[Libxmlplusplus-general] parametrizing libxml++ for the character /string type



hi there,

libxml2 uses some internal utf8 types to represent characters.
libxml++ currently uses std::string, which only works for characters
in the ASCII subset of utf8. It was suggested to use glibmm::ustring
instead, but I'd like to propose a different solution:

What if the xmlpp::Node class (as an example) is split into two
parts: one that is character type agnostic, i.e. uses libxml2's
internal type, and one that does the conversion to C++ types (for
example glibmm::ustring) ?

Here is how this could look like:

class _TextNode
{
public:
  void set_content(const xmlChar *content)
  {
     xmlNodeSetContent(_impl, content);
  }

  /* ... */
private:
  xmlNode *_impl;
};

template <typename string_type, typename string_traits>
class TextNode : private _TextNode
{
public:
  void set_content(const string_type &content)
  {
     _TextNode::set_content(string_traits::to_utf8(content));
  }
  /* ... */
};

So, the real libxml2 wrapper class for a text node is _TextNode.
It's this class that does all the real work. TextNode then provides
a thin Adapter to that (i.e. it uses _TextNode as implementation)
providing a type-safe interface, and by means of the 'string_traits'
providing a mapping to arbitrary user-provided unicode implementations.

You may just do a

typedef TextNode<glibmm::ustring, your_ustring_adaptor> YourTextNode;

to hide the templating, while others can use a different unicode
library.

Hope this makes some sense to you.

Stefan





[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]