Re: ustring bug?



Hello Dodji 
                
> Do not forget to call setlocale (LC_ALL, "") to initialize the charset
> conversion subsytem.

You are right, this version works:
<code>
        #include <glibmm.h>
        #include <iostream>
        
        
        int main(int argc, char **argv) {
                        
                         // <edit>
                setlocale (LC_ALL, "");
                        // </edit>
        
            Glib::ustring ustr ( "h�" );  
        
            try { 
        
               Glib::locale_from_utf8(ustr); 
        
            } catch (Glib::ConvertError& e) {
                std::cout << e.what() << std::endl;
            }
        
            return 0; 
        }
</code>

Even  
        setlocale (LC_ALL, "de_DE.utf8") 
works. Thank you Dodji, I did not know that setlocale is necessary. 

> Also, I think ustring("foo") cannot work properly if "foo" is not a
> valid utf8 sequence.
> In your case, "foo" is not. So if "foo" is a string for which each
> character can fit in one byte, you should rather put "foo" in a
> std::string and convert it to a valid utf8 string using
> Glib::locale_to_utf8() .
> 

"foo" works all the way, but "f�ll only if setlocale is called. As
far as I know this is caused by the hehavior of utf8 only to use more
than 8 Bits for an non-ASCII character. But I didn't know about
setlocale, until today ;)

I will google for it. 


regards, Maik






[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]