- Subject: Re: [slang-users] Welcome to the "slang-users-l" mailing list
- From: Ian Langworth <ian.langworth@xxxxxxxxx>
- Date: Sun, 16 Jan 2011 18:30:11 -0800
Hi,
I'm attempting to print a UTF-8 encoded character to the screen in a
Unicode-capable terminal:
#include <slang.h>
#include <stdio.h>
int main(int argc, char* argv[]) {
SLtt_get_terminfo();
SLang_init_tty(-1, 1, 0);
SLsmg_init_smg();
SLutf8_enable(-1);
SLsmg_cls();
SLsmg_gotorc(0, 0);
SLsmg_write_string("黑\n");
SLsmg_write_nchars("黑", 1);
SLsmg_gotorc(5, 0);
SLsmg_refresh();
SLang_getkey();
SLsmg_reset_smg();
SLang_reset_tty();
return 0;
}
Instead of printing two 黑 characters separated by a newline, all I see is:
é»<91>é
The bytes written to the screen are: 0x c3a9 c2bb 3c39 313e c3a9
My terminal, Terminal.app on Mac OS X, definitely supports unicode,
and other applications (Emacs, Vim) have no trouble. The source code
is UTF-8 encoded and xxd confirms it. I've tried replacing the string
with "\xe9\xbb\x91" and I've tried changing my TERM and LC_ALL env
variables.
I've tried changing the parameter of SLutf8_enable(). I've tried
setting SLsmg_Display_Eight_Bit to 128, but that simply prints "é»é".
By default, SLsmg_Display_Eight_Bit is 160.
Any help would be appreciated.
_______________________________________________
slang-users-l mailing list
slang-users-l@xxxxxxxxxxx
http://mailman.jedsoft.org/mailman/listinfo/slang-users-l
[2011 date index]
[2011 thread index]
[Thread Prev] [Thread Next]
[Date Prev] [Date Next]