Regarding the question of if liblouis supports 32-bit unicode, the answer is it depends and its out of your control as the table author.
It all depends on how the user's liblouis was compiled. In my mind a mistake to be done that way, but I am not in the mindset of conditional compilation anyway. In my world there should be one build and one build alone (OK, you would need different builds for each platform but the code should be equivalent).
I do not know what would happen if one tried to use a table which needs 32-bit unicode on a 16-bit unicode build.
Also it is worth noting that the 16-bit unicode builds are UCS2 and not UTF16. The latter being a variable width encoding so being capable of representing all characters whereas the former is fixed width and so is limited to the 16-bit character set.
Michael Whapples On 09/01/2014 15:41, Aaron Cannon wrote:
Hi all. I've been working on the beginnings of a table to support IPA Braille. Has anyone been doing similar? I would hate to duplicate effort. Also, doing this has caused me to learn a bit more about Unicode, and I'm wondering how complete Liblouis's support is for Unicode. I know it can handle Unicode codepoints in tables and translation, but I'm wondering if we are currently doing any normalization? For those of you not familiar, basically there are multiple ways to write many unicode characters. For e with an acute accent, for example, you can either use the single code point representing e acute, or you can use an e, followed by a combining acute above codepoint. What normalization does is it normalizes all unicode strings so that every possible character is basically represented in only one way. My final question is how many bytes are we using to represent unicode characters? Are we supporting 32 bits, or just 16? Thanks. Aaron For a description of the software, to download it and links to project pages go to http://www.abilitiessoft.com
For a description of the software, to download it and links to project pages go to http://www.abilitiessoft.com