[Solved] Yet another 'Trouble rendering Unicode' topic

Hey Embers,

I seem to be having trouble rendering Unicode, despite me following the helpful guidance provided in a previous post.

I load this font into a ci::gl::TextureFontRef which renders regular characters using the font without issue. I then try to render something a bit more obscure, using the following code:

mTextureFont->drawString( "Test Render: \xE2\x96\x97 ok?", ci::vec2( 100, 400 ) );

The hex code I’ve gotten by looking up a character in the character map on windows, and converting it to Hex Utf-8 code via this very useful website.

Alas, it just renders a blank space (where the special character should be) leaving me puzzled. It’s supposed to render the character with a little black square in the corner.

Open to any ideas and thoughts?

Cheers,

Gazoo

The font may not support those characters. When in doubt, try using a widely supported system font like Arial or Trebuchet, which contains most characters. Fonts downloaded from a “free fonts” website, especially the decorative ones, usually support only a very small number of characters.

-Paul

Thanks for the reply @paul.houx! :slight_smile: Always a pleasure to interact with you.

As far as I know, the font is supposed to support the character given that I can look it up in the Character Map on windows. Furthermore, I tested the font by creating a simple html page, included the font in the CSS stylesheet and proceeded to render the character via the following html command▗. It renders fine in Chrome.

Doesn’t that mean it should work in Cinder too?

Gazoo

For me it works using your font (e.g. the cool “PetMe64.ttf” Commodore-64 font from the website you linked). As the posting you linked to indicates, you need to specify any extra characters you want to support via the supportedChars argument to TextureFont::create().

For example, here I added a couple of characters (you’d need to include a string with all the ones you want!). The following works (assuming you have that font in the asset directory):

void UnicodeFontApp::setup()
{
    auto font = Font(loadAsset("PetMe64.ttf"), 16);
    // You can add extra characters as multibyte hex, octal
    // or (\u) 16-bit code point if your compiler supports it.
    auto myChars = gl::TextureFont::defaultChars() + "\xE2\x96\x97" + "\u25cc";
    fontTex = gl::TextureFont::create(font, gl::TextureFont::Format(), myChars);
}

void UnicodeFontApp::draw()
{
    gl::clear( Color( 0, 0, 0 ) );
    fontTex->drawString("Test Render: \xE2\x96\x97 ok? \u25cc", {10, 50});
}

TestRenderUnicode-C64

Does that not work for you?

Glen.

P.S. If you need to add a large range (or, more likely, several ranges) of special characters, you can do that using arrays and loops, not manually writing them all out! (-;

3 Likes

Hey @totalgee,

I had absolutely no idea that I needed to specify any additional characters I needed - but of course - in hindsight this is clearly necessary. A texturefont with every single character supported by a font would likely be unnecessarily large. Drawing the character now works perfectly!

Thanks for the tip re. adding several characters. I may well pack a vector with the ones I need to iterate through!

Thanks so much for the help!

Gazoo