Determining ex, em and en

I'm writing some typography software which allows users to specify lengths in various dimensions.

If the user asks for a length of "1ex", is it reasonable to determine this length from the sxHeight entry in the OS/2 table? Should I be literally measuring the point height of an "x" in the font instance? Or something else?

Similarly for "1em" - I'm just using the point size of the font instance but should I be doing something more clever?

And "1en" - half an em?

Comments

  • John Hudson
    John Hudson Posts: 3,186
    em and en, yes, should be equal to nominal font size and half of same, respectively.

    For ex, I'd be inclined to use the sxHeight entry in the OS/2 table, on the basis that if it isn't actually equal to the lowercase x height, there might be some good reason for that, e.g. the font doesn't contain a Latin lowercase!
  • Nick Shinn
    Nick Shinn Posts: 2,207
    edited February 2016
    Why are you doing this, Simon?
    Shouldn’t these characters be a function of the font, not the layout app?
  • Thanks, John! Nick, I'm not defining characters, I'm defining units of measurement. ("Set interline distance to 1.2em" and so on.)
  • John Hudson
    John Hudson Posts: 3,186
    BTW, my general recommendation would be to match CSS use of these units.
  • Good idea... although now I discover neither Harfbuzz nor Freetype supports interrogating the sxHeight field, this may all become a little academic. Time to go digging in browser source code to see how they do it.
  • FreeType: http://www.freetype.org/freetype2/docs/reference/ft2-truetype_tables.html#TT_OS2
    HarfBuzz: Well, you can extend the OS/2 support (that you wrote initially :) ) to read later version of the table and provide x-height.

  • Firefox with FreeType seems to try measuring the height of x, then sxHeight, then half em.
  • For completeness:
    • Mozilla, when using Freetype measures the height of /x in units, with the following comment: "Prefering a measured x over sxHeight because sxHeight doesn't consider hinting, but maybe the x extents are not quite right in some fancy script fonts. CSS 2.1 suggests possibly using the height of an "o", which would have a more consistent glyph across fonts."
    • Mozilla, when not using Freetype, tries to load sxHeight from the OS/2 table, ignoring negative values. If that doesn't work and you're on a Mac, it calls CGFontGetXHeight. It then tries to measure characters and if that fails, multiplies the maximum ascent by 0.56.
    • Chromium uses Skia, which checks OS/2 then measures an "X", and then gives up and uses the ascender(!)
  • And which using DirectWrite it uses whatever value DirectWrite calculates for the font metrics.

    I’d say go for sxHeight (you don’t need to account for hinting) falling back to half em, and call it a day.