Units per em

From your experience, does the choice of "units per em" (e.g., 1000, 1024, 2048) make a difference in font functionality? Any reason you choose one size over another? I have generally used 1000, but I've always wondered about other scales.
Tagged:
«13

Comments

  • John Savard
    John Savard Posts: 1,126
    Given that 18 units per em was good enough for Monotype, on the one hand no doubt 1000 units per em is enough, but on the other hand, a number of units that is a multiple of 18 would provide compatibility.
  • AbrahamLee
    AbrahamLee Posts: 262
    Given that 18 units per em was good enough for Monotype, on the one hand no doubt 1000 units per em is enough, but on the other hand, a number of units that is a multiple of 18 would provide compatibility.
    Thanks, John! That's very interesting. You say "was good enough". When was that? Is it that way now? I've never heard of scaling by a multiple of 18.
  • Bhikkhu Pesala
    Bhikkhu Pesala Posts: 210
    edited March 2017
    If a font contains very fine details, http://forum.high-logic.com/viewtopic.php?f=3&t=5093&p=22507 use a higher funits/em value. 

    At 2048 funits/em very small circles will no longer be circular, so increase the value to 4096 or 8192 funits/em.




    Scripts with fine lines like Gabriola use 4096 funits/em. 
  • AbrahamLee
    AbrahamLee Posts: 262
    If a font contains very fine details, use a higher funits/em value

    At 2048 funits/em very small circles will no longer be circular, so increase the value to 4096 or 8192 funits/em.

    Scripts with fine lines like Gabriola use 4096 funits/em. 
    Now this is getting interesting. Thank you for this. I presumed that the main reason for increasing the funits/em would be to support delicate features, but I never would have imagined going to 4096 or 8192 funits/em. Are there many fonts that do this? Is the only reason _not_ to that it will increase file size because you're using larger integers? Does that have that much of an impact? Thanks for feeding my curiosity.

    For MM families is it better to use these higher numbers so interpolated instances are more precise and less affected by integer round-off error? Or does it matter? I guess it probably depends on the complexity of the design. On the other hand, perhaps it's better that the font editor not round off any transformation at all, except at the final TTF/OTF generation stage?
  • AbrahamLee
    AbrahamLee Posts: 262
    The 18 unit em John mentions is from the metal type days, and it was strictly to do with the widths of the characters. The shapes of the characters were not constrained by any sort of unit grid the way they are in digital fonts.
    Ah! That makes more sense.
  • Unless you have a good reason to not use 1000, use 1000, since some software still assumes it.

    An Em that's a power of two (typically 2048) is reputed to result in faster rasterization, but it's probably rarely relevant.
  • AbrahamLee
    AbrahamLee Posts: 262
    Unless you have a good reason to not use 1000, use 1000, since some software still assumes it.

    An Em that's a power of two (typically 2048) is reputed to result in faster rasterization, but it's probably rarely relevant.
    That's kind of what I've heard, but without specifics, which is partially why I asked in the first place. Do you know of specific (esp. common) software that still assumes it? Have there been any studies that validate the claim of faster rasterization when using an Em that's power of two?
  • No specifics about 1000 assumption, sorry. Mostly very old stuff. Sort of like how COBOL is still around.  :-)

    When MS [co-]created TrueType they conceived of and confirmed the speed advantage.
  • Chris Lozos
    Chris Lozos Posts: 1,458
    The problem you can run into drawing is that the vector angles are also constrained by the unit density.  This can cause undue distortion on details in the drawing.  This may not be a problem for the average font in a situation but when you approach extremes of weight and detail, things can get ugly. Choose UPM on the targeted function of the face.
  • AbrahamLee
    AbrahamLee Posts: 262
    The problem you can run into drawing is that the vector angles are also constrained by the unit density.  This can cause undue distortion on details in the drawing.  This may not be a problem for the average font in a situation but when you approach extremes of weight and detail, things can get ugly. Choose UPM on the targeted function of the face.
    That's precisely why I wondered if a higher density is more advantageous, particularly in the case of MM interpolation because the distortion (at least when the resultant points are rounded to the nearest integer location) would be lessened.
  • Thomas Phinney
    Thomas Phinney Posts: 2,883
    Unless you have a good reason to not use 1000, use 1000, since some software still assumes it.

    The assumption is that fonts with PostScript outlines (Type 1 or OpenType CFF/.otf) will have a 1000-unit em, and fonts with TrueType outlines, a 2048-unit em. What is considered “standard” is dependent on the outline format.
  • AbrahamLee
    AbrahamLee Posts: 262
    Thank you for those insights, Thomas and John!
  • Chris Lozos
    Chris Lozos Posts: 1,458
    When you use a typeface that is 2048 UPM, InDesign will round kerning and perhaps spacing units to 1000. Granted, this is a small amount and will rarely matter to most people.  I surmise that keeping UPM to even thousands [1000, 2000, 4000] would keep clear of this problem.  I am assuming that Microsoft is accommodating legacy systems in sticking with their power of 2 recommendation? 
  • John Hudson
    John Hudson Posts: 3,186
    edited March 2017
    I am assuming that Microsoft is accommodating legacy systems in sticking with their power of 2 recommendation? 
    Not just legacy systems. The optimisations in the rasterisers are current for DirectWrite as well as the older GDI environments. Basically, Microsoft's view is that since the optimisations improve things — even if the effect of the improvements are smaller and smaller — why would you ditch the recommendation?

    That said, their position has shifted. When we started making fonts for Microsoft in the late 1990s, a power of 2 UPM was a procurement requirement. Now it is still a recommendation, but no longer a requirement, and I have occasionally shipped them a font that, for one reason or another, had a non power of 2 UPM.

  • Bhikkhu Pesala
    Bhikkhu Pesala Posts: 210
    edited March 2017
    Abraham Lee said:
    I presumed that the main reason for increasing the funits/em would be to support delicate features, but I never would have imagined going to 4096 or 8192 funits/em. Are there many fonts that do this? Is the only reason _not_ to that it will increase file size because you're using larger integers? Does that have that much of an impact? Thanks for feeding my curiosity.
    I have only come across a few that used 8192, and as I said Gabriola uses 4096. One foundry that deserves to remain anonymous (but not for that reason) uses 4096 funits/em.

    File size increase for my Guru Italic font with 2,893 glyphs (and 1,265 composites) is as follows:

    Hinted

    2048 funits/em = 916 KB (938,232 bytes)
    4096 funits/em = 940 KB (962,928 bytes)
    8192 funits/em = 959 KB (982,232 bytes)

    Unhinted
    2048 funits/em = 719 KB (736,496 bytes)
    4096 funits/em = 737 KB (755,016 bytes)
    8192 funits/em = 761 KB (779,384 bytes)

    I think there is still a good reason for sticking to multiples of 2^ however fast modern computer chips may be, and in spite of Microsoft's recommendations being made in 2002. Even microseconds add up if many operations are performed when rendering text for the screen on zooming in or reflowing text. 

    For Mac OX, 1,000 funits/em is recommended. Apparently, there's a bug that affects PDF output for fonts with different funits/em values.

  • File size increase for my Guru Italic font with 2,893 glyphs (and 1,265 composites) is as follows:

    Hinted

    2048 funits/em = 916 KB (938,232 bytes)
    4096 funits/em = 940 KB (962,928 bytes)
    8192 funits/em = 959 KB (982,232 bytes)
    Are these different fonts using the same amount of points? I know for example that FontLab Studio 5 adds more off-curve points in a TrueType conversion the higher the upm is.

    For Mac OX, 1,000 funits/em is recommended. Apparently, there's a bug that affects PDF output for fonts with different funits/em values.
    That was in the first version of Mac OS X 10.10, and I believe has been fixed in 10.10.1 or 10.10.2. Oh, I see I was mentioned in that forum entry ;)
  • Are these different fonts using the same amount of points? I know for example that FontLab Studio 5 adds more off-curve points in a TrueType conversion the higher the upm is.

    For Mac OX, 1,000 funits/em is recommended. Apparently, there's a bug that affects PDF output for fonts with different funits/em values.
    That was in the first version of Mac OS X 10.10, and I believe has been fixed in 10.10.1 or 10.10.2. Oh, I see I was mentioned in that forum entry ;)
    They were all exported from the same font project after changing the funits/em so the number of points are identical. This is what I do to change funits/em.

    1. Select all composites
    2. Invert selection
    3. Copy
    4. Change funits/em
    5. Paste
    This keeps the glyphs the same size. Hinting is reapplied on export. Metrics would need to be recalculated and composites would need to be recomposed, if you wanted to do this for real. 
  • Chris Lozos
    Chris Lozos Posts: 1,458
    I would think that Asian fonts like Chinese, would be much easier to deal with at very high UPM. There are an awful lot more layers and strokes than compared to Latin.
  • Roel Nieskens
    Roel Nieskens Posts: 188
    edited March 2017
    Regarding file size, I suppose that once you crossed the line where most point are more than 256 units apart, it doesn't really matter if you use a UPM of 2048, 4096, 8192, etc. The savings come from being able to use a single byte to store the point's coordinates, as opposed to a signed short, which is two bytes. A font with a 1024 UPM might have 50% of its coordinates stored as a short — say 25.000 points, which equals 50.000 bytes — but when "scaled" to 256 UPM they might all fit in a byte. That saved you 25Kb.

    There's a silly experiment I did with a famous icon font that reduces file size in exactly that manner.

    Oh, and get your rotten tomatoes ready: I think most webfonts would work perfectly well for screen with a low UPM like 256. The least that could be done is scaling the UPM as far down as possible by using the greatest common divisor for all coordinates.
  • I would think that Asian fonts like Chinese, would be much easier to deal with at very high UPM. There are an awful lot more layers and strokes than compared to Latin.
    All type foundaries I know use a 1000×1000 square when drawing Kanjis.
    When building CJK typefaces for MS they sent me a lot of JSON files where coordinates are in float number!
    The target UPM is 2048, though. Most of the bytes in the file are instructions.
  • May be off-topic, but can anyone explain why Zapfino has 400 UPM?
  • John Hudson
    John Hudson Posts: 3,186
    Yes, that's right. The first version of Zapfino for Apple was very small on the body, in order to accommodate the very tall ascenders of the taller style variants. This was done in consultation with Apple, and they shipped the font, but then they got a lot of complaints from people saying that the font was too small relative to other fonts. So Apple decided to change the UPM so that everything would scale larger.
  • John Savard
    John Savard Posts: 1,126
    The 18 unit em John mentions is from the metal type days, and it was strictly to do with the widths of the characters. The shapes of the characters were not constrained by any sort of unit grid the way they are in digital fonts.
    Oops! Yes, I should have remembered that the 1000 or 2048 points into which the em is divided for a digital typeface are used for describing the shape of the character, not just the width of its bounding box - which is why a fine grid is needed to avoid it becoming an obtrusive constraint.

    If one wants to reproduce the widths of a classic typeface precisely, it might make sense to choose a grid that is a multiple of the unit system used for the original face. Even foundry faces appear to have been designed to a unit of 1/4 of a point by some foundries; which makes sense, as without some commonality, how could lines be justified exactly?