Meaning of head.flags.LSBearing

I can't figure out whether the LSBearing bit (bit 1) of head.flags controls the horizontal metrics, or is simply an "efficiency" bit that represents all the LSB values in the 'htmx' table. Does:

(A) head.flags.LSBearing mean "ignore the per-glyph LSB values in the 'hmtx' table"? ... or

(B) Is it more of an advisory bit that certifies that the 'hmtx' LSB values all match the xMin values declared in the 'glyf' table for the corresponding glyph? ... or

(C) Some other, undreampt-of, interpretation.

-Clint
Tagged:

Comments

  • ClintGossClintGoss Posts: 66
    edited April 2019
    It appears that two popular font authoring apps consider the head.flags.LSBearing bit to be advisory.
    Current versions of FontCreator and the Python-based ttx tool, set this bit based on the presence of any glyphs that have an LSB settings that differs from xMin setting for that glyph.
    It appears that these tools to not let the font author override the setting of the bit. In the case of TTX, altering the bit by editing the <head><flags value= ...> of a .ttx file has no effect - TTX forces the bit based on the glyphs.
    ** However ** if head.flags.LSBearing is advisory, then that bit should be set iff the font has glyphs where the LSB does not match xMin. I ran a script over 1,894 Open Source fonts and it turned up 140 fonts where this rule is broken(!). That is, head.flags.LSBearing is on, but they have glyphs (sometimes hundreds) where the LSB for the glyph (in the 'htmx' table) does not match the xMin in the glyph (or the actual minimum xMin value in the points of the glyph).
    So again, I'm not sure how these fonts should be handled ... do I:
    (A) honor the head.flags.LSBearing bit and use xmin instead of the glyph's LSB declaration or
    (B) use the glyph's LSB declaration?
    By the way, I also compared the four bounding box values in each glyph against the bounding box implied by the most extreme points in each glyph. 368 of the 1,894 fonts had at least one error of this type (!! again)

    -Clint
  • The description of this bit in the OpenType spec on the Microsoft site says:
    Left sidebearing point at x=0 (relevant only for TrueType rasterizers)
    If I understand it correctly, the effective sidebearing of a glyph could be altered by the LSB value in the hmtx table.

    You could, for example, design a glyph without left sidebearing, which means xMin would be 0. The LSB in the htmx table could be used to give the glyph a LSB of e.g. 50 font units. In this case the head table bit must not be set.
  • ** However ** if head.flags.LSBearing is advisory, then that bit should be set iff the font has glyphs where the LSB does not match xMin. I ran a script over 1,894 Open Source fonts and it turned up 140 fonts where this rule is broken(!). That is, head.flags.LSBearing is on, but they have glyphs (sometimes hundreds) where the LSB for the glyph (in the 'htmx' table) does not match the xMin in the glyph (or the actual minimum xMin value in the points of the glyph).
    The question is, how big is the difference? If it is very small, it could simply be a rounding error. If the leftmost point of a contour is not an on-curve point, but a curve, the precise value may be a floating point value, but the LSB value must be rounded to an integer value.
  • ClintGossClintGoss Posts: 66
    Thanks Jens ... this is a real head-scratcher for me!

    So ... hmtx.leftSideBearing is an array of int16's, which I'm taking to be +/-funits. The glyph.xMin entries are also int16's. If head.flags.LSBearing is set, I was thinking that htmx.leftSideBearing for a glyph should be the same as glyph.xMin for that glyph, using integer comparison.
    If the leftmost point of a contour is not an on-curve point, but a curve, the precise value may be a floating point value, but the LSB value must be rounded to an integer value.
    Oh, that's a whole kettle of headaches. I was thinking that bearings and xMin values were both simply the location of some (left-most) point in the glyph. However, do we have to (essentially) extrapolate the curve shape pulled by an off-curve point and figure out where it's (left-most) bound is?? Is that the way it really works??

    (I'm gonna need some chocolate before I ponder this ...)

    -Cint
  • ClintGossClintGoss Posts: 66
    Here's what I've found over the last week:

    * The head.flags.LSBearing bit appears to be advisory. Font authoring tools and TTF rendering engines that handle fonts where the rule is broken (i.e. head.flags.LSBearing is set but a glyphs's offset is not zero) do not force the offset to zero.

    * The suggestion that Jens made regarding differences between the LSB in the 'hmtx' table and the declared xMin in the header of the glyph or the actual minimum xMin value in points in the glyph) ...
    ... it could simply be a rounding error.
    ... I think is spot on! The vast majority of the mismatches I found across the 1,894 Open Source fonts I'm scanning are "off-by-one" errors, and most appear to involve composite glyphs with scaling and rotations. I am rounding differently than the system that produced the fonts. I am now wondering if this involves the interpretation of the ROUND_XY_TO_GRID bit in composite glyphs ...

    * How the bounding box for a glyph is calculated was also brought up by Jens. One could imagine what I'll call the "Outline Method": that the bounding box stated in the font should represent the actual outline of the glyph as it would be rendered. A straightforward implementation would be what I'll call the "Data Point Method": simply use the extremas of the coordinate points of each data point of the outline.

    The _g_l_y_f.py module of the Python FontTools package (I'm looking a v3.38.0) actually implements the Outline Method - a significant chunk of thorny code - but then disables it in favor of the Data Point Method. Other authoring tools and the actual Open Source fonts I am surveying also seem to use the Data Point Method.

    -Clint
Sign In or Register to comment.