I can't figure out whether the LSBearing bit (bit 1) of head.flags controls the horizontal metrics, or is simply an "efficiency" bit that represents all the LSB values in the 'htmx' table. Does:
(A) head.flags.LSBearing mean "ignore the per-glyph LSB values in the 'hmtx' table"? ... or
(B) Is it more of an advisory bit that certifies that the 'hmtx' LSB values all match the xMin values declared in the 'glyf' table for the corresponding glyph? ... or
(C) Some other, undreampt-of, interpretation.
-Clint
Comments
(A) honor the head.flags.LSBearing bit and use xmin instead of the glyph's LSB declaration or
(B) use the glyph's LSB declaration?
You could, for example, design a glyph without left sidebearing, which means xMin would be 0. The LSB in the htmx table could be used to give the glyph a LSB of e.g. 50 font units. In this case the head table bit must not be set.
So ... hmtx.leftSideBearing is an array of int16's, which I'm taking to be +/-funits. The glyph.xMin entries are also int16's. If head.flags.LSBearing is set, I was thinking that htmx.leftSideBearing for a glyph should be the same as glyph.xMin for that glyph, using integer comparison.
(I'm gonna need some chocolate before I ponder this ...)
-Cint
* The head.flags.LSBearing bit appears to be advisory. Font authoring tools and TTF rendering engines that handle fonts where the rule is broken (i.e. head.flags.LSBearing is set but a glyphs's offset is not zero) do not force the offset to zero.
* The suggestion that Jens made regarding differences between the LSB in the 'hmtx' table and the declared xMin in the header of the glyph or the actual minimum xMin value in points in the glyph) ... ... I think is spot on! The vast majority of the mismatches I found across the 1,894 Open Source fonts I'm scanning are "off-by-one" errors, and most appear to involve composite glyphs with scaling and rotations. I am rounding differently than the system that produced the fonts. I am now wondering if this involves the interpretation of the ROUND_XY_TO_GRID bit in composite glyphs ...
* How the bounding box for a glyph is calculated was also brought up by Jens. One could imagine what I'll call the "Outline Method": that the bounding box stated in the font should represent the actual outline of the glyph as it would be rendered. A straightforward implementation would be what I'll call the "Data Point Method": simply use the extremas of the coordinate points of each data point of the outline.
The _g_l_y_f.py module of the Python FontTools package (I'm looking a v3.38.0) actually implements the Outline Method - a significant chunk of thorny code - but then disables it in favor of the Data Point Method. Other authoring tools and the actual Open Source fonts I am surveying also seem to use the Data Point Method.
-Clint