I have been working on a new hyphenation-and-justification engine for variable fonts; you can find the code
here and also view
a demo site. But I've come across a problem I don't know how to solve.
It works beautifully for some fonts (Skia for instance) but not for others, and I've kind of worked out what the root problem is. I am using the CSS font-stretch property to set the desired width of text to a given percentage. For example, if I have a word which occupies 207 points and for line breaking purposes I need it to occupy 124 points, I set
font-stretch
to 60%. Now I'm testing using DJR's Fit, which can most certainly compress to 60% of normal and even less, but with
font-stretch
set to 60%, it still occupies 145 points - the actual compression is 70%.
From my point of view, the browser is misbehaving: I asked for 60% compression, I got 70%. But I wonder if underlying that is the arbitrariness of the extremes of the
wdth
"axis: Fit has a "skyline" named instance which has a
wdth
of 0, but it clearly
has some width, while its "ultra extended" instance with
wdth
1000 is somewhat
less than 10 times wider than the regular (which itself has wdth 110!). So the font-stretch property really has no hope of getting it right.
How do I set a piece of text in a variable font to a particular width in a browser?
I guess the dumb-stupid solution is to set a piece of text at
font-variation-settings: 'wdth' (whatever the minimum is)
and measure it, then at
font-variation-settings: 'wdth' (whatever the maximum is)
and measure that, and interpolate between the two. But I'm not even sure how to get the minimum and maximum values from the font in a browser. Any ideas?
Comments
https://github.com/chrissam42/font-to-width/blob/master/font-to-width.js#L341
But here's what the W3G CSS Fonts specification says about font-stretch:
If that's what the spec says, it's not unreasonable to think "I set font-stretch to 50%, so the glyphs should be 50% as wide." Arguably the spec is wrong here, but the spec is wrong because it assumes that "declared" width and "actual" width are identical, when as you say, the relationship is more complicated than that.
The same is true for weight as for width: an axis adjustment that adds +45 per mille of em to the x opaque of the uprights of an uppercase H likely won't add the same amount of the uprights of a lowercase n, let alone to the upright of a plus sign, and the latter won't share the same x,y behaviour as letters, since the plus sign needs to retain optically equivalent x,y stem weights while the stroke contrast of the letters might change.
So I still don't see how you can make quantifiable predictions from parametric axes for arbitrary text strings any more than you can using percentage values as in CSS font-stretch, at least for proportionally spaced fonts.
At that point, one needs to consider the question what e.g. 60% means. Is it:
a) the average change to the width of the glyphs to which font-stretch is applied?
b) the average change to the width of all the non-zero glyphs in the font?*
c) the change to the width of the bounding box of the text to which font-stretch is applied?
_____
* Thought in passing: if (b) were the case, then it would be possible for font makers to design to this standard by making the wdth axis relative to the OS/2 table xAvgCharWidth value. Hmm. I like this idea, although I've not thought it through thoroughly.