I have been working on a new hyphenation-and-justification engine for variable fonts; you can find the code here
and also view a demo site
. But I've come across a problem I don't know how to solve.
It works beautifully for some fonts (Skia for instance) but not for others, and I've kind of worked out what the root problem is. I am using the CSS font-stretch property to set the desired width of text to a given percentage. For example, if I have a word which occupies 207 points and for line breaking purposes I need it to occupy 124 points, I set
to 60%. Now I'm testing using DJR's Fit, which can most certainly compress to 60% of normal and even less, but with
set to 60%, it still occupies 145 points - the actual compression is 70%.
From my point of view, the browser is misbehaving: I asked for 60% compression, I got 70%. But I wonder if underlying that is the arbitrariness of the extremes of the
"axis: Fit has a "skyline" named instance which has a
of 0, but it clearly has
some width, while its "ultra extended" instance with
1000 is somewhat less
than 10 times wider than the regular (which itself has wdth 110!). So the font-stretch property really has no hope of getting it right.
How do I set a piece of text in a variable font to a particular width in a browser?
I guess the dumb-stupid solution is to set a piece of text at
font-variation-settings: 'wdth' (whatever the minimum is)
and measure it, then at
font-variation-settings: 'wdth' (whatever the maximum is)
and measure that, and interpolate between the two. But I'm not even sure how to get the minimum and maximum values from the font in a browser. Any ideas?