Why are f-ligatures included in some monospace fonts? I imagine it's just to fulfill the Unicode code point? What are the downsides of not fulfilling those code points?
I was just thinking that consistency could be a reason to include them, being that amongst a larger typeface collection, all styles, even the monospace offering would have those glyphs fulfilled.
And Ray, I agree with you on that, monospace ligatures can be rather unsightly, lol.
Yep. That's one reason why some people dislike preformed ligatures. If you're of such a mind, you might find it more elegant to compose them with pre- and/or post- variant glyphs. Instead of creating individual /T_b, /T_l, /T_k and such because you have /T_h, just create a /T.pre with a longer right arm, move its sidebearings, and add some glyph substitution code to calt and ligs. Ligs with /f? Just create a couple of versions (tall or short) and tell calt/ligs (hell, even kern) the order to substitute them in -- orchestra conductors can say defffinitely, as Bringhurst suggested. /t/t and /t/y can join. Mrs. Eaves would be proud. It extensible and flexible. (Granted, this works better in a sans on monoline face, but hey, you can't have everything.)
But any tracking applied to a monospace font with double-width ligatures will throw off the overall monowidth consistency, right?
Not really. In InDesign, the Ligature feature is designed to fail beyond a narrow range, for obvious reasons. This is not a stipulation of the feature, just a convention.
Nick — That wouldn’t apply in this case, since presumably the only way that one of the precomposed ligatures that we’re talking about (the legacy fi and fl chars) would be in the text stream is because it was hard-coded to begin with. So, there would be no {liga} feature in effect to fail when tracked.
One could put in dummy fi and fl glyphs and then add a {ccmp} to decompose them when encountered as encoded in the text.
But, why bother with all this for a monospaced font, really? Couldn’t anyone who is pulling up legacy text with hard-coded ligatures and reformatting it with a new monospaced font be expected to run a quick search-and-replace routine to update the text to remove the legacy chars?
@Kent Lew: ... presumably the only way that one of the precomposed ligatures that we’re talking about (the legacy fi and fl chars) would be in the text stream is because it was hard-coded to begin with.
In fact, InDesign will substitute the precomposed fi and fl ligatures for f i and f l, even in the absence of a {liga} feature.
Also, some older applications would substitute f-ligs if the glyphs existed. It wasn't so long ago when no applications supported the liga feature. I'm not sure if it's such a big issue these days but who knows how many applications have a hard coded f-ligature?
The problem with these workarounds is that they just perpetuate legacy f-ligs in text which could easily be fixed with a search/replace.
One could handle double-width ligatures like one has to handle combining marks in monospaced fonts. The combining marks must be normal width for the monospaced requirement, then a default feature can change their advance width to zero. For double-width ligatures, the default feature would double their advance width.
I think it goes back to the earliest version of InDesign, back when there were not a lot of OT fonts yet and lots of older fonts containing fi and fl. It was probably intended as a way to improve the typography with older fonts.
Yeah, I remember that Quark and InDesign both used to do that with Postscript Type 1 fonts actually, pre-OT.
In fact, I remember that Cherie Cone once shared with me a utility that ran on the FOND resource in order to get the fi and fl glyphs to be recognized by Quark’s auto-ligaturing. As I recall, sometimes they worked fine with default FL generation, but sometimes they didn’t and you had to run the FOND modifier.
But I didn’t realize that behavior was still present. Good to know.
Comments
Only downside would be with ancient Mac files using the hardcoded ligature. I doubt many people use it today. But we might be surprised?
I was just thinking that consistency could be a reason to include them, being that amongst a larger typeface collection, all styles, even the monospace offering would have those glyphs fulfilled.
And Ray, I agree with you on that, monospace ligatures can be rather unsightly, lol.
œ and æ are quite likely to look unsightly/ridiculous in a monowidth font.
One could put in dummy fi and fl glyphs and then add a {ccmp} to decompose them when encountered as encoded in the text.
But, why bother with all this for a monospaced font, really? Couldn’t anyone who is pulling up legacy text with hard-coded ligatures and reformatting it with a new monospaced font be expected to run a quick search-and-replace routine to update the text to remove the legacy chars?
Ooh. Interesting thought.
The problem with these workarounds is that they just perpetuate legacy f-ligs in text which could easily be fixed with a search/replace.
The combining marks must be normal width for the monospaced requirement, then a default feature can change their advance width to zero.
For double-width ligatures, the default feature would double their advance width.
Ah, I hadn’t paid attention to that. Thanks for pointing that out, Mark.
In fact, I remember that Cherie Cone once shared with me a utility that ran on the FOND resource in order to get the fi and fl glyphs to be recognized by Quark’s auto-ligaturing. As I recall, sometimes they worked fine with default FL generation, but sometimes they didn’t and you had to run the FOND modifier.
But I didn’t realize that behavior was still present. Good to know.