Hey everyone!
TypeThursday just published an interview with script typeface designer Laura Worthington.
We discussed how she became a typeface designer, the rise of the casual user market and design considerations for this market.
It was a great talk. Well worth your time.
Read it on Medium
Comments
She is well known for her scripts, but her body of work is more diverse than most.
At Adobe, we initially used PUA for all non-standard glyphs (in addition to proper OT features), and then stopped. Pretty much nobody cared.
Maybe Laura's audience is different? Could be.
I used to use the in my early OT fonts, but stopped because Adobe recommended against it. I can see the argument for not doing it (basically, it's a hack, plus unpredictable results when switching fonts). On the other hand, when I stopped doing it, it was because I bought the argument that OT support would improve and there would be no need for such a hack in the future. Except that OT support hasn't improved all that much. (Latest version of Mac OS is an exception, but unfortunately rather hidden away.)
Aside from the potential issues with switching fonts, it doesn't seem to be such a terrible idea and has some real immediate benefits for users until OT feature support becomes more common beyond pro apps, if ever.
Anyway, reading Laura's interview makes me wonder if following Adobe's recommendation was really worth it.
I would suspect MyFonts and other vendors would have some data available to see how many users are using the fonts they purchase for use in MS Office (still very OT-unfriendly as of v2016)?
The downside is that as long as type designers use hacks to make up for the failings of software developers then said developers have an excuse to keep failing. This opens us to a slippery slope problem. We might end up with the kind of situation Microsoft put web developers into for years, having to use hacks to develop web pages that were standards-compliant and that also worked with Internet Explorer. We might have to start releasing two sets of fonts in every release—one with PUA encodings for the people who use cheap software, and another for people running Adobe software, and the support headaches that entails.
Using PUA encodings intelligently involves a degree of knowledge on the part of the user that I wouldn't presume for general retail fonts. We use PUA only exceptionally, when our clients are very aware of the issues and have determined that there is no other mechanism that will meet their use needs. Typically such clients are also the sort of people who will be creating their own OTL->PUA and PUA-OTL scripting and workflows; again, not the typical purchaser of a retail font license.
My view, which I've held for about the same 15 years to which you allude, is that PUA should properly be reserved to purely non-semantic elements such as ornaments, border parts, etc., or to test case and documentation fonts for unencoded scripts.
Using PUA to access variants of encoded characters, such as swash forms, contextual variants, or ligatures, means that you are corrupting the text even as you are typesetting it. The result may look pretty, but it can't be searched, sorted, indexed, or any of the other things that we should be able to do with digital text. If one is producing only print media, then I suppose the old adage that whatever it takes to get the ink on the page stands, but how often these days are we producing only print media copy? Digital text is text that can be copied, pasted, and exchanged, and tends to find its way into multiple formats. If the text has been corrupted at some stage in order to make it look a particular way, that isn't going to be immediately obvious to the person who cuts and pastes it into an rich text email and sends it off, not realising that it is going to be gibberish to the recipient.
_____
Microsoft began supporting OTL stylistic set features, initially in Publisher, because Geraldine Banes and I sat down with people in the Office team and showed them the Gabriola font and the sort of things it could do. To their credit, they devoted resources to add new feature support to a product that was supposed to be in pre-release lockdown, just because they thought the font was so cool. If we'd said '...alternatively the user can use PUA encodings to access all these glyphs', I doubt very much if they would have gone for the OTL option. If one continues to support bad legacy solutions, there's little impetus for software developers to support better new solutions.
I do not think anyone is suggesting that Microsoft should continue their poor OT support.
Ugh.
No, it isn’t. But if the type industry as a whole embraces PUA then we making it easier for software developers of all sorts to not bother with OpenType support. And that’s bad for all of our customers.
@Adam Twardoch has made an interesting proposal: To use the TTC ("TrueType Collection") format as a fallback method of accessing swash glyphs (etc) when OpenType fails us.
AIUI, this was invented for CJK fonts a long time ago, and is supported everywhere since MacOS and Windows have shipped TTCs for a long time. Basically it allows you to put a whole font family (say, what would be 4 font files, Regular, Italic, Bold, BoldItalic) into a single file, and where those fonts have duplicate TTF tables, to de-duplicate them.
I think that this is a fantastic idea, because fiddling about with the Character Map or FontBook to access PUA encoded glyphs is a bit tricky, requiring step-by-step guides and videos, whereas "install the font file" and then "see fonts listed in the family menu" is immediately discoverable and quite obvious for naive users.
One caveat is that while Adobe and Apple support "OTC" fonts (OpenType-CFF style collections) they are not supported on Windows.
I do think things have improved on the Mac. The recent versions of 10.11 have much improved OT support at the system level. And pretty much everything is supported and works, including things like named stylistic sets. Any app that uses the system font APIs can support OT features relatively easily. (This doesn't affect apps that roll their own OT support, such as Adobe apps.) The downside is that OT features are practically hidden in a tiny pop-up menu on the systemwide Font browser. On the other hand, it's the same UI for any app that uses it. So, there's not much reason that cheap or free software on the Mac to not have decent OT support.
Even so, I sometimes wonder if users will ever get it. Even among pro users of pro apps, it seems unusual for them to know about how to access OT features. I get the impression that most people just use the Glyph palette. I hope it's just because the UI for accessing OT features is so bad. Speaking of which, I wonder what's happening with the Adobe OT UI improvement effort?
That is also my concern. True, software support has not progressed enough but people's habit are hard to break, both users and developers. Perhaps if the software supported it better, students in design schools would learn about it and use it more.
Meanwhile its a fact that there are a lot of casual users of fonts who do not use OT-aware software and need a way to access hitherto typically unencoded glyphs.
Windows continues to be the most popular desktop operating system, though, so the recent improvements in Mac OS X (and in libre desktops) are sadly mostly irrelevant.
The way this all began is by interacting with customers who, upon purchasing my typefaces, found they couldn't access the very elements that encouraged them to buy the typeface in the first place. And yes, I had in disclaimers and such as to not mislead, but it still happened. There were even more potential customers asking if there was a way to use my fonts to their full capabilities without investing the time and expense in professional software. There was overall a lot of disappointment and I found that this audience and their requests were growing... rapidly. So, I PUA encoded my fonts and put together instructions and videos on how to use them. While this works for my customers, the process still sucks, having to copy and paste glyphs out of Character Map or Font Book and into another program, but at least it works.
One thing I find encouraging about doing this, and changes I've seen as a result of it, is that this group of people, the crafters/hobbyists/DIYers, are a very tight knit group capable of pushing change. As a result, a couple of design programs that they use have indeed improved their typography. For example, there's a die-cutting machine and corresponding software out there called, "Sure Cuts A Lot," that now has a native glyphs panel as a result. If a small company like that has gotten the hint that their users find typography important and did something about it, I think others may follow suit. Or at least, I hope
The crafters/hobbyists/DIY group is a big one that is growing rapidly and impacting the type market in many ways. My estimation is that they contributed to over half of my sales last year, without impacting the sales I received from professional users. I believe they're a force to reckoned with and the result of their influence will lead to both good and bad things happening in our industry - most of which I believe will be good and could be a boon to our industry. I have a lot of thoughts on this, which I should save for another thread before I go completely off topic, but I do believe this group's needs should be addressed and I think that as they're becoming more aware of this situation they could become an agent of change. Maybe that's a Pollyanna attitude to assume, but given what I've seen so far, I don't think so. Even though PUA works for them, they still grumble about it and want something better. And I've noticed that as they become more interested in type and aware of it, they become more sophisticated about it and want more and better options.
With Dave's suggestions, I'd like to look into those further and learn more about them before I comment. Ultimately, and I'm sure many of you will agree, the real changes need to happen with software developers and within the OS. How do we go about doing that? Petitions, public outcry, typographic mobs?
A display type shops and small software makers could work together to develop a BSD/MIT licensed OpenType API that can be easily integrated into commercial Windows software. DirectWrite does the hard work, all people need is a couple panels to turn stuff on and off.
Note also that there is very little support for CFF flavour TTCs, especially on Windows, so they would need to be built from TTFs, with the usual manufacturing quality issues that this entails. If you've got a solid TTF production workflow, that's fine, but if you're relying on automated conversions from PS sources, you might find making good TTCs a challenge.
And users just going to the glyph palette to insert swashes and alternates seems an awful lot like Linotype typesetters just inserting the uncommon sorts manually from a pi case.
Ah, Progress . . . !
Indeed, https://github.com/behdad/harfbuzz/blob/master/COPYING has been around for ages, licensed as you suggest, and it isn't used much.
I think these low-sophistication methods are entirely appropriate for unsophisticated casual users.
John, both excellent points. It sounds like Laura's PUA scheme is the best fallback option, then.
It is reliance upon such things by presumably more sophisticated, professional users that strikes me as particularly ironic. Perhaps worse is indeed better.
For instance, the typographer may set the same text in three successive lines, in the three fonts as reference, and in a fourth working line in which the preferred combination is worked out. And perhaps a fifth line as “next best”.
At least, that’s how I’ve done it in the past, pre OT.
If the typographer is not relying on the type designer’s off-the-shelf mixology, but prefers to concoct their own, it’s actually better than OT in InDesign etc., because they don’t have to repeatedly dig down through several layers of GUI to get to the options (stylistic sets), which is hugely annoying.
It seems that any Windows application that uses the standard text stack gets OpenType shaping (such that automatic ligatures, OpenType (GPOS) kerning, etc, works) - but no UI! @Adam Twardoch described this in the github thread on this:
That's not 'the way the Web works'; that's the way bad run integration works. The idea that a CSS span applied to characters automatically creates a distinct run for glyph processing is really stupid.
Then again, as I discussed in my Unicode presentation last year, there are numerous poor decisions around run identification and integration, most of which may only be fixed by downstream processing in which all runs in a line are integrated post-shaping.