Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Hrant H. Papazian

Specialist in Armenian typeface design. Interested in multi-script typography, readability and notan... and everything else.

About

Username
Hrant H. Papazian
Joined
Visits
2,592
Last Active
Roles
Member
Points
449
Posts
637
  • Re: Efficiency in kerning pairs

    I have to be a bit careful saying this, but... I wonder if there is a danger in coming to think of the number of kern pairs as a proxy for font quality. And I wonder if it becomes a pride issue - a way of showing how seriously you are taking things.
    Don't be too careful, it's very much a latter-day problem in the field. But equally problematic is the cavalier dismissiveness towards robust kerning. Especially when one spends so much time determining sidebearings...

    How many "kern pairs" did some of the most famous letterpress types have?
    Their kerning was limited to over-hanging bits and filed-down sides (and ligatures, in a way) but I agree with Ray: that's not very relevant.
  • Re: Efficiency in kerning pairs

    Well one issue is technical limits. I remember during the production of Ernestine they were pretty severe for a font with many glyphs. But from what I understand things have improved since then?

    The more interesting issue is what to bother kerning at all. Here I recommend kerning as much as you have time and stamina for. Even something like "¿ß" can happen in text, but it might not be worth it. This makes for very fuzzy decisions...

    Probably the best way to limit things is deciding a threshold below which not to kern. Mine is 5 units, in an Em of 1000. In comparison, I use a threshold of 2 when determining sidebearings.
    Do very careful spacing of all glyphs first.
    But ideally while keeping in mind how kerning will eventually kick in; for example I let glyphs touch (although rarely). This is because environments that don't care enough to enable kerning probably look worse in more important ways...
  • Re: [OTVar] Axes Proposals: variationsguide.typenetwork.com

    It seems to me that he considers optical-size variation to be a treatment rather than a style
    I think that makes sense because optical compensation is intended to be subvisible: not consciously noticeable by the reader. That said, this should not prevent typographers from intentionally using the "wrong" optical size.
  • Re: Freitag — toying around with a geometric display sans

    The "Q" is disrupting the texture from afar.
  • Re: Brain Sees Words As Pictures

    You'll need to define which version of “empiricism” you mean before anybody can disagree.
    But you hit Disagree anyway?  :-)
    To be fair though I'm actually not sure what that last sentence means. Few people (and nobody in this thread AFAIK) consider empiricism pointless. But I for one do believe it's only half the picture. A favorite quote from Paul Klee: "Where intuition is combined with exact research it speeds up the process of research." I would go further and say that intuition can actually guide research (even though research might in fact end up countering it). Einstein instinctively felt he was right before he could formally prove it. The people who have doubted the parallel-letterwise model might not be Einsteins, but just because they don't have as much empirical backing doesn't make their intuition wrong. Yes, it's very easy to fool oneself in the absence of empiricism... but it's not much harder to do the same by leaning too heavily on empiricism.  And now we're in a position where some notable empirical research (in fact since 2009, apparently) supports the existence of boumas, or at least whole-word reading. So unless one side can formally disprove the other side's empirical findings, intuition becomes the "tie-breaker" in terms of what one believes.

    I would say the necessity of taking both empiricism and intuition seriously parallels the necessity of taking both parallel-letterwise and bouma reading seriously.