The future of type

Mark Simonson
Mark Simonson Posts: 1,717
edited August 2016 in Technique and Theory
Like many of you here, I imagine, I think a lot about the historical development of type in terms of its forms and how it is made.

Metal type reigned for about 520 years, produced by hand for the first roughly 430 years, and then with machines for about 90 years. (I'm ignoring some overlap here and below.)

Phototype arose and became dominant within a span of about 50 years, all of it produced mechanically, with master artwork made by hand.

It was then replaced with digital type, which has now been around for about 50 years, dominant for about 30. All of it is produced on computers by human operators at the design and mastering stage.

What, if anything, comes after digital type? It seems to me that digital type is going be around a lot longer than phototype. (You could argue that digital type is really a form of phototype, especially in its early implementations. Nowadays, it doesn't seem to have much to do with photography, except perhaps for print, and even that is a stretch.)

It's hard for me to imagine anything as different as phototype was from metal type, or as digital type was from phototype. In other words, a new form beyond digital. The only developments I can envision have to do with design and production, which I believe will continue to become more automated, probably way beyond what we can imagine now. But it's hard to imagine the form of type changing again any time soon.

But I'm 60 now, and my life experience perhaps limits what I can imagine about the future. I was already an adult when the idea of owning your own computer was a novelty, and was in my early thirties when it became possible to use them to make fonts.

I would love to hear other peoples' perspectives on this.
«13

Comments

  • Chris Lozos
    Chris Lozos Posts: 1,458
    At 72, my perspective is not that much different but enough so that the craft years of type, where there was enough skill required to use type, kept the dabblers and such far enough away to make it a reliable living for the skilled.  My fear is that the "skilled" will be out of a job and the "dabblers" will become the mainstream users [already happening].  As for the next new technology? I am not a future reader, but I assume that something will come along.  My guess would be from some connection between robotics and biotechnics. At some point, we will be able to "think text" and have it delivered in stream to whomever we choose [Yes, this will be a nightmare, but so is e-mail spam and robocalling]. Instead of the sender being the one defining the message display in the receiver's mind, the recipient will define what messages will "perceive" like [look or "seeing" may not be even the proper sense here].  Some recipients may do a decent job of it but others will do the future's version of all-caps Curlz in3D with rainbow colors.  This will not be a public item so it really won't matter much what clutters the individual "reader's" mind.  I guess what I am saying is that at some point "Type is Dead" except for uses like calligraphy today. There will be museums and anthropologists musings to refer to but not broad-use commerce. 
  • For me the future of digital type (and it's design) lays a bit in it's past. Early exploration of computer aided type design and digital typefaces (formats, technology and etc.) led to very interesting results - METAFONT, FontChameleon, MultipleMaster (and etc.) - that briefly forgotten, but currently revived - Prototypo*, Metapolator*, MutatorMath*, Superpolator* (and etc.) -  I now tend to envision as It's future. For me parametric on-demand end-user controlled fonts are imminent - a not so bright future (for us type designers) that may strongly contradict with good taste and aesthetics, but favorable to users needs. I also presume/predict that type designer's job will be transformed to designing smart parametric font templates and the "occasional oddity" of designing a family the way we do it today just for nostalgic reasons (or prestige).

    ---
    Note (*) that I am fully aware that I give example using highly specialized software for designing type. But I see the possibility that one day the engine/math behind any of those cited (or mix between technologies) may have the potential to become "the thing" that MultipleMaster once was, and that "thing" may grow into standard format for representing digital type (especially for web).
  • Mark Simonson
    Mark Simonson Posts: 1,717
    Parametric, sure. That may yet catch on. But it's still digital, just a different way to encode things. 

    Type has gotten increasingly more abstract, going from a three-dimensional object made of metal (or wood), to an image on a photo negative, to numbers stored in a computer. Maybe it can't get any more abstract than that.

    It may be that digital is the final form of type. And that's kind of what I'm getting at. 
  • Chris Lozos
    Chris Lozos Posts: 1,458
    digital implies an output device that is akin to our present day screens.  I doubt that this will hold for more than 25 years.  There will be something else, don't ask me what.  The technology to do it will not come from type people, it will probably come from some big need field, like medical technology, energy conservation, global warming...
  • Mark Simonson
    Mark Simonson Posts: 1,717
    edited August 2016
    I don't think digital implies anything about what kind of output device, except that there always is one for visible things like type, even if it's bypassing the eyes and going directly to the visual cortex. That would still presumably take some kind of digital form. Assuming it is something that people would still call type.

    Keep in mind that what we call type might not be recognized as type to people a hundred or more years ago. They would look at what we have now, a computer or device drawing letters extremely rapidly, but no type in the form they knew. Phototype was not considered type by its proponents at first. We call it type because the forms look similar and it performs the same role.

    So perhaps there might be some post-digital form of type that we (today) would not recognize as type.
  • 15 Years? Hopefully by then people will have finally accepted Windows 10 as being something of value. Maybe not.
  • If we look back at the past of digital type, we've already had at least two generations — bitmap and outline/vector. And if we look at contemporary cutting-edge font technology, what stands out to me is emoji (especially color elements) and animations (see Apple's developments in adding animations in messaging apps in the next version of macOS & iOS). Along with this, of course, we have ridiculous increases in CPU/GPU power, leading to faster rendering.

    So my prediction is that type, having moved from metal to photo to static bitmap to static (mathematical) outline, becomes truly dynamic and changeable in realtime. Imagine the kinetic typography of the best film credits becoming the norm. Of course, this happens because the tools for both font design and text design (possibly just CSS) are improved to support this kinetic typography.

    I'm not claiming this will be pretty, of course. The future never is.
  • Sure, but that feels incremental and not revelatory. And beyond that, those effects have been achievable before now without being foisted onto type. You could says that their inclusion in the type container is somewhere between a convenience and a kludge.
  • James Puckett
    James Puckett Posts: 1,987
    I think there’s a future for finding ways to use fonts for things other than type. Apple opened this door with SBIX and the notion that glyphs can be containers for just about anything. People will find all sorts of clever hacks for this technology that have nothing to do with letters.
  • The eroded significance of font vs typeface seems to be an important distinction to make in this context though. The file format feels like an implementation detail and not the most interesting or key part of what compromises "type".
  • Nick Shinn
    Nick Shinn Posts: 2,181
    I predict that fonts will migrate to become part of layout applications, as parametric (metafont) entities. Eventually.

    This will shift the focus of interest in typography further away from the reductive incunabula archetype (one-glyph-per-character, static, black on white).

    For instance, I am presently working on a “bounced” concept, in which contextual alternates of a typeface are displaced vertically, and rotated.

    This is rather a kludge of the OpenType format, but it works.
    It’s not really a typeface per se, more a style like monospace, as the effect can be applied to any typeface.
    So, it is an effect that could be executed as a “filter” in a layout application.
    This would require some kind of spacing algorithm, though, to produce pleasantly smooth typography, rather than a jazzy mess (although that would not be without some merit).

    A similar kind of typography design could be invented for distress.
    Or serifizing a sans. Motion and (split-) color, too, in the manner of Letterror’s Federal and Hoefler’s Obsidian. And so on.

    The aesthetic hurdle which such metafont typography faces is huge—look at how little progress has been made in 25 years in producing satisfactory faux bold, slanted and orthogonally scaled styles.

  • Stephen Coles
    Stephen Coles Posts: 1,006
    I wish it was possible to Like and Disagree with something at the same time. Ray’s scenario is an entertaining read and feels plausible, but 15 years is an extremely aggressive timeline.
  • To your point Mark, postscript outlines that define lettershapes requires the least amount of information to render the letter. Given that, I can only see optimizations to that core concept and as emoji was mentioned earlier in the thread, fonts will only get more compressed like .mp4s to economize on file size for faster delivery/rendering.

    Ultimately, I'd think until we hit the next stride in technology, some outline data be it binary or quantum will likely be the future of fonts. I also suspect whatever this technology is, since we've already done them digitally, it will be the equivalent of a future MAME emulator to bring them into the future vs scanning printed samples or photo filmstrips and manually tracing them.

    You can rest assured, in most cases provenance will fall by the wayside and history may not record us as accurately.
  • I think the proverbial Man from Mars might see all of the incarnations of typography as just mechanized writing systems of one sort or another. The gulfs between them are minuscule compared to the difference between written and oral transmissions of language. Similarly, the difference between works printed on paper and those presented on screens are merely differences between modes of storage and retrieval. The big ideas are the alphabets, and they have changed little if at all. We fully recognize writing from two or even three thousand years ago as perfectly useful and comprehensible. We use many of the same letterforms today, unaltered. Digital drawing is still drawing, as much made by hand and eye as any previous method. We've added some flexibility and range, but these were always available to people with chisels and pens—if they were willing to spend the time on them. 

    It's hard to think of anything more historically conscious than type. I disagree with Mark about the radical break from photoype to digital type. Both began with precisely the same goal: to reproduce the most popular designs that existed in metal type. In regard to the finer points of type design, I think the greater rift occurred with the advent of mechanical punch cutting and matrix engraving. And even then, one might say that, by the era of Benton, the change had already happened with the development and use of electrotyping, about fifty years earlier. What Benton added that I think was very important—and damaging—was the use of the pantograph. We're still suffering with the notion of one-size-fits-all type, despite the efforts of a few.

    Chris, do you really think people will completely forgo the allure of the custom-made or of the celebrated designer who sets a new fashion, which itself may a revival of something old? It's not chiefly a matter of snobbery or even about the desire to stand out, though it is that to a degree, but rather about the need for something to suit a specific purpose, a need that hasn't been met already or exactly. That need will always be there.

    Do you remember the talk when "desktop publishing" came into existence in the 1980s? It was supposed to mark the death of all print craftsmanship. It surely meant death for some, but it was also democratizing and it opened the doors for a significant number of people were able to adapt it to excellent work. Companies like Adobe, which had the vision to understand that it wasn't about engineering alone, hired real artists. Look at old magazines and newspapers—or screen grabs of early websites—and compare them to those of today. It's an incredible improvement.
  • Mark Simonson
    Mark Simonson Posts: 1,717
    edited August 2016
    It may be splitting hairs to say this, but creating a typeface in a modern font editor (going back to Fontographer and probably earlier) isn't really drawing. Instead, the computer is doing the drawing, and we specify its parameters by setting control points. (It's possible to draw in some font editors, but most typefaces are not made this way.)

    In the phototype era, drawing was everything. It required skilled drafting to make a good typeface. This was true in the pantograph era as well, but not quite in the same way. (I realize that cutting film was also a common technique in phototype production, but it's a skill very much like drawing.)

    Being able to draw well is a boon for a type designer today, but not essential the way it once was. Today, the critical skill is knowing where to place the control points to get the computer to draw shapes you want. You don't necessarily need to be able to draw it by hand.

    As someone with decent drawing skills, I frequently wish I could toss the mouse aside and just draw the shape I want right on the screen. Technically, there are ways to do this, but not with the subtly and control of using a pen or pencil on paper. (And going back to paper gives up too many of the advantages of the computer, like undo.)
  • Chris Lozos
    Chris Lozos Posts: 1,458
    Chris, do you really think people will completely forgo the allure of the custom-made or of the celebrated designer who sets a new fashion<<

    That is not what I am saying.  I am saying that people will assume they are the best creator of their own experience.
  • Chris Lozos
    Chris Lozos Posts: 1,458
    edited August 2016
     "...creating a typeface in a modern font editor...isn't really drawing"
    I disagree.  I don't know how you draw, Mark,  but I draw using a mouse with complete freedom and with no underdrawing or sketch, or mask. I just draw. I don't "place control points" like some people do any more than I did with a pencil 65 years ago.  The big boon to me is "undo" compared to whiteout and no lumpy surface from many layers of paint.  The bad part is that every curve is smoothe even if it is wrong.  When every line is perfectly clean too soon, it is too easy to get infatuated with the cleanliness and overlook the flaws in the real form. 
    Back when I would go back and forth with black then white paint and 2 brushes, the edge was soft until I worked it into the "right" curve for the job.  It never got close to smoothe and sharp until the shape was what it needed to be.  After this, I would make a large stat and refine the edges with a french curve and Rapidograph or knife on Amberlth.  I am not sure that I could draw properly with the computer and mouse now  without the 40 years I spent drawing with pencil, brush, pen, knife, etc.. That background freed me from the need to draw dumbly and with enslaved ritual (the easy way out) like a computer can trick you into doing.
  • Chris beat me to some of this, but . . . 

    Though I hate to admit it, I'm old enough to remember something about the production of phototype. Yes, there was drawing, but it was drawing with a lot of assistance in the form of photostats, film negatives, rubylith, straight edges, French curves, knife blades, glue, splines—and a lot of white-out and red opaque. It was drawing for photo reproduction, not at all like calligraphy, so it could be built up and drawn over. One usually found more skill among those working for headline machines than among those working on drawings intended for text types. There were some virtuosos, like Ed Benguiat, working for companies such as Photolettering, in New York, but most of the people preparing the final art were re-renderers, tracers, and retouchers. 

    In the pantograph era, a large drawing was turned into a relief brass pattern (created photochemically) and traced by an operator, a technique requiring no drawing skill at all, though a clever person like Fred Goudy could use the it to make adjustments on the fly. (Goudy was a great rarity in his doing this.) In The Eternal Letter, there's a picture of Bruce Rogers's paste-up for one of the larger sizes of Centaur, in which you can see what a pastiche it was. How much drawing was done in the early era of punch cutting is a matter of some conjecture.

    I would venture to say that there is far more digital type drawn by its creators than was ever the case before. Just because the tools are in some ways easier doesn't mean it isn't a form of drawing.

    Chris, to your point: I don't see much evidence that people are so apt to do things themselves, especially when they are uncertain as to how the results will be judged. I see people wearing designer labels as a kind of pre-approval, no matter how idiotic it may appear.

  • Mark Simonson
    Mark Simonson Posts: 1,717
    edited August 2016
    Yes, of course, there were aids to getting nice, clean final art, like templates, knives, etc. I'm talking about the lines you put down to determine the shapes of the letters.

    When I was doing lettering by hand (and my early attempts at type design), the best way to get the kind of shapes I wanted was to draw on tracing paper with a pencil, and then gradually refine it by retracing the drawing on another sheet. I repeated this as many times as needed until everything looked just the way I wanted it, sometimes redrawing parts and cutting and pasting them together. Yes, I would use a straight edge for straight lines, but curves would always be drawn by hand at this stage.

    Once the drawing was done, I would transfer this to illustration board to do the final inking. (Some people used a knife and rubylith.) Only at this point would I use things like french curves and eclipse templates. The lines I had drawn would dictate which aids to use. Using such aids before the shapes were drawn would produce stiff, mechanical curves. The lines I put down on paper determined the shapes, not templates. Drawing with templates is not drawing. You could maybe use templates earlier in the drawing stage for a very geometric style or if you were after a mechanical look.

    Like I said, I may be splitting hairs here.

    But working with Bézier curves in a font editor feels nothing like drawing to me, even if the results are similar. It's like the difference between driving and backseat driving.

    (Also, I'm not saying they are not both difficult skills to learn, just not the same skill.)
  • Chris Lozos
    Chris Lozos Posts: 1,458
    Scott, I am not talking about production art, I am talking about creation art or design drawing, the place and time where you define the form, not cleaning it up for camera--that comes last.  That is a production skill and is quite different from a design skill.
    I know plenty of designers who were lousy at production art and plenty of production artists who were bad designers.  I also knew some who were good at both.  I always did my own production art because I found it an extension of the design process and allowed me final control [and one last chance to change my mind ;-)].
  • Chris Lozos
    Chris Lozos Posts: 1,458
    "I don't see much evidence that people are so apt to do things themselves, especially when they are uncertain as to how the results will be judged."

    When creator is the only judge, the game is rigged ;-)
  • Ray Larabie
    Ray Larabie Posts: 1,423
    @Stephen Coles
    It's a conservative timeline. If progress continues at it's current rate, I'd give it 25 years. But technological progress is accelerating. Conservatively, I'll say a $1000 computer matches a human brain by 2026. I think it's more like 2023 but let's say 2026.
    Maybe 3 years later, a $1000 computer matches the brain power of all humans on earth with a few million times the processing speed. 15 years from now it'll be 2031. That $1000 computer is now cheap enough to put in an electronic poster. I'm not talking about artificial human consciousness or anything like that. Just raw computing power, local storage and access to all human and machine knowledge. No far-out sci-fi stuff, just more of what we already have in 2016. Also consider the potential of software that can self test and generate it's own bug reports...fix centuries of bugs in seconds. The innovation gap will close up very quickly.

    I think this is how the rest of the story of type design will unfold. I think the evolution of automated type design will develop from targeted advertising. At first, design will dynamically change according to consumer preference. The layout engine will choose an existing, licensed typeface. It will create illustrations, modify and translate ad copy in order to generate an image. This will happen before 2020. The ads will look synthetic. Some consumers will be able to tell the difference between an ad designed by humans and a synthetic one. But advertisers will be intrigued, especially by the low cost and the ads will improve rapidly based on instant performance feedback. There will be no need to call a meeting to talk about how an ad is performing. The race will be on to create better, more convincing targeted ads. In 2022 the ads have been improving. The l;ayout, illustration and typography seems more creatively designed. Illustrations were once simple photo filters and clip art but will be able to mimic specific styles. A customer who loves Duran Duran may get an illustration in Nagel style with a layout reminiscent of the Rio album cover. A Garfield enthusiast will get a cartoon illustration paired with Cooper Black. But font selection isn't enough. To make these ads stand out, the software will need to learn how to do lettering. A human graphic designer modifies type and creates lettering. The software will need to do that too. First it's just kiss kerning. Then it's modifying ascenders and descenders. The it's setting type on paths and playfully filling counters. It will improve week by week. It will imitate 1990's grunge typography, art nouveau, 2010's retro. At first these will look like crude parodies but they'll improve quickly. 

    Meanwhile, this software will be generating ads in every language. Fully parametric type won't be required. It'll possible to keep using existing fonts and keep modifying them until the end of time. But the lack of font licensing fees will make automated type design very appealing. At this time, the pieces required to create automated type design will already exist. Then it's just a matter of putting the pieces together. I think that's precisely where the drive for automated type design will occur. The ultimate type design tool won't be driven by type designers. It won't come from Glyphs or Fontlab. It'll be driven by targeted advertising. If you think this scenario is off-the-wall, think about Monotype's recent acquisition. Are they nuts? No. I think they know what's coming around the corner. On the upside, we'll always need poodle groomers.
  • Mark Simonson
    Mark Simonson Posts: 1,717
    edited August 2016
    So, you're saying it'll still be digital for a while then?
  • Dave Crossland
    Dave Crossland Posts: 1,428
    edited August 2016
    Sad to see Comic Sans cut from the crop!

    https://www.youtube.com/watch?v=eTI6NBanGkA
  • Nothing will change very much, because nothing has changed very much. Art is a constant; design aspires to art. Modes of production become more automated and society will become more automated – because we're fundamentally lazy. But that's about it.
  • [Deleted User]
    [Deleted User] Posts: 0
    edited August 2016
    The user and all related content has been deleted.
  • Chris Lozos
    Chris Lozos Posts: 1,458
    I see that course is not as Latin centric as many others.