Fonts for augmented reality

Vasil StanevVasil Stanev Posts: 573
edited September 5 in Type Business
Hello,
I believe AR is going to be the next big thing when the technology gets pefected, and early adopters will be able to cash in good. There is already a contact lens that supports rudimentary AR and I think this is the screenless direction things are going to develop in the new decade. Does somebody have any experience in the field concerning fonts?
https://www.mojo.vision/mojo-lens

For now it is of course hardly impressing.

Comments

  • Just remember, you can totally read moving text... as long as it's the same words over and over again.  :->
  • Vasil StanevVasil Stanev Posts: 573
    edited September 5
    As a byproduct of the headset testing, I discovered that my eyes have totally different focal depths.
    Is that dangerous?
  • @Gen Ramírez did his thesis project on this at KBK
  • John HudsonJohn Hudson Posts: 1,955
    Is that dangerous?
    Not so far.  :)

    One eye works for close range things, and one for distant things. Well, perhaps I should say worked, as neither is as good as it used to be. I feel I should probably have glasses for driving—I can't read signs until I'm fairly close to them—, but when tested I’m told that I don’t need glasses.
  • Craig EliasonCraig Eliason Posts: 1,033
    People pay good money for contact lenses that make their eyes work like yours, John!
  • Indeed, some Lasik procedures deliberately adjust the eyes in that way (my sister being a case in point).
  • What does augmented reality font rendering actually mean?
  • (Some threads/people are allowed to go off-topic...)
  • John HudsonJohn Hudson Posts: 1,955
    What does augmented reality font rendering actually mean?
    Display of fonts as part of graphical content overlaid on the physical world using headsets which project that content to the eyes of the viewer using lenses. [Augmented reality (AR) projects an overlay of digital content on the physical world; virtual reality (VR) projects a fully digital non-physical world. Both use the same kind of headset and lenses.]

    Rendering of fonts in AR involves some of the same kind of display techniques used in traditional screen rendering—rasterisation of outlines to pixels, antialiasing—but also needs to take into account the background and foreground of text in the 3D environment, slanting of text in perspective when not viewed directly, movement, and, at the device level, the kind of lenses used in the headset.
  • What does augmented reality font rendering actually mean?
    Display of fonts as part of graphical content overlaid on the physical world using headsets which project that content to the eyes of the viewer using lenses. [Augmented reality (AR) projects an overlay of digital content on the physical world; virtual reality (VR) projects a fully digital non-physical world. Both use the same kind of headset and lenses.]

    Rendering of fonts in AR involves some of the same kind of display techniques used in traditional screen rendering—rasterisation of outlines to pixels, antialiasing—but also needs to take into account the background and foreground of text in the 3D environment, slanting of text in perspective when not viewed directly, movement, and, at the device level, the kind of lenses used in the headset.
    So it's similar to a 3D rendering theoretical problem of rendering text when it is embedded in a 3D model. Standard hinting methods would probably not work properly with arbitrary perspective transformations because each transformation introduces missing extrema. It would likely be complicated, so FreeType, TD renderer, etc. are not introducing font hinting for 3D rendering any time soon. Also, like in any anti-aliased rendering, the anti-aliasing should definitely be gamma-correct as well, that is, blend linearly. For instance, halfway between 0 and 255 (on a scale of 0 to 255) is not 128, but 188, because the scale from 0 to 255 is not a linear scale and must be transformed to linear and back with the sRGB formulas, however with lens problems it might be necessary to use more precise formulas for each specific unit, likely generated with a calibration tool. There is also the problem of how the threshold of gasp table disabling anti-aliasing will take place in perspective rendering.
  • Ray LarabieRay Larabie Posts: 1,015
    AR seems like a good place for variable fonts. There's a door in front of you overlaid with a sign rendered in an appropriate typeface. Back away from the door and the optical axis gradually changes.
  • AR seems like a good place for variable fonts. There's a door in front of you overlaid with a sign rendered in an appropriate typeface. Back away from the door and the optical axis gradually changes.
    Although that should apply to any screen typography.
  • Ray LarabieRay Larabie Posts: 1,015
    edited September 7
    Although that should apply to any screen typography.
    One difference between AR and screen typography is that pops would be distracting. You know how you can have a different letterform change at a certain size? In a variable font a Q might have a tail that crosses through the counter and then at a certain optical size, swaps to a form where the tail doesn't go into the counter. In screen typography that would be fine. But if you're walking, you might be distracted by Q's popping; $ going from one stroke to 2; ligatures appearing. Changes in metrics could be problematic. You're at the airport, headed to your gate and

    RECEPTION
    COUNTER

    suddenly snaps to

    RECEPTION COUNTER
  • Piotr GrochowskiPiotr Grochowski Posts: 76
    edited September 7
    Although that should apply to any screen typography.
    One difference between AR and screen typography is that pops would be distracting. You know how you can have a different letterform change at a certain size? In a variable font a Q might have a tail that crosses through the counter and then at a certain optical size, swaps to a form where the tail doesn't go into the counter. In screen typography that would be fine. But if you're walking, you might be distracted by Q's popping; $ going from one stroke to 2; ligatures appearing. Changes in metrics could be problematic. You're at the airport, headed to your gate and

    RECEPTION
    COUNTER

    suddenly snaps to

    RECEPTION COUNTER
    There are many such possibilities in theory. Would that mean a book would have its words move between pages randomly? In this case it would be better to use scalable rendering technology where each glyph is rounded to integer individually rather than the widths being rounded to integer. What would happen when the text outlines are also a physical object, which could lead to a mismatch between the rendering engine and the physics engine? It could also be interesting to see the ppem glyph in RasterInfo which, when TrueType hinting is supported, would be a way of measuring inverse distance to the camera.
  • Niteesh Yadav also has some research on fonts for AR/VR. I shouldn't talk in his stead, but I've never known him to be less than willing to speak, when approached.
  • Thanks Sérgio for pointing me to this.

    Hey Vasil, Thanks for opening up the discussion. Mojo is in a good state in terms of testing and FDA approval however it will take time to get to a consumer state. And Furthermore to get to a stage where manufacturers will pay attention to reading/fonts.

    However, right now the AR headsets are getting better and we are fast approaching high-density displays. And the type industry should focus on this direction but I have not seen much traction there. Also for developers, the text is least of priority in terms of processing power allocation for rendering fonts at the moment.

    If you interested in learning more about the typography in AR then you can have look at the articles from my ongoing research here https://niteeshyadav.com/research/ which cover it from both type design and tech perspective.

    Also, you can refer to my TypeTech video where I have covered the topic from multiple perspectives. 
    Warning: it is a long in-depth overview! My talk was supposed to be half an hour but coz of COVID the in-person conference was cancelled. So, I decided to change and expand the content for better understanding.

    AR seems like a good place for variable fonts. There's a door in front of you overlaid with a sign rendered in an appropriate typeface. Back away from the door and the optical axis gradually changes.


    Though that is ideal but again the tech doesn't allow for it for the reason I have mentioned above: Please refer to the article shared below to undertand the rendering.

    What does augmented reality font rendering actually mean?

    If you want to understand font rendering in AR have a look at this article 
    The current state of the text in Augmented Reality

  • Here are the time-stamps for various topics related to Typography in AR

    1:30 Introduction
    01:31 Difference between AR/VR
    05:34 Typographic Structure and basics
    8:33 Need of AR specific typefaces 
    10:08 New challenges in AR/VR
    10:45 AR/VR Display types
    14:38 Tech Design Challenges
    26:05 Current scenario
    27:10 Typography in AR (Classification)


  • Also, you can refer to my TypeTech video where I have covered the topic from multiple perspectives.
    multiple perspectives

    Pun intended?

  • Also, you can refer to my TypeTech video where I have covered the topic from multiple perspectives.
    multiple perspectives

    Pun intended?
    I would say it is a literal one. Since talk content was put together keeping in mind Type design, tech-development and interface design.

  • Also, you can refer to my TypeTech video where I have covered the topic from multiple perspectives.
    multiple perspectives

    Pun intended?
    I would say it is a literal one. Since talk content was put together keeping in mind Type design, tech-development and interface design.
    As in when text embedded in a 3D object can be rendered in multiple perspectives?

  • Also, you can refer to my TypeTech video where I have covered the topic from multiple perspectives.
    multiple perspectives

    Pun intended?
    I would say it is a literal one. Since talk content was put together keeping in mind Type design, tech-development and interface design.
    As in when text embedded in a 3D object can be rendered in multiple perspectives?
    In AR text gets refreshed every moment, even when there is slight movement (~1mm movement) will result in a different perspective from renderings point. Please read about Resolution and refresh rate, latency in this article: https://niteeshyadav.com/blog/variables-that-affect-the-experience-in-ar-8618/
Sign In or Register to comment.