Somebody Agrees with Hrant about Black-Letter
Comments
-
@Christian Thalmann There is also a big difference between Jenson and Griffo : In Jenson your are able to record information from three lines togheter (I mean that while you read a line some information comes also from the above and the below lines) while in Griffo you are restrained to only one line. That's one of the things I find so exciting in Jenson.
0 -
Frode said:Adding to my hypothesis: Serif types with contrast (thick verticals, thin horizontals) also allow for a more regular stem interval.0
-
I don’t know. I can only observe that this is true for the typefaces we read most, and might perhaps contribute to why they work so well.1
-
Maybe they work merely acceptably... in spite of it. Because logic (well, the one I use) says stem-regularity is a lost opportunity for contrast; it might even be a distraction in the conscious layer (versus immersive reading). Sorry, FvB.0
-
But the original (prior to printing) Blackletter was all about stem regularity...
Stem regularity (in case of Roman, not sure about Blackletter) allows for more fluid, uninterrupted reading, isn't that the point? Analogically to how the consistent x-height and baseline help guide the eye, don't regular stems allow it to slide smoothly over the text? Although irregular stems sometimes are a distinctive feature of a character (kind of): think oh and zero.0 -
It's certainly pretty...
The eyes only slide smoothly when tracking a moving object; when reading, they saccade. This was Javal's seminal contribution to the field of reading research. And the only thing we need to saccade to further down the same line is enough leading (which is guaranteed, because returning from the end of a line to the beginning of the next one requires much more leading). Ergo: no features of letterforms guide reading. (This is another reason Chinese readers are not slower.)0 -
The way Latin sets, this ends up meaning we have to perform extra saccades, instead of taking advantage of acuity in the vertical dimension.
To be fair, most writing systems have this problem; Chinese (especially when set horizontally) and Hangul are notable exceptions.
My understanding, from conversation with Nadine, is that eye-tracking studies of Chinese readers reveal that saccades are short and frequent, and that saccade length correlates to semantic load, not graphical density, i.e. English readers have longer saccades, and Chinese readers have shorter, but both are taking in about the same amount of semantic content with each fixation.4 -
When the writing system has no effect on reading speed, that strongly indicates non-immersive reading, a very low bar. All of Larson's data has this problem too.
0 -
BTW saccades in Chinese would (presumably) have to be over four times shorter than in English for them to be comparable in speed... Also useful would be to compare regressions (noting here that too-few regressions are also a sign of non-immersion).0
-
Hrant H. Papazian said:When the writing system has no effect on reading speed, that strongly indicates non-immersive reading, a very low bar. All of Larson's data has this problem too.
(The other possibility being, of course, that your theory is wrong.)
2 -
Just look at the net reading speed. When a subject knows they're being tested* reading is slow, deliberative; as long as nothing is way off it just ambles along. When you simply measure how long it took somebody to read something (notably, that they're enjoying) you see that everything matters; this is why people more often report text in some font being more tiring than text in another font. That happened to me personally in the '90s with the Armenian-language edition of AIM Magazine, where after three issues people who were actually reading the thing (the priests...) complained they had trouble reading it. They weren't sure why. We figured it must have been the typeface, so I made a new one, a revival of a design with a solid readability pedigree (if pretty ugly TBH). No complaints from then on. Years later, once I grasped immersive reading, I understood the problem with the first attempt: the x-height was too big. In the readability trenches, everything matters.
* "The difference between readability and legibility? Readability: You are sitting in an armchair, a novel by Raymond Chandler in your hands, at your side is a glass of beer and a cheese sandwich. Legibility: You are in a psychology lab, a few lines of nonsense set in 3 mm x-height sanserif type are flashed on a screen, a guy in a white coat comes towards you with pincers and a blinkometer.” — Robin Kinross
Of course I could be wrong. But you can only prove something exists, not that it doesn't, and when any research seems to show boumas can matter* –and due to the fovea's high resolution, they would matter most in the parafovea, with its precipitous drop in lateral acuity– we should try to build our model accordingly. Narrow boumas are better; especially since you can set them larger.
* http://typedrawers.com/discussion/2285/brain-sees-words-as-pictures0 -
I noticed people are curious about Hangulatin. I like to pick the chance to give some background insights about Hangulatin. Thank you for posting it, Hrant!
~ every syllable (character group) is in itself readable like western style - from top left to lower right corner. Usually latin-reading people recognize that intuitively very fast
~ I‘ve set pages full of text in Hangulatin. Most people (latin used) are able to read it fluently after reading a few lines
~ Like history showed the latin characterset wasn‘t spread to the world because of its high legibility. It was spread because of power and relegion. The christian church thaught people how to read latin so they could read the bible; Hitler decided to not to use blackletter anymore so he could spread his propaganda more widely in latin because countries around were used to read latin. Latin was usually adapted to the languages and people had to learn it. There was no choice. If the characterset didn‘t fit phoenetics of the language, they‘ve put some diacritics on it. That‘s why Vietnamese is so hard to understand for the rest of the latin reading world. It actually didn’t match at all to the latin alphabet.
The Korean King Sejong invented Hangul because he wanted to educate ordinary people to read it. Higher educated people didn‘t want to lose their status, so they kept using sophisticated chinese characters. But in fact Hangul was easier to read and after a time of „fight“ between these two scripts, Hangul won. They still are using some Chinese characters, too. But Hangul is used as main characterset. Hangul is the only script that won a fight of legibility in that huge range. This process took decades to centuries. Even the christian missionaries decided to learn Hangul and spread the word in Hangul as this was easier than teaching latin when there already was an easy to learn script available. Hangul is exceptional in this position. Beeing designed to be a simple script. Most other scripts were spread by force and relegion and people just had to adapt.
~ That also shows that legibility depends mostly on what we are used to read. Children (latin) usually have years to learn to read. (Chinese are learning their whole life as I was told). I learned to read Hangul within two weeks (to read not to understand) and if people are reading Hangulatin fluently after some minutes, that means it is highly adaptable and legible for latin used people. And it is basically possible to use such a script like Hangul in the western world. This is what I wanted to proof with this project.
I am looking forward to your thoughts and opinions. Please be welcome to comment it.
4 -
That also shows that legibility depends mostly on what we are used to read. Children (latin) usually have years to learn to read. (Chinese are learning their whole life as I was told). I learned to read Hangul within two weeks (to read not to understand) and if people are reading Hangulatin fluently after some minutes, that means it is highly adaptable and legible for latin used people. And it is basically possible to use such a script like Hangul in the western world. This is what I wanted to proof with this project.Hangul easy to learn because it is an alphabet, not because it is organised into syllables. I suppose the syllabification might bring some advantage to readers who have trouble splitting large words into syllables, but I don't know whether that's a thing. Then again, the loss of linear reading direction might present a disadvantage.Has anyone investigated whether syllabified alphabets might be easier on dyslexics than linear ones...?0
-
Anita Jürgeleit said:~ Like history showed the latin characterset wasn‘t spread to the world because of its high legibility. It was spread because of power and relegion.You are absolutely right in your history of scripts - up to a point.It is well known, for example, that where Christianity was spread by the Roman Catholic church, the Latin script is used - and where, instead, the Orthodox church is followed, Cyrillic is the script in use.But I wouldn't trust the conclusion you seem to have arrived at, that since Hangul has no such history, it's the only script we can trust to be legible!Would Hangul be less legible if it were used by the Japanese, the Thais, the Burmese, as well, due to Korea becoming an imperial power? If Hangul was adopted by the Koreans because it was legible, then perhaps we should look at the Etruscans, the Greeks, or the Phoenicians. Well, maybe the Etruscans got conquered by the Greeks.Fortunately, today, we use computers and laser printers to put our thoughts on paper. I happen to be familiar with the meaning of the Korean word "sebeolsik". The Koreans had to go to some interesting lengths to adapt the manual typewriter to their script - but they were successful without having to go to the extremes required for Chinese.If one looks at how people were taught to program the IBM 1401 computer, it is strongly underscored just how natural a fit a plain alphabetic script is to the computer in conceptual terms.0
-
Thomas Phinney said:research data gives results that disagree with your theory of readingWhile I commend your desire to have Hrant keep is ideas evidence-based...it has occurred to me that measuring reading speed poses a fundamental difficulty.How do you measure how fast someone reads a given text? After all, the subject could have, without even doing so intentially, or noticing that he had, skipped over parts of the text?The obvious way to make sure that the subject has readEvery. Single. Word.is to ask questions later. Lots of questions. So then you're measuring the time it takes to memorize a given text, not the time it takes to read it.While I'm sure researchers have tried to address these issues, I don't think there has been found some solution that makes them go away completely. It takes me about two hours to read - and subjectively, I think I'm reading every word - a moderate sized novel of light reading, say a James Bond novel, or one of Edgar Rice Burroughs' adventures set on Barsoom. Based on my limited experience of Hangulatin, probably such feats are easily within the reach of Koreans as well, but I can't be sure.0
-
Thanks for coming by, Anita!
Hangul is indeed very exceptional.Anita Jürgeleit said:
That‘s why Vietnamese is so hard to understand for the rest of the latin reading world. It actually didn’t match at all to the latin alphabet.Anita Jürgeleit said:
~ That also shows that legibility depends mostly on what we are used to read.Christian Thalmann said:
Hangul easy to learn because it is an alphabet, not because it is organised into syllables.Christian Thalmann said:
the loss of linear reading direction might present a disadvantage.John Savard said:
So then you're measuring the time it takes to memorize a given text, not the time it takes to read it.1 -
BTW saccades in Chinese would (presumably) have to be over four times shorter than in English for them to be comparable in speedNote that I didn't say anything about speed, only about saccade length being relative to linguistic content, so the more dense the semantic encoding the shorter the saccades will be. If I recall correctly, Nadine reported that typical saccade lengths for Chinese readers is three characters, vs 12–15 for English readers. Since these measurements are based on the same kind of eye-tracking studies, the comparative observation is sound whether or not one believes that test conditions produce slower reading.3
-
Interesting numbers.
The observation seems sound... for deliberative reading.0 -
But I wouldn't trust the conclusion you seem to have arrived at, that since Hangul has no such history, it's the only script we can trust to be legible!I do not think Hangul is the only trustable legible script. It is the only one that was chosen by the general public because it was easier to learn (between two scripts - Hangul and Chinese). All the others didn’t have had any choice. But elsewise I think it is impossible to see if one script is more legible than another.Humans have a very capable brain - especially children. When someone grows up with a certain script, it becomes more readable for this person than for others. Ever. So how can we then compare the readability between two scripts? Only by the assumption? Just by the missing imagination that Hangul or any other script could possibly be read fluently like our beloved on-line latin alphabet?Hangulatin shows that it is possible. It helps to understand this situation of cultural difference by showing that we would handle such a script and we would survive very well, actually.There were a lot of new work for new type designers, btw0
-
Note that I didn't say anything about speed, only about saccade length being relative to linguistic content, so the more dense the semantic encoding the shorter the saccades will be. If I recall correctly, Nadine reported that typical saccade lengths for Chinese readers is three characters, vs 12–15 for English readers.0
-
That's interesting, indeed. But what is actually the value of one saccade? For Chinese it would be 3 words. For Hangul, the average amount of characters in one square is 3 = 9 characters; Latin 12-15: The value (words) here depends on the language.Yes, that's the point. Saccade length seems to be governed by linguistic content, and varies depending on how language is encoded in a writing system. This indicates that fixations are spaced and timed according to how the brain processes the linguistic content of the text, rather than on how much distance the parafovea can take in during the fixation. The implication of this is that you can't speed up reading by compressing the linguistic content into less graphical space, as in Chinese or Hangul, because all you end up doing is increasing the frequency of saccades because the brain still needs the same amount of time to process the linguistic content of each fixation.
3 -
Anita Jürgeleit said:
All the others didn’t have had any choice.Anita Jürgeleit said:
So how can we then compare the readability between two scripts?
0 -
John Hudson said:
The implication of this is that you can't speed up reading by compressing the linguistic content into less graphical space, as in Chinese or Hangul, because all you end up doing is increasing the frequency of saccades because the brain still needs the same amount of time to process the linguistic content of each fixation.
It's pretty inconceivable that a writing system has no effect on reading efficiency. One can easily design one that's less efficient than Latin, just as one can design a font that's less readable.
----
An aside:
Encouraged by that –admittedly modest– poll, I'm going to give "grayletter" (to mean blackletter/Roman hybrid) a go for a few years and see if it sticks... After all my "uniwidth" took over 13 years to be used in the mainstream.0 -
This is like saying all fonts are equally readable.No, it's not remotely like saying that. It's saying that within any given writing system, there are typical patterns of fixation and saccade that correlate to linguistic content load, not to visual acuity. That's not saying that within an individual writing system individual texts and their typographic display are equally readable, but it does suggest that variable readability of texts and types within a writing system is probably not a matter of how frequent or long saccades are, since that seems determined by content processing, not by how much our eyes can or cannot take in during fixations.
There are lots of things that demonstrably affect reading speed and accuracy, most notably complexity (both graphical and textual). In terms of saccade distance and fixation length, it would be interesting to test whether the longer, fewer saccades or shorter, more frequent saccades associated with particular writing systems tend to produce faster or slower reading rates, more or fewer regressions, etc.. Or, indeed, whether saccade length and frequency significantly affects speed and accuracy at all.
1 -
I find that to be an artificial compartmentalization of writing systems versus fonts. It's irrational to believe the writing system has no effect on performance, and that some writing systems are not better/worse for reading than others. It's certainly PC though.
There is indeed a lot of actually-good testing to be done. Mostly we've had observations of the ripples on the surface.0 -
It's irrational to believe the writing system has no effect on performance, and that some writing systems are not better/worse for reading than others.It is also not a thing I said. For someone who goes on about reading a lot, you're kinda shit at the whole comprehension part.
Signing off.
3 -
John Hudson said:Saccade length seems to be governed by linguistic content, and varies depending on how language is encoded in a writing system. This indicates that fixations are spaced and timed according to how the brain processes the linguistic content of the text, rather than on how much distance the parafovea can take in during the fixation. The implication of this is that you can't speed up reading by compressing the linguistic content into less graphical space, as in Chinese or Hangul, because all you end up doing is increasing the frequency of saccades because the brain still needs the same amount of time to process the linguistic content of each fixation.Here, perhaps, is the text that Hrant apparently has misunderstood.If the time required to read a text is determined by the amount of "linguistic content" it contains, and not by the number of saccades involved in accessing the visual representation of that content, then the apparent conclusion is that reading speed is invariant: it depends on the linguistic content being read, and not on the encoding of that content.The eye muscles might get a bit more or less exercise from any particular encoding, but the brain chugs along at a uniform pace regardless.Of course there are obvious ways in which this might be an overly broad conclusion. There might well be ways to make reading more difficult of a different nature than the specific class of modifications to the writing system under discussion. This would mean that you're merely saying that going from an alphabetic system to something like Hangul or to an abugida wouldn't, in itself, change how fast people could read, but still different writing systems within each class were better or worse - making Hrant's interpretation of your words too sweeping.So my point is, even though you may well be correct to say "It is also not a thing I said", it is, at least to some extent, a thing you seemed to have said, and so Hrant really isn't guilty of intentionally misrepresenting your argument - which is also a thing you haven't said, but which I can't help but suspect you might feel, given the force of your response.1
-
@John Hudson
"both are taking in about the same amount of semantic content with each fixation."
"Saccade length seems to be governed by linguistic content"
"fixations are spaced and timed according to how the brain processes the linguistic content of the text, rather than on how much distance the parafovea can take in during the fixation."
"you can't speed up reading by compressing the linguistic content into less graphical space, as in Chinese or Hangul"
[My emphasese.]
I'm not sure how to interpret all of that as not an attempt to claim that the writing system does not affect reading performance. Essentially, to zealously defend Latin in the face of a claim of imperfection. The same thing happened when I gave my Alphabet Reform talk in '99.
0 -
“Saccade length” in this context is referring to distance in words/characters/physical space, not temporal duration.
If you keep linguistic-content per saccade fixed across writing systems, and the spatial part is not critical (being below the threshold of what people can take in visually), then time-per-saccade could still be a variable.
I am not saying it does, or does not, differ across writing systems. Just saying that none of the quoted passages, nor all of them together, contradict this possibility.
I actually think variation on this score is plausible, but.... Unless the difference is very large, it would be hard to test experimentally, because unlike many other experiments, you can’t just use the same set of test subjects, run them through both languages, varying whether they get language A or language B first. Or you could, but since the same subject might be better at one language than another—or all your subjects might be more fluent in one than the other, depending on where you run the test—it could be hard to test.3 -
Hrant H. Papazian said:I'm not sure how to interpret all of that as not an attempt to claim that the writing system does not affect reading performance. Essentially, to zealously defend Latin in the face of a claim of imperfection.Even if one did accept that reading performance was stongly dependent on the writing system, because Latin, and other systems like it are the simplest possible systems known, the assumption would be that it is best - and I wouldn't be surprised if research couldn't be found to support that.Of course, this now suggests to me a "writing system" that could put more data in the foveal field without being dependent on syllabification information, and thus which could be more easily retrofitted into a world designed for Latin. Remove the vowels from the main line of text, and put them directly underneath - between the two letters they occurred between. And allowing a vowel at the beginning or end of the word to stick out, just as consonants would. Since the vowels are a, e, i, o, and u, they're all letters with neither an ascender nor a descender... except, of course, for the dot on the i, which is small. (Presumably the definition would be dumb, to keep implementation simple, and so a u acting as a consonant and a y acting as a vowel would still be treated as a vowel and as a consonant respectively.)0
Categories
- All Categories
- 43 Introductions
- 3.7K Typeface Design
- 803 Font Technology
- 1K Technique and Theory
- 622 Type Business
- 444 Type Design Critiques
- 542 Type Design Software
- 30 Punchcutting
- 136 Lettering and Calligraphy
- 83 Technique and Theory
- 53 Lettering Critiques
- 485 Typography
- 303 History of Typography
- 114 Education
- 68 Resources
- 499 Announcements
- 80 Events
- 105 Job Postings
- 148 Type Releases
- 165 Miscellaneous News
- 270 About TypeDrawers
- 53 TypeDrawers Announcements
- 116 Suggestions and Bug Reports