Artemy Lebedev
§ 81. The life and extraordinary adventures of a typographical pointJanuary 22, 2002 |
|
What should one know about a point? The word stems from the Latin punctum. Point is a unit of the typographical measurement systemtypometry. Before the typographic point was invented, font sizes were differentiated by their names. Say, cicero (12 points) was so named because Ciceros works were first printed in 1467 using this font size. |
The term typometry stems from the name of typometera device (originally named prototype) invented by Fournier |
|
The idea of fontsize standardization dates back to the 17th century, but the first easy-to-handle typographic point was proposed only in 1737 by the French printer and typefounder Pierre Simon Fournier. According to his system, every fontsize was equal to a certain number of points: nonpareil6, petit8 etc. |
|
In France of that period a common linear measurement unit was toise, which was equal to 6 Royal feet (pied de roi). The foot was equal to 12 inches; the inch was divided into 12 lines; the lineinto 12 metric points. The length of two of these points was adopted as one typographic point in a booklet published by Fournier. However, weve seen plenty of times before that everything in human history has consistently been done in a shitty, screwed-up way. For some reason, by the time Fourniers two-volume manual was published in 1764, the newly-born typographic point proved shorter than the point of 1737, thus becoming an arbitrary unit. |
|
Housekeeping tip |
Any measurement system must have a standard. Say the acre is the area of ground tillable by two oxen in one day. Zero Fahrenheit is the freezing point of a saturated salt solution (or water with sal ammoniac, according to other sources). The Russian arsheen was equaled to 28 English inches by a decree of the Russian tsar Peter the Great. The English inch is equal to the width of the thumb on a mans palm, or more precisely, three barley grains arranged in a line. In todays world all non-metric units (all those yards and furlongs, ounces and bushels) are defined based on metric parameters. Until 1964 the meter was defined in the USA as 39.37 inches. But now the English inch is precisely defined as being equal to 2.54 cm. The French inch2.706 cm. Unfortunately, theres no way of saying today how tall Thumbelina basically was. |
|
Nevertheless, due to its simplicity, Fourniers spoilt system became a standard. Later (in 1783) the systems inaccuracies were corrected by another French François-Ambroise Didot, a well-known printer. More specifically, he equalled the typographic point not to an arbitrary, but a mathematically precise value1/6 of a line, taking the same French Royal foot (324.84 mm) as a basis. Simply put, he returned to the value originally proposed by Fournier in 1737. |
|
But as early as 1795 the French introduced the metric system, turfing all Royal feet along with the standard basis of the typographic point. |
|
In 1879 the USA started to widely use a point devised by Nelson C. Hawks who claimed authorship of the idea of the point itself. Hawks discovered that cicero (a 12-point font) was equal to 1/6 of an inch (and we know thats the way it should be). And he persuaded his boss, who owned one of the largest type foundries and was busy restoring his business after a great fire, to adopt the new system. |
|
Things went so far as adopting the Hawks point as a national standard. But the funny thing is that at that time the Association of Typefounders of the United States tied all measurement systems to the metric system. So it said that 83 cicero were equal to 35 centimeters (the point being equal to 0.3514 mm)so much did they want to conform to the metric system. |
The United States introduced the Hawks point as a national standard in 1886, with Great Britain following suit several years later |
|
The same year (1879) Hermann Berthold, a German, translated the Didot point into the metric system: one meter held 2660 points. These days Germany, Eastern Europe and Russia use this point thats defined in the metric system as 0.3759 mm and rounded off to 0.376 mm. |
|
And heres what we get in the end (eat your hearts out, dear designers): |
|
|
|
Its been a historical fact that none of the points is precisely equal to 1/72 of an inch. Many people tried to fit it on, but none of them could make it. But in the 1980s Adobe devised the PostScript language, in which the point was described as exactly one seventy-second of the English inch: 0.013(8) |
|
From that time on, everybody can type fonts on a computer with mind-bending precision. |
|