| ▲ | WorldMaker 2 days ago | |||||||||||||||||||||||||||||||
Rendering Unicode was always this complex. Emoji don't do anything that some other language in real use doesn't also do. What emoji does is bring that visually to the forefront among contemporary English text. The assumption that 8-bit character sets of simple bitmaps are all you need mostly only ever worked for English (and then only if you didn't need nice print-like typography, or math formulas, or…). | ||||||||||||||||||||||||||||||||
| ▲ | gmueckl 2 days ago | parent [-] | |||||||||||||||||||||||||||||||
This isn't exactly true. Emojis and other symbols introduced new notions like colors that were not present before. I'm no longer certain that it is feasible to handcraft a font thwt contains all the symbols for codepoints affected by color modifiers. Also, 8 bit codepages, for all their problems (a different kind of hell), didn't break the assumption that each character is encoded as one byte. In that way, they didn't break software in interesting ways like UTF-encoded and possibly decomposed Unicode is able to do. Back then, it was something of a blessing at surface level, but the proliferation of string handling code and concepts that assume this one to one mappping just don't fit well with Unicode. And UTF-8 specifically gives the illusion to English speakers that using naive 8 bit string handling works. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||