Dear software people,

Unicode is older now than ASCII was when Unicode was introduced. It’s not a weird new fad.

It’s complicated but so is the domain it represents. We recognize that we have to think about time zones and leap days and seconds, for instance. And it’s a cleaner abstraction when you aren’t halfhearted about it.

Sincerely,
Charlie

@vruba Han unification (and related Syriac and Nastaliq issues) aren't well solved issues. But plain ASCII doesn't help you there!

Follow

@pnorman Right – the point is not that Unicode is perfect, because it has various obvious warts and arguable major design flaws. It’s that its utility vastly exceeds any realistic alternative’s, especially ASCII’s, for basically all user-facing software.

· · Web · 1 · 0 · 1

@pnorman Someday that will change, and that will be a new question. For now, as I expect you agree, arguing for ASCII as the only way to do strings is like arguing for 32-bit unsigned integers as the only way to do numbers. “But they’re fast and easy to reason about!”

Sign in to participate in the conversation
Horsin' Around

This is a hometown instance run by Sam and Ingrid, for some friends.