• computers were not always binary
• computers were not always digital
• computers were not always even machines

@doriantaylor I still think it’s weird how much people like privileging binary as an especially important layer of abstraction. “It’s all ones and zeros” is no more true than like a dozen other things you could say about even computers in the commonly understood contemporary style.

@vruba ehhh i mean i think shannon had a pretty decent insight; a bit is the answer to a yes-or-no question and you can construct (à la twenty questions) more nuanced answers from more generic questions

@doriantaylor I would go further and say it was a pretty good insight. But it doesn’t really address what I said, does it?

@vruba it does to the extent that it's isomorphic to other bases

@doriantaylor I feel like we’re having two different conversations here. But I agree with the one you’re having, and that’s good enough for me.

@vruba maybe? (caveat: i am very tired and sore)

i mean maybe i'm in the camp of people who privilege binary because it's an especially convenient representation of information (that maps onto things like voltage states) that happens to be isomorphic to all other (digital) representations by dint of certain physical/mathematical tricks ie if it ain't broke don't fix it

@doriantaylor No worries. I’m just saying it’s interesting how much people in ordinary speech bring up binary as a ·layer· of abstraction – not a choice or method, where isomorphism matters, but a level. Not binary as opposed to trinary, but binary as somehow more diagnostic or distinctive than all the other things in the stack that make computers like that.

Follow

@doriantaylor Yes, absolutely. Or electrical charges, or Turing machines, or any of the other things that ordinary computers ·are·. Somehow binary is the one people always seem to point to. And it’s fun to think about the reasons for that.

· · Web · 2 · 0 · 1

@doriantaylor Like if you want to be reductive about computers, that’s the conventional thing to point to. Roughly comparably, if you want to be reductive about humans, you conventionally say flesh and blood or flesh and bone, not, say, “mostly water” or “a bunch of cells” or “just great apes” or whatever. There’s a conventional level to point to. Why? (Rhetorical, thoughtful, not looking for a detailed and literal answer.)

@vruba oh, sure.

i mean, i wouldn't ever say "it's just ones and zeroes" because that's not a meaningful expression.

like, a bit is the answer to a yes-or-no question, and any yes-or-no question is packed with bits of its own.

@doriantaylor Is there some level of abstraction, X, at which you feel it starts being meaningful to say it’s just X, or is that formulation always not meaningful to you?

@vruba hmmm; i don't really get it but my first mental association is something like "scalars that aren't scalars", like dates.

like, a date is isomorphic to a number (integer?) projected onto some kind of scheme (at some arbitrary resolution).

*but*, the cognitive entity "date" actually has components (year month day hour minute second timezone etc)

another "scalar that isn't a scalar" would be phone numbers or postal codes

@doriantaylor This is really interesting, and while it may seem kind of trivial to you, I think it only appears as a product of a lot of thought and experience.

@doriantaylor Maybe not! So I guess: it may be your first mental association, but I think it’s the product of etc.

@doriantaylor I feel like I skimmed this once a while ago based on a recommendation from you, made a note to read it more carefully, and then never got around to it. (Strictly my fault, not the book’s!)

@vruba what is interesting to me about "tech stuff" in general is that a lot of the insights worth insighting were figured out a very, very, very, very, very long time ago.

@doriantaylor As Mia Doi Todd sings:

It takes such a long time
To make things happen
Such a long time
To see things through

@doriantaylor (I often think tech is riding the wave of Moore’s Law the way the larger society runs on the free-ish energy of fossil fuels. It’s unearned and really distortive, at best, to fundamental progress.)

@doriantaylor (I think this is one reason people get so excited about ML stuff. It feels actually new. If you look under the hood, most of it is still just new processing power, not new theory. Quantity has a quality all of its own and all that.)

@vruba well i mean, a turing machine *could* operate with an arbitrary symbol alphabet, but then any other turing machine is equivalent to any other turing machine so therefore it's equivalent to a binary turing machine (with a single head)

tbh i think the predilection toward binary is at least as much a cognitive/linguistic thing

@doriantaylor That cognitive/linguistic thing is what I’m trying to point at here. It’s interesting to me.

@vruba there is definitely something about “ones and zeros” that seems to be where computing transitions from understandable machines to magic.

Sign in to participate in the conversation
Horsin' Around

This is a hometown instance run by Sam and Ingrid, for some friends.