No, really don't do this, you could get someone killed like that (priority traffic causes tons of traffic accidents annually, if they follow up on a 9/11 when there is nothing wrong you are in very hot water, and rightly so).
There are a lot of situations where one bit communications can make big changes. Sending a voltage down the wire to the 'launch' relay on some missile will do a lot worse. In the end, any actuator anywhere is going to respond to just a single bit changing state. So we're in violent agreement about that.
But to reduce the available bandwidth to nothing when lots is available has me puzzled. Twitter found that reducing the number of characters (by necessity, rather than by choice) launched a new medium. I can see how taking that to its logical conclusion (from 840 bits to 1 bit) may create an entirely new mode of expression.
At the same time I fear it will not leave them with a value that is is 1/840th of twitter, that '1' is perilously close to a '0'. In that sense it is like 'pair', which tries (tried?) to be a social network for 2.
Reminds me of Reverend Harry Powell, an itinerant technology evangelist, con artist, and serial entrepreneur with the words "LIKE" tattooed on the knuckles of one hand and "POKE" tattooed on the other, who would explain this fact to his victims by using his hands in a sermon about the eternal struggle between doing no good and doing no evil.
A technical note: Twitter carries unicode messages; the limit is not 140 octets but 140 code points [0]. This is especially useful when tweeting in Japanese.
I believe there are more than 65536 code points assigned so a tweet should be able to carry over 2240 bits of information -- 2240+ Yos.
---
Incidentally, in Japanese and BEV (probably adopted from jp), the word "yo" has an entirely different meaning.
A good one, thanks. Yes, that makes good sense. I guess that in some languages twitter really does carry a lot more information than it does in Latin alphabet languages.
If your language would only allow for 140 bytes you'd be really out of luck if your language routinely requires multi-byte sequences in UTF-8.
Hm. You could do a 'tweet compression' trick where you use a fixed number of bit from a subset of UTF-8 that you know is multi-byte (selected for the extra long sequences) in order to put longer messages on twitter, and then use a decompressor to turn it back into ascii.
"YoTor" could push innocuous "Yo" messages out over disposable one-time use IPv6 addresses, and all of the side-band message data could be transmitted in the IPv6 address. "The source address is the message."
6*140. But as your sibling comment points out twitter is not based on octets but on code points so there is actually a lot more information in there. I made the assumption that the limit was the 140 byte limit from GSM messages with a 'payload' of roughly 6 bits per character position but as was pointed out it is in fact now 140 UTF-8 code-points.
Yeah, I'm familiar with tweeting in Japanese, and how much more information you can get across in a tweet, which made me question the number.
However, I'm still not sure why (in an imaginary world where twitter doesn't encode in UTF-8), it might be 6-bits. 7 or 8 I can understand.
I'm just curious of what your thinking was, not trying to do any one-upmanship.
Well, 'ascii' is ' ' to 'del', above that I can't even type on this keyboard with any reliability so that gives an upper boundary for me of 96 characters. Of those the actual information is carried mostly by the letters, A-Z, twice if you want to count uppercase and lower case for 52 letters, 10 digits, a space. So that's 63 letters. Round up to 64 (maybe add the @ character or the # if you want, those are pretty prevalent in tweets as well). That's approximately 2^6. If you drop the lowercase/uppercase distinction then you can fit all the other ASCII glyphs and punctuation marks in the second half of your imaginary 6 bit code. You can't enter anything below <32 in a tweet, other than a linefeed.
There are a lot of situations where one bit communications can make big changes. Sending a voltage down the wire to the 'launch' relay on some missile will do a lot worse. In the end, any actuator anywhere is going to respond to just a single bit changing state. So we're in violent agreement about that.
But to reduce the available bandwidth to nothing when lots is available has me puzzled. Twitter found that reducing the number of characters (by necessity, rather than by choice) launched a new medium. I can see how taking that to its logical conclusion (from 840 bits to 1 bit) may create an entirely new mode of expression.
At the same time I fear it will not leave them with a value that is is 1/840th of twitter, that '1' is perilously close to a '0'. In that sense it is like 'pair', which tries (tried?) to be a social network for 2.
Maybe someone should come up with a 0 bit medium?