EvanED wrote:Really, most instructions operate on word level, not byte level, so base 4 billion-whatever for a 32-bit machine (or one in 32-bit mode, like x86). I'm pretty sure your typical RISC architecture (thinking of MIPS specifically) doesn't even have a way of operating on individual bytes, except for the lb and sb instructions.
Heh, my knowledge is pretty out of date on that level. I actually did consider register size when thinking about this, but ended up just going with bytes because they're the lowest addressable thing.
Well, I can deal with the bits directly in C++ too. My main reason for saying that computers "think" in higher bases is because I can't ask the computer for a bit, I have to ask it for a byte and then have it process it to give me the bit. That can be abstracted, but at the most basic level, it's getting its atomic representation of data and then altering it to give me a piece of it, rather than just grabbing that piece. It can't do anything to that piece without doing stuff to seven others, so... yeah.Ben-oni wrote:I think saying computers "think in 1's and 0's" is fair, actually. That's the smallest unit that the computer can operate on. For instance, when you program, you can access each individual bit in a file. You can't get at any finer representation, like the actual electric potential state.
If you are lucky enough to work with a language that has good bitstream processing capabilities, then as the programmer you actually can deal with individual bits. And why not? It doesn't matter that the actual representation might not be bytes (in signal processing, you often deal with symbols that can attain a number of states that is not a power of two), but by the time it reaches the processor, it's bits.
Though there are clearly more subtleties than I originally thought of, so I'll have to consider them.
EvanED wrote:Computers think in voltages.
Psh, people have been doing that since they existed