Mikeski wrote:the "sandal instead" reference is two NOR gates wired as a flip-flop.
Ah, that is interesting, though not (I think) necessary to the point I was getting at. Can you explain how that works?
Here's a way of building a NOR flip flop that might be clearer. You've been imagining NOR gates wired together in a feedforward network, with ones and zeros flowing along the wires. But sometimes they're wired to form loops instead of just feeding forward. If a wire forms a loop, then either a 0 flows around in a circle forever or a 1 flows around in a circle. So the loop forms a memory of one bit. If you insert two NOR gates into that loop, with each gate having the loop wire as one input and output, and an external 0 feeding into the other input, then you again get a single bit of memory: the loop will either have a 0 flowing from the first NOR to the second and a 1 flowing around the other half, or vice versa. So there are two possible states: 01 or 10. That's one bit of memory.
Now, if you momentarily change one of those outside inputs from a 0 to a 1, then change it back to 0, you will force the loop to contain a certain state. If you do the change to one NOR gate, you'll set it one way, and if you change the input to the other one, you'll set it the other way. In this way, we have a long-term way to store one bit of information. We can read it by having the loop split off and become an input to some other circuit. And we can set it by setting one or the other input to 1 briefly. This is a simple flip flop. There are many other kinds of flip flops that are used. By the way, it's more common to use NAND gates than NOR gates, but either can be used to do anything.
Getting back to the topic: feedfoward circuits of NOR gates are "functional", but feedback circuits are not. The output of a flip flop is not just a function of its current inputs, because there's also an internal state, that changes over time. Computers use gates that are connected both acyclicly and cyclicly, so computers at the lowest level are imperative, not functional.
Pfhorrest wrote:And then it would seem higher-level languages should essentially be made up of a bunch of programs (functions) written in that level of code. For a very simple example you could have a slightly higher-level function which adds X to A "in place" as it were, call it the "add-to" function, where add-to(A,X) = store(A,(add(load(A),X)). You could then piece those higher-level functions together into still higher-level functions until you have functions on the level of simple shell commands, and piece those together still into the realm of full-featured end-user programs. At at every stage the functional nature of the program would be clearly evident.
No, doing something "in place" is not functional. It's imperative. Your command store(A,(add(load(A),X)) is assuming there's a variable A that can change over time. In functional programming, there are no variables that can change. The "add" and "load" commands are functional, but the "store" command is imperative.
It might seem like being functional would prevent you from writing certain programs. But it turns out that anything you can do imperatively, you can also do functionally. You just have to think in a very different way. But at the lowest levels, computer CPUs are definitely imperative, because everything works using flip flops and registers whose contents can change.
There's a higher-level example that might make this clearer. Mathematica allows both functional and imperative programming. If X is a list of a million numbers, then the command x[]=42 will change the first number on the list to be a 42. That's imperative. The command ReplacePart[x,1->42] makes a brand new list that is a copy of all million numbers, except that the first one in the new list is a 42 instead of a copy of the old first element. And then the function ReplacePart returns the new list of a million numbers. It doesn't change X. It doesn't change any variables. That's functional, because the original list didn't change. It was just copied to a new list that had a slight difference.
Of course, it would be horrendously inefficient to actually do all that copying on a real computer, because computers are inherently imperative. But Mathematica is smart enough to not really copy the whole list. It just maintains in memory an acyclic directed graph of nodes with reference counters, in such a way that it LOOKS like it made a copy of all million numbers, even though it actually did something else. And that's typical of functional languages. The compilers are smart enough to translate functional operations into equivalent imperative operations that are efficient on real hardware.