Anpheus wrote:If you trust your compiler implicitly to the point where you'll let it do different things on a debug machine and on a production machine, you're making a grave error. As has already been pointed out, when the compiler changes code the errors change as well, and that makes it extremely confusing to reproduce problems.
So what should, say, Valve do? Spend four times longer in development and raise the production costs four times because they have to debug optimized code, or get smashed at release because the code runs four times longer?
Oh, good, I can come up with hypothetical strawmen too. What should Blizzard do? Spend a fourth as much time in development and release a product before its ready, and then have to make up those costs in bugs that weren't caught due to differences between debug and production machines, or should they release a product that works the first time because they first of all, wrote code in a provably correct and tested manner the first time and then not have to worry about those problems in production?
You're confusing the issue by confusing production with full debug symbol builds with regular debug builds. Neither of which are production. And I see no reason why you shouldn't run both, obviously, to catch all possible errors, but there's really no point in strawmanning my argument and then, pompously, saying that coding correctly, using tests, and doing every possible quality control test is valuable. Duh.
We know that. I know that.
Oh, of course. But the flip side is that if you have an algorithm that can be programmed cleaner with recursion, you should do so until profiling tells you that the overhead of the runtime stack is a problem.
Absolutely, never over-optimize a problem. If you have an O(n3
) algorithm and you're trying to find problems in your code, so you spend three weeks optimizing that algorithm to something like O(n2
) and later find out that n is always less than, I don't know, six... you've wasted your time. You and I clearly both know when optimization is premature or not, so don't start bringing up such points as if you're a bastion of clarity here. There is no "flip side" other than common sense (something most programmers lack, admittedly.) My point still stands in its entirety: just because a tool is available, such as recursion, or to bring up other topics, polymorphism, object orientation, operator overloading or any other tools, doesn't mean you have to use them. At all. That said, if you're writing C code and you end up writing something that looks like it's object oriented, you probably wasted more time writing the framework for it than you gained writing it in C instead of learning a little C++.
It's always, always about using the right tool for the job, and we both know that EvanED. So, recursion isn't always the right tool for the job. There's a misconception given to people whenever they write code to write recursive code whenever possible when it's often stupid to do so. Fibonacci sequence, factorials, etc.