Ended wrote:Yeah, the issue is one of terminology, rather than anything fundamental. We wouldn't change the value of pi as such, just redefine what we meant when we said 'pi'.
You are exactly correct. Pi is just a greek letter, the same as beta, chi, epsilon and all 20 or so others. They are all inherently meaningless, but we give them meaning by letting them stand for ideas, such as the ratio of circumference to diameter of a circle. What is the differece between p and pi? Well one is Roman and one is Greek, and they are shaped a bit differently, but that is about it.
Granted, pi is usually
defined and understood as the ratio of circumference to diameter of a circle, but this is a relatively recent phenomenon, and according to Wikipedia, it was relatively uncommon until Euler used it consistently in his publications.
So in conclusion define pi (or pii for that matter) however you like when writing papers or doing assignments or whatever it is that you do, but just be explicit if you are not following the convention, and be consistent, and no one should have any problems. They might laugh at you behind your back, but certainly nothing more serious than that.
Felgraf wrote:I wouldn't mind changing some of the notation for physics. V is velocity, a slightly curvier v is *sometimes* used for frequency (but sometimes not! MWAHAHAH!), god help you if you have crappy handwriting and need to use both in the same equation.
You mean the lower-case Greek letter "nu".
Alan wrote:While we are at it, we should also move everyone to use base-Pi instead of base 10. That would eliminate the need for a special symbol. Pi is represented as 1.
First of all I doubt that basing a numbering system on a non-integer makes any sense, and secondly, even if it did, pi would be 10, and not 1.