Choosing a language

A place to discuss the implementation and style of computer programs.

Moderators: phlip, Moderators General, Prelates

User avatar
Xanthir
My HERO!!!
Posts: 5426
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Postby Xanthir » Sat Jun 23, 2007 2:30 am UTC

Of course, I'd then ask you, what *is* C's forte? ^_^

Rysto
Posts: 1460
Joined: Wed Mar 21, 2007 4:07 am UTC

Postby Rysto » Sat Jun 23, 2007 2:33 am UTC

:roll:

Come on. Systems programming, for one?

User avatar
djn
Posts: 610
Joined: Mon May 07, 2007 1:33 pm UTC
Location: Oslo, Norway

Postby djn » Sat Jun 23, 2007 8:53 am UTC

Rysto wrote:I have to say, to call C's efficiency a "myth" based on its performance in numeric and scientific computation is rather disingenuous. Numerical computation is hardly C's forte.

I'd say that one realistic definition of a "real language" would be one that's widely used in commercial software projects. By that measure, the C family would most definitely quality, while I doubt that Python or Ruby wouldn't.


Oh, I don't know. For not-too-complicated (because then you'd want a higher-level language) loops that have to be fast, it's quite good.
Incidentally, I've just spent some months studying how to migrate numeric/scientific code from python to python modules written in C and fortran.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Re: Choosing a language

Postby yy2bggggs » Sat Jun 23, 2007 4:55 pm UTC

Xanthir wrote:Seriously, properly declared Lisp code can actually beat C in numerical computations! See this article for a discussion of the "C is efficient" fallacy.

That article is cargo cult computer science.

Rysto
Posts: 1460
Joined: Wed Mar 21, 2007 4:07 am UTC

Postby Rysto » Sat Jun 23, 2007 6:13 pm UTC

I do notice that he completely ignored the existence of the restrict keyword.

User avatar
FiddleMath
Posts: 245
Joined: Wed Oct 11, 2006 7:46 am UTC
Location: Madison, WI
Contact:

Postby FiddleMath » Sat Jun 23, 2007 7:25 pm UTC

I think...

It doesn't matter that much which language you use next. All that really matters, I think, is that you learn *how* to learn programming languages, and get done whatever you want to get done in them. Know your algorithms, know some ideas about structuring programs, understand functional decomposition, have some idea of how computers work.

Of course, different languages are built for different reasons, and have different uses. I'd focus more on making bigger and bigger projects work; so long as you're willing to learn a new language because it'll help you do it better. Know what the relative strengths of programming languages are, but that doesn't mean you need to master all of them. They practically blink in and out of existence, you'll never manage.

On the other hand, I do recommend making something substantial in C, and something else substantial in Lisp or Scheme, if only because those languages are likely to stretch your mind a little.

iw
Posts: 150
Joined: Tue Jan 30, 2007 3:58 am UTC

Postby iw » Sun Jun 24, 2007 12:22 pm UTC

C is most certainly a "real" language, as is Python and Ruby and C++ and C# and Java and yes, even PHP.

I think the important thing is to try out as many languages as possible so you understand what the best tools for the situation are. A lot of it also depends on what you feel the most comfortable working with given a situation. A lot of the time, the solution I envision to a problem is clearly suited for one language or another. So it's not really a matter of what to learn, it's a question of what to learn first.

I suggested C because it is a useful language to learn, it's better to learn C before Python in my opinion because it makes all the nice features of it make a lot more sense, and I think it's better for a beginning amateur to learn C before something like Lisp or Scheme because of the difficulty factor. It's better to learn C before C++ so you don't have too much thrown at you at once.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sun Jun 24, 2007 4:18 pm UTC

iw wrote:It's better to learn C before C++ so you don't have too much thrown at you at once.

What do you mean, too much thrown at you at once?

User avatar
b0b
Posts: 79
Joined: Sat Jun 23, 2007 9:33 pm UTC
Contact:

Postby b0b » Sun Jun 24, 2007 4:40 pm UTC

yy2bggggs wrote:
iw wrote:It's better to learn C before C++ so you don't have too much thrown at you at once.

What do you mean, too much thrown at you at once?

C++ is basically C with objects. If you learn C first, you learn the fundamental syntax used within functions. That syntax carries over almost verbatim to C++ object methods. I've heard C++ described as "C with twice as much punctuation". Unfortunately, that description isn't too far off the mark.

C is also a good precursor to Java. There are a lot of similarities. C is simpler, of course, beause you don't have to deal with the encapsulation of code in objects. Java doesn't have as much obscure punctuation as C++ because it doesn't support pointers. (Actually, everything in Java is a pointer, but that's a layer of the onion that shouldn't concern you.)

C# is like C++ with a layer of Microsoft obfuscation on top of it. I only use it when I have to. I would never choose it for personal projects. It's way too heavy for my taste.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sun Jun 24, 2007 5:17 pm UTC

b0b wrote:
yy2bggggs wrote:
iw wrote:It's better to learn C before C++ so you don't have too much thrown at you at once.

What do you mean, too much thrown at you at once?

C++ is basically C with objects.


And templates, and it's libraries.

If you learn C first, you learn the fundamental syntax used within functions. That syntax carries over almost verbatim to C++ object methods.


You also learn things like "use char* for strings" and "use arrays" and "use raw pointers" and other Bad Things (TM) if you actually want to program C++ instead of C with objects.

There's no reason I've ever heard why you should start with C that I buy. There's no reason you have to have everything in C++ thrown at you at once; you can learn stuff like the syntax of functions that you mention above without seeying anything more C++-specific than cout, std collections, and std::string, which are all easier to learn and use than their C counterparts. Then you can start introducing classes and stuff like that. You don't have to throw everything into the pot at once.

C# is like C++ with a layer of Microsoft obfuscation on top of it.


What? That is *way* misrepresenting C#. C# is *far* closer to Java than it is to C++. I describe C# as "a largely MS-only implementation of what Java should have been."

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sun Jun 24, 2007 5:48 pm UTC

b0b wrote:
yy2bggggs wrote:
iw wrote:It's better to learn C before C++ so you don't have too much thrown at you at once.

What do you mean, too much thrown at you at once?

C++ is basically C with objects.

If your C++ code is "C with objects", you are almost certainly writing bad C++ code. This is the trap we need to avoid.

C++ is not basically "C with objects"; it's an entirely different language. But as it supports C as a subset (with few exceptions), it's easy to fall into the trap of thinking you know C++ when all you know is how to write C code with objects.

Also, as I mentioned before, C++ is not an object oriented programming language anyway--it's a multi-paradigm programming language. Another trap that is to be avoided is that, damn the costs, you should strive to implement everything using objects. Sure, OOP is a powerful paradigm, and there's no problem with using it in most of your C++ code, but if there is a better way than objects to solve a problem, and you use objects because you think that's how you're supposed to code in C++, you've just burned yourself.

If you learn C first, you learn the fundamental syntax used within functions.

You can learn the same fundamentals in C++.
That syntax carries over almost verbatim to C++ object methods.

Problem is, if you learn C first, you're learning a lot more than syntax. You're learning the following things that you should, but probably won't, learn how to do properly (i.e., safer, more powerful, less bug prone) in C++ using methodologies that don't exist in C (this is just a partial list I'm making off the top of my head):
  • How to manipulate strings (append, copy, search, etc)
  • How to work with strings (magic character buffer sizes in C?)
  • How to manipulate memory
  • How to print things
  • How to write to files; how to read from files
  • How to perform basic manipulation of data in practical applications (sort, lookup, etc)


And finally, once you learn how to do everything incorrectly because it's correct in C, but you can still get it done (eventually), you'll think you know C++. You'll put down that you know C++ on your resume. You'll get hired, you'll work with me some day, and you'll write code more slowly that is more buggy, and I'll have to fix it.

User avatar
aldimond
Otter-duck
Posts: 2665
Joined: Fri Nov 03, 2006 8:52 am UTC
Location: Uptown, Chicago
Contact:

Postby aldimond » Sun Jun 24, 2007 6:10 pm UTC

EvanED wrote:
b0b wrote:
yy2bggggs wrote:
iw wrote:It's better to learn C before C++ so you don't have too much thrown at you at once.

What do you mean, too much thrown at you at once?

C++ is basically C with objects.


And templates, and it's libraries.

If you learn C first, you learn the fundamental syntax used within functions. That syntax carries over almost verbatim to C++ object methods.


You also learn things like "use char* for strings" and "use arrays" and "use raw pointers" and other Bad Things (TM) if you actually want to program C++ instead of C with objects.

There's no reason I've ever heard why you should start with C that I buy. There's no reason you have to have everything in C++ thrown at you at once; you can learn stuff like the syntax of functions that you mention above without seeying anything more C++-specific than cout, std collections, and std::string, which are all easier to learn and use than their C counterparts. Then you can start introducing classes and stuff like that. You don't have to throw everything into the pot at once.


Learn C so that you get a solid understanding of what pointers and structs mean in terms of memory layout, and what your code means in terms of what values are passed and copied where. What different types mean, how to copy between them and cast between them. The meaning of "unsigned". That's, in my mind, easier when your strings are simple character arrays and your printf is just a function. When << is simply a bit-shift. You're operating with fairly concrete stuff that's easy to grasp.

Then try to do something big and complicated in C. Note all the tedious bookkeeping you have to do, how much you wind up relying on the preprocessor. How you wish you could have some class abstraction. How much it sucks to have to realloc stuff when you're doing string manipulations (I'm really good at low-level string manipulations; I wish I didn't have to be).

Then learn C++ and be slightly relieved of some of that hassle. But realize when you do that some of C's warts are still in place. Passing structs/objects by value still isn't what you want, and you'll know why (C++ even has "pass by reference", much nicer than having to pass pointers as in C, but some CS majors that don't know their C still don't use it!). String literals in code still represent pointers to static memory locations. There are still in C++ cases where a double-pointer is a good thing to use.

It might not be a bad idea to learn assembly before C, even.
One of these days my desk is going to collapse in the middle and all its weight will come down on my knee and tear my new fake ACL. It could be tomorrow. This is my concern.

User avatar
b0b
Posts: 79
Joined: Sat Jun 23, 2007 9:33 pm UTC
Contact:

Postby b0b » Sun Jun 24, 2007 8:18 pm UTC

Well, since I am an OO programmer, I'm not going to defend C over C++. I'm just saying that I found C++ pretty easy to learn because I already knew C. I knew the basic syntax used within C++ methods. I just had to add some new concepts and syntax to my existing knowledge.

I've worked on very large C programs. It was difficult. OOP made large programs much easier to write. I can't even write C anymore. I just don't think that way. I think OOP.

I also think that Java is easier than C++, and it's much easier than Visual C++. Just my opinion.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sun Jun 24, 2007 10:14 pm UTC

b0b wrote:Well, since I am an OO programmer, I'm not going to defend C over C++.

I think you're missing the point.

User avatar
b0b
Posts: 79
Joined: Sat Jun 23, 2007 9:33 pm UTC
Contact:

Postby b0b » Tue Jun 26, 2007 4:09 am UTC

yy2bggggs wrote:
b0b wrote:Well, since I am an OO programmer, I'm not going to defend C over C++.

I think you're missing the point.

No, I think I get it. It's like some people will learn lap steel before pedal steel, and some of the stuff on lap steel doesn't translate well to pedal steel (bar slants, for instance). But the whole idea of pedals to change the tuning on the fly is totally absent in lap steel.

Some people go straight to pedal steel, and don't see any value in learning the more primitive lap steel first. I always recommend the lap steel route though, to familiarize your brain (and hands) with the basic techniques.

It's the same with C and C++. That's how I see it, anyway.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Tue Jun 26, 2007 9:03 am UTC

b0b wrote:
yy2bggggs wrote:
b0b wrote:Well, since I am an OO programmer, I'm not going to defend C over C++.

I think you're missing the point.

No, I think I get it. ...
It's the same with C and C++. That's how I see it, anyway.

Not quite. Everything you can do in C, you can do in C++. That's the problem.

Say I give you a beginner's assignment. You need to write a program, that will ask you for your name. Afterwards, it should ask you for a series of numbers (as many as you want to enter), and print the sum out with your name.

Now, how much C do you have to know to finish this assignment? Not all that much. You don't have to know how to create variables that point to functions. You don't have to know how to use qsort. You don't have to know how to assert, what exit does, or how to dup files. You just have to know a few basic things--string manipulation, how to read in integers, how to loop, and how to make a terminator for your loop... how to add, and print things out. Just a tiny bit of C.

Now I've got two places to go with this...

First off, what do you suppose you need to know in C++ to write the same assignment? Not much. You don't need to know OOP, templates, how to use std::sort. You don't need to know how to declare anonymous namespaces, or how to use iterators. Just the same basic things you need to know for C would do. But in C++, there are some better basics (string, for example), but they are still basic.

Second, let's say you've learned how to do all of your assignments in C, and now you are given a C++ compiler. You are told you need to learn C++. And you are given an assignment--you need to read in your name, and a series of integers, etc. Now write the program. What will you do? Simple--you'll write a C program. Why? Because C++ just extends C (for all practical purposes), and your C program will compile and work just fine. Will it run? You betcha! Anything you do in C will work in C++. And your C program will be very likely to have a security bug in it--namely, a buffer overrun vulnerability. If you did it using C++ 101 knowledge, however, it'd work just fine, but with no buffer overrun vulnerabilities.

Scale the problem up, and that is what I'm talking about.

iw
Posts: 150
Joined: Tue Jan 30, 2007 3:58 am UTC

Postby iw » Tue Jun 26, 2007 11:37 am UTC

yy2bggggs wrote:Second, let's say you've learned how to do all of your assignments in C, and now you are given a C++ compiler. You are told you need to learn C++. And you are given an assignment--you need to read in your name, and a series of integers, etc. Now write the program. What will you do? Simple--you'll write a C program. Why? Because C++ just extends C (for all practical purposes), and your C program will compile and work just fine. Will it run? You betcha! Anything you do in C will work in C++. And your C program will be very likely to have a security bug in it--namely, a buffer overrun vulnerability. If you did it using C++ 101 knowledge, however, it'd work just fine, but with no buffer overrun vulnerabilities.

Scale the problem up, and that is what I'm talking about.

But that problem would therefore exist no matter what programming language you start with. If you learn C++, and then you try to learn Python next, you're going to want to write a C++ program. It may be harder in Python, but if you have this problem, you're going to try to write C++ programs anyway (imagine lots of "for i in range(1,n)"). If you learn C++ first, and then later have to use C++, you'll probably start programming in object-oriented C, which is usually not the right idea.

The idea of learning several different languages is to figure out what works best in those languages, thus expanding your mind. If you start coding C in C++, you aren't doing any learning; at that point it's your fault. The idea here is that we want to offer the beginner as painless of a transition as possible between languages, and going from C to C++ is less painful than learning all the features of C++ right away.

User avatar
evilbeanfiend
Posts: 2650
Joined: Tue Mar 13, 2007 7:05 am UTC
Location: the old world

Postby evilbeanfiend » Tue Jun 26, 2007 12:25 pm UTC

b0b wrote:...whole guitar/language syntax analogy...


the catch being that learning the syntax for a new language really doesn't take very long at all, its the whole what and how to do things morally as well as legally in the language that takes the time. so if your objective to learn c++ you really are better going straight to c++.

that said 'learning c++' is a pretty narrow objective. most people actually have an objective of 'become a better programmer' and in this case learning both is a good idea (with not much difference either way round you do it), i can't think of a single language that doesn.t teach you something by learning it (well maybe esoteric ones, but then the point of them is arguably to learn something by making one up in the first place)
in ur beanz makin u eveel

UltraNurd
Posts: 5
Joined: Mon Jun 25, 2007 5:34 pm UTC
Location: Watertown, Boston, MA
Contact:

Postby UltraNurd » Tue Jun 26, 2007 7:33 pm UTC

For me, the first thing I usually ask is "do I need to handle multilingual text easily?". If the answer to that is yes, then I decide between Perl and Python, which usually comes down to my mood, how complicated it's likely to be, and whether I think I need objects or just objectish things.

Then again, I could be biased, because right now I am fully engaged in wchar_t/char/wstring/string induced pain.

User avatar
aldimond
Otter-duck
Posts: 2665
Joined: Fri Nov 03, 2006 8:52 am UTC
Location: Uptown, Chicago
Contact:

Postby aldimond » Tue Jun 26, 2007 9:58 pm UTC

yy2bggggs wrote:And your C program will be very likely to have a security bug in it--namely, a buffer overrun vulnerability.


If you're a shitty C coder, then yes, your program will have buffer overruns.

It's been a while since I did any C++; what is the C++ 101 way to read a value into a string without a buffer overrun? Is that handled automatically by operator>>(istream, std::string) (or whatever the right way to refer to that is)?
One of these days my desk is going to collapse in the middle and all its weight will come down on my knee and tear my new fake ACL. It could be tomorrow. This is my concern.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Tue Jun 26, 2007 11:10 pm UTC

aldimond wrote:
yy2bggggs wrote:And your C program will be very likely to have a security bug in it--namely, a buffer overrun vulnerability.


If you're a shitty C coder, then yes, your program will have buffer overruns.


Then it seems that almost every C program ever written has had shitty programmers on the team.

This is somewhat in gest, but C makes it easy to screw up.

It's been a while since I did any C++; what is the C++ 101 way to read a value into a string without a buffer overrun? Is that handled automatically by operator>>(istream, std::string) (or whatever the right way to refer to that is)?


Yes. cin >> mystring; can't overflow unless your library/compiler is buggy.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Wed Jun 27, 2007 1:36 am UTC

iw wrote:But that problem would therefore exist no matter what programming language you start with.

Yes and no. Certainly, there are paradigm shifts when moving from language to language--in C and C++ you should be thinking in terms of offsets; in Ada, your ranges should just match your domain. There are starting C coders migrating from Pascal who do the #define BEGIN { and #define END } thing. And yes, this is sort of an issue. But this isn't what I was talking about exactly.

The issue stems from C++ supporting C in its entirety; it's a fundamentally different type of thing than going from C++ to Python. It's an issue with knowing when you know C++. And it's a lot more than just theoretical; it's damned near impossible to find someone who really knows C++ based on the amount of experience they say they have without lying on a resume, specifically because some people don't know they don't know, because they really only know "C with objects" and all of the experience they brought up was brushing up their techniques and skills with "C with objects".

And yes, these poor "it's their fault" people really did improve their skills with "C with objects" during those five years, but sadly they just didn't know that learning such and such in a few weeks is equivalent to three of those years, and learning this little thing over here is invaluable.

Starting right from scratch goes a long way. And it's not harder to do--it is in fact easier.

User avatar
taggedunion
Posts: 146
Joined: Fri Jul 06, 2007 6:20 am UTC
Location: BEHIND YOU

Postby taggedunion » Fri Jul 06, 2007 5:40 pm UTC

I object, along with others, that Python and Ruby are indeed REAL programming languages. I don't see why the definition of a "real" programming language must be ones with braces and static typing. In that case, assembler isn't a real one, and it's what eventually everything is in! Assembler especially if "not easy to use" is in the definition of a "real" language. :P

Also, "scripting language" really shouldn't be a pejorative anymore. Maybe in the day of shell, Awk, and such, but Perl blew that out of the water, and that and Python and Ruby are amazingly powerful and relatively easy to use right out of the box. That's more than I can say about C and C++. I mean, Perl is the computer language closest to natural language, which any linguist could tell you is quite a feat. It is not a mere "scripting" language!

I like C, and respect it for what it does, but I try to stay away from the monster that is C++ if possible. C is a fun one to know and use, but one to avoid using, as one avoids using TNT for lawnscaping.

I despise Java and it's incredible verbosity. Why the hell do I need to fully type out "NoSuchElementException" and "implements"? Eclipse makes it somewhat palatable.

I highly recommend Lua after Python as a language to learn. It's a superb exercise in simplicity. A lot of games, including WoW and LucasArts games, use Lua as their scripting language. There's a port for the PSP and I think the DS as well.

You can use OO programming principles in most any language, including C. You just don't get the stuff normally taken for granted, like the virtual table of function pointers and passing around the this reference. And an object is a glorified table, really. Look at Lua for an example. :)

Well, anyway, after all that, a sort of summary or point:
* C is good to learn. Avoid it if possible.
* C++ is something I know little about and wish to keep it that way.
* Python, Perl, and Ruby are indeed REAL languages.
* Lua is really cool -- go check it out!
* OO is a design practice, not just a language feature.
Yo tengo un gato en mis pantelones.

User avatar
b0b
Posts: 79
Joined: Sat Jun 23, 2007 9:33 pm UTC
Contact:

Postby b0b » Fri Jul 06, 2007 5:55 pm UTC

taggedunion wrote:I object, along with others, that Python and Ruby are indeed REAL programming languages. I don't see why the definition of a "real" programming language must be ones with braces and static typing. In that case, assembler isn't a real one, and it's what eventually everything is in!

Not true at all. Assembler is a human-readable abstraction of machine code. Most languages boil down to machine code eventually, but few of them pass through Assembler along the way.
taggedunion wrote:Also, "scripting language" really shouldn't be a pejorative anymore. Maybe in the day of shell, Awk, and such, but Perl blew that out of the water ...

Perl gave scripting languages a bad name. Awk was much more respectable, IMHO.
taggedunion wrote:I despise Java and it's incredible verbosity. Why the hell do I need to fully type out "NoSuchElementException" and "implements"? Eclipse makes it somewhat palatable.

I take it that you don't have "strong typing skills". Perhaps you'd do better with a game controller as your input device. :P

User avatar
taggedunion
Posts: 146
Joined: Fri Jul 06, 2007 6:20 am UTC
Location: BEHIND YOU

Postby taggedunion » Fri Jul 06, 2007 6:35 pm UTC

b0b wrote:
taggedunion wrote:I object, along with others, that Python and Ruby are indeed REAL programming languages. I don't see why the definition of a "real" programming language must be ones with braces and static typing. In that case, assembler isn't a real one, and it's what eventually everything is in!

Not true at all. Assembler is a human-readable abstraction of machine code. Most languages boil down to machine code eventually, but few of them pass through Assembler along the way.


Well, most pass through C and C++ which are compiled to ASM and then assembled to machine code. I wanted to say machine code, really, but opcodes and such in bit form were farther away than my example demanded.

taggedunion wrote:Also, "scripting language" really shouldn't be a pejorative anymore. Maybe in the day of shell, Awk, and such, but Perl blew that out of the water ...

Perl gave scripting languages a bad name. Awk was much more respectable, IMHO.


Yeah, I don't particularly like Perl, but you have to admit that it is one powerful language.

I must admit I'm overstepping my bounds here. For me, all that stuff is history. I'm quite familiar with shell but my Awk is negligible, so I shouldn't have brought it up. I was trying to voice my opinion on how I people seem to dismiss languages like Perl, Python, etc. because they are "scripting languages". From my knowledge in shell, minimalist languages like Lua, and hell, JavaScript, languages like Perl, Python, etc. are much more powerful and fuller-featured. But disregarding connotations, Python is indeed an excellent scripting language, and my system scripting language of choice.

taggedunion wrote:I despise Java and it's incredible verbosity. Why the hell do I need to fully type out "NoSuchElementException" and "implements"? Eclipse makes it somewhat palatable.

I take it that you don't have "strong typing skills". Perhaps you'd do better with a game controller as your input device. :P


Well, :P to you too. I actually don't like most video/computer games and I haven't touched a controller, much less a system (like XBox or PS2 or whatever), in over a year. I'm a quick typist and make good use of Vim. I'm just more in the six-characters-or-less name camp, and don't like typing long strings of characters for a single name, especially if I have to repeat that name often. The "implements" example was probably a bit overboard, but I chose those names as examples of how ridiculously long Java built-in and library names can get. Of course, there's always the problem of too-short, single-character variable names with no meaning. But I err on the side of short.
Yo tengo un gato en mis pantelones.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sat Jul 07, 2007 12:48 am UTC

taggedunion wrote:Also, "scripting language" really shouldn't be a pejorative anymore. Maybe in the day of shell, Awk, and such, but Perl blew that out of the water, ...


IMHO, "Perl" should be a pejorative by now ;-)

...and that and Python and Ruby are amazingly powerful and relatively easy to use right out of the box. That's more than I can say about C and C++. I mean, Perl is the computer language closest to natural language, which any linguist could tell you is quite a feat.


*boggles*

Besides ASM, I'd have a hard time thinking of a language that I would consider *less* like natural language.

I despise Java and it's incredible verbosity. Why the hell do I need to fully type out "NoSuchElementException" and "implements"? Eclipse makes it somewhat palatable.


I think that there are a number of things in Java that are excessive, but I do understand where they are coming from. Remember, your code will be read a lot more (and probably by a lot more eyes) than it's written. If avoiding an abbreviation makes the code clearer or results in less confusion, it's worth it. For words long enough that abbreviations make sense, there are a zillion ways to produce a reasonable-sounding abbreviation, so I think you should only use one if the abbreviation is standard or you are able to make it standard within the context of your code.

For instance, Java could have standardized on something like "Exn" for exception, then your example becomes "NoSuchElementExn", but I'm not sure I would reduce it further. (You could probably do BadElementExn.)

(This isn't to say that abbreviations are the devil's spawn either. I just posted a sarcastic message to TheDailyWTF a couple days ago about how Java's designers thought that abbreviating 'integer' to 'int' was perfectly reasonable but abbreviating 'boolean' to 'bool' was out of the question. A couple people responded with messages saying readability above conciseness, and that even abbreviating integer doesn't make sense now that we don't need the extra bytes you save, and I responeded effectively saying I thought that idea was pretty dumb. But the abbreviation should make sense even if you're not too familiar with the code base. I believe Ken Thompson or Denis Ritchie has said that if they could fix one mistake with Unix, it would be to spell 'creat' with an 'e'.)

You can use OO programming principles in most any language, including C. You just don't get the stuff normally taken for granted, like the virtual table of function pointers and passing around the this reference. And an object is a glorified table, really. Look at Lua for an example. :)


You also lose everything else that you get with C++, which totally overshadows the syntactic sugar of objects and is responsible for why I like C++, and which I have expounded upon in several threads. If I were to give up one major feature of C++, the first one to go would be objects, and I would go back to C-style OO programming.

User avatar
taggedunion
Posts: 146
Joined: Fri Jul 06, 2007 6:20 am UTC
Location: BEHIND YOU

Postby taggedunion » Sat Jul 07, 2007 3:25 am UTC

EvanED wrote:*boggles*

Besides ASM, I'd have a hard time thinking of a language that I would consider *less* like natural language.


Larry Wall, the creator of Perl, was himself trained as a linguist and got a job hacking Unix to pay the bills. Many of the "weird" features of Perl are derived from linguistics, with the wide variety of options -- TIMTOWDI -- reflective of natural language (synonyms). Because of this great diversity, Perl users develop a "dialect" of Perl -- a subset of the full language determined by peers, needs, and personal taste.

You -- no, everyone -- should read some of Larry Wall's papers on Perl. While I agree that it is a visually unappealing language and I wouldn't want to work with it if I could help it, you have to admit that it is engineering genius. Compiler and interpreter writers (like me) would find these papers useful as well as insights into language design.

...snip part about Java...


I understand. I just don't like typing them out. I think a lot of what Python did for keyword and builtin names, including the exceptions, is a good model in brevity vs. intelligibility. Like, IndexError vs. ArrayIndexOutOfBoundsException. Java does distinguish errors and exceptions, but Python seems to do just fine in with the namespace overlap in that department.

You also lose everything else that you get with C++, which totally overshadows the syntactic sugar of objects and is responsible for why I like C++, and which I have expounded upon in several threads. If I were to give up one major feature of C++, the first one to go would be objects, and I would go back to C-style OO programming.


Yeah, objects and OO are mostly hype. A large part of what's considered "OO" is actually "OO as Java sees it" and includes a lot of things that aren't necessarily OO. Look at Simula and Self for the origins, and counterexample, to Java.

I agree that C++ is plenty powerful and has many awesome constructs, but I avoid for largely the same reason I avoid Perl: I find it ugly and clunky. I'm not saying others can't use it, or can't feel good about using it! I just don't want it for myself.

Hehe... I don't like C++, Java, C#... I use C sparingly... you know what I should pick up? The D Programming Language. :D
Yo tengo un gato en mis pantelones.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sat Jul 07, 2007 3:43 am UTC

taggedunion wrote:I understand. I just don't like typing them out. I think a lot of what Python did for keyword and builtin names, including the exceptions, is a good model in brevity vs. intelligibility. Like, IndexError vs. ArrayIndexOutOfBoundsException. Java does distinguish errors and exceptions, but Python seems to do just fine in with the namespace overlap in that department.


IndexError seems fine to me. I was going to say that I like the tag on the name of exceptions just to distinguish them from other classes (rather than errors from exceptions), but now that I think about it further their presence in a throw, throws, or catch clause should be plenty documentation. ;-)

BadElement would be okay for the one you suggested before.

Yeah, objects and OO are mostly hype. A large part of what's considered "OO" is actually "OO as Java sees it" and includes a lot of things that aren't necessarily OO.


I... don't go that far. If I were forced to use C, but still code it however I wanted, I would still use OO design. In some sense, I would be writing C++ in C.

Hehe... I don't like C++, Java, C#... I use C sparingly... you know what I should pick up? The D Programming Language. :D


I'm not sure if you're being facetious or not, but from what I know, D looks like an awesome language. I just with the tool support were better. Near as I can tell, it's a clean version of C++. I would miss macros, even in their watered-down, crappy version in C/C++, but the other benefits seem like they would make up for it. (There IS a reason that I am insistent on the macro thing in the Lisp thread, and it's because I (over)use macros already. I want to see what proper support will let you do.)

User avatar
taggedunion
Posts: 146
Joined: Fri Jul 06, 2007 6:20 am UTC
Location: BEHIND YOU

Postby taggedunion » Sat Jul 07, 2007 4:00 am UTC

EvanED wrote:BadElement would be okay for the one you suggested before.


Or BadElem?

Yeah, the Hungarian notation is certainly useful. All the exception handling should be a clue though, yes. :) That reminds me: I don't like Java's checked exceptions, and pull in Python for the counterexample of unchecked exceptions. Unchecked exceptions would remove a lot of boilerplate and make things safer in a way (no more broadly catching Exception in main). I read somewhere that checked exceptions hampered code production in anything larger than a small project, but damned if I can recall the source.

Yeah, objects and OO are mostly hype. A large part of what's considered "OO" is actually "OO as Java sees it" and includes a lot of things that aren't necessarily OO.


I... don't go that far. If I were forced to use C, but still code it however I wanted, I would still use OO design. In some sense, I would be writing C++ in C.


OO is certainly still useful, and I use it on a regular basis. But I like Python's interpretation of it, not Java's overbearing one.

Hehe... I don't like C++, Java, C#... I use C sparingly... you know what I should pick up? The D Programming Language. :D


I'm not sure if you're being facetious or not, but from what I know, D looks like an awesome language. I just with the tool support were better. Near as I can tell, it's a clean version of C++. I would miss macros, even in their watered-down, crappy version in C/C++, but the other benefits seem like they would make up for it. (There IS a reason that I am insistent on the macro thing in the Lisp thread, and it's because I (over)use macros already. I want to see what proper support will let you do.)


I would use D except:
1) Bad/incomplete/lacking libraries
2) Already know C
3) My roommate was a D evangelist and turned me off to it a bit :P

It puts a whole helluva lotta features from functional programming languages in it, and lots of other crazy stuff. You can see Walter Bright give talks on his ideas and possible directions for the design of D on Google Video or Youtube. It's absolutely crazy what this dude is doing.

And macros? C/C++ macros are weak textual ones, and are hardly worth comparing to Lisp's macros. Yeah, I wish more languages had syntactical macros, but look at it from a design angle: syntactical macros have to work on the abstract syntax tree, which is fairly simple in Lisp because the code is the AST. In other languages things are more complicated. Ruby might have a shot as of now, since the language is interpreted of its AST rather than bytecodes. The macros would be in a completely different language from the rest of the code, though.

Revisiting D -- I haven't switched to it in much the same way as I haven't switched to Ruby -- I already know a language similar enough that does pretty much the same thing. I already know Python, and I toy but don't fully pick up Ruby. I already know C and can hack it pretty well, so I don't really bother with D. I can justify it less with the latter, though, as D is in many ways a much superior language. Also, it doesn't scare me away like C++.
Yo tengo un gato en mis pantelones.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sat Jul 07, 2007 4:16 am UTC

taggedunion wrote:
EvanED wrote:BadElement would be okay for the one you suggested before.


Or BadElem?

Yeah, the Hungarian notation is certainly useful. All the exception handling should be a clue though, yes. :) That reminds me: I don't like Java's checked exceptions, and pull in Python for the counterexample of unchecked exceptions. Unchecked exceptions would remove a lot of boilerplate and make things safer in a way (no more broadly catching Exception in main). I read somewhere that checked exceptions hampered code production in anything larger than a small project, but damned if I can recall the source.


It doesn't surprise me.

I'm not sure what to think of them, if they are a good idea or not. I said in the Lisp thread I'm a big static typing fan; I like to make the compiler do work for me. So the idea of having it determine if I'm doing everything I should errorwise is appealing. On the other hand, there is the typical counterargument to static typing, which is that type errors (and in this case failure to catch the exception errors) are such a small set of what can go wrong that they should be caught by unit tests anyway; and checked exceptions DO add a lot of boilerplate code.

C# went the opposite way for instance.

Perhaps there is a middle ground. Have some notion of a module boundary (not sure how you get that), and do type inference to determine what exceptions can reach the module boundary, then just force the exported functions to declare their thrown exceptions.

You can see Walter Bright give talks on his ideas and possible directions for the design of D on Google Video or Youtube. It's absolutely crazy what this dude is doing.


I'll check them out.

And macros? C/C++ macros are weak textual ones, and are hardly worth comparing to Lisp's macros. Yeah, I wish more languages had syntactical macros, but look at it from a design angle: syntactical macros have to work on the abstract syntax tree, which is fairly simple in Lisp because the code is the AST. In other languages things are more complicated.


Right. Dylan manages this too, despite not being a Lisp on its face. (Though it shares a lot more with Lisp than is immediately apparent; in fact, early on it was going to use S-expressions as its syntax.)

Then camlp4 appears to do direct modification of the ASTs, though I haven't read enough to see exactly how this is done.

Still, textual substitution macros are better than nothing. And I think it would be possible to improve CPP macros quite a bit too, to make them safer. (For instance, scope them like other names.)

Revisiting D -- I haven't switched to it in much the same way as I haven't switched to Ruby -- I already know a language similar enough that does pretty much the same thing. I already know Python, and I toy but don't fully pick up Ruby. I already know C and can hack it pretty well, so I don't really bother with D. I can justify it less with the latter, though, as D is in many ways a much superior language. Also, it doesn't scare me away like C++.


Gotcha. That's probably part of the reason for me too. I think I know C++ pretty damn well (as distinct from being a good programmer; I don't have a good way of measuring that), and I've put in a LOT of time to reach the point where I can say that. So I'm at the point where I take the ugliness in C++ in stride and can work around it, which reduces the motivation to not use it.

User avatar
b0b
Posts: 79
Joined: Sat Jun 23, 2007 9:33 pm UTC
Contact:

Postby b0b » Sat Jul 07, 2007 4:22 am UTC

Macros are evil. :evil:

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sat Jul 07, 2007 4:49 am UTC

b0b wrote:Macros are evil. :evil:


No, improper or poor uses of macros are evil.

Well, no, that's not necessarily true. CPP macros are evil. However, they are also wonderful at the same time, and the latter outweighs the former.

User avatar
b0b
Posts: 79
Joined: Sat Jun 23, 2007 9:33 pm UTC
Contact:

Postby b0b » Sat Jul 07, 2007 3:29 pm UTC

EvanED wrote:
b0b wrote:Macros are evil. :evil:


No, improper or poor uses of macros are evil.

Well, no, that's not necessarily true. CPP macros are evil. However, they are also wonderful at the same time, and the latter outweighs the former.

You're wrong, but I'll humor you for the moment out of curiosity. Please cite a "wonderful" use of a macro.

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sat Jul 07, 2007 5:42 pm UTC

b0b wrote:Please cite a "wonderful" use of a macro.

http://forums.xkcd.com/viewtopic.php?t=5587

zenten
Posts: 3799
Joined: Fri Jun 22, 2007 7:42 am UTC
Location: Ottawa, Canada

Postby zenten » Sat Jul 07, 2007 5:59 pm UTC

Why do people keep on saying to learn Python, and then learn Java later to get OO down? Python is by nature OO, it doesn't even really have primitives.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sat Jul 07, 2007 6:01 pm UTC

b0b wrote:
EvanED wrote:
b0b wrote:Macros are evil. :evil:


No, improper or poor uses of macros are evil.

Well, no, that's not necessarily true. CPP macros are evil. However, they are also wonderful at the same time, and the latter outweighs the former.

You're wrong, but I'll humor you for the moment out of curiosity. Please cite a "wonderful" use of a macro.


I'll name three.

1. assert, and similar macros.

These have to take an expression, because they evaluate its truthiness, but they also stringize it so that the expression can be displayed when the assertion fails. If you wanted to do that without a macro, you would need to repeat the expression in a string literal.


Items 2 and 3 are on a similar vein, because they both use macros to generate an abundant amount of similar code.

2. This one is from my code. I was working with a SATisfiability solver (technically an SMT solver), and needed to generate ASTs describing the expressions I wanted solved.

The solver's API is a bunch of C functions along the lines of 'sat_expr* make_and(sat_expr*, sat_expr*)' (where sat_expr* is a typdef of void* or something like that). Me, working in C++, made a C++ interface to it. This included a class that represented expressions, say Expr. Expr contains a field, .expr, that contains a sat_expr*.

One of the main motivations for doing this was that I could abuse operator overloading in the Expr class to build expression trees. For instance, I had a function

Code: Select all

Expr operator&& (Expr lhs, Expr rhs)
{
    Expr out;
    out.expr = make_and(lhs.expr, rhs.expr);
    return out;
}

but I needed one of these functions for every operator I needed to support: +, -, *, <, <=, >=, >, ==, !=, &&, ||, and probably one or two I'm not thinking of.

That's a lot of horribly redundant code to write. So I made a macro and used it thustly:

Code: Select all

#define DEFINE_EXPR_OP( cpp_opname, sat_opname )    \
    Expr operator cpp_opname (Expr lhs, Expr rhs)   \
    {                                               \
        Expr out;                                   \
        out.expr = sat_opname(lhs.expr, rhs.expr);  \
        return out;                                 \
    }

DEFINE_EXPR_OP(+, make_plus)
DEFINE_EXPR_OP(-, make_sub)
DEFINE_EXPR_OP(*, make_mult)
DEFINE_EXPR_OP(<, make_lt)
DEFINE_EXPR_OP(<=, make_lte)
DEFINE_EXPR_OP(>, make_gt)
DEFINE_EXPR_OP(>=, make_gte)
DEFINE_EXPR_OP(==, make_eql)
DEFINE_EXPR_OP(!=, make_neq)
DEFINE_EXPR_OP(&&, make_and)
DEFINE_EXPR_OP(||, make_and)

#undef DEFINE_EXPR_OP


21 lines of code vs. ~75 lines of less maintainable code that you need without macros.


3. There's a similar use of this thing in the Linux kernel, especially for defining sysfs accessors and mutators. However, I can't get to the LXR now to cite it because my ISP sucks donkey balls, and I can't get to about 10% of the sites I try even if I know they are up...

User avatar
yy2bggggs
Posts: 1261
Joined: Tue Oct 17, 2006 6:42 am UTC

Postby yy2bggggs » Sat Jul 07, 2007 6:05 pm UTC

EvanED wrote:

Code: Select all

DEFINE_EXPR_OP(||, make_and)

I hope that's a typo.

Rysto
Posts: 1460
Joined: Wed Mar 21, 2007 4:07 am UTC

Postby Rysto » Sat Jul 07, 2007 6:13 pm UTC

It also comes in very handy for all kinds of debugging. Like functions that handle mutexes:

#define mtx_lock(mtx) _mtx_lock(mtx, LINE, FILE, __func__)

Now your mutexes can track exactly where they were locked, but it's invisible to the users of your mutexes.

EvanED
Posts: 4331
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI
Contact:

Postby EvanED » Sat Jul 07, 2007 6:23 pm UTC

yy2bggggs wrote:
EvanED wrote:

Code: Select all

DEFINE_EXPR_OP(||, make_and)

I hope that's a typo.


Oops, yeah. That's what I get for retyping it instead of bringing up the code and copy&pasting...

Rysto
Posts: 1460
Joined: Wed Mar 21, 2007 4:07 am UTC

Postby Rysto » Sat Jul 07, 2007 7:43 pm UTC

I find that I spend a lot more of my coding time thinking than I do typing, so I really don't think that Java's verbosity is much of an issue. It's not like I'm a lightning-fast typist, either. My typing style is much closer to hunt and peck than it is to real touch typing.


Return to “Coding”

Who is online

Users browsing this forum: No registered users and 10 guests