I find this article very interesting. I see a time, in the not-to-distant-future, when higher level languages will standardly outperform lower-level languages for programs humans write. By this, I mean that a higher level language will be more efficient with managing the computer resources of typical computer programs than the hand-coded micro-optimizations that humans like to put in.
This article mentions GC as an area where high-level management beats humans. Here's the text version, from the Google cache, of a presentation that Hans Boehm has done on issues surrounding GC vs. malloc performance. Hans Boehm, being a careful scientist, doesn't state an unambiguous case either way, but he does explode a lot of myths people have around these questions.
There's also this delightful little story about how prototyping in a higher level language created a success story.