So Apple is regretting the corner they painted themselves into by having their core development language be Objective-C. This language is a horrid mashup made half of Smalltalk and half of traditional unreconstructed C. Compared to C++, the modern half is more modern, but the primitive half is more primitive. Steve Jobs used it for NeXT during his time away from Apple, and brought it back with him. But what looked cool and exciting in 1986 is looking awfully outdated today.
The trend in the industry is clearly moving away from these half-and-half languages, toward stuff that doesn’t inherit primitive baggage from the previous century. Microsoft has had great success by stripping all the old C-isms out of C++ to make C#, and Java — the oldest and crudest of this new generation of programming languages — may still be the world’s most widely used language, even though most people probably now see it as something that’s had its day and is not the place to invest future effort.
Now Apple has announced a nu-school language of their own, to replace Objectionable-C. They’re calling it Swift. It’s even more hep and now and with-it than C#. There’s just one problem: there’s already another computer language using the name. It’s a scripting language for parallel computing. Its purpose is to make it easy to spread work over many computers at once. And this, to me, is far more interesting than Apple’s new me-too language. (Or any of the other new contenders coming up, like Google’s Go or the Mozilla foundation’s Rust.)
See, massive parallelism is where the future of computing lies. If you haven’t noticed, desktop CPUs aren’t improving by leaps and bounds anymore like they used to. Speeds and capacities are showing a much flatter growth curve than they did five years ago. You can’t keep making the same old CPUs faster and smaller… you run into physical limits.
And this means that if we want tomorrow’s computers to be capable of feats qualitatively beyond what today’s can do — stuff like understanding natural language, or running a realistic VR simulation, or making robots capable of general-purpose labor — the only way to get there is through massive parallelism. I think that in a decade or two, we’ll mainly compare computer performance specs not with gigahertz or teraflops, but with kilocores or megacores. That is, by the degree of parallelism.
One problem is that 95% of programming is still done in a single-tasking form. Most programmers have little idea of how to really organize computing tasks in parallel rather than in series. There’s very little teaching and training being directed toward unlearning that traditional approach, which soon is going to be far too limiting. Promulgating a new language built around the idea — especially one that makes it as simple and easy as possible — strikes me as a very positive and helpful step to take. I’m really disappointed that Apple has chosen to dump on that helpful effort by trying to steal its name.