Supersonic Man

June 19, 2014

the Swift programming language(s)

Filed under: Hobbyism and Nerdry,thoughtful handwaving — Supersonic Man @ 9:52 pm

So Apple is regretting the corner they painted themselves into by having their core development language be Objective-C.  This language is a horrid mashup made half of Smalltalk and half of traditional unreconstructed C.  Compared to C++, the modern half is more modern, but the primitive half is more primitive.  Steve Jobs used it for NeXT during his time away from Apple, and brought it back with him.  But what looked cool and exciting in 1986 is looking awfully outdated today.

The trend in the industry is clearly moving away from these half-and-half languages, toward stuff that doesn’t inherit primitive baggage from the previous century.  Microsoft has had great success by stripping all the old C-isms out of C++ to make C#, and Java — the oldest and crudest of this new generation of programming languages — may still be the world’s most widely used language, even though most people probably now see it as something that’s had its day and is not the place to invest future effort.

Now Apple has announced a nu-school language of their own, to replace Objectionable-C.  They’re calling it Swift.  It’s even more hep and now and with-it than C#.

There’s just one problem: there’s already another computer language using the name.  It’s a scripting language for parallel computing.  Its purpose is to make it easy to spread work over many computers at once.  And this, to me, is far more interesting than Apple’s new me-too language.  (Or any of the other new contenders coming up, like Google’s Go or the Mozilla foundation’s Rust.)

See, massive parallelism is where the future of computing lies.  If you haven’t noticed, desktop CPUs aren’t improving by leaps and bounds anymore like they used to.  Speeds and capacities are showing a much flatter growth curve than they did five years ago.  You can’t keep making the same old CPUs faster and smaller… you run into physical limits.

And this means that if we want tomorrow’s computers to be capable of feats qualitatively beyond what today’s can do — stuff like understanding natural language, or running a realistic VR simulation, or making robots capable of general-purpose labor — the only way to get there is through massive parallelism.  I think that in a decade or two, we’ll mainly compare computer performance specs not with gigahertz or teraflops, but with kilocores or megacores.  That is, by the degree of parallelism.

One problem is that 95% of programming is still done in a single-tasking form.  Most programmers have little idea of how to really organize computing tasks in parallel rather than in series.

There’s very little teaching and training being directed toward unlearning that traditional approach, which soon is going to be far too limiting.  Promulgating a new language built around the idea — especially one that makes it as simple and easy as possible — strikes me as a very positive and helpful step to take.  I’m really disappointed that Apple has chosen to dump on that helpful effort by trying to steal its name.

May 13, 2014

cosmic inflation

Filed under: Hobbyism and Nerdry,thoughtful handwaving — Supersonic Man @ 10:24 am

The cosmological inflation theory always sounded weird to me.  I’ve been reading a bit about it, trying to get my head around what they’re claiming.  And I’m unconvinced that it’s a valid theory, even though it’s currently winning at prediction.

The classical big bang theory certainly has problems that need addressing.  Now, there’s no doubt that there was a bang which was big.  Everything we see in the universe is flying apart from everything else, and behind everything we can see is a thermal glow which, if you account for redshift, appears to come from hot gas just at the point where it cools enough to be transparent. (This is around 3000 Kelvin, about as hot as a halogen lamp filament).  So there’s no way around the conclusion that, thirteen point something billion years ago, the entire observable universe was packed into a much smaller volume which was dense and hot, and that it exploded out at terrific speed.  That much is clear.

The problems arise when you try to extrapolate what came before the ball of dense hot gas, which clearly was already expanding.  The math says that it must have all expanded from a volume that was much smaller and hotter still.  In fact, the classical mathematical solution to the big bang insists that the initial explosion must have taken place in a region that wasn’t just tiny, or even infinitesimal, but in no volume at all: an absolute single point where density and temperature equalled infinity.

This answer is nonsensical.  Scientists are now rightly rejecting it, as many of them also reject the notion that the center of a black hole must be a singularity of zero size.  Clearly both are an oversimplification.

The trouble is, if you postulate anything else, the consequences run up against observed data.  The key point of observation is that the universe, as far back as it’s possible to see, has an essentially uniform density and temperature in all directions.  It appears very much as if the volume of hot gas which existed at the earliest moment we can see was in a state of thermal equilibrium with itself, as if all energy differences had been allowed to settle down and blend themselves together at a smooth common level of heat.

But such a blending could not have happened.  There was no time for it to happen in.  And worse, the uniformity includes regions which could not have interacted with each other, because light is only now managing to cross from one to the other, or hasn’t even done so yet.  If we look at opposite sides of our sky, we see as far as light has travelled either way in all of history, which means the total distance from one side to the other has only been crossed halfway.  Since no influence can move faster than light, the two opposite sides cannot possibly have exchanged any energy with each other.  It’s true that they may once have only been a foot apart, but they moved away from each other so close to lightspeed that the light from one side still hasn’t reached the debris of the other.  This means they cannot have exchanged any form of energy.

The smallest random quantum fluctuations back then should loom large today as differences in the cosmic backround temperature, and in the density of galaxies.  The differences are too small to account for without some means of smoothing them away.  It can’t have been ordinary thermal equilibrium.  What could it have been?

Similar questions apply on more esoteric levels too, like why the curvature of space appears to be so near to the ideal value that makes it absolutely flat.  It’s not that it’s close today that’s the problem, since our measurement of it is not all that precise… it’s that any small departure would increase over time, so if it’s close now, then in the distant past it must have been ultra-close with improbable precision.

The inflation theory is an attempt to answer these questions.  It postulates that some unknown repulsive force caused space itself, and the matter in it, to undergo some kind of self-generating expansion which kept the universe hot and dense while it grew at an immense rate.  Then, at some point, it ran out of gas and the universe started coasting outward in a conventional way, as we now observe.

The math they postulate for this process would have the effect of ironing out the irregularities that preceded it, making everything steadily smoother and flatter as long as it continued.  And it allows for a time before the inflation started, in which parts of the universe that are now inseparably distant could have achieved energy equilibrium.  It would only have required a brief instant of delay between the initial creation, when all the matter may have been trapped by gravity in a very small initial volume, and the commencement (by entirely hypothetical means) of inflation.

It might even mean that the initial appearance of the pre-inflated universe could be possible as a quantum fluctuation in vacuum, because the amount of energy that needs to appear from nothing might be comparatively tiny.  In these versions, the universe’s gravitational field constitutes a store of negative energy, and the total net mass of the cosmos is zero.

But if you run it backwards to see what initial conditions could have worked out that way, many say that it falls victim to the same problems as a conventional big bang.  Whatever state preceded it has to have been already uniform to an utterly unnatural degree.  Some say that it actually makes the problem worse.

But aside from that, the whole theory just seems like the ultimate in ad-hockery, a contrivance of arbitrary rules and conditions based on imaginary physics, tuned to “predict” observed results. And it makes some claims that are hard to credit, like that the inflationary expansion was in some sense faster than light. Apparently this is a mathematically allowed solution to the equations of general relativity.

We already knew that we can’t account for the ultimate question of why there is something and not nothing.  Even a religious hypothesis doesn’t have an answer for that.  But even leaving that aside, it’s looking like we’ve really got no answers as to what physical conditions must have existed in the early part of the big bang — at a point where matter and energy had fully come into being and started obeying the physical laws of our cosmos, but before what can be observed.

I’d say both the conventional big bang theory and the inflation theory must be missing something essential.  There’s something big going on there which we’re totally failing to see yet.  Probably something that will look blatant and obvious in hindsight someday. Maybe brane theory, or something equally far out, can provide a missing piece if it develops enough.

The inflation theory may be half true. I’m sure some parts are valid. Maybe even most parts. The part where energy and gravity cancel each other out and can therefore be created together in arbitrarily large quantities, for instance, sounds pretty attractive. But I think we’re probably still missing some key piece of context in which these parts can make a good overall theory.

It’s clear that the inflation hypothesis in its current form is incomplete… I wouldn’t be surprised if whatever comes along to complete the picture ends up discarding a large part of it in the process, and the hypothetical inflation stage seems like a prime candidate for something that might turn out to not be needed anymore if we only knew what was missing.

September 21, 2013

Java is doomed?

Filed under: Hobbyism and Nerdry — Supersonic Man @ 9:25 am

I was talking earlier about Windows now having a somewhat bleak future despite still being firmly dominant today, and now I have to recognize something else that’s gotten itself into a similar position: the Java language. Over much of the last decade it’s probably been the most widely used programming language… though it’s hard to be sure, and it certainly was never in any position of majority dominance.  But now nobody sees any kind of growth in its future, and other languages like C# are making it look outdated.  Combine that with the well-publicized security troubles which, among other things, nailed shut the coffin for applets in the browser (the one place where the average computer user came into direct contact with the Java brand), and nobody’s seeing it as the right horse to bet on anymore.

Which is a shame, because it’s still one of the most widely supported and most available languages, and it’s probably still the best teaching language in the C-derived family.  It’s going to have to be fairly widely used in schools, even if it drops slowly out of use in industry.  There isn’t a suitable replacement for that role yet, as far as I can see.

Even as it gets into a state where people scoff at it for real work, it might still be unavoidable for a long time as something you have to know.

. . . . .

Another sad observation of decline: I think MS Office is now better at supporting Open Office than is at supporting MS Office.

September 10, 2013

strict doctypes and old markup

Filed under: Hobbyism and Nerdry — Supersonic Man @ 9:49 am

I thought strict doctypes, like XMTML Strict, were just for eliminating all the deprecated HTML tags that were used for stuff that now uses CSS, such as <font> and <center>.  But there are a couple of gotchas with it.  For instance, strict [X]HTML does not allow you to put a target= attribute on a link.  Apparently this is considered a matter of presentation and styling, though only cutting-edge implementations of CSS support setting it in a stylesheet.  But the one that really makes me scratch my head is that <blockquote> is only allowed to contain block-level elements.  What?  The obvious semantics of a block quote are that it should contain text.  But no, now it’s only supposed to contain paragraphs and divs, not act as a paragraph in itself.

(I’m posting this partly just as a sort of note to myself.)

I do try to use modern standards, but my website has content dating back as far as 1996, so no way am I going to clean out all the old <font> tags.

Maybe I should at least validate, since the content there is all fairly new, and generated from a single master page that I can easily modernize.

[update] I did: is now fully XHTML Strict compliant, though still has tons of content that’s stuck at a Netscape 4 level of markup, using no CSS at all.  The front landing page is the only part that uses any modern browser technology, and even that dates mainly from about 2005.

[update 2] I made a spreadsheet of all the HTML pages on assessing their state of modernity in terms of styling.  The current status is:

  • root level: almost everything is archaic except the index page and the one page that draws the most search traffic.
  • the old film-era photo gallery folder (which frankly, has been an embarrassment for some time, and really needs updating, or even just some severe culling) is also completely archaic.
  • the Enron & Friends material is 90% bad, with a light sprinkling of modern style tweaks, but the current events movie reviews in the same folder are 90% good.
  • the B movie folder is good, and the boids folder, plus bits in the Amiga folder and the Reagan folder.
  • two of the biggest folders are good, but they’re both unfinished projects which are not yet exposed to the public.

The question is, which of these archaic areas is even worth updating?  The answer would be, almost none.  They’re all dated, essentially of historical interest only, except for the gallery, where markup is the least of its problems.


April 3, 2013

is Microsoft now the underdog? (WordPress is annoying)

Filed under: Hobbyism and Nerdry,Rantation and Politicizing — Supersonic Man @ 7:15 pm

I think I’ve about had it with Wordpiss.  Their comment approval process is fine for rejecting dozens of spam comments, but it’s terrible for approving a valid comment where you have to actually READ it before you’re sure it’s good.  The only way to read the whole comment to the end, as far as I could see, was to edit it!  I could not find any option for viewing the comment as it would appear if approved.  And then, when I try to follow any links to the post it’s a comment to, they’re links for editing it, not reading it.  This is stupid.

I have a sneaking feeling that Blogger is much easier to work with.  But I don’t want to move yet more of my life on to Google’s servers.  I think they’ve now officially crossed the line into being the new Microsoft — the big dominant choice that anyone who doesn’t like monopolies ought to look for alternatives to.  Since Windows 8 came out, Microsoft might actually now qualify as an underdog.  If not now, then they will soon.

IBM has been an underdog for a while now.  If they achieve the ability to answer natural-language questions before Google does, as they well might, I’ll be rooting for them, even though they were once the bad guy.  But I won’t go so far as to root for Microsoft… the memories of their ways when they were on top are a bit too fresh.

As for blogging platforms… what I really miss is Livejournal.  Why are today’s social networking sites so good for connecting people but so terrible for longer-form writing?  LJ was the one and only time that I saw thoughtful blogging combined with strong social networking in a way where both were able to work to their fullest.

February 21, 2013

pictures on top

Filed under: Hobbyism and Nerdry — Supersonic Man @ 5:01 pm

It was bugging me that the text along the right hand side of this blog would be rendered on top of my lovely pictures. So I experimented, and it turns out that WordPress is perfectly happy to let you add this little adjustment to your <img> tags:


January 23, 2013

Artificial Intelligence — what’s coming

Filed under: Hobbyism and Nerdry,thoughtful handwaving — Supersonic Man @ 8:08 pm

The term “Artificial Intelligence” means a computer or robot programmed to be smart like a person.  It’s a pipe dream so far, but a lot of people think it makes sense that it can happen eventually, and the idea is a staple of science fiction, in which it’s often taken for granted that a hundred years from now, our machines will be as smart as a lot of us are, and might even be considered citizens with the same rights as people.

Is this notion realistic?  Is it possible?  Is it likely?  If it happens, what form will it take?  I think I may be able to help clarify these questions a bit.


December 2, 2012

the end of Windows hegemony?

Filed under: Hobbyism and Nerdry,Rantation and Politicizing — Supersonic Man @ 12:30 pm

Are we finally seeing the first signs of the end of Windows?  Can the vast decaying empire of the Windows desktop finally be about to fall? (more…)

February 4, 2012

instant volume control

Filed under: Hobbyism and Nerdry — Supersonic Man @ 9:38 pm

Let’s say you’re at work and listening to MP3s in your headphones.  And someone comes up and needs to talk.  You have to stop the music… and in the past I’ve found I’m always fumbling to open up the player GUI and hit Pause, and it takes several seconds, during which you’re not looking very professional.

And if you use the taskbar icon to adjust the volume, it makes a loud blatt in your ear as soon as you let go of it.  That’s not much good either.

I decided I wanted a quick keyboard-shortcut way to go play/pause, and to increase or decrease the volume.  Only problem was, at work I’m not allowed to install any outside software.  So it had to be done with nothing but a script.  Turns out the Windows Script Host can emulate the special media-player keys on a multimedia keyboard, like so: (more…)

January 3, 2012

European vs Indian state names

Filed under: Hobbyism and Nerdry — Supersonic Man @ 11:40 am

Some states have names of purely European derivation, such as New Hampshire or Georgia.  Others have names of native origin, such as Massachusetts or Hawaii.  Which category has more states in it?  Turns out, this question is not all that easy to answer.

First, let’s list the states whose names have definite unambiguous European origins.

New Hampshire
New Jersey
New York
North Carolina
Rhode Island
South Carolina
West Virginia

That’s 21 names. Now, the ones with names of definite native origin:

North Dakota
South Dakota

That’s 24 names.  This list is ahead… but it doesn’t have a clear majority.  To settle the question, we have to look at the five remaining states.

New Mexico

Turns out, all of these five are debatable.  What about the name Indiana?  It’s from a term used in European languages, but the term refers to the native people.  How do you count it?  That’s a philosophical question.

What about New Mexico?  The name “México” is of native origin, but the state is named after a country with a European-derived language and culture.  Do you count it as native?

Arizona.  The origin of the name is said to be a Spanish corruption of an Aztec word.  Should you count that as native?  But others say it’s a Spanish corruption of an O’odham name, still others say it comes from Basque, and finally, it might just be short for “árida zona”, meaning dry zone, though you’d expect the adjective in that phrase to be placed after the noun.  So the fact is, no one actually knows whether the name is native or not.

The case of Oregon is even worse.  The name came into use long before there was a United States of America, among people who knew almost nothing about the area, and nobody knows where it came from at all.  There are various theories but they’re basically all guessing and hoping.

Idaho may be the one case where a land speculator just went and made a name up.  He at first claimed it was a Shoshone name, then that he just invented to sound Indianish, but then later someone argued that he got it from the Comanche term for “enemy”, because that’s how they saw the people who lived in that direction.  Again, no one actually knows.

So the odds are that there are probably more state names of native origin than of European or colonial origin, since if you count only two of these five as native that gives them the majority… but we can’t say for certain.

What we probably can do is link the cases of Indiana and New Mexico, since whatever principle you use to decide one of them will tend to place the other on the opposite side.  If you count New Mexico as native then Indiana looks colonial.  So that would make the balance 25 to 22 for the native side, giving them at least a tie, and they have the majority if any one of the three unknowns is actually native.  But it still isn’t settled.

Next Page »

The Rubric Theme. Create a free website or blog at


Get every new post delivered to your Inbox.