Supersonic Man

April 23, 2022

Bitcoin is a Ponzi scheme

Cryptocurrencies were supposed to be a new tool of commerce for people who don’t trust governments, not an investment commodity or a get-rich-quick scheme. Yet that’s what they’ve turned into. And the further we get into it, the harder it is to see the whole idea as anything but a con.

(more…)

May 29, 2021

the end of Windows hegemony — update

Filed under: computing,Hobbyism and Nerdry,the future! — Supersonic Man @ 1:06 pm

More than eight years ago, I wrote a post here called “the end of Windows hegemony?”.  It was quite premature at the time, and for year after year nothing seemed to happen to make any of the predictions or possibilities mentioned in that post move any closer to reality.

But in the year of the pandemic, it’s finally starting to look like people are reconsidering their automatic default allegiance to Microsoft Windows.  At the time of that post, according to statcounter.com, the desktop market share of Windows was 84% in North America and 91% worldwide.  Now it’s down to 63% in North America and 75% worldwide.  The biggest gainer has been MacOS, though it looks like they may have started trending back down again in the middle of 2020, perhaps due to caution over the change of CPU architecture.  Back then they were at 15% and 8%, and at the peak they hit 28% and 18%.  The other main beneficiary has been ChromeOS, which has gone from essentially nothing to 6% in North America and 2% worldwide.

Perhaps as a response to this downward trend, Microsoft is now planning a fancy new update to the Windows look and feel… and unlike previous major updates, this one is pretty much mandatory.  They’d probably call it Windows 11 if they hadn’t committed to using the name Windows 10 until the end of time… and maybe they will anyway.  Time will tell whether there are good options available for those who decide they hate whatever new style they come up with.

In this, Windows is  becoming like Android, though with less ability to choose different aesthetic styles of UI by picking a different hardware maker.  As with Android, those who make the effort to dig into alternatives will probably have pretty good options to change some things they don’t like, but most non-techy users will not benefit from this, and will take what they’re given.

Mostly what they want, from what I’ve seen, is lack of change.  They want the time and attention they’ve invested in learning software systems to not be lost.  Automatic and mandatory changes are likely to be met with resentment, if they require any relearning.  The time when they feel open to change is when they buy new hardware, which is why Android suffers less of this resentment.  It used to be that paying money for a new OS version would also open this window, but that’s not something that happens anymore.

Marketing-wise, Microsoft was never well served by trying to switch to an evergreen software model in which they pump out updates when they see fit rather than when the user wants them.  Their users, outside of corporate IT departments and technical professionals, are willing to take what they’re given, but want it to be stable and predictable once they’ve gotten used to it.

And I think that what Microsoft has failed to appreciate about its own position is how much their entire Windows business has depended on people’s willingness to take what they’re given.  Aside from gamers, almost nobody chooses windows for themselves because they actively want it.  They take it because it’s what’s been given to them.  Because it’s the default — because it’s what you get automatically if you don’t make an active choice.  Because it’s what everybody has always gotten, and they don’t need to think about it.  I suspect that, like many others before them, Microsoft has mistaken a historical privilege for an earned reward.  They’re probably having thoughts like “They love what we’re doing, so let’s give them more of it.”  Decisions based on such thoughts will not mesh well with reality.

Soon, with Apple gaining by leaps and bounds, now having superior hardware thanks to in-house silicon chips with no x86 baggage, and ChromeOS rapidly becoming more visible and viable, customers are going to have to start thinking about it again.  The time is near when the average computer shopper might no longer get Windows automatically, but will actually make a mindful decision about what OS they prefer.  And I don’t think very many are going to actively avow that they really like and prefer Windows.  After all, the first bar that any competing OS has to clear, in order to be commercially viable at all, is to do better than Windows.

December 6, 2020

the Amiga 1000 was better built than I thought

Filed under: computing,fun,Hobbyism and Nerdry,life — Supersonic Man @ 10:26 pm

An Amiga 1000 was the first computer I ever bought with my own money, and I still have it. And I always knew that in some ways it was well built, because that one I bought back in the eighties still runs, whereas the far more expensive and rugged and professional Amiga 3000 that I bought in the nineties died long ago. But now I’ve found that it’s even truer than I thought.

(more…)

March 27, 2019

what makes one programming language better than another?

Filed under: computing,Hobbyism and Nerdry — Supersonic Man @ 3:26 pm

Every programmer who knows more than one language has opinions about which languages are better than which other languages. I’m no different from anyone in that aspect, but I realize now that I’ve never taken the time to clearly think through what criteria to use in such a comparison. Of course there are times when different language features and styles are suitable for different tasks, but some generalities are pretty universal, and I think different situations mostly just change the emphasis and priority between them.

What got me to take a closer look was hearing someone state the opinion that the measure of the better language is simply the one which forces you to write less repetitive boilerplate. That turns out to be a surprisingly valid and comprehensive metric, despite how plodding and un-abstract it sounds, and I had never thought in those specific terms before.

So, what are some useful criteria for distinguishing a good language from a bad language? Here’s what comes to mind for me:

(more…)

May 5, 2017

makes it easy!

Filed under: computing,Hobbyism and Nerdry — Supersonic Man @ 3:08 pm

Whenever someone introduces me to a new software framework which is designed to make things easier, especially one to make visual layout easier, I usually end up wishing they’d left things difficult.  Because the thing about these frameworks is that they impose assumptions and expectations.  As long as you work within those assumptions and expectations, the framework saves a lot of labor.  But as soon as a requirement comes along which makes you step outside of those expectations, the framework stops working with you and starts fighting against you.  You end up expending as much work getting around the framework as on solving the problem.

This is especially relevant when the framework is for visual layout.  Because then, they only keep things easy when you adhere to certain limitations of visual styling, and the only people who understand those limitations are the developers.  Which means you’re fine as long as you’re willing to live with a programmer’s sense of visual style.  These frameworks seem terrific in demos, because the examples always take advantage of their strengths and avoid their weaknesses.  But as soon as you bring in a designer or marketer who understands design but doesn’t know the quirks of the framework, their ideas will immediately push you into fighting the built-in assumptions, and all the benefits of having a simplified labor-saving technology wave goodbye, going out for a beer while you’re stuck with a job which is now more difficult than it would have been with no help.

This has been true since the early days of graphical interfaces, from Visual Basic to Twitter Bootstrap.  The latter is my particular bete-noir at the moment, as we adopted it at my job, had to retrofit parts of our old design to not be broken by it, then started to develop new stuff which used it but also had the retrofitting in place, and of course were immediately hit with design change requests which don’t get along with it.  Even before those requests, we were already in a situation where our own CSS was in a fight with itself, half of it saying “don’t be Bootstrap” and the other half saying “you gotta be Bootstrap”.

In the nonvisual realm, it isn’t necessarily so bad.  Some frameworks actually do make things easier without making you fight them.  It helps if their use is purely for code, so it’s designed by programmers for programmers, with no end users involved.  One good example nowadays is jQuery.  It makes many things easier and almost nothing harder.  (Though nowadays you might do better to use Cash, a lightweight jQuery subset, rather than jQuery itself.)

We’ve been using it at work but now the word is we’re going to switch to Angular.  We shall see how that turns out.

April 8, 2017

eight-bit nostalgia

Filed under: computing,fun,Hobbyism and Nerdry — Supersonic Man @ 1:03 pm

There’s a lot of nostalgia out there for the era of eight-bit computers — especially the home-oriented ones from the likes of Commodore and Sinclair and Atari.  And I get why: they were tremendously liberating and empowering to those who had never had access to computing before.  And the BASIC interpreters they all came with were likewise quite empowering to those who hadn’t previously realized that they could write their own programs.

But as someone who was already empowered, I couldn’t stand those crappy toy computers.  They’d run out of bits just when you were at the point where a program was starting to get interesting.  I never owned one.  I didn’t start wanting my own computer until the sixteen bit era.  The first personal computer that actually made me want it was the Apple Lisa, which of course was prohibitively expensive.  The first one I wanted enough to pay hard-earned money for it, at a time when I didn’t have much, was the Amiga 1000.

(Last I checked, my Amiga 1000 still runs.  But one of these days the disk drives are going to fail, and any available replacements will be just as old and worn.  Turns out that what a lot of retrocomputing hobbyists do is to use hardware adapters to connect their old disk cables to modern flash-memory drives.  It may be kind of cheating but at least you won’t have range anxiety about how much you dare use it before it breaks.)

To me, the sixteen bit era, and the 32-bit transition following, was the most fun time, when the computers were capable enough to do plenty of cool stuff, but also still innovative and diverse enough to not be all boring and businesslike.

If I were of a mind to recapture any of that fun with modern hardware, it sure doesn’t cost money like it used to: I’d look at, for instance, getting a Pi 3 with Raspbian on it.  You could have a complete Linux system just by velcroing it to the back of a monitor or TV.  But there are even cheaper alternatives: there’s a quite good hacking environment available across all modern platforms, more empowering and ubiquitous than BASIC ever was… in your browser’s javascript.

October 3, 2016

a tribute to the HTC One M7

Filed under: computing,Hobbyism and Nerdry,life — Supersonic Man @ 11:09 pm

My current phone, on which I am typing this post, is an HTC One — the iconic model known, but not advertised, as the M7.  It’s old and I’m now only days away from replacing it. The battery can barely hold a charge anymore, the main camera is busted, and the proximity sensor ain’t what it used to be. Besides that, of course the CPU isn’t much by today’s standards and 32 GB of storage is rather limiting with no SD slot… but if it weren’t for the wear&tear issues, I’d feel pretty darn okay with continuing to use this phone for quite a while longer. It’s an excellent phone, and I definitely wish there were more phones out there which embraced front stereo speakers.

The M7 was quite an important and influential model. Its design and build set a new standard for the kinds of materials and aesthetics that a high-end phone should aspire to. Samsung took a couple of years to catch up, and I’m not quite certain Apple ever did. It’s because of HTC’s chamfered aluminum back that nowadays every midrange Chinese wannabe model has a “premium” metallic build, and plastic became intolerable on a high-end model. And though the stereo speakers may not have been imitated nearly as often as they ought to have been, their presence did manage to embarrass all but the cheapo models into at least putting a speaker on the edge, like Apple, instead of on the back.

Even its camera, which was often regarded as the most disappointing piece of the phone, was influential. The “ultrapixel” approach forced makers and buyers to realize that pixel size matters as much as pixel count, and this is why today’s camera spec comparisons include that metric, along with numbers for megapixels and lens aperture. And yes, this was also among the first cameras to make an issue of its aperture, with f/2.0 when competitors were f/2.4 or slower. The “zoe” feature also helped popularize sharing brief video snippets as if they were still pictures.

Another imitated feature was the IR blaster, though that is now falling out of favor again. Don’t blame HTC for the trend to nonremovable batteries, though — that was well under way a year earlier.

Aside from innovative aspects, it was just a solidly good phone. Its software, for instance (initially a skin on Jellybean, eventually updated to Lollipop), was dramatically smoother and more pleasant than that of the competing Galaxy S4, which tended to be jerky even when fresh out of the box. It also had a stronger headphone amp than the Galaxy. Its audio features even included FM radio, while other phones were giving that up. The display was pretty good for a non-amoled, with nice color and 1080p resolution, which is actually better than 1440p for those who watch movies and TV on their phones. Also, the size of the display was about what I still consider ideal for a compromise between ergonomic convenience and viewing area. The whole industry has pursued the trend to phablet-sized enormity too far, in my opinion, and I’m glad to see a sign of reversal coming now, with Google’s new Pixel phones (made by HTC) each being a size smaller than their Nexus predecessors, and with no performance penalty for the smaller model relative to the larger.

What are the important and influential models in the history of Android phones? The HTC Dream, a.k.a. the T-Mobile G1, was the first Android phone. The Moto Droid was the first to popularize the platform with massive advertising, pointing out that there were areas where it could outdo iOS. The Galaxy Nexus showed off the alternative of a “pure Google” unlocked phone, and a high definition screen without a high price. The Galaxy Note put phablets on the map, and the Galaxy S III was, for many, the first phone to show that Android might actually be superior to iOS, depending on one’s personal priorities. The M7 was the first phone to outdo Apple at physical design and construction, and to demonstrate the importance of good speakers and fast lenses. And maybe we can make a spot for the S6 Edge for being the first to put curved glass to good use, eliminating the side bezel and taking another definite step beyond Apple in physical design. Historically, the M7 stands in distinguished company.

We shall see what becomes influential next — perhaps modularity, though judging by current sales, probably not.

The M7’s physical design is definitely iconic, and it’s unsurprising that HTC kept physical changes to a minimum for the M8 and M9, comparing them to a Porsche 911 which still looks like it did 40 years ago. Unfortunately they kept too much else the same, and lost popularity. To me it’s sad that HTC has regained customers by losing its definitive feature, the stereo speakers… though the HTC 10’s mix of front sound at one end and edge sound at the other is still influential, having been copied by Apple.

HTC as a company deserves respect. They’ve pushed innovation, they’ve challenged Apple and Samsung at times, and for a brief period they were actually the world’s #1 smartphone maker. Nowadays they’re innovating in virtual reality. And it’s rare in being a tech company whose most visible founder is a woman, Cher Wang.

So as I say goodbye to my hard-working HTC One, it’s mostly just with regret that it’s getting physically worn out, not that it’s fallen too far behind. I will definitely keep it around — if my new phone ever has an issue and I need a backup, I know that the old phone will still be able to perform well, as long as I can keep juice in it.

August 6, 2016

pseudo-documentation

Filed under: computing,Hobbyism and Nerdry,Rantation and Politicizing — Supersonic Man @ 8:09 am

In my occupation as a coder, I have to read a lot of technical documentation in order to use existing software components.  And sometimes that documentation can be frustratingly incomplete or unavailable, but to me the worst situation to encounter is what I call pseudo-documentation.  It’s abundant out there.

I will give you a little example of what that’s like.  Let’s say you just encountered a line of code like this:

myThingy.FrabnicateZinxer(Zinxer.Load("arf"));

You have no idea what this does, so you look it up, and this is what you find:


Thingy.FrabnicateZinxer

Frabnicates a Zinxer for an instance of Thingy. If successful, the Zinxer will become frabnicated for this Thingy. If the Zinxer was already frabnicated for another Thingy, the new Thingy will be placed first in the frabnication order of the Zinxer. If it is already frabnicated for this Thingy, no change takes place.

Signature:
public void FrabnicateZinxer(Zinxer zinxerToFrabnicate);

Parameters:
zinxerToFrabnicate – the Zinxer which is to be frabnicated for this Thingy.

Return value:
none

Exceptions:
NullParameterException – a null value was passed as zinxerToFrabnicate.
InvalidOperationException – the Zinxer passed as zinxerToFrabnicate is in a nonfrabnicable state.

Example:
Thingy thingy = new Thingy();
Zinxer zinxer = Zinxer.Load("brb");
thingy.FranbicateZinxer(zinxer);

See also:
Zinxer class
Thingy class


. . . You see what the problem is?  The documentation covers all aspects of what needs to be available in reference material, but you learn nothing by reading it.  It labels the parts but says nothing about what they actually do.  All it tells you is what you had already assumed just from seeing the name — that some unknown thing undergoes some unknown process.  The only new knowledge you come away with is maddening hints of ways it might go wrong, none of which have any explanatory context.

There are many outfits which produce crap like this, but Microsoft may be the worst.  Their tech writers don’t seem to have any supervision by anyone who checks the quality of the work.  Even when they’re writing at length in tutorial or instructional form, the result is often full of gaps and omissions where crucial pieces of context are missing, not to mention inconsistencies which undermine your chances of piecing together anything definite.

June 2, 2016

a small attempt to emulate the gadget press

Filed under: computing,Hobbyism and Nerdry — Supersonic Man @ 9:08 am

Nowadays the popular media report on the latest gadgets almost as eagerly as they report on celebrity gossip.  Since my smartphone is now three models out of date, I’ve been reading more than my share of this stuff.  And this is inspiring me to try adding a little noise of my own to that topic.  So:

Five Things Premium Phones Will Need in Order to Stay Premium

(more…)

March 28, 2016

some fatuous computer industry predictions

Filed under: computing,Hobbyism and Nerdry,the future! — Supersonic Man @ 11:15 am

I think I’ll call some trends in where the computer industry is going to go in the coming years. And yes, these are pulled straight from my lower gastrointestinal tract.

  • Is Windows going to start dying off?  Yes, but it will be very slow.  Home use will disappear before office use.
  • What will replace it?  A windowed variant of Android, or something Android-compatible, which doesn’t even exist yet.
  • Will that be Google’s planned merger of Android with ChromeOS?  Maybe, but I think it may be more likely to come from an independent outfit.  And if it’s advertised as being half Android and half ChromeOS, it’ll really be 90% Android.
  • Will ARM architecture replace Intel ’86 architecture?  Yes, but only temporarily.
  • Then what will win out in the long term?  Something designed for massive parallelism, like a GPU.  I predict that in The Future, when comparing the size and power of different computers, the main stat that will be quoted is the number of kilocores.
  • Will these cores be similar to full-blown processors such as an ARM core, or will they be more basic and stripped-down like a GPU core?  I think the trend may be from the former toward the latter — quantity over quality.
  • Will we still be using Android variants when things get into kilocore country?  Nah, something fundamentally more advanced will replace the whole current idea of desktop-like interfaces.
  • Will neural networks be important?  Maybe.  They’ll remain a specialized minority of architectures, but I think as the massively parallel architecture evolves toward having more cores and less in each core, it will converge toward neural-net architecture and then replace it.
  • What about software?  I think it will be stored in portable binary format and adapted to individual architectures with JIT compilation and/or automatic local optimizers.  The actual coding of highly parallel algorithms will rarely be done by hand, and will usually depend heavily on automated assistance.
  • What about quantum computing?  It’s impossible to tell how big an impact it will have.  It’s essentially a form of analog computing, and as such may be confined to niche specialties… but you never know: it could end up beating conventional computing at its own game and become much more general-purpose.  If this happens, the need for automated assistance in coding goes double.
  • Will we eventually use computers through direct brain interfaces?  Yes, but progress toward that will be frustratingly slow and gradual.
  • Will these new architectures lead to Artificial Intelligence?  Yes, though in a quite limited sense for the shorter term.  See this article for how I think that will go.
  • Does this mean that a computer will take your job?  It sure does, and it’s going to be a very difficult social challenge to adapt to.  See this further article.
Next Page »

Create a free website or blog at WordPress.com.