Other things on this site...

Research
profile
Recipes

Computers should get BETTER over time, not worse

I was given this pretty surprising insight at a permacomputing meet-up in Utrecht:

Computers should get BETTER over time not WORSE

Why? It's our everyday experience that computers, software, smartphones all tend to get worse over time. Slower, more unreliable, strange behaviour. We have a habit of ascribing that to age, and think that the obvious answer is to buy a new device. But the true reason is mostly that bloatware accumulates on the machine. Most commonly it's some pieces of software sitting in the memory (RAM) taking up space for no good reason, and often taking more and more memory over time as data and software bloat accumulates. Once the RAM is too full, almost every piece of software on your device starts to feel slow and unresponsive, because the underlying

(I think there's another possible reason, which is that software security patches also accumulate. Sometimes these have to make dramatic negative interventions like turning off certain CPU optimisations, which definitely impacts our user experience - but this is one kind of "bloat" that I have to admit needs to be there.)

But what's weird is:

Old computers, from the 1980s and 1990s, keep getting better and better. The hardware doesn't change at all. But keen programmers work out how to achieve more and more within the limits of the computing platform. The demoscene (recently recognised by UNESCO!) is one sub-culture where there's a lot of focus on this, where some of the fun comes from working out how to achieve impressive graphics/sound effects in very limited systems.

Now, what a strange contrast, eh? Why would a computer from the 1980s get better and better, while a computer such as a smartphone from 2020 get worse and worse?

There are 2 notable things that old computers don't really do: multi-tasking, and the internet. Multi-tasking brings a lot of complexity. On old machines, you often have the whole computer to use for one program, and you can use the hardware directly to its limits, without having to allow for the fact that your software is probably only one out of ten or more running programs. And when your device is not internet-connected, you can do a lot of coding without worrying about hardening the security.

So, the comparison is not entirely fair. Nevertheless, it's a very strong motivating example: we know that it's possible to have a world where we don't just throw away expensive silicon devices and buy bigger ones. If we focus on the devices we have, and making room for the software to be refined and - even - simplified, we can get close to a much more sustainable computing world. I'd like to apply that in my own work. It's hard to work out how, but I'd like to.

| IT | Permalink

social