The Observatory

english

An oldtimey hand-drawn postcard showing an illustration of the school of the year 2000 where books are ground down by a professor into a machine that distributes it into the ears of the students through headphones. Picture from En L'An 2000 illustrating the school of the future.

Our first child was born in early autumn 2022, almost perfectly timed with the latest energy-driven stagflation. Due to a combination of economic factors (in particular, the skyrocketing mortgage interest rates and cost of food) and the whole new parents' situation, we have very little money to spend, and even less time to spare. For at least the coming year or so, we will have to mostly make do with what we have.

This seems like a good time for some Nietzschean amor fati. To rise to meet the conditions, I will declare that I willed it so, I made it so. This year for my New Year's resolution, I resolve to Not Buy Anything.

Of Flying Cars and Moore's Law

A graph showing the declining number of iPhones i can buy with my wage

There is a second reason behind this decision. Like David Graeber in his essay Of Flying Cars and the Declining Rate of Profit, I have for a while felt like there is something wrong with technology.

To take an example, I am writing this on a late 2013 MacBook Pro. This machine is wonderful despite being ten years old; still fast enough for me by a good margin. I'd consider replacing the battery with a new one (it's still running on the original one). The only reason I would even consider upgrading it is that Apple has stopped rolling out OS updates for it, which also blocks my ability to develop apps for newer versions of iOS or macOS, including some for macOS necessary quality-of-life improvements to SwiftUI. I fail to see how this is not a blatant case of planned obsolescence, but for now, that is beside the point.

Before the new M1 Macs, every subsequent model of Mac except the virtually identical 2015 model was worse. The keyboards were more error-prone, the ports disappeared for no good reason, and there were no marked performance improvements. In particular, I don't need more performance (though I sometimes do need programs and webpages to be less inefficient). The M1 Macs are also worse; they don't have the MagSafe connector, a lifesaver in a cramped household with two adults, one baby, and one dog, and they are unable to drive two screens.

I grew up as a bottom-feeder on discarded late 1990s technology when a two-year-old computer was junk. Using a 10-year-old machine would have been inconceivable. This was when clock speeds were cranked higher every year and when storage went from holding megabytes to gigabytes, to hundreds of gigabytes, and then finally to terabytes. It seems to me that about when Moore's law changed from giving us faster computers to computers with more cores, other things started to change too.

The concept of a Pareto front might be useful to explain this. Whenever we optimise something as complex as a computer, there are trade-offs involved. Therefore, there isn't usually just one optimum but many Pareto-optima, equally good configurations with different trade-offs, none of which is a total improvement over any other in all aspects. Your computer can, for example, be optimal concerning power, portability, or energy efficiency, depending on which variable you prioritise. This leads to a frontier of equally optimal outcomes for (in our case) a given state of technology, the current Pareto frontier. The kicker is that as users of technology, we may value some properties over others, which means that our Pareto frontier may not be the same as the industrial one.

As technology improved, trade-offs became less and less severe between generations of hardware, with each new generation being essentially a Pareto improvement over the previous generation. Ultraportable laptops got acceptable screens, SSDs became large enough to replace most mechanical drives, and CPUs became fast enough to save power and therefore give even the most brütal gaming laptop an acceptable battery life (as long as the GPU was off). This meant that the Pareto frontier under the current level of technology grew to encompass most needs for most people, which crucially meant that each new model for most users was an improvement in something they cared about without any sacrifices, simply because everything got better.

At about the same time, we started hitting diminishing returns for the most possible improvements. What the improvement budget now affords us is slightly different trade-offs, but not equally distributed. Some fields show easier gains, which invites new trade-offs. This means that for most people, most upgrades are a lateral move at best, and often a partial downgrade. I lose the trip-free charging connector, my multiple screens, or the regular USB ports I still need, but I gain burst CPU speed I don't need, or battery life I only sometimes appreciate. To sell units, manufacturers must rely on increasingly unmotivated planned obsolescence driven by software, or social manipulation to manufacture demand.

In other words, the downturn in productivity that Graeber observes holds for computers too. I was promised advanced cybernetic implants, quantum computers, and general AI in four years and they can't even make the Google Glasses we were promised in 2013 a reality ten years after its announcement. Even the simulation technology Graeber means has replaced all technological progress is a disappointment. The current advances in “AI” shows no actual improvement in reasoning. Rather, the current advances in large language models are stochastic parrots perfectly follow Graeber's thesis. They're not so much about intelligence as a dual sleight of hand: hide the labour like in the mechanical turk, and make it appear to be intelligent without actually being so.

The true tragedy of our time is not that technologists insist on building the dystopian nightmare of Ready Player One as the fact that they can't even get that right. Imagine a broken torment nexus, stomping on a human face forever.

At the same time, products get more expensive and/or more enshittified as companies rush to extract more profits in the new nonzero-interest rate economy. All in all, it's a great time to lay flat and feed from the bottom.

#English #theory

Photo by Shu Qian on Unsplash

Most modern computersfor now — are time machines. Like arctic ice sheets or tree trunks they are sediments upon sediments of historical baggage, layered on top of each other. You boot a computer into 1976, take a brief tour through the 80’s, before landing in the early 00’s where we remain to this day.

While this happens, a similar negotiation happens to the display through a process called mode setting, which essentially moves the time machine into higher and higher screen resolutions, each with a corresponding flicker of the screen. Earlier versions of Linux-based systems were designed by well-trained software engineers who could only conceive of systems in terms of interchangeable lego-like bricks.

One of the consequences of the lego brick fetish was that in addition to the flickering of the various boot programs (also the work of software engineers), there would be additional flickering as the lower-level bricks which knew only of the display hardware as abstract things handed off control over the screen to higher-level software that knew how to produce high-resolution visuals using modern graphics cards. The end result was a boot process that flickered in and out of consecutively less and less pixellated visuals until it finally arrived at the level where you could comfortably read the text.

These days, more display-related functionality has been moved into the lower lego bricks, and there is consequently less flickering.

My experiences working with these systems — and more important with their failures — were so formative that I still think in terms of them whenever I see a jagged edge of something that exposes a bit of the inner workings of a process that should have been hidden from me.

I find ruptures (and continuities) like the idiosyncrasies of the x86 architecture particularly interesting because they allow us to mark time. One of the things that have bothered me lately is how much slower time seems to move than in my youth. Nothing seems to be invented, from franchises to technology, no new political ideologies appear, etc. And most alarming of all: I, now in my 30’s, still feel like I understand youth culture, broadly speaking. I have clear memories of how clueless my parents were. For my entire life I have identified as old and grumpy, but my social age still has not managed to catch up with my biological age it seems.

Then some time ago, I was visiting a class for my training in academic teaching as a Ph.D. student. I remember walking around in the classroom, a classic slanted lecture hall, observing the students. A few of them turned to speak to me, since it turned out I had briefly TA:ed them in earlier courses, and I realised as they were turning from their friends I could see the flicker. They were switching from whatever mode of communication they were using natively to whatever legacy common layer they were using with other people.

I turned around in the room and realised I saw several cultural cues I could not read. Someone dressed as what I would have described as a male jock, but with bright nail polish. Was that a weird student thing, or part of some other cultural layer? I couldn’t tell. I saw stickers I had no idea what they meant. Proof of a vibrant culture I was not privy to.

And I thought: Thank god, at last.

#theory #English

In this post I discuss identity and uncertainty, coming out, and the strategic problems with working inside identity politics trough a personal experience and the recent debate about Rapid-Onset Gender Dysphoria. Content Warning: mention of transphobia, discrimination against LGBT people, and gender dysphoria.

Read more...