After three and a half years of running Debian Unstable as the primary OS on my personal laptop, shit finally hit the fan, as what should have been a routine
apt upgrade command managed to break things so badly that
systemd itself was segfaulting on startup and the system would no longer boot. I have since reinstalled Debian, and I figured I’d share my thoughts on the experience of intentionally running an unstable OS.
First of all, why was I running Sid (Debian’s rolling name for the unstable distro) on something as crucial as my laptop?
Well, the story starts in January of 2020, when I bought my Dell XPS 13 Developer Edition. The laptop came preinstalled with Ubuntu, which I hadn’t used for a while, and which I gamely gave a spin for old times’ sake. Turns out, I still really hate a lot of the choices Canonical makes for you out of the box, especially the window manager — for a long time I’ve strongly preferred simpler, customizable WMs and desktops on Linux. For a while I was using Xfce, and lately I’ve been fond of Cinnamon, so I tried to switch, and remove things like
snapd at the same time, and all of this ended up breaking an awful lot of things. Typical Ubuntu nonsense.
Well, it had been worth a shot, but with my dislike for Ubuntu confirmed I decided to wipe the thing and install Debian, my old standby distro. I’ve been a Debian fan for a long time, ever since I really started getting into Linux, and despite my long dalliance with Arch (which, ironically, I fled initially because they adopted
systemd and Debian was still on SysV-style
init (ironic because Debian was, naturally, just late to the party and adopted
systemd shortly thereafter)) I’d been running Debian on my previous laptop, an aging secondhand ThinkPad, very successfully for a while. So it seemed natural to return to the old standard.
However, it turned out that at the time Debian Stable and even Debian Testing were still on older kernel versions (v4.19) that didn’t fully support the USB-C/Thunderbolt ports that were all the laptop had, while Debian Unstable was already on kernel version 5.4, which did — but despite that the Dell XPS firmware wasn’t supported yet!
So ultimately what I had to do was compile my own 5.4 kernel with some patches helpfully provided by someone on the internet (the site sadly no longer exists, and I only have the patchfiles on my disk so I don’t know how to source and thank them for their efforts). This was the first time I’d actually compiled a custom Linux kernel for day-to-day use, which was exciting in its own way, and I was able to compile it on the laptop itself which was even cooler.
Naturally, only a few months later Unstable updated their kernel to a version that supported the XPS firmware out of the box, but it was still a fun experiment.
Anyway since I had started off on the journey I figured I’d keep on the path and see how things went.
High Risk, Low Reward
My overall impression was: things broke more often than I personally think was worth it.
This is, of course, what you sign up for. It’s in the name. The system cannot be guaranteed to be stable at any given time.
Still, it is annoying to expect that something will break every time an update is run. And since the Unstable packages are changed very frequently, the update cycle became a delicate balancing game. At first, because the upstream changes were so frequent, I was updating the system very often (about weekly) to ensure things ran smoothly, but this led to constant little issues as bugs were introduced and then fixed. Then, since things broke so often, once I had a stable system I reduced the frequency of running updates to about every other month or so, which was its own sort of risk — major changes could and would be made upstream that could all get pulled in at once, almost guaranteeing that something would not work as expected after every update.
Eventually, this is almost certainly what broke the thing. Some minor incompatibility was introduced that got pulled in along with everything else. I’m sure it was fixed shortly after it broke, but by then it was too late for my poor machine.
Even before that final update, this cycle quickly became an annoyance. Once the firmware issue was fixed in the kernel, I had no actual need to be on the Unstable distro, no requirement to have the absolute latest version of something that I couldn’t get anywhere else. And even then, I feel like if I did need that the tools are there for me to compile, build, and backport individual packages if necessary — the Debian package ecosystem is fairly robust. In fact, at one point I had to do that for something (I can’t remember what, and my apt sources are gone now) to hold the package back to an older version due to some complex interaction, but I was able to pull the correct dependencies, build the package locally, and pin the version so that the latest updates from upstream weren’t installed over it. It requires a decent amount of Linux know-how and some gentle Googling and Stack Overflowing, but it ended up being pretty simple.
Still, the fact remains that this is kind of de rigueur for Debian Unstable — some things will break, some things have to be custom-built, some things are added that you don’t like or removed that you like. It’s a constantly-changing rolling-release environment — you know, the exact same reason I’d stopped using Arch Linux.
Now, I want to make perfectly clear that the ultimate breakdown of my system is all my fault. I don’t blame Debian or the package maintainers for any of this. Most daily users of Debian Unstable will tell you that daily updates are the way to go, so that the number of changes per upgrade is minimal and the chance to recover the system from failure is much higher. But I wasn’t using my personal laptop every day, and when I did I mostly wanted to scroll social media, read articles, write code, or watch YouTube, not perform system maintenance.
Clearly, what I failed to recognize was that my needs for an operating system were very different to what Debian Unstable was offering. Ultimately, I didn’t need the most cutting-edge version of everything. I just needed a machine that could run the applications I use most often, and those — basically Firefox, VS Code, MuseScore, the Rust compiler, .NET Core, and Python — are all available in up-to-date versions for almost any modern Linux distro. In fact, many of them are either available as platform-independent binary downloads (MuseScore), provide their own Debian apt package sources (VS Code, .NET), or have a proprietary non-apt distribution system (rustup, pyenv) that ensures I can install not just the latest stable version but often other, cutting-edge versions or older versions alongside the stable one and switch according to my project needs.
The real sticking point when considering more stable Debian distributions is Firefox.
Now, Debian is somewhat notorious for only including the Extended Support Release (ESR) version of Firefox, which historically has lagged significantly behind the mainline release, forcing those of us who want to run a modern version of the browser to resort to alternative means. But I’ve gone through those motions before (the same is true of Unstable, in fact) so I was prepared to deal with that when the time came.
So after evaluating my actual needs in a laptop OS, I decided to swing almost all the way back the other way and install Debian Stable with Backports — the official channel for packages in Testing to make their way onto Stable once they’ve been tested reasonably thoroughly. I was ready to make some concessions and jump through some hoops to get it working but with the hope that once those hoops were jumped through I would have a rock-solid system that needed very little constant tweaking just to run.
So with that, I booted a live USB (I actually initially attempted a few potential solutions to fix the Unstable OS in place but none of them panned out). As usual, the Debian installer is very intuitive and flexible, and since I keep my
/home directory on a separate partition there was no need to worry about losing files or configuration (though there’s a caveat I wasn’t aware of initially, more on that in a bit). Installation was largely a breeze and very quickly I had a bootable machine again.
So the first thing I noticed upon booting the new system was that none of my configuration had been preserved!
My home directory was still there, and so were all of my regular files, but my bash configuration, vim configuration, git config, helper scripts etc. were all missing, or so I thought at first.
What had actually happened was that the installer had moved all of the hidden files and directories in my base home directory into a backup directory. So everything was there and just needed to be moved back into the home directory. I’m a little unsure why this happened - I’ve never actually had to fully reinstall Linux on the same drive like this so I don’t know if this is a common thing or not. But it was surprising and very confusing (since all the non-hidden files were obviously still in place).
But once I had realized what was going on, moving things back restored much of the configuration I really needed (syncthing, ssh, git).
Once that was sorted, one of the first things I did was install VS Code and Discord. This went smoothly, as I’d expect, but then a strange issue appeared — both apps would start and then immediately hang their UI threads, forcing me to kill the process. I tried removing caches, configuration, what have you, but nothing helped.
One thing I noticed was that the apps having trouble were all Electron apps. On a hunch, I installed Vivaldi, a WebKit-based browser, and it also would not start up. So something was seriously broken with WebKit rendering. This was very disappointing to see on a supposedly stable OS freshly installed.
I then remembered something I had seen when first booting and then immediately forgot about — a notification that something was out of date and needed updates. I tried
apt upgrade but there were just a few package updates that didn’t solve anything. I poked around and eventually found a Gnome app that showed that there were firmware updates for the Dell XPS laptop itself. This was interesting since I hadn’t installed any Dell proprietary software (like the kind that comes baked in with the Dev Edition Ubuntu distro), but since I hadn’t updated the firmware since I got the thing I gave it a spin. The machine rebooted, flashed several firmware updates (whoops ), and booted back into Debian — and all the WebKit apps worked just fine!
So, once again, it was all my fault — my guess is that the integrated Intel graphics had a bug that the firmware update fixed, and which even the slow and steady Debian Stable had progressed to the point where it was assumed to be fixed as well.
It’s a little annoying, though, that the updates aren’t surfaced through apt, and I have to use this other Gnome update utility. I wonder if there’s some way to get these updates to appear elsewhere so I can fit them into my regular upgrade cadence.
Surprises Under the Hood
Once the WebKit/Electron issues were resolved things were very stable and I could get back to installing and configuring things to be just the way I wanted (I think I missed copying or restoring my Cinnamon configuration in particular somehow, but whatever).
So, for the Firefox issues, I was very surprised to see that the latest ESR release is actually surprisingly current - version 102, vs. 115 for the desktop release. I’m going to have to keep an eye on how often it updates from here on out, but for now I might actually not jump through any hoops at all. I was able to sync my settings and bookmarks and my extensions all work fine for the moment. Of course, given Debian’s history with Firefox ESR (they package releases much, much slower than Firefox releases them) I might want to figure out a solution soon (hopefully one that doesn’t involve snap, I hate snap).
Another pleasant surprise was that, for the first time that I’ve seen in a new Debian install, the Backports release channel was enabled by default! So I was already running more up-to-date versions of packages than a stock-standard Debian Stable distro would be normally. I wonder if this is a convenience thing for desktop installations — I explicitly used the Debian + Cinnamon Live USB image. My guess is that for headless or desktop-agnostic installations they still leave Backports out of the apt lists (it certainly isn’t enabled on my Debian 11 server, though we’ll see what happens when I finally sit down and upgrade to 12).
The final pleasant surprise was the Linux kernel version. This whole saga started out because Debian Stable in 2020 was still running the 4.19 series kernels. This makes some sense, as that was the previous LTS version of the kernel at the time, which suits the Debian philosophy of system stability and reliability over all. If I’d waited half a year, I could’ve gotten Debian Stable with the 5.4 kernel, the next solid LTS version. But at the time, I basically had only once choice if Debian would be my distro.
But now, Debian Stable is running the 6.1 Linux kernel! I suppose if I’d been paying attention it wouldn’t be a surprise at all — the Debian stable releases historically leapfrog the LTS kernel versions rather than progressing to each next one. My predicament in early 2020 was really just a matter of bad timing — Debian Stable releases are every two years, on odd years, so I was caught between the old and the new.
Wow this one ran way longer than I thought. Guess I have more thoughts about Linux than I… thought…
Anyway, will this quest for more stability cause me to run into outdated packages and frustrating lacks of features? Maybe! But right now I’m content to give Debian Stable (or at least, mostly-Stable, with Backports) a spin. Since a lot of things are more up-to-date than I’d expected, and since a lot of other things are easy to keep up-to-date without needed to jump through Debian’s hoops, this might actually be a good choice for a modern desktop system. Is this truly the year of the Linux desktop? Obviously not for everyone — but for me, it always will be.