Skip to content

ubuntu

Updating Kubernetes to the Community-Owned Package Repositories

An Unexpected Update

Most days I update my little server when I log into my PC, and today it gave quite an unexpected surprise:

# apt update && apt full-upgrade -y
...
E: The repository 'https://apt.kubernetes.io kubernetes-xenial Release' no longer has a Release file.
N: Updating from such a repository can't be done securely, and is therefore disabled by default.
N: See apt-secure(8) manpage for repository creation and user configuration details.

This was promptly reported already yesterday as Ubuntu kubernetes-xenial package repository issue #123673 and quick triaged pointing to the announcement from August 2023: pkgs.k8s.io: Introducing Kubernetes Community-Owned Package Repositories.

Calibrating screen color with DisplayCAL

Calibrating screen color is mostly optional these days, if you buy a good screen that comes out of factory with a good calibration profile. Nonetheless, it is recommended to re-calibrate every year or so to account for display ageing, so a few years ago I purchased a Datacolor Spyder5 Express which is known to work on Linux, with Argyll Color Management System.

What doesn't always work so well is DisplayCal. In fact, the original project is dead and was dropped from Ubuntu 20.04 but was still possible to build with python2.7 packages. That is no longer possible in Ubuntu 22.04, but there is a Python 3 fork: eoyilmaz/displaycal-py3

Single-node Kubernetes cluster on Ubuntu Server (lexicon)

After playing around with a few Docker containers and Docker compose, I decided it was time to dive into Kubernetes. But I only have one server: lexicon.

For the most part I followed Computing for Geeks' article Install Kubernetes Cluster on Ubuntu 22.04 using kubeadm, while taking some bits from How to install Kubernetes on Ubuntu 22.04 Jammy Jellyfish Linux (from LinuxConfig) and How to Install Kubernetes Cluster on Ubuntu 22.04 (from LinuxTechi).

Ubuntu Studio 22.04 on Computer, for a young artist

The young artist in the house, who will soon have an upgraded PC for the new year, will be then running Ubuntu Studio 22.04, made for creative people.

The current system is running Ubuntu Studio 20.04 on a very old system, based on an AMD Phenom II X4 and an Asus M4A89GTD Pro/USB3 from 2010. In preparation for the upcoming upgrade, the process starts by installing Ubuntu Studio 22.04 on a old SSD on my own PC (Rapture), which can later be transplanted to the new PC.

Low-effort homelab server with Ubuntu Server on Intel NUC

Need. More. Server. Need. More. POWER!!!

But only a little bit, maybe just enough to run a Minecraft server, which refuses to start on my Raspberry Pi 4 because it has only a meagre 2 GB of RAM.

I had known about Intel NUC tiny PCs for a while, and how handy they can be to have a dedicated physical PC for experimentation. There was a very real possibility that I would have to set one up as a light gaming PC in the near future, so I thought cutting my teeth on a simpler server setup would be a good way to get acquainted with this hardware platform and its Linux support.

Detailed system and process monitoring

Never got the hang of telegraf, it was all too easy to cook my own monitoring...

Humble Beginnings

In fact, when I started building detailed process monitoring I knew nothing about telegraf, influxdb, grafana or even Raspberry Pi computers.

It was back in 2017, when pondering whether to build my next PC around an Intel Core i7-6950X or an AMD Ryzen 5 1600X, that I started looking into measuring CPU usage of a specific process. I wanted to better see and understand whether more (but slower) CPU cores would be a better investment than faster (but fewer) CPU cores.

At the time my PC had a AMD Phenom II X4 965 BE C3 with 4 cores at 3.4GHz, and I had no idea how often those CPU cores were all used to their full extent. To learn more about the possibilities (and limitations) of fully multi-threading CPU-bound applications, I started running top commands in a loop and dumping lines in .csv files to then plot charts in Google Sheets. This was very crude, but it did show the difference between rendering a video in Blender (not multi-threaded) compared to using the pulverize tool to fully multi-thread the same task:

Chart of CPU usage over time showing a single Blender process never using much more than one CPU core

Chart of CPU usage over time showing how pulverize, running 4 Blender processes, most CPU cores most of the time

This early ad-hoc effort resulted in a few scripts to measure per-proccess CPU usage, overall CPU with thermals, and even GPU usage.