The history and use of /etc/glob in early Unixes

One of the innovations that the V7 Bourne shell introduced was built in shell wildcard globbing, which is to say expanding things like *, ?, and so on. Of course Unix had shell wildcards well before V7, but in V6 and earlier, the shell didn’t implement globbing itself; instead this was delegated to an external program, /etc/glob (this affects things like looking into the history of Unix shell wildcards, because you have to know to look at the glob source, not the shell).

↫ Chris Siebenmann

I never knew expanding wildcars in UNIX shells was once done by a separate program, but if you stop and think about the original UNIX philosophy, it kind of makes sense. On a slightly related note, I’m currently very deep into setting up, playing with, and actively using HP-UX 11i v1 on the HP c8000 I was able to buy thanks to countless donations from you all, OSNews readers, and one of the things I want to get working is email in dtmail, the CDE email program. However, dtmail is old, and wants you to do email the UNIX way: instead of dtmail retrieving and sending email itself, it expects other programs to those tasks for you.

In other words, to setup and use dtmail (instead of relying on a 2010 port of Thunderbird), I’ll have to learn how to set up things like sendmail, fetchmail, or alternatives to those tools. Those programs will in turn dump the emails in the maildir format for dtmail to work with. Configuring these tools could very well be above my paygrade, but I’ll do my best to try and get it working – I think it’s more authentic to use something like dtmail than a random Thunderbird port.

In any event, this, too, feels very UNIX-y, much like delegating wildcard expansion to a separate program. What this also shows is that the “UNIX philosophy” was subject to erosion from the very beginning, and really isn’t a modern phenomenon like many people seem to imply. I doubt many of the people complaining about the demise of the UNIX philosophy today even knew wildcard expansion used to be done by a separate program.

Bringing SerenityOS to real hardware, one driver at a time

Many moons ago, around the time when Andreas formally resigned from being Serenity’s BDFL, I decided that I want to get involved in the project more seriously. Looking at it from a perspective of “what do I not like about this (codebase)”, the first thing that came to mind was that it runs HERE points at QEMU and not THERE points at real hardware. Obvious oversight, let’s fix it.

↫ sdomi

There’s no way for me to summarise this cursed saga, so just follow the lovely link and read it. It’s a meandering story of complexity, but eventually, a corrupted graphical session appeared.

Now the real work starts.

Google launches Chromium development fund to ward off antitrust concerns

Don’t you just love it when companies get together under the thin guise of open source to promote their own interests?

Today Google is pleased to announce our partnership with The Linux Foundation and the launch of the Supporters of Chromium-based Browsers. The goal of this initiative is to foster a sustainable environment of open-source contributions towards the health of the Chromium ecosystem and financially support a community of developers who want to contribute to the project, encouraging widespread support and continued technological progress for Chromium embedders.

The Supporters of Chromium-based Browsers fund will be managed by the Linux Foundation, following their long established practices for open governance, prioritizing transparency, inclusivity, and community-driven development. We’re thrilled to have Meta, Microsoft, and Opera on-board as the initial members to pledge their support.

↫ Shruthi Sreekanta on the Chromium blog

First, there’s absolutely no way around the fact that this entire effort is designed to counter some of the antitrust actions against Google, including a possible forced divestment of Chrome. By setting up an additional fund atop the Chromium organisation, placed under the management of the Linux Foundation, Google creates the veneer of more independence for Chromium than their really is. In reality, however, Chromium is very much a Google-led project, with 94% of code contributions coming from Google, and with the Linux Foundation being very much a corporate affair, of which Google itself is a member, one has to wonder just how much it means that the Linux Foundation is managing this new fund.

Second, the initial members of this fund don’t exactly instill confidence in the fund’s morals and values. We’ve got Google, the largest online advertising company in the world. Then there’s Facebook, another major online advertising company, followed by Microsoft, which, among other business ventures, is also a major online advertising company. Lastly we have Opera, an NFT and cryptoscammer making money through predatory loans in poor countries. It’s a veritable who’s who of some of the companies you least want near anything related to your browsing experience.

I highly doubt a transparent effort like this is going to dissuade any judge or antritrust regulator from backing down. It’s clear this fund is entirely self-serving and designed almost exclusively for optics, with an obvious bias towards online advertising companies who want to make the internet worse than towards companies and people trying to make the internet better.

VLC gets caught in “AI” hype, adds “AI” subtitles and translations

VLC media player, the popular open-source software developed by nonprofit VideoLAN, has topped 6 billion downloads worldwide and teased an AI-powered subtitle system.

The new feature automatically generates real-time subtitles — which can then also be translated in many languages — for any video using open-source AI models that run locally on users’ devices, eliminating the need for internet connectivity or cloud services, VideoLAN demoed at CES.

↫ Manish Singh at TechCrunch

VLC is choosing to throw users who rely on subtitles for accessibility or translation reasons under the bus. Using speech-to-text and even “AI” as a starting point for a proper accessibility expert of translator is fine, and can greatly reduce the workload. However, as anyone who works with STT and “AI” translation software knows, their output is highly variable and wildly unreliable, especially once English isn’t involved. Dumping the raw output of these tools onto people who rely on closed captions and subtitles to even be able to view videos is not only lazy, it’s deeply irresponsible and demonstrates a complete lack of respect and understanding.

I was a translator for almost 15 years, with two university degrees on the subject to show for it. This is obviously a subject close to my heart, and the complete and utter lack of respect and understanding from Silicon Valley and the wider technology world for proper localisation and translation has been a thorn in my side for decades. We all know about bad translations, but it goes much deeper than that – with Silicon Valley’s utter disregard for multilingual people drawing most of my ire. Despite about 60 million people in the US alone using both English and Spanish daily, software still almost universally assumes you speak only one language at all times, often forcing fresh installs for something as simple as changing a single application’s language, or not even allowing autocorrect on a touch keyboard to work with multiple languages simultaneously.

I can’t even imagine how bad things are for people who, for instance, require closed-captions for accessibility reasons. Imagine just how bad the “AI”-translated Croatian closed-captions on an Italian video are going to be – that’s two levels of “AI” brainrot between the source and the ears of the Croatian user.

It seems subtitles and closed captions are going to be the next area where technology companies are going to slash costs, without realising – or, more likely, without giving a shit – that this will hurt users who require accessibility or translations more than anything. Seeing even an open source project like VLC jump onto this bandwagon is disheartening, but not entirely unexpected – the hype bubble is inescapable, and a lot more respected projects are going to throw their users under the bus before this bubble pops.

…wait a second. Why is VLC at CES in the first place?

Nvidia CEO says company has plans for desktop chip designed with MediaTek

On Monday at CES 2025, Nvidia unveiled a desktop computer called Project DIGITS. The machine uses Nvidia’s latest “Blackwell” AI chip and will cost $3,000. It contains a new central processor, or CPU, which Nvidia and MediaTek worked to create.

Responding to an analyst’s question during an investor presentation, Huang said Nvidia tapped MediaTek to co-design an energy-efficient CPU that could be sold more widely.

“Now they could provide that to us, and they could keep that for themselves and serve the market. And so it was a great win-win,” Huang said.

Previously, Reuters reported that Nvidia was working on a CPU for personal computers to challenge the consumer and business computer market dominance of Intel, Advanced Micro Devices and Qualcomm.

↫ Stephen Nellis at Reuters

I’ve long wondered why NVIDIA wasn’t entering the general purpose processor market in a more substantial way than it did a few years ago with the Tegra, especially now that ARM has cemented itself as an architecture choice for more than just mobile devices. Much like Intel, AMD, and now Qualcomm, NVIDIA could easily deliver the whole package to laptop, tablet, and desktop makers: processor, chipset, GPU, of course glued together with special NVIDIA magic the other companies opting to use NVIDIA GPUs won’t get.

There’s a lot of money to be made there, and it’s the move that could help NVIDIA survive the inevitable crash of the “AI” wave it’s currently riding, which has pushed the company to become one of the most valuable companies in the world. I’m also sure OEMs would love nothing more than to have more than just Qualcomm to choose from for ARM laptops and desktops, if only to aid in bringing costs down through competition, and to potentially offer ARM devices with the same kind of powerful GPUs currently mostly reserved for x86 machines.

I’m personally always for more competition, but this time with the asterisk that NVIDIA really doesn’t need to get any bigger than it already is. The company has a long history of screwing over consumers, and I doubt that would change if they also conquered a chunky slice of the general purpose processor market.

Pairs not taken

So we all know about twisted-pair ethernet, huh? I get a little frustrated with a lot of histories of the topic, like the recent neil breen^w^wserial port video, because they often fail to address some obvious questions about the origin of twisted-pair network cabling. Well, I will fail to answer these as well, because the reality is that these answers have proven very difficult to track down.

↫ J. B. Crawford

The problems with nailing down an accurate history of the development of the various standards, ideas, concepts, and implementations of Ethernet and other, by now dead, network standards are their age, as well as the fact that their history is entangled with the even longer history of telephone wiring. The reasoning behind some of the choices made by engineers over the past more than 100 years of telephone technology aren’t always clear, and very difficult to retrace.

Crawford dives into some seriously old and fun history here, trying to piece together the origins of twisted pair the best he can. It’s a great read, as all of his writings are.

An operating system in 1000 lines

Hey there! In this book, we’re going to build a small operating system from scratch, step by step.

You might get intimidated when you hear OS or kernel development, the basic functions of an OS (especially the kernel) are surprisingly simple. Even Linux, which is often cited as a huge open-source software, was only 8,413 lines in version 0.01. Today’s Linux kernel is overwhelmingly large, but it started with a tiny codebase, just like your hobby project.

We’ll implement basic context switching, paging, user mode, a command-line shell, a disk device driver, and file read/write operations in C. Sounds like a lot, however, it’s only 1,000 lines of code!

↫ Seiya Nuta

It’s exactly what it says on the tin.

HDMI 2.2 will require new “Ultra96” cables, whenever we have 8K TVs and content

We’ve all had a good seven years to figure out why our interconnected devices refused to work properly with the HDMI 2.1 specification. The HDMI Forum announced at CES today that it’s time to start considering new headaches. HDMI 2.2 will require new cables for full compatibility, but it has the same physical connectors. Tiny QR codes are suggested to help with that, however.

The new specification is named HDMI 2.2, but compatible cables will carry an “Ultra96” marker to indicate that they can carry 96GBps, double the 48 of HDMI 2.1b. The Forum anticipates this will result in higher resolutions and refresh rates and a “next-gen HDMI Fixed Rate Link.” The Forum cited “AR/VR/MR, spatial reality, and light field displays” as benefiting from increased bandwidth, along with medical imaging and machine vision.

↫ Kevin Purdey at Ars Technica

I’m sure this will not pose any problems whatsoever, and that no shady no-name manufacturers will abuse this situation at all. DisplayPort is the better standard and connector anyway.

No, I will not be taking questions.

NESFab: a new programming language for creating NES games

NESFab is a new programming language for creating NES games. Designed with 8-bit limitations in mind, the language is more ergonomic to use than C, while also producing faster assembly code. It’s easy to get started with, and has a useful set of libraries for making your first — or hundredth — NES game.

↫ NESFab website

NESFab has some smart features developers of NES games will certainly appreciate, most notably automatic bank switching. Instead of doing this manually, but NESFab will automatically carve your code and data up into banks to be switched in and out of memory when needed. There’s also an optional map editor, which makes it very easy to create additional levels for your game. All in all, a very cool project I hadn’t heard of, which also claims to perform better than other compilers.

If you’ve ever considered making an NES game, NESFab might be a tool to consider.

OpoLua: a compiled-OPL interpreter for iOS written in Lua

An OPO (compiled OPL) interpreter written in Lua and Swift, based on the Psion Series 5 era format (ie ER5, prior to the Quartz 6.x changes). It lets you run Psion 5 programs written in OPL on any iOS device, subject to the limitations described below.

↫ OpoLua GitHub page

If you’re pining for that Psion Series 5, but don’t want to deal with the hassle of owning and maintaining a real one – here’s a solution if you’re an iOS users. Incredibly neat, but with one limitation: only pure OPL programs work. Any program that also has native ARM code will not work.

Dell rebrands its entire product line: XPS, Inspiron, Latitude, etc. are going away

Dell has announced it’s rebranding literally its entire product line, so mainstays like XPS, Latitude, and Inspiron are going away. They’re replacing all of these old brands with Dell, Dell Pro, and Dell Pro Max, and within each of these, there will be three tiers: Base, Plus, and Premium. Of course, the reason is “AI”.

The AI PC market is quickly evolving. Silicon innovation is at its strongest and everyone from IT decision makers to professionals and everyday users are looking at on-device AI to help drive productivity and creativity. To make finding the right AI PC easy for customers, we’ve introduced three simple product categories to focus on core customer needs – Dell (designed for play, school and work), Dell Pro (designed for professional-grade productivity) and Dell Pro Max (designed for maximum performance). 

We’ve also made it easy to distinguish products within each of the new product categories. We have a consistent approach to tiering that lets customers pinpoint the exact device for their specific needs. Above and beyond the starting point (Base), there’s a Plus tier that offers the most scalable performance and a Premium tier that delivers the ultimate in mobility and design.

↫ Kevin Terwilliger on Dell’s blog

Setting aside the nonsensical reasoning behind the rebrand, I do actually kind of dig the simplicity here. This is a simple, straightforward set of brand names and tiers that pretty much anyone can understand. That being said, the issue with Dell in particular is that once you go to their website to actually buy one of their machines, the clarity abruptly ends and it gets confusing fast. I hope these new brand names and tiers will untangle some of that mess to make it easier to find what you need, but I’m skeptical.

My XPS 13 from 2017 is really starting to show its age, and considering how happy I’ve been with it over the years its current Dell equivalent would be a top contender (assuming I had the finances to do so). I wonder if the Linux support on current Dell laptops has improved since my XPS 13 was new?

Microsoft’s tone-deaf advice to Windows 10 users: just buy a new PC, you’re all rich, right?

Over 60% of Windows users are still using Windows 10, with only about 35% or so – and falling! – of them opting to use Windows 11. As we’ve talked about many times before, this is a major issue going into 2025, since Windows 10’s support will end in October of this year, meaning hundreds of millions of people all over the world will suddenly be running an operating system that will no longer receive security updates. Most of those people don’t want to, or cannot, upgrade to Windows 11, meaning Microsoft is leaving 60% of its Windows customer base out to dry.

I’m sure this will go down just fine with regulators and governments the world over.

Microsoft has tried everything, and it’s clear desperation is setting in, because the company just declared 2025 “The year of the Windows 11 PC refresh”, stating that Windows 11 is the best way to get all the “AI” stuff people are clearly clamoring for.

All of the innovation arriving on new Windows 11 PCs is coming at an important time. We recently confirmed that after providing 10 years of updates and support, Windows 10 will reach the end of its lifecycle on Oct. 14, 2025. After this date, Windows 10 PCs will no longer receive security or feature updates, and our focus is on helping customers stay protected by moving to modern new PCs running Windows 11. Whether the current PC needs a refresh, or it has security vulnerabilities that require the latest hardware-backed protection, now is the time to move forward with a new Windows 11 PC.

↫ Some overpaid executive at Microsoft

What makes this so incredibly aggravating and deeply tone-deaf is that for most of the people affected by this, “upgrading” to Windows 11 simply isn’t a realistic option. Their current PC is most likely performing and working just fine, but the steep and strict hardware requirements prohibit them from installing Windows 11. Buying an entirely new PC is often not only not needed from a performance perspective, but for many, many people also simply unaffordable. In case you haven’t noticed, it’s not exactly going great, financially, for a lot of people out there, and even in the US alone, 70-80% of people live paycheck-to-paycheck, and they’re certainly not going to be able to just “move forward with a new Windows 11 PC” for nebulous and often regressive “benefits” like “AI”.

The fact that Microsoft seems to think all of those hundreds of millions of people not only want to buy a new PC to get “AI” features, but that they also can afford it like it’s no big deal, shows some real lack of connective tissue between the halls of Microsoft’s headquarters and the wider world. Microsoft’s utter lack of a grasp on the financial realities of so many individuals and families today is shocking, at best, and downright offensive, at worst.

I guess if you live in a world where you can casually bribe a president-elect for one million dollars, buying a new computer feels like buying a bag of potatoes.

Why Half-Life 3 speculation is reaching a fever pitch again

The more than two decades since Half-Life 2‘s release have been filled with plenty of rumors and hints about Half-Life 3, ranging from the officialish to the thin to the downright misleading. As we head into 2025, though, we’re approaching something close to a critical mass of rumors and leaks suggesting that Half-Life 3 is really in the works this time, and could be officially announced in the coming months.

↫ Kyle Orland at Ars Technica

We should all be skeptical of anything related to Half-Life 3, but there’s no denying something’s buzzing. The one reason why I personally think a Half-Life 3 might be happening is the imminent launch of SteamOS for generic PCs, possibly accompanied by prebuilt SteamOS PCs and consoles and third-party Steam Decks. It makes perfect sense for Valve to have such a launch accompanied by the release of Half-Life 3, similar to how Half-Life 2 was accompanied by the launch of Steam.

We’ll have to wait and see. It will be hard to fulfill all the crazy expectations, though.

One dog v. the Windows 3.1 graphics stack

I’d like to write a full-fledged blog post about these adventures at some point, but for now I’m going to focus on one particular side quest: getting acceptable video output out of the 1000H when it’s running Windows 3.11 for Workgroups.

By default, Windows 3.x renders using the standard “lowest common denominator” of video: VGA 640×480 at 16 colours. Unfortunately this looks awful on the Eee PC’s beautiful 1024×600 screen, and it’s not even the same aspect ratio.

But how can we do better?

↫ Ash Wolf

If you ever wanted to know how display drivers work in Windows 3.x, here’s your chance. This definitely falls into the category of light reading for the weekend.

The Mac OS X dock turns 25

James Thomson, developer of, originally, DragThing and now PCalc, also happens to be the developer of the very first publicly shown version of the Mac OS dock. Now that it was shown to the world by Steve Jobs exactly 25 years ago, he reminisces about what it was like to create such an iconic piece of software history.

The new Finder (codename “Millennium”) was at this point being written on Mac OS 9, because Mac OS X wasn’t exactly firing on all cylinders quite yet. The filesystem wasn’t working well, which is not super helpful when you are trying to write a user interface on top of it. The Dock was part of the Finder then, and could lean on all the high level C++ interfaces for dealing with disks and files that the rest of the team was working on. So, I started on Mac OS 9, working away in Metrowerks Codewarrior. The Finder was a Carbon app, so we could actually make quite a bit of early progress on 9, before the OS was ready for us. I vividly remember the first time we got the code running on Mac OS X.

↫ James Thomson

I especially like the story about how Steve Jobs really demanded Thomson live in Cupertino in order to work on the dock, instead of remaining remote in Ireland. Thomson and his wife decided not to move to the United States, so he figured he’d lose his assignment, or maybe even his job altogether. Instead, his managers told him something along the lines of “don’t worry, we’ll just tell Steve you moved”. What followed were a lot of back-and-forth flights between Ireland and California, and Thomson’s colleagues telling Steve all sorts of lies and cover stories for whenever he was in Ireland and Steve noticed.

Absolutely wild.

The dock is one of those things from my years using Mac OS X – between roughly 2003 and 2009 or so – that has stuck around with me ever since. To this day, I have a dock at the bottom of my screen that looks and works eerily similar to the Mac OS X dock, and I doubt that’s going to change any time soon. It suits my way of using my computer incredibly well, and it’s the first thing I set up on any new installation I perform (I use Fedora KDE).

NVIDIA’s RTX 5090 will supposedly have a monstrous 575W TDP

The RTX 5090 and RTX 5080 are receiving their final updates. According to two highly reliable leakers, the RTX 5090 is officially a 575W TDP model, confirming that the new SKU requires significantly more power than its predecessor, the RTX 4090 with TDP of 450W.

According to Kopite, there has also been an update to the RTX 5080 specifications. While the card was long rumored to have a 400W TDP, the final figure is now set at 360W. This change is likely because NVIDIA has confirmed the TDP, as opposed to earlier TGP figures that are higher and represent the maximum power limit required by NVIDIA’s specifications for board partners.

↫ WhyCry at VideoCardz.com

These kinds of batshit insane GPU power power requirements are eventually going to run into the limits of the kind of airflow an ATX case can provide. We’re still putting the airflow stream of GPUs (bottom to top) perpendicular to the airflow through the case (front to back) like it’s 1987, and you’d think at least someone would be thinking about addressing this – especially when a GPU is casually dumping this much heat into the constrained space within a computer case.

I don’t want more glass and gamer lights. I want case makers to hire at least one proper fluid dynamics engineer.

Windows 2: Final Fantasy of operating systems

It is common knowledge that Final Fantasy could have been the last game in the series. It is far less known that Windows 2, released around the same time, could too have been the last. If anything, things were more certain: even Microsoft believed that Windows 2 would be the last.

The miracle of overwhelming commercial success brought incredible attention to Windows. The retro community and computer historians generally seem to be interested in the legendary origins of the system (how it all began) or in its turnabout Windows 3.0 release (what did they do right?).

This story instead will be about the underdog of Windows, version 2. To understand where it all went wrong, we must start looking at events that happened even before Microsoft was founded. By necessity, I will talk a lot about the origins of Windows, too. Instead of following interpersonal/corporate drama, I will try to focus on the technical aspects of Windows and its competitors, as well as the technological limitations of the computers around the time. Some details are so convoluted and obscure that even multiple Microsoft sources, including Raymond Chen, are wrong about essential technical details. It is going to be quite a journey, and it might seem a bit random, but I promise that eventually, it all will start making sense.

↫ Nina Kalinina

I’m not going to waste your previous time with my stupid babbling when you could instead spend it reading this amazingly detailed, lovingly crafted, beautifully illustrated, and deeply in-depth article by Nina Kalinina about the history, development, and importance of Windows 2. She’s delivered something special here, and it’s a joy to read and stare at the screenshots from beginning to end. Don’t forget to click on the little expander triangles for a ton of in-depth technical stuff and even more background information.

AROS centimeters closer to 64bit

We’ve just entered the new year, and that means we’re going to see some overviews about what the past year has brought. Today we’re looking at AROS, as AROS News – great name, very classy, you’ve got good taste, don’t change it – summarised AROS’ 2024, and it’s been a good year for the project. We don’t hear a lot about AROS-proper, as the various AROS distributions are a more optimal way of getting to know the operating system and the project’s communication hasn’t always been great, but that doesn’t mean they’ve been sitting still.

Perhaps the most surprising amount of progress in 2024 was made in the move from 32bit to 64bit AROS.

Deadwood also released a 64-bit version of the system (ABIv11) in a Linux hosted version (ABIv11 20241102-1) and AxRuntime version 41.12, which promises a complete switch to 64-bit in the near future. He has also developed a prototype emulator that will enable 64-bit AROS to run programs written for the 32-bit version of the system.

↫ Andrzej “retrofaza” Subocz at AROS News

This is great news for AROS, as being stuck in 32bit isn’t particularly future-proof. It might not pose many problems today, as older hardware remains available and 64bit x86 processors can handle running 32bit operating systems just fine, but you never know when that will change. Int the same vein, Deadwood also released a 64bit version of Oddysey, the WebKit-based browser, which was updated this year from August 2015’s WebKit to February 2019’s WebKit. Sure, 2019 might still be a little outdated, but it does mean a ton of complex sites now work again on AROS, and that’s a hugely positive development.

Things like Python and GCC were also updated this year, and there was, as is fitting for an Amiga-inspired operating system, a lot of activity in the gaming world, including big updates to Doom 3 and ScummVM. This is just a selection of course, so be sure to read Subocz’s entire summary at AROS News.

The GPU, not the TPM, is the root of hardware DRM

Do you think streaming platforms and other entities that employ DRM schemes use the TPM in your computer to decrypt stuff? Well, the Free Software Foundation seems to think so, and adds Microsoft’s insistence on requiring a TPM for Windows 11 into the mix, but it turns out that’s simply not true.

I’m going to be honest here and say that I don’t know what Microsoft’s actual motivation for requiring a TPM in Windows 11 is. I’ve been talking about TPM stuff for a long time. My job involves writing a lot of TPM code. I think having a TPM enables a number of worthwhile security features. Given the choice, I’d certainly pick a computer with a TPM. But in terms of whether it’s of sufficient value to lock out Windows 11 on hardware with no TPM that would otherwise be able to run it? I’m not sure that’s a worthwhile tradeoff.

What I can say is that the FSF’s claim is just 100% wrong, and since this seems to be the sole basis of their overall claim about Microsoft’s strategy here, the argument is pretty significantly undermined. I’m not aware of any streaming media platforms making use of TPMs in any way whatsoever. There is hardware DRM that the media companies use to restrict users, but it’s not in the TPM – it’s in the GPU.

↫ Matthew Garrett

A TPM is imply not designed to handle decryption of media streams, and even if they were, they’re far, far too slow and underpowered to decode even a 1080P stream, let alone anything more demanding than that. In reality, DRM schemes like Google’s Widevine, Apple’s Fairplay, and Microsoft’s Playready offer different levels of functionality, both in software and in hardware. The hardware DRM stuff is all done by the GPU, and not by the TPM. By focusing so much on the TPM, Garrett argues, the FSF is failing to see how GPU makers have enabled a ton of hardware DRM without anyone noticing.

Personally, I totally understand why organisations like the Free Software Foundation are focusing on TPMs right now. They’re one of the main reasons why people can’t upgrade to Windows 11, it’s the thing people have heard about, and it’s the thing that’ll soon prevent them from getting security updates for their otherwise perfectly fine machines. I’m not sure the FSF has enough clout these days to make any meaningful media impact, especially in more general, non-tech media, but by choosing the TPM as their focus they’re definitely choosing a viable vector.

Of course, over here in the tech corner, we don’t like it when people are factually inaccurate or twisting and bending the truth, and I’m glad someone as knowledgeable as Garrett stepped up to set the record straight for us tech-focused people, while everyone else can continue to ignore this matter.