Here’s an interesting talk:
As you can tell from the title, I disagree.
For some reason it blows people’s minds when they realize that most of the code on their PC is running in some embedded chip in a peripheral just moving data around. They think it’s meaningful. It’s not.
These peripherals provide a stable interface to whatever OS you install. Windows/Linux/BSD/FreeRTOS… All of them are on the same footing, and must use the same hardware interfaces. So why does it matter if deep in some chip on your motherboard, there’s a CPU core running embedded Linux?
If you’re a hardcore Stallman-style libre-head, then yes, this means something to you. You should try to avoid using such systems, because those chips almost surely contain non-libre code. But for the average user, it’s irrelevant.
Part of the argument seems to be that computers are getting more complex, and that this poses an existential threat to Linux. If Linux doesn’t evolve, it’ll become irrelevant, as subsystems of your PC, running closed-source proprietary code, slowly gain ground.
But that’s nonsense. The PC “form factor” has won. All of the “complexity” we’ve seen recently is just a reification or exposure of existing internal systems. Once upon a time, you made OpenGL calls, and your Nvidia/ATI driver translated those to some internal format, and then ultimately sent triangles to the GPU. Now, we have Vulkan, which lets you get access to that intermediate format. But you don’t need to use it. You’re still free to use OpenGL (and you should).
One of the strengths of the PC ecosystem is the standardization. The modern PC hardware, which I define as UEFI/ACPI/XHCI/PCIE, is universal. You won’t find modern x86 hardware that deviates from it. If Linux tries to get too fancy and start tinkering with the lower-level details of the system, then that just creates fragmentation. We should try to avoid concerning ourselves with those low-level details (except perhaps in cases where security improvements can be added, without breaking compatibility).