Oohhh hahaha I get it
Nostalgia journey has made me find and download Thunder cats. Iâmma gonna watch all 4 seasons.
Pffffff, Top-load BetaMax with the wood paneling on the front was where it was at!
And my dad spending weekend after weekend renting a VHS machine and some tapes to copy over because they stopped producing Betamax tapes, and being too cheap to get a new machine.
That started out as a terrible tin foil hat joke, but I like the version of shaping my fingers to change the channel a lot more!
![]()
The first one I used that had that feature was new back in 2007. But microwaves last forever so I was probably late to the game.
With the wired remote!
Iâm also getting old⊠And somehow Windows 11 is significantly worse than Windows 95⊠Letâs forget about the tragedy of Windows 8.
![]()
I view Windows 8 as the beginning of everything Microsoft wanted Windows to be and the end of everything users wanted it to be.
Windows 7 was a definite milestone
You can tell the best versions by the ones that Microsoft bitches most heavily about users hanging on for dear life years and years after it was replaced.
98 to a certain extent.
XP and 7 definitely.
Donât get me wrong, they all had flaws, and time has marched on, but those were the ones that definitely werenât upgraded by their replacements.
Thank you, and please do. I hope that an awareness of the need for obedient tech will catch on.
I think of it in terms of the âCyborg Philosophyâ versus the âAndroid Philosophy.â The former wishes for technology to add functionality to the human, upgrading us to better carry out our wills. The latter wishes to outsource human activity to an external agent, outsourcing our work load, our responsibility, and in the process, outsourcing our learning, mastery, and control.
For me it began when the rich high school I went to decided to spend their budget on âsmart boards,â glorified projector screens containing stylus support. For the absurd amount they spent on it, neighbouring schools could have gotten the basic equipment they needed. Our science departments, seemingly in defiance of this nonsense, continued to use the old projectors with the transparent film you could write on, or theyâd use cheap 3D mice to control their projected PC screens, achieving the exact same technical advantages much more efficiently.
Then the parents of the rich kids bought them all iPhones when they came out, and Iâd be sitting on my laptop typing with all my fingers on a full sized full powered machine while these kids were walking around trying to compute with just their thumbs on a tiny screen. Each time a new version would come out the rich kids would âaccidentallyâ drop their iPhone in the toilet and their parents would buy them new ones. At this time, Iâd already experienced the MyVu ocular display, and I was sure this âsmart phoneâ fad would pass quickly and weâd mount proper monitors to our eyes and proper keyboards to our belts and proper PCs to our backs. I waited, and waited, but late stage laissez-faire consumerism doesnât deliver. Google bought up MyVu and stuffed âsmartâ into it, ruining it. I finally realised I have to figure out how to do it myself.
Is this why my new car lags before responding to the accelerator? Thatâs terrifying. Wonder if the brakes do that too?
I have to wait several minutes after turning on my car for the computer to boot up before I can control the thermostat on the touchscreen, during which time it blasts undesired hot or cold air, because the thing has no actual buttons. Unlike my old cars where I just slide the potentiometer where I want (by touch, which is important when you need your eyes for the road) and it reads that position next time the car starts.
And it has these smart lights for the cargo area that seem to intelligently stay on for me just long enough for me to step away from the light-switch before plunging me into darkness. Thanks. Not that the light switches are much help, 'cause theyâre smart too. One in the front knows to ignore me when I tell it to turn off, one in the back knows to ignore me when I tell it to turn on, and both of them know to ignore me after the first instruction I give. Also, thereâs no fuse box to kill them because thatâs managed in the onboard computer. I intend to cut the wires and replace them with a simple light on a circuit connected to stand-alone battery so I can simply open and close the circuit to control it. No âsmartâ crap.
So thatâs why we call them âclickers!â
Thereâs an old hotel
down in Hell
where the TV works but the clickerâs broken.
â A lyric thatâs been stuck in my head for years.
[Edit: found the actual lyric.]
So hopefully the signal wonât spend a significant amount of time in unnecessary and unreliable processing in any of these systems, just quick and direct electronic control circuit thatâs only disrupted by other systems when needed. Even so, Iâve disabled or minimised as many of the driver override features as I can and just let it notify me so that I can remain in control of the giant metal battering ram of death. Thereâs a lot of things I would hope wouldnât have excessive smart bloat in them but they do.
With the amount of people deploying a full a raspberry pi to blink an LED ⊠no way itâs anything less complicated. Remember the Toyota accelerator scandal? Even though in the end I think it was determined not to be the fault of bad code in the ECU of certain Toyota cars, it did reveal that many car manufacturers have no idea how their ECU is actually working⊠a ton of black box and spaghetti code means safety reviews are being skirted because nobody is really evaluating all the code in its entirety.
Itâs an interesting phenomena that, as a species, we have learned to create technology that exceeds our capacity to comprehend it.
That pretty much happened immediately⊠master artisans making things that the general population had no idea about⊠arrow fletchers⊠fly fishing lures⊠aqueducts⊠concrete⊠glass making⊠blacksmithing⊠simple technologies that made amazing things possible.
Thatâs been very painful to watchâŠ
Donât forget about old code that is architecture dependent stuff being used in embedded devices. There are a bunch of products that rely on old code that only runs on certain families of 8 bit MCUs and everybody is afraid to do anything beyond simple fixes that codebase⊠And then they added a more powerful CPU and a touchscreen, and WiFi, etc⊠All interfaced to the old micro through a proprietary bit banged protocol because no one dared touch the old code baseâŠ
When that doesnât happen, every single product in the lineup will get a different firmware with code from different vendors⊠Networking gear is a great example of this. Every single AP model has a different firmware and bugs go unpatched as this creates too many projects to maintain. And thatâs why I love Mikrotik and their unified RouterOS that runs on everything they make, it actually gets updated.
Good software architecture and a willingness to ditch old code and start fresh goes a long way. Yes, itâs painful in the short term, but managing several software projects and adapting new tech to work with old code out of fear of touching said old code will only increase the R&D costsâŠ
FFS, there are HP calculators that run an emulator of one of their old CPUs and still use the same firmware as they did back in the 80s. I bought an HP Prime, and itâs limited to 500 digits because of a memory limitation of a CPU they used back in the 80s. That thing has a beefy ARM CPU and comes with Python, but they couldnât maintain their original math libraryâŠ
I fear that this is only going to get worse as companies off shore their engineering departments to certain places Asia and Latin America where the culture is strongly biased against rocking the boat. Collectivist and conformist cultures make innovation difficult⊠But please, before you lynch me for saying that let me tell you that I think that people who leave such places are often more individually minded. I also grew up in Colombia and hated the crab bucket culture there so Iâm speaking from experience. Progress requires the tall poppiesâŠ
Anyways, Iâm going to bed. Please donât prepare a noose while I sleepâŠ
Mmmm, kind of. The thing is an arrow fletcher or a blacksmith could explain their craft. Sure, their grasp of physics as we know it was shit, but they knew what they were doing and why.
We can build really complex neural networks now, and we know how the pieces work, but we just canât comprehend the whole system of it. Nobody can. All we know is that for this input, that output, but how it got there is total mystery.
Reminds me of that famous quote, âIf the brain were simple enough for us to understand it, we would be so simple we couldnât.â
Their craft yes, but not how things actually worked⊠why arrows flew or combining various ingredients or performing certain arcane steps produced this metal or that kind of blade.
I know what youâre saying about neural networks, but also a lot of human innovation was discovery⊠it was more âartâ than science⊠nobody knew why sand became glass, they could not explain it⊠it just did⊠arrow makers could not explain the archerâs paradox⊠it just âwasâ. I donât see this as all that different from todayâs discovery of the emergent properties of LLMs.
One day, we will be able to completely explain it⊠people are already working on ways to do that. If there is one thing about humans ⊠there are a certain tenacious contingent of us that just need to know how shit works⊠here are just a few of the tools being developed to find out exactly how LLMs work and why they work that way;
- Interactive Interpretability Platforms
- Gradient & PerturbationâBased Attribution Libraries
- Mechanistic Interpretability & Circuit Analysis
- Observability & MLOps for LLMs
- Visualization Toolkits for Attention & Activations
The phrase âthe skyâs the limitâ does not apply when it comes to human knowledge acquisition⊠one day, people gonna build a planetary computer to derive the Ultimate Question of Life, the Universe, and Everything.
I really like this Brit phrase I just learned about;
I like watching a lot of craft specific YouTube. Inheritance Machining and Alec Steel are great channels.
Alec steel is a blacksmith and his gig is specifically Damascus steel forging. He goes on kicks where he tries to make Damascus with different alloys. And he often refers to very old texts of blacksmiths and how they did things. He covers a ton of old texts including even arrow head pulling.
So from what I have gathered from this, a black smith could explain what he does, and why he does it, but not why it works.
Iâm going to use Damascus as an example. Period blacksmiths would incorporate iron filings in between the layers of steel to create their Damascus when forging because it would help the layers stick together. They didnât know why, just that it worked.
Likewise if they were forging arrow heads they would make them all in a standard way but didnât really know specifically what made one better than another. But they knew pulling it from a single rod produced better results because they could make them quickly.
This conversation brings to mind the principle of permacomputing that asserts that a good level of complexity to aim for in a device is that level that a single human can understand.
I donât think we need to be able to understand all the details of why a thing works. That is an endless search. Fruitful, but endless. The blacksmith and the fletcher donât know why many of their techniques work, but they do know how to apply a set of techniques to acquire the desired results. Also, in their case, theyâre working with natural mediums, so itâs a bit different than working with artefacts that we ought to be designing to work in human comprehensible ways. They are adapting to nature, but our artefacts should be adapted to us, or at least meet us in the middle. (I hold that elegant mathematics is that middle, or near to it.)
We donât know why it works, but I think itâs good to build systems at a level of complexity whereat we can track how everything works from a functional, utilitarian standpoint. That is to say, we donât need to understand all of the implementation, but we do need to be able to track all of the causes and effects so that the user knows exactly what to expect as a result of a given input, and exactly how to get to an available output.
I yearn for this ideal. Thereâs a high cost to starting over a big project though, like an entire desktop operating system and all the accumulated projects that depend upon it. The more we can break projects down into smaller independent modules and elegant interfaces, or more broadly good software architecture as you say, the easier this becomes in the long run. Elegance is always a trade off against production and performance, but for interfaces that weâll be calling upon for a long time, elegance is ideal.
Not always. In fact, I tend to think that elegant solutions are cheaper, easier to manufacture, and perform better. But I come from a hardware background, and while there are instances were you get some overhead from coding things properly, this often pays off with shorter development times for more complex projects. And unless youâre doing a one and done proof of concept thing, thereâs no point to tangle things up and dive into the assembly side of a niche architectureâŠ