The antiđŸš«-derailment🚃 & threadđŸ§” hijackingđŸ”« threadđŸ§” ⁉

Oohhh hahaha I get it

1 Like

Nostalgia journey has made me find and download Thunder cats. I’mma gonna watch all 4 seasons.

1 Like

Pffffff, Top-load BetaMax with the wood paneling on the front was where it was at!

And my dad spending weekend after weekend renting a VHS machine and some tapes to copy over because they stopped producing Betamax tapes, and being too cheap to get a new machine.

1 Like

That started out as a terrible tin foil hat joke, but I like the version of shaping my fingers to change the channel a lot more!

:robot_windows:

The first one I used that had that feature was new back in 2007. But microwaves last forever so I was probably late to the game.

With the wired remote!

I’m also getting old
 And somehow Windows 11 is significantly worse than Windows 95
 Let’s forget about the tragedy of Windows 8.

:cry:

2 Likes

I view Windows 8 as the beginning of everything Microsoft wanted Windows to be and the end of everything users wanted it to be.

1 Like

Windows 7 was a definite milestone

1 Like

You can tell the best versions by the ones that Microsoft bitches most heavily about users hanging on for dear life years and years after it was replaced.

98 to a certain extent.
XP and 7 definitely.

Don’t get me wrong, they all had flaws, and time has marched on, but those were the ones that definitely weren’t upgraded by their replacements.

1 Like

Thank you, and please do. I hope that an awareness of the need for obedient tech will catch on.

I think of it in terms of the “Cyborg Philosophy” versus the “Android Philosophy.” The former wishes for technology to add functionality to the human, upgrading us to better carry out our wills. The latter wishes to outsource human activity to an external agent, outsourcing our work load, our responsibility, and in the process, outsourcing our learning, mastery, and control.

For me it began when the rich high school I went to decided to spend their budget on “smart boards,” glorified projector screens containing stylus support. For the absurd amount they spent on it, neighbouring schools could have gotten the basic equipment they needed. Our science departments, seemingly in defiance of this nonsense, continued to use the old projectors with the transparent film you could write on, or they’d use cheap 3D mice to control their projected PC screens, achieving the exact same technical advantages much more efficiently.

Then the parents of the rich kids bought them all iPhones when they came out, and I’d be sitting on my laptop typing with all my fingers on a full sized full powered machine while these kids were walking around trying to compute with just their thumbs on a tiny screen. Each time a new version would come out the rich kids would “accidentally” drop their iPhone in the toilet and their parents would buy them new ones. At this time, I’d already experienced the MyVu ocular display, and I was sure this “smart phone” fad would pass quickly and we’d mount proper monitors to our eyes and proper keyboards to our belts and proper PCs to our backs. I waited, and waited, but late stage laissez-faire consumerism doesn’t deliver. Google bought up MyVu and stuffed “smart” into it, ruining it. I finally realised I have to figure out how to do it myself.

Is this why my new car lags before responding to the accelerator? That’s terrifying. Wonder if the brakes do that too?

I have to wait several minutes after turning on my car for the computer to boot up before I can control the thermostat on the touchscreen, during which time it blasts undesired hot or cold air, because the thing has no actual buttons. Unlike my old cars where I just slide the potentiometer where I want (by touch, which is important when you need your eyes for the road) and it reads that position next time the car starts.

And it has these smart lights for the cargo area that seem to intelligently stay on for me just long enough for me to step away from the light-switch before plunging me into darkness. Thanks. Not that the light switches are much help, 'cause they’re smart too. One in the front knows to ignore me when I tell it to turn off, one in the back knows to ignore me when I tell it to turn on, and both of them know to ignore me after the first instruction I give. Also, there’s no fuse box to kill them because that’s managed in the onboard computer. I intend to cut the wires and replace them with a simple light on a circuit connected to stand-alone battery so I can simply open and close the circuit to control it. No “smart” crap.

So that’s why we call them “clickers!”

There’s an old hotel
down in Hell
where the TV works but the clicker’s broken.
– A lyric that’s been stuck in my head for years.

[Edit: found the actual lyric.]

3 Likes

So hopefully the signal won’t spend a significant amount of time in unnecessary and unreliable processing in any of these systems, just quick and direct electronic control circuit that’s only disrupted by other systems when needed. Even so, I’ve disabled or minimised as many of the driver override features as I can and just let it notify me so that I can remain in control of the giant metal battering ram of death. There’s a lot of things I would hope wouldn’t have excessive smart bloat in them but they do.

2 Likes

With the amount of people deploying a full a raspberry pi to blink an LED 
 no way it’s anything less complicated. Remember the Toyota accelerator scandal? Even though in the end I think it was determined not to be the fault of bad code in the ECU of certain Toyota cars, it did reveal that many car manufacturers have no idea how their ECU is actually working
 a ton of black box and spaghetti code means safety reviews are being skirted because nobody is really evaluating all the code in its entirety.

4 Likes

It’s an interesting phenomena that, as a species, we have learned to create technology that exceeds our capacity to comprehend it.

2 Likes

That pretty much happened immediately
 master artisans making things that the general population had no idea about
 arrow fletchers
 fly fishing lures
 aqueducts
 concrete
 glass making
 blacksmithing
 simple technologies that made amazing things possible.

2 Likes

That’s been very painful to watch


Don’t forget about old code that is architecture dependent stuff being used in embedded devices. There are a bunch of products that rely on old code that only runs on certain families of 8 bit MCUs and everybody is afraid to do anything beyond simple fixes that codebase
 And then they added a more powerful CPU and a touchscreen, and WiFi, etc
 All interfaced to the old micro through a proprietary bit banged protocol because no one dared touch the old code base


When that doesn’t happen, every single product in the lineup will get a different firmware with code from different vendors
 Networking gear is a great example of this. Every single AP model has a different firmware and bugs go unpatched as this creates too many projects to maintain. And that’s why I love Mikrotik and their unified RouterOS that runs on everything they make, it actually gets updated.

Good software architecture and a willingness to ditch old code and start fresh goes a long way. Yes, it’s painful in the short term, but managing several software projects and adapting new tech to work with old code out of fear of touching said old code will only increase the R&D costs


FFS, there are HP calculators that run an emulator of one of their old CPUs and still use the same firmware as they did back in the 80s. I bought an HP Prime, and it’s limited to 500 digits because of a memory limitation of a CPU they used back in the 80s. That thing has a beefy ARM CPU and comes with Python, but they couldn’t maintain their original math library


I fear that this is only going to get worse as companies off shore their engineering departments to certain places Asia and Latin America where the culture is strongly biased against rocking the boat. Collectivist and conformist cultures make innovation difficult
 But please, before you lynch me for saying that let me tell you that I think that people who leave such places are often more individually minded. I also grew up in Colombia and hated the crab bucket culture there so I’m speaking from experience. Progress requires the tall poppies


Anyways, I’m going to bed. Please don’t prepare a noose while I sleep


3 Likes

Mmmm, kind of. The thing is an arrow fletcher or a blacksmith could explain their craft. Sure, their grasp of physics as we know it was shit, but they knew what they were doing and why.

We can build really complex neural networks now, and we know how the pieces work, but we just can’t comprehend the whole system of it. Nobody can. All we know is that for this input, that output, but how it got there is total mystery.

Reminds me of that famous quote, “If the brain were simple enough for us to understand it, we would be so simple we couldn’t.”

2 Likes

Their craft yes, but not how things actually worked
 why arrows flew or combining various ingredients or performing certain arcane steps produced this metal or that kind of blade.

I know what you’re saying about neural networks, but also a lot of human innovation was discovery
 it was more “art” than science
 nobody knew why sand became glass, they could not explain it
 it just did
 arrow makers could not explain the archer’s paradox
 it just “was”. I don’t see this as all that different from today’s discovery of the emergent properties of LLMs.

One day, we will be able to completely explain it
 people are already working on ways to do that. If there is one thing about humans 
 there are a certain tenacious contingent of us that just need to know how shit works
 here are just a few of the tools being developed to find out exactly how LLMs work and why they work that way;

  • Interactive Interpretability Platforms
  • Gradient & Perturbation‑Based Attribution Libraries
  • Mechanistic Interpretability & Circuit Analysis
  • Observability & MLOps for LLMs
  • Visualization Toolkits for Attention & Activations

The phrase “the sky’s the limit” does not apply when it comes to human knowledge acquisition
 one day, people gonna build a planetary computer to derive the Ultimate Question of Life, the Universe, and Everything.

1 Like

I really like this Brit phrase I just learned about;

Start As You Mean To Go On

1 Like

I like watching a lot of craft specific YouTube. Inheritance Machining and Alec Steel are great channels.

Alec steel is a blacksmith and his gig is specifically Damascus steel forging. He goes on kicks where he tries to make Damascus with different alloys. And he often refers to very old texts of blacksmiths and how they did things. He covers a ton of old texts including even arrow head pulling.

So from what I have gathered from this, a black smith could explain what he does, and why he does it, but not why it works.

I’m going to use Damascus as an example. Period blacksmiths would incorporate iron filings in between the layers of steel to create their Damascus when forging because it would help the layers stick together. They didn’t know why, just that it worked.

Likewise if they were forging arrow heads they would make them all in a standard way but didn’t really know specifically what made one better than another. But they knew pulling it from a single rod produced better results because they could make them quickly.

2 Likes

This conversation brings to mind the principle of permacomputing that asserts that a good level of complexity to aim for in a device is that level that a single human can understand.

I don’t think we need to be able to understand all the details of why a thing works. That is an endless search. Fruitful, but endless. The blacksmith and the fletcher don’t know why many of their techniques work, but they do know how to apply a set of techniques to acquire the desired results. Also, in their case, they’re working with natural mediums, so it’s a bit different than working with artefacts that we ought to be designing to work in human comprehensible ways. They are adapting to nature, but our artefacts should be adapted to us, or at least meet us in the middle. (I hold that elegant mathematics is that middle, or near to it.)

We don’t know why it works, but I think it’s good to build systems at a level of complexity whereat we can track how everything works from a functional, utilitarian standpoint. That is to say, we don’t need to understand all of the implementation, but we do need to be able to track all of the causes and effects so that the user knows exactly what to expect as a result of a given input, and exactly how to get to an available output.

I yearn for this ideal. There’s a high cost to starting over a big project though, like an entire desktop operating system and all the accumulated projects that depend upon it. The more we can break projects down into smaller independent modules and elegant interfaces, or more broadly good software architecture as you say, the easier this becomes in the long run. Elegance is always a trade off against production and performance, but for interfaces that we’ll be calling upon for a long time, elegance is ideal.

1 Like

Not always. In fact, I tend to think that elegant solutions are cheaper, easier to manufacture, and perform better. But I come from a hardware background, and while there are instances were you get some overhead from coding things properly, this often pays off with shorter development times for more complex projects. And unless you’re doing a one and done proof of concept thing, there’s no point to tangle things up and dive into the assembly side of a niche architecture


1 Like