What’s your worst “horrible coincidence” experience?

Oh, shit, I had forgotten that it’s been so many years. I’ve made that mistake too, but luckily it was not a prod system (yet).
What’s your worst “horrible coincidence” experience?

Oh, shit, I had forgotten that it’s been so many years. I’ve made that mistake too, but luckily it was not a prod system (yet).
Why opinion on AI is so divided. AI power users are pulling away from everyone else.
It’s not just due to coding. It’s about understanding the tool.
I could mog just about anymore before. But with AI, I can easily be 1,000x as smart as a normie. Whereas a normie using AI is often dumber than before because they do not understand how to use AI well or correctly. Using an LLM, they end up with something worse than if they just had not used it at all.
Someone like me, however, can produce in a few hours with AI something that I could’ve created if I’d spent five hundred hours on it (just did that at work yesterday, though it wouldn’t have taken me 500 hours). It greatly enhances my capabilities and output.
In other words, my arrogance aside, AI (like many technologies) amplifies pre-existing advantages. Perhaps more than any other tech we’ve ever created. This disparity will only become greater.
So, to condense all those words, AI makes smart people tons smarter and normies a bit to a lot dumber. It’ll be interesting to see what falls out of that.
Anyone read this 49 day SSL expiration thing and think they would rather just retire?
Yes. It’s such a fucking clownish, useless idea that is going to make my life worse and many other people’s too for absolutely zero real-world security benefit.
The shit-for-brains set of doofuses who thought of this should be run out of the industry and be require to work at a Chili’s in Duluth until they expire from despair.
What a huge waste of time and resources that is going to lead to more security issues, not fewer.
This is an actual question on the official study guide of a cert (CISM) I am studying for. And it is egregiously, laughably wrong in every aspect. (The bolded answer is what it thinks is correct.)
1) โIP Security v6โ is not a real protocol name. There is nothing called that anywhere in my field. There is IPSec, which applies to both IPv4 and IPv6.
2) The explanation about source and destination IPs being inside the encrypted portion is false in general. Too detailed to go into here, but it just does not work like that at all.
3) MITM resistance in IPsec comes from authenticated key exchange and integrity, not from hiding IP addresses.
4) Even good ol’ IPsec does not prevent MITM in all deployment types. Pre-shared keys, for example, are vulnerable such that an MITM attacker can obtain an OTP and log in as the remote user. Also, pretty-common NULL-authenticated IPsec completely does not protect against MITM and should be treated like plaintext traffic in almost all cases.
That question is just wrong in fifty-eleven different ways. Amazing that is on a study guide. But networking is usually the worst area on anything like this as no one knows what the hell they are talking about ever in that field (see the clownishly asinine NAT ISN’T A FIREWALL AND OFFERS NO SECURITY bullshit the Hacker News-type doofs always spout).
Ooh, ooh, I know this one!
First, there are various types of data centers. Not all are water-cooled. Some are only air-cooled. And there are various types of water cooling. The answer also is heavily dependent on climate and type of datacenter. There is no one pat response to a question like this. As is usual with life, the permutations are endlessly complex.
However, these days when people say “data center” they usually mean “AI data center” because that is all they are aware of. And in reality, the concern about water usage (as Noah Smith pointed out) is really displaced anxiety about AI-related job loss. So the water use question already starts out in epistemically-shaky territory.
First, let’s talk baselines. The average 18-hole golf course in Texas uses roughly 275,000 gallons a day of water. You rarely see many complaining about that, right? And that’s a lot. That’s enough water for around 1,000 households. And in another side note, there are about 430 18-hole golf courses in Texas, and about 13,000 in the US total.
The most common data center type now which is an ~100 MW AI aggregate data center uses about 387,000 gallons/day. That is about 1.4x the Southwest golf-course median. Also, a lot of water. I’d argue that this DC is doing something a lot more useful than letting some old dudes hit a little ball around, though. I’ve seen claims that a single AI data center uses as much water per day as large cities, which isn’t remotely true for even the densest, 250 megawatt evaporatively-cooled DCs. These, however, can use about 3.2 million gallons/day. There aren’t many of those facilities, though. Probably around 20-40 in the world only. Houston (as an example) uses around 475 million gallons per day of water.
So, data centers — at least AI data centers — do use quite a lot of water, but the usual reports I see misrepresent this number by 10x to 1,000x. Having the real facts is important. Else you’re just dealing with fantasy, which helps no one.
I think we need something like Git, but just not something so horrible and complex to use.
Git is great — if you’re a full-time developer. Ok, not great, as even among devs only about 10% can use it competently. But it does some things they need while being way over-complex.
But it does almost nothing I need. I’d use something like Git all the time if it were about 100x easier to use.
It’s AI. My company has canceled at least one hire due to no longer needing the role since AI is doing 100% of the job.
And that’s how it’ll happen. Despite flashy layoffs (Oracle, Microsoft), most of the results of AI will be people who are not hired: junior and even mid-level people in various tech fields just aren’t obligatory to bring on board any longer. And that is only going to get worse.
So, in other words, the problem is not and won’t be firing. It’ll be lack of hiring.
And yes, this will soon happen to other fields1, as it always does. As usual, tech just gets hit first.
Kubernetes is such absolute shit to work with. It’s one of the worst tech products I’ve ever used. It perpetually feels alpha-level. Partially that’s because it’s based on the priorities and “needs” of developers, so it favors extreme complexity over everything else — with security, networking and performance as a very late afterthought. Infrastructure people wouldn’t overlook those items. But the reality is, most devs don’t care about them and don’t understand them. (Many devs think a millisecond is “not much different” than a microsecond, for instance.)
It’s also an example of “new, cool, so must be good.” Tech is much the same as the fashion world. Surprisingly similar, in many ways. Various flashy products emerge, most of which don’t perform in any way as well as the old versions, but offer some cachet, some sense of being “in style” and are adopted by people shilling things or too clueless to actually understand anything other than the latest headlines.
And then MBAs and poor tech leaders pick up on this and force others to adopt that tech, even when it is completely inappropriate to the use case. Kubernetes experienced significant growth and usage from that.
Kubernetes is needed by maybe 0.01% of companies. Everywhere else, it’s an absolute waste and pointless.

Also, Gen X, if you started early (as I did). The sweet spot of tech adding many benefits while avoiding most drawbacks was around 1986-1999. Being charitable, it could be extended up until social media became more popular, around 2005. I wouldn’t, though, as ubiquitous mass surveillance had already become easy at that point.
But in 1990, you could dial into a BBS, chat with someone, download a little game, then go outside and ride your bike. Social ties weren’t destroyed by tech and if someone wanted to surveil you, they had to have a real reason.
Also, no smartphones, which are nothing but evil.
Microsoft has a new plan for Windows 11, and it actually sounds great.
Windows supported moving the taskbar to different sides of the screens for yearsโat least as far back as Windows XP in 2001.
Moving the taskbar to any edge of the screen was supported in Windows 95, first day of release. I know because I did it.

The world is about to change in a big, big way, and no one will be able to stop it. Most especially not the laughable Bluesky screechers.
Glad I have enough storage/compute resources to last until 2029-2030. That was a smart decision.
World might end by then, who knows. Either way, I am set.
A Decade of Docker Containers.
Too bad about the terrible design, absolute shit networking, lack of performance and poor security.
Other than all that, Docker is great.

Same thing I’ve been saying. AI has been smarter than the average human since ~2022, which is not a very high bar. Now it’s getting into the range of low-grade genius (~130 IQ.)
The good news is that I think due to certain physical realities and how matter and the universe itself is organized, AI will top out at around ~160 IQ; I don’t believe superintelligence is possible in this universe.
The bad news is that having a tireless, always-on genius operating 24/7 as needed will radically reshape societies in innumerable hard-to-predict and likely-to-be-destructive ways even sans superintelligence.
So there’s that.
Not a good thing.