Of course he would. He could probably give hitler lessons on oven design.
Late stage capitalism rewards management for any appearance of change. It really doesn’t matter whether the results of that change are good or bad. And even a CEO who keeps destroying companies can always find a similar position elsewhere. The feedback loop is hopelessly broken.
Does he still have a company at all?
This type of shortsightedness should be punished. I mean AI can be useful for certain tasks but it’s still just a tool. It’s like these CEOs were just introduced to a screwdriver and he’s trying use it for everything.
“Look employees, you can use this new screwdriver thing to brush your teeth and wipe your ass. “
As a paid, captive squirrel, focusing on spinning my workout wheel and getting my nuts at the end of the day, I hate that AI is mostly a (very expensive) solution in search of a problem. I am being told “you must use AI, find a way to use it” but my AI successes are very few and mostly non-repeatable (my current AI use case is: “try it once for non-vital, not time-sensitive stuff, if at first you don’t succeed, just give up, if you succeed, you saved some time for more important stuff”).
If I try to think as a CEO or an entrepreneur, though, I sort of see where these people might be coming from. They see AI as the new “internet”, something that for good or bad is getting ingrained in everything we do and that will cause your company to go bankrupt for trying too hard to do things “the new way” but also to quickly fade to irrelevance if you keep doing things in the same way.
It’s easy, with the benefit of hindsight, to say now “haha, Blockbuster could have bought Netflix for $50 Millions and now they are out of business”, but all these people who have seen it happen are seeing AI as the new disruptive technology that can spell great success or complete doom for their current businesses. All hype? Maybe. But if I was a CEO I’d be probably sweating too (and having a couple of VPs at my company wipe up the sweat with dollar bills)
My use case for AI is to get it to tell me water to cereal ratios, like for rice, oatmal, corn meal. If there is a mistake, I can easily control for it, and it’s a decent enough starting point.
That said, I am just being lazy by avoiding taking my own notes. I can easily make my own list of water to cereal ratios to hang on the fridge.
I’m working in a small software development company. We’re exploring AI. It’s not being pushed without foundation.
There’s no need to commit when you don’t even know what you’re committing to, disregarding cost and risk. It just doesn’t make sense. We should expect better from CEOs than emotionally following a fear of missing out without a reasonable assessment.
Just like an AI. Instead of learning from mistakes, he repeates them, and denies any wrongdoing.
“You’re Absolutely right!”
Because he asks the ai what’s best but the chatbot always treats it as a loaded question and it wants to be seen as helpful so it finds a way to agree yes-man style.
God, this article is so full of bullshit my phone stinks. And I’m not even an AI-phobe
What if we just swap CEOs with psychopathic assholes that only… oh wait.
CEO of enterprise-software powerhouse IgniteTech.
Can someone tell me what they do? They don’t have a Wikipedia Article and their website is mostly AI slop.
They throw buzzwords at venture capitalists in hopes of one day selling out.
After grilling their silly LLM for a while, I was able to squeeze out what that company really is all about. They don’t really make anything. They just buy miscellaneous software companies, and turn those apps into subscription based cloud cancer. Enterprise software meets maximum enshittification, yeah baby!
Ah, so removing employees from this dumpster fire was a net positive for society.
They throw buzzwords
Now I understands why the CEO thinks AI could replace everybody.
deleted by creator
I understand what enterprise software is. That wasn’t my question.
“Doing Our Part to Make the World a Greener Place”
Clown company. You can’t promote AI and do a claim like that at the same time.
By accelerating the collapse of a human survivable ecosystem we will bring about the end of humanity, resulting in a greener environment for the handful of surviving species.
By
accelerating the collapse ofpivoting a human survivable ecosystem we willbring about the endaccelerate a paradigm shift of humanity, resulting in a greener environment for thehandfulstable base ofsurviving species.recurring revenue.
Jesus fucking christ they wanna force us to use AI.
“The marketing and salespeople were enthused by the possibilities of working with these new tools, he added.”
https://youtu.be/KHJbSvidohg#t=13s
I see the same push where I work and I cannot get a good answer to the most basic question:
“Why?”
“We want more people using AI.”
“Why?”
“. . .”
Same reason as forcing people back into the office even though it’s the solution to a number of serious issues affecting society:
Investors/banks have tons of money in these markets and are incentivizing/forces companies to adopt these policies to prop up the markets, whether it is in their interest or not.
Oh, yeah, we have that too… we want people in the office because collaboration! Synergy! etc. etc.
“How does that work if you want everyone using AI?”
“. . .”
Oooo hot take time: I’d rather work in an office again than be forced to use LLMs.
I’d rather correct LLM hallucinations than some of the crap front line tech support tells people. :)
I’d rather use the AI than go back to the office. The AI doesn’t care if I’m wearing any pants.
I usually ignore these kind of trends. Just meet any required deadlines etc but don’t engage too much. The vast majority will just disappear.
Specifically as a software developer I cannot see a good outcome from engaging with this trend either. It’s going to go one of two ways.
1: It pans out sooner rather than later that AI wasn’t the panacea they thought it was, and it either is forgotten about, or becomes a set of realized tools we use, but don’t rely on.
2: They believe it can replace us all, and so they replace us all with freshly graduated vibe “programmers” and I don’t have a job anyway.
I don’t really see an upside to engaging with this in any kind of long term plan.
2. It’s about breaking the power of tech workers by reducing them from highly skilled specialists to interchangeable low-status workers whose job is to clean up botshit until it compiles. (Given that the machine does the real work and they’re just tidying up the output it generates when prompted, they naturally don’t merit high wages or indulgent perks, even if getting 30,000 lines of code regurgitated from the mashed-up contents of Github and Stack Overflow working is more cognitively tasking than writing that code from scratch would have been.)
It doesn’t matter what they claim if they simply can’t get the people to babysit the AI codebase or the AIs for less money than the original ones who didn’t have to deal with AIs and their output used to cost.
As a pretty senior dev who spent a lot of my career as a contractor mainly coming in to unfuck code-bases seriously fucked up by a couple of cycles under less experienced people, if I was pitched work to unfuck AI work I would demand a premium for my services purelly because of it being far more more fucked up in far harder to follow ways than the work done by less experience humans (who at least are consistent in the mistakes they make and follow a specific pattern in how they work) even without any moral considerations (on principle I would probably just not take a contract with a company that had used AI like that).
I mean, I can see their strategy work for junior devs, but that kind of reducing the power of specialized workers was already being done against junior devs using “outsourcing”.
My prediction is that it’s just the latest buzzword on the pile of buzzwords and by 2028 a new one will pop up and the only time you hear “AI” will be in the line of “Hey, remember when everyone was talking about AI?”
Before AI it was “The Cloud”. Before the cloud it was “Virtualization”. They’re saying all the same things about AI that they said about the cloud and virtualization…
I guess the real money is inventing the new buzzword that sales people can say will make your business faster, more agile, and more efficient. :)
Everytime I see this kind of hype pop up I think back to when there was this great announcement from Silicon Valley about a “revolution in transportation” and it turned out to be the Segway.
“People will be designing cities around this!”
It could have had an impact if it hadn’t been $5,000…
And, BTW, not even the coolest thing Kamen designed…
Before AI it was “The Cloud”. Before the cloud it was “Virtualization”. They’re saying all the same things about AI that they said about the cloud and virtualization…
So you’re saying AI will make a measurable (arguably net positive) impact and forever change the way we do things in our day to day, just becoming a standard toolset offered by many providers? Because I’d argue that’s what virtualization was, as well as the cloud to a lesser extent. Hell, I’d be hard pressed to be convinced on virtualization being a bad thing (not as much the cloud tho, that has some solid negative arguments).
If you’re trying to shit talk AI, you’d be better off comparing it to block chain than cloud/virtualization, since the latter two are an integral part of a large amount of the work we do, and the former is mainly for illicit drugs/activities and stealing money.
agreed, virtualization was one of the best things to happen in IT since the dawn of the internet. i can’t even imagine how much less efficient and reliable datacenters and the entire internet would be without it. Not at all comparable to AI.
i actually work for a company that does very little virtualization now and it’s fucking awful.
I think the comparison is apt, it’s not that LLM is useless, it’s just that, currently, it’s insanely overhyped. Just like the .com had irrational companies that evaporated but underlying tech was useful. Just like in-house servers were considered to be dead with everything being cloud hosted, now there’s recognition of a trade off. Just like there was pressure to ship everything as an ‘.ova’ and nowadays that’s not really done as much.
An appropriately used level of LLM might even be nice, but when it’s fuel for the grifters, it is going to be obnoxious.
The Cloud is still a thing though. As is virtualization
And AI (LLMs, media generation, machine learning) are going to stay a thing as well.
Yeah, there’s generally a kernel of value wrapped up in all sorts of bullshit.
Some with the .com boom, obviously here we are with internet as a critical infrastructure, but 1999 ‘internet’ was a mess of overhype.
Yeah, but nobody talks about them solving/causing all the problems. :)
I think the next buzzword teed up is “quantum”
Looking forward to the infinitely scalable quantum AI blockchain cloud virtualization E2E P2P VPN micro-services!
Get this person a golden parachute, stat
🤢
Yeah, it’s already happening…
I think it’s a real shame because all three of those things you mention are useful. The problem is that once they become a buzzword, then everything needs to be done using that buzzword.
Cloud has been misused to hell and back, and I have no doubt AI will too.
“AI-powered cloud software virtualization”
It’s just Kubernetes.
You have to use AI! For what? I dunno, figure it out or you’re fired! <- a genius businessman, apparently…
This blind lemming-like rush towards AI that so many CEOs seem to suffer from seriously resembles cult behavior or severe drug addiction, my god…
Off-topic, but not-so-fun fact: lemmings don’t actually follow each other off cliffs in mass suicide events. The people filming the documentary actually scared and chased them to get them to panic and do that.
Horrible, I know
True and completely right, but I lack the vocabulary to replace it with something more accurate and still evocative enough :)
Nah, vulture capital.
AI will now supplement all interactions with the genius businessman
They are so hung up on replacing employees with AI, but they don’t know how, so they force the employees to use AI, in the hope that the employee will teach the AI how to replace them
AI always tells them what they want to hear and will make up sources ad infinitum. So unless you step outside of that bubble and search on your own, you would never know.
cool story bro
You probably don’t want to work at a company like that anyway.
Or work with them. Or hire them. And I don‘t even know what they do.
They “do AI”.
A lot of the large(ish) corporates are moving in this direction, including where I work. It’s not unusual, I always liken large organizations to insects, just following where the others are going, and what they are doing. They don’t really ever put much thought into their actions.
There is going to be insane technical debt all over the place in a few years…