I came across this article in another Lemmy community that dislikes AI. I’m reposting instead of cross posting so that we could have a conversation about how “work” might be changing with advancements in technology.
The headline is clickbaity because Altman was referring to how farmers who lived decades ago might perceive that the work “you and I do today” (including Altman himself), doesn’t look like work.
The fact is that most of us work far abstracted from human survival by many levels. Very few of us are farming, building shelters, protecting our families from wildlife, or doing the back breaking labor jobs that humans were forced to do generations ago.
In my first job, which was IT support, the concept was not lost on me that all day long I pushed buttons to make computers beep in more friendly ways. There was no physical result to see, no produce to harvest, no pile of wood being transitioned from a natural to a chopped state, nothing tangible to step back and enjoy at the end of the day.
Bankers, fashion designers, artists, video game testers, software developers and countless other professions experience something quite similar. Yet, all of these jobs do in some way add value to the human experience.
As humanity’s core needs have been met with technology requiring fewer human inputs, our focus has been able to shift to creating value in less tangible, but perhaps not less meaningful ways. This has created a more dynamic and rich life experience than any of those previous farming generations could have imagined. So while it doesn’t seem like the work those farmers were accustomed to, humanity has been able to shift its attention to other types of work for the benefit of many.
I postulate that AI - as we know it now - is merely another technological tool that will allow new layers of abstraction. At one time bookkeepers had to write in books, now software automatically encodes accounting transactions as they’re made. At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
These days we have fewer bookkeepers - most companies don’t need armies of clerks anymore. But now we have more data analysts who work to understand the information and make important decisions. In the future we may need fewer software coders, and in turn, there will be many more software projects that seek to solve new problems in new ways.
How do I know this? I think history shows us that innovations in technology always bring new problems to be solved. There is an endless reservoir of challenges to be worked on that previous generations didn’t have time to think about. We are going to free minds from tasks that can be automated, and many of those minds will move on to the next level of abstraction.
At the end of the day, I suspect we humans are biologically wired with a deep desire to output rewarding and meaningful work, and much of the results of our abstracted work is hard to see and touch. Perhaps this is why I enjoy mowing my lawn so much, no matter how advanced robotic lawn mowing machines become.
At one time software developers might spend days setting up the framework of a new project, and now an LLM can do the bulk of the work in minutes.
No and no. Have you ever coded anything?
Yeah, I have never spent “days” setting anything up. Anyone who can’t do it without spending “days” struggling with it is not reading the documentation.
You guys are getting documentation?
Well, if I’m not, then neither is an LLM.
But for most projects built with modern tooling, the documentation is fine, and they mostly have simple CLIs for scaffolding a new application.
I mean if you use the code base you’re working in as context it’ll probably learn the code base faster than you will, although I’m not saying that’s a good strategy, I’d never personally do that
The thing is, it really won’t. The context window isn’t large enough, especially for a decently-sized application, and that seems to be a fundamental limitation. Make the context window too large, and the LLM gets massively offtrack very easily, because there’s too much in it to distract it.
And LLMs don’t remember anything. The next time you interact with it and put the whole codebase into its context window again, it won’t know what it did before, even if the last session was ten minutes ago. That’s why they so frequently create bloat.
If your argument attacks my credibility, that’s fine, you don’t know me. We can find cases where developers use the technology and cases where they refuse.
Do you have anything substantive to add to the discussion about whether AI LLMs are anything more than just a tool that allows workers to further abstract, advancing all of the professions it can touch towards any of: better / faster / cheaper / easier?
You seem to be taking this a bit personally…
I’ve got something to add: in every practical application AI have increased liabilities and created vastly inferior product, so they’re not more than just a tool that allows workers to further abstract because they are less than that. This in addition to the fact that AI companies can’t turn a profit, so it’s not better, not faster, not cheaper, but but it is certainly easier (to do a shit job).
CEO isn’t an actual job either, it’s just the 21st century’s titre de noblesse.
deleted by creator
Executive positions are probably the easiest to replace with AI.
- AI will listen to the employees
- They will try to be helpful by providing context and perspective based on information the employee might not have.
- They will accept being told they are wrong and update their advice.
- They will leave the employee to get the job done, trusting that the employee will get back to them if they need more help.
Don’t executives spend their day talking to AI and doing whatever they say?
Exactly. No need to add the executive toxicity filter
Sam Altman is a huckster, not a technologist. As such, I don’t really care what he says about technology. His purpose has always been to transfer as much money as possible from investors into his own pocket before the bubble bursts. Anything else is incidental.
I am not entirely writing off LLMs, but very little of the discussion about them has been rational. They do some things fairly well and a lot of things quite poorly. It would be nice if we could just focus on the former.
Sam, I say this will all my heart…
Fuck you very kindly. I’m pretty sure what you do is not “a real job” and should be replaced by AI.
What do we need the mega rich for anyway? They aren’t creative and easily replaced with AI at this point.
What do we need the mega rich for anyway?
Supposedly the creation and investment of industries, then managing those businesses which also supposedly provide employment for thousands who make the things for them. Except they’ll find ways to cut costs and maximize profit. Like looking for cheaper labor while at the same time thinking of building the next megayacht for which to flex off at Monte Carlo next summer.
If OpenAI gets wiped out, maybe it wasn’t even a “real company” to start with
The problem is the capitalist investor class, by and large, determines what work will be done, what kinds of jobs there will be, and who will work those jobs. They are becoming increasingly out of touch with reality as their wealth and power grows and seem to be trying to mold the world into something, somewhere along the lines of what Curtis Yarven advocates for, that most people would consider very dystopian.
This discussion is also ignoring the fact that currently, 95% of AI projects fail, and studies show that LLM use hurts the productivity of programmers. But yeah, there will almost surely be breakthroughs in the future that will produce more useful AI tech; nobody knows what the timeline for that is though.
To be fair, a lot of jobs in capitalist societies are indeed pointless. Some of them even actively do nothing but subtract value from society.
That said, people still need to make a living and his piece of shit artificial insanity is only making it more difficult. How about stop starving people to death and propose solutions to the problem?
There’s a book Bullshit Jobs that explores this phenomenon. Freakonomics also did an episode referring to the book, which I found interesting.
Bullshit Jobs: A Theory is a 2018 book by anthropologist David Graeber that postulates the existence of meaningless jobs and analyzes their societal harm. He contends that over half of societal work is pointless and becomes psychologically destructive when paired with a work ethic that associates work with self-worth
The jobs did not start out that way, I guess these people have been tossed to the side and are not where the action currently is.
Yet they are still employed because the boss does not understand what they are doing and they might embellish their contributions etc.
There are so many people who do little, drink free coffee talk to everyone and are seen as very social, liked by everyone etc. They do fucking nothing, I know a handful of them.
They may seem pointless to those outside of the organization. As long as someone is willing to pay them then someone considers they have value.
No one is “starving to death” but you’d have people just barely scraping by.
With many bearaucracies there’s plenty of practically valueless work going on.
Because some executive wants to brag about having over a hundred people under them. Because some proceas requires a sort of document be created that hasn’t been used in decades but no one has the time to validate what does or does not matter anymore. Because of a lot of little nonsense reasons where the path of least resistance is to keep plugging away. Because if you are 99 percent sure something is a waste of time and you optimize it, there’s a 1% chance you’ll catch hell for a mistake and almost no chance you get great recognition for the efficiency boost if it pans out.
This is the tricky nature of “value”, isn’t it?
Something can be both valuable and detrimental to humanity.
why capitalist societies specifically?
You know what, he actually wouldn’t be horrificly wrong if he were actually pushing for something there. Lets say hypothetically our jobs, aren’t real work, and it’s no big deal that they are replaced… the actual intents of progression of technology… was originally that when the ratio of work needed to be done and people shifts… we’d work less for more pay etc… but no we just capitalism it and say “labor is in high supply, so we need to cut it’s price until people can find use for it”.
Can’t AI replace Sam Altman?
He doesn’t know Jobs was wiped out by cancer?
I’ve worked for big corporations that employ a lot of people. Every job has a metric showing how much money every single task they do creates. Believe me. They would never pay you if your tasks didn’t generate more money than they need to pay you to do the task.
Every job has a metric showing how much money every single task they do creates.
Management accountants would love to do this. In practise you can only do this for low level, commoditised roles.
Mopping a floor has a determined metric. I’m not kidding. It’s a metric. Clean bathrooms are worth a determined dollar amount. It’s not simply sales or production, every task has a dollar amount. The amount of time it takes to do the task has a dollar value determined and on paper. Corporations know what every task is worth in dollar amounts. Processing Hazmats? Prevents the fine. Removing trash or pallets? Prevents lawsuits and workplace injury. Level of light reflected from the floor? Has a multiplier effect on sales. Determined. Defined. Training sales people on language choices, massive sales effect. They know how much money every single tasks generates, fines or lawsuits prevented, multiplier effects on average ticket sales, training to say ’ highest consumer rated repair services ’ instead of ‘extended warentee’ these are on paper defined dollar amounts. There is NO JOB in which you are paid to do something of no financial value. There are no unprofitable positions or tasks.
Your examples are all commoditized and measurable. Many roles are not this quantifiable.
There is NO JOB in which you are paid to do something of no financial value.
Compliance, marketing, social outreach, branding.
Putting a $ amount on these and other similar roles is very difficult.
But I agree, if the value added is known to be zero or negative then usually no-one is paid to do it.
There are no unprofitable positions or tasks.
Not when they are set up, but they can become unprofitable over time, and get overlooked.
Compliance is calculated with previous years costs in workman’s comp, hiring and training costs, and lawsuit and fine payouts. It’s one of the easiest tasks to break down to dollar amounts. If we paid $8k at every site and one site paid $2k because they didn’t get fined on electrical outlets out of code, then one task in compliance saved $6k I’m not theorising with you. I have seen the excel spreadsheets, this isn’t me assuming they exist, this is quantified. This is specified on paper man. What don’t you get here? Marketing is VERY easy to assign a dollar amount to. We made $100k one quarter with $1k paid in marketing, we made $200k next quarter with $2k paid on marketing. Very easy to determine. You want to wake everyone in the morning meeting up? Tell them you want to pull money out of Advertising and redirect it to payroll. They’ll all spit their coffee out. Social media is also very easy to quantify. You just compare metrics across all quarters and pair them to social media follows, this is a huge metric that a lot of business decisions are made on, this isn’t amorphous just because you’re unaware of how important it is to business. Branding also has hard values assigned, and supporting or changing branding is very much a numbers game. Why else do you have companies willing to buy the name of another company even when they don’t need their production or staff along with it? I don’t think you grasp that every single task someone does for a corporation is matched to a dollar figure amount. Seriously. If I could get labor class people to drop one myth it would be that their labor has next to no value. They know what you’re worth and they know how much they aren’t paying you out of the value you produce.
Compliance is calculated with previous years costs
No, that’s just what you spent last year.
Marketing is VERY easy to assign a dollar amount to.
It’s easy to see how much it costs. It’s very hard to determine exactly how much additional revenue any particular campaign creates.
They know what you’re worth
Pick anyone at the C-Level. How much revenue do they bring in? What’s the ROI of a CFO?
This is part of the reason I don’t work for big corporations… yuck
That would actually be true if companies were run by the people doing the work.
They’re not real until your bullshit factory falls apart without them, fucktard
I was at the Canton Fair last week which is a trade show in China where manufacturers display some of their latest technology.
There was a robotics display all where they are showing off how lots of factories, kitchens, another labor-based jobs can be automated with technology.

This doesn’t really have a lot to do with AI or LLMs, but the field of robotics is advancing fast and a lot of basic work that humans had to do in the past won’t be needed as much in the future.
Yeah… But rich people don’t want to eat food prepared cheaply and efficiently by robots. They want 10k a plate bullshit, not peasant food. They will, however, gladly use robots for manual labor like construction and soldiering








