I think it’ll end up like any industry with machine made options. There will be the a spectrum of products from 100% human made to majority machine made.
There will be a few bespoke artists doing interesting things for the wealthy and the passionate. But, for most of society, the mass produced stuff is fine.
Take clothes, how many of yours were hand made VS machine made. Cobblers are hand making shoes the world over, we’re yours hand made. I have some hand knitted wool stuff (because I’m passionate about wool) but my Levi’s are machine made. Shoes, motorcycle gear…
Furniture. There’s cabinet makers the world over doing beautiful pieces of work, but I got most of my stuff from IKEA. How about you?
It’ll end up like any other industry with machine made options. The bubble will burst, don’t get me wrong, but after the .com bubble burst we still had the internet.
One of the top posts of fuckai right now is a bottle of olive oil, now I’m not tucking there yum. I just have different things I wanna do with my day than stare at someones olive oil bottle. Not better, I’m glad they have the free time and mental effort to do that, pondering mass produced labels is their jam, I support it. I just wanna do different things. I expect the world is going to want to do different things too.
That we will all be forced to adopt it whether we like it or not, in scummy ways, and those that don’t will be unrightfully seen as “boomers”, when in reality they are the last people to genuinely do their work with love and care.
- machine learning models will continue to improve their output somewhat but gains will be incremental and the intrinsic problems with ml-derived content (e.g hallucinations, context window limitations, long-term coherency) will remain
- open source models will catch up with commercial ones
- the smaller ml companies (like openai and anthropic) will be absorbed, probably by Microsoft and Amazon
- The increasing cost of hardware and energy will force companies to raise prices for ml subscriptions and eventually lock ml features behind paywalls
- Computer parts will remain expensive for a long time
- Programmers will collectively spend the next decade wrestling with the consequences of filling their codebases with millions of lines of ai generated code
- Google images will never fully recover
You forgot the most important part: it will be infected with ads. Asked about what the best dish soap is? Why it obviously is either Fairy or Dawn, depending on which brand paid more that week.
Impossible to make predictions that far away. We could have AI models that are barely better than our current ones or we might be extinct with our AGI system already spreading out into the universe.
“It’s Difficult to Make Predictions, Especially About the Future”
~ apocryphal
Narrow AI well get better, even faster than normal because of the research that big AI companies are doing now, but attempts at more general AI will stop being profitable.
General “AI” is not profitable at all, even rn. Raising money is not making profit
I predict software engineers won’t go away, but coders will go away.
That’s already a thing in some areas of programming. Block programming, where you just drag and connect blocks, is very possible, especially in game development.
Judging by the quality of software, I’d say hackers went away at least 15 years ago.
LLMs are shit a doing large code changes, and fundamentally will always be shit at it, because they fundamentally can’t reason. LLMs are good text completers, and that’s their place in the IDE.
My prediction is that most well run organizations are going to push against coding agents soon. Look at the reports that even Amazon is now demanding Senior Engineers reviews for AI code changes and take responsibility, which doesn’t scale now, and will scale even less in the future if we train less coders.
People hate LLMs because of their unreliability, and they are right. But AI is a much more vast field.
As soon as we have more reliable, causal and general intelligence, the opinions will change.
I personally believe that humans have no clue how limited our brain power is. So much so that there will be no AGIs. Only ASIs. Same thing that happened with chess bots.
LLMs are a dead end, and the massive amounts of money being wasted on them will make people too scared to invest in other forms of AI.
So we are currently at a local maxima that we won’t overcome in 10 years. It will take much longer before we try a different approach to create “AGI,” and the wasted money on LLMs will slow other forms of AI research, leaving us stagnating for >10 years
I’m not convinced that investors would know the difference between a company trying to improve llms vs taking a new approach. So I don’t think it will stifle investment in other forms of AI research.
I also don’t think they are a dead end overall. They sure aren’t likely to get to agi, but you don’t need agi to be useful.
You have to convince investors why your AI research won’t hit a wall like LLMs are now - they’ve poisoned the term “AI”
They’re a dead end, insofar as they do all they’ll ever be able to; if you can find use for them at their current level, great, but it does not look likely they will be able to do more than they currently can
I think that all depends on what else there is to invest in. In general, as terrible as ai is, it’s carrying the stock market. Investors need something else to turn to to divert away from AI.
LLMs will go the way of NTFs. No AGI will exist yet
Well, assuming some agi breakthrough doesn’t happen (which would in my opinion require a vastly different approach than llms). We will see more of this ai swarm type stuff. Essentially you end up with a bunch of specialized ai’s, and then some ai coordinators. The ai that we will talk to will just farm out the work to other AIs that will include ones specialized in verifying the work that the ai does.
Most people preAI did work that was say 60% implementation, 30% figuring out what needs to be done, and 10% verifying what was done. That will shift to 15% implementation, 50% requirements gathering, and 35% verification.
Obviously those number are just to show the shift, not intended to be an accurate representation of the current way our work is divided.Overall, if you give ai a way to verify what it is doing, and let it iterate, it is far more useful than just telling it to do a thing or asking it a question.
Someone somewhere, is going to decide “AI is great to lead scientific studies!” just like that terrible idea someone had about bringing AI into the surgery room.
And it is already happening but will be rampant, AI-revisionists re-writing history for propaganda purposes. People are already instilling historical figures for AI to recreate down to the voice for them to say things that they never would’ve said originally or at all, just to push a political message.
This shit will be all over campaigns in any election.
Most interaction between people and computers will move from keyboard and mouse to spoken word.
I have RSI and already do this. Apart from my condition, it’s a game changer.
I’m so much faster using assistive tech than I ever was with a keyboard and shortcuts (and I’m a fast mover).
I agree. Once people taste the speed of talking to the pc, the keyboard will start to fall away.
For many at least.
Yes, and right now they are limited in what control they have over the os or other apps. Input is set up for keyboard and mouse. Just like with touch screen changing inputs, the other software will be redesiiover time to accommodate the new preference leading to more ease and more integration.
LLMs will be a standard part of software tooling like IDEs, and people won’t talk about them much anymore.
LLMs and image/video generation will be a standard part of adult entertainment.
Improvement stagnates.
Venture capital availability reduces.
Mag 7 try to monetise to continue development.
Business adoption is tepid as long term heavy use reduces skills and productivity.
Some financial VC fund learns from a credible whistle blower that generative AI is not a pathway to AGI. Revalues their portfolio, enters administration.
The ensuing fallout triggers a global depression.
It will worm it’s way into more and more everyday interactions and the bulk of society will accept it like they did cameras everywhere, smart appliances, the digital tracking device in their pocket, and screens in their cars instead of physical controls. Avoiding it will become a lifestyle choice that takes effort, and the secondary market for decades old digital/analog technology will continue to grow.






