Ask HN: Better hardware means OpenAI, Anthropic, etc. are doomed in the future?

5 points by kart23 3 days ago

This is something I don't understand, how will all these AI-as-a-service companies survive in the future when hardware gets better and people are able to run LLMs locally? Of course right now the rent vs. buy equation is heavily tilted towards rent, but eventually I could see people buying a desktop they keep at home, and having all their personal inference running on that one machine. Or even having inference pools to distribute load among many people.

do you think what this is possible, and what are these companies plans in that event?

fogzen 2 days ago

They won't survive. AI-as-a-service for frontier models will be relegated to military and research – if that. We're already at diminishing returns on model improvements. Latest improvements are on surrounding architecture, harnesses, agent systems, etc. Consumer hardware will be running the equivalent of ChatGPT 5.2 and IMO most interaction with personal computing devices will be done via natural language LLM personal assistants.

Maybe it takes a bit longer than 5 years but that's where we're going. Already the only reason you're not interacting via personal assistant for everything isn't really LLM capability but the lack of tooling.

  • ediblelegible 5 hours ago

    Agree 100% with what you’re saying about capabilities, but want to push back on the conclusion that these Trillion dollar companies will let themselves just go away

    The are already moving into enterprise, Gov, and product software. I think they’ll find a way to make money even if access to their models is no longer a huge moat

farseer 3 days ago

The frontier of how good models are also shifts and will remain ahead of local models unless we hit some dead end limitation in the algorithms themselves. A ceiling so to speak on how good LLM can get before the law of diminishing returns starts to apply.

  • kart23 2 days ago

    i don’t understand. all models are local models, they’re just not running on your machine.

verdverm 3 days ago

1. Is it cheaper for me to buy hardware and electricity than to call an API? (doesn't seem like it right now)

2. The best models are still worth it, unclear when this changes

3. Average person doesn't have the skill to do this. They are afraid to run even simpler things

  • kart23 2 days ago

    definitely not right now. but I believe sometime the progress of models will plateau while hardware continues to get better. and maybe it would be cheaper, especially if you have solar.

    3. this is like saying the average person doesn’t have the skill to run gta over wine on their linux box. gaming consoles exist.

throwaway5465 2 days ago

Young people have had even the concepts of filesystems conditioned out for files to live in a 'folder' of an APP.

Local sovereignty isn't a pressing need for most users.

freakynit 3 days ago

I do believe this is gonna get commodatized like the internet has. Hardware obviously keeps getting better and cheaper as the time goes by. Sofware in this case is already free/open-weights.

The moats these companies might end up having in near future:

1. Government and enterprise contracts;

2. Even better private models not released to public and only accessible through long-term/exclusive contracts;

3. Gatekeeping the access to millions of their users, especially the non-technical ones, and charging premium for the same;

4. Becoming more and more as the full-stack OS'es to build on top of them.. By proving ready-made foundational layers like knowledge, memory, search/research, sandboxes, deployments, etc...

5. Data/network effects from large-scale usage and feedback loops.

...