Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Local is enough for most users as long as they're willing to accept a non-realtime response - which is a real limitation (especially for personal agentic use) but not a very significant one. The hardware is not that expensive, a single user's needs aren't going to saturate a state-of-the art AI datacenter rack or anything like that. Not even for heavy agentic workloads.
 help



You rent your broadband internet. It's not a foreign concept that we can't own all the infra.

I don't know why we can't just get over the local compute thing and instead build open infra and models in the cloud. That's literally the only way we'll be able to keep pace with hyperscalers.

Local is not going to benefit 99% of use cases. It's a silly toy.

If we build open infra for cloud-based provisioning and inference, we could build a future we still have some ownership in. We'd be able to fine tune large models for lots of purposes. We wouldn't be locked in to major vendors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: