Pocket LLM Server Just Like a Pocket WiFi
Hey HN,
If there is a pocket-sized compact hardware that hosts large-size open-source LLMs that you can connect offline, wouldn't it be helpful?
The benefits:
- You can use large-size open-source LLMs without using up your PC or smartphone's compute
- You can protect your privacy
- You can use high performance LLM offline
I think it’s a very interesting approach. Could be for a niche group, likely power users of LLMs who are often mobile and value privacy.
Any thoughts on how small this hardware could eventually become?
Any external device would need to be cheaper and/or perform better than a current gen iPhone/android device onboard. Seems like a limited market. What would the benefits be over a spec'd out mobile pc doing the same thing?
Not necessarily. Separating the battery, not having to deal with the OS killing inference because of memory pressure (which will be a real problem), being able to schedule tasks (a la Grok), and having more freedom to let the LLM actually manage files that you can't get on iOS, and active cooling are all very good reasons to have a dedicated piece of hardware.