Is there a way to host it yourself on say a descent specd macbook pro like through HuggingFace https://huggingface.co/deepseek-ai/DeepSeek-R1 without any information leaving your computer?
Running it in a MacBook Pro entirely locally is possible via Ollama. Even running the full model (680B) is possible distributed across multiple M2 ultras, apparently: https://x.com/awnihannun/status/1881412271236346233
This is not comparable to the R1 DeepSeek language model which has over 600 billion parameters. This for image generation and is 7b parameters and will run on most recent computers locally.
For the phone app does it send your prompts and information to China?
OpenRouter says if you use them that none of their providers send data to China - but what about other 3rd parties? https://x.com/OpenRouterAI/status/1883701716971028878
Is there a way to host it yourself on say a descent specd macbook pro like through HuggingFace https://huggingface.co/deepseek-ai/DeepSeek-R1 without any information leaving your computer?