Hacker News new | past | comments | ask | show | jobs | submit login

What is a good GPU to put in a home server that can run the TTS / STT and the local LLM required to make this shine?

A 3090 is too expensive and power hungry. Maybe a 3060 12Gb? Is there anything in the "workstation" lineup that is more efficient especially since I don't need the video outs?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: