Comment by tough
Comment by tough 15 days ago
Doesn't google have TPU's that makes inference of their own models much more profitable than say having to rent out NVDIA cards?
Doesn't OpenAI depend mostly on its relationship/partnership with Microsoft to get GPUs to inference on?
Thanks for the links, interesting book!
Yes. Google is probably gonna win the LLM game tbh. They had a massive head start with TPUs which are very energy efficient compared to Nvidia Cards.