Comment by pm90
I agree with this take. Their insane focus on generative AI seems a bit short sighted tbh. Research thrives when you have freedom to do whatever, but what they’re doing now seems to be to focus everyone on LLMs and those who are not comfortable with that are asked to leave (eg the programming language experts who left/were fired).
So I don’t doubt they’ve done well with LLMs, but when it comes to research what matters is long term bets. The only nice thing I can glean is they’re still investing in Quantum (although that too is a bit hype-y).
Disclosure: I work @ goog, opinions my own
There’s absolutely been a lot of focus on LLMs, but they simply work very well at a lot of things.
That said, Carbon (C++ successor) is an active experimental (open source) project. Fuchsia (operating system, also open) is shipping to consumer products today. Non-LLM AI research capabilities were delivered at a level I’m not sure is matched by any other frontier lab? Hardware (TPUs, opentitan, etc). Beam is mind-blowing and IMO such a sleeper that I can’t wait for people to try.
So whilst LLMs certainly take the limelight, Google is still working on new languages, operating systems, ground-up silicon etc. few (if any?) companies are doing that.