AI Summit Recap: Where to Invest in AI

Over the last two years, it’s felt like investors have poured money into any startup with dot-ai in its domain name. In 2023 alone, artificial intelligence funding topped $189 billion. But with hundreds of new AI startups launching each year, it can be difficult for investors to distinguish a true unicorn from a flash in the pan.
As part of The Information’s San Francisco AI Summit, Laura Mandaro, managing editor of news and talent at The Information, sat down with three AI venture veterans to discuss the next big opportunities for AI investors:
- Aaref Hilaly, partner, Bain Capital Ventures
- Guillermo Rauch, founder and CEO, Vercel
- Anima Anandkumar, Bren Professor of Computing and Mathematical Sciences, California Institute of Technology
Beyond Foundational Models
Some of the biggest moves in the last few years have centered around foundational generative AI models, from Microsoft’s $10 billion funding of OpenAI to Amazon’s multibillion-dollar investment in Anthropic. Despite the headline-grabbing valuations of the most popular large language models, Aaref Hilaly believes the market is reaching a saturation point.
“At this point, I think it’s hard to think there’ll be another foundation model company taking the same approach as the three or four that [already] exist today,” he said. “To do the same thing over and over, I don’t think it’s fundable today.”
Guillermo Rauch agreed, adding that he sees opportunity in startups creating AI-native applications that sit on top of popular open source AI models like Llama or that use proprietary data to fine-tune existing models.
“The IP becomes your tests, your evaluations and what you’ve built on top of the model to make these applications better over time,” Rauch said.
Proving AI’s Value
The investors pouring billions into AI are increasingly anxious to see evidence of the technology’s much-touted cost-saving and revenue-generating potential. When asked by Mandaro about these proof points, Hilaly said AI is already proving itself when it comes to extracting added value from employees, whether by automating more-routine aspects of customer service, boosting lead generation for sales teams or generating code for developers.
“Engineers are expensive, and you can [use generative AI] to make them way more capital,” he said. Rauch said Vercel is already doing this: using AI to automate software upgrades that free up hours of developer productivity.
AI in the Physical World
Because AI is a digital product, the most wide-ranging applications have also been digital. Injecting AI into the physical sphere, said Anima Anandkumar, is the next big frontier for the technology.
“The concept of digital twins needs an upgrade to be fully AI based—not just in the world of what you see, but also the world that is hidden from us,” she said, pointing to mechanical wear and tear or the inner workings of nuclear reactors as examples.
As AI gets more incorporated into the physical world, Anandkumar anticipates its use to speed up drug discovery, create cutting-edge medical devices and accelerate the development of new aircraft. To accomplish these feats, she said, AI models would need to expand their understanding “beyond text and images.”
“We need to think of quantum-level simulations for drug discovery, or look at how to simulate plasma or do solutions for energy transitions,” she said.
To illustrate what’s possible, Anandkumar pointed to a recent example at Caltech, where a team used AI to design and print a catheter that resulted in a 100 times reduction in bacterial contamination.
“The AI understood fluid dynamics and how bacteria swims upstream into the human body. Based on that, we could create these shapes within the tube that prevent the bacteria from swimming upstream,” she said. “The more testing and validation we do in the digital world, [the bigger] the cost savings to create physical products.”
Optimizing AI Usage
AI demands a lot of compute power, a limited resource that can drive up costs for enterprises not mindful of how they use it across their organization. As business leaders keep a closer eye on their company’s inference costs, Rauch said, that creates opportunities for startups whose main focus is streamlining AI performance.
“There are so many technologies coming out to optimize your AI usage,” he said, pointing to Anthropic’s recently launched prompt caching and Gemini’s context-caching features. He believes compute costs are less of a constraint lately, since fewer people are training models themselves but rather are using pre-trained models from third parties.
“Because the costs of inference are going down so dramatically, I think it is the perfect time to be investing in these higher levels of abstractions,” he said.