The AI talent wars are just getting started

News Room

For my last issue of the year, I’m focusing on the AI talent war, which is a theme I’ve been covering since this newsletter launched almost two years ago. And keep reading for the latest from inside Google and Meta this week.

But first, I need your questions for a mailbag issue I’m planning for my first issue of 2025. You can submit questions via this form or leave them in the comments.

“It’s like looking for LeBron James”

This week, Databricks announced the largest known funding round for any private tech company in history. The AI enterprise firm is in the final stretch of raising $10 billion, almost all of which is going to go to buying back vested employee stock.

How companies approach compensation is often undercovered in the tech industry, even though the strategies play a crucial role in determining which company gets ahead faster. Nowhere is this dynamic as intense as the war for AI talent, as I’ve covered before. 

To better understand what’s driving the state of play going into 2025, this week I spoke with Naveen Rao, VP of AI at Databricks. Rao is one of my favorite people to talk to about the AI industry. He’s deeply technical but also business-minded, having successfully sold multiple startups. His last company, MosaicML, sold to Databricks for $1.3 billion in 2023. Now, he oversees the AI products for Databricks and is closely involved with its recruiting efforts for top talent.

Our conversation below touches on the logic behind Databricks’s massive funding round, what specific AI talent remains scarce, why he thinks AGI is not imminent, and more.

The following conversation has been edited for length and clarity:

Why is this round mostly to help employees sell stock? Because $10 billion is a lot. You can do a lot with that.

The company is a little over 11 years old. There have been employees that have been here for a long time. This is a way to get them liquidity. 

Most people don’t understand that this is not going into the balance sheet of Databricks. This is largely going to provide liquidity for past employees, [and] liquidity going forward for current and new employees. It ends up being neutral on dilution because they’re shares that already exist. They’ve been allocated to employees and this allows them to sell those to cover the tax associated with those shares.

How much of the rapid increases in AI company valuations have to do with the talent war?

It’s real. The key thing here is that it’s not just pure AI talent — people who come up with the next big thing, the next big paper. We are definitely trying to hire those people. There is an entire infrastructure of software and cloud that needs to be built to support those things. When you build a model and you want to scale it, that actually is not AI talent, per se. It’s infrastructure talent. 

The perceived bubble that we’re in around AI has created an environment where all of those talents are getting recruited heavily. We need to stay competitive. 

Who is being the most aggressive with setting market rates for AI talent?

OpenAI is certainly there. Anthropic. Amazon. Google. Meta. xAI. Microsoft. We’re in constant competition with all of these companies.

Would you put the number of researchers who can build a new frontier model under 1,000?

Yeah. That’s why the talent war is so hot. The leverage that a researcher has in an organization is unprecedented. One researcher’s ideas can completely change the product. That’s kind of new. In semiconductors, people who came up with a new transistor architecture had that kind of leverage. 

That’s why these researchers are so sought after. Somebody who comes up with the next big idea and the next big unlock can have a massive influence on the ability of a company to win.

Do you see that talent pool expanding in the near future or is it going to stay constrained? 

I see some aspects of the pool expanding. Being able to build the appropriate infrastructure and manage it, those roles are expanding. The top-tier researcher side is the hard part. It’s like looking for LeBron James. There are just not very many humans who are capable of that. 

I would say the Inflection-style acquisitions were largely driven by this kind of mentality. You have these concentrations of top-tier talent in these startups and it sounds ridiculous how much people pay. But it’s not ridiculous. I think that’s why you see Google hiring back Noam Shazeer. It’s very hard to find another Noam Shazeer

A guy we had at my previous company that I started, Nervana, is arguably the best GPU programmer in the world. He’s at OpenAI now. Every inference that happens on an OpenAI model is running through his code. You start computing the downstream cost and it’s like, “Holy shit, this one guy saved us $4 billion.”

“You start computing the downstream cost and it’s like, ‘Holy shit, this one guy saved us $4 billion.’”

What’s the edge you have when you’re trying to hire a researcher to Databricks?

You start to see some selection bias of different candidates. Some are AGI or bust, and that’s okay. It’s a great motivation for some of the smartest people out there. We think we’re going to get to AGI through building products. When people use technology, it gets better. That’s part of our pitch. 

AI is in a massive growth base but it’s also hit peak hype and is on the way down the Gartner hype curve. I think we’re on that downward slope right now, whereas Databricks has established a very strong business. That’s very attractive to some because I don’t think we’re so susceptible to the hype.

Do the researchers you talk to really believe that AGI is right around the corner? Is there any consensus of when it’s coming? 

Honestly, there’s not a great consensus. I’ve been in this field for a very long time and I’ve been pretty vocal in saying that it’s not right around the corner. The large language model is a great piece of technology. It has massive amounts of economic uplift and efficiencies that can be gained by building great products around it. But it’s not the spirit of what we used to call AGI, which was human or even animal-like intelligence.

These things are not creating magical intelligence. They’re able to slice up the space that we’re calling facts and patterns more easily. It’s not the same as building a causal learner. They don’t really understand how the world works. 

You may have seen Ilya Sutskever’s talk. We’re all kind of groping in the dark. Scaling was a big unlock. It was natural for a lot of people to feel enthusiastic about that. It turns out that we weren’t solving the right problem.

Is the new idea that’s going to get to AGI the test-time compute or “reasoning” approach?

No. I think it’s going to be an important thing for performance. We can improve the quality of answers, probably reduce the probability of hallucinations, and increase the probability of having responses that are grounded in fact. It’s definitely a positive for the field. But is it going to solve the fundamental problem of the spirit of AGI? I don’t believe so. I’m happy to be wrong, too.

Do you agree with the sentiment that there’s a lot of room to build more good products with existing models, since they are so capable but still constrained by compute and access?

Yeah. Meta started years later than OpenAI and Anthropic and they basically caught up, and xAI caught up extremely fast. I think it’s because the rate of improvement has essentially stopped.

Nilay Patel compares the AI model race to early Bluetooth. Everyone keeps saying there’s a fancier Bluetooth but my phone still won’t connect.

You see this with every product cycle. The first few versions of the iPhone were drastically better than the previous versions. Now, I can’t tell the difference between a three-year-old phone and a new phone. 

I think that’s what we see here. How we utilize these LLMs and the distribution that has been built into them to solve business problems is the next frontier. 

Elsewhere

  • Google gets flatter. CEO Sundar Pichai told employees this week that the company’s drip-drip series of layoffs have reduced the number of managers, directors, and VPs by 10 percent, according to Business Insider and multiple employees I spoke with who also heard the remarks. Relatedly, Pichai also took the opportunity to add “being scrappy” as a character trait to the internal definition of “Googleyness.” (Yes, that’s a real thing.) He demurred on the most upvoted employee question about whether layoffs will continue, though I’m told he did note that there will be “overall” headcount growth next year. 
  • Meta cuts a perk. File this one under “sad violin”: I’m told that, starting in early January, Meta will stop offering free EV charging at its Bay Area campuses. Keep your heads held high, Metamates.

What else you should know about

  • OpenAI teased its next o3 “reasoning” model (yes, “o2” was skipped) with impressive evals.
  • TikTok convinced the Supreme Court to hear its case just before its US ban is set to take effect. Meanwhile, CEO Shou Chew met with Donald Trump at Mar-a-Lago to (I’m assuming) get a sense of what his other options are should TikTok lose its case.
  • More tech-meets-Mar-a-Lago news: Elon Musk inserted himself into the meeting between Jeff Bezos and Trump. Robinhood donated $2 million to Trump’s inauguration. And Softbank CEO Masayoshi Son pledged to invest $100 billion into AI tech in the US, which happens to be the same number he has floated for a chip venture to compete with Nvidia.
  • Apple complained about Meta pressuring the EU to make iOS more compatible with third-party hardware. Anyone who has synced photos from the Ray-Ban Meta glasses to an iPhone will understand why this is a battle that is very important for Meta to win, especially as it gears up to release its own pair of AR glasses with a controller wristband next year. 
  • Amazon is delaying its return-to-office mandate in some cities because it doesn’t have enough office space.
  • Perplexity, which is projected to make $127 million in revenue next year, recently raised $500 million at a valuation of $9 billion. It also acquired another AI startup called Carbon to help it hook into other services, like Notion and Google Docs.

Job board

A few notable moves this week:

  • Meta promoted John Hegeman to chief revenue officer, reporting to COO Javier Olivan. Another one of Olivan’s reports, Justin Osofsky, was also promoted to be head of partnerships for the whole company, including the company’s go-to-market strategy for Llama.
  • Alec Radford, an influential, veteran OpenAI researcher who authored its original GPT research paper, is leaving but will apparently continue working with the company in some capacity. And Shivakumar Venkataraman, who was recently brought in from Google to lead OpenAI’s search efforts, has also left.
  • Coda co-founder and CEO Shishir Mehrotra will also run Grammarly now that the two companies are merging, with Grammarly CEO Rahul Roy-Chowdhury staying on as a board member. 
  • Tencent removed two directors, David Wallerstein and Ben Feder, from the board of Epic Games after the Justice Department said their involvement violated antitrust law. 
  • Former Twitter CFO Ned Segal has been tapped to be chief of housing and economic development for the city of San Francisco. 

More links

  • My full Decoder interview with Arm CEO Rene Haas about the AI chip race, Intel, and more.
  • Waymo’s new report shows that its AV system is far safer than human drivers.
  • The US AI task force’s recommendations and policy proposals. 
  • Apple’s most downloaded app of the year was Temu, followed by Threads, TikTok, and ChatGPT.
  • Global spending on mobile apps increased 15.7 percent this year while overall downloads decreased 2.3 percent.

If you aren’t already getting new issues of Command Line, don’t forget to subscribe to The Verge, which includes unlimited access to all of our stories and an improved ad experience on the web. You’ll also get access to the full archive of past issues.

As always, I want to hear from you, especially if you have a tip or feedback. Respond here, and I’ll get back to you, or ping me securely on Signal.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *