Portfolio-27/03/2025

The year of churn and the death of the specialist: What’s top of mind for tech startups?

We asked our portfolio founders.

Satellite dishes

Death of the specialist. Taste as the new bottleneck in programming. The year of churn. We recently surveyed our founders – who, together, are building the future of robotics, data analytics, fintech, healthtech and more – to understand what major trends, or unexpected market developments, in their sectors are currently shaping their strategy, the most pressing hiring challenges and opportunities, and their “hot takes” (these ones are shared anonymously, to encourage only the boldest predictions). We also wanted to know how AI is changing their product development and engineering workflows, and what API-based LLMs they’re using or planning to use.

It’s a snapshot of what early-stage tech founders are thinking, planning for and prioritising right now.

So what did they have to say?

Unsurprisingly, AI is top of mind for most of our founders.

From experimentation to application

For some, the focus is shifting from experimentation to full-scale deployment, with businesses looking to integrate AI into core operations across different verticals.

Anders Krohn of Kernel observes that “enterprises are moving (or trying to move) from experimental budgets and production to deployment of AI applications.” Even traditionally conservative technology adopters, like finance departments, are embracing AI. Three-quarters of companies were already using AI to some degree in their financial reporting processes according to a KPMG survey last year, and Ole Heine of Haydn notes that finance teams “are getting more and more open-minded about implementing LLMs in their workflows.”

This extends to compliance and security. Mike McNeil of Fleet sees IT and security teams “automating and changing their practices to be more efficient,” and Baran Ozkan of Flagright sees “LLMs becoming the forefront strategy for all AML providers.”

The future is verticalised and niche

“Incumbents will eat most of your basic AI feature set. Keyword: Salesforce Agentforce”, was one of our anonymous takes.

As tech giants like Salesforce integrate AI features into their products and cause the first cohort of B2B AI applications to churn, startups will need to rethink the application layer, combining both depth and breadth in their offerings.

Our founders see the future of AI in verticalised, agentic applications. One founder building back-office SaaS for businesses believes that “The biggest wins will come from vertical solutions that use LLMs to augment humans in specific domain-specific complex processes.” Dhruv Tandon of Decisional adds that “verticalised AI-native UX will evolve into a maker-checker experience: the agent will make and the human will check and regenerate.”

So verticality, product depth, and proprietary data are the moats. “Also, AI products should require more user ‘depth’ and skill; single one-button tools are actually bad for wide adoption of AI,” was another founder’s take.

Open source and roll-your-own

Eduardo Candela of Maihem reflects that “The commoditisation of smaller and open source LLMs will enable many new AI companies and applications.” New models like DeepSeek R1 – and the fact that the DeepSeek team openly documented their methodology – are lowering adoption costs and driving faster innovation, and founders see open-source AI addressing scalability issues encountered by proprietary systems.

However, the real bottleneck is not AI but data. Success will depend on startups’ ability to invest in backend optimisation and data formatting to get the most use out of these models, whether fine-tuning an open source model or training one from scratch. “The open source or roll-your-own models are attractive but will require more funding to properly leverage as a startup,” says George Webster, stealth. “You have to have the business structures in place to optimise your systems with AI and the data needs to be in the right structure and backend for the AI to execute against. Data engineering and business processes are 90% of AI.”

Demand for simplicity

Across every category of B2B software, the signal is clear: simplicity wins.

“After years of being bombarded by SaaS and point solutions, HR buyers crave simplicity more than ever,” says Sançar Sahin of Oliva. “This means consolidated tools, simple pricing models, and low admin.”

This expectation isn’t limited to HR. In compliance, AI is moving beyond “black box” solutions toward “glass box” systems. “Compliance officers now demand the same intuitive interfaces as consumer apps, with AI decisions explained in clear business language,” says Madhu Nadig of Flagright. “This convergence of advanced AI and human-centered design is redefining what compliance teams expect.”

Hiring and talent

Hiring for global growth

Our portfolio companies are prioritising hires for growth. That means sales teams and GTM market leadership to scale beyond founder-led sales.

From ramping up GTM in the US to hiring top engineering talent in Lagos, hiring is increasingly global – from day one, with several founders prioritising remote talent. “The ability to operate as a multinational from the start is crucial,” says George Webster, stealth. “You need the ability to recruit the right talent, in the right location, from the start.”

The death of job-hopping

For one founder, “this is the year of churn. Tech professionals will face a reckoning, adapting to AI-driven workflows or becoming obsolete.” And as AI agents begin to take on a bigger role in the workforce, Sebastian Schüller of HiPeople believes that “Talent acquisition and HR teams will start budgeting agents against headcount, not existing software budgets.”

Specialisation is no longer sufficient in an AI-powered world. “We're prioritising adaptable generalists – general purpose athletes over specialists – individuals who combine rapid learning ability with strong execution drive,” says Madhu Nadig of Flagright. “The AI era shows that willpower and ability to navigate ambiguity are better performance predictors.”

A HR tech founder echoes the sentiment: “The future kings/queens of the workforce won't be specialists. Instead, it'll be generalist ‘conductors’ who know how to leverage AI and external expertise to get 10x done.” Instead of a design team with graphic designers, motion designers, UX designers, video editors, etc. you'll have just one person who can get all the same outputs by having a strong knowledge of lots of AI tools.

This prioritisation of generalists could mark the death of job-hopping culture, was another anonymous founder’s prediction. “Contrary to a decade of ‘growth through job switches,’ top talent is now choosing 7+ year tenures. Not from loyalty, but because rapid AI advancement means deep company context is becoming a career superpower, while frequent switchers struggle to demonstrate lasting impact. The two-year Silicon Valley stint is becoming career suicide.”

Robotics’ ChatGPT moment?

Even though the competition for top tech talent – particularly applied ML and robotics – is fierce, the “expertise [in LLMs and VLMs] is quickly becoming accessible to a larger crowd,” says Nikita Rudin of Flexion Robotics. “A lot of information is available online and having a PhD in the field is no longer necessary to train such models.”

So is the ChatGPT moment for general robotics right around the corner, as NVIDIA CEO Jensen Huang suggested in his CES keynote in January? A few months later and Google Deepmind has announced its vision-language-action (VLA) model Gemini Robotics and Gemini Robotics-ER, claiming advanced spatial understanding capabilities.

It’s “a very hyped field at the moment,” with “new players emerging on a weekly basis across the whole spectrum of robotics from hardware to software,” says Fabian Tischhauser, Julian Nubert, and Nikita Rudin of Flexion Robotics. But they believe progress will be further accelerated through the success of open source LLMs, VLMs and VLA models, and robotic hand development. “It’s unclear how good robots will actually be in a few years – but the progress will be huge.”

AI tooling

Is “vibecoding” taking over? For a quarter of the Winter 2025 batch at YC, 95% of their code is LLM generated, according to Garry Tan. And Anthropic’s co-founder Dario Amodei predicts that by the end of the year AI will be "writing essentially all of the code". While not writing everything with AI just yet, our founders are certainly tooling up.

Massive productivity gains

AI-powered tools – whether being used alongside traditional IDEs or as standalone AI-native code editors like Cursor and Windsurf – are now staples of engineering teams, allowing them to take on a larger scope in each dev cycle and automate away the unsexy areas of the process. Our founders are reporting productivity boosts of between 10x and 50x.

And that goes for the whole organisation. “We’re using AI codegen for back-office tasks that would typically take 50x longer,” says Adam McCann of Claimer. “It’s transformative for internal operations and marketing.”

From designs to prototypes

AI is also transforming product prototyping by automating mundane design tasks. For Dhruv Tandon of Decisional, product development “is changing from iterating on designs to iterating on mock prototypes that work, thanks to tools like Magic Patterns.” Both Eduado Candela of Maihem and Seth Phillips at Bound echo this, using AI to rapidly prototype non-production apps for quick proof-of-concept validation.

Taste is the new bottleneck

With producing code and completing routine tasks becoming easier, the challenge shifts to problem-solving and experimentation. “Right now producing code is not the bottleneck, but thinking through the problem solving part and trying out experiments is where the bulk of the work happens, which is awesome!” says Asheem Panakkat of fore AI.

“Taste, quality, and thoughtfulness will matter more now that the dev per unit of labor is 2-3x of what was possible before,” said Dhruv Tandon of Decisional.

LLM usage and evaluation

Our founders are increasingly selective about which LLMs they use. OpenAI models still dominate, but they’re often using Claude (valued for its reasoning capabilities) and Gemini (noted for its security integrations) alongside, or at least evaluating them. Cost concerns are also driving more interest in open-source alternatives like DeepSeek, particularly for those who want more control over their models.

Here's what some of our founders are using, and how they’re interacting with them.


Which API-based LLMs are you using or planning to evaluate?

Which API-based LLMs are tech founders using or planning to evaluate?


What programming languages are you using to interact with foundation model APIs?

What programming languages are tech founders using to interact with foundation model APIs?

Hot takes

And to end, here are some more anonymised hot takes from our founders.

  • “The thinner the wrapper you build over different foundation models the more defensible the product.”
  • “Humanoids in household application in 2030.”
  • “Non-human identities doing tasks formerly relegated to knowledge workers.”
  • “The Deepseek craze will die down, generative AI capabilities will continue to improve at a steady rate, and most of the value will keep going to NVIDIA.”
  • “Voice AI will be the new hot thing.”
  • “Government investment will become an increasingly attractive form of funding in the US, with the UK and EU working to replicate.”

Thanks to Thomas Vande Casteele, Seth Phillips, Adam McCann, Dhruv Tandon, Adit Sanghvi, Baran Ozkan, Madhu Nadig, Mike McNeil, Fabian Tischhauser, Julian Nubert, Nikita Rudin, Asheem Panakkat, Ole Heine, Sebastian Schüller, Giovanni Luperti, Anders Krohn, Gerard Clos, Eduardo Candela, Max Ahrens, Jiameng Gao, Sançar Sahin, Karl Moritz Hermann, Ruben Burdin and George Webster for sharing your thoughts.

Authors

Sign up for more like this

Stay ahead of trends, get a roundup of high-quality content in your inbox every month.

Newsletter Image