AI money is moving fast today — from hospitals all the way to the battlefield. Capital is chasing startups that can diagnose faster, defend smarter, and run the infrastructure underneath open-source models.
This is Startup Fundraising. Today, we’re tracking the biggest checks and VC signals across clinical AI, defense intelligence, and AI infrastructure.
Friday deal flow. Let’s get into it.
Absolutely. First up: the AI rounds setting the tone.
From Anuja Vaidya at TechTarget:
Aidoc, a clinical AI company focused on medical imaging analysis and workflow optimization, has raised $150 million in a Series E funding round, bringing its total funding to over $500 million. Aidoc's proprietary aiOS aggregates and analyzes medical data and imaging to detect diagnostic errors and flag abnormalities.
That’s a big endorsement for AI that’s actually inside the clinical workflow — not just sitting off to the side as another dashboard. The question is how hospitals use it: decision support, or something that quietly becomes the default call.
Next, from Intelligence Community News:
On April 29, Scout AI Inc. announced an oversubscribed $100 million Series A financing to accelerate development of Fury, its foundation model for unmanned warfare. The round was co-led by Align Ventures and Draper Associates.
A hundred-million-dollar Series A for unmanned warfare is a pretty clear signal. Defense AI is not a niche corner anymore — it’s turning into a capital arms race. And the pitch is blunt: control autonomous systems at scale, and you may control the battlefield.
Now to AI infrastructure. From Duncan Riley at SiliconANGLE:
Featherless.ai Inc., a serverless inference platform startup that hosts open-source artificial intelligence models, today revealed it has raised $20 million in new funding to expand its global infrastructure and launch a marketplace for specialized open models. Founded in 2024, Featherless.ai pitches itself as a neutral hosting layer for enterprises that want to run open-source AI without committing to a single proprietary cloud or hardware stack.
That neutral hosting layer pitch is the key line here. If enterprises want open models without being boxed into one cloud or hardware stack, serverless inference stops sounding like a nice feature — and starts looking like the next AI infrastructure fight.
You’ll find links to every story we covered today in the show notes, so if one caught your ear, you can dig into the full piece there.
That’s Startup Fundraising for today. Thanks for listening, and have a great Friday. This is a Lantern Podcast.