The chatter round synthetic common intelligence (AGI) might dominate headlines coming from Silicon Valley firms like OpenAI, Meta and xAI, however for enterprise leaders on the bottom, the main target is squarely on sensible functions and measurable outcomes. At VentureBeat’s current Remodel 2025 occasion in San Francisco, a transparent image emerged: the period of actual, deployed agentic AI is right here, is accelerating and it’s already reshaping how companies function.
Firms like Intuit, Capital One, LinkedIn, Stanford College and Highmark Well being are quietly placing AI brokers into manufacturing, tackling concrete issues, and seeing tangible returns. Listed here are the 4 largest takeaways from the occasion for technical decision-makers.
1. AI Brokers are transferring into manufacturing, quicker than anybody realized
Enterprises are actually deploying AI brokers in customer-facing functions, and the pattern is accelerating at a breakneck tempo. A current VentureBeat survey of two,000 business professionals performed simply earlier than VB Remodel revealed that 68% of enterprise firms (with 1,000+ workers) had already adopted agentic AI – a determine that appeared excessive on the time. (In actual fact, I apprehensive it was too excessive to be credible, so after I introduced the survey outcomes on the occasion stage, I cautioned that the excessive adoption could also be a mirrored image of VentureBeat’s particular readership.)
Nonetheless, new information validates this speedy shift. A KPMG survey launched on June 26, a day after our occasion, reveals that 33% of organizations are actually deploying AI brokers, a shocking threefold enhance from simply 11% within the earlier two quarters. This market shift validates the pattern VentureBeat first recognized simply weeks in the past in its pre-Remodel survey.
This acceleration is being fueled by tangible outcomes. Ashan Willy, CEO of New Relic, famous a staggering 30% quarter over quarter development in monitoring AI functions by its clients, primarily due to the its clients’ transfer to undertake brokers. Firms are deploying AI brokers to assist clients automate workflows they need assistance with. Intuit, as an example, has deployed bill technology and reminder brokers in its QuickBooks software program. The outcome? Companies utilizing the characteristic are getting paid 5 days quicker and are 10% extra more likely to be paid in full.
Even non-developers are feeling the shift. Scott White, the product lead of Anthropic’s Claude AI product, described how he, regardless of not being an expert programmer, is now constructing production-ready software program options himself. “This wasn’t potential six months in the past,” he defined, highlighting the facility of instruments like Claude Code. Equally, OpenAI’s head of product for its API platform, Olivier Godement, detailed how clients like Stripe and Field are utilizing its Brokers SDK to construct out multi-agent techniques.
2. The hyperscaler race has no clear winner as multi-cloud, multi-model reigns
The times of betting on a single giant language mannequin (LLM) supplier are over. A constant theme all through Remodel 2025 was the transfer in direction of a multi-model and multi-cloud technique. Enterprises need the pliability to decide on the very best device for the job, whether or not it’s a robust proprietary mannequin or a fine-tuned open-source various.
As Armand Ruiz, VP of AI Platform at IBM defined, the corporate’s growth of a mannequin gateway — which routes functions to make use of no matter LLM is most effective and performant for the particular case –was a direct response to buyer demand. IBM began by providing enterprise clients its personal open-source fashions, then added open-source help, and eventually realized it wanted to help all fashions. This need for flexibility was echoed by XD Huang, the CTO of Zoom, who described his firm’s three-tiered mannequin method: supporting proprietary fashions, providing their very own fine-tuned mannequin and permitting clients to create their very own fine-tuned variations.
This pattern is creating a robust however constrained ecosystem, the place GPUs and the facility wanted to generate tokens are in restricted provide. As Dylan Patel of SemiAnalysis and fellow panelists Jonathan Ross of Groq and Sean Lie of Cerebras identified, this places stress on the profitability of numerous firms that merely purchase extra tokens when they’re accessible, as an alternative of locking into earnings as the price of these tokens continues to fall. Enterprises are getting smarter about how they use completely different fashions for various duties to optimize for each price and efficiency — and which will usually imply not simply counting on Nvidia chips, however being rather more personalized — one thing additionally echoed in a VB Remodel session led by Solidigm across the emergence of personalized reminiscence and storage options for AI.
3. Enterprises are targeted on fixing actual issues, not chasing AGI
Whereas tech leaders like Elon Musk, Mark Zuckerberg and Sam Altman are speaking in regards to the daybreak of superintelligence, enterprise practitioners are rolling up their sleeves and fixing quick enterprise challenges. The conversations at Remodel had been refreshingly grounded in actuality.
Take Highmark Well being, the nation’s third-largest built-in medical insurance and supplier firm. Its Chief Information Officer Richard Clarke stated it’s utilizing LLMs for sensible functions like multilingual communication to higher serve their numerous buyer base, and streamlining medical claims. In different phrases, leveraging expertise to ship higher providers in the present day. Equally, Capital One is constructing groups of brokers that mirror the capabilities of the corporate, with particular brokers for duties like threat analysis and auditing, together with serving to their automotive dealership shoppers join clients with the proper loans.
The journey business can also be seeing a realistic shift. CTOs from Expedia and Kayak mentioned how they’re adapting to new search paradigms enabled by LLMs. Customers can now seek for a lodge with an “infinity pool” on ChatGPT, and journey platforms want to include that degree of pure language discovery to remain aggressive. The main focus is on the client, not the expertise for its personal sake.
4. The way forward for AI groups is small, nimble, and empowered
The age of AI brokers can also be remodeling how groups are structured. The consensus is that small, agile “squads” of three to 4 engineers are simplest. Varun Mohan, CEO of Windsurf, a fast-growing agentic IDE, kicked off the occasion by arguing that this small crew construction permits for speedy testing of product hypotheses and avoids the slowdown that plagues bigger teams.
This shift signifies that “everyone seems to be a builder,” and more and more, “everyone seems to be a supervisor” of AI brokers. As GitHub and Atlassian famous, engineers are actually studying to handle fleets of brokers. The talents required are evolving, with a higher emphasis on clear communication and strategic considering to information these autonomous techniques.
This nimbleness is supported by a rising acceptance of sandboxed growth. Andrew Ng, a number one voice in AI, suggested attendees to go away security, governance, and observability to the tip of the event cycle. Whereas this may appear counterintuitive for giant enterprises, the thought is to foster speedy innovation inside a managed setting to show worth rapidly. This sentiment was mirrored in our survey, which discovered that 10% of organizations adopting AI don’t have any devoted AI security crew, suggesting a willingness to prioritize pace in these early phases.
Collectively, these takeaways paint a transparent image of an enterprise AI panorama that’s maturing quickly, transferring from broad experimentation to targeted, value-driven execution. The conversations at Remodel 2025 confirmed that firms are deploying AI brokers in the present day, even when they’ve needed to study powerful classes on the best way. Many have already gone by means of one or two massive pivots since first attempting out generative AI one or two years in the past — so it’s good to get began early.
For a extra conversational dive into these themes and additional evaluation from the occasion, you may hearken to the complete dialogue I had with unbiased AI developer Sam Witteveen on our current podcast under. We’ve additionally simply uploaded the main-stage talks at VB Remodel right here. And our full protection of articles from the occasion is right here.
Take heed to the VB Remodel takeaways podcast with Matt Marshall and Sam Witteveen right here:
Editor’s observe: As a thank-you to our readers, we’ve opened up early chicken registration for VB Remodel 2026 — simply $200. That is the place AI ambition meets operational actuality, and also you’re going to need to be within the room. Reserve your spot now.