Insurance giant AIG deploys agentic AI with orchestration layer

News Room

American International Group (AIG) has reported faster than expected gains from its use of generative AI, with implications for underwriting capacity, operating cost, and portfolio integration. The company’s recent disclosures at an Investor Day merit attention from AI decision-makers as they contain assertions about measurable throughput and workflow redesign.

AIG has outlined potential benefits from generative AI. Chief executive Peter Zaffino later described the company’s early projections as “aspirational,” yet in a fourth quarter earnings call, he stated that “we see the abilities are much greater.” The change in tone is indicative of positive internal results, and according to Zaffino, “We’re seeing a massive change in our ability to process a submission flow way […] without additional human capital resources. That has been the biggest surprise.”

The company’s claims that generative AI has increased submission processing capacity, the economic impact is direct. AIG reports that in 2025 it “made progress embedding generative AI in our core underwriting and claims processes, and expanding it.” The company’s internal tool, AIG Assist, is implemented in most commercial lines of businesses.

Lexington Insurance, AIG’s excess and surplus unit has targetted reaching 500,000 submissions by 2030. Zaffino reports that Lexington has already surpassed 370,000 submissions in 2025. AIG uses generative models to extract and summarise incoming data, and has developed an orchestration layer in the technology stack “to coordinate AI agents to drive better decision-making and reduce costs in the organisation.” Previous Investor Days, this level of orchestration was not a focus.

The chief executive describes AI agents “as companions that operate with our teams” that provide real-time information, draw on historical cases, and challenge underwriting decisions. The company relies on its ability to manage incoming data “at a fraction of the time” and to orchestrate agents so they can “scale and be able to analyse that information that’s not biased in any way; that’s through the entire workflow.”

AIG links orchestration to compression of what it terms a “front-to-back workflow,” a tighter integration between intake, risk assessment and claims handling. The company states that multiple agents, coordinated through a orchestration layer, streamlines repetitive and previously-lengthy processes.

AIG has applied its generative AI stack in specific transactions. During the conversion of Everest’s retail commercial business, the company reports that accounts were prioritised for renewal “in a fraction of the time.” Management states that it built an ontology of Everest’s portfolio and combined it with its own, which “allowed [the company] to prioritise how the portfolios could blend together.” Ontological alignment is technically demanding and often creates underestimated costs.

The launch of Lloyd’s Syndicate 2479, in partnership with Amwins and Blackstone, extended the ontological approach to a special purpose vehicle. In conjunction with Palantir, AIG used LLMs to assess whether Amwins’ programme portfolio aligned with the syndicate’s stated risk appetite. Zaffino stated that AIG has a “strong pipeline of SPV opportunities.”

For AI decision-makers, the case illustrates the use that orchestration and workflow integration can provide when generative models are embedded in core processes, and the degree to which economic impact depends on measurable changes in capacity and cycle time.

(Image source: “Nagasaki, AIG (Insurance company) building” by Admanchester is licensed under CC BY-NC-ND 2.0. )

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is part of TechEx and co-located with other leading technology events. Click here for more information.

AI News is powered by TechForge Media. Explore other upcoming enterprise technology events and webinars here.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *