This publish was co-authored with Benjamin Kleppe, Machine Studying Engineering Lead at bunq.
The combination of agentic AI is reworking the banking trade, marking a major shift from conventional customer support programs. Agentic AI demonstrates autonomous decision-making capabilities in complicated monetary environments, enabling banks to offer round the clock multilingual assist, course of transactions, and ship customized monetary insights at scale.
bunq is Europe’s second-largest neobank, constructed to make life simple for folks and companies who stay a world life-style. Based in 2012 by serial entrepreneur Ali Niknam, bunq has at all times put customers on the coronary heart of every thing they do. The corporate helps its 20 million customers throughout Europe spend, save, finances, and make investments confidently, all inside a single, user-friendly software constructed on person suggestions
On this publish, we present how bunq upgraded Finn, its in-house generative AI assistant, utilizing Amazon Bedrock to remodel person assist and banking operations to be seamless, in a number of languages and time zones.
Enterprise problem
Banks face a serious problem to ship constant, high-quality buyer assist throughout a number of channels, languages, and time zones. Conventional assist programs battle with the complexity of monetary merchandise, regulatory necessities, and the rising expectation for fast, correct responses. Clients anticipate instantaneous entry to important banking capabilities like transaction disputes, account administration, and monetary recommendation, and banks want to keep up strict safety protocols and compliance requirements. As a user-centric financial institution, bunq’s customers anticipate round the clock assist for his or her banking wants, corresponding to requesting a refund or in search of steering on options. Conventional assist fashions couldn’t sustain with this demand, creating irritating bottlenecks and straining inside sources. Past direct assist, bunq’s crew additionally wanted environment friendly methods to investigate incoming characteristic requests and bug stories to constantly enhance their system. It was clear that bunq wanted a wiser answer that might present instantaneous, correct help across the clock and assist the crew flip helpful person suggestions into motion.
Answer overview
Launched in 2023, bunq’s generative AI assistant, Finn, is totally constructed in-house as a part of bunq’s proprietary AI stack. Finn makes use of main AI basis fashions (FMs) and tooling, together with Anthropic’s Claude fashions by Amazon Bedrock. Not like generic chatbots, Finn processes pure language and supplies real-time, clever solutions. Finn can translate the bunq software into 38 languages and translate speech-to-speech calls to the assist crew in actual time. It might additionally summarize complicated banking info, present monetary insights and budgeting recommendation, and even acknowledge pictures, automating tedious duties corresponding to bill processing. bunq’s strategy makes use of AWS companies to create a scalable AI agent infrastructure that may deal with the calls for of contemporary banking whereas sustaining safety and compliance. The answer makes use of the next AWS companies:
- Amazon Bedrock – A totally managed service that makes high-performing FMs from main AI corporations and Amazon obtainable by a unified API. bunq makes use of Amazon Bedrock to entry Anthropic’s Claude fashions with enhanced security measures, scalability, and compliance—vital necessities for banking purposes.
- Amazon Elastic Container Service (Amazon ECS) – A totally managed container orchestration service that makes it easy to deploy, handle, and scale containerized purposes. Amazon ECS alleviates the necessity to set up and function container orchestration software program or handle clusters of digital machines, serving to bunq give attention to constructing Finn’s multi-agent structure.
- Amazon DynamoDB – A totally managed, serverless, NoSQL database service designed to run high-performance purposes at scale. DynamoDB delivers single-digit millisecond efficiency and shops agent reminiscence, dialog historical past, and session information, enabling Finn to keep up context throughout buyer interactions.
- Amazon OpenSearch Serverless – An on-demand, computerized scaling configuration for Amazon OpenSearch Service. OpenSearch Serverless robotically scales compute sources primarily based on software wants and supplies vector search capabilities for Finn’s Retrieval Augmented Era (RAG) implementation, enabling semantic search throughout bunq’s data base.
Constructing a multi-agent implementation with Amazon Bedrock
Customers can work together with Finn by bunq’s software and net interface, utilizing pure language for his or her requests, corresponding to account info, transaction historical past, monetary recommendation, and assist points. The system processes requests in actual time, accessing solely pertinent information to the request, whereas sustaining strict safety and privateness controls. Consumer assist eventualities demand greater than what a single AI agent can ship. A multi-agent structure permits specialised brokers to deal with distinct duties—one agent may excel at understanding the person, one other focuses on extracting related documentation, and a 3rd handles transaction evaluation or account operations. For Finn, this implies a person asking a few failed fee can set off a coordinated response: one agent interprets the query, one other checks transaction logs, and a 3rd suggests options primarily based on comparable circumstances. All of them work collectively seamlessly to ship a complete reply in seconds, as a substitute of bouncing the person between departments. The preliminary multi-agent assist system for banking companies adopted a seemingly easy sample: a central router agent directed person queries to specialised sub-agents. Every agent dealt with particular domains—technical assist, basic inquiries, transaction standing, account administration, and so forth. Nevertheless, because the system grew, so did the scale and complexity of the calls for. As bunq added extra specialised brokers to deal with the brand new ecosystem, three points grew to become obvious:
- Routing complexity – With a number of specialised brokers, the router wanted more and more refined logic to find out the right vacation spot.
- Overlapping capabilities – A number of brokers required entry to the identical information sources and capabilities, forcing the router to foretell not simply the first intent but in addition which secondary brokers could be wanted downstream—an not possible job at scale.
- Scalability bottleneck – Each new agent or functionality meant updating the router’s logic. Including a brand new specialised agent required complete testing of all routing eventualities. The router grew to become a single level of failure and a possible growth bottleneck.
Rethinking the structure
bunq redesigned its system round an orchestrator agent that works essentially in a different way from the previous router. As an alternative of attempting to path to all potential brokers, the orchestrator performs the next actions:
- Routes queries to solely three to 5 major brokers
- Empowers these major brokers to invoke different brokers as instruments when wanted
- Delegates decision-making to the brokers themselves
With this agent-as-tool sample, major brokers detect once they want specialised assist. Device brokers are invoked dynamically by major brokers. Brokers can name different brokers by a well-defined interface—they develop into instruments in one another’s toolkits.
The next diagram illustrates this workflow.

bunq’s Finn service makes use of a complete AWS infrastructure designed for safety, scalability, and clever orchestration. The next structure diagram exhibits how a number of AWS companies work collectively to ship a multi-agent AI system.

Orchestration and agent structure
On the core of the system is the orchestrator agent, working on Amazon Elastic Container Service (Amazon ECS). This orchestrator implements the agent-as-tool sample, routing person queries to a restricted set of major brokers moderately than making an attempt to foretell each potential situation. The orchestrator maintains three to 5 major brokers (Major Agent 1 by 5), every deployed as containerized companies on Amazon ECS. This design supplies horizontal scalability—as demand will increase, extra agent cases will be spun up robotically. Every major agent is empowered to invoke specialised brokers as wanted. These specialised brokers (Specialised Agent 1, 2, 3, and so forth) act as instruments that major brokers can name upon for particular capabilities, corresponding to analyzing transaction information, retrieving documentation, or processing complicated queries. This hierarchical construction avoids the routing complexity bottleneck whereas sustaining flexibility.
Infrastructure particulars
The structure is constructed on a sturdy basis of AWS companies that allow Finn’s efficiency. Customers entry the service by bunq’s software, with visitors secured by AWS WAF and Amazon CloudFront, whereas authentication flows by bunq’s proprietary id system. Amazon Bedrock supplies entry to Anthropic’s Claude fashions for pure language understanding, complemented by Amazon SageMaker hosted fine-tuned fashions for specialised banking eventualities. Agent reminiscence and dialog historical past are saved in DynamoDB, and OpenSearch Service serves as a vector retailer for RAG capabilities, enabling semantic search throughout bunq’s data base. Amazon Easy Storage Service (Amazon S3) handles doc storage, and Amazon MemoryDB manages person classes for real-time interactions. Complete observability by AWS CloudTrail, Amazon GuardDuty, and Amazon CloudWatch helps the crew monitor efficiency, detect threats, and preserve compliance—all inside a safe digital personal cloud (VPC).
Actual-world impression
The transformation from bunq’s preliminary router-based structure to the orchestrator sample with Amazon Bedrock delivered measurable enhancements throughout person assist operations. The multi-agent deployment achieved important operational effectivity good points:
- Finn now handles 97% of bunq’s person assist exercise, with over 82% totally automated. Common response instances dropped to simply 47 seconds, serving to bunq ship the real-time options customers anticipate.
- The fast deployment timeline highlights bunq’s give attention to innovation. The crew moved from idea to manufacturing in 3 months, beginning in January 2025. bunq introduced collectively a crew of 80 folks—from AI engineers to assist employees—who labored collectively to check, study, and deploy updates 3 times a day.
- Earlier than implementing the orchestrator structure, escalations have been primarily guide processes. The brand new multi-agent system elevated automation, reworking end-to-end assist metrics. Past that, Finn expanded bunq’s attain by translating the applying into 38 languages, making banking extra accessible to tens of millions of customers throughout Europe.
- The answer enabled bunq to develop into Europe’s first AI-powered financial institution, providing capabilities no conventional assist system might ship: real-time speech-to-speech translation (a primary in international banking), picture recognition for receipt processing and doc verification, and clever monetary insights—all whereas sustaining the round the clock availability customers demand.
“We went from idea to manufacturing in 3 months. Earlier than the orchestrator structure, escalations have been primarily guide. Now Finn handles 97% of assist with 70% totally automated and 47-second common response instances.”
– Benjamin Kleppe, Machine Studying Engineering Lead at bunq.
Conclusion
bunq’s journey from guide assist escalations to an clever multi-agent system exhibits how trendy AI structure can rework banking operations. By shifting from a inflexible router-based strategy to a versatile orchestrator sample with Amazon Bedrock, bunq averted scalability bottlenecks whereas sustaining the agility wanted to serve 20 million customers throughout Europe. The orchestrator sample with agent-as-tool capabilities proved important to bunq’s success. Moderately than predicting each potential person situation upfront, the system empowers major brokers to dynamically invoke specialised brokers as wanted. This architectural shift diminished complexity, accelerated growth cycles, and helped bunq deploy updates 3 times per day in the course of the preliminary rollout. The outcomes: 97% of assist interactions dealt with by Finn, 70% totally automated, and common response instances of simply 47 seconds. Past effectivity good points, the answer expanded bunq’s attain to 38 languages and positioned the corporate as Europe’s first AI-powered financial institution. By releasing inside sources from guide processes, bunq can now give attention to what it does greatest: constructing a financial institution that makes life simple for its customers.
To study extra about constructing AI-powered purposes with FMs, confer with Amazon Bedrock. Discover how Anthropic’s Claude on Amazon Bedrock can rework your buyer expertise with enhanced security measures and scalability. Get began with the Amazon Bedrock documentation to construct your individual multi-agent options.
Concerning the Authors
Benjamin Kleppe is Machine Studying Engineering Lead at bunq, the place he leads the event and scaling of AI-powered options that make banking smarter and extra private for 20 million customers throughout Europe. He focuses on constructing clever programs that improve person expertise, enhance product discovery, and automate complicated banking processes. Benjamin is enthusiastic about pushing the boundaries of AI innovation in banking, having led bunq to develop into Europe’s first AI-powered financial institution with the launch of Finn, their proprietary generative AI platform.
Jagdeep Singh Soni is a Senior AI/ML Options Architect at AWS primarily based within the Netherlands, specializing in generative AI and Amazon Bedrock. He helps prospects and companions architect and implement clever agent options utilizing Amazon Bedrock and different AWS AI/ML companies. With 16 years of expertise in innovation and cloud structure, Jagdeep focuses on enabling organizations to construct production-ready generative AI purposes that use basis fashions and agent frameworks for real-world enterprise outcomes.
Man Kfir is a generative AI Lead at AWS with over 15 years of expertise in cloud know-how gross sales, enterprise growth, and AI/ML evangelism. He works with enterprise prospects, startups, and companions throughout EMEA to speed up adoption of generative AI options and execute go-to-market methods.


