In this post, we show you how to build an Amazon Bedrock agent that uses MCP to access data sources to quickly build generative AI applications. Using Amazon Bedrock Agents, your agent can be assembled on the fly with MCP-based tools as in this example: (View Highlight)
We showcase an example of building an agent to understand your Amazon Web Service (AWS) spend by connecting to AWS Cost Explorer, Amazon CloudWatch, and Perplexity AI through MCP. You can use the code referenced in this post to connect your agents to other MCP servers to address challenges for your business. We envision a world where agents have access to an ever-growing list of MCP servers that they can use for accomplishing a wide variety of tasks. (View Highlight)
we provide a step-by-step guide on how to connect your favorite MCP servers with Amazon Bedrock agents as Action Groups that an agent can use to accomplish tasks provided by the user. We also introduce Amazon Bedrock Inline Agent SDK , which streamlines the process of invoking inline agents by managing the complex workflow orchestration. The SDK is also packaged with built-in MCP client implementation that provides you with direct access to tools delivered by an MCP server. Without this SDK, developers must write and maintain custom code for:
• Parsing response streams
• Handling return control flows
• Managing state between agent interactions
• Coordinating API calls (View Highlight)
As part of creating an agent, the developer creates an MCP client specific to each MCP server that requires agent communication. When invoked, the agent determines which tools are needed for the user’s task; if MCP server tools are required, it uses the corresponding MCP client to request tool execution from that server. (View Highlight)
To orchestrate this workflow, you take advantage of the return control capability of Amazon Bedrock Agents. The following diagram illustrates the end-to-end flow of an agent handling a request that uses two tools. In the first flow, a Lambda-based action is taken, and in the second, the agent uses an MCP server. (View Highlight)
Host: This is the Amazon Bedrock inline agent. This agent adds MCP clients as action groups that can be invoked through RETURN_CONTROL when the user asks an AWS spend-related question. (View Highlight)
Client: You create two clients that establish one-to-one connections with their respective servers: a cost explorer client with specific cost server parameters and a Perplexity AI client with Perplexity server parameters. (View Highlight)
Servers: You create two MCP servers that each run locally on your machine and communicate to your application over standard input/output (alternatively, you could also configure the client to talk to remote MCP servers).
Cost Explorer and Amazon CloudWatch Logs (for Amazon Bedrock model invocation log data) and an MCP server to retrieve the AWS spend data.
Perplexity AI MCP server to interpret the AWS spend data. (View Highlight)
The MCP servers run locally on your computer and need to access AWS services and the Perplexity API. You can read more about AWS credentials in Manage access keys for IAM users. Make sure that your credentials include AWS Identity and Access Manager (IAM) read access to Cost Explorer and CloudWatch. You can do this by using AWSBillingReadOnlyAccess and CloudWatchReadOnlyAccess managed IAM permissions. You can get the Perplexity API key from the Perplexity Sonar API page. (View Highlight)
Note: While this post demonstrates the use of AWS access keys for simplicity and to focus on the MCP use case, in production environments we strongly recommend using AWS IAM roles instead of long-term access keys. If you must use access keys, please follow these security best practices. Never share or expose your access keys publicly (including in code repositories). Implement the principle of least privilege by scoping the IAM policies to only the required permissions, and regularly rotate your access keys (recommended every 90 days). It is also important to use AWS CloudTrail to monitor access key usage and consider using temporary credentials through AWS STS when possible. (View Highlight)
Organizations can now offer their teams natural, conversational access to complex financial data while enhancing responses with contextual intelligence from sources like Perplexity. As AI continues to evolve, the ability to securely connect models to your organization’s critical systems will become increasingly valuable. Whether you’re looking to transform customer service, streamline operations, or gain deeper business insights, the Amazon Bedrock and MCP integration provides a flexible foundation for your next AI innovation. You can dive deeper on this MCP integration by exploring our code samples. (View Highlight)