MCP introduces an open standard for connecting AI systems with data sources and tools, replacing fragmented integrations with a single protocol. OpenAI's Apps SDK demonstrates how standardised software toolkits will shape future AI developments.
OpenAI's DevDay conference on 6 October 2025 marked a pivotal moment in AI's evolution, transforming AI from a conversational tool to an all-in-one interface. The launch of OpenAI's Apps SDK, a Software Development Kit (SDK) that provides developers with pre-built code libraries, tools and documentation for integrating applications directly within ChatGPT conversations, perfectly demonstrates how the leading tech giants increasingly expect AI to be used as a multifunctional tool. Rather than building integrations from scratch, developers can use this standardised toolkit to connect their services to ChatGPT, dramatically reducing development time and complexity. With the launch of Apps SDK, users can now access interactive applications from companies including Booking.com, Expedia and Spotify without leaving the conversation.
Apps SDK is laying the foundations for ChatGPT to transform from a recommendation engine into a travel superapp capable of handling the entire journey from inspiration to booking. With external websites integrated directly into conversations, users no longer need to copy details from a chat into a separate booking site. This represents a significant move that has the potential to fundamentally shift how people discover and purchase travel. Instead of keyword-based searches that return lists of links to conversational experiences, Apps SDK enables the progressive refinement of options whilst maintaining full context.
The Apps SDK gives third-party developers the full stack: connecting data, triggering actions and rendering fully interactive user interfaces. Users can invoke apps by mentioning their name at the start of a chat, or ChatGPT may even suggest relevant apps during the conversation. This conversational approach dissolves the boundaries between applications, creating experiences that feel natural rather than fragmented across disconnected platforms.
What makes this significant isn't just the feature itself, but the infrastructure powering it. The Apps SDK is built on the open standard Model Context Protocol (MCP), which was introduced by rival Anthropic nearly a year ago. By adopting a competitor's protocol rather than creating its own proprietary system, OpenAI has signalled that interoperability matters more than platform lock-in. This choice is likely to define the next decade of AI development, with MCP acting as the universal connector for AI systems.

Source: Model Context Protocol (MCP)
The tourism sector illustrates why this transformation resonates with the public. Approximately 40% of travellers worldwide have now used AI-based tools for planning trips, with one-third of US travellers using AI assistants extensively to research, plan and book trips. In the UK, holidaymakers increased their use of AI-powered travel platforms by 61% in 2025, suggesting we've crossed a threshold where AI assistance is becoming the preferred method rather than an alternative approach. When someone asks "Find me a boutique hotel in Lisbon with excellent reviews under £150 per night and book it for next weekend", the traditional workflow of opening multiple tabs, comparing prices across sites, reading reviews and checking availability suddenly feels like unnecessary friction.
Behind these seamless experiences lies a technical challenge, one that MCP elegantly addresses. Traditionally, connecting AI applications to external tools required custom integrations for every pairing. With five AI applications and ten external tools, you’d need fifty separate integrations, each requiring bespoke code, ongoing maintenance and updates whenever either side changed. This complexity made advanced AI solutions economically unviable for all but the largest organisations.
MCP introduces a universal, open standard for connecting AI systems with data sources and tools, replacing fragmented integrations with a single protocol. The protocol works through a client-server architecture, where AI applications function as MCP clients, whilst external tools and data sources operate as MCP servers. Instead of fifty integrations, only five AI clients and ten tool servers are needed; just fifteen connections in total. This simplification transforms what was once a prohibitive engineering challenge into an achievable implementation.

Source: SalesforceDevops (HumanLoop)
The core rules, or protocol, have three main parts that control how an AI system works with external resources. Tools are actions the AI can do, such as looking up information in a database, sending an email or changing a record. Resources are the data sources an application can show the AI, such as documents, databases or online service links that give the AI context. Prompts are set starting messages or templates that guide the AI's conversations, making sure it acts the same way for different tasks. These three parts together form a flexible structure for creating complex AI connections.
The design of the MCP is smart because it uses ideas that already work well instead of creating completely new ones. This protocol borrows key concepts from the Language Server Protocol (LSP), which is the de facto standard for connecting computer languages with software development tools. LSP is what lets a code editor offer helpful features like autocomplete, error checking and refactoring for dozens of programming languages without having to build a custom connection for every single one. By using LSP's design for AI systems, this choice prioritises interoperability over proprietary advantage, reducing platform risk for businesses investing in AI integration whilst accelerating ecosystem development.

Source: Language Server Protocol (LSP)
For DMOs, the growing popularity of MCPs raises a strategic question that will shape digital investment decisions for years to come. Should DMOs continue developing proprietary AI tools in isolation or should they prioritise integration with the large language models where travellers are already spending time?
The traditional approach of building bespoke chatbots and trip planning tools made sense when AI capabilities were fragmented and LLMs hadn't achieved mainstream adoption. Today, building an MCP integration allows DMOs to surface destination information more prominently within the recommendations of mainstream LLMs, meeting travellers where they already are rather than hoping they'll visit another platform.
Whilst MCP establishes how AI systems connect to external tools and data sources, it doesn't address the critical question of how transactions actually complete. An AI assistant might retrieve product information, check availability and compare prices through MCP connections, but executing a purchase requires a different kind of protocol. This additional protocol handles payment credentials, merchant authorisation and transaction security.
This is where the Agentic Commerce Protocol (ACP) enters the picture. Developed jointly by OpenAI and Stripe, ACP provides the language that lets buyers, their AI agents and businesses work together to complete purchases. The protocol defines a standardised checkout API that businesses implement once, enabling them to accept orders from any ACP-compatible AI agent. This separation of concerns proves crucial.
When a user decides to purchase, the AI agent constructs a structured checkout request containing product details, quantities and delivery preferences. This request flows to the company's backend system via the ACP specification. The merchant validates the order against current inventory and business rules, then either accepts or declines it. Payment processing happens through the company's existing payment provider, with the protocol supporting multiple payment methods and processors.
Throughout this flow, the business retains full control over pricing, availability and fulfilment, whilst the AI agent manages the conversational interface and user experience. As an open standard, businesses not processing with Stripe are also able to adopt ACP with their existing payment providers, while for those processing payments with Stripe, enabling agentic payments requires updating as little as one line of code.
While it remains a distant reality for travellers to progress from research to completed booking without switching contexts, the path has already been set. In late September 2025, OpenAI launched Instant Checkout, enabling purchases directly within ChatGPT. US ChatGPT users can now buy directly from US Etsy sellers within chats, with over one million Shopify merchants coming soon.

Source: ChatGPT
With AI-enabled transactions now a reality, DMOs face a critical challenge in ensuring that their destination's entire visitor experience ecosystem is ready for this transformation. When consumers gain familiarity with this new process of purchasing products directly through AI integration, they will become more amenable to considering high-value purchases. When visitors discover experiences through conversational AI and expect seamless booking, experience providers with poor digital infrastructure will ultimately become invisible.
This technological advancement means that DMOs will increasingly face critical decisions about whether to expand their remit towards more actively supporting their industry's digital maturity. This means facilitating access to digital booking platforms and helping SMEs implement the structured data and APIs that enable AI integration. Destinations whose SMEs maintain a strong digital presence will benefit from increased visibility in AI platforms, higher conversion rates from inspiration to booking and reduced friction throughout the visitor journey. Visitors increasingly judge destinations not just by their experiences but by how effortlessly those experiences can be discovered and booked. Poor digital infrastructure signals poor experience quality, even when that perception proves entirely inaccurate.
OpenAI's implementation of MCP and ACP demonstrates the visitor-facing potential of these standards. But how do they translate to enterprise environments where security, data governance and operational complexity multiply? Anthropic's approach with Claude Integrations provides the answer, showing how the same foundational standards enable differentiated experiences tailored to organisational needs.
Anthropic announced Integrations in May 2025, allowing Claude to work seamlessly with remote MCP servers across web and desktop applications. Users can connect Claude to external platforms, such as Intercom and Asana, and authenticate through third-party services. All data transfers are encrypted and access permissions are enforced at the user level. This additional layer of security proves essential for enterprise adoption.
The efficiency gains become exponential when LLMs maintain context across an organisation's entire tech stack. When tools connect to Claude, it gains deep context about work, including project histories, task statuses and organisational knowledge. Claude's extended thinking capabilities improve dramatically when given access to relevant project documentation, historical decisions, team workflows and operational data. Rather than staff manually gathering information from disparate systems before making recommendations, the use of MCP means that context can be synthesised automatically.
For DMOs managing complex stakeholder relationships, content production schedules and multi-channel campaigns, this contextual awareness transforms productivity. With this strategic use of AI, the automation of individual tasks becomes an institutional memory that helps teams make better-informed decisions faster, whilst reducing the cognitive overhead of tracking dependencies across systems.

Source: Runloop (Humanloop)
This enterprise implementation reveals the true power of open standards. MCP handles the data connections and tool integrations. ACP manages the commerce layer when transactions occur. Yet, organisations can also implement these protocols in ways that match their specific security requirements, operational workflows and customer experience goals. This serves as a key reason why generative AI has long been hailed as the necessary solution for significantly enhancing operational efficiency.
The parallel with Apple's iOS ecosystem illuminates why MCP and ACP represent inflection points in AI's development. When Apple introduced its own standardised SDK for mobile app development, individual devices transformed into platforms supporting millions of applications. Developers built once and reached Apple's entire user base, who benefited from exponentially expanding functionality. An entire economy emerged around those standards.
MCP and ACP provide comparable infrastructure for AI applications, with the crucial advantage of genuinely being open standards. A single MCP integration potentially provides access across multiple AI platforms as they emerge. Implementing ACP once enables commerce across various AI agents. Businesses maintain customer relationships and operational control whilst accessing new distribution channels that reach customers in the very conversational interfaces where they're already spending time. This ensures conversations continue to feel natural and engaging.
The convergence of rising AI usage, standardised integration protocols and embedded commerce capabilities signals that we're witnessing the formation of new AI-enabled digital ecosystem. The standards exist. The user base has reached critical mass. The technical barriers have been dismantled. What remains is vision: recognising that MCP and ACP aren't merely technical specifications but foundational building blocks for how commerce will function in an AI-mediated world.
For decades, technology markets have been defined by platform wars. Closed ecosystems competed for dominance, with businesses and developers forced to choose sides or maintain costly multi-platform strategies. The decision by leading AI companies to embrace open standards suggests a different trajectory. In fact, the launch of OpenAI's Apps SKD using MCP represents a watershed moment for the AI industry.
This matters because AI's potential impact dwarfs previous platform shifts. Mobile computing transformed how billions of people communicate, work and access services. AI promises to fundamentally reshape how we interact with information, make decisions and conduct business. If that transformation occurs within closed platforms controlled by individual companies, innovation will be constrained by whatever those companies choose to enable. Their business interests will inevitably shape which use cases receive support, which integrations are permitted and how value is distributed across the ecosystem.
Open standards create fundamentally different dynamics. When MCP defines how AI systems connect to external tools, any developer can build compatible servers without requiring permission or partnership agreements. When ACP specifies how commerce transactions flow through AI agents, merchants can implement it regardless of their size or existing technology partnerships. These standards lower barriers to entry whilst accelerating innovation across the entire ecosystem. Startups can compete with established players on capability rather than platform access and specialised solutions can emerge for niche use cases that large platforms would never prioritise.
The Language Server Protocol's success provides a template for what is possible. Today, developers can use obscure domain-specific languages in mainstream editors, with the entire developer community benefiting from this shared infrastructure. MCP and ACP have the potential to do the same for AI applications. That is why the embrace of open standards in AI represents a clear choice about the AI future we're building.
Together, these protocols form a layered infrastructure where each part solves a specific problem while still being able to work smoothly with the others. This setup allows for permissionless innovation, meaning developers can build and deploy new AI tools without needing specific approval from the creators of the underlying platform. Such an approach means that AI tools will become increasingly integrated into daily life in a much more native way, with endless opportunities for enhancing visitor experiences and engagement and internal operational processes.
OpenAI's DevDay conference on 6 October 2025 marked a pivotal moment in AI's evolution, transforming AI from a conversational tool to an all-in-one interface. The launch of OpenAI's Apps SDK, a Software Development Kit (SDK) that provides developers with pre-built code libraries, tools and documentation for integrating applications directly within ChatGPT conversations, perfectly demonstrates how the leading tech giants increasingly expect AI to be used as a multifunctional tool. Rather than building integrations from scratch, developers can use this standardised toolkit to connect their services to ChatGPT, dramatically reducing development time and complexity. With the launch of Apps SDK, users can now access interactive applications from companies including Booking.com, Expedia and Spotify without leaving the conversation.
Apps SDK is laying the foundations for ChatGPT to transform from a recommendation engine into a travel superapp capable of handling the entire journey from inspiration to booking. With external websites integrated directly into conversations, users no longer need to copy details from a chat into a separate booking site. This represents a significant move that has the potential to fundamentally shift how people discover and purchase travel. Instead of keyword-based searches that return lists of links to conversational experiences, Apps SDK enables the progressive refinement of options whilst maintaining full context.
The Apps SDK gives third-party developers the full stack: connecting data, triggering actions and rendering fully interactive user interfaces. Users can invoke apps by mentioning their name at the start of a chat, or ChatGPT may even suggest relevant apps during the conversation. This conversational approach dissolves the boundaries between applications, creating experiences that feel natural rather than fragmented across disconnected platforms.
What makes this significant isn't just the feature itself, but the infrastructure powering it. The Apps SDK is built on the open standard Model Context Protocol (MCP), which was introduced by rival Anthropic nearly a year ago. By adopting a competitor's protocol rather than creating its own proprietary system, OpenAI has signalled that interoperability matters more than platform lock-in. This choice is likely to define the next decade of AI development, with MCP acting as the universal connector for AI systems.

Source: Model Context Protocol (MCP)
The tourism sector illustrates why this transformation resonates with the public. Approximately 40% of travellers worldwide have now used AI-based tools for planning trips, with one-third of US travellers using AI assistants extensively to research, plan and book trips. In the UK, holidaymakers increased their use of AI-powered travel platforms by 61% in 2025, suggesting we've crossed a threshold where AI assistance is becoming the preferred method rather than an alternative approach. When someone asks "Find me a boutique hotel in Lisbon with excellent reviews under £150 per night and book it for next weekend", the traditional workflow of opening multiple tabs, comparing prices across sites, reading reviews and checking availability suddenly feels like unnecessary friction.
Behind these seamless experiences lies a technical challenge, one that MCP elegantly addresses. Traditionally, connecting AI applications to external tools required custom integrations for every pairing. With five AI applications and ten external tools, you’d need fifty separate integrations, each requiring bespoke code, ongoing maintenance and updates whenever either side changed. This complexity made advanced AI solutions economically unviable for all but the largest organisations.
MCP introduces a universal, open standard for connecting AI systems with data sources and tools, replacing fragmented integrations with a single protocol. The protocol works through a client-server architecture, where AI applications function as MCP clients, whilst external tools and data sources operate as MCP servers. Instead of fifty integrations, only five AI clients and ten tool servers are needed; just fifteen connections in total. This simplification transforms what was once a prohibitive engineering challenge into an achievable implementation.

Source: SalesforceDevops (HumanLoop)
The core rules, or protocol, have three main parts that control how an AI system works with external resources. Tools are actions the AI can do, such as looking up information in a database, sending an email or changing a record. Resources are the data sources an application can show the AI, such as documents, databases or online service links that give the AI context. Prompts are set starting messages or templates that guide the AI's conversations, making sure it acts the same way for different tasks. These three parts together form a flexible structure for creating complex AI connections.
The design of the MCP is smart because it uses ideas that already work well instead of creating completely new ones. This protocol borrows key concepts from the Language Server Protocol (LSP), which is the de facto standard for connecting computer languages with software development tools. LSP is what lets a code editor offer helpful features like autocomplete, error checking and refactoring for dozens of programming languages without having to build a custom connection for every single one. By using LSP's design for AI systems, this choice prioritises interoperability over proprietary advantage, reducing platform risk for businesses investing in AI integration whilst accelerating ecosystem development.

Source: Language Server Protocol (LSP)
For DMOs, the growing popularity of MCPs raises a strategic question that will shape digital investment decisions for years to come. Should DMOs continue developing proprietary AI tools in isolation or should they prioritise integration with the large language models where travellers are already spending time?
The traditional approach of building bespoke chatbots and trip planning tools made sense when AI capabilities were fragmented and LLMs hadn't achieved mainstream adoption. Today, building an MCP integration allows DMOs to surface destination information more prominently within the recommendations of mainstream LLMs, meeting travellers where they already are rather than hoping they'll visit another platform.
Whilst MCP establishes how AI systems connect to external tools and data sources, it doesn't address the critical question of how transactions actually complete. An AI assistant might retrieve product information, check availability and compare prices through MCP connections, but executing a purchase requires a different kind of protocol. This additional protocol handles payment credentials, merchant authorisation and transaction security.
This is where the Agentic Commerce Protocol (ACP) enters the picture. Developed jointly by OpenAI and Stripe, ACP provides the language that lets buyers, their AI agents and businesses work together to complete purchases. The protocol defines a standardised checkout API that businesses implement once, enabling them to accept orders from any ACP-compatible AI agent. This separation of concerns proves crucial.
When a user decides to purchase, the AI agent constructs a structured checkout request containing product details, quantities and delivery preferences. This request flows to the company's backend system via the ACP specification. The merchant validates the order against current inventory and business rules, then either accepts or declines it. Payment processing happens through the company's existing payment provider, with the protocol supporting multiple payment methods and processors.
Throughout this flow, the business retains full control over pricing, availability and fulfilment, whilst the AI agent manages the conversational interface and user experience. As an open standard, businesses not processing with Stripe are also able to adopt ACP with their existing payment providers, while for those processing payments with Stripe, enabling agentic payments requires updating as little as one line of code.
While it remains a distant reality for travellers to progress from research to completed booking without switching contexts, the path has already been set. In late September 2025, OpenAI launched Instant Checkout, enabling purchases directly within ChatGPT. US ChatGPT users can now buy directly from US Etsy sellers within chats, with over one million Shopify merchants coming soon.

Source: ChatGPT
With AI-enabled transactions now a reality, DMOs face a critical challenge in ensuring that their destination's entire visitor experience ecosystem is ready for this transformation. When consumers gain familiarity with this new process of purchasing products directly through AI integration, they will become more amenable to considering high-value purchases. When visitors discover experiences through conversational AI and expect seamless booking, experience providers with poor digital infrastructure will ultimately become invisible.
This technological advancement means that DMOs will increasingly face critical decisions about whether to expand their remit towards more actively supporting their industry's digital maturity. This means facilitating access to digital booking platforms and helping SMEs implement the structured data and APIs that enable AI integration. Destinations whose SMEs maintain a strong digital presence will benefit from increased visibility in AI platforms, higher conversion rates from inspiration to booking and reduced friction throughout the visitor journey. Visitors increasingly judge destinations not just by their experiences but by how effortlessly those experiences can be discovered and booked. Poor digital infrastructure signals poor experience quality, even when that perception proves entirely inaccurate.
OpenAI's implementation of MCP and ACP demonstrates the visitor-facing potential of these standards. But how do they translate to enterprise environments where security, data governance and operational complexity multiply? Anthropic's approach with Claude Integrations provides the answer, showing how the same foundational standards enable differentiated experiences tailored to organisational needs.
Anthropic announced Integrations in May 2025, allowing Claude to work seamlessly with remote MCP servers across web and desktop applications. Users can connect Claude to external platforms, such as Intercom and Asana, and authenticate through third-party services. All data transfers are encrypted and access permissions are enforced at the user level. This additional layer of security proves essential for enterprise adoption.
The efficiency gains become exponential when LLMs maintain context across an organisation's entire tech stack. When tools connect to Claude, it gains deep context about work, including project histories, task statuses and organisational knowledge. Claude's extended thinking capabilities improve dramatically when given access to relevant project documentation, historical decisions, team workflows and operational data. Rather than staff manually gathering information from disparate systems before making recommendations, the use of MCP means that context can be synthesised automatically.
For DMOs managing complex stakeholder relationships, content production schedules and multi-channel campaigns, this contextual awareness transforms productivity. With this strategic use of AI, the automation of individual tasks becomes an institutional memory that helps teams make better-informed decisions faster, whilst reducing the cognitive overhead of tracking dependencies across systems.

Source: Runloop (Humanloop)
This enterprise implementation reveals the true power of open standards. MCP handles the data connections and tool integrations. ACP manages the commerce layer when transactions occur. Yet, organisations can also implement these protocols in ways that match their specific security requirements, operational workflows and customer experience goals. This serves as a key reason why generative AI has long been hailed as the necessary solution for significantly enhancing operational efficiency.
The parallel with Apple's iOS ecosystem illuminates why MCP and ACP represent inflection points in AI's development. When Apple introduced its own standardised SDK for mobile app development, individual devices transformed into platforms supporting millions of applications. Developers built once and reached Apple's entire user base, who benefited from exponentially expanding functionality. An entire economy emerged around those standards.
MCP and ACP provide comparable infrastructure for AI applications, with the crucial advantage of genuinely being open standards. A single MCP integration potentially provides access across multiple AI platforms as they emerge. Implementing ACP once enables commerce across various AI agents. Businesses maintain customer relationships and operational control whilst accessing new distribution channels that reach customers in the very conversational interfaces where they're already spending time. This ensures conversations continue to feel natural and engaging.
The convergence of rising AI usage, standardised integration protocols and embedded commerce capabilities signals that we're witnessing the formation of new AI-enabled digital ecosystem. The standards exist. The user base has reached critical mass. The technical barriers have been dismantled. What remains is vision: recognising that MCP and ACP aren't merely technical specifications but foundational building blocks for how commerce will function in an AI-mediated world.
For decades, technology markets have been defined by platform wars. Closed ecosystems competed for dominance, with businesses and developers forced to choose sides or maintain costly multi-platform strategies. The decision by leading AI companies to embrace open standards suggests a different trajectory. In fact, the launch of OpenAI's Apps SKD using MCP represents a watershed moment for the AI industry.
This matters because AI's potential impact dwarfs previous platform shifts. Mobile computing transformed how billions of people communicate, work and access services. AI promises to fundamentally reshape how we interact with information, make decisions and conduct business. If that transformation occurs within closed platforms controlled by individual companies, innovation will be constrained by whatever those companies choose to enable. Their business interests will inevitably shape which use cases receive support, which integrations are permitted and how value is distributed across the ecosystem.
Open standards create fundamentally different dynamics. When MCP defines how AI systems connect to external tools, any developer can build compatible servers without requiring permission or partnership agreements. When ACP specifies how commerce transactions flow through AI agents, merchants can implement it regardless of their size or existing technology partnerships. These standards lower barriers to entry whilst accelerating innovation across the entire ecosystem. Startups can compete with established players on capability rather than platform access and specialised solutions can emerge for niche use cases that large platforms would never prioritise.
The Language Server Protocol's success provides a template for what is possible. Today, developers can use obscure domain-specific languages in mainstream editors, with the entire developer community benefiting from this shared infrastructure. MCP and ACP have the potential to do the same for AI applications. That is why the embrace of open standards in AI represents a clear choice about the AI future we're building.
Together, these protocols form a layered infrastructure where each part solves a specific problem while still being able to work smoothly with the others. This setup allows for permissionless innovation, meaning developers can build and deploy new AI tools without needing specific approval from the creators of the underlying platform. Such an approach means that AI tools will become increasingly integrated into daily life in a much more native way, with endless opportunities for enhancing visitor experiences and engagement and internal operational processes.