X. Design Week 2026: The AI Destination Intensive

Every industry event now features AI. X. Design Week 2026, in Brussels on 2–4 June, goes much further, gathering digital, creative and strategy teams around the principle that progress happens through doing.

Every industry event now features AI. X. Design Week 2026, in Brussels on 2–4 June, goes much further, gathering digital, creative and strategy teams around the principle that progress happens through doing. This is a dedicated, intensive and practical programme where DMOs make significant progress on their AI strategies through hands-on experimentation, high-level expertise and structured problem-solving. The entire event is designed with a focused objective to ensure every destination leaves with a credible, implementable AI strategy and the capability to execute it.

Participants will gain a grounded understanding of where AI is headed and the specific steps required to prepare for the next three to five years, with practical frameworks and templates that are ready to use when you return to the office. Drawing on the shared challenges and collective intelligence of your peers, X. Design Week will build your confidence in AI governance by clarifying legal implications, policies and how to tackle them. Rather than leaving with inspiration, you'll depart with the practical techniques and expertise needed to take immediate action.

XDW 2026 is structured across three days, each addressing a distinct layer of AI strategy. The programme begins by building solid internal foundations, moves on to sharpening competitive positioning and concludes by diving into real-world implementation. Instead of scattered breakout rooms, XDW is built around four distinct zones. Following impactful keynotes that set the agenda, attendees rotate between the zones, choosing the format that suits their learning style and immediate priorities.

  1. The Lab: This is where capability is built through doing, with time dedicated to hands-on testing of AI tools, guided experiments and live builds. Participants work through practical tasks, such as building MCP servers, running GEO audits, testing AI content workflows and experimenting with data visualisation. Every session produces an output you can take away and use.
  2. The Strategy Room: These facilitated planning sessions use proven frameworks and templates. Attendees map their organisation’s AI roadmap with expert support, working through prioritisation, resourcing and sequencing. Participants leave with a transformation playbook and a concrete action plan shaped by both specialist input and peer feedback.
  3. The Debating Room: Here, structured debates will track the collective position of attending DMOs on the most contested AI questions in tourism. Should destinations build AI natively or use external plug-ins? Is AI-generated content a brand risk or an opportunity? Where should the line sit between automation and human judgement? Pre- and post-debate polling tracks how the room’s thinking shifts over the course of each session.
  4. The Clinic: Your problems will be addressed by experts and peers. Each cluster of challenges begins with a short expert briefing on the common patterns behind the submitted challenges, including the causes and potential solutions and pitfalls. Then, working in small groups, attendees collaborate on their shared problems with facilitator support, drawing on the combined experience of peers facing similar issues in different contexts.

Supercharging Internal Transformation

The first day focuses on what happens inside your organisation, building the capability, culture and governance that determine whether AI becomes embedded in how a DMO operates.

Closing the gap between experimentation and operational impact starts with identifying the processes where AI can deliver efficiency gains without requiring a complete overhaul. McKinsey’s State of AI in 2025 report found that the redesign of workflows around AI had the single biggest effect on achieving measurable financial impact. Yet, only around 6% of organisations qualify as high performers on this measure. The implication for DMOs is that simply adopting AI tools isn’t enough. The value sits in rethinking how work gets done, which processes to automate, which to augment and which to leave untouched. For agile destination teams juggling marketing, partnerships, research and stakeholder management, even targeted workflow automation can free up significant capacity for higher-value strategic work.

Source: McKinsey

However, these AI-enabled efficiency gains mean little without the right guardrails. MIT's 2025 State of AI in Business report identifies the scale of the ‘shadow AI economy’. While only 40% of companies reported purchasing an official LLM subscription, workers from over 90% of organisations regularly use personal AI tools for work tasks. With staff already experimenting with AI, the question is whether they're doing so within a framework that protects the organisation or if there is a policy vacuum where data handling, accuracy and brand consistency are left to individual judgement. This means that clear AI policies and guidelines are becoming increasingly necessary for ensuring that the experimentation already happening across every team is channelled productively and responsibly.

As AI becomes embedded across more organisational functions, questions about data handling, intellectual property, transparency and accountability multiply. In fact, the Boston Consulting Group predicts that AI agents will see a 45% annual growth in their business applications. In contrast, Deloitte’s State of AI in the Enterprise report identified that only one in five (21%) organisations is actually ready to govern autonomous AI systems. For DMOs, this governance gap carries particular weight given the management of publicly funded operations and responsibility for representing places and communities accurately. Moving forward without clear policies risks both compliance issues and reputational harm. The challenge of AI sovereignty compounds this approach to AI governance, with 77% of companies revealing that the geographic origin of AI tools plays a leading role in tool selection. Such overarching issues create a barrier for many smaller nations to ensure responsible AI use and data protection compliance.

Source: Boston Consulting Group

In a similar vein, responsible AI usage also requires a strong focus on soft skills. Yet, the Deloitte survey identified insufficient worker skills as the single biggest barrier to integrating AI into existing workflows. Despite the buzz around AI, 84% of businesses have failed to restructure jobs or adapt their daily workflows to accommodate AI's potential. For DMOs, AI literacy needs to extend from leadership through to operations teams. Embedding capability across the organisation means moving beyond one-off training sessions toward sustained programmes that combine formal learning with hands-on experimentation and peer support. This is exactly the kind of support that X. Design Week is designed for. The destinations that treat AI upskilling as a change management challenge are the ones best positioned to make progress.

Such a focus on skills development means that DMO can better leverage their years of accumulated insight, enabling AI to complement existing knowledge. Yet, the question remains whether that knowledge is accessible in a form AI systems can work with, or whether it only remains in documents and the memories of long-serving staff. McKinsey’s research found that 88% of organisations now use AI in at least one business function, yet only around one-third have begun scaling it across the enterprise. A significant part of that scaling challenge is structural, with organisations struggling to connect AI tools to the institutional knowledge that would make them useful. For DMOs, structuring internal knowledge so that AI can surface, connect and amplify it is a foundational step that determines whether AI tools produce generic outputs or genuinely useful, context-specific intelligence.

Source: McKinsey

Enhancing Competitive Positioning

With Day 1 focused on what happens inside your organisation, Day 2 prioritises how your destination is perceived by the world. AI is reshaping the relationship between destinations and their own brand narratives. This is particularly important given that AI interfaces are rapidly becoming the front door to travel planning. Gartner predicted that by 2026, traditional search engine volume will have declined by 25% in only a couple of years. For DMOs whose digital strategies have been built around search engine optimisation, this represents a sizeable change.

This behaviour is reshaping how destinations are discovered and chosen in real time. Adobe’s analysis of more than 8 million visits to US travel sites found that generative AI traffic grew by 3,500% year-on-year in July 2025, with 88% of travellers who used AI for trip planning reporting that it enhanced their experience. If a DMO isn’t actively positioning itself within AI systems through structured content and MCP servers, it risks becoming invisible at the point of decision. Generative Engine Optimisation has emerged in response and is already reshaping how visibility is measured and pursued, but remains in its early stages. Traditional metrics like click-through rates and keyword rankings are giving way to new questions. How does AI perceive your destination? What information does it surface, and crucially, what does it miss?

Source: Adobe

The shift is being accelerated by a wave of direct brand integrations inside the AI platforms travellers already use. This represents a fundamental change in how discovery and decision-making happen more natively, collapsing what used to be a multi-step journey across multiple websites into a single conversational interface. On the other hand, such actions have exposed a serious trust gap. A 2025 Gartner survey of US consumers found that 53% remain sceptical of AI-powered search results, with 61% wanting a simple ‘off switch’ for AI summaries altogether. This creates a credibility challenge as audiences question whether content has been generated by AI and is accurate and reliable.

A parallel trust gap is also exposed with destination imagery. This cuts in both directions, with audiences questioning whether the content they encounter has been produced by AI, while authentic imagery is frequently mistaken for being AI-created. In fact, a Clutch survey found that 57% of people can’t accurately spot an AI-generated photo, despite 66% being confident they could. For destinations, the stakes are particularly high as imagery and storytelling carry the weight of brand identity. With the provenance of creative work increasingly unclear, AI transparency is now an essential pillar for protecting brand trust. This is particularly important given that Sojern's State of Destination Marketing 2026 report highlights that 66% of DMOs have integrated AI in content creation.

Maintaining public integrity is vital, but it’s just as important to ensure the data DMOs already hold can be turned into intelligence that drives targeting and promotion. DMOs sit on rich data, including sentiment analysis and market intelligence. AI has the capacity to connect those sources, identify patterns that human analysis might miss and surface insights that directly inform where to invest promotional spend, which markets to target and which stories to tell. Destinations that want AI to deliver strategic value need to build the infrastructure that makes it possible to address data silos. Without this foundation, even the most capable AI tools produce shallow, generic outputs that fail to reflect the specificity of a destination’s context. Sojern's research demonstrates how significant progress is being made in this area, with AI's application in data analytics rapidly increasing from 28% to 51% in a single year.

Source: Sojern

The Implementation Lab

The final day of X. Design Week is focused on diving into AI use case sessions. The rationale for focusing strongly on implementation is grounded in evidence. MIT’s research found that 95% of custom generative AI pilots fail to deliver measurable business impact because organisations struggle to move beyond the experimentation stage. The core barriers are familiar, with a resistance to new tools and misaligned expectations. Supported by expert guidance, participants will arrive with a specific challenge and leave with a working approach.

Source: MIT

For DMOs, where budgets are tight, learning from both best practices and documented failures is the difference in driving innovation forward. The Implementation Lab is structured to address this directly, operating as a working session. Attendees select specific use cases relevant to their organisation and work through the practicalities with expert support: scoping, sequencing, identifying dependencies and stress-testing assumptions. Collective experience is one of the most valuable resources in the room, with the most valuable insights often coming from the candid account of where things went wrong and why. Creating the conditions for those conversations, X. Design Week turns individual experience into shared intelligence that helps avoid repeating the same mistakes.

XDW 2026 is built for DMOs that have decided to act and need the tools, frameworks and peer support to use AI with confidence. Beyond the programme itself, every participant takes home two resources designed for ongoing use:

  1. The XDW 2026 Insights Report acts as an actionable reference built from the collective output of the room, synthesising the discussion outcomes and strategic recommendations generated across the three days
  2. The Use Cases Library functions as a searchable database of AI implementations from destinations worldwide, providing a living resource of what DMOs are doing with AI, how they are approaching it and the results they are seeing.

Every industry event now features AI. X. Design Week 2026, in Brussels on 2–4 June, goes much further, gathering digital, creative and strategy teams around the principle that progress happens through doing. This is a dedicated, intensive and practical programme where DMOs make significant progress on their AI strategies through hands-on experimentation, high-level expertise and structured problem-solving. The entire event is designed with a focused objective to ensure every destination leaves with a credible, implementable AI strategy and the capability to execute it.

Participants will gain a grounded understanding of where AI is headed and the specific steps required to prepare for the next three to five years, with practical frameworks and templates that are ready to use when you return to the office. Drawing on the shared challenges and collective intelligence of your peers, X. Design Week will build your confidence in AI governance by clarifying legal implications, policies and how to tackle them. Rather than leaving with inspiration, you'll depart with the practical techniques and expertise needed to take immediate action.

XDW 2026 is structured across three days, each addressing a distinct layer of AI strategy. The programme begins by building solid internal foundations, moves on to sharpening competitive positioning and concludes by diving into real-world implementation. Instead of scattered breakout rooms, XDW is built around four distinct zones. Following impactful keynotes that set the agenda, attendees rotate between the zones, choosing the format that suits their learning style and immediate priorities.

  1. The Lab: This is where capability is built through doing, with time dedicated to hands-on testing of AI tools, guided experiments and live builds. Participants work through practical tasks, such as building MCP servers, running GEO audits, testing AI content workflows and experimenting with data visualisation. Every session produces an output you can take away and use.
  2. The Strategy Room: These facilitated planning sessions use proven frameworks and templates. Attendees map their organisation’s AI roadmap with expert support, working through prioritisation, resourcing and sequencing. Participants leave with a transformation playbook and a concrete action plan shaped by both specialist input and peer feedback.
  3. The Debating Room: Here, structured debates will track the collective position of attending DMOs on the most contested AI questions in tourism. Should destinations build AI natively or use external plug-ins? Is AI-generated content a brand risk or an opportunity? Where should the line sit between automation and human judgement? Pre- and post-debate polling tracks how the room’s thinking shifts over the course of each session.
  4. The Clinic: Your problems will be addressed by experts and peers. Each cluster of challenges begins with a short expert briefing on the common patterns behind the submitted challenges, including the causes and potential solutions and pitfalls. Then, working in small groups, attendees collaborate on their shared problems with facilitator support, drawing on the combined experience of peers facing similar issues in different contexts.

Supercharging Internal Transformation

The first day focuses on what happens inside your organisation, building the capability, culture and governance that determine whether AI becomes embedded in how a DMO operates.

Closing the gap between experimentation and operational impact starts with identifying the processes where AI can deliver efficiency gains without requiring a complete overhaul. McKinsey’s State of AI in 2025 report found that the redesign of workflows around AI had the single biggest effect on achieving measurable financial impact. Yet, only around 6% of organisations qualify as high performers on this measure. The implication for DMOs is that simply adopting AI tools isn’t enough. The value sits in rethinking how work gets done, which processes to automate, which to augment and which to leave untouched. For agile destination teams juggling marketing, partnerships, research and stakeholder management, even targeted workflow automation can free up significant capacity for higher-value strategic work.

Source: McKinsey

However, these AI-enabled efficiency gains mean little without the right guardrails. MIT's 2025 State of AI in Business report identifies the scale of the ‘shadow AI economy’. While only 40% of companies reported purchasing an official LLM subscription, workers from over 90% of organisations regularly use personal AI tools for work tasks. With staff already experimenting with AI, the question is whether they're doing so within a framework that protects the organisation or if there is a policy vacuum where data handling, accuracy and brand consistency are left to individual judgement. This means that clear AI policies and guidelines are becoming increasingly necessary for ensuring that the experimentation already happening across every team is channelled productively and responsibly.

As AI becomes embedded across more organisational functions, questions about data handling, intellectual property, transparency and accountability multiply. In fact, the Boston Consulting Group predicts that AI agents will see a 45% annual growth in their business applications. In contrast, Deloitte’s State of AI in the Enterprise report identified that only one in five (21%) organisations is actually ready to govern autonomous AI systems. For DMOs, this governance gap carries particular weight given the management of publicly funded operations and responsibility for representing places and communities accurately. Moving forward without clear policies risks both compliance issues and reputational harm. The challenge of AI sovereignty compounds this approach to AI governance, with 77% of companies revealing that the geographic origin of AI tools plays a leading role in tool selection. Such overarching issues create a barrier for many smaller nations to ensure responsible AI use and data protection compliance.

Source: Boston Consulting Group

In a similar vein, responsible AI usage also requires a strong focus on soft skills. Yet, the Deloitte survey identified insufficient worker skills as the single biggest barrier to integrating AI into existing workflows. Despite the buzz around AI, 84% of businesses have failed to restructure jobs or adapt their daily workflows to accommodate AI's potential. For DMOs, AI literacy needs to extend from leadership through to operations teams. Embedding capability across the organisation means moving beyond one-off training sessions toward sustained programmes that combine formal learning with hands-on experimentation and peer support. This is exactly the kind of support that X. Design Week is designed for. The destinations that treat AI upskilling as a change management challenge are the ones best positioned to make progress.

Such a focus on skills development means that DMO can better leverage their years of accumulated insight, enabling AI to complement existing knowledge. Yet, the question remains whether that knowledge is accessible in a form AI systems can work with, or whether it only remains in documents and the memories of long-serving staff. McKinsey’s research found that 88% of organisations now use AI in at least one business function, yet only around one-third have begun scaling it across the enterprise. A significant part of that scaling challenge is structural, with organisations struggling to connect AI tools to the institutional knowledge that would make them useful. For DMOs, structuring internal knowledge so that AI can surface, connect and amplify it is a foundational step that determines whether AI tools produce generic outputs or genuinely useful, context-specific intelligence.

Source: McKinsey

Enhancing Competitive Positioning

With Day 1 focused on what happens inside your organisation, Day 2 prioritises how your destination is perceived by the world. AI is reshaping the relationship between destinations and their own brand narratives. This is particularly important given that AI interfaces are rapidly becoming the front door to travel planning. Gartner predicted that by 2026, traditional search engine volume will have declined by 25% in only a couple of years. For DMOs whose digital strategies have been built around search engine optimisation, this represents a sizeable change.

This behaviour is reshaping how destinations are discovered and chosen in real time. Adobe’s analysis of more than 8 million visits to US travel sites found that generative AI traffic grew by 3,500% year-on-year in July 2025, with 88% of travellers who used AI for trip planning reporting that it enhanced their experience. If a DMO isn’t actively positioning itself within AI systems through structured content and MCP servers, it risks becoming invisible at the point of decision. Generative Engine Optimisation has emerged in response and is already reshaping how visibility is measured and pursued, but remains in its early stages. Traditional metrics like click-through rates and keyword rankings are giving way to new questions. How does AI perceive your destination? What information does it surface, and crucially, what does it miss?

Source: Adobe

The shift is being accelerated by a wave of direct brand integrations inside the AI platforms travellers already use. This represents a fundamental change in how discovery and decision-making happen more natively, collapsing what used to be a multi-step journey across multiple websites into a single conversational interface. On the other hand, such actions have exposed a serious trust gap. A 2025 Gartner survey of US consumers found that 53% remain sceptical of AI-powered search results, with 61% wanting a simple ‘off switch’ for AI summaries altogether. This creates a credibility challenge as audiences question whether content has been generated by AI and is accurate and reliable.

A parallel trust gap is also exposed with destination imagery. This cuts in both directions, with audiences questioning whether the content they encounter has been produced by AI, while authentic imagery is frequently mistaken for being AI-created. In fact, a Clutch survey found that 57% of people can’t accurately spot an AI-generated photo, despite 66% being confident they could. For destinations, the stakes are particularly high as imagery and storytelling carry the weight of brand identity. With the provenance of creative work increasingly unclear, AI transparency is now an essential pillar for protecting brand trust. This is particularly important given that Sojern's State of Destination Marketing 2026 report highlights that 66% of DMOs have integrated AI in content creation.

Maintaining public integrity is vital, but it’s just as important to ensure the data DMOs already hold can be turned into intelligence that drives targeting and promotion. DMOs sit on rich data, including sentiment analysis and market intelligence. AI has the capacity to connect those sources, identify patterns that human analysis might miss and surface insights that directly inform where to invest promotional spend, which markets to target and which stories to tell. Destinations that want AI to deliver strategic value need to build the infrastructure that makes it possible to address data silos. Without this foundation, even the most capable AI tools produce shallow, generic outputs that fail to reflect the specificity of a destination’s context. Sojern's research demonstrates how significant progress is being made in this area, with AI's application in data analytics rapidly increasing from 28% to 51% in a single year.

Source: Sojern

The Implementation Lab

The final day of X. Design Week is focused on diving into AI use case sessions. The rationale for focusing strongly on implementation is grounded in evidence. MIT’s research found that 95% of custom generative AI pilots fail to deliver measurable business impact because organisations struggle to move beyond the experimentation stage. The core barriers are familiar, with a resistance to new tools and misaligned expectations. Supported by expert guidance, participants will arrive with a specific challenge and leave with a working approach.

Source: MIT

For DMOs, where budgets are tight, learning from both best practices and documented failures is the difference in driving innovation forward. The Implementation Lab is structured to address this directly, operating as a working session. Attendees select specific use cases relevant to their organisation and work through the practicalities with expert support: scoping, sequencing, identifying dependencies and stress-testing assumptions. Collective experience is one of the most valuable resources in the room, with the most valuable insights often coming from the candid account of where things went wrong and why. Creating the conditions for those conversations, X. Design Week turns individual experience into shared intelligence that helps avoid repeating the same mistakes.

XDW 2026 is built for DMOs that have decided to act and need the tools, frameworks and peer support to use AI with confidence. Beyond the programme itself, every participant takes home two resources designed for ongoing use:

  1. The XDW 2026 Insights Report acts as an actionable reference built from the collective output of the room, synthesising the discussion outcomes and strategic recommendations generated across the three days
  2. The Use Cases Library functions as a searchable database of AI implementations from destinations worldwide, providing a living resource of what DMOs are doing with AI, how they are approaching it and the results they are seeing.

Subscribe to our Newsletter

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.