About
AI is becoming embedded in every workflow, every application, and every decision engine. But as data grows exponentially, IT estates become heavier. Moving data - across clouds, across regions, or even across environments - has become slow, expensive, and risky. Latency, sovereignty, egress fees, and unpredictable cloud bills make the old defaults unsustainable.
This roundtable proposes a pragmatic shift: keep data where it is governed and protected; move only the intelligence - models, features, metadata - to where value is created.
Who This is For:
CIOs, CDOs, Heads of Infrastructure & Platforms, Enterprise Architects, and technology leaders operating large, distributed, or regulated data landscapes.
Key Discussion Themes:
- Data Gravity Today: Where it creates real friction in AI delivery: latency, egress, sovereignty, and operational overhead.
- What to Move and What Not To: How leaders decide between keeping data in place, moving compute, or shipping models/features.
- Local Data, Distributed Intelligence: Experiences with running AI close to the source while avoiding unnecessary data movement.
- Cloud‑Right Placement in Practice: How organisations classify workloads and choose the best execution environment.
- AI Cost & Performance Management: Metrics and behaviours that help teams keep AI predictable, scalable, and efficient.
- Multicloud Without the Chaos: Approaches that keep environments governable, portable, and resilient.
This session is a peer-to-peer exchange, a practical discussion about what works, where the real thresholds are, and how to build an AI-ready architecture without adding unnecessary complexity or costs.
What You’ll Take Away:
-
Peer-tested ways to approach workload placement in an AI-heavy landscape.
-
How others are aligning cost, performance, and sovereignty without slowing delivery.
-
Practical metrics leaders use to keep AI predictable and accountable.
-
Examples of patterns that work - and those that don’t - in distributed data environments.