Hourglass Model

The Hourglass Model is a conceptual framework that illustrates how different layers of digital capabilities and corresponding stakeholder groups interact to create intelligent, interoperable, and scalable digital ecosystems. Named for its distinctive shape — wide at the top and bottom with a narrow central layer — this model is especially relevant for complex systems such as smart cities, mobility platforms, industrial automation, and digital twins.

It shows how data flows from the physical world (sensors and devices) through enabling platforms and infrastructure, and ultimately up to applications and user interfaces.  Each layer is structured to highlight a specific type of stakeholder, representing those responsible for building, enabling, or governing that part of the digital system. Opposite them are the key capabilities, the technological functions that make the system work.

Together, these dual aspects of the model help visualise the alignment between people, processes, and platforms. 

These are the professionals who design and develop end-user digital solutions. They focus on translating business or societal needs into practical applications, such as mobile apps, dashboards, or user platforms. Their role bridges technology and usability, often requiring collaboration with UI/UX experts and analytics teams to ensure solutions are both functional and user-friendly.

This group includes system integrators and software teams that connect different services and technologies. They ensure interoperability across tools, platforms, and data sources by leveraging APIs, middleware, and integration frameworks. Their work allows different components of the digital ecosystem to “talk” to each other securely and efficiently.

These are the organisations that build and maintain the core platforms, data spaces, and software environments on which digital solutions run. They provide the backbone of the ecosystem, offering scalable, standardised infrastructure that supports everything from data sharing to analytics and application development.

This layer involves stakeholders that manage the orchestration and coordination of cloud, edge, and core services. They ensure digital processes are responsive, resilient, and securely distributed across computing environments. These stakeholders also enable digital twins, federated systems, and cross-network collaboration at scale.

At the foundation of the model, these stakeholders design, deploy, and manage the physical devices, sensors, and connectivity infrastructure needed to collect real-world data. They include manufacturers of IoT devices, robotics, drones, and network providers who make data acquisition and transmission possible in real-time.

This top layer represents the front-end applications and services that users interact with directly. Application developers create solutions such as mobile apps, web platforms, and control dashboards by integrating data and functionalities provided by the lower layers. These interfaces must be user-friendly, intelligent, and responsive. Developers rely on standardised APIs and user experience (UX) principles to ensure seamless operation. They serve as the final bridge between digital systems and human users—transforming complex backend data into usable services.

This layer handles the interpretation and analysis of data, converting raw input into actionable insights. Dataspace stakeholders are typically organisations that manage access to and governance of data—ensuring it is shared securely, ethically, and legally across systems. They enable powerful analytical capabilities, including:

  • Artificial Intelligence (AI) and Generative AI, which can automate decision-making and provide predictions.
  • Digital Twins, which simulate physical environments or systems.
  • Swarm Intelligence, which enables decentralised coordination, especially in robotics and autonomous agents.
These tools form the brain of digital systems, powering automation, simulation, and deep insights.

This middle layer is the narrow “waist” of the hourglass. It connects the infrastructure and data below with the applications and analytics above. Platform and tool providers offer:

  • Middleware and orchestration tools to manage data flow and service integration.
  • Data platforms that store, organise, and provide access to information.
  • Dataspaces that enable secure and sovereign data sharing between organisations.
  • This layer is crucial for ensuring interoperability, scalability, and standardisation. It acts as a digital backbone that makes collaboration across stakeholders and sectors possible.

The fourth layer is the computational engine of the system. Infrastructure service providers offer cloud platforms, edge computing environments, and AI-accelerated processing. These resources allow data to be stored, processed, and protected at scale:

  • Cloud computing offers powerful centralised processing.
  • Edge computing enables low-latency, localised decisions—vital for real-time applications.
  • Cybersecurity services safeguard sensitive data, ensuring integrity and compliance.

At the base of the hourglass is the physical world—the real-life systems that generate data. Infrastructure providers design and maintain:

  • Sensors that collect information from the environment.
  • Connectivity infrastructure such as 5G, Wi-Fi, and LoRaWAN.
  • Smart devices and robotics that interact with their surroundings.
These components gather and transmit the raw data that powers the entire model. They are essential for automating industrial processes, monitoring environments, and feeding real-time data into higher-level systems.