In 2006 a lot of smart people got invited to an exercise in the institute for the future to explore paths into a 3D Metaverse. Essentially they wanted to know what happens when virtual worlds meet real maps of the planet. What would happen when simulations get real, and life and business go virtual: [Metaverse Roadmap Overview Page](https://www.metaverseroadmap.org/overview/) They already saw that the Metaverse might be a duality in the sense that it will refer to both a set of technologies and a narrative. See also [[Metaverse#Metaverse Multiplicity]] ## Metaverse Scenarios They looked at two key continua in a matrix, based on world-focused vs identity focused scenarios and real-world vs. constructed worlds: **Intimate**: Focused inwardly, describing actions of an individual actor, represented by an individual appearance in the system: "*Somebody or something takes action*." **External**: Focused outwardly, systems providing information about and control of the surrounding / environment / world: "*Parts of the world can be perceived and changed in a specific way*." **Augmentation**: Layer information and capabilities onto the perception of the real: "*Acting within the actual world*." **Simulation**: Model the physical or parallel realities based on information gathered from them: "*Acting within a simulation or abstraction of the world*." That led to four quadrants: ![[Metaverse Roadmap.png]] ### Virtual Worlds (Intimate, Simulation) These are platforms where virtual identities act within virtual maps. They are essentially [[Virtual Worlds]], although they can have many different purposes other than just socializing. **Concepts**: Massive Multiplayer Games, [[Multi-User Dungeons]], virtual event spaces, virtual meeting rooms, training and education environments. **Drivers**: Immersion, 3D representations, networking **Examples**: [[Second Life]], [[Decentraland]], World of Warcraft, Minecraft, AltspaceVR ### Mirror Worlds (External, Simulation) These are platforms containing virtual models (or "reflections") of the real world, constructed through the use of for example [[Internet of Things]], [[Digital Twins]]. From a conceptual point of view, these can be virtual worlds where you swap out the virtual map of the world with a simulated map of the real world. These mirror worlds can be populated by representations of real actors (people, things and environments), which a user can interact with. So these worlds can be experienced through virtual representations, for example an avatar of you running around in a real map of New York in Google Earth or Google Street View Alternatively, they can be experienced by a further abstraction layer, where you can see a map representing the real world and real actors within it that you can influence, for example looking at the bus schedule or calling an Uber. Finally Mirror Worlds can be experienced as results of simulations & modelling, for example looking forward or backwards in time through simulation. **Concepts**: Digital Twins of spaces (from connected home to industry-scale properties to smart cities) and actors within spaces (occupancy, mobility options, ..), environmental data (weather, temperature, pollution, ..), machine-based data (real-time & historic). **Drivers**: Connected devices + sensors, models + ontologies **Examples**: Google Earth & Maps, [Amsterdam Schiphol API]([Schiphol | Explore all Schiphol’s APIs](https://www.schiphol.nl/en/developer-center/page/explore-all-schiphols-apis-in-the-developer-center/)), Uber, Foursquare ### Augmented Reality (External, Augmentation) These are platforms where individuals act as themselves as they perceive a digitally augmented reality around them. See [[Augmented Reality]]. From a conceptual point of view this scales the real-world data of Mirror Worlds to 1:1 size and aligns it to the current perception of the world. As a mode within the model, this leads to [[Ambient Computing]]. **Concepts & Examples**: See [[Augmented Reality]] **Drivers**: Contextualization, especially location-based services, [[Machine Learning]] ### Lifelogging (Intimate, Augmentation) These are platforms that allow real individuals and things to virtually represent themselves and their identities. In 2006 lifelogging became a thing, where people would stream a version of themselves into virtual platforms, spawning concepts like lifestreaming, cam-personalities and microblogging. Today we know people professionally augmenting themselves into online characters as “**social media influencers**”. From a conceptual point of view this moves from augmenting the world around the spectator to augmenting the image of the spectator itself. **Concepts**: Social media, work & professional platforms, publishing platforms (from blogs to video platforms), personal [[Avatars]] **Drivers**: Identity, representation **Examples**: Facebook, Instagram, Twitter, [[Twitch]], LinkedIn, Microsoft 365, Onlyfans, [[Avatars#CodeMiko]] ## System requirements We can look at the technical and conceptual requirements for these quadrants and end up at the requirements for [[Virtual Worlds]]. You can think of all of these as contextual transparent multi-user systems. So, yes, these attributes describe Minecraft AND Google Maps AND Pokémon Go AND Facebook and pull them all into the Metaverse. Really, the only difference between Minecraft and Google Street View is that one map is based on a fictional world and the other on the real one. The only difference between Uber and Pokémon Go is the type of things they spawn on their maps. And Twitter is pretty much any massive multiplayer game, but they only bothered to develop the chat part. With this broad definition, we can map a lot of existing and new fields into the Metaverse. This way, we can also build on existing concepts and experience. ![[Metaverse Roadmap Platform Analysis.png]]