The key differentiator between the World Wide Web and the metaverse is the use of a three dimensional space that gives everything in the metaverse a coordinate in space where it resides. Whether augmented or virtual, every object and service needs to be located some place. This track focuses on those places, big and small, plus the technologies that make them accessible. You may opt to make an existing place even better, build your own place, create something that makes it more efficient for other to build their places, or implement functionality that makes it possile to navigate places here on Earth.
Spatial fabrics are the building blocks of the metaverse. And just like websites, they come in all shapes and sizes. There are three layers of the metaverse map: celestial, terrestrial, and physical that each provide a specific mapping function. Here's a crash course on what you might be able to create within those layers.
The parcel is the lowest map object type in the terrestrial layer roughly equivalent to a single parcel you'd find in a county assessor's map. Think one building equals one parcel. When you create a spatial fabric attached to a parcel parent, your fabric can only consist of physical map objects. This is the way most spatial fabrics will eventually be attached here on Earth.
The tools for creating a spatial fabric are pretty limited at the moment. Your only effective option is to use RP1's Scene Assembler. However, that's not to say it can't be done other ways.
The scenes that currently exist at the physical layer are still fairly rudimentary. But in light of the billions of single parcel spatial fabrics that have yet to be created, it would be exceeding helpful to have a few pristine examples that not only show off what can be accomlished, but also stand as a prime example of how to do it correctly using best practices for building in the metaverse.
Above a parcel is a campus, which will be needed by just about every school and business that has more than one building on premises. Think of campuses as collections of parcels, and are generally made up of multiple spatial fabrics.
The process for setting up a campus on Earth still involves too much manual work, but only because no one has built effective tools for doing so yet. We have high hopes that will no longer be the case by the end of this hackathon! Given that the nuber of real world campuses will number in the tens of millions, having efficient ways to create and manage them will surely be needed.
It wouldn't be out of the question to think that schools and corporations that will be looking to build in the metaverse in just a few weeks or months will be looking around for solutions that will help them twin their own campus. Perhaps your solution might be the cornerstone of a company that fills that demand.
Like parcels, the surface is the lowest map object type in the cellestial layer, which corresponds with the surface of a planet. When you create a spatial fabric attached to a surface parent, your fabric defines all of the terrestrial map object subsurfaces that comprise the surface of a planet. RP1 has a spatial fabric for Earth already. But don't let that discourage you because there will surely be thousands of other planets in galaxies far, far away that need to be terraformed from the ground up.
How would you even start inventing a planet from scratch? Continents, countries, states, counties, and cities need to be imagined for starters. Then you have to handle terrain from the highest mountains to the lowest valleys. Then you have oceans, rivers, roads, buildings, trees, and plants. Of course, in virtual reality, you're not limited to the same laws of physics that we have here on Earth.
Let's get real, though. The task of entirely terraforming a planet in 29 hours is not really practical. Yet there may just be some creative solutions that can tackle some of the problems that will eventually be needed to do so. No doubt, some of those solutions might better fall under the AI track.
The celestial layer of the map is used to create outer space. At universe scale (1.0e+24 m and beyond) there's an unlimited number of galaxies, nebula, star clusters, star systems, planets, moons, and satellites to explore. But before we can set off to explore all of these new worlds, someone has to first create them.
There is a real need to have a highly accurate representation of our own solar system. RP1 has already started doing that by adding objects for all of the major known planets and moons, but only a small portion of the data is accurate. It sure would be nice to step that up to the next level. We have already piqued the interest of NASA and the European Space Agency, and we believe this information can be very useful at the university level.
Beyond our own solar system, relatively little is known about the rest of the universe. So it's the perfect playground to create imaginary worlds used primarily for entertainment. One such location of interest to a few hundred million people here on earth is a well known galaxy, Phaar'farra Way. Over 800 star systems are mapped within it, but it's missing all of its stars, planets, moons, and space stations. This project could take months of years to complete, unless someone clever can some up with a way to speed that up.
No doubt the metaverse will one day be filled with billions of places to visit. For places here on Earth, it's not enough to be able to see them. We must also be able to navigate them. VPS technology will be crucial for solving that problem, and will need to be built directly into the core of native metaverse browsers.
The first step in visual positioning is scanning the space. There are many technologies and standards that already exist for performing this task. Howver, no work has been performed to make them compatible with spatial fabrics.
Ultimately, it would be beneficial if scanning were not only possible, but easy using the hardware devices that everyone already has -- mobile phones. In the future, it would be even more convenient to perform this task with AR glasses or passthough headsets.
Don't overlook the fact that the metaverse is meant to be streamed in realtime, and the devices that people use are not guaranteed to have a lot of processing power. So it's vital that the resulting data sets be compact and efficient for transmission and processing.
Augmented reality will be entirely dependent on the ability to synchronize wearables with the space in which they moving through. Without a reasonably precise understanding of the space around you along with your position within that space, it will not be possible to position virtual objects within that space, locate and speak with other virtual avatars, or navigate the space.
In the absence of commercial AR glasses, we can safely focus on positioning with mobile phones and VR headset in passthough mode. If you could only choose one, mobile phones are much more important since billions of people already have these devices and rely on them throughout the day.
Knowing where you are in the metaverse -- either virtually or in real life -- it would be extremely helpful to be able to search for and get navigational assistance for locating things or places.
Note that a good portion of this task is focused on the search itself. What methods would people invoke to enter their desired object of location? That's up to you. However, keep in mind that typing on wearable devices is excruciatingly inefficient.
Here's another tip you might be interested in... a spatial fabric for the Long Beach Convention Center, Hall A is currently attached to RP1 with the booth layout set up for the AWE expo, being held in mid June. You can teleport directly to it in RP1.