The number of applications that qualify as tools for the metaverse will eventually be counted in the millions. However, considering where we are in the development of the metaverse, the number of existing tools can still be counted on one hand. You may opt to enhance one of these existing tools, implement a suggested tool that is not yet started, or for the truly bold, recognize the need for a tool entirely on your own.
RP1's Scene Assembler allows you to create and modify spatial fabrics, or scenes, at the physical object layer. While other products such as blender are used to create 3D objects, the Scene Assembler's job is to take existing 3D objects and combine them to form a scene. This makes makes the metaverse more dynamic because objects can be added, moved, and removed from a complex scene without having to rebuild the whole scene.
The program is web based and can be installed on any web server. It logs in remotely to a map service and allows you to modify scenes on the remote map service. Each scene can be used as a spatial fabric and attached to a primary spatial fabric or another seconday fabric to become part of the larger metaverse.
There are numerous propsective features that would be nice to have in this program detailed here. You may opt to implent one or more of these, or figure out for yourself another way in which this tool could be enhanced.
Patched Reality's Manifolder allows you to explore spatial fabrics from the celestial level, through the terrestrial level, and all the way down to the physical level. This product is very young and missing oodles of features that will no doubt make this program the most ubiquitous tool for content creators and developers in the known metaverse.
The program is web based and can be installed on any web server. It connects remotely to one of more map services to retrieve the objects contains within the spatial fabrics hosted on those servers. For now, this program only allows you to view content, but it won't be long before it can also be used to edit spatial fabrics too -- perhaps even as soon as the end of this hackathon!
There are numerous propsective features that would be nice to have in this program detailed here. You may opt to implent one or more of these, or figure out for yourself another way in which this tool could be enhanced.
RP1's Map Service is technically not a tool, but in its present form, it's close enough. Metaverse browsers connect to map services to obtain spatial fabrics and the map object contained within. This services relies on a SQL database to store the data within the spatial fabric, making it a great option for most fabrics, large and small.
This service uses Node.js running on either Windows or Linux servers and connectes to either a SQL Server or MySQL database. Setup is easy and can be done on your own server or a cloud hosted server generally in 15-30 minutes. You can host any number of spatial fabrics using this service, from miniscule to exceptionally large (so long as you have enough storage).
This solution is already fairly robust. RP1 uses this exact software to host their public spatial fabrics, including Earth, which contains many million map objects. However, there are a few unimplemented features detailed here that have yet to be added.
While the SQL-based Map Service works well, it could be very useful to have a very simple service that either uses a JSON database, such as CouchDB, or doesn't use any databae at all. Such an option could be great for very hosting simple spatial fabrics that require minimal effort to deploy.
Universal Scene Description (USD) is a very popular standard that already serves as a defacto intermediary to transfer scene data between various software systems due to its nearly universal support. USD offers some nice benefits, which include: multiple people can edit scenes simultaneously and it stores reusable references to objects in pretty much any file format. These features perfectly match how the SQL-based Map Service works! Given the extensive amount of existing software support for USD, there is no doubt that USD-based map services will be very desirable in the not-too-distant future.
This is another componenet that isn't exactly a tool, but would be extremely useful to have. In VR, your headset and hand controllers provide input for head and hand positioning, along with limit finger movement. On desktop and mobile devices, however, we only use keyboard and thumbstick controllers to move your avatar around the world leaving the rest of your avatar motionless and expressionless.
There exist libraries that can take an input video stream from your webcam and and provide real-time motion tracking of your head, face, expressions, body, arms, hands, and legs. One could then convert and incorporate these values into the avatar update packet, thereby giving full movement and expression to people not using VR headsets.
You don't have to be a rocket scientist to set up a metaverse server with the currently available software. However, it still requires a lot of steps and a certain amount of technical proficiency. Can you find a way to make this markedly easier, quicker, or more intuitive?
As the stack of metaverse software matures, the need for tools will grow. However, right now, it might not seem so obvious what is needed. Here are a few ideas that might start you thinking in the right direction:
Conversion tools that export scenes from popular game engines like Unity or Unreal and import them into a spatial fabric.
Tools that examine and/or correct objects in a scene for compatibility and best practices. For example, poly count, material size, measurement units, and file size.
Tools that make it possible to administrate a spatial fabric similar to the control centers envisioned in shows like The Hunger Games and Westworld.