Are human-like avatars here?
Bi-weekly update with the latest insights from the Metaverse: news, posts, creations, and communities.
Hey there! Hope you’re doing well and continue to stay safe.
This week’s focus is on APIs. We have machine learning for 3D, high-quality spatial audio, and realistic human avatars. These are the sort of tools that will build the foundations for the Metaverse. Unreal Engine’s Metahumans cause quite a stir. They look outstanding and run in real-time.
On the news
Tensorflow 3D to understand 3D scenes: Google recently announced Tensorflow 3D. It comes with different pipelines to efficiently understand a scene. This means segmenting the different sections, grouping voxels that belong to the same object, and general object detection. There’s been a lot of work on using machine learning for 3D, and state-of-the-art algorithms are used in mobile AR and for tracking in VR headsets. It’s nice to see that more people will have access to these models. [Link]
High Fidelity spatial audio API: They pivoted from Social VR to an only audio 2D space. Now they just released an API as an npm module. It makes it easy to use their great audio capabilities from any Web app. It’s free while in alpha and will then cost $0.003333 per minute per user. That comes at ~$20 a day if there are 20 people active for 1h each. It’s not cheap, but audio is very important, and high-quality spatial audio is rare. [Link]
Realistic digital humans in Unreal Engine: A surprise announcement by Epic Games showing Metahuman. It’s a very realistic digital human creator tool that will work seamlessly with Unreal Engine. Same as the upcoming UE 5, it will easily scale down to lower-end platforms. They run in real-time and still look amazing. They support a vast amount of customization and animation, and can be rigged manually or using motion capture (even with mobile applications). I believe we’ll need realistic humans to be able to truly be with friends and family in the Metaverse and feel a true sense of presence. [Link]
Over 60 Quest apps made $1m: It’s almost double the amount from back in September. It’s a fast-growing market and Quest 2 is proving to be a big push. With more people jumping into VR and now with App Lab that more experimental applications are allowed on the platform, it does feel like it can start to become mainstream. Can’t wait for proper Quest competitors though. [Link]
Tutorial to deploy Hubs Cloud on Digital Ocean: Fabien Benetou, who I’ve talked about previously for his cool Hubs experiments, is back with a great stream on how to deploy your own Hubs Cloud instance on Digital Ocean. This is still in alpha and the stable way is with AWS, but Fabien is doing a series on how to move away from monopolies. And that includes Amazon’s cloud infrastructure. [Link]
Tilt Brush port in WebXR: msub2 has created another Tilt Brush port. But this one uses the Unity WebXR exporter. It’s still buggy but it mostly works, and hey, big accomplishment. This proves that we can have more complete apps running on the Web. [Link]
Skittish is a fun virtual space for events: Andy Baio has made this Web-based virtual space where avatars are animals. It’s a fun-looking world with proximity chat and creating tools to modify the space around you. Something like this could be on top of Hubs for sure, and I think it’s a very good approach to make the space welcoming. [Link]
See you in 2 weeks!