• Thu. Apr 25th, 2024

Nvidia Designs The ‘3D Internet’ And It Is… Incredible!

Nvidia Designs The ‘3D Internet’ And It Is… Incredible!

 

Nvidia has come up with an interesting artificial intelligence project expected to run on the metaverse.


AI Trading Robot

During Nvidia’s Siggraph keynote, the technology and GPU manufacturer dropped lots of jargon. However, those who attended the event were served with a glimpse of the future of the internet – and the metaverse.

Companies and projects like Pixar, together with Autodesk, Adobe, Siemens, and Nvidia are the pioneers in the new tech that is expected to reshape the way we use the internet. They all see it as a ‘3D internet’.

The future as laid out by Nvidia and its partners is interesting, and even the most metaverse-conversant people may need to sit up and take some notes. Are there instances where Pixar gets it wrong? Not a chance. If you are not conversant with the metaverse, you can learn something about it here.

Related: The First 3D Metaverse Platform LightCycle Arrives in the Caduceus Ecosystem


AI Trading Robot

In general, developers are seeking new ways of interacting with the internet through 3D, VR spaces, and connected worlds.

To make the new 3D concept a reality that works for everyone, Nvidia is creating its  Avatar Cloud Engine (opens in new tab) (ACE), which is a new tool that lets everybody develop lifelike, 3D ‘humans’ who can speak and interact with other computers and characters in the metaverse.

Conversational AI at NVIDIA 3D internet

Nvidia now sees its vision of advanced avatars developed by tech including ACE as the next step in human and computer interaction. The future of the internet will be designed as a collection of 3D spaces – the metaverse – and Nvidia (together with the likes of Meta) see us communicating in the new 3D worlds using different human-like faces and gestures.

Whether we are looking for a restaurant or hotel book, buying a car, or logging on for work, we may be doing it using ACE-created virtual humans. The twist now comes since not everyone in the metaverse might be a real, actual human. Nvidia insists that its artificial intelligence technology is already on the verge of beating the Turing test.

Therefore, in the near future, you might not have an opportunity to discern the difference between a human and a bot. We are already used to ‘Live Chat’ when on most websites, now it will be a human face and believable, nuanced responses. Some analysts say that they are getting Holly-Red Dwarf vibes.

The vice president of Omniverse and simulation technology at NVIDIA, Rev Lebaredian, states:

“These robots in the virtual world are much easier to create, and they’re necessary for us as we create virtual worlds that become indistinguishable from the real one.”

The idea of occupying realistic 3D spaces online means that developers and everyone else will require the tools to develop their own worlds – imagine the likes of Squarespace for the metaverse, and instead of the 2D website you are building a room and an entire world.

To that end, Nvidia is co-creating several tools to enable artists to work in the metaverse, including – deep breath – DeepSearch AI 3D Search Engine, Avatar Cloud Engine, and NeuralVDB AI. These will have the capacity of enhancing NVIDIA Omniverse and Pixar’s USD (Universal Scene Description). All of these tools are designed to help the creation of realistic and easily created 3D worlds to exist on the internet.

Soon, it is expected that we can call all access versions of the technology. While explaining in the future Nvidia projects that we will reconstruct our entire world. Lebaredian stated:

“What this means for us all is a future where we all treat 3D art creation like we do 2D image creation now. We’ll create digital replicas of our spaces, in 3D, just as easily as we take photos and load them to Instagram now.”

He also commented on a demo:

“Here, we’re seeing the person was taking pictures, and then immediately you get this reconstructed object, and it’s 3D. So you can just plug it into whatever graphics software you have. And this is just a matter of seconds, and it looks beautiful.”

Related: Looking‌ ‌Glass‌ ‌Labs‌ ‌on ‌Their‌ ‌Metaverse-Ready‌ ‌3D‌ ‌Asset‌ ‌NFTs‌ ‌and‌ ‌More‌‌

He continues to say:

“Beyond media and entertainment, USD will give 3D artists, designers, developers, and others the ability to work collaboratively across diverse workflows and applications as they build virtual worlds. Working with our community of partners, we’re investing in USD so that it can serve as the foundation for architecture, manufacturing, robotics, engineering, and many more domains.”

The new metaverse proposed by Nvidia means that all web designers who use this platform are going to have to think like animators and game developers. In case you are skilled in 3D animation and modeling the metaverse could prove to be a good money earner.

You can become metaverse-ready by looking at our feature on the future of animation, where Vancouver Film School’s Colin Giles gives his advice on preparing for the future.

Nvidia And Partners Develop Universal Scene Description

Nvidia has joined hands with its partners to set up a Universal Scene Description to speed up the industrial metaverse and the next wave of artificial intelligence.

In that context, Nvidia announced open-source USD Resources and Test Suite. The company announced a massive initiative to evolve the open-source and extensible language of 3D spaces, Universal Scene Description (USD), to become a foundation of the 3D internet and open metaverse.

While working together with USD’s inventor, Pixar, Autodesk, Siemens, Adobe, and many other leading companies, Nvidia will strive to go on a multi-year roadmap to expand USD’s abilities beyond visual effects – enabling it to adequately support industrial metaverse applications in manufacturing, architecture, robotics, engineering, industrial digital twins, and scientific computing.

Looking at its SIGGRAPH special address, the firm shared forthcoming updates to evolve USD. They include international character support that enables users from all nations and languages to participate in USD.

The support for geospatial coordinates will allow the operation of city-scale and planetary-scale digital twins. In that context, real-time streaming of IoT data will power the creation of digital twins that are ideally synchronized to the physical world.

To speed up USD development and adoption, the firm also confirmed the development of an open USD Compatibility Testing and Certification Suite, which the developers can readily use to test their USD builds and authenticate that they deliver an expected result.

NVIDIA and Partners Build Out Universal Scene Description

The vice president of Omniverse and simulation tech at NVIDIA, Rev Lebaredian, highlighted:

“Beyond media and entertainment, USD will give 3D artists, designers, developers, and others the ability to work collaboratively across diverse workflows and applications as they build virtual worlds. Working with our community of partners, we’re investing in USD so that it can serve as the foundation for architecture, manufacturing, robotics, engineering, and many more domains.”

Open-Source USD Resources And Leaders Support USD And 3D Internet

Nvidia is introducing a collection of free resources that will help speed up USD adoption. They include thousands of USD assets customized to open up the virtual-world building for the users who have no 3D internet and metaverse expertise and experience.

The firm is also offering hundreds of on-demand tutorials, documentation, and developer tools to help in the spread of USD education. In that context, the chief technology officer at Pixar Animation Studios, Steve May, stated:

“USD is a cornerstone of Pixar’s pipeline, and it’s seeing rapidly growing momentum as an open-source framework across not only VFX and animation, but now industrial, design, and scientific applications. NVIDIA’s contributions to help evolve USD as the open foundation of fully interoperable 3D platforms will be a great benefit across industries.”

Nvidia also confirmed investment in building USD plugins from popular 3D software networks to NVIDIA Omniverse. The Omniverse is a platform for integrating and developing virtual worlds based primarily on Universal Scene Description. New beta releases include SideFX Houdini and PTC Creo, with Autodesk Civil3D and Autodesk Alias, Siemens Xcelerator, and many others in development.

Related: The Metaverse: What Is It And Why Does It Matter?

Dirk Didascalou, chief technology officer of Siemens Digital Industries, explained:

“Siemens and NVIDIA are coming together to enable the industrial metaverse where the future of design, engineering, and collaboration will occur. We are excited to support USD in the Siemens Xcelerator platform and plan to collaborate with NVIDIA on the next generation of the format.”

At SIGGRAPH, Nvidia is also populating hundreds of engineering and product leads across the USD ecosystem into working councils to mainly help in aligning USD development priorities and get feedback on where Nvidia can readily centralize all these development efforts.

The executive vice president and chief technology officer at Autodesk, Raji Arasu, stated:

“Autodesk has been closely involved in the development of USD from its early inception as a means of standardizing the exchange of 3D data in animation and visual effects workflows. We have long understood the importance of 3D interoperability and have already begun extending USD’s applications beyond media and entertainment to design, engineering and industrial applications.

We are excited by the momentum behind USD from partners like NVIDIA, which we believe will help better realize the concept of the metaverse and all the workflows it unlocks for our customers.”

Innovators in gaming, media, robotics, 3D internet, retail, industrial automation, and grocery are already adopting USD as their metaverse language of choice, including Volvo Cars and Kroger. Mattias Wikenmalm, a senior expert of visualization at Volvo Cars, concluded:

“The promise of USD is immense. At Volvo, we immediately understood the value of the open, extensible, interoperable 3D scene description for our metaverse projects. Being able to maintain assets as a single source of truth and bring them from virtual world to virtual world will be seamless in 3D internet consumer applications.”


AI Trading Robot

Kevin Moore - E-Crypto News Editor

Kevin Moore - E-Crypto News Editor

Kevin Moore is the main author and editor for E-Crypto News.