Nvidia launches ‘omnibus cloud service’
Providing services for building industrial metaverse
Nvidia announced at GTC 2022 on the 21st (Korean time) that it has launched ‘Nvidia Omniverse Cloud’, the first software-as-a-service and infrastructure service.
Omnibus Cloud is a comprehensive cloud service that enables artists, developers, and enterprise teams to design, launch, operate and experience Metaverse applications from anywhere.
It helps individuals or teams to design 3D workflows with one click without the need for local computing power and take advantage of the full power of the omnibus, such as physics simulation, ray tracing, and AI capabilities.
“The 3D Internet, Metaverse, connects virtual 3D worlds expressed in USD (Universal Scene Description) and viewed through a simulation engine,” said Jensen Huang, CEO of NVIDIA. You can design, build and operate the world and your digital twin.”
The Omnibus Cloud is an RTX computer for creation, design and engineering, an OVX server that connects to the Nucleus database and runs virtual world simulations, and NVIDIA’s graphics delivery network via GeForce Now, Omnibus’ portal. It consists of three elements: (NVIDIA Graphics Delivery Network).
Nvidia’s second-generation OVX platform is a metaverse that corresponds to the HGX H100 AI platform. Instead of AI computation, OVX accelerates metaverse applications such as digital twins.
The second-generation OVX is expected to launch in early 2023. It uses eight L40 GPUs of the new Ada Lovelace generation. For the omnibus platform, OVX handles graphics-intensive virtual world simulations and HGX handles AI workloads.
Companies like Siemens, Limac and WPP have already announced their use of omnibus clouds.
Siemens has built the Xcelerator business metaverse platform, which enables organizations to connect remotely and operate in real time across product and production lifecycles.
WPP uses cloud services to launch a marketing service for the automotive industry, and Rimac uses an omnibus cloud to provide an end-to-end automotive pipeline from electric vehicle design to marketing.
Park Chan, member [email protected]
AI that uses tools appears… Does the evolution speed up?
Humans are tools-using animals. It has evolved rapidly since the use of tools since walking upright. This was about 300,000 years ago.
Artificial intelligence (AI) that can use tools like humans has emerged. Just like humans, AI is expected to accelerate its evolutionary pace.
Technica reported on the 16th (local time) that American startup Adept has developed and unveiled an AI model ‘ACT (Action Transformer)-1’ that can perform tasks with a computer program like a human.
ACT-1 is an AI model trained to use software tools, APIs and apps. When you give commands by voice or text, it performs computer tasks like a human. Like a human, you can also browse the Internet by clicking your mouse or using the scroll function.
Based on a neural network ‘transformer model’, it learned by building knowledge about the context and relationships between items in a data set.
It connects with a Chrome extension to observe what’s happening in the browser and performs certain actions, such as clicking, typing, or scrolling. By observing how humans manipulate software, they learned how to automate a series of complex user interface (UI) tasks and performed them.
It can also handle high-level user requests. The user simply enters a command into the text box, and ACT-1 does the rest.
For example, if a user enters “Find a house in Houston for a family of four. Budget is $600,000” into the text input box, ACT-1 automatically navigates to a real estate site in a web browser. Then click on the appropriate area of the website. Enter your search terms, and change the search parameters. Until matching houses appear on the screen.