We’re transitioning to a new UI, and are in the process of updating our content.

Bringing Real-Time AI Models into XR: Introducing Unity Sentis Support in QuarkXR

As XR development continues to evolve, the convergence of real-time 3D graphics and advanced AI models is reshaping how professionals create and interact with immersive environments. At QuarkXR, we’ve always seen Extended Reality as more than just a visualization medium - it’s a dynamic, interactive interface that can intelligently respond to user input, contextual data, and enterprise-specific needs. Today, we’re excited to announce a major step in that direction: support for Unity Sentis, Unity’s AI inference framework, is now integrated into QuarkXR.

Why This Matters for XR Professionals:
If you’re an XR developer, 3D content creator, or an enterprise user in sectors like architecture, engineering, construction (AEC), manufacturing, or location-based scouting, you know that adding intelligence to your immersive environments can streamline workflows, reduce iteration time, and enhance end-user experiences. With Unity Sentis support, it’s now possible to run advanced AI models - ranging from speech recognition systems to custom data analytics - right inside your XR application.

Real-Time AI at the Edge:
By leveraging NVIDIA Cloud GPUs such as the A10 and L4, QuarkXR can handle both the heavy lifting of complex rendering and the computational intensity of AI inference. This combined GPU-based approach means you’re not just visualizing data; you’re interpreting and interacting with it intelligently. Imagine providing your end-users with a voice-driven architectural walkthrough or using computer vision to identify key features in a manufacturing training simulation - all processed in real-time, without latency-heavy round trips to distant data centers.

Security & Compliance:
With QuarkXR, you are in charge of your data. The AI model and XR experience are hosted on a containerized environment which is isolated from the rest of the world. QuarkXR instances can be run from any corporate Cloud tenant, as well as on premise.

QuarkXR integrates seamlessly with NVIDIA CloudXR for fully encrypted, edge-to-cloud streaming. This ensures that all your organization’s proprietary data, user interactions, and AI inference requests remain private, secure, and compliant with industry standards. Data never leaves the secure environment, so you can confidently incorporate sensitive models or workflows into your XR applications without sacrificing privacy.

A Working Example: Whisper by OpenAI:
To demonstrate the power of embedded AI in XR, we’ve implemented OpenAI’s Whisper speech recognition model directly into a Unity-powered environment. In this setup, your spoken commands or questions are transcribed locally - no external APIs or data transfers required. From there, these transcripts can feed into a variety of downstream tasks, such as language translation, summarization, or keyword extraction, all within your XR experience.

Why Whisper?
Whisper runs efficiently on our GPU-accelerated setup and doesn’t send data to any external services. This local processing ensures both speed and confidentiality. As a result, you can integrate advanced speech recognition into your production pipelines without hitting bandwidth or privacy bottlenecks.

Next Steps: Connect to LLMs and Beyond
The Whisper demo is just the start. One natural extension is to connect the transcription output to a Large Language Model (LLM) for real-time translations, domain-specific Q&A, or dynamic content generation. Whether you’re guiding enterprise clients through a complex product catalog in VR or offering on-the-fly updates during an AR-guided maintenance session, the potential to enhance user experience with language and AI-driven insights is immense.

And that's just the tip of the iceberg. Imagine using IoT data streams from connected devices to a domain-specific AI model to inform real-time simulations, making complex information immediately interpretable in an immersive setting. Or have the XR environment itself as a simulation for training AI of real-world awareness, as we've seen examples of XR being used to train robots in manufacturing.

Get Started with Our Unity SDK:
Ready to experiment with embedded AI models in your XR workflows? Visit our Unity SDK page and access our SDK download form to start building your own AI-driven XR applications. Our documentation includes detailed instructions, sample projects, and integration tips for using Unity Sentis alongside QuarkXR and NVIDIA CloudXR.

Conclusion:
The fusion of AI and XR is no longer a futuristic concept - it’s here, and it’s accessible. With QuarkXR’s support for Unity Sentis and secure NVIDIA CloudXR streaming, you can push the boundaries of what’s possible in spatial computing. We can’t wait to see what you’ll create.

Subscribe to our newsletter

By clicking Sign Up you're confirming that you agree with our Terms and Conditions.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.