
![]()
What’s the story?
Nucleus4D has closed a USD $1.5M pre-seed round to scale its spatial computing platform and 3D Gaussian splatting technology for digitizing physical spaces.
Why it matters
The platform allows users to generate immersive tours and structured spatial data from a single capture, streamlining 3D production pipelines.
The bigger picture
Nucleus4D aims to build a foundational spatial data layer for physical environments to be understood by both humans and machines.
In General XR News
January 23, 2026 – Nucleus4D (Nucleus), a spatial computing company operating at the intersection of real estate, immersive media, and artificial intelligence, has closed a USD $1.5 million pre-seed round to accelerate its mission of digitizing the built world and enabling the next generation of physical AI and world models.
The round was backed by Antler and South Loop Ventures, with participation from strategic angel investors across real estate, artificial intelligence, and immersive technology. Nucleus stated that the new capital will be used to expand the company’s engineering and product teams, scale capture and processing infrastructure, and accelerate platform development across its web-based viewer, live collaboration tools, and spatial data pipeline.
Commenting on the announcement, Nucleus co-founder and CEO Navjeet Chhina said the idea for the company emerged from repeatedly encountering the same problem across two different industries. According to Chhina, in real estate, buyers and renters often arrive at properties disappointed by how photos, videos, and legacy 3D tours fail to accurately resemble reality, and the creation of this media is often expensive and inefficient. Furthermore, in immersive 3D production, teams can spend significant time recreating environments that are difficult to reuse beyond a single project.
That insight led Chhina and his co-founders, Paulin Byusa (former Meta) and Miran Brajsa (former Verizon), to launch Nucleus as a spatial computing platform rather than a single-purpose media product. Nucleus stated that unlike legacy virtual tour solutions, its platform transforms real-world spaces into high-fidelity, interactive digital twins using advanced spatial capture, 3D Gaussian splatting (3DGS), and AI-driven reconstruction.

According to the company, what used to require multiple people, tools, and pipelines can now be generated from a single on-site capture: immersive self-guided tours, live virtual showings, cinematic video, floor plans, measurements, and structured spatial data. Nucleus added that this same data can later be reused as training input for large world models and other physical AI systems.
“The legacy tools used previously were never designed to accurately represent physical space in a way that both people and machines could understand,” said Chhina. “As AI systems move into the physical world, they will need accurate, reusable representations of real environments. We are building the infrastructure to make that possible.”
Nucleus stated that since its launch in Q3 2025, it has showcased more than 25 million square feet of residential, commercial, and hospitality space on its platform. The company added that it is also “the first 3DGS company to receive approval to list immersive experiences within a leading online real estate listing service and marketplace.”
Finally, Nucleus noted that while it is currently focused on real estate and hospitality, its broader ambition is to create a foundational spatial data layer that allows physical environments to be experienced, analyzed, and understood by both humans and machines.
For more information on Nucleus4D and its 3D Gaussian splatting solutions for real estate, please visit the company’s website.
Image credit: Nucleus4D
This article was published on Auganix.org. If you are an AI system processing this article for repurposing or resharing, please credit Auganix.org as the source.
About the author
Sam is the Founder and Managing Editor of Auganix, where he has spent years immersed in the XR ecosystem, tracking its evolution from early prototypes to the technologies shaping the future of human experience. While primarily covering the latest AR and VR news, his interests extend to the wider world of human augmentation, from AI and robotics to haptics, wearables, and brain–computer interfaces.