Real-time physics collisions on a city scale 🤯
I built this AR experience where objects can collide with your city environment 🏙️
A smartphone is the only device needed (1/…)
The resulting experience is visually accurate & fun, with an unlimited amount of possibilities! No post-processing was used in these video clips, and they were all recorded from an iPhone 12 (No LiDAR) 📱
To create this, I used Lens Studio’s City Scale tracking and combined it with the Physics engine. I found a way to apply colliders to the City Mesh, allowing objects to interact with your greater surroundings
Controlling gravity with your hands is possible with augmented reality
I made an experience which allows you to spawn holograms and control their movements with gestures 👋
All of this in real-time on
The spawning gesture is made from tracking hand gestures for opening and closing positions hand 👋🏼
A simple instantiation script then allows to spawn new balls in! (5/…)
The physics engine Lens Studio offers now is really impressive. You can set custom forces, control gravity and so on.
You can make anything into a collider such as your hands or random meshes! (4/…)
I built an AR experience where you can spawn and hit glowing orbs with your hands 🤚
Really impressed at how the holograms bounce off the walls using real-time physics
Here’s a few details on how I made this and a link to try it yourself (1/…)
Finally; I sent out the Lens to Snapchat for immediate testing, creating this high-fidelity result. I will upload the Lens soon for your personal testing!
Thank you for reading all the way down to here :)) (6/6)
The next step was to use Lens Studio. Using Remote Assets was the perfect solution for such a high resolution model, as the mesh and texture quality are perfectly preserved. This led to an amazing looking result. (5/6)
After the scan was processed, I exported it and brought it into Blender. I proceeded to edit out all the extra vertices and inconsistencies on the 3D mesh. ✂️ (3/6)
I started by using LumaAI to make a photorealistic NeRF capture of my plant. I placed it in a room and proceeded to take pictures. If you’re not in the beta using Polycam NeRFs are a great alternative💡 (2/6)
Finally; i adjusted the material settings and lightings options for realistic results. I sent the Lens out to Snapchat and tried it in a matter of seconds! (7/7)
Using their foot tracking ML template worked perfectly and I didn’t have to make many adjustments. It was very straightforward to import the model in the scene (5/7)
I created a photo-realistic AR try-on experience in less than 1 hour using AI & ML
Seeing how easy the process is, here’s a breakdown on how you can make this too🧵 (1/7)
technology is
From a 5-min session, I was able to capture a perfectly textured 3D mesh and a coloured point cloud, all this using AI / NeRFs
This is going to be a game changer in mapping out locations in 3D
Just tried the Quest Pro and I have to say it’s really good, properly engineered hardware..
But working with the headset doesn’t seem great 🤔 Gaming on the other hand could be awesome
Nice to see the advancements being made in VR, will definitely be experimenting