City-Scale Augmented Reality

February 21, 2023

Snap has released a new feature called City-scale that allows creators to place virtual content in locations without the user having to scan and create the mesh in advance. Creating an invisible mesh over the buildings and landmarks at the location allows our content to interact meaningfully with the environment and enhances the realism. Characters can hide behind walls; balls can bounce off walls. This technology is not new and Snap has had some locations available through its Landmarker Cloud Service for a while, but City-scale opens up the possibilities across central London, downtown Los Angeles and Santa Monica. They have also allowed creators to scan their own locations and create custom location lenses, but this is restricted and doesn't provide mesh data at such a grand scale. To demonstrate just how incredible these types of experiences can be, I built a small prototype set in Trafalgar Square, London.

The first test was just to see if I could put a party hat and a balloon on Nelson's head. I headed off to London to test it out. Thankfully, I am about 45 minutes away from central London on the train from my office in Kent. Unfortunately, when I arrived and launched Snapchat to test the lens, it crashed. It was apparent that it only crashed when opening this particular type of lens so I had to abandon that test.

The tech department at Snap confirmed that there is indeed a known bug with City-scale in the iOS client, and that it would be fixed in the next release. I just had to wait. At least I knew I was exploring something that was cutting-edge! Whilst I waited, I worked on the project and decided to try and add some physics. The city mesh that is generated does not include a collider but with some pain free hacking of the code I was easily able to add one to each of the tiles. I then added a cannon from the Physics template.

Snap released their new Snapchat iOS client soon afterwards, and with great excitement I headed back into London to see the results. Success! The lens loaded and I was able to launch massive yellow ducks at Nelson's column.

I started documening the results and making notes of some of the peculiarities of the experience. It became apparent that the tracking wasn't going to be completely accurate if you moved around too much, so the activity should be designed for a vantage point. The algorithms attempt to occlude people when they walk in front of the virtual objects but in a place like Trafalgar Square that was becoming increasingly difficult. Best try to keep content above head-height.

It is also not that obvious just how big landmarks are in the real world. That party hat and balloon I put on Nelsons head were practically invisible. The hat was also not accurately positioned as the mesh drifted slightly. You are not going to get accurate placements.

Overall, a successful prototype that has provided me with enough information to guide my clients in the future. I would definitely recommend City-scale for retail stores or brands who could make good use of city landmarks.

For now, Trafalgar Square will be my City-scale Sandbox for prototypes and experimentation. I'll be releasing an Easter themed lens soon.

Like what you see?
We're always happy to talk so get in touch for a free consultation
Hello