Following on from my earlier Lab Post about City-Scale Augmented Reality, I thought I would expand on the tests I have done to see how it could benefit Digital-Out-of-Home (DOOH) campaigns. DOOH is a huge industry and perfectly suited to location-based Augmented Reality. It's all about the eyeballs they say, but, increasingly, it's also about the socials. DOOH campaigns are competing with social media for that wider engagement and sometimes that can be the downfall of traditional media spaces. They are static and pinned in place. Some anamorphic screens like Oceans DeepScreen are very popular with audiences and brands alike. They have viral appeal and are shared enthusiastically on social media.
I wanted to see how much potential there is with City-scale AR and DOOH, so I built a fun Easter AR experience for three locations across London, UK. Two of these locations were focussed on two prime DOOH locations: the Old Street roundabout screen owned by JCDecaux and the previously mentioned DeepScreen at Piccadilly Circus. I also returned to Trafalgar Square to see what I could do with Nelson's Column.
I bundled all the virtual content into a campaign called LDN|AR that will launch on SnapChat for the Easter holiday. I wanted to explore how the virtual content would sit in the real world and how SnapChat's City-scale tracking would hold up under stress.
I decided to create an Easter-themed experience and gathered together some 3D content, namely some Easter Bunnies and Eggs. With Mixamo I added some motion data, imported everything into Snap Lens Studio, and got to work.
The technical details of creating City-scale AR experiences in Lens Studio I will save for a more in-depth tutorial, but I will say that it is surprisingly straightforward and I was able to get the AR scenes for each location ready quickly. It was time to test it out on Location!
I headed into London. I spent a good few hours testing the content on location and edited the footage down into a video that you can watch here.
As you can see, this was a lot of fun, but I also learnt many lessons about creating content for these types of experiences. Here's what I learnt.
City-scale Augmented Reality is a Snap World Tracking technology that is built into their Snapchat app. It only works in their app and it only works when you are actually at the location designed for the experience. Why is this? When you build your project you choose the exact location you want to use as your 'stage' for the experience. Snap's Cloud service then pulls down the mesh data that it has available for that location. This data isn't perfect. It's also (most likely) not completely up-to-date. So the first thing you need to do is pick a spot that you are sure is a permanent fixture (or not likely to move any time soon). Once you have your mesh in place it is rendered within the IDE like any other scene object, so you can then place your virtual content within it. This is the fun bit.
Snap provides an ingenious animated material that shows up in IRL as the coloured texture that seems to wash over buildings as you move your phone around. This tells you where you can tap to interact with the space. The properties of this animation can be adjusted or hidden completely, but for my tests I wanted to see that the mesh was aligning ok with the objects in the real world.
When arriving at the location the app tells you to scan the environment and point your phone at buildings. It needs to see landmarks so it can recognise them and align itself to the mesh that has been bundled into the lens. This can take some time, and relies on the user moving smoothly, slowly and methodically in the right lighting and weather conditions. It's also important to have landscape features for it to recognise so being in the middle of a large open park space might be an issue.
Once loaded you will see the mesh is visible through the animated texture on its material. This animation gives you a sense of volume and it should be obvious if the mesh is aligning correctly with the buildings around you. I found that this was not always the case and it could take additional scans with the phone for it to recalculate and adjust.
It became apparent that the mesh data can easily be jolted out of position if the phone moves around too quickly and if the user is to walk around and turn too much. It is recommended to keep the virtual content within an area that minimises this requirement. Completely surrounding the user with stuff just isn't going to work. This was apparent at Trafalgar Square where I had Bunnies on the fountains, over the road, and all over the square. Direct your users to a spot that they can easily recognise and then ask them to scan the area. Keep the content within a 180° field of view from this spot.
The mesh data is not always accurate, can be incomplete and ragged. Sometimes (as was the case in Piccadilly Square) there is mesh data that is not even associated with real-world items —I think a bus may have been accidentally scanned. The mesh data also occludes any virtual content you are trying to show, and whilst this is great for realism, it may not be that clean. The software also tries very hard to occlude people from the scenes but in very busy, crowded areas, this soon becomes an impossible task.
For this reason, it is best to avoid relying on occlusion and to place your virtual content up high, out in the open and above head height if possible. Also, don't design your experience to replace any part of the scene that will be meshed. For example, my original plan for the Piccadilly Square experience was to create my own 'portal' version of the DeepScreen and have Easter Eggs falling out of it. I soon realised that this wouldn't work because the generated mesh would occlude my virtual portal. I still think this could be made to work if occlusion is turned off and there is no other content that requires occlusion. Something to explore further.
I think the best parts of the test content I created was the stuff that was located up in the sky, the bunnies sat on top of buildings and out of the way or floating in the sky. The Bunnies would not have worked at Old Street roundabout if they had been floor level as there was way too much traffic and chaos. It was a huge building site and most of the structures at floor-level were not there when the mesh data was scanned.
To allow Easter Eggs and Large chicks to bounce and spin off buildings and not fall through the floor, I had to hack the Lens Studio template code and add a Collider to the city mesh. This was quite straightforward. Physics is what makes it really fun to play with but, again, it relies on there being decent mesh data that is aligned well. Don't rely on it being super accurate. The same goes for positioning. You are not going to get the mesh to always align perfectly so don't try and place a virtual object with pinpoint accuracy.
If you want to design a DOOH campaign that offers a companion City-scale AR Snap Lens then you should definitely take the points I have made into consideration. Think about how the content that is on the screen can be used in the environment around the screen but in such a way that it is going to be easy to materialise for the user and not look messy. Try to make sure your virtual content has enough space and is not relliant on being pinned to a real-world feature too accurately. The Piccadilly Square screen has a large area of space behind it that is perfect for a reveal. A character could easily transition up from behind the screen without clashing and being occluded by buildings, people etc. Text and other graphic accoutrements are also great candidates for content as long as you hang them up above head-height.
City-scale AR is a fantastic technology but it only works well if your campaign is tied to a specific location and your story is designed in such a way that it will look great even when the technology is not performing so well. To achieve this, you need to test on location as early as possible and consider all the points I have raised above.
If you would like help to design and implement your own City-scale Augmented Reality campaign, then get in touch for a free consultation: email@example.com