After big concerts and festivals, your Snapchat feed is likely full of footage of the stage interspersed with your friends rocking out. With its new Crowd Surf feature, Snapchat wants to take your second-hand viewing experience to the next level.
Snapchat's new AI could completely change the way we watch live events
Snapchat's new Crowd Surf features uses people's concert Snaps to seamlessly film the stage from multiple perspectives.
Recommended articles
Built in-house by Snap's research team, the proprietary machine learning technology stitches together Snaps submitted to Our Story, and uses geolocation and timestamps to piece the audio together into a semi-seamless video.
A button at the bottom of the screen lets viewers watch from multiple perspectives. During Lorde's performance you get views from the VIP section and people scattered at various angles around the stage.
The technology relies on people taking Snaps at the same time from a lot of different places. In the case of the Lorde concert, you only get snippets of full songs, and while the audio is seamless, the video jumping from person to person is disorientating. You also get the classic front facing shots of a random person's face.
Crowd Surf is just the latest update to Snap's story feature. Since launching in 2014, Snap has tried to pull viewers in with curated event coverage and its Discover content. The company has struggled since its March IPO, and continues to search for new ways to serve ads to the millions of viewers that open the app everyday.
Snap has faced tough competition from Facebook after it launched its own Stories feature on both Facebook and Instagram. Facebook hasn't made any moves into editorial curation, and for the moment Snap rules the space.