360 Animation for Vespertine #15

On October 19th 2016, 'Zoom Through' was presented at National Center of Early Music, York. This was a project commission by Vespertine, National Center of Early Music & AOC Archeology. The 360 animation was an abstract/artistic response to the historic space been scanned by LIDAR equipment. This was then released on YouTube after the event, creating an artefact that had a life after the Event day.


The LIDAR dataset produced by AOC Archeology was over 40 million point model of the Church, that is the home of the National Center of Early Music. Using MeshLab, this was reduced to around 500,000 points, so it could work efficiently in Trapcode Form without memory errors, it also created the 'x-ray' aesthetic of the animation, by having the point cloud resolution low enough to see through the walls and columns. To produce the equirectangular video output for 360, the plugin Mettle Sykbox studio was used in After Effects to render the Trapcode Form object. This plugin created a 6 camera set up to produce a cubemap, that then could be used to produce the unfolded image.

Utilising the same Trapcode Form asset, a sound reactive version was also developed as a visual response to Jez Wells's 16 speaker audio installation that was also part of the Vespertine Event that evening.


On the evening Google Cardboard Headsets were used with a BYOD expectation of the audience. This worked out roughly 50/50, that people had a smartphone or tablet device capable of playing 360 Youtube video. Part of the event, was also the audience constructing their own Headset from the pre-cut kit, which they could take away afterwards... Ink stamps and other art supplies were at hand to customise their Cardboard headset, which is something I feel needs to be exploited further in future projects, as there is value here for audience's investment and addition to the creative conversation.

The experience of viewing the LIDAR data whilst being in the same space, was an interesting experiment. Usually the expectations of these videos is escapist, to take you out of the current space to somewhere new. The emergent play of people using the headset, then looking around the church interior and back again to the animation, was slightly unexpected but welcome. The original conception was that this would be watched in the reception area, and then people would filter into the church's main hall. Anecdotally the audience would give feedback that it made them see details of the space that they hadn't focused on before, particularly the rafters in the roof. This cheap 'Mixed Reality' is something I want to focus on in future projects - like AR, using the technology to add to the immersion/narrative of the space they are already in.



Exploring AR with Vuforia and Unity



Playing catchup on fundamental skills in Augmented Reality. This test was done using the vuforia plugin for Unity - to export as an Android App. Using tutorials from here as a starting point - https://developer.vuforia.com/downloads/samples , it was quick to get up and running..  ( More time was spent trying to get the Nexus 7 to play ball with Windows 10, grrr! ) The tracking target was a postcard, which you need to upload an image of to the Vuforia database to create tracking information. Vuforia is free, if you are happy for watermarking on the app. You also have to create an app license that you copy and paste into the Unity project to register it.
Having a dabble also gave an opportunity to utilise Mixamo to generate a 3D animated character to test with. Mixamo has now been added to the Adobe Cloud services - so there's pre-made animations for Adobe Fuse Characters - that can be exported as .fbx.
As a conceptual idea - I added particle trails to the dancer - this is for a future plan to explore when I work with dancers on other project ideas.