Having just recently moved to The Mill London from Amsterdam, it was a treat to get to get back into the swing of things with a project that incorporated some of the techniques that I’ve always had fun with and have experimented with both for professional and personal work.
Techniques such as Kinect-based 3D scanning, and object-based particle systems. For this spot, the CG scenes were tasked with making a digital, information-based look that would be applied to footage of people interacting with devices. The look and feel of it had to match the live-action shots preceding the CG, which featured turnarounds of people immersed in their devices against a dark environment.
To keep real-life forms and proportions correct, we opted to use the Kinect for scanning people. This way we could direct the position and pose of the people, scan them, and immediately bring them into Cinema4D. This not only eliminated the need to model, rig and pose 3D characters, but gave really nice and natural shape distortions and imperfections that would be time-consuming to achieve via traditional modelling.
Once we had those models in place, we smoothed out some of the inevitable scanning artefacts, and refined some of the facial details. The next step was to animate the camera revolving at about the same speed and angles as the live action shots – for this we made low-res proxies to speed up our viewport previews.
Once the models and camera moves were in the good place, the next round of fun was introduced. After some R&D we oped to emit particles based on the lighting setup. Using X-Particles, we made a setup that would emit static particles where the light was at its brightest on the models, and cease to emit where there was less than 25% light.
With this setup it was easy to sculpt our form made out of particles – moving a light around would give dramatic effects where needed, and allowed us to increase or decrease particle count based on amount of light. For farm rendering we baked the light maps onto the geometry, which allowed for really fast and accurate rendering as well.
The light-based particle setup was re-used for the city scene. Using a mix of light setups for the roads, buildings, and accents (cars, light posts, trash bins) allowed us to flesh out a seemingly elaborate look in a considerably short amount of time. Also using X-Particles groups allowed us to render out different passes for different elements/sections to speed up rendering.
We did some tests for particle rendering using Hair and Pyrocluster, but at the end of the day X-Particles built-in shader worked really fast for the look that we were going for.
In post all the elements were brought in – additional colouring and light manipulation helped the renders look even better. The city scene even got some additional design treatment using Trapcode Particular to add and accentuate the data-lines that the design required.
The title of this project is SuperFast Broadband – I believe it does more than serve as a title for the TV spot, it actually defines the workflow and timeframe. The mix of the Kinect scans, C4D, and X-Particles allowed us a lot of flexibility in a short time frame – the speed and efficiency of this mix (along with the great team dynamic) allowed us to bang out many iterations until we were happy with it, without feeling restricted by a short timeline for research and delivery.
Agency: The Mill
Director: Carl Addy
Producer: Luiza Cruz-Flade
Art Director: Antar Walker
Lead 3D: Fred Huergo
3D: Gabor Ekes, Ashley Tyas, Stephanie Dewhirst, Antar Walker
2D: Ivo Soussa, Ashley Tyas
Editor: Joe Wilby