Its been a long time since I've done a Project Challenge post. I haven't done a single one since moving to Digital Domain either. So lets go over a spot that I really enjoyed working on...LG Advanced Learning. Here is a link to the commercial. This is the 30 second cut. We did a 60, but I can't find it anywhere online that you don't have to subscribe to.
*edit* The 60 second version is now online at Digital Domains website...Here is a link to the quicktime.
This project was a lot of fun to work on. It was a pretty good challenge and in the end it went very smooth and I'm quite pleased with how it all turned out. This was the third project I worked on at DD.
The great thing for me was that another co-worker of mine and I convinced our supervisors to use Max/Vray for this job. Vray is exceptional when it comes to photo real metalic surfaces. The only problem with this idea was that DD didn't currently have a Max TD around to write tools, and DD's older tools for max hadn't been updated in a while. There were a few tools that we HAD to have in order for this to work, so right off the start I set to work creating the ones we didn't have, and my co-worker, Chris(who had used the older DD max pipeline), started testing the existing tools to make sure they were working and stable. Fortunately, they were all in good condition, so it was up to me to script a few needed tools. I ended up writing a tool that would allow us to automatically import animation data in the form of MDD files onto objects in max from Maya. We had hundreds of objects and there was no way this was going to be done by hand. I was very happy to see it get used the first time in production without breaking. In fact, I think we only broke it once durring the project. I also re-wrote a tool that I originally came up with at Blur, but needed re-written to plug into DD's pipeline with Vray. It was a quick test render tool which allowed you to override render settings at the push of a button to do very quick test renders while not actually changing any of your final render settings.
When the project first got rolling we learned that the director had already had another company model the characters. So we were given meshes to start working with. For the most part the models were well done. However, we really wanted these characters to be hyper detailed, so we went back into the models and re-worked them A LOT. We added tons of tiny details that you will only see if your lucky enough to catch the commercial in HD, and maybe not even then unless its a close up shot. We went so far as to model little weld points for all the circuitry. Every inch of these characters had some fine detail on them. We spent a lot of time in this part of the production. Perfecting textures and models. This was the hardest part of the project actually. As soon as the characters started getting signed off on it was pretty smooth sailing to the finish.
When it came time to start rendering shots it was a breeze. The time we spent up front getting the tools ready really paid off. Shots came together quickly and we had very little issues on the render farm. The one thing that did present a problem for us though was grainy noise. We were throwing everything at Vray on this spot. We had GI calc'd per frame, glossy reflections, glossy refractions, translucency, depth of field rendered in camera and motion blur. We also had nuclear hot lights which didn't help the grain at all. Then the challenges started. It really wasn't all that bad to solve the grain issues, but it was the most challenging thing about this project. I love working with Chris though. He and I come from two different schools of thought about rendering with Vray. He is very much in favor of using Light Cache and Irradiance mapping for GI, where I'm in favor of Light Cache and brute force, or in some cases completely brute force GI. What we learned was that about 50% of the time his method worked the best. We got fast render times and clean GI. However, the other 50% when things got more complicated we found that my approach worked out better. It ultimately came down to what was happening in each shot. With so many variables it was a little tricky at first to figure out what needed to be tweeked in order to get rid of the grain. The best way to figure it out was to look at all the buffers we were saving out. We could do test renders looking only at the GI pass to see if our settings were causing grain there and we could tweak the GI settings apart from everything else. We did the same for reflection and refraction passes to make sure our samples were high enough there. Then finally we'd turn on motion blur and DOF and check again to make sure everything was clean before submitting to the farm again. Naturally the render times went up, but all in all none were very long for what we were doing. Most frames averaged around 30-40 minutes at 1024x576. A few of the close up shots were rendered at 1920x1080 and those were about 3 1/2 hours per frame for finals.
This was a rare project. Things went so smoothly on the back end that the lighters pretty much finished a week early and were just there to help the comp artists with additional matte passes and misc fixes. We pulled a few late nights in the beginning durring the modeling phase, but once we were into lighting it was regular work hours til delivery.
The FX were very cool on this spot. DD hired an artist to come in and code real time particle FX and render them in open GL. I sat next to this guy and it was very interesting to listen to him click away for hours writing code, then there would be a flash from his monitor and I'd look over and he'd be testing his particle system...It would just swirl endlessly without looping until he stopped it and went back to coding. In order to get those FX into our scenes with the correct camera motion he rendered out cards of his particles and another 3d artist took them into Lightwave and positioned them and rendered out passes for the compositors. In some cases the compositors took care of the FX placement themselves. The lighters didn't have to do any of this, but we did have to render interactive lighting passes and a few reflection passes.
That's about it.