In the previous post, I mentioned how the renders returned black dots randomly. I spent about 2 hours trying to figure out this simple fix. Most of the time was spent adjusting the render sliders until I removed the dots for one frame, then jumping to the next frame only to find the dots. 1 hour went to realizing that I needed to clamp the HDR. Then, the next hour was searching through more forums pages and reading to also try to clamp the ground. Clamping both fixed all the issues!
Camera Track Issue
I was super excited for these renders to finished but after some compositing, the Storm Troopers are a little floaty. I after studying the track more and more, I realized that the camera was tracking properly, but it was only off by a few frames. I went back into Nuke and exported out a new camera and brought that back into the scene. At the of the 3D view, in red for a few seconds before it faded away, there was an error that said "Frame rate mismatch: The imported scene frame rate '24 fps' differs from the existing frame rate '30 fps'." (After doing more test, I've found that the red text doesn't always pop up. However, it does appear in the script editor before being pushed up by other import information. That's probably how I missed it.)
Now I know that the camera is exporting out at a different frame rate! But how to I change that in Nuke? In the WriteGeo node, there is not option to change the frame rate of the camera and the playback was already in 30 fps. Turns out, I did not change the frame rate in the project settings. One simple fix, a new export, and some more render testing and I now have a properly tracked camera for rendering!
Fixed Camera Match
With a fixed camera track, I can focus on other issues that I am having. The smaller issue being the position of the Storm Trooper. I am going to move them around a little. But the biggest issue I am having is are these random black dots that pop up in random frames. After using AOVs, the dots are all in the Direct Diffuse. Unfortunatley I cannot get rid of them using a FireFly Killer gizmo. It is a rendering issue. I checked the Arnold documentation and there is nothing about this issue. Next, I checked through the forums. Some people said that this could be an issue with sample being too high for certain computer. I'm still testing it out.
In the other two project, I would use a chrome ball to create the indirect light. The problem with using the chrome ball is that I'd also be capturing all the scratches and dints that would be seen in the render. For this project, I borrowed my professor's Pano head.
This pano head was really helpful in getting good increments. My one issue was with how heavy it was. As I would rotate the camera around, the tripod's head would slowly start to weigh forward or back. I was also orginial going to do a pan but I couldn't produce the same smoothness as if the pano head was off.
When if came to stitching the images together, I discovered that there was a way to do it in both Photoshop and Lightroom. Both softwares have an image limit of 100 and I took 124 images. I choose to use Lightroom because I can actually see which images I am selecting. In Photoshop, I only see the file name.
Unfortunately, creating the panorama takes a lot of memory (something my computer lacks) and time. I think it took about 5 hours for my computer to bring back this image, but when checking the resolution, I understand why. The resolution is 31,396 x 3859!
The final in this class is all about using compositing in an animated character into a live action plate. So, I have this rigged Storm Trooper I plan on using in combination with some old motion capture data. My idea is to have the Storm Trooper running through the scene.