For resubmissions, I've decided to redo Project 1 aka Wilson. This is because it was admittedly my lowest grade but also I had several rendering and shader issues I wanted to fix. As I am writing this post, the frames are being rendered on Google Zync. Meanwhile, let me talk about what I changed. ShaderOriginal Shader I had originally created something overly simple in Substance Painter. It was pretty much a white layer with 3 layers of a hand print. It wasn't my best work and I remember doing the texture last minute even with knowing what I was going to do. So, I went back into Substance and completely over haul the layer. I gave it a more leathery feel to it. When using Substance Painter, I always start to like what I have in that view port. When I bring the textures, in to Arnold, that's where things get interesting. For example, I liked the specular difference between the blood and ball in substance, but in Arnold, that all went away. The one best addition I did do was add a displacement map to make fake stitching. Since Wilson isn't going to be that close up, I can easily get away with a ton of dots (I had to hand paint, click by click, all over the geometry). Anyway, I wanted the shader to be better than before and I definitely accomplished that. It's not ideal but I'm happy where its at now. Shading Network AnimationPreviously, I had Wilson and the grey sphere just round around in frame. This time, the grey sphere stays in place while 5 Wilsons dropping from the sky and bouncing around. I wanted to Alembic cache the Wilsons, but every time that I attempted, I got a "rigidBody## is an unsupported type of kRigid error up to frame 120". When I import the animation back into Maya, each Wilson was off in space. I tried redo this several times on different computers, deleting history and freezing history, but no luck. I'll have to see how the animation turns out when it renders.
In the previous post, I mentioned how the renders returned black dots randomly. I spent about 2 hours trying to figure out this simple fix. Most of the time was spent adjusting the render sliders until I removed the dots for one frame, then jumping to the next frame only to find the dots. 1 hour went to realizing that I needed to clamp the HDR. Then, the next hour was searching through more forums pages and reading to also try to clamp the ground. Clamping both fixed all the issues!
Camera Track Issue
password: stormy
I was super excited for these renders to finished but after some compositing, the Storm Troopers are a little floaty. I after studying the track more and more, I realized that the camera was tracking properly, but it was only off by a few frames. I went back into Nuke and exported out a new camera and brought that back into the scene. At the of the 3D view, in red for a few seconds before it faded away, there was an error that said "Frame rate mismatch: The imported scene frame rate '24 fps' differs from the existing frame rate '30 fps'." (After doing more test, I've found that the red text doesn't always pop up. However, it does appear in the script editor before being pushed up by other import information. That's probably how I missed it.)
Now I know that the camera is exporting out at a different frame rate! But how to I change that in Nuke? In the WriteGeo node, there is not option to change the frame rate of the camera and the playback was already in 30 fps. Turns out, I did not change the frame rate in the project settings. One simple fix, a new export, and some more render testing and I now have a properly tracked camera for rendering! Fixed Camera Match
password: stormy
With a fixed camera track, I can focus on other issues that I am having. The smaller issue being the position of the Storm Trooper. I am going to move them around a little. But the biggest issue I am having is are these random black dots that pop up in random frames. After using AOVs, the dots are all in the Direct Diffuse. Unfortunatley I cannot get rid of them using a FireFly Killer gizmo. It is a rendering issue. I checked the Arnold documentation and there is nothing about this issue. Next, I checked through the forums. Some people said that this could be an issue with sample being too high for certain computer. I'm still testing it out.
In the other two project, I would use a chrome ball to create the indirect light. The problem with using the chrome ball is that I'd also be capturing all the scratches and dints that would be seen in the render. For this project, I borrowed my professor's Pano head. This pano head was really helpful in getting good increments. My one issue was with how heavy it was. As I would rotate the camera around, the tripod's head would slowly start to weigh forward or back. I was also orginial going to do a pan but I couldn't produce the same smoothness as if the pano head was off. When if came to stitching the images together, I discovered that there was a way to do it in both Photoshop and Lightroom. Both softwares have an image limit of 100 and I took 124 images. I choose to use Lightroom because I can actually see which images I am selecting. In Photoshop, I only see the file name. Unfortunately, creating the panorama takes a lot of memory (something my computer lacks) and time. I think it took about 5 hours for my computer to bring back this image, but when checking the resolution, I understand why. The resolution is 31,396 x 3859!
The final in this class is all about using compositing in an animated character into a live action plate. So, I have this rigged Storm Trooper I plan on using in combination with some old motion capture data. My idea is to have the Storm Trooper running through the scene.
Purple Subsurface Purple Transmission White Subsurface Gloss Mask Nuke TreeRender Settings & TimeThis shader got out of hand quickly. There are so many little details in my stone, but I decided one only focusing on 4 aspects. I created a purple, white, and grey shader, and separated the glossy coat in a separate aiStandardSurface. I used AiMixShaders with a ramp or noise in the mix settings. After putting all of these together, I have created what you see below. Composite In NukeA number of things happened in this render. The main one being that I rendered the reflection of the instant film in with the main shader. I did this to save on render time and adding another layer. I have figured out how to mask just the reflection, but this mask also takes away the geometry. Guess I will be seperating that into a new layer.
Another issue is that when rendering several Mix Shaders put together while as trying to use those with AOV's, Arnold has a hard recognizing the differences between transmission, and subsurface scattering. The way around this is by separating each of those shaders into their own layer and composite them back in nuke. To get the masks to work, I'll have to create another layer and use a surface shader. This match is nearly perfect by ways of lighting. The main difference between the two is the bounce light coming from the notebook and onto the sphere. I did create a render layer for the bounce, but where the light hit the spheres is a little off. The simplest reason behind this could be where the sphere is in space compared to where is was in reality.
|
Archives
November 2018
Categories
All
|