top of page
Elizabeth House

I really wanted to fix the color discrepancies between my render view and my actual render from the farm before I moved any further with look development. I knew the issue was somewhere in the ACES to sRGB pipeline conversion, so I started working from there. After reading the documentation for it and testing out several different things in my Redshift ROP, I still was getting a washed out render, so I took my files into Nuke to see if I could have more control over the color space.

Raw render

View from render view

Once in Nuke, it was pretty simple to set my project settings to ACEScg, which is what Redshift in Houdini uses by default. Then, using an OCIO node, I transformed the color space to sRGB and wrote out an image sequence from Nuke as a .png sequence rather than .exr files. With this, I was able to match my render view to my renders from the farm (big shoutout and thanks to T'Naige and Michael for helping me figure this one out).





I also did not know about using different color spaces for texture maps. There is an option to change this manually in each Redshift texture node, but there is also an option for using OCIO color rules in the Redshift render nodes, which automatically sets the correct color space for texture assets. So I turned this on as well.


I got a lot of great feedback from my peers for the current state of my project, so I'm glad the color management problem was a fairly simple resolve. For more information and my own future reference, the Redshift documentation for color management in Houdini was quite useful as well as this blog which breaks down some of the quirks of working in ACES.

Elizabeth House

This week, my team and I started dividing up our tasks and finalizing our shot lists. Earlier last week, we were able to all go out and shoot some HDRis and Camilo was able to get some preliminary background plates filmed for shots 2 and 3.

We all really liked the pink/red glow that we got from the neon signs outside of Leopold's and Trustees Theater, so we did decide to stick with location. While we enjoyed the cars on the sides of the road for giving our miniature car a sense of scale, we agreed that it would probably be best to come back in the early morning while it was still dark to avoid the Broughton Street crowds and traffic.


Once the shots were broken up, we were able to form a better list of what effects were needed for each shot. I'm currently responsible for the tire mark transfers on the sketchbook pages as well as the exhaust and smoke coming from the miniature car, both in the first shot. I started with the tire transfer and developing a solution for how to add them in. First, I started with a very simple rotation on the tires and a basic keyframed movement on the car body to be similar to our references. My first instinct was to start with an attribute transfer between the tires and the paper using a color attribute.

Obviously the tire marks need to be permanent rather than just under the tires. My roundabout way of doing this was by scattering points based on the color attribute that I used for the transfer, then selecting only the points that corresponded with where the tires made contact with the paper (in this case blue) using a group expression node. I then added a particle trail node with the length set high so that the points scattered would stay across the pages.

This result was very sparse, so I added a point replicate node after the trail node to get more detail. I then promoted the color attribute to primitives and added a color node after just to give the points some more saturation.

My other approach to this would be using a SOP solver node to keep track of the transfers from previous frames, but then realized it might be more efficient to do this effect in compositing/post production. So, I tabled this effect and moved to the exhaust.


For the exhaust, I found a few up close references (mainly this one, found here). For the miniature, I figured it would be important that the exhaust looked like it was coming from the actual exhaust pipe. I grouped together the geometry for both exhaust pipes and tried using the actual geometry as the source for my smoke using the pyro nodes volume scatter, which worked okay, but I thought I might get additional control by using a particles as the source for the pyro.


Using the same first approach, I used the grouped exhaust pipes as the source emitter for the particles. Since the car was animated, the particles moved along with the car; however, I was getting some strange "puffs" or banding, even with a high number of particles. Increasing the substeps slightly on the POPnetwork fixed this. Once fixed, I added a bit of noise using a POPwind node and some slight variance in the y and z axes in the attributes tab of the POPsource and I was ready to feed the point into a pyro node.

Substeps: 1



Substeps: 4

To feed my points into the default shelf tool, I used a dopimport node, fetching only the geometry from the POPobject node, then set the node in my pyrosource node to keep input. From there, it was just a matter of experimenting with the parameters inside the pyro look node and the pyro solver itself (the incoming visual is strictly to see how the pyro is behaving, NOT for look development).

Of course this still has a long way to go, but I went ahead and rendered this take using the HDRi Camilo made for us. I added some proxy geometry that mimics the table we plan to use to shoot this plate (both shape and material) as well as the sketchbook pages.

There's definitely a lot of work ahead of me, but I think I have a solid start to the exhaust on this shot. Next week, I plan to continue developing this, as well as start some trial runs with the workflow from Houdini to Maya so we can utilize Eaza's texturing work. Our team also plans on further developing the overall aesthetic of our commercial to be more cohesive.

Elizabeth House

My goal for class 5 was to have the entire opening sequence textured, and I got that accomplished! Here's the rough edit.

I do have a couple of concerns with this particular edit, but I think they're easy to fix in the broad scheme of things. There is currently a lot of discrepancies between what my IPR render view shows and what my actual progressive render shows. I believe this is due to color management between IPR and my render nodes, as well as viewing the correct render node when working (select the Houdini tab in the Redshift render view and select the node from there, for future reference). Right now I'm getting a really washed out render back from the farm, which is unexpected since it looks the way I want on my end. In the edit above, I did some minor color correction to try and bring back some of the saturation but it's still not quite right. This week I plan on doing a lot of R&D with the OCiO color management Redshift uses as default and how to get a more accurate render.


More minor concerns going forward are getting more detailed geometry in place (i.e blocking out more of the downstairs props), and getting the lighting consistent across files. As of right now, there is only one light/light portal for the upstairs room, with the volume scattering and global illumination doing a lot of the heavy lifting. This works for the opening shots but later in the sequence there is no light coming from the other side, which will need to be added and might affect the opening sequence. This just means more iterations and experimentation with supplemental lighting going forward.


Aside from some small workflow changes and texture tweaks, I think I'm in a pretty good spot for class 5 and I'm excited to finally see this project come to life.


bottom of page