Alexander Somma
alexsomma
About Me
EXPERTISE
Generalist
INDUSTRY
Film/TV
Houdini Skills
Availability
Not Specified
Recent Forum Posts
Render Window actual resolution? Jan. 22, 2020, 3:10 a.m.
I think we may be talking about different things
With Mantra we have the “Render View” which does indeed have 1:1 or “true resolution” homing.
With Solaris and Hydra delegates the closest thing we have right now seems to be the “Image viewport” which does not have this.
With Mantra we have the “Render View” which does indeed have 1:1 or “true resolution” homing.
With Solaris and Hydra delegates the closest thing we have right now seems to be the “Image viewport” which does not have this.
Render Window actual resolution? Jan. 22, 2020, 2:42 a.m.
True resolution is critical, but is one of the many missing basic features of the new renderer as far as I know
Karma beta documentation Dec. 3, 2019, 5:04 a.m.
mark
The renderview is already a todo item for the future release.
Hi Mark & SideFX,
I must admit this causes me some concern..
First, let me start by saying how much I appreciate the effort that you’ve put into the node based USD workflow with Solaris. Ever since I first saw it, I thought it was a game changer, looking forward to the day we could use it in production.
I also want to mention we are a studio that have been doing all our layout, lighting and rendering work in Houdini since 2015.
What concerns me is that it feels like you are under-prioritizing the importance of having decent renderview features at launch. While the viewport interactivity with Karma/Hydra is very nice, the workflow when it comes to finalizing the look and dialing in the sampling is a big step back. While I do understand that the Hydra viewport is the first iteration, I still can not understand why you chose to launch without these essential features (Both from a user-concern standpoint and from a branding standpoint) We’ve relied on the legacy renderview since 2015, and even though it’s been ok, if we’re being honest it was all ready falling behind, lacking several simple productivity features that would make our everyday lives easier. Now, since you are labeling the old renderview legacy, the default renderview experience in Houdini has to be one of the least featurecomplete in the industry.
I’ve voiced this concern this several times before, but since I’m now seeing that you are scheduling a proper renderview for “a future release” I feel the need to go through this again. These things are very obvious, but I’m mentioning them since it sometimes feels like SideFX does not fully appreciate how much these features matter.
There are many ways of thinking about this, but after a scene is set up, one can divide the remaining work into two main areas; Working with the look, and dialing in the rendersettings. To get these things done efficiently there are a few things that are essential:
1. Snapshots
The snapshots are essential in both areas. When working with the look we are constantly doing snapshots to compare and make decisions on shaders, lighting and even layout. When dialing in sampling it is also necessary to switch back and forth to check what the often tiny differences between render settings are. Without this we have to remember what the previous look was, or what the previous indirect noise level was, and that’s just unrealistic.
Legacy renderview all ready missing features: Keeping snapshots between sessions, thumbnails, renderinfo, scene states, side by side, tying snapshots with takes
Current workaround: Save image to disk and open in compositing package. Extremely slow workflow made even slower because we often don’t want to render the whole image when we work this way. We could try working with the ancient viewport snapshot feature but its a horrible experience.
2. Render region
Another ultra-essential feature. Even with the nice interactivity we get in the Solaris viewport a quick render region workflow is essential. The usual workflow is to see things through the shot camera but to constantly drag regions where you need to focus your attention. The speedup from this can’t be overstated - Getting a small region of the final look & quality, in the actual render resolution, allows us to make judgements in a fraction of the time.
Current workaround: Set up cameras and crops for every region you are interested in and render these to disk - Not viable. Maybe we can make a tiny solaris viewport and use the new 2D viewport feature to bounce around? Nope, no snapshot so cant compare results anyways.
3. Fixed resolution with fit window and 1:1 pixel modes
Also very important for both. Shaders need to be dialed in to have enough detail for the final resolution, and for sampling it is crucial to see the output at 1:1. By being able to switch between fit window and 1:1 mode, we can render and judge both the whole look and inspect carefully with the same render.
Current workaround: Tried to use the “image view” but no way of being sure it’s at 1:1 and no way to fit window.
There are many other issues with the missing features but these are the most essential ones as we see it.
Even though Karma itself is beta, the Solaris viewport is not. We use Arnold, and I’m concerned this lack of features will affect the third party renderers for a long time as well, slowing the adoption of LOPs considerably, unless they decide (hopefully) to make their own renderviewers.
In your presentation, BlueSky Studios spoke about the adoption of LOPs, but they had their own renderview integration. I’m glad they are able to work with LOPs, but that doesn’t help the rest of us.
My guess is that most of you at SideFX are all ready aware of this, but since you still ended up releasing Houdini 18 without these major workflow elements solved, your conclusion seem to have been that the users can work good enough without these, and that’s the bubble I want to burst.
Since Solaris is such an awesome way of setting up our scenes and lighting them, it’s a shame it all ends in such an awkward rendering experience.
For us, this results in a big wow effect when first opening up Solaris, but at the end of the day, we’re not able to use it for actual production.