interactive rendering with Karma workflow?
1720 10 6- noicevstime
- Member
- 15 posts
- Joined: 8月 2006
- Offline
Hi all, coming from Maya/Katana with Vray/Arnold/PRman, I am curious how you typically do interactive lighting/rendering in Houdini/Solaris with Karma outside of the viewport? The viewport rendering is great in terms of seeing camera/light/materials updates interactively, but I can't A/B compare renders while also exposure/gamma up/down to inspect.
If combined with Render Gallery tab, it's better but I can't compare the current live rendering to a snapshot render while also being able to up/down expsoure and toggle among AOVs to inspect.
The Render Gallary seems to be a good place to do A/B wipe comparisons, but I can't funnel live rendered pixels there, it seems (is MPlay an alternative? A/B wipe available there?).
Basically, I am looking for a way to sends rendered pixels "live" to a render view/frame buffer type of window so I can compare the A (still updating interactively) to the B (done rendering, a snapshot) while also letting you up/down expsoure and check AOVs?
If combined with Render Gallery tab, it's better but I can't compare the current live rendering to a snapshot render while also being able to up/down expsoure and toggle among AOVs to inspect.
The Render Gallary seems to be a good place to do A/B wipe comparisons, but I can't funnel live rendered pixels there, it seems (is MPlay an alternative? A/B wipe available there?).
Basically, I am looking for a way to sends rendered pixels "live" to a render view/frame buffer type of window so I can compare the A (still updating interactively) to the B (done rendering, a snapshot) while also letting you up/down expsoure and check AOVs?
Edited by noicevstime - 2024年3月8日 20:12:06
- daveborck
- Member
- 19 posts
- Joined: 3月 2019
- Offline
- robp_sidefx
- スタッフ
- 501 posts
- Joined: 6月 2020
- Offline
*Some* of what you're asking for is achievable in the Render Gallery using Cloned Rendering. at ~18m30s into https://www.youtube.com/watch?v=R4SLw5EdzQ8 [www.youtube.com] we show doing A/B diff between a live (cloned) render and a snapshot.
I can't instantly find it, but I know there is an existing forum post loaded with feature requests for a "render view" in Solaris. Even if we can't immediately act on these, it'd be great to hear your thoughts on what's missing.
I can't instantly find it, but I know there is an existing forum post loaded with feature requests for a "render view" in Solaris. Even if we can't immediately act on these, it'd be great to hear your thoughts on what's missing.
- Jonathan de Blok
- Member
- 274 posts
- Joined: 7月 2013
- Offline
I guess this is the post you're referring to: https://www.sidefx.com/forum/topic/94172/ [www.sidefx.com]
And +1 for Jason's point, it's one those things you just expect to be there. Max/Maya with V-Ray have had years to develop a VFB and workflow that was comfortable and efficient. It also helped that one of their main purpose is to actually output renders. Doing Houdini renders was, at least in my experience, mostly done when it was to much hassle to export it to other packages or some funky attribute driven contraptions make it impractical to do so. So yeah some catch-up in UX is expected here and I do hope we can have a topic here about plans/feedback before they are put into action so we don't get end up with an API-with-buttons v2.0
And +1 for Jason's point, it's one those things you just expect to be there. Max/Maya with V-Ray have had years to develop a VFB and workflow that was comfortable and efficient. It also helped that one of their main purpose is to actually output renders. Doing Houdini renders was, at least in my experience, mostly done when it was to much hassle to export it to other packages or some funky attribute driven contraptions make it impractical to do so. So yeah some catch-up in UX is expected here and I do hope we can have a topic here about plans/feedback before they are put into action so we don't get end up with an API-with-buttons v2.0
Edited by Jonathan de Blok - 2024年3月12日 08:08:09
More code, less clicks.
- noicevstime
- Member
- 15 posts
- Joined: 8月 2006
- Offline
Thanks Rob. I'll definitely check out the clone rendering feature in H20. I'd like to echo Jonathan and Dave's take. Either a new dedicated window/panel or adding features to the existing ones (render gallery or MPlay), having a place that enables comparing live-rendering to a snapshot with the ability to check AOV and exposure/gamma adjustment is quite essential for the lighting workflow. Being able to live-rendering in viewport is great, but for sequecne/shot lighting, I tend to fly around in a persp camera or look through a light in viewport in GL while checking the noise resolved in that particular shot camera angle.
Edited by noicevstime - 2024年3月12日 13:24:25
- martinkindl83
- Member
- 262 posts
- Joined: 11月 2014
- Offline
i have to agree
we are slowly jumping onto the USD train and this is key feature that is missing from lighting department/workflow perspective.
Viewport render is great, but except old XSI i never used it, only renderView, where you can snap, compare to live renders.
USe viewport for navigation, not rendering.
we are slowly jumping onto the USD train and this is key feature that is missing from lighting department/workflow perspective.
Viewport render is great, but except old XSI i never used it, only renderView, where you can snap, compare to live renders.
USe viewport for navigation, not rendering.
- Hamilton Meathouse
- Member
- 199 posts
- Joined: 11月 2013
- Offline
- Hamilton Meathouse
- Member
- 199 posts
- Joined: 11月 2013
- Offline
- Hamilton Meathouse
- Member
- 199 posts
- Joined: 11月 2013
- Offline
- Hamilton Meathouse
- Member
- 199 posts
- Joined: 11月 2013
- Offline
- martinkindl83
- Member
- 262 posts
- Joined: 11月 2014
- Offline
-
- Quick Links