interactive rendering with Karma workflow?
2792 16 7-
- noicevstime
- Member
- 18 posts
- Joined: 8月 2006
- Offline
Hi all, coming from Maya/Katana with Vray/Arnold/PRman, I am curious how you typically do interactive lighting/rendering in Houdini/Solaris with Karma outside of the viewport? The viewport rendering is great in terms of seeing camera/light/materials updates interactively, but I can't A/B compare renders while also exposure/gamma up/down to inspect.
If combined with Render Gallery tab, it's better but I can't compare the current live rendering to a snapshot render while also being able to up/down expsoure and toggle among AOVs to inspect.
The Render Gallary seems to be a good place to do A/B wipe comparisons, but I can't funnel live rendered pixels there, it seems (is MPlay an alternative? A/B wipe available there?).
Basically, I am looking for a way to sends rendered pixels "live" to a render view/frame buffer type of window so I can compare the A (still updating interactively) to the B (done rendering, a snapshot) while also letting you up/down expsoure and check AOVs?
If combined with Render Gallery tab, it's better but I can't compare the current live rendering to a snapshot render while also being able to up/down expsoure and toggle among AOVs to inspect.
The Render Gallary seems to be a good place to do A/B wipe comparisons, but I can't funnel live rendered pixels there, it seems (is MPlay an alternative? A/B wipe available there?).
Basically, I am looking for a way to sends rendered pixels "live" to a render view/frame buffer type of window so I can compare the A (still updating interactively) to the B (done rendering, a snapshot) while also letting you up/down expsoure and check AOVs?
Edited by noicevstime - 2024年3月8日 20:12:06
-
- daveborck
- Member
- 21 posts
- Joined: 3月 2019
- Offline
-
- robp_sidefx
- スタッフ
- 519 posts
- Joined: 6月 2020
- Online
*Some* of what you're asking for is achievable in the Render Gallery using Cloned Rendering. at ~18m30s into https://www.youtube.com/watch?v=R4SLw5EdzQ8 [www.youtube.com] we show doing A/B diff between a live (cloned) render and a snapshot.
I can't instantly find it, but I know there is an existing forum post loaded with feature requests for a "render view" in Solaris. Even if we can't immediately act on these, it'd be great to hear your thoughts on what's missing.
I can't instantly find it, but I know there is an existing forum post loaded with feature requests for a "render view" in Solaris. Even if we can't immediately act on these, it'd be great to hear your thoughts on what's missing.
-
- Jonathan de Blok
- Member
- 276 posts
- Joined: 7月 2013
- Offline
I guess this is the post you're referring to: https://www.sidefx.com/forum/topic/94172/ [www.sidefx.com]
And +1 for Jason's point, it's one those things you just expect to be there. Max/Maya with V-Ray have had years to develop a VFB and workflow that was comfortable and efficient. It also helped that one of their main purpose is to actually output renders. Doing Houdini renders was, at least in my experience, mostly done when it was to much hassle to export it to other packages or some funky attribute driven contraptions make it impractical to do so. So yeah some catch-up in UX is expected here and I do hope we can have a topic here about plans/feedback before they are put into action so we don't get end up with an API-with-buttons v2.0
And +1 for Jason's point, it's one those things you just expect to be there. Max/Maya with V-Ray have had years to develop a VFB and workflow that was comfortable and efficient. It also helped that one of their main purpose is to actually output renders. Doing Houdini renders was, at least in my experience, mostly done when it was to much hassle to export it to other packages or some funky attribute driven contraptions make it impractical to do so. So yeah some catch-up in UX is expected here and I do hope we can have a topic here about plans/feedback before they are put into action so we don't get end up with an API-with-buttons v2.0

Edited by Jonathan de Blok - 2024年3月12日 08:08:09
More code, less clicks.
-
- noicevstime
- Member
- 18 posts
- Joined: 8月 2006
- Offline
Thanks Rob. I'll definitely check out the clone rendering feature in H20. I'd like to echo Jonathan and Dave's take. Either a new dedicated window/panel or adding features to the existing ones (render gallery or MPlay), having a place that enables comparing live-rendering to a snapshot with the ability to check AOV and exposure/gamma adjustment is quite essential for the lighting workflow. Being able to live-rendering in viewport is great, but for sequecne/shot lighting, I tend to fly around in a persp camera or look through a light in viewport in GL while checking the noise resolved in that particular shot camera angle.
Edited by noicevstime - 2024年3月12日 13:24:25
-
- martinkindl83
- Member
- 275 posts
- Joined: 11月 2014
- Offline
i have to agree
we are slowly jumping onto the USD train and this is key feature that is missing from lighting department/workflow perspective.
Viewport render is great, but except old XSI i never used it, only renderView, where you can snap, compare to live renders.
USe viewport for navigation, not rendering.
we are slowly jumping onto the USD train and this is key feature that is missing from lighting department/workflow perspective.
Viewport render is great, but except old XSI i never used it, only renderView, where you can snap, compare to live renders.
USe viewport for navigation, not rendering.
-
- Hamilton Meathouse
- Member
- 200 posts
- Joined: 11月 2013
- Offline
-
- Hamilton Meathouse
- Member
- 200 posts
- Joined: 11月 2013
- Offline
-
- Hamilton Meathouse
- Member
- 200 posts
- Joined: 11月 2013
- Offline
-
- Hamilton Meathouse
- Member
- 200 posts
- Joined: 11月 2013
- Offline
-
- martinkindl83
- Member
- 275 posts
- Joined: 11月 2014
- Offline
-
- sheep
- Member
- 28 posts
- Joined: 1月 2013
- Offline
robp_sidefx
*Some* of what you're asking for is achievable in the Render Gallery using Cloned Rendering. at ~18m30s into https://www.youtube.com/watch?v=R4SLw5EdzQ8 [www.youtube.com] we show doing A/B diff between a live (cloned) render and a snapshot.
I can't instantly find it, but I know there is an existing forum post loaded with feature requests for a "render view" in Solaris. Even if we can't immediately act on these, it'd be great to hear your thoughts on what's missing.
would actually be awsome if get Karma to render directly to the Render Gallery similar wich seem an improve version of Render view, is there any plan for something like that in the future?
-
- robp_sidefx
- スタッフ
- 519 posts
- Joined: 6月 2020
- Online
sheep
would actually be awsome if get Karma to render directly to the Render Gallery similar wich seem an improve version of Render view, is there any plan for something like that in the future?
We are looking at making it easier to get the Render Gallery be a replacement (or at least an alternative) for existing render "targets" (for example: mplay and the viewport). Nothing to show just yet, but yes we're working on this.
-
- martinkindl83
- Member
- 275 posts
- Joined: 11月 2014
- Offline
That would be awesome Rob. Thank you for the heads up.
I was quite excited about clone option, unfortunately it only worked in simple scenes. Was never able to make it work on production scenes. Currently we render to mplay without interactive render or snaps with background render as the rendering into viewport has its own limitations and we keep it in opengl only.
I was quite excited about clone option, unfortunately it only worked in simple scenes. Was never able to make it work on production scenes. Currently we render to mplay without interactive render or snaps with background render as the rendering into viewport has its own limitations and we keep it in opengl only.
-
- robp_sidefx
- スタッフ
- 519 posts
- Joined: 6月 2020
- Online
-
- martinkindl83
- Member
- 275 posts
- Joined: 11月 2014
- Offline
Problem always was connect to the clone.
On simple scene its not a problem. On production 99% of the time we are not able to connect to the clone.
No error that would help.
Once i spend half day deleting nodes until i was able to connect to clone, but wasn't able to track it down, as it was more like random, not any actual node that would do that.
On simple scene its not a problem. On production 99% of the time we are not able to connect to the clone.
No error that would help.
Once i spend half day deleting nodes until i was able to connect to clone, but wasn't able to track it down, as it was more like random, not any actual node that would do that.
-
- robp_sidefx
- スタッフ
- 519 posts
- Joined: 6月 2020
- Online
martinkindl83
On production 99% of the time we are not able to connect to the clone. No error that would help.
That is concerning. I wonder whether the Log Viewer pane tab has anything useful in it? I know it's a big ask, especially in the context of running production, but please do share any messages/logs/etc that even hint as to what's gone wrong. You can send them on this post, via support, or DM me.
-
- Quick Links