Is it possible to use a render of a camera as UV texture? I see that there's an option in the UV texture node to use the viewpoint of a camera, but this is not a rendered view.
I would like to either use a camera render as UV texture to get Jim Crutchfield style video feedback loops. Another project I have in mind is neural/genetic rendering of scenes. Using a target image a loss/fitness could be evaluated by comparing it with what the camera has rendered (which can be very low resolution). This information can then be used to update the geometry. I have some setups like this in Python and Javascript, but I frankly have no idea where to start in Houdini.
One option could be to render the image out to a file. Then the next frame is read in this image in a Python node and the geometry is updated. Render the next frame. Repeat. Is there a way to control when and how a frame is rendered from a node/python command?
Camera Render as UV texture
2021 1 1- matigekunstintelligentie
- Member
- 3 posts
- Joined: April 2022
- Offline
- jordibares
- Member
- 655 posts
- Joined: Feb. 2006
- Offline
You can certainly render out and feed that back into the system using a variety of methods (I would go for COPs for example), and then render the scene again with the texture applied.
You can structure this process in ROPs quite easily and with these dependencies in place, make it a one button thing.
I hope it helps
You can structure this process in ROPs quite easily and with these dependencies in place, make it a one button thing.
I hope it helps
-
- Quick Links