I tried the H18.5's new VEX function - ramp_unpack / ramp_lookup yesterday.
But I'm confused with the introduction of the ramp_unpack function.
Ramps are commonly packed as JSON formatted strings by Houdini operations.
To my understanding, the ramp_unpack's string parm need a JSON formatted strings from ramp channel. But how can I get the right json string?
I tried the “Attribute from Parameters” SOP, but the dict(will be converted to JSON use json_dumps function) it gets from ramp channel seems is not the right format which ramp_unpack needs. I post my hip here.
Ok, I finally find the “Flatten Ramps” option on “Attribute from Parameters” SOP
it works correctly this time…
I find it can use dict attribute to store ramp / multiparms now in Houdini 18.5, it is really nice.
PS: the “Attribute from Parameters” SOP always grabs the parameter values at the current time. So it seems impossible to get right multiparms if the parms are changing with time.
The ramp_lookup() for arrays is a wrapper around what used to be ramp_lookup in pyro_aaramp. That did a linear warp of the sample position into a uniform key distribution, and then sampled the ramp as if the keys were uniform. I'm guessing maybe the full key spline function wasn't present when that was written, and it "gets away" with it for many ramp types as the key spacing isn't important. But for bsplines it is pretty important!
Fortunately there is a workaround, rather than ramp_lookup() you can go directly to the spline function:
y = spline(s[]@r_basis, x, f[]@r_val, f[]@r_key);
Hopefully we can fix ramp_lookup, but you probably want to use the spline() version for backwards compatibility in 18.5 for a while.
papsphilip How can i write a ramp to a json file? i ma trying to construct a dictionary but i am stuck here
there is an error in this line:
dict['point'].append[data]
should look like this
dict['point'].append(data)
beyond that I'm not sure if you are trying to match specific json structure of ramp parameter, like the one that Attribute From Parameters is using or keeping it as custom, but this should at least fix the error