I realized that my confusion stemmed from the subtle yet noticeable difference between viewing linear images through a simple 2.2 gamma curve (Houdini's default, as well as most other 3D apps), versus an actual sRGB LUT. Nuke uses an sRGB LUT by default, which I always thought was the right way to go, given my monitor is properly calibrated to sRGB, and I have other apps like PS setup to use an sRGB ICC profile.
This of course led me to try to get a proper linear-to-sRGB LUT to work in Houdini which was kind of a pain. There's a method in Nuke for LUT generation using a CMSpattern, but the resulting .blut file was way off when I imported it into Houdini (I think that had something to do with the necessity of a 1D prelut before the blut's 3D transform, but I could never quite figure it out). The other solution was to generate a LUT using the OCIO libraries in a shell scenario (I'm no programmer and I couldn't quite crack how to get ocio working in a custom environment)…or…create a LUT using a CHOPS or COPS network - again, it was a simple matter to save out the lut from a COPS net, but I couldn't quite figure out how to program in the specific sRGB transfer function. (I still consider myself a new Houdini user

Long story short: This all led me down the rabbit-hole to wikipedia, grabbing the sRGB transfer function, and simply coding an ascii tab-delimited lut using python. It took some trial and error. I was on the right track and able to generate a 10-bit (1024 step) LUT which worked reasonably well but still noticeably different from the Nuke LUT. Finally when I upped the resolution to a 14-bit LUT (16,000+ steps) I had perceptually a perfect match between what I was viewing in Houdini versus Nuke.
So, I lay this all out if anyone else has had difficulty in getting luts to work in Houdini. The resulting lut is attached if you need it and find it useful. This whole thing made me wonder if it's simply better to stick to gamma 2.2, if it's more efficient for the graphics processing, or if that all really matters. I assume the sRGB spec is there for a reason, and if it's what we calibrate our monitors to, it'd be best to match it everywhere in every app to simplify/unify the grading process. Yes, it's a subtle difference, but it was bugging me.