As far as I can tell, if the alayer.usd is either binary or ascii it works in the example above. Similarly if the the sdf format option isn't there it doesn't matter the format of the the .usd. I'm just wondering whether to use extensions or these format arguments as the identifier for binary vs. ascii. These args are introduced when the config layer LOPs have this specified or an output processor on the 'master' USD ROP forces the formatting.
If you don't specify the format in that fashion, then a .usd file will default to saving in binary. If you wish to save in ascii but don't want to break links to .usd files, that's how it's done.
I think I didn't explain it well enough though. I understand the idea you describe well enough, what is a bit perplexing is that; even if I manually save that sublayer as a binary file at some later date, nothing breaks even though the sublayer reference has the ascii format argument. So it seems a little defunct.
I'm just wondering if it matters at all or if I'm going to regret ignoring the format as specified by the argument at some point in the future.
Ah, I see. I should probably defer to the folks who understand USD at a lower level than I do. Although responses may be slow this week due to an event in London.
I don't believe this flag/argument has an impact during loading, only writing. It's usage (in the context of writing) is also mentioned here: https://www.sidefx.com/forum/topic/86932/?page=1. [www.sidefx.com] I appreciate it's confusing (perhaps even misleading), though I worry that any system we put in place to include it when used as an output path and remove it when used as a reference/sublayer path will have other unexpected consequences. Good food for thought though. I've created an RFE so that we don't abandon this train of thought.