Deploy a trained PPO (shared model) #334
Unanswered
AndreaRossetto
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I'm trying t odeploy a trained PPO agent, the chekpoint is shered (policy + value).
I create the shared model (I get the code during the training in isaaclab)
when I try to run it I get this error
The other part that I'm not sure if is correct is this
Then to run inference the model I do this
is this correct? if yes, How can I populate the observation_space with reading from the sensors?
Beta Was this translation helpful? Give feedback.
All reactions