Is it possible to modify OpenAI environments?
up vote
2
down vote
favorite
There are some things that I would like to modify in the OpenAI environments. If we use the Cartpole example then we can edit things that are in the class init function but with environments that use Box2D it doesn't seem to be as straight forward?
For example, consider the BipedalWalker environment.
In this case how would I edit things like the SPEED_HIP
or SPEED_KNEE
variables?
Thanks
reinforcement-learning openai-gym
add a comment |
up vote
2
down vote
favorite
There are some things that I would like to modify in the OpenAI environments. If we use the Cartpole example then we can edit things that are in the class init function but with environments that use Box2D it doesn't seem to be as straight forward?
For example, consider the BipedalWalker environment.
In this case how would I edit things like the SPEED_HIP
or SPEED_KNEE
variables?
Thanks
reinforcement-learning openai-gym
add a comment |
up vote
2
down vote
favorite
up vote
2
down vote
favorite
There are some things that I would like to modify in the OpenAI environments. If we use the Cartpole example then we can edit things that are in the class init function but with environments that use Box2D it doesn't seem to be as straight forward?
For example, consider the BipedalWalker environment.
In this case how would I edit things like the SPEED_HIP
or SPEED_KNEE
variables?
Thanks
reinforcement-learning openai-gym
There are some things that I would like to modify in the OpenAI environments. If we use the Cartpole example then we can edit things that are in the class init function but with environments that use Box2D it doesn't seem to be as straight forward?
For example, consider the BipedalWalker environment.
In this case how would I edit things like the SPEED_HIP
or SPEED_KNEE
variables?
Thanks
reinforcement-learning openai-gym
reinforcement-learning openai-gym
edited Nov 10 at 20:59
Davia DeNisco
18714
18714
asked Nov 7 at 16:49
tryingtolearn
62041131
62041131
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
up vote
2
down vote
accepted
Yes, you can modify or create new environments in gym. The simplest (but not recommended) way is to modify the constants in your local gym installation directly, but of course that's not really nice.
A nicer way is to download the bipedal walker environment file (from here) and save it to a file (say, my_bipedal_walker.py
)
Then you modify the constants in the my_bipedal_walker.py
file, and then just import it in your code (assuming you put the file in a path that is importable, or the same folder as your other code files):
import gym
from my_bipedal_walker import BipedalWalker
env = BipedalWalker()
Then you have the env
variable being an instance of the environment, with your defined constants for the physics computation, which you can use with any RL algorithm.
An even nicer way would be making your custom environment available in the OpenAI gym registry, which you can do by following the instructions here
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with theenv = BipedalWalker()
approach.
– Dennis Soemers
Nov 11 at 10:00
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
add a comment |
up vote
2
down vote
You can edit the bipedal walker environment just like you can modify the cartpole environment.
All you have to do is modify the constants for SPEED_HIP
and SPEED_KNEE
.
If you want to change how those constants are used in the locomotion of the agent, you might also want to tweak the step
method.
After making changes to the code, when you instantiate the environment, the new instance will be using the modifications you made.
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
add a comment |
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
2
down vote
accepted
Yes, you can modify or create new environments in gym. The simplest (but not recommended) way is to modify the constants in your local gym installation directly, but of course that's not really nice.
A nicer way is to download the bipedal walker environment file (from here) and save it to a file (say, my_bipedal_walker.py
)
Then you modify the constants in the my_bipedal_walker.py
file, and then just import it in your code (assuming you put the file in a path that is importable, or the same folder as your other code files):
import gym
from my_bipedal_walker import BipedalWalker
env = BipedalWalker()
Then you have the env
variable being an instance of the environment, with your defined constants for the physics computation, which you can use with any RL algorithm.
An even nicer way would be making your custom environment available in the OpenAI gym registry, which you can do by following the instructions here
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with theenv = BipedalWalker()
approach.
– Dennis Soemers
Nov 11 at 10:00
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
add a comment |
up vote
2
down vote
accepted
Yes, you can modify or create new environments in gym. The simplest (but not recommended) way is to modify the constants in your local gym installation directly, but of course that's not really nice.
A nicer way is to download the bipedal walker environment file (from here) and save it to a file (say, my_bipedal_walker.py
)
Then you modify the constants in the my_bipedal_walker.py
file, and then just import it in your code (assuming you put the file in a path that is importable, or the same folder as your other code files):
import gym
from my_bipedal_walker import BipedalWalker
env = BipedalWalker()
Then you have the env
variable being an instance of the environment, with your defined constants for the physics computation, which you can use with any RL algorithm.
An even nicer way would be making your custom environment available in the OpenAI gym registry, which you can do by following the instructions here
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with theenv = BipedalWalker()
approach.
– Dennis Soemers
Nov 11 at 10:00
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
add a comment |
up vote
2
down vote
accepted
up vote
2
down vote
accepted
Yes, you can modify or create new environments in gym. The simplest (but not recommended) way is to modify the constants in your local gym installation directly, but of course that's not really nice.
A nicer way is to download the bipedal walker environment file (from here) and save it to a file (say, my_bipedal_walker.py
)
Then you modify the constants in the my_bipedal_walker.py
file, and then just import it in your code (assuming you put the file in a path that is importable, or the same folder as your other code files):
import gym
from my_bipedal_walker import BipedalWalker
env = BipedalWalker()
Then you have the env
variable being an instance of the environment, with your defined constants for the physics computation, which you can use with any RL algorithm.
An even nicer way would be making your custom environment available in the OpenAI gym registry, which you can do by following the instructions here
Yes, you can modify or create new environments in gym. The simplest (but not recommended) way is to modify the constants in your local gym installation directly, but of course that's not really nice.
A nicer way is to download the bipedal walker environment file (from here) and save it to a file (say, my_bipedal_walker.py
)
Then you modify the constants in the my_bipedal_walker.py
file, and then just import it in your code (assuming you put the file in a path that is importable, or the same folder as your other code files):
import gym
from my_bipedal_walker import BipedalWalker
env = BipedalWalker()
Then you have the env
variable being an instance of the environment, with your defined constants for the physics computation, which you can use with any RL algorithm.
An even nicer way would be making your custom environment available in the OpenAI gym registry, which you can do by following the instructions here
answered Nov 10 at 16:33
Matias Valdenegro
30.1k44873
30.1k44873
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with theenv = BipedalWalker()
approach.
– Dennis Soemers
Nov 11 at 10:00
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
add a comment |
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with theenv = BipedalWalker()
approach.
– Dennis Soemers
Nov 11 at 10:00
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
I'd emphasize that very last sentence a bit more. That is 100% the best way to do things.
– Dennis Soemers
Nov 10 at 17:03
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
@DennisSoemers Is that really the best way if all I'm looking to do is make some small adjustments to various attributes?
– tryingtolearn
Nov 10 at 23:41
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
I would say its too much for a small change, easiest is to do the changes in a local copy of the file, as I mentioned.
– Matias Valdenegro
Nov 11 at 0:23
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with the
env = BipedalWalker()
approach.– Dennis Soemers
Nov 11 at 10:00
@tryingtolearn It's the "cleanest" way, it's the way you should do things in my opinion if this is e.g. some longer-term project you're working on, or something you may eventually want to share with others / publish publicly on github / anything like that. It is technically a little bit more work though, so if you're really really sure that you don't care about how clean your solution is and are fine with a very quick "ugly" solution... then I guess it's ok to go with the
env = BipedalWalker()
approach.– Dennis Soemers
Nov 11 at 10:00
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
@MatiasValdenegro Thanks for your answer. I have marked it correct. Further to my question, is it possible to modify the landing zone for the agent in LunarLander? github.com/openai/gym/blob/master/gym/envs/box2d/… As far as I can tell it's fixed at point (0,0) and not sure how to change that
– tryingtolearn
Nov 15 at 12:59
add a comment |
up vote
2
down vote
You can edit the bipedal walker environment just like you can modify the cartpole environment.
All you have to do is modify the constants for SPEED_HIP
and SPEED_KNEE
.
If you want to change how those constants are used in the locomotion of the agent, you might also want to tweak the step
method.
After making changes to the code, when you instantiate the environment, the new instance will be using the modifications you made.
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
add a comment |
up vote
2
down vote
You can edit the bipedal walker environment just like you can modify the cartpole environment.
All you have to do is modify the constants for SPEED_HIP
and SPEED_KNEE
.
If you want to change how those constants are used in the locomotion of the agent, you might also want to tweak the step
method.
After making changes to the code, when you instantiate the environment, the new instance will be using the modifications you made.
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
add a comment |
up vote
2
down vote
up vote
2
down vote
You can edit the bipedal walker environment just like you can modify the cartpole environment.
All you have to do is modify the constants for SPEED_HIP
and SPEED_KNEE
.
If you want to change how those constants are used in the locomotion of the agent, you might also want to tweak the step
method.
After making changes to the code, when you instantiate the environment, the new instance will be using the modifications you made.
You can edit the bipedal walker environment just like you can modify the cartpole environment.
All you have to do is modify the constants for SPEED_HIP
and SPEED_KNEE
.
If you want to change how those constants are used in the locomotion of the agent, you might also want to tweak the step
method.
After making changes to the code, when you instantiate the environment, the new instance will be using the modifications you made.
answered Nov 10 at 13:13
R.F. Nelson
1,3161419
1,3161419
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
add a comment |
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Thanks for your annswer. But I can't do this directly from the "env" object because the object doesn't have those attributes. Does your suggestion involve going into the file that's pip installed directly?
– tryingtolearn
Nov 10 at 23:40
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
Yes. I am suggesting that, instead of installing the package via pip, you download the library from the github repo and require it locally.
– R.F. Nelson
Nov 11 at 1:17
add a comment |
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53194107%2fis-it-possible-to-modify-openai-environments%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown