Open AI Gym for vertical rocket landing containing an extensive tool to incorporate curriculum learning for your agents.
pip install rocketgym
from rocketgym.environment import Environment
import random
env = Environment()
observation = env.reset()
done = False
while not done:
observation, reward, done, info = env.step(random.randint(0, 3))
env.render()
Rocket is treated as a 2D free body. Its physics properties have been modeled after Falcon 9. By default rocket is spawned
- Angle made with y-axis (
$rad$ ): [$-\frac{\pi}{2},\frac{\pi}{2}$ ] - Position Y (
$m$ ):$[0,\infty]$ - Velocity X (
$\frac{m}{s}$ ):$[-\infty,\infty]$ - Velocity Y (
$\frac{m}{s}$ ):$[-\infty,\infty]$ - Angular velocity (
$\frac{rad}{s}$ ):$[-\infty,\infty]$
At each timestep, rocket can perform one of the four actions:
- Left - rotate the engine to the left and set maximum thrust
- Mid - rotate the engine to the middle and set maximum thrust
- Right - rotate the engine to the right and set maximum thrust
- None - turn off thrust
Reward function takes multiple components into consideration:
- For each second, agent looses
$0.3$ . - For impact with the ground, agent receives
$15$ . - At impact, agent looses
$0.5$ for each$rad$ off the vertical axis. - At impact, agent looses
$0.25$ for each$\frac{rad}{s}$ of angular velocity - At impact, agent looses
$1$ for each$\frac{m}{s}$ away from$-1\frac{m}{s}$ vertical velocity - At impact, agent looses
$0.25$ for each$\frac{m}{s}$ of horizontal velocity
The best part about this gym. It allows you to alter the difficulty of the environment by changing things like initial height, action space etc.
from rocketgym.environment import Environment
env = Environment()
env.curriculum.set_fixed_height()
env.curriculum.set_random_height(1,5)
env.curriculum.enable_random_height()
env.curriculum.disable_random_height()
env.curriculum.enable_increasing_height(rate=0.05)
env.curriculum.disable_increasing_height()
env.curriculum.enable_random_starting_rotation()
env.curriculum.disable_random_starting_rotation()
env.curriculum.enable_x_velocity_reward()
env.curriculum.disable_x_velocity_reward()
This module enables saving flight logs. Logs will be saved in logs/plots
directory.
from rocketgym.dashboard import Dashboard
# Creating logger
dash = Dashboard()
# Saving flight log
dash.plot_log(env.rocket.flight_log, episode=0)