add them to pybullet_envs through __init__.py
id='MinitaurReactiveEnv-v0',
id='MinitaurTrottingEnv-v0',
id='MinitaurBallGymEnv-v0',
id='MinitaurStandGymEnv-v0',
id='MinitaurAlternatingLegsEnv-v0',
id='MinitaurFourLegStandEnv-v0',
disable reflection of minitaur_four_leg_stand_env, since the floor changes orientation (reflection is a fixed plane with [0,0,1] normal)
from pybullet_envs.minitaur.envs.minitaur_alternating_legs_env import MinitaurAlternatingLegsEnv
from pybullet_envs.minitaur.envs.minitaur_ball_gym_env import MinitaurBallGymEnv
from pybullet_envs.minitaur.envs.minitaur_randomize_terrain_gym_env import MinitaurRandomizeTerrainGymEnv
from pybullet_envs.minitaur.envs.minitaur_reactive_env import MinitaurReactiveEnv
from pybullet_envs.minitaur.envs.minitaur_stand_gym_env import MinitaurStandGymEnv
from pybullet_envs.minitaur.envs.minitaur_trotting_env import MinitaurTrottingEnv
from pybullet_envs.minitaur.envs.minitaur_four_leg_stand_env import MinitaurFourLegStandEnv
Use python -m pybullet_envs.examples.testEnv --env AntBulletEnv-v0 --render=1 --steps 1000 --resetbenchmark=1
Added environments: HumanoidFlagrunBulletEnv-v0, HumanoidFlagrunHarderBulletEnv-v0, StrikerBulletEnv-v0, ThrowerBulletEnv-v0, PusherBulletEnv-v0, ReacherBulletEnv-v0, CartPoleBulletEnv-v0 and register them to OpenAI Gym.
Allow numpy/humanoid_running.py to use abtch or non-batch update (setJointMotorControl2/setJointMotorControlArray)