gym-bandits

Implements multi-armed bandits


License
MIT
Install
pip install gym-bandits==0.0.1

Documentation

Gym Bandits

A multi-armed bandits environment for OpenAI gym.

Installation instructions

Requirements: gym and numpy

pip install gym_bandits

Usage

import gym
import gym_bandits

env = gym.make('MultiArmedBandits-v0') # 10-armed bandit
env = gym.make('MultiArmedBandits-v0', nr_arms=15) # 15-armed bandit