MABpy

The multi-armed bandit problem models library and various strategies analysis.


Keywords
MultiArmedBandits, ReinforsmentLearning
License
MIT
Install
pip install MABpy==0.1a0

Documentation