• Multi-Armed Bandit Networks: Exploring Online Learning with Networks 

      Hansen, Viktor (Master thesis, 2018-06-26)
      Classical Multi-Armed Bandit solutions often assumes independent arms as a simplification of the problem. This has shown great results in many different fields of practice, but could in some cases, presumably leave untapped ...