Now showing items 1-1 of 1

    • Multi-Armed Bandit Networks: Exploring Online Learning with Networks 

      Hansen, Viktor (The University of Bergen, 2018-06-26)
      Classical Multi-Armed Bandit solutions often assumes independent arms as a simplification of the problem. This has shown great results in many different fields of practice, but could in some cases, presumably leave untapped ...
      Master thesis