Amanote Research
Register
Sign In
Bandits With Budgets
doi 10.1145/2745844.2745847
Full Text
Open PDF
Abstract
Available in
full text
Date
January 1, 2015
Authors
Richard Combes
Chong Jiang
Rayadurgam Srikant
Publisher
ACM Press
Related search
Bernoulli Two-Armed Bandits With Geometric Termination
Stochastic Processes and their Applications
Modeling
Applied Mathematics
Statistics
Probability
Simulation
Two-Armed Bandits With a Goal, II. Dependent Arms
Advances in Applied Probability
Applied Mathematics
Statistics
Probability
Robust Risk-Averse Stochastic Multi-Armed Bandits
Lecture Notes in Computer Science
Computer Science
Theoretical Computer Science
Multi-Armed Bandits Under General Depreciation and Commitment
Probability in the Engineering and Informational Sciences
Industrial
Probability
Uncertainty
Manufacturing Engineering
Management Science
Statistics
Operations Research
Multitasking, Multiarmed Bandits, and the Italian Judiciary
Manufacturing and Service Operations Management
Management Science
Management
Operations Research
Strategy
Equilibrium With Fixed Budgets and Superlinear Connections
ANZIAM Journal
Mathematics
Second-Best Beam-Alignment via Bayesian Multi-Armed Bandits
Putting Bandits Into Context: How Function Learning Supports Decision Making.
Journal of Experimental Psychology: Learning Memory and Cognition
Linguistics
Language
Experimental
Cognitive Psychology
Our Military Budgets.
American Journal of Public Health
Environmental
Public Health
Occupational Health