Amanote Research
Register
Sign In
Discover open access scientific publications
Search, annotate, share and cite publications
Publications by Rayadurgam Srikant
Bandits With Budgets
Related publications
Bernoulli Two-Armed Bandits With Geometric Termination
Stochastic Processes and their Applications
Modeling
Applied Mathematics
Statistics
Probability
Simulation
Two-Armed Bandits With a Goal, II. Dependent Arms
Advances in Applied Probability
Applied Mathematics
Statistics
Probability
Robust Risk-Averse Stochastic Multi-Armed Bandits
Lecture Notes in Computer Science
Computer Science
Theoretical Computer Science
Multi-Armed Bandits Under General Depreciation and Commitment
Probability in the Engineering and Informational Sciences
Industrial
Probability
Uncertainty
Manufacturing Engineering
Management Science
Statistics
Operations Research
Multitasking, Multiarmed Bandits, and the Italian Judiciary
Manufacturing and Service Operations Management
Management Science
Management
Operations Research
Strategy
Equilibrium With Fixed Budgets and Superlinear Connections
ANZIAM Journal
Mathematics
Second-Best Beam-Alignment via Bayesian Multi-Armed Bandits
Putting Bandits Into Context: How Function Learning Supports Decision Making.
Journal of Experimental Psychology: Learning Memory and Cognition
Linguistics
Language
Experimental
Cognitive Psychology
Our Military Budgets.
American Journal of Public Health
Environmental
Public Health
Occupational Health