Amanote Research

Amanote Research

    RegisterSign In

Discover open access scientific publications

Search, annotate, share and cite publications


Publications by Rayadurgam Srikant

Bandits With Budgets

2015English

Related publications

Bernoulli Two-Armed Bandits With Geometric Termination

Stochastic Processes and their Applications
ModelingApplied MathematicsStatisticsProbabilitySimulation
1981English

Two-Armed Bandits With a Goal, II. Dependent Arms

Advances in Applied Probability
Applied MathematicsStatisticsProbability
1980English

Robust Risk-Averse Stochastic Multi-Armed Bandits

Lecture Notes in Computer Science
Computer ScienceTheoretical Computer Science
2013English

Multi-Armed Bandits Under General Depreciation and Commitment

Probability in the Engineering and Informational Sciences
IndustrialProbabilityUncertaintyManufacturing EngineeringManagement ScienceStatisticsOperations Research
2014English

Multitasking, Multiarmed Bandits, and the Italian Judiciary

Manufacturing and Service Operations Management
Management ScienceManagementOperations ResearchStrategy
2016English

Equilibrium With Fixed Budgets and Superlinear Connections

ANZIAM Journal
Mathematics
2001English

Second-Best Beam-Alignment via Bayesian Multi-Armed Bandits

2019English

Putting Bandits Into Context: How Function Learning Supports Decision Making.

Journal of Experimental Psychology: Learning Memory and Cognition
LinguisticsLanguageExperimentalCognitive Psychology
2018English

Our Military Budgets.

American Journal of Public Health
EnvironmentalPublic HealthOccupational Health
1980English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy