Please use this identifier to cite or link to this item: http://hdl.handle.net/123456789/5469
Full metadata record
DC FieldValueLanguage
dc.contributor.authorMondal, Rishov-
dc.date.accessioned2024-02-15T06:46:31Z-
dc.date.available2024-02-15T06:46:31Z-
dc.date.issued2023-05-
dc.identifier.urihttp://hdl.handle.net/123456789/5469-
dc.description.abstractThis thesis investigates the potential of Markov decision processes (MDP) as a tool for solving complex decision-making problems in real-life scenarios. The project delves into the application of MDP in stochastic games, specifically by analyzing an inventory duopoly with a yield uncertainty problem as part of the operations research problem. The thesis also explores the role of MDP in analyzing the budget allocation problem in the Voter Model, a popular model in opinion dynamics. The study provides a comprehensive analysis of MDP’s effectiveness in solving real-life problems and highlights its benefits over other decision-making models. The project offers insights into how MDP can be effectively used to analyze and solve real-life problems and provides directions for future research in this area.en_US
dc.language.isoen_USen_US
dc.publisherIISER Mohalien_US
dc.subjectMarkov Decision Processen_US
dc.subjectperations Research,en_US
dc.subjectOpinion Dynamics.en_US
dc.titleMarkov Decision Process and Its Applicationsen_US
dc.typeThesisen_US
dc.guideSahasrabudhe, Neerajaen_US
Appears in Collections:MP-20

Files in This Item:
File Description SizeFormat 
embargo period.pdf6.04 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.