Please use this identifier to cite or link to this item:
http://hdl.handle.net/123456789/5469
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Mondal, Rishov | - |
dc.date.accessioned | 2024-02-15T06:46:31Z | - |
dc.date.available | 2024-02-15T06:46:31Z | - |
dc.date.issued | 2023-05 | - |
dc.identifier.uri | http://hdl.handle.net/123456789/5469 | - |
dc.description.abstract | This thesis investigates the potential of Markov decision processes (MDP) as a tool for solving complex decision-making problems in real-life scenarios. The project delves into the application of MDP in stochastic games, specifically by analyzing an inventory duopoly with a yield uncertainty problem as part of the operations research problem. The thesis also explores the role of MDP in analyzing the budget allocation problem in the Voter Model, a popular model in opinion dynamics. The study provides a comprehensive analysis of MDP’s effectiveness in solving real-life problems and highlights its benefits over other decision-making models. The project offers insights into how MDP can be effectively used to analyze and solve real-life problems and provides directions for future research in this area. | en_US |
dc.language.iso | en_US | en_US |
dc.publisher | IISER Mohali | en_US |
dc.subject | Markov Decision Process | en_US |
dc.subject | perations Research, | en_US |
dc.subject | Opinion Dynamics. | en_US |
dc.title | Markov Decision Process and Its Applications | en_US |
dc.type | Thesis | en_US |
dc.guide | Sahasrabudhe, Neeraja | en_US |
Appears in Collections: | MP-20 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
embargo period.pdf | 6.04 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.