We develop a decision support framework based on Markov decision processes to maximize the profit from the operation of a multi-state system. This framework enables a comprehensive management of the multi-state system, which considers the maintenance decisions together with those on the multi-state system operation setting, that is, its loading condition and configuration. The decisions are informed by a condition monitoring system, which estimates the health state of the multi-state system components. The approach is shown with reference to a mechanical system made up of components affected by fatigue.

A Markov decision process framework for optimal operation of monitored multi-state systems

Compare, Michele;Baraldi, Piero;Zio, Enrico
2018-01-01

Abstract

We develop a decision support framework based on Markov decision processes to maximize the profit from the operation of a multi-state system. This framework enables a comprehensive management of the multi-state system, which considers the maintenance decisions together with those on the multi-state system operation setting, that is, its loading condition and configuration. The decisions are informed by a condition monitoring system, which estimates the health state of the multi-state system components. The approach is shown with reference to a mechanical system made up of components affected by fatigue.
2018
condition-based maintenance; Markov decision process; Multi-component system; optimal maintenance policy; optimal operation policy; prognostics and health management; Safety, Risk, Reliability and Quality
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1077916
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 12
  • ???jsp.display-item.citation.isi??? 7
social impact