Public buildings play a critical role in urban energy consumption and require innovative solutions to enhance sustainability while utilizing renewable energy sources. This study develops a digital twin framework that models the physical and operational characteristics of government facilities, including solar photovoltaic installations, battery storage units, heating, ventilation, and air conditioning systems, as well as dynamic load profiles. The digital twin continuously integrates sensor-derived data on temperature, lighting, occupancy, and power consumption to generate an accurate, real-time representation of building performance. A reinforcement learning–based control mechanism is embedded within the digital twin to evaluate multiple operational scenarios and implement strategies that minimize energy costs while maximizing renewable energy integration. By dynamically coordinating storage, distributed generation, and demand response, the framework enables adaptive energy scheduling under variable price signals and renewable intermittency. Results indicate significant improvements in energy efficiency, resilience, and cost-effectiveness, demonstrating the potential of digital twin technology to advance sustainable energy management in public buildings

Advancing Sustainable Energy Management in Public Buildings through Digital Twins and Reinforcement Learning

Ullah, Zahid;Gruosso, Giambattista;
2025-01-01

Abstract

Public buildings play a critical role in urban energy consumption and require innovative solutions to enhance sustainability while utilizing renewable energy sources. This study develops a digital twin framework that models the physical and operational characteristics of government facilities, including solar photovoltaic installations, battery storage units, heating, ventilation, and air conditioning systems, as well as dynamic load profiles. The digital twin continuously integrates sensor-derived data on temperature, lighting, occupancy, and power consumption to generate an accurate, real-time representation of building performance. A reinforcement learning–based control mechanism is embedded within the digital twin to evaluate multiple operational scenarios and implement strategies that minimize energy costs while maximizing renewable energy integration. By dynamically coordinating storage, distributed generation, and demand response, the framework enables adaptive energy scheduling under variable price signals and renewable intermittency. Results indicate significant improvements in energy efficiency, resilience, and cost-effectiveness, demonstrating the potential of digital twin technology to advance sustainable energy management in public buildings
2025
IEEE
979-8-3315-6755-2
Digital Twin , Reinforcement Learning , Public Buildings , Energy Management , Renewable Energy
File in questo prodotto:
File Dimensione Formato  
Advancing_Sustainable_Energy_Management_in_Public_Buildings_through_Digital_Twins_and_Reinforcement_Learning.pdf

Accesso riservato

: Publisher’s version
Dimensione 1.21 MB
Formato Adobe PDF
1.21 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1309323
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact