TY - GEN
T1 - A framework of stochastic power management using hidden Markov model
AU - Tan, Ying
AU - Qiu, Qinru
PY - 2008
Y1 - 2008
N2 - The effectiveness of stochastic power management relies on the accurate system and workload model and effective policy optimization. Workload modeling is a machine learning procedure that finds the intrinsic pattern of the incoming tasks based on the observed workload attributes. Markov Decision Process (MDP) based model has been widely adopted for stochastic power management because it delivers provable optimal policy. Given a sequence of observed workload attributes, the hidden Markov model (HMM) of the workload is trained. If the observed workload attributes and states in the workload model do not have one-to-one correspondence, the MDP becomes a Partially Observable Markov Decision Process (POMDP). This paper presents a framework of modeling and optimization for stochastic power management using HMM and POMDP. The proposed technique discovers the HMM of the workload by maximizing the likelihood of the observed attribute sequence. The POMDP optimization is formulated and solved as a quadraticly constrained linear programming (QCLP). Compared with traditional optimization technique, which is based on value iteration, the QCLP based optimization provides superior policy by enabling stochastic control.
AB - The effectiveness of stochastic power management relies on the accurate system and workload model and effective policy optimization. Workload modeling is a machine learning procedure that finds the intrinsic pattern of the incoming tasks based on the observed workload attributes. Markov Decision Process (MDP) based model has been widely adopted for stochastic power management because it delivers provable optimal policy. Given a sequence of observed workload attributes, the hidden Markov model (HMM) of the workload is trained. If the observed workload attributes and states in the workload model do not have one-to-one correspondence, the MDP becomes a Partially Observable Markov Decision Process (POMDP). This paper presents a framework of modeling and optimization for stochastic power management using HMM and POMDP. The proposed technique discovers the HMM of the workload by maximizing the likelihood of the observed attribute sequence. The POMDP optimization is formulated and solved as a quadraticly constrained linear programming (QCLP). Compared with traditional optimization technique, which is based on value iteration, the QCLP based optimization provides superior policy by enabling stochastic control.
UR - http://www.scopus.com/inward/record.url?scp=49749095612&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=49749095612&partnerID=8YFLogxK
U2 - 10.1109/DATE.2008.4484668
DO - 10.1109/DATE.2008.4484668
M3 - Conference contribution
AN - SCOPUS:49749095612
SN - 9783981080
SN - 9789783981089
T3 - Proceedings -Design, Automation and Test in Europe, DATE
SP - 92
EP - 97
BT - Design, Automation and Test in Europe, DATE 2008
T2 - Design, Automation and Test in Europe, DATE 2008
Y2 - 10 March 2008 through 14 March 2008
ER -