Automating Inhabitant Interactions In Home And Workplace Environments Through Data-driven Generation Of Hierarchical Partially-observable Markov Decision Processes
Youngblood, Gregory Michael
MetadataShow full item record
Markov models provide a useful representation of system behavioral actions and state observations, but they do not scale well. Utilizing a hierarchy and abstraction through hierarchical hidden Markov models (HHMMs) improves scalability, but these structures are usually constructed manually using knowledge engineering techniques. We introduce a new method of automatically constructing HHMMs using the output of a sequential data-mining algorithm, Episode Discovery, and apply it to solving automation problems in the intelligent environment domain. Repetitive behavioral actions in sensor rich environments such as smart homes can be observed and categorized into periodic and frequent episodes through data-mining techniques utilizing the minimum description length principle. Utilizing this approach, we provide an architecture and a set of algorithms for a pervasive computing system showing that inhabitant interactions in home and workplace environments can be accurately automated through sensor observation and intelligent control using a data-driven approach to automatically generate hierarchical inhabitant interaction models in the form of HPOMDPs and these models may be modified using temporal-difference reinforcement learning techniques to continually adapt to changes in the inhabitant's patterns until a new model should be generated. We present our life-long learning system and apply this work in our MavPad and MavLab environments where we have been successful at automating up to 40% of the life of a real inhabitant and 76% of a virtual inhabitant as well as dynamically adapting to concept changes over time. Findings from several case studies are provided to show the feasibility of this approach.