Fast Adaptation of Activity Sensing Policies in Mobile Devices

Mohammad Abu Alsheikh, Dusit Niyato, Shaowei Lin, Hwee Pink Tan, Dong In Kim

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

With the proliferation of sensors, such as accelerometers, in mobile devices, activity and motion tracking has become a viable technology to understand and create an engaging user experience. This paper proposes a fast adaptation and learning scheme of activity tracking policies when user statistics are unknown a priori, varying with time, and inconsistent for different users. In our stochastic optimization, user activities are required to be synchronized with a backend under a cellular data limit to avoid overcharges from cellular operators. The mobile device is charged intermittently using wireless or wired charging for receiving the required energy for transmission and sensing operations. First, we propose an activity tracking policy by formulating a stochastic optimization as a constrained Markov decision process (CMDP). Second, we prove that the optimal policy of the CMDP has a threshold structure using a Lagrangian relaxation approach and the submodularity concept. We accordingly present a fast Q-learning algorithm by considering the policy structure to improve the convergence speed over that of conventional Q-learning. Finally, simulation examples are presented to support the theoretical findings of this paper.

Original languageEnglish
Article number7744681
Pages (from-to)5995-6008
Number of pages14
JournalIEEE Transactions on Vehicular Technology
Volume66
Issue number7
DOIs
Publication statusPublished - 15 Nov 2017
Externally publishedYes

Fingerprint

Mobile devices
Mobile Devices
Sensing
Q-learning
Stochastic Optimization
Markov Decision Process
Accelerometers
Learning algorithms
Submodularity
Statistics
Motion Tracking
Lagrangian Relaxation
Accelerometer
Sensors
Convergence Speed
User Experience
Optimal Policy
Proliferation
Inconsistent
Learning Algorithm

Cite this

Abu Alsheikh, Mohammad ; Niyato, Dusit ; Lin, Shaowei ; Tan, Hwee Pink ; Kim, Dong In. / Fast Adaptation of Activity Sensing Policies in Mobile Devices. In: IEEE Transactions on Vehicular Technology. 2017 ; Vol. 66, No. 7. pp. 5995-6008.
@article{2ad348d734a147bfa6398dc74d7ace72,
title = "Fast Adaptation of Activity Sensing Policies in Mobile Devices",
abstract = "With the proliferation of sensors, such as accelerometers, in mobile devices, activity and motion tracking has become a viable technology to understand and create an engaging user experience. This paper proposes a fast adaptation and learning scheme of activity tracking policies when user statistics are unknown a priori, varying with time, and inconsistent for different users. In our stochastic optimization, user activities are required to be synchronized with a backend under a cellular data limit to avoid overcharges from cellular operators. The mobile device is charged intermittently using wireless or wired charging for receiving the required energy for transmission and sensing operations. First, we propose an activity tracking policy by formulating a stochastic optimization as a constrained Markov decision process (CMDP). Second, we prove that the optimal policy of the CMDP has a threshold structure using a Lagrangian relaxation approach and the submodularity concept. We accordingly present a fast Q-learning algorithm by considering the policy structure to improve the convergence speed over that of conventional Q-learning. Finally, simulation examples are presented to support the theoretical findings of this paper.",
keywords = "Activity tracking, fast adaptation, Internet of things, Markov decision processes, wireless charging",
author = "{Abu Alsheikh}, Mohammad and Dusit Niyato and Shaowei Lin and Tan, {Hwee Pink} and Kim, {Dong In}",
year = "2017",
month = "11",
day = "15",
doi = "10.1109/TVT.2016.2628966",
language = "English",
volume = "66",
pages = "5995--6008",
journal = "IEEE Transactions on Vehicular Communications",
issn = "0018-9545",
publisher = "IEEE, Institute of Electrical and Electronics Engineers",
number = "7",

}

Fast Adaptation of Activity Sensing Policies in Mobile Devices. / Abu Alsheikh, Mohammad; Niyato, Dusit; Lin, Shaowei; Tan, Hwee Pink; Kim, Dong In.

In: IEEE Transactions on Vehicular Technology, Vol. 66, No. 7, 7744681, 15.11.2017, p. 5995-6008.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Fast Adaptation of Activity Sensing Policies in Mobile Devices

AU - Abu Alsheikh, Mohammad

AU - Niyato, Dusit

AU - Lin, Shaowei

AU - Tan, Hwee Pink

AU - Kim, Dong In

PY - 2017/11/15

Y1 - 2017/11/15

N2 - With the proliferation of sensors, such as accelerometers, in mobile devices, activity and motion tracking has become a viable technology to understand and create an engaging user experience. This paper proposes a fast adaptation and learning scheme of activity tracking policies when user statistics are unknown a priori, varying with time, and inconsistent for different users. In our stochastic optimization, user activities are required to be synchronized with a backend under a cellular data limit to avoid overcharges from cellular operators. The mobile device is charged intermittently using wireless or wired charging for receiving the required energy for transmission and sensing operations. First, we propose an activity tracking policy by formulating a stochastic optimization as a constrained Markov decision process (CMDP). Second, we prove that the optimal policy of the CMDP has a threshold structure using a Lagrangian relaxation approach and the submodularity concept. We accordingly present a fast Q-learning algorithm by considering the policy structure to improve the convergence speed over that of conventional Q-learning. Finally, simulation examples are presented to support the theoretical findings of this paper.

AB - With the proliferation of sensors, such as accelerometers, in mobile devices, activity and motion tracking has become a viable technology to understand and create an engaging user experience. This paper proposes a fast adaptation and learning scheme of activity tracking policies when user statistics are unknown a priori, varying with time, and inconsistent for different users. In our stochastic optimization, user activities are required to be synchronized with a backend under a cellular data limit to avoid overcharges from cellular operators. The mobile device is charged intermittently using wireless or wired charging for receiving the required energy for transmission and sensing operations. First, we propose an activity tracking policy by formulating a stochastic optimization as a constrained Markov decision process (CMDP). Second, we prove that the optimal policy of the CMDP has a threshold structure using a Lagrangian relaxation approach and the submodularity concept. We accordingly present a fast Q-learning algorithm by considering the policy structure to improve the convergence speed over that of conventional Q-learning. Finally, simulation examples are presented to support the theoretical findings of this paper.

KW - Activity tracking

KW - fast adaptation

KW - Internet of things

KW - Markov decision processes

KW - wireless charging

UR - http://www.scopus.com/inward/record.url?scp=85029611760&partnerID=8YFLogxK

UR - http://www.mendeley.com/research/fast-adaptation-activity-sensing-policies-mobile-devices

U2 - 10.1109/TVT.2016.2628966

DO - 10.1109/TVT.2016.2628966

M3 - Article

VL - 66

SP - 5995

EP - 6008

JO - IEEE Transactions on Vehicular Communications

JF - IEEE Transactions on Vehicular Communications

SN - 0018-9545

IS - 7

M1 - 7744681

ER -