Abstract
Mobile crowdsourcing (MCS) is now an important source of information for smart cities, especially with the help of unmanned aerial vehicles (UAVs) and driverless cars. They are equipped with different kinds of high-precision sensors, and can be scheduled/controlled completely during data collection, which will make MCS system more robust. However, they are limited to energy constraint, especially for long-term, long-distance sensing tasks, and cities are almost too crowded to set stationary charging station. Towards this end, in this paper we propose to leverage emerging deep reinforcement learning (DRL) techniques for enabling model-free unmanned vehicles control, and present a novel and highly effective control framework, called 'DRL-RVC.' It utilizes the powerful convolutional neural network for feature extraction of the necessary information (including sample distribution, traffic flow, etc.), then makes decisions under the guidance of the deep Q network. That is, UAVs will cruise in the city without control and collect most required data in the sensing region, while mobile unmanned charging station will reach the charging point in the shortest possible time. Finally, we validate and evaluate the proposed framework via extensive simulations based on a real dataset in Rome. Extensive simulation results well justify the effectiveness and robustness of our approach.
Original language | English (US) |
---|---|
Pages (from-to) | 1666-1676 |
Number of pages | 11 |
Journal | IEEE Transactions on Industrial Informatics |
Volume | 14 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2018 |
Keywords
- Data crowdsourcing
- energy-efficiency
- smart city
ASJC Scopus subject areas
- Control and Systems Engineering
- Information Systems
- Computer Science Applications
- Electrical and Electronic Engineering