Optimal Auction Design with Quantized Bids for Target Tracking via Crowdsensing

Nianxia Cao, Swastik Brahma, Baocheng Geng, Pramod K. Varshney

Research output: Contribution to journalArticlepeer-review

14 Scopus citations

Abstract

This paper considers the design of an auction mechanism for target tracking via crowdsensing. We consider that the crowdsourcing framework consists of a set of sensors, which are embedded in devices belonging to crowd participants, and a fusion center (FC) that uses the quantized measurements from the sensors to track a target. The auction mechanism we develop addresses participatory concerns of the sensors that arise due to energy consumption associated with sensor participation while maximizing the utility of the FC to achieve desired sensing objectives and preventing market manipulations. Moreover, since a crowdsensing environment is typically resource-constrained, in our auction model, we consider that the sensors in the network quantize their private value estimates regarding their energy costs prior to communicating them to the FC. Furthermore, the paper also proposes the concept of selecting a subset of sensors (bidders) to bid (from a set of available sensors) to satisfy resource constraints during the bidding process. Extensive numerical results are provided to gain insights into the proposed mechanism.

Original languageEnglish (US)
Article number8795570
Pages (from-to)847-857
Number of pages11
JournalIEEE Transactions on Computational Social Systems
Volume6
Issue number5
DOIs
StatePublished - Oct 2019
Externally publishedYes

Keywords

  • Crowdsensing
  • mechanism design
  • quantization
  • reverse auctions
  • sensor selection
  • target tracking
  • wireless sensor networks (WSNs)

ASJC Scopus subject areas

  • Modeling and Simulation
  • Social Sciences (miscellaneous)
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Optimal Auction Design with Quantized Bids for Target Tracking via Crowdsensing'. Together they form a unique fingerprint.

Cite this