Detection of composite events spanning multiple camera views with wireless embedded smart cameras

Youlu Wang, Senem Velipasalar, Mauricio Casares

Research output: Chapter in Book/Entry/PoemConference contribution

8 Scopus citations

Abstract

With the introduction of battery-powered and em- bedded smart cameras, it has become viable to install many spatially-distributed cameras interconnected by wireless links. However, there are many problems that need to be solved to build scalable, battery-powered wireless smart-camera networks (Wi- SCaNs). These problems include the limited processing power, memory, energy and bandwidth. Limited resources necessitate light-weight algorithms to be implemented and run on the embedded cameras, and also careful choice of when and what data to transfer. We present a wireless embedded smart camera system, wherein each camera platform consists of a camera board and a wireless mote, and cameras communicate in a peer-to-peer manner over wireless links. Light-weight background subtraction and tracking algorithms are implemented and run on camera boards. Cameras exchange data to track objects consistently, and also to update locations of lost objects. Since frequent transfer of large-sized data requires more power and incurs more communi- cation delay, transferring all captured frames to a server should be avoided. Another challenge is the limited local memory for storage in camera motes. Thus, instead of transferring or saving every frame or every trajectory, there should be a mechanism to detect events of interest. In the presented system, events of interest can be defined beforehand, and simpler events can be combined in a sequence to define semantically higher-level and composite events. Moreover, event scenarios can span multiple camera views, which make the definition of more complex events possible. Cameras communicate with each other about the portions of a scenario to detect an event that spans different camera views. We present examples of label transfer for consistent tracking, and of updating the location of occluded or lost objects from other cameras by wirelessly exchanging small-sized packets. We also show examples of detecting different composite and spatio- temporal event scenarios spanning multiple camera views. All the processing is performed on the camera boards.

Original languageEnglish (US)
Title of host publication2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009
DOIs
StatePublished - 2009
Externally publishedYes
Event2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009 - Como, Italy
Duration: Aug 30 2009Sep 2 2009

Publication series

Name2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009

Other

Other2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009
Country/TerritoryItaly
CityComo
Period8/30/099/2/09

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Detection of composite events spanning multiple camera views with wireless embedded smart cameras'. Together they form a unique fingerprint.

Cite this