TY - GEN
T1 - Detection of composite events spanning multiple camera views with wireless embedded smart cameras
AU - Wang, Youlu
AU - Velipasalar, Senem
AU - Casares, Mauricio
PY - 2009
Y1 - 2009
N2 - With the introduction of battery-powered and em- bedded smart cameras, it has become viable to install many spatially-distributed cameras interconnected by wireless links. However, there are many problems that need to be solved to build scalable, battery-powered wireless smart-camera networks (Wi- SCaNs). These problems include the limited processing power, memory, energy and bandwidth. Limited resources necessitate light-weight algorithms to be implemented and run on the embedded cameras, and also careful choice of when and what data to transfer. We present a wireless embedded smart camera system, wherein each camera platform consists of a camera board and a wireless mote, and cameras communicate in a peer-to-peer manner over wireless links. Light-weight background subtraction and tracking algorithms are implemented and run on camera boards. Cameras exchange data to track objects consistently, and also to update locations of lost objects. Since frequent transfer of large-sized data requires more power and incurs more communi- cation delay, transferring all captured frames to a server should be avoided. Another challenge is the limited local memory for storage in camera motes. Thus, instead of transferring or saving every frame or every trajectory, there should be a mechanism to detect events of interest. In the presented system, events of interest can be defined beforehand, and simpler events can be combined in a sequence to define semantically higher-level and composite events. Moreover, event scenarios can span multiple camera views, which make the definition of more complex events possible. Cameras communicate with each other about the portions of a scenario to detect an event that spans different camera views. We present examples of label transfer for consistent tracking, and of updating the location of occluded or lost objects from other cameras by wirelessly exchanging small-sized packets. We also show examples of detecting different composite and spatio- temporal event scenarios spanning multiple camera views. All the processing is performed on the camera boards.
AB - With the introduction of battery-powered and em- bedded smart cameras, it has become viable to install many spatially-distributed cameras interconnected by wireless links. However, there are many problems that need to be solved to build scalable, battery-powered wireless smart-camera networks (Wi- SCaNs). These problems include the limited processing power, memory, energy and bandwidth. Limited resources necessitate light-weight algorithms to be implemented and run on the embedded cameras, and also careful choice of when and what data to transfer. We present a wireless embedded smart camera system, wherein each camera platform consists of a camera board and a wireless mote, and cameras communicate in a peer-to-peer manner over wireless links. Light-weight background subtraction and tracking algorithms are implemented and run on camera boards. Cameras exchange data to track objects consistently, and also to update locations of lost objects. Since frequent transfer of large-sized data requires more power and incurs more communi- cation delay, transferring all captured frames to a server should be avoided. Another challenge is the limited local memory for storage in camera motes. Thus, instead of transferring or saving every frame or every trajectory, there should be a mechanism to detect events of interest. In the presented system, events of interest can be defined beforehand, and simpler events can be combined in a sequence to define semantically higher-level and composite events. Moreover, event scenarios can span multiple camera views, which make the definition of more complex events possible. Cameras communicate with each other about the portions of a scenario to detect an event that spans different camera views. We present examples of label transfer for consistent tracking, and of updating the location of occluded or lost objects from other cameras by wirelessly exchanging small-sized packets. We also show examples of detecting different composite and spatio- temporal event scenarios spanning multiple camera views. All the processing is performed on the camera boards.
UR - http://www.scopus.com/inward/record.url?scp=72149122359&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=72149122359&partnerID=8YFLogxK
U2 - 10.1109/ICDSC.2009.5289355
DO - 10.1109/ICDSC.2009.5289355
M3 - Conference contribution
AN - SCOPUS:72149122359
SN - 9781424446209
T3 - 2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009
BT - 2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009
T2 - 2009 3rd ACM/IEEE International Conference on Distributed Smart Cameras, ICDSC 2009
Y2 - 30 August 2009 through 2 September 2009
ER -