By Topic

Proceedings of the IEEE

Issue 10 • Date Oct. 2008

Filter Results

Displaying Results 1 - 22 of 22
  • [Front cover]

    Publication Year: 2008 , Page(s): C1
    Save to Project icon | Request Permissions | PDF file iconPDF (430 KB)  
    Freely Available from IEEE
  • Proceedings of the IEEE publication information

    Publication Year: 2008 , Page(s): C2
    Save to Project icon | Request Permissions | PDF file iconPDF (62 KB)  
    Freely Available from IEEE
  • Special issue on Distributed Smart Cameras - Table of contents

    Publication Year: 2008 , Page(s): 1557 - 1558
    Cited by:  Papers (1)
    Save to Project icon | Request Permissions | PDF file iconPDF (281 KB)  
    Freely Available from IEEE
  • A Method to Avoid Dangers Caused by Fossil Fuels

    Publication Year: 2008 , Page(s): 1559 - 1561
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | PDF file iconPDF (140 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • A Bright Future for Distributed Smart Cameras

    Publication Year: 2008 , Page(s): 1562 - 1564
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | PDF file iconPDF (128 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • An Introduction to Distributed Smart Cameras

    Publication Year: 2008 , Page(s): 1565 - 1575
    Cited by:  Papers (48)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (563 KB) |  | HTML iconHTML  

    Distributed smart cameras (DSCs) are real-time distributed embedded systems that perform computer vision using multiple cameras. This new approach has emerged thanks to a confluence of simultaneous advances in four key disciplines: computer vision, image sensors, embedded computing, and sensor networks. Processing images in a network of distributed smart cameras introduces several complications. However, we believe that the problems DSCs solve are much more important than the challenges of designing and building a distributed video system. We argue that distributed smart cameras represent key components for future embedded computer vision systems and that smart cameras will become an enabling technology for many new applications. We summarize smart camera technology and applications, discuss current trends, and identify important research challenges. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • The Signal Passing Interface and Its Application to Embedded Implementation of Smart Camera Applications

    Publication Year: 2008 , Page(s): 1576 - 1587
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (905 KB) |  | HTML iconHTML  

    Embedded smart camera systems comprise computation- and resource-hungry applications implemented on small, complex but resource-hardy platforms. Efficient implementation of such applications can benefit significantly from parallelization. However, communication between different processing units is a nontrivial task. In addition, new and emerging distributed smart cameras require efficient methods of communication for optimized distributed implementations. In this paper, a novel communication interface, called the signal passing interface (SPI), is presented that attempts to overcome this challenge by integrating relevant properties of two different, yet important, paradigms in this context-dataflow and message passing interface (MPI). Dataflow is a widely used modeling paradigm for signal processing applications, while MPI is an established communication interface in the general-purpose processor community. SPI is targeted toward computation-intensive signal processing applications, and due to its careful specialization, more performance-efficient for embedded implementation in this domain. SPI is also much easier and more intuitive to use. In this paper, successful application of this communication interface to two smart camera applications has been presented in detail to validate a new methodology for efficient distributed implementation for this domain. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Wireless Multimedia Sensor Networks: Applications and Testbeds

    Publication Year: 2008 , Page(s): 1588 - 1605
    Cited by:  Papers (80)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2094 KB) |  | HTML iconHTML  

    The availability of low-cost hardware is enabling the development of wireless multimedia sensor networks (WMSNs), i.e., networks of resource-constrained wireless devices that can retrieve multimedia content such as video and audio streams, still images, and scalar sensor data from the environment. In this paper, ongoing research on prototypes of multimedia sensors and their integration into testbeds for experimental evaluation of algorithms and protocols for WMSNs are described. Furthermore, open research issues and future research directions, both at the device level and at the testbed level, are discussed. This paper is intended to be a resource for researchers interested in advancing the state-of-the-art in experimental research on wireless multimedia sensor networks. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Object Detection, Tracking and Recognition for Multiple Smart Cameras

    Publication Year: 2008 , Page(s): 1606 - 1624
    Cited by:  Papers (27)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1574 KB) |  | HTML iconHTML  

    Video cameras are among the most commonly used sensors in a large number of applications, ranging from surveillance to smart rooms for videoconferencing. There is a need to develop algorithms for tasks such as detection, tracking, and recognition of objects, specifically using distributed networks of cameras. The projective nature of imaging sensors provides ample challenges for data association across cameras. We first discuss the nature of these challenges in the context of visual sensor networks. Then, we show how real-world constraints can be favorably exploited in order to tackle these challenges. Examples of real-world constraints are (a) the presence of a world plane, (b) the presence of a three-dimiensional scene model, (c) consistency of motion across cameras, and (d) color and texture properties. In this regard, the main focus of this paper is towards highlighting the efficient use of the geometric constraints induced by the imaging devices to derive distributed algorithms for target detection, tracking, and recognition. Our discussions are supported by several examples drawn from real applications. Lastly, we also describe several potential research problems that remain to be addressed. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Calibrating Distributed Camera Networks

    Publication Year: 2008 , Page(s): 1625 - 1639
    Cited by:  Papers (20)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1285 KB) |  | HTML iconHTML  

    Recent developments in wireless sensor networks have made feasible distributed camera networks, in which cameras and processing nodes may be spread over a wide geographical area, with no centralized processor and limited ability to communicate a large amount of information over long distances. This paper overviews distributed algorithms for the calibration of such camera networks- that is, the automatic estimation of each camera's position, orientation, and focal length. In particular, we discuss a decentralized method for obtaining the vision graph for a distributed camera network, in which each edge of the graph represents two cameras that image a sufficiently large part of the same environment. We next describe a distributed algorithm in which each camera performs a local, robust nonlinear optimization over the camera parameters and scene points of its vision graph neighbors in order to obtain an initial calibration estimate. We then show how a distributed inference algorithm based on belief propagation can refine the initial estimate to be both accurate and globally consistent. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Smart Camera Networks in Virtual Reality

    Publication Year: 2008 , Page(s): 1640 - 1656
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2508 KB) |  | HTML iconHTML  

    This paper presents our research towards smart camera networks capable of carrying out advanced surveillance tasks with little or no human supervision. A unique centerpiece of our work is the combination of computer graphics, artificial life, and computer vision simulation technologies to develop such networks and experiment with them. Specifically, we demonstrate a smart camera network comprising static and active simulated video surveillance cameras that provides extensive coverage of a large virtual public space, a train station populated by autonomously self-animating virtual pedestrians. The realistically simulated network of smart cameras performs persistent visual surveillance of individual pedestrians with minimal intervention. Our innovative camera control strategy naturally addresses camera aggregation and handoff, is robust against camera and communication failures, and requires no camera calibration, detailed world model, or central controller. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Macroscopic Human Behavior Interpretation Using Distributed Imager and Other Sensors

    Publication Year: 2008 , Page(s): 1657 - 1677
    Cited by:  Papers (7)  |  Patents (1)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (1979 KB) |  | HTML iconHTML  

    This paper presents BScope, a new system for interpreting human activity patterns using a sensor network. BScope provides a runtime, user-programmable framework that processes streams of timestamped sensor data along with prior context information to infer activities and generate appropriate notifications. The users of the system are able to describe human activities with high-level scripts that are directly mapped to hierarchical probabilistic grammars used to parse low-level sensor measurements into high-level distinguishable activities. Our approach is presented, though not limited, in the context of an assisted living application in which a small, privacy-preserving camera sensor network of five nodes is used to monitor activity in the entire house over a period of 25 days. Privacy is preserved by the fact that camera sensors only provide discrete high-level features, such as motion information in the form of image locations, and not actual images. In this deployment, our primary sensing modality is a distributed array of image sensors with wide-angle lenses that observe people's locations in the house during the course of the day. We demonstrate that our system can successfully generate summaries of everyday activities and trigger notifications at runtime by using more than 1.3 million location measurements acquired through our real home deployment. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Security and Privacy in Distributed Smart Cameras

    Publication Year: 2008 , Page(s): 1678 - 1687
    Cited by:  Papers (8)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (296 KB) |  | HTML iconHTML  

    Distributed smart camera systems are becoming increasingly important in a wide range of applications. As they are often deployed in public space and/or our personal environment, they increasingly access and manipulate sensitive or private information. Their architectures need to address security and privacy issues appropriately, considering them from the inception of the overall system structure. In this paper, we present security and privacy issues of distributed smart camera systems. We describe security requirements, possible attacks, and common risks, analyzing issues at the node and at the network level and presenting available solutions. Although security issues of distributed smart cameras are analogous to networked embedded systems and sensor networks, emphasis is given to special requirements of smart camera networks, including privacy and continuous real-time operation. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Smart Cameras and the Right to Privacy

    Publication Year: 2008 , Page(s): 1688 - 1697
    Cited by:  Papers (3)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (427 KB) |  | HTML iconHTML  

    This essay provides a matrix for use by researchers and system designers as a heuristic device to assess the likely legality of the deployment of a surveillance camera system. After presenting the matrix the essay considers examples in which smart camera technology might enhance the venues for deployment of surveillance cameras. Lastly, the article speculates about legal risks that may confront smart camera technology as it becomes more sophisticated. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Smart Camera Based Monitoring System and Its Application to Assisted Living

    Publication Year: 2008 , Page(s): 1698 - 1714
    Cited by:  Papers (23)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2410 KB) |  | HTML iconHTML  

    Western societies are aging rapidly. An automated 24/7 surveillance to ensure safety of the elderly while respecting privacy becomes a major challenge. At the same time this is representative of novel and emerging video surveillance applications discovered lately besides the classic surveillance protection applications in airports, government buildings, and industrial plants. Three problems of current surveillance systems are identified. A distributed and automated smart camera based approach is proposed that addresses these problems. The proposed system's goal set is to analyze the real world and reflect all relevant-and only relevant-information live in an integrated virtual counterpart for visualization. It covers georeferenced person tracking and activity recognition (falling person detection). A prototype system installed in a home for assisted living has been running 24/7 for several months now and shows quite promising performance. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Real-Time Human Pose Estimation: A Case Study in Algorithm Design for Smart Camera Networks

    Publication Year: 2008 , Page(s): 1715 - 1732
    Cited by:  Papers (4)
    Save to Project icon | Request Permissions | Click to expandQuick Abstract | PDF file iconPDF (2710 KB) |  | HTML iconHTML  

    Monitoring human activities finds novel applications in smart environment settings. Examples include immersive multimedia and virtual reality, smart buildings and occupancy-based services, assisted living and patient monitoring, and interactive classrooms and teleconferencing. A network of cameras can enable detection and interpretation of human events by utilizing multiple views and collaborative processing. Distributed processing of acquired videos at the source camera facilitates operation of scalable vision networks by avoiding transfer of raw images. This allows for efficient collaboration between the cameras under the communication and latency constraints, as well as being motivated by aiming to preserve the privacy of the network users (no image transfer out of the camera) while offering services in applications such as assisted living or virtual placement. In this paper, collaborative processing and data fusion techniques in a multicamera setting are examined in the context of human pose estimation. Multiple mechanisms for information fusion across the space (multiple views), time, and different feature levels are introduced to meet system constraints and are described through examples. View full abstract»

    Full text access may be available. Click article title to sign in or learn about subscription options.
  • Electrical Engineering Hall of Fame: John B. Whitehead

    Publication Year: 2008 , Page(s): 1733 - 1735
    Save to Project icon | Request Permissions | PDF file iconPDF (435 KB) |  | HTML iconHTML  
    Freely Available from IEEE
  • Future Special Issues/Special Sections of the Proceedings

    Publication Year: 2008 , Page(s): 1736 - 1737
    Save to Project icon | Request Permissions | PDF file iconPDF (118 KB)  
    Freely Available from IEEE
  • Put your technology leadership in writing

    Publication Year: 2008 , Page(s): 1738
    Save to Project icon | Request Permissions | PDF file iconPDF (329 KB)  
    Freely Available from IEEE
  • IEEE copyright form

    Publication Year: 2008 , Page(s): 1739 - 1740
    Save to Project icon | Request Permissions | PDF file iconPDF (1065 KB)  
    Freely Available from IEEE
  • IEEE Potentials is looking for article submissions

    Publication Year: 2008 , Page(s): C3
    Save to Project icon | Request Permissions | PDF file iconPDF (211 KB)  
    Freely Available from IEEE
  • [Back cover]

    Publication Year: 2008 , Page(s): C4
    Save to Project icon | Request Permissions | PDF file iconPDF (394 KB)  
    Freely Available from IEEE

Aims & Scope

The most highly-cited general interest journal in electrical engineering and computer science, the Proceedings is the best way to stay informed on an exemplary range of topics.

Full Aims & Scope

Meet Our Editors

Editor-in-Chief
H. Joel Trussell
North Carolina State University