Skip to content
University of Maryland LibrariesDigital Repository at the University of Maryland
    • Login
    View Item 
    •   DRUM
    • Theses and Dissertations from UMD
    • UMD Theses and Dissertations
    • View Item
    •   DRUM
    • Theses and Dissertations from UMD
    • UMD Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Sensor, Motion and Temporal Planning

    Thumbnail
    View/Open
    umi-umd-3542.pdf (6.270Mb)
    No. of downloads: 1183

    Date
    2006-05-31
    Author
    Lim, Ser-Nam
    Advisor
    Davis, Larry S
    Metadata
    Show full item record
    Abstract
    We describe in this dissertation, planning strategies which enhance the accuracy with which visual surveillance can be conducted and which expand the capabilities of visual surveillance systems. Several classes of planning strategies are considered: sensor planning, motion planning and temporal planning. Sensor planning is the study of the control of cameras to optimize information gathering for performing vision algorithms. The study of camera control spans camera placement strategies, active camera (specifically, Pan-Tilt-Zoom or PTZ cameras) control, and, in some cases, camera selection from a collection of static cameras. Camera placement strategies have been employed previously for enhancing vision algorithms such as 3D reconstruction, area coverage in surveillance, occlusion and visibility analysis, etc. We will introduce a two-camera placement strategy that is utilized by a background subtraction algorithm, allowing it to achieve video rate performance and invariance to several illumination artifacts, such as lighting changes and shadows. While camera placement strategies can improve the performance of vision algorithms significantly, their utilities are limited in situations where it is more cost-effective to utilize existing camera networks instead. In these situations, we can employ camera selection strategies that choose, from the camera network, cameras that yield the best performance when utilized for performing surveillance tasks. We illustrate this with an algorithm that detects and tracks people under severe occlusions by selecting the best stereo pairs for counting people in a scene. The study of sensor planning is also closely related to motion and temporal planning. Motion and temporal planning involves predicting trajectories of objects into the future based on previously observed tracks, and is very useful for modeling interactions between moving objects in the scene. This is utilized by an active camera system that we have developed for reasoning about periods of occlusions. Doing so allows the system to select cameras and PTZ settings that with high probability can be used to capture unobstructed video segments. Finally, we will introduce a left-package system. This system first detects abandoned package in the scene and goes back in time to determine the time window when the package was first left. Steps can then be taken to retrieve images or video segments collected during the time window for identifying the person who left the package. We present the left-package detection sub-system and will show that it can detect abandoned packages even under severe occlusions without any hard thresholding steps.
    URI
    http://hdl.handle.net/1903/3723
    Collections
    • Computer Science Theses and Dissertations
    • UMD Theses and Dissertations

    DRUM is brought to you by the University of Maryland Libraries
    University of Maryland, College Park, MD 20742-7011 (301)314-1328.
    Please send us your comments.
    Web Accessibility
     

     

    Browse

    All of DRUMCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister
    Pages
    About DRUMAbout Download Statistics

    DRUM is brought to you by the University of Maryland Libraries
    University of Maryland, College Park, MD 20742-7011 (301)314-1328.
    Please send us your comments.
    Web Accessibility