• Login
    View Item 
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    •   Home
    • Theses and Dissertations
    • Theses and Dissertations
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of TUScholarShareCommunitiesDateAuthorsTitlesSubjectsGenresThis CollectionDateAuthorsTitlesSubjectsGenres

    My Account

    LoginRegister

    Help

    AboutPeoplePoliciesHelp for DepositorsData DepositFAQs

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Deep Neural Network for Robust Multiple Object Tracking

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    Name:
    TETDEDXChu-temple-0225E-13991.pdf
    Size:
    14.28Mb
    Format:
    PDF
    Download
    Genre
    Thesis/Dissertation
    Date
    2020
    Author
    Chu, Peng
    Advisor
    Ling, Haibin
    Committee member
    Souvenir, Richard M.
    Yang, Jie
    Zhang, Kai
    Department
    Computer and Information Science
    Subject
    Computer Science
    Permanent link to this record
    http://hdl.handle.net/20.500.12613/2698
    
    Metadata
    Show full item record
    DOI
    http://dx.doi.org/10.34944/dspace/2680
    Abstract
    Tracking multiple objects in video is critical for many applications, ranging from vision-based surveillance to autonomous driving. The popular solution to Multiple Object Tracking (MOT) is the tracking-by-detection strategy, in which, detections of each frame from an external detector are associated and connected to form target trajectories in either online or offline batch mode. Following this strategy, the challenges of robust tracking comes mainly from three aspects: discrimination of the appearance similar targets; handling of the noise from input detections; unifying the separated function modules for generalizability. Recently, deep neural network (DNN) has demonstrate its ability to automatically learn discriminative features from training samples thus achieves success in various computer vision tasks. My research works are to leverage this powerful learning ability of DNN to tackle the above challenges for robust MOT in real world application. In this dissertation, I first introduce the popular framework of MOT system, the datasets, the evaluation metric and challenges in MOT. Then I discuss a work that encodes the structure prior of curvilinear structures in the rank-1 tensor approximation tracking framework to reduce the ambiguity rising from indistinguishable curvilinear structures parts. This work uses convolutional neural network to generate more reliable candidates for tracking and consequently improves the tracking robustness. In the third chapter, I present a work that adapts the DNN based Single Object Tracking (SOT) techniques for missing detection recovery. SOT tracker in this work merges the originally separated feature extraction and similarity evaluation as an integrated affinity estimator. Learning of the integrated affinity estimator requires dedicated affinity samples to be manually fabricated from ground truth association, which usually does not guarantee the consistent data distribution between training and inference phases. In Chapter 4, FAMNet is proposed to integrate feature extraction, affinity estimation and multi-dimensional assignment into a unified DNN to realize end-to-end learning, which demonstrates its capability in different target categories and tracking scenarios in our comprehensive experiments. On the other hand, training of DNN usually requires large amount of labeled data which is not always available in the tracking tasks. To tackle this problem, in Chapter 5, I present a work using transfer learning and multi-task scheme to facilitate the feature learning in the context of limited training data. Finally, we summarize with the discussion of future works including DNN also integrating detector for MOT and other possible MOT frameworks such as model-free MOT tracker.
    ADA compliance
    For Americans with Disabilities Act (ADA) accommodation, including help with reading this content, please contact scholarshare@temple.edu
    Collections
    Theses and Dissertations

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Temple University Libraries | 1900 N. 13th Street | Philadelphia, PA 19122
    (215) 204-8212 | scholarshare@temple.edu
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.