PropertyValue
?:abstract
  • Tracking features between two consecutive images captures the essence of motion in order to categorize objects (either static or moving) in the scene. There has been a lot of literature on tracking features (sparse or dense) and lot of improvements have also been proposed over time. Many of these methods try to extract motion either through global optic flow methods, Horn-Schunck or local optic flow methods, LucasKanade. The analysis is more on scenes taken from a static camera in which background remains stationary but it becomes more challenging to extract motion from a moving camera as the motion of camera is also inherited into the objects. We examine the problem of tracking and tailing single as well as multiple people from a camera mounted on a mobile robot and present a solution for the same. In the proposed method, an alternative approach to optic flow computation is taken by formulating it in an energy minimization framework. The computed flow field is filtered using a spatial relative velocity based filter to determine the potential moving objects. Color and depth information is then used to finally segment and correctly classify the moving objects. The approach works for different testing environments including change in illumination, presence of many textured static objects and similar background color. ()
?:citationCount
  • 0 ()
?:cites
?:created
  • 2016-06-24 ()
?:creator
?:estimatedCitationCount
  • 0 ()
is ?:hasCitingEntity of
?:hasDiscipline
?:hasURL
?:publicationDate
  • 2008-01-01 ()
?:rank
  • 25024 ()
?:referenceCount
  • 20 ()
?:title
  • Optical Flow based person following behaviour of a robot ()
?:type

Metadata

Anon_0  
expand all