Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen: http://dx.doi.org/10.18419/opus-5155
Langanzeige der Metadaten
DC ElementWertSprache
dc.contributor.authorDridi, Mohamed H.de
dc.date.accessioned2015-03-31de
dc.date.accessioned2016-03-31T08:36:55Z-
dc.date.available2015-03-31de
dc.date.available2016-03-31T08:36:55Z-
dc.date.issued2015de
dc.identifier.other428432751de
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:de:bsz:93-opus-98961de
dc.identifier.urihttp://elib.uni-stuttgart.de/handle/11682/5172-
dc.identifier.urihttp://dx.doi.org/10.18419/opus-5155-
dc.description.abstractIn this paper we present a number of methods (manual, semi-automatic and automatic) for tracking individual targets in high density crowd scenes where thousand of people are gathered. The necessary data about the motion of individuals and a lot of other physical information can be extracted from consecutive image sequences in different ways, including optical flow and block motion estimation. One of the famous methods for tracking moving objects is the block matching method. This way to estimate subject motion requires the specification of a comparison window which determines the scale of the estimate. In this work we present a real-time method for pedestrian recognition and tracking in sequences of high resolution images obtained by a stationary (high definition) camera located in different places on the Haram mosque in Mecca. The objective is to estimate pedestrian velocities as a function of the local density.The resulting data of tracking moving pedestrians based on video sequences are presented in the following section. Through the evaluated system the spatio-temporal coordinates of each pedestrian during the Tawaf ritual are established. The pilgrim velocities as function of the local densities in the Mataf area (Haram Mosque Mecca) are illustrated and very precisely documented. Tracking in such places where pedestrian density reaches 7 to 8 Persons/m2 is extremely challenging due to the small number of pixels on the target, appearance ambiguity resulting from the dense packing, and severe inter-object occlusions. The tracking method which is outlined in this paper overcomes these challenges by using a virtual camera which is matched in position, rotation and focal length to the original camera in such a way that the features of the 3D-model match the feature position of the filmed mosque. In this model an individual feature has to be identified by eye, where contrast is a criterion. We do know that the pilgrims walk on a plane, and after matching the camera we also have the height of the plane in 3D-space from our 3D-model. A point object is placed at the position of a selected pedestrian. During the animation we set multiple animation-keys (approximately every 25 to 50 frames which equals 1 to 2 seconds) for the position, such that the position of the point and the pedestrian overlay nearly at every time. By combining all these variables with the available appearance information, we are able to track individual targets in high density crowds.en
dc.language.isoende
dc.rightsinfo:eu-repo/semantics/openAccessde
dc.subject.classificationObjektverfolgung , Transport , Gebäudeplanung , Datenverarbeitung , Sozialwissenschaften , Verkehrsplanungde
dc.subject.ddc530de
dc.subject.otherFußgängerströme , Fußgängersimulation , Fußgängerdichte , Fußgängerbewegung und Transportde
dc.subject.otherPedestrian dynamics , Crowd management , Crowd control , Objects trackingen
dc.titleTracking individual targets in high density crowd scenes analysis of a video recording in Hajj 2009en
dc.typearticlede
ubs.fakultaetFakultät Mathematik und Physikde
ubs.institutInstitut für Theoretische Physik Ide
ubs.opusid9896de
ubs.publikation.sourceCurrent urban studies 3 (2015), S. 35-53. URL http://dx.doi.org./10.4236/cus.2015.31005de
ubs.publikation.typZeitschriftenartikelde
Enthalten in den Sammlungen:08 Fakultät Mathematik und Physik

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
CUS_2015033010291490.pdf3,04 MBAdobe PDFÖffnen/Anzeigen


Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.