Show simple item record

dc.contributor.authorWang, Y
dc.contributor.authorBrownjohn, J
dc.contributor.authorDai, K
dc.contributor.authorPatel, M
dc.date.accessioned2020-09-30T15:28:14Z
dc.date.issued2019-11-15
dc.description.abstractVibration serviceability of footbridges is important in terms of fitness for purpose. Human-induced dynamic loading is the primary excitation of footbridges and has been researched with traditional sensors, such as inertial sensors and force plates. Along with the development of computer hardware and algorithms, e.g., machine learning, especially deep learning, computer vision technology improves rapidly and has potential application to the problem. High precision pedestrian detection can be realized with various computer vision methods, corresponding to different situations or demands. In this paper, two widely recognized computer vision approaches are used for detecting body center of mass and ankle movement, to explore the potential of these methods on human-induced vibration research. Consumer-grade cameras are used without artificial markers, to take videos for further processing and wearable inertial sensors were used to validate and evaluate the computer vision measurements.en_GB
dc.identifier.citationVol. 5, article 133en_GB
dc.identifier.doi10.3389/fbuil.2019.00133
dc.identifier.urihttp://hdl.handle.net/10871/123044
dc.language.isoenen_GB
dc.publisherFrontiers Mediaen_GB
dc.rights© 2019 Wang, Brownjohn, Dai and Patel. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.en_GB
dc.subjecthuman-induced vibrationen_GB
dc.subjectfootbridgeen_GB
dc.subjectcomputer visionen_GB
dc.subjectinstance segmentationen_GB
dc.subjecthuman pose estimationen_GB
dc.titleAn Estimation of Pedestrian Action on Footbridges Using Computer Vision Approachesen_GB
dc.typeArticleen_GB
dc.date.available2020-09-30T15:28:14Z
dc.descriptionThis is the final version. Available on open access from Frontiers Media via the DOI in this recorden_GB
dc.descriptionData Availability Statement: The datasets generated for this study are available on request to the corresponding author.en_GB
dc.identifier.eissn2297-3362
dc.identifier.journalFrontiers in Built Environmenten_GB
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_GB
dcterms.dateAccepted2019-10-29
rioxxterms.versionVoRen_GB
rioxxterms.licenseref.startdate2019-11-15
rioxxterms.typeJournal Article/Reviewen_GB
refterms.dateFCD2020-09-30T15:26:42Z
refterms.versionFCDVoR
refterms.dateFOA2020-09-30T15:28:19Z
refterms.panelBen_GB
refterms.depositExceptionpublishedGoldOA


Files in this item

This item appears in the following Collection(s)

Show simple item record

© 2019 Wang, Brownjohn, Dai and Patel. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
Except where otherwise noted, this item's licence is described as © 2019 Wang, Brownjohn, Dai and Patel. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.