- home
- Advanced Search
- Rural Digital Europe
- 2017-2021
- Publications
- Research data
- Research software
- European Commission
- EC|H2020
- SecondHands
- EU
- Archivio della ricerca- Università ...
- Rural Digital Europe
- 2017-2021
- Publications
- Research data
- Research software
- European Commission
- EC|H2020
- SecondHands
- EU
- Archivio della ricerca- Università ...
Loading
description Publicationkeyboard_double_arrow_right Article , Preprint 2019 ItalyPublisher:Public Library of Science (PLoS) Funded by:EC | TRADR, EC | SecondHandsEC| TRADR ,EC| SecondHandsAuthors: Sanzari, Marta; Ntouskos, Valsamis; Pirri, Fiora;Sanzari, Marta; Ntouskos, Valsamis; Pirri, Fiora;We present a novel framework for the automatic discovery and recognition of motion primitives in videos of human activities. Given the 3D pose of a human in a video, human motion primitives are discovered by optimizing the `motion flux', a quantity which captures the motion variation of a group of skeletal joints. A normalization of the primitives is proposed in order to make them invariant with respect to a subject anatomical variations and data sampling rate. The discovered primitives are unknown and unlabeled and are unsupervisedly collected into classes via a hierarchical non-parametric Bayes mixture model. Once classes are determined and labeled they are further analyzed for establishing models for recognizing discovered primitives. Each primitive model is defined by a set of learned parameters. Given new video data and given the estimated pose of the subject appearing on the video, the motion is segmented into primitives, which are recognized with a probability given according to the parameters of the learned models. Using our framework we build a publicly available dataset of human motion primitives, using sequences taken from well-known motion capture datasets. We expect that our framework, by providing an objective way for discovering and categorizing human motion, will be a useful tool in numerous research fields including video analysis, human inspired motion generation, learning by demonstration, intuitive human-robot interaction, and human behavior analysis.
PLoS ONE; Archivio d... arrow_drop_down PLoS ONE; Archivio della ricerca- Università di Roma La SapienzaOther literature type . Article . 2019 . Peer-reviewedLicense: CC BYEurope PubMed CentralArticle . 2019Full-Text: http://europepmc.org/articles/PMC6443174Data sources: PubMed CentralArchivio della ricerca- Università di Roma La SapienzaArticle . 2019Data sources: Archivio della ricerca- Università di Roma La Sapienzahttps://doi.org/10.48550/arxiv...Article . 2017License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1371/journal.pone.0214499&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euAccess RoutesGreen gold 11 citations 11 popularity Top 10% influence Average impulse Top 10% Powered by BIP!more_vert PLoS ONE; Archivio d... arrow_drop_down PLoS ONE; Archivio della ricerca- Università di Roma La SapienzaOther literature type . Article . 2019 . Peer-reviewedLicense: CC BYEurope PubMed CentralArticle . 2019Full-Text: http://europepmc.org/articles/PMC6443174Data sources: PubMed CentralArchivio della ricerca- Università di Roma La SapienzaArticle . 2019Data sources: Archivio della ricerca- Università di Roma La Sapienzahttps://doi.org/10.48550/arxiv...Article . 2017License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1371/journal.pone.0214499&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
Loading
description Publicationkeyboard_double_arrow_right Article , Preprint 2019 ItalyPublisher:Public Library of Science (PLoS) Funded by:EC | TRADR, EC | SecondHandsEC| TRADR ,EC| SecondHandsAuthors: Sanzari, Marta; Ntouskos, Valsamis; Pirri, Fiora;Sanzari, Marta; Ntouskos, Valsamis; Pirri, Fiora;We present a novel framework for the automatic discovery and recognition of motion primitives in videos of human activities. Given the 3D pose of a human in a video, human motion primitives are discovered by optimizing the `motion flux', a quantity which captures the motion variation of a group of skeletal joints. A normalization of the primitives is proposed in order to make them invariant with respect to a subject anatomical variations and data sampling rate. The discovered primitives are unknown and unlabeled and are unsupervisedly collected into classes via a hierarchical non-parametric Bayes mixture model. Once classes are determined and labeled they are further analyzed for establishing models for recognizing discovered primitives. Each primitive model is defined by a set of learned parameters. Given new video data and given the estimated pose of the subject appearing on the video, the motion is segmented into primitives, which are recognized with a probability given according to the parameters of the learned models. Using our framework we build a publicly available dataset of human motion primitives, using sequences taken from well-known motion capture datasets. We expect that our framework, by providing an objective way for discovering and categorizing human motion, will be a useful tool in numerous research fields including video analysis, human inspired motion generation, learning by demonstration, intuitive human-robot interaction, and human behavior analysis.
PLoS ONE; Archivio d... arrow_drop_down PLoS ONE; Archivio della ricerca- Università di Roma La SapienzaOther literature type . Article . 2019 . Peer-reviewedLicense: CC BYEurope PubMed CentralArticle . 2019Full-Text: http://europepmc.org/articles/PMC6443174Data sources: PubMed CentralArchivio della ricerca- Università di Roma La SapienzaArticle . 2019Data sources: Archivio della ricerca- Università di Roma La Sapienzahttps://doi.org/10.48550/arxiv...Article . 2017License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1371/journal.pone.0214499&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euAccess RoutesGreen gold 11 citations 11 popularity Top 10% influence Average impulse Top 10% Powered by BIP!more_vert PLoS ONE; Archivio d... arrow_drop_down PLoS ONE; Archivio della ricerca- Università di Roma La SapienzaOther literature type . Article . 2019 . Peer-reviewedLicense: CC BYEurope PubMed CentralArticle . 2019Full-Text: http://europepmc.org/articles/PMC6443174Data sources: PubMed CentralArchivio della ricerca- Università di Roma La SapienzaArticle . 2019Data sources: Archivio della ricerca- Università di Roma La Sapienzahttps://doi.org/10.48550/arxiv...Article . 2017License: arXiv Non-Exclusive DistributionData sources: Dataciteadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1371/journal.pone.0214499&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu