- home
- Advanced Search
Filters
Clear All- Digital Humanities and Cultural Heritage
- Publications
- Science Foundation Ireland
- DE
- Publications at Bielefeld Universit...
- Digital Humanities and Cultural Heritage
- Publications
- Science Foundation Ireland
- DE
- Publications at Bielefeld Universit...
Loading
description Publicationkeyboard_double_arrow_right Conference object , Article 2012 GermanyPublisher:Springer Science and Business Media LLC Publicly fundedFunded by:SFI | Nick CampbellSFI| Nick CampbellOertel, Catharine; Cummins, Fred; Campbell, Nick; Edlund, Jens; Wagner, Petra;In recent years there has been a substantial debate about the need for increasingly spontaneous, conversational corpora of spoken interaction that are not controlled or task directed. In parallel the need has arisen for the recording of multi-modal corpora which are not restricted to the audio domain alone. With a corpus that would fulfill both needs, it would be possible to investigate the natural coupling, not only in turn-taking and voice, but also in the movement of participants. In the following paper we describe the design and recording of such a corpus and we provide some illustrative examples of how such a corpus might be exploited in the study of dynamic interaction. The D64 corpus is a multimodal corpus recorded over two successive days. Each day resulted in approximately 4 h of recordings. In total five participants took part in the recordings of whom two participants were female and three were male. Seven video cameras were used of which at least one was trained on each participant. The Optitrack motion capture kit was used in order to enrich information. The D64 corpus comprises annotations on conversational involvement, speech activity and pauses as well as information of the average degree of change in the movement of participants.
Publications at Biel... arrow_drop_down Publications at Bielefeld UniversityConference object . 2010License: CC BY NC NDData sources: Publications at Bielefeld UniversityPublications at Bielefeld University; Journal on Multimodal User InterfacesArticle . 2012 . 2013 . Peer-reviewedLicense: Springer TDMadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1007/s12193-012-0108-6&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euAccess Routeshybrid 34 citations 34 popularity Top 10% influence Top 10% impulse Top 10% Powered by BIP!more_vert Publications at Biel... arrow_drop_down Publications at Bielefeld UniversityConference object . 2010License: CC BY NC NDData sources: Publications at Bielefeld UniversityPublications at Bielefeld University; Journal on Multimodal User InterfacesArticle . 2012 . 2013 . Peer-reviewedLicense: Springer TDMadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1007/s12193-012-0108-6&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu
Loading
description Publicationkeyboard_double_arrow_right Conference object , Article 2012 GermanyPublisher:Springer Science and Business Media LLC Publicly fundedFunded by:SFI | Nick CampbellSFI| Nick CampbellOertel, Catharine; Cummins, Fred; Campbell, Nick; Edlund, Jens; Wagner, Petra;In recent years there has been a substantial debate about the need for increasingly spontaneous, conversational corpora of spoken interaction that are not controlled or task directed. In parallel the need has arisen for the recording of multi-modal corpora which are not restricted to the audio domain alone. With a corpus that would fulfill both needs, it would be possible to investigate the natural coupling, not only in turn-taking and voice, but also in the movement of participants. In the following paper we describe the design and recording of such a corpus and we provide some illustrative examples of how such a corpus might be exploited in the study of dynamic interaction. The D64 corpus is a multimodal corpus recorded over two successive days. Each day resulted in approximately 4 h of recordings. In total five participants took part in the recordings of whom two participants were female and three were male. Seven video cameras were used of which at least one was trained on each participant. The Optitrack motion capture kit was used in order to enrich information. The D64 corpus comprises annotations on conversational involvement, speech activity and pauses as well as information of the average degree of change in the movement of participants.
Publications at Biel... arrow_drop_down Publications at Bielefeld UniversityConference object . 2010License: CC BY NC NDData sources: Publications at Bielefeld UniversityPublications at Bielefeld University; Journal on Multimodal User InterfacesArticle . 2012 . 2013 . Peer-reviewedLicense: Springer TDMadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1007/s12193-012-0108-6&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.euAccess Routeshybrid 34 citations 34 popularity Top 10% influence Top 10% impulse Top 10% Powered by BIP!more_vert Publications at Biel... arrow_drop_down Publications at Bielefeld UniversityConference object . 2010License: CC BY NC NDData sources: Publications at Bielefeld UniversityPublications at Bielefeld University; Journal on Multimodal User InterfacesArticle . 2012 . 2013 . Peer-reviewedLicense: Springer TDMadd ClaimPlease grant OpenAIRE to access and update your ORCID works.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.This Research product is the result of merged Research products in OpenAIRE.
You have already added works in your ORCID record related to the merged Research product.All Research productsarrow_drop_down <script type="text/javascript"> <!-- document.write('<div id="oa_widget"></div>'); document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=10.1007/s12193-012-0108-6&type=result"></script>'); --> </script>
For further information contact us at helpdesk@openaire.eu