Pythagoras 360° Movement Analysis

Research output: Chapter in Book/Report/Conference proceedingConference contribution - Published abstract for conference with selection processResearchpeer-review


Research units


Video analysis technology that enables specific movement detection has long been implemented in various sports as a tool for professionalization. The motion analysis system Pythagoras allows precise visualization of the surrounding space and to capture 360-degree movement information (Büning, Baumgart, Grawunder, & Temme, 2017). These data can be computed and analyzed in real-time using parallel computing algorithms especially developed for this method. Pythagoras has central importance in at least two areas: (1) the context of movement learning as well as (2) the clinical dance therapy. For both settings, the software-supported three-dimensional video recording method Pythagoras offers various application possibilities. Through the implementation of deep learning algorithms which can be trained identifying specific movement patterns or detecting key features of dominant movement qualities (e.g., flow, acceleration), Pythagoras functions as a supervision tool in learning contexts. Also, enables the visualization of interactive 360-degree tutorials in pedagogical settings, allowing the recipient to take different perspectives to get the most appropriate view of the movement material. Thus, the system can make self-organized, observational learning possible, in that the learner can view the given movement not only repeatedly but also from self-chosen perspectives. Pythagoras allows the recording and analysis of high-resolution three-dimensional movement information. The motion information is stored in the form of point clouds, enabling a marker-free motion analysis. Precise software-based testing room measuring enables 360° real-time visualization, which allows scientific analysis based on algorithms especially developed for this process. The movement analysis encompasses a combination of parameters that capture the use of the body, the shape variations, the position in space and the level, the tempo, and the dynamics of the movements. Furthermore, effort qualities such as space, time, movement flow and weight use are evaluated (Bouchard & Badler, 2007). The current pilot project includes the implementation of additional algorithms which can be trained to detect specific movement patterns. This new approach of machine learning based movement pattern recognition can also be used for comparing new data sets with existing records at any time. Records are stored on a secured institution’s internal server. Designed intending to supporting sustainable software developments based on Big Data infrastructures. In detail: Data will be collected with four high-speed cameras in high-resolution covering a full 360° angle. The high quality of the recorded data allows for an exact measurement of movement speed and thus enables to create velocity profiles. The data will be collected using point clouds which provide several million points for motion detection and tracking every second. The amount of body movement data that can be collected is up to 4 GigaBytes per minute and will be stored on high speed physical hard drives. In contrast to the movement analysis systems based on optical marker systems, the recorded point cloud allows exact measurements of the body form and movement behavior (e.g., upper extremities) without any modifications or biases caused by the limitations of common marker systems. The implemented data storage system can be scaled up to 1.6 petabytes, thus allowing a sustainable and economic data storage.
Original languageEnglish
Title of host publicationDDCMC'19 Book of Abstracts : First International Conference; Dance Data, Cognition and Multimodal Communication September 19-21, Universidade Nov de Lisboa
Number of pages1
Publication date20.09.2019
Publication statusPublished - 20.09.2019
EventInternational Conference on Dance Data, Cognition and Multimodal Communications - Lissabon, Portugal
Duration: 19.09.201921.09.2019
Conference number: 1

ID: 4823181

View graph of relations