Interfaz gestual para interacción con música generativa
Loading...
Official URL
Full text at PDC
Publication date
2023
Authors
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Citation
Abstract
En este proyecto hemos desarrollado un sistema de música interactiva controlado mediante la posición de la mano vista por una cámara y los gestos que se hagan en el aire. La posición de la mano, detectada por la cámara y procesada con MediaPipe, nos permitirá interactuar con diferentes parámetros como el tono, el volumen o la distribución aplicada. Los gestos que hagamos con la mano al completo, detectados con la placa Nicla Sense ME, nos permiten pausar la música o reanudarla.
La música sobre la cual haremos esta interacción es una pieza de música generativa, es decir, una composición producida en tiempo real de modo algorítmico, de acuerdo con un conjunto específico de reglas lógicas de control. Partiendo de un código de música generativa seremos capaces de cambiar parámetros de dicho código para modificar el sonido que se genera.
Este proyecto pretende acercar las aplicaciones del Internet de las Cosas a un ámbito del entretenimiento, viendo así las posibilidades que puede abrir esta tecnología a la hora de aplicarlo a la música así como desarrollar nuestra creatividad al interactuar con ella de diversas formas. Se trata principalmente de un trabajo exploratorio para investigar sobre dicha interacción musical utilizando elementos y herramientas vistas durante este máster.
Aunque el objetivo principal sea el entretenimiento también sirve como demostración del potencial de las diferentes herramientas usadas como la placa Nicla Sense ME, la cual es capaz de realizar la inferencia de un gesto previamente entrenado en cuestión de milisegundos, poniendo en valor la importancia de la computación en el borde. Además el eje central de este proyecto es una Raspberry Pi 4, un mini ordenador económico y portable capaz de soportar un sistema IoT complejo
In this project we have developed an interactive music system controlled by the position of the hand seen by a camera and the gestures made in the air. The position of the hand, detected by the camera and processed with MediaPipe, will allow us to interact with different parameters such as tone, volume or distribution. The gestures we make with the whole hand, detected with the Nicla Sense ME board, allow us to pause the music or resume it. The music on which we will do this interaction is a piece of generative music, that is, a composition produced in real time in an algorithmic way, according to a specific set of logical control rules. Starting from a generative music code we will be able to change parameters of this code to modify the sound that is generated. This project aims to bring the applications of the Internet of Things to a field of entertainment, thus seeing the possibilities that this technology can bring when applied to music and develop our creativity as we interact with it in various ways. It is mainly an exploratory work to investigate about this musical interaction using elements and tools seen during this master. Although the main objective is entertainment, it also serves as a demonstration of the potential of the different tools used such as the Nicla Sense ME board, which is able to perform the inference of a previously trained gesture in a matter of milliseconds, highlighting the importance of edge computing. In addition, the backbone of this project is a Raspberry Pi 4, an inexpensive and portable mini-computer capable of supporting a complex IoT system.
In this project we have developed an interactive music system controlled by the position of the hand seen by a camera and the gestures made in the air. The position of the hand, detected by the camera and processed with MediaPipe, will allow us to interact with different parameters such as tone, volume or distribution. The gestures we make with the whole hand, detected with the Nicla Sense ME board, allow us to pause the music or resume it. The music on which we will do this interaction is a piece of generative music, that is, a composition produced in real time in an algorithmic way, according to a specific set of logical control rules. Starting from a generative music code we will be able to change parameters of this code to modify the sound that is generated. This project aims to bring the applications of the Internet of Things to a field of entertainment, thus seeing the possibilities that this technology can bring when applied to music and develop our creativity as we interact with it in various ways. It is mainly an exploratory work to investigate about this musical interaction using elements and tools seen during this master. Although the main objective is entertainment, it also serves as a demonstration of the potential of the different tools used such as the Nicla Sense ME board, which is able to perform the inference of a previously trained gesture in a matter of milliseconds, highlighting the importance of edge computing. In addition, the backbone of this project is a Raspberry Pi 4, an inexpensive and portable mini-computer capable of supporting a complex IoT system.
Description
Trabajo de Fin de Master en Internet de las Cosas, Facultad de Informática UCM, Departamento de Arquitectura de computadores, Curso 2022/2023