YONDULIB: Herramienta para el uso de sonidos como método de control de videojuegos Unity
Loading...
Official URL
Full text at PDC
Publication date
2022
Advisors (or tutors)
Editors
Journal Title
Journal ISSN
Volume Title
Publisher
Citation
Abstract
La industria del videojuego está en crecimiento constante y hace tiempo que dejó de ser un sector enfocado a un público infantil. Hoy en día existen infinidad de juegos que abarcan ámbitos muy diversos, por lo que es fácil encontrar alguno que nos guste y entretenga.
Sin embargo, casi todos los juegos se controlan mediante la combinación teclado y ratón en ordenador, o con mandos con botones y joysticks en consola, con esto nos referimos a las entradas de usuario. Este trabajo de fin de grado busca favorecer el desarrollo de videojuegos donde los usuarios puedan interaccionar mediante sonidos, ofreciendo una interfaz para cambiar los controles predefinidos por otros más accesibles y potencialmente divertidos.
Esto se suma a la cantidad de nuevas interacciones que se pueden desarrollar en el ámbito de la experiencia de usuario, sumando nuevas formas de jugar que a su vez implican nuevas maneras de concebir el propio diseño y desarrollo de mecánicas y dinámicas de videojuegos. Para ello, hemos desarrollado un paquete para el conocido entorno de desarrollo de videojuegos Unity que permite vincular las diferentes acciones del juego con comandos de sonido como silbar, chasquidos o golpes. Como Unity no dispone de APIs integradas para el reconocimiento de sonidos en tiempo real, nos hemos apoyado en la librería Libsoundio, que proporciona funciones de entrada de audio de baja latencia.
Gracias a la librería desarrollada, yondulib, podemos analizar la entrada y, mediante algoritmos de detección de frecuencias e intensidades, identificar los sonidos. Un desarrollador que use yondulib puede asociar sonidos a acciones en el juego, recibiendo retroalimentación gráfica en tiempo real sobre el porcentaje de acierto de cada asociación. Un jugador puede ver en tiempo real cómo son reconocidos sus sonidos por el juego.
Como resultado, este trabajo describe el proceso de diseño y desarrollo de yondulib, disponible como código fuente en un repositorio público, y también como un paquete de Unity que contiene un ejemplo práctico de uso de la librería. También describimos el proceso de integración del paquete en dos juegos ya desarrollados.
Tras este proyecto se deja un camino marcado sobre el que continuar respecto a inputs novedosos por sonido, además de poder servir de comienzo para investigaciones paralelas en cualquier ámbito de entradas de usuario innovadoras en el desarrollo de videojuegos que es, sin duda, un terreno por explotar.
The video game industry is constantly growing and has long since ceased to be a sector focused on children. Nowadays, there are countless games covering a wide range of fields, so it is easy to find one that we like and that entertains us. However, almost all games are controlled by a combination of keyboard and mouse on computer, or by controls with buttons and joysticks on console, by which we mean user input. This final degree project aims to support the development of videogames where users can interact with sounds, offering an interface to change the predefined controls for more accessible and potentially fun ones. This adds to the number of new interactions that can be developed in the field of user experience, adding new ways of playing that in turn imply new ways of conceiving the design and development of videogame mechanics and dynamics. To this end, we have developed a package for the popular videogame development environment Unity that allows linking the different actions of the game with sound commands such as whistling, clicking or tapping. Since Unity does not have built-in APIs for real-time sound recognition, we have relied on the Libsound library, which provides low-latency audio input functions. Thanks to the developed library, yondulib, we can analyse the input and, using frequency and intensity detection algorithms, identify the sounds. A developer using yondulib can associate sounds to actions in the game, receiving real-time graphical feedback on the success rate of each association. A player can see in real time how their sounds are recognised by the game. As a result, this study describes the design and development process of yondulib, available as source code in an public repository, and also as a Unity package containing a practical example of using the library. We also describe the process of integrating the package into two already developed games. This project provides a path to be followed in terms of innovative inputs for sound, and can also serve as a starting point for parallel research in any area of new user input in videogame development, which is certainly an area still to be looked into.
The video game industry is constantly growing and has long since ceased to be a sector focused on children. Nowadays, there are countless games covering a wide range of fields, so it is easy to find one that we like and that entertains us. However, almost all games are controlled by a combination of keyboard and mouse on computer, or by controls with buttons and joysticks on console, by which we mean user input. This final degree project aims to support the development of videogames where users can interact with sounds, offering an interface to change the predefined controls for more accessible and potentially fun ones. This adds to the number of new interactions that can be developed in the field of user experience, adding new ways of playing that in turn imply new ways of conceiving the design and development of videogame mechanics and dynamics. To this end, we have developed a package for the popular videogame development environment Unity that allows linking the different actions of the game with sound commands such as whistling, clicking or tapping. Since Unity does not have built-in APIs for real-time sound recognition, we have relied on the Libsound library, which provides low-latency audio input functions. Thanks to the developed library, yondulib, we can analyse the input and, using frequency and intensity detection algorithms, identify the sounds. A developer using yondulib can associate sounds to actions in the game, receiving real-time graphical feedback on the success rate of each association. A player can see in real time how their sounds are recognised by the game. As a result, this study describes the design and development process of yondulib, available as source code in an public repository, and also as a Unity package containing a practical example of using the library. We also describe the process of integrating the package into two already developed games. This project provides a path to be followed in terms of innovative inputs for sound, and can also serve as a starting point for parallel research in any area of new user input in videogame development, which is certainly an area still to be looked into.
Description
Trabajo de Fin de Grado en Desarrollo de Videojuegos, Facultad de Informática UCM, Departamento de Ingeniería del Software e Inteligencia Artificial, Curso 2021/2022.