Murillo Montero, RaúlDel Barrio García, Alberto AntonioBotella Juan, Guillermo2023-12-012023-12-012021Raul Murillo, Alberto Antonio Del Barrio, & Guillermo Botella. (2021, septiembre 21). Posit Arithmetic Units for Deep Neural Networks. Avances en arquitectura y tecnología de computadores. Actas de las Jornadas SARTECO 20/21, Málaga.978-84-09-32487-310.5281/zenodo.7737760https://hdl.handle.net/20.500.14352/91048Posit™ arithmetic is a recent alternative format to the IEEE 754 standard for floating-point numbers that claims to provide compelling advantages over floats, including higher accuracy, larger dynamic range or bitwise compatibility across systems. In particular, this format is a suitable candidate to replace floating-point numbers in Deep Neural Networks (DNNs), an area of growing interest with a large computational cost. This work presents parameterized designs for multiple posit functional units, including addition, multiplication and multiply-accumulate operation, and integrate them as templates of the FloPoCo framework. Synthesis results show that the proposed arithmetic units significantly reduce the hardware requirements when compared with previous implementations. Finally, this work proposes the use of posit arithmetic for performing both DNN inference and training. Experiments on different datasets, including CIFAR-10, reveal that 16-bit posits can safely replace 32-bit floats for training, and that low-precision 8-bit posits can be used for DNN inference with negligible accuracy drop.engAttribution-ShareAlike 4.0 Internationalhttp://creativecommons.org/licenses/by-sa/4.0/Posit Arithmetic Units for Deep Neural NetworksUnidades Aritméticas Posit para Redes Neuronales Profundasconference paperhttps://sarteco.org/jornadas-sarteco-20-21/https://doi.org/10.5281/zenodo.7737760https://www.jornadassarteco.org/sdm_downloads/actas-jornadas-paralelismo-20-21/open accessHardwareInteligencia artificial (Informática)1203.17 Informática1203.18 Sistemas de Información, Diseño Componentes