RT Journal Article T1 Evaluation of Juliana Tool: A Translator for Julia’s CUDA.jl Code into KernelAbstraction.jl A1 De la Calle, Enrique A1 García Sánchez, Carlos AB Julia is a high-level language that supports the execution of parallel code through various packages. CUDA.jl is widely used for developing GPU-accelerated code in Julia and is integrated into many libraries and programs. In this paper, we present Juliana, a novel tool that automatically translates Julia code utilizing the CUDA.jl package into an abstract, multi-backend representation powered by the KernelAbstractions.jl package. To evaluate the tool’s viability and performance, we analyzed four Julia projects: Rodinia, miniBUDE, BabelStream, and Oceananigans.jl. The performance overhead of this approach was found to be relatively low (under 7% for the Rodinia suite), with performance portability metrics showing results nearly identical to the native implementations. By running the same code across multiple KernelAbstractions’ backends, we successfully executed these translated projects on GPUs from vendors such as NVIDIA, Intel, AMD, and Apple. This ensured compatibility across these platforms and enabled first-time execution on some devices. PB Elsevier YR 2025 FD 2025 LK https://hdl.handle.net/20.500.14352/119578 UL https://hdl.handle.net/20.500.14352/119578 LA eng DS Docta Complutense RD 21 ene 2026