Evaluation of Juliana Tool: A Translator for Julia’s CUDA.jl Code into KernelAbstraction.jl

Loading...
Thumbnail Image

Full text at PDC

Publication date

2025

Advisors (or tutors)

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier
Citations
Google Scholar

Citation

Abstract

Julia is a high-level language that supports the execution of parallel code through various packages. CUDA.jl is widely used for developing GPU-accelerated code in Julia and is integrated into many libraries and programs. In this paper, we present Juliana, a novel tool that automatically translates Julia code utilizing the CUDA.jl package into an abstract, multi-backend representation powered by the KernelAbstractions.jl package. To evaluate the tool’s viability and performance, we analyzed four Julia projects: Rodinia, miniBUDE, BabelStream, and Oceananigans.jl. The performance overhead of this approach was found to be relatively low (under 7% for the Rodinia suite), with performance portability metrics showing results nearly identical to the native implementations. By running the same code across multiple KernelAbstractions’ backends, we successfully executed these translated projects on GPUs from vendors such as NVIDIA, Intel, AMD, and Apple. This ensured compatibility across these platforms and enabled first-time execution on some devices.

Research Projects

Organizational Units

Journal Issue

Description

UCM subjects

Unesco subjects

Keywords

Collections