Status : Verified
Personal Name Galapon, Fredrick Angelo R.
Resource Title An Energy-Efficient Hyperdimensional Computing Architecture Based on Binary Sparse Distributed Representations
Date Issued 9 June 2025
Abstract Hyperdimensional computing (HDC), drawing inspiration from the brain, uses extremely high-dimensional vectors to solve classification problems in a more robust and energy-efficient way than traditional machine learning methods. Unlike conventional algorithms, which often rely on complex models and computationally-intensive training, HDC offers a scalable and lightweight alternative by performing simple bitwise operations on these high-dimensional vectors, or hypervectors. State-of-the-art HDC architectures rely on binary dense hypervectors with approximately equal number of 1 and 0 elements. While these provide good classification performance, they may suffer from high energy consumption due to high switching activity within the architecture, especially with hypervectors whose dimensionality is in the thousands.

In this thesis, we explore the use of binary sparse representations, in which most of the elements are 0s, for a highly energy-efficient HDC architecture. We evaluate our approach on three classification tasks, namely language recognition, character recognition, and hand gesture recognition, targeting a 65nm CMOS process. Our results show that sparse representations achieve a 1.34-5.70x reduction in energy per inference compared to dense representations, while maintaining comparable classification performance across all three tasks, with little to no area overhead. This improvement is observed at a hypervector dimensionality of 10,000, and remains consistent for a wide range of dimensionalities, from 256 to 8,192. Furthermore, increasing the sparsity in the architecture, where as few as 1%-2\% of the elements in a hypervector are 1s, improves energy efficiency without degrading classification accuracy. Despite this higher sparsity, the proposed HDC architecture remains robust against memory errors, comparable to conventional architectures based on dense representations. These findings highlight the potential of sparse representations in enabling more ene
Degree Course Master of Science in Electrical Engineering
Language English
Keyword energy efficiency, sparsity, associative-projective neural network, binary spatter code
Material Type Thesis/Dissertation
Preliminary Pages
1.01 Mb
Category : I - Has patentable or registrable invention of creation.
 
Access Permission : Limited Access