Search engine for discovering works of Art, research articles, and books related to Art and Culture
ShareThis
Javascript must be enabled to continue!

Classification with binary hyperdimensional computing

View through CrossRef
Hyperdimensional computing (HDC) has been introduced as a brain-inspired, lightweight, and energy-efficient algorithmic framework suitable for wearable internet of things (IoT) devices, near-sensor artificial intelligence (AI) applications, and on-device processing. It originates at the intersection of symbolic AI and connectionism, with the potential of combining the advantages of both, i.e., the transparency of symbolic representations, and flexibility and robustness of connectionist representations. HDC maps input data to vectors in a hyperdimensional (HD) space, i.e., hyperdimensional vectors (HVs) with typically thousands of components, and relies on simple component-wise arithmetic vector operations. Several frameworks of HDC exist, distinguished by their vector component types. This dissertation focuses on dense binary HDC since it has been reported to offer higher energy efficiency and more robustness compared to its non-binary counterpart, at the cost of a slightly lower classification accuracy. Therefore, this dissertation investigates adjustments and improvements to the dense binary HDC classification model to enhance its classification performance in terms of accuracy and transparency. First, a novel uniform framework is developed to encode binarized images into HVs. The framework uses a local feature extraction method relying only on native HD arithmetic vector operations. The proposed encoding framework outperforms other studies using native HDC with different encoding approaches, and is on par with more complex hybrid HDC models and lightweight binary neural networks (BNNs). It also demonstrates higher robustness to noise and blur compared to the baseline encoding. Second, two models that enhance previous HDC classifiers are introduced and explored. The models use a confidence metric that measures the confidence with which a sample has been classified during the training procedure. This enhanced training procedure consistently improves classification accuracy compared to the baseline and increases confidence in the classifier’s predictions. Third, a generic post-hoc analysis is proposed, which determines the importance of individual features in the HDC classification model. This analysis provides insight and transparency in the decision-making process at every deployment of the HDC classification model.
University of Antwerp
Title: Classification with binary hyperdimensional computing
Description:
Hyperdimensional computing (HDC) has been introduced as a brain-inspired, lightweight, and energy-efficient algorithmic framework suitable for wearable internet of things (IoT) devices, near-sensor artificial intelligence (AI) applications, and on-device processing.
It originates at the intersection of symbolic AI and connectionism, with the potential of combining the advantages of both, i.
e.
, the transparency of symbolic representations, and flexibility and robustness of connectionist representations.
HDC maps input data to vectors in a hyperdimensional (HD) space, i.
e.
, hyperdimensional vectors (HVs) with typically thousands of components, and relies on simple component-wise arithmetic vector operations.
Several frameworks of HDC exist, distinguished by their vector component types.
This dissertation focuses on dense binary HDC since it has been reported to offer higher energy efficiency and more robustness compared to its non-binary counterpart, at the cost of a slightly lower classification accuracy.
Therefore, this dissertation investigates adjustments and improvements to the dense binary HDC classification model to enhance its classification performance in terms of accuracy and transparency.
First, a novel uniform framework is developed to encode binarized images into HVs.
The framework uses a local feature extraction method relying only on native HD arithmetic vector operations.
The proposed encoding framework outperforms other studies using native HDC with different encoding approaches, and is on par with more complex hybrid HDC models and lightweight binary neural networks (BNNs).
It also demonstrates higher robustness to noise and blur compared to the baseline encoding.
Second, two models that enhance previous HDC classifiers are introduced and explored.
The models use a confidence metric that measures the confidence with which a sample has been classified during the training procedure.
This enhanced training procedure consistently improves classification accuracy compared to the baseline and increases confidence in the classifier’s predictions.
Third, a generic post-hoc analysis is proposed, which determines the importance of individual features in the HDC classification model.
This analysis provides insight and transparency in the decision-making process at every deployment of the HDC classification model.

Related Results

Advancements in Quantum Computing and Information Science
Advancements in Quantum Computing and Information Science
Abstract: The chapter "Advancements in Quantum Computing and Information Science" explores the fundamental principles, historical development, and modern applications of quantum co...
Competitive Indices in Cereal and Legume Mixtures in a South Asian Environment
Competitive Indices in Cereal and Legume Mixtures in a South Asian Environment
Core Ideas Cereal‐legume binary mixtures increased forage productivity per unit area compared to cereal‐cereal and legume‐legume binary mixtures. In binary mixtures, pearl millet w...
RRAM-Based CAM Combined With Time-Domain Circuits for Hyperdimensional Computing
RRAM-Based CAM Combined With Time-Domain Circuits for Hyperdimensional Computing
Abstract Content addressable memory (CAM) for search and match operations demands high speed and low power for near real-time decision-making across many critical domains. ...
Synthesis, characterization and application of novel ionic liquids
Synthesis, characterization and application of novel ionic liquids
Ionic liquids (ILs) or molten salts at room temperature presently experience significant attention in many areas of chemistry. The most attractive property is the “tenability” of t...
THE IMPACT OF CLOUD COMPUTING ON CONSTRUCTION PROJECT DELIVERY ABUJA NIGERIA
THE IMPACT OF CLOUD COMPUTING ON CONSTRUCTION PROJECT DELIVERY ABUJA NIGERIA
Cloud computing is the delivery of computing services, such as storage, processing power, and software applications, via the internet. Cloud computing offers various advantages and...
Nature Inspired Parallel Computing
Nature Inspired Parallel Computing
Parallel computing is more and more important for science and engineering, but it is not used so widely as serial computing. People are used to serial computing and feel parallel c...
Improving Medical Document Classification via Feature Engineering
Improving Medical Document Classification via Feature Engineering
<p dir="ltr">Document classification (DC) is the task of assigning the predefined labels to unseen documents by utilizing the model trained on the available labeled documents...
Efficient Event-Based Robotic Grasping Perception using Hyperdimensional Computing
Efficient Event-Based Robotic Grasping Perception using Hyperdimensional Computing
Abstract Grasping is Fundamental in various robotic applications, particularly within industrial contexts. Accurate inference of object properties is a crucial step toward ...

Back to Top