Collection

SI - Venturing Beyond Boundaries: AI Empowered by Massively Parallel Hyperdimensional Computers

Artificial intelligence, with its capacity to replicate human cognition and execute tasks in real-world scenarios, has gained significant prominence in both academic research and industrial development. Machine learning techniques, algorithms, and deep neural networks have become commonplace tools for pattern recognition and autonomous decision-making.

A fascinating and emerging domain within machine learning is Vector Symbolic Architectures (VSA) or Hyperdimensional Computing (HDC). These approaches represent a neuro-inspired computational framework that harnesses high-dimensional random vector spaces to provide learning and inference solutions with practical levels of accuracy. Notably, VSA/HDC models exhibit superior processing efficiency and model robustness when compared to the more conventional deep networks.

The computational components of the VSA/HDC architecture are particularly well-suited for deployment on highly parallel processing units, making them a promising avenue for future developments in artificial intelligence.

This special issue comprises a compilation of articles with the goal of presenting VSA/HDC architectures, demonstrating their significance in emerging artificial intelligence systems, and highlighting substantial opportunities for parallel processing at all levels, from hardware to software and all relevant algorithms.

The origins of VSA/HDC trace back to the 1980s, with notable early contributions including the work of Hinton (1984) and Kanerva (1988), who respectively developed the theories of distributed representations and distributed memory. VSA/HDC systems can effectively represent a ”concept space” by harnessing the geometry and algebra of high-dimensional random vector spaces. In essence, they enable metaphorical and analogy-based reasoning akin to the processes in the animal brain. A fundamental insight is the importance of large circuits in brain computation, a concept that VSA/HDC incorporates by operating with distributed representations in high-dimensional spaces, referred to as hypervectors. These hypervectors typically have dimensions on the order of thousands or tens of thousands, and they holographically encode information, ensuring inherent robustness.

The arithmetic within HDC relies on well-defined operations between hypervectors, including addition (bundling), multiplication (binding), and permutation. While the specific implementations of these operations may vary among HDC models, they all achieve similar abstract outcomes, such as superposition, association, and ordering of information, respectively. Another crucial function in HDC is information comparison, which typically involves measuring the similarity between hypervectors using techniques like the dot product or cosine similarity. Both information manipulation and comparison in HDC are based on dimension-independent ultra-wide (vector) operations, offering substantial opportunities for parallel processing and optimizations.

Editors

Articles

Articles will be displayed here once they are published.