Key Points

Introduction

Neural networks beyond digital hardware represent a shift toward leveraging inherent physical properties for computation, inspired by biological systems. These include analog circuits that process continuous signals, optical systems using light propagation, quantum networks exploiting superposition, and more exotic forms like mechanical or chemical networks. This paradigm aims to address limitations in power consumption and speed inherent in von Neumann architectures.

Main Types

Advantages and Challenges

These networks often excel in energy efficiency (e.g., analog systems reducing converter needs) and fault tolerance, but challenges include precision control and integration with existing digital ecosystems. Applications span from edge computing to quantum-enhanced machine learning, with research emphasizing hybrid approaches.


Survey on Neural Networks Not Based on Digital Hardware or Gate Circuits

Abstract

This survey provides a comprehensive overview of neural networks that operate without reliance on digital hardware or traditional gate circuits, from a professional scientific perspective. These systems leverage analog, optical, quantum, neuromorphic, physical, mechanical, and chemical/molecular principles to perform computations, often drawing inspiration from biological neural networks. We systematically review historical developments, key architectures, implementation methods, advantages, challenges, applications, and future directions. The analysis is grounded in recent literature, highlighting how these approaches address the energy and scalability bottlenecks of conventional digital neural networks. While promising for specialized tasks, they introduce unique dilemmas such as noise tolerance and hardware integration.

1. Introduction and Historical Context

Neural networks have traditionally been implemented on digital hardware using gate circuits, but this von Neumann-based approach suffers from high energy consumption due to data movement between memory and processors [1]. In contrast, non-digital neural networks exploit physical phenomena for inherent parallelism and efficiency, mimicking the analog nature of biological brains where neurons process continuous signals without discrete logic gates [2].

The concept traces back to the 1960s with early analog implementations like the ADALINE (Adaptive Linear Neuron), which used memistors—electrically adjustable resistors—to emulate synapses [3]. By the 1980s, neuromorphic engineering emerged, coined by Carver Mead, emphasizing analog VLSI circuits operating in subthreshold regimes for low-power, brain-like computation [4]. Recent advancements, driven by the limitations of Moore's Law, have expanded to optical [5], quantum [6], mechanical [7], and chemical systems [8], aiming for energy efficiencies orders of magnitude better than digital counterparts.