Silicon Compute is Massively Energy Intensive Compared to Biological Brains
Modern deep learning and general computation demand enormous energy, limiting scalability and sustainability. Addressing energy efficiency is critical for the next generation of computing platforms, though it also supports potential proliferation of advanced AI and should be advanced alongside AI safety and governance considerations.
Foundational Capabilities (10)
Explore superconducting hardware to achieve brain-inspired computing with drastically reduced energy consumption, scaling to large networks.
Hardware for probabilistic computation, which can perform certain tasks more energy-efficiently by embracing uncertainty.
Develop novel hardware architectures optimized for deep learning and artificial intelligence that dramatically reduce energy consumption compared to current systems.
Develop switching technologies that operate at millivolt levels, significantly reducing the energy required for signal processing and computation.
Leverage brain-inspired neuromorphic hardware to perform computation more efficiently, emulating the low-energy operation of biological neural networks.
New logic and memory technologies based on CNTs or other structures, 3D integration with fine-grained connectivity, and new architectures for computation immersed in memory
Create computing architectures that use reversible logic, theoretically allowing computation with near-zero energy dissipation by avoiding information loss.
Biologically inspired or living computer systems that use biological components to perform computation at very low energy levels.