Fundamental Development Gap Map v1.0

Welcome! This is a new web portal by the team at Convergent Research that we built out with the help of scientists and researchers in our ecosystem to explore the landscape of R&D gaps holding back science and the bridge-scale fundamental development efforts that might allow humanity to solve them. To illustrate this, we’ve created a dataset showing the relationships between the following:

  • R&D Gaps: we’ve focused on “fundamental development” gaps that may need coordinated research programs to solve by building a mid-sized piece of infrastructure.
  • Foundational Capabilities: tools, technologies, data (mid-sized science infrastructure) that could fill the related R&D gap
  • Resources: for each foundational capability, the initiatives and organizations working toward it; relevant research papers, roadmaps, and technology seeds; and ideas (proposals, whitepapers, essays, etc.) illustrating the capability.

This is just v1.0 of the map, so it isn’t comprehensive - at all! We’re putting it out to help you get an intuition for how folks in our ecosystem think about the scientific landscape, spark discussion, and to get your constructive and critical feedback.

You can read more about the “how” and “why” behind it here, learn more about the contents and how to use it here, and suggest contributions to it here. Have other questions/suggestions? Reach out to us at gapmap@convergentresearch.org.

Loading grid view
Augmentation technology, from better interfaces to AI tools to brain computer interfaces (BCIs), can amplify human strengths, democratize high-skill work, enable greater oversight, and help make humans more economically capable. See BCI-related Foundational Capabilities: • Minimally Invasive Ultrasound–Based Whole Brain Computer Interface • Fully Noninvasive Read–Write Technologies • Micro- to Nano-Scale Minimally Invasive BCI Transducers
Develop approaches to protect ocean ecosystems. E.g. Studies have shown that shading corals for the 4 hottest hours of the day can significantly reduce bleaching.
Understand whether specific interventions in the Arctic, from Mixed-Phase Cloud Thinning to Sea Ice Thickening can counter extreme warming and tipping-points in the region. 
New technologies to collect data about ice sheets and improve the efficiency with which we can use that data. For example, UAV-borne ice-penetrating radar systems.
Large open dataset of experimentally determined mechanical, thermal, electrical properties of millions of samples that consolidates published and crowdsourced data to enable ML models. This would augment initiatives like the Materials Project and OQMD, which are simulation-heavy.
Engineered microbes and plants for environmental remediation (e.g., superfund and landfill clean-up and mining) that can survive in toxic conditions and degrade diverse classes of pollutants while concentrating and mining valuable elements. Biocontainment risks need to be addressed.
An open dataset that links multiple modes of chemical characterization, integrating existing databases that are currently siloed or behind paywalls.  The dataset should include structure, synthetic route, NMR/ IR spectra, and bioactivity to enable truly holistic chemical AI that can predict synthesis routes and spectra based on structure.
A community-wide, structured repository (PDB-equivalent) of how materials are made, including processing parameters, environments, recipes, and results. Procedure logs and outcomes including failed experiments. 
Measure the biological effects of food components at the receptor, cellular, organ, and organism levels to understand their impact on human health.
Systematically catalog the chemical and biological components of foods and study their interactions with the human body at multiple scales—from receptors to whole organisms.
Surveillance networks of genetic mutations in bacteria and fungi–both foodborne pathogens and antimicrobial resistance trends.
Develop models to forecast which epitopes will dominate immune responses upon exposure to new antigens, guiding vaccine and therapeutic development.
Create computational models to predict and mitigate immune responses to biologic drugs, improving safety profiles.
Develop large libraries of tool compounds to systematically probe molecular interactions, aiding both drug discovery and toxicity prediction.
Adapt pharm-ome mapping approaches to environmental toxins to predict their biological impacts and improve safety assessments.  For example, scalable solutions for rapid detection and neutralization of mycotoxins in food systems, which contaminate 25% of agricultural products and post significant health risks (e.g., portable sensors and enzyme/ microbial/ RNA-based detoxification systems).
Characterize microplastics in food and water and understand human exposure and impacts. Develop new technologies to degrade PET, polystyrene, and microplastics (e.g., microbe/ enzyme systems).
Develop predictive models for ADME-Tox to lower drug candidate failure rates and increase clinical safety and efficacy.
Systematically map drug–target interactions (the “pharm-ome”) to better predict drug efficacy, side effects, and repurposing
Broaden the scope and diversity of existing biobanks (e.g., the UK BioBank) to include more global populations and additional longitudinal health measures.
Implement noninvasive, low-cost sampling methods (e.g., breath analysis and point-of-care nucleic acid sequencing) to collect repeated, high-resolution physiological data. This would augment the depth of data from initiatives such as the NIH All of Us research program.
To study interactions in complex systems we need to measure multiple agents simultaneously, ideally with timecourse data. Doing this in live aged organisms would require new tools.
Human long term multimorbidity endpoint  trials of known to be safe compounds
There are many other biological systems that are understudied. We need field building to drive greater study of important biological dark matter, e.g., extracellular matrix biology, pregnancy, thymic involution, chronic infections driving chronic disease, menopause biology, and many others
Develop immortal model organisms beyond cell lines to enable the study of longevity and underlying mechanisms of aging. 
Implement pooled screening techniques directly in living organisms to test multiple intervention combinations concurrently in aged context, accelerating discovery in complex disease and aging research.
Develop aging-relevant in vitro models and screen combinations of interventions (e.g., small molecules, gene therapies) using multi-omic and functional readouts to identify synergistic treatments that extend lifespan or promote regeneration.
Study and engineer endosymbiotic relationships (e.g., mitochondria) to better understand and manipulate cellular energy production, potentially offering new avenues to treat bioenergetic disorders.
Map the connectivity not only within the brain but also between the brain and peripheral organs to reveal integrated regulatory networks.
Create detailed, functional maps of hypothalamus/brainstem activation by specific peptides and hormones to link neural activity with physiological outcomes and decipher how the brain orchestrates complex physiological responses.
Utilize ultra-small microelectronic devices delivered via endovascular routes to target specific tissues (including the brain), bypassing the limitations of viral vectors and bulky implants.
Develop platforms (e.g., multi-channel data collection systems) to continuously monitor health parameters in diverse human cohorts, e.g., existing or planned clinical trials. 
Map the determinants of cellular migration in the body to block metastasis and control cellular delivery.
Develop improved gene therapy vectors that can be manufactured at scale, as well as completely novel therapeutic gene therapy delivery vehicles. 
Diversify who can be a funder of science by launching new fast grants-style programs. 
We should design research institutions to support the particular kinds of research they need to house, not fit every square peg into the same round hole.
Accelerate clinical trials and support research into novel vaccine technologies that can be distributed in low-resource settings, especially those that do not require cold chain.
There are various solutions that could help scientists spend less time fundraising and more time doing science. This could include new structures for research institutes, where researchers don’t need to apply for external grants, and new tools for decentralized science funding.
Promote studies conducted outside traditional clinical settings to gather real-world evidence more effectively.
Innovate and scale sanitation technologies to improve water, sanitation, and hygiene, to reduce disease transmission.
Advance public health by creating better models and strategies for intervention, integrating data-driven insights to inform policy and practice.
Lead exposure remains a critical but under-addressed public health challenge. Approximately 1 in 3 children globally have toxic levels of lead in their bloodstream, leading to developmental delays, cognitive impairment, and increased risk of chronic diseases. However, as it stands most countries do not do comprehensive, routine surveillance of blood lead levels. This is because the primary method for blood lead surveillance is costly and inconvenient.
Develop rapid, integrated diagnostic tests for many diseases at once that are low cost.   From Jacob Trefethen’s essay: “I would count success in the diagnostic row as: a multiplex diagnostic for at least 3 pathogens (i.e. flu + COVID does not count), available over the counter for use at home. Either a respiratory panel (e.g. flu + COVID + strep throat) or a fever panel (e.g. malaria + dengue + typhoid) would count. An at-home multiplex STI panel would be great (e.g. chlamydia + gonorrhea + syp...
We need to better understand the physiology of malnutrition and develop improved interventions.
New treatments for chronic infections (e.g., achieving a functional cure for hepatitis B) and novel monoclonal antibodies for diseases such as malaria. New formulations (e.g., one time dose time release delivery) could improve delivery and compliance in low resource settings.
Implement federated data and differential privacy systems to aggregate clinical trial data from multiple sites while ensuring patient privacy.
Vaccine distribution is also a significant challenge in low-resource settings.
Apply quasi-experimental designs to strengthen causal inference in clinical studies, improving evidence quality, and do this for wider and more unified datasets (e.g., existing medical records) and across many conditions, e.g., combined with federated data approaches. Historically it has taken 5-10 years for advanced methods to percolate into relevant areas of omics / biotechnology x clinical area. It is also changing a culture of thinking — that there exists a different kind of validation that...
Enhance early clinical studies by integrating predictive validation techniques to forecast outcomes and optimize trial design.
Combination of new tools and norms that are situated outside of the traditional publishing pipeline: protocols  for empowering and recognizing citizen science on social media. New practices around micro/nanopublishing to lower the barrier to publishing
Promote and support frugal science projects that empower communities worldwide to participate in scientific research, especially in developing countries.
Support modern, open, and community-driven publishing platforms that challenge and replace outdated models.
Establish a community-driven, post-publication peer review system that supplements traditional publishing, enhancing transparency and accountability.
Infrastructure to support and incentivize the diverse curation of research through science social media and/or dedicated spaces. This infrastructure would support rapid dissemination of research and encourage broader exploration of the research landscape, mitigating the risk of homogenous research focus, and maladaptive collective attention patterns in science.
Knowledge models that can facilitate reasoning by synthesizing and clarifying relevant information transparently from multiple domains. “Provide a semantic medium that is both more expressive and more computationally tractable than natural language, a medium able to support formal and informal reasoning, human and inter-agent communication, and the development of scalable quasilinguistic corpora with characteristics of both [scientific] literatures and associative memory”. LLMs have powerful c...
Intelligent databases and automatic probabilistic integration across multiple databases
Build robust infrastructure to integrate and synthesize diverse scientific findings into cohesive, accessible formats for researchers.
Develop AI agents that can read, summarize, and integrate scientific literature, providing researchers and policymakers with synthesized insights.
Develop AI and multimodal LLM systems to automatically detect fraudulent research, flag suspicious publications, and improve overall scientific integrity. Especially as rapid advancements in AI models make it feasible to generate inaccurate scientific content at scale to disrupt.
Establish initiatives to systematically archive and preserve essential datasets from proprietary platforms, ensuring long-term accessibility and reproducibility.
Make it easier to calculate cost effectiveness of interventions without a large staff through AI assisted methods
Deploy comprehensive sequencing-based early warning systems to rapidly detect and attribute emerging bio-threats. It is important to distinguish detection of known pathogens (easy to find in sequencing data) vs. previously unknown ones (hard).
Understand the role of the microbiome in immunity against infection and develop the ability to engineer or transplant microbiome communities that protect against infection.
Innovate new materials and coatings for surfaces and textiles that actively reduce pathogen viability and transmission.
Develop next-generation, affordable, high-performance personal protective equipment to reduce transmission risks.
Enhance indoor environmental controls to reduce pathogen transmission through advanced sensor networks, UV/opto-acoustic disinfection, and improved HVAC systems.
Develop nasal sprays that could be applied daily for broad-spectrum protection against respiratory pathogens
Use technologies such as rapid B-cell sorting and computational design to quickly characterize and discover antibodies and antibody-like molecules to combat emerging biosecurity threats.
Develop prototype vaccines or therapeutics for viruses in each viral family that infects humans, to be rapidly adapted for the next pandemic.
Improve epidemic surveillance and modeling by integrating data from multiple sources for timely, actionable insights.
Utilize the analysis of volatile organic compounds (VOCs) as an early detection tool to identify pathogenic outbreaks.
Establish scalable, decentralized vaccine production systems to rapidly deploy immunizations during outbreaks, reducing reliance on centralized facilities.
Capabilities to determine the geographical source of bio-threats and whether they stem from natural sources or human activity. 
Technologies to detect and modulate the host immune response across a broad range of infections to improve patient outcomes and treatment strategies.
Develop detailed economic models to better assess the costs and benefits of different bio-threat interventions and inform policy decisions.
Develop innovative vaccine platforms that can adapt to or be robust to rapidly mutating viruses (e.g., influenza, HIV, coronaviruses) using methods such as mosaic nanoparticles and mRNA cocktails. Safety and security considerations: https://www.sciencedirect.com/science/article/pii/S0264410X21001717 
Integrate hardware-based locks into DNA synthesizers to ensure secure operation and prevent unauthorized use. Enhance systems for detecting and reporting flagged orders—enabling cross-verification with other intelligence data—and secure AI-driven biodesign tools to mitigate potential biosecurity risks.
Implement advanced, potentially cryptographic (where appropriate), and/or adversarial AI-based DNA synthesis screening methods to prevent misuse and ensure biosecurity.
Develop rapid test platforms that can be reconfigured within days or weeks for emerging pathogens, enabling quick self-testing during a pandemic. Address the limitations of traditional lateral flow tests—which rely on antibodies and take months to develop—by significantly accelerating the deployment of diagnostic tools.
Develop new broad-spectrum antivirals that can be used to treat or prevent infection from evolving viruses.
Adopt circular economy principles and advanced design strategies to build more resilient, robust, and sustainable supply chains.
Develop embedded watermarks in generative protein models to promote traceability of engineered sequences.
Develop next-generation antibiotics and novel antimicrobial compounds using advanced discovery platforms to stay ahead of evolving pathogens.
Develop systems that provide context to content, helping users understand the broader background and counteract misinformation without censorship.
Build on initiatives like Community Notes–a crowdsourced fact-checking system that attaches contextual annotations to tweets–to build robust community-driven evaluation tools for social media platforms. This allows users to create and vote on annotations, while an open source algorithm determines which context notes to attach.
Develop a comprehensive toolkit to reboot essential infrastructure and restore societal functions in the event of widespread failure.
Air breathing fusion to propel single-stage-to-orbit (SSTO) vehicles for a single-stage, reusable spaceplane.
Develop tools that empower users to both create personalized information streams and collaboratively curate content.  Users can set up custom feeds—filtered by semantic content, social network data, and engagement signals—to tailor the information they receive. 
Develop digital tools to shield individuals from online scams and fraudulent schemes, and that detects and filters or flags coordinated misinformation campaigns.
Methods to pressurize large surface areas or positive pressure habitation domes will be necessary for living most places off earth.
Develop “systems to  increase mutual understanding and trust across divides, creating space for productive conflict, deliberation, or cooperation”
Develop recommender systems that prioritize content we’ll appreciate upon reflection, rather than content that only captures our immediate attention. 
Create market-based mechanisms that incentivize accurate fact-checking and hold sources accountable.
Assuming no FTL the only way to get outside of the solar system in a single lifetime is with cryosleep.
Global distributed crowdsourced public knowledge and knowledge graphs
Implement frameworks that allow actors to reduce collaboration risks and costs by defining and enforcing precise flows of information. This allows a larger negotiating space for strategic actors dealing with advanced technologies. 
Build centralized platforms to monitor and analyze the information ecosystem, enabling better identification of misinformation trends.
Invest in the development of interstellar probe technologies that can traverse vast distances, opening the door to exploration beyond our solar system.
We need comprehensive methods to detect and quantify human disempowerment including economic, cultural, and political metrics as well as research and education.
Robust identity verification systems that safeguard personal privacy.
Invest in research exploring non-biological methods for planetary engineering, such as atmospheric modification and energy-efficient infrastructure.
Develop AI delegates who can advocate for people's interest with high fidelity, while also being better able to keep up with the competitive dynamics that are causing the human replacement.
Develop direct interventions for preventing accumulation of excessive AI influence:  • Regulatory frameworks mandating human oversight for critical decisions, limiting AI autonomy in specific domains, and restricting AI ownership of assets or participation in markets • Progressive taxation of AI-generated revenues both to redistribute resources to humans and to subsidize human participation in key sectors • Cultural norms supporting human agency and influence, and opposing AI that is overly auto...
Tools to forecast and monitor key thresholds or tipping points beyond which human influence becomes critically compromised, and the ability to measure effectiveness of intervention strategies.
Initiate dedicated research into biological terraforming strategies that utilize living organisms to modify planetary environments.
Develop next-generation voting systems and auditing protocols that are secure, transparent, and capable of supporting robust collective decision-making.
Develop AI-driven platforms that facilitate deliberative democracy by aggregating diverse perspectives, advancing community moderation, and guiding public decision-making processes to move beyond corporate and partisan pressures.
Everything from mining the moon and using those resources to build things in space to fuel etc etc
Develop modern, scalable survey platforms that can efficiently capture and analyze public opinion data and other qualitative data for social sciences. New tools are needed to collect, analyze, curate, and model large-scale qualitative data.
A framework to help researchers identify and pursue meaningful questions. This includes support structures—such as fellowship programs to train students in problem-driven research workflows, and innovative funding mechanisms to empower researchers to focus on questions that truly matter.
Build tools to enable policy-makers to write mechanized (i.e. runnable software) versions, moving more of the subjective evaluation ahead of the action (rather than interpreting more things post-facto). This would leverage formal logic to streamline policy development and governance.
Develop novel construction methods—such as inflatable or modular architectures—that can be assembled in orbit to create new space stations more efficiently.
Tools for lesson planning and administrative tasks. Tools for administration of school systems, including managing resource distribution.
Develop sophisticated digital tutor systems that offer personalized learning assistance by adapting to individual learning styles using AI.
Apply satellite imagery and ML techniques to modernize archaeological surveys and analysis, enabling more rapid and systematic discoveries.
Utilize emerging commercial spaceflight companies to drive down costs and accelerate the development of new space stations through innovative design and manufacturing.
Develop simulation models to explore scenarios of profound economic change to help plan for disruptive shifts. For example: a future where extended youth replaces traditional retirement and end-of-life expenses are significantly reduced, assessing the economic impact of a major AI transition.
Utilize techniques from chaos and complex systems theory to develop more realistic economic models that account for non-linear dynamics.
Create complex simulation models using AI agents that mimic real-world economic dynamics, providing a richer framework for policy analysis.
Explore novel architectures and settings for AI assisted learning
Design and implement missions specifically aimed at exploring Europa and/or Enceladus for signs of life, leveraging advanced detection and sampling technologies.
Bridge the translational research gap by adopting more operational, nonacademic approaches to development economics, moving beyond traditional models like JPAL.
Develop and refine prediction markets augmented by AI to improve future forecasting accuracy and better inform policy decisions.
No rock survives from the Hadean - but that is because Earth has plate tectonics. Statistically, there must be Hadean rocks waiting on ancient Lunar terrains for us to find (just as there are abundant Lunar meteorites on Earth). A plausible Earth mineral has already been identified in Apollo samples (https://www.sciencedirect.com/science/article/abs/pii/S0012821X19300202). They will be easy to spot due to their distinct mineralogy (and color). However, raking through the Lunar regolith to find t...
Utilize multiplayer online games as testbeds for large-scale economic experiments, enabling controlled studies of human behavior in dynamic environments.
Run controlled experiments—such as basic income trials and alternative market models for carbon pricing—to test and refine innovative economic structures.
Most major economic studies include data collection for 1-2 years, but the outcomes (e.g. education, child health) could plausibly impact large sections of one’s life.  There is little funding for followups on existing studies.
Almost all our knowledge of Hadean Earth comes from Hadean zircons, the “black-box recorder” of minerals. Zircons can incorporate small amounts of the ancient ocean . All known Hadean zircons come from a single site, and they are gathered artisanally by individual graduate students. The solution is to search for more Hadean zircons with the steady, systematic approach of (for example) a diamond company, and then screen massive numbers of zircons for ocean and atmosphere data.
Design and implement novel, decentralized models for public goods allocation and cooperation that overcome existing systemic inefficiencies.
Use advanced machine learning techniques to decode animal vocalizations and behavioral signals, enabling meaningful communication and insights into animal cognition. Note: Many animals (e.g. insects, cephalopods, amphibians) use non-vocal communication (bioluminescence, gestures, electroreception, olfaction). For olfaction see: https://www.osmo.ai/   
Systematically testing functions of ancient proteins 3800 to 50 Mya to understand ancient biochemical capabilities and environmental constraints, informing evolutionary ecology.
Deploy cutting-edge imaging techniques to capture high-resolution details of natural nanostructures, revealing new insights into their form and function.
Innovate and deploy more efficient and robust systems for deep ocean exploration, enabling comprehensive study of deep-sea environments and their unique biology.
Implement dense, systematic DNA sampling in diverse ocean environments—especially areas with sharp gradients in temperature, depth, salinity, and pH (such as reefs, deep trenches, and hydrothermal vents)—to uncover new species and biological insights.
Atmospheric bio-aerosols could have applications for cloud seeding or other topics. 
Use evolutionary algorithms and generative models to simulate human evolution processes in computational systems, driving more robust and adaptable AI.
Develop AI architectures that are inspired by the human brain's structure and functionality, enabling more flexible and general reasoning and planning.
Utilize formal verification techniques to synthesize software that is provably secure, reducing vulnerabilities and enhancing system robustness. Traditional programs in these areas have tended to assume that AI capabilities are saturating, leaving important avenues neglected.  Efforts could break down into several key components:  • Specification generation and validation tools • Code and proof generation systems • Tools that integrate formal verification into existing engineering workflows  • ...
Maintain decentralized control over large neural network training, like SETIatHome/Wikipedia for large language models, to equalize access. However, this also introduces some AI proliferation and control related risks.
Improve the ability of large language models or systems based on them to synthesize disparate information outside their context windows to formulate new scientific hypotheses
Implement Bayesian cognitive models and probabilistic programming and program synthesis techniques to endow AI with more human-like reasoning, planning, and decision-making capabilities.
Develop techniques to ensure AI systems are resistant to "jailbreaking" or circumvention of built-in safety and control protocols.
Develop virtual life simulations powered by machine learning and modern GPU infrastructure to mimic the complex, evolved computations seen in biological systems.
Train AI models on data similar to those developing humans or animals actually experience
Digital fortresses that enable sensitive data to be processed in a controlled, privacy-preserving environment. 
Develop AI-driven design tools that optimize manufacturing systems from the ground up, moving beyond traditional approaches to enable more efficient, automated production processes.
Develop advanced predictive systems that integrate diverse climate data to forecast tipping points more accurately.
Robust strategies for data integrity, anomaly detection, and defensive training protocols to mitigate situations where indirect data poisoning could lead to intentionally misaligned AI systems (not unlike “sleeper agents”).
Develop an integrated "bio lab of the future" that combines robotics, AI, and real-time data analysis to fully automate bioengineering tasks.
Implement AI-assisted systems for programming and controlling lab robots to automate bioengineering workflows and reduce the reliance on manual processes.
Utilize generative AI to automatically generate and optimize building designs and construction plans, streamlining the design process and reducing manual effort.
Observing emergent AI decision-making processes and cognitive patterns with fewer anthropomorphic assumptions.
Design mobile robots with streamlined, efficient designs. Reduce unnecessary degrees of freedom for simpler, cheaper, and more reliable mobility platforms.
Create advanced robot bodies that leverage novel materials, innovative design principles, and improved manufacturing and control techniques to enhance dexterity while reducing cost.
Develop a comprehensive multiphysics foundation model that can be applied across various mechanical systems, integrating physics-based models, experimental data, and machine learning for broad transferability.
Develop hardware-level governance mechanisms to enforce safety and compliance in AI systems, ensuring robust operational constraints. This includes tamper-proof hardware. 
Develop novel actuator technologies that combine hybrid or mode-switching capabilities with power-dense magnetic actuation. These improvements would allow robots to seamlessly transition between compliant and stiff modes, mimicking biological muscle performance while enhancing energy efficiency.
Employ lithographic techniques and advanced nanofabrication to create tiny robots with high precision. Scalable production of nanoscale robotic systems could enable breakthroughs in medicine (e.g., targeted drug delivery) and materials science.
Advance the use of underactuated robotic systems, which use fewer actuators than degrees of freedom, to create compliant, adaptable, and significantly cheaper robots. 
Use AI to enhance the interpretability of other AI systems, creating tools that automatically explain and verify AI behavior.
Build the robots that build the robots.  The current paradigm of all automated manufacturing is for machines (from robot arms to presses) to make things smaller than themselves. This quickly runs into scaling limits etc. 
Develop multimodal electronic skin (e-skin) that enables robots to detect force vectors, slippage, and temperature across large surface areas. Enhanced tactile perception will facilitate fine-grained control and more adaptive interactions with the environment.
Integrate hardware and software design processes to co-evolve robot bodies alongside control policies. This approach reduces inefficiencies caused by decoupled design methods and can unlock entirely new performance regimes.
Study the neural basis of human social instincts to inform AI design, ensuring that AI systems can safely interpret and emulate human social behavior.
Automation of welding requires improved ability to verify welds.
Electrochemical machining (ECM) creates complex shapes with high precision, but is costly. Lower cost ECM could make machined titanium as cheap as aluminum.
Develop and implement AI architectures with separable, auditable world models; where safety can be specified in terms of the state space of the model; and proposed AI outputs come with proofs that the output does not leave the safe region of the world model’s state space.
Systematically collect and curate large training datasets focusing on tactile interactions to enhance simulation accuracy and real-world performance.
Integrate detailed physics models with generative AI techniques to improve the accuracy of simulations and facilitate effective transfer of robotic behavior from simulation to reality.
If we could create more manufacturing systems with low capex but high energy needs, we could take advantage of drastically cheaper solar. One particularly useful example would be desalination.
Close the loop between AI and human-guided interactive theorem provers by using reinforcement learning to refine proofs based on feedback from proof assistants.
Build open, composable Earth System Simulation infrastructure based on high-resolution data. De-silo climate data across ESMs, IAMs, observational data to increase collaborative potential. Traditional ESMs are built using legacy programming languages, which introduce a barrier for new entrants to the field and impede the usage of hybrid machine-learning techniques and modern computing architectures. Collect high-resolution earth system data: Today’s global models and reanalyses are at tens of ...
Robot foundational models can address major obstacles in robot learning and enable training on action-free data including video. This is essential for enabling reasoning about novel situations and robustly handling real-world variability.
New logic and memory technologies based on CNTs or other structures, 3D integration with fine-grained connectivity, and new architectures for computation immersed in memory
Retrofit existing vessels and aircraft with advanced sensors to systematically measure aerosol–cloud interactions in situ, improving our understanding and models.
Enhance predictive models for solar flares using advanced data analytics and observation techniques to better forecast solar activity. Note: NASA spends $0.8 bn/yr on heliophysics. 
Create more robust integrated assessment models that minimize ungrounded economic assumptions and better capture sensitive intervention points and amplification mechanisms in socioeconomic and political systems.
Use AI to design the next generation of hardware for AI.
Explore the concept of using satellite constellations (like Starlink) to perform atmospheric tomography, thereby building a 3D picture of atmospheric dynamics.
Deploy specialized platforms in the stratosphere to gather high-resolution data on atmospheric processes and composition.
Develop improved sensor technologies for airborne particulates, e.g., hyperspectral and lidar-based remote sensing for aerosol particle size, type, and radiative forcing potential
Biologically inspired or living computer systems that use biological components to perform computation at very low energy levels.
Global ARGO-like sensors for deep ocean currents. We have relatively sparse ocean data compared to atmospheric data. Initiatives like Argo floats ( ~4,000 drifting sensors) have collected over two million ocean profiles of temperature and salinity, providing a crucial 3D view of the oceans. Expanding such efforts (more floats, deeper measurements, biogeochemical sensors) and releasing the data in unified formats could enable AI to model ocean currents, carbon uptake, and climate patterns like E...
Develop dynamic models that incorporate microbial, hydrological, and climate-driven processes to better capture methane/N₂O feedback loops.
Deploy networks of in-situ and remote sensors to monitor emissions across key ecosystems like thawing permafrost, peatlands, and tropical forests. 
Computing paradigms based on thermodynamic principles, where computation is driven by energy gradients and can operate at lower energy costs by harnessing reversible and low-energy processes.
Establish protocols and infrastructure for banking backups of power transformers to mitigate the impact of solar flare-induced disruptions.
Develop detailed models that capture the local impacts of climate change (e.g., heatwaves) and implement responsive strategies to minimize harm and adapt to changing conditions.
Hardware for probabilistic computation, which can perform certain tasks more energy-efficiently by embracing uncertainty.
Explore strategies to divert or mitigate the impact of hurricanes using advanced atmospheric control methods.
Assess the feasibility of using  cloud seeding techniques to stimulate precipitation and modulate local weather conditions in a controlled manner.
Create computing architectures that use reversible logic, theoretically allowing computation with near-zero energy dissipation by avoiding information loss.
Conduct controlled, scaled experiments to validate the basic science and engineering principles behind emergency climate interventions such as sunlight reflection modification and atmospheric methane removal.
Develop and test intervention strategies aimed at stabilizing glaciers and mitigating associated climate impacts.
Develop switching technologies that operate at millivolt levels, significantly reducing the energy required for signal processing and computation.
Understand whether a planetary sunshade could be viable
Measurement reporting and verification and foundation modeling for solar radiation modification.
Explore superconducting hardware to achieve brain-inspired computing with drastically reduced energy consumption, scaling to large networks.
Create computational and experimental platforms tailored to validate, measure, report, and verify carbon removal and its environmental impacts in natural systems such as the ocean or soil. Beyond MRV, methods to valorize CDR at scale are needed.
Develop systems to oxidise diffuse atmospheric methane and nitrous oxide.
Leverage brain-inspired neuromorphic hardware to perform computation more efficiently, emulating the low-energy operation of biological neural networks.
Reduce methane production from cows by modifying or removing the methanogen microbes in their rumen:  • Microbe-targeting vaccines • Gene engineered cow microbiome • Highly specific antibacterials
Develop novel hardware architectures optimized for deep learning and artificial intelligence that dramatically reduce energy consumption compared to current systems.
Develop and scale advanced recycling methods, especially for plastics.
Innovate and deploy efficient, scalable methods for cleaning up ocean pollution to restore marine ecosystems and improve ocean health.
Implement high-precision monitoring systems to measure methane production from livestock, enabling optimized agricultural practices.
E-waste represents a significant opportunity to recapture and reuse rare earth elements that are in short supply. 
Standardize a broadly capable design, and create a space telescope factory to produce modular components at scale, dramatically reducing the cost and time required for mission development.
Leverage the reduced launch costs enabled by vehicles like Starship and employ modular assembly techniques to build telescopes more rapidly and economically.
Develop high-strength permanent magnets not made of rare-earth elements. Currently the high-strength magnets underpinning many technologies (e.g., hard disk drives, mobile phones, electric vehicle motors) are all made out of neodymium, a rare earth element at risk of supply chain shortages and environmental issues. 
Develop and launch independent planetary science missions to explore key astrobiological questions. Relying solely on traditional government agencies like NASA unnecessarily limits planetary science and astrobiology research.
Utilize advances in commercial components and software to accelerate the development cycle of telescopes.
Create an industrial center of excellence focused on the practical separation of Lanthanides to distribute this knowledge as a public good.
Space-based Laser Interferometer Gravitational-Wave Observatories (LIGOs) allows an extremely large detector to study regions of the gravitational wave spectrum that are inaccessible from Earth.
Investigate new methodologies to detect high-frequency gravitational waves, potentially unveiling phenomena that are invisible to current detectors.
Carbon fiber could potentially replace steel in many situations and sequester atmospheric carbon instead of creating it if we could make enough of it cheaply enough. Arbitrarily long carbon nanotubes would enable tethers with tensile strength near the limits of physics which unlock things like space elevators.Scaling the production of conductive carbon materials could potentially replace copper.
Even small optical interferometers in space could vastly exceed the sensitivity of ground based systems, and directly image Earth-like planets around Sun-like stars, but this requires advances in precision (<1 micron) formation flying technology for spacecraft. [This is both an engineering bottleneck, and a scientific bottleneck] 
There are an exceptionally large number of compelling signals that live in this band, including: the elusive intermediate mass black holes, white dwarf mergers (putative SN Ia progenitor), tidal disruption events, high precession and eccentricity systems, neutron star merger early warning, and more.
Methods to build large-scale structures from cells and proteins.
Develop and deploy neutron microscopy techniques to image material structures at high resolution, benefiting from neutrons’ deep penetration and sensitivity to light elements.
Develop compact particle accelerators that can be operated on a benchtop scale, reducing both cost and size while retaining necessary performance for scientific and medical applications.
Next generation, high performance protein-based fibers created through new spinning processes that can align and control the molecular assembly of the final fiber, taking advantage of protein’s unique capabilities.
Design and execute key experiments that probe quantum gravity phenomena without requiring massive accelerators, leveraging innovative, cost-effective approaches.
Encourage and fund rigorous experimental investigations into metallic hydrogen superconductivity, ensuring data integrity and reproducibility.
Develop magneto-inertial confinement strategies that combine magnetic fields with inertial forces to better confine plasma for fusion.
Implement AI-based control systems to dynamically stabilize and confine plasma during fusion reactions, improving overall efficiency and stability.
Develop integrated computational frameworks that combine quantum chemistry, reaction potential modeling, experimental data, and ML generative models to describe and direct crystal growth at the atomic level.
Develop comprehensive, high-quality datasets and predictive models to better understand and forecast animal movements and biodiversity shifts. Use advanced computational and theoretical models that capture how species interact, cascade through ecosystems, and ultimately influence stability or collapse. These models will help identify key feedback loops and thresholds, enabling targeted intervention before degradation accelerates.
Create and implement novel mathematical models and computational frameworks that can more accurately simulate and predict turbulent flows.
Design and implement experiments to rigorously test low-energy nuclear reaction principles, providing clear data on reaction mechanisms and viability. This is highly speculative and at this point unlikely to yield practical LENR, however, see: https://coldfusionblog.net/2019/03/13/the-case-against-cold-fusion-experiments/
Utilize deep learning and computational modeling to predict and discover millions of new materials, expanding our understanding of what can exist.
Secure and restore pollination services critical to food systems and biodiversity. This involves: • Building ecological models that map plant–pollinator interactions and forecast vulnerability or collapse points. • Deploying global pollinator monitoring systems using visual, acoustic, and eDNA sensors paired with AI to track pollinator diversity and behavior. • Designing landscape-level interventions such as habitat corridors, floral resource planning, and pesticide regulations to boost wild po...
Develop monitoring techniques to detect early-warning indicators—such as critical slowing down—that suggest an ecosystem is approaching a tipping point.  Establish robust, scalable networks for real-time ecological monitoring using integrated technologies. Deploy sensor arrays that combine soundscapes, environmental DNA (eDNA) sampling, and satellite data fusion to continuously assess ecosystem health across diverse regions.
Develop more efficient assay platforms to test the properties of materials, thus enabling faster feedback and iteration in materials discovery.
Track and forecast the spread of invasive species and simulate containment strategies. 
Leverage machine learning models combined with physics-based property prediction to iteratively explore the materials space using automated, self-driving laboratory platforms, to find things like higher temperature superconductors or topological materials.   New designs are needed to minimize large capital expenditures and integrate flexible, modular components that can be rapidly repurposed for new experiments and are robust to variations and error handling.
Conduct detailed studies of existing closed ecosystems to understand the interactions and feedback loops that enable self-sustainability.
Employ laser direct writing techniques to pattern chips with high precision, offering an alternative to traditional lithographic methods.
Create well-designed testbeds that simulate key ecosystems, allowing for controlled experimentation and development of restoration strategies. Develop experimental testbeds that simulate closed ecosystems, allowing for controlled experimentation and refinement of life-support strategies.
Use digital techniques to plan and assemble chip components, integrating computational design with physical fabrication.
Utilize cutting-edge stem cell and mammalian genome editing techniques to facilitate de-extinction and promote genetic diversity in repopulating ecosystems. Are we preserving the genomes of critically endangered species to enable future de-extinction?
Explore methods to assemble chips using randomized DNA as a templating or assembly tool, allowing for scalable production with inherent molecular diversity.
Develop modular electronic components at the nanoscale, enabling flexible, low-cost assemblies with high molecular diversity.
Develop comprehensive genomic mapping initiatives for global microbiomes to catalog species essential to Earth's biosphere and inform conservation efforts.
Develop methods for producing smaller, more diverse assemblies using patterned techniques that allow for greater molecular variability.
Broaden the capabilities of existing chip fabrication facilities to produce a wider range of devices, potentially reducing costs and expanding functionality.
Extend work on ML force fields to charge transfer problems in external potentials, enabling in-silico discoveries in batteries, electrolysis, carbon capture, biochemistry and the origins of life
Current fabrication methods allow us to work at macroscopic scales (10^0 m) down to the nanometer scale (10^-8 m) with photolithography, and further down to the atomic scale (10^-10 m) with proteins. However, directly bridging from macroscopic to atomic scales (10^0 m to 10^-10 m) for nanotechnology applications remains a significant challenge. A key obstacle is the lack of effective interfaces between single addressable electrodes and proteins.
Develop novel optical and opto-acoustic techniques that reduce scattering, enabling deep-tissue imaging without expensive MRI. These methods aim to improve resolution and enable whole-brain activity mapping as well as cost-effective “body scanners” for diagnostics.
Develop the instrumentation and controls required to directly measure quantum effects in biological systems.
Develop methods to  measure and predict allosteric regulation mechanisms across the proteome, capturing dynamic conformational changes that impact protein function.
Develop label-free biosensing techniques that leverage vibrational signatures to capture complex cellular information without the need for physical probe delivery.
Develop magnetic imaging (magneto-genetics) approaches as an alternative modality that circumvents the limitations of optical scattering, potentially offering noninvasive imaging of deep tissue structures.
Utilize nuclear magnetic resonance (NMR) techniques to capture the dynamic ensembles of intrinsically disordered proteins, enabling accurate modeling of their fluctuating structures.
Explore quantum non-demolition x-ray imaging (or ghost imaging) methods that use quantum correlations to image samples with minimal perturbation, preserving sample integrity for longitudinal studies.
Develop a quantum electron microscope that leverages quantum principles to achieve high-resolution imaging while minimizing sample damage, enabling repeated measurements on the same specimen.
Engineer cells to incorporate imaging labels metabolically, thereby eliminating the need for external probe delivery and enabling noninvasive imaging.
Barcode the orientations of small proteins in the cryo-EM
Develop comprehensive spectroscopic databases to train AI models for both forward and inverse predictions, enabling more accurate structure determination.
Employ microwave spectroscopy to establish a direct one-to-one mapping between molecular structure and spectrum, preserving detailed structural information.
Develop software for simulating stochastic thermodynamics at scale with modern hardware accelerators (GPUs, etc), going beyond Gillespie algorithm.
Modify the genetic code or structure of membrane proteins so they become soluble in water, thereby enabling more robust experimental analysis and practical applications.
Utilize hybrid quantum algorithms to perform quantum chemistry simulations, capitalizing on recent progress in quantum computing.
Leverage AI techniques to accelerate quantum chemistry calculations, improving the speed and accuracy of electronic structure predictions.
Use neural network potentials and force fields to enhance molecular dynamics simulations, making them more efficient and accurate.
Integrated analytic tools to study clinical, molecular, and environmental datasets to identify patterns and infer the underlying causes of disease.
Create accurate, thoroughly benchmarked calculations for elements beyond the second row of the periodic table, for clusters of 3 or more atoms. Could complement 29.5.
Produce a high-quality experimental dataset to validate molecular simulation techniques and transition to frictionless reproducibility.
Develop application-specific integrated circuits (ASICs) tailored for quantum chemistry calculations, aiming to combine efficiency with high computational power.
Conduct comprehensive “-omics” studies of breast milk to capture its molecular and microbial composition and its role in infant development.
Utilize nanoscale laser ionization techniques to achieve atom-by-atom imaging of materials, providing unprecedented resolution of atomic structures.
Support open alternatives to Reaxys and the Cambridge Structural Dataset to the point where they are equal or superior in quality
Systematically profile cellular mechanisms by which early-life malnutrition affects physiology across the lifespan, providing insights into long-term developmental impacts.
Build solid-phase synthesizers capable of the universal, programmable synthesis of polymers such as proteins, peptides, spiroligomers, carbohydrates, and RNA mimetics.
Map and model chemical reactions and catalysts more comprehensively to better understand reaction mechanisms and discover novel catalysts.
Make X-ray lasers more accessible and compact
New delivery mechanisms are needed to target the desired tissues and avoid uptake by hepatocytes and macrophages. Off-target uptake necessitates higher dosing, increasing toxicity risks.
Implement a “chemputer” system to automate chemical synthesis processes, reducing human intervention and increasing reproducibility.
Humans have ~400 odorant receptor genes that encode functional GPCR proteins. Binding ligands have been identified for ~80. Mapping the binding profiles of olfactory receptors from humans and other animals would enable novel biosensors and reveal novel therapeutic targets.
Develop an integrative model that explains how the olfactory system decodes and classifies complex chemical stimuli, linking molecular features to perceived odors.
Use new enzymes to enable the use of synthetic genetic polymers. Note: don’t make mirror life https://www.science.org/doi/10.1126/science.ads9158
Systematically investigate cell-type and tissue specific targets equivalent to ASGPR for the liver. 
Develop and utilize a better set of standardized building blocks to enable modular synthesis, making the assembly of complex molecules more efficient and scalable.
Advance the field of chemistry automation through additional robotics and high-throughput platforms, enabling scalable synthesis processes.
Living human bodies created from stem cells without neural components could be transformative for medical research and drug development. There are many open questions–for example, the long time it takes for maturation, whether a body would function without neural components, etc.
Fast on-demand complex peptide manufacturing
Direct long DNA synthesis could still be cheaper.  Biosecurity consideration: implementation should be governed by security measures such as: • Trustless DNA synthesis screening • Hardware lock for DNA synthesizer See these capabilities under the “Risks of Malicious Bioengineering” bottleneck.
Synthesize massive biopolymer libraries according to the statistics of a generative model
Develop engineering methods for cryopreserving and safely rewarming large organs or whole bodies, which could revolutionize transplantation and long-term tissue storage.
Leverage generative AI to design new enzymes with novel functions by predicting active conformations and optimizing catalytic activity. Especially for complex redox reactions and reactions that go beyond classic biological catalysis.
Develop techniques for “protein carpentry” that allow for the precise construction and remodeling of protein structures to yield dynamic, functional enzymes.
Physics based control across the cell membrane of DNA synthesis inside the cell as a route to maximize “bandwidth across the cell membrane”
Artificial wombs could revolutionize neonatal care and reduce preterm birth complications. They are an early stage research area with various positive biomedical externalities. 
Biosynthesis of C allotropes, e.g. (strong) polyynes and chiral CNTs; Biosynthesis of Si allotropes e.g. crystalline Si from SiO2 for photovoltaics
Expand the operating range of biological enzymes to new solvents, enabling anhydrous enzymes, gas phase and vacuum biochemistry
Expand the chemical diversity of proteins by incorporating non-canonical amino acids, thereby enabling functions beyond those accessible with natural amino acids.
High-throughput testing in a single animal would enable entire studies to be run in rare/exceptional animals, e.g. with spontaneous disease mimicking humans or species not suitable for research labs.
Create integrated platforms that allow for real-time monitoring, modeling and precise manipulation of developmental processes, including with bioelectric and other novel control layers.  Goals could include: Genome encoding of symmetry (e.g. 2 to 8-fold radial), accurate size ratios, Branching pattern codes, Natural & synbio eutely and other counting mechanisms
Develop advanced technologies to facilitate genetic manipulation in plants, broadening the range of programmable organisms. 
Create platforms that simplify the process of identifying and adapting new microbial hosts, providing recipes for their use in synthetic biology applications. This could take advantage of biocontainment approaches to enhance safety.
A greater variety of small animal models (along with corresponding suites of tools such as species-specific antibodies, annotated genomes, transgenics, etc.) would enable novel biological insights and could be used to develop models of complex human diseases. Additional rodent models, as well as those beyond mouse and rat would be highly enabling. More realistic models are also critical for aging research–many diseases of aging are studied in young animals. Analytical tools are also important ...
Develop synthetic meat production processes based on fungus engineering to create affordable, ethical, and sustainable protein sources.
Decode the molecular signals governing how cells communicate and organize to form complex 3D structures, a fundamental process in tissue formation, organ development, immune cell targeting, etc.
Grow human organs in animals to study disease and drug response more accurately. Use advanced stem cell technologies to grow patient-specific tissues and organs in animals for transplantation.  Human organs could also be maintained ex vivo (“in a vat”) for research purposes. Perfused organ systems (including cadaver-based models) that maintain the structure and function of human tissues ex vivo would also be enabling.
Develop modular bioreactor designs that can be easily scaled up, offering flexibility and improved efficiency in industrial bioproduction.
Implement alternative strategies that enhance the ethical aspects of food production without compromising cost-performance.
Lab-grown 3D organ tissues have become an established additional model system to recapitulate aspects of human biology. This technology could also enable the development of functional organs for transplant. It is also important to make tissue models that recapitulate the effects of aging, or a form of “accelerated lifetime testing”.
Utilize cell-free platforms that enable synthetic biology outside of living cells, thereby bypassing the limitations of evolved cellular machinery.
Develop entirely synthetic cells constructed from the ground up, rather than relying on evolved cell systems. This approach would enable precise control over cellular functions and properties, although risks must also be considered, e.g., https://www.science.org/doi/10.1126/science.ads9158
Replace steel steam sterilized bioreactors with something more scalable
Infrastructure and coalition to collect extra blood samples during trials and apply biomarkers for different diseases to get more value from each trial.
Mine biomedical knowledge to decide which neglected assets to run trials on for which conditions
Design polymers that are not limited to amino acids and are directly specified in three dimensions, enabling precise positional control in synthesis and potentially broader or more robust functions than proteins.
Create enzymes specifically engineered via quantum chemical methods and de novo protein design, which can precisely catalyze reactions at defined positions.
Recruitment is a bottleneck that could be addressed by socio-technical programs. Increased interoperability and unity of data access and patient recruitment across centers and disease states would help de-silo recruitment and improve efficiency.
Investigate the feasibility of vacuum mechanosynthesis—a process that uses mechanical forces under vacuum conditions to construct molecules with high positional precision.
Clinical trial designs such as challenge trials, adaptive trial design, and group testing can improve efficiency.  These can be complemented with analytics and decision-support systems to optimize design, patient enrollment, and outcome interpretation.
Create high-fidelity computational models (digital twins) that accurately simulate human physiology, enabling synthetic clinical trials and faster hypothesis testing.
Explore methods for molecular-scale 3D printing, which would enable the precise assembly of molecules layer by layer.   This would in principle move us towards a general-purpose approach to atomically precise fabrication as envisioned by Drexler in the 1980s and Feynman in the late 1950s. DNA origami made a leap in 2006, but DNA is in some key ways a much less precise and versatile nanoscale building material than proteins/peptides. A promising path would extend “DNA origami” to “protein carpent...
Identify and validate robust biomarkers and surrogate endpoints to serve as effective proxies in clinical trials, enabling faster and more informative evaluations. This is especially important for aging (e.g., as proposed by the Norn Group).
Noninvasive monitoring technologies can enable high-resolution, point-of-care data collection.
Diagnostic tools for the top 10 hard to diagnose diseases could have a significant impact on patients and the healthcare ecosystem. Diagnostics as an industry is in a state of market failure. Many disorders remain challenging to diagnose, impacting patient outcomes and burdening the healthcare system. Monitoring the epigenome can enable the identification of exposures to infectious disease and reveal exposure to threat agents.
Technologies to control gene expression in single neurons post-transplantation. Light-based or acoustic methods could offer precision for neuro-activation to enable axonal guidance and integration and enable cellular transplantation for neuroregeneration, circuitry reconstruction etc.
Use peripheral immune cells as both reporters and interventions to decode and influence the brain. This approach leverages the adaptive immune system’s inherent function as sentinel and archivist for etiological and pathological processes in other end organs (including the brain).
The ability to study the real-time dynamics of molecular pathways in living humans remains extremely limited in time-resolution (frequency and duration) and scalability (beyond lab testing). Continuous, minimally invasive technologies are needed to improve our understanding of human physiology, accurately diagnose patients, and assess the impact of clinical interventions. Molecular engineering, particularly protein engineering, and integrated circuit design can be combined to develop new, miniat...
Models that represent cells as self-organizing and adaptive are critical for enabling the simulation of multi-cellular systems. These should incorporate subcellular data from experimental techniques tracking individual molecules within cells in real time, as well as capture the fundamental interaction between external and internal molecular environment.
Develop automated approaches for mechanistic interpretability of virtual neural network models trained on biological state data. This would enable the extraction of mechanistic insights from predictive models, potentially informing both basic science and therapeutic design.
Train predictive models that can probabilistically infer any missing omics data from any available measurements by learning a broadly useful latent space. Such a universal representation of cellular state could become the foundation for many predictive and diagnostic applications.
Highly specific mAb/nanobody type binders for every target epitope, including for specific post-translational modifications
Develop methods to “freeze” and subsequently “unfreeze” living cells, effectively capturing dynamic states for later analysis and then resuming cellular function.
Develop techniques for spatial multiplexing of dynamic signals in live cells, capturing real-time changes and molecular ticker-tapes that record cellular events over time.
Create whole-cell, nanometer-resolved, in-situ multi-omic imaging approaches that can map hundreds to thousands of molecules (e.g., proteins, RNAs) within intact specimens at nanoscale resolution. The core chemistries and imaging technologies for this exist, but they need to be integrated and brought to scale. We could, for example, comprehensively measure the many aspects of the “Hallmarks of Aging” within a single tissue sample.
Develop live cell subcellular imaging techniques in tissues combined with computational foundation models to interpret the resulting data, enabling detailed mapping of subcellular structures and their dynamics  in native environments.
Decipher the “DNA regulatory code” that governs gene expression. Understanding it would enable the prediction of how perturbations to cell state affect transcription in development and disease.
Develop technology suites for single-cell glycomics and lipidomics using methods such as in-situ multi-cycle imaging or spatially resolved nanopore/mass spectrometry. This would scale up the mapping of key molecules beyond proteins and metabolites.
Establish foundational datasets and machine learning models for inverse mass spectrometry of small molecules, enabling interpretation of metabolomic data at the single-cell level.
Develop new technologies to improve the cost-performance of single-cell proteomics (>100x), enabling proteome-wide analysis at scale. This would allow proteins—the functional output of genetics—to be analyzed comprehensively across complex biological systems. Some of these technologies are single-molecule, some are not. 
Construct a comprehensive map linking trillions of T-cell receptors (TCRs) and B-cell receptors (BCRs) to millions of disease antigens. This map will facilitate antibody–antigen binding prediction, allowing the identification of target antigens based solely on immune cell DNA sequences.
We need a way to programmably induce immune tolerance to a user-defined foreign protein, in order to enable many new forms of gene and cell therapy (not to mention help with autoimmune diseases). This is especially needed for brain computer interface as many of the most powerful concepts for BCI would involve adding foreign protein such as transducer proteins for optical or acoustic signals. As Hannu Rajaniemi wrote, “a flexible ability to induce immune tolerance to opsins is a prerequisite of a...
Develop a universal immune-computer interface to enhance the immune system’s targeting of pathogens and cancers, while reducing issues such as autoimmunity and transplant rejection. The ICI would involve two-way coupling—where the immune system and computer mutually optimize their matching processes—and real-time feedback loops. An example could be a wearable device that integrates mRNA manufacturing with single-cell sequencing.
Generate longitudinal, multi-omic immunological data from a diverse cohort of individuals. This dataset would be critical for enabling immune-ome modeling and prediction (for example, in forecasting vaccine responses).
Conduct a large-scale, comprehensive study to establish causal links between persistent microbial/viral infections (such as herpes simplex 1) and neurodegenerative diseases like Alzheimer’s Disease. Such a study would illuminate the role of infections in increasing disease risk and progression. This effort could serve as a sequel to the Genome-Wide Association Studies (GWAS) that have been performed since the completion of the Human Genome Project. Many diseases failed to show obvious genetic e...
Adapt and build experimental and computational methods to read out distributed and sparse immune memory signatures from adaptive immune cells. Beyond individual antibody-antigen binding, this approach focuses on signatures distributed across cellular subpopulations and the repertoire. Decoding these signatures could identify hidden causes of and cures for disease, enabling more accurate diagnosis, treatment, and prevention of chronic conditions.
Develop a novel physical pathway for multiphoton fluorescence generation that enables optical sectioning and excitation at red wavelengths with simple systems like continuous wave lasers.
Develop techniques that render brain tissue optically transparent in vivo, allowing deeper and higher-resolution imaging of neural networks without invasive sectioning.
Create methods to map neuronal connectivity in living brains, capturing the dynamic interactions between neurons at a single-cell level.
Develop innovative microscopy techniques that enable rapid, high-resolution imaging of neuronal networks in vivo at single–neuron resolution (e.g., Light Beads Microscopy)
Use rich data collection and machine learning to correlate natural behavior with neural activity in animals and humans and decipher the “grammar” and subcomponents of bodily movement.
Utilize live human brains maintained ex vivo (“in a vat”) to study disease and drug responses more accurately.
Develop an in-dish model that mimics the structure and function of the human cortex, providing a controllable platform for studying cortical development, function, and disease.
Systematically map how specific brain regions like the hypothalamus and brainstem (arguably the “steering subsystem” of the brain) drive innate behaviors and learning signals, and understand their role in obesity, chronic pain, fertility, inflammation, and other disorders.
Use machine learning to construct functional digital emulations of human and primate brain systems. These models, built at varying levels of fidelity, support automated interpretability and data-driven discovery.
Develop virtual models (e.g., a virtual fly, rodent) to simulate neuro–cognitive processes in a controlled, embodied environment.
Construct a digital replica of a model organism’s brain (e.g., C. elegans) that accurately recapitulates neuronal activity and behavior. 
Harmless nanoscale transducers to record or modulate brain activity that can be delivered minimally invasively.
Access brain fluids through much tinier holes than currently possible to facilitate less invasive delivery of drugs or devices to intracranial or intraventricular spaces.
Use peripheral sampling methods to indirectly monitor brain molecular biomarkers. One approach involves using ultrasound to transiently open the blood–brain barrier, releasing engineered protein markers into the bloodstream for detection.
Use noninvasive modalities—such as ambient field magnetoencephalography (MEG) with quantum gradiometers, sono–magnetic tomography, optical interference methods, and ultrasound modulated optical tomography—to record and modulate brain activity without surgery.
Develop a minimally invasive ultrasound-based platform that can interface programmably with the whole human brain. This approach leverages ultrasound’s ability to penetrate deep tissues, offering scalable imaging and modulation with minimal invasiveness.
Develop new physical detection methods for electron microscopy that improves scalability of visualization of brain circuitry.
Combine advanced imaging methods—such as synchrotron X–ray microscopy and expansion microscopy—to rapidly and scalably image neural circuits in both small and large brain regions. This method promises high spatial resolution with faster throughput.
Develop a scalable technology that can map the brain’s wiring at the single–cell level and link molecular and circuit properties.