On January 12, 2026, at the J.P. Morgan Healthcare Conference, NVIDIA CEO Jensen Huang and Eli Lilly CEO David Ricks announced what may be the most significant partnership in pharmaceutical history: a $1 billion co-innovation AI lab that brings together one of the world's leading pharmaceutical companies with the world's leading AI computing company to fundamentally reinvent how drugs are discovered.
The partnership represents far more than a technology collaboration. It signals a fundamental shift in pharmaceutical research from what has traditionally been an artisanal, trial-and-error process to an engineering discipline where AI models can explore vast biological and chemical spaces computationally before a single molecule is synthesized. The lab, based in the San Francisco Bay Area and opening by late March 2026, will co-locate Lilly's scientists, biologists, and chemists with NVIDIA's AI researchers and engineers, working side-by-side to build next-generation foundation models for biology and chemistry.
"We're creating a new blueprint for drug discovery—one where scientists can explore vast biological and chemical spaces in silico before a single molecule is made," Huang stated in the announcement. "This partnership combines Lilly's volume of data and scientific knowledge with NVIDIA's computational power to tackle some of the most enduring challenges in medicine."
The initiative addresses one of the pharmaceutical industry's most persistent problems: the enormous cost and time required to bring new drugs to market. Current estimates suggest the industry spends approximately $300 billion annually on research and development, with the average drug taking over a decade and costing billions of dollars to develop. The traditional drug discovery process involves synthesizing thousands or even millions of candidate molecules, testing them in laboratories, and iterating based on results—a slow, expensive, and often inefficient process.
The NVIDIA-Lilly partnership aims to transform this process by enabling scientists to use AI models to predict which molecules are likely to be effective before any physical synthesis occurs. This computational approach could dramatically reduce the number of molecules that need to be synthesized and tested, accelerating drug discovery while reducing costs.
The Continuous Learning System: Bridging Wet and Dry Labs
The most innovative aspect of the partnership is its focus on creating a continuous learning system that tightly integrates Lilly's "agentic wet labs" with computational dry labs. This integration represents a fundamental shift from traditional drug discovery, where computational predictions and experimental validation operate in separate, often disconnected workflows.
In traditional pharmaceutical research, computational scientists develop models and make predictions, then experimental scientists test those predictions in physical laboratories. The results flow back to computational scientists, who update their models, and the cycle repeats. However, this process is often slow, with significant delays between computational predictions and experimental validation, and limited feedback loops that prevent rapid iteration.
The NVIDIA-Lilly approach creates a scientist-in-the-loop framework where experiments, data generation, and AI model development continuously inform and improve one another. According to NVIDIA's technical documentation, this enables 24/7 AI-assisted experimentation, where AI models guide experimental design, experimental results immediately inform model training, and the improved models generate better predictions for the next round of experiments.
This continuous learning approach addresses one of the fundamental challenges in applying AI to drug discovery: the gap between computational predictions and real-world experimental results. AI models trained on historical data can make predictions, but those predictions must be validated experimentally. The traditional approach treats this as a sequential process—predict, then validate, then update. The continuous learning system makes it an integrated, iterative process where predictions and validation happen simultaneously, with each informing the other in real-time.
The system's "agentic wet labs" component suggests that physical laboratory equipment itself may be automated or AI-controlled, enabling experiments to run continuously with AI systems making decisions about what experiments to run next based on previous results. This automation could dramatically increase the throughput of experimental validation, allowing the system to test many more hypotheses than would be possible with traditional manual laboratory work.
The BioNeMo Platform: AI Foundation Models for Biology and Chemistry
The partnership is built on NVIDIA's BioNeMo platform, a framework specifically designed for training and deploying large AI models specialized in computational biology and chemistry. BioNeMo represents NVIDIA's effort to bring the same foundation model approach that has transformed natural language processing and computer vision to the biological and chemical sciences.
According to NVIDIA's BioNeMo documentation, the platform supports multiple biological modalities. For protein analysis, BioNeMo includes ESM-2 and AMPLIFY models for representation learning, EquiDock for predicting protein-protein interactions, and ESMFold for predicting protein structure from amino acid sequences. For DNA and genomics, DNABERT analyzes DNA sequences, predicts genome function, and assesses gene mutations. For single-cell biology, Geneformer analyzes single-cell RNA sequencing data. For chemistry, generative models like MolMIM design drug candidates optimized for specific protein targets, while DiffDock predicts drug-protein binding structures.
The platform's scale is impressive. On 256 NVIDIA A100 GPUs, BioNeMo trains a three-billion parameter BERT-based protein language model on over one trillion tokens in 4.2 days. This scale enables training models on massive biological and chemical datasets that would be impossible with traditional computing infrastructure.
The partnership will use BioNeMo to build next-generation foundation and frontier models for biology and chemistry. These models will be trained on Lilly's proprietary data—the company's extensive knowledge base of drug compounds, experimental results, and biological insights accumulated over decades of pharmaceutical research. This combination of NVIDIA's computational capabilities and Lilly's domain expertise and data could create models that are more powerful and more applicable to real-world drug discovery than either could achieve independently.
The expanded BioNeMo platform also includes new capabilities specifically relevant to the partnership. According to NVIDIA's announcements, the platform now includes RNAPro for RNA structure prediction and ReaSyn v2 for molecular synthesis reasoning, along with GPU-accelerated tools for molecular design. These capabilities address specific challenges in drug discovery, such as understanding how RNA molecules fold and function, and determining how to synthesize complex drug molecules efficiently.
The Vera Rubin Architecture: Unprecedented Computational Power
The partnership will leverage NVIDIA's Vera Rubin architecture, which the company describes as the most powerful computational framework in the industry. The Rubin platform, announced at CES 2026 just days before the Lilly partnership, delivers 50 petaflops of NVFP4 inference performance and 35 petaflops of training performance, representing a 5x performance uplift compared to the previous Blackwell generation.
For drug discovery applications, this computational power enables training larger, more sophisticated AI models on larger datasets in less time. The ability to train models faster means researchers can iterate more quickly, testing new hypotheses and refining models based on experimental results. The platform's 10x reduction in inference token costs compared to Blackwell also makes it more economical to use trained models for large-scale molecular screening and prediction tasks.
The partnership will use DGX B300 systems based on the Rubin architecture, providing the computational infrastructure necessary for training the large foundation models that will power the continuous learning system. These systems will enable the lab to process massive amounts of biological and chemical data, train sophisticated AI models, and run inference at scale to guide experimental design.
The computational requirements for drug discovery AI are substantial. Training foundation models on biological and chemical data requires processing enormous datasets—protein sequences, molecular structures, experimental results, and scientific literature. The Rubin architecture's capabilities make it feasible to train these models in reasonable timeframes, enabling the rapid iteration that's essential for the continuous learning approach.
The Scientist-in-the-Loop Framework: Human Expertise Meets AI Capability
A critical aspect of the partnership is its scientist-in-the-loop framework, which ensures that human expertise remains central to the drug discovery process even as AI capabilities expand. This approach recognizes that while AI can process vast amounts of data and identify patterns that humans might miss, human scientists bring essential domain knowledge, intuition, and judgment that AI systems cannot yet replicate.
The framework enables scientists to guide AI model development, interpret results, and make strategic decisions about research directions. AI systems handle the computational heavy lifting—processing data, training models, making predictions, and suggesting experiments—while scientists provide oversight, validation, and strategic direction. This division of labor leverages the strengths of both humans and AI, creating a more powerful research capability than either could achieve alone.
The continuous learning system also enables scientists to work more efficiently. Rather than spending time on routine data analysis or experimental design, scientists can focus on high-level strategy, interpreting results, and making decisions about research directions. AI systems handle the computational tasks, freeing scientists to apply their expertise where it's most valuable.
However, the scientist-in-the-loop approach also requires scientists to develop new skills. Working effectively with AI systems requires understanding their capabilities and limitations, interpreting AI-generated predictions, and knowing when to trust AI suggestions versus when to apply human judgment. The partnership will likely include training programs to help Lilly scientists develop these skills, ensuring they can work effectively with the new AI-powered research infrastructure.
The Economic Impact: Transforming a $300 Billion Industry
The pharmaceutical industry's research and development costs are enormous. Current estimates suggest the industry spends approximately $300 billion annually on R&D, with the average drug taking over a decade to develop and costing billions of dollars. These costs are driven by the high failure rate in drug discovery—the vast majority of candidate molecules fail during development, with only a small fraction eventually becoming approved drugs.
The NVIDIA-Lilly partnership aims to address these costs by improving the efficiency of drug discovery. If AI models can better predict which molecules are likely to be effective, researchers can focus experimental efforts on the most promising candidates, reducing the number of molecules that need to be synthesized and tested. This could dramatically reduce both the time and cost required to bring new drugs to market.
The partnership's $1 billion investment over five years is substantial, but it must be viewed in the context of pharmaceutical R&D spending. If the partnership can improve drug discovery efficiency by even a modest percentage, the cost savings could far exceed the investment. More importantly, if the approach enables discovering drugs that wouldn't have been found through traditional methods, the value could be measured in lives saved and improved, not just dollars.
The economic impact extends beyond cost reduction. Faster drug discovery means that new treatments can reach patients sooner, potentially saving lives and reducing suffering. The ability to explore larger chemical and biological spaces computationally could also enable discovering treatments for diseases that have been difficult to address with traditional approaches, creating value that extends far beyond cost savings.
However, the economic benefits depend on the partnership's success. AI-powered drug discovery is still an emerging field, and while early results are promising, it's not yet clear how much the approach will improve efficiency in practice. The $1 billion investment represents a significant bet that AI can transform drug discovery, but success is not guaranteed.
Beyond Discovery: Manufacturing and Supply Chain Optimization
While the initial focus is on drug discovery, the partnership extends to other aspects of pharmaceutical operations. The collaboration will explore using robotics and physical AI for manufacturing optimization, creating digital twins using NVIDIA Omniverse and RTX PRO Servers to test and optimize manufacturing supply chains before implementation.
This expansion reflects the partnership's broader vision of transforming pharmaceutical operations through AI, not just drug discovery. Manufacturing optimization could reduce production costs, improve quality control, and enable more flexible production capabilities. Supply chain optimization could improve efficiency and resilience, reducing the risk of shortages or disruptions.
The use of digital twins is particularly interesting. By creating virtual models of manufacturing processes and supply chains, the partnership can test different configurations and strategies computationally before implementing them physically. This approach enables rapid iteration and optimization without the cost and risk of physical experimentation, similar to how the continuous learning system enables computational exploration of chemical and biological spaces.
The partnership also plans to explore AI applications across clinical development, potentially using AI to optimize clinical trial design, identify patient populations, and analyze trial results. This expansion could further accelerate drug development and improve the efficiency of bringing new treatments to market.
The Competitive Landscape: AI in Pharmaceutical Research
The NVIDIA-Lilly partnership enters a competitive landscape where multiple companies are pursuing AI-powered drug discovery. Other pharmaceutical companies, including Merck, Pfizer, and Novartis, have also invested in AI capabilities, though typically through partnerships with AI companies rather than the deep co-innovation approach that NVIDIA and Lilly are pursuing.
The partnership's scale and commitment—$1 billion over five years with co-located teams—suggests a deeper level of integration than typical pharmaceutical-AI partnerships. Rather than simply licensing AI technology or contracting for AI services, NVIDIA and Lilly are building a joint capability that combines both companies' expertise in ways that wouldn't be possible through traditional partnerships.
This approach could provide competitive advantages. The co-located teams can iterate more rapidly, with computational and experimental work happening in parallel rather than sequentially. The continuous learning system enables faster feedback loops, potentially accelerating research compared to approaches where computational and experimental work are more separated.
However, the competitive landscape is also evolving rapidly. Other AI companies are developing drug discovery capabilities, and other pharmaceutical companies are building their own AI expertise. The success of the NVIDIA-Lilly partnership will depend not just on the technology, but on execution—whether the teams can work effectively together, whether the continuous learning system delivers on its promise, and whether the approach produces better results than alternatives.
Technical Challenges: From Concept to Reality
While the partnership's vision is compelling, translating it into reality will require solving numerous technical challenges. Building foundation models for biology and chemistry is more complex than building models for natural language or images, as biological and chemical data have different structures and relationships that must be captured in model architectures.
The continuous learning system also presents challenges. Integrating computational and experimental workflows requires sophisticated software infrastructure that can manage data flows, coordinate between systems, and ensure that experimental results are properly incorporated into model training. The "agentic wet labs" component, if it involves automated laboratory equipment, requires reliable robotics and automation systems that can perform experiments accurately and consistently.
The scientist-in-the-loop framework requires careful design to ensure that human expertise is effectively integrated with AI capabilities. The system must provide scientists with the information and tools they need to make decisions, while also enabling AI systems to operate autonomously when appropriate. Balancing human oversight with AI autonomy is a complex challenge that will require ongoing refinement.
Data quality and availability are also critical. The partnership's success depends on having high-quality data for training models, and on experimental results that are reliable and reproducible. Ensuring data quality across both computational and experimental domains requires careful attention to data collection, validation, and management processes.
The Future of Drug Discovery: A New Paradigm
The NVIDIA-Lilly partnership represents a potential paradigm shift in how drugs are discovered. Rather than the traditional approach of synthesizing and testing molecules sequentially, the partnership enables a computational-first approach where AI models explore vast chemical and biological spaces before any physical synthesis occurs.
This shift could fundamentally change the economics of drug discovery. If computational exploration can identify the most promising drug candidates more efficiently, the cost and time required to bring new drugs to market could decrease significantly. This could make it economically viable to develop treatments for diseases that are currently too rare or too difficult to address profitably.
The approach could also enable discovering entirely new classes of drugs. By exploring larger chemical and biological spaces computationally, AI models might identify promising molecules that wouldn't have been considered through traditional approaches. This could lead to breakthrough treatments for diseases that have been difficult to address with existing drug discovery methods.
However, the paradigm shift depends on the partnership's success. While the vision is compelling, translating it into practical drug discovery capabilities will require solving numerous technical challenges and demonstrating that the approach produces better results than traditional methods. The $1 billion investment represents a significant commitment to making this vision a reality, but success is not guaranteed.
Conclusion: A Blueprint for the Future
The NVIDIA-Eli Lilly partnership represents more than a collaboration between two companies—it's a blueprint for how AI can transform industries that have traditionally relied on experimental, trial-and-error approaches. The partnership's focus on continuous learning, scientist-in-the-loop frameworks, and integration of computational and experimental capabilities could serve as a model for applying AI to other complex scientific and engineering challenges.
For the pharmaceutical industry, the partnership signals that AI-powered drug discovery is moving from experimental to production. The $1 billion investment, co-located teams, and comprehensive approach suggest that both companies believe AI can fundamentally improve drug discovery, not just incrementally enhance existing processes.
As the lab opens in late March 2026 and begins operations, the industry will be watching closely to see whether the continuous learning system delivers on its promise. If successful, the partnership could accelerate drug discovery, reduce costs, and enable discovering treatments for diseases that have been difficult to address. If the approach proves effective, it could inspire similar partnerships and investments across the pharmaceutical industry, potentially transforming how drugs are discovered and developed.
The question isn't whether AI will transform drug discovery—the technology's capabilities make that transformation likely. The question is how quickly the transformation will occur, how broadly it will expand, and what new capabilities it will enable. With the NVIDIA-Lilly partnership, we're seeing one of the most ambitious efforts to answer these questions, combining cutting-edge AI technology with deep pharmaceutical expertise to create a new blueprint for discovering the medicines of the future.
One thing is certain: the partnership represents a significant step toward a future where drug discovery is driven by computational exploration and AI-guided experimentation, where scientists can explore vast biological and chemical spaces before synthesizing a single molecule, and where the time and cost required to bring new treatments to market are dramatically reduced. As 2026 unfolds and the lab begins operations, we'll see how quickly this vision becomes reality and what new possibilities it enables for treating disease and improving human health.




