The December timing is not a coincidence. OpenAI’s ChatGPT rolled out on November 30, 2022. ChatGPT, which runs on Nvidia hardware, has prompted a reprioritization of AI in the industry. As Powell described it: “From the top down, every group is saying, go do at least two things using AI, and report back. They are really just starting to make it part of the DNA of the company, and come up with your own ideas. But this accessibility that ChatGPT has created, it’s like a new programming language that has just opened up the entire world.”
Nvidia is actively working with partners in drug discovery to enable conversational interfaces with AI models. The company has a demo where users can pose complex queries, such as: “Can you use AlphaFold to predict the structure of a new SARS-CoV-2 variant?” Or, “Given a recently approved antiviral, can you generate 200 similar structures and screen them against this new variant?”
While it may sound like a fairly common chatbot interaction for someone with a conceptual drug discovery understanding, it triggers complex AI models on the backend. This is an example of how generative AI can bridge the traditional gap between biologists and computer scientists.
The use of large language models alone for streamlining documentation access is significant in pharma. “Now, you can simply feed this documentation into chat and engage in a real conversation with it,” Powell said. “It directly answers your questions, removing the need to methodically sift through documentation to understand how to use a given application.”
Generative AI: Bridging the gap between biologists and computer scientists
But it goes beyond that. “It’s going to be wild,” Powell said. Generative AI has the promise to “take every biologist and turn them into a computer scientist.” And to do the same for computer scientists focusing on biology.
In the early discovery phase, AI promises to shorten timelines, cut costs and increase a drug candidate’s odds in the clinic. With traditional methods, “you’re just going to be bound by time, cost and how much you’re actually looking at. This is a needle in a haystack problem,” Powell said. But computational methods enables drug discovery professionals to “explore much larger spaces,” she said. “The outcome is potentially better because you’re analyzing more; it’s purely a function of statistics.” Meld AI into this iterative process, and each experimental cycle advances, boosting the prospects of the developed drug. “That’s what’s going to change the game,” Powell said.
The boost in speed AI can offer scientists is another potential game-changer, Powell noted. “Scientists often already have their next question in mind. If it takes weeks or months to get an answer, their focus may have already shifted elsewhere,” she said. “Scientific efficiency suffers when everything requires traditional experimental methods.”
Big Pharmas are sharpening their focus on AI technologies. A notable example is Sanofi, which announced in June it intends to put AI at the core of its operations. Morgan Stanley predicts that within the next decade, AI in early-stage drug development could spark the discovery of 50 additional novel therapies, translating to sales exceeding $50 billion.
AI’s growing pains and tapping its promise in drug discovery
Over the years, AI has experienced several stumbles in the healthcare realm as well. AI largely struggled to live up to its potential during the pandemic, despite unprecedented data availability and need, as Harvard Business Review noted. While early AI tools correctly warned of the emerging outbreak, systems for diagnosis, prognosis, and spread prediction often fell flat. Challenges related to dataset quality, algorithmic bias, human errors and global complexity played a role in pandemic-era AI missteps.
In addition, multiple high-profile techbio companies tapping AI have struggled post-IPO. As observers have noted, the promises of accelerated development through AI platforms have often outstripped reality. The inherent biological complexity, fierce ongoing competition from academia and Big Pharma and lack of sharp biological focus have challenged some techbio firms over the years.
While Big Pharma companies are increasing their AI investments, smaller, more nimble players with access to quality proprietary data may wield an advantage. After all, as Powell pointed out, data is “a crucial component of a company’s DNA that determines success.”
Pharma’s future as scale
Ultimately, the pharma industry is just scratching the surface of what is possible with AI, Powell said. “People typically perform virtual screening at scales of hundreds of thousands to millions, but the goal is to screen at the scale of billions,” she said. As an illustration, Recursion Pharmaceuticals processed 36 billion chemical compounds for drug-target interaction in just a week. The company reckons that it would have otherwise taken 100,000 years to accomplish the same feat with traditional methods. Relatively speaking, screening against the vast repository of 36 billion mappable chemical compounds cost “not even half a million dollars,” Powell said.
Given the vast potential in the nexus of tech and bio, the question shouldn’t be, “Does it make sense to use AI in drug discovery?” but rather, “Given the vast potential of AI at scale, why wouldn’t every drug program use it?”
Filed Under: Data science, Drug Discovery, Drug Discovery and Development, Industry 4.0, machine learning and AI