This interview analysis is sponsored by Deloitte and was written, edited, and published in alignment with our Emerj sponsored content guidelines. Learn more about our thought leadership and content creation services on our Emerj Media Services page.
In R&D-intensive sectors such as life sciences, agriculture, and advanced materials, a familiar challenge persists: fragmented data ecosystems, incompatible systems, and computational ceilings that hinder innovation.
Organizations continue to struggle to integrate domain-specific data sources — genomics, agronomy, chemistry, and instrumentation logs — into unified frameworks that can accelerate discovery.
The economic impact is staggering. According to IBM and the Harvard Business Review, data silos and inefficiencies are estimated to cost the global economy $3.1 trillion annually in lost productivity and revenue.* Poor data quality, a frequent byproduct of this fragmentation, costs companies an average of $12.9 million per year, according to Gartner.
Within research and development, the consequences are particularly acute in the pharmaceutical sector. For example, research published in Drug Discovery Today earlier this year notes that only about 10.8 percent of drug candidates advance from early development to market. Each failed project compounds the inefficiencies, extending timelines and inflating costs. Against this backdrop, even marginal improvements in throughput can translate into outsized competitive advantage.
In an exclusive interview for Emerj’s ‘AI in Business’ podcast, Ben Ninio, Principal in Strategy at Deloitte, offers a perspective on how R&D leaders can overcome these structural limits.
Rather than pursuing ever-greater computational capacity or isolated platforms, he describes how treating data as a language — and connecting these “languages” across scientific domains — can unify discovery workflows. His approach reframes the challenge as one of interpretation rather than calculation, enabling researchers to extract insight from complexity without exponential compute.
This article analyzes two of Ninio’s core insights for enterprise leaders across research and development spaces:
- Treating scientific data as a shared language: How reimagining data as a linguistic structure helps enterprises unify R&D systems and accelerate discovery.
- Building multimodal frameworks to uncover hidden relationships: How combining domain-specific “languages” such as genomics, soil health, and chemical structures allows organizations to surface new insights and redefine innovation strategy.
Listen to the full episode below:
Guest: Ben Ninio, Principal in Strategy, Deloitte
Expertise: AI Strategy, Digital Transformation, R&D Innovation, Cross-Industry Growth
Brief Recognition: With more than a decade of experience in digital innovation and enterprise strategy, Ben has advised global leaders in life sciences, agriculture, and industrial sectors on building AI-driven operating models and data ecosystems. Before joining Deloitte, Ben founded and scaled multiple technology ventures — including GoTo Global and CAR2GO Israel — and led digital transformation at Syngenta, creating new business capabilities across analytics, acquisitions, and global delivery. A graduate of Reichman University and The Wharton School, Ben is recognized for his work bridging advanced technology and business value in the transition to an AI-led future.
Treating Scientific Data as a Shared Language
When discussing how AI is changing research and development, Ninio begins with a strikingly simple idea: language. He describes language as the common denominator across all domains of scientific inquiry — life sciences, agriculture, and industrial chemistry alike. “The short answer is language,” Ninio says. “Language is the unifying common factor across the way that we work in R&D.”
Ninio explains that while traditional computational approaches rely on brute-force modeling of molecular interactions, these systems hit hard physical limits. Researchers have been using AI and statistical tools to predict how molecules or proteins behave when combined.
These systems are competent for predicting how single or small groups of molecules behave when combined. Beyond that, Ninio says, “We just don’t have enough compute in the world. These are quantum systems, too complex to simulate exhaustively.”
He observes that compute ceilings in scientific modeling mirror the limits seen in general AI today: impressive acceleration followed by physical and practical constraints.
To address these limits, Ninio outlines what he calls “the hack,” or treating scientific data itself as a form of language. He emphasizes that treating scientific data this way means researchers no longer need to resort to brute-force modeling. “We can start to use language to connect the dots and make some pretty good guesses about what nature is going to do next,” notes Ninio.
In his view, molecules, DNA, and proteins are structured languages with syntax, grammar, and meaning, just not the kind expressed in English words.
Ninio then draws a parallel between this approach and large language models (LLMs), which predict the next word in a sequence based on context and probability:
“If I say a word, there are almost infinite possibilities for the next one. But systems like LLMs have learned to make statistically meaningful predictions. That same principle applies to biological systems.”
– Ben Ninio, Principal in Strategy at Deloitte
Instead of simulating every potential molecular interaction, AI models can now predict likely outcomes, narrowing billions of possibilities to a manageable shortlist for scientists to test.
The implications, Ninio notes, are profound for R&D productivity:
“Only about one percent of ideas make it all the way through the R&D pipeline. We don’t need to be perfect; we just need to do a little better than we’ve done in the past. If we can move that success rate to two or three percent, the amount of value created globally is enormous.”
– Ben Ninio, Principal in Strategy at Deloitte
To achieve these success rates, Ninio emphasizes the need for semantic infrastructure — data systems that translate domain-specific information into a symbolic format that both humans and AI can interpret. “Scientists will still be sitting in front of lists of 30 or 40 AI-generated options,” says Ninio. “But those options will now be informed by the probability structures of nature itself.”
In his framing, the organizations that succeed in this new paradigm are those that learn to “speak” the language of their own data, turning disconnected datasets into connected insight. This linguistic approach, he argues, transforms discovery from a problem of scale into one of interpretation, enabling innovation to progress at the speed of meaning rather than the limits of computation.
Building Multimodal Frameworks to Uncover Hidden Relationships
From Ninio’s perspective, treating scientific data as a language is only the beginning. The next step, he explains, is enabling AI to combine languages — to reason across domains and data types simultaneously.
“Where this starts to get really exciting is that we can actually start to combine different languages. It could be English, it could be proteins, it could be music, it could be soil health and microbiome. These are vastly different languages, but because of the way we construct knowledge graphs, they can start to make their own linkages which are not necessarily intuitive.”
– Ben Ninio, Principal in Strategy at Deloitte
Ninio uses this analogy to illustrate the concept of multimodal AI — systems capable of synthesizing information from text, numerical data, sequences, and imagery into unified knowledge graphs. These structures, he explains, enable cross-domain reasoning.
According to Ninio, this capacity for abstraction marks a shift from linear modeling to networked understanding. “The algorithms are seeing relationships we can’t necessarily articulate,” he explains. “They’re connecting patterns between biological, chemical, and environmental data that our minds weren’t built to hold all at once.”
Ninio stresses that this approach requires enterprises to balance technological innovation with cultural change. He identifies three foundational layers for organizations adopting multimodal reasoning:
- Semantic Interoperability: Translating data from multiple scientific “languages” into shared representational formats.
- Knowledge Graph Construction: Mapping relationships between concepts and entities across disciplines.
- Multimodal Modeling: Training AI systems to reason across text, sequence, image, and numerical data simultaneously.
These capabilities, Ninio explains, depend on cross-functional collaboration. “Multimodal frameworks only work when teams are open,” he says. “Executives need to align incentives around collective discovery, not just departmental performance.”
He adds that aligning human governance with machine learning systems ensures that AI-generated hypotheses remain grounded in scientific credibility.
Ninio emphasizes that the quality and framing of data remain decisive factors in the success of AI-driven research. Not all data carries equal value, and the way information is structured and contextualized determines how effectively models can learn from it. He describes this as the point where “art meets science” — a process of continual iteration that balances human intuition with empirical validation through wet-lab experimentation.
In Deloitte’s applied work, Ninio notes that the most transformative discoveries emerge when public datasets are connected with private, proprietary information. By combining open research with the often underutilized internal data locked inside R&D organizations, teams can reveal patterns that would otherwise remain invisible.
The synthesis Ninio describes allows AI systems to generate hypotheses that cross traditional disciplinary boundaries, linking insights from genomics to materials science, or agricultural chemistry to molecular biology.
Ninio welcomes comparisons between the current stage of AI adoption and the early internet era, when the potential of digital connectivity was only beginning to be understood. Just as the web transformed communication and commerce, he believes that multimodal AI will redefine scientific discovery, creating continuous, cross-domain systems of innovation rather than siloed lines of inquiry.
Ultimately, Ninio argues that the organizations willing to invest in this connective infrastructure today will be the ones shaping the future of research and development. Those capable of uniting their data environments into shared, intelligent networks will not simply accelerate discovery — they will redefine the pace and purpose of innovation itself.
Source link
#Connecting #Dots #Discovery #Ben #Ninio #Deloitte








