At SapioCon 2025, amid the excitement around AI and digital innovation, Dr. Joanne Hackett’s talk stood out for its insistence on something much harder: getting the fundamentals right.
She began with a striking visual—a room full of attendees rising to their tiptoes, a physical metaphor for the leaps we’ve made in medicine, like a $3.2 million gene therapy injection that restores mobility to muscular dystrophy patients. But the bigger question was this: how do we make such transformative outcomes routine rather than exceptional?
The answer, she argued, lies in the full arc of healthcare—not just in discovery or diagnosis, but in a connected continuum that spans prevention, diagnosis, treatment, and follow-up. It’s an increasingly possible vision, but only if we stop treating data as an endpoint and start treating it as infrastructure supporting this continuum.
From big data to useful data
We’ve all heard about the promise of big data in healthcare. Genomics, wearables, electronic health records, patient-reported outcomes—the information is there. What we’re missing is coherence. We are drowning in data but starving for insight.
Hackett put it plainly: collecting data without a plan for curation, standardization, and connectivity is a complete waste of time.
She wasn’t exaggerating. Of approximately 200 global genomics initiatives today, only a handful are using the data they collect effectively. That’s not just a technical issue—it’s a strategic failure. The real value of healthcare data doesn’t emerge at the point of collection. It comes from making that data interoperable, longitudinal, and reusable.
That means clean data. It means data that is formatted correctly at the outset. And it means breaking down silos between research, diagnostics, and clinical care. This is where digitalization needs to grow up—not to impress, but to enable.
“Collecting data without a plan for curation, standardization, and connectivity is a complete waste of time.”
Dr. Joanne Hackett, IQVIA
Imagine being able to track subtle health markers over time, correlating life events and symptoms in ways we’ve never done before. Is the 35-year-old woman struggling with infertility also the child who had cystic acne and poor motor coordination? Might she later develop early-onset Parkinson’s? We don’t know—because we don’t collect or connect this kind of longitudinal data. But we could. And we should. Because it’s not only more insightful—it’s cheaper and more humane than the reactive model we have now.
Precision at the population level
The field of precision medicine has come a long way from its boutique beginnings. Hackett urged us to go a step further—to think in terms of precision public health. That means moving beyond individual mutations or risk scores and using global data to inform policies, tailor programs, and ultimately shape how entire populations are treated.
Take oncology. We’re now watching precision approaches replace broad-spectrum treatments in real time. The same is happening in neurology, where the historically slow pace of CNS (central nervous system) therapy development is finally accelerating. Cell, gene, and RNA therapies are no longer limited to acute or last-resort scenarios—they’re entering the mainstream.
But this precision is only possible if we can match the right treatment to the right patient at the right time. And that’s only possible if we understand the patient journey end to end—not in silos, but as a living thread of data that travels with the patient.
This kind of continuity is not just clinically important—it’s operationally critical. As Hackett reminded us, healthcare is also a business. The ability to triage faster, diagnose earlier, and initiate appropriate care sooner doesn’t just improve outcomes; it clears hospital beds, reduces waste, and allows systems to function more effectively.
Real-world evidence: From nice-to-have to necessity
In this new paradigm, real-world evidence (RWE) isn’t a supplementary dataset—it’s foundational. RWE helps us understand how therapies perform outside tightly controlled clinical trials. It informs access, reimbursement, and clinical decision making. And when combined with traditional trial data, it creates a more nuanced, more equitable picture of health outcomes.
To get there, though, we need systems that are built for it. Not just to collect and store data, but to curate, annotate, and connect it—across time, across platforms, and across domains.
That’s why Hackett emphasized the importance of platforms that can integrate everything from sequencing data to clinical records to published literature. Without that connective tissue, even the most advanced therapies will fall short of their potential.
The consumer data opportunity
One of the more provocative parts of Hackett’s talk focused on wearables and digital health tools. These devices are collecting data at an unprecedented scale—heart rates, sleep patterns, glucose levels, even stress markers. But we’re barely scratching the surface of what that data could mean across a patient’s lifetime.
The challenge isn’t just technological—it’s conceptual. We need to shift from a snapshot model of health to a streaming model. Health isn’t a series of isolated events; it’s a continuous state, and our systems should reflect that.
Imagine if data from your fitness tracker could one day inform your cancer treatment plan—or help predict which therapies you’re most likely to respond to. That’s not science fiction; it’s an integration problem. The data is there. The bridge is not.
Why the digital thread matters
Hackett described the current state of healthcare data as a string of pearls without the string. We have isolated datasets—some quite beautiful on their own—but no thread connecting them. That thread is what digitalization should offer: not just more data, but continuity.
A longitudinal, interoperable data thread enables more than just better science. It enables faster diagnoses, earlier interventions, smarter policy, and ultimately, better lives.
It also enables consent-driven, ethically managed data sharing, where patients understand the value of their data and receive actionable insights in return—a “value exchange” model that could be the future of public health programs.
If real-world data is to feed genomic, precision, and personalized medicine, then those data streams must be connected across care settings and geographies. This isn’t just about scale; it’s about interoperability—ensuring that data generated in Sweden, Kenya, or New York can inform care equally.
LIMS software as a strategic lever
Hackett highlighted the role of laboratory information management systems (LIMS) in making all this possible. In particular, she pointed to large-scale government projects in Norway and Denmark, where systems were deployed both to track samples and curate genomic data in ways that accelerate treatment access and enable patient recontact.
These aren’t small pilots. Norway’s largest medical genetics department now processes over 25,000 samples per year through a unified system, while Denmark’s national initiative is using curated genomic data to proactively improve care.
These examples illustrate how LIMS software, when designed with interoperability and curation in mind, can become more than just back-end infrastructure. It becomes the nervous system of a truly digital healthcare continuum.
The role of policy and trust
Even the best technology won’t matter if trust isn’t built in. Hackett acknowledged the regulatory challenges of working across borders, especially around consent and data reuse. But she also emphasized that people are willing to share their health data—if they understand how it will be used and what they’ll get in return.
That places a new burden on health systems and technology vendors alike: to prioritize transparency, to design with patients (not just regulators) in mind, and to build platforms that are as easy to trust as they are to use.
Hackett also pointed out a practical truth often overlooked: health data isn’t particularly valuable in and of itself. It won’t make patients rich. But it can—and should—help make them healthier. That reframing is key to building the kind of public-private partnerships needed for healthcare’s digital future.
Where do we go from here?
Hackett closed with a vision of healthcare that’s as pragmatic as it is ambitious. Not a world of flying cars and miracle cures, but one where the basics—clean data, smart systems, interoperable platforms—are finally in place.
It’s a vision that requires discipline, coordination, and yes, humility. We don’t need to reinvent the wheel. We need to connect it to the axle.
Platforms like LIMS, especially those built for scalability and connectivity, are no longer optional—they’re essential. Not because they collect data, but because they enable us to make sense of it. Not because they’re digital, but because they are the digital thread itself.
At a time when healthcare innovation is accelerating, the bottleneck is no longer discovery. It’s integration.
And that’s exactly where our focus needs to be.