Onyx Blog

The Architecture Behind Intelligence: A Conversation with Dragon Bashyam

Payer Intelligence

The Architecture Behind Intelligence: A Conversation with Dragon Bashyam

  • Home
  • /
  • The Architecture Behind Intelligence: A Conversation with Dragon Bashyam

By Susheel Ladwa, CEO, Onyx 

Moving clinical data is no longer the hard part. Using it is.

Payers today receive clinical information from dozens of sources — provider EHRs, public health systems, exchange partners — and most of it arrives fragmented, inconsistent, and hard to operationalize. Connectivity is solved. Intelligence isn’t.

I wrote recently about why the payer organizations that will lead are the ones focused on risk, quality, and care delivery — the ones building toward intelligence, not just connectivity. This post is about how that actually gets built.

That’s the gap OnyxOS is designed to close: a platform that acquires, validates, curates, enriches, and standardizes clinical data so payer organizations can use it at scale — for risk adjustment, HEDIS quality programs, care gap identification, prior authorization, and payment integrity.

How that works under the hood is a technical conversation, and Nagesh “Dragon” Bashyam is the right person to have it. Our CTO and co-founder of InteropX (now part of Onyx), Dragon has authored more than 20 HL7 FHIR and C-CDA standards, built clinical data infrastructure serving 30 million patients nationwide, and advised HHS, ONC, CDC, and HRSA on U.S. digital health standards and policies.

What follows is that conversation.

Once clinical data is standardized in a consistent model, organizations can apply analytics, automation, and AI much more effectively

On why interoperability didn’t solve the usability problem 

I began by asking Dragon why so many organizations still struggle to use clinical data even after major investments in interoperability. 

Dragon: 

Interoperability solved a very important part of the problem, which is connectivity. 

Today we have standards like HL7, C-CDA, and FHIR that allow systems to exchange information more consistently. Regulations accelerated adoption, so data can move between payers, providers, and other organizations much more easily than before. 

But moving data is only the first step. 

Most payer organizations receive clinical information from many different sources — provider EHR systems, public health systems, and other partners. Each of those sources produces data slightly differently. 

So the data arrives fragmented. Sometimes it’s incomplete. Sometimes it isn’t standardized in a way analytics or operational systems can easily use. 

The real challenge now is not whether we can exchange data. 

The challenge is making that data usable. 

On the architecture required to make clinical data usable 

Solving that problem requires thinking about the platform architecture in layers. 

Clinical data must first be acquired reliably, then standardized and structured, and finally made usable inside operational workflows. 

This is the approach behind OnyxOS, which combines scalable data acquisition, a FHIR-native data fabric, and intelligence layers that allow payer organizations to operationalize clinical data across multiple programs. 

On the role of clinical data pipelines 

I asked Dragon what that architecture looks like in practice. 

Dragon: 

The key is building reliable data pipelines between payers and providers. 

At InteropX we focused on creating what we call a virtual data pipeline — infrastructure that continuously brings clinical data from provider systems into payer environments in a standardized way. 

This foundation now supports capabilities integrated into OnyxOS, including Payer-to-Payer Data Exchange, Provider Access APIs, Electronic Prior Authorization, Clinical Data Acquisition from provider EHR systems. 

Without something like that, organizations usually try to build the pipelines themselves. They pull data from many sources, normalize different formats, and try to maintain those integrations over time. 

That becomes very complex and expensive. 

Healthcare standards evolve. Regulations change. New data sources appear. Maintaining those pipelines becomes a full-time engineering effort. 

A platform approach allows organizations to plug into technology that already understands healthcare standards and data models — and continues evolving as the ecosystem changes. 

That allows payer organizations to focus on their core business instead of constantly maintaining the underlying data plumbing. 

On turning clinical data into something operational teams can use 

Once that data is flowing reliably, the next challenge is turning it into something operational teams can actually use. 

I asked Dragon what that requires from a platform perspective. 

Dragon: 

The next step is making sure the data becomes usable. 

Clinical information often arrives as documents or fragmented records across multiple systems. To make that useful, the platform needs to normalize and structure the data so it can support analytics and operational workflows. 

That’s where a FHIR-native data fabric becomes important. 

Once clinical data is standardized in a consistent model, organizations can apply analytics, automation, and AI much more effectively. 

This is also where platforms like OnyxOS extend beyond interoperability — enabling applications that support risk adjustment, HEDIS quality programs, care gap identification, prior authorization workflows, and payment integrity. 

Without that structured data foundation, those programs become extremely difficult to scale. 

Across the market, we’re seeing this challenge play out in very real ways. 

In one engagement with a large regional health plan in the Northeast, the focus is improving risk adjustment accuracy by ensuring clinical data arriving from provider systems is standardized and usable for analytics. 

In another deployment with a large regional health plan in the Southeast, the priority is supporting HEDIS quality programs, where teams need reliable clinical data to identify care gaps and improve quality performance. 

In both cases, the underlying issue is the same: the data exists, but organizations need the right architecture to turn that data into operational intelligence. 

On where AI fits 

AI is receiving enormous attention across healthcare today, so I asked Dragon where it fits into this architecture. 

Dragon: 

AI can be very powerful, but it has to be applied to real problems. 

Right now there is a lot of hype around AI, but in healthcare the most important thing is still the data. 

If the data is fragmented or inconsistent, AI cannot solve that problem. 

AI works best when it operates on clean, standardized data. 

Once that foundation exists, AI can help extract insights, support analytics, and enable new operational workflows. 

But the foundation has to come first. 

On the next phase of healthcare data platforms 

Healthcare interoperability has made tremendous progress. The industry can now exchange clinical data at a scale that wasn’t possible even a decade ago. 

But the next phase of the industry is about making that data operational. 

Platforms like OnyxOS combine scalable data acquisition, a FHIR-native data fabric, and intelligent workflows so organizations can move beyond simply exchanging information and begin using clinical data to drive measurable outcomes. 

The organizations that succeed in the next phase of healthcare will not simply move data faster. 

They will turn clinical data into intelligence that powers risk, quality, and care programs across the enterprise. 

That’s what a payer intelligence platform actually means in practice

 

Susheel Ladwa

Susheel Ladwa

Onyx CEO