Event Recording, Make Way for Long-Term Holter Monitoring!
Among the things that we can absolutely count on is change. Time marches on, and with it, technology is in lockstep. The ambulatory ECG market is no exception to this rule. The dramatic improvements in electronics, particularly as evidenced in semiconductors (e.g., processors, memory) and communication technology has given us new ways to address the desire, and now the need, to capture the heartbeats (if not the hearts) of our patients. An argument can be made that ambulatory ECG provides the fundamental and potentially critical piece of data in any health profile. It is the medical equivalent to the automotive industry’s reliance on an “engine analyzer.”
A brief history of ambulatory ECG device (Holters, event recorders)
Briefly stated, the earliest “ambulatory” ECG recorders were huge and clumsy arrangements that cost a fortune. Patients lugged around huge recording systems (reel-to-reel magnetic tapes) and power supplies of prodigious proportion, storage media of such size, weight, and volume that only the hardy need apply. As the storage technology shrank from reel-to-reel, to cassette tapes, to solid-state (semiconductor) memory, the portability of the ambulatory system became more and more reasonable.
The last barrier to fall was the storage capacity of these memory devices. While the memory capacity was still limited, engineers were constrained to detect and record only the most egregious, life-threatening arrhythmias. That was the focus of the ambulatory ECG market for most of its relatively short history — that is, since the late 1970’s, and this gave rise to the technology we refer to as “event recording.”
For many years, the technology didn’t exist to record each and every beat. (Truth be told, neither did the computing power exist to subsequently analyze each and every beat in a reasonable period of time.) Systems were designed to record only the fraction of beats determined to be problematic. The “snapshot” of beats could be initiated by the patient pressing a button, or the machine in its infinite wisdom could “auto-trigger” when a specific condition was detected. In either case, the broader concept of an “event recorder system” was born.
Initially, event transmission was accomplished by the patient connecting the recorder to a land-line phone system in order to send the data to the monitoring site. Of course, the patient had to be symptomatic to realize there was something to transmit. Putting anything that might be construed as an “alarm” to alert the patient as to the detection of a possible event on a medical device has serious implications for how the FDA views a product. A very serious (read “fatal”) arrhythmia left no time for this. In fact, even with the advent of auto-detection of events by the recorder and the addition of nearly real-time transmission of the events to a central monitoring location, the improvement in patient outcomes is still the subject of much controversy. The event recording/monitoring/analysis industry is desperately trying to justify its existence with papers and studies that attempt to conclusively prove the efficacy of event monitoring in terms of patient outcomes.
Growth of an industry
When the concept of transmitting event data to a central monitoring site was introduced, an entire industry grew up around the event recording, transmission, and analysis paradigm. Every possible permutation occurred. For example, service companies tried owning all the recorders and software and selling the service to doctors. Sometimes they sold the recorders, but retained the service component. Leasing them to doctors. Pay-per-study contracts. Preferential volume discounts. Doctors owning the hardware and contracting only for the monitoring service and the report. CPT coding and billing was a nightmare. CMS couldn’t implement codes fast enough to keep track of who was doing what. Loopholes abound. Abuse was rampant. It was the “wild west” era for ambulatory ECG monitoring.
Unfortunately, the response time to a patient in distress from a major cardiac event needs to be measured in seconds or minutes (as in 1-3 minutes), not tens of minutes or hours. For example, a study done by Panasonic Healthcare in Yokohama, Japan, published in the Journal of Telemedicine and Telecare, found that the single biggest factor in improving outcomes for patients with remote heart monitoring of chronic heart failure was rapid intervention. The application of all this technology culminating in near real-time transmission and monitoring is for naught if the response to that patient’s condition is not near real-time as well. Not to be flippant about a serious medical matter, but we’d probably save a lot more in terms of resources if we simply sent the GPS coordinates of the patient in distress so we’d know where to find the body!
The truth is that for the relatively small segment of society that can afford to pay for this boutique type of monitoring, and that lives in an urban environment where an extremely rapid response is possible, it may change the outcome. For the rest of us who live in the real world, let’s be honest, the paradigm of event recording is fatally flawed, both technically and practically.
Technology to the rescue
However, that is not the end of the story. With the advent of new semiconductor technology in processing power, low-power devices, and increased memory storage, there is a tremendous diagnostic value to be had by recording all of a patient’s heart beats over an extended period of time and letting a computer do the “heavy lifting” involved in a complete post-analysis of that data. The diagnostic value of that cannot be overestimated.
Multi-day Holter is not only possible, it makes enormous sense. Around the world, countries are recognizing the value of two-, three-, even as many as 14-day Holter studies. The increase in cost for the longer use of the equipment and analysis of a lot more data is incremental. It leverages the gains made by improved technology to provide more valuable information but at a lower cost per datagram. It eliminates the single, major cost factor in the outdated event paradigm in use here in the U.S.: the 24/7 human monitoring of the data, with no clear improvement in outcome for all that expense.
Better diagnostic capabilities — a resounding “Yes!”
So where do we go from here? Research is being done. Papers are being published. It’s a great grad school topic for a statistician, a health care provider, or business major. But we can certainly reduce the cost of collecting the ECG data. We can gather far more high-quality data and with less human intervention than ever before. That leads to improvements in the accuracy of the diagnostics.
But when it comes to the regime where urgent treatment is required, i.e. a human response to a dire ECG situation, there are few changes that make a real difference. For most ambulatory patients, the response time necessary to effect a good outcome is generally orders of magnitude shorter than what is practical outside of an inpatient setting. For patients in extremis, the high percentage improvement is obtained by those in residence in a facility with acute cardiac care capabilities. And that does not scale well at all!