How long should patient test data be retained?

 In a recent post on the AARC Diagnostics forum a PFT Lab manager asked how long they need to keep database records. The ostensible reason for this was that they had too many years of records and had been having database problems.

The poster wasn’t specific about what kinds of problems they were having. Database problems can be hard to diagnose particularly when a database is networked but with a modern SQL database the number of records shouldn’t be an issue. SQL databases containing millions of records are routinely used in demanding multi-user applications. If this was thirty years ago when computers first started to be commonplace in the PFT Lab I could understand since PC-based databases were still in their infancy then. It was at least partly for this reason that a number of PFT equipment manufacturers developed their own proprietary databases. This is no longer the case and I have difficulty believing that there are any manufacturers at this time that don’t use a commercial SQL database of one kind or another.

I am not suggesting the poster wasn’t having problems. Even though SQL databases tend to be very robust that doesn’t mean that incorrect settings or bugs in the software accessing the database can’t cause problems. Equipment manufacturers and hospital IT departments may not have the expertise or the patience (or even the desire) to diagnose and fix these kinds of problems either. What I found curious however, was that almost everybody responding to the original post seemed to be eager to get rid of their “old” patient data as soon as they possibly could.

I don’t understand this at all. My lab’s database goes back to 1990 and we’ve worked hard to maintain its integrity throughout this time. Since 1990 my lab has gone through at least six major software and database revisions and has migrated the database from its original home in a IBM PC AT, to a lab-only local area network and then to a shared SQL server managed by the hospital’s IT department. When my hospital merged with another in the late 1990’s we also merged our database with the other hospital’s PFT lab. Our database currently contains (among other things) 667,000 spirometry results from 159,000 patient visits. It’s hard to be absolutely sure of course, but as far as I can tell we haven’t lost any information along the way.

So why go to all this trouble? Legally, as long as our paper PFT reports are in Medical Records (yes {sigh}, we’re still required to generate paper reports although with electronic report signing on our horizon that requirement will go away soon) we’re not required to keep our electronic database records for any particular time period at all (I will mention in passing that our state’s requirements for regular Medical Records is that they need to be kept for 40 years). We’ve also had an interface with the hospital’s computer system for almost 20 years (which itself has gone through at least four major revisions) and all of the patient reports we’ve uploaded are still in the hospital’s information system.

The reason we go to all this trouble is very simple: neither Medical Records nor the hospital’s information system can generate a trend report.

I think that trends are a critical but often overlooked part of PFT testing. Certainly there are many patients who are only seen once in the PFT Lab and never seen again (or at least not in the same lab and that is an important issue unto itself). Many patients are seen more than once however, and how normal or abnormal their test results are is often less important than the changes in their test results. We regularly get at a number of patients every week who have PFT records dating from 10, 15 or even over 20 years ago. When test results are reviewed their trends often have a significant influence on the test interpretation (and the inability to assess trends is one reason that computerized interpretations have been a consistent failure for decades).

Keeping patient records, most particularly in a useable form, is a critical part of patient service, and this is why I don’t understand the apparent desire of some labs to get rid of their data as quickly as they can. Even if a lab regularly uploads patient test results to their hospital’s information system, does that system store the graphical results? Does it store all trials for a single test? Does it store the raw test data and not just a few of the results? And most importantly of all, can it generate a trend report? If the answer is yes to all these then I would agree that the lab itself does not need to retain test data any longer than the bare minimum that is necessary. Otherwise, I think a lab has a responsibility to try to retain patient test results indefinitely.

[Full disclosure: I am forced to admit that I have a reputation as a data pack rat, and with some justification. I’ve had a PC of one kind or another since the 1970’s (anybody remember the TRS-80 Model III?) and I have some personal files on my computer that are over 30 years old, so you can see that I have trouble understanding why you wouldn’t want to keep patient data for as long as possible.]

There are a number of factors however, that are driving dramatic improvement in data interchange between hospital information systems, clinics, physicians and patients. Patient information has been stored as paper records in Medical Records for a very long time and more recently and more conveniently in Hospital Information Systems, but both are a bit like a black hole: information goes in but it takes an awful lot of hard work to get it back out. The need to interchange data is forcing fundamental changes in the way information is stored and retrieved. This process is still ongoing, but it is already allowing extremely large on-line datasets to be easily created and maintained. These in turn can be searched quickly for trends and correlations, and although this is certainly changing the nature of research, it is also changing the nature of patient care.

At some point, data interchange will reach a point where it won’t matter when or where a patient had their PFT tests performed since it will all be on tap, ready to be reviewed and trended no matter where the patient is or who they are seeing. We’re not there yet however, and at a guess the era of universal and transparent data interchange is still at least a decade away, so retaining data still matters.

When we have the opportunity to acquire new test systems for our labs, we usually focus on what tests the equipment performs and how well we think it will perform them. As importantly, we also need to think about how the test data is stored, retrieved and shared. There should be no technical reasons a lab can’t keep all of its test data for as long as it wants or needs, even if that is for multiple decades. In general, SQL databases are a mature and reliable technology. Retrieval speed is not greatly affected by how many records are stored and disk space is cheap. All of the major SQL databases have a variety of tools for maintaining database health. Database problems are unacceptable and if your lab is having them it is unlikely that having “too many” records is necessarily going to be the reason. Instead, those responsible, the equipment vendor and hospital IT department, need to be taken to task until they are fixed.

 

Creative Commons License
PFT Blog by Richard Johnston is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.