New test to establish integrity of biosamples used in medical research

Effective diagnosis and treatment of disease draws on painstaking research, which often relies on biological samples. The avalanche of studies used to better understand illnesses and design effective therapies cost billions of dollars and potentially affects millions of lives.

So, it would seem reasonable to assume that the reliability of biological samples, on which accurate results depend, would be of paramount concern for the scientific community.

According to Chad Borges, a researcher in the Biodesign Virginia G. Piper Center for Personalized Diagnostics at Arizona State University, that assumption is quite often wrong.

"One of the major reasons that there are so many discoveries of biomarkers (early indicators of disease) in the literature, but so few positive validations that confirm those findings is the fact that in many cases during the discovery, samples were used that have a history or an integrity that's simply unknown."

Biological samples can be highly susceptible to changes over time, which often occur when they are removed from deep refrigeration. Degraded samples can produce spurious results in research. To address these concerns, Borges and his colleagues have designed a highly sensitive test that can be used to establish the integrity of blood plasma and serum, the most common biosamples used in medical research.

Ensuring such samples have been properly handled is the first step in careful research that meets the necessary high standards of reliability and reproducibility. The new test, which relies on accurate measurement of the relative proportions of two forms of the protein albumin present in blood, was recently described in the journal Molecular and Cellular Proteomics.

Houston, we have a problem

The immediacy of the issue of sample quality became apparent to Borges during the course of his own research, which involved experiments on biological samples slated for distribution by the National Institute of Health (NIH). "We got a little suspicious that something wasn't quite right about the sample set," he says. Borges applied the newly designed test to the samples, with surprising results. "Low and behold, there was a major difference between the cases and controls for this specimen integrity marker."

Further investigation revealed that the freezer in which the control samples were stored had lost power for several days during a natural disaster. "That information is really important with regard to the quality of the samples and the stability of the markers that were in them."

The implications of this discrepancy plainly went beyond his own research.

Who knows how many other markers are differentiated simply because of the way in which the cases and controls were handled."

Chad Borges, researcher, Biodesign Virginia G. Piper Center for Personalized Diagnostics, Arizona State University

Lurking beneath the surface

In 2018 alone, the National Institute of Health's Medline logged close to a million published papers of health-related research. Advances relying on research findings have transformed medical science, improving the quality of life and saving millions from dreaded diseases and afflictions.

But progress has not always been smooth going. In addition to formidable scientific challenges facing researchers, political considerations and career concerns also influence how science is done. The pressure to publish findings in scientific journals often weighs heavily on researchers and is considered essential to the advancement of a young scientist's career. While rooted in career pragmatism, the increasingly competitive drive to publish or perish can overshadow concerns about data reliability.

Long taken for granted, the issue of scientific reproducibility has recently moved to the forefront of discussions on the practice of science, as many studies face reexamination and increased scrutiny. Just how solid are the results of published studies? Can they be replicated? A recent book-length exposé makes the case that the issues surrounding scientific reliability are considerably more profound and alarming than once thought.

Biological samples are ground zero in the quest for dependable science, yet researchers hoping to publish their work may have a disincentive to spend the time to probe the integrity of their specimens. Should they uncover a problem, it may throw their data into question and preclude publication--a serious setback, with little for the researcher to show for it. There is a danger of an ignorance is bliss mentality.

Know your specimens!

As Borges notes, addressing the problem requires two things. First, a regulatory body like the NIH needs to issue strict guidelines that include detailed documentation of sample history and handling. Currently, some scientific journals do require documentation of specimen storage conditions prior to publication, but such records are often inadequate for ensuring a high level of sample integrity. Secondly, researchers need reliable methods for testing their samples to ensure they meet exacting standards. The technique described in the current study is an important advance in this direction.

Depending on the nature and purpose of a given blood plasma or serum sample, even minor fluctuations in the collection, processing, storage and handling can affect quality and reliability. The most important of many factors affecting such samples is the time they have spent in a thawed condition above -30o C. This is also one of the more challenging variables to track over time, in many instances.

One recent clinical example highlighting the issue of sample integrity concerns HER2, a critical biomarker used for the diagnosis of breast cancer. It has recently been discovered that this marker is highly unstable and can yield specious results when applied to sample tissue, unless it is processed within 1 hour of surgical resection.

QC for blood

The new biomarker sets cutoff values for blood plasma and serum, allowing researchers to easily assess the quality of samples and their suitability for given experiments, even if a detailed record of sample handling and storage is unavailable. For the first time, plasma and serum--the most commonly used biospecimens for medical research--can be tracked with a reliable biomarker.

The biomarker, which relies on relative proportions of two isoforms of albumin, requires only a low volume of plasma or serum and minimal sample preparation. (Different isoforms of this protein are functionally similar but an oxygen-induced modification that occurs to an abnormal extent outside of the body is used to identify mishandled samples.)

Albumin is the most abundant protein in blood plasma and serum, comprising roughly half of all protein content in these biofluids. Outside the body, the natural unmodified form of albumin becomes oxidized with time. This can be detected by observing a change in protein mass, using mass spectrometry.

By describing a chemical rate law for this protein oxidation reaction that takes place in plasma and serum, the biomarker acts as a kind of molecular stopwatch that can precisely gauge the elapsed time a particular sample has remained in a thawed state.

The biomarker described is inexpensive, easy and rapid to use and can be fully automated, making it a strong candidate to serve as the new gold standard for plasma and serum analysis. It is capable of detecting biospecimen exposure to room temperature conditions for as little as 2 hours, quickly and accurately identifying mishandled or mis-stored samples and preventing their inclusion in clinical research.

The study was carried out in collaboration with Maricopa County Hospital. Patient study samples were acquired with the help of cardiologist Dr. Christian Breburda and his staff.

In addition to more conventional clinical research, the new biomarker is poised to make inroads in a variety of health-related investigations. It has recently been incorporated into an ambitious project sponsored by the Defense Advanced Research Projects Agency or DARPA, which uses epigenetic markers in blood to identify exposure to weapons of mass destruction or their precursor chemicals. The new biomarker will be used to ensure the quality of blood samples, further establishing the power and versatility of this approach.

Source:
Journal reference:

Jeffs, J.W. et al. (2019) Delta-S-Cys-Albumin: A Lab Test that Quantifies Cumulative Exposure of Archived Human Blood Plasma and Serum Samples to Thawed Conditions. Molecular & Cellular Proteomics. doi.org/10.1074/mcp.TIR119.001659.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Research suggests no need for yellow fever vaccine booster after initial dose