What Is Real-World Evidence and How Is It Used?

Industry Advice Healthcare

In an age where data drives research and regulation, real-world evidence plays a pivotal role in making health and public safety decisions. Technological advances, such as the rise in digital fitness trackers, electronic medical records, and mobile devices have increased interest in the potential for converting real-world data into real-world evidence (RWE) that can inform healthcare decision-making.

If you’re interested in breaking into this exciting new sector of data analysis, here’s everything you need to know about real-world data, how it relates to real-world evidence, and how this information is leveraged in today’s market.

What Is Real-World Data?

Real-world data (RWD) refers to the collection of data about a patient’s health status or routine healthcare delivery. According to the U.S. Food and Drug Administration (FDA) this data can come from any number of sources, including:

  • Electronic health records (EHRs)
  • Claims and billing activities
  • Product and disease registries
  • Patient-generated data, including in-home use settings
  • Data gathered from other sources that can inform on health status, such as mobile devices

While this list encompasses most data sources, it continues to grow. “Think of real-world data as outside of a controlled environment. All the ways that you digitally exist and how you might experience health, health events, or health outcomes are logged in various ways,” says Kristin Kostka, director of the OHDSI Center at Northeastern University’s Roux Institute.


Interested in a Career in Real-World Evidence?

Download our free guide to learn the skills you need to advance in the field.

DOWNLOAD NOW


An excellent example of this is technology targeting the fitness community. Wearable devices such as Apple watches or Fitbits capture data like daily movement, heart abnormalities, and caloric intake. Mobile apps have also joined this movement, collecting data on trends such as sleeping habits and mental health. Fitness machines like a Peloton or Nordictrack also capture this kind of information. “There’s a whole world of things that fall in the RWD bucket,” says Kostka.

What Is Real-World Evidence?

According to the FDA, real-world evidence is the culmination of findings regarding the usage and potential benefits or risks of a procedure or product derived from real-world data. “The idea of real-world evidence is taking [real-world data] and doing something methodically to create a finding,” says Kostka.

These findings are generated by various studies and analyses, including:

  • Randomized trials
  • Pragmatic trials
  • Observational studies

Real-world evidence is the culmination of these various scientific processes to get to a clinical finding. As Kostka puts it, “data by itself is not evidence, but evidence is comprised of data.” Therefore, real-world evidence doesn’t exist without the collection of valuable and translatable data.

How Is Real-World Evidence Used?

Real-world evidence has become really important as a result of the data boom that’s occurring in healthcare today. “There’s more data being captured than we ever thought humanly possible,” says Kostka.

There are many uses of real-world evidence, but perhaps the most topical is saving money in healthcare, learning from past trends and models, and preventing possible negative health outcomes.

Saves Money

Since Congress passed the 21st Century Cures Act in 2016, there’s been a lot of new found momentum in the field of real-world evidence. This is largely due to the fact that Congress has put more money into research, rather than costly experiments—especially after the emergence of the COVID-19 pandemic.

For example, significant increases in government funding to the National Institutes of Health (NIH) over the past few years have supported the development of a nationwide network of nearly 500,000 researchers. In fact, the president’s 2023 budget request allocated $62.5 billion to NIH. This is a drastic increase compared to $42.9 billion the agency received in 2022. While this might seem expensive, according to the U.S. Department of Health and Human Services, the average cost of clinical trials can range from $4–20 million. In addition, new drugs approved by the FDA cost about $41,117 per patient.

The high experiment-based expenses have forced Congress to gain a better understanding of the opportunistic ways that science can respond to these kinds of challenges that don’t require a hefty price tag. As a result, the federal government has become a more active partner in research and development through budget allocation and regulatory decision-making.

Helps Us Learn From the Past

The pandemic didn’t just reveal the need for research funding, it also “made it really obvious that we need an agile learning ecosystem,” says Kostka. If COVID-19 response protocol had historical data from previous pandemics, public health officials might have had better information going into March 2020.

In order to make healthcare more proactive, rather than reactive, real-world evidence focuses on past data to create models to predict possible health outcomes. “We’re looking at who’s more likely to have a bad health outcome, hospitalization risk, or mortality rate,” says Kostka.

Prevents Possible Negative Outcomes

Science is all about trial and error. Real-world evidence follows suit by “learning from available patterns of treatment to derive what works and what doesn’t,” says Kostka. The main difference, however, is that real-world evidence removes possible harm to patients and clinical trial participants.

An excellent example of this is vaccine approval. Vaccines are required to go through a rigorous vetting process that includes trials that test possible effects on various groups of people. After FDA approval, the vaccine is then rolled out to millions of patients, but there’s always a risk of dangerous side effects when it’s administered in a larger pool of people.

For example, several vaccines have a risk of creating blood clots. “This is a rare, but devastating outcome,” says Kostka. “Real-world evidence takes those claims and observations and finds a systematic way to evaluate and reproduce that finding if it’s reproducible.” If these adverse and dangerous side effects are more predictable, healthcare professionals can give more knowledgeable advice to their patients.

Consider a Career in Real-World Evidence

Real-world evidence has historically been used to monitor post-market safety of medical products and services, but since its integration into product development, real-world evidence professionals are expected to be highly skilled individuals with a thorough understanding of the industry.

“There are a series of foundational competencies that are necessary to be a guru in this space,” says Kostka. One method of developing these skills is earning an MS in Real-World Evidence. “Northeastern’s program focuses on the skills needed to succeed in the field, but, in general, you have to know a little bit about the regulatory context and the data that’s available.”

Want to learn more about how a degree in real-world evidence can boost your career in healthcare? Check out our Master of Science in Real-World Evidence program to learn more.

Download Our Free Guide to Advancing Your Real-World Evidence Career” width=