man in white dress shirt sitting on black office rolling chair
(Photo : National Cancer Institute on Unsplash)

The Food and Drug Administration, more commonly known as the FDA, has had more than a century of involvement with food and drug safety in the United States. From its humble beginnings as a branching of the Department of Agriculture, the FDA has experienced a range of transformations since its founding in 1906.

For those undertaking online PMHNP programs, understanding how the FDA effectively regulates drug safety in the United States can give us insight into how the medications available for prescription today have been rigorously tested. Let's explore seven key moments in the FDA's history, from its founding to contemporary times, and dive into how the FDA has helped set a better standard for what we can put into our bodies.

1902: The Poison Squad

Before the inception of the FDA, food and medication safety was managed by the U.S. Department of Agriculture within the Department of Chemistry. For two decades, from 1882, Harvey Washington Wiley assisted the U.S. Government in several investigations into preservatives that were in wide use at the time, including the use of formaldehyde in the adulteration of poor-quality food and drink, such as milk.

After the embalmed beef scandal of 1898, which raised concerns about canned beef being tainted with chemicals such as boric acid, Wiley was provided with substantial funding in 1902 to perform a range of human trials on subjects, armed with preservatives that were widely used and broadly available at the time.

These preservatives included compounds such as borax, copper sulfate, and formaldehyde—commonly used in manufactured food products of the day. Over six months, a group of young men formed the Poison Squad, where, in exchange for free food, they became human test subjects for these widely used materials.

1906: The Food and Drugs Act

The results were shocking. Even for healthy young men, these preservatives were showing horrific impacts on the human body. Public outcry from these tests eventually led to the creation of the Pure Food and Drugs Act, passed by Congress in 1906.

This represented a seismic shift in how food and drugs were regulated. Previously, with limited regulation, food safety had been the Wild West—companies could do what they wanted, often with little oversight. With a new set of laws behind them, the Bureau of Chemistry could begin to drive improvements in food standards in America.

The 1930s: An Overhaul

The roaring '30s represented a transformational decade for the Food, Drug, and Insecticide Administration. In 1930, their name was shortened to what we know today—the Food and Drug Administration.

By 1933, it was clear that legislation passed more than a century ago was displaying inadequacies—with the FDA calling for a full overhaul of the 1906 Act to reflect the changing nature of drug regulation in the U.S.

In 1937, a new antibiotic known as elixir sulfanilamide was introduced, with little testing being done before its introduction to the market. The results were horrific—with more than a hundred people being killed as a result of usage of the medication.

One year later, the Federal Food, Drug, and Cosmetic Act was introduced, introducing tighter regulations on drug testing before products were brought to market. It set the standard for modern regulatory reform at the time.

1951: New Supervision Safeguards

In 1951, the Durham-Humphrey Amendment was introduced. This introduced new safeguards to restrict unsafe medications to prescription-only—and was an early precursor to modern prescription restrictions.

shallow focus photography of white bottle lot
(Photo : National Cancer Institute on Unsplash)

1962: The Thalidomide Crisis

In the late 1950s, the German pharmaceutical company Grünenthal brought the product known as thalidomide to many global markets, including Europe and Australia. With a lack of testing done on pregnant women, it was revealed in the years following that thalidomide was the cause of serious birth defects and health problems, leading to the deaths and deformities of an estimated 10,000 infants worldwide.

Thalidomide would have had a much more significant impact in the U.S. had it not been for the involvement of one incredible FDA reviewer—Frances Oldham Kelsey. Dr. Kelsey's work in refusing to authorize the use of thalidomide in the United States led to not only her receiving the highest civilian honor of the time but also transformational change in the way that drugs had to be tested, firstly in animal cohorts before being allowed anywhere near the general public.

1993: A New Testing Regime

While the thalidomide scandal brought to attention the need for drug testing, it had a second, undesirable effect—in 1977, women of childbearing age were excluded from medical testing. This has had significant, ongoing consequences on what we know of women's health—leading to a change in legislation in 1993 to force the National Institute of Health and other organizations to begin including women in clinical trials again.

1998–99: AERS and the Drug Facts Label

As computers became more widespread in the 1990s, standards evolved to provide patients with the information they need to make informed decisions. 1998 saw the introduction of a new digital database, known as the Adverse Event Reporting System (AERS), allowing for assessment of the safety of already marketed drugs.

This, in part, has led to greater awareness of the impacts of opioids, allowing for not only awareness but empowering the FDA to drive for change in a highly addictive space.

1999 saw the introduction of the Drug Facts Label for products sold over the counter. This standardization of medical information provided greater transparency between pharmacy and patient.

Where will the next century of the FDA take us? As we can see, there have been many landmark changes over the last century—and as the food and drug landscape evolves, you can be sure that the FDA will be there to help tackle the challenges that will undoubtedly present themselves.