It’s hard to imagine a time when vaccines didn’t exist. However, up until the Second World War, they were few and far between. It wasn’t until the U.S. Army decided to collaborate with medical experts and industry professionals that a vaccine was developed to prevent one of the world’s most common illnesses: the flu.
Disease in the military pre-WWII
Before WWII, those on the frontlines perished more from disease than injury. The situation was improved by slightly better sanitary conditions during World War I, but they still couldn’t prevent the spread of one of the deadliest illnesses of the twentieth century: the 1918 influenza pandemic. The flu pandemic caused an estimated 50 million deaths worldwide, and part of the reason for its spread were soldiers fighting in Europe.
WWI saw soldiers from across the world travel to Europe in an influx the continent had never before witnessed. This allowed for not only the introduction of new illnesses to the population but also their rapid spread. The U.S. Navy saw 40 percent of its members affected, while the Army wasn’t far behind with an infection rate of 36 percent.
WWII and the need for a flu vaccine
WWII was the beginning of more epidemics across Europe. Diseases like diphtheria spread like wildfire, and the U.S. military realized it was losing more troops to illness than to fighting. This spurred a partnership between the Army, industry, and academia, allowing for unprecedented levels of innovation against wartime disease threats.
Ten vaccines were developed or improved during this time, with over one-third used to combat vaccine-preventable diseases. One vaccine protected against botulinum toxoid, created in response to false intelligence that the Germans had loaded their V-1 bombs with the toxin behind botulism. Another was the Japanese encephalitis vaccine, developed in preparation for the Allied landing in Japan.
During this time, Scottish physician Alexander Fleming presented a medical innovation that changed the world, penicillin. However, the most significant development to come from the war effort was that of the flu vaccine.
A federal commission is organized
In 1941, fearing another pandemic, the U.S. Army organized a commission. It was part of a much larger network of federal vaccine development programs, which had enlisted a slew of specialists in an attempt to target the flu, measles, and the mumps, among other diseases.
The commission pulled together knowledge of how to isolate, grow and purify the flu virus so that the development of the vaccine could be fast-tracked. It was led by virologist Thomas Francis Jr., and scientists were able to work from home, allowing the military access to expertise and facilities located in the civilian sector.
Seeing it as part of their wartime duty, manufacturers offered to work with the commission for little-to-no profit. With the medical industry as an active partner, a new research format was formed, effectively turning scientific findings into actual working vaccines.
All this, paired with the fact that intellectual property rights weren’t a barrier, resulted in the quick development of the vaccine. Francis and the other directors were able to transfer people and resources to projects deemed the most important, and this allowed for coordination across all phases of the development process.
The result of all this work was an FDA-approved flu vaccine in under two years. It was also the first to be licensed in the United States.
Lack of cooperation in the post-war era
Upon the conclusion of the war, cooperation between the Army, manufacturers, and medical professionals continued. This resulted in the development of a variety of vaccines throughout the middle of the century, including those targeting adenovirus and meningococcal meningitis.
However, this partnership disintegrated toward the end of the 1900s. Legal, political and economic changes during the 1970s and 1980s caused major disruptions, and without full cooperation vaccine development stalled. Some existing vaccine programs were even discontinued.