Not long after the United States entered World War II in December 1941, Allied leaders Winston Churchill and Franklin D. Roosevelt — along with commanding Allied general Dwight D. Eisenhower — began to plan an invasion of Nazi-occupied France. Opening a new front was vital to defeating the Nazis, so plans were set in place for Operation Overlord — the codename for the Normandy landings on June 6, 1944. The massive operation began the liberation of France and other parts of Western Europe, ultimately turning the tide of World War II and bringing about the end of Nazi Germany. Here are five facts about that fateful day, now commonly known as D-Day.
D-Day Was Supposed to Happen a Day Earlier
Allied leaders originally set a date of June 5, 1944, for D-Day. But something very British managed to delay the invasion: the weather. Foul weather over the English Channel meant that it was too rough for ships to sail, so the invasion was postponed until the day after. It was a nervous, pensive wait for everyone involved, not least for the soldiers waiting to cross the Channel. Then came news from the meteorologists, who forecast a brief window of calmer weather for June 6. There were a limited number of dates with the right tidal conditions for an invasion, so if the operation didn’t go forward during the break in the weather on June 6, it would have had to wait until June 19-21 (when, as it turned out, there was a storm that would have made invasion impossible). The green light was finally given, and D-Day took place on June 6.
The Germans Weren’t Expecting the Invasion to Be at Normandy
The Germans knew that an Allied invasion of Nazi-occupied France could turn the tide of war, and had planned to counter such an invasion. But they didn’t consider Normandy as a particularly likely landing point. Instead, they believed the Allies would invade further north, at the French port city of Calais, which sat just a little more than 20 miles across the English Channel from Dover. The German army installed three massive gun batteries along the Calais coast in order to counter this threat. That’s not to say that Normandy was an easy target. It was defended by the Atlantic Wall, a 2,000-mile-long chain of fortresses, mines, gun emplacements, tank traps, and obstacles. It was an impressive piece of defensive engineering, but it wasn’t enough to stop the Allied invasion.
Advertisement
Advertisement
Spies and Misinformation Played a Major Part in the Success of D-Day
The Allies did all they could to convince the Nazis that an invasion would not take place at Normandy. Leading up to D-Day, nearly every German spy in England had been captured or turned into a double agent, and the double agents were told to inform their Nazi handlers that the invasion was indeed planned for Calais. At the same time, the Allies sent out fake radio traffic to further convince the Germans that Calais was the plan. This deception was all part of Operation Fortitude, which aimed to dupe the Nazis with misinformation, including creating an entirely fake army. This fictitious force, known as the First U.S. Army Group (FUSAG), was made up of thousands of fake tanks and airplanes, as well as decoy buildings, all placed on England’s southeast coast and supposedly commanded by General George S. Patton. The Allies let German reconnaissance planes photograph the site of the dummy army, further convincing the enemy that a military buildup was being made for an invasion of Calais. What’s more, the Allies by this time had cracked the Nazis’ Enigma code, so they could monitor the success of their misinformation campaign by tapping into German communications.
D-Day Was the Largest Amphibious Invasion in History
The Allied invasion of Normandy was the largest single-day amphibious invasion in history. The scale of the assault is hard to even imagine, as the numbers are mind-boggling. In the months and days leading up to the invasion, 7 million tons of supplies, including 450,000 tons of ammunition, were brought into Britain from the United States, and war planners created around 17 million maps to support the operation. In the hours prior to the beach landings, 11,590 Allied aircraft flew 14,674 sorties to support the invasion, and 15,500 American and 7,900 British airborne troops parachuted into France behind enemy lines. Then came the beach assault by 132,715 Allied troops, consisting of 75,215 British and Canadian forces and 57,500 Americans. Between them, they stormed the beaches of Normandy, the Americans fighting their way ashore at Utah Beach and Omaha Beach, the British at Gold and Sword beaches, and the Canadians at Juno Beach.
Advertisement
Advertisement
Eisenhower Wrote a Secret “In Case of Failure” Message
The success of D-Day was in no way assured. In the days before the invasion, General Eisenhower secretly wrote a statement now known as the “In Case of Failure” message, to be released if the invasion failed. In the letter, Eisenhower took full blame for any such failure. “My decision to attack at this time and place was based upon the best information available,” he wrote. “The troops, the air, and the Navy did all that bravery and devotion to duty could do.”
But D-Day was a military success that paved the way for a German surrender less than a year later. The invasion, however, came at a terrible cost. Historians are still investigating the actual number of deaths that resulted from the chaos of D-Day, but we know that at least 4,414 Allied soldiers, sailors, airmen, and coast guardsmen lost their lives, with at least 10,000 total casualties. On the German side, meanwhile, estimates suggest between 4,000 and 9,000 killed, wounded, or missing, with around 200,000 Germans captured as prisoners of war. Today, just a few thousand D-Day veterans may still be alive, the youngest now in their late 90s. On June 6, 2023, around 40 World War II veterans gathered at Normandy to mark the 79th anniversary of D-Day and pay tribute to the lives lost that day.
Vehicles and weaponry attract much wartime attention, but failing to give proper consideration to uniform design can spell disaster. Take, for instance, World War I, when the French army ignored war minister Adolphe Messimy’s warning about the country’s insistence on retaining the conspicuous red coloring of its historic pantalon rouge uniforms, despite his pointed admonishment: “This stupid blind attachment to the most visible of colors will have cruel consequences.” The French went on to suffer heavy casualties at the outset of the war, and switched to issuing horizon blue uniforms in 1915. The importance of uniforms became apparent to the Soviet Union as well, when soldiers suffered frostbite and other cold injuries during the Winter War against Finland at the start of World War II.
Both world wars created shifts in uniform design that were sometimes innovative, sometimes bizarre, and in some cases, enduringly impactful to civilian fashion. These are some of the more notable facts about military uniforms from the two world wars.
WWI Marked the U.S. Army’s First Monochromatic Uniform
The uniform worn by the United States Army in the First World War was called the M1910 uniform. In addition to being the Army’s first single-color uniform — allowing for better camouflage and easier manufacturing — it was also the first time the standard olive drab uniform was worn during a war (though the Army switched to khaki-colored cotton uniforms during the summer). The M1910 was also notable for not including any blue outerwear or pants, which had been a part of every United States (or Continental) Army uniform since the Revolutionary War.
The French Army Reintroduced the Metal Helmet During WWI
Metal helmets had been in use throughout antiquity, but they fell out of favor in the 18th and 19th centuries with the decline of sword-and-spear close combat and the rise of firearms. In 1915, the French army outfitted its soldiers with steel helmets in order to protect them from falling shrapnel, which was an increasing problem in trench warfare. The helmets were designed by Intendant-General Louis Auguste Adrian, and were known as M15 Adrian helmets. The Adrian helmets proved effective enough that other Western armies began adopting them. Eventually, most countries began manufacturing some sort of metal helmet design.
Advertisement
Advertisement
The WWI-Era German Helmet Spikes Originally Had a Function
The famous spike-adorned German helmet from World War I is called a pickelhaube, and it was designed in 1842 by King Friedrich Wilhelm IV of Prussia. (Similar spiked helmets were already in use in other countries, such as Russia.) So what was the spike for? Its original purpose was as a point to attach the decorative strands of a cavalry helmet plume. Later, the spike itself (without the attached plume) gained aesthetic value for its aggressive appearance and was favored by the German infantry. The pickelhaube didn’t last for the entirety of the war, though, as it was discontinued in 1916 due to its ill-suitedness for trench warfare and a shortage of materials necessary to manufacture them. That didn’t stop it from being an enduring symbol of the German army, though, as its sinister appearance was ideal for war propaganda depicting the Germans as vicious aggressors. To celebrate the end of the war, New York City built a pyramid out of around 85,000 pickelhaubes.
Soviet Soldiers Wore Fur Coats and Felt and Wool Boots
The frigid climate of the Soviet Union posed a harrowing challenge to soldiers, especially during winter months. To combat the cold, the Soviet Union outfitted the Red Army with thick fur coats and traditional boots made of felt and wool, known as valenki. Valenki weren’t exclusively military-issued boots, though; in fact, they were a traditional form of Russian footwear worn for hundreds of years. Though the fur coats and valenki were age-old items, they were only added to the army’s provisions in August of 1941 after poor cold-weather preparedness in the Soviet Union’s 1939 invasion of Finland caused higher-than-expected casualties.
Advertisement
Advertisement
Soviet Soldiers Didn’t Wear Socks
During the world wars, and at least as far back as the Napoleonic Wars, the Russian army used a predecessor to socks: the foot wrap. The clothing, known in Russian as a portyanki, is a simple rectangular piece of cloth wrapped around the foot that provides the same function a sock would. The advantage is that the foot wrap is cheap and easy to manufacture en masse, though the drawback is that it requires considerably more technique to put on correctly. Incorrectly wrapping the cloth could result in creases or folds that would cause blistering or other discomfort. Nonetheless, the portyanki remained in use by the Russian army as late as 2013.
WWII Flight Jackets Became an Enduring Fashion in the U.S.
World War II aerial warfare created a unique problem for military uniform designers to solve. Temperatures at the altitudes pilots flew at could reach as low as -30 degrees Fahrenheit, but the tight quarters of an aircraft’s cockpit meant that any outerwear that was too bulky would impede movement. In response to this challenge, the U.S. Army Aviation Clothing Board developed two leather flight jackets that were used in World War II: the A-2 and the G-1. The A-2 was issued for the Air Force, had a medium snap collar and an enclosed snap-secured pocket on each side, and zipped closed. The G-1 was issued for the Navy, and had a similar pocket and zipper design, but added some flashiness in the form of a larger fur-lined collar, and ornamental patches on the front, arms, and back.
The military ceased production of the A-2 in 1943, but the design was popularized in the 1963 Steve McQueen film The Great Escape, and retailers manufacture replicas to this day. As for the G-1, it’s still issued to enlisted members of the Navy, and had its own film role as the jacket Tom Cruise wore in 1986’s Top Gun.
“There never was a good war or a bad peace,” Benjamin Franklin wrote in 1783. Wise words indeed, and very true. Unfortunately, humans too often find themselves at war, as millennia of conflict can attest — the earliest known war was in Sudan a staggering 13,400 years ago.
Among the many wars fought in human history, some stand out for their peculiar nature, whether due to the strange events that provoked the conflict or for the lack of any actual fighting. Here are 10 of the strangest wars in history, from the 14th century to modern times.
The War of the Oaken Bucket
The War of the Oaken Bucket certainly has one of the strangest names in the history of conflict, and it does involve a bucket — just not as prominently as the myth would suggest. According to legend, the war began one night in 1325 after soldiers from Modena crept into Bologna and stole the oaken pail from the municipal well. In reality, the war was the culmination of ongoing tensions that had existed between the Italian city-states for 300 years. There was a bucket involved, but not until the end of the conflict, when Modenese soldiers took the municipal bucket as a trophy of war.
In 1651, the Netherlands decided to get involved in the English Civil War between the Royalists and Parliamentarians. During the whole messy affair, the Dutch sent a fleet of 12 warships to the Isles of Scilly, an archipelago off the southwestern tip of Cornwall, to demand reparations from the Royalists, who had been raiding Dutch shipping lanes. Their demands were ignored, at which point the Dutch declared war on the Isles of Scilly. The Dutch hung around for three months and then abandoned the fruitless conflict and sailed home. But they forgot one thing: to declare peace with the Isles of Scilly. The bloodless war technically lasted for 335 years until anyone saw fit to formally sign a peace treaty, which finally happened in 1986. It remains, arguably at least, one of the longest wars in history (the shortest, in contrast, lasted just 38 minutes).
Advertisement
Advertisement
War of Jenkins' Ear
In 1738, British merchants were increasingly protesting over Spanish control how the Spanish Guarda Costa (coast guard) was treating their trading ships in the Americas. The mood in Britain was that the Spanish needed to be taught a lesson. Enter Captain Robert Jenkins, a Welsh mariner who, in 1731, had his ear cut off by overzealous Spanish coast guards when they searched his ship for contraband. Seven years after that incident, Jenkins was called to appear in the House of Commons in London, where, according to some accounts, he presented his preserved ear, much to the outrage of the gathered assembly. The British public soon became aware of this episode, further stoking anti-Spanish fervor and helping to pave the way for a full-scale war that began in 1739 and ended in 1748.
The Kettle War was a bizarre conflict that, in truth, was more of an international incident than a war. In 1784, the Holy Roman Empire and the Dutch Republic were squabbling over access to the ports of Antwerp and Ghent in Belgium. In a show of force, the Holy Roman emperor dispatched three vessels, led by his magnificent warship Le Louis, to seize control of the Dutch port at Amsterdam. The Dutch were waiting with their own smaller ships. When the enemy approached, their lead ship, the Dolfijn, fired a single shot that ricocheted off a kettle on the deck of the Le Louis. This terrified the ship’s incompetent captain, who immediately surrendered, handing victory — and the emperor’s flagship — to the Dutch.
Advertisement
Advertisement
The Pastry War
In the early 1830s, a French pastry cook living in Tacubaya, near Mexico City, claimed that some Mexican army officers had damaged and looted his restaurant. He appealed to the king of France, demanding compensation, and in doing so, unwittingly helped launch a war. The pastry cook’s complaint prompted France to press Mexico for the grand sum of 600,000 pesos in compensation. In November 1838, with the Mexican president yet to make any payments, France sent a fleet to Veracruz, the principal port on the Gulf of Mexico. The French bombarded the fortress of San Juan de Ulúa, and Mexico declared war on France. But before the crisis could escalate any further, Britain stepped in and negotiated a peace treaty. The French forces withdrew in March 1839. The pastry cook, meanwhile, never saw a single peso from Mexico, which never paid the compensation — a fact that was later used by France to justify the second French intervention in Mexico, in 1861.
The Oregon Treaty of 1846 settled long-standing border disputes between the U.S. and British North America (present-day Canada). Even on the strategically important island of San Juan in Washington state, which remained contested, the British and American settlers seemed to be getting along. But then, on June 15, 1859, an American farmer named Lyman Cutlar shot a British pig that had wandered onto his land and was eating his potatoes. Things escalated quickly, and the local Americans requested U.S. military protection. A 66-person company of the U.S. 9th Infantry was sent to San Juan. In response, the British sent three warships. Then came a voice of reason in the guise of Admiral Robert L. Baynes, commander in chief of the British navy in the Pacific. He refused to engage any further, stating that he would not “involve two great nations in a war over a squabble about a pig.” So ended the Pig War, with only one casualty: the unfortunate pig.
Advertisement
Advertisement
The Town of Líjar Versus France
In 1883, the tiny town of Líjar in Andalusia, Spain, declared war against the entire military might of France. Líjar’s mayor was apparently infuriated by some news he had heard, and immediately called a town meeting to discuss the matter. He explained the situation as follows: “Our King Alfonso [of Spain], when passing through Paris on the 29th day of September was stoned and offended in the most cowardly fashion by miserable hordes of the French nation.” The town council approved the mayor’s war motion, and Líjar duly announced its decision to the Spanish government and the president of the French Republic. Then, nothing happened — until, 100 years later, the town decided to formally end its war with France, with very little fanfare outside of Líjar, because everyone else had forgotten the war ever started.
Following decades of territorial disputes, tensions were already running high between Greece and Bulgaria in 1925 — and then a dog sparked a war. It all began when the dog ran across the border between Greece and Bulgaria. His owner, a Greek soldier, ran after the dog, and was promptly shot by the Bulgarians. The ensuing diplomatic chaos resulted in a brief invasion of Bulgaria by Greece, known as the War of the Stray Dog or the Incident at Petrich, which lasted 10 days and resulted in at least 50 casualties. The fate of the dog remains unknown.
Advertisement
Advertisement
The Great Emu War
In 1932, a marauding horde of emus arrived in Western Australia, where they began destroying crops and causing general havoc. Farmers petitioned the government for help to combat the mob, which totaled at least 20,000 flightless birds. In response, Major G.P.W. Meredith of the Australian army was sent to the region in command of a small group of soldiers armed with Lewis light machine guns and 10,000 rounds of ammunition. Things didn’t go well. The emus were tougher, faster, and more intelligent than expected, and it took 2,500 rounds of ammunition to fell just 200 of the birds. The “war” was eventually abandoned, with the emus victorious.
For half a century, Denmark and Canada were engaged in what must be one of the friendliest wars of all time. It all started in the 1970s, when the two nations were deliberating over their Arctic boundaries, including a small, desolate chunk of rock called Hans Island. No one could really agree on how to divide Hans, so it remained in rather unimportant limbo. Then, in 1984, some Canadian soldiers landed on the rock and promptly planted a maple leaf flag and left a bottle of whisky before returning home. In response, Denmark flew a representative out to the island, who replaced the flag with the Danish flag, leaving a bottle of schnapps and a note that read “Welcome to Danish Island.” The Whisky War had begun in earnest. This amicable conflict continued for 50 years, with the regular exchange of flags, notes, and bottles of booze. Finally, in 2022, Denmark and Canada struck a deal over the tiny, uninhabited Arctic island, ending the Whisky War for good.
World War II was one of the most transformative events of the 20th century. It was the largest war ever fought, with more than 50 nations and 100 million troops involved, and it reshaped geopolitics, resulting in the United States and Soviet Union emerging as major world powers leading into the Cold War. This far-reaching war also inspired new global peacekeeping efforts, including the creation of the United Nations, and it brought to light incredibly courageous acts of humanity from soldiers and civilians alike. Here are the stories of six daring heroes of the Second World War.
The Youngest American Soldier in WWII
Calvin L. Graham was the youngest U.S. military member during WWII, and is still the youngest recipient of the Purple Heart and Bronze Star. It wasn’t unusual for boys to lie about their age to enlist, but Graham was just 12 years old when he forged his mother’s signature and headed to Houston to enlist. The 125-pound, 5-foot-2 boy was miraculously cleared for naval service and assigned to the USS South Dakota as an anti-aircraft gunner.
On November 14, 1942, the South Dakota was ambushed by Japanese forces at the Battle of Guadalcanal. Graham was severely burned and thrown down three stories of the ship, but still mustered the strength to tend to his severely wounded shipmates. He was honored for his heroism, but when his mother found out about the honor, she informed the Navy of his real age and he was stripped of his medals and thrown into the brig for three months. In 1978, President Jimmy Carter learned of Graham’s story and restored his medals, except for his Purple Heart, which wasn’t restored until two years after Graham’s death.
Polish soldiers stationed in Iran during the war were met with great surprise when a shepherd traded them a Syrian brown bear cub for a Swiss army knife and some canned goods. The cub’s mother was likely killed by hunters, so the soldiers adopted him, giving him the name “Wojtek,” meaning “joyful warrior” in Polish — a title he soon lived up to. His caretaker, a soldier named Peter Prendys, taught the bear how to salute, wave, and march, and Wojtek became a great morale booster.
In 1944, Wojtek was given the rank of private and a serial number (pets were banned in the Polish army), and he shipped off to Italy with his unit. That May, the bear even joined combat during the Battle of Monte Cassino, carrying supplies to his fellow troops, according to witnesses. He was promoted to the rank of corporal for his bravery. After the war, Wojtek found his forever home at the Edinburgh Zoo in 1947. A bronze statue of the bear and Prendys still stands in downtown Edinburgh today.
Army Colonel Ruby Bradley of the U.S. Army Nurse Corps was working at Camp John Hay in the Philippines when she was taken prisoner by the Japanese army in 1941. She became a POW at the Santo Tomas Internment Camp in Manila — but she didn’t let it break her spirit. Bradley immediately went to work helping her fellow POWs by offering medical aid and smuggling food and medicine to those in need. She assisted on 230 major surgeries and delivered 13 babies during her 37 months at the camp. Bradley and her fellow nurses became known as the “Angels in Fatigues.”
In February 1945, the camp was finally liberated, and Bradley — who was malnourished from giving her food rations to children — went home. She continued her career in the Army, amassing 34 decorations, medals, and awards (including the Bronze Star Medal), making her one of the most decorated women in U.S. military history.
General Benjamin O. Davis Jr. faced racial discrimination from the very beginning of his military career. He was only the fourth Black cadet in the history of the United States Military Academy at West Point before joining the Army in 1936. After being stationed in Alabama, he received the opportunity of a lifetime: squadron commander of the first all-Black unit in the Army Air Forces. This unit of 1,000 Black pilots became known as the Tuskegee Airmen, renowned for their exceptional achievements in combat despite the discrimination they faced.
Davis led the 99th Fighter Squadron during their 1943 deployment against Axis forces in North Africa, and later that year, he commanded the 332nd Fighter Group to fight on the front lines in Italy. During his two-year command of the Tuskegee Airmen, Davis and his crew sank more than 40 enemy ships and downed more than twice the number of aircraft they lost, earning them a reputation as a formidable fighting squadron. Their impressive record wasn’t just a message to the enemy; it broke racial barriers at home, furthering the fight for desegregation and equal rights. Davis had a life of public service and was promoted to four-star general by President Bill Clinton in 1998.
Advertisement
Advertisement
The First Female Asian American Officer
For Navy Lieutenant Susan Ahn Cuddy, entry into military service was personal. Her father, Dosan Ahn Chang Ho, died while imprisoned by the Japanese in 1938. He was incarcerated for anti-Japanese activism as a known leader for the Korean independence movement. Despite growing anti-Asian sentiments during WWII, Cuddy wanted to honor her father and fight against the Japanese, so she enlisted in the U.S. Navy in 1942. She was the first female Asian American naval officer and eventually became the first female gunnery officer, training pilots to fire a .50-caliber machine gun. She later worked with codebreakers at the Naval Intelligence Office while using her knowledge of the Korean language. Even there, Cuddy faced discrimination — one of her superiors wouldn’t let her access classified documents. After the war, Cuddy worked at the National Security Agency during the Cold War. She died peacefully in her sleep in 2015 at the age of 100.
On the morning of December 7, 1941, George Walters, a crane operator at the Pearl Harbor dockyard in Hawaii, awoke to a devastating surprise attack by Japanese forces. Walters ran to a massive crane next to the USS Pennsylvania and began moving it back and forth on its track to shield the ship from an onslaught of rounds from Japanese fighters and dive bombers. He even attempted to knock planes out of the sky with the boom. The protected gunners onboard the Pennsylvania were able to return fire. Later, a bomb exploded on the dock next to Walter’s crane, knocking him out of the fight. He survived with a concussion, and it’s believed that his actions helped save the ship from certain destruction. The story of Walters’ heroism was featured in Walter Lord’s 1957 book “Day of Infamy.” Walters continued to work at the shipyard for 25 years following the attack. Lewis Walters, George’s son, was a young shipyard apprentice at the time who witnessed his father’s bravery firsthand.
The American Revolution was one of the most significant conflicts of the 18th century. It not only led to the 13 original colonies gaining independence from Great Britain, but also helped establish democracy and representation as a path for governments around the world. Today, schools teach the famous events and figures from this chapter of American history year after year, from the rebellious Boston Tea Party to Paul Revere’s “midnight ride” to the “shot heard round the world” during the Revolutionary War. But the storied details of the nation’s founding aren’t always completely accurate, and there are plenty of myths that persist to this day.
Myth: The American Colonies Went to War Solely Over Taxes
The phrase “taxation without representation” is a popular and easy-to-remember slogan of the American Revolution, based on the argument laid out in Patrick Henry’s Virginia Resolves in 1765. Henry wrote a series of resolutions that were passed in Virginia’s House of Burgesses in response to the Stamp Act, which levied additional taxes on the British colonies in America. Though taxes were a major point of contention between the colonists and the British crown, they were not the sole reason for the conflict. Mounting tensions between American colonists and the British were also caused by disputes over land distribution — the British planned to reserve the western part of North America for Indigenous peoples, angering colonists with plans to expand outward.
Myth: Paul Revere Was the Only Rider Who Warned About the British
Paul Revere’s “midnight ride” was immortalized by painter Grant Wood’s 1931 depiction of the event, “The Midnight Ride of Paul Revere,” which was inspired by Henry Wadsworth Longfellow’s 1860 poem “Paul Revere’s Ride.” While Revere did ride out the evening of April 18, 1775, to warn Sons of Liberty leaders Samuel Adams and John Hancock of the arrival of British troops, he wasn’t alone. Patriots William Dawes and Samuel Prescott also rode on different routes through the greater Boston area. All three riders were stopped by the British, but managed to escape and complete their task, warning the rebels that an attack was coming.
Myth: The Phrase “Don’t Fire Until You See the Whites of Their Eyes” Was Coined During the Revolution
The phrase “don’t shoot until you see the whites of their eyes” is used as shorthand today, meant as a warning against reacting too quickly. The idiom is typically credited to Colonel Israel Putnam at the Battle of Bunker Hill. But there’s no concrete evidence that Putnam uttered the phrase, or that it was first said during that particular battle, or even during the Revolutionary War. In fact, some historians have traced the phrase back to the Seven Years’ War a decade earlier, or even to Prussian soldiers during various battles in the 18th century. It’s likely this was a phrase already known to soldiers before the American Revolution.
Myth: The Declaration of Independence Was Signed on July 4
Every year, Americans celebrate Independence Day on the Fourth of July, and it’s commonly believed that July 4, 1776, marks the date the Declaration of Independence was signed. In reality, the Continental Congress voted to declare independence on July 2, and the Declaration of Independence was formally adopted two days later on July 4. (John Adams even predicted that July 2 would be celebrated as a national holiday for centuries to come.) The signing of the document, meanwhile, didn’t begin for another month; John Hancock was the first founding father to sign the declaration, on August 2, 1776.
Advertisement
Advertisement
Myth: The Liberty Bell Cracked While the Declaration of Independence Was Being Read
No trip to Philadelphia is complete without a visit to the Liberty Bell, a 2,000-pound bell that hangs in Independence Hall (formerly the Pennsylvania State House). The bell was ordered from London by Pennsylvania statesman Isaac Norris in 1751, and when it arrived stateside, it cracked on the first ring. The original bell was then melted down and recast in Philadelphia, and it was this second iteration of the Liberty Bell that was rung to celebrate the first public reading of the Declaration of Independence on July 8, 1776. According to lore, the bell fractured again at this historic moment, but as far as records show, no cracks appeared that day. The infamous split in the current bell actually occurred sometime in the mid-19th century; the first record of the blemish appears in 1846.
The nation’s first President is possibly the most famous American of all time, but he was not quite the military mastermind that he’s often credited as being. Most of the military decisions during the Revolutionary War were hidden from the public, sparing people the details of the indecision that Washington often faced in times of strife. The general had never commanded a large unit before leading the Continental Army, and though his bravery was lauded, his tactician skills left something to be desired, by some accounts. In the years after the war, Thomas Paine — famous for writing the revolutionary book Common Sense — wrote that Washington “slept away [his] time in the field.” That said, Washington’s skills as a leader were unparalleled, and his willingness to step down from the presidency after two terms allowed America’s fledgling democracy to establish a system of shifting leaders.
Advertisement
Advertisement
Myth: Americans Were United in Their Support of the War
The “spirit of ’76” — a nickname for the patriotic fervor around the revolution — was really only a spirit of around 70% to 80% of the population at the time. The rest of the colonists were either loyal to the crown or skeptical of conflict. Some of this divide occurred because of geography, as New England colonists were dragged into the conflict sooner than those in the South. Many people were concerned with the cost (human and financial) of going to war with one of the world’s most powerful empires, and some militia fighters had to be paid to enlist rather than volunteering for the cause. By the end of the revolution, however, enthusiasm for American independence was more widespread. This was due in part to a mass exodus of loyalists: By 1786, between 60,000 and 80,000 loyalists left the colonies to go back to Great Britain.
Winston Churchill is widely regarded as one of the greatest leaders of the 20th century, especially for his role in guiding Britain and the Allies to victory in World War II. Born in 1874 to an aristocratic family that included his prominent politician father, Lord Randolph Churchill, and American socialite mother, Jennie Jerome, Churchill spent his childhood largely in the care of a nanny and in boarding school, where he struggled to keep up academically. At age 18, he enrolled in the Royal Military College, a major achievement for the young boy who had an early interest in the military and also saw it as a distinct path into politics. After a four-year stint serving as both a soldier and war correspondent around the world, Churchill resigned from the army in 1899 to focus on his career as a writer and politician.
Churchill went on to hold a variety of political positions in both the Liberal and Conservative parties, including first lord of the admiralty, chancellor of the exchequer, secretary of state for war, and, of course, prime minister of the United Kingdom. He also became a prolific and celebrated writer and a renowned orator, whose powerful speeches, such as his famous "We shall fight on the beaches" address, inspired both his country and people around the world. Churchill was known for his eloquence, courage, wit, and vision, but he wasn’t without his faults, and his controversial views on imperialism, race, and social reform remain an equally entrenched part of his legacy. Churchill died in 1965 at the age of 90, remaining to some one of the greatest Brits of all time.
Churchill Did a Stint as a War Correspondent
Churchill struggled through his school years in nearly every subject, history and English being the exceptions. His father steered him away from academics and toward a military career, where it took Churchill three attempts to get into the Royal Military College at Sandhurst (now the Royal Military Academy Sandhurst). In 1895, he joined the 4th Queen’s Own Hussars cavalry unit, and made his first army trip to Cuba — but not for combat. Churchill took a short leave to report on the Cuban War of Independence for London’s Daily Graphic. In 1896, his regiment was deployed to India, where he served as both a soldier and a journalist; his dispatches were later compiled into The Story of the Malakand Field Force, his first of many published nonfiction works. His journalism even led Churchill to a notable moment in his young career. While covering the Boer War in South Africa for The Morning Post, he and members of the British army were captured and taken to a prisoner-of-war camp. He escaped by scaling a wall in the dark of night, returning a hero.
He Was Awarded the Nobel Prize in Literature in 1953
Churchill’s war reporting marked the beginning of an esteemed literary career. His first major work following his war dispatch collections was a 1906 biography of his father, titled Lord Randolph Churchill; he also wrote a four-volume biography of his ancestor, the Duke of Marlborough. Churchill's most famous works, however, are his histories of the two world wars, which he both witnessed and shaped. The World Crisis covers the First World War and its aftermath, while The Second World War, throughout six volumes, details the global conflict that made him a legendary leader. Churchill also published several collections of speeches and essays, as well as a book on his hobby of painting, Painting as a Pastime. In 1953, his work earned him the Nobel Prize in literature, awarded “for his mastery of historical and biographical description as well as for brilliant oratory in defending exalted human values.” As high an honor as it was, it’s believed that what Churchill truly wanted was the Nobel Peace Prize.
He Was the First Official Honorary Citizen of the United States
On April 9, 1963, President John F. Kennedy declared Churchill an honorary citizen of the United States, making the former British prime minister the first person to officially have the distinction. “In the dark days and darker nights when England stood alone… he mobilized the English language and sent it into battle,” Kennedy said of Churchill during the ceremony. “The incandescent quality of his words illuminated the courage of his countrymen… By adding his name to our rolls, we mean to honor him — but his acceptance honors us far more.”
Despite the surety of Kennedy’s words at the time, granting Churchill the title was an arduous process. American journalist Kay Halle had pushed for the honor as early as 1957, but a debate dragged on, and Kennedy eventually informed Halle in 1962 that such a move would be unconstitutional (he proposed naming a Navy ship after Churchill instead). Some progress was made later that year, but the matter languished in legislative limbo. In early 1963, with concerns about the aging politician’s health, the U.S. Senate passed a resolution making the distinction constitutional, and just seven days later, Churchill’s honorary citizenship ceremony took place.
He Was the First British Prime Minister to Top the Pop Music Charts
Churchill’s life and career was rife with accolades, but one of his more unusual accomplishments was being the first British prime minister to earn a spot on the pop music charts — not once, but twice. The first time was in 1965, shortly after his death, when a recording of his speeches called The Voice Of reached No. 6 on the Official U.K. Albums Chart. The second Top 10 hit came in 2010, when the Central Band of the Royal Air Force released an album called Reach for the Skies, commemorating the 70th anniversary of the Battle of Britain. The album featured some of Churchill's World War II speeches set to music, and it sat on the charts alongside contemporary acts including Mumford and Sons, KT Tunstall, and the Killers frontman Brandon Flowers.
Advertisement
Advertisement
He Served as Prime Minister Two Separate Times
Despite proving himself to be a popular prime minister who led his country to victory during World War II, Churchill was defeated in the 1945 general election by the Labor Party leader Clement Attlee. The Labor Party at the time was strongly influenced by the Beveridge Report, a 1942 government document that outlined the need for greater social support for Brits following the war, including an emphasis on social security, affordable housing, and health care. In contrast, Churchill’s Conservatives focused on lowering taxes and maintaining defense spending. The need for social reform weighed on the minds of voters, and they gave the Labor Party a landslide victory at the polls. Six years later, however, after the party failed to fully deliver on promises of radical social and economic change, Churchill was voted back into office. Just shy of his 77th birthday at the time, the leader had already begun to experience strokes, and suffered several more during his second run as PM. On April 5, 1955, the 80-year-old Churchill finally retired.
Churchill famously wore many hats, including politician, writer, painter, master orator — and bricklayer. He could often be found building walls for his garden and he constructed a cottage for his daughters at his Chartwell estate in Kent. He once described the physical labor as a “delightful” contrast to his intellectual work, committing to putting down “200 bricks and 2,000 words a day.” In 1928, a photo of Churchill working at his property appeared in the press; his skills were criticized by some, but encouraged by James Lane, the mayor of Battersea and the organizer of the local chapter of the Amalgamated Union of Building Trade Workers (AUBTW). Lane invited Churchill to join, and after some initial hesitation, on October 10, 1928, Churchill was inducted into the union. His membership card read: “Winston S. Churchill, Westerham, Kent. Occupation, bricklayer.”
Advertisement
Advertisement
The First Known Use of “OMG” Was in a Letter to Churchill
The now-ubiquitous “OMG,” an abbreviation meaning “Oh my God,” started popping up in text messages and online chats in the 1990s and early 2000s, but the first known use of the term was actually in a letter to Winston Churchill during World War I. Sent by retired British navy Admiral John Arbuthnot Fisher, the letter was in reaction to newspaper reports at the time, as Fisher criticized Britain's WWI strategies. At the end of his letter, Fisher snarkily wrote, “I hear that a new order of knighthood is on the tapis” (meaning “on the table”). “O.M.G. — Oh! My God! Shower it on the Admiralty!!” The retired admiral in all his sarcasm was already in his 70s at the time, but his quip laid the groundwork for an entire youth linguistic revolution.
In the wake of World War II, new ideological borders were drawn across the European continent. Vast cultural and economic differences formed a deep divide between the democratic nations of Western Europe and the communist regimes of the Soviet Union and its allies in the East. Throughout the Cold War era, these two distinct factions were separated by a symbolic boundary that cut through the continent, known as the Iron Curtain.
The term “Iron Curtain” was first used in reference to the Cold War in 1946; nations that were considered “behind” the Iron Curtain were those under Soviet and communist influence, as those regimes maintained a firm grasp on power. As time progressed, cracks formed in the Iron Curtain as former communist nations embraced democracy, ultimately leading to the political reunification of Europe. But for as long as it existed, the Iron Curtain served as a philosophical barrier between two vastly different worlds. Here are five fascinating facts from behind the Iron Curtain.
The Term “Iron Curtain” Was Popularized by Winston Churchill
Long before the term “Iron Curtain” was coined in reference to the Cold War, the words referred to a fireproof safety mechanism that separated the audience from the stage in theatrical productions. In 1945, author Alexander Campbell borrowed the term in his book It’s Your Empire to describe censorship related to World War II-era Japanese conquests. “Iron Curtain” was first used in the context of communist Europe during a speech by British Prime Minister Winston Churchill on March 5, 1946. Appearing with President Harry Truman at Westminster College in Fulton, Missouri, Churchill stated, “From Stettin in the Baltic, to Trieste in the Adriatic, an iron curtain has descended across the continent.” Churchill sought to warn the audience of the threat posed by the Soviet Union, and the term “Iron Curtain” resonated, remaining popular for decades after. Around the same time as Churchill’s speech, another great wordsmith used the phrase “Cold War” for the first time — author George Orwell in his 1945 essay “You and the Atom Bomb.” Two years later, Truman adviser Bernard Baruch formally coined the term “Cold War” to describe the cooling relationship between the United States and Soviet Union.
Poland Was the First Eastern Bloc Country to Hold Democratic Elections
For decades, communist regimes maintained uninterrupted power over the many nations of the Eastern Bloc, a group of communist states largely located in Central and Eastern Europe and parts of Asia. Dictators ruled with an iron fist thanks to the lack of fair and free elections within the Soviet Union, Czechoslovakia, and the other countries that fell behind the Iron Curtain. That trend continued until 1989, when Poland held its first democratic elections since the Cold War began. Tadeusz Mazowiecki emerged as Eastern Europe’s first noncommunist leader in decades, representing a pro-labor party known as Solidarity. Mazowiecki embraced Western ideology such as a free-market economy, and though he was replaced as prime minister two years later, the election remains a historic event. Other former communist nations soon followed Poland’s lead; Czechoslovakia and Hungary both held their first fair multiparty elections in 1990. Not long after, the Iron Curtain disintegrated as the Soviet Union collapsed.
Albania Escaped Soviet Control and Aligned Itself With China
Albania may be firmly located in Europe, but during the Cold War it found an unlikely anti-Soviet ally in China. Beginning in 1949, Albania aligned itself with the Soviet Union, which was then still ruled by Joseph Stalin. But after Stalin’s death in 1953, Soviet-Albanian relations began to strain, as Albanian leader Enver Hoxha was much less fond of incoming Soviet premier Nikita Khrushchev. Khrushchev denounced Stalin’s ideology, which rubbed Hoxha and the Albanian people the wrong way and eventually led to the formal termination of Soviet-Albanian relations in 1961. Around the same time, the U.S.S.R. and China had a falling-out of their own, as Chinese leader Mao Zedong also decried the Soviet Union’s revisionism. Given their similar pro-Stalin views, Albania and China found unlikely allies in one another, with China informally providing support to the tiny Balkan country. These strange bedfellows remained on the same page until Richard Nixon visited China in 1972, forcing Albania to reevaluate the relationship. The allyship between Albania and China came to an end in 1978.
Since its creation, jazz has represented the American identity, in part as a symbol of freedom of expression. That’s what made jazz music such a useful propaganda tool for the West to reach nations behind the Iron Curtain. Underground American radio transmissions brought jazz to residents of Eastern Bloc nations beginning in 1955 in an effort to appeal to those repressed societies. Programs such as Willis Conover’s Music USA: Jazz Hour earned cult followings within these pro-Soviet countries, as the U.S. sought to win the ideological war over its Eastern European rivals. The United States hoped that free-flowing jazz music could open the minds of citizens who were used to more formal operas and ballets, and in turn make those individuals more receptive to Western culture. Barriers were broken down even further in 1958 when jazz pianist Dave Brubeck traveled to Poland to perform a landmark concert series. These efforts extended into the Middle East, Asia, and Africa around the same time, as the U.S. spread its jazz music around the world.
Advertisement
Advertisement
The Fall of the Berlin Wall Was Partly Due to a Misinformed Spokesperson
On November 9, 1989, a gaffe uttered by East German spokesperson Günter Schabowski accelerated the fall of the Berlin Wall. After decades of difficult and dangerous travel between East and West Germany, East German officials sought to loosen restrictions while still maintaining control over the visa application process. However, Schabowski, a spokesperson for East Germany’s politbüro, misinterpreted the notes he was given at a press conference and claimed that travel between the two sides could begin without delay. Shortly after the televised slipup, massive crowds gathered at the Berlin Wall, though these groups of travelers initially were held back by guards who were given no specific instructions. Eventually, East Berliners outnumbered the guards to a significant degree, leaving officials no choice but to let people through. Every one of the checkpoints opened by midnight, as people from the East freely traveled into West Berlin for the first time since the wall was erected in 1961. The feeling of independence and celebration was palpable, as East Berliners climbed the wall, destroyed it with hammers, and reunited with their neighbors in West Berlin.
During the high medieval period in Europe — a time when the continent’s population increased rapidly, bringing about great social and political change — Pope Urban II sparked a religious war when he urged Christians to recapture the Holy Land, a region that lies between the Jordan River and the Mediterranean Sea and is held sacred among Christians, Jews and Muslims, particularly the holy city of Jerusalem. The result was a series of military campaigns that lasted from 1095 to 1291, known today as the Crusades.
The Roots of Religious Conflict
By the end of the 11th century, about two-thirds of the ancient Christian world had been conquered by Muslims, including strategically and religiously important regions such as Palestine, Syria, and Egypt. When, around 1077, Muslim Turks took control of the Holy Land, religious and territorial frictions between the two religious groups reached a tipping point. Byzantine Emperor Alexius I Comnenus feared that his lands — the Eastern Roman Empire, which included the strategically important city of Constantinople (now Istanbul) — would be next, and he appealed to the pope for assistance. Pope Urban II responded in 1095, promising the knights of Europe that their sins would be forgiven if they recaptured the Holy Land and, more specifically, Jerusalem, for Christianity. With that fateful promise, the Crusades began.
Pope Urban’s plea was met with a massive response, not only among knights but also among ordinary citizens who chose to join the Crusades. Their reasons for joining the holy war were varied. Many knights sought to defend Christianity while earning forgiveness for their sins and eternal glory, as promised by Pope Urban II. But their motives weren’t always religious. Knights also believed firmly in chivalry and sought out adventures in which they might gain honor or renown. The campaigns also presented material opportunities, with conquered land up for grabs. European merchants, too, were typically in favor of the Crusades, as they sought to monopolize important trading centers that were under Muslim control. They could also earn good money shipping Crusaders to the Middle East — an estimated 90,000 men, women, and children of all classes joined the First Crusade alone, so there was wealth to be made in providing passage.
The armies of the First Crusade set off for the Holy Land in 1096. Warriors had gathered from around Europe, and a force of about 60,000 soldiers (of which some 6,000 were knights) and at least 30,000 noncombatants was amassed, known as the “Princes’ Crusade.” By 1098, the campaigning forces had taken major cities including Nicaea and Antioch, but not without suffering significant losses. By the time the Crusader army reached its primary objective, Jerusalem, in June 1099, there were only around 1,300 knights and 12,000 infantry remaining. Undeterred, they laid siege to Jerusalem, breaching the walls on July 15. The Crusaders then spent two days massacring the inhabitants and pillaging the city. The First Crusade was a bloody victory for Europe, and Jerusalem returned to Christian rule.
Despite the outcome of the First Crusade, holding territory in the Holy Land was no easy task. So-called “Crusader states” were formed, but the first of them, the County of Edessa, was retaken in 1144 by Muslim forces. In response, Pope Eugene III called for a Second Crusade, this time to be led by European kings. A force of 60,000 people assembled and marched east. The campaign, however, was a failure. The Crusader armies became stretched out, and after numerous skirmishes and a handful of sieges, the whole endeavor fizzled away.
Advertisement
Advertisement
Saladin and Richard the Lionheart
Next came the rise of Saladin, one of the most famous figures of the Crusades. The Muslim sultan of Egypt and Syria brought forces from both nations under his own control and began attacking the Crusader states. Most importantly, Saladin recaptured Jerusalem — the prize of the First Crusade — in 1187, sending shockwaves around Europe. The pope soon called for a Third Crusade to recapture Jerusalem. The kings of Europe responded — most notably Richard I of England, known as Richard the Lionheart due to his reputation as a fearsome warrior. So began one of the greatest military rivalries not only of the Crusades, but of warfare in general. Richard and Saladin developed a mutual respect and admiration for each other through their conflicts, and became the defining figures of the Crusades. The Third Crusade ended in 1192 with the Crusader armies recapturing the vital cities of Acre and Jaffa. The Crusaders retook most of the territory captured by Saladin, but failed to recapture Jerusalem.
The Crusades didn’t end there. There were at least eight Crusades in total, until the period finally ended with the fall of the city of Acre in 1291. This marked the end of any major European presence in the region, and no further large military campaigns were organized. Between 1095 and 1291, a number of popular campaigns also took place, but without papal approval, so they are not considered part of the Crusades. This includes the infamous and somewhat bizarre “Children's Crusade,” led by a 12-year-old shepherd named Stephen. The details of this unofficial quest are questionable and may well have been hugely exaggerated by contemporary sources, which claim that as many as 30,000 children set off toward Jerusalem. Whatever the actual numbers, the Children’s Crusade was a failure and never even reached the Holy Land — but it can lay claim to being the first European youth movement.
The Central Intelligence Agency has its fingers in many pies, from counterterrorism to offensive cyber operations and covert paramilitary actions. The mere mention of the CIA brings with it a certain mystique, conjuring up images of secret agents, globe-trotting spies, and clandestine activities. It’s no surprise, then, that the agency has featured heavily in numerous Hollywood movies, from Spy Game and Zero Dark Thirty to The Bourne Identity and Bridge of Spies.
The CIA was formed in 1947 by President Harry Truman, partly as a replacement for the Office of Strategic Services (OSS), which was disbanded after World War II. As a civilian intelligence service and part of the U.S. Intelligence Community, it is officially tasked with gathering, processing, and analyzing national security information from around the world. Unlike the FBI, the CIA has no law enforcement function — it’s also not allowed to collect information regarding “U.S. Persons,” although the agency’s actions have often proven controversial in that regard.
Unsurprisingly, the CIA has kept — and uncovered — many secrets over the decades. Here are some of the most fascinating secrets from the agency’s history, from innovative spy techniques to daring covert missions.
The CIA Had Plenty of Secret Gadgets
The CIA created a range of secret gadgets that could have been straight out of a James Bond movie. The extensive list of low- and high-tech trickery includes hollow silver dollars for holding messages or film; miniature compasses hidden in cufflinks; pigeon-mounted mini cameras; a listening device designed to look like tiger excrement; and a robot fish called Charlie that secretly collected water samples. Perhaps most impressive of all was the “insectothopter,” a tiny robotic dragonfly that could eavesdrop on otherwise inaudible conversations.
In 1962, the CIA launched a mission — code-named Project COLDFEET — to investigate an abandoned Soviet research station on a floating ice island in the Arctic. Getting there was easy enough: Two Navy pilots secretly parachuted down onto the ice and began their search for information. The tricky part was how to recover the pilots and the information they had retrieved, as it was impossible to land an aircraft on the ice. So, the CIA decided to use its new Fulton surface-to-air recovery system, colloquially known as the Skyhook. The agents on the ground deployed a helium balloon that lifted a 500-foot line into the air. A slow-moving B-17 plane, with the Skyhook device attached to its nose, then flew overhead and snagged the line with the agents attached to the end of it, sweeping them into the air, at which point they were winched aboard the aircraft. Sound familiar? You might have seen the Skyhook used later by James Bond in 1965’s Thunderball and Batman in 2008’s The Dark Knight.
Anthropologists and Archaeologists Made Great Spies
In Indiana Jones and the Kingdom of the CrystalSkull (2008), Indy reveals that he once worked for the OSS, the predecessor of the CIA. This is a believable part of the character’s backstory, as the OSS and other intelligence agencies often recruited archaeologists like Indy during World War I and World War II. The same is true for the CIA, which has routinely enlisted archaeologists, anthropologists, art historians, and other academics for intelligence-gathering purposes. It makes sense, as these professionals already have the perfect reason to be conducting fieldwork in foreign countries — and they tend to be somewhat more discreet than Indiana Jones.
Animals have been used in espionage since the days of the ancient Egyptians and ancient Greeks, who used homing pigeons and dogs to deliver secret messages (the latter carrying missives in their collars). The CIA took things to a whole new level, however, with highly trained nonhumans undertaking a range of complex tasks. In a secret 1960s project known as Project OXYGAS, dolphins were trained to attach explosive devices to enemy ships. Birds have proved useful, too. Camera-carrying pigeons could take higher-quality photos than spy satellites operating at the time, while ravens were trained to deliver and retrieve small objects of up to 1.4 ounces from otherwise inaccessible buildings. The CIA also spent $20 million trying to train cats to eavesdrop while fitted with recording devices. Known as Operation Acoustic Kitty (seriously), the project was eventually abandoned after a feline operative was tasked with listening in on the conversation between two men sitting in a park. The cat, totally uninterested in any previous training, wandered off and was hit by a passing car.
It’s often said that wars bring about a wave of innovation — necessity being the mother of invention, as the old adage goes. In reality, that’s not necessarily true. According to some studies, there tends to be a significant decline in inventiveness immediately after the outbreak of a war, followed by a marked surge, the net result being a fairly standard rate of innovation overall. Creation through necessity or even desperation certainly happens, but prosperous, peaceful, and free societies tend to be just as inventive, if not more so.
That said, plenty of technological innovation took place during World War II, especially in fields that had military applications. Here are some of the most pivotal, successful, and enduring inventions to come out of the war, from handy tools used by millions of people to miracle drugs that have saved countless lives.
Duct Tape
In 1943, Vesta Stoudt, an Illinois woman with two sons serving in the U.S. Navy, was working in an ordnance plant when she noticed a problem with the ammunition boxes she was packing. The boxes were sealed with paper tape with a tab to open them, but this tab could easily tear off, leaving soldiers potentially scrambling to open the boxes in life-threatening situations. So, Stoudt came up with the idea of a waterproof fabric tape with which to seal the boxes — an idea she sent to none other than President Franklin D. Roosevelt. Impressed, the President sent her letter to the War Production Board, which soon came up with what we now know as duct tape. Not only was it easy to apply and remove on ammo boxes, but it also turned out to be endlessly handy for quickly repairing military equipment, including vehicles and weapons.
The fundamental principle underlying modern radar (which is actually an acronym for “radio detection and ranging”) was first observed in 1886 by physicist Heinrich Hertz, who found that electromagnetic waves could be reflected from various objects. It was during World War II, however, that modern, practical radar was developed. Britain had already established a chain of radar stations along its south and east coasts by the outbreak of the war, allowing for the detection of enemy aircraft at a range of 80 miles. The British then invented the cavity magnetron in 1940, paving the way for far more compact, powerful, and sensitive radar units (and, as it happens, microwave ovens).
British engineer and RAF officer Frank Whittle first put forward his vision of jet propulsion in 1928, at which time he was roundly ridiculed. Undeterred, he did a successful test run of the first practical jet engine in 1937, albeit on the ground. Then, in August 1939 — a month before the outbreak of the war — the German-built Heinkel He 178 made the first jet-powered flight in history. The war ramped up the development of jet engines, and Whittle found himself with more funding than ever before. In May 1941, his jet-propelled Gloster E.28/39 took flight, achieving a top speed of 370 mph at 25,000 feet, faster than any other conventional propeller-driven aircraft. Though neither the Heinkel He 178 nor the Gloster E.28/39 ever flew combat missions during the war, the “jet age” had begun. The first jet aircraft used in the war were the Messerschmitt Me 262 and Gloster Meteor, starting in 1944.
During World War II, Harry Coover was part of a team at Eastman Kodak trying to find a way to make clear plastic gun sights for Allied soldiers. During his research, Coover accidentally created a new compound called cyanoacrylate. At the time, he and his team found the new compound to be incredibly durable but way too sticky to use, and they soon abandoned the substance. Nine years later, in 1951, Coover returned to cyanoacrylates, and this time he and his team recognized new potential in them. The sticky adhesive required no heat or pressure to bond, and the bond was incredibly strong. Coover had invented super glue, albeit by accident.
Advertisement
Advertisement
Electronic Computers
It’s hard to say precisely when the first computer was invented. You could argue that it was the abacus of the ancient world, and there’s certainly a case to be made for Charles Babbage and his mechanical computer of the early 19th century. But when it comes to programmable, electronic computers, we can reliably trace the origin to the Second World War. First there was Colossus, a huge set of computers developed by British code breakers at Bletchley Park between 1943 and 1945. In its first test, Colossus successfully decoded a genuine coded message so quickly that 10 improved machines were ordered right away. Then there was ENIAC (Electronic Numerical Integrator and Computer), developed by the United States and completed in 1945, which lays claim to being the first programmable, general-purpose, electronic digital computer. (It was designed specifically for computing values for artillery range tables, but had other uses as well.) Together, they marked a turning point in the history of modern computers.
The Slinky
Not all the inventions that arose from World War II had military applications. In 1943, Richard James, a naval mechanical engineer stationed at the William Cramp & Sons shipyards in Philadelphia, was working to devise springs that could keep sensitive ship equipment steady at sea. While he was working, he accidentally knocked a coiled spring from a shelf. He watched, surprised and amused, as the spring seemingly walked its way end-over-end across the ground. That same day, he went home and told his wife, Betty James, about an idea he had for a new toy. They took out a $500 loan, co-founded James Industries, and in 1945 the Slinky hit shelves. By the end of the 20th century, around 250 million Slinkys had been sold.
Penicillin was discovered in 1928 by the Scottish bacteriologist Alexander Fleming, and it was later isolated and purified in the late 1930s. Making large amounts of penicillin, however, was difficult. In 1942, when the new antibiotic was first used to successfully treat a patient for sepsis, it used up half the available supply of penicillin in the United States. During the war, a combined effort between the U.S. and Great Britain saw scientists working around the clock to develop mass production techniques. They were successful, and managed to manufacture 2.3 million doses of penicillin in preparation for the D-Day invasion on June 6, 1944.