Elvis was on the radio, TheEd Sullivan Show was on the TV, and scores of people were hightailing it to the suburbs — this was 1950s America. It was a young nation, with 31% of its 151 million residents under age 18, and it was on the brink of change. Birth rates continued to rise at unprecedented levels, giving way to a new generation of “baby boomers.” The “nuclear family” (describing married couples with kids at home) was ingrained in the culture; more than half of all people (68% of men and 66% of women) were married. By the time the ’60s rolled around, many of these cultural norms would be upended, but this generation left a lasting mark on American society. Here is a snapshot of family life in the 1950s, by the numbers.
Post-World War II America saw a rapid increase in birth rates lasting from 1946 through 1964. It became known as the “baby boom,” and the 1950s were smack dab in the middle of it. During the ’50s, around 4 million babies were born every year in America, a sharp increase from the previous average, around 2.7 million births annually between 1910 and 1945. By the end of the boom, around 77 million babies had been born. This influx of births was due to many positive aspects of the postwar era, including low unemployment rates, a burgeoning economy, low interest rates, and a strengthened middle class.
In alignment with the nuclear family mindset, most ’50s households consisted of a married couple, and typically only one spouse worked (generally the man). In 1950, only 29% of working-age women living in the U.S. held a job — but nearly half of single working-age women (46.4%) worked. The number decreased dramatically among married working-age women; less than a quarter of them (21.6%) held jobs.
By 1960, the number of working women in America increased from 16.5 million to nearly 22.5 million, a 35% increase (despite only a 14% increase in the population of working-age women). The five most popular jobs for women of this era were secretary (stenographer or typist), salesperson (retail), schoolteacher, bookkeeper, and apparel factory worker.
Advertisement
Advertisement
Photo credit: Camerique/ Archive Photos via Getty Images
Mortgage Rates Averaged Around 2.5%
The housing market of the 1950s was booming. An increasing number of Americans were leaving busy urban lifestyles behind in favor of the suburbs. Mortgage rates ranged between 2.1% in 1950 and 2.6% in 1959. For the 16 million World War II veterans living in 1950, the G.I. Bill lowered mortgage rates even more. For many, it was the perfect time to purchase a home.
One of the most recognizable examples of 1950s suburban neighborhoods must be the “Levittowns,” named after real estate developer William Levitt, who built thousands of houses in planned communities around the mid-Atlantic during the late 1940s and early ’50s. The most famous of these communities was in Long Island, New York, where during peak construction, one house was built every 16 minutes.
Around 4.4 million homes had television sets by 1950. This might sound like a lot for the era, but it was only 9% of households. By the end of the decade, the figure spiked to 90% of households, marking a transformational decade for entertainment. Television programming, especially the American sitcom, became a staple of family life. These shows epitomized the stereotypical American family unit, from the Cleavers of Leave It to Beaver to the Warrens of Father Knows Best.
Although the golden age of Hollywood was nearing its end in the 1950s, cinemas were still as popular as ever, and fortunately for moviegoers, this pastime didn’t cost a fortune. In 1950, one theater ticket cost 46 cents, which was less than the price of a dozen eggs (60 cents). A family of four could go to the movies for the price of around two gallons of milk (one gallon cost 83 cents) — a feat that is not likely accomplished today.
Families flocked to theaters to see Disney’s Cinderella, the top-grossing film of 1950. Released on February 15, the film grossed more than $52 million that year and sold nearly 99 million tickets. Other top-grossing films of 1950 included King Solomon’s Mines, Father of the Bride, and All About Eve.
More Than Half of All Households Had Children at Home
Due to the ongoing baby boom, most American households had young children at home in the 1950s. Census records show that around 52% of households had children under 18 at home in 1950; in 2019, that number was down to 41%. Families were large during the decade: Around 58% of households had between three and five members, 21% had more than six members, 18% had two members, and only 3% had one member. The average family unit size has steadily declined since its peak during the late 1950s and early ’60s. In 2022, the average American family size was 3.13 people.
With so many Americans moving to the suburbs in the ’50s, more and more families depended on a car to get around. In 1954, most U.S. households (64%) owned one car. Between 1954 and 1960, the number of one-car families rose from 30.1 million households to 32.4 million. Multicar ownership wasn’t popular — a little more than 8% of households owned two cars in 1954, and only 0.9% had three or more cars. (Owning two cars became slightly more common by the end of the decade.) Just how much did a car set you back during the 1950s? Two popular family cars, the Cadillac DeVille and the Oldsmobile 88 Fiesta, cost around $3,523 and $3,541, respectively, which would be around $37,000 today.
The concept of miniature dwellings traces back to ancient civilizations, when Egyptians placed small clay replicas of their houses and belongings in and around burials. These models were intended to provide the comforts of home to the deceased in the afterlife. Although the tiny dwellings we know as dollhouses today are quite different from these ancient versions, their history also includes purposes other than play. Over the last 500 years, dollhouses have evolved from elaborate displays for adults, to useful household teaching tools, to enduring objects of imagination and aspiration for children.
The earliest known dollhouses were made in the 16th century, primarily in Germany, and later in Holland and England. Known as a “dockenhaus” (miniature house), “cabinet house,” or “baby house” (because of the size, not the intended audience), these handcrafted items were not initially made for children to play with — they served as display cases for wealthy adults to fill with miniature furniture, fabrics, and artwork that reflected their own taste and lifestyle.
One of the earliest recorded examples of a dollhouse is the Munich Baby House. Commissioned by Albert V, the Duke of Bavaria, in the 1550s, the piece was made by skilled artisans in the shape of a royal residence (instead of a wooden cabinet like the dominant style that soon followed). Though the Munich Baby House was lost in a fire in the 1600s, Albert V had the object detailed in an inventory of his household goods. Historians believe that the Munich Baby House was likely made for the duke’s entertainment, but some suggest it may have been built as a gift for his daughter, which would make it an early example of a dollhouse for children.
Throughout the 17th century, dollhouses remained as elaborate and whimsical showcases of a family's wealth; they were most often constructed in cabinets with doors that opened and closed like a china cabinet. In the early 1600s, however, baby houses also took on a more practical purpose. Nuremberg houses and Nuremberg kitchens, named for their primary place of manufacture in Germany, emerged as inspirational and educational tools to motivate and teach young women how to decorate and care for a household.
These types of baby houses were less ornamental than their predecessors, typically made entirely out of metal and sometimes consisting of just a kitchen. But they were no less meticulous, featuring tiny handcrafted brooms, kettles, copper cooking pots, and mini masonry hearths at the heart of the kitchen. Though they’re thought of as being instructional in nature, Nuremberg houses were also used for play, primarily by girls, and often during the Christmas holidays. This trend played another important role in the evolution of dollhouses: A well-preserved Nuremberg house from 1673 is another early example of a dollhouse built to look like a family home.
Advertisement
Advertisement
Photo credit: Heritage Images/ Hulton Archive via Getty Images
The Victorian Dollhouse
By the 18th century, baby houses had become popular in England. These structures commonly had detailed and realistic facades, including doors and windows, informing what we commonly think of as the classic Victorian dollhouse. They were often modeled after the owner’s home, and although they still functioned as displays of opulence, caring for them and decorating them also became a beloved hobby for women.
Until the mid-18th century, dollhouses were unique, one-of-a-kind creations. But with the innovations of the Industrial Revolution, the once custom-made dollhouses became much easier to produce en masse. This dovetailed with changing ideas of childhood in the early 19th century, when kids were no longer expected to weather the hardships of adulthood. Around this time, dollhouses started to be used more frequently as toys.
One of the first mass-produced dollhouses available in the United States came courtesy of the Bliss Manufacturing Company. Bliss began producing dollhouses in the 1890s, even as the U.S. also imported the tiny dwellings and miniature furnishings from Germany. Despite becoming more mainstream in the early 1900s, dollhouses didn’t become affordable for most families until after World War II. The postwar economic boom, along with increased manufacturing materials and abilities, made the toys fixtures in playrooms throughout America. Mattel released Barbie’s first Dreamhouse in 1962, just three years after the Barbie doll’s debut, and by the 1970s, other major toy companies including Fisher Price, Playmobil, and Tomy were producing popular miniature toy houses as well.
In recent decades, dollhouses have evolved into collector's items and continue to be a passionate hobbyist pursuit. Over the past few years, social media communities have developed around a renewed love of miniature dwellings. Today, the toy also reflects a cultural shift — many people consider the meticulous structures a form of escapism and a source of solace and joy in a world with issues that often feel beyond individual control. Dollhouses continue to be cherished toys that exist at the intersection of craftsmanship, culture, and creativity.
The concept of etiquette dates back to Europe during the medieval era, when rules and social conventions first gained prominence. During the Renaissance, expectations of behavior at royal and noble courts were outlined in courtesy books, or books of manners. In the 19th century, etiquette manuals continued to flourish in Europe and the United States, guiding behavior for ladies and gentlemen in both social and professional settings. By the early 20th century, these guidebooks were increasingly popular with both wealthy and middle-class women in the U.S., and author Emily Post became the definitive expert with the publication of her first book of etiquette in 1922.
Today, the rules of behavior observed by previous generations might seem old-fashioned and strange, and certainly there are some social conventions better left in the past, as they reflect the inequality and biases of bygone eras. But etiquette itself isn’t inherently outdated. While specific customs may evolve, the underlying principles of courtesy, respect, and consideration for others remain as relevant today as they were a century or two ago. With that in mind, here are some of the most unusual and surprising etiquette rules from decades past.
According to Vogue’s Book of Etiquette, published in 1948, wives should defer to one’s husband as “head of the house.” By not paying proper respect to their husbands, the thinking went, bad-mannered American wives were placing their husbands in a subordinate position, which was “most unbecoming to a man.” Among the suggestions for being a better wife were to say “we” or “our” instead of “I” or “me,” and to let one’s husband take the lead on deciding when to leave a party. Reflecting the often oppressive gender norms of the era, the guide reminds wives that “a woman can gracefully play second fiddle, but a man who is obviously subordinated to a dominating woman is a pathetic and foolish figure.”
There Should Be One Servant for Every Two Dinner Guests
The anonymous countess who authored the 1870 etiquette book Mixing in Society: A Complete Manual of Manners declares, “It is impossible to over-estimate the importance of dinners.” She goes on to detail the many aspects of planning and hosting a dinner party. In addition to having an equal number of ladies and gentlemen at a dinner (and never 13, out of respect for superstitious guests), the hostess should make sure to have “one servant to every two guests, or, at least, one to every three.”
A Man Was Expected to Choose His Riding Companion’s Horse
Published in 1883, American Etiquette and Rules of Politeness outlines the rules for men going horseback riding with a woman, noting that the gentleman should “be very careful in selecting her horse, and should procure one that she can easily manage.” He is also admonished to “trust nothing to the stable men, without personal examination,” and to “be constantly on the lookout for anything that might frighten the lady’s horse.”
The Most Important Rule for Children Was Obedience
In 1922, Emily Post published her first book of good manners, Etiquette in Society, in Business, in Politics and at Home. It offered more than 600 pages of rules and standards, from how to make introductions to proper behavior when traveling abroad. No one was exempt from learning and practicing proper etiquette, including children. “No young human being, any more than a young dog, has the least claim to attractiveness unless it is trained to manners and obedience,” Post states in the chapter “The Kindergarten of Etiquette.” In addition to learning how to properly use a fork and knife and remaining quiet while adults are speaking, a child should be taken away “the instant it becomes disobedient,” directs Post. By teaching a child that it can’t “‘stay with mother’ unless it is well-behaved,” she writes, “it learns self-control in babyhood.”
Photo credit: FPG/ Archive Photos via Getty Images
Flirting Was a Sign of Ill Breeding
Published in 1892, the guidebook Etiquette: Good Manners for All People; Especially for Those “Who Dwell Within the Broad Zone of the Average,” focused on advice for middle-class Americans, and not just wealthy society. The book describes itself as offering “some of the fundamental laws of good behavior in every-day life” for “people of moderate means and quiet habits of living.” In the chapter on “Gallantry and Coquetry,” readers are reminded that there is nothing wrong with a man enjoying the company of a charming woman, or a woman delighting in the conversation of a brilliant man. However, these acts of mutual appreciation have “nothing in common with the shallow travesty of sentiment that characterizes a pointless flirtation.” Not only is flirting a sign of poor breeding, the guide suggests, but “a married flirt is worse than vulgar.”
Advertisement
Advertisement
Photo credit: Hirz/ Archive Photos via Getty Images
A Man Couldn’t Speak to a Woman Unless She Spoke to Him First
During England’s Victorian era in the 19th century, women had “the privilege of recognizing a gentleman” first by acknowledging him with a bow, according to the 1859 British handbook The Habits of Good Society: A Handbook of Etiquette for Ladies and Gentlemen. Men were expected to wait for that acknowledgment before speaking. “No man may stop to speak to a lady until she stops to speak to him,” the book advises. The guidelines go on to say, “The lady, in short, has the right in all cases to be friendly or distant. Women have not many rights; let us gracefully concede the few that they possess.”
The 20th century produced an array of iconic toys that captured the public’s imagination and, in some cases, continue to delight young people worldwide. The Slinky, originating in the 1940s, and the Rubik’s Cube, first sold in the United States in the early 1980s, have remained more or less the same since their invention, invoking a nostalgic simplicity. Other toys, such as LEGO and Barbie, have offered up countless iterations, weathering changing trends to endure in popularity and appeal. The legacy of these toys is in more than just their entertainment value — it’s in the way they reflected or even set cultural trends, interests, and technological advancements. Here are some of the most popular toys throughout the 20th century, many of which are still around today.
In the early 1940s, United States industry was largely focused on producing goods for the war effort, and it was during this time that the Slinky was accidentally invented. Richard James, a mechanical engineer, stumbled on the idea in 1943 while working with tension springs for naval equipment at a Philadelphia shipyard. After accidentally knocking some of his prototypes off a shelf, James couldn’t help but notice the way one of them “walked” down a stack of books on his desk. He worked on this strange spring — which his wife named “Slinky” after seeing the word in the dictionary — over the next two years. By the end of 1945, James got an initial run of 400 Slinkys into a local department store. It wasn’t until he staged a live demonstration, however, that the product’s popularity picked up, and the toy sold out. Within the first 10 years, he sold 100 million. The Slinky has endured for decades, not only as a popular toy on its own, but also through licensing and its iconic jingle — the longest-running jingle in television advertising history.
LEGO is known for its colorful modular plastic bricks, but when the company started in Denmark in 1932, it made wooden toys such as cars and yo-yos. Plastic toys didn’t come along until the late 1940s, when founder Ole Kirk Christiansen developed the forerunner of the buildable bricks we know today, known at the time as Automatic Binding Bricks. In 1958, the modern LEGO brick was patented, with an updated interlocking design that became its signature.
Through a deal with Samsonite, LEGO made its way to Canada and the U.S. in the early 1960s, but the iconic toy didn’t truly find its footing in North America until the early 1970s. The New York Times claimed the toy had been “ineptly marketed” since its stateside arrival, and the then-head of LEGO’s U.S. operations called the deal with Samsonite “a disaster.” In 1973, however, the company took over its own U.S. production and sales and, per the Times, sales “soared.” LEGO grew to be much more than a toy in the ensuing decades — it became an entertainment empire. Throughout it all, the company has stood by its name, which also happens to be its guiding principle: LEGO is an abbreviation of the Danish words “leg godt,” meaning “play well.”
When Mattel released the first Barbie doll on March 9, 1959, it was the first time that most children had seen a three-dimensional, adult-bodied doll — the norm at the time were baby dolls designed to be taken care of. Ruth Handler, the co-founder of Mattel and creator of Barbie, had a different idea. After watching her daughter Barbara, the toy’s namesake, play with paper dolls, Handler envisioned a doll that was a little bit older and could inspire more aspirational play: Young girls could see their future selves in the doll, instead of a child to nurture. Barbie’s initial launch at the New York Toy Fair faced some skepticism from other toy industry executives, but Handler’s instincts were right: Around 300,000 Barbies sold within the first year. As beloved as Barbie was, though, she also courted controversy. Early on, detractors were uncomfortable with the doll’s figure. Barbie was at times criticized for being too conventional; other times, too progressive. But the doll’s popularity endured as the company diversified her looks, skin tones, body types, and, of course, jobs: Throughout her lifetime, Barbie has explored more than 250 different careers. The cultural phenomenon continues to this day: Around 1 billion Barbie dolls have been sold, and in 2023, the first live-action movie based on Barbie became the year’s biggest release.
Photo credit: Justin Sullivan/ Getty Images News via Getty Images
G.I. Joe
Following Mattel’s major Barbie breakout, rival toy company Hasbro sought a similar success story. Barbie thrived by marketing primarily to young girls, and Hasbro aimed to fill a gap in the market with a toy made for boys. In the early 1960s, toy maker Stan Weston approached Hasbro with an idea for a military toy, but was turned down. One Hasbro executive, Don Levine, saw the toy’s potential, however, and workshopped the idea until the company approved. It wouldn’t be called a doll, though — Hasbro created the term “action figure” to market the new product, and even forbade anyone in the company from referring to it as a doll. Released in 1964, the original G.I. Joe line consisted of four 12-inch figures, one for each of the U.S. military branches: the Army, Navy, Air Force, and Marines. The action figure took off, and within two years, G.I. Joe accounted for almost 66% of Hasbro’s overall profits. The franchise eventually created less military-centric characters, expanded to comic books and animated series, and embraced sci-fi, espionage, and team-based narratives that have carried the toy as a symbol of adventure and heroism across generations.
At first glance, a Rubik’s Cube appears simple, but the mathematically complex puzzle is anything but, and solving it is a problem that has captivated the public ever since the toy’s invention. Created by Hungarian architect and professor Ernő Rubik in 1974, the first "Magic Cube," as he called it, resulted from months of work assembling blocks of wood with rubber bands, glue, and paper. After painting the faces of the squares, Rubik started twisting the blocks around, and it took him weeks to get it back to its original state. One month later, he finally did. He patented the toy Rubik’s “Buvos Kocka,” or “Magic Cube,” and it first appeared in Hungarian toy shops in 1977. Within two years, 300,000 Hungarians had bought the puzzling cube. By 1980, an American toy company was on board, and international sales of the renamed Rubik’s Cube took off — 100 million were sold in three years. As quickly as the craze started, however, it seemed to fade. TheNew York Timesreported in 1982 that it had “become passe,” replaced by “E.T. paraphernalia…[and] electronic video games.” But the toy has nonetheless endured, and to date, an estimated 350 million colorful cubes have been sold, making it one of the bestselling puzzles in history.
Photo credit: Barbara Alper/ Archive Photos via Getty Images
Cabbage Patch Kids
Known for their one-of-a-kind features, unique names, and adoption certificates, Cabbage Patch Kids caused a full-on frenzy in the 1980s, leading to long lines at stores — and even riots. Although the dolls are known as the invention of Xavier Roberts, whose signature is on every doll, the origin story reportedly starts with a folk artist named Martha Nelson Thomas. In the late 1970s, Thomas was selling her handmade “doll babies” at craft fairs in Louisville, Kentucky. Roberts reportedly resold the doll babies at his own store for a while, but eventually remade and renamed them Cabbage Patch Kids. (Thomas eventually took Roberts to court over the copyright, but the pair settled in 1985.) In 1982, Roberts licensed his dolls to the Coleco toy company, and the following year, thanks to a robust advertising campaign, demand was much greater than supply , sparking angry mobs of disappointed parents that holiday season. Around 3 million Cabbage Patch Kids had been “adopted” by the end of 1983, and over the next two years, sales topped half a billion dollars. The doll’s popularity faded quickly after that, but Cabbage Patch Kids remain toy store fixtures to this day.
Advertisement
Advertisement
Photo credit: Chesnot/ Getty Images News via Getty Images
Tamagotchi
In the early 1990s, video game consoles were household staples, and by the end of the decade, tech toys such as Tickle Me Elmo and Furbies caused consumer crazes. But one pocket-sized toy that combined the best of both worlds was a ’90s must-have: the virtual pet. The handheld pixelated companions required regular feeding and playing, imbuing users with responsibility and emotional attachment, and engaging them in a type of continual play that was relatively new at the time.
The most popular virtual pet was the Tamagotchi, created by Japanese toy company Bandai. It was released in the United States on May 1, 1997, six months after it was launched (and subsequently sold 5 million units) in Japan. After the first day of the toy’s U.S. release, some stores were already sold out. Within the year, there were several competing virtual pets: GigaPets and Digimon offered different pet options and more gameplay. The constant connectivity of the virtual pets led to schoolbans, and as the internet gained traction in the late ’90s and early 2000s, online versions such as Neopets all but replaced the Tamagotchi. Virtual pets had an undeniable influence on future trends in gaming and handheld electronic devices, and while the toy has gone through several iterations and relaunches over the years, the original Tamagotchi largely remains a nostalgic relic of the ’90s.
The 1960s and ’70s are considered a golden age in advertising, though the industry’s creative revolution arguably started in the 1950s, thanks in part to the rise of television unlocking new forms of storytelling. It was an era of bold ideas, increasingly large budgets, and even bigger personalities — a time when advertising was seen as a glamorous, if maybe unethical, profession populated by well-dressed men and women (but mostly men) profiting from the postwar consumer culture.
At the time, many of the nation’s largest ad agencies were located on Madison Avenue in Manhattan, and the street came to be synonymous with American advertising and its unique methodology. Safire’s Political Dictionary, published in 1978, referred to “Madison Avenue techniques” as the “gimmicky, slick use of the communications media to play on emotions.” More recently, the culture surrounding this advertising boom has been portrayed in 2007’s acclaimed AMC series “Mad Men,” centered on the charismatic creative director Don Draper (played by Jon Hamm). Here are five fascinating facts about the golden age of advertising, and the real-life ad men and women of Madison Avenue.
A “Small” Ad Changed the Way Americans Looked at Cars
In the 1960s, advertising underwent a transformation that became known as the Creative Revolution, shifting the industry’s focus from research and science to an approach that was creative and emotionally driven. For better or worse, this era of advertising owes a lot to the Volkswagen Beetle, and the visionary ad man Bill Bernbach. In 1959, at a time when Americans were buying cars out of Detroit and vehicles were getting bigger and flashier, Bernbach’s agency, Doyle Dane Bernbach (DDB), was contracted to promote the German-made Volkswagen Beetle in the United States. The problem was, Volkswagen’s strong link to Nazi Germany made it a tough sell in the U.S. The challenge called for an unconventional approach. Rather than attempting to duplicate the busy, colorful advertising style of American-made cars, the creative team behind Volkswagen’s campaign went in the opposite direction. The first ad, “Think Small,” featured a small black-and-white image of a Volkswagen Beetle against a backdrop of white space. The now-iconic ad encouraged consumers to look at the car in a new light, from being able to “squeeze into a small parking spot” to having small insurance payments and small repair bills.
The 1960s ushered in a new era of creativity in advertising, delivering advertisements that were brash and irreverent but also respectful of the consumer and entertaining to read. Ironically, one of the biggest players in American advertising was British ad man David Ogilvy, founder of the New York City-based advertising giant Ogilvy & Mather and known today as the “father of advertising.” Ogilvy believed in the importance of creating “story appeal” through the use of unique, unexpected elements or “hooks,” such as the eye patch worn by “The Man in the Hathaway Shirt” ads. “Every advertisement is part of the long-term investment in the personality of the brand,” Ogilvy said, and it was a philosophy that Madison Avenue took to heart. Ogilvy’s portfolio included the first national advertising campaign for Sears, the quirky Commander Whitehead ads promoting Schweppes Quinine Water, and the beautiful tourism ads that helped revitalize the image of Puerto Rico. In 1962, Ogilvy’s creative and innovative vision led TIME magazine to call him “The most sought-after wizard in today’s advertising industry.”
The “three-martini lunch” — the typically all-male leisurely power lunches where ideas were sparked and deals were made over a few rounds of cocktails — is the stuff of legend today, and an iconic image of the culture surrounding Madison Avenue. And indeed, during the heydey of advertising’s golden age, drinking at lunch was not only acceptable, but expected. According to ad exec Jerry Della Femina, “The bartender would be shaking the martinis as we walked in.” It was accepted that drinking fueled the creative process; David Ogilvy’s advice for tapping into the creativity of the unconscious included “going for a long walk, or taking a hot bath, or drinking half a pint of claret.” Regardless of whether the real ad exes of Madison Avenue were regularly imbibing as much alcohol as Don Draper and his pals on “Mad Men,” there’s one indisputable fact about the so-called three-martini lunch: It was a deductible business expense that symbolized success as much as excess.
The Leo Burnett Company was one of the few major advertising agencies not based in Manhattan, but the Chicago agency was responsible for a number of well-known campaigns, including the Pillsbury Doughboy, Tony the Tiger, and one of the most successful campaigns in advertising history, the Marlboro Man. Smokers and non-smokers alike know the iconic cowboy character, who was developed by Burnett in the mid-1950s to rebrand the “mild” feminine cigarette. The ads were a hit and, in the mid-1960s, the team at Burnett went even further in promoting the brand by using real cowboys on a Texas ranch. When tobacco advertising was banned from television and radio in the early 1970s, the Marlboro cowboys still found success in print, making Marlboro the top-selling brand worldwide in 1972.
Advertisement
Advertisement
Photo credit: Ben Martin/ Archive Photos via Getty Images
The ’60s Saw the First Female CEO of a Major Ad Agency
Throughout the 1950s and ’60s, college-educated women recruited to work on Madison Avenue were more likely to be found sitting behind a typewriter than in the boardroom. In a booklet published in 1963 by the J. Walter Thompson agency (JWT), young women were encouraged to hone their typing and shorthand skills so they could become the “right hand to a busy executive” or “secretary to one of the senior analysts.” But in 1966, Mary Wells Lawrence, the founding president of Wells Rich Greene, became the first woman to found, own, and run a major advertising agency. Two years later, she became the first woman CEO of a company listed on the New York Stock Exchange. Some of her agency’s most notable campaigns include Alka-Seltzer’s “Plop, Plop, Fizz, Fizz,” Ford’s “Quality Is Job One,” and the “I ❤ New York” tourism campaign.
We all know of the Freemasons and the ever-mysterious Illuminati, but throughout history, plenty of other secret societies have flourished under the radar. The western U.S. is home to a long-running, low-key historical society with a unique and eccentric ethos, while northern Spain’s historic food culture has been kept alive through selective supper clubs for more than a century. Though their stories don’t often get told, these clandestine groups have nonetheless left their own obscure marks. Read on to learn about five little-known secret societies.
Secret societies typically conjure a dark air of mystery, but the Order of the Occult Hand illustrates the fun side of underground organizations. Its origins can be traced to 1965, when Joseph Flanders, a crime reporter for the Charlotte News, wrote an article about the shooting of a local millworker. “It was as if an occult hand had reached down from above and moved the players like pawns upon some giant chessboard,” Flanders wrote. His colleagues, the legend goes, found the flowery description so funny, they formed the Order of the Occult Hand, a secret society dedicated to sneaking “it was as if an occult hand,” or a similar phrase, into their work.
The mission quickly spread among journalism circles in Charlotte and beyond. By the early 1970s, the mischievous media conspiracy was becoming so prevalent that the Boston Herald reportedly banned “occult hand” from the paper. Over the years, the phrase continued to show up in TheNew York Times, TheWashington Post, and the Los Angeles Times. In 2004, writer James Janega published a thorough exposé of the Order in the Chicago Tribune, and in 2006, journalist Paul Greenberg, a long-running member of the society, copped to creating a new secret phrase that went into circulation, even as the “occult hand” keeps going.
In 2011, a team of researchers cracked the code of a centuries-old manuscript belonging to a secret society known as the Oculists. The text, known as the Copiale Cipher and believed to date back to between about 1760 and 1780, was discovered in former East Germany following the Cold War. Once the confusing use of Roman and Greek characters, arrows and shapes, and mathematical symbols was deciphered, a ritual manual for an 18th-century German group with a keen interest in eyesight was revealed.
The cipher detailed the Oculists’ initiation ceremonies, oaths, and “surgeries,” which seemed to consist of plucking hairs from eyebrows with tweezers — a nonsurgical procedure, of course, but described by the manuscript as symbolic actions. Another passage described a tobacco ceremony in which the hand pointedly touches the eye; another still told of a candidate kneeling in a candlelit room in front of a man wearing an amulet with a blue eye in the center. Research has suggested that the group’s focus on the eye was simply due to the fact that eyes are part of the symbology of secret societies — the Oculists did not appear to be optometrists, and their ultimate purpose remains a mystery.
Northern Spain’s Basque Country is home to a handful of “txokos” — food-centric secret societies that started as a way to save money on food and drink when dining out of the home. These gastronomic societies function as exclusive clubs; members, often chosen after being waitlisted for years, have access to a fully stocked kitchen and pantry, where they cook for themselves or each other, using the honor system to pay for items needed or used. While it sounds similar to a modern dinner party, many of the txokos have been around for decades, and are still going strong.
Kañoyetan, reportedly the oldest society in the region (founded in 1900), counts renowned local chef Martin Berasategui — a 12-time Michelin star recipient — among its members. Until recently, txokos operated as secret societies only for men; the club claimed to be a place for men to socialize and cook outside of the home, where, according to the BBC, “their wives traditionally called the shots.” Wine and cider are always on hand; these days the dinners can start late in the evening, and have been known to stretch on until the early morning hours.
The mysterious society known as E Clampus Vitus originated in West Virginia sometime around the mid-1840s. By the early 1850s, the “Clampers,” like many people during the gold rush era, made their way west to California. Many of the fraternal club’s rituals were adopted as a reaction to the formalities of other organizations at the time, such as the Odd Fellows and Freemasons. Clampers, who were primarily miners, wore eccentric clothing and accessories, conducted lighthearted rituals, and adopted the slogan “Credo Quia Absurdum,” or, roughly translated, “I believe because it is absurd.”
The Clampers’ clubs waned around the turn of the 20th century, and by the 1920s, the society was all but defunct. But in the 1930s, the Clampers reestablished themselves with a new objective: to chronicle some of the most obscure details of the history of the American West. In California alone, more than 1,400 historical markers have been installed to commemorate moments in the state’s history that might otherwise go overlooked, including the birthplace of the martini, filming locations, and the “world’s largest blossoming plant.”
The name Pythagoras likely brings back memories of high school geometry, but the ancient Greek philosopher and mathematician was also the head of a mysterious society. The Divine Brotherhood of Pythagoras was formed in the sixth century BCE. The community may have been based on the study of mathematics, but it operated more like a secret society — or, as some might say, a cult. It’s believed they lived together communally, surrendered their personal possessions, were vegetarians who purportedly did not eat beans because it was believed beans had souls, and followed several strict rituals.
The Pythagoreans’ motto was “all is number,” and their aim was to be pure of mind and soul. Their focus on mathematics and science was a way to achieve purity — as was avoiding wearing woolen clothing, and never stirring a fire with a knife, as laid out in Pythagoras’ rules. The group ultimately had many mathematical achievements, but their selective and rigid way of life contributed to a lingering sense of mystery around the community.
As we look back at American history, it’s crucial to take a moment to reflect on and recognize the contributions made by the nation’s Indigenous peoples, who are so often overshadowed by famous figures who came to the United States from other parts of the world. To commemorate this important part of America’s heritage, here’s a look at five notable Indigenous heroes and leaders who shaped the nation through their tireless efforts.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
Geronimo (1829-1909)
A medicine man and leader of the Bedonkohe band of the Chiricahua Apache, Geronimo was born on the Gila River in New Mexico, where he was originally given the name Goyahkla, meaning “the one who yawns.” After the United States government forcibly relocated 4,000 Apaches to a reservation in San Carlos, Arizona, Geronimo led dozens of breakouts in an effort to return his community to their nomadic roots. Geronimo’s legacy is vast. His relationship with many American and Mexican civilians was complex, as he fought against colonialism but was made famous after appearing in Buffalo Bill’s “Wild West” sideshow and eventually in Theodore Roosevelt’s election parade. Geronimo’s tireless fight for Apache independence cemented him as a fearless crusader for freedom by the time of his death from pneumonia in 1909.
The son of a warrior, Sitting Bull was born in what is now South Dakota and was nicknamed “Slow” for his lack of fighting ability — that is, until he was branded Tatanka Yotanka (“Sitting Bull”) at age 14 after “counting coup” in a battle against the Crow Tribe. (“Counting coup” is a way to humiliate an enemy by riding close enough to touch them with a stick.) Sitting Bull eventually rose to become chief of the Hunkpapa Sioux, and fought tirelessly against the U.S. military, who sought to seize Indigenous land.
After fleeing to Canada to escape a vengeful army in the wake of the defeat of General George Armstrong Custer (and his 210 troops) in 1876 at the Battle of Little Bighorn, Sitting Bull returned to the U.S. in 1881 and was held prisoner at Standing Rock Reservation on Dakota Territory. His impact, however, could not be contained: After an Indigenous mystic claimed in 1889 that a ghost dance would eliminate the threat of white settlers on Native land, Sitting Bull allowed his followers to practice the dance — much to the horror of federal officials, who feared another uprising. Sitting Bull was killed by gunfire upon his arrest in 1890, and is remembered as a martyr for freedom.
Born near the Black Hills of South Dakota, Lakota Chief Crazy Horse was the son of a warrior with the same name, and at a young age he began showcasing his capacity for battle and bravery. Having helped lead the Sioux resistance against the U.S. military’s attempts to colonize the Great Plains throughout the 1860s and ’70s, Crazy Horse led a band of Lakota warriors against General Custer's 7th Cavalry Regiment during the Battle of Little Bighorn in 1876 (alongside Sitting Bull) before returning to the Northern Plains. Unfortunately, Crazy Horse and his community faced an unwavering enemy; forced to keep moving — and fighting — to evade federal resettlement, the chief and his 1,100 followers ultimately surrendered to the U.S. military at Fort Robinson in May 1877. There, in the wake of his arrest (and under the banner of truce), Crazy Horse was stabbed during a scuffle with U.S. soldiers and died of his injuries. He is remembered for his courage, leadership, and his endless perseverance against the colonizing forces.
Photo credit: MPI/ Archive Photos via Getty Images
Sacagawea (c. 1788-1812 or 1884)
Sacagawea was only around 16 years old when she carved her place in Native American history through her ability to communicate with different peoples. Kidnapped by the Hidatsa (Indigenous people of North Dakota) at age 12, Sacagawea was then claimed by French Canadian trader Toussaint Charbonneau as one of his wives at age 13. Despite this treatment, upon the arrival of explorers Meriwether Lewis and William Clark to Hidatsa territory in 1804, the young woman proved herself invaluable. Chosen by her husband to serve as interpreter as he and the explorers moved west, she rescued records and supplies from the river when the crew’s boat tipped and took on water, helped acquire horses from her brother when the expedition passed through Idaho, and saved her counterparts from starvation as they faced food shortages. Most importantly, her role as translator helped assure safety for both her own team and the Indigenous communities they crossed paths with. Her knowledge and wherewithal earned her momentous respect from the 45 white men who relied on her, and ultimately made the expedition a success. Her date of death remains a mystery. Following the expedition, Sacagawea and Charbonneau worked for the Missouri Fur Company in St. Louis in 1810, and it was believed that Sacagawea succumbed to typhus in 1812. However, some Native American oral histories claim that she lived until 1884 on the Shoshone lands where she was born.
Advertisement
Advertisement
Photo credit: Peter Turnley/ Corbis Historical via Getty Images
Wilma Mankiller (1945-2010)
For 10 years, Wilma Mankiller served as the principal chief of the Cherokee Nation, the first woman to do so. Born on the territory in 1945, Mankiller and her family were moved to a housing project in California in the 1950s, where they endured culture shock, racism, and the effects of poverty, which shaped the future chief’s ethos. Mankiller returned to Cherokee territory in 1977, where she founded the Community Development Department for the Cherokee Nation, and advocated endlessly for improved education, health care, and housing services.
For these efforts, then-Principal Chief Ross Swimmer asked her to run as his deputy in 1983. Two years later, Swimmer stepped down to lead the Bureau of Indian Affairs, and Mankiller became principal chief, serving until 1995. She was celebrated for lowering infant mortality rates, boosting education, and working to ensure financial and social equality. Mankiller was inducted into the National Women’s Hall of Fame in 1993, received the Presidential Medal of Freedom in 1998, and continued to advocate for women’s rights and Indigenous rights until her death in 2010 at age 64.
Depending on where you lived and when you grew up, it’s possible you might have known more than one person with the same name. Maybe there was a Jennifer A. and a Jennifer L., or maybe you knew four different people named Michael. Year after year, decade after decade, there are trends in baby names that draw on history, religion, and cultural references. Here are the most popular baby names in the United States during each decade of the 20th century.
Between 1900 and 1909, the most popular name for boys in the U.S. was John, and the most popular girls’ name, by a long shot, was Mary. This is according to data from the U.S. Social Security Administration, based on people applying for Social Security cards. There were 84,591 applications under the name John, and 161,504 entries for Mary. These two names popped up time and time again throughout the 20th century. Both names come from the Bible — John is one of Jesus’ disciples, and Mary is the name of both Jesus’ mother and Mary Magdalene. After John, the most popular boys’ names of this decade were William, James, George, and Charles, and the most popular girls’ names after Mary were Helen, Margaret, Anna, and Ruth.
Photo credit: FPG/ Archive Photos via Getty Images
1910s
Between 1910 and 1919, the most popular names were once again John and Mary. In this decade, there were 376,312 registered Johns and 478,637 Marys. Why the sudden jump? For one, the Social Security Administration began collecting data in 1937, so anyone born before that was only counted if they applied for a Social Security card after 1937. (That means the data for the 1900s, 1910s, and 1920s is based on people who listed their birthdays in these decades despite obtaining cards later in life, and doesn’t count anyone born in this period that didn’t apply for a Social Security card.) The U.S. also saw a population spike as infant mortality rates decreased throughout the 20th century, thanks to advances in health care and better access to clean water.
In the 1910s, for the second decade in a row, the second most popular names for boys and girls were William and Helen, respectively, followed by James, Robert, and Joseph for boys, and Dorothy, Margaret, and Ruth for girls. William has long been a popular English name dating back to William the Conqueror, who became the first Norman king of England in the 11th century. Helen, meanwhile, has its origins in Greek mythology: Helen of Troy was a famous beauty, known as the “face that launched a thousand ships.”
Between 1920 and 1929, John finally fell out of the top spot, as the most popular name for boys was Robert, with 576,373 entries. Robert, like William, dates back to English royalty and translates to “bright with fame” or “shining.” Mary stayed strong for girls, with 701,755 registered applications. The 1920s saw continued population increases both in the U.S. and worldwide. This is sometimes credited to a baby boom that occurred after World War I and the Spanish influenza, but is largely due, as in the previous decade, to better health care.
Photo credit: Hulton Archive/ Archive Photos via Getty Images
1930s
Between 1930 and 1939, Robert and Mary stayed at the top of the list, with 590,787 Roberts and 572,987 Marys. Though there were more Roberts born this decade than in the previous one, there was a decline in the birth rate overall due to the strain that the Great Depression placed on families. (The overall population was still higher in 1940 than in 1930, at roughly 132 million versus 123 million people.) A few new interesting names entered the runner-up positions in the 1930s. In female names, Betty and Barbara grew in popularity. Betty is a nickname for Elizabeth, a versatile name with Hebrew origins that is also found in English royalty (namely, Queen Elizabeth I). Barbara, like Helen, comes from Greek, and is also the name of St. Barbara, the patron saint of armorers, miners, and artillerymen. For boys’ names, the runners-up after Robert were James, John, William, and Richard.
Between 1940 and 1949, the name Robert fell to the second spot after James, which had 795,753 entries. Mary remained the most popular name for girls at 640,066 entries. The name James derives from Hebrew, and, like John, stems from a number of uses in the Bible. Like many other popular names, James is also found in the English monarchy, as well as the Scottish monarchy. Though it’s fallen out of the top slots in recent years in the United States, James remains one of the most popular baby names in Scotland. The next most popular boys’ names in the 1940s were Robert, John, William, and Richard; for girls, the list included Linda, Barbara, Patricia, and Carol. Interestingly, while Linda was never the most popular name in any given year, it is the most popular American baby name of all time, translating to “beautiful” in Spanish and Portuguese. Patricia, on the other hand, had been popular in England long before its time in the states, as it was the name of Queen Victoria’s granddaughter.
Photo credit: George Marks/ Hulton Archive via Getty Images
1950s
Between 1950 and 1959, the names James and Mary remained at the top of the list with 843,711 and 625,601 entries, respectively. Not far behind James, however, was a new popular name: Michael. Michael, like James, stems from the Hebrew Bible, and variations of the name exist across a number of languages, such as Miguel in Spanish and Micha in German. After James and Michael, Robert, John, and David topped the list for boys’ names, while Linda, Patricia, Susan, and Deborah followed Mary for the most popular girls’ names.
Between 1960 and 1969, everything changed, as is fitting for this revolutionary decade. Both James and Mary were unseated from the No. 1 slot: Michael became the most popular name for boys at 833,102 entries, and Lisa for girls at 496,975 entries. In fact, there were almost 150,000 more Lisas than Marys in the 1960s. The name is another variation on the popular moniker Elizabeth, and even Elvis Presley picked it for his daughter, Lisa Marie, who was born in 1968. While not much else changed in boys’ names this decade, popular girls’ names saw the addition of newcomers Susan, Karen, and Kimberly.
Between 1970 and 1979, Michael remained the most popular name for boys, topping out the decade with 707,458 entries, while Jennifer unseated the short-lived reign of Lisa with 581,753 entries. There were more new names that cropped up in the second and third slots, however, including Christopher and Jason for boys. The name Jennifer, meanwhile, grew so popular, it became known as the “standard” name for a baby girl. The initial spike in Jennifers started 50 years prior with the appearance of the name in a George Bernard Shaw play called The Doctor’s Dilemma. After Jennifer, the most popular ’70s girls’ names were Amy, Melissa, Michelle, and Kimberly.
Between 1980 and 1989, Michael retained its title as the most popular name for boys, with 663,827 entries, while Jessica just barely unseated Jennifer as the most popular name for girls — there were 469,518 Jessicas versus 440,896 Jennifers. Jessica stems from the Hebrew Bible, where its original spelling was “Jeska”; the common spelling in English comes from William Shakespeare’s play The Merchant of Venice. The top five boys’ names in the 1980s were Michael, Christopher, Matthew, Joshua, and David, and the top five for girls were Jessica, Jennifer, Amanda, Ashley, and Sarah.
Between 1990 and 1999, Michael and Jessica stayed the most popular names for each gender, with 462,390 Michaels and 303,118 Jessicas. Still, there were fewer entries for both than in the previous decade, in part because a handful of newer, trendy names cropped up as well, such as Matthew, Justin, and Andrew for boys and Ashley and Tiffany for girls. Andrew, like James, is a popular name with links to Scotland, while Matthew goes back to the Bible. Ashley and Tiffany, meanwhile, reflect the trend of girls’ names ending in “y” — names such as Brittany, Courtney, Emily, and Kelsey took off in the beginning of the 21st century.
Some of the most profound moments in history can be encapsulated in a single, memorable quote. These succinct phrases, often pulled from longer speeches or events, distill complex ideas into digestible gems. At their best, they act as verbal snapshots, capturing the essence of historical moments with an emotional urgency that lingers and lets them resonate across generations. Martin Luther King Jr.’s rallying cry of “I have a dream” is easily one of the most famous such lines in history. Similarly, Neil Armstrong’s “That’s one small step for man, one giant leap for mankind” immortalizes a peak moment in humanity; the astronaut’s muffled voice as he spoke to the public on Earth from the moon is unforgettable.
These sound bites have become cultural shorthand for momentous events and the ideals they captured, and their historical weight will keep them in the cultural consciousness for years to come.
At the heart of Martin Luther King Jr.’s famous 1963 speech were four simple words: “I have a dream.” On August 28, from the steps of the Lincoln Memorial and against a backdrop of racial segregation and discrimination in the United States, King energized the crowd — and the world — with his dream of a better life for his family and all African Americans. “I have a dream,” King said, “that one day this nation will rise up and live out the true meaning of its creed: We hold these truths to be self-evident, that all men are created equal.” He employed the phrase again, several times, to great effect, throughout the speech. “I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character,” he said. “I have a dream today.” The urgent, eloquent delivery laid bare the need for change; “I have a dream” became a rallying cry for the civil rights movement, and remains not a relic of history, but a living aspiration to this day.
King’s speech was televised by major broadcasters to a large live audience. At the time, he was a nationally known figure, but this was the first time many Americans — including, reportedly, President John F. Kennedy — had ever seen him deliver a full address. Less than a year later, President Lyndon B. Johnson signed the Civil Rights Act of 1964; the following year saw the Voting Rights Act of 1965 come into law. These pieces of legislation were the biggest civil rights advancements since the end of the Civil War.
On July 20, 1969, the first human walked on the moon. As astronaut Neil Armstrong climbed down the ladder of Apollo 11’s lunar module and onto the moon’s surface, he encapsulated the profound moment with these words: “That’s one small step for man, one giant leap for mankind.” He spoke through a muddled transmission to Earth, as some 650 million people watched on in awe.
Armstrong later told his biographer that, while he had thought ahead about what to say, it wasn’t too rehearsed. “What can you say when you step off of something?” he told biographer James R. Hansen. “Well, something about a step. It just sort of evolved during the period that I was doing the procedures of the practice takeoff and… all the other activities that were on our flight schedule at that time.” Although the quote has endured, Armstrong himself says it has been misquoted all along, and that he actually said, or at least meant to say, “one small step for a man.” (After many years and multiple attempts to clean up the audio quality, the Smithsonian National Air and Space Museum has concluded that the original quote is accurate.)
President John F. Kennedy assumed office during a tumultuous time in America’s history. But right from his inaugural address, he conveyed a spirit of hope and idealism in a resonant quote that went on to define his presidency. “Ask not what your country can do for you — ask what you can do for your country,” he famously said.
JFK’s inauguration, the first to be broadcast in color, was watched by some 38 million people. The speech, although credited principally to Kennedy, was also written by Kennedy’s longtime aide (and later, principal speechwriter) Ted Sorensen. Kennedy wanted a speech that would “set a tone for the era about to begin,” and he got just that. America was on the precipice of great social change, and the inaugural address encapsulated the country’s need for unity and the civic engagement the moment would call for.
Photo credit: Todd Warshaw/ Getty Images Sport via Getty Images
“Do You Believe in Miracles?” (1980)
One of the most iconic moments in sports history happened during the 1980 Winter Olympics in Lake Placid, New York. In the last few minutes of the men’s ice hockey medal round match between the United States and Soviet Union, the U.S. was, improbably, ahead by one goal. The Soviets were seasoned players known for their dominance in international hockey; they placed in the top three in every world championship and Olympic tournament they had played since 1954. The U.S. team, by comparison, was made up primarily of young college players who averaged 21 years old, making them the youngest players of any American Olympic hockey team in history.
No one expected a U.S. victory. A New York Times columnist even wrote that “unless the ice melts,” the USSR would once again be victorious. As the clock counted down, with just five seconds left and the U.S. still up by one, ABC sportscaster Al Michaels remarked, “Do you believe in miracles?” before letting out an elated “Yes!” as the clock ran out and the U.S. won 4-3. The victory was soon dubbed the “Miracle on Ice.” Two days later, the U.S. went on to clinch the gold medal after defeating Finland. A TV documentary about the road to gold used Michaels’ quote for its title, and in 2016, Sports Illustrated called the victory the “greatest moment in sporting history,” proving that a good underdog story can be better than fiction.
On June 12, 1987, during a ceremony at Berlin’s Brandenburg Gate for the city’s 750th anniversary, U.S. President Ronald Reagan delivered the now-famous line, “Mr. Gorbachev, tear down this wall.” The Berlin Wall, which had divided East and West Berlin since 1961, was more than just an imposing physical barrier; it symbolized the ideological divide between communism and democracy across Europe during the Cold War.
Reagan’s speech became a defining moment in his presidency — eventually. Although reactions were mixed at the time, the address gained favorable traction when the Berlin Wall finally fell two years later, on November 9, 1989. The line now stands as a pivotal moment in history, capturing an era of tense political dynamics — and, of course, solidifying Reagan’s legacy as “the great communicator.” The fall of the Berlin Wall was a historical turning point, signaling victory for democracy and peace. Soviet leader Mikhail Gorbachev even won the Nobel Peace Prize in 1990 for his role in putting the Cold War to an end.
Over the past century, the typical home kitchen has undergone a significant transformation, reflecting both social changes and new technology. In the 1920s and ’30s, kitchens were primarily utilitarian spaces with a focus on functionality and easy-to-clean surfaces. Appliances were limited, hand mixers had cranks, and gas ovens, which had replaced wood or coal-burning stoves in most homes, were starting to themselves be replaced by electric ovens.
The post-World War II consumerism of the late 1940s and 1950s brought bigger kitchens for entertaining and more labor-saving appliances, including blenders, mixers, and dishwashers. The kitchen space became more streamlined and functional, and the 1960s and 1970s brought countertop food processors and microwave ovens into the mainstream.
Open-plan kitchens and islands became increasingly popular in home design throughout the 1980s and ’90s, indicative of the kitchen’s role as a hub for family and friends to gather. That trend continued into the 21st century, along with a significant shift toward high-tech kitchens, smart appliances, and a focus on sustainability. Today’s kitchens — reflecting the changing ways we prepare, store, and consume food — look dramatically different than they did a century ago, making many once-popular items obsolete. Here are six things that your grandparents and great-grandparents might have had in their own home kitchens a century ago.
Photo credit: George Rinhart/ Corbis Historical via Getty Images
An Icebox
Before the widespread availability of electric refrigerators, iceboxes were used to keep perishable food cool. These wooden or metal boxes had a compartment for ice at the top, and fresh ice was delivered each week by an iceman. The design of the icebox allowed cold air to circulate around the stored items, while a drip pan collected the water as the ice melted. Naturally, iceboxes fell out of fashion as electric fridges went mainstream. In 1927, General Electric introduced the first affordable electric refrigeration, which relied on a refrigerant for cooling rather than ice.
Photo credit: FPG/ Archive Photos via Getty Images
A Butter Churn
Before commercial butter production made it possible to buy butter at the market, churning cream into butter was an activity done at home. The hand-crank butter churn was introduced in the mid-19th century, and it became the most commonly used household butter churn until the 1940s. In the early 20th century, the Dazey Churn & Manufacturing Company began producing glass churns that could make smaller quantities of butter much quicker than the larger, time-intensive churns. Once the butter was churned, it could then be poured or pressed into decorative molds for serving.
A Hoosier is a freestanding, self-contained kitchen cabinet that was popular in the early 1900s, named after the Hoosier Manufacturing Company that made it. Also known as a “Kitchen Piano” due to its shape, this kitchen necessity offered homemakers ample storage space and an additional work surface. Hoosier cabinets had numerous drawers and shelves for storing cookware and utensils, as well as features such as a flour bin with a built-in sifter, a sugar bin, a spice and condiment rack, a bread bin, a pull-out cutting board, and a cookbook holder. The all-in-one cabinet fell out of favor as kitchen designs began to incorporate built-in cabinets and islands for additional storage and counter space, but they’re still sometimes used for decorative storage.
Photo credit: Camerique/ Archive Photos via Getty Images
A Manual Hand Mixer
While the iconic KitchenAid stand mixer was patented more than 100 years ago in 1919, electric hand mixers weren’t commercially available until the 1960s. Before then, beating eggs or mixing other ingredients was done by hand, often with a manual hand mixer (also called a rotary egg beater). First developed in the 1850s, hand mixers had two beaters that rotated when you turned a crank. Though the style and mechanisms evolved over the years, manual hand mixers were still widely used in the 1920s, when only two-thirds of American households had electricity.
Even though ground coffee was available in bags and cans in the 1920s, and instant coffee was gaining popularity, household coffee grinders, such as a wall-mounted coffee grinder (or mill), were still a common kitchen appliance. According to a 1918 New-York Tribune article on the art of making perfect coffee, “The real coffee lover will always have a mill in the kitchen.” The wall-mounted, hand-crank style had a glass container that could hold a pound of coffee beans, and a container with tablespoon markings to catch the ground coffee.
There was a time when treasured family recipes were written on 3-by-5-inch index cards and stored in a box on the kitchen counter. Before the 1920s, most recipes were passed on by example — young women would learn how to make their grandmother’s pot roast by helping her in the kitchen. As such, handwritten recipes were generally a list of ingredients, often without quantity, and vague directions. As kitchen science developed, magazines began advertising recipe subscriptions delivered as preprinted, perforated cards. Women also started writing their own recipes on blank cards to collect and exchange, and the recipe box proved to be a more decorative and lasting storage solution than a shoebox. Like many vintage kitchen items, this nostalgic throwback still has novelty appeal, but the recipe box has largely been replaced by digital recipes stored on apps and websites.