Historically Speaking: The Ancient Elixir Made by Bees

Honey has always been a sweet treat, but it has also long served as a preservative, medicine and salve.

The Wall Street Journal

February 9, 2023

The U.S. Department of Agriculture made medical history last month when it approved the first vaccine for honey bees. Hives will be inoculated against American Foulbrood, a highly contagious bacterial disease that kills bee larvae. Our buzzy friends need all the help they can get. In 2021, a national survey of U.S. beekeepers reported that 45.5% of managed colonies died during the preceding year. Since more than one-third of the foods we eat depend on insect pollinators, a bee-less world would drastically alter everyday life.

The loss of bees would also cost us honey, a foodstuff that throughout human history has been much more than a pleasant sugar-substitute. Energy-dense, nutritionally-rich wild honey, ideal for brain development, may have helped our earliest human ancestors along the path of evolution. The importance of honey foraging can be inferred from its frequent appearance in Paleolithic art. The Araña Caves of Valencia, Spain, are notable for a particularly evocative line drawing of a honey harvester dangling precariously while thrusting an arm into a beehive.

Honey is easily fermented, and there is evidence that the ancient Chinese were making a mixed fruit, rice and honey alcoholic beverage as early as 7000 BC. The Egyptians may have been the first to domesticate bees. A scene in the sun temple of Pharaoh Nyuserre Ini, built around 2400 B.C., depicts beekeepers blowing smoke into hives as they collect honey. They loved the taste, of course, but honey also played a fundamental role in Egyptian culture. It was used in religious rituals, as a preservative (for embalming) and, because of its anti-bacterial properties, as an ingredient in hundreds of concoctions from contraceptives to gastrointestinal medicines and salves for wounds.

The oldest known written reference to honey comes from a 4,000-year-old recipe for a skin ointment, noted on a cuneiform clay tablet found among the ruins of Nippur in the Iraqi desert.

The ancient Greeks judged honey like fine wine, rating its qualities by bouquet and region. The thyme-covered slopes of Mount Hymettus, near Athens, were thought to produce the best varieties, prompting sometimes violent competition between beekeepers. The Greeks also appreciated its preservative properties. In 323 B.C., the body of Alexander the Great was allegedly transported in a vat of honey to prevent it from spoiling.

Honey’s many uses were also recognized in medieval Europe. In fact, in 1403 honey helped to save the life of 16 year-old Prince Henry, the future King Henry V of England. During the battle of Shrewsbury, an arrowhead became embedded in his cheekbone. The extraction process was long and painful, resulting in a gaping hole. Knowing the dangers of an open wound, the royal surgeon John Bradmore treated the cavity with a honey mixture that kept it safe from dirt and bacteria.

Despite Bradmore’s success, honey was relegated to folk remedy status until World War I. Then medical shortages encouraged Russian doctors to use honey in wound treatments. Honey was soon after upstaged by the discovery of penicillin in 1928, but today its time has come.

A 2021 study in the medical journal BMJ found honey to be a cheap and effective treatment for the symptoms of upper respiratory tract infections. Scientists are exploring its potential uses in fighting cancer, diabetes, asthma and cardiovascular disease.

To save ourselves, however, first we must save the bees.

Historically Speaking: Fakers, Con Men and Pretenders to the Throne

George Santos is far from the first public figure to have assumed an identity later discovered to be rife with fictions

The Wall Street Journal

January 27, 2023

Few would have thought it possible in the age of the internet, and yet U.S. Rep. George Santos turns out to have invented a long list of details of the life story he told as a candidate.

It was much easier to be an impostor in the ancient world, when travel was difficult, communications slow, and few people even knew what their rulers looked like. One of history’s oldest and strangest examples depended on this ignorance.

According to the Greek historian Herodotus, King Cambyses II of Persia became so jealous of his younger brother Bardiya, also known in history as Smerdis, that he ordered his assassination in 522 B.C. Imagine Cambyses’s shock, therefore, when news came to him while he was away fighting in Egypt that a man calling himself Smerdis had seized the throne.

ILLUSTRATION: THOMAS FUCHS

Cambyses died before he could confront the impostor. But Prince Darius, a former ally of Cambyses, suspected that the new king was actually a Magi priest named Gautama. Herodotus relates that Darius knew a crucial fact about Gautama: his ears had been cut off.

Royal etiquette kept the king at distance from everyone—even the queen lived and slept in separate quarters. But Darius persuaded her to find an opportunity to be in the king’s apartment when he was asleep and check his ears. They were missing! Darius promptly denounced this Smerdis as an impostor, proclaimed himself the savior of the kingdom and took the throne. Modern historians suspect that the real impostor was Darius and that he invented the Gautama story to justify his coup against Smerdis.

The Middle Ages were a boom time for royal impostors. Kings and crown princes were often assassinated or executed under conditions that could plausibly be spun into tales of miraculous escape. Some of those impersonating long-lost monarchs managed to get quite far. King Henry VII of England spent eight years fighting a rebellion led by Perkin Warbeck, who claimed to be one of the two princes held in the Tower of London and killed by King Richard III in the 1480s.

During the Renaissance, the most famous case in Europe involved neither a king nor the heir to a great fortune but a French peasant named Martin Guerre. In 1548, the feckless Guerre abandoned his wife, Bertrande, and their son. Eight years later, a look-alike named Arnaud du Thil suddenly appeared, claiming to be Guerre. He settled down and became a good member of the community—too good for those who had known the old Guerre. But Bertrande insisted he was the same man. The deception unraveled in 1560 when the real Martin Guerre made a sensational return in the middle of a court trial to decide du Thil’s identity. Du Thil was executed for impersonation, but the judge declared Bertrande innocent of adultery on the grounds that women are known to be very silly creatures and easily deceived.

Opportunities to claim titles and thrones diminished after the 18th century, and a new class of impostor arose: the confidence man. Mr. Santos isn’t yet a match for the Czech-American fraudster, “Count” Victor Lustig. In 1925, posing as a French government official, Lustig successfully auctioned off the Eiffel Tower. In Chicago the following year, he posed as an investor and swindled Al Capone out of $5000.

Lustig’s long list of crimes eventually landed him in Alcatraz. Where Mr. Santos is heading is a mystery—much like where he’s been.

Historically Speaking: The Long, Dark Shadow of the Real ‘Bedlam’

Today’s debate over compulsory treatment for the mentally ill has roots in a history of good intentions gone awry

The Wall Street Journal

January 12, 2023

This year, California and New York City will roll out plans to force the homeless mentally ill to receive hospital treatment. The initiatives face fierce legal challenges despite their backers’ good intentions and promised extra funds.

Opposition to compulsory hospitalization has its roots in the historic maltreatment of mental patients. For centuries, the biggest problem regarding the care of the mentally ill was the lack of it. Until the 18th century, Britain was typical in having only one public insane asylum, Bethlehem Royal Hospital. The conditions were so notorious, even by contemporary standards, that the hospital’s nickname, Bedlam, became synonymous with violent anarchy.

Plate 8 of WIlliam Hogarth’s ‘A Rake’s Progress,’ titled ‘In The Madhouse,’ was painted around 1735 and depicted the hospital known as ‘Bedlam.’
PHOTO: HERITAGE IMAGES VIA GETTY IMAGES

The cost of treatment at Bedlam, which consisted of pacifying the patients through pain and terror, was offset by viewing fees. Anyone could pay to stare or laugh at the inmates, and thousands did. But social attitudes toward mental illness were changing. By the end of the 18th century, psychiatric reformers such as Benjamin Rush in America and Philippe Pinel in France had demonstrated the efficacy of more humane treatment.

In a burst of optimism, New York Hospital created a ward for the “curable” insane in 1792. The Quaker-run “Asylum for the Relief of Persons Deprived of the Use of Their Reason” in Pennsylvania became the first dedicated mental hospital in the U.S. in 1813. By the 1830s there were at least a dozen private mental hospitals in America.

The public authorities, however, were still shutting the mentally ill in prisons, as the social reformer Dorothea Dix was appalled to discover in 1841. Dix’s energetic campaigning bore fruit in New Jersey, which soon built its first public asylum. Designed by Thomas Kirkbride to provide state-of-the-art care amid pleasant surroundings, Trenton State Hospital served as a model for more than 70 purpose-built asylums that sprang up across the nation after Congress approved government funding for them in 1860.

Unfortunately, the philanthropic impetus driving the public mental hospital movement created as many problems as it solved. Abuse became rampant. It was so easy to have a person committed that in the 1870s, President Grover Cleveland, while still an aspiring politician, successfully silenced the mother of his illegitimate son by having her spirited away to an asylum.

In 1887, the journalist Nellie Bly went undercover as a patient in the Women’s Lunatic Asylum on Blackwell’s Island (now Roosevelt Island), New York. She exposed both the brutal practices of that institution and the general lack of legal safeguards against unwarranted incarceration.

The social reformer Dorothea Dix advocated for public mental health care.

During the first half of the 20th century, the best-run public mental hospitals lived up to the ideals that had inspired them. But the worst seemed to confirm fears that patients on the receiving end of state benevolence lost all basic rights. At Trenton State Hospital between 1907 and 1930, the director Henry Cotton performed thousands of invasive surgeries in the mistaken belief that removing patients’ teeth or organs would cure their mental illnesses. He ended up killing almost a third of those he treated and leaving the rest damaged and disfigured. The public uproar was immense. And yet, just a decade later, some mental hospitals were performing lobotomies on patients with or without consent.

In 1975 the ACLU persuaded the Supreme Court that the mentally ill had the right to refuse hospitalization, making public mental-health care mostly voluntary. But while legal principles are black and white, mental illness comes in shades of gray: A half century later, up to a third of people living on the streets are estimated to be mentally ill. As victories go, the Supreme Court decision was also a tragedy.

The New York Times: This Royal Saga Has a Surprise Ending

December 11, 2022

The New York Times

Once upon a time, a boy met a girl, and they fell in love. This was no ordinary love, just as this is no ordinary story. They first set eyes on each other in a crowded restaurant. They talked, nothing more. And yet each felt the connection between them. It was the beginning of a profound love that would survive extraordinary trials.

The couple had much in common. On the surface they had every advantage, being attractive, well educated and popular. Yet behind the mask of good fortune was a more complicated reality. Both were the products of unconventional childhoods. Both desperately wanted to escape their family backgrounds. And this was the rub. The boy and girl came from opposite ends of the social spectrum.

One was of royal blood, brought up in the lap of luxury. The other was a commoner, brought up in straitened circumstances by a single mother. Neither cared the least about such things, although they were not so naïve as to think other people would share their views. Their greatest fear was that once their relationship became public they would lose what little freedom they had. Between the demands of royal protocol and the 24-hour glare of public scrutiny, they would become prisoners in a gilded cage.

But love conquers all, and by 2017, the couple were engaged. The announcement sent the country into a tizzy. That two people from such vastly different backgrounds could fall in love and marry seemed like a fairy tale come true with something in it for everyone. By welcoming the engagement, the royal family could prove that it was in step with the times. In covering it, the media had a gold mine on its hands. As for the public at large, royalist or not, there was a gallery of delights, whether it was the gossip, the glamour or simply pride in knowing that the world was watching.

This was no fairy tale, however, and it didn’t take long for the couple’s fears to be realized. The media loves a good scandal, and when there’s money involved there is often a family member or two ready to stir up trouble. The couple were horribly embarrassed after a private family quarrel was turned into a kind of national sport, with the public invited to take sides. They felt simultaneously trapped and exposed. She became the target of mockery, criticism, abuse. He was forced to defend his love for her against not-so-subtle schemes to break them up. One newspaper even ran a poll on whether their wedding should go ahead. Later, she revealed that the relentless attacks on their privacy and on her personally nearly drove her to a nervous breakdown. She loved him, but it was killing her.

The wedding did happen, of course. By then the couple were already plotting their escape. The plan was so audacious, so unprecedented, that they told no one until all the pieces were in place. The announcement brought fresh condemnation. They weren’t just going to opt out of public life — they were going to emigrate to America. It would mean starting afresh, without titles, status or public money to support them. They would make their own way in the world, living on their salaries like ordinary citizens. To demonstrate their seriousness, they turned down the offer of a lump sum to see them off.

Princess Mako and Kei Komuro announced their plans on Oct. 26, 2021, in Tokyo.

Their low-key arrival in America naturally prompted speculation. What did the couple really have planned? Would they fulfill their vow to lead private lives, or would they take advantage of their newfound freedom to air their grievances against the royal family, the media and anyone who had ever crossed them? There was no need for them to live in a tiny, cramped apartment when they could earn millions telling their story.

If the couple were tempted to accept any of these opportunities, we will never know. They live a modest and quiet existence out of the limelight. They have no presence on any social media platforms, though that hasn’t stopped the paparazzi from dogging their footsteps. But former Princess Mako of Japan and her husband, Kei Komuro, are finally free and independent.

Ms. Komuro, who had to give up her royal status to marry Mr. Komuro, has a master’s degree in museum studies and several years of experience in the art world. Using this as her entry point, she took a position as an intern at the Metropolitan Museum of Art. Mr. Komuro, who already had a Japanese degree in business law, received a scholarship to attend Fordham University’s law school. When he failed the bar exam on the first try, hardly surprising since English is his second language, he kept trying until he passed on his third try, earlier this year. He is working at the law firm Lowenstein Sandler. The couple may well turn out to be Japan’s greatest export. Other present and future ex-royals might take note.

Historically Speaking: You Might Not Want to Win a Roman Lottery

Humans have long liked to draw lots as a way to win fortunes and settle fates

The Wall Street Journal

November 25, 2022

Someone in California won this month’s $2.04 billion Powerball lottery—the largest in U.S. history. The odds are staggering. The likelihood of death by plane crash (often estimated at 1 in 11 million for the average American) is greater than that of winning the Powerball or Mega Millions lottery (1 in roughly 300 million). Despite this, just under 50% of American adults bought a lottery ticket last year.

What drives people to risk their luck playing the lottery is more than just lousy math. Lotteries tap into a deep need among humans to find meaning in random events. Many ancient societies, from the Chinese to the Hebrews, practiced cleromancy, or the casting of lots to enable divine will to express itself. It is stated in the Bible’s Book of Proverbs: “The lot is cast into the lap, but its every decision is from the Lord.”

The ancient Greeks were among the first to use lotteries to ensure impartiality for non-religious purposes. The Athenians relied on a special device called a “kleroterion” for selecting jurors and public officials at random, to avoid unfair interference. The Romans had a more terrible use for drawing lots: A kind of collective military punishment known as “decimation” required a disgraced legion to select 1 out of every 10 soldiers at random and execute them. The last known use of the practice was in World War I by French and possibly Italian commanders.

The Romans also found less bloody uses for lotteries, including as a source of state revenue. Emperor Nero was likely the first ruler to use a raffle system as a means of filling the treasury without raising taxes.

ILLUSTRATION: THOMAS FUCHS

Following the fall of Rome, lotteries found other uses in the West—for example, as a means of allocating market stalls. But state lotteries only returned to Europe after 1441, when the city of Bruges successfully experimented with one as a means to finance its community projects. These fundraisers didn’t always work, however. A lack of faith in the English authorities severely dampened ticket sales for Queen Elizabeth I’s first (and last) National Lottery in 1567.

When they did work, the bonanzas could be significant: In the New World, lotteries helped to pay for the first years of the Jamestown Colony, as well as Harvard, Yale, Princeton, Columbia and many other institutions. And in France in 1729, the philosopher Voltaire got very rich by winning the national lottery, which was meant to sell bonds by making each bond a ticket for a jackpot drawing. He did it by unsavory means: Voltaire was part of a consortium of schemers who took advantage of a flaw in the lottery’s design by buying up enormous numbers of very cheap bonds.

Corruption scandals and failures eventually took their toll. Critics such as the French novelist Honoré de Balzac, who called lotteries the “opium of poverty,” denounced them for exploiting the poor. Starting in the late 1820s, a raft of anti-lottery laws were enacted on both sides of the Atlantic. Debates continued about them even where they remained legal. The Russian novelist Anton Chekhov highlighted their debilitating effects in his 1887 short story “The Lottery Ticket,” about a contented couple who are tormented and finally turned into raging malcontents by the mere possibility of winning.

New Hampshire was the first American state to roll back the ban on lotteries in 1964. Since then, state lotteries have proven to be neither the disaster nor the cure-all predicted. As for the five holdouts—Alabama, Alaska, Hawaii, Nevada and Utah still have no state lotteries—they are doing just fine.

Historically Speaking: Modern Dentistry’s Painful Past

Just be thankful that your teeth aren’t drilled with a flint or numbed with cocaine

The Wall Street Journal

November 3, 2022

Since the start of the pandemic, a number of studies have uncovered a surprising link: The presence of gum disease, the first sign often being bloody gums when brushing, can make a patient with Covid three times more likely to be admitted to the ICU with complications. Putting off that visit to the dental hygienist may not be such a good idea just now.

Many of us have unpleasant memories of visits to the dentist, but caring for our teeth has come a very long way over the millennia. What our ancestors endured is incredible. In 2006 an Italian-led team of evolutionary anthropologists working in Balochistan, in southwestern Pakistan, found several 9,000-year-old skeletons whose infected teeth had been drilled using a pointed flint tool. Attempts at re-creating the process found that the patients would have had to sit still for up to a minute. Poppies grow in the region, so there may have been some local expertise in using them for anesthesia.

Acupuncture may have been used to treat tooth pain in ancient China, but the chronology remains uncertain. In ancient Egypt, dentistry was considered a distinct medical skill, but the Egyptians still had some of the worst teeth in the ancient world, mainly from chewing on food adulterated with grit and sand. They were averse to dental surgery, relying instead on topical pain remedies such as amulets, mouthwashes and even pastes made from dead mice. Faster relief could be obtained in Rome, where tooth extraction—and dentures—were widely available

ILLUSTRATION: THOMAS FUCHS

In Europe during the Middle Ages, dentistry fell under the purview of barbers. They could pull teeth but little else. The misery of mouth pain continued unabated until the early 18th century, when the French physician Pierre Fauchard became the first doctor to specialize in teeth. A rigorous scientist, Fauchard helped to found modern dentistry by recording his innovative methods and discoveries in a two-volume work, “The Surgeon Dentist.”

Fauchard elevated dentistry to a serious profession on both sides of the Atlantic. Before he became famous for his midnight ride, Paul Revere earned a respectable living making dentures. George Washington’s false teeth were actually a marvel of colonial-era technology, combining human teeth with elephant and walrus ivory to create a realistic look.

During the 1800s, the U.S. led the world in dental medicine, not only as an academic discipline but in the standardization of the practice, from the use of automatic drills to the dentist’s chair.

Perhaps the biggest breakthrough was the invention of local anesthetic in 1884. In July of that year, Sigmund Freud published a paper in Vienna on the potential uses of cocaine. Four months later in America, Richard Hall and William S. Halsted, whose pioneering work included the first radical mastectomy for breast cancer, decided to test cocaine’s numbing properties by injecting it into their dental nerves. Hall had an infected incisor filled without feeling a thing.

For patients, the experiment was a miracle. For Hall and Halsted, it was a disaster; both became cocaine addicts. Fortunately, dental surgery would be made safe by the invention of non-habit forming Novocain 20 years later.

With orthodontics, veneers, implants and teeth whiteners, dentists can give anyone a beautiful smile nowadays. But the main thing is that oral care doesn’t have to hurt, and it could just save your life. So make that appointment.

Historically Speaking: The Fungus That Fed Gods And Felled a Pope

There’s no hiding the fact that mushrooms, though delicious, have a dark side

The Wall Street Journal

October 21, 2022

Fall means mushroom season. And, oh, what joy. The Romans called mushrooms the food of the gods; to the ancient Chinese, they contained the elixir of life; and for many people, anything with truffles is the next best thing to a taste of heaven.

Lovers of mushrooms are known as mycophiles, while haters are mycophobes. Both sets have good reasons for feeling so strongly. The medicinal properties of mushrooms have been recognized for thousands of years. The ancient Chinese herbal text “Shen Nong Ben Cao Jing,” written down sometime during the Eastern Han Dynasty, 25-220 AD, was among the earliest medical treatises to highlight the immune-boosting powers of the reishi mushroom, also known as lingzhi.

The hallucinogenic powers of certain mushrooms were also widely known. Many societies, from the ancient Mayans to the Vikings, used psilocybin-containing fungi, popularly known as magic mushrooms, to achieve altered states either during religious rituals or in preparation for battle. One of the very few pre-Hispanic texts to survive Spanish destruction, the Codex Yuta Tnoho or Vindobonensis Mexicanus I, reveals the central role played by the mushroom in the cosmology of the Mixtecs.

ILLUSTRATION: THOMAS FUCHS

There is no hiding the fact that mushrooms have a dark side, however. Fewer than 100 species are actually poisonous out of the thousands of varieties that have been identified. But some are so deadly—the death cap (Amanita phalloides), for example—that recovery is uncertain even with swift treatment. Murder by mushroom is a staple of crime writing, although modern forensic science has made it impossible to disguise.

There is a strong possibility that this is how the Roman Emperor Claudius died on Oct. 13, 54 A.D. The alleged perpetrator, his fourth wife Agrippina the Younger, wanted to clear the path for her son Nero to sit on the imperial throne. Nero dutifully deified the late emperor, as was Roman custom. But according to the historian Dio Cassius, he revealed his true feelings by joking that mushrooms were surely a dish for gods, since Claudius, by means of a mushroom, had become a god.

Another victim of the death cap mushroom, it has been speculated, was Pope Clement VII in 1534, who is best known for opposing King Henry VIII’s attempt to get rid of Catherine of Aragon, the first of his six wives. Two centuries later, in what was almost certainly an accident, Holy Roman Emperor King Charles VI died in Vienna on Oct. 20, 1740, after attempting to treat a cold and fever with his favorite dish of stewed mushrooms.

Of course, mushrooms don’t need to be lethal to be dangerous. Ergot fungus, which looks like innocuous black seeds, can contaminate cereal grains, notably rye. Its baleful effects include twitching, convulsions, the sensation of burning, and terrifying hallucinations. The Renaissance painter Hieronymus Bosch may well have been suffering from ergotism, known as St. Anthony’s Fire in his day, when he painted his depictions of hell. Less clear is whether ergotism was behind the strange symptoms recorded among some of the townspeople during the Salem witch panic of 1692-93.

Unfortunately, the mushroom’s mixed reputation deterred scientific research into its many uses. But earlier this year a small study in the Journal of Psychopharmacology found evidence to support what many college students already believe: Magic mushrooms can be therapeutic. Medication containing psilocybin had an antidepressant effect over the course of a year. More studies are needed, but I know one thing for sure: Sautéed mushrooms and garlic are a recipe for happiness.

Historically Speaking: A Pocket-Sized Dilemma for Women

Unlike men’s clothes, female fashion has been indifferent for centuries to creating ways for women to stash things in their garments

The Wall Street Journal

September 29, 2022

The current round of Fashion Weeks started in New York on Sep. 9 and will end in Paris on Oct. 4, with London and Milan slotted in between. Amid the usual impractical and unwearable outfits on stage, some designers went their own way and featured—gasp—women’s wear with large pockets.

The anti-pocket prejudice in women’s clothing runs deep. In 1954, the French designer Christian Dior stated: “Men have pockets to keep things in, women for decoration.” Designers seem to think that their idea of how a woman should look outweighs what she needs from her clothes. That mentality probably explains why a 2018 survey found that 100% of the pockets in men’s jeans were large enough to fit a midsize cellphone, but only 40% of women’s jeans pockets measured up.

The pocket is an ancient idea, initially designed as a pouch that was tied or sewn to a belt beneath a layer of clothing. Otzi, the 5,300-year-old ice mummy I wrote about recently for having the world’s oldest known tattoos, also wore an early version of a pocket; it contained his fire-starting tools.

ILLUSTRATION: THOMAS FUCHS

The ancient concept was so practical that the same basic design was still in use during the medieval era. Attempts to find other storage solutions usually came up short. In the 16th century a man’s codpiece sometimes served as an alternative holdall, despite the awkwardness of having to fish around your crotch to find things. Its fall from favor at the end of the 1600s coincided with the first in-seam pockets for men.

It was at this stage that the pocket divided into “his” and “hers” styles. Women retained the tie-on version; the fashion for wide dresses allowed plenty of room to hang a pouch underneath the layers of petticoats. But it was also impractical since reaching a pocket required lifting the layers up.

Moralists looked askance at women’s pockets, which seemed to defy male oversight and could potentially be a hiding place for love letters, money and makeup. On the other hand, in the 17th century a maidservant was able to thwart the unwelcome advances of the diarist Samuel Pepys by grabbing a pin from her pocket and threatening to stab him with it, according to his own account.

Matters looked up for women in the 18th century with the inclusion of side slits on dresses that allowed them direct access to their pockets. Newspapers began to carry advertisements for articles made especially to be carried in them. Sewing kits and snuff boxes were popular items, as were miniature “conversation cards” containing witty remarks “to create mirth in mixed companies.”

Increasingly, though, the essential difference between men’s and women’s pockets—his being accessible and almost anywhere, hers being hidden and nestled near her groin—gave them symbolism. Among the macabre acts committed by the Victorian serial killer Jack the Ripper was the disemboweling of his victims’ pockets, which he left splayed open next to their bodies.

Women had been agitating for more practical dress styles since the formation of the Rational Dress Society in Britain in 1881, but it took the upheavals caused by World War I for real change to happen. Women’s pantsuits started to appear in the 1920s. First lady Eleanor Roosevelt caused a sensation by appearing in one in 1933. The real revolution began in 1934, with the introduction of Levi’s bluejeans for women, 61 years after the originals for men. The women’s front pocket was born. And one day, with luck, it will grow up to be the same size as men’s.

Harper’s Bazaar: Behind her eyes: celebrating the Queen as a cultural icon

Our steadfast hope

Harper’s Bazaar

June 2022

If you’ve ever had a dream involving the Queen, you are not alone. After her Silver Jubilee in 1977, it was estimated that more than a third of Britons had dreamt about her at least once, with even ardent republicans confessing to receiving royal visits in their slumbers. For the past 70 years, the Queen has been more than just a presence in our lives, subconscious or otherwise; she has been a source of fascination, inspiration and national pride.

Queen Elizabeth II in 2002

When Princess Elizabeth became Queen in 1952, the country was still struggling to emerge from the shadow of World War II. Her youth offered a break with the past. Time magazine in the United States named her its ‘Woman of the Year’, not because of anything she had achieved but because of the hope she represented for Britain’s future. A barrister and political hopeful named Margaret Thatcher wrote in the Sunday Graphic that having a queen ought to remove “the last shreds of prejudice against women aspiring to the highest places”. After all, Elizabeth II was a wife and mother of two small children, and yet no one was suggesting that family life made her unfit to rule.

Thatcher’s optimism belied the Queen’s dilemma over how to craft her identity as a modern monarch in a traditional role. At the beginning, tradition seemed to have the upper hand: a bagpiper played beneath her window every morning (a holdover from Queen Victoria). The Queen knew she didn’t want to be defined by the past. “Some people have expressed the hope that my reign may mark a new Elizabethan age,” she stated in 1953. “Frankly, I do not myself feel at all like my great Tudor forbear.”

Nevertheless, the historical parallels between the two queens are instructive. Elizabeth I created a public persona, yet made it authentic. Fakery was impossible, since “we princes,” she observed, “are set on stages in the sight and view of all the world.” Although Elizabeth I was a consummate performer, her actions were grounded in sincere belief. She began her reign by turning her coronation into a great public event. Observers were shocked by her willingness to interact with the crowds, but the celebrations laid the foundation for a new relationship between the queen and her subjects.

The introduction of television cameras for Elizabeth II’s coronation performed a similar function. In the 1860s, the journalist Walter Bagehot observed that society itself is a kind of “theatrical show” where “the climax of the play is the Queen”. The 1953 broadcast enabled 27 million Britons and 55 million Americans to participate in the ‘show’ from the comfort of their homes. It was a new kind of intimacy that demanded more from Elizabeth II than any previous monarch.

Images and quotes from the Queen’s coronavirus address in April 2022 displayed across London

The Queen had resisted being filmed, but having been convinced by Prince Philip of its necessity, she worked to master the medium. She practised reading off a teleprompter so that her 1957 Christmas speech, the first to be telecast, would appear warm and natural. Harking back to Elizabeth I, she admitted: “I cannot lead you into battle, I do not give you laws or administer justice, but I can do some­thing else, I can give you my heart and my devotion.” She vowed to fight for “fundamental principles” while not being “afraid of the future”.

In practice, embracing the future could be as groundbreaking as instituting the royal “walkabout”, or as subtle as adjusting her hemline to rest at the knee. Indeed, establishing her own sense of fashion was one of the first successes of Elizabeth II’s reign. Its essence was pure glamour, but the designs were performing a double duty: nothing could be too patterned, too hot, too shiny or too sheer, or else it wouldn’t photograph well. Her wardrobe carried the subversive message that dresses should be made to work for the wearer, not the other way round. In an era when female celebrity was becoming increasingly tied to “sexiness”, the Queen offered a different kind of confident femininity. Never afraid to wear bright blocks of colour, she has encouraged generations of women to think beyond merely blending in.

The opportunity to demonstrate her “fund­amental principles” on the international stage came in 1961, during a Cold War crisis involving Ghana. The Queen was due to go on a state visit, until growing violence there led to calls for it to be cancelled. She not only insisted on keeping the engage­ment, but during the wildly popular trip, she also made a point of dancing with President Kwame Nkrumah at a state ball. Her adept handling of the situation helped to prevent Ghana from switching allegiance to the Soviet Union. Just as important, though, was the coverage given to her rejection of contemporary racism. As Harold Macmillan noted: “She loves her duty and means to be Queen and not a puppet.” This determination has seen her through 14 prime ministers, 14 US presidents, seven popes and 265 official overseas visits.

At the beginning of the Covid epidemic in 2020, with the nation in shock at the sudden cessation of ordinary life, Elizabeth II spoke directly to the country, sharing a wartime memory to remind people of what can be endured. “We will succeed,” she promised, and in that desperate moment, she united us all in hope. The uniqueness of the Queen lies in her ability to weather change with grace and equanimity – as the poet Philip Larkin once wrote: “In times when nothing stood/but worsened, or grew strange/there was one constant good:/she did not change.” That steadfast continuity, so rare in a world of permanent flux, is an endless source of inspiration for artists and writers, designers and composers, all of us.

The Mail on Sunday: No miniskirts. No railing about being a working mother.

Leading historian AMANDA FOREMAN explains why the Queen was a true feminist icon who changed the world for millions of women – in very surprising ways.

The Mail on Sunday

September 17, 2022

Ask someone for the name of a famous feminist and no doubt you’ll get one of a few prominent women batted back to you. Germaine Greer. Gloria Steinem. Hillary Clinton. But Elizabeth Windsor? That would be a no. She looked the opposite of today’s powerful women with her knee-length tweeds and distinctly unfashionable court shoes.

I, though, argue differently. As a historian with a particular interest in female power, I believe one thing above all puts the Queen in a special category of achievement. Not the length of her reign. Not even her link to the courageous wartime generation. No, it is her global impact on the cause of gender equality that should be remembered, all without donning a miniskirt or wailing MeToo. All without spilling emotions, making herself a victim or hiding the effects of age and motherhood.

I believe the Queen is the ultimate feminist icon of the 20th Century, more a symbol of women’s progress in this century than other icons like Madonna or Beyoncé could dream of. Females everywhere, particularly those past menopause, have much to thank her for.

But when it has been previously suggested the Queen was a feminist, or that women should celebrate her life, critics have bitten back sharply.

In 2019 Olivia Colman, who portrayed the Queen in the Netflix drama The Crown, provoked equal cheers and jeers for describing her as ‘the ultimate feminist’. A few years before, Woman’s Hour chief presenter Emma Barnett had her intellectual credentials questioned for calling the Queen a ‘feminist icon’.

They justified the view for different reasons. For Colman, it was because the Queen had shown a wife could assume a man’s role while retaining her femininity. The argument went in reverse for Barnett: the Queen had shown her gender was ‘irrelevant to her capacity to do her job’.

Yet no King would ever have his masculinity and the definition of manhood so conflated in the same way. It’s doubtful anyone will question whether King Charles defines the essence of what it is to be a man.

In the midst of all the grief for the Queen, we should remember at the beginning of her reign Elizabeth’s potential power to effect change provoked as much unease as it did anticipation. In a patriarchal world, female empowerment is a force to fear. After all, we never talk about ‘male empowerment’, do we?

Our two other long-lived queens, Elizabeth I and Victoria, had the same scrutiny. Foreign affairs, great questions of state, probity in government, what did that matter compared to the burning issue of what it meant to have a woman placed above the heads of men?

It was not easy for Elizabeth II to escape from under the shadow of Queen Victoria, the figurative mother of the nation.

Initially, it wasn’t even clear she wanted to. Though the command for brides to obey their husbands had not been part of the Book of Common Prayer since 1928, Elizabeth included it in her wedding vows.

Aged 25, she was a mother-of-two when she made her accession speech before the Privy Council. Accompanied by her husband, Elizabeth looked even younger than her years, surrounded by a roomful of mostly old men. But after the Privy Council meeting, the comparisons with Victoria stopped. And you can begin to see her innate feminism come to the fore. Elizabeth did not lose her self-confidence in between pregnancies and pass over the red boxes or deputise Philip to meet her Ministers. Far from it. She took on the role of sovereign and Philip accepted his as the world’s most famous house-husband.

In reality, there were few actions or speeches of the Queen’s that could be classed as declaratively feminist – such as the time she drove Crown Prince Abdullah of Saudi Arabia around Balmoral in her Land Rover when Saudi women were forbidden to drive, going at such breakneck speed while chatting that the Prince begged her to slow down.

Or her few comments about the work of the WI, or the potential to be tapped if only society can ‘find ways to allow girls and women to play their full part’.

No, instead of examples like these, the Queen was a feminist for reasons most women can instantly relate to: first, she established clear boundaries between the demands of her job and those of her family.

Society still expects wives will drop everything for the family, no matter how consuming their careers, so husbands can go to work. Not once did the Queen say or imply she ought to shift her weekly audience with the Prime Minister, or cancel the ribbon-cutting of a hospital because of some domestic concern.

Second, society judges working mothers much more harshly than working fathers, giving the latter a free pass if their job is important enough but condemning the former as a terrible person if her children don’t turn out to be outstanding successes. The Queen’s fitness as sovereign has never been tied to her fitness as a mother. Although she always made her family a part of her life, Elizabeth did not allow it to define her as Victoria did.

Third, society makes middle- aged women feel that they are invisible. Their opinions stop mattering, contributions don’t count and their bodies, according to fashion designers, don’t exist. Whispers that the Queen ought to abdicate began in her 50s. By 1977, her Silver Jubilee, critics wondered what she was good for now her youth and figure were in the rear-view mirror.

In answer, she embodied the reverse of Invisible Woman Syndrome. By refusing to countenance abdication, she showed what a working woman looks like past menopause. Rather than shrinking, she revved up a gear and demonstrated a woman’s age has no bearing on her agency and authority.

Her fabulous colour sense and ability to match dresses to the mood excited intense interest – but this didn’t make her a feminist icon. In an age when a woman’s sexiness is her currency, and empowerment judged by how much of her body she exposes, she refused to make any concessions to fashion.

This was a confident femininity, an inner feminism based on absolute assuredness of who she was and why she mattered. For over five decades, the Queen showed what strength and purpose look like on the body of an older woman.

The next three generations of monarchs are due to be Kings. To some extent, the old way of doing things will return. So, it is up to us to honour Queen Elizabeth’s memory by following her example.

She tore up the rule book on gender roles without society falling apart or families breaking down. Despite heavy restrictions on what she could do as a woman let alone a Queen, she forged her own path – and invited the rest of us to follow.