Historically Speaking: How Roses Came to Mean True Love

Our favorite Valentine’s Day flower was already a symbol of passion in ancient Greek mythology

The Wall Street Journal

February 13, 2021

“My luve is like a red red rose,/That’s newly sprung in June,” wrote the Scottish poet Robert Burns in 1794, creating an inexhaustible revenue stream for florists everywhere, especially around Valentine’s Day. But why a red rose, you might well ask.

According to Greek myth, the blood of Aphrodite turned roses red.
PHOTO: GETTY IMAGES

Longevity is one reason. The rose is an ancient and well-traveled flower: A 55 million-year-old rose fossil found in Colorado suggests that roses were already blooming when our earliest primate ancestors began populating the earth. If you want to see where it all began, at least in the New World, then a trip to the Florissant Fossil Beds National Monument, roughly two hours’ drive from Denver, should be on your list of things to do once the pandemic is over.

In Greek mythology the rose was associated with Aphrodite, goddess of love, who was said to have emerged from the sea in a shower of foam that transformed into white roses. Her son Cupid bribed Harpocrates, the god of silence, with a single rose in return for not revealing his mother’s love affairs, giving rise to the Latin phrase sub rosa, “under the rose,” as a term for secrecy. As for the red rose, it was said to be born of tragedy: Aphrodite became tangled in a rose bush when she ran to comfort her lover Adonis as he lay dying from a wild boar attack. Scratched and torn by its thorns, her feet bled onto the roses and turned them crimson.

For the ancient Romans, the rose’s symbolic connection to love and death made it useful for celebrations and funerals alike. A Roman banquet without a suffocating cascade of petals was no banquet at all, and roses were regularly woven into garlands or crushed for their perfume. The first time Mark Antony saw Cleopatra he had to wade through a carpet of rose petals to reach her, by which point he had completely lost his head.

Rose cultivation in Asia became increasingly sophisticated during the Middle Ages, but in Europe the early church looked askance at the flower, regarding it as yet another example of pagan decadence. Fortunately, the Frankish emperor Charlemagne, an avid horticulturalist, refused to be cowed by old pieties, and in 794 he decreed that all royal gardens should contain roses and lilies.

The imperial seal of approval hastened the rose’s acceptance into the ecclesiastical fold. The Virgin Mary was likened to a thornless white rose because she was free of original sin. In fact, a climbing rose planted in her honor in 815 by the monks of Germany’s Hildesheim Cathedral is the oldest surviving rose bush today. Red roses, by contrast, symbolized the Crucifixion and Christian martyrs like St. Valentine, a priest killed by the Romans in the 3rd century, whose feast day is celebrated on Feb. 14. In the 14th century, his emergence as the patron saint of romantic love tipped the scales in favor of the red over the white rose.

The symbolism attached to the rose has long made it irresistible to poets. Shakespeare’s audience would have known that when Juliet compares Romeo to the flower—“that which we call a rose,/By any other name would smell as sweet”—it meant tragedy awaited the lovers. Yet they would have felt comforted, too, since each red rose bears witness, as Burns wrote, to the promise of love unbound and eternal: “Till a’ the seas gang dry, my dear,/And the rocks melt wi’ the sun.”

Historically Speaking: The Original Victims of Cancel Culture

Roman emperors and modern dictators have feared the social and spiritual penalties of excommunication.

The Wall Street Journal

January 28, 2021

Nowadays, all it takes for a person to be condemned to internal exile is a Twitter stampede of outrage. The lack of any regulating authority or established criteria for what constitutes repentance gives “cancel culture,” as it is popularly known, a particularly modern edge over more old-fashioned expressions of public shaming such as tar-and-feathering, boycotts and blacklists.

Portrait of Martin Luther by Lucas Cranach the Elder.
PHOTO: CORBIS/VCG/GETTY IMAGES

But the practice of turning nonconforming individuals into non-persons has been used with great effectiveness for centuries, none more so than the punishment of excommunication by the Roman Catholic Church. The penalties included social ostracism, refusal of communion and Christian burial, and eternal damnation of one’s soul.

The fear inspired by excommunication was great enough to make even kings fall in line. In 390, soldiers under the command of the Roman emperor Theodosius I massacred thousands in the Greek city of Thessalonica. In response, Bishop Ambrose of Milan excommunicated Theodosius, forcing him to don sackcloth and ashes as public penance. Ambrose’s victory established the Church’s authority over secular rulers.

Later church leaders relied on the threat of excommunication to maintain their power, but the method could backfire. In 1054, Pope Leo III of Rome excommunicated Patriarch Michael Cerularius of Constantinople, the head of the eastern Church, who retaliated by excommunicating Leo and the western Church. Since this Great Schism, the two churches, Roman Catholic and Eastern Orthodox, have never reunited.

During the Middle Ages, the penalty of excommunication broadened to include the cancellation of all legal protections, including the right to collect debts. Neither kings nor cities were safe. After being excommunicated by Pope Gregory VII in 1076, Holy Roman Emperor Henry IV stood barefoot in the snow for three days before the pontiff grudgingly welcomed him inside to hear his repentance. The entire city of Venice was excommunicated over half a dozen times, and on each occasion the frightened Venetians capitulated to papal authority.

But the excommunication of Martin Luther, the founder of Protestantism, by Pope Leo X in January 1521, 500 years ago this month, didn’t work out as planned. Summoned to explain himself at the Diet of Worms, a meeting presided over by the Holy Roman Emperor Charles V, Luther refused to recant and ask forgiveness, allegedly saying: “Here I stand, I can do no other.” In response, the Emperor declared him a heretic and outlaw, putting his life in danger. Luther was only saved from assassination by his patron, Frederick, Elector of Saxony, who hid him in a castle. Luther used the time to begin translating the Bible into German.

Napoleon Bonaparte was equally unconcerned about the spiritual consequences when he was excommunicated by Pope Pius VII in 1809. Nevertheless, he was sufficiently frightened of public opinion to kidnap the pontiff and keep him out of sight for several months. In 1938, angry over Nazi Germany’s takeover of Austria, the Italian dictator Benito Mussolini tried to persuade Pope Pius XI to excommunicate the German dictator Adolf Hitler, a nonpracticing Catholic. Who knows what would have happened if he had been successful.

Historically Speaking: Two Centuries of Exploring Antarctica

Charting the southern continent took generations of heroic sacrifice and international cooperation.

The Wall Street Journal

January 14, 2021

There is a place on Earth that remains untouched by war, slavery or riots. Its inhabitants coexist in peace, and all nationalities are welcomed. No, it’s not Neverland or Shangri-La—it’s Antarctica, home to the South Pole, roughly 20 million penguins and a transient population of about 4,000 scientists and support staff.

Antarctica’s existence was only confirmed 200 years ago. Following some initial sightings by British and Russian explorers in January 1821, Captain John Davis, a British-born American sealer and explorer, landed on the Antarctic Peninsula on Feb. 7, 1821. Davis was struck by its immense size, writing in his logbook, “I think this Southern Land to be a Continent.” It is, in fact, the fifth-largest of Earth’s seven continents.

Herbert Ponting is attacked by a penguin during the 1911 Scott expedition in Antarctica.
PHOTO: HERBERT PONTING/SCOTT POLAR RESEARCH INSTITUTE, UNIVERSITY OF CAMBRIDGE/GETTY IMAGES

People had long speculated that there had to be something down at the bottom of the globe—in cartographers’ terms, a Terra Australis Incognita (“unknown southern land”). The ancient Greeks referred to the putative landmass as “Ant-Arktos,” because it was on the opposite side of the globe from the constellation of Arktos, the Bear, which appears in the north. But the closest anyone came to penetrating the freezing wastes of the Antarctic Circle was Captain James Cook, the British explorer, who looked for a southern continent from 1772-75. He got within 80 miles of the coast, but the harshness of the region convinced Cook that “no man will ever venture further than I have done.”

Davis proved him wrong half a century later, but explorers were unable to make further progress until the heroic age of Antarctic exploration in the early 20th century. In 1911, the British explorer Robert F. Scott led a research expedition to the South Pole, only to be beaten by the Norwegian Roald Amundsen, who misled his backers about his true intentions and jettisoned scientific research for the sake of getting there quickly.

Extraordinarily bad luck led to the deaths of Scott and his teammates on their return journey. In 1915, Ernest Shackleton led a British expedition that aimed to make the first crossing of Antarctica by land, but his ship Endurance was trapped in the polar ice. The crew’s 18-month odyssey to return to civilization became the stuff of legend.

Soon exploration gave way to international competition over Antarctica’s natural resources. Great Britain marked almost two-thirds of the continent’s landmass as part of the British Empire, but a half dozen other countries also staked claims. In 1947 the U.S. joined the fray with Operation High Jump, a U.S. Navy-led mission to establish a research base that involved 13 ships and 23 aircraft.

Antarctica’s freedom and neutrality were in question during the Cold War. But in 1957, a group of geophysicists managed to launch a year-long Antarctic research project involving 12 countries. It was such a success that two years later the countries, including the U.S., the U.K. and the USSR, signed the Antarctic Treaty, guaranteeing the continent’s protection from militarization and exploitation. This goodwill toward men took a further 20 years to extend to women, but in 1979 American engineer Irene C. Peden became the first woman to work at the South Pole for an entire winter.

Historically Speaking: Awed by the Meteor Shower of the New Year’s Sky

Human beings have always marveled at displays like this weekend’s Quadrantids, but now we can understand them as well.

The Wall Street Journal

January 1, 2021

If you wish upon a star this week, you probably won’t get your heart’s desire. But if you’re lucky, you’ll be treated to an outstanding display of the Quadrantids, the annual New Year’s meteor shower that rivals the Perseids in intensity and quality of fireballs. The Quadrantids are exceptionally brief, however: The peak lasts only a few hours on January 2, and a cloudy sky or full moon can ruin the entire show.

A long-exposure photograph of the Draconid meteor shower in October 2018.
PHOTO: SMITYUK YURI/TASS/ZUMA PRESS

Meteor showers happen when the Earth encounters dust and rock sloughed off by a comet as it orbits the sun. The streaks of light we see are produced by this debris burning up in the Earth’s atmosphere.

Human beings have been aware of the phenomenon since ancient times. Some Christian archaeologists have theorized that the biblical story of Sodom and Gomorrah was inspired by a massive meteor strike near the Dead Sea some 3,700 years ago, which wiped out the Bronze Age city of Tall el-Hammam in modern Jordan.

Aristotle believed that comets and meteors weren’t heavenly bodies but “exhalations” from the Earth that ignited in the sky. As a result, Western astronomers took little interest in them until the rise of modern science. By contrast, the Chinese began recording meteor events as early as 687 B.C. The Mayans were also fascinated by meteor showers: Studies of hieroglyphic records suggest that important occasions, such as royal coronations, were timed to coincide with the Eta Aquarid shower in the spring.

Even before telescopes were invented, it wasn’t hard to observe comets, meteors and meteor showers. The 11th-century Bayeux Tapestry contains a depiction of Halley’s comet, which appeared in 1066. But people couldn’t see meteors for what they really were. Medieval Christians referred to the annual Perseid shower as “the tears of St. Lawrence,” believing that the burning tears of the martyred saint lit up the sky on his feast day, August 10.

Things began to change in the 19th century, as astronomers noticed that some meteor showers recurred on a fixed cycle. In November 1799, the Leonid shower was recorded by Andrew Ellicott, an American surveyor on a mission to establish the boundary between the U.S. and the Spanish territory of Florida. Ellicott was on board a ship in the Florida Keys when he observed the Leonids, writing in his journal that “the whole heavens appeared as if illuminated with skyrockets, flying in an infinity of directions, and I was in constant expectation of some of them falling on the vessel.” When a similar spectacle lit up the skies in the eastern U.S. in 1833, astronomers realized that it was a recurrence of the same phenomenon and that the meteor storm must be linked to the orbit of a particular comet.

The origin of the Quadrantids was harder to locate. Astronomers kept looking for its parent comet until 2003, when NASA scientist Peter Jenniskens realized that they were on the wrong track: The shower is actually caused by a giant asteroid, designated 2003 EH1, which broke off from a comet 500 years ago. It is somehow fitting that a mystery of the New Year’s night sky yielded to the power of an open mind.

Historically Speaking: The Martini’s Contribution to Civilization

The cocktail was invented in the U.S., but it soon became a worldwide symbol of sophistication.

Wall Street Journal

December 18, 2020

In 1887, the Chicago Tribune hailed the martini as the quintessential Christmas drink, reminding readers that it is “made of Vermouth, Booth’s Gin, and Angostura Bitters.” That remains the classic recipe, even though no one can say for certain who created it.

The journalist H.L. Mencken famously declared that the martini was “the only American invention as perfect as the sonnet,” and there are plenty of claimants to the title of inventor. The city of Martinez, Calif., insists the martini was first made there in 1849, for a miner who wanted to celebrate a gold strike with something “special.” Another origin story gives the credit to Jerry Thomas, the bartender of the Occidental Hotel in San Francisco, in 1867.

Actor Pierce Brosnan as James Bond, with his signature martini.
PHOTO: MGM/EVERETT COLLECTION

Of course, just as calculus was discovered simultaneously by Isaac Newton and Gottfried Leibniz, the martini may have sprung from multiple cocktail shakers. What soon made it stand out from all other gin cocktails was its association with high society. The hero of “Burning Daylight,” Jack London’s 1910 novel about a gold-miner turned entrepreneur, drinks martinis to prove to himself and others that he has “arrived.” Ernest Hemingway paid tribute to the drink in his 1929 novel “A Farewell To Arms” with the immortal line, “I had never tasted anything so cool and clean. They made me feel civilized.”

Prohibition was a golden age for the martini. Its adaptability was a boon: Even the coarsest bathtub gin could be made palatable with the addition of vermouth and olive brine (a dirty martini), a pickled onion (Gibson), lemon (twist), lime cordial (gimlet) or extra vermouth (wet). President Franklin D. Roosevelt was so attached to the cocktail that he tried a little martini diplomacy on Stalin during the Yalta conference of 1945. Stalin could just about stand the taste but informed Roosevelt that the cold on the way down wasn’t to his liking at all.

The American love affair with the martini continued in Hollywood films like “All About Eve,” starring Bette Davis, which portrayed it as the epitome of glamour and sophistication. But change was coming. In Ian Fleming’s 1954 novel “Live and Let Die,” James Bond ordered a martini made with vodka instead of gin. Worse, two years later in “Diamonds are Forever,” Fleming described the drink as being “shaken and not stirred,” even though shaking weakens it. Then again, according to an analysis of Bond’s alcohol consumption published in the British Medical Journal in 2013, 007 sometimes downed the equivalent of 14 martinis in a 24-hour period, so his whole body would have been shaking.

American businessmen weren’t all that far behind. The three-martini lunch was a national pastime until business lunches ceased to be fully tax-deductible in the 1980s. Banished from meetings, the martini went back to its roots as a mixologists’ dream, reinventing itself as a ‘tini for all seasons.

The 1990s brought new varieties that even James Bond might have thought twice about, like the chocolate martini, made with creme de cacao, and the appletini, made with apple liqueur, cider or juice. Whatever your favorite, this holiday season let’s toast to feeling civilized.

Leaders Who Bowed Out Gracefully

Kings and politicians have used their last moments on the world stage to deliver words of inspiration.

November 5, 2020

The Wall Street Journal

The concession speech is one of the great accomplishments of modern democracy. The election is over and passions are running high, but the loser graciously concedes defeat, calls for national unity and reminds supporters that tomorrow is another day. It may be pure political theater, but it’s pageantry with a purpose.

For most of history, defeated rulers didn’t give concession speeches; they were too busy begging for their lives, since a king who lost his throne was usually killed shortly after. The Iliad recounts six separate occasions where a defeated warrior asks his opponent for mercy, only to be hacked to death anyway. The Romans had no interest whatsoever in listening to defeated enemies—except once, in the 1st century, when the British chieftain Caractacus was brought in chains before the Senate.

Republican presidential candidate John McCain delivers his concession speech on Nov. 4, 2008, after losing the election to Barack Obama.
PHOTO: ROBYN BECK/AGENCE FRANCE-PRESSE/GETTY IMAGES

On a whim, the Emperor Claudius told Caractacus to give one reason why his life should be spared. According to the historian Cassius Dio, the defeated Briton gave an impassioned speech about the glory of Rome, and how much greater it would be if he was spared: “If you save my life, I shall be an everlasting memorial of your clemency.” Impressed, the Senate set him free.

King Charles I had no hope for clemency on Jan. 30, 1649, when he faced execution after the English Civil War. But this made his speech all the more powerful, because Charles was speaking to posterity more than to his replacement, Oliver Cromwell. His final words have been a template for concession speeches ever since: After defending his record and reputation, Charles urged Cromwell to rule for the good of the country, “to endeavor to the last gasp the peace of the kingdom.”

In modern times, appeals to the nation became an important part of royal farewell speeches. When Napoleon Bonaparte abdicated as emperor of France in 1814, he stood in the courtyard of the palace of Fontainebleau and bade an emotional goodbye to the remnants of his Old Guard. He said that he was leaving to prevent further bloodshed, and ended with the exhortation: “I go, but you, my friends, will continue to serve France.”

Emperor Hirohito delivered a similar message in his radio broadcast on Aug. 14, 1945, announcing Japan’s surrender in World War II. The Emperor stressed that by choosing peace over annihilation he was serving the ultimate interests of the nation. He expected his subjects to do the same, to “enhance the innate glory of the Imperial State.” The shock of the Emperor’s words was compounded by the fact that no one outside the court and cabinet had ever heard his voice before.

In the U.S., the quality of presidential concession speeches rose markedly after they began to be televised in 1952. Over the years, Republican candidates, in particular, have elevated the art of losing to almost Churchillian heights. John McCain’s words on election night 2008, when he lost to Barack Obama, remain unmatched: “Americans never quit. We never surrender. We never hide from history. We make history.”

Historically Speaking: Tales That Go Bump in the Night

From Homer to Edgar Allan Poe, ghost stories have given us a chilling good time

The Wall Street Journal

October 23, 2020

As the novelist Neil Gaiman, a master of the macabre, once said, “Fear is a wonderful thing, in small doses.” In this respect, we’re no different than our ancestors: They, too, loved to tell ghost stories.

One of the earliest ghosts in literature appears in Homer’s Odyssey. Odysseus entertains King Alcinous of Phaeacia with an account of his trip to the Underworld, where he met the spirits of Greek heroes killed in the Trojan War. The dead Achilles complains that being a ghost is no fun: “I should choose, so I might live on earth, to serve as the hireling of another…rather than to be lord over all the dead.”

ILLUSTRATION: THOMAS FUCHS

It was a common belief in both Eastern and Western societies that ghosts could sometimes return to right a great wrong, such as an improper burial. The idea that ghosts are intrinsically evil—the core of any good ghost story—received a boost from Plato, who believed that only wicked souls hang on after death; the good know when it’s time to let go.

Ghosts were problematic for early Christianity, which taught that sinners went straight to Hell; they weren’t supposed to be slumming it on Earth. The ghost story was dangerously close to heresy until the Church adopted the belief in Purgatory, a realm where the souls of minor sinners waited to be cleansed. The Byland Abbey tales, a collection of ghost stories recorded by an anonymous 15th-century English monk, suggest that the medieval Church regarded the supernatural as a useful form of advertising: Not paying the priest to say a mass for the dead could lead to a nasty case of haunting.

The ghost story reached its apogee in the early modern era with Shakespeare’s “Hamlet,” which opens with a terrified guard seeing the ghost of the late king on the battlements of Elsinore Castle. But the rise of scientific skepticism made the genre seem old-fashioned and unsophisticated. Ghosts were notably absent from English literature until Horace Walpole, son of Britain’s first prime minister, published the supernatural mystery novel “The Castle of Otranto” in 1764, as a protest against the deadening effect of “reason” on art.

Washington Irving was the first American writer to take the ghost story seriously, creating the Headless Horseman in his 1820 tale “The Legend of Sleepy Hollow.” He was a lightweight, however, compared with Edgar Allan Poe, who turned horror into an art form. His famous 1839 story “The Fall of the House of Usher” heightens the tension with ambiguity: For most of the story, it isn’t clear whether Roderick Usher’s house really is haunted, or if he is merely “enchained by certain superstitious impressions.”

Henry James used a similar technique in 1895, when, unhappy with the tepid reception of his novels in the U.S., he decided to frighten Americans into liking him. The result was the psychological horror story “The Turn of the Screw,” about a governess who may or may not be seeing ghosts. The reviews expressed horror at the horror, with one critic describing it as “the most hopelessly evil story that we could have read in any literature.” With such universal condemnation, success was assured.

Historically Speaking: The Business and Pleasure of Dining Out

The food service industry will eventually overcome the pandemic, just as it bounced back from ancient Roman bans and Prohibition.

The Wall Street Journal

September 24, 2020

It remains anyone’s guess what America’s once-vibrant restaurant scene will look like in 2021. At the beginning of this year, there were 209 Michelin-starred restaurants in the U.S. This month, the editors of the Michelin Guide announced that just 29 had managed to reopen after the pandemic lockdown.

ILLUSTRATION: THOMAS FUCHS

The food-service industry has always had to struggle. In the Roman Empire, the typical eatery was the thermopolium, a commercial kitchen that sold mulled wine and a prepared meal—either to-go or, in larger establishments, to eat at the counter. They were extremely popular among the working poor—archaeologists have found over 150 in Pompeii alone—and therefore regarded with suspicion by the authorities. In 41 A.D., Emperor Claudius ordered a ban on thermopolia, but the setback was temporary at best.

In Europe during the Middle Ages, the “cook shop” served a similar function for the poor. For the wealthier sort, however, it was surprisingly difficult to find places to eat out. Only a few monasteries and taverns provided hospitality. In Geoffrey Chaucer’s 14th-century comic travelogue “The Canterbury Tales,” the pilgrims have to bring their own cook, Roger of Ware, who is said to be an expert at roasting, boiling, broiling and frying.

To experience restaurant-style dining with separate tables, waiters and a menu, one had to follow in the footsteps of the Venetian merchant Marco Polo to the Far East. The earliest prototype of the modern restaurant developed in Kaifeng, the last capital of the Song Dynasty (960-1279), to accommodate its vast transient population of merchants and officials. The accumulation of so many rich and homesick men led to a boom in sophisticated eateries offering meals cooked to order.

Europe had nothing similar until the French began to experiment with different forms of public catering in the 1760s. These new places advertised themselves as a healthy alternative to the tavern, offering restorative soups and broths—hence their name, the restaurant.

In 1782, this rather humble start inspired Antoine Beauvilliers to open the first modern restaurant, La Grande Taverne de Londres, which unashamedly replicated the luxury of royal dining. By the 1800s, the term “restaurant” in any language meant a superior establishment serving refined French cuisine.

In 1830, two Swiss brothers, John and Peter Delmonico, opened the first restaurant in the U.S., Delmonico’s in New York. It was a temple of haute cuisine, with uniformed waiters, imported linens and produce grown on a dedicated farm. What’s more, diners could make reservations ahead of time and order either a la carte or prix fixe—all novel concepts in 19th century America.

Delmonico’s reign lasted until Prohibition, which forced thousands of U.S. restaurants out of business, unable to survive without alcohol sales. During that time, the only growth in the restaurant trade was in illegal speakeasies and family-friendly diners. Yet in 1934, just one year after Prohibition’s repeal, the art deco-themed Rainbow Room opened its doors at Rockefeller Plaza in New York. Out of the ashes of the old, a new age of elegance had begun.

Stepping out of the Shadows

Sylvia Pankhurst by Rachel Holmes, review — finally having her moment.

Her mother and sister were once better known, but this fine biography shows just how remarkable the women’s rights activist was.

The Times

September 22, 2020

After decades of obscurity, Sylvia Pankhurst is finally having her moment. This is the third biography in seven years — not bad for a woman who spent much of her life being unfavourably compared with her more popular mother and sister.

The neglect is partly owing to Sylvia’s rich, complex life not being easily pigeonholed. Although she played an instrumental role in the suffrage movement, she was first and foremost a defender of the poor, the oppressed and the marginalised. Her political choices were often noble, but lonely ones.

Sylvia inherited her appetite for social activism and boundless energy for work from her parents, Richard and Emmeline. A perpetually aspiring MP, Richard cheerfully espoused atheism, women’s suffrage, republicanism, anti-imperialism and socialism at a time when any one of these causes was sufficient to scupper a man’s electoral chances. Emmeline was just as politically involved and only slightly less radical.

Sylvia’s mother, the suffragette Emmeline Pankhurst
ALAMY

Despite financial troubles and career disappointments, the Pankhurst parents were a devoted couple and the household a happy one. Sylvia was born in 1882, the second of five children and the middle daughter between Christabel (her mother’s favourite) and Adela (no one’s favourite). She craved Emmeline’s good opinion, but was closer to her father. “Life is valueless without enthusiasms,” he once told her, a piece of advice she took to heart.

Sylvia was only 16 when her father died. Without his counter-influence, the three sisters (and their brother, Harry, who died of polio aged 20) lived in thrall to their powerful mother. After Emmeline and Christabel founded the Women’s Social and Political Union (WSPU) in 1903 — having become frustrated by the lack of support from the Independent Labour Party — there was no question that Sylvia and Adela would do anything other than sacrifice their personal interests for the good of the cause. Sylvia later admitted that one of her greatest regrets was being made to give up a promising art career for politics.

She was imprisoned for the first time in 1906. As the tactics of the WSPU became more extreme, so did the violence employed by the authorities against its members. Sylvia was the only Pankhurst to be subjected to force-feeding, an experience she likened to rape.

“Infinitely worse than the pain,” she wrote of the experience, “was the sense of degradation.” Indeed, in some cases that was the whole point of the exercise. While not widespread, vaginal and anal “feeding” was practised on the hunger strikers. Holmes hints, but doesn’t speculate that Sylvia may have been one of its victims.


Pankhurst died in Ethiopia in 1960 after accepting an invitation from Emperor Haile Selassie, pictured, to emigrate to Africa
PHOTO ARCHIVE/GETTY

Ironically, Sylvia suffered the most while being the least convinced by the WSPU’s militant tactics. It wasn’t only the violence she abhorred. Emmeline and Christabel wanted the WSPU to remain an essentially middle-class, politically aloof organisation, whereas Sylvia regarded women’s rights as part of a wider struggle for revolutionary socialism. The differences between them became unbridgeable after Sylvia founded a separate socialist wing of the WSPU in the East End. Both she and Adela, whom Emmeline and Christabel dismissed as a talentless lightweight, were summarily expelled from the WSPU in February 1914. The four women would never again be in the same room together.

Sylvia had recognised early on that first-wave feminism suffered from a fundamental weakness. It was simultaneously too narrow and too broad to be a stand-alone political platform. The wildly different directions that were taken by the four Pankhursts after the victory of 1918 proved her right: Emmeline became a Conservative, Christabel a born-again Christian, Sylvia a communist and Adela a fascist, yet all remained loyal to their concept of women’s rights.

Once cut loose from the Pankhurst orbit, Sylvia claimed the freedom to think and act as her conscience directed. In 1918 she fell in love with an Italian anarchist socialist refugee, Silvio Corio, who already had three children with two women. Undeterred, she lived with him in Woodford Green, Essex, in a ramshackle home appropriately named Red Cottage. They remained partners for the best part of 30 years, writing, publishing and campaigning together. Even more distressing for her uptight family, not to mention society in general, at the advanced age of 45 she had a son by him, Richard, who was given her surname rather than Silvio’s.

Sylvia Pankhurst, here in 1940, became a communist after the victory of 1918
ALAMY

Broadly speaking, her life can be divided into four campaigns: after women’s suffrage came communism, then anti-fascism and finally Ethiopian independence. (The last has received the least attention, although Sylvia insisted it gave her the greatest pride.) None was an unalloyed success or without controversy. Her fierce independence would lead her to break with Lenin over their ideological differences, and later support her erstwhile enemy Winston Churchill when their views on fascism aligned. She never had any time for Stalin, left-wing antisemitism or liberal racism. In her mid-seventies and widowed, she cut all ties with Britain by accepting an invitation from Emperor Haile Selassie to emigrate to Ethiopia. She died there in 1960.

The genius of Holmes’s fascinating and important biography is that it approaches Sylvia’s life as if she were a man. The writing isn’t prettified or leavened by amusing anecdotes about Victorian manners, it’s dense and serious, as befits a woman who never wore make-up and didn’t care about clothes. To paraphrase the WSPU’s slogan, it is about deeds not domesticity. Rather than dwelling on moods and relationships, Holmes is interested in ideas and consequences. It’s wonderfully refreshing. Sylvia lived for her work; her literary output was astounding. In addition to publishing her own newspaper almost every week for over four decades, she wrote nonfiction, fiction, plays, poetry and investigative reports. She even taught herself Romanian so that she could translate the poems of the 19th-century Romantic poet Mihail Eminescu. It doesn’t matter whether Sylvia was right or wrong in her political enthusiasms; as Holmes rightly insists, what counts is that by acting on them she helped to make history.

Historically Speaking: Women Who Made the American West

From authors to outlaws, female pioneers helped to shape frontier society.

The Wall Street Journal

September 9, 2020

On Sept. 14, 1920, Connecticut became the 37th state to ratify the 19th Amendment, which guaranteed women the right to vote. The exercise was largely symbolic, since ratification had already been achieved thanks to Tennessee on August 18. Still, the fact that Connecticut and the rest of the laggard states were located in the eastern part of the U.S. wasn’t a coincidence. Though women are often portrayed in Westerns as either vixens or victims, they played a vital role in the life of the American frontier.

The outlaw Belle Starr, born Myra Belle Shirley, in 1886.
PHOTO: ROEDER BROTHERS/BUYENLARGE/GETTY IMAGES

Louisa Ann Swain of Laramie, Wyo., was the first woman in the U.S. to vote legally in a general election, in September 1870. The state was also ahead of the pack in granting women the right to sit on a jury, act as a justice of the peace and serve as a bailiff. Admittedly, it wasn’t so much enlightened thinking that opened up these traditionally male roles as it was the desperate shortage of women. No white woman crossed the continent until 17-year-old Nancy Kelsey traveled with her husband from Missouri to California in 1841. Once there, as countless pioneer women subsequently discovered, the family’s survival depended on her ability to manage without his help.

Women can and must fend for themselves was the essential message in the ‘”Little House on the Prairie” series of books by Laura Ingalls Wilder, who was brought up on a series of homesteads in Wisconsin and Minnesota in the 1870s. Independence was so natural to her that she refused to say “I obey” in her marriage vows, explaining, “even if I tried, I do not think I could obey anybody against my better judgment.”

Although the American frontier represented incredible hardship and danger, for many women it also offered a unique kind of freedom. They could forge themselves anew, seizing opportunities that would have been impossible for women in the more settled and urbanized parts of the country.

This was especially true for women of color. Colorado’s first Black settler was a former slave named Clara Brown, who won her freedom in 1856 and subsequently worked her way west to the gold-mining town of Central City. Recognizing a need in the market, she founded a successful laundry business catering to miners and their families. Some of her profits went to buy land and shares in mines; the rest she spent on philanthropy, earning her the nickname “Angel of the Rockies.” After the Civil War, Brown made it her mission to locate her lost family, ultimately finding a grown-up daughter, Eliza.

However, the flip of side of being able to “act like men” was that women had to be prepared to die like men, too. Myra Belle Shirley, aka Belle Starr, was a prolific Texas outlaw whose known associates included the notorious James brothers. Despite a long criminal career that mainly involved bootlegging and fencing stolen horses, Starr was convicted only once, resulting in a nine-month prison sentence in the Detroit House of Correction. Her luck finally ran out in 1889, two days before her 41st birthday. By now a widow for the third time, Belle was riding alone in Oklahoma when she was shot and killed in an ambush. The list of suspects included her own children, although the murder was never solved.