These days, every neighborhood bar celebrates Oktoberfest, but the original fall beer festival is the one in Munich, Germany—still the largest of its kind in the world. Oktoberfest was started in 1810 by the Bavarian royal family as a celebration of Crown Prince Ludwig’s marriage to Princess Therese von Sachsen-Hildburghausen. Nowadays, it lasts 16 days and attracts some 6 million tourists, who guzzle almost 2 million gallons of beer.
Yet these staggering numbers conceal the fact that, outside of the developing world, the beer industry is suffering. Beer sales in the U.S. last year accounted for 45.6% of the alcohol market, down from 48.2% in 2010. In Germany, per capita beer consumption has dropped by one-third since 1976. It is a sad decline for a drink that has played a central role in the history of civilization. Brewing beer, like baking bread, is considered by archaeologists to be one of the key markers in the development of agriculture and communal living.
In Sumer, the ancient empire in modern-day Iraq where the world’s first cities emerged in the 4th millennium BC, up to 40% of all grain production may have been devoted to beer. It was more than an intoxicating beverage; beer was nutritious and much safer to drink than ordinary water because it was boiled first. The oldest known beer recipe comes from a Sumerian hymn to Ninkasi, the goddess of beer, composed around 1800 BC. The fact that a female deity oversaw this most precious commodity reflects the importance of women in its production. Beer was brewed in the kitchen and was considered as fundamental a skill for women as cooking and needlework.
The ancient Egyptians similarly regarded beer as essential for survival: Construction workers for the pyramids were usually paid in beer rations. The Greeks and Romans were unusual in preferring wine; blessed with climates that aided viticulture, they looked down on beer-drinking as foreign and unmanly. (There’s no mention of beer in Homer.)
Northern Europeans adopted wine-growing from the Romans, but beer was their first love. The Vikings imagined Valhalla as a place where beer perpetually flowed. Still, beer production remained primarily the work of women. With most occupations in the Middle Ages restricted to members of male-only guilds, widows and spinsters could rely on ale-making to support themselves. Among her many talents as a writer, composer, mystic and natural scientist, the renowned 12th century Rhineland abbess Hildegard of Bingen was also an expert on the use of hops in beer.
The female domination of beer-making lasted in Europe until the 15th and 16th centuries, when the growth of the market economy helped to transform it into a profitable industry. As professional male brewers took over production and distribution, female brewers lost their respectability. By the 19th century, women were far more likely to be temperance campaigners than beer drinkers.
When Prohibition ended in the U.S. in 1933, brewers struggled to get beer into American homes. Their solution was an ad campaign selling beer to housewives—not to drink it but to cook with it. In recent years, beer ads have rarely bothered to address women at all, which may explain why only a quarter of U.S. beer drinkers are female.
As we’ve seen recently in the Kavanaugh hearings, a male-dominated beer-drinking culture can be unhealthy for everyone. Perhaps it’s time for brewers to forget “the king of beers”—Budweiser’s slogan—and seek their once and future queen.
Writer and historian Amanda Foreman took her family on an epic motorhome adventure. Would it drive them all round the bend?
Crossing Devils Garden, UtahANGELA HAYS
Imagining yourself behind the wheel of an RV and actually driving one are two completely different things. I discovered this shortly after we hit the road in our 30ft rental for a 1,500-mile odyssey from Denver, Colorado, to Las Vegas, Nevada. As soon as I pressed down on the accelerator, other faculties such as breathing and thinking stopped functioning. My husband, Jonathan, took over, and remained in the driving seat for the rest of the trip, while I quietly buried my pride beneath a pile of flip-flops and rain jackets.
Did I mention there were seven of us — including five children? Masochistic, perhaps, but my motivation in initiating this most ambitious of family holidays was sound. Last year, I wrote in The Sunday Times about the poisonous effect social media can have on families, and on girls’ self-esteem in particular. With four daughters — Helena, 16, Halcyon, 13, and 11-year-old twins Xanthe and Hero (15-year-old Theo completes the gang) — this is particularly pertinent to us.
We’re the Foremans: Amanda and family in their RVANGELA HAYS
The latest research offers hope, though. The best antidote to the rapacious demands of the internet, experts say, is its reverse — the power of lived experience. And so we decided to put that theory to the test by taking to the road.
Everything was new and exciting at first. Look, a herd of cattle blocking the road! Oh, we blew the electrics! It was when we left the cool mountain ranges of Colorado and descended into Utah, a dip that sent the thermometer climbing into the 30s, that reality took over.
We were heading for Moab and its two national and one state parks. Some of the best scenery in the totemic road-trip movie Thelma and Louise is here. I kept repeating this fact until Helena reminded me that they’d never seen the film and couldn’t care less. The atmosphere of the RV, already thick with sweat and hormones, was beginning to grow heavy. Knees and elbows were making contact where they shouldn’t.
Our first hike was through the Arches National Park, so named after the formations that stake the desert like a thousand oversized Durdle Doors. Our trail took us to Devils Garden, with its giant phallic pillars. Another led to Landscape Arch, the longest natural arch in the world. A peculiar tension emanates from it, as though at any second something will snap and the whole thing will collapse. I recognised the feeling.
Arches National Park lives up to its nameALAMY
Three children claimed to be on the edge of survival by day’s end (heat, thirst, sand in shoe, etc). A night’s camping at Devils Garden did it for the rest.
The evening had started well. We ate under the starriest sky I’d ever seen. The intense blackness around us served to highlight the strange symphony of sounds wafting from the brush. After dinner, we sat in companionable silence, listening to the music of the desert. If we could have stayed that way until morning, the night would have been perfect. But shutting down an RV for the night requires concentration — not the futzing and farting about of a family of camping novices. By the time we had secured the awning, washed and put everything away, cleared the only escape route of detritus, converted the “dinette” into a quasi-bed large enough for a small person, or in our case reluctant twins, turned the banquette into a sofa bed (tricky with the seatbelt fittings running down the middle) and stashed the teenagers into the bunk above the driver’s seat, mercury had started rising in the worst way.
I lay on my side in the back alcove, feeling the grit between my toes and the sweat between my eyes, waiting for the inevitable blow-up. There was a yell, a thwack, and the RV started shaking on its plastic levellers. My husband and I leapt out of bed and turned on the lights. The children froze, the twins in mid-wrestle. Each returned sulkily to his or her place.
Once peace was restored, Jonathan and I listened for signs of a round two: was that it, or merely the first skirmish in a battle to the death? “It depends,” I said, “on whether we kill them first.”
“Where do we go from here?” Evita sang over the sound system as we headed to our next stop the following day. There were three more destinations to see before we arrived at the Grand Canyon. “This isn’t where we intended to be,” she warbled.
“Don’t listen to her,” my husband told the children, “she’s being a wuss.”
I was beginning to fear the wuss factor would end up trumping the wow factor. Your basic renters’ RV is essentially a tin can on wheels with a lavatory attached. By midweek we had wrecked the place. Only the desperate and foolhardy dared face the loo. Every spare inch of the RV was occupied by wet and drying clothes, creating a steam-room effect without the refreshing scent of eucalyptus.
On the prowl: a puma roams through Moab, UtahGETTY
A gentle rain accompanied our arrival at Bryce Canyon. Despite the name, it’s not a canyon at all, but a series of natural stone amphitheatres created by millions of years of frost-thaw cycles. Each one is filled with densely packed pillars and spires called “hoodoos”, which turn crimson and ginger in the sun. Our goal, as we set off from Ruby’s Campground, was to see the sun set over Bryce’s most famous amphitheatre, the hauntingly named Silent City. There was a whiff of mutiny in the damp air. Nothing obvious, but if I had suggested going back to the RV, no one would have complained.
We trudged through the pines towards Inspiration Point (by God, did we need some) in a long, straggly line. Glimpses of Silent City flashed through breaks in the trees. The hoodoos seemed not alive exactly, but immobile. I looked back and saw that the stragglers had stopped. They, too, were staring at the rocks.
At the Point, we waited in vain for a break in the clouds. Our son suddenly shouted: “It’s a peregrine falcon, I’m sure of it.” His sisters clustered around, taking photographs as the bird swooped and then banked hard into the air. “They do this to impress their mates,” Theo informed us.
Back at camp, we broke out the chocolate Oreos in celebration of Theo’s surprise expertise in all things avian. The rain also cleared in time for us to light the barbecue and go full-on carnivore. There wasn’t a peep from anyone that night.
Gorgeous: the Virgin River flows through Zion National ParkALAMY
The next morning, we reached Zion National Park right on schedule, a first for us. This day was a hike through the Virgin River to the Narrows, where the orange-brown walls of the canyon rise 1,000ft, but the gorge is only 30ft across. The water was cold, the rocks slippery, providing ample opportunities for a whinge. None came. I noticed a subtle change as we waded upriver: the children were the ones out in front. That wasn’t all. Over the next few days, blisters that would have been incapacitating at the beginning of the trip were now displayed with pride.
The news that we were camping on the rustic side of the Grand Canyon, at the North Rim, didn’t faze anyone at all. The South Rim, which overlooks the Colorado River, gets 10 times more visitors and has two dozen main viewing points. The North is higher, colder, smaller, with just three principal viewpoints, and is so quiet that you can hear the trees rustling in the early-morning breeze. It means you have one of the great natural wonders of the world all to yourself. We could picnic among the most spectacular scenery without another person in sight.
If there was a wuss among the group, it was me. We had trekked to Cape Royal, where a wooded trail leads to a dramatic plateau that offers one of the widest panoramas of the canyon. The children were naturally drawn to the ledge, carelessly dangling their legs over the side. I was torn between wanting to capture the moment for ever and shouting at them to get back. Jonathan intervened. “It’s all right,” he said. “They are free.”
There wasn’t an ounce of regret when we dropped off the camper in Las Vegas. I doubt we will ever rent an RV again. Yet we ended the week feeling truly content. Without being sappy about it, to experience nature in its untrammelled state is to feel insignificant and uplifted in equal measure.
We came, we saw, and were conquered, happily.
Amanda’s guide to a stress-free US motorhome holiday Do book your campground berths at least six months in advance. I’m not exaggerating. Do find out what your RV comes with. Usually, it’s practically nothing. Do check the depot for “left behinds”. We picked up a barbecue that way, and donated it back at the end. Do bring physical maps with you — there isn’t much coverage out in the sticks. Praise be. Do have a games bag with amusements that don’t need charging, such as playing cards, colouring books and chess. Do take the desert heat seriously. Load up on hats, sunscreen, insulated water bottles and handheld fans. Do make a list of chores. There should be no passengers on your trip, only crew members. Don’t overpack. Don’t attempt more than four hours’ driving a day. It’s not much fun for the people stuck in the back. Don’t assume that you’ll be allowed to use your generator past 8pm. Plan your meals ahead of time. Don’t leave out any food overnight. Nature is not your friend. Nature wants to swarm over your leftovers and/or eat you. Don’t forget to check your water levels every day. Remember how disgusting it is when you flush the loo on a plane and nothing happens when the trap opens? Now multiply that by five. Don’t expect kids to be as rhapsodic about the scenery as you are.
Hiring a six-berth RV sleeping six for 10 days in November, picking up in Denver and dropping off in Las Vegas, starts at £830, including one-way fee, with Road Bear (roadbearrv.com). A 10-day fly-drive package from Denver in October costs £1,225pp for a family of six, including RV hire (bon-voyage.co.uk). A World on Fire by Amanda Foreman is published by Penguin
Family ties
Have you successfully bonded as a family on holiday? Or had a disaster? Tell us and you could win a £250 holiday voucher; see Letters for details. Email [email protected]
Fifty years ago, on September 30, 1968, the world’s first 747 Jumbo Jet rolled out of Boeing’s Everett plant in Seattle, Washington. It was hailed as the future of commercial air travel, complete with fine dining, live piano music and glamorous stewardesses. And perhaps we might still be living in that future, were it not for the 1978 Airline Deregulation Act signed into law by President Jimmy Carter.
Deregulation was meant to increase the competitiveness of the airlines, while giving passengers more choice about the prices they paid. It succeeded in greatly expanding the accessibility of air travel, but at the price of making it a far less luxurious experience. Today, flying is a matter of “calculated misery,” as Columbia Law School professor Tim Wu put it in a 2014 article in the New Yorker. Airlines deliberately make travel unpleasant in order to force economy passengers to pay extra for things that were once considered standard, like food and blankets.
So it has always been with mass travel, since its beginnings in the 17th century: a test of how much discomfort and delay passengers are willing to endure. For the English Puritans who sailed to America on the Mayflower in 1620, light and ventilation were practically non-existent, the food was terrible and the sanitation primitive. All 102 passengers were crammed into a tiny living area just 80 feet long and 20 feet wide. To cap it all, the Mayflower took 66 days to arrive instead of the usual 47 for a trans-Atlantic crossing and was 600 miles off course from its intended destination of Virginia.
The introduction of the commercial stage coach in 1610, by a Scottish entrepreneur who offered trips between Edinburgh and Leith, made it easier for the middle classes to travel by land. But it was still an expensive and unpleasant experience. Before the invention of macadam roads—which rely on layers of crushed stone to create a flat and durable surface—in Britain in the 1820s, passengers sat cheek by jowl on springless benches, in a coach that trundled along at around five miles per hour.
The new paving technology improved the travel times but not necessarily the overall experience. Charles Dickens had already found fame with his comic stories of coach travel in “The Pickwick Papers” when he and Mrs. Dickens traveled on an American stage coach in Ohio in 1842. They paid to have the coach to themselves, but the journey was still rough: “At one time we were all flung together in a heap at the bottom of the coach.” Dickens chose to go by rail for the next leg of the trip, which wasn’t much better: “There is a great deal of jolting, a great deal of noise, a great deal of wall, not much window.”
Despite its primitive beginnings, 19th-century rail travel evolved to offer something revolutionary to its paying customers: quality service at an affordable price. In 1868, the American inventor George Pullman introduced his new designs for sleeping and dining cars. For a modest extra fee, the distinctive green Pullman cars provided travelers with hotel-like accommodation, forcing rail companies to raise their standards on all sleeper trains.
By contrast, the transatlantic steamship operators pampered their first-class passengers and abused the rest. In 1879, a reporter at the British Pall Mall Gazette sailed Cunard’s New York to Liverpool route in steerage in order to “test [the] truth by actual experience.” He was appalled to find that passengers were treated worse than cattle. No food was provided, “despite the fact that the passage is paid for.” The journalist noted that two steerage passengers “took one look at the place” and paid for an upgrade. I think we all know how they felt.
Among the pallbearers at Senator John McCain’s funeral in Washington last weekend was the Russian dissident Vladimir Kara-Murza. Mr. Kara-Murza is a survivor of two poisoning attempts, in 2015 and 2017, which he believes were intended as retaliation for his activism against the Putin regime.
Indeed, Russia is known or suspected to be responsible for several notorious recent poisoning cases, including the attempted murder this past March of Sergei Skripal, a former Russian spy living in Britain, and his daughter Yulia with the nerve agent Novichok. They survived the attack, but several months later a British woman died of Novichok exposure a few miles from where the Skirpals lived.
Poison has long been a favorite tool of brutal statecraft: It both terrorizes and kills, and it can be administered without detection. The Arthashastra, an ancient Indian political treatise that out-Machiavels Machiavelli, contains hundreds of recipes for toxins, as well as advice on when and how to use them to eliminate an enemy.
Most royal and imperial courts of the classical world were also awash with poison. Though it is impossible to prove so many centuries later, the long list of putative victims includes Alexander the Great (poisoned wine), Emperor Augustus (poisoned figs) and Emperor Claudius (poisoned mushrooms), as well as dozens of royal heirs, relatives, rivals and politicians. King Mithridates of Pontus, an ancient Hellenistic empire, was so paranoid—having survived a poison attempt by his own mother—that he took daily microdoses of every known toxin in order to build up his immunity.
Poisoning reached its next peak during the Italian Renaissance. Every ruling family, from the Medicis to the Viscontis, either fell victim to poison or employed it as a political weapon. The Borgias were even reputed to have their own secret recipe, a variation of arsenic called “cantarella.” Although a large number of their rivals conveniently dropped dead, the Borgias were small fry compared with the republic of Venice. The records of the Venetian Council of Ten reveal that a secret poison program went on for decades. Remarkably, two victims are known to have survived their assassination attempts: Count Francesco Sforza in 1450 and the Ottoman Sultan Mehmed II in 1477.
In the 20th century, the first country known to have established a targeted poisoning program was Russia under the Bolsheviks. According to Boris Volodarsky, a former Russian agent, Lenin ordered the creation of a poison laboratory called the “Special Room” in 1921. By the Cold War, the one-room lab had evolved into an international factory system staffed by hundreds, possibly thousands of scientists. Their specialty was untraceable poisons delivered by ingenious weapons—such as a cigarette packet made in 1954 that could fire bullets filled with potassium cyanide.
In 1978, the prizewinning Bulgarian writer Georgi Markov, then working for the BBC in London, was killed by an umbrella tip that shot a pellet containing the poison ricin into his leg. After the international outcry, the Soviet Union toned down its poisoning efforts but didn’t end them. And Putin’s Russia has continued to use similar techniques. In 2006, according to an official British inquiry, Russian secret agents murdered the ex-spy Alexander Litvinenko by slipping polonium into his drink during a meeting at a London hotel. It was the beginning of a new wave of poisonings whose end is not yet in sight.
Since it began making headlines last year, the #MeToo movement has expanded into a global rallying cry. The campaign has many facets, but its core message is clear: Women who are victims of sexual harassment and assault still face too many obstacles in their quest for justice.
How much harder it was for women in earlier eras is illustrated perfectly by Emperor Constantine’s 326 edict on rape and abduction. While condemning both, the law assumed that all rape victims deserved punishment for their failure to resist more forcefully. The best outcome for the victim was disinheritance from her parents’ estate; the worst, death by burning.
In the Middle Ages, a rape victim was more likely to be blamed than believed, unless she suffered death or dismemberment in the attack. That makes the case of the Englishwoman Isabella Plomet all the more remarkable. In 1292, Plomet went to her doctor Ralph de Worgan to be treated for a leg problem. He made her drink a sleeping drug and then proceeded to rape her while she was unconscious.
It’s likely that Worgan, a respected pillar of local society, had relied for years on the silence of his victims. But Plomet’s eloquence in court undid him: He was found guilty and fined. The case was a landmark in medieval law, broadening the definition of rape to include nonconsent through intoxication.
But prejudice against the victims of sexual assault was slow to change. In Catholic Europe, notions of family honor and female reputation usually meant that victims had to marry their rapists or be classed as ruined. This was the origin of the most famous case of the 17th century. In 1611, Artemisia Gentileschi and her father Orazio brought a suit in a Roman court against her art teacher, Agostino Tassi, for rape.
Although Tassi had a previous criminal record, as a “dishonored” woman it was Gentileschi who had to submit to torture to prove that she was telling the truth. She endured an eight-month trial to see Tassi convicted and temporarily banished from Rome. “Cleared” by her legal victory, Gentileschi refused to let the attack define her or determine the rest of her life. She is now regarded as one of the greatest artists of the Baroque era.
One class of victims who had no voice and no legal recourse were free and enslaved black women in pre-Civil War America. Their stories make grim reading. In 1855, Celia, an 18-year-old slave in Missouri, killed her master when he attempted to rape her. At her trial she insisted—through her lawyers, since she was barred from testifying—that the right to self-defense extended to all women. The court disagreed, and Celia was executed—but not before making a successful prison break and almost escaping.
Change was still far off in 1931, when the 18-year-old Rosa Parks, working as a housekeeper, was pounced on by her white employer. As she later recalled, “He offered me a drink of whiskey, which I promptly and vehemently refused. He moved nearer to me and put his hand on my waist.” She managed to fight him off, and in a larger sense Parks never stopped fighting. She became a criminal investigator for the NAACP, helping black victims of white sexual assault to press charges.
Rosa Parks is often referred to as the “first lady of civil rights,” in recognition of her famous protest on a segregated bus in Montgomery, Alabama in 1955. She should also be remembered as one of the unsung heroines in the long prehistory of #MeToo.
“Ay me!” laments Lysander in Shakespeare’s “A Midsummer Night’s Dream.” “For aught that I could ever read, / Could ever hear by tale or history, / The course of true love never did run smooth.” What audience would disagree? Thwarted lovers are indeed the stuff of history and art—especially when the lovers are kings and queens.
But there were good reasons why the monarchs of old were not allowed to follow their hearts. Realpolitik and royal passion do not mix, as Cleopatra VII (69-30 B.C.), the anniversary of whose death falls on Aug. 12, found to her cost. Her theatrical seduction of and subsequent affair with Julius Caesar insulated Egypt from Roman imperial designs. But in 41 B.C., she let her heart rule her head and fell in love with Mark Antony, who was fighting Caesar’s adopted son Octavian for control of Rome.
Cleopatra’s demand that Antony divorce his wife Octavia—sister of Octavian—and marry her instead was a catastrophic misstep. It made Egypt the target of Octavian’s fury, and forced Cleopatra into fighting Rome on Antony’s behalf. The couple’s defeat at the sea battle of Actium in 31 B.C. didn’t only end in personal tragedy: the 300-year-old Ptolemaic dynasty was destroyed, and Egypt was reduced to a Roman province.
In Shakespeare’s play “Antony and Cleopatra,” Antony laments, “I am dying, Egypt, dying.” It is a reminder that, as Egypt’s queen, Cleopatra was the living embodiment of her country; their fates were intertwined. That is why royal marriages have usually been inseparable from international diplomacy.
In 1339, when Prince Pedro of Portugal fell in love with his wife’s Castilian lady-in-waiting, Inés de Castro, the problem wasn’t the affair per se but the opportunity it gave to neighboring Castile to meddle in Portuguese politics. In 1355, Pedro’s father, King Afonso IV, took the surest way of separating the couple—who by now had four children together—by having Inés murdered. Pedro responded by launching a bloody civil war against his father that left northern Portugal in ruins. The dozens of romantic operas and plays inspired by the tragic love story neglect to mention its political repercussions; for decades afterward, the Portuguese throne was weak and the country divided.
Perhaps no monarchy in history bears more scars from Cupid’s arrow than the British. From Edward II (1284-1327), whose poor choice of male lovers unleashed murder and mayhem on the country—he himself was allegedly killed with a red hot poker—to Henry VIII (1491-1547), who bullied and butchered his way through six wives and destroyed England’s Catholic way of life in the process, British rulers have been remarkable for their willingness to place personal happiness above public responsibility.
Edward VIII (1894 -1972) was a chip off the block, in the worst way. The moral climate of the 1930s couldn’t accept the King of England marrying a twice-divorced American. Declaring he would have Wallis Simpson or no one, Edward plunged the country into crisis by abdicating in 1936. With European monarchies falling on every side, Britain’s suddenly looked extremely vulnerable. The current Queen’s father, King George VI, quite literally saved it from collapse.
According to a popular saying, “Everything in the world is about sex except sex. Sex is about power.” That goes double when the lovers wear royal crowns.
Evidence shows people become happier in their fifties, but achieving that takes some soul-searching
I used not to believe in the “midlife crisis”. I am ashamed to say that I thought it was a convenient excuse for self-indulgent behaviour — such as splurging on a Lamborghini or getting buttock implants. So I wasn’t even aware that I was having one until earlier this year, when my family complained that I had become miserable to be around. I didn’t shout or take to my bed, but five minutes in my company was a real downer. The closer I got to my 50th birthday, the more I radiated dissatisfaction.
Can you be simultaneously contented and discontented? The answer is yes. Surveys of “national wellbeing” in several countries, including the UK, by the Office for National Statistics have revealed a fascinating U-curve in relation to happiness and age. In Britain, feelings of stress and anxiety appear to peak at 49 and subsequently fade as the years increase. Interestingly, a 2012 study showed that chimpanzees and orang-utans exhibited a similar U-curve of happiness as they reach middle age.
On a rational level, I wasn’t the least bit disappointed with my life. The troika of family, work and friends made me very happy. And yet something was eating away at my peace of mind. I regarded myself as a failure — not in terms of work but as a human being. Learning that I wasn’t alone in my daily acid bath of gloom didn’t change anything.
One of F Scott Fitzgerald’s most memorable lines is: “There are no second acts in American lives.” It’s so often quoted that it’s achieved the status of a truism. It’s often taken to be an ironic commentary on how Americans, particularly men, are so frightened of failure that they cling to the fiction that life is a perpetual first act. As I thought about the line in relation to my own life, Fitzgerald’s meaning seemed clear. First acts are about actions and opportunities. There is hope, possibility and redemption. Second acts are about reactions and consequences.
Old habits die hard, however. I couldn’t help conducting a little research into Fitzgerald’s life. What was the author of The Great Gatsby really thinking when he wrote the line? Would it even matter?
The answer turned out to be complicated. As far as the quotation goes, Fitzgerald actually wrote the reverse. The line appears in a 1935 essay entitled My Lost City, about his relationship with New York: “I once thought that there were no second acts in American lives, but there was certainly to be a second act to New York’s boom days.”
It reappeared in the notes for his Hollywood novel, The Love of the Last Tycoon, which was half finished when he died in 1940, aged 44. Whatever he had planned for his characters, the book was certainly meant to have been Fitzgerald’s literary comeback — his second act — after a decade of drunken missteps, declining book sales and failed film projects.
Fitzgerald may not have subscribed to the “It’s never too late to be what you might have been” school of thought, but he wasn’t blind to reality. Of course he believed in second acts. The world is full of middle-aged people who successfully reinvented themselves a second or even third time. The mercurial rise of Emperor Claudius (10BC to AD54) is one of the earliest historical examples of the true “second act”.
According to Suetonius, Claudius’s physical infirmities had made him the butt of scorn among his powerful family. But his lowly status saved him after the assassination of his nephew, Caligula. The plotters found the 56-year-old Claudius cowering behind a curtain. On the spur of the moment, instead of killing him, as they did Caligula’s wife and daughter, the plotters decided the stumbling and stuttering scion of the Julio-Claudian dynasty could be turned into a puppet emperor. It was a grave miscalculation. Claudius seized on his changed circumstances. The bumbling persona was dropped and, although flawed, he became a forceful and innovative ruler.
Mostly, however, it isn’t a single event that shapes life after 50 but the willingness to stay the course long after the world has turned away. It’s extraordinary how the granting of extra time can turn tragedy into triumph. In his heyday, General Mikhail Kutuzov was hailed as Russia’s greatest military leader. But by 1800 the 55-year-old was prematurely aged. Stiff-limbed, bloated and blind in one eye, Kutuzov looked more suited to play the role of the buffoon than the great general. He was Alexander I’s last choice to lead the Russian forces at the Battle of Austerlitz in 1805, but was the first to be blamed for the army’s defeat.
Kutuzov was relegated to the sidelines after Austerlitz. He remained under official disfavour until Napoleon’s army was halfway to Moscow in 1812. Only then, with the army and the aristocracy begging for his recall, did the tsar agree to his reappointment. Thus, in Russia’s hour of need it ended up being Kutuzov, the disgraced general, who saved the country.
Winston Churchill had a similar apotheosis in the Second World War. For most of the 1930s he was considered a political has-been by friends and foes alike. His elevation to prime minister in 1940 at the age of 65 changed all that, of course. But had it not been for the extraordinary circumstances created by the war, Robert Rhodes James’s Churchill: A Study in Failure, 1900-1939 would have been the epitaph rather than the prelude to the greatest chapter in his life.
It isn’t just generals and politicians who can benefit from second acts. For writers and artists, particularly women, middle age can be extremely liberating. The Booker prize-winning novelist Penelope Fitzgerald published her first book at 59 after a lifetime of teaching while supporting her children and alcoholic husband. Thereafter she wrote at a furious pace, producing nine novels and three biographies before she died at 83.
I could stop right now and end with a celebratory quote from Morituri Salutamus by the American poet Henry Wadsworth Longfellow: “For age is opportunity no less/ than youth itself, though in another dress, / And as the evening twilight fades away / The sky is filled with stars, invisible by day.”
However, that isn’t — and wasn’t — what was troubling me in the first place. I don’t think the existential anxieties of middle age are caused or cured by our careers. Sure, I could distract myself with happy thoughts about a second act where I become someone who can write a book a year rather than one a decade. But that would still leave the problem of the flesh-and-blood person I had become in reality. What to think of her? It finally dawned on me that this had been my fear all along: it doesn’t matter which act I am in; I am still me.
My funk lifted once the big day rolled around. I suspect that joining a gym and going on a regular basis had a great deal to do with it. But I had also learnt something valuable during these past few months. Worrying about who you thought you would be or what you might have been fills a void but leaves little space for anything else. It’s coming to terms with who you are right now that really matters.
Since the 16th century, travelers have recorded the overwhelming impact of a natural wonder.
ILLUSTRATION BY THOMAS FUCHS
Strange as it may sound, it was watching Geena Davis and Susan Sarandon in the tragic final scene of “Thelma and Louise” (1991) that convinced me I had to go to the Grand Canyon one day and experience its life-changing beauty. Nearly three decades have passed, but I’m finally here. Instead of a stylish 1966 Ford Thunderbird, however, I’m driving a mammoth RV, with my family in tow.
The overwhelming presence of the Grand Canyon is just as I dreamed. Yet I’m acutely aware of how one-sided the relationship is. As the Pulitzer Prize-winning poet Carl Sandburg wrote in “Many Hats” in 1928: “For each man sees himself in the Grand Canyon—each one makes his own Canyon before he comes.”
The first Europeans to encounter the Canyon were Spanish conquistadors searching for the legendary Seven Golden Cities of Cibola. In 1540, Hopi guides took a small scouting party led by García López de Cárdenas to the South Rim (60 miles north of present-day Williams, Ariz.). In Cárdenas’s mind, the Canyon was a route to riches. After trying for three days to find a path to reach the river below, he cut his losses in disgust and left. Cárdenas saw no point to the Grand Canyon if it failed to yield any treasure.
Three centuries later, in 1858, the first Euro-American to follow in Cárdenas’s footsteps, Lt. Joseph Christmas Ives of the U.S. Army Corps of Topographical Engineers, had a similar reaction. In his official report, Ives waxed lyrical about the magnificent scenery but concluded, “The region is, of course, altogether valueless….Ours has been the first, and will doubtless be the last, party of whites to visit this profitless locality.”
Americans only properly “discovered” the Grand Canyon through the works of artists such as Thomas Moran. A devotee of the Hudson River School of painters, Moran found his spiritual and artistic home in the untamed landscapes of the West. His romantic pictures awakened the public to the natural wonder in their midst. Eager to see the real thing, the trickle of visitors turned into a stream by the late 1880s.
The effusive reactions to the Canyon recorded by tourists who made the arduous trek from Flagstaff, Ariz. (a railway to Grand Canyon Village was only built in 1901) have become a familiar refrain: “Not for human needs was it fashioned, but for the abode of gods…. To the end it effaced me,” wrote Harriet Monroe, the founder of Poetry magazine, in 1899.
But there was one class of people who were apparently insensible to the Canyon: copper miners. Watching their thoughtless destruction of the landscape, Monroe wondered, “Do they cease to feel it?” President Theodore Roosevelt feared so, and in 1908 he made an executive decision to protect 800,000 acres from exploitation by creating the Grand Canyon National Monument.
Roosevelt’s farsightedness may have put a crimp in the profits of mining companies, but it paid dividends in other ways. By the 1950s, the Canyon had become a must-see destination, attracting visitors from all over the world. Among them were the tragic Sylvia Plath, author of “The Bell Jar,” and her husband, Ted Hughes, the future British Poet Laureate. Thirty years later, the visit to the Canyon still haunted Hughes: “I never went back and you are dead. / But at odd moments it comes, / As if for the first time.” He is not alone, I suspect, in never fully leaving the Canyon behind.
Today’s gyms, which depend on our vanity and body envy, are a far cry from what the Greeks envisioned
ILLUSTRATION: THOMAS FUCHS
Going to the gym takes on a special urgency at this time of year, as we prepare to put our bodies on display at the pool and beach. Though the desire to live a virtuous life of fitness no doubt plays its part, vanity and body envy are, I suspect, the main motivation for our seasonal exertions.
The ancient Greeks, who invented gyms (the Greek gymnasion means “school for naked exercise”), were also body-conscious, but they saw a deeper point to the sweat. No mere muscle shops, Greek gymnasia were state-sponsored institutions aimed at training young men to embody, literally, the highest ideals of Greek virtue. In Plato’s “The Republic,” Socrates says that the two branches of physical and academic education “seem to have been given by some god to men…to ensure a proper harmony between energy and initiative on the one hand and reason on the other, by tuning each to the right pitch.”
Physical competition, culminating in the Olympics, was a form of patriotic activity, and young men went to the gym to socialize, bathe and learn to think. Aristotle founded his school of philosophy in the Lyceum, in a gymnasium that included physical training.
The Greek concept fell out of favor in the West with the rise of Christianity. The abbot St. Bernard of Clairvaux (1090–1153), who advised five popes, wrote, “The spirit flourishes more strongly…in an infirm and weak body,” neatly summing up the medieval ambivalence toward physicality.
Many centuries later, an eccentric German educator named Friedrich Jahn (1778-1852) played a key role in the gym’s revival. Convinced that Prussia’s defeat by Napoleon was due to his compatriots’ descent into physical and moral weakness, Jahn decided that a Greek-style gym would “preserve young people from laxity and…prepare them to fight for the fatherland.” In 1811, he opened a gym in Berlin for military-style physical training (not to be confused with the older German usage of the term gymnasium for the most advanced level of secondary schools).
By the mid-19th century, Europe’s upper-middle classes had sufficient wealth and leisure time to devote themselves to exercise for exercise’s sake. Hippolyte Triat opened two of the first truly commercial gyms in Brussels and Paris in the late 1840s. A retired circus strongman, he capitalized on his physique to sell his “look.”
But broader spiritual ideas still influenced the spread of physical fitness. The 19th-century movement Muscular Christianity sought to transform the working classes into healthy, patriotic Christians. One offshoot, the Young Men’s Christian Association, became famous for its low-cost gyms.
By the mid-20th century, Americans were using their gyms for two different sets of purposes. Those devoted to “manliness” worked out at places like Gold’s Gym and aimed to wow others with their physiques. The other group, “health and fitness” advocates, expanded sharply after Jack LaLanne, who founded his first gym in 1936, turned a healthy lifestyle into a salable commodity. A few decades later, Jazzercise, aerobics, disco and spandex made the gym a liberating, fashionable and sexy place.
More than 57 million Americans belong to a health club today, but until local libraries start adding spinning classes and CrossFit, the gym will remain a shadow of the original Greek ideal. We prize our sound bodies, but we aren’t nearly as devoted to developing sound mind and character.
In the U.S., the author Junot Díaz has stepped down as Pulitzer Prize chairman while the board investigates allegations of sexual misconduct. In a statement through his literary agent earlier this month, Mr. Díaz did not address individual accusations but said in part, “I take responsibility for my past.” Finally, the organizers of the Echo, Germany’s version of the Grammys, said they would no longer bestow the awards after one of this year’s prizes went to rappers who used anti-Semitic words and images in their lyrics and videos.
Prize-giving controversies—some more serious than others—go back millennia. I know something about prizes, having served as chairwoman of the literary Man Booker Prize jury.
The ancient Greeks gave us the concept of the arts prize. To avoid jury corruption in their drama competitions, during the Festival of Dionysus, the Athenians devised a complicated system of votes and lotteries that is still not entirely understood today. Looking back now, the quality of the judging seems questionable. Euripides, the greatest tragedian of classical Greece, habitually challenged his society’s assumptions in tragedies like “Medea,” which sympathetically portrayed female desperation in a society where men ruled absolutely. In a three-way competition, “Medea,” which still holds the stage today, placed last.
Controversy surrounding a competition can be a revitalizing force—especially when the powers that be support the dissidents. By the 1860s, France’s Academy of Fine Arts, the defender of official taste, was growing increasingly out of touch with contemporary art. In 1863, the jury of the prestigious annual Salon exhibition, which the academy controlled, rejected artists such as Paul Cézanne, Camille Pissarro and Édouard Manet.
The furor caused Emperor Napoleon III to order a special exhibition called the Salon “of Rejects” to “let the public judge” who was right. The public was divided, but the artists felt emboldened, and many scholars regard 1863 as the birthdate of modern painting. The Academy ultimately relinquished its control of the Salon in 1881. Its time was over.
At other times, controversies over prizes are more flash than substance. As antigovernment student protests swept Paris and many other places in 1968, a group of filmmakers tried to show solidarity with the protesters by shutting down the venerable Cannes Film Festival. At one point, directors hung from a curtain to prevent a film from starting. The festival was canceled but returned in 1969 without the revolutionary changes some critics were hoping for.
As the summer approaches and the beleaguered festivals around the world take a breather, here’s some advice from a survivor of the prize process: Use this time to reflect and revive.