03/5/20

Mr. Sammler’s City


Saul Bellow’s prophetic 1970 novel captured New York’s unraveling and remains a cautionary tale.
Myron Magnet
Spring 2008

Fear was a New Yorker’s constant companion in the 1970s and ’80s. We lived behind doors with triple locks, some like engines of medieval ironmongery. We barred our ground-floor and fire-escape windows with steel grates that made us feel imprisoned. I was thankful for mine, though, when a hatchet turned up on my fire escape, origin unknown. Nearing our building entrances, we held our keys at the ready and looked over our shoulders, as police and street-smart lore advised; our hearts pounded as we tried to shove the heavy doors open and slam them shut before some mugger could push in behind us, standard mugging procedure. Only once was I too slow and lost my money. A neighbor, who worked at a midtown bank, lost his life.
So to read Saul Bellow’s Mr. Sammler’s Planet when it came out in 1970 was like a jolt of electricity. Just when New York had begun to spin out of control—steadily worsening for over two decades until murders numbered over 2,200 a year, one every four hours—Bellow’s novel described the unraveling with brilliant precision and explained unflinchingly why it was happening. His account shocked readers: some thought it racist and reactionary; others feared it was true but too offensive for a decent person to say. In those days, I felt I should cover my copy with a plain brown wrapper on the subway to veil the obscenity of its political incorrectness.
The book was true, prophetically so. And now that we live in New York’s second golden age—the age of reborn neighborhoods in every borough, of safe streets bustling with tourists, of $40 million apartments, of filled-to-overflowing private schools and colleges, of urban glamour; the age when the New York Times runs stories that explain how once upon a time there was THE AGE OF THE MUGGER and that ask, IS NEW YORK LOSING ITS STREET SMARTS?—it’s important to recall that today’s peace and prosperity mustn’t be taken for granted. Hip young residents of the revived Lower East Side or Williamsburg need to know that it’s possible to kill a city, that the streets they walk daily were once no-go zones, that within living memory residents and companies were fleeing Gotham, that newsweeklies heralded the rotting of the Big Apple and movies like Taxi Driver and Midnight Cowboy plausibly depicted New York as a nightmare peopled by freaks. That’s why it’s worth looking back at Mr. Sammler to understand why that decline occurred: we need to make sure it doesn’t happen again. Continue reading

02/7/20

Drain the Swamp of Ugly Architecture

Drain the Swamp of Ugly Architecture
Trump plans a welcome executive order requiring federal buildings to be built in the classical style.
By Myron Magnet
Feb. 6, 2020

“Making Federal Buildings Beautiful Again,” a new executive order planned by the Trump administration, would thrill lifelong amateur architects George Washington and Thomas Jefferson. These Founders—who designed Mount Vernon, Monticello and the Virginia State Capitol—wanted the new nation’s public buildings to embody its ideals of self-governance, rooted in Greek democracy and Roman republicanism. They would surely applaud President Trump’s proposed order to build new federal buildings in the classical style.

Architectural classicism is a living language, not an antiquarian straitjacket. Its grammar of columns and capitals, pediments and proportions allows a wide range of expression. Just look at the original genius with which Michelangelo marshaled that language in his era or Christopher Wren in his. It’s a language that constantly updated itself in America’s federal city, from the handsome 1790s White House to John Russell Pope’s sublime 1940s Jefferson Memorial and National Gallery of Art. In the language of classicism, buildings relate civilly to each other, forming harmonious cities—Venice or pre-World War II London—in which the whole adds up to more than the sum of its parts, however beautiful some may be. A bad classical building may be awkward or uninspired; it is never hideous. And all is based on human proportions and human scale.

Not so for the modernism that the proposed executive order discourages. Though modernism is an odd word for a style that’s now almost a century old, it began with an explicit European rejection of American architecture and a thoroughly 20th-century impulse toward central planning and state control. Modernism brought housing projects so bare and standardized that no worker wanted to live in them.

Even when you look at a supposed masterpiece of that style—Mies van der Rohe’s Seagram Building on Park Avenue in New York, say—you see one identical office piled on top of another, with the same curtains and furniture arrangement, as if every inmate were an interchangeable cog in some vast machine that utterly dwarfs him. It is an architecture that belittles rather than exalts the individual, exactly the opposite sensation of the exhilaration you feel in the Capitol rotunda or Grand Central Terminal. Modernist buildings, the expression of a mechanical, anonymous vision of a social leviathan that individuals are born to serve, might as well be designed by machines. In this computer age, they largely are.

What’s more, they are ugly. The Pritzker Prize in architecture, like the Nobel Peace Prize, almost guarantees the honoree will be the Yasser Arafat of architecture, the very opposite of what the prize claims to honor. Consider Pritzker winner Thom Mayne’s contribution to America’s national patrimony. His Orwellian San Francisco Federal Building resembles a cyclops mated with a prison. The building is so hideously antisocial that, like Boston’s brutalist concrete City Hall, the homeless camp there permanently.

Of course the modernist establishment has already slammed the proposed executive order, which overturns the General Services Administration’s design excellence program, long a full-employment scheme for modernist architects. The debate now, says an arts critic in the Guardian, is between “those who trust architects and professionals to design whatever they think is best, and those who seek to control what they do.”

That’s precisely right. Most Americans don’t like the buildings that architecture’s mandarins have crammed down their throats. Ordinary people choose traditional values over the wisdom of self-proclaimed experts every time. In fact, that is Trumpism’s hallmark.

02/1/20

Clarence Thomas: the Movie

Clarence Thomas: the Movie
Don’t miss this new documentary.
Myron Magnet
January 31, 2020

From a kerosene-lit shanty in a Georgia swamp to the Supreme Court bench is almost as meteoric a rise as from a log cabin to the White House, and if you add in overcoming segregation in the days when the KKK marched openly down Savannah’s main street, it’s closer still. Michael Pack’s riveting documentary on Justice Clarence Thomas, Created Equal—opening in theaters this week and airing on PBS in May—movingly captures the uncompromising ethic that propelled the justice’s career past so many obstacles as it distills 30 hours of interviews with Thomas and his wife, Virginia, into what feels not only like the exemplary life story of an underappreciated hero but also like a laser-focused, two-hour account of our nation’s race relations over the last 70 years. Yes, we overcame, but at a cost—of which Justice Thomas paid more than his fair share.

The film is purely biographical—Thomas’s brilliant jurisprudence plays no role here—and the justice’s somberly eloquent, slightly melancholy recounting of his saga as he faces the camera directly, dark-suited, with starched white shirt and monochromatic necktie, closely follows the style of his bestselling memoir, My Grandfather’s Son. But as Thomas tells his story, Pack shows us haunting images, over a nostalgically evocative American musical score—bluegrass guitars and banjos, jazz, and Louis Armstrong longingly singing “Moon River” (with lyrics by Savannah-born Johnny Mercer, Thomas reminds us)—that bring it all even more vividly to life than the excellent memoir does. The film clips of the mazy creeks around Thomas’s birthplace, the coastal Georgia hamlet of Pin Point—founded by freed slaves just after the Civil War—sometimes seen from above, as in the iconic shot toward the end of The African Queen, and sometimes seen as we travel along them in one of the little “bateaux” that the oystermen and crab fishers of that lush and remote outpost on the very edge of America still use, bring home how “far removed in time and space” it was from modern, urban America, as Thomas puts it.

It was a completely different world—a tiny, poor, all black community of jumbled shacks around the cinderblock workshop where the women picked the crabs and shucked the oysters that the men caught and raked. The still photos Pack found from the 1940s show you a preindustrial world so vanished that it could just as easily be the nineteenth century as the twentieth. Descended from West Africans, Thomas and his neighbors spoke a dialect called Gullah or Geechee, incomprehensible to outsiders; but when Pack shows us a film clip of a woman singing that patois as she feeds her chickens, we grasp viscerally from the creole lilt how this corner of America was a link in Britain’s triangle trade, with ships bringing enslaved Africans to the Caribbean and southern colonies, carrying the sugar north for distillation into rum, and returning to Britain to sell it.

For Thomas and his playfellows, this was a Mark Twain world of improvised games in the woods and swamps, with no such thing as a store-bought toy—until the heartbreakingly tiny, jerrybuilt shack where he lived with his mother, older sister, and little brother burned down. He came home to “just ashes and twisted tin,” he says. “Everything that you ever knew in life is just there—I mean, it’s smoldering.” Continue reading

01/17/20

What City Journal Wrought

What City Journal Wrought

An editor looks back

Autumn 2015

 

The “Lights Out Club” used to meet for monthly lunches in the early 1990s, my late friend Lorian Marlantes, then chief of Rockefeller Center, told me. Why the name? Because Marlantes’s fellow members—the CEOs of Consolidated Edison, a couple of big Gotham banks, and a few other firms whose core business chained them to New York—thought that soon one of them would be the man who’d turn the lights out forever on a city that was dying before their eyes, killing their companies along with it.

In those days, you didn’t need to be Nostradamus to make such a dire prediction. The evidence was everywhere—on the graffiti-scrawled buildings and mailboxes, the potholed streets, the squalor of the panhandlers, the dustbowl that had been Olmsted and Vaux’s sublime Central Park, and the pervasive stench of urine, thanks to the bums who were turning the capital of the twentieth century into a giant pissoir, with the carriage drive of Grand Central Station the urinal of the universe.

In 1983, the Mobil Oil Corporation, to show Mayor Edward Koch why it was contemplating leaving New York, videotaped the sordidness around its 42nd Street headquarters, near Grand Central. The camera caught the rotting trash, the pee-filled potholes, the degradation of the homeless hordes—some crazy and some shiftless—through which Mobil employees had to pick their way into the then-shabby, billboard-plastered station to catch trains home to their orderly suburbs, fragrant with new-mown grass. After shots of corporate headquarters located in similarly bucolic suburbs, the wordless video closed with the written question: “What do we tell our employees?”

Mobil’s answer, in 1987, was to move to Fairfax, Virginia. More than 100 of some 140 Fortune 500 companies headquartered in Gotham in the 1950s asked the same question and reached the same conclusion, pulling out their tax dollars and leading their well-paid workers into greener pastures in those pre–Rudolph Giuliani decades. They were among the million New Yorkers, many of them the elderly rich and the well-educated young, who fled Gotham in the 1970s and 1980s.

The squalor was only one problem. Another was crime. Of course, much of the disorder—the open dope-dealing, the public drinking, the streetwalkers serving every almost-unthinkable taste, the three-card-monte cardsharpers and their pickpocket confederates preying on the crowds they drew, the window-rattling boombox radios—was itself against the law. But these minor crimes deepened as a coastal shelf into burglary, car theft, armed robbery, assault, rape, and murder—one killing every four hours every day of the annus horribilis 1990.

Those New Yorkers who could afford it tried to insulate themselves with doormen and limo services, as in Tom Wolfe’s 1987 bestseller The Bonfire of the Vanities; those who couldn’t, like the protagonist of Saul Bellow’s 1970 Mr. Sammler’s Planet, envied the guarded doors, the trustworthy drivers, the hushed private clubs—islands of civility in a sea of chaos—as they held on to the strap of the lurching, graffiti-fouled bus, watching the pickpocket ply his craft, or walked down their own dark streets, adrenaline rushing at the sound of every footfall.

Just as the crack of a jungle twig cocks every ear, tenses every muscle, and sends birds screaming indignantly into the sky, apprehension was as characteristic a New York feeling as was ambition in those days. If we didn’t quite live in “continuall feare, and danger of violent death,” as in Thomas Hobbes’s state of nature, “where every man is Enemy to every man,” we were sufficiently on edge. And no wonder. One friend, robbed at gunpoint on Broadway of his wallet, which the thief searched for his address, was then marched to his apartment, forced to unlock it, and tied up, while the gunman coolly stuffed everything of value into my friend’s bedsheets and carted it off. For the sheer thrill, a gang of teen girls swarming up from Morningside Park stomped the girlfriend of a fellow graduate student unconscious and blood-drenched in front of the Columbia University president’s mansion one afternoon. A neighbor, pushed into his lobby as he unlocked his building’s unattended front door after a very long day’s work—the typical thief’s M.O. in that era—was not only robbed but also killed. Another friend, raped at knifepoint on a filthy hallway floor in a neighborhood where she had gone for a purpose she never mentioned, had her satisfied assailant ask her for another “date,” a proposal she declined. But in a way, on the street, in the subway, in the parks, we all felt continually violated and continually asked to go through it again. That people were leaving town all around us came as no surprise.

What to do? A Manhattan Institute seminar on Gotham school reform I attended in the late 1980s, as Koch’s 12-year mayoralty drew to a sadly sordid close, caught the temper of the times. Its chairmen were wily national teachers’ union chief Albert Shanker and New York Board of Education president Robert F. Wagner III, a long-valued friend. Maybe we could try X, a panelist suggested. No: union work rules forbade. How about Y? No: the state legislature . . . the budget. . . . And so on for two hours. The profoundly depressing expert consensus: the more you knew about New York, the more you knew that there was nothing nothing nothing we could do to fix a calamitous mess. After all, wasn’t this the “ungovernable city”? Continue reading

01/6/20

The Last Victorian Sage


Gertrude Himmelfarb, 1922–2019
Myron Magnet
January 2, 2020

Gertrude Himmelfarb, our foremost historian of ideas and one of the nation’s greatest historians of any stamp, died Monday at 97. Though a Washingtonian for the last decades of her long and productive life, the Brooklyn-born Himmelfarb was among the last of a storied band of New York Jewish intellectuals—the “Family,” they called themselves—who joined scholarly erudition to wide-ranging social, political, cultural, and ethical concerns far transcending the merely academic. They wrote for an educated general audience eager for the acuity with which they brought the wisdom and experience of the past to bear on the problems of present-day life. Through much reflection and debate, they’d mostly thought their way through the Trotskyist political correctness that prevailed in their student days to arrive at a liberal Americanism that, in time, metamorphosed into their own brand of conservativism. Now, with wonks and pundits, pedants and ideologues, taking their places, and with the “educated general reader” going extinct, today’s intellectuals seem shallow and dull by contrast.

Acerbic in her impatience with foolishness, Himmelfarb particularly scorned the Marxoid view that people’s beliefs and ideals have no independent reality but are just reflections of the material conditions around them. She rejected social-policy theories that give short shrift to cultural life, ignoring what goes on in people’s minds and hearts as a mere reflection of the real reality—the economic reality that should be the focus of our attention. According to this viewpoint, what people think can’t possibly alter the large forces that shape their lives. What determines individual behavior is the environment, not the content of the mind and spirit of the individual—as in, for example, the belief that crime springs from a lack of opportunity. She wasn’t much more sympathetic to social-policy thinkers who consider individuals the authors of their own actions and fates only to the extent that they choose rationally among various economic incentives—a welfare check versus a minimum-wage job, say. To her, this was just another way of saying that individuals merely respond mechanically to the environment: they don’t shape it. Continue reading

01/6/20

‘Hate Crime’ Is Only a Step Away From Thoughtcrime


Punishing people, even criminals, for ideas is inimical to the American tradition of free speech.
By
Myron Magnet
Jan. 1, 2020

Does it make sense that a person can burn an American flag with impunity but not a gay-pride flag? Earlier this month, a judge in Story County, Iowa, sentenced Adolfo Martinez to a preposterous 16 years in prison for swiping the rainbow flag from a nearby church and burning it in front of a strip club.
Mr. Martinez, 30, has a long criminal history, which partly explains the long sentence. He had two felony convictions, and Iowa law deems any three-time felon an “habitual offender,” subject to enhanced sentencing. But a jury convicted Mr. Martinez of three misdemeanors—third-degree arson, for which the maximum penalty is two years in prison, along with third-degree harassment and the reckless use of fire, each subject to a maximum one-year term.
Mr. Martinez complicated his own defense by telling a local TV station that he had torched the flag because he didn’t like gay people and had “burned down their pride, plain and simple.” In response, the judge increased the misdemeanor arson charge to a hate-crime charge—a felony, normally carrying a maximum of five years in prison. So what seemed on its face to be a minor infraction suddenly became Mr. Martinez’s strike three, inflating his five-year maximum to 15, plus an extra year for the reckless use of fire.
The absurdity of the sentence points up the larger absurdity of hate crimes as a class of criminal offense. Burning an American flag, the Supreme Court says, is free speech. The First Amendment allows you to register disapproval of the government in whatever expressive way you choose, though watch out for the arson laws. Calling the cops “pigs” or singing “F— da Police”? Also no problem, legally speaking. Unlike Canada, Europe and American colleges, the U.S. doesn’t have “hate speech” laws.
The idea that free speech means free speech is a jewel of American exceptionalism. It’s odious and moronic to deny the Holocaust, but it isn’t—and shouldn’t be—a crime. The New York Times didn’t clutch its pearls when Hillary Clinton dismissed Donald Trump’s supporters as a “deplorables” who are “irredeemable” and “not America.” Nor did the guardians of correct opinion blanch when Barack Obama disparaged a large number of Americans as troglodytes clinging to their guns and religion. Rep. Ilhan Omar is entirely at liberty to explain away support for Israel as being “all about the Benjamins, baby.” Robert De Niro is similarly free to give the finger to Mr. Trump and his supporters. All this is as American as apple pie, if less appetizing.
Designating an offense as a hate crime criminalizes not the action but the idea that supposedly impelled it. Here we are but a step away from the “thoughtcrime” George Orwell described in “1984.”
Properly, the law should ask only two questions about your state of mind. First, do you have the faculty of reason that allows you to distinguish right from wrong? Second, did you intend to do the crime you committed? Beyond that, as James Madison repeatedly insisted, you have freedom of conscience. You can believe whatever you want, however politically incorrect—especially since today’s political correctness may be deemed tyranny in retrospect. In a far-flung republic composed of various subgroups, multiple viewpoints and interests are bound to proliferate. Under such circumstances, toleration is required.
The New York area has experienced a rash of what Gov. Andrew Cuomo denounces as “hate crimes.” Swastikas have been scrawled in largely Orthodox Jewish neighborhoods. Adolescent thugs have assaulted Hasidim on the streets. In mid-December three customers and a cop were murdered in an attack on a Jersey City, N.J., kosher market. On Saturday, a madman stabbed five people at the home of a rabbi in Monsey, N.Y., north of the city.
I abhor these offenses, but I don’t see what is gained by Mr. Cuomo’s apoplectic imprecations. These outrages don’t presage pogroms, and it seems a fair bet that the perpetrators don’t know what the Holocaust was. Did it matter to the victims whether their assailants attacked them to steal their money, express their hostility, or take advantage of their vulnerability? Surely the solution isn’t relabeling but rather energetic and activist policing of the kind that discouraged violent acts by ill-socialized adolescents and street-dwelling crazies in New York for 20 golden years. Proactive policing also largely rid the streets of graffiti, offensive symbols included.
Let cops vigorously enforce existing laws against assault, harassment, vandalism, arson and the like. If the harassment amounts to an organized campaign of repression rather than random acts of delinquents or lunatics, then it’s time to dust off the Reconstruction Era’s antiterrorism laws. No group, whether Klansmen or members of an antifa mob, should be allowed to threaten or brutalize people.
It’s a sad reflection on the failure of New York’s current political culture, with its recent soft-on-crime legislation and abhorrence of common-sense policing, that ordinary people must think hard about the less appealing alternative of pressing for more teeth in the Supreme Court’s Heller decision, upholding citizens’ Second Amendment right to keep and bear arms for self-defense.

11/8/19

The Court Moves Right

The Court Moves Right
But judges have a lot of unlearning to do.
Myron Magnet
Autumn 2019

Less than a decade ago, surveying the shambles that half a century’s judicial activism and judicial abdication had made of the Framers’ original Constitution, such insightful commentators as Philip Howard and Mark Levin feared that only a new constitutional convention could fix the mess. Not a full replay of the 1787 drama, but something almost as drastic—the amending convention that the Constitution’s Article V outlines. Its terms allow two-thirds of state legislatures to name a council empowered to frame a balanced-budget or income-tax-limit amendment, say, or—most important—to repeal unconstitutional laws, regulations, and Supreme Court decisions. Three-quarters of the legislatures would then need to ratify such measures.

Now, though, the advent of Justices Neil Gorsuch and Brett Kavanaugh has reshaped the Supreme Court enough to stop such despondent talk. While the decisions announced at the end of the Court’s term in June, marking the first year with both new justices on the bench, don’t amount to a stampede toward the Right, they display a wholesome focus on what the Constitution and statutes actually say. The Nine are “redirecting the judge’s interpretive task back to its roots, away from open-ended policy appeals and speculation about legislative intentions and toward the traditional tools of interpretation that judges have employed for centuries to elucidate the law’s original public meaning,” Gorsuch explained in a June opinion. “Today, it is even said that we judges are, to one degree or another, ‘all textualists now.’ ” And that’s already a quiet revolution. Continue reading

10/1/19

Imprimis


Clarence Thomas and the Lost Constitution
September 2019 • Volume 48, Number 9 • Myron Magnet
Myron Magnet
Author, Clarence Thomas and the Lost Constitution

The following is adapted from a speech delivered on September 17, 2019, at Hillsdale College’s Constitution Day Celebration in Washington, D.C.

Clarence Thomas is our era’s most consequential jurist, as radical as he is brave. During his almost three decades on the bench, he has been laying out a blueprint for remaking Supreme Court jurisprudence. His template is the Constitution as the Framers wrote it during that hot summer in Philadelphia 232 years ago, when they aimed to design “good government from reflection and choice,” as Alexander Hamilton put it in the first Federalist, rather than settle for a regime formed, as are most in history, by “accident and force.” In Thomas’s view, what the Framers achieved remains as modern and up-to-date—as avant-garde, even—as it was in 1787.

What the Framers envisioned was a self-governing republic. Citizens would no longer be ruled. Under laws made by their elected representatives, they would be free to work out their own happiness in their own way, in their families and local communities. But since those elected representatives are born with the same selfish impulses as everyone else—the same all-too-human nature that makes government necessary in the first place—the Framers took care to limit their powers and to hedge them with checks and balances, to prevent the servants of the sovereign people from becoming their masters. The Framers strove to avoid at all costs what they called an “elective despotism,” understanding that elections alone don’t ensure liberty.

Did they achieve their goal perfectly, even with the first ten amendments that form the Bill of Rights? No—and they recognized that. It took the Thirteenth, Fourteenth, and Fifteenth Amendments—following a fearsome war—to end the evil of slavery that marred the Framers’ creation, but that they couldn’t abolish summarily if they wanted to get the document adopted. Thereafter, it took the Nineteenth Amendment to give women the vote, a measure that followed inexorably from the principles of the American Revolution.

During the ratification debates, one gloomy critic prophesied that if citizens ratified the Constitution, “the forms of republican government” would soon exist “in appearance only” in America, as had occurred in ancient Rome. American republicanism would indeed eventually decline, but the decline took a century to begin and unfolded with much less malice than it did at the end of the Roman Republic. Nor was it due to some defect in the Constitution, but rather to repeated undermining by the Supreme Court, the president, and the Congress.

The result today is a crisis of legitimacy, fueling the anger with which Americans now glare at one another. Half of us believe we live under the old Constitution, with its guarantee of liberty and its expectation of self-reliance. The other half believe in a “living constitution”—a regime that empowers the Supreme Court to sit as a permanent constitutional convention, issuing decrees that keep our government evolving with modernity’s changing conditions. The living constitution also permits countless supposedly expert administrative agencies, like the SEC and the EPA, to make rules like a legislature, administer them like an executive, and adjudicate and punish infractions of them like a judiciary.

To the Old Constitutionalists, this government of decrees issued by bureaucrats and judges is not democratic self-government but something more like tyranny—hard or soft, depending on whether or not you are caught in the unelected rulers’ clutches. To the Living Constitutionalists, on the other hand, government by agency experts and Ivy League-trained judges—making rules for a progressive society (to use their language) and guided by enlightened principles of social justice that favor the “disadvantaged” and other victim groups—constitutes real democracy. So today we have the Freedom Party versus the Fairness Party, with unelected bureaucrats and judges saying what fairness is.

This is the constitutional deformation that Justice Thomas, an Old Constitutionalist in capital letters, has striven to repair. If the Framers had wanted a constitution that evolved by judicial ruling, Thomas says, they could have stuck with the unwritten British constitution that governed the American colonists in just that way for 150 years before the Revolution. But Americans chose a written constitution, whose meaning, as the Framers and the state ratifying conventions understood it, does not change—and whose purpose remains, as the Preamble states, to “secure the Blessings of Liberty to ourselves and our Posterity.”

In Thomas’s view, there is no nobler or more just purpose for any government. If the Framers failed to realize that ideal fully because of slavery, the Civil War amendments proved that their design was, in Thomas’s word, “perfectible.” Similarly, if later developments fell away from that ideal, it is still perfectible, and Thomas takes it as his job—his calling, he says—to perfect it. And that can mean that where earlier Supreme Court decisions have deviated from what the document and its amendments say, it is the duty of today’s justices to overrule them. Consequently, while the hallowed doctrine of stare decisis—the rule that judges are bound to respect precedent—certainly applies to the lower courts, Supreme Court justices owe fidelity to the Constitution alone, and if their predecessors have construed it erroneously, today’s justices must say so and overturn their decisions. Continue reading

09/30/19

Misjudging Clarence Thomas

Misjudging Clarence Thomas
Corey Robin’s assessment of the Supreme Court justice is lost in left field.
Myron Magnet
September 29, 2019 Arts and CulturePolitics and law
The Enigma of Clarence Thomas, by Corey Robin (Metropolitan Books, 320 pp., $27)

What deliciously ironic wit the New Yorker’s first art editor, Rea Irvin, deployed in his iconic drawing of Eustace Tilley, the Regency dandy quizzically inspecting a butterfly through a monocle on the magazine’s inaugural cover nearly a century ago. Ah yes, we Gotham cosmopolites view the rest of America as exotic insects worth a moment’s gaze as they hatch from the basket of deplorables and flit by for their 24 hours in the sun. But, Irvin hinted, what an affected fop is Eustace himself—as showy as the bright creature catching his glance but oh, how much more contrived in his top hat and impossibly high neckcloth. I can’t help wishing that Corey Robin, a Brooklyn College professor who has made a career of turning a supercilious monocle on conservatives and explaining their curious, “reactionary” ideas to his fellow enlightened “progressives,” had shown a scintilla of Irvin’s wry self-knowledge in his new book, The Enigma of Clarence Thomas, an excerpt of which the New Yorker coincidentally has just published. But since Robin’s assessment of the Supreme Court justice lacks a single self-questioning moment, let’s look back at him through his monocle and take our own measure of the author before we consider his account of our era’s greatest jurist.

How fashions have changed! Despite a modish dash of race, class, and gender, today’s New Yorker of refined sensibility, if Robin is a representative specimen, presents himself in his book as a conventional socialist, an admirer of the French rather than the American Revolution, and still mooning with nostalgia for that imaginary 1960s “revolution” that Bernie Sanders has dreamt of since his long-ago youth. In Robin’s vision, politics centers on the “power the state will have to involve itself in the affairs of the citizens,” making “rules for a more just and humane economy.” It is a realm of “democratic transformation, where men and women act deliberatively and collectively to alter their estate,” led by the “heroic action of an elite few,” masters of “the arts of persuasion, the mobilization and transformation of popular belief”—though Robin’s evocation of the Robespierres and Lenins of the world is bound to make one wonder just how democratic his vision of the popular will really is. What were the editors of the publication for which he writes a column thinking when they called it Jacobin, after a political elite that wrought its social transformation by removing the heads of those of the wrong class or opinion?

For Robin, capitalism is a system of “overwhelming, anti-democratic constraint” that takes “the great questions of society—justice, equality, freedom, distribution—off the table of public deliberation,” shielding them from “the conscious and collective interference of citizens acting through their government.” In this collectivist vein, he casts a cold eye on Madison’s classic formulation of American constitutionalism in Federalist 10. The Constitution protects life, liberty, and property, Madison writes, and since individual citizens have a boundless variety of talents, ambitions, and energies, the liberty the Constitution safeguards will result in different and unequal outcomes, including economic inequality. The danger in the democratic republic that the Constitution frames, Madison wrote, is that the unpropertied majority could use their voting numbers to expropriate the wealth of the rich few, trampling the Constitution’s protection of property. Such an expropriation is what Madison meant by the tyranny of the majority, and a key goal of the Constitution’s checks and balances is to forestall just that. When Robin holds up Justice Thomas’s citation of Madison’s argument as a mere ploy “to moralize moneymaking, to lend the market a legitimacy it had been denied by New Deal liberalism, to shield money and the market from political critique,” he seems to be looking at the Constitution through the wrong end of a telescope, seeing FDR and the New Deal’s tyranny of the majority, rather than James Madison and the protection of individual liberty, as the nation’s real Founding Father. Of the individual citizen whose liberty the Constitution is meant to shield, we hear nary a word until a third of the way through the book, and then only once or twice thereafter. Everyone is simply an atom dissolved in the mass of race, class, or gender.

The lens through which Robin views Thomas is even more distorting—not surprising, given that he “reject[s] virtually all of Thomas’s views” and moreover believes that the justice, during his confirmation hearings, “lied to the Judiciary Committee when he stated that he never sexually harassed Anita Hill,” an allegation that’s now the stock, and thus increasingly incredible, gambit for opponents of conservative judicial nominees. In the justice’s opinions, what Robin sees, as anyone who spends even an hour or two reading them must see, is Thomas’s striking concern with race, a subject that he raises repeatedly, even in cases seemingly far from the question. Upon this observation, Robin erects a wildly far-fetched account of the justice’s worldview and jurisprudence, one that imperiously sweeps away Thomas’s own careful exposition of his intellectual journey in his speeches and memoir as if he must be incapable of understanding his own mind and heart. But of course, this concern springs not just from Thomas’s personal history but also from the belief, central to his jurisprudence, that it’s precisely on race matters that the Court has made so many fateful wrong turns that need correction. Continue reading

06/20/19

Justice Thomas’s Credo

The Constitution, not precedent, is the law of the land.
Myron Magnet
June 19, 2019

One of the most striking aspects of Monday’s Supreme Court decision in Gamble v. United States was Clarence Thomas’s eloquent summary of the core precept of his judicial philosophy: that stare decisis—the venerable doctrine that courts should respect precedent—deserves but a minor place in Supreme Court jurisprudence. His 17-page concurrence in a case concerning double jeopardy, really a stand-alone essay, emphasizes that, in America’s system of government, the “Constitution, federal statutes, and treaties are the law.” That’s why justices and other governmental officers take an oath to “preserve, protect, and defend the Constitution of the United States”—not to safeguard judicial precedents. “That the Constitution outranks other sources of law is inherent in its nature,” he writes. The job of a Supreme Court justice, therefore, “is modest: We interpret and apply written law to the facts of particular cases.” Continue reading

05/27/19

How John Marshall Made the Supreme Court Supreme


Myron Magnet
Spring 2019

His brains and bonhomie forged a band of Federalist brethren.

Most serious American readers know National Review columnist and National Humanities Medal laureate Richard Brookhiser as the author of a shelf of elegantly crafted biographies of our nation’s Founding Fathers, from George Washington and Alexander Hamilton up to our re-founder, Abraham Lincoln. Those crisp, pleasurable volumes rest on the assumption that these were very great men who created (or re-created) something rare in human history: a self-governing republic whose growing freedom and prosperity validated the vision they strove so hard and sacrificed so much to make real. It’s fitting that the most recent of Brookhiser’s exemplary works is John Marshall: The Man Who Made the Supreme Court, for it was Marshall—a junior member of the Founding Fathers, so to speak—who made the Court a formidable bastion of the nation’s founding governmental principles, shielding them from attacks by demagogically inclined presidents from Jefferson to Jackson, until his death in 1835.

It takes all a biographer’s skills to write Marshall’s life, for he left no diaries and few letters or speeches. One must intuit the man’s character from bits and pieces of his own writings, his weighty but wooden biography of George Washington, his judicial opinions, and his contemporaries’ descriptions of him. From these gleanings, however, like Napoleon’s chef after the Battle of Marengo, Brookhiser concocts a rich and nourishing dish.

Born in backwoods Virginia in 1755, Marshall all his life kept a rural simplicity of manner and dress that once misled a Richmond citizen to think him a porter and ask him to carry a turkey home from the market, which the chief justice cheerfully did, refusing a tip for his efforts. Gregarious, athletic, and full of jokes, Marshall in his thirties was the life of the Quoits Club, a select Richmond group dedicated to weekly bibulous good fellowship and a horseshoe-like game played with metal rings, activities at which Marshall excelled. During one barroom game of inventing rhymes on assigned words, he drew “paradox” and, glancing at a knot of bourbon-drinking Kentuckians, promptly declaimed:

In the Blue Grass region,
A paradox was born.
The corn was full of kernels,
And the colonels full of corn.

“In his youth, he gamed, bet, and drank,” a temperate congressman grumbled; yet in old age, the legislator had to drive uphill in his gig, “while the old chief justice walks.”

Service in Washington’s army during the Revolution left Marshall with veneration for his commander in chief—“the greatest Man on earth,” he thought. Like most of his fellow officers, he came away from the war with the beliefs, born from the bone-chilling, stomach-gnawing privation of icy winter quarters, that became the core principles of Federalism once the Constitution was ratified—including by the Virginia ratifying convention, where Marshall played a key role. For its own preservation, the United States needed to be a real union, not a confederation of states, the Federalists held, with a central government powerful enough to fight a war and fund it, without inflicting superfluous suffering on its soldiers.
Continue reading

05/24/19

Thomas and Breyer’s ‘Stare’ Contest

Their sharp disagreement about precedent reflects different worldviews that go far beyond abortion.

By

Myron Magnet

May 22, 2019 6:53 p.m. ET

Justice Clarence Thomas in Washington, Feb. 15, 2018. PHOTO: PABLO MARTINEZ MONSIVAIS/ASSOCIATED PRESS

Justice Stephen Breyer lamented last week that the Supreme Court had overturned “a well-reasoned decision that has caused no serious practical problems in the four decades since we decided it.” Dissenting from Justice Clarence Thomas’s majority decision in Franchise Tax Board v. Hyatt, Justice Breyer added: “Today’s decision can only cause one to wonder which cases the Court will overrule next.”

Court watchers assumed the two justices were arguing about abortion, although the case had nothing to do with that issue. But the clash over stare decisis—the doctrine that courts must respect precedent as binding—runs far deeper. It is a manifestation of the crisis of legitimacy that has split Americans into two increasingly hostile camps.

On Justice Thomas’s side is the belief that the government’s authority rests on the written Constitution. This view regards a self-governing republic—designed to protect the individual’s right to pursue his own happiness in his own way, in his family and local community—as the most just and up-to-date form of government ever imagined, even 232 years after the Constitutional Convention.

Justice Breyer, by contrast, assumes America is rightly governed by a “living Constitution,” which evolves by judicial decree to meet modernity’s fast-changing conditions. Judges make up law “with boldness and a touch of audacity,” as Woodrow Wilson put it, rather than merely interpreting a Constitution he thought obsolete.

Wilson also established a corps of supposedly expert, nonpartisan administrators in such agencies as the Interstate Commerce Commission and the Federal Trade Commission, to make rules like a legislature, carry them out like an executive, and adjudicate and punish infractions of them like a judiciary. Wilson and Franklin D. Roosevelt, who supersized this system, considered it the cutting edge of modernity in the protection it afforded workers and the disadvantaged. Call it the Fairness Party, as distinct from Justice Thomas’s Freedom Party.

The Freedom Party does not view the rule by decrees of unelected officials, however enlightened, as an advance over democratic self-government. If the framers had wanted such a system, they could have stuck with the unwritten British constitution, which had governed the American colonists for 150 years and evolves by judicial precedent. They wanted a written constitution, strictly limiting federal authority, because they knew that human nature’s inborn selfishness and aggression not only make government necessary but also lead government officials to abuse their power if not restrained.

U.S. history justifies the framers’ caution, as Justice Thomas has argued in hundreds of opinions since joining the court in 1991. At crucial junctures, the Supreme Court has twisted the Constitution that guarantees liberty toward government oppression.

Start with The Slaughter-House Cases (1873) and U.S. v. Cruikshank(1876), which blew away the protection of the Bill of Rights with which the 14th Amendment’s framers and ratifiers thought they had clothed freed slaves against depredations by state governments. The result was 90 years of Jim Crow tyranny in the South. “I have a personal interest in this,” Justice Thomas once said. “I lived under segregation.” He grew up in 1950s Savannah, Ga., where the law forbade him to drink out of this fountain or walk across that park. If the Fairness Party thinks Supreme Court distortions can twist only to the left, it should think again. Far better to stick to the original meaning, as Justice Thomas urges.

Look what happened when the court allowed Congress and the president to proliferate administrative agencies with no political accountability. The justices have “overseen and sanctioned the growth of an administrative system that concentrates the power to make laws and the power to enforce them in the hands of a vast and unaccountable administrative apparatus that finds no comfortable home in our constitutional structure,” Justice Thomas wrote in a 2015 opinion, the first of a series that argued for reining in the administrative state.

Such lawless power ends in tyranny, as in the case of Joseph Robertson. As these pages recently reported, the Montana rancher dug two ponds fed by a trickle that ran down his mountain acres, only to be prosecuted and imprisoned for polluting “navigable waterways,” as absurdly defined by bureaucrats at the Environmental Protection Agency.

Beginning with the Warren Court in the 1950s, bold and audacious justices began making up law out of the Constitution’s “emanations, formed by penumbras”—literally, gas and shadows. As Justice Thomas has objected, the court invented rights that sharply curtailed the traditional order-keeping authority of police and teachers, making streets, schools, and housing projects in poor neighborhoods dangerous, and depriving mostly minority citizens of the first civil right—to be safe. The justices have even trampled the Bill of Rights, sanctioning campaign-finance laws that curtail the political speech at the core of First Amendment protections.

It’s as if the Court respects no limits. Thus the hallmark of Justice Thomas’s jurisprudence is his willingness to overturn prior decisions when he thinks his predecessors have construed the Constitution incorrectly. The justices readily overturn unconstitutional laws passed by a duly elected Congress. Why be more tender toward judicial errors?

“Stare decisis is not an inexorable command,” Justice Thomas observes in Hyatt. He has said elsewhere: “I think that the Constitution itself, the written document, is the ultimate stare decisis.” Justice Breyer asks which cases the court will overrule next. Justice Thomas’s reasonable answer: Whichever ones go against the Constitution.

Mr. Magnet is editor-at-large of the Manhattan Institute’s City Journal, a National Humanities Medal laureate and author of “Clarence Thomas and the Lost Constitution.”

 

 

07/22/17

“Let Right Be Done!”

A classic film’s lesson in liberty

July 21, 2017

May I recommend one of my candidates for the Ten Greatest Movies list—The Winslow Boy? What the 1948 British film (not David Mamet’s 1999 remake) has going for it is a brilliant director, Anthony Asquith—who ranks with such luminaries as Carol Reed, Alfred Hitchcock, John Huston, or Jean Renoir—and a stellar cast, which includes some of the most skilled actors in movie history, from Cedric Hardwicke on down, all at the top of their form. But above all these advantages, the movie’s animating spirit is its script, by Terrance Rattigan and Anatole de Grunwald from Rattigan’s play, which grippingly dramatizes a principle at the very heart of Anglo-Saxon liberty—a principle that today’s America badly needs to relearn.

The Winslow Boy–and his father

The story, set in 1912—when director Asquith’s father, H. H. Asquith, was Britain’s Liberal prime minister, and World War I was brewing—is simple, and it won’t spoil the movie for you if I sketch its outline. Twelve-year-old Ronnie Winslow gets expelled from Osborne, the prestigious boarding school for cadets headed for Royal Navy commissions, for allegedly stealing five shillings. Though the sum is trivial, the alleged breach of the code of officers and gentlemen is not. His father, Arthur, a newly retired Wimbledon bank manager played by Hardwicke, solemnly asks him if he is guilty—twice—and when the boy twice asserts his innocence, his father, who raised him to tell the truth, vows to vindicate the boy’s honor, whatever the cost.It proves immense. In his quest, which lasts until after Ronnie turns 14, Arthur sacrifices his health, much of his savings, and the happiness and future of his solidly respectable and eminently likable upper-middle-class family. He meets obstacles at every point. The school’s commandant tells him that, as he had no doubt of Ronnie’s guilt after hearing the details of the theft, he has no second thoughts about summarily expelling the boy, without any formal procedure or even someone to advise Ronnie or speak in his defense. He won’t reconsider the evidence or say what it was. A visit to the Admiralty Commission to threaten a lawsuit gains Arthur only a haughty declaration that he needn’t bother: a subject of the king can’t sue the king’s representatives, for the law holds that the king can do no wrong.

True enough, his solicitor tells him; but nevertheless Magna Carta, the thirteenth-century charter of English liberties, declares that “no subject of the King may be condemned without a trial,” so perhaps Arthur should ask his MP to denounce the wrong done to Ronnie in Parliament. Good advice: for the MP, seeing a chance to win favorable press as a defender of justice, is glad to oblige. Reporters readily take the bait and make the Winslow case a national cause célèbre.

The uproar catches the interest of Sir Robert Morton, England’s most eminent—and expensive—barrister, masterfully played by Robert Donat as a complex mix of eloquence, cold hauteur, ruthless intelligence, and deep but hidden feeling, a legal version of Jane Eyre’s Mr. Rochester. Morton drops in at the Winslows’ house on his way to dinner with a duchess, politely introduces himself, and mercilessly cross-examines Ronnie, until the boy stammers with confusion and his family (along with the audience) wonders if he’s been telling the truth. But after such browbeating, the great man abruptly announces that he’ll take the case, for he thinks Ronnie is innocent. Continue reading

07/23/16

Why Are Voters So Angry?

cj_header
Summer 2016

They want self-government back.

Haunting this year’s presidential contest is the sense that the U.S. government no longer belongs to the people and no longer represents them. And this uneasy feeling is not misplaced. It reflects the real state of affairs.

We have lost the government we learned about in civics class, with its democratic election of representatives to do the voters’ will in framing laws, which the president vows to execute faithfully, unless the Supreme Court rules them unconstitutional. That small government of limited powers that the Founders designed, hedged with checks and balances, hasn’t operated for a century. All its parts still have their old names and appear to be carrying out their old functions. But in fact, a new kind of government has grown up inside the old structure, like those parasites hatched in another organism that grow by eating up their host from within, until the adult creature bursts out of the host’s carcass. This transformation is not an evolution but a usurpation.

What has now largely displaced the Founders’ government is what’s called the Administrative State—a transformation premeditated by its main architect, Woodrow Wilson. The thin-skinned, self-righteous college-professor president, who thought himself enlightened far beyond the citizenry, dismissed the Declaration of Independence’s inalienable rights as so much outmoded “nonsense,” and he rejected the Founders’ clunky constitutional machinery as obsolete. (See “It’s Not Your Founding Fathers’ Republic Any More,” Summer 2014.) What a modern country needed, he said, was a “living constitution” that would keep pace with the fast-changing times by continual, Darwinian adaptation, as he called it, effected by federal courts acting as a permanent constitutional convention. Continue reading

04/25/16

The End of Democracy in America

cj_header

Tocqueville foresaw how it would come.
Myron Magnet
Spring 2016

Alexis de Tocqueville was a more prophetic observer of American democracy than even his most ardent admirers appreciate. True, readers have seen clearly what makes his account of American exceptionalism so luminously accurate, and they have grasped the profundity of his critique of American democracy’s shortcomings. What they have missed is his startling clairvoyance about how democracy in America could evolve into what he called “democratic despotism.” That transformation has been in process for decades now, and reversing it is the principal political challenge of our own moment in history. It is implicitly, and should be explicitly, at the center of our upcoming presidential election.
Readers don’t fully credit Tocqueville with being the seer he was for the same reason that, though volume 1 of Democracy in America set cash registers jingling as merrily as Santa’s sleigh bells at its 1835 publication, volume 2, five years later, met a much cooler reception. The falloff, I think, stems from the author’s failure to make plain a key step in his argument between the two tomes—an omission he righted two decades later with the publication of The Old Regime and the French Revolution in 1856. Reading the two books together makes Tocqueville’s argument—and its urgent timeliness—snap into focus with the clarity of revelation.

Alexis de Tocqueville

Alexis de Tocqueville in 1850

Continue reading

02/21/16

Liberty—If You Can Keep It

 

 

Yes, it does demand eternal vigilance.

MYRON MAGNET
Winter 2016
auschwitz

 

The gates of Auschwitz—with their demonic jeer, “Work Makes You Free”—led to history’s vilest demonstration of everything freedom isn’t.

Isn’t a sexual revolution a kind of revolution?” a Soviet dissident, the grandson of one of Stalin’s henchmen, asked me rhetorically in the mid-1970s. Recently released from five years’ Siberian exile, he certainly knew what slavery and tyranny were. But now, he wondered, couldn’t the waning of Russia’s sexual constraints be the harbinger of wider liberty? After all, he asked hopefully, “Isn’t sexual freedom, freedom?”

It didn’t turn out that way. So impoverished was the Soviet empire that it couldn’t give its subjects the bread and circuses that pacified imperial Rome’s populace; so, to the cheap vodka drastically shortening Russian life spans, it added lascivious license. Drunken stupor; moments of voluptuous rapture: that’s escape, not liberty. Continue reading

07/4/15

The Vision of the Founding Fathers

natl review logo

What kind of nation did the Founders aim to create?
By Myron Magnet — July 3, 2015

Men, not vast, impersonal forces — economic, technological, class struggle, what have you — make history, and they make it out of the ideals that they cherish in their hearts and the ideas they have in their minds. So what were the ideas and ideals that drove the Founding Fathers to take up arms and fashion a new kind of government, one formed by reflection and choice, as Alexander Hamilton said, rather than by accident and force?

Signing of the Declaration of Independence, John Trumbull

Signing of the Declaration of Independence, John Trumbull

The worldview out of which America was born centered on three revolutionary ideas, of which the most powerful was a thirst for liberty. For the Founders, liberty was not some vague abstraction. They understood it concretely, as people do who have a keen knowledge of its opposite. They understood it in the same way as Eastern Europeans who have lived under Communist tyranny, for instance, or Jews who escaped the Holocaust. Continue reading

06/13/15

Free Speech in Peril

cj_header
Spring 2015

Trigger warning: may offend the illiberal or intolerant

Shut up or die. It’s hard to think of a more frontal assault on the basic values of Western freedom than al-Qaida’s January slaughter of French journalists for publishing cartoons they disliked. I disagree with what you say, and I’ll defend to the death my right to make you stop saying it: the battle cry of neo-medievalism. And it worked. The New York Times, in reporting the Charlie Hebdo massacre, flinched from printing the cartoons. The London Telegraph showed the magazine’s cover but pixelated the image of Muhammad. All honor to the Washington Post and the New York Post for the courage to show, as the latter so often does, the naked truth.

The Paris atrocity ought to make us rethink the harms we ourselves have been inflicting on the freedom to think our own thoughts and say and write them that is a prime glory of our Bill of Rights—and that its author, James Madison, shocked by Virginia’s jailing of Baptist preachers for publishing unorthodox religious views, entered politics to protect. Our First Amendment allows you to say whatever you like, except, a 1942 Supreme Court decision held, “the lewd and obscene, the profane, the libelous, and the insulting or ‘fighting’ words—those which by their very utterances inflict injury or tend to incite an immediate breach of the peace,” though subsequent decisions have allowed obscene and profane speech. A 1992 judgment further refined the “fighting words” exemption, ruling that the First Amendment forbids government from discriminating among the ideas that the fighting words convey, banning anti-Catholic insults, for example, while permitting slurs against anti-Catholics. In other words, government can’t bar what we would now call “hate speech”—speech that will cause “anger, alarm or resentment in others on the basis of race, color, creed, religion or gender.”
25_2-mm1
This expansive freedom prevails nowhere else on earth. European countries, and even Canada, have passed hate-speech laws that criminalize casual racial slurs or insults to someone’s sexual habits. An Oxford student spent a night in jail for opining to a policeman that his horse seemed gay. France, which has recently fined citizens for antigay tweets and criminalized calls for jihad as an incitement to violence—a measure that our First Amendment would allow only if the calls presented a “clear and present danger”—also (most improperly) forbids the denial of crimes against humanity, especially the Holocaust. The pope has weighed in as well, with the platitude that no one should insult anyone’s religion—or his mother. Continue reading

05/30/15

Magnet School

natl review logo

THE CORNER
THE ONE AND ONLY.
by JAY NORDLINGER May 27, 2015 2:36 PM
My Impromptus today is kind of unusual. (I know, no different from the norm.) What are the least overrated places you know? In other words, places about which the hype is true. And what are the most overrated? I brought up this topic a couple of weeks ago, and, today, I report reader responses. One of those responses is this: Least overrated: Mount Vernon. Warm, approachable, understandable. Most overrated: Monticello. As much as I love Jefferson, his home leaves me cold, especially when compared with Mount Vernon.
I brought this opinion — this pairing — to the attention of Myron Magnet. Why? Well, Myron knows about everything. But he is especially knowledgeable in this area, as the author of The Founders at Home: The Building of America, 1735-1817. He was good enough to write a comment, which I’m so pleased to share with you.
The “reader’s comparison surprised me,” he begins. “In truth, both houses are profoundly moving to visit, haunted as they are by spirits of the great statesmen and amateur architects who, as a lifetime hobby, spent years planning, building up, repairing, perfecting these outward embodiments of their inner vision of the kind of domestic life they were building a nation to make possible. By contrast with your correspondent, in politics I love Washington, while the only Jeffersonian political principle I agree with is that all men are created equal. So I like the Burkean approach Washington took to enlarging and improving Mount Vernon, not altering structures that worked fine as he added new and improved sections of the house. The result is a house that, for all its attempts to look classically symmetrical, is endearingly lopsided, with the rooftop lantern 18 inches off center, and a different number of windows under each half of the pediment over the entrance portico. Jefferson, by contrast, is a rationalist’s rationalist, with the plan of Monticello an endlessly interesting, complex, but always symmetrical puzzle of abstract geometrical shapes forming a brilliantly harmonious whole. Well, I like rationalism — in architecture, if not in politics, where it led Jefferson to his monstrous views on the French Revolution. There is however one truly disturbing thing about Monticello, and that’s the care and trouble Jefferson took to hide the economic reality of slavery that supported the whole operation, putting the service wings half-underground and devising ways to bring food and wine into the dining room without a human being having to carry it in. I suppose one should give him credit at least for being ashamed of slavery. As Dr. Johnson said of that proto-Darwinian, the Scotch judge Lord Monboddo, who believed that men were descended from monkeys, If one has a tail, one should take pains to conceal it; but Monboddo flaunts his with pride.”

Read more at: http://www.nationalreview.com/corner/418942/magnet-school-jay-nordlinger

02/19/15

What Must We Think About When We Think About Politics?

cj_headerWinter 2015
What Must We Think About When We Think About Politics?
Man is a political animal, but he is much more.
Hobbes
NATIONAL TRUST PHOTO LIBRARY/ART RESOURCE, NY
A headless body in a topless bar would not have surprised political philosopher Thomas Hobbes.

The late political scientist James Q. Wilson used to caution, with his elegant precision, that it’s not enough to have political opinions. You also need facts—which, for him and his brilliant colleagues at The Public Interest of the 1960s and 1970s, meant data. You think this policy will produce that outcome? Okay, try it—and then measure what happens. Did you reduce poverty? Raise test scores? And you had also better comb the data for consequences you neither expected nor intended, for all policies must stand or fall by the totality of their results. Remember, too, Wilson and his colleagues used to insist, that correlation is not causation: if two things alter more or less in tandem, that doesn’t by itself prove that one of the changes produced the other. They may be independent of each other, or some as-yet-unnoticed third force may have sparked both of them. Data don’t speak for themselves but require interpretation—which may or may not be correct. It’s art, not science.

This warning proved a powerful corrective to the liberal ideology about social policy that reigned in the 1960s—pious, unproved platitudes about “root causes” that gave birth to the War on Poverty, whose dire consequences, including an ever-more-deeply entrenched underclass, still bedevil America. But Wilson’s rigor tones up only one of the areas where political thought and discourse tend to be flabby. At least two more elements, well known to political philosophers since antiquity but often ignored today, are essential to intelligent political thinking. You have to have some understanding of psychology—of the minds and hearts that motivate the individuals who are the stuff of politics—and you have to know something about culture, the thick web of beliefs and customs that shape individuals and their social world at least as much as public policies do. Continue reading