Public Good

 

 

The Public Good

 

Worker at a factory learning that his plant is being shut down and his job is gone.

 

 I worry that the idea of a common good is declining. Suddenly, for example, that dour intellectual battleaxe of the 1950s, Ayn Rand, has found an enthusiastic new audience among young adults. This is the same Ayn Rand who identified self-interest as the highest good and preached that caring about others was a fake value invented by the contemptible weak as a means of hobbling the heroic strong.

Somehow, her ideas have acquired a patina of cool.

I was fulminating about all this the other day, sounding, I’m sure like the crusty old codgers of my youth. Picture skinny old men shaking their canes and yelling in high-pitched, cracked voices, “Young people today! No respect!”

My wife heard my fulminations and took me to task. “Young people are no more selfish than they ever were,” she said. “In fact, less so. Just look at websites like Kickstarter and Kiva and Indigogo, and how popular they are.” For anyone who doesn’t know, these websites let anyone seeking money for a cause connect up with people who want to donate (or loan) money to their exact cause. And it’s working. People really are getting funding for all kinds of good works, and a lot of it is coming from the young; maybe most of it.

But I never disputed the idealism. I’m not saying young people are getting more selfish. I know lots of young adults who have compassionate feelings and want to reach out. They just want to choose who they reach out to. They want their giving to reflect who they are. Helping others becomes, to some extent, an act of self-definition, self-realization. Self-expression.

Which is fine. But I’m just saying, the social compact of old offered a different proposition. It proposed that individuals relinquish their idea of themselves as the center of the universe and see themselves as smaller parts of a greater whole, a society whose collective promise was that no one would be left (entirely) behind.

At the leftist end of the axis, that compact was expressed as socialism. And when I was young, though “communist” might have been a curse word to older folks in mainstream America, calling someone a “socialist” was no worse than calling them “European.” Many young people cheerfully embraced “socialist” as a label. They saw no stigma in it. In many quarters, “socialist” had a positive connotation. It meant you believed it was right to care about the well-being of the whole society and that you had a duty to contribute to that well-being. Giving money to a beggar was fine, but it was merely charity. Fighting for a social program that would help thousands was on a higher plane, more noble, and that’s what being a socialist was all about, that struggle.

That’s the thing that’s vanishing, seems to me. In its place, rising up like swamp gas, is a notion that the whole will take care of itself if only every individual looks out for his or her own interest vigorously and competitively, giving now quarter and asking for no help. Seeking the well-being of one’s own individual self is what has glamour now.

I overheard a conversation between two twenty-somethings in a bookstore one day about an election. The guy was telling the woman that he was not going to vote for a certain candidate.

Why not? she asked. After all, the candidate had the right stand on many issues; and she went on to list positions of which she and her guy both evidently approved.

Yes, the guy admitted, “but on the other hand…” And he cited a list of issues on which the candidate was at odds with him. In fact, declared this fellow, he had decided not to vote at all, because: “There just isn’t any candidate out there who really represents ME.”

I thought about his expectation. I thought about the implication that the only candidate worth voting for is one whose preferences and positions exactly match your own. At some level (polls tell us) that is what many voters look for in a candidate now—a surrogate self: someone who “represents” them by looking, sounding, talking and thinking exactly as they do.

I have to say, I’m not one of those voters. A candidate who held exactly the same positions and preferences as me would be ineffective. And a candidate exactly like me would be a disaster. I’m good at some things, but I know I’d be no good at being president. Or vice president. Or the Senator from California. Or dogcatcher of a small town. I’m looking for someone whose positions and approaches I can approve of in the main, and who also, in my judgment, would be able to work with enough different people to effect some worthwhile changes and who could take decisive but judicious action when needed.

To me, if you’re looking for a candidate exactly like yourself, you’re looking at voting as a form of self expression.

What strikes me is the way this development in politics mirrors a modern trend driven by technology. This goes back to the algorithms that power all search engines. These identify the preferences of the person searching and offer them (the algorithm’s best guess of) what they’re looking for and also of what else they might like.

As some hi-tech professional once put it (I forget who or where) “each person who visits Amazon.com enters a bookstore visited by no other person on Earth.”

That’s because anyone with a history of purchasing books on Amazon is offered a range of books that have been selected by the search engine based on that consumer’s earlier choices. The same is true of Netflix. Pandora, Youtube, et al.

The same is true of Google: every single person who seeks information from Google gets a different set of options. When I Google the term “Egypt”, I get lots of information about the Muslim Brotherhood, the Egyptian elections, the Arab Spring, etc. When a friend of mine Goggles the same term on her computer, she gets a list of websites about mummies, the temples at Luxor, airlines offering bargain flights to Cairo, etc.

But that’s not the worst of it. Another friend mine has enthusiastically embraced the idea of “seasteading”—of building floating cities on the ocean and declaring them sovereign countries. He tells me the idea is catching on wildly; there’s a virtual prairie fire of enthusiasm about it in the country. “Just Google seasteading,” he urged.

He said this because when he Goggles the term he gets endless lists of blogs that rave about seasteading. When I Google that term, I get sites on which people are ranting about how naive, dopey, and possibly unethical the idea is.

Here’s the creepy thing. I got these sites the first time I Googled “seasteading.” The list Google gave me wasn’t derived from choices I had previously made about this term Somehow, Google’s algorithm had an opinion about my opinion of seasteading. It turns out that Google’s algorithm has its opinion about my opinion of any topic I might look up, every topic I might ever look up.

What does this mean? To me it means that we’re slowly losing the capacity to see what the universe looks like from any place except where we are standing. As people do less and less live interacting with communities of other people in problem-solving settings—in offices, schools, town hall meetings, union sessions, conferences, and so on—and let their interactions with the world be mediated increasingly by search and information technology and its algorithms, this trend will speed up. Every person will in fact be the center of the universe.

Politically, my whole life I have been committed to the notion of a public good and to the idea that each of us has a duty to contribute to it. But that enterprise depends on a common vision that all the members of a society can enter into. Politics is partly about building that common vision. I fear for the prospects of such a politics in a world from which the very idea of a public good has vanished and nothing remains but private interests duking it out in a competition of all against all.

 [Return to Home Page]

Not Your Founding Father’s Democracy

 

 

 

Not Your (Founding) Father’s Democracy

 

 

 

Another gut-wrenching presidential campaign season screams into full gear. What a process! Why on Earth did the founders ever craft such a system?

Actually, they didn’t. The process we are in the middle of bears little resemblance to the one that put George Washington in office. For better or for worse, huge innovations have entered the system. Here (as I see it) are the ten of hte biggest changes. I wrote this column two elections ago, so I don’t include here the impact of social media and the Internet in general. That’s material for a whole other column to come.

1. Today we have a popular vote.

In the first 34 years of our republic (spanning the terms of five presidents) we had no popular vote to speak of. Then as now, presidents were chosen by the electoral college, as mandated by the constitution, but at first, the electors in many states were simply appointed by state lawmakers. So, in California, for example, the state assembly would assemble behind closed doors and pick some delegation to send to Washington, and that delegation would decide who Californians wanted for president. Gradually, however, states came around to letting voters pick electors, the system we have today. The first time enough states did this to make a popular vote even worth recording was 1824. (A total of 356,035 ballots were cast for president that year.)

2. Today we have political parties.

The constitution never mentions political parties. The founders thought they would be divisive and hoped to prevent any from forming. In their vision, the nation’s top leader would be chosen from amongst eminent personalities who had proven themselves above all special interests. The process would simply entail selecting the most capable of all the available sages. The founders thought such a lineup existed and always would.

They were naïve, of course. Today, no one can seriously run for president unless they belong to a party; and political parties by nature represent subsets of the nation, not the nation as a whole. A presidential election today represents a struggle between conglomerations of interest groups—rural vs. urban, oil interests vs. environment, and so on.

3. Today we have presidential campaigns.

This wasn’t part of the original plan. The founders considered “vote-chasing” undignified. Of course, supporters of early presidential hopefuls did write diatribes and polemics on behalf of their heroes, but George Washington held no campaign rallies. That I Like Tom button you’ve been hoarding probably references Tom Arnold, not Thomas Jefferson. Vote-chasing did not come into full bloom until the election of 1840. Not coincidentally, that was the first year a nationwide popular vote existed.

4. It now takes money to win the presidency.

Washington spent virtually nothing to become president. The next few candidates incurred only small costs—small enough to handle out of their own and their friends’ pockets. Really big money didn’t pour into presidential campaigns until after the Civil War. A crucial turning point came in 1896, when William McKinley’s campaign manager basically invented systematic fundraising. That year, McKinley raised and spent about seven million dollars to his opponent’s piddly $650,000. This year, according to the Financial Times of London, the two presidential candidates have spent over $1.2 billion dollars between them. Whatever else a presidential election may be, it’s now a contest between fundraising honchos.

5. Persuasive techniques developed for business are used in politics now.

In the distant past, advertisers were in charge of herding existing demand toward their client’s products. The advent of television, and the rise of “Madison Avenue,” brought a subtle change. Now advertising professionals took on the task of creating demand. In the 1950s, advertisers made the heady discovery that they could actually do this—motivate people to buy things they did not start out wanting. Political campaign professionals were quick to draw on the expertise of Madison Avenue to create, shape, mold, and herd public opinion. This tends to blur the boundary between what we think and what political professionals want us to think–whatever else it may be, a presidential election is now a contest between marketing teams.

 

 [column]

6. Today, candidates come to us in “bite-sized” portions.

It’s part of the effect of advertising in politics, but I think this one deserves separate mention. In 1952, Dwight D. Eisenhower’s campaign hired ad whiz Rosser Reeves straight out of Madison Avenue. Reeves had invented the slogan

“melts in your mouth, not in your hand” for M&M (one of the century’s 15 greatest ad slogans according to many advertising experts), and Eisenhower’s team thought Reeves might do for Ike what he had done for candy. Reeves happened upon a seminal idea called “spot advertising.” Reeves saw that moments of time were for sale between hit shows on television. He could buy those “spots” for small bucks and thereby reach the huge audiences built at a cost of millions by the big companies that sponsored the shows. The only catch: he had to deliver a message in 30 seconds or less. Rosser made a series of “spot ads” for Ike that compressed a town-hall meeting feeling into a 30-second clip. Today’s presidential campaigns consist largely of “spot ads,” “sound bites,” and the like. 

7. Today the candidates interact with voters through mass media. 

About sixty years ago, technology made it possible for candidates to speak to millions at one time through radio and television. Frank Merriam, who ran for governor of California in 1934, was the first to really exploit the political potential of mass media—he used radio advertising (and fake newsreels) to squash populist Upton Sinclair. 

Today, the bulk of the money raised by presidential candidates goes into mass media buys. One consequence of addressing millions at once is that candidates have to deliver least-common-denominator messages. However… 

8. Mass media appeals are now filtered through “narrowcasting.” 

Mass media still rules, but so many forms of media now exist that campaigns can deliver tailored messages to different target audiences. Viewers experience these ads as mass appeals—as what the candidates is broadcasting to everybody. Actually, different demographic segments see slightly different messages. What’s more, the direct-mail industry has databases from which it can assemble lists of individuals fitting particular profiles based on the products they buy, the television shows they watch, the work they do, etc. By mail and phone, therefore, particularized messages can be delivered to each individual appropriate to his or her opinions and leanings. The Internet will undoubtedly promote this trend. 

9. Polling has come to permeate the election process. 

Scientific polling was invented in the 1920s as an instrument of business, but it didn’t enter politics until the late 1930s, when Franklin D. Roosevelt began using a private polling service. At that point, polling was still a one-way process: the president would give a speech and then see how it went over. 

In the election of 1960, however, the Kennedy campaign began running polls in a given area before a candidate’s appearances and use the results to write the speeches he would give there—which changes the function of polling. By 1976, Jimmy Carter’s key campaign advisors included a pollster, Pat Caddell. Reagan followed suit and brought his pollster into the White House to help him govern. All these precedents have endured. 

Meanwhile, pollsters have refined their techniques through the use of “focus groups.” These are small groups of people selected to mirror a particular demographic profile. Campaign professionals sit down for in-depth discussions with a focus group to get behind mere numbers and root out people’s underlying emotions and unconscious leanings. In 1984, for example, focus group research helped Mondale discover that Gary Hart’s supporters felt uneasy about Hart’s ability to handle an international crisis. Ads based on that research stopped Hart’s momentum. 

Polling enables candidates to tell the voters what they want to hear. As a result, voter cannot tell what the candidates really think. Yet the opinions politicians glean from voters may be the very ones their own campaigns have planted out there, through advertising. In combination, then, polling and opinion management create a hall of mirrors in which no one knows what anyone really thinks. 

10. Today political consultants run presidential campaigns. 

Once upon a time, people who wanted to be president gathered a group of supporters and molded them into a staff of loyalists who did the tasks needed to get their man elected. 

Then in the early 1930s, a husband-and-wife team in California, Clem Whitaker and Leone Baxter, set up the first political consulting firm. They offered clients a complete package of campaign services, from developing strategy to writing speeches to catering fundraising dinners—in short, they turned campaigning into a paid service separable from any particular candidate or cause, just like lawyering or advertising.

Political consultants now dominate elections at every level. At this point, they still remain vaguely associated with one side or the other of the political spectrum, but when the fiercest Democratic hired-gun James Carville can marry his fiercest Republican counterpart Mary Matalin, you know that electing a candidate exists today as a content-free abstraction, a craft in itself, independent of any particular worldly goal. 

And yes, there is a Society of Political Consultants, and yes, they are holding an awards banquet in 2005 to hand out “Pollies” for the best political consulting of the past year. Whatever else it might be, a presidential election is now a race to win a Pollie.

[Return to Home Page]

The Case for Liberal Arts

 

  

Higher Education

 

The Case for Liberal Arts  

 

 

It’s too soon to write obituaries for the classic, residential, liberal arts college. Applications at my own alma mater, Reed College, are up. Ditto for Haverford, Williams, and all their ilk.

But why would anyone pay for an education that provides no concrete job skills?

Seven arts

The answer traces back to the first European universities. Those universities had no founders but formed spontaneously because scholars gravitated to places with books and students gravitated to places with scholars. The University of Paris, for example, grew out of the community of learners around Notre Dame cathedral.

Early on, this first university organized learning into four colleges. Every student had to first get through the College of Art. Those who did were titled “beginners” or, in Latin, “baccalaureates”—whence comes our modern-day Bachelor (of Arts).

At that gateway college, students studied seven “arts”: grammar, rhetoric, logic, arithmetic, geometry, astronomy, and music. In short, they learned how to think, write, speak, argue, and calculate. Only then were they allowed to pursue advanced studies at the College of Theology, Law, or Medicine.

Mere baccalaureates could get positions in the church or secular jobs as “clerks” and “notaries,” so the College of Art did have vocational implications, but only as a by-product. Its core purpose was to turn raw noodles into “well-educated persons.”

That mission remains.

The well-educated person

A liberal arts education proposes to give students a survey from up high of the whole landscape of human knowledge. Then, B.A. in hand, students can make their way to the grubby real-world corner that suits them best. They’ll make better choices, goes the thinking, once they’ve seen the context. And their work will better serve the common good if they know how it fits in with the human endeavor.

Ultimately, then, the driving ideal of a liberal arts education is to forge well-educated persons. This presumes that “well-educated” is a coherent quality, quite apart from “good at this” and “good at that.”

What this quality is and how it’s formed remains always in play. In America, however, until about 30 years ago, the liberal arts curriculum had a definite three-part form:

1. First, a core course that gave students a big picture of where civilization had been and where it was going.

2. Second, distribution requirements led students to take courses in disparate disciplines and thus experience different modes of thinking and the range of human thought.

3. Third, a major immersed students in deeper study of one field.

The last two planks remain intact, but student activists of my generation dented the first one. We charged that core humanities courses really boiled down to reverent study of books written by “dead white European males,” ignoring the contributions of women, Blacks, Latinos, Asians and others; and besides, we said, students being so varied, why should we all have to squeeze through the same portal?

Many colleges dropped the core course idea. But a few (Reed, for example) never abandoned the ancient doctrines. And at least one college, St. Johns, aggressively clung to a curriculum built almost entirely around “great books” (of Western Civilization.)

Now, however, many colleges are painstakingly reconstituting core humanities courses, often with a global cast. As it turns out, “we’re-all-different” is not really an argument for abandoning a core course. It’s the strongest reason to have one!

Elitism?

Still, the question remains: what good does it do any individual to be “well-educated”? Is this not an elitist concept analogous to the aristocratic notion of “gentleman?”

I’ll quote an answer someone gave me recently. Alicia Neumann earned her B.A. from Occidental, a traditional liberal arts college in California. Years later, she went back to school and got a Master’s in Public Health. Now she works in her new field and doing well. But what makes her good at her job, she confided, is mostly stuff she learned at Occidental.

“Just the ability to give and get information clearly!” she declared. “So much of my job consists of writing—emails, reports, letters! Or attending meetings, giving presentations. The ability to get my point across is major. I look around at some of my colleagues who skipped the liberal arts and they’re fuzzier at communication. It’s an obstacle in their work. It makes them less efficient.

“Then, there’s the ability to synthesize. A lot of what I did in college was collect information from many sources, discern patterns, and put it together to make a new point. Back then I did it with literature, but the underlying skill is applicable to everything I do now.

“One of my friends is a lobbyist in Washington, and she’s just zooming up through that world. Why? Because she can write and think.”

This is what a liberal arts education is about, just as it was 900 years ago at the College of Art in Paris. And this is why places like Reed and Occidental keep flourishing: they open pathways to leadership and power in America, not just because of whom one meets at such places, but because of what one learns there.

Yes, a good liberal arts education tends to produce America’s elite, but that’s not a reason to mark it down. It’s a reason to keep it open to students from all walks of life.

 

[Return to Home]