‘In the beginning was the relation’: On becoming less strange

We are strangers to each other despite our seeming transparencies, our open confessions, our public language. We use the same words and mean different things. We use different words and mean something else. We stand inside the clatter and the grind and say our public sayings. Some spend our mornings prizing apart concepts. But to what end? With what cause? For what reason?

There is another way: We become less strange to each other as we walk slowly along. There is no other way around: no beeline, no backdoor, no beck-and-call. Understanding: this is the slow work of finding words that, through the cultivation of unhurried experiences and the enactment of delicate gestures, finally become shared, becoming ours. So that words resonant in the relation.

Buber says, “In the beginning was the relation.”

The discourse of choice is the burden we carry

We are strangers to each other. Our moral incoherences go stated but unremarked.

  • “Who are you to say?”
  • “That’s just your opinion.”
  • “Well, let’s just agree to disagree.”
  • “Because I said so.”
  • “That’s my choice, not yours.”

We have even forgotten the question: What general beliefs confer rational authority upon a set of social practices? To say that a belief is rational is to imply that any person whatever can (ought to) assent to it. To say that a belief carries authority is to suggest that the utterance makes a demand for action or assent.

Every day our cultural failures are on view in our overextended discourse of freedom. If a “justification” doesn’t end with “Well, because so-and-so chose it,” then we feel unsatisfied. And yet this pseudo-justification is unsatisfying because the ability to choose one thing or another says nothing about whether the thing that we’re choosing is good or intrinsically worthwhile.

And that is scary because choosing is so arbitrary. In the modern age, the discourse of choice is the burden we carry.

Addendum: A Case about Playhouses

Of the $50,000 playhouse designed in a Cap Cod style for her young daughter, the mother remarked, “My daughter loves it. And it’s certainly a conversation piece.”

The mother’s reasoning: The daughter wanted it, she chose it, and she loves it. “But is it good–good independent of her choosing it?” The question is not asked and doubtless it cannot be answered.

Further Reading

Alasdair MacIntyre, After Virtue

Allison Arieff on design ‘beyond the cubicle’

Over cocktails, an old-school literary agent told me that either an editor “gets your work” or she doesn’t. It was the best description of intellectual sympathy, of shared understanding, of common purpose that I’d heard since moving to New York 2 years ago.

So when I read Allison Arieff”s “Beyond the Cubicle” yesterday, I saw immediately that she gets it. She gets that design can’t start from old assumptions concerning the separation of work life from home life, thinking from playing, the work day from the weekend, making from resting. Instead, genuine design must begin from philosophical considerations: What is work? What is the workplace? What does it mean to work in common? How does work integrate different facets of our being? And so on.

Our spaces of work must not only reflect our mental life but inspire our creative potency in an endless feedback loop of physical surrounding, intellectual activity, necessary reprieve, and leisurely strolling.

Arieff concludes,

[H]ow can the workplace evolve to respond to the contemporary realities of work culture?

The [Wall Street] Journal is right that good design can inspire creativity and great ideas, but I’d argue that the focus should be less on floor plans and more on ways of working. When’s the last time you had a creative breakthrough in a Monday morning meeting? Creativity springs from unexpected places and sources — from a walk in the park to the rare block of uninterrupted time — so thinking more broadly about the intrinsic motivations (autonomy, learning, etc.) that facilitate good work is likely to have a far happier outcome than the “latest” innovation in cubicles.

Definitely gets it.

Further Reading

Andrew Taggart, “The Life Need of Philosophy”

—. “Rules of Thumb for Starting a Way of Life Business.” (See especially Example 2.)

To become pessimistic or to let a hundred flowers bloom

It’s one of those mornings when I read scores of sour stories about a world headed downhill. In an effort to make playgrounds safer, parents, legislators, and lawyers have made our children more risk-averse and less thick-skinned. Meanwhile, the debt crisis in Greece rages on in the eurozone: no quick fixes, no easy solutions. On a small island in the South Pacific, an island once featured on This American Life if memory serves me correct, environmental degradation and climate change have made this 8-mile stretch of land virtually uninhabitable.

My morning mood, of a groping necessity, seems to find its way to philosophical pessimism, the view that human life cannot be justified. So the philosopher David Benatar: “no life is good.”

It is here that my mind doubles back in a dialectical reversal: the decline of public intellectuals is a blessing, not a curse, the time now ripe for new experiments in living, the movement led by thinkers-makers on the horizon. Criticism, as a modus operandi, has been ineffectual (my essay on the bankruptcy of the critical orientation can be read here), and the social critic as a subverter of the status quo has become bitter and resentful. And so what, Jacobi–so what, thou sour grapes?

So what when local markets have returned?

So what when a movement is just now under way?

So what when friends are thinking beyond sustainability, beyond apocalypse, beyond saving the world?

So what when, as Matthew Abrams put it in an email to me,

Our focus [at The Mycelium School] is not on less bad, but new models to make current issues moot.

So what, provided we bracket the terror that ensued, we remember the call to “let a hundred flowers bloom”?

‘The choice is to join an institution or die on the vine’

No, Jacoby, it isn’t.

Five years ago I would have assumed that Jacoby was right. Three years ago I would have agreed with Jacoby. Two years ago I would have despaired with him. A year ago I floated the question: “Is it possible to live otherwise today?” That little question has made all the difference.

Our problem is not just political; it is also conceptual, lying with our failure of imagination.

Jacoby’s conclusion applies only to the public intellectual who has joined an institution (hence not a public intellectual) or who is dying on the vine (hence not a public intellectual). The question is whether the mid-century public intellectual was all that effective in his criticisms of the status quo. I doubt it. In any case, in the early 21st C. other figures have sprung up in his place.

In the long view, Jacoby and Scialabba represent an older generation of intellectuals whose thinking was shaped by the rise of the university after WWII. They were important then, but their time has past. And now their bitterness is apparent.

The following is an excerpt from a symposium at The Crooked Timber (August 2009) held on the work of George Scialabba.

Russell Jacoby writes,

The anthology that Robert Boynton published some years ago, “The New New Journalism” subtitled “Conversations with America’s Best Non-fiction Writers on Their Craft” tip-toed around a issue that still remains too hot: money. How [in the early 21st C.] do intellectuals earn enough money to write and think?

The possibilities are worse than ever. Yes, a few souls manage to hustle and do quite nicely, for instance, Christopher Hitchens. Yes, a few magazines like the “New Yorker” pay a living wage, but for most to survive, if not flourish, requires a working (and willing) spouse, family money or an academic position (or its equivalent such as a slot in a think tank or policy outfit). Yes, [George] Scialabba has a chair at Harvard, but his sits behind a desk on the ground floor of the building which he superintends. Only the most resolute can juggle for years a day job and night time of writing. For almost everyone else, the choice is to join an institution or die on the vine.

George Scialabba replies,

Russell asks another fundamental question: “How Will Intellectuals Eat?” Independent intellectuals have always depended on conversations, lectures, seminars, libraries, museums, bookstores, newsstands, cafes, small publishers, little magazines, cheap apartments, and easy movement into and out of part-time jobs, preferably on the fringes of culture or academe. In other words, cities. In return, they supplied the civilization in “bourgeois  civilization.”

Capitalist rationality is not synonymous with bourgeois civilization; on the contrary, it is the chief subverter of bourgeois civilization. By its inflexible logic, the material prerequisites of intellectual life were economically irrational. Inexpensive urban neighborhoods, small-scale enterprises, relaxed personnel policies all succumbed to the same polite, deadly formula: “We’re sorry, but nowadays investors expect a higher rate of return.” In an earlier example of industry consolidation, Nixon’s delightful Secretary of Agriculture, Earl Butz, helpfully advised small farmers and ranchers: “Get big or get out.” They got out, and American food is now, by and large, mass-produced dreck. Will the same thing happen to American culture? There are symptoms: the difficulty of getting non-blockbusters published, promoted, and kept in print; the pressure of bookstore chains on independents; the vast wasteland of Clear Channel radio; the metastasis of the Murdoch media empire. There are also exceptions, of course. There are always exceptions to trends before they become accomplished facts, as a great many people were eager to remind Russell when The Last Intellectuals appeared. But in general, I think Russell’s formulation here is spot-on: for nearly everyone, “the choice is to join an institution or die on the vine.”