Archive for the trust and transparency Category

Dealing with an aging parent has no end of frustrations.  Yet with those frustrations come some unexpected positive consequences.

For me, many of these unexpected benefits have been “lessons about time management” and “lessons about learning how to say no.”  Even though those lessons do not always help with the “dealing with Mom” part of life, they have and will be valuable in the other parts of my life.  I.e., “work.”

Here’s an example that fits both categories.  My mother has always been a bit of a nag.  With her decline in memory, however, she has become the epitome of the impossible-to-satisfy micromanager.  Not only does she nag, but she constantly interrupts.  There are many days that, once she rises, I can count on not having an uninterrupted 15 minutes until she goes to bed.

But the purpose of today’s post is not to whine about my frustrations.  I do that too much as it is, and the reality of her aging is that no amount of ranting or whining or yelling about unfairness is going to change things. That’s what the first part of the Serenity Prayer is all about.

No, this post is about a positive that has come out of that frustration.  I’ve learned, in a deep way that I probably should have realized long ago, something about why deadlines work.   I’ve learned that a major component of my deciding to take a project on is whether the client or “boss” imposes a deadline.

I’ve realized that I can handle deadlines.  Even ones that are somewhat earlier than I’d prefer.  But what I can’t handle is the combination of “when you can” and “now”.

Give me your deadline.  If I think it impossible, if I can’t make it, I’ll tell you.  If ex ante I underestimate the effort a project will take, I’ll burn as much of my midnight oil as I can to get it done anyway.  And the necessities of trying to balance care-giving with all the other stuff have actually made me much better at that ex ante estimation, and much better at saying no when that estimate says the deadline is impossible to meet.

You’re much more likely to be screwed, though, if you don’t give me a deadline.  Because I’ve got too much going on — and, knowing me, I’m always going to have too much going on — that your wants are going to get shunted aside.  Over and over.

And that’s going to happen no matter how much you nag me.   All the nagging does is get everyone’s blood pressure up.  Yours goes up, because I’m not helping you.  And mine goes up, because-I’m-really-busy-and-everything-takes-four-times-as-long-when-I’m-constantly-being-interrupted.

Yes, I know.  What took me so long to figure this out?  Well, when you find as many things interesting as I do, it’s really, really hard to say no.    I’ve always been juggling a lot of balls.  If truth be told, I *like*juggling — in a way that’s what interdisciplinarity, true interdisciplinarity of the sort of been working at for my entire adult life, is about.  Juggling.

And, unlike most people my age, I’ve never had the kind of 24-7 relationship with a dependent before.  I’ve never come close to the parenting thing.

But now the juggling is constrained.  Now asking me to do something interesting isn’t enough to get my effort.  No matter how good my intentions are.

So now, when a client or colleague comes to me with a new project, no matter how exciting that project sounds, I ask another question:  When do you need it by?

And I insist on a mutually satisfactory answer, one that does two things:

1.  It’s a deadline I’m sure I can meet.

And:

2.  It says you’re not going to bug me unless and until that deadline is about to pass with the project unfinished.

Otherwise, my answer to the project will be no.

Give me a deadline.  Then stand aside.  Or expect disappointment

I can’t tell my mother to go away.  But, and this is another lesson I’ve learned in these months of increasing interruptions, you’d be surprised how many non-Moms can be told to go away.

Give me a deadline.  Then stand aside.

Or go away.

Share

In my last post, I compared being a successful teacher to running a successful conference. Today, I want to discuss the benefits and limits of the analogy.

First, a warning: At just a bit under 5400 words, what follows will be the single longest Iterations post to date. Perhaps ever.

The rules of Iterations haven’t changed. Notwithstanding this particular child of Godzilla, and perhaps one more near the end of summer, Iterations will remain a place for fragmentary exploration. One where posts aim for the short side of 500-1500 words. Indeed, since I’m trying to develop my information-density skills to become followorthy on Twitter before fall term starts, what Iterations will morph to if anything, is shorter and shorter entries.

However, as the Barriers of Faith (formerly titled “Technology and Education”) book project moves into its next phase, I feel compelled to post a full chapter or two. Partly it’s simple vanity: I want to keep people updated on my (clearly cool) ideas. :) But also necessity: BoF will be as radical in rhetorical design as in content. As people encounter the book, they’ll experience not one odd design feature, but three.

And that’s a risky strategy, since if I screw it up, it’ll make listening more expensive. If I want my ideas, including the design for exploring those ideas — to sell, I need to test the costs of listening to them with as many different kinds of reader as possible. Iterations readers by definition being rather diverse, I’m hoping “Letter #3″ below will intrigue several of you enough to volunteer as readers of future piece.

(If you want to be a reader/reviewer, e-mail me at barriers@thelisteningphd.com.  You won’t get paid in $$. But if you review at least one piece of the book, even just this one, you’ll get a free copy of the book when published (target date: late fall, 2010). And not just a free PDF; I’ll mail an autographed copy of a physical book.)

Second, a bit of preface about that larger design. (If you don’t read prefaces, feel free to skip to “Letter 3” below.) Since I’m only giving one piece here, I’d like to say a bit about the overall design of Barriers to Faith, and how I hope it will work.

Design innovation A: “Discourses.Barriers of Faith is built around four multi-chapter “discourses,” each iterating a different dimension of the larger question of “whence economic higher ed in the 21st century?” Chapters will be short, on the order of 10-12 pages.

“Discourse” is not just a pretentious word for “part” or “section” or even (for those of you who like books from the 18th and 19th centuries) “volume.” Chapters within each discourse here do not merely explore a sub-question of that section’s main question.

They attempt to face the reality that people first engage a “question of importance” come to true conversation on a “shared question” via different methodological, ideological, epistemological, intellectual, moral values, that their mode of discoursing gets converted from “arguing across each other” to “conversing with each other” only insofar as these differing values get exposed and dealt with. True conversation requires shared questions. Just as England and America often appear to be “two nations divided by a common language,” people in a public discourse who appear to be talking about the same question are not. Their different value sets assign different meaning to the same words. Persuasion isn’t possible in such a discourse. Only the sham victories of “getting the last word.”

Each of the four discourses of BofF seeks to expose these hidden barriers to serious conversation, then replace their false commonality with true common ground. They do so through the contrast of perspectives offered by what I call the “alternate universe.” By using perspectives that are clearly “out there” from most readers (e.g. technophiles in discourse II, anarchists and evangelical Christians in discourse IV, and science fiction writers throughout), the reader’s own ways of asking the same questions also get exposed. Listen more to the aliens of C.J. Cherryh or the alternate histories of L.E. Modesitt, Jr., and you hear yourself better as well. You hear when you are asking different questions than your neighbor.

Design innovation B:  “Provocations.“  Aiding exposure are, a half dozen “provocations” situated between each pair of discourses. Each provocation states specific — and very radical –”proposals” for pedagogic innovation. While I would love it if individual teachers followed my lead (I have tested, or will prior to the book’s publication, each provocation with actual students), that’s not my goal in presenting them. None, however, and emphatically, are proposals I expect to get reduced to actual “educational policy” or “curricular reform.” They are only what their name suggests, “provocations” to take public discussion past “the usual suspects” of funding, curriculum change, ideology, etc. Ways to help expose the real reforms needed, the ones that will arise from full engagement in the four discourses.

Borrowing a bit of jargon from one galaxy in the alternate universe, the provocations are not themselves the outside-the-box thinking. They’re a technique for getting real outside-the-box thinking to occur.

Design innovation C:  “Opening letters.“  Discourse plus provocation provides exposure. Yet exposure alone is isn’t enough. It must somehow encourage what Adam Smith called sympathy. Before a discourse will morph into mutually beneficial conversation, into something where “outside the box” ideas actually get traded, one needs a trigger of “fellow feeling” that encourages the discourse participants to see value in building that true common ground.

Which is where the third design innovation comes in. Each of the discourse’s open with a letter written to an “old friend” named Jack. Full sympathy, in my opinion, can come from strangers only if some of us are willing to practice what I call “absurd transparency.” And few things can be as transparent of our true beliefs, both the good ones and the bad ones, as the letters we write to long-time friends. And can I ask others to be sympathetic to my request for their transparency if I am unwilling to be provide my own?

For readers of Christian apologetics, or for fans of Narnia, “Jack” is my homage to the late C.S. Lewis. It was either that or address them to Paul, in honor of the famous ancient correspondent with Rome, Corinth, Galatia, Ephesus, Galatia, Colosse, and Thessalonica. And not even I am hubristic enough to dare the latter.

It’s a conceit, of course. I never met Lewis, who died long before I had heard of either Narnia or Mere Christianity, much less entertained a thought of corresponding. Yet for me he has been the sort of writers I expect all of us compulsive readers “know.” An influencer of my own thought and development as profound as any lifelong, first-name friend. In the manner of his life, in the quality of his thinking Lewis was, in my mind, far more significant than another, more famous, Christian who died on the same day in November, 1963. “Jack” Lewis was a model of how to reconcile Christian faith and intellectual rigor. A model for bridging interpretive worlds.

Were he around to read Barriers of Faith, I expect he’d have some rather strenuous objections. Yet, given that my immediate inspiration was one of his lesser-known gems, the posthumous Letters to Malcolm: Chiefly on Prayer, to who else could I address my own letters?

Readers of Iterations should feel free to quote-with-attribution from “Letter #3” in the usual ways of the blogosphere. I ask, however, that any quotation be accompanied by a link to this page, and that any discussion of the details of Letter #3 note that it has been released into the ether as a “work-in-progress.”

Without any further adieu, then, “Letter #3”:

Letter #3
On Giving Students Too Much Responsibility?

© 2009 Wade E. Shilts. All rights reserved.

Jack,

Sometimes, I despair.

I mean, if I can’t even get someone as smart as you, someone as open-minded and thoughtful and caring as you are on matters pedagogic, to hear what I’m saying . . . how am I ever going to get the idiots without a clue to listen?

A case in point: your last letter, where you object on several grounds to my teaching-as-conference-programming idea. [Aside to Iterations readers: Immediately preceding this chapter will have been Provocation C, "It's not a class. It's an international conference." Following it will be the other five chapters of Discourse III, "Economics as fellow traveling; Or, education of the commons, by the commons, and for the commons."]

1. It’s unrealistic, you say. At most a good seminar might be able to handle 20 students. Even the smallest conferences have 50 to 150 people. You can’t run a effective seminar with 50.

2. Conferences take a lot of planning, you add. Work up front. How in the world can a teacher going to find the time to do all that before the term starts. It’s months of work, coordinating several people.

3. And students aren’t far enough along in their professional development to know what a good economics or history conference would be. They can’t even write a decent paper yet.

4. And look at all those other things we need to get done in the course of a semester-long class. The content we need to cover. How are we going to get that done via a “conference” format.

5. Oh, and by the way, have you forgotten how bad many conference panels are? We’d be bored stiff if we had to sit through an entire semester of conference panels.

I’m afraid, old friend, that you have missed my point altogether. That you have fallen, again, into the trap of rounding up the usual suspects.

Let’s start with your point about not being able to run a seminar with 50 people in it. Sure. Actually I think your 20 grossly optimistic. Personally, I wouldn’t want more than 10 students in a seminar. Maybe 15 if both “A” students and experienced.

But who said anything about running a seminar? I said “put together a conference.” With the exception of the by-invitation-only annual gathering of the Cliometric Society, I’ve never been to a “conference” that is a seminar. (And much as I think Clio provides the model for professional seminars, I’m not suggesting you run an undergrad Clio. Argh. That’d be a disaster.)

Seriously, Jack, have you never been to a good conference with more than 15 people? Conferences are a different animal than seminars. They do different things. They scale differently.

Have you never been to a conference and come away saying it was worth the trip? (And if not, whyever do you keep going? I mean, I know what a conference trip costs, and I know for a fact that your college doesn’t cover all travel expenses.) Yeah, I know some conferences are rotten — remember Boston? — but there are really good ones, too. Grand Rapids last month, for example.

And who said anything about the teacher doing all the prep? I didn’t say “run a conference.” I said “put together a conference.”

Look, I absolutely agree on conference prep. Lots of stuff has to happen before attendees get to check into posh hotel rooms and order room service. Contracts with hotels, finagling sponsors, finding activities, reviewing proposals, scheduling, checking prices, arranging airport shuttles, etc, etc, etc.

But whatever makes you think that all that stuff has to take place before your class starts? And what makes you think that you have to be the one to do it?

Okay, I’ll confess. I tricked you. I wrote Provocation C as I did because I was pretty sure each “you” I used would be read in the singular. Especially since some were. But the ones about putting together a conference? Go back and read those again. They’re the plural you. As in “you and your students.” Nyah, nyah, nyah. Gotcha.

I know. Unfair. I’m a bastard sometimes, what can I say. How can I sit here, having manipulated you through the ambiguity of my pronouns, and then take you to task?

But, don’t you see, my writing and your reading illustrate the point I’ve been trying to make about just how insidiously the usual suspects work upon us.

After all, as you read those opening sentences of Provocation C, did it even enter your mind to think there might be multiple ways of reading the second sentence?

Seriously.

Or did the interpretive paths hardwired into even your brilliant brain tell you that I was speaking to you, to the same singular teacher implied by “you’re teaching…”? Did you think, at all, whether I might be arguing that the you in the second sentence of the provocation, for example, was the plural you”?

Sure, now that I’ve thrown the possibility out there, you can see the annoying ambiguity in what I wrote. Anyone can. Now.

But earlier? Would the plural you of my audience have noticed? Will most “teacher” readers of the book know me as well as you do? If even someone as close to me as you, Jack, someone who knows what kind of writing smart ass I can be, couldn’t see through to the ambiguity, would the others? I doubt it.

And your own reaction just highlights the huge problem we have if we’re going to get higher ed types past the usual suspects. To get them past, means exposing their “you” as beholden to an evil “them”. None of us have problems showing others to be stuck in a rut — we mastered that skill in graduate school. The problems come when someone points out that we are rutting around the mud like the pigs my uncle used to slop.

I suspect this might be part of why economists on average get poorer course evaluations. In many ways, economics as a discipline is all about highlighting how usual suspects fall short. About how easy it is to miss all the consequences of personal or policy choice. About how so many questions are answered by “It depends…,” or by “On the one hand …., but on the other hand…” All disciplines have practitioners who specialize in debunking, iconoclasts who thrive on piercing the bubbles of received wisdom. But economics in many ways defines itself as debunking. Rounding up the usual suspects and striving to put them out of their misery — it’s not just what a few of us do, it’s what all of us are.

But, you’re right, I am asking you to do a lot of new work. Though not as much work as the “require a walkabout to understand local demand and supply” proposal I suggested earlier. (I’m assuming you failed to object to that idea since you know I borrowed it from John Taylor Gatto, who managed to do it in poorly funded NYC schools with 9th graders.)

But I’m not sure the “conference” idea has to mean more work in the long run. Oh it would require you to say “no” more often to the administrivia that surrounds you. Yes, it would be impossible were you one of our poor untenured or adjunct colleagues who lack significant say over the contents of their syllabi. But why is it that we tenured types find it so much easier to justify saying “no” to our students because of “other obligations” than justifying “no” to the committee babble because of our students?

And yes, coming to your third objection, I’m afraid that the proposal is that radical. Probably more so. Because, yes, I’m saying the class should “put together” the conference. Not you. Those 18-20 year olds.

The conference program won’t be a syllabus you hand out in week 1. It’ll be a collaborative effort that your class struggles to finish before week 10 or 11 of the term. That’s right. The bulk of “your” economic history “course” will be spent coaching your students in putting the program together, in finding people to moderate panels, and in the hundred other tasks that having a good conference requires.

Yes, yes. I know. That just strengthens your point. Our undergraduate students just aren’t ready for that. They’re not deep enough or versed enough in the discipline of thinking about choices and consequences, or in their writing skills, to write a conference-quality paper. Much less serve on the program committee.

True.

But answer me this. Will your usual syllabus make them any more ready? Especially if that syllabus follows the model of the usual texts and the usual major requirements? Seriously.

Think about your objection a bit more. Think about why it is, after they have taken our history course, even our best students are still no more ready to decide which potential speakers are worthy of invitation, which papers are worth hearing. Think about why they’re no more ready to read the tea leaves of resumes and abstracts.

And be clear. They’re not going to be ready. I don’t want the economic history organizations in which I am a member involving undergraduates in the program selection and planning process either. In fact, I’m opposed to most of the “undergraduate research initiatives” that have been springing up in the minds of deans everywhere the last several years. For your reason: they’re not going to be ready. Not even close. At least not in the social sciences.

Yet is our student’s inability to judge well enough to serve on an actual program committee a reason against modeling our classes on conference preparation? Or exactly the opposite?

Judgment only improves by exercise. You can’t “tell” people that Xs are good and Ys are bad and then just expect them to be able to move on and correctly decide whether various Zs encountered in their life-after-your-class are more X-ish or more Y-ish. Until you give them opportunities to make dumb mistakes, are they going to learn to make smart judgments?

Have you ever wondered why, when you have students do an end-of-semester class presentation, so many are awful? Why so many give presentations a half-way diligent tenth grader should be ashamed to give?

Yes, I know. They leave things until the last minute. No matter what you or I say, they put too much off, until they’re long past the time where they can do everything that needs to be done. Sure. Okay. Fine.

But why do they leave it go so long?

Sorry, it isn’t that “they’re lazy” and it isn’t that “all they want to do is drink beer and get laid.” (As you’ll see in the “Genius of Gen Y” chapter I’ll send you soon.) It’s complicated, but a big reason they procrastinate on the projects we assign is because our projects don’t make them feel the necessities of collaboration. And if they don’t feel it, they aren’t going to listen to our rants that they’re procrastinating. Much less heed them.

Want good presentations? Want them to work at their writing and their Powerpoints and the quality of their Q&A? I’m sorry, Jack, but then you need to involve them in a project process that truly makes them hurt with the necessities of collaboration. A process that makes them see their individual tasks of outline, draft, final paper, what have you, all as part of a group activity. As Adam Smith pointed out in Theory of Moral Sentiments, coordination in a system depends on the quality of sympathy or “fellow feeling” by the individuals interacting within that system. Not the empathy of “I feel your pain” that our liberal do-gooder friends are always going on about, but the sympathy of “I feel my pain when you feel yours.”

Think. Have you ever noticed how the truly awful presentations at a conference almost never get made by conference committee members? And if anyone has an excuse for procrastinating on their own papers it would be a member of the committee member who has had to do all that other stuff. Yet, if you go to a paper being presented by a committee member, you’re much less likely to see her reading in a monotone. Or have a dozen boring Powerpoint slides, each with 200 words full of bullet points. Canned graphs from Excel. Fidgeting from the moderator when they run overtime.

Program committee members invest in their conference. Conference organizers don’t just take it personal when people think their own paper bad. Conference organizers take it personal whenever people think any paper is bad.

Oh, they aren’t always great presenters. But they’re not the horrible ones that get us cussing either.

I’m betting that if you do your job as program chair well, so will your students. That you’ll see a pretty substantial improvement in the overall level of end-of-semester presentations.

And more importantly, you’ll get more accomplished toward your real goals for the class.

If we want our students to have skills — writing skills, presentation skills, economic skills, historical skills — they bring to bear in their lives after college, we must involve in the collaborative judgment process from the beginning. Knowing that in their inexperience and their ignorance they are going to judge badly, we must get them judging early and often. If you think what program committees do is important — and your history of continuing conference-going tells me you do — then you need people practicing at doing program committee-type stuff.

Because the need for program-committee skills — the skills of collaborating — is critical. Not because we need our students to grow up and put on academic conferences. (I expect that, no matter how much you and I like them, the world could do without 90% of academic conferences.) But so they choose good professional development conferences. So they better decide what happens at all those industry conventions and trade shows and webinars. And — and this is the really big one — so they exercise better judgment in all those non-conference settings where they are going to have to collaborate and judge each other’s economic or historical claims.

No, I haven’t forgotten your fourth criticism. I was just saving it for last.

Because, to be frank, Jack, and please don’t take this personal, you know I love and respect you, I just don’t care whether your students get most of that content you and the textbook makers and the curriculum committee thinks so bloody important.

Sure, I care about my students’ ability to tell the differences among averages, marginals, and totals. But I could care less if they could manipulate the n-teen cost curves they get inundated in most principles-level courses. Sure, I firmly believe they need to get precise in their “balance sheet” and “national income” thinking. To know how to answer questions of “how big is ‘big’?” But I could care less if they remember the differences between gross national product, gross domestic product, or net national product.

Yes, yes, details matter. That’s my point. What we have to get across is that paying attention to details matters. Not “this detail matters” or “that detail matters”. That’s what my “walkabout” and “conference” ideas are about. Getting past the abstractions. Making them feel and work with the details.

What matters — what determines whether our classes are worth anything — is how our students think after having had our classes. Whether they work better when confronted with new details of “economic issues” or “historical experience” after they have been with us (and with their classmates) for 10 or 12 or 16 weeks. If they think better, we’ve succeeded. If they don’t, we’ve failed. Even if they got the best score on every test and wrote the best term paper we’ve seen in years.

And our usual methods of teaching — our survey courses, our lectures, our term papers, our exams? They’ve failed big time. They were failing long before Gen Y revolutionized the institutions of information and its interpretation and “made our job harder.:

Look no farther than what passes for “economic” discussion these days. I’m not talking about the evils of the Patriot Act or the idiocy of 2 trillion dollar deficits. Those are easy targets. Too easy. I’m talking about the everyday talk of Boomer CEOs, politicos, CNN reporters. Dinner table talk. New York Times editorial talk.

Look at the continued health of mercantilist ideas. I mean, it’s been, what, 235 years since Smith demolished the fallacies and empirical errors of mercantilism? And what economic doctrines still pervade virtually all public discussion? My conservative friends talk about the evils of socialism, my liberal friends talk about the evils of capitalism, and almost none of them realize how much they are captive of stupid, stupid, stupid 17th and 18th century ideas. I swear, sometimes it makes me want to puke.

And just how long have we undergraduate economics curriculum been teaching ideas better than mercantilism? Two generations? Three? Four? Five? Okay, I know, that other economic thinking — the neoclassicals and Keynesians and social democrats and the Chicago School, even Austro-institutionalist-anarchists like me — all of them have flaws, some of them big. But bigger than the idiocies of mercantilism’s zero-sum, it’s-all-about-who-has-power thinking that drives mercantilism.

I mean, it takes what, a couple production possibility curves, a couple stories about trade, and the seeds of anti-mercantilist thinking are planted, right?

Yet what the heck have we been watering and fertilizing with in the rest of our economics classes, if the best “public discourse” that comes from all the alternatives-to-mercantilism seeds we have planted over the last few generations is a nation sending millions of pieces of junk mail whining about NAFTA, a nation whose most educated people choose between the likes of Bush and Obama and Pelosi, between CNN and FoxNews? C’mon. A nation where undergraduate economic education had been doing its job should have laughed Ross Perot and his “giant sucking sound” metaphor off the stage in 1992. Not still have “leaders” from the major political parties trying to out-Perot each other in 2009.

And it isn’t just politics. Look at the corporate world. I’m not talking Enron and Worldcom, or the banks, airlines, or GM. I’m talking about the middle manager at Ordinary Company, Inc., who thinks he’s in a “war” with the competition. The college-educated shop floor worker who thinks the only way to get hire wages for “labor” is to take something away from “management.” The farmer who blames low corn prices on increasing costs of growing corn.

Sorry. You knew you were going to wind me up, didn’t you?

But I get frustrated. I believe in the value of thinking “like an economist.” And there have been hundreds of thousands of economics classes taught on the current model in the last fifty years or so, thousands of economics teachers striving to illustrate the value of the economist’s thinking tools. Yet what gets the public energy up in economics? Three quarters of a century from The General Theory and “educated” America thinks the “new” ideas of Keynes and Roosevelt are workable?

And no I don’t want to debate Keynes with you again. Because that’s not my point. Regardless of whether Keynes was right (your position) or wrong (mine), the fact is that we economics and economic history teachers have for the better parts of three generations been teaching against Keynes. And if we had been doing anything other than a generally abysmal job, people would have either elected the new ideas of Obama decades earlier or have consigned them to the dustbin with the physiocrats, the real bills doctrine, anarchists, and all those other things we teachers have consigned to the dustbin.

I know, I know. I’m asking that a whole lot of trust to put in the hands of 18-20-year-olds. Trust that will with great frequency prove misplaced because, well, because 18-20-year-olds are, more often than not, inexperienced with, and ignorant of, important matters economic and historical.

And I have to admit this scares the heck out of me. (Remember, Jack, I’m the one who, with a bit of bourbon in me, likes to rant about the “utter idiocy” of so many of the ideas of so-called “student-centered” education.)

But we don’t have a choice.

No, I’m not just going off on my “Gen Y is special” thing again.

Yes, I do think Gen Y is special and exciting and willing and able to do things you and I weren’t at their age. And, yes, maybe I’m deluding myself.

But, don’t you see, Jack? If I’m wrong in seeing Gen Y as exceptional, if you’re right, then the case for radical reform of pedagogy of the sort I’m suggesting doesn’t get weaker.

It gets stronger.

A lot stronger. Because the important question isn’t whether they can be trusted at 18 with such responsibility. The question is how they handle that responsibility when they get it at age 22.

Because they will get it. No matter what you and I do, whether we stay with your trickle-down methods, or whether we adopt my radical ones, our students are going get responsibility when they graduate. A lot more than we did.

They’ll insist.

And, more importantly, the global marketplace will insist.

It has to. Paradigm shifts today happens too frequently, too fast, for a small core of masters and “leaders” to handle them all. Tomorrow’s paradigmatic changes will be even faster. A world of paradigm shiftiness, a world of change compression, will work only if we can find a way to enable 50 percent to think the way that liberal arts education has historically enabled 5-10 percent to think.

I’d love it if we were dealing with nothing more than the impatience of youth, than the foolish wisdom of sophomores. If we could do what our elders did. If we could just exert the power of our experience and position. Tell them to wait their turn. To play the game.

But the world of “playing the game” is gone. To quote the Wachowski brothers, “There is no spoon.” Sorry, but there is no one game. We have multiple games, each one changing, and changing faster and faster.

It’s our job, somehow, as teachers of economics and history, not to warn them about the brave new world, but to ready them for it. To help them acquire skills not of transmitting and receiving information, but next order skills of judging and interpreting it. Skills of creating, of collaborating. Skills of practicing all those skills at light speed.

I’d love to be able to say, as my teachers did, “Trust us, because in your youth, you’re not ready. Because we have the knowledge and the experience, and you don’t.” But I can’t. Way too much of that knowledge of mine is going to be outdated by the time they graduate. (If it isn’t already.) What we “know” simply isn’t going to remain worthy of trust long enough. If I ask them to “trust me” that way, I’m not doing my job. They’ll fail.

You and I are personally as safe as anyone in this flat world can be, thanks to the protections of tenure. But our students aren’t. When you and I screw up, they, not us, bear the biggest consequences. We may not agree about who will (or should) finally hold the bag when Social Security finally goes bye-bye, but we ought to agree about who hold the bag if we continue our ineffective economic and historical education.

We must trust our students today, knowing they will sometimes disappoint us. Because if we don’t, they will fail after they leave us. Bank on it.

And when they do, bank on losing a lot more of your retirement savings than you’ve lost in the last few years.

Maybe my “conference” idea is a bad one. But its not a bad one because it precludes us from covering enough content.

Content is just the icing on the economic/historical cake. If we want to ensure our students get the cake — i.e., the skills and judgment we yammer on about when talking about curriculum at faculty retreats, the skills and judgment we keep failing to get across — we’d better start thinking about something other than how much sugar to put in the icing.

Best,
Wade

p.s. Speaking of icing: About Friday — could we do the first martini at 7:30 this week instead of 7? I’ve a group who wants to talk about a piece of their history project and the only time all can make it is 6:30. And I’m guessing that, with this particular group of overachievers, their “little question or two” will go 40 minutes.

Share

Spent a little bit of time this weekend contemplating the syllabus for what has long been my favorite class to teach.  Economics 130, Luther’s one-semester principles of economics course.  As with all of my courses, I expect to make major changes to the class before returning to the classroom, and so I was spending an hour or so thinking about my mission for the class.

I prefer to call the course, not by its title in the college catalog, but by the name of the book I have been using as long as I have taught it, The Economic Way of Thinking.  Capital letters and all.

When I do so in his hearing, however, my colleague, C. Nicholas Gomersall, will invariably attempt to correct my diction, asking me in his polite Yorkshire way to speak of ways of thinking instead.  Sometimes I have assented, sometimes not.  But, almost always, a little part of me adds something to the effect of “Okay, sure, but mine is the best one.”

Yet beneath Nick’s observation, beneath even my silly intellectual sneers, lies an important point.   A sobering point.  Even a disturbing one.

You, see, Nick is absolutely correct.  There isn’t one economic way of thinking, but several.  In fact more than several.  Not only are there the ways of thinking Nick and I like to compare and contrast, the neoclassical and the Marxist, the Austrian and the Chicago, the Keynesian and the post-Keynesian, the Smithian and the Mercantilist.  There are the Donald Trump and the Bill Gates, the Dubya and the Barack, the Oliver Stone and the Frank Capra.

But enough of my gratuitous allusions and false dichotomies.

Nick’s point is sobering because, as a teacher of economics (and not merely a holder of a particular way of thinking about economics), I have a responsibility to properly account for that intellectual diversity.  And somehow do so when I know that getting across enough of even one way of thinking in a single semester is problematic.

Were my only pedagogic goal to introduce concepts and vocabulary, it wouldn’t be so bad.  But I’ve always been of the opinion that Economics 130 is economics for citizenship.  Economics for those who are to be members of a society, not just atomistic individuals bouncing off each other in a sort of self-absorbed Brownian motion.  Economics that only matters to the extent that it gets used after the student has finished my class and received my grade.

If I have to deal with all those other ways of thinking, how in the world am I going to ensure my students can apply any of them?

That’s the sobering part.

The disturbing part is that a truth lies within my sneer, too.

Oh, not that my way is necessarily the best.  That’s pure hubris.

No, the disturbing bit is that one can put to use any economic way of thinking only to the extent one believes that the way being used is the better one.  I don’t think in terms of opportunity cost because it might be better than the alternative.  I think in terms of opportunity cost because, at my core, I believe that doing so is a better way to think.

And so, if I want to teach my students economics in a way that they will apply it after they leave my classroom, I have to teach them how to figure out which way of thinking is better and which way of thinking is worse.

And that means a lot more than just putting the different ways of thinking on the table.  It means I have to somehow get them to adopt a way of thinking about ways of thinking.  A way that allows them to weigh the benefits and costs of thinking my way or Nick’s way or Oliver Stone’s way or Adam Smith’s way.  A way by which they can better decide which way is better.

I can’t do it in a way where my way of thinking always wins by default.  Even when I think, and think deeply, that it does.  Yet I can’t just say all ways of thinking are equally worthy of belief, either.

Because they never are.

Oh, sure, sometimes one way might be better, and other times another way.  Nothing is always best.

But that’s not the point.  At any given moment, at each particular time when a particular decision must be made, one is going to be better and another one is going to be worse.  If it weren’t so, there’d be no need to choose.  Choices matters only when they are unequal.

Choosing means deciding about better and worse.   And teaching, which is nothing if it is not preparing people for better choosing, means being able to say two things in a believable way, often within minutes of each other:

1. “My way” is not the only worthy way.
2. “My way” is better.

The statements need not be contradictory.  One is a statement of a general truth.  One is a statement of a truth for a particular situation.

But I can guarantee that those listening will often hear the a contradiction.  And, unless the teacher is very, very careful, and maybe even if he is,  a contradiction that greatly damages the teacher’s credibility.

And when that contradiction gets repeated with some frequency, as, I regret to say, it probably will?   Well, it’s no accident that average student evaluations of the teaching quality of economists tend to be lower than the overall average.  And, more unfortunately, it’s no accident that so many graduates of introductory economics classes end up not applying what their economics professors were trying to teach them to apply.

As I said.  Sobering.

And disturbing.

Share

If how we respond to failure is so revealing of character, why are we so reticient to reveal our failures on resumes, in our profiles on social and professional networking sites, in interviews and cold calls, in website bios, etc.?

Oh, I understand why it’s a bad idea to clutter a job-hunting resume with the various times in the last 32 years when I have fallen short. Those who read my resume are busy people who have only a few seconds to decide whether to read on or give me a call.  The last thing I need — or they need — is a bunch of detail about where I haven’t performed.  What they need is evidence of where I have performed.   Evidence that I might be able to get a job done and done well.

In the job-hunting context, a resume’s traditional first function is enabling the saying of “no.”  When you’re confronted with dozens or hundreds of applicants you need a quick way of reducing the possibles to a manageable few.  Okay.  I get that.

But we provide resume-type information in a lot of contexts beyond simply hunting for a job.  Contexts where the information provided doesn’t have to serve that same weeding-out function.
Contexts where the reader has already reduced matters to a manageable few.  Contexts where you’ve already made it past that first cut.

When a visitor to your website clicks on your bio, they’ve already provisionally decided that you might be interesting.   And so the contents of your bio should be all about keeping them interested.  And keeping them interested requires more than reciting your accomplishments.   It requires revealing some breadth and depth about the way you think, about the way you deal with successes and failures, about the way you approach life.  In short, it means starting to reveal something about your character.

Yet what do most website bios do?  They may not be formatted like a job hunter’s resume, but their content is the same.  They trumpet accomplishments — degrees, past jobs, awards, publications.  Credentials and citations and important clients and friends.

Nothing wrong with a bit of trumpeting per se, but it can’t be all you do.  At least not if your goal with the bio is getting a response other than “what an arrogant bore!”

Does this mean you should go off on an extended discussion of your problems over the years.  No.  Not at all.  The reader of your bio isn’t interested in being your brother or your therapist or your bartender.

What it means is that your bio reader wants to know something more about you.  About what you are.  About how you do what you do, not about what you’ve done.   They aren’t interested in your problems, but they are interested in how you deal with problems.

Most people that are going to be thinking about buying your goods, your services, your ideas, are going to realize that everyone falls short sometimes.  (And do you really want to work with the minority who don’t?)  Every product fails some time.  Every consultant gives bad advice from time to time.  Every idea isn’t a good one.   Unless they’re just looking for someone to do an assembly line task, they’re going to want to know what you do when things don’t work right, when an unreasonable deadline is imposed, when the stress level increases.  When you fail.

A student who insists on teachers who never make mistakes should never sign up for my class.  A company who believes perfection is possible (as opposed to worth striving for) should not hire me.  Blog readers who want all my postings to be relevant to them, correctly argued, who expect me to always practice what I preach, should take a walk over to the About page … and probably keep going.

Because I’m going to make mistakes in my class.  I’m not always going to give perfect advice.  I’m not always going to say the right things in the right way.

I’ve disappointed some people in the past.  And, alas, much as I would rather it be otherwise, I’m going to disappoint some people in the future.

The world still needs the old-fashioned resume.  Employers, awarders of grants, anyone who has to deal with lots of applications — all these people need weeding tools.

But the world also needs transparency.   I need ways not just of getting past the first cut, but reasons for being paid attention to in depth.   I don’t need to be without error.  I need to show how I iterate through and beyond my errors.

I need to show that I follow my own advice.  I need to show that, most of the time anyway, I …

Listen.  Think.  Repeat.

Share

I haven’t been blogging much the last several weeks.

Part of it’s been the distractions of the usual suspects: dealing with the needs of my 87-year-old parent; my lifelong penchant for avoiding work whenever possible; my propensity for getting distracted by dog and household tasks.

And part of it has been the need to put some extended work in on three major projects: (i) following through on a long-standing promise to help a novelist friend and client put together his website; (ii) doing my job as “Vice President of Marketing” of the Economic and Business Society and help it get ready for conferences this April in Grand Rapids and in May 2010 in Braga, Portugal; and (iii) readying several additions, too often postponed, to the Iterative Listening website, additions that I want to have up absolutely no later than Christmas.

Even more, though, I’ve been grappling with the “Technology and Education in the 21st Century” project. That project, initially started with the aim of publishing a book on the needs of economic higher education in today’s world, has morphed in a major way.

Oh, the book part is still on the table (Click here for a reasonably current picture.) I feel more every day that I’m on to something, and that the book needs to be done. And while there is a part of me that worries about getting as much of it done as I promised in my college sabbatical application, another part of me is yelling that I need to get more of it, maybe even all of it, done before returning to the classroom full-time.

But it isn’t just the book anymore. It’s about how I redo every one of my classes from the bottom up. It’s about how I’m retraining myself as a teacher, cleaning the garage out and trashing all the accumulated clutter from twenty-plus years of teaching. It’s about how one synthesizes “in school” higher education needs with “out of school” needs for business and professional education. It’s … well, it’s a whole lot of stuff that has to be put together and kept from exploding like an overinflated balloon.

But the biggest reason is that I’ve been contemplating a major addition to the topics of conversation. My last post, Nov 21 on judgment, was an example of what is going to be coming. And coming, I expect — no, I hope — with a good deal of regularity.

In one sentence: I am going to be much more transparent and vocal about my Christian believing.

For all its importance, religious belief is something that a lot of us prefer to compartmentalize and separate from the rest of our lives. Every Christian churchgoer past the age of, say, 15, has heard dozens of sermons complaining of “only on Sunday” worship, yet virtually all of us think “business” and “religion” should be kept separate.

I would venture a guess that, did I poll 1000 experts on “starting a new business” or “how to succeed in business,” 990 of them would tell me to keep God completely off the blog and the rest of the company web site. (The other 10 might allow me a vague bit of “Christian commitment to service” or some such wherever it is that I post the company’s mission, places like here and here.)

So, it isn’t a step I’m taking lightly or without a great deal of contemplation on the costs and benefits. And, yes, prayer, too.

But I am taking it, and I have no intention of taking it tentatively. Because if I believe, as I do believe, in the utter essentiality of the Great Commandment (“Love the Lord your God with all your heart and with all your soul and with all your mind and with all your strength” [Mark: 12:30-31]), I have no business trying to separate things.

For some reason, I don’t think He who said to a disciple, after the disciple had asked to be excused to bury his father, “Follow me, and let the dead bury their own dead” [Matthew 8:21-22], is going to be particularly impressed by my relegating my own following to non-business hours. Being a Christian is more than just admitting that I am one or saying that “my prayers are with you” when an occasion for sympathy presents itself. More than saying “Happy Thanksgiving” on the last Thursday of November or “Merry Christmas” at the office Christmas party.

Oh, I’m not planning on a lot of hellfire and damnation sermons. That to my mind leads too easily to violation of the “Judge not” teaching. With my legal and professorial background, it’s already hard enough to resist my natural propensity for scribe and Pharisee mode. Though I believe in the inevitability of God’s judging, I find grossly offensive and hubristic the rantings of all who would profess to know others’ fate when that day comes. It is, to my mind, one of the greatest shortcomings of “organized” Christianity.

Nor do I intend this to be a place where I’m actively seeking to persuade my non-Christian readers to convert to the true faith. I consider myself an evangelical, as someone who wants all to believe in Him; but I can’t do the TV preacher hard sell. I just look ridiculous thumping the Bible that way.

No, my witness is my own life. How I go about striving to follow His teaching, and shape my life as I think he wants me to shape it. How my actions and beliefs change as I listen to His teachings. The example I set, or fail to set, with my own belief.

And how I credit God for that life and that belief. How and when I bring Him up in conversation, online and off. How I admit His presence in and into my life. How, if I have any good ideas they come through Him; and how, if I have bad ideas, they come despite Him.

Because even though God won’t be mentioned in every post — far from that — He, capitalization and all, is in all of them.

He always has been. And it’s time I admitted it.

Long past time.

Share

Just discovered another home run by Michael Lopp, the software engineering manager who is the real world alter ego of the pseudonymous Rands (www.RandsInRepose.com).  In the chapter of his Managing Humans titled “Trickle Theory,” he says,

Fact is, your world is changing faster than you’ll ever be able to keep up with, and you can view that fact from two different perspectives:

?    I believe I can control my world, and through an aggressive campaign of task management, personal goals, and a can do attitude, I will succeed in in doing the impossible.  Go me!

Or . . .

?    I know there is no controlling the world, but I will fluidly surf the entropy by constantly changing myself.

Surfing entropy.  It’s a marvelous metaphor.   I love it.  (And I just bought the domain name. Nyah nyah nyah.  :) )

*     *     *     *     *

Of course it’s easier said than done.  As Lopp says in concluding the chapter, it takes confidence. The kind of confidence that you can earn only by trying the constant-adapt-yourself-to-the-impossible approach for awhile.  And if you’re anything like me, having confidence to do the stuff that builds confidence is….well, it’s just really hard.

And I’ll be honest:  That “for awhile”?  It’s a long fight.  You don’t just say “I’m going to build confidence that I can do the adapt thing” today and tomorrow become Michael Lopps.  It doesn’t work that way. There’s going to be a constant chorus of smart little voices yelling from the back of your brain and telling you, “You’re lost,” “It’s impossible,” “Give it up!”  And there won’t be any shortage of voices, internal and external, telling you that you just have to get a better plan, a better to do list, a better way of doing things, a platoon of Marine drill sergeants yelling “Jump!”

No, surfing the entropy is going to take more than donning a mental Hawaiian shirt and heading to your interior Malibu.  And I’d be lying if I said I were there yet.

I am, however, closer than I was six months ago, and a lot closer than I was three years ago.

And for me, the key, the thing that has kept me moving forward and building a bit of that confidence, is that I have a set of mental gurus on call.  Half a dozen people who, when I get an email from them or open one of their book, say something that I’m going to listen to uncritically.

These gurus don’t have to be “mentors.”  They don’t even have to be someone you know “personally.”  I’ve never met or talked with Lopp.  Closest I’ve ever come to that sort of contact was my once leaving an unanswered comment on his blog.  But he still fits the “guru” bill.

Because the key is that I can listen to what they are saying uncritically.  To listen to someone uncritically is often pooh-poohed, especially by people like me (teachers in the business of convincing others to use their critical faculties).  But it actually is a valuable part of the thinking  toolbox.  Because, when you are listening uncritically to a guru, three things are happening.

First, you are extending unconditional trust.  No one can be a guru for you unless you trust him or her deeply.   Whether its their charisma, their track record, their personal relationship with you, or some combination of all three, you give this person more than your usual level of trustingness.

You don’t have to *always* be unconditional in your trust.  Every guru I’ve ever had, every guru I will ever have, is human.  Capable of error.  Capable of vice.  Capable of saying and doing things that are boneheaded or wrongheaded or both.  None of them do I always listen to uncritically.

But all of them I do listen to uncritically sometimes.

Until you reach the point where you can just “soak up” what someone says, without adding your own elaboration or clarification or “but, what about…”, that someone may be a valuable source or colleague or friend.  But he or she won’t be a guru.  Not to you.

The second thing that is happening is that you are in fact soaking up valuable information.  When you are in listening uncritically to a guru, you are still learning something you didn’t know before.  You are seeing things a little bit differently, understanding something a bit more,  walking a mental path you haven’t managed to walk before.

The third thing that is happening is that you are re-inforcing your own pre-existing belief.  This third thing is what the “critical thinking” people are always harping on as bad:  it’s the “not being open-minded” thing, the “playing to your biases and prejudices” thing.  When done too much it makes you the prey for the manure I ranted about in my last blog entry.

But if the second thing is happening, if you are actually getting new and valuable information from that guru, the re-inforcement of your pre-existing belief does something else.  It increases your confidence in yourself.  You are more sure than before in the rightness of your position.  You “know” that you are correct, that you are doing the right thing, that you are approaching things from the right frame of mind, to a greater extent than you were before.

You may not be surfing the entropy yet, but you’re starting to see how to try a wave or two.

*     *     *     *     *

So where does one find these gurus, Wade?

I’m tempted to Om, do my best to look wise, and say, “When you are ready, they will appear, grasshopper.”

But that’s the smart-ass in me talking.  If truth be told, I don’t really know.  And my attraction to Zen-ing notwithstanding, I think finding gurus is one of those entropic things.  What you do is you ignore the impossibility, and try to find ways to make yourself trusting and uncritical.

Find an opportunity to do an uncritical listening daily.  When someone you respect says something, and your first reaction is something along the lines of “But, what about …” or “No, that’s not quite right …,”  or “But isn’t it true that …”, stop yourself.  Don’t insert a critical remark. Treat it as a temptation and do what God tells people to do about temptation.  Resist.

Do it once a day.  Shift yourself away from “I’m right” toward “I agree” by way of “I’m listening without planning my next rebuttal.”  Do it once a day and I bet you’ll find you haven’t lost a thing.

Then up it to twice a day.  Believe one absurd thing before lunch, and another before going to bed.  Put “listen uncritically twice today” at the top of your to-do list.  Don’t worry, there’s little chance you’re going to eliminate your habits of critical thinking.   But you might just discover that uncritical thinking isn’t always a bad idea.

And I’m betting that if you keep this habit for awhile, you’ll start seeing the same people pop up.  You’ll start noticing who you’re trusting with your uncritical listening moments.  That neighbor, that science fiction writer, that scholar, that whoever it is that is always saying things that you say “of course!” and “aha!” to.

And you have found someone who may be one of your gurus for awhile.

Someone who can help you move toward a higher comfort level with “constantly changing yourself.”

Surf’s up!

Share

It’s easily among my least favorite words.  Right up there with “w-o-r-k” and “b-i-l-l” and “h-a-t-e”  A real four letter word.

I can’t do …
I can’t afford …
I can’t …
I can’t …
I can’t …

Too often, can’t simply means won’t.  It reflects our choices and priorities and values.  Choices that we can change.  Priorities that we can alter.  Values that we don’t have to have.

Now, I’m well aware that some things can’t be done at all.  And I’m well aware that not everyone can do  the things that can be done.  After all, there are a lot of things I can’t do, too.    I’m never going to be able to run a four-minute mile.  I’m never going to be very good at small talk.  I’m not very likely to stop being a messy pack rat any time soon.  And a dozen others.

And I’m aware of your time constraint.  Just like me, you only have 168 hours in your week to spend.  And no matter how much you might want to say, “yes, sure, I’ll do that,” you have to say no sometimes.

No, I understand why “can’t” can be a necessary word for you to use.

But it’s just that it’s not a very helpful word for me when you use it.

You see, if you keep saying “I can’t” to me, I’m not going to be sympathetic.  I’m not going to understand.  Because however true your “can’t” might be, I’m not just going to be noticing  physical inability or you time constraint or any of the other perfectly legitimate reasons you have for saying it.

What I’m going to be hearing is, “you don’t have time for me.”  What I’m going to be hearing is,  “you’re not interested in helping me with what I need.”   What I”m going to be hearing is “The only person that matters to you is you, not me.”

Look, I know that’s probably being unfair to you.  You do care.  You are interested.   But, you see, that’s the problem with the word.  It tends to give the wrong impression.

Because “can’t” isn’t just a statement of your abilities.  It’s a word that can’t help its hearer.  When I’ve asked you to do something, and you’ve replied, “I can’t,” I still have that something that needs doing.

Something that is coming up against my only-168-hours-in-a-week constraint.  If I’ve said something which inspired a “I can’t” from you, it’s an extremely good bet that what I was said was inspired by my own need to find a way to deal with that time constraint of mine.

And your throwing of “I can’t” into the conversation does absolutely nothing to help me with my problem.

In fact, your use of the phrase likely means I’ve got more to worry about than I did before you used it.  Because in addition to the original something (that still needs doing), now I have to also account for something else one of my students, employees, friends, family members, acquaintances, etc., can’t do.  I have to put a little sticky-note in my mind somewhere that says, “when I need X again, make sure to go somewhere else for help.”  Or one that says “remember that Y has this problem.” Or both.

And some of that is fine.  Even inevitable.  Anyone who wishes to have teachers, employees, friends, family, even acquaintances needs to be willing to add both kinds of mental sticky-notes.

The problem comes with repeated use.  When you say “can’t” to someone over and over and over again, you just highlight the subtext.   Whether you intend to do so or not, you are saying to that person that their needs just aren’t very important to you.   You’re saying you can’t be bothered.  You’re saying that you’re only interested in sucking energy from the relationship, not in  providing energy to it.

But what am I to do, you ask.  How can we avoid saying “I can’t” too much, when in fact the 168-hour constraint and the other personal limitations of ours are real, after all?

Well, I’m afraid I don’t have any good answers here.  As I expect anyone who knows me can attest, I myself am doubtless among those who say “I can’t” far too often.  But I do have a three suggestions.

The first step, the essential step, is self-awareness.  You want to get in the habit of noticing when you are using the word.  Several times a day ask yourself, “how many times have I said “can’t” recently?”  Put a sticky-note on your computer monitor that asks, “have I said or written “can’t” in the last hour?”  Use “find” or “search” features on your word-processor or browser or email client to see how often you use the word in print.  After you’ve had a conversation with someone, ask yourself how many times you said the word “can’t” in that conversation.

At first, I expect, you’ll miss most of the times you use the word.  But as you get in the habit of looking for it, I think you’ll be surprised just how big a part of your vocabulary the word really is.

Again, the goal in this first step is just self-awareness.  Before you can worry about how to change a habit, you must be aware of the habit.

Step two further strengthens that self-awareness, but also provides the most important route to changing the habit.  Once you’ve got in the habit of noticing when you are using the c-word, start focusing on how you use it.  And then start changing your diction.

Does your use of “can’t tend to be a conversation ender for the other party?  What are the next words out of your mouth?

Are they perhaps “because I…”?    Are they:  “I can’t talk right now because I…”?  “I can’t do that because I…”?  “I can’t handle that because I …”?

“I can’t” modified by “because I…” may be slightly better than an unadorned “I can’t”, but it is nowhere near as adequate as people think.  All “because I…” does is give your reasons for saying no.  It still says no.   Giving the reasons for your rejection may be valuable for getting the rejected person to be more sympathetic to your needs, but it does not help at all with the rejected person’s own needs.

And remember, when someone asks you for something, they aren’t doing it primarily to satisfy one of your needs, but to one of satisfy theirs.  Your inability to satisfy their need doesn’t eliminate their need.  In fact, for them, it just accents that need.

And so the key to breaking the negative parts of your “I can’t” habits is changing your language.  Instead of focusing on your reasons for saying no (the natural temptation), focus on providing the asker with help in fulfilling his needs.  Focus on being seen, not as a no-sayer, but as a person who offers valuable alternatives:

“I don’t know how to do that, but did you know Y is really good at doing that?”
“I’ve never been able to do that very well, but have you talked to Z?  She’s an expert.  Inexpensive, too.
“I’m already booked for the next four weeks, but have you tried…?”
“I can’t do A, but I can do B.  Will that help at all?”

And so on.

The point is that you want your words to say, “I understand your need.  And I’m interested in helping you satisfy those needs.”  Not on providing reasons and getting sympathy from others.  But on giving sympathy.  Not on getting value, but on giving value.

Step three?   Well, step three is dedicated to the spirit of Marines everywhere.  Step three is the there’s-no-such-thing-as-can’t-only-won’t principle. Step three is “Replace ‘can’t', with ‘Can do!’”

I admit to having really mixed feelings on this step.

On the one hand, I very much admire the Marines.  And I tend to believe that a lot of “I can’t” are really just the can’t-sayer’s way of saying won’t.   I’ve taught for over 20 years and I’ve seen lots of times where students have said “I can’t” regarding  something when in fact I know they can.  Lots of times when I’ve wanted to reply, “just shut up with the can’t and do it.”

But on the other hand, I also  know there are things for which “I can’t” and “I won’t” are each appropriate responses.  In that same 20 years of teaching, I’ve seen lots of times when a student should have been saying no to me or my fellow teachers.  I’ve seen lots of times when a student really couldn’t do something.

“Can do” isn’t always possible.  And even when it is possible, it isn’t always wise.

I guess that’s why this is only step three.  It’s far important to be aware of the can’ts than to eliminate all of them.  It’s far more important to focus your no-saying on the needs of the other person than it is to stop saying no.

In the end, it’s not about deleting four-letter words from your vocabulary.

It’s only about knowing when to use them.

Share

My remote stops and lingers on CNN less than once a month.

With the exception of the occasional seismic event (9/11, election night), I could care less what the people on CNN say.  And even during those seismic event moments, I’m not going to be lingering on CNN — I’m going to flipping back and forth between CNN and various other channels I visit equally rarely (FoxNews, MSNBC, CBS/ABC/CBS).

No, it’s not apathy about what’s happening in the world.  And, no, it really doesn’t have much of anything to do with politics or media bias or any of those usual suspects.

I don’t listen to TV news sources because they simply aren’t trustworthy sources of information anymore.  And I no longer have time for “information sources” who consistently prove unreliable.  Information sources whose research is incomplete.  Information sources who only tell one third of one half of one perspective on a story.   Information sources who are too busy shouting about the pebble of news they have found to realize that they are standing in the middle of a quarry.

Now, I understand that one function, perhaps the most important function, of information sources like CNN and the others is to filter information for me.  To pick the important bits out of the mass of data out there, the mass of data that I simply don’t have the time or energy or contacts to filter on my own.

Point taken.

But if you’re going to be an information filter, that brings with it an awesome responsibility.  Not a responsibility to “present all sides,” whatever the eff that means.  Not to be “fair and balanced,” another platitude that has the intellectual content of the instructions on a tube of Preparation H.

A responsibility to sift the wheat from the chaff.

A responsibility to go deeper rather than shallower.  To go broader rather than narrower.  To know when it’s time to look at the forest, and when it’s time to examine a specific tree.

A responsibility to go beyond first impressions and gut reactions.   A responsibility not to get distracted by trivia or celebrity or ephemerality.  A responsibility to avoid the facile categorization and the half-thought-out conclusion.

A responsibility not to shout the loudest, but to listen the hardest.

And, sad to say, this awesome responsibility is far too rarely met.  CNN, the others — they fail abysmally at it.  Over and over and over again.

Why?  I don’t know.  Perhaps it is their incompetence.  Perhaps it is their ideological bias.  Perhaps it is just imperfect humans beings being imperfect human beings.  I don’t know.

And, frankly, I no longer care.

Because, in the end, the reason for their failure doesn’t matter.  Only the failure itself.  In a world of information glut, a world where each of us gets bombarded by something between 3,000 and 30,000 messages a day, I simply don’t have time for information sources with that low a signal/noise ratio.  I have too much muck to wade through as it is.

It’s like finding time for a daily walk with the dog.  Why in the world should I take that walk through a swamp filled with disease-carrying mosquitos?

And no matter how many perfectly coiffed pretty people with top-of-the-line orthodontia you put at anchor desks, TV “news” channels, no matter how good your talking heads get at combining celebrity and a sense of gravitas, you keep insisting I walk through swamps.

I don’t have that kind of time.

Or interest.

Swamp gas may be a alternate source of energy.  But it’s still swamp gas.

No thanks.

Share

Slashdot?  What’s that?

Most of my offline friends, colleagues, and acquaintances have never heard of Slashdot.  Heck, my guess is that a majority of my online contacts haven’t either.

That’s too bad, because despite its “News for Nerds” masthead, Slashdot offers to anyone who visits, nerd or non-nerd, an excellent opportunity to reap the value of a a listening mindset.

In fact, for most visitors, nerd or otherwise, Slashdot (abbreviated /.) is going to have its greatest value only to the extent that the visitor enters with that listening mindset.

Look, I like talking.  This blog wouldn’t exist if I didn’t.  But I very rarely post to /. discussions.  Not because I don’t have anything I want to say –  go to /. and within 5 minutes I’m probably muttering to myself — but I find it impossible to effectively post there.

Oh, it isn’t truly impossible.  However, given the way /. works, together with the way *I* work, it makes very little sense for me to post very often.  Because I’ve never paid for a /. “subscription,” and because I don’t set my /. up in a way to immediately notify me of stories, and because I don’t typically find myself reading a /. thread until it has been online for at least 6 hours, or even a couple days, my first opportunity to post on a thread comes at a moment when the thread already has a few hundred entries on a couple dozen sub-threads.  Add in the moderation system that assigns a score to each post, and each user’s ability to screen posts based on score, and it’s pretty darn likely that any post I add is going to disappear unseen and, worse, uncommented upon.

And since I’d be posting primarily because of a hope that someone might listen … well, I’m just not going to post very often.

This used to annoy me.  But it doesn’t anymore.

Why not?  Because at some point I realized just how many people there were on /. who had something to offer me if I just went on and listened.  My inability to post there isn’t a weakness.  It’s a strength.

The thread titled “Give Up the Fight for Personal Privacy,” is, to my mind, a perfect case in point.  Because of my personal intellectual, epistemological, and ideological interests, and because of my deep seated fears for my own privacy and the ability of states and organizations and assorted scum-bags to invade it, I pay a lot of attention to privacy issues.  I expect I know more about privacy and its infringement than most people.  I would go so far as to say my knowledge of the potential for privacy invasion is greater than that of 90 percent of the individuals I know.

But, this /. thread makes it clear that, despite all that interest of mine, I’m still a piker.  What I know may be huge, as compared to what my mother knows or my reading group pals know or my teaching colleagues know.  But it’s tiny compared to what I don’t know.

Unfortunately, if you want to know what I mean you’re going to have to read a big chunk of the thread.  And that means a few minutes of your time.  When I started writing this blog entry, this /. thread on privacy already had 565 comments. By the time I finish this article, post it on Iterations, and you read it, who knows how long the /. thread will have become.

Because, I’m sorry, but I can’t summarize the thread for you.  I couldn’t even if I had read the whole thing. (I’m only about 30% through the thread right now.)   That’s part of the point you see.     Read a /. thread on a topic of interest to you, and you hear a lot of really smart people out there with a lot of different perspectives on the topic.  Perspectives that cannot easily be reduced to this or that interest or ideology.

Listen to your average television panel yakking about privacy,  or attend an academic conference session devoted to the subject, and it’s pretty easy to line people and ideas up, and place them into nice and neat categories.  But read a /. thread on the topic and you simply cannot do the same.

Oh you’ll see a number of trolls and knuckleheads (especially if, as I do, you set your reading threshold to admit all posts, even those with a score of -1).  But you’ll also see lots of really smart people making very good points that you cannot make consistent with each other.   You’ll find yourself realizing, over and over again, that even the bits you thought you had figured out already aren’t quite as obvious as they used to be.

At least I do.

And that’s why I try to find room to read at least one big chunk of a /. thread several times a week.  Each chunk may take me 15-20 minutes, or more, to work my way through.  But I find it’s invariably going to be 15-20 minutes well spent.  I’ve been visiting /.  for a couple years now, and I’ve *never* been disappointed with the chunks I’ve decided to spend 15-20 minutes with.

Even if it’s abbreviation weren’t so cool, Slashdot belongs on the “1000 cool things” list.

News for listeners.

Share

Tenured faculty members enjoy one perk still very rare in the rest of the working world — sabbaticals.  And for more than one reason, none of which I’m going to go into here, I’m glad I’m one of those sabbaticals right now.

But today I find myself with an addition reason for gladness.  Were I in the classroom this fall, I’m sure I’d be getting “Well, what do you think about the situation in the financial markets, the proposed bailouts, etc, etc?”  Such questions go with the territory when you teach economics.

But, while I have opinions, I am absolutely certain that very few of my students, or my colleagues, or my superiors, or my student’s parent’s would appreciate them.    And they would appreciate very much less my explanation of them, for, even for someone as verbose as I usually am, my explanation in this case would be very, very lengthy.  It is not  a simple situation.  It doesn’t have simple answers.  And any question or answer that claims otherwise is wrong.  Including, I expect, every question and answer about it in the upcoming presidential debates which I refuse to waste my time upon.

And, were I in the position of teacher this fall, I would feel compelled to give that explanation.

Compelled in a way that I do not feel in this blog.  Here, I’m perfectly content to just blurt out my convictions without explanation, just like everyone else.  And, no, I’m not particularly interested in arguing these points right now — because it would take too much time, and, believe it or not, there are far more important things to talk about.

So here are my blurts:
1.  Why in the world does anyone think that “government” can be trusted with a bailout decision.  Has the Bush administration, or the Congress, or their respective predecessors for the last 20 years or so, together or separately, provided any evidence that they should be trusted with $700 much less $700 billion?

2.  The people of this country have become addicted to the idea that “papa government is here to help us with every travail of our lives.”  Despite that government showing, over and over and over again, that it belongs at the top of any list of delinquent and abusive dads.

We are like unlearning victims of a Ponzi scheme:  First we went to them because they promised solutions to million-dollar problems.  When, helped by their fraud and mismanagement and out-and-out “governing” stupidity, those million-dollar problems were replaced by billion-dollar problems, we went to them again.  And, now, after more fraud and stupidity and help-that-is-no-real-help-at-all, we are going to them with “You Must Do Something About” a $700 billion problem.

Yes, America may have a financial crisis right now.  And I certainly have no easy solutions, much less any painless ones to that crisis.  But for the life of me, I simply don’t understand why people think the Bushs and the Pelosis, the McCains and the Obamas, are going to be the ones with the solution.

In the days, weeks, and months ahead, I expect no shortage of bickering and finger-pointing and grandiose plans promising solution of this latest crisis.  But, frankly, I have no confidence in these people even if they do somehow manage to come together and actually embrace the true bipartisanship they all claim is essential.

Because that’s not how Ponzi schemes work.   When we trust these con men and con women — any of them — with the $700 billion, all we are assured of is that tomorrow they’ll be asking us for control over $7 trillion more.

And that’s what we do with every problem today.  We ask con men and con women to solve it for us.

Pick any major national, regional, or state newspaper.  Dig into its archives for the last, say, five years.  And I bet you will be hard pressed to find three consecutive issues of their front page that doesn’t have one or more stories reflecting a majority opinion that “the government must do something” about some problem or event or catastrophe.

Heck, I’d be surprised to find two consecutive issues.  Ours is a nation whose history and success has been built on the notion of self-reliance.  Yet that self-reliance has become too much of a myth.  Every act of God or of our fellow man now deserves government assistance and retribution.

No single Ponzi scheme, not even one of $700 billion, can threaten an economy capable of generating at least $14 trillion of new value every year.

But a cultural addiction to the lures of Ponzi-masquerading-as-Papa?  That can.

Share
All content of this blog, except comments added under names other than "Wade," are copyright © 2008, 2009 Wade E. Shilts