Archive for the foibles of the listener-in-chief Category

I haven’t been blogging much the last several weeks.

Part of it’s been the distractions of the usual suspects: dealing with the needs of my 87-year-old parent; my lifelong penchant for avoiding work whenever possible; my propensity for getting distracted by dog and household tasks.

And part of it has been the need to put some extended work in on three major projects: (i) following through on a long-standing promise to help a novelist friend and client put together his website; (ii) doing my job as “Vice President of Marketing” of the Economic and Business Society and help it get ready for conferences this April in Grand Rapids and in May 2010 in Braga, Portugal; and (iii) readying several additions, too often postponed, to the Iterative Listening website, additions that I want to have up absolutely no later than Christmas.

Even more, though, I’ve been grappling with the “Technology and Education in the 21st Century” project. That project, initially started with the aim of publishing a book on the needs of economic higher education in today’s world, has morphed in a major way.

Oh, the book part is still on the table (Click here for a reasonably current picture.) I feel more every day that I’m on to something, and that the book needs to be done. And while there is a part of me that worries about getting as much of it done as I promised in my college sabbatical application, another part of me is yelling that I need to get more of it, maybe even all of it, done before returning to the classroom full-time.

But it isn’t just the book anymore. It’s about how I redo every one of my classes from the bottom up. It’s about how I’m retraining myself as a teacher, cleaning the garage out and trashing all the accumulated clutter from twenty-plus years of teaching. It’s about how one synthesizes “in school” higher education needs with “out of school” needs for business and professional education. It’s … well, it’s a whole lot of stuff that has to be put together and kept from exploding like an overinflated balloon.

But the biggest reason is that I’ve been contemplating a major addition to the topics of conversation. My last post, Nov 21 on judgment, was an example of what is going to be coming. And coming, I expect — no, I hope — with a good deal of regularity.

In one sentence: I am going to be much more transparent and vocal about my Christian believing.

For all its importance, religious belief is something that a lot of us prefer to compartmentalize and separate from the rest of our lives. Every Christian churchgoer past the age of, say, 15, has heard dozens of sermons complaining of “only on Sunday” worship, yet virtually all of us think “business” and “religion” should be kept separate.

I would venture a guess that, did I poll 1000 experts on “starting a new business” or “how to succeed in business,” 990 of them would tell me to keep God completely off the blog and the rest of the company web site. (The other 10 might allow me a vague bit of “Christian commitment to service” or some such wherever it is that I post the company’s mission, places like here and here.)

So, it isn’t a step I’m taking lightly or without a great deal of contemplation on the costs and benefits. And, yes, prayer, too.

But I am taking it, and I have no intention of taking it tentatively. Because if I believe, as I do believe, in the utter essentiality of the Great Commandment (“Love the Lord your God with all your heart and with all your soul and with all your mind and with all your strength” [Mark: 12:30-31]), I have no business trying to separate things.

For some reason, I don’t think He who said to a disciple, after the disciple had asked to be excused to bury his father, “Follow me, and let the dead bury their own dead” [Matthew 8:21-22], is going to be particularly impressed by my relegating my own following to non-business hours. Being a Christian is more than just admitting that I am one or saying that “my prayers are with you” when an occasion for sympathy presents itself. More than saying “Happy Thanksgiving” on the last Thursday of November or “Merry Christmas” at the office Christmas party.

Oh, I’m not planning on a lot of hellfire and damnation sermons. That to my mind leads too easily to violation of the “Judge not” teaching. With my legal and professorial background, it’s already hard enough to resist my natural propensity for scribe and Pharisee mode. Though I believe in the inevitability of God’s judging, I find grossly offensive and hubristic the rantings of all who would profess to know others’ fate when that day comes. It is, to my mind, one of the greatest shortcomings of “organized” Christianity.

Nor do I intend this to be a place where I’m actively seeking to persuade my non-Christian readers to convert to the true faith. I consider myself an evangelical, as someone who wants all to believe in Him; but I can’t do the TV preacher hard sell. I just look ridiculous thumping the Bible that way.

No, my witness is my own life. How I go about striving to follow His teaching, and shape my life as I think he wants me to shape it. How my actions and beliefs change as I listen to His teachings. The example I set, or fail to set, with my own belief.

And how I credit God for that life and that belief. How and when I bring Him up in conversation, online and off. How I admit His presence in and into my life. How, if I have any good ideas they come through Him; and how, if I have bad ideas, they come despite Him.

Because even though God won’t be mentioned in every post — far from that — He, capitalization and all, is in all of them.

He always has been. And it’s time I admitted it.

Long past time.

Share

Since June 2007 I’ve been on leave from my full-time teaching gig.  And it’ll be another ten months or so before I return to my spot at the front of the room in Olin 101.

While the leave has been bad for my short-term finances, it has done wonders for my intellectual and mental health.  For a long list of reasons, none of which I’m going to go into in this post, I needed some time away and I needed it badly.  So badly that it was even worth putting up along the way with that one 12-month stretch when my “income” totaled a mere $1,000. (In the mystery that is typical of Wade’s lifetime approach to personal finance, I had set up the first year away as an “unpaid leave”; only when the second “sabbatical” year kicked in did my paychecks start again.)

But the point of this post is not to whine about my finances either.  Doing things the way I did them was my choice all the way and, if I had the opportunity to do it over again, I still would have done the “unpaid year followed by sabbatical” thing.  The debts I ran up, doing my part to ensure the low savings rate America is known for?   Looking back, I might have kept them a bit smaller, but not significantly so.  No matter how annoying/scaring that debt balance looks right now, I prefer it to the alternative.  So, I’m not going to go off on a whine.

No,  the point of this post is to highlight one reason why that leave has been so, so valuable.  Because, lets be frank, most people in the working world don’t have the luxury of leaves that tenured college faculty like I have.  Most people if they want extended time off from their day job must quit their job (and its income) with no promise of gainful employment after the leave period is done.  Most people reading this blog don’t ever get the choice that I had back in 2007 and that I will get again if I stay with my current employer another six years or so.

And that’s too bad, because, I am utterly convinced we’d be better off if they did.  And I don’t just mean individually better off in terms of the leave-taker’s mental health.   I mean better off in terms of aggregate economic wealth.  Overall productivity would be higher, there would be more creativity, innovation, and entrepreneurship.  And GDP would soar.

Put it another way:  I’m in no way advocating this (or anything else) as a federal spending program, but if the feds offered $75,000 each to, say, 10 million workers and told them to take a year off without “gainful employment”, and offered them with no strings attached whatever, I’m betting that the net benefits of the program would exceed the net benefits of the ten-times-bigger financial bailout that Congress just passed.

Because the greatest benefits of my leave to date, the ones that have already justified my incurring of that major financial “hit”, were not the ones I wrote down when I asked for the leave and sabbatical.   I still expect to get the outcomes I described on my application — outcomes involving the book I’m writing on education, yes.

But the big benefits?  The really big benefits?  The ones that are really going to impact the quality of my teaching?  The ones that are really going to improve what my students and colleagues take away from my scholarship and my lectures?

All those big benefits are coming from places unexpected.  Places like this blog.  Places like launching a copywriting business in September 2007 whose connection to my “professor” life was rather tangential, only to see that business morph in a matter of months to a focus on educational products and services, a focus that means it will be intimately connected.

In a matter of months, I went from thinking I’d have to choose between my love of teaching and my entrepreneurial aspirations, to realizing that the only way I was going to be as good at either as I wanted to be is if I could somehow find a way to do both.

Both “Professor Wade” and “Iterative Listening, LLC” are going to survive the leave.  Neither is going to be what I envisioned them as back in mid-2007.  Both will be better.  Far better.

And they will be better because my extended leave gave me the freedom to morph.  Or, to be more  precise about it, the leave helped me see that I had that freedom.  When I was working full-time (and for anyone who wants to be a good college professor, “full-time” means 60-80 hour weeks year-round), I didn’t have the freedom to morph or even see the possibilities for morphing that I might have had.

Here’s just one example of the unexpected benefits that can come from the freedom an extended leave can give.  I was sitting in my favorite recliner earlier today, my dog sleeping in my lap, not having done much all afternoon except grumping about the Packers’ poor performance and sipping a bit of Jack Daniels and Coke.   And I realized that I had never been able to articulate one of my biggest, yet previously hidden reasons for spending my life in higher education:

Spend time in the academy, especially if you focus on teaching, and you can get a lot of Wow! moments.  The kind of moments that happen when you see someone do or say or think something really amazing.  Colleagues.  Students.   People who write the stuff you get to read in summers and spring breaks.

And as I sat there I got to thinking back.  Thinking about people like Jody and Ross and Jon and Amber and Deirdre and Sadad and Ed and Catherine and Russ and Wayne and Storm and Brian and Brian and Tyson and Scott and Matt and Mark and Luke and John and hundreds more who have given me “Wow!” moments since I first stepped in front of a class in the fall of 1987.

And I realized how often those “Wow!” moments were wholly unexpected consequences of being in the academy.    How often the people providing the Wow! were not the A students or the professors most published.  How often they were the people labelled “slacker” or “lazy” or “dumb as a post” or “not very bright” or “not much of a student” or half a hundred other negatives.

Sure, some of the above list, did I ask my Iowa, Kirkwood, Central, or Luther colleagues about them, all would agree that they are amazing people.  But others?  I’d hear, “Him?  He was a total brick”; “Her?  She was a ditz”; “Those two?  Those were the dumbest/laziest/worst students I’ve ever seen.”

And because I’ve had the luxury during this extended leave of not only spending more Sunday afternoons in the recliner with my dog, but of ramping up my reading, I realized how my experience with the Wow! in the academy was just the tip of the iceberg.  How many Wow!s being brought forth out there are not being brought by the people everyone recognizes as disgustingly smart and observant and all the other good stuff.   They’re being brought out by the deviant and the socially challenged and the “not very bright” and the “lazy” and all the rest.

At least if teachers and “responsible elders” like me don’t get in the way first.

Now I’m a big believer in the value of the Wow!  The more Wow! as a general rule, in my opinion, the better.

And so, one of the unexpected results of this extended leave of mine is  going to be revision of how I approach my own enabling of the Wow!  It means that one of the (unplanned) things I’m going to be doing in the months remaining of my leave is finding better ways to stay out of the way of those who create the Wow!

I’ve realized that if I want to be that better teacher, I’m going to have to find new ways of short-circuiting myself on those occasions when I’m tempted to label a student as “lazy” or “not very bright.”  (Because it isn’t just my colleagues that have used that label.)

I’m going to have to find new ways of putting my students in a position where they can bring forth the Wow!

And, because I have the luxury of that leave, ten more months where the bastard parts of the academic life aren’t going to keep me from my recliner and thinking about the Wow!, I’m betting that I’ll find some of those new ways.

Wow!

Share

For years I have prided myself in focusing my pedagogy on “asking questions” rather than “giving and getting answers.”

Unfortunately, all these years I’ve only had it half right.

Oh, I still think the emphasis on a teacher as a “giver of answers” is severely misplaced.   I still think we spend way too much time in our classrooms transmitting the “content” of our fields, content that any halfway literate person could get, and get more cheaply, by reading a book, visiting a website, or watching a video on their iPod.

No, I haven’t had an epiphany that “content is what matters” after all.  Sorry.  I’m going to continue to argue against teaching-as-information-transmission for the foreseeable future.

No, the point is that while (correctly) shfiting away from giver-of-answers mode I’ve also (incorrectly) over-emphasized asker-of-questions mode.   And it’s taken a couple years of the stress of dealing 24-7 with an aging mother to drive the lesson home.

Let me illustrate with what is going to sound like a trivial, even whiny, personal example.  Every day the ritual prior to and during breakfast is the same.  My mother will ask the same questions (Do you want toast?  Do you want anything else to drink?” Do you want two eggs? And so on.  There are probably eight or ten questions, all told).  And well before she is done with the questions, my temper is shot and, inevitably, my answers get terser and louder and more obnoxious.  Add to this her desire to engage in a bunch of small talk and gossip during the meal, and its a good bet that the meal ends with my yelling followed by a stony sort of silence, me angry and her hurt.

Now at this point, my guess is, you’re probably thinking, “Wade, grow up and stop being a child.”

But let me add, as Paul Harvey might say, the rest of the story.

The two of us generally eat breakfast together somewhere between 7:15 and 9:00 a.m.  (When my sabbatical from full-time teaching ends, I’ll have to be done and gone by 8:30, but for now the main determinant of the time is when my mother gets up.)  This is right smack-dab in the middle of the most productive part of my day.   For any given day, I’m probably going to get 80 percent of my production between 6 and 10 a.m.

If I were still living alone, I’d still be eating a big breakfast most days.  But I’d probably go out for it; or, if I were doing the cooking, it would only be “in the background.”  (One of the reasons I eat the same high protein/high fat breakfast every day is that’s what I can do in the mental background while the rest of my brain muscles are doing their heaviest lifting elsewhere.)

And so every question my mother asks as she prepares the breakfast is an interruption.  And every unasked-for piece of small talk is an interruption.  And after the fifth or tenth interruption, it’s a real good bet I’ve pulled a mental hamstring.

Because, even after I’ve come to the table, my mind is elsewhere, trying to get a handle on my great ideas and projects of the day.  I’m not in anything approaching conversation mode.

I understand.  It’s not my mother’s “fault”.  She simply doesn’t remember that I have told her a hundred times or so that “I don’t care” what she puts on the table. That if I want something other than my usual bacon, one egg, cottage cheese, half bagel, and water, I’ll ask for it.  Her short term memory, especially when she’s only been up for a few minutes, simply isn’t what it was when she was 85.  And barring some amazing medical breakthrough, it never will be again.   It’s sad (for her) and annoying (for me), but that’s just the way it is.

I’m just going to have to deal with the breakfast thing.  Find ways of keeping my temper, find ways of re-allocating my thinking time, finding ways of keeping concentration despite interruptions, etc., etc.  In this case, the problem is mine, not hers.

But the point of this long digression into my household habits is not to complain about my mom.  The point is to set up an analogy between my mother’s “failure to listen” and the frequent failures to listen that happen in today’s higher education classrooms (including, sad to say, many of my own).

Because contrary to received wisdom, asking questions is not necessarily synonymous with listening.

How many times have you said, or heard said, that “there is no such thing as a dumb/bad question.”  Well, I don’t know how else to say it, but that “no bad question” idea is complete and utter bullshit.

Leaving aside for now what makes a question “dumb” (perhaps I’ll discuss that in a later post), one of the big things that makes a question “bad” is when it reflects a failure to listen.

What makes my mother’s daily questions about the number of eggs and such so annoying is that they suggest she hasn’t listened to my last hundred or thousand answers.    And, I hate to say it, that is exactly what’s wrong with a lot of teachers.  And, unlike my mother, those teachers don’t have an excuse.

(Yes, I know, a lot of students also show a repeated failure to listen,  but that, like the question of what makes for a “dumb” question, is a topic I’d rather postpone to another day.)

When we as teachers ask questions, we have particular answers in mind.  At least I hope we do.  Some of these are answers we will be correcting (“wrong” answers), others we will be applauding (“correct”), and some we hope will be conflicting (“there are no right or wrong answers here, though there may be ones that are better or worse…”).

Yet, having those answers in mind can be a barrier to effective listening.  Because the questions we are asking may not be the questions our audience is hearing.  Much less the questions our audience cares about.

They may be the questions our audience should care about.  But just asking particular questions of a student doesn’t make the student care about them, any more than my mother asking about eggs this morning made me care about them.

It’s a common belief, especially among those who teach, that “asking questions” is evidence of a good listening practice.

But, I’m sorry to say, it can be exactly the opposite.

Share

Okay, I’ll admit it.  I often get carried away in my enthusiasm for ideas.  Despite priding myself on demanding “evidence” and on examining such evidence critically, the fact of the matter is that I often get persuaded — and persuaded deeply — without doing a whole lot of either.

Fortunately, I also try to put myself in a position where others will point out my sloppy thinking.  (As, for example, David has just done in the “scholarly journals” thread here when he asks for my evidence on the demand for higher education.  I”ll get back to you on that one, David, I promise I will, but it might take a while.  I’ve got a lot of evidence I’ve got to dig out and through first.)

But reading David’s comment, and visiting both the Healthcare Wordsmith and a recent slashdot thread on “New grads shun IT jobs as ‘boring’, all within the last 8 hours, have me thinking.  I tend to be a “the world has changed” kind of person.  In particular I believe that higher education must transform itself if it is to remain relevant (and marketable) in this new world.

But how much of that belief is because I’ve got evidence to back me up and how much of it is because I have jumped on a bandwagon du jour?

Given that I”m writing a book on the subject, I’m hoping that its not just me being a bandwagon jumper and that, between now and the time the book is finished, I either have the evidence or change my arguments accordingingly.

However, there’s a couple interesting things about bandwagons.  First, sometimes the bandwagon is travelling in the right direction.  (If it wasn’t, we’d never get cool concerts, would we?)  Second, and more importantly, the first people on the bandwagon often have to get on without sufficient evidence a priori.

That’s an essential part of entrepreneurship.   Jumping into something in the uncertainty of “before the evidence is in.”  Any schmo can see the iPod is a success … now.  But harken back to the days when whoever had the idea about the iPod wheel first had the idea…they didn’t know the future market. They were guessing.

Because unless someone has some divine powers, any evidence we might have about the “future” is built on an analogy to what has happened in the past.   Now some of those analogies are so good that we’d be damn fools not to believe them (as long as I’m sitting in Calmar, IA, the apple I let loose tomorrow is going to fall to the ground just like they’ve been doing since well before Isaac Newton had his idea.

But the interesting analogies that we draw on daily to tell us what to believe about the future, they have quite a bit more evidentiary uncertainty build into them.

And as change accelerates, that evidentiary uncertainty increases.  Change means more confounding variables, and makes drawing the analogy more problematic.

Yet we must still draw analogies.

*     *     *     *     *

So where does this leave marketing?  Marketing gets a lot of bad press as “fluff” and “hype” and “deception.”  No real surprise there, since a big chunk of marketing effort is designed to work on our emotions more than on our reason.   Or, if you will, on the emotions we connect to the “evidence” as we draw one analogy or another.

But is this truly bad?  After all, in the end, our evaluation of the quality of the product is in the future.  And, if the marketer is correct (in the empirical sense of the term), we will see “quality” differently in the future (after we use the product) than we have in the past (where, at the moment of marketing, all the “evidence” has to be found).  We demand the marketer provide us with evidence — truth in advertising, and all the rest — yet we’re demanding something that, if the marketer is correct, he cannot provide.

Because if he’s correct, he’s predicting a future that hasn’t happened before.

Don’t mistake me here.  I’m not saying marketers should lie.  Or that all consumer protection laws that encourage truth-telling are bad.  Hardly.

What I’m saying is that, when it comes to making an argument about the future, “evidence” and “facts” aren’t a trump card that ends the argument.   And, for good or ill, the essence of most “world has changed” talk, despite the diction that chooses past tense, remains an argument about the future.

Damn.

Share
All content of this blog, except comments added under names other than "Wade," are copyright © 2008, 2009 Wade E. Shilts