Education is more than just completing something

I’m taking a course in organizational behavior this semester, and today’s assignment was to take a position on the question of whether or not leadership training is a waste of money. Most of my classmates have taken the counterpoint position so far; only one person had agreed with the premise by the time I posted my response, which I include here:

In the interest of not having Michael hold up the entire support for the Yes/Point position, and the fact that I’m in a feisty mood today, I’ll agree with the premise that leadership training is a waste of money. I will clarify, though, and this will seem like a hedge (only because it is): actually, I believe leadership training can be a waste of money. It isn’t that people can’t be trained to be better at something; if that was true, education in general, and higher education in particular, would be pointless.


In the pandemic, disruption may be our silver lining

The pandemic has brought plenty of changes to our everyday routines. The same is true in higher education. Beginning as soon as we closed our campus buildings in March, we had to figure out how we could continue to provide instruction and support services to students – and we had a week to figure it out. Classes moved online and to “alternative delivery methods.” Advising moved to working with students by phone and in virtual meetings. Our remaining holdout paper forms were transformed into web forms overnight.


The unfortunate commodification of higher education

The return on investment of higher education is under increased scrutiny today. This has led to a narrowed emphasis by colleges and universities on providing marketable skills that lead directly to jobs. Andrew Delbanco, in his 2012 book, College: What It Was, Is, and Should Be, suggests a broader set of qualities for colleges, however, beyond technical and vocational competencies. At the same time, we’ve experienced increasing economic inequality and a coarsening of our political discourse in the United States, so the diminishment of a “college” education as Delbanco describes it has profound implications for democracy.


How did I get here?

This is part two of a series on higher education. Read the first post here.

I believe in education.

Just wanted to get that out of the way first. Because some of the thoughts that I have about education after high school may seem like I’m opposed to it. That’s not true. But I do think that our current expectations, and often our methods, have some problems that need to be addressed, and the turmoil of 2020 gives us an opportunity to do exactly that.

So who am I to make suggestions about higher education? Let me give you a quick summary of my experience in higher ed so you can decide if I’m someone worth giving some of your attention.

I’ve worked in higher ed for 21 years. I spent a brief year as an adjunct instructor at Northwood University in Midland, Michigan, in the early 1990s, teaching a graphic design course about “desktop publishing.” (If that term’s unfamiliar, it’s what we called software like Aldus PageMaker and Quark XPress that were key in moving print layout from manual work to digital.)

In 1997 I started teaching as a contract instructor for St. Clair County Community College (SC4) in Port Huron, Michigan. I taught day-long seminars that covered introductory and intermediate level use of Microsoft Office programs, including Word, Outlook, Excel, Access, and PowerPoint. It was part-time work, but in the late 1990s there was a lot of federal and state money available for companies to train their workers in these skills, so I kept pretty busy, teaching three or four days per week. That funding dried up around 2002 and I went back to my freelance work as a writer, designer, and marketing consultant.

In late 2008, I got a call from a community theater friend, Roger Hansel, who was the technical director at SC4. They needed a director for their Christmas play on short notice. I’d done about twenty plays as an actor and director for local theater groups and Roger knew I was self-employed and might be available. I took the gig and with a small group of talented young performers, we put the show together in about four weeks. I thought it went well, but didn’t expect it to be anything more than a one-off opportunity. The chair of the visual and performing arts department, David Korff, had other plans. My predecessor wasn’t coming back, so would I be interested in directing the next show? Um, sure. At the end of the semester, he asked me what plays we were doing next year. I wondered why he was asking me, so he told me that it was my job to pick them since I’d be directing them. And I’d also be teaching acting and improvisation. Oh. Okay.

I did that for eight years, and they were the best “work” years of my life. I loved teaching theater, directing shows, working with such enthusiastic young performers. For some of them, theater was their refuge from their less-than-ideal real lives. Through our program, they made friends, learned to resolve differences, and in a few cases, just survived. It was fun, but it was also important work.

(I’ll always be thankful to Roger and David. I know I solved a short-term problem for them, but I’m pretty sure I got a lot more out of the transaction than they did.)

But then that ended. After building a program, directing almost 30 shows, teaching hundreds of college students, and revising and relaunching an oral interpretation of literature course, I was told I was no longer qualified to do any of that. I have no master’s degree, not just in theater but in anything (I’ll have one – finally – in 2022, but it won’t be in theater). I have a bachelor’s degree in geography and earth science. I never claimed otherwise, and no one suggested that I had. What I did have was years of experience actually acting and directing, which continued to grow over my eight years as the artistic director of The SC4 Players. Unfortunately, that was no longer enough.

I understood intellectually why I couldn’t teach theater anymore. Updated Higher Learning Commission standards meant that instructors needed to have at least a master’s degree in the field they teach (or any master’s degree and at least 18 additional hours in that field). In order to get a MFA in theater, I’d need to do remedial work to get a BFA first, then get accepted into an MFA program, then complete the program. I’m sure I’d have learned a lot of new things, but the fact was I already was doing the work an MFA should have qualified me for, and according to my colleagues’ evaluations of my work, I was doing it well. So it made no sense for me to spend that amount of time and money to get degrees in the hope of maybe being able to reapply for my old job.

I won’t detail what happened over the next two years, but in the end, my college doesn’t produce any plays (even before COVID) and there are fewer theater arts classes offered. Again, intellectually I understood the need for “standards,” but I don’t think those standards ended up serving our students very well.

I was fortunate that I’d also been working as an academic advisor at the college, so I was able to continue doing that, which provided a connection to students. I have a sticker in my office from NACADA, the Global Community for Academic Advising, that says “Advising is Teaching,” and that’s true. In 2016, I was given an opportunity to rebuild our recruitment and admissions team, which I accepted. I’ve loved these challenges and I think we’ve done pretty well. I still miss teaching theater – I’ll probably never completely let that frustration go – but I’m proud of my current team. We still make differences in people’s lives and that makes it worthwhile.

So I believe in education. But my experience working in higher ed has led me to question some of the things we do. Especially the things that we do out of habit, because we just haven’t taken the time to ask whether it still makes sense to do it that way.

As I lay out some of my questions and perhaps even a few ideas to answer them, I hope that if you agree you’ll let me know. Even more importantly, if you disagree I hope you’ll also let me know and will engage with me in a constructive manner. That, in the end, is the point of education.

An extremely cursory look at higher ed over the past eighty years

Going to college used to be something that few people did. In 1940, only 4.6% of Americans had completed a bachelor’s degree or higher. In 2017, that number was 33.4%. Another ten percent of Americans had completed a two-year associate degree, meaning that about nine out every twenty Americans had at least a two-year college degree. Considering that only about 15 percent of Americans finished high school in 1940 (when the average number of school years completed was about eight and a half), that’s an impressive shift in Americans’ attitude toward education in general and post-high-school education in particular.

Junior colleges started in the early 20th century, offering expanded local access to education beyond high school. These schools initially offered courses that paralleled the type of classes that a student would expect to take if they had attended a senior college for their freshman and sophomore years. Completing the two-year program resulted in a certificate until the University of Chicago began awarding associate degrees in 1899. By the 1920s, junior colleges were offering distinct programs in trades, including business management, engineering, mechanical work, agriculture, and other areas. After World War II, the occupational training component of junior colleges expanded considerably; today, about half of associate degrees are related to a specific trade or occupation and the other half are in more traditional liberal arts and sciences.

There is an ongoing debate between those who feel that both two- and four-year institutions have placed too much emphasis on occupational training and have allowed liberal arts to be diminished. The discussion can quickly veer into arguments over how rigorous college work is today or whether general education requirements in English composition, math, social sciences, and other areas are needed at all in the 21st century.

It’s important to remember that when you hear about “higher education,” “colleges,” or “universities” in the media, one size doesn’t fit all. To an individual student there is no “higher education” in that sense. There is only the institution they’ve chosen to attend. Most are non-profit, but the profit education sector has grown and ebbed in recent years. Many of the best known universities are operated by states; nearly all community colleges are public institutions; but there are many private colleges and universities in the U.S. Some are enormous, some are tiny. Some are sponsored or influenced by religious denominations. Some are expensive, some much less so.

Each student’s experience is unique, though we often try to “standardize” things. Sometimes those standardization efforts benefit the student; other times it seems that the choice to standardize a process benefits the institution, the faculty, or the support staff. Trying to support a student’s goals and dreams is hard work, but there are so many of them (20 million or so in 2020)! Shortcuts are taken by the school… and by the student.

This post is by no means an exhaustive survey of the history of higher education in the United States. There are plenty of good books on that subject if you’re interested in more depth (John Thelin‘s classic “A History of American Higher Education” is one choice). I wanted to set the stage for the discussion I’m hoping to have over the next few days, weeks, or months about where the (admittedly) broad concept of higher education is now and how we can begin to respond to the challenges and opportunities that already existed, but which 2020 has made even more critical.