The pandemic has brought plenty of changes to our everyday routines. The same is true in higher education. Beginning as soon as we closed our campus buildings in March, we had to figure out how we could continue to provide instruction and support services to students – and we had a week to figure it out. Classes moved online and to “alternative delivery methods.” Advising moved to working with students by phone and in virtual meetings. Our remaining holdout paper forms were transformed into web forms overnight.(more…)
The return on investment of higher education is under increased scrutiny today. This has led to a narrowed emphasis by colleges and universities on providing marketable skills that lead directly to jobs. Andrew Delbanco, in his 2012 book, College: What It Was, Is, and Should Be, suggests a broader set of qualities for colleges, however, beyond technical and vocational competencies. At the same time, we’ve experienced increasing economic inequality and a coarsening of our political discourse in the United States, so the diminishment of a “college” education as Delbanco describes it has profound implications for democracy.(more…)
This is part two of a series on higher education. Read the first post here.
I believe in education.
Just wanted to get that out of the way first. Because some of the thoughts that I have about education after high school may seem like I’m opposed to it. That’s not true. But I do think that our current expectations, and often our methods, have some problems that need to be addressed, and the turmoil of 2020 gives us an opportunity to do exactly that.
So who am I to make suggestions about higher education? Let me give you a quick summary of my experience in higher ed so you can decide if I’m someone worth giving some of your attention.
I’ve worked in higher ed for 21 years. I spent a brief year as an adjunct instructor at Northwood University in Midland, Michigan, in the early 1990s, teaching a graphic design course about “desktop publishing.” (If that term’s unfamiliar, it’s what we called software like Aldus PageMaker and Quark XPress that were key in moving print layout from manual work to digital.)
In 1997 I started teaching as a contract instructor for St. Clair County Community College (SC4) in Port Huron, Michigan. I taught day-long seminars that covered introductory and intermediate level use of Microsoft Office programs, including Word, Outlook, Excel, Access, and PowerPoint. It was part-time work, but in the late 1990s there was a lot of federal and state money available for companies to train their workers in these skills, so I kept pretty busy, teaching three or four days per week. That funding dried up around 2002 and I went back to my freelance work as a writer, designer, and marketing consultant.
In late 2008, I got a call from a community theater friend, Roger Hansel, who was the technical director at SC4. They needed a director for their Christmas play on short notice. I’d done about twenty plays as an actor and director for local theater groups and Roger knew I was self-employed and might be available. I took the gig and with a small group of talented young performers, we put the show together in about four weeks. I thought it went well, but didn’t expect it to be anything more than a one-off opportunity. The chair of the visual and performing arts department, David Korff, had other plans. My predecessor wasn’t coming back, so would I be interested in directing the next show? Um, sure. At the end of the semester, he asked me what plays we were doing next year. I wondered why he was asking me, so he told me that it was my job to pick them since I’d be directing them. And I’d also be teaching acting and improvisation. Oh. Okay.
I did that for eight years, and they were the best “work” years of my life. I loved teaching theater, directing shows, working with such enthusiastic young performers. For some of them, theater was their refuge from their less-than-ideal real lives. Through our program, they made friends, learned to resolve differences, and in a few cases, just survived. It was fun, but it was also important work.
(I’ll always be thankful to Roger and David. I know I solved a short-term problem for them, but I’m pretty sure I got a lot more out of the transaction than they did.)
But then that ended. After building a program, directing almost 30 shows, teaching hundreds of college students, and revising and relaunching an oral interpretation of literature course, I was told I was no longer qualified to do any of that. I have no master’s degree, not just in theater but in anything (I’ll have one – finally – in 2022, but it won’t be in theater). I have a bachelor’s degree in geography and earth science. I never claimed otherwise, and no one suggested that I had. What I did have was years of experience actually acting and directing, which continued to grow over my eight years as the artistic director of The SC4 Players. Unfortunately, that was no longer enough.
I understood intellectually why I couldn’t teach theater anymore. Updated Higher Learning Commission standards meant that instructors needed to have at least a master’s degree in the field they teach (or any master’s degree and at least 18 additional hours in that field). In order to get a MFA in theater, I’d need to do remedial work to get a BFA first, then get accepted into an MFA program, then complete the program. I’m sure I’d have learned a lot of new things, but the fact was I already was doing the work an MFA should have qualified me for, and according to my colleagues’ evaluations of my work, I was doing it well. So it made no sense for me to spend that amount of time and money to get degrees in the hope of maybe being able to reapply for my old job.
I won’t detail what happened over the next two years, but in the end, my college doesn’t produce any plays (even before COVID) and there are fewer theater arts classes offered. Again, intellectually I understood the need for “standards,” but I don’t think those standards ended up serving our students very well.
I was fortunate that I’d also been working as an academic advisor at the college, so I was able to continue doing that, which provided a connection to students. I have a sticker in my office from NACADA, the Global Community for Academic Advising, that says “Advising is Teaching,” and that’s true. In 2016, I was given an opportunity to rebuild our recruitment and admissions team, which I accepted. I’ve loved these challenges and I think we’ve done pretty well. I still miss teaching theater – I’ll probably never completely let that frustration go – but I’m proud of my current team. We still make differences in people’s lives and that makes it worthwhile.
So I believe in education. But my experience working in higher ed has led me to question some of the things we do. Especially the things that we do out of habit, because we just haven’t taken the time to ask whether it still makes sense to do it that way.
As I lay out some of my questions and perhaps even a few ideas to answer them, I hope that if you agree you’ll let me know. Even more importantly, if you disagree I hope you’ll also let me know and will engage with me in a constructive manner. That, in the end, is the point of education.
Going to college used to be something that few people did. In 1940, only 4.6% of Americans had completed a bachelor’s degree or higher. In 2017, that number was 33.4%. Another ten percent of Americans had completed a two-year associate degree, meaning that about nine out every twenty Americans had at least a two-year college degree. Considering that only about 15 percent of Americans finished high school in 1940 (when the average number of school years completed was about eight and a half), that’s an impressive shift in Americans’ attitude toward education in general and post-high-school education in particular.
Junior colleges started in the early 20th century, offering expanded local access to education beyond high school. These schools initially offered courses that paralleled the type of classes that a student would expect to take if they had attended a senior college for their freshman and sophomore years. Completing the two-year program resulted in a certificate until the University of Chicago began awarding associate degrees in 1899. By the 1920s, junior colleges were offering distinct programs in trades, including business management, engineering, mechanical work, agriculture, and other areas. After World War II, the occupational training component of junior colleges expanded considerably; today, about half of associate degrees are related to a specific trade or occupation and the other half are in more traditional liberal arts and sciences.
There is an ongoing debate between those who feel that both two- and four-year institutions have placed too much emphasis on occupational training and have allowed liberal arts to be diminished. The discussion can quickly veer into arguments over how rigorous college work is today or whether general education requirements in English composition, math, social sciences, and other areas are needed at all in the 21st century.
It’s important to remember that when you hear about “higher education,” “colleges,” or “universities” in the media, one size doesn’t fit all. To an individual student there is no “higher education” in that sense. There is only the institution they’ve chosen to attend. Most are non-profit, but the profit education sector has grown and ebbed in recent years. Many of the best known universities are operated by states; nearly all community colleges are public institutions; but there are many private colleges and universities in the U.S. Some are enormous, some are tiny. Some are sponsored or influenced by religious denominations. Some are expensive, some much less so.
Each student’s experience is unique, though we often try to “standardize” things. Sometimes those standardization efforts benefit the student; other times it seems that the choice to standardize a process benefits the institution, the faculty, or the support staff. Trying to support a student’s goals and dreams is hard work, but there are so many of them (20 million or so in 2020)! Shortcuts are taken by the school… and by the student.
This post is by no means an exhaustive survey of the history of higher education in the United States. There are plenty of good books on that subject if you’re interested in more depth (John Thelin‘s classic “A History of American Higher Education” is one choice). I wanted to set the stage for the discussion I’m hoping to have over the next few days, weeks, or months about where the (admittedly) broad concept of higher education is now and how we can begin to respond to the challenges and opportunities that already existed, but which 2020 has made even more critical.
Has there been a more disruptive year in modern history? 2020 is certainly setting some records in that sense. Between the COVID-19 pandemic and political polarization, we don’t seem to be able to work together on anything. Add climate change, as evidenced by the orange skies in northern California today from massive fires, and our ongoing inability to agree that climate change even exists, and this has been a year to forget.
On September 11, 2001, we realized that we could be attacked right here at home, and many of our current political divisions have their genesis in the aftermath of those attacks, including the Iraq War and the more general War on Terror. The recession of 2007-09 made us less confident about our economic and governmental structures, but it didn’t affect everyone. We also experienced a fairly strong recovery from the recession that, while not “lifting all boats,” soon made it a faded memory.
Anyone old enough to have lived through World War II could describe first-hand the economic and political upheavals of those years. Before that, the Great Depression was certainly disruptive, but there aren’t many Americans who remember that from any more than a child’s point of view.
The protests against the country’s involvement in Vietnam and the civil rights movement of the 1960s were certainly divisive. But those conflicts didn’t involve everyone, and there were many people who weren’t affected that much. Many of those who were unaffected were privileged enough to avoid the problems and learned nothing from them: those who were wealthy (and white) enough to get deferrals from military service, to not have to deal with the hippies or the student protesters or anyone else who didn’t look like them.
Going further back, the Spanish Flu pandemic of 1918 is probably the closest parallel for obvious reasons, including the arguments over mask wearing. The U.S. Civil War is probably the most disruptive time in American history, of course, with the country literally split in two. But both of those periods are distant, something we read about in history books or in television documentaries.
So as we enter the last months of 2020, is there anything positive to build on? We have an election in 55 days that will help answer that question, of course, and there have been a newfound awareness of the systemic injustice, racism, and economic inequality in the U.S. If we move in a different direction in 2021, will we continue to build on the changes we’ve started, or will we be tempted to return to “normal?” Is that even possible, even if we wanted to?
I believe there are opportunities to change and to improve as we recover from the pandemic and the current state of political discourse. In the field I’ve worked in for twenty years, higher education, we’re already seeing some of those changes, many of them quite small and yet still significant. Larger changes are needed and this may be the opportunity we’ve needed to address them, discuss what the future of education looks like, and move toward that future.
That’s what I’m planning to discuss here for now. I may nibble at the hand that feeds me occasionally, but to not consider the larger opportunities that this disruptive year is providing would be a shame. If you’re interested I invite you to follow me here or on Twitter (@TomKephart). I’ll tweet each new post there as well.