Over the past four weeks, I have become increasingly aware of how the media I consume affects my perceptions of higher education. Most of the TV shows/movies I watch or songs that I listen to talk about college, but definitely do not accurately describe my college experience. Since our topic for this upcoming week is professors, I began to think about my past and current professors and how I have felt about them. Are they knowledgeable about the material they are teaching? Do I feel comfortable asking them questions during class? Would I attend their office hours to ask for help? I always answer yes to the first question because professors are supposed to be experts in their fields, right? At least, that’s how they are represented in the media. After taking time to break down the article “Anti-Intellectualism and Faculty: Representations of the Prime Time Professoriate”, I discovered that professors are depicted as the embodiment of academia, people who dedicate their entire lives to learning. Although slightly an overstatement, I do not disagree with this point because professors have built a career upon true scholarship. However, when evaluating my next two questions, my answer is more hesitant, never a straightforward “yes”. But why is this? Why do I not feel comfortable asking questions to the men and women who get paid to help students like me? This discomfort probably stems from the way that higher education faculty members are presented in popular media. Author Barbara Toblowsky references a quote in her article, stating that the term professor implies the adjectives “dry, hectoring, unemotional, self- important” and lacking any “human connection”. After evaluating several different TV shows and movies in which professors play a major role, Toblowsky proves this statement to be true.
Think of any interaction you have seen between a professor and his/her class in the media. 9 times out of 10, the professor is getting mad at the class for their poor performance or criticizing their behaviors and opinions. By talking down to their students in movies and TV shows, we, as viewers, develop the opinion that all higher ed faculty members are cold and unyielding. Furthermore, most professors in popular media are portrayed as having a lack of morals and a poor conscience. There are so many occasions in which professors develop inappropriate relations with students, some being sexual. Furthermore, most teachers never face consequences for their poor behaviors. After watching things that show teachers having affairs with students, I wonder if this portrayal has caused me to see professors as intimidating, making me feel vulnerable when approaching them with questions regarding coursework. Also, most professors in popular media do not have a happy home life. Often, they are alcoholics or divorced, having nothing in their lives except their respective fields of study.
This then leads into the conversation about tenure. Is tenure a good or bad thing? This is a hot debate right now and has been for many years. Rather than giving my opinion on the matter, I will focus on how popular media portrays tenure, which by the way, is not good. In the media, tenure is used to avoid punishment for poor teaching or bad behavior. Professors that skip class, disregard students and colleagues, or even develop inappropriate relations with their students do not face consequences because they have tenure. This representation could be a leading factor as to why many Americans see tenure as a bad thing. It is so interesting to me that most popular media (or at least the several sources referenced in Toblowsky’s article) all represent professors in a similar, negative light. Could this be why Americans distrust professors and higher education in general? I am sure that the portrayal of professors as rude, arrogant scholars has impacted my perceptions of the men and women that instruct me in the classroom daily. This portrayal is more than likely one of the reasons as to why I second-guess raising my hand in class, not because I am scared to speak but because I am scared of a response. Professors are the men and women that are paid to teach us, using their knowledge and backgrounds to help students to grow into critical thinkers and good citizens. If they are all actually the way that the media portrays them, then no one would actually go to college. With that being said, why does popular media paint such a negative picture of them?