When someone has an irrational fear, no trained psychologist advises them to completely remove themselves from that which causes them discomfort. This approach does nothing in the way of overcoming fear; in fact, it amplifies it—feeding the fear of even encountering fear. Yet most American universities today are coddling their students’ minds, allowing them to wrap themselves in an offense-free cocoon that decelerates their maturation, leaving them unprepared for the life that awaits them after college.
In the past five years, American college campuses have seen a remarkable rise in the usage ofthe term “microaggression.” Microaggressions are defined as “small actions or word choices that seem on their face to have no malicious intent but that are thought of as a kind of violence nonetheless.” For instance, asking a person of Asian descent for help with math could be considered a microaggression. In this charged atmosphere, one could characterize simply throwing a football to an African American in a pick-up football game when there are other people open to be one as well.
The plague of microaggressions has become such a problem that many students have organized coalitions forcing professors and campus faculty to issue “trigger warnings” before they teach on a subject that includes any element or terminology that may be considered offensive. These trigger warnings are supposed to prevent students from reliving the shock of past harm or simply experiencing emotions of offense. Like the announcements on TV when the news anchor is about to play a disturbing video clip, these warnings are designed to protect the seemingly fragile minds of college students from words such as “violate” or “kill.”
From whence did microaggressions and the necessitation for trigger warnings stem? While they have been harnessed for almost a century to assist those traumatized by war and assault, their more recent misapplication has been harnessed to further the cause of moral relativism. When people reject objective facts or refuse to take into account a view of reality beyond their sensory understanding, they tend to act as children. Sociologists call this sort of thinking, based solely on one’s own subjective view of reality, “emotional reasoning.” Emotional reasoning rejects many streams of wisdom including classical and Christian notions of philosophy, and instead of centering the world on God or some other external high ideal, places reality’s center solely on the self; this is called solipsism. Any attack on the self, therefore, is an attack on reality itself—catastrophic in the “fragile” mind of a student.
American society at large provides for far greater freedom of expression than what students, faculty, and administrators enjoy today as a result of this downward spiral of extreme self-censorship. The trend of American universities to shelter the rising generation from the challenges and complexities of reality has led—and will continue to lead—to a culture marked by paranoia and distrust. If we continue to teach our students that they ought to hide from all discomfort, then they will hide. The world truly is an unsafe place in many respects, but confronting and changing those realities will require men and women who have learned to ground their critical thinking in evidence rather than their own subjective emotions. If universities continue to pander to “emotional reasoning,” it will be to society’s detriment. The nation needs leaders capable of engaging difficult, even painful, questions with equal parts intellectual rigor and magnanimity, and the ability to listen rather than demonize the opposing party. Ultimately, educating students in paranoia and fear helps no one. If we want greater levels of justice and goodness in society, then we should invest in the next generation by steering them well clear of solipsism.
See The Atlantic for more.
September 8, 2015