By Nancy Duchesneau
There’s a comic that illustrates a phenomenon known all too well by researchers – the tendency for the complex findings of rigorous studies to be boiled down to facile comments about what is or is not true. Press releases and the media seek to pull complex findings together into sound bites that can impact a wider audience. The intention is undeniably benign. We want a well-informed public on the progress happening in research. Unfortunately, however, this means that sometimes the true message gets skewed, or results that are found sometimes and under certain conditions are taken as absolute truths for all situations. Trained researchers are aware that science is based on probability and reducing as much uncertainty as possible, but when headlines make declarations of how one factor is “scientifically proven” to improve a particular outcome, many who have not received the same training are deceived into thinking there is no uncertainty or variability involved. In all sciences, the process of vetting the evidence through a critique of methods and context is the pivotal point at which the results are either believed or dismissed as ungeneralizable.
Given the enormous amount of information available to individuals through a multitude of media sources, the need to critically analyze “facts” and their sources has become a hot topic, especially with the terms “fake news” and “alternative facts” that have been given the spotlight recently. It has become commonplace for people to recognize that they must sift through the information given to them, distinguishing what is valid and reliable from what is not. It appears clear to me that this skillset is not ubiquitous and that leaders, both in academia and public service, must be diligent about promoting trustworthy information to limit the problems inherent in publicizing falsehoods and conspiracies. However, the problem becomes compounded when political leaders themselves are struggling with being able to make these distinctions.
Policymakers are rarely experts themselves in the issues they legislate on. Instead, they often rely upon experts in their respective fields to provide the insight needed to make sound judgments. Hence, congressional hearings often utilize expert witnesses from whom politicians can learn and ask questions. However, when policymakers misinterpret the evidence, generalize too far, fail to recognize fallacious evidence, or ignore the value of more rigorous evidence, we see a division between the scientific community and the leaders of our communities and our country. Commonly cited examples of this are the topics of vaccination and climate change. The scientists who study these topics are nearly unanimous in agreement that vaccinations do not cause autism and that climate change is indeed a real and manmade phenomenon. Despite this, leading politicians exist at all levels of government who deny the existence of climate change and warn their constituents about the dangers of vaccines. Furthermore, this leads to the implementation of policies in which invalid and unreliable studies are used to propagate political agendas.
Today, there are a number of education issues in which research is being misused. For example, Mayor Emanuel of Chicago has proposed new graduation requirements intended to encourage students to raise their expectations by requiring students to prove acceptance into a college, military, job program, or similar. There is no dispute among researchers that encouraging students to raise expectations of themselves has benefits. However, making this into a requirement for graduation ignores the reality of students who may not fit into a particular mold, and disenfranchises them further by denying them a high school diploma. Mayor Emanuel is misusing the research in a way that may significantly harm students.
As we move forward in improving our education system (as well as other systems in our country), we, the public, the academics, the policymakers, need to be more careful about how we interpret and use the information we’re given. Policymakers in particular should be especially critical of the information they receive, disseminate, and use as the basis for policymaking. We should be able to believe, at the least, our leaders, and so our leaders, at the least, should use their substantial resources to do their best to vet their information.
Contact Nancy: Duchesn4@msu.edu