March 2nd, 2013
“Critical Thinking is Best Taught Outside Classroom” claims that museums, TV shows, and hands-on fairs like the Maker Faire are better at teaching critical thinking skills than the standard classroom setting. Critical thinking here is marked by asking questions like “What if …?” and “How can …?” followed by questions about cause and effect.
While museums can play an important role in helping children develop critical thinking skills, it isn’t clear that museums or other “institutions of informal learning” are better suited to teach those skills. Any critical thinking—asking good questions, as the article would have it—requires more than just an unstructured encounter with some exhibit or device or situation. And learning how to ask a good question requires more than being taught to ask “What if …?” and “How can …?” Good questions require some relevant background knowledge. Children, adolescents, and even adults often lack the knowledge to know how to formulate a good question when looking at an unfamiliar object or artifact. That’s why museums spend so much time, energy, and money designing exhibitions, selecting objects, arranging displays, crafting labels, and training docents and guides. We shouldn’t be surprised that college students were found to ask better questions than fifth graders. College students simply knew more.
The failure of the U.S. school system to teach students to think critically or give them the opportunity to develop the habit of thinking critically is not news to anybody.1 A post over at College Misery summarizes a common lament shared by college and university professors across the country. In Extra Class Hiram suggests that students have never been asked to think:
They come from the worksheet generation. Literally, most of what they’ve done in school is take standardized tests and fill in worksheets. They haven’t been asked to think, or to try to think, or to imagine that thinking means anything. They just have done things, filled in things.
Teaching any students but especially college students to start thinking is laborious and requires not only a conscious decision on the part of professor, but also a commitment by the professor to struggle through a period of resentment and anger. Students get into college because they have done well in school, or at least they have learned how the system works. When given discrete tasks with well defined criteria for success—e.g., a fixed number of questions each with one right and multiple wrong answers or when asked to write 500 words about a particular historical event—they perform admirably. After nearly a decade of acquiring those skills and coming to believe both that they constitute thinking and that school is the application of those skills, students are understandably uncomfortable when professors inform them that neither is true. Student morale continues its downward spiral when professors then demand something new and different. Consequently, teaching students how to think also requires buy-in from the students themselves. Professors have to convince them to give up their comfortable system of right and wrong along with its familiar forms of evaluation that mark success, in exchange for an approach that has neither tidy answers, nor even simple assessment criteria.
In my history of science courses I try to address these challenges by trying to teach curiosity, and modeling curiosity, and showing them how to ask questions, and explaining what makes a good question. I begin by explaining clearly why all this matters, why we are going to make the effort, why they, the students, will benefit from learning a new set of skills. When approached this way, the formal nature of the classroom setting facilitates rather than inhibits teaching students how to think critically.
1 Lambasting the school system for its failures has become a national pastime. Informal learning environments with their higher tolerance for failure, however, are not panaceas. “Informal learning environments tolerate failure better than schools” because such environments generally lack systems of evaluation. Success and failure have no meaning in such environments. Progressing to the next exhibit in a museum does not dependent on having succeeded at the previous one in the way course prerequisites work. If your gadget at the Maker Faire falls apart or fails to work, you aren’t held back until you demonstrate proficiency, as you might be held back a grade or not allowed to graduate. It would be nice to teach students to start thinking at a relatively early age, but that would require changing incentives and rewards that so many students and parents and teachers and school districts and admissions committees and employers and the government have come to expect and depend on.↩