More Or Less Technology In The Classroom? We’re Asking The Wrong Question - Rickey J. White, Jr. | RJW™
18166
post-template-default,single,single-post,postid-18166,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-16.3,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.7,vc_responsive
 

More Or Less Technology In The Classroom? We’re Asking The Wrong Question

More Or Less Technology In The Classroom? We’re Asking The Wrong Question

My husband Ken likes to tell a story about his middle-school trigonometry class in the late 1970s, when the first solid-state, pocket-sized electronic calculators were coming onto the market. They were pricey, about $250, and he and his fellow math geeks were thrilled by the amazing things you could do with them. They wanted to be able to use them in class.

Not so fast. Their teachers were sure that their students would become dependent on the calculators and their math skills would suffer ever after. Calculators were banned from Ken’s math class. Instead, he was taught how to do trigonometry with a slide rule.

This article is is adapted from The New Education: How to Revolutionize the University to Prepare Students for a World in Flux, available now. [Image: courtesy Basic books]

In retrospect, that seems ridiculous. Why would calculators hurt your trig ability, but slide rules would not? The answer can be summed up as “technophobia,” a fear of change as embodied in new technology, especially technology that the young seem to master easily and that makes their elders feel clumsy, out-of-date, and yearning for the good ole days. Ken’s teachers weren’t focused on their students’ ability to do mental calculations (a foundational math skill) but were worried about them relying on a new device instead of an old one. These well-meaning teachers wanted students to learn on the device their teachers had used when they were young.

We hear arguments about the use of technology every year about this time, as students prepare to head back to school. Invariably, pundits fill the airways with extreme views on the role of technology in our lives, often quoting “studies” that confirm one ill effect or another. My favorite one headlining on the evening news this season contends that it’s the “Likes” on Facebook that’s making teens stupid. The logic seems to be that you can click “like” without really thinking and responding to the content on social media. Does that then mean that Comments sections make us smart? I don’t think so.

“Mad claims for technology’s ability to cure all are enough to send anyone back to her slide rule.” [Photo: Robert Daemmrich Photography Inc/Corbis/Getty Images]

What technology “does” to our brains is a recurring debate with high stakes and a lot of confusing data. For every article claiming technology damages you, there’s a counter tale of all the wonders technology brings, its magical ability to compensate for all the woeful gaps in the current education system. Technology as touted as the solution to making everyone and anyone workforce-ready, practically for free. This utopian view of technology–technophilia–fuels the $240 billion global educational technology market. It’s often hyperbolic. Remember way back in 2012, the year the New York Times dubbed to be “The Year of the MOOC”–Massive Open Online Courseware. Major universities were having famous professors webcast lectures and students could take these for free.

Best-selling author and Times columnist Thomas L. Friedman hailed MOOCs as “a budding revolution in global online education” that would universalize knowledge and slash skyrocketing college tuitions. Why stop there? “Nothing has more potential to lift people out of poverty,” Friedman insisted. Needless to say, in the passing years, MOOCs haven’t done a thing to lower tuition or end poverty. Such mad claims for technology’s ability to cure all problems are enough to send anyone back to her slide rule.

Here’s the connection between educational technophobia or technophilia: Both presume that technology in and of itself has superpowers that can either tank or replace human learning. Technology can automate many things. What it cannot automate is how humans learn something new and challenging. Neither can it, in its own right, rob us of our ability to learn when we want and need to learn. It can distract us, entice us away from the business of learning–but so can just about anything else when we’re bored.

Learning requires trial, error, making mistakes, correction, feedback, more trial, more errors, and onward on the road of social, interactive learning. Seymour Papert, a pioneer in the field of artificial intelligence and cofounder of the MIT Media Lab, was an influential theorist and passionate advocate of student-centered learning, or what he calls “constructionism.” He championed the idea that the best way to learn in the post-internet world of interactive communication is literally by constructing something: making, doing, exploring, experimenting, failing, analyzing the failures, and trying again.

Rather than having famous experts deliver the answers for neophytes to master, he believed it was the job of education to provide the “conditions for invention.” He posed problems to his students and let them figure out their way to the answer. He preferred to mix experts and nonexperts, specialists and novices, computer scientists and artists, and thought his role was to keep asking ever-more challenging questions. If he had been teaching Ken’s middle-school trig class, for example, he would have encouraged the kids to use calculators as a starting place for active learning. He might have them calculating sine, cosine, and tangent trigonometric functions in order to then be able to explore more complex trigonometric problems in areas where they had passionate interests: astronomy, programming, acoustics, optics, biology, chemistry, computer graphics, and other subjects far beyond the syllabus of eighth-grade math. In other words, they would learn that technology is a tool–and so is trigonometry.

Most of the technophobic responses to devices assume that school should be cordoned off from the real world. Far too many research studies prove or disprove the efficacy of technology by seeing if it improves students’ scores on conventional, multiple-choice, objective exams. That’s the wrong metric. The purpose of education should not be better grades or a diploma. It should be the best possible preparation for thriving in a complex and changing world.

“Technology is a tool–and so is trigonometry.” [Photo: Jonathan Kirn/Getty Images]

Does taking notes long hand (instead of typing them into a laptop) really help you get higher grades on final exams (as one study proclaims)? Then focus on teaching better note-taking online since, outside of the classroom, that’s pretty much how everyone takes notes. Does having a screen available in a lecture hall mean students pay less attention to the lecturer? Sure! If I remember my own school days correctly, even yesterday’s school newspaper could distract me from a boring lecture.

It’s long past time that we found more engaged, effective ways of teaching than the lecture. In a 2014 analysis of 228 different studies of STEM teaching and learning comparing the efficacy of lectures (“continuous exposition by the teacher”) to active learning (“the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert”), active learning won hands down, yielding greater success rates, completion rates, and higher exam grades than the traditional lecturing methods. It also took less time for students to master the material and methods.

That’s common sense, really. Instead of either banning devices or automating information retrieval–whether from a screen or a lecturer droning on from the podium–the best pedagogical research we have reinforces the idea that learning in the classroom is most effective when it proceeds pretty much the way it does when we try to master something new outside of school: learning incrementally, being challenged, trying again. I even studied for my driver’s test that way–and certainly that’s what I do if I’m aspiring to something really difficult.

Banning devices does nothing to empower students using these devices. Empowerment requires separating knee-jerk technophobia and technophilia from wise and real cautions. Today’s devices are so attractive and useful that it is hard to avoid using them, even when we know that they can render us and our most valuable data insecure. That’s a caveat worth paying attention to.

Calculators versus slide rules? Laptops versus calculators? In order to have a saner relationship to devices, we need to get rid of magical thinking. This is difficult, since there is a very long history of seeing technology as superhuman. Ken and his classmates would have been amused to know that in 17th century England many were wary of the spanking-new invention of the slide rule. Both Sir Isaac Newton and the Reverend William Oughtred, one of the slide rule’s inventors, used them privately and quietly. Many powerful dons and religious leaders of the day believed any man-made device that presumed to improve upon the capacities God had given humanity had to be heretical. As Galileo and others could testify, life didn’t go well for scientists who were thought to be of Satan’s party.

As school begins and the studies roll forth, keep this history in mind. Like many of the contemporary arguments against science, the roots of today’s technophobia and technophilia can be found in ancient worries about blasphemy. Sir Isaac Newton understood this. He taught his students how to use slide rules on the down low.


Cathy N. Davidson directs the Futures Initiative at the City University of New York (CUNY). Previously, she spent 25 years at Duke as a scholar and administrator. She is the author of many books including Now You See It and has written for the Wall Street Journal and Fast Company, among others. Davidson lives in New York.


Source: Fast Company

Tags:
No Comments

Sorry, the comment form is closed at this time.