Want to fix big tech? Change what classes are required for a computer science degree - Rickey J. White, Jr. | RJW™
23733
post-template-default,single,single-post,postid-23733,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-16.3,qode-theme-bridge,wpb-js-composer js-comp-ver-5.4.7,vc_responsive
 

Want to fix big tech? Change what classes are required for a computer science degree

Want to fix big tech? Change what classes are required for a computer science degree

“Are you Eng or Non-Eng?”

“Around here, you’re either SWE or Support.”

These are common sentiments expressed across the tech industry: People are classified into binary categories of engineering or non-engineering, software engineering or support. This classification extends from companies’ hiring databases, to the culture of how people talk about each other, to how leaders identify their employees. It also creates a hierarchy of the people whose opinions matter: Engineering and computer science opinions overshadow design opinions and expertise from humanistic studies.

This way of thinking then permeates the products and services that technology companies build. And today, those products and services affect millions of people: Computer scientists at Facebook and Google create algorithms and models that influence what news countless people do–or don’t–read. Computer scientists at banks write code that determines if someone is eligible for a mortgage. And computer scientists at insurance companies build machine learning systems that decide how much someone pays for their policy.

[Source Photo: Jason Leung/Unsplash]

When this narrow thinking translates into products and services, and then the world at large, the unintended consequences can be dire. In recent years, we’ve encountered algorithms that radicalized youth, platforms that amplify misinformation, facial recognition software that perpetuates racial bias, and systems that exacerbate inequality in the courts and in hiring practices.

We often scramble for shortsighted solutions to these problems: product fixes with minimal effect, or acrimonious congressional hearings with little impact. But if we want more ethical technology, we need systemic change. We need to start at square one–in the classroom. Why? Because while the harmful technical vs. nontechnical classification flourishes in the tech industry, that’s not where it begins. For decades, a platitude about social science and humanities has taken place on college and university campuses. It goes something like: You’re studying History / Sociology / English? You must really love the subject. I hope you can find a job. (Even a sitting president joked about this–and critics have been warning about the effects of this Two Cultures false dichotomy since at least the 1950s.)

More recently, another platitude has emerged, this one about computer science majors: that they’re precocious, that their opinions matter most, that they’re destined for six-figure starting salaries.

During my own time on campus, I witnessed how deeply those two platitudes impacted my peers’ education. Computer science, like many majors, was heavily “siloed”–my fellow students had course schedules loaded with classes like Data Structures or Algorithms, but usually devoid of classes like Social Movements, Political Uprisings, and Race and Gender Studies. Even if social sciences were a core requirement, the classes were viewed as fluff; a means to an end to graduate. Additionally, we were told, from our first days on campus, that we were different. We were heading in opposite directions from our social science counterparts. Humanities and Social Science students will become lawyers, teachers, social workers; Computer Science students will build the future through software. Everyone else will use the stuff we build.

[Source Photo: Jason Leung/Unsplash]

This wasn’t unique to my university. In the years since graduation, I have worked across the tech industry, government, and academia. It was rare to find someone with a double major, or major and minor, that equally bridged computer science with the social science and humanities–not as electives and add-ons, but as fields with equal weight.

Further: Performance reviews at tech companies focus more on a product’s user growth and engagement, and less on how that product affects those users or affects our broader society. It’s why software is often developed with a mindset of “what’s possible?” rather than “what’s responsible?” Disruption and speed are the reigning virtues.

Of course, this isn’t to say computer science programs produce amoral people, or that computer scientists have an inherent empathy deficit. (That’s a whole other harmful and misguided platitude–that Silicon Valley is full of mercenary coders who don’t care about society.) Many colleagues I have worked with across consumer tech, enterprise, and government are thoughtful and empathetic people.

Rather, computer science programs produce people with blind spots; people who excel at writing code, but who aren’t equipped to assess how that code might intersect with human behavior, privacy, safety, vulnerability, equality, and many other factors. As the entrepreneur Anil Dash recently wrote, “Tech is often built with surprising ignorance about its users.” Too few people follow the advice of legal scholar Salome Viljoen to honor all expertise when writing code and building technology.

This blind spot plays a major role in the tech controversies we see every day. It’s one of the reasons we end up with recommendation algorithms that can predict our television preferences to a tee–but also spread anti-vaccine rhetoric at gigabit speeds. And encounter targeted pregnancy advertisements that spoil the surprise, or far worse, continue to appear after a stillbirth. The computer scientists that built Facebook’s newsfeed algorithm, or Google’s search ranking algorithm, didn’t fully anticipate how or why bad actors could hijack their technology–an oversight that a more complete understanding of sociology might have helped mitigate.

By recasting computer science and social science as compatible, and not mutually exclusive, we can make real progress on these problems, and also help prevent future ones. There are already bright spots: At my alma mater, Georgia Tech, the faculty has introduced an “Ethics in Undergraduate Education” program. It explores how ethics intersect with automated decision-making and privacy policies. Other schools are piloting similar programs: Northwestern University has a course titled “Algorithms and Society,” and the University of Wisconsin at Madison has a course titled “Code and Power.” Casey Fiesler, an assistant professor of Information Science at the University of Colorado, led a movement that sourced over 200 tech ethics courses across computer science, information science/studies, communication, law, philosophy, and others.

[Source Photo: Jason Leung/Unsplash]

Tech companies and civil society are realizing the importance of this, too: Omidyar Network, Mozilla, Schmidt Futures, and Craig Newmark Philanthropies recently launched the Responsible Computer Science Challenge, a $3.5 million initiative to better integrate ethics into undergraduate computer science education. (I’m currently a Fellow at Mozilla co-leading this challenge.) On April 30, the first wave of winners received funds to pilot their responsible CS curricula.

It’s critical that these first attempts take root. For long-term change in the tech industry, we need a commitment to lasting change in curricula and the siloed culture of academia. I’m hopeful this nascent movement will grow. We might one day see computer scientists compelled to follow ethical guidelines, like how doctors swear the Hippocratic Oath or how lawyers are accountable to bar associations. We might see engineers who are more like fiduciaries, only shipping and deploying code after its been tested for bias and potential user harm. We might see certification programs that classify developers as “ethics conscious.”

Of course, changing how we teach computer science won’t solve all of tech’s ills. Even when a cohort of concerned employees at YouTube spoke up about the effects of misinformation on the platform, the company didn’t take action. Regulation, web literacy, heightened gender and ethnic diversity, and other approaches all have important roles to play also. By training a new generation of technologists to think about ethics just as often as they think about code, we can spark change from the start. And as the internet becomes more and more entwined with our cities, our safety, and our well-being, there’s never been a more important time to do this.


Kathy Pham is a computer scientist, product leader, and serial founder who has held roles in product management, software engineering, data science, people operations, and leadership in the private, nonprofit, and public sectors. She is currently a Mozilla Fellow leading the Responsible Computer Science Challenge.


Source: Fast Company

Tags:
No Comments

Sorry, the comment form is closed at this time.