⚡ Three Things Your CS Degree Won't Teach You

What industry influence means for the CS classroom

I'm excited to share a guest essay from my dear friend Jessica Dai, who cofounded Reboot with me back in March. She's been on hiatus from the project for the last few months, instead embarking on far more productive ventures like improving model transparency at Arthur or presenting a paper on "Fairness Under Partial Compliance" to the Women in Machine Learning workshop at NeurIPS.

Jessica also wrote a feature piece, "The Paradox of Socially Responsible Computing," on the limits and potential of teaching tech ethics—insights she gained from leading Brown University's Socially Responsible Computing program from 2019-2020. I highly recommend you read that first. It's incredibly illuminating, and it sets the stage for this deeper interrogation into the systemic factors (read: technology capitalism) that constrain ethics in the CS classroom. So I apologize for the clickbait title, but we're really not kidding.

🤖 cs vs. the tech industry

By Jessica Dai

Undergraduate computer science programs have a long-standing tradition of choosing not to cover many topics students might encounter “in the real world,” eschewing kubernetes and AWS in favor of theory and algorithms. In terms of technical content, perhaps this is acceptable, even desirable. But as departments attempt to contend with and encourage responsible or ethical CS, they fall short by ignoring the non-technical aspects of the broader environments that their graduates will be entering. Beyond setting up recruiting and research open-houses, students are left to slowly discover for themselves the nature of the tech industry and CS academia. And yet, it is precisely this non-technical context that shapes the way the technology itself is operationalized.

As I allude to in my original piece about CS ethics programs, it would be disingenuous to discuss "CS ethics" in the university without addressing the behavior of the tech industry itself, or even understanding what we mean when we talk about “tech.” Capital-t Tech—think Silicon Valley, startups, and tech giants—is a place where, despite all its claims to disruption, few innovations are more revolutionary than the reinvention of the servant—see, for example, all the gig work apps. It’s a place where anyone who is not a white or Asian male software engineer will generally be underrepresented, underpaid, mistreated, and possibly even abused (a term I do not use lightly). 

And, in the context of tech ethics, it’s a place where some of the most egregious policies are decisions made far away from product development, where it seems like there’s little an individual employee can do. ICE and military contracts, for example; the abdication of responsibility for content moderation; the treatment of massive shadow workforces of contractors that are the backbone for these companies’ continued operation [editor’s note: see Reboot’s review of Voices from the Valley]. In the big picture, much of what is concerning about tech is less about the actual technology and more about “business-side” decisions made by executives. I suspect this is both a feature—oftentimes employees won’t learn about these decisions until after they’ve been made—and a bug—I wonder if at least some of employees’ anger is due to a perceived lack of agency over the products they’ve built. 

on recruiting, and choosing where to work

Many tech companies, recognizing the overall shift in sentiment at universities around the US, have repackaged their “change the world” marketing and recruiting pitch as “tech for good.” But the vacuousness of the phrase means that quite literally anything could be tech for good, whether it's cop-tech ("VR for empathy building") or predatory payday lending companies (“a financial system that works for people”). At large companies like Microsoft, you could be working on making Powerpoint screen-reader accessible, or you could be building the war cloud. Tech for good needs to be more than a branding effort—it’s up to students to think critically about what they’re really contributing to.

There is nothing quite like the way tech companies make students going through recruiting simultaneously feel special, unique, valuable—after all, these companies try so hard to get applicants—and generic, disposable, worthless—at the end of the day, you’re just one of thousands. Having been there myself, I can’t entirely blame students for whatever decisions they end up making. On the one hand, there’s the illusion of “changing the world”; on the other, there are very real reasons for students to pursue these careers—the financial stability of a tech job is incredibly attractive in light of six-digit student loans, for example. While there are no easy answers, it seems to me, at least, that university departments have the responsibility to not pretend that the companies hosting recruiting events are all the same (I've written previously about Palantir’s industry partnership with Brown, which ended the following year). 

on academia and the university-industrial complex

But of course, not all CS graduates are going into industry. On the research side, there are two major elephants in the room. First, is the massive, disproportionate amount of power that industry currently has over CS research. Dr. Timnit Gebru's recent firing from Google Brain's Ethical AI team was ostensibly due to (a) a paper criticizing large language models or (b) her overall pattern of activism and advocacy—either of which are intensely chilling. This episode exemplifies and confirms many core fears about industry-driven research that, until now, most people were able to rationalize away: that industry research is ultimately still motivated by the bottom line; that "Ethics" teams, despite their high-quality work, might be a facade that companies will sacrifice to preserve their power, and only valuable insofar as they are revenue-generating; that true, vocal concern about ethical oversights is cause for termination. 

It's not much better in academia proper, either. The second elephant is funding: many of the biggest research grants are from either industry or the military (though NSF has a nontrivial grant volume as well). The work that is funded by the tech giants is often related to areas that they can then fold back into their profit machine. For example, Amazon has recently been funding fair machine learning grants; a cynical view would note that fairness is a feature Amazon is adding to its SageMaker product. Meanwhile, military funding is, well, military funding, and while there certainly may be beneficial projects funded from any source (the internet, for example), I personally can't help but feel uncomfortable with the notion that the military chooses grant awardees based on its mission of protecting liberal democracy (by occupying sovereign nations and killing civilians around the world). 

As with recruiting, I am in no position to be too prescriptive about what professors should be doing. The reality seems to be that for an academic looking for funding, there aren’t many options not in some way related to big tech or the military. Rather than individual grants or individual faculty members, the problem is the industry-filled funding space itself—reminiscent of research funded by Big Oil and Big Tobacco.

if i’m a college student, what am i supposed to do with this information? 

While I don’t have an especially unique take on this issue, perhaps the most succinct description of my personal view is the meme, “we live in a society.” We live in a society where it’s not possible to truly be one hundred percent “ethical,” and nowhere is that more true than in tech. “Just be good,” after all, is an NP-hard problem. But that in itself isn’t a reason to not care or to withdraw entirely. I find Jenny Odell’s phrasing in How to Do Nothing especially compelling: to participate in society as long as we live in one.

[editor’s note: concerned students can stay involved with Reboot as we continue to interrogate these issues ;)]


what are you most proud of accomplishing with brown’s src program?

Honestly, just getting it off the ground I see as a pretty significant leap. As I’ve written, there’s a lot we could improve on, but a lot of work really did go into setting up the program in the first place. The results of that first year, which in my mind I see as sort of a proof of concept, are definitely promising.

what was most surprising about engaging with students on cs and ethics?

I started out pretty unsure of how receptive students would be to this content—I fully expected a slew of unsavory posts on Dear Blueno (Brown’s anonymous confessions page)—but it turns out that the majority of students were happy to engage once they were prompted to (and we got at most one DB post which wasn’t too mean at all!).

can you tell me about a book you loved this year?

Why Fish Don't Exist: A Story of Loss, Love, and the Hidden Order of Life by Lulu Miller. It doesn’t sound very exciting, but it’s truly delightful at every turn.

Find more of Jessica on Twitter or Goodreads.

🌀 microdoses

💝 closing note

Any gift ideas for last-minute shoppers?

  • Jasmine: A paid newsletter subscription to support your giftee’s favorite independent writer—some of my favorites are Anne Helen Petersen’s Culture Study, Jeff Ding’s ChinAI, and Azeem Azhar’s Exponential View.

  • Deb: Anything from Paper Source can only be a gift, especially the hand finger puppets which are hilarious and also completely useless :)

  • Em: Waterproof notepad and pencil for the shower, to write your shower thoughts on!

  • Ben:Brain Pickings is one of my long-time favorite blogs, and I love old science art. You can buy something from their society6 page.

Reboot's next author event will be on Tuesday, Jan 12 from 5-6pm Pacific—and surprise, Jessica’s back with us and hosting. See you then, and happy holidays!

—Jasmine & Reboot team