Dr. Robert Howell, Dedman Family Distinguished Professor and Philosophy Department Chair, has contributed to SMU’s cutting-edge research on the ethics of technology. His work with the Dedman College Interdisciplinary Institute research cluster “Technology, Society, and Value” combines philosophy, law, statistics, and engineering to explore the future of technology and its ethical precedents.
No one likes asking for directions. Nowadays we’re lucky: we don’t have to. Almost everyone carries a smartphone loaded with Google Maps or Waze which can tell us, turn by turn, how to get from the office to the closest purveyor of macarons or specialty donuts. Now suppose that in addition to these apps we had another: Google Morals. This time those wizards of Silicon Valley have developed an app that tells us the ethical thing to do in any given situation. Trying to decide whether to take a “sick day” at the beach? Ask Google Morals. Deciding whether to buy that Tesla or to give a little more money to the Texas Food Bank? Ask Google Morals.
Some years back I asked what might be wrong with relying on an app as an ethical compass (here’s a link to a Huffington Post article summarizing my argument). The idea of Google Morals was far-fetched, but nevertheless I thought it would help us realize the mistake in offloading the burden of ethical decision-making. In the years since, it has become clear that even if we aren’t appealing to our phones to make these sorts of decisions, we have nonetheless allowed technology to lead where we should follow. Social Networks have transformed democratic discourse, and the news we read is increasingly shaped by algorithms hidden from our view. Corporations have detailed data on all of us and can use that data to shape our behavior in subtle ways. Artificial Intelligence is being used for everything from recognizing our faces at the border to arbitrating parole hearings. The list can go on, but the point is clear: it’s long past time to attend to the ethics of emerging technologies so that we don’t race forward into a future we will regret.
The Philosophy Department at SMU is leading the way in researching and teaching the ethics of technology. Three years ago, Ken Daley and I developed a course called Technology, Society, and Value in which we discuss issues including the nature of privacy, the ethics of human enhancement, and the dangers of artificial intelligence and social networks. SMU students, it turns out, see the importance of these issues. This fall, over 150 students signed up for sections of the course with many more on waiting lists, making it the department’s most popular course. With support from the Dedman College Interdisciplinary Institute we have joined with Suku Nair of the Lyle School of Engineering to form a Technology, Society, and Value Research Cluster that sponsors dinners, discussions, and lectures to bring people from across the University together with leaders in industry to discuss the issues we face. Suku and I, meanwhile, are working with professors Meghan Ryan in the Dedman School of Law and Tony Ng in the Department of Statistics to form an interdisciplinary team that aims to make SMU a leader in research and education in the ethics of technology. We believe that SMU’s research strengths can be allied with the international profile and expertise of Dallas industry to make a difference at a crucial time in our history. Technological change is inevitable, but ethical leadership is indispensable. We’ll only have the future we want if we first understand the values we hold dear.