In this seminar, we will learn to think about how the world we live in shapes and is shaped by technology, and what the role of law is and should be with respect to regulating technology’s creation, use, and effects. We will learn to analyze the relationships between technology, law, and society through theoretical and critical lenses. We will examine how law and policy interacts with the development and proliferation of new technologies, or new applications of existing technologies. We will examine the challenges that legislators, judges, and regulators face when acting in contexts affected by technology, including uncertainty about the future, lack of technical expertise, and the speed at which technology develops. We will learn to synthesize insights from a variety of sources and fields in our discussions and in a final research paper.
By the end of this course, you will be able to:
- Articulate what it means to think about technology through a sociotechnical lens, and why it matters.
- Scrutinize the ways in which technologies can be beneficial or harmful to society.
- Describe the differences between how legal actors, such as judges, legislators, regulators, and in-house counsel at technology companies interact with technological development.
- Examine the role and limitations of law in regulating the social effects of technology.
Class Meetings and Office Hours
We will meet Tuesday 5:30–7:30 PM, on Zoom (link on MyLaw page). My office hours will be by appointment in the Zoom classroom.
There is no assigned textbook for this seminar. We will discuss journal articles, news articles, cases, book chapters and other materials. Many of the assignments below are hyperlinked, but readings that are not will be posted in course OneDrive folder.
The OneDrive folder also has a space where each of you can upload interesting discussion materials that are relevant to the class, like news articles you see. If you upload something, please just let me or the class know via email what it is, so we can enjoy and discuss it.
In addition, for assistance with the research paper, I recommend Professor Volokh’s book Academic Legal Writing. (It doesn’t really matter which edition, though later is probably at least a little better.)
Regular attendance is required for all classes at UCLA Law. Pursuant to our academic standards, students who do not regularly attend class may, at my discretion, be prohibited from turning in a final paper, resulting in a grade of “F” or being dropped from the class. Students for whom this may be an issue will receive a written warning before this final action, and may need to attend all remaining classes after the written warning is given. If you must miss a class because of a medical need, or serious familial, religious, or professional obligation, please email me at least 2 hours before class to request an excused absence. By UCLA policy, the class will be recorded, but as should be obvious in a seminar, watching the recording is no substitute for attending class sessions.
Your grade will be based on class participation (40%) and a final research paper as described below (60%).
This is a course centered on small-group discussion, class participation counts for 40% your final grade. I cannot stress this enough: If you do not want to be an active participant in class discussion, do not take this course. Participation is required, and you will not receive a passing grade if you do not contribute meaningfully to the discussion. We are here to learn from and teach each other, and if you do not participate meaningfully, you are depriving not only yourself, but your fellow students as well.
You will be assessed on the quality, not the quantity, of your participation. In this seminar, we will read challenging material, often from types of sources that law students do not typically engage with. The unfamiliarity of these texts means that you will probably not have well-developed notions going in about how you are supposed to engage with them. That is intentional. We each bring our own valuable perspectives and expertise to the discussion, and the lenses through which each of us sees the readings are as interesting and important as the readings themselves. Quality participation, therefore, does not mean that you understand the material fully and give a correct answer—there is usually no such thing. Rather, quality participation means thorough preparation and thoughtful engagement that enriches class discussion and thus helps us all learn.
The Classroom Environment
I want to ask for three things from you in creating the classroom environment that is conducive to learning: presence, generosity, and security. First, I ask for presence. We are engaged together in a semester-long intellectual exploration, and I hope you will be present for it, as much as you are able. This goes beyond simple physical attendance or even preparation. I encourage and invite you to bring into the class as much of your attention, your curiosity, your interest and imagination, as you can. This may be exceptionally challenging in a semester of remote classes, so it is especially important that we all be intentional about it.
Some of the topics we face in here may be difficult, not just intellectually, but emotionally or politically as well. In daily life, we may often face strong pressure to embrace, or to reject, particular opinions in matters of law and policy. On some issues, this can be part of how we express our values and define ourselves. The seminar room is an intentionally different kind of space: It is a safe place to play with new ideas, to try on arguments, to entertain controversial, uncertain, or unpopular views. Advocating a position with which one actually disagrees can at times be fruitful. In any event, I expect you to honor one another’s intellectual growth by making an active effort to hold your mind open to new perspectives.
Therefore, the second thing I ask is that you be generous toward the ideas you encounter here — whether from me, from the readings, or from our shared discussion — in a specific way: seek out the strongest version of each idea, before responding to it. Ask yourself, “what’s the smartest thing this person (or this text) could possibly be trying to say?” Then, respond to that, including with spirited objections if they arise for you.
Third, I ask that you work with me to create conditions in which we can feel secure in taking intellectual risks. Specifically, please do not quote your classmates outside of class without first obtaining their permission. Views expressed in the seminar are part of the learning process, and may not reflect the speaker’s fully developed view. Conversation is a vehicle that helps us discover and develop our perspectives.
Discussion, participation, and presence are essential to all law school classes, but no more so than in a seminar such as this. I ask that each of you please turn on your video whenever possible so we can be as fully engaged as possible while in a remote setting. If you must turn your video off for personal reasons or for bandwidth reasons, please let me know in advance where possible, and please limit it to be the exception rather than the rule. Please note that I am requesting camera usage to aid classroom discussion and presence, not as a way to introduce surveillance into the class; if you have a momentary need to turn off your camera for whatever reason, please do so at your own discretion. If the video is off for an extended period and you do not tell me ahead of time, I may come check on you, not because you’re in trouble, but because I will take that as a sign that something is wrong and I will want to see that you’re ok.
Per University policy, UCLA Student Conduct Code 102.28 says that expectations of privacy apply, and it specifically prohibits recording without the consent of all recorded parties and prohibits taking photographs where there is a reasonable expectation of privacy. In remote teaching, advising, chatting, and other engagement in course activities remotely there is a reasonable expectation that photographing, screen capture, or other copying methods or recordings will not occur without express permission from all participants. A violation subjects a student to the disciplinary process. Do not record your courses, do not take screen shots of your classes, professors, or classmates, and do not release, post, email, text, or otherwise share or sell course materials to others.
UCLA Law strives to provide accommodations in a way that supports students with disabilities while maintaining their anonymity and the fundamental nature of our law program. As such, students needing academic accommodations should not contact their professors directly, but contact Carmina Ocampo, Director of Student Life or the UCLA Center for Accessible Education (CAE) When possible, students should start this process within the first two weeks of the semester, as reasonable notice is needed to coordinate accommodations.
Resources for Health and Wellness
Students needing assistance with medical or mental health issues, substance abuse, anxiety or depression or other health-related matters should contact the Office for Student Affairs, UCLA Counseling and Psychological Services (CAPS) at 310-825-0768 (Courtney Walters is the counselor regularly assigned to the law school) or the Ashe Student Health & Wellness Center at 310-825-4073.
The majority of your grade will be determined by an original research paper. In it, you should do new factual and legal research to explore a topic related to law, technology, and society that we do not cover in class, or a new aspect of a topic we do cover. The paper must be substantial, as it satisfies the SAW requirement. As I read it, this means it should likely be about 10,000-12,000 words in length. This word count is not itself important, and I will not be counting; you should write exactly as many words as needed to make your arguments well. I am giving you an approximate word count solely to explain that I anticipate that something in that range will accomplish the task. If you greatly exceed that range, your claim should be large enough to justify the extra length used to defend it.
The paper will be evaluated on clarity and organization, the integration of sources and analytical arguments to back up your claims, and a demonstrated understanding of the relevant theoretical concepts we cover in class. (Hint: If you write about technology as if it were entirely separable from the people who build, use, and otherwise interact with it, you’re going to have a bad time. Hopefully you will see what I mean in the first two weeks.)
You must complete three assignments on the way to the final paper. Each of these must be submitted before the start of class on the specified due date. Together, the successful completion of these assignments will count for 25% of your paper grade (15% of your overall grade for the course). Please submit the assignments in the designated folder in the student Drop Box on MyLaw, as a Word document titled as: “Paper_Proposal.docx”, “Outline_and_Bibligraphy.docx”, “First_Draft.docx” or “Final_Draft.docx”, as appropriate. Do not include your name; the Drop Box takes care of that. Please number your pages as well. Doing this correctively ensures that I will be able to give you credit for having turned it in.
Paper Proposal (Due Week 4, 9/15):
You must submit a paper proposal. This should be about one page and no more than two, and will consist of a proposed abstract for your paper, identifying the topic you wish to explore, some open questions, and a rough sketch of the paper’s expected thesis. For this assignment, you will need to do some initial research to understand what is out there already. I also expect this thesis to change as you research more, so do not worry about it being perfect; I just want to make sure you’re on the right track and asking the right sorts of questions.
In the following week, I will return the proposals with comments and may ask to meet with you if there is something I feel we should discuss. But feel free to check in with me as you think about topics before your proposal is due.
Annotated Paper Outline and Initial Bibliography (Due Week 7, 10/6):
Next, you must submit an annotated paper outline and initial bibliography. By this time, you should have done substantial research and understand your argument decently well. I expect you to outline the paper at a level of detail such that I can understand your argument by reading the outline. Please annotate the outline at the key points needed to make this happen. For the bibliography, you should provide ten sources and for each, a few sentences explaining what it is and how it fits in the argument of your paper.
In the following two weeks, I will meet with each of you for half an hour to discuss the paper outlines and offer suggestions. It is imperative that you lay out the outline and annotation in a tight, organized manner. I’ll have many of these to read; if you do not give me information in an organized enough way, I will not be able to help you in the half hour we have to go over the outline.
First Draft (Due Week 11, 11/10):
Your first draft is due five weeks after your outline. This first draft should be as close to your finished product as you can make it, to spare you from having to make big changes during finals period. I will not be grading them yet, but I will offer feedback. I hope to turn them around within about 10 days, to give you a full four weeks to work on the final draft.
Final Draft (Due at the end of finals period, 12/18):
Self-explanatory. This is the product that will make up the remainder of your paper grade.
The first third of the course will be dedicated to thinking about what technology is and what it means. We will interrogate the idea of technology as an object, and explore the social systems and power structures that co-create technology, often determining aspects of its creation, use, and effects on people. We will spend this portion of the course developing a vocabulary that is associated with these questions in the scholarship. Class readings will mix classics with newer iterations of similar questions. (5 classes)
The second third of this class will apply our newly acquired theoretical frameworks to several different categories of technology. We will examine their function on a technical level, and interrogate the societal frameworks in which they sit. Does technology drive change? How much is the effect of a technology new, and how much is attributable to existing societal structures? What kinds of problems does the technology raise, and how can law best respond or anticipate the problems? (4 classes)
The final four classes will be run by you all. I will ask you each to submit one reading selection to the class—whether a news article or an academic article or something from another discipline—to open discussion of your paper topic. I will also ask you to bring in and share your annotated outlines, to help us understand the topic, and so that we can provide feedback. (4 classes, 4 students in each class, randomly assigned). The readings and outlines should be placed in the appropriate OneDrive folder.
The weekly readings are below. By 2:30PM the day of class, please post a reaction to the readings in the Teams discussion channel. This can be just a few sentences, or longer. This is no right or wrong answer; it is meant to drive in-class discussion. Please also read your classmates’ answers before class.
Week 1: The Social Shaping and Politics of Technology
What is technology? This week, we’ll probe the definition of technology, discuss the forces that shape it, and examine how lawyers tend to speak and write about technology. We will also have some time to get to know each other and go over the plan for the course.
- Donald MacKenzie & Judy Wajcman, Introductory Essay, in The Social Shaping of Technology (Donald MacKenzie & Judy Wajcman, eds. 1985) (23 pgs).
- Langdon Winner, Do Artifacts Have Politics?, reprinted in The Social Shaping of Technology, supra at 26 (12 pgs)
- Meg Leta Jones, Does Technology Drive Law? The Dilemma of Technological Exceptionalism in Cyberlaw, 2018 J. L., Tech. & Pol’y 101 (pp 250–268, 277–281, 24 pgs)
Optional Supplemental Readings:
- Ruth Schwartz Cowan, How the Refrigerator Got its Hum, reprinted in The Social Shaping of Technology, supra, at 202
- Nick Seaver, Knowing Algorithms, in DigitalSTS: A Field Guide for Science & Technology Studies 412 (Janet Vertesi & David Ribes, eds., 2019)
Week 2: Social Constructivism: Technology and Race
This week we will discuss social constructivism and its implications. The theory of “social construction of technology” (SCOT) is useful for understanding the social origins of technology, but it mostly lacks a normative account of technology’s effects. Social construction of race is, however, explicitly normative and is a central concept underlying Critical Race Theory. Parts of the assignment address the relationship between technology and race directly. As you read and watch, think about the parallels between these kinds of deconstructive analyses, as well as the differences. They will be key to understanding what’s to come.
- Trevor Pinch & Wiebe Bijker, The Social Construction of Facts and Artifacts: or How the Sociology of Science and the Sociology of Technology might Benefit Each Other, reprinted in The Social Construction of Technological Systems 11, (1987) (31 pgs)
- Ian Haney Lopez, The Social Construction of Race, reprinted in Critical Race Theory: The Cutting Edge 238 (Jean Stefancic & Richard Delgado eds., 3d ed. 2013) (10 pages)
- Ruha Benjamin, Automating Anti-Blackness, excerpt from Race After Technology: Abolitionist Tools for the New Jim Code (2019) (4 pages)
- Joy Buolamwini, TED Talk, How I’m Fighting Bias in Algorithms (November 2016) (8-9 minutes)
- Please also watch this related short video on the history of photography and race (4-5 minutes)
Optional Supplemental Readings:
- Amy Moran-Thomas, How a Popular Medical Device Encodes Racial Bias, Boston Rev. (August 5, 2020)
- Sergio Sismondo, Science and Technology Studies and an Engaged Program, in The Handbook of Science and Technology Studies 13 (Edward J. Hackett et al., eds., 3d ed. 2008)
- Issa Kohler-Hausmann, Eddie Murphy and the Dangers of Counterfactual Causal Thinking About Detecting Racial Discrimination, 113 Nw. U. L. Rev. 1163 (2019)
Week 3: Code Is—But Really Isn’t—Law
If you speak to technology law scholars and practitioners you’ll sometimes hear the phrase “code is law.” But what do we mean by that? This week we’ll discuss regulation by law and by technology. How are regulation by law and regulation by technology similar? How are they different? How can translating legal requirements into code distort their meaning? When is that an acceptable risk?
- Lawrence Lessig, What Things Regulate?,Chapter 7 in Code 2.0 (2002) (18 pgs)
- Danielle Citron, Technological Due Process, 85 Wash. U. L. Rev. 1249 (2008) (Read pp. 1259–1263, 1267–81 (end of top paragraph); 21 pgs)
- Karen E.C. Levy, Book-Smart, Not Street-Smart: Blockchain-Based Smart Contracts and The Social Workings of Law, 3 Engaging Sci Tech & Soc’y 1 (2017) (12 pgs)
- Deirdre K. Mulligan, Colin Koopman & Nick Doty, Privacy Is an Essentially Contested Concept: A Multi-Dimensional Analytic for Mapping Privacy, 374 Philosophical Transactions Royal Soc’y, December 2016 (read Parts 1–3, pp. 1–9; 9 pgs)
Optional Supplemental Readings:
- Bert-Jaap Koops, Criteria for Normative Technology: The Acceptability of “Code as Law” in Light of Democratic and Constitutional Values, in Regulating Technologies: Legal Futures, Regulatory Frames, and Technological Fixes 157 (Roger Brownsword & Karen Yeung, eds., 2007) (read pp. 157–174)
- Ruha Benjamin, Retooling Solidarity, Chapter 5 in Race After Technology
- Danielle Citron, Technological Due Process, 85 Wash. U. L. Rev. 1249 (2008) (the rest)
Week 4: When Technology Occludes: Manipulation, Secrecy, and Abstraction
Paper Proposal Due
Technology can be designed or used to undermine people’s autonomy. Technologically-mediated decisions can also be secret or hard to understand, and thus difficult to contest or work around. As you read, think about the different people involved in these different scenarios—coders, judges, administrators, clerks doing the data entry, coders, tech company management, benefits applicants. What choices does each make and who experiences the effects? What role is the technological system playing in these scenarios? What social categories does it rely on or co-construct? What does all this mean for the rule of law? How should the law respond?
- Ryan Calo, Digital Market Manipulation, 82 Geo. Wash. L. Rev. 995 (read pp. 996–1003; 8 pgs)
- Jamie Luguri & Lior Jacob Strahilevitz , Shining a Light on Dark Patterns (forthcoming) (read Part I & Part III-C, pp. 6–17 & 38–43; 18 pgs)
- Jack M. Balkin & Jonathan Zittrain, A Grand Bargain to Make Tech Companies Trustworthy, The Atlantic (July 26, 2020) (5 pgs)
- Rebecca Wexler, When a Computer Program Keeps You in Jail, N.Y. Times (June 13, 2017) (2 pgs)
- State v. Loomis, 881 N.W.2d 749 (Wis. 2016) (15 pgs)
- Sonia K. Katyal, The Paradox of Source Code Secrecy, 104 Cornell L. Rev. 1183 (2019) (read Part IV-A, pp. 1237–42; 6 pgs)
- James W. Malazita & Korryn Resetar, Infrastructures of Abstraction: How Computer Science Education Produces Anti-Political Subjects, 30 Digital Creativity 300 (2019) (read Parts 1–2, pp. 300–07, 8 pgs)
Optional Supplemental Readings
- Daniel Susser, Beate Roessler & Helen Nissenbaum, Online Manipulation: Hidden Influences in a Digital World 4 Geo. L. Tech. Rev. 1 (2019)
- Rebecca Wexler, Life, Liberty and Trade Secrets: Intellectual Property in the Criminal Justice System, 70 Stan. L. Rev. 1343 (2018)
- This is the full article that the NY Times piece comes from
- Andrew D. Selbst et al., Fairness and Abstraction in Sociotechnical Systems, 2019 ACM Conference on Fairness, Accountability and Transparency 59
Week 5: Unintended Uses, Unintended Consequences, and Regulation
The ways in which technology gets put to use, and by whom, are often unexpected. What should regulators do about that fact? Sit back and wait for the dust to settle for fear of doing more harm than good? Proactively regulate to ensure benefits outweigh harms? Whose interests are considered when regulators make such decisions? Is it ever right to ban a technology or form of technological research? Why or why not?
- Ronald Kline & Trevor Pinch, The Social Construction of the Automobile in the Rural United States, 37 Tech. & Culture 763 (1996) (read 764–65, 772–95; 26 pgs)
- Katharine Trendacosta, Reevaluating the DMCA 22 Years Later: Let’s Think of the Users, EFF Deeplinks (7 pgs)
- Steve Clarke, Future Technologies, Dystopic Futures and the Precautionary Principle, 7 Ethics & Info. Tech. 121 (6 pgs)
- Woodrow Hartzog & Evan Selinger, Facial Recognition Is the Perfect Tool for Oppression, Medium (Aug 2, 2018) (9 pgs)
- Malkia Devich-Cyril, Defund Facial Recognition, The Atlantic (July 5, 2020) (11 pgs)
- Bruce Schneier, We’re Banning Facial Recognition. We’re Missing the Point., N.Y. Times (Jan 20, 2020) (2 pgs)
- Dave Gershgorn, New Coalition Calls to End ‘Racist’ A.I. Research Claiming to Match Faces to Criminal Behavior, OneZero (Jun 23, 2020) (4 pgs)
Optional Supplemental Readings:
- Karl Bode, Congress To Consider National Right To Repair Law For First Time, TechDirt (Aug. 7, 2020)
- Jonathan Zittrain, The Generative Internet, 119 Harv. L. Rev. 1974 (2006)
- Madeleine Akrich, The De-Scription of Technical Objects, in Shaping Technology/Building Society (Wiebe Bijke & John Law, eds. 1992)
- Bert-Jaap Koops, The Concept of Function Creep, in 13 Law, Innovation & Technology (forthcoming 2021)
Week 6: Police Technology (Guest: Prof. Sarah Brayne, University of Texas Dept. of Sociology)
This week is the first of our case studies. We’re going to apply what we’ve learned about how to think about technology, it’s origins, and its effects, to look at police technology. What is special about police technology? How is it used? Where does it come form? Should law regulate it and how? We also have a special guest! Professor Sarah Brayne has spent the last 7 years studying the LAPD’s use and understanding of their technology, and she’s coming to talk with us about it. Please come to class with questions for her.
- Sarah Brayne, Predict and Surveil (2020) [excerpt TBD]
- The book will be released during the semester, and I’ll post an excerpt for us to read.
- Brian Barrett, The Pentagon’s Hand-Me-Downs Helped Militarize Police. Here’s How, Wired (June 2, 2020) (4 pgs)
- Seth Stoughton, Law Enforcement’s Warrior Problem, 128 Harv. L. Rev. F. 225 (2014-2015) (read pp. 225–231, up to “To flesh out.”; 6 pgs).
- Barry Friedman & Maria Ponomarenko, Democratic Policing, 90 N.Y.U. L. Rev. 1827 (read Part I-B, pp 1843–55; 12 pgs)
Optional Supplemental Readings:
- Upturn & The Leadership Conference, The Illusion of Accuracy: How Body-Worn Camera Footage Can Distort Evidence (2017)
- Tracey Meares, Programming Errors: Understanding the Constitutionality of Stop-and-Frisk as a Program, Not an Incident, 82 U Chicago. L. Rev. 159 (2015)
- Andrew Guthrie Ferguson, Big Data and Predictive Reasonable Suspicion, 163 U. Penn. L. Rev. 327 (2015)
Week 7: Gendered Digital Harms
Outline and Annotated Bibliography Due
Technology can enhance harmful and abusive patterns in society, especially for anyone who is not a cisgender heterosexual white man. Women and transgender people in particular face abuse online and off in the form of cyberharassment and cybermobs, and intimate partner abusers can sometimes find that the affordances of technology offer new techniques to assert and maintain their control. Things to focus on in the reading include questions about what’s new or different about this situation, and how much it is even a problem of technology? How does the design of the technology encourage this behavior? How can or should the law address these issues?
This is a challenging topic to discuss, and the readings may be difficult to get through for some of you. I am putting this on the syllabus because as challenging as it may be, I believe it is extremely important to understand the relationship between these kinds of harms and technology, and these discussions are too often considered secondary in society. Because of the sensitivity of the topic, I ask that we take special care this week to be kind, generous, and sensitive to others’ experiences that may differ from our own.
- Amanda Hess, Why Women Aren’t Welcome on the Internet, Pacific Standard (June 14, 2017) (11 pgs)
- CW: rape and death threats
- Diana Freed et al, “A Stalker’s Paradise”: How Intimate Partner Abusers Exploit Technology, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems 1 (10 pgs)
- Danielle Keats Citron, The Problem of Social Attitudes, Chapter 3 in Hate Crimes in Cyberspace (2014) (19 pgs)
- Complaint, In re Snapchat, FTC Docket No. C-4501 (9 pgs)
- Mary Anne Franks, Fearless Speech, 17 First Amendment L. Rev. 294 (2019) (read pp 301–315; 15 pgs)
Optional Supplemental Readings:
- Siva Vaidhyanathan, Well done, Twitter. You’ve Finally Figured Out How To Deal With Trump’s Tweets, The Guardian (May 31, 2020) (4 pgs)
- Diana Freed et al, Digital Technologies and Intimate Partner Violence: A Qualitative Analysis with Multiple Stakeholders, Proceedings of the ACM on Human-Computer Interaction, CSCW 1 (2017)
- Christo Sims, The Politics of Design, Design as Politics, in The Routledge Companion to Digital Ethnography 439 (2017)
Week 8: Technology on a Global Scale
Today’s technology companies are powerful, maybe as powerful as governments. The huge multi-nationals sometimes even envision themselves as separate and apart from any individual state government. How does their global nature affect regulation? Despite their positioning as global citizens, many of the companies are based here in the US. How does this affect the products they make? Whose voices matter to them and how does that change the technological realities?
- Kristin Eichensehr, Digital Switzerlands, 167 U. Penn. L. Rev 665 (read pp. 685–695, 701, 12 pgs)
- tl;dr. (But actually do read Professor Eichensehr’s article.)
- Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598 (read pp. 1616–22; 7 pgs)
- Siva Vaidhyanathan, Facebook and the Folly of Self-Regulation, Wired (May 9, 2020) (6 pgs)
- Anupam Chander, Margot E. Kaminski & William McGeveran, Catalyzing Privacy Law (forthcoming 2020) (read Part I & III-A, pp. 6–11, 26–27; 8 pgs)
- Jack Goldsmith & Tim Wu, Digital Borders (2006) (5 pgs)
- Chinmayi Arun, AI and the Global South: Designing for Other Worlds, in Oxford Handbook of Ethics and AI (Markus D. Dubber, Frank Pasquale, & Sunit Das eds., 2019) (13 pgs)
Optional Supplemental Readings:
- Farhad Manjoo, ‘Right to Be Forgotten’ Online Could Spread, N.Y. Times (Aug. 5, 2015)
- Center for Democracy and Technology, Mixed Messages? The Limits of Social Media Content Analysis (read Executive Summary; 5 pgs)
- Mark Latonero, AI for Good Is Often Bad, Wired (Nov. 18, 2019)
- An Independent Assessment of the Human Rights Impact of Facebook in Myanmar (2018)
Week 9: Technology, Democracy, and Elections
Election Day is next week! Let’s spend a class talking about some of the ways technology is affecting our elections, and what law can or should do about it. (Reminder: Election Day also means no class next week! I’d say “go vote,” but you probably shouldn’t do so in person if you can help it. So go mail in your ballot!)
- Samuel Wooley, We’re Fighting Fake News AI Bots by Using More AI. That’s a Mistake., Tech. Rev. (Jan. 8 2020) (7 pgs)
- James Grimmelmann, The Platform is the Message, 2 Geo Tech. L. Rev. 217 (2018) (17 pgs).
- Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security, 107 Calif. L. Rev. 1753 (2019) (read pp. 1758–68, 1776–79; 14 pgs)
- Jen Schwartz, The Vulnerabilities of Our Voting Machines, Sci. Am. (Nov. 1, 2018) (8 pgs)
- Jonathan Zittrain, Facebook Could Decide an Election Without Anyone Ever Finding Out, The New Republic (June 1, 2014) (5 pgs)
Optional Supplemental Readings:
- Francesca Tripodi, Searching for Alternative Facts (2018)
- Kate Klonick, The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression, 129 Yale L.J. 2418 (2020)
- Dipayan Ghosh, Facebook’s Oversight Board is Not Enough, Harv. Bus. Rev. (Oct. 16, 2019)
Week 10: Presentations 1
Week 11: Presentations 2
First Draft Due
Week 12: Presentations 3
Week 13: Presentations 4