•   When: Friday, March 26, 2021 from 03:00 PM to 04:00 PM
  •   Speakers: Alexander Moena
  •   Export to iCal

Abstract: This talk will examine the ontology, schema, and representative images that constitute ImageNet, a large-scale database for training image recognition and computer vision algorithms. It will overview biases embedded in the dataset ranging from the banal (e.g. complexity of objects, focal depth, etc.) to the intolerably cruel (e.g. images of white people in blackface). In assessing the impact of these biases, I will move to examining their compounding effects as ImageNet is increasingly used as a benchmark to train and assess the accuracy of image recognition and computer vision algorithms. In doing so, I hope to open a discussion about the importance of human review (manual or supervised) in the creation of large datasets, especially those that may serve as industry benchmarks. I'd like to ask to what extent this is feasible? ethically mandated? necessary for technical accuracy? a savvy financial investment given the attendant public relations crises when things go awry?

Bio: Dr. Alexander Monea is an Assistant Professor serving jointly in George Mason's English Department and Cultural Studies Department. He researches the history and cultural impacts of computers and digital media. He received his PhD in Communication, Rhetoric, & Digital Media from North Carolina State University. His forthcoming book project examines how heteronormative biases get embedded in the computer vision algorithms and content filters that control the flow of content we encounter online. It examines everything from the anti-porn coalition of Christian conservatives and Alt-Right groups to the bias in datasets and algorithms that filter content to instances of sex educational and artistic materials being rendered invisible online. He also is currently Co-PI on an NSF grant with Aditya Johri and Huzefa Rangwala using role-playing scenarios to study best practices for teaching data ethics.

 

Posted 3 years, 3 months ago