Lecture | November 9 | 4:10-5:30 p.m. | 202 South Hall
Jessica Hullman
Co-sponsored by the Goldman School of Public Policy
Research and development in computer science and statistics have produced increasingly sophisticated software interfaces for interactive visual data analysis, and data visualizations have become ubiquitous in communicative contexts like news and scientific publishing. However, despite these successes, our understanding of how to design robust visualizations for data-driven inference remains limited. For example, designing visualizations to maximize perceptual accuracy and usersâ reported satisfaction can lead people to adopt visualizations that promote overconfident interpretations. Design philosophies that emphasize data exploration and hypothesis generation over other phases of analysis can encourage pattern-finding over sensitivity analysis and quantification of uncertainty. I will motivate alternative objectives for measuring the value of a visualization, and describe design approaches that better satisfy these objectives. I will discuss how the concept of a model check can help bridge traditionally exploratory and confirmatory activities, and suggest new directions for software and empirical research.Â
This lecture will also be live streamed via Zoom.
Join the Zoom live stream
510-642-1464
Catherine Cronquist Browning, catherine@ischool.berkeley.edu,