Designing AI for Social Fairness

 

AI-based technologies increasingly shape access to safety, health, and essential services, yet they can produce outcomes that raise concerns about fairness. This project examines how AI systems influence experiences in transportation, health, and public-sector contexts through studies of pedestrian behavior, healthcare access and provision, and municipal technology practices. By combining computational modeling, immersive simulations, stakeholder interviews, and community-centered design, the team develops insights and tools that help ensure AI systems reflect societal needs and support fairer decision making. Ongoing engagement with government agencies, healthcare providers, and community partners informs policy and practice aimed at fostering more responsible and fair uses of AI.

 

Team Members


Kaya de Barbaro
Psychology

News


At the Crosswalk: Who Does AI See? A UT Austin research team is investigating how appearance affects driver behavior—and how those patterns could be replicated in autonomous vehicles.
Ago. 25, 2025
Unyielding Bias

A team of Good Systems researchers is using video footage and VR simulations to study driver behavior toward pedestrians — and how bias may influence future AI systems.

young-african-american-female-psychologist-keeping-2023-11-27-05-12-31-utc-scaled-1200x800-c-default.jpg
Marzo 28, 2024
UT Austin, Cornell Researchers Developing AI Interventions to Address Suicide Rates Among Black Youth
A multi-university team that includes University of Texas at Austin researchers has been awarded a grant from the National Institutes of Health to create AI-based interventions to help address the risk for suicide among Black youth.
tower from east
Sept. 20, 2023
Good Systems Awards Funding to Advance Ethical AI Research in Core Project Areas
Good Systems has awarded funding to five projects that investigate the ethical implications of AI technologies in society in the areas of racial justice, surveillance and privacy, smart cities, and community-robot interaction.

Videos


Documents