Bias and Diversity Working Group.

Last updated January 19, 2024

Diverse data collection and curation strategies, as well as the mitigation of bias in data analysis within the MIDRC commons, are critically important to yield ethical AI algorithms that produce trustworthy results for all groups. MIDRC strives to mitigate bias in its study population, data collection, curation, and analysis.

Check out the bias and diversity group’s resources page; A bias awareness tool is available to help researchers identify and mitigate biases that may arise in the AI/ML development pipeline.

Members:

Brad Bower, PhD, NIH, Karen Drukker, PhD, (lead), University of Chicago, Weijie Chen, PhD, US Food and Drug Administration, Judy Gichoya, PhD, (lead), Emory University, Maryellen Giger, PhD, University of Chicago, Nick Gruszauskas, PhD, University of Chicago, Jayashree Kalpathy-Cramer, PhD, University of Colorado, Hui Li, PhD, University of Chicago, Rui Carlos Pereira De Sá, PhD, NIH, Kyle Myers, PhD, Puente Solutions, Berkman Sahiner, PhD, US Food and Drug Administration, Heather Whitney, PhD, University of Chicago, Zi Jill Zhang, MD, University of Pennsylvania

Previous
Previous

Grand Challenges Working Group

Next
Next

Annotations and Labeling Working Group