One of the things I’ve been doing a lot of thinking about recently is how I can work to make sure that the ways in which we collect and analyze data don’t inadvertently reinforce biases. I found the following article on the “four pillars of the responsible use of data” to be really helpful in establishing basic principles that I think anyone who collects, or works with data would benefit from keeping in mind.
I won’t spoil the whole article, but as a quick preview, here’s a brief look at the four pillars:
- Trust: “Organizations must ensure how and why they are collecting data are mission-driven practices. […] Communities and individuals must have input. Big data must be given local context in the form of “ground-truth”—information coming directly from the people, place, or things surveyed.”
- Humanity: “There is fear in the nonprofit sector that data turns human beings into cold, hard numbers. Finding ways to humanize data and even how we talk about data is critical to widespread acceptance of data-driven culture.”
- Equity: “Inclusiveness and participatory design are two ways to move towards equity. Organizations should involve the surveyed communities, or better, let them lead. People and communities should own their own data and systems.”
- Privacy: “It’s important that people and communities are in charge of their own data. They should be able to have a say in, if not outright control of, what happens with data collected from them.”
In the age of corporate-driven big data, I think it’s easy to feel like any effort to collect data is reductionist and can produce more harm than good. However, I’m optimistic that insofar as data has been used to drive profit and reinforce inequities, it can also be used to reveal society’s biggest shortcomings in startling detail and help us shine a light on how we can do better.