Digital Tech Doesn’t Have to Be Toxic

The benefits of digital transformation are potentially limitless, with nearly endless ways to keep us connected with technology that optimizes efficiency, cost savings, and competitive advantage.

But how do you navigate the ethical risks and unintended consequences of allowing algorithms and automation to make decisions that affect people’s lives when almost two-thirds (65%) of senior executives can’t explain how specific artificial intelligence (AI) model decisions or predictions are made, and 73% struggle to get executive support for prioritizing AI ethics according to a recent study by global analytics firm FICO.

These are just some of the challenges Michelle Shevin explores as Senior Program Manager, technology and society at the Ford Foundation. As a technologist, Shevin is obsessed with the interplay of technological, regulatory, and cultural change and anticipating and preparing to adapt to the challenges coming down the line including:

  • How do we create digital trust in what analyst firm Gartner calls the age of hyperautomation?
  • What are the ethical implications of how we design technology?
  • How do we design for accessibility and inclusion to ensure that everyone can use technology?

This is where Public Interest Technology (PIT) comes in, a digital equity movement that dates back to the early 2010s. To get the most out of technology, says Shevin, it’s not enough to “build it and they will come.” Better, says Shevin, to take a human-centered approach to innovation that invests in research, education, and ways to deploy technology that protect and benefit society.

“Companies that are explicitly incorporating the public interest technology frame are beginning to see real value generated both in terms of their profits,” says Shevin, “but also in terms of the trust that they’re generating with the critical communities that they’re working with, showing up for, and engaging with through business relationships. So, we think public interest tech is really core to a long-term ecosystem of building accountability and trust across constituencies and across customer relationships. .”

“For example,” says Shevin, “as a business leader, you don’t want to get 20 steps down the road of designing and deploying facial recognition technology that could soon be regulated out of existence because nobody thought to put guard rails in place , or because nobody was consulting with the communities that might be most impacted by the technology.”

Perhaps this is a usability issue as much as anything else. But 44% of consumers don’t fully trust digital services, according to global research from McKinsey. In other words, people are conflicted about the acceleration of digital transformation in the post-COVID world. On the other hand, taking a human-centered approach to digital innovation can change people’s perception of technology. In the following interview, Shevin shares her insight on the PIT movement and how prioritizing public trust, speeds tech adoption, drives tech equity, and helps businesses anticipate regulation in the age of hyperautomation. The questions have been edited for clarity and brevity.

We Need a New Kind of Technologist

Question:

The public interest technology (PIT) movement seems to be gaining momentum. You talked about that in a recent Fast Company article you wrote. You basically argued we need more technologists trained to understand the ethical, legal, and political ramifications of the technology they create. Talk about that and also about what you’ve called the public interest law framing of PIT.

Shevin:

Public interest law is a useful mental model to keep in mind. In the 1950s and 60s, there was this great need for legal expertise that also understood the demands of the civil rights movement and at that time funders started to make big investments in legal defense funds, and institutions like the ACLU, and university law clinics, and pro-bono infrastructure for private law firms, for example.

Because public interest law is a thriving field, we often say that it’s hard to imagine now that this important infrastructure for the pursuit of justice needed to be intentionally built, but it did. And in that continued pursuit of justice, we see now a big moment to invest in public interest tech. And this is partially about responding to the escalating damage of the move-fast-and-break-things era that has characterized much of digital transformation thus far.

So, it’s become clear to stakeholders in philanthropy that we need a new framework to make sure technology is developed, designed, regulated, and used in a way that protects consumer rights and improves people’s lives. So, PIT aims to be that new framework. And it’s really a growing field that’s made up of a different kind of technologist who tends to be more interdisciplinary and intersectional.

Question:

In what way, can you be more specific?

Shevin:

They might not be your classic computer scientist or programmer or developer. They could be advocates, they could be artists, they could be advocates and yes, they could also be coders and programmers. And it’s this different kind of technologist that really expects and demands, you know, that technologies be created and used responsibly. And so you may have heard of the terms ‘responsible tech’ or ‘ethical tech’ or ‘tech for good.’ Well, PIT resonates with all of those frames. It’s kind of a big tent. It’s a movement, it’s a field and it really is a home for technologists to call out where technology can better deliver services and contribute to solving big problems.

Question:

So, PIT is really different from traditional computer science or data science expertise.

Shevin:

Increasingly, we’re going to see mainstream computer science and data science recognizing this, but traditionally it has not. Public interest tech centers a recognition that historically marginalized groups are most often harmed by technology. Data is impacted by structural inequality. Computing is not immune to power dynamics. Public interest technologists are people who have the knowledge and experience to make technology that really advances values ​​like justice and equity in addition to, for example, optimizing for the bottom line of the business, profit margin efficiency, cost savings, or. PIT is an extremely diverse field, as you could imagine. It spans job categories and it’s intentionally diverse in terms of people’s backgrounds. And because of that, it’s a group who better represents public interests in the US and abroad.

People and Technology – In That Order

Question:

So, when you talk about equitable tech, you’re talking about making sure the technologies we develop really don’t harm people or make the digital divide even worse than what it already is. But beyond social impact and doing the right thing, what’s the business case for PIT, and why should business leaders care about it?

Shevin:

So, the private sector is a critical piece of the PIT ecosystem, and also a place where we think there’s enormous potential for deepening commitments, expanding our notes of what accountability looks like in the private sector. And of course, a lot of technology development happens through private industry, which makes the private sector this really critical node. Simply put, companies that center public interest values ​​in the way they hire talent, approach technology, and maintain technical systems will create stronger products and services.

Question:

So, if I’m a business leader who cares about improving usability and driving innovation with a public interest mindset, what’s the playbook for doing that?

Shevin:

First, it’s essential to understand what PIT is, and how values ​​like equity and transparency, and accountability get prioritized in a business framework, right? So, we’ve seen anecdotally and at scale how transformative adopting a PIT mindset can be for businesses. You can look at companies like Twitter, and you’ll see that over the past year they’ve been hiring more and more public interest technologists.

We’ve also seen companies embrace PIT and generate real value not just in terms of profits, but also in terms of the trust they’re generating with critical communities they’re working with. We see these companies engaging with communities through business relationships. For businesses, this is core to building an ecosystem and winning trust across constituencies and across customer relationships.

Question:

But business leaders can be skeptical of terms like public interest technology. It can come across as just another way to say we need more tech regulation.

Shevin:

That’s an important distinction because regulation is inevitable and important but has a reputation as a hard hammer that can inhibit innovation. But as a field and a set of approaches that work across sectors to center public interest values ​​like equity, transparency and justice in the way we design and apply technology, public interest tech is core to driving your business growth and sustainability.

So, we talk about PIT as a way to be anticipatory of forthcoming regulations and standards and important changes in the environment as it relates to regulatory hurdles and compliance.

It’s Okay to Pump the Brakes on Innovation

Question:

You’ve said that one of the roles of public interest technologists is to help us build and deploy technology that’s more human-centric. What did you mean by that?

Shevin:

Yes, when everyone’s excited about the next big innovation, PIT sometimes encourages us to actually pump the brakes and more closely examine how tech trends can and do harm marginalized groups, and how we can prevent that from happening.

I read a recent FICO survey that revealed many executives are poorly equipped to ensure the ethical implications of using AI systems. For example, when asked about the standards and processes in place to govern AI usage, only 38% said their companies had data bias detection and mitigation steps in place. And just 6% said that they tried to ensure their development teams are diverse.

So, that’s a really disappointing snapshot of where we’re at when it comes to IT governance and hyperautomation governance. We’ve seen time and time again, how algorithmic bias has already caused real-world harm for marginalized communities, right? We’ve seen false arrests, increasing surveillance, and increasing marginalization of folks who don’t have access to systems that may require them to be machine-readable or visible to AI systems. So that’s really where public interest tech comes in.

Question:

So, pragmatically speaking, what’s your advice for businesses that want to prioritize trust, diversity, equity, and inclusion as part of their digital transformation strategy?

Shevin:

My advice is to consider the consequences of the technology you’re building and make every effort to build and use technology in a way that avoids harming marginalized communities. Consider adopting and incorporating frameworks that prioritize trust and accountability like we prioritize cost savings, efficiency, profit, and speed.

.

Leave a Comment