Not so long ago, technology was seen primarily as a force for good. But in the wake of crippling cyberattacks, wide-scale data breaches, and unintended outcomes driven by artificial intelligence (AI), that’s quickly changed. Welcome to the tech backlash. With it comes increasing public scrutiny around how data and technology is used today, and what it will bring in the future.
It’s a concerning development not just for Silicon Valley startups and massive platform companies but every company operating in the digital economy. But just how big an issue is it? What can and should corporate leaders do? What can they expect from regulators?
These questions and many more were topics of discussion at PwC’s recent Exchange on Responsible Technology. At the event, senior executives were encouraged to learn from one another, as well as those on the forefront of responsible technology. These included Paula Goldman, Chief Ethical and Humane Use Officer at Salesforce and DJ Patil, Chief Data Scientist at Devoted Health and former US Chief Data Scientist.
From those conversations emerged four key takeaways for how you can navigate responsible tech in your organisation:
Exchange attendees expected to gain timely insights about top-of-mind issues, such as navigating new privacy regulations or decoding AI bias. But what surprised many was just how vast and layered the responsible tech domain is. Along with these more familiar areas are less obvious considerations around how a company designs its products and services, specifies how they can be used, and codifies responsible behavior. In addition, there are higher-level concerns about tech’s impact on employees, society, and other stakeholders — a growing focus in light of the Business Roundtable’s new definition of corporate purpose, which along with delivering long-term shareholder value includes a commitment to invest in employees, support broader communities, provide customer value, and treat suppliers ethically.1
While responsible technology’s broad remit might be daunting, one thing became clear over the course of the Exchange: Standing still is no longer an option. Business leaders are urged to begin — be it by defining an organisation-wide data use checklist, getting feedback from individual customers about their concerns, or incorporating questions related to tech ethics into your job candidate interview process.
Just as technology has become instrumental to business strategy, so are questions related to its responsible use — that means accountability at the most senior levels of a business. It was notable that several panelists who are leading the charge in their own companies report directly to the CEO and the Board. The senior executives in attendance, however, also realise this isn’t a one-person job. They need to champion the principles, policies, and behaviors throughout all facets of the company. That requires creating an environment where people care about issues, have permission to raise questions or ideas, and feel like they are part of the solution.
Balance short-term needs with longer-term options that are not yet known. For example, you might be collecting a wealth of data that is a byproduct of a core system and have no immediate uses for it. But down the line, it could drive a new business model or partnership. How do you ensure you’re ethically handling the data and protecting your business interests while allowing for its responsible monetisation? Or put another way: How can you know what you don’t know? The answer is to consider what kinds of frameworks or processes might be needed that allow for optionality. For example, you might outline how you and a partner would negotiate handling rights around new value created from existing data. Likewise, ensuring you are practicing and building responsible AI approaches, such as establishing end-to-end governance processes, is another way to prepare for the future — whatever that might bring.
From the first session on data ethics to the last on public policy developments, it became clear that we’re at an inflection point. The general consensus of speakers and participants alike was that there’s a closing window in which to positively shape how data and technology delivers value and how it is regarded by customers, employees, and the greater public. This is especially true in regards to data privacy — already the focus of the EU’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). Many global businesses are already struggling to meet compliance requirements — CCPA goes into effect in the US in January 2020 — and the prospect of proliferating US state privacy laws is daunting.
Instead of a patchwork of laws that vary state by state, business and tech leaders at the Exchange said they would prefer a US national policy. But merely hoping or even anticipating is not enough, especially as lawmakers may not be well versed in the intricacies of the issue or see it as a priority. Collectively, business leaders have the opportunity to help shape effective regulation by engaging and educating regulators on the topic of tech ethics, participating in industry or standards groups, and staying abreast of activity related to emerging technology like AI.
Business leaders who participated in the Exchange were energised but admitted they had a lot to think about in regards to establishing guardrails around the use of technology. That first step — an awareness of the myriad issues and all the different ways to begin attacking the problem — is critical for any company. So too is engaging with others, especially in different industries, who are experimenting with new approaches and that you can learn from. The upshot? Responsible technology is a shared challenge that will take a collaborative approach to fully address.
© 2017 - 2021 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details. Liability limited by a scheme approved under Professional Standards Legislation.