
Products Must be Buttressed by Ethical Principles
Salesforce Chief Ethical and Humane Use Officer Paula Goldman says with the unprecedented uptick in tech adoption this year, teams need to build ethical-use guidance that must underpin the product development lifecycle.
By Paula Goldman
The global pandemic has changed all aspects of our lives — how we work, how we learn, how we interact. To meet the need for contactless practices and social distancing, businesses of all sizes and across all industries, have gone digital. And, we’ve seen Asia lead the way in
digitalization. In fact, DBS unveiled in a study in September 2020, that in APAC, Singapore businesses are the most digitally-ready, and close to half of businesses in Singapore have a well-defined digital strategy.
As we find our way on the road to business recovery, we recognize the important role technology can play in protecting health and restoring economic access. However, for technology to have a long lasting impact, it must be built responsibly — because no matter how good a tool is, people won’t use it unless they trust it.
Building a Response Framework That Works
There’s no blueprint for developing technology in a global pandemic, and as we began to develop COVID response technology to help businesses return to work safely via work.com, we identified the need to build privacy and ethical use guidance for teams to reference as they develop products. This helped us think through any unintended consequences of our products, as well as how our products might be used after the pandemic.
Our entire industry is grappling with the human and societal impact of modern technologies — the issues we face are significant, and we know we cannot tackle them alone or in siloes. To create this guidance, we knew we had to listen to a variety of voices, both across our own teams — product, legal, equality — and industry experts. We also created mechanisms for continued feedback and ensured the guidance we created was digestible and customized for the intended audience.
Through this, we created our ethical use and privacy principles for technology solutions responding to COVID-19 that aim to uphold human rights and protect personal data. In addition, we have developed guidance tailored for our AppExchange partners. The principles were critical in the development of work.com.
When we design our technologies responsibly and collaboratively, we can help support adoption of important technologies and support COVID-19 recovery. Our world is rapidly changing, and solutions will vary by context based on differences in industry, business nature and local climate.
A key challenge of building a responsible technology framework in a crisis is scale and speed. We need to both respond rapidly to the need, but also with ethical use in mind.
Through our experience, we have determined some key principles that need to be kept in mind to help maintain that balance, and by sharing the following principles, we hope to help empower others to build responsible solutions to support recovery from COVID-19 and beyond:
Protecting Human Rights and Equality
The first and most important principle is that solutions should not cause harm, and should not adversely affect already marginalized groups.
We anchor on the principle of “intention vs. impact”, meaning regardless of our intent it is the impact of our actions that matter and that we are held accountable to. We actively involve diverse experts — such as from public and medical health professionals — in the development and implementation process to help avoid harm and maximize the impact and inclusivity.
Some of the ways we have kept equality top of mind is offering Work.com in multiple languages, and utilizing accessibility best practices, as well as protecting data and privacy that may adversely impact marginalized groups.. Work.com also minimizes the risk that shift optimization software could enable unintentional discrimination against workers. It does this by keeping the Resource Priority field set to null by default, and advising customers to keep it that way. As a result, the optimization engine treats all workers (“resources”) equally when allocating shifts.
Honoring Transparency
Trusted solutions should not prevent users from knowing how their personal data is being collected and used, and what their rights are to control that information.
Users should trust that their data is protected. Data transparency is critical to ensuring your company’s accountability. Employees should be able to ask, ‘What rights do I have over the information I put in here?’.
Minimizing Data Collection
We encourage customers to only collect only the data that are absolutely essential for a solution to be effective, and to safeguard and presume the privacy of the individual or employee. Wherever possible, the information collected from employees or customers should be anonymous or aggregated.
For example, in Work.com, shift managers do not see “wellness status” by default — they only see if someone is available to work or not, protecting employees’ privacy while also helping managers schedule appropriately. Another example of this is how we partnered with privacy and medical experts on the design of the Wellness Check survey found in Work.com. The survey intentionally bundles all symptom and exposure questions into a single yes or no attestation.
This way, an administrator viewing Wellness Check data knows only whether someone has passed the screen or not, and not specific symptoms or point of exposure.
Taking a long-term approach
The pandemic is eventually coming to an end — but technology solutions, the data collected through those solutions, and their implications can have a longer impact. Consider the long-term exposure and usability of the information that’s being collected. Solutions should not retain data once it’s no longer needed, so as to keep the information only for as long as necessary. Your users and company should be able to answer how data will be collected and deleted. As an example, automatically deleting after a set period of time is a best practice for contract tracing.
Ensuring the security of personal data
It’s crucial to limit access to sensitive data to only those with a strict need-to-know, especially in cases where sensitive health or employment data are concerned. Safeguards should be implemented to protect against misuse.
In Work.com, for example, Salesforce strongly recommends customers set up a Workplace Command Center in an organization separate from the company’s core CRM (Customer Relationship Management) system, which partitions access to sensitive data.
Technology adoption in the recent months have skyrocketed, and through that we are not just seeing changes in the workplace, but are also beginning to understand how emerging technology impacts society. Careful considerations, meaningful collaborations and strong communications with stakeholders will be the best way forward towards building a transparent, responsible and long term COVID-19 framework.
Ed. Paula Goldman says she is the first Chief Ethical and Humane Use Officer at Salesforce. Goldman joins Salesforce from Omidyar Network, a social impact investment firm established by eBay founder Pierre Omidyar. Goldman is also Founder of Imagining Ourselves, a project of the International Museum of Women, where she led the creation of one of the world’s first online museums. Goldman earned the Social Impact Award from the Anita Borg Institute for Women and Technology and a Muse Award from the American Association of Museums for this project. Goldman has a Ph.D. from Harvard University, a Master’s Degree in Public Affairs from Princeton and a B.A. with highest honours from UC Berkeley. She has been on faculty at both UC Berkeley and Mills College and was an inaugural Social Impact Fellow at the UC Berkeley Haas School of Business. Featured image by Photographer Vincent M.A. Janssen.)












