Your cart is currently empty!

Ethics Toolkit Aims to Forestall Harms, Abuses of Technology

Tech giants are reeling from months of criticism oversecurity breaches, lax privacy protections, and other damage caused by use andmisuse of their products. It’s often said that science and technology advance soquickly that regulation and ethics cannot keep up. But, while some abuses oftechnology might be hard to anticipate, it is possible to incorporate riskmitigation into design and development processes, thereby enabling ethics tokeep pace with technological development. A new “ethics toolkit” seeks toencourage this foresight.
The teams and companies developing and marketing newtechnologies, particularly those based on artificial intelligence (AI), havethe option of focusing on potential uses—beneficial and harmful—andanticipating a broad range of scenarios. This would empower them to build insafeguards against some of the worst possible abuses of their products. Apartnership of two organizations is eager to help them do just that with its Ethical OS guideand toolkit.
According to Wired.com,the “new guidebook shows tech companies that it’s possible to predict futurechanges to humans’ relationship with technology, and that they can tweak theirproducts so they’ll do less damage when those eventual days arrive.” Ethical OSis the product of the Institute for the Future and the Tech andSociety Solutions Lab.
Ethical OS guides developers, executives, product managers,and others in “warming up your foresight muscles and kicking off importantconversations with your team.” The free kit comprises asynchronous eLearningfor developers, a downloadable checklist, a set of scenarios intended to sparkdiscussion about the long-term impact of technologies being developed, andstrategies to guide developers toward ethical action to mitigate potentialharms.
The kit describes eight “risk zones”; developers can focuson the areas most relevant to their product. The zones are: Truth,Disinformation, and Propaganda; Addiction and the Dopamine Economy; Economicand Asset Inequalities; Machine Ethics and AlgorithmicBiases; Surveillance State; DataControl and Monetization; Implicit Trust and User Understanding; andHateful and Criminal Actors.
L&D teams might, for example, focus on Machine Ethicsand Algorithmic Biases, taking steps to ensure that they are not rushing toautomate and incorporateAI in eLearning and performance support without considering whetherthe AI engines perpetuate discrimination or exacerbate bias. The checklistencourages developers to consider whether there’s any recourseor accountability to people impacted by algorithms used in theirproducts and to consider whether those algorithmsare transparent or are “black boxes.”
Additionally, the section on Data Control and Monetizationcould be relevant to L&D teams that collect data about learners.Considering the ramifications of what information is collected and how it is usedcan lead to a better learner experience as well as improvingsecurity and strengthening compliance with GDPRand other data privacy regulations.
The Institute for the Future is a 50-year-old nonprofitorganization that considers foresight training among its core missions. Itsresearchers create tools—including massive multiplayer online games—andoffer guidance to governments, businesses, and nonprofits in anticipating andpreparing for future dilemmas and scenarios.
The Tech and Society Solutions Lab is a project of Omidyar Network,which invests in pro-social entrepreneurship—while also seeking to mitigateunintended harmful consequences of emerging technologies.
“We can’t predict the future. But that shouldn’t mean we can’tsystematically build safeguards against future risk directly into our designand development processes,” Paula Goldman and Raina Kumra of the Tech andSociety Solutions Lab wrote in their announcementof the toolkit’s release on August 7.
Ethical OS makes ethical L&D design and development pain-free,providing an ethics toolkit that makes it easy to build foresight into content planningprocesses, conduct ethics-focused discussions, and follow a checklist to ensurethat their products anticipate and mitigate potential abuses of technology thatcould harm learners or negatively affect learnerexperience.






