Technology-facilitated abuse is a challenging issue, and there is no simple solution to eliminate it. However, by making subtle decisions—balancing intended with unintended consequences—it is possible to design technology to be resistant to it. To aid technologists in making these decisions, IBM is proposing five key design principles to make products resistant to coercive control.
Creating products that do not contribute to, or enable, society’s problems is an ethical responsibility of all companies—not only because it is the right thing to do but also because it is the best business approach. A recent study found that 80% of global respondents agreed with the statement that corporations have a responsibility to prioritize their employees, the environment, and their community as much as they prioritize delivering profits to their shareholders. Focusing on values and purpose has been IBM’s approach for more than a century, with these design principles being the latest example of IBM’s desire for technology to shape lives and society for the better.
By sharing this set of design principles, IBM aims to improve the usability, security, and privacy of new technologies to make them inherently safer. We recommend that these become an integral part of any product design review. While these principles may be familiar to technologists, they take on additional meaning when looked at through the lens of coercive control.
Five Key Design Principles
Having a diverse design team broadens the understanding of user habits, enabling greater exploration of use cases, both the positive and the negative. Often when developing a new technology, designers have target users in mind. However, they might not be the only type of users that end up using the technology, with other users often leveraging tech in unexpected ways.
Users need to be able to actively make informed decisions about their privacy settings. Small red buttons, or phrases like ‘advanced settings’ can intimidate users, causing them to pick the default settings without necessarily understanding the consequences of that choice. Settings should be simple to understand and easy to configure, and their presentation should not try to influence the user. Include periodic notifications for the user to review configuration that results in data being shared and ensure a diverse user base is considered when establishing default privacy settings.
Gaslighting is when a person manipulates someone psychologically into doubting their memories and judgment. If a user can remove all evidence of an action taking place, or if there never was any evidence, this could lead to someone starting to question their memory. Timely and pertinent notifications as well as auditing are essential for making it obvious who has done what and when. Technology needs to be transparent about where changes have been made and when remote functionality is triggered, making it difficult to obscure or hide gaslighting attempts. Where appropriate, a local override for a remote activation should be provided, empowering users with the ability to choose to retain control of their environment. The user interface and design around such notifications and auditing should be treated with equal importance to that of the regular function of the product, and not assigned to some corner of the interface that is hard to find.
It is important that products are secure, only collecting and sharing necessary data, thereby limiting the risk that they could be used maliciously. This involves thinking beyond the traditional security threat models and paying attention to the potential risk trajectories if the technology is used to abuse. For example, it is common that many home computer-based devices/services are managed by one user, even though they are used by many members of the family (e.g. virtual assistants, subscription channels, family calendar/data sharing plans, etc.). An intuitive and easy way for family members to subscribe and unsubscribe could be a more effective model, empowering users with joint control.
Victims of coercive control live in complex, ever-shifting worlds and may lack the energy or confidence to navigate new technologies. If all end user technology was intuitive to use and understand, this could help reduce the risk of abusers dominating with their greater technical confidence, either with threats or by installing applications the victim doesn’t understand. The combination of ease of use and an auditing feedback loop to every user can provide reassurance to a potential victim that they are not being controlled by the technology in question.