Dark Pattern UX Design
User experience designers, design apps that are are ethical, provide a good experience and. build trust... but not always. Whether intentional or not, apps can perform in unethical and potentially abusive ways.
Dark patterns of user experience design trick users into behaving in ways that they would not, if they were consciously aware of what they are doing and why.
- male developers in outnumber female designers 80/20 which is not representative of the general population.
- Male developers generally also greatly outnumber women in VR development
- Male attitudes to safety and behaviours are different to women’s lived experience.
- Women's sense of safety is based on feeling physically safe in a physical space.
- Women's sense of safety is based on feeling psychologically safe in her interactions in her intimate relationships with 1 in 4 women experiencing intimate partner violence (IPV) within their lifetime.
- Women are negatively impacted by systems not designed with them in mind, on a regular basis.
- UX designers want to develop digital applications that are ethical, safe in order to establish trust in their products.
- Dark pattern user interaction is psychologically abusive and non consensual.
- VR experiences can be developed as positive, safe spaces for women.
- Women are not afforded safety equality is real life experiences, in intimate spaces, and in real environments and systems.
- It is possible to develop a set of design principles that provide safety equality in VR
- Implementing these principles could lead to an assurance that VR is safe and this in turn will provide a higher safety standard for all VR apps. In offering this assurance, VR could’ve safely extended to deliver highly emotive, trauma training experiences for all users.
- We can design safe, controlled fully immersive trauma training for all users but also specifically for first responders which will have a range of positive impacts.
Dark UX and Coercive Control
Coercive control generally refers to a pattern of behavior in a relationship that seeks to take away a woman’s independence and autonomy through manipulation, fear, and surveillance.
In a digital context, dark patterns mirror this by creating an environment where users feel pressured, guilted, or tricked into actions they would not otherwise take.
Design choices act as mechanisms of control:
Loss of Autonomy: Users lose control over their data, their time, and their finances.
Forced Action: Access to features or information may be denied unless the user agrees to unwanted terms or subscriptions, mirroring how an abuser might restrict access to necessities.
Manipulation of information: Important details, like costs or cancellation steps, are hidden or obscured, ensuring users cannot make a fully informed decision.
Common Dark Patterns as Coercive Tactics
Several specific dark patterns function as digital manifestations of coercive control:
Roach Motel (Hard to Cancel): It is made easy for a user to sign up for a service but intentionally difficult to cancel, requiring multiple steps or forcing users to call customer service. Maybe there is no obvious exit button.
Blocking exit
This mimics the real-world tactic of making it nearly impossible for a woman to leave a relationship. This may come in many or all of the following tactics. The coercive controller may physically standing in the way of a woman’s exit route, cut off her emotional support, control her finances and access to funds, surveil her life, erode her confidence and sense herself and remove her entire sense of agency. This happens, not suddenly but gradually. It’s like a frog in a saucepan of water. If the coercive controller turns up the heat on a saucepan of water too quickly the frog will jump out, if he does it slowly, then the frog will die. Speed and time become weaponised in the relationship.
Forced Continuity: After a "free trial," a user's credit card is automatically charged without clear warning or easy cancellation options.
Pattern mechanisms: There maybe repeated patterns of behaviour that are difficult to spot in isolation but that together, reveal a mechanism of control. The pattern becomes normalised. This ‘conditioning’ happens very subtly.
Financial control
This removes the user's financial control. In finacial abuse, an abusive man might control the family income and give 'housekeeping' to the wife of barely enough to buy basics or force her to having to chose between eating or providing their children with shoes. He might take out credit on the wifes name, trick her into signing contracts, remortgage their home, or generally mislead her about where money is being spent while spending it on himself. He might then use her belief of their poor financial situation to coerce and control other people such as grandparents/parents into giving him money as form of extortion/elderly abuse.
Confirmshaming: Emotionally manipulative language is used to guilt users out of a decision (e.g., the option to decline a newsletter subscription is worded as, "No thanks, I don't want to save money").
Threats
If you leave me you will break up the family and hurt the children.
Or when extended from shame to fear... (If you leave me, I will unalive myself - and also potentially the children).
Privacy Zuckering (Data Grabs): The design tricks users into sharing more personal information than intended, often by hiding privacy-respecting settings behind obscure menus or using pre-checked boxes
Consent
This is non-consensual and happens in personal intimate relationships where a new partner reveals information early on in the relationship that could be used to blackmail the person into keeping quiet. The intial trust means you are less guarded and may reveal too much information about finances or personal, embarrassing or shameful information/photographs etc.. that could get weaponised later on.
Nagging: Repeated, non-dismissible prompts for actions like enabling notifications wear down user resistance over time through sheer repetition. In a relationship, this shows up as continually repeated demands. If she finally concedes by this process of coercion, her act is not consensual.
Compliance
This might show up as persistent demands made over a long period of time that goes against a firm boundary. Such as the boundary is that she asserts that she will not perform more domestic labour than her partner, so as to share the load equally. He may triangulate her with an ex who was a great domestic worker. He may have greater financial power and have higher financial status and pays for everything (from which he also benefits but makes her believe they are ‘treats’) however, the ‘gifts’ are transactual. They are not given as signs of cherishing their partner but create an imbalance, making her feel she should give him something back in return while also making her expect less, thus making her compliant, over time.
These ‘tests of compliance’, assist him on checking how easily he may be able to manipulate the woman with the intention of taking control. His intention is to take advantage of her resources which may not necessarily be physical or material things but could be her time, energy, emotional support or labour.
Similarly in app design, the app mechanisms or design may exploit user vulnerabilities and cognitive biases to ensure the business's goals are met at the user's expense. Regulators worldwide, including the U.S. Federal Trade Commission (FTC) and the European Union, are increasingly prohibiting these manipulative practices under consumer protection and data privacy laws, such as the EU's Digital Services Act and California's CPRA.
We will more carefully consider the legal implications of dark ux (and coercive control) in the next post.