Dark Pattern UX Design

User experience designers, design apps that are are ethical, provide a good experience and. build trust... but not always. Whether intentional or not, apps can perform in unethical and potentially abusive ways. 

Dark patterns of user experience design trick users into behaving in ways that they would not, if they were consciously aware of what they are doing and why. 

UX designers are not consciously intending to deploy dark pattern user experience design. They are not necessarily deliberately trying to control users with malicious intent; however, sometimes the apps they design may not give the user full control or seek full consent. Ux designers may just not be fully aware that they are designing apps with user experiences which are unethical or non-consensual, because their own world experience differs substantially from their users. 

The risk is when there is unintentional coercive control stemming from UX designers being unaware of a users physical and psychological vulnerability. They are unable to 'stand in the shoes of' a users lived experience. Although there is capacity within the development process for user testers to bring awareness to developers that there are aspects of the apps control mechanisms that are problematic; there does not seems to be a set of design standards that can be applied that support safe experiences for vulnerable users.

As a person who worked in VR development in safeguarding children in immersive environments stated only yesterday, 'the problem is that VR is being designed and promoted by 50 men in silicon valley with their own set of values and ideals'. 

Coercive-controllers similarly seeks to gain control of a woman in a relationship. The techniques of coercive-control can be applied to 'win' an argument or get their own way without wanting complete control, could be a valid comparison. However, either way and in both instances of dark UX and coercive control, the intention is not clear, but the impact on a woman is unethical and abusive. 

The impact on her ranges from feeling unsafe, distrusting, confused, angry, frustrated, scared, submissive, out of control, loss of agency, self-respect and self-esteem. The impact is not just one of trust but can lead to physical attack, and feel like a physical attack even if it is experienced in a virtual environment. The loss of power can lead to complete loss of value as a woman. 

In previous posts, we challenged a set of assumptions, as follows: 

  • male developers in outnumber female designers 80/20 which is not representative of the general population. 
  • Male developers generally also greatly outnumber women in VR development 
  • Male attitudes to safety and behaviours are different to women’s lived experience. 
  • Women's sense of safety is based on feeling physically safe in a physical space. 
  • Women's sense of safety is based on feeling psychologically safe in her interactions in her intimate relationships with 1 in 4 women experiencing intimate partner violence (IPV) within their lifetime. 
  • Women are negatively impacted by systems not designed with them in mind, on a regular basis.
We will continue by exploring the following set of assumptions: 
  • UX designers want to develop digital applications that are ethical, safe in order to establish trust in their products.
  • Dark pattern user interaction is psychologically abusive and non consensual. 
  • VR experiences can be developed as positive, safe spaces for women. 
  • Women are not afforded safety equality is real life experiences, in intimate spaces, and in real environments and systems. 
  • It is possible to develop a set of design principles that provide safety equality in VR 
  • Implementing these principles could lead to an assurance that VR is safe and this in turn will provide a higher safety standard for all VR apps. In offering this assurance, VR could’ve safely extended to deliver highly emotive, trauma training experiences for all users.
  • We can design safe, controlled fully immersive trauma training for all users but also specifically for first responders which will have a range of positive impacts. 
There is an opportunity at this point in the development of VR, to build a foundation of equality in VR (and which may transfer to real-space equality). 

This may or may not include the use of AI (if virgin data can be obtained that isn't gender biased in favour of male experience).  


Dark UX and Coercive Control

Coercive control generally refers to a pattern of behavior in a relationship that seeks to take away a woman’s independence and autonomy through manipulation, fear, and surveillance. 

In a digital context, dark patterns mirror this by creating an environment where users feel pressured, guilted, or tricked into actions they would not otherwise take. 

Design choices act as mechanisms of control: 

Loss of Autonomy: Users lose control over their data, their time, and their finances.

Forced Action: Access to features or information may be denied unless the user agrees to unwanted terms or subscriptions, mirroring how an abuser might restrict access to necessities.

Manipulation of information: Important details, like costs or cancellation steps, are hidden or obscured, ensuring users cannot make a fully informed decision. 


Common Dark Patterns as Coercive Tactics

Several specific dark patterns function as digital manifestations of coercive control:

Roach Motel (Hard to Cancel): It is made easy for a user to sign up for a service but intentionally difficult to cancel, requiring multiple steps or forcing users to call customer service. Maybe there is no obvious exit button. 

Blocking exit 

This mimics the real-world tactic of making it nearly impossible for a woman to leave a relationship. This may come in many or all of the following tactics. The coercive controller may physically standing in the way of a woman’s exit route, cut off her emotional support, control her finances and access to funds, surveil her life, erode her confidence and sense herself and remove her entire sense of agency. This happens, not suddenly but gradually. It’s like a frog in a saucepan of water. If the coercive controller turns up the heat on a saucepan of water too quickly the frog will jump out, if he does it slowly, then the frog will die. Speed and time become weaponised in the relationship. 

Forced Continuity: After a "free trial," a user's credit card is automatically charged without clear warning or easy cancellation options. 

Pattern mechanisms: There maybe repeated patterns of behaviour that are difficult to spot in isolation but that together, reveal a mechanism of control. The pattern becomes normalised. This ‘conditioning’ happens very subtly. 


Financial control 

This removes the user's financial control. In finacial abuse, an abusive man might control the family income and give 'housekeeping' to the wife of barely enough to buy basics or force her to having to chose between eating or providing their children with shoes. He might take out credit on the wifes name, trick her into signing contracts, remortgage their home, or generally mislead her about where money is being spent while spending it on himself. He might then use her belief of their poor financial situation to coerce and control other people such as grandparents/parents into giving him money as form of extortion/elderly abuse.

Confirmshaming: Emotionally manipulative language is used to guilt users out of a decision (e.g., the option to decline a newsletter subscription is worded as, "No thanks, I don't want to save money").

Threats 

If you leave me you will break up the family and hurt the children. 

Or when extended from shame to fear... (If you leave me, I will unalive myself - and also potentially the children). 

Privacy Zuckering (Data Grabs): The design tricks users into sharing more personal information than intended, often by hiding privacy-respecting settings behind obscure menus or using pre-checked boxes

Consent 

This is non-consensual and happens in personal intimate relationships where a new partner reveals information early on in the relationship that could be used to blackmail the person into keeping quiet. The intial trust means you are less guarded and may reveal too much information about finances or personal, embarrassing or shameful information/photographs etc.. that could get weaponised later on.  

Nagging: Repeated, non-dismissible prompts for actions like enabling notifications wear down user resistance over time through sheer repetition. In a relationship, this shows up as continually repeated demands. If she finally concedes by this process of coercion, her act is not consensual.

Compliance 

This might show up as persistent demands made over a long period of time that goes against a firm boundary. Such as the boundary is that she asserts that she will not perform more domestic labour than her partner, so as to share the load equally. He may triangulate her with an ex who was a great domestic worker. He may have greater financial power and have higher financial status and pays for everything (from which he also benefits but makes her believe they are ‘treats’) however, the ‘gifts’ are transactual. They are not given as signs of cherishing their partner but create an imbalance, making her feel she should give him something back in return while also making her expect less, thus making her compliant, over time.   

These ‘tests of compliance’, assist him on checking how easily he may be able to manipulate the woman with the intention of taking control. His intention is to take advantage of her resources which may not necessarily be physical or material things but could be her time, energy, emotional support or labour. 

Similarly in app design, the app mechanisms or design may exploit user vulnerabilities and cognitive biases to ensure the business's goals are met at the user's expense. Regulators worldwide, including the U.S. Federal Trade Commission (FTC) and the European Union, are increasingly prohibiting these manipulative practices under consumer protection and data privacy laws, such as the EU's Digital Services Act and California's CPRA. 

We will more carefully consider the legal implications of dark ux (and coercive control) in the next post. 

Popular posts from this blog

Ambulance Service Simulated training research group

Safe learning environments

Safeguarding : Brief Design a Narrative HMD VR Training Experience for mental healthcare practitioners.