Have you encountered items being added to your online shopping cart automatically or come across limited-time offers that weren’t truly time-sensitive and didn’t offer any actual price reductions?
If either of these situations sound familiar, you’ve come across the so-called “dark pattern”, cleverly designed interface elements that encourage people to make choices that benefit businesses, often at their expense.
Prof Dr Ho Chin Kuan, the vice chancellor of the Asia Pacific University of Technology and Innovation (APU), says dark patterns are intentionally used to manipulate or deceive users into taking an action they ordinarily would not.
“A straightforward example is when a website deliberately makes it difficult for users to find the unsubscribe button to discourage them from leaving a mailing list. This tactic is referred to as a ‘roach motel’.
“Another typical dark pattern we commonly encounter is called ‘hidden cost’ – displaying a product’s price in a way that hides additional costs until the user is further along the process.
“You may have experienced this while booking your holiday,” Prof Ho adds.
The researcher who initially coined the term in 2010, Harry Brignull, laid out 16 categories of dark patterns, including the aforementioned two.
The others are: fake scarcity, forced action, comparison prevention, misdirection, nagging, obstruction, confirm shaming, disguised ads, fake social proof, fake urgency, forced continuity, preselection, sneaking into baskets, and trick wording.
According to Prof Ho, the most prevalent types of dark patterns are misdirection and forced continuity, in addition to hidden costs and roach motels.
Misdirection, or visual interference, occurs when a website employs cues, whether visual or verbal, to guide users towards or away from a specific action – for instance, using a green-coloured button to make an option appear as the most favourable choice.
According to him, forced continuity (also known as hidden subscriptions) involves automatically enrolling users in paid subscriptions after a free trial period without providing clear notification.
Digital deceit
Examining various types of dark patterns reveals that some are more overt than others. For instance, the “sneaking into the basket” technique involves adding an item secretly to a user’s cart during checkout, resulting in additional costs without their consent if it goes unnoticed.
In other instances, a platform might default to selecting certain options for users, a practice known as preselection, which requires users to remain vigilant and actively opt out.
Shopping platforms are also known to employ a technique known as fake scarcity by falsely indicating low stock or high demand to create a misleading sense of urgency or FOMO (fear of missing out).
Additional pressure can be applied by showing a timer, which pushes customers to make a choice or buy something within a short amount of time.
A platform may also include false reviews or show notifications of others making purchases to trick visitors into believing something is more popular or well-received than it actually is. This is known as fake social proof.
The objective is almost always the same: to encourage users to make impulsive purchases without taking other alternatives into account.
Sites can also employ “comparison prevention” by intentionally displaying prices or specifications of a product or service at different positions on different pages.
Obstruction is yet another prevalent dark pattern that discourages users from taking certain actions, such as removing an item from their cart, by introducing virtual obstacles in their path.
“I have personally observed a popular ecommerce mobile app shrinking the ‘Remove From Cart’ button and placing it in a corner to make it difficult for customers to remove items from their shopping carts,” says Prof Ho.
This can function in conjunction with roach motels, where the option to unsubscribe is hidden behind multiple clicks and webpages.
“Nagging” is when a service repeatedly requests users to do something without giving them the option to permanently silence those requests.
Most users will likely be familiar with this due to apps incessantly requesting more permissions on your device, particularly access to your contacts, or seeking reviews on app stores.
Confirm shaming refers to a tactic where rejecting or opting out of something is framed negatively in an attempt to guilt-trip users into selecting the option the platform desires.
For instance, a platform requesting to track your online activity may label the choice to decline as “No, I do not want personalised content”, implying that a user is losing out by opting out.
Though dark patterns are particularly prominent on ecommerce websites, they go beyond generating sales and are also used to gather personal data.
Prof Ho outlines a simple goal behind the use of dark patterns: “An obvious motivation is the increased conversion rates, leading to increased revenue.
“Another reason is data collection; some sites trick users into sharing more personal information than required. Retailers can then monetise the data.”
These thoughts are echoed by Siraj Jalil, president of the Malaysia Cyber Consumer Association (MCCA).
“Online retailers and platforms use dark patterns primarily to boost their metrics, such as increasing sign-ups, sales, user engagement, or retention.
“These manipulative tactics can give businesses short-term gains by extracting more value from users, though they can harm long-term trust and brand reputation,” he says.
Regulatory rigour
There are currently no laws or legislation that specifically address the issue of dark patterns in the country, with experts like Siraj hoping for its inclusion under the upcoming new cybersecurity bill.
“It (dark patterns) is already part of the marketing mechanism among Malaysian ecommerce industries.
“From MCCA’s perspective, we hope the government can hasten the new cybersecurity bill to be enacted.
“The government needs to redefine the word ‘cybersecurity’, not just in terms of network security, but it must consist of all our consumerism in cyberspace,” he says, adding that this would allow for a more comprehensive approach that would curb any upcoming issues in the digital world.
Prof Ho, on the other hand, stresses that such regulations would require a delicate balance.
“Creating legal safeguards to protect consumers against suspicious commercial behaviour is advantageous.
“Such steps can ensure, to a certain extent, a just and open online environment, increasing consumer and business trust.
“Regulations may call for unambiguous terms and conditions, explicit user interface requirements, and a ban on dishonest design techniques.
“But it’s essential to strike the correct balance. Regulations shouldn't inhibit legitimate marketing strategies or innovation.
“Maintaining a healthy digital ecosystem necessitates balancing safeguarding consumers with enabling businesses to operate,” he states.
Prof Ho voiced out the role that the tech industry, government bodies, and consumer advocacy have to play in addressing the issue of dark patterns.
“Collaboration between these parties is essential to creating a digital environment that respects user rights and encourages corporations to operate ethically.
“The tech and ecommerce industries should take proactive steps to self-regulate and promote ethical design practices.
“Government bodies may consider creating and enforcing regulations that set standards for user interface design.
“But before doing so, comprehensive stakeholder consultations ought to be carried out.
“Advocacy groups play a vital role in educating consumers about the existence and consequences of dark patterns. We need to empower consumers to recognise and resist manipulative designs,” he explains.