What if we told you there was an app out there that could get in your head and change your behavior, any way it wanted to? This app knows how your mind works, and was developed specifically to exploit its vulnerabilities so it could control your actions and behaviors—with a predictable degree of accuracy.
Now what if I told you that app already exists? Or that you have dozens of apps like this already installed on your phone?
This is your reality, but it’s more complicated than it seems.
Apps that control your behavior can be a positive thing. For example, consider your monetary habits; weeding out the bad habits and establishing stronger, good habits can improve your financial wellness for years to come, and if an app is designed to encourage that behavior, the long-term effects are almost universally positive. Take Acorns as an example; the app is designed to use the spare change from your regular transactions to invest in stocks, bonds, and index funds. You’re shown your growth curve regularly, so you’re encouraged to keep using it. Of course, Acorns takes a cut—but the app is helping people save for retirement, even if they’ve never considered the option before.
The Center for Advanced Hindsight and design firm Bridgeable have also built a prototype app designed to encourage more rational decision making. Though not all apps were designed with a benevolent intention, this serves as a good example of how this power could be used for the benefit of their users. There are also dozens of apps designed to help users break bad habits—or form new ones. By giving small, daily rewards for logging in and connecting to other users socially, users are incentivized to continue their positive behavior.
There are, however, examples of apps that have abused their understanding of human psychology. Infinite scrolling, a feature used by Twitter, Pinterest, Facebook, Reddit, and dozens of other apps, is designed to hijack your brain and encourage repeated use—the same way slot machines use selective rewards to keep casino patrons paying.
Facebook, specifically, has been criticized for using techniques like bright red notifications, selective alerts, and newsfeed curation to maximize the time a user spends in the app. In a darker turn, the company has even openly admitted to intentionally manipulating users’ emotions for a scientific experiment. If you’re not aware of how these subtle design changes affect your behavior, you could fall victim to them, spending far more time on the app than you mean to, or walking away with a change in mood through no fault of your own.
The ugly part is that there isn’t a clear answer for how we can move forward.
Do we propose new regulations, which restrict what developers can include in their designs? If so, who would be responsible for making the decisions? And would this simultaneously limit the innovative capacity of our most talented engineers, or prevent us from unlocking the benefits of positive behavioral rewards?
Do we give consumers more tools, knowledge, and resources so they can avoid being controlled? If so, how can we do this in a way that reaches the majority of the population?
Do we let things ride as they are and hope the good outweighs the bad? If so, are we supposed to trust that the majority of companies will operate with users’ best interests in mind? And how far will it go before intervention is no longer a possibility?
Technology has given us awesome power, and it’s on us—as developers, consumers, and policymakers—to determine what we’re going to do with that power. There’s a dark side and a light side to being able to practically dictate user behavior within an app, so is it worth taking action on? And if so, who should be in charge? There’s no succinct answer here, but more information is never a bad thing—so spend more time educating yourself about how your favorite apps work, and how they might be exploiting you for their own sake.