Facebook treats you like a lab rat

Did Facebook study go too far?
Did Facebook study go too far?

You can be angry and upset with Facebook after it manipulated its users' emotions as part of a study. But you shouldn't be surprised.

After all, experiments on users are commonplace on the Web. We are lab rats in Internet mazes.

The most common kind of Web experiment conducted on users is called an "A/B" test. That's when an online company provides a different Web experience for a small subset of customers. If you are part of the A/B test, your screen may look different than your neighbor's, even though you're both on the same website.

Google (GOOGL) constantly conducts A/B tests by making tiny tweaks to its search algorithm to see if the changes provide more useful results. CNN.com has a tool that tests different headlines to see which one generates more clicks. (CNNMoney does A/B testing as well.) And Facebook (FB) A/B tests everything from the placement of ads to which content appears in your News Feed.

We all agree to these experiments, whether we're aware of them or not. It is in these companies' terms of service. For instance, Facebook's data use policy says, "We may use the information we receive about you for internal operations, including troubleshooting, data analysis, testing, research and service improvement." These kind of experiments are part of the price we pay for free online content and services.

Related: Internet outraged by Facebook's 'creepy' mood experiment

By their very nature, A/B tests are manipulative. If you're part of a test, you might click on something you otherwise would have ignored, buy something you wouldn't otherwise have purchased or feel something you wouldn't otherwise have felt.

But there's an important distinction between typical A/B tests and the one that Facebook conducted two years ago.

Most A/B tests serve two purposes: to improve a company's business and make your Web experience better. A better-placed checkout button could drive up sales. More relevant search results or social posts will keep users engaged and coming back for more.

The goals of Facebook's mood experiment aren't entirely clear. A subset of users were intentionally made to be less happy during a week in 2012. Facebook changed the mix in the News Feeds of almost 690,000 users. Some people were shown more positive posts, while others were shown more negative posts.

So for some users, their Web experience was made worse -- not better.

Related: The best deals in tech

Facebook, in an emailed statement, claimed that the experiment was conducted to "improve our services and make the content people see on Facebook as relevant and engaging as possible."

If that's true, then the goal was noble, but the method was unethical. Toying with people's emotions is always a potential byproduct of A/B testing, but it's a step too far to intentionally make some users feel negative emotions. That distinction might be subtle, but it's important.

For example, most people would be fine with an Amazon (AMZN) experiment that manipulated search results that drove us to make healthier food purchases. But there would be an uproar if Amazon drove a group of customers to make less healthy choices for a week.

The way to avoid these kind of practices in the future is simple -- Facebook and other Internet giants could switch to an "opt-in" model of consent, in which users are asked explicitly to be part of research such as Facebook's mood experiment. If the 689,000 people subjected to Facebook's experiment knew they were taking part in a study, people might not be making as big a stink about it.

But don't hold your breath. The culture in Silicon Valley is to force users to be tested and manipulated. We can only hope that our Internet overlords are benevolent -- or that we're chosen only for the "positive" experiments.

CNNMoney Sponsors