HOME / BLOG / CONTENT MARKETING FOR APPS

Should Facebook’s Advertising Practices be Regulated?

11/May/2017 · 2 MINS READ

There’s been quite a stir in the world of social media advertising. The Australian recently discovered, through a series of Facebook’s internal documents, that the social media giant may have allowed for the targeting of lonely youths in need of a ‘confidence boost’.

By using a system called ‘sentiment analysis’, Facebook’s algorithms are allegedly able to judge the moods of its users. This technology can identify what people are feeling and when, such as ‘overwhelmed’, ‘stressed’, or ‘anxious’. It can also determine if someone feels like they are ‘looking good’ or are ‘losing weight’.

The internal documents also stated that Facebook’s research found that young people expressed confidence-boosting emotions in the early part of the week, while the weekend was often used to boast about or broadcast achievements.

The furore around ‘sentiment analysis’ is largely built on the concern that it will be used to target people on Facebook when they are most vulnerable, especially as this includes young people.

While targeting vulnerable children would certainly put Facebook (and advertisers using it) in breach of ethics codes and of Australian guidelines for marketing to children, Facebook denies this is the case. Rather than being used as a targeting tool, the ‘sentiment analysis’ study was run simply to show advertisers how people expressed themselves on the platform. It was not a targeting tool for their ads.

So, though there may be no immediate cause for concern, what this leak does is raise a red flag at the potential issues a platform like Facebook will continue to surface.

Social media and self-regulation

Facebook is a platform funded almost entirely by ad revenue, and its users can be targeted by advertisers based on their demographics and the things they have ‘liked’. Moreover, Facebook is a continually developing platform – both for its users and its advertisers. And that means the platform will continue to develop new methods of targeting users, new ways to improve the measurability of its results and new tools for driving an advertiser’s ROI.

Whilst Facebook can continue creating enhanced advertising tools, it’s important that regulatory and ethics boards are able to keep an eye on just what those enhancements could mean for its users.

The major problem with regulating a platform like Facebook, or indeed any online platform, is that updates and patches arrive at an incredibly fast pace. That pace makes it difficult to effectively gauge what the impact of a feature change or new tool will be. In effect, it’s unlikely that any single regulatory panel would be able to keep up with the rate of change.

This means that keeping Facebook in check will likely fall back into the hands of Facebook itself. Self-regulation is used in some form or another by most of the advertising industry, but Facebook has an opportunity to define what it can mean for social media.

So what’s the solution?

The answer seems simple: transparency. This speculation about unethical targeting practices would never have occurred if the ‘sentiment analysis’ study were publicly available, and not a ‘for your eyes only’ document restricted to advertisers who want to see how people use the platform.

If every update to how ads are targeted or what user data is available were made public, then regulating any changes would become that much simpler. Transparency would not only allow the public and the media to rest easy, it would also spare Facebook much of the speculation and suspicion swirling around its advertising practices.

Maybe it’s time for Facebook to take inspiration from one of humanity’s most popular clichés: “if you want something done right, you have to do it yourself.”