By Miles Pilgrim (Instagram: hex738)

Social platforms like Facebook, Instagram, and Snapchat are continually gaining government attention across the world due to their unregulated nature – and the effects it might have on society.

There are two kinds of regulation in particular which have become the focus: the first being the regulation of user data by the social media companies, and the second is the regulation of the content that gets uploaded by its users.

The suicide of Molly Russell, 14, in 2017 recently sparked attention in the UK press, in which her Instagram account was found to contain media content relating to depression, self-harm, and suicide; believed to have contributed to her death. This has since prompted Instagram to announce new ‘sensitivity filters’ which hide images until users actively choose to look at them. (1)

Other incidents involving media content have involved Instagram influencers advertising diet pills, ‘detox’ teas, and appetite-suppressing sweets which turned out to be harmful. Professor Steve Powis, England’s national medical director, in response, has urged “social media companies to ban ‘irresponsible and unsafe’ adverts for health products by celebrities.” (2)

Very recently it was discovered that there had been nearly 2000 incidents of child grooming taking place on these platforms over a span of six months, up to September of 2018. And while spokespersons from Instagram and Facebook have responded that they aggressively fought grooming on their platforms, the chief of the NSPCC has accused social media platforms of “10 years of failed self-regulation.” (3)

Facebook, as well as hiring thousands of staff solely for the purpose of content monitoring, has even gone so far as to design its own computer chips to help make AI a more practical tool in controlling the platform which will include “monitoring video in real time and helping its army of human moderators decide what content should be allowed on the service.” (4)

Artificial Intelligence is faster and cheaper

The issues remain for these platforms however, and not only has the self-imposed regulation been ineffective but it will most likely continue to be:

There is some suggestion now, that the companies themselves have become too powerful; some even describing Facebook purchasing both WhatsApp and Instagram as a “mergers that should have never been allowed”. (5) Even with ‘an army’ of staff hired for content monitoring, Facebook, as of 2019, now has over 2.32 billion active accounts which makes monitoring all of them difficult.

This is partly why AI is set to be used for content monitoring in the future. But this will also most likely prove to be unsuccessful. While computer algorithms and AI will most likely be both faster and cheaper for these companies than hiring an ever-needed number of staff, the reality is that these computers simply will not be able to think critically or abstractly enough to be good at the job. This may have already been the case in which several popular YouTube channels were shut down without warning for apparently violating YouTube’s community guidelines for containing sexual content involving minors. In truth however, was that the channels and videos in question contained nothing of the sort, only harmless Pokémon GO videos. Google refused to clarify “whether this process is performed by a human, or a computer” (6) in relation to the banning. The issue arose over the fact that an acronym used on the channels was misinterpreted, but within context, it seems difficult to believe that a human could have made such a mistake.

Another issue that would be faced by AI content monitoring AI, is that the content uploaded to the platforms continually changes. Those that operate in illicit or underground circles within these social platforms, will simply adapt to find ways around the regulations imposed, like most criminals. This could be something as simple as a filter overlaid over a picture or video that circumvents the patterns and algorithms that an AI uses to distinguish harmful content. In essence – the computer cannot ‘see’ the content as harmful, and so it goes unnoticed. In this event, it is likely that the company would eventually become aware of the problem, and would set to improve the AI programming. But of course, by that point, harmful content has already slipped through the cracks, so to speak and criminals are already finding new ways around the systems in place.

Government regulation is fought or circumvented

Because the likes of Instagram and Facebook have so many users unlikely to leave the service any time soon, self-regulation for these companies, their staff, and their AIs, will be, a never-ending impossible task likely to cost both time any money, while being riddled with flaws.

It is not in the interest of these platforms to self-regulate for the purpose of protecting their user base either since their main purpose is to generate advertising revenue from other businesses using user data as a product.

Forced regulation of user data through government policy will most likely be fought or circumvented via loopholes since it will potentially damage profit margins. This became evident recently when the German government implemented new data protection policies that Facebook will have to abide by, in which data that can be collected from users through Facebook, Instagram, and WhatsApp will be controlled. Facebook has since said that they “disagree with their conclusions and intend to appeal.” (7)

This regulation of user data has come in the wake of the Facebook-Cambridge Analytica data scandal of 2018 that revealed that personal data from over a million Facebook profiles had been harvested without consent, and used for political gain.

More recently was when members of a private Facebook group had “discovered that their details could be downloaded by third parties” (8) which also included health data from individuals.

It has already been made clear that self-regulation of user data by social media giants cannot be done efficiently; it is not in their interest to do so. Following the incentive of Germany, this regulation will have to be forced via government policy. The issue here lies with the fact that only recently have governments become aware of the scale of the problem, and in some cases do not fully understand the technology.

Self-regulation is an unwinnable contest

Regulation via governments, or self-regulation from these platforms of user content however, is an unwinnable conquest. Trying to control what content can and cannot be shared over social media will likely result in people trying to find ways around the regulation put into place. In much the same way as an oppressive regime trying to control what people are allowed to talk about, a business trying to control what its two billion users share is an impossible task, even with AI and thousands of content monitoring staff.

When profits are made the priority for capitalist business, self-regulation becomes paradoxical because ultimately the regulation will involve implementing restrictions that, in some way, have a negative impact on revenue; an ideology counterproductive to capitalist objectives. In the case of social media platforms like Facebook, Instagram, and Snapchat, it will most likely always be the end-user instead that bears the cost.

Regulation of social media content would always be a challenge, even in a perfect society. But in the final analysis, the best form of scrutiny of such platforms will be achieved only when they are run democratically as a public service and not as sources of private profit.

Resources:

(1)    https://www.bbc.co.uk/news/uk-47114313

(2)    https://www.theguardian.com/society/

(3)    https://www.bbc.co.uk/news/uk-47410520

(4)    (Financial Times, February 19 2019)

(5)    (Financial Times, March 1 2019)

(6)    https://www.bbc.co.uk/news/technology-47278362

(7)    https://www.theguardian.com/technology/

(8)    https://www.bbc.co.uk/news/technology-47308655

March 4, 2019

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Instagram
RSS