SpoilNet // The Scary Real Time Censorship Engine Beyond Filterbubbles


Do you hate spoilers for your favorite T.V. shows and movies? You’re not alone, in fact, some die-hard fans go so far as to block words and phrases mentioning a new movie or series in the back end of their social media apps.

This behavior has caught the attention of University of San Diego California Artificial Intelligence research team. The group has developed an A.I. that can read and listen to data & media coming through your devices and automatically remove the spoilers using the tool.

SpoilNet uses a neural network to identify words, phrases, and cultural metaphors that pertain to the target subject. Using an online knowledge base of words and phrases it can auto censor spoilers before they happen.



The team used the open social network Goodreads which allows anyone to access user data and annotations. This database allowed the programmers to create an index with fine grain accuracy.
The granular data has been the only limitation of programmers to make a sophisticated version of this type of A.I. tool. UCSD researchers used over 1.3 million annotations from users sharing thoughts, and notes on the highlighted portions of books they read.

One pattern they found in the data was that spoilers tend to pile at the end of reviews. Another finding is that not everyone has the same quality of annotation of what a spoiler is. The main labor involved in developing the system was making the A.I. understand the various qualities of spoilers, and make a standard for contextual information. 

When developing the A.I. tool programmers found a difficult hurdle was the use of words in different contexts. For example, the color green could be used to say an item is a color, or it could mean something is naive, or even the name for an actual character.

Once the context hurdle was eliminated, the A.I. system could accurately predict spoilers at an 80% success rate. They also ran SpoilNet on reviews of a recent T.V. show and got a 74% success rate. The hurdle left is new generations using serious words in an unserious context.

The main worry privacy advocates have for SpoilNet is that it might be employed for nefarious purposes. For example, future versions of this could be used to install listening software on a personal computer, mobile device or smart device to censor their online experience without them knowing.

This framework has the possibility of listening for spoken keywords in live streams and muting the audio or disrupting the connection until the unfavored content being mentioned has passed. Some fear it might be used for removing articles, images, search results and video from social media search queries and algorithmic news feeds.

This is of concern because while this tool was created to improve user experience, it could be repurposed by corporate security contractors (spies for hire) and could use it to wreak havoc on a target without them knowing.

This is a major upset for those few social media and internet heavy hitters, like Twitters Jack Dorsey, who advocate that the web shal remain free, fair, and a public square.




Where this gets scary, is if a oppressive government obtains this code. They could use this NOT for spoilers, but on live videos where victims of government abuse are muted or "Glitched Out" in real time to thier audience. These could be on location news anchors, or protesters trying to expose government corruption. 

The nightmare senario is the possibility for government human rights abuses, where once the location or "mention" of certain keywords during live podcasts, on location streams, or even a non live youtube video could be censored in real time, and you'll never know the broadcaster or your rights were ever suppressed.

So, if you'll never never be the wiser...do you trust your countries government not to use it on you or the population?

A Hammer can build a home, or be used commit assault...its time to start looking at digital tools and algorythms in the same way.


Read more about SpoilNet on EurekAlert

Post a Comment

Previous Post Next Post

Make the NSA work for it...

Featured Podcasts // Click to Listen NOW