SAMBOT-head-illustration

Who SAM is

Samara Areto Monitorbot, aka SAMbot or SAM, is a machine learning bot that detects and tracks toxic sentiment on Twitter. During the 2021 Canadian federal election, SAM will collect data and insights about the online abuse directed at party leaders and incumbent candidates running for re-election across the country.

SAM’s journey

SAM has analyzed hundreds of thousands of tweets sent to party leaders and incumbent candidates on the digital campaign trail.

Since August 15, SAM has found:

  • Nearly 20% of tweets were labelled toxic
  • 1 in 4 tweets directed to women were toxic
  • 1 in 5 tweets directed to men were toxic

SAMbot report: September 12-19, 2021

In the final week of the campaign, activity on Twitter slightly decreased, but the proportion of toxicity remained steady.

Read the report

SAMbot report: September 5-12, 2021

With the Leaders’ Debates and other notable campaign events, the fourth week of the campaign was a week of high activity on Twitter.

Read the report

Why do we need SAM?

While it is commonly understood that toxic online spaces are harming our democracy, there is little data that illustrates the extent of the problem in detail. Consequently, effective regulations, policies and social expectations for online conduct in Canadian political contexts are lacking.

Online political discourse during campaign periods can be extremely toxic and the 2021 federal election is a unique opportunity for SAM to investigate the current state of Canada’s online political space. SAM will track all English and French tweets directed at political party leaders and incumbent candidates. Each message that SAM tracks — whether  a reply, quote tweet or mention — will be analyzed on seven-point toxicity attributes.

The information SAM provides can inform important conversations and nuanced approaches to reducing the toxicity of online political spaces in Canada.

Toxicity Scale

Attribute name Description
TOXICITY A rude, disrespectful, or unreasonable comment that is likely to make people leave a discussion.
SEVERE TOXICITY A very hateful, aggressive, disrespectful comment or otherwise very likely to make a user leave a discussion or give up on sharing their perspective. This attribute is much less sensitive to more mild forms of toxicity, such as comments that include positive uses of curse words.
IDENTITY ATTACK Negative or hateful comments targeting someone because of their identity.
INSULT Insulting, inflammatory, or negative comment towards a person or a group of people.
PROFANITY Swear words, curse words, or other obscene or profane language.
THREAT Describes an intention to inflict pain, injury, or violence against an individual or group.
SEXUALLY EXPLICIT Contains references to sexual acts, body parts, or other lewd content.

How SAM does it

SAM is a machine learning bot that collects and analyzes tweets in real time, and provides a tweet analysis while also storing each tweet that mentions the Twitter handle of at least one of the candidates being monitored.

SAM does not track or store retweets, as counting the same tweet more than once would confuse the analysis. Twitter data is collected and used in line with Twitter’s acceptable terms of use.

SAM analyzes and stores tweets by:

  1. analyzing the tweet information to identify and extract the tweet text
  2. scoring the tweet on a seven-point toxicity scale, and
  3. storing the tweet text and its toxicity score in a database

Read more about the technology behind SAM and how it has been used in other elections:

View PDF

Who is SAM listening to?

SAM has the capacity to monitor millions of tweets in real time. For the federal election, SAM is tracking the Twitter mentions of the major political parties’ leaders and the incumbent Members of Parliament who are running for re-election in 2021.

The list of accounts that SAM is monitoring is available on this spreadsheet.

When is SAM tracking?

SAM will be tracking candidates’ mentions 24/7 during the 2021 federal election campaign and providing us with findings on a weekly basis.

How does SAM detect toxicity?

SAM is a machine learning bot – a software application robot that runs automated tasks over the Internet. SAM is tracking all English and French tweets directed at political party leaders and incumbent candidates. Each message that SAM tracks – whether that’s a reply, quote tweet or mention – will be ranked on a seven-point toxicity scale. SAM has the capacity to distinguish between a microaggression and a threat.

Get SAM updates straight to your inbox from the Samara Centre

Subscribe Now

Support SAM's work by donating to the Samara Centre

Donate Now