SAMBOT-head-illustration

Who SAM is

Samara Areto Monitorbot, aka SAMbot or SAM, is a machine learning bot deployed to detect and track toxic sentiment on Twitter during the 2021 Canadian federal election.

SAM’s journey

SAM analyzed millions of tweets sent to party leaders and incumbent candidates on the digital campaign trail during the election period (August 15-September 20, 2021).

SAMbot report: September 20, 2021

On Election Day, a rise in toxicity was noted at 11 PM ET, shortly after a Liberal minority government was declared by CBC and CTV.

Read the report

SAMbot report: September 12-19, 2021

In the final week of the campaign, activity on Twitter slightly decreased, but the proportion of toxicity remained steady.

Read the report

Why do we need SAM?

While it is commonly understood that toxic online spaces are harming our democracy, there is little data that illustrates the extent of the problem in detail. Consequently, effective regulations, policies and social expectations for online conduct in Canadian political contexts are lacking.

Online political discourse during campaign periods can be extremely toxic and the 2021 federal election was a unique opportunity for SAM to investigate the current state of Canada’s online political space. SAM tracked all English and French tweets directed at political party leaders and incumbent candidates. Each message that SAM tracked — whether a reply, quote tweet or mention — was analyzed on seven-point toxicity attributes.

This information will help to inform important conversations and nuanced approaches to reducing the toxicity of online political spaces in Canada.

Toxicity Scale

Attribute name Description
TOXICITY A rude, disrespectful, or unreasonable comment that is likely to make people leave a discussion.
SEVERE TOXICITY A very hateful, aggressive, disrespectful comment or otherwise very likely to make a user leave a discussion or give up on sharing their perspective. This attribute is much less sensitive to more mild forms of toxicity, such as comments that include positive uses of curse words.
IDENTITY ATTACK Negative or hateful comments targeting someone because of their identity.
INSULT Insulting, inflammatory, or negative comment towards a person or a group of people.
PROFANITY Swear words, curse words, or other obscene or profane language.
THREAT Describes an intention to inflict pain, injury, or violence against an individual or group.
SEXUALLY EXPLICIT Contains references to sexual acts, body parts, or other lewd content.

How SAM does it

SAM is a machine learning bot that collected and analyzed tweets in real time during the election campaign period. SAM provided a tweet analysis and stored each tweet that mentioned the Twitter handle of at least one of the monitored candidates.

SAM did not track or store retweets, as counting the same tweet more than once would have confused the analysis. Twitter data was collected and used in line with Twitter’s acceptable terms of use.

SAM analyzed and stored tweets by:

  1. analyzing the tweet information to identify and extract the tweet text
  2. scoring the tweet on a seven-point toxicity scale, and
  3. storing the tweet text and its toxicity score in a database

Read more about the technology behind SAM and how it has been used in other elections:

View PDF

Who did SAM listen to?

SAM has the capacity to monitor millions of tweets in real time. For the federal election, SAM tracked the Twitter mentions of the major political parties’ leaders and the incumbent Members of Parliament who ran for re-election in 2021.

SAM tracked 210 male incumbent candidates, including Justin Trudeau, Erin O’Toole and Jagmeet Singh, 88 female incumbent candidates, and two party leaders who did not have a seat in the last Parliament (Annamie Paul and Maxime Bernier).

The list of accounts that SAM monitored is available on this spreadsheet.

When did SAM track?

SAM tracked the candidates’ mentions 24/7 during the 2021 federal election campaign from Sunday, August 15 – Monday, September 20, 2021. Weekly findings can be found on our Reports page.

How does SAM detect toxicity?

SAM is a machine learning bot – a software application robot that runs automated tasks over the Internet. SAM tracked all English and French tweets directed at political party leaders and incumbent candidates.

Each message that SAM tracked – whether that’s a reply, quote tweet or mention – was ranked on a seven-point toxicity scale. SAM had the capacity to distinguish between a microaggression and a threat.

Be the first to know about what SAM does next

Subscribe Now

Support SAM's work by donating to the Samara Centre

Donate Now