Friday, October 7, 2022

How does the collection of personal information by political parties affect elector engagement?

My new article with Julia Kalinina, Elizabeth Dubois, and Nicole Goodman, "Privacy and Canadian Political Parties: The Effects of the Data-Driven Campaign on Elector Engagement," is now available Open Access in the Canadian Journal of Political Science

Datified campaigning brings concerns about surveillance, divisiveness, digital redlining and elector autonomy. This article asks whether awareness of parties’ data collection practices affects willingness to engage with campaigns. 

We surveyed Canadians to answer this question, and found: 

  • that respondents are not fully aware of political parties’ data collection practices,
  • that awareness of parties’ collection of personal information may reduce electors’ willingness to interact with political parties online, and 
  • respondents support the application of privacy law to political parties.  

Our team wrote about some of these findings in an article published in The Conversation in 2021, "Electronic tracking of voters is a thorny topic in a tight federal election race."

The Algorithmic Distribution of News: Policy Responses


My new edited collection with James Meese, The Algorithmic Distribution of News: Policy Responses, examines regulatory responses - and the responses of journalism organizations -  to the distribution of news by algorithms. It examines how news recommenders and aggregators like Google News and the Facebook news feed have challenged the traditional news gatekeepers and the revenue flows of news organizations.  Countries around the world have come up with a broad range of policy responses. The spectacular array of authors in this book book provide both reflections on these initial responses, as well as some food for thought about future directions. 

Tuesday, June 7, 2022

The Online Streaming Act (Bill C-11) claims to level the playing field … but for whom?

The Online Streaming Act (Bill C-11) claims to level the playing field … but for whom? Nawshaba Ahmed and I ask, in our piece published in The Conversation.

 See also my comments to the Heritage committee on algorithmic recommendations and C-11.

Wednesday, June 1, 2022

Comments to the House of Commons Standing Committee on Canadian Heritage on Bill C-11

44th Parliament, 1st Session
Tuesday, May 31, 2022, 6:30 p.m. to 8:30 p.m.
Watch on ParlVu

My name is Sara Bannerman (she/her). I’m a Canada Research Chair in Communication Policy and Governance at McMaster University.

Today I will focus on discoverability and algorithmic bias. 

Governments around the are working on measures to ensure that algorithms are accountable.

There is a common misconception that streaming platforms’ recommend what the user wants to see.

Platforms show us what they want to show us. They show us what will keep us watching ads, purchasing advertised products, contributing data, and subscribing.

Platforms are not neutral.  They serve their business interests. As well, there are three types of bias I am concerned about. These biases can affect both users and content providers. 

First, there could be bias if algorithms are used to select content for carriage on a streaming service--

  • predicting how many viewers the content will attract. 
  • A poor algorithmic showing could sink a content provider’s chances of being shown.
Second, there can be bias in the recommendation algorithms that users use to discover content.
  • Recommenders often display popularity bias, recommending what’s popular and concentrating users’ viewing on a smaller catalog of content.
    • This can be unfair to artists in the “long tail” and to users who like something other than popular content.
    • It could be unfair to Canadian content—including user-generated content,

Third, users’ own biases can be amplified
  • Beyond users’ biases towards or against Canadian content, if some users have a gender bias, for example, this could be amplified in the recommendations that respond to past viewing habits.
  • Such biases can form a feedback loop that can spread throughout the system.

The research in this area is only recently developing.

CRTC intervention in the algorithms raises many difficult problems.  It may not be the first, most likely, or best answer to them.  The CRTC has said, today, it does not want to play that role. But the Commission could play a role in bringing such problems to light.

There are concerns that requiring discoverability would infringe on freedom of expression.
Streaming services’ user interfaces and recommendations may be forms of expression, and, if so, regulatory interventions could constitute a limit on that expression.
There are legitimate concerns that promoting some content could mean demoting other content, among other concerns.
Sometimes limits on expression are justified, but they must be justified.
To understand whether any justification exists, or even just to understand, we need data.

It may be that the best role for the CRTC will be to monitor and call attention to problems, not just with the discoverability of Canadian content, but also with recommender biases relating to other Canadian values, so that civil society and others can intervene.
The CRTC can only do that if it has data.

The provisions on information-provision and information disclosure in the bill are important to the study and examination of discoverability algorithms and data, and to the CRTC’s potential work with outside organizations on this matter. 

It may be necessary to require platforms to collect certain data to permit these examinations to happen.
  • The general powers section of the Bill could include (under n and o) the phrase “collection and provision”
  • The general powers section of the Bill could name information on discoverability as information the Commission can demand.
I strongly disagree with proposals that would allow a company to prevent the disclosure of information in section 25.3.[1]

The Canadian broadcasting system has often served dominant groups.
It has also been open to change and improvement based the work of civil society and others.

We need to ensure that the discoverability mechanisms of online streaming platforms are also open to critique and change through public transparency, debate, and data. 

Thank you.



[1] See Appendix for details


Appendix

Amendments to C-11, for consideration

 9 (1) (n) the collection and provision to the Commission […]

9 (1) (o) the collection and provision to the Commission […]

9 (1) (o) (v) information related to discoverability; and [NEW]


Comments on Netflix’s Proposed amendments to Bill C-10, for consideration

I object to Netflix’s proposed amendments to Bill C-10 section 25.3, which could equally be proposed to be applied to C-11.

If their proposed 25.3 (4 and 5) (a) (a.1) is adopted (I do not support this), it should read “must not knowingly be publicly disclosed or made available […].” This would permit the CRTC working privately with groups to examine discoverability mechanisms and other matters.