44th Parliament, 1st Session
Tuesday, May 31, 2022, 6:30 p.m. to 8:30 p.m.
Watch on ParlVu
My name is Sara Bannerman (she/her). I’m a Canada Research
Chair in Communication Policy and Governance at McMaster University.
Today I will focus on discoverability and algorithmic
bias.
Governments around the are working on measures to ensure
that algorithms are accountable.
There is a common misconception that streaming platforms’
recommend what the user wants to see.
Platforms show us what they want to show us.
They show us what will keep us watching ads, purchasing advertised products, contributing
data, and subscribing.
Platforms are not neutral. They serve
their business interests. As well, there are three types of bias I am concerned
about. These biases can affect both users and content providers.
First, there could be bias if algorithms are used to select content for carriage on a streaming service--
- predicting how many viewers the content will
attract.
- A poor algorithmic showing could sink a content
provider’s chances of being shown.
Second, there can be bias in the recommendation
algorithms that users use to discover content.
- Recommenders often display popularity bias,
recommending what’s popular and concentrating users’ viewing on a smaller
catalog of content.
- This can be unfair to artists in the “long tail”
and to users who like something other than popular content.
- It could be unfair to Canadian content—including
user-generated content,
Third, users’ own biases can be amplified
- Beyond users’ biases towards or against Canadian
content, if some users have a gender bias, for example, this could be amplified
in the recommendations that respond to past viewing habits.
- Such biases can form a feedback loop that can
spread throughout the system.
The research in this area is only recently developing.
CRTC intervention in the algorithms raises many difficult
problems. It may not be the first, most
likely, or best answer to them. The CRTC
has said, today, it does not want to play that role. But the Commission could
play a role in bringing such problems to light.
There are concerns that requiring discoverability would
infringe on freedom of expression.
Streaming services’ user interfaces and recommendations may be forms of
expression, and, if so, regulatory interventions could constitute a limit on
that expression.
There are legitimate concerns that promoting some content could mean demoting other
content, among other concerns.
Sometimes limits on expression are justified, but they must be justified.
To understand whether any justification exists, or even just to understand, we
need data.
It may be that the best role for the CRTC will be to monitor
and call attention to problems, not just with the discoverability of Canadian
content, but also with recommender biases relating to other Canadian values, so
that civil society and others can intervene.
The CRTC can only do that if it has data.
The provisions on information-provision and information
disclosure in the bill are important to the study and examination of
discoverability algorithms and data, and to the CRTC’s potential work with
outside organizations on this matter.
It may be necessary to require platforms to
collect certain data to permit these examinations to happen.
- The general powers section of the Bill could
include (under n and o) the phrase “collection and provision”
- The general powers section of the Bill could
name information on discoverability as information the Commission can demand.
I strongly disagree with proposals
that would allow a company to prevent the disclosure of information in section
25.3.
The Canadian broadcasting system has often served dominant
groups.
It has also been open to change and improvement based the work of civil society
and others.
We need to ensure that the discoverability mechanisms of
online streaming platforms are also open to critique and change through public
transparency, debate, and data.
Thank you.
Appendix
Amendments
to C-11, for consideration
9 (1) (n)
the collection and provision to the Commission […]
9 (1) (o)
the collection and provision to the Commission […]
9 (1)
(o) (v) information related to discoverability; and [NEW]
Comments on Netflix’s
Proposed amendments to Bill C-10, for consideration
I object to
Netflix’s proposed amendments to Bill C-10 section 25.3, which could equally be
proposed to be applied to C-11.
If their proposed 25.3 (4 and 5) (a) (a.1) is adopted (I do not support this), it should
read “must not knowingly be publicly disclosed or made available […].”
This would permit the CRTC working privately with groups to examine
discoverability mechanisms and other matters.