Mon. Mar 16th, 2026

AI Systems Use Canadian Journalism Without Proper Credit, Study Finds

A new study says artificial intelligence systems rely heavily on Canadian journalism to answer users’ questions but often fail to credit the original news sources.

Researchers at McGill University’s Centre for Media, Technology and Democracy analyzed 2,267 Canadian news stories and tested how major AI platforms respond when asked about Canadian news events.

The study found that popular AI models such as ChatGPT, Gemini, Claude and Grok show strong knowledge of Canadian current affairs, suggesting they have absorbed large amounts of Canadian news reporting during training.

However, the researchers said the systems rarely acknowledge where the information comes from. In about 82 per cent of responses, the AI tools did not provide any source attribution when discussing Canadian news events.

Even when AI platforms were given web access and asked about specific articles, many responses summarized enough of the original reporting that users no longer needed to visit the original news website.

About half of the responses included at least one link to Canadian content, but only 28 per cent directly named the Canadian news outlet that produced the reporting.

The report said that while links can technically guide readers back to the original source, most users reading the AI response would not know whose journalism they were consuming.

Researchers warned that AI companies are extracting value from journalism at multiple levels. They said AI systems absorb news archives as training data, create new content based on that material without naming the sources, and deliver answers directly to users in a way that reduces the need to visit news websites.

The study argues that this process could worsen the financial challenges already facing the news industry by weakening traffic to original news outlets.

The report was released during a national summit in Banff where the federal government is discussing the impact of artificial intelligence on culture and media.

Opening the event, federal Culture Minister Mark Miller said there are serious questions about copyright, licensing and the growing presence of AI-generated content in the marketplace.

Artificial Intelligence Minister Evan Solomon said consultations on Canada’s upcoming AI strategy revealed concerns from creators who want stronger safeguards as AI technology develops.

Solomon said the government is aware of questions around copyright, ownership and the use of data by AI systems, and indicated that discussions on these issues are ongoing.

At the same time, several major Canadian news organizations are taking legal action over the issue. A coalition that includes The Canadian Press, Torstar, The Globe and Mail, Postmedia and CBC/Radio-Canada has filed a lawsuit in Ontario against OpenAI.

The media companies argue that their news content has been used to train AI systems without permission and that companies are profiting from that material without paying compensation.

Canada has already taken steps to address similar issues with technology companies. In 2023, the federal government passed the Online News Act, requiring platforms such as Google and Meta to compensate media outlets for displaying their news content.

Following the law’s passage, Meta removed news content from its platforms in Canada, while Google agreed to make payments to media organizations.

Researchers say the impact of AI on journalism could be different from what happened with social media. While social media platforms attracted advertising revenue by gathering audiences around news content, AI systems may go further by directly delivering the information itself.

According to the report, this means readers may no longer need to visit the original news sources at all, raising new concerns about the future of journalism.

Related Post