The Globe and Mail reports in its Friday edition that if your toaster catches fire and causes damage, you can sue the manufacturer for product liability. The Globe's guest columnist Vass Bednar writes that if an artificial intelligence system, however, promotes harmful content, like eating disorders or financial scams, there is often little legal recourse available. Despite their significant influence on society, the products of digital platforms, such as recommender systems and large language models, are largely under-regulated. Product liability law protects consumers from unsafe goods and holds manufacturers accountable, even without intent to harm. This fosters a culture of responsibility, but Big Tech has largely evaded these standards. Many courts still do not consider software a "product" under traditional product liability law, and the harms are quite difficult to assess and prove in liability law. Meanwhile, platforms such as Meta and Google claim they are not publishers or producers, but neutral intermediaries, despite actively designing systems that shape what people see, click, pay and believe. This legal exceptionalism is especially glaring given the growing harms associated with algorithmic systems.
© 2025 Canjex Publishing Ltd. All rights reserved.