By Nazli Bostandoust
Nazli Bostandoust is Manager, Corporate Innovation at MaRS Discovery District.
Modern artificial intelligence is, famously, a Canadian story. The so-called godfathers of AI – Geoffrey Hinton, Yoshua Bengio and Richard Sutton – made their deep learning breakthroughs here, and the world-renowned research institutes they’re associated with (Toronto’s Vector, Mila in Montreal, the Alberta Machine Intelligence Institute in Edmonton) continue to do pioneering work in the field.
In 2017, we were the first country in the world to launch a national AI strategy. This past May, when Ontario MP Evan Solomon was appointed Minister of Artificial Intelligence and Digital Innovation, he became one of few such government officials in the world.
But all this homegrown innovation and accomplishment will mean little if we don’t create and control the infrastructure necessary to continue building cutting-edge AI.
In average year-over-year growth in AI talent concentration, Canada outpaces all other G7 nations. But we also have the lowest amount of publicly available compute capacity – the resources needed for training and deploying AI – in the G7.
Most Canadian businesses and startups, in fact, rely on the massive American cloud providers known as hyperscalers (Amazon Web Services, Microsoft Azure, and Google Cloud) for their compute capacity needs.
This is detrimental in two ways: one, it hinders the growth and innovation of Canada’s AI industry; two, it potentially makes us vulnerable to the whims of these tech giants and the antagonistic political climate within which they operate.
Earlier this month, to cite just one dramatic example, President Donald Trump used his executive powers to force Microsoft to suspend the email account of an International Criminal Court prosecutor investigating Israel for war crimes.
At a moment when we’re talking about the sovereignty of everything – our borders, our governance, our natural resources, our defence – we need to also talk about taking control of our compute capacity.
The federal government has, thankfully, begun to take these concerns seriously. Late last year, it launched its $2-billion AI Sovereign Compute Strategy which includes a $300-million AI Compute Access Fund, designed to mobilize private sector investment and build out domestic supercomputing infrastructure. Of that $2 billion, $700 million is being spent on the construction of data centres in Canada, through Ottawa’s AI Compute Challenge.
Toronto AI firm Cohere was the first to receive funding, $240 million, which the company is using for a new data centre facility in Canada. Cohere will be the anchor tenant for the new facility, which New Jersey-based cloud computing firm CoreWeave is building.
At the same time, the major telecommunication companies are making encouraging moves. Bell is building a network of six data centres in B.C. while Telus will soon unveil what it calls sovereign AI factories in B.C. and Quebec.
This investment is a good start, but infrastructure alone won’t secure Canada’s AI future. Compute capacity is notoriously expensive; a single server can cost US$300,000, a cloud hundreds of millions.
Reframing the idea of digital sovereignty
Rather than look to Ottawa for more cash, we need complementary solutions. One thing we might consider is reframing the very idea of digital sovereignty. Building more data centres on Canadian soil is imperative, but we also need to build domestic capabilities, supply chains and adoption pathways that make sovereignty meaningful.
If we think about sovereignty this way, we also better democratize compute so that the next wave of homegrown innovators can emerge alongside anchors like Cohere. These innovators tend to be the small and medium-sized enterprises (SMEs) that drive over half of our GDP.
They’re faster at deploying real-world AI and more representative of Canada’s regional and demographic diversity. Recently, at an event MaRS hosted during Toronto Tech Week, Hojjat Salemi, the chief business development officer at Ranovus, made the excellent point that full participation in the data centre supply chain is as significant as the ownership of that infrastructure.
As a data centre, it’s easy and convenient to go with NVIDIA racks, but there are smaller, competitive Canadian firms all along the supply chain, from the copper refiners to the optical cable manufacturers, that, in aggregate, can do the same job just as well.
On the same panel, Craig McLellan, CEO of ThinkOn, the only sovereign cloud provider in the country, echoed this sentiment, arguing that the best way for the government to combat the compute capacity gap is to trace the supply chain.
This isn’t just about helping individual companies, but rather bolstering the whole ecosystem by guaranteeing a portion of public-sector AI infrastructure purposes go through Canadian-owned providers. It also helps to make clear that investing in sovereign compute capacity is more than just propping up a new industry whose benefits to the larger economy are still, to the general public, somewhat speculative.
Many of these SMEs are pioneering vertical AI, building models tailored for specific sectors or problems like medical diagnostics and advanced manufacturing. These solutions are grounded in Canadian research and data and often service public good applications. Without targeted support, we risk sidelining the very companies best positioned to translate AI into real-world value. Minister Solomon is fond of saying “sovereignty does not mean solitude.”
We agree. But that doesn’t mean waiting. Canada must act now to close the compute capacity gap, strengthen its AI supply chain, and give SMEs a fair shot. Sovereignty means more than infrastructure; it means intention.
Funding a major company like Cohere can’t reduce the government’s capacity to dedicate time, resources and money to support the real job-creators. Collaboration must start at home, with procurement, investment and policy aligned to build a truly Canadian AI future.
R$
Organizations: | |
People: | |
Topics: |