Organizations:

People:

Topics:


Lack of supercomputing power is impairing Canada’s research and business innovation

Mark Lowey
December 13, 2023

Lack of supercomputing power is hobbling Canadian research and business innovation and driving scientists to other countries that have the resource, say experts in high-performance computing systems.

Along with this “brain drain,” the country also is missing out on multi-billions of dollars in taxable revenue by not providing commercial supercomputing services, as is done in the U.S. and other countries, they said.

Canada has about 25 per cent of the accelerated supercomputing power required by academic researchers, and there is no supercomputer in the country for innovation by businesses or national defense, says Dr. Ryan Grant, PhD (photo at right), assistant professor of electrical and computer engineering in the Smith School of Engineering at Queen’s University. He helped build the world’s largest and fastest supercomputers in the U.S. before returning to Canada.

“We have the top AI talent in the world in research,” Grant told Research Money. “But there’s no place for them to go and translate that product from the lab into a product that makes dollars we get to tax in Canada. There’s no place for industry to go to do work on AI.”

On a list of the top 500 fastest supercomputing systems in the world, Canada’s best supercomputers rank No. 86 and 87. Among the many countries with a more powerful supercomputer than Canada has are Finland, Italy, South Korea, Switzerland, Brazil, Netherlands, Luxembourg, Sweden, Taiwan and Thailand.

“China has a top 10 [supercomputer] on that list that’s more powerful than all the supercomputer systems in Canada combined,” Grant said.

The power of supercomputers is measured in “petaflops.” Canada’s supercomputing power totals about 42 petaflops, compared with nearly 3,700 petaflops in the U.S.

“You’d expect us to be at a ratio of one to 10,” given that the U.S. population is about 10 times the size of Canada’s, Grant said.

“We’re not competitive with other nations and our [supercomputing] capacity is lacking,” Dave King (photo at left), chief revenue officer at Calgary-based Denvr Dataworks, said in an interview. Denvr Dataworks provides and operates infrastructure and software to companies looking to run artificial intelligence compute workloads.

The unit and scale of computing power for working on databases has gone from one computer getting the job done to now needing a minimum of 128 computers networked together to act as one, King said. For example, with the enormous databases AI requires, the computers need to be clustered together into a supercomputer.

“This is where Canada lacks this scale – they’re not building clusters,” he said.

The lack of supercomputing capacity is already resulting in Canada’s “brain trust” – talented researchers and graduates – relocating to the U.S. and other countries, King said.

“It’s absolutely happening now and it’s going to accelerate, not back off,” he added. “If we don’t put infrastructure on the ground, people will leave because they’ll go to the place where they have the tools they need.”

Canada falling further behind other countries

Other countries are recognizing the need to invest in supercomputing systems, including the U.K. which recently announced that it will invest £900 million in a cutting-edge supercomputer as part of a national artificial intelligence strategy. The “exascale” computer would be several times more powerful than the U.K.’s biggest computers.

Exascale supercomputers are the biggest and fastest systems in the world, costing about US$600 million to US$700 million, Grant explained. An exascale computer can do a billion, or a quintillion (1018), simple calculations per second. To put that into context, if everybody on Earth were to do one calculation per second, it would take four years to equal what the world’s fastest supercomputer can do in one second.

The U.S. has one exascale supercomputer in operation, called Frontier, which cost US$600 million and came online in 2022. The machine takes up the space of two tennis courts at the Oak Ridge National Laboratory and is used for scientific research.

Several more exascale machines are scheduled to come online in 2024 – including one at Argonne National Laboratory in Illinois and another at Lawrence Livermore National Laboratory in California – as well as Europe’s first exascale supercomputer.

Canada has five major supercomputers – although none anywhere near the size of Frontier in the U.S. – which are part of a national research computing infrastructure coordinated by the Digital Research Alliance of Canada.

These supercomputers are housed at McGill University, Simon Fraser University, University of Victoria, University of Toronto, and University of Waterloo.

Innovation, Science and Economic Development Canada has acknowledged on its website that “There is currently a shortage of supercomputer power for Canada’s researchers, which means they are not able to undertake the world-leading science they otherwise could.”

The federal government is investing $50 million, disbursed across the five post-secondary institution supercomputer sites, to expand the national supercomputing infrastructure.

King said while the federal investment is welcome, “is $50 million sufficient to catch up to the latent demand and start to serve Canadian companies as well? No, it’s not even close.”

Denvr, a privately held company, has  a capital investment plan of more than $1 billion for 2024, he noted. The company this year alone is rolling out in the U.S. the equivalent of about 25 $50-million supercomputing clusters.

Hessam Mirsadeghi (photo at left) senior software engineer who lives in Canada but does his job remotely for California-based software company NVIDIA, said Canada’s shortfall in supercomputing power is even worse when it comes to GPUs, or graphics processing units. GPUs can process many pieces of data simultaneously and they act like accelerators in high-performance computing systems.

The U.S. has 161 supercomputers on the Top 500 list, compared with 10 for Canada and four of those are only for government use, Mirsadeghi said. Of the six remaining systems in Canada, only three have GPUs installed compared with 75 systems with GPUs in the U.S.

“GPUs are so important because with the changes we have seen in recent years, with respect to the computation that the world needs and the important workloads we have today, they can all benefit significantly if you use GPUs,” he said.

“Today, with AI advancements, accelerated supercomputing is now a necessity, not just a nice-to-have capability,” he added.

As for the $50-million federal investment, the government’s approach for the last decade or so of sporadically injecting funds into Canada’s supercomputing infrastructure and then forgetting about it won’t work, Mirsadeghi said. Government needs to commit to long-term funding to ensure the country has a modern computing infrastructure that’s consistently kept up to date, he said.

Supercomputing touches our lives every day

Grant pointed out that the $50-million federal investment will help only to increase the supercomputing power available for academic researchers engaged in basic science. Ottawa’s investment will not address the supercomputing needs of industry, national defense or national security, and other sectors, he said.

“I don’t think people realize how much supercomputing touches their lives every day,” he said.

During the COVID-19 pandemic, the two-metre safe separation distance between people was calculated by a supercomputer, based on modelling simulated COVID molecules expelled by a person sneezing and how far these molecules would travel, Grant noted.

Environment Canada uses a supercomputer to make weather predictions, and models created by supercomputers help protect people from the impacts of natural disasters, including earthquakes and wildfires, along with climate change impacts.

A growing number of Nobel laureates have relied heavily on high-performance computing for their achievements. Supercomputers help with drug discovery and in genomics, with cybersecurity, and are used to train the models underpinning AI systems and tools.

A fast-growing application for supercomputers is their ability to build “digital twins” – a virtual representation of an intended real-world physical product, system or process that serves as the effectively indistinguishable digital counterpart. Digital twins can be used for simulation, integration, testing, monitoring and maintenance.

 For example, digital twins were used to test the designs for Boeing’s 737 MAX airliner and other aircraft. In the aerospace and automotive industries, high-performance computing has dramatically reduced the time-to-market and increased the safety and reliability of new aircraft and vehicle designs.

“Being able to do those simulations with digital twins on big supercomputing systems really returns a lot of money. It’s quite good for your overall cost curve,” Grant noted.

Digital twins can and are being used to design and build manufacturing facilities, procurement and supply chains, infrastructure (such as smart cities or a national power grid), cybersecurity systems, monitoring space debris and for deep space exploration, and in health care such as a digital twin of the human immune system.

Supercomputers provide huge financial return on investment

Each of the world’s largest supercomputing systems comprises tens of thousands of individual computers wired together in a network. How they’re wired together is a critical and unique component of supercomputers.

Grant is one of a few people in the world with the expertise to write the software for the application that connects all the computers in a supercomputer. Prior to joining Queen’s University, he worked at Sandia National Laboratories, a U.S. Department of Energy national security lab, helping to build the U.S.’s new exascale computing systems.

Grant said he saw first-hand how much revenue can be earned by providing supercomputing services. A study by Minnesota-based Hyperion Research found on average, for every dollar invested in supercomputing in the finance, manufacturing, life sciences and transportation sectors, the return is about $40 in business profits or cost savings, and $500 in business revenue.

“This isn’t a 20-year payoff, either. This is three- to five-year payoff,” Grant said.

“Companies in this area [of providing supercomputing services] can’t deploy equipment fast enough in the for-profit sector,” he added. “They literally just can’t get it on the floor fast enough. It’s sold before it hits the floor.”

King agreed, saying: “There is great demand from the world and Canada needs to get in line. The return on the capital investment is excellent.”

The global high-performance computing market, worth US$45 billion in 2022, is forecast to grow by five per cent per year between 2032 and 3032, reaching US$90 billion by 2032, according to a report by Global Market Insights.

If the public-private sector invests in a world-class supercomputer in Canada, “they will not only earn a ridiculous amount of money on the return, but that business itself will be able to grow and not flow that money out of Canada,” Grant said.        

Building a world-class supercomputer in Canada

Grant said he came back to Canada from the U.S. “to try and help my country, because I saw us falling behind [in supercomputing] and I didn’t want that to happen.” His dream is to build a world-class supercomputing system at Queen’s University.

Canada now has the critical mass of expertise to be able to actually build its own supercomputer, he said. “To me, it’s a no-brainer. Of course we should be taking this business in-house.”

The supercomputing system he envisions would be used for both business innovation and national defense, and be secure enough to safeguard highly sensitive datasets. At the same time, the system could be operated in a less-secure mode for scientific researchers who want access and for training highly qualified personnel – something other countries have done.

Businesses, including SMEs, would be able to obtain supercomputing services on a nonprofit supercomputer at far less cost than existing commercial services, Grant said.

Supercomputers consume a lot of electricity, but the supercomputer Grant wants to build would have a “green” advantage. It would utilize exclusively “direct-to-[micro]chip” water cooling, enabling the otherwise waste heat the system generates to be captured and used to heat nearby buildings.

Most importantly, Grant said, the supercomputer needn’t cost $600 million or $700 million. Using a modular approach, the machine could be built in phases starting with $50 million to $100 million, he said.

King said that Denvr Dataworks has spent the past seven years developing and deploying modular clusters of networked computers. “The notion of building a supercomputer and expanding it over time like a set of Lego bricks – that’s exactly what Denvr Dataworks has pioneered.”

The initial supercomputing cluster that Grant envisions would be “a great start,” and would be an economic driver attracting Canadian graduates, international talent and startups like a magnet, he said.

King said having the supercomputer would mean that skilled research groups at the Vector Institute in Toronto, Mila-Quebec AI Institute, Queen’s University and the Alberta Machine Intelligence Institute (Amii) in western Canada will be able to retain talent and keep talent coming to Canada.

Denvr has been talking with Grant and is very interested in partnering with him, King said. “We’re a Canadian-owned company and we want to help build this infrastructure for the benefit of Canadians.”

Grant said the new supercomputer for Canada would provide a space for researchers to collaborate with industry, to translate their academic knowledge to industry while taking real-world problems back to the lab to find solutions.

“It’s that ‘last mile’ that needs to be addressed – that’s kind of where the magic happens,” he said. “We’re trying to take science from the lab to the store, to your shelf, to your computer where you get a new thing that you actually use.”

The key benefit of having a world-class supercomputer is, in one word, “competitiveness,” Grant said. “Supercomputers let us do what we need to do for the information economy, to make us a player. Either we have supercomputers and we are a producer in the information economy, or we don’t have them and we’re a consumer in the information age.”  

R$


Other News






Events For Leaders in
Science, Tech, Innovation, and Policy


Discuss and learn from those in the know at our virtual and in-person events.



See Upcoming Events










You have 1 free article remaining.
Don't miss out - start your free trial today.

Start your FREE trial    Already a member? Log in






Top

By using this website, you agree to our use of cookies. We use cookies to provide you with a great experience and to help our website run effectively in accordance with our Privacy Policy and Terms of Service.