HomeNews1Check out how a computer chip maker is providing the better brains AI runs on

Check out how a computer chip maker is providing the better brains AI runs on

Check out how a computer chip maker is providing the better brains AI runs on

Untether AI is a very rare breed among the many AI outfits springing up over the past decade.

For starters, it is Canadian in a field dominated by American companies, but it also isn’t trying to build the next ChatGPT.

Instead, Untether, headquartered in Toronto, produces the grey matter needed to run any AI program — specialized computer chips. Thanks to the ubiquity of AI, these chips are needed everywhere. One notable place is under the hood of GM’s autonomous vehicles.

Last year, Untether AI announced a partnership with the auto giant to produce AI perception systems that are used to help autonomous vehicles find their way around without human assistance.

But the future of AI chip design goes far beyond Untether. Tech superpowers like the U.S., Taiwan and China are pushing the boundaries of chip design in what appears to be the 21st century’s space race — or, arguably, the nuclear arms race. About 90 per cent of the semiconductor world outsources its productions, and the largest factories tend to be found on either side of the new Cold War.

Untether AI’s contribution to today’s chip industry is described as energy-efficient, yet affordable. With AI set to dominate computing over the next five years, electricity consumption by the industry will go through the roof.

A low-footprint chip capable of handling the most advanced AI’s operations without melting into a shiny puddle is, in Untether AI CEO Arun Iyengar’s view, essential to lightening the load.

Are there fundamental differences between the chips people have in their cars, and the chips your company is designing?

There are all sorts of chips, just like there are all sorts of books. If you want to learn Russian, you wouldn’t pick up an English dictionary. From a silicon perspective, you need to figure out where and what you’re using a chip for. So the answer is, yes, you end up using different types of chips in a lot of different types of scenarios.

We are actually very, very different than other chip makers. We’re one of maybe three Canadian chip companies that were founded here. Startups also typically try not to do simple things. We’ve taken on a very big, complex thing. What we focus on is artificial intelligence usage. In artificial intelligence, there are two components: actually coming up with a model, and then you use a model. We’re on the use side of the equation. That could mean autonomous vehicles or smart cities or smart retail, or robotics. Those are all good examples of applying a trained model.

Why are there so few Canadian chip companies?

Canada’s got great AI talent on the software side. If you close your eyes in downtown Toronto and throw a rock, you’ll probably hit an AI shop that’s doing software — and that’s great. But AI is the one discipline that’s going to change the way hardware and software work together. It used to be that when you designed something for the internet, you didn’t really care about the underlying hardware. You could use anyone’s processor — anyone being Intel.

AI is different. You need specialized hardware to run it. So I think what’s happened, in Canada, is that everybody’s focused on software. The hardware has been an afterthought. These companies think a company somewhere else is going to make the hardware, and they’ll just use the hardware because they think Canada doesn’t have that capability here.

We really thought it was silly for a company to be based anywhere except in the hub of AI software land. There’s a lot of usage that happens right here.

How do your AI chips differ from what’s already on the market? Are they more resistant to heat? Are they faster?

Now you’re getting into why we’re in existence, which is the right question. Companies like Nvidia and AMD and Intel make graphics processing units or GPUs. They are typically used for AI today. GPUs are designed with a technique called Von Neumann architecture — a really powerful, old-school architecture that’s worked very well for the semiconductor world from the 1950s, up until now.

It’s called a load and store architecture, which basically means you have memory outside the chip, you have processing inside the chip, and you have a very long, narrow connection connecting the two. It’s kind of like the Don Valley Parkway when it gets clogged. You just sit there, burning energy. And you end up with a very low utilization when it comes to AI.

If AI is deployed using this general purpose Von Neumann architecture, you end up with a hidden energy crisis. It will actually take away the ability to deploy AI, and make it available to the global population. It’ll be more for the elite.

What’s this energy crisis?

You’ve heard of ChatGPT? Running ChatGPT for a month, using traditional chip technology, consumes the same energy as powering a town of 175,000 people every month. Nobody talks about that, because ChatGPT is amazing. It’s awesome. But this is a problem. This is the crisis that’s in front of us if we don’t have a more efficient way of deploying artificial intelligence.

That’s where we come in. We basically blow up the Von Neumann architecture and come up with a completely different means of putting memory and data processing right next to each other, so the data movement — and energy consumption — is minimized drastically. It moves such a tiny distance that we can’t even measure it with our naked eyes. But in traditional Von Neumann architecture, 90 per cent of the energy going into that chip is moving the data. We take that to pretty much zero.

You’ve worked in the U.S., and now you’re in Canada. Are there any major differences between the AI sector in both countries?

When I joined the company, the idea of a chip company in Canada made no sense to me. To me, a chip company needed to be in the U.S. because, guess what, that’s where Silicon Valley is, and we should be in the land of silicon. So I thought maybe I’d have the software team in Canada and move the hardware side into Silicon Valley. It’s easier to find software engineers in the Toronto area because it is a hotbed for a lot of good AI talent. It’s not as straightforward to find hardware talent.

Very shortly, I realized moving the company to the U.S. made no sense because the talent I found here, and continue to find here, is amazing. I am also seeing people that spend time in the U.S. saying they want to come back to Canada because that is where they are from.

We’re creating those opportunities for them. We’re giving them the chance to work in a cutting-edge company working on silicon, which would be the equivalent of what I’d be doing in a startup in Silicon Valley. That’s been a huge magnet, if you will, for people to say: ‘Wow, this is pretty cool.’ When they actually come in — and these are people that have worked at companies like Google — they think what we’re creating is very, very interesting technology.

The Biden administration is really interested in setting up semiconductor fabricators in places like Ohio. What kind of impact would it have on the AI sector if we did that here?

That would be the Canadian government basically saying that semiconductors are going to be the backbone of everything we do. And if we don’t own the supply chain, we’re at the mercy of somebody else. It would be great. The three Canadian chip startups I talked about earlier would probably grow to 30 the day it gets announced, and would probably be 300 when the factory actually started functioning.

There are some massive chip makers in the semiconductor arms race, like Nvidia. How do you keep up with a giant like that if you’re a startup in Canada?

The way we answer that question is not necessarily through saying: ‘Let me tell you how we’re better’. It’s: ‘Let me show you how we’re better.’ I’ll give you an example. We work with a smart retail customer that was looking at a way to bring in more cameras into a store and be able to capture theft, or whatever the case may be.

What they found is that they were limited to a certain number of cameras with their existing chip. For the same amount of power they were using, we could add six times the capacity of their cameras. All of sudden, every aisle could have a camera — and not just for theft deterrence. You could just come by, wave your card, and a store could know what a customer has already picked. These are use cases that customers could not do with their existing implementation.

Are there any supports you’d like to see from the federal government to make the AI space better in Canada?

I think a lot of it starts from universities, and the Canadian university system is really, really good. We get a lot of really good talent coming out of the University of Toronto and the University of Waterloo. On the semiconductor side, you need to be certain that it is critical to the country’s success.

If you have that as your DNA, which is something the U.S. has really moved toward, and something China did about five years ago, then your policies would change automatically. I wouldn’t have to pick the one thing I need. Recognize that semiconductors are going to be the DNA of progress.

This article was reported by The Star