NEXT Canada, GRIT Ed.06, Brain Gain vs. Brain Drain - Magazine - Page 7
A ROBUST COMPUTE
Professor Graham Taylor
Academic Director, Next AI; Interim Research Director
and Canada CIFAR AI Chair Vector Institute for Artificial
Intelligence; Professor and Canada Research Chair
in Machine Learning School of Engineering,
University of Guelph
NEXT provides many supports to its community of
founders: funding, space and, importantly, access to
renowned academics, a deep mentor network and a
diverse alumni community. Still, every year we survey entrepreneurs asking how we can further support them. Every year we receive the same message: startups need access to compute.
It is no secret that progress in AI is bottlenecked by computing power.
As AI’s performance improves, in some cases to near-human levels, and its
application reach expands into new industries, AI models similarly grow
more complex, driving the demand for more compute. It is common to see early-stage startups in Next AI spending $2.5-10k
monthly on US-based commercial cloud computing. More
established SMEs spend much more. This is a pipeline
of government and private support for Canadian
startups directly to US tech giants. A robust homegrown compute ecosystem will be a key differentiator in attracting and retaining Canadian startups.
In a recent Globe and Mail Op Ed, my Vector
Institute colleagues Garth Gibson and Ron Bodkin
used OpenAI’s GPT-3 as an example of the scale of
compute needed in modern applications of AI. Estimates suggest it costs more than $3.5M to train this
model using an economical cloud computing option.
But the compute bottleneck is not limited to top AI
research organizations like Vector. In March, OpenAI claimed that more than 300 applications, many
built by startups, are using GPT-3. While using a
pre-trained model reduces some of that $3.5M
cost, deploying language models in specific doGRIT