LOG IN
SIGN UP
Canary Wharfian - Online Investment Banking & Finance Community.
Welcome Back to Canary Wharfian
Exclusive Content & Forum
HireVue Interview Practice
Psychometric Test Practice
E-mail address
Password
Don't have an account?
Forgot password?
Join Canary Wharfian
E-mail address
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Elliott says Nvidia is in a ‘bubble’ and AI is ‘overhyped’

Elliott says Nvidia is in a ‘bubble’ and AI is ‘overhyped’
by Canary WharfianAugust 10th 2024
Join the conversation

Hedge fund tells clients many supposed applications of the technology are ‘never going to actually work’

https://www.ft.com/content/24a12be1-a973-4efe-ab4f-b981aee0cd0b

OK, so I work in tech and wanted to share this one as it's quite close to "home". I largely agree with the sentiment of this article, I don't think general AI will ever materialise and become a reality. No matter how hard interested parties are pushing the idea of it happening. There is a limit to how much one can push statistical and probabilistic models, even with infinite amount of computing power.

I think this story will play out similarly how bitcoin did, although we haven't seen the end of it yet. Lots comparing it to other bubbles that happened in history, like the dotcom at the turn of the century or Tulip back in the middle ages, and yet, bitcoin is still here.

Another interesting parallel story: https://finance.yahoo.com/news/intel-ceo-fires-back-nvidia-040943532.html

AI really consumes a ton of energy and I think this might put a limit on the growth of the industry, especially now that ESG is receiving a lot more attention (https://www.cnbc.com/2024/05/15/microsofts-carbon-emissions-have-risen-30percent-since-2020-due-to-data-center-expansion.html). Data centers (the buildings powering LLMs) are actually a pretty significant contributor to global warming (around 5%).

Now, I had seen some great applications of generative AI (which is one of the main flavors of AI in general) in many different domains, such as patient consultancy at healthtech startups, contract review at legal tech startups and assistance with writing tenders. The reason AI is really useful is that it saves time. It's not that it's super smart. Also, it can summarize long texts really well. That's about it.

AI (or machine learning, or natural language processing in technical terms) is also not something new. In fact, it had been around for multiple decades in some way, shape or form. The only thing stopping it from becoming commercialised was the lack of powerful enough computing capacity for deployment at scale. It was recent advancements in tech hardware that enabled this big data revolution, not some groundbreaking academic research. So betting on AI is really betting on tech, and the ability to generate a lot more data AND at the same time being able to process them will deliver great value to businesses.

Elliott says Nvidia is in a ‘bubble’ and AI is ‘overhyped’

Elliott says Nvidia is in a ‘bubble’ and AI is ‘overhyped’
by Canary Wharfian
August 10th 2024
Join the conversation

Hedge fund tells clients many supposed applications of the technology are ‘never going to actually work’

https://www.ft.com/content/24a12be1-a973-4efe-ab4f-b981aee0cd0b

OK, so I work in tech and wanted to share this one as it's quite close to "home". I largely agree with the sentiment of this article, I don't think general AI will ever materialise and become a reality. No matter how hard interested parties are pushing the idea of it happening. There is a limit to how much one can push statistical and probabilistic models, even with infinite amount of computing power.

I think this story will play out similarly how bitcoin did, although we haven't seen the end of it yet. Lots comparing it to other bubbles that happened in history, like the dotcom at the turn of the century or Tulip back in the middle ages, and yet, bitcoin is still here.

Another interesting parallel story: https://finance.yahoo.com/news/intel-ceo-fires-back-nvidia-040943532.html

AI really consumes a ton of energy and I think this might put a limit on the growth of the industry, especially now that ESG is receiving a lot more attention (https://www.cnbc.com/2024/05/15/microsofts-carbon-emissions-have-risen-30percent-since-2020-due-to-data-center-expansion.html). Data centers (the buildings powering LLMs) are actually a pretty significant contributor to global warming (around 5%).

Now, I had seen some great applications of generative AI (which is one of the main flavors of AI in general) in many different domains, such as patient consultancy at healthtech startups, contract review at legal tech startups and assistance with writing tenders. The reason AI is really useful is that it saves time. It's not that it's super smart. Also, it can summarize long texts really well. That's about it.

AI (or machine learning, or natural language processing in technical terms) is also not something new. In fact, it had been around for multiple decades in some way, shape or form. The only thing stopping it from becoming commercialised was the lack of powerful enough computing capacity for deployment at scale. It was recent advancements in tech hardware that enabled this big data revolution, not some groundbreaking academic research. So betting on AI is really betting on tech, and the ability to generate a lot more data AND at the same time being able to process them will deliver great value to businesses.