Home World Business Azeem’s 2024 Trends: AI, Energy, and Decentralization

Azeem’s 2024 Trends: AI, Energy, and Decentralization

by 9999biz.com
0 comment


AZEEM AZHAR: Hi there, it’s Azeem. Now, I have just come back from Davos in Switzerland where I was attending the annual meeting of the World Economic Forum, as well as enjoying meetings and conversations with some of the 80,000 or so other visitors who descend on that ski resort for the week. It’s quite a tiring period. Across three days, I walked at 64,480 steps in snow boots. It’s more than 45 kilometers, more than 30 miles. That was reasonably tiring. And I spent many hours, probably more than 12 hours a day, talking to really interesting people, many listeners of this podcast and readers of the newsletter. We talked about AI, its impact on jobs, productivity, the economy. We talked about climate change and the climate crisis. Talked about new materials, batteries, electrification. Talked about how those trends are taking hold in different parts of the world. Overwhelming sense I got is there’s a series of multiple crises brewing. That’s perhaps why there were so many leaders gathering in Davos, just an opportunity to congregate and talk through them and find points of commonality. But alongside those crises is a sense of real excitement. It’s excitement driven not just by the technologies at scale, but also some deep, deep science that was also on display. And all of that is finished off with a dusting of uncertainty.

Now, in a sea of uncertainty, that sense of the unknowable, I think a horizon scan can help people make better decisions. And it’s better, more helpful than a rear view mirror or any point predictions. I guess the point about looking out over the horizon is that there are things that you can see approaching. They’re not very clear. They may not materialize. You have a chance to veer towards them or course correct.

So, what I wanted to do was dedicate this episode of the podcast to a special post-Davos review that allows me to take you through my 2024 outlook. To give you a sense of how I got here, it’s based on nearly three decades in industry. This September, I will celebrate 30 years working. And the last nine of which has really been focused entirely on building the thesis behind Exponential View and investing in my team’s research capability and a lot of the original thinking that we do. I’ve been in the industry for so long and now focused on EV, I also listen to a lot of people. It’s not just the conversations we have on this podcast, it’s also behind closed doors. And I had hundreds of conversations in 2023, which have helped me inform this 2024 analysis.

And those conversations were with a really wide variety of people. It included cabinet level ministers and heads of state, the leaders at pretty much every one of the large AI labs, as well as a couple of dozen AI startups, public and private market investors in Europe, Asia, and the US, venture capitalists right across the spectrum, the US, Silicon Valley, the UK, and Asia, and policymakers responsible for AI regulation, net zero, and energy transition policies in a number of those markets. I also spoke to senior execs at large public firms, leading scientists in academia, in AI, energy transition, biotech, geopolitics, political science. In other words, it’s a broad and deep group of people I had a chance to learn from.

Now, in my full outlook, I cover 12 themes, from AI adoption to geopolitics to decentralization and the energy transition. In today’s podcast exclusive overview, I’m going to walk through a few of those themes, but if you want to go through all 12, see the link in the show notes. Or you can visit this short URL. It’s bit.ly/24outlook. That’s bit.ly/24outlook. So, let’s get started.

Theme one is electrifying everything. Now, we’re at the very peak of fossil fuel use globally. Coal, which powered the industrial revolution from the mid-seventeenth century, will be in decline from now on. And as we defossilise, technology driven energy systems like solar will guarantee a declining price ceiling for energy, freeing us of the vagaries of commodity driven energy provision. Solar power capacity additions are racing ahead. Renewable investment has exceeded fossil fuel investments for six years. Chinese solar panel prices dropped some 40% in 2023, to as low as 12 cents a watt, strengthening the case for the ongoing deployment of solar. The rapid improvements in the techno-economics of solar are making it increasingly appealing and making forecasts progressively ropey. External forecasters are having to revise their estimates upwards. Our own internal models at Exponential View are more bullish than many of these because we place more weight on learning effects and positive feedback loops.

Electric vehicles are on a tear, with more than 40 million in use. Now that’s true even though there are some mumblings and rumblings from Hertz, for example, and Ford and GM and some of the European automakers about the pace of the electric vehicle rollout. But the numbers don’t lie. And compared to earlier years, consumers have a choice of a more comprehensive selection of electric vehicles and a virtuous cycle is taking hold. As the market grows, more firms enter, they compete, they offer consumers diversity and innovation. In this case, prices come down, and range, probably the most critical buying factor for an EV, increases. Much of this has been helped by the consistent decline in battery pack prices. Cheaper alternatives like sodium ion batteries will contribute further to downward price pressure.

Now, technology transitions typically follow an S-curve. And since the 20th century, many of these transitions have taken eight to 14, maybe 15 years. This was true for the replacement of horses by cars, or of Sanger gene sequencing by next-generation sequencing, for feature phones to smartphones. And all of this happened even quicker as electric vehicles replaced internal combustion engine vehicles in markets such as Norway. With so much more consumer choice in EVs and the positive feedback loops accelerating, and concomitantly, owning a gas car and maintaining it becomes less attractive, it wouldn’t be unreasonable to see transitions taking less than a decade from the point at which roughly 5% of all new cars sold are electric.

The impact of stable and declining electricity prices, divorced from fossil fuel, commodity volatility will be profound. It’ll allow for longer-term investment decisions in under-electrified sectors such as heating and industrial processes. These sectors could benefit from learning curve effects as demand grows. For example, colleagues at the Oxford Martin School where I’m a visiting fellow, reckon that electrolyser costs could drop tenfold by 2040. Now we need electrolysers ultimately to split water into hydrogen oxygen and then to do useful industrial things with hydrogen.

Now, if you’re a business leader, there are questions you could be asking to help you get onto the upside of this trend. Are you planning for an appropriately rapid transition towards electrification? Have you oriented your business towards an electrified world of stable and declining power prices? In any part of your world, do these changes leave you with soon-to-be stranded assets?

So, the second theme is the corporate AI agenda. And what I’d like you to do is consider two numbers: six and 92. These two numbers speak to the pace of corporate adoption of new generative AI tools. In Research, in the summer of 2023, Tom Davenport of Babson College surveyed 334 chief digital officers about how they were using generative AI. Now, even though this technology has only recently been commercialized, remarkably 6%, one in 16, had one generative AI use case in production deployment. Now understand how important 6% is as a threshold. Technologies are not adopted linearly. They go through an S-curve where they experience a period of exponential installation. The curve slows down as the laggards finally catch up. Now that ramp has typically started around the 6% level.

The second number there to understand is 92%. In November, OpenAI announced it was supporting 2 million developers, including teams from 92% of the Fortune 500. This is grassroots interest. Roughly 80% of large firms that do not have generative AI in production deployment have developers playing with OpenAI’s tools. Davenport’s data doesn’t have a perfect overlap with a Fortune 500, but it’s a helpful proxy. The stark gap between CDO’s awareness of employees experimenting at an individual level, less than 30%, and OpenAI’s 92% number, suggests, and my informal conversations bear this out, that some bosses in the middle are unclear where the frontline developer has moved.

Now, beyond the technologists, other members of the C-suite are excited by the potential of this new wave of tools. Speaking to several dozen European CFOs in the summer of 2023, the interest in generative AI as a tool for productivity was palpable. Even the top bosses have been electrified. The C-suite seems to be leading in employee adoption. Now, this certainly wasn’t the case with the internet in the 1990s when senior executives had to be dragged into the web kicking and screaming. In 1999, I was helping a major telco, and the then CEO told me he would never allow customers to pay or even access their bills online. He was right. He left the business a year or so later.

Now, this combination of eagerness from the top and grassroots developer adoption from below creates a high octane mix. In 2024 and beyond, that 6% will inexorably rise towards the 92%. It may take a few years because it does take time to roll out IT projects. Talent is short, prioritization is challenging, but the groundwork has been laid. Points for listeners to consider. Firms will be under pressure to deliver robust applications on a technology that is in many cases not robust, and it’s facing various types of legal challenges. At the same time, more straightforward applications will be available to companies. If these projects succeed and improving productivity, there may be an immediate impact on the rate of new hires or even signs of wider spread job cuts. 2023 was too soon to see those impacts. This coming year might bring more clarity.

My next theme is the business model of AI. Now, OpenAI grew from a run rate of $1.3 billion in October 2023, to 1.8 billion by the end of the year. It’s pretty fast. Anthropic, another firm peddling LLMs, is on course to make $850 million in 2024, about four times higher than its number in ’23. These are small, but they’re real numbers in the context of the tech industry.

The New York Times lawsuit against OpenAI for copyright infringement will be a critical test case to help us understand one aspect of this business model. It doesn’t seem unreasonable for there to be some kind of settlement between the two firms. After all, the New York Times and others have put significant effort into creating really unique insights. If courts believe that the watermark set by the music industry is reasonable, there’ll be a fracas. Spotify pays a punishing 70% of revenue and license fees. It seems more likely that any agreement will be lower than that. Apple is reputedly negotiating a $50 million multi-year deal with the Times to license material for its AI models. But before anyone gets too excited about that, consider what such deals mean and which firms will be able to afford to play in this field. Only those with the biggest platform can afford the fees and only the largest publishers will have legal commercial teams to secure those licenses and the preferential access that comes with them.

But the right way to look at this case is as a lens on the exponential gap. Copyright law has been straining for decades, increasingly so with the advent of media convergence, digitization, and the internet. Under AI, it could become shambolic. To apply it in its current guise, in its most literal sense, will not give good long-term outcomes. Instead, we need systems that create incentives for creating new things, which could or may extend to the kind of rights that emerge as those new things get remixed into more new things. We’re well behind on tackling this. The Creative Commons movement is more than 20 years old. And as Jeff Jarvis, a former professor of journalism and expert on media in the internet age, argued in his guest essay in my newsletter, Exponential View, we should consider a shift from a world of copyright to credit right. And VC investor Albert Wenger argues that reforms could include automatic attribution of authorship for free, but any further restrictions by the copyright holder could require paid registration.

Now, these are just a couple of the ideas that we ought to put into the pot. A stringent application of 20th century copyright law to a 21st century technology could be suffocating, like discouraging the use of a telescope because it might upset a set of ecclesiastical pronouncements. It would certainly create legal risks for firms using the technology and challenge the richness of open source and ultimately impact its deployment.

Now, copyright is only one crucial aspect of the business model question. Another is the extent to which the Amazon, Google, Apple, you name it, digital tax that pervades the internet economy will extend into the AI economy. We’ve benefited from the hygiene and clarity of Apple’s App Store to be sure, but was a 30% Apple tax, since reduced in some cases, the right way to structure the digital economy? As artificial intelligence becomes infrastructural, how do we avoid both the economic toll gate and the political control that could be handed over to these tech providers?

For a sense of just how large this cash power lodestone could be, consider that in these early days, OpenAI is doing $1.8 billion in revenues, growing insanely fast, with a valuation, approaching a hundred billion dollars, paying engineers $10 million a year, [inaudible 00:13:42] around a thousand employees. In this world, what could an AI tax end up looking like?

My next theme is about looking at compressing time with scientific AI. Now, DeepMind’s AlphaFold increased the speed of discovering protein structures by about a million percent or so. Their Gnome Toolkit unlocked more than 380,000 stable crystal structures back in 2023. It’s about 65 years worth of human effort. The impact of these two developments has yet to be priced in. The contribution of AlphaFold could be more applications or new application fields will be developed, for example, helping us design protein machines with complex or specific functions, design new organisms, and help with disease treatment. Even so, AlphaFold II is not a panacea. There are lots of other issues in getting proteins to work that need to be resolved.

Now, Gnome was released at the end of November 2023. It’s early days to assess its potential. In fact, since its release there have been some questions of just how useful it is. But as Madonna pointed out, we are living in a material world. Vaclav Smil and others have made this case compellingly well. Today our economies rely on vast quantities of bulky, polluting, energy-intensive materials: steel, plastics, and cement, materials that have been used by humans for at least 12 millennia, at least in the case of cement. Discovering stable new materials that are thermodynamically plausible and can do useful things for us is hard. And so it’s quite interesting that when I wrote my Horizon Scan, researchers had already synthesized more than 700 stable materials identified by Gnome. So perhaps new pathways are emerging.

Both AlphaFold and Gnome enable vastly accelerated discovery of potential candidates, reducing the cost to synthesize, test, and ultimately use them. Even though AlphaFold has been available longer than Gnome, proteins are more fiddly than crystals and they operate in more complex biological environments. And for many applications of proteins in that biological domain, well, it’s healthcare, regulatory scrutiny is necessarily deep. This might make the case that things like Gnome in the materials environment could have a more immediate impact.

Beyond these discovery tools that work in silico, we also need to do things in vitro, real experiments and real physical synthesis. Much current laboratory automation is really just scaling throughput with brittle older generation robotics. Experiments like the AI co-scientist could expand AI’s role in scientific inquiry. This emerging tool is an autonomous research system that can process complex instructions from scientific literature and help design and physically execute experiments and even refine those based on learning from feedback.

So, my next theme is about small AI. Now, while GPT-4 and Gemini Ultra demonstrate what can be done with extremely large AI models, we’re also witnessing the common pattern in the evolution of technologies, optimizations that lead to miniaturization. French Open-source provider Mistral has already shown as good as GPT-3.5 performance from its various open-source models, including one which is just a 7 billion parameter model called Mixtral. I have a model that is nearly as good as GPT-3.5 That runs locally on my Mac laptop. Microsoft also showed off Phi-2. It’s another small model that matches or performs models up to twenty-five times larger thanks to new innovations in model scaling and training data curation. That’s what Microsoft says. The Phi-2 team relied on extremely high quality data of textbook quality. We’ve also got new architectures like state space models. They’re different to the transformer that powers the LLM, but they’re much smaller than the leviathans of GPT-4 and its cousins.

Smaller models will be in demand. They’re cheaper to run and can operate on a wider array of environments than these mega models. For many organizations, smaller models might also be easier to use because they’re easier to diagnose, easier to control. This will create many opportunities, and it’ll also create new space for new businesses. How do we test, manage, maintain, and govern them?

And so the final theme for this podcast special is this idea that decentralization is gathering speed. Now, the personal computer and then the smartphone democratized access to computing. No more did you need the multi-million dollar budget and account manager at IBM or Sperry to secure a mainframe. With that democratization came a huge expansion of the computing market. From software to services, everyone had access to the bicycle of the mind. As the history of the blockchain and indeed, democracy has shown decentralization of resources and the power that goes with them is an appealing idea.

Now, coming soon will be decentralized AI systems, local AI services running on your phone or laptop, no instant access required. As I mentioned, I already run a variety of LLMs on my MacBook Pro. Generalized language models for local execution could become productized and enter operating systems. Google’s Nano will already run on Android devices. Apple released its optimized framework for machine learning, called MLX, which should bring the same sorts of technologies to iPhones. Small is indeed beautiful. Decentralized AI systems running open-sourced LLMs may operate across the fabric of the internet. And I think what that will do is see developers experimenting with distributed agentic systems, these agentic systems being other LLMs or more narrow AI tools that are orchestrated by traditional software or by LLMs themselves. Consider that being the creation of almost like a computer where the LLM acts as a central processor.

Now, we’ve seen architectural shifts like this before in history. The phone network with its centralized switching intelligence gave way to the internet, and that pushed the smarts to the edges. So too we might see a similar dance. Consider OpenAI’s GPT-4 or Google’s Gemini as a gigantic switchboard for AI services. The shift to a decentralized architecture on the internet unleashed an innovation at the edges and the stack of economic rewards and governance issues that have kept me busy for a couple of decades.

Now, this process isn’t simply happening on the internet and soon the AI world. Energy system is undergoing a similar process of democratization and decentralization. 30 years ago, you needed billions of dollars and tons of contacts if you wanted to get into the electricity generating business. Combined cycle gas generation turbines are not cheap, and you have to figure out how to connect them to the grid. Well, today you can do that with rooftop solar for $10,000. But that’s a separate theme, and I tackle that in my 12-Point Outlook for 2024. The rest of that horizon scan includes my expectations of blockchain, how to think about AI during the election year, the role of Temperance Technologies, which I find fascinating, and how they might be a layer between human speed and the exponential pace of technology.

So, for the full outlook, go and visit this short link. You can find it via bit.ly/24outlook, or grab the link in the episode notes below. It’s going to be in your podcast app of choice. And next week I’m going to be back with a brand new discussion. Until then, belated and Happy New Year, and goodbye.

You may also like

Leave a Comment

Introduce

Dive into a universe of news diversity and online shopping excellence with 9999Biz Worldwide.

Newsletter

Subscribe my Newsletter for new trending posts, tips & new promotion. Let's stay updated!

Latest news

@2023 – All Right Reserved. Designed and Developed by THE LANDING COMPANY LIMITED – Tax code: 0316285369

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?
-
00:00
00:00
Update Required Flash plugin
-
00:00
00:00
0