Excited about NVIDIA upside, and reflecting on virtualization
Publishing note: Starting today, I will generally bundle notes into a single weekly publication, which will come out on Thursday mornings.
NVIDIA results
NVIDIA reported today, and it was another great quarter. Sell on good news they say, but I remain a holder. I am sure other people will dig through their numbers and find many interesting signals. In my view, we’ve only realized a tiny fraction of the potential of AI; AI consumption and NVIDIA stock will continue to head up:
- Consumer use is still in the very early adoption phase. I am the only person in my extended family using AI explicitly or regularly. There is nothing relevant about it for anyone else yet — it doesn’t help them manage people or run small businesses or provide counseling or run a workshop or care for dogs or any other life activity. Some would consider this and say, “AI is overhyped.” I consider this and say, “Oh my gosh, the upside is tremendous.”
- Business applications are not very deep yet. When I set up a company a year ago, I used Stripe Atlas to help automate all the registration and legal nonsense you must do. But Stripe Atlas doesn’t go very far in automating the setup and operation of a company; after 1-2 days, you have blown through its help, and you are on your own. I dream of the day when a Stripe Atlas future version gives me an AI GC, AI Controller, AI VP HR, and other AI role placeholders and operates a lot of the company for me, giving me daily or weekly reminders about the few things I have to do.
- AI is also only reactive so far. I have to bring AI into my workflow explicitly, proactively ask an AI tool to critique my writing, provide some imagery, etc. That is starting to change; continuously active tools like Github CoPilot and Grammarly are very nice, though still at a superficial level — I trust Grammarly to tune up my spelling and grammar, but that is it. There is unlimited upside here.
I am also bullish on NVIDIA because they have executed very well and continue to execute very well, though some of this may be shorter term:
- NVIDIA took advantage of Intel’s missteps to become the provider of the most strategic FLOPs for the industry. Books will be written about the rise of NVIDIA and the fall of Intel; heck, books have already been written on it. Intel is desperately trying to catch up, but the gap between Intel and NVIDIA has developed over a decade or more and won’t be easy to fix.
- NVIDIA continually invested in its software stack and provided incredibly relevant software to developers. CUDA and its friends have been used by every AI developer and continue to expand, creating stickiness for NVIDIA. Intel had many software adventures but never made an Intel-specific stack critical to developers, creating no stickiness.
- NVIDIA invested in a developer community — tools, events, information, etc. The energy and enthusiasm at NVIDIA GDCs and other events are great. Communities can wane, and developers are fickle, but NVIDIA has done a great job to date.
Many competitors are focused on NVIDIA’s growth and margins, charging hard to displace them, but it is not easy. And with the demand for AI growing substantially, NVIDIA should be fine for quite a while.
Virtualization matters
VMWare Explore is happening this week, and almost no one cares; there is little or no news coverage. VMWare is a faded star; it used to be a vital tool for the industry, but that was long ago. VMWare will probably be a great business forever, but it is not much of a growth story anymore.
But virtualization still matters, a lot. At various times, teams I’ve been on have depended on Windows’ MSDOS emulation, Sun’s Virtualbox, VMWare, Docker and its variants, Parallels, WSL, Kubernetes, and some I have probably forgotten. My personal greatest virtualization hits in the PC era:
- Windows would have been dead in the water without great MSDOS virtualization. You could run great Windows apps like Excel, but still needed the DOS box to run games and old apps. Without this capability, nobody would have run Windows. Games support was essential, and Directx didn’t happen until Windows 95; a great DOS VM was mandatory. The step up to Windows 3.x and 386s made the DOS box even better (and the foolish insistence on supporting 286s was a boat anchor for OS/2).
- The growth of Azure and AWS was all about their ability to host arbitrary Linux (and other) machines. They’ve since expanded to many different services, but hosting VM instances is at the core of those businesses. This capability allowed the cloud services to take on legacy computing loads and allowed for (relatively) easy movement of loads between private hosting, public hosting, development hosting, etc.
- Mobile app development and testing are much faster thanks to the emulator VMs. I’ve read that the Xcode iOS simulator is not a full VM, but the Android Studio simulator is. I don’t know what the truth is, but the ability to quickly fire up and run your built app on various emulated target devices is a huge productivity lever. And the upcoming iOS 18 feature to mirror your phone apps on your Mac is strangely useful.
- Using Docker to try out new software. It is so easy to spin up a Docker machine and run some random open-source project rather than polluting your machine or fighting dependencies on your machine. Docker has saved me so much pain. Maybe a container is not full virtualization, but for a massive range of use cases, it is close enough.
- Using Docker VMs in your dev process to spin up candidates. The ability to stand up service instances on your machine and test out an end-to-end transaction quickly was a game changer.
Conversely, I have been hamstrung on projects when I didn’t have excellent virtualization support:
- Automotive software development. The auto industry is way behind on virtualization. Too often, in-car software cannot be tested without being in a car or without using an expensive and scarce hardware emulator. As a result, the development of in-car software is slow and costly. The virtualization problem is harder than just classic virtualization; the VM has to have a virtualized car underneath it, which can generate automotive events and consume automotive signals. But millions of man-years have been wasted not having this. OEM software efforts have stumbled time and time again, and this is a big part of it. Virtualization is not only for development reasons — no one at an automaker can “self-host” on the infotainment systems under development. Every person at an automaker should be running the infotainment software they expect to deliver to their next car on their desk, but they can’t due to a lack of quality virtualization. They have no idea what the customer experience will be.
- Open source projects that don’t provide a docker container. Thankfully, these are fewer and fewer, but I still run across projects that don’t offer a container definition, including some recent ML and AI projects. What a waste of time. One person could do a little work and make it easy for 1000s of people downstream to try out the project.
- Academic papers that don’t provide the source and a docker file to run them. I have little confidence in repeatability if the authors haven’t invested in a framework to run their code repeatedly.
- Native MacOS VM capability. Dependency management on MacOS can be finicky; we’ve all dealt with colliding Python installs: MacOS + homebrew + Anaconda or whatever is unreliable. I wish I could spin up a MacOS VM like a docker image.
A foundational task in any project is establishing the virtualization environment to allow collaborators to spin up the project easily; there are still corners of the industry that need to pay more attention to this. Virtualization is all about compatibility and speed. Done right, it makes your new product 100% compatible with existing code and installations, making it easier and faster for customers to adopt your new thing. It also helps you quickly and repeatably trial solutions, which is critical at all parts of the development chain — experimentation, development, test, deployment, and pre-sales. Customers, trial users, and potential contributors should always be able to spin up a virtualized version of your product and try it out quickly — every barrier to easy trial is a lost sale or a lost opportunity to work together.
Other things this week
- Noah Smith had some good things at the start of this week. Regarding the VC winter, AI costs are driving up startup capex/opex requirements; the path to great outcomes is a lot harder now than 15 years ago. And the point about solar abundance == water abundance is very well stated. I don't think we've comprehended how transformational this might be.
- The Diff talks about PE ownership in sports. When will PE come for college football? Consider The Ohio State University, my alma mater. TOSU is sitting on a $7.8B endowment. Besides the academic programs, the university has full ownership of the Buckeyes and sports facilities – stadium, arenas, etc. The Buckeyes generate ~$250M in annual media and ticket revenue, though that all gets spent on opex and capex (and who knows if that is warranted). What price would be enough for the University to consider separating this asset? That $250M revenue figure is free of donations; donations to the athletic department are another $50-60M on top of that, and then there is an unknown amount of donations to the University itself that are due to athletic success and affiliation. It would be tricky to untangle, but the media dollars probably overwhelm the complexity at some point. We’ve seen dramatic changes in college conference affiliations mainly due to media money issues, and no one predicted those changes. The money pressures won’t stop.
- Adam Singer on agency. We all have agency, and it is up to us to drop the smart phones and start spending time in more meaningful ways. I need to hear this as much as anyone.
- Noah Smith reviewing Ender’s Game. An odd time to be reviewing this book perhaps, but Noah’s message resonates with me — IQ and morality don’t necessarily go hand in hand.
Comments ()