We have no inside knowledge of Nvidia or Coreweave. But, we write this based on the 10K and 10Q from Nvidia, the news and research from FactSet. We bring to the table almost two decades in Corporate Banking for Ayesha and over two decades in Technology for Mayhem. We know there have been numerous allegations about some big conspiracy theory surrounding Nvidia and we want to clear up some of the confusion.
Data Center vs Consumer Products
Nvidia has been driving significant innovation into technology that helps to facilitate an acceleration of complex, mathematically intensive data sets. Using graphical processors, Nvidia found that its own technology could outperform CPUs. From there the company built an innovation advantage by creating CUDA, which eventually become the software ecology that builds a relatively durable competitive moat around Nvidia’s data center offerings.
Data center sales are much more profitable than consumer products for Nvidia. With gaming in decline, Nvidia’s A100 and H100 sales to data center customers more than made up for the difference. Data center sales have become the largest driver of revenue and revenue growth for Nvidia, and these products are extremely high margin. Meaning that Nvidia spends less and makes more on these products dollar for dollar made than, say, video cards for gaming.
GPUs have Specific Advantages vs CPU for Specific Workloads
Why GPUs are Critical for Data-intensive Computing
Parallelism: Many data-intensive tasks can be broken down into smaller sub-tasks that can be executed simultaneously. GPUs excel at this kind of parallel processing.
High Throughput: The architecture of a GPU allows for more data to be processed per unit of time compared to a CPU, especially for tasks that can be parallelized.
Specialized Libraries: Libraries like CUDA for NVIDIA GPUs and OpenCL for other vendors have made it easier to offload data-intensive computations to the GPU.
Machine Learning and AI: Training models often involve a lot of matrix operations, which GPUs are highly efficient at handling.
Big Data Analytics: Processing and analyzing large data sets can be expedited using the parallel computing capabilities of GPUs.
Scientific Computing: Simulations involving complex calculations can be significantly sped up using GPUs.
Real-Time Processing: GPUs can process large streams of data in real-time, making them ideal for applications like video processing and gaming.
Cost-Efficiency: While GPUs may be expensive, the computational speed-up they offer for specific tasks can make them more cost-effective in the long run.
Empirical Evidence
Speed-Up in Machine Learning: In a study published in 2017, researchers found that using GPUs can speed up the training time of deep learning models by a factor of up to 50 compared to using CPUs (source).
Big Data Analytics: According to a paper published in the ACM Digital Library, GPUs can accelerate big data analytics tasks by 10x to 100x compared to traditional CPU-based systems (source).
By providing high throughput, parallelism, and specialized libraries, GPUs have become a critical component in data-intensive computing, machine learning, big data analytics, and more.
Nvidia’s Secret Sauce: CUDA
NVIDIA's Compute Unified Device Architecture (CUDA) is a parallel computing platform and application programming interface (API) model that allows software developers to use a CUDA-enabled graphics processing unit (GPU) for general-purpose processing. While other APIs like OpenCL (Open Computing Language), OpenACC, and Vulkan also offer GPU-accelerated computing, CUDA has certain advantages that have made it a popular choice in various domains. Here are some of the key advantages of CUDA:
Performance Benchmarks: Studies often show CUDA outperforming other APIs in specific tasks. For instance, a research paper comparing CUDA and OpenCL for financial computing found that CUDA was up to 30% faster (source).
Feature Comparison: A comprehensive study comparing CUDA and OpenCL stated that while OpenCL provides more flexibility by supporting multiple hardware vendors, CUDA provides a richer set of features and often better performance (source).
Active Development: NVIDIA actively maintains and updates CUDA, ensuring that it takes advantage of the latest hardware features.
Strong Community: Due to its widespread adoption, CUDA has a large community of developers, contributing to forums, open-source projects, and academic research.
Widespread Use: CUDA is used across various industries including finance for risk modeling, in sciences for computational biology, and obviously in machine learning and data analytics.