FLOP/s 10²⁸ ████ 10²⁴ ███ 10²⁰ ██ 10¹⁶ █ exponential compute scale
GPU vs TPU ▓▓▓▓▓▓ GPU ░░░░ TPU ▒▒ ASIC NVIDIA dominates hardware mix
DISTRIBUTION Cloud: 16% ┌────┐ │████│ Consumer: 33% ┌────────┐ │████████│ Other: 51% ┌────────────┐ │████████████│ who has it
SMARTPHONES 📱📱📱📱📱 📱📱📱📱📱 📱📱📱📱📱 = 23% of global FLOP/s! hidden compute
GROWTH 7x Q1 2023: ░░░░ Q4 2024: ▓▓▓▓▓▓▓▓▓▓▓▓ ▓▓▓▓▓▓▓▓▓▓▓▓ ▓▓▓▓▓▓ accelerating

Appendix I

Estimated Global Compute Distribution and Capacity

This appendix documents the global distribution of computing power as of Q4 2024, with particular focus on AI accelerators and general-purpose computing hardware. All computing power is measured in floating point operations per second (FLOP/s), enabling direct comparison between different types of computing devices.

Methodology

Our analysis tracks two primary categories of computing power:

  1. Specialized AI Hardware: Data center GPUs (e.g., NVIDIA H100s, AMD MI300X), TPUs, and custom ASICs (e.g., Microsoft MAIA, Meta MTIA)
  2. General Purpose Computing: Consumer devices including smartphones, personal computers, and gaming consoles

Time Period Normalization

Many source datasets span 2022-2024. For these three-year totals, we estimate the 2023-2024 portion as 80% of the total, reflecting accelerating deployment rates through the period and increased manufacturing capacity in later years.

Infrastructure Accounting

To account for infrastructure deployed before our analysis period that remains in active use, we apply a 1.3× multiplier to new deployments.

Consumer Device Calculations

For devices like smartphones, we calculate:

$$\text{Total Capacity} = \text{Quarterly Sales} \times \text{Number of Quarters} \times \text{Device Performance}$$

Hardware Specifications

Table 1: Hardware Specifications (2023-2024)

Hardware Type Peak Performance (FLOP/s)
Data Center AI Chips
AMD MI300X OAMGPU$2.61 \times 10^{15}$
NVIDIA H100GPU$9.89 \times 10^{14}$
NVIDIA A100GPU$3.12 \times 10^{14}$
Microsoft MAIAASIC$8.00 \times 10^{14}$
Meta MTIAASIC$3.54 \times 10^{14}$
AWS TraniumASIC$2.50 \times 10^{14}$
AWS InferentiaASIC$1.92 \times 10^{14}$
Consumer Devices
iPhone 14 ProMobile SoC$2.00 \times 10^{12}$
Samsung S24Mobile SoC$3.40 \times 10^{12}$
Average PC CPUCPU$4.08 \times 10^{10}$
Average PC GPUGPU$4.60 \times 10^{12}$
PS5Console$1.03 \times 10^{13}$

NVIDIA GPU Shipments

Table 2: NVIDIA GPU Shipments (2023-2024)

Type Units Computing Power (FLOP/s)
A100s (2023)2,260,000$7.05 \times 10^{20}$
H100s (2023)1,500,000$1.48 \times 10^{21}$
A100s (2024)2,000,000$6.24 \times 10^{20}$
H100s (2024)2,000,000$1.98 \times 10^{21}$
Total 2022-20247,760,000$4.79 \times 10^{21}$

Major Cloud Provider Deployments

Table 3: Estimated Major Cloud Provider Deployments (2022-2024)

Provider Hardware Type Computing Power (FLOP/s)
Microsoft/OpenAI
H100 equivalents$6.53 \times 10^{20}$
AMD MI300X$2.51 \times 10^{20}$
MAIA$1.58 \times 10^{20}$
Total Q4 2024$1.38 \times 10^{21}$
Amazon/Anthropic
H100/A100$2.87 \times 10^{20}$
Inferentia$1.75 \times 10^{20}$
Tranium$9.15 \times 10^{19}$
Total Q4 2024$7.19 \times 10^{20}$
Google/DeepMind
TPUs (Q4 2024)$9.10 \times 10^{20}$
H100/A100$3.16 \times 10^{20}$
Total Q4 2024$1.23 \times 10^{21}$
Meta
H100/A100$3.96 \times 10^{20}$
AMD MI300X$4.52 \times 10^{20}$
MTIA$5.31 \times 10^{20}$
Total Q4 2024$1.79 \times 10^{21}$

Base Computing Capacity

Table 4: Base Computing Capacity (Q1 2023)

Type Computing Power (FLOP/s)
TPU$2.93 \times 10^{19}$
GPU$3.95 \times 10^{21}$
Total$3.98 \times 10^{21}$

New Computing Capacity

Table 5: Estimated New Computing Capacity (Q1 2023 - Q4 2024)

Category Computing Power (FLOP/s)
Specialized AI Hardware
NVIDIA GPUs (avg of sources)$4.81 \times 10^{21}$
New Google TPUs$8.81 \times 10^{20}$
New Cloud Hardware$2.35 \times 10^{21}$
Consumer Devices
Active iPhones$5.98 \times 10^{21}$
Active Androids$1.50 \times 10^{21}$
PC CPUs$1.96 \times 10^{19}$
PC GPUs$2.21 \times 10^{21}$
Game Consoles$8.64 \times 10^{20}$
Total New$2.82 \times 10^{22}$

Total Worldwide Computing Capacity

Table 6: Estimated Total Worldwide Computing Capacity (Q4 2024)

Category Computing Power (FLOP/s) Share (%)
Cloud/AI Providers
Meta$1.79 \times 10^{21}$5.57%
Microsoft/OpenAI$1.38 \times 10^{21}$4.29%
Google/DeepMind$1.23 \times 10^{21}$3.81%
Amazon/Anthropic$7.19 \times 10^{20}$2.23%
Consumer Computing
Smartphones$7.48 \times 10^{21}$23.23%
PC CPUs/GPUs$2.23 \times 10^{21}$6.92%
Game Consoles$8.64 \times 10^{20}$2.68%
Other Cloud/Pre-2023$1.65 \times 10^{22}$51.28%
Total$3.22 \times 10^{22}$100.00%

Analysis and Implications

Distribution of Computing Power

  • Cloud Concentration: The four largest cloud/AI providers collectively control 15.9% of global computing power, representing significant concentration of high-performance computing resources.
  • Consumer Device Significance: Despite lower per-unit performance, consumer devices collectively represent 32.83% of global computing power, with smartphones alone accounting for 23.23%.
  • Legacy Infrastructure: Pre-2023 and other cloud infrastructure remains significant at 51.28% of total capacity.

Growth Patterns

  • Accelerating Deployment: New capacity added in 2023-2024 ($2.82 \times 10^{22}$ FLOP/s) is approximately 7 times larger than the Q1 2023 base capacity ($3.98 \times 10^{21}$ FLOP/s).
  • Custom Hardware: Major cloud providers are increasingly deploying custom AI accelerators alongside NVIDIA GPUs.
  • Consumer Updates: The rapid replacement cycle of consumer devices maintains their significant share of total computing power.

Methodological Limitations

  • Utilization Rates: Our analysis uses peak theoretical performance. Actual achieved performance varies based on workload characteristics, cooling constraints, and network bottlenecks.
  • Temporal Distribution: The 80% factor for 2023-2024 portion is an estimate based on recent NVIDIA sales growth.
  • Consumer Device Usage: Not all consumer devices are actively used for computation, potentially overestimating their contribution.