The AI boom isn't slowing down. It's tripping over itself. That's the picture painted by IREN Limited (IREN) CEO Daniel Roberts in a recent podcast appearance. Speaking during Nvidia Corp's (NVDA) GTC conference, Roberts described the current scramble to build AI infrastructure in blunt, almost comical terms: "permanent whack-a-mole." It might just be the most accurate description of the industry's growing pains right now.
Think about it like this. The constraints keep shifting. First, everyone was desperate for GPUs. Then, once you could theoretically get your hands on the chips, the problem became power—where do you plug all this stuff in? Now, according to Roberts, the real bottleneck is something else entirely: "time to compute." Even when companies manage to secure both the silicon and the juice, they still can't flip the switch fast enough. Why? Because AI isn't just software; it's intensely physical.
Building a data center isn't like downloading an app. It means securing land, negotiating connections to the electrical grid, hiring thousands of specialized workers, and constructing massive cooling and electrical systems. It's a marathon of logistics, permits, and construction. And that's where the system, built for digital speed, starts to grind against the slow gears of the physical world.
Roberts put it plainly. The "real world" is struggling to keep up with "digital exponential demand curves." Every time you think you've solved one problem, another one pops up. You fix the chip supply? Suddenly power becomes the scarce resource. You secure enough power? Now you can't find enough skilled labor to build the facilities. You solve the labor issue? The supply chains for transformers or cooling units tighten. That's the whack-a-mole. And it's constant.
The demand side of this equation shows no signs of letting up. "There are no idle GPUs in the world," Roberts said, adding that the industry still "cannot meet demand." That's a critical signal for anyone watching this space. This isn't a speculative bubble struggling to find real use cases—it's an entire ecosystem struggling to keep up with them. The demand is very real; the supply chain just can't deliver fast enough.
All of this points to a bigger shift in the AI trade. It's evolving. It's no longer just a story about whose chip is fastest or whose model is smartest. It's increasingly a story about infrastructure—and the hard, slow, expensive limits of building it. The next phase of AI's breakneck growth might not be defined by a flashy software breakthrough from a lab. It might be defined by how quickly we can pour concrete, run power lines, and train technicians. The race is on, and the finish line keeps moving.












