There are many reasons why AI needs to achieve some level of decentralization:
Power: AI consumes more power than can easily be fed to and integrated into large, centralized data centers. There either isn't enough power on the local grid to fulfill the needs of an AI data center, or that data center doesn't have the underlying power infrastructure to support full-blown AI applications.
Cooling: Even if the power can be brought in to satisfy AI applications, many existing data centers wouldn't be able to cool the servers and processors. Outages would be inevitable due to overheating. Liquid cooling has been proposed as the solution, but many data centers either don't have room to retrofit it, don't have the skilled manpower to support it, or can't justify it economically.
Latency: If you send all data to and perform all analysis at a central point, you introduce round trip latency. If the data center is hundreds or thousands of miles away, valuable time is wasted as the data is moved around and the numbers are crunched. This is particularly a factor with real-time applications. Imagine self-driving vehicles having a lag of a second introduced for every decision. If the vehicle is traveling at more than about 20 miles per hour (32.19 kilometres per hour), crashes and accidents would be commonplace.