Anthropic co-founder discusses decentralized AI training, stating its growth far surpasses that of centralized models
Foresight News reported that Jack Clark, co-founder of Anthropic and former policy director at OpenAI, mentioned the importance of decentralized training in his weekly AI newsletter Import AI. He stated that decentralized training can improve data privacy and system robustness through distributed learning across multiple nodes. Citing a research report from Epoch AI, which analyzed over 100 related papers, he pointed out that the computational scale of decentralized training is growing at a rate of 20 times per year (significantly higher than the 5-fold annual growth rate of cutting-edge centralized training). Currently, decentralized training is still about 1,000 times smaller than frontier centralized training, but it is technically feasible and may support broader collective development of more powerful models.
Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.
You may also like
Tom Lee: Bitmine is about to launch an APP
The U.S. Judiciary Committee opposes Section 604 of the CLARITY Act
