
Jensen Huang Praises Bittensor, comparing its decentralized AI training to Folding@Home during a live podcast at GTC event.
Author: Akshat Thakur
High attention and emotional sentiment detected.
March 20, 2026 Jensen Huang praises Bittensor during a live conversation at the GPU Technology Conference, describing its decentralized AI training model as “our modern version of Folding@Home.” The comment came during a discussion with investor Chamath Palihapitiya and quickly spread across X, pushing the project into wider attention beyond crypto-native circles.
High Signal Summary For A Quick Glance
KevinVC97
@vazquez_ke70888
@opentensor @tplr_ai @chamath @nvidia $Tao to $1000
The largest decentralised LLM pre-training run in history. SN3 @tplr_ai trained Covenant-72B across 70+ contributors on open internet infrastructure. Now it’s being discussed by @chamath with @nvidia CEO Jensen Huang. Distributed, open-weight model training on Bittensor is https://t.co/aZZcigIyFW
05:20 PM·Mar 20, 2026
taoswap
@taoswap_org
@opentensor @tplr_ai @chamath @nvidia A big win for @tplr_ai and the Bittensor ecosystem!
The largest decentralised LLM pre-training run in history. SN3 @tplr_ai trained Covenant-72B across 70+ contributors on open internet infrastructure. Now it’s being discussed by @chamath with @nvidia CEO Jensen Huang. Distributed, open-weight model training on Bittensor is https://t.co/aZZcigIyFW
01:13 AM·Mar 20, 2026
El Balderino
@Baldies4Bitcoin
@opentensor @bittingthembits @tplr_ai @chamath @nvidia Finally staked some of my Tao to a subnet today and chose @tplr_ai
The largest decentralised LLM pre-training run in history. SN3 @tplr_ai trained Covenant-72B across 70+ contributors on open internet infrastructure. Now it’s being discussed by @chamath with @nvidia CEO Jensen Huang. Distributed, open-weight model training on Bittensor is https://t.co/aZZcigIyFW
12:56 AM·Mar 20, 2026
Jensen Huang praises Bittensor at a moment when decentralized AI is starting to move into mainstream conversations. Speaking on the All-In Podcast recorded live at the GPU Technology Conference, Huang compared the network to Folding@Home, the distributed computing project that once used millions of personal computers for scientific workloads.
He also shared his broader view on AI development:
“I believe we fundamentally need models as a first class product, proprietary product, as well as models as open source. These two things are not A or B, it’s A and B.”
The statement positions decentralized AI as a complement to existing systems, not a replacement.
Bittensor operates as a decentralized network where participants contribute machine learning models and compute power in exchange for TAO tokens. The network organizes activity into subnets, each focused on a specific task. Subnet 3, known as Templar and operated by Covenant AI, focuses on large-scale language model training.
Earlier this month, the subnet completed a major milestone. It trained a 72-billion-parameter model, Covenant-72B, on 1.1 trillion tokens using more than 70 independent contributors connected through standard internet infrastructure.
The system reached 94.5 percent compute utilization while reducing communication overhead by over 146 times using a custom SparseLoCo optimizer. Benchmark results showed a score of 67.1 on the MMLU zero-shot test, outperforming earlier open models like LLaMA-2-70B. The full model weights and research are publicly available. (Source: SimplyTAO Analysis)
Jensen Huang praises Bittensor in a way that carries weight beyond crypto. As the CEO of NVIDIA, the company powering much of the world’s AI infrastructure, his acknowledgment signals that decentralized AI has reached a level worth paying attention to.
The comparison to Folding@Home reframes the idea. It turns decentralized training from an experimental concept into something familiar and proven in structure.
The market reacted immediately. TAO rose more than 17 percent after the clips circulated, with trading volume nearly doubling (Source: SimplyTAO Analysis). At the same time, institutional interest is building. Grayscale has already filed an S-1 for a potential Bittensor ETF, adding another layer of attention to the ecosystem (Source: Grayscale Announcement).
Loading chart...
Bittensor distributes AI training across a global network of independent contributors. Participants provide compute power and model outputs, while validators score the usefulness of each contribution. Rewards are distributed in TAO tokens based on performance.
In the Subnet 3 training run, contributors operated in a coordinated but decentralized setup. The system allowed training to pause and resume while maintaining consistency across machines.
The SparseLoCo optimizer reduces the bandwidth required for training updates, allowing efficient coordination even over standard internet connections. This removes a key bottleneck in distributed AI systems.
The immediate impact is visibility. Clips from the podcast continue to circulate, and the Opentensor Foundation is using the moment to highlight the network’s open-source approach.
The next phase focuses on scaling. Developers aim to train larger models, expand into inference and fine-tuning, and attract more contributors. Institutional players may begin testing decentralized training more seriously as awareness grows.
Our Crypto Talk is committed to unbiased, transparent, and true reporting to the best of our knowledge. This news article aims to provide accurate information in a timely manner. However, we advise the readers to verify facts independently and consult a professional before making any decisions based on the content since our sources could be wrong too. Check our Terms and conditions for more info.
Google Search Trends Reveal the Hidden Peaks and Crashes in Crypto Markets
Jensen Huang Praises Bittensor as Folding@Home for AI
Coinbase Unveils Stock Perpetual Futures for Non-U.S. Traders
Animoca Brands Invests in AVAX, Partners With Ava Labs
Google Search Trends Reveal the Hidden Peaks and Crashes in Crypto Markets
Jensen Huang Praises Bittensor as Folding@Home for AI
Coinbase Unveils Stock Perpetual Futures for Non-U.S. Traders
Animoca Brands Invests in AVAX, Partners With Ava Labs