Fred

Fred

Change the world by Web3 @RyzeLabs | alumni @THUBA_DAO

[In-depth Analysis] What kind of sparks can AI and Web3 create?

Introduction: The Development of AI+Web3#

In recent years, the rapid development of artificial intelligence (AI) and Web3 technologies has garnered widespread attention globally. AI, as a technology that simulates and mimics human intelligence, has made significant breakthroughs in areas such as facial recognition, natural language processing, and machine learning. The rapid advancement of AI technology has brought tremendous transformation and innovation to various industries.

The market size of the AI industry reached $200 billion in 2023, with industry giants and outstanding players like OpenAI, Character.AI, and Midjourney emerging like mushrooms after rain, leading the AI boom.

At the same time, Web3, as an emerging internet model, is gradually changing our understanding and usage of the internet. Based on decentralized blockchain technology, Web3 achieves data sharing and control, user autonomy, and the establishment of trust mechanisms through features such as smart contracts, distributed storage, and decentralized identity verification. The core idea of Web3 is to liberate data from centralized authoritative institutions, empowering users with control over their data and the right to share its value.

Currently, the market value of the Web3 industry has reached $25 trillion, with new narratives and scenarios continuously emerging, attracting more and more people to join the Web3 industry, whether it be Bitcoin, Ethereum, Solana, or application layer players like Uniswap and Stepn.

It is easy to see that the combination of AI and Web3 is a field of great interest to builders and VCs from both the East and West, and how to effectively integrate the two is a question worth exploring.

This article will focus on the current state of AI+Web3 development, exploring the potential value and impact brought by this integration. We will first introduce the basic concepts and characteristics of AI and Web3, and then discuss their interrelationship. Next, we will analyze the current status of AI+Web3 projects and delve into the limitations and challenges they face. Through this research, we hope to provide valuable references and insights for investors and practitioners in related industries.

Ways AI and Web3 Interact#

The development of AI and Web3 is like the two sides of a balance scale, with AI bringing productivity improvements and Web3 bringing changes to production relationships. So what kind of sparks can AI and Web3 create together? We will first analyze the dilemmas and areas for improvement faced by the AI and Web3 industries, and then discuss how they can help solve these dilemmas.

Dilemmas Faced by the AI Industry#

To explore the dilemmas faced by the AI industry, we first need to look at the essence of the AI industry. The core of the AI industry revolves around three elements: computing power, algorithms, and data.

image

  1. First is computing power: Computing power refers to the ability to perform large-scale calculations and processing. AI tasks often require processing large amounts of data and performing complex calculations, such as training deep neural network models. High-intensity computing power can accelerate model training and inference processes, improving the performance and efficiency of AI systems. In recent years, with the development of hardware technology, such as graphics processing units (GPUs) and dedicated AI chips (like TPUs), the enhancement of computing power has played a significant role in the development of the AI industry. Nvidia, which has seen its stock soar in recent years, occupies a large market share as a GPU provider, earning substantial profits.

  2. What are algorithms: Algorithms are the core components of AI systems; they are mathematical and statistical methods used to solve problems and achieve tasks. AI algorithms can be divided into traditional machine learning algorithms and deep learning algorithms, with the latter achieving significant breakthroughs in recent years. The choice and design of algorithms are crucial for the performance and effectiveness of AI systems. Continuously improving and innovating algorithms can enhance the accuracy, robustness, and generalization ability of AI systems. Different algorithms will yield different results, so enhancing algorithms is also vital for task completion.

  3. Why is data important: The core task of AI systems is to extract patterns and rules from data through learning and training. Data is the foundation for training and optimizing models; through large-scale data samples, AI systems can learn more accurate and intelligent models. Rich datasets can provide more comprehensive and diverse information, enabling models to generalize better to unseen data, helping AI systems better understand and solve real-world problems.

After understanding the three core elements of AI, let's look at the dilemmas and challenges AI faces in these areas. First, in terms of computing power, AI tasks typically require significant computational resources for model training and inference, especially for deep learning models. However, acquiring and managing large-scale computing power is an expensive and complex challenge. The costs, energy consumption, and maintenance of high-performance computing devices are all issues. This can be particularly difficult for startups and individual developers to obtain sufficient computing power.

In terms of algorithms, although deep learning algorithms have achieved tremendous success in many fields, there are still some dilemmas and challenges. For instance, training deep neural networks requires large amounts of data and computational resources, and for certain tasks, the interpretability and explainability of the models may be insufficient. Additionally, the robustness and generalization ability of algorithms are also important issues, as models may perform inconsistently on unseen data. Among numerous algorithms, finding the best algorithm to provide the best service is a process that requires continuous exploration.

Regarding data, while data drives AI, acquiring high-quality, diverse data remains a challenge. In some fields, data may be difficult to obtain, such as sensitive health data in the medical field. Furthermore, the quality, accuracy, and labeling of data are also issues; incomplete or biased data may lead to erroneous behavior or biases in models. Protecting data privacy and security is also a significant consideration.

Moreover, issues such as explainability and transparency exist; the black-box nature of AI models is a public concern. For certain applications, such as finance, healthcare, and justice, the decision-making processes of models need to be explainable and traceable, while existing deep learning models often lack transparency. Explaining the decision-making processes of models and providing trustworthy explanations remains a challenge.

Additionally, many AI projects have unclear business models, which leaves many AI entrepreneurs feeling lost.

Dilemmas Faced by the Web3 Industry#

In the Web3 industry, there are also many different dilemmas that need to be addressed, whether it be data analysis in Web3, poor user experiences in Web3 products, or issues related to smart contract code vulnerabilities and hacker attacks, there is much room for improvement. AI, as a tool for enhancing productivity, also has significant potential in these areas.

First, there is room for improvement in data analysis and predictive capabilities: The application of AI technology in data analysis and prediction has a tremendous impact on the Web3 industry. Through intelligent analysis and mining using AI algorithms, Web3 platforms can extract valuable information from vast amounts of data and make more accurate predictions and decisions. This is particularly significant for risk assessment, market prediction, and asset management in the decentralized finance (DeFi) sector.

Furthermore, improvements in user experience and personalized services can also be achieved: The application of AI technology enables Web3 platforms to provide better user experiences and personalized services. By analyzing and modeling user data, Web3 platforms can offer personalized recommendations, customized services, and intelligent interactive experiences. This helps to increase user engagement and satisfaction, promoting the development of the Web3 ecosystem. For example, many Web3 protocols integrate AI tools like ChatGPT to better serve users.

In terms of security and privacy protection, the application of AI also has profound implications for the Web3 industry. AI technology can be used to detect and defend against cyberattacks, identify abnormal behaviors, and provide stronger security guarantees. Additionally, AI can be applied to data privacy protection through techniques such as data encryption and privacy computing, safeguarding users' personal information on Web3 platforms. In the auditing of smart contracts, since there may be vulnerabilities and security risks during the writing and auditing processes, AI technology can be used for automated contract auditing and vulnerability detection, enhancing the security and reliability of contracts.

It is evident that AI can participate in and provide assistance in many aspects of the dilemmas and potential improvements faced by the Web3 industry.

Analysis of the Current State of AI+Web3 Projects#

Projects that combine AI and Web3 mainly focus on two major aspects: utilizing blockchain technology to enhance the performance of AI projects, and leveraging AI technology to improve Web3 projects.

A large number of projects have emerged exploring these two aspects, including Io.net, Gensyn, Ritual, and various others. The following sections will analyze the current status and development of different sub-tracks where AI supports Web3 and Web3 supports AI.

image

Web3 Supporting AI#

Decentralized Computing Power#

Since OpenAI launched ChatGPT at the end of 2022, it has ignited a boom in AI. Within five days of its launch, the user count reached 1 million, while Instagram took about two and a half months to reach the same number of downloads. Following that, ChatGPT's growth was also rapid, reaching 100 million monthly active users within two months, and by November 2023, the weekly active user count reached 100 million. With the advent of ChatGPT, the AI field quickly transformed from a niche track into a highly regarded industry.

According to a report by Trendforce, ChatGPT requires 30,000 NVIDIA A100 GPUs to operate, and future GPT-5 will require even more computational power. This has sparked an arms race among AI companies, where only those with sufficient computing power can secure enough momentum and advantages in the AI battle, leading to a shortage of GPUs.

Before the rise of AI, the largest GPU provider, Nvidia, had customers concentrated in three major cloud services: AWS, Azure, and GCP. With the rise of artificial intelligence, a large number of new buyers have emerged, including major tech companies like Meta, Oracle, and other data platforms and AI startups, all joining the race to hoard GPUs for training AI models. Large tech companies like Meta and Tesla have significantly increased their purchases for custom AI models and internal research. Foundational model companies like Anthropic and data platforms like Snowflake and Databricks have also purchased more GPUs to help clients provide AI services.

As mentioned by Semi Analysis last year, there are "GPU rich and GPU poor" companies, with a few companies owning over 20,000 A100/H100 GPUs, allowing team members to use 100 to 1,000 GPUs for projects. These companies are either cloud providers or self-built LLMs, including OpenAI, Google, Meta, Anthropic, Inflection, Tesla, Oracle, and Mistral.

However, most companies fall into the category of GPU poor, struggling with far fewer GPUs and spending a lot of time and effort on tasks that are difficult to advance in the ecosystem. This situation is not limited to startups; some of the most well-known AI companies—Hugging Face, Databricks (MosaicML), Together, and even Snowflake—have fewer than 20K A100/H100 GPUs. These companies have world-class technical talent but are limited by the supply of GPUs, putting them at a disadvantage compared to larger companies in the AI competition.

This shortage is not limited to the "GPU poor"; even at the end of 2023, the leading AI player OpenAI had to close paid registrations for weeks due to an inability to secure enough GPUs while procuring more GPU supplies.

image

It is evident that the rapid development of AI has led to a severe mismatch between the demand and supply sides of GPUs, with the issue of supply not meeting demand becoming urgent.

To address this issue, some Web3 projects have begun to leverage the characteristics of Web3 technology to provide decentralized computing power services, including Akash, Render, Gensyn, and others. These projects share a common goal: to incentivize users to provide idle GPU computing power through tokens, becoming the supply side of computing power to support AI clients.

The supply side can be broadly categorized into three areas: cloud service providers, cryptocurrency miners, and enterprises.

Cloud service providers include large cloud service providers (like AWS, Azure, GCP) and GPU cloud service providers (like Coreweave, Lambda, Crusoe, etc.), where users can resell idle computing power from cloud service providers to earn income. Cryptocurrency miners, following Ethereum's transition from PoW to PoS, have also become an important potential supply side with idle GPU computing power. Additionally, large enterprises like Tesla and Meta, which have purchased large quantities of GPUs for strategic reasons, can also provide idle GPU computing power as a supply side.

Currently, players in this space can be roughly divided into two categories: those using decentralized computing power for AI inference and those using decentralized computing power for AI training. The former includes projects like Render (which focuses on rendering but can also provide AI computing power), Akash, and Aethir; the latter includes io.net (which can support both inference and training) and Gensyn, with the main difference being the different requirements for computing power.

Let's first discuss the projects focused on AI inference. These projects attract users to participate in providing computing power through token incentives, and then provide computing power network services to the demand side, thus matching supply and demand for idle computing power. An introduction and analysis of such projects were mentioned in our previous DePIN research report at Ryze Labs, which you are welcome to refer to.

The core point is that through a token incentive mechanism, projects first attract suppliers and then attract users, thus achieving the project's cold start and core operational mechanism, allowing for further expansion and development. In this cycle, the supply side receives more valuable token rewards, while the demand side benefits from cheaper and more cost-effective services. The project's token value aligns with the growth of participants on both the supply and demand sides, and as the token price rises, it attracts more participants and speculators, forming value capture.

image

The other category uses decentralized computing power for AI training, such as Gensyn and io.net (which can support both AI training and inference). In fact, the operational logic of this category of projects is not fundamentally different from that of AI inference projects; they still attract supply side participation to provide computing power through token incentives for the demand side to use.

Among them, io.net, as a decentralized computing power network, currently has over 500,000 GPUs, performing exceptionally well among decentralized computing power projects. Additionally, it has integrated the computing power of Render and Filecoin, continuously developing its ecosystem.

image

Furthermore, Gensyn promotes the allocation and rewards of machine learning tasks through smart contracts to facilitate AI training. As shown in the diagram below, the hourly cost of machine learning training work on Gensyn is about $0.4, which is significantly lower than the over $2 cost of AWS and GCP.

Gensyn's system includes four participants: submitters, executors, validators, and reporters.

  • Submitters: Demand users are consumers of tasks, providing tasks to be computed and paying for AI training tasks.
  • Executors: Executors perform the model training tasks and generate proofs of task completion for validators to check.
  • Validators: Validators link the non-deterministic training process with deterministic linear computations, comparing the executors' proofs with expected thresholds.
  • Reporters: Reporters check the validators' work and raise challenges when issues are found to earn rewards.

It is clear that Gensyn aims to become a large-scale, cost-effective computing protocol for global deep learning models. However, across this track, why do most projects choose to use decentralized computing power for AI inference rather than training?
To help those unfamiliar with AI training and inference, let’s briefly explain the differences between the two:

  • AI Training: If we compare artificial intelligence to a student, training is akin to providing the student with a wealth of knowledge and examples, which can also be understood as the data we commonly refer to. The AI learns from these knowledge examples. Since learning inherently requires understanding and memorizing vast amounts of information, this process demands significant computational power and time.
  • AI Inference: So what is inference? It can be understood as using the acquired knowledge to solve problems or take exams. During the inference phase, the AI uses the knowledge it has learned to answer questions rather than acquiring new knowledge, so the computational requirements during inference are much lower.

It is evident that the computational power requirements for the two are vastly different. The feasibility of using decentralized computing power for AI inference is much greater than for training, as training large models requires an enormous amount of data and high bandwidth for data communication.

In addition, there are projects like Ritual that hope to combine distributed networks with model creators, maintaining decentralization and security. Its first product, Infernet, allows smart contracts on the blockchain to access AI models off-chain, enabling such contracts to access AI in a manner that maintains verification, decentralization, and privacy protection.

The coordinator of Infernet is responsible for managing the behavior of nodes in the network and responding to computation requests from consumers. When users utilize Infernet, tasks like inference and proof are performed off-chain, with results returned to the coordinator and ultimately passed to consumers on-chain through contracts.

In addition to decentralized computing power networks, there are also decentralized bandwidth networks like Grass that enhance data transmission speed and efficiency. Overall, the emergence of decentralized computing power networks provides a new possibility for the supply side of AI computing power, pushing AI forward in new directions.

Decentralized Algorithm Models#

As mentioned in Chapter 2, the three core elements of AI are computing power, algorithms, and data. Since computing power can form a supply network through decentralization, can algorithms also have a similar approach to form a supply network of algorithm models?
Before analyzing the projects in this track, let’s first understand the significance of decentralized algorithm models. Many may wonder, since OpenAI already exists, why do we need a decentralized algorithm network?

Essentially, a decentralized algorithm network is a decentralized AI algorithm service market that connects many different AI models, each with its own areas of expertise and skills. When users pose questions, the market selects the most suitable AI model to provide answers. Chat-GPT is an AI model developed by OpenAI that can understand and generate human-like text.

In simple terms, ChatGPT is like a highly capable student helping to solve different types of problems, while a decentralized algorithm network is like a school with many students helping to solve problems. Although this student is currently very capable, over the long term, a school that can recruit students globally has tremendous potential.

Currently, in the field of decentralized algorithm models, there are also some projects exploring and attempting to develop, with the representative project Bittensor serving as a case study to help understand the development of this niche area.

In Bittensor, the supply side of algorithm models (or miners) contributes their machine learning models to the network. These models can analyze data and provide insights. Model providers are rewarded with cryptocurrency tokens (TAO) for their contributions.

To ensure the quality of answers to questions, Bittensor employs a unique consensus mechanism to ensure the network reaches consensus on the best answers. When a question is posed, multiple model miners provide answers. The validators in the network then begin their work to determine the best answer and send it back to the user.

The TAO token in Bittensor plays two main roles throughout the process: it incentivizes miners to contribute algorithm models to the network, and users need to spend tokens to ask questions and have the network complete tasks.

Since Bittensor is decentralized, anyone with internet access can join the network, either as a user posing questions or as a miner providing answers. This allows more people to utilize powerful artificial intelligence.

In summary, using networks like Bittensor as an example, the field of decentralized algorithm models has the potential to create a more open and transparent landscape, where AI models can be trained, shared, and utilized in a secure and decentralized manner. Additionally, there are decentralized algorithm model networks like BasedAI attempting similar initiatives, with a particularly interesting aspect being the use of ZK to protect user data privacy during interactions with models, which will be further discussed in the fourth section.

As decentralized algorithm model platforms develop, they will enable small companies to compete with large organizations in utilizing top-tier AI tools, potentially having a significant impact across various industries.

Decentralized Data Collection#

For training AI models, a large supply of data is essential. However, most Web2 companies still monopolize user data, with platforms like X, Reddit, TikTok, Snapchat, Instagram, and YouTube prohibiting data collection for AI training. This has become a significant obstacle to the development of the AI industry.

On the other hand, some Web2 platforms sell user data to AI companies without sharing any profits with users. For example, Reddit reached a $60 million agreement with Google, allowing Google to train AI models on its posts. This has led to the monopolization of data collection rights by large capital and big data entities, resulting in an overly capital-intensive direction for the industry.

In response to this situation, some projects are combining Web3 with token incentives to achieve decentralized data collection. Taking PublicAI as an example, users can participate in two roles:

  • One role is as AI data providers, where users can find valuable content on X, tag the official PublicAI account, and attach insights using #AI or #Web3 as classification tags to send content to the PublicAI data center for data collection.
  • The other role is as data validators, where users can log into the PublicAI data center to vote for the most valuable data for AI training.

In return, users can receive token incentives for these contributions, promoting a win-win relationship between data contributors and the AI industry.

In addition to projects like PublicAI that specifically collect data for AI training, many other projects are also engaging in decentralized data collection through token incentives. For example, Ocean collects user data to serve AI through data tokenization, Hivemapper collects map data through users' car-mounted cameras, Dimo collects user car data, and WiHi collects weather data. These projects that collect data through decentralization are also potential supply sides for AI training, so broadly speaking, they can also be included in the paradigm of Web3 supporting AI.

ZK Protecting User Privacy in AI#

In addition to the advantages of decentralization brought by blockchain technology, another significant aspect is zero-knowledge proofs. Through zero-knowledge technology, privacy can be protected while achieving information verification.

In traditional machine learning, data typically needs to be stored and processed centrally, which may lead to risks of data privacy breaches. On the other hand, methods for protecting data privacy, such as data encryption or data de-identification, may limit the accuracy and performance of machine learning models.

The technology of zero-knowledge proofs can help address this dilemma by resolving the conflict between privacy protection and data sharing. ZKML (Zero-Knowledge Machine Learning) allows for the training and inference of machine learning models without revealing the original data. Zero-knowledge proofs enable the features of data and the results of models to be proven correct without disclosing the actual data content.

The core goal of ZKML is to achieve a balance between privacy protection and data sharing. It can be applied in various scenarios, such as medical health data analysis, financial data analysis, and cross-organizational collaboration. By using ZKML, individuals can protect the privacy of their sensitive data while sharing it with others to gain broader insights and collaborative opportunities without worrying about the risk of data privacy breaches.

Currently, this field is still in its early stages, with most projects still exploring. For example, BasedAI has proposed a decentralized approach that seamlessly integrates FHE with LLM to maintain data confidentiality. By embedding privacy into its distributed network infrastructure using zero-knowledge large language models (ZK-LLM), it ensures that user data remains private throughout the network's operation.

Here, let’s briefly explain what Fully Homomorphic Encryption (FHE) is. Fully homomorphic encryption is a type of encryption technology that allows computations to be performed on encrypted data without needing to decrypt it. This means that various mathematical operations (such as addition, multiplication, etc.) performed on FHE-encrypted data can be conducted while keeping the data encrypted, yielding results equivalent to those obtained by performing the same operations on the original unencrypted data, thus protecting user data privacy.

In addition to the four categories mentioned above, there are also blockchain projects like Cortex that support executing AI programs on-chain. Currently, executing machine learning programs on traditional blockchains faces a challenge, as virtual machines are extremely inefficient when running any non-complex machine learning models. Therefore, most people believe that running AI on blockchains is impossible. However, the Cortex Virtual Machine (CVM) utilizes GPUs to execute AI programs on-chain and is compatible with EVM. In other words, the Cortex chain can execute all Ethereum DApps and integrate AI machine learning into these DApps. This allows for the execution of machine learning models in a decentralized, immutable, and transparent manner, as network consensus verifies every step of AI inference.

AI Supporting Web3#

In the collision of AI and Web3, in addition to Web3 supporting AI, the assistance of AI to the Web3 industry is also worth noting. The core contribution of artificial intelligence lies in enhancing productivity, thus there are many attempts in areas such as AI auditing of smart contracts, data analysis and prediction, personalized services, security, and privacy protection.

Data Analysis and Prediction#

Currently, many Web3 projects have begun to integrate existing AI services (such as ChatGPT) or develop their own to provide data analysis and prediction services for Web3 users. The coverage is extensive, including providing investment strategies through AI algorithms, on-chain analysis AI tools, price and market predictions, and more.

For example, Pond uses AI graph algorithms to predict valuable alpha tokens for users and institutions, providing investment assistance. BullBear AI trains based on users' historical data and price trends to provide the most accurate information to support price trend predictions, helping users gain profits.

There are also investment competition platforms like Numerai, where participants predict the stock market based on AI and large language models, utilizing the platform's free high-quality data to train models and submit predictions daily. Numerai calculates the performance of these predictions over the next month, allowing participants to stake NMR on models and earn returns based on their performance.

Additionally, there are on-chain data analysis platforms like Arkham that also integrate AI for services. Arkham links blockchain addresses with entities such as exchanges, funds, and whales, displaying key data and analysis for users to provide decision-making advantages. The AI integration part involves Arkham Ultra using algorithms to match addresses with real-world entities, developed over three years with support from core contributors at Palantir and OpenAI founders.

Personalized Services#

In Web2 projects, AI has many application scenarios in search and recommendation fields, serving users' personalized needs. The same is true for Web3 projects, where many project teams optimize user experiences by integrating AI.

For example, the well-known data analysis platform Dune recently launched the Wand tool, which allows users to write SQL queries using large language models. Through the Wand Create feature, users can automatically generate SQL queries based on natural language questions, making it very convenient for users unfamiliar with SQL to search.

Moreover, some Web3 content platforms have begun to integrate ChatGPT for content summarization. For instance, the Web3 media platform Followin integrates ChatGPT to summarize opinions and recent developments in a particular field; the Web3 encyclopedia platform IQ.wiki aims to become a primary source of objective, high-quality knowledge about blockchain technology and cryptocurrencies online, making it easier for users to discover and access blockchain information globally, and it also integrates GPT-4 to summarize wiki articles. Additionally, Kaito, a search engine based on LLM, aims to become a Web3 search platform, changing the way information is accessed in Web3.

In terms of content creation, there are projects like NFPrompt that reduce user creation costs. NFPrompt allows users to generate NFTs more easily through AI, thereby lowering the cost of creation and providing many personalized services in the creative process.

AI Auditing of Smart Contracts#

In the Web3 field, auditing smart contracts is also a very important task. Implementing AI for auditing smart contract code can more efficiently and accurately identify and uncover vulnerabilities in the code.

As Vitalik has mentioned, one of the biggest challenges in the cryptocurrency field is the errors in our code. An exciting possibility is that artificial intelligence (AI) could significantly simplify the use of formal verification tools to prove that a set of code meets specific properties. If this can be achieved, we may have error-free SEK EVMs (such as the Ethereum Virtual Machine). The more errors are reduced, the greater the security of the space, and AI is very helpful in achieving this.

For example, the 0x0.ai project provides an AI smart contract auditor, which is a tool that uses advanced algorithms to analyze smart contracts and identify potential vulnerabilities or issues that could lead to fraud or other security risks. Auditors use machine learning techniques to identify patterns and anomalies in the code, flagging potential issues for further review.

In addition to the three categories mentioned above, there are also some native cases utilizing AI to support the Web3 field, such as PAAL, which helps users create personalized AI bots that can be deployed on Telegram and Discord to serve Web3 users; and AI-driven multi-chain DEX aggregator Hera, which uses AI to provide the best trading paths between the widest range of tokens and any token pairs. Overall, AI's support for Web3 is more about being a tool-level assistance.

Limitations and Challenges of AI+Web3 Projects#

Real Obstacles in Decentralized Computing Power#

Currently, many of the Web3 projects supporting AI are focused on decentralized computing power, promoting global users to become the supply side of computing power through token incentives, which is a very interesting innovation. However, on the other hand, there are some real issues that need to be addressed:

Compared to centralized computing service providers, decentralized computing products typically rely on nodes and participants distributed globally to provide computational resources. Due to potential delays and instability in network connections between these nodes, performance and stability may be lower than that of centralized computing products.

Additionally, the availability of decentralized computing products is influenced by the degree of matching between supply and demand. If there are not enough suppliers or demand is too high, it may lead to resource shortages or an inability to meet user needs.

Finally, compared to centralized computing products, decentralized computing products typically involve more technical details and complexities. Users may need to understand and deal with knowledge related to distributed networks, smart contracts, and cryptocurrency payments, increasing the cost of understanding and using these products.

After in-depth discussions with many decentralized computing project teams, it has been found that current decentralized computing is still largely limited to AI inference rather than AI training.

Next, I will address four small questions to help everyone understand the underlying reasons:

  1. Why do most decentralized computing projects choose to do AI inference rather than AI training?
  2. What exactly makes Nvidia so powerful? What are the reasons for the difficulty of decentralized computing training?
  3. What will the ultimate outcome of decentralized computing (Render, Akash, io.net, etc.) look like?
  4. What will the ultimate outcome of decentralized algorithm models (Bittensor) look like?

Let’s unravel these layers one by one:

  1. Across this track, most decentralized computing projects choose to do AI inference rather than training, primarily due to the different requirements for computing power and bandwidth.

To help everyone better understand, let’s compare AI to a student:

AI Training: If we compare artificial intelligence to a student, training is akin to providing the student with a wealth of knowledge and examples, which can also be understood as the data we commonly refer to. The AI learns from these knowledge examples. Since learning inherently requires understanding and memorizing vast amounts of information, this process demands significant computational power and time.

AI Inference: So what is inference? It can be understood as using the acquired knowledge to solve problems or take exams. During the inference phase, the AI uses the knowledge it has learned to answer questions rather than acquiring new knowledge, so the computational requirements during inference are much lower.

It is easy to see that the difficulty difference between the two fundamentally lies in the fact that training large models requires an enormous amount of data and extremely high bandwidth requirements for data communication, making the implementation of decentralized computing for training extremely challenging. In contrast, inference has much lower data and bandwidth requirements, making implementation more feasible.

For large models, stability is paramount; if training is interrupted, it requires retraining, which incurs high sunk costs. On the other hand, demands with relatively lower computing requirements can be realized, such as AI inference mentioned earlier, or training of smaller models in specific scenarios, which is possible when there are relatively large node service providers in the decentralized computing network.

  1. So where are the bottlenecks in data and bandwidth? Why is decentralized training difficult to achieve?

This involves two key elements of large model training: single-card computing power and multi-card parallelism.
Single-card computing power: Currently, all centers that require training large models are referred to as supercomputing centers. To facilitate understanding, we can liken the supercomputing center to the human body, with the underlying unit GPU being the cells. If a single cell (GPU) has strong computing power, then the overall computing power (single cell × quantity) may also be strong.

Multi-card parallelism: Training a large model often requires hundreds of billions of GBs. For supercomputing centers training large models, at least tens of thousands of A100s are needed as a baseline. Thus, it requires mobilizing these tens of thousands of cards for training. However, training a large model is not as simple as sequentially training on the first A100 card and then on the second; rather, different parts of the model are trained on different graphics cards, and training A may require results from B, thus involving multi-card parallelism.

Why is Nvidia so powerful, with its market value soaring, while AMD and domestic companies like Huawei and Horizon find it difficult to catch up? The core reasons lie in two aspects: the CUDA software environment and NVLink multi-card communication.

On one hand, having a software ecosystem that can adapt to hardware is crucial, such as Nvidia's CUDA system. Building a new system is challenging, akin to creating a new language, with very high replacement costs.

On the other hand, multi-card communication essentially involves the input and output of information between cards. How to parallelize and transmit data is key. Due to the existence of NVLink, it is impossible to connect Nvidia and AMD cards; furthermore, NVLink limits the physical distance between graphics cards, requiring them to be within the same supercomputing center, making it difficult for decentralized computing power distributed worldwide to form a computing cluster for large model training.

The first point explains why AMD and domestic companies like Huawei and Horizon currently find it difficult to catch up; the second point explains why decentralized training is challenging to achieve.

  1. What will the ultimate outcome of decentralized computing power look like?
    Decentralized computing power currently struggles to conduct large model training, primarily because stability is crucial for large model training. If training is interrupted, it requires retraining, which incurs high sunk costs. The requirements for multi-card parallelism are high, and bandwidth is limited by physical distance. Nvidia achieves multi-card communication through NVLink; however, within a supercomputing center, NVLink limits the physical distance between graphics cards, making it difficult for dispersed computing power to form a computing cluster for large model training.

On the other hand, demands with relatively lower computing requirements can be realized, such as AI inference or training of smaller models in specific scenarios, which is possible when there are relatively large node service providers. Additionally, edge computing scenarios like rendering are also relatively easier to implement.

  1. What will the ultimate outcome of decentralized algorithm models look like?
    The ultimate outcome of decentralized algorithm models depends on the future of AI. I believe the future AI battle may consist of one or two closed-source model giants (like ChatGPT) alongside a flourishing array of models. In this context, application layer products do not need to be tied to a single large model but can collaborate with multiple large models. In this regard, Bittensor's model still holds significant potential.

The Combination of AI and Web3 is Relatively Rough, Failing to Achieve 1+1>2#

Currently, in projects combining Web3 and AI, especially those where AI supports Web3, most projects still merely use AI superficially without truly reflecting a deep integration between AI and cryptocurrency. This superficial application is primarily manifested in two aspects:
First, whether using AI for data analysis and prediction, employing AI in recommendation and search scenarios, or conducting code audits, the integration with Web2 projects and AI does not differ significantly. These projects simply utilize AI to enhance efficiency and conduct analysis without showcasing the intrinsic fusion and innovative solutions between AI and cryptocurrency.

Secondly, many Web3 teams' integration with AI is more about marketing, purely leveraging the concept of AI. They only apply AI technology in very limited areas and then begin to promote AI trends, creating a false impression that their projects are closely tied to AI. However, there remains a significant gap in genuine innovation.

Despite the current limitations of Web3 and AI projects, we should recognize that this is merely the early stage of development. In the future, we can expect more in-depth research and innovation to achieve a closer integration between AI and cryptocurrency, creating more intrinsic and meaningful solutions in fields such as finance, decentralized autonomous organizations, prediction markets, and NFTs.

Token Economics as a Buffer for AI Project Narratives#

As mentioned at the beginning, the commercial model dilemma of AI projects arises because more and more large models are gradually becoming open-source. Currently, many AI+Web3 projects often find it difficult to develop and secure funding in Web2, leading them to overlay Web3 narratives and token economics to promote user participation.

However, the key question is whether the integration of token economics genuinely helps AI projects address actual needs or is merely a narrative or a pursuit of short-term value.

Currently, most AI+Web3 projects are far from being practical, and it is hoped that more grounded and thoughtful teams can not only use tokens as a hype for AI projects but truly meet actual demand scenarios.

Conclusion#

Currently, numerous cases and applications of AI+Web3 projects have emerged. First, AI technology can provide more efficient and intelligent application scenarios for Web3. Through AI's data analysis and prediction capabilities, Web3 users can have better tools for investment decision-making and other scenarios. Additionally, AI can audit smart contract code, optimize the execution process of smart contracts, and improve the performance and efficiency of blockchains. At the same time, AI technology can provide more precise and intelligent recommendations and personalized services for decentralized applications, enhancing user experience.

Meanwhile, the decentralized and programmable characteristics of Web3 also provide new opportunities for the development of AI technology. Through token incentives, decentralized computing power projects offer new solutions to the dilemma of insufficient AI computing power, while Web3's smart contracts and distributed storage mechanisms provide broader space and resources for sharing and training AI algorithms. The user autonomy and trust mechanisms of Web3 also bring new possibilities for AI development, allowing users to choose to participate in data sharing and training, thereby improving the diversity and quality of data and further enhancing the performance and accuracy of AI models.

Although the current intersection of AI+Web3 projects is still in its early stages and faces many dilemmas, it also brings many advantages. For instance, while decentralized computing power products have some drawbacks, they reduce reliance on centralized institutions, provide greater transparency and auditability, and enable broader participation and innovation. For specific use cases and user needs, decentralized computing power products may be a valuable choice; the same applies to data collection, where decentralized data collection projects also offer advantages, such as reducing dependence on single data sources, providing broader data coverage, and promoting data diversity and inclusivity. In practice, it is essential to weigh these pros and cons and adopt appropriate management and technical measures to overcome challenges, ensuring that decentralized data collection projects positively impact AI development.

In summary, the integration of AI and Web3 offers limitless possibilities for future technological innovation and economic development. By combining AI's intelligent analysis and decision-making capabilities with Web3's decentralization and user autonomy, we believe that a more intelligent, open, and equitable economic and social system can be built in the future.

References#

https://docs.bewater.xyz/zh/aixcrypto/
https://medium.com/@ModulusLabs
https://docs.bewater.xyz/zh/aixcrypto/chapter2.html#_3-4-4-%E4%BA%BA%E5%B7%A5%E6%99%BA%E8%83%BD%E6%8A%80%E6%9C%AF%E5%BC%80%E6%BA%90%E7%9A%84%E9%97%AE%E9%A2%98
https://docs.bewater.xyz/zh/aixcrypto/chapter4.html#_1-2-1-%E6%95%B0%E6%8D%AE
https://mirror.xyz/lukewasm.eth/LxhWgl-vaAoM3s_i9nCP8AxlfcLvTKuhXayBoEr00mA
https://www.galaxy.com/insights/research/understanding-intersection-crypto-ai/
https://www.theblockbeats.info/news/48410?search=1
https://www.theblockbeats.info/news/48758?search=1
https://www.theblockbeats.info/news/49284?search=1
https://www.theblockbeats.info/news/50419?search=1
https://www.theblockbeats.info/news/50464?search=1
https://www.theblockbeats.info/news/50814?search=1
https://www.theblockbeats.info/news/51165?search=1
https://www.theblockbeats.info/news/51099?search=1
https://www.techflowpost.com/article/detail_16418.html
https://blog.invgate.com/chatgpt-statistics
https://www.windowscentral.com/hardware/computers-desktops/chatgpt-may-need-30000-nvidia-gpus-should-pc-gamers-be-worried
https://www.trendforce.com/presscenter/news/20230301-11584.html
https://www.linkedin.com/pulse/great-gpu-shortage-richpoor-chris-zeoli-5cs5c/
https://www.semianalysis.com/p/google-gemini-eats-the-world-gemini
https://news.marsbit.co/20230613141801035350.html
https://medium.com/@taofinney/bittensor-tao-a-beginners-guide-eb9ee8e0d1a4
https://www.hk01.com/%E7%B6%B2%E7%A7%91%E3%80%8CWeb3.0%E3%80%8D%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%96%E5%8D%9A%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B%E9%9A%9B%E8%AA%9E%E8%A8%80%E5%9C%8B

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.