China now has the most powerful neural network: it is many times smarter than competitors from Google and Open AI

Get real time updates directly on you device, subscribe now.

When the Open AI neural network GPT-3 was introduced in May 2020, it set a new standard in deep learning and was considered the most advanced at the time. The AI ​​model could generate text that was virtually indistinguishable from what was written by a human. But just 10 months later, researchers from the Beijing Academy of Artificial Intelligence announced the creation of their own generative neural network model called Wu Dao, capable of doing everything that GPT-3 can do and even more.

Just three months later, Wu Dao 2.0 appeared with 1.75 trillion parameters, which is 10 times more powerful than GPT-3 and 150 billion more parameters than Google Switch Transformers. Chinese experts first developed an open source learning system similar to Google Mixture of Experts called FastMoE. It made it possible to train a neural network model both on supercomputer clusters and on conventional GPUs. This gave the system a lot of flexibility as it did not require proprietary hardware like Google’s and could run on standard hardware.

Related Posts

VPN 365 2.2.6

VPN One 22.0

With all this computing power, the new neural network has a huge set of capabilities. Unlike most deep learning models, which often perform a single task, Wu Dao is multimodal, that is, it can perform multiple tasks. In theory, it is similar to the AI ​​that Facebook uses to combat hate and misinformation.

Researchers have demonstrated Wu Dao’s ability to perform natural language processing, image and text generation, and image recognition tasks. The neural network can not only write essays, poems and couplets in traditional Chinese, but it is also capable of creating alternative text based on a static image and generating almost photorealistic images from the description. Wu Dao mimics speech, creates recipes and predicts the three-dimensional structure of proteins, similar to AlphaFold.

Almost 5 TB of data was used to train Wu Dao 2.0. Several dozen companies have already become interested in the development.

Get real time updates directly on you device, subscribe now.

Leave A Reply

Your email address will not be published.