It’s the largest neural network ever created

Chinese artificial intelligence researchers at the Beijing Academy of Artificial Intelligence (BAAI) unveiled Wu Dao 2.0, the world’s biggest natural language processing (NLP) model.

NLP (Natural Language Processing) is a branch of A.I. research that aims to give computers the ability to understand the text and spoken words and respond the same way human beings can.

GPT-3 (Generative Pre-trained Transformer 3) was the first powerful language model we’ve ever seen, a 175 billion–parameter deep learning model trained on text datasets with hundreds of billions of words. A parameter in a neural network is a calculation that affects the data in the model by assigning a larger or smaller weighting to each piece.

Wu Dao 2.0 (Chinese for enlightenment) is 10 times larger than GPT-3, using 1.75 trillion parameters. It’s multitasking because it can simulate conversational speech, write poems, understand pictures, and even generate recipes.

This A.I. can write essays, poems, and couplets in traditional Chinese, as well as captioning images and creating nearly photorealistic artwork, and natural language descriptions.

Wu Dao 2.0 is multimodal, providing a range of different skills, including the ability to perform text generation, image recognition, and image generation tasks. It covers both Chinese and English with skills acquired by studying 4.9 terabytes of images and texts, including 1.2 terabytes each of Chinese and English texts. It can also learn from text and images and tackle tasks that include both types of data (something GPT-3 can’t do).

Wu Dao 2.0 is trained with different models (FastMoE) for each modality which allows the larger model to select which one to consult for each type of task. Moreover, FastMoE is open-source and doesn’t require specific hardware, which makes it more democratic.

>>>  How NLP algorithms could ruin the internet

In addition, Wu Dao 2.0 can predict the 3D structures of proteins and can also power “virtual idols“. Just recently, BAAI researchers unveiled Hua Zhibing, China’s first A.I.-powered virtual student. She can learn continuously, compose poetry, draw pictures, and will learn to code in the future. In contrast with GPT-3, Wu Dao 2.0 can learn different tasks over time, not forgetting what it has learned previously. This feature seems to bring A.I. yet closer to human memory and learning mechanisms.

“Wu Dao 2.0 aims to enable machines to think like humans and achieve cognitive abilities beyond the Turing test”, said Jie Tang, the lead researcher behind Wu Dao 2.0. The rapid progress being made in the development of language models makes it clear that A.I. will soon pass the Turing test.

“What we are building is a power plant for the future of AI. With mega data, mega computing power, and mega models, we can transform data to fuel the AI applications of the future”, BAAI chair Dr. Hongjiang Zhang said.

A.I. power is constantly increasing and it brings with it pros and cons. This technology has the potential to automate tasks that require language understanding and technical sophistication such as interpreting or translating documents, launching actions, creating alerts, or generating code and articles; as well as answering questions about any topic. Application areas include technology, customer service (e.g. assisting with customer queries), marketing (e.g. writing attractive copy), or sales (e.g. communicating with potential customers); but not only.

However, one of the biggest problems will surely be manipulation: especially of information. Anyone might use these kinds of algorithms to generate lots of fake news, or writing, pretending to be someone else. But maybe it’s too early to imagine all these possible implications.

>>>  Photos and portraits can move and talk