Stargate: Artificial Superintelligence in 4 Years?

“According to OpenAI’s press release outlining the Stargate project, the partners will begin deploying $100 billion immediately. The key technology partners are Arm Holdings, Microsoft, NVIDIA, Oracle, and OpenAI. Initial funding will be supplied by Softbank, OpenAI, Oracle, and the MGX AI investment fund based in the United Arab Emirates. Starting with a gigantic data center currently under construction in Abilene, Texas, the ultimate goal is to build as many as 20 AI data centers scattered around the country over the next four years.
“Stargate will be building the physical and virtual infrastructure to power the next generation of advancements in AI,” declared President Trump. He further suggested, “I think it’s going to be something that’s very special. It’ll lead to something that could be the biggest of all.” How special? How about the development of artificial superintelligence?

“I think AGI is coming very, very soon,” said Son. Artificial general intelligence (AGI) are systems capable of performing any intellectual task that a human can. But Son didn’t stop with just the advent of human-level capabilities. “After that, artificial superintelligence will come to solve the issues that mankind would never, ever have thought that we could solve. Well, this is the beginning of our golden age,” he observed.”

https://reason.com/2025/01/22/stargate-artificial-superintelligence-in-4-years/

China’s DeepSeek AI is hitting Nvidia where it hurts

“DeepSeek also claims to have needed only about 2,000 specialized chips from Nvidia to train V3, compared to the 16,000 or more required to train leading models, according to the New York Times. These unverified claims are leading developers and investors to question the compute-intensive approach favored by the world’s leading AI companies. And if true, it means that DeepSeek engineers had to get creative in the face of trade restrictions meant to ensure US domination of AI.”

https://www.theverge.com/2025/1/27/24352801/deepseek-ai-chatbot-chatgpt-ios-app-store

Chinese researchers develop AI model for military use on back of Meta’s Llama

“Top Chinese research institutions linked to the People’s Liberation Army have used Meta’s publicly available Llama model to develop an AI tool for potential military applications, according to three academic papers and analysts.”

https://www.yahoo.com/finance/news/exclusive-chinese-researchers-develop-ai-023814416.html

Amazing New Chinese A.I.-Powered Language Model Wu Dao 2.0 Unveiled

“Chinese artificial intelligence (A.I.) researchers at the Beijing Academy of Artificial Intelligence (BAAI) unveiled Wu Dao 2.0, the world’s biggest natural language processing (NLP) model. And it’s a big deal.

NLP is a branch of A.I. research that aims to give computers the ability to understand text and spoken words and respond to them in much the same way human beings can.

Last year, the San Francisco–based nonprofit A.I. research laboratory OpenAI wowed the world when it released its GPT-3 (Generative Pre-trained Transformer 3) language model. GPT-3 is a 175 billion–parameter deep learning model trained on text datasets with hundreds of billions of words. A parameter is a calculation in a neural network that shapes the model’s data by assigning to each chunk a greater or lesser weighting, thus providing the neural network a learned perspective on the data.

Back in November, The New York Times reported that GPT-3 “generates tweets, pens poetry, summarizes emails, answers trivia questions, translates languages and even writes its own computer programs, all with very little prompting.” GPT-3, move on over. Wu Dao 2.0 is here.

Wu Dao 2.0 (Chinese for enlightenment) is ten times larger than GPT-3, using 1.75 trillion parameters to simulate conversational speech, write poems, understand pictures, and even generate recipes. In addition, as the South China Morning Post reports, Wu Dao 2.0 is multimodal, covering both Chinese and English with skills acquired by studying 4.9 terabytes of images and texts, including 1.2 terabytes each of Chinese and English texts.

“Wu Dao 2.0’s mulitmodal design affords it a range of skills, including the ability to perform natural language processing, text generation, image recognition, and image generation tasks,” reports VentureBeat. “It can write essays, poems, and couplets in traditional Chinese, as well as captioning images and creating nearly photorealistic artwork, given natural language descriptions.” In addition, Wu Dao 2.0 can predict the 3D structures of proteins, like DeepMind’s AlphaFold, and can also power “virtual idols.” Just recently, BAAI researchers unveiled Hua Zhibing, China’s first A.I.-powered virtual student”