Nvidia, the world’s most valuable listed chip companies, announced Monday that it will build one of the world’s fastest artificial intelligence supercomputers in Israel at a cost of hundreds of millions of dollars.
The system is to be called “Israel-1” and will contain up to eight exaflops of AI computing, according to Reuters. To understand the capabilities of the computer: one exaflop can perform one quintillion (1,000,000,000,000,000,000) calculations PER SECOND.
“This morning we woke up to great news,” Finance Minister Bezalel Smotrich said in a statement. “The State of Israel has been preparing itself for many years for this, and here it is.
“The Israel-1 computer is to be one of the most powerful supercomputers in the world for AI calculations.
“Nvidia believes in Israeli high-tech and has already announced its intention to continue recruiting 1,000 additional workers here,” Smotrich noted.
“This is important news for high-tech and AI and an expression of confidence in the State of Israel, its economy and Israeli human capital. Nvidia is in a phenomenal development momentum and Israel has been and will be a fertile ground for the development of the hi-tech and AI environment.
As Minister of Finance, I will soon, with God’s help, promote significant steps for the high-tech industry, and the State of Israel will continue to flourish in this arena as in other arenas.
“The Israeli startup Nation is alive and kicking and will kick a lot more,” Smotrich added.
‘Generative AI is Going Everywhere’
The cloud-based system, developed by the former Mellanox team, is to be partly operational by the end of 2023. Nvidia outbid Intel Corporation to acquire Mellanox Technologies, an Israeli chip designer, in 2019 for close to $7 billion.
Nvidia senior vice president Gilad Shainer said the company worked on the project with 800 startups and tens of thousands of software engineers in the Jewish State.
Calling artificial intelligence the “most important technology in our lifetime,” Shainer said, adding that large graphic processing units (GPU) are needed to develop AI and generative AI applications.
Open AI’s ChatGPT was created with thousands of Nvidia GPUs, to give an example.
“Generative AI is going everywhere nowadays,” Shainer told Reuters. “You need to be able to run training on large datasets.
The system will enable companies in Israel to carry out training much quicker, “to build frameworks and build solutions that can tackle more complex problems,” he said.
Nvidia Works with Microsoft and Google
ChatGPT initially required approximately 20,000 GPUs in 2020 to process the data on which it was trained. However, that number has since leaped to more than 30,000 GPUs, when Open AI owner Microsoft added AI features to its Bing search engine, according to the Euroeseuro tech website.
Nvidia has a 90-percent market share in the hardware market for artificial intelligence, with clients that include both Microsoft and Google. This past March, Google’s cloud computing division also announced an agreement with Nvidia to help accelerate generative artificial intelligence in the market.
Microsoft uses Nvidia’s GPUs to power its artificial intelligence supercomputer, but Google has developed a specialized chip called a “tensor processing unit” (TPU) to power its own artificial intelligence supercomputer.
The internet giant is also gradually releasing its new conversational chatbot service, dubbed “Bard,” described by the company as “an early experiment that lets you collaborate with generative AI.”