What is ebay expansion?  Computers have given a very long answer

You probably remember from math lessons that you can expand pi as 3.14. However, two decimal places is not a very impressive result when compared to the result achieved by Google computers.

In 2019, Google Cloud put this number at 31.4 trillion. However, the record did not last long, because only two years later, scientists from the University of Applied Sciences in Grisons increased this figure to 62.8 trillion decimal places. This year, however, there was another remarkable achievement.

Read also: Google’s latest text-to-image generator can create realistic images of what you dream of

Google played the main role again, more precisely – computers there. Thanks to their participation, it was possible to determine the value of pi up to 100 trillion decimal places. So Google Cloud is once again involved in breaking records, which is especially impressive when you consider that it has tripled over three years.

The pi expansion is now known as 100 trillion decimal places

This achievement is a testament to how much the Google Cloud infrastructure is accelerating from year to year. The core technology that made this possible is Compute Engine, the secure and configurable computing service from Google Cloud, plus the last few additions and improvements: Compute Engine N2 family of machines, 100Gbps bandwidth, Google Virtual NIC, and sustainable permanent disks. We can read in a statement published by Google

The program that managed to calculate 100 trillion digits of pi is known as R-Cruncher v0.7.8. The algorithm it depends on is known by name Chudnovsky’s algorithm. The computations began on October 14, 2021 and ended on March 21, 2022. More specifically, the computation took 157 days, 23 hours, 31 minutes, and 7.651 seconds.

Read also: They spent four months puzzled over this math question. Chinese genius solved it overnight

As you can imagine, such a large-scale operation requires a lot of computing power and resources. Google Cloud estimated the cache size needed to perform the calculations at 554 TB. It is worth noting that the company has created a cluster consisting of one computing node and 32 storage nodes. In total, this included 64 iSCSI block storage targets.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

NASA. An unusual picture from space. What is this shadow on the surface of Jupiter?

Jupiter is the largest planet in our solar…

The James Webb Telescope may solve the mystery of the existence of a strange star in the galaxy

KIC 8462852, the Boyajian star, is one of the most mysterious objects…

Scientists have observed quantum phenomena on an elusive scale so far

Scientists from the universities of Dresden and Munich have discovered a new…

Interactive exhibits and interesting experiences. This is how the Copernicus Science Center shows science – Wprost

The main attraction for visitors to the Copernicus Science Center is the…