What is ebay expansion?  Computers have given a very long answer

You probably remember from math lessons that you can expand pi as 3.14. However, two decimal places is not a very impressive result when compared to the result achieved by Google computers.

In 2019, Google Cloud put this number at 31.4 trillion. However, the record did not last long, because only two years later, scientists from the University of Applied Sciences in Grisons increased this figure to 62.8 trillion decimal places. This year, however, there was another remarkable achievement.

Read also: Google’s latest text-to-image generator can create realistic images of what you dream of

Google played the main role again, more precisely – computers there. Thanks to their participation, it was possible to determine the value of pi up to 100 trillion decimal places. So Google Cloud is once again involved in breaking records, which is especially impressive when you consider that it has tripled over three years.

The pi expansion is now known as 100 trillion decimal places

This achievement is a testament to how much the Google Cloud infrastructure is accelerating from year to year. The core technology that made this possible is Compute Engine, the secure and configurable computing service from Google Cloud, plus the last few additions and improvements: Compute Engine N2 family of machines, 100Gbps bandwidth, Google Virtual NIC, and sustainable permanent disks. We can read in a statement published by Google

The program that managed to calculate 100 trillion digits of pi is known as R-Cruncher v0.7.8. The algorithm it depends on is known by name Chudnovsky’s algorithm. The computations began on October 14, 2021 and ended on March 21, 2022. More specifically, the computation took 157 days, 23 hours, 31 minutes, and 7.651 seconds.

Read also: They spent four months puzzled over this math question. Chinese genius solved it overnight

As you can imagine, such a large-scale operation requires a lot of computing power and resources. Google Cloud estimated the cache size needed to perform the calculations at 554 TB. It is worth noting that the company has created a cluster consisting of one computing node and 32 storage nodes. In total, this included 64 iSCSI block storage targets.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

When does an amazing meteor light up the night sky? The apogee of Geminid is approaching

A winter meteor show can be even more spectacular than August Perseid…

Interstellar objects in the solar system can be very common

Comet 2I/Borisov, observed in 2019, was the second officially documented interstellar object.…

Scientists promise cheap solar panels. Their prototype leaves the competition far behind

Although silicon solar cells have seen impressive progress in recent years, there…

The fourth wave of the pandemic in Europe. Where is the worst?

The significant increase in infections in Europe contributed to the reversal of…