Main menu


Q&A: Neil Thompson's computing power and innovation | MIT News

featured image

Moore’s Law is a famous prediction by Intel co-founder Gordon Moore that the number of transistors on a microchip will double every one to two years. This forecast has been largely met or exceeded since the 1970s. Computing power doubles about every two years, and smarter, faster microchips become cheaper.

This rapid growth in computing power has fueled innovation for decades, but early in the 21st century, researchers began sounding the alarm that Moore’s Law was slowing down. With standard silicon technology, there is a physical limit to how many tiny transistors you can cram into an affordable microchip, and how many.

Neil Thompson, a research scientist at the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Massachusetts Institute of Technology (MIT) at the Sloan School of Management, and his research team are working to develop more powerful computers to improve outcomes across society. We set out to quantify the significance. In a new working paper, they analyzed five areas of computational importance, including weather forecasting, oil exploration, and protein folding (important for drug discovery). This working paper was co-authored by research assistants Gabriel F. Manso and Schninge.

They found that 49-94% of the improvement in these areas could be explained by computational power. For example, in weather forecasting, a 10-fold increase in computer processing power will improve forecasts 3 days ahead by a factor of 3.

However, computer progress is slowing, which could have far-reaching implications for the economy and society as a whole.Thompson spoke MIT News On the implications of this study and the end of Moore’s Law.

Q: How did you approach this analysis and quantify the impact of computing on different domains?

A: It is difficult to quantify the impact of computing on real-world results. The most common way to look at computing power, and more generally IT advancements, is to look at how much money companies spend on it and see how that correlates with results. Spending, however, is a difficult measure to use because it only partially reflects the value of the computing power purchased. For example, today’s computer chips may cost as much as they did last year, but are much more powerful. Economists try to adjust for changes in that quality, but it’s hard to know exactly what that number should be. Our project measured computing power more directly. For example, we investigated the function of the system used when protein folding was first performed using deep learning. By looking directly at the features, we get more accurate measurements, so we can better estimate the impact of computing power on performance.

Q: How will more powerful computers improve weather forecasting, oil exploration, and protein folding?

A: Simply put, the increase in computing power has had a huge impact on these areas. Weather Forecast has seen a trillion-fold increase in the amount of computing power used for these models. This will tell you how much more computing power you have and how you’ve used it. This is not just someone who installs older programs onto faster computers. Instead, the user has to constantly redesign algorithms to take advantage of 10x or 100x the power of his computer. There is still a lot of human ingenuity needed to improve performance, but our results show that much of that ingenuity is focused on how to take advantage of ever-more powerful computing engines. It means that

Oil exploration is an interesting case. Oil companies use some of the world’s largest supercomputers to interpret seismic data. , is combating this trend by mapping the subsurface geology. This improves the job of drilling holes in exactly the right places.

Using computing to improve protein folding has long been a goal. This is because it is important to understand her three-dimensional shape of these molecules. This determines interactions with other molecules. In recent years, the AlphaFold system has made remarkable progress in this area. Our analysis shows that these improvements are well predicted by the significant increase in computational power used.

Q: What was the biggest challenge in conducting this analysis?

A: When looking at two trends growing over time, in this case performance and computing power, one of the most important questions is what the relationship between them is causation and what is just correlation. It’s about figuring out if it’s related. In the areas we explored, companies invest a lot of money and do a lot of testing, so we can partially answer that question. It’s not just about spending ten million dollars and hoping it works. They did an evaluation and found that running the model twice as long as he did improved performance. Then buy a system powerful enough to do that calculation quickly and put it into production. It gives us great confidence. But there are other ways to check causality. For example, you can see that there have been many big leaps in the computing power NOAA (National Oceanic and Atmospheric Administration) uses for weather forecasting. And if you buy a bigger computer and install it all at once, you’ll get much better performance.

Q: Would these advances have been possible without the exponential increase in computing power?

A: This is a tough question as there are different inputs such as human capital, traditional capital and computational power. All three have changed over time. Some might say that it would have the greatest effect if computing power increased to a trillion times his. This is a good intuition, but we also need to consider the reduction in marginal returns. For example, if he goes from having no computers to having one, that’s a big change. But if he goes from 100 computers to 101 computers, adding one extra won’t give much benefit. Therefore, he has two powers to compete. On the one hand, computing is greatly increased, but on the other hand, the marginal profit is reduced. Our research shows that we already have a ton of computing power, but it’s growing so fast that it explains a lot of the performance gains in these areas.

Q: What are the implications of decelerating Moore’s Law?

A: I am very concerned about the impact. As computing improves, it will enhance weather forecasting and other areas we have explored, but it will also improve countless other areas that have not been measured, but they are an important part of our economy and society. If the drivers of improvement slow down, that means all these follow-on effects slow down too.

Some may object, arguing that there are many ways to innovate. If one path slows down, the others will make up for it. On some level it is true. For example, we’ve already seen growing interest in designing dedicated computer his chips as a way to compensate for the demise of Moore’s Law. The problem, however, is the magnitude of these effects. The benefits of Moore’s Law have been so great that in many application areas they cannot be compensated by other sources of innovation.