![reshape matlab reshape matlab](https://helloacm.com/wp-content/uploads/2021/07/reshape-matrix.jpg)
But that’s also the benefit of having that digital twin, where you can keep that in mind-it's not going to be as accurate as that precise model that we’ve developed over the years.īoth chip design and manufacturing are system intensive you have to consider every little part. Of course, that’s why you do many simulations and parameter sweeps. Gorr: The tend to not be as accurate as physics-based models. You want to simulate, test, experiment as much as possible without making something using the actual process engineering. That’s obviously going to yield dramatic cost savings if you’re actually manufacturing and producing. Especially in the experimentation and design phases, where you’re trying different things. So, it’s going to be more efficient and, as you said, cheaper? Then, in conjunction, you have this other model that you could tweak and tune and try different parameters and experiments that let sweep through all of those different situations and come up with a better design in the end.
![reshape matlab reshape matlab](https://okpanico.files.wordpress.com/2017/08/sp4071.png)
That’s pretty much what people are doing, where you have the physical system model and the experimental data. So it’s like having a digital twin in a sense? So, we’re seeing that benefit in many ways, including the efficiency and economy that are the results of iterating quickly on the experiments and the simulations that will really help in the design. That takes a lot less time computationally than solving the physics-based equations directly. You could create a surrogate model, so to speak, of that physics-based model, use the data, and then do your parameter sweeps, your optimizations, your Monte Carlo simulations using the surrogate model. We want to do a reduced order model, where instead of solving such a computationally expensive and extensive model, we can do something a little cheaper. Gorr: Historically, we’ve seen a lot of physics-based modeling, which is a very intensive process. What are the benefits of using AI for chip design? We think of AI oftentimes as a predictive tool, or as a robot doing something, but a lot of times you get a lot of insight from the data through AI. So, looking back at that historical data of when you’ve had those moments where maybe it took a bit longer than expected to manufacture something, you can take a look at all of that data and use AI to try to identify the proximate cause or to see something that might jump out even in the processing and design phases. Then, thinking about the logistical modeling that you see in any industry, there is always planned downtime that you want to mitigate but you also end up having unplanned downtime. There’s a lot of anomaly detection and fault mitigation that you really want to consider. But even thinking ahead in the design process, when you’re designing the light and the sensors and all the different components. I think defect detection is a big one at all phases of the process, especially in manufacturing. There’s a lot of important applications here, even in the general process engineering where we want to optimize things. Heather Gorr: AI is such an important technology because it’s involved in most parts of the cycle, including the design and manufacturing process. How is AI currently being used to design the next generation of chips? To better understand how AI is set to revolutionize chip design, we spoke with Heather Gorr, senior product manager for MathWorks’ MATLAB platform. Speaking of speed, Google’s TPU V4 AI chip has doubled its processing power compared with that of its previous version.īut AI holds still more promise and potential for the semiconductor industry.
![reshape matlab reshape matlab](https://i.stack.imgur.com/WENfZ.png)
Samsung, for instance, is adding AI to its memory chips to enable processing in memory, thereby saving energy and speeding up machine learning. So they’re turning to other approaches to chip design, incorporating technologies like AI into the process. Engineers and designers can do only so much to miniaturize transistors and pack as many of them as possible into chips.