Future of chipmaking will rely heavily on AI to detect flaws, says Applied Materials
To make the high-end chips for the Apple iPhone, such as the A14 or Nvidia A100 series AI processors, with billions of transistors, you need a factory that costs $ 16 billion to build and maintain. This amount rose from $ 10 billion just eight years ago, and is expected to increase significantly further, to perhaps $ 18 billion in the next few years.
This has posed a dilemma for the chip industry: These chips need more than ever to be checked for flaws, but chipmakers are under more pressure than ever to release the product in order to recoup their investment.
“You should naturally want to inspect more, because there are more process steps, more things that can go wrong, but if you look at what happened, economics have prevented our customers from doing this inspection. “said Keith Wells, who is group vice president of the imaging and process control group at Applied Materials, the world’s largest manufacturer of tools for chipmaking.
“We see this need to really solve this economic problem for our customers,” said Wells, who spoke with ZDNet via Zoom.
A number of tools are used to solve this economic dilemma, and one of the areas with the most potential, Wells said, is artificial intelligence, particularly the machine learning form of AI.
In March, the company unveiled Enlight, a scanner that had been in development for five years and was deployed in beta in 2019.
Enlight uses the polarization of light to maximize resolution and detect critical defects in half the time of a conventional optical scanner. The scanner for the first time will capture both direct light bouncing off the surface of the wafer and scattered light, referred to as “brightfield” and “gray background” respectively. It’s like scanning two things in one pass, cutting the time in half.
In addition to the new optics, however, the scanner will use variable algorithms, rather than hard-wired ones. This paves the way for algorithms for learned functions using machine learning.
With more and more particles appearing on wafers, chipmakers must sort out which particles will actually be lethal to a chip. Examples of faults that are really bad are an “open”, where a circuit is broken, or “short”, where two circuits parallel to each other have been bridged by a particle fallen between them, also called a bridge.
This leads to a classification problem which is a classic machine learning task: is this flaw fatal or manageable?
“When you get down to 5 nanometers, if a particle falls into an open area you don’t care, it’s unlikely to impact a circuit,” Wells explained. “But if a transistor falls on, it is likely to kill the chip,” which means that part of the silicon wafer is unusable. Chip manufacturing depends on how many chips can be obtained from a wafer, the known good die. This metric is the “yield” of chip manufacturing.
“Therefore, our customers want to know not only what is on the wafer, but what is likely to kill the wafer or not, so you have to sort out those two things,” Wells said.
This judgment process is a two-part process, detection and then classification. One is fast, the other slow.
The Enlight scanner, being faster than previous scanners, can be used more frequently during manufacturing, thus gathering more images of potential defects. In fifteen minutes to an hour, it will generate maybe a million images of what could be a fatal flaw, or what could just be noise in scattered light.
These millions of potential defects are passed to a separate tool, called SEMVision, which performs the classification. SEMVision is a scanning electron microscope. It scans the surface of the wafer with a focused electron beam. It looks slower, with greater resolution, just a section of the wafer that was suggested by Enlight. He examines any potential defects that Enlight reports to see if they are openings or bridges, or perhaps a harmless protrusion. He classifies what he sees.
SEMVision takes thirty minutes to scan, then forty-five minutes to review. “They work together, sensing with the optical system and examining at very high resolution with the eBeam tool,” Wells explained. “You use this for the initial training of your AI, which then tells you about the prediction of performance faults.”
Both Enlight and SEMVision data will be sent to a third system, a rack computer running a program called ExtractAI. The program calculates the probability of a section of wafer and can instruct SEMVision how to use its slow approach to examine other areas.
“ExtractAI said, I looked at the last fifteen flaws put in this n-dimensional space, and they’re all bridges, so now I’m very confident on all of those that are bridges,” Wells explained. “Stop looking at these areas, go sample this population and take 50 or 100 of them, and it works through that with this back and forth communication between the SEMVision and the ExtractAI computer.”
This feedback loop is new in the use of AI, Wells said. Traditional use of scanning electron microscopes uses a set of static rules to classify defects.
“You hear people say that we are using AI, but they set up a static classifier, and then there has to be someone to decide if that classifier is no longer good, the process has changed,” he said. Wells explained.
In contrast, “this classification system is constantly adapting and updating,” he said of ExtractAI.
“Every slice that goes through Enlight and is sent to SEMVision is sent to a real-time AI process that learns and adapts our classifier to determine what a performance defect versus a nuisance is. Each cycle from Enlight to SEMVision to ExtractAI “tries to optimize output” once more.
“We use an adaptive learning method, which is unique,” he said. This means that “every slice we adapt to what we see, every slice the AI makes a decision and tries to optimize the outcome”.
ExtractAI, once formed, can use the million points of interest to classify the entire surface of the wafer. “We’re giving a real flaw map, and it’s more accurate and valuable than what a competitor can do,” Wells said.
Realization is a confluence of factors. Accelerating optical scanning with Enlight makes it possible to see more things sooner than a human looking at the output of the SEM microscope. But the computational capacity that comes with faster computer algorithms has allowed for continuous retraining.
“It’s this synergy that’s happening in the semiconductor industry,” he said, “all of a sudden the processors and GPUs are fast enough that you can start deploying AI and get results quickly.
“The element of math to make this near real time, where you have answers in seconds, has been a key element in moving this forward.”
Wells did not disclose the exact details of the machine learning algorithms, citing protected intellectual property. It’s not really deep learning, he clarified, given that there are fewer data points.
“It’s closer to what is considered traditional machine learning than a neural network, and one of the reasons we’ve gone down this path is that neural networks are very power hungry. in data, ”Wells said.
“We have big data, but if you look at a Google Image Classifier, their configuration is hundreds of thousands of data points, while we have thousands of data points.”
What works well with those thousands of data points up front, to get wafer inspection up and running, “is closer to a machine learning approach, where you extract the ‘interesting’ attributes and then use techniques for determining which attributes are most important. important in a classification. ”
Keep in mind that the factory is a secure facility. It’s not on the Internet and there is no outside connection to the tool in the chip factory, Wells noted.
“When you figure out how to do that, you can start with an algorithm that contains weights, but doesn’t contain anyone’s IP address,” he said. “But then all the additional data to do the training can only be generated on site, in this factory we can’t bring a bunch of other data points.”
The immediate economic gain from Enlight, SEMVision and ExtractAI, Wells said, is that the combo tools inspect wafers 1.6 times more frequently, increasing the chances of finding the killer defect. Over time, Applied believes that more tools like ExtractAI will increase the work of human fabrication technicians and thus help reduce the yield gap, the difference between desire and actual production.
The ExtractAI tool has been used exclusively for logic circuits to date and has just been deployed in beta to DRAM and NAND memory chipmakers, Wells said.