Engineering tool or job threat?

AI and ML Jobs


Semiconductor Engineering spoke with Michael Jackson, Corporate Vice President of R&D at Cadence, about using AI to design and test complex chips. Joel Sumner, vice president of semiconductor and electronics engineering at National Instruments. Grace Yu, product and engineering manager at Meta. David Pan is Professor of Electrical and Computer Engineering at the University of Texas at Austin. Below is an excerpt of a conversation that took place in front of a live audience at DesignCon.

SE: Will AI really replace engineers or will it become a tool?

jackson: AI is really a tool for engineers to use. In the field of design automation, we have been creating tools for many years. AI is part of a new breed. We see a lot of research and a lot of development. New products are coming out, and this is a way for users to get better results faster.

Sumner: It is inevitable that our work will change in some way. No one who has been in this industry for a long time is doing it exactly the same way as in the past. AI is changing the way we work. Because we are ambitious. We all have some grand things that we cannot achieve today. AI makes our lives easier and faster by removing distractions. Everyone is asking for more resources, being asked to help get something done, or talking to their boss. But the reality of the world we live in is that it can’t be done. That’s the limiter. AI unlocks it.

bread: AI becomes positive. It is very useful for optimization, prediction and even generation. Some of the less creative low-level mundane jobs could be replaced by AI. Looking at the four industrial revolutions, the 1st industrial revolution with coal and steam, the 2nd industrial revolution with electricity, the 3rd industrial revolution with computers, and now the 4th industrial revolution each saw a few jobs. , workers and engineers are being replaced. But it creates new breeds of engineers who are more productive or create better designs.

excellent: This is seen on the PCB. The amount of work required to manually check millions of connections is enormous. How do you know that everything is connected correctly? EDA tools allow you to improve design automation and verify verification. But AI can do much better than simple, repetitive tasks. Training a model to capture the same patterns allows it to do some intelligent work. However, it depends on the quality of the AI ​​model. That way, we can get more out of AI, freeing up engineers to do more creative work that AI can’t.

SE: Can AI debug and validate complex chips? When we think of AI, we tend to think of it as something out of a sci-fi movie, but it really isn’t.

bread: You can perform mundane tasks today. But with a lot of training data (and supervised, unsupervised, semi-supervised, active learning, transfer learning) you can also start doing very intelligent things. That said, it doesn’t make you smarter than a top engineer. However, it may be very suitable for some applications, not necessarily all.

jackson: You can definitely do more than mundane tasks. For example, in the area of ​​optimizing digital ICs, products have been released that allow you to achieve better results than you could have without the technology. You’ll probably save 10% or more in power, or you’ll see results a month sooner than otherwise. It’s nothing out of the ordinary. That’s true productivity.

Sumner: The promise we see is the ability to give engineers hints where to look, especially on the debug side. We have already introduced such technology. We had a major semiconductor vendor that we worked with and they were doing automation around the root cause looking for patterns in the incoming data. It’s very acceptable because there are humans in the loop, accelerating what they can do. I look forward to the day when I won’t have to sit in my room staring at conspiracies or spotting problems.

SE: There is a lot of custom design going on right now. We don’t have much data on their design, so how does that affect the AI?

excellent: For some of the custom designs we’re doing now, we don’t have an existing data set and we’re modifying the model to make use of existing data. Perhaps in a few years, when we collect more data and the data is more accurate, we will be able to present new technologies and apply AI to them.

jackson: There is a good set of training data to help develop statistical models used as part of machine learning. A framework can be common so that companies A and B can tune and customize their statistical models based on local data.

bread: Agree. Since you need application-specific AI for each application and customer, you’ll likely need to migrate one model to another and make minor changes. Perhaps there is still a common framework that can be used, to which transfer learning etc. can be applied.

SE: The processes at various leading edge fabs have become very different. How do you develop AI tools that work with each of them?

Sumner: There are two research areas here. One is to get the basics of variance, train a model on what they have in common, and be able to learn about what they differ from. One thing I see with ML models is that they require a lot of data. And not only does the data itself have records, but it also includes tagged content. Is it good or bad? And if it’s bad, what’s wrong? This allows the algorithm to move forward. But if your data set is small and you have at least a little bit of commonality, you can tell what the base is and what they have in common around it in order to make it work.

SE: If some engineers are imperfect and the algorithms they create are imperfect, does that mean the AI ​​will be imperfect? ​​If so, how can we avoid it?

jackson: AI isn’t always perfect, but it learns. A good example is the AlphaGo work done by DeepMind. It can play Go better than a human, so it’s much better than the developers who designed it. The trick was to create something that you would actually learn, then do it so that it could reach its full potential.

bread: There is a case related to MAGICAL sponsored by DARPA. It is a fully automated analog IC layout system that takes into account all kinds of constraint generation, placement and routing. The advantage of this is that it can generate tens of thousands of different types of layouts and automatically extract and simulate them without a human in the loop. It can generate all sorts of weird layouts while meeting your design constraints. But you can also explore other solutions that human designers might be interested in but haven’t touched. So, like AlphaGo, a game where the AI ​​can do better with fewer moves, we can do the same here. Engineers are arguably imperfect, but this process is iteratively improved to allow them to do things they never thought possible.

jackson: At 37 hands in Game 2, people watching said they didn’t get it and thought it was a mistake.

SE: So part of this depends on how much data has both good and bad data?

Sumner: Yes, especially for those using AI for their own problems, the question is where does the data come from? Most people have a lot of data just on their machine. However, this would change the way we think about storing data. That data must be stored in an easily accessible way so that you can do your job quickly, and the algorithms do their work quickly, and for you to experiment with what works specifically for you. it won’t work. Tagged the right way. A hard drive full of Excel files takes so long to put together that it makes experimenting so much harder than you think. And having some tools to help you extract data quickly is key to this experimentation and progress.

excellent: Some of the current work in Meta’s lab is pioneering work. There are a lot of issues in the industry as a whole, not only in terms of design, but also how to utilize the Metaverse. It’s not just games. We want to apply it to many areas that are not yet known at this time, and there is uncertainty about AI. There are many studies. We have tons of data to train our AI to recognize underlying patterns, which is the noise in the data. We still have a lot of work and challenges ahead of us. We can also take existing AI models from other industries to determine the best approach and extend them into new areas for developers.

Related Documents
AI becomes more prominent in chip design (Part 2 of the roundtable discussion above)
Experts at the table: The pros and cons of more data, and how AI can leverage that data to optimize designs and improve reliability.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *