How is Autodesk using AI in product design?

Applications of AI


A conversation with Stephen Hooper

Stephen Hooper, vice president of Software Engineering, Design and Manufacturing at Autodesk

Earlier this year, we met with Steven Hooper, Autodesk's vice president of software development, design and manufacturing, to discuss how Autodesk is leveraging AI for mechanical design. This is part of an ongoing project to determine how (if) every major CAD and CAE company will adopt AI. So far, every company has said they will use AI in some form. More than anyone at Autodesk, Hooper oversees the roadmap for Autodesk's product design software, including Fusion and Inventor.

This follows an interview with Mike Haley, SVP of Research at Autodesk, who is laying the foundations of AI across all industries at Autodesk (design and manufacturing, AEC, and entertainment). Check out part 1 of our conversation with Mike Haley here.

The article below is based largely on our conversation.

Hooper's eagerness to talk about AI and its implementation in the product is unusual, as most software companies are reluctant to talk about product features that haven't been released yet, perhaps as a sign that he's answering the call for AI after the ChatGPT furor.

Engineering.com: Can you tell us more about your role at Autodesk and how you work with Mike Haley?

Stephen Hooper: Mike works on the large language definition that is the foundation for the entire company. He is not responsible for the creativity of the products in the various industry verticals. My job is to work with Mike to help him think through how to build a generic service, understand the market opportunity from a manufacturing perspective, define it, prototype it, test it, and then build production-ready code based on his generic service in Fusion. Fusion is, of course, a targeted tool for manufacturing (you'll see why in a moment). Mike then worked on building the generic learning model, and my team worked on implementing it on the client side so that it would show up as a product feature.

Fusion's drawing automation automatically creates drawings of each part in an assembly, including all views and many of the annotations. This image is from an Autodesk video on YouTube.

Engineering.com: So will the next release of Fusion employ AI?

Hooper: We have something we call Drawing Automation. It's a drawing automation service. We break down any assembly into parts, take each part and create the layout and scale. We do all the view orientations and annotate all the drawings. It's easier to show you than to explain. Here's a quick overview. Here's the base plate of the machine. [Refer to the image above.] Fusion will extract all the individual parts and create the drawing. The angled parts are [part 4 above] We make sure it's oriented correctly on the drawing. This is all done on the cloud, not on the client side. You choose a few important dimensions, and the AI ​​does the rest. The drawing is scaled and annotated automatically. There are still some ways to go, like geometric tolerances and dimensions, but we're already counting the parts. [for a parts list]But as you know, drawings can be interpreted in many different ways, so rather than jumping ahead and dimensioning everything, we give you options. On the side, you see optional ways you can dimension. You can do some basic dimensions and fill in the rest yourself. Maybe you like a particular dimension style, but want to reduce the density of dimensions. You can work interactively. It's not perfect yet. It does a good job of creating and laying out all of your drawings. You can create balloons, bills of materials, annotations, baseline dimensions, ordinate dimensions, etc. You can't create drawing notes or geometric tolerances or dimensions, etc. These are advanced areas. We're working on them.

One of our next steps is to use large-scale language models for notes in drawings, rather than training specific models for annotations, etc.

There are still some challenges with the automatic drawing. The circled areas are where the leader lines were broken. Image taken from Autodesk video: https://youtu.be/3DkUlbgnw-8?feature=shared.

Another thing we've done is instrumented our product. We instrumented it because even if you train a large model on a huge amount of data, it's still not perfect. Iteratively improving the model will help. In this example, one of the callouts crosses another callout, so where it says 2x6x20 holes, [see circled area in the image above]Text annotation and leader line intersect [marked in red in the image above]The user generates a dimension, then manually drags it in. Because we instrument the product, we know that the user has made a correction, and we retrain the model so that the user doesn't have to drag the dimension in. This happens after a while. In this way, the product gets more and more sophisticated as more users use it.

Engineering.com: Can you elaborate on your use of customer data? Is it proprietary?

Hooper: That's an important aspect to consider. First of all, you have to have access to a lot of unique data. Otherwise, you'd be better off licensing it. For drawing notes, there's no point in building a large language model. You're better off licensing OpenAI and building specific training on technical documentation on top of it. But if you have something specific to your industry, like the actual drawing layout itself, Autodesk is the best choice. We have a huge amount of in-house data on drawing specifications, standard styles, layouts, etc. Moreover, we can instrument our product to learn from user interactions. This is not proprietary data that we're generating. We're not learning from someone else's designs. This is what I call horizontal AI. This is vertical AI that helps our users to design, be more creative, and more innovative. We are very sensitive to data protection.

Take Grammarly for example. This is what I call horizontal functionality. Grammarly is not AI-based, but updates its algorithms based on user usage. Nobody cares if Grammarly learns from your grammar, spelling, and accuracy. Neither do you. Especially if you are a writer, you would care if Grammarly starts creating articles for others based on your IP and the creative content you have created. The same applies here. Nobody cares if you train a model based on your drawing standards, because everyone works based on your drawing standards. There is no IP issue, just increased productivity. This is a perfect area to apply machine learning.

to be continued



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *