Real-time monitoring for Additive Manufacturing (AM) processes can greatly benefit from spatial-temporal modeling using deep learning. However, existing, deep-learning approaches in AM are case-dependent, and therefore not robust to changes of control inputs and data types. As AM is dynamic and complex, this limitation leads to a lack of systematic, DL approaches for real-time monitoring of AM, which involves a large number of varying control parameters and monitoring data. To address the challenge, this paper introduces a novel approach for developing spatial-temporal models to monitor Laser Powder Bed Fusion (LPBF) processes using deep learning on real-time, monitoring data. First, we present a novel model for representing in-situ-monitoring and control data of LPBF at multiple scales. Second, from the model, we extract spatial-temporal relationships for in-situ monitoring of LPBF processes. Third, we present a spatial-temporal, modeling approach using the architecture of convolutional long short-term memory (LSTM) to monitor the spatial-temporal relationships and detect anomalies. A case study used convolutional LSTM Autoencoder on optical, melt-pool-monitoring data, one of the most widely adopted data types in in-situ monitoring of LPBF. The data was generated from an LPBF testbed called the Additive Manufacturing Metrology Testbed. The novel, learning approach enables spatial-temporal modeling of AM dynamics directly from real-time data for the monitoring of varying AM environments. This methodical approach provides the potential to fuse realtime data at multiple, spatial-temporal scales.

This content is only available via PDF.
You do not currently have access to this content.