Article Article
Melt Pool Temperature Prediction Based on Recurrent Neural Network for Directed Energy Deposition

In directed energy deposition, the melt pool temperature is closely related to the microstructures and defects, thus significantly affecting the final part quality. Multiple factors such as laser power, scanning speed, spot size, and powder feed rate can affect the melt pool temperature profiles. It is critical to determine the melt pool temperature distribution and history during the material deposition. However, the high-energy beam and metal material interaction in the molten pool is a complex coupled process with physical metallurgy changes, which makes it challenging to use the experimental methods to investigate the melt pool temperature distribution under different processing parameters. To address this challenge, we establish two data-driven models using machine learning based on extreme gradient boosting (XGBoost) and long short-term memory (LSTM) correspondingly to predict the melt pool temperature in a thin-wall structure deposition. The experimental results have shown that the prediction of LSTM is sufficiently accurate and fast for in-situ correction. The proposed predictive models are expected to facilitate DED process optimization in melt pool thermal-related properties.

DOI: 10.32548/RS.2022.033

References

(1) Gibson, I., Rosen, D., and Stucker, B., 2015, Additive Manufacturing Technologies, Springer New York, New York, NY.

(2) Saboori, A., Aversa, A., Marchese, G., Biamino, S., Lombardi, M., and Fino, P., 2019, “Application of Directed Energy Deposition-Based Additive Manufacturing in Repair,” Appl. Sci., 9(16).

(3) Dass, A., and Moridi, A., 2019, “State of the Art in Directed Energy Deposition : From,” Coatings.

(4) Chen, T., and Guestrin, C., 2016, “XGBoost: A Scalable Tree Boosting System,” Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ACM, New York, NY, USA, pp. 785–794.

(5) Schmidhuber, J., 1992, “Learning Complex, Extended Sequences Using the Principle of History Compression,” Neural Comput., 4(2), pp. 234–242.

Metrics
Usage Shares
Total Views
31 Page Views
Total Shares
0 Tweets
31
0 PDF Downloads
0
0 Facebook Shares
Total Usage
31