Bernas, M. Raphael (2024) Lipschitz Regularization for Neural Networks in High Dimension PRE - Research Project, ENSTA.
![]()
| PDF Available under License Creative Commons Attribution. 2671Kb |
Abstract
Neural network robustness is a critical issue in the field of machine learning, affecting various applications. Small perturbations in the input can significantly impact the output quality, leading to problems in image classification, autonomous vehicles, and object detection. As deep learning continues to grow in importance, these issues could lead to security risks for systems relying on deep neural networks. To address this, regularization methods have been developed to ensure Deep Neural Network (DNN) robustness. One such method involves penalizing the model based on its Lipschitz constant. However, calculating this constant is computationally unfeasible, especially in higher dimensions. Amethod proposed by Bungert et al- in CLIP : Cheap Lipschitz Training of Neural Networks- suggests computing the Lipschitz constant on a restricted set determined by adversarial methods. This work aim to extend the results found by Bungert et al and address the challenges encountered in higher dimensions.
Item Type: | Thesis (PRE - Research Project) |
---|---|
Uncontrolled Keywords: | Lipschitz Constant, Adversarial Training, Neural Network, Robustness, Over-fitting, High Dimension. |
Subjects: | Mathematics and Applications |
ID Code: | 10063 |
Deposited By: | Raphaël BERNAS |
Deposited On: | 03 sept. 2024 10:02 |
Dernière modification: | 03 sept. 2024 10:02 |
Repository Staff Only: item control page