TY - GEN
T1 - Validated Computation of Lipschitz Constant of Recurrent Neural Networks
AU - Guo, Yuhua
AU - Li, Yiran
AU - Farjudian, Amin
PY - 2023/6/4
Y1 - 2023/6/4
N2 - A validated method is presented for computation of Lipschitz constant of recurrent neural networks. Lipschitz estimation of neural networks has gained prominence due to its close links with robustness analysis, a central concern in modern machine learning, especially in safety-critical applications. In recent years, several methods for validated Lipschitz estimation of feed-forward networks have been proposed, yet there are relatively fewer methods available for recurrent networks. In the current article, based on interval enclosure of Clarke's generalized gradient, a method is proposed for Lipschitz estimation of recurrent networks which is applicable to both differentiable and non-differentiable networks. The method has a firm foundation in domain theory, and the algorithms can be proven to be correct by construction. A maximization algorithm is devised based on bisection with which a certified estimate of the Lipschitz constant can be obtained, and the region of least robustness can be located in the input domain. The method is implemented using interval arithmetic, and some experiments on vanilla recurrent networks are reported.
AB - A validated method is presented for computation of Lipschitz constant of recurrent neural networks. Lipschitz estimation of neural networks has gained prominence due to its close links with robustness analysis, a central concern in modern machine learning, especially in safety-critical applications. In recent years, several methods for validated Lipschitz estimation of feed-forward networks have been proposed, yet there are relatively fewer methods available for recurrent networks. In the current article, based on interval enclosure of Clarke's generalized gradient, a method is proposed for Lipschitz estimation of recurrent networks which is applicable to both differentiable and non-differentiable networks. The method has a firm foundation in domain theory, and the algorithms can be proven to be correct by construction. A maximization algorithm is devised based on bisection with which a certified estimate of the Lipschitz constant can be obtained, and the region of least robustness can be located in the input domain. The method is implemented using interval arithmetic, and some experiments on vanilla recurrent networks are reported.
KW - Clarke gradient
KW - interval arithmetic.
KW - Lipschitz constant
KW - recurrent neural network
UR - http://www.scopus.com/inward/record.url?scp=85162693409&partnerID=8YFLogxK
U2 - 10.1145/3583788.3583795
DO - 10.1145/3583788.3583795
M3 - Conference contribution
AN - SCOPUS:85162693409
T3 - ICMLSC: Machine Learning and Soft Computing
SP - 46
EP - 52
BT - ICMLSC '23
PB - Association for Computing Machinery
T2 - 7th International Conference on Machine Learning and Soft Computing, ICMLSC 2023
Y2 - 5 January 2023 through 7 January 2023
ER -