On the Trade-Off Between Efficiency and Precision of Neural Abstraction

Alec Edwards*, Mirco Giacobbe, Alessandro Abate

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural abstractions have been recently introduced as formal approximations of complex, nonlinear dynamical models. They comprise a neural ODE and a certified upper bound on the error between the abstract neural network and the concrete dynamical model. So far neural abstractions have exclusively been obtained as neural networks consisting entirely of ReLU activation functions, resulting in neural ODE models that have piecewise affine dynamics, and which can be equivalently interpreted as linear hybrid automata. In this work, we observe that the utility of an abstraction depends on its use: some scenarios might require coarse abstractions that are easier to analyse, whereas others might require more complex, refined abstractions. We therefore consider neural abstractions of alternative shapes, namely either piecewise constant or nonlinear non-polynomial (specifically, obtained via sigmoidal activations). We employ formal inductive synthesis procedures to generate neural abstractions that result in dynamical models with these semantics. Empirically, we demonstrate the trade-off that these different neural abstraction templates have vis-a-vis their precision and synthesis time, as well as the time required for their safety verification (done via reachability computation). We improve existing synthesis techniques to enable abstraction of higher-dimensional models, and additionally discuss the abstraction of complex neural ODEs to improve the efficiency of reachability analysis for these models.
Original languageEnglish
Title of host publicationQuantitative Evaluation of Systems
Subtitle of host publication20th International Conference, QEST 2023, Antwerp, Belgium, September 20–22, 2023, Proceedings
EditorsNils Jansen, Mirco Tribastone
PublisherSpringer
Pages152–171
Number of pages20
Edition1
ISBN (Electronic)9783031438356
ISBN (Print)9783031438349
DOIs
Publication statusPublished - 15 Sept 2023
Event20th International Conference on Quantitative Evaluation of SysTems (QEST) - Antwerp, Belgium
Duration: 20 Sept 202322 Sept 2023

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume14287
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th International Conference on Quantitative Evaluation of SysTems (QEST)
Abbreviated titleQEST 2023
Country/TerritoryBelgium
CityAntwerp
Period20/09/2322/09/23

Keywords

  • nonlinear dynamical systems
  • formal abstractions
  • safety verification
  • SAT modulo theory
  • neural networks

Fingerprint

Dive into the research topics of 'On the Trade-Off Between Efficiency and Precision of Neural Abstraction'. Together they form a unique fingerprint.

Cite this