NWO VIDI grant VI.Vidi.223.096, 2023-2028
Summary. Automata learning techniques are successful in improving the quality of software and hardware systems. They have been applied to analyse network protocols, legacy software, smart card readers and embedded control software. However, current learning techniques suffer from fundamental scalability issues that prevent their wide‐spread use in industry.
To alleviate these scalability issues, it is essential to understand the underlying principles of automata learning. Indeed, automata learning has not reached its full potential, due to a knowledge gap in what the right mathematical foundations are. This is confirmed, for instance, by recent work co‐developed by the PI, which shows that the notion of apartness from constructive mathematics can play an important role in improving the efficiency and conceptual presentation of automata learning algorithms.
This project—APPLE—will advance foundational research in automata learning in two directions: with approximation and with abstraction. Understanding approximation is key to measure the quality of learned automata, which is crucial to determine to what extent correctness properties that hold on the learned automaton actually hold in reality. Abstraction is essential to construct meaningful automata for large‐scale and realistic systems. Such systems are not faithfully represented by finite automata, so learning necessarily results in an abstraction of the real system.
APPLE proposes a novel approach to approximation and abstraction, by building on and extending the role of apart‐ ness in the foundations of automata learning, as well as the deep connections between logic, testing and learning. Based on these insights and techniques, APPLE significantly extends the foundations of automata learning, expands the scope of these techniques to larger scale systems, and helps to reduce the amount of work for the engineer. Overall, the project results in a deeper understanding of automata learning and a boost in its applicability.