Code mixes natural language in identifier names, comments, and stylistic choices (ordering and typesetting) with a formal language that defines a computation. The snippets in each language form a communication channel. Developers read both channels; a CPU processes only the formal channel. These two channels interact and constrain each other. The theory of dual channel constraints elucidates these interactions and points to their exploitation. One prominent application is probabilistic type inference. In an optionally typed language, developers can add type annotation to find local type errors and to provide signposts to their development environment to facilitate navigation, refactoring, completion, and documentation. Natural type inference (NTI) reformulates probabilistic type inference as an optimisation problem: it combines hard logical constraints from the formal channel with soft natural constraints from the natural channel to soundly infer types. OptTyper realises NTI for TypeScript. Most work, including NTI, assumes the two channels are in sync. I will close with outlining work that solves problems that arise when they are not.
After over 100 online talks during the 2020/2021 pandemic, this talk will be the first in person SERG talk in 18 months.
The meeting will take place on Zoom, due to Dutch restrictions on travel from the UK.
Earl Barr (https://earlbarr.com/) is a professor of software engineering at the University College London. He received his PhD at UC Davis in 2009. Earl’s research interests include machine learning for software engineering (and vice versa), debugging, testing and analysis, game theory, and computer security. His recent work focuses on probabilistically quantifying program equivalence, probabilistic type inference, and dual channel constraints. With a pandemic-imposed hiatus, Earl dodges vans and taxis on his bike commute in London.