Welcome!

0 %
Yun Peng
Ph.D. Candidate in CUHK
  • Chinese Name
    彭昀
  • Major
    Computer Science
  • City
    Hong Kong
  • Age
    22
  • Mail
    normal@yunpeng.work
Research Interest
Software Engineering
Artificial Intelligence
Programming Language

ICSE 2022

Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python

Paper Information

Paper Name: Static Inference Meets Deep Learning: A Hybrid Type Inference Approach for Python

Conference: 44th International Conference on Software Engineering (ICSE2022)

Authors: Yun Peng, Cuiyun Gao, Zongjie Li, Bowei Gao, David Lo, Qirun Zhang, and Michael Lyu

Abstract

Type inference for dynamic programming languages such as Python is an important yet challenging task. Static type inference techniques can precisely infer variables with enough static constraints but are unable to handle variables with dynamic features. Deep learning (DL) based approaches are feature-agnostic, but they cannot guarantee the correctness of the predicted types. Their performance significantly depends on the quality of the training data (i.e., DL models perform poorly on some common types that rarely appear in the training dataset). It is interesting to note that the static and DL-based approaches offer complementary benefits. Unfortunately, to our knowledge, precise type inference based on both static inference and neural predictions has not been exploited and remains an open challenge. In particular, it is hard to integrate DL models into the framework of rule-based static approaches.

This paper fills the gap and proposes a hybrid type inference approach named HiTyper based on both static inference and deep learning. Specifically, our key insight is to record type dependencies among variables in each function and encode the dependency information in type dependency graphs (TDGs). Based on TDGs, we can easily integrate type inference rules in the nodes to conduct static inference and type rejection rules to inspect the correctness of neural predictions. HiTyper iteratively conducts static inference and DL-based prediction until the TDG is fully inferred. Experiments on two benchmark datasets show that HiTyper outperforms state-of-the-art DL models by exactly matching 10% more human annotations. HiTyper also achieves an increase of more than 30% on inferring rare types. Considering only the static part of HiTyper, it infers 2 ~ 3 times more types than existing static type inference tools. Moreover, HiTyper successfully corrected seven wrong human annotations in six GitHub projects, and two of them have already been approved by the repository owners.

Workflow of HiTyper

Code

https://github.com/JohnnyPeng18/HiTyper

or Simply type: pip install hityper

Resources

Contact

    © 2022 All Rights Reserved.