Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file modified .DS_Store
Binary file not shown.
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,7 @@ results/
.history/
.vscode/
tree-sitter-*/
huggingface_models/
huggingface_models/
attentions/.DS_Store
.DS_Store
.DS_Store
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,19 @@
# CAT-probing: A Metric-based Approach to Interpret How Pre-trained Models for Programming Language Attend Code Structure

![License](https://img.shields.io/badge/License-MIT-blue) ![Build](https://img.shields.io/badge/Build-Passing-green) ![Last Commit](https://img.shields.io/github/last-commit/QiushiSun/CodeAttention)

## Updates

- 2022/12/10: Please check our [slides](https://drive.google.com/file/d/1Nb1QdwqFcmQuObRtSLZ-j3wNvp5qD89W/view?usp=share_link). 🐈
- 2022/10/12: Our code is available. 😋
- 2022/10/06: Release the paper of CAT-probing, check out our [paper](https://arxiv.org/abs/2210.04633). 👏
- 2022/10/06: Release the paper of CAT-probing, check out our [paper](https://preview.aclanthology.org/emnlp-22-ingestion/2022.findings-emnlp.295/). 👏
- 2022/10/06: CAT-probing is accepted by **Findings of EMNLP 2022** 🎉

## Introduction

We proposed a metric-based probing method, namely, CAT-probing, to quantitatively evaluate how CodePTMs Attention scores relate to distances between AST nodes.

More details are provided in our EMNLP'22 paper and [our paper on arXiv](https://arxiv.org/abs/2210.04633).
More details are provided in our EMNLP'22 paper and [our paper](https://preview.aclanthology.org/emnlp-22-ingestion/2022.findings-emnlp.295/).

## Environment & Preparing

Expand Down
Binary file added slides.pdf
Binary file not shown.