Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: type object 'PrecisionRecallDisplay' has no attribute 'from_predictions' #17

Open
664730 opened this issue Jun 2, 2024 · 5 comments

Comments

@664730
Copy link

664730 commented Jun 2, 2024

100%|████████████████████████████████████████████████████████████████████████████████████████████| 18864/18864 [00:25<00:00, 726.31it/s]
06/02/2024 17:05:54 - INFO - main - ***** Running Test *****
06/02/2024 17:05:54 - INFO - main - Num examples = 18864
06/02/2024 17:05:54 - INFO - main - Batch size = 128
Traceback (most recent call last):
File "linevul_main.py", line 1294, in
main()
File "linevul_main.py", line 1290, in main
test(args, model, tokenizer, test_dataset, best_threshold=0.5)
File "linevul_main.py", line 332, in test
PrecisionRecallDisplay.from_predictions(y_trues, logits[:, 1], name="LineVul")
AttributeError: type object 'PrecisionRecallDisplay' has no attribute 'from_predictions'
May I ask what to do if this error occurs? I have already updated scikit learn to 0.24.2, but it still doesn't work.

@MichaelFu1998-create
Copy link
Collaborator

Hi @664730,
Please update scikit learn to at least 1.0.0 and see if it works.

@664730
Copy link
Author

664730 commented Jun 3, 2024

Hi @664730, Please update scikit learn to at least 1.0.0 and see if it works.

Thanks a lot!You are right.

@664730
Copy link
Author

664730 commented Jun 4, 2024

Hi @664730, Please update scikit learn to at least 1.0.0 and see if it works.

File "linevul_main.py", line 526, in test
write_invalid_data=False)
File "linevul_main.py", line 881, in line_level_localization_tp
results = {"total_lines": total_lines,
UnboundLocalError: local variable 'total_lines' referenced before assignment
Hello, what should I do about this error?It appears when running RQ2.Thanks a lot.

@MichaelFu1998-create
Copy link
Collaborator

Hi @664730
I'm not sure about the cause given your current error msg, could you show me the complete ``line_level_localization_tp'' function that you used? Thanks!

@664730
Copy link
Author

664730 commented Jun 4, 2024

Hi @664730 I'm not sure about the cause given your current error msg, could you show me the complete ``line_level_localization_tp'' function that you used? Thanks!

def line_level_localization_tp(flaw_lines: str, tokenizer, model, mini_batch, original_func: str, args, top_k_loc: list, top_k_constant: list, reasoning_method: str, index: int, write_invalid_data: bool):
# function for captum LIG.
def predict(input_ids):
return model(input_ids=input_ids)[0]

def lig_forward(input_ids):
    logits = model(input_ids=input_ids)[0]
    y_pred = 1 # for positive attribution, y_pred = 0 for negative attribution
    pred_prob = logits[y_pred].unsqueeze(-1)
    return pred_prob

flaw_line_seperator = "/~/"
(input_ids, labels) = mini_batch
ids = input_ids[0].detach().tolist()
all_tokens = tokenizer.convert_ids_to_tokens(ids)
all_tokens = [token.replace("Ġ", "") for token in all_tokens]
all_tokens = [token.replace("ĉ", "Ċ") for token in all_tokens]
original_lines = ''.join(all_tokens).split("Ċ")

# flaw line verification
# get flaw tokens ground truth
flaw_lines = get_all_flaw_lines(flaw_lines=flaw_lines, flaw_line_seperator=flaw_line_seperator)
flaw_tokens_encoded = encode_all_lines(all_lines=flaw_lines, tokenizer=tokenizer)
verified_flaw_lines = []
do_explanation = False
for i in range(len(flaw_tokens_encoded)):
    encoded_flaw = ''.join(flaw_tokens_encoded[i])
    encoded_all = ''.join(all_tokens)
    if encoded_flaw in encoded_all:
        verified_flaw_lines.append(flaw_tokens_encoded[i])
        do_explanation = True

Of course, it's the original one in your code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants