Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Where the adversarials are stored. #2

Open
mhsamavatian opened this issue Feb 28, 2021 · 1 comment
Open

Where the adversarials are stored. #2

mhsamavatian opened this issue Feb 28, 2021 · 1 comment

Comments

@mhsamavatian
Copy link

I cannot understand from the code where the adversarial images are stored. If I want to give my own adversarial inputs to the path generation code, where should I put them. What is the format? NumPy?

@uchuhimo
Copy link
Collaborator

uchuhimo commented Mar 1, 2021

@mhsamavatian You can store your adversarial inputs in wherever you like, and use the following code to generate path (see https://github.com/Ptolemy-DL/Ptolemy/blob/master/src/nninst/backend/tensorflow/attack/common.py#L4527-L4546):

def adversarial_input_fn():
    adversarial_input = model_config.normalize_fn(adversarial_example)
    if not isinstance(adversarial_input, tf.data.Dataset):
        adversarial_input = tf.data.Dataset.from_tensors(
            adversarial_input
        )
    return adversarial_input

trace = reconstruct_trace_from_tf_v2(
    model_fn=model_fn,
    input_fn=adversarial_input_fn,
    trace_fn=partial(
        trace_fn,
        select_seed_fn=lambda output: arg_sorted_topk(output, rank)[
            rank - 1 : rank
        ],
    ),
    model_dir=model_dir,
    rank=rank,
)[0]

adversarial_example is your adversarial input in NumPy format, trace is the generated path.

We store our adversarial inputs in store/example/{attack_name}/{name}/{class_id}/{image_id}.pkl (see https://github.com/Ptolemy-DL/Ptolemy/blob/master/src/nninst/backend/tensorflow/attack/common.py#L1163), e.g., store/example/FGSM/alexnet_imagenet/883/0.pkl is an adversarial input generated by FGSM attack for AlexNet+ImageNet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants