Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FGM is wrong + extend to all p >= 1 #2381

Open
ego-thales opened this issue Jan 9, 2024 · 5 comments · May be fixed by #2382
Open

FGM is wrong + extend to all p >= 1 #2381

ego-thales opened this issue Jan 9, 2024 · 5 comments · May be fixed by #2382
Assignees

Comments

@ego-thales
Copy link

ego-thales commented Jan 9, 2024

Hello,

I'm not sure, but I think the FGM extension to $L¹$ norm is not correct.

From what I can read here, it seems to me that the current version implements (essentially)
$$\text{noise direction}=\frac{\nabla}{\Vert\nabla\Vert_1}$$
when
$$\text{noise direction}=(0, \dots, 0, \text{sign}(a_i), 0, \dots, 0),\quad(i=\text{argmax}_j\vert\nabla_j\vert)$$
gives a higher inner product $\langle\nabla,\text{noise direction}\rangle$ for the same $L¹$ budget.

Indeed, in both cases $\Vert\text{noise direction}\Vert_1=1$ while the first and second options respectively give $\Vert\nabla\Vert_2/\Vert\nabla\Vert_1$ and $\Vert\nabla\Vert_{\infty}$. The latter is of course bigger due to Hölder's inequality.


Edit: See here for generalization to all $p\in[1, +\infty]$.

@ego-thales ego-thales changed the title $L¹$ FGM is wrong? FGM is wrong? Jan 9, 2024
@beat-buesser
Copy link
Collaborator

Hi @ego-thales Thank you for this comment. Without deciding on the correctness yet, how did you notice this issue? Have you already checked which version the literature on FGM is using?

@ego-thales
Copy link
Author

Thanks for your answer,

I've stumbled upon this because while reading FGSM paper (the reference for implementation), I thought about generalizing to $L^p$ norms. Then I saw that this repo implemented $L¹$ and $L^2$ extensions specifically, so I went and checked out the code (since there is no cited source regarding the maths used) and noticed this (apparently) suboptimal implementation.

@eliegoudout eliegoudout linked a pull request Jan 9, 2024 that will close this issue
12 tasks
@ego-thales
Copy link
Author

ego-thales commented Jan 9, 2024

Actually, now that I think about it, I don't see any reason why this attack is not generalized to any $L^p$ noise.

Let $p\in[1, +\infty]$ and $q$ such that $\frac{1}{p}+\frac{1}{q}=1$ (some abuse of notations will occur when $p=1$ or $p=+\infty$). With

$$\text{noise direction}:=\left(\frac{\vert\nabla\vert}{\Vert\nabla\Vert_q}\right)^{q/p}\text{sign}(\nabla),$$
one gets:

  • $\Vert\text{noise direction}\Vert_p=1$,
  • $\langle \nabla, \text{noise direction}\rangle=\Vert\nabla\Vert_q$ (I skip the quick computation, but mainly because $\frac{q}{p}+1=q$), which is the equality case of Hölder's inequality and as such, optimal.

As such, it would be a nice addition to entirely generalize FGM to all $p\geq 1$.

@ego-thales ego-thales changed the title FGM is wrong? FGM is wrong + extend to all _p_ Jan 9, 2024
@ego-thales ego-thales changed the title FGM is wrong + extend to all _p_ FGM is wrong + extend to all p >= 1 Jan 9, 2024
@beat-buesser
Copy link
Collaborator

Hi @ego-thales Thank you very much for the explanation and pull request! Let me take a closer look at the required changes. Related to this issue in FGSM, what do you think about the perturbation per iteration and overall perturbation calculation for p=1 in the Projected Gradient Descent attacks in art.attacks.evasion.projected_gradient_descent.*?

@eliegoudout
Copy link

I'm not entirely sure but it looks to me after a quick glance that PGD was implemented as a subs class of FGSM and inherits its loss from it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants