Skip to content

Commit

Permalink
Fix nosec for nested dicts
Browse files Browse the repository at this point in the history
Before this commit nosec was searched from the begnning
of the expression's context, which may be broader than
the exact piece of code that a developer wants to skip.
This caused, that for the below example:

1. example = {
2.     'S3_CONFIG_PARAMS': dict(  # nosec B106
3.         ...
4.     ),
5.     'LOCALFS_BASEDIR': '/var/tmp/herp',  # nosec B108
6. }

for line 5, nosec from line 2 was returned. Thus `nosec B108` was ignored.

This commit changes the algorithm that search for nosec for an expression
and nosec from the exact line of the expression is preferred.

Resolves: PyCQA#1003
  • Loading branch information
kfrydel committed Mar 27, 2023
1 parent 02d73e9 commit ecd53cc
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 2 deletions.
4 changes: 2 additions & 2 deletions bandit/core/utils.py
Expand Up @@ -373,8 +373,8 @@ def check_ast_node(name):


def get_nosec(nosec_lines, context):
for lineno in context["linerange"]:
nosec = nosec_lines.get(lineno, None)
for lineno in [context["lineno"], *context["linerange"]]:
nosec = nosec_lines.get(lineno)
if nosec is not None:
return nosec
return None
11 changes: 11 additions & 0 deletions examples/nosec.py
Expand Up @@ -13,3 +13,14 @@
subprocess.Popen('/bin/ls *', shell=True) # type: ... # noqa: E501 ; pylint: disable=line-too-long # nosec
subprocess.Popen('#nosec', shell=True) # nosec B607, B101
subprocess.Popen('#nosec', shell=True) # nosec B602, subprocess_popen_with_shell_equals_true
# check that nosec in nested dict does not cause "higher" annotations to be ignored
# reproduction of https://github.com/PyCQA/bandit/issues/1003
example = {
'S3_CONFIG_PARAMS': dict( # nosec B106
aws_access_key_id='key_goes_here',
aws_secret_access_key='secret_goes_here',
endpoint_url='s3.amazonaws.com',
),
'LOCALFS_BASEDIR': '/var/tmp/herp', # nosec B108
'ALPINE_APORTS_DIR': '/tmp/derp', # nosec B108
}

0 comments on commit ecd53cc

Please sign in to comment.