Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix multiple Sphinx warnings & docstrings #985

Closed
wants to merge 95 commits into from

Conversation

ProGamerGov
Copy link
Contributor

@ProGamerGov ProGamerGov commented Jun 30, 2022

The warning messages take up a lot of space on the console log, and it was really simple to resolve them.

The common.rst file was also incorrectly pointing to the wrong path and some of functions were renamed since it was created, so no docs were being generated for that page. InputBaselineXGradient was also removed from the public rst api docs, as it's not supposed to be public.

In addition to these easy doc warning fixes, I found a module that was listed on on the API docs site, but there's not public path to use it and no tests were ever written for it. I made an issue post for it here: #989

I fixed some docstring issues like lack of consistent uppercase for any and callable, spacing, random type / case mistakes I came across, etc...

I also fixed some paths.

Issues with upgrading to later versions of Sphinx were also resolved.


Currently Sphinx gives the following warnings / errors for the master branch:

/content/captum/sphinx/source/base_classes.rst:2: WARNING: Title underline too short.

Base Classes
==========
/content/captum/sphinx/source/base_classes.rst:29: WARNING: Title underline too short.

Perturbation Attribution
^^^^^^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/base_classes.rst:29: WARNING: Title underline too short.

Perturbation Attribution
^^^^^^^^^^^^^^^^^^^^^
WARNING: autodoc: failed to import function 'validate_input' from module 'captum.attr._utils.common'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr._utils.common' has no attribute 'validate_input'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: validate_input

WARNING: autodoc: failed to import function 'validate_noise_tunnel_type' from module 'captum.attr._utils.common'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr._utils.common' has no attribute 'validate_noise_tunnel_type'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: validate_noise_tunnel_type

WARNING: autodoc: failed to import function 'format_input' from module 'captum.attr._utils.common'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr._utils.common' has no attribute 'format_input'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: format_input

WARNING: autodoc: failed to import function '_format_attributions' from module 'captum.attr._utils.common'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr._utils.common' has no attribute '_format_attributions'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: _format_attributions

WARNING: autodoc: failed to import function 'zeros' from module 'captum.attr._utils.common'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr._utils.common' has no attribute 'zeros'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: zeros

WARNING: autodoc: failed to import function '_run_forward' from module 'captum.attr._utils.common'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr._utils.common' has no attribute '_run_forward'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: _run_forward

/content/captum/sphinx/source/concept.rst:2: WARNING: Title underline too short.

Concept-based Interpretability
======
/content/captum/sphinx/source/concept.rst:12: WARNING: Title underline too short.

ConceptInterpreter
^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/concept.rst:12: WARNING: Title underline too short.

ConceptInterpreter
^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/deconvolution.rst:2: WARNING: Title underline too short.

Deconvolution
=========
/content/captum/captum/attr/_core/deep_lift.py:docstring of captum.attr._core.deep_lift.DeepLiftShap:12: WARNING: Definition list ends without a blank line; unexpected unindent.
/content/captum/sphinx/source/feature_ablation.rst:2: WARNING: Title underline too short.

Feature Ablation
=========
/content/captum/captum/attr/_core/feature_ablation.py:docstring of captum.attr._core.feature_ablation.FeatureAblation.attribute:36: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/sphinx/source/feature_permutation.rst:2: WARNING: Title underline too short.

Feature Permutation
=========
WARNING: autodoc: failed to import class 'InputBaselineXGradient' from module 'captum.attr'; the following exception was raised:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
    return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr' has no attribute 'InputBaselineXGradient'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
    obj = attrgetter(obj, mangled_name)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
    return autodoc_attrgetter(self.env.app, obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
    return safe_getattr(obj, name, *defargs)
  File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
    raise AttributeError(name) from exc
AttributeError: InputBaselineXGradient

/content/captum/sphinx/source/guided_backprop.rst:2: WARNING: Title underline too short.

Guided Backprop
=========
/content/captum/sphinx/source/guided_grad_cam.rst:2: WARNING: Title underline too short.

Guided GradCAM
=========
/content/captum/sphinx/source/influence.rst:2: WARNING: Title underline too short.

Influential Examples
======
/content/captum/sphinx/source/influence.rst:12: WARNING: Title underline too short.

SimilarityInfluence
^^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/influence.rst:12: WARNING: Title underline too short.

SimilarityInfluence
^^^^^^^^^^^^^^^^^
/content/captum/captum/influence/_core/tracincp.py:docstring of captum.influence._core.tracincp.TracInCPBase.influence:61: WARNING: Inline interpreted text or phrase reference start-string without end-string.
/content/captum/captum/influence/_core/tracincp.py:docstring of captum.influence._core.tracincp.TracInCP.influence:61: WARNING: Inline interpreted text or phrase reference start-string without end-string.
/content/captum/captum/influence/_core/tracincp_fast_rand_proj.py:docstring of captum.influence._core.tracincp_fast_rand_proj.TracInCPFast.influence:62: WARNING: Inline interpreted text or phrase reference start-string without end-string.
/content/captum/sphinx/source/influence.rst:38: WARNING: Title underline too short.

TracInCPFastRandProj
^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/influence.rst:38: WARNING: Title underline too short.

TracInCPFastRandProj
^^^^^^^^^^^^^^^^
/content/captum/captum/influence/_core/tracincp_fast_rand_proj.py:docstring of captum.influence._core.tracincp_fast_rand_proj.TracInCPFastRandProj:1: WARNING: Inline literal start-string without end-string.
/content/captum/sphinx/source/input_x_gradient.rst:2: WARNING: Title underline too short.

Input X Gradient
===============
WARNING: autodoc: failed to import class 'api.Batch' from module 'captum.insights'; the following exception was raised:
No module named 'captum.insights.api'
WARNING: autodoc: failed to import class 'api.AttributionVisualizer' from module 'captum.insights'; the following exception was raised:
No module named 'captum.insights.api'
WARNING: autodoc: failed to import class 'features.BaseFeature' from module 'captum.insights'; the following exception was raised:
No module named 'captum.insights.features'
WARNING: autodoc: failed to import class 'features.GeneralFeature' from module 'captum.insights'; the following exception was raised:
No module named 'captum.insights.features'
WARNING: autodoc: failed to import class 'features.TextFeature' from module 'captum.insights'; the following exception was raised:
No module named 'captum.insights.features'
WARNING: autodoc: failed to import class 'features.ImageFeature' from module 'captum.insights'; the following exception was raised:
No module named 'captum.insights.features'
/content/captum/captum/attr/_core/integrated_gradients.py:docstring of captum.attr._core.integrated_gradients.IntegratedGradients.attribute:43: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/kernel_shap.py:docstring of captum.attr._core.kernel_shap.KernelShap.attribute:66: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/kernel_shap.py:docstring of captum.attr._core.kernel_shap.KernelShap.kernel_shap_perturb_generator:4: WARNING: Block quote ends without a blank line; unexpected unindent.
/content/captum/sphinx/source/layer.rst:2: WARNING: Title underline too short.

Layer Attribution
======
/content/captum/captum/attr/_core/layer/layer_conductance.py:docstring of captum.attr._core.layer.layer_conductance.LayerConductance.attribute:35: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/sphinx/source/layer.rst:18: WARNING: Title underline too short.

Internal Influence
^^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/layer.rst:18: WARNING: Title underline too short.

Internal Influence
^^^^^^^^^^^^^^^^^
/content/captum/captum/attr/_core/layer/internal_influence.py:docstring of captum.attr._core.layer.internal_influence.InternalInfluence.attribute:209: WARNING: Inline interpreted text or phrase reference start-string without end-string.
/content/captum/sphinx/source/layer.rst:24: WARNING: Title underline too short.

Layer Gradient X Activation
^^^^^^^^^^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/layer.rst:24: WARNING: Title underline too short.

Layer Gradient X Activation
^^^^^^^^^^^^^^^^^^^^^^^^^
/content/captum/captum/attr/_core/layer/layer_deep_lift.py:docstring of captum.attr._core.layer.layer_deep_lift.LayerDeepLift.attribute:39: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/layer/layer_deep_lift.py:docstring of captum.attr._core.layer.layer_deep_lift.LayerDeepLiftShap:16: WARNING: Definition list ends without a blank line; unexpected unindent.
/content/captum/sphinx/source/layer.rst:54: WARNING: Title underline too short.

Layer Integrated Gradients
^^^^^^^^^^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/layer.rst:54: WARNING: Title underline too short.

Layer Integrated Gradients
^^^^^^^^^^^^^^^^^^^^^^^^^
/content/captum/captum/attr/_core/layer/layer_integrated_gradients.py:docstring of captum.attr._core.layer.layer_integrated_gradients.LayerIntegratedGradients.attribute:35: WARNING: Unexpected indentation.
/content/captum/captum/attr/_core/layer/layer_integrated_gradients.py:docstring of captum.attr._core.layer.layer_integrated_gradients.LayerIntegratedGradients.attribute:140: WARNING: Unexpected indentation.
/content/captum/captum/attr/_core/layer/layer_integrated_gradients.py:docstring of captum.attr._core.layer.layer_integrated_gradients.LayerIntegratedGradients.attribute:158: WARNING: Block quote ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/layer/layer_lrp.py:docstring of captum.attr._core.layer.layer_lrp.LayerLRP.attribute:79: WARNING: Definition list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/layer/layer_lrp.py:docstring of captum.attr._core.layer.layer_lrp.LayerLRP.attribute:93: WARNING: Block quote ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/lime.py:docstring of captum.attr._core.lime.LimeBase.attribute:111: WARNING: Inline strong start-string without end-string.
/content/captum/captum/attr/_core/lime.py:docstring of captum.attr._core.lime.Lime.attribute:66: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/lrp.py:docstring of captum.attr._core.lrp.LRP.attribute:68: WARNING: Unexpected indentation.
/content/captum/captum/attr/_core/lrp.py:docstring of captum.attr._core.lrp.LRP.attribute:80: WARNING: Block quote ends without a blank line; unexpected unindent.
/content/captum/sphinx/source/metrics.rst:2: WARNING: Title underline too short.

Metrics
======
/content/captum/captum/metrics/_core/infidelity.py:docstring of captum.metrics._core.infidelity.infidelity:83: WARNING: Definition list ends without a blank line; unexpected unindent.
/content/captum/captum/metrics/_core/sensitivity.py:docstring of captum.metrics._core.sensitivity.sensitivity_max:112: WARNING: Inline strong start-string without end-string.
/content/captum/sphinx/source/neuron.rst:2: WARNING: Title underline too short.

Neuron Attribution
=======
/content/captum/sphinx/source/neuron.rst:11: WARNING: Title underline too short.

Neuron Integrated Gradients
^^^^^^^^^^^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/neuron.rst:11: WARNING: Title underline too short.

Neuron Integrated Gradients
^^^^^^^^^^^^^^^^^^^^^^^^^^
/content/captum/captum/attr/_core/neuron/neuron_deep_lift.py:docstring of captum.attr._core.neuron.neuron_deep_lift.NeuronDeepLiftShap:16: WARNING: Definition list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/neuron/neuron_feature_ablation.py:docstring of captum.attr._core.neuron.neuron_feature_ablation.NeuronFeatureAblation.attribute:69: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/noise_tunnel.py:docstring of captum.attr._core.noise_tunnel.NoiseTunnel:18: WARNING: Definition list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/occlusion.py:docstring of captum.attr._core.occlusion.Occlusion.attribute:68: WARNING: Bullet list ends without a blank line; unexpected unindent.
WARNING: autodoc: failed to import module 'pytext' from module 'captum.attr._models'; the following exception was raised:
No module named 'pytext'
WARNING: don't know which module to import for autodocumenting 'PyTextInterpretableEmbedding' (try placing a "module" or "currentmodule" directive in the document, or giving an explicit module name)
WARNING: don't know which module to import for autodocumenting 'BaselineGenerator' (try placing a "module" or "currentmodule" directive in the document, or giving an explicit module name)
/content/captum/sphinx/source/robust.rst:2: WARNING: Title underline too short.

Robustness
======
/content/captum/sphinx/source/robust.rst:26: WARNING: Title underline too short.

Min Param Perturbation
^^^^^^^^^^^^^^^^
/content/captum/sphinx/source/robust.rst:26: WARNING: Title underline too short.

Min Param Perturbation
^^^^^^^^^^^^^^^^
/content/captum/captum/robust/_core/metrics/min_param_perturbation.py:docstring of captum.robust._core.metrics.min_param_perturbation.MinParamPerturbation:75: WARNING: Inline strong start-string without end-string.
/content/captum/sphinx/source/shapley_value_sampling.rst:2: WARNING: Title underline too short.

Shapley Value Sampling
=========
/content/captum/captum/attr/_core/shapley_value.py:docstring of captum.attr._core.shapley_value.ShapleyValueSampling.attribute:42: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/shapley_value.py:docstring of captum.attr._core.shapley_value.ShapleyValues.attribute:42: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_utils/visualization.py:docstring of captum.attr._utils.visualization.visualize_image_attr:37: WARNING: Enumerated list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_utils/visualization.py:docstring of captum.attr._utils.visualization.visualize_image_attr:54: WARNING: Enumerated list ends without a blank line; unexpected unindent.
looking for now-outdated files... none found
pickling environment... done
checking consistency... /content/captum/sphinx/source/approximation_methods.rst: WARNING: document isn't included in any toctree
/content/captum/sphinx/source/common.rst: WARNING: document isn't included in any toctree
/content/captum/sphinx/source/pytext.rst: WARNING: document isn't included in any toctree
done

With the changes in this PR, Sphinx now only gives the following warnings instead of the above multi page list:

/content/captum/captum/attr/_core/layer/layer_integrated_gradients.py:docstring of captum.attr._core.layer.layer_integrated_gradients.LayerIntegratedGradients.attribute:147: WARNING: Bullet list ends without a blank line; unexpected unindent.
/content/captum/captum/attr/_core/lime.py:docstring of captum.attr._core.lime.LimeBase.attribute:111: WARNING: Inline strong start-string without end-string.
/content/captum/captum/metrics/_core/sensitivity.py:docstring of captum.metrics._core.sensitivity.sensitivity_max:112: WARNING: Inline strong start-string without end-string.
WARNING: autodoc: failed to import class 'pytext.PyTextInterpretableEmbedding' from module 'captum.attr._models'; the following exception was raised:
No module named 'pytext'
WARNING: autodoc: failed to import class 'pytext.BaselineGenerator' from module 'captum.attr._models'; the following exception was raised:
No module named 'pytext'
/content/captum/captum/robust/_core/metrics/min_param_perturbation.py:docstring of captum.robust._core.metrics.min_param_perturbation.MinParamPerturbation:77: WARNING: Inline strong start-string without end-string.
looking for now-outdated files... none found
pickling environment... done
checking consistency... /content/captum/sphinx/source/pytext.rst: WARNING: document isn't included in any toctree
done

@ProGamerGov ProGamerGov changed the title Fix "WARNING: Title underline too short." message from Sphinx Fix multiple Sphinx warnings Jul 5, 2022
@ProGamerGov ProGamerGov changed the title Fix multiple Sphinx warnings Fix multiple Sphinx warnings & docstrings Jul 18, 2022
@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Jul 19, 2022

I've come across some issues:

  • Doc variables with the lowercase any may be seen as referencing the any() function rather than the type. So, instead of any, we should probably be using the uppercase Any.

  • The callable type also has the same issue as any, as there is a callable() function.

  • Which example is the correct format for a tuple with a known size tuple of int, tuple(int, int), or tuple[int, int]? I've seen conflicting examples in Captum's code.

  • Are we supposed to be appending an 's' to the end of types in the doc variables? Ex: tuple of ints (with int being plural) instead of tuple of int as there are conflicting examples in Captum's code.

  • Is there specific formatting for dict? Like dict[int, float].

  • I assume the format for listing multiple variable type options is (type1 or type2), and (type1, type2, or type3)?

@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Jul 20, 2022

I've also noticed a major mistake in formatting code snippets in docstrings.

The other Captum modules all use this incorrect format that doesn't show up on the website:

  • `code_highlight`: Works for Markdown only: code_highlight

They should be using this format, which works on the website and Markdown:

  • ``code_highlight``: Works for Markdown and reStructuredText (reST): code_highlight

I'm working on a regex solution to fix this issue Captum wide, but that'll probably be for another PR.


You can also directly link to function / classes based on their name like I show below, but this is not something I can easily automate:

# Other classes and functions in the same file
# Sometimes this format may require the '.' prefix to work

:class:`MyClass`

:func:`myclass.my_func`

:func:`my_func`


# Classes and functions in another file (prefix of '.')

:class:`.MyClass`

:func:`.MyClass.my_func`

:func:`.my_func`


# You can also give exact paths

:class:`captum.module.MyClass`

:func:`captum.module.MyClass.my_func`

:func:`captum.module.my_func`


# Or you can show a different name for the hyperlink link

:func:`my_func_name <captum.module.my_func>`

@NarineK
Copy link
Contributor

NarineK commented Jul 21, 2022

  • callable

Thank you for the suggestion, @ProGamerGov! Regarding any or Any currently I don't see that the website is linking to the right type. Basically it is currently not generating any link on the web site. I see the same issue for Callable.
https://captum.ai/api/layer.html
Do you see any links generated for Any or any ?

I think that we can use tuple of int similar to what we do for tensors like tuple of tensors

tuple of int - I think that we don't need to add 's' in the end

I just also looked into pytorch docs and it looks like they just annotate it with dict
https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/module.py#L1508
I think that we have been using dict[type1, type2] for most cases.

Do you mean this for tuples: (type1 or type2), and (type1, type2, or type3) ?

@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Jul 21, 2022

@NarineK So for Callable and Any, I am planning on using this function in the conf.py here to ensure that each type is hyperlinked by intersphinx. Whether or not we use uppercase or lowercase is not important as we can hyperlink either option, but we are linking to the uppercase type hints 'Callable' and 'Any'. The PyTorch library code uses both the upper and lower case versions randomly it seems.

There aren't any links for Callable or Any currently, but the code below will ensure that they are created when generating the Sphinx webpages. (This function hasn't been added to this PR as I was planning to add it with the rest of the Optim module. However, I can add it to the master branch now if you want.)

import re
from typing import List


def autodoc_process_docstring(
    app, what: str, name: str, obj, options, lines: List[str]
) -> None:
    """
    Modify docstrings before creating html files.

    See here for more information:
    https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html
    """
    for i in range(len(lines)):
        # Skip unless line is an parameter doc or a return doc
        if not (lines[i].startswith(":type") or lines[i].startswith(":rtype")):
            continue

        # Change "nn.Module" to "torch.nn.Module" in doc type hints for intersphinx
        lines[i] = re.sub(r"\bnn.Module\b", "torch.nn.Module", lines[i])
        lines[i] = lines[i].replace("torch.torch.", "torch.")

        # Ensure nn.Module and torch.Tensor are hyperlinked
        lines[i] = re.sub(r"\btorch.nn.Module\b", ":obj:`torch.nn.Module`", lines[i])
        lines[i] = re.sub(r"\btorch.Tensor\b", ":obj:`torch.Tensor`", lines[i])

        # Handle Any & Callable types
        lines[i] = re.sub(r"\bAny\b", ":obj:`Any <typing.Any>`", lines[i])
        lines[i] = re.sub(r"\bany\b", ":obj:`Any <typing.Any>`", lines[i])
        lines[i] = re.sub(
            r"\bCallable\b", ":obj:`Callable <typing.Callable>`", lines[i]
        )
        lines[i] = re.sub(
            r"\bcallable\b", ":obj:`Callable <typing.Callable>`", lines[i]
        )

        # Handle list & tuple types
        lines[i] = re.sub(r"\blist\b", ":obj:`list`", lines[i])
        lines[i] = re.sub(r"\btuple\b", ":obj:`tuple`", lines[i])

        # Handle str, bool, & slice types
        lines[i] = re.sub(r"\bstr\b", ":obj:`str`", lines[i])
        lines[i] = re.sub(r"\bbool\b", ":obj:`bool`", lines[i])
        lines[i] = re.sub(r"\bslice\b", ":obj:`slice`", lines[i])

        # Handle int & float types
        lines[i] = re.sub(r"\bint\b", ":obj:`int`", lines[i])
        lines[i] = re.sub(r"\bfloat\b", ":obj:`float`", lines[i])
        lines[i] = re.sub(r"\bints\b", ":obj:`ints <int>`", lines[i])
        lines[i] = re.sub(r"\bfloats\b", ":obj:`floats <float>`", lines[i])

        # Handle tensor types that are using lowercase
        # Bolding return types doesn't work with Sphinx hyperlinks
        lines[i] = lines[i].replace("*tensors*", "tensors")
        lines[i] = lines[i].replace("*tensor*", "tensor")
        lines[i] = re.sub(r"\btensor\b", ":class:`tensor <torch.Tensor>`", lines[i])
        lines[i] = re.sub(r"\btensors\b", ":class:`tensors <torch.Tensor>`", lines[i])

        # Handle None type
        lines[i] = re.sub(r"\bNone\b", ":obj:`None`", lines[i])


def setup(app) -> None:
    app.connect("autodoc-process-docstring", autodoc_process_docstring)

Do you mean this for tuples: (type1 or type2), and (type1, type2, or type3) ?

I was referring to listing multiple types for a variable in the docs like this:

def func(x, y, z):
    """
    Args:

        x (type1, type2, or type3): Variable description.
        y (type1 or type2): Variable description.
        y (tuple of type1 or list of type2): Variable description.
    """

I think I saw a few cases where this was used: (type1 or type2 or type3 or type4). Though it's not really that important for this PR.

@ProGamerGov
Copy link
Contributor Author

PyTorch now officially has the same docstring type guidelines as the ones that I've created in this PR for Captum, now that my PR has been merged!

pytorch/pytorch#83536

https://github.com/pytorch/pytorch/blob/master/CONTRIBUTING.md#docstring-type-formatting

@ProGamerGov
Copy link
Contributor Author

@NarineK @aobo-y This PR is ready for merging! Changes were made based on the reviewer feedback, and it would probably be best to merge this PR before other PRs end up conflicting with it

@aobo-y
Copy link
Contributor

aobo-y commented Sep 1, 2022

@ProGamerGov This PR has contained too many contents for me to directly review. Could you help me break down things a little bit.

Based on my understanding, the changes can be categorized into the following areas:

  • typo, wording
  • fix sphinx building errors/warnings
  • docstring types
    • as you have already noticed that we have a standardized styles now, do you think we need any updates here? To be clear, we are definitely not suggesting you to correct all existing errors. Just check if this PR will not diverge further from our style.
  • add a function to ensure hyperlinks for python build-in types in web

Did I miss anything?

@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Sep 1, 2022

@aobo-y So in addition to what you listed, there are the following changes:

  • Add type hints to the len functions: __len__(self) -> int:, and to the __getitem__ functions.
  • Convert all http links to https in docstrings for enhanced security & privacy.
  • Changed arxiv.org links to not link directly to the PDF.
  • Added ignore-members to some rst files so that Sphinx 5.x and up is supported.
  • Changed the name of docs/algorithms.md to docs/attribution_algorithms.md as Narine asked me to do in the email thread.

I can easily change all references of lowercase tensor & tensors in this PR if you wish. There are also a few instances where I changed list(int) to list of int, so I'll change them back to list[int].

@aobo-y
Copy link
Contributor

aobo-y commented Sep 1, 2022

Great! If the tensor type thing is not too complicated, let's include it in this pr.

@ProGamerGov
Copy link
Contributor Author

I just made the following changes, and now am working on changing the lowercase tensor to uppercase.

  • Converted all usages of the docstring types using list of type to list[type]:

    (list of Concept):
    (list of str):
    (list of str, optional): 
    (list of int):
    (str, list of str):
    (str or list of str):
    (list of str or None, optional):
    (list of BaseFeature):
    (str, list of str, or Iterator):
    (list of Stat):
    (list of Dataset):
    (dict[Concept, list of str]):
    
  • Added missing comma to all docstring type references of:

    (int, tuple, tensor or list, optional): # -> (int, tuple, tensor, or list, optional):
    (scalar, tensor, tuple of scalars or tensors, optional): # -> (scalar, tensor, tuple of scalars, or tensors, optional):
    
  • Added missing "or" to docstring types for all references of:

    (tensor, tuple of tensors, Callable): # -> `(tensor, tuple of tensors, or Callable):
    
  • Removed plural usage of "scalars".

@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Sep 1, 2022

@aobo-y Okay, I just converted all docstring type instances of tensor & tensors to Tensor!

@ProGamerGov
Copy link
Contributor Author

ProGamerGov commented Sep 1, 2022

@aobo-y I just noticed a new warning in the libmamba test. I thought it might be causing the Conda test to fail, but it still seems to be working though there's an extra 10 minutes of install time now. I reported the issue to the Conda devs: conda/conda#11790

All tests have passed!

lines[i] = re.sub(_rt[0] + r"Any" + _rt[1], "~typing.Any", lines[i])
lines[i] = re.sub(_rt[0] + r"Callable" + _rt[1], "~typing.Callable", lines[i])
lines[i] = re.sub(_rt[0] + r"Iterator" + _rt[1], "~typing.Iterator", lines[i])
lines[i] = re.sub(_rt[0] + r"Iterable" + _rt[1], "~typing.Iterable", lines[i])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why do we need to add ~typing. for them? what is the exact situation that they cannot be identified?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aobo-y Adding ~typing. to them ensures that intersphinx recognizes them properly, as it does not autodetect anything under the typing module. This way it shows up as hyperlinked (the ~typing. portion doesn't show up int the HTML docs).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, then wondering if you have any clues why Tensor can be correctly linked with from torch import Tensor but not Iterable with from typing import Iterable?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aobo-y I'm not sure. Even inferring Tensor from the imports doesn't seem to be very robust as it only does it 3 times in total for the entire project.

These are the type hints before autodoc_process_docstring, with each line showing the name of the class / function and then the docstring type (name, lines[i]): https://gist.github.com/ProGamerGov/1d0636cd173f46effc8db7a2705f509d

And these are the type hints after: https://gist.github.com/ProGamerGov/31871f2d231ab68179ab18eecabb77b6

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had thought that there were a few more cases where Sphinx detected the from torch import Tensor part, but these are the only ones:

captum.attr.LayerAttribution.interpolate :type layer_attribution: :py:class:`~torch.Tensor`
captum.influence.TracInCPFastRandProj.influence :type targets: :py:class:`~torch.Tensor`
captum.attr.LRP.compute_convergence_delta :type output: :py:class:`~torch.Tensor`

There are no cases where it detects the typing module imports. So I guess we can basically just assume that it doesn't detect any from torch import Tensor imports then.

Weirdly enough a small handful of random Python type usage instances get detected and wrapped in intersphinx supported code before the replacement function, like:py:class:'<str>'. Though somewhere down the line all of them get properly detected, which is a bit confusing.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. TracInCPFast and TracInCPFastRandProj are under the same file with the same docstring, but one is detected and one is not. Indeed pretty weird

captum.influence.TracInCPFast.influence :type targets: Tensor, optional
...
captum.influence.TracInCPFastRandProj.influence :type targets: :py:class:`~torch.Tensor`

# Preserve signature defaults
# Prevents entire tensors from being printed, & gives callable functions
# proper names
autodoc_preserve_defaults = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you point me to where we use variables as default? I am thinking for some of these cases, we can set the default as None and assign the variable within the code after checking None

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aobo-y Here's an example: https://captum.ai/api/influence.html#similarityinfluence

The similarity_metric variable is set to the cosine_similarity function, and it shows up in the docs with additional characters wrapped around the name:

similarity_metric=<function cosine_similarity>

Setting autodoc_preserve_defaults = True corrects it.

similarity_metric=cosine_similarity

This setting is also something I need for the Optim module's TransformationRobustness class:

# Define TransformationRobustness defaults externally for easier Sphinx docs formatting
_TR_TRANSLATE: List[int] = [4] * 10
_TR_SCALE: List[float] = [0.995**n for n in range(-5, 80)] + [
    0.998**n for n in 2 * list(range(20, 40))
]
_TR_DEGREES: List[int] = (
    list(range(-20, 20)) + list(range(-10, 10)) + list(range(-5, 5)) + 5 * [0]
)


class TransformationRobustness(nn.Module):
    def __init__(
        self,
        padding_transform: Optional[nn.Module] = nn.ConstantPad2d(2, value=0.5),
        translate: Optional[Union[int, List[int]]] = _TR_TRANSLATE,
        scale: Optional[NumSeqOrTensorOrProbDistType] = _TR_SCALE,
        degrees: Optional[NumSeqOrTensorOrProbDistType] = _TR_DEGREES,
        final_translate: Optional[int] = 2,
        crop_or_pad_output: bool = False,
    ) -> None:

This way I don't need a second variable to disable each option, and Sphinx doesn't try to show hundreds of numbers in the HTML docs.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generally, we do not encourage using variables as default. But there are surely some cases that it may make more sense to use variables. Using huge list/tensor as default is something that does not exist before. We can discuss the optim example further when reviewing the optim module.

For existing code, I believe the only variable defaults are all functions, like the example you give. Then actually, <function cosine_similarity> is slightly more preferable, but no big difference, so we can keep this change.

Please correct me if you are aware of other cases in our codebase.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aobo-y I think that there may be a few other variables with function defaults cases, but I don't recall any other variables in the Captum at the moment that would be affected by this change.

sphinx/source/index.rst Outdated Show resolved Hide resolved
@ProGamerGov
Copy link
Contributor Author

I added the Tensor intersphinx helper line to autodoc_process_docstring as we have previously discussed. I also checked each replacement instance to ensure that there were no issues.

* list of list of Concept -> list[list[Concept]]

* Return Types: tensor & tensors -> Tensor
@facebook-github-bot
Copy link
Contributor

@aobo-y has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@aobo-y
Copy link
Contributor

aobo-y commented Sep 16, 2022

Hi @ProGamerGov , I cannot merge this pr coz there is a conflict in pytorch/captum/captum/influence/_core/tracincp_fast_rand_proj.py. Could you give it a look?

@ProGamerGov
Copy link
Contributor Author

@aobo-y I already fixed the merge conflicts in a PR earlier today, but I just noticed a mypy issue present in the master branch that was causing lint errors. I've fixed that mypy issue with the latest commit

@facebook-github-bot
Copy link
Contributor

@aobo-y has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants