From 16b707b67baa334ab627be6d63bc619dbb7dbf44 Mon Sep 17 00:00:00 2001 From: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Date: Thu, 10 Aug 2023 17:32:51 -0700 Subject: [PATCH] Merge main into develop-graphql-async (#892) * Update Versioning Scheme (#651) * Update versioning scheme to 3 semver digits * Fix version indexing Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei * Remove version truncation * [Mega-Linter] Apply linters fixes * Bump tests Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei Co-authored-by: TimPansino * Fix Trace Finalizer Crashes (#652) * Patch crashes in various traces with None settings * Add tests for graphql trace types to unittests * Add test to ensure traces don't crash in finalizer * [Mega-Linter] Apply linters fixes * Bump tests Co-authored-by: TimPansino Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> * Add usage tracking metrics for Kafka clients. (#658) * Add usage tracking metrics for Kafka clients. * Fix double import lint error * [Mega-Linter] Apply linters fixes * Create version util file and add metrics to consumer. * Address linting errors. * Add missing semi-colon. * [Mega-Linter] Apply linters fixes * Bump tests. Co-authored-by: Hannah Stepanek Co-authored-by: hmstepanek Co-authored-by: umaannamalai * Deprecate add_custom_parameter(s) API (#655) * Deprecate add_custom_parameter(s) API * Fix unicode tests and some pylint errors * Fix more pylint errors * Revert "Fix more pylint errors" This reverts commit 807ec1c5c40fe421300ccdcd6fedd81f288dce2c. * Edit deprecation message in add_custom_parameters * Add usage metrics for Daphne and Hypercorn. (#665) * Add usage metrics for Daphne and Hypercorn. * [Mega-Linter] Apply linters fixes Co-authored-by: umaannamalai * Fix Flask view support in Code Level Metrics (#664) * Fix Flask view support in Code Level Metrics Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * [Mega-Linter] Apply linters fixes * Bump tests * Fix CLM tests for flaskrest * [Mega-Linter] Apply linters fixes * Bump tests Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: TimPansino Co-authored-by: Uma Annamalai * Fix aioredis version crash (#661) Co-authored-by: Uma Annamalai * Add double wrapped testing for Hypercorn and Daphne and dispatcher argument to WSGI API. (#667) * Add double wrapped app tests. * Fix linting errors. * [Mega-Linter] Apply linters fixes * Add co-authors. Co-authored-by: Tim Pansino Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: umaannamalai Co-authored-by: Tim Pansino Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek * Add Python 3.11 Support (#654) * Add py311 tests * Fix typo * Added 3.11 support for aiohttp framework Co-authored-by: Timothy Pansino * Set up environment to run Python 3.11 Co-authored-by: Timothy Pansino * Add Python 3.11 support for agent_features Co-authored-by: Timothy Pansino * Partial Python 3.11 support added for Tornado Co-authored-by: Timothy Pansino * Adjust postgres versions * Fix tornado install path locally * Remove aioredis py311 tests * Update 3.11 to dev in tests * Fix sanic instrumentation and imp/importlib deprecation Co-authored-by: Timothy Pansino * Simplify wheel build options * Update cibuildwheel for 3.11 * Remove falconmaster py311 test Co-authored-by: Lalleh Rafeei Co-authored-by: Timothy Pansino * Remove devcontainer submodule (#669) * Uncomment NewRelicContextFormatter from agent.py (#676) * Fix botocore tests for botocore v1.28.1+ (#675) * Fix botocore tests for botocore v1.28.1+ Co-authored-by: Timothy Pansino * Fix boto3 tests for botocore v1.28.1+ Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * Fix boto3 tests for python 2.7 Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Feature increased custom event limit (#674) * Update reservoir size for custom events. * [Mega-Linter] Apply linters fixes * Increase custom event limit. (#666) * Remove duplicated CUSTOM_EVENT_RESERVOIR_SIZE Co-authored-by: Tim Pansino Co-authored-by: TimPansino Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Co-authored-by: Uma Annamalai * Add python 3.11 stable release to GHA (#671) * Double kafka test runners (#677) Co-authored-by: Hannah Stepanek * Fix failing flask_rest tests (#683) * Pin flask-restx in flask_rest tests for 2.7 flask-restx dropped support for 2.7 in 1.0.1. * Drop support for flask-restplus flask-restx replaced flask-restplus. flask-restplus's latest version supports 3.6 which we don't even support anymore. * Fix failing botocore tests (#684) * Change queue url for botocore>=1.29.0 botocore >=1.29.0 uses sqs.us-east-1.amazonaws.com url instead of queue.amazonaws.com. * Use tuple version instead of str * Change botocore129->botocore128 * Add record_log_event to public api (#681) * Add patch for sentry SDK to correct ASGI v2/v3 detection. (#680) * Add patch for sentry to correct ASGI v2/v3 detection. Co-authored-by: Tim Pansino Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek * [Mega-Linter] Apply linters fixes Co-authored-by: Tim Pansino Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: umaannamalai Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Update pip install command (#688) * Validator transfer from fixtures.py to validators directory, Part 1 (#672) * Move validate_transaction_metrics to validators directory * Comment out original validate_transaction_metrics from fixtures.py * Move validate_time_metrics_outside_transaction to validators directory * Move validate_internal_metrics into validators directory and fixed validate_transaction_metrics * Move validate_transaction_errors into validators directory * Move validate_application_errors into validators directory * Move validate_custom_parameters into validators directory * Move validate_synthetics_event into validators directory * Move validate_transaction_event_attributes into validators directory * Move validate_non_transaction_error_event into validators directory * Fix import issues * Fix (more) import issues * Fix validate_transaction_metrics import in aioredis * Remove commented code from fixtures.py * Initialize ExternalNode properties (#687) Co-authored-by: Hannah Stepanek * Fix package_version_utils.py logic (#689) * Fix package_version_utils.py logic Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * Move description of func into func itself * typecast lists into tuples * Remove breakpoints * Empty _test_package_version_utils.py * Make changes to the test Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * Pin Github Actions Runner to Ubuntu 20 for Py27 (#698) * Pin Github Actions runner to ubuntu 20 for Py27 * Upgrade setup-python * Fix Confluent Kafka Producer Arguments (#699) * Add confluentkafka test for posargs/kwargs * Fix confluent kafka topic argument bug * More sensible producer arguments * Fix tornado master tests & instrument redis 4.3.5 (#695) * Remove 3.7 testing of tornado master tornadomaster dropped support for 3.7 * Instrument new redis 4.3.5 client methods Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Remove pylint codes from flake8 config (#701) * Validator transfer from fixtures.py to validators directory, Part 2 (#690) * Move validate_transaction_metrics to validators directory * Comment out original validate_transaction_metrics from fixtures.py * Move validate_time_metrics_outside_transaction to validators directory * Move validate_internal_metrics into validators directory and fixed validate_transaction_metrics * Move validate_transaction_errors into validators directory * Move validate_application_errors into validators directory * Move validate_custom_parameters into validators directory * Move validate_synthetics_event into validators directory * Move validate_transaction_event_attributes into validators directory * Move validate_non_transaction_error_event into validators directory * Move validate_application_error_trace_count into validators directory * Move validate_application_error_event_count into validators directory * Move validate_synthetics_transaction_trace into validators directory * Move validate_tt_collector_json to validators directory * Move validate_transaction_trace_attributes into validator directory * Move validate_transaction_error_trace_attributes into validator directory * Move validate_error_trace_collector_json into validator directory * Move validate_error_event_collector_json into validator directory * Move validate_transaction_event_collector_json into validator directory * Fix import issues from merge * Fix some pylint errors * Revert 'raise ValueError' to be PY2 compatible * Delete commented lines * Fix bug in celery where works don't report data (#696) This fixes Missing information from Celery workers when using MAX_TASKS_PER_CHILD issue. Previously, if celery was run with the --loglevel=INFO flag, an agent instance would be created for the main celery process and after the first worker shutdown, all following worker's agent instances would point to that agent instance instead of creating a new instance. This was root caused to incorrectly creating an agent instance when application activate was not set. Now no agent instance will be created for the main celery process. * Reverts removal of flask_restful hooks. (#705) * Update instrumented methods in redis. (#707) Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek * Add TraceCache Guarded Iteration (#704) * Add MutableMapping API to TraceCache * Update trace cache usage to use guarded APIs. * [Mega-Linter] Apply linters fixes * Bump tests * Fix keys iterator * Comments for trace cache methods * Reorganize tests * Fix fixture refs * Fix testing refs * [Mega-Linter] Apply linters fixes * Bump tests * Upper case constant Co-authored-by: TimPansino Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> * Fix Type Constructor Classes in Code Level Metrics (#708) * Fix CLM exception catching * Reorganize CLM Tests * Add type constructor tests to CLM * Fix line number * Pin tox version * Fix lambda tests in CLM * Fix lint issues * Turn helper func into pytest fixture Co-authored-by: Hannah Stepanek * Fix sanic and starlette tests (#734) * Fix sanic tests * Tweak test fix for sanic * Remove test for v18.12 in sanic (no longer supported) * Pin starlette latest to v0.23.1 (for now) * Add comment in tox about pinned starlette version * Add methods to instrument (#738) * Add card to instrumented methods in Redis (#740) * Add DevContainer (#711) * Add devcontainer setup * Add newrelic env vars to passenv * Add default extensions * Add devcontainer instructions to contributing docs * Convert smart quotes in contributing docs. * Apply proper RST formatting * [Mega-Linter] Apply linters fixes * Add GHCR to prerequisites * [Mega-Linter] Apply linters fixes * Bump tests Co-authored-by: TimPansino * Module classmethod fix (#662) * Fix function_wrapper calls to module * Fix wrapper in pika hook * Revert elasticsearch instrumentation * Revert some wrap_function_wrappers to orig * Remove comments/breakpoints * Fix hooks in elasticsearch Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Fix log decorating to be JSON compatible (#736) * Initial addition of JSON capability * Add NR-LINKING metadata JSON combatibility * Remove breakpoint * Hardcode local log decorating tests * Tweak linking metatdata parsing/adding * Revert "Fix log decorating to be JSON compatible" (#746) * Revert "Fix log decorating to be JSON compatible (#736)" This reverts commit 0db5fee1e5d44b0791dc517ac9f5d88d1240a340. * [Mega-Linter] Apply linters fixes * Trigger tests Co-authored-by: hmstepanek * Add apdexPerfZone attribute to Transaction. (#753) Co-authored-by: Enriqueta De Leon Co-authored-by: Kate Anderson Co-authored-by: Mary Martinez Co-authored-by: Enriqueta De Leon Co-authored-by: Kate Anderson Co-authored-by: Mary Martinez Co-authored-by: Hannah Stepanek * Fix tests in starlette v0.23.1 (#752) * Fix tests in starlette v0.23.1 * Fix conditional tests * Add comment to bg_task test * Support `redis.asyncio` (#744) * Support `redis.asyncio` * Fix `flake8` issues Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Redis Asyncio Testing (#750) * Add standardized method for package version tuples * Adapt aioredis tests to redis.asyncio * Standardize version tuple * Refresh uninstrumented redis methods * Fix aioredis version checking * Remove aioredis version function * CodeCov Integration (#710) * Add aggregate coverage settings to tox.ini * Refactor coverage fixture for GHA * Send coverage data files * Linter fixes * Configure codecov report * Yield cov handle from fixture * Fix empty coverage fixture * Specify artifact download dir * Find coverage files with find command * Add concurrency cancelling to github actions * uncomment test deps * Fix or symbol * Fix concurrency groups * Linter fixes * Add comment for yield None in fixture * [Mega-Linter] Apply linters fixes * Bump Tests --------- Co-authored-by: TimPansino * Mergify (#761) * Add mergify config file * Remove priority * Clean up mergify rules * Add non-draft requirement for merge * Add merge method * [Mega-Linter] Apply linters fixes * Don't update draft PRs. * Remove merge rules for develop branches * Linting --------- Co-authored-by: TimPansino * Elasticsearch v8 support (#741) * Fix function_wrapper calls to module * Fix wrapper in pika hook * Revert elasticsearch instrumentation * Revert some wrap_function_wrappers to orig * Remove comments/breakpoints * Fix hooks in elasticsearch * Add new client methods from v8 and their hooks * Add elasticsearch v8 to workflow and tox * Fix indices for elasticsearch01 * Disable xpack security in elasticsearch v8.0 * Start to add try/except blocks in tests * Add support for v8 transport * add support for v8 connection * Add tests-WIP * Clean up most tests * Clean up unused instrumentation Co-authored-by: Lalleh Rafeei Co-authored-by: Enriqueta De Leon Co-authored-by: Uma Annamalai Co-authored-by: Kate Anderson Co-authored-by: Mary Martinez Co-authored-by: Hannah Stepanek * Remove elastic search source code * Elasticsearch v8 testing Co-authored-by: Lalleh Rafeei Co-authored-by: Enriqueta De Leon Co-authored-by: Uma Annamalai Co-authored-by: Kate Anderson Co-authored-by: Mary Martinez Co-authored-by: Hannah Stepanek * Scope ES fixture * ES v8 only supports Python3.6+ * Refactor transport tests for v8 Co-authored-by: Lalleh Rafeei Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: Kate Anderson Co-authored-by: Enriqueta De Leon * Remove extra comments * Added perform_request_kwargs to test_transport * Fix some linter issues * Remove extra newline * Group es v7 v8 process modules together * Add auto signature detection & binding * Use bind_arguments in ES * Add test for wrapped function * Add validator for datastore trace inputs * Use common bind_arguments for PY3 * Fix tests in starlette v0.23.1 (#752) * Fix tests in starlette v0.23.1 * Fix conditional tests * Add comment to bg_task test * Split below es 8 methods from es 8 methods Note the previous tests in this file to check whether a method was instrumented, did not test anything because they were checking whether the list of methods that we instrumented were instrumented instead of whether there were uninstrumented methods on the es client that we missed. Because we decided due to lack of reporting of bugs by our customers, to not support the buggy wrapping on previous es versions (below es8), we only added tests to assert all methods were wrapped from es8+. We also are only testing es8+ wrapping of methods since the previous versions wrapping behavior may not have been correct due to the signature of the methods changing without us detecting it due to lack of tests. Since our customers have not reported any issues, it seems not worth it at this time to go back and fix these bugs. * Remove signature auto detection implementation * Fixup: remove signature autodetection * Fixup: cleanup * Test method calls on all es versions * Fixup: don't run some methods on es7 --------- Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Co-authored-by: mary-martinez Co-authored-by: enriqueta Co-authored-by: Tim Pansino Co-authored-by: Lalleh Rafeei Co-authored-by: Enriqueta De Leon Co-authored-by: Uma Annamalai Co-authored-by: Kate Anderson Co-authored-by: Mary Martinez Co-authored-by: Hannah Stepanek Co-authored-by: Hannah Stepanek Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Update contributors workspace link in CONTRIBUTING.rst. (#760) * Update link in CONTRIBUTING.rst. * Update to RST syntax. * [Mega-Linter] Apply linters fixes --------- Co-authored-by: umaannamalai Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Add Retry to Pip Install (#763) * Add retry to pip install * Fix retry backoff constant * Fix script failures --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Add aiohttp support for expected status codes (#735) * Add aiohttp support for expected status codes * Adjust naming convention * Fix expected tests for new validator behavior --------- Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Co-authored-by: Tim Pansino * Fix PyPy Priority Sampling Test (#766) * Fix pypy priority sampling * [Mega-Linter] Apply linters fixes * Bump tests --------- Co-authored-by: TimPansino * Config linter fixes (#768) * Fix default value and lazy logging pylint * Fix default value and lazy logging pylint * Fix unnecessary 'else' in pylint * Fix logging-not-lazy in pylint * Fix redefined built-in error in Pylint * Fix implicit string concatenation in Pylint * Fix dict() to {} in Pylint * Make sure eval is OK to use for Pylint * Fix logging format string for Pylint * Change list comprehension to generator expression * [Mega-Linter] Apply linters fixes * Rerun tests --------- Co-authored-by: lrafeei * Sync tests w/ agents/cross_agent_tests/pull/150 (#770) Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Infinite Tracing Batching & Compression (#762) * Infinite Tracing Batching and Compression settings (#756) * Add compression setting * Add batching setting * Infinite Tracing Compression (#758) * Initial commit * Add compression option in StreamingRPC * Add compression default to tests * Add test to confirm compression settings * Remove commented out code * Set compression settings from settings override * Infinite Tracing Batching (#759) * Initial infinite tracing batching implementation * Add RecordSpanBatch method to mock grpc server * Span batching settings and testing. Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * Add final 8t batching tests * Rename serialization test * Formatting * Guard unittests from failing due to batching * Linting * Simplify batching algorithm * Properly wire batching parametrization * Fix incorrect validator use * Data loss on reconnect regression testing Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek * Test stream buffer batch sizes * Fix logic in supportability metrics for spans * Clean up nested conditionals in stream buffer * Compression parametrization in serialization test * Formatting * Update 8t test_no_delay_on_ok * Update protobufs * Remove unnecessary patching from test * Fix waiting in supportability metric tests * Add sleep to waiting in test * Reorder sleep and condition check * Mark no data loss xfail for py2. * Fix conditional check * Fix flake8 linter issues --------- Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek * Infinite Tracing Supportability Feature Toggle Metrics (#769) * Add 8T feature toggle supportability metrics * Remove supportability metrics when 8t is disabled. * Formatting --------- Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek * Fix DT settings for txn feature tests (#771) * Fix pyramid testing versions (#764) Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Fix Ariadne Middleware Testing (#776) * Fix ariadne middleware testing Co-authored-by: Uma Annamalai Co-authored-by: Lalleh Rafeei * [Mega-Linter] Apply linters fixes * Bump tests --------- Co-authored-by: Uma Annamalai Co-authored-by: Lalleh Rafeei Co-authored-by: TimPansino * Exclude merged PRs from automatic mergify actions. (#774) Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Refactor Code Coverage (#765) * Reorder dependency of code coverage fixture * Fix tests with coverage disabled * Refactor code coverage fixture * Clean out old coverage settings * Fix missing code coverage fixture * Fix pypy priority sampling * Start coverage from pytest-cov for better tracking * Refactor coverage config file * Ripping out coverage fixtures * Move tool config to bottom of tox.ini * Disabling py27 warning * Renaming env var --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Add GraphQL Introspection Setting (#783) * Add graphql introspection setting * Sort settings object hierarchy * Add test for introspection queries setting * Expand introspection queries testing * [Mega-Linter] Apply linters fixes * Adjust introspection detection for graphql --------- Co-authored-by: TimPansino * Fix instance info tests for redis. (#784) * Fix instance info tests for redis. * [Mega-Linter] Apply linters fixes --------- Co-authored-by: umaannamalai * Fix Redis Instance Info (#790) * Fix failing redis test for new default behavior * Revert "Fix instance info tests for redis. (#784)" This reverts commit f7108e3c2a54ab02a1104f6c16bd5fd799b9fc7e. * Guard GraphQL Settings Lookup (#787) * Guard graphql settings lookup * [Mega-Linter] Apply linters fixes * Bump tests * Update graphql settings test --------- Co-authored-by: TimPansino Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Errors Inbox Improvements (#791) * Errors inbox attributes and tests (#778) * Initial errors inbox commit Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * Add enduser.id field * Move validate_error_trace_attributes into validators directory * Add error callback attributes test * Add tests for enduser.id & error.group.name Co-authored-by: Timothy Pansino * Uncomment code_coverage * Drop commented out line --------- Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Error Group Callback API (#785) * Error group initial implementation * Rewrite error callback to pass map of info * Fixed incorrect validators causing errors Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek * Fix validation of error trace attributes * Expanded error callback test * Add incorrect type to error callback testing * Change error group callback to private setting * Add testing for error group callback inputs * Separate error group callback tests * Add explicit testing for the set API * Ensure error group is string * Fix python 2 type validation --------- Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * User Tracking for Errors Inbox (#789) * Add user tracking feature for errors inbox. * Address review comments, * Add high_security test. * Cleanup invalid tests test. * Update user_id string check. * Remove set_id outside txn test. --------- Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> --------- Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Co-authored-by: Timothy Pansino Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Uma Annamalai * Update Packages (#793) * Update urllib3 to v1.26.15 * Update six to v1.16.0 * Update coverage exclude for newrelic/packages * [Mega-Linter] Apply linters fixes * Drop removed package from urllib3 * Update pytest * Downgrade websockets version for old sanic testing --------- Co-authored-by: TimPansino * Remove Unused Instrumentation and Tests (#794) * Remove unused instrumentation files * Remove testing for deprecated CherryPy versions * Remove unused pyelasticsearch tests --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Fix Loguru Instrumentation for v0.7.0 (#798) * Add autosignature implementation * Fix loguru with auto-signature * [Mega-Linter] Apply linters fixes * Fix tests for Py2 * [Mega-Linter] Apply linters fixes * Bump tests * Remove unwrap from signature utils * Fix arg unpacking * Remove unwrap arg from bind_args * Fix linter errors --------- Co-authored-by: TimPansino Co-authored-by: Lalleh Rafeei * Remove Twisted framework (#800) * Initial twisted commit * Remove Twisted Framework * Pin virtualenv, fix pip arg deprecation & disable kafka tests (#803) * Pin virtualenv * Fixup: use 20.21.1 instead * Replace install-options with config-settings See https://github.com/pypa/pip/issues/11358. * Temporarily disable kafka tests * Add tests for pyodbc (#796) * Add tests for pyodbc * Move imports into tests to get import coverage * Fixup: remove time import * Trigger tests --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Add tests for Waitress (#797) * Change import format * Initial commit * Add more tests to adapter_waitress * Remove commented out code * [Mega-Linter] Apply linters fixes * Add assertions to all tests * Add more NR testing to waitress --------- Co-authored-by: lrafeei Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Add testing for genshi and mako. (#799) * Add testing for genshi and mako. * [Mega-Linter] Apply linters fixes --------- Co-authored-by: umaannamalai Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Omit some frameworks from coverage analysis (#810) * Omit some frameworks from coverage analysis * Remove commas * Change format of omit * Add relative_files option to coverage * Add absolute directory * Add envsitepackagedir * Add coveragerc file * Add codecov.yml * [Mega-Linter] Apply linters fixes * Revert coveragerc file settings * Add files in packages and more frameworks * Remove commented line --------- Co-authored-by: lrafeei Co-authored-by: Hannah Stepanek * Run coverage around pytest (#813) * Run coverage around pytest * Trigger tests * Fixup * Add redis client_no_touch to ignore list * Temporarily remove kafka from coverage * Remove coverage for old libs * Add required option for tox v4 (#795) * Add required option for tox v4 * Update tox in GHA * Remove py27 no-cache-dir * Fix Testing Failures (#828) * Fix tastypie tests * Adjust asgiref pinned version * Make aioredis key PID unique * Pin more asgiref versions * Fix pytest test filtering when running tox (#823) Co-authored-by: Uma Annamalai * Validator transfer p3 (#745) * Move validate_transaction_metrics to validators directory * Comment out original validate_transaction_metrics from fixtures.py * Move validate_time_metrics_outside_transaction to validators directory * Move validate_internal_metrics into validators directory and fixed validate_transaction_metrics * Move validate_transaction_errors into validators directory * Move validate_application_errors into validators directory * Move validate_custom_parameters into validators directory * Move validate_synthetics_event into validators directory * Move validate_transaction_event_attributes into validators directory * Move validate_non_transaction_error_event into validators directory * Move validate_application_error_trace_count into validators directory * Move validate_application_error_event_count into validators directory * Move validate_synthetics_transaction_trace into validators directory * Move validate_tt_collector_json to validators directory * Move validate_transaction_trace_attributes into validator directory * Move validate_transaction_error_trace_attributes into validator directory * Move validate_error_trace_collector_json into validator directory * Move validate_error_event_collector_json into validator directory * Move validate_transaction_event_collector_json into validator directory * Move validate_custom_event_collector_json into validator directory * Move validate_tt_parameters into validator directory * Move validate_tt_parameters into validator directory * Move validate_tt_segment_params into validator directory * Move validate_browser_attributes into validators directory * Move validate_error_event_attributes into validators directory * Move validate_error_trace_attributes_outside_transaction into validators directory * Move validate_error_event_attributes_outside_transaction into validators directory * Fix some pylint errors * Redirect check_error_attributes * Fix more Pylint errors * Fix import issues from move * Fix more import shuffle errors * Sort logging JSON test for PY2 consistency * Fix Pylint errors in validators * Fix import error --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Fix set output warning using new GHA syntax (#833) * Fix set output warning using new GHA syntax * Fix quoting * Remove Python 2.7 and pypy2 testing (#835) * Change setup-python to @v2 for py2.7 * Remove py27 and pypy testing * Fix syntax errors * Fix comma related syntax errors * Fix more issues in tox * Remove gearman test * Containerized CI Pipeline (#836) * Revert "Remove Python 2.7 and pypy2 testing (#835)" This reverts commit abb6405d2bfd629ed83f48e8a17b4a28e3a3c352. * Containerize CI process * Publish new docker container for CI images * Rename github actions job * Copyright tag scripts * Drop debug line * Swap to new CI image * Move pip install to just main python * Remove libcurl special case from tox * Install special case packages into main image * Remove unused packages * Remove all other triggers besides manual * Add make run command * Cleanup small bugs * Fix CI Image Tagging (#838) * Correct templated CI image name * Pin pypy2.7 in image * Fix up scripting * Temporarily Restore Old CI Pipeline (#841) * Restore old pipelines * Remove python 2 from setup-python * Rework CI Pipeline (#839) Change pypy to pypy27 in tox. Fix checkout logic Pin tox requires * Fix Tests on New CI (#843) * Remove non-root user * Test new CI image * Change pypy to pypy27 in tox. * Fix checkout logic * Fetch git tags properly * Pin tox requires * Adjust default db settings for github actions * Rename elasticsearch services * Reset to new pipelines * [Mega-Linter] Apply linters fixes * Fix timezone * Fix docker networking * Pin dev image to new sha * Standardize gearman DB settings * Fix elasticsearch settings bug * Fix gearman bug * Add missing odbc headers * Add more debug messages * Swap out dev ci image * Fix required virtualenv version * Swap out dev ci image * Swap out dev ci image * Remove aioredis v1 for EOL * Add coverage paths for docker container * Unpin ci container --------- Co-authored-by: TimPansino * Instrument Redis waitaof (#851) * Add uninstrumented command to redis * Update logic for datastore_aioredis instance info * [Mega-Linter] Apply linters fixes * Bump tests * Update defaults for aioredis port --------- Co-authored-by: TimPansino * Ignore patched hooks files. (#849) Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Fix local scoped package reporting (#837) * Include isort stdlibs for determining stdlib modules * Use isort & sys to eliminate std & builtin modules Previously, the logic would fail to identify third party modules installed within the local user socpe. This fixes that issue by skipping builtin and stdlib modules by name, instead of attempting to identify third party modules based on file paths. * Handle importlib_metadata.version being a callable * Add isort into third party notices * [Mega-Linter] Apply linters fixes * Remove Python 2.7 and pypy2 testing (#835) * Change setup-python to @v2 for py2.7 * Remove py27 and pypy testing * Fix syntax errors * Fix comma related syntax errors * Fix more issues in tox * Remove gearman test * Containerized CI Pipeline (#836) * Revert "Remove Python 2.7 and pypy2 testing (#835)" This reverts commit abb6405d2bfd629ed83f48e8a17b4a28e3a3c352. * Containerize CI process * Publish new docker container for CI images * Rename github actions job * Copyright tag scripts * Drop debug line * Swap to new CI image * Move pip install to just main python * Remove libcurl special case from tox * Install special case packages into main image * Remove unused packages * Remove all other triggers besides manual * Add make run command * Cleanup small bugs * Fix CI Image Tagging (#838) * Correct templated CI image name * Pin pypy2.7 in image * Fix up scripting * Temporarily Restore Old CI Pipeline (#841) * Restore old pipelines * Remove python 2 from setup-python * Rework CI Pipeline (#839) Change pypy to pypy27 in tox. Fix checkout logic Pin tox requires * Fix Tests on New CI (#843) * Remove non-root user * Test new CI image * Change pypy to pypy27 in tox. * Fix checkout logic * Fetch git tags properly * Pin tox requires * Adjust default db settings for github actions * Rename elasticsearch services * Reset to new pipelines * [Mega-Linter] Apply linters fixes * Fix timezone * Fix docker networking * Pin dev image to new sha * Standardize gearman DB settings * Fix elasticsearch settings bug * Fix gearman bug * Add missing odbc headers * Add more debug messages * Swap out dev ci image * Fix required virtualenv version * Swap out dev ci image * Swap out dev ci image * Remove aioredis v1 for EOL * Add coverage paths for docker container * Unpin ci container --------- Co-authored-by: TimPansino * Trigger tests --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: hmstepanek Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Co-authored-by: TimPansino Co-authored-by: Uma Annamalai * MSSQL Testing (#852) * For mysql tests into mssql * Add tox envs for mssql * Add mssql DB settings * Add correct MSSQL tests * Add mssql to GHA * Add MSSQL libs to CI image * Pin to dev CI image sha * Swap SQLServer container image * Fix healthcheck * Put MSSQL image back * Drop pypy37 tests * Unpin dev image sha * Exclude command line functionality from test coverage (#855) * FIX: resilient environment settings (#825) if the application uses generalimport to manage optional depedencies, it's possible that generalimport.MissingOptionalDependency is raised. In this case, we should not report the module as it is not actually loaded and is not a runtime dependency of the application. Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Hannah Stepanek * Replace drop_transaction logic by using transaction context manager (#832) * Replace drop_transaction call * [Mega-Linter] Apply linters fixes * Empty commit to start tests * Change logic in BG Wrappers --------- Co-authored-by: lrafeei Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Upgrade to Pypy38 for TypedDict (#861) * Fix base branch * Revert tox dependencies * Replace all pypy37 with pypy38 * Remove action.yml file * Push Empty Commit * Fix skip_missing_interpreters behavior * Fix skip_missing_interpreters behavior * Pin dev CI image sha * Remove unsupported Tornado tests * Add latest tests to Tornado * Remove pypy38 (for now) --------- Co-authored-by: Tim Pansino * Add profile_trace testing (#858) * Include isort stdlibs for determining stdlib modules * Use isort & sys to eliminate std & builtin modules Previously, the logic would fail to identify third party modules installed within the local user socpe. This fixes that issue by skipping builtin and stdlib modules by name, instead of attempting to identify third party modules based on file paths. * Handle importlib_metadata.version being a callable * Add isort into third party notices * [Mega-Linter] Apply linters fixes * Remove Python 2.7 and pypy2 testing (#835) * Change setup-python to @v2 for py2.7 * Remove py27 and pypy testing * Fix syntax errors * Fix comma related syntax errors * Fix more issues in tox * Remove gearman test * Containerized CI Pipeline (#836) * Revert "Remove Python 2.7 and pypy2 testing (#835)" This reverts commit abb6405d2bfd629ed83f48e8a17b4a28e3a3c352. * Containerize CI process * Publish new docker container for CI images * Rename github actions job * Copyright tag scripts * Drop debug line * Swap to new CI image * Move pip install to just main python * Remove libcurl special case from tox * Install special case packages into main image * Remove unused packages * Remove all other triggers besides manual * Add make run command * Cleanup small bugs * Fix CI Image Tagging (#838) * Correct templated CI image name * Pin pypy2.7 in image * Fix up scripting * Temporarily Restore Old CI Pipeline (#841) * Restore old pipelines * Remove python 2 from setup-python * Rework CI Pipeline (#839) Change pypy to pypy27 in tox. Fix checkout logic Pin tox requires * Fix Tests on New CI (#843) * Remove non-root user * Test new CI image * Change pypy to pypy27 in tox. * Fix checkout logic * Fetch git tags properly * Pin tox requires * Adjust default db settings for github actions * Rename elasticsearch services * Reset to new pipelines * [Mega-Linter] Apply linters fixes * Fix timezone * Fix docker networking * Pin dev image to new sha * Standardize gearman DB settings * Fix elasticsearch settings bug * Fix gearman bug * Add missing odbc headers * Add more debug messages * Swap out dev ci image * Fix required virtualenv version * Swap out dev ci image * Swap out dev ci image * Remove aioredis v1 for EOL * Add coverage paths for docker container * Unpin ci container --------- Co-authored-by: TimPansino * Trigger tests * Add testing for profile trace. * [Mega-Linter] Apply linters fixes * Ignore __call__ from coverage on profile_trace. * [Mega-Linter] Apply linters fixes --------- Co-authored-by: Hannah Stepanek Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: hmstepanek Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Co-authored-by: TimPansino Co-authored-by: umaannamalai * Add Transaction API Tests (#857) * Test for suppress_apdex_metric * Add custom_metrics tests * Add distributed_trace_headers testing in existing tests * [Mega-Linter] Apply linters fixes * Remove redundant if-statement * Ignore deprecated transaction function from coverage * [Mega-Linter] Apply linters fixes * Push empty commit * Update newrelic/api/transaction.py --------- Co-authored-by: lrafeei Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> Co-authored-by: Uma Annamalai * Add tests for jinja2. (#842) * Add tests for jinja2. * [Mega-Linter] Apply linters fixes * Update tox.ini Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> --------- Co-authored-by: umaannamalai Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Timothy Pansino <11214426+TimPansino@users.noreply.github.com> * Add tests for newrelic/config.py (#860) Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Fix starlette testing matrix for updated behavior. (#869) Co-authored-by: Lalleh Rafeei Co-authored-by: Hannah Stepanek Co-authored-by: Uma Annamalai * Correct Serverless Distributed Tracing Logic (#870) * Fix serverless logic for distributed tracing * Test stubs * Collapse testing changes * Add negative testing to regular DT test suite * Apply linter fixes * [Mega-Linter] Apply linters fixes --------- Co-authored-by: TimPansino * Fix Kafka CI (#863) * Reenable kafka testing * Add kafka dev lib * Sync install python with devcontainer * Fix kafka local host setting * Drop set -u flag * Pin CI image dev sha * Add parallel flag to kafka * Fix proper exit status * Build librdkafka from source * Updated dev image sha * Remove coverage exclusions * Add new options to better emulate GHA * Reconfigure kafka networking Co-authored-by: Hannah Stepanek * Fix kafka ports on GHA * Run kafka tests serially * Separate kafka consumer groups * Put CI container makefile back * Remove confluent kafka Py27 for latest * Roll back ubuntu version update * Update dev ci sha --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Hannah Stepanek * Change image tag to latest (#871) * Change image tag to latest * Use built sha * Fixup * Replace w/ latest * Add full version for pypy3.8 to tox (#872) * Add full version for pypy3.8 * Remove solrpy from tests * Instrument RedisCluster (#809) * Add instrumentation for RedisCluster * Add tests for redis cluster * Ignore Django instrumentation from older versions (#859) * Ignore Django instrumentation from older versions * Ignore Django instrumentation from older versions * Fix text concatenation * Update newrelic/hooks/framework_django.py Co-authored-by: Hannah Stepanek * Update newrelic/hooks/framework_django.py Co-authored-by: Hannah Stepanek --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Hannah Stepanek * Modify postgresql tests to include WITH query (#885) * Modify postgresql tests to include WITH * [Mega-Linter] Apply linters fixes --------- Co-authored-by: lrafeei * Develop redis addons (#888) * Added separate instrumentation for redis.asyncio.client (#808) * Added separate instrumentation for redis.asyncio.client Merge main branch updates Add tests for newrelic/config.py (#860) Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> * Modify redis tests * removed redis.asyncio from aioredis instrumentation removed aioredis instrumentation in redis asyncio client removed redis.asyncio from aioredis instrumentation --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Lalleh Rafeei Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> * Redis asyncio testing (#881) * Add/modify redis asyncio tests * Change to psubscribe * Tweak redis async tests/instrumentation * [Mega-Linter] Apply linters fixes * Push empty commit * Exclude older instrumentation from coverage * Resolve requested testing changes * Tweak async pubsub test * Fix pubsub test --------- Co-authored-by: lrafeei * Remove aioredis and aredis from tox (#891) * Remove aioredis and aredis from tox * Add aredis and aioredis to coverage ignore * Push empty commit * Fix codecov ignore file --------- Co-authored-by: Ahmed Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: lrafeei * Fix drift * Fix ariadne middleware tests for all versions. --------- Co-authored-by: Hannah Stepanek Co-authored-by: Lalleh Rafeei Co-authored-by: TimPansino Co-authored-by: Lalleh Rafeei <84813886+lrafeei@users.noreply.github.com> Co-authored-by: Uma Annamalai Co-authored-by: Hannah Stepanek Co-authored-by: umaannamalai Co-authored-by: Lalleh Rafeei Co-authored-by: Kevin Morey Co-authored-by: Kate Anderson <90657569+kanderson250@users.noreply.github.com> Co-authored-by: Enriqueta De Leon Co-authored-by: Kate Anderson Co-authored-by: Mary Martinez Co-authored-by: Dmitry Kolyagin Co-authored-by: mary-martinez Co-authored-by: enriqueta Co-authored-by: Mary Martinez Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: Justin Richert Co-authored-by: Ahmed Helil Co-authored-by: Ahmed --- .devcontainer/Dockerfile | 4 + .devcontainer/devcontainer.json | 35 + .../actions/setup-python-matrix/action.yml | 45 - .github/containers/Dockerfile | 98 + .github/containers/Makefile | 48 + .github/containers/install-python.sh | 53 + .github/containers/requirements.txt | 5 + .github/mergify.yml | 80 + .github/scripts/retry.sh | 42 + .github/workflows/build-ci-image.yml | 68 + .github/workflows/deploy-python.yml | 4 +- .github/workflows/get-envs.py | 14 + .github/workflows/mega-linter.yml | 2 +- .github/workflows/tests.yml | 528 ++++- .gitignore | 3 + CONTRIBUTING.rst | 245 ++- MANIFEST.in | 1 + THIRD_PARTY_NOTICES.md | 19 +- codecov.yml | 25 + newrelic/__init__.py | 7 +- newrelic/admin/validate_config.py | 99 +- newrelic/agent.py | 666 +++---- newrelic/api/application.py | 14 +- newrelic/api/asgi_application.py | 19 +- newrelic/api/background_task.py | 38 +- newrelic/api/database_trace.py | 5 +- newrelic/api/graphql_trace.py | 7 +- newrelic/api/message_transaction.py | 26 +- newrelic/api/profile_trace.py | 50 +- newrelic/api/settings.py | 30 +- newrelic/api/time_trace.py | 68 +- newrelic/api/transaction.py | 202 +- newrelic/api/wsgi_application.py | 34 +- newrelic/common/package_version_utils.py | 102 + newrelic/common/signature.py | 31 + newrelic/common/streaming_utils.py | 49 +- newrelic/config.py | 275 ++- newrelic/core/agent_streaming.py | 31 +- newrelic/core/application.py | 89 +- newrelic/core/attribute.py | 162 +- newrelic/core/code_level_metrics.py | 2 +- newrelic/core/config.py | 86 +- newrelic/core/context.py | 10 +- newrelic/core/data_collector.py | 23 +- newrelic/core/environment.py | 78 +- newrelic/core/error_node.py | 3 +- newrelic/core/external_node.py | 2 + newrelic/core/infinite_tracing_pb2.py | 21 +- newrelic/core/infinite_tracing_v3_pb2.py | 498 ++--- newrelic/core/infinite_tracing_v4_pb2.py | 147 +- newrelic/core/stats_engine.py | 106 +- newrelic/core/trace_cache.py | 103 +- newrelic/core/transaction_node.py | 437 +++-- newrelic/hooks/adapter_daphne.py | 4 +- newrelic/hooks/adapter_hypercorn.py | 9 +- newrelic/hooks/adapter_waitress.py | 17 +- newrelic/hooks/component_sentry.py | 41 + newrelic/hooks/datastore_aioredis.py | 54 +- newrelic/hooks/datastore_aredis.py | 2 +- newrelic/hooks/datastore_bmemcached.py | 23 +- newrelic/hooks/datastore_elasticsearch.py | 711 +++++-- newrelic/hooks/datastore_memcache.py | 46 +- newrelic/hooks/datastore_pyelasticsearch.py | 91 +- newrelic/hooks/datastore_pylibmc.py | 29 +- newrelic/hooks/datastore_pymemcache.py | 28 +- newrelic/hooks/datastore_pymongo.py | 73 +- newrelic/hooks/datastore_pysolr.py | 18 +- newrelic/hooks/datastore_redis.py | 171 +- newrelic/hooks/datastore_solrpy.py | 17 +- newrelic/hooks/framework_aiohttp.py | 45 +- newrelic/hooks/framework_django.py | 396 ++-- newrelic/hooks/framework_flask.py | 18 + newrelic/hooks/framework_graphql.py | 5 +- newrelic/hooks/framework_sanic.py | 2 +- newrelic/hooks/framework_twisted.py | 560 ------ newrelic/hooks/logger_logging.py | 7 +- newrelic/hooks/logger_loguru.py | 20 +- newrelic/hooks/memcache_pylibmc.py | 57 - newrelic/hooks/memcache_umemcache.py | 82 - .../hooks/messagebroker_confluentkafka.py | 6 +- newrelic/hooks/messagebroker_kafkapython.py | 13 +- newrelic/hooks/messagebroker_pika.py | 235 ++- newrelic/hooks/nosql_pymongo.py | 44 - newrelic/hooks/nosql_redis.py | 62 - newrelic/hooks/solr_pysolr.py | 39 - newrelic/hooks/solr_solrpy.py | 50 - newrelic/packages/isort/LICENSE | 21 + newrelic/packages/isort/__init__.py | 0 newrelic/packages/isort/stdlibs/__init__.py | 2 + newrelic/packages/isort/stdlibs/all.py | 3 + newrelic/packages/isort/stdlibs/py2.py | 3 + newrelic/packages/isort/stdlibs/py27.py | 301 +++ newrelic/packages/isort/stdlibs/py3.py | 3 + newrelic/packages/isort/stdlibs/py310.py | 222 +++ newrelic/packages/isort/stdlibs/py311.py | 222 +++ newrelic/packages/isort/stdlibs/py36.py | 224 +++ newrelic/packages/isort/stdlibs/py37.py | 225 +++ newrelic/packages/isort/stdlibs/py38.py | 224 +++ newrelic/packages/isort/stdlibs/py39.py | 224 +++ newrelic/packages/six.py | 689 ++++++- newrelic/packages/urllib3/LICENSE.txt | 2 +- newrelic/packages/urllib3/__init__.py | 1 + newrelic/packages/urllib3/_version.py | 2 +- newrelic/packages/urllib3/connection.py | 15 +- newrelic/packages/urllib3/connectionpool.py | 44 +- .../packages/urllib3/contrib/appengine.py | 2 +- newrelic/packages/urllib3/contrib/ntlmpool.py | 4 +- .../packages/urllib3/contrib/pyopenssl.py | 17 +- .../urllib3/contrib/securetransport.py | 1 - .../packages/urllib3/packages/__init__.py | 5 - newrelic/packages/urllib3/packages/six.py | 1 - .../packages/ssl_match_hostname/__init__.py | 24 - newrelic/packages/urllib3/poolmanager.py | 1 + newrelic/packages/urllib3/response.py | 72 +- newrelic/packages/urllib3/util/connection.py | 3 +- newrelic/packages/urllib3/util/request.py | 5 +- newrelic/packages/urllib3/util/retry.py | 62 +- .../ssl_match_hostname.py} | 15 +- newrelic/packages/urllib3/util/timeout.py | 9 +- newrelic/packages/urllib3/util/url.py | 11 +- newrelic/packages/urllib3/util/wait.py | 1 - setup.cfg | 2 +- setup.py | 15 +- tests/adapter_cheroot/conftest.py | 8 +- tests/adapter_cheroot/test_wsgi.py | 2 +- tests/adapter_daphne/conftest.py | 11 +- tests/adapter_daphne/test_daphne.py | 26 +- tests/adapter_gevent/conftest.py | 3 +- tests/adapter_gevent/pytest.ini | 2 - tests/adapter_gunicorn/conftest.py | 3 +- tests/adapter_gunicorn/pytest.ini | 2 - .../test_aiohttp_app_factory.py | 4 +- tests/adapter_gunicorn/test_asgi_app.py | 4 +- tests/adapter_gunicorn/test_gaiohttp.py | 4 +- tests/adapter_hypercorn/conftest.py | 11 +- tests/adapter_hypercorn/test_hypercorn.py | 23 +- tests/adapter_uvicorn/conftest.py | 10 +- tests/adapter_uvicorn/test_uvicorn.py | 4 +- tests/adapter_waitress/_application.py | 54 + tests/adapter_waitress/conftest.py | 40 + tests/adapter_waitress/test_wsgi.py | 101 + .../_test_async_coroutine_trace.py | 6 +- .../_test_code_level_metrics.py | 41 +- tests/agent_features/conftest.py | 15 +- tests/agent_features/test_apdex_metrics.py | 31 +- tests/agent_features/test_asgi_browser.py | 749 +++---- .../test_asgi_distributed_tracing.py | 4 +- tests/agent_features/test_asgi_transaction.py | 40 +- .../test_asgi_w3c_trace_context.py | 6 +- .../test_async_context_propagation.py | 9 +- tests/agent_features/test_async_timing.py | 12 +- tests/agent_features/test_attribute.py | 346 ++-- .../test_attributes_in_action.py | 86 +- tests/agent_features/test_browser.py | 825 ++++---- .../agent_features/test_code_level_metrics.py | 437 +++-- .../agent_features/test_collector_payloads.py | 51 +- tests/agent_features/test_configuration.py | 403 +++- tests/agent_features/test_coroutine_trace.py | 36 +- .../test_coroutine_transaction.py | 75 +- tests/agent_features/test_custom_metrics.py | 62 + .../test_distributed_tracing.py | 409 ++-- tests/agent_features/test_error_events.py | 8 +- .../test_error_group_callback.py | 261 +++ .../test_event_loop_wait_time.py | 31 +- tests/agent_features/test_function_trace.py | 5 +- .../agent_features/test_high_security_mode.py | 24 +- .../test_ignore_expected_errors.py | 26 +- tests/agent_features/test_lambda_handler.py | 281 +-- tests/agent_features/test_notice_error.py | 12 +- .../agent_features/test_priority_sampling.py | 59 +- tests/agent_features/test_profile_trace.py | 88 + tests/agent_features/test_serverless_mode.py | 144 +- tests/agent_features/test_span_events.py | 748 +++---- .../test_supportability_metrics.py | 2 +- tests/agent_features/test_synthetics.py | 4 + tests/agent_features/test_time_trace.py | 41 +- ...n_event_data_and_some_browser_stuff_too.py | 274 +-- tests/agent_features/test_transaction_name.py | 2 +- .../test_transaction_trace_segments.py | 166 +- .../agent_features/test_w3c_trace_context.py | 6 +- tests/agent_features/test_web_transaction.py | 153 +- tests/agent_features/test_wsgi_attributes.py | 15 +- tests/agent_streaming/_test_handler.py | 64 +- tests/agent_streaming/conftest.py | 54 +- .../agent_streaming/test_infinite_tracing.py | 381 +++- tests/agent_streaming/test_stream_buffer.py | 87 + tests/agent_streaming/test_streaming_rpc.py | 115 +- tests/agent_unittests/conftest.py | 9 +- tests/agent_unittests/test_agent_connect.py | 3 +- tests/agent_unittests/test_harvest_loop.py | 21 +- .../test_package_version_utils.py | 104 + tests/agent_unittests/test_signature.py | 31 + tests/agent_unittests/test_trace_cache.py | 129 ++ .../test_utilization_settings.py | 246 ++- tests/application_celery/conftest.py | 8 +- tests/application_celery/test_celery.py | 2 +- tests/application_gearman/conftest.py | 8 +- tests/application_gearman/test_gearman.py | 6 +- .../component_djangorestframework/conftest.py | 9 +- .../test_application.py | 6 +- tests/component_flask_rest/conftest.py | 9 +- .../component_flask_rest/test_application.py | 87 +- tests/component_graphqlserver/conftest.py | 11 +- tests/component_graphqlserver/test_graphql.py | 8 +- tests/component_tastypie/conftest.py | 8 +- tests/component_tastypie/test_application.py | 5 +- tests/coroutines_asyncio/conftest.py | 12 +- .../test_context_propagation.py | 9 +- tests/cross_agent/conftest.py | 10 +- .../utilization/utilization_json.json | 6 +- .../utilization_vendor_specific/gcp.json | 28 +- .../cross_agent/test_aws_utilization_data.py | 2 +- .../test_azure_utilization_data.py | 2 +- .../test_boot_id_utilization_data.py | 2 +- tests/cross_agent/test_cat_map.py | 168 +- tests/cross_agent/test_collector_hostname.py | 68 +- tests/cross_agent/test_distributed_tracing.py | 213 +- .../cross_agent/test_gcp_utilization_data.py | 2 +- tests/cross_agent/test_lambda_event_source.py | 4 +- .../cross_agent/test_pcf_utilization_data.py | 2 +- tests/cross_agent/test_rum_client_config.py | 98 +- tests/cross_agent/test_utilization_configs.py | 123 +- tests/cross_agent/test_w3c_trace_context.py | 252 +-- tests/datastore_aioredis/conftest.py | 31 +- .../test_custom_conn_pool.py | 15 +- .../test_execute_command.py | 19 +- tests/datastore_aioredis/test_get_and_set.py | 24 +- .../datastore_aioredis/test_instance_info.py | 7 +- tests/datastore_aioredis/test_multiple_dbs.py | 23 +- tests/datastore_aioredis/test_trace_node.py | 10 +- tests/datastore_aioredis/test_transactions.py | 42 +- .../test_uninstrumented_methods.py | 24 + tests/datastore_aredis/conftest.py | 8 +- .../datastore_aredis/test_custom_conn_pool.py | 4 +- .../datastore_aredis/test_execute_command.py | 4 +- tests/datastore_aredis/test_get_and_set.py | 4 +- tests/datastore_aredis/test_multiple_dbs.py | 6 +- tests/datastore_aredis/test_trace_node.py | 86 +- tests/datastore_asyncpg/conftest.py | 13 +- tests/datastore_asyncpg/test_multiple_dbs.py | 6 +- tests/datastore_asyncpg/test_query.py | 8 +- tests/datastore_bmemcached/conftest.py | 9 +- tests/datastore_bmemcached/test_memcache.py | 2 +- tests/datastore_elasticsearch/conftest.py | 38 +- .../test_connection.py | 50 +- .../test_database_duration.py | 44 +- .../test_elasticsearch.py | 288 +-- .../test_instrumented_methods.py | 144 +- tests/datastore_elasticsearch/test_mget.py | 161 +- .../test_multiple_dbs.py | 98 +- .../test_trace_node.py | 84 +- .../datastore_elasticsearch/test_transport.py | 132 +- tests/datastore_memcache/conftest.py | 9 +- tests/datastore_memcache/test_memcache.py | 4 +- tests/datastore_memcache/test_multiple_dbs.py | 4 +- tests/datastore_mysql/conftest.py | 9 +- tests/datastore_mysql/test_database.py | 2 +- tests/datastore_postgresql/conftest.py | 34 +- tests/datastore_postgresql/test_database.py | 43 +- tests/datastore_psycopg2/conftest.py | 9 +- tests/datastore_psycopg2/test_async.py | 6 +- tests/datastore_psycopg2/test_cursor.py | 4 +- tests/datastore_psycopg2/test_multiple_dbs.py | 4 +- tests/datastore_psycopg2/test_register.py | 4 +- tests/datastore_psycopg2/test_rollback.py | 4 +- tests/datastore_psycopg2/test_trace_node.py | 60 +- tests/datastore_psycopg2cffi/conftest.py | 10 +- tests/datastore_psycopg2cffi/test_database.py | 5 +- .../test_pyelasticsearch.py | 116 -- tests/datastore_pylibmc/conftest.py | 9 +- tests/datastore_pylibmc/test_memcache.py | 2 +- tests/datastore_pymemcache/conftest.py | 9 +- tests/datastore_pymemcache/test_memcache.py | 2 +- tests/datastore_pymongo/conftest.py | 11 +- tests/datastore_pymongo/test_pymongo.py | 7 +- .../conftest.py | 29 +- tests/datastore_pymssql/test_database.py | 115 ++ tests/datastore_pymysql/conftest.py | 10 +- tests/datastore_pymysql/test_database.py | 3 +- tests/datastore_pyodbc/conftest.py | 33 + tests/datastore_pyodbc/test_pyodbc.py | 120 ++ tests/datastore_pysolr/conftest.py | 8 +- tests/datastore_pysolr/test_solr.py | 2 +- tests/datastore_redis/conftest.py | 8 +- tests/datastore_redis/test_asyncio.py | 100 + .../datastore_redis/test_custom_conn_pool.py | 4 +- tests/datastore_redis/test_execute_command.py | 4 +- tests/datastore_redis/test_get_and_set.py | 4 +- tests/datastore_redis/test_multiple_dbs.py | 4 +- tests/datastore_redis/test_rb.py | 4 +- tests/datastore_redis/test_trace_node.py | 76 +- .../test_uninstrumented_methods.py | 55 +- tests/datastore_rediscluster/conftest.py | 32 + ...est_uninstrumented_rediscluster_methods.py | 168 ++ tests/datastore_solrpy/conftest.py | 8 +- tests/datastore_solrpy/test_solr.py | 2 +- tests/datastore_sqlite/conftest.py | 9 +- tests/datastore_sqlite/test_database.py | 4 +- tests/datastore_umemcache/conftest.py | 38 - tests/datastore_umemcache/test_memcache.py | 143 -- tests/external_boto3/conftest.py | 8 +- tests/external_boto3/test_boto3_iam.py | 69 +- tests/external_boto3/test_boto3_s3.py | 122 +- tests/external_boto3/test_boto3_sns.py | 111 +- tests/external_botocore/conftest.py | 8 +- .../test_botocore_dynamodb.py | 191 +- tests/external_botocore/test_botocore_ec2.py | 85 +- tests/external_botocore/test_botocore_s3.py | 113 +- tests/external_botocore/test_botocore_sqs.py | 121 +- tests/external_feedparser/conftest.py | 10 +- tests/external_feedparser/test_feedparser.py | 2 +- tests/external_http/conftest.py | 9 +- tests/external_http/test_http.py | 2 +- tests/external_httplib/conftest.py | 11 +- tests/external_httplib/test_httplib.py | 16 +- tests/external_httplib/test_urllib.py | 3 +- tests/external_httplib/test_urllib2.py | 3 +- tests/external_httplib2/conftest.py | 8 +- tests/external_httplib2/test_httplib2.py | 2 +- tests/external_httpx/conftest.py | 12 +- tests/external_httpx/test_client.py | 12 +- tests/external_requests/conftest.py | 9 +- tests/external_requests/test_requests.py | 5 +- tests/external_urllib3/conftest.py | 9 +- tests/external_urllib3/test_urllib3.py | 5 +- .../framework_aiohttp/_target_application.py | 135 +- tests/framework_aiohttp/conftest.py | 15 +- tests/framework_aiohttp/test_client.py | 30 +- .../test_client_async_await.py | 3 +- tests/framework_aiohttp/test_client_cat.py | 33 +- tests/framework_aiohttp/test_externals.py | 34 +- tests/framework_aiohttp/test_middleware.py | 64 +- tests/framework_aiohttp/test_server.py | 242 +-- tests/framework_aiohttp/test_server_cat.py | 180 +- tests/framework_aiohttp/test_ws.py | 23 +- .../framework_ariadne/_target_application.py | 45 +- .../framework_ariadne/_target_schema_sync.py | 11 +- tests/framework_ariadne/conftest.py | 11 +- tests/framework_ariadne/test_application.py | 23 +- tests/framework_bottle/conftest.py | 8 +- tests/framework_bottle/test_application.py | 7 +- tests/framework_cherrypy/conftest.py | 8 +- tests/framework_cherrypy/test_application.py | 6 +- tests/framework_cherrypy/test_dispatch.py | 2 +- tests/framework_cherrypy/test_resource.py | 2 +- tests/framework_cherrypy/test_routes.py | 2 +- tests/framework_django/conftest.py | 9 +- tests/framework_django/test_application.py | 6 +- .../framework_django/test_asgi_application.py | 6 +- tests/framework_falcon/conftest.py | 8 +- tests/framework_falcon/test_application.py | 6 +- tests/framework_fastapi/conftest.py | 11 +- tests/framework_fastapi/test_application.py | 2 +- tests/framework_flask/conftest.py | 8 +- tests/framework_flask/test_application.py | 6 +- tests/framework_flask/test_blueprints.py | 5 +- tests/framework_flask/test_compress.py | 5 +- tests/framework_flask/test_middleware.py | 5 +- tests/framework_flask/test_not_found.py | 4 +- tests/framework_flask/test_user_exceptions.py | 4 +- tests/framework_flask/test_views.py | 12 +- tests/framework_graphene/conftest.py | 11 +- tests/framework_graphql/conftest.py | 13 +- tests/framework_graphql/test_application.py | 32 +- tests/framework_grpc/conftest.py | 12 +- tests/framework_grpc/test_clients.py | 4 +- .../test_distributed_tracing.py | 5 +- tests/framework_grpc/test_server.py | 11 +- tests/framework_pyramid/conftest.py | 8 +- .../test_append_slash_app.py | 6 +- tests/framework_pyramid/test_application.py | 5 +- tests/framework_pyramid/test_cornice.py | 4 +- tests/framework_sanic/conftest.py | 12 +- tests/framework_sanic/test_application.py | 20 +- .../framework_sanic/test_cross_application.py | 150 +- tests/framework_starlette/conftest.py | 11 +- tests/framework_starlette/test_application.py | 8 +- tests/framework_starlette/test_bg_tasks.py | 24 +- tests/framework_starlette/test_graphql.py | 3 +- tests/framework_strawberry/conftest.py | 10 +- .../framework_tornado/_target_application.py | 88 +- tests/framework_tornado/conftest.py | 9 +- .../framework_tornado/test_custom_handler.py | 2 +- tests/framework_tornado/test_externals.py | 6 +- tests/framework_tornado/test_inbound_cat.py | 5 +- tests/framework_tornado/test_server.py | 234 +-- tests/logger_logging/conftest.py | 11 +- tests/logger_logging/test_local_decorating.py | 28 +- tests/logger_logging/test_metrics.py | 5 +- tests/logger_logging/test_settings.py | 7 +- tests/logger_loguru/conftest.py | 11 +- tests/logger_loguru/test_metrics.py | 4 +- tests/logger_loguru/test_settings.py | 7 +- .../messagebroker_confluentkafka/conftest.py | 25 +- .../test_consumer.py | 20 +- .../test_producer.py | 67 +- .../test_serialization.py | 4 +- tests/messagebroker_kafkapython/conftest.py | 27 +- .../test_consumer.py | 20 +- .../test_producer.py | 13 +- .../test_serialization.py | 8 +- tests/messagebroker_pika/conftest.py | 12 +- tests/messagebroker_pika/test_cat.py | 3 +- .../test_distributed_tracing.py | 4 +- .../test_pika_async_connection_consume.py | 327 ++-- .../test_pika_blocking_connection_consume.py | 212 +- ...a_blocking_connection_consume_generator.py | 219 +-- tests/messagebroker_pika/test_pika_produce.py | 14 +- .../test_pika_supportability.py | 2 +- tests/template_genshi/conftest.py | 30 + tests/template_genshi/test_genshi.py | 38 + tests/template_jinja2/conftest.py | 30 + tests/template_jinja2/test_jinja2.py | 41 + tests/template_mako/conftest.py | 8 +- tests/template_mako/test_mako.py | 17 +- tests/testing_support/db_settings.py | 196 +- tests/testing_support/fixtures.py | 1728 ++--------------- tests/testing_support/sample_applications.py | 154 +- .../sample_asgi_applications.py | 6 +- tests/testing_support/util.py | 15 + .../validate_application_error_event_count.py | 39 + .../validate_application_error_trace_count.py | 39 + .../validators/validate_application_errors.py | 56 + .../validators/validate_browser_attributes.py | 74 + .../validate_custom_event_collector_json.py | 64 + .../validators/validate_custom_parameters.py | 48 + .../validate_datastore_trace_inputs.py | 50 + .../validate_error_event_attributes.py | 51 + ...or_event_attributes_outside_transaction.py | 48 + .../validate_error_event_collector_json.py | 69 + .../validate_error_trace_attributes.py | 47 + ...or_trace_attributes_outside_transaction.py | 45 + .../validate_error_trace_collector_json.py | 58 + .../validators/validate_internal_metrics.py | 64 + .../validators/validate_metric_payload.py | 6 +- .../validate_non_transaction_error_event.py | 71 + .../validators/validate_synthetics_event.py | 71 + .../validate_synthetics_transaction_trace.py | 67 + ...lidate_time_metrics_outside_transaction.py | 93 + ...date_transaction_error_trace_attributes.py | 49 + .../validators/validate_transaction_errors.py | 83 + .../validate_transaction_event_attributes.py | 53 + ...lidate_transaction_event_collector_json.py | 56 + .../validate_transaction_metrics.py | 135 ++ .../validate_transaction_trace_attributes.py | 69 + .../validators/validate_tt_collector_json.py | 184 ++ .../validators/validate_tt_parameters.py | 52 + .../validators/validate_tt_segment_params.py | 91 + tox.ini | 333 ++-- 449 files changed, 18478 insertions(+), 11562 deletions(-) create mode 100644 .devcontainer/Dockerfile create mode 100644 .devcontainer/devcontainer.json delete mode 100644 .github/actions/setup-python-matrix/action.yml create mode 100644 .github/containers/Dockerfile create mode 100644 .github/containers/Makefile create mode 100755 .github/containers/install-python.sh create mode 100644 .github/containers/requirements.txt create mode 100644 .github/mergify.yml create mode 100755 .github/scripts/retry.sh create mode 100644 .github/workflows/build-ci-image.yml create mode 100644 codecov.yml create mode 100644 newrelic/common/package_version_utils.py create mode 100644 newrelic/common/signature.py create mode 100644 newrelic/hooks/component_sentry.py delete mode 100644 newrelic/hooks/framework_twisted.py delete mode 100644 newrelic/hooks/memcache_pylibmc.py delete mode 100644 newrelic/hooks/memcache_umemcache.py delete mode 100644 newrelic/hooks/nosql_pymongo.py delete mode 100644 newrelic/hooks/nosql_redis.py delete mode 100644 newrelic/hooks/solr_pysolr.py delete mode 100644 newrelic/hooks/solr_solrpy.py create mode 100644 newrelic/packages/isort/LICENSE create mode 100644 newrelic/packages/isort/__init__.py create mode 100644 newrelic/packages/isort/stdlibs/__init__.py create mode 100644 newrelic/packages/isort/stdlibs/all.py create mode 100644 newrelic/packages/isort/stdlibs/py2.py create mode 100644 newrelic/packages/isort/stdlibs/py27.py create mode 100644 newrelic/packages/isort/stdlibs/py3.py create mode 100644 newrelic/packages/isort/stdlibs/py310.py create mode 100644 newrelic/packages/isort/stdlibs/py311.py create mode 100644 newrelic/packages/isort/stdlibs/py36.py create mode 100644 newrelic/packages/isort/stdlibs/py37.py create mode 100644 newrelic/packages/isort/stdlibs/py38.py create mode 100644 newrelic/packages/isort/stdlibs/py39.py delete mode 100644 newrelic/packages/urllib3/packages/ssl_match_hostname/__init__.py rename newrelic/packages/urllib3/{packages/ssl_match_hostname/_implementation.py => util/ssl_match_hostname.py} (92%) delete mode 100644 tests/adapter_gevent/pytest.ini delete mode 100644 tests/adapter_gunicorn/pytest.ini create mode 100644 tests/adapter_waitress/_application.py create mode 100644 tests/adapter_waitress/conftest.py create mode 100644 tests/adapter_waitress/test_wsgi.py create mode 100644 tests/agent_features/test_custom_metrics.py create mode 100644 tests/agent_features/test_error_group_callback.py create mode 100644 tests/agent_features/test_profile_trace.py create mode 100644 tests/agent_streaming/test_stream_buffer.py create mode 100644 tests/agent_unittests/test_package_version_utils.py create mode 100644 tests/agent_unittests/test_signature.py create mode 100644 tests/agent_unittests/test_trace_cache.py delete mode 100644 tests/datastore_pyelasticsearch/test_pyelasticsearch.py rename tests/{datastore_pyelasticsearch => datastore_pymssql}/conftest.py (50%) create mode 100644 tests/datastore_pymssql/test_database.py create mode 100644 tests/datastore_pyodbc/conftest.py create mode 100644 tests/datastore_pyodbc/test_pyodbc.py create mode 100644 tests/datastore_redis/test_asyncio.py create mode 100644 tests/datastore_rediscluster/conftest.py create mode 100644 tests/datastore_rediscluster/test_uninstrumented_rediscluster_methods.py delete mode 100644 tests/datastore_umemcache/conftest.py delete mode 100644 tests/datastore_umemcache/test_memcache.py create mode 100644 tests/template_genshi/conftest.py create mode 100644 tests/template_genshi/test_genshi.py create mode 100644 tests/template_jinja2/conftest.py create mode 100644 tests/template_jinja2/test_jinja2.py create mode 100644 tests/testing_support/validators/validate_application_error_event_count.py create mode 100644 tests/testing_support/validators/validate_application_error_trace_count.py create mode 100644 tests/testing_support/validators/validate_application_errors.py create mode 100644 tests/testing_support/validators/validate_browser_attributes.py create mode 100644 tests/testing_support/validators/validate_custom_event_collector_json.py create mode 100644 tests/testing_support/validators/validate_custom_parameters.py create mode 100644 tests/testing_support/validators/validate_datastore_trace_inputs.py create mode 100644 tests/testing_support/validators/validate_error_event_attributes.py create mode 100644 tests/testing_support/validators/validate_error_event_attributes_outside_transaction.py create mode 100644 tests/testing_support/validators/validate_error_event_collector_json.py create mode 100644 tests/testing_support/validators/validate_error_trace_attributes.py create mode 100644 tests/testing_support/validators/validate_error_trace_attributes_outside_transaction.py create mode 100644 tests/testing_support/validators/validate_error_trace_collector_json.py create mode 100644 tests/testing_support/validators/validate_internal_metrics.py create mode 100644 tests/testing_support/validators/validate_non_transaction_error_event.py create mode 100644 tests/testing_support/validators/validate_synthetics_event.py create mode 100644 tests/testing_support/validators/validate_synthetics_transaction_trace.py create mode 100644 tests/testing_support/validators/validate_time_metrics_outside_transaction.py create mode 100644 tests/testing_support/validators/validate_transaction_error_trace_attributes.py create mode 100644 tests/testing_support/validators/validate_transaction_errors.py create mode 100644 tests/testing_support/validators/validate_transaction_event_attributes.py create mode 100644 tests/testing_support/validators/validate_transaction_event_collector_json.py create mode 100644 tests/testing_support/validators/validate_transaction_metrics.py create mode 100644 tests/testing_support/validators/validate_transaction_trace_attributes.py create mode 100644 tests/testing_support/validators/validate_tt_collector_json.py create mode 100644 tests/testing_support/validators/validate_tt_parameters.py create mode 100644 tests/testing_support/validators/validate_tt_segment_params.py diff --git a/.devcontainer/Dockerfile b/.devcontainer/Dockerfile new file mode 100644 index 000000000..f42af328e --- /dev/null +++ b/.devcontainer/Dockerfile @@ -0,0 +1,4 @@ +ARG IMAGE=ghcr.io/newrelic-experimental/pyenv-devcontainer:latest + +# To target other architectures, change the --platform directive in the Dockerfile. +FROM --platform=linux/amd64 ${IMAGE} diff --git a/.devcontainer/devcontainer.json b/.devcontainer/devcontainer.json new file mode 100644 index 000000000..92a8cdee4 --- /dev/null +++ b/.devcontainer/devcontainer.json @@ -0,0 +1,35 @@ +// For format details, see https://containers.dev/implementors/json_reference/. +{ + "name": "pyenv", + "build":{ + // To target other architectures, change the --platform directive in the Dockerfile. + "dockerfile": "Dockerfile", + "args": { + "IMAGE": "ghcr.io/newrelic-experimental/pyenv-devcontainer:latest" + } + }, + "remoteUser": "vscode", + "runArgs": ["--network=host"], + "features": { + // Available Features: https://containers.dev/features + // "ghcr.io/devcontainers/features/docker-from-docker:1": {"moby": false}, + // "ghcr.io/devcontainers/features/aws-cli:1": {}, + // "ghcr.io/devcontainers/features/github-cli:1": {} + }, + "containerEnv": { + "NEW_RELIC_HOST": "${localEnv:NEW_RELIC_HOST}", + "NEW_RELIC_LICENSE_KEY": "${localEnv:NEW_RELIC_LICENSE_KEY}", + "NEW_RELIC_INSERT_KEY": "${localEnv:NEW_RELIC_INSERT_KEY}", + "NEW_RELIC_DEVELOPER_MODE": "${localEnv:NEW_RELIC_DEVELOPER_MODE}" + }, + "customizations": { + "vscode": { + "settings": {}, + "extensions": [ + "ms-python.python", + "ms-vsliveshare.vsliveshare", + "eamodio.gitlens" + ] + } + } +} diff --git a/.github/actions/setup-python-matrix/action.yml b/.github/actions/setup-python-matrix/action.yml deleted file mode 100644 index 344cf686c..000000000 --- a/.github/actions/setup-python-matrix/action.yml +++ /dev/null @@ -1,45 +0,0 @@ -name: "setup-python-matrix" -description: "Sets up all versions of python required for matrix testing in this repo." -runs: - using: "composite" - steps: - - uses: actions/setup-python@v3 - with: - python-version: "pypy-3.7" - architecture: x64 - - - uses: actions/setup-python@v3 - with: - python-version: "pypy-2.7" - architecture: x64 - - - uses: actions/setup-python@v3 - with: - python-version: "3.7" - architecture: x64 - - - uses: actions/setup-python@v3 - with: - python-version: "3.8" - architecture: x64 - - - uses: actions/setup-python@v3 - with: - python-version: "3.9" - architecture: x64 - - - uses: actions/setup-python@v3 - with: - python-version: "3.10" - architecture: x64 - - - uses: actions/setup-python@v3 - with: - python-version: "2.7" - architecture: x64 - - - name: Install Dependencies - shell: bash - run: | - python3.10 -m pip install -U pip - python3.10 -m pip install -U wheel setuptools tox virtualenv!=20.0.24 diff --git a/.github/containers/Dockerfile b/.github/containers/Dockerfile new file mode 100644 index 000000000..2fbefb14a --- /dev/null +++ b/.github/containers/Dockerfile @@ -0,0 +1,98 @@ + +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +FROM ubuntu:20.04 + +# Install OS packages +RUN export DEBIAN_FRONTEND=noninteractive && \ + apt-get update && \ + apt-get install -y \ + bash \ + build-essential \ + curl \ + expat \ + freetds-common \ + freetds-dev \ + gcc \ + git \ + libbz2-dev \ + libcurl4-openssl-dev \ + libffi-dev \ + libgmp-dev \ + libkrb5-dev \ + liblzma-dev \ + libmpfr-dev \ + libncurses-dev \ + libpq-dev \ + libreadline-dev \ + libsqlite3-dev \ + libssl-dev \ + locales \ + make \ + odbc-postgresql \ + openssl \ + python2-dev \ + python3-dev \ + python3-pip \ + tzdata \ + unixodbc-dev \ + unzip \ + wget \ + zip \ + zlib1g \ + zlib1g-dev && \ + rm -rf /var/lib/apt/lists/* + +# Build librdkafka from source +ARG LIBRDKAFKA_VERSION=2.1.1 +RUN cd /tmp && \ + wget https://github.com/confluentinc/librdkafka/archive/refs/tags/v${LIBRDKAFKA_VERSION}.zip -O ./librdkafka.zip && \ + unzip ./librdkafka.zip && \ + rm ./librdkafka.zip && \ + cd ./librdkafka-${LIBRDKAFKA_VERSION} && \ + ./configure && \ + make all install && \ + cd /tmp && \ + rm -rf ./librdkafka-${LIBRDKAFKA_VERSION} + +# Setup ODBC config +RUN sed -i 's|Driver=psqlodbca.so|Driver=/usr/lib/x86_64-linux-gnu/odbc/psqlodbca.so|g' /etc/odbcinst.ini && \ + sed -i 's|Driver=psqlodbcw.so|Driver=/usr/lib/x86_64-linux-gnu/odbc/psqlodbcw.so|g' /etc/odbcinst.ini && \ + sed -i 's|Setup=libodbcpsqlS.so|Setup=/usr/lib/x86_64-linux-gnu/odbc/libodbcpsqlS.so|g' /etc/odbcinst.ini + +# Set the locale +RUN locale-gen --no-purge en_US.UTF-8 +ENV LANG=en_US.UTF-8 \ LANGUAGE=en_US:en \ LC_ALL=en_US.UTF-8 +ENV TZ="Etc/UTC" +RUN ln -fs "/usr/share/zoneinfo/${TZ}" /etc/localtime && \ + dpkg-reconfigure -f noninteractive tzdata + +# Use root user +ENV HOME /root +WORKDIR "${HOME}" + +# Install pyenv +ENV PYENV_ROOT="${HOME}/.pyenv" +RUN curl https://pyenv.run/ | /bin/bash +ENV PATH="$PYENV_ROOT/bin:$PYENV_ROOT/shims:${PATH}" +RUN echo 'eval "$(pyenv init -)"' >>$HOME/.bashrc && \ + pyenv update + +# Install Python +ARG PYTHON_VERSIONS="3.10 3.9 3.8 3.7 3.11 2.7 pypy2.7-7.3.12 pypy3.8-7.3.11" +COPY --chown=1000:1000 --chmod=+x ./install-python.sh /tmp/install-python.sh +COPY ./requirements.txt /requirements.txt +RUN /tmp/install-python.sh && \ + rm /tmp/install-python.sh diff --git a/.github/containers/Makefile b/.github/containers/Makefile new file mode 100644 index 000000000..35081f738 --- /dev/null +++ b/.github/containers/Makefile @@ -0,0 +1,48 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# Repository root for mounting into container. +MAKEFILE_DIR:=$(dir $(realpath $(firstword $(MAKEFILE_LIST)))) +REPO_ROOT:=$(realpath $(MAKEFILE_DIR)../../) + +.PHONY: default +default: test + +.PHONY: build +build: + @# Perform a shortened build for testing + @docker build $(MAKEFILE_DIR) \ + -t ghcr.io/newrelic/newrelic-python-agent-ci:local \ + --build-arg='PYTHON_VERSIONS=3.10 2.7' + +.PHONY: test +test: build + @# Ensure python versions are usable + @docker run --rm ghcr.io/newrelic/python-agent-ci:local /bin/bash -c '\ + python3.10 --version && \ + python2.7 --version && \ + touch tox.ini && tox --version && \ + echo "Success! Python versions installed."' + +.PHONY: run +run: build + @docker run --rm -it \ + --mount type=bind,source="$(REPO_ROOT)",target=/home/github/python-agent \ + --workdir=/home/github/python-agent \ + --add-host=host.docker.internal:host-gateway \ + -e NEW_RELIC_HOST="${NEW_RELIC_HOST}" \ + -e NEW_RELIC_LICENSE_KEY="${NEW_RELIC_LICENSE_KEY}" \ + -e NEW_RELIC_DEVELOPER_MODE="${NEW_RELIC_DEVELOPER_MODE}" \ + -e GITHUB_ACTIONS="true" \ + ghcr.io/newrelic/newrelic-python-agent-ci:local /bin/bash diff --git a/.github/containers/install-python.sh b/.github/containers/install-python.sh new file mode 100755 index 000000000..2031e2d92 --- /dev/null +++ b/.github/containers/install-python.sh @@ -0,0 +1,53 @@ +#!/bin/bash +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +set -eo pipefail + +main() { + # Coerce space separated string to array + if [[ ${#PYTHON_VERSIONS[@]} -eq 1 ]]; then + PYTHON_VERSIONS=($PYTHON_VERSIONS) + fi + + if [[ -z "${PYTHON_VERSIONS[@]}" ]]; then + echo "No python versions specified. Make sure PYTHON_VERSIONS is set." 1>&2 + exit 1 + fi + + # Find all latest pyenv supported versions for requested python versions + PYENV_VERSIONS=() + for v in "${PYTHON_VERSIONS[@]}"; do + LATEST=$(pyenv latest -k "$v" || pyenv latest -k "$v-dev") + if [[ -z "$LATEST" ]]; then + echo "Latest version could not be found for ${v}." 1>&2 + exit 1 + fi + PYENV_VERSIONS+=($LATEST) + done + + # Install each specific version + for v in "${PYENV_VERSIONS[@]}"; do + pyenv install "$v" & + done + wait + + # Set all installed versions as globally accessible + pyenv global ${PYENV_VERSIONS[@]} + + # Install dependencies for main python installation + pyenv exec pip install --upgrade -r /requirements.txt +} + +main diff --git a/.github/containers/requirements.txt b/.github/containers/requirements.txt new file mode 100644 index 000000000..27fa6624b --- /dev/null +++ b/.github/containers/requirements.txt @@ -0,0 +1,5 @@ +pip +setuptools +wheel +virtualenv<20.22.0 +tox \ No newline at end of file diff --git a/.github/mergify.yml b/.github/mergify.yml new file mode 100644 index 000000000..9536b3039 --- /dev/null +++ b/.github/mergify.yml @@ -0,0 +1,80 @@ +# For condition grammar see: https://docs.mergify.com/conditions/#grammar + +shared: + conditions: + - and: &pr_ready_checks + - "#approved-reviews-by>=1" # A '#' pulls the length of the underlying list + - "label=ready-to-merge" + - "check-success=tests" + - "-draft" # Don't include draft PRs + - "-merged" + - or: # Only handle branches that target main or develop branches + - "base=main" + - "base~=^develop" + +queue_rules: + - name: default + conditions: + - and: *pr_ready_checks + merge_method: squash + +pull_request_rules: + # Merge Queue PR Rules + - name: Regular PRs - Add to merge queue on approval (squash) + conditions: + - and: *pr_ready_checks + - "-head~=^develop" # Don't include PRs from develop branches + actions: + queue: + method: squash + + # Automatic PR Updates + - name: Automatic PR branch updates + conditions: + - "queue-position=-1" # Not queued + - "-draft" # Don't include draft PRs + - "-merged" + actions: + update: + + # Automatic Labeling + - name: Clean up after merge + conditions: + - merged + actions: + delete_head_branch: + label: + remove: + - "merge-conflicts" + - "ready-to-merge" + - "tests-failing" + + - name: Toggle label on merge conflicts + conditions: + - "-merged" + - conflict + actions: + label: + toggle: + - "merge-conflicts" + + # Don't use a toggle for this, as the label constantly gets applied and removed when tests are rerun. + - name: Add label on test failures + conditions: + - "-merged" + - or: + - check-failure=tests + - check-skipped=tests + actions: + label: + add: + - "tests-failing" + + - name: Remove label on test success + conditions: + - "-merged" + - check-success=tests + actions: + label: + remove: + - "tests-failing" diff --git a/.github/scripts/retry.sh b/.github/scripts/retry.sh new file mode 100755 index 000000000..079798a72 --- /dev/null +++ b/.github/scripts/retry.sh @@ -0,0 +1,42 @@ +#!/bin/bash +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +# Time in seconds to backoff after the initial attempt. +INITIAL_BACKOFF=10 + +# Grab first arg as number of retries +retries=$1 +shift + +# Use for loop to repeatedly try the wrapped command, breaking on success +for i in $(seq 1 $retries); do + echo "Running: $@" + + # Exponential backoff + if [[ i -gt 1 ]]; then + # Starts with the initial backoff then doubles every retry. + backoff=$(($INITIAL_BACKOFF * (2 ** (i - 2)))) + echo "Command failed, retrying in $backoff seconds..." + sleep $backoff + fi + + # Run wrapped command, and exit on success + $@ && break + result=$? +done + +# Exit with status code of wrapped command +exit $result diff --git a/.github/workflows/build-ci-image.yml b/.github/workflows/build-ci-image.yml new file mode 100644 index 000000000..5bd0e6f69 --- /dev/null +++ b/.github/workflows/build-ci-image.yml @@ -0,0 +1,68 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +name: Build CI Image + +on: + workflow_dispatch: # Allow manual trigger + +concurrency: + group: ${{ github.ref || github.run_id }} + cancel-in-progress: true + +jobs: + build: + runs-on: ubuntu-latest + + steps: + - uses: actions/checkout@v3 + with: + persist-credentials: false + fetch-depth: 0 + + - name: Set up Docker Buildx + id: buildx + uses: docker/setup-buildx-action@v2 + + - name: Generate Docker Metadata (Tags and Labels) + id: meta + uses: docker/metadata-action@v4 + with: + images: ghcr.io/${{ github.repository }}-ci + flavor: | + prefix= + suffix= + latest=false + tags: | + type=raw,value=latest,enable={{is_default_branch}} + type=schedule,pattern={{date 'YYYY-MM-DD'}} + type=sha,format=short,prefix=sha- + type=sha,format=long,prefix=sha- + + - name: Login to GitHub Container Registry + if: github.event_name != 'pull_request' + uses: docker/login-action@v2 + with: + registry: ghcr.io + username: ${{ github.repository_owner }} + password: ${{ secrets.GITHUB_TOKEN }} + + - name: Build and Publish Image + uses: docker/build-push-action@v3 + with: + push: ${{ github.event_name != 'pull_request' }} + context: .github/containers + platforms: linux/amd64 + tags: ${{ steps.meta.outputs.tags }} + labels: ${{ steps.meta.outputs.labels }} diff --git a/.github/workflows/deploy-python.yml b/.github/workflows/deploy-python.yml index e8fbd4f7f..fe16ee485 100644 --- a/.github/workflows/deploy-python.yml +++ b/.github/workflows/deploy-python.yml @@ -54,10 +54,10 @@ jobs: CIBW_ENVIRONMENT: "LD_LIBRARY_PATH=/opt/rh/=vtoolset-8/root/usr/lib64:/opt/rh/devtoolset-8/root/usr/lib:/opt/rh/devtoolset-8/root/usr/lib64/dyninst:/opt/rh/devtoolset-8/root/usr/lib/dyninst:/usr/local/lib64:/usr/local/lib" - name: Build Manylinux Wheels (Python 3) - uses: pypa/cibuildwheel@v2.1.3 + uses: pypa/cibuildwheel@v2.11.1 env: CIBW_PLATFORM: linux - CIBW_BUILD: cp37-manylinux_aarch64 cp38-manylinux_aarch64 cp39-manylinux_aarch64 cp310-manylinux_aarch64 cp37-manylinux_x86_64 cp38-manylinux_x86_64 cp39-manylinux_x86_64 cp310-manylinux_x86_64 + CIBW_BUILD: cp37-manylinux* cp38-manylinux* cp39-manylinux* cp310-manylinux* cp311-manylinux* CIBW_ARCHS: x86_64 aarch64 CIBW_ENVIRONMENT: "LD_LIBRARY_PATH=/opt/rh/devtoolset-8/root/usr/lib64:/opt/rh/devtoolset-8/root/usr/lib:/opt/rh/devtoolset-8/root/usr/lib64/dyninst:/opt/rh/devtoolset-8/root/usr/lib/dyninst:/usr/local/lib64:/usr/local/lib" diff --git a/.github/workflows/get-envs.py b/.github/workflows/get-envs.py index 576cbeb5c..4fcba6aa7 100755 --- a/.github/workflows/get-envs.py +++ b/.github/workflows/get-envs.py @@ -1,4 +1,18 @@ #!/usr/bin/env python3.8 +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + import fileinput import os diff --git a/.github/workflows/mega-linter.yml b/.github/workflows/mega-linter.yml index d378752dc..cd0930507 100644 --- a/.github/workflows/mega-linter.yml +++ b/.github/workflows/mega-linter.yml @@ -15,7 +15,7 @@ env: # Comment env block if you do not want to apply fixes APPLY_FIXES_MODE: commit # If APPLY_FIXES is used, defines if the fixes are directly committed (commit) or posted in a PR (pull_request) concurrency: - group: ${{ github.ref }}-${{ github.workflow }} + group: ${{ github.ref || github.run_id }}-${{ github.workflow }} cancel-in-progress: true jobs: diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml index 892bfce9a..b2c221bcf 100644 --- a/.github/workflows/tests.yml +++ b/.github/workflows/tests.yml @@ -24,29 +24,69 @@ on: schedule: - cron: "0 15 * * *" +concurrency: + group: ${{ github.ref || github.run_id }}-${{ github.workflow }} + cancel-in-progress: true + jobs: - tests: # Aggregate job that provides a single check for workflow success - runs-on: ubuntu-latest + # Aggregate job that provides a single check for all tests passing + tests: + runs-on: ubuntu-20.04 needs: - python - - elasticsearchserver01 - elasticsearchserver07 + - elasticsearchserver08 - gearman - grpc - kafka - - libcurl - memcached - mongodb + - mssql - mysql - postgres - rabbitmq - redis + - rediscluster - solr steps: - name: Success run: echo "Success!" + # Combine and upload coverage data + coverage: + if: success() || failure() # Does not run on cancelled workflows + runs-on: ubuntu-20.04 + needs: + - tests + + steps: + - uses: actions/checkout@v3 + + - uses: actions/setup-python@v4 + with: + python-version: "3.10" + architecture: x64 + + - name: Download Coverage Artifacts + uses: actions/download-artifact@v3 + with: + path: ./ + + - name: Combine Coverage + run: | + pip install coverage + find . -name ".coverage.*" -exec mv {} ./ \; + coverage combine + coverage xml + + - name: Upload Coverage to Codecov + uses: codecov/codecov-action@v3 + with: + files: coverage.xml + fail_ci_if_error: true + + # Tests python: env: TOTAL_GROUPS: 20 @@ -78,17 +118,25 @@ jobs: 20, ] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -99,6 +147,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + grpc: env: TOTAL_GROUPS: 1 @@ -108,17 +163,25 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -129,32 +192,56 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 - libcurl: + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + + postgres: env: - TOTAL_GROUPS: 1 + TOTAL_GROUPS: 2 strategy: fail-fast: false matrix: - group-number: [1] + group-number: [1, 2] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 + services: + postgres: + image: postgres:9 + env: + POSTGRES_PASSWORD: postgres + ports: + - 8080:5432 + - 8081:5432 + # Set health checks to wait until postgres has started + options: >- + --health-cmd pg_isready + --health-interval 10s + --health-timeout 5s + --health-retries 5 + steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix - # Special case packages - - name: Install libcurl-dev + - name: Fetch git tags run: | - sudo apt-get update - sudo apt-get install libcurl4-openssl-dev + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -165,41 +252,59 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 - postgres: + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + + mssql: env: - TOTAL_GROUPS: 2 + TOTAL_GROUPS: 1 strategy: fail-fast: false matrix: - group-number: [1, 2] + group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: - postgres: - image: postgres:9 + mssql: + image: mcr.microsoft.com/azure-sql-edge:latest env: - POSTGRES_PASSWORD: postgres + MSSQL_USER: python_agent + MSSQL_PASSWORD: python_agent + MSSQL_SA_PASSWORD: "python_agent#1234" + ACCEPT_EULA: "Y" ports: - - 8080:5432 - - 8081:5432 - # Set health checks to wait until postgres has started + - 8080:1433 + - 8081:1433 + # Set health checks to wait until mysql has started options: >- - --health-cmd pg_isready + --health-cmd "/opt/mssql-tools/bin/sqlcmd -U SA -P $MSSQL_SA_PASSWORD -Q 'SELECT 1'" --health-interval 10s --health-timeout 5s --health-retries 5 steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -210,6 +315,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + mysql: env: TOTAL_GROUPS: 2 @@ -219,7 +331,11 @@ jobs: matrix: group-number: [1, 2] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -242,12 +358,115 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin + + - name: Get Environments + id: get-envs + run: | + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT + env: + GROUP_NUMBER: ${{ matrix.group-number }} + + - name: Test + run: | + tox -vv -e ${{ steps.get-envs.outputs.envs }} -p auto + env: + TOX_PARALLEL_NO_SPINNER: 1 + PY_COLORS: 0 + + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + + rediscluster: + env: + TOTAL_GROUPS: 1 + + strategy: + fail-fast: false + matrix: + group-number: [1] + + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway + timeout-minutes: 30 + + services: + redis1: + image: hmstepanek/redis-cluster-node:1.0.0 + ports: + - 6379:6379 + - 16379:16379 + options: >- + --add-host=host.docker.internal:host-gateway + + redis2: + image: hmstepanek/redis-cluster-node:1.0.0 + ports: + - 6380:6379 + - 16380:16379 + options: >- + --add-host=host.docker.internal:host-gateway + + redis3: + image: hmstepanek/redis-cluster-node:1.0.0 + ports: + - 6381:6379 + - 16381:16379 + options: >- + --add-host=host.docker.internal:host-gateway + + redis4: + image: hmstepanek/redis-cluster-node:1.0.0 + ports: + - 6382:6379 + - 16382:16379 + options: >- + --add-host=host.docker.internal:host-gateway + + redis5: + image: hmstepanek/redis-cluster-node:1.0.0 + ports: + - 6383:6379 + - 16383:16379 + options: >- + --add-host=host.docker.internal:host-gateway + + redis6: + image: hmstepanek/redis-cluster-node:1.0.0 + ports: + - 6384:6379 + - 16384:16379 + options: >- + --add-host=host.docker.internal:host-gateway + + cluster-setup: + image: hmstepanek/redis-cluster:1.0.0 + options: >- + --add-host=host.docker.internal:host-gateway + + steps: + - uses: actions/checkout@v3 + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -258,6 +477,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + redis: env: TOTAL_GROUPS: 2 @@ -267,7 +493,11 @@ jobs: matrix: group-number: [1, 2] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -285,12 +515,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -301,6 +535,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + solr: env: TOTAL_GROUPS: 1 @@ -310,7 +551,11 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -330,12 +575,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -346,6 +595,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + memcached: env: TOTAL_GROUPS: 2 @@ -355,7 +611,11 @@ jobs: matrix: group-number: [1, 2] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -373,12 +633,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -389,6 +653,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + rabbitmq: env: TOTAL_GROUPS: 1 @@ -398,7 +669,11 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -417,12 +692,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -433,16 +712,27 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + kafka: env: - TOTAL_GROUPS: 2 + TOTAL_GROUPS: 4 strategy: fail-fast: false matrix: - group-number: [1, 2] + group-number: [1, 2, 3, 4] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -458,46 +748,47 @@ jobs: image: bitnami/kafka:3.2 ports: - 8080:8080 - - 8081:8081 + - 8082:8082 + - 8083:8083 env: + KAFKA_ENABLE_KRAFT: no ALLOW_PLAINTEXT_LISTENER: yes KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: true - KAFKA_CFG_LISTENERS: L1://:8080,L2://:8081 - KAFKA_CFG_ADVERTISED_LISTENERS: L1://127.0.0.1:8080,L2://kafka:8081, - KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP: L1:PLAINTEXT,L2:PLAINTEXT - KAFKA_CFG_INTER_BROKER_LISTENER_NAME: L2 + KAFKA_CFG_LISTENERS: L1://:8082,L2://:8083,L3://:8080 + KAFKA_CFG_ADVERTISED_LISTENERS: L1://host.docker.internal:8082,L2://host.docker.internal:8083,L3://kafka:8080 + KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP: L1:PLAINTEXT,L2:PLAINTEXT,L3:PLAINTEXT + KAFKA_CFG_INTER_BROKER_LISTENER_NAME: L3 steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix - # Special case packages - - name: Install librdkafka-dev + - name: Fetch git tags run: | - # Use lsb-release to find the codename of Ubuntu to use to install the correct library name - sudo apt-get update - sudo ln -fs /usr/share/zoneinfo/America/Los_Angeles /etc/localtime - sudo apt-get install -y wget gnupg2 software-properties-common - sudo wget -qO - https://packages.confluent.io/deb/7.2/archive.key | sudo apt-key add - - sudo add-apt-repository "deb https://packages.confluent.io/clients/deb $(lsb_release -cs) main" - sudo apt-get update - sudo apt-get install -y librdkafka-dev/$(lsb_release -c | cut -f 2) + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} - name: Test run: | - tox -vv -e ${{ steps.get-envs.outputs.envs }} + tox -vv -e ${{ steps.get-envs.outputs.envs }} -p auto env: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + mongodb: env: TOTAL_GROUPS: 1 @@ -507,7 +798,11 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: @@ -525,12 +820,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -541,7 +840,14 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 - elasticsearchserver01: + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + + elasticsearchserver07: env: TOTAL_GROUPS: 1 @@ -550,12 +856,16 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: - es01: - image: elasticsearch:1.4.4 + elasticsearch: + image: elasticsearch:7.17.8 env: "discovery.type": "single-node" ports: @@ -570,12 +880,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -586,7 +900,14 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 - elasticsearchserver07: + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + + elasticsearchserver08: env: TOTAL_GROUPS: 1 @@ -595,13 +916,18 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: - es01: - image: elasticsearch:7.13.2 + elasticsearch: + image: elasticsearch:8.6.0 env: + "xpack.security.enabled": "false" "discovery.type": "single-node" ports: - 8080:9200 @@ -615,12 +941,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -631,6 +961,13 @@ jobs: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 + gearman: env: TOTAL_GROUPS: 1 @@ -640,14 +977,18 @@ jobs: matrix: group-number: [1] - runs-on: ubuntu-latest + runs-on: ubuntu-20.04 + container: + image: ghcr.io/newrelic/newrelic-python-agent-ci:latest + options: >- + --add-host=host.docker.internal:host-gateway timeout-minutes: 30 services: gearman: image: artefactual/gearmand ports: - - 4730:4730 + - 8080:4730 # Set health checks to wait until gearman has started options: >- --health-cmd "(echo status ; sleep 0.1) | nc 127.0.0.1 4730 -w 1" @@ -657,12 +998,16 @@ jobs: steps: - uses: actions/checkout@v3 - - uses: ./.github/actions/setup-python-matrix + + - name: Fetch git tags + run: | + git config --global --add safe.directory "$GITHUB_WORKSPACE" + git fetch --tags origin - name: Get Environments id: get-envs run: | - echo "::set-output name=envs::$(tox -l | grep "^${{ github.job }}\-" | ./.github/workflows/get-envs.py)" + echo "envs=$(tox -l | grep '^${{ github.job }}\-' | ./.github/workflows/get-envs.py)" >> $GITHUB_OUTPUT env: GROUP_NUMBER: ${{ matrix.group-number }} @@ -672,3 +1017,10 @@ jobs: env: TOX_PARALLEL_NO_SPINNER: 1 PY_COLORS: 0 + + - name: Upload Coverage Artifacts + uses: actions/upload-artifact@v3 + with: + name: coverage-${{ github.job }}-${{ strategy.job-index }} + path: ./**/.coverage.* + retention-days: 1 diff --git a/.gitignore b/.gitignore index 8226b0e97..d4550713f 100644 --- a/.gitignore +++ b/.gitignore @@ -1,3 +1,6 @@ +.DS_Store +.DS_Store/ + # Linter megalinter-reports/ diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index af5082362..12081d1ee 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -1,54 +1,85 @@ -Contributing to the Python Agent -================================= +################################## + Contributing to the Python Agent +################################## -Thanks for your interest in contributing to the ``New Relic Python Agent``! We look forward to engaging with you. +Thanks for your interest in contributing to the ``New Relic Python +Agent``! We look forward to engaging with you. -How to Contribute ------------------ +******************* + How to Contribute +******************* Contributions are always welcome. Before contributing please read the -`code of conduct `__ and `search the issue tracker <../../issues>`__; your issue may have already been discussed or fixed in `main`. To contribute, `fork `__ this repository, commit your changes, and `send a Pull Request `__. - -Note that our `code of conduct `__ applies to all platforms and venues related to this project; please follow it in all your interactions with the project and its participants. - -How to Get Help or Ask Questions --------------------------------- +`code of conduct +`__ +and `search the issue tracker <../../issues>`__; your issue may have +already been discussed or fixed in `main`. To contribute, `fork +`__ this repository, +commit your changes, and `send a Pull Request +`__. + +Note that our `code of conduct +`__ +applies to all platforms and venues related to this project; please +follow it in all your interactions with the project and its +participants. + +********************************** + How to Get Help or Ask Questions +********************************** Do you have questions or are you experiencing unexpected behaviors after modifying this Open Source Software? Please engage with the “Build on -New Relic” space in the `Explorers -Hub `__, -New Relic’s Forum. Posts are publicly viewable by anyone, please do not +New Relic” space in the `Explorers Hub +`__, +New Relic's Forum. Posts are publicly viewable by anyone, please do not include PII or sensitive information in your forum post. -Contributor License Agreement (“CLA”) -------------------------------------- - -We’d love to get your contributions to improve the Python Agent! Keep in mind that when you submit your Pull Request, you'll need to sign the CLA via the click-through using CLA-Assistant. You only have to sign the CLA one time per project. If you'd like to execute our corporate CLA, or if you have any questions, please drop us an email at opensource@newrelic.com. - -For more information about CLAs, please check out Alex Russell’s excellent post, -`Why Do I Need to Sign This? `__. - -Feature Requests ----------------- - -Feature requests should be submitted in the `Issue tracker <../../issues>`__, with a description of the expected behavior & use case, where they’ll remain closed until sufficient interest, `e.g. :+1: reactions `__, has been `shown by the community <../../issues?q=label%3A%22votes+needed%22+sort%3Areactions-%2B1-desc>`__. Before submitting an Issue, please search for similar ones in the -`closed issues <../../issues?q=is%3Aissue+is%3Aclosed+label%3Aenhancement>`__. - -Filing Issues & Bug Reports ---------------------------- +*************************************** + Contributor License Agreement (“CLA”) +*************************************** + +We'd love to get your contributions to improve the Python Agent! Keep in +mind that when you submit your Pull Request, you'll need to sign the CLA +via the click-through using CLA-Assistant. You only have to sign the CLA +one time per project. If you'd like to execute our corporate CLA, or if +you have any questions, please drop us an email at +opensource@newrelic.com. + +For more information about CLAs, please check out Alex Russell's +excellent post, `Why Do I Need to Sign This? +`__. + +****************** + Feature Requests +****************** + +Feature requests should be submitted in the `Issue tracker +<../../issues>`__, with a description of the expected behavior & use +case, where they'll remain closed until sufficient interest, `e.g. :+1: +reactions +`__, +has been `shown by the community +<../../issues?q=label%3A%22votes+needed%22+sort%3Areactions-%2B1-desc>`__. +Before submitting an Issue, please search for similar ones in the +`closed issues +<../../issues?q=is%3Aissue+is%3Aclosed+label%3Aenhancement>`__. + +***************************** + Filing Issues & Bug Reports +***************************** We use GitHub issues to track public issues and bugs. If possible, please provide a link to an example app or gist that reproduces the issue. When filing an issue, please ensure your description is clear and includes the following information. -* Project version (ex: 1.4.0) -* Custom configurations (ex: flag=true) -* Any modifications made to the Python Agent +- Project version (ex: 1.4.0) +- Custom configurations (ex: flag=true) +- Any modifications made to the Python Agent A note about vulnerabilities -^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +============================ New Relic is committed to the security of our customers and their data. We believe that providing coordinated disclosure by security researchers @@ -56,12 +87,13 @@ and engaging with the security community are important means to achieve our security goals. If you believe you have found a security vulnerability in this project -or any of New Relic’s products or websites, we welcome and greatly -appreciate you reporting it to New Relic through -`HackerOne `__. +or any of New Relic's products or websites, we welcome and greatly +appreciate you reporting it to New Relic through `HackerOne +`__. -Setting Up Your Environment ---------------------------- +***************************** + Setting Up Your Environment +***************************** This Open Source Software can be used in a large number of environments, all of which have their own quirks and best practices. As such, while we @@ -69,8 +101,95 @@ are happy to provide documentation and assistance for unmodified Open Source Software, we cannot provide support for your specific environment. -Pull Request Guidelines ------------------------ +******************************* + Developing Inside a Container +******************************* + +To avoid the issues involved with setting up a local environment, +consider using our prebuilt development container to easily create an +environment on demand with a wide selection of Python versions +installed. This also comes with the `tox +`__ tool (See Testing Guidelines) and a +few packages preinstalled. + +While we cannot provide direct support in setting up your environment to +work with this container, we develop it in the open and provide this +documentation to help reduce the setup burden on new contributors. + +Prerequisites: +============== + +#. Install `Docker `__ for you local operating + system. + +#. Login to the `GitHub Container Registry + `__ + through Docker. + +#. Install Either: + - `VS Code `__ onto your local + system (recommended). + + - The `Dev Container CLI + `__ in your terminal. + (Requires a local copy of `npm + `__.) + +Steps for VS Code: +================== + +#. Ensure Docker is running. + +#. Install the `VS Code Extension for Dev Containers + `__ + into VS Code. + +#. In VS Code, open the command pallette (Ctrl-Shift-P on Windows/Linux + or Cmd-Shift-P on Mac) and search for and run "Dev Containers: + Rebuild and Reopen in Container". + +#. Wait for the container to build and start. This may take a long time + to pull the first time the container is run, subsequent runs should + be faster thanks to caching. + +#. To update your container, open the command pallette and run "Dev + Containers: Rebuild Without Cache and Reopen in Container". + +Steps for Command Line Editor Users (vim, etc.): +================================================ + +#. Ensure Docker is running. + +#. From the root of this repository, run ``devcontainer up + --workspace-folder=.`` to start the container. The running container + ID will be displayed, which is useful for subsequent steps. + +#. To gain shell access to the container, run ``docker exec -it + /bin/bash``. Alternative shells include ``zsh`` and + ``fish``. + +#. Navigate to the ``/workspaces`` folder to find your source code. + +#. To stop the container, run ``exit`` on any open shells and then run + ``docker stop ``. ``docker ps`` may be helpful for + finding the ID if you've lost it. + +Personalizing Your Container: +============================= + +#. If you use a dotfiles repository (such as `chezmoi + `__), you can configure your container to + clone and install your dotfiles using `VS Code dotfile settings + `__. + +#. To install extra packages and features, you can edit your local copy + of the .devcontainer/devcontainer.json file to use specific `Dev + Container Features `__. A few common + needs are already included but commented out. + +************************* + Pull Request Guidelines +************************* Before we can accept a pull request, you must sign our `Contributor Licensing Agreement <#contributor-license-agreement-cla>`__, if you have @@ -79,28 +198,44 @@ same Apache 2.0 license as we use for this project in general. Minimally, the `test suite <#testing-guidelines>`__ must pass for us to accept a PR. Ideally, we would love it if you also added appropriate -tests if you’re implementing a feature! +tests if you're implementing a feature! -Please note that integration tests will be run internally before contributions are accepted. +Please note that integration tests will be run internally before +contributions are accepted. Additionally: -1. Ensure any install or build dependencies are removed before the end of the layer when doing a build. -2. Increase the version numbers in any examples files and the README.md to the new version that this Pull Request would represent. The versioning scheme we use is `SemVer `__. -3. You may merge the Pull Request in once you have the sign-off of two other developers, or if you do not have permission to do that, you may request the second reviewer to merge it for you. +#. Ensure any install or build dependencies are removed before the end + of the layer when doing a build. + +#. Increase the version numbers in any examples files and the README.md + to the new version that this Pull Request would represent. The + versioning scheme we use is `SemVer `__. + +#. You may merge the Pull Request in once you have the sign-off of two + other developers, or if you do not have permission to do that, you + may request the second reviewer to merge it for you. -Testing Guidelines ------------------- +******************** + Testing Guidelines +******************** The Python Agent uses `tox `__ for -testing. The repository uses tests in -`tests/ `__. +testing. The repository uses tests in tests/. -You can run these tests by entering the `tests/ `__ directory and then entering the directory of the tests you want to run. Then, run the following command: +You can run these tests by entering the tests/ directory and then +entering the directory of the tests you want to run. Then, run the +following command: -tox -c tox.ini -e [test environment] +``tox -c tox.ini -e [test environment]`` -Slack ------ +******* + Slack +******* -We host a public Slack with a dedicated channel for contributors and maintainers of open source projects hosted by New Relic. If you are contributing to this project, you're welcome to request access to the #oss-contributors channel in the newrelicusers.slack.com workspace. To request access, see https://newrelicusers-signup.herokuapp.com/. +We host a public Slack with a dedicated channel for contributors and +maintainers of open source projects hosted by New Relic. If you are +contributing to this project, you're welcome to request access to the +#oss-contributors channel in the newrelicusers.slack.com workspace. To +request access, please use this `link +`__. diff --git a/MANIFEST.in b/MANIFEST.in index 0a75ce752..bf746435c 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -8,3 +8,4 @@ include newrelic/common/cacert.pem include newrelic/packages/wrapt/LICENSE include newrelic/packages/wrapt/README include newrelic/packages/urllib3/LICENSE.txt +include newrelic/packages/isort/LICENSE diff --git a/THIRD_PARTY_NOTICES.md b/THIRD_PARTY_NOTICES.md index 3662484f6..a1dd7e07d 100644 --- a/THIRD_PARTY_NOTICES.md +++ b/THIRD_PARTY_NOTICES.md @@ -14,7 +14,16 @@ Copyright (c) Django Software Foundation and individual contributors. Distributed under the following license(s): - * [The BSD 3-Clause License](https://opensource.org/licenses/BSD-3-Clause) +* [The BSD 3-Clause License](https://opensource.org/licenses/BSD-3-Clause) + + +## [isort](https://pypi.org/project/isort) + +Copyright (c) 2013 Timothy Edmund Crosley + +Distributed under the following license(s): + +* [The MIT License](http://opensource.org/licenses/MIT) ## [six](https://pypi.org/project/six) @@ -23,7 +32,7 @@ Copyright (c) 2010-2013 Benjamin Peterson Distributed under the following license(s): - * [The MIT License](http://opensource.org/licenses/MIT) +* [The MIT License](http://opensource.org/licenses/MIT) ## [time.monotonic](newrelic/common/_monotonic.c) @@ -32,7 +41,7 @@ Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, Distributed under the following license(s): - * [Python Software Foundation](https://docs.python.org/3/license.html) +* [Python Software Foundation](https://docs.python.org/3/license.html) ## [urllib3](https://pypi.org/project/urllib3) @@ -41,7 +50,7 @@ Copyright (c) 2008-2019 Andrey Petrov and contributors (see CONTRIBUTORS.txt) Distributed under the following license(s): - * [The MIT License](http://opensource.org/licenses/MIT) +* [The MIT License](http://opensource.org/licenses/MIT) ## [wrapt](https://pypi.org/project/wrapt) @@ -51,5 +60,5 @@ All rights reserved. Distributed under the following license(s): - * [The BSD 2-Clause License](http://opensource.org/licenses/BSD-2-Clause) +* [The BSD 2-Clause License](http://opensource.org/licenses/BSD-2-Clause) diff --git a/codecov.yml b/codecov.yml new file mode 100644 index 000000000..6ca30f640 --- /dev/null +++ b/codecov.yml @@ -0,0 +1,25 @@ +ignore: + - "newreilc/hooks/component_sentry.py" + - "newrelic/admin/*" + - "newrelic/console.py" + - "newrelic/hooks/adapter_flup.py" + - "newrelic/hooks/adapter_meinheld.py" + - "newrelic/hooks/adapter_paste.py" + - "newrelic/hooks/component_piston.py" + - "newrelic/hooks/database_oursql.py" + - "newrelic/hooks/database_psycopg2ct.py" + - "newrelic/hooks/datastore_aioredis.py" + - "newrelic/hooks/datastore_aredis.py" + - "newrelic/hooks/datastore_motor.py" + - "newrelic/hooks/datastore_pyelasticsearch.py" + - "newrelic/hooks/datastore_umemcache.py" + - "newrelic/hooks/external_dropbox.py" + - "newrelic/hooks/external_facepy.py" + - "newrelic/hooks/external_pywapi.py" + - "newrelic/hooks/external_xmlrpclib.py" + - "newrelic/hooks/framework_pylons.py" + - "newrelic/hooks/framework_web2py.py" + - "newrelic/hooks/framework_webpy.py" + - "newrelic/hooks/middleware_weberror.py" + - "newrelic/packages/*" + - "newrelic/packages/**/*" diff --git a/newrelic/__init__.py b/newrelic/__init__.py index b142f593e..2a5828f8f 100644 --- a/newrelic/__init__.py +++ b/newrelic/__init__.py @@ -13,12 +13,13 @@ # limitations under the License. import os.path + THIS_DIR = os.path.dirname(__file__) try: - with open(os.path.join(THIS_DIR, 'version.txt'), 'r') as f: + with open(os.path.join(THIS_DIR, "version.txt"), "r") as f: version = f.read() except: - version = '0.0.0.0' + version = "0.0.0" -version_info = list(map(int, version.split('.'))) +version_info = list(map(int, version.split("."))) diff --git a/newrelic/admin/validate_config.py b/newrelic/admin/validate_config.py index a842df7be..ac25b715e 100644 --- a/newrelic/admin/validate_config.py +++ b/newrelic/admin/validate_config.py @@ -25,17 +25,15 @@ def _run_validation_test(): from newrelic.api.error_trace import error_trace from newrelic.api.external_trace import external_trace from newrelic.api.function_trace import function_trace - from newrelic.api.transaction import add_custom_parameter from newrelic.api.time_trace import notice_error + from newrelic.api.transaction import add_custom_attribute from newrelic.api.wsgi_application import wsgi_application - @external_trace(library='test', - url='http://localhost/test', method='GET') + @external_trace(library="test", url="http://localhost/test", method="GET") def _external1(): time.sleep(0.1) - @function_trace(label='label', - params={'fun-key-1': '1', 'fun-key-2': 2, 'fun-key-3': 3.0}) + @function_trace(label="label", params={"fun-key-1": "1", "fun-key-2": 2, "fun-key-3": 3.0}) def _function1(): _external1() @@ -47,33 +45,29 @@ def _function2(): @error_trace() @function_trace() def _function3(): - add_custom_parameter('txn-key-1', 1) + add_custom_attribute("txn-key-1", 1) _function4() - raise RuntimeError('This is a test error and can be ignored.') + raise RuntimeError("This is a test error and can be ignored.") @function_trace() def _function4(params=None, application=None): try: _function5() except: - notice_error(attributes=(params or { - 'err-key-2': 2, 'err-key-3': 3.0}), - application=application) + notice_error(attributes=(params or {"err-key-2": 2, "err-key-3": 3.0}), application=application) @function_trace() def _function5(): - raise NotImplementedError( - 'This is a test error and can be ignored.') + raise NotImplementedError("This is a test error and can be ignored.") @wsgi_application() def _wsgi_application(environ, start_response): - status = '200 OK' - output = 'Hello World!' + status = "200 OK" + output = "Hello World!" - response_headers = [('Content-type', 'text/plain'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-type", "text/plain"), ("Content-Length", str(len(output)))] start_response(status, response_headers) for i in range(10): @@ -107,16 +101,15 @@ def _background_task(): def _start_response(*args): pass - _environ = { 'SCRIPT_NAME': '', 'PATH_INFO': '/test', - 'QUERY_STRING': 'key=value' } + _environ = {"SCRIPT_NAME": "", "PATH_INFO": "/test", "QUERY_STRING": "key=value"} _iterable = _wsgi_application(_environ, _start_response) _iterable.close() _background_task() - _function4(params={'err-key-4': 4, 'err-key-5': 5.0}, - application=application()) + _function4(params={"err-key-4": 4, "err-key-5": 5.0}, application=application()) + _user_message = """ Running Python agent test. @@ -136,19 +129,23 @@ def _start_response(*args): data to the New Relic UI. """ -@command('validate-config', 'config_file [log_file]', -"""Validates the syntax of . Also tests connectivity to New + +@command( + "validate-config", + "config_file [log_file]", + """Validates the syntax of . Also tests connectivity to New Relic core application by connecting to the account corresponding to the license key listed in the configuration file, and reporting test data under -the application name 'Python Agent Test'.""") +the application name 'Python Agent Test'.""", +) def validate_config(args): + import logging import os import sys - import logging import time if len(args) == 0: - usage('validate-config') + usage("validate-config") sys.exit(1) from newrelic.api.application import register_application @@ -158,7 +155,7 @@ def validate_config(args): if len(args) >= 2: log_file = args[1] else: - log_file = '/tmp/python-agent-test.log' + log_file = "/tmp/python-agent-test.log" # nosec log_level = logging.DEBUG @@ -168,21 +165,20 @@ def validate_config(args): pass config_file = args[0] - environment = os.environ.get('NEW_RELIC_ENVIRONMENT') + environment = os.environ.get("NEW_RELIC_ENVIRONMENT") - if config_file == '-': - config_file = os.environ.get('NEW_RELIC_CONFIG_FILE') + if config_file == "-": + config_file = os.environ.get("NEW_RELIC_CONFIG_FILE") - initialize(config_file, environment, ignore_errors=False, - log_file=log_file, log_level=log_level) + initialize(config_file, environment, ignore_errors=False, log_file=log_file, log_level=log_level) _logger = logging.getLogger(__name__) - _logger.debug('Starting agent validation.') + _logger.debug("Starting agent validation.") _settings = global_settings() - app_name = os.environ.get('NEW_RELIC_TEST_APP_NAME', 'Python Agent Test') + app_name = os.environ.get("NEW_RELIC_TEST_APP_NAME", "Python Agent Test") _settings.app_name = app_name _settings.transaction_tracer.transaction_threshold = 0 @@ -194,17 +190,17 @@ def validate_config(args): print(_user_message % dict(app_name=app_name, log_file=log_file)) - _logger.debug('Register test application.') + _logger.debug("Register test application.") - _logger.debug('Collector host is %r.', _settings.host) - _logger.debug('Collector port is %r.', _settings.port) + _logger.debug("Collector host is %r.", _settings.host) + _logger.debug("Collector port is %r.", _settings.port) - _logger.debug('Proxy scheme is %r.', _settings.proxy_scheme) - _logger.debug('Proxy host is %r.', _settings.proxy_host) - _logger.debug('Proxy port is %r.', _settings.proxy_port) - _logger.debug('Proxy user is %r.', _settings.proxy_user) + _logger.debug("Proxy scheme is %r.", _settings.proxy_scheme) + _logger.debug("Proxy host is %r.", _settings.proxy_host) + _logger.debug("Proxy port is %r.", _settings.proxy_port) + _logger.debug("Proxy user is %r.", _settings.proxy_user) - _logger.debug('License key is %r.', _settings.license_key) + _logger.debug("License key is %r.", _settings.license_key) _timeout = 30.0 @@ -215,24 +211,25 @@ def validate_config(args): _duration = _end - _start if not _application.active: - _logger.error('Unable to register application for test, ' - 'connection could not be established within %s seconds.', - _timeout) + _logger.error( + "Unable to register application for test, " "connection could not be established within %s seconds.", + _timeout, + ) return - if hasattr(_application.settings, 'messages'): + if hasattr(_application.settings, "messages"): for message in _application.settings.messages: - if message['message'].startswith('Reporting to:'): - parts = message['message'].split('Reporting to:') + if message["message"].startswith("Reporting to:"): + parts = message["message"].split("Reporting to:") url = parts[1].strip() - print('Registration successful. Reporting to:') + print("Registration successful. Reporting to:") print() - print(' %s' % url) + print(" %s" % url) print() break - _logger.debug('Registration took %s seconds.', _duration) + _logger.debug("Registration took %s seconds.", _duration) - _logger.debug('Run the validation test.') + _logger.debug("Run the validation test.") _run_validation_test() diff --git a/newrelic/agent.py b/newrelic/agent.py index f635b866f..95a540780 100644 --- a/newrelic/agent.py +++ b/newrelic/agent.py @@ -12,412 +12,332 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.config import ( - initialize as __initialize, - extra_settings as __extra_settings) - -from newrelic.core.config import global_settings as __global_settings - -from newrelic.core.agent import ( - shutdown_agent as __shutdown_agent, - register_data_source as __register_data_source) - -from newrelic.samplers.decorators import ( - data_source_generator as __data_source_generator, - data_source_factory as __data_source_factory) - -from newrelic.api.log import NewRelicContextFormatter - -from newrelic.api.application import ( - application_instance as __application, - register_application as __register_application, - application_settings as __application_settings) - +from newrelic.api.application import application_instance as __application +from newrelic.api.application import application_settings as __application_settings +from newrelic.api.application import register_application as __register_application +from newrelic.api.log import NewRelicContextFormatter # noqa from newrelic.api.time_trace import ( - current_trace as __current_trace, - get_linking_metadata as __get_linking_metadata, - add_custom_span_attribute as __add_custom_span_attribute, - record_exception as __record_exception, - notice_error as __notice_error) - + add_custom_span_attribute as __add_custom_span_attribute, +) +from newrelic.api.time_trace import current_trace as __current_trace +from newrelic.api.time_trace import get_linking_metadata as __get_linking_metadata +from newrelic.api.time_trace import notice_error as __notice_error +from newrelic.api.time_trace import record_exception as __record_exception from newrelic.api.transaction import ( - current_transaction as __current_transaction, - set_transaction_name as __set_transaction_name, - end_of_transaction as __end_of_transaction, - set_background_task as __set_background_task, - ignore_transaction as __ignore_transaction, - suppress_apdex_metric as __suppress_apdex_metric, - capture_request_params as __capture_request_params, - add_custom_parameter as __add_custom_parameter, - add_custom_parameters as __add_custom_parameters, - add_framework_info as __add_framework_info, - get_browser_timing_header as __get_browser_timing_header, - get_browser_timing_footer as __get_browser_timing_footer, - disable_browser_autorum as __disable_browser_autorum, - suppress_transaction_trace as __suppress_transaction_trace, - record_custom_metric as __record_custom_metric, - record_custom_metrics as __record_custom_metrics, - record_custom_event as __record_custom_event, - accept_distributed_trace_payload as __accept_distributed_trace_payload, - create_distributed_trace_payload as __create_distributed_trace_payload, - accept_distributed_trace_headers as __accept_distributed_trace_headers, - insert_distributed_trace_headers as __insert_distributed_trace_headers, - current_trace_id as __current_trace_id, - current_span_id as __current_span_id) - + accept_distributed_trace_headers as __accept_distributed_trace_headers, +) +from newrelic.api.transaction import ( + accept_distributed_trace_payload as __accept_distributed_trace_payload, +) +from newrelic.api.transaction import add_custom_attribute as __add_custom_attribute +from newrelic.api.transaction import add_custom_attributes as __add_custom_attributes +from newrelic.api.transaction import add_custom_parameter as __add_custom_parameter +from newrelic.api.transaction import add_custom_parameters as __add_custom_parameters +from newrelic.api.transaction import add_framework_info as __add_framework_info +from newrelic.api.transaction import capture_request_params as __capture_request_params +from newrelic.api.transaction import ( + create_distributed_trace_payload as __create_distributed_trace_payload, +) +from newrelic.api.transaction import current_span_id as __current_span_id +from newrelic.api.transaction import current_trace_id as __current_trace_id +from newrelic.api.transaction import current_transaction as __current_transaction +from newrelic.api.transaction import ( + disable_browser_autorum as __disable_browser_autorum, +) +from newrelic.api.transaction import end_of_transaction as __end_of_transaction +from newrelic.api.transaction import ( + get_browser_timing_footer as __get_browser_timing_footer, +) +from newrelic.api.transaction import ( + get_browser_timing_header as __get_browser_timing_header, +) +from newrelic.api.transaction import ignore_transaction as __ignore_transaction +from newrelic.api.transaction import ( + insert_distributed_trace_headers as __insert_distributed_trace_headers, +) +from newrelic.api.transaction import record_custom_event as __record_custom_event +from newrelic.api.transaction import record_custom_metric as __record_custom_metric +from newrelic.api.transaction import record_custom_metrics as __record_custom_metrics +from newrelic.api.transaction import record_log_event as __record_log_event +from newrelic.api.transaction import set_background_task as __set_background_task +from newrelic.api.transaction import set_transaction_name as __set_transaction_name +from newrelic.api.transaction import suppress_apdex_metric as __suppress_apdex_metric +from newrelic.api.transaction import ( + suppress_transaction_trace as __suppress_transaction_trace, +) from newrelic.api.wsgi_application import ( - wsgi_application as __wsgi_application, - WSGIApplicationWrapper as __WSGIApplicationWrapper, - wrap_wsgi_application as __wrap_wsgi_application) + WSGIApplicationWrapper as __WSGIApplicationWrapper, +) +from newrelic.api.wsgi_application import ( + wrap_wsgi_application as __wrap_wsgi_application, +) +from newrelic.api.wsgi_application import wsgi_application as __wsgi_application +from newrelic.config import extra_settings as __extra_settings +from newrelic.config import initialize as __initialize +from newrelic.core.agent import register_data_source as __register_data_source +from newrelic.core.agent import shutdown_agent as __shutdown_agent +from newrelic.core.config import global_settings as __global_settings +from newrelic.samplers.decorators import data_source_factory as __data_source_factory +from newrelic.samplers.decorators import ( + data_source_generator as __data_source_generator, +) try: from newrelic.api.asgi_application import ( - asgi_application as __asgi_application, - ASGIApplicationWrapper as __ASGIApplicationWrapper, - wrap_asgi_application as __wrap_asgi_application) + ASGIApplicationWrapper as __ASGIApplicationWrapper, + ) + from newrelic.api.asgi_application import asgi_application as __asgi_application + from newrelic.api.asgi_application import ( + wrap_asgi_application as __wrap_asgi_application, + ) except SyntaxError: + def __asgi_application(*args, **kwargs): pass __ASGIApplicationWrapper = __asgi_application __wrap_asgi_application = __asgi_application -from newrelic.api.web_transaction import ( - WebTransaction as __WebTransaction, - web_transaction as __web_transaction, - WebTransactionWrapper as __WebTransactionWrapper, - wrap_web_transaction as __wrap_web_transaction) - +from newrelic.api.background_task import BackgroundTask as __BackgroundTask from newrelic.api.background_task import ( - background_task as __background_task, - BackgroundTask as __BackgroundTask, - BackgroundTaskWrapper as __BackgroundTaskWrapper, - wrap_background_task as __wrap_background_task) - -from newrelic.api.lambda_handler import ( - LambdaHandlerWrapper as __LambdaHandlerWrapper, - lambda_handler as __lambda_handler) - + BackgroundTaskWrapper as __BackgroundTaskWrapper, +) +from newrelic.api.background_task import background_task as __background_task +from newrelic.api.background_task import wrap_background_task as __wrap_background_task +from newrelic.api.database_trace import DatabaseTrace as __DatabaseTrace +from newrelic.api.database_trace import DatabaseTraceWrapper as __DatabaseTraceWrapper +from newrelic.api.database_trace import database_trace as __database_trace +from newrelic.api.database_trace import ( + register_database_client as __register_database_client, +) +from newrelic.api.database_trace import wrap_database_trace as __wrap_database_trace +from newrelic.api.datastore_trace import DatastoreTrace as __DatastoreTrace +from newrelic.api.datastore_trace import ( + DatastoreTraceWrapper as __DatastoreTraceWrapper, +) +from newrelic.api.datastore_trace import datastore_trace as __datastore_trace +from newrelic.api.datastore_trace import wrap_datastore_trace as __wrap_datastore_trace +from newrelic.api.error_trace import ErrorTrace as __ErrorTrace +from newrelic.api.error_trace import ErrorTraceWrapper as __ErrorTraceWrapper +from newrelic.api.error_trace import error_trace as __error_trace +from newrelic.api.error_trace import wrap_error_trace as __wrap_error_trace +from newrelic.api.external_trace import ExternalTrace as __ExternalTrace +from newrelic.api.external_trace import ExternalTraceWrapper as __ExternalTraceWrapper +from newrelic.api.external_trace import external_trace as __external_trace +from newrelic.api.external_trace import wrap_external_trace as __wrap_external_trace +from newrelic.api.function_trace import FunctionTrace as __FunctionTrace +from newrelic.api.function_trace import FunctionTraceWrapper as __FunctionTraceWrapper +from newrelic.api.function_trace import function_trace as __function_trace +from newrelic.api.function_trace import wrap_function_trace as __wrap_function_trace +from newrelic.api.generator_trace import ( + GeneratorTraceWrapper as __GeneratorTraceWrapper, +) +from newrelic.api.generator_trace import generator_trace as __generator_trace +from newrelic.api.generator_trace import wrap_generator_trace as __wrap_generator_trace +from newrelic.api.html_insertion import insert_html_snippet as __insert_html_snippet +from newrelic.api.html_insertion import verify_body_exists as __verify_body_exists +from newrelic.api.lambda_handler import LambdaHandlerWrapper as __LambdaHandlerWrapper +from newrelic.api.lambda_handler import lambda_handler as __lambda_handler +from newrelic.api.message_trace import MessageTrace as __MessageTrace +from newrelic.api.message_trace import MessageTraceWrapper as __MessageTraceWrapper +from newrelic.api.message_trace import message_trace as __message_trace +from newrelic.api.message_trace import wrap_message_trace as __wrap_message_trace +from newrelic.api.message_transaction import MessageTransaction as __MessageTransaction +from newrelic.api.message_transaction import ( + MessageTransactionWrapper as __MessageTransactionWrapper, +) +from newrelic.api.message_transaction import ( + message_transaction as __message_transaction, +) +from newrelic.api.message_transaction import ( + wrap_message_transaction as __wrap_message_transaction, +) +from newrelic.api.profile_trace import ProfileTraceWrapper as __ProfileTraceWrapper +from newrelic.api.profile_trace import profile_trace as __profile_trace +from newrelic.api.profile_trace import wrap_profile_trace as __wrap_profile_trace +from newrelic.api.settings import set_error_group_callback as __set_error_group_callback +from newrelic.api.supportability import wrap_api_call as __wrap_api_call +from newrelic.api.transaction import set_user_id as __set_user_id from newrelic.api.transaction_name import ( - transaction_name as __transaction_name, - TransactionNameWrapper as __TransactionNameWrapper, - wrap_transaction_name as __wrap_transaction_name) - -from newrelic.api.function_trace import ( - function_trace as __function_trace, - FunctionTrace as __FunctionTrace, - FunctionTraceWrapper as __FunctionTraceWrapper, - wrap_function_trace as __wrap_function_trace) + TransactionNameWrapper as __TransactionNameWrapper, +) +from newrelic.api.transaction_name import transaction_name as __transaction_name +from newrelic.api.transaction_name import ( + wrap_transaction_name as __wrap_transaction_name, +) +from newrelic.api.web_transaction import WebTransaction as __WebTransaction +from newrelic.api.web_transaction import ( + WebTransactionWrapper as __WebTransactionWrapper, +) +from newrelic.api.web_transaction import web_transaction as __web_transaction +from newrelic.api.web_transaction import wrap_web_transaction as __wrap_web_transaction +from newrelic.common.object_names import callable_name as __callable_name +from newrelic.common.object_wrapper import FunctionWrapper as __FunctionWrapper +from newrelic.common.object_wrapper import InFunctionWrapper as __InFunctionWrapper +from newrelic.common.object_wrapper import ObjectProxy as __ObjectProxy +from newrelic.common.object_wrapper import ObjectWrapper as __ObjectWrapper +from newrelic.common.object_wrapper import OutFunctionWrapper as __OutFunctionWrapper +from newrelic.common.object_wrapper import PostFunctionWrapper as __PostFunctionWrapper +from newrelic.common.object_wrapper import PreFunctionWrapper as __PreFunctionWrapper +from newrelic.common.object_wrapper import function_wrapper as __function_wrapper +from newrelic.common.object_wrapper import in_function as __in_function +from newrelic.common.object_wrapper import out_function as __out_function +from newrelic.common.object_wrapper import ( + patch_function_wrapper as __patch_function_wrapper, +) +from newrelic.common.object_wrapper import post_function as __post_function +from newrelic.common.object_wrapper import pre_function as __pre_function +from newrelic.common.object_wrapper import resolve_path as __resolve_path +from newrelic.common.object_wrapper import ( + transient_function_wrapper as __transient_function_wrapper, +) +from newrelic.common.object_wrapper import ( + wrap_function_wrapper as __wrap_function_wrapper, +) +from newrelic.common.object_wrapper import wrap_in_function as __wrap_in_function +from newrelic.common.object_wrapper import wrap_object as __wrap_object +from newrelic.common.object_wrapper import ( + wrap_object_attribute as __wrap_object_attribute, +) +from newrelic.common.object_wrapper import wrap_out_function as __wrap_out_function +from newrelic.common.object_wrapper import wrap_post_function as __wrap_post_function +from newrelic.common.object_wrapper import wrap_pre_function as __wrap_pre_function # EXPERIMENTAL - Generator traces are currently experimental and may not # exist in this form in future versions of the agent. -from newrelic.api.generator_trace import ( - generator_trace as __generator_trace, - GeneratorTraceWrapper as __GeneratorTraceWrapper, - wrap_generator_trace as __wrap_generator_trace) # EXPERIMENTAL - Profile traces are currently experimental and may not # exist in this form in future versions of the agent. -from newrelic.api.profile_trace import ( - profile_trace as __profile_trace, - ProfileTraceWrapper as __ProfileTraceWrapper, - wrap_profile_trace as __wrap_profile_trace) - -from newrelic.api.database_trace import ( - database_trace as __database_trace, - DatabaseTrace as __DatabaseTrace, - DatabaseTraceWrapper as __DatabaseTraceWrapper, - wrap_database_trace as __wrap_database_trace, - register_database_client as __register_database_client) - -from newrelic.api.datastore_trace import ( - datastore_trace as __datastore_trace, - DatastoreTrace as __DatastoreTrace, - DatastoreTraceWrapper as __DatastoreTraceWrapper, - wrap_datastore_trace as __wrap_datastore_trace) - -from newrelic.api.external_trace import ( - external_trace as __external_trace, - ExternalTrace as __ExternalTrace, - ExternalTraceWrapper as __ExternalTraceWrapper, - wrap_external_trace as __wrap_external_trace) - -from newrelic.api.error_trace import ( - error_trace as __error_trace, - ErrorTrace as __ErrorTrace, - ErrorTraceWrapper as __ErrorTraceWrapper, - wrap_error_trace as __wrap_error_trace) - -from newrelic.api.message_trace import ( - message_trace as __message_trace, - MessageTrace as __MessageTrace, - MessageTraceWrapper as __MessageTraceWrapper, - wrap_message_trace as __wrap_message_trace) - -from newrelic.api.message_transaction import ( - message_transaction as __message_transaction, - MessageTransaction as __MessageTransaction, - MessageTransactionWrapper as __MessageTransactionWrapper, - wrap_message_transaction as __wrap_message_transaction) - -from newrelic.common.object_names import callable_name as __callable_name - -from newrelic.common.object_wrapper import ( - ObjectProxy as __ObjectProxy, - wrap_object as __wrap_object, - wrap_object_attribute as __wrap_object_attribute, - resolve_path as __resolve_path, - transient_function_wrapper as __transient_function_wrapper, - FunctionWrapper as __FunctionWrapper, - function_wrapper as __function_wrapper, - wrap_function_wrapper as __wrap_function_wrapper, - patch_function_wrapper as __patch_function_wrapper, - ObjectWrapper as __ObjectWrapper, - pre_function as __pre_function, - PreFunctionWrapper as __PreFunctionWrapper, - wrap_pre_function as __wrap_pre_function, - post_function as __post_function, - PostFunctionWrapper as __PostFunctionWrapper, - wrap_post_function as __wrap_post_function, - in_function as __in_function, - InFunctionWrapper as __InFunctionWrapper, - wrap_in_function as __wrap_in_function, - out_function as __out_function, - OutFunctionWrapper as __OutFunctionWrapper, - wrap_out_function as __wrap_out_function) - -from newrelic.api.html_insertion import ( - insert_html_snippet as __insert_html_snippet, - verify_body_exists as __verify_body_exists) - -from newrelic.api.supportability import wrap_api_call as __wrap_api_call initialize = __initialize -extra_settings = __wrap_api_call(__extra_settings, - 'extra_settings') -global_settings = __wrap_api_call(__global_settings, - 'global_settings') -shutdown_agent = __wrap_api_call(__shutdown_agent, - 'shutdown_agent') -register_data_source = __wrap_api_call(__register_data_source, - 'register_data_source') -data_source_generator = __wrap_api_call(__data_source_generator, - 'data_source_generator') -data_source_factory = __wrap_api_call(__data_source_factory, - 'data_source_factory') -application = __wrap_api_call(__application, - 'application') +extra_settings = __wrap_api_call(__extra_settings, "extra_settings") +global_settings = __wrap_api_call(__global_settings, "global_settings") +shutdown_agent = __wrap_api_call(__shutdown_agent, "shutdown_agent") +register_data_source = __wrap_api_call(__register_data_source, "register_data_source") +data_source_generator = __wrap_api_call(__data_source_generator, "data_source_generator") +data_source_factory = __wrap_api_call(__data_source_factory, "data_source_factory") +application = __wrap_api_call(__application, "application") register_application = __register_application -application_settings = __wrap_api_call(__application_settings, - 'application_settings') -current_trace = __wrap_api_call(__current_trace, - 'current_trace') -get_linking_metadata = __wrap_api_call(__get_linking_metadata, - 'get_linking_metadata') -add_custom_span_attribute = __wrap_api_call(__add_custom_span_attribute, - 'add_custom_span_attribute') -current_transaction = __wrap_api_call(__current_transaction, - 'current_transaction') -set_transaction_name = __wrap_api_call(__set_transaction_name, - 'set_transaction_name') -end_of_transaction = __wrap_api_call(__end_of_transaction, - 'end_of_transaction') -set_background_task = __wrap_api_call(__set_background_task, - 'set_background_task') -ignore_transaction = __wrap_api_call(__ignore_transaction, - 'ignore_transaction') -suppress_apdex_metric = __wrap_api_call(__suppress_apdex_metric, - 'suppress_apdex_metric') -capture_request_params = __wrap_api_call(__capture_request_params, - 'capture_request_params') -add_custom_parameter = __wrap_api_call(__add_custom_parameter, - 'add_custom_parameter') -add_custom_parameters = __wrap_api_call(__add_custom_parameters, - 'add_custom_parameters') -add_framework_info = __wrap_api_call(__add_framework_info, - 'add_framework_info') -record_exception = __wrap_api_call(__record_exception, - 'record_exception') -notice_error = __wrap_api_call(__notice_error, - 'notice_error') -get_browser_timing_header = __wrap_api_call(__get_browser_timing_header, - 'get_browser_timing_header') -get_browser_timing_footer = __wrap_api_call(__get_browser_timing_footer, - 'get_browser_timing_footer') -disable_browser_autorum = __wrap_api_call(__disable_browser_autorum, - 'disable_browser_autorum') -suppress_transaction_trace = __wrap_api_call(__suppress_transaction_trace, - 'suppress_transaction_trace') -record_custom_metric = __wrap_api_call(__record_custom_metric, - 'record_custom_metric') -record_custom_metrics = __wrap_api_call(__record_custom_metrics, - 'record_custom_metrics') -record_custom_event = __wrap_api_call(__record_custom_event, - 'record_custom_event') +application_settings = __wrap_api_call(__application_settings, "application_settings") +current_trace = __wrap_api_call(__current_trace, "current_trace") +get_linking_metadata = __wrap_api_call(__get_linking_metadata, "get_linking_metadata") +add_custom_span_attribute = __wrap_api_call(__add_custom_span_attribute, "add_custom_span_attribute") +current_transaction = __wrap_api_call(__current_transaction, "current_transaction") +set_user_id = __wrap_api_call(__set_user_id, "set_user_id") +set_error_group_callback = __wrap_api_call(__set_error_group_callback, "set_error_group_callback") +set_transaction_name = __wrap_api_call(__set_transaction_name, "set_transaction_name") +end_of_transaction = __wrap_api_call(__end_of_transaction, "end_of_transaction") +set_background_task = __wrap_api_call(__set_background_task, "set_background_task") +ignore_transaction = __wrap_api_call(__ignore_transaction, "ignore_transaction") +suppress_apdex_metric = __wrap_api_call(__suppress_apdex_metric, "suppress_apdex_metric") +capture_request_params = __wrap_api_call(__capture_request_params, "capture_request_params") +add_custom_parameter = __wrap_api_call(__add_custom_parameter, "add_custom_parameter") +add_custom_parameters = __wrap_api_call(__add_custom_parameters, "add_custom_parameters") +add_custom_attribute = __wrap_api_call(__add_custom_attribute, "add_custom_attribute") +add_custom_attributes = __wrap_api_call(__add_custom_attributes, "add_custom_attributes") +add_framework_info = __wrap_api_call(__add_framework_info, "add_framework_info") +record_exception = __wrap_api_call(__record_exception, "record_exception") +notice_error = __wrap_api_call(__notice_error, "notice_error") +get_browser_timing_header = __wrap_api_call(__get_browser_timing_header, "get_browser_timing_header") +get_browser_timing_footer = __wrap_api_call(__get_browser_timing_footer, "get_browser_timing_footer") +disable_browser_autorum = __wrap_api_call(__disable_browser_autorum, "disable_browser_autorum") +suppress_transaction_trace = __wrap_api_call(__suppress_transaction_trace, "suppress_transaction_trace") +record_custom_metric = __wrap_api_call(__record_custom_metric, "record_custom_metric") +record_custom_metrics = __wrap_api_call(__record_custom_metrics, "record_custom_metrics") +record_custom_event = __wrap_api_call(__record_custom_event, "record_custom_event") +record_log_event = __wrap_api_call(__record_log_event, "record_log_event") accept_distributed_trace_payload = __wrap_api_call( - __accept_distributed_trace_payload, 'accept_distributed_trace_payload') + __accept_distributed_trace_payload, "accept_distributed_trace_payload" +) create_distributed_trace_payload = __wrap_api_call( - __create_distributed_trace_payload, - 'create_distributed_trace_payload') + __create_distributed_trace_payload, "create_distributed_trace_payload" +) accept_distributed_trace_headers = __wrap_api_call( - __accept_distributed_trace_headers, - 'accept_distributed_trace_headers') + __accept_distributed_trace_headers, "accept_distributed_trace_headers" +) insert_distributed_trace_headers = __wrap_api_call( - __insert_distributed_trace_headers, - 'insert_distributed_trace_headers') -current_trace_id = __wrap_api_call(__current_trace_id, 'current_trace_id') -current_span_id = __wrap_api_call(__current_span_id, 'current_span_id') + __insert_distributed_trace_headers, "insert_distributed_trace_headers" +) +current_trace_id = __wrap_api_call(__current_trace_id, "current_trace_id") +current_span_id = __wrap_api_call(__current_span_id, "current_span_id") wsgi_application = __wsgi_application asgi_application = __asgi_application -WebTransaction = __wrap_api_call(__WebTransaction, - 'WebTransaction') -web_transaction = __wrap_api_call(__web_transaction, - 'web_transaction') -WebTransactionWrapper = __wrap_api_call(__WebTransactionWrapper, - 'WebTransactionWrapper') -wrap_web_transaction = __wrap_api_call(__wrap_web_transaction, - 'wrap_web_transaction') +WebTransaction = __wrap_api_call(__WebTransaction, "WebTransaction") +web_transaction = __wrap_api_call(__web_transaction, "web_transaction") +WebTransactionWrapper = __wrap_api_call(__WebTransactionWrapper, "WebTransactionWrapper") +wrap_web_transaction = __wrap_api_call(__wrap_web_transaction, "wrap_web_transaction") WSGIApplicationWrapper = __WSGIApplicationWrapper wrap_wsgi_application = __wrap_wsgi_application ASGIApplicationWrapper = __ASGIApplicationWrapper wrap_asgi_application = __wrap_asgi_application -background_task = __wrap_api_call(__background_task, - 'background_task') -BackgroundTask = __wrap_api_call(__BackgroundTask, - 'BackgroundTask') -BackgroundTaskWrapper = __wrap_api_call(__BackgroundTaskWrapper, - 'BackgroundTaskWrapper') -wrap_background_task = __wrap_api_call(__wrap_background_task, - 'wrap_background_task') -LambdaHandlerWrapper = __wrap_api_call(__LambdaHandlerWrapper, - 'LambdaHandlerWrapper') -lambda_handler = __wrap_api_call(__lambda_handler, - 'lambda_handler') -transaction_name = __wrap_api_call(__transaction_name, - 'transaction_name') -TransactionNameWrapper = __wrap_api_call(__TransactionNameWrapper, - 'TransactionNameWrapper') -wrap_transaction_name = __wrap_api_call(__wrap_transaction_name, - 'wrap_transaction_name') -function_trace = __wrap_api_call(__function_trace, - 'function_trace') -FunctionTrace = __wrap_api_call(__FunctionTrace, - 'FunctionTrace') -FunctionTraceWrapper = __wrap_api_call(__FunctionTraceWrapper, - 'FunctionTraceWrapper') -wrap_function_trace = __wrap_api_call(__wrap_function_trace, - 'wrap_function_trace') -generator_trace = __wrap_api_call(__generator_trace, - 'generator_trace') -GeneratorTraceWrapper = __wrap_api_call(__GeneratorTraceWrapper, - 'GeneratorTraceWrapper') -wrap_generator_trace = __wrap_api_call(__wrap_generator_trace, - 'wrap_generator_trace') -profile_trace = __wrap_api_call(__profile_trace, - 'profile_trace') -ProfileTraceWrapper = __wrap_api_call(__ProfileTraceWrapper, - 'ProfileTraceWrapper') -wrap_profile_trace = __wrap_api_call(__wrap_profile_trace, - 'wrap_profile_trace') -database_trace = __wrap_api_call(__database_trace, - 'database_trace') -DatabaseTrace = __wrap_api_call(__DatabaseTrace, - 'DatabaseTrace') -DatabaseTraceWrapper = __wrap_api_call(__DatabaseTraceWrapper, - 'DatabaseTraceWrapper') -wrap_database_trace = __wrap_api_call(__wrap_database_trace, - 'wrap_database_trace') -register_database_client = __wrap_api_call(__register_database_client, - 'register_database_client') -datastore_trace = __wrap_api_call(__datastore_trace, - 'datastore_trace') -DatastoreTrace = __wrap_api_call(__DatastoreTrace, - 'DatastoreTrace') -DatastoreTraceWrapper = __wrap_api_call(__DatastoreTraceWrapper, - 'DatastoreTraceWrapper') -wrap_datastore_trace = __wrap_api_call(__wrap_datastore_trace, - 'wrap_datastore_trace') -external_trace = __wrap_api_call(__external_trace, - 'external_trace') -ExternalTrace = __wrap_api_call(__ExternalTrace, - 'ExternalTrace') -ExternalTraceWrapper = __wrap_api_call(__ExternalTraceWrapper, - 'ExternalTraceWrapper') -wrap_external_trace = __wrap_api_call(__wrap_external_trace, - 'wrap_external_trace') -error_trace = __wrap_api_call(__error_trace, - 'error_trace') -ErrorTrace = __wrap_api_call(__ErrorTrace, - 'ErrorTrace') -ErrorTraceWrapper = __wrap_api_call(__ErrorTraceWrapper, - 'ErrorTraceWrapper') -wrap_error_trace = __wrap_api_call(__wrap_error_trace, - 'wrap_error_trace') -message_trace = __wrap_api_call(__message_trace, - 'message_trace') -MessageTrace = __wrap_api_call(__MessageTrace, - 'MessageTrace') -MessageTraceWrapper = __wrap_api_call(__MessageTraceWrapper, - 'MessageTraceWrapper') -wrap_message_trace = __wrap_api_call(__wrap_message_trace, - 'wrap_message_trace') -message_transaction = __wrap_api_call(__message_transaction, - 'message_trace') -MessageTransaction = __wrap_api_call(__MessageTransaction, - 'MessageTransaction') -MessageTransactionWrapper = __wrap_api_call(__MessageTransactionWrapper, - 'MessageTransactionWrapper') -wrap_message_transaction = __wrap_api_call(__wrap_message_transaction, - 'wrap_message_transaction') -callable_name = __wrap_api_call(__callable_name, - 'callable_name') -ObjectProxy = __wrap_api_call(__ObjectProxy, - 'ObjectProxy') -wrap_object = __wrap_api_call(__wrap_object, - 'wrap_object') -wrap_object_attribute = __wrap_api_call(__wrap_object_attribute, - 'wrap_object_attribute') -resolve_path = __wrap_api_call(__resolve_path, - 'resolve_path') -transient_function_wrapper = __wrap_api_call(__transient_function_wrapper, - 'transient_function_wrapper') -FunctionWrapper = __wrap_api_call(__FunctionWrapper, - 'FunctionWrapper') -function_wrapper = __wrap_api_call(__function_wrapper, - 'function_wrapper') -wrap_function_wrapper = __wrap_api_call(__wrap_function_wrapper, - 'wrap_function_wrapper') -patch_function_wrapper = __wrap_api_call(__patch_function_wrapper, - 'patch_function_wrapper') -ObjectWrapper = __wrap_api_call(__ObjectWrapper, - 'ObjectWrapper') -pre_function = __wrap_api_call(__pre_function, - 'pre_function') -PreFunctionWrapper = __wrap_api_call(__PreFunctionWrapper, - 'PreFunctionWrapper') -wrap_pre_function = __wrap_api_call(__wrap_pre_function, - 'wrap_pre_function') -post_function = __wrap_api_call(__post_function, - 'post_function') -PostFunctionWrapper = __wrap_api_call(__PostFunctionWrapper, - 'PostFunctionWrapper') -wrap_post_function = __wrap_api_call(__wrap_post_function, - 'wrap_post_function') -in_function = __wrap_api_call(__in_function, - 'in_function') -InFunctionWrapper = __wrap_api_call(__InFunctionWrapper, - 'InFunctionWrapper') -wrap_in_function = __wrap_api_call(__wrap_in_function, - 'wrap_in_function') -out_function = __wrap_api_call(__out_function, - 'out_function') -OutFunctionWrapper = __wrap_api_call(__OutFunctionWrapper, - 'OutFunctionWrapper') -wrap_out_function = __wrap_api_call(__wrap_out_function, - 'wrap_out_function') -insert_html_snippet = __wrap_api_call(__insert_html_snippet, - 'insert_html_snippet') -verify_body_exists = __wrap_api_call(__verify_body_exists, - 'verify_body_exists') +background_task = __wrap_api_call(__background_task, "background_task") +BackgroundTask = __wrap_api_call(__BackgroundTask, "BackgroundTask") +BackgroundTaskWrapper = __wrap_api_call(__BackgroundTaskWrapper, "BackgroundTaskWrapper") +wrap_background_task = __wrap_api_call(__wrap_background_task, "wrap_background_task") +LambdaHandlerWrapper = __wrap_api_call(__LambdaHandlerWrapper, "LambdaHandlerWrapper") +lambda_handler = __wrap_api_call(__lambda_handler, "lambda_handler") +transaction_name = __wrap_api_call(__transaction_name, "transaction_name") +TransactionNameWrapper = __wrap_api_call(__TransactionNameWrapper, "TransactionNameWrapper") +wrap_transaction_name = __wrap_api_call(__wrap_transaction_name, "wrap_transaction_name") +function_trace = __wrap_api_call(__function_trace, "function_trace") +FunctionTrace = __wrap_api_call(__FunctionTrace, "FunctionTrace") +FunctionTraceWrapper = __wrap_api_call(__FunctionTraceWrapper, "FunctionTraceWrapper") +wrap_function_trace = __wrap_api_call(__wrap_function_trace, "wrap_function_trace") +generator_trace = __wrap_api_call(__generator_trace, "generator_trace") +GeneratorTraceWrapper = __wrap_api_call(__GeneratorTraceWrapper, "GeneratorTraceWrapper") +wrap_generator_trace = __wrap_api_call(__wrap_generator_trace, "wrap_generator_trace") +profile_trace = __wrap_api_call(__profile_trace, "profile_trace") +ProfileTraceWrapper = __wrap_api_call(__ProfileTraceWrapper, "ProfileTraceWrapper") +wrap_profile_trace = __wrap_api_call(__wrap_profile_trace, "wrap_profile_trace") +database_trace = __wrap_api_call(__database_trace, "database_trace") +DatabaseTrace = __wrap_api_call(__DatabaseTrace, "DatabaseTrace") +DatabaseTraceWrapper = __wrap_api_call(__DatabaseTraceWrapper, "DatabaseTraceWrapper") +wrap_database_trace = __wrap_api_call(__wrap_database_trace, "wrap_database_trace") +register_database_client = __wrap_api_call(__register_database_client, "register_database_client") +datastore_trace = __wrap_api_call(__datastore_trace, "datastore_trace") +DatastoreTrace = __wrap_api_call(__DatastoreTrace, "DatastoreTrace") +DatastoreTraceWrapper = __wrap_api_call(__DatastoreTraceWrapper, "DatastoreTraceWrapper") +wrap_datastore_trace = __wrap_api_call(__wrap_datastore_trace, "wrap_datastore_trace") +external_trace = __wrap_api_call(__external_trace, "external_trace") +ExternalTrace = __wrap_api_call(__ExternalTrace, "ExternalTrace") +ExternalTraceWrapper = __wrap_api_call(__ExternalTraceWrapper, "ExternalTraceWrapper") +wrap_external_trace = __wrap_api_call(__wrap_external_trace, "wrap_external_trace") +error_trace = __wrap_api_call(__error_trace, "error_trace") +ErrorTrace = __wrap_api_call(__ErrorTrace, "ErrorTrace") +ErrorTraceWrapper = __wrap_api_call(__ErrorTraceWrapper, "ErrorTraceWrapper") +wrap_error_trace = __wrap_api_call(__wrap_error_trace, "wrap_error_trace") +message_trace = __wrap_api_call(__message_trace, "message_trace") +MessageTrace = __wrap_api_call(__MessageTrace, "MessageTrace") +MessageTraceWrapper = __wrap_api_call(__MessageTraceWrapper, "MessageTraceWrapper") +wrap_message_trace = __wrap_api_call(__wrap_message_trace, "wrap_message_trace") +message_transaction = __wrap_api_call(__message_transaction, "message_trace") +MessageTransaction = __wrap_api_call(__MessageTransaction, "MessageTransaction") +MessageTransactionWrapper = __wrap_api_call(__MessageTransactionWrapper, "MessageTransactionWrapper") +wrap_message_transaction = __wrap_api_call(__wrap_message_transaction, "wrap_message_transaction") +callable_name = __wrap_api_call(__callable_name, "callable_name") +ObjectProxy = __wrap_api_call(__ObjectProxy, "ObjectProxy") +wrap_object = __wrap_api_call(__wrap_object, "wrap_object") +wrap_object_attribute = __wrap_api_call(__wrap_object_attribute, "wrap_object_attribute") +resolve_path = __wrap_api_call(__resolve_path, "resolve_path") +transient_function_wrapper = __wrap_api_call(__transient_function_wrapper, "transient_function_wrapper") +FunctionWrapper = __wrap_api_call(__FunctionWrapper, "FunctionWrapper") +function_wrapper = __wrap_api_call(__function_wrapper, "function_wrapper") +wrap_function_wrapper = __wrap_api_call(__wrap_function_wrapper, "wrap_function_wrapper") +patch_function_wrapper = __wrap_api_call(__patch_function_wrapper, "patch_function_wrapper") +ObjectWrapper = __wrap_api_call(__ObjectWrapper, "ObjectWrapper") +pre_function = __wrap_api_call(__pre_function, "pre_function") +PreFunctionWrapper = __wrap_api_call(__PreFunctionWrapper, "PreFunctionWrapper") +wrap_pre_function = __wrap_api_call(__wrap_pre_function, "wrap_pre_function") +post_function = __wrap_api_call(__post_function, "post_function") +PostFunctionWrapper = __wrap_api_call(__PostFunctionWrapper, "PostFunctionWrapper") +wrap_post_function = __wrap_api_call(__wrap_post_function, "wrap_post_function") +in_function = __wrap_api_call(__in_function, "in_function") +InFunctionWrapper = __wrap_api_call(__InFunctionWrapper, "InFunctionWrapper") +wrap_in_function = __wrap_api_call(__wrap_in_function, "wrap_in_function") +out_function = __wrap_api_call(__out_function, "out_function") +OutFunctionWrapper = __wrap_api_call(__OutFunctionWrapper, "OutFunctionWrapper") +wrap_out_function = __wrap_api_call(__wrap_out_function, "wrap_out_function") +insert_html_snippet = __wrap_api_call(__insert_html_snippet, "insert_html_snippet") +verify_body_exists = __wrap_api_call(__verify_body_exists, "verify_body_exists") diff --git a/newrelic/api/application.py b/newrelic/api/application.py index 41a1b0cd3..ea57829f2 100644 --- a/newrelic/api/application.py +++ b/newrelic/api/application.py @@ -33,18 +33,18 @@ def _instance(name, activate=True): if name is None: name = newrelic.core.config.global_settings().app_name - # Ensure we grab a reference to the agent before grabbing - # the lock, else startup callback on agent initialisation - # could deadlock as it tries to create a application when - # we already have the lock held. - - agent = newrelic.core.agent.agent_instance() - # Try first without lock. If we find it we can return. instance = Application._instances.get(name, None) if not instance and activate: + # Ensure we grab a reference to the agent before grabbing + # the lock, else startup callback on agent initialisation + # could deadlock as it tries to create a application when + # we already have the lock held. + + agent = newrelic.core.agent.agent_instance() + with Application._lock: # Now try again with lock so that only one gets # to create and add it. diff --git a/newrelic/api/asgi_application.py b/newrelic/api/asgi_application.py index 609a8b4b5..2e4e4979b 100644 --- a/newrelic/api/asgi_application.py +++ b/newrelic/api/asgi_application.py @@ -254,7 +254,7 @@ async def send(self, event): return await self._send(event) -def ASGIApplicationWrapper(wrapped, application=None, name=None, group=None, framework=None): +def ASGIApplicationWrapper(wrapped, application=None, name=None, group=None, framework=None, dispatcher=None): def nr_asgi_wrapper(wrapped, instance, args, kwargs): double_callable = asgiref_compatibility.is_double_callable(wrapped) if double_callable: @@ -271,9 +271,7 @@ async def nr_async_asgi(receive, send): # Check to see if any transaction is present, even an inactive # one which has been marked to be ignored or which has been # stopped already. - transaction = current_transaction(active_only=False) - if transaction: # If there is any active transaction we will return without # applying a new ASGI application wrapper context. In the @@ -290,6 +288,9 @@ async def nr_async_asgi(receive, send): if framework: transaction.add_framework_info(name=framework[0], version=framework[1]) + if dispatcher: + transaction.add_dispatcher_info(name=dispatcher[0], version=dispatcher[1]) + # Also override the web transaction name to be the name of # the wrapped callable if not explicitly named, and we want # the default name to be that of the ASGI component for the @@ -323,6 +324,9 @@ async def nr_async_asgi(receive, send): if framework: transaction.add_framework_info(name=framework[0], version=framework[1]) + if dispatcher: + transaction.add_dispatcher_info(name=dispatcher[0], version=dispatcher[1]) + # Override the initial web transaction name to be the supplied # name, or the name of the wrapped callable if wanting to use # the callable as the default. This will override the use of a @@ -367,20 +371,23 @@ async def nr_async_asgi(receive, send): return FunctionWrapper(wrapped, nr_asgi_wrapper) -def asgi_application(application=None, name=None, group=None, framework=None): +def asgi_application(application=None, name=None, group=None, framework=None, dispatcher=None): return functools.partial( ASGIApplicationWrapper, application=application, name=name, group=group, framework=framework, + dispatcher=dispatcher, ) -def wrap_asgi_application(module, object_path, application=None, name=None, group=None, framework=None): +def wrap_asgi_application( + module, object_path, application=None, name=None, group=None, framework=None, dispatcher=None +): wrap_object( module, object_path, ASGIApplicationWrapper, - (application, name, group, framework), + (application, name, group, framework, dispatcher), ) diff --git a/newrelic/api/background_task.py b/newrelic/api/background_task.py index a4a9e8e6a..4cdcd8a0d 100644 --- a/newrelic/api/background_task.py +++ b/newrelic/api/background_task.py @@ -13,19 +13,16 @@ # limitations under the License. import functools -import sys from newrelic.api.application import Application, application_instance from newrelic.api.transaction import Transaction, current_transaction -from newrelic.common.async_proxy import async_proxy, TransactionContext +from newrelic.common.async_proxy import TransactionContext, async_proxy from newrelic.common.object_names import callable_name from newrelic.common.object_wrapper import FunctionWrapper, wrap_object class BackgroundTask(Transaction): - def __init__(self, application, name, group=None, source=None): - # Initialise the common transaction base class. super(BackgroundTask, self).__init__(application, source=source) @@ -53,7 +50,6 @@ def __init__(self, application, name, group=None, source=None): def BackgroundTaskWrapper(wrapped, application=None, name=None, group=None): - def wrapper(wrapped, instance, args, kwargs): if callable(name): if instance is not None: @@ -107,39 +103,19 @@ def create_transaction(transaction): manager = create_transaction(current_transaction(active_only=False)) + # This means that a transaction already exists, so we want to return if not manager: return wrapped(*args, **kwargs) - success = True - - try: - manager.__enter__() - try: - return wrapped(*args, **kwargs) - except: - success = False - if not manager.__exit__(*sys.exc_info()): - raise - finally: - if success and manager._ref_count == 0: - manager._is_finalized = True - manager.__exit__(None, None, None) - else: - manager._request_handler_finalize = True - manager._server_adapter_finalize = True - old_transaction = current_transaction() - if old_transaction is not None: - old_transaction.drop_transaction() + with manager: + return wrapped(*args, **kwargs) return FunctionWrapper(wrapped, wrapper) def background_task(application=None, name=None, group=None): - return functools.partial(BackgroundTaskWrapper, - application=application, name=name, group=group) + return functools.partial(BackgroundTaskWrapper, application=application, name=name, group=group) -def wrap_background_task(module, object_path, application=None, - name=None, group=None): - wrap_object(module, object_path, BackgroundTaskWrapper, - (application, name, group)) +def wrap_background_task(module, object_path, application=None, name=None, group=None): + wrap_object(module, object_path, BackgroundTaskWrapper, (application, name, group)) diff --git a/newrelic/api/database_trace.py b/newrelic/api/database_trace.py index 09dfa1e11..2bc497688 100644 --- a/newrelic/api/database_trace.py +++ b/newrelic/api/database_trace.py @@ -127,6 +127,7 @@ def _log_async_warning(self): def finalize_data(self, transaction, exc=None, value=None, tb=None): self.stack_trace = None + self.sql_format = "off" connect_params = None cursor_params = None @@ -206,8 +207,8 @@ def finalize_data(self, transaction, exc=None, value=None, tb=None): transaction._explain_plan_count += 1 self.sql_format = ( - tt.record_sql if tt.record_sql else "" - ) # If tt.record_sql is None, then the empty string will default to sql being obfuscated + tt.record_sql if tt.record_sql else "off" + ) # If tt.record_sql is None, then default to sql being off self.connect_params = connect_params self.cursor_params = cursor_params self.sql_parameters = sql_parameters diff --git a/newrelic/api/graphql_trace.py b/newrelic/api/graphql_trace.py index 6863bd73d..6b0d344a2 100644 --- a/newrelic/api/graphql_trace.py +++ b/newrelic/api/graphql_trace.py @@ -69,8 +69,13 @@ def finalize_data(self, transaction, exc=None, value=None, tb=None): self._add_agent_attribute("graphql.operation.type", self.operation_type) self._add_agent_attribute("graphql.operation.name", self.operation_name) + settings = transaction.settings + if settings and settings.agent_limits and settings.agent_limits.sql_query_length_maximum: + limit = transaction.settings.agent_limits.sql_query_length_maximum + else: + limit = 0 + # Attach formatted graphql - limit = transaction.settings.agent_limits.sql_query_length_maximum self.graphql = graphql = self.formatted[:limit] self._add_agent_attribute("graphql.operation.query", graphql) diff --git a/newrelic/api/message_transaction.py b/newrelic/api/message_transaction.py index 291a3897e..54a71f6ef 100644 --- a/newrelic/api/message_transaction.py +++ b/newrelic/api/message_transaction.py @@ -13,7 +13,6 @@ # limitations under the License. import functools -import sys from newrelic.api.application import Application, application_instance from newrelic.api.background_task import BackgroundTask @@ -39,7 +38,6 @@ def __init__( transport_type="AMQP", source=None, ): - name, group = self.get_transaction_name(library, destination_type, destination_name) super(MessageTransaction, self).__init__(application, name, group=group, source=source) @@ -218,30 +216,12 @@ def create_transaction(transaction): manager = create_transaction(current_transaction(active_only=False)) + # This means that transaction already exists and we want to return if not manager: return wrapped(*args, **kwargs) - success = True - - try: - manager.__enter__() - try: - return wrapped(*args, **kwargs) - except: # Catch all - success = False - if not manager.__exit__(*sys.exc_info()): - raise - finally: - if success and manager._ref_count == 0: - manager._is_finalized = True - manager.__exit__(None, None, None) - else: - manager._request_handler_finalize = True - manager._server_adapter_finalize = True - - old_transaction = current_transaction() - if old_transaction is not None: - old_transaction.drop_transaction() + with manager: + return wrapped(*args, **kwargs) return FunctionWrapper(wrapped, wrapper) diff --git a/newrelic/api/profile_trace.py b/newrelic/api/profile_trace.py index 28113b1d8..93aa191a4 100644 --- a/newrelic/api/profile_trace.py +++ b/newrelic/api/profile_trace.py @@ -13,31 +13,27 @@ # limitations under the License. import functools -import sys import os +import sys -from newrelic.packages import six - -from newrelic.api.time_trace import current_trace +from newrelic import __file__ as AGENT_PACKAGE_FILE from newrelic.api.function_trace import FunctionTrace -from newrelic.common.object_wrapper import FunctionWrapper, wrap_object +from newrelic.api.time_trace import current_trace from newrelic.common.object_names import callable_name +from newrelic.common.object_wrapper import FunctionWrapper, wrap_object +from newrelic.packages import six -from newrelic import __file__ as AGENT_PACKAGE_FILE -AGENT_PACKAGE_DIRECTORY = os.path.dirname(AGENT_PACKAGE_FILE) + '/' +AGENT_PACKAGE_DIRECTORY = os.path.dirname(AGENT_PACKAGE_FILE) + "/" class ProfileTrace(object): - def __init__(self, depth): self.function_traces = [] self.maximum_depth = depth self.current_depth = 0 - def __call__(self, frame, event, arg): - - if event not in ['call', 'c_call', 'return', 'c_return', - 'exception', 'c_exception']: + def __call__(self, frame, event, arg): # pragma: no cover + if event not in ["call", "c_call", "return", "c_return", "exception", "c_exception"]: return parent = current_trace() @@ -49,8 +45,7 @@ def __call__(self, frame, event, arg): # coroutine systems based on greenlets so don't run # if we detect may be using greenlets. - if (hasattr(sys, '_current_frames') and - parent.thread_id not in sys._current_frames()): + if hasattr(sys, "_current_frames") and parent.thread_id not in sys._current_frames(): return co = frame.f_code @@ -84,7 +79,7 @@ def _callable(): except Exception: pass - if event in ['call', 'c_call']: + if event in ["call", "c_call"]: # Skip the outermost as we catch that with the root # function traces for the profile trace. @@ -100,19 +95,17 @@ def _callable(): self.function_traces.append(None) return - if event == 'call': + if event == "call": func = _callable() if func: name = callable_name(func) else: - name = '%s:%s#%s' % (func_filename, func_name, - func_line_no) + name = "%s:%s#%s" % (func_filename, func_name, func_line_no) else: func = arg name = callable_name(arg) if not name: - name = '%s:@%s#%s' % (func_filename, func_name, - func_line_no) + name = "%s:@%s#%s" % (func_filename, func_name, func_line_no) function_trace = FunctionTrace(name=name, parent=parent) function_trace.__enter__() @@ -127,7 +120,7 @@ def _callable(): self.function_traces.append(function_trace) self.current_depth += 1 - elif event in ['return', 'c_return', 'c_exception']: + elif event in ["return", "c_return", "c_exception"]: if not self.function_traces: return @@ -143,9 +136,7 @@ def _callable(): self.current_depth -= 1 -def ProfileTraceWrapper(wrapped, name=None, group=None, label=None, - params=None, depth=3): - +def ProfileTraceWrapper(wrapped, name=None, group=None, label=None, params=None, depth=3): def wrapper(wrapped, instance, args, kwargs): parent = current_trace() @@ -192,7 +183,7 @@ def wrapper(wrapped, instance, args, kwargs): _params = params with FunctionTrace(_name, _group, _label, _params, parent=parent, source=wrapped): - if not hasattr(sys, 'getprofile'): + if not hasattr(sys, "getprofile"): return wrapped(*args, **kwargs) profiler = sys.getprofile() @@ -212,11 +203,8 @@ def wrapper(wrapped, instance, args, kwargs): def profile_trace(name=None, group=None, label=None, params=None, depth=3): - return functools.partial(ProfileTraceWrapper, name=name, - group=group, label=label, params=params, depth=depth) + return functools.partial(ProfileTraceWrapper, name=name, group=group, label=label, params=params, depth=depth) -def wrap_profile_trace(module, object_path, name=None, - group=None, label=None, params=None, depth=3): - return wrap_object(module, object_path, ProfileTraceWrapper, - (name, group, label, params, depth)) +def wrap_profile_trace(module, object_path, name=None, group=None, label=None, params=None, depth=3): + return wrap_object(module, object_path, ProfileTraceWrapper, (name, group, label, params, depth)) diff --git a/newrelic/api/settings.py b/newrelic/api/settings.py index fc70eb0d4..5cc9ba79f 100644 --- a/newrelic/api/settings.py +++ b/newrelic/api/settings.py @@ -12,10 +12,15 @@ # See the License for the specific language governing permissions and # limitations under the License. +import logging + import newrelic.core.config settings = newrelic.core.config.global_settings +_logger = logging.getLogger(__name__) + + RECORDSQL_OFF = 'off' RECORDSQL_RAW = 'raw' RECORDSQL_OBFUSCATED = 'obfuscated' @@ -23,5 +28,26 @@ COMPRESSED_CONTENT_ENCODING_DEFLATE = 'deflate' COMPRESSED_CONTENT_ENCODING_GZIP = 'gzip' -STRIP_EXCEPTION_MESSAGE = ("Message removed by New Relic " - "'strip_exception_messages' setting") +STRIP_EXCEPTION_MESSAGE = ("Message removed by New Relic 'strip_exception_messages' setting") + + +def set_error_group_callback(callback, application=None): + """Set the current callback to be used to determine error groups.""" + from newrelic.api.application import application_instance + + if callback is not None and not callable(callback): + _logger.error("Error group callback must be a callable, or None to unset this setting.") + return + + # Check for activated application if it exists and was not given. + application = application_instance(activate=False) if application is None else application + + # Get application settings if it exists, or fallback to global settings object + _settings = application.settings if application is not None else settings() + + if _settings is None: + _logger.error("Failed to set error_group_callback in application settings. Report this issue to New Relic support.") + return + + if _settings.error_collector: + _settings.error_collector._error_group_callback = callback diff --git a/newrelic/api/time_trace.py b/newrelic/api/time_trace.py index dc010674c..24be0e00f 100644 --- a/newrelic/api/time_trace.py +++ b/newrelic/api/time_trace.py @@ -30,6 +30,8 @@ from newrelic.core.config import is_expected_error, should_ignore_error from newrelic.core.trace_cache import trace_cache +from newrelic.packages import six + _logger = logging.getLogger(__name__) @@ -255,13 +257,15 @@ def _observe_exception(self, exc_info=None, ignore=None, expected=None, status_c if getattr(value, "_nr_ignored", None): return - module, name, fullnames, message = parse_exc_info((exc, value, tb)) + module, name, fullnames, message_raw = parse_exc_info((exc, value, tb)) fullname = fullnames[0] # Check to see if we need to strip the message before recording it. if settings.strip_exception_messages.enabled and fullname not in settings.strip_exception_messages.allowlist: message = STRIP_EXCEPTION_MESSAGE + else: + message = message_raw # Where expected or ignore are a callable they should return a # tri-state variable with the following behavior. @@ -327,6 +331,10 @@ def _observe_exception(self, exc_info=None, ignore=None, expected=None, status_c if is_expected is None and callable(expected): is_expected = expected(exc, value, tb) + # Callable on transaction + if is_expected is None and hasattr(transaction, "_expect_errors"): + is_expected = transaction._expect_errors(exc, value, tb) + # List of class names if is_expected is None and expected is not None and not callable(expected): # Do not set is_expected to False @@ -340,7 +348,7 @@ def _observe_exception(self, exc_info=None, ignore=None, expected=None, status_c is_expected = is_expected_error(exc_info, status_code=status_code, settings=settings) # Record a supportability metric if error attributes are being - # overiden. + # overridden. if "error.class" in self.agent_attributes: transaction._record_supportability("Supportability/SpanEvent/Errors/Dropped") @@ -349,11 +357,23 @@ def _observe_exception(self, exc_info=None, ignore=None, expected=None, status_c self._add_agent_attribute("error.message", message) self._add_agent_attribute("error.expected", is_expected) - return fullname, message, tb, is_expected + return fullname, message, message_raw, tb, is_expected def notice_error(self, error=None, attributes=None, expected=None, ignore=None, status_code=None): attributes = attributes if attributes is not None else {} + # If no exception details provided, use current exception. + + # Pull from sys.exc_info if no exception is passed + if not error or None in error: + error = sys.exc_info() + + # If no exception to report, exit + if not error or None in error: + return + + exc, value, tb = error + recorded = self._observe_exception( error, ignore=ignore, @@ -361,7 +381,7 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, status_code=status_code, ) if recorded: - fullname, message, tb, is_expected = recorded + fullname, message, message_raw, tb, is_expected = recorded transaction = self.transaction settings = transaction and transaction.settings @@ -388,16 +408,45 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, ) custom_params = {} - if settings and settings.code_level_metrics and settings.code_level_metrics.enabled: - source = extract_code_from_traceback(tb) - else: - source = None + # Extract additional details about the exception + + source = None + error_group_name = None + if settings: + if settings.code_level_metrics and settings.code_level_metrics.enabled: + source = extract_code_from_traceback(tb) + + if settings.error_collector and settings.error_collector.error_group_callback is not None: + try: + # Call callback to obtain error group name + input_attributes = {} + input_attributes.update(transaction._custom_params) + input_attributes.update(attributes) + error_group_name_raw = settings.error_collector.error_group_callback(value, { + "traceback": tb, + "error.class": exc, + "error.message": message_raw, + "error.expected": is_expected, + "custom_params": input_attributes, + "transactionName": getattr(transaction, "name", None), + "response.status": getattr(transaction, "_response_code", None), + "request.method": getattr(transaction, "_request_method", None), + "request.uri": getattr(transaction, "_request_uri", None), + }) + if error_group_name_raw: + _, error_group_name = process_user_attribute("error.group.name", error_group_name_raw) + if error_group_name is None or not isinstance(error_group_name, six.string_types): + raise ValueError("Invalid attribute value for error.group.name. Expected string, got: %s" % repr(error_group_name_raw)) + except Exception: + _logger.error("Encountered error when calling error group callback:\n%s", "".join(traceback.format_exception(*sys.exc_info()))) + error_group_name = None transaction._create_error_node( settings, fullname, message, is_expected, + error_group_name, custom_params, self.guid, tb, @@ -630,8 +679,9 @@ def get_service_linking_metadata(application=None, settings=None): if not settings: if application is None: from newrelic.api.application import application_instance + application = application_instance(activate=False) - + if application is not None: settings = application.settings diff --git a/newrelic/api/transaction.py b/newrelic/api/transaction.py index f486989b4..d2bfc8528 100644 --- a/newrelic/api/transaction.py +++ b/newrelic/api/transaction.py @@ -25,13 +25,11 @@ import weakref from collections import OrderedDict -from newrelic.api.application import application_instance import newrelic.core.database_node import newrelic.core.error_node -from newrelic.core.log_event_node import LogEventNode import newrelic.core.root_node import newrelic.core.transaction_node -import newrelic.packages.six as six +from newrelic.api.application import application_instance from newrelic.api.time_trace import TimeTrace, get_linking_metadata from newrelic.common.encoding_utils import ( DistributedTracePayload, @@ -48,6 +46,7 @@ obfuscate, ) from newrelic.core.attribute import ( + MAX_ATTRIBUTE_LENGTH, MAX_LOG_MESSAGE_LENGTH, MAX_NUM_USER_ATTRIBUTES, create_agent_attributes, @@ -61,8 +60,9 @@ DST_NONE, DST_TRANSACTION_TRACER, ) -from newrelic.core.config import DEFAULT_RESERVOIR_SIZE, LOG_EVENT_RESERVOIR_SIZE +from newrelic.core.config import CUSTOM_EVENT_RESERVOIR_SIZE, LOG_EVENT_RESERVOIR_SIZE from newrelic.core.custom_event import create_custom_event +from newrelic.core.log_event_node import LogEventNode from newrelic.core.stack_trace import exception_stack from newrelic.core.stats_engine import CustomMetrics, SampledDataSet from newrelic.core.thread_utilization import utilization_tracker @@ -71,6 +71,7 @@ TraceCacheNoActiveTraceError, trace_cache, ) +from newrelic.packages import six _logger = logging.getLogger(__name__) @@ -120,7 +121,7 @@ def complete_root(self): self.exited = True @staticmethod - def complete_trace(): + def complete_trace(): # pylint: disable=arguments-differ pass @property @@ -158,13 +159,11 @@ def path(self): class Transaction(object): - STATE_PENDING = 0 STATE_RUNNING = 1 STATE_STOPPED = 2 def __init__(self, application, enabled=None, source=None): - self._application = application self._source = source @@ -186,6 +185,8 @@ def __init__(self, application, enabled=None, source=None): self._loop_time = 0.0 self._frameworks = set() + self._message_brokers = set() + self._dispatchers = set() self._frozen_path = None @@ -324,10 +325,14 @@ def __init__(self, application, enabled=None, source=None): self.enabled = True if self._settings: - self._custom_events = SampledDataSet(capacity=self._settings.event_harvest_config.harvest_limits.custom_event_data) - self._log_events = SampledDataSet(capacity=self._settings.event_harvest_config.harvest_limits.log_event_data) + self._custom_events = SampledDataSet( + capacity=self._settings.event_harvest_config.harvest_limits.custom_event_data + ) + self._log_events = SampledDataSet( + capacity=self._settings.event_harvest_config.harvest_limits.log_event_data + ) else: - self._custom_events = SampledDataSet(capacity=DEFAULT_RESERVOIR_SIZE) + self._custom_events = SampledDataSet(capacity=CUSTOM_EVENT_RESERVOIR_SIZE) self._log_events = SampledDataSet(capacity=LOG_EVENT_RESERVOIR_SIZE) def __del__(self): @@ -336,7 +341,6 @@ def __del__(self): self.__exit__(None, None, None) def __enter__(self): - assert self._state == self.STATE_PENDING # Bail out if the transaction is not enabled. @@ -396,7 +400,6 @@ def __enter__(self): return self def __exit__(self, exc, value, tb): - # Bail out if the transaction is not enabled. if not self.enabled: @@ -541,6 +544,14 @@ def __exit__(self, exc, value, tb): for framework, version in self._frameworks: self.record_custom_metric("Python/Framework/%s/%s" % (framework, version), 1) + if self._message_brokers: + for message_broker, version in self._message_brokers: + self.record_custom_metric("Python/MessageBroker/%s/%s" % (message_broker, version), 1) + + if self._dispatchers: + for dispatcher, version in self._dispatchers: + self.record_custom_metric("Python/Dispatcher/%s/%s" % (dispatcher, version), 1) + if self._settings.distributed_tracing.enabled: # Sampled and priority need to be computed at the end of the # transaction when distributed tracing or span events are enabled. @@ -621,7 +632,6 @@ def __exit__(self, exc, value, tb): # new samples can cause an error. if not self.ignore_transaction: - self._application.record_transaction(node) @property @@ -828,7 +838,7 @@ def trace_intrinsics(self): # Add in special CPU time value for UI to display CPU burn. - # XXX Disable cpu time value for CPU burn as was + # TODO: Disable cpu time value for CPU burn as was # previously reporting incorrect value and we need to # fix it, at least on Linux to report just the CPU time # for the executing thread. @@ -914,9 +924,7 @@ def filter_request_parameters(self, params): @property def request_parameters(self): if (self.capture_params is None) or self.capture_params: - if self._request_params: - r_attrs = {} for k, v in self._request_params.items(): @@ -1022,7 +1030,9 @@ def _create_distributed_trace_data(self): settings = self._settings account_id = settings.account_id - trusted_account_key = settings.trusted_account_key + trusted_account_key = settings.trusted_account_key or ( + self._settings.serverless_mode.enabled and self._settings.account_id + ) application_id = settings.primary_application_id if not (account_id and application_id and trusted_account_key and settings.distributed_tracing.enabled): @@ -1080,7 +1090,6 @@ def _generate_distributed_trace_headers(self, data=None): try: data = data or self._create_distributed_trace_data() if data: - traceparent = W3CTraceParent(data).text() yield ("traceparent", traceparent) @@ -1114,7 +1123,10 @@ def _can_accept_distributed_trace_headers(self): return False settings = self._settings - if not (settings.distributed_tracing.enabled and settings.trusted_account_key): + trusted_account_key = settings.trusted_account_key or ( + self._settings.serverless_mode.enabled and self._settings.account_id + ) + if not (settings.distributed_tracing.enabled and trusted_account_key): return False if self._distributed_trace_state: @@ -1160,10 +1172,13 @@ def _accept_distributed_trace_payload(self, payload, transport_type="HTTP"): settings = self._settings account_id = data.get("ac") + trusted_account_key = settings.trusted_account_key or ( + self._settings.serverless_mode.enabled and self._settings.account_id + ) # If trust key doesn't exist in the payload, use account_id received_trust_key = data.get("tk", account_id) - if settings.trusted_account_key != received_trust_key: + if trusted_account_key != received_trust_key: self._record_supportability("Supportability/DistributedTrace/AcceptPayload/Ignored/UntrustedAccount") if settings.debug.log_untrusted_distributed_trace_keys: _logger.debug( @@ -1177,11 +1192,10 @@ def _accept_distributed_trace_payload(self, payload, transport_type="HTTP"): except: return False - if "pr" in data: - try: - data["pr"] = float(data["pr"]) - except: - data["pr"] = None + try: + data["pr"] = float(data["pr"]) + except Exception: + data["pr"] = None self._accept_distributed_trace_data(data, transport_type) self._record_supportability("Supportability/DistributedTrace/AcceptPayload/Success") @@ -1273,8 +1287,10 @@ def accept_distributed_trace_headers(self, headers, transport_type="HTTP"): tracestate = ensure_str(tracestate) try: vendors = W3CTraceState.decode(tracestate) - tk = self._settings.trusted_account_key - payload = vendors.pop(tk + "@nr", "") + trusted_account_key = self._settings.trusted_account_key or ( + self._settings.serverless_mode.enabled and self._settings.account_id + ) + payload = vendors.pop(trusted_account_key + "@nr", "") self.tracing_vendors = ",".join(vendors.keys()) self.tracestate = vendors.text(limit=31) except: @@ -1283,7 +1299,7 @@ def accept_distributed_trace_headers(self, headers, transport_type="HTTP"): # Remove trusted new relic header if available and parse if payload: try: - tracestate_data = NrTraceState.decode(payload, tk) + tracestate_data = NrTraceState.decode(payload, trusted_account_key) except: tracestate_data = None if tracestate_data: @@ -1367,7 +1383,6 @@ def _generate_response_headers(self, read_length=None): # process web external calls. if self.client_cross_process_id is not None: - # Need to work out queueing time and duration up to this # point for inclusion in metrics and response header. If the # recording of the transaction had been prematurely stopped @@ -1411,11 +1426,17 @@ def _generate_response_headers(self, read_length=None): return nr_headers - def get_response_metadata(self): + # This function is CAT related and has been deprecated. + # Eventually, this will be removed. Until then, coverage + # does not need to factor this function into its analysis. + def get_response_metadata(self): # pragma: no cover nr_headers = dict(self._generate_response_headers()) return convert_to_cat_metadata_value(nr_headers) - def process_request_metadata(self, cat_linking_value): + # This function is CAT related and has been deprecated. + # Eventually, this will be removed. Until then, coverage + # does not need to factor this function into its analysis. + def process_request_metadata(self, cat_linking_value): # pragma: no cover try: payload = base64_decode(cat_linking_value) except: @@ -1432,7 +1453,6 @@ def process_request_metadata(self, cat_linking_value): return self._process_incoming_cat_headers(encoded_cross_process_id, encoded_txn_header) def set_transaction_name(self, name, group=None, priority=None): - # Always perform this operation even if the transaction # is not active at the time as will be called from # constructor. If path has been frozen do not allow @@ -1473,32 +1493,38 @@ def set_transaction_name(self, name, group=None, priority=None): self._group = group self._name = name - def record_log_event(self, message, level=None, timestamp=None, priority=None): settings = self.settings - if not (settings and settings.application_logging and settings.application_logging.enabled and settings.application_logging.forwarding and settings.application_logging.forwarding.enabled): + if not ( + settings + and settings.application_logging + and settings.application_logging.enabled + and settings.application_logging.forwarding + and settings.application_logging.forwarding.enabled + ): return - + timestamp = timestamp if timestamp is not None else time.time() level = str(level) if level is not None else "UNKNOWN" - + if not message or message.isspace(): _logger.debug("record_log_event called where message was missing. No log event will be sent.") return - + message = truncate(message, MAX_LOG_MESSAGE_LENGTH) event = LogEventNode( timestamp=timestamp, level=level, message=message, - attributes=get_linking_metadata(), + attributes=get_linking_metadata(), ) self._log_events.add(event, priority=priority) - - def record_exception(self, exc=None, value=None, tb=None, params=None, ignore_errors=None): + # This function has been deprecated (and will be removed eventually) + # and therefore does not need to be included in coverage analysis + def record_exception(self, exc=None, value=None, tb=None, params=None, ignore_errors=None): # pragma: no cover # Deprecation Warning warnings.warn( ("The record_exception function is deprecated. Please use the new api named notice_error instead."), @@ -1529,7 +1555,9 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, status_code=status_code, ) - def _create_error_node(self, settings, fullname, message, expected, custom_params, span_id, tb, source): + def _create_error_node( + self, settings, fullname, message, expected, error_group_name, custom_params, span_id, tb, source + ): # Only remember up to limit of what can be caught for a # single transaction. This could be trimmed further # later if there are already recorded errors and would @@ -1558,12 +1586,11 @@ def _create_error_node(self, settings, fullname, message, expected, custom_param span_id=span_id, stack_trace=exception_stack(tb), custom_params=custom_params, - file_name=None, - line_number=None, source=source, + error_group_name=error_group_name, ) - # TODO Errors are recorded in time order. If + # TODO: Errors are recorded in time order. If # there are two exceptions of same type and # different message, the UI displays the first # one. In the PHP agent it was recording the @@ -1603,6 +1630,8 @@ def _process_node(self, node): if type(node) is newrelic.core.database_node.DatabaseNode: settings = self._settings + if not settings: + return if not settings.collect_traces: return if not settings.slow_sql.enabled and not settings.transaction_tracer.explain_enabled: @@ -1633,12 +1662,12 @@ def stop_recording(self): self._cpu_user_time_end = os.times()[0] - def add_custom_parameter(self, name, value): + def add_custom_attribute(self, name, value): if not self._settings: return False if self._settings.high_security: - _logger.debug("Cannot add custom parameter in High Security Mode.") + _logger.debug("Cannot add custom attribute in High Security Mode.") return False if len(self._custom_params) >= MAX_NUM_USER_ATTRIBUTES: @@ -1653,19 +1682,47 @@ def add_custom_parameter(self, name, value): self._custom_params[key] = val return True - def add_custom_parameters(self, items): + def add_custom_attributes(self, items): result = True # items is a list of (name, value) tuples. for name, value in items: - result &= self.add_custom_parameter(name, value) + result &= self.add_custom_attribute(name, value) return result + # This function has been deprecated (and will be removed eventually) + # and therefore does not need to be included in coverage analysis + def add_custom_parameter(self, name, value): # pragma: no cover + # Deprecation warning + warnings.warn( + ("The add_custom_parameter API has been deprecated. " "Please use the add_custom_attribute API."), + DeprecationWarning, + ) + return self.add_custom_attribute(name, value) + + # This function has been deprecated (and will be removed eventually) + # and therefore does not need to be included in coverage analysis + def add_custom_parameters(self, items): # pragma: no cover + # Deprecation warning + warnings.warn( + ("The add_custom_parameters API has been deprecated. " "Please use the add_custom_attributes API."), + DeprecationWarning, + ) + return self.add_custom_attributes(items) + def add_framework_info(self, name, version=None): if name: self._frameworks.add((name, version)) + def add_messagebroker_info(self, name, version=None): + if name: + self._message_brokers.add((name, version)) + + def add_dispatcher_info(self, name, version=None): + if name: + self._dispatchers.add((name, version)) + def dump(self, file): """Dumps details about the transaction to the file object.""" @@ -1734,22 +1791,59 @@ def capture_request_params(flag=True): transaction.capture_params = flag -def add_custom_parameter(key, value): +def add_custom_attribute(key, value): transaction = current_transaction() if transaction: - return transaction.add_custom_parameter(key, value) + return transaction.add_custom_attribute(key, value) else: return False -def add_custom_parameters(items): +def add_custom_attributes(items): transaction = current_transaction() if transaction: - return transaction.add_custom_parameters(items) + return transaction.add_custom_attributes(items) else: return False +# This function has been deprecated (and will be removed eventually) +# and therefore does not need to be included in coverage analysis +def add_custom_parameter(key, value): # pragma: no cover + # Deprecation warning + warnings.warn( + ("The add_custom_parameter API has been deprecated. Please use the add_custom_attribute API."), + DeprecationWarning, + ) + return add_custom_attribute(key, value) + + +# This function has been deprecated (and will be removed eventually) +# and therefore does not need to be included in coverage analysis +def add_custom_parameters(items): # pragma: no cover + # Deprecation warning + warnings.warn( + ("The add_custom_parameters API has been deprecated. Please use the add_custom_attributes API."), + DeprecationWarning, + ) + return add_custom_attributes(items) + + +def set_user_id(user_id): + transaction = current_transaction() + + if not user_id or not transaction: + return + + if not isinstance(user_id, six.string_types): + _logger.warning("The set_user_id API requires a string-based user ID.") + return + + user_id = truncate(user_id, MAX_ATTRIBUTE_LENGTH) + + transaction._add_agent_attribute("enduser.id", user_id) + + def add_framework_info(name, version=None): transaction = current_transaction() if transaction: @@ -1869,7 +1963,9 @@ def record_log_event(message, level=None, timestamp=None, application=None, prio "record_log_event has been called but no transaction or application was running. As a result, " "the following event has not been recorded. message: %r level: %r timestamp %r. To correct " "this problem, supply an application object as a parameter to this record_log_event call.", - message, level, timestamp, + message, + level, + timestamp, ) elif application.enabled: application.record_log_event(message, level, timestamp, priority=priority) diff --git a/newrelic/api/wsgi_application.py b/newrelic/api/wsgi_application.py index 0f4d30454..67338cbdd 100644 --- a/newrelic/api/wsgi_application.py +++ b/newrelic/api/wsgi_application.py @@ -18,9 +18,6 @@ import time from newrelic.api.application import application_instance -from newrelic.api.transaction import current_transaction -from newrelic.api.time_trace import notice_error -from newrelic.api.web_transaction import WSGIWebTransaction from newrelic.api.function_trace import FunctionTrace, FunctionTraceWrapper from newrelic.api.html_insertion import insert_html_snippet, verify_body_exists from newrelic.api.time_trace import notice_error @@ -80,12 +77,12 @@ def close(self): self.response_trace = None try: - with FunctionTrace(name='Finalize', group='Python/WSGI'): + with FunctionTrace(name="Finalize", group="Python/WSGI"): if isinstance(self.generator, _WSGIApplicationMiddleware): self.generator.close() - elif hasattr(self.generator, 'close'): + elif hasattr(self.generator, "close"): FunctionTraceWrapper(self.generator.close)() except: # Catch all @@ -437,7 +434,7 @@ def close(self): # Call close() on the iterable as required by the # WSGI specification. - if hasattr(self.iterable, 'close'): + if hasattr(self.iterable, "close"): FunctionTraceWrapper(self.iterable.close)() def __iter__(self): @@ -510,7 +507,7 @@ def __iter__(self): yield data -def WSGIApplicationWrapper(wrapped, application=None, name=None, group=None, framework=None): +def WSGIApplicationWrapper(wrapped, application=None, name=None, group=None, framework=None, dispatcher=None): # Python 2 does not allow rebinding nonlocal variables, so to fix this # framework must be stored in list so it can be edited by closure. @@ -556,6 +553,9 @@ def _nr_wsgi_application_wrapper_(wrapped, instance, args, kwargs): if framework: transaction.add_framework_info(name=framework[0], version=framework[1]) + if dispatcher: + transaction.add_dispatcher_info(name=dispatcher[0], version=dispatcher[1]) + # Also override the web transaction name to be the name of # the wrapped callable if not explicitly named, and we want # the default name to be that of the WSGI component for the @@ -618,6 +618,9 @@ def _args(environ, start_response, *args, **kwargs): if framework: transaction.add_framework_info(name=framework[0], version=framework[1]) + if dispatcher: + transaction.add_dispatcher_info(name=dispatcher[0], version=dispatcher[1]) + # Override the initial web transaction name to be the supplied # name, or the name of the wrapped callable if wanting to use # the callable as the default. This will override the use of a @@ -672,7 +675,7 @@ def write(data): if "wsgi.input" in environ: environ["wsgi.input"] = _WSGIInputWrapper(transaction, environ["wsgi.input"]) - with FunctionTrace(name='Application', group='Python/WSGI'): + with FunctionTrace(name="Application", group="Python/WSGI"): with FunctionTrace(name=callable_name(wrapped), source=wrapped): if settings and settings.browser_monitoring.enabled and not transaction.autorum_disabled: result = _WSGIApplicationMiddleware(wrapped, environ, _start_response, transaction) @@ -688,11 +691,18 @@ def write(data): return FunctionWrapper(wrapped, _nr_wsgi_application_wrapper_) -def wsgi_application(application=None, name=None, group=None, framework=None): +def wsgi_application(application=None, name=None, group=None, framework=None, dispatcher=None): return functools.partial( - WSGIApplicationWrapper, application=application, name=name, group=group, framework=framework + WSGIApplicationWrapper, + application=application, + name=name, + group=group, + framework=framework, + dispatcher=dispatcher, ) -def wrap_wsgi_application(module, object_path, application=None, name=None, group=None, framework=None): - wrap_object(module, object_path, WSGIApplicationWrapper, (application, name, group, framework)) +def wrap_wsgi_application( + module, object_path, application=None, name=None, group=None, framework=None, dispatcher=None +): + wrap_object(module, object_path, WSGIApplicationWrapper, (application, name, group, framework, dispatcher)) diff --git a/newrelic/common/package_version_utils.py b/newrelic/common/package_version_utils.py new file mode 100644 index 000000000..f3d334e2a --- /dev/null +++ b/newrelic/common/package_version_utils.py @@ -0,0 +1,102 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys + +# Need to account for 4 possible variations of version declaration specified in (rejected) PEP 396 +VERSION_ATTRS = ("__version__", "version", "__version_tuple__", "version_tuple") # nosec +NULL_VERSIONS = frozenset((None, "", "0", "0.0", "0.0.0", "0.0.0.0", (0,), (0, 0), (0, 0, 0), (0, 0, 0, 0))) # nosec + + +def get_package_version(name): + """Gets the version string of the library. + :param name: The name of library. + :type name: str + :return: The version of the library. Returns None if can't determine version. + :type return: str or None + + Usage:: + >>> get_package_version("botocore") + "1.1.0" + """ + + version = _get_package_version(name) + + # Coerce iterables into a string + if isinstance(version, tuple): + version = ".".join(str(v) for v in version) + + return version + + +def get_package_version_tuple(name): + """Gets the version tuple of the library. + :param name: The name of library. + :type name: str + :return: The version of the library. Returns None if can't determine version. + :type return: tuple or None + + Usage:: + >>> get_package_version_tuple("botocore") + (1, 1, 0) + """ + + def int_or_str(value): + try: + return int(value) + except Exception: + return str(value) + + version = _get_package_version(name) + + # Split "." separated strings and cast fields to ints + if isinstance(version, str): + version = tuple(int_or_str(v) for v in version.split(".")) + + return version + + +def _get_package_version(name): + module = sys.modules.get(name, None) + version = None + for attr in VERSION_ATTRS: + try: + version = getattr(module, attr, None) + # In certain cases like importlib_metadata.version, version is a callable + # function. + if callable(version): + continue + # Cast any version specified as a list into a tuple. + version = tuple(version) if isinstance(version, list) else version + if version not in NULL_VERSIONS: + return version + except Exception: + pass + + # importlib was introduced into the standard library starting in Python3.8. + if "importlib" in sys.modules and hasattr(sys.modules["importlib"], "metadata"): + try: + version = sys.modules["importlib"].metadata.version(name) # pylint: disable=E1101 + if version not in NULL_VERSIONS: + return version + except Exception: + pass + + if "pkg_resources" in sys.modules: + try: + version = sys.modules["pkg_resources"].get_distribution(name).version + if version not in NULL_VERSIONS: + return version + except Exception: + pass diff --git a/newrelic/common/signature.py b/newrelic/common/signature.py new file mode 100644 index 000000000..314998196 --- /dev/null +++ b/newrelic/common/signature.py @@ -0,0 +1,31 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from newrelic.packages import six + +if six.PY3: + from inspect import Signature + + def bind_args(func, args, kwargs): + """Bind arguments and apply defaults to missing arugments for a callable.""" + bound_args = Signature.from_callable(func).bind(*args, **kwargs) + bound_args.apply_defaults() + return bound_args.arguments + +else: + from inspect import getcallargs + + def bind_args(func, args, kwargs): + """Bind arguments and apply defaults to missing arugments for a callable.""" + return getcallargs(func, *args, **kwargs) diff --git a/newrelic/common/streaming_utils.py b/newrelic/common/streaming_utils.py index ccd0b44ef..ad1b371dc 100644 --- a/newrelic/common/streaming_utils.py +++ b/newrelic/common/streaming_utils.py @@ -17,20 +17,24 @@ import threading try: - from newrelic.core.infinite_tracing_pb2 import AttributeValue + from newrelic.core.infinite_tracing_pb2 import AttributeValue, SpanBatch except: - AttributeValue = None + AttributeValue, SpanBatch = None, None + _logger = logging.getLogger(__name__) class StreamBuffer(object): - def __init__(self, maxlen): + def __init__(self, maxlen, batching=False): self._queue = collections.deque(maxlen=maxlen) self._notify = self.condition() self._shutdown = False self._seen = 0 self._dropped = 0 + self._settings = None + + self.batching = batching @staticmethod def condition(*args, **kwargs): @@ -66,14 +70,23 @@ def stats(self): return seen, dropped + def __bool__(self): + return bool(self._queue) + + def __len__(self): + return len(self._queue) + def __iter__(self): return StreamBufferIterator(self) class StreamBufferIterator(object): + MAX_BATCH_SIZE = 100 + def __init__(self, stream_buffer): self.stream_buffer = stream_buffer self._notify = self.stream_buffer._notify + self.batching = self.stream_buffer.batching self._shutdown = False self._stream = None @@ -100,12 +113,30 @@ def __next__(self): self.shutdown() raise StopIteration - try: - return self.stream_buffer._queue.popleft() - except IndexError: - pass - - if not self.stream_closed() and not self.stream_buffer._queue: + if self.batching: + stream_buffer_len = len(self.stream_buffer) + if stream_buffer_len > self.MAX_BATCH_SIZE: + # Ensure batch size is never more than 100 to prevent issues with serializing large numbers + # of spans causing their age to exceed 10 seconds. That would cause them to be rejected + # by the trace observer. + batch = [self.stream_buffer._queue.popleft() for _ in range(self.MAX_BATCH_SIZE)] + return SpanBatch(spans=batch) + elif stream_buffer_len: + # For small span batches empty stream buffer into list and clear queue. + # This is only safe to do under lock which prevents items being added to the queue. + batch = list(self.stream_buffer._queue) + self.stream_buffer._queue.clear() + return SpanBatch(spans=batch) + + else: + # Send items from stream buffer one at a time. + try: + return self.stream_buffer._queue.popleft() + except IndexError: + pass + + # Wait until items are added to the stream buffer. + if not self.stream_closed() and not self.stream_buffer: self._notify.wait() next = __next__ diff --git a/newrelic/config.py b/newrelic/config.py index 4e0912db8..8a041ad34 100644 --- a/newrelic/config.py +++ b/newrelic/config.py @@ -42,9 +42,9 @@ import newrelic.console import newrelic.core.agent import newrelic.core.config -import newrelic.core.trace_cache as trace_cache from newrelic.common.log_file import initialize_logging from newrelic.common.object_names import expand_builtin_exception_name +from newrelic.core import trace_cache from newrelic.core.config import ( Settings, apply_config_setting, @@ -102,11 +102,23 @@ _cache_object = [] + +def _reset_config_parser(): + global _config_object + global _cache_object + _config_object = ConfigParser.RawConfigParser() + _cache_object = [] + + # Mechanism for extracting settings from the configuration for use in # instrumentation modules and extensions. -def extra_settings(section, types={}, defaults={}): +def extra_settings(section, types=None, defaults=None): + if types is None: + types = {} + if defaults is None: + defaults = {} settings = {} if _config_object.has_section(section): @@ -219,12 +231,12 @@ def _map_default_host_value(license_key): def _raise_configuration_error(section, option=None): _logger.error("CONFIGURATION ERROR") if section: - _logger.error("Section = %s" % section) + _logger.error("Section = %s", section) if option is None: options = _config_object.options(section) - _logger.error("Options = %s" % options) + _logger.error("Options = %s", options) _logger.exception("Exception Details") if not _ignore_errors: @@ -234,13 +246,12 @@ def _raise_configuration_error(section, option=None): "Check New Relic agent log file for further " "details." % section ) - else: - raise newrelic.api.exceptions.ConfigurationError( - "Invalid configuration. Check New Relic agent log file for further details." - ) + raise newrelic.api.exceptions.ConfigurationError( + "Invalid configuration. Check New Relic agent log file for further details." + ) else: - _logger.error("Option = %s" % option) + _logger.error("Option = %s", option) _logger.exception("Exception Details") if not _ignore_errors: @@ -250,12 +261,11 @@ def _raise_configuration_error(section, option=None): 'section "%s". Check New Relic agent log ' "file for further details." % (option, section) ) - else: - raise newrelic.api.exceptions.ConfigurationError( - 'Invalid configuration for option "%s". ' - "Check New Relic agent log file for further " - "details." % option - ) + raise newrelic.api.exceptions.ConfigurationError( + 'Invalid configuration for option "%s". ' + "Check New Relic agent log file for further " + "details." % option + ) def _process_setting(section, option, getter, mapper): @@ -285,9 +295,8 @@ def _process_setting(section, option, getter, mapper): if len(fields) == 1: setattr(target, fields[0], value) break - else: - target = getattr(target, fields[0]) - fields = fields[1].split(".", 1) + target = getattr(target, fields[0]) + fields = fields[1].split(".", 1) # Cache the configuration so can be dumped out to # log file when whole main configuration has been @@ -530,6 +539,8 @@ def _process_configuration(section): _process_setting(section, "event_harvest_config.harvest_limits.log_event_data", "getint", None) _process_setting(section, "infinite_tracing.trace_observer_host", "get", None) _process_setting(section, "infinite_tracing.trace_observer_port", "getint", None) + _process_setting(section, "infinite_tracing.compression", "getboolean", None) + _process_setting(section, "infinite_tracing.batching", "getboolean", None) _process_setting(section, "infinite_tracing.span_queue_size", "getint", None) _process_setting(section, "code_level_metrics.enabled", "getboolean", None) @@ -547,6 +558,11 @@ def _process_configuration(section): _configuration_done = False +def _reset_configuration_done(): + global _configuration_done + _configuration_done = False + + def _process_app_name_setting(): # Do special processing to handle the case where the application # name was actually a semicolon separated list of names. In this @@ -568,7 +584,7 @@ def _process_app_name_setting(): def _link_applications(application): for altname in linked: - _logger.debug("link to %s" % ((name, altname),)) + _logger.debug("link to %s", ((name, altname),)) application.link_to_application(altname) if linked: @@ -594,21 +610,21 @@ def _process_labels_setting(labels=None): deduped = {} for key, value in labels: - if len(key) > length_limit: _logger.warning( - "Improper configuration. Label key %s is too long. Truncating key to: %s" % (key, key[:length_limit]) + "Improper configuration. Label key %s is too long. Truncating key to: %s", key, key[:length_limit] ) if len(value) > length_limit: _logger.warning( - "Improper configuration. Label value %s is too " - "long. Truncating value to: %s" % (value, value[:length_limit]) + "Improper configuration. Label value %s is too long. Truncating value to: %s", + value, + value[:length_limit], ) if len(deduped) >= count_limit: _logger.warning( - "Improper configuration. Maximum number of labels reached. Using first %d labels." % count_limit + "Improper configuration. Maximum number of labels reached. Using first %d labels.", count_limit ) break @@ -728,8 +744,7 @@ def translate_deprecated_settings(settings, cached_settings): ), ] - for (old_key, new_key) in deprecated_settings_map: - + for old_key, new_key in deprecated_settings_map: if old_key in cached: _logger.info( "Deprecated setting found: %r. Please use new setting: %r.", @@ -753,7 +768,6 @@ def translate_deprecated_settings(settings, cached_settings): # deprecated settings, so it gets handled separately. if "ignored_params" in cached: - _logger.info( "Deprecated setting found: ignored_params. Please use " "new setting: attributes.exclude. For the new setting, an " @@ -885,7 +899,6 @@ def _load_configuration( log_file=None, log_level=None, ): - global _configuration_done global _config_file @@ -908,8 +921,7 @@ def _load_configuration( 'Prior configuration file used was "%s" and ' 'environment "%s".' % (_config_file, _environment) ) - else: - return + return _configuration_done = True @@ -923,7 +935,6 @@ def _load_configuration( # If no configuration file then nothing more to be done. if not config_file: - _logger.debug("no agent configuration file") # Force initialisation of the logging system now in case @@ -963,7 +974,7 @@ def _load_configuration( return - _logger.debug("agent configuration file was %s" % config_file) + _logger.debug("agent configuration file was %s", config_file) # Now read in the configuration file. Cache the config file # name in internal settings object as indication of succeeding. @@ -1014,7 +1025,7 @@ def _load_configuration( # against the internal settings object. for option, value in _cache_object: - _logger.debug("agent config %s = %s" % (option, repr(value))) + _logger.debug("agent config %s = %s", option, repr(value)) # Validate provided feature flags and log a warning if get one # which isn't valid. @@ -1063,7 +1074,7 @@ def _load_configuration( terminal = False rollup = None - _logger.debug("register function-trace %s" % ((module, object_path, name, group),)) + _logger.debug("register function-trace %s", ((module, object_path, name, group),)) hook = _function_trace_import_hook(object_path, name, group, label, params, terminal, rollup) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1080,7 +1091,7 @@ def _load_configuration( name = None group = "Function" - _logger.debug("register generator-trace %s" % ((module, object_path, name, group),)) + _logger.debug("register generator-trace %s", ((module, object_path, name, group),)) hook = _generator_trace_import_hook(object_path, name, group) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1091,10 +1102,10 @@ def _load_configuration( # Generic error reporting functions. -def _raise_instrumentation_error(type, locals): +def _raise_instrumentation_error(instrumentation_type, locals_dict): _logger.error("INSTRUMENTATION ERROR") - _logger.error("Type = %s" % type) - _logger.error("Locals = %s" % locals) + _logger.error("Type = %s", instrumentation_type) + _logger.error("Locals = %s", locals_dict) _logger.exception("Exception Details") if not _ignore_errors: @@ -1115,7 +1126,7 @@ def module_import_hook_results(): def _module_import_hook(target, module, function): def _instrument(target): - _logger.debug("instrument module %s" % ((target, module, function),)) + _logger.debug("instrument module %s", ((target, module, function),)) try: instrumented = target._nr_instrumented @@ -1123,7 +1134,7 @@ def _instrument(target): instrumented = target._nr_instrumented = set() if (module, function) in instrumented: - _logger.debug("instrumentation already run %s" % ((target, module, function),)) + _logger.debug("instrumentation already run %s", ((target, module, function),)) return instrumented.add((module, function)) @@ -1173,7 +1184,7 @@ def _process_module_configuration(): if target not in _module_import_hook_registry: _module_import_hook_registry[target] = (module, function) - _logger.debug("register module %s" % ((target, module, function),)) + _logger.debug("register module %s", ((target, module, function),)) hook = _module_import_hook(target, module, function) newrelic.api.import_hook.register_import_hook(target, hook) @@ -1186,7 +1197,7 @@ def _process_module_configuration(): def _module_function_glob(module, object_path): """Match functions and class methods in a module to file globbing syntax.""" - if not any([c in object_path for c in {"*", "?", "["}]): # Identify globbing patterns + if not any((c in object_path for c in ("*", "?", "["))): # Identify globbing patterns return (object_path,) # Returned value must be iterable else: # Gather module functions @@ -1194,7 +1205,7 @@ def _module_function_glob(module, object_path): available_functions = {k: v for k, v in module.__dict__.items() if callable(v) and not isinstance(v, type)} except Exception: # Default to empty dict if no functions available - available_functions = dict() + available_functions = {} # Gather module classes and methods try: @@ -1240,7 +1251,6 @@ def _process_wsgi_application_configuration(): for section in _config_object.sections(): if not section.startswith("wsgi-application:"): continue - enabled = False try: @@ -1262,7 +1272,7 @@ def _process_wsgi_application_configuration(): if _config_object.has_option(section, "application"): application = _config_object.get(section, "application") - _logger.debug("register wsgi-application %s" % ((module, object_path, application),)) + _logger.debug("register wsgi-application %s", ((module, object_path, application),)) hook = _wsgi_application_import_hook(object_path, application) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1318,10 +1328,10 @@ def _process_background_task_configuration(): group = _config_object.get(section, "group") if name and name.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - name = eval(name, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + name = eval(name, callable_vars) # nosec, pylint: disable=W0123 - _logger.debug("register background-task %s" % ((module, object_path, application, name, group),)) + _logger.debug("register background-task %s", ((module, object_path, application, name, group),)) hook = _background_task_import_hook(object_path, application, name, group) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1368,10 +1378,10 @@ def _process_database_trace_configuration(): sql = _config_object.get(section, "sql") if sql.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - sql = eval(sql, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + sql = eval(sql, callable_vars) # nosec, pylint: disable=W0123 - _logger.debug("register database-trace %s" % ((module, object_path, sql),)) + _logger.debug("register database-trace %s", ((module, object_path, sql),)) hook = _database_trace_import_hook(object_path, sql) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1423,14 +1433,14 @@ def _process_external_trace_configuration(): method = _config_object.get(section, "method") if url.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - url = eval(url, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + url = eval(url, callable_vars) # nosec, pylint: disable=W0123 if method and method.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - method = eval(method, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + method = eval(method, callable_vars) # nosec, pylint: disable=W0123 - _logger.debug("register external-trace %s" % ((module, object_path, library, url, method),)) + _logger.debug("register external-trace %s", ((module, object_path, library, url, method),)) hook = _external_trace_import_hook(object_path, library, url, method) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1495,11 +1505,11 @@ def _process_function_trace_configuration(): rollup = _config_object.get(section, "rollup") if name and name.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - name = eval(name, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + name = eval(name, callable_vars) # nosec, pylint: disable=W0123 _logger.debug( - "register function-trace %s" % ((module, object_path, name, group, label, params, terminal, rollup),) + "register function-trace %s", ((module, object_path, name, group, label, params, terminal, rollup),) ) hook = _function_trace_import_hook(object_path, name, group, label, params, terminal, rollup) @@ -1553,10 +1563,10 @@ def _process_generator_trace_configuration(): group = _config_object.get(section, "group") if name and name.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - name = eval(name, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + name = eval(name, callable_vars) # nosec, pylint: disable=W0123 - _logger.debug("register generator-trace %s" % ((module, object_path, name, group),)) + _logger.debug("register generator-trace %s", ((module, object_path, name, group),)) hook = _generator_trace_import_hook(object_path, name, group) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1612,10 +1622,10 @@ def _process_profile_trace_configuration(): depth = _config_object.get(section, "depth") if name and name.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - name = eval(name, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + name = eval(name, callable_vars) # nosec, pylint: disable=W0123 - _logger.debug("register profile-trace %s" % ((module, object_path, name, group, depth),)) + _logger.debug("register profile-trace %s", ((module, object_path, name, group, depth),)) hook = _profile_trace_import_hook(object_path, name, group, depth=depth) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1662,8 +1672,8 @@ def _process_memcache_trace_configuration(): command = _config_object.get(section, "command") if command.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - command = eval(command, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + command = eval(command, callable_vars) # nosec, pylint: disable=W0123 _logger.debug("register memcache-trace %s", (module, object_path, command)) @@ -1680,7 +1690,7 @@ def _transaction_name_import_hook(object_path, name, group, priority): def _instrument(target): try: for func in _module_function_glob(target, object_path): - _logger.debug("wrap transaction-name %s" % ((target, func, name, group, priority),)) + _logger.debug("wrap transaction-name %s", ((target, func, name, group, priority),)) newrelic.api.transaction_name.wrap_transaction_name(target, func, name, group, priority) except Exception: _raise_instrumentation_error("transaction-name", locals()) @@ -1722,10 +1732,10 @@ def _process_transaction_name_configuration(): priority = _config_object.getint(section, "priority") if name and name.startswith("lambda "): - vars = {"callable_name": newrelic.api.object_wrapper.callable_name} - name = eval(name, vars) # nosec + callable_vars = {"callable_name": newrelic.api.object_wrapper.callable_name} + name = eval(name, callable_vars) # nosec, pylint: disable=W0123 - _logger.debug("register transaction-name %s" % ((module, object_path, name, group, priority),)) + _logger.debug("register transaction-name %s", ((module, object_path, name, group, priority),)) hook = _transaction_name_import_hook(object_path, name, group, priority) newrelic.api.import_hook.register_import_hook(module, hook) @@ -1883,7 +1893,6 @@ def _startup_data_source(): def _setup_data_source(): - global _data_sources_done if _data_sources_done: @@ -1942,7 +1951,7 @@ def _process_function_profile_configuration(): if _config_object.has_option(section, "checkpoint"): checkpoint = _config_object.getfloat(section, "checkpoint") - _logger.debug("register function-profile %s" % ((module, object_path, filename, delay, checkpoint),)) + _logger.debug("register function-profile %s", ((module, object_path, filename, delay, checkpoint),)) hook = _function_profile_import_hook(object_path, filename, delay, checkpoint) newrelic.api.import_hook.register_import_hook(module, hook) @@ -2179,6 +2188,10 @@ def _process_module_builtin_defaults(): "instrument_graphqlserver", ) + _process_module_definition( + "sentry_sdk.integrations.asgi", "newrelic.hooks.component_sentry", "instrument_sentry_sdk_integrations_asgi" + ) + # _process_module_definition('web.application', # 'newrelic.hooks.framework_webpy') # _process_module_definition('web.template', @@ -2661,56 +2674,149 @@ def _process_module_builtin_defaults(): "aioredis.connection", "newrelic.hooks.datastore_aioredis", "instrument_aioredis_connection" ) + # Redis v4.2+ + _process_module_definition( + "redis.asyncio.client", "newrelic.hooks.datastore_redis", "instrument_asyncio_redis_client" + ) + + # Redis v4.2+ + _process_module_definition( + "redis.asyncio.commands", "newrelic.hooks.datastore_redis", "instrument_asyncio_redis_client" + ) + + _process_module_definition( + "redis.asyncio.connection", "newrelic.hooks.datastore_aioredis", "instrument_aioredis_connection" + ) + + # v7 and below _process_module_definition( "elasticsearch.client", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.cat", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_cat", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.cat", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_cat_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.cluster", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_cluster", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.cluster", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_cluster_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.indices", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_indices", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.indices", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_indices_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.nodes", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_nodes", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.nodes", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_nodes_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.snapshot", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_snapshot", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.snapshot", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_snapshot_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.tasks", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_tasks", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.tasks", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_tasks_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.client.ingest", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_client_ingest", ) + # v8 and above + _process_module_definition( + "elasticsearch._sync.client.ingest", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elasticsearch_client_ingest_v8", + ) + + # v7 and below _process_module_definition( "elasticsearch.connection.base", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_connection_base", ) + # v8 and above + _process_module_definition( + "elastic_transport._node._base", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elastic_transport__node__base", + ) + + # v7 and below _process_module_definition( "elasticsearch.transport", "newrelic.hooks.datastore_elasticsearch", "instrument_elasticsearch_transport", ) + # v8 and above + _process_module_definition( + "elastic_transport._transport", + "newrelic.hooks.datastore_elasticsearch", + "instrument_elastic_transport__transport", + ) _process_module_definition("pika.adapters", "newrelic.hooks.messagebroker_pika", "instrument_pika_adapters") _process_module_definition("pika.channel", "newrelic.hooks.messagebroker_pika", "instrument_pika_channel") @@ -2745,6 +2851,10 @@ def _process_module_builtin_defaults(): ) _process_module_definition("redis.client", "newrelic.hooks.datastore_redis", "instrument_redis_client") + _process_module_definition( + "redis.commands.cluster", "newrelic.hooks.datastore_redis", "instrument_redis_commands_cluster" + ) + _process_module_definition( "redis.commands.core", "newrelic.hooks.datastore_redis", "instrument_redis_commands_core" ) @@ -2922,19 +3032,6 @@ def _process_module_builtin_defaults(): "instrument_cornice_service", ) - # _process_module_definition('twisted.web.server', - # 'newrelic.hooks.framework_twisted', - # 'instrument_twisted_web_server') - # _process_module_definition('twisted.web.http', - # 'newrelic.hooks.framework_twisted', - # 'instrument_twisted_web_http') - # _process_module_definition('twisted.web.resource', - # 'newrelic.hooks.framework_twisted', - # 'instrument_twisted_web_resource') - # _process_module_definition('twisted.internet.defer', - # 'newrelic.hooks.framework_twisted', - # 'instrument_twisted_internet_defer') - _process_module_definition("gevent.monkey", "newrelic.hooks.coroutines_gevent", "instrument_gevent_monkey") _process_module_definition( @@ -3027,8 +3124,12 @@ def _process_module_entry_points(): _instrumentation_done = False -def _setup_instrumentation(): +def _reset_instrumentation_done(): + global _instrumentation_done + _instrumentation_done = False + +def _setup_instrumentation(): global _instrumentation_done if _instrumentation_done: diff --git a/newrelic/core/agent_streaming.py b/newrelic/core/agent_streaming.py index 5cd7117e6..b581f5d17 100644 --- a/newrelic/core/agent_streaming.py +++ b/newrelic/core/agent_streaming.py @@ -18,9 +18,9 @@ try: import grpc - from newrelic.core.infinite_tracing_pb2 import RecordStatus, Span + from newrelic.core.infinite_tracing_pb2 import RecordStatus, Span, SpanBatch except Exception: - grpc, RecordStatus, Span = None, None, None + grpc, RecordStatus, Span, SpanBatch = None, None, None, None _logger = logging.getLogger(__name__) @@ -33,7 +33,6 @@ class StreamingRpc(object): retry will not occur. """ - PATH = "/com.newrelic.trace.v1.IngestService/RecordSpan" RETRY_POLICY = ( (15, False), (15, False), @@ -44,7 +43,7 @@ class StreamingRpc(object): ) OPTIONS = [("grpc.enable_retries", 0)] - def __init__(self, endpoint, stream_buffer, metadata, record_metric, ssl=True): + def __init__(self, endpoint, stream_buffer, metadata, record_metric, ssl=True, compression=None): self._endpoint = endpoint self._ssl = ssl self.metadata = metadata @@ -57,17 +56,35 @@ def __init__(self, endpoint, stream_buffer, metadata, record_metric, ssl=True): self.notify = self.condition() self.record_metric = record_metric self.closed = False + # If this is not set, None is still a falsy value. + self.compression_setting = grpc.Compression.Gzip if compression else grpc.Compression.NoCompression + + if self.batching: # Stream buffer will be sending span batches + self.path = "/com.newrelic.trace.v1.IngestService/RecordSpanBatch" + self.serializer = SpanBatch.SerializeToString + else: + self.path = "/com.newrelic.trace.v1.IngestService/RecordSpan" + self.serializer = Span.SerializeToString self.create_channel() + @property + def batching(self): + # Determine batching by stream buffer settings + return self.stream_buffer.batching + def create_channel(self): if self._ssl: credentials = grpc.ssl_channel_credentials() - self.channel = grpc.secure_channel(self._endpoint, credentials, options=self.OPTIONS) + self.channel = grpc.secure_channel( + self._endpoint, credentials, compression=self.compression_setting, options=self.OPTIONS + ) else: - self.channel = grpc.insecure_channel(self._endpoint, options=self.OPTIONS) + self.channel = grpc.insecure_channel( + self._endpoint, compression=self.compression_setting, options=self.OPTIONS + ) - self.rpc = self.channel.stream_stream(self.PATH, Span.SerializeToString, RecordStatus.FromString) + self.rpc = self.channel.stream_stream(self.path, self.serializer, RecordStatus.FromString) def create_response_iterator(self): with self.stream_buffer._notify: diff --git a/newrelic/core/application.py b/newrelic/core/application.py index 419211a35..7be217428 100644 --- a/newrelic/core/application.py +++ b/newrelic/core/application.py @@ -520,30 +520,65 @@ def connect_to_data_collector(self, activate_agent): self._global_events_account = 0 - # Record metrics for how long it took us to connect and how - # many attempts we made. Also record metrics for the final - # successful attempt. If we went through multiple attempts, - # individual details of errors before the final one that - # worked are not recorded as recording them all in the - # initial harvest would possibly skew first harvest metrics - # and cause confusion as we cannot properly mark the time over - # which they were recorded. Make sure we do this before we - # mark the session active so we don't have to grab a lock on - # merging the internal metrics. - with InternalTraceContext(internal_metrics): + # Record metrics for how long it took us to connect and how + # many attempts we made. Also record metrics for the final + # successful attempt. If we went through multiple attempts, + # individual details of errors before the final one that + # worked are not recorded as recording them all in the + # initial harvest would possibly skew first harvest metrics + # and cause confusion as we cannot properly mark the time over + # which they were recorded. Make sure we do this before we + # mark the session active so we don't have to grab a lock on + # merging the internal metrics. + internal_metric( "Supportability/Python/Application/Registration/Duration", self._period_start - connect_start ) internal_metric("Supportability/Python/Application/Registration/Attempts", connect_attempts) - # Logging feature toggle supportability metrics - application_logging_metrics = configuration.application_logging.enabled and configuration.application_logging.metrics.enabled - application_logging_forwarding = configuration.application_logging.enabled and configuration.application_logging.forwarding.enabled - application_logging_local_decorating = configuration.application_logging.enabled and configuration.application_logging.local_decorating.enabled - internal_metric("Supportability/Logging/Forwarding/Python/%s" % ("enabled" if application_logging_forwarding else "disabled"), 1) - internal_metric("Supportability/Logging/LocalDecorating/Python/%s" % ("enabled" if application_logging_local_decorating else "disabled"), 1) - internal_metric("Supportability/Logging/Metrics/Python/%s" % ("enabled" if application_logging_metrics else "disabled"), 1) + # Record metrics for feature toggles from settings + + # Logging feature toggle metrics + application_logging_metrics = ( + configuration.application_logging.enabled and configuration.application_logging.metrics.enabled + ) + application_logging_forwarding = ( + configuration.application_logging.enabled and configuration.application_logging.forwarding.enabled + ) + application_logging_local_decorating = ( + configuration.application_logging.enabled and configuration.application_logging.local_decorating.enabled + ) + internal_metric( + "Supportability/Logging/Forwarding/Python/%s" + % ("enabled" if application_logging_forwarding else "disabled"), + 1, + ) + internal_metric( + "Supportability/Logging/LocalDecorating/Python/%s" + % ("enabled" if application_logging_local_decorating else "disabled"), + 1, + ) + internal_metric( + "Supportability/Logging/Metrics/Python/%s" % ("enabled" if application_logging_metrics else "disabled"), + 1, + ) + + # Infinite tracing feature toggle metrics + infinite_tracing = configuration.infinite_tracing.enabled # Property that checks trace observer host + if infinite_tracing: + infinite_tracing_batching = configuration.infinite_tracing.batching + infinite_tracing_compression = configuration.infinite_tracing.compression + internal_metric( + "Supportability/InfiniteTracing/gRPC/Batching/%s" + % ("enabled" if infinite_tracing_batching else "disabled"), + 1, + ) + internal_metric( + "Supportability/InfiniteTracing/gRPC/Compression/%s" + % ("enabled" if infinite_tracing_compression else "disabled"), + 1, + ) self._stats_engine.merge_custom_metrics(internal_metrics.metrics()) @@ -724,11 +759,9 @@ def stop_data_samplers(self): def remove_data_source(self, name): with self._data_samplers_lock: - data_sampler = [x for x in self._data_samplers if x.name == name] if len(data_sampler) > 0: - # Should be at most one data sampler for a given name. data_sampler = data_sampler[0] @@ -741,7 +774,6 @@ def remove_data_source(self, name): data_sampler.stop() except Exception: - # If sampler has not started yet, it may throw an error. _logger.debug( @@ -1066,7 +1098,6 @@ def harvest(self, shutdown=False, flexible=False): with InternalTraceContext(internal_metrics): with InternalTrace("Supportability/Python/Harvest/Calls/" + call_metric): - self._harvest_count += 1 start = time.time() @@ -1204,7 +1235,6 @@ def harvest(self, shutdown=False, flexible=False): stats.reset_synthetics_events() if configuration.collect_analytics_events and configuration.transaction_events.enabled: - transaction_events = stats.transaction_events if transaction_events: @@ -1235,7 +1265,7 @@ def harvest(self, shutdown=False, flexible=False): if configuration.infinite_tracing.enabled: span_stream = stats.span_stream # Only merge stats as part of default harvest - if span_stream and not flexible: + if span_stream is not None and not flexible: spans_seen, spans_dropped = span_stream.stats() spans_sent = spans_seen - spans_dropped @@ -1267,7 +1297,6 @@ def harvest(self, shutdown=False, flexible=False): and configuration.error_collector.capture_events and configuration.error_collector.enabled ): - error_events = stats.error_events if error_events: num_error_samples = error_events.num_samples @@ -1289,7 +1318,6 @@ def harvest(self, shutdown=False, flexible=False): # Send custom events if configuration.collect_custom_events and configuration.custom_insights_events.enabled: - customs = stats.custom_events if customs: @@ -1309,8 +1337,13 @@ def harvest(self, shutdown=False, flexible=False): # Send log events - if configuration and configuration.application_logging and configuration.application_logging.enabled and configuration.application_logging.forwarding and configuration.application_logging.forwarding.enabled: - + if ( + configuration + and configuration.application_logging + and configuration.application_logging.enabled + and configuration.application_logging.forwarding + and configuration.application_logging.forwarding.enabled + ): logs = stats.log_events if logs: diff --git a/newrelic/core/attribute.py b/newrelic/core/attribute.py index 4c2673939..372711369 100644 --- a/newrelic/core/attribute.py +++ b/newrelic/core/attribute.py @@ -13,20 +13,21 @@ # limitations under the License. import logging - from collections import namedtuple +from newrelic.core.attribute_filter import ( + DST_ALL, + DST_ERROR_COLLECTOR, + DST_SPAN_EVENTS, + DST_TRANSACTION_EVENTS, + DST_TRANSACTION_SEGMENTS, + DST_TRANSACTION_TRACER, +) from newrelic.packages import six -from newrelic.core.attribute_filter import (DST_ALL, DST_ERROR_COLLECTOR, - DST_TRANSACTION_TRACER, DST_TRANSACTION_EVENTS, DST_SPAN_EVENTS, - DST_TRANSACTION_SEGMENTS) - - _logger = logging.getLogger(__name__) -_Attribute = namedtuple('_Attribute', - ['name', 'value', 'destinations']) +_Attribute = namedtuple("_Attribute", ["name", "value", "destinations"]) # The following destinations are created here, never changed, and only # used in create_agent_attributes. It is placed at the module level here @@ -34,61 +35,61 @@ # All agent attributes go to transaction traces and error traces by default. -_DESTINATIONS = (DST_ERROR_COLLECTOR | - DST_TRANSACTION_TRACER | - DST_TRANSACTION_SEGMENTS) -_DESTINATIONS_WITH_EVENTS = (_DESTINATIONS | - DST_TRANSACTION_EVENTS | - DST_SPAN_EVENTS) +_DESTINATIONS = DST_ERROR_COLLECTOR | DST_TRANSACTION_TRACER | DST_TRANSACTION_SEGMENTS +_DESTINATIONS_WITH_EVENTS = _DESTINATIONS | DST_TRANSACTION_EVENTS | DST_SPAN_EVENTS # The following subset goes to transaction events by default. -_TRANSACTION_EVENT_DEFAULT_ATTRIBUTES = set(( - 'host.displayName', - 'request.method', - 'request.headers.contentType', - 'request.headers.contentLength', - 'request.uri', - 'response.status', - 'request.headers.accept', - 'response.headers.contentLength', - 'response.headers.contentType', - 'request.headers.host', - 'request.headers.userAgent', - 'message.queueName', - 'message.routingKey', - 'http.url', - 'http.statusCode', - 'aws.requestId', - 'aws.operation', - 'aws.lambda.arn', - 'aws.lambda.coldStart', - 'aws.lambda.eventSource.arn', - "db.collection", - 'db.instance', - 'db.operation', - 'db.statement', - 'error.class', - 'error.message', - 'error.expected', - 'peer.hostname', - 'peer.address', - 'graphql.field.name', - 'graphql.field.parentType', - 'graphql.field.path', - 'graphql.field.returnType', - 'graphql.operation.name', - 'graphql.operation.type', - 'graphql.operation.query', +_TRANSACTION_EVENT_DEFAULT_ATTRIBUTES = set( + ( + "aws.lambda.arn", + "aws.lambda.coldStart", + "aws.lambda.eventSource.arn", + "aws.operation", + "aws.requestId", "code.filepath", "code.function", "code.lineno", "code.namespace", -)) + "db.collection", + "db.instance", + "db.operation", + "db.statement", + "enduser.id", + "error.class", + "error.expected", + "error.message", + "error.group.name", + "graphql.field.name", + "graphql.field.parentType", + "graphql.field.path", + "graphql.field.returnType", + "graphql.operation.name", + "graphql.operation.query", + "graphql.operation.type", + "host.displayName", + "http.statusCode", + "http.url", + "message.queueName", + "message.routingKey", + "peer.address", + "peer.hostname", + "request.headers.accept", + "request.headers.contentLength", + "request.headers.contentType", + "request.headers.host", + "request.headers.userAgent", + "request.method", + "request.uri", + "response.headers.contentLength", + "response.headers.contentType", + "response.status", + ) +) MAX_NUM_USER_ATTRIBUTES = 128 MAX_ATTRIBUTE_LENGTH = 255 -MAX_64_BIT_INT = 2 ** 63 - 1 +MAX_64_BIT_INT = 2**63 - 1 MAX_LOG_MESSAGE_LENGTH = 32768 @@ -109,10 +110,8 @@ class CastingFailureException(Exception): class Attribute(_Attribute): - def __repr__(self): - return "Attribute(name=%r, value=%r, destinations=%r)" % ( - self.name, self.value, bin(self.destinations)) + return "Attribute(name=%r, value=%r, destinations=%r)" % (self.name, self.value, bin(self.destinations)) def create_attributes(attr_dict, destinations, attribute_filter): @@ -142,8 +141,7 @@ def create_agent_attributes(attr_dict, attribute_filter): return attributes -def resolve_user_attributes( - attr_dict, attribute_filter, target_destination, attr_class=dict): +def resolve_user_attributes(attr_dict, attribute_filter, target_destination, attr_class=dict): u_attrs = attr_class() for attr_name, attr_value in attr_dict.items(): @@ -158,8 +156,7 @@ def resolve_user_attributes( return u_attrs -def resolve_agent_attributes( - attr_dict, attribute_filter, target_destination, attr_class=dict): +def resolve_agent_attributes(attr_dict, attribute_filter, target_destination, attr_class=dict): a_attrs = attr_class() for attr_name, attr_value in attr_dict.items(): @@ -182,10 +179,9 @@ def create_user_attributes(attr_dict, attribute_filter): return create_attributes(attr_dict, destinations, attribute_filter) -def truncate( - text, maxsize=MAX_ATTRIBUTE_LENGTH, encoding='utf-8', ending=None): +def truncate(text, maxsize=MAX_ATTRIBUTE_LENGTH, encoding="utf-8", ending=None): - # Truncate text so that it's byte representation + # Truncate text so that its byte representation # is no longer than maxsize bytes. # If text is unicode (Python 2 or 3), return unicode. @@ -198,21 +194,21 @@ def truncate( ending = ending and ending.encode(encoding) if ending and truncated != text: - truncated = truncated[:-len(ending)] + ending + truncated = truncated[: -len(ending)] + ending return truncated -def _truncate_unicode(u, maxsize, encoding='utf-8'): +def _truncate_unicode(u, maxsize, encoding="utf-8"): encoded = u.encode(encoding)[:maxsize] - return encoded.decode(encoding, 'ignore') + return encoded.decode(encoding, "ignore") def _truncate_bytes(s, maxsize): return s[:maxsize] -def check_name_length(name, max_length=MAX_ATTRIBUTE_LENGTH, encoding='utf-8'): +def check_name_length(name, max_length=MAX_ATTRIBUTE_LENGTH, encoding="utf-8"): trunc_name = truncate(name, max_length, encoding) if name != trunc_name: raise NameTooLongException() @@ -228,8 +224,7 @@ def check_max_int(value, max_int=MAX_64_BIT_INT): raise IntTooLargeException() -def process_user_attribute( - name, value, max_length=MAX_ATTRIBUTE_LENGTH, ending=None): +def process_user_attribute(name, value, max_length=MAX_ATTRIBUTE_LENGTH, ending=None): # Perform all necessary checks on a potential attribute. # @@ -250,23 +245,19 @@ def process_user_attribute( value = sanitize(value) except NameIsNotStringException: - _logger.debug('Attribute name must be a string. Dropping ' - 'attribute: %r=%r', name, value) + _logger.debug("Attribute name must be a string. Dropping " "attribute: %r=%r", name, value) return FAILED_RESULT except NameTooLongException: - _logger.debug('Attribute name exceeds maximum length. Dropping ' - 'attribute: %r=%r', name, value) + _logger.debug("Attribute name exceeds maximum length. Dropping " "attribute: %r=%r", name, value) return FAILED_RESULT except IntTooLargeException: - _logger.debug('Attribute value exceeds maximum integer value. ' - 'Dropping attribute: %r=%r', name, value) + _logger.debug("Attribute value exceeds maximum integer value. " "Dropping attribute: %r=%r", name, value) return FAILED_RESULT except CastingFailureException: - _logger.debug('Attribute value cannot be cast to a string. ' - 'Dropping attribute: %r=%r', name, value) + _logger.debug("Attribute value cannot be cast to a string. " "Dropping attribute: %r=%r", name, value) return FAILED_RESULT else: @@ -278,9 +269,12 @@ def process_user_attribute( if isinstance(value, valid_types_text): trunc_value = truncate(value, maxsize=max_length, ending=ending) if value != trunc_value: - _logger.debug('Attribute value exceeds maximum length ' - '(%r bytes). Truncating value: %r=%r.', - max_length, name, trunc_value) + _logger.debug( + "Attribute value exceeds maximum length " "(%r bytes). Truncating value: %r=%r.", + max_length, + name, + trunc_value, + ) value = trunc_value @@ -294,8 +288,7 @@ def sanitize(value): # # Raise CastingFailureException, if str(value) somehow fails. - valid_value_types = (six.text_type, six.binary_type, bool, float, - six.integer_types) + valid_value_types = (six.text_type, six.binary_type, bool, float, six.integer_types) if not isinstance(value, valid_value_types): original = value @@ -305,7 +298,8 @@ def sanitize(value): except Exception: raise CastingFailureException() else: - _logger.debug('Attribute value is of type: %r. Casting %r to ' - 'string: %s', type(original), original, value) + _logger.debug( + "Attribute value is of type: %r. Casting %r to " "string: %s", type(original), original, value + ) return value diff --git a/newrelic/core/code_level_metrics.py b/newrelic/core/code_level_metrics.py index 652715eab..ba00d93af 100644 --- a/newrelic/core/code_level_metrics.py +++ b/newrelic/core/code_level_metrics.py @@ -89,7 +89,7 @@ def extract_code_from_callable(func): # Use inspect to get file and line number file_path = inspect.getsourcefile(func) line_number = inspect.getsourcelines(func)[1] - except TypeError: + except Exception: pass # Split function path to extract class name diff --git a/newrelic/core/config.py b/newrelic/core/config.py index 60520c113..7489be222 100644 --- a/newrelic/core/config.py +++ b/newrelic/core/config.py @@ -52,6 +52,7 @@ # reservoir. Error Events have a different default size. DEFAULT_RESERVOIR_SIZE = 1200 +CUSTOM_EVENT_RESERVOIR_SIZE = 3600 ERROR_EVENT_RESERVOIR_SIZE = 100 SPAN_EVENT_RESERVOIR_SIZE = 2000 LOG_EVENT_RESERVOIR_SIZE = 10000 @@ -137,7 +138,9 @@ class TransactionTracerAttributesSettings(Settings): class ErrorCollectorSettings(Settings): - pass + @property + def error_group_callback(self): + return self._error_group_callback class ErrorCollectorAttributesSettings(Settings): @@ -333,6 +336,14 @@ def _can_enable_infinite_tracing(): return True +class InstrumentationSettings(Settings): + pass + + +class InstrumentationGraphQLSettings(Settings): + pass + + class EventHarvestConfigSettings(Settings): nested = True _lock = threading.Lock() @@ -354,50 +365,52 @@ class EventHarvestConfigHarvestLimitSettings(Settings): _settings = TopLevelSettings() +_settings.agent_limits = AgentLimitsSettings() _settings.application_logging = ApplicationLoggingSettings() _settings.application_logging.forwarding = ApplicationLoggingForwardingSettings() -_settings.application_logging.metrics = ApplicationLoggingMetricsSettings() _settings.application_logging.local_decorating = ApplicationLoggingLocalDecoratingSettings() +_settings.application_logging.metrics = ApplicationLoggingMetricsSettings() _settings.attributes = AttributesSettings() -_settings.gc_runtime_metrics = GCRuntimeMetricsSettings() -_settings.code_level_metrics = CodeLevelMetricsSettings() -_settings.thread_profiler = ThreadProfilerSettings() -_settings.transaction_tracer = TransactionTracerSettings() -_settings.transaction_tracer.attributes = TransactionTracerAttributesSettings() -_settings.error_collector = ErrorCollectorSettings() -_settings.error_collector.attributes = ErrorCollectorAttributesSettings() _settings.browser_monitoring = BrowserMonitorSettings() _settings.browser_monitoring.attributes = BrowserMonitorAttributesSettings() -_settings.transaction_name = TransactionNameSettings() -_settings.transaction_metrics = TransactionMetricsSettings() -_settings.event_loop_visibility = EventLoopVisibilitySettings() -_settings.rum = RumSettings() -_settings.slow_sql = SlowSqlSettings() -_settings.agent_limits = AgentLimitsSettings() +_settings.code_level_metrics = CodeLevelMetricsSettings() _settings.console = ConsoleSettings() -_settings.debug = DebugSettings() _settings.cross_application_tracer = CrossApplicationTracerSettings() -_settings.transaction_events = TransactionEventsSettings() -_settings.transaction_events.attributes = TransactionEventsAttributesSettings() _settings.custom_insights_events = CustomInsightsEventsSettings() -_settings.process_host = ProcessHostSettings() -_settings.synthetics = SyntheticsSettings() -_settings.message_tracer = MessageTracerSettings() -_settings.utilization = UtilizationSettings() -_settings.strip_exception_messages = StripExceptionMessageSettings() _settings.datastore_tracer = DatastoreTracerSettings() -_settings.datastore_tracer.instance_reporting = DatastoreTracerInstanceReportingSettings() _settings.datastore_tracer.database_name_reporting = DatastoreTracerDatabaseNameReportingSettings() +_settings.datastore_tracer.instance_reporting = DatastoreTracerInstanceReportingSettings() +_settings.debug = DebugSettings() +_settings.distributed_tracing = DistributedTracingSettings() +_settings.error_collector = ErrorCollectorSettings() +_settings.error_collector.attributes = ErrorCollectorAttributesSettings() +_settings.event_harvest_config = EventHarvestConfigSettings() +_settings.event_harvest_config.harvest_limits = EventHarvestConfigHarvestLimitSettings() +_settings.event_loop_visibility = EventLoopVisibilitySettings() +_settings.gc_runtime_metrics = GCRuntimeMetricsSettings() _settings.heroku = HerokuSettings() +_settings.infinite_tracing = InfiniteTracingSettings() +_settings.instrumentation = InstrumentationSettings() +_settings.instrumentation.graphql = InstrumentationGraphQLSettings() +_settings.message_tracer = MessageTracerSettings() +_settings.process_host = ProcessHostSettings() +_settings.rum = RumSettings() +_settings.serverless_mode = ServerlessModeSettings() +_settings.slow_sql = SlowSqlSettings() _settings.span_events = SpanEventSettings() _settings.span_events.attributes = SpanEventAttributesSettings() +_settings.strip_exception_messages = StripExceptionMessageSettings() +_settings.synthetics = SyntheticsSettings() +_settings.thread_profiler = ThreadProfilerSettings() +_settings.transaction_events = TransactionEventsSettings() +_settings.transaction_events.attributes = TransactionEventsAttributesSettings() +_settings.transaction_metrics = TransactionMetricsSettings() +_settings.transaction_name = TransactionNameSettings() _settings.transaction_segments = TransactionSegmentSettings() _settings.transaction_segments.attributes = TransactionSegmentAttributesSettings() -_settings.distributed_tracing = DistributedTracingSettings() -_settings.serverless_mode = ServerlessModeSettings() -_settings.infinite_tracing = InfiniteTracingSettings() -_settings.event_harvest_config = EventHarvestConfigSettings() -_settings.event_harvest_config.harvest_limits = EventHarvestConfigHarvestLimitSettings() +_settings.transaction_tracer = TransactionTracerSettings() +_settings.transaction_tracer.attributes = TransactionTracerAttributesSettings() +_settings.utilization = UtilizationSettings() _settings.log_file = os.environ.get("NEW_RELIC_LOG", None) @@ -457,7 +470,6 @@ def _environ_as_mapping(name, default=""): return result for item in items.split(";"): - try: key, value = item.split(":") except ValueError: @@ -688,6 +700,7 @@ def default_host(license_key): _settings.error_collector.ignore_status_codes = _parse_status_codes("100-102 200-208 226 300-308 404", set()) _settings.error_collector.expected_classes = [] _settings.error_collector.expected_status_codes = set() +_settings.error_collector._error_group_callback = None _settings.error_collector.attributes.enabled = True _settings.error_collector.attributes.exclude = [] _settings.error_collector.attributes.include = [] @@ -730,15 +743,21 @@ def default_host(license_key): _settings.infinite_tracing.trace_observer_host = os.environ.get("NEW_RELIC_INFINITE_TRACING_TRACE_OBSERVER_HOST", None) _settings.infinite_tracing.trace_observer_port = _environ_as_int("NEW_RELIC_INFINITE_TRACING_TRACE_OBSERVER_PORT", 443) +_settings.infinite_tracing.compression = _environ_as_bool("NEW_RELIC_INFINITE_TRACING_COMPRESSION", default=True) +_settings.infinite_tracing.batching = _environ_as_bool("NEW_RELIC_INFINITE_TRACING_BATCHING", default=True) _settings.infinite_tracing.ssl = True _settings.infinite_tracing.span_queue_size = _environ_as_int("NEW_RELIC_INFINITE_TRACING_SPAN_QUEUE_SIZE", 10000) +_settings.instrumentation.graphql.capture_introspection_queries = os.environ.get( + "NEW_RELIC_INSTRUMENTATION_GRAPHQL_CAPTURE_INTROSPECTION_QUERIES", False +) + _settings.event_harvest_config.harvest_limits.analytic_event_data = _environ_as_int( "NEW_RELIC_ANALYTICS_EVENTS_MAX_SAMPLES_STORED", DEFAULT_RESERVOIR_SIZE ) _settings.event_harvest_config.harvest_limits.custom_event_data = _environ_as_int( - "NEW_RELIC_CUSTOM_INSIGHTS_EVENTS_MAX_SAMPLES_STORED", DEFAULT_RESERVOIR_SIZE + "NEW_RELIC_CUSTOM_INSIGHTS_EVENTS_MAX_SAMPLES_STORED", CUSTOM_EVENT_RESERVOIR_SIZE ) _settings.event_harvest_config.harvest_limits.span_event_data = _environ_as_int( @@ -938,7 +957,6 @@ def global_settings_dump(settings_object=None, serializable=False): components = urlparse.urlparse(proxy_host) if components.scheme: - netloc = create_obfuscated_netloc(components.username, components.password, components.hostname, obfuscated) if components.port: @@ -1061,14 +1079,14 @@ def apply_server_side_settings(server_side_config=None, settings=_settings): # Overlay with agent server side configuration settings. - for (name, value) in agent_config.items(): + for name, value in agent_config.items(): apply_config_setting(settings_snapshot, name, value) # Overlay with global server side configuration settings. # global server side configuration always takes precedence over the global # server side configuration settings. - for (name, value) in server_side_config.items(): + for name, value in server_side_config.items(): apply_config_setting(settings_snapshot, name, value) event_harvest_config = server_side_config.get("event_harvest_config", {}) diff --git a/newrelic/core/context.py b/newrelic/core/context.py index 95de15b4e..7560855ae 100644 --- a/newrelic/core/context.py +++ b/newrelic/core/context.py @@ -46,7 +46,7 @@ def log_propagation_failure(s): elif trace is not None: self.trace = trace elif trace_cache_id is not None: - self.trace = self.trace_cache._cache.get(trace_cache_id, None) + self.trace = self.trace_cache.get(trace_cache_id, None) if self.trace is None: log_propagation_failure("No trace with id %d." % trace_cache_id) elif hasattr(request, "_nr_trace") and request._nr_trace is not None: @@ -60,11 +60,11 @@ def __enter__(self): self.thread_id = self.trace_cache.current_thread_id() # Save previous cache contents - self.restore = self.trace_cache._cache.get(self.thread_id, None) + self.restore = self.trace_cache.get(self.thread_id, None) self.should_restore = True # Set context in trace cache - self.trace_cache._cache[self.thread_id] = self.trace + self.trace_cache[self.thread_id] = self.trace return self @@ -72,10 +72,10 @@ def __exit__(self, exc, value, tb): if self.should_restore: if self.restore is not None: # Restore previous contents - self.trace_cache._cache[self.thread_id] = self.restore + self.trace_cache[self.thread_id] = self.restore else: # Remove entry from cache - self.trace_cache._cache.pop(self.thread_id) + self.trace_cache.pop(self.thread_id) def context_wrapper(func, trace=None, request=None, trace_cache_id=None, strict=True): diff --git a/newrelic/core/data_collector.py b/newrelic/core/data_collector.py index f8947927d..985e37240 100644 --- a/newrelic/core/data_collector.py +++ b/newrelic/core/data_collector.py @@ -61,6 +61,7 @@ def connect_span_stream(self, span_iterator, record_metric): port = self.configuration.infinite_tracing.trace_observer_port ssl = self.configuration.infinite_tracing.ssl + compression_setting = self.configuration.infinite_tracing.compression endpoint = "{}:{}".format(host, port) if ( @@ -68,14 +69,13 @@ def connect_span_stream(self, span_iterator, record_metric): and self.configuration.span_events.enabled and self.configuration.collect_span_events ): - metadata = ( ("agent_run_token", self.configuration.agent_run_id), ("license_key", self.configuration.license_key), ) rpc = self._rpc = StreamingRpc( - endpoint, span_iterator, metadata, record_metric, ssl=ssl + endpoint, span_iterator, metadata, record_metric, ssl=ssl, compression=compression_setting ) rpc.connect() return rpc @@ -135,9 +135,7 @@ def send_log_events(self, sampling_info, log_event_data): return self._protocol.send("log_event_data", payload) def get_agent_commands(self): - """Receive agent commands from the data collector. - - """ + """Receive agent commands from the data collector.""" payload = (self.agent_run_id,) return self._protocol.send("get_agent_commands", payload) @@ -180,8 +178,7 @@ def send_agent_command_results(self, cmd_results): return self._protocol.send("agent_command_results", payload) def send_profile_data(self, profile_data): - """Called to submit Profile Data. - """ + """Called to submit Profile Data.""" payload = (self.agent_run_id, profile_data) return self._protocol.send("profile_data", payload) @@ -206,9 +203,7 @@ class DeveloperModeSession(Session): def connect_span_stream(self, span_iterator, record_metric): if self.configuration.debug.connect_span_stream_in_developer_mode: - super(DeveloperModeSession, self).connect_span_stream( - span_iterator, record_metric - ) + super(DeveloperModeSession, self).connect_span_stream(span_iterator, record_metric) class ServerlessModeSession(Session): @@ -231,12 +226,8 @@ def shutdown_session(): def create_session(license_key, app_name, linked_applications, environment): settings = global_settings() if settings.serverless_mode.enabled: - return ServerlessModeSession( - app_name, linked_applications, environment, settings - ) + return ServerlessModeSession(app_name, linked_applications, environment, settings) elif settings.developer_mode: - return DeveloperModeSession( - app_name, linked_applications, environment, settings - ) + return DeveloperModeSession(app_name, linked_applications, environment, settings) else: return Session(app_name, linked_applications, environment, settings) diff --git a/newrelic/core/environment.py b/newrelic/core/environment.py index 9fc6e2dd4..9bca085a3 100644 --- a/newrelic/core/environment.py +++ b/newrelic/core/environment.py @@ -17,38 +17,30 @@ """ +import logging import os import platform import sys -import sysconfig import newrelic +from newrelic.common.package_version_utils import get_package_version from newrelic.common.system_info import ( logical_processor_count, physical_processor_count, total_physical_memory, ) +from newrelic.packages.isort import stdlibs as isort_stdlibs try: import newrelic.core._thread_utilization except ImportError: pass +_logger = logging.getLogger(__name__) + def environment_settings(): """Returns an array of arrays of environment settings""" - - # Find version resolver. - - get_version = None - # importlib was introduced into the standard library starting in Python3.8. - if "importlib" in sys.modules and hasattr(sys.modules["importlib"], "metadata"): - get_version = sys.modules["importlib"].metadata.version - elif "pkg_resources" in sys.modules: - - def get_version(name): # pylint: disable=function-redefined - return sys.modules["pkg_resources"].get_distribution(name).version - env = [] # Agent information. @@ -186,7 +178,7 @@ def get_version(name): # pylint: disable=function-redefined dispatcher.append(("Dispatcher Version", hypercorn.__version__)) else: try: - dispatcher.append(("Dispatcher Version", get_version("hypercorn"))) + dispatcher.append(("Dispatcher Version", get_package_version("hypercorn"))) except Exception: pass @@ -206,8 +198,7 @@ def get_version(name): # pylint: disable=function-redefined env.extend(dispatcher) # Module information. - purelib = sysconfig.get_path("purelib") - platlib = sysconfig.get_path("platlib") + stdlib_builtin_module_names = _get_stdlib_builtin_module_names() plugins = [] @@ -219,29 +210,58 @@ def get_version(name): # pylint: disable=function-redefined # list for name, module in sys.modules.copy().items(): # Exclude lib.sub_paths as independent modules except for newrelic.hooks. - if "." in name and not name.startswith("newrelic.hooks."): + nr_hook = name.startswith("newrelic.hooks.") + if "." in name and not nr_hook or name.startswith("_"): continue + # If the module isn't actually loaded (such as failed relative imports # in Python 2.7), the module will be None and should not be reported. - if not module: + try: + if not module: + continue + except Exception: + # if the application uses generalimport to manage optional depedencies, + # it's possible that generalimport.MissingOptionalDependency is raised. + # In this case, we should not report the module as it is not actually loaded and + # is not a runtime dependency of the application. + # continue + # Exclude standard library/built-in modules. - # Third-party modules can be installed in either purelib or platlib directories. - # See https://docs.python.org/3/library/sysconfig.html#installation-paths. - if ( - not hasattr(module, "__file__") - or not module.__file__ - or not module.__file__.startswith(purelib) - or not module.__file__.startswith(platlib) - ): + if name in stdlib_builtin_module_names: continue try: - version = get_version(name) - plugins.append("%s (%s)" % (name, version)) + version = get_package_version(name) except Exception: - plugins.append(name) + version = None + + # If it has no version it's likely not a real package so don't report it unless + # it's a new relic hook. + if version or nr_hook: + plugins.append("%s (%s)" % (name, version)) env.append(("Plugin List", plugins)) return env + + +def _get_stdlib_builtin_module_names(): + builtins = set(sys.builtin_module_names) + # Since sys.stdlib_module_names is not available in versions of python below 3.10, + # use isort's hardcoded stdlibs instead. + python_version = sys.version_info[0:2] + if python_version < (3,): + stdlibs = isort_stdlibs.py27.stdlib + elif (3, 7) <= python_version < (3, 8): + stdlibs = isort_stdlibs.py37.stdlib + elif python_version < (3, 9): + stdlibs = isort_stdlibs.py38.stdlib + elif python_version < (3, 10): + stdlibs = isort_stdlibs.py39.stdlib + elif python_version >= (3, 10): + stdlibs = sys.stdlib_module_names + else: + _logger.warn("Unsupported Python version. Unable to determine stdlibs.") + return builtins + return builtins | stdlibs diff --git a/newrelic/core/error_node.py b/newrelic/core/error_node.py index 67c1b449a..fe0157b81 100644 --- a/newrelic/core/error_node.py +++ b/newrelic/core/error_node.py @@ -24,8 +24,7 @@ "span_id", "stack_trace", "custom_params", - "file_name", - "line_number", "source", + "error_group_name", ], ) diff --git a/newrelic/core/external_node.py b/newrelic/core/external_node.py index 16c113794..20e07e9a5 100644 --- a/newrelic/core/external_node.py +++ b/newrelic/core/external_node.py @@ -32,6 +32,8 @@ class ExternalNode(_ExternalNode, GenericNodeMixin): + cross_process_id = None + external_txn_name = None @property def details(self): diff --git a/newrelic/core/infinite_tracing_pb2.py b/newrelic/core/infinite_tracing_pb2.py index a0fa9dc54..278dc6d18 100644 --- a/newrelic/core/infinite_tracing_pb2.py +++ b/newrelic/core/infinite_tracing_pb2.py @@ -13,13 +13,24 @@ # limitations under the License. try: - from google.protobuf import __version__ - PROTOBUF_VERSION = tuple(int(v) for v in __version__.split(".")) + from google.protobuf import __version__ + + PROTOBUF_VERSION = tuple(int(v) for v in __version__.split(".")) except Exception: - PROTOBUF_VERSION = (0, 0, 0) + PROTOBUF_VERSION = (0, 0, 0) # Import appropriate generated pb2 file for protobuf version if PROTOBUF_VERSION >= (4,): - from newrelic.core.infinite_tracing_v4_pb2 import * + from newrelic.core.infinite_tracing_v4_pb2 import ( # noqa: F401; pylint: disable=W0611 + AttributeValue, + RecordStatus, + Span, + SpanBatch, + ) else: - from newrelic.core.infinite_tracing_v3_pb2 import * + from newrelic.core.infinite_tracing_v3_pb2 import ( # noqa: F401; pylint: disable=W0611 + AttributeValue, + RecordStatus, + Span, + SpanBatch, + ) diff --git a/newrelic/core/infinite_tracing_v3_pb2.py b/newrelic/core/infinite_tracing_v3_pb2.py index 987c96303..79e3ec4eb 100644 --- a/newrelic/core/infinite_tracing_v3_pb2.py +++ b/newrelic/core/infinite_tracing_v3_pb2.py @@ -12,375 +12,129 @@ # See the License for the specific language governing permissions and # limitations under the License. -try: - from google.protobuf import descriptor as _descriptor - from google.protobuf import message as _message - from google.protobuf import reflection as _reflection - from google.protobuf import symbol_database as _symbol_database - # @@protoc_insertion_point(imports) -except ImportError: - pass -else: - _sym_db = _symbol_database.Default() - - - DESCRIPTOR = _descriptor.FileDescriptor( - name='infinite_tracing.proto', - package='com.newrelic.trace.v1', - syntax='proto3', - serialized_options=None, - serialized_pb=b'\n\x16infinite_tracing.proto\x12\x15\x63om.newrelic.trace.v1\"\x86\x04\n\x04Span\x12\x10\n\x08trace_id\x18\x01 \x01(\t\x12?\n\nintrinsics\x18\x02 \x03(\x0b\x32+.com.newrelic.trace.v1.Span.IntrinsicsEntry\x12H\n\x0fuser_attributes\x18\x03 \x03(\x0b\x32/.com.newrelic.trace.v1.Span.UserAttributesEntry\x12J\n\x10\x61gent_attributes\x18\x04 \x03(\x0b\x32\x30.com.newrelic.trace.v1.Span.AgentAttributesEntry\x1aX\n\x0fIntrinsicsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a\\\n\x13UserAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a]\n\x14\x41gentAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\"t\n\x0e\x41ttributeValue\x12\x16\n\x0cstring_value\x18\x01 \x01(\tH\x00\x12\x14\n\nbool_value\x18\x02 \x01(\x08H\x00\x12\x13\n\tint_value\x18\x03 \x01(\x03H\x00\x12\x16\n\x0c\x64ouble_value\x18\x04 \x01(\x01H\x00\x42\x07\n\x05value\"%\n\x0cRecordStatus\x12\x15\n\rmessages_seen\x18\x01 \x01(\x04\x32\x65\n\rIngestService\x12T\n\nRecordSpan\x12\x1b.com.newrelic.trace.v1.Span\x1a#.com.newrelic.trace.v1.RecordStatus\"\x00(\x01\x30\x01\x62\x06proto3' - ) - - - - - _SPAN_INTRINSICSENTRY = _descriptor.Descriptor( - name='IntrinsicsEntry', - full_name='com.newrelic.trace.v1.Span.IntrinsicsEntry', - filename=None, - file=DESCRIPTOR, - containing_type=None, - fields=[ - _descriptor.FieldDescriptor( - name='key', full_name='com.newrelic.trace.v1.Span.IntrinsicsEntry.key', index=0, - number=1, type=9, cpp_type=9, label=1, - has_default_value=False, default_value=b"".decode('utf-8'), - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='value', full_name='com.newrelic.trace.v1.Span.IntrinsicsEntry.value', index=1, - number=2, type=11, cpp_type=10, label=1, - has_default_value=False, default_value=None, - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - ], - extensions=[ - ], - nested_types=[], - enum_types=[ - ], - serialized_options=b'8\001', - is_extendable=False, - syntax='proto3', - extension_ranges=[], - oneofs=[ - ], - serialized_start=291, - serialized_end=379, - ) - - _SPAN_USERATTRIBUTESENTRY = _descriptor.Descriptor( - name='UserAttributesEntry', - full_name='com.newrelic.trace.v1.Span.UserAttributesEntry', - filename=None, - file=DESCRIPTOR, - containing_type=None, - fields=[ - _descriptor.FieldDescriptor( - name='key', full_name='com.newrelic.trace.v1.Span.UserAttributesEntry.key', index=0, - number=1, type=9, cpp_type=9, label=1, - has_default_value=False, default_value=b"".decode('utf-8'), - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='value', full_name='com.newrelic.trace.v1.Span.UserAttributesEntry.value', index=1, - number=2, type=11, cpp_type=10, label=1, - has_default_value=False, default_value=None, - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - ], - extensions=[ - ], - nested_types=[], - enum_types=[ - ], - serialized_options=b'8\001', - is_extendable=False, - syntax='proto3', - extension_ranges=[], - oneofs=[ - ], - serialized_start=381, - serialized_end=473, - ) - - _SPAN_AGENTATTRIBUTESENTRY = _descriptor.Descriptor( - name='AgentAttributesEntry', - full_name='com.newrelic.trace.v1.Span.AgentAttributesEntry', - filename=None, - file=DESCRIPTOR, - containing_type=None, - fields=[ - _descriptor.FieldDescriptor( - name='key', full_name='com.newrelic.trace.v1.Span.AgentAttributesEntry.key', index=0, - number=1, type=9, cpp_type=9, label=1, - has_default_value=False, default_value=b"".decode('utf-8'), - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='value', full_name='com.newrelic.trace.v1.Span.AgentAttributesEntry.value', index=1, - number=2, type=11, cpp_type=10, label=1, - has_default_value=False, default_value=None, - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - ], - extensions=[ - ], - nested_types=[], - enum_types=[ - ], - serialized_options=b'8\001', - is_extendable=False, - syntax='proto3', - extension_ranges=[], - oneofs=[ - ], - serialized_start=475, - serialized_end=568, - ) - - _SPAN = _descriptor.Descriptor( - name='Span', - full_name='com.newrelic.trace.v1.Span', - filename=None, - file=DESCRIPTOR, - containing_type=None, - fields=[ - _descriptor.FieldDescriptor( - name='trace_id', full_name='com.newrelic.trace.v1.Span.trace_id', index=0, - number=1, type=9, cpp_type=9, label=1, - has_default_value=False, default_value=b"".decode('utf-8'), - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='intrinsics', full_name='com.newrelic.trace.v1.Span.intrinsics', index=1, - number=2, type=11, cpp_type=10, label=3, - has_default_value=False, default_value=[], - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='user_attributes', full_name='com.newrelic.trace.v1.Span.user_attributes', index=2, - number=3, type=11, cpp_type=10, label=3, - has_default_value=False, default_value=[], - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='agent_attributes', full_name='com.newrelic.trace.v1.Span.agent_attributes', index=3, - number=4, type=11, cpp_type=10, label=3, - has_default_value=False, default_value=[], - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - ], - extensions=[ - ], - nested_types=[_SPAN_INTRINSICSENTRY, _SPAN_USERATTRIBUTESENTRY, _SPAN_AGENTATTRIBUTESENTRY, ], - enum_types=[ - ], - serialized_options=None, - is_extendable=False, - syntax='proto3', - extension_ranges=[], - oneofs=[ - ], - serialized_start=50, - serialized_end=568, - ) - - - _ATTRIBUTEVALUE = _descriptor.Descriptor( - name='AttributeValue', - full_name='com.newrelic.trace.v1.AttributeValue', - filename=None, - file=DESCRIPTOR, - containing_type=None, - fields=[ - _descriptor.FieldDescriptor( - name='string_value', full_name='com.newrelic.trace.v1.AttributeValue.string_value', index=0, - number=1, type=9, cpp_type=9, label=1, - has_default_value=False, default_value=b"".decode('utf-8'), - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='bool_value', full_name='com.newrelic.trace.v1.AttributeValue.bool_value', index=1, - number=2, type=8, cpp_type=7, label=1, - has_default_value=False, default_value=False, - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='int_value', full_name='com.newrelic.trace.v1.AttributeValue.int_value', index=2, - number=3, type=3, cpp_type=2, label=1, - has_default_value=False, default_value=0, - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - _descriptor.FieldDescriptor( - name='double_value', full_name='com.newrelic.trace.v1.AttributeValue.double_value', index=3, - number=4, type=1, cpp_type=5, label=1, - has_default_value=False, default_value=float(0), - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - ], - extensions=[ - ], - nested_types=[], - enum_types=[ - ], - serialized_options=None, - is_extendable=False, - syntax='proto3', - extension_ranges=[], - oneofs=[ - _descriptor.OneofDescriptor( - name='value', full_name='com.newrelic.trace.v1.AttributeValue.value', - index=0, containing_type=None, fields=[]), - ], - serialized_start=570, - serialized_end=686, - ) - - - _RECORDSTATUS = _descriptor.Descriptor( - name='RecordStatus', - full_name='com.newrelic.trace.v1.RecordStatus', - filename=None, - file=DESCRIPTOR, - containing_type=None, - fields=[ - _descriptor.FieldDescriptor( - name='messages_seen', full_name='com.newrelic.trace.v1.RecordStatus.messages_seen', index=0, - number=1, type=4, cpp_type=4, label=1, - has_default_value=False, default_value=0, - message_type=None, enum_type=None, containing_type=None, - is_extension=False, extension_scope=None, - serialized_options=None, file=DESCRIPTOR), - ], - extensions=[ - ], - nested_types=[], - enum_types=[ - ], - serialized_options=None, - is_extendable=False, - syntax='proto3', - extension_ranges=[], - oneofs=[ - ], - serialized_start=688, - serialized_end=725, - ) - - _SPAN_INTRINSICSENTRY.fields_by_name['value'].message_type = _ATTRIBUTEVALUE - _SPAN_INTRINSICSENTRY.containing_type = _SPAN - _SPAN_USERATTRIBUTESENTRY.fields_by_name['value'].message_type = _ATTRIBUTEVALUE - _SPAN_USERATTRIBUTESENTRY.containing_type = _SPAN - _SPAN_AGENTATTRIBUTESENTRY.fields_by_name['value'].message_type = _ATTRIBUTEVALUE - _SPAN_AGENTATTRIBUTESENTRY.containing_type = _SPAN - _SPAN.fields_by_name['intrinsics'].message_type = _SPAN_INTRINSICSENTRY - _SPAN.fields_by_name['user_attributes'].message_type = _SPAN_USERATTRIBUTESENTRY - _SPAN.fields_by_name['agent_attributes'].message_type = _SPAN_AGENTATTRIBUTESENTRY - _ATTRIBUTEVALUE.oneofs_by_name['value'].fields.append( - _ATTRIBUTEVALUE.fields_by_name['string_value']) - _ATTRIBUTEVALUE.fields_by_name['string_value'].containing_oneof = _ATTRIBUTEVALUE.oneofs_by_name['value'] - _ATTRIBUTEVALUE.oneofs_by_name['value'].fields.append( - _ATTRIBUTEVALUE.fields_by_name['bool_value']) - _ATTRIBUTEVALUE.fields_by_name['bool_value'].containing_oneof = _ATTRIBUTEVALUE.oneofs_by_name['value'] - _ATTRIBUTEVALUE.oneofs_by_name['value'].fields.append( - _ATTRIBUTEVALUE.fields_by_name['int_value']) - _ATTRIBUTEVALUE.fields_by_name['int_value'].containing_oneof = _ATTRIBUTEVALUE.oneofs_by_name['value'] - _ATTRIBUTEVALUE.oneofs_by_name['value'].fields.append( - _ATTRIBUTEVALUE.fields_by_name['double_value']) - _ATTRIBUTEVALUE.fields_by_name['double_value'].containing_oneof = _ATTRIBUTEVALUE.oneofs_by_name['value'] - DESCRIPTOR.message_types_by_name['Span'] = _SPAN - DESCRIPTOR.message_types_by_name['AttributeValue'] = _ATTRIBUTEVALUE - DESCRIPTOR.message_types_by_name['RecordStatus'] = _RECORDSTATUS - _sym_db.RegisterFileDescriptor(DESCRIPTOR) - - Span = _reflection.GeneratedProtocolMessageType('Span', (_message.Message,), { - - 'IntrinsicsEntry' : _reflection.GeneratedProtocolMessageType('IntrinsicsEntry', (_message.Message,), { - 'DESCRIPTOR' : _SPAN_INTRINSICSENTRY, - '__module__' : 'infinite_tracing_pb2' - # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.IntrinsicsEntry) - }) - , - - 'UserAttributesEntry' : _reflection.GeneratedProtocolMessageType('UserAttributesEntry', (_message.Message,), { - 'DESCRIPTOR' : _SPAN_USERATTRIBUTESENTRY, - '__module__' : 'infinite_tracing_pb2' - # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.UserAttributesEntry) - }) - , - - 'AgentAttributesEntry' : _reflection.GeneratedProtocolMessageType('AgentAttributesEntry', (_message.Message,), { - 'DESCRIPTOR' : _SPAN_AGENTATTRIBUTESENTRY, - '__module__' : 'infinite_tracing_pb2' - # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.AgentAttributesEntry) - }) - , - 'DESCRIPTOR' : _SPAN, - '__module__' : 'infinite_tracing_pb2' - # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span) - }) - _sym_db.RegisterMessage(Span) - _sym_db.RegisterMessage(Span.IntrinsicsEntry) - _sym_db.RegisterMessage(Span.UserAttributesEntry) - _sym_db.RegisterMessage(Span.AgentAttributesEntry) - - AttributeValue = _reflection.GeneratedProtocolMessageType('AttributeValue', (_message.Message,), { - 'DESCRIPTOR' : _ATTRIBUTEVALUE, - '__module__' : 'infinite_tracing_pb2' - # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.AttributeValue) - }) - _sym_db.RegisterMessage(AttributeValue) - - RecordStatus = _reflection.GeneratedProtocolMessageType('RecordStatus', (_message.Message,), { - 'DESCRIPTOR' : _RECORDSTATUS, - '__module__' : 'infinite_tracing_pb2' - # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.RecordStatus) - }) - _sym_db.RegisterMessage(RecordStatus) - - - _SPAN_INTRINSICSENTRY._options = None - _SPAN_USERATTRIBUTESENTRY._options = None - _SPAN_AGENTATTRIBUTESENTRY._options = None - - _INGESTSERVICE = _descriptor.ServiceDescriptor( - name='IngestService', - full_name='com.newrelic.trace.v1.IngestService', - file=DESCRIPTOR, - index=0, - serialized_options=None, - serialized_start=727, - serialized_end=828, - methods=[ - _descriptor.MethodDescriptor( - name='RecordSpan', - full_name='com.newrelic.trace.v1.IngestService.RecordSpan', - index=0, - containing_service=None, - input_type=_SPAN, - output_type=_RECORDSTATUS, - serialized_options=None, - ), - ]) - _sym_db.RegisterServiceDescriptor(_INGESTSERVICE) - - DESCRIPTOR.services_by_name['IngestService'] = _INGESTSERVICE - - # @@protoc_insertion_point(module_scope) - +# Generated by the protocol buffer compiler. DO NOT EDIT! +# source: v1.proto +"""Generated protocol buffer code.""" +from google.protobuf import descriptor as _descriptor +from google.protobuf import descriptor_pool as _descriptor_pool +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection +from google.protobuf import symbol_database as _symbol_database + +# @@protoc_insertion_point(imports) + +_sym_db = _symbol_database.Default() + + +DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile( + b'\n\x08v1.proto\x12\x15\x63om.newrelic.trace.v1"7\n\tSpanBatch\x12*\n\x05spans\x18\x01 \x03(\x0b\x32\x1b.com.newrelic.trace.v1.Span"\x86\x04\n\x04Span\x12\x10\n\x08trace_id\x18\x01 \x01(\t\x12?\n\nintrinsics\x18\x02 \x03(\x0b\x32+.com.newrelic.trace.v1.Span.IntrinsicsEntry\x12H\n\x0fuser_attributes\x18\x03 \x03(\x0b\x32/.com.newrelic.trace.v1.Span.UserAttributesEntry\x12J\n\x10\x61gent_attributes\x18\x04 \x03(\x0b\x32\x30.com.newrelic.trace.v1.Span.AgentAttributesEntry\x1aX\n\x0fIntrinsicsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a\\\n\x13UserAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a]\n\x14\x41gentAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01"t\n\x0e\x41ttributeValue\x12\x16\n\x0cstring_value\x18\x01 \x01(\tH\x00\x12\x14\n\nbool_value\x18\x02 \x01(\x08H\x00\x12\x13\n\tint_value\x18\x03 \x01(\x03H\x00\x12\x16\n\x0c\x64ouble_value\x18\x04 \x01(\x01H\x00\x42\x07\n\x05value"%\n\x0cRecordStatus\x12\x15\n\rmessages_seen\x18\x01 \x01(\x04\x32\xc5\x01\n\rIngestService\x12T\n\nRecordSpan\x12\x1b.com.newrelic.trace.v1.Span\x1a#.com.newrelic.trace.v1.RecordStatus"\x00(\x01\x30\x01\x12^\n\x0fRecordSpanBatch\x12 .com.newrelic.trace.v1.SpanBatch\x1a#.com.newrelic.trace.v1.RecordStatus"\x00(\x01\x30\x01\x62\x06proto3' +) + + +_SPANBATCH = DESCRIPTOR.message_types_by_name["SpanBatch"] +_SPAN = DESCRIPTOR.message_types_by_name["Span"] +_SPAN_INTRINSICSENTRY = _SPAN.nested_types_by_name["IntrinsicsEntry"] +_SPAN_USERATTRIBUTESENTRY = _SPAN.nested_types_by_name["UserAttributesEntry"] +_SPAN_AGENTATTRIBUTESENTRY = _SPAN.nested_types_by_name["AgentAttributesEntry"] +_ATTRIBUTEVALUE = DESCRIPTOR.message_types_by_name["AttributeValue"] +_RECORDSTATUS = DESCRIPTOR.message_types_by_name["RecordStatus"] +SpanBatch = _reflection.GeneratedProtocolMessageType( + "SpanBatch", + (_message.Message,), + { + "DESCRIPTOR": _SPANBATCH, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.SpanBatch) + }, +) +_sym_db.RegisterMessage(SpanBatch) + +Span = _reflection.GeneratedProtocolMessageType( + "Span", + (_message.Message,), + { + "IntrinsicsEntry": _reflection.GeneratedProtocolMessageType( + "IntrinsicsEntry", + (_message.Message,), + { + "DESCRIPTOR": _SPAN_INTRINSICSENTRY, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.IntrinsicsEntry) + }, + ), + "UserAttributesEntry": _reflection.GeneratedProtocolMessageType( + "UserAttributesEntry", + (_message.Message,), + { + "DESCRIPTOR": _SPAN_USERATTRIBUTESENTRY, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.UserAttributesEntry) + }, + ), + "AgentAttributesEntry": _reflection.GeneratedProtocolMessageType( + "AgentAttributesEntry", + (_message.Message,), + { + "DESCRIPTOR": _SPAN_AGENTATTRIBUTESENTRY, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.AgentAttributesEntry) + }, + ), + "DESCRIPTOR": _SPAN, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span) + }, +) +_sym_db.RegisterMessage(Span) +_sym_db.RegisterMessage(Span.IntrinsicsEntry) +_sym_db.RegisterMessage(Span.UserAttributesEntry) +_sym_db.RegisterMessage(Span.AgentAttributesEntry) + +AttributeValue = _reflection.GeneratedProtocolMessageType( + "AttributeValue", + (_message.Message,), + { + "DESCRIPTOR": _ATTRIBUTEVALUE, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.AttributeValue) + }, +) +_sym_db.RegisterMessage(AttributeValue) + +RecordStatus = _reflection.GeneratedProtocolMessageType( + "RecordStatus", + (_message.Message,), + { + "DESCRIPTOR": _RECORDSTATUS, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.RecordStatus) + }, +) +_sym_db.RegisterMessage(RecordStatus) + +_INGESTSERVICE = DESCRIPTOR.services_by_name["IngestService"] +if _descriptor._USE_C_DESCRIPTORS is False: + DESCRIPTOR._options = None + _SPAN_INTRINSICSENTRY._options = None + _SPAN_INTRINSICSENTRY._serialized_options = b"8\001" + _SPAN_USERATTRIBUTESENTRY._options = None + _SPAN_USERATTRIBUTESENTRY._serialized_options = b"8\001" + _SPAN_AGENTATTRIBUTESENTRY._options = None + _SPAN_AGENTATTRIBUTESENTRY._serialized_options = b"8\001" + _SPANBATCH._serialized_start = 35 + _SPANBATCH._serialized_end = 90 + _SPAN._serialized_start = 93 + _SPAN._serialized_end = 611 + _SPAN_INTRINSICSENTRY._serialized_start = 334 + _SPAN_INTRINSICSENTRY._serialized_end = 422 + _SPAN_USERATTRIBUTESENTRY._serialized_start = 424 + _SPAN_USERATTRIBUTESENTRY._serialized_end = 516 + _SPAN_AGENTATTRIBUTESENTRY._serialized_start = 518 + _SPAN_AGENTATTRIBUTESENTRY._serialized_end = 611 + _ATTRIBUTEVALUE._serialized_start = 613 + _ATTRIBUTEVALUE._serialized_end = 729 + _RECORDSTATUS._serialized_start = 731 + _RECORDSTATUS._serialized_end = 768 + _INGESTSERVICE._serialized_start = 771 + _INGESTSERVICE._serialized_end = 968 +# @@protoc_insertion_point(module_scope) diff --git a/newrelic/core/infinite_tracing_v4_pb2.py b/newrelic/core/infinite_tracing_v4_pb2.py index ae1739670..79e3ec4eb 100644 --- a/newrelic/core/infinite_tracing_v4_pb2.py +++ b/newrelic/core/infinite_tracing_v4_pb2.py @@ -1,5 +1,3 @@ -# -*- coding: utf-8 -*- - # Copyright 2010 New Relic, Inc. # # Licensed under the Apache License, Version 2.0 (the "License"); @@ -17,41 +15,126 @@ # Generated by the protocol buffer compiler. DO NOT EDIT! # source: v1.proto """Generated protocol buffer code.""" -from google.protobuf.internal import builder as _builder from google.protobuf import descriptor as _descriptor from google.protobuf import descriptor_pool as _descriptor_pool +from google.protobuf import message as _message +from google.protobuf import reflection as _reflection from google.protobuf import symbol_database as _symbol_database + # @@protoc_insertion_point(imports) _sym_db = _symbol_database.Default() -DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x08v1.proto\x12\x15\x63om.newrelic.trace.v1\"7\n\tSpanBatch\x12*\n\x05spans\x18\x01 \x03(\x0b\x32\x1b.com.newrelic.trace.v1.Span\"\x86\x04\n\x04Span\x12\x10\n\x08trace_id\x18\x01 \x01(\t\x12?\n\nintrinsics\x18\x02 \x03(\x0b\x32+.com.newrelic.trace.v1.Span.IntrinsicsEntry\x12H\n\x0fuser_attributes\x18\x03 \x03(\x0b\x32/.com.newrelic.trace.v1.Span.UserAttributesEntry\x12J\n\x10\x61gent_attributes\x18\x04 \x03(\x0b\x32\x30.com.newrelic.trace.v1.Span.AgentAttributesEntry\x1aX\n\x0fIntrinsicsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a\\\n\x13UserAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a]\n\x14\x41gentAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\"t\n\x0e\x41ttributeValue\x12\x16\n\x0cstring_value\x18\x01 \x01(\tH\x00\x12\x14\n\nbool_value\x18\x02 \x01(\x08H\x00\x12\x13\n\tint_value\x18\x03 \x01(\x03H\x00\x12\x16\n\x0c\x64ouble_value\x18\x04 \x01(\x01H\x00\x42\x07\n\x05value\"%\n\x0cRecordStatus\x12\x15\n\rmessages_seen\x18\x01 \x01(\x04\x32\xc5\x01\n\rIngestService\x12T\n\nRecordSpan\x12\x1b.com.newrelic.trace.v1.Span\x1a#.com.newrelic.trace.v1.RecordStatus\"\x00(\x01\x30\x01\x12^\n\x0fRecordSpanBatch\x12 .com.newrelic.trace.v1.SpanBatch\x1a#.com.newrelic.trace.v1.RecordStatus\"\x00(\x01\x30\x01\x62\x06proto3') - -_builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals()) -_builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'v1_pb2', globals()) -if _descriptor._USE_C_DESCRIPTORS == False: - - DESCRIPTOR._options = None - _SPAN_INTRINSICSENTRY._options = None - _SPAN_INTRINSICSENTRY._serialized_options = b'8\001' - _SPAN_USERATTRIBUTESENTRY._options = None - _SPAN_USERATTRIBUTESENTRY._serialized_options = b'8\001' - _SPAN_AGENTATTRIBUTESENTRY._options = None - _SPAN_AGENTATTRIBUTESENTRY._serialized_options = b'8\001' - _SPANBATCH._serialized_start=35 - _SPANBATCH._serialized_end=90 - _SPAN._serialized_start=93 - _SPAN._serialized_end=611 - _SPAN_INTRINSICSENTRY._serialized_start=334 - _SPAN_INTRINSICSENTRY._serialized_end=422 - _SPAN_USERATTRIBUTESENTRY._serialized_start=424 - _SPAN_USERATTRIBUTESENTRY._serialized_end=516 - _SPAN_AGENTATTRIBUTESENTRY._serialized_start=518 - _SPAN_AGENTATTRIBUTESENTRY._serialized_end=611 - _ATTRIBUTEVALUE._serialized_start=613 - _ATTRIBUTEVALUE._serialized_end=729 - _RECORDSTATUS._serialized_start=731 - _RECORDSTATUS._serialized_end=768 - _INGESTSERVICE._serialized_start=771 - _INGESTSERVICE._serialized_end=968 + +DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile( + b'\n\x08v1.proto\x12\x15\x63om.newrelic.trace.v1"7\n\tSpanBatch\x12*\n\x05spans\x18\x01 \x03(\x0b\x32\x1b.com.newrelic.trace.v1.Span"\x86\x04\n\x04Span\x12\x10\n\x08trace_id\x18\x01 \x01(\t\x12?\n\nintrinsics\x18\x02 \x03(\x0b\x32+.com.newrelic.trace.v1.Span.IntrinsicsEntry\x12H\n\x0fuser_attributes\x18\x03 \x03(\x0b\x32/.com.newrelic.trace.v1.Span.UserAttributesEntry\x12J\n\x10\x61gent_attributes\x18\x04 \x03(\x0b\x32\x30.com.newrelic.trace.v1.Span.AgentAttributesEntry\x1aX\n\x0fIntrinsicsEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a\\\n\x13UserAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01\x1a]\n\x14\x41gentAttributesEntry\x12\x0b\n\x03key\x18\x01 \x01(\t\x12\x34\n\x05value\x18\x02 \x01(\x0b\x32%.com.newrelic.trace.v1.AttributeValue:\x02\x38\x01"t\n\x0e\x41ttributeValue\x12\x16\n\x0cstring_value\x18\x01 \x01(\tH\x00\x12\x14\n\nbool_value\x18\x02 \x01(\x08H\x00\x12\x13\n\tint_value\x18\x03 \x01(\x03H\x00\x12\x16\n\x0c\x64ouble_value\x18\x04 \x01(\x01H\x00\x42\x07\n\x05value"%\n\x0cRecordStatus\x12\x15\n\rmessages_seen\x18\x01 \x01(\x04\x32\xc5\x01\n\rIngestService\x12T\n\nRecordSpan\x12\x1b.com.newrelic.trace.v1.Span\x1a#.com.newrelic.trace.v1.RecordStatus"\x00(\x01\x30\x01\x12^\n\x0fRecordSpanBatch\x12 .com.newrelic.trace.v1.SpanBatch\x1a#.com.newrelic.trace.v1.RecordStatus"\x00(\x01\x30\x01\x62\x06proto3' +) + + +_SPANBATCH = DESCRIPTOR.message_types_by_name["SpanBatch"] +_SPAN = DESCRIPTOR.message_types_by_name["Span"] +_SPAN_INTRINSICSENTRY = _SPAN.nested_types_by_name["IntrinsicsEntry"] +_SPAN_USERATTRIBUTESENTRY = _SPAN.nested_types_by_name["UserAttributesEntry"] +_SPAN_AGENTATTRIBUTESENTRY = _SPAN.nested_types_by_name["AgentAttributesEntry"] +_ATTRIBUTEVALUE = DESCRIPTOR.message_types_by_name["AttributeValue"] +_RECORDSTATUS = DESCRIPTOR.message_types_by_name["RecordStatus"] +SpanBatch = _reflection.GeneratedProtocolMessageType( + "SpanBatch", + (_message.Message,), + { + "DESCRIPTOR": _SPANBATCH, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.SpanBatch) + }, +) +_sym_db.RegisterMessage(SpanBatch) + +Span = _reflection.GeneratedProtocolMessageType( + "Span", + (_message.Message,), + { + "IntrinsicsEntry": _reflection.GeneratedProtocolMessageType( + "IntrinsicsEntry", + (_message.Message,), + { + "DESCRIPTOR": _SPAN_INTRINSICSENTRY, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.IntrinsicsEntry) + }, + ), + "UserAttributesEntry": _reflection.GeneratedProtocolMessageType( + "UserAttributesEntry", + (_message.Message,), + { + "DESCRIPTOR": _SPAN_USERATTRIBUTESENTRY, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.UserAttributesEntry) + }, + ), + "AgentAttributesEntry": _reflection.GeneratedProtocolMessageType( + "AgentAttributesEntry", + (_message.Message,), + { + "DESCRIPTOR": _SPAN_AGENTATTRIBUTESENTRY, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span.AgentAttributesEntry) + }, + ), + "DESCRIPTOR": _SPAN, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.Span) + }, +) +_sym_db.RegisterMessage(Span) +_sym_db.RegisterMessage(Span.IntrinsicsEntry) +_sym_db.RegisterMessage(Span.UserAttributesEntry) +_sym_db.RegisterMessage(Span.AgentAttributesEntry) + +AttributeValue = _reflection.GeneratedProtocolMessageType( + "AttributeValue", + (_message.Message,), + { + "DESCRIPTOR": _ATTRIBUTEVALUE, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.AttributeValue) + }, +) +_sym_db.RegisterMessage(AttributeValue) + +RecordStatus = _reflection.GeneratedProtocolMessageType( + "RecordStatus", + (_message.Message,), + { + "DESCRIPTOR": _RECORDSTATUS, + "__module__": "v1_pb2" + # @@protoc_insertion_point(class_scope:com.newrelic.trace.v1.RecordStatus) + }, +) +_sym_db.RegisterMessage(RecordStatus) + +_INGESTSERVICE = DESCRIPTOR.services_by_name["IngestService"] +if _descriptor._USE_C_DESCRIPTORS is False: + DESCRIPTOR._options = None + _SPAN_INTRINSICSENTRY._options = None + _SPAN_INTRINSICSENTRY._serialized_options = b"8\001" + _SPAN_USERATTRIBUTESENTRY._options = None + _SPAN_USERATTRIBUTESENTRY._serialized_options = b"8\001" + _SPAN_AGENTATTRIBUTESENTRY._options = None + _SPAN_AGENTATTRIBUTESENTRY._serialized_options = b"8\001" + _SPANBATCH._serialized_start = 35 + _SPANBATCH._serialized_end = 90 + _SPAN._serialized_start = 93 + _SPAN._serialized_end = 611 + _SPAN_INTRINSICSENTRY._serialized_start = 334 + _SPAN_INTRINSICSENTRY._serialized_end = 422 + _SPAN_USERATTRIBUTESENTRY._serialized_start = 424 + _SPAN_USERATTRIBUTESENTRY._serialized_end = 516 + _SPAN_AGENTATTRIBUTESENTRY._serialized_start = 518 + _SPAN_AGENTATTRIBUTESENTRY._serialized_end = 611 + _ATTRIBUTEVALUE._serialized_start = 613 + _ATTRIBUTEVALUE._serialized_end = 729 + _RECORDSTATUS._serialized_start = 731 + _RECORDSTATUS._serialized_end = 768 + _INGESTSERVICE._serialized_start = 771 + _INGESTSERVICE._serialized_end = 968 # @@protoc_insertion_point(module_scope) diff --git a/newrelic/core/stats_engine.py b/newrelic/core/stats_engine.py index 0e8e74546..203e3e796 100644 --- a/newrelic/core/stats_engine.py +++ b/newrelic/core/stats_engine.py @@ -26,6 +26,7 @@ import random import sys import time +import traceback import warnings import zlib from heapq import heapify, heapreplace @@ -36,15 +37,21 @@ from newrelic.common.encoding_utils import json_encode from newrelic.common.object_names import parse_exc_info from newrelic.common.streaming_utils import StreamBuffer -from newrelic.core.attribute import create_user_attributes, process_user_attribute, truncate, MAX_LOG_MESSAGE_LENGTH +from newrelic.core.attribute import ( + MAX_LOG_MESSAGE_LENGTH, + create_agent_attributes, + create_user_attributes, + process_user_attribute, + truncate, +) from newrelic.core.attribute_filter import DST_ERROR_COLLECTOR from newrelic.core.code_level_metrics import extract_code_from_traceback from newrelic.core.config import is_expected_error, should_ignore_error from newrelic.core.database_utils import explain_plan from newrelic.core.error_collector import TracedError +from newrelic.core.log_event_node import LogEventNode from newrelic.core.metric import TimeMetric from newrelic.core.stack_trace import exception_stack -from newrelic.core.log_event_node import LogEventNode _logger = logging.getLogger(__name__) @@ -605,13 +612,15 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, if getattr(value, "_nr_ignored", None): return - module, name, fullnames, message = parse_exc_info(error) + module, name, fullnames, message_raw = parse_exc_info(error) fullname = fullnames[0] # Check to see if we need to strip the message before recording it. if settings.strip_exception_messages.enabled and fullname not in settings.strip_exception_messages.allowlist: message = STRIP_EXCEPTION_MESSAGE + else: + message = message_raw # Where expected or ignore are a callable they should return a # tri-state variable with the following behavior. @@ -707,6 +716,42 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, user_attributes = create_user_attributes(custom_attributes, settings.attribute_filter) + + # Extract additional details about the exception as agent attributes + agent_attributes = {} + + if settings: + if settings.code_level_metrics and settings.code_level_metrics.enabled: + extract_code_from_traceback(tb).add_attrs(agent_attributes.__setitem__) + + if settings.error_collector and settings.error_collector.error_group_callback is not None: + error_group_name = None + try: + # Call callback to obtain error group name + error_group_name_raw = settings.error_collector.error_group_callback(value, { + "traceback": tb, + "error.class": exc, + "error.message": message_raw, + "error.expected": is_expected, + "custom_params": attributes, + # Transaction specific items should be set to None + "transactionName": None, + "response.status": None, + "request.method": None, + "request.uri": None, + }) + if error_group_name_raw: + _, error_group_name = process_user_attribute("error.group.name", error_group_name_raw) + if error_group_name is None or not isinstance(error_group_name, six.string_types): + raise ValueError("Invalid attribute value for error.group.name. Expected string, got: %s" % repr(error_group_name_raw)) + else: + agent_attributes["error.group.name"] = error_group_name + + except Exception: + _logger.error("Encountered error when calling error group callback:\n%s", "".join(traceback.format_exception(*sys.exc_info()))) + + agent_attributes = create_agent_attributes(agent_attributes, settings.attribute_filter) + # Record the exception details. attributes = {} @@ -726,9 +771,10 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, # set source code attributes attributes["agentAttributes"] = {} - if settings and settings.code_level_metrics and settings.code_level_metrics.enabled: - extract_code_from_traceback(tb).add_attrs(attributes["agentAttributes"].__setitem__) - + for attr in agent_attributes: + if attr.destinations & DST_ERROR_COLLECTOR: + attributes["agentAttributes"][attr.name] = attr.value + error_details = TracedError( start_time=time.time(), path="Exception", message=message, type=fullname, parameters=attributes ) @@ -751,7 +797,6 @@ def notice_error(self, error=None, attributes=None, expected=None, ignore=None, self.record_time_metric(TimeMetric(name="Errors/all", scope="", duration=0.0, exclusive=None)) def _error_event(self, error): - # This method is for recording error events outside of transactions, # don't let the poorly named 'type' attribute fool you. @@ -767,12 +812,15 @@ def _error_event(self, error): # Leave agent attributes field blank since not a transaction - error_event = [error.parameters["intrinsics"], error.parameters["userAttributes"], {}] + error_event = [ + error.parameters["intrinsics"], + error.parameters["userAttributes"], + error.parameters["agentAttributes"], + ] return error_event def record_custom_event(self, event): - settings = self.__settings if not settings: @@ -964,7 +1012,6 @@ def record_transaction(self, transaction): transaction_tracer = settings.transaction_tracer if not transaction.suppress_transaction_trace and transaction_tracer.enabled and settings.collect_traces: - # Transactions saved for Synthetics transactions # do not depend on the transaction threshold. @@ -987,7 +1034,6 @@ def record_transaction(self, transaction): self._synthetics_events.add(event) elif settings.collect_analytics_events and settings.transaction_events.enabled: - event = transaction.transaction_event(self.__stats_table) self._transaction_events.add(event, priority=transaction.priority) @@ -1008,40 +1054,50 @@ def record_transaction(self, transaction): # Merge in log events - if settings and settings.application_logging and settings.application_logging.enabled and settings.application_logging.forwarding and settings.application_logging.forwarding.enabled: + if ( + settings + and settings.application_logging + and settings.application_logging.enabled + and settings.application_logging.forwarding + and settings.application_logging.forwarding.enabled + ): self._log_events.merge(transaction.log_events, priority=transaction.priority) - def record_log_event(self, message, level=None, timestamp=None, priority=None): settings = self.__settings - if not (settings and settings.application_logging and settings.application_logging.enabled and settings.application_logging.forwarding and settings.application_logging.forwarding.enabled): + if not ( + settings + and settings.application_logging + and settings.application_logging.enabled + and settings.application_logging.forwarding + and settings.application_logging.forwarding.enabled + ): return - + timestamp = timestamp if timestamp is not None else time.time() level = str(level) if level is not None else "UNKNOWN" if not message or message.isspace(): _logger.debug("record_log_event called where message was missing. No log event will be sent.") return - + message = truncate(message, MAX_LOG_MESSAGE_LENGTH) event = LogEventNode( timestamp=timestamp, level=level, message=message, - attributes=get_linking_metadata(), + attributes=get_linking_metadata(), ) if priority is None: # Base priority for log events outside transactions is below those inside transactions - priority = random.random() - 1 + priority = random.random() - 1 # nosec self._log_events.add(event, priority=priority) return event - def metric_data(self, normalizer=None): """Returns a list containing the low level metric data for sending to the core application pertaining to the reporting @@ -1115,7 +1171,6 @@ def error_data(self): return self.__transaction_errors def slow_sql_data(self, connections): - _logger.debug("Generating slow SQL data.") if not self.__settings: @@ -1134,7 +1189,6 @@ def slow_sql_data(self, connections): result = [] for stats_node in slow_sql_nodes: - slow_sql_node = stats_node.slow_sql_node params = slow_sql_node.params or {} @@ -1398,7 +1452,9 @@ def reset_stats(self, settings, reset_stream=False): self.reset_synthetics_events() # streams are never reset after instantiation if reset_stream: - self._span_stream = StreamBuffer(settings.infinite_tracing.span_queue_size) + self._span_stream = StreamBuffer( + settings.infinite_tracing.span_queue_size, batching=settings.infinite_tracing.batching + ) def reset_metric_stats(self): """Resets the accumulated statistics back to initial state for @@ -1612,7 +1668,6 @@ def merge_metric_stats(self, snapshot): stats.merge_stats(other) def _merge_transaction_events(self, snapshot, rollback=False): - # Merge in transaction events. In the normal case snapshot is a # StatsEngine from a single transaction, and should only have one # event. Just to avoid issues, if there is more than one, don't merge. @@ -1631,7 +1686,6 @@ def _merge_transaction_events(self, snapshot, rollback=False): self._transaction_events.merge(events) def _merge_synthetics_events(self, snapshot, rollback=False): - # Merge Synthetic analytic events, appending to the list # that contains events from previous transactions. In the normal # case snapshot is a StatsEngine from a single transaction, and should @@ -1648,7 +1702,6 @@ def _merge_synthetics_events(self, snapshot, rollback=False): self._synthetics_events.merge(events) def _merge_error_events(self, snapshot): - # Merge in error events. Since we are using reservoir sampling that # gives equal probability to keeping each event, merge is the same as # rollback. There may be multiple error events per transaction. @@ -1676,7 +1729,6 @@ def _merge_log_events(self, snapshot, rollback=False): self._log_events.merge(events) def _merge_error_traces(self, snapshot): - # Append snapshot error details at end to maintain time # based order and then trim at maximum to be kept. snapshot will # always have newer data. @@ -1686,7 +1738,6 @@ def _merge_error_traces(self, snapshot): self.__transaction_errors = self.__transaction_errors[:maximum] def _merge_sql(self, snapshot): - # Add sql traces to the set of existing entries. If over # the limit of how many to collect, only merge in if already # seen the specific SQL. @@ -1701,7 +1752,6 @@ def _merge_sql(self, snapshot): stats.merge_stats(slow_sql_stats) def _merge_traces(self, snapshot): - # Limit number of Synthetics transactions maximum = self.__settings.agent_limits.synthetics_transactions diff --git a/newrelic/core/trace_cache.py b/newrelic/core/trace_cache.py index 1634d0d0b..5f0ddcd3d 100644 --- a/newrelic/core/trace_cache.py +++ b/newrelic/core/trace_cache.py @@ -28,6 +28,11 @@ except ImportError: import _thread as thread +try: + from collections.abc import MutableMapping +except ImportError: + from collections import MutableMapping + from newrelic.core.config import global_settings from newrelic.core.loop_node import LoopNode @@ -92,7 +97,7 @@ class TraceCacheActiveTraceError(RuntimeError): pass -class TraceCache(object): +class TraceCache(MutableMapping): asyncio = cached_module("asyncio") greenlet = cached_module("greenlet") @@ -100,7 +105,7 @@ def __init__(self): self._cache = weakref.WeakValueDictionary() def __repr__(self): - return "<%s object at 0x%x %s>" % (self.__class__.__name__, id(self), str(dict(self._cache.items()))) + return "<%s object at 0x%x %s>" % (self.__class__.__name__, id(self), str(dict(self.items()))) def current_thread_id(self): """Returns the thread ID for the caller. @@ -135,10 +140,10 @@ def current_thread_id(self): def task_start(self, task): trace = self.current_trace() if trace: - self._cache[id(task)] = trace + self[id(task)] = trace def task_stop(self, task): - self._cache.pop(id(task), None) + self.pop(id(task), None) def current_transaction(self): """Return the transaction object if one exists for the currently @@ -146,11 +151,11 @@ def current_transaction(self): """ - trace = self._cache.get(self.current_thread_id()) + trace = self.get(self.current_thread_id()) return trace and trace.transaction def current_trace(self): - return self._cache.get(self.current_thread_id()) + return self.get(self.current_thread_id()) def active_threads(self): """Returns an iterator over all current stack frames for all @@ -169,7 +174,7 @@ def active_threads(self): # First yield up those for real Python threads. for thread_id, frame in sys._current_frames().items(): - trace = self._cache.get(thread_id) + trace = self.get(thread_id) transaction = trace and trace.transaction if transaction is not None: if transaction.background_task: @@ -197,7 +202,7 @@ def active_threads(self): debug = global_settings().debug if debug.enable_coroutine_profiling: - for thread_id, trace in list(self._cache.items()): + for thread_id, trace in self.items(): transaction = trace.transaction if transaction and transaction._greenlet is not None: gr = transaction._greenlet() @@ -212,7 +217,7 @@ def prepare_for_root(self): trace in the cache is from a different task (for asyncio). Returns the current trace after the cache is updated.""" thread_id = self.current_thread_id() - trace = self._cache.get(thread_id) + trace = self.get(thread_id) if not trace: return None @@ -221,11 +226,11 @@ def prepare_for_root(self): task = current_task(self.asyncio) if task is not None and id(trace._task) != id(task): - self._cache.pop(thread_id, None) + self.pop(thread_id, None) return None if trace.root and trace.root.exited: - self._cache.pop(thread_id, None) + self.pop(thread_id, None) return None return trace @@ -240,8 +245,8 @@ def save_trace(self, trace): thread_id = trace.thread_id - if thread_id in self._cache: - cache_root = self._cache[thread_id].root + if thread_id in self: + cache_root = self[thread_id].root if cache_root and cache_root is not trace.root and not cache_root.exited: # Cached trace exists and has a valid root still _logger.error( @@ -253,7 +258,7 @@ def save_trace(self, trace): raise TraceCacheActiveTraceError("transaction already active") - self._cache[thread_id] = trace + self[thread_id] = trace # We judge whether we are actually running in a coroutine by # seeing if the current thread ID is actually listed in the set @@ -284,7 +289,7 @@ def pop_current(self, trace): thread_id = trace.thread_id parent = trace.parent - self._cache[thread_id] = parent + self[thread_id] = parent def complete_root(self, root): """Completes a trace specified by the given root @@ -301,7 +306,7 @@ def complete_root(self, root): to_complete = [] for task_id in task_ids: - entry = self._cache.get(task_id) + entry = self.get(task_id) if entry and entry is not root and entry.root is root: to_complete.append(entry) @@ -316,12 +321,12 @@ def complete_root(self, root): thread_id = root.thread_id - if thread_id not in self._cache: + if thread_id not in self: thread_id = self.current_thread_id() - if thread_id not in self._cache: + if thread_id not in self: raise TraceCacheNoActiveTraceError("no active trace") - current = self._cache.get(thread_id) + current = self.get(thread_id) if root is not current: _logger.error( @@ -333,7 +338,7 @@ def complete_root(self, root): raise RuntimeError("not the current trace") - del self._cache[thread_id] + del self[thread_id] root._greenlet = None def record_event_loop_wait(self, start_time, end_time): @@ -359,7 +364,7 @@ def record_event_loop_wait(self, start_time, end_time): task = getattr(transaction.root_span, "_task", None) loop = get_event_loop(task) - for trace in list(self._cache.values()): + for trace in self.values(): if trace in seen: continue @@ -390,6 +395,62 @@ def record_event_loop_wait(self, start_time, end_time): root.increment_child_count() root.add_child(node) + # MutableMapping methods + + def items(self): + """ + Safely iterates on self._cache.items() indirectly using a list of value references + to avoid RuntimeErrors from size changes during iteration. + """ + for wr in self._cache.valuerefs(): + value = wr() # Dereferenced value is potentially no longer live. + if ( + value is not None + ): # weakref is None means weakref has been garbage collected and is no longer live. Ignore. + yield wr.key, value # wr.key is the original dict key + + def keys(self): + """ + Iterates on self._cache.keys() indirectly using a list of value references + to avoid RuntimeErrors from size changes during iteration. + + NOTE: Returned keys are keys to weak references which may at any point be garbage collected. + It is only safe to retrieve values from the trace cache using trace_cache.get(key, None). + Retrieving values using trace_cache[key] can cause a KeyError if the item has been garbage collected. + """ + for wr in self._cache.valuerefs(): + yield wr.key # wr.key is the original dict key + + def values(self): + """ + Safely iterates on self._cache.values() indirectly using a list of value references + to avoid RuntimeErrors from size changes during iteration. + """ + for wr in self._cache.valuerefs(): + value = wr() # Dereferenced value is potentially no longer live. + if ( + value is not None + ): # weakref is None means weakref has been garbage collected and is no longer live. Ignore. + yield value + + def __getitem__(self, key): + return self._cache.__getitem__(key) + + def __setitem__(self, key, value): + self._cache.__setitem__(key, value) + + def __delitem__(self, key): + self._cache.__delitem__(key) + + def __iter__(self): + return self.keys() + + def __len__(self): + return self._cache.__len__() + + def __bool__(self): + return bool(self._cache.__len__()) + _trace_cache = TraceCache() diff --git a/newrelic/core/transaction_node.py b/newrelic/core/transaction_node.py index 97f6f3ebb..0faae3790 100644 --- a/newrelic/core/transaction_node.py +++ b/newrelic/core/transaction_node.py @@ -22,35 +22,81 @@ import newrelic.core.error_collector import newrelic.core.trace_node - +from newrelic.common.streaming_utils import SpanProtoAttrs +from newrelic.core.attribute import create_agent_attributes, create_user_attributes +from newrelic.core.attribute_filter import ( + DST_ERROR_COLLECTOR, + DST_TRANSACTION_EVENTS, + DST_TRANSACTION_TRACER, +) from newrelic.core.metric import ApdexMetric, TimeMetric from newrelic.core.string_table import StringTable -from newrelic.core.attribute import create_user_attributes -from newrelic.core.attribute_filter import (DST_ERROR_COLLECTOR, - DST_TRANSACTION_TRACER, DST_TRANSACTION_EVENTS) - -from newrelic.common.streaming_utils import SpanProtoAttrs try: from newrelic.core.infinite_tracing_pb2 import Span except: pass -_TransactionNode = namedtuple('_TransactionNode', - ['settings', 'path', 'type', 'group', 'base_name', 'name_for_metric', - 'port', 'request_uri', 'queue_start', 'start_time', - 'end_time', 'last_byte_time', 'response_time', 'total_time', - 'duration', 'exclusive', 'root', 'errors', 'slow_sql', - 'custom_events', 'log_events', 'apdex_t', 'suppress_apdex', 'custom_metrics', - 'guid', 'cpu_time', 'suppress_transaction_trace', 'client_cross_process_id', - 'referring_transaction_guid', 'record_tt', 'synthetics_resource_id', - 'synthetics_job_id', 'synthetics_monitor_id', 'synthetics_header', - 'is_part_of_cat', 'trip_id', 'path_hash', 'referring_path_hash', - 'alternate_path_hashes', 'trace_intrinsics', 'agent_attributes', - 'distributed_trace_intrinsics', 'user_attributes', 'priority', - 'sampled', 'parent_transport_duration', 'parent_span', 'parent_type', - 'parent_account', 'parent_app', 'parent_tx', 'parent_transport_type', - 'root_span_guid', 'trace_id', 'loop_time']) +_TransactionNode = namedtuple( + "_TransactionNode", + [ + "settings", + "path", + "type", + "group", + "base_name", + "name_for_metric", + "port", + "request_uri", + "queue_start", + "start_time", + "end_time", + "last_byte_time", + "response_time", + "total_time", + "duration", + "exclusive", + "root", + "errors", + "slow_sql", + "custom_events", + "log_events", + "apdex_t", + "suppress_apdex", + "custom_metrics", + "guid", + "cpu_time", + "suppress_transaction_trace", + "client_cross_process_id", + "referring_transaction_guid", + "record_tt", + "synthetics_resource_id", + "synthetics_job_id", + "synthetics_monitor_id", + "synthetics_header", + "is_part_of_cat", + "trip_id", + "path_hash", + "referring_path_hash", + "alternate_path_hashes", + "trace_intrinsics", + "agent_attributes", + "distributed_trace_intrinsics", + "user_attributes", + "priority", + "sampled", + "parent_transport_duration", + "parent_span", + "parent_type", + "parent_account", + "parent_app", + "parent_tx", + "parent_transport_type", + "root_span_guid", + "trace_id", + "loop_time", + ], +) class TransactionNode(_TransactionNode): @@ -71,7 +117,7 @@ def __hash__(self): @property def string_table(self): - result = getattr(self, '_string_table', None) + result = getattr(self, "_string_table", None) if result is not None: return result self._string_table = StringTable() @@ -96,7 +142,7 @@ def time_metrics(self, stats): if not self.base_name: return - if self.type == 'WebTransaction': + if self.type == "WebTransaction": # Report time taken by request dispatcher. We don't # know upstream time distinct from actual request # time so can't report time exclusively in the @@ -109,11 +155,7 @@ def time_metrics(self, stats): # and how the exclusive component would appear in # the overview graphs. - yield TimeMetric( - name='HttpDispatcher', - scope='', - duration=self.response_time, - exclusive=None) + yield TimeMetric(name="HttpDispatcher", scope="", duration=self.response_time, exclusive=None) # Upstream queue time within any web server front end. @@ -128,114 +170,84 @@ def time_metrics(self, stats): if queue_wait < 0: queue_wait = 0 - yield TimeMetric( - name='WebFrontend/QueueTime', - scope='', - duration=queue_wait, - exclusive=None) + yield TimeMetric(name="WebFrontend/QueueTime", scope="", duration=queue_wait, exclusive=None) # Generate the full transaction metric. - yield TimeMetric( - name=self.path, - scope='', - duration=self.response_time, - exclusive=self.exclusive) + yield TimeMetric(name=self.path, scope="", duration=self.response_time, exclusive=self.exclusive) # Generate the rollup metric. - if self.type != 'WebTransaction': - rollup = '%s/all' % self.type + if self.type != "WebTransaction": + rollup = "%s/all" % self.type else: rollup = self.type - yield TimeMetric( - name=rollup, - scope='', - duration=self.response_time, - exclusive=self.exclusive) + yield TimeMetric(name=rollup, scope="", duration=self.response_time, exclusive=self.exclusive) # Generate Unscoped Total Time metrics. - if self.type == 'WebTransaction': - metric_prefix = 'WebTransactionTotalTime' - metric_suffix = 'Web' + if self.type == "WebTransaction": + metric_prefix = "WebTransactionTotalTime" + metric_suffix = "Web" else: - metric_prefix = 'OtherTransactionTotalTime' - metric_suffix = 'Other' + metric_prefix = "OtherTransactionTotalTime" + metric_suffix = "Other" yield TimeMetric( - name='%s/%s' % (metric_prefix, self.name_for_metric), - scope='', - duration=self.total_time, - exclusive=self.total_time) + name="%s/%s" % (metric_prefix, self.name_for_metric), + scope="", + duration=self.total_time, + exclusive=self.total_time, + ) - yield TimeMetric( - name=metric_prefix, - scope='', - duration=self.total_time, - exclusive=self.total_time) + yield TimeMetric(name=metric_prefix, scope="", duration=self.total_time, exclusive=self.total_time) # Generate Distributed Tracing metrics if self.settings.distributed_tracing.enabled: dt_tag = "%s/%s/%s/%s/all" % ( - self.parent_type or 'Unknown', - self.parent_account or 'Unknown', - self.parent_app or 'Unknown', - self.parent_transport_type or 'Unknown') + self.parent_type or "Unknown", + self.parent_account or "Unknown", + self.parent_app or "Unknown", + self.parent_transport_type or "Unknown", + ) - for bonus_tag in ('', metric_suffix): + for bonus_tag in ("", metric_suffix): yield TimeMetric( name="DurationByCaller/%s%s" % (dt_tag, bonus_tag), - scope='', + scope="", duration=self.duration, - exclusive=self.duration) + exclusive=self.duration, + ) if self.parent_transport_duration is not None: yield TimeMetric( name="TransportDuration/%s%s" % (dt_tag, bonus_tag), - scope='', + scope="", duration=self.parent_transport_duration, - exclusive=self.parent_transport_duration) + exclusive=self.parent_transport_duration, + ) if self.errors: yield TimeMetric( - name='ErrorsByCaller/%s%s' % (dt_tag, bonus_tag), - scope='', - duration=0.0, - exclusive=None) + name="ErrorsByCaller/%s%s" % (dt_tag, bonus_tag), scope="", duration=0.0, exclusive=None + ) # Generate Error metrics if self.errors: if False in (error.expected for error in self.errors): # Generate overall rollup metric indicating if errors present. - yield TimeMetric( - name='Errors/all', - scope='', - duration=0.0, - exclusive=None) + yield TimeMetric(name="Errors/all", scope="", duration=0.0, exclusive=None) # Generate individual error metric for transaction. - yield TimeMetric( - name='Errors/%s' % self.path, - scope='', - duration=0.0, - exclusive=None) + yield TimeMetric(name="Errors/%s" % self.path, scope="", duration=0.0, exclusive=None) # Generate rollup metric for WebTransaction errors. - yield TimeMetric( - name='Errors/all%s' % metric_suffix, - scope='', - duration=0.0, - exclusive=None) + yield TimeMetric(name="Errors/all%s" % metric_suffix, scope="", duration=0.0, exclusive=None) else: - yield TimeMetric( - name='ErrorsExpected/all', - scope='', - duration=0.0, - exclusive=None) + yield TimeMetric(name="ErrorsExpected/all", scope="", duration=0.0, exclusive=None) # Now for the children. for child in self.root.children: @@ -243,9 +255,7 @@ def time_metrics(self, stats): yield metric def apdex_metrics(self, stats): - """Return a generator yielding the apdex metrics for this node. - - """ + """Return a generator yielding the apdex metrics for this node.""" if not self.base_name: return @@ -255,7 +265,7 @@ def apdex_metrics(self, stats): # The apdex metrics are only relevant to web transactions. - if self.type != 'WebTransaction': + if self.type != "WebTransaction": return # The magic calculations based on apdex_t. The apdex_t @@ -280,20 +290,18 @@ def apdex_metrics(self, stats): # Generate the full apdex metric. yield ApdexMetric( - name='Apdex/%s' % self.name_for_metric, - satisfying=satisfying, - tolerating=tolerating, - frustrating=frustrating, - apdex_t=self.apdex_t) + name="Apdex/%s" % self.name_for_metric, + satisfying=satisfying, + tolerating=tolerating, + frustrating=frustrating, + apdex_t=self.apdex_t, + ) # Generate the rollup metric. yield ApdexMetric( - name='Apdex', - satisfying=satisfying, - tolerating=tolerating, - frustrating=frustrating, - apdex_t=self.apdex_t) + name="Apdex", satisfying=satisfying, tolerating=tolerating, frustrating=frustrating, apdex_t=self.apdex_t + ) def error_details(self): """Return a generator yielding the details for each unique error @@ -318,38 +326,49 @@ def error_details(self): params = {} params["stack_trace"] = error.stack_trace - intrinsics = {'spanId': error.span_id, 'error.expected': error.expected} + intrinsics = {"spanId": error.span_id, "error.expected": error.expected} intrinsics.update(self.trace_intrinsics) - params['intrinsics'] = intrinsics + params["intrinsics"] = intrinsics - params['agentAttributes'] = {} + params["agentAttributes"] = {} for attr in self.agent_attributes: if attr.destinations & DST_ERROR_COLLECTOR: - params['agentAttributes'][attr.name] = attr.value + params["agentAttributes"][attr.name] = attr.value - params['userAttributes'] = {} + params["userAttributes"] = {} for attr in self.user_attributes: if attr.destinations & DST_ERROR_COLLECTOR: - params['userAttributes'][attr.name] = attr.value + params["userAttributes"][attr.name] = attr.value - # add source context attrs for error - if self.settings and self.settings.code_level_metrics and self.settings.code_level_metrics.enabled and getattr(error, "source", None): - error.source.add_attrs(params['agentAttributes'].__setitem__) + # add error specific agent attributes to this error's agentAttributes - # add error specific custom params to this error's userAttributes + err_agent_attrs = {} + error_group_name = error.error_group_name + if error_group_name: + err_agent_attrs["error.group.name"] = error_group_name - err_attrs = create_user_attributes(error.custom_params, - self.settings.attribute_filter) + # add source context attrs for error + if ( + self.settings + and self.settings.code_level_metrics + and self.settings.code_level_metrics.enabled + and getattr(error, "source", None) + ): + error.source.add_attrs(err_agent_attrs.__setitem__) + + err_agent_attrs = create_agent_attributes(err_agent_attrs, self.settings.attribute_filter) + for attr in err_agent_attrs: + if attr.destinations & DST_ERROR_COLLECTOR: + params["agentAttributes"][attr.name] = attr.value + + err_attrs = create_user_attributes(error.custom_params, self.settings.attribute_filter) for attr in err_attrs: if attr.destinations & DST_ERROR_COLLECTOR: - params['userAttributes'][attr.name] = attr.value + params["userAttributes"][attr.name] = attr.value yield newrelic.core.error_collector.TracedError( - start_time=error.timestamp, - path=self.path, - message=error.message, - type=error.type, - parameters=params) + start_time=error.timestamp, path=self.path, message=error.message, type=error.type, parameters=params + ) def transaction_trace(self, stats, limit, connections): @@ -362,19 +381,19 @@ def transaction_trace(self, stats, limit, connections): attributes = {} - attributes['intrinsics'] = self.trace_intrinsics + attributes["intrinsics"] = self.trace_intrinsics - attributes['agentAttributes'] = {} + attributes["agentAttributes"] = {} for attr in self.agent_attributes: if attr.destinations & DST_TRANSACTION_TRACER: - attributes['agentAttributes'][attr.name] = attr.value - if attr.name == 'request.uri': + attributes["agentAttributes"][attr.name] = attr.value + if attr.name == "request.uri": self.include_transaction_trace_request_uri = True - attributes['userAttributes'] = {} + attributes["userAttributes"] = {} for attr in self.user_attributes: if attr.destinations & DST_TRANSACTION_TRACER: - attributes['userAttributes'][attr.name] = attr.value + attributes["userAttributes"][attr.name] = attr.value # There is an additional trace node labeled as 'ROOT' # that needs to be inserted below the root node object @@ -382,19 +401,17 @@ def transaction_trace(self, stats, limit, connections): # from the actual top node for the transaction. root = newrelic.core.trace_node.TraceNode( - start_time=trace_node.start_time, - end_time=trace_node.end_time, - name='ROOT', - params={}, - children=[trace_node], - label=None) + start_time=trace_node.start_time, + end_time=trace_node.end_time, + name="ROOT", + params={}, + children=[trace_node], + label=None, + ) return newrelic.core.trace_node.RootNode( - start_time=start_time, - empty0={}, - empty1={}, - root=root, - attributes=attributes) + start_time=start_time, empty0={}, empty1={}, root=root, attributes=attributes + ) def slow_sql_nodes(self, stats): for item in self.slow_sql: @@ -405,18 +422,18 @@ def apdex_perf_zone(self): # Apdex is only valid for WebTransactions. - if self.type != 'WebTransaction': + if self.type != "WebTransaction": return None if self.errors and False in (error.expected for error in self.errors): - return 'F' + return "F" else: if self.duration <= self.apdex_t: - return 'S' + return "S" elif self.duration <= 4 * self.apdex_t: - return 'T' + return "T" else: - return 'F' + return "F" def transaction_event(self, stats_table): # Create the transaction event, which is a list of attributes. @@ -445,39 +462,38 @@ def transaction_event_intrinsics(self, stats_table): intrinsics = self._event_intrinsics(stats_table) - intrinsics['type'] = 'Transaction' - intrinsics['name'] = self.path - intrinsics['totalTime'] = self.total_time + intrinsics["type"] = "Transaction" + intrinsics["name"] = self.path + intrinsics["totalTime"] = self.total_time def _add_if_not_empty(key, value): if value: intrinsics[key] = value + apdex_perf_zone = self.apdex_perf_zone() + _add_if_not_empty("apdexPerfZone", apdex_perf_zone) + _add_if_not_empty("nr.apdexPerfZone", apdex_perf_zone) + if self.errors: - intrinsics['error'] = True + intrinsics["error"] = True if self.path_hash: - intrinsics['nr.guid'] = self.guid - intrinsics['nr.tripId'] = self.trip_id - intrinsics['nr.pathHash'] = self.path_hash - - _add_if_not_empty('nr.referringPathHash', - self.referring_path_hash) - _add_if_not_empty('nr.alternatePathHashes', - ','.join(self.alternate_path_hashes)) - _add_if_not_empty('nr.referringTransactionGuid', - self.referring_transaction_guid) - _add_if_not_empty('nr.apdexPerfZone', - self.apdex_perf_zone()) + intrinsics["nr.guid"] = self.guid + intrinsics["nr.tripId"] = self.trip_id + intrinsics["nr.pathHash"] = self.path_hash + + _add_if_not_empty("nr.referringPathHash", self.referring_path_hash) + _add_if_not_empty("nr.alternatePathHashes", ",".join(self.alternate_path_hashes)) + _add_if_not_empty("nr.referringTransactionGuid", self.referring_transaction_guid) if self.synthetics_resource_id: - intrinsics['nr.guid'] = self.guid + intrinsics["nr.guid"] = self.guid if self.parent_tx: - intrinsics['parentId'] = self.parent_tx + intrinsics["parentId"] = self.parent_tx if self.parent_span: - intrinsics['parentSpanId'] = self.parent_span + intrinsics["parentSpanId"] = self.parent_span return intrinsics @@ -500,10 +516,21 @@ def error_events(self, stats_table): if attr.destinations & DST_ERROR_COLLECTOR: user_attributes[attr.name] = attr.value + # add error specific agent attributes to this error's agentAttributes + + err_agent_attrs = {} + error_group_name = error.error_group_name + if error_group_name: + err_agent_attrs["error.group.name"] = error_group_name + + err_agent_attrs = create_agent_attributes(err_agent_attrs, self.settings.attribute_filter) + for attr in err_agent_attrs: + if attr.destinations & DST_ERROR_COLLECTOR: + agent_attributes[attr.name] = attr.value + # add error specific custom params to this error's userAttributes - err_attrs = create_user_attributes(error.custom_params, - self.settings.attribute_filter) + err_attrs = create_user_attributes(error.custom_params, self.settings.attribute_filter) for attr in err_attrs: if attr.destinations & DST_ERROR_COLLECTOR: user_attributes[attr.name] = attr.value @@ -517,24 +544,24 @@ def error_event_intrinsics(self, error, stats_table): intrinsics = self._event_intrinsics(stats_table) - intrinsics['type'] = "TransactionError" - intrinsics['error.class'] = error.type - intrinsics['error.message'] = error.message - intrinsics['error.expected'] = error.expected - intrinsics['transactionName'] = self.path - intrinsics['spanId'] = error.span_id + intrinsics["type"] = "TransactionError" + intrinsics["error.class"] = error.type + intrinsics["error.message"] = error.message + intrinsics["error.expected"] = error.expected + intrinsics["transactionName"] = self.path + intrinsics["spanId"] = error.span_id - intrinsics['nr.transactionGuid'] = self.guid + intrinsics["nr.transactionGuid"] = self.guid if self.referring_transaction_guid: guid = self.referring_transaction_guid - intrinsics['nr.referringTransactionGuid'] = guid + intrinsics["nr.referringTransactionGuid"] = guid return intrinsics def _event_intrinsics(self, stats_table): """Common attributes for analytics events""" - cache = getattr(self, '_event_intrinsics_cache', None) + cache = getattr(self, "_event_intrinsics_cache", None) if cache is not None: # We don't want to execute this function more than once, since @@ -544,24 +571,24 @@ def _event_intrinsics(self, stats_table): intrinsics = self.distributed_trace_intrinsics.copy() - intrinsics['timestamp'] = int(1000.0 * self.start_time) - intrinsics['duration'] = self.response_time + intrinsics["timestamp"] = int(1000.0 * self.start_time) + intrinsics["duration"] = self.response_time if self.port: - intrinsics['port'] = self.port + intrinsics["port"] = self.port # Add the Synthetics attributes to the intrinsics dict. if self.synthetics_resource_id: - intrinsics['nr.syntheticsResourceId'] = self.synthetics_resource_id - intrinsics['nr.syntheticsJobId'] = self.synthetics_job_id - intrinsics['nr.syntheticsMonitorId'] = self.synthetics_monitor_id + intrinsics["nr.syntheticsResourceId"] = self.synthetics_resource_id + intrinsics["nr.syntheticsJobId"] = self.synthetics_job_id + intrinsics["nr.syntheticsMonitorId"] = self.synthetics_monitor_id def _add_call_time(source, target): # include time for keys previously added to stats table via # stats_engine.record_transaction - if (source, '') in stats_table: - call_time = stats_table[(source, '')].total_call_time + if (source, "") in stats_table: + call_time = stats_table[(source, "")].total_call_time if target in intrinsics: intrinsics[target] += call_time else: @@ -570,45 +597,43 @@ def _add_call_time(source, target): def _add_call_count(source, target): # include counts for keys previously added to stats table via # stats_engine.record_transaction - if (source, '') in stats_table: - call_count = stats_table[(source, '')].call_count + if (source, "") in stats_table: + call_count = stats_table[(source, "")].call_count if target in intrinsics: intrinsics[target] += call_count else: intrinsics[target] = call_count - _add_call_time('WebFrontend/QueueTime', 'queueDuration') + _add_call_time("WebFrontend/QueueTime", "queueDuration") - _add_call_time('External/all', 'externalDuration') - _add_call_time('Datastore/all', 'databaseDuration') - _add_call_time('Memcache/all', 'memcacheDuration') + _add_call_time("External/all", "externalDuration") + _add_call_time("Datastore/all", "databaseDuration") + _add_call_time("Memcache/all", "memcacheDuration") - _add_call_count('External/all', 'externalCallCount') - _add_call_count('Datastore/all', 'databaseCallCount') + _add_call_count("External/all", "externalCallCount") + _add_call_count("Datastore/all", "databaseCallCount") if self.loop_time: - intrinsics['eventLoopTime'] = self.loop_time - _add_call_time('EventLoop/Wait/all', 'eventLoopWait') + intrinsics["eventLoopTime"] = self.loop_time + _add_call_time("EventLoop/Wait/all", "eventLoopWait") self._event_intrinsics_cache = intrinsics.copy() return intrinsics def span_protos(self, settings): - for i_attrs, u_attrs, a_attrs in self.span_events( - settings, attr_class=SpanProtoAttrs): - yield Span(trace_id=self.trace_id, - intrinsics=i_attrs, - user_attributes=u_attrs, - agent_attributes=a_attrs) + for i_attrs, u_attrs, a_attrs in self.span_events(settings, attr_class=SpanProtoAttrs): + yield Span(trace_id=self.trace_id, intrinsics=i_attrs, user_attributes=u_attrs, agent_attributes=a_attrs) def span_events(self, settings, attr_class=dict): - base_attrs = attr_class(( - ('transactionId', self.guid), - ('traceId', self.trace_id), - ('sampled', self.sampled), - ('priority', self.priority), - )) + base_attrs = attr_class( + ( + ("transactionId", self.guid), + ("traceId", self.trace_id), + ("sampled", self.sampled), + ("priority", self.priority), + ) + ) for event in self.root.span_events( settings, diff --git a/newrelic/hooks/adapter_daphne.py b/newrelic/hooks/adapter_daphne.py index 430d9c4b3..f18cb779a 100644 --- a/newrelic/hooks/adapter_daphne.py +++ b/newrelic/hooks/adapter_daphne.py @@ -13,6 +13,7 @@ # limitations under the License. from newrelic.api.asgi_application import ASGIApplicationWrapper +from newrelic.common.package_version_utils import get_package_version @property @@ -22,9 +23,10 @@ def application(self): @application.setter def application(self, value): + dispatcher_details = ("Daphne", get_package_version("daphne")) # Wrap app only once if value and not getattr(value, "_nr_wrapped", False): - value = ASGIApplicationWrapper(value) + value = ASGIApplicationWrapper(value, dispatcher=dispatcher_details) value._nr_wrapped = True self._nr_application = value diff --git a/newrelic/hooks/adapter_hypercorn.py b/newrelic/hooks/adapter_hypercorn.py index f22dc74f1..8dec936ef 100644 --- a/newrelic/hooks/adapter_hypercorn.py +++ b/newrelic/hooks/adapter_hypercorn.py @@ -15,6 +15,7 @@ from newrelic.api.asgi_application import ASGIApplicationWrapper from newrelic.api.wsgi_application import WSGIApplicationWrapper from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.common.package_version_utils import get_package_version def bind_worker_serve(app, *args, **kwargs): @@ -24,6 +25,7 @@ def bind_worker_serve(app, *args, **kwargs): async def wrap_worker_serve(wrapped, instance, args, kwargs): import hypercorn + dispatcher_details = ("Hypercorn", get_package_version("hypercorn")) wrapper_module = getattr(hypercorn, "app_wrappers", None) asgi_wrapper_class = getattr(wrapper_module, "ASGIWrapper", None) wsgi_wrapper_class = getattr(wrapper_module, "WSGIWrapper", None) @@ -32,13 +34,14 @@ async def wrap_worker_serve(wrapped, instance, args, kwargs): # Hypercorn 0.14.1 introduced wrappers for ASGI and WSGI apps that need to be above our instrumentation. if asgi_wrapper_class is not None and isinstance(app, asgi_wrapper_class): - app.app = ASGIApplicationWrapper(app.app) + app.app = ASGIApplicationWrapper(app.app, dispatcher=dispatcher_details) elif wsgi_wrapper_class is not None and isinstance(app, wsgi_wrapper_class): - app.app = WSGIApplicationWrapper(app.app) + app.app = WSGIApplicationWrapper(app.app, dispatcher=dispatcher_details) else: - app = ASGIApplicationWrapper(app) + app = ASGIApplicationWrapper(app, dispatcher=dispatcher_details) app._nr_wrapped = True + return await wrapped(app, *args, **kwargs) diff --git a/newrelic/hooks/adapter_waitress.py b/newrelic/hooks/adapter_waitress.py index bdeb15b37..2353510e3 100644 --- a/newrelic/hooks/adapter_waitress.py +++ b/newrelic/hooks/adapter_waitress.py @@ -12,17 +12,16 @@ # See the License for the specific language governing permissions and # limitations under the License. -import newrelic.api.wsgi_application -import newrelic.api.in_function +from newrelic.api.in_function import wrap_in_function +from newrelic.api.wsgi_application import WSGIApplicationWrapper +from newrelic.common.package_version_utils import get_package_version -def instrument_waitress_server(module): - def wrap_wsgi_application_entry_point(server, application, - *args, **kwargs): - application = newrelic.api.wsgi_application.WSGIApplicationWrapper( - application) +def instrument_waitress_server(module): + def wrap_wsgi_application_entry_point(server, application, *args, **kwargs): + dispatcher_details = ("Waitress", get_package_version("waitress")) + application = WSGIApplicationWrapper(application, dispatcher=dispatcher_details) args = [server, application] + list(args) return (args, kwargs) - newrelic.api.in_function.wrap_in_function(module, - 'WSGIServer.__init__', wrap_wsgi_application_entry_point) + wrap_in_function(module, "WSGIServer.__init__", wrap_wsgi_application_entry_point) diff --git a/newrelic/hooks/component_sentry.py b/newrelic/hooks/component_sentry.py new file mode 100644 index 000000000..cc54efa9b --- /dev/null +++ b/newrelic/hooks/component_sentry.py @@ -0,0 +1,41 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from newrelic.common.object_wrapper import FunctionWrapper, wrap_function_wrapper + +# This is NOT a fully-featured instrumentation for the sentry SDK. Instead +# this is a monkey-patch of the sentry SDK to work around a bug that causes +# improper ASGI 2/3 version detection when inspecting our wrappers. We fix this +# by manually unwrapping the application when version detection is run. + + +def bind__looks_like_asgi3(app): + return app + + +def wrap__looks_like_asgi3(wrapped, instance, args, kwargs): + try: + app = bind__looks_like_asgi3(*args, **kwargs) + except Exception: + return wrapped(*args, **kwargs) + + while isinstance(app, FunctionWrapper) and hasattr(app, "__wrapped__"): + app = app.__wrapped__ + + return wrapped(app) + + +def instrument_sentry_sdk_integrations_asgi(module): + if hasattr(module, "_looks_like_asgi3"): + wrap_function_wrapper(module, "_looks_like_asgi3", wrap__looks_like_asgi3) diff --git a/newrelic/hooks/datastore_aioredis.py b/newrelic/hooks/datastore_aioredis.py index a2267960c..daced369a 100644 --- a/newrelic/hooks/datastore_aioredis.py +++ b/newrelic/hooks/datastore_aioredis.py @@ -12,25 +12,17 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.api.datastore_trace import DatastoreTrace, DatastoreTraceWrapper +from newrelic.api.datastore_trace import DatastoreTrace from newrelic.api.time_trace import current_trace from newrelic.api.transaction import current_transaction -from newrelic.common.object_wrapper import wrap_function_wrapper, function_wrapper, FunctionWrapper +from newrelic.common.object_wrapper import function_wrapper, wrap_function_wrapper +from newrelic.common.package_version_utils import get_package_version_tuple from newrelic.hooks.datastore_redis import ( _redis_client_methods, _redis_multipart_commands, _redis_operation_re, ) -from newrelic.common.async_wrapper import async_wrapper - -import aioredis - -try: - AIOREDIS_VERSION = tuple(int(x) for x in getattr(aioredis, "__version__").split(".")) -except Exception: - AIOREDIS_VERSION = (0, 0, 0) - def _conn_attrs_to_dict(connection): host = getattr(connection, "host", None) @@ -47,14 +39,13 @@ def _conn_attrs_to_dict(connection): def _instance_info(kwargs): host = kwargs.get("host") or "localhost" - port_path_or_id = str(kwargs.get("port") or kwargs.get("path", 6379)) + port_path_or_id = str(kwargs.get("path") or kwargs.get("port", 6379)) db = str(kwargs.get("db") or 0) return (host, port_path_or_id, db) def _wrap_AioRedis_method_wrapper(module, instance_class_name, operation): - @function_wrapper async def _nr_wrapper_AioRedis_async_method_(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -63,28 +54,35 @@ async def _nr_wrapper_AioRedis_async_method_(wrapped, instance, args, kwargs): with DatastoreTrace(product="Redis", target=None, operation=operation): return await wrapped(*args, **kwargs) - + def _nr_wrapper_AioRedis_method_(wrapped, instance, args, kwargs): # Check for transaction and return early if found. # Method will return synchronously without executing, # it will be added to the command stack and run later. - if AIOREDIS_VERSION < (2,): + aioredis_version = get_package_version_tuple("aioredis") + # This conditional is for versions of aioredis that are outside + # New Relic's supportability window but will still work. New + # Relic does not provide testing/support for this. In order to + # keep functionality without affecting coverage metrics, this + # segment is excluded from coverage analysis. + if aioredis_version and aioredis_version < (2,): # pragma: no cover # AioRedis v1 uses a RedisBuffer instead of a real connection for queueing up pipeline commands from aioredis.commands.transaction import _RedisBuffer + if isinstance(instance._pool_or_conn, _RedisBuffer): # Method will return synchronously without executing, # it will be added to the command stack and run later. return wrapped(*args, **kwargs) else: # AioRedis v2 uses a Pipeline object for a client and internally queues up pipeline commands - from aioredis.client import Pipeline + if aioredis_version: + from aioredis.client import Pipeline if isinstance(instance, Pipeline): return wrapped(*args, **kwargs) # Method should be run when awaited, therefore we wrap in an async wrapper. return _nr_wrapper_AioRedis_async_method_(wrapped)(*args, **kwargs) - name = "%s.%s" % (instance_class_name, operation) wrap_function_wrapper(module, name, _nr_wrapper_AioRedis_method_) @@ -113,7 +111,9 @@ async def wrap_Connection_send_command(wrapped, instance, args, kwargs): # If it's not a multi part command, there's no need to trace it, so # we can return early. - if operation.split()[0] not in _redis_multipart_commands: # Set the datastore info on the DatastoreTrace containing this function call. + if ( + operation.split()[0] not in _redis_multipart_commands + ): # Set the datastore info on the DatastoreTrace containing this function call. trace = current_trace() # Find DatastoreTrace no matter how many other traces are inbetween @@ -140,7 +140,12 @@ async def wrap_Connection_send_command(wrapped, instance, args, kwargs): return await wrapped(*args, **kwargs) -def wrap_RedisConnection_execute(wrapped, instance, args, kwargs): +# This wrapper is for versions of aioredis that are outside +# New Relic's supportability window but will still work. New +# Relic does not provide testing/support for this. In order to +# keep functionality without affecting coverage metrics, this +# segment is excluded from coverage analysis. +def wrap_RedisConnection_execute(wrapped, instance, args, kwargs): # pragma: no cover # RedisConnection in aioredis v1 returns a future instead of using coroutines transaction = current_transaction() if not transaction: @@ -165,7 +170,9 @@ def wrap_RedisConnection_execute(wrapped, instance, args, kwargs): # If it's not a multi part command, there's no need to trace it, so # we can return early. - if operation.split()[0] not in _redis_multipart_commands: # Set the datastore info on the DatastoreTrace containing this function call. + if ( + operation.split()[0] not in _redis_multipart_commands + ): # Set the datastore info on the DatastoreTrace containing this function call. trace = current_trace() # Find DatastoreTrace no matter how many other traces are inbetween @@ -206,6 +213,11 @@ def instrument_aioredis_connection(module): if hasattr(module.Connection, "send_command"): wrap_function_wrapper(module, "Connection.send_command", wrap_Connection_send_command) - if hasattr(module, "RedisConnection"): + # This conditional is for versions of aioredis that are outside + # New Relic's supportability window but will still work. New + # Relic does not provide testing/support for this. In order to + # keep functionality without affecting coverage metrics, this + # segment is excluded from coverage analysis. + if hasattr(module, "RedisConnection"): # pragma: no cover if hasattr(module.RedisConnection, "execute"): wrap_function_wrapper(module, "RedisConnection.execute", wrap_RedisConnection_execute) diff --git a/newrelic/hooks/datastore_aredis.py b/newrelic/hooks/datastore_aredis.py index a63da57c7..236cbf3f8 100644 --- a/newrelic/hooks/datastore_aredis.py +++ b/newrelic/hooks/datastore_aredis.py @@ -98,4 +98,4 @@ def instrument_aredis_client(module): def instrument_aredis_connection(module): - wrap_function_wrapper(module.Connection, "send_command", wrap_Connection_send_command) + wrap_function_wrapper(module, "Connection.send_command", wrap_Connection_send_command) diff --git a/newrelic/hooks/datastore_bmemcached.py b/newrelic/hooks/datastore_bmemcached.py index c54947ab7..3091f0992 100644 --- a/newrelic/hooks/datastore_bmemcached.py +++ b/newrelic/hooks/datastore_bmemcached.py @@ -14,12 +14,25 @@ from newrelic.api.datastore_trace import wrap_datastore_trace -_memcache_client_methods = ('get', 'gets', 'get_multi', 'set', 'cas', - 'set_multi', 'add', 'replace', 'delete', 'delete_multi', 'incr', - 'decr', 'flush_all', 'stats') +_memcache_client_methods = ( + "get", + "gets", + "get_multi", + "set", + "cas", + "set_multi", + "add", + "replace", + "delete", + "delete_multi", + "incr", + "decr", + "flush_all", + "stats", +) + def instrument_bmemcached_client(module): for name in _memcache_client_methods: if hasattr(module.Client, name): - wrap_datastore_trace(module.Client, name, - product='Memcached', target=None, operation=name) + wrap_datastore_trace(module, "Client.%s" % name, product="Memcached", target=None, operation=name) diff --git a/newrelic/hooks/datastore_elasticsearch.py b/newrelic/hooks/datastore_elasticsearch.py index 3db62bb90..2417aabfe 100644 --- a/newrelic/hooks/datastore_elasticsearch.py +++ b/newrelic/hooks/datastore_elasticsearch.py @@ -12,11 +12,11 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.packages import six - from newrelic.api.datastore_trace import DatastoreTrace from newrelic.api.transaction import current_transaction -from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.common.object_wrapper import function_wrapper, wrap_function_wrapper +from newrelic.common.package_version_utils import get_package_version_tuple +from newrelic.packages import six # An index name can be a string, None or a sequence. In the case of None # an empty string or '*', it is the same as using '_all'. When a string @@ -24,70 +24,113 @@ # obviously can also be more than one index name. Where we are certain # there is only a single index name we use it, otherwise we use 'other'. +ES_VERSION = get_package_version_tuple("elasticsearch") + + def _index_name(index): - if not index or index == '*': - return '_all' - if not isinstance(index, six.string_types) or ',' in index: - return 'other' + if not index or index == "*": + return "_all" + if not isinstance(index, six.string_types) or "," in index: + return "other" return index -def _extract_kwargs_index(*args, **kwargs): - return _index_name(kwargs.get('index')) def _extract_args_index(index=None, *args, **kwargs): return _index_name(index) + +def _extract_args_allocation_explain_index( + current_node=None, + error_trace=None, + filter_path=None, + human=None, + include_disk_info=None, + include_yes_decisions=None, + index=None, + *args, + **kwargs +): + return _index_name(index) + + +def _extract_args_name_index(name=None, index=None, *args, **kwargs): + return _index_name(index) + + def _extract_args_body_index(body=None, index=None, *args, **kwargs): return _index_name(index) -def _extract_args_doctype_body_index(doc_type=None, body=None, index=None, - *args, **kwargs): + +def _extract_args_requests_index(requests=None, index=None, *args, **kwargs): + return _index_name(index) + + +def _extract_args_searches_index(searches=None, index=None, *args, **kwargs): + return _index_name(index) + + +def _extract_args_search_templates_index(search_templates=None, index=None, *args, **kwargs): + return _index_name(index) + + +def _extract_args_operations_index(operations=None, index=None, *args, **kwargs): + return _index_name(index) + + +def _extract_args_doctype_body_index(doc_type=None, body=None, index=None, *args, **kwargs): return _index_name(index) + def _extract_args_field_index(field=None, index=None, *args, **kwargs): return _index_name(index) -def _extract_args_name_body_index(name=None, body=None, index=None, - *args, **kwargs): + +def _extract_args_fields_index(fields=None, index=None, *args, **kwargs): return _index_name(index) -def _extract_args_name_index(name=None, index=None, *args, **kwargs): + +def _extract_args_name_body_index(name=None, body=None, index=None, *args, **kwargs): return _index_name(index) + def _extract_args_metric_index(metric=None, index=None, *args, **kwargs): return _index_name(index) -def wrap_elasticsearch_client_method(owner, name, arg_extractor, prefix=None): + +def _extract_args_settings_index(settings=None, index=None, *args, **kwargs): + return _index_name(index) + + +def instrument_es_methods(module, _class, client_methods, prefix=None): + for method_name, arg_extractor in client_methods: + if hasattr(getattr(module, _class), method_name): + wrap_elasticsearch_client_method(module, _class, method_name, arg_extractor, prefix) + + +def wrap_elasticsearch_client_method(module, class_name, method_name, arg_extractor, prefix=None): def _nr_wrapper_Elasticsearch_method_(wrapped, instance, args, kwargs): transaction = current_transaction() if transaction is None: return wrapped(*args, **kwargs) - - # When arg_extractor is None, it means there is no target field + # When index is None, it means there is no target field # associated with this method. Hence this method will only # create an operation metric and no statement metric. This is # handled by setting the target to None when calling the # DatastoreTraceWrapper. - if arg_extractor is None: index = None else: index = arg_extractor(*args, **kwargs) if prefix: - operation = '%s.%s' % (prefix, name) + operation = "%s.%s" % (prefix, method_name) else: - operation = name + operation = method_name transaction._nr_datastore_instance_info = (None, None, None) - dt = DatastoreTrace( - product='Elasticsearch', - target=index, - operation=operation, - source=wrapped - ) + dt = DatastoreTrace(product="Elasticsearch", target=index, operation=operation, source=wrapped) with dt: result = wrapped(*args, **kwargs) @@ -100,200 +143,451 @@ def _nr_wrapper_Elasticsearch_method_(wrapped, instance, args, kwargs): return result - if hasattr(owner, name): - wrap_function_wrapper(owner, name, _nr_wrapper_Elasticsearch_method_) - -_elasticsearch_client_methods = ( - ('abort_benchmark', None), - ('benchmark', _extract_args_index), - ('bulk', None), - ('clear_scroll', None), - ('count', _extract_args_index), - ('count_percolate', _extract_args_index), - ('create', _extract_args_index), - ('delete', _extract_args_index), - ('delete_by_query', _extract_args_index), - ('delete_script', None), - ('delete_template', None), - ('exists', _extract_args_index), - ('explain', _extract_args_index), - ('get', _extract_args_index), - ('get_script', None), - ('get_source', _extract_args_index), - ('get_template', None), - ('index', _extract_args_index), - ('info', None), - ('list_benchmarks', _extract_args_index), - ('mget', None), - ('mlt', _extract_args_index), - ('mpercolate', _extract_args_body_index), - ('msearch', None), - ('mtermvectors', None), - ('percolate', _extract_args_index), - ('ping', None), - ('put_script', None), - ('put_template', None), - ('scroll', None), - ('search', _extract_args_index), - ('search_exists', _extract_args_index), - ('search_shards', _extract_args_index), - ('search_template', _extract_args_index), - ('suggest', _extract_args_body_index), - ('termvector', _extract_args_index), - ('termvectors', None), - ('update', _extract_args_index), + wrap_function_wrapper(module, "%s.%s" % (class_name, method_name), _nr_wrapper_Elasticsearch_method_) + + +_elasticsearch_client_methods_below_v8 = ( + ("abort_benchmark", None), + ("benchmark", _extract_args_index), + ("bulk", None), + ("clear_scroll", None), + ("count", _extract_args_index), + ("count_percolate", _extract_args_index), + ("create", _extract_args_index), + ("delete", _extract_args_index), + ("delete_by_query", _extract_args_index), + ("delete_script", None), + ("delete_template", None), + ("exists", _extract_args_index), + ("explain", _extract_args_index), + ("get", _extract_args_index), + ("get_script", None), + ("get_source", _extract_args_index), + ("get_template", None), + ("index", _extract_args_index), + ("info", None), + ("list_benchmarks", _extract_args_index), + ("mget", None), + ("mlt", _extract_args_index), + ("mpercolate", _extract_args_body_index), + ("msearch", None), + ("mtermvectors", None), + ("percolate", _extract_args_index), + ("ping", None), + ("put_script", None), + ("put_template", None), + ("scroll", None), + ("search", _extract_args_index), + ("search_exists", _extract_args_index), + ("search_shards", _extract_args_index), + ("search_template", _extract_args_index), + ("suggest", _extract_args_body_index), + ("termvector", _extract_args_index), + ("termvectors", None), + ("update", _extract_args_index), ) + +_elasticsearch_client_methods_v8 = ( + ("bulk", _extract_args_operations_index), + ("clear_scroll", None), + ("close", None), + ("close_point_in_time", None), + ("count", _extract_args_index), + ("create", _extract_args_index), + ("delete", _extract_args_index), + ("delete_by_query", _extract_args_index), + ("delete_by_query_rethrottle", None), + ("delete_script", None), + ("exists", _extract_args_index), + ("exists_source", _extract_args_index), + ("explain", _extract_args_index), + ("field_caps", _extract_args_index), + ("get", _extract_args_index), + ("get_script", None), + ("get_script_context", None), + ("get_script_languages", None), + ("get_source", _extract_args_index), + ("index", _extract_args_index), + ("info", None), + ("knn_search", _extract_args_index), + ("mget", _extract_args_index), + ("msearch", _extract_args_searches_index), + ("msearch_template", _extract_args_search_templates_index), + ("mtermvectors", _extract_args_index), + ("open_point_in_time", _extract_args_index), + ("options", None), + ("ping", None), + ("put_script", None), + ("rank_eval", _extract_args_requests_index), + ("reindex", None), + ("reindex_rethrottle", None), + ("render_search_template", None), + ("scripts_painless_execute", None), + ("scroll", None), + ("search", _extract_args_index), + ("search_mvt", _extract_args_index), + ("search_shards", _extract_args_index), + ("terms_enum", _extract_args_index), + ("termvector", _extract_args_index), + ("termvectors", _extract_args_index), + ("update", _extract_args_index), + ("update_by_query", _extract_args_index), + ("update_by_query_rethrottle", None), +) + + def instrument_elasticsearch_client(module): - for name, arg_extractor in _elasticsearch_client_methods: - wrap_elasticsearch_client_method(module.Elasticsearch, name, - arg_extractor) - -_elasticsearch_client_indices_methods = ( - ('analyze', _extract_args_index), - ('clear_cache', _extract_args_index), - ('close', _extract_args_index), - ('create', _extract_args_index), - ('delete', _extract_args_index), - ('delete_alias', _extract_args_index), - ('delete_mapping', _extract_args_index), - ('delete_template', None), - ('delete_warmer', _extract_args_index), - ('exists', _extract_args_index), - ('exists_alias', _extract_args_name_index), - ('exists_template', None), - ('exists_type', _extract_args_index), - ('flush', _extract_args_index), - ('get', _extract_args_index), - ('get_alias', _extract_args_index), - ('get_aliases', _extract_args_index), - ('get_mapping', _extract_args_index), - ('get_field_mapping', _extract_args_field_index), - ('get_settings', _extract_args_index), - ('get_template', None), - ('get_upgrade', _extract_args_index), - ('get_warmer', _extract_args_index), - ('open', _extract_args_index), - ('optimize', _extract_args_index), - ('put_alias', _extract_args_name_index), - ('put_mapping', _extract_args_doctype_body_index), - ('put_settings', _extract_args_body_index), - ('put_template', None), - ('put_warmer', _extract_args_name_body_index), - ('recovery', _extract_args_index), - ('refresh', _extract_args_index), - ('segments', _extract_args_index), - ('snapshot_index', _extract_args_index), - ('stats', _extract_args_index), - ('status', _extract_args_index), - ('update_aliases', None), - ('upgrade', _extract_args_index), - ('validate_query', _extract_args_index), + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "Elasticsearch", _elasticsearch_client_methods_below_v8) + + +def instrument_elasticsearch_client_v8(module): + instrument_es_methods(module, "Elasticsearch", _elasticsearch_client_methods_v8) + + +_elasticsearch_client_indices_methods_below_v8 = ( + ("analyze", _extract_args_index), + ("clear_cache", _extract_args_index), + ("close", _extract_args_index), + ("create", _extract_args_index), + ("delete", _extract_args_index), + ("delete_alias", _extract_args_index), + ("delete_mapping", _extract_args_index), + ("delete_template", None), + ("delete_warmer", _extract_args_index), + ("exists", _extract_args_index), + ("exists_alias", _extract_args_name_index), + ("exists_template", None), + ("exists_type", _extract_args_index), + ("flush", _extract_args_index), + ("get", _extract_args_index), + ("get_alias", _extract_args_index), + ("get_aliases", _extract_args_index), + ("get_mapping", _extract_args_index), + ("get_field_mapping", _extract_args_field_index), + ("get_settings", _extract_args_index), + ("get_template", None), + ("get_upgrade", _extract_args_index), + ("get_warmer", _extract_args_index), + ("open", _extract_args_index), + ("optimize", _extract_args_index), + ("put_alias", _extract_args_name_index), + ("put_mapping", _extract_args_doctype_body_index), + ("put_settings", _extract_args_body_index), + ("put_template", None), + ("put_warmer", _extract_args_name_body_index), + ("recovery", _extract_args_index), + ("refresh", _extract_args_index), + ("segments", _extract_args_index), + ("snapshot_index", _extract_args_index), + ("stats", _extract_args_index), + ("status", _extract_args_index), + ("update_aliases", None), + ("upgrade", _extract_args_index), + ("validate_query", _extract_args_index), +) + + +_elasticsearch_client_indices_methods_v8 = ( + ("add_block", _extract_args_index), + ("analyze", _extract_args_index), + ("clear_cache", _extract_args_index), + ("clone", _extract_args_index), + ("close", _extract_args_index), + ("create", _extract_args_index), + ("create_data_stream", None), + ("data_streams_stats", None), + ("delete", _extract_args_index), + ("delete_alias", _extract_args_index), + ("delete_data_stream", None), + ("delete_index_template", None), + ("delete_template", None), + ("disk_usage", _extract_args_index), + ("downsample", _extract_args_index), + ("exists", _extract_args_index), + ("exists_alias", _extract_args_name_index), + ("exists_index_template", None), + ("exists_template", None), + ("field_usage_stats", _extract_args_index), + ("flush", _extract_args_index), + ("forcemerge", _extract_args_index), + ("get", _extract_args_index), + ("get_alias", _extract_args_index), + ("get_data_stream", None), + ("get_field_mapping", _extract_args_fields_index), + ("get_index_template", None), + ("get_mapping", _extract_args_index), + ("get_settings", _extract_args_index), + ("get_template", None), + ("migrate_to_data_stream", None), + ("modify_data_stream", None), + ("open", _extract_args_index), + ("promote_data_stream", None), + ("put_alias", _extract_args_index), + ("put_index_template", None), + ("put_mapping", _extract_args_index), + ("put_settings", _extract_args_settings_index), + ("put_template", None), + ("recovery", _extract_args_index), + ("refresh", _extract_args_index), + ("reload_search_analyzers", _extract_args_index), + ("resolve_index", None), + ("rollover", None), + ("segments", _extract_args_index), + ("shard_stores", _extract_args_index), + ("shrink", _extract_args_index), + ("simulate_index_template", None), + ("simulate_template", None), + ("split", _extract_args_index), + ("stats", _extract_args_index), + ("unfreeze", _extract_args_index), + ("update_aliases", None), + ("validate_query", _extract_args_index), ) + def instrument_elasticsearch_client_indices(module): - for name, arg_extractor in _elasticsearch_client_indices_methods: - wrap_elasticsearch_client_method(module.IndicesClient, name, - arg_extractor, 'indices') - -_elasticsearch_client_cat_methods = ( - ('aliases', None), - ('allocation', None), - ('count', _extract_args_index), - ('fielddata', None), - ('health', None), - ('help', None), - ('indices', _extract_args_index), - ('master', None), - ('nodes', None), - ('pending_tasks', None), - ('plugins', None), - ('recovery', _extract_args_index), - ('shards', _extract_args_index), - ('segments', _extract_args_index), - ('thread_pool', None), + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "IndicesClient", _elasticsearch_client_indices_methods_below_v8, "indices") + + +def instrument_elasticsearch_client_indices_v8(module): + instrument_es_methods(module, "IndicesClient", _elasticsearch_client_indices_methods_v8, "indices") + + +_elasticsearch_client_cat_methods_below_v8 = ( + ("aliases", None), + ("allocation", None), + ("count", _extract_args_index), + ("fielddata", None), + ("health", None), + ("help", None), + ("indices", _extract_args_index), + ("master", None), + ("nodes", None), + ("pending_tasks", None), + ("plugins", None), + ("recovery", _extract_args_index), + ("shards", _extract_args_index), + ("segments", _extract_args_index), + ("thread_pool", None), +) + +_elasticsearch_client_cat_methods_v8 = ( + ("aliases", None), + ("allocation", None), + ("component_templates", None), + ("count", _extract_args_index), + ("fielddata", None), + ("health", None), + ("help", None), + ("indices", _extract_args_index), + ("master", None), + ("ml_data_frame_analytics", None), + ("ml_datafeeds", None), + ("ml_jobs", None), + ("ml_trained_models", None), + ("nodeattrs", None), + ("nodes", None), + ("pending_tasks", None), + ("plugins", None), + ("recovery", _extract_args_index), + ("repositories", None), + ("segments", _extract_args_index), + ("shards", _extract_args_index), + ("snapshots", None), + ("tasks", None), + ("templates", None), + ("thread_pool", None), + ("transforms", None), ) + def instrument_elasticsearch_client_cat(module): - for name, arg_extractor in _elasticsearch_client_cat_methods: - wrap_elasticsearch_client_method(module.CatClient, name, - arg_extractor, 'cat') - -_elasticsearch_client_cluster_methods = ( - ('get_settings', None), - ('health', _extract_args_index), - ('pending_tasks', None), - ('put_settings', None), - ('reroute', None), - ('state', _extract_args_metric_index), - ('stats', None), + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "CatClient", _elasticsearch_client_cat_methods_below_v8, "cat") + + +def instrument_elasticsearch_client_cat_v8(module): + instrument_es_methods(module, "CatClient", _elasticsearch_client_cat_methods_v8, "cat") + + +_elasticsearch_client_cluster_methods_below_v8 = ( + ("get_settings", None), + ("health", _extract_args_index), + ("pending_tasks", None), + ("put_settings", None), + ("reroute", None), + ("state", _extract_args_metric_index), + ("stats", None), ) + +_elasticsearch_client_cluster_methods_v8 = ( + ("allocation_explain", _extract_args_allocation_explain_index), + ("delete_component_template", None), + ("delete_voting_config_exclusions", None), + ("exists_component_template", None), + ("get_component_template", None), + ("get_settings", None), + ("health", _extract_args_index), + ("pending_tasks", None), + ("post_voting_config_exclusions", None), + ("put_component_template", None), + ("put_settings", None), + ("remote_info", None), + ("reroute", None), + ("state", _extract_args_metric_index), + ("stats", None), +) + + def instrument_elasticsearch_client_cluster(module): - for name, arg_extractor in _elasticsearch_client_cluster_methods: - wrap_elasticsearch_client_method(module.ClusterClient, name, - arg_extractor, 'cluster') - -_elasticsearch_client_nodes_methods = ( - ('hot_threads', None), - ('info', None), - ('shutdown', None), - ('stats', None), + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "ClusterClient", _elasticsearch_client_cluster_methods_below_v8, "cluster") + + +def instrument_elasticsearch_client_cluster_v8(module): + instrument_es_methods(module, "ClusterClient", _elasticsearch_client_cluster_methods_v8, "cluster") + + +_elasticsearch_client_nodes_methods_below_v8 = ( + ("hot_threads", None), + ("info", None), + ("shutdown", None), + ("stats", None), +) +_elasticsearch_client_nodes_methods_v8 = ( + ("clear_repositories_metering_archive", None), + ("get_repositories_metering_info", None), + ("hot_threads", None), + ("info", None), + ("reload_secure_settings", None), + ("stats", None), + ("usage", None), ) + def instrument_elasticsearch_client_nodes(module): - for name, arg_extractor in _elasticsearch_client_nodes_methods: - wrap_elasticsearch_client_method(module.NodesClient, name, - arg_extractor, 'nodes') - -_elasticsearch_client_snapshot_methods = ( - ('create', None), - ('create_repository', None), - ('delete', None), - ('delete_repository', None), - ('get', None), - ('get_repository', None), - ('restore', None), - ('status', None), - ('verify_repository', None), + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "NodesClient", _elasticsearch_client_nodes_methods_below_v8, "nodes") + + +def instrument_elasticsearch_client_nodes_v8(module): + instrument_es_methods(module, "NodesClient", _elasticsearch_client_nodes_methods_v8, "nodes") + + +_elasticsearch_client_snapshot_methods_below_v8 = ( + ("create", None), + ("create_repository", None), + ("delete", None), + ("delete_repository", None), + ("get", None), + ("get_repository", None), + ("restore", None), + ("status", None), + ("verify_repository", None), ) +_elasticsearch_client_snapshot_methods_v8 = ( + ("cleanup_repository", None), + ("clone", None), + ("create", None), + ("create_repository", None), + ("delete", None), + ("delete_repository", None), + ("get", None), + ("get_repository", None), + ("restore", None), + ("status", None), + ("verify_repository", None), +) + def instrument_elasticsearch_client_snapshot(module): - for name, arg_extractor in _elasticsearch_client_snapshot_methods: - wrap_elasticsearch_client_method(module.SnapshotClient, name, - arg_extractor, 'snapshot') + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "SnapshotClient", _elasticsearch_client_snapshot_methods_below_v8, "snapshot") + + +def instrument_elasticsearch_client_snapshot_v8(module): + instrument_es_methods(module, "SnapshotClient", _elasticsearch_client_snapshot_methods_v8, "snapshot") + _elasticsearch_client_tasks_methods = ( - ('list', None), - ('cancel', None), - ('get', None), + ("list", None), + ("cancel", None), + ("get", None), ) + def instrument_elasticsearch_client_tasks(module): - for name, arg_extractor in _elasticsearch_client_tasks_methods: - wrap_elasticsearch_client_method(module.TasksClient, name, - arg_extractor, 'tasks') - -_elasticsearch_client_ingest_methods = ( - ('get_pipeline', None), - ('put_pipeline', None), - ('delete_pipeline', None), - ('simulate', None), + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "TasksClient", _elasticsearch_client_tasks_methods, "tasks") + + +def instrument_elasticsearch_client_tasks_v8(module): + instrument_es_methods(module, "TasksClient", _elasticsearch_client_tasks_methods, "tasks") + + +_elasticsearch_client_ingest_methods_below_v8 = ( + ("get_pipeline", None), + ("put_pipeline", None), + ("delete_pipeline", None), + ("simulate", None), +) + +_elasticsearch_client_ingest_methods_v8 = ( + ("delete_pipeline", None), + ("geo_ip_stats", None), + ("get_pipeline", None), + ("processor_grok", None), + ("put_pipeline", None), + ("simulate", None), ) + def instrument_elasticsearch_client_ingest(module): - for name, arg_extractor in _elasticsearch_client_ingest_methods: - wrap_elasticsearch_client_method(module.IngestClient, name, - arg_extractor, 'ingest') + # The module path was remapped in v8 to match previous versions. + # In order to avoid double wrapping we check the version before + # wrapping. + if ES_VERSION < (8,): + instrument_es_methods(module, "IngestClient", _elasticsearch_client_ingest_methods_below_v8, "ingest") + + +def instrument_elasticsearch_client_ingest_v8(module): + instrument_es_methods(module, "IngestClient", _elasticsearch_client_ingest_methods_v8, "ingest") + # # Instrumentation to get Datastore Instance Information # + def _nr_Connection__init__wrapper(wrapped, instance, args, kwargs): """Cache datastore instance info on Connection object""" - def _bind_params(host='localhost', port=9200, *args, **kwargs): + def _bind_params(host="localhost", port=9200, *args, **kwargs): return host, port host, port = _bind_params(*args, **kwargs) @@ -302,9 +596,21 @@ def _bind_params(host='localhost', port=9200, *args, **kwargs): return wrapped(*args, **kwargs) + def instrument_elasticsearch_connection_base(module): - wrap_function_wrapper(module.Connection, '__init__', - _nr_Connection__init__wrapper) + wrap_function_wrapper(module, "Connection.__init__", _nr_Connection__init__wrapper) + + +def BaseNode__init__wrapper(wrapped, instance, args, kwargs): + result = wrapped(*args, **kwargs) + instance._nr_host_port = (instance.host, str(instance.port)) + return result + + +def instrument_elastic_transport__node__base(module): + if hasattr(module, "BaseNode"): + wrap_function_wrapper(module, "BaseNode.__init__", BaseNode__init__wrapper) + def _nr_get_connection_wrapper(wrapped, instance, args, kwargs): """Read instance info from Connection and stash on Transaction.""" @@ -323,13 +629,34 @@ def _nr_get_connection_wrapper(wrapped, instance, args, kwargs): if tracer_settings.instance_reporting.enabled: host, port_path_or_id = conn._nr_host_port instance_info = (host, port_path_or_id, None) - except: - instance_info = ('unknown', 'unknown', None) + except Exception: + instance_info = ("unknown", "unknown", None) transaction._nr_datastore_instance_info = instance_info return conn + +def _nr_perform_request_wrapper(wrapped, instance, args, kwargs): + """Read instance info from Connection and stash on Transaction.""" + + transaction = current_transaction() + + if transaction is None: + return wrapped(*args, **kwargs) + + if not hasattr(instance.node_pool.get, "_nr_wrapped"): + instance.node_pool.get = function_wrapper(_nr_get_connection_wrapper)(instance.node_pool.get) + instance.node_pool.get._nr_wrapped = True + + return wrapped(*args, **kwargs) + + def instrument_elasticsearch_transport(module): - wrap_function_wrapper(module.Transport, 'get_connection', - _nr_get_connection_wrapper) + if hasattr(module, "Transport") and hasattr(module.Transport, "get_connection"): + wrap_function_wrapper(module, "Transport.get_connection", _nr_get_connection_wrapper) + + +def instrument_elastic_transport__transport(module): + if hasattr(module, "Transport") and hasattr(module.Transport, "perform_request"): + wrap_function_wrapper(module, "Transport.perform_request", _nr_perform_request_wrapper) diff --git a/newrelic/hooks/datastore_memcache.py b/newrelic/hooks/datastore_memcache.py index 9d51aead4..90b2d43dc 100644 --- a/newrelic/hooks/datastore_memcache.py +++ b/newrelic/hooks/datastore_memcache.py @@ -14,19 +14,24 @@ from newrelic.api.datastore_trace import DatastoreTrace, wrap_datastore_trace from newrelic.api.transaction import current_transaction -from newrelic.common.object_wrapper import (wrap_object, FunctionWrapper, - wrap_function_wrapper) +from newrelic.common.object_wrapper import ( + FunctionWrapper, + wrap_function_wrapper, + wrap_object, +) + def _instance_info(memcache_host): try: host = memcache_host.ip port_path_or_id = str(memcache_host.port) except AttributeError: - host = 'localhost' + host = "localhost" port_path_or_id = str(memcache_host.address) return (host, port_path_or_id, None) + def _nr_get_server_wrapper(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -46,14 +51,14 @@ def _nr_get_server_wrapper(wrapped, instance, args, kwargs): if tracer_settings.instance_reporting.enabled and host is not None: instance_info = _instance_info(host) except: - instance_info = ('unknown', 'unknown', None) + instance_info = ("unknown", "unknown", None) transaction._nr_datastore_instance_info = instance_info return result -def MemcacheSingleWrapper(wrapped, product, target, operation, module): +def MemcacheSingleWrapper(wrapped, product, target, operation, module): def _nr_datastore_trace_wrapper_(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -76,26 +81,35 @@ def _nr_datastore_trace_wrapper_(wrapped, instance, args, kwargs): return FunctionWrapper(wrapped, _nr_datastore_trace_wrapper_) + def wrap_memcache_single(module, object_path, product, target, operation): - wrap_object(module.Client, object_path, MemcacheSingleWrapper, - (product, target, operation, module)) + wrap_object(module, "Client.%s" % object_path, MemcacheSingleWrapper, (product, target, operation, module)) + -_memcache_client_methods = ('delete', 'incr', 'decr', 'add', - 'append', 'prepend', 'replace', 'set', 'cas', 'get', 'gets') +_memcache_client_methods = ( + "delete", + "incr", + "decr", + "add", + "append", + "prepend", + "replace", + "set", + "cas", + "get", + "gets", +) -_memcache_multi_methods = ('delete_multi', 'get_multi', 'set_multi', - 'get_stats', 'get_slabs', 'flush_all') +_memcache_multi_methods = ("delete_multi", "get_multi", "set_multi", "get_stats", "get_slabs", "flush_all") def instrument_memcache(module): - wrap_function_wrapper(module.Client, '_get_server', _nr_get_server_wrapper) + wrap_function_wrapper(module, "Client._get_server", _nr_get_server_wrapper) for name in _memcache_client_methods: if hasattr(module.Client, name): - wrap_memcache_single(module, name, - product='Memcached', target=None, operation=name) + wrap_memcache_single(module, name, product="Memcached", target=None, operation=name) for name in _memcache_multi_methods: if hasattr(module.Client, name): - wrap_datastore_trace(module.Client, name, - product='Memcached', target=None, operation=name) + wrap_datastore_trace(module, "Client.%s" % name, product="Memcached", target=None, operation=name) diff --git a/newrelic/hooks/datastore_pyelasticsearch.py b/newrelic/hooks/datastore_pyelasticsearch.py index 37b7f3176..63e33a9bb 100644 --- a/newrelic/hooks/datastore_pyelasticsearch.py +++ b/newrelic/hooks/datastore_pyelasticsearch.py @@ -12,11 +12,10 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.packages import six - from newrelic.api.datastore_trace import DatastoreTraceWrapper from newrelic.api.transaction import current_transaction from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.packages import six # An index name can be a string, None or a sequence. In the case of None # an empty string or '*', it is the same as using '_all'. When a string @@ -24,58 +23,64 @@ # obviously can also be more than one index name. Where we are certain # there is only a single index name we use it, otherwise we use 'other'. + def _index_name(index): - if not index or index == '*': - return '_all' - if not isinstance(index, six.string_types) or ',' in index: - return 'other' + if not index or index == "*": + return "_all" + if not isinstance(index, six.string_types) or "," in index: + return "other" return index + def _extract_kwargs_index(*args, **kwargs): - return _index_name(kwargs.get('index')) + return _index_name(kwargs.get("index")) + def _extract_args_index(index=None, *args, **kwargs): return _index_name(index) + def _extract_args_metric_index(metric=None, index=None, *args, **kwargs): return _index_name(index) + _elasticsearch_client_methods = ( - ('bulk', None), - ('bulk_index', _extract_args_index), - ('close_index', None), - ('cluster_state', _extract_args_metric_index), - ('count', _extract_kwargs_index), - ('create_index', _extract_args_index), - ('delete', _extract_args_index), - ('delete_all', _extract_args_index), - ('delete_all_indexes', None), - ('delete_by_query', _extract_args_index), - ('delete_index', _extract_args_index), - ('flush', _extract_args_index), - ('gateway_snapshot', _extract_args_index), - ('get', _extract_args_index), - ('get_aliases', _extract_args_index), - ('get_mapping', _extract_args_index), - ('get_settings', _extract_args_index), - ('health', _extract_args_index), - ('index', _extract_args_index), - ('more_like_this', _extract_args_index), - ('multi_get', None), - ('open_index', _extract_args_index), - ('optimize', _extract_args_index), - ('percolate', _extract_args_index), - ('put_mapping', _extract_args_index), - ('refresh', _extract_args_index), - ('search', _extract_kwargs_index), - ('send_request', None), - ('status', _extract_args_index), - ('update', _extract_args_index), - ('update_aliases', None), - ('update_all_settings', None), - ('update_settings', _extract_args_index), + ("bulk", None), + ("bulk_index", _extract_args_index), + ("close_index", None), + ("cluster_state", _extract_args_metric_index), + ("count", _extract_kwargs_index), + ("create_index", _extract_args_index), + ("delete", _extract_args_index), + ("delete_all", _extract_args_index), + ("delete_all_indexes", None), + ("delete_by_query", _extract_args_index), + ("delete_index", _extract_args_index), + ("flush", _extract_args_index), + ("gateway_snapshot", _extract_args_index), + ("get", _extract_args_index), + ("get_aliases", _extract_args_index), + ("get_mapping", _extract_args_index), + ("get_settings", _extract_args_index), + ("health", _extract_args_index), + ("index", _extract_args_index), + ("more_like_this", _extract_args_index), + ("multi_get", None), + ("open_index", _extract_args_index), + ("optimize", _extract_args_index), + ("percolate", _extract_args_index), + ("put_mapping", _extract_args_index), + ("refresh", _extract_args_index), + ("search", _extract_kwargs_index), + ("send_request", None), + ("status", _extract_args_index), + ("update", _extract_args_index), + ("update_aliases", None), + ("update_all_settings", None), + ("update_settings", _extract_args_index), ) + def wrap_elasticsearch_client_method(module, name, arg_extractor): def _nr_wrapper_ElasticSearch_method_(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -94,11 +99,11 @@ def _nr_wrapper_ElasticSearch_method_(wrapped, instance, args, kwargs): else: index = arg_extractor(*args, **kwargs) - return DatastoreTraceWrapper(wrapped, product='Elasticsearch',target=index, operation=name)(*args, **kwargs) + return DatastoreTraceWrapper(wrapped, product="Elasticsearch", target=index, operation=name)(*args, **kwargs) if hasattr(module.ElasticSearch, name): - wrap_function_wrapper(module.ElasticSearch, name, - _nr_wrapper_ElasticSearch_method_) + wrap_function_wrapper(module, "ElasticSearch.%s" % name, _nr_wrapper_ElasticSearch_method_) + def instrument_pyelasticsearch_client(module): for name, arg_extractor in _elasticsearch_client_methods: diff --git a/newrelic/hooks/datastore_pylibmc.py b/newrelic/hooks/datastore_pylibmc.py index 3c8ab636b..3d42a70fb 100644 --- a/newrelic/hooks/datastore_pylibmc.py +++ b/newrelic/hooks/datastore_pylibmc.py @@ -14,13 +14,30 @@ from newrelic.api.datastore_trace import wrap_datastore_trace -_memcache_client_methods = ('get', 'gets', 'set', 'replace', 'add', - 'prepend', 'append', 'cas', 'delete', 'incr', 'decr', 'incr_multi', - 'get_multi', 'set_multi', 'add_multi', 'delete_multi', 'get_stats', - 'flush_all', 'touch') +_memcache_client_methods = ( + "get", + "gets", + "set", + "replace", + "add", + "prepend", + "append", + "cas", + "delete", + "incr", + "decr", + "incr_multi", + "get_multi", + "set_multi", + "add_multi", + "delete_multi", + "get_stats", + "flush_all", + "touch", +) + def instrument_pylibmc_client(module): for name in _memcache_client_methods: if hasattr(module.Client, name): - wrap_datastore_trace(module.Client, name, - product='Memcached', target=None, operation=name) + wrap_datastore_trace(module, "Client.%s" % name, product="Memcached", target=None, operation=name) diff --git a/newrelic/hooks/datastore_pymemcache.py b/newrelic/hooks/datastore_pymemcache.py index f33f135de..690e95d61 100644 --- a/newrelic/hooks/datastore_pymemcache.py +++ b/newrelic/hooks/datastore_pymemcache.py @@ -14,12 +14,30 @@ from newrelic.api.datastore_trace import wrap_datastore_trace -_memcache_client_methods = ('set', 'set_many', 'add', 'replace', 'append', - 'prepend', 'cas', 'get', 'get_many', 'gets', 'gets_many', 'delete', - 'delete_many', 'incr', 'decr', 'touch', 'stats', 'flush_all', 'quit') +_memcache_client_methods = ( + "set", + "set_many", + "add", + "replace", + "append", + "prepend", + "cas", + "get", + "get_many", + "gets", + "gets_many", + "delete", + "delete_many", + "incr", + "decr", + "touch", + "stats", + "flush_all", + "quit", +) + def instrument_pymemcache_client(module): for name in _memcache_client_methods: if hasattr(module.Client, name): - wrap_datastore_trace(module.Client, name, - product='Memcached', target=None, operation=name) + wrap_datastore_trace(module, "Client.%s" % name, product="Memcached", target=None, operation=name) diff --git a/newrelic/hooks/datastore_pymongo.py b/newrelic/hooks/datastore_pymongo.py index 66a86b7ab..c9c34b1fc 100644 --- a/newrelic/hooks/datastore_pymongo.py +++ b/newrelic/hooks/datastore_pymongo.py @@ -15,38 +15,70 @@ from newrelic.api.datastore_trace import wrap_datastore_trace from newrelic.api.function_trace import wrap_function_trace -_pymongo_client_methods = ('save', 'insert', 'update', 'drop', 'remove', - 'find_one', 'find', 'count', 'create_index', 'ensure_index', - 'drop_indexes', 'drop_index', 'reindex', 'index_information', - 'options', 'group', 'rename', 'distinct', 'map_reduce', - 'inline_map_reduce', 'find_and_modify', 'initialize_unordered_bulk_op', - 'initialize_ordered_bulk_op', 'bulk_write', 'insert_one', 'insert_many', - 'replace_one', 'update_one', 'update_many', 'delete_one', 'delete_many', - 'find_raw_batches', 'parallel_scan', 'create_indexes', 'list_indexes', - 'aggregate', 'aggregate_raw_batches', 'find_one_and_delete', - 'find_one_and_replace', 'find_one_and_update') +_pymongo_client_methods = ( + "save", + "insert", + "update", + "drop", + "remove", + "find_one", + "find", + "count", + "create_index", + "ensure_index", + "drop_indexes", + "drop_index", + "reindex", + "index_information", + "options", + "group", + "rename", + "distinct", + "map_reduce", + "inline_map_reduce", + "find_and_modify", + "initialize_unordered_bulk_op", + "initialize_ordered_bulk_op", + "bulk_write", + "insert_one", + "insert_many", + "replace_one", + "update_one", + "update_many", + "delete_one", + "delete_many", + "find_raw_batches", + "parallel_scan", + "create_indexes", + "list_indexes", + "aggregate", + "aggregate_raw_batches", + "find_one_and_delete", + "find_one_and_replace", + "find_one_and_update", +) def instrument_pymongo_connection(module): # Must name function explicitly as pymongo overrides the # __getattr__() method in a way that breaks introspection. - rollup = ('Datastore/all', 'Datastore/MongoDB/all') + rollup = ("Datastore/all", "Datastore/MongoDB/all") - wrap_function_trace(module, 'Connection.__init__', - name='%s:Connection.__init__' % module.__name__, - terminal=True, rollup=rollup) + wrap_function_trace( + module, "Connection.__init__", name="%s:Connection.__init__" % module.__name__, terminal=True, rollup=rollup + ) def instrument_pymongo_mongo_client(module): # Must name function explicitly as pymongo overrides the # __getattr__() method in a way that breaks introspection. - rollup = ('Datastore/all', 'Datastore/MongoDB/all') + rollup = ("Datastore/all", "Datastore/MongoDB/all") - wrap_function_trace(module, 'MongoClient.__init__', - name='%s:MongoClient.__init__' % module.__name__, - terminal=True, rollup=rollup) + wrap_function_trace( + module, "MongoClient.__init__", name="%s:MongoClient.__init__" % module.__name__, terminal=True, rollup=rollup + ) def instrument_pymongo_collection(module): @@ -55,5 +87,6 @@ def _collection_name(collection, *args, **kwargs): for name in _pymongo_client_methods: if hasattr(module.Collection, name): - wrap_datastore_trace(module.Collection, name, product='MongoDB', - target=_collection_name, operation=name) + wrap_datastore_trace( + module, "Collection.%s" % name, product="MongoDB", target=_collection_name, operation=name + ) diff --git a/newrelic/hooks/datastore_pysolr.py b/newrelic/hooks/datastore_pysolr.py index da29e37ed..7d4e8697d 100644 --- a/newrelic/hooks/datastore_pysolr.py +++ b/newrelic/hooks/datastore_pysolr.py @@ -14,21 +14,19 @@ from newrelic.api.datastore_trace import wrap_datastore_trace -_pysolr_client_methods = ('search', 'more_like_this', 'suggest_terms', 'add', -'delete', 'commit', 'optimize', 'extract') +_pysolr_client_methods = ("search", "more_like_this", "suggest_terms", "add", "delete", "commit", "optimize", "extract") + +_pysolr_admin_methods = ("status", "create", "reload", "rename", "swap", "unload", "load") -_pysolr_admin_methods = ('status', 'create', 'reload', 'rename', 'swap', - 'unload', 'load') def instrument_pysolr(module): for name in _pysolr_client_methods: if hasattr(module.Solr, name): - wrap_datastore_trace(module.Solr, name, - product='Solr', target=None, operation=name) + wrap_datastore_trace(module, "Solr.%s" % name, product="Solr", target=None, operation=name) - if hasattr(module, 'SolrCoreAdmin'): + if hasattr(module, "SolrCoreAdmin"): for name in _pysolr_admin_methods: if hasattr(module.SolrCoreAdmin, name): - wrap_datastore_trace(module.SolrCoreAdmin, name, - product='Solr', target=None, - operation='admin.%s' % name) + wrap_datastore_trace( + module, "SolrCoreAdmin.%s" % name, product="Solr", target=None, operation="admin.%s" % name + ) diff --git a/newrelic/hooks/datastore_redis.py b/newrelic/hooks/datastore_redis.py index f25c76e65..6854d84f3 100644 --- a/newrelic/hooks/datastore_redis.py +++ b/newrelic/hooks/datastore_redis.py @@ -16,12 +16,67 @@ from newrelic.api.datastore_trace import DatastoreTrace from newrelic.api.transaction import current_transaction -from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.common.object_wrapper import function_wrapper, wrap_function_wrapper -_redis_client_methods = { +_redis_client_sync_methods = { + "acl_dryrun", + "auth", + "bgrewriteaof", + "bitfield", + "blmpop", + "bzmpop", + "client", + "command", + "command_docs", + "command_getkeysandflags", + "command_info", + "debug_segfault", + "expiretime", + "failover", + "hello", + "latency_doctor", + "latency_graph", + "latency_histogram", + "lcs", + "lpop", + "lpos", + "memory_doctor", + "memory_help", + "monitor", + "pexpiretime", + "psetex", + "psync", + "pubsub", + "renamenx", + "rpop", + "script_debug", + "sentinel_ckquorum", + "sentinel_failover", + "sentinel_flushconfig", + "sentinel_get_master_addr_by_name", + "sentinel_master", + "sentinel_masters", + "sentinel_monitor", + "sentinel_remove", + "sentinel_reset", + "sentinel_sentinels", + "sentinel_set", + "sentinel_slaves", + "shutdown", + "sort", + "sort_ro", + "spop", + "srandmember", + "unwatch", + "watch", + "zlexcount", + "zrevrangebyscore", +} + + +_redis_client_async_methods = { "acl_cat", "acl_deluser", - "acl_dryrun", "acl_genpass", "acl_getuser", "acl_help", @@ -50,11 +105,9 @@ "arrlen", "arrpop", "arrtrim", - "auth", - "bgrewriteaof", "bgsave", "bitcount", - "bitfield", + "bitfield_ro", "bitop_and", "bitop_not", "bitop_or", @@ -62,13 +115,14 @@ "bitop", "bitpos", "blmove", - "blmpop", "blpop", "brpop", "brpoplpush", - "bzmpop", + "byrank", + "byrevrank", "bzpopmax", "bzpopmin", + "card", "cdf", "clear", "client_getname", @@ -86,7 +140,6 @@ "client_trackinginfo", "client_unblock", "client_unpause", - "client", "cluster_add_slots", "cluster_addslots", "cluster_count_failure_report", @@ -113,10 +166,7 @@ "cluster_slots", "cluster", "command_count", - "command_docs", "command_getkeys", - "command_getkeysandflags", - "command_info", "command_list", "command", "commit", @@ -132,7 +182,6 @@ "createrule", "dbsize", "debug_object", - "debug_segfault", "debug_sleep", "debug", "decr", @@ -155,10 +204,8 @@ "exists", "expire", "expireat", - "expiretime", "explain_cli", "explain", - "failover", "fcall_ro", "fcall", "flushall", @@ -187,7 +234,6 @@ "getrange", "getset", "hdel", - "hello", "hexists", "hget", "hgetall", @@ -215,8 +261,9 @@ "insertnx", "keys", "lastsave", - "latency_histogram", - "lcs", + "latency_history", + "latency_latest", + "latency_reset", "lindex", "linsert", "list", @@ -225,8 +272,6 @@ "lmpop", "loadchunk", "lolwut", - "lpop", - "lpos", "lpush", "lpushx", "lrange", @@ -235,8 +280,6 @@ "ltrim", "madd", "max", - "memory_doctor", - "memory_help", "memory_malloc_stats", "memory_purge", "memory_stats", @@ -251,7 +294,6 @@ "module_load", "module_loadex", "module_unload", - "monitor", "move", "mrange", "mrevrange", @@ -267,21 +309,17 @@ "persist", "pexpire", "pexpireat", - "pexpiretime", "pfadd", "pfcount", "pfmerge", "ping", "profile", - "psetex", "psubscribe", - "psync", "pttl", "publish", "pubsub_channels", "pubsub_numpat", "pubsub_numsub", - "pubsub", "punsubscribe", "quantile", "query", @@ -289,18 +327,18 @@ "quit", "randomkey", "range", + "rank", "readonly", "readwrite", "rename", - "renamenx", "replicaof", "reserve", "reset", "resp", "restore", "revrange", + "revrank", "role", - "rpop", "rpoplpush", "rpush", "rpushx", @@ -310,7 +348,6 @@ "scan", "scandump", "scard", - "script_debug", "script_exists", "script_flush", "script_kill", @@ -319,24 +356,11 @@ "sdiffstore", "search", "select", - "sentinel_ckquorum", - "sentinel_failover", - "sentinel_flushconfig", - "sentinel_get_master_addr_by_name", - "sentinel_master", - "sentinel_masters", - "sentinel_monitor", - "sentinel_remove", - "sentinel_reset", - "sentinel_sentinels", - "sentinel_set", - "sentinel_slaves", "set", "setbit", "setex", "setnx", "setrange", - "shutdown", "sinter", "sintercard", "sinterstore", @@ -349,11 +373,7 @@ "smembers", "smismember", "smove", - "sort_ro", - "sort", "spellcheck", - "spop", - "srandmember", "srem", "sscan_iter", "sscan", @@ -376,13 +396,13 @@ "time", "toggle", "touch", + "trimmed_mean", "ttl", "type", "unlink", "unsubscribe", - "unwatch", "wait", - "watch", + "waitaof", "xack", "xadd", "xautoclaim", @@ -418,7 +438,6 @@ "zinter", "zintercard", "zinterstore", - "zlexcount", "zmpop", "zmscore", "zpopmax", @@ -435,7 +454,6 @@ "zremrangebyscore", "zrevrange", "zrevrangebylex", - "zrevrangebyscore", "zrevrank", "zscan_iter", "zscan", @@ -444,6 +462,8 @@ "zunionstore", } +_redis_client_methods = _redis_client_sync_methods.union(_redis_client_async_methods) + _redis_multipart_commands = set(["client", "cluster", "command", "config", "debug", "sentinel", "slowlog", "script"]) _redis_operation_re = re.compile(r"[-\s]+") @@ -460,7 +480,7 @@ def _conn_attrs_to_dict(connection): def _instance_info(kwargs): host = kwargs.get("host") or "localhost" - port_path_or_id = str(kwargs.get("port") or kwargs.get("path", "unknown")) + port_path_or_id = str(kwargs.get("path") or kwargs.get("port", "unknown")) db = str(kwargs.get("db") or 0) return (host, port_path_or_id, db) @@ -491,6 +511,29 @@ def _nr_wrapper_Redis_method_(wrapped, instance, args, kwargs): wrap_function_wrapper(module, name, _nr_wrapper_Redis_method_) +def _wrap_asyncio_Redis_method_wrapper(module, instance_class_name, operation): + @function_wrapper + async def _nr_wrapper_asyncio_Redis_async_method_(wrapped, instance, args, kwargs): + transaction = current_transaction() + if transaction is None: + return await wrapped(*args, **kwargs) + + with DatastoreTrace(product="Redis", target=None, operation=operation): + return await wrapped(*args, **kwargs) + + def _nr_wrapper_asyncio_Redis_method_(wrapped, instance, args, kwargs): + from redis.asyncio.client import Pipeline + + if isinstance(instance, Pipeline): + return wrapped(*args, **kwargs) + + # Method should be run when awaited, therefore we wrap in an async wrapper. + return _nr_wrapper_asyncio_Redis_async_method_(wrapped)(*args, **kwargs) + + name = "%s.%s" % (instance_class_name, operation) + wrap_function_wrapper(module, name, _nr_wrapper_asyncio_Redis_method_) + + def _nr_Connection_send_command_wrapper_(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -528,7 +571,15 @@ def _nr_Connection_send_command_wrapper_(wrapped, instance, args, kwargs): operation = _redis_operation_re.sub("_", operation) - with DatastoreTrace(product="Redis", target=None, operation=operation, host=host, port_path_or_id=port_path_or_id, database_name=db, source=wrapped): + with DatastoreTrace( + product="Redis", + target=None, + operation=operation, + host=host, + port_path_or_id=port_path_or_id, + database_name=db, + source=wrapped, + ): return wrapped(*args, **kwargs) @@ -544,6 +595,14 @@ def instrument_redis_client(module): _wrap_Redis_method_wrapper_(module, "Redis", name) +def instrument_asyncio_redis_client(module): + if hasattr(module, "Redis"): + class_ = getattr(module, "Redis") + for operation in _redis_client_async_methods: + if hasattr(class_, operation): + _wrap_asyncio_Redis_method_wrapper(module, "Redis", operation) + + def instrument_redis_commands_core(module): _instrument_redis_commands_module(module, "CoreCommands") @@ -574,7 +633,11 @@ def instrument_redis_commands_bf_commands(module): _instrument_redis_commands_module(module, "CMSCommands") _instrument_redis_commands_module(module, "TDigestCommands") _instrument_redis_commands_module(module, "TOPKCommands") - + + +def instrument_redis_commands_cluster(module): + _instrument_redis_commands_module(module, "RedisClusterCommands") + def _instrument_redis_commands_module(module, class_name): for name in _redis_client_methods: diff --git a/newrelic/hooks/datastore_solrpy.py b/newrelic/hooks/datastore_solrpy.py index 3b2ac9c17..74e808ae5 100644 --- a/newrelic/hooks/datastore_solrpy.py +++ b/newrelic/hooks/datastore_solrpy.py @@ -14,11 +14,20 @@ from newrelic.api.datastore_trace import wrap_datastore_trace -_solrpy_client_methods = ('query', 'add', 'add_many', 'delete', 'delete_many', -'delete_query', 'commit', 'optimize', 'raw_query') +_solrpy_client_methods = ( + "query", + "add", + "add_many", + "delete", + "delete_many", + "delete_query", + "commit", + "optimize", + "raw_query", +) + def instrument_solrpy(module): for name in _solrpy_client_methods: if hasattr(module.SolrConnection, name): - wrap_datastore_trace(module.SolrConnection, name, - product='Solr', target=None, operation=name) + wrap_datastore_trace(module, "SolrConnection.%s" % name, product="Solr", target=None, operation=name) diff --git a/newrelic/hooks/framework_aiohttp.py b/newrelic/hooks/framework_aiohttp.py index 346753167..68f4e70f1 100644 --- a/newrelic/hooks/framework_aiohttp.py +++ b/newrelic/hooks/framework_aiohttp.py @@ -11,7 +11,7 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. -import asyncio + import inspect import itertools @@ -26,7 +26,7 @@ function_wrapper, wrap_function_wrapper, ) -from newrelic.core.config import should_ignore_error +from newrelic.core.config import is_expected_error, should_ignore_error SUPPORTED_METHODS = ("connect", "head", "get", "delete", "options", "patch", "post", "put", "trace") @@ -61,6 +61,19 @@ def _should_ignore(exc, value, tb): return _should_ignore +def is_expected(transaction): + settings = transaction.settings + + def _is_expected(exc, value, tb): + from aiohttp import web + + if isinstance(value, web.HTTPException): + status_code = value.status_code + return is_expected_error((exc, value, tb), status_code, settings=settings) + + return _is_expected + + def _nr_process_response_proxy(response, transaction): nr_headers = transaction.process_response(response.status, response.headers) @@ -163,9 +176,8 @@ def _bind_params(request): @function_wrapper def _nr_aiohttp_wrap_middleware_(wrapped, instance, args, kwargs): - @asyncio.coroutine - def _inner(): - result = yield from wrapped(*args, **kwargs) + async def _inner(): + result = await wrapped(*args, **kwargs) return function_trace()(result) return _inner() @@ -221,10 +233,9 @@ def _nr_aiohttp_add_cat_headers_(wrapped, instance, args, kwargs): if is_coroutine_callable(wrapped): - @asyncio.coroutine - def new_coro(): + async def new_coro(): try: - result = yield from wrapped(*args, **kwargs) + result = await wrapped(*args, **kwargs) return result finally: instance.headers = tmp @@ -267,10 +278,9 @@ def _nr_aiohttp_request_wrapper_(wrapped, instance, args, kwargs): method, url = _bind_request(*args, **kwargs) trace = ExternalTrace("aiohttp", str(url), method) - @asyncio.coroutine - def _coro(): + async def _coro(): try: - response = yield from wrapped(*args, **kwargs) + response = await wrapped(*args, **kwargs) try: trace.process_response_headers(response.headers.items()) @@ -332,19 +342,18 @@ def _nr_request_wrapper(wrapped, instance, args, kwargs): coro = wrapped(*args, **kwargs) - if hasattr(coro, "__await__"): - coro = coro.__await__() - - @asyncio.coroutine - def _coro(*_args, **_kwargs): + async def _coro(*_args, **_kwargs): transaction = current_transaction() if transaction is None: - response = yield from coro + response = await coro return response # Patch in should_ignore to all notice_error calls transaction._ignore_errors = should_ignore(transaction) + # Patch in is_expected to all notice_error calls + transaction._expect_errors = is_expected(transaction) + import aiohttp transaction.add_framework_info(name="aiohttp", version=aiohttp.__version__) @@ -352,7 +361,7 @@ def _coro(*_args, **_kwargs): import aiohttp.web as _web try: - response = yield from coro + response = await coro except _web.HTTPException as e: _nr_process_response(e, transaction) raise diff --git a/newrelic/hooks/framework_django.py b/newrelic/hooks/framework_django.py index 005f28279..3d9f448cc 100644 --- a/newrelic/hooks/framework_django.py +++ b/newrelic/hooks/framework_django.py @@ -12,48 +12,60 @@ # See the License for the specific language governing permissions and # limitations under the License. +import functools +import logging import sys import threading -import logging -import functools - -from newrelic.packages import six from newrelic.api.application import register_application from newrelic.api.background_task import BackgroundTaskWrapper from newrelic.api.error_trace import wrap_error_trace -from newrelic.api.function_trace import (FunctionTrace, wrap_function_trace, - FunctionTraceWrapper) +from newrelic.api.function_trace import ( + FunctionTrace, + FunctionTraceWrapper, + wrap_function_trace, +) from newrelic.api.html_insertion import insert_html_snippet -from newrelic.api.transaction import current_transaction from newrelic.api.time_trace import notice_error +from newrelic.api.transaction import current_transaction from newrelic.api.transaction_name import wrap_transaction_name from newrelic.api.wsgi_application import WSGIApplicationWrapper - -from newrelic.common.object_wrapper import (FunctionWrapper, wrap_in_function, - wrap_post_function, wrap_function_wrapper, function_wrapper) +from newrelic.common.coroutine import is_asyncio_coroutine, is_coroutine_function from newrelic.common.object_names import callable_name +from newrelic.common.object_wrapper import ( + FunctionWrapper, + function_wrapper, + wrap_function_wrapper, + wrap_in_function, + wrap_post_function, +) from newrelic.config import extra_settings from newrelic.core.config import global_settings -from newrelic.common.coroutine import is_coroutine_function, is_asyncio_coroutine +from newrelic.packages import six if six.PY3: from newrelic.hooks.framework_django_py3 import ( - _nr_wrapper_BaseHandler_get_response_async_, _nr_wrap_converted_middleware_async_, + _nr_wrapper_BaseHandler_get_response_async_, ) _logger = logging.getLogger(__name__) _boolean_states = { - '1': True, 'yes': True, 'true': True, 'on': True, - '0': False, 'no': False, 'false': False, 'off': False + "1": True, + "yes": True, + "true": True, + "on": True, + "0": False, + "no": False, + "false": False, + "off": False, } def _setting_boolean(value): if value.lower() not in _boolean_states: - raise ValueError('Not a boolean: %s' % value) + raise ValueError("Not a boolean: %s" % value) return _boolean_states[value.lower()] @@ -62,21 +74,20 @@ def _setting_set(value): _settings_types = { - 'browser_monitoring.auto_instrument': _setting_boolean, - 'instrumentation.templates.inclusion_tag': _setting_set, - 'instrumentation.background_task.startup_timeout': float, - 'instrumentation.scripts.django_admin': _setting_set, + "browser_monitoring.auto_instrument": _setting_boolean, + "instrumentation.templates.inclusion_tag": _setting_set, + "instrumentation.background_task.startup_timeout": float, + "instrumentation.scripts.django_admin": _setting_set, } _settings_defaults = { - 'browser_monitoring.auto_instrument': True, - 'instrumentation.templates.inclusion_tag': set(), - 'instrumentation.background_task.startup_timeout': 10.0, - 'instrumentation.scripts.django_admin': set(), + "browser_monitoring.auto_instrument": True, + "instrumentation.templates.inclusion_tag": set(), + "instrumentation.background_task.startup_timeout": 10.0, + "instrumentation.scripts.django_admin": set(), } -django_settings = extra_settings('import-hook:django', - types=_settings_types, defaults=_settings_defaults) +django_settings = extra_settings("import-hook:django", types=_settings_types, defaults=_settings_defaults) def should_add_browser_timing(response, transaction): @@ -92,7 +103,7 @@ def should_add_browser_timing(response, transaction): # do RUM insertion, need to move to a WSGI middleware and # deal with how to update the content length. - if hasattr(response, 'streaming_content'): + if hasattr(response, "streaming_content"): return False # Need to be running within a valid web transaction. @@ -121,21 +132,21 @@ def should_add_browser_timing(response, transaction): # a user may want to also perform insertion for # 'application/xhtml+xml'. - ctype = response.get('Content-Type', '').lower().split(';')[0] + ctype = response.get("Content-Type", "").lower().split(";")[0] if ctype not in transaction.settings.browser_monitoring.content_type: return False # Don't risk it if content encoding already set. - if response.has_header('Content-Encoding'): + if response.has_header("Content-Encoding"): return False # Don't risk it if content is actually within an attachment. - cdisposition = response.get('Content-Disposition', '').lower() + cdisposition = response.get("Content-Disposition", "").lower() - if cdisposition.split(';')[0].strip().lower() == 'attachment': + if cdisposition.split(";")[0].strip().lower() == "attachment": return False return True @@ -144,6 +155,7 @@ def should_add_browser_timing(response, transaction): # Response middleware for automatically inserting RUM header and # footer into HTML response returned by application + def browser_timing_insertion(response, transaction): # No point continuing if header is empty. This can occur if @@ -175,14 +187,15 @@ def html_to_be_inserted(): if result is not None: if transaction.settings.debug.log_autorum_middleware: - _logger.debug('RUM insertion from Django middleware ' - 'triggered. Bytes added was %r.', - len(result) - len(response.content)) + _logger.debug( + "RUM insertion from Django middleware triggered. Bytes added was %r.", + len(result) - len(response.content), + ) response.content = result - if response.get('Content-Length', None): - response['Content-Length'] = str(len(response.content)) + if response.get("Content-Length", None): + response["Content-Length"] = str(len(response.content)) return response @@ -192,18 +205,19 @@ def html_to_be_inserted(): # 'newrelic' will be automatically inserted into set of tag # libraries when performing step to instrument the middleware. + def newrelic_browser_timing_header(): from django.utils.safestring import mark_safe transaction = current_transaction() - return transaction and mark_safe(transaction.browser_timing_header()) or '' + return transaction and mark_safe(transaction.browser_timing_header()) or "" # nosec def newrelic_browser_timing_footer(): from django.utils.safestring import mark_safe transaction = current_transaction() - return transaction and mark_safe(transaction.browser_timing_footer()) or '' + return transaction and mark_safe(transaction.browser_timing_footer()) or "" # nosec # Addition of instrumentation for middleware. Can only do this @@ -256,9 +270,14 @@ def wrapper(wrapped, instance, args, kwargs): yield wrapper(wrapped) -def wrap_view_middleware(middleware): +# Because this is not being used in any version of Django that is +# within New Relic's support window, no tests will be added +# for this. However, value exists to keeping backwards compatible +# functionality, so instead of removing this instrumentation, this +# will be excluded from the coverage analysis. +def wrap_view_middleware(middleware): # pragma: no cover - # XXX This is no longer being used. The changes to strip the + # This is no longer being used. The changes to strip the # wrapper from the view handler when passed into the function # urlresolvers.reverse() solves most of the problems. To back # that up, the object wrapper now proxies various special @@ -293,7 +312,7 @@ def wrapper(wrapped, instance, args, kwargs): def _wrapped(request, view_func, view_args, view_kwargs): # This strips the view handler wrapper before call. - if hasattr(view_func, '_nr_last_object'): + if hasattr(view_func, "_nr_last_object"): view_func = view_func._nr_last_object return wrapped(request, view_func, view_args, view_kwargs) @@ -370,37 +389,28 @@ def insert_and_wrap_middleware(handler, *args, **kwargs): # priority than that for view handler so view handler # name always takes precedence. - if hasattr(handler, '_request_middleware'): - handler._request_middleware = list( - wrap_leading_middleware( - handler._request_middleware)) + if hasattr(handler, "_request_middleware"): + handler._request_middleware = list(wrap_leading_middleware(handler._request_middleware)) - if hasattr(handler, '_view_middleware'): - handler._view_middleware = list( - wrap_leading_middleware( - handler._view_middleware)) + if hasattr(handler, "_view_middleware"): + handler._view_middleware = list(wrap_leading_middleware(handler._view_middleware)) - if hasattr(handler, '_template_response_middleware'): + if hasattr(handler, "_template_response_middleware"): handler._template_response_middleware = list( - wrap_trailing_middleware( - handler._template_response_middleware)) + wrap_trailing_middleware(handler._template_response_middleware) + ) - if hasattr(handler, '_response_middleware'): - handler._response_middleware = list( - wrap_trailing_middleware( - handler._response_middleware)) + if hasattr(handler, "_response_middleware"): + handler._response_middleware = list(wrap_trailing_middleware(handler._response_middleware)) - if hasattr(handler, '_exception_middleware'): - handler._exception_middleware = list( - wrap_trailing_middleware( - handler._exception_middleware)) + if hasattr(handler, "_exception_middleware"): + handler._exception_middleware = list(wrap_trailing_middleware(handler._exception_middleware)) finally: lock.release() -def _nr_wrapper_GZipMiddleware_process_response_(wrapped, instance, args, - kwargs): +def _nr_wrapper_GZipMiddleware_process_response_(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -433,36 +443,33 @@ def _nr_wrapper_BaseHandler_get_response_(wrapped, instance, args, kwargs): request = _bind_get_response(*args, **kwargs) - if hasattr(request, '_nr_exc_info'): + if hasattr(request, "_nr_exc_info"): notice_error(error=request._nr_exc_info, status_code=response.status_code) - delattr(request, '_nr_exc_info') + delattr(request, "_nr_exc_info") return response # Post import hooks for modules. + def instrument_django_core_handlers_base(module): # Attach a post function to load_middleware() method of # BaseHandler to trigger insertion of browser timing # middleware and wrapping of middleware for timing etc. - wrap_post_function(module, 'BaseHandler.load_middleware', - insert_and_wrap_middleware) + wrap_post_function(module, "BaseHandler.load_middleware", insert_and_wrap_middleware) - if six.PY3 and hasattr(module.BaseHandler, 'get_response_async'): - wrap_function_wrapper(module, 'BaseHandler.get_response_async', - _nr_wrapper_BaseHandler_get_response_async_) + if six.PY3 and hasattr(module.BaseHandler, "get_response_async"): + wrap_function_wrapper(module, "BaseHandler.get_response_async", _nr_wrapper_BaseHandler_get_response_async_) - wrap_function_wrapper(module, 'BaseHandler.get_response', - _nr_wrapper_BaseHandler_get_response_) + wrap_function_wrapper(module, "BaseHandler.get_response", _nr_wrapper_BaseHandler_get_response_) def instrument_django_gzip_middleware(module): - wrap_function_wrapper(module, 'GZipMiddleware.process_response', - _nr_wrapper_GZipMiddleware_process_response_) + wrap_function_wrapper(module, "GZipMiddleware.process_response", _nr_wrapper_GZipMiddleware_process_response_) def wrap_handle_uncaught_exception(middleware): @@ -506,10 +513,9 @@ def instrument_django_core_handlers_wsgi(module): import django - framework = ('Django', django.get_version()) + framework = ("Django", django.get_version()) - module.WSGIHandler.__call__ = WSGIApplicationWrapper( - module.WSGIHandler.__call__, framework=framework) + module.WSGIHandler.__call__ = WSGIApplicationWrapper(module.WSGIHandler.__call__, framework=framework) # Wrap handle_uncaught_exception() of WSGIHandler so that # can capture exception details of any exception which @@ -519,10 +525,10 @@ def instrument_django_core_handlers_wsgi(module): # exception, so last chance to do this as exception will not # propagate up to the WSGI application. - if hasattr(module.WSGIHandler, 'handle_uncaught_exception'): - module.WSGIHandler.handle_uncaught_exception = ( - wrap_handle_uncaught_exception( - module.WSGIHandler.handle_uncaught_exception)) + if hasattr(module.WSGIHandler, "handle_uncaught_exception"): + module.WSGIHandler.handle_uncaught_exception = wrap_handle_uncaught_exception( + module.WSGIHandler.handle_uncaught_exception + ) def wrap_view_handler(wrapped, priority=3): @@ -532,7 +538,7 @@ def wrap_view_handler(wrapped, priority=3): # called recursively. We flag that view handler was wrapped # using the '_nr_django_view_handler' attribute. - if hasattr(wrapped, '_nr_django_view_handler'): + if hasattr(wrapped, "_nr_django_view_handler"): return wrapped if hasattr(wrapped, "view_class"): @@ -584,7 +590,7 @@ def wrapper(wrapped, instance, args, kwargs): if transaction is None: return wrapped(*args, **kwargs) - if hasattr(transaction, '_nr_django_url_resolver'): + if hasattr(transaction, "_nr_django_url_resolver"): return wrapped(*args, **kwargs) # Tag the transaction so we know when we are in the top @@ -602,8 +608,7 @@ def _wrapped(path): if type(result) is tuple: callback, callback_args, callback_kwargs = result - result = (wrap_view_handler(callback, priority=5), - callback_args, callback_kwargs) + result = (wrap_view_handler(callback, priority=5), callback_args, callback_kwargs) else: result.func = wrap_view_handler(result.func, priority=5) @@ -636,8 +641,7 @@ def wrapper(wrapped, instance, args, kwargs): return wrap_view_handler(result, priority=priority) else: callback, param_dict = result - return (wrap_view_handler(callback, priority=priority), - param_dict) + return (wrap_view_handler(callback, priority=priority), param_dict) return FunctionWrapper(wrapped, wrapper) @@ -653,9 +657,10 @@ def wrap_url_reverse(wrapped): def wrapper(wrapped, instance, args, kwargs): def execute(viewname, *args, **kwargs): - if hasattr(viewname, '_nr_last_object'): + if hasattr(viewname, "_nr_last_object"): viewname = viewname._nr_last_object return wrapped(viewname, *args, **kwargs) + return execute(*args, **kwargs) return FunctionWrapper(wrapped, wrapper) @@ -672,20 +677,19 @@ def instrument_django_core_urlresolvers(module): # lost. We thus intercept it here so can capture that # traceback which is otherwise lost. - wrap_error_trace(module, 'get_callable') + wrap_error_trace(module, "get_callable") # Wrap methods which resolves a request to a view handler. # This can be called against a resolver initialised against # a custom URL conf associated with a specific request, or a # resolver which uses the default URL conf. - if hasattr(module, 'RegexURLResolver'): + if hasattr(module, "RegexURLResolver"): urlresolver = module.RegexURLResolver else: urlresolver = module.URLResolver - urlresolver.resolve = wrap_url_resolver( - urlresolver.resolve) + urlresolver.resolve = wrap_url_resolver(urlresolver.resolve) # Wrap methods which resolve error handlers. For 403 and 404 # we give these higher naming priority over any prior @@ -695,26 +699,22 @@ def instrument_django_core_urlresolvers(module): # handler in place so error details identify the correct # transaction. - if hasattr(urlresolver, 'resolve403'): - urlresolver.resolve403 = wrap_url_resolver_nnn( - urlresolver.resolve403, priority=3) + if hasattr(urlresolver, "resolve403"): + urlresolver.resolve403 = wrap_url_resolver_nnn(urlresolver.resolve403, priority=3) - if hasattr(urlresolver, 'resolve404'): - urlresolver.resolve404 = wrap_url_resolver_nnn( - urlresolver.resolve404, priority=3) + if hasattr(urlresolver, "resolve404"): + urlresolver.resolve404 = wrap_url_resolver_nnn(urlresolver.resolve404, priority=3) - if hasattr(urlresolver, 'resolve500'): - urlresolver.resolve500 = wrap_url_resolver_nnn( - urlresolver.resolve500, priority=1) + if hasattr(urlresolver, "resolve500"): + urlresolver.resolve500 = wrap_url_resolver_nnn(urlresolver.resolve500, priority=1) - if hasattr(urlresolver, 'resolve_error_handler'): - urlresolver.resolve_error_handler = wrap_url_resolver_nnn( - urlresolver.resolve_error_handler, priority=1) + if hasattr(urlresolver, "resolve_error_handler"): + urlresolver.resolve_error_handler = wrap_url_resolver_nnn(urlresolver.resolve_error_handler, priority=1) # Wrap function for performing reverse URL lookup to strip any # instrumentation wrapper when view handler is passed in. - if hasattr(module, 'reverse'): + if hasattr(module, "reverse"): module.reverse = wrap_url_reverse(module.reverse) @@ -723,7 +723,7 @@ def instrument_django_urls_base(module): # Wrap function for performing reverse URL lookup to strip any # instrumentation wrapper when view handler is passed in. - if hasattr(module, 'reverse'): + if hasattr(module, "reverse"): module.reverse = wrap_url_reverse(module.reverse) @@ -742,17 +742,15 @@ def instrument_django_template(module): def template_name(template, *args): return template.name - if hasattr(module.Template, '_render'): - wrap_function_trace(module, 'Template._render', - name=template_name, group='Template/Render') + if hasattr(module.Template, "_render"): + wrap_function_trace(module, "Template._render", name=template_name, group="Template/Render") else: - wrap_function_trace(module, 'Template.render', - name=template_name, group='Template/Render') + wrap_function_trace(module, "Template.render", name=template_name, group="Template/Render") # Django 1.8 no longer has module.libraries. As automatic way is not # preferred we can just skip this now. - if not hasattr(module, 'libraries'): + if not hasattr(module, "libraries"): return # Register template tags used for manual insertion of RUM @@ -766,12 +764,12 @@ def template_name(template, *args): library.simple_tag(newrelic_browser_timing_header) library.simple_tag(newrelic_browser_timing_footer) - module.libraries['django.templatetags.newrelic'] = library + module.libraries["django.templatetags.newrelic"] = library def wrap_template_block(wrapped): def wrapper(wrapped, instance, args, kwargs): - return FunctionTraceWrapper(wrapped, name=instance.name, group='Template/Block')(*args, **kwargs) + return FunctionTraceWrapper(wrapped, name=instance.name, group="Template/Block")(*args, **kwargs) return FunctionWrapper(wrapped, wrapper) @@ -812,11 +810,15 @@ def instrument_django_core_servers_basehttp(module): # instrumentation of the wsgiref module or some other means. def wrap_wsgi_application_entry_point(server, application, **kwargs): - return ((server, WSGIApplicationWrapper(application, - framework='Django'),), kwargs) + return ( + ( + server, + WSGIApplicationWrapper(application, framework="Django"), + ), + kwargs, + ) - if (not hasattr(module, 'simple_server') and - hasattr(module.ServerHandler, 'run')): + if not hasattr(module, "simple_server") and hasattr(module.ServerHandler, "run"): # Patch the server to make it work properly. @@ -833,11 +835,10 @@ def run(self, application): def close(self): if self.result is not None: try: - self.request_handler.log_request( - self.status.split(' ', 1)[0], self.bytes_sent) + self.request_handler.log_request(self.status.split(" ", 1)[0], self.bytes_sent) finally: try: - if hasattr(self.result, 'close'): + if hasattr(self.result, "close"): self.result.close() finally: self.result = None @@ -855,17 +856,16 @@ def close(self): # Now wrap it with our instrumentation. - wrap_in_function(module, 'ServerHandler.run', - wrap_wsgi_application_entry_point) + wrap_in_function(module, "ServerHandler.run", wrap_wsgi_application_entry_point) def instrument_django_contrib_staticfiles_views(module): - if not hasattr(module.serve, '_nr_django_view_handler'): + if not hasattr(module.serve, "_nr_django_view_handler"): module.serve = wrap_view_handler(module.serve, priority=3) def instrument_django_contrib_staticfiles_handlers(module): - wrap_transaction_name(module, 'StaticFilesHandler.serve') + wrap_transaction_name(module, "StaticFilesHandler.serve") def instrument_django_views_debug(module): @@ -878,10 +878,8 @@ def instrument_django_views_debug(module): # from a middleware or view handler in place so error # details identify the correct transaction. - module.technical_404_response = wrap_view_handler( - module.technical_404_response, priority=3) - module.technical_500_response = wrap_view_handler( - module.technical_500_response, priority=1) + module.technical_404_response = wrap_view_handler(module.technical_404_response, priority=3) + module.technical_500_response = wrap_view_handler(module.technical_500_response, priority=1) def resolve_view_handler(view, request): @@ -890,8 +888,7 @@ def resolve_view_handler(view, request): # duplicate the lookup mechanism. if request.method.lower() in view.http_method_names: - handler = getattr(view, request.method.lower(), - view.http_method_not_allowed) + handler = getattr(view, request.method.lower(), view.http_method_not_allowed) else: handler = view.http_method_not_allowed @@ -936,7 +933,7 @@ def _args(request, *args, **kwargs): priority = 4 - if transaction.group == 'Function': + if transaction.group == "Function": if transaction.name == callable_name(view): priority = 5 @@ -953,22 +950,22 @@ def instrument_django_views_generic_base(module): def instrument_django_http_multipartparser(module): - wrap_function_trace(module, 'MultiPartParser.parse') + wrap_function_trace(module, "MultiPartParser.parse") def instrument_django_core_mail(module): - wrap_function_trace(module, 'mail_admins') - wrap_function_trace(module, 'mail_managers') - wrap_function_trace(module, 'send_mail') + wrap_function_trace(module, "mail_admins") + wrap_function_trace(module, "mail_managers") + wrap_function_trace(module, "send_mail") def instrument_django_core_mail_message(module): - wrap_function_trace(module, 'EmailMessage.send') + wrap_function_trace(module, "EmailMessage.send") def _nr_wrapper_BaseCommand___init___(wrapped, instance, args, kwargs): instance.handle = FunctionTraceWrapper(instance.handle) - if hasattr(instance, 'handle_noargs'): + if hasattr(instance, "handle_noargs"): instance.handle_noargs = FunctionTraceWrapper(instance.handle_noargs) return wrapped(*args, **kwargs) @@ -982,29 +979,25 @@ def _args(argv, *args, **kwargs): subcommand = _argv[1] commands = django_settings.instrumentation.scripts.django_admin - startup_timeout = \ - django_settings.instrumentation.background_task.startup_timeout + startup_timeout = django_settings.instrumentation.background_task.startup_timeout if subcommand not in commands: return wrapped(*args, **kwargs) application = register_application(timeout=startup_timeout) - return BackgroundTaskWrapper(wrapped, application, subcommand, 'Django')(*args, **kwargs) + return BackgroundTaskWrapper(wrapped, application, subcommand, "Django")(*args, **kwargs) def instrument_django_core_management_base(module): - wrap_function_wrapper(module, 'BaseCommand.__init__', - _nr_wrapper_BaseCommand___init___) - wrap_function_wrapper(module, 'BaseCommand.run_from_argv', - _nr_wrapper_BaseCommand_run_from_argv_) + wrap_function_wrapper(module, "BaseCommand.__init__", _nr_wrapper_BaseCommand___init___) + wrap_function_wrapper(module, "BaseCommand.run_from_argv", _nr_wrapper_BaseCommand_run_from_argv_) @function_wrapper -def _nr_wrapper_django_inclusion_tag_wrapper_(wrapped, instance, - args, kwargs): +def _nr_wrapper_django_inclusion_tag_wrapper_(wrapped, instance, args, kwargs): - name = hasattr(wrapped, '__name__') and wrapped.__name__ + name = hasattr(wrapped, "__name__") and wrapped.__name__ if name is None: return wrapped(*args, **kwargs) @@ -1013,16 +1006,14 @@ def _nr_wrapper_django_inclusion_tag_wrapper_(wrapped, instance, tags = django_settings.instrumentation.templates.inclusion_tag - if '*' not in tags and name not in tags and qualname not in tags: + if "*" not in tags and name not in tags and qualname not in tags: return wrapped(*args, **kwargs) - return FunctionTraceWrapper(wrapped, name=name, group='Template/Tag')(*args, **kwargs) + return FunctionTraceWrapper(wrapped, name=name, group="Template/Tag")(*args, **kwargs) @function_wrapper -def _nr_wrapper_django_inclusion_tag_decorator_(wrapped, instance, - args, kwargs): - +def _nr_wrapper_django_inclusion_tag_decorator_(wrapped, instance, args, kwargs): def _bind_params(func, *args, **kwargs): return func, args, kwargs @@ -1033,63 +1024,56 @@ def _bind_params(func, *args, **kwargs): return wrapped(func, *_args, **_kwargs) -def _nr_wrapper_django_template_base_Library_inclusion_tag_(wrapped, - instance, args, kwargs): +def _nr_wrapper_django_template_base_Library_inclusion_tag_(wrapped, instance, args, kwargs): - return _nr_wrapper_django_inclusion_tag_decorator_( - wrapped(*args, **kwargs)) + return _nr_wrapper_django_inclusion_tag_decorator_(wrapped(*args, **kwargs)) @function_wrapper -def _nr_wrapper_django_template_base_InclusionNode_render_(wrapped, - instance, args, kwargs): +def _nr_wrapper_django_template_base_InclusionNode_render_(wrapped, instance, args, kwargs): if wrapped.__self__ is None: return wrapped(*args, **kwargs) - file_name = getattr(wrapped.__self__, '_nr_file_name', None) + file_name = getattr(wrapped.__self__, "_nr_file_name", None) if file_name is None: return wrapped(*args, **kwargs) name = wrapped.__self__._nr_file_name - return FunctionTraceWrapper(wrapped, name=name, group='Template/Include')(*args, **kwargs) + return FunctionTraceWrapper(wrapped, name=name, group="Template/Include")(*args, **kwargs) -def _nr_wrapper_django_template_base_generic_tag_compiler_(wrapped, instance, - args, kwargs): +def _nr_wrapper_django_template_base_generic_tag_compiler_(wrapped, instance, args, kwargs): if wrapped.__code__.co_argcount > 6: # Django > 1.3. - def _bind_params(parser, token, params, varargs, varkw, defaults, - name, takes_context, node_class, *args, **kwargs): + def _bind_params( + parser, token, params, varargs, varkw, defaults, name, takes_context, node_class, *args, **kwargs + ): return node_class + else: # Django <= 1.3. - def _bind_params(params, defaults, name, node_class, parser, token, - *args, **kwargs): + def _bind_params(params, defaults, name, node_class, parser, token, *args, **kwargs): return node_class node_class = _bind_params(*args, **kwargs) - if node_class.__name__ == 'InclusionNode': + if node_class.__name__ == "InclusionNode": result = wrapped(*args, **kwargs) - result.render = ( - _nr_wrapper_django_template_base_InclusionNode_render_( - result.render)) + result.render = _nr_wrapper_django_template_base_InclusionNode_render_(result.render) return result return wrapped(*args, **kwargs) -def _nr_wrapper_django_template_base_Library_tag_(wrapped, instance, - args, kwargs): - +def _nr_wrapper_django_template_base_Library_tag_(wrapped, instance, args, kwargs): def _bind_params(name=None, compile_function=None, *args, **kwargs): return compile_function @@ -1105,14 +1089,16 @@ def _get_node_class(compile_function): # Django >= 1.4 uses functools.partial if isinstance(compile_function, functools.partial): - node_class = compile_function.keywords.get('node_class') + node_class = compile_function.keywords.get("node_class") # Django < 1.4 uses their home-grown "curry" function, # not functools.partial. - if (hasattr(compile_function, 'func_closure') and - hasattr(compile_function, '__name__') and - compile_function.__name__ == '_curried'): + if ( + hasattr(compile_function, "func_closure") + and hasattr(compile_function, "__name__") + and compile_function.__name__ == "_curried" + ): # compile_function here is generic_tag_compiler(), which has been # curried. To get node_class, we first get the function obj, args, @@ -1121,19 +1107,20 @@ def _get_node_class(compile_function): # is not consistent from platform to platform, so we need to map # them to the variables in compile_function.__code__.co_freevars. - cells = dict(zip(compile_function.__code__.co_freevars, - (c.cell_contents for c in compile_function.func_closure))) + cells = dict( + zip(compile_function.__code__.co_freevars, (c.cell_contents for c in compile_function.func_closure)) + ) # node_class is the 4th arg passed to generic_tag_compiler() - if 'args' in cells and len(cells['args']) > 3: - node_class = cells['args'][3] + if "args" in cells and len(cells["args"]) > 3: + node_class = cells["args"][3] return node_class node_class = _get_node_class(compile_function) - if node_class is None or node_class.__name__ != 'InclusionNode': + if node_class is None or node_class.__name__ != "InclusionNode": return wrapped(*args, **kwargs) # Climb stack to find the file_name of the include template. @@ -1146,9 +1133,8 @@ def _get_node_class(compile_function): for i in range(1, stack_levels + 1): frame = sys._getframe(i) - if ('generic_tag_compiler' in frame.f_code.co_names and - 'file_name' in frame.f_code.co_freevars): - file_name = frame.f_locals.get('file_name') + if "generic_tag_compiler" in frame.f_code.co_names and "file_name" in frame.f_code.co_freevars: + file_name = frame.f_locals.get("file_name") if file_name is None: return wrapped(*args, **kwargs) @@ -1167,22 +1153,22 @@ def instrument_django_template_base(module): settings = global_settings() - if 'django.instrumentation.inclusion-tags.r1' in settings.feature_flag: + if "django.instrumentation.inclusion-tags.r1" in settings.feature_flag: - if hasattr(module, 'generic_tag_compiler'): - wrap_function_wrapper(module, 'generic_tag_compiler', - _nr_wrapper_django_template_base_generic_tag_compiler_) + if hasattr(module, "generic_tag_compiler"): + wrap_function_wrapper( + module, "generic_tag_compiler", _nr_wrapper_django_template_base_generic_tag_compiler_ + ) - if hasattr(module, 'Library'): - wrap_function_wrapper(module, 'Library.tag', - _nr_wrapper_django_template_base_Library_tag_) + if hasattr(module, "Library"): + wrap_function_wrapper(module, "Library.tag", _nr_wrapper_django_template_base_Library_tag_) - wrap_function_wrapper(module, 'Library.inclusion_tag', - _nr_wrapper_django_template_base_Library_inclusion_tag_) + wrap_function_wrapper( + module, "Library.inclusion_tag", _nr_wrapper_django_template_base_Library_inclusion_tag_ + ) def _nr_wrap_converted_middleware_(middleware, name): - @function_wrapper def _wrapper(wrapped, instance, args, kwargs): transaction = current_transaction() @@ -1197,9 +1183,7 @@ def _wrapper(wrapped, instance, args, kwargs): return _wrapper(middleware) -def _nr_wrapper_convert_exception_to_response_(wrapped, instance, args, - kwargs): - +def _nr_wrapper_convert_exception_to_response_(wrapped, instance, args, kwargs): def _bind_params(original_middleware, *args, **kwargs): return original_middleware @@ -1214,21 +1198,19 @@ def _bind_params(original_middleware, *args, **kwargs): def instrument_django_core_handlers_exception(module): - if hasattr(module, 'convert_exception_to_response'): - wrap_function_wrapper(module, 'convert_exception_to_response', - _nr_wrapper_convert_exception_to_response_) + if hasattr(module, "convert_exception_to_response"): + wrap_function_wrapper(module, "convert_exception_to_response", _nr_wrapper_convert_exception_to_response_) - if hasattr(module, 'handle_uncaught_exception'): - module.handle_uncaught_exception = ( - wrap_handle_uncaught_exception( - module.handle_uncaught_exception)) + if hasattr(module, "handle_uncaught_exception"): + module.handle_uncaught_exception = wrap_handle_uncaught_exception(module.handle_uncaught_exception) def instrument_django_core_handlers_asgi(module): import django - framework = ('Django', django.get_version()) + framework = ("Django", django.get_version()) - if hasattr(module, 'ASGIHandler'): + if hasattr(module, "ASGIHandler"): from newrelic.api.asgi_application import wrap_asgi_application - wrap_asgi_application(module, 'ASGIHandler.__call__', framework=framework) + + wrap_asgi_application(module, "ASGIHandler.__call__", framework=framework) diff --git a/newrelic/hooks/framework_flask.py b/newrelic/hooks/framework_flask.py index ea2f4f143..c0540a60d 100644 --- a/newrelic/hooks/framework_flask.py +++ b/newrelic/hooks/framework_flask.py @@ -16,6 +16,8 @@ """ +from inspect import isclass + from newrelic.api.function_trace import ( FunctionTrace, FunctionTraceWrapper, @@ -55,6 +57,22 @@ def _nr_wrapper_handler_(wrapped, instance, args, kwargs): name = getattr(wrapped, "_nr_view_func_name", callable_name(wrapped)) view = getattr(wrapped, "view_class", wrapped) + try: + # Attempt to narrow down class based views to the correct method + from flask import request + from flask.views import MethodView + + if isclass(view): + if issubclass(view, MethodView): + # For method based views, use the corresponding method if available + method = request.method.lower() + view = getattr(view, method, view) + else: + # For class based views, use the dispatch_request function if available + view = getattr(view, "dispatch_request", view) + except ImportError: + pass + # Set priority=2 so this will take precedence over any error # handler which will be at priority=1. diff --git a/newrelic/hooks/framework_graphql.py b/newrelic/hooks/framework_graphql.py index b5af068f0..3e1d4333c 100644 --- a/newrelic/hooks/framework_graphql.py +++ b/newrelic/hooks/framework_graphql.py @@ -187,8 +187,9 @@ def wrap_execute_operation(wrapped, instance, args, kwargs): if operation.selection_set is not None: fields = operation.selection_set.selections # Ignore transactions for introspection queries - for field in fields: - if get_node_value(field, "name") in GRAPHQL_INTROSPECTION_FIELDS: + if not (transaction.settings and transaction.settings.instrumentation.graphql.capture_introspection_queries): + # If all selected fields are introspection fields + if all(get_node_value(field, "name") in GRAPHQL_INTROSPECTION_FIELDS for field in fields): ignore_transaction() fragments = execution_context.fragments diff --git a/newrelic/hooks/framework_sanic.py b/newrelic/hooks/framework_sanic.py index 745cdbf70..94b5179c2 100644 --- a/newrelic/hooks/framework_sanic.py +++ b/newrelic/hooks/framework_sanic.py @@ -311,4 +311,4 @@ def instrument_sanic_response(module): def instrument_sanic_touchup_service(module): if hasattr(module, "TouchUp") and hasattr(module.TouchUp, "run"): - wrap_function_wrapper(module.TouchUp, "run", _nr_wrap_touchup_run) + wrap_function_wrapper(module, "TouchUp.run", _nr_wrap_touchup_run) diff --git a/newrelic/hooks/framework_twisted.py b/newrelic/hooks/framework_twisted.py deleted file mode 100644 index 0270282a6..000000000 --- a/newrelic/hooks/framework_twisted.py +++ /dev/null @@ -1,560 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import logging -import sys -import weakref -import UserList - -import newrelic.api.application -import newrelic.api.object_wrapper -import newrelic.api.transaction -import newrelic.api.web_transaction -import newrelic.api.function_trace -import newrelic.api.error_trace - -from newrelic.api.time_trace import notice_error - -_logger = logging.getLogger(__name__) - -class RequestProcessWrapper(object): - - def __init__(self, wrapped): - if isinstance(wrapped, tuple): - (instance, wrapped) = wrapped - else: - instance = None - - newrelic.api.object_wrapper.update_wrapper(self, wrapped) - - self._nr_instance = instance - self._nr_next_object = wrapped - - if not hasattr(self, '_nr_last_object'): - self._nr_last_object = wrapped - - def __get__(self, instance, klass): - if instance is None: - return self - descriptor = self._nr_next_object.__get__(instance, klass) - return self.__class__((instance, descriptor)) - - def __call__(self): - assert self._nr_instance != None - - transaction = newrelic.api.transaction.current_transaction() - - # Check to see if we are being called within the context of any - # sort of transaction. If we are, then we don't bother doing - # anything and just call the wrapped function. This should not - # really ever occur with Twisted.Web wrapper but check anyway. - - if transaction: - return self._nr_next_object() - - # Always use the default application specified in the agent - # configuration. - - application = newrelic.api.application.application_instance() - - # We need to fake up a WSGI like environ dictionary with the key - # bits of information we need. - - environ = {} - - environ['REQUEST_URI'] = self._nr_instance.path - - # Now start recording the actual web transaction. - - transaction = newrelic.api.web_transaction.WSGIWebTransaction( - application, environ, source=self._nr_next_object) - - if not transaction.enabled: - return self._nr_next_object() - - transaction.__enter__() - - self._nr_instance._nr_transaction = transaction - - self._nr_instance._nr_is_deferred_callback = False - self._nr_instance._nr_is_request_finished = False - self._nr_instance._nr_wait_function_trace = None - - # We need to add a reference to the Twisted.Web request object - # in the transaction as only able to stash the transaction in a - # deferred. Need to use a weakref to avoid an object cycle which - # may prevent cleanup of transaction. - - transaction._nr_current_request = weakref.ref(self._nr_instance) - - try: - # Call the original method in a trace object to give better - # context in transaction traces. Three things can happen - # within this call. The render() function which is in turn - # called can return a result immediately which means user - # code should have called finish() on the request, it can - # raise an exception which is caught in process() function - # where error handling calls finish(), or it can return that - # it is not done yet and register deferred callbacks to - # complete the request. - - result = newrelic.api.function_trace.FunctionTraceWrapper(self._nr_next_object, name='Request/Process', group='Python/Twisted') - - # In the case of a result having being returned or an - # exception occuring, then finish() will have been called. - # We can't just exit the transaction in the finish call - # however as need to still pop back up through the above - # function trace. So if flagged that have finished, then we - # exit the transaction here. Otherwise we setup a function - # trace to track wait time for deferred and manually pop the - # transaction as being the current one for this thread. - - if self._nr_instance._nr_is_request_finished: - transaction.__exit__(None, None, None) - self._nr_instance._nr_transaction = None - self._nr_instance = None - - else: - self._nr_instance._nr_wait_function_trace = \ - newrelic.api.function_trace.FunctionTrace( - name='Deferred/Wait', - group='Python/Twisted', - source=self._nr_next_object) - - self._nr_instance._nr_wait_function_trace.__enter__() - transaction.drop_transaction() - - except: # Catch all - # If an error occurs assume that transaction should be - # exited. Technically don't believe this should ever occur - # unless our code here has an error or Twisted.Web is - # broken. - - _logger.exception('Unexpected exception raised by Twisted.Web ' - 'Request.process() exception.') - - transaction.__exit__(*sys.exc_info()) - self._nr_instance._nr_transaction = None - self._nr_instance = None - - raise - - return result - -class RequestFinishWrapper(object): - - def __init__(self, wrapped): - if isinstance(wrapped, tuple): - (instance, wrapped) = wrapped - else: - instance = None - - newrelic.api.object_wrapper.update_wrapper(self, wrapped) - - self._nr_instance = instance - self._nr_next_object = wrapped - - if not hasattr(self, '_nr_last_object'): - self._nr_last_object = wrapped - - def __get__(self, instance, klass): - if instance is None: - return self - descriptor = self._nr_next_object.__get__(instance, klass) - return self.__class__((instance, descriptor)) - - def __call__(self): - assert self._nr_instance != None - - # Call finish() method straight away if request is not even - # associated with a transaction. - - if not hasattr(self._nr_instance, '_nr_transaction'): - return self._nr_next_object() - - # Technically we should only be able to be called here without - # an active transaction if we are in the wait state. If we - # are called in context of original request process() function - # or a deferred the transaction should already be registered. - - transaction = self._nr_instance._nr_transaction - - if self._nr_instance._nr_wait_function_trace: - if newrelic.api.transaction.current_transaction(): - _logger.debug('The Twisted.Web request finish() method is ' - 'being called while in wait state but there is ' - 'already a current transaction.') - else: - transaction.save_transaction() - - elif not newrelic.api.transaction.current_transaction(): - _logger.debug('The Twisted.Web request finish() method is ' - 'being called from request process() method or a ' - 'deferred but there is not a current transaction.') - - # Except for case of being called when in wait state, we can't - # actually exit the transaction at this point as may be called - # in context of an outer function trace node. We thus flag that - # are finished and pop back out allowing outer scope to actually - # exit the transaction. - - self._nr_instance._nr_is_request_finished = True - - # Now call the original finish() function. - - if self._nr_instance._nr_is_deferred_callback: - - # If we are in a deferred callback log any error against the - # transaction here so we know we will capture it. We - # possibly don't need to do it here as outer scope may catch - # it anyway. Duplicate will be ignored so not too important. - # Most likely the finish() call would never fail anyway. - - try: - result = newrelic.api.function_trace.FunctionTraceWrapper(self._nr_next_object, name='Request/Finish', group='Python/Twisted') - except: # Catch all - notice_error(sys.exc_info()) - raise - - elif self._nr_instance._nr_wait_function_trace: - - # Now handle the special case where finish() was called - # while in the wait state. We might get here through - # Twisted.Web itself somehow calling finish() when still - # waiting for a deferred. If this were to occur though then - # the transaction will not be popped if we simply marked - # request as finished as no outer scope to see that and - # clean up. We will thus need to end the function trace and - # exit the transaction. We end function trace here and then - # the transaction down below. - - try: - self._nr_instance._nr_wait_function_trace.__exit__( - None, None, None) - - result = newrelic.api.function_trace.FunctionTraceWrapper(self._nr_next_object, name='Request/Finish', group='Python/Twisted') - - - transaction.__exit__(None, None, None) - - except: # Catch all - transaction.__exit__(*sys.exc_info()) - raise - - finally: - self._nr_instance._nr_wait_function_trace = None - self._nr_instance._nr_transaction = None - self._nr_instance = None - - else: - # This should be the case where finish() is being called in - # the original render() function. - - result = newrelic.api.function_trace.FunctionTraceWrapper(self._nr_next_object, name='Request/Finish', group='Python/Twisted') - - return result - -class ResourceRenderWrapper(object): - - def __init__(self, wrapped): - if isinstance(wrapped, tuple): - (instance, wrapped) = wrapped - else: - instance = None - - newrelic.api.object_wrapper.update_wrapper(self, wrapped) - - self._nr_instance = instance - self._nr_next_object = wrapped - - if not hasattr(self, '_nr_last_object'): - self._nr_last_object = wrapped - - def __get__(self, instance, klass): - if instance is None: - return self - descriptor = self._nr_next_object.__get__(instance, klass) - return self.__class__((instance, descriptor)) - - def __call__(self, *args): - - # Temporary work around due to customer calling class method - # directly with 'self' as first argument. Need to work out best - # practice for dealing with this. - - if len(args) == 2: - # Assume called as unbound method with (self, request). - instance, request = args - else: - # Assume called as bound method with (request). - instance = self._nr_instance - request = args[-1] - - assert instance != None - - transaction = newrelic.api.transaction.current_transaction() - - if transaction is None: - return self._nr_next_object(*args) - - # This is wrapping the render() function of the resource. We - # name the function node and the web transaction after the name - # of the handler function augmented with the method type for the - # request. - - name = "%s.render_%s" % ( - newrelic.api.object_wrapper.callable_name( - instance), request.method) - transaction.set_transaction_name(name, priority=1) - - return newrelic.api.function_trace.FunctionTraceWrapper(self._nr_next_object, name)(*args) - - -class DeferredUserList(UserList.UserList): - - def pop(self, i=-1): - import twisted.internet.defer - item = super(DeferredUserList, self).pop(i) - - item0 = item[0] - item1 = item[1] - - if item0[0] != twisted.internet.defer._CONTINUE: - item0 = (newrelic.api.function_trace.FunctionTraceWrapper( - item0[0], group='Python/Twisted/Callback'), - item0[1], item0[2]) - - if item1[0] != twisted.internet.defer._CONTINUE: - item1 = (newrelic.api.function_trace.FunctionTraceWrapper( - item1[0], group='Python/Twisted/Errback'), - item1[1], item1[2]) - - return (item0, item1) - -class DeferredWrapper(object): - - def __init__(self, wrapped): - if isinstance(wrapped, tuple): - (instance, wrapped) = wrapped - else: - instance = None - - newrelic.api.object_wrapper.update_wrapper(self, wrapped) - - self._nr_instance = instance - self._nr_next_object = wrapped - - if not hasattr(self, '_nr_last_object'): - self._nr_last_object = wrapped - - def __get__(self, instance, klass): - if instance is None: - return self - descriptor = self._nr_next_object.__get__(instance, klass) - return self.__class__((instance, descriptor)) - - def __call__(self, *args, **kwargs): - - # This is wrapping the __init__() function so call that first. - - self._nr_next_object(*args, **kwargs) - - # We now wrap the list of deferred callbacks so can track when - # each callback is actually called. - - if self._nr_instance: - transaction = newrelic.api.transaction.current_transaction() - if transaction: - self._nr_instance._nr_transaction = transaction - self._nr_instance.callbacks = DeferredUserList( - self._nr_instance.callbacks) - -class DeferredCallbacksWrapper(object): - - def __init__(self, wrapped): - if isinstance(wrapped, tuple): - (instance, wrapped) = wrapped - else: - instance = None - - newrelic.api.object_wrapper.update_wrapper(self, wrapped) - - self._nr_instance = instance - self._nr_next_object = wrapped - - if not hasattr(self, '_nr_last_object'): - self._nr_last_object = wrapped - - def __get__(self, instance, klass): - if instance is None: - return self - descriptor = self._nr_next_object.__get__(instance, klass) - return self.__class__((instance, descriptor)) - - def __call__(self): - assert self._nr_instance != None - - transaction = newrelic.api.transaction.current_transaction() - - # If there is an active transaction then deferred is being - # called within a context of another deferred so simply call the - # callback and return. - - if transaction: - return self._nr_next_object() - - # If there is no transaction recorded against the deferred then - # don't need to do anything and can simply call the callback and - # return. - - if not hasattr(self._nr_instance, '_nr_transaction'): - return self._nr_next_object() - - transaction = self._nr_instance._nr_transaction - - # If we can't find a Twisted.Web request object associated with - # the transaction or it is no longer valid then simply call the - # callback and return. - - if not hasattr(transaction, '_nr_current_request'): - return self._nr_next_object() - - request = transaction._nr_current_request() - - if not request: - return self._nr_next_object() - - try: - # Save the transaction recorded against the deferred as the - # active transaction. - - transaction.save_transaction() - - # Record that are calling a deferred. This changes what we - # do if the request finish() method is being called. - - request._nr_is_deferred_callback = True - - # We should always be calling into a deferred when we are - # in the wait state for the request. We need to exit that - # wait state. - - if request._nr_wait_function_trace: - request._nr_wait_function_trace.__exit__(None, None, None) - request._nr_wait_function_trace = None - - else: - _logger.debug('Called a Twisted.Web deferred when we were ' - 'not in a wait state.') - - # Call the deferred and capture any errors that may come - # back from it. - - with newrelic.api.error_trace.ErrorTrace(): - return newrelic.api.function_trace.FunctionTraceWrapper(self._nr_next_object, name='Deferred/Call', group='Python/Twisted') - - finally: - # If the request finish() method was called from the - # deferred then we need to exit the transaction. Other wise - # we need to create a new function trace node for a new wait - # state and pop the transaction. - - if request._nr_is_request_finished: - transaction.__exit__(None, None, None) - self._nr_instance._nr_transaction = None - - else: - # XXX Should we be removing the transaction from the - # deferred object as well. Can the same deferred be - # called multiple times for same request. It probably - # can be reregistered. - - request._nr_wait_function_trace = \ - newrelic.api.function_trace.FunctionTrace( - name='Deferred/Wait', - group='Python/Twisted', - source=self._nr_next_object) - - request._nr_wait_function_trace.__enter__() - transaction.drop_transaction() - - request._nr_is_deferred_callback = False - -class InlineGeneratorWrapper(object): - - def __init__(self, wrapped, generator): - self._nr_wrapped = wrapped - self._nr_generator = generator - - def __iter__(self): - name = newrelic.api.object_wrapper.callable_name(self._nr_wrapped) - iterable = iter(self._nr_generator) - while 1: - with newrelic.api.function_trace.FunctionTrace( - name, group='Python/Twisted/Generator', source=self._nr_wrapped): - yield next(iterable) - -class InlineCallbacksWrapper(object): - - def __init__(self, wrapped): - if isinstance(wrapped, tuple): - (instance, wrapped) = wrapped - else: - instance = None - - newrelic.api.object_wrapper.update_wrapper(self, wrapped) - - self._nr_instance = instance - self._nr_next_object = wrapped - - if not hasattr(self, '_nr_last_object'): - self._nr_last_object = wrapped - - def __get__(self, instance, klass): - if instance is None: - return self - descriptor = self._nr_next_object.__get__(instance, klass) - return self.__class__((instance, descriptor)) - - def __call__(self, *args, **kwargs): - transaction = newrelic.api.transaction.current_transaction() - - if not transaction: - return self._nr_next_object(*args, **kwargs) - - result = self._nr_next_object(*args, **kwargs) - - if not result: - return result - - return iter(InlineGeneratorWrapper(self._nr_next_object, result)) - -def instrument_twisted_web_server(module): - module.Request.process = RequestProcessWrapper(module.Request.process) - -def instrument_twisted_web_http(module): - module.Request.finish = RequestFinishWrapper(module.Request.finish) - -def instrument_twisted_web_resource(module): - module.Resource.render = ResourceRenderWrapper(module.Resource.render) - -def instrument_twisted_internet_defer(module): - module.Deferred.__init__ = DeferredWrapper(module.Deferred.__init__) - module.Deferred._runCallbacks = DeferredCallbacksWrapper( - module.Deferred._runCallbacks) - - #_inlineCallbacks = module.inlineCallbacks - #def inlineCallbacks(f): - # return _inlineCallbacks(InlineCallbacksWrapper(f)) - #module.inlineCallbacks = inlineCallbacks diff --git a/newrelic/hooks/logger_logging.py b/newrelic/hooks/logger_logging.py index 22cdc8c78..67fb46525 100644 --- a/newrelic/hooks/logger_logging.py +++ b/newrelic/hooks/logger_logging.py @@ -15,10 +15,9 @@ from newrelic.api.application import application_instance from newrelic.api.time_trace import get_linking_metadata from newrelic.api.transaction import current_transaction, record_log_event -from newrelic.common.object_wrapper import wrap_function_wrapper, function_wrapper +from newrelic.common.object_wrapper import function_wrapper, wrap_function_wrapper from newrelic.core.config import global_settings - try: from urllib import quote except ImportError: @@ -28,7 +27,7 @@ def add_nr_linking_metadata(message): available_metadata = get_linking_metadata() entity_name = quote(available_metadata.get("entity.name", "")) - entity_guid = available_metadata.get("entity.guid", "") + entity_guid = available_metadata.get("entity.guid", "") span_id = available_metadata.get("span.id", "") trace_id = available_metadata.get("trace.id", "") hostname = available_metadata.get("hostname", "") @@ -72,7 +71,7 @@ def wrap_callHandlers(wrapped, instance, args, kwargs): if application and application.enabled: application.record_custom_metric("Logging/lines", {"count": 1}) application.record_custom_metric("Logging/lines/%s" % level_name, {"count": 1}) - + if settings.application_logging.forwarding and settings.application_logging.forwarding.enabled: try: message = record.getMessage() diff --git a/newrelic/hooks/logger_loguru.py b/newrelic/hooks/logger_loguru.py index 801a1c8cd..9e7ed3eae 100644 --- a/newrelic/hooks/logger_loguru.py +++ b/newrelic/hooks/logger_loguru.py @@ -18,15 +18,18 @@ from newrelic.api.application import application_instance from newrelic.api.transaction import current_transaction, record_log_event from newrelic.common.object_wrapper import wrap_function_wrapper +from newrelic.common.signature import bind_args from newrelic.core.config import global_settings from newrelic.hooks.logger_logging import add_nr_linking_metadata from newrelic.packages import six -_logger = logging.getLogger(__name__) +_logger = logging.getLogger(__name__) is_pypy = hasattr(sys, "pypy_version_info") + def loguru_version(): from loguru import __version__ + return tuple(int(x) for x in __version__.split(".")) @@ -54,7 +57,7 @@ def _nr_log_forwarder(message_instance): if application and application.enabled: application.record_custom_metric("Logging/lines", {"count": 1}) application.record_custom_metric("Logging/lines/%s" % level_name, {"count": 1}) - + if settings.application_logging.forwarding and settings.application_logging.forwarding.enabled: try: record_log_event(message, level_name, int(record["time"].timestamp())) @@ -64,14 +67,13 @@ def _nr_log_forwarder(message_instance): ALLOWED_LOGURU_OPTIONS_LENGTHS = frozenset((8, 9)) -def bind_log(level_id, static_level_no, from_decorator, options, message, args, kwargs): - assert len(options) in ALLOWED_LOGURU_OPTIONS_LENGTHS # Assert the options signature we expect - return level_id, static_level_no, from_decorator, list(options), message, args, kwargs - def wrap_log(wrapped, instance, args, kwargs): try: - level_id, static_level_no, from_decorator, options, message, subargs, subkwargs = bind_log(*args, **kwargs) + bound_args = bind_args(wrapped, args, kwargs) + options = bound_args["options"] = list(bound_args["options"]) + assert len(options) in ALLOWED_LOGURU_OPTIONS_LENGTHS # Assert the options signature we expect + options[-2] = nr_log_patcher(options[-2]) # Loguru looks into the stack trace to find the caller's module and function names. # options[1] tells loguru how far up to look in the stack trace to find the caller. @@ -87,14 +89,14 @@ def wrap_log(wrapped, instance, args, kwargs): _logger.debug("Exception in loguru handling: %s" % str(e)) return wrapped(*args, **kwargs) else: - return wrapped(level_id, static_level_no, from_decorator, options, message, subargs, subkwargs) + return wrapped(**bound_args) def nr_log_patcher(original_patcher=None): def _nr_log_patcher(record): if original_patcher: record = original_patcher(record) - + transaction = current_transaction() if transaction: diff --git a/newrelic/hooks/memcache_pylibmc.py b/newrelic/hooks/memcache_pylibmc.py deleted file mode 100644 index 190980f08..000000000 --- a/newrelic/hooks/memcache_pylibmc.py +++ /dev/null @@ -1,57 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import newrelic.api.memcache_trace - -def instrument(module): - - if hasattr(module.Client, 'add'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.add', 'add') - if hasattr(module.Client, 'append'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.append', 'replace') - if hasattr(module.Client, 'decr'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.decr', 'decr') - if hasattr(module.Client, 'delete'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.delete', 'delete') - if hasattr(module.Client, 'delete_multi'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.delete_multi', 'delete') - if hasattr(module.Client, 'get'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.get', 'get') - if hasattr(module.Client, 'get_multi'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.get_multi', 'get') - if hasattr(module.Client, 'incr'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.incr', 'incr') - if hasattr(module.Client, 'incr_multi'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.incr_multi', 'incr') - if hasattr(module.Client, 'prepend'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.prepend', 'replace') - if hasattr(module.Client, 'replace'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.replace', 'replace') - if hasattr(module.Client, 'set'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.set', 'set') - if hasattr(module.Client, 'set_multi'): - newrelic.api.memcache_trace.wrap_memcache_trace( - module, 'Client.set_multi', 'set') diff --git a/newrelic/hooks/memcache_umemcache.py b/newrelic/hooks/memcache_umemcache.py deleted file mode 100644 index 5241374e8..000000000 --- a/newrelic/hooks/memcache_umemcache.py +++ /dev/null @@ -1,82 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -from newrelic.api.memcache_trace import memcache_trace -from newrelic.api.object_wrapper import ObjectWrapper - -class Client(ObjectWrapper): - - def __init__(self, wrapped): - super(Client, self).__init__(wrapped, None, None) - - @memcache_trace('set') - def set(self, *args, **kwargs): - return self._nr_next_object.set(*args, **kwargs) - - @memcache_trace('get') - def get(self, *args, **kwargs): - return self._nr_next_object.get(*args, **kwargs) - - @memcache_trace('get') - def gets(self, *args, **kwargs): - return self._nr_next_object.gets(*args, **kwargs) - - @memcache_trace('get') - def get_multi(self, *args, **kwargs): - return self._nr_next_object.get_multi(*args, **kwargs) - - @memcache_trace('get') - def gets_multi(self, *args, **kwargs): - return self._nr_next_object.gets_multi(*args, **kwargs) - - @memcache_trace('add') - def add(self, *args, **kwargs): - return self._nr_next_object.add(*args, **kwargs) - - @memcache_trace('replace') - def replace(self, *args, **kwargs): - return self._nr_next_object.replace(*args, **kwargs) - - @memcache_trace('replace') - def append(self, *args, **kwargs): - return self._nr_next_object.append(*args, **kwargs) - - @memcache_trace('replace') - def prepend(self, *args, **kwargs): - return self._nr_next_object.prepend(*args, **kwargs) - - @memcache_trace('delete') - def delete(self, *args, **kwargs): - return self._nr_next_object.delete(*args, **kwargs) - - @memcache_trace('replace') - def cas(self, *args, **kwargs): - return self._nr_next_object.cas(*args, **kwargs) - - @memcache_trace('incr') - def incr(self, *args, **kwargs): - return self._nr_next_object.incr(*args, **kwargs) - - @memcache_trace('decr') - def decr(self, *args, **kwargs): - return self._nr_next_object.decr(*args, **kwargs) - -def instrument(module): - - _Client = module.Client - - def _client(*args, **kwargs): - return Client(_Client(*args, **kwargs)) - - module.Client = _client diff --git a/newrelic/hooks/messagebroker_confluentkafka.py b/newrelic/hooks/messagebroker_confluentkafka.py index 965fd765b..81d9fa59a 100644 --- a/newrelic/hooks/messagebroker_confluentkafka.py +++ b/newrelic/hooks/messagebroker_confluentkafka.py @@ -22,6 +22,7 @@ from newrelic.api.time_trace import notice_error from newrelic.api.transaction import current_transaction from newrelic.common.object_wrapper import function_wrapper, wrap_function_wrapper +from newrelic.common.package_version_utils import get_package_version _logger = logging.getLogger(__name__) @@ -54,7 +55,9 @@ def wrap_Producer_produce(wrapped, instance, args, kwargs): topic = args[0] args = args[1:] else: - topic = kwargs.get("topic", None) + topic = kwargs.pop("topic", None) + + transaction.add_messagebroker_info("Confluent-Kafka", get_package_version("confluent-kafka")) with MessageTrace( library="Kafka", @@ -161,6 +164,7 @@ def wrap_Consumer_poll(wrapped, instance, args, kwargs): name = "Named/%s" % destination_name transaction.record_custom_metric("%s/%s/Received/Bytes" % (group, name), received_bytes) transaction.record_custom_metric("%s/%s/Received/Messages" % (group, name), message_count) + transaction.add_messagebroker_info("Confluent-Kafka", get_package_version("confluent-kafka")) return record diff --git a/newrelic/hooks/messagebroker_kafkapython.py b/newrelic/hooks/messagebroker_kafkapython.py index 697b46349..9124a16dc 100644 --- a/newrelic/hooks/messagebroker_kafkapython.py +++ b/newrelic/hooks/messagebroker_kafkapython.py @@ -26,6 +26,7 @@ function_wrapper, wrap_function_wrapper, ) +from newrelic.common.package_version_utils import get_package_version HEARTBEAT_POLL = "MessageBroker/Kafka/Heartbeat/Poll" HEARTBEAT_SENT = "MessageBroker/Kafka/Heartbeat/Sent" @@ -48,6 +49,8 @@ def wrap_KafkaProducer_send(wrapped, instance, args, kwargs): topic, value, key, headers, partition, timestamp_ms = _bind_send(*args, **kwargs) headers = list(headers) if headers else [] + transaction.add_messagebroker_info("Kafka-Python", get_package_version("kafka-python")) + with MessageTrace( library="Kafka", operation="Produce", @@ -112,6 +115,7 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs): message_count = 1 transaction = current_transaction(active_only=False) + if not transaction: transaction = MessageTransaction( application=application_instance(), @@ -124,7 +128,7 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs): source=wrapped, ) instance._nr_transaction = transaction - transaction.__enter__() + transaction.__enter__() # pylint: disable=C2801 # Obtain consumer client_id to send up as agent attribute if hasattr(instance, "config") and "client_id" in instance.config: @@ -143,12 +147,13 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs): name = "Named/%s" % destination_name transaction.record_custom_metric("%s/%s/Received/Bytes" % (group, name), received_bytes) transaction.record_custom_metric("%s/%s/Received/Messages" % (group, name), message_count) + transaction.add_messagebroker_info("Kafka-Python", get_package_version("kafka-python")) return record def wrap_KafkaProducer_init(wrapped, instance, args, kwargs): - get_config_key = lambda key: kwargs.get(key, instance.DEFAULT_CONFIG[key]) # noqa: E731 + get_config_key = lambda key: kwargs.get(key, instance.DEFAULT_CONFIG[key]) # pylint: disable=C3001 # noqa: E731 kwargs["key_serializer"] = wrap_serializer( instance, "Serialization/Key", "MessageBroker", get_config_key("key_serializer") @@ -162,13 +167,13 @@ def wrap_KafkaProducer_init(wrapped, instance, args, kwargs): class NewRelicSerializerWrapper(ObjectProxy): def __init__(self, wrapped, serializer_name, group_prefix): - ObjectProxy.__init__.__get__(self)(wrapped) + ObjectProxy.__init__.__get__(self)(wrapped) # pylint: disable=W0231 self._nr_serializer_name = serializer_name self._nr_group_prefix = group_prefix def serialize(self, topic, object): - wrapped = self.__wrapped__.serialize + wrapped = self.__wrapped__.serialize # pylint: disable=W0622 args = (topic, object) kwargs = {} diff --git a/newrelic/hooks/messagebroker_pika.py b/newrelic/hooks/messagebroker_pika.py index 625302ba1..cecc1b934 100644 --- a/newrelic/hooks/messagebroker_pika.py +++ b/newrelic/hooks/messagebroker_pika.py @@ -17,73 +17,74 @@ import types from newrelic.api.application import application_instance -from newrelic.api.message_transaction import MessageTransaction from newrelic.api.function_trace import FunctionTraceWrapper from newrelic.api.message_trace import MessageTrace +from newrelic.api.message_transaction import MessageTransaction from newrelic.api.transaction import current_transaction from newrelic.common.object_names import callable_name -from newrelic.common.object_wrapper import (wrap_function_wrapper, wrap_object, - FunctionWrapper, function_wrapper, resolve_path, apply_patch) - +from newrelic.common.object_wrapper import ( + FunctionWrapper, + apply_patch, + function_wrapper, + resolve_path, + wrap_function_wrapper, + wrap_object, +) -_START_KEY = '_nr_start_time' -KWARGS_ERROR = 'Supportability/hooks/pika/kwargs_error' +_START_KEY = "_nr_start_time" +KWARGS_ERROR = "Supportability/hooks/pika/kwargs_error" -def _add_consume_rabbitmq_trace(transaction, method, properties, - nr_start_time, queue_name=None): +def _add_consume_rabbitmq_trace(transaction, method, properties, nr_start_time, queue_name=None): routing_key = None - if hasattr(method, 'routing_key'): + if hasattr(method, "routing_key"): routing_key = method.routing_key properties = properties and properties.__dict__ or {} - correlation_id = properties.get('correlation_id') - reply_to = properties.get('reply_to') - headers = properties.get('headers') + correlation_id = properties.get("correlation_id") + reply_to = properties.get("reply_to") + headers = properties.get("headers") # Do not record dt headers in the segment parameters if headers: - headers.pop( - MessageTrace.cat_id_key, None) - headers.pop( - MessageTrace.cat_transaction_key, None) - headers.pop( - MessageTrace.cat_distributed_trace_key, None) - headers.pop('traceparent', None) - headers.pop('tracestate', None) + headers.pop(MessageTrace.cat_id_key, None) + headers.pop(MessageTrace.cat_transaction_key, None) + headers.pop(MessageTrace.cat_distributed_trace_key, None) + headers.pop("traceparent", None) + headers.pop("tracestate", None) # The transaction may have started after the message was received. In this # case, the start time is reset to the true transaction start time. - transaction.start_time = min(nr_start_time, - transaction.start_time) + transaction.start_time = min(nr_start_time, transaction.start_time) params = {} if routing_key is not None: - params['routing_key'] = routing_key + params["routing_key"] = routing_key if correlation_id is not None: - params['correlation_id'] = correlation_id + params["correlation_id"] = correlation_id if reply_to is not None: - params['reply_to'] = reply_to + params["reply_to"] = reply_to if headers is not None: - params['headers'] = headers + params["headers"] = headers if queue_name is not None: - params['queue_name'] = queue_name + params["queue_name"] = queue_name # create a trace starting at the time the message was received - trace = MessageTrace(library='RabbitMQ', - operation='Consume', - destination_type='Exchange', - destination_name=method.exchange or 'Default', - params=params) + trace = MessageTrace( + library="RabbitMQ", + operation="Consume", + destination_type="Exchange", + destination_name=method.exchange or "Default", + params=params, + ) trace.__enter__() trace.start_time = nr_start_time trace.__exit__(None, None, None) -def _bind_basic_publish( - exchange, routing_key, body, properties=None, *args, **kwargs): +def _bind_basic_publish(exchange, routing_key, body, properties=None, *args, **kwargs): return (exchange, routing_key, body, properties, args, kwargs) @@ -95,8 +96,7 @@ def _nr_wrapper_basic_publish(wrapped, instance, args, kwargs): from pika import BasicProperties - (exchange, routing_key, body, properties, args, kwargs) = ( - _bind_basic_publish(*args, **kwargs)) + (exchange, routing_key, body, properties, args, kwargs) = _bind_basic_publish(*args, **kwargs) properties = properties or BasicProperties() properties.headers = properties.headers or {} user_headers = properties.headers.copy() @@ -112,20 +112,22 @@ def _nr_wrapper_basic_publish(wrapped, instance, args, kwargs): params = {} if routing_key is not None: - params['routing_key'] = routing_key + params["routing_key"] = routing_key if properties.correlation_id is not None: - params['correlation_id'] = properties.correlation_id + params["correlation_id"] = properties.correlation_id if properties.reply_to is not None: - params['reply_to'] = properties.reply_to + params["reply_to"] = properties.reply_to if user_headers: - params['headers'] = user_headers - - with MessageTrace(library='RabbitMQ', - operation='Produce', - destination_type='Exchange', - destination_name=exchange or 'Default', - params=params, - source=wrapped): + params["headers"] = user_headers + + with MessageTrace( + library="RabbitMQ", + operation="Produce", + destination_type="Exchange", + destination_name=exchange or "Default", + params=params, + source=wrapped, + ): cat_headers = MessageTrace.generate_request_headers(transaction) properties.headers.update(cat_headers) return wrapped(*args, **kwargs) @@ -133,7 +135,6 @@ def _nr_wrapper_basic_publish(wrapped, instance, args, kwargs): def _wrap_Channel_get_callback(module, obj, wrap_get): def _nr_wrapper_basic_get(wrapped, instance, args, kwargs): - @function_wrapper def callback_wrapper(callback, _instance, _args, _kwargs): transaction = current_transaction() @@ -143,13 +144,11 @@ def callback_wrapper(callback, _instance, _args, _kwargs): if not _kwargs: method, properties = _args[1:3] - start_time = getattr(callback_wrapper, '_nr_start_time', None) + start_time = getattr(callback_wrapper, "_nr_start_time", None) - _add_consume_rabbitmq_trace(transaction, - method=method, - properties=properties, - nr_start_time=start_time, - queue_name=queue) + _add_consume_rabbitmq_trace( + transaction, method=method, properties=properties, nr_start_time=start_time, queue_name=queue + ) else: m = transaction._transaction_metrics.get(KWARGS_ERROR, 0) transaction._transaction_metrics[KWARGS_ERROR] = m + 1 @@ -172,26 +171,23 @@ def _nr_wrapper_Basic_Deliver_init_(wrapper, instance, args, kwargs): def _nr_wrap_BlockingChannel___init__(wrapped, instance, args, kwargs): ret = wrapped(*args, **kwargs) - impl = getattr(instance, '_impl', None) + impl = getattr(instance, "_impl", None) # Patch in the original basic_consume to avoid wrapping twice - if impl and hasattr(impl, '_nr_basic_consume'): + if impl and hasattr(impl, "_nr_basic_consume"): impl.basic_consume = impl.basic_consume.__wrapped__ return ret -def _wrap_basic_consume_BlockingChannel_old(wrapper, - consumer_callback, queue, *args, **kwargs): +def _wrap_basic_consume_BlockingChannel_old(wrapper, consumer_callback, queue, *args, **kwargs): args = (wrapper(consumer_callback), queue) + args return queue, args, kwargs -def _wrap_basic_consume_Channel_old(wrapper, consumer_callback, queue='', - *args, **kwargs): +def _wrap_basic_consume_Channel_old(wrapper, consumer_callback, queue="", *args, **kwargs): return queue, (wrapper(consumer_callback), queue) + args, kwargs -def _wrap_basic_consume_Channel(wrapper, queue, on_message_callback, *args, - **kwargs): +def _wrap_basic_consume_Channel(wrapper, queue, on_message_callback, *args, **kwargs): args = (queue, wrapper(on_message_callback)) + args return queue, args, kwargs @@ -201,8 +197,7 @@ def _wrap_basic_get_Channel(wrapper, queue, callback, *args, **kwargs): return queue, args, kwargs -def _wrap_basic_get_Channel_old(wrapper, callback=None, queue='', - *args, **kwargs): +def _wrap_basic_get_Channel_old(wrapper, callback=None, queue="", *args, **kwargs): if callback is not None: callback = wrapper(callback) args = (callback, queue) + args @@ -210,7 +205,6 @@ def _wrap_basic_get_Channel_old(wrapper, callback=None, queue='', def _ConsumeGeneratorWrapper(wrapped): - def wrapper(wrapped, instance, args, kwargs): def _possibly_create_traces(yielded): # This generator can be called either outside of a transaction, or @@ -245,16 +239,15 @@ def _possibly_create_traces(yielded): else: # 3. Outside of a transaction - exchange = method.exchange or 'Default' - routing_key = getattr(method, 'routing_key', None) + exchange = method.exchange or "Default" + routing_key = getattr(method, "routing_key", None) headers = None reply_to = None correlation_id = None if properties is not None: - headers = getattr(properties, 'headers', None) - reply_to = getattr(properties, 'reply_to', None) - correlation_id = getattr( - properties, 'correlation_id', None) + headers = getattr(properties, "headers", None) + reply_to = getattr(properties, "reply_to", None) + correlation_id = getattr(properties, "correlation_id", None) # Create a messagebroker task for each iteration through the # generator. This is important because it is foreseeable that @@ -262,15 +255,16 @@ def _possibly_create_traces(yielded): # many messages. bt = MessageTransaction( - application=application_instance(), - library='RabbitMQ', - destination_type='Exchange', - destination_name=exchange, - routing_key=routing_key, - headers=headers, - reply_to=reply_to, - correlation_id=correlation_id, - source=wrapped) + application=application_instance(), + library="RabbitMQ", + destination_type="Exchange", + destination_name=exchange, + routing_key=routing_key, + headers=headers, + reply_to=reply_to, + correlation_id=correlation_id, + source=wrapped, + ) bt.__enter__() return bt @@ -327,28 +321,25 @@ def _generator(generator): def _wrap_Channel_consume_callback(module, obj, wrap_consume): - @function_wrapper def _nr_wrapper_Channel_consume_(wrapped, channel, args, kwargs): - @function_wrapper def callback_wrapper(wrapped, instance, args, kwargs): name = callable_name(wrapped) transaction = current_transaction(active_only=False) - if transaction and (transaction.ignore_transaction or - transaction.stopped): + if transaction and (transaction.ignore_transaction or transaction.stopped): return wrapped(*args, **kwargs) elif transaction: return FunctionTraceWrapper(wrapped, name=name)(*args, **kwargs) else: - if hasattr(channel, '_nr_disable_txn_tracing'): + if hasattr(channel, "_nr_disable_txn_tracing"): return wrapped(*args, **kwargs) # Keyword arguments are unknown since this is a user # defined callback - exchange = 'Unknown' + exchange = "Unknown" routing_key = None headers = None reply_to = None @@ -356,32 +347,31 @@ def callback_wrapper(wrapped, instance, args, kwargs): unknown_kwargs = False if not kwargs: method, properties = args[1:3] - exchange = method.exchange or 'Default' - routing_key = getattr(method, 'routing_key', None) + exchange = method.exchange or "Default" + routing_key = getattr(method, "routing_key", None) if properties is not None: - headers = getattr(properties, 'headers', None) - reply_to = getattr(properties, 'reply_to', None) - correlation_id = getattr( - properties, 'correlation_id', None) + headers = getattr(properties, "headers", None) + reply_to = getattr(properties, "reply_to", None) + correlation_id = getattr(properties, "correlation_id", None) else: unknown_kwargs = True with MessageTransaction( - application=application_instance(), - library='RabbitMQ', - destination_type='Exchange', - destination_name=exchange, - routing_key=routing_key, - headers=headers, - queue_name=queue, - reply_to=reply_to, - correlation_id=correlation_id, - source=wrapped) as mt: + application=application_instance(), + library="RabbitMQ", + destination_type="Exchange", + destination_name=exchange, + routing_key=routing_key, + headers=headers, + queue_name=queue, + reply_to=reply_to, + correlation_id=correlation_id, + source=wrapped, + ) as mt: # Improve transaction naming - _new_txn_name = 'RabbitMQ/Exchange/%s/%s' % (exchange, - name) - mt.set_transaction_name(_new_txn_name, group='Message') + _new_txn_name = "RabbitMQ/Exchange/%s/%s" % (exchange, name) + mt.set_transaction_name(_new_txn_name, group="Message") # Record that something went horribly wrong if unknown_kwargs: @@ -411,35 +401,30 @@ def _disable_channel_transactions(wrapped, instance, args, kwargs): def instrument_pika_adapters(module): import pika - version = tuple(int(num) for num in pika.__version__.split('.', 1)[0]) + + version = tuple(int(num) for num in pika.__version__.split(".", 1)[0]) if version[0] < 1: wrap_consume = _wrap_basic_consume_BlockingChannel_old else: wrap_consume = _wrap_basic_consume_Channel - _wrap_Channel_consume_callback( - module.blocking_connection, - 'BlockingChannel.basic_consume', - wrap_consume) - wrap_function_wrapper(module.blocking_connection, - 'BlockingChannel.__init__', _nr_wrap_BlockingChannel___init__) - wrap_object(module.blocking_connection, 'BlockingChannel.consume', - _ConsumeGeneratorWrapper) + _wrap_Channel_consume_callback(module, "blocking_connection.BlockingChannel.basic_consume", wrap_consume) + wrap_function_wrapper(module, "blocking_connection.BlockingChannel.__init__", _nr_wrap_BlockingChannel___init__) + wrap_object(module, "blocking_connection.BlockingChannel.consume", _ConsumeGeneratorWrapper) - if hasattr(module, 'tornado_connection'): - wrap_function_wrapper(module.tornado_connection, - 'TornadoConnection.channel', _disable_channel_transactions) + if hasattr(module, "tornado_connection"): + wrap_function_wrapper(module, "tornado_connection.TornadoConnection.channel", _disable_channel_transactions) def instrument_pika_spec(module): - wrap_function_wrapper(module.Basic.Deliver, '__init__', - _nr_wrapper_Basic_Deliver_init_) + wrap_function_wrapper(module, "Basic.Deliver.__init__", _nr_wrapper_Basic_Deliver_init_) def instrument_pika_channel(module): import pika - version = tuple(int(num) for num in pika.__version__.split('.', 1)[0]) + + version = tuple(int(num) for num in pika.__version__.split(".", 1)[0]) if version[0] < 1: wrap_consume = _wrap_basic_consume_Channel_old @@ -448,11 +433,7 @@ def instrument_pika_channel(module): wrap_consume = _wrap_basic_consume_Channel wrap_get = _wrap_basic_get_Channel - wrap_function_wrapper(module, 'Channel.basic_publish', - _nr_wrapper_basic_publish) + wrap_function_wrapper(module, "Channel.basic_publish", _nr_wrapper_basic_publish) - _wrap_Channel_get_callback(module, 'Channel.basic_get', wrap_get) - _wrap_Channel_consume_callback( - module, - 'Channel.basic_consume', - wrap_consume) + _wrap_Channel_get_callback(module, "Channel.basic_get", wrap_get) + _wrap_Channel_consume_callback(module, "Channel.basic_consume", wrap_consume) diff --git a/newrelic/hooks/nosql_pymongo.py b/newrelic/hooks/nosql_pymongo.py deleted file mode 100644 index b2d747b28..000000000 --- a/newrelic/hooks/nosql_pymongo.py +++ /dev/null @@ -1,44 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import newrelic.api.function_trace - -_methods = ['save', 'insert', 'update', 'drop', 'remove', 'find_one', - 'find', 'count', 'create_index', 'ensure_index', 'drop_indexes', - 'drop_index', 'reindex', 'index_information', 'options', - 'group', 'rename', 'distinct', 'map_reduce', 'inline_map_reduce', - 'find_and_modify'] - -def instrument_pymongo_connection(module): - - # Must name function explicitly as pymongo overrides the - # __getattr__() method in a way that breaks introspection. - - newrelic.api.function_trace.wrap_function_trace( - module, 'Connection.__init__', - name='%s:Connection.__init__' % module.__name__) - -def instrument_pymongo_collection(module): - - # Must name function explicitly as pymongo overrides the - # __getattr__() method in a way that breaks introspection. - - for method in _methods: - if hasattr(module.Collection, method): - #newrelic.api.function_trace.wrap_function_trace( - # module, 'Collection.%s' % method, - # name=method, group='Custom/MongoDB') - newrelic.api.function_trace.wrap_function_trace( - module, 'Collection.%s' % method, - name='%s:Collection.%s' % (module.__name__, method)) diff --git a/newrelic/hooks/nosql_redis.py b/newrelic/hooks/nosql_redis.py deleted file mode 100644 index 1f099ec97..000000000 --- a/newrelic/hooks/nosql_redis.py +++ /dev/null @@ -1,62 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import newrelic.api.function_trace - -_methods_1 = ['bgrewriteaof', 'bgsave', 'config_get', 'config_set', - 'dbsize', 'debug_object', 'delete', 'echo', 'flushall', - 'flushdb', 'info', 'lastsave', 'object', 'ping', 'save', - 'shutdown', 'slaveof', 'append', 'decr', 'exists', - 'expire', 'expireat', 'get', 'getbit', 'getset', 'incr', - 'keys', 'mget', 'mset', 'msetnx', 'move', 'persist', - 'randomkey', 'rename', 'renamenx', 'set', 'setbit', - 'setex', 'setnx', 'setrange', 'strlen', 'substr', 'ttl', - 'type', 'blpop', 'brpop', 'brpoplpush', 'lindex', - 'linsert', 'llen', 'lpop', 'lpush', 'lpushx', 'lrange', - 'lrem', 'lset', 'ltrim', 'rpop', 'rpoplpush', 'rpush', - 'rpushx', 'sort', 'sadd', 'scard', 'sdiff', 'sdiffstore', - 'sinter', 'sinterstore', 'sismember', 'smembers', - 'smove', 'spop', 'srandmember', 'srem', 'sunion', - 'sunionstore', 'zadd', 'zcard', 'zcount', 'zincrby', - 'zinterstore', 'zrange', 'zrangebyscore', 'zrank', 'zrem', - 'zremrangebyrank', 'zremrangebyscore', 'zrevrange', - 'zrevrangebyscore', 'zrevrank', 'zscore', 'zunionstore', - 'hdel', 'hexists', 'hget', 'hgetall', 'hincrby', 'hkeys', - 'hlen', 'hset', 'hsetnx', 'hmset', 'hmget', 'hvals', - 'publish'] - -_methods_2 = ['setex', 'lrem', 'zadd'] - -def instrument_redis_connection(module): - - newrelic.api.function_trace.wrap_function_trace( - module, 'Connection.connect') - -def instrument_redis_client(module): - - if hasattr(module, 'StrictRedis'): - for method in _methods_1: - if hasattr(module.StrictRedis, method): - newrelic.api.function_trace.wrap_function_trace( - module, 'StrictRedis.%s' % method) - else: - for method in _methods_1: - if hasattr(module.Redis, method): - newrelic.api.function_trace.wrap_function_trace( - module, 'Redis.%s' % method) - - for method in _methods_2: - if hasattr(module.Redis, method): - newrelic.api.function_trace.wrap_function_trace( - module, 'Redis.%s' % method) diff --git a/newrelic/hooks/solr_pysolr.py b/newrelic/hooks/solr_pysolr.py deleted file mode 100644 index a4972681b..000000000 --- a/newrelic/hooks/solr_pysolr.py +++ /dev/null @@ -1,39 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import newrelic.api.solr_trace - -def instrument(module): - - if hasattr(module.Solr, 'search'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.search', 'pysolr', 'query') - if hasattr(module.Solr, 'more_like_this'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.more_like_this', 'pysolr', 'query') - if hasattr(module.Solr, 'suggest_terms'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.suggest_terms', 'pysolr', 'query') - if hasattr(module.Solr, 'add'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.add', 'pysolr', 'add') - if hasattr(module.Solr, 'delete'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.delete', 'pysolr', 'delete') - if hasattr(module.Solr, 'commit'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.commit', 'pysolr', 'commit') - if hasattr(module.Solr, 'optimize'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.optimize', 'pysolr', 'optimize') diff --git a/newrelic/hooks/solr_solrpy.py b/newrelic/hooks/solr_solrpy.py deleted file mode 100644 index e8b87ad47..000000000 --- a/newrelic/hooks/solr_solrpy.py +++ /dev/null @@ -1,50 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import newrelic.api.solr_trace - -def instrument(module): - - if hasattr(module.Solr, 'delete'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.delete', 'solrpy', 'delete') - if hasattr(module.Solr, 'delete_many'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.delete_many', 'solrpy', 'delete') - if hasattr(module.Solr, 'delete_query'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.delete_query', 'solrpy', 'delete') - if hasattr(module.Solr, 'add'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.add', 'solrpy', 'add') - if hasattr(module.Solr, 'add_many'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.add_many', 'solrpy', 'add') - if hasattr(module.Solr, 'commit'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.commit', 'solrpy', 'commit') - if hasattr(module.Solr, 'optimize'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'Solr.optimize', 'solrpy', 'optimize') - - if hasattr(module.SolrConnection, 'query'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'SolrConnection.query', 'solrpy', 'query') - if hasattr(module.SolrConnection, 'raw_query'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'SolrConnection.raw_query', 'solrpy', 'query') - - if hasattr(module, 'SearchHandler'): - newrelic.api.solr_trace.wrap_solr_trace( - module, 'SearchHandler.__call__', 'solrpy', 'query') diff --git a/newrelic/packages/isort/LICENSE b/newrelic/packages/isort/LICENSE new file mode 100644 index 000000000..b5083a50d --- /dev/null +++ b/newrelic/packages/isort/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2013 Timothy Edmund Crosley + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/newrelic/packages/isort/__init__.py b/newrelic/packages/isort/__init__.py new file mode 100644 index 000000000..e69de29bb diff --git a/newrelic/packages/isort/stdlibs/__init__.py b/newrelic/packages/isort/stdlibs/__init__.py new file mode 100644 index 000000000..3394a7eda --- /dev/null +++ b/newrelic/packages/isort/stdlibs/__init__.py @@ -0,0 +1,2 @@ +from . import all as _all +from . import py2, py3, py27, py36, py37, py38, py39, py310, py311 diff --git a/newrelic/packages/isort/stdlibs/all.py b/newrelic/packages/isort/stdlibs/all.py new file mode 100644 index 000000000..08a365e19 --- /dev/null +++ b/newrelic/packages/isort/stdlibs/all.py @@ -0,0 +1,3 @@ +from . import py2, py3 + +stdlib = py2.stdlib | py3.stdlib diff --git a/newrelic/packages/isort/stdlibs/py2.py b/newrelic/packages/isort/stdlibs/py2.py new file mode 100644 index 000000000..74af019e4 --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py2.py @@ -0,0 +1,3 @@ +from . import py27 + +stdlib = py27.stdlib diff --git a/newrelic/packages/isort/stdlibs/py27.py b/newrelic/packages/isort/stdlibs/py27.py new file mode 100644 index 000000000..a9bc99d0c --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py27.py @@ -0,0 +1,301 @@ +""" +File contains the standard library of Python 2.7. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "AL", + "BaseHTTPServer", + "Bastion", + "CGIHTTPServer", + "Carbon", + "ColorPicker", + "ConfigParser", + "Cookie", + "DEVICE", + "DocXMLRPCServer", + "EasyDialogs", + "FL", + "FrameWork", + "GL", + "HTMLParser", + "MacOS", + "MimeWriter", + "MiniAEFrame", + "Nav", + "PixMapWrapper", + "Queue", + "SUNAUDIODEV", + "ScrolledText", + "SimpleHTTPServer", + "SimpleXMLRPCServer", + "SocketServer", + "StringIO", + "Tix", + "Tkinter", + "UserDict", + "UserList", + "UserString", + "W", + "__builtin__", + "_ast", + "_winreg", + "abc", + "aepack", + "aetools", + "aetypes", + "aifc", + "al", + "anydbm", + "applesingle", + "argparse", + "array", + "ast", + "asynchat", + "asyncore", + "atexit", + "audioop", + "autoGIL", + "base64", + "bdb", + "binascii", + "binhex", + "bisect", + "bsddb", + "buildtools", + "bz2", + "cPickle", + "cProfile", + "cStringIO", + "calendar", + "cd", + "cfmfile", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "commands", + "compileall", + "compiler", + "contextlib", + "cookielib", + "copy", + "copy_reg", + "crypt", + "csv", + "ctypes", + "curses", + "datetime", + "dbhash", + "dbm", + "decimal", + "difflib", + "dircache", + "dis", + "distutils", + "dl", + "doctest", + "dumbdbm", + "dummy_thread", + "dummy_threading", + "email", + "encodings", + "ensurepip", + "errno", + "exceptions", + "fcntl", + "filecmp", + "fileinput", + "findertools", + "fl", + "flp", + "fm", + "fnmatch", + "formatter", + "fpectl", + "fpformat", + "fractions", + "ftplib", + "functools", + "future_builtins", + "gc", + "gdbm", + "gensuitemodule", + "getopt", + "getpass", + "gettext", + "gl", + "glob", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "hotshot", + "htmlentitydefs", + "htmllib", + "httplib", + "ic", + "icopen", + "imageop", + "imaplib", + "imgfile", + "imghdr", + "imp", + "importlib", + "imputil", + "inspect", + "io", + "itertools", + "jpeg", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "macerrors", + "macostools", + "macpath", + "macresource", + "mailbox", + "mailcap", + "marshal", + "math", + "md5", + "mhlib", + "mimetools", + "mimetypes", + "mimify", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multifile", + "multiprocessing", + "mutex", + "netrc", + "new", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "parser", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "popen2", + "poplib", + "posix", + "posixfile", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "quopri", + "random", + "re", + "readline", + "resource", + "rexec", + "rfc822", + "rlcompleter", + "robotparser", + "runpy", + "sched", + "select", + "sets", + "sgmllib", + "sha", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statvfs", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "sunaudiodev", + "symbol", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "thread", + "threading", + "time", + "timeit", + "token", + "tokenize", + "trace", + "traceback", + "ttk", + "tty", + "turtle", + "types", + "unicodedata", + "unittest", + "urllib", + "urllib2", + "urlparse", + "user", + "uu", + "uuid", + "videoreader", + "warnings", + "wave", + "weakref", + "webbrowser", + "whichdb", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpclib", + "zipfile", + "zipimport", + "zlib", +} diff --git a/newrelic/packages/isort/stdlibs/py3.py b/newrelic/packages/isort/stdlibs/py3.py new file mode 100644 index 000000000..988254385 --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py3.py @@ -0,0 +1,3 @@ +from . import py36, py37, py38, py39, py310, py311 + +stdlib = py36.stdlib | py37.stdlib | py38.stdlib | py39.stdlib | py310.stdlib | py311.stdlib diff --git a/newrelic/packages/isort/stdlibs/py310.py b/newrelic/packages/isort/stdlibs/py310.py new file mode 100644 index 000000000..f45cf50a3 --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py310.py @@ -0,0 +1,222 @@ +""" +File contains the standard library of Python 3.10. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "_ast", + "_thread", + "abc", + "aifc", + "argparse", + "array", + "ast", + "asynchat", + "asyncio", + "asyncore", + "atexit", + "audioop", + "base64", + "bdb", + "binascii", + "binhex", + "bisect", + "builtins", + "bz2", + "cProfile", + "calendar", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "compileall", + "concurrent", + "configparser", + "contextlib", + "contextvars", + "copy", + "copyreg", + "crypt", + "csv", + "ctypes", + "curses", + "dataclasses", + "datetime", + "dbm", + "decimal", + "difflib", + "dis", + "distutils", + "doctest", + "email", + "encodings", + "ensurepip", + "enum", + "errno", + "faulthandler", + "fcntl", + "filecmp", + "fileinput", + "fnmatch", + "fractions", + "ftplib", + "functools", + "gc", + "getopt", + "getpass", + "gettext", + "glob", + "graphlib", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "html", + "http", + "idlelib", + "imaplib", + "imghdr", + "imp", + "importlib", + "inspect", + "io", + "ipaddress", + "itertools", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "lzma", + "mailbox", + "mailcap", + "marshal", + "math", + "mimetypes", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multiprocessing", + "netrc", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "pathlib", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "poplib", + "posix", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "queue", + "quopri", + "random", + "re", + "readline", + "reprlib", + "resource", + "rlcompleter", + "runpy", + "sched", + "secrets", + "select", + "selectors", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "socketserver", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statistics", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "threading", + "time", + "timeit", + "tkinter", + "token", + "tokenize", + "trace", + "traceback", + "tracemalloc", + "tty", + "turtle", + "turtledemo", + "types", + "typing", + "unicodedata", + "unittest", + "urllib", + "uu", + "uuid", + "venv", + "warnings", + "wave", + "weakref", + "webbrowser", + "winreg", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpc", + "zipapp", + "zipfile", + "zipimport", + "zlib", + "zoneinfo", +} diff --git a/newrelic/packages/isort/stdlibs/py311.py b/newrelic/packages/isort/stdlibs/py311.py new file mode 100644 index 000000000..6fa42e995 --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py311.py @@ -0,0 +1,222 @@ +""" +File contains the standard library of Python 3.11. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "_ast", + "_thread", + "abc", + "aifc", + "argparse", + "array", + "ast", + "asynchat", + "asyncio", + "asyncore", + "atexit", + "audioop", + "base64", + "bdb", + "binascii", + "bisect", + "builtins", + "bz2", + "cProfile", + "calendar", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "compileall", + "concurrent", + "configparser", + "contextlib", + "contextvars", + "copy", + "copyreg", + "crypt", + "csv", + "ctypes", + "curses", + "dataclasses", + "datetime", + "dbm", + "decimal", + "difflib", + "dis", + "distutils", + "doctest", + "email", + "encodings", + "ensurepip", + "enum", + "errno", + "faulthandler", + "fcntl", + "filecmp", + "fileinput", + "fnmatch", + "fractions", + "ftplib", + "functools", + "gc", + "getopt", + "getpass", + "gettext", + "glob", + "graphlib", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "html", + "http", + "idlelib", + "imaplib", + "imghdr", + "imp", + "importlib", + "inspect", + "io", + "ipaddress", + "itertools", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "lzma", + "mailbox", + "mailcap", + "marshal", + "math", + "mimetypes", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multiprocessing", + "netrc", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "pathlib", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "poplib", + "posix", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "queue", + "quopri", + "random", + "re", + "readline", + "reprlib", + "resource", + "rlcompleter", + "runpy", + "sched", + "secrets", + "select", + "selectors", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "socketserver", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statistics", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "threading", + "time", + "timeit", + "tkinter", + "token", + "tokenize", + "tomllib", + "trace", + "traceback", + "tracemalloc", + "tty", + "turtle", + "turtledemo", + "types", + "typing", + "unicodedata", + "unittest", + "urllib", + "uu", + "uuid", + "venv", + "warnings", + "wave", + "weakref", + "webbrowser", + "winreg", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpc", + "zipapp", + "zipfile", + "zipimport", + "zlib", + "zoneinfo", +} diff --git a/newrelic/packages/isort/stdlibs/py36.py b/newrelic/packages/isort/stdlibs/py36.py new file mode 100644 index 000000000..59ebd24cb --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py36.py @@ -0,0 +1,224 @@ +""" +File contains the standard library of Python 3.6. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "_ast", + "_dummy_thread", + "_thread", + "abc", + "aifc", + "argparse", + "array", + "ast", + "asynchat", + "asyncio", + "asyncore", + "atexit", + "audioop", + "base64", + "bdb", + "binascii", + "binhex", + "bisect", + "builtins", + "bz2", + "cProfile", + "calendar", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "compileall", + "concurrent", + "configparser", + "contextlib", + "copy", + "copyreg", + "crypt", + "csv", + "ctypes", + "curses", + "datetime", + "dbm", + "decimal", + "difflib", + "dis", + "distutils", + "doctest", + "dummy_threading", + "email", + "encodings", + "ensurepip", + "enum", + "errno", + "faulthandler", + "fcntl", + "filecmp", + "fileinput", + "fnmatch", + "formatter", + "fpectl", + "fractions", + "ftplib", + "functools", + "gc", + "getopt", + "getpass", + "gettext", + "glob", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "html", + "http", + "imaplib", + "imghdr", + "imp", + "importlib", + "inspect", + "io", + "ipaddress", + "itertools", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "lzma", + "macpath", + "mailbox", + "mailcap", + "marshal", + "math", + "mimetypes", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multiprocessing", + "netrc", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "parser", + "pathlib", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "poplib", + "posix", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "queue", + "quopri", + "random", + "re", + "readline", + "reprlib", + "resource", + "rlcompleter", + "runpy", + "sched", + "secrets", + "select", + "selectors", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "socketserver", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statistics", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "symbol", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "threading", + "time", + "timeit", + "tkinter", + "token", + "tokenize", + "trace", + "traceback", + "tracemalloc", + "tty", + "turtle", + "turtledemo", + "types", + "typing", + "unicodedata", + "unittest", + "urllib", + "uu", + "uuid", + "venv", + "warnings", + "wave", + "weakref", + "webbrowser", + "winreg", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpc", + "zipapp", + "zipfile", + "zipimport", + "zlib", +} diff --git a/newrelic/packages/isort/stdlibs/py37.py b/newrelic/packages/isort/stdlibs/py37.py new file mode 100644 index 000000000..e0ad1228a --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py37.py @@ -0,0 +1,225 @@ +""" +File contains the standard library of Python 3.7. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "_ast", + "_dummy_thread", + "_thread", + "abc", + "aifc", + "argparse", + "array", + "ast", + "asynchat", + "asyncio", + "asyncore", + "atexit", + "audioop", + "base64", + "bdb", + "binascii", + "binhex", + "bisect", + "builtins", + "bz2", + "cProfile", + "calendar", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "compileall", + "concurrent", + "configparser", + "contextlib", + "contextvars", + "copy", + "copyreg", + "crypt", + "csv", + "ctypes", + "curses", + "dataclasses", + "datetime", + "dbm", + "decimal", + "difflib", + "dis", + "distutils", + "doctest", + "dummy_threading", + "email", + "encodings", + "ensurepip", + "enum", + "errno", + "faulthandler", + "fcntl", + "filecmp", + "fileinput", + "fnmatch", + "formatter", + "fractions", + "ftplib", + "functools", + "gc", + "getopt", + "getpass", + "gettext", + "glob", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "html", + "http", + "imaplib", + "imghdr", + "imp", + "importlib", + "inspect", + "io", + "ipaddress", + "itertools", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "lzma", + "macpath", + "mailbox", + "mailcap", + "marshal", + "math", + "mimetypes", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multiprocessing", + "netrc", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "parser", + "pathlib", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "poplib", + "posix", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "queue", + "quopri", + "random", + "re", + "readline", + "reprlib", + "resource", + "rlcompleter", + "runpy", + "sched", + "secrets", + "select", + "selectors", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "socketserver", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statistics", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "symbol", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "threading", + "time", + "timeit", + "tkinter", + "token", + "tokenize", + "trace", + "traceback", + "tracemalloc", + "tty", + "turtle", + "turtledemo", + "types", + "typing", + "unicodedata", + "unittest", + "urllib", + "uu", + "uuid", + "venv", + "warnings", + "wave", + "weakref", + "webbrowser", + "winreg", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpc", + "zipapp", + "zipfile", + "zipimport", + "zlib", +} diff --git a/newrelic/packages/isort/stdlibs/py38.py b/newrelic/packages/isort/stdlibs/py38.py new file mode 100644 index 000000000..3d89fd26b --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py38.py @@ -0,0 +1,224 @@ +""" +File contains the standard library of Python 3.8. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "_ast", + "_dummy_thread", + "_thread", + "abc", + "aifc", + "argparse", + "array", + "ast", + "asynchat", + "asyncio", + "asyncore", + "atexit", + "audioop", + "base64", + "bdb", + "binascii", + "binhex", + "bisect", + "builtins", + "bz2", + "cProfile", + "calendar", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "compileall", + "concurrent", + "configparser", + "contextlib", + "contextvars", + "copy", + "copyreg", + "crypt", + "csv", + "ctypes", + "curses", + "dataclasses", + "datetime", + "dbm", + "decimal", + "difflib", + "dis", + "distutils", + "doctest", + "dummy_threading", + "email", + "encodings", + "ensurepip", + "enum", + "errno", + "faulthandler", + "fcntl", + "filecmp", + "fileinput", + "fnmatch", + "formatter", + "fractions", + "ftplib", + "functools", + "gc", + "getopt", + "getpass", + "gettext", + "glob", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "html", + "http", + "imaplib", + "imghdr", + "imp", + "importlib", + "inspect", + "io", + "ipaddress", + "itertools", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "lzma", + "mailbox", + "mailcap", + "marshal", + "math", + "mimetypes", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multiprocessing", + "netrc", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "parser", + "pathlib", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "poplib", + "posix", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "queue", + "quopri", + "random", + "re", + "readline", + "reprlib", + "resource", + "rlcompleter", + "runpy", + "sched", + "secrets", + "select", + "selectors", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "socketserver", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statistics", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "symbol", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "threading", + "time", + "timeit", + "tkinter", + "token", + "tokenize", + "trace", + "traceback", + "tracemalloc", + "tty", + "turtle", + "turtledemo", + "types", + "typing", + "unicodedata", + "unittest", + "urllib", + "uu", + "uuid", + "venv", + "warnings", + "wave", + "weakref", + "webbrowser", + "winreg", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpc", + "zipapp", + "zipfile", + "zipimport", + "zlib", +} diff --git a/newrelic/packages/isort/stdlibs/py39.py b/newrelic/packages/isort/stdlibs/py39.py new file mode 100644 index 000000000..4b7dd5954 --- /dev/null +++ b/newrelic/packages/isort/stdlibs/py39.py @@ -0,0 +1,224 @@ +""" +File contains the standard library of Python 3.9. + +DO NOT EDIT. If the standard library changes, a new list should be created +using the mkstdlibs.py script. +""" + +stdlib = { + "_ast", + "_thread", + "abc", + "aifc", + "argparse", + "array", + "ast", + "asynchat", + "asyncio", + "asyncore", + "atexit", + "audioop", + "base64", + "bdb", + "binascii", + "binhex", + "bisect", + "builtins", + "bz2", + "cProfile", + "calendar", + "cgi", + "cgitb", + "chunk", + "cmath", + "cmd", + "code", + "codecs", + "codeop", + "collections", + "colorsys", + "compileall", + "concurrent", + "configparser", + "contextlib", + "contextvars", + "copy", + "copyreg", + "crypt", + "csv", + "ctypes", + "curses", + "dataclasses", + "datetime", + "dbm", + "decimal", + "difflib", + "dis", + "distutils", + "doctest", + "email", + "encodings", + "ensurepip", + "enum", + "errno", + "faulthandler", + "fcntl", + "filecmp", + "fileinput", + "fnmatch", + "formatter", + "fractions", + "ftplib", + "functools", + "gc", + "getopt", + "getpass", + "gettext", + "glob", + "graphlib", + "grp", + "gzip", + "hashlib", + "heapq", + "hmac", + "html", + "http", + "imaplib", + "imghdr", + "imp", + "importlib", + "inspect", + "io", + "ipaddress", + "itertools", + "json", + "keyword", + "lib2to3", + "linecache", + "locale", + "logging", + "lzma", + "mailbox", + "mailcap", + "marshal", + "math", + "mimetypes", + "mmap", + "modulefinder", + "msilib", + "msvcrt", + "multiprocessing", + "netrc", + "nis", + "nntplib", + "ntpath", + "numbers", + "operator", + "optparse", + "os", + "ossaudiodev", + "parser", + "pathlib", + "pdb", + "pickle", + "pickletools", + "pipes", + "pkgutil", + "platform", + "plistlib", + "poplib", + "posix", + "posixpath", + "pprint", + "profile", + "pstats", + "pty", + "pwd", + "py_compile", + "pyclbr", + "pydoc", + "queue", + "quopri", + "random", + "re", + "readline", + "reprlib", + "resource", + "rlcompleter", + "runpy", + "sched", + "secrets", + "select", + "selectors", + "shelve", + "shlex", + "shutil", + "signal", + "site", + "smtpd", + "smtplib", + "sndhdr", + "socket", + "socketserver", + "spwd", + "sqlite3", + "sre", + "sre_compile", + "sre_constants", + "sre_parse", + "ssl", + "stat", + "statistics", + "string", + "stringprep", + "struct", + "subprocess", + "sunau", + "symbol", + "symtable", + "sys", + "sysconfig", + "syslog", + "tabnanny", + "tarfile", + "telnetlib", + "tempfile", + "termios", + "test", + "textwrap", + "threading", + "time", + "timeit", + "tkinter", + "token", + "tokenize", + "trace", + "traceback", + "tracemalloc", + "tty", + "turtle", + "turtledemo", + "types", + "typing", + "unicodedata", + "unittest", + "urllib", + "uu", + "uuid", + "venv", + "warnings", + "wave", + "weakref", + "webbrowser", + "winreg", + "winsound", + "wsgiref", + "xdrlib", + "xml", + "xmlrpc", + "zipapp", + "zipfile", + "zipimport", + "zlib", + "zoneinfo", +} diff --git a/newrelic/packages/six.py b/newrelic/packages/six.py index 8a877b174..4e15675d8 100644 --- a/newrelic/packages/six.py +++ b/newrelic/packages/six.py @@ -1,6 +1,4 @@ -"""Utilities for writing code that runs on Python 2 and 3""" - -# Copyright (c) 2010-2013 Benjamin Peterson +# Copyright (c) 2010-2020 Benjamin Peterson # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to deal @@ -20,17 +18,24 @@ # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE # SOFTWARE. +"""Utilities for writing code that runs on Python 2 and 3""" + +from __future__ import absolute_import + +import functools +import itertools import operator import sys import types __author__ = "Benjamin Peterson " -__version__ = "1.3.0" +__version__ = "1.16.0" # Useful for very coarse version differentiation. PY2 = sys.version_info[0] == 2 PY3 = sys.version_info[0] == 3 +PY34 = sys.version_info[0:2] >= (3, 4) if PY3: string_types = str, @@ -53,6 +58,7 @@ else: # It's possible to have sizeof(long) != sizeof(Py_ssize_t). class X(object): + def __len__(self): return 1 << 31 try: @@ -65,6 +71,11 @@ def __len__(self): MAXSIZE = int((1 << 63) - 1) del X +if PY34: + from importlib.util import spec_from_loader +else: + spec_from_loader = None + def _add_doc(func, doc): """Add documentation to a function.""" @@ -84,9 +95,13 @@ def __init__(self, name): def __get__(self, obj, tp): result = self._resolve() - setattr(obj, self.name, result) - # This is a bit ugly, but it avoids running this again. - delattr(tp, self.name) + setattr(obj, self.name, result) # Invokes __set__. + try: + # This is a bit ugly, but it avoids running this again by + # removing this descriptor. + delattr(obj.__class__, self.name) + except AttributeError: + pass return result @@ -104,6 +119,27 @@ def __init__(self, name, old, new=None): def _resolve(self): return _import_module(self.mod) + def __getattr__(self, attr): + _module = self._resolve() + value = getattr(_module, attr) + setattr(self, attr, value) + return value + + +class _LazyModule(types.ModuleType): + + def __init__(self, name): + super(_LazyModule, self).__init__(name) + self.__doc__ = self.__class__.__doc__ + + def __dir__(self): + attrs = ["__doc__", "__name__"] + attrs += [attr.name for attr in self._moved_attributes] + return attrs + + # Subclasses should override this + _moved_attributes = [] + class MovedAttribute(_LazyDescr): @@ -130,34 +166,126 @@ def _resolve(self): return getattr(module, self.attr) +class _SixMetaPathImporter(object): + + """ + A meta path importer to import six.moves and its submodules. + + This class implements a PEP302 finder and loader. It should be compatible + with Python 2.5 and all existing versions of Python3 + """ + + def __init__(self, six_module_name): + self.name = six_module_name + self.known_modules = {} + + def _add_module(self, mod, *fullnames): + for fullname in fullnames: + self.known_modules[self.name + "." + fullname] = mod + + def _get_module(self, fullname): + return self.known_modules[self.name + "." + fullname] + + def find_module(self, fullname, path=None): + if fullname in self.known_modules: + return self + return None + + def find_spec(self, fullname, path, target=None): + if fullname in self.known_modules: + return spec_from_loader(fullname, self) + return None + + def __get_module(self, fullname): + try: + return self.known_modules[fullname] + except KeyError: + raise ImportError("This loader does not know module " + fullname) + + def load_module(self, fullname): + try: + # in case of a reload + return sys.modules[fullname] + except KeyError: + pass + mod = self.__get_module(fullname) + if isinstance(mod, MovedModule): + mod = mod._resolve() + else: + mod.__loader__ = self + sys.modules[fullname] = mod + return mod + + def is_package(self, fullname): + """ + Return true, if the named module is a package. + + We need this method to get correct spec objects with + Python 3.4 (see PEP451) + """ + return hasattr(self.__get_module(fullname), "__path__") + + def get_code(self, fullname): + """Return None + + Required, if is_package is implemented""" + self.__get_module(fullname) # eventually raises ImportError + return None + get_source = get_code # same as get_code + + def create_module(self, spec): + return self.load_module(spec.name) + + def exec_module(self, module): + pass + +_importer = _SixMetaPathImporter(__name__) + + +class _MovedItems(_LazyModule): -class _MovedItems(types.ModuleType): """Lazy loading of moved objects""" + __path__ = [] # mark as package _moved_attributes = [ MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"), MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"), + MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"), MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"), + MovedAttribute("intern", "__builtin__", "sys"), MovedAttribute("map", "itertools", "builtins", "imap", "map"), + MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"), + MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"), + MovedAttribute("getoutput", "commands", "subprocess"), MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"), - MovedAttribute("reload_module", "__builtin__", "imp", "reload"), + MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"), MovedAttribute("reduce", "__builtin__", "functools"), + MovedAttribute("shlex_quote", "pipes", "shlex", "quote"), MovedAttribute("StringIO", "StringIO", "io"), + MovedAttribute("UserDict", "UserDict", "collections"), + MovedAttribute("UserList", "UserList", "collections"), + MovedAttribute("UserString", "UserString", "collections"), MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"), MovedAttribute("zip", "itertools", "builtins", "izip", "zip"), - + MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"), MovedModule("builtins", "__builtin__"), MovedModule("configparser", "ConfigParser"), + MovedModule("collections_abc", "collections", "collections.abc" if sys.version_info >= (3, 3) else "collections"), MovedModule("copyreg", "copy_reg"), + MovedModule("dbm_gnu", "gdbm", "dbm.gnu"), + MovedModule("dbm_ndbm", "dbm", "dbm.ndbm"), + MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread" if sys.version_info < (3, 9) else "_thread"), MovedModule("http_cookiejar", "cookielib", "http.cookiejar"), MovedModule("http_cookies", "Cookie", "http.cookies"), MovedModule("html_entities", "htmlentitydefs", "html.entities"), MovedModule("html_parser", "HTMLParser", "html.parser"), MovedModule("http_client", "httplib", "http.client"), + MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), + MovedModule("email_mime_image", "email.MIMEImage", "email.mime.image"), MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"), + MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"), MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"), - MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"), MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"), MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"), @@ -165,12 +293,14 @@ class _MovedItems(types.ModuleType): MovedModule("queue", "Queue"), MovedModule("reprlib", "repr"), MovedModule("socketserver", "SocketServer"), + MovedModule("_thread", "thread", "_thread"), MovedModule("tkinter", "Tkinter"), MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"), MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"), MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"), MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"), MovedModule("tkinter_tix", "Tix", "tkinter.tix"), + MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"), MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"), MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"), MovedModule("tkinter_colorchooser", "tkColorChooser", @@ -182,14 +312,199 @@ class _MovedItems(types.ModuleType): MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"), MovedModule("tkinter_tksimpledialog", "tkSimpleDialog", "tkinter.simpledialog"), + MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"), + MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"), + MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"), MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"), - MovedModule("winreg", "_winreg"), + MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"), + MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"), ] +# Add windows specific modules. +if sys.platform == "win32": + _moved_attributes += [ + MovedModule("winreg", "_winreg"), + ] + for attr in _moved_attributes: setattr(_MovedItems, attr.name, attr) + if isinstance(attr, MovedModule): + _importer._add_module(attr, "moves." + attr.name) +del attr + +_MovedItems._moved_attributes = _moved_attributes + +moves = _MovedItems(__name__ + ".moves") +_importer._add_module(moves, "moves") + + +class Module_six_moves_urllib_parse(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_parse""" + + +_urllib_parse_moved_attributes = [ + MovedAttribute("ParseResult", "urlparse", "urllib.parse"), + MovedAttribute("SplitResult", "urlparse", "urllib.parse"), + MovedAttribute("parse_qs", "urlparse", "urllib.parse"), + MovedAttribute("parse_qsl", "urlparse", "urllib.parse"), + MovedAttribute("urldefrag", "urlparse", "urllib.parse"), + MovedAttribute("urljoin", "urlparse", "urllib.parse"), + MovedAttribute("urlparse", "urlparse", "urllib.parse"), + MovedAttribute("urlsplit", "urlparse", "urllib.parse"), + MovedAttribute("urlunparse", "urlparse", "urllib.parse"), + MovedAttribute("urlunsplit", "urlparse", "urllib.parse"), + MovedAttribute("quote", "urllib", "urllib.parse"), + MovedAttribute("quote_plus", "urllib", "urllib.parse"), + MovedAttribute("unquote", "urllib", "urllib.parse"), + MovedAttribute("unquote_plus", "urllib", "urllib.parse"), + MovedAttribute("unquote_to_bytes", "urllib", "urllib.parse", "unquote", "unquote_to_bytes"), + MovedAttribute("urlencode", "urllib", "urllib.parse"), + MovedAttribute("splitquery", "urllib", "urllib.parse"), + MovedAttribute("splittag", "urllib", "urllib.parse"), + MovedAttribute("splituser", "urllib", "urllib.parse"), + MovedAttribute("splitvalue", "urllib", "urllib.parse"), + MovedAttribute("uses_fragment", "urlparse", "urllib.parse"), + MovedAttribute("uses_netloc", "urlparse", "urllib.parse"), + MovedAttribute("uses_params", "urlparse", "urllib.parse"), + MovedAttribute("uses_query", "urlparse", "urllib.parse"), + MovedAttribute("uses_relative", "urlparse", "urllib.parse"), +] +for attr in _urllib_parse_moved_attributes: + setattr(Module_six_moves_urllib_parse, attr.name, attr) del attr -moves = sys.modules[__name__ + ".moves"] = _MovedItems("moves") +Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes + +_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"), + "moves.urllib_parse", "moves.urllib.parse") + + +class Module_six_moves_urllib_error(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_error""" + + +_urllib_error_moved_attributes = [ + MovedAttribute("URLError", "urllib2", "urllib.error"), + MovedAttribute("HTTPError", "urllib2", "urllib.error"), + MovedAttribute("ContentTooShortError", "urllib", "urllib.error"), +] +for attr in _urllib_error_moved_attributes: + setattr(Module_six_moves_urllib_error, attr.name, attr) +del attr + +Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes + +_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"), + "moves.urllib_error", "moves.urllib.error") + + +class Module_six_moves_urllib_request(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_request""" + + +_urllib_request_moved_attributes = [ + MovedAttribute("urlopen", "urllib2", "urllib.request"), + MovedAttribute("install_opener", "urllib2", "urllib.request"), + MovedAttribute("build_opener", "urllib2", "urllib.request"), + MovedAttribute("pathname2url", "urllib", "urllib.request"), + MovedAttribute("url2pathname", "urllib", "urllib.request"), + MovedAttribute("getproxies", "urllib", "urllib.request"), + MovedAttribute("Request", "urllib2", "urllib.request"), + MovedAttribute("OpenerDirector", "urllib2", "urllib.request"), + MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"), + MovedAttribute("ProxyHandler", "urllib2", "urllib.request"), + MovedAttribute("BaseHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"), + MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"), + MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"), + MovedAttribute("FileHandler", "urllib2", "urllib.request"), + MovedAttribute("FTPHandler", "urllib2", "urllib.request"), + MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"), + MovedAttribute("UnknownHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"), + MovedAttribute("urlretrieve", "urllib", "urllib.request"), + MovedAttribute("urlcleanup", "urllib", "urllib.request"), + MovedAttribute("URLopener", "urllib", "urllib.request"), + MovedAttribute("FancyURLopener", "urllib", "urllib.request"), + MovedAttribute("proxy_bypass", "urllib", "urllib.request"), + MovedAttribute("parse_http_list", "urllib2", "urllib.request"), + MovedAttribute("parse_keqv_list", "urllib2", "urllib.request"), +] +for attr in _urllib_request_moved_attributes: + setattr(Module_six_moves_urllib_request, attr.name, attr) +del attr + +Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes + +_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"), + "moves.urllib_request", "moves.urllib.request") + + +class Module_six_moves_urllib_response(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_response""" + + +_urllib_response_moved_attributes = [ + MovedAttribute("addbase", "urllib", "urllib.response"), + MovedAttribute("addclosehook", "urllib", "urllib.response"), + MovedAttribute("addinfo", "urllib", "urllib.response"), + MovedAttribute("addinfourl", "urllib", "urllib.response"), +] +for attr in _urllib_response_moved_attributes: + setattr(Module_six_moves_urllib_response, attr.name, attr) +del attr + +Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes + +_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"), + "moves.urllib_response", "moves.urllib.response") + + +class Module_six_moves_urllib_robotparser(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_robotparser""" + + +_urllib_robotparser_moved_attributes = [ + MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"), +] +for attr in _urllib_robotparser_moved_attributes: + setattr(Module_six_moves_urllib_robotparser, attr.name, attr) +del attr + +Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes + +_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"), + "moves.urllib_robotparser", "moves.urllib.robotparser") + + +class Module_six_moves_urllib(types.ModuleType): + + """Create a six.moves.urllib namespace that resembles the Python 3 namespace""" + __path__ = [] # mark as package + parse = _importer._get_module("moves.urllib_parse") + error = _importer._get_module("moves.urllib_error") + request = _importer._get_module("moves.urllib_request") + response = _importer._get_module("moves.urllib_response") + robotparser = _importer._get_module("moves.urllib_robotparser") + + def __dir__(self): + return ['parse', 'error', 'request', 'response', 'robotparser'] + +_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"), + "moves.urllib") def add_move(move): @@ -216,11 +531,6 @@ def remove_move(name): _func_code = "__code__" _func_defaults = "__defaults__" _func_globals = "__globals__" - - _iterkeys = "keys" - _itervalues = "values" - _iteritems = "items" - _iterlists = "lists" else: _meth_func = "im_func" _meth_self = "im_self" @@ -230,11 +540,6 @@ def remove_move(name): _func_defaults = "func_defaults" _func_globals = "func_globals" - _iterkeys = "iterkeys" - _itervalues = "itervalues" - _iteritems = "iteritems" - _iterlists = "iterlists" - try: advance_iterator = next @@ -257,6 +562,9 @@ def get_unbound_function(unbound): create_bound_method = types.MethodType + def create_unbound_method(func, cls): + return func + Iterator = object else: def get_unbound_function(unbound): @@ -265,6 +573,9 @@ def get_unbound_function(unbound): def create_bound_method(func, obj): return types.MethodType(func, obj, obj.__class__) + def create_unbound_method(func, cls): + return types.MethodType(func, None, cls) + class Iterator(object): def next(self): @@ -283,73 +594,132 @@ def next(self): get_function_globals = operator.attrgetter(_func_globals) -def iterkeys(d, **kw): - """Return an iterator over the keys of a dictionary.""" - return iter(getattr(d, _iterkeys)(**kw)) +if PY3: + def iterkeys(d, **kw): + return iter(d.keys(**kw)) + + def itervalues(d, **kw): + return iter(d.values(**kw)) -def itervalues(d, **kw): - """Return an iterator over the values of a dictionary.""" - return iter(getattr(d, _itervalues)(**kw)) + def iteritems(d, **kw): + return iter(d.items(**kw)) -def iteritems(d, **kw): - """Return an iterator over the (key, value) pairs of a dictionary.""" - return iter(getattr(d, _iteritems)(**kw)) + def iterlists(d, **kw): + return iter(d.lists(**kw)) -def iterlists(d, **kw): - """Return an iterator over the (key, [values]) pairs of a dictionary.""" - return iter(getattr(d, _iterlists)(**kw)) + viewkeys = operator.methodcaller("keys") + + viewvalues = operator.methodcaller("values") + + viewitems = operator.methodcaller("items") +else: + def iterkeys(d, **kw): + return d.iterkeys(**kw) + + def itervalues(d, **kw): + return d.itervalues(**kw) + + def iteritems(d, **kw): + return d.iteritems(**kw) + + def iterlists(d, **kw): + return d.iterlists(**kw) + + viewkeys = operator.methodcaller("viewkeys") + + viewvalues = operator.methodcaller("viewvalues") + + viewitems = operator.methodcaller("viewitems") + +_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.") +_add_doc(itervalues, "Return an iterator over the values of a dictionary.") +_add_doc(iteritems, + "Return an iterator over the (key, value) pairs of a dictionary.") +_add_doc(iterlists, + "Return an iterator over the (key, [values]) pairs of a dictionary.") if PY3: def b(s): return s.encode("latin-1") + def u(s): return s unichr = chr - if sys.version_info[1] <= 1: - def int2byte(i): - return bytes((i,)) - else: - # This is about 2x faster than the implementation above on 3.2+ - int2byte = operator.methodcaller("to_bytes", 1, "big") + import struct + int2byte = struct.Struct(">B").pack + del struct byte2int = operator.itemgetter(0) indexbytes = operator.getitem iterbytes = iter import io StringIO = io.StringIO BytesIO = io.BytesIO + del io + _assertCountEqual = "assertCountEqual" + if sys.version_info[1] <= 1: + _assertRaisesRegex = "assertRaisesRegexp" + _assertRegex = "assertRegexpMatches" + _assertNotRegex = "assertNotRegexpMatches" + else: + _assertRaisesRegex = "assertRaisesRegex" + _assertRegex = "assertRegex" + _assertNotRegex = "assertNotRegex" else: def b(s): return s + # Workaround for standalone backslash + def u(s): - return unicode(s, "unicode_escape") + return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape") unichr = unichr int2byte = chr + def byte2int(bs): return ord(bs[0]) + def indexbytes(buf, i): return ord(buf[i]) - def iterbytes(buf): - return (ord(byte) for byte in buf) + iterbytes = functools.partial(itertools.imap, ord) import StringIO StringIO = BytesIO = StringIO.StringIO + _assertCountEqual = "assertItemsEqual" + _assertRaisesRegex = "assertRaisesRegexp" + _assertRegex = "assertRegexpMatches" + _assertNotRegex = "assertNotRegexpMatches" _add_doc(b, """Byte literal""") _add_doc(u, """Text literal""") -if PY3: - import builtins - exec_ = getattr(builtins, "exec") +def assertCountEqual(self, *args, **kwargs): + return getattr(self, _assertCountEqual)(*args, **kwargs) - def reraise(tp, value, tb=None): - if value.__traceback__ is not tb: - raise value.with_traceback(tb) - raise value +def assertRaisesRegex(self, *args, **kwargs): + return getattr(self, _assertRaisesRegex)(*args, **kwargs) + +def assertRegex(self, *args, **kwargs): + return getattr(self, _assertRegex)(*args, **kwargs) - print_ = getattr(builtins, "print") - del builtins + +def assertNotRegex(self, *args, **kwargs): + return getattr(self, _assertNotRegex)(*args, **kwargs) + + +if PY3: + exec_ = getattr(moves.builtins, "exec") + + def reraise(tp, value, tb=None): + try: + if value is None: + value = tp() + if value.__traceback__ is not tb: + raise value.with_traceback(tb) + raise value + finally: + value = None + tb = None else: def exec_(_code_, _globs_=None, _locs_=None): @@ -364,20 +734,45 @@ def exec_(_code_, _globs_=None, _locs_=None): _locs_ = _globs_ exec("""exec _code_ in _globs_, _locs_""") - exec_("""def reraise(tp, value, tb=None): - raise tp, value, tb + try: + raise tp, value, tb + finally: + tb = None +""") + + +if sys.version_info[:2] > (3,): + exec_("""def raise_from(value, from_value): + try: + raise value from from_value + finally: + value = None """) +else: + def raise_from(value, from_value): + raise value +print_ = getattr(moves.builtins, "print", None) +if print_ is None: def print_(*args, **kwargs): - """The new-style print function.""" + """The new-style print function for Python 2.4 and 2.5.""" fp = kwargs.pop("file", sys.stdout) if fp is None: return + def write(data): if not isinstance(data, basestring): data = str(data) + # If the file has an encoding, encode unicode with it. + if (isinstance(fp, file) and + isinstance(data, unicode) and + fp.encoding is not None): + errors = getattr(fp, "errors", None) + if errors is None: + errors = "strict" + data = data.encode(fp.encoding, errors) fp.write(data) want_unicode = False sep = kwargs.pop("sep", None) @@ -414,10 +809,190 @@ def write(data): write(sep) write(arg) write(end) +if sys.version_info[:2] < (3, 3): + _print = print_ + + def print_(*args, **kwargs): + fp = kwargs.get("file", sys.stdout) + flush = kwargs.pop("flush", False) + _print(*args, **kwargs) + if flush and fp is not None: + fp.flush() _add_doc(reraise, """Reraise an exception.""") +if sys.version_info[0:2] < (3, 4): + # This does exactly the same what the :func:`py3:functools.update_wrapper` + # function does on Python versions after 3.2. It sets the ``__wrapped__`` + # attribute on ``wrapper`` object and it doesn't raise an error if any of + # the attributes mentioned in ``assigned`` and ``updated`` are missing on + # ``wrapped`` object. + def _update_wrapper(wrapper, wrapped, + assigned=functools.WRAPPER_ASSIGNMENTS, + updated=functools.WRAPPER_UPDATES): + for attr in assigned: + try: + value = getattr(wrapped, attr) + except AttributeError: + continue + else: + setattr(wrapper, attr, value) + for attr in updated: + getattr(wrapper, attr).update(getattr(wrapped, attr, {})) + wrapper.__wrapped__ = wrapped + return wrapper + _update_wrapper.__doc__ = functools.update_wrapper.__doc__ + + def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS, + updated=functools.WRAPPER_UPDATES): + return functools.partial(_update_wrapper, wrapped=wrapped, + assigned=assigned, updated=updated) + wraps.__doc__ = functools.wraps.__doc__ + +else: + wraps = functools.wraps + def with_metaclass(meta, *bases): """Create a base class with a metaclass.""" - return meta("NewBase", bases, {}) + # This requires a bit of explanation: the basic idea is to make a dummy + # metaclass for one level of class instantiation that replaces itself with + # the actual metaclass. + class metaclass(type): + + def __new__(cls, name, this_bases, d): + if sys.version_info[:2] >= (3, 7): + # This version introduced PEP 560 that requires a bit + # of extra care (we mimic what is done by __build_class__). + resolved_bases = types.resolve_bases(bases) + if resolved_bases is not bases: + d['__orig_bases__'] = bases + else: + resolved_bases = bases + return meta(name, resolved_bases, d) + + @classmethod + def __prepare__(cls, name, this_bases): + return meta.__prepare__(name, bases) + return type.__new__(metaclass, 'temporary_class', (), {}) + + +def add_metaclass(metaclass): + """Class decorator for creating a class with a metaclass.""" + def wrapper(cls): + orig_vars = cls.__dict__.copy() + slots = orig_vars.get('__slots__') + if slots is not None: + if isinstance(slots, str): + slots = [slots] + for slots_var in slots: + orig_vars.pop(slots_var) + orig_vars.pop('__dict__', None) + orig_vars.pop('__weakref__', None) + if hasattr(cls, '__qualname__'): + orig_vars['__qualname__'] = cls.__qualname__ + return metaclass(cls.__name__, cls.__bases__, orig_vars) + return wrapper + + +def ensure_binary(s, encoding='utf-8', errors='strict'): + """Coerce **s** to six.binary_type. + + For Python 2: + - `unicode` -> encoded to `str` + - `str` -> `str` + + For Python 3: + - `str` -> encoded to `bytes` + - `bytes` -> `bytes` + """ + if isinstance(s, binary_type): + return s + if isinstance(s, text_type): + return s.encode(encoding, errors) + raise TypeError("not expecting type '%s'" % type(s)) + + +def ensure_str(s, encoding='utf-8', errors='strict'): + """Coerce *s* to `str`. + + For Python 2: + - `unicode` -> encoded to `str` + - `str` -> `str` + + For Python 3: + - `str` -> `str` + - `bytes` -> decoded to `str` + """ + # Optimization: Fast return for the common case. + if type(s) is str: + return s + if PY2 and isinstance(s, text_type): + return s.encode(encoding, errors) + elif PY3 and isinstance(s, binary_type): + return s.decode(encoding, errors) + elif not isinstance(s, (text_type, binary_type)): + raise TypeError("not expecting type '%s'" % type(s)) + return s + + +def ensure_text(s, encoding='utf-8', errors='strict'): + """Coerce *s* to six.text_type. + + For Python 2: + - `unicode` -> `unicode` + - `str` -> `unicode` + + For Python 3: + - `str` -> `str` + - `bytes` -> decoded to `str` + """ + if isinstance(s, binary_type): + return s.decode(encoding, errors) + elif isinstance(s, text_type): + return s + else: + raise TypeError("not expecting type '%s'" % type(s)) + + +def python_2_unicode_compatible(klass): + """ + A class decorator that defines __unicode__ and __str__ methods under Python 2. + Under Python 3 it does nothing. + + To support Python 2 and 3 with a single code base, define a __str__ method + returning text and apply this decorator to the class. + """ + if PY2: + if '__str__' not in klass.__dict__: + raise ValueError("@python_2_unicode_compatible cannot be applied " + "to %s because it doesn't define __str__()." % + klass.__name__) + klass.__unicode__ = klass.__str__ + klass.__str__ = lambda self: self.__unicode__().encode('utf-8') + return klass + + +# Complete the moves implementation. +# This code is at the end of this module to speed up module loading. +# Turn this module into a package. +__path__ = [] # required for PEP 302 and PEP 451 +__package__ = __name__ # see PEP 366 @ReservedAssignment +if globals().get("__spec__") is not None: + __spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable +# Remove other six meta path importers, since they cause problems. This can +# happen if six is removed from sys.modules and then reloaded. (Setuptools does +# this for some reason.) +if sys.meta_path: + for i, importer in enumerate(sys.meta_path): + # Here's some real nastiness: Another "instance" of the six module might + # be floating around. Therefore, we can't use isinstance() to check for + # the six meta path importer, since the other six instance will have + # inserted an importer with different class. + if (type(importer).__name__ == "_SixMetaPathImporter" and + importer.name == __name__): + del sys.meta_path[i] + break + del i, importer +# Finally, add the importer to the meta path import hook. +sys.meta_path.append(_importer) diff --git a/newrelic/packages/urllib3/LICENSE.txt b/newrelic/packages/urllib3/LICENSE.txt index c89cf27b8..429a1767e 100644 --- a/newrelic/packages/urllib3/LICENSE.txt +++ b/newrelic/packages/urllib3/LICENSE.txt @@ -1,6 +1,6 @@ MIT License -Copyright (c) 2008-2019 Andrey Petrov and contributors (see CONTRIBUTORS.txt) +Copyright (c) 2008-2020 Andrey Petrov and contributors (see CONTRIBUTORS.txt) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal diff --git a/newrelic/packages/urllib3/__init__.py b/newrelic/packages/urllib3/__init__.py index fe86b59d7..c8c7ce691 100644 --- a/newrelic/packages/urllib3/__init__.py +++ b/newrelic/packages/urllib3/__init__.py @@ -19,6 +19,7 @@ from .util.timeout import Timeout from .util.url import get_host + __author__ = "Andrey Petrov (andrey.petrov@shazow.net)" __license__ = "MIT" __version__ = __version__ diff --git a/newrelic/packages/urllib3/_version.py b/newrelic/packages/urllib3/_version.py index 5141d980b..e12dd0e78 100644 --- a/newrelic/packages/urllib3/_version.py +++ b/newrelic/packages/urllib3/_version.py @@ -1,2 +1,2 @@ # This file is protected via CODEOWNERS -__version__ = "1.26.7" +__version__ = "1.26.15" diff --git a/newrelic/packages/urllib3/connection.py b/newrelic/packages/urllib3/connection.py index 60f70f794..54b96b191 100644 --- a/newrelic/packages/urllib3/connection.py +++ b/newrelic/packages/urllib3/connection.py @@ -51,7 +51,6 @@ class BrokenPipeError(Exception): SubjectAltNameWarning, SystemTimeWarning, ) -from .packages.ssl_match_hostname import CertificateError, match_hostname from .util import SKIP_HEADER, SKIPPABLE_HEADERS, connection from .util.ssl_ import ( assert_fingerprint, @@ -61,6 +60,7 @@ class BrokenPipeError(Exception): resolve_ssl_version, ssl_wrap_socket, ) +from .util.ssl_match_hostname import CertificateError, match_hostname log = logging.getLogger(__name__) @@ -68,7 +68,7 @@ class BrokenPipeError(Exception): # When it comes time to update this value as a part of regular maintenance # (ie test_recent_date is failing) update it to ~6 months before the current date. -RECENT_DATE = datetime.date(2020, 7, 1) +RECENT_DATE = datetime.date(2022, 1, 1) _CONTAINS_CONTROL_CHAR_RE = re.compile(r"[^-!#$%&'*+.^_`|~0-9a-zA-Z]") @@ -229,6 +229,11 @@ def putheader(self, header, *values): ) def request(self, method, url, body=None, headers=None): + # Update the inner socket's timeout value to send the request. + # This only triggers if the connection is re-used. + if getattr(self, "sock", None) is not None: + self.sock.settimeout(self.timeout) + if headers is None: headers = {} else: @@ -355,17 +360,15 @@ def set_cert( def connect(self): # Add certificate verification - conn = self._new_conn() + self.sock = conn = self._new_conn() hostname = self.host tls_in_tls = False if self._is_using_tunnel(): if self.tls_in_tls_required: - conn = self._connect_tls_proxy(hostname, conn) + self.sock = conn = self._connect_tls_proxy(hostname, conn) tls_in_tls = True - self.sock = conn - # Calls self._set_hostport(), so self.host is # self._tunnel_host below. self._tunnel() diff --git a/newrelic/packages/urllib3/connectionpool.py b/newrelic/packages/urllib3/connectionpool.py index 8dccf4bc2..c23d736b1 100644 --- a/newrelic/packages/urllib3/connectionpool.py +++ b/newrelic/packages/urllib3/connectionpool.py @@ -2,6 +2,7 @@ import errno import logging +import re import socket import sys import warnings @@ -35,7 +36,6 @@ ) from .packages import six from .packages.six.moves import queue -from .packages.ssl_match_hostname import CertificateError from .request import RequestMethods from .response import HTTPResponse from .util.connection import is_connection_dropped @@ -44,6 +44,7 @@ from .util.request import set_file_position from .util.response import assert_header_parsing from .util.retry import Retry +from .util.ssl_match_hostname import CertificateError from .util.timeout import Timeout from .util.url import Url, _encode_target from .util.url import _normalize_host as normalize_host @@ -301,8 +302,11 @@ def _put_conn(self, conn): pass except queue.Full: # This should never happen if self.block == True - log.warning("Connection pool is full, discarding connection: %s", self.host) - + log.warning( + "Connection pool is full, discarding connection: %s. Connection pool size: %s", + self.host, + self.pool.qsize(), + ) # Connection never got put back into the pool, close it. if conn: conn.close() @@ -375,7 +379,7 @@ def _make_request( timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() - conn.timeout = timeout_obj.connect_timeout + conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) # Trigger any extra validation we need to do. try: @@ -745,7 +749,35 @@ def urlopen( # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False - if isinstance(e, (BaseSSLError, CertificateError)): + + def _is_ssl_error_message_from_http_proxy(ssl_error): + # We're trying to detect the message 'WRONG_VERSION_NUMBER' but + # SSLErrors are kinda all over the place when it comes to the message, + # so we try to cover our bases here! + message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) + return ( + "wrong version number" in message or "unknown protocol" in message + ) + + # Try to detect a common user error with proxies which is to + # set an HTTP proxy to be HTTPS when it should be 'http://' + # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) + # Instead we add a nice error message and point to a URL. + if ( + isinstance(e, BaseSSLError) + and self.proxy + and _is_ssl_error_message_from_http_proxy(e) + and conn.proxy + and conn.proxy.scheme == "https" + ): + e = ProxyError( + "Your proxy appears to only use HTTP and not HTTPS, " + "try changing your proxy URL to be HTTP. See: " + "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" + "#https-proxy-error-http-proxy", + SSLError(e), + ) + elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) @@ -830,7 +862,7 @@ def urlopen( ) # Check if we should retry the HTTP response. - has_retry_after = bool(response.getheader("Retry-After")) + has_retry_after = bool(response.headers.get("Retry-After")) if retries.is_retry(method, response.status, has_retry_after): try: retries = retries.increment(method, url, response=response, _pool=self) diff --git a/newrelic/packages/urllib3/contrib/appengine.py b/newrelic/packages/urllib3/contrib/appengine.py index f91bdd6e7..a5a6d9103 100644 --- a/newrelic/packages/urllib3/contrib/appengine.py +++ b/newrelic/packages/urllib3/contrib/appengine.py @@ -224,7 +224,7 @@ def urlopen( ) # Check if we should retry the HTTP response. - has_retry_after = bool(http_response.getheader("Retry-After")) + has_retry_after = bool(http_response.headers.get("Retry-After")) if retries.is_retry(method, http_response.status, has_retry_after): retries = retries.increment(method, url, response=http_response, _pool=self) log.debug("Retry: %s", url) diff --git a/newrelic/packages/urllib3/contrib/ntlmpool.py b/newrelic/packages/urllib3/contrib/ntlmpool.py index 41a8fd174..471665754 100644 --- a/newrelic/packages/urllib3/contrib/ntlmpool.py +++ b/newrelic/packages/urllib3/contrib/ntlmpool.py @@ -69,7 +69,7 @@ def _new_conn(self): log.debug("Request headers: %s", headers) conn.request("GET", self.authurl, None, headers) res = conn.getresponse() - reshdr = dict(res.getheaders()) + reshdr = dict(res.headers) log.debug("Response status: %s %s", res.status, res.reason) log.debug("Response headers: %s", reshdr) log.debug("Response data: %s [...]", res.read(100)) @@ -101,7 +101,7 @@ def _new_conn(self): conn.request("GET", self.authurl, None, headers) res = conn.getresponse() log.debug("Response status: %s %s", res.status, res.reason) - log.debug("Response headers: %s", dict(res.getheaders())) + log.debug("Response headers: %s", dict(res.headers)) log.debug("Response data: %s [...]", res.read()[:100]) if res.status != 200: if res.status == 401: diff --git a/newrelic/packages/urllib3/contrib/pyopenssl.py b/newrelic/packages/urllib3/contrib/pyopenssl.py index def83afdb..1ed214b1d 100644 --- a/newrelic/packages/urllib3/contrib/pyopenssl.py +++ b/newrelic/packages/urllib3/contrib/pyopenssl.py @@ -47,10 +47,10 @@ """ from __future__ import absolute_import +import OpenSSL.crypto import OpenSSL.SSL from cryptography import x509 from cryptography.hazmat.backends.openssl import backend as openssl_backend -from cryptography.hazmat.backends.openssl.x509 import _Certificate try: from cryptography.x509 import UnsupportedExtension @@ -73,11 +73,20 @@ class UnsupportedExtension(Exception): import logging import ssl import sys +import warnings from .. import util from ..packages import six from ..util.ssl_ import PROTOCOL_TLS_CLIENT +warnings.warn( + "'urllib3.contrib.pyopenssl' module is deprecated and will be removed " + "in a future release of urllib3 2.x. Read more in this issue: " + "https://github.com/urllib3/urllib3/issues/2680", + category=DeprecationWarning, + stacklevel=2, +) + __all__ = ["inject_into_urllib3", "extract_from_urllib3"] # SNI always works. @@ -219,9 +228,8 @@ def get_subj_alt_name(peer_cert): if hasattr(peer_cert, "to_cryptography"): cert = peer_cert.to_cryptography() else: - # This is technically using private APIs, but should work across all - # relevant versions before PyOpenSSL got a proper API for this. - cert = _Certificate(openssl_backend, peer_cert._x509) + der = OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_ASN1, peer_cert) + cert = x509.load_der_x509_certificate(der, openssl_backend) # We want to find the SAN extension. Ask Cryptography to locate it (it's # faster than looping in Python) @@ -406,7 +414,6 @@ def makefile(self, mode, bufsize=-1): self._makefile_refs += 1 return _fileobject(self, mode, bufsize, close=True) - else: # Platform-specific: Python 3 makefile = backport_makefile diff --git a/newrelic/packages/urllib3/contrib/securetransport.py b/newrelic/packages/urllib3/contrib/securetransport.py index 554c015fe..6c46a3b9f 100644 --- a/newrelic/packages/urllib3/contrib/securetransport.py +++ b/newrelic/packages/urllib3/contrib/securetransport.py @@ -770,7 +770,6 @@ def makefile(self, mode, bufsize=-1): self._makefile_refs += 1 return _fileobject(self, mode, bufsize, close=True) - else: # Platform-specific: Python 3 def makefile(self, mode="r", buffering=None, *args, **kwargs): diff --git a/newrelic/packages/urllib3/packages/__init__.py b/newrelic/packages/urllib3/packages/__init__.py index fce4caa65..e69de29bb 100644 --- a/newrelic/packages/urllib3/packages/__init__.py +++ b/newrelic/packages/urllib3/packages/__init__.py @@ -1,5 +0,0 @@ -from __future__ import absolute_import - -from . import ssl_match_hostname - -__all__ = ("ssl_match_hostname",) diff --git a/newrelic/packages/urllib3/packages/six.py b/newrelic/packages/urllib3/packages/six.py index ba50acb06..f099a3dcd 100644 --- a/newrelic/packages/urllib3/packages/six.py +++ b/newrelic/packages/urllib3/packages/six.py @@ -772,7 +772,6 @@ def reraise(tp, value, tb=None): value = None tb = None - else: def exec_(_code_, _globs_=None, _locs_=None): diff --git a/newrelic/packages/urllib3/packages/ssl_match_hostname/__init__.py b/newrelic/packages/urllib3/packages/ssl_match_hostname/__init__.py deleted file mode 100644 index ef3fde520..000000000 --- a/newrelic/packages/urllib3/packages/ssl_match_hostname/__init__.py +++ /dev/null @@ -1,24 +0,0 @@ -import sys - -try: - # Our match_hostname function is the same as 3.10's, so we only want to - # import the match_hostname function if it's at least that good. - # We also fallback on Python 3.10+ because our code doesn't emit - # deprecation warnings and is the same as Python 3.10 otherwise. - if sys.version_info < (3, 5) or sys.version_info >= (3, 10): - raise ImportError("Fallback to vendored code") - - from ssl import CertificateError, match_hostname -except ImportError: - try: - # Backport of the function from a pypi module - from backports.ssl_match_hostname import ( # type: ignore - CertificateError, - match_hostname, - ) - except ImportError: - # Our vendored copy - from ._implementation import CertificateError, match_hostname # type: ignore - -# Not needed, but documenting what we provide. -__all__ = ("CertificateError", "match_hostname") diff --git a/newrelic/packages/urllib3/poolmanager.py b/newrelic/packages/urllib3/poolmanager.py index 3a31a285b..ca4ec3411 100644 --- a/newrelic/packages/urllib3/poolmanager.py +++ b/newrelic/packages/urllib3/poolmanager.py @@ -34,6 +34,7 @@ "ca_cert_dir", "ssl_context", "key_password", + "server_hostname", ) # All known keyword arguments that could be provided to the pool manager, its diff --git a/newrelic/packages/urllib3/response.py b/newrelic/packages/urllib3/response.py index 38693f4fc..0bd13d40b 100644 --- a/newrelic/packages/urllib3/response.py +++ b/newrelic/packages/urllib3/response.py @@ -2,16 +2,22 @@ import io import logging +import sys +import warnings import zlib from contextlib import contextmanager from socket import error as SocketError from socket import timeout as SocketTimeout try: - import brotli + try: + import brotlicffi as brotli + except ImportError: + import brotli except ImportError: brotli = None +from . import util from ._collections import HTTPHeaderDict from .connection import BaseSSLError, HTTPException from .exceptions import ( @@ -478,6 +484,54 @@ def _error_catcher(self): if self._original_response and self._original_response.isclosed(): self.release_conn() + def _fp_read(self, amt): + """ + Read a response with the thought that reading the number of bytes + larger than can fit in a 32-bit int at a time via SSL in some + known cases leads to an overflow error that has to be prevented + if `amt` or `self.length_remaining` indicate that a problem may + happen. + + The known cases: + * 3.8 <= CPython < 3.9.7 because of a bug + https://github.com/urllib3/urllib3/issues/2513#issuecomment-1152559900. + * urllib3 injected with pyOpenSSL-backed SSL-support. + * CPython < 3.10 only when `amt` does not fit 32-bit int. + """ + assert self._fp + c_int_max = 2 ** 31 - 1 + if ( + ( + (amt and amt > c_int_max) + or (self.length_remaining and self.length_remaining > c_int_max) + ) + and not util.IS_SECURETRANSPORT + and (util.IS_PYOPENSSL or sys.version_info < (3, 10)) + ): + buffer = io.BytesIO() + # Besides `max_chunk_amt` being a maximum chunk size, it + # affects memory overhead of reading a response by this + # method in CPython. + # `c_int_max` equal to 2 GiB - 1 byte is the actual maximum + # chunk size that does not lead to an overflow error, but + # 256 MiB is a compromise. + max_chunk_amt = 2 ** 28 + while amt is None or amt != 0: + if amt is not None: + chunk_amt = min(amt, max_chunk_amt) + amt -= chunk_amt + else: + chunk_amt = max_chunk_amt + data = self._fp.read(chunk_amt) + if not data: + break + buffer.write(data) + del data # to reduce peak memory usage by `max_chunk_amt`. + return buffer.getvalue() + else: + # StringIO doesn't like amt=None + return self._fp.read(amt) if amt is not None else self._fp.read() + def read(self, amt=None, decode_content=None, cache_content=False): """ Similar to :meth:`http.client.HTTPResponse.read`, but with two additional @@ -510,13 +564,11 @@ def read(self, amt=None, decode_content=None, cache_content=False): fp_closed = getattr(self._fp, "closed", False) with self._error_catcher(): + data = self._fp_read(amt) if not fp_closed else b"" if amt is None: - # cStringIO doesn't like amt=None - data = self._fp.read() if not fp_closed else b"" flush_decoder = True else: cache_content = False - data = self._fp.read(amt) if not fp_closed else b"" if ( amt != 0 and not data ): # Platform-specific: Buggy versions of Python. @@ -612,9 +664,21 @@ def from_httplib(ResponseCls, r, **response_kw): # Backwards-compatibility methods for http.client.HTTPResponse def getheaders(self): + warnings.warn( + "HTTPResponse.getheaders() is deprecated and will be removed " + "in urllib3 v2.1.0. Instead access HTTPResponse.headers directly.", + category=DeprecationWarning, + stacklevel=2, + ) return self.headers def getheader(self, name, default=None): + warnings.warn( + "HTTPResponse.getheader() is deprecated and will be removed " + "in urllib3 v2.1.0. Instead use HTTPResponse.headers.get(name, default).", + category=DeprecationWarning, + stacklevel=2, + ) return self.headers.get(name, default) # Backwards compatibility for http.cookiejar diff --git a/newrelic/packages/urllib3/util/connection.py b/newrelic/packages/urllib3/util/connection.py index 30b2d174d..6af1138f2 100644 --- a/newrelic/packages/urllib3/util/connection.py +++ b/newrelic/packages/urllib3/util/connection.py @@ -2,9 +2,8 @@ import socket -from ..exceptions import LocationParseError - from ..contrib import _appengine_environ +from ..exceptions import LocationParseError from ..packages import six from .wait import NoWayToWaitForSocketError, wait_for_read diff --git a/newrelic/packages/urllib3/util/request.py b/newrelic/packages/urllib3/util/request.py index 25103383e..b574b081e 100644 --- a/newrelic/packages/urllib3/util/request.py +++ b/newrelic/packages/urllib3/util/request.py @@ -14,7 +14,10 @@ ACCEPT_ENCODING = "gzip,deflate" try: - import brotli as _unused_module_brotli # noqa: F401 + try: + import brotlicffi as _unused_module_brotli # noqa: F401 + except ImportError: + import brotli as _unused_module_brotli # noqa: F401 except ImportError: pass else: diff --git a/newrelic/packages/urllib3/util/retry.py b/newrelic/packages/urllib3/util/retry.py index 0ccc767bd..2490d5e5b 100644 --- a/newrelic/packages/urllib3/util/retry.py +++ b/newrelic/packages/urllib3/util/retry.py @@ -69,6 +69,24 @@ def DEFAULT_REDIRECT_HEADERS_BLACKLIST(cls, value): ) cls.DEFAULT_REMOVE_HEADERS_ON_REDIRECT = value + @property + def BACKOFF_MAX(cls): + warnings.warn( + "Using 'Retry.BACKOFF_MAX' is deprecated and " + "will be removed in v2.0. Use 'Retry.DEFAULT_BACKOFF_MAX' instead", + DeprecationWarning, + ) + return cls.DEFAULT_BACKOFF_MAX + + @BACKOFF_MAX.setter + def BACKOFF_MAX(cls, value): + warnings.warn( + "Using 'Retry.BACKOFF_MAX' is deprecated and " + "will be removed in v2.0. Use 'Retry.DEFAULT_BACKOFF_MAX' instead", + DeprecationWarning, + ) + cls.DEFAULT_BACKOFF_MAX = value + @six.add_metaclass(_RetryMeta) class Retry(object): @@ -162,7 +180,7 @@ class Retry(object): .. warning:: - Previously this parameter was named ``method_allowlist``, that + Previously this parameter was named ``method_whitelist``, that usage is deprecated in v1.26.0 and will be removed in v2.0. :param iterable status_forcelist: @@ -181,7 +199,7 @@ class Retry(object): seconds. If the backoff_factor is 0.1, then :func:`.sleep` will sleep for [0.0s, 0.2s, 0.4s, ...] between retries. It will never be longer - than :attr:`Retry.BACKOFF_MAX`. + than :attr:`Retry.DEFAULT_BACKOFF_MAX`. By default, backoff is disabled (set to 0). @@ -220,7 +238,7 @@ class Retry(object): DEFAULT_REMOVE_HEADERS_ON_REDIRECT = frozenset(["Authorization"]) #: Maximum backoff time. - BACKOFF_MAX = 120 + DEFAULT_BACKOFF_MAX = 120 def __init__( self, @@ -239,23 +257,23 @@ def __init__( respect_retry_after_header=True, remove_headers_on_redirect=_Default, # TODO: Deprecated, remove in v2.0 - method_allowlist=_Default, + method_whitelist=_Default, ): - if method_allowlist is not _Default: + if method_whitelist is not _Default: if allowed_methods is not _Default: raise ValueError( "Using both 'allowed_methods' and " - "'method_allowlist' together is not allowed. " + "'method_whitelist' together is not allowed. " "Instead only use 'allowed_methods'" ) warnings.warn( - "Using 'method_allowlist' with Retry is deprecated and " + "Using 'method_whitelist' with Retry is deprecated and " "will be removed in v2.0. Use 'allowed_methods' instead", DeprecationWarning, stacklevel=2, ) - allowed_methods = method_allowlist + allowed_methods = method_whitelist if allowed_methods is _Default: allowed_methods = self.DEFAULT_ALLOWED_METHODS if remove_headers_on_redirect is _Default: @@ -302,17 +320,17 @@ def new(self, **kw): # TODO: If already given in **kw we use what's given to us # If not given we need to figure out what to pass. We decide - # based on whether our class has the 'method_allowlist' property - # and if so we pass the deprecated 'method_allowlist' otherwise + # based on whether our class has the 'method_whitelist' property + # and if so we pass the deprecated 'method_whitelist' otherwise # we use 'allowed_methods'. Remove in v2.0 - if "method_allowlist" not in kw and "allowed_methods" not in kw: - if "method_allowlist" in self.__dict__: + if "method_whitelist" not in kw and "allowed_methods" not in kw: + if "method_whitelist" in self.__dict__: warnings.warn( - "Using 'method_allowlist' with Retry is deprecated and " + "Using 'method_whitelist' with Retry is deprecated and " "will be removed in v2.0. Use 'allowed_methods' instead", DeprecationWarning, ) - params["method_allowlist"] = self.allowed_methods + params["method_whitelist"] = self.allowed_methods else: params["allowed_methods"] = self.allowed_methods @@ -348,7 +366,7 @@ def get_backoff_time(self): return 0 backoff_value = self.backoff_factor * (2 ** (consecutive_errors_len - 1)) - return min(self.BACKOFF_MAX, backoff_value) + return min(self.DEFAULT_BACKOFF_MAX, backoff_value) def parse_retry_after(self, retry_after): # Whitespace: https://tools.ietf.org/html/rfc7230#section-3.2.4 @@ -376,7 +394,7 @@ def parse_retry_after(self, retry_after): def get_retry_after(self, response): """Get the value of Retry-After in seconds.""" - retry_after = response.getheader("Retry-After") + retry_after = response.headers.get("Retry-After") if retry_after is None: return None @@ -431,15 +449,15 @@ def _is_method_retryable(self, method): """Checks if a given HTTP method should be retried upon, depending if it is included in the allowed_methods """ - # TODO: For now favor if the Retry implementation sets its own method_allowlist + # TODO: For now favor if the Retry implementation sets its own method_whitelist # property outside of our constructor to avoid breaking custom implementations. - if "method_allowlist" in self.__dict__: + if "method_whitelist" in self.__dict__: warnings.warn( - "Using 'method_allowlist' with Retry is deprecated and " + "Using 'method_whitelist' with Retry is deprecated and " "will be removed in v2.0. Use 'allowed_methods' instead", DeprecationWarning, ) - allowed_methods = self.method_allowlist + allowed_methods = self.method_whitelist else: allowed_methods = self.allowed_methods @@ -584,10 +602,10 @@ def __repr__(self): ).format(cls=type(self), self=self) def __getattr__(self, item): - if item == "method_allowlist": + if item == "method_whitelist": # TODO: Remove this deprecated alias in v2.0 warnings.warn( - "Using 'method_allowlist' with Retry is deprecated and " + "Using 'method_whitelist' with Retry is deprecated and " "will be removed in v2.0. Use 'allowed_methods' instead", DeprecationWarning, ) diff --git a/newrelic/packages/urllib3/packages/ssl_match_hostname/_implementation.py b/newrelic/packages/urllib3/util/ssl_match_hostname.py similarity index 92% rename from newrelic/packages/urllib3/packages/ssl_match_hostname/_implementation.py rename to newrelic/packages/urllib3/util/ssl_match_hostname.py index 689208d3c..1dd950c48 100644 --- a/newrelic/packages/urllib3/packages/ssl_match_hostname/_implementation.py +++ b/newrelic/packages/urllib3/util/ssl_match_hostname.py @@ -9,7 +9,7 @@ # ipaddress has been backported to 2.6+ in pypi. If it is installed on the # system, use it to handle IPAddress ServerAltnames (this was added in # python-3.5) otherwise only do DNS matching. This allows -# backports.ssl_match_hostname to continue to be used in Python 2.7. +# util.ssl_match_hostname to continue to be used in Python 2.7. try: import ipaddress except ImportError: @@ -78,7 +78,8 @@ def _dnsname_match(dn, hostname, max_wildcards=1): def _to_unicode(obj): if isinstance(obj, str) and sys.version_info < (3,): - obj = unicode(obj, encoding="ascii", errors="strict") + # ignored flake8 # F821 to support python 2.7 function + obj = unicode(obj, encoding="ascii", errors="strict") # noqa: F821 return obj @@ -111,11 +112,9 @@ def match_hostname(cert, hostname): try: # Divergence from upstream: ipaddress can't handle byte str host_ip = ipaddress.ip_address(_to_unicode(hostname)) - except ValueError: - # Not an IP address (common case) - host_ip = None - except UnicodeError: - # Divergence from upstream: Have to deal with ipaddress not taking + except (UnicodeError, ValueError): + # ValueError: Not an IP address (common case) + # UnicodeError: Divergence from upstream: Have to deal with ipaddress not taking # byte strings. addresses should be all ascii, so we consider it not # an ipaddress in this case host_ip = None @@ -123,7 +122,7 @@ def match_hostname(cert, hostname): # Divergence from upstream: Make ipaddress library optional if ipaddress is None: host_ip = None - else: + else: # Defensive raise dnsnames = [] san = cert.get("subjectAltName", ()) diff --git a/newrelic/packages/urllib3/util/timeout.py b/newrelic/packages/urllib3/util/timeout.py index ff69593b0..78e18a627 100644 --- a/newrelic/packages/urllib3/util/timeout.py +++ b/newrelic/packages/urllib3/util/timeout.py @@ -2,9 +2,8 @@ import time -# The default socket timeout, used by httplib to indicate that no timeout was -# specified by the user -from socket import _GLOBAL_DEFAULT_TIMEOUT +# The default socket timeout, used by httplib to indicate that no timeout was; specified by the user +from socket import _GLOBAL_DEFAULT_TIMEOUT, getdefaulttimeout from ..exceptions import TimeoutStateError @@ -116,6 +115,10 @@ def __repr__(self): # __str__ provided for backwards compatibility __str__ = __repr__ + @classmethod + def resolve_default_timeout(cls, timeout): + return getdefaulttimeout() if timeout is cls.DEFAULT_TIMEOUT else timeout + @classmethod def _validate_timeout(cls, value, name): """Check that a timeout attribute is valid. diff --git a/newrelic/packages/urllib3/util/url.py b/newrelic/packages/urllib3/util/url.py index 81a03da9e..e5682d3be 100644 --- a/newrelic/packages/urllib3/util/url.py +++ b/newrelic/packages/urllib3/util/url.py @@ -50,7 +50,7 @@ "(?:(?:%(hex)s:){0,6}%(hex)s)?::", ] -UNRESERVED_PAT = r"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789._!\-~" +UNRESERVED_PAT = r"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789._\-~" IPV6_PAT = "(?:" + "|".join([x % _subs for x in _variations]) + ")" ZONE_ID_PAT = "(?:%25|%)(?:[" + UNRESERVED_PAT + "]|%[a-fA-F0-9]{2})+" IPV6_ADDRZ_PAT = r"\[" + IPV6_PAT + r"(?:" + ZONE_ID_PAT + r")?\]" @@ -63,7 +63,7 @@ BRACELESS_IPV6_ADDRZ_RE = re.compile("^" + IPV6_ADDRZ_PAT[2:-2] + "$") ZONE_ID_RE = re.compile("(" + ZONE_ID_PAT + r")\]$") -_HOST_PORT_PAT = ("^(%s|%s|%s)(?::([0-9]{0,5}))?$") % ( +_HOST_PORT_PAT = ("^(%s|%s|%s)(?::0*?(|0|[1-9][0-9]{0,4}))?$") % ( REG_NAME_PAT, IPV4_PAT, IPV6_ADDRZ_PAT, @@ -279,6 +279,9 @@ def _normalize_host(host, scheme): if scheme in NORMALIZABLE_SCHEMES: is_ipv6 = IPV6_ADDRZ_RE.match(host) if is_ipv6: + # IPv6 hosts of the form 'a::b%zone' are encoded in a URL as + # such per RFC 6874: 'a::b%25zone'. Unquote the ZoneID + # separator as necessary to return a valid RFC 4007 scoped IP. match = ZONE_ID_RE.search(host) if match: start, end = match.span(1) @@ -300,7 +303,7 @@ def _normalize_host(host, scheme): def _idna_encode(name): - if name and any([ord(x) > 128 for x in name]): + if name and any(ord(x) >= 128 for x in name): try: import idna except ImportError: @@ -331,7 +334,7 @@ def parse_url(url): """ Given a url, return a parsed :class:`.Url` namedtuple. Best-effort is performed to parse incomplete urls. Fields not provided will be None. - This parser is RFC 3986 compliant. + This parser is RFC 3986 and RFC 6874 compliant. The parser logic and helper functions are based heavily on work done in the ``rfc3986`` module. diff --git a/newrelic/packages/urllib3/util/wait.py b/newrelic/packages/urllib3/util/wait.py index c280646c7..21b4590b3 100644 --- a/newrelic/packages/urllib3/util/wait.py +++ b/newrelic/packages/urllib3/util/wait.py @@ -42,7 +42,6 @@ class NoWayToWaitForSocketError(Exception): def _retry_on_intr(fn, timeout): return fn(timeout) - else: # Old and broken Pythons. def _retry_on_intr(fn, timeout): diff --git a/setup.cfg b/setup.cfg index 544bdbea3..006265c36 100644 --- a/setup.cfg +++ b/setup.cfg @@ -5,4 +5,4 @@ license_files = [flake8] max-line-length = 120 -extend-ignore = C0103,C0115,C0116,C0415,E0401,E1120,E122,E126,E127,E128,E203,E501,E722,F841,R1725,W0613,W0613,W504 +extend-ignore = E122,E126,E127,E128,E203,E501,E722,F841,W504,E731 diff --git a/setup.py b/setup.py index cdb4ac091..2b1e5191e 100644 --- a/setup.py +++ b/setup.py @@ -45,13 +45,10 @@ def newrelic_agent_guess_next_version(tag_version): version, _, _ = str(tag_version).partition("+") version_info = list(map(int, version.split("."))) - if len(version_info) < 4: + if len(version_info) < 3: return version version_info[1] += 1 - if version_info[1] % 2: - version_info[3] = 0 - else: - version_info[3] += 1 + version_info[2] = 0 return ".".join(map(str, version_info)) @@ -105,13 +102,14 @@ def build_extension(self, ext): "newrelic.hooks", "newrelic.network", "newrelic/packages", + "newrelic/packages/isort", + "newrelic/packages/isort/stdlibs", "newrelic/packages/urllib3", "newrelic/packages/urllib3/util", "newrelic/packages/urllib3/contrib", "newrelic/packages/urllib3/contrib/_securetransport", "newrelic/packages/urllib3/packages", "newrelic/packages/urllib3/packages/backports", - "newrelic/packages/urllib3/packages/ssl_match_hostname", "newrelic/packages/wrapt", "newrelic.samplers", ] @@ -124,6 +122,7 @@ def build_extension(self, ext): "Programming Language :: Python :: 3.8", "Programming Language :: Python :: 3.9", "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: System :: Monitoring", @@ -134,7 +133,7 @@ def build_extension(self, ext): use_scm_version={ "version_scheme": newrelic_agent_next_version, "local_scheme": "no-local-version", - "git_describe_command": "git describe --dirty --tags --long --match *.*.*.*", + "git_describe_command": "git describe --dirty --tags --long --match *.*.*", "write_to": "newrelic/version.txt", }, setup_requires=["setuptools_scm>=3.2,<7"], @@ -177,7 +176,6 @@ def with_librt(): def run_setup(with_extensions): def _run_setup(): - # Create a local copy of kwargs, if there is no c compiler run_setup # will need to be re-run, and these arguments can not be present. @@ -246,7 +244,6 @@ def _run_setup(): run_setup(with_extensions=True) except BuildExtFailed: - print(75 * "*") print(WARNING) diff --git a/tests/adapter_cheroot/conftest.py b/tests/adapter_cheroot/conftest.py index 7e255783e..37d9d4df4 100644 --- a/tests/adapter_cheroot/conftest.py +++ b/tests/adapter_cheroot/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.adapter_cheroot', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/adapter_cheroot/test_wsgi.py b/tests/adapter_cheroot/test_wsgi.py index b59a12c2d..49858e2f5 100644 --- a/tests/adapter_cheroot/test_wsgi.py +++ b/tests/adapter_cheroot/test_wsgi.py @@ -16,7 +16,7 @@ import cheroot.wsgi import newrelic.api.transaction -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def get_open_port(): diff --git a/tests/adapter_daphne/conftest.py b/tests/adapter_daphne/conftest.py index cda62f22e..3b35b2ee6 100644 --- a/tests/adapter_daphne/conftest.py +++ b/tests/adapter_daphne/conftest.py @@ -12,17 +12,8 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.adapter_daphne", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/adapter_daphne/test_daphne.py b/tests/adapter_daphne/test_daphne.py index 4953e9a9f..e5f9dd832 100644 --- a/tests/adapter_daphne/test_daphne.py +++ b/tests/adapter_daphne/test_daphne.py @@ -21,16 +21,21 @@ from testing_support.fixtures import ( override_application_settings, raise_background_exceptions, - validate_transaction_errors, - validate_transaction_metrics, wait_for_background_threads, ) from testing_support.sample_asgi_applications import ( AppWithCall, AppWithCallRaw, simple_app_v2_raw, + simple_app_v3, ) from testing_support.util import get_open_port +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.common.object_names import callable_name @@ -45,6 +50,10 @@ simple_app_v2_raw, marks=skip_asgi_2_unsupported, ), + pytest.param( + simple_app_v3, + marks=skip_asgi_3_unsupported, + ), pytest.param( AppWithCallRaw(), marks=skip_asgi_3_unsupported, @@ -54,7 +63,7 @@ marks=skip_asgi_3_unsupported, ), ), - ids=("raw", "class_with_call", "class_with_call_double_wrapped"), + ids=("raw", "wrapped", "class_with_call", "class_with_call_double_wrapped"), ) def app(request, server_and_port): app = request.param @@ -112,11 +121,16 @@ async def fake_app(*args, **kwargs): @override_application_settings({"transaction_name.naming_scheme": "framework"}) def test_daphne_200(port, app): - @validate_transaction_metrics(callable_name(app)) + @validate_transaction_metrics( + callable_name(app), + custom_metrics=[ + ("Python/Dispatcher/Daphne/%s" % daphne.__version__, 1), + ], + ) @raise_background_exceptions() @wait_for_background_threads() def response(): - return urlopen("http://localhost:%d" % port, timeout=10) + return urlopen("http://localhost:%d" % port, timeout=10) # nosec assert response().status == 200 @@ -129,7 +143,7 @@ def test_daphne_500(port, app): @wait_for_background_threads() def _test(): try: - urlopen("http://localhost:%d/exc" % port) + urlopen("http://localhost:%d/exc" % port) # nosec except HTTPError: pass diff --git a/tests/adapter_gevent/conftest.py b/tests/adapter_gevent/conftest.py index e521babe5..01dacc9e6 100644 --- a/tests/adapter_gevent/conftest.py +++ b/tests/adapter_gevent/conftest.py @@ -15,8 +15,7 @@ import pytest import webtest -from testing_support.fixtures import (collector_agent_registration_fixture, - collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/adapter_gevent/pytest.ini b/tests/adapter_gevent/pytest.ini deleted file mode 100644 index 458f898d8..000000000 --- a/tests/adapter_gevent/pytest.ini +++ /dev/null @@ -1,2 +0,0 @@ -[pytest] -usefixtures = collector_available_fixture collector_agent_registration diff --git a/tests/adapter_gunicorn/conftest.py b/tests/adapter_gunicorn/conftest.py index ca1c5fb22..228742c96 100644 --- a/tests/adapter_gunicorn/conftest.py +++ b/tests/adapter_gunicorn/conftest.py @@ -14,8 +14,7 @@ import pytest -from testing_support.fixtures import (collector_agent_registration_fixture, - collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/adapter_gunicorn/pytest.ini b/tests/adapter_gunicorn/pytest.ini deleted file mode 100644 index 458f898d8..000000000 --- a/tests/adapter_gunicorn/pytest.ini +++ /dev/null @@ -1,2 +0,0 @@ -[pytest] -usefixtures = collector_available_fixture collector_agent_registration diff --git a/tests/adapter_gunicorn/test_aiohttp_app_factory.py b/tests/adapter_gunicorn/test_aiohttp_app_factory.py index 2e0729001..dc16b1231 100644 --- a/tests/adapter_gunicorn/test_aiohttp_app_factory.py +++ b/tests/adapter_gunicorn/test_aiohttp_app_factory.py @@ -30,8 +30,8 @@ reason='aiohttp app factories were implement in 3.1') @pytest.mark.parametrize('nr_enabled', (True, False)) def test_aiohttp_app_factory(nr_enabled): - nr_admin = os.path.join(os.environ['TOX_ENVDIR'], 'bin', 'newrelic-admin') - gunicorn = os.path.join(os.environ['TOX_ENVDIR'], 'bin', 'gunicorn') + nr_admin = os.path.join(os.environ['TOX_ENV_DIR'], 'bin', 'newrelic-admin') + gunicorn = os.path.join(os.environ['TOX_ENV_DIR'], 'bin', 'gunicorn') # Restart the server if it dies during testing for _ in range(5): diff --git a/tests/adapter_gunicorn/test_asgi_app.py b/tests/adapter_gunicorn/test_asgi_app.py index 2e3445303..93e348465 100644 --- a/tests/adapter_gunicorn/test_asgi_app.py +++ b/tests/adapter_gunicorn/test_asgi_app.py @@ -27,8 +27,8 @@ @pytest.mark.parametrize('nr_enabled', (True, False)) def test_asgi_app(nr_enabled): - nr_admin = os.path.join(os.environ['TOX_ENVDIR'], 'bin', 'newrelic-admin') - gunicorn = os.path.join(os.environ['TOX_ENVDIR'], 'bin', 'gunicorn') + nr_admin = os.path.join(os.environ['TOX_ENV_DIR'], 'bin', 'newrelic-admin') + gunicorn = os.path.join(os.environ['TOX_ENV_DIR'], 'bin', 'gunicorn') PORT = get_open_port() cmd = [gunicorn, '-b', '127.0.0.1:%d' % PORT, '--worker-class', diff --git a/tests/adapter_gunicorn/test_gaiohttp.py b/tests/adapter_gunicorn/test_gaiohttp.py index 3a421b039..9f205bad9 100644 --- a/tests/adapter_gunicorn/test_gaiohttp.py +++ b/tests/adapter_gunicorn/test_gaiohttp.py @@ -30,8 +30,8 @@ @pytest.mark.parametrize('nr_enabled', [True, False]) def test_gunicorn_gaiohttp_worker(nr_enabled): - nr_admin = os.path.join(os.environ['TOX_ENVDIR'], 'bin', 'newrelic-admin') - gunicorn = os.path.join(os.environ['TOX_ENVDIR'], 'bin', 'gunicorn') + nr_admin = os.path.join(os.environ['TOX_ENV_DIR'], 'bin', 'newrelic-admin') + gunicorn = os.path.join(os.environ['TOX_ENV_DIR'], 'bin', 'gunicorn') # Restart the server if it dies during testing for _ in range(5): diff --git a/tests/adapter_hypercorn/conftest.py b/tests/adapter_hypercorn/conftest.py index 50e8bad10..2276e9415 100644 --- a/tests/adapter_hypercorn/conftest.py +++ b/tests/adapter_hypercorn/conftest.py @@ -15,17 +15,8 @@ from testing_support.fixture.event_loop import ( # noqa: F401; pylint: disable=W0611 event_loop as loop, ) -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.adapter_hypercorn", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/adapter_hypercorn/test_hypercorn.py b/tests/adapter_hypercorn/test_hypercorn.py index 05bf9fdc5..8b53eee0a 100644 --- a/tests/adapter_hypercorn/test_hypercorn.py +++ b/tests/adapter_hypercorn/test_hypercorn.py @@ -22,8 +22,6 @@ from testing_support.fixtures import ( override_application_settings, raise_background_exceptions, - validate_transaction_errors, - validate_transaction_metrics, wait_for_background_threads, ) from testing_support.sample_asgi_applications import ( @@ -32,6 +30,12 @@ simple_app_v2_raw, ) from testing_support.util import get_open_port +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.transaction import ignore_transaction from newrelic.common.object_names import callable_name @@ -115,7 +119,7 @@ def wait_for_port(port, retries=10): status = None for _ in range(retries): try: - status = urlopen("http://localhost:%d/ignored" % port, timeout=1).status + status = urlopen("http://localhost:%d/ignored" % port, timeout=1).status # nosec assert status == 200 return except Exception as e: @@ -128,11 +132,18 @@ def wait_for_port(port, retries=10): @override_application_settings({"transaction_name.naming_scheme": "framework"}) def test_hypercorn_200(port, app): - @validate_transaction_metrics(callable_name(app)) + hypercorn_version = pkg_resources.get_distribution("hypercorn").version + + @validate_transaction_metrics( + callable_name(app), + custom_metrics=[ + ("Python/Dispatcher/Hypercorn/%s" % hypercorn_version, 1), + ], + ) @raise_background_exceptions() @wait_for_background_threads() def response(): - return urlopen("http://localhost:%d" % port, timeout=10) + return urlopen("http://localhost:%d" % port, timeout=10) # nosec assert response().status == 200 @@ -145,6 +156,6 @@ def test_hypercorn_500(port, app): @wait_for_background_threads() def _test(): with pytest.raises(HTTPError): - urlopen("http://localhost:%d/exc" % port) + urlopen("http://localhost:%d/exc" % port) # nosec _test() diff --git a/tests/adapter_uvicorn/conftest.py b/tests/adapter_uvicorn/conftest.py index f44b3016d..4f2f7c2df 100644 --- a/tests/adapter_uvicorn/conftest.py +++ b/tests/adapter_uvicorn/conftest.py @@ -13,17 +13,9 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) -_coverage_source = [ - "newrelic.hooks.adapter_uvicorn", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/adapter_uvicorn/test_uvicorn.py b/tests/adapter_uvicorn/test_uvicorn.py index e3261f4e8..93d155aa8 100644 --- a/tests/adapter_uvicorn/test_uvicorn.py +++ b/tests/adapter_uvicorn/test_uvicorn.py @@ -23,10 +23,10 @@ from testing_support.fixtures import ( override_application_settings, raise_background_exceptions, - validate_transaction_errors, - validate_transaction_metrics, wait_for_background_threads, ) +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from testing_support.sample_asgi_applications import ( AppWithCall, AppWithCallRaw, diff --git a/tests/adapter_waitress/_application.py b/tests/adapter_waitress/_application.py new file mode 100644 index 000000000..c3b36f0c2 --- /dev/null +++ b/tests/adapter_waitress/_application.py @@ -0,0 +1,54 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from threading import Thread +from time import sleep + +from testing_support.sample_applications import ( + raise_exception_application, + raise_exception_finalize, + raise_exception_response, + simple_app_raw, +) +from testing_support.util import get_open_port + + +def sample_application(environ, start_response): + path_info = environ.get("PATH_INFO") + + if path_info.startswith("/raise-exception-application"): + return raise_exception_application(environ, start_response) + elif path_info.startswith("/raise-exception-response"): + return raise_exception_response(environ, start_response) + elif path_info.startswith("/raise-exception-finalize"): + return raise_exception_finalize(environ, start_response) + + return simple_app_raw(environ, start_response) + + +def setup_application(): + port = get_open_port() + + def run_wsgi(): + from waitress import serve + + serve(sample_application, host="127.0.0.1", port=port) + + wsgi_thread = Thread(target=run_wsgi) + wsgi_thread.daemon = True + wsgi_thread.start() + + sleep(1) + + return port diff --git a/tests/adapter_waitress/conftest.py b/tests/adapter_waitress/conftest.py new file mode 100644 index 000000000..aecbfd86d --- /dev/null +++ b/tests/adapter_waitress/conftest.py @@ -0,0 +1,40 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import pytest +import webtest +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) + +_default_settings = { + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, +} + +collector_agent_registration = collector_agent_registration_fixture( + app_name="Python Agent Test (Waitress)", default_settings=_default_settings +) + + +@pytest.fixture(autouse=True, scope="session") +def target_application(): + import _application + + port = _application.setup_application() + return webtest.TestApp("http://localhost:%d" % port) diff --git a/tests/adapter_waitress/test_wsgi.py b/tests/adapter_waitress/test_wsgi.py new file mode 100644 index 000000000..c9fa42719 --- /dev/null +++ b/tests/adapter_waitress/test_wsgi.py @@ -0,0 +1,101 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +from testing_support.fixtures import ( + override_application_settings, + raise_background_exceptions, + wait_for_background_threads, +) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.common.package_version_utils import get_package_version + +WAITRESS_VERSION = get_package_version("waitress") + + +@override_application_settings({"transaction_name.naming_scheme": "framework"}) +def test_wsgi_application_index(target_application): + @validate_transaction_metrics( + "_application:sample_application", + custom_metrics=[ + ("Python/Dispatcher/Waitress/%s" % WAITRESS_VERSION, 1), + ], + ) + @raise_background_exceptions() + @wait_for_background_threads() + def _test(): + response = target_application.get("/") + assert response.status == "200 OK" + + _test() + + +@override_application_settings({"transaction_name.naming_scheme": "framework"}) +def test_raise_exception_application(target_application): + @validate_transaction_errors(["builtins:RuntimeError"]) + @validate_transaction_metrics( + "_application:sample_application", + custom_metrics=[ + ("Python/Dispatcher/Waitress/%s" % WAITRESS_VERSION, 1), + ], + ) + @raise_background_exceptions() + @wait_for_background_threads() + def _test(): + response = target_application.get("/raise-exception-application/", status=500) + assert response.status == "500 Internal Server Error" + + _test() + + +@override_application_settings({"transaction_name.naming_scheme": "framework"}) +def test_raise_exception_response(target_application): + @validate_transaction_errors(["builtins:RuntimeError"]) + @validate_transaction_metrics( + "_application:sample_application", + custom_metrics=[ + ("Python/Dispatcher/Waitress/%s" % WAITRESS_VERSION, 1), + ], + ) + @raise_background_exceptions() + @wait_for_background_threads() + def _test(): + response = target_application.get("/raise-exception-response/", status=500) + assert response.status == "500 Internal Server Error" + + _test() + + +@override_application_settings({"transaction_name.naming_scheme": "framework"}) +def test_raise_exception_finalize(target_application): + @validate_transaction_errors(["builtins:RuntimeError"]) + @validate_transaction_metrics( + "_application:sample_application", + custom_metrics=[ + ("Python/Dispatcher/Waitress/%s" % WAITRESS_VERSION, 1), + ], + ) + @raise_background_exceptions() + @wait_for_background_threads() + def _test(): + response = target_application.get("/raise-exception-finalize/", status=500) + assert response.status == "500 Internal Server Error" + + _test() diff --git a/tests/agent_features/_test_async_coroutine_trace.py b/tests/agent_features/_test_async_coroutine_trace.py index 96cb3c7dc..51b81f5f6 100644 --- a/tests/agent_features/_test_async_coroutine_trace.py +++ b/tests/agent_features/_test_async_coroutine_trace.py @@ -14,11 +14,12 @@ import asyncio import functools +import sys import time import pytest -from testing_support.fixtures import ( - capture_transaction_metrics, +from testing_support.fixtures import capture_transaction_metrics +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) @@ -68,6 +69,7 @@ def _test(): assert full_metrics[metric_key].total_call_time >= 0.1 +@pytest.mark.skipif(sys.version_info >= (3, 11), reason="Asyncio decorator was removed in Python 3.11+.") @pytest.mark.parametrize( "trace,metric", [ diff --git a/tests/agent_features/_test_code_level_metrics.py b/tests/agent_features/_test_code_level_metrics.py index 90529320d..bbe3363f4 100644 --- a/tests/agent_features/_test_code_level_metrics.py +++ b/tests/agent_features/_test_code_level_metrics.py @@ -13,11 +13,12 @@ # limitations under the License. import functools + def exercise_function(): return -class ExerciseClass(): +class ExerciseClass(object): def exercise_method(self): return @@ -30,12 +31,46 @@ def exercise_class_method(cls): return -class ExerciseClassCallable(): +class ExerciseClassCallable(object): def __call__(self): return + +def exercise_method(self): + return + + +@staticmethod +def exercise_static_method(): + return + + +@classmethod +def exercise_class_method(cls): + return + + +def __call__(self): + return + + +type_dict = { + "exercise_method": exercise_method, + "exercise_static_method": exercise_static_method, + "exercise_class_method": exercise_class_method, + "exercise_lambda": lambda: None, +} +callable_type_dict = type_dict.copy() +callable_type_dict["__call__"] = __call__ + +ExerciseTypeConstructor = type("ExerciseTypeConstructor", (object,), type_dict) +ExerciseTypeConstructorCallable = type("ExerciseTypeConstructorCallable", (object,), callable_type_dict) + + CLASS_INSTANCE = ExerciseClass() CLASS_INSTANCE_CALLABLE = ExerciseClassCallable() +TYPE_CONSTRUCTOR_CLASS_INSTANCE = ExerciseTypeConstructor() +TYPE_CONSTRUCTOR_CALLABLE_CLASS_INSTANCE = ExerciseTypeConstructorCallable() -exercise_lambda = lambda: None +exercise_lambda = lambda: None # noqa: E731 exercise_partial = functools.partial(exercise_function) diff --git a/tests/agent_features/conftest.py b/tests/agent_features/conftest.py index d3cadbd46..57263238b 100644 --- a/tests/agent_features/conftest.py +++ b/tests/agent_features/conftest.py @@ -13,25 +13,14 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 newrelic_caplog as caplog, ) from newrelic.packages import six -_coverage_source = [ - "newrelic.api.transaction", - "newrelic.api.web_transaction", - "newrelic.common.coroutine", - "newrelic.api.lambda_handler", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/agent_features/test_apdex_metrics.py b/tests/agent_features/test_apdex_metrics.py index e32a96e31..c150fcf7e 100644 --- a/tests/agent_features/test_apdex_metrics.py +++ b/tests/agent_features/test_apdex_metrics.py @@ -13,24 +13,41 @@ # limitations under the License. import webtest - -from testing_support.validators.validate_apdex_metrics import ( - validate_apdex_metrics) from testing_support.sample_applications import simple_app +from testing_support.validators.validate_apdex_metrics import validate_apdex_metrics +from newrelic.api.transaction import current_transaction, suppress_apdex_metric +from newrelic.api.wsgi_application import wsgi_application normal_application = webtest.TestApp(simple_app) - # NOTE: This test validates that the server-side apdex_t is set to 0.5 # If the server-side configuration changes, this test will start to fail. @validate_apdex_metrics( - name='', - group='Uri', + name="", + group="Uri", apdex_t_min=0.5, apdex_t_max=0.5, ) def test_apdex(): - normal_application.get('/') + normal_application.get("/") + + +# This has to be a Web Transaction. +# The apdex measurement only applies to Web Transactions +def test_apdex_suppression(): + @wsgi_application() + def simple_apdex_supression_app(environ, start_response): + suppress_apdex_metric() + + start_response(status="200 OK", response_headers=[]) + transaction = current_transaction() + + assert transaction.suppress_apdex + assert transaction.apdex == 0 + return [] + + apdex_suppression_app = webtest.TestApp(simple_apdex_supression_app) + apdex_suppression_app.get("/") diff --git a/tests/agent_features/test_asgi_browser.py b/tests/agent_features/test_asgi_browser.py index c2c7ce715..281d08b96 100644 --- a/tests/agent_features/test_asgi_browser.py +++ b/tests/agent_features/test_asgi_browser.py @@ -12,48 +12,57 @@ # See the License for the specific language governing permissions and # limitations under the License. -import sys import json import pytest import six - -from testing_support.fixtures import (override_application_settings, - validate_transaction_errors, validate_custom_parameters) +from bs4 import BeautifulSoup from testing_support.asgi_testing import AsgiTest +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_custom_parameters import ( + validate_custom_parameters, +) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) from newrelic.api.application import application_settings -from newrelic.api.transaction import (get_browser_timing_header, - get_browser_timing_footer, add_custom_parameter, - disable_browser_autorum) from newrelic.api.asgi_application import asgi_application +from newrelic.api.transaction import ( + add_custom_attribute, + disable_browser_autorum, + get_browser_timing_footer, + get_browser_timing_header, +) from newrelic.common.encoding_utils import deobfuscate -from bs4 import BeautifulSoup +_runtime_error_name = RuntimeError.__module__ + ":" + RuntimeError.__name__ -_runtime_error_name = (RuntimeError.__module__ + ':' + RuntimeError.__name__) @asgi_application() async def target_asgi_application_manual_rum(scope, receive, send): - text = '%s

RESPONSE

%s' + text = "%s

RESPONSE

%s" - output = (text % (get_browser_timing_header(), - get_browser_timing_footer())).encode('UTF-8') + output = (text % (get_browser_timing_header(), get_browser_timing_footer())).encode("UTF-8") - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode('utf-8'))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) + target_application_manual_rum = AsgiTest(target_asgi_application_manual_rum) _test_footer_attributes = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "js_agent_loader": "", } + @override_application_settings(_test_footer_attributes) def test_footer_attributes(): settings = application_settings() @@ -67,589 +76,632 @@ def test_footer_attributes(): assert settings.beacon assert settings.error_beacon - token = '0123456789ABCDEF' - headers = { 'Cookie': 'NRAGENT=tk=%s' % token } + token = "0123456789ABCDEF" # nosec + headers = {"Cookie": "NRAGENT=tk=%s" % token} - response = target_application_manual_rum.get('/', headers=headers) + response = target_application_manual_rum.get("/", headers=headers) - html = BeautifulSoup(response.body, 'html.parser') + html = BeautifulSoup(response.body, "html.parser") header = html.html.head.script.string content = html.html.body.p.string footer = html.html.body.script.string # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # Validate the insertion of RUM header. - assert header.find('NREUM HEADER') != -1 + assert header.find("NREUM HEADER") != -1 # Now validate the various fields of the footer. The fields are # held by a JSON dictionary. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) - assert data['licenseKey'] == settings.browser_key - assert data['applicationID'] == settings.application_id + assert data["licenseKey"] == settings.browser_key + assert data["applicationID"] == settings.application_id - assert data['agent'] == settings.js_agent_file - assert data['beacon'] == settings.beacon - assert data['errorBeacon'] == settings.error_beacon + assert data["agent"] == settings.js_agent_file + assert data["beacon"] == settings.beacon + assert data["errorBeacon"] == settings.error_beacon - assert data['applicationTime'] >= 0 - assert data['queueTime'] >= 0 + assert data["applicationTime"] >= 0 + assert data["queueTime"] >= 0 obfuscation_key = settings.license_key[:13] - assert type(data['transactionName']) == type(u'') + type_transaction_data = unicode if six.PY2 else str # noqa: F821, pylint: disable=E0602 + assert isinstance(data["transactionName"], type_transaction_data) - txn_name = deobfuscate(data['transactionName'], obfuscation_key) + txn_name = deobfuscate(data["transactionName"], obfuscation_key) - assert txn_name == u'WebTransaction/Uri/' + assert txn_name == "WebTransaction/Uri/" + + assert "atts" not in data - assert 'atts' not in data _test_rum_ssl_for_http_is_none = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'browser_monitoring.ssl_for_http': None, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "browser_monitoring.ssl_for_http": None, + "js_agent_loader": "", } + @override_application_settings(_test_rum_ssl_for_http_is_none) def test_ssl_for_http_is_none(): settings = application_settings() assert settings.browser_monitoring.ssl_for_http is None - response = target_application_manual_rum.get('/') - html = BeautifulSoup(response.body, 'html.parser') + response = target_application_manual_rum.get("/") + html = BeautifulSoup(response.body, "html.parser") footer = html.html.body.script.string - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert "sslForHttp" not in data - assert 'sslForHttp' not in data _test_rum_ssl_for_http_is_true = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'browser_monitoring.ssl_for_http': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "browser_monitoring.ssl_for_http": True, + "js_agent_loader": "", } + @override_application_settings(_test_rum_ssl_for_http_is_true) def test_ssl_for_http_is_true(): settings = application_settings() assert settings.browser_monitoring.ssl_for_http is True - response = target_application_manual_rum.get('/') - html = BeautifulSoup(response.body, 'html.parser') + response = target_application_manual_rum.get("/") + html = BeautifulSoup(response.body, "html.parser") footer = html.html.body.script.string - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert data["sslForHttp"] is True - assert data['sslForHttp'] is True _test_rum_ssl_for_http_is_false = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'browser_monitoring.ssl_for_http': False, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "browser_monitoring.ssl_for_http": False, + "js_agent_loader": "", } + @override_application_settings(_test_rum_ssl_for_http_is_false) def test_ssl_for_http_is_false(): settings = application_settings() assert settings.browser_monitoring.ssl_for_http is False - response = target_application_manual_rum.get('/') - html = BeautifulSoup(response.body, 'html.parser') + response = target_application_manual_rum.get("/") + html = BeautifulSoup(response.body, "html.parser") footer = html.html.body.script.string - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert data["sslForHttp"] is False - assert data['sslForHttp'] is False @asgi_application() async def target_asgi_application_yield_single_no_head(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode('utf-8'))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_yield_single_no_head = AsgiTest( - target_asgi_application_yield_single_no_head) + +target_application_yield_single_no_head = AsgiTest(target_asgi_application_yield_single_no_head) _test_html_insertion_yield_single_no_head_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_single_no_head_settings) def test_html_insertion_yield_single_no_head(): - response = target_application_yield_single_no_head.get('/') + response = target_application_yield_single_no_head.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' in response.body - assert b'NREUM.info' in response.body + assert b"NREUM HEADER" in response.body + assert b"NREUM.info" in response.body + @asgi_application() async def target_asgi_application_yield_multi_no_head(scope, receive, send): - output = [ b'', b'

RESPONSE

' ] + output = [b"", b"

RESPONSE

"] - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(b''.join(output))).encode('utf-8'))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(b"".join(output))).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) for data in output: more_body = data is not output[-1] await send({"type": "http.response.body", "body": data, "more_body": more_body}) -target_application_yield_multi_no_head = AsgiTest( - target_asgi_application_yield_multi_no_head) + +target_application_yield_multi_no_head = AsgiTest(target_asgi_application_yield_multi_no_head) _test_html_insertion_yield_multi_no_head_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_multi_no_head_settings) def test_html_insertion_yield_multi_no_head(): - response = target_application_yield_multi_no_head.get('/') + response = target_application_yield_multi_no_head.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' in response.body - assert b'NREUM.info' in response.body + assert b"NREUM HEADER" in response.body + assert b"NREUM.info" in response.body + @asgi_application() async def target_asgi_application_unnamed_attachment_header(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode('utf-8')), - (b'content-disposition', b'attachment')] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + (b"content-disposition", b"attachment"), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_unnamed_attachment_header = AsgiTest( - target_asgi_application_unnamed_attachment_header) + +target_application_unnamed_attachment_header = AsgiTest(target_asgi_application_unnamed_attachment_header) _test_html_insertion_unnamed_attachment_header_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_unnamed_attachment_header_settings) + +@override_application_settings(_test_html_insertion_unnamed_attachment_header_settings) def test_html_insertion_unnamed_attachment_header(): - response = target_application_unnamed_attachment_header.get('/') + response = target_application_unnamed_attachment_header.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers - assert 'content-disposition' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers + assert "content-disposition" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body + @asgi_application() async def target_asgi_application_named_attachment_header(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode('utf-8')), - (b'content-disposition', b'Attachment; filename="X"')] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + (b"content-disposition", b'Attachment; filename="X"'), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_named_attachment_header = AsgiTest( - target_asgi_application_named_attachment_header) + +target_application_named_attachment_header = AsgiTest(target_asgi_application_named_attachment_header) _test_html_insertion_named_attachment_header_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_named_attachment_header_settings) + +@override_application_settings(_test_html_insertion_named_attachment_header_settings) def test_html_insertion_named_attachment_header(): - response = target_application_named_attachment_header.get('/') + response = target_application_named_attachment_header.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers - assert 'content-disposition' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers + assert "content-disposition" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body + @asgi_application() async def target_asgi_application_inline_attachment_header(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode('utf-8')), - (b'content-disposition', b'inline; filename="attachment"')] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + (b"content-disposition", b'inline; filename="attachment"'), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_inline_attachment_header = AsgiTest( - target_asgi_application_inline_attachment_header) + +target_application_inline_attachment_header = AsgiTest(target_asgi_application_inline_attachment_header) _test_html_insertion_inline_attachment_header_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_inline_attachment_header_settings) + +@override_application_settings(_test_html_insertion_inline_attachment_header_settings) def test_html_insertion_inline_attachment_header(): - response = target_application_inline_attachment_header.get('/') + response = target_application_inline_attachment_header.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers - assert 'content-disposition' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers + assert "content-disposition" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' in response.body - assert b'NREUM.info' in response.body + assert b"NREUM HEADER" in response.body + assert b"NREUM.info" in response.body + @asgi_application() async def target_asgi_application_empty(scope, receive, send): - status = '200 OK' + status = "200 OK" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', b'0')] + response_headers = [(b"content-type", b"text/html; charset=utf-8"), (b"content-length", b"0")] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body"}) -target_application_empty = AsgiTest( - target_asgi_application_empty) + +target_application_empty = AsgiTest(target_asgi_application_empty) _test_html_insertion_empty_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_empty_settings) + +@override_application_settings(_test_html_insertion_empty_settings) def test_html_insertion_empty(): - response = target_application_empty.get('/') + response = target_application_empty.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body assert len(response.body) == 0 + @asgi_application() async def target_asgi_application_single_empty_string(scope, receive, send): - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', b'0')] + response_headers = [(b"content-type", b"text/html; charset=utf-8"), (b"content-length", b"0")] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": b""}) -target_application_single_empty_string = AsgiTest( - target_asgi_application_single_empty_string) + +target_application_single_empty_string = AsgiTest(target_asgi_application_single_empty_string) _test_html_insertion_single_empty_string_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_single_empty_string_settings) + +@override_application_settings(_test_html_insertion_single_empty_string_settings) def test_html_insertion_single_empty_string(): - response = target_application_single_empty_string.get('/') + response = target_application_single_empty_string.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body assert len(response.body) == 0 + @asgi_application() async def target_asgi_application_multiple_empty_string(scope, receive, send): - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', b'0')] + response_headers = [(b"content-type", b"text/html; charset=utf-8"), (b"content-length", b"0")] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": b"", "more_body": True}) await send({"type": "http.response.body", "body": b""}) -target_application_multiple_empty_string = AsgiTest( - target_asgi_application_multiple_empty_string) + +target_application_multiple_empty_string = AsgiTest(target_asgi_application_multiple_empty_string) _test_html_insertion_multiple_empty_string_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_multiple_empty_string_settings) + +@override_application_settings(_test_html_insertion_multiple_empty_string_settings) def test_html_insertion_multiple_empty_string(): - response = target_application_multiple_empty_string.get('/') + response = target_application_multiple_empty_string.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body assert len(response.body) == 0 + @asgi_application() async def target_asgi_application_single_large_prelude(scope, receive, send): - output = 64*1024*b' ' + b'' + output = 64 * 1024 * b" " + b"" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_single_large_prelude = AsgiTest( - target_asgi_application_single_large_prelude) + +target_application_single_large_prelude = AsgiTest(target_asgi_application_single_large_prelude) _test_html_insertion_single_large_prelude_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_single_large_prelude_settings) + +@override_application_settings(_test_html_insertion_single_large_prelude_settings) def test_html_insertion_single_large_prelude(): - response = target_application_single_large_prelude.get('/') + response = target_application_single_large_prelude.get("/") assert response.status == 200 # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers + + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + output = [32 * 1024 * b" ", 32 * 1024 * b" ", b""] - output = [32*1024*b' ', 32*1024*b' ', b''] + assert len(response.body) == len(b"".join(output)) - assert len(response.body) == len(b''.join(output)) @asgi_application() async def target_asgi_application_multi_large_prelude(scope, receive, send): - output = [32*1024*b' ', 32*1024*b' ', b''] + output = [32 * 1024 * b" ", 32 * 1024 * b" ", b""] - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(b''.join(output))).encode("utf-8"))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(b"".join(output))).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) for data in output: more_body = data is not output[-1] await send({"type": "http.response.body", "body": data, "more_body": more_body}) -target_application_multi_large_prelude = AsgiTest( - target_asgi_application_multi_large_prelude) + +target_application_multi_large_prelude = AsgiTest(target_asgi_application_multi_large_prelude) _test_html_insertion_multi_large_prelude_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_multi_large_prelude_settings) + +@override_application_settings(_test_html_insertion_multi_large_prelude_settings) def test_html_insertion_multi_large_prelude(): - response = target_application_multi_large_prelude.get('/') + response = target_application_multi_large_prelude.get("/") assert response.status == 200 # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers + + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + output = [32 * 1024 * b" ", 32 * 1024 * b" ", b""] - output = [32*1024*b' ', 32*1024*b' ', b''] + assert len(response.body) == len(b"".join(output)) - assert len(response.body) == len(b''.join(output)) @asgi_application() async def target_asgi_application_yield_before_start(scope, receive, send): # This is not legal but we should see what happens with our middleware await send({"type": "http.response.body", "body": b"", "more_body": True}) - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_yield_before_start = AsgiTest( - target_asgi_application_yield_before_start) + +target_application_yield_before_start = AsgiTest(target_asgi_application_yield_before_start) _test_html_insertion_yield_before_start_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_before_start_settings) def test_html_insertion_yield_before_start(): # The application should complete as pass through, but an assertion error # would be raised in the AsgiTest class with pytest.raises(AssertionError): - target_application_yield_before_start.get('/') + target_application_yield_before_start.get("/") + @asgi_application() async def target_asgi_application_start_yield_start(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": b""}) await send({"type": "http.response.start", "status": 200, "headers": response_headers}) -target_application_start_yield_start = AsgiTest( - target_asgi_application_start_yield_start) + +target_application_start_yield_start = AsgiTest(target_asgi_application_start_yield_start) _test_html_insertion_start_yield_start_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_start_yield_start_settings) def test_html_insertion_start_yield_start(): # The application should complete as pass through, but an assertion error # would be raised in the AsgiTest class with pytest.raises(AssertionError): - target_application_start_yield_start.get('/') + target_application_start_yield_start.get("/") + @asgi_application() async def target_asgi_application_invalid_content_length(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', b'XXX')] + response_headers = [(b"content-type", b"text/html; charset=utf-8"), (b"content-length", b"XXX")] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_invalid_content_length = AsgiTest( - target_asgi_application_invalid_content_length) + +target_application_invalid_content_length = AsgiTest(target_asgi_application_invalid_content_length) _test_html_insertion_invalid_content_length_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_invalid_content_length_settings) def test_html_insertion_invalid_content_length(): - response = target_application_invalid_content_length.get('/') + response = target_application_invalid_content_length.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers - assert response.headers['content-length'] == 'XXX' + assert response.headers["content-length"] == "XXX" + + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body @asgi_application() async def target_asgi_application_content_encoding(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8")), - (b'content-encoding', b'identity')] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + (b"content-encoding", b"identity"), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_content_encoding = AsgiTest( - target_asgi_application_content_encoding) + +target_application_content_encoding = AsgiTest(target_asgi_application_content_encoding) _test_html_insertion_content_encoding_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_content_encoding_settings) def test_html_insertion_content_encoding(): - response = target_application_content_encoding.get('/') + response = target_application_content_encoding.get("/") assert response.status == 200 # Technically 'identity' should not be used in Content-Encoding @@ -657,181 +709,190 @@ def test_html_insertion_content_encoding(): # RUM for this test. Other option is to compress the response # and use 'gzip'. - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers + + assert response.headers["content-encoding"] == "identity" - assert response.headers['content-encoding'] == 'identity' + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body @asgi_application() async def target_asgi_application_no_content_type(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [(b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [(b"content-length", str(len(output)).encode("utf-8"))] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_no_content_type = AsgiTest( - target_asgi_application_no_content_type) + +target_application_no_content_type = AsgiTest(target_asgi_application_no_content_type) _test_html_insertion_no_content_type_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_no_content_type_settings) def test_html_insertion_no_content_type(): - response = target_application_no_content_type.get('/') + response = target_application_no_content_type.get("/") assert response.status == 200 - assert 'content-type' not in response.headers - assert 'content-length' in response.headers + assert "content-type" not in response.headers + assert "content-length" in response.headers + + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body @asgi_application() async def target_asgi_application_plain_text(scope, receive, send): - output = b'RESPONSE' + output = b"RESPONSE" - response_headers = [ - (b'content-type', b'text/plain'), - (b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [(b"content-type", b"text/plain"), (b"content-length", str(len(output)).encode("utf-8"))] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_plain_text = AsgiTest( - target_asgi_application_plain_text) + +target_application_plain_text = AsgiTest(target_asgi_application_plain_text) _test_html_insertion_plain_text_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_plain_text_settings) def test_html_insertion_plain_text(): - response = target_application_plain_text.get('/') + response = target_application_plain_text.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body @asgi_application() async def target_asgi_application_param(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" response_headers = [ - (b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] - add_custom_parameter('key', 'value') + add_custom_attribute("key", "value") await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_param = AsgiTest( - target_asgi_application_param) +target_application_param = AsgiTest(target_asgi_application_param) _test_html_insertion_param_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } @override_application_settings(_test_html_insertion_param_settings) -@validate_custom_parameters(required_params=[('key', 'value')]) +@validate_custom_parameters(required_params=[("key", "value")]) def test_html_insertion_param(): - response = target_application_param.get('/') + response = target_application_param.get("/") assert response.status == 200 - assert b'NREUM HEADER' in response.body - assert b'NREUM.info' in response.body + assert b"NREUM HEADER" in response.body + assert b"NREUM.info" in response.body + @asgi_application() async def target_asgi_application_param_on_error(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" response_headers = [ - (b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) try: - raise RuntimeError('ERROR') + raise RuntimeError("ERROR") finally: - add_custom_parameter('key', 'value') + add_custom_attribute("key", "value") -target_application_param_on_error = AsgiTest( - target_asgi_application_param_on_error) + +target_application_param_on_error = AsgiTest(target_asgi_application_param_on_error) _test_html_insertion_param_on_error_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_param_on_error_settings) @validate_transaction_errors(errors=[_runtime_error_name]) -@validate_custom_parameters(required_params=[('key', 'value')]) +@validate_custom_parameters(required_params=[("key", "value")]) def test_html_insertion_param_on_error(): try: - target_application_param_on_error.get('/') + target_application_param_on_error.get("/") except RuntimeError: pass + @asgi_application() async def target_asgi_application_disable_autorum_via_api(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" disable_browser_autorum() - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_disable_autorum_via_api = AsgiTest( - target_asgi_application_disable_autorum_via_api) + +target_application_disable_autorum_via_api = AsgiTest(target_asgi_application_disable_autorum_via_api) _test_html_insertion_disable_autorum_via_api_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_disable_autorum_via_api_settings) + +@override_application_settings(_test_html_insertion_disable_autorum_via_api_settings) def test_html_insertion_disable_autorum_via_api(): - response = target_application_disable_autorum_via_api.get('/') + response = target_application_disable_autorum_via_api.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body + @asgi_application() async def target_asgi_application_manual_rum_insertion(scope, receive, send): - output = b'

RESPONSE

' + output = b"

RESPONSE

" header = get_browser_timing_header() footer = get_browser_timing_footer() @@ -839,36 +900,38 @@ async def target_asgi_application_manual_rum_insertion(scope, receive, send): header = get_browser_timing_header() footer = get_browser_timing_footer() - assert header == '' - assert footer == '' + assert header == "" + assert footer == "" - response_headers = [(b'content-type', b'text/html; charset=utf-8'), - (b'content-length', str(len(output)).encode("utf-8"))] + response_headers = [ + (b"content-type", b"text/html; charset=utf-8"), + (b"content-length", str(len(output)).encode("utf-8")), + ] await send({"type": "http.response.start", "status": 200, "headers": response_headers}) await send({"type": "http.response.body", "body": output}) -target_application_manual_rum_insertion = AsgiTest( - target_asgi_application_manual_rum_insertion) + +target_application_manual_rum_insertion = AsgiTest(target_asgi_application_manual_rum_insertion) _test_html_insertion_manual_rum_insertion_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_manual_rum_insertion_settings) + +@override_application_settings(_test_html_insertion_manual_rum_insertion_settings) def test_html_insertion_manual_rum_insertion(): - response = target_application_manual_rum_insertion.get('/') + response = target_application_manual_rum_insertion.get("/") assert response.status == 200 - assert 'content-type' in response.headers - assert 'content-length' in response.headers + assert "content-type" in response.headers + assert "content-length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert b'NREUM HEADER' not in response.body - assert b'NREUM.info' not in response.body + assert b"NREUM HEADER" not in response.body + assert b"NREUM.info" not in response.body diff --git a/tests/agent_features/test_asgi_distributed_tracing.py b/tests/agent_features/test_asgi_distributed_tracing.py index bb34aba20..90f57becc 100644 --- a/tests/agent_features/test_asgi_distributed_tracing.py +++ b/tests/agent_features/test_asgi_distributed_tracing.py @@ -22,8 +22,8 @@ from newrelic.api.asgi_application import asgi_application, ASGIWebTransaction from testing_support.asgi_testing import AsgiTest -from testing_support.fixtures import (override_application_settings, - validate_transaction_metrics) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics distributed_trace_intrinsics = ['guid', 'traceId', 'priority', 'sampled'] diff --git a/tests/agent_features/test_asgi_transaction.py b/tests/agent_features/test_asgi_transaction.py index 1820efa86..cac075cef 100644 --- a/tests/agent_features/test_asgi_transaction.py +++ b/tests/agent_features/test_asgi_transaction.py @@ -12,16 +12,11 @@ # See the License for the specific language governing permissions and # limitations under the License. -import asyncio import logging import pytest from testing_support.asgi_testing import AsgiTest -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_errors, - validate_transaction_metrics, -) +from testing_support.fixtures import override_application_settings from testing_support.sample_asgi_applications import ( AppWithDescriptor, simple_app_v2, @@ -30,6 +25,12 @@ simple_app_v3, simple_app_v3_raw, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.asgi_application import ASGIApplicationWrapper, asgi_application @@ -124,15 +125,34 @@ def test_asgi_application_decorator_no_params_double_callable(): assert response.body == b"" -# Test for presence of framework info based on whether framework is specified -@validate_transaction_metrics(name="test", custom_metrics=[("Python/Framework/framework/v1", 1)]) -def test_framework_metrics(): - asgi_decorator = asgi_application(name="test", framework=("framework", "v1")) +# Test for presence of framework and dispatcher info based on whether framework is specified +@validate_transaction_metrics( + name="test", custom_metrics=[("Python/Framework/framework/v1", 1), ("Python/Dispatcher/dispatcher/v1.0.0", 1)] +) +def test_dispatcher_and_framework_metrics(): + asgi_decorator = asgi_application(name="test", framework=("framework", "v1"), dispatcher=("dispatcher", "v1.0.0")) decorated_application = asgi_decorator(simple_app_v2_raw) application = AsgiTest(decorated_application) application.make_request("GET", "/") +# Test for presence of framework and dispatcher info under existing transaction +@validate_transaction_metrics( + name="test", custom_metrics=[("Python/Framework/framework/v1", 1), ("Python/Dispatcher/dispatcher/v1.0.0", 1)] +) +def test_double_wrapped_dispatcher_and_framework_metrics(): + inner_asgi_decorator = asgi_application( + name="test", framework=("framework", "v1"), dispatcher=("dispatcher", "v1.0.0") + ) + decorated_application = inner_asgi_decorator(simple_app_v2_raw) + + outer_asgi_decorator = asgi_application(name="double_wrapped") + double_decorated_application = outer_asgi_decorator(decorated_application) + + application = AsgiTest(double_decorated_application) + application.make_request("GET", "/") + + @pytest.mark.parametrize("method", ("method", "cls", "static")) @validate_transaction_metrics(name="", group="Uri") def test_app_with_descriptor(method): diff --git a/tests/agent_features/test_asgi_w3c_trace_context.py b/tests/agent_features/test_asgi_w3c_trace_context.py index 68c192a5d..8cec2eb7a 100644 --- a/tests/agent_features/test_asgi_w3c_trace_context.py +++ b/tests/agent_features/test_asgi_w3c_trace_context.py @@ -19,11 +19,11 @@ from newrelic.api.asgi_application import asgi_application from testing_support.asgi_testing import AsgiTest -from testing_support.fixtures import (override_application_settings, - validate_transaction_event_attributes, validate_transaction_metrics) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_span_events import ( validate_span_events) - +from testing_support.validators.validate_transaction_event_attributes import validate_transaction_event_attributes @asgi_application() async def target_asgi_application(scope, receive, send): diff --git a/tests/agent_features/test_async_context_propagation.py b/tests/agent_features/test_async_context_propagation.py index ea850c095..47d16cfc5 100644 --- a/tests/agent_features/test_async_context_propagation.py +++ b/tests/agent_features/test_async_context_propagation.py @@ -13,9 +13,8 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( - function_not_called, - override_generic_settings, +from testing_support.fixtures import function_not_called, override_generic_settings +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) @@ -132,7 +131,7 @@ def handle_exception(loop, context): # The agent should have removed all traces from the cache since # run_until_complete has terminated (all callbacks scheduled inside the # task have run) - assert not trace_cache()._cache + assert not trace_cache() # Assert that no exceptions have occurred assert not exceptions, exceptions @@ -287,7 +286,7 @@ def _test(): # The agent should have removed all traces from the cache since # run_until_complete has terminated - assert not trace_cache()._cache + assert not trace_cache() # Assert that no exceptions have occurred assert not exceptions, exceptions diff --git a/tests/agent_features/test_async_timing.py b/tests/agent_features/test_async_timing.py index 0198f151a..f8cad864d 100644 --- a/tests/agent_features/test_async_timing.py +++ b/tests/agent_features/test_async_timing.py @@ -44,17 +44,15 @@ def _validate_total_time_value(wrapped, instance, args, kwargs): @function_trace(name="child") -@asyncio.coroutine -def child(): - yield from asyncio.sleep(0.1) +async def child(): + await asyncio.sleep(0.1) @background_task(name="parent") -@asyncio.coroutine -def parent(calls): +async def parent(calls): coros = [child() for _ in range(calls)] - yield from asyncio.gather(*coros) - yield from asyncio.sleep(0.1) + await asyncio.gather(*coros) + await asyncio.sleep(0.1) @validate_total_time_value_greater_than(0.2) diff --git a/tests/agent_features/test_attribute.py b/tests/agent_features/test_attribute.py index edfffae2f..f4b9e896f 100644 --- a/tests/agent_features/test_attribute.py +++ b/tests/agent_features/test_attribute.py @@ -13,24 +13,33 @@ # limitations under the License. import sys + import pytest import webtest +from testing_support.fixtures import ( + override_application_settings, + validate_agent_attribute_types, + validate_attributes, + validate_attributes_complete, +) +from testing_support.sample_applications import fully_featured_app +from testing_support.validators.validate_custom_parameters import ( + validate_custom_parameters, +) from newrelic.api.background_task import background_task -from newrelic.api.transaction import (add_custom_parameter, - add_custom_parameters) +from newrelic.api.transaction import add_custom_attribute, add_custom_attributes from newrelic.api.wsgi_application import wsgi_application -from newrelic.core.attribute import (truncate, sanitize, Attribute, - CastingFailureException, MAX_64_BIT_INT, _DESTINATIONS_WITH_EVENTS) - +from newrelic.core.attribute import ( + _DESTINATIONS_WITH_EVENTS, + MAX_64_BIT_INT, + Attribute, + CastingFailureException, + sanitize, + truncate, +) from newrelic.packages import six -from testing_support.fixtures import (override_application_settings, - validate_attributes, validate_attributes_complete, - validate_custom_parameters, validate_agent_attribute_types) -from testing_support.sample_applications import fully_featured_app - - # Python 3 lacks longs if sys.version_info >= (3, 0): @@ -43,333 +52,328 @@ @wsgi_application() def target_wsgi_application(environ, start_response): - status = '200 OK' - output = b'Hello World!' + status = "200 OK" + output = b"Hello World!" - path = environ.get('PATH_INFO') - if path == '/user_attribute': - add_custom_parameter('test_key', 'test_value') + path = environ.get("PATH_INFO") + if path == "/user_attribute": + add_custom_attribute("test_key", "test_value") - response_headers = [('Content-Type', 'text/plain; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/plain; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) return [output] -_required_intrinsics = ['trip_id', 'totalTime'] +_required_intrinsics = ["trip_id", "totalTime"] _forgone_intrinsics = [] -@validate_attributes('intrinsic', _required_intrinsics, _forgone_intrinsics) +@validate_attributes("intrinsic", _required_intrinsics, _forgone_intrinsics) def test_intrinsics(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/') - assert response.body == b'Hello World!' - - -_required_agent = ['request.method', 'wsgi.output.seconds', 'response.status', - 'request.headers.host', 'request.headers.accept', 'request.uri', - 'response.headers.contentType', 'response.headers.contentLength'] + response = target_application.get("/") + assert response.body == b"Hello World!" + + +_required_agent = [ + "request.method", + "wsgi.output.seconds", + "response.status", + "request.headers.host", + "request.headers.accept", + "request.uri", + "response.headers.contentType", + "response.headers.contentLength", +] if ThreadUtilization: - _required_agent.append('thread.concurrency') + _required_agent.append("thread.concurrency") _forgone_agent = [] -@validate_attributes('agent', _required_agent, _forgone_agent) +@validate_attributes("agent", _required_agent, _forgone_agent) def test_agent(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/', - extra_environ={'HTTP_ACCEPT': '*/*'}) - assert response.body == b'Hello World!' + response = target_application.get("/", extra_environ={"HTTP_ACCEPT": "*/*"}) + assert response.body == b"Hello World!" _required_user = [] -_forgone_user = ['test_key'] +_forgone_user = ["test_key"] -@validate_attributes('user', _required_user, _forgone_user) +@validate_attributes("user", _required_user, _forgone_user) def test_user_default(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/') - assert response.body == b'Hello World!' + response = target_application.get("/") + assert response.body == b"Hello World!" -_required_user = ['test_key'] +_required_user = ["test_key"] _forgone_user = [] -@validate_attributes('user', _required_user, _forgone_user) +@validate_attributes("user", _required_user, _forgone_user) def test_user_add_attribute(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/user_attribute') - assert response.body == b'Hello World!' + response = target_application.get("/user_attribute") + assert response.body == b"Hello World!" -_settings_legacy_false = {'capture_params': False} +_settings_legacy_false = {"capture_params": False} _required_request_legacy_false = [] -_forgone_request_legacy_false = ['request.parameters.foo'] +_forgone_request_legacy_false = ["request.parameters.foo"] @override_application_settings(_settings_legacy_false) -@validate_attributes('agent', _required_request_legacy_false, - _forgone_request_legacy_false) +@validate_attributes("agent", _required_request_legacy_false, _forgone_request_legacy_false) def test_capture_request_params_legacy_false(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/?foo=bar') - assert response.body == b'Hello World!' + response = target_application.get("/?foo=bar") + assert response.body == b"Hello World!" -_settings_legacy_true = {'capture_params': True} -_required_request_legacy_true = ['request.parameters.foo'] +_settings_legacy_true = {"capture_params": True} +_required_request_legacy_true = ["request.parameters.foo"] _forgone_request_legacy_true = [] @override_application_settings(_settings_legacy_true) -@validate_attributes('agent', _required_request_legacy_true, - _forgone_request_legacy_true) +@validate_attributes("agent", _required_request_legacy_true, _forgone_request_legacy_true) def test_capture_request_params_legacy_true(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/?foo=bar') - assert response.body == b'Hello World!' + response = target_application.get("/?foo=bar") + assert response.body == b"Hello World!" -_required_request_default = ['request.parameters.foo'] +_required_request_default = ["request.parameters.foo"] _forgone_request_default = [] -@validate_attributes('agent', _required_request_default, - _forgone_request_default) +@validate_attributes("agent", _required_request_default, _forgone_request_default) def test_capture_request_params_default(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/?foo=bar') - assert response.body == b'Hello World!' + response = target_application.get("/?foo=bar") + assert response.body == b"Hello World!" _required_display_host_default = [] -_forgone_display_host_default = ['host.displayName'] +_forgone_display_host_default = ["host.displayName"] -@validate_attributes('agent', _required_display_host_default, - _forgone_display_host_default) +@validate_attributes("agent", _required_display_host_default, _forgone_display_host_default) def test_display_host_default(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/') - assert response.body == b'Hello World!' + response = target_application.get("/") + assert response.body == b"Hello World!" -_settings_display_host_custom = {'process_host.display_name': 'CUSTOM NAME'} +_settings_display_host_custom = {"process_host.display_name": "CUSTOM NAME"} -_display_name_attribute = Attribute(name='host.displayName', - value='CUSTOM NAME', destinations=_DESTINATIONS_WITH_EVENTS) +_display_name_attribute = Attribute( + name="host.displayName", value="CUSTOM NAME", destinations=_DESTINATIONS_WITH_EVENTS +) _required_display_host_custom = [_display_name_attribute] _forgone_display_host_custom = [] @override_application_settings(_settings_display_host_custom) -@validate_attributes_complete('agent', _required_display_host_custom, - _forgone_display_host_custom) +@validate_attributes_complete("agent", _required_display_host_custom, _forgone_display_host_custom) def test_display_host_custom(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/') - assert response.body == b'Hello World!' + response = target_application.get("/") + assert response.body == b"Hello World!" # Tests for truncate() + def test_truncate_string(): - s = 'blahblah' + s = "blahblah" result = truncate(s, maxsize=4) assert isinstance(result, six.string_types) - assert result == 'blah' + assert result == "blah" def test_truncate_bytes(): - b = b'foobar' + b = b"foobar" result = truncate(b, maxsize=3) assert isinstance(result, six.binary_type) - assert result == b'foo' + assert result == b"foo" def test_truncate_unicode_snowman(): # '\u2603' is 'SNOWMAN' - u = u'snow\u2603' - assert u.encode('utf-8') == b'snow\xe2\x98\x83' + # decode("unicode-escape") is used to get Py2 unicode + u = "snow\u2603".decode("unicode-escape") if six.PY2 else "snow\u2603" + assert u.encode("utf-8") == b"snow\xe2\x98\x83" result = truncate(u, maxsize=5) assert isinstance(result, six.text_type) - assert result == u'snow' + assert result == "snow" def test_truncate_combining_characters(): # '\u0308' is 'COMBINING DIAERESIS' (AKA 'umlaut') - u = u'Zoe\u0308' - assert u.encode('utf-8') == b'Zoe\xcc\x88' + # decode("unicode-escape") is used to get Py2 unicode + u = "Zoe\u0308".decode("unicode-escape") if six.PY2 else "Zoe\u0308" + assert u.encode("utf-8") == b"Zoe\xcc\x88" # truncate will chop off 'COMBINING DIAERESIS', which leaves # 'LATIN SMALL LETTER E' by itself. result = truncate(u, maxsize=3) assert isinstance(result, six.text_type) - assert result == u'Zoe' + assert result == "Zoe" def test_truncate_empty_string(): - s = '' + s = "" result = truncate(s, maxsize=4) assert isinstance(result, six.string_types) - assert result == '' + assert result == "" def test_truncate_empty_bytes(): - b = b'' + b = b"" result = truncate(b, maxsize=3) assert isinstance(result, six.binary_type) - assert result == b'' + assert result == b"" def test_truncate_empty_unicode(): - u = u'' + # decode("unicode-escape") is used to get Py2 unicode + u = "".decode("unicode-escape") if six.PY2 else "" result = truncate(u, maxsize=5) assert isinstance(result, six.text_type) - assert result == u'' + assert result == "" # Tests for limits on user attributes -TOO_LONG = '*' * 256 -TRUNCATED = '*' * 255 +TOO_LONG = "*" * 256 +TRUNCATED = "*" * 255 -_required_custom_params = [('key', 'value')] +_required_custom_params = [("key", "value")] _forgone_custom_params = [] @validate_custom_parameters(_required_custom_params, _forgone_custom_params) @background_task() def test_custom_param_ok(): - result = add_custom_parameter('key', 'value') + result = add_custom_attribute("key", "value") assert result @validate_custom_parameters(_required_custom_params, _forgone_custom_params) @background_task() def test_custom_params_ok(): - result = add_custom_parameters([('key', 'value')]) + result = add_custom_attributes([("key", "value")]) assert result _required_custom_params_long_key = [] -_forgone_custom_params_long_key = [(TOO_LONG, 'value')] +_forgone_custom_params_long_key = [(TOO_LONG, "value")] -@validate_custom_parameters(_required_custom_params_long_key, - _forgone_custom_params_long_key) +@validate_custom_parameters(_required_custom_params_long_key, _forgone_custom_params_long_key) @background_task() def test_custom_param_key_too_long(): - result = add_custom_parameter(TOO_LONG, 'value') + result = add_custom_attribute(TOO_LONG, "value") assert not result -@validate_custom_parameters(_required_custom_params_long_key, - _forgone_custom_params_long_key) +@validate_custom_parameters(_required_custom_params_long_key, _forgone_custom_params_long_key) @background_task() def test_custom_params_key_too_long(): - result = add_custom_parameters([(TOO_LONG, 'value')]) + result = add_custom_attributes([(TOO_LONG, "value")]) assert not result -_required_custom_params_long_value = [('key', TRUNCATED)] +_required_custom_params_long_value = [("key", TRUNCATED)] _forgone_custom_params_long_value = [] -@validate_custom_parameters(_required_custom_params_long_value, - _forgone_custom_params_long_value) +@validate_custom_parameters(_required_custom_params_long_value, _forgone_custom_params_long_value) @background_task() def test_custom_param_value_too_long(): - result = add_custom_parameter('key', TOO_LONG) + result = add_custom_attribute("key", TOO_LONG) assert result -@validate_custom_parameters(_required_custom_params_long_value, - _forgone_custom_params_long_value) +@validate_custom_parameters(_required_custom_params_long_value, _forgone_custom_params_long_value) @background_task() def test_custom_params_value_too_long(): - result = add_custom_parameters([('key', TOO_LONG)]) + result = add_custom_attributes([("key", TOO_LONG)]) assert result -_required_custom_params_too_many = [('key-127', 'value')] -_forgone_custom_params_too_many = [('key-128', 'value')] +_required_custom_params_too_many = [("key-127", "value")] +_forgone_custom_params_too_many = [("key-128", "value")] -@validate_custom_parameters(_required_custom_params_too_many, - _forgone_custom_params_too_many) +@validate_custom_parameters(_required_custom_params_too_many, _forgone_custom_params_too_many) @background_task() def test_custom_param_too_many(): for i in range(129): - result = add_custom_parameter('key-%02d' % i, 'value') + result = add_custom_attribute("key-%02d" % i, "value") if i < 128: assert result else: - assert not result # Last one fails + assert not result # Last one fails -@validate_custom_parameters(_required_custom_params_too_many, - _forgone_custom_params_too_many) +@validate_custom_parameters(_required_custom_params_too_many, _forgone_custom_params_too_many) @background_task() def test_custom_params_too_many(): - item_list = [('key-%02d' % i, 'value') for i in range(129)] - result = add_custom_parameters(item_list) + item_list = [("key-%02d" % i, "value") for i in range(129)] + result = add_custom_attributes(item_list) assert not result _required_custom_params_name_not_string = [] -_forgone_custom_params_name_not_string = [(1, 'value')] +_forgone_custom_params_name_not_string = [(1, "value")] -@validate_custom_parameters(_required_custom_params_name_not_string, - _forgone_custom_params_name_not_string) +@validate_custom_parameters(_required_custom_params_name_not_string, _forgone_custom_params_name_not_string) @background_task() def test_custom_param_name_not_string(): - result = add_custom_parameter(1, 'value') + result = add_custom_attribute(1, "value") assert not result -@validate_custom_parameters(_required_custom_params_name_not_string, - _forgone_custom_params_name_not_string) +@validate_custom_parameters(_required_custom_params_name_not_string, _forgone_custom_params_name_not_string) @background_task() def test_custom_params_name_not_string(): - result = add_custom_parameters([(1, 'value')]) + result = add_custom_attributes([(1, "value")]) assert not result TOO_BIG = MAX_64_BIT_INT + 1 _required_custom_params_int_too_big = [] -_forgone_custom_params_int_too_big = [('key', TOO_BIG)] +_forgone_custom_params_int_too_big = [("key", TOO_BIG)] -@validate_custom_parameters(_required_custom_params_int_too_big, - _forgone_custom_params_int_too_big) +@validate_custom_parameters(_required_custom_params_int_too_big, _forgone_custom_params_int_too_big) @background_task() def test_custom_param_int_too_big(): - result = add_custom_parameter('key', TOO_BIG) + result = add_custom_attribute("key", TOO_BIG) assert not result -@validate_custom_parameters(_required_custom_params_int_too_big, - _forgone_custom_params_int_too_big) +@validate_custom_parameters(_required_custom_params_int_too_big, _forgone_custom_params_int_too_big) @background_task() def test_custom_params_int_too_big(): - result = add_custom_parameters([('key', TOO_BIG)]) + result = add_custom_attributes([("key", TOO_BIG)]) assert not result -OK_KEY = '*' * (255 - len('request.parameters.')) -OK_REQUEST_PARAM = 'request.parameters.' + OK_KEY -TOO_LONG_KEY = '*' * (256 - len('request.parameters.')) -TOO_LONG_REQUEST_PARAM = 'request.parameters.' + TOO_LONG_KEY +OK_KEY = "*" * (255 - len("request.parameters.")) +OK_REQUEST_PARAM = "request.parameters." + OK_KEY +TOO_LONG_KEY = "*" * (256 - len("request.parameters.")) +TOO_LONG_REQUEST_PARAM = "request.parameters." + TOO_LONG_KEY assert len(OK_REQUEST_PARAM) == 255 assert len(TOO_LONG_REQUEST_PARAM) == 256 @@ -378,36 +382,33 @@ def test_custom_params_int_too_big(): _forgone_request_key_ok = [] -@validate_attributes('agent', _required_request_key_ok, - _forgone_request_key_ok) +@validate_attributes("agent", _required_request_key_ok, _forgone_request_key_ok) def test_capture_request_params_key_ok(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/?%s=bar' % OK_KEY) - assert response.body == b'Hello World!' + response = target_application.get("/?%s=bar" % OK_KEY) + assert response.body == b"Hello World!" _required_request_key_too_long = [] _forgone_request_key_too_long = [TOO_LONG_REQUEST_PARAM] -@validate_attributes('agent', _required_request_key_too_long, - _forgone_request_key_too_long) +@validate_attributes("agent", _required_request_key_too_long, _forgone_request_key_too_long) def test_capture_request_params_key_too_long(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/?%s=bar' % TOO_LONG_KEY) - assert response.body == b'Hello World!' + response = target_application.get("/?%s=bar" % TOO_LONG_KEY) + assert response.body == b"Hello World!" -_required_request_value_too_long = ['request.parameters.foo'] +_required_request_value_too_long = ["request.parameters.foo"] _forgone_request_value_too_long = [] -@validate_attributes('agent', _required_request_value_too_long, - _forgone_request_value_too_long) +@validate_attributes("agent", _required_request_value_too_long, _forgone_request_value_too_long) def test_capture_request_params_value_too_long(): target_application = webtest.TestApp(target_wsgi_application) - response = target_application.get('/?foo=%s' % TOO_LONG) - assert response.body == b'Hello World!' + response = target_application.get("/?foo=%s" % TOO_LONG) + assert response.body == b"Hello World!" # Test attribute types are according to Agent-Attributes spec. @@ -416,41 +417,48 @@ def test_capture_request_params_value_too_long(): # Types are only defined in the spec for agent attributes, not intrinsics. -agent_attributes = {'request.headers.accept': six.string_types, - 'request.headers.contentLength': int, - 'request.headers.contentType': six.string_types, - 'request.headers.host': six.string_types, - 'request.headers.referer': six.string_types, - 'request.headers.userAgent': six.string_types, - 'request.method': six.string_types, - 'request.parameters.test': six.string_types, - 'response.headers.contentLength': int, - 'response.headers.contentType': six.string_types, - 'response.status': six.string_types} +agent_attributes = { + "request.headers.accept": six.string_types, + "request.headers.contentLength": int, + "request.headers.contentType": six.string_types, + "request.headers.host": six.string_types, + "request.headers.referer": six.string_types, + "request.headers.userAgent": six.string_types, + "request.method": six.string_types, + "request.parameters.test": six.string_types, + "response.headers.contentLength": int, + "response.headers.contentType": six.string_types, + "response.status": six.string_types, +} @validate_agent_attribute_types(agent_attributes) def test_agent_attribute_types(): - test_environ = {'CONTENT_TYPE': 'HTML', 'CONTENT_LENGTH': '100', - 'HTTP_USER_AGENT': 'Firefox', 'HTTP_REFERER': 'somewhere', - 'HTTP_ACCEPT': 'everything'} - fully_featured_application.get('/?test=val', extra_environ=test_environ) + test_environ = { + "CONTENT_TYPE": "HTML", + "CONTENT_LENGTH": "100", + "HTTP_USER_AGENT": "Firefox", + "HTTP_REFERER": "somewhere", + "HTTP_ACCEPT": "everything", + } + fully_featured_application.get("/?test=val", extra_environ=test_environ) # Test sanitize() + def test_sanitize_string(): - s = 'foo' + s = "foo" assert sanitize(s) == s def test_sanitize_bytes(): - b = b'bytes' + b = b"bytes" assert sanitize(b) == b def test_sanitize_unicode(): - u = u'SMILING FACE: \u263a' + u = "SMILING FACE: \u263a" assert sanitize(u) == u @@ -467,22 +475,22 @@ def test_sanitize_int(): def test_sanitize_long(): - l = long(123456) - assert sanitize(l) == l + long_int = long(123456) + assert sanitize(long_int) == long_int def test_sanitize_dict(): - d = {1: 'foo'} + d = {1: "foo"} assert sanitize(d) == "{1: 'foo'}" def test_sanitize_list(): - l = [1, 2, 3, 4] - assert sanitize(l) == '[1, 2, 3, 4]' + list_var = [1, 2, 3, 4] + assert sanitize(list_var) == "[1, 2, 3, 4]" def test_sanitize_tuple(): - t = ('one', 'two', 'three') + t = ("one", "two", "three") assert sanitize(t) == "('one', 'two', 'three')" diff --git a/tests/agent_features/test_attributes_in_action.py b/tests/agent_features/test_attributes_in_action.py index 31c5d625d..08601fccf 100644 --- a/tests/agent_features/test_attributes_in_action.py +++ b/tests/agent_features/test_attributes_in_action.py @@ -20,20 +20,38 @@ override_application_settings, reset_core_stats_engine, validate_attributes, +) +from testing_support.validators.validate_browser_attributes import ( validate_browser_attributes, +) +from testing_support.validators.validate_error_event_attributes import ( validate_error_event_attributes, +) +from testing_support.validators.validate_error_event_attributes_outside_transaction import ( validate_error_event_attributes_outside_transaction, +) +from testing_support.validators.validate_error_trace_attributes import ( + validate_error_trace_attributes, +) +from testing_support.validators.validate_error_trace_attributes_outside_transaction import ( validate_error_trace_attributes_outside_transaction, +) +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_error_trace_attributes import ( validate_transaction_error_trace_attributes, +) +from testing_support.validators.validate_transaction_event_attributes import ( validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_trace_attributes import ( validate_transaction_trace_attributes, ) -from testing_support.validators.validate_span_events import validate_span_events from newrelic.api.application import application_instance as application +from newrelic.api.background_task import background_task from newrelic.api.message_transaction import message_transaction from newrelic.api.time_trace import notice_error -from newrelic.api.transaction import add_custom_parameter +from newrelic.api.transaction import add_custom_attribute, set_user_id from newrelic.api.wsgi_application import wsgi_application from newrelic.common.object_names import callable_name @@ -87,7 +105,16 @@ AGENT_KEYS_ALL = TRACE_ERROR_AGENT_KEYS + REQ_PARAMS -TRANS_EVENT_INTRINSICS = ("name", "duration", "type", "timestamp", "totalTime", "error") +TRANS_EVENT_INTRINSICS = ( + "name", + "duration", + "type", + "timestamp", + "totalTime", + "error", + "nr.apdexPerfZone", + "apdexPerfZone", +) TRANS_EVENT_AGENT_KEYS = [ "response.status", "request.method", @@ -132,8 +159,8 @@ def normal_wsgi_application(environ, start_response): output = "header

RESPONSE

" output = output.encode("UTF-8") - add_custom_parameter(USER_ATTRS[0], "test_value") - add_custom_parameter(USER_ATTRS[1], "test_value") + add_custom_attribute(USER_ATTRS[0], "test_value") + add_custom_attribute(USER_ATTRS[1], "test_value") response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) @@ -905,3 +932,52 @@ def test_routing_key_agent_attribute(): @message_transaction(library="RabbitMQ", destination_type="Exchange", destination_name="x") def test_none_type_routing_key_agent_attribute(): pass + + +_required_agent_attributes = ["enduser.id"] +_forgone_agent_attributes = [] + + +@pytest.mark.parametrize( + "input_user_id, reported_user_id, high_security", + ( + ("1234", "1234", True), + ("a" * 260, "a" * 255, False), + ), +) +def test_enduser_id_attribute_api_valid_types(input_user_id, reported_user_id, high_security): + @reset_core_stats_engine() + @validate_error_trace_attributes( + callable_name(ValueError), exact_attrs={"user": {}, "intrinsic": {}, "agent": {"enduser.id": reported_user_id}} + ) + @validate_error_event_attributes( + exact_attrs={"user": {}, "intrinsic": {}, "agent": {"enduser.id": reported_user_id}} + ) + @validate_attributes("agent", _required_agent_attributes, _forgone_agent_attributes) + @background_task() + @override_application_settings({"high_security": high_security}) + def _test(): + set_user_id(input_user_id) + + try: + raise ValueError() + except Exception: + notice_error() + + _test() + + +@pytest.mark.parametrize("input_user_id", (None, "", 123)) +def test_enduser_id_attribute_api_invalid_types(input_user_id): + @reset_core_stats_engine() + @validate_attributes("agent", [], ["enduser.id"]) + @background_task() + def _test(): + set_user_id(input_user_id) + + try: + raise ValueError() + except Exception: + notice_error() + + _test() diff --git a/tests/agent_features/test_browser.py b/tests/agent_features/test_browser.py index 5f8492016..e0f562d1e 100644 --- a/tests/agent_features/test_browser.py +++ b/tests/agent_features/test_browser.py @@ -12,47 +12,55 @@ # See the License for the specific language governing permissions and # limitations under the License. -import sys -import webtest import json +import sys import six - -from testing_support.fixtures import (override_application_settings, - validate_transaction_errors, validate_custom_parameters) +import webtest +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_custom_parameters import ( + validate_custom_parameters, +) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) from newrelic.api.application import application_settings -from newrelic.api.transaction import (get_browser_timing_header, - get_browser_timing_footer, add_custom_parameter, - disable_browser_autorum) +from newrelic.api.transaction import ( + add_custom_attribute, + disable_browser_autorum, + get_browser_timing_footer, + get_browser_timing_header, +) from newrelic.api.wsgi_application import wsgi_application from newrelic.common.encoding_utils import deobfuscate -_runtime_error_name = (RuntimeError.__module__ + ':' + RuntimeError.__name__) +_runtime_error_name = RuntimeError.__module__ + ":" + RuntimeError.__name__ + @wsgi_application() def target_wsgi_application_manual_rum(environ, start_response): - status = '200 OK' + status = "200 OK" - text = '%s

RESPONSE

%s' + text = "%s

RESPONSE

%s" - output = (text % (get_browser_timing_header(), - get_browser_timing_footer())).encode('UTF-8') + output = (text % (get_browser_timing_header(), get_browser_timing_footer())).encode("UTF-8") - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) return [output] + target_application_manual_rum = webtest.TestApp(target_wsgi_application_manual_rum) _test_footer_attributes = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "js_agent_loader": "", } + @override_application_settings(_test_footer_attributes) def test_footer_attributes(): settings = application_settings() @@ -66,10 +74,10 @@ def test_footer_attributes(): assert settings.beacon assert settings.error_beacon - token = '0123456789ABCDEF' - headers = { 'Cookie': 'NRAGENT=tk=%s' % token } + token = "0123456789ABCDEF" # nosec + headers = {"Cookie": "NRAGENT=tk=%s" % token} - response = target_application_manual_rum.get('/', headers=headers) + response = target_application_manual_rum.get("/", headers=headers) header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -77,702 +85,731 @@ def test_footer_attributes(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # Validate the insertion of RUM header. - assert header.find('NREUM HEADER') != -1 + assert header.find("NREUM HEADER") != -1 # Now validate the various fields of the footer. The fields are # held by a JSON dictionary. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) - assert data['licenseKey'] == settings.browser_key - assert data['applicationID'] == settings.application_id + assert data["licenseKey"] == settings.browser_key + assert data["applicationID"] == settings.application_id - assert data['agent'] == settings.js_agent_file - assert data['beacon'] == settings.beacon - assert data['errorBeacon'] == settings.error_beacon + assert data["agent"] == settings.js_agent_file + assert data["beacon"] == settings.beacon + assert data["errorBeacon"] == settings.error_beacon - assert data['applicationTime'] >= 0 - assert data['queueTime'] >= 0 + assert data["applicationTime"] >= 0 + assert data["queueTime"] >= 0 obfuscation_key = settings.license_key[:13] - assert type(data['transactionName']) == type(u'') + type_transaction_data = unicode if six.PY2 else str # noqa: F821 + assert isinstance(data["transactionName"], type_transaction_data) + + txn_name = deobfuscate(data["transactionName"], obfuscation_key) - txn_name = deobfuscate(data['transactionName'], obfuscation_key) + assert txn_name == "WebTransaction/Uri/" - assert txn_name == u'WebTransaction/Uri/' + assert "atts" not in data - assert 'atts' not in data _test_rum_ssl_for_http_is_none = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'browser_monitoring.ssl_for_http': None, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "browser_monitoring.ssl_for_http": None, + "js_agent_loader": "", } + @override_application_settings(_test_rum_ssl_for_http_is_none) def test_ssl_for_http_is_none(): settings = application_settings() assert settings.browser_monitoring.ssl_for_http is None - response = target_application_manual_rum.get('/') + response = target_application_manual_rum.get("/") footer = response.html.html.body.script.string - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert "sslForHttp" not in data - assert 'sslForHttp' not in data _test_rum_ssl_for_http_is_true = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'browser_monitoring.ssl_for_http': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "browser_monitoring.ssl_for_http": True, + "js_agent_loader": "", } + @override_application_settings(_test_rum_ssl_for_http_is_true) def test_ssl_for_http_is_true(): settings = application_settings() assert settings.browser_monitoring.ssl_for_http is True - response = target_application_manual_rum.get('/') + response = target_application_manual_rum.get("/") footer = response.html.html.body.script.string - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert data["sslForHttp"] is True - assert data['sslForHttp'] is True _test_rum_ssl_for_http_is_false = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': False, - 'browser_monitoring.ssl_for_http': False, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": False, + "browser_monitoring.ssl_for_http": False, + "js_agent_loader": "", } + @override_application_settings(_test_rum_ssl_for_http_is_false) def test_ssl_for_http_is_false(): settings = application_settings() assert settings.browser_monitoring.ssl_for_http is False - response = target_application_manual_rum.get('/') + response = target_application_manual_rum.get("/") footer = response.html.html.body.script.string - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert data["sslForHttp"] is False - assert data['sslForHttp'] is False @wsgi_application() def target_wsgi_application_yield_single_no_head(environ, start_response): - status = '200 OK' + status = "200 OK" - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) yield output -target_application_yield_single_no_head = webtest.TestApp( - target_wsgi_application_yield_single_no_head) + +target_application_yield_single_no_head = webtest.TestApp(target_wsgi_application_yield_single_no_head) _test_html_insertion_yield_single_no_head_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_single_no_head_settings) def test_html_insertion_yield_single_no_head(): - response = target_application_yield_single_no_head.get('/', status=200) + response = target_application_yield_single_no_head.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain('NREUM HEADER', 'NREUM.info') + response.mustcontain("NREUM HEADER", "NREUM.info") + @wsgi_application() def target_wsgi_application_yield_multi_no_head(environ, start_response): - status = '200 OK' + status = "200 OK" - output = [ b'', b'

RESPONSE

' ] + output = [b"", b"

RESPONSE

"] - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(b''.join(output))))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(b"".join(output))))] start_response(status, response_headers) for data in output: yield data -target_application_yield_multi_no_head = webtest.TestApp( - target_wsgi_application_yield_multi_no_head) + +target_application_yield_multi_no_head = webtest.TestApp(target_wsgi_application_yield_multi_no_head) _test_html_insertion_yield_multi_no_head_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_multi_no_head_settings) def test_html_insertion_yield_multi_no_head(): - response = target_application_yield_multi_no_head.get('/', status=200) + response = target_application_yield_multi_no_head.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain('NREUM HEADER', 'NREUM.info') + response.mustcontain("NREUM HEADER", "NREUM.info") + @wsgi_application() def target_wsgi_application_unnamed_attachment_header(environ, start_response): - status = '200 OK' + status = "200 OK" - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output))), - ('Content-Disposition', 'attachment')] + response_headers = [ + ("Content-Type", "text/html; charset=utf-8"), + ("Content-Length", str(len(output))), + ("Content-Disposition", "attachment"), + ] start_response(status, response_headers) yield output -target_application_unnamed_attachment_header = webtest.TestApp( - target_wsgi_application_unnamed_attachment_header) + +target_application_unnamed_attachment_header = webtest.TestApp(target_wsgi_application_unnamed_attachment_header) _test_html_insertion_unnamed_attachment_header_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_unnamed_attachment_header_settings) + +@override_application_settings(_test_html_insertion_unnamed_attachment_header_settings) def test_html_insertion_unnamed_attachment_header(): - response = target_application_unnamed_attachment_header.get('/', status=200) + response = target_application_unnamed_attachment_header.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers - assert 'Content-Disposition' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers + assert "Content-Disposition" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) + @wsgi_application() def target_wsgi_application_named_attachment_header(environ, start_response): - status = '200 OK' + status = "200 OK" - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output))), - ('Content-Disposition', 'Attachment; filename="X"')] + response_headers = [ + ("Content-Type", "text/html; charset=utf-8"), + ("Content-Length", str(len(output))), + ("Content-Disposition", 'Attachment; filename="X"'), + ] start_response(status, response_headers) yield output -target_application_named_attachment_header = webtest.TestApp( - target_wsgi_application_named_attachment_header) + +target_application_named_attachment_header = webtest.TestApp(target_wsgi_application_named_attachment_header) _test_html_insertion_named_attachment_header_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_named_attachment_header_settings) + +@override_application_settings(_test_html_insertion_named_attachment_header_settings) def test_html_insertion_named_attachment_header(): - response = target_application_named_attachment_header.get('/', status=200) + response = target_application_named_attachment_header.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers - assert 'Content-Disposition' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers + assert "Content-Disposition" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) + @wsgi_application() def target_wsgi_application_inline_attachment_header(environ, start_response): - status = '200 OK' + status = "200 OK" - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output))), - ('Content-Disposition', 'inline; filename="attachment"')] + response_headers = [ + ("Content-Type", "text/html; charset=utf-8"), + ("Content-Length", str(len(output))), + ("Content-Disposition", 'inline; filename="attachment"'), + ] start_response(status, response_headers) yield output -target_application_inline_attachment_header = webtest.TestApp( - target_wsgi_application_inline_attachment_header) + +target_application_inline_attachment_header = webtest.TestApp(target_wsgi_application_inline_attachment_header) _test_html_insertion_inline_attachment_header_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_inline_attachment_header_settings) + +@override_application_settings(_test_html_insertion_inline_attachment_header_settings) def test_html_insertion_inline_attachment_header(): - response = target_application_inline_attachment_header.get('/', status=200) + response = target_application_inline_attachment_header.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers - assert 'Content-Disposition' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers + assert "Content-Disposition" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain('NREUM HEADER', 'NREUM.info') + response.mustcontain("NREUM HEADER", "NREUM.info") + @wsgi_application() def target_wsgi_application_empty_list(environ, start_response): - status = '200 OK' + status = "200 OK" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', '0')] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", "0")] start_response(status, response_headers) return [] -target_application_empty_list = webtest.TestApp( - target_wsgi_application_empty_list) + +target_application_empty_list = webtest.TestApp(target_wsgi_application_empty_list) _test_html_insertion_empty_list_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_empty_list_settings) + +@override_application_settings(_test_html_insertion_empty_list_settings) def test_html_insertion_empty_list(): - response = target_application_empty_list.get('/', status=200) + response = target_application_empty_list.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) assert len(response.body) == 0 + @wsgi_application() def target_wsgi_application_single_empty_string(environ, start_response): - status = '200 OK' + status = "200 OK" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', '0')] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", "0")] start_response(status, response_headers) - return [''] + return [""] + -target_application_single_empty_string = webtest.TestApp( - target_wsgi_application_single_empty_string) +target_application_single_empty_string = webtest.TestApp(target_wsgi_application_single_empty_string) _test_html_insertion_single_empty_string_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_single_empty_string_settings) + +@override_application_settings(_test_html_insertion_single_empty_string_settings) def test_html_insertion_single_empty_string(): - response = target_application_single_empty_string.get('/', status=200) + response = target_application_single_empty_string.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) assert len(response.body) == 0 + @wsgi_application() def target_wsgi_application_multiple_empty_string(environ, start_response): - status = '200 OK' + status = "200 OK" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', '0')] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", "0")] start_response(status, response_headers) - return ['', ''] + return ["", ""] + -target_application_multiple_empty_string = webtest.TestApp( - target_wsgi_application_multiple_empty_string) +target_application_multiple_empty_string = webtest.TestApp(target_wsgi_application_multiple_empty_string) _test_html_insertion_multiple_empty_string_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_multiple_empty_string_settings) + +@override_application_settings(_test_html_insertion_multiple_empty_string_settings) def test_html_insertion_multiple_empty_string(): - response = target_application_multiple_empty_string.get('/', status=200) + response = target_application_multiple_empty_string.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) assert len(response.body) == 0 + @wsgi_application() def target_wsgi_application_single_large_prelude(environ, start_response): - status = '200 OK' + status = "200 OK" - output = [64*1024*b' ' + b''] + output = [64 * 1024 * b" " + b""] - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(b''.join(output))))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(b"".join(output))))] start_response(status, response_headers) return output -target_application_single_large_prelude = webtest.TestApp( - target_wsgi_application_single_large_prelude) + +target_application_single_large_prelude = webtest.TestApp(target_wsgi_application_single_large_prelude) _test_html_insertion_single_large_prelude_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_single_large_prelude_settings) + +@override_application_settings(_test_html_insertion_single_large_prelude_settings) def test_html_insertion_single_large_prelude(): - response = target_application_single_large_prelude.get('/', status=200) + response = target_application_single_large_prelude.get("/", status=200) # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers + + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + output = [32 * 1024 * b" ", 32 * 1024 * b" ", b""] - output = [32*1024*b' ', 32*1024*b' ', b''] + assert len(response.body) == len(b"".join(output)) - assert len(response.body) == len(b''.join(output)) @wsgi_application() def target_wsgi_application_multi_large_prelude(environ, start_response): - status = '200 OK' + status = "200 OK" - output = [32*1024*b' ', 32*1024*b' ', b''] + output = [32 * 1024 * b" ", 32 * 1024 * b" ", b""] - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(b''.join(output))))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(b"".join(output))))] start_response(status, response_headers) return output -target_application_multi_large_prelude = webtest.TestApp( - target_wsgi_application_multi_large_prelude) + +target_application_multi_large_prelude = webtest.TestApp(target_wsgi_application_multi_large_prelude) _test_html_insertion_multi_large_prelude_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_multi_large_prelude_settings) + +@override_application_settings(_test_html_insertion_multi_large_prelude_settings) def test_html_insertion_multi_large_prelude(): - response = target_application_multi_large_prelude.get('/', status=200) + response = target_application_multi_large_prelude.get("/", status=200) # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers + + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + output = [32 * 1024 * b" ", 32 * 1024 * b" ", b""] - output = [32*1024*b' ', 32*1024*b' ', b''] + assert len(response.body) == len(b"".join(output)) - assert len(response.body) == len(b''.join(output)) @wsgi_application() def target_wsgi_application_yield_before_start(environ, start_response): - status = '200 OK' + status = "200 OK" # Ambiguous whether yield an empty string before calling # start_response() is legal. Various WSGI servers allow it # We have to disable WebTest lint check to get this to run. - yield b'' + yield b"" - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) yield output -target_application_yield_before_start = webtest.TestApp( - target_wsgi_application_yield_before_start, lint=False) + +target_application_yield_before_start = webtest.TestApp(target_wsgi_application_yield_before_start, lint=False) _test_html_insertion_yield_before_start_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_before_start_settings) def test_html_insertion_yield_before_start(): - response = target_application_yield_before_start.get('/', status=200) + response = target_application_yield_before_start.get("/", status=200) # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain('NREUM HEADER', 'NREUM.info') + response.mustcontain("NREUM HEADER", "NREUM.info") + @wsgi_application() def target_wsgi_application_start_yield_start(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) - yield '' + yield "" try: - start_response(status, response_headers) + start_response(status, response_headers) # noqa: F821 except Exception: - start_response('500 Error', response_headers, sys.exc_info()) + start_response("500 Error", response_headers, sys.exc_info()) yield output -target_application_start_yield_start = webtest.TestApp( - target_wsgi_application_start_yield_start) + +target_application_start_yield_start = webtest.TestApp(target_wsgi_application_start_yield_start) _test_html_insertion_start_yield_start_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_start_yield_start_settings) def test_html_insertion_start_yield_start(): - response = target_application_start_yield_start.get('/', status=500) + response = target_application_start_yield_start.get("/", status=500) # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers + + response.mustcontain("NREUM HEADER", "NREUM.info") - response.mustcontain('NREUM HEADER', 'NREUM.info') @wsgi_application() def target_wsgi_application_invalid_content_length(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', 'XXX')] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", "XXX")] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) yield output -target_application_invalid_content_length = webtest.TestApp( - target_wsgi_application_invalid_content_length) + +target_application_invalid_content_length = webtest.TestApp(target_wsgi_application_invalid_content_length) _test_html_insertion_invalid_content_length_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_invalid_content_length_settings) def test_html_insertion_invalid_content_length(): - response = target_application_invalid_content_length.get('/', status=200) + response = target_application_invalid_content_length.get("/", status=200) # This is relying on WebTest not validating the # value of the Content-Length response header # and just passing it through as is. - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers - assert response.headers['Content-Length'] == 'XXX' + assert response.headers["Content-Length"] == "XXX" + + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) @wsgi_application() def target_wsgi_application_content_encoding(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output))), - ('Content-Encoding', 'identity')] + response_headers = [ + ("Content-Type", "text/html; charset=utf-8"), + ("Content-Length", str(len(output))), + ("Content-Encoding", "identity"), + ] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) yield output -target_application_content_encoding = webtest.TestApp( - target_wsgi_application_content_encoding) + +target_application_content_encoding = webtest.TestApp(target_wsgi_application_content_encoding) _test_html_insertion_content_encoding_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_content_encoding_settings) def test_html_insertion_content_encoding(): - response = target_application_content_encoding.get('/', status=200) + response = target_application_content_encoding.get("/", status=200) # Technically 'identity' should not be used in Content-Encoding # but clients will still accept it. Use this fact to disable auto # RUM for this test. Other option is to compress the response # and use 'gzip'. - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers - assert response.headers['Content-Encoding'] == 'identity' + assert response.headers["Content-Encoding"] == "identity" + + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) @wsgi_application() def target_wsgi_application_no_content_type(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Length', str(len(output)))] + response_headers = [("Content-Length", str(len(output)))] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) yield output -target_application_no_content_type = webtest.TestApp( - target_wsgi_application_no_content_type, lint=False) + +target_application_no_content_type = webtest.TestApp(target_wsgi_application_no_content_type, lint=False) _test_html_insertion_no_content_type_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_no_content_type_settings) def test_html_insertion_no_content_type(): - response = target_application_no_content_type.get('/', status=200) + response = target_application_no_content_type.get("/", status=200) + + assert "Content-Type" not in response.headers + assert "Content-Length" in response.headers - assert 'Content-Type' not in response.headers - assert 'Content-Length' in response.headers + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) @wsgi_application() def target_wsgi_application_plain_text(environ, start_response): - output = b'RESPONSE' + output = b"RESPONSE" - response_headers = [('Content-Type', 'text/plain'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/plain"), ("Content-Length", str(len(output)))] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) yield output -target_application_plain_text = webtest.TestApp( - target_wsgi_application_plain_text) + +target_application_plain_text = webtest.TestApp(target_wsgi_application_plain_text) _test_html_insertion_plain_text_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_plain_text_settings) def test_html_insertion_plain_text(): - response = target_application_plain_text.get('/', status=200) + response = target_application_plain_text.get("/", status=200) + + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) @wsgi_application() def target_wsgi_application_write_callback(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html"), ("Content-Length", str(len(output)))] - write = start_response('200 OK', response_headers) + write = start_response("200 OK", response_headers) write(output) return [] -target_application_write_callback = webtest.TestApp( - target_wsgi_application_write_callback) + +target_application_write_callback = webtest.TestApp(target_wsgi_application_write_callback) _test_html_insertion_write_callback_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_write_callback_settings) def test_html_insertion_write_callback(): - response = target_application_write_callback.get('/', status=200) + response = target_application_write_callback.get("/", status=200) + + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) @wsgi_application() def target_wsgi_application_yield_before_write(environ, start_response): - output = [b'', b'

RESPONSE

'] + output = [b"", b"

RESPONSE

"] - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(b''.join(output))))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(b"".join(output))))] - write = start_response('200 OK', response_headers) + write = start_response("200 OK", response_headers) # Technically this is in violation of the WSGI specification # if that write() should always be before yields. @@ -781,172 +818,177 @@ def target_wsgi_application_yield_before_write(environ, start_response): write(output.pop(0)) -target_application_yield_before_write = webtest.TestApp( - target_wsgi_application_yield_before_write) + +target_application_yield_before_write = webtest.TestApp(target_wsgi_application_yield_before_write) _test_html_insertion_yield_before_write_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_yield_before_write_settings) def test_html_insertion_yield_before_write(): - response = target_application_yield_before_write.get('/', status=200) + response = target_application_yield_before_write.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - expected = b'

RESPONSE

' + expected = b"

RESPONSE

" assert response.body == expected + @wsgi_application() def target_wsgi_application_write_before_yield(environ, start_response): - output = [b'', b'

RESPONSE

'] + output = [b"", b"

RESPONSE

"] - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(b''.join(output))))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(b"".join(output))))] - write = start_response('200 OK', response_headers) + write = start_response("200 OK", response_headers) write(output.pop(0)) yield output.pop(0) -target_application_write_before_yield = webtest.TestApp( - target_wsgi_application_write_before_yield) + +target_application_write_before_yield = webtest.TestApp(target_wsgi_application_write_before_yield) _test_html_insertion_write_before_yield_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_write_before_yield_settings) def test_html_insertion_write_before_yield(): - response = target_application_write_before_yield.get('/', status=200) + response = target_application_write_before_yield.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) - expected = b'

RESPONSE

' + expected = b"

RESPONSE

" assert response.body == expected + @wsgi_application() def target_wsgi_application_param_on_close(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) try: yield output finally: - add_custom_parameter('key', 'value') + add_custom_attribute("key", "value") + -target_application_param_on_close = webtest.TestApp( - target_wsgi_application_param_on_close) +target_application_param_on_close = webtest.TestApp(target_wsgi_application_param_on_close) _test_html_insertion_param_on_close_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_param_on_close_settings) -@validate_custom_parameters(required_params=[('key', 'value')]) +@validate_custom_parameters(required_params=[("key", "value")]) def test_html_insertion_param_on_close(): - response = target_application_param_on_close.get('/', status=200) + response = target_application_param_on_close.get("/", status=200) + + response.mustcontain("NREUM HEADER", "NREUM.info") - response.mustcontain('NREUM HEADER', 'NREUM.info') @wsgi_application() def target_wsgi_application_param_on_error(environ, start_response): - output = b'

RESPONSE

' + output = b"

RESPONSE

" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] - start_response('200 OK', response_headers) + start_response("200 OK", response_headers) try: - raise RuntimeError('ERROR') + raise RuntimeError("ERROR") yield output finally: - add_custom_parameter('key', 'value') + add_custom_attribute("key", "value") + -target_application_param_on_error = webtest.TestApp( - target_wsgi_application_param_on_error) +target_application_param_on_error = webtest.TestApp(target_wsgi_application_param_on_error) _test_html_insertion_param_on_error_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } + @override_application_settings(_test_html_insertion_param_on_error_settings) @validate_transaction_errors(errors=[_runtime_error_name]) -@validate_custom_parameters(required_params=[('key', 'value')]) +@validate_custom_parameters(required_params=[("key", "value")]) def test_html_insertion_param_on_error(): try: - response = target_application_param_on_error.get('/', status=500) + response = target_application_param_on_error.get("/", status=500) except RuntimeError: pass + @wsgi_application() def target_wsgi_application_disable_autorum_via_api(environ, start_response): - status = '200 OK' + status = "200 OK" - output = b'

RESPONSE

' + output = b"

RESPONSE

" disable_browser_autorum() - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) yield output -target_application_disable_autorum_via_api = webtest.TestApp( - target_wsgi_application_disable_autorum_via_api) + +target_application_disable_autorum_via_api = webtest.TestApp(target_wsgi_application_disable_autorum_via_api) _test_html_insertion_disable_autorum_via_api_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_disable_autorum_via_api_settings) + +@override_application_settings(_test_html_insertion_disable_autorum_via_api_settings) def test_html_insertion_disable_autorum_via_api(): - response = target_application_disable_autorum_via_api.get('/', status=200) + response = target_application_disable_autorum_via_api.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) + @wsgi_application() def target_wsgi_application_manual_rum_insertion(environ, start_response): - status = '200 OK' + status = "200 OK" - output = b'

RESPONSE

' + output = b"

RESPONSE

" header = get_browser_timing_header() footer = get_browser_timing_footer() @@ -954,34 +996,33 @@ def target_wsgi_application_manual_rum_insertion(environ, start_response): header = get_browser_timing_header() footer = get_browser_timing_footer() - assert header == '' - assert footer == '' + assert header == "" + assert footer == "" - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) yield output -target_application_manual_rum_insertion = webtest.TestApp( - target_wsgi_application_manual_rum_insertion) + +target_application_manual_rum_insertion = webtest.TestApp(target_wsgi_application_manual_rum_insertion) _test_html_insertion_manual_rum_insertion_settings = { - 'browser_monitoring.enabled': True, - 'browser_monitoring.auto_instrument': True, - 'js_agent_loader': u'', + "browser_monitoring.enabled": True, + "browser_monitoring.auto_instrument": True, + "js_agent_loader": "", } -@override_application_settings( - _test_html_insertion_manual_rum_insertion_settings) + +@override_application_settings(_test_html_insertion_manual_rum_insertion_settings) def test_html_insertion_manual_rum_insertion(): - response = target_application_manual_rum_insertion.get('/', status=200) + response = target_application_manual_rum_insertion.get("/", status=200) - assert 'Content-Type' in response.headers - assert 'Content-Length' in response.headers + assert "Content-Type" in response.headers + assert "Content-Length" in response.headers # The 'NREUM HEADER' value comes from our override for the header. # The 'NREUM.info' value comes from the programmatically generated # footer added by the agent. - response.mustcontain(no=['NREUM HEADER', 'NREUM.info']) + response.mustcontain(no=["NREUM HEADER", "NREUM.info"]) diff --git a/tests/agent_features/test_code_level_metrics.py b/tests/agent_features/test_code_level_metrics.py index 1d2bd6c3a..a7aeaa39a 100644 --- a/tests/agent_features/test_code_level_metrics.py +++ b/tests/agent_features/test_code_level_metrics.py @@ -12,25 +12,40 @@ # See the License for the specific language governing permissions and # limitations under the License. -import sys import sqlite3 -import newrelic.packages.six as six -import pytest +import sys -from testing_support.fixtures import override_application_settings, dt_enabled +import pytest +from _test_code_level_metrics import ( + CLASS_INSTANCE, + CLASS_INSTANCE_CALLABLE, + TYPE_CONSTRUCTOR_CALLABLE_CLASS_INSTANCE, + TYPE_CONSTRUCTOR_CLASS_INSTANCE, + ExerciseClass, + ExerciseClassCallable, + ExerciseTypeConstructor, + ExerciseTypeConstructorCallable, +) +from _test_code_level_metrics import __file__ as FILE_PATH +from _test_code_level_metrics import ( + exercise_function, + exercise_lambda, + exercise_partial, +) +from testing_support.fixtures import dt_enabled, override_application_settings from testing_support.validators.validate_span_events import validate_span_events +import newrelic.packages.six as six from newrelic.api.background_task import background_task -from newrelic.api.function_trace import FunctionTrace, FunctionTraceWrapper - -from _test_code_level_metrics import exercise_function, CLASS_INSTANCE, CLASS_INSTANCE_CALLABLE, exercise_lambda, exercise_partial, ExerciseClass, ExerciseClassCallable, __file__ as FILE_PATH - +from newrelic.api.function_trace import FunctionTrace is_pypy = hasattr(sys, "pypy_version_info") NAMESPACE = "_test_code_level_metrics" CLASS_NAMESPACE = ".".join((NAMESPACE, "ExerciseClass")) CALLABLE_CLASS_NAMESPACE = ".".join((NAMESPACE, "ExerciseClassCallable")) +TYPE_CONSTRUCTOR_NAMESPACE = ".".join((NAMESPACE, "ExerciseTypeConstructor")) +TYPE_CONSTRUCTOR_CALLABLE_NAMESPACE = ".".join((NAMESPACE, "ExerciseTypeConstructorCallable")) FUZZY_NAMESPACE = CLASS_NAMESPACE if six.PY3 else NAMESPACE if FILE_PATH.endswith(".pyc"): FILE_PATH = FILE_PATH[:-1] @@ -39,115 +54,166 @@ BUILTIN_ATTRS = {"code.filepath": "", "code.lineno": None} if not is_pypy else {} + def merge_dicts(A, B): d = {} d.update(A) d.update(B) return d -@pytest.mark.parametrize( - "func,args,agents", - ( - ( # Function - exercise_function, - (), - { - "code.filepath": FILE_PATH, - "code.function": "exercise_function", - "code.lineno": 16, - "code.namespace": NAMESPACE, - }, - ), - ( # Method - CLASS_INSTANCE.exercise_method, - (), - { - "code.filepath": FILE_PATH, - "code.function": "exercise_method", - "code.lineno": 21, - "code.namespace": CLASS_NAMESPACE, - }, - ), - ( # Static Method - CLASS_INSTANCE.exercise_static_method, - (), - { - "code.filepath": FILE_PATH, - "code.function": "exercise_static_method", - "code.lineno": 24, - "code.namespace": FUZZY_NAMESPACE, - }, - ), - ( # Class Method - ExerciseClass.exercise_class_method, - (), - { - "code.filepath": FILE_PATH, - "code.function": "exercise_class_method", - "code.lineno": 28, - "code.namespace": CLASS_NAMESPACE, - }, - ), - ( # Callable object - CLASS_INSTANCE_CALLABLE, - (), - { - "code.filepath": FILE_PATH, - "code.function": "__call__", - "code.lineno": 34, - "code.namespace": CALLABLE_CLASS_NAMESPACE, - }, - ), - ( # Lambda - exercise_lambda, - (), - { - "code.filepath": FILE_PATH, - "code.function": "", - "code.lineno": 40, - "code.namespace": NAMESPACE, - }, - ), - ( # Functools Partials - exercise_partial, - (), + +@pytest.fixture +def extract(): + def _extract(obj): + with FunctionTrace("_test", source=obj): + pass + + return _extract + + +_TEST_BASIC_CALLABLES = { + "function": ( + exercise_function, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_function", + "code.lineno": 17, + "code.namespace": NAMESPACE, + }, + ), + "lambda": ( + exercise_lambda, + (), + { + "code.filepath": FILE_PATH, + "code.function": "", + "code.lineno": 75, + "code.namespace": NAMESPACE, + }, + ), + "partial": ( + exercise_partial, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_function", + "code.lineno": 17, + "code.namespace": NAMESPACE, + }, + ), + "builtin_function": ( + max, + (1, 2), + merge_dicts( { - "code.filepath": FILE_PATH, - "code.function": "exercise_function", - "code.lineno": 16, - "code.namespace": NAMESPACE, - }, - ), - ( # Top Level Builtin - max, - (1, 2), - merge_dicts({ "code.function": "max", "code.namespace": "builtins" if six.PY3 else "__builtin__", - }, BUILTIN_ATTRS), + }, + BUILTIN_ATTRS, ), - ( # Module Level Builtin - sqlite3.connect, - (":memory:",), - merge_dicts({ + ), + "builtin_module_function": ( + sqlite3.connect, + (":memory:",), + merge_dicts( + { "code.function": "connect", "code.namespace": "_sqlite3", - }, BUILTIN_ATTRS), + }, + BUILTIN_ATTRS, ), - ( # Builtin Method - SQLITE_CONNECTION.__enter__, - (), - merge_dicts({ + ), +} + + +@pytest.mark.parametrize( + "func,args,agents", + [pytest.param(*args, id=id_) for id_, args in six.iteritems(_TEST_BASIC_CALLABLES)], +) +def test_code_level_metrics_basic_callables(func, args, agents, extract): + @override_application_settings( + { + "code_level_metrics.enabled": True, + } + ) + @dt_enabled + @validate_span_events( + count=1, + exact_agents=agents, + ) + @background_task() + def _test(): + extract(func) + + _test() + + +_TEST_METHODS = { + "method": ( + CLASS_INSTANCE.exercise_method, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_method", + "code.lineno": 22, + "code.namespace": CLASS_NAMESPACE, + }, + ), + "static_method": ( + CLASS_INSTANCE.exercise_static_method, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_static_method", + "code.lineno": 25, + "code.namespace": FUZZY_NAMESPACE, + }, + ), + "class_method": ( + ExerciseClass.exercise_class_method, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_class_method", + "code.lineno": 29, + "code.namespace": CLASS_NAMESPACE, + }, + ), + "call_method": ( + CLASS_INSTANCE_CALLABLE, + (), + { + "code.filepath": FILE_PATH, + "code.function": "__call__", + "code.lineno": 35, + "code.namespace": CALLABLE_CLASS_NAMESPACE, + }, + ), + "builtin_method": ( + SQLITE_CONNECTION.__enter__, + (), + merge_dicts( + { "code.function": "__enter__", "code.namespace": "sqlite3.Connection" if not is_pypy else "_sqlite3.Connection", - }, BUILTIN_ATTRS), + }, + BUILTIN_ATTRS, ), ), +} + + +@pytest.mark.parametrize( + "func,args,agents", + [pytest.param(*args, id=id_) for id_, args in six.iteritems(_TEST_METHODS)], ) -def test_code_level_metrics_callables(func, args, agents): - @override_application_settings({ - "code_level_metrics.enabled": True, - }) +def test_code_level_metrics_methods(func, args, agents, extract): + @override_application_settings( + { + "code_level_metrics.enabled": True, + } + ) @dt_enabled @validate_span_events( count=1, @@ -155,47 +221,145 @@ def test_code_level_metrics_callables(func, args, agents): ) @background_task() def _test(): - FunctionTraceWrapper(func)(*args) + extract(func) _test() +_TEST_TYPE_CONSTRUCTOR_METHODS = { + "method": ( + TYPE_CONSTRUCTOR_CLASS_INSTANCE.exercise_method, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_method", + "code.lineno": 39, + "code.namespace": TYPE_CONSTRUCTOR_NAMESPACE, + }, + ), + "static_method": ( + TYPE_CONSTRUCTOR_CLASS_INSTANCE.exercise_static_method, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_static_method", + "code.lineno": 43, + "code.namespace": NAMESPACE, + }, + ), + "class_method": ( + ExerciseTypeConstructor.exercise_class_method, + (), + { + "code.filepath": FILE_PATH, + "code.function": "exercise_class_method", + "code.lineno": 48, + "code.namespace": TYPE_CONSTRUCTOR_NAMESPACE, + }, + ), + "lambda_method": ( + ExerciseTypeConstructor.exercise_lambda, + (), + { + "code.filepath": FILE_PATH, + "code.function": "", + "code.lineno": 61, + # Lambdas behave strangely in type constructors on Python 2 and use the class namespace. + "code.namespace": NAMESPACE if six.PY3 else TYPE_CONSTRUCTOR_NAMESPACE, + }, + ), + "call_method": ( + TYPE_CONSTRUCTOR_CALLABLE_CLASS_INSTANCE, + (), + { + "code.filepath": FILE_PATH, + "code.function": "__call__", + "code.lineno": 53, + "code.namespace": TYPE_CONSTRUCTOR_CALLABLE_NAMESPACE, + }, + ), +} + + @pytest.mark.parametrize( - "obj,agents", - ( - ( # Class with __call__ - ExerciseClassCallable, - { - "code.filepath": FILE_PATH, - "code.function": "ExerciseClassCallable", - "code.lineno": 33, - "code.namespace":NAMESPACE, - }, - ), - ( # Class without __call__ - ExerciseClass, - { - "code.filepath": FILE_PATH, - "code.function": "ExerciseClass", - "code.lineno": 20, - "code.namespace": NAMESPACE, - }, - ), - ( # Non-callable Object instance - CLASS_INSTANCE, - { - "code.filepath": FILE_PATH, - "code.function": "ExerciseClass", - "code.lineno": 20, - "code.namespace": NAMESPACE, - }, - ), + "func,args,agents", + [pytest.param(*args, id=id_) for id_, args in six.iteritems(_TEST_TYPE_CONSTRUCTOR_METHODS)], +) +def test_code_level_metrics_type_constructor_methods(func, args, agents, extract): + @override_application_settings( + { + "code_level_metrics.enabled": True, + } + ) + @dt_enabled + @validate_span_events( + count=1, + exact_agents=agents, + ) + @background_task() + def _test(): + extract(func) + + _test() + + +_TEST_OBJECTS = { + "class": ( + ExerciseClass, + { + "code.filepath": FILE_PATH, + "code.function": "ExerciseClass", + "code.lineno": 21, + "code.namespace": NAMESPACE, + }, + ), + "callable_class": ( + ExerciseClassCallable, + { + "code.filepath": FILE_PATH, + "code.function": "ExerciseClassCallable", + "code.lineno": 34, + "code.namespace": NAMESPACE, + }, + ), + "type_constructor_class": ( + ExerciseTypeConstructor, + { + "code.filepath": FILE_PATH, + "code.function": "ExerciseTypeConstructor", + "code.namespace": NAMESPACE, + }, + ), + "type_constructor_class_callable_class": ( + ExerciseTypeConstructorCallable, + { + "code.filepath": FILE_PATH, + "code.function": "ExerciseTypeConstructorCallable", + "code.namespace": NAMESPACE, + }, ), + "non_callable_object": ( + CLASS_INSTANCE, + { + "code.filepath": FILE_PATH, + "code.function": "ExerciseClass", + "code.lineno": 21, + "code.namespace": NAMESPACE, + }, + ), +} + + +@pytest.mark.parametrize( + "obj,agents", + [pytest.param(*args, id=id_) for id_, args in six.iteritems(_TEST_OBJECTS)], ) -def test_code_level_metrics_objects(obj, agents): - @override_application_settings({ - "code_level_metrics.enabled": True, - }) +def test_code_level_metrics_objects(obj, agents, extract): + @override_application_settings( + { + "code_level_metrics.enabled": True, + } + ) @dt_enabled @validate_span_events( count=1, @@ -203,7 +367,6 @@ def test_code_level_metrics_objects(obj, agents): ) @background_task() def _test(): - with FunctionTrace("_test", source=obj): - pass - - _test() \ No newline at end of file + extract(obj) + + _test() diff --git a/tests/agent_features/test_collector_payloads.py b/tests/agent_features/test_collector_payloads.py index 17b46ce49..42510e5c7 100644 --- a/tests/agent_features/test_collector_payloads.py +++ b/tests/agent_features/test_collector_payloads.py @@ -14,17 +14,30 @@ import pytest import webtest - -from testing_support.fixtures import (validate_error_trace_collector_json, - validate_tt_collector_json, validate_transaction_event_collector_json, - validate_error_event_collector_json, - validate_custom_event_collector_json, override_application_settings) - -from testing_support.sample_applications import (simple_app, - simple_exceptional_app, simple_custom_event_app) - -from testing_support.validators.validate_log_event_collector_json import validate_log_event_collector_json - +from testing_support.fixtures import override_application_settings +from testing_support.sample_applications import ( + simple_app, + simple_custom_event_app, + simple_exceptional_app, +) +from testing_support.validators.validate_custom_event_collector_json import ( + validate_custom_event_collector_json, +) +from testing_support.validators.validate_error_event_collector_json import ( + validate_error_event_collector_json, +) +from testing_support.validators.validate_error_trace_collector_json import ( + validate_error_trace_collector_json, +) +from testing_support.validators.validate_log_event_collector_json import ( + validate_log_event_collector_json, +) +from testing_support.validators.validate_transaction_event_collector_json import ( + validate_transaction_event_collector_json, +) +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) exceptional_application = webtest.TestApp(simple_exceptional_app) normal_application = webtest.TestApp(simple_app) @@ -34,7 +47,7 @@ @validate_error_trace_collector_json() def test_error_trace_json(): try: - exceptional_application.get('/') + exceptional_application.get("/") except ValueError: pass @@ -42,34 +55,34 @@ def test_error_trace_json(): @validate_error_event_collector_json() def test_error_event_json(): try: - exceptional_application.get('/') + exceptional_application.get("/") except ValueError: pass @validate_tt_collector_json() def test_transaction_trace_json(): - normal_application.get('/') + normal_application.get("/") @validate_tt_collector_json(exclude_request_uri=True) -@override_application_settings({'attributes.exclude': set(('request.uri',))}) +@override_application_settings({"attributes.exclude": set(("request.uri",))}) def test_transaction_trace_json_no_request_uri(): - normal_application.get('/') + normal_application.get("/") @validate_transaction_event_collector_json() def test_transaction_event_json(): - normal_application.get('/') + normal_application.get("/") @validate_custom_event_collector_json() def test_custom_event_json(): - custom_event_application.get('/') + custom_event_application.get("/") @pytest.mark.xfail(reason="Unwritten validator") @validate_log_event_collector_json def test_log_event_json(): - normal_application.get('/') + normal_application.get("/") raise NotImplementedError("Fix my validator") diff --git a/tests/agent_features/test_configuration.py b/tests/agent_features/test_configuration.py index 5846e3808..79f2a41f1 100644 --- a/tests/agent_features/test_configuration.py +++ b/tests/agent_features/test_configuration.py @@ -13,6 +13,7 @@ # limitations under the License. import collections +import tempfile import pytest @@ -21,8 +22,18 @@ except ImportError: import urllib.parse as urlparse +import logging + +from newrelic.api.exceptions import ConfigurationError from newrelic.common.object_names import callable_name -from newrelic.config import delete_setting, translate_deprecated_settings +from newrelic.config import ( + _reset_config_parser, + _reset_configuration_done, + _reset_instrumentation_done, + delete_setting, + initialize, + translate_deprecated_settings, +) from newrelic.core.config import ( Settings, apply_config_setting, @@ -34,6 +45,10 @@ ) +def function_to_trace(): + pass + + def parameterize_local_config(settings_list): settings_object_list = [] @@ -262,7 +277,6 @@ def parameterize_local_config(settings_list): @parameterize_local_config(_test_dictionary_local_config) def test_dict_parse(settings): - assert "NR-SESSION" in settings.request_headers_map config = settings.event_harvest_config @@ -438,12 +452,12 @@ def test_delete_setting_parent(): TSetting("event_harvest_config.harvest_limits.error_event_data", 100, 100), ), ( - TSetting("custom_insights_events.max_samples_stored", 1200, 1200), - TSetting("event_harvest_config.harvest_limits.custom_event_data", 9999, 1200), + TSetting("custom_insights_events.max_samples_stored", 3600, 3600), + TSetting("event_harvest_config.harvest_limits.custom_event_data", 9999, 3600), ), ( - TSetting("custom_insights_events.max_samples_stored", 9999, 1200), - TSetting("event_harvest_config.harvest_limits.custom_event_data", 1200, 1200), + TSetting("custom_insights_events.max_samples_stored", 9999, 3600), + TSetting("event_harvest_config.harvest_limits.custom_event_data", 3600, 3600), ), ( TSetting("application_logging.forwarding.max_samples_stored", 10000, 10000), @@ -583,3 +597,380 @@ def test_default_values(name, expected_value): settings = global_settings() value = fetch_config_setting(settings, name) assert value == expected_value + + +def test_initialize(): + initialize() + + +newrelic_ini_contents = b""" +[newrelic] +app_name = Python Agent Test (agent_features) +""" + + +def test_initialize_raises_if_config_does_not_match_previous(): + error_message = "Configuration has already been done against " "differing configuration file or environment.*" + with pytest.raises(ConfigurationError, match=error_message): + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name) + + +def test_initialize_via_config_file(): + _reset_configuration_done() + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name) + + +def test_initialize_no_config_file(): + _reset_configuration_done() + initialize() + + +def test_initialize_config_file_does_not_exist(): + _reset_configuration_done() + error_message = "Unable to open configuration file does-not-exist." + with pytest.raises(ConfigurationError, match=error_message): + initialize(config_file="does-not-exist") + + +def test_initialize_environment(): + _reset_configuration_done() + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name, environment="developement") + + +def test_initialize_log_level(): + _reset_configuration_done() + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name, log_level="debug") + + +def test_initialize_log_file(): + _reset_configuration_done() + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name, log_file="stdout") + + +@pytest.mark.parametrize( + "feature_flag,expect_warning", + ( + (["django.instrumentation.inclusion-tags.r1"], False), + (["noexist"], True), + ), +) +def test_initialize_config_file_feature_flag(feature_flag, expect_warning, logger): + settings = global_settings() + apply_config_setting(settings, "feature_flag", feature_flag) + _reset_configuration_done() + + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name) + + message = ( + "Unknown agent feature flag 'noexist' provided. " + "Check agent documentation or release notes, or " + "contact New Relic support for clarification of " + "validity of the specific feature flag." + ) + if expect_warning: + assert message in logger.caplog.records + else: + assert message not in logger.caplog.records + + apply_config_setting(settings, "feature_flag", []) + + +@pytest.mark.parametrize( + "feature_flag,expect_warning", + ( + (["django.instrumentation.inclusion-tags.r1"], False), + (["noexist"], True), + ), +) +def test_initialize_no_config_file_feature_flag(feature_flag, expect_warning, logger): + settings = global_settings() + apply_config_setting(settings, "feature_flag", feature_flag) + _reset_configuration_done() + + initialize() + + message = ( + "Unknown agent feature flag 'noexist' provided. " + "Check agent documentation or release notes, or " + "contact New Relic support for clarification of " + "validity of the specific feature flag." + ) + + if expect_warning: + assert message in logger.caplog.records + else: + assert message not in logger.caplog.records + + apply_config_setting(settings, "feature_flag", []) + + +@pytest.mark.parametrize( + "setting_name,setting_value,expect_error", + ( + ("transaction_tracer.function_trace", [callable_name(function_to_trace)], False), + ("transaction_tracer.generator_trace", [callable_name(function_to_trace)], False), + ("transaction_tracer.function_trace", ["no_exist"], True), + ("transaction_tracer.generator_trace", ["no_exist"], True), + ), +) +def test_initialize_config_file_with_traces(setting_name, setting_value, expect_error, logger): + settings = global_settings() + apply_config_setting(settings, setting_name, setting_value) + _reset_configuration_done() + + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.seek(0) + + initialize(config_file=f.name) + + if expect_error: + assert "CONFIGURATION ERROR" in logger.caplog.records + else: + assert "CONFIGURATION ERROR" not in logger.caplog.records + + apply_config_setting(settings, setting_name, []) + + +func_newrelic_ini = b""" +[function-trace:] +enabled = True +function = test_configuration:function_to_trace +name = function_to_trace +group = group +label = label +terminal = False +rollup = foo/all +""" + +bad_func_newrelic_ini = b""" +[function-trace:] +enabled = True +function = function_to_trace +""" + +func_missing_enabled_newrelic_ini = b""" +[function-trace:] +function = function_to_trace +""" + +external_newrelic_ini = b""" +[external-trace:] +enabled = True +function = test_configuration:function_to_trace +library = "foo" +url = localhost:80/foo +method = GET +""" + +bad_external_newrelic_ini = b""" +[external-trace:] +enabled = True +function = function_to_trace +""" + +external_missing_enabled_newrelic_ini = b""" +[external-trace:] +function = function_to_trace +""" + +generator_newrelic_ini = b""" +[generator-trace:] +enabled = True +function = test_configuration:function_to_trace +name = function_to_trace +group = group +""" + +bad_generator_newrelic_ini = b""" +[generator-trace:] +enabled = True +function = function_to_trace +""" + +generator_missing_enabled_newrelic_ini = b""" +[generator-trace:] +function = function_to_trace +""" + +bg_task_newrelic_ini = b""" +[background-task:] +enabled = True +function = test_configuration:function_to_trace +lambda = test_configuration:function_to_trace +""" + +bad_bg_task_newrelic_ini = b""" +[background-task:] +enabled = True +function = function_to_trace +""" + +bg_task_missing_enabled_newrelic_ini = b""" +[background-task:] +function = function_to_trace +""" + +db_trace_newrelic_ini = b""" +[database-trace:] +enabled = True +function = test_configuration:function_to_trace +sql = test_configuration:function_to_trace +""" + +bad_db_trace_newrelic_ini = b""" +[database-trace:] +enabled = True +function = function_to_trace +""" + +db_trace_missing_enabled_newrelic_ini = b""" +[database-trace:] +function = function_to_trace +""" + +wsgi_newrelic_ini = b""" +[wsgi-application:] +enabled = True +function = test_configuration:function_to_trace +application = app +""" + +bad_wsgi_newrelic_ini = b""" +[wsgi-application:] +enabled = True +function = function_to_trace +application = app +""" + +wsgi_missing_enabled_newrelic_ini = b""" +[wsgi-application:] +function = function_to_trace +application = app +""" + +wsgi_unparseable_enabled_newrelic_ini = b""" +[wsgi-application:] +enabled = not-a-bool +function = function_to_trace +application = app +""" + + +@pytest.mark.parametrize( + "section,expect_error", + ( + (func_newrelic_ini, False), + (bad_func_newrelic_ini, True), + (func_missing_enabled_newrelic_ini, False), + (external_newrelic_ini, False), + (bad_external_newrelic_ini, True), + (external_missing_enabled_newrelic_ini, False), + (generator_newrelic_ini, False), + (bad_generator_newrelic_ini, True), + (generator_missing_enabled_newrelic_ini, False), + (bg_task_newrelic_ini, False), + (bad_bg_task_newrelic_ini, True), + (bg_task_missing_enabled_newrelic_ini, False), + (db_trace_newrelic_ini, False), + (bad_db_trace_newrelic_ini, True), + (db_trace_missing_enabled_newrelic_ini, False), + (wsgi_newrelic_ini, False), + (bad_wsgi_newrelic_ini, True), + (wsgi_missing_enabled_newrelic_ini, False), + (wsgi_unparseable_enabled_newrelic_ini, True), + ), + ids=( + "func_newrelic_ini", + "bad_func_newrelic_ini", + "func_missing_enabled_newrelic_ini", + "external_newrelic_ini", + "bad_external_newrelic_ini", + "external_missing_enabled_newrelic_ini", + "generator_newrelic_ini", + "bad_generator_newrelic_ini", + "generator_missing_enabled_newrelic_ini", + "bg_task_newrelic_ini", + "bad_bg_task_newrelic_ini", + "bg_task_missing_enabled_newrelic_ini", + "db_trace_newrelic_ini", + "bad_db_trace_newrelic_ini", + "db_trace_missing_enabled_newrelic_ini", + "wsgi_newrelic_ini", + "bad_wsgi_newrelic_ini", + "wsgi_missing_enabled_newrelic_ini", + "wsgi_unparseable_enabled_newrelic_ini", + ), +) +def test_initialize_developer_mode(section, expect_error, logger): + settings = global_settings() + apply_config_setting(settings, "monitor_mode", False) + apply_config_setting(settings, "developer_mode", True) + _reset_configuration_done() + _reset_instrumentation_done() + _reset_config_parser() + + with tempfile.NamedTemporaryFile() as f: + f.write(newrelic_ini_contents) + f.write(section) + f.seek(0) + + initialize(config_file=f.name) + + if expect_error: + assert "CONFIGURATION ERROR" in logger.caplog.records + else: + assert "CONFIGURATION ERROR" not in logger.caplog.records + + +@pytest.fixture +def caplog_handler(): + class CaplogHandler(logging.StreamHandler): + """ + To prevent possible issues with pytest's monkey patching + use a custom Caplog handler to capture all records + """ + + def __init__(self, *args, **kwargs): + self.records = [] + super(CaplogHandler, self).__init__(*args, **kwargs) + + def emit(self, record): + self.records.append(self.format(record)) + + return CaplogHandler() + + +@pytest.fixture +def logger(caplog_handler): + _logger = logging.getLogger("newrelic.config") + _logger.addHandler(caplog_handler) + _logger.caplog = caplog_handler + _logger.setLevel(logging.WARNING) + yield _logger + del caplog_handler.records[:] + _logger.removeHandler(caplog_handler) diff --git a/tests/agent_features/test_coroutine_trace.py b/tests/agent_features/test_coroutine_trace.py index b15537573..36e365bc4 100644 --- a/tests/agent_features/test_coroutine_trace.py +++ b/tests/agent_features/test_coroutine_trace.py @@ -18,11 +18,12 @@ import time import pytest -from testing_support.fixtures import ( - capture_transaction_metrics, +from testing_support.fixtures import capture_transaction_metrics, validate_tt_parenting +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, - validate_tt_parenting, ) from newrelic.api.background_task import background_task @@ -105,33 +106,30 @@ def test_coroutine_siblings(event_loop): # the case if child was a child of child since child is terminal) @function_trace("child", terminal=True) - @asyncio.coroutine - def child(wait, event=None): + async def child(wait, event=None): if event: event.set() - yield from wait.wait() + await wait.wait() - @asyncio.coroutine - def middle(): + async def middle(): wait = asyncio.Event() started = asyncio.Event() child_0 = asyncio.ensure_future(child(wait, started)) # Wait for the first child to start - yield from started.wait() + await started.wait() child_1 = asyncio.ensure_future(child(wait)) # Allow children to complete wait.set() - yield from child_1 - yield from child_0 + await child_1 + await child_0 @function_trace("parent") - @asyncio.coroutine - def parent(): - yield from asyncio.ensure_future(middle()) + async def parent(): + await asyncio.ensure_future(middle()) event_loop.run_until_complete(parent()) @@ -465,15 +463,13 @@ def test_trace_outlives_transaction(event_loop): running, finish = asyncio.Event(), asyncio.Event() @function_trace(name="coro") - @asyncio.coroutine - def _coro(): + async def _coro(): running.set() - yield from finish.wait() + await finish.wait() - @asyncio.coroutine - def parent(): + async def parent(): task.append(asyncio.ensure_future(_coro())) - yield from running.wait() + await running.wait() @validate_transaction_metrics( "test_trace_outlives_transaction", diff --git a/tests/agent_features/test_coroutine_transaction.py b/tests/agent_features/test_coroutine_transaction.py index 1a55385c7..8b602ffc0 100644 --- a/tests/agent_features/test_coroutine_transaction.py +++ b/tests/agent_features/test_coroutine_transaction.py @@ -19,6 +19,8 @@ from testing_support.fixtures import ( capture_transaction_metrics, override_generic_settings, +) +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, ) @@ -39,8 +41,7 @@ def coroutine_test(event_loop, transaction, nr_enabled=True, does_hang=False, call_exit=False, runtime_error=False): @transaction - @asyncio.coroutine - def task(): + async def task(): txn = current_transaction() if not nr_enabled: @@ -55,15 +56,15 @@ def task(): try: if does_hang: - yield from loop.create_future() + await loop.create_future() # noqa else: - yield from asyncio.sleep(0.0) + await asyncio.sleep(0.0) if nr_enabled and txn.enabled: # Validate loop time is recorded after suspend assert txn._loop_time > 0.0 except GeneratorExit: if runtime_error: - yield from asyncio.sleep(0.0) + await asyncio.sleep(0.0) return task @@ -160,11 +161,10 @@ def test_async_coroutine_throw_cancel(event_loop, num_coroutines, create_test_ta tasks = [create_test_task(event_loop, transaction) for _ in range(num_coroutines)] - @asyncio.coroutine - def task_c(): + async def task_c(): futures = [asyncio.ensure_future(t()) for t in tasks] - yield from asyncio.sleep(0.0) + await asyncio.sleep(0.0) [f.cancel() for f in futures] @@ -195,8 +195,7 @@ def test_async_coroutine_throw_error(event_loop, num_coroutines, create_test_tas tasks = [create_test_task(event_loop, transaction) for _ in range(num_coroutines)] - @asyncio.coroutine - def task_c(): + async def task_c(): coros = [t() for t in tasks] for coro in coros: @@ -232,14 +231,13 @@ def test_async_coroutine_close(event_loop, num_coroutines, create_test_task, tra tasks = [create_test_task(event_loop, transaction) for _ in range(num_coroutines)] - @asyncio.coroutine - def task_c(): + async def task_c(): coros = [t() for t in tasks] if start_coroutines: [asyncio.ensure_future(coro) for coro in coros] - yield from asyncio.sleep(0.0) + await asyncio.sleep(0.0) [coro.close() for coro in coros] @@ -273,13 +271,12 @@ def test_async_coroutine_close_raises_error(event_loop, num_coroutines, create_t tasks = [create_test_task(event_loop, transaction, runtime_error=True) for _ in range(num_coroutines)] - @asyncio.coroutine - def task_c(): + async def task_c(): coros = [t() for t in tasks] [c.send(None) for c in coros] - yield from asyncio.sleep(0.0) + await asyncio.sleep(0.0) for coro in coros: with pytest.raises(RuntimeError): @@ -313,24 +310,21 @@ def test_deferred_async_background_task(event_loop, transaction, metric, argumen args, kwargs = arguments("deferred") @transaction(*args, **kwargs) - @asyncio.coroutine - def child_task(): - yield from asyncio.sleep(0) + async def child_task(): + await asyncio.sleep(0) main_metric = (metric % "main", "") args, kwargs = arguments("main") @transaction(*args, **kwargs) - @asyncio.coroutine - def parent_task(): - yield from asyncio.sleep(0) + async def parent_task(): + await asyncio.sleep(0) return event_loop.create_task(child_task()) - @asyncio.coroutine - def test_runner(): - child = yield from parent_task() - yield from child + async def test_runner(): + child = await parent_task() + await child metrics = [] @@ -362,18 +356,16 @@ def test_child_transaction_when_parent_is_running(event_loop, transaction, metri args, kwargs = arguments("deferred") @transaction(*args, **kwargs) - @asyncio.coroutine - def child_task(): - yield from asyncio.sleep(0) + async def child_task(): + await asyncio.sleep(0) main_metric = (metric % "main", "") args, kwargs = arguments("main") @transaction(*args, **kwargs) - @asyncio.coroutine - def parent_task(): - yield from event_loop.create_task(child_task()) + async def parent_task(): + await event_loop.create_task(child_task()) metrics = [] @@ -405,9 +397,8 @@ def test_nested_coroutine_inside_sync(event_loop, transaction, metric, arguments args, kwargs = arguments("child") @transaction(*args, **kwargs) - @asyncio.coroutine - def child_task(): - yield from asyncio.sleep(0) + async def child_task(): + await asyncio.sleep(0) main_metric = (metric % "main", "") args, kwargs = arguments("main") @@ -443,22 +434,20 @@ def test_nested_coroutine_task_already_active(event_loop, transaction, metric, a args, kwargs = arguments("deferred") @transaction(*args, **kwargs) - @asyncio.coroutine - def child_task(): - yield from asyncio.sleep(0) + async def child_task(): + await asyncio.sleep(0) @function_trace() - def child_trace(): - yield from child_task() + async def child_trace(): + await child_task() main_metric = (metric % "main", "") args, kwargs = arguments("main") @transaction(*args, **kwargs) - @asyncio.coroutine - def parent_task(): - yield from event_loop.create_task(child_trace()) + async def parent_task(): + await event_loop.create_task(child_trace()) metrics = [] diff --git a/tests/agent_features/test_custom_metrics.py b/tests/agent_features/test_custom_metrics.py new file mode 100644 index 000000000..21a67149a --- /dev/null +++ b/tests/agent_features/test_custom_metrics.py @@ -0,0 +1,62 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from testing_support.fixtures import reset_core_stats_engine +from testing_support.validators.validate_custom_metrics_outside_transaction import ( + validate_custom_metrics_outside_transaction, +) + +from newrelic.api.application import application_instance as application +from newrelic.api.background_task import background_task +from newrelic.api.transaction import ( + current_transaction, + record_custom_metric, + record_custom_metrics, +) + + +# Testing record_custom_metric +@reset_core_stats_engine() +@background_task() +def test_custom_metric_inside_transaction(): + transaction = current_transaction() + record_custom_metric("CustomMetric/InsideTransaction/Count", 1) + for metric in transaction._custom_metrics.metrics(): + assert metric == ("CustomMetric/InsideTransaction/Count", [1, 1, 1, 1, 1, 1]) + + +@reset_core_stats_engine() +@validate_custom_metrics_outside_transaction([("CustomMetric/OutsideTransaction/Count", 1)]) +@background_task() +def test_custom_metric_outside_transaction_with_app(): + app = application() + record_custom_metric("CustomMetric/OutsideTransaction/Count", 1, application=app) + + +# Testing record_custom_metricS +@reset_core_stats_engine() +@background_task() +def test_custom_metrics_inside_transaction(): + transaction = current_transaction() + record_custom_metrics([("CustomMetrics/InsideTransaction/Count", 1)]) + for metric in transaction._custom_metrics.metrics(): + assert metric == ("CustomMetrics/InsideTransaction/Count", [1, 1, 1, 1, 1, 1]) + + +@reset_core_stats_engine() +@validate_custom_metrics_outside_transaction([("CustomMetrics/OutsideTransaction/Count", 1)]) +@background_task() +def test_custom_metrics_outside_transaction_with_app(): + app = application() + record_custom_metrics([("CustomMetrics/OutsideTransaction/Count", 1)], application=app) diff --git a/tests/agent_features/test_distributed_tracing.py b/tests/agent_features/test_distributed_tracing.py index ae6b4f32d..263b1bdcf 100644 --- a/tests/agent_features/test_distributed_tracing.py +++ b/tests/agent_features/test_distributed_tracing.py @@ -12,69 +12,86 @@ # See the License for the specific language governing permissions and # limitations under the License. +import copy import json + import pytest import webtest -import copy +from testing_support.fixtures import override_application_settings, validate_attributes +from testing_support.validators.validate_error_event_attributes import ( + validate_error_event_attributes, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.application import application_instance -from newrelic.api.background_task import background_task, BackgroundTask -from newrelic.api.transaction import (current_transaction, current_trace_id, - current_span_id) +from newrelic.api.background_task import BackgroundTask, background_task +from newrelic.api.external_trace import ExternalTrace from newrelic.api.time_trace import current_trace +from newrelic.api.transaction import ( + accept_distributed_trace_headers, + accept_distributed_trace_payload, + create_distributed_trace_payload, + current_span_id, + current_trace_id, + current_transaction, +) from newrelic.api.web_transaction import WSGIWebTransaction from newrelic.api.wsgi_application import wsgi_application -from testing_support.fixtures import (override_application_settings, - validate_attributes, validate_transaction_event_attributes, - validate_error_event_attributes, validate_transaction_metrics) - -distributed_trace_intrinsics = ['guid', 'traceId', 'priority', 'sampled'] -inbound_payload_intrinsics = ['parent.type', 'parent.app', 'parent.account', - 'parent.transportType', 'parent.transportDuration'] +distributed_trace_intrinsics = ["guid", "traceId", "priority", "sampled"] +inbound_payload_intrinsics = [ + "parent.type", + "parent.app", + "parent.account", + "parent.transportType", + "parent.transportDuration", +] payload = { - 'v': [0, 1], - 'd': { - 'ac': '1', - 'ap': '2827902', - 'id': '7d3efb1b173fecfa', - 'pa': '5e5733a911cfbc73', - 'pr': 10.001, - 'sa': True, - 'ti': 1518469636035, - 'tr': 'd6b4ba0c3a712ca', - 'ty': 'App', - } + "v": [0, 1], + "d": { + "ac": "1", + "ap": "2827902", + "id": "7d3efb1b173fecfa", + "pa": "5e5733a911cfbc73", + "pr": 10.001, + "sa": True, + "ti": 1518469636035, + "tr": "d6b4ba0c3a712ca", + "ty": "App", + }, } -parent_order = ['parent_type', 'parent_account', - 'parent_app', 'parent_transport_type'] +parent_order = ["parent_type", "parent_account", "parent_app", "parent_transport_type"] parent_info = { - 'parent_type': payload['d']['ty'], - 'parent_account': payload['d']['ac'], - 'parent_app': payload['d']['ap'], - 'parent_transport_type': 'HTTP' + "parent_type": payload["d"]["ty"], + "parent_account": payload["d"]["ac"], + "parent_app": payload["d"]["ap"], + "parent_transport_type": "HTTP", } @wsgi_application() def target_wsgi_application(environ, start_response): - status = '200 OK' - output = b'hello world' - response_headers = [('Content-type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + status = "200 OK" + output = b"hello world" + response_headers = [("Content-type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] txn = current_transaction() # Make assertions on the WSGIWebTransaction object assert txn._distributed_trace_state - assert txn.parent_type == 'App' - assert txn.parent_app == '2827902' - assert txn.parent_account == '1' - assert txn.parent_span == '7d3efb1b173fecfa' - assert txn.parent_transport_type == 'HTTP' + assert txn.parent_type == "App" + assert txn.parent_app == "2827902" + assert txn.parent_account == "1" + assert txn.parent_span == "7d3efb1b173fecfa" + assert txn.parent_transport_type == "HTTP" assert isinstance(txn.parent_transport_duration, float) - assert txn._trace_id == 'd6b4ba0c3a712ca' + assert txn._trace_id == "d6b4ba0c3a712ca" assert txn.priority == 10.001 assert txn.sampled @@ -85,90 +102,75 @@ def target_wsgi_application(environ, start_response): test_application = webtest.TestApp(target_wsgi_application) _override_settings = { - 'trusted_account_key': '1', - 'distributed_tracing.enabled': True, + "trusted_account_key": "1", + "distributed_tracing.enabled": True, } _metrics = [ - ('Supportability/DistributedTrace/AcceptPayload/Success', 1), - ('Supportability/TraceContext/Accept/Success', None) + ("Supportability/DistributedTrace/AcceptPayload/Success", 1), + ("Supportability/TraceContext/Accept/Success", None), ] @override_application_settings(_override_settings) -@validate_transaction_metrics( - '', - group='Uri', - rollup_metrics=_metrics) +@validate_transaction_metrics("", group="Uri", rollup_metrics=_metrics) def test_distributed_tracing_web_transaction(): - headers = {'newrelic': json.dumps(payload)} - response = test_application.get('/', headers=headers) - assert 'X-NewRelic-App-Data' not in response.headers + headers = {"newrelic": json.dumps(payload)} + response = test_application.get("/", headers=headers) + assert "X-NewRelic-App-Data" not in response.headers -@pytest.mark.parametrize('span_events', (True, False)) -@pytest.mark.parametrize('accept_payload', (True, False)) +@pytest.mark.parametrize("span_events", (True, False)) +@pytest.mark.parametrize("accept_payload", (True, False)) def test_distributed_trace_attributes(span_events, accept_payload): if accept_payload: - _required_intrinsics = ( - distributed_trace_intrinsics + inbound_payload_intrinsics) + _required_intrinsics = distributed_trace_intrinsics + inbound_payload_intrinsics _forgone_txn_intrinsics = [] _forgone_error_intrinsics = [] _exact_intrinsics = { - 'parent.type': 'Mobile', - 'parent.app': '2827902', - 'parent.account': '1', - 'parent.transportType': 'HTTP', - 'traceId': 'd6b4ba0c3a712ca', + "parent.type": "Mobile", + "parent.app": "2827902", + "parent.account": "1", + "parent.transportType": "HTTP", + "traceId": "d6b4ba0c3a712ca", } - _exact_txn_attributes = {'agent': {}, 'user': {}, - 'intrinsic': _exact_intrinsics.copy()} - _exact_error_attributes = {'agent': {}, 'user': {}, - 'intrinsic': _exact_intrinsics.copy()} - _exact_txn_attributes['intrinsic']['parentId'] = '7d3efb1b173fecfa' - _exact_txn_attributes['intrinsic']['parentSpanId'] = 'c86df80de2e6f51c' - - _forgone_error_intrinsics.append('parentId') - _forgone_error_intrinsics.append('parentSpanId') - _forgone_txn_intrinsics.append('grandparentId') - _forgone_error_intrinsics.append('grandparentId') - - _required_attributes = { - 'intrinsic': _required_intrinsics, 'agent': [], 'user': []} - _forgone_txn_attributes = {'intrinsic': _forgone_txn_intrinsics, - 'agent': [], 'user': []} - _forgone_error_attributes = {'intrinsic': _forgone_error_intrinsics, - 'agent': [], 'user': []} + _exact_txn_attributes = {"agent": {}, "user": {}, "intrinsic": _exact_intrinsics.copy()} + _exact_error_attributes = {"agent": {}, "user": {}, "intrinsic": _exact_intrinsics.copy()} + _exact_txn_attributes["intrinsic"]["parentId"] = "7d3efb1b173fecfa" + _exact_txn_attributes["intrinsic"]["parentSpanId"] = "c86df80de2e6f51c" + + _forgone_error_intrinsics.append("parentId") + _forgone_error_intrinsics.append("parentSpanId") + _forgone_txn_intrinsics.append("grandparentId") + _forgone_error_intrinsics.append("grandparentId") + + _required_attributes = {"intrinsic": _required_intrinsics, "agent": [], "user": []} + _forgone_txn_attributes = {"intrinsic": _forgone_txn_intrinsics, "agent": [], "user": []} + _forgone_error_attributes = {"intrinsic": _forgone_error_intrinsics, "agent": [], "user": []} else: _required_intrinsics = distributed_trace_intrinsics - _forgone_txn_intrinsics = _forgone_error_intrinsics = \ - inbound_payload_intrinsics + ['grandparentId', 'parentId', - 'parentSpanId'] - - _required_attributes = { - 'intrinsic': _required_intrinsics, 'agent': [], 'user': []} - _forgone_txn_attributes = {'intrinsic': _forgone_txn_intrinsics, - 'agent': [], 'user': []} - _forgone_error_attributes = {'intrinsic': _forgone_error_intrinsics, - 'agent': [], 'user': []} + _forgone_txn_intrinsics = _forgone_error_intrinsics = inbound_payload_intrinsics + [ + "grandparentId", + "parentId", + "parentSpanId", + ] + + _required_attributes = {"intrinsic": _required_intrinsics, "agent": [], "user": []} + _forgone_txn_attributes = {"intrinsic": _forgone_txn_intrinsics, "agent": [], "user": []} + _forgone_error_attributes = {"intrinsic": _forgone_error_intrinsics, "agent": [], "user": []} _exact_txn_attributes = _exact_error_attributes = None _forgone_trace_intrinsics = _forgone_error_intrinsics test_settings = _override_settings.copy() - test_settings['span_events.enabled'] = span_events + test_settings["span_events.enabled"] = span_events @override_application_settings(test_settings) - @validate_transaction_event_attributes( - _required_attributes, _forgone_txn_attributes, - _exact_txn_attributes) - @validate_error_event_attributes( - _required_attributes, _forgone_error_attributes, - _exact_error_attributes) - @validate_attributes('intrinsic', - _required_intrinsics, _forgone_trace_intrinsics) - @background_task(name='test_distributed_trace_attributes') + @validate_transaction_event_attributes(_required_attributes, _forgone_txn_attributes, _exact_txn_attributes) + @validate_error_event_attributes(_required_attributes, _forgone_error_attributes, _exact_error_attributes) + @validate_attributes("intrinsic", _required_intrinsics, _forgone_trace_intrinsics) + @background_task(name="test_distributed_trace_attributes") def _test(): txn = current_transaction() @@ -181,19 +183,19 @@ def _test(): "id": "c86df80de2e6f51c", "tr": "d6b4ba0c3a712ca", "ti": 1518469636035, - "tx": "7d3efb1b173fecfa" - } + "tx": "7d3efb1b173fecfa", + }, } - payload['d']['pa'] = "5e5733a911cfbc73" + payload["d"]["pa"] = "5e5733a911cfbc73" if accept_payload: - result = txn.accept_distributed_trace_payload(payload) + result = accept_distributed_trace_payload(payload) assert result else: - txn._create_distributed_trace_payload() + create_distributed_trace_payload() try: - raise ValueError('cookies') + raise ValueError("cookies") except ValueError: txn.notice_error() @@ -201,33 +203,30 @@ def _test(): _forgone_attributes = { - 'agent': [], - 'user': [], - 'intrinsic': (inbound_payload_intrinsics + ['grandparentId']), + "agent": [], + "user": [], + "intrinsic": (inbound_payload_intrinsics + ["grandparentId"]), } @override_application_settings(_override_settings) -@validate_transaction_event_attributes( - {}, _forgone_attributes) -@validate_error_event_attributes( - {}, _forgone_attributes) -@validate_attributes('intrinsic', - {}, _forgone_attributes['intrinsic']) -@background_task(name='test_distributed_trace_attrs_omitted') +@validate_transaction_event_attributes({}, _forgone_attributes) +@validate_error_event_attributes({}, _forgone_attributes) +@validate_attributes("intrinsic", {}, _forgone_attributes["intrinsic"]) +@background_task(name="test_distributed_trace_attrs_omitted") def test_distributed_trace_attrs_omitted(): txn = current_transaction() try: - raise ValueError('cookies') + raise ValueError("cookies") except ValueError: txn.notice_error() # test our distributed_trace metrics by creating a transaction and then forcing # it to process a distributed trace payload -@pytest.mark.parametrize('web_transaction', (True, False)) -@pytest.mark.parametrize('gen_error', (True, False)) -@pytest.mark.parametrize('has_parent', (True, False)) +@pytest.mark.parametrize("web_transaction", (True, False)) +@pytest.mark.parametrize("gen_error", (True, False)) +@pytest.mark.parametrize("has_parent", (True, False)) def test_distributed_tracing_metrics(web_transaction, gen_error, has_parent): def _make_dt_tag(pi): return "%s/%s/%s/%s/all" % tuple(pi[x] for x in parent_order) @@ -235,11 +234,11 @@ def _make_dt_tag(pi): # figure out which metrics we'll see based on the test params # note: we'll always see DurationByCaller if the distributed # tracing flag is turned on - metrics = ['DurationByCaller'] + metrics = ["DurationByCaller"] if gen_error: - metrics.append('ErrorsByCaller') + metrics.append("ErrorsByCaller") if has_parent: - metrics.append('TransportDuration') + metrics.append("TransportDuration") tag = None dt_payload = copy.deepcopy(payload) @@ -249,15 +248,14 @@ def _make_dt_tag(pi): if has_parent: tag = _make_dt_tag(parent_info) else: - tag = _make_dt_tag(dict((x, 'Unknown') for x in parent_info.keys())) - del dt_payload['d']['tr'] + # tag = _make_dt_tag(dict((x, "Unknown") for x in parent_order)) + tag = _make_dt_tag(dict((x, "Unknown") for x in parent_info.keys())) + del dt_payload["d"]["tr"] # now run the test - transaction_name = "test_dt_metrics_%s" % '_'.join(metrics) + transaction_name = "test_dt_metrics_%s" % "_".join(metrics) _rollup_metrics = [ - ("%s/%s%s" % (x, tag, bt), 1) - for x in metrics - for bt in ['', 'Web' if web_transaction else 'Other'] + ("%s/%s%s" % (x, tag, bt), 1) for x in metrics for bt in ["", "Web" if web_transaction else "Other"] ] def _make_test_transaction(): @@ -266,16 +264,15 @@ def _make_test_transaction(): if not web_transaction: return BackgroundTask(application, transaction_name) - environ = {'REQUEST_URI': '/trace_ends_after_txn'} + environ = {"REQUEST_URI": "/trace_ends_after_txn"} tn = WSGIWebTransaction(application, environ) tn.set_transaction_name(transaction_name) return tn @override_application_settings(_override_settings) @validate_transaction_metrics( - transaction_name, - background_task=not(web_transaction), - rollup_metrics=_rollup_metrics) + transaction_name, background_task=not (web_transaction), rollup_metrics=_rollup_metrics + ) def _test(): with _make_test_transaction() as transaction: transaction.accept_distributed_trace_payload(dt_payload) @@ -289,62 +286,62 @@ def _test(): _test() -NEW_RELIC_ACCEPTED = \ - [('Supportability/DistributedTrace/AcceptPayload/Success', 1), - ('Supportability/TraceContext/Accept/Success', None), - ('Supportability/TraceContext/TraceParent/Accept/Success', None), - ('Supportability/TraceContext/Accept/Success', None)] -TRACE_CONTEXT_ACCEPTED = \ - [('Supportability/TraceContext/Accept/Success', 1), - ('Supportability/TraceContext/TraceParent/Accept/Success', 1), - ('Supportability/TraceContext/Accept/Success', 1), - ('Supportability/DistributedTrace/AcceptPayload/Success', None)] -NO_HEADERS_ACCEPTED = \ - [('Supportability/DistributedTrace/AcceptPayload/Success', None), - ('Supportability/TraceContext/Accept/Success', None), - ('Supportability/TraceContext/TraceParent/Accept/Success', None), - ('Supportability/TraceContext/Accept/Success', None)] -TRACEPARENT = '00-0af7651916cd43dd8448eb211c80319c-00f067aa0ba902b7-01' -TRACESTATE = 'rojo=f06a0ba902b7,congo=t61rcWkgMzE' - - -@pytest.mark.parametrize('traceparent,tracestate,newrelic,metrics', - [(False, False, False, NO_HEADERS_ACCEPTED), - (False, False, True, NEW_RELIC_ACCEPTED), - (False, True, True, NEW_RELIC_ACCEPTED), - (False, True, False, NO_HEADERS_ACCEPTED), - (True, True, True, TRACE_CONTEXT_ACCEPTED), - (True, False, False, TRACE_CONTEXT_ACCEPTED), - (True, False, True, TRACE_CONTEXT_ACCEPTED), - (True, True, False, TRACE_CONTEXT_ACCEPTED)] - ) +NEW_RELIC_ACCEPTED = [ + ("Supportability/DistributedTrace/AcceptPayload/Success", 1), + ("Supportability/TraceContext/Accept/Success", None), + ("Supportability/TraceContext/TraceParent/Accept/Success", None), + ("Supportability/TraceContext/Accept/Success", None), +] +TRACE_CONTEXT_ACCEPTED = [ + ("Supportability/TraceContext/Accept/Success", 1), + ("Supportability/TraceContext/TraceParent/Accept/Success", 1), + ("Supportability/TraceContext/Accept/Success", 1), + ("Supportability/DistributedTrace/AcceptPayload/Success", None), +] +NO_HEADERS_ACCEPTED = [ + ("Supportability/DistributedTrace/AcceptPayload/Success", None), + ("Supportability/TraceContext/Accept/Success", None), + ("Supportability/TraceContext/TraceParent/Accept/Success", None), + ("Supportability/TraceContext/Accept/Success", None), +] +TRACEPARENT = "00-0af7651916cd43dd8448eb211c80319c-00f067aa0ba902b7-01" +TRACESTATE = "rojo=f06a0ba902b7,congo=t61rcWkgMzE" + + +@pytest.mark.parametrize( + "traceparent,tracestate,newrelic,metrics", + [ + (False, False, False, NO_HEADERS_ACCEPTED), + (False, False, True, NEW_RELIC_ACCEPTED), + (False, True, True, NEW_RELIC_ACCEPTED), + (False, True, False, NO_HEADERS_ACCEPTED), + (True, True, True, TRACE_CONTEXT_ACCEPTED), + (True, False, False, TRACE_CONTEXT_ACCEPTED), + (True, False, True, TRACE_CONTEXT_ACCEPTED), + (True, True, False, TRACE_CONTEXT_ACCEPTED), + ], +) @override_application_settings(_override_settings) -def test_distributed_tracing_backwards_compatibility(traceparent, - tracestate, - newrelic, - metrics): - +def test_distributed_tracing_backwards_compatibility(traceparent, tracestate, newrelic, metrics): headers = [] if traceparent: - headers.append(('traceparent', TRACEPARENT)) + headers.append(("traceparent", TRACEPARENT)) if tracestate: - headers.append(('tracestate', TRACESTATE)) + headers.append(("tracestate", TRACESTATE)) if newrelic: - headers.append(('newrelic', json.dumps(payload))) + headers.append(("newrelic", json.dumps(payload))) @validate_transaction_metrics( - "test_distributed_tracing_backwards_compatibility", - background_task=True, - rollup_metrics=metrics) - @background_task(name='test_distributed_tracing_backwards_compatibility') + "test_distributed_tracing_backwards_compatibility", background_task=True, rollup_metrics=metrics + ) + @background_task(name="test_distributed_tracing_backwards_compatibility") def _test(): - transaction = current_transaction() - transaction.accept_distributed_trace_headers(headers) + accept_distributed_trace_headers(headers) _test() -@background_task(name='test_current_trace_id_api_inside_transaction') +@background_task(name="test_current_trace_id_api_inside_transaction") def test_current_trace_id_api_inside_transaction(): trace_id = current_trace_id() assert len(trace_id) == 32 @@ -356,7 +353,7 @@ def test_current_trace_id_api_outside_transaction(): assert trace_id is None -@background_task(name='test_current_span_id_api_inside_transaction') +@background_task(name="test_current_span_id_api_inside_transaction") def test_current_span_id_inside_transaction(): span_id = current_span_id() assert span_id == current_trace().guid @@ -365,3 +362,65 @@ def test_current_span_id_inside_transaction(): def test_current_span_id_outside_transaction(): span_id = current_span_id() assert span_id is None + + +@pytest.mark.parametrize("trusted_account_key", ("1", None), ids=("tk_set", "tk_unset")) +def test_outbound_dt_payload_generation(trusted_account_key): + @override_application_settings( + { + "distributed_tracing.enabled": True, + "account_id": "1", + "trusted_account_key": trusted_account_key, + "primary_application_id": "1", + } + ) + @background_task(name="test_outbound_dt_payload_generation") + def _test_outbound_dt_payload_generation(): + transaction = current_transaction() + payload = ExternalTrace.generate_request_headers(transaction) + if trusted_account_key: + assert payload + # Ensure trusted account key present as vendor + assert dict(payload)["tracestate"].startswith("1@nr=") + else: + assert not payload + + _test_outbound_dt_payload_generation() + + +@pytest.mark.parametrize("trusted_account_key", ("1", None), ids=("tk_set", "tk_unset")) +def test_inbound_dt_payload_acceptance(trusted_account_key): + @override_application_settings( + { + "distributed_tracing.enabled": True, + "account_id": "1", + "trusted_account_key": trusted_account_key, + "primary_application_id": "1", + } + ) + @background_task(name="_test_inbound_dt_payload_acceptance") + def _test_inbound_dt_payload_acceptance(): + transaction = current_transaction() + + payload = { + "v": [0, 1], + "d": { + "ty": "Mobile", + "ac": "1", + "tk": "1", + "ap": "2827902", + "pa": "5e5733a911cfbc73", + "id": "7d3efb1b173fecfa", + "tr": "d6b4ba0c3a712ca", + "ti": 1518469636035, + "tx": "8703ff3d88eefe9d", + }, + } + + result = transaction.accept_distributed_trace_payload(payload) + if trusted_account_key: + assert result + else: + assert not result + + _test_inbound_dt_payload_acceptance() diff --git a/tests/agent_features/test_error_events.py b/tests/agent_features/test_error_events.py index d93f9908b..72bdb14f7 100644 --- a/tests/agent_features/test_error_events.py +++ b/tests/agent_features/test_error_events.py @@ -16,6 +16,7 @@ import time import webtest + from testing_support.fixtures import ( cat_enabled, make_cross_agent_headers, @@ -23,10 +24,15 @@ override_application_settings, reset_core_stats_engine, validate_error_event_sample_data, - validate_non_transaction_error_event, validate_transaction_error_event_count, ) from testing_support.sample_applications import fully_featured_app +from testing_support.validators.validate_error_trace_attributes import ( + validate_error_trace_attributes, +) +from testing_support.validators.validate_non_transaction_error_event import ( + validate_non_transaction_error_event, +) from newrelic.api.application import application_instance as application from newrelic.api.application import application_settings diff --git a/tests/agent_features/test_error_group_callback.py b/tests/agent_features/test_error_group_callback.py new file mode 100644 index 000000000..2fe2fc68c --- /dev/null +++ b/tests/agent_features/test_error_group_callback.py @@ -0,0 +1,261 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys +import threading +import traceback + +import pytest +from testing_support.fixtures import ( + override_application_settings, + reset_core_stats_engine, +) +from testing_support.validators.validate_error_event_attributes import ( + validate_error_event_attributes, +) +from testing_support.validators.validate_error_event_attributes_outside_transaction import ( + validate_error_event_attributes_outside_transaction, +) +from testing_support.validators.validate_error_trace_attributes import ( + validate_error_trace_attributes, +) +from testing_support.validators.validate_error_trace_attributes_outside_transaction import ( + validate_error_trace_attributes_outside_transaction, +) + +from newrelic.api.application import application_instance as application +from newrelic.api.background_task import background_task +from newrelic.api.settings import set_error_group_callback +from newrelic.api.time_trace import notice_error +from newrelic.api.transaction import current_transaction +from newrelic.api.web_transaction import web_transaction +from newrelic.common.object_names import callable_name + +_callback_called = threading.Event() +_truncated_value = "A" * 300 + + +def error_group_callback(exc, data): + _callback_called.set() + + if isinstance(exc, ValueError): + return "value" + elif isinstance(exc, ZeroDivisionError): + return _truncated_value + elif isinstance(exc, IndexError): + return [] + elif isinstance(exc, LookupError): + return 123 + elif isinstance(exc, TypeError): + return "" + + +def test_clear_error_group_callback(): + settings = application().settings + set_error_group_callback(lambda x, y: None) + assert settings.error_collector.error_group_callback is not None, "Failed to set callback." + set_error_group_callback(None) + assert settings.error_collector.error_group_callback is None, "Failed to clear callback." + + +@pytest.mark.parametrize( + "callback,accepted", [(error_group_callback, True), (lambda x, y: None, True), (None, False), ("string", False)] +) +def test_set_error_group_callback(callback, accepted): + try: + set_error_group_callback(callback) + settings = application().settings + if accepted: + assert settings.error_collector.error_group_callback is not None, "Failed to set callback." + else: + assert settings.error_collector.error_group_callback is None, "Accepted bad callback." + finally: + set_error_group_callback(None) + + +@pytest.mark.parametrize( + "exc_class,group_name,high_security", + [ + (ValueError, "value", False), + (ValueError, "value", True), + (TypeError, None, False), + (RuntimeError, None, False), + (IndexError, None, False), + (LookupError, None, False), + (ZeroDivisionError, _truncated_value[:255], False), + ], + ids=("standard", "high-security", "empty-string", "None-value", "list-type", "int-type", "truncated-value"), +) +@reset_core_stats_engine() +def test_error_group_name_callback(exc_class, group_name, high_security): + _callback_called.clear() + + if group_name is not None: + exact = {"user": {}, "intrinsic": {}, "agent": {"error.group.name": group_name}} + forgone = None + else: + exact = None + forgone = {"user": [], "intrinsic": [], "agent": ["error.group.name"]} + + @validate_error_trace_attributes(callable_name(exc_class), forgone_params=forgone, exact_attrs=exact) + @validate_error_event_attributes(forgone_params=forgone, exact_attrs=exact) + @override_application_settings({"high_security": high_security}) + @background_task() + def _test(): + + try: + raise exc_class() + except Exception: + notice_error() + + assert _callback_called.is_set() + + try: + set_error_group_callback(error_group_callback) + _test() + finally: + set_error_group_callback(None) + + +@pytest.mark.parametrize( + "exc_class,group_name,high_security", + [ + (ValueError, "value", False), + (ValueError, "value", True), + (TypeError, None, False), + (RuntimeError, None, False), + (IndexError, None, False), + (LookupError, None, False), + (ZeroDivisionError, _truncated_value[:255], False), + ], + ids=("standard", "high-security", "empty-string", "None-value", "list-type", "int-type", "truncated-value"), +) +@reset_core_stats_engine() +def test_error_group_name_callback_outside_transaction(exc_class, group_name, high_security): + _callback_called.clear() + + if group_name is not None: + exact = {"user": {}, "intrinsic": {}, "agent": {"error.group.name": group_name}} + forgone = None + else: + exact = None + forgone = {"user": [], "intrinsic": [], "agent": ["error.group.name"]} + + @validate_error_trace_attributes_outside_transaction( + callable_name(exc_class), forgone_params=forgone, exact_attrs=exact + ) + @validate_error_event_attributes_outside_transaction(forgone_params=forgone, exact_attrs=exact) + @override_application_settings({"high_security": high_security}) + def _test(): + try: + raise exc_class() + except Exception: + app = application() + notice_error(application=app) + + assert _callback_called.is_set() + + try: + set_error_group_callback(error_group_callback) + _test() + finally: + set_error_group_callback(None) + + +@pytest.mark.parametrize( + "transaction_decorator", + [ + background_task(name="TestBackgroundTask"), + web_transaction( + name="TestWebTransaction", + host="localhost", + port=1234, + request_method="GET", + request_path="/", + headers=[], + ), + None, + ], + ids=("background_task", "web_transation", "outside_transaction"), +) +@reset_core_stats_engine() +def test_error_group_name_callback_attributes(transaction_decorator): + callback_errors = [] + _data = [] + + def callback(error, data): + def _callback(): + import types + + _data.append(data) + txn = current_transaction() + + # Standard attributes + assert isinstance(error, Exception) + assert isinstance(data["traceback"], types.TracebackType) + assert data["error.class"] is type(error) + assert data["error.message"] == "text" + assert data["error.expected"] is False + + # All attributes should always be included, but set to None when not relevant. + if txn is None: # Outside transaction + assert data["transactionName"] is None + assert data["custom_params"] == {"notice_error_attribute": 1} + assert data["response.status"] is None + assert data["request.method"] is None + assert data["request.uri"] is None + elif txn.background_task: # Background task + assert data["transactionName"] == "TestBackgroundTask" + assert data["custom_params"] == {"notice_error_attribute": 1, "txn_attribute": 2} + assert data["response.status"] is None + assert data["request.method"] is None + assert data["request.uri"] is None + else: # Web transaction + assert data["transactionName"] == "TestWebTransaction" + assert data["custom_params"] == {"notice_error_attribute": 1, "txn_attribute": 2} + assert data["response.status"] == 200 + assert data["request.method"] == "GET" + assert data["request.uri"] == "/" + + try: + _callback() + except Exception: + callback_errors.append(sys.exc_info()) + raise + + def _test(): + try: + txn = current_transaction() + if txn: + txn.add_custom_attribute("txn_attribute", 2) + if not txn.background_task: + txn.process_response(200, []) + raise Exception("text") + except Exception: + app = application() if transaction_decorator is None else None # Only set outside transaction + notice_error(application=app, attributes={"notice_error_attribute": 1}) + + assert not callback_errors, "Callback inputs failed to validate.\nerror: %s\ndata: %s" % ( + traceback.format_exception(*callback_errors[0]), + str(_data[0]), + ) + + if transaction_decorator is not None: + _test = transaction_decorator(_test) # Manually decorate test function + + try: + set_error_group_callback(callback) + _test() + finally: + set_error_group_callback(None) diff --git a/tests/agent_features/test_event_loop_wait_time.py b/tests/agent_features/test_event_loop_wait_time.py index ccf57c9a4..84c65dcdc 100644 --- a/tests/agent_features/test_event_loop_wait_time.py +++ b/tests/agent_features/test_event_loop_wait_time.py @@ -16,10 +16,14 @@ import time import pytest -from testing_support.fixtures import ( - override_application_settings, +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_event_attributes import ( validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, +) +from testing_support.validators.validate_transaction_trace_attributes import ( validate_transaction_trace_attributes, ) @@ -30,36 +34,33 @@ @background_task(name="block") -@asyncio.coroutine -def block_loop(ready, done, blocking_transaction_active, times=1): +async def block_loop(ready, done, blocking_transaction_active, times=1): for _ in range(times): - yield from ready.wait() + await ready.wait() ready.clear() time.sleep(0.1) done.set() if blocking_transaction_active: - yield from ready.wait() + await ready.wait() @function_trace(name="waiter") -@asyncio.coroutine -def waiter(ready, done, times=1): +async def waiter(ready, done, times=1): for _ in range(times): ready.set() - yield from done.wait() + await done.wait() done.clear() @background_task(name="wait") -@asyncio.coroutine -def wait_for_loop(ready, done, times=1): +async def wait_for_loop(ready, done, times=1): transaction = current_transaction() transaction._sampled = True # Run the waiter on another task so that the sentinel for wait appears # multiple times in the trace cache - yield from asyncio.ensure_future(waiter(ready, done, times)) + await asyncio.ensure_future(waiter(ready, done, times)) # Set the ready to terminate the block_loop if it's running ready.set() @@ -74,7 +75,7 @@ def wait_for_loop(ready, done, times=1): ), ) def test_record_event_loop_wait(event_loop, blocking_transaction_active, event_loop_visibility_enabled): - import asyncio + # import asyncio metric_count = 2 if event_loop_visibility_enabled else None execute_attributes = {"intrinsic": ("eventLoopTime",), "agent": (), "user": ()} @@ -139,7 +140,7 @@ def _test(): def test_record_event_loop_wait_outside_task(): # Insert a random trace into the trace cache trace = FunctionTrace(name="testing") - trace_cache()._cache[0] = trace + trace_cache()[0] = trace @background_task(name="test_record_event_loop_wait_outside_task") def _test(): @@ -183,7 +184,7 @@ def test_blocking_task_on_different_loop(): def test_record_event_loop_wait_on_different_task(event_loop): - import asyncio + # import asyncio async def recorder(ready, wait): ready.set() diff --git a/tests/agent_features/test_function_trace.py b/tests/agent_features/test_function_trace.py index 78c012b30..f1f0cd9ac 100644 --- a/tests/agent_features/test_function_trace.py +++ b/tests/agent_features/test_function_trace.py @@ -17,9 +17,8 @@ from newrelic.api.background_task import background_task from newrelic.api.function_trace import FunctionTrace -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_parenting) - +from testing_support.fixtures import validate_tt_parenting +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics _test_function_trace_default_group_scoped_metrics = [ ('Function/FunctionTrace', 1)] diff --git a/tests/agent_features/test_high_security_mode.py b/tests/agent_features/test_high_security_mode.py index 89499d365..20d997837 100644 --- a/tests/agent_features/test_high_security_mode.py +++ b/tests/agent_features/test_high_security_mode.py @@ -24,10 +24,18 @@ validate_attributes_complete, validate_custom_event_count, validate_custom_event_in_application_stats_engine, + validate_request_params_omitted, +) +from testing_support.validators.validate_custom_parameters import ( validate_custom_parameters, +) +from testing_support.validators.validate_non_transaction_error_event import ( validate_non_transaction_error_event, - validate_request_params_omitted, +) +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, +) +from testing_support.validators.validate_tt_segment_params import ( validate_tt_segment_params, ) @@ -38,7 +46,7 @@ from newrelic.api.settings import STRIP_EXCEPTION_MESSAGE from newrelic.api.time_trace import notice_error from newrelic.api.transaction import ( - add_custom_parameter, + add_custom_attribute, capture_request_params, current_transaction, record_custom_event, @@ -396,7 +404,7 @@ def test_remote_config_hsm_fixups_server_side_disabled(): @validate_custom_parameters(required_params=[("key", "value")]) @background_task() def test_other_transaction_custom_parameters_hsm_disabled(): - add_custom_parameter("key", "value") + add_custom_attribute("key", "value") @override_application_settings(_test_transaction_settings_hsm_disabled) @@ -404,14 +412,14 @@ def test_other_transaction_custom_parameters_hsm_disabled(): @background_task() def test_other_transaction_multiple_custom_parameters_hsm_disabled(): transaction = current_transaction() - transaction.add_custom_parameters([("key-1", "value-1"), ("key-2", "value-2")]) + transaction.add_custom_attributes([("key-1", "value-1"), ("key-2", "value-2")]) @override_application_settings(_test_transaction_settings_hsm_enabled) @validate_custom_parameters(forgone_params=[("key", "value")]) @background_task() def test_other_transaction_custom_parameters_hsm_enabled(): - add_custom_parameter("key", "value") + add_custom_attribute("key", "value") @override_application_settings(_test_transaction_settings_hsm_enabled) @@ -419,7 +427,7 @@ def test_other_transaction_custom_parameters_hsm_enabled(): @background_task() def test_other_transaction_multiple_custom_parameters_hsm_enabled(): transaction = current_transaction() - transaction.add_custom_parameters([("key-1", "value-1"), ("key-2", "value-2")]) + transaction.add_custom_attributes([("key-1", "value-1"), ("key-2", "value-2")]) class TestException(Exception): @@ -434,7 +442,7 @@ class TestException(Exception): @validate_custom_parameters(required_params=[("key-1", "value-1")]) @background_task() def test_other_transaction_error_parameters_hsm_disabled(): - add_custom_parameter("key-1", "value-1") + add_custom_attribute("key-1", "value-1") try: raise TestException("test message") except Exception: @@ -448,7 +456,7 @@ def test_other_transaction_error_parameters_hsm_disabled(): @validate_custom_parameters(forgone_params=[("key-1", "value-1")]) @background_task() def test_other_transaction_error_parameters_hsm_enabled(): - add_custom_parameter("key-1", "value-1") + add_custom_attribute("key-1", "value-1") try: raise TestException("test message") except Exception: diff --git a/tests/agent_features/test_ignore_expected_errors.py b/tests/agent_features/test_ignore_expected_errors.py index d685c39c0..ee26245c5 100644 --- a/tests/agent_features/test_ignore_expected_errors.py +++ b/tests/agent_features/test_ignore_expected_errors.py @@ -16,12 +16,24 @@ from testing_support.fixtures import ( override_application_settings, reset_core_stats_engine, - validate_error_event_attributes_outside_transaction, validate_error_event_sample_data, +) +from testing_support.validators.validate_error_event_attributes_outside_transaction import ( + validate_error_event_attributes_outside_transaction, +) +from testing_support.validators.validate_error_trace_attributes_outside_transaction import ( validate_error_trace_attributes_outside_transaction, +) +from testing_support.validators.validate_time_metrics_outside_transaction import ( validate_time_metrics_outside_transaction, +) +from testing_support.validators.validate_transaction_error_trace_attributes import ( validate_transaction_error_trace_attributes, +) +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) @@ -86,8 +98,9 @@ def test_classes_error_event_inside_transaction(settings, expected, ignore): error_count = 1 if not ignore else 0 errors = _test_runtime_error if not ignore else [] + expected_errors = _runtime_error_name if expected and not ignore else None - @validate_transaction_errors(errors=errors) + @validate_transaction_errors(errors=errors, expected_errors=expected_errors) @validate_error_event_sample_data( required_attrs=attributes, required_user_attrs=False, @@ -260,8 +273,9 @@ def test_status_codes_inside_transaction(settings, expected, ignore, status_code error_count = 1 if not ignore else 0 errors = _test_teapot_error if not ignore else [] + expected_errors = _teapot_error_name if expected and not ignore else None - @validate_transaction_errors(errors=errors) + @validate_transaction_errors(errors=errors, expected_errors=expected_errors) @validate_error_event_sample_data( required_attrs=attributes, required_user_attrs=False, @@ -351,8 +365,9 @@ def test_mixed_ignore_expected_settings_inside_transaction( error_count = 1 if not ignore else 0 errors = _test_runtime_error if not ignore else [] + expected_errors = _runtime_error_name if expected and not ignore else None - @validate_transaction_errors(errors=errors) + @validate_transaction_errors(errors=errors, expected_errors=expected_errors) @validate_error_event_sample_data( required_attrs=attributes, required_user_attrs=False, @@ -420,8 +435,9 @@ def test_overrides_inside_transaction(override, result, parameter): error_count = 1 if not ignore else 0 errors = _test_runtime_error if not ignore else [] + expected_errors = _runtime_error_name if expected and not ignore else None - @validate_transaction_errors(errors=errors) + @validate_transaction_errors(errors=errors, expected_errors=expected_errors) @validate_error_event_sample_data( required_attrs=attributes, required_user_attrs=False, diff --git a/tests/agent_features/test_lambda_handler.py b/tests/agent_features/test_lambda_handler.py index 4ff932e36..40b694407 100644 --- a/tests/agent_features/test_lambda_handler.py +++ b/tests/agent_features/test_lambda_handler.py @@ -13,12 +13,18 @@ # limitations under the License. import functools -import pytest from copy import deepcopy -from testing_support.fixtures import (override_application_settings, - validate_transaction_trace_attributes, - validate_transaction_event_attributes) -import newrelic.api.lambda_handler as lambda_handler + +import pytest +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_trace_attributes import ( + validate_transaction_trace_attributes, +) + +from newrelic.api import lambda_handler # NOTE: this fixture will force all tests in this file to assume that a cold @@ -27,7 +33,7 @@ @pytest.fixture(autouse=True) def force_cold_start_status(request): try: - is_cold_start = request.getfixturevalue('is_cold') + is_cold_start = request.getfixturevalue("is_cold") lambda_handler.COLD_START_RECORDED = not is_cold_start except Exception: lambda_handler.COLD_START_RECORDED = True @@ -36,63 +42,65 @@ def force_cold_start_status(request): @lambda_handler.lambda_handler() def handler(event, context): return { - 'statusCode': '200', - 'body': '{}', - 'headers': { - 'Content-Type': 'application/json', - 'Content-Length': 2, + "statusCode": "200", + "body": "{}", + "headers": { + "Content-Type": "application/json", + "Content-Length": 2, }, } _override_settings = { - 'attributes.include': ['request.parameters.*', 'request.headers.*'], + "attributes.include": ["request.parameters.*", "request.headers.*"], } _expected_attributes = { - 'agent': [ - 'aws.requestId', - 'aws.lambda.arn', - 'request.method', - 'request.uri', - 'response.status', - 'response.headers.contentType', - 'response.headers.contentLength', + "agent": [ + "aws.requestId", + "aws.lambda.arn", + "request.method", + "request.uri", + "response.status", + "response.headers.contentType", + "response.headers.contentLength", ], - 'user': [], - 'intrinsic': [], + "user": [], + "intrinsic": [], } _exact_attrs = { - 'agent': { - 'request.parameters.foo': 'bar', - 'request.headers.host': 'myhost', + "agent": { + "request.parameters.foo": "bar", + "request.headers.host": "myhost", }, - 'user': {}, - 'intrinsic': {} + "user": {}, + "intrinsic": {}, } empty_event = {} firehose_event = { - "records": [{ - "recordId": "495469866831355442", - "data": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0IDEyMy4=", - "approximateArrivalTimestamp": 1495072949453 - }], + "records": [ + { + "recordId": "495469866831355442", + "data": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0IDEyMy4=", + "approximateArrivalTimestamp": 1495072949453, + } + ], "region": "us-west-2", "deliveryStreamArn": "arn:aws:kinesis:EXAMPLE", - "invocationId": "invocationIdExample" + "invocationId": "invocationIdExample", } class Context(object): - aws_request_id = 'cookies' - invoked_function_arn = 'arn' - function_name = 'cats' - function_version = '$LATEST' + aws_request_id = "cookies" + invoked_function_arn = "arn" + function_name = "cats" + function_version = "$LATEST" memory_limit_in_mb = 128 -@pytest.mark.parametrize('is_cold', (False, True)) +@pytest.mark.parametrize("is_cold", (False, True)) def test_lambda_transaction_attributes(is_cold, monkeypatch): # setup copies of the attribute lists for this test only _forgone_params = {} @@ -101,36 +109,32 @@ def test_lambda_transaction_attributes(is_cold, monkeypatch): # if we have a cold start, then we should see aws.lambda.coldStart=True if is_cold: - _exact['agent']['aws.lambda.coldStart'] = True - _expected['agent'].append('aws.lambda.coldStart') + _exact["agent"]["aws.lambda.coldStart"] = True + _expected["agent"].append("aws.lambda.coldStart") # otherwise, then we need to make sure that we don't see it at all else: - _forgone_params = { - 'agent': ['aws.lambda.coldStart'], - 'user': [], - 'intrinsic': [] - } + _forgone_params = {"agent": ["aws.lambda.coldStart"], "user": [], "intrinsic": []} - @validate_transaction_trace_attributes( - required_params=_expected, - forgone_params=_forgone_params) + @validate_transaction_trace_attributes(required_params=_expected, forgone_params=_forgone_params) @validate_transaction_event_attributes( - required_params=_expected, - forgone_params=_forgone_params, - exact_attrs=_exact) + required_params=_expected, forgone_params=_forgone_params, exact_attrs=_exact + ) @override_application_settings(_override_settings) def _test(): - monkeypatch.setenv('AWS_REGION', 'earth') - handler({ - 'httpMethod': 'GET', - 'path': '/', - 'headers': { - 'HOST': 'myhost', + monkeypatch.setenv("AWS_REGION", "earth") + handler( + { + "httpMethod": "GET", + "path": "/", + "headers": { + "HOST": "myhost", + }, + "queryStringParameters": {"foo": "bar"}, + "multiValueQueryStringParameters": {"foo": ["bar"]}, }, - 'queryStringParameters': {'foo': 'bar'}, - 'multiValueQueryStringParameters': {'foo': ['bar']}, - }, Context) + Context, + ) _test() @@ -139,23 +143,26 @@ def _test(): @validate_transaction_event_attributes(_expected_attributes) @override_application_settings(_override_settings) def test_lambda_malformed_api_gateway_payload(monkeypatch): - monkeypatch.setenv('AWS_REGION', 'earth') - handler({ - 'httpMethod': 'GET', - 'path': '/', - 'headers': {}, - 'queryStringParameters': 42, - 'multiValueQueryStringParameters': 42, - }, Context) + monkeypatch.setenv("AWS_REGION", "earth") + handler( + { + "httpMethod": "GET", + "path": "/", + "headers": {}, + "queryStringParameters": 42, + "multiValueQueryStringParameters": 42, + }, + Context, + ) _malformed_request_attributes = { - 'agent': [ - 'aws.requestId', - 'aws.lambda.arn', + "agent": [ + "aws.requestId", + "aws.lambda.arn", ], - 'user': [], - 'intrinsic': [], + "user": [], + "intrinsic": [], } @@ -163,23 +170,26 @@ def test_lambda_malformed_api_gateway_payload(monkeypatch): @validate_transaction_event_attributes(_malformed_request_attributes) @override_application_settings(_override_settings) def test_lambda_malformed_request_headers(): - handler({ - 'httpMethod': 'GET', - 'path': '/', - 'headers': None, - }, Context) + handler( + { + "httpMethod": "GET", + "path": "/", + "headers": None, + }, + Context, + ) _malformed_response_attributes = { - 'agent': [ - 'aws.requestId', - 'aws.lambda.arn', - 'request.method', - 'request.uri', - 'response.status', + "agent": [ + "aws.requestId", + "aws.lambda.arn", + "request.method", + "request.uri", + "response.status", ], - 'user': [], - 'intrinsic': [], + "user": [], + "intrinsic": [], } @@ -187,33 +197,35 @@ def test_lambda_malformed_request_headers(): @validate_transaction_event_attributes(_malformed_response_attributes) @override_application_settings(_override_settings) def test_lambda_malformed_response_headers(): - @lambda_handler.lambda_handler() def handler(event, context): return { - 'statusCode': 200, - 'body': '{}', - 'headers': None, + "statusCode": 200, + "body": "{}", + "headers": None, } - handler({ - 'httpMethod': 'GET', - 'path': '/', - 'headers': {}, - }, Context) + handler( + { + "httpMethod": "GET", + "path": "/", + "headers": {}, + }, + Context, + ) _no_status_code_response = { - 'agent': [ - 'aws.requestId', - 'aws.lambda.arn', - 'request.method', - 'request.uri', - 'response.headers.contentType', - 'response.headers.contentLength', + "agent": [ + "aws.requestId", + "aws.lambda.arn", + "request.method", + "request.uri", + "response.headers.contentType", + "response.headers.contentLength", ], - 'user': [], - 'intrinsic': [], + "user": [], + "intrinsic": [], } @@ -221,53 +233,51 @@ def handler(event, context): @validate_transaction_event_attributes(_no_status_code_response) @override_application_settings(_override_settings) def test_lambda_no_status_code_response(): - @lambda_handler.lambda_handler() def handler(event, context): return { - 'body': '{}', - 'headers': { - 'Content-Type': 'application/json', - 'Content-Length': 2, + "body": "{}", + "headers": { + "Content-Type": "application/json", + "Content-Length": 2, }, } - handler({ - 'httpMethod': 'GET', - 'path': '/', - 'headers': {}, - }, Context) + handler( + { + "httpMethod": "GET", + "path": "/", + "headers": {}, + }, + Context, + ) -@pytest.mark.parametrize('event,arn', ( - (empty_event, None), - (firehose_event, 'arn:aws:kinesis:EXAMPLE'))) +@pytest.mark.parametrize("event,arn", ((empty_event, None), (firehose_event, "arn:aws:kinesis:EXAMPLE"))) def test_lambda_event_source_arn_attribute(event, arn): if arn is None: _exact = None _expected = None _forgone = { - 'user': [], 'intrinsic': [], - 'agent': ['aws.lambda.eventSource.arn'], + "user": [], + "intrinsic": [], + "agent": ["aws.lambda.eventSource.arn"], } else: _exact = { - 'user': {}, 'intrinsic': {}, - 'agent': {'aws.lambda.eventSource.arn': arn}, + "user": {}, + "intrinsic": {}, + "agent": {"aws.lambda.eventSource.arn": arn}, } _expected = { - 'user': [], 'intrinsic': [], - 'agent': ['aws.lambda.eventSource.arn'], + "user": [], + "intrinsic": [], + "agent": ["aws.lambda.eventSource.arn"], } _forgone = None - @validate_transaction_trace_attributes( - required_params=_expected, - forgone_params=_forgone) - @validate_transaction_event_attributes( - required_params=_expected, - forgone_params=_forgone, - exact_attrs=_exact) + @validate_transaction_trace_attributes(required_params=_expected, forgone_params=_forgone) + @validate_transaction_event_attributes(required_params=_expected, forgone_params=_forgone, exact_attrs=_exact) @override_application_settings(_override_settings) def _test(): handler(event, Context) @@ -275,10 +285,13 @@ def _test(): _test() -@pytest.mark.parametrize('api', ( - lambda_handler.lambda_handler, - functools.partial(lambda_handler.LambdaHandlerWrapper, handler), -)) +@pytest.mark.parametrize( + "api", + ( + lambda_handler.lambda_handler, + functools.partial(lambda_handler.LambdaHandlerWrapper, handler), + ), +) def test_deprecation_warnings(api): with pytest.deprecated_call(): api() diff --git a/tests/agent_features/test_notice_error.py b/tests/agent_features/test_notice_error.py index e052602a0..913ee9289 100644 --- a/tests/agent_features/test_notice_error.py +++ b/tests/agent_features/test_notice_error.py @@ -19,11 +19,19 @@ error_is_saved, override_application_settings, reset_core_stats_engine, + validate_transaction_error_event_count, + validate_transaction_error_trace_count, +) +from testing_support.validators.validate_application_error_event_count import ( validate_application_error_event_count, +) +from testing_support.validators.validate_application_error_trace_count import ( validate_application_error_trace_count, +) +from testing_support.validators.validate_application_errors import ( validate_application_errors, - validate_transaction_error_event_count, - validate_transaction_error_trace_count, +) +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, ) diff --git a/tests/agent_features/test_priority_sampling.py b/tests/agent_features/test_priority_sampling.py index 41f7fc1f3..f73824f71 100644 --- a/tests/agent_features/test_priority_sampling.py +++ b/tests/agent_features/test_priority_sampling.py @@ -13,16 +13,18 @@ # limitations under the License. import pytest +from testing_support.fixtures import ( + core_application_stats_engine, + override_application_settings, + reset_core_stats_engine, +) -from testing_support.fixtures import (reset_core_stats_engine, - core_application_stats_engine, override_application_settings) from newrelic.api.application import application_instance as application from newrelic.api.background_task import BackgroundTask -@override_application_settings( - {'event_harvest_config.harvest_limits.analytic_event_data': 1}) -@pytest.mark.parametrize('first_transaction_saved', [True, False]) +@override_application_settings({"event_harvest_config.harvest_limits.analytic_event_data": 1}) +@pytest.mark.parametrize("first_transaction_saved", [True, False]) def test_priority_used_in_transaction_events(first_transaction_saved): first_priority = 1 if first_transaction_saved else 0 second_priority = 0 if first_transaction_saved else 1 @@ -32,46 +34,46 @@ def _test(): # Stats engine stats_engine = core_application_stats_engine() - with BackgroundTask(application(), name='T1') as txn: + with BackgroundTask(application(), name="T1") as txn: txn._priority = first_priority - with BackgroundTask(application(), name='T2') as txn: + with BackgroundTask(application(), name="T2") as txn: txn._priority = second_priority transaction_events = list(stats_engine.transaction_events) assert len(transaction_events) == 1 - # highest priority should win - assert stats_engine.transaction_events.pq[0][0] == 1 + # Highest priority should win. + # Priority can be 1 or 2 depending on randomness in sampling computation. + assert stats_engine.transaction_events.pq[0][0] >= 1 if first_transaction_saved: - assert transaction_events[0][0]['name'].endswith('/T1') + assert transaction_events[0][0]["name"].endswith("/T1") else: - assert transaction_events[0][0]['name'].endswith('/T2') + assert transaction_events[0][0]["name"].endswith("/T2") _test() -@override_application_settings({ - 'event_harvest_config.harvest_limits.error_event_data': 1}) -@pytest.mark.parametrize('first_transaction_saved', [True, False]) +@override_application_settings({"event_harvest_config.harvest_limits.error_event_data": 1}) +@pytest.mark.parametrize("first_transaction_saved", [True, False]) def test_priority_used_in_transaction_error_events(first_transaction_saved): first_priority = 1 if first_transaction_saved else 0 second_priority = 0 if first_transaction_saved else 1 @reset_core_stats_engine() def _test(): - with BackgroundTask(application(), name='T1') as txn: + with BackgroundTask(application(), name="T1") as txn: txn._priority = first_priority try: - raise ValueError('OOPS') + raise ValueError("OOPS") except ValueError: txn.notice_error() - with BackgroundTask(application(), name='T2') as txn: + with BackgroundTask(application(), name="T2") as txn: txn._priority = second_priority try: - raise ValueError('OOPS') + raise ValueError("OOPS") except ValueError: txn.notice_error() @@ -85,29 +87,28 @@ def _test(): assert stats_engine.error_events.pq[0][0] == 1 if first_transaction_saved: - assert error_events[0][0]['transactionName'].endswith('/T1') + assert error_events[0][0]["transactionName"].endswith("/T1") else: - assert error_events[0][0]['transactionName'].endswith('/T2') + assert error_events[0][0]["transactionName"].endswith("/T2") _test() -@override_application_settings({ - 'event_harvest_config.harvest_limits.custom_event_data': 1}) -@pytest.mark.parametrize('first_transaction_saved', [True, False]) +@override_application_settings({"event_harvest_config.harvest_limits.custom_event_data": 1}) +@pytest.mark.parametrize("first_transaction_saved", [True, False]) def test_priority_used_in_transaction_custom_events(first_transaction_saved): first_priority = 1 if first_transaction_saved else 0 second_priority = 0 if first_transaction_saved else 1 @reset_core_stats_engine() def _test(): - with BackgroundTask(application(), name='T1') as txn: + with BackgroundTask(application(), name="T1") as txn: txn._priority = first_priority - txn.record_custom_event('foobar', {'foo': 'bar'}) + txn.record_custom_event("foobar", {"foo": "bar"}) - with BackgroundTask(application(), name='T2') as txn: + with BackgroundTask(application(), name="T2") as txn: txn._priority = second_priority - txn.record_custom_event('barbaz', {'foo': 'bar'}) + txn.record_custom_event("barbaz", {"foo": "bar"}) # Stats engine stats_engine = core_application_stats_engine() @@ -119,8 +120,8 @@ def _test(): assert stats_engine.custom_events.pq[0][0] == 1 if first_transaction_saved: - assert custom_events[0][0]['type'] == 'foobar' + assert custom_events[0][0]["type"] == "foobar" else: - assert custom_events[0][0]['type'] == 'barbaz' + assert custom_events[0][0]["type"] == "barbaz" _test() diff --git a/tests/agent_features/test_profile_trace.py b/tests/agent_features/test_profile_trace.py new file mode 100644 index 000000000..f696b7480 --- /dev/null +++ b/tests/agent_features/test_profile_trace.py @@ -0,0 +1,88 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task +from newrelic.api.profile_trace import ProfileTraceWrapper, profile_trace + + +def test_profile_trace_wrapper(): + def _test(): + def nested_fn(): + pass + + nested_fn() + + wrapped_test = ProfileTraceWrapper(_test) + wrapped_test() + + +@validate_transaction_metrics("test_profile_trace:test_profile_trace_empty_args", background_task=True) +@background_task() +def test_profile_trace_empty_args(): + @profile_trace() + def _test(): + pass + + _test() + + +_test_profile_trace_defined_args_scoped_metrics = [("Custom/TestTrace", 1)] + + +@validate_transaction_metrics( + "test_profile_trace:test_profile_trace_defined_args", + scoped_metrics=_test_profile_trace_defined_args_scoped_metrics, + background_task=True, +) +@background_task() +def test_profile_trace_defined_args(): + @profile_trace(name="TestTrace", group="Custom", label="Label", params={"key": "value"}, depth=7) + def _test(): + pass + + _test() + + +_test_profile_trace_callable_args_scoped_metrics = [("Function/TestProfileTrace", 1)] + + +@validate_transaction_metrics( + "test_profile_trace:test_profile_trace_callable_args", + scoped_metrics=_test_profile_trace_callable_args_scoped_metrics, + background_task=True, +) +@background_task() +def test_profile_trace_callable_args(): + def name_callable(): + return "TestProfileTrace" + + def group_callable(): + return "Function" + + def label_callable(): + return "HSM" + + def params_callable(): + return {"account_id": "12345"} + + @profile_trace(name=name_callable, group=group_callable, label=label_callable, params=params_callable, depth=0) + def _test(): + pass + + _test() diff --git a/tests/agent_features/test_serverless_mode.py b/tests/agent_features/test_serverless_mode.py index 75b5f0075..189481f70 100644 --- a/tests/agent_features/test_serverless_mode.py +++ b/tests/agent_features/test_serverless_mode.py @@ -13,7 +13,16 @@ # limitations under the License. import json + import pytest +from testing_support.fixtures import override_generic_settings +from testing_support.validators.validate_serverless_data import validate_serverless_data +from testing_support.validators.validate_serverless_metadata import ( + validate_serverless_metadata, +) +from testing_support.validators.validate_serverless_payload import ( + validate_serverless_payload, +) from newrelic.api.application import application_instance from newrelic.api.background_task import background_task @@ -22,23 +31,14 @@ from newrelic.api.transaction import current_transaction from newrelic.core.config import global_settings -from testing_support.fixtures import override_generic_settings -from testing_support.validators.validate_serverless_data import ( - validate_serverless_data) -from testing_support.validators.validate_serverless_payload import ( - validate_serverless_payload) -from testing_support.validators.validate_serverless_metadata import ( - validate_serverless_metadata) - -@pytest.fixture(scope='function') +@pytest.fixture(scope="function") def serverless_application(request): settings = global_settings() orig = settings.serverless_mode.enabled settings.serverless_mode.enabled = True - application_name = 'Python Agent Test (test_serverless_mode:%s)' % ( - request.node.name) + application_name = "Python Agent Test (test_serverless_mode:%s)" % (request.node.name) application = application_instance(application_name) application.activate() @@ -48,17 +48,18 @@ def serverless_application(request): def test_serverless_payload(capsys, serverless_application): - - @override_generic_settings(serverless_application.settings, { - 'distributed_tracing.enabled': True, - }) + @override_generic_settings( + serverless_application.settings, + { + "distributed_tracing.enabled": True, + }, + ) @validate_serverless_data( - expected_methods=('metric_data', 'analytic_event_data'), - forgone_methods=('preconnect', 'connect', 'get_agent_commands')) + expected_methods=("metric_data", "analytic_event_data"), + forgone_methods=("preconnect", "connect", "get_agent_commands"), + ) @validate_serverless_payload() - @background_task( - application=serverless_application, - name='test_serverless_payload') + @background_task(application=serverless_application, name="test_serverless_payload") def _test(): transaction = current_transaction() assert transaction.settings.serverless_mode.enabled @@ -75,17 +76,15 @@ def _test(): def test_no_cat_headers(serverless_application): - @background_task( - application=serverless_application, - name='test_cat_headers') + @background_task(application=serverless_application, name="test_cat_headers") def _test_cat_headers(): transaction = current_transaction() payload = ExternalTrace.generate_request_headers(transaction) assert not payload - trace = ExternalTrace('testlib', 'http://example.com') - response_headers = [('X-NewRelic-App-Data', 'Cookies')] + trace = ExternalTrace("testlib", "http://example.com") + response_headers = [("X-NewRelic-App-Data", "Cookies")] with trace: trace.process_response_headers(response_headers) @@ -94,61 +93,66 @@ def _test_cat_headers(): _test_cat_headers() -def test_dt_outbound(serverless_application): - @override_generic_settings(serverless_application.settings, { - 'distributed_tracing.enabled': True, - 'account_id': '1', - 'trusted_account_key': '1', - 'primary_application_id': '1', - }) - @background_task( - application=serverless_application, - name='test_dt_outbound') - def _test_dt_outbound(): +@pytest.mark.parametrize("trusted_account_key", ("1", None), ids=("tk_set", "tk_unset")) +def test_outbound_dt_payload_generation(serverless_application, trusted_account_key): + @override_generic_settings( + serverless_application.settings, + { + "distributed_tracing.enabled": True, + "account_id": "1", + "trusted_account_key": trusted_account_key, + "primary_application_id": "1", + }, + ) + @background_task(application=serverless_application, name="test_outbound_dt_payload_generation") + def _test_outbound_dt_payload_generation(): transaction = current_transaction() payload = ExternalTrace.generate_request_headers(transaction) assert payload - - _test_dt_outbound() - - -def test_dt_inbound(serverless_application): - @override_generic_settings(serverless_application.settings, { - 'distributed_tracing.enabled': True, - 'account_id': '1', - 'trusted_account_key': '1', - 'primary_application_id': '1', - }) - @background_task( - application=serverless_application, - name='test_dt_inbound') - def _test_dt_inbound(): + # Ensure trusted account key or account ID present as vendor + assert dict(payload)["tracestate"].startswith("1@nr=") + + _test_outbound_dt_payload_generation() + + +@pytest.mark.parametrize("trusted_account_key", ("1", None), ids=("tk_set", "tk_unset")) +def test_inbound_dt_payload_acceptance(serverless_application, trusted_account_key): + @override_generic_settings( + serverless_application.settings, + { + "distributed_tracing.enabled": True, + "account_id": "1", + "trusted_account_key": trusted_account_key, + "primary_application_id": "1", + }, + ) + @background_task(application=serverless_application, name="test_inbound_dt_payload_acceptance") + def _test_inbound_dt_payload_acceptance(): transaction = current_transaction() payload = { - 'v': [0, 1], - 'd': { - 'ty': 'Mobile', - 'ac': '1', - 'tk': '1', - 'ap': '2827902', - 'pa': '5e5733a911cfbc73', - 'id': '7d3efb1b173fecfa', - 'tr': 'd6b4ba0c3a712ca', - 'ti': 1518469636035, - 'tx': '8703ff3d88eefe9d', - } + "v": [0, 1], + "d": { + "ty": "Mobile", + "ac": "1", + "tk": "1", + "ap": "2827902", + "pa": "5e5733a911cfbc73", + "id": "7d3efb1b173fecfa", + "tr": "d6b4ba0c3a712ca", + "ti": 1518469636035, + "tx": "8703ff3d88eefe9d", + }, } result = transaction.accept_distributed_trace_payload(payload) assert result - _test_dt_inbound() + _test_inbound_dt_payload_acceptance() -@pytest.mark.parametrize('arn_set', (True, False)) +@pytest.mark.parametrize("arn_set", (True, False)) def test_payload_metadata_arn(serverless_application, arn_set): - # If the session object gathers the arn from the settings object before the # lambda handler records it there, then this test will fail. @@ -157,17 +161,17 @@ def test_payload_metadata_arn(serverless_application, arn_set): arn = None if arn_set: - arn = 'arrrrrrrrrrRrrrrrrrn' + arn = "arrrrrrrrrrRrrrrrrrn" - settings.aws_lambda_metadata.update({'arn': arn, 'function_version': '$LATEST'}) + settings.aws_lambda_metadata.update({"arn": arn, "function_version": "$LATEST"}) class Context(object): invoked_function_arn = arn - @validate_serverless_metadata(exact_metadata={'arn': arn}) + @validate_serverless_metadata(exact_metadata={"arn": arn}) @lambda_handler(application=serverless_application) def handler(event, context): - assert settings.aws_lambda_metadata['arn'] == arn + assert settings.aws_lambda_metadata["arn"] == arn return {} try: diff --git a/tests/agent_features/test_span_events.py b/tests/agent_features/test_span_events.py index 0024c1b8b..05e375ff3 100644 --- a/tests/agent_features/test_span_events.py +++ b/tests/agent_features/test_span_events.py @@ -12,15 +12,26 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest import sys -from newrelic.api.transaction import current_transaction -from newrelic.api.time_trace import (current_trace, - add_custom_span_attribute, notice_error) -from newrelic.api.background_task import background_task -from newrelic.common.object_names import callable_name +import pytest +from testing_support.fixtures import ( + dt_enabled, + function_not_called, + override_application_settings, +) +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) +from newrelic.api.background_task import background_task from newrelic.api.database_trace import DatabaseTrace from newrelic.api.datastore_trace import DatastoreTrace from newrelic.api.external_trace import ExternalTrace @@ -28,81 +39,73 @@ from newrelic.api.memcache_trace import MemcacheTrace from newrelic.api.message_trace import MessageTrace from newrelic.api.solr_trace import SolrTrace - -from testing_support.fixtures import (override_application_settings, - function_not_called, validate_tt_segment_params, - validate_transaction_metrics, dt_enabled, - validate_transaction_event_attributes) -from testing_support.validators.validate_span_events import ( - validate_span_events) +from newrelic.api.time_trace import ( + add_custom_span_attribute, + current_trace, + notice_error, +) +from newrelic.api.transaction import current_transaction +from newrelic.common.object_names import callable_name ERROR = ValueError("whoops") ERROR_NAME = callable_name(ERROR) -@pytest.mark.parametrize('dt_enabled', (True, False)) -@pytest.mark.parametrize('span_events_enabled', (True, False)) -@pytest.mark.parametrize('txn_sampled', (True, False)) +@pytest.mark.parametrize("dt_enabled", (True, False)) +@pytest.mark.parametrize("span_events_enabled", (True, False)) +@pytest.mark.parametrize("txn_sampled", (True, False)) def test_span_events(dt_enabled, span_events_enabled, txn_sampled): - guid = 'dbb536c53b749e0b' - sentinel_guid = '0687e0c371ea2c4e' - function_guid = '482439c52de807ee' - transaction_name = 'OtherTransaction/Function/transaction' + guid = "dbb536c53b749e0b" + sentinel_guid = "0687e0c371ea2c4e" + function_guid = "482439c52de807ee" + transaction_name = "OtherTransaction/Function/transaction" priority = 0.5 - @function_trace(name='child') + @function_trace(name="child") def child(): pass - @function_trace(name='function') + @function_trace(name="function") def function(): current_trace().guid = function_guid child() - _settings = { - 'distributed_tracing.enabled': dt_enabled, - 'span_events.enabled': span_events_enabled - } + _settings = {"distributed_tracing.enabled": dt_enabled, "span_events.enabled": span_events_enabled} count = 0 if dt_enabled and span_events_enabled and txn_sampled: count = 1 exact_intrinsics_common = { - 'type': 'Span', - 'transactionId': guid, - 'sampled': txn_sampled, - 'priority': priority, - 'category': 'generic', + "type": "Span", + "transactionId": guid, + "sampled": txn_sampled, + "priority": priority, + "category": "generic", } - expected_intrinsics = ('timestamp', 'duration') + expected_intrinsics = ("timestamp", "duration") exact_intrinsics_root = exact_intrinsics_common.copy() - exact_intrinsics_root['name'] = 'Function/transaction' - exact_intrinsics_root['transaction.name'] = transaction_name - exact_intrinsics_root['nr.entryPoint'] = True + exact_intrinsics_root["name"] = "Function/transaction" + exact_intrinsics_root["transaction.name"] = transaction_name + exact_intrinsics_root["nr.entryPoint"] = True exact_intrinsics_function = exact_intrinsics_common.copy() - exact_intrinsics_function['name'] = 'Function/function' - exact_intrinsics_function['parentId'] = sentinel_guid + exact_intrinsics_function["name"] = "Function/function" + exact_intrinsics_function["parentId"] = sentinel_guid exact_intrinsics_child = exact_intrinsics_common.copy() - exact_intrinsics_child['name'] = 'Function/child' - exact_intrinsics_child['parentId'] = function_guid - - @validate_span_events(count=count, - expected_intrinsics=['nr.entryPoint']) - @validate_span_events(count=count, - exact_intrinsics=exact_intrinsics_root, - expected_intrinsics=expected_intrinsics) - @validate_span_events(count=count, - exact_intrinsics=exact_intrinsics_function, - expected_intrinsics=expected_intrinsics) - @validate_span_events(count=count, - exact_intrinsics=exact_intrinsics_child, - expected_intrinsics=expected_intrinsics) + exact_intrinsics_child["name"] = "Function/child" + exact_intrinsics_child["parentId"] = function_guid + + @validate_span_events(count=count, expected_intrinsics=["nr.entryPoint"]) + @validate_span_events(count=count, exact_intrinsics=exact_intrinsics_root, expected_intrinsics=expected_intrinsics) + @validate_span_events( + count=count, exact_intrinsics=exact_intrinsics_function, expected_intrinsics=expected_intrinsics + ) + @validate_span_events(count=count, exact_intrinsics=exact_intrinsics_child, expected_intrinsics=expected_intrinsics) @override_application_settings(_settings) - @background_task(name='transaction') + @background_task(name="transaction") def _test(): # Force intrinsics txn = current_transaction() @@ -116,24 +119,28 @@ def _test(): _test() -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), -)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + ), +) def test_each_span_type(trace_type, args): @validate_span_events(count=2) - @override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - }) - @background_task(name='test_each_span_type') + @override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + } + ) + @background_task(name="test_each_span_type") def _test(): - transaction = current_transaction() transaction._sampled = True @@ -143,37 +150,37 @@ def _test(): _test() -@pytest.mark.parametrize('sql,sql_format,expected', ( - pytest.param( - 'a' * 2001, - 'raw', - ''.join(['a'] * 1997 + ['...']), - id='truncate'), - pytest.param( - 'a' * 2000, - 'raw', - ''.join(['a'] * 2000), - id='no_truncate'), - pytest.param( - 'select * from %s' % ''.join(['?'] * 2000), - 'obfuscated', - 'select * from %s...' % ( - ''.join(['?'] * (2000 - len('select * from ') - 3))), - id='truncate_obfuscated'), - pytest.param('select 1', 'off', ''), - pytest.param('select 1', 'raw', 'select 1'), - pytest.param('select 1', 'obfuscated', 'select ?'), -)) +@pytest.mark.parametrize( + "sql,sql_format,expected", + ( + pytest.param("a" * 2001, "raw", "".join(["a"] * 1997 + ["..."]), id="truncate"), + pytest.param("a" * 2000, "raw", "".join(["a"] * 2000), id="no_truncate"), + pytest.param( + "select * from %s" % "".join(["?"] * 2000), + "obfuscated", + "select * from %s..." % ("".join(["?"] * (2000 - len("select * from ") - 3))), + id="truncate_obfuscated", + ), + pytest.param("select 1", "off", ""), + pytest.param("select 1", "raw", "select 1"), + pytest.param("select 1", "obfuscated", "select ?"), + ), +) def test_database_db_statement_format(sql, sql_format, expected): - @validate_span_events(count=1, exact_agents={ - 'db.statement': expected, - }) - @override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'transaction_tracer.record_sql': sql_format, - }) - @background_task(name='test_database_db_statement_format') + @validate_span_events( + count=1, + exact_agents={ + "db.statement": expected, + }, + ) + @override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "transaction_tracer.record_sql": sql_format, + } + ) + @background_task(name="test_database_db_statement_format") def _test(): transaction = current_transaction() transaction._sampled = True @@ -186,115 +193,129 @@ def _test(): @validate_span_events( count=1, - exact_intrinsics={'category': 'datastore'}, - unexpected_agents=['db.statement'], + exact_intrinsics={"category": "datastore"}, + unexpected_agents=["db.statement"], +) +@override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "span_events.attributes.exclude": ["db.statement"], + } ) -@override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'span_events.attributes.exclude': ['db.statement'], -}) -@background_task(name='test_database_db_statement_exclude') +@background_task(name="test_database_db_statement_exclude") def test_database_db_statement_exclude(): transaction = current_transaction() transaction._sampled = True - with DatabaseTrace('select 1'): + with DatabaseTrace("select 1"): pass -@pytest.mark.parametrize('trace_type,args,attrs', ( - (DatastoreTrace, ('db_product', 'db_target', 'db_operation'), {"db.collection": "db_target", "db.operation": "db_operation"}), - (DatabaseTrace, ("select 1 from db_table",), {"db.collection": "db_table", "db.statement": "select ? from db_table"}), -)) +@pytest.mark.parametrize( + "trace_type,args,attrs", + ( + ( + DatastoreTrace, + ("db_product", "db_target", "db_operation"), + {"db.collection": "db_target", "db.operation": "db_operation"}, + ), + ( + DatabaseTrace, + ("select 1 from db_table",), + {"db.collection": "db_table", "db.statement": "select ? from db_table"}, + ), + ), +) def test_datastore_database_trace_attrs(trace_type, args, attrs): @validate_span_events( count=1, exact_agents=attrs, ) - @override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - }) - @background_task(name='test_database_db_statement_exclude') + @override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + } + ) + @background_task(name="test_database_db_statement_exclude") def test(): transaction = current_transaction() transaction._sampled = True with trace_type(*args): pass - + test() -@pytest.mark.parametrize('exclude_url', (True, False)) +@pytest.mark.parametrize("exclude_url", (True, False)) def test_external_spans(exclude_url): override_settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, + "distributed_tracing.enabled": True, + "span_events.enabled": True, } if exclude_url: - override_settings['span_events.attributes.exclude'] = ['http.url'] + override_settings["span_events.attributes.exclude"] = ["http.url"] exact_agents = {} - unexpected_agents = ['http.url'] + unexpected_agents = ["http.url"] else: - exact_agents = {'http.url': 'http://example.com/foo'} + exact_agents = {"http.url": "http://example.com/foo"} unexpected_agents = [] @validate_span_events( count=1, exact_intrinsics={ - 'name': 'External/example.com/library/get', - 'type': 'Span', - 'sampled': True, - - 'category': 'http', - 'span.kind': 'client', - 'component': 'library', - 'http.method': 'get', + "name": "External/example.com/library/get", + "type": "Span", + "sampled": True, + "category": "http", + "span.kind": "client", + "component": "library", + "http.method": "get", }, exact_agents=exact_agents, unexpected_agents=unexpected_agents, - expected_intrinsics=('priority',), + expected_intrinsics=("priority",), ) @override_application_settings(override_settings) - @background_task(name='test_external_spans') + @background_task(name="test_external_spans") def _test(): transaction = current_transaction() transaction._sampled = True - with ExternalTrace( - library='library', - url='http://example.com/foo?secret=123', - method='get'): + with ExternalTrace(library="library", url="http://example.com/foo?secret=123", method="get"): pass _test() -@pytest.mark.parametrize('kwarg_override,attr_override', ( - ({'url': 'a' * 256}, {'http.url': 'a' * 255}), - ({'library': 'a' * 256}, {'component': 'a' * 255}), - ({'method': 'a' * 256}, {'http.method': 'a' * 255}), -)) -@override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, -}) +@pytest.mark.parametrize( + "kwarg_override,attr_override", + ( + ({"url": "a" * 256}, {"http.url": "a" * 255}), + ({"library": "a" * 256}, {"component": "a" * 255}), + ({"method": "a" * 256}, {"http.method": "a" * 255}), + ), +) +@override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + } +) def test_external_span_limits(kwarg_override, attr_override): - exact_intrinsics = { - 'type': 'Span', - 'sampled': True, - - 'category': 'http', - 'span.kind': 'client', - 'component': 'library', - 'http.method': 'get', + "type": "Span", + "sampled": True, + "category": "http", + "span.kind": "client", + "component": "library", + "http.method": "get", } exact_agents = { - 'http.url': 'http://example.com/foo', + "http.url": "http://example.com/foo", } for attr_name, attr_value in attr_override.items(): if attr_name in exact_agents: @@ -303,9 +324,9 @@ def test_external_span_limits(kwarg_override, attr_override): exact_intrinsics[attr_name] = attr_value kwargs = { - 'library': 'library', - 'url': 'http://example.com/foo?secret=123', - 'method': 'get', + "library": "library", + "url": "http://example.com/foo?secret=123", + "method": "get", } kwargs.update(kwarg_override) @@ -313,9 +334,9 @@ def test_external_span_limits(kwarg_override, attr_override): count=1, exact_intrinsics=exact_intrinsics, exact_agents=exact_agents, - expected_intrinsics=('priority',), + expected_intrinsics=("priority",), ) - @background_task(name='test_external_spans') + @background_task(name="test_external_spans") def _test(): transaction = current_transaction() transaction._sampled = True @@ -326,32 +347,33 @@ def _test(): _test() -@pytest.mark.parametrize('kwarg_override,attribute_override', ( - ({'host': 'a' * 256}, - {'peer.hostname': 'a' * 255, 'peer.address': 'a' * 255}), - ({'port_path_or_id': 'a' * 256, 'host': 'a'}, - {'peer.hostname': 'a', 'peer.address': 'a:' + 'a' * 253}), - ({'database_name': 'a' * 256}, {'db.instance': 'a' * 255}), -)) -@override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, -}) +@pytest.mark.parametrize( + "kwarg_override,attribute_override", + ( + ({"host": "a" * 256}, {"peer.hostname": "a" * 255, "peer.address": "a" * 255}), + ({"port_path_or_id": "a" * 256, "host": "a"}, {"peer.hostname": "a", "peer.address": "a:" + "a" * 253}), + ({"database_name": "a" * 256}, {"db.instance": "a" * 255}), + ), +) +@override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + } +) def test_datastore_span_limits(kwarg_override, attribute_override): - exact_intrinsics = { - 'type': 'Span', - 'sampled': True, - - 'category': 'datastore', - 'span.kind': 'client', - 'component': 'library', + "type": "Span", + "sampled": True, + "category": "datastore", + "span.kind": "client", + "component": "library", } exact_agents = { - 'db.instance': 'db', - 'peer.hostname': 'foo', - 'peer.address': 'foo:1234', + "db.instance": "db", + "peer.hostname": "foo", + "peer.address": "foo:1234", } for k, v in attribute_override.items(): @@ -361,22 +383,22 @@ def test_datastore_span_limits(kwarg_override, attribute_override): exact_intrinsics[k] = v kwargs = { - 'product': 'library', - 'target': 'table', - 'operation': 'operation', - 'host': 'foo', - 'port_path_or_id': 1234, - 'database_name': 'db', + "product": "library", + "target": "table", + "operation": "operation", + "host": "foo", + "port_path_or_id": 1234, + "database_name": "db", } kwargs.update(kwarg_override) @validate_span_events( count=1, exact_intrinsics=exact_intrinsics, - expected_intrinsics=('priority',), + expected_intrinsics=("priority",), exact_agents=exact_agents, ) - @background_task(name='test_external_spans') + @background_task(name="test_external_spans") def _test(): transaction = current_transaction() transaction._sampled = True @@ -387,76 +409,72 @@ def _test(): _test() -@pytest.mark.parametrize('collect_span_events', (False, True)) -@pytest.mark.parametrize('span_events_enabled', (False, True)) -def test_collect_span_events_override(collect_span_events, - span_events_enabled): - - if collect_span_events and span_events_enabled: - spans_expected = True - else: - spans_expected = False +@pytest.mark.parametrize("collect_span_events", (False, True)) +@pytest.mark.parametrize("span_events_enabled", (False, True)) +def test_collect_span_events_override(collect_span_events, span_events_enabled): + spans_expected = collect_span_events and span_events_enabled span_count = 2 if spans_expected else 0 @validate_span_events(count=span_count) - @override_application_settings({ - 'transaction_tracer.enabled': False, - 'distributed_tracing.enabled': True, - 'span_events.enabled': span_events_enabled, - 'collect_span_events': collect_span_events - }) - @background_task(name='test_collect_span_events_override') + @override_application_settings( + { + "transaction_tracer.enabled": False, + "distributed_tracing.enabled": True, + "span_events.enabled": span_events_enabled, + "collect_span_events": collect_span_events, + } + ) + @background_task(name="test_collect_span_events_override") def _test(): transaction = current_transaction() transaction._sampled = True - with FunctionTrace('span_generator'): + with FunctionTrace("span_generator"): pass if not spans_expected: - _test = function_not_called( - 'newrelic.core.attribute', - 'resolve_agent_attributes')(_test) + _test = function_not_called("newrelic.core.attribute", "resolve_agent_attributes")(_test) _test() -@pytest.mark.parametrize('include_attribues', (True, False)) +@pytest.mark.parametrize("include_attribues", (True, False)) def test_span_event_agent_attributes(include_attribues): override_settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, + "distributed_tracing.enabled": True, + "span_events.enabled": True, } if include_attribues: count = 1 - override_settings['attributes.include'] = ['*'] + override_settings["attributes.include"] = ["*"] else: count = 0 @override_application_settings(override_settings) + @validate_span_events(count=count, expected_agents=["webfrontend.queue.seconds"]) @validate_span_events( - count=count, expected_agents=['webfrontend.queue.seconds']) - @validate_span_events( - count=count, - exact_agents={'trace1_a': 'foobar', 'trace1_b': 'barbaz'}, - unexpected_agents=['trace2_a', 'trace2_b']) + count=count, + exact_agents={"trace1_a": "foobar", "trace1_b": "barbaz"}, + unexpected_agents=["trace2_a", "trace2_b"], + ) @validate_span_events( - count=count, - exact_agents={'trace2_a': 'foobar', 'trace2_b': 'barbaz'}, - unexpected_agents=['trace1_a', 'trace1_b']) - @background_task(name='test_span_event_agent_attributes') + count=count, + exact_agents={"trace2_a": "foobar", "trace2_b": "barbaz"}, + unexpected_agents=["trace1_a", "trace1_b"], + ) + @background_task(name="test_span_event_agent_attributes") def _test(): transaction = current_transaction() transaction.queue_start = 1.0 transaction._sampled = True - with FunctionTrace('trace1') as trace_1: - trace_1._add_agent_attribute('trace1_a', 'foobar') - trace_1._add_agent_attribute('trace1_b', 'barbaz') - with FunctionTrace('trace2') as trace_2: - trace_2._add_agent_attribute('trace2_a', 'foobar') - trace_2._add_agent_attribute('trace2_b', 'barbaz') + with FunctionTrace("trace1") as trace_1: + trace_1._add_agent_attribute("trace1_a", "foobar") + trace_1._add_agent_attribute("trace1_b", "barbaz") + with FunctionTrace("trace2") as trace_2: + trace_2._add_agent_attribute("trace2_a", "foobar") + trace_2._add_agent_attribute("trace2_b", "barbaz") _test() @@ -469,31 +487,35 @@ def __exit__(self, *args): pass -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), - (FakeTrace, ()), -)) -@pytest.mark.parametrize('exclude_attributes', (True, False)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + (FakeTrace, ()), + ), +) +@pytest.mark.parametrize("exclude_attributes", (True, False)) def test_span_event_user_attributes(trace_type, args, exclude_attributes): - _settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, + "distributed_tracing.enabled": True, + "span_events.enabled": True, } - forgone_params = ['invalid_value', ] - expected_params = {'trace1_a': 'foobar', 'trace1_b': 'barbaz'} + forgone_params = [ + "invalid_value", + ] + expected_params = {"trace1_a": "foobar", "trace1_b": "barbaz"} # We expect user_attributes to be included by default if exclude_attributes: count = 0 - _settings['attributes.exclude'] = ['*'] - forgone_params.extend(('trace1_a', 'trace1_b')) + _settings["attributes.exclude"] = ["*"] + forgone_params.extend(("trace1_a", "trace1_b")) expected_trace_params = {} else: expected_trace_params = expected_params @@ -503,44 +525,44 @@ def test_span_event_user_attributes(trace_type, args, exclude_attributes): @validate_span_events( count=count, exact_users=expected_params, - unexpected_users=forgone_params,) - @validate_tt_segment_params(exact_params=expected_trace_params, - forgone_params=forgone_params) - @background_task(name='test_span_event_user_attributes') + unexpected_users=forgone_params, + ) + @validate_tt_segment_params(exact_params=expected_trace_params, forgone_params=forgone_params) + @background_task(name="test_span_event_user_attributes") def _test(): transaction = current_transaction() transaction._sampled = True with trace_type(*args): - add_custom_span_attribute('trace1_a', 'foobar') - add_custom_span_attribute('trace1_b', 'barbaz') - add_custom_span_attribute('invalid_value', sys.maxsize + 1) + add_custom_span_attribute("trace1_a", "foobar") + add_custom_span_attribute("trace1_b", "barbaz") + add_custom_span_attribute("invalid_value", sys.maxsize + 1) _test() -@validate_span_events(count=1, exact_users={'foo': 'b'}) +@validate_span_events(count=1, exact_users={"foo": "b"}) @dt_enabled -@background_task(name='test_span_user_attribute_overrides_transaction_attribute') +@background_task(name="test_span_user_attribute_overrides_transaction_attribute") def test_span_user_attribute_overrides_transaction_attribute(): transaction = current_transaction() - transaction.add_custom_parameter('foo', 'a') - add_custom_span_attribute('foo', 'b') - transaction.add_custom_parameter('foo', 'c') + transaction.add_custom_parameter("foo", "a") + add_custom_span_attribute("foo", "b") + transaction.add_custom_parameter("foo", "c") -@override_application_settings({'attributes.include': '*'}) -@validate_span_events(count=1, exact_agents={'foo': 'b'}) +@override_application_settings({"attributes.include": "*"}) +@validate_span_events(count=1, exact_agents={"foo": "b"}) @dt_enabled -@background_task(name='test_span_agent_attribute_overrides_transaction_attribute') +@background_task(name="test_span_agent_attribute_overrides_transaction_attribute") def test_span_agent_attribute_overrides_transaction_attribute(): transaction = current_transaction() trace = current_trace() - transaction._add_agent_attribute('foo', 'a') - trace._add_agent_attribute('foo', 'b') - transaction._add_agent_attribute('foo', 'c') + transaction._add_agent_attribute("foo", "a") + trace._add_agent_attribute("foo", "b") + transaction._add_agent_attribute("foo", "c") def test_span_custom_attribute_limit(): @@ -555,71 +577,68 @@ def test_span_custom_attribute_limit(): for i in range(128): if i < 64: - span_custom_attrs.append('span_attr%i' % i) - txn_custom_attrs.append('txn_attr%i' % i) + span_custom_attrs.append("span_attr%i" % i) + txn_custom_attrs.append("txn_attr%i" % i) unexpected_txn_attrs.extend(span_custom_attrs) span_custom_attrs.extend(txn_custom_attrs[:64]) - expected_txn_attrs = {'user': txn_custom_attrs, 'agent': [], - 'intrinsic': []} - expected_absent_txn_attrs = {'agent': [], - 'user': unexpected_txn_attrs, - 'intrinsic': []} - - @override_application_settings({'attributes.include': '*'}) - @validate_transaction_event_attributes(expected_txn_attrs, - expected_absent_txn_attrs) - @validate_span_events(count=1, - expected_users=span_custom_attrs, - unexpected_users=txn_custom_attrs[64:]) + expected_txn_attrs = {"user": txn_custom_attrs, "agent": [], "intrinsic": []} + expected_absent_txn_attrs = {"agent": [], "user": unexpected_txn_attrs, "intrinsic": []} + + @override_application_settings({"attributes.include": "*"}) + @validate_transaction_event_attributes(expected_txn_attrs, expected_absent_txn_attrs) + @validate_span_events(count=1, expected_users=span_custom_attrs, unexpected_users=txn_custom_attrs[64:]) @dt_enabled - @background_task(name='test_span_attribute_limit') + @background_task(name="test_span_attribute_limit") def _test(): transaction = current_transaction() for i in range(128): - transaction.add_custom_parameter('txn_attr%i' % i, 'txnValue') + transaction.add_custom_parameter("txn_attr%i" % i, "txnValue") if i < 64: - add_custom_span_attribute('span_attr%i' % i, 'spanValue') + add_custom_span_attribute("span_attr%i" % i, "spanValue") + _test() _span_event_metrics = [("Supportability/SpanEvent/Errors/Dropped", None)] -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), - (FakeTrace, ()), -)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + (FakeTrace, ()), + ), +) def test_span_event_error_attributes_notice_error(trace_type, args): - _settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, + "distributed_tracing.enabled": True, + "span_events.enabled": True, } error = ValueError("whoops") exact_agents = { - 'error.class': callable_name(error), - 'error.message': 'whoops', + "error.class": callable_name(error), + "error.message": "whoops", } @override_application_settings(_settings) @validate_transaction_metrics( - 'test_span_event_error_attributes_notice_error', - background_task=True, - rollup_metrics=_span_event_metrics) + "test_span_event_error_attributes_notice_error", background_task=True, rollup_metrics=_span_event_metrics + ) @validate_span_events( count=1, - exact_agents=exact_agents,) - @background_task(name='test_span_event_error_attributes_notice_error') + exact_agents=exact_agents, + ) + @background_task(name="test_span_event_error_attributes_notice_error") def _test(): transaction = current_transaction() transaction._sampled = True @@ -633,36 +652,38 @@ def _test(): _test() -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), -)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + ), +) def test_span_event_error_attributes_observed(trace_type, args): - error = ValueError("whoops") exact_agents = { - 'error.class': callable_name(error), - 'error.message': 'whoops', + "error.class": callable_name(error), + "error.message": "whoops", } # Verify errors are not recorded since notice_error is not called - rollups = [('Errors/all', None)] + _span_event_metrics + rollups = [("Errors/all", None)] + _span_event_metrics @dt_enabled @validate_transaction_metrics( - 'test_span_event_error_attributes_observed', - background_task=True, - rollup_metrics=rollups) + "test_span_event_error_attributes_observed", background_task=True, rollup_metrics=rollups + ) @validate_span_events( count=1, - exact_agents=exact_agents,) - @background_task(name='test_span_event_error_attributes_observed') + exact_agents=exact_agents, + ) + @background_task(name="test_span_event_error_attributes_observed") def _test(): try: with trace_type(*args): @@ -673,47 +694,52 @@ def _test(): _test() -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), - (FakeTrace, ()), -)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + (FakeTrace, ()), + ), +) @dt_enabled -@validate_span_events(count=1, - exact_agents={'error.class': ERROR_NAME, 'error.message': 'whoops'}) -@background_task(name='test_span_event_notice_error_overrides_observed') +@validate_span_events(count=1, exact_agents={"error.class": ERROR_NAME, "error.message": "whoops"}) +@background_task(name="test_span_event_notice_error_overrides_observed") def test_span_event_notice_error_overrides_observed(trace_type, args): try: with trace_type(*args): try: raise ERROR - except: + except Exception: notice_error() - raise ValueError + raise ValueError # pylint: disable (Py2/Py3 compatibility) except ValueError: pass -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), - (FakeTrace, ()), -)) -@override_application_settings({'error_collector.enabled': False}) -@validate_span_events(count=0, expected_agents=['error.class']) -@validate_span_events(count=0, expected_agents=['error.message']) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + (FakeTrace, ()), + ), +) +@override_application_settings({"error_collector.enabled": False}) +@validate_span_events(count=0, expected_agents=["error.class"]) +@validate_span_events(count=0, expected_agents=["error.message"]) @dt_enabled -@background_task(name='test_span_event_errors_disabled') +@background_task(name="test_span_event_errors_disabled") def test_span_event_errors_disabled(trace_type, args): with trace_type(*args): try: @@ -725,32 +751,34 @@ def test_span_event_errors_disabled(trace_type, args): _metrics = [("Supportability/SpanEvent/Errors/Dropped", 2)] -@pytest.mark.parametrize('trace_type,args', ( - (FunctionTrace, ('name', )), - (FakeTrace, ()), -)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (FunctionTrace, ("name",)), + (FakeTrace, ()), + ), +) def test_span_event_multiple_errors(trace_type, args): _settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, + "distributed_tracing.enabled": True, + "span_events.enabled": True, } error = ValueError("whoops") exact_agents = { - 'error.class': callable_name(error), - 'error.message': 'whoops', + "error.class": callable_name(error), + "error.message": "whoops", "error.expected": False, } @override_application_settings(_settings) @validate_span_events( count=1, - exact_agents=exact_agents,) - @validate_transaction_metrics("test_span_event_multiple_errors", - background_task=True, - rollup_metrics=_metrics) - @background_task(name='test_span_event_multiple_errors') + exact_agents=exact_agents, + ) + @validate_transaction_metrics("test_span_event_multiple_errors", background_task=True, rollup_metrics=_metrics) + @background_task(name="test_span_event_multiple_errors") def _test(): transaction = current_transaction() transaction._sampled = True diff --git a/tests/agent_features/test_supportability_metrics.py b/tests/agent_features/test_supportability_metrics.py index d3e4b9a69..d77502180 100644 --- a/tests/agent_features/test_supportability_metrics.py +++ b/tests/agent_features/test_supportability_metrics.py @@ -20,7 +20,7 @@ from newrelic.core.agent import agent_instance -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_metric_payload import ( validate_metric_payload) diff --git a/tests/agent_features/test_synthetics.py b/tests/agent_features/test_synthetics.py index cdf02e3e4..2e08144cc 100644 --- a/tests/agent_features/test_synthetics.py +++ b/tests/agent_features/test_synthetics.py @@ -19,7 +19,11 @@ cat_enabled, make_synthetics_header, override_application_settings, +) +from testing_support.validators.validate_synthetics_event import ( validate_synthetics_event, +) +from testing_support.validators.validate_synthetics_transaction_trace import ( validate_synthetics_transaction_trace, ) diff --git a/tests/agent_features/test_time_trace.py b/tests/agent_features/test_time_trace.py index f81e8750d..eccb4d7fe 100644 --- a/tests/agent_features/test_time_trace.py +++ b/tests/agent_features/test_time_trace.py @@ -14,11 +14,21 @@ import logging -from testing_support.fixtures import validate_transaction_metrics +import pytest +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task +from newrelic.api.database_trace import DatabaseTrace +from newrelic.api.datastore_trace import DatastoreTrace +from newrelic.api.external_trace import ExternalTrace from newrelic.api.function_trace import FunctionTrace -from newrelic.api.transaction import end_of_transaction +from newrelic.api.graphql_trace import GraphQLOperationTrace, GraphQLResolverTrace +from newrelic.api.memcache_trace import MemcacheTrace +from newrelic.api.message_trace import MessageTrace +from newrelic.api.solr_trace import SolrTrace +from newrelic.api.transaction import current_transaction, end_of_transaction @validate_transaction_metrics( @@ -34,3 +44,30 @@ def test_trace_after_end_of_transaction(caplog): error_messages = [record for record in caplog.records if record.levelno >= logging.ERROR] assert not error_messages + + +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (GraphQLOperationTrace, ()), + (GraphQLResolverTrace, ()), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + ), +) +@background_task() +def test_trace_finalizes_with_transaction_missing_settings(monkeypatch, trace_type, args): + txn = current_transaction() + try: + with trace_type(*args): + # Validate no errors are raised when finalizing trace with no settings + monkeypatch.setattr(txn, "_settings", None) + finally: + # Ensure transaction still has settings when it exits to prevent other crashes making errors hard to read + monkeypatch.undo() + assert txn.settings diff --git a/tests/agent_features/test_transaction_event_data_and_some_browser_stuff_too.py b/tests/agent_features/test_transaction_event_data_and_some_browser_stuff_too.py index a99fc1cdd..73bdfcf53 100644 --- a/tests/agent_features/test_transaction_event_data_and_some_browser_stuff_too.py +++ b/tests/agent_features/test_transaction_event_data_and_some_browser_stuff_too.py @@ -13,36 +13,39 @@ # limitations under the License. import json + import webtest +from testing_support.fixtures import ( + override_application_settings, + validate_transaction_event_sample_data, +) +from testing_support.sample_applications import ( + fully_featured_app, + user_attributes_added, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) from newrelic.api.application import application_settings from newrelic.api.background_task import background_task - from newrelic.common.encoding_utils import deobfuscate from newrelic.common.object_wrapper import transient_function_wrapper -from testing_support.fixtures import (override_application_settings, - validate_transaction_event_sample_data, - validate_transaction_event_attributes) -from testing_support.sample_applications import (fully_featured_app, - user_attributes_added) - - fully_featured_application = webtest.TestApp(fully_featured_app) _user_attributes = user_attributes_added() -#====================== Test cases ==================================== +# ====================== Test cases ==================================== -_test_capture_attributes_enabled_settings = { - 'browser_monitoring.attributes.enabled': True } +_test_capture_attributes_enabled_settings = {"browser_monitoring.attributes.enabled": True} _intrinsic_attributes = { - 'name': 'WebTransaction/Uri/', - 'port': 80, + "name": "WebTransaction/Uri/", + "port": 80, } -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) @override_application_settings(_test_capture_attributes_enabled_settings) def test_capture_attributes_enabled(): settings = application_settings() @@ -52,7 +55,7 @@ def test_capture_attributes_enabled(): assert settings.js_agent_loader - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -60,25 +63,23 @@ def test_capture_attributes_enabled(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate the various fields of the footer related to analytics. # The fields are held by a JSON dictionary. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) obfuscation_key = settings.license_key[:13] - attributes = json.loads(deobfuscate(data['atts'], - obfuscation_key)) - user_attrs = attributes['u'] - + attributes = json.loads(deobfuscate(data["atts"], obfuscation_key)) + user_attrs = attributes["u"] # When you round-trip through json encoding and json decoding, you # always end up with unicode (unicode in Python 2, str in Python 3.) @@ -90,22 +91,18 @@ def test_capture_attributes_enabled(): browser_attributes = _user_attributes.copy() - browser_attributes['bytes'] = u'bytes-value' - browser_attributes['invalid-utf8'] = _user_attributes[ - 'invalid-utf8'].decode('latin-1') - browser_attributes['multibyte-utf8'] = _user_attributes[ - 'multibyte-utf8'].decode('latin-1') + browser_attributes["bytes"] = "bytes-value" + browser_attributes["invalid-utf8"] = _user_attributes["invalid-utf8"].decode("latin-1") + browser_attributes["multibyte-utf8"] = _user_attributes["multibyte-utf8"].decode("latin-1") for attr, value in browser_attributes.items(): - assert user_attrs[attr] == value, ( - "attribute %r expected %r, found %r" % - (attr, value, user_attrs[attr])) + assert user_attrs[attr] == value, "attribute %r expected %r, found %r" % (attr, value, user_attrs[attr]) + + +_test_no_attributes_recorded_settings = {"browser_monitoring.attributes.enabled": True} -_test_no_attributes_recorded_settings = { - 'browser_monitoring.attributes.enabled': True } -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs={}) +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs={}) @override_application_settings(_test_no_attributes_recorded_settings) def test_no_attributes_recorded(): settings = application_settings() @@ -115,8 +112,7 @@ def test_no_attributes_recorded(): assert settings.js_agent_loader - response = fully_featured_application.get('/', extra_environ={ - 'record_attributes': 'FALSE'}) + response = fully_featured_application.get("/", extra_environ={"record_attributes": "FALSE"}) header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -124,32 +120,33 @@ def test_no_attributes_recorded(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate the various fields of the footer related to analytics. # The fields are held by a JSON dictionary. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) # As we are not recording any user or agent attributes, we should not # actually have an entry at all in the footer. - assert 'atts' not in data + assert "atts" not in data + _test_analytic_events_capture_attributes_disabled_settings = { - 'transaction_events.attributes.enabled': False, - 'browser_monitoring.attributes.enabled': True } + "transaction_events.attributes.enabled": False, + "browser_monitoring.attributes.enabled": True, +} -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs={}) -@override_application_settings( - _test_analytic_events_capture_attributes_disabled_settings) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs={}) +@override_application_settings(_test_analytic_events_capture_attributes_disabled_settings) def test_analytic_events_capture_attributes_disabled(): settings = application_settings() @@ -162,7 +159,7 @@ def test_analytic_events_capture_attributes_disabled(): assert settings.js_agent_loader - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -170,23 +167,23 @@ def test_analytic_events_capture_attributes_disabled(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate that attributes are present, since browser monitoring should # be enabled. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert "atts" in data - assert 'atts' in data -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) def test_capture_attributes_default(): settings = application_settings() @@ -195,7 +192,7 @@ def test_capture_attributes_default(): assert settings.js_agent_loader - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -203,32 +200,29 @@ def test_capture_attributes_default(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate that attributes are not present, since should # be disabled. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) - assert 'atts' not in data + assert "atts" not in data -_test_analytic_events_background_task_settings = { - 'browser_monitoring.attributes.enabled': True } -_intrinsic_attributes = { - 'name': 'OtherTransaction/Uri/' -} +_test_analytic_events_background_task_settings = {"browser_monitoring.attributes.enabled": True} + +_intrinsic_attributes = {"name": "OtherTransaction/Uri/"} -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) -@override_application_settings( - _test_analytic_events_background_task_settings) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) +@override_application_settings(_test_analytic_events_background_task_settings) def test_analytic_events_background_task(): settings = application_settings() @@ -240,20 +234,17 @@ def test_analytic_events_background_task(): assert settings.js_agent_loader - response = fully_featured_application.get('/', extra_environ={ - 'newrelic.set_background_task': True}) + response = fully_featured_application.get("/", extra_environ={"newrelic.set_background_task": True}) assert response.html.html.head.script is None -_test_capture_attributes_disabled_settings = { - 'browser_monitoring.attributes.enabled': False } -_intrinsic_attributes = { - 'name': 'WebTransaction/Uri/' -} +_test_capture_attributes_disabled_settings = {"browser_monitoring.attributes.enabled": False} + +_intrinsic_attributes = {"name": "WebTransaction/Uri/"} -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) @override_application_settings(_test_capture_attributes_disabled_settings) def test_capture_attributes_disabled(): settings = application_settings() @@ -263,7 +254,7 @@ def test_capture_attributes_disabled(): assert settings.js_agent_loader - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -271,30 +262,34 @@ def test_capture_attributes_disabled(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate that attributes are not present, since should # be disabled. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert "atts" not in data - assert 'atts' not in data -@transient_function_wrapper('newrelic.core.stats_engine', - 'SampledDataSet.add') +@transient_function_wrapper("newrelic.core.stats_engine", "SampledDataSet.add") def validate_no_analytics_sample_data(wrapped, instance, args, kwargs): - assert False, 'Should not be recording analytic event.' + assert False, "Should not be recording analytic event." return wrapped(*args, **kwargs) + _test_collect_analytic_events_disabled_settings = { - 'collect_analytics_events': False, - 'browser_monitoring.attributes.enabled': True } + "collect_analytics_events": False, + "distributed_tracing.enabled": False, + "browser_monitoring.attributes.enabled": True, +} + @validate_no_analytics_sample_data @override_application_settings(_test_collect_analytic_events_disabled_settings) @@ -308,7 +303,7 @@ def test_collect_analytic_events_disabled(): assert settings.js_agent_loader - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -316,24 +311,28 @@ def test_collect_analytic_events_disabled(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate that attributes are present, since should # be enabled. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert "atts" in data - assert 'atts' in data _test_analytic_events_disabled_settings = { - 'transaction_events.enabled': False, - 'browser_monitoring.attributes.enabled': True } + "transaction_events.enabled": False, + "distributed_tracing.enabled": False, + "browser_monitoring.attributes.enabled": True, +} + @validate_no_analytics_sample_data @override_application_settings(_test_analytic_events_disabled_settings) @@ -348,7 +347,7 @@ def test_analytic_events_disabled(): assert settings.js_agent_loader - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") header = response.html.html.head.script.string content = response.html.html.body.p.string @@ -356,25 +355,26 @@ def test_analytic_events_disabled(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" # We no longer are in control of the JS contents of the header so # just check to make sure it contains at least the magic string # 'NREUM'. - assert header.find('NREUM') != -1 + assert header.find("NREUM") != -1 # Now validate that attributes are present, since should # be enabled. - data = json.loads(footer.split('NREUM.info=')[1]) + data = json.loads(footer.split("NREUM.info=")[1]) + + assert "atts" in data - assert 'atts' in data # -------------- Test call counts in analytic events ---------------- -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) def test_no_database_or_external_attributes_in_analytics(): """Make no external calls or database calls in the transaction and check if the analytic event doesn't have the databaseCallCount, databaseDuration, @@ -385,7 +385,7 @@ def test_no_database_or_external_attributes_in_analytics(): assert settings.browser_monitoring.enabled - response = fully_featured_application.get('/') + response = fully_featured_application.get("/") # Validation of analytic data happens in the decorator. @@ -393,15 +393,16 @@ def test_no_database_or_external_attributes_in_analytics(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" + _intrinsic_attributes = { - 'name': 'WebTransaction/Uri/db', - 'databaseCallCount': 2, + "name": "WebTransaction/Uri/db", + "databaseCallCount": 2, } -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) def test_database_attributes_in_analytics(): """Make database calls in the transaction and check if the analytic event has the databaseCallCount and databaseDuration attributes. @@ -412,9 +413,9 @@ def test_database_attributes_in_analytics(): assert settings.browser_monitoring.enabled test_environ = { - 'db' : '2', + "db": "2", } - response = fully_featured_application.get('/db', extra_environ=test_environ) + response = fully_featured_application.get("/db", extra_environ=test_environ) # Validation of analytic data happens in the decorator. @@ -422,15 +423,16 @@ def test_database_attributes_in_analytics(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" + _intrinsic_attributes = { - 'name': 'WebTransaction/Uri/ext', - 'externalCallCount': 2, + "name": "WebTransaction/Uri/ext", + "externalCallCount": 2, } -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) def test_external_attributes_in_analytics(): """Make external calls in the transaction and check if the analytic event has the externalCallCount and externalDuration attributes. @@ -441,10 +443,9 @@ def test_external_attributes_in_analytics(): assert settings.browser_monitoring.enabled test_environ = { - 'external' : '2', + "external": "2", } - response = fully_featured_application.get('/ext', - extra_environ=test_environ) + response = fully_featured_application.get("/ext", extra_environ=test_environ) # Validation of analytic data happens in the decorator. @@ -452,16 +453,17 @@ def test_external_attributes_in_analytics(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" + _intrinsic_attributes = { - 'name': 'WebTransaction/Uri/dbext', - 'databaseCallCount': 2, - 'externalCallCount': 2, + "name": "WebTransaction/Uri/dbext", + "databaseCallCount": 2, + "externalCallCount": 2, } -@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, - required_user_attrs=_user_attributes) + +@validate_transaction_event_sample_data(required_attrs=_intrinsic_attributes, required_user_attrs=_user_attributes) def test_database_and_external_attributes_in_analytics(): """Make external calls and database calls in the transaction and check if the analytic event has the databaseCallCount, databaseDuration, @@ -473,11 +475,10 @@ def test_database_and_external_attributes_in_analytics(): assert settings.browser_monitoring.enabled test_environ = { - 'db' : '2', - 'external' : '2', + "db": "2", + "external": "2", } - response = fully_featured_application.get('/dbext', - extra_environ=test_environ) + response = fully_featured_application.get("/dbext", extra_environ=test_environ) # Validation of analytic data happens in the decorator. @@ -485,24 +486,25 @@ def test_database_and_external_attributes_in_analytics(): # Validate actual body content. - assert content == 'RESPONSE' + assert content == "RESPONSE" + # -------------- Test background tasks ---------------- _expected_attributes = { - 'user': [], - 'agent': [], - 'intrinsic' : ('name', 'duration', 'type', 'timestamp', 'totalTime'), + "user": [], + "agent": [], + "intrinsic": ("name", "duration", "type", "timestamp", "totalTime"), } _expected_absent_attributes = { - 'user': ('foo'), - 'agent': ('response.status', 'request.method'), - 'intrinsic': ('port'), + "user": ("foo"), + "agent": ("response.status", "request.method"), + "intrinsic": ("port"), } -@validate_transaction_event_attributes(_expected_attributes, - _expected_absent_attributes) + +@validate_transaction_event_attributes(_expected_attributes, _expected_absent_attributes) @background_task() def test_background_task_intrinsics_has_no_port(): pass diff --git a/tests/agent_features/test_transaction_name.py b/tests/agent_features/test_transaction_name.py index 492f64df3..beaf83b57 100644 --- a/tests/agent_features/test_transaction_name.py +++ b/tests/agent_features/test_transaction_name.py @@ -15,7 +15,7 @@ from newrelic.api.background_task import background_task from newrelic.api.transaction import set_transaction_name, set_background_task -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics # test diff --git a/tests/agent_features/test_transaction_trace_segments.py b/tests/agent_features/test_transaction_trace_segments.py index ad4d02b18..8318c0fca 100644 --- a/tests/agent_features/test_transaction_trace_segments.py +++ b/tests/agent_features/test_transaction_trace_segments.py @@ -13,142 +13,158 @@ # limitations under the License. import pytest +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) -from newrelic.api.transaction import current_transaction from newrelic.api.background_task import background_task - from newrelic.api.database_trace import DatabaseTrace from newrelic.api.datastore_trace import DatastoreTrace -from newrelic.api.external_trace import external_trace, ExternalTrace +from newrelic.api.external_trace import ExternalTrace, external_trace from newrelic.api.function_trace import FunctionTrace +from newrelic.api.graphql_trace import GraphQLOperationTrace, GraphQLResolverTrace from newrelic.api.memcache_trace import MemcacheTrace from newrelic.api.message_trace import MessageTrace from newrelic.api.solr_trace import SolrTrace - -from testing_support.fixtures import (override_application_settings, - validate_tt_segment_params) +from newrelic.api.transaction import current_transaction -@external_trace('lib', 'https://example.com/path?q=q#frag') +@external_trace("lib", "https://example.com/path?q=q#frag") def external(): pass -@validate_tt_segment_params(present_params=('http.url',)) -@background_task(name='test_external_segment_attributes_default') +@validate_tt_segment_params(present_params=("http.url",)) +@background_task(name="test_external_segment_attributes_default") def test_external_segment_attributes_default(): external() -@override_application_settings({ - 'transaction_segments.attributes.exclude': ['http.url'], -}) -@validate_tt_segment_params(forgone_params=('http.url',)) -@background_task(name='test_external_segment_attributes_disabled') +@override_application_settings( + { + "transaction_segments.attributes.exclude": ["http.url"], + } +) +@validate_tt_segment_params(forgone_params=("http.url",)) +@background_task(name="test_external_segment_attributes_disabled") def test_external_segment_attributes_disabled(): external() -@validate_tt_segment_params(exact_params={'http.url': 'http://example.org'}) -@background_task(name='test_external_user_params_override_url') +@validate_tt_segment_params(exact_params={"http.url": "http://example.org"}) +@background_task(name="test_external_user_params_override_url") def test_external_user_params_override_url(): - with ExternalTrace('lib', 'http://example.com') as t: + with ExternalTrace("lib", "http://example.com") as t: # Pretend like this is a user attribute and it's legal to do this - t.params['http.url'] = 'http://example.org' + t.params["http.url"] = "http://example.org" -@validate_tt_segment_params(exact_params={'db.instance': 'a' * 255}) -@background_task(name='test_datastore_db_instance_truncation') +@validate_tt_segment_params(exact_params={"db.instance": "a" * 255}) +@background_task(name="test_datastore_db_instance_truncation") def test_datastore_db_instance_truncation(): - with DatastoreTrace('db_product', 'db_target', 'db_operation', - database_name='a' * 256): + with DatastoreTrace("db_product", "db_target", "db_operation", database_name="a" * 256): pass -@validate_tt_segment_params(exact_params={'db.instance': 'a' * 255}) -@background_task(name='test_database_db_instance_truncation') +@validate_tt_segment_params(exact_params={"db.instance": "a" * 255}) +@background_task(name="test_database_db_instance_truncation") def test_database_db_instance_truncation(): - with DatabaseTrace('select * from foo', - database_name='a' * 256): + with DatabaseTrace("select * from foo", database_name="a" * 256): pass -@override_application_settings({ - 'transaction_tracer.record_sql': 'raw', -}) -@validate_tt_segment_params(exact_params={'db.statement': 'select 1'}) -@background_task(name='test_database_db_statement') +@override_application_settings( + { + "transaction_tracer.record_sql": "raw", + } +) +@validate_tt_segment_params(exact_params={"db.statement": "select 1"}) +@background_task(name="test_database_db_statement") def test_database_db_statement_default_enabled(): - with DatabaseTrace('select 1'): + with DatabaseTrace("select 1"): pass -@override_application_settings({ - 'transaction_tracer.record_sql': 'raw', - 'agent_limits.sql_query_length_maximum': 1, -}) -@validate_tt_segment_params(exact_params={'db.statement': 'a'}) -@background_task(name='test_database_db_statement_truncation') +@override_application_settings( + { + "transaction_tracer.record_sql": "raw", + "agent_limits.sql_query_length_maximum": 1, + } +) +@validate_tt_segment_params(exact_params={"db.statement": "a"}) +@background_task(name="test_database_db_statement_truncation") def test_database_db_statement_truncation(): - with DatabaseTrace('a' * 2): + with DatabaseTrace("a" * 2): pass -@override_application_settings({ - 'transaction_segments.attributes.exclude': ['db.*'], -}) -@validate_tt_segment_params(forgone_params=('db.instance', 'db.statement')) -@background_task(name='test_database_segment_attributes_disabled') +@override_application_settings( + { + "transaction_segments.attributes.exclude": ["db.*"], + } +) +@validate_tt_segment_params(forgone_params=("db.instance", "db.statement")) +@background_task(name="test_database_segment_attributes_disabled") def test_database_segment_attributes_disabled(): transaction = current_transaction() - with DatabaseTrace('select 1', database_name='foo'): + with DatabaseTrace("select 1", database_name="foo"): pass -@pytest.mark.parametrize('trace_type,args', ( - (DatabaseTrace, ('select * from foo', )), - (DatastoreTrace, ('db_product', 'db_target', 'db_operation')), - (ExternalTrace, ('lib', 'url')), - (FunctionTrace, ('name', )), - (MemcacheTrace, ('command', )), - (MessageTrace, ('lib', 'operation', 'dst_type', 'dst_name')), - (SolrTrace, ('lib', 'command')), -)) +@pytest.mark.parametrize( + "trace_type,args", + ( + (DatabaseTrace, ("select * from foo",)), + (DatastoreTrace, ("db_product", "db_target", "db_operation")), + (ExternalTrace, ("lib", "url")), + (FunctionTrace, ("name",)), + (GraphQLOperationTrace, ()), + (GraphQLResolverTrace, ()), + (MemcacheTrace, ("command",)), + (MessageTrace, ("lib", "operation", "dst_type", "dst_name")), + (SolrTrace, ("lib", "command")), + ), +) def test_each_segment_type(trace_type, args): - @validate_tt_segment_params(exact_params={'blah': 'bloo'}) - @override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'attributes.include': ['blah'], - }) - @background_task(name='test_each_segment_type') + @validate_tt_segment_params(exact_params={"blah": "bloo"}) + @override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "attributes.include": ["blah"], + } + ) + @background_task(name="test_each_segment_type") def _test(): transaction = current_transaction() transaction._sampled = True with trace_type(*args) as trace: - trace._add_agent_attribute('blah', 'bloo') + trace._add_agent_attribute("blah", "bloo") _test() -@override_application_settings({ - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'attributes.include': ['*'], -}) -@background_task(name='test_attribute_overrides') +@override_application_settings( + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "attributes.include": ["*"], + } +) +@background_task(name="test_attribute_overrides") def test_attribute_overrides(): - with FunctionTrace('test_attribute_overrides_trace') as trace: + with FunctionTrace("test_attribute_overrides_trace") as trace: trace.exclusive = 0.1 - trace._add_agent_attribute('exclusive_duration_millis', 0.2) - trace._add_agent_attribute('test_attr', 'a') - trace.add_custom_attribute('exclusive_duration_millis', 0.3) - trace.add_custom_attribute('test_attr', 'b') + trace._add_agent_attribute("exclusive_duration_millis", 0.2) + trace._add_agent_attribute("test_attr", "a") + trace.add_custom_attribute("exclusive_duration_millis", 0.3) + trace.add_custom_attribute("test_attr", "b") node = trace.create_node() params = node.get_trace_segment_params(current_transaction().settings) - assert params['exclusive_duration_millis'] == 100 - assert params['test_attr'] == 'b' + assert params["exclusive_duration_millis"] == 100 + assert params["test_attr"] == "b" diff --git a/tests/agent_features/test_w3c_trace_context.py b/tests/agent_features/test_w3c_trace_context.py index 17e40fdb0..726cf011a 100644 --- a/tests/agent_features/test_w3c_trace_context.py +++ b/tests/agent_features/test_w3c_trace_context.py @@ -19,11 +19,11 @@ from newrelic.api.transaction import current_transaction from newrelic.api.external_trace import ExternalTrace from newrelic.api.wsgi_application import wsgi_application -from testing_support.fixtures import (override_application_settings, - validate_transaction_event_attributes, validate_transaction_metrics) +from testing_support.fixtures import override_application_settings from testing_support.validators.validate_span_events import ( validate_span_events) - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_event_attributes import validate_transaction_event_attributes @wsgi_application() def target_wsgi_application(environ, start_response): diff --git a/tests/agent_features/test_web_transaction.py b/tests/agent_features/test_web_transaction.py index 22bfd6eb8..66cf25858 100644 --- a/tests/agent_features/test_web_transaction.py +++ b/tests/agent_features/test_web_transaction.py @@ -14,67 +14,112 @@ # limitations under the License. import gc -import webtest -import pytest import time + +import pytest +import webtest +from testing_support.fixtures import validate_attributes +from testing_support.sample_applications import simple_app, simple_app_raw +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + from newrelic.api.application import application_instance from newrelic.api.web_transaction import WebTransaction -from testing_support.fixtures import (validate_transaction_metrics, - validate_attributes) -from testing_support.sample_applications import simple_app -import newrelic.packages.six as six +from newrelic.api.wsgi_application import wsgi_application +from newrelic.packages import six + application = webtest.TestApp(simple_app) # TODO: WSGI metrics must not be generated for a WebTransaction METRICS = ( - ('Python/WSGI/Input/Bytes', None), - ('Python/WSGI/Input/Time', None), - ('Python/WSGI/Input/Calls/read', None), - ('Python/WSGI/Input/Calls/readline', None), - ('Python/WSGI/Input/Calls/readlines', None), - ('Python/WSGI/Output/Bytes', None), - ('Python/WSGI/Output/Time', None), - ('Python/WSGI/Output/Calls/yield', None), - ('Python/WSGI/Output/Calls/write', None), + ("Python/WSGI/Input/Bytes", None), + ("Python/WSGI/Input/Time", None), + ("Python/WSGI/Input/Calls/read", None), + ("Python/WSGI/Input/Calls/readline", None), + ("Python/WSGI/Input/Calls/readlines", None), + ("Python/WSGI/Output/Bytes", None), + ("Python/WSGI/Output/Time", None), + ("Python/WSGI/Output/Calls/yield", None), + ("Python/WSGI/Output/Calls/write", None), ) -# TODO: Add rollup_metrics=METRICS +# Test for presence of framework and dispatcher info based on whether framework is specified @validate_transaction_metrics( - 'test_base_web_transaction', - group='Test') -@validate_attributes('agent', -[ - 'request.headers.accept', 'request.headers.contentLength', - 'request.headers.contentType', 'request.headers.host', - 'request.headers.referer', 'request.headers.userAgent', 'request.method', - 'request.uri', 'response.status', 'response.headers.contentLength', - 'response.headers.contentType', 'request.parameters.foo', - 'request.parameters.boo', 'webfrontend.queue.seconds', -]) -@pytest.mark.parametrize('use_bytes', (True, False)) + name="test", custom_metrics=[("Python/Framework/framework/v1", 1), ("Python/Dispatcher/dispatcher/v1.0.0", 1)] +) +def test_dispatcher_and_framework_metrics(): + inner_wsgi_decorator = wsgi_application( + name="test", framework=("framework", "v1"), dispatcher=("dispatcher", "v1.0.0") + ) + decorated_application = inner_wsgi_decorator(simple_app_raw) + + application = webtest.TestApp(decorated_application) + application.get("/") + + +# Test for presence of framework and dispatcher info under existing transaction +@validate_transaction_metrics( + name="test", custom_metrics=[("Python/Framework/framework/v1", 1), ("Python/Dispatcher/dispatcher/v1.0.0", 1)] +) +def test_double_wrapped_dispatcher_and_framework_metrics(): + inner_wsgi_decorator = wsgi_application( + name="test", framework=("framework", "v1"), dispatcher=("dispatcher", "v1.0.0") + ) + decorated_application = inner_wsgi_decorator(simple_app_raw) + + outer_wsgi_decorator = wsgi_application(name="double_wrapped") + double_decorated_application = outer_wsgi_decorator(decorated_application) + + application = webtest.TestApp(double_decorated_application) + application.get("/") + + +# TODO: Add rollup_metrics=METRICS +@validate_transaction_metrics("test_base_web_transaction", group="Test") +@validate_attributes( + "agent", + [ + "request.headers.accept", + "request.headers.contentLength", + "request.headers.contentType", + "request.headers.host", + "request.headers.referer", + "request.headers.userAgent", + "request.method", + "request.uri", + "response.status", + "response.headers.contentLength", + "response.headers.contentType", + "request.parameters.foo", + "request.parameters.boo", + "webfrontend.queue.seconds", + ], +) +@pytest.mark.parametrize("use_bytes", (True, False)) def test_base_web_transaction(use_bytes): application = application_instance() request_headers = { - 'Accept': 'text/plain', - 'Content-Length': '0', - 'Content-Type': 'text/plain', - 'Host': 'localhost', - 'Referer': 'http://example.com?q=1&boat=⛵', - 'User-Agent': 'potato', - 'X-Request-Start': str(time.time() - 0.2), - 'newRelic': 'invalid', + "Accept": "text/plain", + "Content-Length": "0", + "Content-Type": "text/plain", + "Host": "localhost", + "Referer": "http://example.com?q=1&boat=⛵", + "User-Agent": "potato", + "X-Request-Start": str(time.time() - 0.2), + "newRelic": "invalid", } if use_bytes: byte_headers = {} for name, value in request_headers.items(): - name = name.encode('utf-8') + name = name.encode("utf-8") try: - value = value.encode('utf-8') + value = value.encode("utf-8") except UnicodeDecodeError: assert six.PY2 byte_headers[name] = value @@ -82,24 +127,22 @@ def test_base_web_transaction(use_bytes): request_headers = byte_headers transaction = WebTransaction( - application, - 'test_base_web_transaction', - group='Test', - scheme='http', - host='localhost', - port=8000, - request_method='HEAD', - request_path='/foobar', - query_string='foo=bar&boo=baz', - headers=request_headers.items(), + application, + "test_base_web_transaction", + group="Test", + scheme="http", + host="localhost", + port=8000, + request_method="HEAD", + request_path="/foobar", + query_string="foo=bar&boo=baz", + headers=request_headers.items(), ) if use_bytes: - response_headers = ((b'Content-Length', b'0'), - (b'Content-Type', b'text/plain')) + response_headers = ((b"Content-Length", b"0"), (b"Content-Type", b"text/plain")) else: - response_headers = (('Content-Length', '0'), - ('Content-Type', 'text/plain')) + response_headers = (("Content-Length", "0"), ("Content-Type", "text/plain")) with transaction: transaction.process_response(200, response_headers) @@ -117,8 +160,8 @@ def validate_no_garbage(): @validate_transaction_metrics( - name='', - group='Uri', + name="", + group="Uri", ) def test_wsgi_app_memory(validate_no_garbage): - application.get('/') + application.get("/") diff --git a/tests/agent_features/test_wsgi_attributes.py b/tests/agent_features/test_wsgi_attributes.py index 0f7f7d6f2..db9fc807a 100644 --- a/tests/agent_features/test_wsgi_attributes.py +++ b/tests/agent_features/test_wsgi_attributes.py @@ -13,14 +13,17 @@ # limitations under the License. import webtest -from testing_support.fixtures import ( - dt_enabled, - override_application_settings, +from testing_support.fixtures import dt_enabled, override_application_settings +from testing_support.sample_applications import fully_featured_app +from testing_support.validators.validate_error_event_attributes import ( validate_error_event_attributes, +) +from testing_support.validators.validate_transaction_error_trace_attributes import ( validate_transaction_error_trace_attributes, +) +from testing_support.validators.validate_transaction_event_attributes import ( validate_transaction_event_attributes, ) -from testing_support.sample_applications import fully_featured_app WSGI_ATTRIBUTES = [ "wsgi.input.seconds", @@ -44,6 +47,4 @@ @override_application_settings({"attributes.include": ["*"]}) @dt_enabled def test_wsgi_attributes(): - app.post_json( - "/", {"foo": "bar"}, extra_environ={"n_errors": "1", "err_message": "oops"} - ) + app.post_json("/", {"foo": "bar"}, extra_environ={"n_errors": "1", "err_message": "oops"}) diff --git a/tests/agent_streaming/_test_handler.py b/tests/agent_streaming/_test_handler.py index 9fa8b19f8..d46e72f4a 100644 --- a/tests/agent_streaming/_test_handler.py +++ b/tests/agent_streaming/_test_handler.py @@ -12,36 +12,84 @@ # See the License for the specific language governing permissions and # limitations under the License. +import time +from collections import deque from concurrent import futures +from threading import Event import grpc -from newrelic.core.infinite_tracing_pb2 import RecordStatus, Span + +from newrelic.core.infinite_tracing_pb2 import RecordStatus, Span, SpanBatch + +SPANS_PROCESSED_EVENT = Event() +SPANS_RECEIVED = deque() +SPAN_BATCHES_RECEIVED = deque() def record_span(request, context): metadata = dict(context.invocation_metadata()) - assert 'agent_run_token' in metadata - assert 'license_key' in metadata + assert "agent_run_token" in metadata + assert "license_key" in metadata for span in request: - status_code = span.intrinsics.get('status_code', None) - status_code = status_code and getattr( - grpc.StatusCode, status_code.string_value) + SPANS_RECEIVED.append(span) + SPANS_PROCESSED_EVENT.set() + + # Handle injecting status codes. + status_code = span.intrinsics.get("status_code", None) + status_code = status_code and getattr(grpc.StatusCode, status_code.string_value) if status_code is grpc.StatusCode.OK: - break + return elif status_code: context.abort(status_code, "Abort triggered by client") + # Give the client time to enter the wait condition before closing the server. + if span.intrinsics.get("wait_then_ok", None): + # Wait long enough that the client is now waiting for more spans and stuck in notify.wait(). + time.sleep(1) + return + yield RecordStatus(messages_seen=1) +def record_span_batch(request, context): + metadata = dict(context.invocation_metadata()) + assert "agent_run_token" in metadata + assert "license_key" in metadata + + for span_batch in request: + SPAN_BATCHES_RECEIVED.append(span_batch) + SPANS_PROCESSED_EVENT.set() + batch_size = 0 + + for span in span_batch.spans: + # Handle injecting status codes. + status_code = span.intrinsics.get("status_code", None) + status_code = status_code and getattr(grpc.StatusCode, status_code.string_value) + if status_code is grpc.StatusCode.OK: + return + elif status_code: + context.abort(status_code, "Abort triggered by client") + + # Give the client time to enter the wait condition before closing the server. + if span.intrinsics.get("wait_then_ok", None): + # Wait long enough that the client is now waiting for more spans and stuck in notify.wait(). + time.sleep(1) + return + + yield RecordStatus(messages_seen=batch_size) + + HANDLERS = ( grpc.method_handlers_generic_handler( "com.newrelic.trace.v1.IngestService", { "RecordSpan": grpc.stream_stream_rpc_method_handler( record_span, Span.FromString, RecordStatus.SerializeToString - ) + ), + "RecordSpanBatch": grpc.stream_stream_rpc_method_handler( + record_span_batch, SpanBatch.FromString, RecordStatus.SerializeToString + ), }, ), ) diff --git a/tests/agent_streaming/conftest.py b/tests/agent_streaming/conftest.py index 13e31da60..390aeda9c 100644 --- a/tests/agent_streaming/conftest.py +++ b/tests/agent_streaming/conftest.py @@ -12,23 +12,17 @@ # See the License for the specific language governing permissions and # limitations under the License. +import threading + import pytest -import random -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_grpc_server import MockExternalgRPCServer + from newrelic.common.streaming_utils import StreamBuffer -import threading CONDITION_CLS = type(threading.Condition()) -_coverage_source = [] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -40,20 +34,19 @@ "agent_limits.errors_per_harvest": 100, "distributed_tracing.enabled": True, "infinite_tracing.trace_observer_host": "nr-internal.aws-us-east-2.tracing.staging-edge.nr-data.net", + "infinite_tracing.compression": True, "debug.connect_span_stream_in_developer_mode": True, } collector_agent_registration = collector_agent_registration_fixture( - app_name="Python Agent Test (agent_streaming)", - default_settings=_default_settings + app_name="Python Agent Test (agent_streaming)", default_settings=_default_settings ) @pytest.fixture(scope="module") def grpc_app_server(): - port = random.randint(50000, 50099) - with MockExternalgRPCServer(port=port) as server: - yield server, port + with MockExternalgRPCServer() as server: + yield server, server.port @pytest.fixture(scope="module") @@ -83,5 +76,34 @@ def buffer_empty_event(monkeypatch): def condition(*args, **kwargs): return SetEventOnWait(event, *args, **kwargs) - monkeypatch.setattr(StreamBuffer, 'condition', condition) + monkeypatch.setattr(StreamBuffer, "condition", condition) return event + + +@pytest.fixture(scope="session", params=[pytest.param(True, id="batching"), pytest.param(False, id="nonbatching")]) +def batching(request): + return request.param + + +@pytest.fixture(scope="function") +def spans_received(): + from _test_handler import SPANS_RECEIVED + + SPANS_RECEIVED.clear() + return SPANS_RECEIVED + + +@pytest.fixture(scope="function") +def span_batches_received(): + from _test_handler import SPAN_BATCHES_RECEIVED + + SPAN_BATCHES_RECEIVED.clear() + return SPAN_BATCHES_RECEIVED + + +@pytest.fixture(scope="function") +def spans_processed_event(): + from _test_handler import SPANS_PROCESSED_EVENT + + SPANS_PROCESSED_EVENT.clear() + return SPANS_PROCESSED_EVENT diff --git a/tests/agent_streaming/test_infinite_tracing.py b/tests/agent_streaming/test_infinite_tracing.py index 8942e6a75..f1119c38c 100644 --- a/tests/agent_streaming/test_infinite_tracing.py +++ b/tests/agent_streaming/test_infinite_tracing.py @@ -12,17 +12,20 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest import threading +import time -from newrelic.core.config import global_settings +import pytest from testing_support.fixtures import override_generic_settings +from testing_support.util import conditional_decorator +from testing_support.validators.validate_metric_payload import validate_metric_payload -from newrelic.core.application import Application +from newrelic.common.streaming_utils import StreamBuffer from newrelic.core.agent_streaming import StreamingRpc -from newrelic.core.infinite_tracing_pb2 import Span, AttributeValue -from testing_support.validators.validate_metric_payload import ( - validate_metric_payload) +from newrelic.core.application import Application +from newrelic.core.config import global_settings +from newrelic.core.infinite_tracing_pb2 import AttributeValue, Span +from newrelic.packages import six settings = global_settings() @@ -31,7 +34,7 @@ @pytest.fixture() def app(): - app = Application('Python Agent Test (Infinite Tracing)') + app = Application("Python Agent Test (Infinite Tracing)") yield app # Calling internal_agent_shutdown on an application that is already closed # will raise an exception. @@ -40,22 +43,31 @@ def app(): app.internal_agent_shutdown(restart=False) except: pass - if active_session: + if active_session and active_session._rpc is not None: assert not active_session._rpc.response_processing_thread.is_alive() assert not active_session._rpc.channel @pytest.mark.parametrize( - 'status_code, metrics', ( - ('UNIMPLEMENTED', [ - ('Supportability/InfiniteTracing/Span/gRPC/UNIMPLEMENTED', 1), - ('Supportability/InfiniteTracing/Span/Response/Error', 1)]), - ('INTERNAL', [ - ('Supportability/InfiniteTracing/Span/gRPC/INTERNAL', 1), - ('Supportability/InfiniteTracing/Span/Response/Error', 1)]), - )) -def test_infinite_tracing_span_streaming(mock_grpc_server, - status_code, metrics, monkeypatch, app): + "status_code, metrics", + ( + ( + "UNIMPLEMENTED", + [ + ("Supportability/InfiniteTracing/Span/gRPC/UNIMPLEMENTED", 1), + ("Supportability/InfiniteTracing/Span/Response/Error", 1), + ], + ), + ( + "INTERNAL", + [ + ("Supportability/InfiniteTracing/Span/gRPC/INTERNAL", 1), + ("Supportability/InfiniteTracing/Span/Response/Error", 1), + ], + ), + ), +) +def test_infinite_tracing_span_streaming(mock_grpc_server, status_code, metrics, monkeypatch, app, batching): event = threading.Event() class TerminateOnWait(CONDITION_CLS): @@ -71,21 +83,24 @@ def wait(self, *args, **kwargs): def condition(*args, **kwargs): return TerminateOnWait(*args, **kwargs) - monkeypatch.setattr(StreamingRpc, 'condition', condition) + monkeypatch.setattr(StreamingRpc, "condition", condition) span = Span( - intrinsics={'status_code': AttributeValue(string_value=status_code)}, - agent_attributes={}, - user_attributes={}) - - @override_generic_settings(settings, { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'infinite_tracing.trace_observer_host': 'localhost', - 'infinite_tracing.trace_observer_port': mock_grpc_server, - 'infinite_tracing.ssl': False, - }) - @validate_metric_payload(metrics=metrics) + intrinsics={"status_code": AttributeValue(string_value=status_code)}, agent_attributes={}, user_attributes={} + ) + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + }, + ) + @validate_metric_payload(metrics) def _test(): app.connect_to_data_collector(None) @@ -98,9 +113,7 @@ def _test(): _test() -def test_reconnect_on_failure(monkeypatch, mock_grpc_server, - buffer_empty_event, app): - +def test_reconnect_on_failure(monkeypatch, mock_grpc_server, buffer_empty_event, app, batching): status_code = "INTERNAL" wait_event = threading.Event() continue_event = threading.Event() @@ -115,25 +128,25 @@ def wait(self, *args, **kwargs): def condition(*args, **kwargs): return WaitOnWait(*args, **kwargs) - monkeypatch.setattr(StreamingRpc, 'condition', condition) + monkeypatch.setattr(StreamingRpc, "condition", condition) terminating_span = Span( - intrinsics={'status_code': AttributeValue(string_value=status_code)}, - agent_attributes={}, - user_attributes={}) + intrinsics={"status_code": AttributeValue(string_value=status_code)}, agent_attributes={}, user_attributes={} + ) - span = Span( - intrinsics={}, - agent_attributes={}, - user_attributes={}) - - @override_generic_settings(settings, { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'infinite_tracing.trace_observer_host': 'localhost', - 'infinite_tracing.trace_observer_port': mock_grpc_server, - 'infinite_tracing.ssl': False, - }) + span = Span(intrinsics={}, agent_attributes={}, user_attributes={}) + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + }, + ) def _test(): app.connect_to_data_collector(None) @@ -182,7 +195,7 @@ def test_agent_restart(app): assert rpc.response_processing_thread.is_alive() -def test_disconnect_on_UNIMPLEMENTED(mock_grpc_server, monkeypatch, app): +def test_disconnect_on_UNIMPLEMENTED(mock_grpc_server, monkeypatch, app, batching): event = threading.Event() class WaitOnNotify(CONDITION_CLS): @@ -194,21 +207,25 @@ def notify_all(self, *args, **kwargs): def condition(*args, **kwargs): return WaitOnNotify(*args, **kwargs) - monkeypatch.setattr(StreamingRpc, 'condition', condition) + monkeypatch.setattr(StreamingRpc, "condition", condition) terminating_span = Span( - intrinsics={'status_code': AttributeValue( - string_value='UNIMPLEMENTED')}, + intrinsics={"status_code": AttributeValue(string_value="UNIMPLEMENTED")}, agent_attributes={}, - user_attributes={}) - - @override_generic_settings(settings, { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'infinite_tracing.trace_observer_host': 'localhost', - 'infinite_tracing.trace_observer_port': mock_grpc_server, - 'infinite_tracing.ssl': False, - }) + user_attributes={}, + ) + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + }, + ) def _test(): app.connect_to_data_collector(None) @@ -228,7 +245,7 @@ def _test(): def test_agent_shutdown(): # Get the application connected to the actual 8T endpoint - app = Application('Python Agent Test (Infinite Tracing)') + app = Application("Python Agent Test (Infinite Tracing)") app.connect_to_data_collector(None) rpc = app._active_session._rpc # Store references to the original rpc and threads @@ -239,39 +256,57 @@ def test_agent_shutdown(): @pytest.mark.xfail(reason="This test is flaky", strict=False) -def test_no_delay_on_ok(mock_grpc_server, monkeypatch, app): +def test_no_delay_on_ok(mock_grpc_server, monkeypatch, app, batching): wait_event = threading.Event() connect_event = threading.Event() - metrics = [('Supportability/InfiniteTracing/Span/gRPC/OK', 1), - ('Supportability/InfiniteTracing/Span/Response/Error', None)] + metrics = [ + ("Supportability/InfiniteTracing/Span/gRPC/OK", 1), + ("Supportability/InfiniteTracing/Span/Response/Error", None), + ] class SetFlagOnWait(CONDITION_CLS): + def __init__(self, event, *args, **kwargs): + super(SetFlagOnWait, self).__init__(*args, **kwargs) + self.event = event + def wait(self, *args, **kwargs): - wait_event.set() + self.event.set() return super(SetFlagOnWait, self).wait(*args, **kwargs) @staticmethod def condition(*args, **kwargs): - return SetFlagOnWait(*args, **kwargs) + return SetFlagOnWait(wait_event, *args, **kwargs) + + _create_channel = StreamingRpc.create_channel + + def create_channel(self, *args, **kwargs): + ret = _create_channel(self, *args, **kwargs) + connect_event.set() + return ret + + monkeypatch.setattr(StreamingRpc, "condition", condition) + monkeypatch.setattr(StreamingRpc, "create_channel", create_channel) - monkeypatch.setattr(StreamingRpc, 'condition', condition) span = Span( intrinsics={"status_code": AttributeValue(string_value="OK")}, agent_attributes={}, user_attributes={}, ) - @override_generic_settings(settings, { - 'distributed_tracing.enabled': True, - 'span_events.enabled': True, - 'infinite_tracing.trace_observer_host': 'localhost', - 'infinite_tracing.trace_observer_port': mock_grpc_server, - 'infinite_tracing.ssl': False, - }) - @validate_metric_payload(metrics=metrics) + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + }, + ) + @validate_metric_payload(metrics) def _test(): - def connect_complete(): connect_event.set() @@ -284,15 +319,6 @@ def connect_complete(): stream_buffer = app._stats_engine.span_stream rpc = app._active_session._rpc - _rpc = rpc.rpc - - def patched_rpc(*args, **kwargs): - connect_event.set() - return _rpc(*args, **kwargs) - - rpc.rpc = patched_rpc - - # Put a span that will trigger an OK status code and wait for an attempted # reconnect. stream_buffer.put(span) @@ -302,3 +328,182 @@ def patched_rpc(*args, **kwargs): app.harvest() _test() + + +@conditional_decorator( + condition=six.PY2, decorator=pytest.mark.xfail(reason="Test frequently times out on Py2.", strict=False) +) +def test_no_data_loss_on_reconnect(mock_grpc_server, app, buffer_empty_event, batching, spans_processed_event): + """ + Test for data loss when channel is closed by the server while waiting for more data in a request iterator. + + This is a bug that's caused by the periodic (15 second) disconnects issued by the trace observer. To observe, + wait long enough in __next__'s notify.wait() call until the server issues a grpc.StatusCode.OK causing a + disconnect and reconnect. Alternatively in the case of this test, we use a mock server to issue one at the + appropriate moment rather than waiting for a real trace observer to issue a disconnect. + + While in this state, the very next span placed in the StreamBuffer would wake up the request_iterator for the + now closed channel (which was waiting in the __next__ function) and be consumed. The channel, being closed, + would discard the data and finish shutting down. This is now prevented by guards checking if the channel is + closed before popping any data inside the request iterator, which instead raises a StopIteration. + + Relevant GitHub issue: https://github.com/grpc/grpc/issues/29110 + """ + + terminating_span = Span( + intrinsics={"wait_then_ok": AttributeValue(string_value="OK")}, agent_attributes={}, user_attributes={} + ) + + span = Span(intrinsics={}, agent_attributes={}, user_attributes={}) + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + }, + ) + def _test(): + # Connect to app and retrieve references to various components + app.connect_to_data_collector(None) + + stream_buffer = app._stats_engine.span_stream + rpc = app._active_session._rpc + request_iterator = rpc.request_iterator + + # Wait until iterator is waiting on spans + assert buffer_empty_event.wait(timeout=5) + buffer_empty_event.clear() + + # Send a span that will trigger disconnect + stream_buffer.put(terminating_span) + + # Wait for spans to be processed by server + assert spans_processed_event.wait(timeout=5) + spans_processed_event.clear() + + # Wait for OK status code to close the channel + start_time = time.time() + while not (request_iterator._stream and request_iterator._stream.done()): + assert time.time() - start_time < 5, "Timed out waiting for OK status code." + time.sleep(0.5) + + # Put new span and wait until buffer has been emptied and either sent or lost + stream_buffer.put(span) + assert spans_processed_event.wait(timeout=5), "Data lost in stream buffer iterator." + + _test() + + +@pytest.mark.parametrize("dropped_spans", [0, 1]) +def test_span_supportability_metrics(mock_grpc_server, monkeypatch, app, dropped_spans, batching): + wait_event = threading.Event() + continue_event = threading.Event() + + total_spans = 3 + metrics = [ + ("Supportability/InfiniteTracing/Span/Seen", total_spans), + ( + "Supportability/InfiniteTracing/Span/Sent", + (total_spans - dropped_spans) or None, + ), # Replace 0 with None to indicate metric will not be sent + ] + + class WaitOnWait(CONDITION_CLS): + def wait(self, *args, **kwargs): + wait_event.set() + ret = super(WaitOnWait, self).wait(*args, **kwargs) + assert continue_event.wait(timeout=5) + return ret + + @staticmethod + def condition(*args, **kwargs): + return WaitOnWait(*args, **kwargs) + + monkeypatch.setattr(StreamBuffer, "condition", condition) + + span = Span( + intrinsics={}, + agent_attributes={}, + user_attributes={}, + ) + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + "infinite_tracing.span_queue_size": total_spans - dropped_spans, + }, + ) + @validate_metric_payload(metrics) + def _test(): + app.connect_to_data_collector(None) + + assert wait_event.wait(timeout=5) + + stream_buffer = app._stats_engine.span_stream + + # Put enough spans to overflow buffer + for _ in range(total_spans): + stream_buffer.put(span) + + # Harvest all spans simultaneously + wait_event.clear() + continue_event.set() + assert wait_event.wait(timeout=5) + wait_event.clear() + + app.harvest() + + _test() + + +@pytest.mark.parametrize("trace_observer_host", ["localhost", None]) +@pytest.mark.parametrize("batching", [True, False]) +@pytest.mark.parametrize("compression", [True, False]) +def test_settings_supportability_metrics(mock_grpc_server, app, trace_observer_host, batching, compression): + connect_event = threading.Event() + + enabled = bool(trace_observer_host) + + metrics = [ + ("Supportability/InfiniteTracing/gRPC/Batching/enabled", 1 if enabled and batching else None), + ("Supportability/InfiniteTracing/gRPC/Batching/disabled", 1 if enabled and not batching else None), + ("Supportability/InfiniteTracing/gRPC/Compression/enabled", 1 if enabled and compression else None), + ("Supportability/InfiniteTracing/gRPC/Compression/disabled", 1 if enabled and not compression else None), + ] + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "span_events.enabled": True, + "infinite_tracing.trace_observer_host": trace_observer_host, + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.batching": batching, + "infinite_tracing.compression": compression, + }, + ) + @validate_metric_payload(metrics) + def _test(): + def connect_complete(): + connect_event.set() + + app.connect_to_data_collector(connect_complete) + + assert connect_event.wait(timeout=5) + connect_event.clear() + + app.harvest() + + _test() diff --git a/tests/agent_streaming/test_stream_buffer.py b/tests/agent_streaming/test_stream_buffer.py new file mode 100644 index 000000000..80551e9d3 --- /dev/null +++ b/tests/agent_streaming/test_stream_buffer.py @@ -0,0 +1,87 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import pytest +from conftest import CONDITION_CLS + +from newrelic.common.streaming_utils import StreamBuffer, StreamBufferIterator +from newrelic.core.infinite_tracing_pb2 import Span, SpanBatch + + +class StopIterationOnWait(CONDITION_CLS): + def wait(self, *args, **kwargs): + raise StopIteration() + + +@staticmethod +def stop_iteration_condition(*args, **kwargs): + return StopIterationOnWait(*args, **kwargs) + + +@pytest.fixture(scope="function") +def stop_iteration_on_wait(monkeypatch): + monkeypatch.setattr(StreamBuffer, "condition", stop_iteration_condition) + + +def test_stream_buffer_iterator_batching(stop_iteration_on_wait, batching): + stream_buffer = StreamBuffer(5, batching=batching) + + for _ in range(5): + span = Span(intrinsics={}, agent_attributes={}, user_attributes={}) + stream_buffer.put(span) + + buffer_contents = list(stream_buffer) + if batching: + assert len(buffer_contents) == 1 + assert isinstance(buffer_contents.pop(), SpanBatch) + else: + assert len(buffer_contents) == 5 + assert all(isinstance(element, Span) for element in stream_buffer) + + +def test_stream_buffer_iterator_max_batch_size(stop_iteration_on_wait): + stream_buffer = StreamBuffer(StreamBufferIterator.MAX_BATCH_SIZE + 1, batching=True) + + # Add 1 more span than the maximum batch size + for _ in range(StreamBufferIterator.MAX_BATCH_SIZE + 1): + span = Span(intrinsics={}, agent_attributes={}, user_attributes={}) + stream_buffer.put(span) + + # Pull all span batches out of buffer + buffer_contents = list(stream_buffer) + assert len(buffer_contents) == 2 + + # Large batch + batch = buffer_contents.pop(0) + assert isinstance(batch, SpanBatch) + assert len(batch.spans) == StreamBufferIterator.MAX_BATCH_SIZE + + # Single span batch + batch = buffer_contents.pop(0) + assert isinstance(batch, SpanBatch) + assert len(batch.spans) == 1 + + +def test_stream_buffer_queue_size(): + stream_buffer = StreamBuffer(1) + + # Add more spans than queue can hold + for _ in range(2): + span = Span(intrinsics={}, agent_attributes={}, user_attributes={}) + stream_buffer.put(span) + + # Ensure spans are dropped and not stored + assert len(stream_buffer) == 1 + assert stream_buffer._dropped == 1 + assert stream_buffer._seen == 2 diff --git a/tests/agent_streaming/test_streaming_rpc.py b/tests/agent_streaming/test_streaming_rpc.py index 3cf5ccc25..3ab74086e 100644 --- a/tests/agent_streaming/test_streaming_rpc.py +++ b/tests/agent_streaming/test_streaming_rpc.py @@ -14,10 +14,13 @@ import threading -from newrelic.core.agent_streaming import StreamingRpc -from newrelic.common.streaming_utils import StreamBuffer -from newrelic.core.infinite_tracing_pb2 import Span, AttributeValue +import pytest +from testing_support.fixtures import override_generic_settings +from newrelic.common.streaming_utils import StreamBuffer +from newrelic.core.agent_streaming import StreamingRpc +from newrelic.core.config import global_settings +from newrelic.core.infinite_tracing_pb2 import AttributeValue, Span CONDITION_CLS = type(threading.Condition()) DEFAULT_METADATA = (("agent_run_token", ""), ("license_key", "")) @@ -27,13 +30,54 @@ def record_metric(*args, **kwargs): pass -def test_close_before_connect(mock_grpc_server): +# This enumeration is taken from gRPC's implementation for compression: +# https://grpc.github.io/grpc/python/grpc.html#compression +@pytest.mark.parametrize( + "compression_setting, gRPC_compression_val", + ( + (None, 0), + (True, 2), + (False, 0), + ), +) +def test_correct_settings(mock_grpc_server, compression_setting, gRPC_compression_val): + settings = global_settings() + + @override_generic_settings( + settings, + { + "distributed_tracing.enabled": True, + "infinite_tracing.trace_observer_host": "localhost", + "infinite_tracing.trace_observer_port": mock_grpc_server, + "infinite_tracing.ssl": False, + "infinite_tracing.compression": compression_setting, + }, + ) + def _test(): + endpoint = "localhost:%s" % mock_grpc_server + stream_buffer = StreamBuffer(1) + + rpc = StreamingRpc( + endpoint, + stream_buffer, + DEFAULT_METADATA, + record_metric, + ssl=False, + compression=settings.infinite_tracing.compression, + ) + + rpc.connect() + assert rpc.compression_setting.value == gRPC_compression_val + rpc.close() + + _test() + + +def test_close_before_connect(mock_grpc_server, batching): endpoint = "localhost:%s" % mock_grpc_server - stream_buffer = StreamBuffer(0) + stream_buffer = StreamBuffer(0, batching=batching) - rpc = StreamingRpc( - endpoint, stream_buffer, DEFAULT_METADATA, record_metric, ssl=False - ) + rpc = StreamingRpc(endpoint, stream_buffer, DEFAULT_METADATA, record_metric, ssl=False) # Calling close will close the grpc channel rpc.close() @@ -44,13 +88,11 @@ def test_close_before_connect(mock_grpc_server): assert not rpc.response_processing_thread.is_alive() -def test_close_while_connected(mock_grpc_server, buffer_empty_event): +def test_close_while_connected(mock_grpc_server, buffer_empty_event, batching): endpoint = "localhost:%s" % mock_grpc_server - stream_buffer = StreamBuffer(1) + stream_buffer = StreamBuffer(1, batching=batching) - rpc = StreamingRpc( - endpoint, stream_buffer, DEFAULT_METADATA, record_metric, ssl=False - ) + rpc = StreamingRpc(endpoint, stream_buffer, DEFAULT_METADATA, record_metric, ssl=False) rpc.connect() # Check the processing thread is alive and spans are being sent @@ -67,7 +109,7 @@ def test_close_while_connected(mock_grpc_server, buffer_empty_event): assert not rpc.response_processing_thread.is_alive() -def test_close_while_awaiting_reconnect(mock_grpc_server, monkeypatch): +def test_close_while_awaiting_reconnect(mock_grpc_server, monkeypatch, batching): event = threading.Event() class WaitOnWait(CONDITION_CLS): @@ -89,11 +131,9 @@ def condition(*args, **kwargs): ) endpoint = "localhost:%s" % mock_grpc_server - stream_buffer = StreamBuffer(1) + stream_buffer = StreamBuffer(1, batching=batching) - rpc = StreamingRpc( - endpoint, stream_buffer, DEFAULT_METADATA, record_metric, ssl=False - ) + rpc = StreamingRpc(endpoint, stream_buffer, DEFAULT_METADATA, record_metric, ssl=False) rpc.connect() # Send a span to trigger reconnect @@ -104,3 +144,42 @@ def condition(*args, **kwargs): rpc.close() # Make sure the processing_thread is closed assert not rpc.response_processing_thread.is_alive() + + +@pytest.mark.parametrize("compression", (True, False)) +def test_rpc_serialization_and_deserialization( + mock_grpc_server, + batching, + compression, + buffer_empty_event, + spans_received, + span_batches_received, + spans_processed_event, +): + """StreamingRPC sends deserializable span to correct endpoint.""" + + endpoint = "localhost:%s" % mock_grpc_server + stream_buffer = StreamBuffer(1, batching=batching) + + span = Span( + intrinsics={}, + agent_attributes={}, + user_attributes={}, + ) + + rpc = StreamingRpc(endpoint, stream_buffer, DEFAULT_METADATA, record_metric, compression=compression, ssl=False) + + rpc.connect() + + buffer_empty_event.clear() + stream_buffer.put(span) + + assert buffer_empty_event.wait(5) + assert spans_processed_event.wait(5) + + if batching: + assert not spans_received, "Spans incorrectly received." + assert span_batches_received, "No span batches received." + else: + assert not span_batches_received, "Span batches incorrectly received." + assert spans_received, "No spans received." diff --git a/tests/agent_unittests/conftest.py b/tests/agent_unittests/conftest.py index 012e6ca4b..1504d1b8d 100644 --- a/tests/agent_unittests/conftest.py +++ b/tests/agent_unittests/conftest.py @@ -16,10 +16,8 @@ import tempfile import pytest -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 newrelic_caplog as caplog, ) @@ -44,11 +42,12 @@ reload except NameError: # python 3.x - from imp import reload # pylint: disable=W0402 + from importlib import reload class FakeProtos(object): Span = object() + SpanBatch = object() sys.modules["grpc"] = object() diff --git a/tests/agent_unittests/test_agent_connect.py b/tests/agent_unittests/test_agent_connect.py index 46c4edf44..eb944c072 100644 --- a/tests/agent_unittests/test_agent_connect.py +++ b/tests/agent_unittests/test_agent_connect.py @@ -19,10 +19,9 @@ from testing_support.fixtures import ( override_generic_settings, - validate_internal_metrics, failing_endpoint ) - +from testing_support.validators.validate_internal_metrics import validate_internal_metrics SETTINGS = global_settings() diff --git a/tests/agent_unittests/test_harvest_loop.py b/tests/agent_unittests/test_harvest_loop.py index 7760e1307..305622107 100644 --- a/tests/agent_unittests/test_harvest_loop.py +++ b/tests/agent_unittests/test_harvest_loop.py @@ -61,9 +61,8 @@ def transaction_node(request): expected=False, span_id=None, stack_trace="", + error_group_name=None, custom_params={}, - file_name=None, - line_number=None, source=None, ) @@ -163,7 +162,6 @@ def transaction_node(request): def validate_metric_payload(metrics=[], endpoints_called=[]): - sent_metrics = {} @transient_function_wrapper("newrelic.core.agent_protocol", "AgentProtocol.send") @@ -204,7 +202,6 @@ def validate(): def validate_transaction_event_payloads(payload_validators): @function_wrapper def _wrapper(wrapped, instance, args, kwargs): - payloads = [] @transient_function_wrapper("newrelic.core.agent_protocol", "AgentProtocol.send") @@ -319,7 +316,6 @@ def test_serverless_application_harvest(): ], ) def test_application_harvest_with_spans(distributed_tracing_enabled, span_events_enabled, spans_created): - span_endpoints_called = [] max_samples_stored = 10 @@ -381,7 +377,10 @@ def _test(): (7, 10, 10, 7), ), ) -def test_application_harvest_with_span_streaming(span_queue_size, spans_to_send, expected_sent, expected_seen): +@pytest.mark.parametrize("span_batching", (True, False)) +def test_application_harvest_with_span_streaming( + span_batching, span_queue_size, spans_to_send, expected_sent, expected_seen +): @override_generic_settings( settings, { @@ -390,6 +389,7 @@ def test_application_harvest_with_span_streaming(span_queue_size, spans_to_send, "span_events.enabled": True, "infinite_tracing._trace_observer_host": "x", "infinite_tracing.span_queue_size": span_queue_size, + "infinite_tracing.batching": span_batching, }, ) @validate_metric_payload( @@ -600,7 +600,6 @@ def test_reservoir_size_zeros(harvest_name, event_name): @pytest.mark.parametrize("events_seen", (1, 5, 10)) def test_error_event_sampling_info(events_seen): - reservoir_size = 5 endpoints_called = [] @@ -671,7 +670,6 @@ def test_compute_sampled_no_reset(): def test_analytic_event_sampling_info(): - synthetics_limit = 10 transactions_limit = 20 @@ -756,7 +754,6 @@ def _test(): }, ) def test_transaction_events_disabled(): - endpoints_called = [] expected_metrics = ( ("Supportability/Python/RequestSampler/requests", None), @@ -798,7 +795,8 @@ def test_reset_synthetics_events(): @pytest.mark.parametrize( - "allowlist_event", ("analytic_event_data", "custom_event_data", "log_event_data", "error_event_data", "span_event_data") + "allowlist_event", + ("analytic_event_data", "custom_event_data", "log_event_data", "error_event_data", "span_event_data"), ) @override_generic_settings( settings, @@ -850,7 +848,8 @@ def test_flexible_events_harvested(allowlist_event): @pytest.mark.parametrize( - "allowlist_event", ("analytic_event_data", "custom_event_data", "log_event_data", "error_event_data", "span_event_data") + "allowlist_event", + ("analytic_event_data", "custom_event_data", "log_event_data", "error_event_data", "span_event_data"), ) @override_generic_settings( settings, diff --git a/tests/agent_unittests/test_package_version_utils.py b/tests/agent_unittests/test_package_version_utils.py new file mode 100644 index 000000000..435d74947 --- /dev/null +++ b/tests/agent_unittests/test_package_version_utils.py @@ -0,0 +1,104 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import sys + +import pytest +from testing_support.validators.validate_function_called import validate_function_called + +from newrelic.common.package_version_utils import ( + NULL_VERSIONS, + VERSION_ATTRS, + get_package_version, + get_package_version_tuple, +) + +IS_PY38_PLUS = sys.version_info[:2] >= (3, 8) +SKIP_IF_NOT_IMPORTLIB_METADATA = pytest.mark.skipif(not IS_PY38_PLUS, reason="importlib.metadata is not supported.") +SKIP_IF_IMPORTLIB_METADATA = pytest.mark.skipif( + IS_PY38_PLUS, reason="importlib.metadata is preferred over pkg_resources." +) + + +@pytest.fixture(scope="function", autouse=True) +def patched_pytest_module(monkeypatch): + for attr in VERSION_ATTRS: + if hasattr(pytest, attr): + monkeypatch.delattr(pytest, attr) + + yield pytest + + +@pytest.mark.parametrize( + "attr,value,expected_value", + ( + ("version", "1.2.3.4", "1.2.3.4"), + ("__version__", "1.3.5rc2", "1.3.5rc2"), + ("__version_tuple__", (3, 5, 8), "3.5.8"), + ("version_tuple", [3, 1, "0b2"], "3.1.0b2"), + ), +) +def test_get_package_version(attr, value, expected_value): + # There is no file/module here, so we monkeypatch + # pytest instead for our purposes + setattr(pytest, attr, value) + version = get_package_version("pytest") + assert version == expected_value + delattr(pytest, attr) + + +def test_skips_version_callables(): + # There is no file/module here, so we monkeypatch + # pytest instead for our purposes + setattr(pytest, "version", lambda x: "1.2.3.4") + setattr(pytest, "version_tuple", [3, 1, "0b2"]) + + version = get_package_version("pytest") + + assert version == "3.1.0b2" + + delattr(pytest, "version") + delattr(pytest, "version_tuple") + + +@pytest.mark.parametrize( + "attr,value,expected_value", + ( + ("version", "1.2.3.4", (1, 2, 3, 4)), + ("__version__", "1.3.5rc2", (1, 3, "5rc2")), + ("__version_tuple__", (3, 5, 8), (3, 5, 8)), + ("version_tuple", [3, 1, "0b2"], (3, 1, "0b2")), + ), +) +def test_get_package_version_tuple(attr, value, expected_value): + # There is no file/module here, so we monkeypatch + # pytest instead for our purposes + setattr(pytest, attr, value) + version = get_package_version_tuple("pytest") + assert version == expected_value + delattr(pytest, attr) + + +@SKIP_IF_NOT_IMPORTLIB_METADATA +@validate_function_called("importlib.metadata", "version") +def test_importlib_metadata(): + version = get_package_version("pytest") + assert version not in NULL_VERSIONS, version + + +@SKIP_IF_IMPORTLIB_METADATA +@validate_function_called("pkg_resources", "get_distribution") +def test_pkg_resources_metadata(): + version = get_package_version("pytest") + assert version not in NULL_VERSIONS, version diff --git a/tests/agent_unittests/test_signature.py b/tests/agent_unittests/test_signature.py new file mode 100644 index 000000000..8d44896f3 --- /dev/null +++ b/tests/agent_unittests/test_signature.py @@ -0,0 +1,31 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import pytest + +from newrelic.common.signature import bind_args + + +@pytest.mark.parametrize( + "func,args,kwargs,expected", + [ + (lambda x, y: None, (1,), {"y": 2}, {"x": 1, "y": 2}), + (lambda x=1, y=2: None, (1,), {"y": 2}, {"x": 1, "y": 2}), + (lambda x=1: None, (), {}, {"x": 1}), + ], + ids=("posargs", "kwargs", "defaults"), +) +def test_signature_binding(func, args, kwargs, expected): + bound_args = bind_args(func, args, kwargs) + assert bound_args == expected diff --git a/tests/agent_unittests/test_trace_cache.py b/tests/agent_unittests/test_trace_cache.py new file mode 100644 index 000000000..e0f7db84f --- /dev/null +++ b/tests/agent_unittests/test_trace_cache.py @@ -0,0 +1,129 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import threading + +import pytest + +from newrelic.core.trace_cache import TraceCache + +_TEST_CONCURRENT_ITERATION_TC_SIZE = 20 + + +class DummyTrace(object): + pass + + +@pytest.fixture(scope="function") +def trace_cache(): + return TraceCache() + + +def test_trace_cache_methods(trace_cache): + """Test MutableMapping methods functional for trace_cache""" + obj = DummyTrace() # weakref compatible object + + trace_cache[1] = obj + assert 1 in trace_cache + assert bool(trace_cache) + assert list(trace_cache) + + del trace_cache[1] + assert 1 not in trace_cache + assert not bool(trace_cache) + + trace_cache[1] = obj + assert trace_cache.get(1, None) + assert trace_cache.pop(1, None) + + trace_cache[1] = obj + assert len(trace_cache) == 1 + assert len(list(trace_cache.items())) == 1 + assert len(list(trace_cache.keys())) == 1 + assert len(list(trace_cache.values())) == 1 + + +@pytest.fixture(scope="function") +def iterate_trace_cache(trace_cache): + def _iterate_trace_cache(shutdown): + while True: + if shutdown.is_set(): + return + for k, v in trace_cache.items(): + pass + for v in trace_cache.values(): + pass + for v in trace_cache.keys(): + pass + + return _iterate_trace_cache + + +@pytest.fixture(scope="function") +def change_weakref_dict_size(trace_cache): + def _change_weakref_dict_size(shutdown, obj_refs): + """ + Cause RuntimeErrors when iterating on the trace_cache by: + - Repeatedly pop and add batches of keys to cause size changes. + - Randomly delete and replace some object refs so the weak references are deleted, + causing the weakref dict to delete them and forcing further size changes. + """ + + dict_size_change = _TEST_CONCURRENT_ITERATION_TC_SIZE // 2 # Remove up to half of items + while True: + if shutdown.is_set(): + return + + # Delete and re-add keys + for i in range(dict_size_change): + trace_cache._cache.pop(i, None) + for i in range(dict_size_change): + trace_cache._cache[i] = obj_refs[i] + + # Replace every 3rd obj ref causing the WeakValueDictionary to drop it. + for i, _ in enumerate(obj_refs[::3]): + obj_refs[i] = DummyTrace() + + return _change_weakref_dict_size + + +def test_concurrent_iteration(iterate_trace_cache, change_weakref_dict_size): + """ + Test for exceptions related to trace_cache changing size during iteration. + + The WeakValueDictionary used internally is particularly prone to this, as iterating + on it in any way other than indirectly through WeakValueDictionary.valuerefs() + will cause RuntimeErrors due to the unguarded iteration on a dictionary internally. + """ + obj_refs = [DummyTrace() for _ in range(_TEST_CONCURRENT_ITERATION_TC_SIZE)] + shutdown = threading.Event() + + t1 = threading.Thread(target=change_weakref_dict_size, args=(shutdown, obj_refs)) + t2 = threading.Thread(target=iterate_trace_cache, args=(shutdown,)) + t1.daemon = True + t2.daemon = True + t1.start() + t2.start() + + # Run for 1 second, then shutdown. Stop immediately for exceptions. + t2.join(timeout=1) + assert t1.is_alive(), "Thread exited with exception." + assert t2.is_alive(), "Thread exited with exception." + shutdown.set() + + # Ensure threads shutdown with a timeout to prevent hangs + t1.join(timeout=1) + t2.join(timeout=1) + assert not t1.is_alive(), "Thread failed to exit." + assert not t2.is_alive(), "Thread failed to exit." diff --git a/tests/agent_unittests/test_utilization_settings.py b/tests/agent_unittests/test_utilization_settings.py index 0cc9b4bc7..8af4bcbf1 100644 --- a/tests/agent_unittests/test_utilization_settings.py +++ b/tests/agent_unittests/test_utilization_settings.py @@ -13,29 +13,33 @@ # limitations under the License. import os -import pytest import tempfile -from newrelic.common.object_wrapper import function_wrapper -from newrelic.core.agent_protocol import AgentProtocol -from newrelic.config import initialize +import pytest # these will be reloaded for each test import newrelic.config import newrelic.core.config +from newrelic.common.object_wrapper import function_wrapper +from newrelic.config import initialize +from newrelic.core.agent_protocol import AgentProtocol # the specific methods imported here will not be changed when the modules are # reloaded -from newrelic.core.config import (_remove_ignored_configs, - finalize_application_settings, _environ_as_int, _environ_as_float, - global_settings) +from newrelic.core.config import ( + _environ_as_float, + _environ_as_int, + _remove_ignored_configs, + finalize_application_settings, + global_settings, +) try: # python 2.x reload except NameError: # python 3.x - from imp import reload + from importlib import reload INI_FILE_WITHOUT_UTIL_CONF = b""" [newrelic] @@ -56,14 +60,16 @@ """ ENV_WITHOUT_UTIL_CONF = {} -ENV_WITH_UTIL_CONF = {'NEW_RELIC_UTILIZATION_BILLING_HOSTNAME': 'env-hostname'} +ENV_WITH_UTIL_CONF = {"NEW_RELIC_UTILIZATION_BILLING_HOSTNAME": "env-hostname"} ENV_WITH_BAD_UTIL_CONF = { - 'NEW_RELIC_UTILIZATION_LOGICAL_PROCESSORS': 'notanum', - 'NEW_RELIC_UTILIZATION_BILLING_HOSTNAME': 'env-hostname', - 'NEW_RELIC_UTILIZATION_TOTAL_RAM_MIB': '98765', + "NEW_RELIC_UTILIZATION_LOGICAL_PROCESSORS": "notanum", + "NEW_RELIC_UTILIZATION_BILLING_HOSTNAME": "env-hostname", + "NEW_RELIC_UTILIZATION_TOTAL_RAM_MIB": "98765", +} +ENV_WITH_HEROKU = { + "NEW_RELIC_HEROKU_USE_DYNO_NAMES": "false", + "NEW_RELIC_HEROKU_DYNO_NAME_PREFIXES_TO_SHORTEN": "meow wruff", } -ENV_WITH_HEROKU = {'NEW_RELIC_HEROKU_USE_DYNO_NAMES': 'false', - 'NEW_RELIC_HEROKU_DYNO_NAME_PREFIXES_TO_SHORTEN': 'meow wruff'} INITIAL_ENV = os.environ @@ -108,6 +114,7 @@ def reset(wrapped, instance, args, kwargs): returned = wrapped(*args, **kwargs) return returned + return reset @@ -115,39 +122,35 @@ def reset(wrapped, instance, args, kwargs): def test_heroku_default(): settings = global_settings() assert settings.heroku.use_dyno_names is True - assert settings.heroku.dyno_name_prefixes_to_shorten in \ - (['scheduler', 'run'], ['run', 'scheduler']) + assert settings.heroku.dyno_name_prefixes_to_shorten in (["scheduler", "run"], ["run", "scheduler"]) @reset_agent_config(INI_FILE_WITHOUT_UTIL_CONF, ENV_WITH_HEROKU) def test_heroku_override(): settings = global_settings() assert settings.heroku.use_dyno_names is False - assert settings.heroku.dyno_name_prefixes_to_shorten in \ - (['meow', 'wruff'], ['wruff', 'meow']) + assert settings.heroku.dyno_name_prefixes_to_shorten in (["meow", "wruff"], ["wruff", "meow"]) @reset_agent_config(INI_FILE_WITHOUT_UTIL_CONF, ENV_WITH_UTIL_CONF) def test_billing_hostname_from_env_vars(): settings = global_settings() - assert settings.utilization.billing_hostname == 'env-hostname' + assert settings.utilization.billing_hostname == "env-hostname" - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_conf = local_config['utilization'].get('config') - assert util_conf == {'hostname': 'env-hostname'} + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_conf = local_config["utilization"].get("config") + assert util_conf == {"hostname": "env-hostname"} @reset_agent_config(INI_FILE_WITH_UTIL_CONF, ENV_WITH_UTIL_CONF) def test_billing_hostname_precedence(): # ini-file takes precedence over env vars settings = global_settings() - assert settings.utilization.billing_hostname == 'file-hostname' + assert settings.utilization.billing_hostname == "file-hostname" - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_conf = local_config['utilization'].get('config') - assert util_conf == {'hostname': 'file-hostname'} + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_conf = local_config["utilization"].get("config") + assert util_conf == {"hostname": "file-hostname"} @reset_agent_config(INI_FILE_WITHOUT_UTIL_CONF, ENV_WITHOUT_UTIL_CONF) @@ -157,21 +160,19 @@ def test_billing_hostname_with_blank_ini_file_no_env(): # if no utilization config settings are set, the 'config' section is not in # the payload at all - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_conf = local_config['utilization'].get('config') + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_conf = local_config["utilization"].get("config") assert util_conf is None @reset_agent_config(INI_FILE_WITH_UTIL_CONF, ENV_WITHOUT_UTIL_CONF) def test_billing_hostname_with_set_in_ini_not_in_env(): settings = global_settings() - assert settings.utilization.billing_hostname == 'file-hostname' + assert settings.utilization.billing_hostname == "file-hostname" - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_conf = local_config['utilization'].get('config') - assert util_conf == {'hostname': 'file-hostname'} + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_conf = local_config["utilization"].get("config") + assert util_conf == {"hostname": "file-hostname"} @reset_agent_config(INI_FILE_WITH_BAD_UTIL_CONF, ENV_WITHOUT_UTIL_CONF) @@ -179,10 +180,9 @@ def test_bad_value_in_ini_file(): settings = global_settings() assert settings.utilization.logical_processors == 0 - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_conf = local_config['utilization'].get('config') - assert util_conf == {'hostname': 'file-hostname', 'total_ram_mib': 12345} + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_conf = local_config["utilization"].get("config") + assert util_conf == {"hostname": "file-hostname", "total_ram_mib": 12345} @reset_agent_config(INI_FILE_WITHOUT_UTIL_CONF, ENV_WITH_BAD_UTIL_CONF) @@ -190,160 +190,154 @@ def test_bad_value_in_env_var(): settings = global_settings() assert settings.utilization.logical_processors == 0 - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_conf = local_config['utilization'].get('config') - assert util_conf == {'hostname': 'env-hostname', 'total_ram_mib': 98765} + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_conf = local_config["utilization"].get("config") + assert util_conf == {"hostname": "env-hostname", "total_ram_mib": 98765} # Tests for combining with server side settings _server_side_config_settings_util_conf = [ { - 'foo': 123, - 'bar': 456, - 'agent_config': { - 'utilization.billing_hostname': 'server-side-hostname' - }, + "foo": 123, + "bar": 456, + "agent_config": {"utilization.billing_hostname": "server-side-hostname"}, }, { - 'foo': 123, - 'bar': 456, - 'agent_config': { - 'baz': 789, + "foo": 123, + "bar": 456, + "agent_config": { + "baz": 789, }, }, { - 'foo': 123, - 'bar': 456, + "foo": 123, + "bar": 456, }, ] -@pytest.mark.parametrize('server_settings', - _server_side_config_settings_util_conf) +@pytest.mark.parametrize("server_settings", _server_side_config_settings_util_conf) def test_remove_ignored_configs(server_settings): fixed_settings = _remove_ignored_configs(server_settings) - agent_config = fixed_settings.get('agent_config', {}) - assert 'utilization.billing_hostname' not in agent_config + agent_config = fixed_settings.get("agent_config", {}) + assert "utilization.billing_hostname" not in agent_config @reset_agent_config(INI_FILE_WITH_UTIL_CONF, ENV_WITHOUT_UTIL_CONF) -@pytest.mark.parametrize('server_settings', - _server_side_config_settings_util_conf) +@pytest.mark.parametrize("server_settings", _server_side_config_settings_util_conf) def test_finalize_application_settings(server_settings): settings = global_settings() - finalize_application_settings(server_side_config=server_settings, - settings=settings) + finalize_application_settings(server_side_config=server_settings, settings=settings) # hostname set in ini_file and not in env vars - assert settings.utilization.billing_hostname == 'file-hostname' + assert settings.utilization.billing_hostname == "file-hostname" # Tests for _environ_as_int _tests_environ_as_int = [ { - 'name': 'test no env var set, no default requested', - 'envvar_set': False, - 'envvar_val': None, # None set - 'default': None, # None requested - 'expected_value': 0, + "name": "test no env var set, no default requested", + "envvar_set": False, + "envvar_val": None, # None set + "default": None, # None requested + "expected_value": 0, }, { - 'name': 'test no env var set, default requested', - 'envvar_set': False, - 'envvar_val': None, # None set - 'default': 123, - 'expected_value': 123, + "name": "test no env var set, default requested", + "envvar_set": False, + "envvar_val": None, # None set + "default": 123, + "expected_value": 123, }, { - 'name': 'test env var is not an int, no default requested', - 'envvar_set': True, - 'envvar_val': 'testing', - 'default': None, # None requested - 'expected_value': 0, + "name": "test env var is not an int, no default requested", + "envvar_set": True, + "envvar_val": "testing", + "default": None, # None requested + "expected_value": 0, }, { - 'name': 'test env var is not an int, default requested', - 'envvar_set': True, - 'envvar_val': 'testing-more', - 'default': 1234, - 'expected_value': 1234, + "name": "test env var is not an int, default requested", + "envvar_set": True, + "envvar_val": "testing-more", + "default": 1234, + "expected_value": 1234, }, { - 'name': 'test env var is an int', - 'envvar_set': True, - 'envvar_val': 7239, - 'default': None, # None requested - 'expected_value': 7239, + "name": "test env var is an int", + "envvar_set": True, + "envvar_val": 7239, + "default": None, # None requested + "expected_value": 7239, }, ] _tests_environ_as_float = [ { - 'name': 'test no env var set, no default requested', - 'envvar_set': False, - 'envvar_val': None, # None set - 'default': None, # None requested - 'expected_value': 0.0, + "name": "test no env var set, no default requested", + "envvar_set": False, + "envvar_val": None, # None set + "default": None, # None requested + "expected_value": 0.0, }, { - 'name': 'test no env var set, default requested', - 'envvar_set': False, - 'envvar_val': None, # None set - 'default': 123.0, - 'expected_value': 123.0, + "name": "test no env var set, default requested", + "envvar_set": False, + "envvar_val": None, # None set + "default": 123.0, + "expected_value": 123.0, }, { - 'name': 'test env var is not a float, no default requested', - 'envvar_set': True, - 'envvar_val': 'testing', - 'default': None, # None requested - 'expected_value': 0.0, + "name": "test env var is not a float, no default requested", + "envvar_set": True, + "envvar_val": "testing", + "default": None, # None requested + "expected_value": 0.0, }, { - 'name': 'test env var is not a number, default requested', - 'envvar_set': True, - 'envvar_val': 'testing-more', - 'default': 1234.0, - 'expected_value': 1234.0, + "name": "test env var is not a number, default requested", + "envvar_set": True, + "envvar_val": "testing-more", + "default": 1234.0, + "expected_value": 1234.0, }, { - 'name': 'test env var is an int, not float', - 'envvar_set': True, - 'envvar_val': '7239', - 'default': None, # None requested - 'expected_value': 7239.0, + "name": "test env var is an int, not float", + "envvar_set": True, + "envvar_val": "7239", + "default": None, # None requested + "expected_value": 7239.0, }, { - 'name': 'test env var is a float', - 'envvar_set': True, - 'envvar_val': '7239.23234', - 'default': None, # None requested - 'expected_value': 7239.23234, + "name": "test env var is a float", + "envvar_set": True, + "envvar_val": "7239.23234", + "default": None, # None requested + "expected_value": 7239.23234, }, ] def _test_environ(env_type, test): - env = {'TESTING': test['envvar_val']} if test['envvar_set'] else {} - default = test['default'] + env = {"TESTING": test["envvar_val"]} if test["envvar_set"] else {} + default = test["default"] with Environ(env): if default: - val = env_type('TESTING', default=default) + val = env_type("TESTING", default=default) else: - val = env_type('TESTING') - assert val == test['expected_value'] + val = env_type("TESTING") + assert val == test["expected_value"] -@pytest.mark.parametrize('test', _tests_environ_as_int) +@pytest.mark.parametrize("test", _tests_environ_as_int) def test__environ_as_int(test): _test_environ(_environ_as_int, test) -@pytest.mark.parametrize('test', _tests_environ_as_float) +@pytest.mark.parametrize("test", _tests_environ_as_float) def test__environ_as_float(test): _test_environ(_environ_as_float, test) diff --git a/tests/application_celery/conftest.py b/tests/application_celery/conftest.py index 92b9aabd0..49f0fe477 100644 --- a/tests/application_celery/conftest.py +++ b/tests/application_celery/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.application_celery', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/application_celery/test_celery.py b/tests/application_celery/test_celery.py index 5bde17714..c2f9177fa 100644 --- a/tests/application_celery/test_celery.py +++ b/tests/application_celery/test_celery.py @@ -15,7 +15,7 @@ from newrelic.api.background_task import background_task from newrelic.api.transaction import ignore_transaction, end_of_transaction -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics from tasks import add, tsum diff --git a/tests/application_gearman/conftest.py b/tests/application_gearman/conftest.py index ec469dcae..6a38806e2 100644 --- a/tests/application_gearman/conftest.py +++ b/tests/application_gearman/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.application_gearman', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/application_gearman/test_gearman.py b/tests/application_gearman/test_gearman.py index 7ddc13fdc..5dda4ef47 100644 --- a/tests/application_gearman/test_gearman.py +++ b/tests/application_gearman/test_gearman.py @@ -20,14 +20,16 @@ import gearman from newrelic.api.background_task import background_task +from testing_support.db_settings import gearman_settings worker_thread = None worker_event = threading.Event() gm_client = None -GEARMAND_HOST = os.environ.get("GEARMAND_PORT_4730_TCP_ADDR", "localhost") -GEARMAND_PORT = os.environ.get("GEARMAND_PORT_4730_TCP_PORT", "4730") +GEARMAND_SETTINGS = gearman_settings()[0] +GEARMAND_HOST = GEARMAND_SETTINGS["host"] +GEARMAND_PORT = GEARMAND_SETTINGS["port"] GEARMAND_ADDR = "%s:%s" % (GEARMAND_HOST, GEARMAND_PORT) diff --git a/tests/component_djangorestframework/conftest.py b/tests/component_djangorestframework/conftest.py index cdd3d2e8f..a4b37571d 100644 --- a/tests/component_djangorestframework/conftest.py +++ b/tests/component_djangorestframework/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_django', - 'newrelic.hooks.component_djangorestframework', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/component_djangorestframework/test_application.py b/tests/component_djangorestframework/test_application.py index 0d0b98d82..c036f068d 100644 --- a/tests/component_djangorestframework/test_application.py +++ b/tests/component_djangorestframework/test_application.py @@ -18,9 +18,11 @@ from newrelic.packages import six from newrelic.core.config import global_settings -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_generic_settings, +from testing_support.fixtures import ( + override_generic_settings, function_not_called) +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics import django diff --git a/tests/component_flask_rest/conftest.py b/tests/component_flask_rest/conftest.py index 12e985893..ff00973ab 100644 --- a/tests/component_flask_rest/conftest.py +++ b/tests/component_flask_rest/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, # noqa - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_flask', - 'newrelic.hooks.component_flask_rest', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/component_flask_rest/test_application.py b/tests/component_flask_rest/test_application.py index d0eb41795..d463a0205 100644 --- a/tests/component_flask_rest/test_application.py +++ b/tests/component_flask_rest/test_application.py @@ -13,24 +13,33 @@ # limitations under the License. import pytest +from testing_support.fixtures import ( + override_generic_settings, + override_ignore_status_codes, +) +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from newrelic.common.object_names import callable_name +from newrelic.core.config import global_settings from newrelic.packages import six -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_ignore_status_codes, - override_generic_settings) -from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics -from newrelic.core.config import global_settings -from newrelic.common.object_names import callable_name +TEST_APPLICATION_PREFIX = "_test_application.create_app." if six.PY3 else "_test_application" -@pytest.fixture(params=["flask_restful", "flask_restplus", "flask_restx"]) +@pytest.fixture(params=["flask_restful", "flask_restx"]) def application(request): from _test_application import get_test_application + if request.param == "flask_restful": import flask_restful as module - elif request.param == "flask_restplus": - import flask_restplus as module elif request.param == "flask_restx": import flask_restx as module else: @@ -44,49 +53,46 @@ def application(request): _test_application_index_scoped_metrics = [ - ('Function/flask.app:Flask.wsgi_app', 1), - ('Python/WSGI/Application', 1), - ('Python/WSGI/Response', 1), - ('Python/WSGI/Finalize', 1), - ('Function/_test_application:index', 1), - ('Function/werkzeug.wsgi:ClosingIterator.close', 1), + ("Function/flask.app:Flask.wsgi_app", 1), + ("Python/WSGI/Application", 1), + ("Python/WSGI/Response", 1), + ("Python/WSGI/Finalize", 1), + ("Function/_test_application:index", 1), + ("Function/werkzeug.wsgi:ClosingIterator.close", 1), ] -@validate_code_level_metrics("_test_application.create_app.", "IndexResource", py2_namespace="_test_application") +@validate_code_level_metrics(TEST_APPLICATION_PREFIX + ".IndexResource", "get") @validate_transaction_errors(errors=[]) -@validate_transaction_metrics('_test_application:index', - scoped_metrics=_test_application_index_scoped_metrics) +@validate_transaction_metrics("_test_application:index", scoped_metrics=_test_application_index_scoped_metrics) def test_application_index(application): - response = application.get('/index') - response.mustcontain('hello') + response = application.get("/index") + response.mustcontain("hello") _test_application_raises_scoped_metrics = [ - ('Function/flask.app:Flask.wsgi_app', 1), - ('Python/WSGI/Application', 1), - ('Function/_test_application:exception', 1), + ("Function/flask.app:Flask.wsgi_app", 1), + ("Python/WSGI/Application", 1), + ("Function/_test_application:exception", 1), ] @pytest.mark.parametrize( - 'exception,status_code,ignore_status_code,propagate_exceptions', [ - ('werkzeug.exceptions:HTTPException', 404, False, False), - ('werkzeug.exceptions:HTTPException', 404, True, False), - ('werkzeug.exceptions:HTTPException', 503, False, False), - ('_test_application:CustomException', 500, False, False), - ('_test_application:CustomException', 500, False, True), -]) -def test_application_raises(exception, status_code, ignore_status_code, - propagate_exceptions, application): - - @validate_code_level_metrics("_test_application.create_app.", "ExceptionResource", py2_namespace="_test_application") - @validate_transaction_metrics('_test_application:exception', - scoped_metrics=_test_application_raises_scoped_metrics) + "exception,status_code,ignore_status_code,propagate_exceptions", + [ + ("werkzeug.exceptions:HTTPException", 404, False, False), + ("werkzeug.exceptions:HTTPException", 404, True, False), + ("werkzeug.exceptions:HTTPException", 503, False, False), + ("_test_application:CustomException", 500, False, False), + ("_test_application:CustomException", 500, False, True), + ], +) +def test_application_raises(exception, status_code, ignore_status_code, propagate_exceptions, application): + @validate_code_level_metrics(TEST_APPLICATION_PREFIX + ".ExceptionResource", "get") + @validate_transaction_metrics("_test_application:exception", scoped_metrics=_test_application_raises_scoped_metrics) def _test(): try: - application.get('/exception/%s/%i' % (exception, - status_code), status=status_code, expect_errors=True) + application.get("/exception/%s/%i" % (exception, status_code), status=status_code, expect_errors=True) except Exception as e: assert propagate_exceptions @@ -108,9 +114,8 @@ def test_application_outside_transaction(application): _settings = global_settings() - @override_generic_settings(_settings, {'enabled': False}) + @override_generic_settings(_settings, {"enabled": False}) def _test(): - application.get('/exception/werkzeug.exceptions:HTTPException/404', - status=404) + application.get("/exception/werkzeug.exceptions:HTTPException/404", status=404) _test() diff --git a/tests/component_graphqlserver/conftest.py b/tests/component_graphqlserver/conftest.py index c2b5f7d92..f62af8210 100644 --- a/tests/component_graphqlserver/conftest.py +++ b/tests/component_graphqlserver/conftest.py @@ -12,17 +12,8 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.component_graphqlserver", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/component_graphqlserver/test_graphql.py b/tests/component_graphqlserver/test_graphql.py index f22245d84..22cfda306 100644 --- a/tests/component_graphqlserver/test_graphql.py +++ b/tests/component_graphqlserver/test_graphql.py @@ -14,11 +14,9 @@ import importlib import pytest -from testing_support.fixtures import ( - dt_enabled, - validate_transaction_errors, - validate_transaction_metrics, -) +from testing_support.fixtures import dt_enabled +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_span_events import validate_span_events from testing_support.validators.validate_transaction_count import ( validate_transaction_count, diff --git a/tests/component_tastypie/conftest.py b/tests/component_tastypie/conftest.py index da01fa46b..e38e3b2f3 100644 --- a/tests/component_tastypie/conftest.py +++ b/tests/component_tastypie/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.component_tastypie', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/component_tastypie/test_application.py b/tests/component_tastypie/test_application.py index 16521425c..5f81d0831 100644 --- a/tests/component_tastypie/test_application.py +++ b/tests/component_tastypie/test_application.py @@ -21,8 +21,9 @@ from newrelic.api.background_task import background_task from newrelic.api.transaction import end_of_transaction -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_ignore_status_codes) +from testing_support.fixtures import override_ignore_status_codes +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics from wsgi import application diff --git a/tests/coroutines_asyncio/conftest.py b/tests/coroutines_asyncio/conftest.py index aa412d38a..5d3d843d0 100644 --- a/tests/coroutines_asyncio/conftest.py +++ b/tests/coroutines_asyncio/conftest.py @@ -13,18 +13,10 @@ # limitations under the License. import pytest -from testing_support.fixture.event_loop import event_loop -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) -_coverage_source = [ - "newrelic.hooks.coroutines_asyncio", -] +from testing_support.fixture.event_loop import event_loop +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/coroutines_asyncio/test_context_propagation.py b/tests/coroutines_asyncio/test_context_propagation.py index 3beef38d0..ef26aacc1 100644 --- a/tests/coroutines_asyncio/test_context_propagation.py +++ b/tests/coroutines_asyncio/test_context_propagation.py @@ -15,9 +15,8 @@ import sys import pytest -from testing_support.fixtures import ( - function_not_called, - override_generic_settings, +from testing_support.fixtures import function_not_called, override_generic_settings +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) @@ -132,7 +131,7 @@ def handle_exception(loop, context): # The agent should have removed all traces from the cache since # run_until_complete has terminated (all callbacks scheduled inside the # task have run) - assert not trace_cache()._cache + assert not trace_cache() # Assert that no exceptions have occurred assert not exceptions, exceptions @@ -290,7 +289,7 @@ def _test(): # The agent should have removed all traces from the cache since # run_until_complete has terminated - assert not trace_cache()._cache + assert not trace_cache() # Assert that no exceptions have occurred assert not exceptions, exceptions diff --git a/tests/cross_agent/conftest.py b/tests/cross_agent/conftest.py index 0f0f7e3b9..d21ebf236 100644 --- a/tests/cross_agent/conftest.py +++ b/tests/cross_agent/conftest.py @@ -14,16 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.api.transaction', - 'newrelic.api.web_transaction', - 'newrelic.core.attribute_filter', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/cross_agent/fixtures/utilization/utilization_json.json b/tests/cross_agent/fixtures/utilization/utilization_json.json index a5ed101bc..384464631 100644 --- a/tests/cross_agent/fixtures/utilization/utilization_json.json +++ b/tests/cross_agent/fixtures/utilization/utilization_json.json @@ -183,7 +183,7 @@ "input_hostname": "myotherhost", "input_full_hostname": "myotherhost.com", "input_ip_address": ["1.2.3.4"], - "input_gcp_id": "3161347020215157000", + "input_gcp_id": "3161347020215157123", "input_gcp_type": "projects/492690098729/machineTypes/custom-1-1024", "input_gcp_name": "aef-default-20170501t160547-7gh8", "input_gcp_zone": "projects/492690098729/zones/us-central1-c", @@ -196,7 +196,7 @@ "ip_address": ["1.2.3.4"], "vendors": { "gcp": { - "id": "3161347020215157000", + "id": "3161347020215157123", "machineType": "custom-1-1024", "name": "aef-default-20170501t160547-7gh8", "zone": "us-central1-c" @@ -211,7 +211,7 @@ "input_hostname": "myotherhost", "input_full_hostname": "myotherhost.com", "input_ip_address": ["1.2.3.4"], - "input_gcp_id": "3161347020215157000", + "input_gcp_id": "3161347020215157123", "input_gcp_type": "projects/492690098729/machineTypes/custom-1-1024", "input_gcp_name": null, "input_gcp_zone": "projects/492690098729/zones/us-central1-c", diff --git a/tests/cross_agent/fixtures/utilization_vendor_specific/gcp.json b/tests/cross_agent/fixtures/utilization_vendor_specific/gcp.json index 9912790c0..090b410ea 100644 --- a/tests/cross_agent/fixtures/utilization_vendor_specific/gcp.json +++ b/tests/cross_agent/fixtures/utilization_vendor_specific/gcp.json @@ -24,7 +24,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "aef-default-20170501t160547-7gh8", "zone": "projects/492690098729/zones/us-central1-c" @@ -34,7 +34,7 @@ }, "expected_vendors_hash": { "gcp": { - "id": "3161347020215157000", + "id": "3161347020215157123", "machineType": "custom-1-1024", "name": "aef-default-20170501t160547-7gh8", "zone": "us-central1-c" @@ -46,7 +46,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "", "name": "aef-default-20170501t160547-7gh8", "zone": "projects/492690098729/zones/us-central1-c" @@ -66,7 +66,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz", "name": "aef-default-20170501t160547-7gh8", "zone": "projects/492690098729/zones/us-central1-c" @@ -126,7 +126,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "aef-default-20170501t160547-7gh8", "zone": "" @@ -146,7 +146,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "aef-default-20170501t160547-7gh8", "zone": "zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz" @@ -166,7 +166,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "", "zone": "projects/492690098729/zones/us-central1-c" @@ -186,7 +186,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz", "zone": "projects/492690098729/zones/us-central1-c" @@ -206,7 +206,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "滈 橀槶澉 鞻饙騴 鱙鷭黂 甗糲 紁羑 嗂 蛶觢豥 餤駰鬳 釂鱞鸄", "zone": "projects/492690098729/zones/us-central1-c" @@ -216,7 +216,7 @@ }, "expected_vendors_hash": { "gcp": { - "id": "3161347020215157000", + "id": "3161347020215157123", "machineType": "custom-1-1024", "name": "滈 橀槶澉 鞻饙騴 鱙鷭黂 甗糲 紁羑 嗂 蛶觢豥 餤駰鬳 釂鱞鸄", "zone": "us-central1-c" @@ -228,7 +228,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "滈 橀槶澉 鞻饙騴 鱙鷭黂 甗糲, 紁羑 嗂 蛶觢豥 餤駰鬳 釂鱞鸄", "zone": "projects/492690098729/zones/us-central1-c" @@ -248,7 +248,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "Bang!", "zone": "projects/492690098729/zones/us-central1-c" @@ -268,7 +268,7 @@ "uri": { "http://metadata.google.internal/computeMetadata/v1/instance/?recursive=true": { "response": { - "id": 3161347020215157000, + "id": 3161347020215157123, "machineType": "projects/492690098729/machineTypes/custom-1-1024", "name": "a-b_c.3... and/or 503 867-5309", "zone": "projects/492690098729/zones/us-central1-c" @@ -278,7 +278,7 @@ }, "expected_vendors_hash": { "gcp": { - "id": "3161347020215157000", + "id": "3161347020215157123", "machineType": "custom-1-1024", "name": "a-b_c.3... and/or 503 867-5309", "zone": "us-central1-c" diff --git a/tests/cross_agent/test_aws_utilization_data.py b/tests/cross_agent/test_aws_utilization_data.py index 7ff2e623b..807b1a97b 100644 --- a/tests/cross_agent/test_aws_utilization_data.py +++ b/tests/cross_agent/test_aws_utilization_data.py @@ -18,7 +18,7 @@ from newrelic.common.utilization import AWSUtilization from testing_support.mock_http_client import create_client_cls -from testing_support.fixtures import validate_internal_metrics +from testing_support.validators.validate_internal_metrics import validate_internal_metrics CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) diff --git a/tests/cross_agent/test_azure_utilization_data.py b/tests/cross_agent/test_azure_utilization_data.py index 897fadec0..772a96f98 100644 --- a/tests/cross_agent/test_azure_utilization_data.py +++ b/tests/cross_agent/test_azure_utilization_data.py @@ -18,7 +18,7 @@ from newrelic.common.utilization import AzureUtilization from testing_support.mock_http_client import create_client_cls -from testing_support.fixtures import validate_internal_metrics +from testing_support.validators.validate_internal_metrics import validate_internal_metrics CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) diff --git a/tests/cross_agent/test_boot_id_utilization_data.py b/tests/cross_agent/test_boot_id_utilization_data.py index 2eaeae670..ea5b26a9e 100644 --- a/tests/cross_agent/test_boot_id_utilization_data.py +++ b/tests/cross_agent/test_boot_id_utilization_data.py @@ -20,7 +20,7 @@ from newrelic.common.system_info import BootIdUtilization -from testing_support.fixtures import validate_internal_metrics +from testing_support.validators.validate_internal_metrics import validate_internal_metrics CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) diff --git a/tests/cross_agent/test_cat_map.py b/tests/cross_agent/test_cat_map.py index 67c5ab815..6e7ac63d6 100644 --- a/tests/cross_agent/test_cat_map.py +++ b/tests/cross_agent/test_cat_map.py @@ -18,42 +18,58 @@ can be found in test/framework_tornado_r3/test_cat_map.py """ -import webtest -import pytest import json import os +import pytest +import webtest + try: from urllib2 import urlopen # Py2.X except ImportError: - from urllib.request import urlopen # Py3.X - -from newrelic.packages import six + from urllib.request import urlopen # Py3.X + +from testing_support.fixtures import ( + make_cross_agent_headers, + override_application_name, + override_application_settings, + validate_analytics_catmap_data, +) +from testing_support.mock_external_http_server import ( + MockExternalHTTPHResponseHeadersServer, +) +from testing_support.validators.validate_tt_parameters import validate_tt_parameters from newrelic.api.external_trace import ExternalTrace -from newrelic.api.transaction import (get_browser_timing_header, - set_transaction_name, get_browser_timing_footer, set_background_task, - current_transaction) +from newrelic.api.transaction import ( + current_transaction, + get_browser_timing_footer, + get_browser_timing_header, + set_background_task, + set_transaction_name, +) from newrelic.api.wsgi_application import wsgi_application -from newrelic.common.encoding_utils import obfuscate, json_encode - -from testing_support.fixtures import (override_application_settings, - override_application_name, validate_tt_parameters, - make_cross_agent_headers, validate_analytics_catmap_data) -from testing_support.mock_external_http_server import ( - MockExternalHTTPHResponseHeadersServer) +from newrelic.common.encoding_utils import json_encode, obfuscate +from newrelic.packages import six -ENCODING_KEY = '1234567890123456789012345678901234567890' +ENCODING_KEY = "1234567890123456789012345678901234567890" CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) -JSON_DIR = os.path.normpath(os.path.join(CURRENT_DIR, 'fixtures')) +JSON_DIR = os.path.normpath(os.path.join(CURRENT_DIR, "fixtures")) OUTBOUD_REQUESTS = {} -_parameters_list = ["name", "appName", "transactionName", "transactionGuid", - "inboundPayload", "outboundRequests", "expectedIntrinsicFields", - "nonExpectedIntrinsicFields"] +_parameters_list = [ + "name", + "appName", + "transactionName", + "transactionGuid", + "inboundPayload", + "outboundRequests", + "expectedIntrinsicFields", + "nonExpectedIntrinsicFields", +] -@pytest.fixture(scope='module') +@pytest.fixture(scope="module") def server(): with MockExternalHTTPHResponseHeadersServer() as _server: yield _server @@ -61,8 +77,8 @@ def server(): def load_tests(): result = [] - path = os.path.join(JSON_DIR, 'cat_map.json') - with open(path, 'r') as fh: + path = os.path.join(JSON_DIR, "cat_map.json") + with open(path, "r") as fh: tests = json.load(fh) for test in tests: @@ -77,57 +93,52 @@ def load_tests(): @wsgi_application() def target_wsgi_application(environ, start_response): - status = '200 OK' + status = "200 OK" - txn_name = environ.get('txn') + txn_name = environ.get("txn") if six.PY2: - txn_name = txn_name.decode('UTF-8') - txn_name = txn_name.split('/', 3) + txn_name = txn_name.decode("UTF-8") + txn_name = txn_name.split("/", 3) - guid = environ.get('guid') - old_cat = environ.get('old_cat') == 'True' + guid = environ.get("guid") + old_cat = environ.get("old_cat") == "True" txn = current_transaction() txn.guid = guid for req in OUTBOUD_REQUESTS: # Change the transaction name before making an outbound call. - outgoing_name = req['outboundTxnName'].split('/', 3) - if outgoing_name[0] != 'WebTransaction': + outgoing_name = req["outboundTxnName"].split("/", 3) + if outgoing_name[0] != "WebTransaction": set_background_task(True) set_transaction_name(outgoing_name[2], group=outgoing_name[1]) - expected_outbound_header = obfuscate( - json_encode(req['expectedOutboundPayload']), ENCODING_KEY) - generated_outbound_header = dict( - ExternalTrace.generate_request_headers(txn)) + expected_outbound_header = obfuscate(json_encode(req["expectedOutboundPayload"]), ENCODING_KEY) + generated_outbound_header = dict(ExternalTrace.generate_request_headers(txn)) # A 500 error is returned because 'assert' statements in the wsgi app # are ignored. if old_cat: - if (expected_outbound_header != - generated_outbound_header['X-NewRelic-Transaction']): - status = '500 Outbound Headers Check Failed.' + if expected_outbound_header != generated_outbound_header["X-NewRelic-Transaction"]: + status = "500 Outbound Headers Check Failed." else: - if 'X-NewRelic-Transaction' in generated_outbound_header: - status = '500 Outbound Headers Check Failed.' - r = urlopen(environ['server_url']) + if "X-NewRelic-Transaction" in generated_outbound_header: + status = "500 Outbound Headers Check Failed." + r = urlopen(environ["server_url"]) # nosec B310 r.read(10) # Set the final transaction name. - if txn_name[0] != 'WebTransaction': + if txn_name[0] != "WebTransaction": set_background_task(True) set_transaction_name(txn_name[2], group=txn_name[1]) - text = '%s

RESPONSE

%s' + text = "%s

RESPONSE

%s" - output = (text % (get_browser_timing_header(), - get_browser_timing_footer())).encode('UTF-8') + output = (text % (get_browser_timing_header(), get_browser_timing_footer())).encode("UTF-8") - response_headers = [('Content-type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) return [output] @@ -137,26 +148,35 @@ def target_wsgi_application(environ, start_response): @pytest.mark.parametrize(_parameters, load_tests()) -@pytest.mark.parametrize('old_cat', (True, False)) -def test_cat_map(name, appName, transactionName, transactionGuid, - inboundPayload, outboundRequests, expectedIntrinsicFields, - nonExpectedIntrinsicFields, old_cat, server): +@pytest.mark.parametrize("old_cat", (True, False)) +def test_cat_map( + name, + appName, + transactionName, + transactionGuid, + inboundPayload, + outboundRequests, + expectedIntrinsicFields, + nonExpectedIntrinsicFields, + old_cat, + server, +): global OUTBOUD_REQUESTS OUTBOUD_REQUESTS = outboundRequests or {} _custom_settings = { - 'cross_process_id': '1#1', - 'encoding_key': ENCODING_KEY, - 'trusted_account_ids': [1], - 'cross_application_tracer.enabled': True, - 'distributed_tracing.enabled': not old_cat, - 'transaction_tracer.transaction_threshold': 0.0, + "cross_process_id": "1#1", + "encoding_key": ENCODING_KEY, + "trusted_account_ids": [1], + "cross_application_tracer.enabled": True, + "distributed_tracing.enabled": not old_cat, + "transaction_tracer.transaction_threshold": 0.0, } if expectedIntrinsicFields and old_cat: _external_node_params = { - 'path_hash': expectedIntrinsicFields['nr.pathHash'], - 'trip_id': expectedIntrinsicFields['nr.tripId'], + "path_hash": expectedIntrinsicFields["nr.pathHash"], + "trip_id": expectedIntrinsicFields["nr.tripId"], } else: _external_node_params = [] @@ -167,16 +187,16 @@ def test_cat_map(name, appName, transactionName, transactionGuid, expectedIntrinsicFields = {} @validate_tt_parameters(required_params=_external_node_params) - @validate_analytics_catmap_data(transactionName, - expected_attributes=expectedIntrinsicFields, - non_expected_attributes=nonExpectedIntrinsicFields) + @validate_analytics_catmap_data( + transactionName, expected_attributes=expectedIntrinsicFields, non_expected_attributes=nonExpectedIntrinsicFields + ) @override_application_settings(_custom_settings) @override_application_name(appName) def run_cat_test(): if six.PY2: - txn_name = transactionName.encode('UTF-8') - guid = transactionGuid.encode('UTF-8') + txn_name = transactionName.encode("UTF-8") + guid = transactionGuid.encode("UTF-8") else: txn_name = transactionName guid = transactionGuid @@ -185,20 +205,26 @@ def run_cat_test(): # are properly ignoring these headers when the agent is using better # cat. - headers = make_cross_agent_headers(inboundPayload, ENCODING_KEY, '1#1') - response = target_application.get('/', headers=headers, - extra_environ={'txn': txn_name, 'guid': guid, - 'old_cat': str(old_cat), - 'server_url': 'http://localhost:%d' % server.port}) + headers = make_cross_agent_headers(inboundPayload, ENCODING_KEY, "1#1") + response = target_application.get( + "/", + headers=headers, + extra_environ={ + "txn": txn_name, + "guid": guid, + "old_cat": str(old_cat), + "server_url": "http://localhost:%d" % server.port, + }, + ) # Validation of analytic data happens in the decorator. - assert response.status == '200 OK' + assert response.status == "200 OK" content = response.html.html.body.p.string # Validate actual body content as sanity check. - assert content == 'RESPONSE' + assert content == "RESPONSE" run_cat_test() diff --git a/tests/cross_agent/test_collector_hostname.py b/tests/cross_agent/test_collector_hostname.py index d9c65e34b..2ce39a1ec 100644 --- a/tests/cross_agent/test_collector_hostname.py +++ b/tests/cross_agent/test_collector_hostname.py @@ -15,29 +15,28 @@ import json import multiprocessing import os -import pytest import sys import tempfile +import pytest + try: # python 2.x reload except NameError: # python 3.x - from imp import reload + from importlib import reload CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) -FIXTURE = os.path.normpath(os.path.join(CURRENT_DIR, 'fixtures', - 'collector_hostname.json')) +FIXTURE = os.path.normpath(os.path.join(CURRENT_DIR, "fixtures", "collector_hostname.json")) -_parameters_list = ['config_file_key', 'config_override_host', - 'env_key', 'env_override_host', 'hostname'] -_parameters = ','.join(_parameters_list) +_parameters_list = ["config_file_key", "config_override_host", "env_key", "env_override_host", "hostname"] +_parameters = ",".join(_parameters_list) def _load_tests(): - with open(FIXTURE, 'r') as fh: + with open(FIXTURE, "r") as fh: js = fh.read() return json.loads(js) @@ -48,37 +47,39 @@ def _parametrize_test(test): _tests_json = _load_tests() _collector_hostname_tests = [_parametrize_test(t) for t in _tests_json] -_collector_hostname_ids = [t.get('name', None) for t in _tests_json] +_collector_hostname_ids = [t.get("name", None) for t in _tests_json] -def _test_collector_hostname(config_file_key=None, config_override_host=None, - env_key=None, env_override_host=None, hostname=None, queue=None): +def _test_collector_hostname( + config_file_key=None, config_override_host=None, env_key=None, env_override_host=None, hostname=None, queue=None +): try: - ini_contents = '[newrelic]' + ini_contents = "[newrelic]" - if 'NEW_RELIC_HOST' in os.environ: - del os.environ['NEW_RELIC_HOST'] - if 'NEW_RELIC_LICENSE_KEY' in os.environ: - del os.environ['NEW_RELIC_LICENSE_KEY'] + if "NEW_RELIC_HOST" in os.environ: + del os.environ["NEW_RELIC_HOST"] + if "NEW_RELIC_LICENSE_KEY" in os.environ: + del os.environ["NEW_RELIC_LICENSE_KEY"] if env_override_host: - os.environ['NEW_RELIC_HOST'] = env_override_host + os.environ["NEW_RELIC_HOST"] = env_override_host if env_key: - os.environ['NEW_RELIC_LICENSE_KEY'] = env_key + os.environ["NEW_RELIC_LICENSE_KEY"] = env_key if config_file_key: - ini_contents += '\nlicense_key = %s' % config_file_key + ini_contents += "\nlicense_key = %s" % config_file_key if config_override_host: - ini_contents += '\nhost = %s' % config_override_host + ini_contents += "\nhost = %s" % config_override_host import newrelic.config as config import newrelic.core.config as core_config + reload(core_config) reload(config) ini_file = tempfile.NamedTemporaryFile() - ini_file.write(ini_contents.encode('utf-8')) + ini_file.write(ini_contents.encode("utf-8")) ini_file.seek(0) config.initialize(ini_file.name) @@ -91,13 +92,11 @@ def _test_collector_hostname(config_file_key=None, config_override_host=None, raise if queue: - queue.put('PASS') + queue.put("PASS") -@pytest.mark.parametrize(_parameters, _collector_hostname_tests, - ids=_collector_hostname_ids) -def test_collector_hostname(config_file_key, config_override_host, env_key, - env_override_host, hostname): +@pytest.mark.parametrize(_parameters, _collector_hostname_tests, ids=_collector_hostname_ids) +def test_collector_hostname(config_file_key, config_override_host, env_key, env_override_host, hostname): # We run the actual test in a subprocess because we are editing the # settings we need to connect to the data collector. With the wrong @@ -105,11 +104,18 @@ def test_collector_hostname(config_file_key, config_override_host, env_key, # run after this one. queue = multiprocessing.Queue() - process = multiprocessing.Process(target=_test_collector_hostname, - kwargs={'config_file_key': config_file_key, 'config_override_host': - config_override_host, 'env_key': env_key, 'env_override_host': - env_override_host, 'hostname': hostname, 'queue': queue}) + process = multiprocessing.Process( + target=_test_collector_hostname, + kwargs={ + "config_file_key": config_file_key, + "config_override_host": config_override_host, + "env_key": env_key, + "env_override_host": env_override_host, + "hostname": hostname, + "queue": queue, + }, + ) process.start() result = queue.get(timeout=2) - assert result == 'PASS' + assert result == "PASS" diff --git a/tests/cross_agent/test_distributed_tracing.py b/tests/cross_agent/test_distributed_tracing.py index 2ec246275..060fe8a86 100644 --- a/tests/cross_agent/test_distributed_tracing.py +++ b/tests/cross_agent/test_distributed_tracing.py @@ -14,53 +14,70 @@ import json import os + import pytest import webtest +from testing_support.fixtures import override_application_settings, validate_attributes +from testing_support.validators.validate_error_event_attributes import ( + validate_error_event_attributes, +) +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.transaction import current_transaction from newrelic.api.wsgi_application import wsgi_application from newrelic.common.encoding_utils import DistributedTracePayload from newrelic.common.object_wrapper import transient_function_wrapper -from testing_support.fixtures import (override_application_settings, - validate_transaction_metrics, validate_transaction_event_attributes, - validate_error_event_attributes, validate_attributes) -from testing_support.validators.validate_span_events import ( - validate_span_events) - CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) -JSON_DIR = os.path.normpath(os.path.join(CURRENT_DIR, 'fixtures', - 'distributed_tracing')) - -_parameters_list = ['account_id', 'comment', 'expected_metrics', - 'force_sampled_true', 'inbound_payloads', 'intrinsics', - 'major_version', 'minor_version', 'outbound_payloads', - 'raises_exception', 'span_events_enabled', 'test_name', - 'transport_type', 'trusted_account_key', 'web_transaction'] -_parameters = ','.join(_parameters_list) +JSON_DIR = os.path.normpath(os.path.join(CURRENT_DIR, "fixtures", "distributed_tracing")) + +_parameters_list = [ + "account_id", + "comment", + "expected_metrics", + "force_sampled_true", + "inbound_payloads", + "intrinsics", + "major_version", + "minor_version", + "outbound_payloads", + "raises_exception", + "span_events_enabled", + "test_name", + "transport_type", + "trusted_account_key", + "web_transaction", +] +_parameters = ",".join(_parameters_list) def load_tests(): result = [] - path = os.path.join(JSON_DIR, 'distributed_tracing.json') - with open(path, 'r') as fh: + path = os.path.join(JSON_DIR, "distributed_tracing.json") + with open(path, "r") as fh: tests = json.load(fh) for test in tests: values = (test.get(param, None) for param in _parameters_list) - param = pytest.param(*values, id=test.get('test_name')) + param = pytest.param(*values, id=test.get("test_name")) result.append(param) return result def override_compute_sampled(override): - @transient_function_wrapper('newrelic.core.adaptive_sampler', - 'AdaptiveSampler.compute_sampled') + @transient_function_wrapper("newrelic.core.adaptive_sampler", "AdaptiveSampler.compute_sampled") def _override_compute_sampled(wrapped, instance, args, kwargs): if override: return True return wrapped(*args, **kwargs) + return _override_compute_sampled @@ -69,58 +86,54 @@ def assert_payload(payload, payload_assertions, major_version, minor_version): # flatten payload so it matches the test: # payload['d']['ac'] -> payload['d.ac'] - d = payload.pop('d') + d = payload.pop("d") for key, value in d.items(): - payload['d.%s' % key] = value + payload["d.%s" % key] = value - for expected in payload_assertions.get('expected', []): + for expected in payload_assertions.get("expected", []): assert expected in payload - for unexpected in payload_assertions.get('unexpected', []): + for unexpected in payload_assertions.get("unexpected", []): assert unexpected not in payload - for key, value in payload_assertions.get('exact', {}).items(): + for key, value in payload_assertions.get("exact", {}).items(): assert key in payload if isinstance(value, list): value = tuple(value) assert payload[key] == value - assert payload['v'][0] == major_version - assert payload['v'][1] == minor_version + assert payload["v"][0] == major_version + assert payload["v"][1] == minor_version @wsgi_application() def target_wsgi_application(environ, start_response): - status = '200 OK' - output = b'hello world' - response_headers = [('Content-type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + status = "200 OK" + output = b"hello world" + response_headers = [("Content-type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] txn = current_transaction() - txn.set_transaction_name(test_settings['test_name']) + txn.set_transaction_name(test_settings["test_name"]) - if not test_settings['web_transaction']: + if not test_settings["web_transaction"]: txn.background_task = True - if test_settings['raises_exception']: + if test_settings["raises_exception"]: try: 1 / 0 except ZeroDivisionError: txn.notice_error() - extra_inbound_payloads = test_settings['extra_inbound_payloads'] + extra_inbound_payloads = test_settings["extra_inbound_payloads"] for payload, expected_result in extra_inbound_payloads: - result = txn.accept_distributed_trace_payload(payload, - test_settings['transport_type']) + result = txn.accept_distributed_trace_payload(payload, test_settings["transport_type"]) assert result is expected_result - outbound_payloads = test_settings['outbound_payloads'] + outbound_payloads = test_settings["outbound_payloads"] if outbound_payloads: for payload_assertions in outbound_payloads: payload = txn._create_distributed_trace_payload() - assert_payload(payload, payload_assertions, - test_settings['major_version'], - test_settings['minor_version']) + assert_payload(payload, payload_assertions, test_settings["major_version"], test_settings["minor_version"]) start_response(status, response_headers) return [output] @@ -130,14 +143,26 @@ def target_wsgi_application(environ, start_response): @pytest.mark.parametrize(_parameters, load_tests()) -def test_distributed_tracing(account_id, comment, expected_metrics, - force_sampled_true, inbound_payloads, intrinsics, major_version, - minor_version, outbound_payloads, raises_exception, - span_events_enabled, test_name, transport_type, trusted_account_key, - web_transaction): +def test_distributed_tracing( + account_id, + comment, + expected_metrics, + force_sampled_true, + inbound_payloads, + intrinsics, + major_version, + minor_version, + outbound_payloads, + raises_exception, + span_events_enabled, + test_name, + transport_type, + trusted_account_key, + web_transaction, +): extra_inbound_payloads = [] - if transport_type != 'HTTP': + if transport_type != "HTTP": # Since wsgi_application calls accept_distributed_trace_payload # automatically with transport_type='HTTP', we must defer this call # until we can specify the transport type. @@ -152,78 +177,68 @@ def test_distributed_tracing(account_id, comment, expected_metrics, global test_settings test_settings = { - 'test_name': test_name, - 'web_transaction': web_transaction, - 'raises_exception': raises_exception, - 'extra_inbound_payloads': extra_inbound_payloads, - 'outbound_payloads': outbound_payloads, - 'transport_type': transport_type, - 'major_version': major_version, - 'minor_version': minor_version, + "test_name": test_name, + "web_transaction": web_transaction, + "raises_exception": raises_exception, + "extra_inbound_payloads": extra_inbound_payloads, + "outbound_payloads": outbound_payloads, + "transport_type": transport_type, + "major_version": major_version, + "minor_version": minor_version, } override_settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': span_events_enabled, - 'account_id': account_id, - 'trusted_account_key': trusted_account_key, + "distributed_tracing.enabled": True, + "span_events.enabled": span_events_enabled, + "account_id": account_id, + "trusted_account_key": trusted_account_key, } - common_required = intrinsics['common']['expected'] - common_forgone = intrinsics['common']['unexpected'] - common_exact = intrinsics['common'].get('exact', {}) - - txn_intrinsics = intrinsics.get('Transaction', {}) - txn_event_required = {'agent': [], 'user': [], - 'intrinsic': txn_intrinsics.get('expected', [])} - txn_event_required['intrinsic'].extend(common_required) - txn_event_forgone = {'agent': [], 'user': [], - 'intrinsic': txn_intrinsics.get('unexpected', [])} - txn_event_forgone['intrinsic'].extend(common_forgone) - txn_event_exact = {'agent': {}, 'user': {}, - 'intrinsic': txn_intrinsics.get('exact', {})} - txn_event_exact['intrinsic'].update(common_exact) + common_required = intrinsics["common"]["expected"] + common_forgone = intrinsics["common"]["unexpected"] + common_exact = intrinsics["common"].get("exact", {}) + + txn_intrinsics = intrinsics.get("Transaction", {}) + txn_event_required = {"agent": [], "user": [], "intrinsic": txn_intrinsics.get("expected", [])} + txn_event_required["intrinsic"].extend(common_required) + txn_event_forgone = {"agent": [], "user": [], "intrinsic": txn_intrinsics.get("unexpected", [])} + txn_event_forgone["intrinsic"].extend(common_forgone) + txn_event_exact = {"agent": {}, "user": {}, "intrinsic": txn_intrinsics.get("exact", {})} + txn_event_exact["intrinsic"].update(common_exact) headers = {} if inbound_payloads: payload = DistributedTracePayload(inbound_payloads[0]) - headers['newrelic'] = payload.http_safe() - - @validate_transaction_metrics(test_name, - rollup_metrics=expected_metrics, - background_task=not web_transaction) - @validate_transaction_event_attributes( - txn_event_required, txn_event_forgone, txn_event_exact) - @validate_attributes('intrinsic', common_required, common_forgone) + headers["newrelic"] = payload.http_safe() + + @validate_transaction_metrics(test_name, rollup_metrics=expected_metrics, background_task=not web_transaction) + @validate_transaction_event_attributes(txn_event_required, txn_event_forgone, txn_event_exact) + @validate_attributes("intrinsic", common_required, common_forgone) @override_compute_sampled(force_sampled_true) @override_application_settings(override_settings) def _test(): - response = test_application.get('/', headers=headers) - assert 'X-NewRelic-App-Data' not in response.headers + response = test_application.get("/", headers=headers) + assert "X-NewRelic-App-Data" not in response.headers - if 'Span' in intrinsics: - span_intrinsics = intrinsics.get('Span') - span_expected = span_intrinsics.get('expected', []) + if "Span" in intrinsics: + span_intrinsics = intrinsics.get("Span") + span_expected = span_intrinsics.get("expected", []) span_expected.extend(common_required) - span_unexpected = span_intrinsics.get('unexpected', []) + span_unexpected = span_intrinsics.get("unexpected", []) span_unexpected.extend(common_forgone) - span_exact = span_intrinsics.get('exact', {}) + span_exact = span_intrinsics.get("exact", {}) span_exact.update(common_exact) - _test = validate_span_events(exact_intrinsics=span_exact, - expected_intrinsics=span_expected, - unexpected_intrinsics=span_unexpected)(_test) + _test = validate_span_events( + exact_intrinsics=span_exact, expected_intrinsics=span_expected, unexpected_intrinsics=span_unexpected + )(_test) elif not span_events_enabled: _test = validate_span_events(count=0)(_test) if raises_exception: - error_event_required = {'agent': [], 'user': [], - 'intrinsic': common_required} - error_event_forgone = {'agent': [], 'user': [], - 'intrinsic': common_forgone} - error_event_exact = {'agent': {}, 'user': {}, - 'intrinsic': common_exact} - _test = validate_error_event_attributes(error_event_required, - error_event_forgone, error_event_exact)(_test) + error_event_required = {"agent": [], "user": [], "intrinsic": common_required} + error_event_forgone = {"agent": [], "user": [], "intrinsic": common_forgone} + error_event_exact = {"agent": {}, "user": {}, "intrinsic": common_exact} + _test = validate_error_event_attributes(error_event_required, error_event_forgone, error_event_exact)(_test) _test() diff --git a/tests/cross_agent/test_gcp_utilization_data.py b/tests/cross_agent/test_gcp_utilization_data.py index 1d77ed0a2..7bbf6ce0a 100644 --- a/tests/cross_agent/test_gcp_utilization_data.py +++ b/tests/cross_agent/test_gcp_utilization_data.py @@ -18,7 +18,7 @@ from newrelic.common.utilization import GCPUtilization from testing_support.mock_http_client import create_client_cls -from testing_support.fixtures import validate_internal_metrics +from testing_support.validators.validate_internal_metrics import validate_internal_metrics CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) diff --git a/tests/cross_agent/test_lambda_event_source.py b/tests/cross_agent/test_lambda_event_source.py index 3a90aec58..511294cf6 100644 --- a/tests/cross_agent/test_lambda_event_source.py +++ b/tests/cross_agent/test_lambda_event_source.py @@ -17,8 +17,8 @@ import pytest from newrelic.api.lambda_handler import lambda_handler -from testing_support.fixtures import (override_application_settings, - validate_transaction_event_attributes) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_event_attributes import validate_transaction_event_attributes CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) FIXTURE_DIR = os.path.normpath(os.path.join(CURRENT_DIR, 'fixtures')) diff --git a/tests/cross_agent/test_pcf_utilization_data.py b/tests/cross_agent/test_pcf_utilization_data.py index ce86bfb9f..28b56f759 100644 --- a/tests/cross_agent/test_pcf_utilization_data.py +++ b/tests/cross_agent/test_pcf_utilization_data.py @@ -18,7 +18,7 @@ from newrelic.common.utilization import PCFUtilization -from testing_support.fixtures import validate_internal_metrics +from testing_support.validators.validate_internal_metrics import validate_internal_metrics CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) diff --git a/tests/cross_agent/test_rum_client_config.py b/tests/cross_agent/test_rum_client_config.py index d60cff777..c2a4a465f 100644 --- a/tests/cross_agent/test_rum_client_config.py +++ b/tests/cross_agent/test_rum_client_config.py @@ -14,94 +14,118 @@ import json import os + import pytest import webtest +from testing_support.fixtures import override_application_settings -from newrelic.api.transaction import (set_transaction_name, - add_custom_parameter, get_browser_timing_footer) +from newrelic.api.transaction import ( + add_custom_attribute, + get_browser_timing_footer, + set_transaction_name, +) from newrelic.api.wsgi_application import wsgi_application -from testing_support.fixtures import override_application_settings - def _load_tests(): - fixture = os.path.join(os.curdir, 'fixtures', 'rum_client_config.json') - with open(fixture, 'r') as fh: + fixture = os.path.join(os.curdir, "fixtures", "rum_client_config.json") + with open(fixture, "r") as fh: js = fh.read() return json.loads(js) -fields = ['testname', 'apptime_milliseconds', 'queuetime_milliseconds', - 'browser_monitoring.attributes.enabled', 'transaction_name', - 'license_key', 'connect_reply', 'user_attributes', 'expected'] + +fields = [ + "testname", + "apptime_milliseconds", + "queuetime_milliseconds", + "browser_monitoring.attributes.enabled", + "transaction_name", + "license_key", + "connect_reply", + "user_attributes", + "expected", +] # Replace . as not a valid character in python argument names -field_names = ','.join([f.replace('.', '_') for f in fields]) +field_names = ",".join([f.replace(".", "_") for f in fields]) + def _parametrize_test(test): return tuple([test.get(f, None) for f in fields]) + _rum_tests = [_parametrize_test(t) for t in _load_tests()] + @wsgi_application() def target_wsgi_application(environ, start_response): - status = '200 OK' + status = "200 OK" - txn_name = environ.get('txn_name') - set_transaction_name(txn_name, group='') + txn_name = environ.get("txn_name") + set_transaction_name(txn_name, group="") - user_attrs = json.loads(environ.get('user_attrs')) + user_attrs = json.loads(environ.get("user_attrs")) for key, value in user_attrs.items(): - add_custom_parameter(key, value) + add_custom_attribute(key, value) - text = '%s

RESPONSE

' + text = "%s

RESPONSE

" - output = (text % get_browser_timing_footer()).encode('UTF-8') + output = (text % get_browser_timing_footer()).encode("UTF-8") - response_headers = [('Content-Type', 'text/html; charset=utf-8'), - ('Content-Length', str(len(output)))] + response_headers = [("Content-Type", "text/html; charset=utf-8"), ("Content-Length", str(len(output)))] start_response(status, response_headers) return [output] + target_application = webtest.TestApp(target_wsgi_application) + @pytest.mark.parametrize(field_names, _rum_tests) -def test_browser_montioring(testname, apptime_milliseconds, queuetime_milliseconds, - browser_monitoring_attributes_enabled, transaction_name, - license_key, connect_reply, user_attributes, expected): +def test_browser_montioring( + testname, + apptime_milliseconds, + queuetime_milliseconds, + browser_monitoring_attributes_enabled, + transaction_name, + license_key, + connect_reply, + user_attributes, + expected, +): settings = { - 'browser_monitoring.attributes.enabled': browser_monitoring_attributes_enabled, - 'license_key': license_key, - 'js_agent_loader': u'', - } + "browser_monitoring.attributes.enabled": browser_monitoring_attributes_enabled, + "license_key": license_key, + "js_agent_loader": "", + } settings.update(connect_reply) @override_application_settings(settings) def run_browser_data_test(): - response = target_application.get('/', - extra_environ={'txn_name': str(transaction_name), - 'user_attrs': json.dumps(user_attributes)}) + response = target_application.get( + "/", extra_environ={"txn_name": str(transaction_name), "user_attrs": json.dumps(user_attributes)} + ) # We actually put the "footer" in the header, the first script is the # agent "header", the second one is where the data lives, hence the [1]. - footer = response.html.html.head.find_all('script')[1] - footer_data = json.loads(footer.string.split('NREUM.info=')[1]) + footer = response.html.html.head.find_all("script")[1] + footer_data = json.loads(footer.string.split("NREUM.info=")[1]) # Not feasible to test the time metric values in testing - expected.pop('queueTime') - expected.pop('applicationTime') - assert footer_data['applicationTime'] >= 0 - assert footer_data['queueTime'] >= 0 + expected.pop("queueTime") + expected.pop("applicationTime") + assert footer_data["applicationTime"] >= 0 + assert footer_data["queueTime"] >= 0 # Python always prepends stuff to the transaction name, so this # doesn't match the obscured value. - expected.pop('transactionName') + expected.pop("transactionName") # Check that all other values are correct @@ -112,7 +136,7 @@ def run_browser_data_test(): # don't omit it, so we need to special case 'atts' when we compare # to 'expected'. - if key == 'atts' and value == '': + if key == "atts" and value == "": assert key not in footer_data else: assert footer_data[key] == value diff --git a/tests/cross_agent/test_utilization_configs.py b/tests/cross_agent/test_utilization_configs.py index 810631ee6..4a4adb485 100644 --- a/tests/cross_agent/test_utilization_configs.py +++ b/tests/cross_agent/test_utilization_configs.py @@ -14,36 +14,36 @@ import json import os -import pytest import sys import tempfile +import pytest # NOTE: the test_utilization_settings_from_env_vars test mocks several of the # methods in newrelic.core.data_collector and does not put them back! from testing_support.mock_http_client import create_client_cls -from newrelic.core.agent_protocol import AgentProtocol -from newrelic.common.system_info import BootIdUtilization -from newrelic.common.utilization import (CommonUtilization) -from newrelic.common.object_wrapper import (function_wrapper) + import newrelic.core.config +from newrelic.common.object_wrapper import function_wrapper +from newrelic.common.system_info import BootIdUtilization +from newrelic.common.utilization import CommonUtilization +from newrelic.core.agent_protocol import AgentProtocol try: # python 2.x reload except NameError: # python 3.x - from imp import reload + from importlib import reload INITIAL_ENV = os.environ CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) -FIXTURE = os.path.normpath(os.path.join( - CURRENT_DIR, 'fixtures', 'utilization', 'utilization_json.json')) +FIXTURE = os.path.normpath(os.path.join(CURRENT_DIR, "fixtures", "utilization", "utilization_json.json")) def _load_tests(): - with open(FIXTURE, 'r') as fh: + with open(FIXTURE, "r") as fh: js = fh.read() return json.loads(js) @@ -51,24 +51,28 @@ def _load_tests(): def _mock_logical_processor_count(cnt): def logical_processor_count(): return cnt + return logical_processor_count def _mock_total_physical_memory(mem): def total_physical_memory(): return mem + return total_physical_memory def _mock_gethostname(name): def gethostname(*args, **kwargs): return name + return gethostname def _mock_getips(ip_addresses): def getips(*args, **kwargs): return ip_addresses + return getips @@ -100,26 +104,32 @@ def __exit__(self, *args, **kwargs): def _get_response_body_for_test(test): - if test.get('input_aws_id'): - return json.dumps({ - 'instanceId': test.get('input_aws_id'), - 'instanceType': test.get('input_aws_type'), - 'availabilityZone': test.get('input_aws_zone'), - }).encode('utf8') - if test.get('input_azure_id'): - return json.dumps({ - 'location': test.get('input_azure_location'), - 'name': test.get('input_azure_name'), - 'vmId': test.get('input_azure_id'), - 'vmSize': test.get('input_azure_size'), - }).encode('utf8') - if test.get('input_gcp_id'): - return json.dumps({ - 'id': test.get('input_gcp_id'), - 'machineType': test.get('input_gcp_type'), - 'name': test.get('input_gcp_name'), - 'zone': test.get('input_gcp_zone'), - }).encode('utf8') + if test.get("input_aws_id"): + return json.dumps( + { + "instanceId": test.get("input_aws_id"), + "instanceType": test.get("input_aws_type"), + "availabilityZone": test.get("input_aws_zone"), + } + ).encode("utf8") + if test.get("input_azure_id"): + return json.dumps( + { + "location": test.get("input_azure_location"), + "name": test.get("input_azure_name"), + "vmId": test.get("input_azure_id"), + "vmSize": test.get("input_azure_size"), + } + ).encode("utf8") + if test.get("input_gcp_id"): + return json.dumps( + { + "id": test.get("input_gcp_id"), + "machineType": test.get("input_gcp_type"), + "name": test.get("input_gcp_name"), + "zone": test.get("input_gcp_zone"), + } + ).encode("utf8") def patch_boot_id_file(test): @@ -128,16 +138,16 @@ def _patch_boot_id_file(wrapped, instance, args, kwargs): boot_id_file = None initial_sys_platform = sys.platform - if test.get('input_boot_id'): + if test.get("input_boot_id"): boot_id_file = tempfile.NamedTemporaryFile() - boot_id_file.write(test.get('input_boot_id')) + boot_id_file.write(test.get("input_boot_id")) boot_id_file.seek(0) BootIdUtilization.METADATA_URL = boot_id_file.name - sys.platform = 'linux-mock-testing' # ensure boot_id is gathered + sys.platform = "linux-mock-testing" # ensure boot_id is gathered else: # do not gather boot_id at all, this will ensure there is nothing # extra in the gathered utilizations data - sys.platform = 'not-linux' + sys.platform = "not-linux" try: return wrapped(*args, **kwargs) @@ -153,38 +163,36 @@ def patch_system_info(test, monkeypatch): def _patch_system_info(wrapped, instance, args, kwargs): sys_info = newrelic.common.system_info - monkeypatch.setattr(sys_info, "logical_processor_count", - _mock_logical_processor_count( - test.get('input_logical_processors'))) - monkeypatch.setattr(sys_info, "total_physical_memory", - _mock_total_physical_memory( - test.get('input_total_ram_mib'))) - monkeypatch.setattr(sys_info, "gethostname", - _mock_gethostname( - test.get('input_hostname'))) - monkeypatch.setattr(sys_info, "getips", - _mock_getips( - test.get('input_ip_address'))) + monkeypatch.setattr( + sys_info, "logical_processor_count", _mock_logical_processor_count(test.get("input_logical_processors")) + ) + monkeypatch.setattr( + sys_info, "total_physical_memory", _mock_total_physical_memory(test.get("input_total_ram_mib")) + ) + monkeypatch.setattr(sys_info, "gethostname", _mock_gethostname(test.get("input_hostname"))) + monkeypatch.setattr(sys_info, "getips", _mock_getips(test.get("input_ip_address"))) return wrapped(*args, **kwargs) return _patch_system_info -@pytest.mark.parametrize('test', _load_tests()) +@pytest.mark.parametrize("test", _load_tests()) def test_utilization_settings(test, monkeypatch): - env = test.get('input_environment_variables', {}) + env = test.get("input_environment_variables", {}) - if test.get('input_pcf_guid'): - env.update({ - 'CF_INSTANCE_GUID': test.get('input_pcf_guid'), - 'CF_INSTANCE_IP': test.get('input_pcf_ip'), - 'MEMORY_LIMIT': test.get('input_pcf_mem_limit'), - }) + if test.get("input_pcf_guid"): + env.update( + { + "CF_INSTANCE_GUID": test.get("input_pcf_guid"), + "CF_INSTANCE_IP": test.get("input_pcf_ip"), + "MEMORY_LIMIT": test.get("input_pcf_mem_limit"), + } + ) for key, val in env.items(): - monkeypatch.setenv(key, val) + monkeypatch.setenv(key, str(val)) @patch_boot_id_file(test) @patch_system_info(test, monkeypatch) @@ -199,10 +207,9 @@ def _test_utilization_data(): # gathered utilization data monkeypatch.setattr(settings.utilization, "detect_docker", False) - local_config, = AgentProtocol._connect_payload( - '', [], [], settings) - util_output = local_config['utilization'] - expected_output = test['expected_output_json'] + (local_config,) = AgentProtocol._connect_payload("", [], [], settings) + util_output = local_config["utilization"] + expected_output = test["expected_output_json"] # The agent does not record full_hostname and it's not required expected_output.pop("full_hostname") diff --git a/tests/cross_agent/test_w3c_trace_context.py b/tests/cross_agent/test_w3c_trace_context.py index 479d799bc..893274ce4 100644 --- a/tests/cross_agent/test_w3c_trace_context.py +++ b/tests/cross_agent/test_w3c_trace_context.py @@ -14,87 +14,105 @@ import json import os + import pytest import webtest -from newrelic.packages import six - -from newrelic.api.transaction import current_transaction +from testing_support.fixtures import override_application_settings, validate_attributes +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.transaction import ( + accept_distributed_trace_headers, + current_transaction, + insert_distributed_trace_headers, +) from newrelic.api.wsgi_application import wsgi_application -from newrelic.common.object_wrapper import transient_function_wrapper -from testing_support.validators.validate_span_events import ( - validate_span_events) -from testing_support.fixtures import (override_application_settings, - validate_transaction_metrics, validate_transaction_event_attributes, - validate_attributes) from newrelic.common.encoding_utils import W3CTraceState +from newrelic.common.object_wrapper import transient_function_wrapper +from newrelic.packages import six CURRENT_DIR = os.path.dirname(os.path.realpath(__file__)) -JSON_DIR = os.path.normpath(os.path.join(CURRENT_DIR, 'fixtures', - 'distributed_tracing')) - -_parameters_list = ('test_name', 'trusted_account_key', 'account_id', - 'web_transaction', 'raises_exception', 'force_sampled_true', - 'span_events_enabled', 'transport_type', 'inbound_headers', - 'outbound_payloads', 'intrinsics', 'expected_metrics') - -_parameters = ','.join(_parameters_list) +JSON_DIR = os.path.normpath(os.path.join(CURRENT_DIR, "fixtures", "distributed_tracing")) + +_parameters_list = ( + "test_name", + "trusted_account_key", + "account_id", + "web_transaction", + "raises_exception", + "force_sampled_true", + "span_events_enabled", + "transport_type", + "inbound_headers", + "outbound_payloads", + "intrinsics", + "expected_metrics", +) + +_parameters = ",".join(_parameters_list) XFAIL_TESTS = [ - 'spans_disabled_root', - 'missing_traceparent', - 'missing_traceparent_and_tracestate', - 'w3c_and_newrelc_headers_present_error_parsing_traceparent' + "spans_disabled_root", + "missing_traceparent", + "missing_traceparent_and_tracestate", + "w3c_and_newrelc_headers_present_error_parsing_traceparent", ] + def load_tests(): result = [] - path = os.path.join(JSON_DIR, 'trace_context.json') - with open(path, 'r') as fh: + path = os.path.join(JSON_DIR, "trace_context.json") + with open(path, "r") as fh: tests = json.load(fh) for test in tests: values = (test.get(param, None) for param in _parameters_list) - param = pytest.param(*values, id=test.get('test_name')) + param = pytest.param(*values, id=test.get("test_name")) result.append(param) return result ATTR_MAP = { - 'traceparent.version': 0, - 'traceparent.trace_id': 1, - 'traceparent.parent_id': 2, - 'traceparent.trace_flags': 3, - 'tracestate.version': 0, - 'tracestate.parent_type': 1, - 'tracestate.parent_account_id': 2, - 'tracestate.parent_application_id': 3, - 'tracestate.span_id': 4, - 'tracestate.transaction_id': 5, - 'tracestate.sampled': 6, - 'tracestate.priority': 7, - 'tracestate.timestamp': 8, - 'tracestate.tenant_id': None, + "traceparent.version": 0, + "traceparent.trace_id": 1, + "traceparent.parent_id": 2, + "traceparent.trace_flags": 3, + "tracestate.version": 0, + "tracestate.parent_type": 1, + "tracestate.parent_account_id": 2, + "tracestate.parent_application_id": 3, + "tracestate.span_id": 4, + "tracestate.transaction_id": 5, + "tracestate.sampled": 6, + "tracestate.priority": 7, + "tracestate.timestamp": 8, + "tracestate.tenant_id": None, } def validate_outbound_payload(actual, expected, trusted_account_key): - traceparent = '' - tracestate = '' + traceparent = "" + tracestate = "" for key, value in actual: - if key == 'traceparent': - traceparent = value.split('-') - elif key == 'tracestate': + if key == "traceparent": + traceparent = value.split("-") + elif key == "tracestate": vendors = W3CTraceState.decode(value) - nr_entry = vendors.pop(trusted_account_key + '@nr', '') - tracestate = nr_entry.split('-') - exact_values = expected.get('exact', {}) - expected_attrs = expected.get('expected', []) - unexpected_attrs = expected.get('unexpected', []) - expected_vendors = expected.get('vendors', []) + nr_entry = vendors.pop(trusted_account_key + "@nr", "") + tracestate = nr_entry.split("-") + exact_values = expected.get("exact", {}) + expected_attrs = expected.get("expected", []) + unexpected_attrs = expected.get("unexpected", []) + expected_vendors = expected.get("vendors", []) for key, value in exact_values.items(): - header = traceparent if key.startswith('traceparent.') else tracestate + header = traceparent if key.startswith("traceparent.") else tracestate attr = ATTR_MAP[key] if attr is not None: if isinstance(value, bool): @@ -105,13 +123,13 @@ def validate_outbound_payload(actual, expected, trusted_account_key): assert header[attr] == str(value) for key in expected_attrs: - header = traceparent if key.startswith('traceparent.') else tracestate + header = traceparent if key.startswith("traceparent.") else tracestate attr = ATTR_MAP[key] if attr is not None: assert header[attr], key for key in unexpected_attrs: - header = traceparent if key.startswith('traceparent.') else tracestate + header = traceparent if key.startswith("traceparent.") else tracestate attr = ATTR_MAP[key] if attr is not None: assert not header[attr], key @@ -124,127 +142,129 @@ def validate_outbound_payload(actual, expected, trusted_account_key): def target_wsgi_application(environ, start_response): transaction = current_transaction() - if not environ['.web_transaction']: + if not environ[".web_transaction"]: transaction.background_task = True - if environ['.raises_exception']: + if environ[".raises_exception"]: try: raise ValueError("oops") except: transaction.notice_error() - if '.inbound_headers' in environ: - transaction.accept_distributed_trace_headers( - environ['.inbound_headers'], - transport_type=environ['.transport_type'], + if ".inbound_headers" in environ: + accept_distributed_trace_headers( + environ[".inbound_headers"], + transport_type=environ[".transport_type"], ) payloads = [] - for _ in range(environ['.outbound_calls']): + for _ in range(environ[".outbound_calls"]): payloads.append([]) - transaction.insert_distributed_trace_headers(payloads[-1]) + insert_distributed_trace_headers(payloads[-1]) - start_response('200 OK', [('Content-Type', 'application/json')]) - return [json.dumps(payloads).encode('utf-8')] + start_response("200 OK", [("Content-Type", "application/json")]) + return [json.dumps(payloads).encode("utf-8")] test_application = webtest.TestApp(target_wsgi_application) def override_compute_sampled(override): - @transient_function_wrapper('newrelic.core.adaptive_sampler', - 'AdaptiveSampler.compute_sampled') + @transient_function_wrapper("newrelic.core.adaptive_sampler", "AdaptiveSampler.compute_sampled") def _override_compute_sampled(wrapped, instance, args, kwargs): if override: return True return wrapped(*args, **kwargs) + return _override_compute_sampled @pytest.mark.parametrize(_parameters, load_tests()) -def test_trace_context(test_name, trusted_account_key, account_id, - web_transaction, raises_exception, force_sampled_true, - span_events_enabled, transport_type, inbound_headers, - outbound_payloads, intrinsics, expected_metrics): - +def test_trace_context( + test_name, + trusted_account_key, + account_id, + web_transaction, + raises_exception, + force_sampled_true, + span_events_enabled, + transport_type, + inbound_headers, + outbound_payloads, + intrinsics, + expected_metrics, +): if test_name in XFAIL_TESTS: pytest.xfail("Waiting on cross agent tests update.") # Prepare assertions if not intrinsics: intrinsics = {} - common = intrinsics.get('common', {}) - common_required = common.get('expected', []) - common_forgone = common.get('unexpected', []) - common_exact = common.get('exact', {}) - - txn_intrinsics = intrinsics.get('Transaction', {}) - txn_event_required = {'agent': [], 'user': [], - 'intrinsic': txn_intrinsics.get('expected', [])} - txn_event_required['intrinsic'].extend(common_required) - txn_event_forgone = {'agent': [], 'user': [], - 'intrinsic': txn_intrinsics.get('unexpected', [])} - txn_event_forgone['intrinsic'].extend(common_forgone) - txn_event_exact = {'agent': {}, 'user': {}, - 'intrinsic': txn_intrinsics.get('exact', {})} - txn_event_exact['intrinsic'].update(common_exact) + common = intrinsics.get("common", {}) + common_required = common.get("expected", []) + common_forgone = common.get("unexpected", []) + common_exact = common.get("exact", {}) + + txn_intrinsics = intrinsics.get("Transaction", {}) + txn_event_required = {"agent": [], "user": [], "intrinsic": txn_intrinsics.get("expected", [])} + txn_event_required["intrinsic"].extend(common_required) + txn_event_forgone = {"agent": [], "user": [], "intrinsic": txn_intrinsics.get("unexpected", [])} + txn_event_forgone["intrinsic"].extend(common_forgone) + txn_event_exact = {"agent": {}, "user": {}, "intrinsic": txn_intrinsics.get("exact", {})} + txn_event_exact["intrinsic"].update(common_exact) override_settings = { - 'distributed_tracing.enabled': True, - 'span_events.enabled': span_events_enabled, - 'account_id': account_id, - 'trusted_account_key': trusted_account_key, + "distributed_tracing.enabled": True, + "span_events.enabled": span_events_enabled, + "account_id": account_id, + "trusted_account_key": trusted_account_key, } extra_environ = { - '.web_transaction': web_transaction, - '.raises_exception': raises_exception, - '.transport_type': transport_type, - '.outbound_calls': outbound_payloads and len(outbound_payloads) or 0, + ".web_transaction": web_transaction, + ".raises_exception": raises_exception, + ".transport_type": transport_type, + ".outbound_calls": outbound_payloads and len(outbound_payloads) or 0, } inbound_headers = inbound_headers and inbound_headers[0] or None - if transport_type != 'HTTP': - extra_environ['.inbound_headers'] = inbound_headers + if transport_type != "HTTP": + extra_environ[".inbound_headers"] = inbound_headers inbound_headers = None elif six.PY2 and inbound_headers: - inbound_headers = { - k.encode('utf-8'): v.encode('utf-8') - for k, v in inbound_headers.items()} - - @validate_transaction_metrics(test_name, - group="Uri", - rollup_metrics=expected_metrics, - background_task=not web_transaction) - @validate_transaction_event_attributes( - txn_event_required, txn_event_forgone, txn_event_exact) - @validate_attributes('intrinsic', common_required, common_forgone) + inbound_headers = {k.encode("utf-8"): v.encode("utf-8") for k, v in inbound_headers.items()} + + @validate_transaction_metrics( + test_name, group="Uri", rollup_metrics=expected_metrics, background_task=not web_transaction + ) + @validate_transaction_event_attributes(txn_event_required, txn_event_forgone, txn_event_exact) + @validate_attributes("intrinsic", common_required, common_forgone) @override_application_settings(override_settings) @override_compute_sampled(force_sampled_true) def _test(): return test_application.get( - '/' + test_name, + "/" + test_name, headers=inbound_headers, extra_environ=extra_environ, ) - if 'Span' in intrinsics: - span_intrinsics = intrinsics.get('Span') - span_expected = span_intrinsics.get('expected', []) + if "Span" in intrinsics: + span_intrinsics = intrinsics.get("Span") + span_expected = span_intrinsics.get("expected", []) span_expected.extend(common_required) - span_unexpected = span_intrinsics.get('unexpected', []) + span_unexpected = span_intrinsics.get("unexpected", []) span_unexpected.extend(common_forgone) - span_exact = span_intrinsics.get('exact', {}) + span_exact = span_intrinsics.get("exact", {}) span_exact.update(common_exact) - _test = validate_span_events(exact_intrinsics=span_exact, - expected_intrinsics=span_expected, - unexpected_intrinsics=span_unexpected)(_test) + _test = validate_span_events( + exact_intrinsics=span_exact, expected_intrinsics=span_expected, unexpected_intrinsics=span_unexpected + )(_test) elif not span_events_enabled: _test = validate_span_events(count=0)(_test) response = _test() - assert response.status == '200 OK' + assert response.status == "200 OK" payloads = response.json if outbound_payloads: assert len(payloads) == len(outbound_payloads) diff --git a/tests/datastore_aioredis/conftest.py b/tests/datastore_aioredis/conftest.py index de9c6c04d..e1cea4c01 100644 --- a/tests/datastore_aioredis/conftest.py +++ b/tests/datastore_aioredis/conftest.py @@ -12,30 +12,31 @@ # See the License for the specific language governing permissions and # limitations under the License. -import aioredis +import os import pytest +from newrelic.common.package_version_utils import get_package_version_tuple from testing_support.db_settings import redis_settings from testing_support.fixture.event_loop import event_loop as loop -from testing_support.fixtures import ( # noqa: F401 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 + +try: + import aioredis + + AIOREDIS_VERSION = get_package_version_tuple("aioredis") +except ImportError: + import redis.asyncio as aioredis + + # Fake aioredis version to show when it was moved to redis.asyncio + AIOREDIS_VERSION = (2, 0, 2) + -AIOREDIS_VERSION = tuple(int(x) for x in aioredis.__version__.split(".")[:2]) SKIPIF_AIOREDIS_V1 = pytest.mark.skipif(AIOREDIS_VERSION < (2,), reason="Unsupported aioredis version.") SKIPIF_AIOREDIS_V2 = pytest.mark.skipif(AIOREDIS_VERSION >= (2,), reason="Unsupported aioredis version.") DB_SETTINGS = redis_settings()[0] -_coverage_source = [ - "newrelic.hooks.datastore_aioredis", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) - _default_settings = { "transaction_tracer.explain_threshold": 0.0, "transaction_tracer.transaction_threshold": 0.0, @@ -67,3 +68,7 @@ def client(request, loop): pytest.skip("StrictRedis not implemented.") else: raise NotImplementedError() + +@pytest.fixture(scope="session") +def key(): + return "AIOREDIS-TEST-" + str(os.getpid()) diff --git a/tests/datastore_aioredis/test_custom_conn_pool.py b/tests/datastore_aioredis/test_custom_conn_pool.py index 7644e8ffb..b09cf0bdd 100644 --- a/tests/datastore_aioredis/test_custom_conn_pool.py +++ b/tests/datastore_aioredis/test_custom_conn_pool.py @@ -17,12 +17,15 @@ will not result in an error. """ -from newrelic.api.background_task import background_task - -# from testing_support.fixture.event_loop import event_loop as loop -from testing_support.fixtures import validate_transaction_metrics, override_application_settings from testing_support.db_settings import redis_settings +from testing_support.fixture.event_loop import event_loop as loop # noqa +from testing_support.fixtures import override_application_settings from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task DB_SETTINGS = redis_settings()[0] @@ -107,7 +110,7 @@ async def exercise_redis(client): background_task=True, ) @background_task() -def test_fake_conn_pool_enable_instance(client, loop, monkeypatch): +def test_fake_conn_pool_enable_instance(client, loop, monkeypatch): # noqa # Get a real connection conn = getattr(client, "_pool_or_conn", None) if conn is None: @@ -132,7 +135,7 @@ def test_fake_conn_pool_enable_instance(client, loop, monkeypatch): background_task=True, ) @background_task() -def test_fake_conn_pool_disable_instance(client, loop, monkeypatch): +def test_fake_conn_pool_disable_instance(client, loop, monkeypatch): # noqa # Get a real connection conn = getattr(client, "_pool_or_conn", None) if conn is None: diff --git a/tests/datastore_aioredis/test_execute_command.py b/tests/datastore_aioredis/test_execute_command.py index bbc8b2d4f..54851a659 100644 --- a/tests/datastore_aioredis/test_execute_command.py +++ b/tests/datastore_aioredis/test_execute_command.py @@ -13,12 +13,17 @@ # limitations under the License. import pytest -from newrelic.api.background_task import background_task -from testing_support.fixtures import validate_transaction_metrics, override_application_settings -from conftest import AIOREDIS_VERSION +# import aioredis +from conftest import AIOREDIS_VERSION, loop # noqa # pylint: disable=E0611,W0611 from testing_support.db_settings import redis_settings +from testing_support.fixtures import override_application_settings from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task DB_SETTINGS = redis_settings()[0] @@ -78,7 +83,7 @@ async def exercise_redis_single_arg(client): background_task=True, ) @background_task() -def test_redis_execute_command_as_one_arg_enable(client, loop): +def test_redis_execute_command_as_one_arg_enable(client, loop): # noqa loop.run_until_complete(exercise_redis_single_arg(client)) @@ -91,7 +96,7 @@ def test_redis_execute_command_as_one_arg_enable(client, loop): background_task=True, ) @background_task() -def test_redis_execute_command_as_one_arg_disable(client, loop): +def test_redis_execute_command_as_one_arg_disable(client, loop): # noqa loop.run_until_complete(exercise_redis_single_arg(client)) @@ -103,7 +108,7 @@ def test_redis_execute_command_as_one_arg_disable(client, loop): background_task=True, ) @background_task() -def test_redis_execute_command_as_two_args_enable(client, loop): +def test_redis_execute_command_as_two_args_enable(client, loop): # noqa loop.run_until_complete(exercise_redis_multi_args(client)) @@ -115,5 +120,5 @@ def test_redis_execute_command_as_two_args_enable(client, loop): background_task=True, ) @background_task() -def test_redis_execute_command_as_two_args_disable(client, loop): +def test_redis_execute_command_as_two_args_disable(client, loop): # noqa loop.run_until_complete(exercise_redis_multi_args(client)) diff --git a/tests/datastore_aioredis/test_get_and_set.py b/tests/datastore_aioredis/test_get_and_set.py index a446d5f6c..cbddf6091 100644 --- a/tests/datastore_aioredis/test_get_and_set.py +++ b/tests/datastore_aioredis/test_get_and_set.py @@ -12,11 +12,15 @@ # See the License for the specific language governing permissions and # limitations under the License. -from newrelic.api.background_task import background_task - -from testing_support.fixtures import validate_transaction_metrics, override_application_settings +# from conftest import AIOREDIS_VERSION, event_loop, loop from testing_support.db_settings import redis_settings +from testing_support.fixtures import override_application_settings from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task DB_SETTINGS = redis_settings()[0] @@ -60,9 +64,9 @@ _disable_rollup_metrics.append((_instance_metric_name, None)) -async def exercise_redis(client): - await client.set("key", "value") - await client.get("key") +async def exercise_redis(client, key): + await client.set(key, "value") + await client.get(key) @override_application_settings(_enable_instance_settings) @@ -73,8 +77,8 @@ async def exercise_redis(client): background_task=True, ) @background_task() -def test_redis_client_operation_enable_instance(client, loop): - loop.run_until_complete(exercise_redis(client)) +def test_redis_client_operation_enable_instance(client, loop, key): + loop.run_until_complete(exercise_redis(client, key)) @override_application_settings(_disable_instance_settings) @@ -85,5 +89,5 @@ def test_redis_client_operation_enable_instance(client, loop): background_task=True, ) @background_task() -def test_redis_client_operation_disable_instance(client, loop): - loop.run_until_complete(exercise_redis(client)) +def test_redis_client_operation_disable_instance(client, loop, key): + loop.run_until_complete(exercise_redis(client, key)) diff --git a/tests/datastore_aioredis/test_instance_info.py b/tests/datastore_aioredis/test_instance_info.py index 4bb744149..d43366ab5 100644 --- a/tests/datastore_aioredis/test_instance_info.py +++ b/tests/datastore_aioredis/test_instance_info.py @@ -14,10 +14,9 @@ from inspect import isawaitable import pytest -import aioredis from newrelic.hooks.datastore_aioredis import _conn_attrs_to_dict, _instance_info -from conftest import AIOREDIS_VERSION, SKIPIF_AIOREDIS_V1 +from conftest import aioredis, AIOREDIS_VERSION, SKIPIF_AIOREDIS_V1 _instance_info_tests = [ ({}, ("localhost", "6379", "0")), @@ -35,6 +34,10 @@ class DisabledConnection(aioredis.Connection): @staticmethod async def connect(*args, **kwargs): pass + + async def can_read_destructive(self, *args, **kwargs): + return False + class DisabledUnixConnection(aioredis.UnixDomainSocketConnection, DisabledConnection): diff --git a/tests/datastore_aioredis/test_multiple_dbs.py b/tests/datastore_aioredis/test_multiple_dbs.py index cb817c9f8..d490c1f58 100644 --- a/tests/datastore_aioredis/test_multiple_dbs.py +++ b/tests/datastore_aioredis/test_multiple_dbs.py @@ -12,14 +12,18 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest -import aioredis -from newrelic.api.background_task import background_task +from conftest import aioredis -from testing_support.fixtures import validate_transaction_metrics, override_application_settings -from conftest import AIOREDIS_VERSION +import pytest +from conftest import AIOREDIS_VERSION, loop # noqa from testing_support.db_settings import redis_settings +from testing_support.fixtures import override_application_settings from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task DB_SETTINGS = redis_settings() @@ -102,7 +106,7 @@ @pytest.fixture(params=("Redis", "StrictRedis")) -def client_set(request, loop): +def client_set(request, loop): # noqa if len(DB_SETTINGS) > 1: if AIOREDIS_VERSION >= (2, 0): if request.param == "Redis": @@ -133,7 +137,6 @@ def client_set(request, loop): raise NotImplementedError() - async def exercise_redis(client_1, client_2): await client_1.set("key", "value") await client_1.get("key") @@ -153,7 +156,7 @@ async def exercise_redis(client_1, client_2): background_task=True, ) @background_task() -def test_multiple_datastores_enabled(client_set, loop): +def test_multiple_datastores_enabled(client_set, loop): # noqa loop.run_until_complete(exercise_redis(client_set[0], client_set[1])) @@ -166,7 +169,7 @@ def test_multiple_datastores_enabled(client_set, loop): background_task=True, ) @background_task() -def test_multiple_datastores_disabled(client_set, loop): +def test_multiple_datastores_disabled(client_set, loop): # noqa loop.run_until_complete(exercise_redis(client_set[0], client_set[1])) @@ -179,7 +182,7 @@ def test_multiple_datastores_disabled(client_set, loop): ) @override_application_settings(_enable_instance_settings) @background_task() -def test_concurrent_calls(client_set, loop): +def test_concurrent_calls(client_set, loop): # noqa # Concurrent calls made with original instrumenation taken from synchonous Redis # instrumentation had a bug where datastore info on concurrent calls to multiple instances # would result in all instances reporting as the host/port of the final call made. diff --git a/tests/datastore_aioredis/test_trace_node.py b/tests/datastore_aioredis/test_trace_node.py index e4fa1e3ba..92235f793 100644 --- a/tests/datastore_aioredis/test_trace_node.py +++ b/tests/datastore_aioredis/test_trace_node.py @@ -12,9 +12,15 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import validate_tt_collector_json, override_application_settings -from testing_support.util import instance_hostname +# import aioredis +# import pytest +# from conftest import AIOREDIS_VERSION, event_loop from testing_support.db_settings import redis_settings +from testing_support.fixtures import override_application_settings +from testing_support.util import instance_hostname +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task diff --git a/tests/datastore_aioredis/test_transactions.py b/tests/datastore_aioredis/test_transactions.py index 168de008b..ced922022 100644 --- a/tests/datastore_aioredis/test_transactions.py +++ b/tests/datastore_aioredis/test_transactions.py @@ -13,51 +13,56 @@ # limitations under the License. import pytest +from conftest import AIOREDIS_VERSION, SKIPIF_AIOREDIS_V1, SKIPIF_AIOREDIS_V2 +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import validate_transaction_errors - -from conftest import SKIPIF_AIOREDIS_V1, SKIPIF_AIOREDIS_V2, AIOREDIS_VERSION @background_task() @pytest.mark.parametrize("in_transaction", (True, False)) -def test_pipelines_no_harm(client, in_transaction, loop): +def test_pipelines_no_harm(client, in_transaction, loop, key): async def exercise(): if AIOREDIS_VERSION >= (2,): pipe = client.pipeline(transaction=in_transaction) else: pipe = client.pipeline() # Transaction kwarg unsupported - - pipe.set("TXN", 1) + + pipe.set(key, 1) return await pipe.execute() status = loop.run_until_complete(exercise()) assert status == [True] -def exercise_transaction_sync(pipe): - pipe.set("TXN", 1) +def exercise_transaction_sync(key): + def _run(pipe): + pipe.set(key, 1) + return _run -async def exercise_transaction_async(pipe): - await pipe.set("TXN", 1) +def exercise_transaction_async(key): + async def _run(pipe): + await pipe.set(key, 1) + return _run @SKIPIF_AIOREDIS_V1 @pytest.mark.parametrize("exercise", (exercise_transaction_sync, exercise_transaction_async)) @background_task() -def test_transactions_no_harm(client, loop, exercise): - status = loop.run_until_complete(client.transaction(exercise)) +def test_transactions_no_harm(client, loop, key, exercise): + status = loop.run_until_complete(client.transaction(exercise(key))) assert status == [True] @SKIPIF_AIOREDIS_V2 @background_task() -def test_multi_exec_no_harm(client, loop): +def test_multi_exec_no_harm(client, loop, key): async def exercise(): pipe = client.multi_exec() - pipe.set("key", "value") + pipe.set(key, "value") status = await pipe.execute() assert status == [True] @@ -66,8 +71,7 @@ async def exercise(): @SKIPIF_AIOREDIS_V1 @background_task() -def test_pipeline_immediate_execution_no_harm(client, loop): - key = "TXN_WATCH" +def test_pipeline_immediate_execution_no_harm(client, loop, key): async def exercise(): await client.set(key, 1) @@ -92,8 +96,7 @@ async def exercise(): @SKIPIF_AIOREDIS_V1 @background_task() -def test_transaction_immediate_execution_no_harm(client, loop): - key = "TXN_WATCH" +def test_transaction_immediate_execution_no_harm(client, loop, key): async def exercise(): async def exercise_transaction(pipe): value = int(await pipe.get(key)) @@ -116,8 +119,7 @@ async def exercise_transaction(pipe): @SKIPIF_AIOREDIS_V1 @validate_transaction_errors([]) @background_task() -def test_transaction_watch_error_no_harm(client, loop): - key = "TXN_WATCH" +def test_transaction_watch_error_no_harm(client, loop, key): async def exercise(): async def exercise_transaction(pipe): value = int(await pipe.get(key)) diff --git a/tests/datastore_aioredis/test_uninstrumented_methods.py b/tests/datastore_aioredis/test_uninstrumented_methods.py index f1b36b1ca..7858709c1 100644 --- a/tests/datastore_aioredis/test_uninstrumented_methods.py +++ b/tests/datastore_aioredis/test_uninstrumented_methods.py @@ -15,7 +15,11 @@ IGNORED_METHODS = { "address", + "auto_close_connection_pool", "channels", + "client_tracking_off", + "client_tracking_on", + "client_no_touch", "close", "closed", "connection_pool", @@ -25,6 +29,9 @@ "execute_command", "execute", "from_url", + "get_connection_kwargs", + "get_encoder", + "get_retry", "hscan_iter", "ihscan", "in_pubsub", @@ -33,6 +40,7 @@ "iscan", "isscan", "izscan", + "load_external_module", "lock", "multi_exec", "parse_response", @@ -42,9 +50,11 @@ "register_script", "response_callbacks", "RESPONSE_CALLBACKS", + "sentinel", "SET_IF_EXIST", "SET_IF_NOT_EXIST", "set_response_callback", + "set_retry", "SHUTDOWN_NOSAVE", "SHUTDOWN_SAVE", "single_connection_client", @@ -61,6 +71,20 @@ "ZSET_IF_NOT_EXIST", } +REDIS_MODULES = { + "bf", + "cf", + "cms", + "ft", + "graph", + "json", + "tdigest", + "topk", + "ts", +} + +IGNORED_METHODS |= REDIS_MODULES + def test_uninstrumented_methods(client): methods = {m for m in dir(client) if not m[0] == "_"} diff --git a/tests/datastore_aredis/conftest.py b/tests/datastore_aredis/conftest.py index 8f43d088b..78067e0fe 100644 --- a/tests/datastore_aredis/conftest.py +++ b/tests/datastore_aredis/conftest.py @@ -15,14 +15,8 @@ import pytest from testing_support.fixture.event_loop import event_loop as loop # noqa: F401 -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.datastore_aredis', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_aredis/test_custom_conn_pool.py b/tests/datastore_aredis/test_custom_conn_pool.py index 344154412..70c75de9e 100644 --- a/tests/datastore_aredis/test_custom_conn_pool.py +++ b/tests/datastore_aredis/test_custom_conn_pool.py @@ -23,8 +23,8 @@ from newrelic.api.background_task import background_task from testing_support.fixture.event_loop import event_loop as loop -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_aredis/test_execute_command.py b/tests/datastore_aredis/test_execute_command.py index 7db3c9c59..c5b0fc332 100644 --- a/tests/datastore_aredis/test_execute_command.py +++ b/tests/datastore_aredis/test_execute_command.py @@ -17,8 +17,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.fixture.event_loop import event_loop as loop from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_aredis/test_get_and_set.py b/tests/datastore_aredis/test_get_and_set.py index fbde29d86..2eeee947b 100644 --- a/tests/datastore_aredis/test_get_and_set.py +++ b/tests/datastore_aredis/test_get_and_set.py @@ -17,8 +17,8 @@ from newrelic.api.background_task import background_task from testing_support.fixture.event_loop import event_loop as loop -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_aredis/test_multiple_dbs.py b/tests/datastore_aredis/test_multiple_dbs.py index e16ae9483..cb4cbac5b 100644 --- a/tests/datastore_aredis/test_multiple_dbs.py +++ b/tests/datastore_aredis/test_multiple_dbs.py @@ -15,10 +15,8 @@ import aredis import pytest from testing_support.db_settings import redis_settings -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, -) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.util import instance_hostname from newrelic.api.background_task import background_task diff --git a/tests/datastore_aredis/test_trace_node.py b/tests/datastore_aredis/test_trace_node.py index 9741bfbd6..9d5d86162 100644 --- a/tests/datastore_aredis/test_trace_node.py +++ b/tests/datastore_aredis/test_trace_node.py @@ -13,12 +13,13 @@ # limitations under the License. import aredis - -from testing_support.fixture.event_loop import event_loop as loop -from testing_support.fixtures import (validate_tt_collector_json, - override_application_settings) -from testing_support.util import instance_hostname from testing_support.db_settings import redis_settings +from testing_support.fixture.event_loop import event_loop as loop # noqa: F401 +from testing_support.fixtures import override_application_settings +from testing_support.util import instance_hostname +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task @@ -28,100 +29,93 @@ # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, + "datastore_tracer.database_name_reporting.enabled": False, } _instance_only_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": False, } _database_only_settings = { - 'datastore_tracer.instance_reporting.enabled': False, - 'datastore_tracer.database_name_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": False, + "datastore_tracer.database_name_reporting.enabled": True, } # Expected parameters _enabled_required = { - 'host': instance_hostname(DB_SETTINGS['host']), - 'port_path_or_id': str(DB_SETTINGS['port']), - 'db.instance': str(DATABASE_NUMBER), + "host": instance_hostname(DB_SETTINGS["host"]), + "port_path_or_id": str(DB_SETTINGS["port"]), + "db.instance": str(DATABASE_NUMBER), } _enabled_forgone = {} _disabled_required = {} _disabled_forgone = { - 'host': 'VALUE NOT USED', - 'port_path_or_id': 'VALUE NOT USED', - 'db.instance': 'VALUE NOT USED', + "host": "VALUE NOT USED", + "port_path_or_id": "VALUE NOT USED", + "db.instance": "VALUE NOT USED", } _instance_only_required = { - 'host': instance_hostname(DB_SETTINGS['host']), - 'port_path_or_id': str(DB_SETTINGS['port']), + "host": instance_hostname(DB_SETTINGS["host"]), + "port_path_or_id": str(DB_SETTINGS["port"]), } _instance_only_forgone = { - 'db.instance': str(DATABASE_NUMBER), + "db.instance": str(DATABASE_NUMBER), } _database_only_required = { - 'db.instance': str(DATABASE_NUMBER), + "db.instance": str(DATABASE_NUMBER), } _database_only_forgone = { - 'host': 'VALUE NOT USED', - 'port_path_or_id': 'VALUE NOT USED', + "host": "VALUE NOT USED", + "port_path_or_id": "VALUE NOT USED", } # Query + async def _exercise_db(): - client = aredis.StrictRedis(host=DB_SETTINGS['host'], - port=DB_SETTINGS['port'], db=DATABASE_NUMBER) + client = aredis.StrictRedis(host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], db=DATABASE_NUMBER) - await client.set('key', 'value') - await client.get('key') + await client.set("key", "value") + await client.get("key") - await client.execute_command('CLIENT', 'LIST', parse='LIST') + await client.execute_command("CLIENT", "LIST", parse="LIST") # Tests + @override_application_settings(_enable_instance_settings) -@validate_tt_collector_json( - datastore_params=_enabled_required, - datastore_forgone_params=_enabled_forgone) +@validate_tt_collector_json(datastore_params=_enabled_required, datastore_forgone_params=_enabled_forgone) @background_task() -def test_trace_node_datastore_params_enable_instance(loop): +def test_trace_node_datastore_params_enable_instance(loop): # noqa: F811 loop.run_until_complete(_exercise_db()) @override_application_settings(_disable_instance_settings) -@validate_tt_collector_json( - datastore_params=_disabled_required, - datastore_forgone_params=_disabled_forgone) +@validate_tt_collector_json(datastore_params=_disabled_required, datastore_forgone_params=_disabled_forgone) @background_task() -def test_trace_node_datastore_params_disable_instance(loop): +def test_trace_node_datastore_params_disable_instance(loop): # noqa: F811 loop.run_until_complete(_exercise_db()) @override_application_settings(_instance_only_settings) -@validate_tt_collector_json( - datastore_params=_instance_only_required, - datastore_forgone_params=_instance_only_forgone) +@validate_tt_collector_json(datastore_params=_instance_only_required, datastore_forgone_params=_instance_only_forgone) @background_task() -def test_trace_node_datastore_params_instance_only(loop): +def test_trace_node_datastore_params_instance_only(loop): # noqa: F811 loop.run_until_complete(_exercise_db()) @override_application_settings(_database_only_settings) -@validate_tt_collector_json( - datastore_params=_database_only_required, - datastore_forgone_params=_database_only_forgone) +@validate_tt_collector_json(datastore_params=_database_only_required, datastore_forgone_params=_database_only_forgone) @background_task() -def test_trace_node_datastore_params_database_only(loop): +def test_trace_node_datastore_params_database_only(loop): # noqa: F811 loop.run_until_complete(_exercise_db()) diff --git a/tests/datastore_asyncpg/conftest.py b/tests/datastore_asyncpg/conftest.py index 00720a55f..69bc0501a 100644 --- a/tests/datastore_asyncpg/conftest.py +++ b/tests/datastore_asyncpg/conftest.py @@ -13,11 +13,8 @@ # limitations under the License. from testing_support.fixture.event_loop import event_loop -from testing_support.fixtures import code_coverage_fixture # noqa -from testing_support.fixtures import ( - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -30,9 +27,3 @@ collector_agent_registration = collector_agent_registration_fixture( app_name="Python Agent Test (datastore_asyncpg)", default_settings=_default_settings ) - -_coverage_source = [ - "newrelic.hooks.database_asyncpg", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) diff --git a/tests/datastore_asyncpg/test_multiple_dbs.py b/tests/datastore_asyncpg/test_multiple_dbs.py index a24eaa388..a917a9e83 100644 --- a/tests/datastore_asyncpg/test_multiple_dbs.py +++ b/tests/datastore_asyncpg/test_multiple_dbs.py @@ -17,10 +17,8 @@ import asyncpg import pytest from testing_support.db_settings import postgresql_settings -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, -) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.util import instance_hostname from newrelic.api.background_task import background_task diff --git a/tests/datastore_asyncpg/test_query.py b/tests/datastore_asyncpg/test_query.py index eb44cfd16..838ced61d 100644 --- a/tests/datastore_asyncpg/test_query.py +++ b/tests/datastore_asyncpg/test_query.py @@ -12,19 +12,19 @@ # See the License for the specific language governing permissions and # limitations under the License. -import asyncio import os -import random from io import BytesIO import asyncpg import pytest from testing_support.db_settings import postgresql_settings -from testing_support.fixtures import ( +from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, +) +from testing_support.validators.validate_tt_collector_json import ( validate_tt_collector_json, ) -from testing_support.util import instance_hostname from newrelic.api.background_task import background_task diff --git a/tests/datastore_bmemcached/conftest.py b/tests/datastore_bmemcached/conftest.py index 3d41ed930..c970c1c34 100644 --- a/tests/datastore_bmemcached/conftest.py +++ b/tests/datastore_bmemcached/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) -_coverage_source = [ - 'newrelic.api.memcache_trace', - 'newrelic.hooks.datastore_bmemcached', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_bmemcached/test_memcache.py b/tests/datastore_bmemcached/test_memcache.py index 1bef31f59..68eee0633 100644 --- a/tests/datastore_bmemcached/test_memcache.py +++ b/tests/datastore_bmemcached/test_memcache.py @@ -19,7 +19,7 @@ from newrelic.api.background_task import background_task from newrelic.api.transaction import set_background_task -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import memcached_settings diff --git a/tests/datastore_elasticsearch/conftest.py b/tests/datastore_elasticsearch/conftest.py index d665bce87..53fa6fcdc 100644 --- a/tests/datastore_elasticsearch/conftest.py +++ b/tests/datastore_elasticsearch/conftest.py @@ -13,25 +13,35 @@ # limitations under the License. import pytest +from testing_support.db_settings import elasticsearch_settings -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.datastore_elasticsearch', -] +from newrelic.common.package_version_utils import get_package_version -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { - 'transaction_tracer.explain_threshold': 0.0, - 'transaction_tracer.transaction_threshold': 0.0, - 'transaction_tracer.stack_trace_threshold': 0.0, - 'debug.log_data_collector_payloads': True, - 'debug.record_transaction_failure': True + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, } collector_agent_registration = collector_agent_registration_fixture( - app_name='Python Agent Test (datastore_elasticsearch)', - default_settings=_default_settings, - linked_applications=['Python Agent Test (datastore)']) + app_name="Python Agent Test (datastore_elasticsearch)", + default_settings=_default_settings, + linked_applications=["Python Agent Test (datastore)"], +) + +ES_VERSION = tuple([int(n) for n in get_package_version("elasticsearch").split(".")]) +ES_SETTINGS = elasticsearch_settings()[0] +ES_MULTIPLE_SETTINGS = elasticsearch_settings() +ES_URL = "http://%s:%s" % (ES_SETTINGS["host"], ES_SETTINGS["port"]) + + +@pytest.fixture(scope="session") +def client(): + from elasticsearch import Elasticsearch + + return Elasticsearch(ES_URL) diff --git a/tests/datastore_elasticsearch/test_connection.py b/tests/datastore_elasticsearch/test_connection.py index 37df49b80..9e8f17b4c 100644 --- a/tests/datastore_elasticsearch/test_connection.py +++ b/tests/datastore_elasticsearch/test_connection.py @@ -12,21 +12,53 @@ # See the License for the specific language governing permissions and # limitations under the License. -from elasticsearch.connection.base import Connection +import pytest + +try: + from elasticsearch.connection.base import Connection +except ImportError: + from elastic_transport._models import NodeConfig + from elastic_transport._node._base import BaseNode as Connection + +from conftest import ES_VERSION, ES_SETTINGS + + +HOST = {"scheme": "http", "host": ES_SETTINGS["host"], "port": int(ES_SETTINGS["port"])} + +IS_V8 = ES_VERSION >= (8,) +SKIP_IF_V7 = pytest.mark.skipif(not IS_V8, reason="Skipping v8 tests.") +SKIP_IF_V8 = pytest.mark.skipif(IS_V8, reason="Skipping v7 tests.") def test_connection_default(): - conn = Connection() - assert conn._nr_host_port == ('localhost', '9200') + if IS_V8: + conn = Connection(NodeConfig(**HOST)) + else: + conn = Connection(**HOST) + + assert conn._nr_host_port == (ES_SETTINGS["host"], ES_SETTINGS["port"]) + +@SKIP_IF_V7 +def test_connection_config(): + conn = Connection(NodeConfig(scheme="http", host="foo", port=8888)) + assert conn._nr_host_port == ("foo", "8888") + + +@SKIP_IF_V8 def test_connection_host_arg(): - conn = Connection('the_host') - assert conn._nr_host_port == ('the_host', '9200') + conn = Connection("the_host") + assert conn._nr_host_port == ("the_host", "9200") + +@SKIP_IF_V8 def test_connection_args(): - conn = Connection('the_host', 9999) - assert conn._nr_host_port == ('the_host', '9999') + conn = Connection("the_host", 9999) + assert conn._nr_host_port == ("the_host", "9999") + +@SKIP_IF_V8 def test_connection_kwargs(): - conn = Connection(host='foo', port=8888) - assert conn._nr_host_port == ('foo', '8888') + conn = Connection(host="foo", port=8888) + assert conn._nr_host_port == ("foo", "8888") + diff --git a/tests/datastore_elasticsearch/test_database_duration.py b/tests/datastore_elasticsearch/test_database_duration.py index a76f700b1..e2599c67b 100644 --- a/tests/datastore_elasticsearch/test_database_duration.py +++ b/tests/datastore_elasticsearch/test_database_duration.py @@ -14,38 +14,48 @@ import sqlite3 -from elasticsearch import Elasticsearch +from testing_support.validators.validate_database_duration import ( + validate_database_duration, +) from newrelic.api.background_task import background_task -from testing_support.db_settings import elasticsearch_settings -from testing_support.validators.validate_database_duration import validate_database_duration +from conftest import ES_VERSION -ES_SETTINGS = elasticsearch_settings()[0] -ES_URL = 'http://%s:%s' % (ES_SETTINGS['host'], ES_SETTINGS['port']) -def _exercise_es(es): - es.index(index="contacts", doc_type="person", - body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) - es.index(index="contacts", doc_type="person", - body={"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2) - es.index(index="contacts", doc_type="person", - body={"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) - es.indices.refresh('contacts') + + +def _exercise_es_v7(es): + es.index(index="contacts", doc_type="person", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + es.index( + index="contacts", doc_type="person", body={"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2 + ) + es.index(index="contacts", doc_type="person", body={"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) + es.indices.refresh("contacts") + + +def _exercise_es_v8(es): + es.index(index="contacts", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + es.index(index="contacts", body={"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2) + es.index(index="contacts", body={"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) + es.indices.refresh(index="contacts") + + +_exercise_es = _exercise_es_v7 if ES_VERSION < (8, 0, 0) else _exercise_es_v8 + @validate_database_duration() @background_task() -def test_elasticsearch_database_duration(): - client = Elasticsearch(ES_URL) +def test_elasticsearch_database_duration(client): _exercise_es(client) + @validate_database_duration() @background_task() -def test_elasticsearch_and_sqlite_database_duration(): +def test_elasticsearch_and_sqlite_database_duration(client): # Make Elasticsearch queries - client = Elasticsearch(ES_URL) _exercise_es(client) # Make sqlite queries diff --git a/tests/datastore_elasticsearch/test_elasticsearch.py b/tests/datastore_elasticsearch/test_elasticsearch.py index 548043216..d2c892ea9 100644 --- a/tests/datastore_elasticsearch/test_elasticsearch.py +++ b/tests/datastore_elasticsearch/test_elasticsearch.py @@ -12,121 +12,120 @@ # See the License for the specific language governing permissions and # limitations under the License. -from elasticsearch import Elasticsearch import elasticsearch.client +from testing_support.fixtures import override_application_settings +from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings) -from testing_support.db_settings import elasticsearch_settings -from testing_support.util import instance_hostname +from conftest import ES_VERSION, ES_SETTINGS -ES_SETTINGS = elasticsearch_settings()[0] -ES_URL = 'http://%s:%s' % (ES_SETTINGS['host'], ES_SETTINGS['port']) # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, } # Metrics _base_scoped_metrics = [ - ('Datastore/statement/Elasticsearch/_all/cluster.health', 1), - ('Datastore/statement/Elasticsearch/_all/search', 2), - ('Datastore/statement/Elasticsearch/address/index', 2), - ('Datastore/statement/Elasticsearch/address/search', 1), - ('Datastore/statement/Elasticsearch/contacts/index', 3), - ('Datastore/statement/Elasticsearch/contacts/indices.refresh', 1), - ('Datastore/statement/Elasticsearch/contacts/search', 2), - ('Datastore/statement/Elasticsearch/other/search', 2), + ("Datastore/statement/Elasticsearch/_all/cluster.health", 1), + ("Datastore/statement/Elasticsearch/_all/search", 2), + ("Datastore/statement/Elasticsearch/address/index", 2), + ("Datastore/statement/Elasticsearch/address/search", 1), + ("Datastore/statement/Elasticsearch/contacts/index", 3), + ("Datastore/statement/Elasticsearch/contacts/indices.refresh", 1), + ("Datastore/statement/Elasticsearch/contacts/search", 2), + ("Datastore/statement/Elasticsearch/other/search", 2), ] _base_rollup_metrics = [ - ('Datastore/operation/Elasticsearch/cluster.health', 1), - ('Datastore/operation/Elasticsearch/index', 5), - ('Datastore/operation/Elasticsearch/indices.refresh', 1), - ('Datastore/operation/Elasticsearch/search', 7), - ('Datastore/statement/Elasticsearch/_all/cluster.health', 1), - ('Datastore/statement/Elasticsearch/_all/search', 2), - ('Datastore/statement/Elasticsearch/address/index', 2), - ('Datastore/statement/Elasticsearch/address/search', 1), - ('Datastore/statement/Elasticsearch/contacts/index', 3), - ('Datastore/statement/Elasticsearch/contacts/indices.refresh', 1), - ('Datastore/statement/Elasticsearch/contacts/search', 2), - ('Datastore/statement/Elasticsearch/other/search', 2), + ("Datastore/operation/Elasticsearch/cluster.health", 1), + ("Datastore/operation/Elasticsearch/index", 5), + ("Datastore/operation/Elasticsearch/indices.refresh", 1), + ("Datastore/operation/Elasticsearch/search", 7), + ("Datastore/statement/Elasticsearch/_all/cluster.health", 1), + ("Datastore/statement/Elasticsearch/_all/search", 2), + ("Datastore/statement/Elasticsearch/address/index", 2), + ("Datastore/statement/Elasticsearch/address/search", 1), + ("Datastore/statement/Elasticsearch/contacts/index", 3), + ("Datastore/statement/Elasticsearch/contacts/indices.refresh", 1), + ("Datastore/statement/Elasticsearch/contacts/search", 2), + ("Datastore/statement/Elasticsearch/other/search", 2), ] # Version support +def is_importable(module_path): + try: + __import__(module_path) + return True + except ImportError: + return False + + _all_count = 14 -try: - import elasticsearch.client.cat - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/cat.health', 1)) - _base_rollup_metrics.append( - ('Datastore/operation/Elasticsearch/cat.health', 1)) +if is_importable("elasticsearch.client.cat") or is_importable("elasticsearch._sync.client.cat"): + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/cat.health", 1)) + _base_rollup_metrics.append(("Datastore/operation/Elasticsearch/cat.health", 1)) _all_count += 1 -except ImportError: - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/cat.health', None)) - _base_rollup_metrics.append( - ('Datastore/operation/Elasticsearch/cat.health', None)) - -try: - import elasticsearch.client.nodes - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/nodes.info', 1)) - _base_rollup_metrics.append( - ('Datastore/operation/Elasticsearch/nodes.info', 1)) +else: + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/cat.health", None)) + _base_rollup_metrics.append(("Datastore/operation/Elasticsearch/cat.health", None)) + +if is_importable("elasticsearch.client.nodes") or is_importable("elasticsearch._sync.client.nodes"): + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/nodes.info", 1)) + _base_rollup_metrics.append(("Datastore/operation/Elasticsearch/nodes.info", 1)) _all_count += 1 -except ImportError: - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/nodes.info', None)) - _base_rollup_metrics.append( - ('Datastore/operation/Elasticsearch/nodes.info', None)) - -if (hasattr(elasticsearch.client, 'SnapshotClient') and - hasattr(elasticsearch.client.SnapshotClient, 'status')): - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/snapshot.status', 1)) - _base_rollup_metrics.append( - ('Datastore/operation/Elasticsearch/snapshot.status', 1)) +else: + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/nodes.info", None)) + _base_rollup_metrics.append(("Datastore/operation/Elasticsearch/nodes.info", None)) + +if hasattr(elasticsearch.client, "SnapshotClient") and hasattr(elasticsearch.client.SnapshotClient, "status"): + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/snapshot.status", 1)) + _base_rollup_metrics.append(("Datastore/operation/Elasticsearch/snapshot.status", 1)) _all_count += 1 else: - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/snapshot.status', None)) - _base_rollup_metrics.append( - ('Datastore/operation/Elasticsearch/snapshot.status', None)) - -if hasattr(elasticsearch.client.IndicesClient, 'status'): - _base_scoped_metrics.append( - ('Datastore/statement/Elasticsearch/_all/indices.status', 1)) - _base_rollup_metrics.extend([ - ('Datastore/operation/Elasticsearch/indices.status', 1), - ('Datastore/statement/Elasticsearch/_all/indices.status', 1), - ]) + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/snapshot.status", None)) + _base_rollup_metrics.append(("Datastore/operation/Elasticsearch/snapshot.status", None)) + +if hasattr(elasticsearch.client.IndicesClient, "status"): + _base_scoped_metrics.append(("Datastore/statement/Elasticsearch/_all/indices.status", 1)) + _base_rollup_metrics.extend( + [ + ("Datastore/operation/Elasticsearch/indices.status", 1), + ("Datastore/statement/Elasticsearch/_all/indices.status", 1), + ] + ) _all_count += 1 else: - _base_scoped_metrics.append( - ('Datastore/operation/Elasticsearch/indices.status', None)) - _base_rollup_metrics.extend([ - ('Datastore/operation/Elasticsearch/indices.status', None), - ('Datastore/statement/Elasticsearch/_all/indices.status', None), - ]) - -_base_rollup_metrics.extend([ - ('Datastore/all', _all_count), - ('Datastore/allOther', _all_count), - ('Datastore/Elasticsearch/all', _all_count), - ('Datastore/Elasticsearch/allOther', _all_count), -]) + _base_scoped_metrics.append(("Datastore/operation/Elasticsearch/indices.status", None)) + _base_rollup_metrics.extend( + [ + ("Datastore/operation/Elasticsearch/indices.status", None), + ("Datastore/statement/Elasticsearch/_all/indices.status", None), + ] + ) + +_base_rollup_metrics.extend( + [ + ("Datastore/all", _all_count), + ("Datastore/allOther", _all_count), + ("Datastore/Elasticsearch/all", _all_count), + ("Datastore/Elasticsearch/allOther", _all_count), + ] +) # Instance info @@ -136,74 +135,105 @@ _enable_scoped_metrics = list(_base_scoped_metrics) _enable_rollup_metrics = list(_base_rollup_metrics) -_host = instance_hostname(ES_SETTINGS['host']) -_port = ES_SETTINGS['port'] +_host = instance_hostname(ES_SETTINGS["host"]) +_port = ES_SETTINGS["port"] -_instance_metric_name = 'Datastore/instance/Elasticsearch/%s/%s' % ( - _host, _port) +_instance_metric_name = "Datastore/instance/Elasticsearch/%s/%s" % (_host, _port) -_enable_rollup_metrics.append( - (_instance_metric_name, _all_count) -) +_enable_rollup_metrics.append((_instance_metric_name, _all_count)) -_disable_rollup_metrics.append( - (_instance_metric_name, None) -) +_disable_rollup_metrics.append((_instance_metric_name, None)) # Query -def _exercise_es(es): - es.index(index="contacts", doc_type="person", - body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) - es.index(index="contacts", doc_type="person", - body={"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2) - es.index(index="contacts", doc_type="person", - body={"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) - es.indices.refresh('contacts') - es.index(index="address", doc_type="employee", body={"name": "Sherlock", - "address": "221B Baker Street, London"}, id=1) - es.index(index="address", doc_type="employee", body={"name": "Bilbo", - "address": "Bag End, Bagshot row, Hobbiton, Shire"}, id=2) - es.search(index='contacts', q='name:Joe') - es.search(index='contacts', q='name:jessica') - es.search(index='address', q='name:Sherlock') - es.search(index=['contacts', 'address'], q='name:Bilbo') - es.search(index='contacts,address', q='name:Bilbo') - es.search(index='*', q='name:Bilbo') - es.search(q='name:Bilbo') + +def _exercise_es_v7(es): + es.index(index="contacts", doc_type="person", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + es.index( + index="contacts", doc_type="person", body={"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2 + ) + es.index(index="contacts", doc_type="person", body={"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) + es.indices.refresh("contacts") + es.index( + index="address", doc_type="employee", body={"name": "Sherlock", "address": "221B Baker Street, London"}, id=1 + ) + es.index( + index="address", + doc_type="employee", + body={"name": "Bilbo", "address": "Bag End, Bagshot row, Hobbiton, Shire"}, + id=2, + ) + es.search(index="contacts", q="name:Joe") + es.search(index="contacts", q="name:jessica") + es.search(index="address", q="name:Sherlock") + es.search(index=["contacts", "address"], q="name:Bilbo") + es.search(index="contacts,address", q="name:Bilbo") + es.search(index="*", q="name:Bilbo") + es.search(q="name:Bilbo") es.cluster.health() - if hasattr(es, 'cat'): + if hasattr(es, "cat"): es.cat.health() - if hasattr(es, 'nodes'): + if hasattr(es, "nodes"): es.nodes.info() - if hasattr(es, 'snapshot') and hasattr(es.snapshot, 'status'): + if hasattr(es, "snapshot") and hasattr(es.snapshot, "status"): es.snapshot.status() - if hasattr(es.indices, 'status'): + if hasattr(es.indices, "status"): es.indices.status() + +def _exercise_es_v8(es): + es.index(index="contacts", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + es.index(index="contacts", body={"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2) + es.index(index="contacts", body={"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) + es.indices.refresh(index="contacts") + es.index(index="address", body={"name": "Sherlock", "address": "221B Baker Street, London"}, id=1) + es.index(index="address", body={"name": "Bilbo", "address": "Bag End, Bagshot row, Hobbiton, Shire"}, id=2) + es.search(index="contacts", q="name:Joe") + es.search(index="contacts", q="name:jessica") + es.search(index="address", q="name:Sherlock") + es.search(index=["contacts", "address"], q="name:Bilbo") + es.search(index="contacts,address", q="name:Bilbo") + es.search(index="*", q="name:Bilbo") + es.search(q="name:Bilbo") + es.cluster.health() + + if hasattr(es, "cat"): + es.cat.health() + if hasattr(es, "nodes"): + es.nodes.info() + if hasattr(es, "snapshot") and hasattr(es.snapshot, "status"): + es.snapshot.status() + if hasattr(es.indices, "status"): + es.indices.status() + + +_exercise_es = _exercise_es_v7 if ES_VERSION < (8, 0, 0) else _exercise_es_v8 + + # Test @validate_transaction_errors(errors=[]) @validate_transaction_metrics( - 'test_elasticsearch:test_elasticsearch_operation_disabled', - scoped_metrics=_disable_scoped_metrics, - rollup_metrics=_disable_rollup_metrics, - background_task=True) + "test_elasticsearch:test_elasticsearch_operation_disabled", + scoped_metrics=_disable_scoped_metrics, + rollup_metrics=_disable_rollup_metrics, + background_task=True, +) @override_application_settings(_disable_instance_settings) @background_task() -def test_elasticsearch_operation_disabled(): - client = Elasticsearch(ES_URL) +def test_elasticsearch_operation_disabled(client): _exercise_es(client) + @validate_transaction_errors(errors=[]) @validate_transaction_metrics( - 'test_elasticsearch:test_elasticsearch_operation_enabled', - scoped_metrics=_enable_scoped_metrics, - rollup_metrics=_enable_rollup_metrics, - background_task=True) + "test_elasticsearch:test_elasticsearch_operation_enabled", + scoped_metrics=_enable_scoped_metrics, + rollup_metrics=_enable_rollup_metrics, + background_task=True, +) @override_application_settings(_enable_instance_settings) @background_task() -def test_elasticsearch_operation_enabled(): - client = Elasticsearch(ES_URL) +def test_elasticsearch_operation_enabled(client): _exercise_es(client) diff --git a/tests/datastore_elasticsearch/test_instrumented_methods.py b/tests/datastore_elasticsearch/test_instrumented_methods.py index 28ca8f975..4ad88c2a5 100644 --- a/tests/datastore_elasticsearch/test_instrumented_methods.py +++ b/tests/datastore_elasticsearch/test_instrumented_methods.py @@ -11,61 +11,131 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. - import elasticsearch import elasticsearch.client +import pytest +from conftest import ES_VERSION +from testing_support.validators.validate_datastore_trace_inputs import ( + validate_datastore_trace_inputs, +) + +from newrelic.api.background_task import background_task -from newrelic.hooks.datastore_elasticsearch import ( - _elasticsearch_client_methods, - _elasticsearch_client_indices_methods, - _elasticsearch_client_cat_methods, - _elasticsearch_client_cluster_methods, - _elasticsearch_client_nodes_methods, - _elasticsearch_client_snapshot_methods, - _elasticsearch_client_tasks_methods, - _elasticsearch_client_ingest_methods, +RUN_IF_V8 = pytest.mark.skipif( + ES_VERSION < (8,), reason="Only run for v8+. We don't support all methods in previous versions." ) -def _test_methods_wrapped(object, method_name_tuples): - for method_name, _ in method_name_tuples: - method = getattr(object, method_name, None) - if method is not None: - err = '%s.%s isnt being wrapped' % (object, method) - assert hasattr(method, '__wrapped__'), err +@pytest.fixture +def client(client): + if ES_VERSION < (8, 0): + client.index( + index="contacts", doc_type="person", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1 + ) + else: + client.index(index="contacts", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + return client + + +@pytest.mark.parametrize( + "sub_module,method,args,kwargs,expected_index", + [ + (None, "exists", (), {"index": "contacts", "id": 1}, "contacts"), + (None, "info", (), {}, None), + pytest.param( + None, + "msearch", + (), + {"searches": [{}, {"query": {"match": {"message": "this is a test"}}}], "index": "contacts"}, + "contacts", + marks=RUN_IF_V8, + ), + ("indices", "exists", (), {"index": "contacts"}, "contacts"), + ("indices", "exists_template", (), {"name": "no-exist"}, None), + ("cat", "count", (), {"index": "contacts"}, "contacts"), + ("cat", "health", (), {}, None), + pytest.param( + "cluster", + "allocation_explain", + (), + {"index": "contacts", "shard": 0, "primary": True}, + "contacts", + marks=RUN_IF_V8, + ), + ("cluster", "get_settings", (), {}, None), + ("cluster", "health", (), {"index": "contacts"}, "contacts"), + ("nodes", "info", (), {}, None), + ("snapshot", "status", (), {}, None), + ("tasks", "list", (), {}, None), + ("ingest", "geo_ip_stats", (), {}, None), + ], +) +def test_method_on_client_datastore_trace_inputs(client, sub_module, method, args, kwargs, expected_index): + expected_operation = "%s.%s" % (sub_module, method) if sub_module else method + + @validate_datastore_trace_inputs(target=expected_index, operation=expected_operation) + @background_task() + def _test(): + if not sub_module: + getattr(client, method)(*args, **kwargs) + else: + getattr(getattr(client, sub_module), method)(*args, **kwargs) + + _test() + + +def _test_methods_wrapped(_object, ignored_methods=None): + if not ignored_methods: + ignored_methods = {"perform_request", "transport"} + + def is_wrapped(m): + return hasattr(getattr(_object, m), "__wrapped__") + + methods = {m for m in dir(_object) if not m[0] == "_"} + uninstrumented = {m for m in (methods - ignored_methods) if not is_wrapped(m)} + assert not uninstrumented, "There are uninstrumented methods: %s" % uninstrumented + + +@RUN_IF_V8 def test_instrumented_methods_client(): - _test_methods_wrapped(elasticsearch.Elasticsearch, - _elasticsearch_client_methods) + _test_methods_wrapped(elasticsearch.Elasticsearch) + +@RUN_IF_V8 def test_instrumented_methods_client_indices(): - _test_methods_wrapped(elasticsearch.client.IndicesClient, - _elasticsearch_client_indices_methods) + _test_methods_wrapped(elasticsearch.client.IndicesClient) + +@RUN_IF_V8 def test_instrumented_methods_client_cluster(): - _test_methods_wrapped(elasticsearch.client.ClusterClient, - _elasticsearch_client_cluster_methods) + _test_methods_wrapped(elasticsearch.client.ClusterClient) + +@RUN_IF_V8 def test_instrumented_methods_client_cat(): - if hasattr(elasticsearch.client, 'CatClient'): - _test_methods_wrapped(elasticsearch.client.CatClient, - _elasticsearch_client_cat_methods) + if hasattr(elasticsearch.client, "CatClient"): + _test_methods_wrapped(elasticsearch.client.CatClient) + +@RUN_IF_V8 def test_instrumented_methods_client_nodes(): - if hasattr(elasticsearch.client, 'NodesClient'): - _test_methods_wrapped(elasticsearch.client.NodesClient, - _elasticsearch_client_nodes_methods) + if hasattr(elasticsearch.client, "NodesClient"): + _test_methods_wrapped(elasticsearch.client.NodesClient) + +@RUN_IF_V8 def test_instrumented_methods_client_snapshot(): - if hasattr(elasticsearch.client, 'SnapshotClient'): - _test_methods_wrapped(elasticsearch.client.SnapshotClient, - _elasticsearch_client_snapshot_methods) + if hasattr(elasticsearch.client, "SnapshotClient"): + _test_methods_wrapped(elasticsearch.client.SnapshotClient) + +@RUN_IF_V8 def test_instrumented_methods_client_tasks(): - if hasattr(elasticsearch.client, 'TasksClient'): - _test_methods_wrapped(elasticsearch.client.TasksClient, - _elasticsearch_client_tasks_methods) + if hasattr(elasticsearch.client, "TasksClient"): + _test_methods_wrapped(elasticsearch.client.TasksClient) + +@RUN_IF_V8 def test_instrumented_methods_client_ingest(): - if hasattr(elasticsearch.client, 'IngestClient'): - _test_methods_wrapped(elasticsearch.client.IngestClient, - _elasticsearch_client_ingest_methods) + if hasattr(elasticsearch.client, "IngestClient"): + _test_methods_wrapped(elasticsearch.client.IngestClient) diff --git a/tests/datastore_elasticsearch/test_mget.py b/tests/datastore_elasticsearch/test_mget.py index 417b231d6..f3f7c0979 100644 --- a/tests/datastore_elasticsearch/test_mget.py +++ b/tests/datastore_elasticsearch/test_mget.py @@ -13,42 +13,43 @@ # limitations under the License. import pytest - from elasticsearch import Elasticsearch -from elasticsearch.connection_pool import RoundRobinSelector -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) -from testing_support.db_settings import elasticsearch_settings +try: + from elastic_transport import RoundRobinSelector +except ImportError: + from elasticsearch.connection_pool import RoundRobinSelector + +from conftest import ES_MULTIPLE_SETTINGS, ES_VERSION +from testing_support.fixtures import override_application_settings from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task -ES_MULTIPLE_SETTINGS = elasticsearch_settings() - # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, } # Metrics -_base_scoped_metrics = ( - ('Datastore/statement/Elasticsearch/contacts/index', 2), -) +_base_scoped_metrics = (("Datastore/statement/Elasticsearch/contacts/index", 2),) _base_rollup_metrics = ( - ('Datastore/all', 3), - ('Datastore/allOther', 3), - ('Datastore/Elasticsearch/all', 3), - ('Datastore/Elasticsearch/allOther', 3), - ('Datastore/operation/Elasticsearch/index', 2), - ('Datastore/operation/Elasticsearch/mget', 1), - ('Datastore/statement/Elasticsearch/contacts/index', 2), + ("Datastore/all", 3), + ("Datastore/allOther", 3), + ("Datastore/Elasticsearch/all", 3), + ("Datastore/Elasticsearch/allOther", 3), + ("Datastore/operation/Elasticsearch/index", 2), + ("Datastore/operation/Elasticsearch/mget", 1), + ("Datastore/statement/Elasticsearch/contacts/index", 2), ) _disable_scoped_metrics = list(_base_scoped_metrics) @@ -61,89 +62,101 @@ es_1 = ES_MULTIPLE_SETTINGS[0] es_2 = ES_MULTIPLE_SETTINGS[1] - host_1 = instance_hostname(es_1['host']) - port_1 = es_1['port'] + host_1 = instance_hostname(es_1["host"]) + port_1 = es_1["port"] - host_2 = instance_hostname(es_2['host']) - port_2 = es_2['port'] + host_2 = instance_hostname(es_2["host"]) + port_2 = es_2["port"] - instance_metric_name_1 = 'Datastore/instance/Elasticsearch/%s/%s' % ( - host_1, port_1) - instance_metric_name_2 = 'Datastore/instance/Elasticsearch/%s/%s' % ( - host_2, port_2) + instance_metric_name_1 = "Datastore/instance/Elasticsearch/%s/%s" % (host_1, port_1) + instance_metric_name_2 = "Datastore/instance/Elasticsearch/%s/%s" % (host_2, port_2) - _enable_rollup_metrics.extend([ + _enable_rollup_metrics.extend( + [ (instance_metric_name_1, 2), (instance_metric_name_2, 1), - ]) + ] + ) - _disable_rollup_metrics.extend([ + _disable_rollup_metrics.extend( + [ (instance_metric_name_1, None), (instance_metric_name_2, None), - ]) + ] + ) + + +@pytest.fixture(scope="module") +def client(): + urls = ["http://%s:%s" % (db["host"], db["port"]) for db in ES_MULTIPLE_SETTINGS] + # When selecting a connection from the pool, use the round robin method. + # This is actually the default already. Using round robin will ensure that + # doing two db calls will mean elastic search is talking to two different + # dbs. + if ES_VERSION >= (8,): + client = Elasticsearch(urls, node_selector_class=RoundRobinSelector, randomize_hosts=False) + else: + client = Elasticsearch(urls, selector_class=RoundRobinSelector, randomize_hosts=False) + return client + # Query + def _exercise_es_multi(es): # set on db 1 - es.index(index='contacts', doc_type='person', - body={'name': 'Joe Tester', 'age': 25, 'title': 'QA Engineer'}, - id=1) - - # set on db 2 - es.index(index='contacts', doc_type='person', - body={'name': 'Jane Tester', 'age': 22, 'title': 'Senior QA Engineer'}, - id=2) + if ES_VERSION >= (8,): + es.index(index="contacts", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + # set on db 2 + es.index(index="contacts", body={"name": "Jane Tester", "age": 22, "title": "Senior QA Engineer"}, id=2) + else: + es.index( + index="contacts", doc_type="person", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1 + ) + # set on db 2 + es.index( + index="contacts", + doc_type="person", + body={"name": "Jane Tester", "age": 22, "title": "Senior QA Engineer"}, + id=2, + ) # ask db 1, will return info from db 1 and 2 mget_body = { - 'docs': [ - {'_id': 1, '_index': 'contacts'}, - {'_id': 2, '_index': 'contacts'}, + "docs": [ + {"_id": 1, "_index": "contacts"}, + {"_id": 2, "_index": "contacts"}, ] } - results = es.mget(mget_body) - assert len(results['docs']) == 2 + results = es.mget(body=mget_body) + assert len(results["docs"]) == 2 + # Test -@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, - reason='Test environment not configured with multiple databases.') + +@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, reason="Test environment not configured with multiple databases.") @override_application_settings(_enable_instance_settings) @validate_transaction_metrics( - 'test_mget:test_multi_get_enabled', - scoped_metrics=_enable_scoped_metrics, - rollup_metrics=_enable_rollup_metrics, - background_task=True) + "test_mget:test_multi_get_enabled", + scoped_metrics=_enable_scoped_metrics, + rollup_metrics=_enable_rollup_metrics, + background_task=True, +) @background_task() -def test_multi_get_enabled(): - urls = ['http://%s:%s' % (db['host'], db['port']) for db in - ES_MULTIPLE_SETTINGS] - # When selecting a connection from the pool, use the round robin method. - # This is actually the default already. Using round robin will ensure that - # doing two db calls will mean elastic search is talking to two different - # dbs. - client = Elasticsearch(urls, selector_class=RoundRobinSelector, - randomize_hosts=False) +def test_multi_get_enabled(client): _exercise_es_multi(client) -@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, - reason='Test environment not configured with multiple databases.') + +@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, reason="Test environment not configured with multiple databases.") @override_application_settings(_disable_instance_settings) @validate_transaction_metrics( - 'test_mget:test_multi_get_disabled', - scoped_metrics=_disable_scoped_metrics, - rollup_metrics=_disable_rollup_metrics, - background_task=True) + "test_mget:test_multi_get_disabled", + scoped_metrics=_disable_scoped_metrics, + rollup_metrics=_disable_rollup_metrics, + background_task=True, +) @background_task() -def test_multi_get_disabled(): - urls = ['http://%s:%s' % (db['host'], db['port']) for db in - ES_MULTIPLE_SETTINGS] - # When selecting a connection from the pool, use the round robin method. - # This is actually the default already. Using round robin will ensure that - # doing two db calls will mean elastic search is talking to two different - # dbs. - client = Elasticsearch(urls, selector_class=RoundRobinSelector, - randomize_hosts=False) +def test_multi_get_disabled(client): _exercise_es_multi(client) diff --git a/tests/datastore_elasticsearch/test_multiple_dbs.py b/tests/datastore_elasticsearch/test_multiple_dbs.py index b4b1559a7..71c47b168 100644 --- a/tests/datastore_elasticsearch/test_multiple_dbs.py +++ b/tests/datastore_elasticsearch/test_multiple_dbs.py @@ -13,40 +13,36 @@ # limitations under the License. import pytest - +from conftest import ES_MULTIPLE_SETTINGS, ES_VERSION from elasticsearch import Elasticsearch - -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) -from testing_support.db_settings import elasticsearch_settings +from testing_support.fixtures import override_application_settings from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task -ES_MULTIPLE_SETTINGS = elasticsearch_settings() - # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, } # Metrics -_base_scoped_metrics = ( - ('Datastore/statement/Elasticsearch/contacts/index', 2), -) +_base_scoped_metrics = (("Datastore/statement/Elasticsearch/contacts/index", 2),) _base_rollup_metrics = ( - ('Datastore/all', 2), - ('Datastore/allOther', 2), - ('Datastore/Elasticsearch/all', 2), - ('Datastore/Elasticsearch/allOther', 2), - ('Datastore/operation/Elasticsearch/index', 2), - ('Datastore/statement/Elasticsearch/contacts/index', 2), + ("Datastore/all", 2), + ("Datastore/allOther", 2), + ("Datastore/Elasticsearch/all", 2), + ("Datastore/Elasticsearch/allOther", 2), + ("Datastore/operation/Elasticsearch/index", 2), + ("Datastore/statement/Elasticsearch/contacts/index", 2), ) _disable_scoped_metrics = list(_base_scoped_metrics) @@ -59,61 +55,71 @@ es_1 = ES_MULTIPLE_SETTINGS[0] es_2 = ES_MULTIPLE_SETTINGS[1] - host_1 = instance_hostname(es_1['host']) - port_1 = es_1['port'] + host_1 = instance_hostname(es_1["host"]) + port_1 = es_1["port"] - host_2 = instance_hostname(es_2['host']) - port_2 = es_2['port'] + host_2 = instance_hostname(es_2["host"]) + port_2 = es_2["port"] - instance_metric_name_1 = 'Datastore/instance/Elasticsearch/%s/%s' % ( - host_1, port_1) - instance_metric_name_2 = 'Datastore/instance/Elasticsearch/%s/%s' % ( - host_2, port_2) + instance_metric_name_1 = "Datastore/instance/Elasticsearch/%s/%s" % (host_1, port_1) + instance_metric_name_2 = "Datastore/instance/Elasticsearch/%s/%s" % (host_2, port_2) - _enable_rollup_metrics.extend([ + _enable_rollup_metrics.extend( + [ (instance_metric_name_1, 1), (instance_metric_name_2, 1), - ]) + ] + ) - _disable_rollup_metrics.extend([ + _disable_rollup_metrics.extend( + [ (instance_metric_name_1, None), (instance_metric_name_2, None), - ]) + ] + ) # Query + def _exercise_es(es): - es.index(index='contacts', doc_type='person', - body={'name': 'Joe Tester', 'age': 25, 'title': 'QA Engineer'}, id=1) + if ES_VERSION >= (8,): + es.index(index="contacts", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) + else: + es.index( + index="contacts", doc_type="person", body={"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1 + ) + # Test -@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, - reason='Test environment not configured with multiple databases.') + +@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, reason="Test environment not configured with multiple databases.") @override_application_settings(_enable_instance_settings) @validate_transaction_metrics( - 'test_multiple_dbs:test_multiple_dbs_enabled', - scoped_metrics=_enable_scoped_metrics, - rollup_metrics=_enable_rollup_metrics, - background_task=True) + "test_multiple_dbs:test_multiple_dbs_enabled", + scoped_metrics=_enable_scoped_metrics, + rollup_metrics=_enable_rollup_metrics, + background_task=True, +) @background_task() def test_multiple_dbs_enabled(): for db in ES_MULTIPLE_SETTINGS: - es_url = 'http://%s:%s' % (db['host'], db['port']) + es_url = "http://%s:%s" % (db["host"], db["port"]) client = Elasticsearch(es_url) _exercise_es(client) -@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, - reason='Test environment not configured with multiple databases.') + +@pytest.mark.skipif(len(ES_MULTIPLE_SETTINGS) < 2, reason="Test environment not configured with multiple databases.") @override_application_settings(_disable_instance_settings) @validate_transaction_metrics( - 'test_multiple_dbs:test_multiple_dbs_disabled', - scoped_metrics=_disable_scoped_metrics, - rollup_metrics=_disable_rollup_metrics, - background_task=True) + "test_multiple_dbs:test_multiple_dbs_disabled", + scoped_metrics=_disable_scoped_metrics, + rollup_metrics=_disable_rollup_metrics, + background_task=True, +) @background_task() def test_multiple_dbs_disabled(): for db in ES_MULTIPLE_SETTINGS: - es_url = 'http://%s:%s' % (db['host'], db['port']) + es_url = "http://%s:%s" % (db["host"], db["port"]) client = Elasticsearch(es_url) _exercise_es(client) diff --git a/tests/datastore_elasticsearch/test_trace_node.py b/tests/datastore_elasticsearch/test_trace_node.py index 65e773340..af96b80b4 100644 --- a/tests/datastore_elasticsearch/test_trace_node.py +++ b/tests/datastore_elasticsearch/test_trace_node.py @@ -12,102 +12,102 @@ # See the License for the specific language governing permissions and # limitations under the License. -from elasticsearch import Elasticsearch - -from testing_support.fixtures import (validate_tt_collector_json, - override_application_settings, validate_tt_parenting) -from testing_support.db_settings import elasticsearch_settings +from testing_support.fixtures import ( + override_application_settings, + validate_tt_parenting, +) from testing_support.util import instance_hostname +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task -ES_SETTINGS = elasticsearch_settings()[0] -ES_URL = 'http://%s:%s' % (ES_SETTINGS['host'], ES_SETTINGS['port']) +from conftest import ES_SETTINGS, ES_VERSION # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, + "datastore_tracer.database_name_reporting.enabled": False, } _instance_only_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": False, } # Expected parameters _enabled_required = { - 'host': instance_hostname(ES_SETTINGS['host']), - 'port_path_or_id': str(ES_SETTINGS['port']), + "host": instance_hostname(ES_SETTINGS["host"]), + "port_path_or_id": str(ES_SETTINGS["port"]), } _enabled_forgone = { - 'db.instance': 'VALUE NOT USED', + "db.instance": "VALUE NOT USED", } _disabled_required = {} _disabled_forgone = { - 'host': 'VALUE NOT USED', - 'port_path_or_id': 'VALUE NOT USED', - 'db.instance': 'VALUE NOT USED', + "host": "VALUE NOT USED", + "port_path_or_id": "VALUE NOT USED", + "db.instance": "VALUE NOT USED", } _instance_only_required = { - 'host': instance_hostname(ES_SETTINGS['host']), - 'port_path_or_id': str(ES_SETTINGS['port']), + "host": instance_hostname(ES_SETTINGS["host"]), + "port_path_or_id": str(ES_SETTINGS["port"]), } _instance_only_forgone = { - 'db.instance': 'VALUE NOT USED', + "db.instance": "VALUE NOT USED", } _tt_parenting = ( - 'TransactionNode', [ - ('DatastoreNode', []), + "TransactionNode", + [ + ("DatastoreNode", []), ], ) # Query -def _exercise_es(es): - es.index(index='contacts', doc_type='person', - body={'name': 'Joe Tester', 'age': 25, 'title': 'QA Master'}, id=1) + +def _exercise_es_v7(es): + es.index(index="contacts", doc_type="person", body={"name": "Joe Tester", "age": 25, "title": "QA Master"}, id=1) +def _exercise_es_v8(es): + es.index(index="contacts", body={"name": "Joe Tester", "age": 25, "title": "QA Master"}, id=1) + + +_exercise_es = _exercise_es_v7 if ES_VERSION < (8, 0, 0) else _exercise_es_v8 + # Tests + @override_application_settings(_enable_instance_settings) -@validate_tt_collector_json( - datastore_params=_enabled_required, - datastore_forgone_params=_enabled_forgone) +@validate_tt_collector_json(datastore_params=_enabled_required, datastore_forgone_params=_enabled_forgone) @validate_tt_parenting(_tt_parenting) @background_task() -def test_trace_node_datastore_params_enable_instance(): - client = Elasticsearch(ES_URL) +def test_trace_node_datastore_params_enable_instance(client): _exercise_es(client) @override_application_settings(_disable_instance_settings) -@validate_tt_collector_json( - datastore_params=_disabled_required, - datastore_forgone_params=_disabled_forgone) +@validate_tt_collector_json(datastore_params=_disabled_required, datastore_forgone_params=_disabled_forgone) @validate_tt_parenting(_tt_parenting) @background_task() -def test_trace_node_datastore_params_disable_instance(): - client = Elasticsearch(ES_URL) +def test_trace_node_datastore_params_disable_instance(client): _exercise_es(client) @override_application_settings(_instance_only_settings) -@validate_tt_collector_json( - datastore_params=_instance_only_required, - datastore_forgone_params=_instance_only_forgone) +@validate_tt_collector_json(datastore_params=_instance_only_required, datastore_forgone_params=_instance_only_forgone) @validate_tt_parenting(_tt_parenting) @background_task() -def test_trace_node_datastore_params_instance_only(): - client = Elasticsearch(ES_URL) +def test_trace_node_datastore_params_instance_only(client): _exercise_es(client) diff --git a/tests/datastore_elasticsearch/test_transport.py b/tests/datastore_elasticsearch/test_transport.py index 49896ba07..a091a9a92 100644 --- a/tests/datastore_elasticsearch/test_transport.py +++ b/tests/datastore_elasticsearch/test_transport.py @@ -1,6 +1,6 @@ # Copyright 2010 New Relic, Inc. # -# Licensed under the Apache License, Version 2.0 (the "License"); +# Licensed under the Apache License, ES_VERSION 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # @@ -12,63 +12,99 @@ # See the License for the specific language governing permissions and # limitations under the License. -from elasticsearch import VERSION -from elasticsearch.client.utils import _make_path -from elasticsearch.transport import Transport -from elasticsearch.connection.http_requests import RequestsHttpConnection -from elasticsearch.connection.http_urllib3 import Urllib3HttpConnection +import pytest +from conftest import ES_SETTINGS, ES_VERSION from elasticsearch.serializer import JSONSerializer -from newrelic.api.application import application_instance as application -from newrelic.api.background_task import BackgroundTask +from newrelic.api.background_task import background_task +from newrelic.api.transaction import current_transaction -from testing_support.db_settings import elasticsearch_settings +try: + from elasticsearch.connection.http_requests import RequestsHttpConnection + from elasticsearch.connection.http_urllib3 import Urllib3HttpConnection + from elasticsearch.transport import Transport -ES_SETTINGS = elasticsearch_settings()[0] -HOST = { - 'host':ES_SETTINGS['host'], - 'port': int(ES_SETTINGS['port']) -} -INDEX = 'contacts' -DOC_TYPE = 'person' -ID = 1 -METHOD = _make_path(INDEX, DOC_TYPE, ID) -PARAMS = {} -HEADERS = {"Content-Type": "application/json"} -DATA = {"name": "Joe Tester"} -BODY = JSONSerializer().dumps(DATA).encode('utf-8') + NodeConfig = dict +except ImportError: + from elastic_transport._models import NodeConfig + from elastic_transport._node._http_requests import ( + RequestsHttpNode as RequestsHttpConnection, + ) + from elastic_transport._node._http_urllib3 import ( + Urllib3HttpNode as Urllib3HttpConnection, + ) + from elastic_transport._transport import Transport -def test_transport_get_connection(): - app = application() - with BackgroundTask(app, 'transport_perform_request') as transaction: - transport = Transport([HOST]) - transport.get_connection() +IS_V8 = ES_VERSION >= (8,) +IS_V7 = ES_VERSION >= (7,) and ES_VERSION < (8, 0) +IS_BELOW_V7 = ES_VERSION < (7,) - expected = (ES_SETTINGS['host'], ES_SETTINGS['port'], None) - assert transaction._nr_datastore_instance_info == expected +RUN_IF_V8 = pytest.mark.skipif(IS_V7 or IS_BELOW_V7, reason="Only run for v8+") +RUN_IF_V7 = pytest.mark.skipif(IS_V8 or IS_BELOW_V7, reason="Only run for v7") +RUN_IF_BELOW_V7 = pytest.mark.skipif(not IS_BELOW_V7, reason="Only run for versions below v7") -def test_transport_perform_request_urllib3(): - app = application() - with BackgroundTask(app, 'perform_request_urllib3') as transaction: - transport = Transport([HOST], connection_class=Urllib3HttpConnection) - if VERSION >= (7, 16, 0): - transport.perform_request('POST', METHOD, headers=HEADERS, params=PARAMS, body=DATA) - else: - transport.perform_request('POST', METHOD, params=PARAMS, body=DATA) - expected = (ES_SETTINGS['host'], ES_SETTINGS['port'], None) - assert transaction._nr_datastore_instance_info == expected +HOST = NodeConfig(scheme="http", host=ES_SETTINGS["host"], port=int(ES_SETTINGS["port"])) + +METHOD = "/contacts/person/1" +HEADERS = {"Content-Type": "application/json"} +DATA = {"name": "Joe Tester"} + +BODY = JSONSerializer().dumps(DATA) +if hasattr(BODY, "encode"): + BODY = BODY.encode("utf-8") + +@pytest.mark.parametrize( + "transport_kwargs, perform_request_kwargs", + [ + pytest.param({}, {"body": DATA}, id="DefaultTransport_below_v7", marks=RUN_IF_BELOW_V7), + pytest.param({}, {"headers": HEADERS, "body": DATA}, id="DefaultTransport_v7+", marks=RUN_IF_V7 or RUN_IF_V8), + pytest.param( + {"connection_class": Urllib3HttpConnection}, + {"body": DATA}, + id="Urllib3HttpConnectionv7", + marks=RUN_IF_BELOW_V7, + ), + pytest.param( + {"connection_class": RequestsHttpConnection}, + {"body": DATA}, + id="RequestsHttpConnectionv7", + marks=RUN_IF_BELOW_V7, + ), + pytest.param( + {"connection_class": Urllib3HttpConnection}, + {"headers": HEADERS, "body": DATA}, + id="Urllib3HttpConnectionv7", + marks=RUN_IF_V7, + ), + pytest.param( + {"connection_class": RequestsHttpConnection}, + {"headers": HEADERS, "body": DATA}, + id="RequestsHttpConnectionv7", + marks=RUN_IF_V7, + ), + pytest.param( + {"node_class": Urllib3HttpConnection}, + {"headers": HEADERS, "body": DATA}, + id="Urllib3HttpNodev8", + marks=RUN_IF_V8, + ), + pytest.param( + {"node_class": RequestsHttpConnection}, + {"headers": HEADERS, "body": DATA}, + id="RequestsHttpNodev8", + marks=RUN_IF_V8, + ), + ], +) +@background_task() +def test_transport_connection_classes(transport_kwargs, perform_request_kwargs): + transaction = current_transaction() -def test_transport_perform_request_requests(): - app = application() - with BackgroundTask(app, 'perform_request_requests') as transaction: - transport = Transport([HOST], connection_class=RequestsHttpConnection) - if VERSION >= (7, 16, 0): - transport.perform_request('POST', METHOD, headers=HEADERS, params=PARAMS, body=DATA) - else: - transport.perform_request('POST', METHOD, params=PARAMS, body=DATA) + transport = Transport([HOST], **transport_kwargs) + transport.perform_request("POST", METHOD, **perform_request_kwargs) - expected = (ES_SETTINGS['host'], ES_SETTINGS['port'], None) + expected = (ES_SETTINGS["host"], ES_SETTINGS["port"], None) assert transaction._nr_datastore_instance_info == expected diff --git a/tests/datastore_memcache/conftest.py b/tests/datastore_memcache/conftest.py index d19451200..835e895bd 100644 --- a/tests/datastore_memcache/conftest.py +++ b/tests/datastore_memcache/conftest.py @@ -17,17 +17,10 @@ import pytest import memcache -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.db_settings import memcached_settings -_coverage_source = [ - 'newrelic.api.memcache_trace', - 'newrelic.hooks.datastore_memcache', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_memcache/test_memcache.py b/tests/datastore_memcache/test_memcache.py index d8afab3b1..a66c114ee 100644 --- a/tests/datastore_memcache/test_memcache.py +++ b/tests/datastore_memcache/test_memcache.py @@ -14,8 +14,8 @@ import memcache -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import memcached_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_memcache/test_multiple_dbs.py b/tests/datastore_memcache/test_multiple_dbs.py index b83d7dfcc..dbc3ea2b3 100644 --- a/tests/datastore_memcache/test_multiple_dbs.py +++ b/tests/datastore_memcache/test_multiple_dbs.py @@ -17,8 +17,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import memcached_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_mysql/conftest.py b/tests/datastore_mysql/conftest.py index 9b490b057..a2f74c398 100644 --- a/tests/datastore_mysql/conftest.py +++ b/tests/datastore_mysql/conftest.py @@ -15,15 +15,8 @@ import pytest import os -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.database_mysql', - 'newrelic.hooks.database_dbapi2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_mysql/test_database.py b/tests/datastore_mysql/test_database.py index 0991d6df0..2fc8ca129 100644 --- a/tests/datastore_mysql/test_database.py +++ b/tests/datastore_mysql/test_database.py @@ -14,7 +14,7 @@ import mysql.connector -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from testing_support.db_settings import mysql_settings diff --git a/tests/datastore_postgresql/conftest.py b/tests/datastore_postgresql/conftest.py index 741b53078..4a25f2574 100644 --- a/tests/datastore_postgresql/conftest.py +++ b/tests/datastore_postgresql/conftest.py @@ -12,28 +12,22 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest - -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) - -_coverage_source = [ - 'newrelic.hooks.database_postgresql', - 'newrelic.hooks.database_dbapi2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) _default_settings = { - 'transaction_tracer.explain_threshold': 0.0, - 'transaction_tracer.transaction_threshold': 0.0, - 'transaction_tracer.stack_trace_threshold': 0.0, - 'debug.log_data_collector_payloads': True, - 'debug.record_transaction_failure': True, - 'debug.log_explain_plan_queries': True + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, + "debug.log_explain_plan_queries": True, } collector_agent_registration = collector_agent_registration_fixture( - app_name='Python Agent Test (datastore_postgresql)', - default_settings=_default_settings, - linked_applications=['Python Agent Test (datastore)']) + app_name="Python Agent Test (datastore_postgresql)", + default_settings=_default_settings, + linked_applications=["Python Agent Test (datastore)"], +) diff --git a/tests/datastore_postgresql/test_database.py b/tests/datastore_postgresql/test_database.py index de53808c6..19070880b 100644 --- a/tests/datastore_postgresql/test_database.py +++ b/tests/datastore_postgresql/test_database.py @@ -13,15 +13,13 @@ # limitations under the License. import postgresql.driver.dbapi20 - - -from testing_support.fixtures import validate_transaction_metrics - +from testing_support.db_settings import postgresql_settings from testing_support.validators.validate_database_trace_inputs import ( validate_database_trace_inputs, ) - -from testing_support.db_settings import postgresql_settings +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task @@ -41,13 +39,14 @@ ("Datastore/operation/Postgres/create", 1), ("Datastore/operation/Postgres/commit", 3), ("Datastore/operation/Postgres/rollback", 1), + ("Datastore/operation/Postgres/other", 1), ] _test_execute_via_cursor_rollup_metrics = [ - ("Datastore/all", 13), - ("Datastore/allOther", 13), - ("Datastore/Postgres/all", 13), - ("Datastore/Postgres/allOther", 13), + ("Datastore/all", 14), + ("Datastore/allOther", 14), + ("Datastore/Postgres/all", 14), + ("Datastore/Postgres/allOther", 14), ("Datastore/operation/Postgres/select", 1), ("Datastore/statement/Postgres/%s/select" % DB_SETTINGS["table_name"], 1), ("Datastore/operation/Postgres/insert", 1), @@ -63,6 +62,10 @@ ("Datastore/operation/Postgres/call", 2), ("Datastore/operation/Postgres/commit", 3), ("Datastore/operation/Postgres/rollback", 1), + ("Datastore/operation/Postgres/other", 1), + ("Function/postgresql.driver.dbapi20:connect", 1), + ("Function/postgresql.driver.dbapi20:Connection.__enter__", 1), + ("Function/postgresql.driver.dbapi20:Connection.__exit__", 1), ] @@ -82,30 +85,27 @@ def test_execute_via_cursor(): host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], ) as connection: - cursor = connection.cursor() cursor.execute("""drop table if exists %s""" % DB_SETTINGS["table_name"]) - cursor.execute( - """create table %s """ % DB_SETTINGS["table_name"] - + """(a integer, b real, c text)""" - ) + cursor.execute("""create table %s """ % DB_SETTINGS["table_name"] + """(a integer, b real, c text)""") cursor.executemany( - """insert into %s """ % DB_SETTINGS["table_name"] - + """values (%s, %s, %s)""", + """insert into %s """ % DB_SETTINGS["table_name"] + """values (%s, %s, %s)""", [(1, 1.0, "1.0"), (2, 2.2, "2.2"), (3, 3.3, "3.3")], ) cursor.execute("""select * from %s""" % DB_SETTINGS["table_name"]) - for row in cursor: - pass + cursor.execute( + """with temporaryTable (averageValue) as (select avg(b) from %s) """ % DB_SETTINGS["table_name"] + + """select * from %s,temporaryTable """ % DB_SETTINGS["table_name"] + + """where %s.b > temporaryTable.averageValue""" % DB_SETTINGS["table_name"] + ) cursor.execute( - """update %s """ % DB_SETTINGS["table_name"] - + """set a=%s, b=%s, c=%s where a=%s""", + """update %s """ % DB_SETTINGS["table_name"] + """set a=%s, b=%s, c=%s where a=%s""", (4, 4.0, "4.0", 1), ) @@ -152,7 +152,6 @@ def test_rollback_on_exception(): host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], ): - raise RuntimeError("error") except RuntimeError: diff --git a/tests/datastore_psycopg2/conftest.py b/tests/datastore_psycopg2/conftest.py index 47bfd9f16..dd271909d 100644 --- a/tests/datastore_psycopg2/conftest.py +++ b/tests/datastore_psycopg2/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.database_psycopg2', - 'newrelic.hooks.database_dbapi2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_psycopg2/test_async.py b/tests/datastore_psycopg2/test_async.py index 78df2beca..7af9adc6a 100644 --- a/tests/datastore_psycopg2/test_async.py +++ b/tests/datastore_psycopg2/test_async.py @@ -16,11 +16,11 @@ import psycopg2.extras import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, +from testing_support.fixtures import ( validate_stats_engine_explain_plan_output_is_none, override_application_settings) - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from testing_support.validators.validate_transaction_slow_sql_count import ( validate_transaction_slow_sql_count) diff --git a/tests/datastore_psycopg2/test_cursor.py b/tests/datastore_psycopg2/test_cursor.py index e7a549c13..d66d73ff8 100644 --- a/tests/datastore_psycopg2/test_cursor.py +++ b/tests/datastore_psycopg2/test_cursor.py @@ -21,8 +21,8 @@ except ImportError: sql = None -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from testing_support.util import instance_hostname from utils import DB_SETTINGS diff --git a/tests/datastore_psycopg2/test_multiple_dbs.py b/tests/datastore_psycopg2/test_multiple_dbs.py index bf7629ebf..afbdd66f2 100644 --- a/tests/datastore_psycopg2/test_multiple_dbs.py +++ b/tests/datastore_psycopg2/test_multiple_dbs.py @@ -15,8 +15,8 @@ import psycopg2 import pytest -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from testing_support.util import instance_hostname from utils import DB_MULTIPLE_SETTINGS diff --git a/tests/datastore_psycopg2/test_register.py b/tests/datastore_psycopg2/test_register.py index 03b553749..b5450c358 100644 --- a/tests/datastore_psycopg2/test_register.py +++ b/tests/datastore_psycopg2/test_register.py @@ -16,8 +16,8 @@ import psycopg2 import psycopg2.extras -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors) +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from utils import DB_SETTINGS from newrelic.api.background_task import background_task diff --git a/tests/datastore_psycopg2/test_rollback.py b/tests/datastore_psycopg2/test_rollback.py index f0ef8149f..0a23b1005 100644 --- a/tests/datastore_psycopg2/test_rollback.py +++ b/tests/datastore_psycopg2/test_rollback.py @@ -15,8 +15,8 @@ import psycopg2 import pytest -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from testing_support.util import instance_hostname from utils import DB_SETTINGS diff --git a/tests/datastore_psycopg2/test_trace_node.py b/tests/datastore_psycopg2/test_trace_node.py index b9cd45788..9bfbcf42b 100644 --- a/tests/datastore_psycopg2/test_trace_node.py +++ b/tests/datastore_psycopg2/test_trace_node.py @@ -13,72 +13,78 @@ # limitations under the License. import psycopg2 - -from testing_support.fixtures import (validate_tt_collector_json, - override_application_settings, validate_tt_parenting) +from testing_support.fixtures import ( + override_application_settings, + validate_tt_parenting, +) from testing_support.util import instance_hostname +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from utils import DB_SETTINGS from newrelic.api.background_task import background_task - # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, + "datastore_tracer.database_name_reporting.enabled": False, } # Expected parameters _enabled_required = { - 'host': instance_hostname(DB_SETTINGS['host']), - 'port_path_or_id': str(DB_SETTINGS['port']), - 'db.instance': DB_SETTINGS['name'], + "host": instance_hostname(DB_SETTINGS["host"]), + "port_path_or_id": str(DB_SETTINGS["port"]), + "db.instance": DB_SETTINGS["name"], } _enabled_forgone = {} _disabled_required = {} _disabled_forgone = { - 'host': 'VALUE NOT USED', - 'port_path_or_id': 'VALUE NOT USED', - 'db.instance': 'VALUE NOT USED', + "host": "VALUE NOT USED", + "port_path_or_id": "VALUE NOT USED", + "db.instance": "VALUE NOT USED", } _tt_parenting = ( - 'TransactionNode', [ - ('FunctionNode', []), - ('DatabaseNode', []), + "TransactionNode", + [ + ("FunctionNode", []), + ("DatabaseNode", []), ], ) # Query + def _exercise_db(): connection = psycopg2.connect( - database=DB_SETTINGS['name'], user=DB_SETTINGS['user'], - password=DB_SETTINGS['password'], host=DB_SETTINGS['host'], - port=DB_SETTINGS['port']) + database=DB_SETTINGS["name"], + user=DB_SETTINGS["user"], + password=DB_SETTINGS["password"], + host=DB_SETTINGS["host"], + port=DB_SETTINGS["port"], + ) try: cursor = connection.cursor() - cursor.execute("""SELECT setting from pg_settings where name=%s""", - ('server_version',)) + cursor.execute("""SELECT setting from pg_settings where name=%s""", ("server_version",)) finally: connection.close() # Tests + @override_application_settings(_enable_instance_settings) -@validate_tt_collector_json( - datastore_params=_enabled_required, - datastore_forgone_params=_enabled_forgone) +@validate_tt_collector_json(datastore_params=_enabled_required, datastore_forgone_params=_enabled_forgone) @validate_tt_parenting(_tt_parenting) @background_task() def test_trace_node_datastore_params_enable_instance(): @@ -86,9 +92,7 @@ def test_trace_node_datastore_params_enable_instance(): @override_application_settings(_disable_instance_settings) -@validate_tt_collector_json( - datastore_params=_disabled_required, - datastore_forgone_params=_disabled_forgone) +@validate_tt_collector_json(datastore_params=_disabled_required, datastore_forgone_params=_disabled_forgone) @validate_tt_parenting(_tt_parenting) @background_task() def test_trace_node_datastore_params_disable_instance(): diff --git a/tests/datastore_psycopg2cffi/conftest.py b/tests/datastore_psycopg2cffi/conftest.py index 9cff099f8..c9df1369b 100644 --- a/tests/datastore_psycopg2cffi/conftest.py +++ b/tests/datastore_psycopg2cffi/conftest.py @@ -14,16 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.database_psycopg2cffi', - 'newrelic.hooks.database_psycopg2', - 'newrelic.hooks.database_dbapi2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_psycopg2cffi/test_database.py b/tests/datastore_psycopg2cffi/test_database.py index c0eb6e722..54ff6ad09 100644 --- a/tests/datastore_psycopg2cffi/test_database.py +++ b/tests/datastore_psycopg2cffi/test_database.py @@ -16,8 +16,9 @@ import psycopg2cffi.extensions import psycopg2cffi.extras -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, validate_stats_engine_explain_plan_output_is_none) +from testing_support.fixtures import validate_stats_engine_explain_plan_output_is_none +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_transaction_slow_sql_count import \ validate_transaction_slow_sql_count from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs diff --git a/tests/datastore_pyelasticsearch/test_pyelasticsearch.py b/tests/datastore_pyelasticsearch/test_pyelasticsearch.py deleted file mode 100644 index 837c9ae19..000000000 --- a/tests/datastore_pyelasticsearch/test_pyelasticsearch.py +++ /dev/null @@ -1,116 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import sqlite3 -from pyelasticsearch import ElasticSearch - -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors) -from testing_support.db_settings import elasticsearch_settings -from testing_support.validators.validate_database_duration import validate_database_duration - -from newrelic.api.background_task import background_task - -ES_SETTINGS = elasticsearch_settings()[0] -ES_URL = 'http://%s:%s' % (ES_SETTINGS['host'], ES_SETTINGS['port']) - -def _exercise_es(es): - es.index("contacts", "person", - {"name": "Joe Tester", "age": 25, "title": "QA Engineer"}, id=1) - es.index("contacts", "person", - {"name": "Jessica Coder", "age": 32, "title": "Programmer"}, id=2) - es.index("contacts", "person", - {"name": "Freddy Tester", "age": 29, "title": "Assistant"}, id=3) - es.refresh('contacts') - es.index("address", "employee", {"name": "Sherlock", - "address": "221B Baker Street, London"}, id=1) - es.index("address", "employee", {"name": "Bilbo", - "address": "Bag End, Bagshot row, Hobbiton, Shire"}, id=2) - es.search('name:Joe', index='contacts') - es.search('name:jessica', index='contacts') - es.search('name:Sherlock', index='address') - es.search('name:Bilbo', index=['contacts', 'address']) - es.search('name:Bilbo', index='contacts,address') - es.search('name:Bilbo', index='*') - es.search('name:Bilbo') - es.status() - -# Common Metrics for tests that use _exercise_es(). - -_test_pyelasticsearch_scoped_metrics = [ - ('Datastore/statement/Elasticsearch/contacts/index', 3), - ('Datastore/statement/Elasticsearch/contacts/search', 2), - ('Datastore/statement/Elasticsearch/address/index', 2), - ('Datastore/statement/Elasticsearch/address/search', 1), - ('Datastore/statement/Elasticsearch/_all/search', 2), - ('Datastore/statement/Elasticsearch/other/search', 2), - ('Datastore/statement/Elasticsearch/contacts/refresh', 1), - ('Datastore/statement/Elasticsearch/_all/status', 1), -] - -_test_pyelasticsearch_rollup_metrics = [ - ('Datastore/all', 14), - ('Datastore/allOther', 14), - ('Datastore/Elasticsearch/all', 14), - ('Datastore/Elasticsearch/allOther', 14), - ('Datastore/operation/Elasticsearch/index', 5), - ('Datastore/operation/Elasticsearch/search', 7), - ('Datastore/operation/Elasticsearch/refresh', 1), - ('Datastore/operation/Elasticsearch/status', 1), - ('Datastore/statement/Elasticsearch/contacts/index', 3), - ('Datastore/statement/Elasticsearch/contacts/search', 2), - ('Datastore/statement/Elasticsearch/address/index', 2), - ('Datastore/statement/Elasticsearch/address/search', 1), - ('Datastore/statement/Elasticsearch/_all/search', 2), - ('Datastore/statement/Elasticsearch/other/search', 2), - ('Datastore/statement/Elasticsearch/contacts/refresh', 1), - ('Datastore/statement/Elasticsearch/_all/status', 1), -] - -@validate_transaction_errors(errors=[]) -@validate_transaction_metrics( - 'test_pyelasticsearch:test_pyelasticsearch_operation', - scoped_metrics=_test_pyelasticsearch_scoped_metrics, - rollup_metrics=_test_pyelasticsearch_rollup_metrics, - background_task=True) -@background_task() -def test_pyelasticsearch_operation(): - client = ElasticSearch(ES_URL) - _exercise_es(client) - -@validate_database_duration() -@background_task() -def test_elasticsearch_database_duration(): - client = ElasticSearch(ES_URL) - _exercise_es(client) - -@validate_database_duration() -@background_task() -def test_elasticsearch_and_sqlite_database_duration(): - - # Make ElasticSearch queries - - client = ElasticSearch(ES_URL) - _exercise_es(client) - - # Make sqlite queries - - conn = sqlite3.connect(":memory:") - cur = conn.cursor() - - cur.execute("CREATE TABLE contacts (name text, age int)") - cur.execute("INSERT INTO contacts VALUES ('Bob', 22)") - - conn.commit() - conn.close() diff --git a/tests/datastore_pylibmc/conftest.py b/tests/datastore_pylibmc/conftest.py index 4dd03c77a..40970bdca 100644 --- a/tests/datastore_pylibmc/conftest.py +++ b/tests/datastore_pylibmc/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.api.memcache_trace', - 'newrelic.hooks.datastore_pylibmc', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_pylibmc/test_memcache.py b/tests/datastore_pylibmc/test_memcache.py index 554581fdc..769f3b483 100644 --- a/tests/datastore_pylibmc/test_memcache.py +++ b/tests/datastore_pylibmc/test_memcache.py @@ -17,7 +17,7 @@ import pylibmc from testing_support.db_settings import memcached_settings -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.api.background_task import background_task from newrelic.api.transaction import set_background_task diff --git a/tests/datastore_pymemcache/conftest.py b/tests/datastore_pymemcache/conftest.py index ff5420903..3d4e1ce76 100644 --- a/tests/datastore_pymemcache/conftest.py +++ b/tests/datastore_pymemcache/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.api.memcache_trace', - 'newrelic.hooks.datastore_pymemcache', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_pymemcache/test_memcache.py b/tests/datastore_pymemcache/test_memcache.py index 12bd5da1a..9aeea4d54 100644 --- a/tests/datastore_pymemcache/test_memcache.py +++ b/tests/datastore_pymemcache/test_memcache.py @@ -16,7 +16,7 @@ import pymemcache.client -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import memcached_settings from newrelic.api.background_task import background_task diff --git a/tests/datastore_pymongo/conftest.py b/tests/datastore_pymongo/conftest.py index 8d279f2e2..d269182b0 100644 --- a/tests/datastore_pymongo/conftest.py +++ b/tests/datastore_pymongo/conftest.py @@ -12,17 +12,8 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.datastore_pymongo", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/datastore_pymongo/test_pymongo.py b/tests/datastore_pymongo/test_pymongo.py index 09ea62e0b..4649062ce 100644 --- a/tests/datastore_pymongo/test_pymongo.py +++ b/tests/datastore_pymongo/test_pymongo.py @@ -16,14 +16,11 @@ import pymongo from testing_support.db_settings import mongodb_settings -from testing_support.fixtures import ( - validate_transaction_errors, - validate_transaction_metrics, -) from testing_support.validators.validate_database_duration import ( validate_database_duration, ) - +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.api.background_task import background_task from newrelic.packages import six diff --git a/tests/datastore_pyelasticsearch/conftest.py b/tests/datastore_pymssql/conftest.py similarity index 50% rename from tests/datastore_pyelasticsearch/conftest.py rename to tests/datastore_pymssql/conftest.py index 101bf444a..a6584cdff 100644 --- a/tests/datastore_pyelasticsearch/conftest.py +++ b/tests/datastore_pymssql/conftest.py @@ -14,24 +14,23 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import ( + collector_agent_registration_fixture, + collector_available_fixture, +) # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.datastore_pyelasticsearch', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { - 'transaction_tracer.explain_threshold': 0.0, - 'transaction_tracer.transaction_threshold': 0.0, - 'transaction_tracer.stack_trace_threshold': 0.0, - 'debug.log_data_collector_payloads': True, - 'debug.record_transaction_failure': True + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, + "debug.log_explain_plan_queries": True, } collector_agent_registration = collector_agent_registration_fixture( - app_name='Python Agent Test (datastore_pyelasticsearch)', - default_settings=_default_settings, - linked_applications=['Python Agent Test (datastore)']) + app_name="Python Agent Test (datastore_pymssql)", + default_settings=_default_settings, + linked_applications=["Python Agent Test (datastore)"], +) diff --git a/tests/datastore_pymssql/test_database.py b/tests/datastore_pymssql/test_database.py new file mode 100644 index 000000000..bdbf75c15 --- /dev/null +++ b/tests/datastore_pymssql/test_database.py @@ -0,0 +1,115 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import pymssql + +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs + +from testing_support.db_settings import mssql_settings + +from newrelic.api.background_task import background_task + +DB_SETTINGS = mssql_settings()[0] +TABLE_NAME = "datastore_pymssql_" + DB_SETTINGS["namespace"] +PROCEDURE_NAME = "hello_" + DB_SETTINGS["namespace"] + + +def execute_db_calls_with_cursor(cursor): + cursor.execute("""drop table if exists %s""" % TABLE_NAME) + + cursor.execute("""create table %s """ % TABLE_NAME + """(a integer, b real, c text)""") + + cursor.executemany( + """insert into %s """ % TABLE_NAME + """values (%s, %s, %s)""", + [(1, 1.0, "1.0"), (2, 2.2, "2.2"), (3, 3.3, "3.3")], + ) + + cursor.execute("""select * from %s""" % TABLE_NAME) + + for row in cursor: + pass + + cursor.execute("""update %s""" % TABLE_NAME + """ set a=%s, b=%s, """ """c=%s where a=%s""", (4, 4.0, "4.0", 1)) + + cursor.execute("""delete from %s where a=2""" % TABLE_NAME) + cursor.execute("""drop procedure if exists %s""" % PROCEDURE_NAME) + cursor.execute( + """CREATE PROCEDURE %s AS + BEGIN + SELECT 'Hello World!'; + END""" + % PROCEDURE_NAME + ) + + cursor.callproc(PROCEDURE_NAME) + + +_test_scoped_metrics = [ + ("Function/pymssql._pymssql:connect", 1), + ("Datastore/statement/MSSQL/%s/select" % TABLE_NAME, 1), + ("Datastore/statement/MSSQL/%s/insert" % TABLE_NAME, 1), + ("Datastore/statement/MSSQL/%s/update" % TABLE_NAME, 1), + ("Datastore/statement/MSSQL/%s/delete" % TABLE_NAME, 1), + ("Datastore/operation/MSSQL/drop", 2), + ("Datastore/operation/MSSQL/create", 2), + ("Datastore/statement/MSSQL/%s/call" % PROCEDURE_NAME, 1), + ("Datastore/operation/MSSQL/commit", 2), + ("Datastore/operation/MSSQL/rollback", 1), +] + +_test_rollup_metrics = [ + ("Datastore/all", 13), + ("Datastore/allOther", 13), + ("Datastore/MSSQL/all", 13), + ("Datastore/MSSQL/allOther", 13), + ("Datastore/statement/MSSQL/%s/select" % TABLE_NAME, 1), + ("Datastore/statement/MSSQL/%s/insert" % TABLE_NAME, 1), + ("Datastore/statement/MSSQL/%s/update" % TABLE_NAME, 1), + ("Datastore/statement/MSSQL/%s/delete" % TABLE_NAME, 1), + ("Datastore/operation/MSSQL/select", 1), + ("Datastore/operation/MSSQL/insert", 1), + ("Datastore/operation/MSSQL/update", 1), + ("Datastore/operation/MSSQL/delete", 1), + ("Datastore/statement/MSSQL/%s/call" % PROCEDURE_NAME, 1), + ("Datastore/operation/MSSQL/call", 1), + ("Datastore/operation/MSSQL/drop", 2), + ("Datastore/operation/MSSQL/create", 2), + ("Datastore/operation/MSSQL/commit", 2), + ("Datastore/operation/MSSQL/rollback", 1), +] + + +@validate_transaction_metrics( + "test_database:test_execute_via_cursor_context_manager", + scoped_metrics=_test_scoped_metrics, + rollup_metrics=_test_rollup_metrics, + background_task=True, +) +@validate_database_trace_inputs(sql_parameters_type=tuple) +@background_task() +def test_execute_via_cursor_context_manager(): + connection = pymssql.connect( + user=DB_SETTINGS["user"], password=DB_SETTINGS["password"], host=DB_SETTINGS["host"], port=DB_SETTINGS["port"] + ) + + with connection: + cursor = connection.cursor() + + with cursor: + execute_db_calls_with_cursor(cursor) + + connection.commit() + connection.rollback() + connection.commit() diff --git a/tests/datastore_pymysql/conftest.py b/tests/datastore_pymysql/conftest.py index 0aeb282a4..51d037432 100644 --- a/tests/datastore_pymysql/conftest.py +++ b/tests/datastore_pymysql/conftest.py @@ -14,16 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.database_pymysql', - 'newrelic.hooks.database_mysqldb', - 'newrelic.hooks.database_dbapi2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_pymysql/test_database.py b/tests/datastore_pymysql/test_database.py index 2f20d8cbd..5943b1266 100644 --- a/tests/datastore_pymysql/test_database.py +++ b/tests/datastore_pymysql/test_database.py @@ -14,8 +14,7 @@ import pymysql -from testing_support.fixtures import (validate_transaction_metrics, - ) +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from testing_support.db_settings import mysql_settings diff --git a/tests/datastore_pyodbc/conftest.py b/tests/datastore_pyodbc/conftest.py new file mode 100644 index 000000000..b00a0a663 --- /dev/null +++ b/tests/datastore_pyodbc/conftest.py @@ -0,0 +1,33 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) + +_default_settings = { + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, + "debug.log_explain_plan_queries": True, +} + +collector_agent_registration = collector_agent_registration_fixture( + app_name="Python Agent Test (datastore_pyodbc)", + default_settings=_default_settings, + linked_applications=["Python Agent Test (datastore)"], +) diff --git a/tests/datastore_pyodbc/test_pyodbc.py b/tests/datastore_pyodbc/test_pyodbc.py new file mode 100644 index 000000000..119908e4d --- /dev/null +++ b/tests/datastore_pyodbc/test_pyodbc.py @@ -0,0 +1,120 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +import pytest +from testing_support.db_settings import postgresql_settings +from testing_support.validators.validate_database_trace_inputs import ( + validate_database_trace_inputs, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task + +DB_SETTINGS = postgresql_settings()[0] + + +@validate_transaction_metrics( + "test_pyodbc:test_execute_via_cursor", + scoped_metrics=[ + ("Function/pyodbc:connect", 1), + ], + rollup_metrics=[ + ("Datastore/all", 1), + ("Datastore/allOther", 1), + ("Datastore/ODBC/all", 1), + ("Datastore/ODBC/allOther", 1), + ], + background_task=True, +) +@validate_database_trace_inputs(sql_parameters_type=tuple) +@background_task() +def test_execute_via_cursor(pyodbc_driver): + import pyodbc + + with pyodbc.connect( + "DRIVER={%s};SERVER=%s;PORT=%s;DATABASE=%s;UID=%s;PWD=%s" + % ( + pyodbc_driver, + DB_SETTINGS["host"], + DB_SETTINGS["port"], + DB_SETTINGS["name"], + DB_SETTINGS["user"], + DB_SETTINGS["password"], + ) + ) as connection: + cursor = connection.cursor() + cursor.execute("""drop table if exists %s""" % DB_SETTINGS["table_name"]) + cursor.execute("""create table %s """ % DB_SETTINGS["table_name"] + """(a integer, b real, c text)""") + cursor.executemany( + """insert into %s """ % DB_SETTINGS["table_name"] + """values (?, ?, ?)""", + [(1, 1.0, "1.0"), (2, 2.2, "2.2"), (3, 3.3, "3.3")], + ) + cursor.execute("""select * from %s""" % DB_SETTINGS["table_name"]) + for row in cursor: + pass + cursor.execute( + """update %s """ % DB_SETTINGS["table_name"] + """set a=?, b=?, c=? where a=?""", + (4, 4.0, "4.0", 1), + ) + cursor.execute("""delete from %s where a=2""" % DB_SETTINGS["table_name"]) + connection.commit() + + cursor.execute("SELECT now()") + cursor.execute("SELECT pg_sleep(0.25)") + + connection.rollback() + connection.commit() + + +@validate_transaction_metrics( + "test_pyodbc:test_rollback_on_exception", + scoped_metrics=[ + ("Function/pyodbc:connect", 1), + ], + rollup_metrics=[ + ("Datastore/all", 1), + ("Datastore/allOther", 1), + ("Datastore/ODBC/all", 1), + ("Datastore/ODBC/allOther", 1), + ], + background_task=True, +) +@validate_database_trace_inputs(sql_parameters_type=tuple) +@background_task() +def test_rollback_on_exception(pyodbc_driver): + import pyodbc + + with pytest.raises(RuntimeError): + with pyodbc.connect( + "DRIVER={%s};SERVER=%s;PORT=%s;DATABASE=%s;UID=%s;PWD=%s" + % ( + pyodbc_driver, + DB_SETTINGS["host"], + DB_SETTINGS["port"], + DB_SETTINGS["name"], + DB_SETTINGS["user"], + DB_SETTINGS["password"], + ) + ) as connection: + raise RuntimeError("error") + + +@pytest.fixture +def pyodbc_driver(): + import pyodbc + + driver_name = "PostgreSQL Unicode" + assert driver_name in pyodbc.drivers() + return driver_name diff --git a/tests/datastore_pysolr/conftest.py b/tests/datastore_pysolr/conftest.py index 1f5419454..07851b698 100644 --- a/tests/datastore_pysolr/conftest.py +++ b/tests/datastore_pysolr/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.datastore_pysolr', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_pysolr/test_solr.py b/tests/datastore_pysolr/test_solr.py index 785c3cb9f..a987a29ac 100644 --- a/tests/datastore_pysolr/test_solr.py +++ b/tests/datastore_pysolr/test_solr.py @@ -14,7 +14,7 @@ from pysolr import Solr -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import solr_settings from newrelic.api.background_task import background_task diff --git a/tests/datastore_redis/conftest.py b/tests/datastore_redis/conftest.py index 802924c75..53ff2658d 100644 --- a/tests/datastore_redis/conftest.py +++ b/tests/datastore_redis/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.datastore_redis', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_redis/test_asyncio.py b/tests/datastore_redis/test_asyncio.py new file mode 100644 index 000000000..97c1b7853 --- /dev/null +++ b/tests/datastore_redis/test_asyncio.py @@ -0,0 +1,100 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import asyncio + +import pytest +from testing_support.db_settings import redis_settings +from testing_support.fixture.event_loop import event_loop as loop # noqa: F401 +from testing_support.util import instance_hostname +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task +from newrelic.common.package_version_utils import get_package_version_tuple + +# Settings + +DB_SETTINGS = redis_settings()[0] +REDIS_VERSION = get_package_version_tuple("redis") + +# Metrics + +_enable_scoped_metrics = [("Datastore/operation/Redis/publish", 3)] + +_enable_rollup_metrics = [ + ("Datastore/all", 3), + ("Datastore/allOther", 3), + ("Datastore/Redis/all", 3), + ("Datastore/Redis/allOther", 3), + ("Datastore/operation/Redis/publish", 3), + ("Datastore/instance/Redis/%s/%s" % (instance_hostname(DB_SETTINGS["host"]), DB_SETTINGS["port"]), 3), +] + +# Tests + + +@pytest.fixture() +def client(loop): # noqa + import redis.asyncio + + return loop.run_until_complete(redis.asyncio.Redis(host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], db=0)) + + +@pytest.mark.skipif(REDIS_VERSION < (4, 2), reason="This functionality exists in Redis 4.2+") +@validate_transaction_metrics("test_asyncio:test_async_pipeline", background_task=True) +@background_task() +def test_async_pipeline(client, loop): # noqa + async def _test_pipeline(client): + async with client.pipeline(transaction=True) as pipe: + await pipe.set("key1", "value1") + await pipe.execute() + + loop.run_until_complete(_test_pipeline(client)) + + +@pytest.mark.skipif(REDIS_VERSION < (4, 2), reason="This functionality exists in Redis 4.2+") +@validate_transaction_metrics( + "test_asyncio:test_async_pubsub", + scoped_metrics=_enable_scoped_metrics, + rollup_metrics=_enable_rollup_metrics, + background_task=True, +) +@background_task() +def test_async_pubsub(client, loop): # noqa + messages_received = [] + + async def reader(pubsub): + while True: + message = await pubsub.get_message(ignore_subscribe_messages=True) + if message: + messages_received.append(message["data"].decode()) + if message["data"].decode() == "NOPE": + break + + async def _test_pubsub(): + async with client.pubsub() as pubsub: + await pubsub.psubscribe("channel:*") + + future = asyncio.create_task(reader(pubsub)) + + await client.publish("channel:1", "Hello") + await client.publish("channel:2", "World") + await client.publish("channel:1", "NOPE") + + await future + + loop.run_until_complete(_test_pubsub()) + assert messages_received == ["Hello", "World", "NOPE"] diff --git a/tests/datastore_redis/test_custom_conn_pool.py b/tests/datastore_redis/test_custom_conn_pool.py index 9700392cc..156c9ce31 100644 --- a/tests/datastore_redis/test_custom_conn_pool.py +++ b/tests/datastore_redis/test_custom_conn_pool.py @@ -22,8 +22,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_redis/test_execute_command.py b/tests/datastore_redis/test_execute_command.py index c86295b35..747588072 100644 --- a/tests/datastore_redis/test_execute_command.py +++ b/tests/datastore_redis/test_execute_command.py @@ -17,8 +17,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_redis/test_get_and_set.py b/tests/datastore_redis/test_get_and_set.py index 2c40ddc12..0e2df4bb1 100644 --- a/tests/datastore_redis/test_get_and_set.py +++ b/tests/datastore_redis/test_get_and_set.py @@ -16,8 +16,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_redis/test_multiple_dbs.py b/tests/datastore_redis/test_multiple_dbs.py index 67f7b7fe8..15777cc38 100644 --- a/tests/datastore_redis/test_multiple_dbs.py +++ b/tests/datastore_redis/test_multiple_dbs.py @@ -17,8 +17,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_redis/test_rb.py b/tests/datastore_redis/test_rb.py index 7f4feeda7..5678c2787 100644 --- a/tests/datastore_redis/test_rb.py +++ b/tests/datastore_redis/test_rb.py @@ -24,8 +24,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import redis_settings from testing_support.util import instance_hostname diff --git a/tests/datastore_redis/test_trace_node.py b/tests/datastore_redis/test_trace_node.py index 39b7763ba..cc0d59919 100644 --- a/tests/datastore_redis/test_trace_node.py +++ b/tests/datastore_redis/test_trace_node.py @@ -13,11 +13,12 @@ # limitations under the License. import redis - -from testing_support.fixtures import (validate_tt_collector_json, - override_application_settings) -from testing_support.util import instance_hostname from testing_support.db_settings import redis_settings +from testing_support.fixtures import override_application_settings +from testing_support.util import instance_hostname +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task @@ -27,100 +28,93 @@ # Settings _enable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": True, } _disable_instance_settings = { - 'datastore_tracer.instance_reporting.enabled': False, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": False, + "datastore_tracer.database_name_reporting.enabled": False, } _instance_only_settings = { - 'datastore_tracer.instance_reporting.enabled': True, - 'datastore_tracer.database_name_reporting.enabled': False, + "datastore_tracer.instance_reporting.enabled": True, + "datastore_tracer.database_name_reporting.enabled": False, } _database_only_settings = { - 'datastore_tracer.instance_reporting.enabled': False, - 'datastore_tracer.database_name_reporting.enabled': True, + "datastore_tracer.instance_reporting.enabled": False, + "datastore_tracer.database_name_reporting.enabled": True, } # Expected parameters _enabled_required = { - 'host': instance_hostname(DB_SETTINGS['host']), - 'port_path_or_id': str(DB_SETTINGS['port']), - 'db.instance': str(DATABASE_NUMBER), + "host": instance_hostname(DB_SETTINGS["host"]), + "port_path_or_id": str(DB_SETTINGS["port"]), + "db.instance": str(DATABASE_NUMBER), } _enabled_forgone = {} _disabled_required = {} _disabled_forgone = { - 'host': 'VALUE NOT USED', - 'port_path_or_id': 'VALUE NOT USED', - 'db.instance': 'VALUE NOT USED', + "host": "VALUE NOT USED", + "port_path_or_id": "VALUE NOT USED", + "db.instance": "VALUE NOT USED", } _instance_only_required = { - 'host': instance_hostname(DB_SETTINGS['host']), - 'port_path_or_id': str(DB_SETTINGS['port']), + "host": instance_hostname(DB_SETTINGS["host"]), + "port_path_or_id": str(DB_SETTINGS["port"]), } _instance_only_forgone = { - 'db.instance': str(DATABASE_NUMBER), + "db.instance": str(DATABASE_NUMBER), } _database_only_required = { - 'db.instance': str(DATABASE_NUMBER), + "db.instance": str(DATABASE_NUMBER), } _database_only_forgone = { - 'host': 'VALUE NOT USED', - 'port_path_or_id': 'VALUE NOT USED', + "host": "VALUE NOT USED", + "port_path_or_id": "VALUE NOT USED", } # Query + def _exercise_db(): - client = redis.StrictRedis(host=DB_SETTINGS['host'], - port=DB_SETTINGS['port'], db=DATABASE_NUMBER) + client = redis.StrictRedis(host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], db=DATABASE_NUMBER) - client.set('key', 'value') - client.get('key') + client.set("key", "value") + client.get("key") - client.execute_command('CLIENT', 'LIST', parse='LIST') + client.execute_command("CLIENT", "LIST", parse="LIST") # Tests + @override_application_settings(_enable_instance_settings) -@validate_tt_collector_json( - datastore_params=_enabled_required, - datastore_forgone_params=_enabled_forgone) +@validate_tt_collector_json(datastore_params=_enabled_required, datastore_forgone_params=_enabled_forgone) @background_task() def test_trace_node_datastore_params_enable_instance(): _exercise_db() @override_application_settings(_disable_instance_settings) -@validate_tt_collector_json( - datastore_params=_disabled_required, - datastore_forgone_params=_disabled_forgone) +@validate_tt_collector_json(datastore_params=_disabled_required, datastore_forgone_params=_disabled_forgone) @background_task() def test_trace_node_datastore_params_disable_instance(): _exercise_db() @override_application_settings(_instance_only_settings) -@validate_tt_collector_json( - datastore_params=_instance_only_required, - datastore_forgone_params=_instance_only_forgone) +@validate_tt_collector_json(datastore_params=_instance_only_required, datastore_forgone_params=_instance_only_forgone) @background_task() def test_trace_node_datastore_params_instance_only(): _exercise_db() @override_application_settings(_database_only_settings) -@validate_tt_collector_json( - datastore_params=_database_only_required, - datastore_forgone_params=_database_only_forgone) +@validate_tt_collector_json(datastore_params=_database_only_required, datastore_forgone_params=_database_only_forgone) @background_task() def test_trace_node_datastore_params_database_only(): _exercise_db() diff --git a/tests/datastore_redis/test_uninstrumented_methods.py b/tests/datastore_redis/test_uninstrumented_methods.py index 314f9f203..ccf5a096d 100644 --- a/tests/datastore_redis/test_uninstrumented_methods.py +++ b/tests/datastore_redis/test_uninstrumented_methods.py @@ -14,19 +14,20 @@ import pytest import redis - from testing_support.db_settings import redis_settings DB_SETTINGS = redis_settings()[0] -redis_client = redis.Redis(host=DB_SETTINGS['host'], port=DB_SETTINGS['port'], db=0) -strict_redis_client = redis.StrictRedis(host=DB_SETTINGS['host'], port=DB_SETTINGS['port'], db=0) +redis_client = redis.Redis(host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], db=0) +strict_redis_client = redis.StrictRedis(host=DB_SETTINGS["host"], port=DB_SETTINGS["port"], db=0) IGNORED_METHODS = { - 'MODULE_CALLBACKS', - 'MODULE_VERSION', - 'NAME', + "MODULE_CALLBACKS", + "MODULE_VERSION", + "NAME", + "add_edge", + "add_node", "append_bucket_size", "append_capacity", "append_error", @@ -38,6 +39,11 @@ "append_no_scale", "append_values_and_weights", "append_weights", + "batch_indexer", + "BatchIndexer", + "bulk", + "call_procedure", + "client_no_touch", "client_tracking_off", "client_tracking_on", "client", @@ -46,45 +52,40 @@ "connection_pool", "connection", "debug_segfault", + "edges", "execute_command", + "flush", "from_url", "get_connection_kwargs", "get_encoder", + "get_label", + "get_params_args", + "get_property", + "get_relation", + "get_retry", "hscan_iter", + "index_name", + "labels", + "list_keys", + "load_document", "load_external_module", "lock", + "name", + "nodes", "parse_response", "pipeline", + "property_keys", "register_script", + "relationship_types", "response_callbacks", "RESPONSE_CALLBACKS", "sentinel", "set_file", "set_path", "set_response_callback", + "set_retry", "transaction", - "BatchIndexer", - "batch_indexer", - "get_params_args", - "index_name", - "load_document", - "add_edge", - "add_node", - "bulk", - "call_procedure", - "edges", - "flush", - "get_label", - "get_property", - "get_relation", - "labels", - "list_keys", - "name", - "nodes", - "property_keys", - "relationship_types", "version", - } REDIS_MODULES = { diff --git a/tests/datastore_rediscluster/conftest.py b/tests/datastore_rediscluster/conftest.py new file mode 100644 index 000000000..fe53f1fe2 --- /dev/null +++ b/tests/datastore_rediscluster/conftest.py @@ -0,0 +1,32 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) + +_default_settings = { + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, +} + +collector_agent_registration = collector_agent_registration_fixture( + app_name="Python Agent Test (datastore_redis)", + default_settings=_default_settings, + linked_applications=["Python Agent Test (datastore)"], +) diff --git a/tests/datastore_rediscluster/test_uninstrumented_rediscluster_methods.py b/tests/datastore_rediscluster/test_uninstrumented_rediscluster_methods.py new file mode 100644 index 000000000..ae211aa31 --- /dev/null +++ b/tests/datastore_rediscluster/test_uninstrumented_rediscluster_methods.py @@ -0,0 +1,168 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +import redis +from testing_support.db_settings import redis_cluster_settings + +DB_CLUSTER_SETTINGS = redis_cluster_settings()[0] + +# Set socket_timeout to 5s for fast fail, otherwise the default is to wait forever. +client = redis.RedisCluster(host=DB_CLUSTER_SETTINGS["host"], port=DB_CLUSTER_SETTINGS["port"], socket_timeout=5) + +IGNORED_METHODS = { + "MODULE_CALLBACKS", + "MODULE_VERSION", + "NAME", + "add_edge", + "add_node", + "append_bucket_size", + "append_capacity", + "append_error", + "append_expansion", + "append_items_and_increments", + "append_items", + "append_max_iterations", + "append_no_create", + "append_no_scale", + "append_values_and_weights", + "append_weights", + "batch_indexer", + "BatchIndexer", + "bulk", + "call_procedure", + "client_tracking_off", + "client_tracking_on", + "client", + "close", + "commandmixin", + "connection_pool", + "connection", + "debug_segfault", + "edges", + "execute_command", + "flush", + "from_url", + "get_connection_kwargs", + "get_encoder", + "get_label", + "get_params_args", + "get_property", + "get_relation", + "get_retry", + "hscan_iter", + "index_name", + "labels", + "list_keys", + "load_document", + "load_external_module", + "lock", + "name", + "nodes", + "parse_response", + "pipeline", + "property_keys", + "register_script", + "relationship_types", + "response_callbacks", + "RESPONSE_CALLBACKS", + "sentinel", + "set_file", + "set_path", + "set_response_callback", + "set_retry", + "transaction", + "version", + "ALL_NODES", + "CLUSTER_COMMANDS_RESPONSE_CALLBACKS", + "COMMAND_FLAGS", + "DEFAULT_NODE", + "ERRORS_ALLOW_RETRY", + "NODE_FLAGS", + "PRIMARIES", + "RANDOM", + "REPLICAS", + "RESULT_CALLBACKS", + "RedisClusterRequestTTL", + "SEARCH_COMMANDS", + "client_no_touch", + "cluster_addslotsrange", + "cluster_bumpepoch", + "cluster_delslotsrange", + "cluster_error_retry_attempts", + "cluster_flushslots", + "cluster_links", + "cluster_myid", + "cluster_myshardid", + "cluster_replicas", + "cluster_response_callbacks", + "cluster_setslot_stable", + "cluster_shards", + "command_flags", + "commands_parser", + "determine_slot", + "disconnect_connection_pools", + "encoder", + "get_default_node", + "get_node", + "get_node_from_key", + "get_nodes", + "get_primaries", + "get_random_node", + "get_redis_connection", + "get_replicas", + "keyslot", + "mget_nonatomic", + "monitor", + "mset_nonatomic", + "node_flags", + "nodes_manager", + "on_connect", + "pubsub", + "read_from_replicas", + "reinitialize_counter", + "reinitialize_steps", + "replace_default_node", + "result_callbacks", + "set_default_node", + "user_on_connect_func", +} + +REDIS_MODULES = { + "bf", + "cf", + "cms", + "ft", + "graph", + "json", + "tdigest", + "topk", + "ts", +} + +IGNORED_METHODS |= REDIS_MODULES + + +def test_uninstrumented_methods(): + methods = {m for m in dir(client) if not m[0] == "_"} + is_wrapped = lambda m: hasattr(getattr(client, m), "__wrapped__") + uninstrumented = {m for m in methods - IGNORED_METHODS if not is_wrapped(m)} + + for module in REDIS_MODULES: + if hasattr(client, module): + module_client = getattr(client, module)() + module_methods = {m for m in dir(module_client) if not m[0] == "_"} + is_wrapped = lambda m: hasattr(getattr(module_client, m), "__wrapped__") + uninstrumented |= {m for m in module_methods - IGNORED_METHODS if not is_wrapped(m)} + + assert not uninstrumented, "Uninstrumented methods: %s" % sorted(uninstrumented) diff --git a/tests/datastore_solrpy/conftest.py b/tests/datastore_solrpy/conftest.py index 52248065c..4418e5d9a 100644 --- a/tests/datastore_solrpy/conftest.py +++ b/tests/datastore_solrpy/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.datastore_solrpy', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_solrpy/test_solr.py b/tests/datastore_solrpy/test_solr.py index 86dc23d4a..ee1a7e91e 100644 --- a/tests/datastore_solrpy/test_solr.py +++ b/tests/datastore_solrpy/test_solr.py @@ -14,7 +14,7 @@ from solr import SolrConnection -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import solr_settings from newrelic.api.background_task import background_task diff --git a/tests/datastore_sqlite/conftest.py b/tests/datastore_sqlite/conftest.py index 270b9b8cf..ed695b251 100644 --- a/tests/datastore_sqlite/conftest.py +++ b/tests/datastore_sqlite/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.database_sqlite', - 'newrelic.hooks.database_dbapi2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/datastore_sqlite/test_database.py b/tests/datastore_sqlite/test_database.py index 5443ca1c9..584ce57bc 100644 --- a/tests/datastore_sqlite/test_database.py +++ b/tests/datastore_sqlite/test_database.py @@ -18,12 +18,12 @@ is_pypy = hasattr(sys, 'pypy_version_info') -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_database_trace_inputs import validate_database_trace_inputs from newrelic.api.background_task import background_task -DATABASE_DIR = os.environ.get('TOX_ENVDIR', '.') +DATABASE_DIR = os.environ.get('TOX_ENV_DIR', '.') DATABASE_NAME = ':memory:' _test_execute_via_cursor_scoped_metrics = [ diff --git a/tests/datastore_umemcache/conftest.py b/tests/datastore_umemcache/conftest.py deleted file mode 100644 index 1e945141d..000000000 --- a/tests/datastore_umemcache/conftest.py +++ /dev/null @@ -1,38 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import pytest - -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) - -_coverage_source = [ - 'newrelic.api.memcache_trace', - 'newrelic.hooks.datastore_umemcache', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) - -_default_settings = { - 'transaction_tracer.explain_threshold': 0.0, - 'transaction_tracer.transaction_threshold': 0.0, - 'transaction_tracer.stack_trace_threshold': 0.0, - 'debug.log_data_collector_payloads': True, - 'debug.record_transaction_failure': True -} - -collector_agent_registration = collector_agent_registration_fixture( - app_name='Python Agent Test (datastore_umemcache)', - default_settings=_default_settings, - linked_applications=['Python Agent Test (datastore)']) diff --git a/tests/datastore_umemcache/test_memcache.py b/tests/datastore_umemcache/test_memcache.py deleted file mode 100644 index ce6475a49..000000000 --- a/tests/datastore_umemcache/test_memcache.py +++ /dev/null @@ -1,143 +0,0 @@ -# Copyright 2010 New Relic, Inc. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -import os - -import umemcache - -from testing_support.db_settings import memcached_settings -from testing_support.fixtures import validate_transaction_metrics - -from newrelic.api.background_task import background_task -from newrelic.api.transaction import set_background_task - - -DB_SETTINGS = memcached_settings()[0] - -MEMCACHED_HOST = DB_SETTINGS["host"] -MEMCACHED_PORT = DB_SETTINGS["port"] -MEMCACHED_NAMESPACE = DB_SETTINGS["namespace"] - -MEMCACHED_ADDR = '%s:%s' % (MEMCACHED_HOST, MEMCACHED_PORT) - -_test_bt_set_get_delete_scoped_metrics = [ - ('Datastore/operation/Memcached/set', 1), - ('Datastore/operation/Memcached/get', 1), - ('Datastore/operation/Memcached/delete', 1)] - -_test_bt_set_get_delete_rollup_metrics = [ - ('Datastore/all', 3), - ('Datastore/allOther', 3), - ('Datastore/Memcached/all', 3), - ('Datastore/Memcached/allOther', 3), - ('Datastore/operation/Memcached/set', 1), - ('Datastore/operation/Memcached/get', 1), - ('Datastore/operation/Memcached/delete', 1)] - -@validate_transaction_metrics( - 'test_memcache:test_bt_set_get_delete', - scoped_metrics=_test_bt_set_get_delete_scoped_metrics, - rollup_metrics=_test_bt_set_get_delete_rollup_metrics, - background_task=True) -@background_task() -def test_bt_set_get_delete(): - set_background_task(True) - client = umemcache.Client(MEMCACHED_ADDR) - client.connect() - - key = MEMCACHED_NAMESPACE + 'key' - - client.set(key, 'value') - value = client.get(key)[0] - client.delete(key) - - assert value == 'value' - -_test_wt_set_get_delete_scoped_metrics = [ - ('Datastore/operation/Memcached/set', 1), - ('Datastore/operation/Memcached/get', 1), - ('Datastore/operation/Memcached/delete', 1)] - -_test_wt_set_get_delete_rollup_metrics = [ - ('Datastore/all', 3), - ('Datastore/allWeb', 3), - ('Datastore/Memcached/all', 3), - ('Datastore/Memcached/allWeb', 3), - ('Datastore/operation/Memcached/set', 1), - ('Datastore/operation/Memcached/get', 1), - ('Datastore/operation/Memcached/delete', 1)] - -@validate_transaction_metrics( - 'test_memcache:test_wt_set_get_delete', - scoped_metrics=_test_wt_set_get_delete_scoped_metrics, - rollup_metrics=_test_wt_set_get_delete_rollup_metrics, - background_task=False) -@background_task() -def test_wt_set_get_delete(): - set_background_task(False) - client = umemcache.Client(MEMCACHED_ADDR) - client.connect() - - key = MEMCACHED_NAMESPACE + 'key' - - client.set(key, 'value') - value = client.get(key)[0] - client.delete(key) - - assert value == 'value' - -_test_wt_set_incr_decr_scoped_metrics = [ - ('Datastore/operation/Memcached/set', 1), - ('Datastore/operation/Memcached/get', 2), - ('Datastore/operation/Memcached/incr', 2), - ('Datastore/operation/Memcached/decr', 1), - ('Datastore/operation/Memcached/stats', 1)] - -_test_wt_set_incr_decr_rollup_metrics = [ - ('Datastore/all', 7), - ('Datastore/allWeb', 7), - ('Datastore/Memcached/all', 7), - ('Datastore/Memcached/allWeb', 7), - ('Datastore/operation/Memcached/set', 1), - ('Datastore/operation/Memcached/get', 2), - ('Datastore/operation/Memcached/incr', 2), - ('Datastore/operation/Memcached/decr', 1), - ('Datastore/operation/Memcached/stats', 1)] - -@validate_transaction_metrics( - 'test_memcache:test_wt_set_incr_decr', - scoped_metrics=_test_wt_set_incr_decr_scoped_metrics, - rollup_metrics=_test_wt_set_incr_decr_rollup_metrics, - background_task=False) -@background_task() -def test_wt_set_incr_decr(): - set_background_task(False) - client = umemcache.Client(MEMCACHED_ADDR) - client.connect() - - key = MEMCACHED_NAMESPACE + 'key' - - client.set(key, '667') - value = client.get(key)[0] - client.incr(key, 1) - client.incr(key, 1) - client.decr(key, 1) - value = client.get(key)[0] - - assert value == '668' - - d = client.stats() - - assert d.has_key('uptime') - assert d.has_key('bytes') diff --git a/tests/external_boto3/conftest.py b/tests/external_boto3/conftest.py index 4daa3678b..90d82f007 100644 --- a/tests/external_boto3/conftest.py +++ b/tests/external_boto3/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.external_botocore', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_boto3/test_boto3_iam.py b/tests/external_boto3/test_boto3_iam.py index 9c6246c8c..a2237dc93 100644 --- a/tests/external_boto3/test_boto3_iam.py +++ b/tests/external_boto3/test_boto3_iam.py @@ -17,68 +17,73 @@ import boto3 import moto +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_segment_params, override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec (This is fine for testing purposes) -TEST_USER = 'python-agent-test-%s' % uuid.uuid4() +TEST_USER = "python-agent-test-%s" % uuid.uuid4() _iam_scoped_metrics = [ - ('External/iam.amazonaws.com/botocore/POST', 3), + ("External/iam.amazonaws.com/botocore/POST", 3), ] _iam_rollup_metrics = [ - ('External/all', 3), - ('External/allOther', 3), - ('External/iam.amazonaws.com/all', 3), - ('External/iam.amazonaws.com/botocore/POST', 3), + ("External/all", 3), + ("External/allOther", 3), + ("External/iam.amazonaws.com/all", 3), + ("External/iam.amazonaws.com/botocore/POST", 3), ] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events( - exact_agents={'http.url': 'https://iam.amazonaws.com/'}, count=3) -@validate_span_events(expected_agents=('aws.requestId',), count=3) -@validate_span_events(exact_agents={'aws.operation': 'CreateUser'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'GetUser'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteUser'}, count=1) -@validate_tt_segment_params(present_params=('aws.requestId',)) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(exact_agents={"http.url": "https://iam.amazonaws.com/"}, count=3) +@validate_span_events(expected_agents=("aws.requestId",), count=3) +@validate_span_events(exact_agents={"aws.operation": "CreateUser"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "GetUser"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteUser"}, count=1) +@validate_tt_segment_params(present_params=("aws.requestId",)) @validate_transaction_metrics( - 'test_boto3_iam:test_iam', - scoped_metrics=_iam_scoped_metrics, - rollup_metrics=_iam_rollup_metrics, - background_task=True) + "test_boto3_iam:test_iam", + scoped_metrics=_iam_scoped_metrics, + rollup_metrics=_iam_rollup_metrics, + background_task=True, +) @background_task() @moto.mock_iam def test_iam(): iam = boto3.client( - 'iam', - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY, + "iam", + aws_access_key_id=AWS_ACCESS_KEY_ID, + aws_secret_access_key=AWS_SECRET_ACCESS_KEY, ) # Create user resp = iam.create_user(UserName=TEST_USER) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Get the user resp = iam.get_user(UserName=TEST_USER) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert resp['User']['UserName'] == TEST_USER + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert resp["User"]["UserName"] == TEST_USER # Delete the user resp = iam.delete_user(UserName=TEST_USER) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 diff --git a/tests/external_boto3/test_boto3_s3.py b/tests/external_boto3/test_boto3_s3.py index ba65bc950..a7ecf034a 100644 --- a/tests/external_boto3/test_boto3_s3.py +++ b/tests/external_boto3/test_boto3_s3.py @@ -18,106 +18,106 @@ import boto3 import botocore import moto +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' -AWS_REGION_NAME = 'us-west-2' +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec +AWS_REGION_NAME = "us-west-2" + +TEST_BUCKET = "python-agent-test-%s" % uuid.uuid4() + +BOTOCORE_VERSION = tuple(map(int, botocore.__version__.split("."))) -TEST_BUCKET = 'python-agent-test-%s' % uuid.uuid4() -BOTOCORE_VERSION = tuple(map(int, botocore.__version__.split('.'))) if BOTOCORE_VERSION < (1, 7, 41): - S3_URL = 's3-us-west-2.amazonaws.com' + S3_URL = "s3-us-west-2.amazonaws.com" + EXPECTED_BUCKET_URL = "https://%s/%s" % (S3_URL, TEST_BUCKET) + EXPECTED_KEY_URL = EXPECTED_BUCKET_URL + "/hello_world" +elif BOTOCORE_VERSION < (1, 28): + S3_URL = "s3.us-west-2.amazonaws.com" + EXPECTED_BUCKET_URL = "https://%s/%s" % (S3_URL, TEST_BUCKET) + EXPECTED_KEY_URL = EXPECTED_BUCKET_URL + "/hello_world" else: - S3_URL = 's3.us-west-2.amazonaws.com' + S3_URL = "%s.s3.us-west-2.amazonaws.com" % TEST_BUCKET + EXPECTED_BUCKET_URL = "https://%s/" % S3_URL + EXPECTED_KEY_URL = EXPECTED_BUCKET_URL + "hello_world" -expected_http_url = 'https://%s/%s' % (S3_URL, TEST_BUCKET) _s3_scoped_metrics = [ - ('External/%s/botocore/GET' % S3_URL, 2), - ('External/%s/botocore/PUT' % S3_URL, 2), - ('External/%s/botocore/DELETE' % S3_URL, 2), + ("External/%s/botocore/GET" % S3_URL, 2), + ("External/%s/botocore/PUT" % S3_URL, 2), + ("External/%s/botocore/DELETE" % S3_URL, 2), ] _s3_rollup_metrics = [ - ('External/all', 6), - ('External/allOther', 6), - ('External/%s/all' % S3_URL, 6), - ('External/%s/botocore/GET' % S3_URL, 2), - ('External/%s/botocore/PUT' % S3_URL, 2), - ('External/%s/botocore/DELETE' % S3_URL, 2), + ("External/all", 6), + ("External/allOther", 6), + ("External/%s/all" % S3_URL, 6), + ("External/%s/botocore/GET" % S3_URL, 2), + ("External/%s/botocore/PUT" % S3_URL, 2), + ("External/%s/botocore/DELETE" % S3_URL, 2), ] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(exact_agents={'aws.operation': 'CreateBucket'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'PutObject'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'ListObjects'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'GetObject'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteObject'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteBucket'}, count=1) -@validate_span_events( - exact_agents={'http.url': expected_http_url}, count=3) -@validate_span_events( - exact_agents={'http.url': expected_http_url + '/hello_world'}, count=3) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(exact_agents={"aws.operation": "CreateBucket"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "PutObject"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "ListObjects"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "GetObject"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteObject"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteBucket"}, count=1) +@validate_span_events(exact_agents={"http.url": EXPECTED_BUCKET_URL}, count=3) +@validate_span_events(exact_agents={"http.url": EXPECTED_KEY_URL}, count=3) @validate_transaction_metrics( - 'test_boto3_s3:test_s3', - scoped_metrics=_s3_scoped_metrics, - rollup_metrics=_s3_rollup_metrics, - background_task=True) + "test_boto3_s3:test_s3", scoped_metrics=_s3_scoped_metrics, rollup_metrics=_s3_rollup_metrics, background_task=True +) @background_task() @moto.mock_s3 def test_s3(): client = boto3.client( - 's3', - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY, - region_name=AWS_REGION_NAME, + "s3", + aws_access_key_id=AWS_ACCESS_KEY_ID, + aws_secret_access_key=AWS_SECRET_ACCESS_KEY, + region_name=AWS_REGION_NAME, ) # Create bucket - resp = client.create_bucket( - Bucket=TEST_BUCKET, - CreateBucketConfiguration={'LocationConstraint': AWS_REGION_NAME} - ) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + resp = client.create_bucket(Bucket=TEST_BUCKET, CreateBucketConfiguration={"LocationConstraint": AWS_REGION_NAME}) + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Put object - resp = client.put_object( - Bucket=TEST_BUCKET, - Key='hello_world', - Body=b'hello_world_content' - ) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + resp = client.put_object(Bucket=TEST_BUCKET, Key="hello_world", Body=b"hello_world_content") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # List bucket resp = client.list_objects(Bucket=TEST_BUCKET) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert len(resp['Contents']) == 1 - assert resp['Contents'][0]['Key'] == 'hello_world' + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert len(resp["Contents"]) == 1 + assert resp["Contents"][0]["Key"] == "hello_world" # Get object - resp = client.get_object(Bucket=TEST_BUCKET, Key='hello_world') - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert resp['Body'].read() == b'hello_world_content' + resp = client.get_object(Bucket=TEST_BUCKET, Key="hello_world") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert resp["Body"].read() == b"hello_world_content" # Delete object - resp = client.delete_object(Bucket=TEST_BUCKET, Key='hello_world') - assert resp['ResponseMetadata']['HTTPStatusCode'] == 204 + resp = client.delete_object(Bucket=TEST_BUCKET, Key="hello_world") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 204 # Delete bucket resp = client.delete_bucket(Bucket=TEST_BUCKET) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 204 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 204 diff --git a/tests/external_boto3/test_boto3_sns.py b/tests/external_boto3/test_boto3_sns.py index 38c8c951c..bafe68611 100644 --- a/tests/external_boto3/test_boto3_sns.py +++ b/tests/external_boto3/test_boto3_sns.py @@ -13,80 +13,91 @@ # limitations under the License. import sys + import boto3 import moto import pytest +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_segment_params, override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' -AWS_REGION_NAME = 'us-east-1' -SNS_URL = 'sns-us-east-1.amazonaws.com' -TOPIC = 'arn:aws:sns:us-east-1:123456789012:some-topic' -sns_metrics = [ - ('MessageBroker/SNS/Topic' - '/Produce/Named/%s' % TOPIC, 1)] -sns_metrics_phone = [ - ('MessageBroker/SNS/Topic' - '/Produce/Named/PhoneNumber', 1)] +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec (This is fine for testing purposes) +AWS_REGION_NAME = "us-east-1" +SNS_URL = "sns-us-east-1.amazonaws.com" +TOPIC = "arn:aws:sns:us-east-1:123456789012:some-topic" +sns_metrics = [("MessageBroker/SNS/Topic" "/Produce/Named/%s" % TOPIC, 1)] +sns_metrics_phone = [("MessageBroker/SNS/Topic" "/Produce/Named/PhoneNumber", 1)] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(expected_agents=('aws.requestId',), count=2) -@validate_span_events(exact_agents={'aws.operation': 'CreateTopic'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'Publish'}, count=1) -@validate_tt_segment_params(present_params=('aws.requestId',)) -@pytest.mark.parametrize('topic_argument', ('TopicArn', 'TargetArn')) -@validate_transaction_metrics('test_boto3_sns:test_publish_to_sns_topic', - scoped_metrics=sns_metrics, rollup_metrics=sns_metrics, - background_task=True) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(expected_agents=("aws.requestId",), count=2) +@validate_span_events(exact_agents={"aws.operation": "CreateTopic"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "Publish"}, count=1) +@validate_tt_segment_params(present_params=("aws.requestId",)) +@pytest.mark.parametrize("topic_argument", ("TopicArn", "TargetArn")) +@validate_transaction_metrics( + "test_boto3_sns:test_publish_to_sns_topic", + scoped_metrics=sns_metrics, + rollup_metrics=sns_metrics, + background_task=True, +) @background_task() @moto.mock_sns def test_publish_to_sns_topic(topic_argument): - conn = boto3.client('sns', - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY, - region_name=AWS_REGION_NAME) + conn = boto3.client( + "sns", + aws_access_key_id=AWS_ACCESS_KEY_ID, + aws_secret_access_key=AWS_SECRET_ACCESS_KEY, + region_name=AWS_REGION_NAME, + ) - topic_arn = conn.create_topic(Name='some-topic')['TopicArn'] + topic_arn = conn.create_topic(Name="some-topic")["TopicArn"] kwargs = {topic_argument: topic_arn} - published_message = conn.publish(Message='my msg', **kwargs) - assert 'MessageId' in published_message + published_message = conn.publish(Message="my msg", **kwargs) + assert "MessageId" in published_message -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(expected_agents=('aws.requestId',), count=3) -@validate_span_events(exact_agents={'aws.operation': 'CreateTopic'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'Subscribe'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'Publish'}, count=1) -@validate_tt_segment_params(present_params=('aws.requestId',)) -@validate_transaction_metrics('test_boto3_sns:test_publish_to_sns_phone', - scoped_metrics=sns_metrics_phone, rollup_metrics=sns_metrics_phone, - background_task=True) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(expected_agents=("aws.requestId",), count=3) +@validate_span_events(exact_agents={"aws.operation": "CreateTopic"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "Subscribe"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "Publish"}, count=1) +@validate_tt_segment_params(present_params=("aws.requestId",)) +@validate_transaction_metrics( + "test_boto3_sns:test_publish_to_sns_phone", + scoped_metrics=sns_metrics_phone, + rollup_metrics=sns_metrics_phone, + background_task=True, +) @background_task() @moto.mock_sns def test_publish_to_sns_phone(): - conn = boto3.client('sns', - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY, - region_name=AWS_REGION_NAME) + conn = boto3.client( + "sns", + aws_access_key_id=AWS_ACCESS_KEY_ID, + aws_secret_access_key=AWS_SECRET_ACCESS_KEY, + region_name=AWS_REGION_NAME, + ) - topic_arn = conn.create_topic(Name='some-topic')['TopicArn'] - conn.subscribe(TopicArn=topic_arn, Protocol='sms', Endpoint='5555555555') + topic_arn = conn.create_topic(Name="some-topic")["TopicArn"] + conn.subscribe(TopicArn=topic_arn, Protocol="sms", Endpoint="5555555555") - published_message = conn.publish( - PhoneNumber='5555555555', Message='my msg') - assert 'MessageId' in published_message + published_message = conn.publish(PhoneNumber="5555555555", Message="my msg") + assert "MessageId" in published_message diff --git a/tests/external_botocore/conftest.py b/tests/external_botocore/conftest.py index 738b51f5a..e5cf15533 100644 --- a/tests/external_botocore/conftest.py +++ b/tests/external_botocore/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.external_botocore', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_botocore/test_botocore_dynamodb.py b/tests/external_botocore/test_botocore_dynamodb.py index eb0432aba..30114d53b 100644 --- a/tests/external_botocore/test_botocore_dynamodb.py +++ b/tests/external_botocore/test_botocore_dynamodb.py @@ -17,90 +17,96 @@ import botocore.session import moto +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_segment_params, override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' -AWS_REGION = 'us-east-1' +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec (This is fine for testing purposes) +AWS_REGION = "us-east-1" -TEST_TABLE = 'python-agent-test-%s' % uuid.uuid4() +TEST_TABLE = "python-agent-test-%s" % uuid.uuid4() _dynamodb_scoped_metrics = [ - ('Datastore/statement/DynamoDB/%s/create_table' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/put_item' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/get_item' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/update_item' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/query' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/scan' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/delete_item' % TEST_TABLE, 1), - ('Datastore/statement/DynamoDB/%s/delete_table' % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/create_table" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/put_item" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/get_item" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/update_item" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/query" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/scan" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/delete_item" % TEST_TABLE, 1), + ("Datastore/statement/DynamoDB/%s/delete_table" % TEST_TABLE, 1), ] _dynamodb_rollup_metrics = [ - ('Datastore/all', 8), - ('Datastore/allOther', 8), - ('Datastore/DynamoDB/all', 8), - ('Datastore/DynamoDB/allOther', 8), + ("Datastore/all", 8), + ("Datastore/allOther", 8), + ("Datastore/DynamoDB/all", 8), + ("Datastore/DynamoDB/allOther", 8), ] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(expected_agents=('aws.requestId',), count=8) -@validate_span_events(exact_agents={'aws.operation': 'PutItem'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'GetItem'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteItem'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'CreateTable'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteTable'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'Query'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'Scan'}, count=1) -@validate_tt_segment_params(present_params=('aws.requestId',)) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(expected_agents=("aws.requestId",), count=8) +@validate_span_events(exact_agents={"aws.operation": "PutItem"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "GetItem"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteItem"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "CreateTable"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteTable"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "Query"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "Scan"}, count=1) +@validate_tt_segment_params(present_params=("aws.requestId",)) @validate_transaction_metrics( - 'test_botocore_dynamodb:test_dynamodb', - scoped_metrics=_dynamodb_scoped_metrics, - rollup_metrics=_dynamodb_rollup_metrics, - background_task=True) + "test_botocore_dynamodb:test_dynamodb", + scoped_metrics=_dynamodb_scoped_metrics, + rollup_metrics=_dynamodb_rollup_metrics, + background_task=True, +) @background_task() @moto.mock_dynamodb2 def test_dynamodb(): session = botocore.session.get_session() client = session.create_client( - 'dynamodb', - region_name=AWS_REGION, - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY + "dynamodb", + region_name=AWS_REGION, + aws_access_key_id=AWS_ACCESS_KEY_ID, + aws_secret_access_key=AWS_SECRET_ACCESS_KEY, ) # Create table resp = client.create_table( - TableName=TEST_TABLE, - AttributeDefinitions=[ - {'AttributeName': 'Id', 'AttributeType': 'N'}, - {'AttributeName': 'Foo', 'AttributeType': 'S'}, - ], - KeySchema=[ - {'AttributeName': 'Id', 'KeyType': 'HASH'}, - {'AttributeName': 'Foo', 'KeyType': 'RANGE'}, - ], - ProvisionedThroughput={ - 'ReadCapacityUnits': 5, - 'WriteCapacityUnits': 5, - }, + TableName=TEST_TABLE, + AttributeDefinitions=[ + {"AttributeName": "Id", "AttributeType": "N"}, + {"AttributeName": "Foo", "AttributeType": "S"}, + ], + KeySchema=[ + {"AttributeName": "Id", "KeyType": "HASH"}, + {"AttributeName": "Foo", "KeyType": "RANGE"}, + ], + ProvisionedThroughput={ + "ReadCapacityUnits": 5, + "WriteCapacityUnits": 5, + }, ) - assert resp['TableDescription']['TableName'] == TEST_TABLE + assert resp["TableDescription"]["TableName"] == TEST_TABLE # moto response is ACTIVE, AWS response is CREATING # assert resp['TableDescription']['TableStatus'] == 'ACTIVE' @@ -110,73 +116,70 @@ def test_dynamodb(): # Put item resp = client.put_item( - TableName=TEST_TABLE, - Item={ - 'Id': {'N': '101'}, - 'Foo': {'S': 'hello_world'}, - 'SomeValue': {'S': 'some_random_attribute'}, - } + TableName=TEST_TABLE, + Item={ + "Id": {"N": "101"}, + "Foo": {"S": "hello_world"}, + "SomeValue": {"S": "some_random_attribute"}, + }, ) # No checking response, due to inconsistent return values. # moto returns resp['Attributes']. AWS returns resp['ResponseMetadata'] # Get item resp = client.get_item( - TableName=TEST_TABLE, - Key={ - 'Id': {'N': '101'}, - 'Foo': {'S': 'hello_world'}, - 'SomeValue': {'S': 'some_random_attribute'}, - } + TableName=TEST_TABLE, + Key={ + "Id": {"N": "101"}, + "Foo": {"S": "hello_world"}, + "SomeValue": {"S": "some_random_attribute"}, + }, ) - assert resp['Item']['SomeValue']['S'] == 'some_random_attribute' + assert resp["Item"]["SomeValue"]["S"] == "some_random_attribute" # Update item resp = client.update_item( - TableName=TEST_TABLE, - Key={ - 'Id': {'N': '101'}, - 'Foo': {'S': 'hello_world'}, - 'SomeValue': {'S': 'some_random_attribute'}, - }, - AttributeUpdates={ - 'Foo2': { - 'Value': {'S': 'hello_world2'}, - 'Action': 'PUT' - }, - }, - ReturnValues='ALL_NEW', + TableName=TEST_TABLE, + Key={ + "Id": {"N": "101"}, + "Foo": {"S": "hello_world"}, + "SomeValue": {"S": "some_random_attribute"}, + }, + AttributeUpdates={ + "Foo2": {"Value": {"S": "hello_world2"}, "Action": "PUT"}, + }, + ReturnValues="ALL_NEW", ) - assert resp['Attributes']['Foo2'] + assert resp["Attributes"]["Foo2"] # Query for item resp = client.query( - TableName=TEST_TABLE, - Select='ALL_ATTRIBUTES', - KeyConditionExpression='#Id = :v_id', - ExpressionAttributeNames={'#Id': 'Id'}, - ExpressionAttributeValues={':v_id': {'N': '101'}}, + TableName=TEST_TABLE, + Select="ALL_ATTRIBUTES", + KeyConditionExpression="#Id = :v_id", + ExpressionAttributeNames={"#Id": "Id"}, + ExpressionAttributeValues={":v_id": {"N": "101"}}, ) - assert len(resp['Items']) == 1 - assert resp['Items'][0]['SomeValue']['S'] == 'some_random_attribute' + assert len(resp["Items"]) == 1 + assert resp["Items"][0]["SomeValue"]["S"] == "some_random_attribute" # Scan resp = client.scan(TableName=TEST_TABLE) - assert len(resp['Items']) == 1 + assert len(resp["Items"]) == 1 # Delete item resp = client.delete_item( - TableName=TEST_TABLE, - Key={ - 'Id': {'N': '101'}, - 'Foo': {'S': 'hello_world'}, - }, + TableName=TEST_TABLE, + Key={ + "Id": {"N": "101"}, + "Foo": {"S": "hello_world"}, + }, ) # No checking response, due to inconsistent return values. # moto returns resp['Attributes']. AWS returns resp['ResponseMetadata'] # Delete table resp = client.delete_table(TableName=TEST_TABLE) - assert resp['TableDescription']['TableName'] == TEST_TABLE + assert resp["TableDescription"]["TableName"] == TEST_TABLE # moto response is ACTIVE, AWS response is DELETING # assert resp['TableDescription']['TableStatus'] == 'DELETING' diff --git a/tests/external_botocore/test_botocore_ec2.py b/tests/external_botocore/test_botocore_ec2.py index 6f91ad75f..28a8ff63a 100644 --- a/tests/external_botocore/test_botocore_ec2.py +++ b/tests/external_botocore/test_botocore_ec2.py @@ -17,80 +17,81 @@ import botocore.session import moto +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_segment_params, override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' -AWS_REGION = 'us-east-1' -UBUNTU_14_04_PARAVIRTUAL_AMI = 'ami-c65be9ae' +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec (This is fine for testing purposes) +AWS_REGION = "us-east-1" +UBUNTU_14_04_PARAVIRTUAL_AMI = "ami-c65be9ae" -TEST_INSTANCE = 'python-agent-test-%s' % uuid.uuid4() +TEST_INSTANCE = "python-agent-test-%s" % uuid.uuid4() _ec2_scoped_metrics = [ - ('External/ec2.us-east-1.amazonaws.com/botocore/POST', 3), + ("External/ec2.us-east-1.amazonaws.com/botocore/POST", 3), ] _ec2_rollup_metrics = [ - ('External/all', 3), - ('External/allOther', 3), - ('External/ec2.us-east-1.amazonaws.com/all', 3), - ('External/ec2.us-east-1.amazonaws.com/botocore/POST', 3), + ("External/all", 3), + ("External/allOther", 3), + ("External/ec2.us-east-1.amazonaws.com/all", 3), + ("External/ec2.us-east-1.amazonaws.com/botocore/POST", 3), ] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(expected_agents=('aws.requestId',), count=3) -@validate_span_events(exact_agents={'aws.operation': 'RunInstances'}, count=1) -@validate_span_events( - exact_agents={'aws.operation': 'DescribeInstances'}, count=1) -@validate_span_events( - exact_agents={'aws.operation': 'TerminateInstances'}, count=1) -@validate_tt_segment_params(present_params=('aws.requestId',)) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(expected_agents=("aws.requestId",), count=3) +@validate_span_events(exact_agents={"aws.operation": "RunInstances"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DescribeInstances"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "TerminateInstances"}, count=1) +@validate_tt_segment_params(present_params=("aws.requestId",)) @validate_transaction_metrics( - 'test_botocore_ec2:test_ec2', - scoped_metrics=_ec2_scoped_metrics, - rollup_metrics=_ec2_rollup_metrics, - background_task=True) + "test_botocore_ec2:test_ec2", + scoped_metrics=_ec2_scoped_metrics, + rollup_metrics=_ec2_rollup_metrics, + background_task=True, +) @background_task() @moto.mock_ec2 def test_ec2(): session = botocore.session.get_session() client = session.create_client( - 'ec2', - region_name=AWS_REGION, - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY + "ec2", region_name=AWS_REGION, aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY ) # Create instance resp = client.run_instances( - ImageId=UBUNTU_14_04_PARAVIRTUAL_AMI, - InstanceType='m1.small', - MinCount=1, - MaxCount=1, + ImageId=UBUNTU_14_04_PARAVIRTUAL_AMI, + InstanceType="m1.small", + MinCount=1, + MaxCount=1, ) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert len(resp['Instances']) == 1 - instance_id = resp['Instances'][0]['InstanceId'] + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert len(resp["Instances"]) == 1 + instance_id = resp["Instances"][0]["InstanceId"] # Describe instance resp = client.describe_instances(InstanceIds=[instance_id]) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert resp['Reservations'][0]['Instances'][0]['InstanceId'] == instance_id + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert resp["Reservations"][0]["Instances"][0]["InstanceId"] == instance_id # Delete instance resp = client.terminate_instances(InstanceIds=[instance_id]) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert resp['TerminatingInstances'][0]['InstanceId'] == instance_id + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert resp["TerminatingInstances"][0]["InstanceId"] == instance_id diff --git a/tests/external_botocore/test_botocore_s3.py b/tests/external_botocore/test_botocore_s3.py index 3cd4ecd93..1984d8103 100644 --- a/tests/external_botocore/test_botocore_s3.py +++ b/tests/external_botocore/test_botocore_s3.py @@ -15,101 +15,104 @@ import sys import uuid +import botocore import botocore.session import moto +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) +BOTOCORE_VERSION = tuple(int(v) for v in botocore.__version__.split(".")[:3]) + # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' -AWS_REGION = 'us-east-1' +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec +AWS_REGION = "us-east-1" -TEST_BUCKET = 'python-agent-test-%s' % uuid.uuid4() -S3_URL = 's3.amazonaws.com' -expected_http_url = 'https://%s/%s' % (S3_URL, TEST_BUCKET) +TEST_BUCKET = "python-agent-test-%s" % uuid.uuid4() +if BOTOCORE_VERSION >= (1, 28): + S3_URL = "%s.s3.amazonaws.com" % TEST_BUCKET + EXPECTED_BUCKET_URL = "https://%s/" % S3_URL + EXPECTED_KEY_URL = EXPECTED_BUCKET_URL + "hello_world" +else: + S3_URL = "s3.amazonaws.com" + EXPECTED_BUCKET_URL = "https://%s/%s" % (S3_URL, TEST_BUCKET) + EXPECTED_KEY_URL = EXPECTED_BUCKET_URL + "/hello_world" _s3_scoped_metrics = [ - ('External/s3.amazonaws.com/botocore/GET', 2), - ('External/s3.amazonaws.com/botocore/PUT', 2), - ('External/s3.amazonaws.com/botocore/DELETE', 2), + ("External/%s/botocore/GET" % S3_URL, 2), + ("External/%s/botocore/PUT" % S3_URL, 2), + ("External/%s/botocore/DELETE" % S3_URL, 2), ] _s3_rollup_metrics = [ - ('External/all', 6), - ('External/allOther', 6), - ('External/s3.amazonaws.com/all', 6), - ('External/s3.amazonaws.com/botocore/GET', 2), - ('External/s3.amazonaws.com/botocore/PUT', 2), - ('External/s3.amazonaws.com/botocore/DELETE', 2), + ("External/all", 6), + ("External/allOther", 6), + ("External/%s/all" % S3_URL, 6), + ("External/%s/botocore/GET" % S3_URL, 2), + ("External/%s/botocore/PUT" % S3_URL, 2), + ("External/%s/botocore/DELETE" % S3_URL, 2), ] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(exact_agents={'aws.operation': 'CreateBucket'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'PutObject'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'ListObjects'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'GetObject'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteObject'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteBucket'}, count=1) -@validate_span_events( - exact_agents={'http.url': expected_http_url}, count=3) -@validate_span_events( - exact_agents={'http.url': expected_http_url + '/hello_world'}, count=3) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(exact_agents={"aws.operation": "CreateBucket"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "PutObject"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "ListObjects"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "GetObject"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteObject"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteBucket"}, count=1) +@validate_span_events(exact_agents={"http.url": EXPECTED_BUCKET_URL}, count=3) +@validate_span_events(exact_agents={"http.url": EXPECTED_KEY_URL}, count=3) @validate_transaction_metrics( - 'test_botocore_s3:test_s3', - scoped_metrics=_s3_scoped_metrics, - rollup_metrics=_s3_rollup_metrics, - background_task=True) + "test_botocore_s3:test_s3", + scoped_metrics=_s3_scoped_metrics, + rollup_metrics=_s3_rollup_metrics, + background_task=True, +) @background_task() @moto.mock_s3 def test_s3(): session = botocore.session.get_session() client = session.create_client( - 's3', - region_name=AWS_REGION, - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY + "s3", region_name=AWS_REGION, aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY ) # Create bucket resp = client.create_bucket(Bucket=TEST_BUCKET) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Put object - resp = client.put_object( - Bucket=TEST_BUCKET, - Key='hello_world', - Body=b'hello_world_content' - ) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + resp = client.put_object(Bucket=TEST_BUCKET, Key="hello_world", Body=b"hello_world_content") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # List bucket resp = client.list_objects(Bucket=TEST_BUCKET) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert len(resp['Contents']) == 1 - assert resp['Contents'][0]['Key'] == 'hello_world' + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert len(resp["Contents"]) == 1 + assert resp["Contents"][0]["Key"] == "hello_world" # Get object - resp = client.get_object(Bucket=TEST_BUCKET, Key='hello_world') - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 - assert resp['Body'].read() == b'hello_world_content' + resp = client.get_object(Bucket=TEST_BUCKET, Key="hello_world") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 + assert resp["Body"].read() == b"hello_world_content" # Delete object - resp = client.delete_object(Bucket=TEST_BUCKET, Key='hello_world') - assert resp['ResponseMetadata']['HTTPStatusCode'] == 204 + resp = client.delete_object(Bucket=TEST_BUCKET, Key="hello_world") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 204 # Delete bucket resp = client.delete_bucket(Bucket=TEST_BUCKET) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 204 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 204 diff --git a/tests/external_botocore/test_botocore_sqs.py b/tests/external_botocore/test_botocore_sqs.py index 46482c675..3f7d8c022 100644 --- a/tests/external_botocore/test_botocore_sqs.py +++ b/tests/external_botocore/test_botocore_sqs.py @@ -14,132 +14,131 @@ import sys import uuid -import pytest import botocore.session import moto +import pytest +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - override_application_settings) -from testing_support.validators.validate_span_events import ( - validate_span_events) +from newrelic.common.package_version_utils import get_package_version -MOTO_VERSION = tuple(int(v) for v in moto.__version__.split('.')[:3]) +MOTO_VERSION = tuple(int(v) for v in moto.__version__.split(".")[:3]) # patch earlier versions of moto to support py37 if sys.version_info >= (3, 7) and MOTO_VERSION <= (1, 3, 1): import re + moto.packages.responses.responses.re._pattern_type = re.Pattern -AWS_ACCESS_KEY_ID = 'AAAAAAAAAAAACCESSKEY' -AWS_SECRET_ACCESS_KEY = 'AAAAAASECRETKEY' -AWS_REGION = 'us-east-1' +url = "sqs.us-east-1.amazonaws.com" +botocore_version = tuple([int(n) for n in get_package_version("botocore").split(".")]) +if botocore_version < (1, 29, 0): + url = "queue.amazonaws.com" + +AWS_ACCESS_KEY_ID = "AAAAAAAAAAAACCESSKEY" +AWS_SECRET_ACCESS_KEY = "AAAAAASECRETKEY" # nosec +AWS_REGION = "us-east-1" -TEST_QUEUE = 'python-agent-test-%s' % uuid.uuid4() +TEST_QUEUE = "python-agent-test-%s" % uuid.uuid4() _sqs_scoped_metrics = [ - ('MessageBroker/SQS/Queue/Produce/Named/%s' - % TEST_QUEUE, 2), - ('External/queue.amazonaws.com/botocore/POST', 3), + ("MessageBroker/SQS/Queue/Produce/Named/%s" % TEST_QUEUE, 2), + ("External/%s/botocore/POST" % url, 3), ] _sqs_rollup_metrics = [ - ('MessageBroker/SQS/Queue/Produce/Named/%s' - % TEST_QUEUE, 2), - ('MessageBroker/SQS/Queue/Consume/Named/%s' - % TEST_QUEUE, 1), - ('External/all', 3), - ('External/allOther', 3), - ('External/queue.amazonaws.com/all', 3), - ('External/queue.amazonaws.com/botocore/POST', 3), + ("MessageBroker/SQS/Queue/Produce/Named/%s" % TEST_QUEUE, 2), + ("MessageBroker/SQS/Queue/Consume/Named/%s" % TEST_QUEUE, 1), + ("External/all", 3), + ("External/allOther", 3), + ("External/%s/all" % url, 3), + ("External/%s/botocore/POST" % url, 3), ] _sqs_scoped_metrics_malformed = [ - ('MessageBroker/SQS/Queue/Produce/Named/Unknown', 1), + ("MessageBroker/SQS/Queue/Produce/Named/Unknown", 1), ] _sqs_rollup_metrics_malformed = [ - ('MessageBroker/SQS/Queue/Produce/Named/Unknown', 1), + ("MessageBroker/SQS/Queue/Produce/Named/Unknown", 1), ] -@override_application_settings({'distributed_tracing.enabled': True}) -@validate_span_events(exact_agents={'aws.operation': 'CreateQueue'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'SendMessage'}, count=1) -@validate_span_events( - exact_agents={'aws.operation': 'ReceiveMessage'}, count=1) -@validate_span_events( - exact_agents={'aws.operation': 'SendMessageBatch'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'PurgeQueue'}, count=1) -@validate_span_events(exact_agents={'aws.operation': 'DeleteQueue'}, count=1) +@override_application_settings({"distributed_tracing.enabled": True}) +@validate_span_events(exact_agents={"aws.operation": "CreateQueue"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "SendMessage"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "ReceiveMessage"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "SendMessageBatch"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "PurgeQueue"}, count=1) +@validate_span_events(exact_agents={"aws.operation": "DeleteQueue"}, count=1) @validate_transaction_metrics( - 'test_botocore_sqs:test_sqs', - scoped_metrics=_sqs_scoped_metrics, - rollup_metrics=_sqs_rollup_metrics, - background_task=True) + "test_botocore_sqs:test_sqs", + scoped_metrics=_sqs_scoped_metrics, + rollup_metrics=_sqs_rollup_metrics, + background_task=True, +) @background_task() @moto.mock_sqs def test_sqs(): session = botocore.session.get_session() client = session.create_client( - 'sqs', - region_name=AWS_REGION, - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY + "sqs", region_name=AWS_REGION, aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY ) # Create queue resp = client.create_queue(QueueName=TEST_QUEUE) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # QueueUrl is needed for rest of methods. - QUEUE_URL = resp['QueueUrl'] + QUEUE_URL = resp["QueueUrl"] # Send message - resp = client.send_message(QueueUrl=QUEUE_URL, MessageBody='hello_world') - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + resp = client.send_message(QueueUrl=QUEUE_URL, MessageBody="hello_world") + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Receive message resp = client.receive_message(QueueUrl=QUEUE_URL) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Send message batch messages = [ - {'Id': '1', 'MessageBody': 'message 1'}, - {'Id': '2', 'MessageBody': 'message 2'}, - {'Id': '3', 'MessageBody': 'message 3'}, + {"Id": "1", "MessageBody": "message 1"}, + {"Id": "2", "MessageBody": "message 2"}, + {"Id": "3", "MessageBody": "message 3"}, ] resp = client.send_message_batch(QueueUrl=QUEUE_URL, Entries=messages) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Purge queue resp = client.purge_queue(QueueUrl=QUEUE_URL) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 # Delete queue resp = client.delete_queue(QueueUrl=QUEUE_URL) - assert resp['ResponseMetadata']['HTTPStatusCode'] == 200 + assert resp["ResponseMetadata"]["HTTPStatusCode"] == 200 -@override_application_settings({'distributed_tracing.enabled': True}) +@override_application_settings({"distributed_tracing.enabled": True}) @validate_transaction_metrics( - 'test_botocore_sqs:test_sqs_malformed', - scoped_metrics=_sqs_scoped_metrics_malformed, - rollup_metrics=_sqs_rollup_metrics_malformed, - background_task=True) + "test_botocore_sqs:test_sqs_malformed", + scoped_metrics=_sqs_scoped_metrics_malformed, + rollup_metrics=_sqs_rollup_metrics_malformed, + background_task=True, +) @background_task() @moto.mock_sqs def test_sqs_malformed(): session = botocore.session.get_session() client = session.create_client( - 'sqs', - region_name=AWS_REGION, - aws_access_key_id=AWS_ACCESS_KEY_ID, - aws_secret_access_key=AWS_SECRET_ACCESS_KEY + "sqs", region_name=AWS_REGION, aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY ) # Malformed send message, uses arg instead of kwarg with pytest.raises(TypeError): - client.send_message('https://fake-url/', MessageBody='hello_world') + client.send_message("https://fake-url/", MessageBody="hello_world") diff --git a/tests/external_feedparser/conftest.py b/tests/external_feedparser/conftest.py index 818889161..11d19f1cd 100644 --- a/tests/external_feedparser/conftest.py +++ b/tests/external_feedparser/conftest.py @@ -13,8 +13,8 @@ # limitations under the License. import pytest -from testing_support.fixtures import (code_coverage_fixture, # noqa - collector_agent_registration_fixture, collector_available_fixture) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import MockExternalHTTPServer _default_settings = { @@ -29,12 +29,6 @@ app_name='Python Agent Test (external_feedparser)', default_settings=_default_settings) -_coverage_source = [ - 'newrelic.hooks.external_feedparser', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) - def create_handler(response): def handler(self): diff --git a/tests/external_feedparser/test_feedparser.py b/tests/external_feedparser/test_feedparser.py index 9cb175711..5e515cfc3 100644 --- a/tests/external_feedparser/test_feedparser.py +++ b/tests/external_feedparser/test_feedparser.py @@ -14,7 +14,7 @@ import pytest from newrelic.api.background_task import background_task -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics @pytest.fixture(scope="session") diff --git a/tests/external_http/conftest.py b/tests/external_http/conftest.py index c4f768907..f8afb49f3 100644 --- a/tests/external_http/conftest.py +++ b/tests/external_http/conftest.py @@ -14,17 +14,10 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer) -_coverage_source = [ - 'newrelic.api.external_trace', - 'newrelic.hooks.external_httplib', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_http/test_http.py b/tests/external_http/test_http.py index 16bef3f2a..e08518f5f 100644 --- a/tests/external_http/test_http.py +++ b/tests/external_http/test_http.py @@ -21,8 +21,8 @@ from testing_support.fixtures import ( cat_enabled, override_application_settings, - validate_transaction_metrics, ) +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_cross_process_headers import ( validate_cross_process_headers, ) diff --git a/tests/external_httplib/conftest.py b/tests/external_httplib/conftest.py index 1c9a69713..2edbeab91 100644 --- a/tests/external_httplib/conftest.py +++ b/tests/external_httplib/conftest.py @@ -14,19 +14,10 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer) -_coverage_source = [ - 'newrelic.api.external_trace', - 'newrelic.hooks.external_httplib', - 'newrelic.hooks.external_urllib', - 'newrelic.hooks.external_urllib2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_httplib/test_httplib.py b/tests/external_httplib/test_httplib.py index 69f790846..f67e68dc2 100644 --- a/tests/external_httplib/test_httplib.py +++ b/tests/external_httplib/test_httplib.py @@ -23,12 +23,7 @@ cache_outgoing_headers, insert_incoming_headers, ) -from testing_support.fixtures import ( - cat_enabled, - override_application_settings, - validate_transaction_metrics, - validate_tt_segment_params, -) +from testing_support.fixtures import cat_enabled, override_application_settings from testing_support.validators.validate_cross_process_headers import ( validate_cross_process_headers, ) @@ -36,6 +31,12 @@ validate_external_node_params, ) from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) from newrelic.api.background_task import background_task from newrelic.common.encoding_utils import DistributedTracePayload @@ -104,7 +105,8 @@ def test_httplib_https_request(server): ) @background_task(name="test_httplib:test_httplib_https_request") def _test(): - connection = httplib.HTTPSConnection("localhost", server.port) + # fix HTTPSConnection: https://wiki.openstack.org/wiki/OSSN/OSSN-0033 + connection = httplib.HTTPSConnection("localhost", server.port) # nosec # It doesn't matter that a SSL exception is raised here because the # agent still records this as an external request try: diff --git a/tests/external_httplib/test_urllib.py b/tests/external_httplib/test_urllib.py index 38cf4f713..cea88a8dd 100644 --- a/tests/external_httplib/test_urllib.py +++ b/tests/external_httplib/test_urllib.py @@ -25,7 +25,8 @@ cache_outgoing_headers, insert_incoming_headers, ) -from testing_support.fixtures import cat_enabled, validate_transaction_metrics +from testing_support.fixtures import cat_enabled +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_cross_process_headers import ( validate_cross_process_headers, ) diff --git a/tests/external_httplib/test_urllib2.py b/tests/external_httplib/test_urllib2.py index cbcb25a2f..62ed23074 100644 --- a/tests/external_httplib/test_urllib2.py +++ b/tests/external_httplib/test_urllib2.py @@ -25,7 +25,8 @@ cache_outgoing_headers, insert_incoming_headers, ) -from testing_support.fixtures import cat_enabled, validate_transaction_metrics +from testing_support.fixtures import cat_enabled +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_cross_process_headers import ( validate_cross_process_headers, ) diff --git a/tests/external_httplib2/conftest.py b/tests/external_httplib2/conftest.py index b8bc40e6c..cf3501da5 100644 --- a/tests/external_httplib2/conftest.py +++ b/tests/external_httplib2/conftest.py @@ -14,17 +14,11 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer) -_coverage_source = [ - 'newrelic.hooks.external_httplib2', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_httplib2/test_httplib2.py b/tests/external_httplib2/test_httplib2.py index d3e71e7b3..288aa84ee 100644 --- a/tests/external_httplib2/test_httplib2.py +++ b/tests/external_httplib2/test_httplib2.py @@ -21,7 +21,6 @@ from testing_support.fixtures import ( cat_enabled, override_application_settings, - validate_transaction_metrics, ) from testing_support.validators.validate_cross_process_headers import ( validate_cross_process_headers, @@ -29,6 +28,7 @@ from testing_support.validators.validate_external_node_params import ( validate_external_node_params, ) +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.api.background_task import background_task diff --git a/tests/external_httpx/conftest.py b/tests/external_httpx/conftest.py index bad35d45e..87ea1bec0 100644 --- a/tests/external_httpx/conftest.py +++ b/tests/external_httpx/conftest.py @@ -16,18 +16,8 @@ import pytest from testing_support.fixture.event_loop import event_loop as loop -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.api.external_trace", - "newrelic.hooks.external_httpx", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/external_httpx/test_client.py b/tests/external_httpx/test_client.py index 1d170a5fa..b4760a38f 100644 --- a/tests/external_httpx/test_client.py +++ b/tests/external_httpx/test_client.py @@ -19,9 +19,6 @@ dt_enabled, override_application_settings, override_generic_settings, - validate_transaction_errors, - validate_transaction_metrics, - validate_tt_segment_params, ) from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer, @@ -30,6 +27,15 @@ validate_cross_process_headers, ) from testing_support.validators.validate_span_events import validate_span_events +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_segment_params import ( + validate_tt_segment_params, +) from newrelic.api.background_task import background_task from newrelic.api.time_trace import current_trace diff --git a/tests/external_requests/conftest.py b/tests/external_requests/conftest.py index 02fee2419..10a2ccf05 100644 --- a/tests/external_requests/conftest.py +++ b/tests/external_requests/conftest.py @@ -14,17 +14,10 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer) -_coverage_source = [ - 'newrelic.api.external_trace', - 'newrelic.hooks.external_requests', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_requests/test_requests.py b/tests/external_requests/test_requests.py index b61cf36df..f6f4506e5 100644 --- a/tests/external_requests/test_requests.py +++ b/tests/external_requests/test_requests.py @@ -22,8 +22,6 @@ from testing_support.fixtures import ( cat_enabled, override_application_settings, - validate_transaction_errors, - validate_transaction_metrics, validate_tt_parenting, ) from testing_support.validators.validate_cross_process_headers import ( @@ -32,7 +30,8 @@ from testing_support.validators.validate_external_node_params import ( validate_external_node_params, ) - +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.api.background_task import background_task diff --git a/tests/external_urllib3/conftest.py b/tests/external_urllib3/conftest.py index b71263dda..19d3f394b 100644 --- a/tests/external_urllib3/conftest.py +++ b/tests/external_urllib3/conftest.py @@ -14,18 +14,11 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer) -_coverage_source = [ - 'newrelic.api.external_trace', - 'newrelic.hooks.external_urllib3', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/external_urllib3/test_urllib3.py b/tests/external_urllib3/test_urllib3.py index 93287e000..68e15d463 100644 --- a/tests/external_urllib3/test_urllib3.py +++ b/tests/external_urllib3/test_urllib3.py @@ -28,8 +28,6 @@ from testing_support.fixtures import ( cat_enabled, override_application_settings, - validate_transaction_errors, - validate_transaction_metrics, ) from testing_support.util import version2tuple from testing_support.validators.validate_cross_process_headers import ( @@ -38,7 +36,8 @@ from testing_support.validators.validate_external_node_params import ( validate_external_node_params, ) - +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.api.background_task import background_task diff --git a/tests/framework_aiohttp/_target_application.py b/tests/framework_aiohttp/_target_application.py index 895260798..f15e7fd65 100644 --- a/tests/framework_aiohttp/_target_application.py +++ b/tests/framework_aiohttp/_target_application.py @@ -14,53 +14,51 @@ import asyncio import sys -from aiohttp import web, WSMsgType, ClientSession + +from aiohttp import ClientSession, WSMsgType, web + from newrelic.api.function_trace import function_trace -@asyncio.coroutine -def index(request): - yield - resp = web.Response(text='Hello Aiohttp!') - resp.set_cookie('ExampleCookie', 'ExampleValue') +async def index(request): + await asyncio.sleep(0) + resp = web.Response(text="Hello Aiohttp!") + resp.set_cookie("ExampleCookie", "ExampleValue") return resp -@asyncio.coroutine -def hang(request): +async def hang(request): while True: - yield + await asyncio.sleep(0) -@asyncio.coroutine -def error(request): +async def error(request): raise ValueError("Value Error") -@asyncio.coroutine -def non_500_error(request): +async def non_500_error(request): raise web.HTTPGone() -@asyncio.coroutine -def raise_404(request): +async def raise_403(request): + raise web.HTTPForbidden() + + +async def raise_404(request): raise web.HTTPNotFound() -@asyncio.coroutine @function_trace() -def wait(): - yield from asyncio.sleep(0.1) +async def wait(): + await asyncio.sleep(0.1) -@asyncio.coroutine -def run_task(loop): - yield from wait() +async def run_task(loop): + await wait() loop.stop() -@asyncio.coroutine -def background(request): +async def background(request): try: loop = request.loop except AttributeError: @@ -68,16 +66,14 @@ def background(request): asyncio.set_event_loop(loop) asyncio.tasks.ensure_future(run_task(loop)) - return web.Response(text='Background Task Scheduled') + return web.Response(text="Background Task Scheduled") class HelloWorldView(web.View): - - @asyncio.coroutine - def _respond(self): - yield - resp = web.Response(text='Hello Aiohttp!') - resp.set_cookie('ExampleCookie', 'ExampleValue') + async def _respond(self): + await asyncio.sleep(0) + resp = web.Response(text="Hello Aiohttp!") + resp.set_cookie("ExampleCookie", "ExampleValue") return resp get = _respond @@ -92,15 +88,13 @@ class KnownException(Exception): class KnownErrorView(web.View): - - @asyncio.coroutine - def _respond(self): + async def _respond(self): try: - yield + await asyncio.sleep(0) except KnownException: pass finally: - return web.Response(text='Hello Aiohttp!') + return web.Response(text="Hello Aiohttp!") get = _respond post = _respond @@ -109,83 +103,80 @@ def _respond(self): delete = _respond -@asyncio.coroutine -def websocket_handler(request): +async def websocket_handler(request): ws = web.WebSocketResponse() - yield from ws.prepare(request) + await ws.prepare(request) # receive messages for all eternity! # (or until the client closes the socket) while not ws.closed: - msg = yield from ws.receive() + msg = await ws.receive() if msg.type == WSMsgType.TEXT: - result = ws.send_str('/' + msg.data) - if hasattr(result, '__await__'): - yield from result.__await__() + result = ws.send_str("/" + msg.data) + if hasattr(result, "__await__"): + await result return ws -@asyncio.coroutine -def fetch(method, url, loop): +async def fetch(method, url, loop): session = ClientSession(loop=loop) - if hasattr(session, '__aenter__'): - yield from session.__aenter__() + if hasattr(session, "__aenter__"): + await session.__aenter__() else: session.__enter__() try: _method = getattr(session, method) try: - response = yield from asyncio.wait_for(_method(url), timeout=None, loop=loop) + response = await asyncio.wait_for(_method(url), timeout=None, loop=loop) except TypeError: - response = yield from asyncio.wait_for(_method(url), timeout=None) - text = yield from response.text() + response = await asyncio.wait_for(_method(url), timeout=None) + text = await response.text() finally: - if hasattr(session, '__aexit__'): - yield from session.__aexit__(*sys.exc_info()) + if hasattr(session, "__aexit__"): + await session.__aexit__(*sys.exc_info()) else: session.__exit__(*sys.exc_info()) return text -@asyncio.coroutine -def fetch_multiple(method, loop, url): +async def fetch_multiple(method, loop, url): coros = [fetch(method, url, loop) for _ in range(2)] try: - responses = yield from asyncio.gather(*coros, loop=loop) + responses = await asyncio.gather(*coros, loop=loop) except TypeError: - responses = yield from asyncio.gather(*coros) - return '\n'.join(responses) + responses = await asyncio.gather(*coros) + return "\n".join(responses) -@asyncio.coroutine -def multi_fetch_handler(request): +async def multi_fetch_handler(request): try: loop = request.loop except AttributeError: loop = request.task._loop - responses = yield from fetch_multiple('get', loop, request.query['url']) - return web.Response(text=responses, content_type='text/html') + responses = await fetch_multiple("get", loop, request.query["url"]) + return web.Response(text=responses, content_type="text/html") def make_app(middlewares=None): app = web.Application(middlewares=middlewares) - app.router.add_route('*', '/coro', index) - app.router.add_route('*', '/class', HelloWorldView) - app.router.add_route('*', '/error', error) - app.router.add_route('*', '/known_error', KnownErrorView) - app.router.add_route('*', '/non_500_error', non_500_error) - app.router.add_route('*', '/raise_404', raise_404) - app.router.add_route('*', '/hang', hang) - app.router.add_route('*', '/background', background) - app.router.add_route('*', '/ws', websocket_handler) - app.router.add_route('*', '/multi_fetch', multi_fetch_handler) + app.router.add_route("*", "/coro", index) + app.router.add_route("*", "/class", HelloWorldView) + app.router.add_route("*", "/error", error) + app.router.add_route("*", "/known_error", KnownErrorView) + app.router.add_route("*", "/non_500_error", non_500_error) + app.router.add_route("*", "/raise_403", raise_403) + app.router.add_route("*", "/raise_404", raise_404) + app.router.add_route("*", "/hang", hang) + app.router.add_route("*", "/background", background) + app.router.add_route("*", "/ws", websocket_handler) + app.router.add_route("*", "/multi_fetch", multi_fetch_handler) for route in app.router.routes(): handler = route.handler @@ -199,5 +190,5 @@ def make_app(middlewares=None): return app -if __name__ == '__main__': - web.run_app(make_app(), host='127.0.0.1') +if __name__ == "__main__": + web.run_app(make_app(), host="127.0.0.1") diff --git a/tests/framework_aiohttp/conftest.py b/tests/framework_aiohttp/conftest.py index b4a31d7e2..3bb814a9b 100644 --- a/tests/framework_aiohttp/conftest.py +++ b/tests/framework_aiohttp/conftest.py @@ -22,21 +22,13 @@ from testing_support.fixture.event_loop import ( # noqa: F401 pylint: disable=W0611 event_loop, ) -from testing_support.fixtures import ( # noqa: F401 pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer, MockExternalHTTPServer, ) -_coverage_source = [ - "newrelic.hooks.framework_aiohttp", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -76,8 +68,7 @@ def tearDown(self): if hasattr(self, "asyncTearDown"): asyncio.get_event_loop().run_until_complete(self.asyncTearDown()) - @asyncio.coroutine - def _get_client(self, app_or_server): + async def _get_client(self, app_or_server): """Return a TestClient instance.""" client_constructor_arg = app_or_server diff --git a/tests/framework_aiohttp/test_client.py b/tests/framework_aiohttp/test_client.py index b2d23dd23..96bbb46f0 100644 --- a/tests/framework_aiohttp/test_client.py +++ b/tests/framework_aiohttp/test_client.py @@ -16,7 +16,9 @@ import aiohttp import pytest -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from yarl import URL from newrelic.api.background_task import background_task @@ -28,18 +30,16 @@ ) -@asyncio.coroutine -def fetch(method, url): +async def fetch(method, url): with aiohttp.ClientSession() as session: _method = getattr(session, method) - response = yield from asyncio.wait_for(_method(url), timeout=None) + response = await asyncio.wait_for(_method(url), timeout=None) response.raise_for_status() - yield from response.text() + await response.text() @background_task(name="fetch_multiple") -@asyncio.coroutine -def fetch_multiple(method, url): +async def fetch_multiple(method, url): coros = [fetch(method, url) for _ in range(2)] return asyncio.gather(*coros, return_exceptions=True) @@ -126,8 +126,7 @@ class ThrowerException(ValueError): pass @background_task(name="test_client_throw_yield_from") - @asyncio.coroutine - def self_driving_thrower(): + async def self_driving_thrower(): with aiohttp.ClientSession() as session: coro = session._request(method.upper(), local_server_info.url) @@ -159,8 +158,7 @@ def task_test(): @pytest.mark.parametrize("method,exc_expected", test_matrix) def test_client_close_yield_from(event_loop, local_server_info, method, exc_expected): @background_task(name="test_client_close_yield_from") - @asyncio.coroutine - def self_driving_closer(): + async def self_driving_closer(): with aiohttp.ClientSession() as session: coro = session._request(method.upper(), local_server_info.url) @@ -219,17 +217,15 @@ def test_create_task_yield_from(event_loop, local_server_info, method, exc_expec # `loop.create_task` returns a Task object which uses the coroutine's # `send` method, not `__next__` - @asyncio.coroutine - def fetch_task(loop): + async def fetch_task(loop): with aiohttp.ClientSession() as session: coro = getattr(session, method) - resp = yield from loop.create_task(coro(local_server_info.url)) + resp = await loop.create_task(coro(local_server_info.url)) resp.raise_for_status() - yield from resp.text() + await resp.text() @background_task(name="test_create_task_yield_from") - @asyncio.coroutine - def fetch_multiple(loop): + async def fetch_multiple(loop): coros = [fetch_task(loop) for _ in range(2)] return asyncio.gather(*coros, return_exceptions=True) diff --git a/tests/framework_aiohttp/test_client_async_await.py b/tests/framework_aiohttp/test_client_async_await.py index 1e3eb79ec..dedc64c9d 100644 --- a/tests/framework_aiohttp/test_client_async_await.py +++ b/tests/framework_aiohttp/test_client_async_await.py @@ -16,7 +16,8 @@ import aiohttp import pytest -from testing_support.fixtures import cat_enabled, validate_transaction_metrics +from testing_support.fixtures import cat_enabled +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from yarl import URL from newrelic.api.background_task import background_task diff --git a/tests/framework_aiohttp/test_client_cat.py b/tests/framework_aiohttp/test_client_cat.py index a830c2269..887743429 100644 --- a/tests/framework_aiohttp/test_client_cat.py +++ b/tests/framework_aiohttp/test_client_cat.py @@ -13,21 +13,20 @@ # limitations under the License. import asyncio -import os import aiohttp import pytest from testing_support.external_fixtures import create_incoming_headers -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, -) +from testing_support.fixtures import override_application_settings from testing_support.validators.validate_cross_process_headers import ( validate_cross_process_headers, ) from testing_support.validators.validate_external_node_params import ( validate_external_node_params, ) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task from newrelic.api.external_trace import ExternalTrace @@ -41,8 +40,7 @@ _expected_error_class = aiohttp.client_exceptions.ClientResponseError -@asyncio.coroutine -def fetch(url, headers=None, raise_for_status=False, connector=None): +async def fetch(url, headers=None, raise_for_status=False, connector=None): kwargs = {} if version_info >= (2, 0): @@ -53,13 +51,13 @@ def fetch(url, headers=None, raise_for_status=False, connector=None): headers = {} try: - response = yield from request + response = await request if raise_for_status and version_info < (2, 0): response.raise_for_status() except _expected_error_class: return headers - response_text = yield from response.text() + response_text = await response.text() for header in response_text.split("\n"): if not header: continue @@ -69,7 +67,7 @@ def fetch(url, headers=None, raise_for_status=False, connector=None): continue headers[h.strip()] = v.strip() f = session.close() - yield from asyncio.ensure_future(f) + await asyncio.ensure_future(f) return headers @@ -78,9 +76,8 @@ def fetch(url, headers=None, raise_for_status=False, connector=None): @pytest.mark.parametrize("span_events", (True, False)) def test_outbound_cross_process_headers(event_loop, cat_enabled, distributed_tracing, span_events, mock_header_server): @background_task(name="test_outbound_cross_process_headers") - @asyncio.coroutine - def _test(): - headers = yield from fetch("http://127.0.0.1:%d" % mock_header_server.port) + async def _test(): + headers = await fetch("http://127.0.0.1:%d" % mock_header_server.port) transaction = current_transaction() transaction._test_request_headers = headers @@ -144,15 +141,14 @@ def test_outbound_cross_process_headers_no_txn(event_loop, mock_header_server): def test_outbound_cross_process_headers_exception(event_loop, mock_header_server): @background_task(name="test_outbound_cross_process_headers_exception") - @asyncio.coroutine - def test(): + async def test(): # corrupt the transaction object to force an error transaction = current_transaction() guid = transaction.guid delattr(transaction, "guid") try: - headers = yield from fetch("http://127.0.0.1:%d" % mock_header_server.port) + headers = await fetch("http://127.0.0.1:%d" % mock_header_server.port) assert not headers.get(ExternalTrace.cat_id_key) assert not headers.get(ExternalTrace.cat_transaction_key) @@ -163,10 +159,9 @@ def test(): class PoorResolvingConnector(aiohttp.TCPConnector): - @asyncio.coroutine - def _resolve_host(self, host, port, *args, **kwargs): + async def _resolve_host(self, host, port, *args, **kwargs): res = [{"hostname": host, "host": host, "port": 1234, "family": self._family, "proto": 0, "flags": 0}] - hosts = yield from super(PoorResolvingConnector, self)._resolve_host(host, port, *args, **kwargs) + hosts = await super(PoorResolvingConnector, self)._resolve_host(host, port, *args, **kwargs) for hinfo in hosts: res.append(hinfo) return res diff --git a/tests/framework_aiohttp/test_externals.py b/tests/framework_aiohttp/test_externals.py index a410590ef..acbc4ca81 100644 --- a/tests/framework_aiohttp/test_externals.py +++ b/tests/framework_aiohttp/test_externals.py @@ -12,29 +12,31 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest -import asyncio -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_parenting) +from testing_support.fixtures import validate_tt_parenting +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) expected_parenting = ( - 'TransactionNode', [ - ('FunctionNode', [ - ('ExternalTrace', []), - ('ExternalTrace', []), - ]), -]) + "TransactionNode", + [ + ( + "FunctionNode", + [ + ("ExternalTrace", []), + ("ExternalTrace", []), + ], + ), + ], +) @validate_tt_parenting(expected_parenting) -@validate_transaction_metrics('_target_application:multi_fetch_handler', - rollup_metrics=[('External/all', 2)]) +@validate_transaction_metrics("_target_application:multi_fetch_handler", rollup_metrics=[("External/all", 2)]) def test_multiple_requests_within_transaction(local_server_info, aiohttp_app): - @asyncio.coroutine - def fetch(): - resp = yield from aiohttp_app.client.request('GET', '/multi_fetch', - params={'url': local_server_info.url}) + async def fetch(): + resp = await aiohttp_app.client.request("GET", "/multi_fetch", params={"url": local_server_info.url}) assert resp.status == 200 aiohttp_app.loop.run_until_complete(fetch()) diff --git a/tests/framework_aiohttp/test_middleware.py b/tests/framework_aiohttp/test_middleware.py index 47050232f..6cbf86677 100644 --- a/tests/framework_aiohttp/test_middleware.py +++ b/tests/framework_aiohttp/test_middleware.py @@ -12,57 +12,53 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest -import asyncio import aiohttp +import pytest +from testing_support.fixtures import override_generic_settings +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.core.config import global_settings -from testing_support.fixtures import (validate_transaction_metrics, - override_generic_settings) -from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics - -version_info = tuple(int(_) for _ in aiohttp.__version__.split('.')[:2]) +version_info = tuple(int(_) for _ in aiohttp.__version__.split(".")[:2]) -@asyncio.coroutine -def middleware_factory(app, handler): - @asyncio.coroutine - def middleware_handler(request): - response = yield from handler(request) +async def middleware_factory(app, handler): + async def middleware_handler(request): + response = await handler(request) return response return middleware_handler middleware_tests = [ - (middleware_factory, 'Function/test_middleware:' - 'middleware_factory..middleware_handler'), + (middleware_factory, "Function/test_middleware:" "middleware_factory..middleware_handler"), ] if version_info >= (3, 0): + @aiohttp.web.middleware - @asyncio.coroutine - def new_style_middleware(request, handler): - response = yield from handler(request) + async def new_style_middleware(request, handler): + response = await handler(request) return response middleware_tests.append( - (new_style_middleware, - 'Function/test_middleware:new_style_middleware'), + (new_style_middleware, "Function/test_middleware:new_style_middleware"), ) -@pytest.mark.parametrize('nr_enabled', [True, False]) -@pytest.mark.parametrize('middleware,metric', middleware_tests) +@pytest.mark.parametrize("nr_enabled", [True, False]) +@pytest.mark.parametrize("middleware,metric", middleware_tests) def test_middleware(nr_enabled, aiohttp_app, middleware, metric): - - @asyncio.coroutine - def fetch(): - resp = yield from aiohttp_app.client.request('GET', '/coro') + async def fetch(): + resp = await aiohttp_app.client.request("GET", "/coro") assert resp.status == 200 - text = yield from resp.text() + text = await resp.text() assert "Hello Aiohttp!" in text return resp @@ -71,27 +67,27 @@ def _test(): if nr_enabled: scoped_metrics = [ - ('Function/_target_application:index', 1), + ("Function/_target_application:index", 1), (metric, 1), ] rollup_metrics = [ - ('Function/_target_application:index', 1), + ("Function/_target_application:index", 1), (metric, 1), - ('Python/Framework/aiohttp/%s' % aiohttp.__version__, 1), + ("Python/Framework/aiohttp/%s" % aiohttp.__version__, 1), ] - _test = validate_transaction_metrics('_target_application:index', - scoped_metrics=scoped_metrics, - rollup_metrics=rollup_metrics)(_test) + _test = validate_transaction_metrics( + "_target_application:index", scoped_metrics=scoped_metrics, rollup_metrics=rollup_metrics + )(_test) _test = validate_code_level_metrics("_target_application", "index")(_test) - + func_name = metric.split("/")[1].replace(":", ".").split(".") namespace, func_name = ".".join(func_name[:-1]), func_name[-1] _test = validate_code_level_metrics(namespace, func_name)(_test) else: settings = global_settings() - _test = override_generic_settings(settings, {'enabled': False})(_test) + _test = override_generic_settings(settings, {"enabled": False})(_test) _test() diff --git a/tests/framework_aiohttp/test_server.py b/tests/framework_aiohttp/test_server.py index 2b4e28b5f..6a5ef0d10 100644 --- a/tests/framework_aiohttp/test_server.py +++ b/tests/framework_aiohttp/test_server.py @@ -12,64 +12,65 @@ # See the License for the specific language governing permissions and # limitations under the License. -import pytest import asyncio + import aiohttp -from newrelic.core.config import global_settings +import pytest +from testing_support.fixtures import ( + count_transactions, + override_application_settings, + override_expected_status_codes, + override_generic_settings, + override_ignore_status_codes, +) +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, validate_transaction_event_attributes, - count_transactions, override_generic_settings, - override_application_settings, override_ignore_status_codes) -from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from newrelic.core.config import global_settings -version_info = tuple(int(_) for _ in aiohttp.__version__.split('.')[:2]) +version_info = tuple(int(_) for _ in aiohttp.__version__.split(".")[:2]) -BASE_REQUIRED_ATTRS = ['request.headers.contentType', - 'request.method'] +BASE_REQUIRED_ATTRS = ["request.headers.contentType", "request.method"] # The agent should not record these attributes in events unless the settings # explicitly say to do so -BASE_FORGONE_ATTRS = ['request.parameters.hello'] - - -@pytest.mark.parametrize('nr_enabled', [True, False]) -@pytest.mark.parametrize('method', [ - 'GET', - 'POST', - 'PUT', - 'PATCH', - 'DELETE', -]) -@pytest.mark.parametrize('uri,metric_name,error,status', [ - ( - '/error?hello=world', - '_target_application:error', - 'builtins:ValueError', - 500 - ), - - ( - '/non_500_error?hello=world', - '_target_application:non_500_error', - 'aiohttp.web_exceptions:HTTPGone', - 410 - ), - - ( - '/raise_404?hello=world', - '_target_application:raise_404', - None, - 404 - ), -]) -def test_error_exception(method, uri, metric_name, error, status, nr_enabled, - aiohttp_app): - @asyncio.coroutine - def fetch(): - resp = yield from aiohttp_app.client.request(method, - uri, headers={'content-type': 'text/plain'}) +BASE_FORGONE_ATTRS = ["request.parameters.hello"] + + +@pytest.mark.parametrize("nr_enabled", [True, False]) +@pytest.mark.parametrize( + "method", + [ + "GET", + "POST", + "PUT", + "PATCH", + "DELETE", + ], +) +@pytest.mark.parametrize( + "uri,metric_name,error,status", + [ + ("/error?hello=world", "_target_application:error", "builtins:ValueError", 500), + ("/non_500_error?hello=world", "_target_application:non_500_error", "aiohttp.web_exceptions:HTTPGone", 410), + ("/raise_404?hello=world", "_target_application:raise_404", None, 404), + ("/raise_403?hello=world", "_target_application:raise_403", "aiohttp.web_exceptions:HTTPForbidden", 403), + ], +) +def test_error_exception(method, uri, metric_name, error, status, nr_enabled, aiohttp_app): + async def fetch(): + resp = await aiohttp_app.client.request(method, uri, headers={"content-type": "text/plain"}) assert resp.status == status required_attrs = list(BASE_REQUIRED_ATTRS) @@ -80,77 +81,82 @@ def fetch(): if error: errors.append(error) - @validate_transaction_errors(errors=errors) - @validate_transaction_metrics(metric_name, + @validate_transaction_errors( + errors=errors, expected_errors=["aiohttp.web_exceptions:HTTPForbidden"] + ) + @validate_transaction_metrics( + metric_name, scoped_metrics=[ - ('Function/%s' % metric_name, 1), + ("Function/%s" % metric_name, 1), ], rollup_metrics=[ - ('Function/%s' % metric_name, 1), - ('Python/Framework/aiohttp/%s' % aiohttp.__version__, 1), + ("Function/%s" % metric_name, 1), + ("Python/Framework/aiohttp/%s" % aiohttp.__version__, 1), ], ) @validate_transaction_event_attributes( required_params={ - 'agent': required_attrs, - 'user': [], - 'intrinsic': [], + "agent": required_attrs, + "user": [], + "intrinsic": [], }, forgone_params={ - 'agent': forgone_attrs, - 'user': [], - 'intrinsic': [], + "agent": forgone_attrs, + "user": [], + "intrinsic": [], }, exact_attrs={ - 'agent': { - 'response.status': str(status), + "agent": { + "response.status": str(status), }, - 'user': {}, - 'intrinsic': {}, + "user": {}, + "intrinsic": {}, }, ) @validate_code_level_metrics(*metric_name.split(":")) @override_ignore_status_codes([404]) + @override_expected_status_codes([403]) def _test(): aiohttp_app.loop.run_until_complete(fetch()) + else: settings = global_settings() - @override_generic_settings(settings, {'enabled': False}) + @override_generic_settings(settings, {"enabled": False}) def _test(): aiohttp_app.loop.run_until_complete(fetch()) _test() -@pytest.mark.parametrize('nr_enabled', [True, False]) -@pytest.mark.parametrize('method', [ - 'GET', - 'POST', - 'PUT', - 'PATCH', - 'DELETE', -]) -@pytest.mark.parametrize('uri,metric_name', [ - ('/coro?hello=world', '_target_application:index'), - ('/class?hello=world', '_target_application:HelloWorldView._respond'), - ('/known_error?hello=world', - '_target_application:KnownErrorView._respond'), -]) -def test_simultaneous_requests(method, uri, metric_name, - nr_enabled, aiohttp_app): - - @asyncio.coroutine - def fetch(): - resp = yield from aiohttp_app.client.request(method, uri, - headers={'content-type': 'text/plain'}) +@pytest.mark.parametrize("nr_enabled", [True, False]) +@pytest.mark.parametrize( + "method", + [ + "GET", + "POST", + "PUT", + "PATCH", + "DELETE", + ], +) +@pytest.mark.parametrize( + "uri,metric_name", + [ + ("/coro?hello=world", "_target_application:index"), + ("/class?hello=world", "_target_application:HelloWorldView._respond"), + ("/known_error?hello=world", "_target_application:KnownErrorView._respond"), + ], +) +def test_simultaneous_requests(method, uri, metric_name, nr_enabled, aiohttp_app): + async def fetch(): + resp = await aiohttp_app.client.request(method, uri, headers={"content-type": "text/plain"}) assert resp.status == 200 - text = yield from resp.text() + text = await resp.text() assert "Hello Aiohttp!" in text return resp - @asyncio.coroutine - def multi_fetch(loop): + async def multi_fetch(loop): coros = [fetch() for i in range(2)] try: @@ -158,7 +164,7 @@ def multi_fetch(loop): except TypeError: combined = asyncio.gather(*coros, loop=loop) - responses = yield from combined + responses = await combined return responses required_attrs = list(BASE_REQUIRED_ATTRS) @@ -166,8 +172,7 @@ def multi_fetch(loop): required_attrs.extend(extra_required) - required_attrs.extend(['response.status', - 'response.headers.contentType']) + required_attrs.extend(["response.status", "response.headers.contentType"]) if nr_enabled: transactions = [] @@ -175,26 +180,27 @@ def multi_fetch(loop): func_name = metric_name.replace(":", ".").split(".") namespace, func_name = ".".join(func_name[:-1]), func_name[-1] - @override_application_settings({'attributes.include': ['request.*']}) - @validate_transaction_metrics(metric_name, + @override_application_settings({"attributes.include": ["request.*"]}) + @validate_transaction_metrics( + metric_name, scoped_metrics=[ - ('Function/%s' % metric_name, 1), + ("Function/%s" % metric_name, 1), ], rollup_metrics=[ - ('Function/%s' % metric_name, 1), - ('Python/Framework/aiohttp/%s' % aiohttp.__version__, 1), + ("Function/%s" % metric_name, 1), + ("Python/Framework/aiohttp/%s" % aiohttp.__version__, 1), ], ) @validate_transaction_event_attributes( required_params={ - 'agent': required_attrs, - 'user': [], - 'intrinsic': [], + "agent": required_attrs, + "user": [], + "intrinsic": [], }, forgone_params={ - 'agent': [], - 'user': [], - 'intrinsic': [], + "agent": [], + "user": [], + "intrinsic": [], }, ) @validate_code_level_metrics(namespace, func_name) @@ -202,21 +208,21 @@ def multi_fetch(loop): def _test(): aiohttp_app.loop.run_until_complete(multi_fetch(aiohttp_app.loop)) assert len(transactions) == 2 + else: settings = global_settings() - @override_generic_settings(settings, {'enabled': False}) + @override_generic_settings(settings, {"enabled": False}) def _test(): aiohttp_app.loop.run_until_complete(multi_fetch(aiohttp_app.loop)) _test() -@pytest.mark.parametrize('nr_enabled', [True, False]) +@pytest.mark.parametrize("nr_enabled", [True, False]) def test_system_response_creates_no_transaction(nr_enabled, aiohttp_app): - @asyncio.coroutine - def fetch(): - resp = yield from aiohttp_app.client.request('GET', '/404') + async def fetch(): + resp = await aiohttp_app.client.request("GET", "/404") assert resp.status == 404 return resp @@ -227,10 +233,11 @@ def fetch(): def _test(): aiohttp_app.loop.run_until_complete(fetch()) assert len(transactions) == 0 + else: settings = global_settings() - @override_generic_settings(settings, {'enabled': False}) + @override_generic_settings(settings, {"enabled": False}) def _test(): aiohttp_app.loop.run_until_complete(fetch()) @@ -238,18 +245,17 @@ def _test(): def test_aborted_connection_creates_transaction(aiohttp_app): - @asyncio.coroutine - def fetch(): + async def fetch(): try: - yield from aiohttp_app.client.request('GET', '/hang', timeout=0.1) + await aiohttp_app.client.request("GET", "/hang", timeout=0.1) except asyncio.TimeoutError: try: # Force the client to disconnect (while the server is hanging) - yield from aiohttp_app.client.close() + await aiohttp_app.client.close() # In aiohttp 1.X, this can result in a CancelledError being raised except asyncio.CancelledError: pass - yield + await asyncio.sleep(0) return assert False, "Request did not time out" @@ -265,13 +271,11 @@ def _test(): def test_work_after_request_not_recorded(aiohttp_app): - resp = aiohttp_app.loop.run_until_complete( - aiohttp_app.client.request('GET', '/background')) + resp = aiohttp_app.loop.run_until_complete(aiohttp_app.client.request("GET", "/background")) assert resp.status == 200 - @asyncio.coroutine - def timeout(): - yield from asyncio.sleep(1) + async def timeout(): + await asyncio.sleep(1) aiohttp_app.loop.stop() assert False diff --git a/tests/framework_aiohttp/test_server_cat.py b/tests/framework_aiohttp/test_server_cat.py index a09fa6b79..28af90d8d 100644 --- a/tests/framework_aiohttp/test_server_cat.py +++ b/tests/framework_aiohttp/test_server_cat.py @@ -12,47 +12,54 @@ # See the License for the specific language governing permissions and # limitations under the License. -import asyncio import json + import pytest +from testing_support.fixtures import ( + make_cross_agent_headers, + override_application_settings, + validate_analytics_catmap_data, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) -from newrelic.common.object_wrapper import transient_function_wrapper from newrelic.common.encoding_utils import deobfuscate -from testing_support.fixtures import (override_application_settings, - make_cross_agent_headers, validate_analytics_catmap_data, - validate_transaction_event_attributes) +from newrelic.common.object_wrapper import transient_function_wrapper -ENCODING_KEY = '1234567890123456789012345678901234567890' +ENCODING_KEY = "1234567890123456789012345678901234567890" test_uris = [ - ('/error?hello=world', '_target_application:error'), - ('/coro?hello=world', '_target_application:index'), - ('/class?hello=world', '_target_application:HelloWorldView._respond'), + ("/error?hello=world", "_target_application:error"), + ("/coro?hello=world", "_target_application:index"), + ("/class?hello=world", "_target_application:HelloWorldView._respond"), ] def record_aiohttp1_raw_headers(raw_headers): try: - import aiohttp.protocol + import aiohttp.protocol # noqa: F401, pylint: disable=W0611 except ImportError: + def pass_through(function): return function + return pass_through - @transient_function_wrapper('aiohttp.protocol', 'HttpParser.parse_headers') + @transient_function_wrapper("aiohttp.protocol", "HttpParser.parse_headers") def recorder(wrapped, instance, args, kwargs): def _bind_params(lines): return lines lines = _bind_params(*args, **kwargs) for line in lines: - line = line.decode('utf-8') + line = line.decode("utf-8") # This is the request, not the response - if line.startswith('GET'): + if line.startswith("GET"): break - if ':' in line: - key, value = line.split(':', maxsplit=1) + if ":" in line: + key, value = line.split(":", maxsplit=1) raw_headers[key.strip()] = value.strip() return wrapped(*args, **kwargs) @@ -61,59 +68,61 @@ def _bind_params(lines): @pytest.mark.parametrize( - 'inbound_payload,expected_intrinsics,forgone_intrinsics,cat_id', [ - - # Valid payload from trusted account - (["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"], - {"nr.referringTransactionGuid": "b854df4feb2b1f06", - "nr.tripId": "7e249074f277923d", - "nr.referringPathHash": "5d2957be"}, - [], - '1#1'), - - # Valid payload from an untrusted account - (["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"], - {}, - ['nr.referringTransactionGuid', 'nr.tripId', 'nr.referringPathHash'], - '80#1'), -]) -@pytest.mark.parametrize('method', ['GET']) -@pytest.mark.parametrize('uri,metric_name', test_uris) -def test_cat_headers(method, uri, metric_name, inbound_payload, - expected_intrinsics, forgone_intrinsics, cat_id, aiohttp_app): + "inbound_payload,expected_intrinsics,forgone_intrinsics,cat_id", + [ + # Valid payload from trusted account + ( + ["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"], + { + "nr.referringTransactionGuid": "b854df4feb2b1f06", + "nr.tripId": "7e249074f277923d", + "nr.referringPathHash": "5d2957be", + }, + [], + "1#1", + ), + # Valid payload from an untrusted account + ( + ["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"], + {}, + ["nr.referringTransactionGuid", "nr.tripId", "nr.referringPathHash"], + "80#1", + ), + ], +) +@pytest.mark.parametrize("method", ["GET"]) +@pytest.mark.parametrize("uri,metric_name", test_uris) +def test_cat_headers( + method, uri, metric_name, inbound_payload, expected_intrinsics, forgone_intrinsics, cat_id, aiohttp_app +): _raw_headers = {} - @asyncio.coroutine - def fetch(): - headers = make_cross_agent_headers(inbound_payload, ENCODING_KEY, - cat_id) - resp = yield from aiohttp_app.client.request(method, uri, - headers=headers) + async def fetch(): + headers = make_cross_agent_headers(inbound_payload, ENCODING_KEY, cat_id) + resp = await aiohttp_app.client.request(method, uri, headers=headers) if _raw_headers: raw_headers = _raw_headers else: - raw_headers = {k.decode('utf-8'): v.decode('utf-8') - for k, v in resp.raw_headers} + raw_headers = {k.decode("utf-8"): v.decode("utf-8") for k, v in resp.raw_headers} if expected_intrinsics: # test valid CAT response header - assert 'X-NewRelic-App-Data' in raw_headers + assert "X-NewRelic-App-Data" in raw_headers - app_data = json.loads(deobfuscate( - raw_headers['X-NewRelic-App-Data'], ENCODING_KEY)) + app_data = json.loads(deobfuscate(raw_headers["X-NewRelic-App-Data"], ENCODING_KEY)) assert app_data[0] == cat_id - assert app_data[1] == ('WebTransaction/Function/%s' % metric_name) + assert app_data[1] == ("WebTransaction/Function/%s" % metric_name) else: - assert 'X-NewRelic-App-Data' not in resp.headers + assert "X-NewRelic-App-Data" not in resp.headers _custom_settings = { - 'cross_process_id': '1#1', - 'encoding_key': ENCODING_KEY, - 'trusted_account_ids': [1], - 'cross_application_tracer.enabled': True, - 'distributed_tracing.enabled': False, + "cross_process_id": "1#1", + "encoding_key": ENCODING_KEY, + "trusted_account_ids": [1], + "cross_application_tracer.enabled": True, + "distributed_tracing.enabled": False, } # NOTE: the logic-flow of this test can be a bit confusing. @@ -125,9 +134,11 @@ def fetch(): # is received and subsequently processed. that code is # a fixture from conftest.py/_target_application.py - @validate_analytics_catmap_data('WebTransaction/Function/%s' % metric_name, - expected_attributes=expected_intrinsics, - non_expected_attributes=forgone_intrinsics) + @validate_analytics_catmap_data( + "WebTransaction/Function/%s" % metric_name, + expected_attributes=expected_intrinsics, + non_expected_attributes=forgone_intrinsics, + ) @override_application_settings(_custom_settings) @record_aiohttp1_raw_headers(_raw_headers) def _test(): @@ -136,8 +147,8 @@ def _test(): _test() -account_id = '33' -primary_application_id = '2827902' +account_id = "33" +primary_application_id = "2827902" inbound_payload = { "v": [0, 1], @@ -150,14 +161,14 @@ def _test(): "sa": True, "ti": 1518469636035, "tr": "d6b4ba0c3a712ca", - "ty": "App" - } + "ty": "App", + }, } expected_attributes = { - 'agent': [], - 'user': [], - 'intrinsic': { + "agent": [], + "user": [], + "intrinsic": { "traceId": "d6b4ba0c3a712ca", "priority": 1.234567, "sampled": True, @@ -166,32 +177,28 @@ def _test(): "parent.account": account_id, "parent.transportType": "HTTP", "parentId": "e8b91a159289ff74", - "parentSpanId": "7d3efb1b173fecfa" - } + "parentSpanId": "7d3efb1b173fecfa", + }, } unexpected_attributes = { - 'agent': [], - 'user': [], - 'intrinsic': [ - "grandparentId", "cross_process_id", "nr.tripId", "nr.pathHash" - ] + "agent": [], + "user": [], + "intrinsic": ["grandparentId", "cross_process_id", "nr.tripId", "nr.pathHash"], } -@pytest.mark.parametrize('uri,metric_name', test_uris) +@pytest.mark.parametrize("uri,metric_name", test_uris) def test_distributed_tracing_headers(uri, metric_name, aiohttp_app): - @asyncio.coroutine - def fetch(): - headers = {'newrelic': json.dumps(inbound_payload)} - resp = yield from aiohttp_app.client.request('GET', uri, - headers=headers) + async def fetch(): + headers = {"newrelic": json.dumps(inbound_payload)} + resp = await aiohttp_app.client.request("GET", uri, headers=headers) # better cat does not send a response in the headers - assert 'newrelic' not in resp.headers + assert "newrelic" not in resp.headers # old-cat headers should not be in the response - assert 'X-NewRelic-App-Data' not in resp.headers + assert "X-NewRelic-App-Data" not in resp.headers # NOTE: the logic-flow of this test can be a bit confusing. # the override settings and attribute validation occur @@ -202,14 +209,15 @@ def fetch(): # is received and subsequently processed. that code is # a fixture from conftest.py/_target_application.py - @validate_transaction_event_attributes( - expected_attributes, unexpected_attributes) - @override_application_settings({ - 'account_id': '33', - 'trusted_account_key': '33', - 'primary_application_id': primary_application_id, - 'distributed_tracing.enabled': True - }) + @validate_transaction_event_attributes(expected_attributes, unexpected_attributes) + @override_application_settings( + { + "account_id": "33", + "trusted_account_key": "33", + "primary_application_id": primary_application_id, + "distributed_tracing.enabled": True, + } + ) def _test(): aiohttp_app.loop.run_until_complete(fetch()) diff --git a/tests/framework_aiohttp/test_ws.py b/tests/framework_aiohttp/test_ws.py index 549902d6e..da908014d 100644 --- a/tests/framework_aiohttp/test_ws.py +++ b/tests/framework_aiohttp/test_ws.py @@ -12,28 +12,25 @@ # See the License for the specific language governing permissions and # limitations under the License. -import asyncio import aiohttp from testing_support.fixtures import function_not_called -version_info = tuple(int(_) for _ in aiohttp.__version__.split('.')[:2]) +version_info = tuple(int(_) for _ in aiohttp.__version__.split(".")[:2]) -@function_not_called('newrelic.core.stats_engine', - 'StatsEngine.record_transaction') +@function_not_called("newrelic.core.stats_engine", "StatsEngine.record_transaction") def test_websocket(aiohttp_app): - @asyncio.coroutine - def ws_write(): - ws = yield from aiohttp_app.client.ws_connect('/ws') + async def ws_write(): + ws = await aiohttp_app.client.ws_connect("/ws") try: for _ in range(2): - result = ws.send_str('Hello') - if hasattr(result, '__await__'): - yield from result.__await__() - msg = yield from ws.receive() - assert msg.data == '/Hello' + result = ws.send_str("Hello") + if hasattr(result, "__await__"): + await result + msg = await ws.receive() + assert msg.data == "/Hello" finally: - yield from ws.close(code=1000) + await ws.close(code=1000) assert ws.close_code == 1000 aiohttp_app.loop.run_until_complete(ws_write()) diff --git a/tests/framework_ariadne/_target_application.py b/tests/framework_ariadne/_target_application.py index b0e19ac0d..a59e7432e 100644 --- a/tests/framework_ariadne/_target_application.py +++ b/tests/framework_ariadne/_target_application.py @@ -15,19 +15,27 @@ import asyncio import json -import pytest - -from ._target_schema_sync import target_schema as target_schema_sync, target_asgi_application as target_asgi_application_sync, target_wsgi_application as target_wsgi_application_sync -from ._target_schema_async import target_schema as target_schema_async, target_asgi_application as target_asgi_application_async +from framework_ariadne._target_schema_async import ( + target_asgi_application as target_asgi_application_async, +) +from framework_ariadne._target_schema_async import target_schema as target_schema_async +from framework_ariadne._target_schema_sync import ( + target_asgi_application as target_asgi_application_sync, +) +from framework_ariadne._target_schema_sync import target_schema as target_schema_sync +from framework_ariadne._target_schema_sync import ( + target_wsgi_application as target_wsgi_application_sync, +) +from framework_ariadne.test_application import ariadne_version_tuple from graphql import MiddlewareManager def check_response(query, success, response): if isinstance(query, str) and "error" not in query: - assert success and not "errors" in response, response["errors"] - assert response["data"] + assert success and "errors" not in response, response + assert response.get("data", None), response else: assert "errors" in response, response @@ -36,15 +44,15 @@ def run_sync(schema): def _run_sync(query, middleware=None): from ariadne import graphql_sync - if middleware: - middleware = MiddlewareManager(*middleware) - else: - middleware = None + if ariadne_version_tuple < (0, 18): + if middleware: + middleware = MiddlewareManager(*middleware) success, response = graphql_sync(schema, {"query": query}, middleware=middleware) check_response(query, success, response) return response.get("data", {}) + return _run_sync @@ -52,16 +60,17 @@ def run_async(schema): def _run_async(query, middleware=None): from ariadne import graphql - if middleware: - middleware = MiddlewareManager(*middleware) - else: - middleware = None + #Later versions of ariadne directly accept a list of middleware while older versions require the MiddlewareManager + if ariadne_version_tuple < (0, 18): + if middleware: + middleware = MiddlewareManager(*middleware) loop = asyncio.get_event_loop() success, response = loop.run_until_complete(graphql(schema, {"query": query}, middleware=middleware)) check_response(query, success, response) return response.get("data", {}) + return _run_async @@ -91,7 +100,12 @@ def _run_asgi(query, middleware=None): def run_asgi(app): def _run_asgi(query, middleware=None): - app.asgi_application.middleware = middleware + if ariadne_version_tuple < (0, 16): + app.asgi_application.middleware = middleware + + #In ariadne v0.16.0, the middleware attribute was removed from the GraphQL class in favor of the http_handler + elif ariadne_version_tuple >= (0, 16): + app.asgi_application.http_handler.middleware = middleware response = app.make_request( "POST", "/", body=json.dumps({"query": query}), headers={"Content-Type": "application/json"} @@ -108,6 +122,7 @@ def _run_asgi(query, middleware=None): assert "errors" not in body or not body["errors"] return body.get("data", {}) + return _run_asgi diff --git a/tests/framework_ariadne/_target_schema_sync.py b/tests/framework_ariadne/_target_schema_sync.py index e42ee0bc1..2725f0866 100644 --- a/tests/framework_ariadne/_target_schema_sync.py +++ b/tests/framework_ariadne/_target_schema_sync.py @@ -22,11 +22,17 @@ load_schema_from_path, make_executable_schema, ) -from ariadne.asgi import GraphQL as GraphQLASGI from ariadne.wsgi import GraphQL as GraphQLWSGI from framework_graphql._target_schema_sync import books, magazines, libraries from testing_support.asgi_testing import AsgiTest +from framework_ariadne.test_application import ariadne_version_tuple + +if ariadne_version_tuple < (0, 16): + from ariadne.asgi import GraphQL as GraphQLASGI +elif ariadne_version_tuple >= (0, 16): + from ariadne.asgi.graphql import GraphQL as GraphQLASGI + schema_file = os.path.join(os.path.dirname(os.path.realpath(__file__)), "schema.graphql") type_defs = load_schema_from_path(schema_file) @@ -36,6 +42,7 @@ mutation = MutationType() + @mutation.field("storage_add") def resolve_storage_add(self, info, string): storage.append(string) @@ -94,4 +101,4 @@ def resolve_error(self, info): target_schema = make_executable_schema(type_defs, query, mutation, item) target_asgi_application = AsgiTest(GraphQLASGI(target_schema)) -target_wsgi_application = webtest.TestApp(GraphQLWSGI(target_schema)) +target_wsgi_application = webtest.TestApp(GraphQLWSGI(target_schema)) \ No newline at end of file diff --git a/tests/framework_ariadne/conftest.py b/tests/framework_ariadne/conftest.py index 31a19f5a6..210399bb9 100644 --- a/tests/framework_ariadne/conftest.py +++ b/tests/framework_ariadne/conftest.py @@ -14,17 +14,8 @@ import pytest import six -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.framework_graphql", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_ariadne/test_application.py b/tests/framework_ariadne/test_application.py index fd294ac92..734a7e92c 100644 --- a/tests/framework_ariadne/test_application.py +++ b/tests/framework_ariadne/test_application.py @@ -11,26 +11,27 @@ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. - +import pytest from framework_graphql.test_application import * +from newrelic.common.package_version_utils import get_package_version + +ARIADNE_VERSION = get_package_version("ariadne") +ariadne_version_tuple = tuple(map(int, ARIADNE_VERSION.split("."))) + -@pytest.fixture(scope="session", params=["sync-sync", "async-sync", "async-async", "wsgi-sync", "asgi-sync", "asgi-async"]) +@pytest.fixture( + scope="session", params=["sync-sync", "async-sync", "async-async", "wsgi-sync", "asgi-sync", "asgi-async"] +) def target_application(request): from ._target_application import target_application - target_application = target_application[request.param] - try: - import ariadne - version = ariadne.__version__ - except Exception: - import pkg_resources - version = pkg_resources.get_distribution("ariadne").version + target_application = target_application[request.param] param = request.param.split("-") is_background = param[0] not in {"wsgi", "asgi"} schema_type = param[1] extra_spans = 4 if param[0] == "wsgi" else 0 - assert version is not None - return "Ariadne", version, target_application, is_background, schema_type, extra_spans + assert ARIADNE_VERSION is not None + return "Ariadne", ARIADNE_VERSION, target_application, is_background, schema_type, extra_spans \ No newline at end of file diff --git a/tests/framework_bottle/conftest.py b/tests/framework_bottle/conftest.py index 04ed187f3..095a3331f 100644 --- a/tests/framework_bottle/conftest.py +++ b/tests/framework_bottle/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_bottle', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/framework_bottle/test_application.py b/tests/framework_bottle/test_application.py index f9fb0915e..28619d5eb 100644 --- a/tests/framework_bottle/test_application.py +++ b/tests/framework_bottle/test_application.py @@ -15,12 +15,13 @@ import pytest import base64 -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_ignore_status_codes, +from testing_support.fixtures import ( + override_ignore_status_codes, override_application_settings) - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.packages import six from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors import webtest diff --git a/tests/framework_cherrypy/conftest.py b/tests/framework_cherrypy/conftest.py index 0de6238ea..bc730bb1f 100644 --- a/tests/framework_cherrypy/conftest.py +++ b/tests/framework_cherrypy/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_cherrypy', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/framework_cherrypy/test_application.py b/tests/framework_cherrypy/test_application.py index 1dd7f837d..39f8b5c16 100644 --- a/tests/framework_cherrypy/test_application.py +++ b/tests/framework_cherrypy/test_application.py @@ -17,9 +17,11 @@ from newrelic.packages import six -from testing_support.fixtures import (validate_transaction_errors, - override_application_settings, override_ignore_status_codes) +from testing_support.fixtures import ( + override_application_settings, + override_ignore_status_codes) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors import cherrypy diff --git a/tests/framework_cherrypy/test_dispatch.py b/tests/framework_cherrypy/test_dispatch.py index 0a3f8e7c7..64dccb214 100644 --- a/tests/framework_cherrypy/test_dispatch.py +++ b/tests/framework_cherrypy/test_dispatch.py @@ -17,7 +17,7 @@ from newrelic.packages import six -from testing_support.fixtures import validate_transaction_errors +from testing_support.validators.validate_transaction_errors import validate_transaction_errors import cherrypy diff --git a/tests/framework_cherrypy/test_resource.py b/tests/framework_cherrypy/test_resource.py index 09c68f9fc..385d28d91 100644 --- a/tests/framework_cherrypy/test_resource.py +++ b/tests/framework_cherrypy/test_resource.py @@ -14,7 +14,7 @@ import webtest -from testing_support.fixtures import validate_transaction_errors +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics import cherrypy diff --git a/tests/framework_cherrypy/test_routes.py b/tests/framework_cherrypy/test_routes.py index 7c1b23181..9111a29ce 100644 --- a/tests/framework_cherrypy/test_routes.py +++ b/tests/framework_cherrypy/test_routes.py @@ -16,7 +16,7 @@ import sys import webtest -from testing_support.fixtures import validate_transaction_errors +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics import cherrypy diff --git a/tests/framework_django/conftest.py b/tests/framework_django/conftest.py index 6f807c766..8a43ef5c9 100644 --- a/tests/framework_django/conftest.py +++ b/tests/framework_django/conftest.py @@ -14,15 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_django', - 'newrelic.hooks.framework_django_py3', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/framework_django/test_application.py b/tests/framework_django/test_application.py index 1876b9a69..1f2616b0f 100644 --- a/tests/framework_django/test_application.py +++ b/tests/framework_django/test_application.py @@ -12,11 +12,13 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings, +from testing_support.fixtures import ( + override_application_settings, override_generic_settings, override_ignore_status_codes) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics from newrelic.hooks.framework_django import django_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors import os diff --git a/tests/framework_django/test_asgi_application.py b/tests/framework_django/test_asgi_application.py index 98f157f85..457042766 100644 --- a/tests/framework_django/test_asgi_application.py +++ b/tests/framework_django/test_asgi_application.py @@ -18,10 +18,12 @@ from newrelic.core.config import global_settings from newrelic.common.encoding_utils import gzip_decompress -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings, +from testing_support.fixtures import ( + override_application_settings, override_generic_settings, override_ignore_status_codes) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors DJANGO_VERSION = tuple(map(int, django.get_version().split('.')[:2])) diff --git a/tests/framework_falcon/conftest.py b/tests/framework_falcon/conftest.py index 3df2656cf..fd43715c6 100644 --- a/tests/framework_falcon/conftest.py +++ b/tests/framework_falcon/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_falcon', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/framework_falcon/test_application.py b/tests/framework_falcon/test_application.py index 6f59ea826..6b64c8c67 100644 --- a/tests/framework_falcon/test_application.py +++ b/tests/framework_falcon/test_application.py @@ -14,10 +14,12 @@ import pytest from newrelic.core.config import global_settings -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_ignore_status_codes, +from testing_support.fixtures import ( + override_ignore_status_codes, override_generic_settings) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors SETTINGS = global_settings() diff --git a/tests/framework_fastapi/conftest.py b/tests/framework_fastapi/conftest.py index e976b05ed..d65398ffb 100644 --- a/tests/framework_fastapi/conftest.py +++ b/tests/framework_fastapi/conftest.py @@ -13,20 +13,11 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 newrelic_caplog as caplog, ) -_coverage_source = [ - "newrelic.hooks.framework_fastapi", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_fastapi/test_application.py b/tests/framework_fastapi/test_application.py index 8fe4be34f..85d230e26 100644 --- a/tests/framework_fastapi/test_application.py +++ b/tests/framework_fastapi/test_application.py @@ -15,7 +15,7 @@ import logging import pytest -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics diff --git a/tests/framework_flask/conftest.py b/tests/framework_flask/conftest.py index abf124817..f90ed9b51 100644 --- a/tests/framework_flask/conftest.py +++ b/tests/framework_flask/conftest.py @@ -17,14 +17,8 @@ import pytest from flask import __version__ as flask_version -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_flask', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/framework_flask/test_application.py b/tests/framework_flask/test_application.py index 98757a805..de7a43019 100644 --- a/tests/framework_flask/test_application.py +++ b/tests/framework_flask/test_application.py @@ -14,10 +14,12 @@ import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings, +from testing_support.fixtures import ( + override_application_settings, validate_tt_parenting) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from newrelic.packages import six diff --git a/tests/framework_flask/test_blueprints.py b/tests/framework_flask/test_blueprints.py index 6a76d405b..4a1e361fb 100644 --- a/tests/framework_flask/test_blueprints.py +++ b/tests/framework_flask/test_blueprints.py @@ -14,9 +14,10 @@ import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings) +from testing_support.fixtures import override_application_settings from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from newrelic.packages import six diff --git a/tests/framework_flask/test_compress.py b/tests/framework_flask/test_compress.py index ac7f323fd..f6feb01ee 100644 --- a/tests/framework_flask/test_compress.py +++ b/tests/framework_flask/test_compress.py @@ -12,8 +12,9 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics diff --git a/tests/framework_flask/test_middleware.py b/tests/framework_flask/test_middleware.py index d92d4d851..3c81ebc47 100644 --- a/tests/framework_flask/test_middleware.py +++ b/tests/framework_flask/test_middleware.py @@ -14,9 +14,10 @@ import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings) +from testing_support.fixtures import override_application_settings from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors def target_application(): # We need to delay Flask application creation because of ordering diff --git a/tests/framework_flask/test_not_found.py b/tests/framework_flask/test_not_found.py index 22ad5efcd..c1c55475e 100644 --- a/tests/framework_flask/test_not_found.py +++ b/tests/framework_flask/test_not_found.py @@ -14,8 +14,8 @@ import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors) +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def target_application(): # We need to delay Flask application creation because of ordering diff --git a/tests/framework_flask/test_user_exceptions.py b/tests/framework_flask/test_user_exceptions.py index 5c8f3a658..844b4b9ef 100644 --- a/tests/framework_flask/test_user_exceptions.py +++ b/tests/framework_flask/test_user_exceptions.py @@ -14,8 +14,8 @@ import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors) +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def target_application(): # We need to delay Flask application creation because of ordering diff --git a/tests/framework_flask/test_views.py b/tests/framework_flask/test_views.py index d4dd8178f..0338169c7 100644 --- a/tests/framework_flask/test_views.py +++ b/tests/framework_flask/test_views.py @@ -16,8 +16,13 @@ async_handler_support, skip_if_not_async_handler_support, ) -from testing_support.fixtures import ( +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) @@ -50,6 +55,7 @@ def target_application(): return _test_application +@validate_code_level_metrics("_test_views.TestView", "dispatch_request") @validate_transaction_errors(errors=[]) @validate_transaction_metrics("_test_views:test_view", scoped_metrics=scoped_metrics) def test_class_based_view(): @@ -59,6 +65,7 @@ def test_class_based_view(): @skip_if_not_async_handler_support +@validate_code_level_metrics("_test_views_async.TestAsyncView", "dispatch_request") @validate_transaction_errors(errors=[]) @validate_transaction_metrics("_test_views_async:test_async_view", scoped_metrics=scoped_metrics) def test_class_based_async_view(): @@ -67,6 +74,7 @@ def test_class_based_async_view(): response.mustcontain("ASYNC VIEW RESPONSE") +@validate_code_level_metrics("_test_views.TestMethodView", "get") @validate_transaction_errors(errors=[]) @validate_transaction_metrics("_test_views:test_methodview", scoped_metrics=scoped_metrics) def test_get_method_view(): @@ -75,6 +83,7 @@ def test_get_method_view(): response.mustcontain("METHODVIEW GET RESPONSE") +@validate_code_level_metrics("_test_views.TestMethodView", "post") @validate_transaction_errors(errors=[]) @validate_transaction_metrics("_test_views:test_methodview", scoped_metrics=scoped_metrics) def test_post_method_view(): @@ -84,6 +93,7 @@ def test_post_method_view(): @skip_if_not_async_handler_support +@validate_code_level_metrics("_test_views_async.TestAsyncMethodView", "get") @validate_transaction_errors(errors=[]) @validate_transaction_metrics("_test_views_async:test_async_methodview", scoped_metrics=scoped_metrics) def test_get_method_async_view(): diff --git a/tests/framework_graphene/conftest.py b/tests/framework_graphene/conftest.py index 73eccda4a..a097f9750 100644 --- a/tests/framework_graphene/conftest.py +++ b/tests/framework_graphene/conftest.py @@ -14,17 +14,8 @@ import pytest import six -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.framework_graphql", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_graphql/conftest.py b/tests/framework_graphql/conftest.py index e46f70f91..48cac2226 100644 --- a/tests/framework_graphql/conftest.py +++ b/tests/framework_graphql/conftest.py @@ -13,19 +13,10 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -from newrelic.packages import six +import six -_coverage_source = [ - "newrelic.hooks.framework_graphql", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_graphql/test_application.py b/tests/framework_graphql/test_application.py index 8ac499273..2f636cf38 100644 --- a/tests/framework_graphql/test_application.py +++ b/tests/framework_graphql/test_application.py @@ -13,10 +13,9 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( - dt_enabled, - validate_transaction_errors, - validate_transaction_metrics, +from testing_support.fixtures import dt_enabled, override_application_settings +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, ) from testing_support.validators.validate_code_level_metrics import ( validate_code_level_metrics, @@ -25,6 +24,12 @@ from testing_support.validators.validate_transaction_count import ( validate_transaction_count, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task from newrelic.common.object_names import callable_name @@ -90,6 +95,13 @@ def error_middleware(next, root, info, **args): error_middleware.append(error_middleware_async) +def test_no_harm_no_transaction(target_application): + framework, version, target_application, is_bg, schema_type, extra_spans = target_application + + response = target_application("{ __schema { types { name } } }") + assert not response.get("errors", None) + + _runtime_error_name = callable_name(RuntimeError) _test_runtime_error = [(_runtime_error_name, "Runtime Error!")] @@ -136,6 +148,7 @@ def test_basic(target_application): def _test(): response = target_application("{ hello }") assert response["hello"] == "Hello!" + assert not response.get("errors", None) _test() @@ -437,6 +450,7 @@ def test_operation_metrics_and_attrs(target_application): @conditional_decorator(background_task(), is_bg) def _test(): response = target_application("query MyQuery { library(index: 0) { branch, book { id, name } } }") + assert not response.get("errors", None) _test() @@ -568,12 +582,18 @@ def _test(): _test() -def test_ignored_introspection_transactions(target_application): +@pytest.mark.parametrize("capture_introspection_setting", (True, False)) +def test_ignored_introspection_transactions(target_application, capture_introspection_setting): framework, version, target_application, is_bg, schema_type, extra_spans = target_application + txn_ct = 1 if capture_introspection_setting else 0 - @validate_transaction_count(0) + @override_application_settings( + {"instrumentation.graphql.capture_introspection_queries": capture_introspection_setting} + ) + @validate_transaction_count(txn_ct) @background_task() def _test(): response = target_application("{ __schema { types { name } } }") + assert not response.get("errors", None) _test() diff --git a/tests/framework_grpc/conftest.py b/tests/framework_grpc/conftest.py index 3e27d134d..970a096cf 100644 --- a/tests/framework_grpc/conftest.py +++ b/tests/framework_grpc/conftest.py @@ -16,20 +16,12 @@ import grpc import pytest -from testing_support.fixtures import ( # noqa - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from testing_support.mock_external_grpc_server import MockExternalgRPCServer import newrelic.packages.six as six -_coverage_source = [ - "newrelic.hooks.framework_grpc", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_grpc/test_clients.py b/tests/framework_grpc/test_clients.py index e8fed1da5..c6ada806b 100644 --- a/tests/framework_grpc/test_clients.py +++ b/tests/framework_grpc/test_clients.py @@ -18,8 +18,8 @@ from newrelic.api.background_task import background_task -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors) +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from _test_common import create_request, get_result diff --git a/tests/framework_grpc/test_distributed_tracing.py b/tests/framework_grpc/test_distributed_tracing.py index 7cd134ca0..7f253651d 100644 --- a/tests/framework_grpc/test_distributed_tracing.py +++ b/tests/framework_grpc/test_distributed_tracing.py @@ -21,12 +21,11 @@ from newrelic.common.encoding_utils import ( DistributedTracePayload, W3CTraceParent, W3CTraceState, NrTraceState) -from testing_support.fixtures import (override_application_settings, - validate_transaction_metrics) +from testing_support.fixtures import override_application_settings from testing_support.validators.validate_span_events import ( validate_span_events) from _test_common import create_request, wait_for_transaction_completion - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics _test_matrix = ('method_name,streaming_request', ( ('DoUnaryUnary', False), diff --git a/tests/framework_grpc/test_server.py b/tests/framework_grpc/test_server.py index d1ed47ff5..534d78005 100644 --- a/tests/framework_grpc/test_server.py +++ b/tests/framework_grpc/test_server.py @@ -18,12 +18,13 @@ from conftest import create_stub_and_channel from _test_common import create_request, wait_for_transaction_completion from newrelic.core.config import global_settings -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_event_attributes, override_application_settings, - override_generic_settings, function_not_called, - validate_transaction_errors) +from testing_support.fixtures import ( + override_application_settings, + override_generic_settings, function_not_called) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_event_attributes import validate_transaction_event_attributes def select_python_version(py2, py3): return six.PY3 and py3 or py2 diff --git a/tests/framework_pyramid/conftest.py b/tests/framework_pyramid/conftest.py index 6ca07b90a..289eabeaf 100644 --- a/tests/framework_pyramid/conftest.py +++ b/tests/framework_pyramid/conftest.py @@ -14,14 +14,8 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -_coverage_source = [ - 'newrelic.hooks.framework_pyramid', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { 'transaction_tracer.explain_threshold': 0.0, diff --git a/tests/framework_pyramid/test_append_slash_app.py b/tests/framework_pyramid/test_append_slash_app.py index 875f91012..f09e14b55 100644 --- a/tests/framework_pyramid/test_append_slash_app.py +++ b/tests/framework_pyramid/test_append_slash_app.py @@ -34,9 +34,9 @@ import pytest import re -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings) - +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors def _to_int(version_str): m = re.match(r'\d+', version_str) diff --git a/tests/framework_pyramid/test_application.py b/tests/framework_pyramid/test_application.py index 6b912d632..132e0f72a 100644 --- a/tests/framework_pyramid/test_application.py +++ b/tests/framework_pyramid/test_application.py @@ -14,8 +14,9 @@ import pytest -from testing_support.fixtures import (validate_transaction_metrics, - validate_transaction_errors, override_application_settings) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from newrelic.packages import six from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics diff --git a/tests/framework_pyramid/test_cornice.py b/tests/framework_pyramid/test_cornice.py index 8b8dc58d2..fe36831e0 100644 --- a/tests/framework_pyramid/test_cornice.py +++ b/tests/framework_pyramid/test_cornice.py @@ -14,9 +14,9 @@ import pytest -from testing_support.fixtures import (validate_transaction_errors, - validate_transaction_metrics) from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors from newrelic.packages import six diff --git a/tests/framework_sanic/conftest.py b/tests/framework_sanic/conftest.py index 434528bac..5152887b6 100644 --- a/tests/framework_sanic/conftest.py +++ b/tests/framework_sanic/conftest.py @@ -15,21 +15,13 @@ import asyncio import pytest -from testing_support.fixtures import ( # noqa: F401 pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from newrelic.common.object_wrapper import ( # noqa: F401 pylint: disable=W0611 transient_function_wrapper, ) -_coverage_source = [ - "newrelic.hooks.framework_sanic", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_sanic/test_application.py b/tests/framework_sanic/test_application.py index eebbde003..a949d91d0 100644 --- a/tests/framework_sanic/test_application.py +++ b/tests/framework_sanic/test_application.py @@ -21,22 +21,34 @@ override_application_settings, override_generic_settings, override_ignore_status_codes, - validate_transaction_errors, - validate_transaction_event_attributes, - validate_transaction_metrics, ) from testing_support.validators.validate_code_level_metrics import ( validate_code_level_metrics, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.application import application_instance from newrelic.api.external_trace import ExternalTrace from newrelic.api.transaction import Transaction +from newrelic.common.package_version_utils import get_package_version from newrelic.core.config import global_settings +SANIC_VERSION = tuple(map(int, get_package_version("sanic").split("."))) + +sanic_21 = SANIC_VERSION >= (21,) +sanic_v19_to_v22_12 = SANIC_VERSION >= (19,) and SANIC_VERSION < (22, 12) + BASE_METRICS = [ ("Function/_target_application:index", 1), - ("Function/_target_application:request_middleware", 1 if int(sanic.__version__.split(".", 1)[0]) > 18 else 2), + ("Function/_target_application:request_middleware", 1 if sanic_v19_to_v22_12 else 2), ] FRAMEWORK_METRICS = [ ("Python/Framework/Sanic/%s" % sanic.__version__, 1), diff --git a/tests/framework_sanic/test_cross_application.py b/tests/framework_sanic/test_cross_application.py index 0c1c724cf..31dc3b9b9 100644 --- a/tests/framework_sanic/test_cross_application.py +++ b/tests/framework_sanic/test_cross_application.py @@ -13,30 +13,38 @@ # limitations under the License. import json -import pytest -import re import random + +# import re import string -from newrelic.common.encoding_utils import deobfuscate +import pytest +from testing_support.fixtures import ( + make_cross_agent_headers, + override_application_settings, + validate_analytics_catmap_data, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + from newrelic.api.application import application_instance from newrelic.api.external_trace import ExternalTrace from newrelic.api.transaction import Transaction - -from testing_support.fixtures import (override_application_settings, - make_cross_agent_headers, validate_analytics_catmap_data, - validate_transaction_metrics, validate_transaction_event_attributes) - +from newrelic.common.encoding_utils import deobfuscate BASE_METRICS = [ - ('Function/_target_application:index', 1), + ("Function/_target_application:index", 1), ] DT_METRICS = [ - ('Supportability/DistributedTrace/AcceptPayload/Success', None), - ('Supportability/TraceContext/TraceParent/Accept/Success', 1), + ("Supportability/DistributedTrace/AcceptPayload/Success", None), + ("Supportability/TraceContext/TraceParent/Accept/Success", 1), ] -BASE_ATTRS = ['response.status', 'response.headers.contentType', - 'response.headers.contentLength'] +BASE_ATTRS = ["response.status", "response.headers.contentType", "response.headers.contentLength"] + def raw_headers(response): try: @@ -50,62 +58,69 @@ def raw_headers(response): @validate_transaction_metrics( - '_target_application:index', + "_target_application:index", scoped_metrics=BASE_METRICS, rollup_metrics=BASE_METRICS + DT_METRICS, ) -@override_application_settings({ - 'distributed_tracing.enabled': True, -}) +@override_application_settings( + { + "distributed_tracing.enabled": True, + } +) @validate_transaction_event_attributes( - required_params={'agent': BASE_ATTRS, 'user': [], 'intrinsic': []}, + required_params={"agent": BASE_ATTRS, "user": [], "intrinsic": []}, ) def test_inbound_distributed_trace(app): transaction = Transaction(application_instance()) dt_headers = ExternalTrace.generate_request_headers(transaction) - response = app.fetch('get', '/', headers=dict(dt_headers)) + response = app.fetch("get", "/", headers=dict(dt_headers)) assert response.status == 200 -ENCODING_KEY = "".join(random.choice(string.ascii_lowercase) for _ in range(40)) +ENCODING_KEY = "".join(random.choice(string.ascii_lowercase) for _ in range(40)) # nosec _cat_response_header_urls_to_test = ( - ('/', '_target_application:index'), - ('/streaming', '_target_application:streaming'), - ('/error', '_target_application:error'), + ("/", "_target_application:index"), + ("/streaming", "_target_application:streaming"), + ("/error", "_target_application:error"), ) _custom_settings = { - 'cross_process_id': '1#1', - 'encoding_key': ENCODING_KEY, - 'trusted_account_ids': [1], - 'cross_application_tracer.enabled': True, - 'distributed_tracing.enabled': False, + "cross_process_id": "1#1", + "encoding_key": ENCODING_KEY, + "trusted_account_ids": [1], + "cross_application_tracer.enabled": True, + "distributed_tracing.enabled": False, } @pytest.mark.parametrize( - 'inbound_payload,expected_intrinsics,forgone_intrinsics,cat_id', [ - - # Valid payload from trusted account - (['b854df4feb2b1f06', False, '7e249074f277923d', '5d2957be'], - {'nr.referringTransactionGuid': 'b854df4feb2b1f06', - 'nr.tripId': '7e249074f277923d', - 'nr.referringPathHash': '5d2957be'}, - [], - '1#1'), - - # Valid payload from an untrusted account - (['b854df4feb2b1f06', False, '7e249074f277923d', '5d2957be'], - {}, - ['nr.referringTransactionGuid', 'nr.tripId', 'nr.referringPathHash'], - '80#1'), -]) -@pytest.mark.parametrize('url,metric_name', _cat_response_header_urls_to_test) -def test_cat_response_headers(app, inbound_payload, expected_intrinsics, - forgone_intrinsics, cat_id, url, metric_name): + "inbound_payload,expected_intrinsics,forgone_intrinsics,cat_id", + [ + # Valid payload from trusted account + ( + ["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"], + { + "nr.referringTransactionGuid": "b854df4feb2b1f06", + "nr.tripId": "7e249074f277923d", + "nr.referringPathHash": "5d2957be", + }, + [], + "1#1", + ), + # Valid payload from an untrusted account + ( + ["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"], + {}, + ["nr.referringTransactionGuid", "nr.tripId", "nr.referringPathHash"], + "80#1", + ), + ], +) +@pytest.mark.parametrize("url,metric_name", _cat_response_header_urls_to_test) +def test_cat_response_headers(app, inbound_payload, expected_intrinsics, forgone_intrinsics, cat_id, url, metric_name): _base_metrics = [ - ('Function/%s' % metric_name, 1), + ("Function/%s" % metric_name, 1), ] @validate_transaction_metrics( @@ -114,39 +129,36 @@ def test_cat_response_headers(app, inbound_payload, expected_intrinsics, rollup_metrics=_base_metrics, ) @validate_analytics_catmap_data( - 'WebTransaction/Function/%s' % metric_name, - expected_attributes=expected_intrinsics, - non_expected_attributes=forgone_intrinsics) + "WebTransaction/Function/%s" % metric_name, + expected_attributes=expected_intrinsics, + non_expected_attributes=forgone_intrinsics, + ) @override_application_settings(_custom_settings) def _test(): - cat_headers = make_cross_agent_headers(inbound_payload, ENCODING_KEY, - cat_id) - response = app.fetch('get', url, headers=dict(cat_headers)) + cat_headers = make_cross_agent_headers(inbound_payload, ENCODING_KEY, cat_id) + response = app.fetch("get", url, headers=dict(cat_headers)) if expected_intrinsics: # test valid CAT response header - assert b'X-NewRelic-App-Data' in raw_headers(response) + assert b"X-NewRelic-App-Data" in raw_headers(response) cat_response_header = response.headers.get("X-NewRelic-App-Data", None) - app_data = json.loads(deobfuscate(cat_response_header, - ENCODING_KEY)) + app_data = json.loads(deobfuscate(cat_response_header, ENCODING_KEY)) assert app_data[0] == cat_id - assert app_data[1] == ('WebTransaction/Function/%s' % metric_name) + assert app_data[1] == ("WebTransaction/Function/%s" % metric_name) else: - assert b'X-NewRelic-App-Data' not in raw_headers(response) + assert b"X-NewRelic-App-Data" not in raw_headers(response) _test() @override_application_settings(_custom_settings) def test_cat_response_custom_header(app): - inbound_payload = ['b854df4feb2b1f06', False, '7e249074f277923d', - '5d2957be'] - cat_id = '1#1' - custom_header_value = b'my-custom-header-value' - cat_headers = make_cross_agent_headers(inbound_payload, ENCODING_KEY, - cat_id) - - response = app.fetch('get', '/custom-header/%s/%s' % ( - 'X-NewRelic-App-Data', custom_header_value), - headers=dict(cat_headers)) + inbound_payload = ["b854df4feb2b1f06", False, "7e249074f277923d", "5d2957be"] + cat_id = "1#1" + custom_header_value = b"my-custom-header-value" + cat_headers = make_cross_agent_headers(inbound_payload, ENCODING_KEY, cat_id) + + response = app.fetch( + "get", "/custom-header/%s/%s" % ("X-NewRelic-App-Data", custom_header_value), headers=dict(cat_headers) + ) assert custom_header_value in raw_headers(response), raw_headers(response) diff --git a/tests/framework_starlette/conftest.py b/tests/framework_starlette/conftest.py index a760fe847..7c843cb08 100644 --- a/tests/framework_starlette/conftest.py +++ b/tests/framework_starlette/conftest.py @@ -12,17 +12,8 @@ # See the License for the specific language governing permissions and # limitations under the License. -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.framework_starlette", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_starlette/test_application.py b/tests/framework_starlette/test_application.py index 9c5944bd0..7d36d66cc 100644 --- a/tests/framework_starlette/test_application.py +++ b/tests/framework_starlette/test_application.py @@ -16,14 +16,12 @@ import pytest import starlette -from testing_support.fixtures import ( - override_ignore_status_codes, - validate_transaction_errors, - validate_transaction_metrics, -) +from testing_support.fixtures import override_ignore_status_codes from newrelic.common.object_names import callable_name from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from testing_support.validators.validate_transaction_errors import validate_transaction_errors +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics starlette_version = tuple(int(x) for x in starlette.__version__.split(".")) diff --git a/tests/framework_starlette/test_bg_tasks.py b/tests/framework_starlette/test_bg_tasks.py index 308f67d10..5e30fe32e 100644 --- a/tests/framework_starlette/test_bg_tasks.py +++ b/tests/framework_starlette/test_bg_tasks.py @@ -16,10 +16,12 @@ import pytest from starlette import __version__ -from testing_support.fixtures import validate_transaction_metrics from testing_support.validators.validate_transaction_count import ( validate_transaction_count, ) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) starlette_version = tuple(int(x) for x in __version__.split(".")) @@ -85,11 +87,22 @@ def _test(): response = app.get("/" + route) assert response.status == 200 - BUG_COMPLETELY_FIXED = (starlette_version >= (0, 21, 0)) or ( - starlette_version >= (0, 20, 1) and sys.version_info[:2] > (3, 7) + # The bug was fixed in version 0.21.0 but re-occured in 0.23.1. + # The bug was also not present on 0.20.1 to 0.23.1 if using Python3.7. + # The bug was fixed again in version 0.29.0 + BUG_COMPLETELY_FIXED = any( + ( + (0, 21, 0) <= starlette_version < (0, 23, 1), + (0, 20, 1) <= starlette_version < (0, 23, 1) and sys.version_info[:2] > (3, 7), + starlette_version >= (0, 29, 0), + ) + ) + BUG_PARTIALLY_FIXED = any( + ( + (0, 20, 1) <= starlette_version < (0, 21, 0), + (0, 23, 1) <= starlette_version < (0, 29, 0), + ) ) - BUG_PARTIALLY_FIXED = (0, 20, 1) <= starlette_version < (0, 21, 0) and sys.version_info[:2] <= (3, 7) - if BUG_COMPLETELY_FIXED: # Assert both web transaction and background task transactions are present. _test = validate_transaction_metrics( @@ -101,6 +114,7 @@ def _test(): # The background task no longer blocks the completion of the web request/web transaction. # However, the BaseHTTPMiddleware causes the task to be cancelled when the web request disconnects, so there are no # longer function traces or background task transactions. + # In version 0.23.1, the check to see if more_body exists is removed, reverting behavior to this model _test = validate_transaction_metrics("_test_bg_tasks:run_%s_bg_task" % route, scoped_metrics=[route_metric])( _test ) diff --git a/tests/framework_starlette/test_graphql.py b/tests/framework_starlette/test_graphql.py index 241371eb1..24ec3ab38 100644 --- a/tests/framework_starlette/test_graphql.py +++ b/tests/framework_starlette/test_graphql.py @@ -15,7 +15,8 @@ import json import pytest -from testing_support.fixtures import dt_enabled, validate_transaction_metrics +from testing_support.fixtures import dt_enabled +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_span_events import validate_span_events diff --git a/tests/framework_strawberry/conftest.py b/tests/framework_strawberry/conftest.py index 10c659b8b..c5cdbf0c8 100644 --- a/tests/framework_strawberry/conftest.py +++ b/tests/framework_strawberry/conftest.py @@ -14,17 +14,9 @@ import pytest import six -from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) -_coverage_source = [ - "newrelic.hooks.framework_graphql", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/framework_tornado/_target_application.py b/tests/framework_tornado/_target_application.py index 497b81efd..98db75ab9 100644 --- a/tests/framework_tornado/_target_application.py +++ b/tests/framework_tornado/_target_application.py @@ -13,12 +13,13 @@ # limitations under the License. import time -import tornado.ioloop -import tornado.web + import tornado.gen import tornado.httpclient -import tornado.websocket import tornado.httputil +import tornado.ioloop +import tornado.web +import tornado.websocket from tornado.routing import PathMatches @@ -36,16 +37,15 @@ def get_status(self, *args, **kwargs): class ProcessCatHeadersHandler(tornado.web.RequestHandler): def __init__(self, application, request, response_code=200, **kwargs): - super(ProcessCatHeadersHandler, self).__init__(application, request, - **kwargs) + super(ProcessCatHeadersHandler, self).__init__(application, request, **kwargs) self.response_code = response_code def get(self, client_cross_process_id, txn_header, flush=None): import newrelic.api.transaction as _transaction + txn = _transaction.current_transaction() if txn: - txn._process_incoming_cat_headers(client_cross_process_id, - txn_header) + txn._process_incoming_cat_headers(client_cross_process_id, txn_header) if self.response_code != 200: self.set_status(self.response_code) @@ -53,7 +53,7 @@ def get(self, client_cross_process_id, txn_header, flush=None): self.write("Hello, world") - if flush == 'flush': + if flush == "flush": # Force a flush prior to calling finish # This causes the headers to get written immediately. The tests # which hit this endpoint will check that the response has been @@ -61,17 +61,17 @@ def get(self, client_cross_process_id, txn_header, flush=None): self.flush() # change the headers to garbage - self.set_header('Content-Type', 'garbage') + self.set_header("Content-Type", "garbage") class EchoHeaderHandler(tornado.web.RequestHandler): def get(self): - response = str(self.request.headers.__dict__).encode('utf-8') + response = str(self.request.headers.__dict__).encode("utf-8") self.write(response) class SimpleHandler(tornado.web.RequestHandler): - options = {'your_command': 'options'} + options = {"your_command": "options"} def get(self): self.write("Hello, world") @@ -111,7 +111,7 @@ def get(self): @tornado.gen.coroutine def throw_exception(self): - raise ValueError('Throwing exception.') + raise ValueError("Throwing exception.") class CoroHandler(tornado.web.RequestHandler): @@ -165,7 +165,7 @@ async def get(self): def trace(self): from newrelic.api.function_trace import FunctionTrace - with FunctionTrace(name='trace', terminal=True): + with FunctionTrace(name="trace", terminal=True): pass @@ -178,12 +178,11 @@ class EnsureFutureHandler(tornado.web.RequestHandler): def get(self): import asyncio - @asyncio.coroutine - def coro_trace(): + async def coro_trace(): from newrelic.api.function_trace import FunctionTrace - with FunctionTrace(name='trace', terminal=True): - yield from tornado.gen.sleep(0) + with FunctionTrace(name="trace", terminal=True): + await tornado.gen.sleep(0) asyncio.ensure_future(coro_trace()) @@ -193,18 +192,14 @@ def on_message(self, message): super(WebNestedHandler, self).on_message(message) -class CustomApplication( - tornado.httputil.HTTPServerConnectionDelegate, - tornado.httputil.HTTPMessageDelegate): - +class CustomApplication(tornado.httputil.HTTPServerConnectionDelegate, tornado.httputil.HTTPMessageDelegate): def start_request(self, server_conn, http_conn): self.server_conn = server_conn self.http_conn = http_conn return self def finish(self): - response_line = tornado.httputil.ResponseStartLine( - "HTTP/1.1", 200, "OK") + response_line = tornado.httputil.ResponseStartLine("HTTP/1.1", 200, "OK") headers = tornado.httputil.HTTPHeaders() headers["Content-Type"] = "text/plain" self.http_conn.write_headers(response_line, headers) @@ -221,6 +216,7 @@ def initialize(self, yield_before_finish=False): async def get(self, total=1): import asyncio + total = int(total) cls = type(self) @@ -239,33 +235,31 @@ async def get(self, total=1): if self.yield_before_finish: await asyncio.sleep(0) - self.write('*') + self.write("*") def make_app(custom=False): handlers = [ - (PathMatches(r'/simple'), SimpleHandler), - (r'/crash', CrashHandler), - (r'/call-simple', CallSimpleHandler), - (r'/super-simple', SuperSimpleHandler), - (r'/coro', CoroHandler), - (r'/coro-throw', CoroThrowHandler), - (r'/fake-coro', FakeCoroHandler), - (r'/init', InitializeHandler), - (r'/html-insertion', HTMLInsertionHandler), - (r'/bad-get-status', BadGetStatusHandler), - (r'/force-cat-response/(\S+)/(\S+)/(\S+)', ProcessCatHeadersHandler), - (r'/304-cat-response/(\S+)/(\S+)', ProcessCatHeadersHandler, - {'response_code': 304}), - (r'/echo-headers', EchoHeaderHandler), - (r'/native-simple', NativeSimpleHandler), - (r'/multi-trace', MultiTraceHandler), - (r'/web-socket', WebSocketHandler), - (r'/ensure-future', EnsureFutureHandler), - (r'/call-web-socket', WebNestedHandler), - (r'/block/(\d+)', BlockingHandler), - (r'/block-with-yield/(\d+)', BlockingHandler, - {'yield_before_finish': True}), + (PathMatches(r"/simple"), SimpleHandler), + (r"/crash", CrashHandler), + (r"/call-simple", CallSimpleHandler), + (r"/super-simple", SuperSimpleHandler), + (r"/coro", CoroHandler), + (r"/coro-throw", CoroThrowHandler), + (r"/fake-coro", FakeCoroHandler), + (r"/init", InitializeHandler), + (r"/html-insertion", HTMLInsertionHandler), + (r"/bad-get-status", BadGetStatusHandler), + (r"/force-cat-response/(\S+)/(\S+)/(\S+)", ProcessCatHeadersHandler), + (r"/304-cat-response/(\S+)/(\S+)", ProcessCatHeadersHandler, {"response_code": 304}), + (r"/echo-headers", EchoHeaderHandler), + (r"/native-simple", NativeSimpleHandler), + (r"/multi-trace", MultiTraceHandler), + (r"/web-socket", WebSocketHandler), + (r"/ensure-future", EnsureFutureHandler), + (r"/call-web-socket", WebNestedHandler), + (r"/block/(\d+)", BlockingHandler), + (r"/block-with-yield/(\d+)", BlockingHandler, {"yield_before_finish": True}), ] if custom: return CustomApplication() @@ -275,5 +269,5 @@ def make_app(custom=False): if __name__ == "__main__": app = make_app() - app.listen(8888, address='127.0.0.1') + app.listen(8888, address="127.0.0.1") tornado.ioloop.IOLoop.current().start() diff --git a/tests/framework_tornado/conftest.py b/tests/framework_tornado/conftest.py index cec4549d2..920b916ee 100644 --- a/tests/framework_tornado/conftest.py +++ b/tests/framework_tornado/conftest.py @@ -14,8 +14,7 @@ import pytest -from testing_support.fixtures import (code_coverage_fixture, # noqa - collector_agent_registration_fixture, collector_available_fixture) +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 _default_settings = { 'transaction_tracer.explain_threshold': 0.0, @@ -29,12 +28,6 @@ app_name='Python Agent Test (framework_tornado)', default_settings=_default_settings) -_coverage_source = [ - 'newrelic.hooks.framework_tornado', -] - -code_coverage = code_coverage_fixture(source=_coverage_source) - @pytest.fixture(scope='module') def app(request): diff --git a/tests/framework_tornado/test_custom_handler.py b/tests/framework_tornado/test_custom_handler.py index 4cabc5e1f..a8cb77d76 100644 --- a/tests/framework_tornado/test_custom_handler.py +++ b/tests/framework_tornado/test_custom_handler.py @@ -13,7 +13,7 @@ # limitations under the License. import pytest -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics pytestmark = pytest.mark.custom_app diff --git a/tests/framework_tornado/test_externals.py b/tests/framework_tornado/test_externals.py index 7d973ac5f..0c44e4336 100644 --- a/tests/framework_tornado/test_externals.py +++ b/tests/framework_tornado/test_externals.py @@ -17,10 +17,8 @@ import sys import pytest -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, -) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.mock_external_http_server import ( MockExternalHTTPHResponseHeadersServer, MockExternalHTTPServer, diff --git a/tests/framework_tornado/test_inbound_cat.py b/tests/framework_tornado/test_inbound_cat.py index 87ebe03d0..44fbf2933 100644 --- a/tests/framework_tornado/test_inbound_cat.py +++ b/tests/framework_tornado/test_inbound_cat.py @@ -15,8 +15,9 @@ import json import pytest from testing_support.fixtures import (make_cross_agent_headers, - override_application_settings, validate_transaction_event_attributes, - validate_transaction_metrics) + override_application_settings) +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics +from testing_support.validators.validate_transaction_event_attributes import validate_transaction_event_attributes ENCODING_KEY = '1234567890123456789012345678901234567890' diff --git a/tests/framework_tornado/test_server.py b/tests/framework_tornado/test_server.py index 16aced356..6f8e6bf2a 100644 --- a/tests/framework_tornado/test_server.py +++ b/tests/framework_tornado/test_server.py @@ -13,41 +13,56 @@ # limitations under the License. import pytest -from newrelic.core.config import global_settings -from testing_support.fixtures import (validate_transaction_metrics, - override_generic_settings, function_not_called, - validate_transaction_event_attributes, - validate_transaction_errors, override_ignore_status_codes, - override_application_settings) +from testing_support.fixtures import ( + function_not_called, + override_application_settings, + override_generic_settings, + override_ignore_status_codes, +) +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) from testing_support.validators.validate_transaction_count import ( - validate_transaction_count) -from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics - - -@pytest.mark.parametrize('uri,name,metrics, method_metric', ( - # ('/native-simple', '_target_application:NativeSimpleHandler.get', None, - # True), - # ('/simple', '_target_application:SimpleHandler.get', None, True), - ('/call-simple', '_target_application:CallSimpleHandler.get', None, True), - ('/super-simple', '_target_application:SuperSimpleHandler.get', None, - True), - ('/coro', '_target_application:CoroHandler.get', None, False), - ('/fake-coro', '_target_application:FakeCoroHandler.get', None, False), - ('/coro-throw', '_target_application:CoroThrowHandler.get', None, False), - ('/init', '_target_application:InitializeHandler.get', None, True), - ('/multi-trace', '_target_application:MultiTraceHandler.get', - [('Function/trace', 2)], True), -)) -@override_application_settings({'attributes.include': ['request.*']}) + validate_transaction_count, +) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_event_attributes import ( + validate_transaction_event_attributes, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.core.config import global_settings + + +@pytest.mark.parametrize( + "uri,name,metrics, method_metric", + ( + # ('/native-simple', '_target_application:NativeSimpleHandler.get', None, + # True), + # ('/simple', '_target_application:SimpleHandler.get', None, True), + ("/call-simple", "_target_application:CallSimpleHandler.get", None, True), + ("/super-simple", "_target_application:SuperSimpleHandler.get", None, True), + ("/coro", "_target_application:CoroHandler.get", None, False), + ("/fake-coro", "_target_application:FakeCoroHandler.get", None, False), + ("/coro-throw", "_target_application:CoroThrowHandler.get", None, False), + ("/init", "_target_application:InitializeHandler.get", None, True), + ("/multi-trace", "_target_application:MultiTraceHandler.get", [("Function/trace", 2)], True), + ), +) +@override_application_settings({"attributes.include": ["request.*"]}) def test_server(app, uri, name, metrics, method_metric): - FRAMEWORK_METRIC = 'Python/Framework/Tornado/%s' % app.tornado_version - METHOD_METRIC = 'Function/%s' % name + FRAMEWORK_METRIC = "Python/Framework/Tornado/%s" % app.tornado_version + METHOD_METRIC = "Function/%s" % name metrics = metrics or [] metrics.append((FRAMEWORK_METRIC, 1)) metrics.append((METHOD_METRIC, 1 if method_metric else None)) - host = '127.0.0.1:' + str(app.get_http_port()) + host = "127.0.0.1:" + str(app.get_http_port()) namespace, func_name = name.split(".") namespace = namespace.replace(":", ".") @@ -56,21 +71,21 @@ def test_server(app, uri, name, metrics, method_metric): rollup_metrics=metrics, ) @validate_transaction_event_attributes( - required_params={ - 'agent': ('response.headers.contentType',), - 'user': (), 'intrinsic': ()}, + required_params={"agent": ("response.headers.contentType",), "user": (), "intrinsic": ()}, exact_attrs={ - 'agent': {'request.headers.contentType': '1234', - 'request.headers.host': host, - 'request.method': 'GET', - 'request.uri': uri, - 'response.status': '200'}, - 'user': {}, - 'intrinsic': {'port': app.get_http_port()}, + "agent": { + "request.headers.contentType": "1234", + "request.headers.host": host, + "request.method": "GET", + "request.uri": uri, + "response.status": "200", + }, + "user": {}, + "intrinsic": {"port": app.get_http_port()}, }, ) def _test(): - response = app.fetch(uri, headers=(('Content-Type', '1234'),)) + response = app.fetch(uri, headers=(("Content-Type", "1234"),)) assert response.code == 200 if method_metric: @@ -79,33 +94,31 @@ def _test(): _test() -@pytest.mark.parametrize('uri,name,metrics,method_metric', ( - ('/native-simple', '_target_application:NativeSimpleHandler.get', None, - True), - ('/simple', '_target_application:SimpleHandler.get', None, True), - ('/call-simple', '_target_application:CallSimpleHandler.get', None, True), - ('/super-simple', '_target_application:SuperSimpleHandler.get', None, - True), - ('/coro', '_target_application:CoroHandler.get', None, False), - ('/fake-coro', '_target_application:FakeCoroHandler.get', None, False), - ('/coro-throw', '_target_application:CoroThrowHandler.get', None, False), - ('/init', '_target_application:InitializeHandler.get', None, True), - ('/ensure-future', - '_target_application:EnsureFutureHandler.get', - [('Function/trace', None)], True), - ('/multi-trace', '_target_application:MultiTraceHandler.get', - [('Function/trace', 2)], True), -)) +@pytest.mark.parametrize( + "uri,name,metrics,method_metric", + ( + ("/native-simple", "_target_application:NativeSimpleHandler.get", None, True), + ("/simple", "_target_application:SimpleHandler.get", None, True), + ("/call-simple", "_target_application:CallSimpleHandler.get", None, True), + ("/super-simple", "_target_application:SuperSimpleHandler.get", None, True), + ("/coro", "_target_application:CoroHandler.get", None, False), + ("/fake-coro", "_target_application:FakeCoroHandler.get", None, False), + ("/coro-throw", "_target_application:CoroThrowHandler.get", None, False), + ("/init", "_target_application:InitializeHandler.get", None, True), + ("/ensure-future", "_target_application:EnsureFutureHandler.get", [("Function/trace", None)], True), + ("/multi-trace", "_target_application:MultiTraceHandler.get", [("Function/trace", 2)], True), + ), +) def test_concurrent_inbound_requests(app, uri, name, metrics, method_metric): from tornado import gen - FRAMEWORK_METRIC = 'Python/Framework/Tornado/%s' % app.tornado_version - METHOD_METRIC = 'Function/%s' % name + FRAMEWORK_METRIC = "Python/Framework/Tornado/%s" % app.tornado_version + METHOD_METRIC = "Function/%s" % name metrics = metrics or [] metrics.append((FRAMEWORK_METRIC, 1)) metrics.append((METHOD_METRIC, 1 if method_metric else None)) - + namespace, func_name = name.split(".") namespace = namespace.replace(":", ".") @@ -127,89 +140,94 @@ def _test(): _test() + @validate_code_level_metrics("_target_application.CrashHandler", "get") -@validate_transaction_metrics('_target_application:CrashHandler.get') -@validate_transaction_errors(['builtins:ValueError']) +@validate_transaction_metrics("_target_application:CrashHandler.get") +@validate_transaction_errors(["builtins:ValueError"]) def test_exceptions_are_recorded(app): - response = app.fetch('/crash') + response = app.fetch("/crash") assert response.code == 500 -@pytest.mark.parametrize('nr_enabled,ignore_status_codes', [ - (True, [405]), - (True, []), - (False, None), -]) +@pytest.mark.parametrize( + "nr_enabled,ignore_status_codes", + [ + (True, [405]), + (True, []), + (False, None), + ], +) def test_unsupported_method(app, nr_enabled, ignore_status_codes): - def _test(): - response = app.fetch('/simple', - method='TEAPOT', body=b'', allow_nonstandard_methods=True) + response = app.fetch("/simple", method="TEAPOT", body=b"", allow_nonstandard_methods=True) assert response.code == 405 if nr_enabled: _test = override_ignore_status_codes(ignore_status_codes)(_test) - _test = validate_transaction_metrics( - '_target_application:SimpleHandler')(_test) + _test = validate_transaction_metrics("_target_application:SimpleHandler")(_test) if ignore_status_codes: _test = validate_transaction_errors(errors=[])(_test) else: - _test = validate_transaction_errors( - errors=['tornado.web:HTTPError'])(_test) + _test = validate_transaction_errors(errors=["tornado.web:HTTPError"])(_test) else: settings = global_settings() - _test = override_generic_settings(settings, {'enabled': False})(_test) + _test = override_generic_settings(settings, {"enabled": False})(_test) _test() @validate_transaction_errors(errors=[]) -@validate_transaction_metrics('tornado.web:ErrorHandler') +@validate_transaction_metrics("tornado.web:ErrorHandler") @validate_transaction_event_attributes( - required_params={'agent': (), 'user': (), 'intrinsic': ()}, + required_params={"agent": (), "user": (), "intrinsic": ()}, exact_attrs={ - 'agent': {'request.uri': '/does-not-exist'}, - 'user': {}, - 'intrinsic': {}, + "agent": {"request.uri": "/does-not-exist"}, + "user": {}, + "intrinsic": {}, }, ) def test_not_found(app): - response = app.fetch('/does-not-exist') + response = app.fetch("/does-not-exist") assert response.code == 404 -@override_generic_settings(global_settings(), { - 'enabled': False, -}) -@function_not_called('newrelic.core.stats_engine', - 'StatsEngine.record_transaction') +@override_generic_settings( + global_settings(), + { + "enabled": False, + }, +) +@function_not_called("newrelic.core.stats_engine", "StatsEngine.record_transaction") def test_nr_disabled(app): - response = app.fetch('/simple') + response = app.fetch("/simple") assert response.code == 200 -@pytest.mark.parametrize('uri,name', ( - ('/web-socket', '_target_application:WebSocketHandler'), - ('/call-web-socket', '_target_application:WebNestedHandler'), -)) +@pytest.mark.parametrize( + "uri,name", + ( + ("/web-socket", "_target_application:WebSocketHandler"), + ("/call-web-socket", "_target_application:WebNestedHandler"), + ), +) def test_web_socket(uri, name, app): - import asyncio + # import asyncio + from tornado.websocket import websocket_connect namespace, func_name = name.split(":") @validate_transaction_metrics( name, - rollup_metrics=[('Function/%s' % name, None)], + rollup_metrics=[("Function/%s" % name, None)], ) @validate_code_level_metrics(namespace, func_name) def _test(): - url = app.get_url(uri).replace('http', 'ws') + url = app.get_url(uri).replace("http", "ws") - @asyncio.coroutine - def _connect(): - conn = yield from websocket_connect(url) + async def _connect(): + conn = await websocket_connect(url) return conn @validate_transaction_metrics( @@ -218,14 +236,13 @@ def _connect(): def connect(): return app.io_loop.run_sync(_connect) - @function_not_called('newrelic.core.stats_engine', - 'StatsEngine.record_transaction') + @function_not_called("newrelic.core.stats_engine", "StatsEngine.record_transaction") def call(call): - @asyncio.coroutine - def _call(): - yield from conn.write_message("test") - resp = yield from conn.read_message() + async def _call(): + await conn.write_message("test") + resp = await conn.read_message() assert resp == "hello test" + app.io_loop.run_sync(_call) conn = connect() @@ -235,13 +252,10 @@ def _call(): _test() -LOOP_TIME_METRICS = ( - ('EventLoop/Wait/' - 'WebTransaction/Function/_target_application:BlockingHandler.get', 1), -) +LOOP_TIME_METRICS = (("EventLoop/Wait/" "WebTransaction/Function/_target_application:BlockingHandler.get", 1),) -@pytest.mark.parametrize('yield_before_finish', (True, False)) +@pytest.mark.parametrize("yield_before_finish", (True, False)) @validate_transaction_metrics( "_target_application:BlockingHandler.get", scoped_metrics=LOOP_TIME_METRICS, @@ -250,9 +264,9 @@ def test_io_loop_blocking_time(app, yield_before_finish): from tornado import gen if yield_before_finish: - url = app.get_url('/block-with-yield/2') + url = app.get_url("/block-with-yield/2") else: - url = app.get_url('/block/2') + url = app.get_url("/block/2") coros = (app.http_client.fetch(url) for _ in range(2)) responses = app.io_loop.run_sync(lambda: gen.multi(coros)) diff --git a/tests/logger_logging/conftest.py b/tests/logger_logging/conftest.py index 514a14595..46e8f4ec3 100644 --- a/tests/logger_logging/conftest.py +++ b/tests/logger_logging/conftest.py @@ -15,17 +15,8 @@ import logging import pytest -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.logger_logging", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/logger_logging/test_local_decorating.py b/tests/logger_logging/test_local_decorating.py index 32a47e904..d4917eff5 100644 --- a/tests/logger_logging/test_local_decorating.py +++ b/tests/logger_logging/test_local_decorating.py @@ -14,13 +14,16 @@ import platform +from testing_support.fixtures import reset_core_stats_engine +from testing_support.validators.validate_log_event_count import validate_log_event_count +from testing_support.validators.validate_log_event_count_outside_transaction import ( + validate_log_event_count_outside_transaction, +) + from newrelic.api.application import application_settings from newrelic.api.background_task import background_task from newrelic.api.time_trace import current_trace from newrelic.api.transaction import current_transaction -from testing_support.fixtures import reset_core_stats_engine -from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.validators.validate_log_event_count_outside_transaction import validate_log_event_count_outside_transaction def set_trace_ids(): @@ -31,6 +34,7 @@ def set_trace_ids(): if trace: trace.guid = "abcdefgh" + def exercise_logging(logger): set_trace_ids() @@ -42,9 +46,19 @@ def get_metadata_string(log_message, is_txn): assert host entity_guid = application_settings().entity_guid if is_txn: - metadata_string = "".join(('NR-LINKING|', entity_guid, '|', host, '|abcdefgh12345678|abcdefgh|Python%20Agent%20Test%20%28logger_logging%29|')) + metadata_string = "".join( + ( + "NR-LINKING|", + entity_guid, + "|", + host, + "|abcdefgh12345678|abcdefgh|Python%20Agent%20Test%20%28logger_logging%29|", + ) + ) else: - metadata_string = "".join(('NR-LINKING|', entity_guid, '|', host, '|||Python%20Agent%20Test%20%28logger_logging%29|')) + metadata_string = "".join( + ("NR-LINKING|", entity_guid, "|", host, "|||Python%20Agent%20Test%20%28logger_logging%29|") + ) formatted_string = log_message + " " + metadata_string return formatted_string @@ -55,7 +69,7 @@ def test_local_log_decoration_inside_transaction(logger): @background_task() def test(): exercise_logging(logger) - assert logger.caplog.records[0] == get_metadata_string('C', True) + assert logger.caplog.records[0] == get_metadata_string("C", True) test() @@ -65,6 +79,6 @@ def test_local_log_decoration_outside_transaction(logger): @validate_log_event_count_outside_transaction(1) def test(): exercise_logging(logger) - assert logger.caplog.records[0] == get_metadata_string('C', False) + assert logger.caplog.records[0] == get_metadata_string("C", False) test() diff --git a/tests/logger_logging/test_metrics.py b/tests/logger_logging/test_metrics.py index eb9419daf..f5a1c5e8d 100644 --- a/tests/logger_logging/test_metrics.py +++ b/tests/logger_logging/test_metrics.py @@ -16,10 +16,7 @@ from newrelic.api.background_task import background_task from testing_support.fixtures import reset_core_stats_engine from testing_support.validators.validate_custom_metrics_outside_transaction import validate_custom_metrics_outside_transaction -from testing_support.fixtures import ( - validate_transaction_metrics, -) - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def exercise_logging(logger): logger.debug("A") diff --git a/tests/logger_logging/test_settings.py b/tests/logger_logging/test_settings.py index 2406d87c0..0581e6218 100644 --- a/tests/logger_logging/test_settings.py +++ b/tests/logger_logging/test_settings.py @@ -18,11 +18,8 @@ from newrelic.api.background_task import background_task from testing_support.fixtures import reset_core_stats_engine from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, -) - +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def basic_logging(logger): logger.warning("C") diff --git a/tests/logger_loguru/conftest.py b/tests/logger_loguru/conftest.py index af632e300..65eaf4ab8 100644 --- a/tests/logger_loguru/conftest.py +++ b/tests/logger_loguru/conftest.py @@ -15,17 +15,8 @@ import logging import pytest -from testing_support.fixtures import ( - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) - -_coverage_source = [ - "newrelic.hooks.logger_loguru", -] +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/logger_loguru/test_metrics.py b/tests/logger_loguru/test_metrics.py index c8a9299c8..9c02d405e 100644 --- a/tests/logger_loguru/test_metrics.py +++ b/tests/logger_loguru/test_metrics.py @@ -15,9 +15,7 @@ from newrelic.api.background_task import background_task from testing_support.fixtures import reset_core_stats_engine from testing_support.validators.validate_custom_metrics_outside_transaction import validate_custom_metrics_outside_transaction -from testing_support.fixtures import ( - validate_transaction_metrics, -) +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def exercise_logging(logger): diff --git a/tests/logger_loguru/test_settings.py b/tests/logger_loguru/test_settings.py index 4e5dadf5d..43d675d56 100644 --- a/tests/logger_loguru/test_settings.py +++ b/tests/logger_loguru/test_settings.py @@ -19,11 +19,8 @@ from newrelic.api.background_task import background_task from testing_support.fixtures import reset_core_stats_engine from testing_support.validators.validate_log_event_count import validate_log_event_count -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, -) - +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics def get_metadata_string(log_message, is_txn): host = platform.uname().node diff --git a/tests/messagebroker_confluentkafka/conftest.py b/tests/messagebroker_confluentkafka/conftest.py index a86af3ff9..fa86b6b3c 100644 --- a/tests/messagebroker_confluentkafka/conftest.py +++ b/tests/messagebroker_confluentkafka/conftest.py @@ -17,11 +17,8 @@ import pytest from testing_support.db_settings import kafka_settings -from testing_support.fixtures import ( # noqa: F401, pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from newrelic.api.transaction import current_transaction from newrelic.common.object_wrapper import transient_function_wrapper @@ -30,11 +27,6 @@ BROKER = "%s:%s" % (DB_SETTINGS["host"], DB_SETTINGS["port"]) -_coverage_source = [ - "newrelic.hooks.messagebroker_confluentkafka", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -92,7 +84,7 @@ def producer(topic, client_type, json_serializer): @pytest.fixture(scope="function") -def consumer(topic, producer, client_type, json_deserializer): +def consumer(group_id, topic, producer, client_type, json_deserializer): from confluent_kafka import Consumer, DeserializingConsumer if client_type == "cimpl": @@ -101,7 +93,7 @@ def consumer(topic, producer, client_type, json_deserializer): "bootstrap.servers": BROKER, "auto.offset.reset": "earliest", "heartbeat.interval.ms": 1000, - "group.id": "test", + "group.id": group_id, } ) elif client_type == "serializer_function": @@ -110,7 +102,7 @@ def consumer(topic, producer, client_type, json_deserializer): "bootstrap.servers": BROKER, "auto.offset.reset": "earliest", "heartbeat.interval.ms": 1000, - "group.id": "test", + "group.id": group_id, "value.deserializer": lambda v, c: json.loads(v.decode("utf-8")), "key.deserializer": lambda v, c: json.loads(v.decode("utf-8")) if v is not None else None, } @@ -121,7 +113,7 @@ def consumer(topic, producer, client_type, json_deserializer): "bootstrap.servers": BROKER, "auto.offset.reset": "earliest", "heartbeat.interval.ms": 1000, - "group.id": "test", + "group.id": group_id, "value.deserializer": json_deserializer, "key.deserializer": json_deserializer, } @@ -189,6 +181,11 @@ def topic(): admin.delete_topics(new_topics) +@pytest.fixture(scope="session") +def group_id(): + return str(uuid.uuid4()) + + @pytest.fixture() def send_producer_message(topic, producer, serialize, client_type): callback_called = [] diff --git a/tests/messagebroker_confluentkafka/test_consumer.py b/tests/messagebroker_confluentkafka/test_consumer.py index 61f532a78..31f9478b3 100644 --- a/tests/messagebroker_confluentkafka/test_consumer.py +++ b/tests/messagebroker_confluentkafka/test_consumer.py @@ -14,19 +14,22 @@ import pytest from conftest import cache_kafka_consumer_headers -from testing_support.fixtures import ( - reset_core_stats_engine, - validate_attributes, - validate_error_event_attributes_outside_transaction, - validate_transaction_errors, - validate_transaction_metrics, -) +from testing_support.fixtures import reset_core_stats_engine, validate_attributes from testing_support.validators.validate_distributed_trace_accepted import ( validate_distributed_trace_accepted, ) +from testing_support.validators.validate_error_event_attributes_outside_transaction import ( + validate_error_event_attributes_outside_transaction, +) from testing_support.validators.validate_transaction_count import ( validate_transaction_count, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task from newrelic.api.transaction import end_of_transaction @@ -63,6 +66,8 @@ def _test(): def test_custom_metrics_on_existing_transaction(get_consumer_record, topic): + from confluent_kafka import __version__ as version + transaction_name = ( "test_consumer:test_custom_metrics_on_existing_transaction.._test" if six.PY3 else "test_consumer:_test" ) @@ -72,6 +77,7 @@ def test_custom_metrics_on_existing_transaction(get_consumer_record, topic): custom_metrics=[ ("Message/Kafka/Topic/Named/%s/Received/Bytes" % topic, 1), ("Message/Kafka/Topic/Named/%s/Received/Messages" % topic, 1), + ("Python/MessageBroker/Confluent-Kafka/%s" % version, 1), ], background_task=True, ) diff --git a/tests/messagebroker_confluentkafka/test_producer.py b/tests/messagebroker_confluentkafka/test_producer.py index 71b674e80..2b3e74e7a 100644 --- a/tests/messagebroker_confluentkafka/test_producer.py +++ b/tests/messagebroker_confluentkafka/test_producer.py @@ -12,17 +12,20 @@ # See the License for the specific language governing permissions and # limitations under the License. +import time import threading import pytest from conftest import cache_kafka_producer_headers -from testing_support.fixtures import ( - validate_transaction_errors, - validate_transaction_metrics, -) from testing_support.validators.validate_messagebroker_headers import ( validate_messagebroker_headers, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task from newrelic.common.object_names import callable_name @@ -34,37 +37,68 @@ ) @background_task() def test_produce_arguments(topic, producer, client_type, serialize, headers): - callback_called = threading.Event() + callback1_called = threading.Event() + callback2_called = threading.Event() + ts = int(time.time()) - def producer_callback(err, msg): - callback_called.set() + def producer_callback1(err, msg): + callback1_called.set() + + def producer_callback2(err, msg): + callback2_called.set() if client_type == "cimpl": + # Keyword Args producer.produce( - topic, + topic=topic, value=serialize({"foo": 1}), key=serialize("my-key"), - callback=producer_callback, - partition=1, - timestamp=1, + partition=0, + callback=producer_callback2, + timestamp=ts, headers=headers, ) - else: + # Positional Args producer.produce( topic, + serialize({"foo": 1}), + serialize("my-key"), + 0, + producer_callback1, + None, + ts, + headers, + ) + else: + # Keyword Args + producer.produce( + topic=topic, value=serialize({"foo": 1}), key=serialize("my-key"), - partition=1, - on_delivery=producer_callback, - timestamp=1, + partition=0, + on_delivery=producer_callback2, + timestamp=ts, headers=headers, ) + # Positional Args + producer.produce( + topic, + serialize("my-key"), + serialize({"foo": 1}), + 0, + producer_callback1, + ts, + headers, + ) producer.flush() - assert callback_called.wait(5), "Callback never called." + assert callback1_called.wait(5), "Callback never called." + assert callback2_called.wait(5), "Callback never called." def test_trace_metrics(topic, send_producer_message): + from confluent_kafka import __version__ as version + scoped_metrics = [("MessageBroker/Kafka/Topic/Produce/Named/%s" % topic, 1)] unscoped_metrics = scoped_metrics txn_name = "test_producer:test_trace_metrics..test" if six.PY3 else "test_producer:test" @@ -73,6 +107,7 @@ def test_trace_metrics(topic, send_producer_message): txn_name, scoped_metrics=scoped_metrics, rollup_metrics=unscoped_metrics, + custom_metrics=[("Python/MessageBroker/Confluent-Kafka/%s" % version, 1)], background_task=True, ) @background_task() diff --git a/tests/messagebroker_confluentkafka/test_serialization.py b/tests/messagebroker_confluentkafka/test_serialization.py index 4d948713d..0b8b41d52 100644 --- a/tests/messagebroker_confluentkafka/test_serialization.py +++ b/tests/messagebroker_confluentkafka/test_serialization.py @@ -13,8 +13,10 @@ # limitations under the License. import pytest -from testing_support.fixtures import ( +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) diff --git a/tests/messagebroker_kafkapython/conftest.py b/tests/messagebroker_kafkapython/conftest.py index 098486f34..de12f5830 100644 --- a/tests/messagebroker_kafkapython/conftest.py +++ b/tests/messagebroker_kafkapython/conftest.py @@ -18,11 +18,8 @@ import kafka import pytest from testing_support.db_settings import kafka_settings -from testing_support.fixtures import ( # noqa: F401, pylint: disable=W0611 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 from newrelic.api.transaction import current_transaction from newrelic.common.object_wrapper import transient_function_wrapper @@ -32,11 +29,6 @@ BOOTSTRAP_SERVER = "%s:%s" % (DB_SETTINGS["host"], DB_SETTINGS["port"]) BROKER = [BOOTSTRAP_SERVER] -_coverage_source = [ - "newrelic.hooks.messagebroker_kafkapython", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, @@ -94,7 +86,7 @@ def producer(client_type, json_serializer, json_callable_serializer): @pytest.fixture(scope="function") -def consumer(topic, producer, client_type, json_deserializer, json_callable_deserializer): +def consumer(group_id, topic, producer, client_type, json_deserializer, json_callable_deserializer): if client_type == "no_serializer": consumer = kafka.KafkaConsumer( topic, @@ -102,7 +94,7 @@ def consumer(topic, producer, client_type, json_deserializer, json_callable_dese auto_offset_reset="earliest", consumer_timeout_ms=100, heartbeat_interval_ms=1000, - group_id="test", + group_id=group_id, ) elif client_type == "serializer_function": consumer = kafka.KafkaConsumer( @@ -113,7 +105,7 @@ def consumer(topic, producer, client_type, json_deserializer, json_callable_dese auto_offset_reset="earliest", consumer_timeout_ms=100, heartbeat_interval_ms=1000, - group_id="test", + group_id=group_id, ) elif client_type == "callable_object": consumer = kafka.KafkaConsumer( @@ -124,7 +116,7 @@ def consumer(topic, producer, client_type, json_deserializer, json_callable_dese auto_offset_reset="earliest", consumer_timeout_ms=100, heartbeat_interval_ms=1000, - group_id="test", + group_id=group_id, ) elif client_type == "serializer_object": consumer = kafka.KafkaConsumer( @@ -135,7 +127,7 @@ def consumer(topic, producer, client_type, json_deserializer, json_callable_dese auto_offset_reset="earliest", consumer_timeout_ms=100, heartbeat_interval_ms=1000, - group_id="test", + group_id=group_id, ) yield consumer @@ -210,6 +202,11 @@ def topic(): admin.delete_topics([topic]) +@pytest.fixture(scope="session") +def group_id(): + return str(uuid.uuid4()) + + @pytest.fixture() def send_producer_message(topic, producer, serialize): def _test(): diff --git a/tests/messagebroker_kafkapython/test_consumer.py b/tests/messagebroker_kafkapython/test_consumer.py index f53b2acb3..78ba086c6 100644 --- a/tests/messagebroker_kafkapython/test_consumer.py +++ b/tests/messagebroker_kafkapython/test_consumer.py @@ -14,19 +14,22 @@ import pytest from conftest import cache_kafka_consumer_headers -from testing_support.fixtures import ( - reset_core_stats_engine, - validate_attributes, - validate_error_event_attributes_outside_transaction, - validate_transaction_errors, - validate_transaction_metrics, -) +from testing_support.fixtures import reset_core_stats_engine, validate_attributes from testing_support.validators.validate_distributed_trace_accepted import ( validate_distributed_trace_accepted, ) +from testing_support.validators.validate_error_event_attributes_outside_transaction import ( + validate_error_event_attributes_outside_transaction, +) from testing_support.validators.validate_transaction_count import ( validate_transaction_count, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task from newrelic.api.transaction import end_of_transaction @@ -60,6 +63,8 @@ def _test(): def test_custom_metrics_on_existing_transaction(get_consumer_record, topic): + from kafka.version import __version__ as version + transaction_name = ( "test_consumer:test_custom_metrics_on_existing_transaction.._test" if six.PY3 else "test_consumer:_test" ) @@ -69,6 +74,7 @@ def test_custom_metrics_on_existing_transaction(get_consumer_record, topic): custom_metrics=[ ("Message/Kafka/Topic/Named/%s/Received/Bytes" % topic, 1), ("Message/Kafka/Topic/Named/%s/Received/Messages" % topic, 1), + ("Python/MessageBroker/Kafka-Python/%s" % version, 1), ], background_task=True, ) diff --git a/tests/messagebroker_kafkapython/test_producer.py b/tests/messagebroker_kafkapython/test_producer.py index 927956482..53a31dce5 100644 --- a/tests/messagebroker_kafkapython/test_producer.py +++ b/tests/messagebroker_kafkapython/test_producer.py @@ -14,13 +14,15 @@ import pytest from conftest import cache_kafka_producer_headers -from testing_support.fixtures import ( - validate_transaction_errors, - validate_transaction_metrics, -) from testing_support.validators.validate_messagebroker_headers import ( validate_messagebroker_headers, ) +from testing_support.validators.validate_transaction_errors import ( + validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) from newrelic.api.background_task import background_task from newrelic.common.object_names import callable_name @@ -28,6 +30,8 @@ def test_trace_metrics(topic, send_producer_message): + from kafka.version import __version__ as version + scoped_metrics = [("MessageBroker/Kafka/Topic/Produce/Named/%s" % topic, 1)] unscoped_metrics = scoped_metrics txn_name = "test_producer:test_trace_metrics..test" if six.PY3 else "test_producer:test" @@ -36,6 +40,7 @@ def test_trace_metrics(topic, send_producer_message): txn_name, scoped_metrics=scoped_metrics, rollup_metrics=unscoped_metrics, + custom_metrics=[("Python/MessageBroker/Kafka-Python/%s" % version, 1)], background_task=True, ) @background_task() diff --git a/tests/messagebroker_kafkapython/test_serialization.py b/tests/messagebroker_kafkapython/test_serialization.py index b83b4e85c..0b2bee74d 100644 --- a/tests/messagebroker_kafkapython/test_serialization.py +++ b/tests/messagebroker_kafkapython/test_serialization.py @@ -15,10 +15,14 @@ import json import pytest -from testing_support.fixtures import ( - reset_core_stats_engine, +from testing_support.fixtures import reset_core_stats_engine +from testing_support.validators.validate_error_event_attributes_outside_transaction import ( validate_error_event_attributes_outside_transaction, +) +from testing_support.validators.validate_transaction_errors import ( validate_transaction_errors, +) +from testing_support.validators.validate_transaction_metrics import ( validate_transaction_metrics, ) diff --git a/tests/messagebroker_pika/conftest.py b/tests/messagebroker_pika/conftest.py index 9849ee014..67246f9c5 100644 --- a/tests/messagebroker_pika/conftest.py +++ b/tests/messagebroker_pika/conftest.py @@ -17,11 +17,8 @@ import pika import pytest from testing_support.db_settings import rabbitmq_settings -from testing_support.fixtures import ( # noqa: F401 - code_coverage_fixture, - collector_agent_registration_fixture, - collector_available_fixture, -) + +from testing_support.fixtures import collector_agent_registration_fixture, collector_available_fixture # noqa: F401; pylint: disable=W0611 QUEUE = "test_pika-%s" % uuid.uuid4() QUEUE_2 = "test_pika-%s" % uuid.uuid4() @@ -36,11 +33,6 @@ DB_SETTINGS = rabbitmq_settings()[0] -_coverage_source = [ - "newrelic.hooks.messagebroker_pika", -] - -code_coverage = code_coverage_fixture(source=_coverage_source) _default_settings = { "transaction_tracer.explain_threshold": 0.0, diff --git a/tests/messagebroker_pika/test_cat.py b/tests/messagebroker_pika/test_cat.py index 2b1aac4a7..e6ca848cc 100644 --- a/tests/messagebroker_pika/test_cat.py +++ b/tests/messagebroker_pika/test_cat.py @@ -23,9 +23,8 @@ from testing_support.fixtures import ( cat_enabled, override_application_settings, - validate_transaction_metrics, ) - +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from newrelic.api.background_task import background_task from newrelic.api.transaction import current_transaction diff --git a/tests/messagebroker_pika/test_distributed_tracing.py b/tests/messagebroker_pika/test_distributed_tracing.py index f911aaad8..548d41a0d 100644 --- a/tests/messagebroker_pika/test_distributed_tracing.py +++ b/tests/messagebroker_pika/test_distributed_tracing.py @@ -23,8 +23,8 @@ from newrelic.common.encoding_utils import DistributedTracePayload from testing_support.db_settings import rabbitmq_settings -from testing_support.fixtures import (override_application_settings, - validate_transaction_metrics) +from testing_support.fixtures import override_application_settings +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics DB_SETTINGS = rabbitmq_settings()[0] diff --git a/tests/messagebroker_pika/test_pika_async_connection_consume.py b/tests/messagebroker_pika/test_pika_async_connection_consume.py index 18c999845..4e44c7ed7 100644 --- a/tests/messagebroker_pika/test_pika_async_connection_consume.py +++ b/tests/messagebroker_pika/test_pika_async_connection_consume.py @@ -12,39 +12,57 @@ # See the License for the specific language governing permissions and # limitations under the License. -from minversion import pika_version_info -from compat import basic_consume import functools + import pika -from pika.adapters.tornado_connection import TornadoConnection import pytest import six import tornado - -from newrelic.api.background_task import background_task - -from conftest import (QUEUE, QUEUE_2, EXCHANGE, EXCHANGE_2, CORRELATION_ID, - REPLY_TO, HEADERS, BODY) -from testing_support.fixtures import (capture_transaction_metrics, - validate_transaction_metrics, validate_tt_collector_json, - function_not_called, override_application_settings) -from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics +from compat import basic_consume +from conftest import ( + BODY, + CORRELATION_ID, + EXCHANGE, + EXCHANGE_2, + HEADERS, + QUEUE, + QUEUE_2, + REPLY_TO, +) +from minversion import pika_version_info +from pika.adapters.tornado_connection import TornadoConnection from testing_support.db_settings import rabbitmq_settings +from testing_support.fixtures import ( + capture_transaction_metrics, + function_not_called, + override_application_settings, +) +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) +from newrelic.api.background_task import background_task DB_SETTINGS = rabbitmq_settings()[0] _message_broker_tt_params = { - 'queue_name': QUEUE, - 'routing_key': QUEUE, - 'correlation_id': CORRELATION_ID, - 'reply_to': REPLY_TO, - 'headers': HEADERS.copy(), + "queue_name": QUEUE, + "routing_key": QUEUE, + "correlation_id": CORRELATION_ID, + "reply_to": REPLY_TO, + "headers": HEADERS.copy(), } # Tornado's IO loop is not configurable in versions 5.x and up try: + class MyIOLoop(tornado.ioloop.IOLoop.configured_class()): def handle_callback_exception(self, *args, **kwargs): raise @@ -55,38 +73,44 @@ def handle_callback_exception(self, *args, **kwargs): connection_classes = [pika.SelectConnection, TornadoConnection] -parametrized_connection = pytest.mark.parametrize('ConnectionClass', - connection_classes) +parametrized_connection = pytest.mark.parametrize("ConnectionClass", connection_classes) _test_select_conn_basic_get_inside_txn_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, 1), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, 1), ] if six.PY3: _test_select_conn_basic_get_inside_txn_metrics.append( - (('Function/test_pika_async_connection_consume:' - 'test_async_connection_basic_get_inside_txn.' - '.on_message'), 1)) + ( + ( + "Function/test_pika_async_connection_consume:" + "test_async_connection_basic_get_inside_txn." + ".on_message" + ), + 1, + ) + ) else: - _test_select_conn_basic_get_inside_txn_metrics.append( - ('Function/test_pika_async_connection_consume:on_message', 1)) + _test_select_conn_basic_get_inside_txn_metrics.append(("Function/test_pika_async_connection_consume:on_message", 1)) @parametrized_connection -@pytest.mark.parametrize('callback_as_partial', [True, False]) -@validate_code_level_metrics("test_pika_async_connection_consume.test_async_connection_basic_get_inside_txn.", "on_message", py2_namespace="test_pika_async_connection_consume") +@pytest.mark.parametrize("callback_as_partial", [True, False]) +@validate_code_level_metrics( + "test_pika_async_connection_consume" + (".test_async_connection_basic_get_inside_txn." if six.PY3 else ""), + "on_message", +) @validate_transaction_metrics( - ('test_pika_async_connection_consume:' - 'test_async_connection_basic_get_inside_txn'), - scoped_metrics=_test_select_conn_basic_get_inside_txn_metrics, - rollup_metrics=_test_select_conn_basic_get_inside_txn_metrics, - background_task=True) + ("test_pika_async_connection_consume:" "test_async_connection_basic_get_inside_txn"), + scoped_metrics=_test_select_conn_basic_get_inside_txn_metrics, + rollup_metrics=_test_select_conn_basic_get_inside_txn_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() -def test_async_connection_basic_get_inside_txn(producer, ConnectionClass, - callback_as_partial): +def test_async_connection_basic_get_inside_txn(producer, ConnectionClass, callback_as_partial): def on_message(channel, method_frame, header_frame, body): assert method_frame assert body == BODY @@ -104,9 +128,7 @@ def on_open_channel(channel): def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) - connection = ConnectionClass( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + connection = ConnectionClass(pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection) try: connection.ioloop.start() @@ -117,9 +139,8 @@ def on_open_connection(connection): @parametrized_connection -@pytest.mark.parametrize('callback_as_partial', [True, False]) -def test_select_connection_basic_get_outside_txn(producer, ConnectionClass, - callback_as_partial): +@pytest.mark.parametrize("callback_as_partial", [True, False]) +def test_select_connection_basic_get_outside_txn(producer, ConnectionClass, callback_as_partial): metrics_list = [] @capture_transaction_metrics(metrics_list) @@ -142,8 +163,8 @@ def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) connection = ConnectionClass( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection + ) try: connection.ioloop.start() @@ -160,25 +181,24 @@ def on_open_connection(connection): _test_select_conn_basic_get_inside_txn_no_callback_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), ] @pytest.mark.skipif( - condition=pika_version_info[0] > 0, - reason='pika 1.0 removed the ability to use basic_get with callback=None') + condition=pika_version_info[0] > 0, reason="pika 1.0 removed the ability to use basic_get with callback=None" +) @parametrized_connection @validate_transaction_metrics( - ('test_pika_async_connection_consume:' - 'test_async_connection_basic_get_inside_txn_no_callback'), + ("test_pika_async_connection_consume:" "test_async_connection_basic_get_inside_txn_no_callback"), scoped_metrics=_test_select_conn_basic_get_inside_txn_no_callback_metrics, rollup_metrics=_test_select_conn_basic_get_inside_txn_no_callback_metrics, - background_task=True) + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() -def test_async_connection_basic_get_inside_txn_no_callback(producer, - ConnectionClass): +def test_async_connection_basic_get_inside_txn_no_callback(producer, ConnectionClass): def on_open_channel(channel): channel.basic_get(callback=None, queue=QUEUE) channel.close() @@ -188,9 +208,7 @@ def on_open_channel(channel): def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) - connection = ConnectionClass( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + connection = ConnectionClass(pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection) try: connection.ioloop.start() @@ -201,27 +219,26 @@ def on_open_connection(connection): _test_async_connection_basic_get_empty_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), ] @parametrized_connection -@pytest.mark.parametrize('callback_as_partial', [True, False]) +@pytest.mark.parametrize("callback_as_partial", [True, False]) @validate_transaction_metrics( - ('test_pika_async_connection_consume:' - 'test_async_connection_basic_get_empty'), - scoped_metrics=_test_async_connection_basic_get_empty_metrics, - rollup_metrics=_test_async_connection_basic_get_empty_metrics, - background_task=True) + ("test_pika_async_connection_consume:" "test_async_connection_basic_get_empty"), + scoped_metrics=_test_async_connection_basic_get_empty_metrics, + rollup_metrics=_test_async_connection_basic_get_empty_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() -def test_async_connection_basic_get_empty(ConnectionClass, - callback_as_partial): - QUEUE = 'test_async_empty' +def test_async_connection_basic_get_empty(ConnectionClass, callback_as_partial): + QUEUE = "test_async_empty" def on_message(channel, method_frame, header_frame, body): - assert False, body.decode('UTF-8') + assert False, body.decode("UTF-8") if callback_as_partial: on_message = functools.partial(on_message) @@ -235,9 +252,7 @@ def on_open_channel(channel): def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) - connection = ConnectionClass( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + connection = ConnectionClass(pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection) try: connection.ioloop.start() @@ -248,33 +263,42 @@ def on_open_connection(connection): _test_select_conn_basic_consume_in_txn_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), ] if six.PY3: _test_select_conn_basic_consume_in_txn_metrics.append( - (('Function/test_pika_async_connection_consume:' - 'test_async_connection_basic_consume_inside_txn.' - '.on_message'), 1)) + ( + ( + "Function/test_pika_async_connection_consume:" + "test_async_connection_basic_consume_inside_txn." + ".on_message" + ), + 1, + ) + ) else: - _test_select_conn_basic_consume_in_txn_metrics.append( - ('Function/test_pika_async_connection_consume:on_message', 1)) + _test_select_conn_basic_consume_in_txn_metrics.append(("Function/test_pika_async_connection_consume:on_message", 1)) @parametrized_connection @validate_transaction_metrics( - ('test_pika_async_connection_consume:' - 'test_async_connection_basic_consume_inside_txn'), - scoped_metrics=_test_select_conn_basic_consume_in_txn_metrics, - rollup_metrics=_test_select_conn_basic_consume_in_txn_metrics, - background_task=True) -@validate_code_level_metrics("test_pika_async_connection_consume.test_async_connection_basic_consume_inside_txn.", "on_message", py2_namespace="test_pika_async_connection_consume") + ("test_pika_async_connection_consume:" "test_async_connection_basic_consume_inside_txn"), + scoped_metrics=_test_select_conn_basic_consume_in_txn_metrics, + rollup_metrics=_test_select_conn_basic_consume_in_txn_metrics, + background_task=True, +) +@validate_code_level_metrics( + "test_pika_async_connection_consume" + + (".test_async_connection_basic_consume_inside_txn." if six.PY3 else ""), + "on_message", +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_async_connection_basic_consume_inside_txn(producer, ConnectionClass): def on_message(channel, method_frame, header_frame, body): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.basic_ack(method_frame.delivery_tag) channel.close() @@ -287,9 +311,7 @@ def on_open_channel(channel): def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) - connection = ConnectionClass( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + connection = ConnectionClass(pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection) try: connection.ioloop.start() @@ -300,46 +322,67 @@ def on_open_connection(connection): _test_select_conn_basic_consume_two_exchanges = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE_2, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE_2, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE_2, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE_2, None), ] if six.PY3: _test_select_conn_basic_consume_two_exchanges.append( - (('Function/test_pika_async_connection_consume:' - 'test_async_connection_basic_consume_two_exchanges.' - '.on_message_1'), 1)) + ( + ( + "Function/test_pika_async_connection_consume:" + "test_async_connection_basic_consume_two_exchanges." + ".on_message_1" + ), + 1, + ) + ) _test_select_conn_basic_consume_two_exchanges.append( - (('Function/test_pika_async_connection_consume:' - 'test_async_connection_basic_consume_two_exchanges.' - '.on_message_2'), 1)) + ( + ( + "Function/test_pika_async_connection_consume:" + "test_async_connection_basic_consume_two_exchanges." + ".on_message_2" + ), + 1, + ) + ) else: _test_select_conn_basic_consume_two_exchanges.append( - ('Function/test_pika_async_connection_consume:on_message_1', 1)) + ("Function/test_pika_async_connection_consume:on_message_1", 1) + ) _test_select_conn_basic_consume_two_exchanges.append( - ('Function/test_pika_async_connection_consume:on_message_2', 1)) + ("Function/test_pika_async_connection_consume:on_message_2", 1) + ) @parametrized_connection @validate_transaction_metrics( - ('test_pika_async_connection_consume:' - 'test_async_connection_basic_consume_two_exchanges'), - scoped_metrics=_test_select_conn_basic_consume_two_exchanges, - rollup_metrics=_test_select_conn_basic_consume_two_exchanges, - background_task=True) -@validate_code_level_metrics("test_pika_async_connection_consume.test_async_connection_basic_consume_two_exchanges.", "on_message_1", py2_namespace="test_pika_async_connection_consume") -@validate_code_level_metrics("test_pika_async_connection_consume.test_async_connection_basic_consume_two_exchanges.", "on_message_2", py2_namespace="test_pika_async_connection_consume") + ("test_pika_async_connection_consume:" "test_async_connection_basic_consume_two_exchanges"), + scoped_metrics=_test_select_conn_basic_consume_two_exchanges, + rollup_metrics=_test_select_conn_basic_consume_two_exchanges, + background_task=True, +) +@validate_code_level_metrics( + "test_pika_async_connection_consume" + + (".test_async_connection_basic_consume_two_exchanges." if six.PY3 else ""), + "on_message_1", +) +@validate_code_level_metrics( + "test_pika_async_connection_consume" + + (".test_async_connection_basic_consume_two_exchanges." if six.PY3 else ""), + "on_message_2", +) @background_task() -def test_async_connection_basic_consume_two_exchanges(producer, producer_2, - ConnectionClass): +def test_async_connection_basic_consume_two_exchanges(producer, producer_2, ConnectionClass): global events_received events_received = 0 def on_message_1(channel, method_frame, header_frame, body): channel.basic_ack(method_frame.delivery_tag) - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY global events_received @@ -352,7 +395,7 @@ def on_message_1(channel, method_frame, header_frame, body): def on_message_2(channel, method_frame, header_frame, body): channel.basic_ack(method_frame.delivery_tag) - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY global events_received @@ -370,9 +413,7 @@ def on_open_channel(channel): def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) - connection = ConnectionClass( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + connection = ConnectionClass(pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection) try: connection.ioloop.start() @@ -383,12 +424,11 @@ def on_open_connection(connection): # This should not create a transaction -@function_not_called('newrelic.core.stats_engine', - 'StatsEngine.record_transaction') -@override_application_settings({'debug.record_transaction_failure': True}) +@function_not_called("newrelic.core.stats_engine", "StatsEngine.record_transaction") +@override_application_settings({"debug.record_transaction_failure": True}) def test_tornado_connection_basic_consume_outside_transaction(producer): def on_message(channel, method_frame, header_frame, body): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.basic_ack(method_frame.delivery_tag) channel.close() @@ -401,9 +441,7 @@ def on_open_channel(channel): def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) - connection = TornadoConnection( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + connection = TornadoConnection(pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection) try: connection.ioloop.start() @@ -414,31 +452,44 @@ def on_open_connection(connection): if six.PY3: - _txn_name = ('test_pika_async_connection_consume:' - 'test_select_connection_basic_consume_outside_transaction.' - '.on_message') + _txn_name = ( + "test_pika_async_connection_consume:" + "test_select_connection_basic_consume_outside_transaction." + ".on_message" + ) _test_select_connection_consume_outside_txn_metrics = [ - (('Function/test_pika_async_connection_consume:' - 'test_select_connection_basic_consume_outside_transaction.' - '.on_message'), None)] + ( + ( + "Function/test_pika_async_connection_consume:" + "test_select_connection_basic_consume_outside_transaction." + ".on_message" + ), + None, + ) + ] else: - _txn_name = ( - 'test_pika_async_connection_consume:on_message') + _txn_name = "test_pika_async_connection_consume:on_message" _test_select_connection_consume_outside_txn_metrics = [ - ('Function/test_pika_async_connection_consume:on_message', None)] + ("Function/test_pika_async_connection_consume:on_message", None) + ] # This should create a transaction @validate_transaction_metrics( - _txn_name, - scoped_metrics=_test_select_connection_consume_outside_txn_metrics, - rollup_metrics=_test_select_connection_consume_outside_txn_metrics, - background_task=True, - group='Message/RabbitMQ/Exchange/%s' % EXCHANGE) -@validate_code_level_metrics("test_pika_async_connection_consume.test_select_connection_basic_consume_outside_transaction.", "on_message", py2_namespace="test_pika_async_connection_consume") + _txn_name, + scoped_metrics=_test_select_connection_consume_outside_txn_metrics, + rollup_metrics=_test_select_connection_consume_outside_txn_metrics, + background_task=True, + group="Message/RabbitMQ/Exchange/%s" % EXCHANGE, +) +@validate_code_level_metrics( + "test_pika_async_connection_consume" + + (".test_select_connection_basic_consume_outside_transaction." if six.PY3 else ""), + "on_message", +) def test_select_connection_basic_consume_outside_transaction(producer): def on_message(channel, method_frame, header_frame, body): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.basic_ack(method_frame.delivery_tag) channel.close() @@ -452,8 +503,8 @@ def on_open_connection(connection): connection.channel(on_open_callback=on_open_channel) connection = pika.SelectConnection( - pika.ConnectionParameters(DB_SETTINGS['host']), - on_open_callback=on_open_connection) + pika.ConnectionParameters(DB_SETTINGS["host"]), on_open_callback=on_open_connection + ) try: connection.ioloop.start() diff --git a/tests/messagebroker_pika/test_pika_blocking_connection_consume.py b/tests/messagebroker_pika/test_pika_blocking_connection_consume.py index d52fce95a..7b41674a2 100644 --- a/tests/messagebroker_pika/test_pika_blocking_connection_consume.py +++ b/tests/messagebroker_pika/test_pika_blocking_connection_consume.py @@ -12,51 +12,56 @@ # See the License for the specific language governing permissions and # limitations under the License. -from compat import basic_consume import functools +import os + import pika import pytest import six -import os +from compat import basic_consume +from conftest import BODY, CORRELATION_ID, EXCHANGE, HEADERS, QUEUE, REPLY_TO +from testing_support.db_settings import rabbitmq_settings +from testing_support.fixtures import capture_transaction_metrics +from testing_support.validators.validate_code_level_metrics import ( + validate_code_level_metrics, +) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task from newrelic.api.transaction import end_of_transaction -from conftest import QUEUE, EXCHANGE, CORRELATION_ID, REPLY_TO, HEADERS, BODY -from testing_support.fixtures import (capture_transaction_metrics, - validate_transaction_metrics, validate_tt_collector_json) -from testing_support.validators.validate_code_level_metrics import validate_code_level_metrics -from testing_support.db_settings import rabbitmq_settings - DB_SETTINGS = rabbitmq_settings()[0] _message_broker_tt_params = { - 'queue_name': QUEUE, - 'routing_key': QUEUE, - 'correlation_id': CORRELATION_ID, - 'reply_to': REPLY_TO, - 'headers': HEADERS.copy(), + "queue_name": QUEUE, + "routing_key": QUEUE, + "correlation_id": CORRELATION_ID, + "reply_to": REPLY_TO, + "headers": HEADERS.copy(), } _test_blocking_connection_basic_get_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, 1), - (('Function/pika.adapters.blocking_connection:' - '_CallbackResult.set_value_once'), 1) + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, 1), + (("Function/pika.adapters.blocking_connection:" "_CallbackResult.set_value_once"), 1), ] @validate_transaction_metrics( - ('test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_get'), - scoped_metrics=_test_blocking_connection_basic_get_metrics, - rollup_metrics=_test_blocking_connection_basic_get_metrics, - background_task=True) + ("test_pika_blocking_connection_consume:" "test_blocking_connection_basic_get"), + scoped_metrics=_test_blocking_connection_basic_get_metrics, + rollup_metrics=_test_blocking_connection_basic_get_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_basic_get(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() method_frame, _, _ = channel.basic_get(QUEUE) assert method_frame @@ -64,23 +69,22 @@ def test_blocking_connection_basic_get(producer): _test_blocking_connection_basic_get_empty_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), ] @validate_transaction_metrics( - ('test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_get_empty'), - scoped_metrics=_test_blocking_connection_basic_get_empty_metrics, - rollup_metrics=_test_blocking_connection_basic_get_empty_metrics, - background_task=True) + ("test_pika_blocking_connection_consume:" "test_blocking_connection_basic_get_empty"), + scoped_metrics=_test_blocking_connection_basic_get_empty_metrics, + rollup_metrics=_test_blocking_connection_basic_get_empty_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_basic_get_empty(): - QUEUE = 'test_blocking_empty-%s' % os.getpid() - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + QUEUE = "test_blocking_empty-%s" % os.getpid() + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() channel.queue_declare(queue=QUEUE) @@ -96,8 +100,7 @@ def test_blocking_connection_basic_get_outside_transaction(producer): @capture_transaction_metrics(metrics_list) def test_basic_get(): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() channel.queue_declare(queue=QUEUE) @@ -113,46 +116,57 @@ def test_basic_get(): _test_blocking_conn_basic_consume_no_txn_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), ] if six.PY3: - _txn_name = ('test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_consume_outside_transaction.' - '.on_message') + _txn_name = ( + "test_pika_blocking_connection_consume:" + "test_blocking_connection_basic_consume_outside_transaction." + ".on_message" + ) _test_blocking_conn_basic_consume_no_txn_metrics.append( - (('Function/test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_consume_outside_transaction.' - '.on_message'), None)) + ( + ( + "Function/test_pika_blocking_connection_consume:" + "test_blocking_connection_basic_consume_outside_transaction." + ".on_message" + ), + None, + ) + ) else: - _txn_name = ('test_pika_blocking_connection_consume:' - 'on_message') + _txn_name = "test_pika_blocking_connection_consume:" "on_message" _test_blocking_conn_basic_consume_no_txn_metrics.append( - ('Function/test_pika_blocking_connection_consume:on_message', None)) + ("Function/test_pika_blocking_connection_consume:on_message", None) + ) -@pytest.mark.parametrize('as_partial', [True, False]) -@validate_code_level_metrics("test_pika_blocking_connection_consume.test_blocking_connection_basic_consume_outside_transaction.", "on_message", py2_namespace="test_pika_blocking_connection_consume") +@pytest.mark.parametrize("as_partial", [True, False]) +@validate_code_level_metrics( + "test_pika_blocking_connection_consume" + + (".test_blocking_connection_basic_consume_outside_transaction." if six.PY3 else ""), + "on_message", +) @validate_transaction_metrics( - _txn_name, - scoped_metrics=_test_blocking_conn_basic_consume_no_txn_metrics, - rollup_metrics=_test_blocking_conn_basic_consume_no_txn_metrics, - background_task=True, - group='Message/RabbitMQ/Exchange/%s' % EXCHANGE) + _txn_name, + scoped_metrics=_test_blocking_conn_basic_consume_no_txn_metrics, + rollup_metrics=_test_blocking_conn_basic_consume_no_txn_metrics, + background_task=True, + group="Message/RabbitMQ/Exchange/%s" % EXCHANGE, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) -def test_blocking_connection_basic_consume_outside_transaction(producer, - as_partial): +def test_blocking_connection_basic_consume_outside_transaction(producer, as_partial): def on_message(channel, method_frame, header_frame, body): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.stop_consuming() if as_partial: on_message = functools.partial(on_message) - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() basic_consume(channel, QUEUE, on_message) @@ -164,41 +178,51 @@ def on_message(channel, method_frame, header_frame, body): _test_blocking_conn_basic_consume_in_txn_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), ] if six.PY3: _test_blocking_conn_basic_consume_in_txn_metrics.append( - (('Function/test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_consume_inside_txn.' - '.on_message'), 1)) + ( + ( + "Function/test_pika_blocking_connection_consume:" + "test_blocking_connection_basic_consume_inside_txn." + ".on_message" + ), + 1, + ) + ) else: _test_blocking_conn_basic_consume_in_txn_metrics.append( - ('Function/test_pika_blocking_connection_consume:on_message', 1)) + ("Function/test_pika_blocking_connection_consume:on_message", 1) + ) -@pytest.mark.parametrize('as_partial', [True, False]) -@validate_code_level_metrics("test_pika_blocking_connection_consume.test_blocking_connection_basic_consume_inside_txn.", "on_message", py2_namespace="test_pika_blocking_connection_consume") +@pytest.mark.parametrize("as_partial", [True, False]) +@validate_code_level_metrics( + "test_pika_blocking_connection_consume" + + (".test_blocking_connection_basic_consume_inside_txn." if six.PY3 else ""), + "on_message", +) @validate_transaction_metrics( - ('test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_consume_inside_txn'), - scoped_metrics=_test_blocking_conn_basic_consume_in_txn_metrics, - rollup_metrics=_test_blocking_conn_basic_consume_in_txn_metrics, - background_task=True) + ("test_pika_blocking_connection_consume:" "test_blocking_connection_basic_consume_inside_txn"), + scoped_metrics=_test_blocking_conn_basic_consume_in_txn_metrics, + rollup_metrics=_test_blocking_conn_basic_consume_in_txn_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_basic_consume_inside_txn(producer, as_partial): def on_message(channel, method_frame, header_frame, body): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.stop_consuming() if as_partial: on_message = functools.partial(on_message) - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() basic_consume(channel, QUEUE, on_message) try: @@ -209,33 +233,40 @@ def on_message(channel, method_frame, header_frame, body): _test_blocking_conn_basic_consume_stopped_txn_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), - ('OtherTransaction/Message/RabbitMQ/Exchange/Named/%s' % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), + ("OtherTransaction/Message/RabbitMQ/Exchange/Named/%s" % EXCHANGE, None), ] if six.PY3: _test_blocking_conn_basic_consume_stopped_txn_metrics.append( - (('Function/test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_consume_stopped_txn.' - '.on_message'), None)) + ( + ( + "Function/test_pika_blocking_connection_consume:" + "test_blocking_connection_basic_consume_stopped_txn." + ".on_message" + ), + None, + ) + ) else: _test_blocking_conn_basic_consume_stopped_txn_metrics.append( - ('Function/test_pika_blocking_connection_consume:on_message', None)) + ("Function/test_pika_blocking_connection_consume:on_message", None) + ) -@pytest.mark.parametrize('as_partial', [True, False]) +@pytest.mark.parametrize("as_partial", [True, False]) @validate_transaction_metrics( - ('test_pika_blocking_connection_consume:' - 'test_blocking_connection_basic_consume_stopped_txn'), - scoped_metrics=_test_blocking_conn_basic_consume_stopped_txn_metrics, - rollup_metrics=_test_blocking_conn_basic_consume_stopped_txn_metrics, - background_task=True) + ("test_pika_blocking_connection_consume:" "test_blocking_connection_basic_consume_stopped_txn"), + scoped_metrics=_test_blocking_conn_basic_consume_stopped_txn_metrics, + rollup_metrics=_test_blocking_conn_basic_consume_stopped_txn_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_basic_consume_stopped_txn(producer, as_partial): def on_message(channel, method_frame, header_frame, body): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.stop_consuming() @@ -244,8 +275,7 @@ def on_message(channel, method_frame, header_frame, body): if as_partial: on_message = functools.partial(on_message) - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() basic_consume(channel, QUEUE, on_message) try: diff --git a/tests/messagebroker_pika/test_pika_blocking_connection_consume_generator.py b/tests/messagebroker_pika/test_pika_blocking_connection_consume_generator.py index a9ee1b331..816b28323 100644 --- a/tests/messagebroker_pika/test_pika_blocking_connection_consume_generator.py +++ b/tests/messagebroker_pika/test_pika_blocking_connection_consume_generator.py @@ -13,65 +13,66 @@ # limitations under the License. import pika +from conftest import BODY, CORRELATION_ID, EXCHANGE, HEADERS, QUEUE, REPLY_TO +from testing_support.db_settings import rabbitmq_settings +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task -from conftest import QUEUE, EXCHANGE, CORRELATION_ID, REPLY_TO, HEADERS, BODY -from testing_support.fixtures import (validate_transaction_metrics, - validate_tt_collector_json) -from testing_support.db_settings import rabbitmq_settings - DB_SETTINGS = rabbitmq_settings()[0] _message_broker_tt_params = { - 'queue_name': QUEUE, - 'routing_key': QUEUE, - 'correlation_id': CORRELATION_ID, - 'reply_to': REPLY_TO, - 'headers': HEADERS.copy(), + "queue_name": QUEUE, + "routing_key": QUEUE, + "correlation_id": CORRELATION_ID, + "reply_to": REPLY_TO, + "headers": HEADERS.copy(), } _test_blocking_connection_consume_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/Unknown', None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/Unknown", None), ] @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_break'), - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_break"), + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_break(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() for method_frame, properties, body in channel.consume(QUEUE): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY break @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_connection_close'), - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_connection_close"), + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_connection_close(producer): - connection = pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) + connection = pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) channel = connection.channel() try: for method_frame, properties, body in channel.consume(QUEUE): - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY channel.close() connection.close() @@ -82,16 +83,15 @@ def test_blocking_connection_consume_connection_close(producer): @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_timeout'), - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_timeout"), + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_timeout(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() for result in channel.consume(QUEUE, inactivity_timeout=0.01): @@ -99,7 +99,7 @@ def test_blocking_connection_consume_timeout(producer): if result and any(result): method_frame, properties, body = result channel.basic_ack(method_frame.delivery_tag) - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY else: # timeout hit! @@ -107,16 +107,15 @@ def test_blocking_connection_consume_timeout(producer): @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_exception_in_for_loop'), - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_exception_in_for_loop"), + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_exception_in_for_loop(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() try: @@ -128,29 +127,28 @@ def test_blocking_connection_consume_exception_in_for_loop(producer): # Expected error pass except Exception as e: - assert False, 'Wrong exception was raised: %s' % e + assert False, "Wrong exception was raised: %s" % e else: - assert False, 'No exception was raised!' + assert False, "No exception was raised!" _test_blocking_connection_consume_empty_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/Unknown', None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/Unknown", None), ] @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_exception_in_generator'), - scoped_metrics=_test_blocking_connection_consume_empty_metrics, - rollup_metrics=_test_blocking_connection_consume_empty_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_exception_in_generator"), + scoped_metrics=_test_blocking_connection_consume_empty_metrics, + rollup_metrics=_test_blocking_connection_consume_empty_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_exception_in_generator(): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() try: @@ -161,29 +159,28 @@ def test_blocking_connection_consume_exception_in_generator(): # Expected error pass except Exception as e: - assert False, 'Wrong exception was raised: %s' % e + assert False, "Wrong exception was raised: %s" % e else: - assert False, 'No exception was raised!' + assert False, "No exception was raised!" _test_blocking_connection_consume_many_metrics = [ - ('MessageBroker/RabbitMQ/Exchange/Produce/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/%s' % EXCHANGE, None), - ('MessageBroker/RabbitMQ/Exchange/Consume/Named/Unknown', None), + ("MessageBroker/RabbitMQ/Exchange/Produce/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/%s" % EXCHANGE, None), + ("MessageBroker/RabbitMQ/Exchange/Consume/Named/Unknown", None), ] @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_many'), - scoped_metrics=_test_blocking_connection_consume_many_metrics, - rollup_metrics=_test_blocking_connection_consume_many_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_many"), + scoped_metrics=_test_blocking_connection_consume_many_metrics, + rollup_metrics=_test_blocking_connection_consume_many_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_many(produce_five): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() consumed = 0 @@ -196,22 +193,21 @@ def test_blocking_connection_consume_many(produce_five): @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_using_methods'), - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_using_methods"), + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) @background_task() def test_blocking_connection_consume_using_methods(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() consumer = channel.consume(QUEUE, inactivity_timeout=0.01) method, properties, body = next(consumer) - assert hasattr(method, '_nr_start_time') + assert hasattr(method, "_nr_start_time") assert body == BODY result = next(consumer) @@ -224,28 +220,28 @@ def test_blocking_connection_consume_using_methods(producer): pass else: # this is not - assert False, 'No exception was raised!' + assert False, "No exception was raised!" result = consumer.close() assert result is None @validate_transaction_metrics( - 'Named/%s' % EXCHANGE, - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True, - group='Message/RabbitMQ/Exchange') + "Named/%s" % EXCHANGE, + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, + group="Message/RabbitMQ/Exchange", +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) def test_blocking_connection_consume_outside_txn(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() consumer = channel.consume(QUEUE) try: for method_frame, properties, body in consumer: - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY break finally: @@ -254,26 +250,24 @@ def test_blocking_connection_consume_outside_txn(producer): def test_blocking_connection_consume_many_outside_txn(produce_five): - @validate_transaction_metrics( - 'Named/%s' % EXCHANGE, - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True, - group='Message/RabbitMQ/Exchange') - @validate_tt_collector_json( - message_broker_params=_message_broker_tt_params) + "Named/%s" % EXCHANGE, + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, + group="Message/RabbitMQ/Exchange", + ) + @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) def consume_it(consumer, up_next=None): if up_next is None: method_frame, properties, body = next(consumer) else: method_frame, properties, body = up_next - assert hasattr(method_frame, '_nr_start_time') + assert hasattr(method_frame, "_nr_start_time") assert body == BODY return next(consumer) - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() consumer = channel.consume(QUEUE) @@ -288,21 +282,21 @@ def consume_it(consumer, up_next=None): @validate_transaction_metrics( - 'Named/%s' % EXCHANGE, - scoped_metrics=_test_blocking_connection_consume_metrics, - rollup_metrics=_test_blocking_connection_consume_metrics, - background_task=True, - group='Message/RabbitMQ/Exchange') + "Named/%s" % EXCHANGE, + scoped_metrics=_test_blocking_connection_consume_metrics, + rollup_metrics=_test_blocking_connection_consume_metrics, + background_task=True, + group="Message/RabbitMQ/Exchange", +) @validate_tt_collector_json(message_broker_params=_message_broker_tt_params) def test_blocking_connection_consume_using_methods_outside_txn(producer): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() consumer = channel.consume(QUEUE, inactivity_timeout=0.01) method, properties, body = next(consumer) - assert hasattr(method, '_nr_start_time') + assert hasattr(method, "_nr_start_time") assert body == BODY result = next(consumer) @@ -315,22 +309,21 @@ def test_blocking_connection_consume_using_methods_outside_txn(producer): pass else: # this is not - assert False, 'No exception was raised!' + assert False, "No exception was raised!" result = consumer.close() assert result is None @validate_transaction_metrics( - ('test_pika_blocking_connection_consume_generator:' - 'test_blocking_connection_consume_exception_on_creation'), - scoped_metrics=_test_blocking_connection_consume_empty_metrics, - rollup_metrics=_test_blocking_connection_consume_empty_metrics, - background_task=True) + ("test_pika_blocking_connection_consume_generator:" "test_blocking_connection_consume_exception_on_creation"), + scoped_metrics=_test_blocking_connection_consume_empty_metrics, + rollup_metrics=_test_blocking_connection_consume_empty_metrics, + background_task=True, +) @background_task() def test_blocking_connection_consume_exception_on_creation(): - with pika.BlockingConnection( - pika.ConnectionParameters(DB_SETTINGS['host'])) as connection: + with pika.BlockingConnection(pika.ConnectionParameters(DB_SETTINGS["host"])) as connection: channel = connection.channel() try: @@ -340,4 +333,4 @@ def test_blocking_connection_consume_exception_on_creation(): pass else: # this is not - assert False, 'TypeError was not raised' + assert False, "TypeError was not raised" diff --git a/tests/messagebroker_pika/test_pika_produce.py b/tests/messagebroker_pika/test_pika_produce.py index 60e6526e4..dbc9af030 100644 --- a/tests/messagebroker_pika/test_pika_produce.py +++ b/tests/messagebroker_pika/test_pika_produce.py @@ -15,14 +15,16 @@ import pika import pytest from testing_support.db_settings import rabbitmq_settings -from testing_support.fixtures import ( - override_application_settings, - validate_transaction_metrics, - validate_tt_collector_json, -) +from testing_support.fixtures import override_application_settings from testing_support.validators.validate_messagebroker_headers import ( validate_messagebroker_headers, ) +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) +from testing_support.validators.validate_tt_collector_json import ( + validate_tt_collector_json, +) from newrelic.api.background_task import background_task from newrelic.api.transaction import current_transaction @@ -46,7 +48,7 @@ def cache_pika_headers(wrapped, instance, args, kwargs): QUEUE = "test-pika-queue" CORRELATION_ID = "testingpika" REPLY_TO = "testing" -HEADERS = {u"MYHEADER": u"pikatest"} +HEADERS = {"MYHEADER": "pikatest"} _message_broker_tt_included_params = { "routing_key": QUEUE, diff --git a/tests/messagebroker_pika/test_pika_supportability.py b/tests/messagebroker_pika/test_pika_supportability.py index fa0e46639..9f0d94e90 100644 --- a/tests/messagebroker_pika/test_pika_supportability.py +++ b/tests/messagebroker_pika/test_pika_supportability.py @@ -19,7 +19,7 @@ from newrelic.api.background_task import background_task from conftest import QUEUE, BODY -from testing_support.fixtures import validate_transaction_metrics +from testing_support.validators.validate_transaction_metrics import validate_transaction_metrics from testing_support.db_settings import rabbitmq_settings DB_SETTINGS = rabbitmq_settings()[0] diff --git a/tests/template_genshi/conftest.py b/tests/template_genshi/conftest.py new file mode 100644 index 000000000..932ec9bae --- /dev/null +++ b/tests/template_genshi/conftest.py @@ -0,0 +1,30 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) + +_default_settings = { + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, +} + +collector_agent_registration = collector_agent_registration_fixture( + app_name="Python Agent Test (template_genshi)", default_settings=_default_settings +) diff --git a/tests/template_genshi/test_genshi.py b/tests/template_genshi/test_genshi.py new file mode 100644 index 000000000..03420579e --- /dev/null +++ b/tests/template_genshi/test_genshi.py @@ -0,0 +1,38 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from genshi.template import MarkupTemplate +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task + + +@validate_transaction_metrics( + "test_render", + background_task=True, + scoped_metrics=(("Template/Render/genshi.core:Stream.render", 1),), +) +@background_task(name="test_render") +def test_render(): + template_to_render = MarkupTemplate("

hello, $name!

") + result = template_to_render.generate(name="NR").render("xhtml") + assert result == "

hello, NR!

" + + +def test_render_outside_txn(): + template_to_render = MarkupTemplate("

hello, $name!

") + result = template_to_render.generate(name="NR").render("xhtml") + assert result == "

hello, NR!

" diff --git a/tests/template_jinja2/conftest.py b/tests/template_jinja2/conftest.py new file mode 100644 index 000000000..a6922078d --- /dev/null +++ b/tests/template_jinja2/conftest.py @@ -0,0 +1,30 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from testing_support.fixtures import ( # noqa: F401; pylint: disable=W0611 + collector_agent_registration_fixture, + collector_available_fixture, +) + +_default_settings = { + "transaction_tracer.explain_threshold": 0.0, + "transaction_tracer.transaction_threshold": 0.0, + "transaction_tracer.stack_trace_threshold": 0.0, + "debug.log_data_collector_payloads": True, + "debug.record_transaction_failure": True, +} + +collector_agent_registration = collector_agent_registration_fixture( + app_name="Python Agent Test (template_jinja2)", default_settings=_default_settings +) diff --git a/tests/template_jinja2/test_jinja2.py b/tests/template_jinja2/test_jinja2.py new file mode 100644 index 000000000..c64dac923 --- /dev/null +++ b/tests/template_jinja2/test_jinja2.py @@ -0,0 +1,41 @@ +# Copyright 2010 New Relic, Inc. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +from jinja2 import Template +from testing_support.validators.validate_transaction_metrics import ( + validate_transaction_metrics, +) + +from newrelic.api.background_task import background_task + + +@validate_transaction_metrics( + "test_render", + background_task=True, + scoped_metrics=( + ("Template/Render/