From 1ffde3eb818eadd58f6518f75f0c981b60c2e3ef Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Sun, 26 Jun 2022 09:31:08 +0300 Subject: [PATCH] Revert 30935 defer bpo45162 to 312 (#94285) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * GH-93444: remove redundant fields from basicblock: b_nofallthrough, b_exit, b_return (GH-93445) * netrc: Remove unused "import shlex" (#93311) * gh-92886: Fix test that fails when running with `-O` in `test_imaplib.py` (#93237) * Fix missing word in sys.float_info docstring (GH-93489) * [doc] Correct a grammatical error in a docstring. (GH-93441) * gh-93442: Make C++ version of _Py_CAST work with 0/NULL. (#93500) Add C++ overloads for _Py_CAST_impl() to handle 0/NULL. This will allow C++ extensions that pass 0 or NULL to macros using _Py_CAST() to continue to compile. Without this, you get an error like: invalid ‘static_cast’ from type ‘int’ to type ‘_object*’ The modern way to use a NULL value in C++ is to use nullptr. However, we want to not break extensions that do things the old way. Co-authored-by: serge-sans-paille * gh-93442: Add test for _Py_CAST(nullptr). (gh-93505) * gh-90473: wasmtime does not support absolute symlinks (GH-93490) * gh-89973: Fix re.error in the fnmatch module. (GH-93072) Character ranges with upper bound less that lower bound (e.g. [c-a]) are now interpreted as empty ranges, for compatibility with other glob pattern implementations. Previously it was re.error. * Document LOAD_FAST_CHECK opcode (#93498) * gh-93247: Fix assert function in asyncio locks test (#93248) * gh-90473: WASI requires proper open(2) flags (GH-93529) * GH-92308 What's New: list pending removals in 3.13 and future versions (#92562) * gh-90473: Skip POSIX tests that don't apply to WASI (GH-93536) * asyncio.Barrier docs: Fix typo (#93371) taks -> tasks * gh-83728: Add hmac.new default parameter deprecation (GH-91939) * gh-90473: Make chmod a dummy on WASI, skip chmod tests (GH-93534) WASI does not have the ``chmod(2)`` syscall yet. * Remove action=None kwarg from Barrier docs (GH-93538) * [docs] fix some asyncio.Barrier.wait docs grammar (GH-93552) * gh-93475: Expose FICLONE and FICLONERANGE constants in fcntl (#93478) * gh-89018: Improve documentation of `sqlite3` exceptions (#27645) - Order exceptions as in PEP 249 - Reword descriptions, so they match the current behaviour Co-authored-by: Alex Waygood * bpo-42658: Use LCMapStringEx in ntpath.normcase to match OS behaviour for case-folding (GH-32010) * Fix contributor name in WhatsNew 3.11 (GH-93556) * Grammar fix to socket error string (GH-93523) * gh-86986: bump min sphinx version to 3.2 (GH-93337) * gh-79096: Protect cookie file created by {LWP,Mozilla}CookieJar.save() (GH-93463) Note: This change is not effective on Microsoft Windows. Cookies can store sensitive information and should therefore be protected against unauthorized third parties. This is also described in issue #79096. The filesystem permissions are currently set to 644, everyone can read the file. This commit changes the permissions to 600, only the creater of the file can read and modify it. This improves security, because it reduces the attack surface. Now the attacker needs control of the user that created the cookie or a ways to circumvent the filesystems permissions. This change is backwards incompatible. Systems that rely on world-readable cookies will breake. However, one could argue that those are misconfigured in the first place. * gh-93162: Add ability to configure QueueHandler/QueueListener together (GH-93269) Also, provide getHandlerByName() and getHandlerNames() APIs. Closes #93162. * gh-57539: Increase calendar test coverage (GH-93468) Co-authored-by: Sean Fleming Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Łukasz Langa * gh-88831: In docs for asyncio.create_task, explain why strong references to tasks are needed (GH-93258) Co-authored-by: Łukasz Langa * Shrink the LOAD_METHOD cache by one codeunit. (#93537) * Fix MSVC compiler warnings in ceval.c (#93569) * gh-93162: test_config_queue_handler requires threading (GH-93572) * gh-84461: Emscripten's faccessat() does not accept flags (GHß92353) * gh-92592: Allow logging filters to return a LogRecord. (GH-92591) * Fix `PurePath.relative_to` links in the pathlib documentation. (GH-93268) These are currently broken as they refer to :meth:`Path.relative_to` rather than :meth:`PurePath.relative_to`, and `relative_to` is a method on `PurePath`. * GH-93481: Suppress expected deprecation warning in test_pyclbr (GH-93483) * gh-93370: Deprecate sqlite3.version and sqlite3.version_info (#93482) Co-authored-by: Alex Waygood Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Erlend E. Aasland * GH-93521: For dataclasses, filter out `__weakref__` slot if present in bases (GH-93535) * gh-93421: Update sqlite3 cursor.rowcount only after SQLITE_DONE (#93526) * gh-93584: Make all install+tests targets depends on all (GH-93589) All install targets use the "all" target as synchronization point to prevent race conditions with PGO builds. PGO builds use recursive make, which can lead to two parallel `./python setup.py build` processes that step on each others toes. "test" targets now correctly compile PGO build in a clean repo. * gh-87961: Remove outdated notes from functions that aren't in the Limited API (GH-93581) * Remove outdated notes from functions that aren't in the Limited API Nowadays everything that *is* in the Limited API has a note added automatically. These notes could mislead people to think that these functions could never be added to the limited API. Remove them. * Also remove forgotten note on tp_vectorcall_offset not being finalized * gh-93180: Update os.copy_file_range() documentation (#93182) * gh-93575: Use correct way to calculate PyUnicode struct sizes (GH-93602) * gh-93575: Use correct way to calculate PyUnicode struct sizes * Add comment to keep test_sys and test_unicode in sync * Fix case code < 256 * gh-90473: Define HOSTRUNNER for WASI (GH-93606) * gh-79096: Fix/improve http cookiejar tests (GH-93614) Fixup of GH-93463: - remove stray print - use proper way to check file mode - add working chmod decorator Co-authored-by: Łukasz Langa * gh-93616: Fix env changed issue in test_modulefinder (GH-93617) * gh-90494: Reject 6th element of the __reduce__() tuple (GH-93609) copy.copy() and copy.deepcopy() now always raise a TypeError if __reduce__() returns a tuple with length 6 instead of silently ignore the 6th item or produce incorrect result. * Doc: Update references and examples of old, unsupported OSes and uarches (GH-92791) * bpo-45383: Get metaclass from bases in PyType_From* (GH-28748) This checks the bases of of a type created using the FromSpec API to inherit the bases metaclasses. The metaclass's alloc function will be called as is done in `tp_new` for classes created in Python. Co-authored-by: Petr Viktorin Co-authored-by: Erlend Egeberg Aasland * Improve logging documentation with example and additional cookbook re… (GH-93644) * gh-90473: disable user site packages on WASI/Emscripten (GH-93633) * gh-90473: Skip get_config_h() tests on WASI (GH-93645) * gh-90549: Fix leak of global named resources using multiprocessing spawn (#30617) Co-authored-by: XD Trol Co-authored-by: Antoine Pitrou * gh-92434: Silence compiler warning in Modules/_sqlite/connection.c on 32-bit systems (#93090) * gh-90763: Modernise xx template module initialisation (#93078) Use C APIs such as PyModule_AddType instead of PyModule_AddObject. Also remove incorrect module decrefs if module fails to initialise. * gh-93491: Add support tier detection to configure (GH-93492) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Steve Dower Co-authored-by: Erlend Egeberg Aasland * gh-93466: Document PyType_Spec doesn't accept repeated slot IDs; raise where this was problematic (GH-93471) * gh-93671: Avoid exponential backtracking in deeply nested sequence patterns in match statements (GH-93680) Co-authored-by: Łukasz Langa * gh-81790: support "UNC" device paths in `ntpath.splitdrive()` (GH-91882) * GH-93621: reorder code in with/async-with exception exit path to reduce the size of the exception table (GH-93622) * gh-93461: Invalidate sys.path_importer_cache entries with relative paths (GH-93653) * gh-91317: Document that Path does not collapse initial `//` (GH-32193) Documentation for `pathlib` says: > Spurious slashes and single dots are collapsed, but double dots ('..') are not, since this would change the meaning of a path in the face of symbolic links: However, it omits that initial double slashes also aren't collapsed. Later, in documentation of `PurePath.drive`, `PurePath.root`, and `PurePath.name` it mentions UNC but: - this abbreviation says nothing to a person who is unaware about existence of UNC (Wikipedia doesn't help either by [giving a disambiguation page](https://en.wikipedia.org/wiki/UNC)) - it shows up only if a person needs to use a specific property or decides to fully learn what the module provides. For context, see the BPO entry. * gh-92886: Fix tests that fail when running with optimizations (`-O`) in `test_zipimport.py` (GH-93236) * gh-92930: _pickle.c: Acquire strong references before calling save() (GH-92931) * gh-84461: Use HOSTRUNNER to run regression tests (GH-93694) Co-authored-by: Brett Cannon * gh-90473: Skip test_queue when threading is not available (GH-93712) * gh-90153: whatsnew: "z" option in format spec (GH-93624) Add what's new entry for PEP 682 in Python 3.11. * gh-86404: [doc] A make sucpicious false positive. (GH-93710) * Change list to view object (#93661) * gh-84508: tool to generate cjk traditional chinese mappings (gh-93272) * Remove usage of _Py_IDENTIFIER from math module (#93739) * gh-91162: Support splitting of unpacked arbitrary-length tuple over TypeVar and TypeVarTuple parameters (alt) (GH-93412) For example: A[T, *Ts][*tuple[int, ...]] -> A[int, *tuple[int, ...]] A[*Ts, T][*tuple[int, ...]] -> A[*tuple[int, ...], int] * gh-93728: fix memory leak in deepfrozen code objects (GH-93729) * gh-93747: Fix Refleak when handling multiple Py_tp_doc slots (gh-93749) * GH-90699: use statically allocated strings in typeobject.c (gh-93751) * Add more FOR_ITER specialization stats (GH-32151) * gh-89653: PEP 670: Convert PyFunction macros (#93765) Convert PyFunction macros to static inline functions. * Remove ANY_VARARGS() macro from the C API (#93764) The macro was exposed by mistake. * gh-84623: Remove unused imports in stdlib (#93773) * gh-91731: Don't define 'static_assert' in C++11 where is a keyword to avoid UB (GH-93700) * gh-84623: Remove unused imports in tests (#93772) * gh-93353: Fix importlib.resources._tempfile() finalizer (#93377) Fix the importlib.resources.as_file() context manager to remove the temporary file if destroyed late during Python finalization: keep a local reference to the os.remove() function. Patch by Victor Stinner. * gh-84461: Fix parallel testing on WebAssembly (GH-93768) * gh-89653: PEP 670: Macros always cast arguments in cpython/ (#93766) Header files in the Include/cpython/ are only included if the Py_LIMITED_API macro is not defined. * gh-93353: Add test.support.late_deletion() (#93774) * gh-93741: Add private C API _PyImport_GetModuleAttrString() (GH-93742) It combines PyImport_ImportModule() and PyObject_GetAttrString() and saves 4-6 lines of code on every use. Add also _PyImport_GetModuleAttr() which takes Python strings as arguments. * gh-79512: Fixed names and __module__ value of weakref classes (GH-93719) Classes ReferenceType, ProxyType and CallableProxyType have now correct atrtributes __module__, __name__ and __qualname__. It makes them (types, not instances) pickleable. * gh-91810: Fix regression with writing an XML declaration with encoding='unicode' (GH-93426) Suppress writing an XML declaration in open files in ElementTree.write() with encoding='unicode' and xml_declaration=None. If file patch is passed to ElementTree.write() with encoding='unicode', always open a new file in UTF-8. * gh-93761: Fix test to avoid simple delay when synchronizing. (GH-93779) * gh-89546: Clean up PyType_FromMetaclass (GH-93686) When changing PyType_FromMetaclass recently (GH-93012, GH-93466, GH-28748) I found a bunch of opportunities to improve the code. Here they are. Fixes: #89546 Automerge-Triggered-By: GH:encukou * gh-91321: Fix compatibility with C++ older than C++11 (#93784) Fix the compatibility of the Python C API with C++ older than C++11. _Py_NULL is only defined as nullptr on C++11 and newer. * GH-93662: Make sure that column offsets are correct in multi-line method calls. (GH-93673) * GH-93516: Store offset of first traceable instruction in code object (GH-93769) * gh-90473: Include stdlib dir in wasmtime PYTHONPATH (GH-93797) * GH-93429: Merge `LOAD_METHOD` back into `LOAD_ATTR` (GH-93430) * gh-93353: regrtest checks for leaked temporary files (#93776) When running tests with -jN, create a temporary directory per process and mark a test as "environment changed" if a test leaks a temporary file or directory. * gh-79579: Improve DML query detection in sqlite3 (#93623) The fix involves using pysqlite_check_remaining_sql(), not only to check for multiple statements, but now also to strip leading comments and whitespace from SQL statements, so we can improve DML query detection. pysqlite_check_remaining_sql() is renamed lstrip_sql(), to more accurately reflect its function, and hardened to handle more SQL comment corner cases. * GH-93678: reduce boilerplate and code repetition in the compiler (GH-93682) * gh-91877: Fix WriteTransport.get_write_buffer_{limits,size} docs (#92338) - Amend docs for WriteTransport.get_write_buffer_limits - Add docs for WriteTransport.get_write_buffer_size * GH-93429: Document `LOAD_METHOD` removal (GH-93803) * Include freelists in allocation total. (GH-93799) * gh-93795: Use test.support TESTFN/unlink in sqlite3 tests (#93796) * Remove LOAD_METHOD stats. (GH-93807) * Rename 'LOAD_METHOD' specialization stat consts to 'ATTR'. (GH-93812) * gh-93353: Fix regrtest for -jN with N >= 2 (GH-93813) * [docs] Fix LOAD_ATTR version changed (GH-93816) * gh-93814: Add infinite test for itertools.chain.from_iterable (GH-93815) fix #93814 Automerge-Triggered-By: GH:rhettinger * gh-93735: Split Docs CI to speed-up the build (GH-93736) * gh-93183: Adjust wording in socket docs (#93832) package => packet Co-authored-by: Victor Norman * gh-93829: In sqlite3, replace Py_BuildValue with faster APIs (#93830) - In Modules/_sqlite/connection.c, use PyLong_FromLong - In Modules/_sqlite/microprotocols.c, use PyTuple_Pack * Add test.support.busy_retry() (#93770) Add busy_retry() and sleeping_retry() functions to test.support. * gh-87260: Update sqlite3 signature docs to reflect actual implementation (#93840) Align the docs for the following methods with the actual implementation: - sqlite3.complete_statement() - sqlite3.Connection.create_function() - sqlite3.Connection.create_aggregate() - sqlite3.Connection.set_progress_handler() * test_thread uses support.sleeping_retry() (#93849) test_thread.test_count() now fails if it takes longer than LONG_TIMEOUT seconds. * Use support.sleeping_retry() and support.busy_retry() (#93848) * Replace time.sleep(0.010) with sleeping_retry() to use an exponential sleep. * support.wait_process(): reuse sleeping_retry(). * _test_eintr: remove unused variables. * Update includes in call.c (GH-93786) * gh-93857: Fix broken audit-event targets in sqlite3 docs (#93859) Corrected targets for the following audit-events: - sqlite3.enable_load_extension => sqlite3.Connection.enable_load_extension - sqlite3.load_extension => sqlite3.Connection.load_extension * GH-93850: Fix test_asyncio exception ignored tracebacks (#93854) * gh-93824: Reenable installation of shell extension on Windows ARM64 (GH-93825) * test_asyncio: run_until() implements exponential sleep (#93866) run_until() of test.test_asyncio.utils now uses an exponential sleep delay (max: 1 second), rather than a fixed delay of 1 ms. Similar design than support.sleeping_retry() wait strategy that applies exponential backoff. * test_asyncore: Optimize capture_server() (#93867) Remove time.sleep(0.01) in test_asyncore capture_server(). The sleep was redundant and inefficient, since the loop starts with select.select() which also implements a sleep (poll for socket data with a timeout). * Tests call sleeping_retry() with SHORT_TIMEOUT (#93870) Tests now call busy_retry() and sleeping_retry() with SHORT_TIMEOUT or LONG_TIMEOUT (of test.support), rather than hardcoded constants. Add also WAIT_ACTIVE_CHILDREN_TIMEOUT constant to _test_multiprocessing. * gh-84461: Document how to install SDKs manually (GH-93844) Co-authored-by: Brett Cannon * gh-93820: Fix copy() regression in enum.Flag (GH-93876) GH-26658 introduced a regression in copy / pickle protocol for combined `enum.Flag`s. `copy.copy(re.A | re.I)` would fail with `AttributeError: ASCII|IGNORECASE`. `enum.Flag` now has a `__reduce_ex__()` method that reduces flags by combined value, not by combined name. * Call busy_retry() and sleeping_retry() with error=True (#93871) Tests no longer call busy_retry() and sleeping_retry() with error=False: raise an exception if the loop times out. * gh-87347: Add parenthesis around PyXXX_Check() arguments (#92815) * gh-91321: Fix test_cppext for C++03 (#93902) Don't build _testcppext.cpp with -Wzero-as-null-pointer-constant when testing C++03: only use this compiler flag with C++11. * gh-91577: SharedMemory move imports out of methods (#91579) SharedMemory.unlink() uses the unregister() function from resource_tracker. Previously it was imported in the method, but this can fail if the method is called during interpreter shutdown, for example when unlink is part of a __del__() method. Moving the import to the top of the file, means that the unregister() method is available during interpreter shutdown. The register call in SharedMemory.__init__() can also use this imported resource_tracker. * gh-92547: Amend What's New (#93872) * Fix BINARY_SUBSCR_GETITEM stats (GH-93903) * gh-93847: Fix repr of enum of generic aliases (GH-93885) * gh-93353: regrtest supports checking tmp files with -j2 (#93909) regrtest now also implements checking for leaked temporary files and directories when using -jN for N >= 2. Use tempfile.mkdtemp() to create the temporary directory. Skip this check on WASI. * GH-91389: Fix dis position information for CACHEs (GH-93663) * gh-91985: Ensure in-tree builds override platstdlib_dir in every path calculation (GH-93641) * GH-83658: make multiprocessing.Pool raise an exception if maxtasksperchild is not None or a positive int (GH-93364) Closes #83658. * test_logging: Fix BytesWarning in SysLogHandlerTest (GH-93920) * gh-91404: Revert "bpo-23689: re module, fix memory leak when a match is terminated by a signal or allocation failure (GH-32283) (#93882) Revert "bpo-23689: re module, fix memory leak when a match is terminated by a signal or memory allocation failure (GH-32283)" This reverts commit 6e3eee5c11b539e9aab39cff783acf57838c355a. Manual fixups to increase the MAGIC number and to handle conflicts with a couple of changes that landed after that. Thanks for reviews by Ma Lin and Serhiy Storchaka. * gh-89745: Avoid exact match when comparing program_name in test_embed on Windows (GH-93888) * gh-93852: Add test.support.create_unix_domain_name() (#93914) test_asyncio, test_logging, test_socket and test_socketserver now create AF_UNIX domains in the current directory to no longer fail with OSError("AF_UNIX path too long") if the temporary directory (the TMPDIR environment variable) is too long. Modify the following tests to use create_unix_domain_name(): * test_asyncio * test_logging * test_socket * test_socketserver test_asyncio.utils: remove unused time import. * gh-77782: Py_FdIsInteractive() now uses PyConfig.interactive (#93916) * gh-74953: Add _PyTime_FromMicrosecondsClamp() function (#93942) * gh-74953: Fix PyThread_acquire_lock_timed() code recomputing the timeout (#93941) Set timeout, don't create a local variable with the same name. * gh-77782: Deprecate global configuration variable (#93943) Deprecate global configuration variable like Py_IgnoreEnvironmentFlag: the Py_InitializeFromConfig() API should be instead. Fix declaration of Py_GETENV(): use PyAPI_FUNC(), not PyAPI_DATA(). * gh-93911: Specialize `LOAD_ATTR_PROPERTY` (GH-93912) * gh-92888: Fix memoryview bad `__index__` use after free (GH-92946) Co-authored-by: chilaxan <35645806+chilaxan@users.noreply.github.com> Co-authored-by: Serhiy Storchaka <3659035+serhiy-storchaka@users.noreply.github.com> * GH-89858: Fix test_embed for out-of-tree builds (GH-93465) * gh-92611: Add details on replacements for cgi utility funcs (GH-92792) Per @brettcannon 's [suggestions on the Discourse thread](https://discuss.python.org/t/pep-594-take-2-removing-dead-batteries-from-the-standard-library/13508/51), discussed in #92611 and as a followup to PR #92612 , this PR add additional specific per-function replacement information for the utility functions in the `cgi` module deprecated by PEP 594 (PEP-594). @brettcannon , should this be backported (without the `deprecated-removed` , which I would update it accordingly and re-add in my other PR adding that to the others for 3.11+), or just go in 3.11+? * GH-77403: Fix tests which fail when PYTHONUSERBASE is not normalized (GH-93917) * gh-91387: Strip trailing slash from tarfile longname directories (GH-32423) Co-authored-by: Brett Cannon * Add jaraco as primary owner of importlib.metadata and importlib.resources. (#93960) * Add jaraco as primary owner of importlib.metadata and importlib.resources. * Align indentation. Co-authored-by: Ezio Melotti Co-authored-by: Ezio Melotti * gh-84461: Fix circulare dependency on BUILDPYTHON (GH-93977) * gh-89828: Do not relay the __class__ attribute in GenericAlias (#93754) list[int].__class__ returned type, and isinstance(list[int], type) returned True. It caused numerous problems in code that checks isinstance(x, type). * gh-84461: Fix pydebug Emscripten browser builds (GH-93982) wasm_assets script did not take the ABIFLAG flag of sysconfigdata into account. * gh-93955: Use unbound methods for slot `__getattr__` and `__getattribute__` (GH-93956) * gh-91387: Fix tarfile test on WASI (GH-93984) WASI's rmdir() syscall does not like the trailing slash. * gh-93975: Nicer error reporting in test_venv (GH-93959) - gh-93957: Provide nicer error reporting from subprocesses in test_venv.EnsurePipTest.test_with_pip. - Update changelog This change does three things: 1. Extract a function for trapping output in subprocesses. 2. Emit both stdout and stderr when encountering an error. 3. Apply the change to `ensurepip._uninstall` check. * GH-93990: fix refcounting bug in `add_subclass` in `typeobject.c` (GH-93989) * What's new in 3.10: fix link to issue (#93968) * What's new in 3.10: fix link to issue * What's new in 3.10: fix link to GH issue Co-authored-by: Ezio Melotti Co-authored-by: Ezio Melotti * gh-93761: Fix test_logging test_config_queue_handler() race condition (#93952) Fix a race condition in test_config_queue_handler() of test_logging. * gh-74953: Reformat PyThread_acquire_lock_timed() (#93947) Reformat the pthread implementation of PyThread_acquire_lock_timed() using a mutex and a conditioinal variable. * Add goto to avoid multiple indentation levels and exit quickly * Use "while(1)" and make the control flow more obvious. * PEP 7: Add braces around if blocks. * gh-93937, C API: Move PyFrame_GetBack() to Python.h (#93938) Move the follow functions and type from frameobject.h to pyframe.h, so the standard provide frame getter functions: * PyFrame_Check() * PyFrame_GetBack() * PyFrame_GetBuiltins() * PyFrame_GetGenerator() * PyFrame_GetGlobals() * PyFrame_GetLasti() * PyFrame_GetLocals() * PyFrame_Type Remove #include "frameobject.h" from many C files. It's no longer needed. * gh-93991: Use boolean instead of 0/1 for condition check (GH-93992) # gh-93991: Use boolean instead of 0/1 for condition check * gh-84461: Fix Emscripten umask and permission issues (GH-94002) - Emscripten's default umask is too strict, see https://github.com/emscripten-core/emscripten/issues/17269 - getuid/getgid and geteuid/getegid are stubs that always return 0 (root). Disable effective uid/gid syscalls and fix tests that use chmod() current user. - Cannot drop X bit from directory. * gh-84461: Skip test_unwritable_directory again on Emscripten (GH-94007) GH-93992 removed geteuid() and enabled the test again on Emscripten. * gh-93925: Improve clarity of sqlite3 commit/rollback, and close docs (#93926) Co-authored-by: CAM Gerlach * gh-61162: Clarify sqlite3 connection context manager docs (GH-93890) Explicitly note that transactions are only closed if there is an open transation at `__exit__`, and that transactions are not implicitly opened during `__enter__`. Co-authored-by: CAM Gerlach Co-authored-by: Stanley <46876382+slateny@users.noreply.github.com> Automerge-Triggered-By: GH:erlend-aasland * gh-79009: sqlite3.iterdump now correctly handles tables with autoincrement (#9621) Co-authored-by: Erlend E. Aasland * gh-84461: Silence some compiler warnings on WASM (GH-93978) * GH-93897: Store frame size in code object and de-opt if insufficient space on thread frame stack. (GH-93908) * GH-93516: Speedup line number checks when tracing. (GH-93763) * Use a lookup table to reduce overhead of getting line numbers during tracing. * gh-90539: doc: Expand on what should not go into CFLAGS, LDFLAGS (#92754) * gh-87347: Add parenthesis around macro arguments (#93915) Add unit test on Py_MEMBER_SIZE() and some other macros. * gh-93937: PyOS_StdioReadline() uses PyConfig.legacy_windows_stdio (#94024) On Windows, PyOS_StdioReadline() now gets PyConfig.legacy_windows_stdio from _PyOS_ReadlineTState, rather than using the deprecated global Py_LegacyWindowsStdioFlag variable. Fix also a compiler warning in Py_SetStandardStreamEncoding(). * GH-93249: relax overly strict assertion on bounds->ar_start (GH-93961) * gh-94021: Address unreachable code warning in specialize code (GH-94022) * GH-93678: refactor compiler so that optimizer does not need the assembler and compiler structs (GH-93842) * gh-93839: Move Lib/ctypes/test/ to Lib/test/test_ctypes/ (#94041) * Move Lib/ctypes/test/ to Lib/test/test_ctypes/ * Remove Lib/test/test_ctypes.py * Update imports and build system. * gh-93839: Move Lib/unttest/test/ to Lib/test/test_unittest/ (#94043) * Move Lib/unittest/test/ to Lib/test/test_unittest/ * Remove Lib/test/test_unittest.py * Replace unittest.test with test.test_unittest * Remove unittest.load_tests() * Rewrite unittest __init__.py and __main__.py * Update build system, CODEOWNERS, and wasm_assets.py * GH-91432: Specialize FOR_ITER (GH-91713) * Adds FOR_ITER_LIST and FOR_ITER_RANGE specializations. * Adds _PyLong_AssignValue() internal function to avoid temporary boxing of ints. * gh-94028: Clear and reset sqlite3 statements properly in cursor iternext (GH-94042) * gh-94052: Don't re-run failed tests with --python option (#94054) * gh-93839: Use load_package_tests() for testmock (GH-94055) Fixes failing tests on WebAssembly platforms. Automerge-Triggered-By: GH:tiran * gh-54781: Move Lib/lib2to3/tests/ to Lib/test/test_lib2to3/ (#94049) * Move Lib/lib2to3/tests/ to Lib/test/test_lib2to3/. * Remove Lib/test/test_lib2to3.py. * Update imports. * all_project_files(): use different paths and sort files to make the tests more reproducible. * Update references to tests. * gh-74953: _PyThread_cond_after() uses _PyTime_t (#94056) pthread _PyThread_cond_after() implementation now uses the _PyTime_t type to handle properly overflow: clamp to the maximum value. Remove MICROSECONDS_TO_TIMESPEC() function. * GH-93841: Allow stats to be turned on and off, cleared and dumped at runtime. (GH-93843) * gh-86986: Drop compatibility support for Sphinx 2 (GH-93737) * Revert "bpo-42843: Keep Sphinx 1.8 and Sphinx 2 compatibility (GH-24282)" This reverts commit 5c1f15b4b1024cbf0acc85832f0c623d1a4605fd * Revert "bpo-42579: Make workaround for various versions of Sphinx more robust (GH-23662)" This reverts commit b63a620014b67a6e63d10783149c41baaf59def8. * gh-94068: Remove HVSOCKET_CONTAINER_PASSTHRU constant because it has been removed from Windows (GH-94069) Fixes #94068 Automerge-Triggered-By: GH:zware * Closes gh-94038: Update Release Schedule in README.rst from PEP 664 to PEP 693 (GH-94046) * gh-93851: Fix all broken links in Doc/ (GH-93853) * gh-93675: Fix typos in `Doc/` (GH-93676) Closes #93675 * Minor optimization for Fractions.limit_denominator (GH-93730) When we construct the upper and lower candidates in limit_denominator, the numerator and denominator are already relatively prime (and the denominator positive) by construction, so there's no need to go through the usual normalisation in the constructor. This saves a couple of potentially expensive gcd calls. Suggested by Michael Scott Asato Cuthbert in GH-93477. * gh-93240: clarify wording in IO tutorial (GH-93276) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> * Tutorial: specify match cases don't fall through (GH-93615) * gh-93021: Fix __text_signature__ for __get__ (GH-93023) Because of the way wrap_descr_get is written, the second argument to __get__ methods implemented through the wrapper is always optional. * gh-82927: Update files related to HTML entities. (GH-92504) * DOC: correct bytesarray -> bytearray in comments (GH-92410) * gh-87389: Fix an open redirection vulnerability in http.server. (#93879) Fix an open redirection vulnerability in the `http.server` module when an URI path starts with `//` that could produce a 301 Location header with a misleading target. Vulnerability discovered, and logic fix proposed, by Hamza Avvan (@hamzaavvan). Test and comments authored by Gregory P. Smith [Google]. * gh-89336: Remove configparser APIs that were deprecated for 3.12 (#92503) https://github.com/python/cpython/issue/89336: Remove configparser 3.12 deprecations. Co-authored-by: Hugo van Kemenade * bpo-30535: [doc] state that sys.meta_path is not empty by default (GH-94098) Co-authored-by: Windson yang * gh-88123: Implement new Enum __contains__ (GH-93298) Co-authored-by: Ethan Furman * Stats: Add summary of top instructions for misses and deferred specialization. (GH-94072) * gh-74696: Do not change the current working directory in shutil.make_archive() if possible (GH-93160) It is no longer changed when create a zip or tar archive. It is still changed for custom archivers registered with shutil.register_archive_format() if root_dir is not None. Co-authored-by: Éric Co-authored-by: Łukasz Langa * gh-94101 Disallow instantiation of SSLSession objects (GH-94102) Fixes #94101 Automerge-Triggered-By: GH:tiran * Fix typo in _io.TextIOWrapper Clinic input (#94037) Co-authored-by: Łukasz Langa * gh-93951: In test_bdb.StateTestCase.test_skip, avoid including auxiliary importers. (GH-93962) Co-authored-by: Brett Cannon * gh-91172: Create a workflow for verifying bundled pip and setuptools (GH-31885) Co-authored-by: Hugo van Kemenade Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> * gh-94114: Remove obsolete reference to python.org mirrors (GH-94115) * gh-94114 * gh-84623: Remove unused imports (#94132) * gh-54781: Move Lib/tkinter/test/test_ttk/ to Lib/test/test_ttk/ (#94070) * Move Lib/tkinter/test/test_tkinter/ to Lib/test/test_tkinter/. * Move Lib/tkinter/test/test_ttk/ to Lib/test/test_ttk/. * Add Lib/test/test_ttk/__init__.py based on test_ttk_guionly.py. * Add Lib/test/test_tkinter/__init__.py * Remove old Lib/test/test_tk.py. * Remove old Lib/test/test_ttk_guionly.py. * Add __main__ sub-modules. * Update imports and update references to rename files. * gh-84623: Move imports in doctests (#94133) Move imports in doctests to prevent false alarms in pyflakes. * Add ABI dump Makefile target (#94136) * gh-84623: Remove unused imports in idlelib (#94143) Remove commented code in test_debugger_r.py. Co-authored-by: Terry Jan Reedy * gh-85308: argparse: Use filesystem encoding for arguments file (GH-93277) * Closes gh-94152: Update pyvideo.org URL (GH-94075) The URL is now https://pyvideo.org, which uses HTTPS and avoids a redirect. * gh-91456: [Enum] Deprecate default auto() behavior with mixed value types (GH-91457) When used with plain Enum, auto() returns the last numeric value assigned, skipping any incompatible member values (such as strings); starting in 3.13 the default auto() for plain Enums will require all the values to be of compatible types, and will return a new value that is 1 higher than any existing value. Co-authored-by: Ethan Furman * gh-84461: Fix test_sqlite for Emscripten/WASI (#94125) * gh-86404: [doc] Fix missing backtick and double target name. (#94120) * gh-89121: Keep the number of pending SQLite statements to a minimum (#30379) Make sure statements that have run to completion or errored are reset and cleared off the cursor for all paths in execute() and executemany(). * GH-91742: Fix pdb crash after jump (GH-94171) * [Enum] fix typo (GH-94158) * gh-92858: Improve error message for some suites with syntax error before ':' (#92894) * gh-93771: Clarify how deepfreeze.py is run (#94150) * gh-91219: Add an index_pages default list and parameter to SimpleHTTPRequestHandler (GH-31985) * Add an index_pages default list to SimpleHTTPRequestHandler and an optional constructor parameter that allows the default indexes pages list to be overridden. This makes it easy to set a new index page name without having to override send_head. * [Enum] Remove automatic docstring generation (GH-94188) * Add ABI dump script (#94135) * Add more tests for throwing into yield from (GH-94097) * gh-94169: Remove deprecated io.OpenWrapper (#94170) Remove io.OpenWrapper and _pyio.OpenWrapper, deprecated in Python 3.10: just use :func:`open` instead. The open() (io.open()) function is a built-in function. Since Python 3.10, _pyio.open() is also a static method. * gh-94199: Remove ssl.RAND_pseudo_bytes() function (#94202) Remove the ssl.RAND_pseudo_bytes() function, deprecated in Python 3.6: use os.urandom() or ssl.RAND_bytes() instead. * gh-94196: Remove gzip.GzipFile.filename attribute (#94197) gzip: Remove the filename attribute of gzip.GzipFile, deprecated since Python 2.6, use the name attribute instead. In write mode, the filename attribute added '.gz' file extension if it was not present. * gh-93692: remove "build finished successfully" message from setup.py (#93693) The message was only emitted when the build succeeded _and_ there were missing modules. * gh-84461: Fix ctypes and test_ctypes on Emscripten (#94142) - c_longlong and c_longdouble need experimental WASM bigint. - Skip tests that need threading - Define ``CTYPES_MAX_ARGCOUNT`` for Emscripten. libffi-emscripten 2022-06-23 supports up to 1000 args. * gh-94205: Ensures all required DLLs are copied on Windows for underpth tests (GH-94206) * gh-84461: Build Emscripten with WASM BigInt support (#94219) * gh-94172: urllib.request avoids deprecated check_hostname (#94193) The urllib.request no longer uses the deprecated check_hostname parameter of the http.client module. Add private http.client._create_https_context() helper to http.client, used by urllib.request. Remove the now redundant check on check_hostname and verify_mode in http.client: the SSLContext.check_hostname setter already implements the check. * IDLE: replace if statement with expression (#94228) * Docs: Remove `Provides [...]` from `multiprocessing.shared_memory` description (#92761) * gh-93382: Sync up `co_code` changes with 3.11 (GH-94227) Sync up co_code changes with 3.11 commit 852b4d4bcd12b0b6839a015a262ce976b134f6f3. * gh-94217: Skip import tests when _testcapi is a builtin (GH-94218) * gh-85308: Add argparse tests for reading non-ASCII arguments from file (GH-94160) * bpo-46642: Explicitly disallow subclassing of instaces of TypeVar, ParamSpec, etc (GH-31148) The existing test covering this case passed only incidentally. We explicitly disallow doing this and add a proper error message. Co-authored-by: Serhiy Storchaka * bpo-26253: Add compressionlevel to tarfile stream (GH-2962) `tarfile` already accepts a compressionlevel argument for creating files. This patch adds the same for stream-based tarfile usage. The default is 9, the value that was previously hard-coded. * gh-70441: Fix test_tarfile on systems w/o bz2 (gh-2962) (#94258) * gh-94199: Remove ssl.match_hostname() function (#94224) * gh-94207: Fix struct module leak (GH-94239) Make _struct.Struct a GC type This fixes a memory leak in the _struct module, where as soon as a Struct object is stored in the cache, there's a cycle from the _struct module to the cache to Struct objects to the Struct type back to the module. If _struct.Struct is not gc-tracked, that cycle is never collected. This PR makes _struct.Struct GC-tracked, and adds a regression test. * gh-94245: Test pickling and copying of typing.Tuple[()] (GH-94259) * gh-77560: Report possible errors in restoring builtins at finalization (GH-94255) Seems in the past the copy of builtins was not made in some scenarios, and the error was silenced. Write it now to stderr, so we have a chance to see it. * gh-90016: Reword sqlite3 adapter/converter docs (#93095) Also add adapters and converter recipes. Co-authored-by: CAM Gerlach Co-authored-by: Alex Waygood Co-authored-by: Ulises Ojeda Co-authored-by: jackh-ncl <1750152+jackh-ncl@users.noreply.github.com> Co-authored-by: Mark Dickinson Co-authored-by: Colin Delahunty <72827203+colin99d@users.noreply.github.com> Co-authored-by: Neil Schemenauer Co-authored-by: Christian Heimes Co-authored-by: Dennis Sweeney <36520290+sweeneyde@users.noreply.github.com> Co-authored-by: Cyker Way Co-authored-by: Hugo van Kemenade Co-authored-by: Omer Katz Co-authored-by: Stanley <46876382+slateny@users.noreply.github.com> Co-authored-by: Thomas Grainger Co-authored-by: Illia Volochii Co-authored-by: Erlend Egeberg Aasland Co-authored-by: Alex Waygood Co-authored-by: AN Long Co-authored-by: Samodya Abeysiriwardane <379594+sransara@users.noreply.github.com> Co-authored-by: Evorage Co-authored-by: Davide Rizzo Co-authored-by: Pascal Wittmann Co-authored-by: Vinay Sajip Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Łukasz Langa Co-authored-by: Andreas Grommek <76997441+agrommek@users.noreply.github.com> Co-authored-by: Mark Shannon Co-authored-by: Ken Jin Co-authored-by: Adrian Garcia Badaracco <1755071+adriangb@users.noreply.github.com> Co-authored-by: jacksonriley <52106215+jacksonriley@users.noreply.github.com> Co-authored-by: Kalyan Co-authored-by: Bluenix Co-authored-by: Petr Viktorin Co-authored-by: CAM Gerlach Co-authored-by: Sebastian Berg Co-authored-by: Leo Trol Co-authored-by: XD Trol Co-authored-by: Antoine Pitrou Co-authored-by: neonene <53406459+neonene@users.noreply.github.com> Co-authored-by: Steve Dower Co-authored-by: Pablo Galindo Salgado Co-authored-by: Barney Gale Co-authored-by: Oleg Iarygin Co-authored-by: Brett Cannon Co-authored-by: John Belmonte Co-authored-by: Julien Palard Co-authored-by: Pamela Fox Co-authored-by: Dong-hee Na Co-authored-by: Kumar Aditya <59607654+kumaraditya303@users.noreply.github.com> Co-authored-by: Victor Stinner Co-authored-by: Sanket Shanbhag Co-authored-by: Jeong YunWon <69878+youknowone@users.noreply.github.com> Co-authored-by: Steve Dower Co-authored-by: samtygier Co-authored-by: Ken Jin Co-authored-by: Brandt Bucher Co-authored-by: Gregory P. Smith Co-authored-by: chilaxan <35645806+chilaxan@users.noreply.github.com> Co-authored-by: Serhiy Storchaka <3659035+serhiy-storchaka@users.noreply.github.com> Co-authored-by: Chris Fernald Co-authored-by: Jason R. Coombs Co-authored-by: Ezio Melotti Co-authored-by: Lei Zhang Co-authored-by: Erlend Egeberg Aasland Co-authored-by: itssme Co-authored-by: Matthias Köppe Co-authored-by: MilanJuhas <81162136+MilanJuhas@users.noreply.github.com> Co-authored-by: luzpaz Co-authored-by: paulreece <96156234+paulreece@users.noreply.github.com> Co-authored-by: max <36980911+pr2502@users.noreply.github.com> Co-authored-by: Jelle Zijlstra Co-authored-by: Thomas A Caswell Co-authored-by: Windson yang Co-authored-by: Carl Bordum Hansen Co-authored-by: Ethan Furman Co-authored-by: Éric Co-authored-by: chgnrdv <52372310+chgnrdv@users.noreply.github.com> Co-authored-by: fikotta <81991278+fikotta@users.noreply.github.com> Co-authored-by: partev Co-authored-by: Terry Jan Reedy Co-authored-by: Inada Naoki Co-authored-by: Oscar R <89599049+oscar-LT@users.noreply.github.com> Co-authored-by: wookie184 Co-authored-by: Guido van Rossum Co-authored-by: Myron Walker Co-authored-by: Sam Ezeh Co-authored-by: Ken Jin <28750310+Fidget-Spinner@users.noreply.github.com> Co-authored-by: Gregory Beauregard Co-authored-by: Yaron de Leeuw Co-authored-by: Mark Dickinson --- .azure-pipelines/windows-release.yml | 220 --- .../windows-release/build-steps.yml | 84 - .azure-pipelines/windows-release/checkout.yml | 21 - .azure-pipelines/windows-release/find-sdk.yml | 17 - .../windows-release/layout-command.yml | 23 - .../windows-release/mingw-lib.yml | 13 - .../windows-release/msi-steps.yml | 181 --- .../windows-release/stage-build.yml | 193 --- .../windows-release/stage-layout-embed.yml | 61 - .../windows-release/stage-layout-full.yml | 80 - .../windows-release/stage-layout-msix.yml | 102 -- .../windows-release/stage-layout-nuget.yml | 52 - .../windows-release/stage-msi.yml | 43 - .../windows-release/stage-pack-msix.yml | 148 -- .../windows-release/stage-pack-nuget.yml | 66 - .../stage-publish-nugetorg.yml | 50 - .../stage-publish-pythonorg.yml | 192 --- .../windows-release/stage-publish-store.yml | 38 - .../windows-release/stage-sign.yml | 130 -- .../windows-release/stage-test-embed.yml | 41 - .../windows-release/stage-test-msi.yml | 108 -- .../windows-release/stage-test-nuget.yml | 58 - .github/CODEOWNERS | 5 +- .github/workflows/build.yml | 3 + .github/workflows/build_msi.yml | 3 + .github/workflows/doc.yml | 49 +- .../workflows/new-bugs-announce-notifier.yml | 5 +- .github/workflows/regen-abidump.sh | 8 + .github/workflows/verify-ensurepip-wheels.yml | 28 + .gitignore | 3 + Doc/Makefile | 8 +- Doc/c-api/bytes.rst | 3 - Doc/c-api/call.rst | 18 - Doc/c-api/init.rst | 109 ++ Doc/c-api/init_config.rst | 45 +- Doc/c-api/refcounting.rst | 14 +- Doc/c-api/structures.rst | 2 - Doc/c-api/sys.rst | 4 +- Doc/c-api/type.rst | 54 +- Doc/c-api/typeobj.rst | 8 +- Doc/c-api/unicode.rst | 66 +- Doc/conf.py | 4 +- Doc/data/stable_abi.dat | 1 + Doc/distutils/apiref.rst | 2 +- Doc/faq/design.rst | 3 +- Doc/faq/library.rst | 12 +- Doc/faq/programming.rst | 2 +- Doc/howto/functional.rst | 29 +- Doc/howto/logging-cookbook.rst | 138 +- Doc/howto/logging.rst | 22 +- Doc/howto/sockets.rst | 19 +- Doc/howto/urllib2.rst | 6 +- Doc/includes/email-headers.py | 3 +- Doc/includes/sqlite3/adapter_datetime.py | 17 - Doc/includes/sqlite3/converter_point.py | 21 +- Doc/installing/index.rst | 2 +- Doc/library/_thread.rst | 2 +- Doc/library/aifc.rst | 2 +- Doc/library/argparse.rst | 13 +- Doc/library/array.rst | 2 +- Doc/library/asynchat.rst | 4 +- Doc/library/asyncio-llapi-index.rst | 4 + Doc/library/asyncio-stream.rst | 2 +- Doc/library/asyncio-subprocess.rst | 4 + Doc/library/asyncio-sync.rst | 8 +- Doc/library/asyncio-task.rst | 19 +- Doc/library/asyncore.rst | 4 +- Doc/library/audioop.rst | 2 +- Doc/library/base64.rst | 2 +- Doc/library/binascii.rst | 2 +- Doc/library/cgi.rst | 33 +- Doc/library/cgitb.rst | 2 +- Doc/library/chunk.rst | 2 +- Doc/library/concurrent.futures.rst | 2 +- Doc/library/configparser.rst | 29 +- Doc/library/contextlib.rst | 2 +- Doc/library/crypt.rst | 2 +- Doc/library/datetime.rst | 4 +- Doc/library/decimal.rst | 5 +- Doc/library/difflib.rst | 6 +- Doc/library/dis.rst | 85 +- Doc/library/doctest.rst | 48 +- Doc/library/email.header.rst | 2 +- Doc/library/email.headerregistry.rst | 2 +- Doc/library/enum.rst | 8 +- Doc/library/fcntl.rst | 6 + Doc/library/fractions.rst | 12 +- Doc/library/functions.rst | 3 +- Doc/library/gzip.rst | 4 + Doc/library/hashlib.rst | 2 +- Doc/library/html.entities.rst | 6 +- Doc/library/http.cookies.rst | 2 +- Doc/library/imghdr.rst | 2 +- Doc/library/imp.rst | 2 +- Doc/library/importlib.metadata.rst | 41 +- Doc/library/importlib.resources.rst | 2 +- Doc/library/importlib.rst | 14 +- Doc/library/inspect.rst | 7 +- Doc/library/io.rst | 2 +- Doc/library/itertools.rst | 8 +- Doc/library/locale.rst | 2 + Doc/library/logging.config.rst | 76 +- Doc/library/logging.handlers.rst | 6 + Doc/library/logging.rst | 50 +- Doc/library/mailcap.rst | 14 +- Doc/library/math.rst | 2 +- Doc/library/mmap.rst | 2 +- Doc/library/msilib.rst | 2 +- Doc/library/multiprocessing.rst | 1 + Doc/library/multiprocessing.shared_memory.rst | 4 +- Doc/library/nis.rst | 2 +- Doc/library/os.path.rst | 2 +- Doc/library/os.rst | 54 +- Doc/library/ossaudiodev.rst | 2 +- Doc/library/pathlib.rst | 41 +- Doc/library/pickle.rst | 12 +- Doc/library/pipes.rst | 2 +- Doc/library/platform.rst | 2 +- Doc/library/posix.rst | 2 +- Doc/library/re.rst | 82 +- Doc/library/secrets.rst | 2 +- Doc/library/shutil.rst | 8 +- Doc/library/signal.rst | 4 +- Doc/library/smtpd.rst | 4 +- Doc/library/sndhdr.rst | 2 +- Doc/library/socket.rst | 52 +- Doc/library/spwd.rst | 2 +- Doc/library/sqlite3.rst | 378 +++-- Doc/library/ssl.rst | 92 +- Doc/library/statistics.rst | 4 +- Doc/library/stdtypes.rst | 2 + Doc/library/struct.rst | 9 +- Doc/library/subprocess.rst | 3 - Doc/library/sunau.rst | 2 +- Doc/library/symtable.rst | 3 +- Doc/library/sys.rst | 3 +- Doc/library/tarfile.rst | 7 +- Doc/library/telnetlib.rst | 2 +- Doc/library/termios.rst | 10 +- Doc/library/test.rst | 164 +- Doc/library/threading.rst | 8 +- Doc/library/tkinter.rst | 5 +- Doc/library/typing.rst | 134 +- Doc/library/unittest.mock.rst | 2 +- Doc/library/uu.rst | 2 +- Doc/library/warnings.rst | 16 +- Doc/library/weakref.rst | 2 + Doc/library/xdrlib.rst | 2 +- Doc/library/xml.dom.minidom.rst | 2 +- Doc/library/xml.etree.elementtree.rst | 2 + Doc/library/zipfile.rst | 2 +- Doc/license.rst | 6 +- Doc/make.bat | 5 +- Doc/reference/expressions.rst | 6 +- Doc/tools/extensions/asdl_highlight.py | 3 +- Doc/tools/extensions/peg_highlight.py | 2 +- Doc/tools/extensions/pyspecific.py | 8 +- Doc/tools/extensions/suspicious.py | 41 +- Doc/tools/rstlint.py | 2 +- Doc/tools/susp-ignored.csv | 2 + Doc/tools/templates/download.html | 4 + Doc/tutorial/classes.rst | 6 +- Doc/tutorial/controlflow.rst | 6 +- Doc/tutorial/inputoutput.rst | 2 +- Doc/tutorial/whatnow.rst | 7 +- Doc/using/cmdline.rst | 37 +- Doc/using/configure.rst | 20 + Doc/using/unix.rst | 2 +- Doc/using/venv-create.inc | 2 +- Doc/using/windows.rst | 10 +- Doc/whatsnew/2.1.rst | 7 +- Doc/whatsnew/2.5.rst | 14 +- Doc/whatsnew/2.6.rst | 4 +- Doc/whatsnew/2.7.rst | 6 +- Doc/whatsnew/3.0.rst | 6 +- Doc/whatsnew/3.1.rst | 1 + Doc/whatsnew/3.10.rst | 4 +- Doc/whatsnew/3.11.rst | 105 +- Doc/whatsnew/3.12.rst | 167 ++ Doc/whatsnew/3.2.rst | 16 +- Doc/whatsnew/3.4.rst | 6 +- Doc/whatsnew/3.6.rst | 3 +- Doc/whatsnew/3.7.rst | 3 +- Doc/whatsnew/3.8.rst | 3 + Doc/whatsnew/3.9.rst | 2 +- Grammar/python.gram | 24 +- Include/abstract.h | 16 +- Include/boolobject.h | 6 +- Include/bytearrayobject.h | 4 +- Include/bytesobject.h | 2 +- Include/ceval.h | 2 +- Include/complexobject.h | 4 +- Include/cpython/abstract.h | 8 +- Include/cpython/bytearrayobject.h | 8 +- Include/cpython/bytesobject.h | 8 +- Include/cpython/cellobject.h | 16 +- Include/cpython/classobject.h | 10 +- Include/cpython/code.h | 15 +- Include/cpython/context.h | 6 +- Include/cpython/dictobject.h | 7 +- Include/cpython/frameobject.h | 13 - Include/cpython/funcobject.h | 57 +- Include/cpython/genobject.h | 8 +- Include/cpython/import.h | 3 + Include/cpython/listobject.h | 10 +- Include/cpython/methodobject.h | 20 +- Include/cpython/modsupport.h | 6 +- Include/cpython/object.h | 12 +- Include/cpython/odictobject.h | 12 +- Include/cpython/picklebufobject.h | 2 +- Include/cpython/pydebug.h | 36 +- Include/cpython/pyerrors.h | 2 +- Include/cpython/pyframe.h | 17 + Include/cpython/pystate.h | 5 +- Include/cpython/pythonrun.h | 28 +- Include/cpython/pytime.h | 4 + Include/cpython/tupleobject.h | 10 +- Include/cpython/unicodeobject.h | 84 +- Include/cpython/warnings.h | 2 +- Include/cpython/weakrefobject.h | 10 +- Include/datetime.h | 92 +- Include/dictobject.h | 8 +- Include/fileobject.h | 6 +- Include/floatobject.h | 2 +- Include/import.h | 2 +- Include/internal/pycore_asdl.h | 4 +- Include/internal/pycore_call.h | 2 + Include/internal/pycore_ceval.h | 13 +- Include/internal/pycore_code.h | 168 +- Include/internal/pycore_descrobject.h | 26 + Include/internal/pycore_dict.h | 6 +- Include/internal/pycore_frame.h | 50 +- Include/internal/pycore_global_strings.h | 7 +- Include/internal/pycore_hamt.h | 16 +- Include/internal/pycore_hashtable.h | 4 +- Include/internal/pycore_initconfig.h | 8 +- Include/internal/pycore_interp.h | 3 - Include/internal/pycore_list.h | 6 + Include/internal/pycore_long.h | 2 + Include/internal/pycore_object.h | 11 +- Include/internal/pycore_opcode.h | 199 ++- Include/internal/pycore_pymath.h | 8 +- Include/internal/pycore_pystate.h | 14 +- Include/internal/pycore_range.h | 22 + Include/internal/pycore_runtime.h | 2 - Include/internal/pycore_runtime_init.h | 23 +- Include/internal/pycore_symtable.h | 2 +- Include/internal/pycore_unionobject.h | 4 +- Include/iterobject.h | 4 +- Include/listobject.h | 2 +- Include/longobject.h | 2 +- Include/memoryobject.h | 2 +- Include/methodobject.h | 4 +- Include/modsupport.h | 14 +- Include/moduleobject.h | 4 +- Include/object.h | 23 +- Include/objimpl.h | 4 +- Include/opcode.h | 98 +- Include/py_curses.h | 2 +- Include/pycapsule.h | 2 +- Include/pyerrors.h | 4 +- Include/pyframe.h | 6 + Include/pymacconfig.h | 12 - Include/pymacro.h | 7 + Include/pymem.h | 14 +- Include/pyport.h | 79 +- Include/pystats.h | 105 ++ Include/pythread.h | 4 +- Include/rangeobject.h | 2 +- Include/setobject.h | 10 +- Include/sliceobject.h | 2 +- Include/structseq.h | 4 +- Include/traceback.h | 2 +- Include/tupleobject.h | 2 +- Include/unicodeobject.h | 30 +- Include/weakrefobject.h | 8 +- Lib/_pyio.py | 35 +- Lib/argparse.py | 4 +- Lib/ast.py | 6 +- Lib/asyncio/base_futures.py | 1 - Lib/asyncio/coroutines.py | 1 - Lib/asyncio/locks.py | 1 - Lib/asyncio/proactor_events.py | 2 +- Lib/asyncio/runners.py | 1 - Lib/asyncio/streams.py | 1 - Lib/asyncio/taskgroups.py | 19 +- Lib/configparser.py | 61 +- Lib/copy.py | 2 +- Lib/ctypes/test/__main__.py | 4 - Lib/dataclasses.py | 17 +- Lib/dis.py | 51 +- Lib/distutils/sysconfig.py | 1 - Lib/distutils/tests/test_bdist.py | 1 - Lib/distutils/tests/test_build.py | 1 + Lib/distutils/tests/test_sysconfig.py | 4 +- Lib/doctest.py | 14 +- Lib/email/_header_value_parser.py | 2 +- Lib/enum.py | 180 +-- Lib/filecmp.py | 8 +- Lib/fnmatch.py | 30 +- Lib/fractions.py | 14 +- Lib/functools.py | 5 +- Lib/gzip.py | 8 - Lib/html/entities.py | 9 +- Lib/http/client.py | 32 +- Lib/http/cookiejar.py | 4 +- Lib/http/server.py | 14 +- Lib/idlelib/configdialog.py | 2 +- Lib/idlelib/idle_test/test_debugger_r.py | 19 +- Lib/idlelib/idle_test/test_editor.py | 1 - Lib/idlelib/idle_test/test_parenmatch.py | 4 +- Lib/idlelib/iomenu.py | 8 +- Lib/idlelib/util.py | 1 - Lib/importlib/_bootstrap_external.py | 25 +- Lib/importlib/metadata/__init__.py | 48 +- Lib/importlib/resources/_common.py | 7 +- Lib/io.py | 16 - Lib/locale.py | 12 +- Lib/logging/__init__.py | 90 +- Lib/logging/config.py | 92 +- Lib/logging/handlers.py | 1 + Lib/mailcap.py | 26 +- Lib/multiprocessing/context.py | 14 + Lib/multiprocessing/pool.py | 3 + Lib/multiprocessing/process.py | 10 +- Lib/multiprocessing/shared_memory.py | 7 +- Lib/netrc.py | 2 +- Lib/ntpath.py | 50 +- Lib/opcode.py | 84 +- Lib/os.py | 5 +- Lib/pathlib.py | 70 +- Lib/pickle.py | 2 +- Lib/platform.py | 4 + Lib/pydoc.py | 22 +- Lib/random.py | 2 +- Lib/re/__init__.py | 23 +- Lib/re/_compiler.py | 61 +- Lib/re/_constants.py | 3 +- Lib/re/_parser.py | 3 +- Lib/shutil.py | 102 +- Lib/site.py | 4 +- Lib/socket.py | 3 +- Lib/sqlite3/__init__.py | 21 +- Lib/sqlite3/dbapi2.py | 28 +- Lib/sqlite3/dump.py | 14 +- Lib/ssl.py | 64 +- Lib/subprocess.py | 17 +- Lib/symtable.py | 2 +- Lib/sysconfig.py | 4 +- Lib/tarfile.py | 32 +- Lib/test/_test_eintr.py | 14 +- Lib/test/_test_embed_set_config.py | 5 +- Lib/test/_test_multiprocessing.py | 99 +- Lib/test/_testcppext.cpp | 107 +- Lib/test/datetimetester.py | 1 - Lib/test/doctest_lineno.py | 50 + Lib/test/fork_wait.py | 6 +- Lib/test/leakers/test_ctypes.py | 2 +- Lib/test/libregrtest/cmdline.py | 8 +- Lib/test/libregrtest/main.py | 34 +- Lib/test/libregrtest/runtest.py | 5 + Lib/test/libregrtest/runtest_mp.py | 50 +- Lib/test/libregrtest/setup.py | 2 +- Lib/test/list_tests.py | 3 +- Lib/test/lock_tests.py | 1 - Lib/test/pickletester.py | 61 + Lib/test/pythoninfo.py | 65 +- Lib/test/setup_testcppext.py | 25 +- Lib/test/signalinterproctester.py | 13 +- Lib/test/support/__init__.py | 190 ++- Lib/test/support/os_helper.py | 54 +- Lib/test/support/socket_helper.py | 18 +- Lib/test/support/threading_helper.py | 18 +- Lib/test/test__locale.py | 6 +- Lib/test/test__xxsubinterpreters.py | 11 +- Lib/test/test_argparse.py | 36 +- Lib/test/test_ast.py | 35 + Lib/test/test_asyncio/test_events.py | 1 - Lib/test/test_asyncio/test_locks.py | 6 +- Lib/test/test_asyncio/test_runners.py | 8 +- Lib/test/test_asyncio/test_sock_lowlevel.py | 2 +- Lib/test/test_asyncio/test_tasks.py | 4 - Lib/test/test_asyncio/test_timeouts.py | 1 - Lib/test/test_asyncio/test_unix_events.py | 41 +- Lib/test/test_asyncio/utils.py | 21 +- Lib/test/test_asyncore.py | 6 +- Lib/test/test_atexit.py | 2 +- Lib/test/test_bdb.py | 10 + Lib/test/test_binascii.py | 2 +- Lib/test/test_bisect.py | 6 + Lib/test/test_bufio.py | 1 - Lib/test/test_builtin.py | 2 +- Lib/test/test_calendar.py | 13 + Lib/test/test_capi.py | 51 +- Lib/test/test_cmd_line.py | 58 +- Lib/test/test_cmd_line_script.py | 3 +- Lib/test/test_code.py | 33 +- Lib/test/test_codecs.py | 1 - Lib/test/test_compile.py | 6 +- Lib/test/test_compileall.py | 3 + Lib/test/test_concurrent_futures.py | 11 +- Lib/test/test_configparser.py | 42 +- Lib/test/test_context.py | 35 + Lib/test/test_contextlib_async.py | 11 +- Lib/test/test_copy.py | 22 + Lib/test/test_coroutines.py | 3 +- Lib/test/test_cppext.py | 40 +- Lib/test/test_ctypes.py | 10 - .../test => test/test_ctypes}/__init__.py | 0 Lib/test/test_ctypes/__main__.py | 4 + .../test => test/test_ctypes}/test_anon.py | 0 .../test_ctypes}/test_array_in_pointer.py | 0 .../test => test/test_ctypes}/test_arrays.py | 2 +- .../test_ctypes}/test_as_parameter.py | 3 +- .../test_ctypes}/test_bitfields.py | 2 +- .../test => test/test_ctypes}/test_buffers.py | 2 +- .../test => test/test_ctypes}/test_bytes.py | 0 .../test_ctypes}/test_byteswap.py | 0 .../test_ctypes}/test_callbacks.py | 5 +- .../test => test/test_ctypes}/test_cast.py | 2 +- .../test => test/test_ctypes}/test_cfuncs.py | 8 +- .../test_ctypes}/test_checkretval.py | 2 +- .../test => test/test_ctypes}/test_delattr.py | 0 .../test => test/test_ctypes}/test_errno.py | 0 .../test => test/test_ctypes}/test_find.py | 0 .../test_ctypes}/test_frombuffer.py | 0 .../test => test/test_ctypes}/test_funcptr.py | 0 .../test_ctypes}/test_functions.py | 3 +- .../test_ctypes}/test_incomplete.py | 0 .../test => test/test_ctypes}/test_init.py | 0 .../test_ctypes}/test_internals.py | 0 .../test_ctypes}/test_keeprefs.py | 0 .../test => test/test_ctypes}/test_libc.py | 0 .../test => test/test_ctypes}/test_loading.py | 0 .../test_ctypes}/test_macholib.py | 0 .../test_ctypes}/test_memfunctions.py | 2 +- .../test => test/test_ctypes}/test_numbers.py | 0 .../test => test/test_ctypes}/test_objects.py | 8 +- .../test_ctypes}/test_parameters.py | 3 +- .../test => test/test_ctypes}/test_pep3118.py | 0 .../test_ctypes}/test_pickling.py | 0 .../test_ctypes}/test_pointers.py | 0 .../test_ctypes}/test_prototypes.py | 2 +- .../test_ctypes}/test_python_api.py | 0 .../test_ctypes}/test_random_things.py | 0 .../test_ctypes}/test_refcounts.py | 0 .../test => test/test_ctypes}/test_repr.py | 0 .../test_ctypes}/test_returnfuncptrs.py | 0 .../test_ctypes}/test_simplesubclasses.py | 0 .../test => test/test_ctypes}/test_sizes.py | 0 .../test => test/test_ctypes}/test_slicing.py | 2 +- .../test_ctypes}/test_stringptr.py | 0 .../test => test/test_ctypes}/test_strings.py | 2 +- .../test_ctypes}/test_struct_fields.py | 0 .../test_ctypes}/test_structures.py | 2 +- .../test_ctypes}/test_unaligned_structures.py | 0 .../test => test/test_ctypes}/test_unicode.py | 2 +- .../test => test/test_ctypes}/test_values.py | 0 .../test_ctypes}/test_varsize_struct.py | 0 .../test => test/test_ctypes}/test_win32.py | 0 .../test_ctypes}/test_wintypes.py | 0 Lib/test/test_dataclasses.py | 48 + Lib/test/test_dbm_dumb.py | 2 + Lib/test/test_decorators.py | 1 - Lib/test/test_defaultdict.py | 2 - Lib/test/test_descr.py | 1 - Lib/test/test_descrtut.py | 2 +- Lib/test/test_dis.py | 531 ++++--- Lib/test/test_doctest.py | 27 +- Lib/test/test_dtrace.py | 5 + Lib/test/test_email/test_email.py | 25 +- Lib/test/test_embed.py | 97 +- Lib/test/test_enum.py | 455 +++--- Lib/test/test_exception_group.py | 1 - Lib/test/test_exceptions.py | 1 - Lib/test/test_extcall.py | 2 +- Lib/test/test_fileio.py | 5 +- Lib/test/test_float.py | 1 - Lib/test/test_flufl.py | 1 - Lib/test/test_fnmatch.py | 114 ++ Lib/test/test_fstring.py | 34 +- Lib/test/test_functools.py | 2 - Lib/test/test_future.py | 1 - Lib/test/test_gc.py | 14 +- Lib/test/test_genericpath.py | 2 +- Lib/test/test_getpath.py | 1 - Lib/test/test_hashlib.py | 4 +- Lib/test/test_heapq.py | 1 - Lib/test/test_http_cookiejar.py | 46 +- Lib/test/test_httpservers.py | 53 +- Lib/test/test_imaplib.py | 5 +- Lib/test/test_import/__init__.py | 26 +- .../test_importlib/extension/test_finder.py | 6 +- .../test_importlib/extension/test_loader.py | 8 + Lib/test/test_importlib/fixtures.py | 16 + Lib/test/test_importlib/import_/test_path.py | 15 +- Lib/test/test_importlib/test_abc.py | 1 - Lib/test/test_importlib/test_api.py | 6 +- Lib/test/test_importlib/test_main.py | 91 +- Lib/test/test_importlib/test_metadata_api.py | 6 +- .../test_importlib/test_threaded_import.py | 2 +- Lib/test/test_inspect.py | 5 +- Lib/test/test_io.py | 21 +- Lib/test/test_itertools.py | 1 + Lib/test/test_largefile.py | 2 + Lib/test/test_launcher.py | 35 +- Lib/test/test_lib2to3.py | 9 - .../tests => test/test_lib2to3}/__init__.py | 0 .../tests => test/test_lib2to3}/__main__.py | 0 .../tests => test/test_lib2to3}/data/README | 0 .../tests => test/test_lib2to3}/data/bom.py | 0 .../tests => test/test_lib2to3}/data/crlf.py | 0 .../test_lib2to3}/data/different_encoding.py | 0 .../test_lib2to3}/data/false_encoding.py | 0 .../test_lib2to3}/data/fixers/bad_order.py | 0 .../data/fixers/myfixes/__init__.py | 0 .../data/fixers/myfixes/fix_explicit.py | 0 .../data/fixers/myfixes/fix_first.py | 0 .../data/fixers/myfixes/fix_last.py | 0 .../data/fixers/myfixes/fix_parrot.py | 0 .../data/fixers/myfixes/fix_preorder.py | 0 .../test_lib2to3}/data/fixers/no_fixer_cls.py | 0 .../data/fixers/parrot_example.py | 0 .../test_lib2to3}/data/infinite_recursion.py | 0 .../test_lib2to3}/data/py2_test_grammar.py | 0 .../test_lib2to3}/data/py3_test_grammar.py | 0 .../test_lib2to3}/pytree_idempotency.py | 6 +- .../tests => test/test_lib2to3}/support.py | 21 +- .../test_lib2to3}/test_all_fixers.py | 1 - .../test_lib2to3}/test_fixers.py | 12 +- .../tests => test/test_lib2to3}/test_main.py | 0 .../test_lib2to3}/test_parser.py | 4 +- .../test_lib2to3}/test_pytree.py | 0 .../test_lib2to3}/test_refactor.py | 0 .../tests => test/test_lib2to3}/test_util.py | 0 Lib/test/test_lltrace.py | 9 +- Lib/test/test_locale.py | 12 +- Lib/test/test_logging.py | 329 +++- Lib/test/test_mailbox.py | 5 + Lib/test/test_mailcap.py | 13 +- Lib/test/test_marshal.py | 2 +- Lib/test/test_memoryview.py | 101 ++ Lib/test/test_modulefinder.py | 75 +- .../test_multiprocessing_main_handling.py | 27 +- Lib/test/test_netrc.py | 7 + Lib/test/test_nis.py | 1 - Lib/test/test_ntpath.py | 27 + Lib/test/test_os.py | 27 +- Lib/test/test_pathlib.py | 20 +- Lib/test/test_patma.py | 21 + Lib/test/test_pdb.py | 43 + Lib/test/test_peepholer.py | 197 ++- Lib/test/test_peg_generator/test_c_parser.py | 2 +- Lib/test/test_posix.py | 18 +- Lib/test/test_posixpath.py | 33 +- Lib/test/test_property.py | 95 ++ Lib/test/test_py_compile.py | 1 + Lib/test/test_pyclbr.py | 9 +- Lib/test/test_pydoc.py | 10 +- Lib/test/test_pyexpat.py | 3 +- Lib/test/test_queue.py | 5 +- Lib/test/test_re.py | 70 +- Lib/test/test_regrtest.py | 36 +- Lib/test/test_robotparser.py | 5 +- Lib/test/test_select.py | 1 - Lib/test/test_selectors.py | 4 +- Lib/test/test_shelve.py | 2 - Lib/test/test_shutil.py | 55 +- Lib/test/test_signal.py | 35 +- Lib/test/test_site.py | 9 +- Lib/test/test_smtpd.py | 3 + Lib/test/test_socket.py | 137 +- Lib/test/test_socketserver.py | 4 +- Lib/test/test_sqlite3/__init__.py | 1 - Lib/test/test_sqlite3/test_dbapi.py | 246 ++- Lib/test/test_sqlite3/test_dump.py | 47 + Lib/test/test_sqlite3/test_factory.py | 12 - Lib/test/test_sqlite3/test_regression.py | 10 +- Lib/test/test_sqlite3/test_transactions.py | 73 +- Lib/test/test_ssl.py | 216 +-- Lib/test/test_stable_abi_ctypes.py | 1 + Lib/test/test_stat.py | 4 +- Lib/test/test_struct.py | 17 + Lib/test/test_support.py | 15 +- Lib/test/test_syntax.py | 41 +- Lib/test/test_sys.py | 8 +- Lib/test/test_sys_settrace.py | 52 + Lib/test/test_sysconfig.py | 9 +- Lib/test/test_tarfile.py | 93 +- Lib/test/test_tcl.py | 3 - Lib/test/test_tempfile.py | 2 + Lib/test/test_thread.py | 18 +- Lib/test/test_threading.py | 4 +- Lib/test/test_threading_local.py | 1 - Lib/test/test_time.py | 3 + Lib/test/test_tk.py | 20 - .../test => test/test_tkinter}/README | 0 Lib/test/test_tkinter/__init__.py | 18 + Lib/test/test_tkinter/__main__.py | 4 + .../test => test/test_tkinter}/support.py | 1 - .../test/test_tkinter/test_colorchooser.py | 2 +- .../test/test_tkinter/test_font.py | 2 +- .../test_tkinter/test_geometry_managers.py | 4 +- .../test/test_tkinter/test_images.py | 2 +- .../test/test_tkinter/test_loadtk.py | 0 .../test/test_tkinter/test_messagebox.py | 2 +- .../test/test_tkinter/test_misc.py | 2 +- .../test/test_tkinter/test_simpledialog.py | 2 +- .../test/test_tkinter/test_text.py | 2 +- .../test/test_tkinter/test_variables.py | 2 +- .../test/test_tkinter/test_widgets.py | 4 +- .../test_tkinter}/widget_tests.py | 3 +- Lib/test/test_tomllib/test_misc.py | 7 +- Lib/test/test_tools/test_md5sum.py | 1 - Lib/test/test_tools/test_pathfix.py | 1 - Lib/test/test_tools/test_pindent.py | 1 - Lib/test/test_traceback.py | 29 +- .../__init__.py} | 15 +- Lib/test/test_ttk/__main__.py | 4 + .../test/test_ttk/test_extensions.py | 2 +- Lib/{tkinter => }/test/test_ttk/test_style.py | 2 +- .../test/test_ttk/test_widgets.py | 4 +- Lib/test/test_type_comments.py | 1 - Lib/test/test_types.py | 6 + Lib/test/test_typing.py | 138 +- Lib/test/test_unicode.py | 31 +- Lib/test/test_unicode_file.py | 2 +- Lib/test/test_unittest.py | 16 - Lib/test/test_unittest/__init__.py | 6 + Lib/test/test_unittest/__main__.py | 4 + .../test_unittest}/_test_warnings.py | 0 .../test => test/test_unittest}/dummy.py | 0 .../test => test/test_unittest}/support.py | 0 .../test_unittest}/test_assertions.py | 0 .../test_unittest}/test_async_case.py | 0 .../test => test/test_unittest}/test_break.py | 0 .../test => test/test_unittest}/test_case.py | 2 +- .../test_unittest}/test_discovery.py | 6 +- .../test_unittest}/test_functiontestcase.py | 2 +- .../test_unittest}/test_loader.py | 6 +- .../test_unittest}/test_program.py | 18 +- .../test_unittest}/test_result.py | 2 +- .../test_unittest}/test_runner.py | 2 +- .../test_unittest}/test_setups.py | 0 .../test_unittest}/test_skipping.py | 2 +- .../test => test/test_unittest}/test_suite.py | 2 +- Lib/test/test_unittest/testmock/__init__.py | 6 + .../test_unittest}/testmock/__main__.py | 2 +- .../test_unittest}/testmock/support.py | 0 .../test_unittest}/testmock/testasync.py | 0 .../test_unittest}/testmock/testcallable.py | 2 +- .../test_unittest}/testmock/testhelpers.py | 0 .../testmock/testmagicmethods.py | 0 .../test_unittest}/testmock/testmock.py | 2 +- .../test_unittest}/testmock/testpatch.py | 22 +- .../test_unittest}/testmock/testsealable.py | 0 .../test_unittest}/testmock/testsentinel.py | 0 .../test_unittest}/testmock/testwith.py | 2 +- Lib/test/test_univnewlines.py | 1 - Lib/test/test_unparse.py | 3 + Lib/test/test_urllib.py | 5 + Lib/test/test_urllib2.py | 1 - Lib/test/test_urllib2net.py | 1 + Lib/test/test_urllib_response.py | 5 + Lib/test/test_uu.py | 6 + Lib/test/test_venv.py | 77 +- Lib/test/test_wait3.py | 5 +- Lib/test/test_wait4.py | 5 +- Lib/test/test_warnings/__init__.py | 4 +- Lib/test/test_weakref.py | 11 + Lib/test/test_winsound.py | 22 + Lib/test/test_xml_etree.py | 16 +- Lib/test/test_yield_from.py | 527 +++++++ Lib/test/test_zipapp.py | 18 + Lib/test/test_zipfile.py | 11 + Lib/test/test_zipimport.py | 6 +- Lib/threading.py | 23 +- Lib/tkinter/commondialog.py | 2 +- Lib/tkinter/test/__init__.py | 0 Lib/tkinter/test/test_tkinter/__init__.py | 0 Lib/tkinter/test/test_ttk/__init__.py | 0 Lib/types.py | 2 +- Lib/typing.py | 139 +- Lib/unittest/__init__.py | 10 - Lib/unittest/test/__init__.py | 25 - Lib/unittest/test/__main__.py | 18 - Lib/unittest/test/testmock/__init__.py | 17 - Lib/urllib/parse.py | 1 - Lib/urllib/request.py | 9 +- Lib/venv/__init__.py | 2 +- Lib/xml/etree/ElementTree.py | 12 +- Lib/zipapp.py | 2 +- Lib/zipfile.py | 4 +- Mac/BuildScript/scripts/postflight.framework | 4 +- Makefile.pre.in | 85 +- Misc/ACKS | 4 + Misc/NEWS.d/3.11.0b1.rst | 1 + .../2018-08-21-11-10-18.bpo-34449.Z3qm3c.rst | 1 + ...2-05-25-05-46-00.gh-issue-93202.T37jtj.rst | 4 + ...2-05-25-13-56-00.gh-issue-93207.B9Rubf.rst | 3 + ...2-05-31-18-04-58.gh-issue-69093.6lSa0C.rst | 1 + ...2-06-04-12-53-53.gh-issue-93491.ehM211.rst | 1 + ...2-06-08-14-28-03.gh-issue-93584.0xfHOK.rst | 2 + .../2021-10-05-21-59-43.bpo-45383.TVClgf.rst | 3 + ...2-05-13-18-17-48.gh-issue-92781.TVDr3-.rst | 3 + ...2-05-19-18-05-51.gh-issue-92913.Ass1Hv.rst | 2 + ...2-05-23-12-31-04.gh-issue-77782.ugC8dn.rst | 3 + ...2-05-23-13-33-18.gh-issue-93103.ooD3Eb.rst | 4 + ...2-05-23-15-22-18.gh-issue-92898.Qjc9d3.rst | 2 + ...2-06-03-14-54-41.gh-issue-93466.DDtH0X.rst | 3 + ...2-06-04-13-15-41.gh-issue-93442.4M4NDb.rst | 3 + ...2-06-10-16-50-27.gh-issue-89546.mX1f10.rst | 4 + ...2-06-10-23-41-48.gh-issue-91731.fhYUQG.rst | 3 + ...2-06-13-21-37-31.gh-issue-91321.DgJFvS.rst | 2 + ...2-06-17-13-41-38.gh-issue-93937.uKVTEh.rst | 14 + .../2022-01-02-14-53-59.bpo-46142.WayjgT.rst | 3 + ...2-04-24-02-22-10.gh-issue-91432.YPJAK6.rst | 1 + ...2-05-13-00-57-18.gh-issue-92658.YdhFE2.rst | 1 + ...2-05-13-12-36-10.gh-issue-92777.Odo4vP.rst | 1 + ...2-05-14-13-22-11.gh-issue-92804.rAqpI2.rst | 1 + ...2-05-15-15-25-05.gh-issue-90473.MoPHYW.rst | 1 + ...2-05-17-20-41-43.gh-issue-92858.eIXJTn.rst | 1 + ...2-05-18-08-32-33.gh-issue-92914.tJUeTD.rst | 1 + ...2-05-18-12-55-35.gh-issue-90690.TKuoTa.rst | 2 + ...2-05-18-18-34-45.gh-issue-92930.kpYPOb.rst | 1 + ...2-05-19-13-25-50.gh-issue-92955.kmNV33.rst | 1 + ...2-05-19-15-29-53.gh-issue-89914.8bAffH.rst | 2 + ...2-05-20-09-25-34.gh-issue-93021.k3Aji2.rst | 2 + ...2-05-20-13-32-24.gh-issue-93012.e9B-pv.rst | 8 + ...2-05-21-23-21-37.gh-issue-93065.5I18WC.rst | 5 + ...2-05-22-02-37-50.gh-issue-93061.r70Imp.rst | 1 + ...2-05-23-18-36-07.gh-issue-93143.X1Yqxm.rst | 1 + ...2-05-24-14-35-48.gh-issue-93040.9X6Ofu.rst | 1 + ...2-05-25-04-07-22.gh-issue-91924.-UyO4q.rst | 2 + ...2-05-25-12-30-12.gh-issue-84694.5sjy2w.rst | 2 + ...2-05-25-21-56-25.gh-issue-93223.gTOGVZ.rst | 1 + ...2-05-30-10-22-46.gh-issue-93345.gi1A4L.rst | 2 + ...2-05-30-14-50-03.gh-issue-93283.XDO2ZQ.rst | 2 + ...2-05-30-15-35-42.gh-issue-93354.RZk8gs.rst | 3 + ...2-05-30-15-51-11.gh-issue-93356.l5wnzW.rst | 1 + ...2-05-30-19-00-38.gh-issue-93359.zXV3A0.rst | 2 + ...2-05-31-16-36-30.gh-issue-93382.Jf6gAj.rst | 2 + ...2-06-01-17-47-40.gh-issue-93418.24dJuc.rst | 2 + ...2-06-02-08-28-55.gh-issue-93429.DZTWHx.rst | 1 + ...2-06-02-23-00-08.gh-issue-93444.m63DIs.rst | 1 + ...2-06-06-14-28-24.gh-issue-93533.lnC0CC.rst | 1 + ...2-06-09-09-08-29.gh-issue-93621.-_Pn1d.rst | 1 + ...2-06-09-19-19-02.gh-issue-93461.5DqP1e.rst | 6 + ...2-06-10-10-31-18.gh-issue-93662.-7RSC1.rst | 2 + ...2-06-10-12-03-17.gh-issue-93671.idkQqG.rst | 2 + ...2-06-10-16-57-35.gh-issue-93678.1WBnHt.rst | 1 + ...2-06-12-19-31-56.gh-issue-89828.bq02M7.rst | 2 + ...2-06-13-10-48-09.gh-issue-93516.yJSait.rst | 2 + ...2-06-13-13-55-34.gh-issue-93516.HILrDl.rst | 2 + ...2-06-15-11-16-13.gh-issue-93841.06zqX3.rst | 3 + ...2-06-15-16-45-53.gh-issue-93678.1I_ZT3.rst | 1 + ...2-06-16-16-53-22.gh-issue-93911.RDwIiK.rst | 1 + ...2-06-17-16-30-24.gh-issue-93955.LmiAe9.rst | 1 + ...2-06-20-13-48-57.gh-issue-94021.o78q3G.rst | 1 + .../2022-01-13-16-03-15.bpo-40838.k3NVCf.rst | 2 + .../2022-03-30-17-56-01.bpo-47161.gesHfS.rst | 2 + ...2-05-18-23-58-26.gh-issue-92240.bHvYiz.rst | 2 + ...2-05-26-11-33-23.gh-issue-86438.kEGGmK.rst | 3 + ...2-05-26-14-51-25.gh-issue-88831.5Cccr5.rst | 1 + ...2-05-29-21-22-54.gh-issue-86986.lFXw8j.rst | 1 + ...2-06-15-12-12-49.gh-issue-87260.epyI7D.rst | 1 + ...2-06-16-10-10-59.gh-issue-61162.1ypkG8.rst | 1 + .../2017-07-31-13-35-28.bpo-26253.8v_sCs.rst | 2 + .../2018-09-28-22-18-03.bpo-34828.5Zyi_S.rst | 1 + .../2020-10-15-18-37-12.bpo-42047.XDdoSF.rst | 1 + .../2022-01-09-14-23-00.bpo-28249.4dzB80.rst | 2 + .../2022-02-05-18-46-54.bpo-46642.YI6nHQ.rst | 1 + .../2022-02-09-23-44-27.bpo-45393.9v5Y8U.rst | 2 + .../2022-03-08-04-46-44.bpo-46951.SWAz97.rst | 1 + .../2022-03-19-04-41-42.bpo-47063.nwRfUo.rst | 1 + .../2022-04-03-11-25-02.bpo-41287.8CTdwf.rst | 1 + .../2022-04-03-19-40-09.bpo-39064.76PbIz.rst | 2 + .../2022-04-08-22-12-11.bpo-47231.lvyglt.rst | 1 + ...2-04-11-16-55-41.gh-issue-91456.DK3KKl.rst | 3 + ...2-04-15-17-38-55.gh-issue-91577.Ah7cLL.rst | 1 + ...2-04-24-22-26-45.gh-issue-81790.M5Rvpm.rst | 2 + ...2-05-08-18-51-14.gh-issue-89336.TL6ip7.rst | 4 + ...2-05-09-09-28-02.gh-issue-92530.M4Q1RS.rst | 2 + ...2-05-09-11-55-04.gh-issue-92547.CzVZft.rst | 6 + ...2-05-09-22-27-11.gh-issue-92591.V7RCk2.rst | 3 + ...2-05-11-19-33-27.gh-issue-92671.KE4v6a.rst | 1 + ...2-05-14-11-41-23.gh-issue-90473.kPdOZl.rst | 2 + ...2-05-16-14-35-39.gh-issue-92839.owSMyo.rst | 1 + ...2-05-18-17-18-41.gh-issue-91922.DwWIsJ.rst | 3 + ...2-05-18-21-04-09.gh-issue-87901.lnf041.rst | 2 + ...2-05-19-13-33-18.gh-issue-92675.ZeerMZ.rst | 2 + ...2-05-19-17-49-58.gh-issue-92932.o2peTh.rst | 3 + ...2-05-20-15-52-43.gh-issue-93010.WF-cAc.rst | 1 + ...2-05-21-13-16-16.gh-issue-93044.eJ_XkZ.rst | 2 + ...2-05-22-16-08-01.gh-issue-89973.jc-Q4g.rst | 3 + ...2-05-22-23-46-18.gh-issue-93033.wZfiL-.rst | 1 + ...2-05-24-10-59-02.gh-issue-92728.zxTifq.rst | 3 + ...2-05-24-11-19-04.gh-issue-74696.-cnf-A.rst | 2 + ...2-05-25-00-21-28.gh-issue-91513.9VyCT4.rst | 1 + ...2-05-25-02-45-41.gh-issue-90817.yxANgU.rst | 3 + ...2-05-26-09-24-41.gh-issue-93162.W1VuhU.rst | 4 + ...2-05-26-23-10-55.gh-issue-93156.4XfDVN.rst | 2 + ...2-05-27-10-52-06.gh-issue-85308.K6r-tJ.rst | 4 + ...2-05-27-13-18-18.gh-issue-93297.e2zuHz.rst | 1 + ...2-05-27-22-17-11.gh-issue-88123.mkYl5q.rst | 2 + ...2-05-28-08-02-55.gh-issue-93312.HY0Uzj.rst | 3 + ...2-05-30-21-42-50.gh-issue-83658.01Ntx0.rst | 1 + ...2-05-31-14-58-40.gh-issue-93353.9Hvm6o.rst | 3 + ...2-06-01-11-24-13.gh-issue-91162.NxvU_u.rst | 5 + ...2-06-02-08-40-58.gh-issue-91810.Gtk44w.rst | 2 + ...2-06-03-22-13-28.gh-issue-93370.tjfu9L.rst | 1 + ...2-06-04-00-11-54.gh-issue-93475.vffFw1.rst | 2 + ...2-06-05-22-22-42.gh-issue-93421.43UO_8.rst | 3 + ...2-06-06-12-58-27.gh-issue-79579.e8rB-M.rst | 2 + ...2-06-06-13-19-43.gh-issue-93521._vE8m9.rst | 4 + ...2-06-07-14-53-46.gh-issue-90549.T4FMKY.rst | 2 + ...2-06-08-20-11-02.gh-issue-90494.LIZT85.rst | 3 + ...2-06-09-10-12-55.gh-issue-90473.683m_C.rst | 2 + ...2-06-09-17-15-26.gh-issue-91389.OE4vS5.rst | 2 + ...2-06-11-13-32-17.gh-issue-79512.A1KTDr.rst | 3 + ...2-06-15-21-20-02.gh-issue-93820.FAMLY8.rst | 2 + ...2-06-15-21-35-11.gh-issue-91404.39TZzW.rst | 3 + ...2-06-16-09-24-50.gh-issue-93847.kuv8bN.rst | 1 + ...2-06-20-23-14-43.gh-issue-94028.UofEcX.rst | 3 + ...2-06-22-11-16-11.gh-issue-94101.V9vDG8.rst | 3 + ...2-06-23-13-12-05.gh-issue-91742.sNytVX.rst | 1 + ...2-06-23-14-35-10.gh-issue-94169.jeba90.rst | 4 + ...2-06-24-09-41-41.gh-issue-94196.r2KyfS.rst | 4 + ...2-06-24-10-29-19.gh-issue-94199.pfehmz.rst | 3 + ...2-06-24-17-11-33.gh-issue-94199.7releN.rst | 4 + ...2-06-24-19-23-59.gh-issue-94207.VhS1eS.rst | 2 + ...2-06-25-13-38-53.gh-issue-93259.FAGw-2.rst | 2 + ...2-04-27-18-25-30.gh-issue-68966.gjS8zs.rst | 4 + ...2-05-19-08-53-07.gh-issue-92888.TLtR9W.rst | 2 + ...2-06-03-12-52-53.gh-issue-79096.YVoxgC.rst | 1 + ...2-06-15-20-09-23.gh-issue-87389.QVaC3f.rst | 3 + .../2022-03-14-23-28-17.bpo-47016.K-t2QX.rst | 2 + ...2-05-12-05-51-06.gh-issue-92670.7L43Z_.rst | 3 + ...2-05-25-23-00-35.gh-issue-92886.Y-vrWj.rst | 1 + ...2-05-25-23-07-15.gh-issue-92886.Aki63_.rst | 1 + ...2-06-03-12-22-44.gh-issue-89858.ftBvjE.rst | 1 + ...2-06-03-14-18-37.gh-issue-90473.7iXVRK.rst | 2 + ...2-06-03-16-26-04.gh-issue-57539.HxWgYO.rst | 1 + ...2-06-04-12-05-31.gh-issue-90473.RSpjF7.rst | 1 + ...2-06-05-10-16-45.gh-issue-90473.QMu7A8.rst | 2 + ...2-06-08-14-17-59.gh-issue-93575.Xb2LNB.rst | 4 + ...2-06-08-22-32-56.gh-issue-93616.e5Kkx2.rst | 2 + ...2-06-10-21-18-14.gh-issue-84461.9TAb26.rst | 2 + ...2-06-16-17-50-58.gh-issue-93353.JdpATx.rst | 2 + ...2-06-16-21-38-18.gh-issue-93852.U_Hl6s.rst | 4 + ...2-06-17-13-55-11.gh-issue-93957.X4ovYV.rst | 2 + ...2-06-17-15-20-09.gh-issue-93951.CW1Vv4.rst | 1 + ...2-06-20-23-04-52.gh-issue-93839.OE3Ybk.rst | 2 + ...2-06-21-17-37-46.gh-issue-54781.BjVAVg.rst | 2 + ...2-06-19-14-56-33.gh-issue-86087.R8MkRy.rst | 2 + .../2020-01-10-23-33-03.bpo-38704.2Idtdn.rst | 1 + .../2022-03-20-15-47-35.bpo-42658.16eXtb.rst | 3 + ...2-04-12-18-35-20.gh-issue-91061.x40hSK.rst | 1 + ...2-05-16-11-45-06.gh-issue-92841.NQx107.rst | 2 + ...2-05-19-14-01-30.gh-issue-92984.Dsxnlr.rst | 1 + ...2-05-19-21-44-25.gh-issue-92817.Jrf-Kv.rst | 3 + ...2-06-15-01-03-52.gh-issue-93824.mR4mxu.rst | 2 + Misc/python.man | 44 +- Misc/stable_abi.toml | 2 + Modules/Setup.stdlib.in | 2 +- Modules/_asynciomodule.c | 7 +- Modules/_bisectmodule.c | 4 +- Modules/_ctypes/_ctypes.c | 4 +- Modules/_ctypes/callbacks.c | 14 +- Modules/_ctypes/ctypes.h | 6 +- Modules/_datetimemodule.c | 34 +- Modules/_elementtree.c | 8 +- Modules/_hashopenssl.c | 6 +- Modules/_io/bufferedio.c | 2 +- Modules/_io/textio.c | 4 +- Modules/_operator.c | 9 +- Modules/_pickle.c | 63 +- Modules/_posixsubprocess.c | 4 +- Modules/_sqlite/clinic/connection.c.h | 29 +- Modules/_sqlite/clinic/module.c.h | 54 +- Modules/_sqlite/connection.c | 113 +- Modules/_sqlite/cursor.c | 50 +- Modules/_sqlite/microprotocols.c | 4 +- Modules/_sqlite/module.c | 58 +- Modules/_sqlite/statement.c | 119 +- Modules/_sre/clinic/sre.c.h | 27 +- Modules/_sre/sre.c | 78 +- Modules/_sre/sre.h | 4 - Modules/_sre/sre_constants.h | 3 +- Modules/_sre/sre_lib.h | 30 +- Modules/_ssl.c | 22 +- Modules/_struct.c | 22 +- Modules/_testcapimodule.c | 335 +++- Modules/_testinternalcapi.c | 2 +- Modules/_winapi.c | 61 + Modules/_xxsubinterpretersmodule.c | 18 +- Modules/_zoneinfo.c | 35 +- Modules/arraymodule.c | 21 +- Modules/binascii.c | 10 +- Modules/cjkcodecs/cjkcodecs.h | 9 +- Modules/cjkcodecs/mappings_hk.h | 1 + Modules/cjkcodecs/mappings_tw.h | 2 + Modules/clinic/_ssl.c.h | 33 +- Modules/clinic/_winapi.c.h | 39 +- Modules/faulthandler.c | 10 +- Modules/fcntlmodule.c | 9 + Modules/gcmodule.c | 8 - Modules/getnameinfo.c | 3 +- Modules/getpath.py | 8 +- Modules/main.c | 21 +- Modules/mathmodule.c | 74 +- Modules/posixmodule.c | 13 +- Modules/pyexpat.c | 1 - Modules/signalmodule.c | 7 +- Modules/socketmodule.c | 132 ++ Modules/socketmodule.h | 17 + Modules/timemodule.c | 11 +- Modules/xxlimited_35.c | 58 +- Modules/xxmodule.c | 39 +- Objects/abstract.c | 9 +- Objects/bytearrayobject.c | 1 + Objects/bytes_methods.c | 1 + Objects/bytesobject.c | 4 - Objects/call.c | 21 +- Objects/codeobject.c | 87 +- Objects/descrobject.c | 65 +- Objects/exceptions.c | 4 - Objects/fileobject.c | 20 +- Objects/floatobject.c | 2 +- Objects/frameobject.c | 42 +- Objects/genericaliasobject.c | 100 +- Objects/genobject.c | 4 + Objects/listobject.c | 46 +- Objects/longobject.c | 34 +- Objects/memoryobject.c | 71 +- Objects/object.c | 16 +- Objects/obmalloc.c | 27 +- Objects/rangeobject.c | 23 +- Objects/setobject.c | 10 +- Objects/stringlib/asciilib.h | 1 + Objects/stringlib/fastsearch.h | 36 +- Objects/stringlib/replace.h | 4 +- Objects/stringlib/stringdefs.h | 1 + Objects/stringlib/ucs1lib.h | 1 + Objects/stringlib/ucs2lib.h | 4 + Objects/stringlib/ucs4lib.h | 4 + Objects/stringlib/undef.h | 1 + Objects/typeobject.c | 477 ++++-- Objects/unicodeobject.c | 39 +- Objects/weakrefobject.c | 6 +- PC/launcher2.c | 23 +- PC/layout/main.py | 3 - PC/layout/support/appxmanifest.py | 4 +- PC/layout/support/catalog.py | 2 - PC/layout/support/constants.py | 1 - PC/pyconfig.h | 19 +- PC/python3dll.c | 1 + PC/winsound.c | 22 +- PCbuild/_socket.vcxproj | 2 +- PCbuild/lib.pyproj | 269 ++-- PCbuild/pyproject.props | 1 + PCbuild/pythoncore.vcxproj | 4 + PCbuild/pythoncore.vcxproj.filters | 12 + Parser/asdl_c.py | 15 +- Parser/myreadline.c | 3 +- Parser/parser.c | 880 ++++++----- Parser/pegen.c | 11 +- Parser/pegen.h | 1 + Parser/string_parser.c | 44 +- Parser/tokenizer.c | 27 +- Parser/tokenizer.h | 3 + Programs/_testembed.c | 41 + Programs/test_frozenmain.h | 74 +- Python/Python-ast.c | 24 +- Python/_warnings.c | 17 - Python/ast.c | 26 + Python/bltinmodule.c | 1 + Python/ceval.c | 917 ++++++----- Python/ceval_gil.h | 18 - Python/clinic/sysmodule.c.h | 106 +- Python/compile.c | 1363 ++++++++++------- Python/condvar.h | 6 +- Python/errors.c | 16 - Python/fileutils.c | 8 +- Python/frame.c | 20 +- Python/frozenmain.c | 2 +- Python/getargs.c | 8 - Python/getopt.c | 6 +- Python/hamt.c | 18 +- Python/import.c | 65 +- Python/initconfig.c | 204 ++- Python/opcode_targets.h | 94 +- Python/pathconfig.c | 23 +- Python/preconfig.c | 50 +- Python/pylifecycle.c | 39 +- Python/pystate.c | 20 +- Python/pytime.c | 8 + Python/specialize.c | 488 +++--- Python/suggestions.c | 1 - Python/sysmodule.c | 76 +- Python/thread_pthread.h | 173 +-- Python/traceback.c | 3 +- README.rst | 2 +- Tools/c-analyzer/cpython/globals-to-fix.tsv | 7 +- Tools/clinic/clinic.py | 1 - Tools/msi/bundle/Default.wxl | 1 - Tools/msi/bundle/packagegroups/launcher.wxs | 12 +- Tools/msi/launcher/launcher.wxs | 13 +- Tools/msi/launcher/launcher_files.wxs | 13 +- Tools/peg_generator/Makefile | 2 +- Tools/peg_generator/pegen/c_generator.py | 8 +- Tools/peg_generator/pegen/grammar.py | 7 - Tools/peg_generator/scripts/benchmark.py | 2 +- .../peg_generator/scripts/grammar_grapher.py | 1 - .../scripts/test_parse_directory.py | 2 - .../scripts/test_pypi_packages.py | 3 +- Tools/scripts/deepfreeze.py | 18 +- Tools/scripts/freeze_modules.py | 1 - Tools/scripts/generate_re_casefix.py | 1 - Tools/scripts/generate_stdlib_module_names.py | 1 - Tools/scripts/parse_html5_entities.py | 27 +- Tools/scripts/parseentities.py | 64 - Tools/scripts/pathfix.py | 1 - Tools/scripts/run_tests.py | 46 +- Tools/scripts/stable_abi.py | 1 - Tools/scripts/summarize_stats.py | 70 +- Tools/scripts/verify_ensurepip_wheels.py | 98 ++ Tools/ssl/multissltests.py | 1 - Tools/unicode/genmap_tchinese.py | 239 +++ Tools/wasm/README.md | 125 +- Tools/wasm/config.site-wasm32-emscripten | 10 + Tools/wasm/config.site-wasm32-wasi | 27 + Tools/wasm/wasm_assets.py | 49 +- configure | 275 ++-- configure.ac | 188 +-- pyconfig.h.in | 16 +- setup.py | 4 - 1037 files changed, 14845 insertions(+), 10157 deletions(-) delete mode 100644 .azure-pipelines/windows-release.yml delete mode 100644 .azure-pipelines/windows-release/build-steps.yml delete mode 100644 .azure-pipelines/windows-release/checkout.yml delete mode 100644 .azure-pipelines/windows-release/find-sdk.yml delete mode 100644 .azure-pipelines/windows-release/layout-command.yml delete mode 100644 .azure-pipelines/windows-release/mingw-lib.yml delete mode 100644 .azure-pipelines/windows-release/msi-steps.yml delete mode 100644 .azure-pipelines/windows-release/stage-build.yml delete mode 100644 .azure-pipelines/windows-release/stage-layout-embed.yml delete mode 100644 .azure-pipelines/windows-release/stage-layout-full.yml delete mode 100644 .azure-pipelines/windows-release/stage-layout-msix.yml delete mode 100644 .azure-pipelines/windows-release/stage-layout-nuget.yml delete mode 100644 .azure-pipelines/windows-release/stage-msi.yml delete mode 100644 .azure-pipelines/windows-release/stage-pack-msix.yml delete mode 100644 .azure-pipelines/windows-release/stage-pack-nuget.yml delete mode 100644 .azure-pipelines/windows-release/stage-publish-nugetorg.yml delete mode 100644 .azure-pipelines/windows-release/stage-publish-pythonorg.yml delete mode 100644 .azure-pipelines/windows-release/stage-publish-store.yml delete mode 100644 .azure-pipelines/windows-release/stage-sign.yml delete mode 100644 .azure-pipelines/windows-release/stage-test-embed.yml delete mode 100644 .azure-pipelines/windows-release/stage-test-msi.yml delete mode 100644 .azure-pipelines/windows-release/stage-test-nuget.yml create mode 100644 .github/workflows/regen-abidump.sh create mode 100644 .github/workflows/verify-ensurepip-wheels.yml delete mode 100644 Doc/includes/sqlite3/adapter_datetime.py mode change 100755 => 100644 Doc/library/socket.rst create mode 100644 Include/cpython/pyframe.h create mode 100644 Include/internal/pycore_descrobject.h create mode 100644 Include/internal/pycore_range.h create mode 100644 Include/pystats.h delete mode 100644 Lib/ctypes/test/__main__.py mode change 100755 => 100644 Lib/socket.py create mode 100644 Lib/test/doctest_lineno.py delete mode 100644 Lib/test/test_ctypes.py rename Lib/{ctypes/test => test/test_ctypes}/__init__.py (100%) create mode 100644 Lib/test/test_ctypes/__main__.py rename Lib/{ctypes/test => test/test_ctypes}/test_anon.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_array_in_pointer.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_arrays.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_as_parameter.py (98%) rename Lib/{ctypes/test => test/test_ctypes}/test_bitfields.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_buffers.py (98%) rename Lib/{ctypes/test => test/test_ctypes}/test_bytes.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_byteswap.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_callbacks.py (98%) rename Lib/{ctypes/test => test/test_ctypes}/test_cast.py (98%) rename Lib/{ctypes/test => test/test_ctypes}/test_cfuncs.py (97%) rename Lib/{ctypes/test => test/test_ctypes}/test_checkretval.py (95%) rename Lib/{ctypes/test => test/test_ctypes}/test_delattr.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_errno.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_find.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_frombuffer.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_funcptr.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_functions.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_incomplete.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_init.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_internals.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_keeprefs.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_libc.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_loading.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_macholib.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_memfunctions.py (98%) rename Lib/{ctypes/test => test/test_ctypes}/test_numbers.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_objects.py (87%) rename Lib/{ctypes/test => test/test_ctypes}/test_parameters.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_pep3118.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_pickling.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_pointers.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_prototypes.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_python_api.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_random_things.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_refcounts.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_repr.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_returnfuncptrs.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_simplesubclasses.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_sizes.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_slicing.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_stringptr.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_strings.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_struct_fields.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_structures.py (99%) rename Lib/{ctypes/test => test/test_ctypes}/test_unaligned_structures.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_unicode.py (97%) rename Lib/{ctypes/test => test/test_ctypes}/test_values.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_varsize_struct.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_win32.py (100%) rename Lib/{ctypes/test => test/test_ctypes}/test_wintypes.py (100%) delete mode 100644 Lib/test/test_lib2to3.py rename Lib/{lib2to3/tests => test/test_lib2to3}/__init__.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/__main__.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/README (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/bom.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/crlf.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/different_encoding.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/false_encoding.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/bad_order.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/myfixes/__init__.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/myfixes/fix_explicit.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/myfixes/fix_first.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/myfixes/fix_last.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/myfixes/fix_parrot.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/myfixes/fix_preorder.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/no_fixer_cls.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/fixers/parrot_example.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/infinite_recursion.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/py2_test_grammar.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/data/py3_test_grammar.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/pytree_idempotency.py (96%) rename Lib/{lib2to3/tests => test/test_lib2to3}/support.py (77%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_all_fixers.py (99%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_fixers.py (99%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_main.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_parser.py (99%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_pytree.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_refactor.py (100%) rename Lib/{lib2to3/tests => test/test_lib2to3}/test_util.py (100%) mode change 100755 => 100644 Lib/test/test_socket.py delete mode 100644 Lib/test/test_tk.py rename Lib/{tkinter/test => test/test_tkinter}/README (100%) create mode 100644 Lib/test/test_tkinter/__init__.py create mode 100644 Lib/test/test_tkinter/__main__.py rename Lib/{tkinter/test => test/test_tkinter}/support.py (99%) rename Lib/{tkinter => }/test/test_tkinter/test_colorchooser.py (96%) rename Lib/{tkinter => }/test/test_tkinter/test_font.py (98%) rename Lib/{tkinter => }/test/test_tkinter/test_geometry_managers.py (99%) rename Lib/{tkinter => }/test/test_tkinter/test_images.py (99%) rename Lib/{tkinter => }/test/test_tkinter/test_loadtk.py (100%) rename Lib/{tkinter => }/test/test_tkinter/test_messagebox.py (94%) rename Lib/{tkinter => }/test/test_tkinter/test_misc.py (99%) rename Lib/{tkinter => }/test/test_tkinter/test_simpledialog.py (93%) rename Lib/{tkinter => }/test/test_tkinter/test_text.py (96%) rename Lib/{tkinter => }/test/test_tkinter/test_variables.py (99%) rename Lib/{tkinter => }/test/test_tkinter/test_widgets.py (99%) rename Lib/{tkinter/test => test/test_tkinter}/widget_tests.py (99%) rename Lib/test/{test_ttk_guionly.py => test_ttk/__init__.py} (68%) create mode 100644 Lib/test/test_ttk/__main__.py rename Lib/{tkinter => }/test/test_ttk/test_extensions.py (99%) rename Lib/{tkinter => }/test/test_ttk/test_style.py (98%) rename Lib/{tkinter => }/test/test_ttk/test_widgets.py (99%) delete mode 100644 Lib/test/test_unittest.py create mode 100644 Lib/test/test_unittest/__init__.py create mode 100644 Lib/test/test_unittest/__main__.py rename Lib/{unittest/test => test/test_unittest}/_test_warnings.py (100%) rename Lib/{unittest/test => test/test_unittest}/dummy.py (100%) rename Lib/{unittest/test => test/test_unittest}/support.py (100%) rename Lib/{unittest/test => test/test_unittest}/test_assertions.py (100%) rename Lib/{unittest/test => test/test_unittest}/test_async_case.py (100%) rename Lib/{unittest/test => test/test_unittest}/test_break.py (100%) rename Lib/{unittest/test => test/test_unittest}/test_case.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_discovery.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_functiontestcase.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_loader.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_program.py (96%) rename Lib/{unittest/test => test/test_unittest}/test_result.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_runner.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_setups.py (100%) rename Lib/{unittest/test => test/test_unittest}/test_skipping.py (99%) rename Lib/{unittest/test => test/test_unittest}/test_suite.py (99%) create mode 100644 Lib/test/test_unittest/testmock/__init__.py rename Lib/{unittest/test => test/test_unittest}/testmock/__main__.py (86%) rename Lib/{unittest/test => test/test_unittest}/testmock/support.py (100%) rename Lib/{unittest/test => test/test_unittest}/testmock/testasync.py (100%) rename Lib/{unittest/test => test/test_unittest}/testmock/testcallable.py (98%) rename Lib/{unittest/test => test/test_unittest}/testmock/testhelpers.py (100%) rename Lib/{unittest/test => test/test_unittest}/testmock/testmagicmethods.py (100%) rename Lib/{unittest/test => test/test_unittest}/testmock/testmock.py (99%) rename Lib/{unittest/test => test/test_unittest}/testmock/testpatch.py (98%) rename Lib/{unittest/test => test/test_unittest}/testmock/testsealable.py (100%) rename Lib/{unittest/test => test/test_unittest}/testmock/testsentinel.py (100%) rename Lib/{unittest/test => test/test_unittest}/testmock/testwith.py (99%) delete mode 100644 Lib/tkinter/test/__init__.py delete mode 100644 Lib/tkinter/test/test_tkinter/__init__.py delete mode 100644 Lib/tkinter/test/test_ttk/__init__.py delete mode 100644 Lib/unittest/test/__init__.py delete mode 100644 Lib/unittest/test/__main__.py delete mode 100644 Lib/unittest/test/testmock/__init__.py create mode 100644 Misc/NEWS.d/next/Build/2018-08-21-11-10-18.bpo-34449.Z3qm3c.rst create mode 100644 Misc/NEWS.d/next/Build/2022-05-25-05-46-00.gh-issue-93202.T37jtj.rst create mode 100644 Misc/NEWS.d/next/Build/2022-05-25-13-56-00.gh-issue-93207.B9Rubf.rst create mode 100644 Misc/NEWS.d/next/Build/2022-05-31-18-04-58.gh-issue-69093.6lSa0C.rst create mode 100644 Misc/NEWS.d/next/Build/2022-06-04-12-53-53.gh-issue-93491.ehM211.rst create mode 100644 Misc/NEWS.d/next/Build/2022-06-08-14-28-03.gh-issue-93584.0xfHOK.rst create mode 100644 Misc/NEWS.d/next/C API/2021-10-05-21-59-43.bpo-45383.TVClgf.rst create mode 100644 Misc/NEWS.d/next/C API/2022-05-13-18-17-48.gh-issue-92781.TVDr3-.rst create mode 100644 Misc/NEWS.d/next/C API/2022-05-19-18-05-51.gh-issue-92913.Ass1Hv.rst create mode 100644 Misc/NEWS.d/next/C API/2022-05-23-12-31-04.gh-issue-77782.ugC8dn.rst create mode 100644 Misc/NEWS.d/next/C API/2022-05-23-13-33-18.gh-issue-93103.ooD3Eb.rst create mode 100644 Misc/NEWS.d/next/C API/2022-05-23-15-22-18.gh-issue-92898.Qjc9d3.rst create mode 100644 Misc/NEWS.d/next/C API/2022-06-03-14-54-41.gh-issue-93466.DDtH0X.rst create mode 100644 Misc/NEWS.d/next/C API/2022-06-04-13-15-41.gh-issue-93442.4M4NDb.rst create mode 100644 Misc/NEWS.d/next/C API/2022-06-10-16-50-27.gh-issue-89546.mX1f10.rst create mode 100644 Misc/NEWS.d/next/C API/2022-06-10-23-41-48.gh-issue-91731.fhYUQG.rst create mode 100644 Misc/NEWS.d/next/C API/2022-06-13-21-37-31.gh-issue-91321.DgJFvS.rst create mode 100644 Misc/NEWS.d/next/C API/2022-06-17-13-41-38.gh-issue-93937.uKVTEh.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-01-02-14-53-59.bpo-46142.WayjgT.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-04-24-02-22-10.gh-issue-91432.YPJAK6.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-13-00-57-18.gh-issue-92658.YdhFE2.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-13-12-36-10.gh-issue-92777.Odo4vP.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-14-13-22-11.gh-issue-92804.rAqpI2.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-15-15-25-05.gh-issue-90473.MoPHYW.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-17-20-41-43.gh-issue-92858.eIXJTn.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-18-08-32-33.gh-issue-92914.tJUeTD.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-18-12-55-35.gh-issue-90690.TKuoTa.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-18-18-34-45.gh-issue-92930.kpYPOb.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-19-13-25-50.gh-issue-92955.kmNV33.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-19-15-29-53.gh-issue-89914.8bAffH.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-20-09-25-34.gh-issue-93021.k3Aji2.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-20-13-32-24.gh-issue-93012.e9B-pv.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-21-23-21-37.gh-issue-93065.5I18WC.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-22-02-37-50.gh-issue-93061.r70Imp.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-23-18-36-07.gh-issue-93143.X1Yqxm.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-24-14-35-48.gh-issue-93040.9X6Ofu.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-25-04-07-22.gh-issue-91924.-UyO4q.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-25-12-30-12.gh-issue-84694.5sjy2w.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-25-21-56-25.gh-issue-93223.gTOGVZ.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-30-10-22-46.gh-issue-93345.gi1A4L.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-30-14-50-03.gh-issue-93283.XDO2ZQ.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-35-42.gh-issue-93354.RZk8gs.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-51-11.gh-issue-93356.l5wnzW.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-30-19-00-38.gh-issue-93359.zXV3A0.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-05-31-16-36-30.gh-issue-93382.Jf6gAj.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-01-17-47-40.gh-issue-93418.24dJuc.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-02-08-28-55.gh-issue-93429.DZTWHx.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-02-23-00-08.gh-issue-93444.m63DIs.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-06-14-28-24.gh-issue-93533.lnC0CC.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-09-09-08-29.gh-issue-93621.-_Pn1d.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-09-19-19-02.gh-issue-93461.5DqP1e.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-10-10-31-18.gh-issue-93662.-7RSC1.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-10-12-03-17.gh-issue-93671.idkQqG.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-10-16-57-35.gh-issue-93678.1WBnHt.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-12-19-31-56.gh-issue-89828.bq02M7.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-13-10-48-09.gh-issue-93516.yJSait.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-13-13-55-34.gh-issue-93516.HILrDl.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-15-11-16-13.gh-issue-93841.06zqX3.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-15-16-45-53.gh-issue-93678.1I_ZT3.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-16-16-53-22.gh-issue-93911.RDwIiK.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-17-16-30-24.gh-issue-93955.LmiAe9.rst create mode 100644 Misc/NEWS.d/next/Core and Builtins/2022-06-20-13-48-57.gh-issue-94021.o78q3G.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-01-13-16-03-15.bpo-40838.k3NVCf.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-03-30-17-56-01.bpo-47161.gesHfS.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-05-18-23-58-26.gh-issue-92240.bHvYiz.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-05-26-11-33-23.gh-issue-86438.kEGGmK.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-05-26-14-51-25.gh-issue-88831.5Cccr5.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-05-29-21-22-54.gh-issue-86986.lFXw8j.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-06-15-12-12-49.gh-issue-87260.epyI7D.rst create mode 100644 Misc/NEWS.d/next/Documentation/2022-06-16-10-10-59.gh-issue-61162.1ypkG8.rst create mode 100644 Misc/NEWS.d/next/Library/2017-07-31-13-35-28.bpo-26253.8v_sCs.rst create mode 100644 Misc/NEWS.d/next/Library/2018-09-28-22-18-03.bpo-34828.5Zyi_S.rst create mode 100644 Misc/NEWS.d/next/Library/2020-10-15-18-37-12.bpo-42047.XDdoSF.rst create mode 100644 Misc/NEWS.d/next/Library/2022-01-09-14-23-00.bpo-28249.4dzB80.rst create mode 100644 Misc/NEWS.d/next/Library/2022-02-05-18-46-54.bpo-46642.YI6nHQ.rst create mode 100644 Misc/NEWS.d/next/Library/2022-02-09-23-44-27.bpo-45393.9v5Y8U.rst create mode 100644 Misc/NEWS.d/next/Library/2022-03-08-04-46-44.bpo-46951.SWAz97.rst create mode 100644 Misc/NEWS.d/next/Library/2022-03-19-04-41-42.bpo-47063.nwRfUo.rst create mode 100644 Misc/NEWS.d/next/Library/2022-04-03-11-25-02.bpo-41287.8CTdwf.rst create mode 100644 Misc/NEWS.d/next/Library/2022-04-03-19-40-09.bpo-39064.76PbIz.rst create mode 100644 Misc/NEWS.d/next/Library/2022-04-08-22-12-11.bpo-47231.lvyglt.rst create mode 100644 Misc/NEWS.d/next/Library/2022-04-11-16-55-41.gh-issue-91456.DK3KKl.rst create mode 100644 Misc/NEWS.d/next/Library/2022-04-15-17-38-55.gh-issue-91577.Ah7cLL.rst create mode 100644 Misc/NEWS.d/next/Library/2022-04-24-22-26-45.gh-issue-81790.M5Rvpm.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-08-18-51-14.gh-issue-89336.TL6ip7.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-09-09-28-02.gh-issue-92530.M4Q1RS.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-09-11-55-04.gh-issue-92547.CzVZft.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-09-22-27-11.gh-issue-92591.V7RCk2.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-11-19-33-27.gh-issue-92671.KE4v6a.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-14-11-41-23.gh-issue-90473.kPdOZl.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-16-14-35-39.gh-issue-92839.owSMyo.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-18-17-18-41.gh-issue-91922.DwWIsJ.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-18-21-04-09.gh-issue-87901.lnf041.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-19-13-33-18.gh-issue-92675.ZeerMZ.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-19-17-49-58.gh-issue-92932.o2peTh.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-20-15-52-43.gh-issue-93010.WF-cAc.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-21-13-16-16.gh-issue-93044.eJ_XkZ.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-22-16-08-01.gh-issue-89973.jc-Q4g.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-22-23-46-18.gh-issue-93033.wZfiL-.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-24-10-59-02.gh-issue-92728.zxTifq.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-24-11-19-04.gh-issue-74696.-cnf-A.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-25-00-21-28.gh-issue-91513.9VyCT4.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-25-02-45-41.gh-issue-90817.yxANgU.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-26-09-24-41.gh-issue-93162.W1VuhU.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-26-23-10-55.gh-issue-93156.4XfDVN.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-27-10-52-06.gh-issue-85308.K6r-tJ.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-27-13-18-18.gh-issue-93297.e2zuHz.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-27-22-17-11.gh-issue-88123.mkYl5q.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-28-08-02-55.gh-issue-93312.HY0Uzj.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-30-21-42-50.gh-issue-83658.01Ntx0.rst create mode 100644 Misc/NEWS.d/next/Library/2022-05-31-14-58-40.gh-issue-93353.9Hvm6o.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-01-11-24-13.gh-issue-91162.NxvU_u.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-02-08-40-58.gh-issue-91810.Gtk44w.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-03-22-13-28.gh-issue-93370.tjfu9L.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-04-00-11-54.gh-issue-93475.vffFw1.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-05-22-22-42.gh-issue-93421.43UO_8.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-06-12-58-27.gh-issue-79579.e8rB-M.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-06-13-19-43.gh-issue-93521._vE8m9.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-07-14-53-46.gh-issue-90549.T4FMKY.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-08-20-11-02.gh-issue-90494.LIZT85.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-09-10-12-55.gh-issue-90473.683m_C.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-09-17-15-26.gh-issue-91389.OE4vS5.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-11-13-32-17.gh-issue-79512.A1KTDr.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-15-21-20-02.gh-issue-93820.FAMLY8.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-15-21-35-11.gh-issue-91404.39TZzW.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-16-09-24-50.gh-issue-93847.kuv8bN.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-20-23-14-43.gh-issue-94028.UofEcX.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-22-11-16-11.gh-issue-94101.V9vDG8.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-23-13-12-05.gh-issue-91742.sNytVX.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-23-14-35-10.gh-issue-94169.jeba90.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-24-09-41-41.gh-issue-94196.r2KyfS.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-24-10-29-19.gh-issue-94199.pfehmz.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-24-17-11-33.gh-issue-94199.7releN.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-24-19-23-59.gh-issue-94207.VhS1eS.rst create mode 100644 Misc/NEWS.d/next/Library/2022-06-25-13-38-53.gh-issue-93259.FAGw-2.rst create mode 100644 Misc/NEWS.d/next/Security/2022-04-27-18-25-30.gh-issue-68966.gjS8zs.rst create mode 100644 Misc/NEWS.d/next/Security/2022-05-19-08-53-07.gh-issue-92888.TLtR9W.rst create mode 100644 Misc/NEWS.d/next/Security/2022-06-03-12-52-53.gh-issue-79096.YVoxgC.rst create mode 100644 Misc/NEWS.d/next/Security/2022-06-15-20-09-23.gh-issue-87389.QVaC3f.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-03-14-23-28-17.bpo-47016.K-t2QX.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-05-12-05-51-06.gh-issue-92670.7L43Z_.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-05-25-23-00-35.gh-issue-92886.Y-vrWj.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-05-25-23-07-15.gh-issue-92886.Aki63_.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-03-12-22-44.gh-issue-89858.ftBvjE.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-03-14-18-37.gh-issue-90473.7iXVRK.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-03-16-26-04.gh-issue-57539.HxWgYO.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-04-12-05-31.gh-issue-90473.RSpjF7.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-05-10-16-45.gh-issue-90473.QMu7A8.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-08-14-17-59.gh-issue-93575.Xb2LNB.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-08-22-32-56.gh-issue-93616.e5Kkx2.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-10-21-18-14.gh-issue-84461.9TAb26.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-16-17-50-58.gh-issue-93353.JdpATx.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-16-21-38-18.gh-issue-93852.U_Hl6s.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-17-13-55-11.gh-issue-93957.X4ovYV.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-17-15-20-09.gh-issue-93951.CW1Vv4.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-20-23-04-52.gh-issue-93839.OE3Ybk.rst create mode 100644 Misc/NEWS.d/next/Tests/2022-06-21-17-37-46.gh-issue-54781.BjVAVg.rst create mode 100644 Misc/NEWS.d/next/Tools-Demos/2022-06-19-14-56-33.gh-issue-86087.R8MkRy.rst create mode 100644 Misc/NEWS.d/next/Windows/2020-01-10-23-33-03.bpo-38704.2Idtdn.rst create mode 100644 Misc/NEWS.d/next/Windows/2022-03-20-15-47-35.bpo-42658.16eXtb.rst create mode 100644 Misc/NEWS.d/next/Windows/2022-04-12-18-35-20.gh-issue-91061.x40hSK.rst create mode 100644 Misc/NEWS.d/next/Windows/2022-05-16-11-45-06.gh-issue-92841.NQx107.rst create mode 100644 Misc/NEWS.d/next/Windows/2022-05-19-14-01-30.gh-issue-92984.Dsxnlr.rst create mode 100644 Misc/NEWS.d/next/Windows/2022-05-19-21-44-25.gh-issue-92817.Jrf-Kv.rst create mode 100644 Misc/NEWS.d/next/Windows/2022-06-15-01-03-52.gh-issue-93824.mR4mxu.rst delete mode 100755 Tools/scripts/parseentities.py create mode 100755 Tools/scripts/verify_ensurepip_wheels.py create mode 100644 Tools/unicode/genmap_tchinese.py diff --git a/.azure-pipelines/windows-release.yml b/.azure-pipelines/windows-release.yml deleted file mode 100644 index 581f48ba22882b..00000000000000 --- a/.azure-pipelines/windows-release.yml +++ /dev/null @@ -1,220 +0,0 @@ -name: Release_$(Build.SourceBranchName)_$(SourceTag)_$(Date:yyyyMMdd)$(Rev:.rr) - -parameters: -- name: GitRemote - displayName: "Git remote" - type: string - default: python - values: - - 'python' - - 'pablogsal' - - 'ambv' - - '(Other)' -- name: GitRemote_Other - displayName: "If Other, specify Git remote" - type: string - default: 'python' -- name: SourceTag - displayName: "Git tag" - type: string - default: main -- name: DoPublish - displayName: "Publish release" - type: boolean - default: false -- name: SigningCertificate - displayName: "Code signing certificate" - type: string - default: 'Python Software Foundation' - values: - - 'Python Software Foundation' - - 'TestSign' - - 'Unsigned' -- name: SigningDescription - displayName: "Signature description" - type: string - default: 'Built: $(Build.BuildNumber)' -- name: DoARM64 - displayName: "Publish ARM64 build" - type: boolean - default: true -# Because there is no ARM64 Tcl/Tk pre-3.11, we need a separate option -# to keep those builds working when the files are going to be absent. -# Eventually when we stop releasing anything that old, we can drop this -# argument (and make it implicitly always 'true') -- name: ARM64TclTk - displayName: "Use Tcl/Tk for ARM64 (3.11 and later)" - type: boolean - default: true -- name: DoPGO - displayName: "Run PGO" - type: boolean - default: true -- name: DoCHM - displayName: "Produce compiled help document (pre-3.11)" - type: boolean - default: false -- name: DoLayout - displayName: "Produce full layout artifact" - type: boolean - default: true -- name: DoMSIX - displayName: "Produce Store packages" - type: boolean - default: true -- name: DoNuget - displayName: "Produce Nuget packages" - type: boolean - default: true -- name: DoEmbed - displayName: "Produce embeddable package" - type: boolean - default: true -- name: DoMSI - displayName: "Produce EXE/MSI installer" - type: boolean - default: true -- name: BuildToPublish - displayName: "Build number to publish (0 to skip)" - type: number - default: '0' - -variables: - __RealSigningCertificate: 'Python Software Foundation' - ${{ if ne(parameters.GitRemote, '(Other)') }}: - GitRemote: ${{ parameters.GitRemote }} - ${{ else }}: - GitRemote: ${{ parameters.GitRemote_Other }} - SourceTag: ${{ parameters.SourceTag }} - DoPGO: ${{ parameters.DoPGO }} - ${{ if ne(parameters.SigningCertificate, 'Unsigned') }}: - SigningCertificate: ${{ parameters.SigningCertificate }} - SigningDescription: ${{ parameters.SigningDescription }} - DoCHM: ${{ parameters.DoCHM }} - DoLayout: ${{ parameters.DoLayout }} - DoMSIX: ${{ parameters.DoMSIX }} - DoNuget: ${{ parameters.DoNuget }} - DoEmbed: ${{ parameters.DoEmbed }} - DoMSI: ${{ parameters.DoMSI }} - DoPublish: ${{ parameters.DoPublish }} - PublishARM64: ${{ parameters.DoARM64 }} -# QUEUE TIME VARIABLES -# PyDotOrgUsername: '' -# PyDotOrgServer: '' - -trigger: none -pr: none - -stages: -- ${{ if eq(parameters.BuildToPublish, '0') }}: - - stage: Build - displayName: Build binaries - jobs: - - template: windows-release/stage-build.yml - parameters: - ARM64TclTk: ${{ parameters.ARM64TclTk }} - - - stage: Sign - displayName: Sign binaries - dependsOn: Build - jobs: - - template: windows-release/stage-sign.yml - - - stage: Layout - displayName: Generate layouts - dependsOn: Sign - jobs: - - template: windows-release/stage-layout-full.yml - parameters: - ARM64TclTk: ${{ parameters.ARM64TclTk }} - - template: windows-release/stage-layout-embed.yml - - template: windows-release/stage-layout-nuget.yml - - - stage: Pack - dependsOn: Layout - jobs: - - template: windows-release/stage-pack-nuget.yml - - - stage: Test - dependsOn: Pack - jobs: - - template: windows-release/stage-test-embed.yml - - template: windows-release/stage-test-nuget.yml - - - ${{ if eq(parameters.DoMSIX, 'true') }}: - - stage: Layout_MSIX - displayName: Generate MSIX layouts - dependsOn: Sign - jobs: - - template: windows-release/stage-layout-msix.yml - parameters: - ARM64TclTk: ${{ parameters.ARM64TclTk }} - - - stage: Pack_MSIX - displayName: Package MSIX - dependsOn: Layout_MSIX - jobs: - - template: windows-release/stage-pack-msix.yml - - - ${{ if eq(parameters.DoMSI, 'true') }}: - - stage: Build_MSI - displayName: Build MSI installer - dependsOn: Sign - jobs: - - template: windows-release/stage-msi.yml - parameters: - ARM64TclTk: ${{ parameters.ARM64TclTk }} - - - stage: Test_MSI - displayName: Test MSI installer - dependsOn: Build_MSI - jobs: - - template: windows-release/stage-test-msi.yml - - - ${{ if eq(parameters.DoPublish, 'true') }}: - - ${{ if eq(parameters.DoMSI, 'true') }}: - - stage: PublishPyDotOrg - displayName: Publish to python.org - dependsOn: ['Test_MSI', 'Test'] - jobs: - - template: windows-release/stage-publish-pythonorg.yml - - - ${{ if eq(parameters.DoNuget, 'true') }}: - - stage: PublishNuget - displayName: Publish to nuget.org - ${{ if eq(parameters.DoMSI, 'true') }}: - dependsOn: ['Test_MSI', 'Test'] - ${{ else }}: - dependsOn: 'Test' - jobs: - - template: windows-release/stage-publish-nugetorg.yml - - - ${{ if eq(parameters.DoMSIX, 'true') }}: - - stage: PublishStore - displayName: Publish to Store - ${{ if eq(parameters.DoMSI, 'true') }}: - dependsOn: ['Test_MSI', 'Pack_MSIX'] - ${{ else }}: - dependsOn: 'Pack_MSIX' - jobs: - - template: windows-release/stage-publish-store.yml - -- ${{ else }}: - - stage: PublishExisting - displayName: Publish existing build - dependsOn: [] - jobs: - - ${{ if eq(parameters.DoMSI, 'true') }}: - - template: windows-release/stage-publish-pythonorg.yml - parameters: - BuildToPublish: ${{ parameters.BuildToPublish }} - - - ${{ if eq(parameters.DoNuget, 'true') }}: - - template: windows-release/stage-publish-nugetorg.yml - parameters: - BuildToPublish: ${{ parameters.BuildToPublish }} - - - ${{ if eq(parameters.DoMSIX, 'true') }}: - - template: windows-release/stage-publish-store.yml - parameters: - BuildToPublish: ${{ parameters.BuildToPublish }} diff --git a/.azure-pipelines/windows-release/build-steps.yml b/.azure-pipelines/windows-release/build-steps.yml deleted file mode 100644 index 5ca2016d65f9e1..00000000000000 --- a/.azure-pipelines/windows-release/build-steps.yml +++ /dev/null @@ -1,84 +0,0 @@ -parameters: - ShouldPGO: false - -steps: -- template: ./checkout.yml - -- powershell: | - $d = (.\PCbuild\build.bat -V) | %{ if($_ -match '\s+(\w+):\s*(.+)\s*$') { @{$Matches[1] = $Matches[2];} }}; - Write-Host "##vso[task.setvariable variable=VersionText]$($d.PythonVersion)" - Write-Host "##vso[task.setvariable variable=VersionNumber]$($d.PythonVersionNumber)" - Write-Host "##vso[task.setvariable variable=VersionHex]$($d.PythonVersionHex)" - Write-Host "##vso[task.setvariable variable=VersionUnique]$($d.PythonVersionUnique)" - Write-Host "##vso[build.addbuildtag]$($d.PythonVersion)" - Write-Host "##vso[build.addbuildtag]$($d.PythonVersion)-$(Name)" - displayName: 'Extract version numbers' - -- ${{ if eq(parameters.ShouldPGO, 'false') }}: - - powershell: | - $env:SigningCertificate = $null - .\PCbuild\build.bat -v -p $(Platform) -c $(Configuration) - displayName: 'Run build' - env: - IncludeUwp: true - Py_OutDir: '$(Build.BinariesDirectory)\bin' - -- ${{ if eq(parameters.ShouldPGO, 'true') }}: - - powershell: | - $env:SigningCertificate = $null - .\PCbuild\build.bat -v -p $(Platform) --pgo - displayName: 'Run build with PGO' - env: - IncludeUwp: true - Py_OutDir: '$(Build.BinariesDirectory)\bin' - -- powershell: | - $kitroot = (gp 'HKLM:\SOFTWARE\Microsoft\Windows Kits\Installed Roots\').KitsRoot10 - $tool = (gci -r "$kitroot\Bin\*\x64\signtool.exe" | sort FullName -Desc | select -First 1) - if (-not $tool) { - throw "SDK is not available" - } - Write-Host "##vso[task.prependpath]$($tool.Directory)" - displayName: 'Add WinSDK tools to path' - -- powershell: | - $env:SigningCertificate = $null - $(_HostPython) PC\layout -vv -b "$(Build.BinariesDirectory)\bin" -t "$(Build.BinariesDirectory)\catalog" --catalog "${env:CAT}.cdf" --preset-default --arch $(Arch) - makecat "${env:CAT}.cdf" - del "${env:CAT}.cdf" - if (-not (Test-Path "${env:CAT}.cat")) { - throw "Failed to build catalog file" - } - displayName: 'Generate catalog' - env: - CAT: $(Build.BinariesDirectory)\bin\$(Arch)\python - PYTHON_HEXVERSION: $(VersionHex) - -- task: PublishPipelineArtifact@0 - displayName: 'Publish binaries' - condition: and(succeeded(), not(and(eq(variables['Configuration'], 'Release'), variables['SigningCertificate']))) - inputs: - targetPath: '$(Build.BinariesDirectory)\bin\$(Arch)' - artifactName: bin_$(Name) - -- task: PublishPipelineArtifact@0 - displayName: 'Publish binaries for signing' - condition: and(succeeded(), and(eq(variables['Configuration'], 'Release'), variables['SigningCertificate'])) - inputs: - targetPath: '$(Build.BinariesDirectory)\bin\$(Arch)' - artifactName: unsigned_bin_$(Name) - -- task: CopyFiles@2 - displayName: 'Layout Artifact: symbols' - inputs: - sourceFolder: $(Build.BinariesDirectory)\bin\$(Arch) - targetFolder: $(Build.ArtifactStagingDirectory)\symbols\$(Name) - flatten: true - contents: | - **\*.pdb - -- task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: symbols' - inputs: - PathToPublish: '$(Build.ArtifactStagingDirectory)\symbols' - ArtifactName: symbols diff --git a/.azure-pipelines/windows-release/checkout.yml b/.azure-pipelines/windows-release/checkout.yml deleted file mode 100644 index d42d55fff08dda..00000000000000 --- a/.azure-pipelines/windows-release/checkout.yml +++ /dev/null @@ -1,21 +0,0 @@ -parameters: - depth: 3 - -steps: -- checkout: none - -- script: git clone --progress -v --depth ${{ parameters.depth }} --branch $(SourceTag) --single-branch https://github.com/$(GitRemote)/cpython.git . - displayName: 'git clone ($(GitRemote)/$(SourceTag))' - condition: and(succeeded(), and(variables['GitRemote'], variables['SourceTag'])) - -- script: git clone --progress -v --depth ${{ parameters.depth }} --branch $(SourceTag) --single-branch $(Build.Repository.Uri) . - displayName: 'git clone (/$(SourceTag))' - condition: and(succeeded(), and(not(variables['GitRemote']), variables['SourceTag'])) - -- script: git clone --progress -v --depth ${{ parameters.depth }} --branch $(Build.SourceBranchName) --single-branch https://github.com/$(GitRemote)/cpython.git . - displayName: 'git clone ($(GitRemote)/)' - condition: and(succeeded(), and(variables['GitRemote'], not(variables['SourceTag']))) - -- script: git clone --progress -v --depth ${{ parameters.depth }} --branch $(Build.SourceBranchName) --single-branch $(Build.Repository.Uri) . - displayName: 'git clone' - condition: and(succeeded(), and(not(variables['GitRemote']), not(variables['SourceTag']))) diff --git a/.azure-pipelines/windows-release/find-sdk.yml b/.azure-pipelines/windows-release/find-sdk.yml deleted file mode 100644 index e4de78555b3f66..00000000000000 --- a/.azure-pipelines/windows-release/find-sdk.yml +++ /dev/null @@ -1,17 +0,0 @@ -# Locate the Windows SDK and add its binaries directory to PATH -# -# `toolname` can be overridden to use a different marker file. - -parameters: - toolname: signtool.exe - -steps: - - powershell: | - $kitroot = (gp 'HKLM:\SOFTWARE\Microsoft\Windows Kits\Installed Roots\').KitsRoot10 - $tool = (gci -r "$kitroot\Bin\*\${{ parameters.toolname }}" | sort FullName -Desc | select -First 1) - if (-not $tool) { - throw "SDK is not available" - } - Write-Host "##vso[task.prependpath]$($tool.Directory)" - Write-Host "Adding $($tool.Directory) to PATH" - displayName: 'Add WinSDK tools to path' diff --git a/.azure-pipelines/windows-release/layout-command.yml b/.azure-pipelines/windows-release/layout-command.yml deleted file mode 100644 index 406ccd859faa6a..00000000000000 --- a/.azure-pipelines/windows-release/layout-command.yml +++ /dev/null @@ -1,23 +0,0 @@ -steps: -- task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(HostArch)' - condition: and(succeeded(), variables['HostArch']) - inputs: - artifactName: bin_$(HostArch) - targetPath: $(Build.BinariesDirectory)\bin_$(HostArch) - -- powershell: > - Write-Host ( - '##vso[task.setvariable variable=LayoutCmd]& - "$(Python)" - "{1}\PC\layout" - -vv - --source "{1}" - --build "{0}\bin" - --arch "$(Name)" - --temp "{0}\layout-temp" - --include-cat "{0}\bin\python.cat" - --doc-build "{0}\doc"' - -f ("$(Build.BinariesDirectory)", "$(Build.SourcesDirectory)") - ) - displayName: 'Set LayoutCmd' diff --git a/.azure-pipelines/windows-release/mingw-lib.yml b/.azure-pipelines/windows-release/mingw-lib.yml deleted file mode 100644 index 30f7d34fa61d23..00000000000000 --- a/.azure-pipelines/windows-release/mingw-lib.yml +++ /dev/null @@ -1,13 +0,0 @@ -parameters: - DllToolOpt: -m i386:x86-64 - #DllToolOpt: -m i386 --as-flags=--32 - -steps: -- powershell: | - git clone https://github.com/python/cpython-bin-deps --branch binutils --single-branch --depth 1 --progress -v "binutils" - gci "bin\$(Arch)\python*.dll" | %{ - & "binutils\gendef.exe" $_ | Out-File -Encoding ascii tmp.def - & "binutils\dlltool.exe" --dllname $($_.BaseName).dll --def tmp.def --output-lib "$($_.Directory)\lib$($_.BaseName).a" ${{ parameters.DllToolOpt }} - } - displayName: 'Generate MinGW import library' - workingDirectory: $(Build.BinariesDirectory) diff --git a/.azure-pipelines/windows-release/msi-steps.yml b/.azure-pipelines/windows-release/msi-steps.yml deleted file mode 100644 index 79fc6f5ed292d4..00000000000000 --- a/.azure-pipelines/windows-release/msi-steps.yml +++ /dev/null @@ -1,181 +0,0 @@ -parameters: - ARM64TclTk: true - -steps: - - template: ./checkout.yml - - - powershell: | - $d = (.\PCbuild\build.bat -V) | %{ if($_ -match '\s+(\w+):\s*(.+)\s*$') { @{$Matches[1] = $Matches[2];} }}; - Write-Host "##vso[task.setvariable variable=SigningDescription]Python $($d.PythonVersion)" - displayName: 'Update signing description' - condition: and(succeeded(), not(variables['SigningDescription'])) - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: doc' - inputs: - artifactName: doc - targetPath: $(Build.BinariesDirectory)\doc - - - task: CopyFiles@2 - displayName: 'Merge documentation files' - inputs: - sourceFolder: $(Build.BinariesDirectory)\doc - targetFolder: $(Build.SourcesDirectory)\Doc\build - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_win32' - inputs: - artifactName: bin_win32 - targetPath: $(Build.BinariesDirectory)\win32 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_win32_d' - inputs: - artifactName: bin_win32_d - targetPath: $(Build.BinariesDirectory)\win32 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_amd64' - inputs: - artifactName: bin_amd64 - targetPath: $(Build.BinariesDirectory)\amd64 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_amd64_d' - inputs: - artifactName: bin_amd64_d - targetPath: $(Build.BinariesDirectory)\amd64 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_arm64' - condition: and(succeeded(), eq(variables['PublishARM64'], 'true')) - inputs: - artifactName: bin_arm64 - targetPath: $(Build.BinariesDirectory)\arm64 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_arm64_d' - condition: and(succeeded(), eq(variables['PublishARM64'], 'true')) - inputs: - artifactName: bin_arm64_d - targetPath: $(Build.BinariesDirectory)\arm64 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: tcltk_lib_win32' - inputs: - artifactName: tcltk_lib_win32 - targetPath: $(Build.BinariesDirectory)\tcltk_lib_win32 - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: tcltk_lib_amd64' - inputs: - artifactName: tcltk_lib_amd64 - targetPath: $(Build.BinariesDirectory)\tcltk_lib_amd64 - - - ${{ if eq(parameters.ARM64TclTk, true) }}: - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: tcltk_lib_arm64' - condition: and(succeeded(), eq(variables['PublishARM64'], 'true')) - inputs: - artifactName: tcltk_lib_arm64 - targetPath: $(Build.BinariesDirectory)\tcltk_lib_arm64 - - - powershell: | - copy $(Build.BinariesDirectory)\amd64\Activate.ps1 Lib\venv\scripts\common\Activate.ps1 -Force - displayName: 'Copy signed files into sources' - condition: and(succeeded(), variables['SigningCertificate']) - - - script: | - call Tools\msi\get_externals.bat - call PCbuild\find_python.bat - echo ##vso[task.setvariable variable=PYTHON]%PYTHON% - call PCbuild/find_msbuild.bat - echo ##vso[task.setvariable variable=MSBUILD]%MSBUILD% - displayName: 'Get external dependencies' - - - script: | - %PYTHON% -m pip install blurb - %PYTHON% -m blurb merge -f Misc\NEWS - displayName: 'Merge NEWS file' - - - script: | - %MSBUILD% Tools\msi\launcher\launcher.wixproj - displayName: 'Build launcher installer' - env: - Platform: x86 - Py_OutDir: $(Build.BinariesDirectory) - - - script: | - %MSBUILD% Tools\msi\bundle\releaselocal.wixproj /t:Rebuild /p:RebuildAll=true - displayName: 'Build win32 installer' - env: - Platform: x86 - Py_OutDir: $(Build.BinariesDirectory) - PYTHON: $(Build.BinariesDirectory)\win32\python.exe - PythonForBuild: $(Build.BinariesDirectory)\win32\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - TclTkLibraryDir: $(Build.BinariesDirectory)\tcltk_lib_win32 - BuildForRelease: true - - - script: | - %MSBUILD% Tools\msi\bundle\releaselocal.wixproj /t:Rebuild /p:RebuildAll=true - displayName: 'Build amd64 installer' - env: - Platform: x64 - Py_OutDir: $(Build.BinariesDirectory) - PYTHON: $(Build.BinariesDirectory)\amd64\python.exe - PythonForBuild: $(Build.BinariesDirectory)\amd64\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - TclTkLibraryDir: $(Build.BinariesDirectory)\tcltk_lib_amd64 - BuildForRelease: true - - - script: | - %MSBUILD% Tools\msi\bundle\releaselocal.wixproj /t:Rebuild /p:RebuildAll=true - displayName: 'Build arm64 installer' - condition: and(succeeded(), eq(variables['PublishARM64'], 'true')) - env: - Platform: ARM64 - Py_OutDir: $(Build.BinariesDirectory) - PYTHON: $(Build.BinariesDirectory)\win32\python.exe - PythonForBuild: $(Build.BinariesDirectory)\win32\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - BuildForRelease: true - ${{ if eq(parameters.ARM64TclTk, true) }}: - TclTkLibraryDir: $(Build.BinariesDirectory)\tcltk_lib_arm64 - - - task: CopyFiles@2 - displayName: 'Assemble artifact: msi (win32)' - inputs: - sourceFolder: $(Build.BinariesDirectory)\win32\en-us - targetFolder: $(Build.ArtifactStagingDirectory)\msi\win32 - contents: | - *.msi - *.cab - *.exe - - - task: CopyFiles@2 - displayName: 'Assemble artifact: msi (amd64)' - inputs: - sourceFolder: $(Build.BinariesDirectory)\amd64\en-us - targetFolder: $(Build.ArtifactStagingDirectory)\msi\amd64 - contents: | - *.msi - *.cab - *.exe - - - task: CopyFiles@2 - displayName: 'Assemble artifact: msi (arm64)' - condition: and(succeeded(), eq(variables['PublishARM64'], 'true')) - inputs: - sourceFolder: $(Build.BinariesDirectory)\arm64\en-us - targetFolder: $(Build.ArtifactStagingDirectory)\msi\arm64 - contents: | - *.msi - *.cab - *.exe - - - task: PublishPipelineArtifact@0 - displayName: 'Publish MSI' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\msi' - artifactName: msi diff --git a/.azure-pipelines/windows-release/stage-build.yml b/.azure-pipelines/windows-release/stage-build.yml deleted file mode 100644 index 26f43177504362..00000000000000 --- a/.azure-pipelines/windows-release/stage-build.yml +++ /dev/null @@ -1,193 +0,0 @@ -parameters: - ARM64TclTk: true - -jobs: -- job: Build_Docs - displayName: Docs build - pool: - vmImage: windows-2022 - - workspace: - clean: all - - steps: - - template: ./checkout.yml - - - script: Doc\make.bat html - displayName: 'Build HTML docs' - env: - BUILDDIR: $(Build.BinariesDirectory)\Doc - - - script: Doc\make.bat htmlhelp - displayName: 'Build CHM docs' - condition: and(succeeded(), eq(variables['DoCHM'], 'true')) - env: - BUILDDIR: $(Build.BinariesDirectory)\Doc - - - task: CopyFiles@2 - displayName: 'Assemble artifact: Doc' - inputs: - sourceFolder: $(Build.BinariesDirectory)\Doc - targetFolder: $(Build.ArtifactStagingDirectory)\Doc - contents: | - html\**\* - htmlhelp\*.chm - - - task: PublishPipelineArtifact@0 - displayName: 'Publish artifact: doc' - inputs: - targetPath: $(Build.ArtifactStagingDirectory)\Doc - artifactName: doc - - -- job: Build_Python - displayName: Python build - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32: - Name: win32 - Arch: win32 - Platform: x86 - Configuration: Release - _HostPython: .\python - win32_d: - Name: win32_d - Arch: win32 - Platform: x86 - Configuration: Debug - _HostPython: .\python - amd64_d: - Name: amd64_d - Arch: amd64 - Platform: x64 - Configuration: Debug - _HostPython: .\python - arm64: - Name: arm64 - Arch: arm64 - Platform: ARM64 - Configuration: Release - _HostPython: python - arm64_d: - Name: arm64_d - Arch: arm64 - Platform: ARM64 - Configuration: Debug - _HostPython: python - - steps: - - template: ./build-steps.yml - -- job: Build_Python_NonPGO - displayName: Python non-PGO build - condition: and(succeeded(), ne(variables['DoPGO'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - amd64: - Name: amd64 - Arch: amd64 - Platform: x64 - Configuration: Release - _HostPython: .\python - - steps: - - template: ./build-steps.yml - - -- job: Build_Python_PGO - displayName: Python PGO build - condition: and(succeeded(), eq(variables['DoPGO'], 'true')) - - # Allow up to five hours for PGO - timeoutInMinutes: 300 - - pool: - name: 'Windows Release' - - workspace: - clean: all - - strategy: - matrix: - amd64: - Name: amd64 - Arch: amd64 - Platform: x64 - Configuration: Release - _HostPython: .\python - - steps: - - template: ./build-steps.yml - parameters: - ShouldPGO: true - - -- job: TclTk_Lib - displayName: Publish Tcl/Tk Library - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - steps: - - template: ./checkout.yml - - - script: PCbuild\get_externals.bat --no-openssl --no-libffi - displayName: 'Get external dependencies' - - - task: MSBuild@1 - displayName: 'Copy Tcl/Tk lib for publish' - inputs: - solution: PCbuild\tcltk.props - platform: x86 - msbuildArguments: /t:CopyTclTkLib /p:OutDir="$(Build.ArtifactStagingDirectory)\tcl_win32" - - - task: MSBuild@1 - displayName: 'Copy Tcl/Tk lib for publish' - inputs: - solution: PCbuild\tcltk.props - platform: x64 - msbuildArguments: /t:CopyTclTkLib /p:OutDir="$(Build.ArtifactStagingDirectory)\tcl_amd64" - - - ${{ if eq(parameters.ARM64TclTk, true) }}: - - task: MSBuild@1 - displayName: 'Copy Tcl/Tk lib for publish' - inputs: - solution: PCbuild\tcltk.props - platform: ARM64 - msbuildArguments: /t:CopyTclTkLib /p:OutDir="$(Build.ArtifactStagingDirectory)\tcl_arm64" - - - task: PublishPipelineArtifact@0 - displayName: 'Publish artifact: tcltk_lib_win32' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\tcl_win32' - artifactName: tcltk_lib_win32 - - - task: PublishPipelineArtifact@0 - displayName: 'Publish artifact: tcltk_lib_amd64' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\tcl_amd64' - artifactName: tcltk_lib_amd64 - - - ${{ if eq(parameters.ARM64TclTk, true) }}: - - task: PublishPipelineArtifact@0 - displayName: 'Publish artifact: tcltk_lib_arm64' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\tcl_arm64' - artifactName: tcltk_lib_arm64 diff --git a/.azure-pipelines/windows-release/stage-layout-embed.yml b/.azure-pipelines/windows-release/stage-layout-embed.yml deleted file mode 100644 index c8b23d308d81e9..00000000000000 --- a/.azure-pipelines/windows-release/stage-layout-embed.yml +++ /dev/null @@ -1,61 +0,0 @@ -jobs: -- job: Make_Embed_Layout - displayName: Make embeddable layout - condition: and(succeeded(), eq(variables['DoEmbed'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32: - Name: win32 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - amd64: - Name: amd64 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - arm64: - Name: arm64 - HostArch: amd64 - Python: $(Build.BinariesDirectory)\bin_amd64\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - - steps: - - template: ./checkout.yml - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(Name)' - inputs: - artifactName: bin_$(Name) - targetPath: $(Build.BinariesDirectory)\bin - - - template: ./layout-command.yml - - - powershell: | - $d = (.\PCbuild\build.bat -V) | %{ if($_ -match '\s+(\w+):\s*(.+)\s*$') { @{$Matches[1] = $Matches[2];} }}; - Write-Host "##vso[task.setvariable variable=VersionText]$($d.PythonVersion)" - displayName: 'Extract version numbers' - - - powershell: > - $(LayoutCmd) - --copy "$(Build.ArtifactStagingDirectory)\layout" - --zip "$(Build.ArtifactStagingDirectory)\embed\python-$(VersionText)-embed-$(Name).zip" - --preset-embed - displayName: 'Generate embeddable layout' - - - task: PublishPipelineArtifact@0 - displayName: 'Publish Artifact: layout_embed_$(Name)' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\layout' - artifactName: layout_embed_$(Name) - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: embed' - inputs: - PathtoPublish: '$(Build.ArtifactStagingDirectory)\embed' - ArtifactName: embed diff --git a/.azure-pipelines/windows-release/stage-layout-full.yml b/.azure-pipelines/windows-release/stage-layout-full.yml deleted file mode 100644 index 343ee1f73c215b..00000000000000 --- a/.azure-pipelines/windows-release/stage-layout-full.yml +++ /dev/null @@ -1,80 +0,0 @@ -parameters: - ARM64TclTk: true - -jobs: -- job: Make_Layouts - displayName: Make layouts - condition: and(succeeded(), eq(variables['DoLayout'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32: - Name: win32 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - TclLibrary: $(Build.BinariesDirectory)\tcltk_lib\tcl8 - amd64: - Name: amd64 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - TclLibrary: $(Build.BinariesDirectory)\tcltk_lib\tcl8 - arm64: - Name: arm64 - HostArch: amd64 - Python: $(Build.BinariesDirectory)\bin_amd64\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - ${{ if eq(parameters.ARM64TclTk, true) }}: - TclLibrary: $(Build.BinariesDirectory)\tcltk_lib\tcl8 - - steps: - - template: ./checkout.yml - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(Name)' - inputs: - artifactName: bin_$(Name) - targetPath: $(Build.BinariesDirectory)\bin - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(Name)_d' - inputs: - artifactName: bin_$(Name)_d - targetPath: $(Build.BinariesDirectory)\bin - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: doc' - inputs: - artifactName: doc - targetPath: $(Build.BinariesDirectory)\doc - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: tcltk_lib_$(Name)' - condition: and(succeeded(), variables['TclLibrary']) - inputs: - artifactName: tcltk_lib_$(Name) - targetPath: $(Build.BinariesDirectory)\tcltk_lib - - - powershell: | - copy "$(Build.BinariesDirectory)\bin\Activate.ps1" Lib\venv\scripts\common\Activate.ps1 -Force - displayName: 'Copy signed files into sources' - condition: and(succeeded(), variables['SigningCertificate']) - - - template: ./layout-command.yml - - - powershell: | - $(LayoutCmd) --copy "$(Build.ArtifactStagingDirectory)\layout" --preset-default - displayName: 'Generate full layout' - env: - TCL_LIBRARY: $(TclLibrary) - - - task: PublishPipelineArtifact@0 - displayName: 'Publish Artifact: layout_full_$(Name)' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\layout' - artifactName: layout_full_$(Name) diff --git a/.azure-pipelines/windows-release/stage-layout-msix.yml b/.azure-pipelines/windows-release/stage-layout-msix.yml deleted file mode 100644 index a44e1ede20326c..00000000000000 --- a/.azure-pipelines/windows-release/stage-layout-msix.yml +++ /dev/null @@ -1,102 +0,0 @@ -parameters: - ARM64TclTk: true - -jobs: -- job: Make_MSIX_Layout - displayName: Make MSIX layout - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - #win32: - # Name: win32 - # Python: $(Build.BinariesDirectory)\bin\python.exe - # PYTHONHOME: $(Build.SourcesDirectory) - # TclLibrary: $(Build.BinariesDirectory)\tcltk_lib\tcl8 - amd64: - Name: amd64 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - TclLibrary: $(Build.BinariesDirectory)\tcltk_lib\tcl8 - arm64: - Name: arm64 - HostArch: amd64 - Python: $(Build.BinariesDirectory)\bin_amd64\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - ${{ if eq(parameters.ARM64TclTk, true) }}: - TclLibrary: $(Build.BinariesDirectory)\tcltk_lib\tcl8 - - steps: - - template: ./checkout.yml - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(Name)' - inputs: - artifactName: bin_$(Name) - targetPath: $(Build.BinariesDirectory)\bin - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(Name)_d' - inputs: - artifactName: bin_$(Name)_d - targetPath: $(Build.BinariesDirectory)\bin - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: tcltk_lib_$(Name)' - condition: and(succeeded(), variables['TclLibrary']) - inputs: - artifactName: tcltk_lib_$(Name) - targetPath: $(Build.BinariesDirectory)\tcltk_lib - - - powershell: | - copy "$(Build.BinariesDirectory)\bin\Activate.ps1" Lib\venv\scripts\common\Activate.ps1 -Force - displayName: 'Copy signed files into sources' - condition: and(succeeded(), variables['SigningCertificate']) - - - template: ./layout-command.yml - - - powershell: | - Remove-Item "$(Build.ArtifactStagingDirectory)\appx-store" -Recurse -Force -EA 0 - $(LayoutCmd) --copy "$(Build.ArtifactStagingDirectory)\appx-store" --preset-appx --precompile - displayName: 'Generate store APPX layout' - env: - TCL_LIBRARY: $(TclLibrary) - - - task: PublishPipelineArtifact@0 - displayName: 'Publish Artifact: layout_appxstore_$(Name)' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\appx-store' - artifactName: layout_appxstore_$(Name) - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: cert' - condition: and(succeeded(), variables['SigningCertificate']) - inputs: - artifactName: cert - targetPath: $(Build.BinariesDirectory)\cert - - - powershell: | - $info = (gc "$(Build.BinariesDirectory)\cert\certinfo.json" | ConvertFrom-JSON) - Write-Host "Side-loadable APPX must be signed with '$($info.Subject)'" - Write-Host "##vso[task.setvariable variable=APPX_DATA_PUBLISHER]$($info.Subject)" - Write-Host "##vso[task.setvariable variable=APPX_DATA_SHA256]$($info.SHA256)" - displayName: 'Override signing parameters' - condition: and(succeeded(), variables['SigningCertificate']) - - - powershell: | - Remove-Item "$(Build.ArtifactStagingDirectory)\appx" -Recurse -Force -EA 0 - $(LayoutCmd) --copy "$(Build.ArtifactStagingDirectory)\appx" --preset-appx --precompile --include-symbols --include-tests - displayName: 'Generate sideloading APPX layout' - env: - TCL_LIBRARY: $(TclLibrary) - - - task: PublishPipelineArtifact@0 - displayName: 'Publish Artifact: layout_appx_$(Name)' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\appx' - artifactName: layout_appx_$(Name) diff --git a/.azure-pipelines/windows-release/stage-layout-nuget.yml b/.azure-pipelines/windows-release/stage-layout-nuget.yml deleted file mode 100644 index b60a324dd90e3d..00000000000000 --- a/.azure-pipelines/windows-release/stage-layout-nuget.yml +++ /dev/null @@ -1,52 +0,0 @@ -jobs: -- job: Make_Nuget_Layout - displayName: Make Nuget layout - condition: and(succeeded(), eq(variables['DoNuget'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32: - Name: win32 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - amd64: - Name: amd64 - Python: $(Build.BinariesDirectory)\bin\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - arm64: - Name: arm64 - HostArch: amd64 - Python: $(Build.BinariesDirectory)\bin_amd64\python.exe - PYTHONHOME: $(Build.SourcesDirectory) - - steps: - - template: ./checkout.yml - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: bin_$(Name)' - inputs: - artifactName: bin_$(Name) - targetPath: $(Build.BinariesDirectory)\bin - - - powershell: | - copy $(Build.BinariesDirectory)\bin\Activate.ps1 Lib\venv\scripts\common\Activate.ps1 -Force - displayName: 'Copy signed files into sources' - condition: and(succeeded(), variables['SigningCertificate']) - - - template: ./layout-command.yml - - - powershell: | - $(LayoutCmd) --copy "$(Build.ArtifactStagingDirectory)\nuget" --preset-nuget - displayName: 'Generate nuget layout' - - - task: PublishPipelineArtifact@0 - displayName: 'Publish Artifact: layout_nuget_$(Name)' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\nuget' - artifactName: layout_nuget_$(Name) diff --git a/.azure-pipelines/windows-release/stage-msi.yml b/.azure-pipelines/windows-release/stage-msi.yml deleted file mode 100644 index 0566544a6eb63e..00000000000000 --- a/.azure-pipelines/windows-release/stage-msi.yml +++ /dev/null @@ -1,43 +0,0 @@ -parameters: - ARM64TclTk: true - -jobs: -- job: Make_MSI - displayName: Make MSI - condition: and(succeeded(), not(variables['SigningCertificate'])) - - pool: - vmImage: windows-2022 - - variables: - ReleaseUri: http://www.python.org/{arch} - DownloadUrl: https://www.python.org/ftp/python/{version}/{arch}{releasename}/{msi} - Py_OutDir: $(Build.BinariesDirectory) - - workspace: - clean: all - - steps: - - template: msi-steps.yml - parameters: - ARM64TclTk: ${{ parameters.ARM64TclTk }} - -- job: Make_Signed_MSI - displayName: Make signed MSI - condition: and(succeeded(), variables['SigningCertificate']) - - pool: - name: 'Windows Release' - - variables: - ReleaseUri: http://www.python.org/{arch} - DownloadUrl: https://www.python.org/ftp/python/{version}/{arch}{releasename}/{msi} - Py_OutDir: $(Build.BinariesDirectory) - - workspace: - clean: all - - steps: - - template: msi-steps.yml - parameters: - ARM64TclTk: ${{ parameters.ARM64TclTk }} diff --git a/.azure-pipelines/windows-release/stage-pack-msix.yml b/.azure-pipelines/windows-release/stage-pack-msix.yml deleted file mode 100644 index 95988151a03db7..00000000000000 --- a/.azure-pipelines/windows-release/stage-pack-msix.yml +++ /dev/null @@ -1,148 +0,0 @@ -jobs: -- job: Pack_MSIX - displayName: Pack MSIX bundles - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - amd64: - Name: amd64 - Artifact: appx - Suffix: - ShouldSign: true - amd64_store: - Name: amd64 - Artifact: appxstore - Suffix: -store - Upload: true - arm64: - Name: arm64 - Artifact: appx - Suffix: - ShouldSign: true - arm64_store: - Name: arm64 - Artifact: appxstore - Suffix: -store - Upload: true - - steps: - - template: ./checkout.yml - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: layout_$(Artifact)_$(Name)' - inputs: - artifactName: layout_$(Artifact)_$(Name) - targetPath: $(Build.BinariesDirectory)\layout - - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: symbols' - inputs: - artifactName: symbols - downloadPath: $(Build.BinariesDirectory) - - - powershell: | - $d = (.\PCbuild\build.bat -V) | %{ if($_ -match '\s+(\w+):\s*(.+)\s*$') { @{$Matches[1] = $Matches[2];} }}; - Write-Host "##vso[task.setvariable variable=VersionText]$($d.PythonVersion)" - Write-Host "##vso[task.setvariable variable=VersionNumber]$($d.PythonVersionNumber)" - Write-Host "##vso[task.setvariable variable=VersionHex]$($d.PythonVersionHex)" - Write-Host "##vso[task.setvariable variable=VersionUnique]$($d.PythonVersionUnique)" - Write-Host "##vso[task.setvariable variable=Filename]python-$($d.PythonVersion)-$(Name)$(Suffix)" - displayName: 'Extract version numbers' - - - powershell: | - ./Tools/msi/make_appx.ps1 -layout "$(Build.BinariesDirectory)\layout" -msix "$(Build.ArtifactStagingDirectory)\msix\$(Filename).msix" - displayName: 'Build msix' - - - powershell: | - 7z a -tzip "$(Build.ArtifactStagingDirectory)\msix\$(Filename).appxsym" *.pdb - displayName: 'Build appxsym' - workingDirectory: $(Build.BinariesDirectory)\symbols\$(Name) - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: MSIX' - condition: and(succeeded(), or(ne(variables['ShouldSign'], 'true'), not(variables['SigningCertificate']))) - inputs: - PathtoPublish: '$(Build.ArtifactStagingDirectory)\msix' - ArtifactName: msix - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: MSIX' - condition: and(succeeded(), and(eq(variables['ShouldSign'], 'true'), variables['SigningCertificate'])) - inputs: - PathtoPublish: '$(Build.ArtifactStagingDirectory)\msix' - ArtifactName: unsigned_msix - - - powershell: | - 7z a -tzip "$(Build.ArtifactStagingDirectory)\msixupload\$(Filename).msixupload" * - displayName: 'Build msixupload' - condition: and(succeeded(), eq(variables['Upload'], 'true')) - workingDirectory: $(Build.ArtifactStagingDirectory)\msix - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: MSIXUpload' - condition: and(succeeded(), eq(variables['Upload'], 'true')) - inputs: - PathtoPublish: '$(Build.ArtifactStagingDirectory)\msixupload' - ArtifactName: msixupload - - -- job: Sign_MSIX - displayName: Sign side-loadable MSIX bundles - dependsOn: - - Pack_MSIX - condition: and(succeeded(), variables['SigningCertificate']) - - pool: - name: 'Windows Release' - - workspace: - clean: all - - steps: - - template: ./checkout.yml - - template: ./find-sdk.yml - - - powershell: | - $d = (.\PCbuild\build.bat -V) | %{ if($_ -match '\s+(\w+):\s*(.+)\s*$') { @{$Matches[1] = $Matches[2];} }}; - Write-Host "##vso[task.setvariable variable=SigningDescription]Python $($d.PythonVersion)" - displayName: 'Update signing description' - condition: and(succeeded(), not(variables['SigningDescription'])) - - - task: DownloadBuildArtifacts@0 - displayName: 'Download Artifact: unsigned_msix' - inputs: - artifactName: unsigned_msix - downloadPath: $(Build.BinariesDirectory) - - # MSIX must be signed and timestamped simultaneously - # - # Getting "Error: SignerSign() failed." (-2147024885/0x8007000b)"? - # It may be that the certificate info collected in stage-sign.yml is wrong. Check that - # you do not have multiple matches for the certificate name you have specified. - - powershell: | - $failed = $true - foreach ($retry in 1..3) { - signtool sign /a /n "$(SigningCertificate)" /fd sha256 /tr http://timestamp.digicert.com/ /td sha256 /d "$(SigningDescription)" (gi *.msix) - if ($?) { - $failed = $false - break - } - sleep 1 - } - if ($failed) { - throw "Failed to sign MSIX" - } - displayName: 'Sign MSIX' - workingDirectory: $(Build.BinariesDirectory)\unsigned_msix - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: MSIX' - inputs: - PathtoPublish: '$(Build.BinariesDirectory)\unsigned_msix' - ArtifactName: msix diff --git a/.azure-pipelines/windows-release/stage-pack-nuget.yml b/.azure-pipelines/windows-release/stage-pack-nuget.yml deleted file mode 100644 index 85b44e389ab5d1..00000000000000 --- a/.azure-pipelines/windows-release/stage-pack-nuget.yml +++ /dev/null @@ -1,66 +0,0 @@ -jobs: -- job: Pack_Nuget - displayName: Pack Nuget bundles - condition: and(succeeded(), eq(variables['DoNuget'], 'true')) - - pool: - name: 'Windows Release' - - workspace: - clean: all - - strategy: - matrix: - amd64: - Name: amd64 - win32: - Name: win32 - arm64: - Name: arm64 - - steps: - - checkout: none - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: layout_nuget_$(Name)' - inputs: - artifactName: layout_nuget_$(Name) - targetPath: $(Build.BinariesDirectory)\layout - - - task: NugetToolInstaller@0 - displayName: 'Install Nuget' - inputs: - versionSpec: '>=5.0' - - - powershell: > - nuget pack - "$(Build.BinariesDirectory)\layout\python.nuspec" - -OutputDirectory $(Build.ArtifactStagingDirectory) - -NoPackageAnalysis - -NonInteractive - condition: and(succeeded(), not(variables['OverrideNugetVersion'])) - displayName: 'Create nuget package' - - - powershell: > - nuget pack - "$(Build.BinariesDirectory)\layout\python.nuspec" - -OutputDirectory $(Build.ArtifactStagingDirectory) - -NoPackageAnalysis - -NonInteractive - -Version "$(OverrideNugetVersion)" - condition: and(succeeded(), variables['OverrideNugetVersion']) - displayName: 'Create nuget package' - - - powershell: | - gci *.nupkg | %{ - nuget sign "$_" -CertificateSubjectName "$(SigningCertificate)" -Timestamper http://timestamp.digicert.com/ -Overwrite - } - displayName: 'Sign nuget package' - workingDirectory: $(Build.ArtifactStagingDirectory) - condition: and(succeeded(), variables['SigningCertificate']) - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: nuget' - inputs: - PathtoPublish: '$(Build.ArtifactStagingDirectory)' - ArtifactName: nuget diff --git a/.azure-pipelines/windows-release/stage-publish-nugetorg.yml b/.azure-pipelines/windows-release/stage-publish-nugetorg.yml deleted file mode 100644 index abb9d0f0fd485a..00000000000000 --- a/.azure-pipelines/windows-release/stage-publish-nugetorg.yml +++ /dev/null @@ -1,50 +0,0 @@ -parameters: - BuildToPublish: '' - -jobs: -- job: Publish_Nuget - displayName: Publish Nuget packages - condition: and(succeeded(), eq(variables['DoNuget'], 'true'), ne(variables['SkipNugetPublish'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - steps: - - checkout: none - - - ${{ if parameters.BuildToPublish }}: - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact from ${{ parameters.BuildToPublish }}' - inputs: - artifactName: nuget - downloadPath: $(Build.BinariesDirectory) - buildType: specific - project: $(System.TeamProject) - pipeline: $(Build.DefinitionName) - buildVersionToDownload: specific - buildId: ${{ parameters.BuildToPublish }} - - - ${{ else }}: - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: nuget' - inputs: - artifactName: nuget - downloadPath: $(Build.BinariesDirectory) - - - - powershell: 'gci pythonarm*.nupkg | %{ Write-Host "Not publishing: $($_.Name)"; gi $_ } | del' - displayName: 'Prevent publishing ARM64 packages' - workingDirectory: '$(Build.BinariesDirectory)\nuget' - condition: and(succeeded(), ne(variables['PublishARM64'], 'true')) - - - task: NuGetCommand@2 - displayName: Push packages - condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) - inputs: - command: push - packagesToPush: '$(Build.BinariesDirectory)\nuget\*.nupkg' - nuGetFeedType: external - publishFeedCredentials: 'Python on Nuget' diff --git a/.azure-pipelines/windows-release/stage-publish-pythonorg.yml b/.azure-pipelines/windows-release/stage-publish-pythonorg.yml deleted file mode 100644 index 084134e009902e..00000000000000 --- a/.azure-pipelines/windows-release/stage-publish-pythonorg.yml +++ /dev/null @@ -1,192 +0,0 @@ -parameters: - BuildToPublish: '' - -jobs: -- job: Publish_Python - displayName: Publish python.org packages - condition: and(succeeded(), eq(variables['DoMSI'], 'true'), eq(variables['DoEmbed'], 'true'), ne(variables['SkipPythonOrgPublish'], 'true')) - - pool: - #vmImage: windows-2022 - name: 'Windows Release' - - workspace: - clean: all - - steps: - - template: ./checkout.yml - - - task: UsePythonVersion@0 - displayName: 'Use Python 3.6 or later' - inputs: - versionSpec: '>=3.6' - - - ${{ if parameters.BuildToPublish }}: - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact from ${{ parameters.BuildToPublish }}: Doc' - inputs: - artifactName: Doc - targetPath: $(Build.BinariesDirectory)\Doc - buildType: specific - project: $(System.TeamProject) - pipeline: $(Build.DefinitionName) - buildVersionToDownload: specific - buildId: ${{ parameters.BuildToPublish }} - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact from ${{ parameters.BuildToPublish }}: msi' - inputs: - artifactName: msi - targetPath: $(Build.BinariesDirectory)\msi - buildType: specific - project: $(System.TeamProject) - pipeline: $(Build.DefinitionName) - buildVersionToDownload: specific - buildId: ${{ parameters.BuildToPublish }} - - # Note that embed is a 'build' artifact, not a 'pipeline' artifact - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact from ${{ parameters.BuildToPublish }}: embed' - inputs: - artifactName: embed - downloadPath: $(Build.BinariesDirectory) - buildType: specific - project: $(System.TeamProject) - pipeline: $(Build.DefinitionName) - buildVersionToDownload: specific - buildId: ${{ parameters.BuildToPublish }} - - - ${{ else }}: - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: Doc' - inputs: - artifactName: Doc - targetPath: $(Build.BinariesDirectory)\Doc - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: msi' - inputs: - artifactName: msi - targetPath: $(Build.BinariesDirectory)\msi - - # Note that embed is a 'build' artifact, not a 'pipeline' artifact - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: embed' - inputs: - artifactName: embed - downloadPath: $(Build.BinariesDirectory) - - - # Note that ARM64 MSIs are skipped at build when this option is specified - - powershell: 'gci *embed-arm*.zip | %{ Write-Host "Not publishing: $($_.Name)"; gi $_ } | del' - displayName: 'Prevent publishing ARM64 packages' - workingDirectory: '$(Build.BinariesDirectory)\embed' - condition: and(succeeded(), ne(variables['PublishARM64'], 'true')) - - - - task: DownloadSecureFile@1 - name: gpgkey - inputs: - secureFile: 'python-signing.key' - displayName: 'Download GPG key' - - - powershell: | - git clone https://github.com/python/cpython-bin-deps --branch gpg --single-branch --depth 1 --progress -v "gpg" - gpg/gpg2.exe --import "$(gpgkey.secureFilePath)" - $files = gci -File "msi\*\*", "embed\*.zip" - if ("$(DoCHM)" -ieq "true") { - $files = $files + (gci -File "doc\htmlhelp\*.chm") - } - $files.FullName | %{ - gpg/gpg2.exe -ba --batch --passphrase $(GPGPassphrase) $_ - "Made signature for $_" - } - displayName: 'Generate GPG signatures' - workingDirectory: $(Build.BinariesDirectory) - - - powershell: | - $p = gps "gpg-agent" -EA 0 - if ($p) { $p.Kill() } - displayName: 'Kill GPG agent' - condition: true - - - - powershell: > - $(Build.SourcesDirectory)\Tools\msi\uploadrelease.ps1 - -build msi - -user $(PyDotOrgUsername) - -server $(PyDotOrgServer) - -doc_htmlhelp doc\htmlhelp - -embed embed - -skippurge - -skiptest - -skiphash - condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) - workingDirectory: $(Build.BinariesDirectory) - displayName: 'Upload files to python.org' - - - powershell: > - python - "$(Build.SourcesDirectory)\Tools\msi\purge.py" - (gci msi\*\python-*.exe | %{ $_.Name -replace 'python-(.+?)(-|\.exe).+', '$1' } | select -First 1) - workingDirectory: $(Build.BinariesDirectory) - condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) - displayName: 'Purge CDN' - - - powershell: | - $failures = 0 - gci "msi\*\*.exe" -File | %{ - $d = mkdir "tests\$($_.BaseName)" -Force - gci $d -r -File | del - $ic = copy $_ $d -PassThru - "Checking layout for $($ic.Name)" - Start-Process -wait $ic "/passive", "/layout", "$d\layout", "/log", "$d\log\install.log" - if (-not $?) { - Write-Error "Failed to validate layout of $($inst.Name)" - $failures += 1 - } - } - if ($failures) { - Write-Error "Failed to validate $failures installers" - exit 1 - } - condition: and(succeeded(), eq(variables['SigningCertificate'], variables['__RealSigningCertificate'])) - workingDirectory: $(Build.BinariesDirectory) - displayName: 'Test layouts' - - - powershell: | - $files = gci -File "msi\*\*.exe", "embed\*.zip" - if ("$(DoCHM)" -ieq "true") { - $files = $files + (gci -File "doc\htmlhelp\python*.chm") - } - $hashes = $files | ` - Sort-Object Name | ` - Format-Table Name, @{ - Label="MD5"; - Expression={(Get-FileHash $_ -Algorithm MD5).Hash} - }, Length -AutoSize | ` - Out-String -Width 4096 - $d = mkdir "$(Build.ArtifactStagingDirectory)\hashes" -Force - $hashes | Out-File "$d\hashes.txt" -Encoding ascii - $hashes - workingDirectory: $(Build.BinariesDirectory) - displayName: 'Generate hashes' - - - powershell: | - "Copying:" - $files = gci -File "msi\*\python*.asc", "embed\*.asc" - if ("$(DoCHM)" -ieq "true") { - $files = $files + (gci -File "doc\htmlhelp\*.asc") - } - $files.FullName - $d = mkdir "$(Build.ArtifactStagingDirectory)\hashes" -Force - move $files $d -Force - gci msi -Directory | %{ move "msi\$_\*.asc" (mkdir "$d\$_" -Force) } - workingDirectory: $(Build.BinariesDirectory) - displayName: 'Copy GPG signatures for build' - - - task: PublishPipelineArtifact@0 - displayName: 'Publish Artifact: hashes' - inputs: - targetPath: '$(Build.ArtifactStagingDirectory)\hashes' - artifactName: hashes diff --git a/.azure-pipelines/windows-release/stage-publish-store.yml b/.azure-pipelines/windows-release/stage-publish-store.yml deleted file mode 100644 index 0eae21edaa7f8e..00000000000000 --- a/.azure-pipelines/windows-release/stage-publish-store.yml +++ /dev/null @@ -1,38 +0,0 @@ -parameters: - BuildToPublish: '' - -jobs: -- job: Publish_Store - displayName: Publish Store packages - condition: and(succeeded(), eq(variables['DoMSIX'], 'true'), ne(variables['SkipMSIXPublish'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - steps: - - checkout: none - - - ${{ if parameters.BuildToPublish }}: - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: msixupload' - inputs: - artifactName: msixupload - downloadPath: $(Build.BinariesDirectory) - buildType: specific - project: cpython - pipeline: Windows-Release - buildVersionToDownload: specific - buildId: ${{ parameters.BuildToPublish }} - - - ${{ else }}: - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: msixupload' - inputs: - artifactName: msixupload - downloadPath: $(Build.BinariesDirectory) - - # TODO: eq(variables['SigningCertificate'], variables['__RealSigningCertificate']) - # If we are not real-signed, DO NOT PUBLISH diff --git a/.azure-pipelines/windows-release/stage-sign.yml b/.azure-pipelines/windows-release/stage-sign.yml deleted file mode 100644 index 4481aa86edc2cc..00000000000000 --- a/.azure-pipelines/windows-release/stage-sign.yml +++ /dev/null @@ -1,130 +0,0 @@ -parameters: - Include: '*.exe, *.dll, *.pyd, *.cat, *.ps1' - Exclude: 'vcruntime*, libffi*, libcrypto*, libssl*' - -jobs: -- job: Sign_Python - displayName: Sign Python binaries - condition: and(succeeded(), variables['SigningCertificate']) - - pool: - name: 'Windows Release' - - workspace: - clean: all - - strategy: - matrix: - win32: - Name: win32 - amd64: - Name: amd64 - arm64: - Name: arm64 - - steps: - - template: ./checkout.yml - - template: ./find-sdk.yml - - - powershell: | - $d = (.\PCbuild\build.bat -V) | %{ if($_ -match '\s+(\w+):\s*(.+)\s*$') { @{$Matches[1] = $Matches[2];} }}; - Write-Host "##vso[task.setvariable variable=SigningDescription]Python $($d.PythonVersion)" - displayName: 'Update signing description' - condition: and(succeeded(), not(variables['SigningDescription'])) - - - powershell: | - Write-Host "##vso[build.addbuildtag]signed" - displayName: 'Add build tags' - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: unsigned_bin_$(Name)' - inputs: - artifactName: unsigned_bin_$(Name) - targetPath: $(Build.BinariesDirectory)\bin - - - powershell: | - copy "$(Build.SourcesDirectory)\Lib\venv\scripts\common\Activate.ps1" . - displayName: 'Copy files from source' - workingDirectory: $(Build.BinariesDirectory)\bin - - - powershell: | - $files = (gi ${{ parameters.Include }} -Exclude ${{ parameters.Exclude }}) - signtool sign /a /n "$(SigningCertificate)" /fd sha256 /d "$(SigningDescription)" $files - displayName: 'Sign binaries' - workingDirectory: $(Build.BinariesDirectory)\bin - - - powershell: | - $files = (gi ${{ parameters.Include }} -Exclude ${{ parameters.Exclude }}) - $failed = $true - foreach ($retry in 1..10) { - signtool timestamp /tr http://timestamp.digicert.com/ /td sha256 $files - if ($?) { - $failed = $false - break - } - sleep 5 - } - if ($failed) { - Write-Host "##vso[task.logissue type=error]Failed to timestamp files" - } - displayName: 'Timestamp binaries' - workingDirectory: $(Build.BinariesDirectory)\bin - continueOnError: true - - - task: PublishPipelineArtifact@0 - displayName: 'Publish artifact: bin_$(Name)' - inputs: - targetPath: '$(Build.BinariesDirectory)\bin' - artifactName: bin_$(Name) - - -- job: Dump_CertInfo - displayName: Capture certificate info - condition: and(succeeded(), variables['SigningCertificate']) - - pool: - name: 'Windows Release' - - steps: - - checkout: none - - - powershell: | - $m = 'CN=$(SigningCertificate)' - $c = ((gci Cert:\CurrentUser\My), (gci Cert:\LocalMachine\My)) | %{ $_ } | ` - ?{ $_.Subject -match $m -and $_.NotBefore -lt (Get-Date) -and $_.NotAfter -gt (Get-Date) } | ` - select -First 1 - if (-not $c) { - Write-Host "Failed to find certificate for $(SigningCertificate)" - exit - } - $d = mkdir "$(Build.BinariesDirectory)\tmp" -Force - $cf = "$d\cert.cer" - [IO.File]::WriteAllBytes($cf, $c.Export("Cer")) - $csha = (certutil -dump $cf | sls "Cert Hash\(sha256\): (.+)").Matches.Groups[1].Value - - $info = @{ Subject=$c.Subject; SHA256=$csha; } - - $d = mkdir "$(Build.BinariesDirectory)\cert" -Force - $info | ConvertTo-JSON -Compress | Out-File -Encoding utf8 "$d\certinfo.json" - displayName: "Extract certificate info" - - - task: PublishPipelineArtifact@0 - displayName: 'Publish artifact: cert' - inputs: - targetPath: '$(Build.BinariesDirectory)\cert' - artifactName: cert - - -- job: Mark_Unsigned - displayName: Tag unsigned build - condition: and(succeeded(), not(variables['SigningCertificate'])) - - pool: - vmImage: windows-2022 - - steps: - - checkout: none - - - powershell: | - Write-Host "##vso[build.addbuildtag]unsigned" - displayName: 'Add build tag' diff --git a/.azure-pipelines/windows-release/stage-test-embed.yml b/.azure-pipelines/windows-release/stage-test-embed.yml deleted file mode 100644 index 252db959930f2f..00000000000000 --- a/.azure-pipelines/windows-release/stage-test-embed.yml +++ /dev/null @@ -1,41 +0,0 @@ -jobs: -- job: Test_Embed - displayName: Test Embed - condition: and(succeeded(), eq(variables['DoEmbed'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32: - Name: win32 - amd64: - Name: amd64 - - steps: - - checkout: none - - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: embed' - inputs: - artifactName: embed - downloadPath: $(Build.BinariesDirectory) - - - powershell: | - $p = gi "$(Build.BinariesDirectory)\embed\python*embed-$(Name).zip" - Expand-Archive -Path $p -DestinationPath "$(Build.BinariesDirectory)\Python" - $p = gi "$(Build.BinariesDirectory)\Python\python.exe" - Write-Host "##vso[task.prependpath]$(Split-Path -Parent $p)" - displayName: 'Install Python and add to PATH' - - - script: | - python -c "import sys; print(sys.version)" - displayName: 'Collect version number' - - - script: | - python -m site - displayName: 'Collect site' diff --git a/.azure-pipelines/windows-release/stage-test-msi.yml b/.azure-pipelines/windows-release/stage-test-msi.yml deleted file mode 100644 index a471d05bc6a4a2..00000000000000 --- a/.azure-pipelines/windows-release/stage-test-msi.yml +++ /dev/null @@ -1,108 +0,0 @@ -jobs: -- job: Test_MSI - displayName: Test MSI - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32_User: - ExeMatch: 'python-[\dabrc.]+\.exe' - Logs: $(Build.ArtifactStagingDirectory)\logs\win32_User - InstallAllUsers: 0 - win32_Machine: - ExeMatch: 'python-[\dabrc.]+\.exe' - Logs: $(Build.ArtifactStagingDirectory)\logs\win32_Machine - InstallAllUsers: 1 - amd64_User: - ExeMatch: 'python-[\dabrc.]+-amd64\.exe' - Logs: $(Build.ArtifactStagingDirectory)\logs\amd64_User - InstallAllUsers: 0 - amd64_Machine: - ExeMatch: 'python-[\dabrc.]+-amd64\.exe' - Logs: $(Build.ArtifactStagingDirectory)\logs\amd64_Machine - InstallAllUsers: 1 - - steps: - - checkout: none - - - task: DownloadPipelineArtifact@1 - displayName: 'Download artifact: msi' - inputs: - artifactName: msi - targetPath: $(Build.BinariesDirectory)\msi - - - powershell: | - $p = (gci -r *.exe | ?{ $_.Name -match '$(ExeMatch)' } | select -First 1) - Write-Host "##vso[task.setvariable variable=SetupExe]$($p.FullName)" - Write-Host "##vso[task.setvariable variable=SetupExeName]$($p.Name)" - displayName: 'Find installer executable' - workingDirectory: $(Build.BinariesDirectory)\msi - - - script: > - "$(SetupExe)" - /passive - /log "$(Logs)\install\log.txt" - TargetDir="$(Build.BinariesDirectory)\Python" - Include_debug=1 - Include_symbols=1 - InstallAllUsers=$(InstallAllUsers) - displayName: 'Install Python' - - - powershell: | - $p = gi "$(Build.BinariesDirectory)\Python\python.exe" - Write-Host "##vso[task.prependpath]$(Split-Path -Parent $p)" - displayName: 'Add test Python to PATH' - - - script: | - python -c "import sys; print(sys.version)" - displayName: 'Collect version number' - - - script: | - python -m site - displayName: 'Collect site' - - - powershell: | - gci -r "${env:PROGRAMDATA}\Microsoft\Windows\Start Menu\Programs\Python*" - displayName: 'Capture per-machine Start Menu items' - - powershell: | - gci -r "${env:APPDATA}\Microsoft\Windows\Start Menu\Programs\Python*" - displayName: 'Capture per-user Start Menu items' - - - powershell: | - gci -r "HKLM:\Software\WOW6432Node\Python" - displayName: 'Capture per-machine 32-bit registry' - - powershell: | - gci -r "HKLM:\Software\Python" - displayName: 'Capture per-machine native registry' - - powershell: | - gci -r "HKCU:\Software\Python" - displayName: 'Capture current-user registry' - - - script: | - python -m pip install "azure<0.10" - python -m pip uninstall -y azure python-dateutil six - displayName: 'Test (un)install package' - - - script: | - python -m test -uall -v test_ttk_guionly test_tk test_idle - displayName: 'Test Tkinter and Idle' - - - script: > - "$(SetupExe)" - /passive - /uninstall - /log "$(Logs)\uninstall\log.txt" - displayName: 'Uninstall Python' - - - task: PublishBuildArtifacts@1 - displayName: 'Publish Artifact: logs' - condition: true - continueOnError: true - inputs: - PathtoPublish: '$(Build.ArtifactStagingDirectory)\logs' - ArtifactName: msi_testlogs diff --git a/.azure-pipelines/windows-release/stage-test-nuget.yml b/.azure-pipelines/windows-release/stage-test-nuget.yml deleted file mode 100644 index c500baf29b4571..00000000000000 --- a/.azure-pipelines/windows-release/stage-test-nuget.yml +++ /dev/null @@ -1,58 +0,0 @@ -jobs: -- job: Test_Nuget - displayName: Test Nuget - condition: and(succeeded(), eq(variables['DoNuget'], 'true')) - - pool: - vmImage: windows-2022 - - workspace: - clean: all - - strategy: - matrix: - win32: - Package: pythonx86 - amd64: - Package: python - - steps: - - checkout: none - - - task: DownloadBuildArtifacts@0 - displayName: 'Download artifact: nuget' - inputs: - artifactName: nuget - downloadPath: $(Build.BinariesDirectory) - - - task: NugetToolInstaller@0 - inputs: - versionSpec: '>= 5' - - - powershell: > - nuget install - $(Package) - -Source "$(Build.BinariesDirectory)\nuget" - -OutputDirectory "$(Build.BinariesDirectory)\install" - -Prerelease - -ExcludeVersion - -NonInteractive - displayName: 'Install Python' - - - powershell: | - $p = gi "$(Build.BinariesDirectory)\install\$(Package)\tools\python.exe" - Write-Host "##vso[task.prependpath]$(Split-Path -Parent $p)" - displayName: 'Add test Python to PATH' - - - script: | - python -c "import sys; print(sys.version)" - displayName: 'Collect version number' - - - script: | - python -m site - displayName: 'Collect site' - - - script: | - python -m pip install "azure<0.10" - python -m pip uninstall -y azure python-dateutil six - displayName: 'Test (un)install package' diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 0aea3983fa600d..013e1cbd7241d5 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -53,6 +53,7 @@ Python/pythonrun.c @iritkatriel /Lib/html/ @ezio-melotti /Lib/_markupbase.py @ezio-melotti /Lib/test/test_html*.py @ezio-melotti +/Tools/scripts/*html5* @ezio-melotti # Import (including importlib). # Ignoring importlib.h so as to not get flagged on @@ -60,6 +61,8 @@ Python/pythonrun.c @iritkatriel # bytecode. **/*import*.c @brettcannon @encukou @ericsnowcurrently @ncoghlan @warsaw **/*import*.py @brettcannon @encukou @ericsnowcurrently @ncoghlan @warsaw +**/importlib/resources/* @jaraco @warsaw @brettcannon +**/importlib/metadata/* @jaraco @warsaw # Dates and times **/*datetime* @pganssle @abalkin @@ -95,7 +98,7 @@ Lib/ast.py @isidentical # Mock /Lib/unittest/mock.py @cjw296 -/Lib/unittest/test/testmock/* @cjw296 +/Lib/test/test_unittest/testmock/* @cjw296 # SQLite 3 **/*sqlite* @berkerpeksag @erlend-aasland diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index e04633b711f2f0..d800442ad07e36 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -22,6 +22,9 @@ on: - '3.8' - '3.7' +permissions: + contents: read + jobs: check_source: name: 'Check for source changes' diff --git a/.github/workflows/build_msi.yml b/.github/workflows/build_msi.yml index ec18735e9b9fa6..6044ae0f7c29b4 100644 --- a/.github/workflows/build_msi.yml +++ b/.github/workflows/build_msi.yml @@ -23,6 +23,9 @@ on: paths: - 'Tools/msi/**' +permissions: + contents: read + jobs: build_win32: name: 'Windows (x86) Installer' diff --git a/.github/workflows/doc.yml b/.github/workflows/doc.yml index 0d1b85d84746a9..e06f21671b5a5a 100644 --- a/.github/workflows/doc.yml +++ b/.github/workflows/doc.yml @@ -23,6 +23,10 @@ on: paths: - 'Doc/**' - 'Misc/**' + - '.github/workflows/doc.yml' + +permissions: + contents: read jobs: build_doc: @@ -32,6 +36,38 @@ jobs: - uses: actions/checkout@v3 - name: Register Sphinx problem matcher run: echo "::add-matcher::.github/problem-matchers/sphinx.json" + - name: 'Set up Python' + uses: actions/setup-python@v4 + with: + python-version: '3' + cache: 'pip' + cache-dependency-path: 'Doc/requirements.txt' + - name: 'Install build dependencies' + run: make -C Doc/ venv + - name: 'Check documentation' + run: make -C Doc/ check + - name: 'Build HTML documentation' + run: make -C Doc/ SPHINXOPTS="-q" SPHINXERRORHANDLING="-W --keep-going" html + - name: 'Upload' + uses: actions/upload-artifact@v3 + with: + name: doc-html + path: Doc/build/html + + # Run "doctest" on HEAD as new syntax doesn't exist in the latest stable release + doctest: + name: 'Doctest' + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - name: Register Sphinx problem matcher + run: echo "::add-matcher::.github/problem-matchers/sphinx.json" + - uses: actions/cache@v3 + with: + path: ~/.cache/pip + key: ubuntu-doc-${{ hashFiles('Doc/requirements.txt') }} + restore-keys: | + ubuntu-doc- - name: 'Install Dependencies' run: sudo ./.github/workflows/posix-deps-apt.sh && sudo apt-get install wamerican - name: 'Configure CPython' @@ -40,17 +76,6 @@ jobs: run: make -j4 - name: 'Install build dependencies' run: make -C Doc/ PYTHON=../python venv - # Run "check doctest html" as 3 steps to get a more readable output - # in the web UI - - name: 'Check documentation' - run: make -C Doc/ PYTHON=../python SPHINXOPTS="-q -W --keep-going -j4" check # Use "xvfb-run" since some doctest tests open GUI windows - name: 'Run documentation doctest' - run: xvfb-run make -C Doc/ PYTHON=../python SPHINXOPTS="-q -W --keep-going -j4" doctest - - name: 'Build HTML documentation' - run: make -C Doc/ PYTHON=../python SPHINXOPTS="-q -W --keep-going -j4" html - - name: 'Upload' - uses: actions/upload-artifact@v3 - with: - name: doc-html - path: Doc/build/html + run: xvfb-run make -C Doc/ PYTHON=../python SPHINXOPTS="-q" SPHINXERRORHANDLING="-W --keep-going" doctest diff --git a/.github/workflows/new-bugs-announce-notifier.yml b/.github/workflows/new-bugs-announce-notifier.yml index 8cd834419f00bf..b2b63472d83421 100644 --- a/.github/workflows/new-bugs-announce-notifier.yml +++ b/.github/workflows/new-bugs-announce-notifier.yml @@ -5,6 +5,9 @@ on: types: - opened +permissions: + issues: read + jobs: notify-new-bugs-announce: runs-on: ubuntu-latest @@ -39,7 +42,7 @@ jobs: assignee : issue.data.assignees.map(assignee => { return assignee.login }), body : issue.data.body }; - + const data = { from: "CPython Issues ", to: "new-bugs-announce@python.org", diff --git a/.github/workflows/regen-abidump.sh b/.github/workflows/regen-abidump.sh new file mode 100644 index 00000000000000..251bb3857ecfcb --- /dev/null +++ b/.github/workflows/regen-abidump.sh @@ -0,0 +1,8 @@ +set -ex + +export DEBIAN_FRONTEND=noninteractive +./.github/workflows/posix-deps-apt.sh +apt-get install -yq abigail-tools python3 +export CFLAGS="-g3 -O0" +./configure --enable-shared && make +make regen-abidump diff --git a/.github/workflows/verify-ensurepip-wheels.yml b/.github/workflows/verify-ensurepip-wheels.yml new file mode 100644 index 00000000000000..61e3d1adf534d5 --- /dev/null +++ b/.github/workflows/verify-ensurepip-wheels.yml @@ -0,0 +1,28 @@ +name: Verify bundled pip and setuptools + +on: + workflow_dispatch: + push: + paths: + - 'Lib/ensurepip/_bundled/**' + - '.github/workflows/verify-ensurepip-wheels.yml' + - 'Tools/scripts/verify_ensurepip_wheels.py' + pull_request: + paths: + - 'Lib/ensurepip/_bundled/**' + - '.github/workflows/verify-ensurepip-wheels.yml' + - 'Tools/scripts/verify_ensurepip_wheels.py' + +permissions: + contents: read + +jobs: + verify: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v3 + - uses: actions/setup-python@v4 + with: + python-version: '3' + - name: Compare checksums of bundled pip and setuptools to ones published on PyPI + run: ./Tools/scripts/verify_ensurepip_wheels.py diff --git a/.gitignore b/.gitignore index b3b22f471c2765..0ddfd717d737f8 100644 --- a/.gitignore +++ b/.gitignore @@ -150,3 +150,6 @@ Python/frozen_modules/MANIFEST # Ignore ./python binary on Unix but still look into ./Python/ directory. /python !/Python/ + +# main branch only: ABI files are not checked/maintained +Doc/data/python*.abi diff --git a/Doc/Makefile b/Doc/Makefile index 3a3417bf99af3b..5b6a95813abee5 100644 --- a/Doc/Makefile +++ b/Doc/Makefile @@ -18,7 +18,7 @@ SPHINXERRORHANDLING = -W PAPEROPT_a4 = -D latex_elements.papersize=a4paper PAPEROPT_letter = -D latex_elements.papersize=letterpaper -ALLSPHINXOPTS = -b $(BUILDER) -d build/doctrees $(PAPEROPT_$(PAPER)) \ +ALLSPHINXOPTS = -b $(BUILDER) -d build/doctrees $(PAPEROPT_$(PAPER)) -j auto \ $(SPHINXOPTS) $(SPHINXERRORHANDLING) . build/$(BUILDER) $(SOURCES) .PHONY: help build html htmlhelp latex text texinfo changes linkcheck \ @@ -213,8 +213,10 @@ dist: rm dist/python-$(DISTVERSION)-docs-texinfo.tar check: - $(SPHINXLINT) -i tools -i $(VENVDIR) -i README.rst - $(SPHINXLINT) ../Misc/NEWS.d/next/ + # Check the docs and NEWS files with sphinx-lint. + # Ignore the tools and venv dirs and check that the default role is not used. + $(SPHINXLINT) -i tools -i $(VENVDIR) --enable default-role + $(SPHINXLINT) --enable default-role ../Misc/NEWS.d/next/ serve: @echo "The serve target was removed, use htmlview instead (see bpo-36329)" diff --git a/Doc/c-api/bytes.rst b/Doc/c-api/bytes.rst index 7617487a462d37..d62962cab45f6b 100644 --- a/Doc/c-api/bytes.rst +++ b/Doc/c-api/bytes.rst @@ -58,9 +58,6 @@ called with a non-bytes parameter. .. % XXX: This should be exactly the same as the table in PyErr_Format. .. % One should just refer to the other. - .. % XXX: The descriptions for %zd and %zu are wrong, but the truth is complicated - .. % because not all compilers support the %z width modifier -- we fake it - .. % when necessary via interpolating PY_FORMAT_SIZE_T. .. tabularcolumns:: |l|l|L| diff --git a/Doc/c-api/call.rst b/Doc/c-api/call.rst index cdf72bc1f4714a..13ef8b217a1622 100644 --- a/Doc/c-api/call.rst +++ b/Doc/c-api/call.rst @@ -144,8 +144,6 @@ Vectorcall Support API However, the function ``PyVectorcall_NARGS`` should be used to allow for future extensions. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.8 .. c:function:: vectorcallfunc PyVectorcall_Function(PyObject *op) @@ -158,8 +156,6 @@ Vectorcall Support API This is mostly useful to check whether or not *op* supports vectorcall, which can be done by checking ``PyVectorcall_Function(op) != NULL``. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.8 .. c:function:: PyObject* PyVectorcall_Call(PyObject *callable, PyObject *tuple, PyObject *dict) @@ -172,8 +168,6 @@ Vectorcall Support API It does not check the :const:`Py_TPFLAGS_HAVE_VECTORCALL` flag and it does not fall back to ``tp_call``. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.8 @@ -256,8 +250,6 @@ please see individual documentation for details. Return the result of the call on success, or raise an exception and return *NULL* on failure. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.9 @@ -343,8 +335,6 @@ please see individual documentation for details. Return the result of the call on success, or raise an exception and return *NULL* on failure. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.9 @@ -357,8 +347,6 @@ please see individual documentation for details. Return the result of the call on success, or raise an exception and return *NULL* on failure. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.9 @@ -372,8 +360,6 @@ please see individual documentation for details. Return the result of the call on success, or raise an exception and return *NULL* on failure. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.9 .. c:function:: PyObject* PyObject_VectorcallDict(PyObject *callable, PyObject *const *args, size_t nargsf, PyObject *kwdict) @@ -388,8 +374,6 @@ please see individual documentation for details. already has a dictionary ready to use for the keyword arguments, but not a tuple for the positional arguments. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.9 .. c:function:: PyObject* PyObject_VectorcallMethod(PyObject *name, PyObject *const *args, size_t nargsf, PyObject *kwnames) @@ -410,8 +394,6 @@ please see individual documentation for details. Return the result of the call on success, or raise an exception and return *NULL* on failure. - This function is not part of the :ref:`limited API `. - .. versionadded:: 3.9 diff --git a/Doc/c-api/init.rst b/Doc/c-api/init.rst index d4954958f855f5..038498f325ceb3 100644 --- a/Doc/c-api/init.rst +++ b/Doc/c-api/init.rst @@ -83,52 +83,93 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. .. c:var:: int Py_BytesWarningFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.bytes_warning` should be used instead, see :ref:`Python + Initialization Configuration `. + Issue a warning when comparing :class:`bytes` or :class:`bytearray` with :class:`str` or :class:`bytes` with :class:`int`. Issue an error if greater or equal to ``2``. Set by the :option:`-b` option. + .. deprecated:: 3.12 + .. c:var:: int Py_DebugFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.parser_debug` should be used instead, see :ref:`Python + Initialization Configuration `. + Turn on parser debugging output (for expert only, depending on compilation options). Set by the :option:`-d` option and the :envvar:`PYTHONDEBUG` environment variable. + .. deprecated:: 3.12 + .. c:var:: int Py_DontWriteBytecodeFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.write_bytecode` should be used instead, see :ref:`Python + Initialization Configuration `. + If set to non-zero, Python won't try to write ``.pyc`` files on the import of source modules. Set by the :option:`-B` option and the :envvar:`PYTHONDONTWRITEBYTECODE` environment variable. + .. deprecated:: 3.12 + .. c:var:: int Py_FrozenFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.pathconfig_warnings` should be used instead, see + :ref:`Python Initialization Configuration `. + Suppress error messages when calculating the module search path in :c:func:`Py_GetPath`. Private flag used by ``_freeze_module`` and ``frozenmain`` programs. + .. deprecated:: 3.12 + .. c:var:: int Py_HashRandomizationFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.hash_seed` and :c:member:`PyConfig.use_hash_seed` should + be used instead, see :ref:`Python Initialization Configuration + `. + Set to ``1`` if the :envvar:`PYTHONHASHSEED` environment variable is set to a non-empty string. If the flag is non-zero, read the :envvar:`PYTHONHASHSEED` environment variable to initialize the secret hash seed. + .. deprecated:: 3.12 + .. c:var:: int Py_IgnoreEnvironmentFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.use_environment` should be used instead, see + :ref:`Python Initialization Configuration `. + Ignore all :envvar:`PYTHON*` environment variables, e.g. :envvar:`PYTHONPATH` and :envvar:`PYTHONHOME`, that might be set. Set by the :option:`-E` and :option:`-I` options. + .. deprecated:: 3.12 + .. c:var:: int Py_InspectFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.inspect` should be used instead, see + :ref:`Python Initialization Configuration `. + When a script is passed as first argument or the :option:`-c` option is used, enter interactive mode after executing the script or the command, even when :data:`sys.stdin` does not appear to be a terminal. @@ -136,12 +177,24 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. Set by the :option:`-i` option and the :envvar:`PYTHONINSPECT` environment variable. + .. deprecated:: 3.12 + .. c:var:: int Py_InteractiveFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.interactive` should be used instead, see + :ref:`Python Initialization Configuration `. + Set by the :option:`-i` option. + .. deprecated:: 3.12 + .. c:var:: int Py_IsolatedFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.isolated` should be used instead, see + :ref:`Python Initialization Configuration `. + Run Python in isolated mode. In isolated mode :data:`sys.path` contains neither the script's directory nor the user's site-packages directory. @@ -149,8 +202,14 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. .. versionadded:: 3.4 + .. deprecated:: 3.12 + .. c:var:: int Py_LegacyWindowsFSEncodingFlag + This API is kept for backward compatibility: setting + :c:member:`PyPreConfig.legacy_windows_fs_encoding` should be used instead, see + :ref:`Python Initialization Configuration `. + If the flag is non-zero, use the ``mbcs`` encoding with ``replace`` error handler, instead of the UTF-8 encoding with ``surrogatepass`` error handler, for the :term:`filesystem encoding and error handler`. @@ -162,8 +221,14 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. .. availability:: Windows. + .. deprecated:: 3.12 + .. c:var:: int Py_LegacyWindowsStdioFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.legacy_windows_stdio` should be used instead, see + :ref:`Python Initialization Configuration `. + If the flag is non-zero, use :class:`io.FileIO` instead of :class:`WindowsConsoleIO` for :mod:`sys` standard streams. @@ -174,8 +239,14 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. .. availability:: Windows. + .. deprecated:: 3.12 + .. c:var:: int Py_NoSiteFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.site_import` should be used instead, see + :ref:`Python Initialization Configuration `. + Disable the import of the module :mod:`site` and the site-dependent manipulations of :data:`sys.path` that it entails. Also disable these manipulations if :mod:`site` is explicitly imported later (call @@ -183,36 +254,66 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. Set by the :option:`-S` option. + .. deprecated:: 3.12 + .. c:var:: int Py_NoUserSiteDirectory + This API is kept for backward compatibility: setting + :c:member:`PyConfig.user_site_directory` should be used instead, see + :ref:`Python Initialization Configuration `. + Don't add the :data:`user site-packages directory ` to :data:`sys.path`. Set by the :option:`-s` and :option:`-I` options, and the :envvar:`PYTHONNOUSERSITE` environment variable. + .. deprecated:: 3.12 + .. c:var:: int Py_OptimizeFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.optimization_level` should be used instead, see + :ref:`Python Initialization Configuration `. + Set by the :option:`-O` option and the :envvar:`PYTHONOPTIMIZE` environment variable. + .. deprecated:: 3.12 + .. c:var:: int Py_QuietFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.quiet` should be used instead, see :ref:`Python + Initialization Configuration `. + Don't display the copyright and version messages even in interactive mode. Set by the :option:`-q` option. .. versionadded:: 3.2 + .. deprecated:: 3.12 + .. c:var:: int Py_UnbufferedStdioFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.buffered_stdio` should be used instead, see :ref:`Python + Initialization Configuration `. + Force the stdout and stderr streams to be unbuffered. Set by the :option:`-u` option and the :envvar:`PYTHONUNBUFFERED` environment variable. + .. deprecated:: 3.12 + .. c:var:: int Py_VerboseFlag + This API is kept for backward compatibility: setting + :c:member:`PyConfig.verbose` should be used instead, see :ref:`Python + Initialization Configuration `. + Print a message each time a module is initialized, showing the place (filename or built-in module) from which it is loaded. If greater or equal to ``2``, print a message for each file that is checked for when @@ -221,6 +322,8 @@ to 1 and ``-bb`` sets :c:data:`Py_BytesWarningFlag` to 2. Set by the :option:`-v` option and the :envvar:`PYTHONVERBOSE` environment variable. + .. deprecated:: 3.12 + Initializing and finalizing the interpreter =========================================== @@ -253,6 +356,9 @@ Initializing and finalizing the interpreter (without calling :c:func:`Py_FinalizeEx` first). There is no return value; it is a fatal error if the initialization fails. + Use the :c:func:`Py_InitializeFromConfig` function to customize the + :ref:`Python Initialization Configuration `. + .. note:: On Windows, changes the console mode from ``O_TEXT`` to ``O_BINARY``, which will also affect non-Python uses of the console using the C Runtime. @@ -264,6 +370,9 @@ Initializing and finalizing the interpreter *initsigs* is ``0``, it skips initialization registration of signal handlers, which might be useful when Python is embedded. + Use the :c:func:`Py_InitializeFromConfig` function to customize the + :ref:`Python Initialization Configuration `. + .. c:function:: int Py_IsInitialized() diff --git a/Doc/c-api/init_config.rst b/Doc/c-api/init_config.rst index cc223a7fa4b940..2074ec4e0e8ea5 100644 --- a/Doc/c-api/init_config.rst +++ b/Doc/c-api/init_config.rst @@ -735,9 +735,8 @@ PyConfig * ``"utf-8"`` if :c:member:`PyPreConfig.utf8_mode` is non-zero. * ``"ascii"`` if Python detects that ``nl_langinfo(CODESET)`` announces - the ASCII encoding (or Roman8 encoding on HP-UX), whereas the - ``mbstowcs()`` function decodes from a different encoding (usually - Latin1). + the ASCII encoding, whereas the ``mbstowcs()`` function + decodes from a different encoding (usually Latin1). * ``"utf-8"`` if ``nl_langinfo(CODESET)`` returns an empty string. * Otherwise, use the :term:`locale encoding`: ``nl_langinfo(CODESET)`` result. @@ -835,8 +834,10 @@ PyConfig * Set :c:member:`~PyConfig.safe_path` to ``1``: don't prepend a potentially unsafe path to :data:`sys.path` at Python - startup. - * Set :c:member:`~PyConfig.use_environment` to ``0``. + startup, such as the current directory, the script's directory or an + empty string. + * Set :c:member:`~PyConfig.use_environment` to ``0``: ignore ``PYTHON`` + environment variables. * Set :c:member:`~PyConfig.user_site_directory` to ``0``: don't add the user site directory to :data:`sys.path`. * Python REPL doesn't import :mod:`readline` nor enable default readline @@ -846,7 +847,8 @@ PyConfig Default: ``0`` in Python mode, ``1`` in isolated mode. - See also :c:member:`PyPreConfig.isolated`. + See also the :ref:`Isolated Configuration ` and + :c:member:`PyPreConfig.isolated`. .. c:member:: int legacy_windows_stdio @@ -983,6 +985,9 @@ PyConfig Incremented by the :option:`-d` command line option. Set to the :envvar:`PYTHONDEBUG` environment variable value. + Need a :ref:`debug build of Python ` (the ``Py_DEBUG`` macro + must be defined). + Default: ``0``. .. c:member:: int pathconfig_warnings @@ -1177,13 +1182,13 @@ PyConfig imported, showing the place (filename or built-in module) from which it is loaded. - If greater or equal to ``2``, print a message for each file that is checked - for when searching for a module. Also provides information on module - cleanup at exit. + If greater than or equal to ``2``, print a message for each file that is + checked for when searching for a module. Also provides information on + module cleanup at exit. Incremented by the :option:`-v` command line option. - Set to the :envvar:`PYTHONVERBOSE` environment variable value. + Set by the :envvar:`PYTHONVERBOSE` environment variable value. Default: ``0``. @@ -1289,7 +1294,11 @@ Example setting the program name:: } More complete example modifying the default configuration, read the -configuration, and then override some parameters:: +configuration, and then override some parameters. Note that since +3.11, many parameters are not calculated until initialization, and +so values cannot be read from the configuration structure. Any values +set before initialize is called will be left unchanged by +initialization:: PyStatus init_python(const char *program_name) { @@ -1314,7 +1323,15 @@ configuration, and then override some parameters:: goto done; } - /* Append our custom search path to sys.path */ + /* Specify sys.path explicitly */ + /* If you want to modify the default set of paths, finish + initialization first and then use PySys_GetObject("path") */ + config.module_search_paths_set = 1; + status = PyWideStringList_Append(&config.module_search_paths, + L"/path/to/stdlib"); + if (PyStatus_Exception(status)) { + goto done; + } status = PyWideStringList_Append(&config.module_search_paths, L"/path/to/more/modules"); if (PyStatus_Exception(status)) { @@ -1417,8 +1434,8 @@ It is possible to completely ignore the function calculating the default path configuration by setting explicitly all path configuration output fields listed above. A string is considered as set even if it is non-empty. ``module_search_paths`` is considered as set if -``module_search_paths_set`` is set to ``1``. In this case, path -configuration input fields are ignored as well. +``module_search_paths_set`` is set to ``1``. In this case, +``module_search_paths`` will be used without modification. Set :c:member:`~PyConfig.pathconfig_warnings` to ``0`` to suppress warnings when calculating the path configuration (Unix only, Windows does not log any warning). diff --git a/Doc/c-api/refcounting.rst b/Doc/c-api/refcounting.rst index 391907c8c2976a..1cff4e7215cf49 100644 --- a/Doc/c-api/refcounting.rst +++ b/Doc/c-api/refcounting.rst @@ -109,13 +109,13 @@ objects. It is a good idea to use this macro whenever decrementing the reference count of an object that might be traversed during garbage collection. +.. c:function:: void Py_IncRef(PyObject *o) -The following functions are for runtime dynamic embedding of Python: -``Py_IncRef(PyObject *o)``, ``Py_DecRef(PyObject *o)``. They are -simply exported function versions of :c:func:`Py_XINCREF` and -:c:func:`Py_XDECREF`, respectively. + Increment the reference count for object *o*. A function version of :c:func:`Py_XINCREF`. + It can be used for runtime dynamic embedding of Python. -The following functions or macros are only for use within the interpreter core: -:c:func:`_Py_Dealloc`, :c:func:`_Py_ForgetReference`, :c:func:`_Py_NewReference`, -as well as the global variable :c:data:`_Py_RefTotal`. +.. c:function:: void Py_DecRef(PyObject *o) + + Decrement the reference count for object *o*. A function version of :c:func:`Py_XDECREF`. + It can be used for runtime dynamic embedding of Python. diff --git a/Doc/c-api/structures.rst b/Doc/c-api/structures.rst index ff5ecf24072c1a..86d4536f8d28e7 100644 --- a/Doc/c-api/structures.rst +++ b/Doc/c-api/structures.rst @@ -321,8 +321,6 @@ There are these calling conventions: or possibly ``NULL`` if there are no keywords. The values of the keyword arguments are stored in the *args* array, after the positional arguments. - This is not part of the :ref:`limited API `. - .. versionadded:: 3.7 diff --git a/Doc/c-api/sys.rst b/Doc/c-api/sys.rst index 7b714678444be1..32c2bc8a0880ea 100644 --- a/Doc/c-api/sys.rst +++ b/Doc/c-api/sys.rst @@ -21,10 +21,12 @@ Operating System Utilities Return true (nonzero) if the standard I/O file *fp* with name *filename* is deemed interactive. This is the case for files for which ``isatty(fileno(fp))`` - is true. If the global flag :c:data:`Py_InteractiveFlag` is true, this function + is true. If the :c:member:`PyConfig.interactive` is non-zero, this function also returns true if the *filename* pointer is ``NULL`` or if the name is equal to one of the strings ``''`` or ``'???'``. + This function must not be called before Python is initialized. + .. c:function:: void PyOS_BeforeFork() diff --git a/Doc/c-api/type.rst b/Doc/c-api/type.rst index d740e4eb0897e5..aa77c285e3b829 100644 --- a/Doc/c-api/type.rst +++ b/Doc/c-api/type.rst @@ -190,10 +190,16 @@ Creating Heap-Allocated Types The following functions and structs are used to create :ref:`heap types `. -.. c:function:: PyObject* PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) +.. c:function:: PyObject* PyType_FromMetaclass(PyTypeObject *metaclass, PyObject *module, PyType_Spec *spec, PyObject *bases) + + Create and return a :ref:`heap type ` from the *spec* + (see :const:`Py_TPFLAGS_HEAPTYPE`). - Creates and returns a :ref:`heap type ` from the *spec* - (:const:`Py_TPFLAGS_HEAPTYPE`). + The metaclass *metaclass* is used to construct the resulting type object. + When *metaclass* is ``NULL``, the metaclass is derived from *bases* + (or *Py_tp_base[s]* slots if *bases* is ``NULL``, see below). + Note that metaclasses that override + :c:member:`~PyTypeObject.tp_new` are not supported. The *bases* argument can be used to specify base classes; it can either be only one class or a tuple of classes. @@ -210,6 +216,25 @@ The following functions and structs are used to create This function calls :c:func:`PyType_Ready` on the new type. + Note that this function does *not* fully match the behavior of + calling :py:class:`type() ` or using the :keyword:`class` statement. + With user-provided base types or metaclasses, prefer + :ref:`calling ` :py:class:`type` (or the metaclass) + over ``PyType_From*`` functions. + Specifically: + + * :py:meth:`~object.__new__` is not called on the new class + (and it must be set to ``type.__new__``). + * :py:meth:`~object.__init__` is not called on the new class. + * :py:meth:`~object.__init_subclass__` is not called on any bases. + * :py:meth:`~object.__set_name__` is not called on new descriptors. + + .. versionadded:: 3.12 + +.. c:function:: PyObject* PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) + + Equivalent to ``PyType_FromMetaclass(NULL, module, spec, bases)``. + .. versionadded:: 3.9 .. versionchanged:: 3.10 @@ -217,15 +242,32 @@ The following functions and structs are used to create The function now accepts a single class as the *bases* argument and ``NULL`` as the ``tp_doc`` slot. + .. versionchanged:: 3.12 + + The function now finds and uses a metaclass corresponding to the provided + base classes. Previously, only :class:`type` instances were returned. + + .. c:function:: PyObject* PyType_FromSpecWithBases(PyType_Spec *spec, PyObject *bases) - Equivalent to ``PyType_FromModuleAndSpec(NULL, spec, bases)``. + Equivalent to ``PyType_FromMetaclass(NULL, NULL, spec, bases)``. .. versionadded:: 3.3 + .. versionchanged:: 3.12 + + The function now finds and uses a metaclass corresponding to the provided + base classes. Previously, only :class:`type` instances were returned. + .. c:function:: PyObject* PyType_FromSpec(PyType_Spec *spec) - Equivalent to ``PyType_FromSpecWithBases(spec, NULL)``. + Equivalent to ``PyType_FromMetaclass(NULL, NULL, spec, NULL)``. + + .. versionchanged:: 3.12 + + The function now finds and uses a metaclass corresponding to the + base classes provided in *Py_tp_base[s]* slots. + Previously, only :class:`type` instances were returned. .. c:type:: PyType_Spec @@ -254,6 +296,8 @@ The following functions and structs are used to create Array of :c:type:`PyType_Slot` structures. Terminated by the special slot value ``{0, NULL}``. + Each slot ID should be specified at most once. + .. c:type:: PyType_Slot Structure defining optional functionality of a type, containing a slot ID diff --git a/Doc/c-api/typeobj.rst b/Doc/c-api/typeobj.rst index b3f371bb9c0623..7927e1eb129e99 100644 --- a/Doc/c-api/typeobj.rst +++ b/Doc/c-api/typeobj.rst @@ -727,12 +727,6 @@ and :c:type:`PyType_Type` effectively act as defaults.) When a user sets :attr:`__call__` in Python code, only *tp_call* is updated, likely making it inconsistent with the vectorcall function. - .. note:: - - The semantics of the ``tp_vectorcall_offset`` slot are provisional and - expected to be finalized in Python 3.9. - If you use vectorcall, plan for updating your code for Python 3.9. - .. versionchanged:: 3.8 Before version 3.8, this slot was named ``tp_print``. @@ -2071,7 +2065,7 @@ flag set. This is done by filling a :c:type:`PyType_Spec` structure and calling :c:func:`PyType_FromSpec`, :c:func:`PyType_FromSpecWithBases`, -or :c:func:`PyType_FromModuleAndSpec`. +:c:func:`PyType_FromModuleAndSpec`, or :c:func:`PyType_FromMetaclass`. .. _number-structs: diff --git a/Doc/c-api/unicode.rst b/Doc/c-api/unicode.rst index 9dccbf1dc2298a..5d420bfa93cb27 100644 --- a/Doc/c-api/unicode.rst +++ b/Doc/c-api/unicode.rst @@ -397,10 +397,6 @@ APIs: ASCII-encoded string. The following format characters are allowed: .. % This should be exactly the same as the table in PyErr_Format. - .. % The descriptions for %zd and %zu are wrong, but the truth is complicated - .. % because not all compilers support the %z width modifier -- we fake it - .. % when necessary via interpolating PY_FORMAT_SIZE_T. - .. % Similar comments apply to the %ll width modifier and .. tabularcolumns:: |l|l|L| @@ -645,8 +641,7 @@ system. cannot contain embedded null characters. Use :c:func:`PyUnicode_DecodeFSDefaultAndSize` to decode a string from - :c:data:`Py_FileSystemDefaultEncoding` (the locale encoding read at - Python startup). + the :term:`filesystem encoding and error handler`. This function ignores the :ref:`Python UTF-8 Mode `. @@ -680,9 +675,8 @@ system. *errors* is ``NULL``. Return a :class:`bytes` object. *unicode* cannot contain embedded null characters. - Use :c:func:`PyUnicode_EncodeFSDefault` to encode a string to - :c:data:`Py_FileSystemDefaultEncoding` (the locale encoding read at - Python startup). + Use :c:func:`PyUnicode_EncodeFSDefault` to encode a string to the + :term:`filesystem encoding and error handler`. This function ignores the :ref:`Python UTF-8 Mode `. @@ -703,12 +697,12 @@ system. File System Encoding """""""""""""""""""" -To encode and decode file names and other environment strings, -:c:data:`Py_FileSystemDefaultEncoding` should be used as the encoding, and -:c:data:`Py_FileSystemDefaultEncodeErrors` should be used as the error handler -(:pep:`383` and :pep:`529`). To encode file names to :class:`bytes` during -argument parsing, the ``"O&"`` converter should be used, passing -:c:func:`PyUnicode_FSConverter` as the conversion function: +Functions encoding to and decoding from the :term:`filesystem encoding and +error handler` (:pep:`383` and :pep:`529`). + +To encode file names to :class:`bytes` during argument parsing, the ``"O&"`` +converter should be used, passing :c:func:`PyUnicode_FSConverter` as the +conversion function: .. c:function:: int PyUnicode_FSConverter(PyObject* obj, void* result) @@ -745,12 +739,7 @@ conversion function: Decode a string from the :term:`filesystem encoding and error handler`. - If :c:data:`Py_FileSystemDefaultEncoding` is not set, fall back to the - locale encoding. - - :c:data:`Py_FileSystemDefaultEncoding` is initialized at startup from the - locale encoding and cannot be modified later. If you need to decode a string - from the current locale encoding, use + If you need to decode a string from the current locale encoding, use :c:func:`PyUnicode_DecodeLocaleAndSize`. .. seealso:: @@ -758,7 +747,8 @@ conversion function: The :c:func:`Py_DecodeLocale` function. .. versionchanged:: 3.6 - Use :c:data:`Py_FileSystemDefaultEncodeErrors` error handler. + The :term:`filesystem error handler ` is now used. .. c:function:: PyObject* PyUnicode_DecodeFSDefault(const char *s) @@ -766,28 +756,22 @@ conversion function: Decode a null-terminated string from the :term:`filesystem encoding and error handler`. - If :c:data:`Py_FileSystemDefaultEncoding` is not set, fall back to the - locale encoding. - - Use :c:func:`PyUnicode_DecodeFSDefaultAndSize` if you know the string length. + If the string length is known, use + :c:func:`PyUnicode_DecodeFSDefaultAndSize`. .. versionchanged:: 3.6 - Use :c:data:`Py_FileSystemDefaultEncodeErrors` error handler. + The :term:`filesystem error handler ` is now used. .. c:function:: PyObject* PyUnicode_EncodeFSDefault(PyObject *unicode) - Encode a Unicode object to :c:data:`Py_FileSystemDefaultEncoding` with the - :c:data:`Py_FileSystemDefaultEncodeErrors` error handler, and return - :class:`bytes`. Note that the resulting :class:`bytes` object may contain - null bytes. - - If :c:data:`Py_FileSystemDefaultEncoding` is not set, fall back to the - locale encoding. + Encode a Unicode object to the :term:`filesystem encoding and error + handler`, and return :class:`bytes`. Note that the resulting :class:`bytes` + object can contain null bytes. - :c:data:`Py_FileSystemDefaultEncoding` is initialized at startup from the - locale encoding and cannot be modified later. If you need to encode a string - to the current locale encoding, use :c:func:`PyUnicode_EncodeLocale`. + If you need to encode a string to the current locale encoding, use + :c:func:`PyUnicode_EncodeLocale`. .. seealso:: @@ -796,7 +780,8 @@ conversion function: .. versionadded:: 3.2 .. versionchanged:: 3.6 - Use :c:data:`Py_FileSystemDefaultEncodeErrors` error handler. + The :term:`filesystem error handler ` is now used. wchar_t Support """"""""""""""" @@ -861,10 +846,7 @@ constructor. Setting encoding to ``NULL`` causes the default encoding to be used which is UTF-8. The file system calls should use :c:func:`PyUnicode_FSConverter` for encoding file names. This uses the -variable :c:data:`Py_FileSystemDefaultEncoding` internally. This -variable should be treated as read-only: on some systems, it will be a -pointer to a static string, on others, it will change at run-time -(such as when the application invokes setlocale). +:term:`filesystem encoding and error handler` internally. Error handling is set by errors which may also be set to ``NULL`` meaning to use the default handling defined for the codec. Default error handling for all diff --git a/Doc/conf.py b/Doc/conf.py index e539da539e6551..0c7deb59d130dc 100644 --- a/Doc/conf.py +++ b/Doc/conf.py @@ -45,7 +45,7 @@ highlight_language = 'python3' # Minimum version of sphinx required -needs_sphinx = '1.8' +needs_sphinx = '3.2' # Ignore any .rst files in the venv/ directory. exclude_patterns = ['venv/*', 'README.rst'] @@ -237,5 +237,3 @@ # bpo-40204: Disable warnings on Sphinx 2 syntax of the C domain since the # documentation is built with -W (warnings treated as errors). c_warn_on_allowed_pre_v3 = False - -strip_signature_backslash = True diff --git a/Doc/data/stable_abi.dat b/Doc/data/stable_abi.dat index 3912a7c1242de0..82cd5796efd27d 100644 --- a/Doc/data/stable_abi.dat +++ b/Doc/data/stable_abi.dat @@ -653,6 +653,7 @@ function,PyTuple_Size,3.2,, var,PyTuple_Type,3.2,, type,PyTypeObject,3.2,,opaque function,PyType_ClearCache,3.2,, +function,PyType_FromMetaclass,3.12,, function,PyType_FromModuleAndSpec,3.10,, function,PyType_FromSpec,3.2,, function,PyType_FromSpecWithBases,3.3,, diff --git a/Doc/distutils/apiref.rst b/Doc/distutils/apiref.rst index 18a4aac4f70895..74c48142d6d7fd 100644 --- a/Doc/distutils/apiref.rst +++ b/Doc/distutils/apiref.rst @@ -11,7 +11,7 @@ API Reference and other APIs, makes the API consistent across different Python versions, and is hence recommended over using ``distutils`` directly. -.. _New and changed setup.py arguments in setuptools: https://setuptools.readthedocs.io/en/latest/setuptools.html#new-and-changed-setup-keywords +.. _New and changed setup.py arguments in setuptools: https://web.archive.org/web/20210614192516/https://setuptools.pypa.io/en/stable/userguide/keywords.html .. include:: ./_setuptools_disclaimer.rst diff --git a/Doc/faq/design.rst b/Doc/faq/design.rst index ff83a1b8134b77..a624fdb07a17df 100644 --- a/Doc/faq/design.rst +++ b/Doc/faq/design.rst @@ -324,8 +324,7 @@ Can Python be compiled to machine code, C or some other language? `Cython `_ compiles a modified version of Python with optional annotations into C extensions. `Nuitka `_ is an up-and-coming compiler of Python into C++ code, aiming to support the full -Python language. For compiling to Java you can consider -`VOC `_. +Python language. How does Python manage memory? diff --git a/Doc/faq/library.rst b/Doc/faq/library.rst index b9e541c150dc43..8167bf22f0b1ad 100644 --- a/Doc/faq/library.rst +++ b/Doc/faq/library.rst @@ -483,8 +483,14 @@ including :func:`~shutil.copyfile`, :func:`~shutil.copytree`, and How do I copy a file? --------------------- -The :mod:`shutil` module contains a :func:`~shutil.copyfile` function. Note -that on MacOS 9 it doesn't copy the resource fork and Finder info. +The :mod:`shutil` module contains a :func:`~shutil.copyfile` function. +Note that on Windows NTFS volumes, it does not copy +`alternate data streams +`_ +nor `resource forks `__ +on macOS HFS+ volumes, though both are now rarely used. +It also doesn't copy file permissions and metadata, though using +:func:`shutil.copy2` instead will preserve most (though not all) of it. How do I read (or write) binary data? @@ -664,7 +670,7 @@ A summary of available frameworks is maintained by Paul Boddie at https://wiki.python.org/moin/WebProgramming\ . Cameron Laird maintains a useful set of pages about Python web technologies at -http://phaseit.net/claird/comp.lang.python/web_python. +https://web.archive.org/web/20210224183619/http://phaseit.net/claird/comp.lang.python/web_python. How can I mimic CGI form submission (METHOD=POST)? diff --git a/Doc/faq/programming.rst b/Doc/faq/programming.rst index f87eaff9531fce..0b11bc986bbb04 100644 --- a/Doc/faq/programming.rst +++ b/Doc/faq/programming.rst @@ -56,7 +56,7 @@ Are there tools to help find bugs or perform static analysis? Yes. -`Pylint `_ and +`Pylint `_ and `Pyflakes `_ do basic checking that will help you catch bugs sooner. diff --git a/Doc/howto/functional.rst b/Doc/howto/functional.rst index 695b9b31a762bd..fb561a6a10b661 100644 --- a/Doc/howto/functional.rst +++ b/Doc/howto/functional.rst @@ -315,9 +315,15 @@ line of a file like this:: Sets can take their contents from an iterable and let you iterate over the set's elements:: - S = {2, 3, 5, 7, 11, 13} - for i in S: - print(i) + >>> S = {2, 3, 5, 7, 11, 13} + >>> for i in S: + ... print(i) + 2 + 3 + 5 + 7 + 11 + 13 @@ -335,18 +341,18 @@ List comprehensions and generator expressions (short form: "listcomps" and functional programming language Haskell (https://www.haskell.org/). You can strip all the whitespace from a stream of strings with the following code:: - line_list = [' line 1\n', 'line 2 \n', ...] + >>> line_list = [' line 1\n', 'line 2 \n', ' \n', ''] - # Generator expression -- returns iterator - stripped_iter = (line.strip() for line in line_list) + >>> # Generator expression -- returns iterator + >>> stripped_iter = (line.strip() for line in line_list) - # List comprehension -- returns list - stripped_list = [line.strip() for line in line_list] + >>> # List comprehension -- returns list + >>> stripped_list = [line.strip() for line in line_list] You can select only certain elements by adding an ``"if"`` condition:: - stripped_list = [line.strip() for line in line_list - if line != ""] + >>> stripped_list = [line.strip() for line in line_list + ... if line != ""] With a list comprehension, you get back a Python list; ``stripped_list`` is a list containing the resulting lines, not an iterator. Generator expressions @@ -363,7 +369,8 @@ have the form:: if condition1 for expr2 in sequence2 if condition2 - for expr3 in sequence3 ... + for expr3 in sequence3 + ... if condition3 for exprN in sequenceN if conditionN ) diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst index 704279240b357b..79b4069f035076 100644 --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -7,7 +7,8 @@ Logging Cookbook :Author: Vinay Sajip This page contains a number of recipes related to logging, which have been found -useful in the past. +useful in the past. For links to tutorial and reference information, please see +:ref:`cookbook-ref-links`. .. currentmodule:: logging @@ -713,6 +714,32 @@ which, when run, produces something like: 2010-09-06 22:38:15,301 d.e.f DEBUG IP: 123.231.231.123 User: fred A message at DEBUG level with 2 parameters 2010-09-06 22:38:15,301 d.e.f INFO IP: 123.231.231.123 User: fred A message at INFO level with 2 parameters +Imparting contextual information in handlers +-------------------------------------------- + +Each :class:`~Handler` has its own chain of filters. +If you want to add contextual information to a :class:`LogRecord` without leaking +it to other handlers, you can use a filter that returns +a new :class:`~LogRecord` instead of modifying it in-place, as shown in the following script:: + + import copy + import logging + + def filter(record: logging.LogRecord): + record = copy.copy(record) + record.user = 'jim' + return record + + if __name__ == '__main__': + logger = logging.getLogger() + logger.setLevel(logging.INFO) + handler = logging.StreamHandler() + formatter = logging.Formatter('%(message)s from %(user)-8s') + handler.setFormatter(formatter) + handler.addFilter(filter) + logger.addHandler(handler) + + logger.info('A log message') .. _multiple-processes: @@ -2995,6 +3022,95 @@ refer to the comments in the code snippet for more detailed information. if __name__=='__main__': main() +Logging to syslog with RFC5424 support +-------------------------------------- + +Although :rfc:`5424` dates from 2009, most syslog servers are configured by detault to +use the older :rfc:`3164`, which hails from 2001. When ``logging`` was added to Python +in 2003, it supported the earlier (and only existing) protocol at the time. Since +RFC5424 came out, as there has not been widespread deployment of it in syslog +servers, the :class:`~logging.handlers.SysLogHandler` functionality has not been +updated. + +RFC 5424 contains some useful features such as support for structured data, and if you +need to be able to log to a syslog server with support for it, you can do so with a +subclassed handler which looks something like this:: + + import datetime + import logging.handlers + import re + import socket + import time + + class SysLogHandler5424(logging.handlers.SysLogHandler): + + tz_offset = re.compile(r'([+-]\d{2})(\d{2})$') + escaped = re.compile(r'([\]"\\])') + + def __init__(self, *args, **kwargs): + self.msgid = kwargs.pop('msgid', None) + self.appname = kwargs.pop('appname', None) + super().__init__(*args, **kwargs) + + def format(self, record): + version = 1 + asctime = datetime.datetime.fromtimestamp(record.created).isoformat() + m = self.tz_offset.match(time.strftime('%z')) + has_offset = False + if m and time.timezone: + hrs, mins = m.groups() + if int(hrs) or int(mins): + has_offset = True + if not has_offset: + asctime += 'Z' + else: + asctime += f'{hrs}:{mins}' + try: + hostname = socket.gethostname() + except Exception: + hostname = '-' + appname = self.appname or '-' + procid = record.process + msgid = '-' + msg = super().format(record) + sdata = '-' + if hasattr(record, 'structured_data'): + sd = record.structured_data + # This should be a dict where the keys are SD-ID and the value is a + # dict mapping PARAM-NAME to PARAM-VALUE (refer to the RFC for what these + # mean) + # There's no error checking here - it's purely for illustration, and you + # can adapt this code for use in production environments + parts = [] + + def replacer(m): + g = m.groups() + return '\\' + g[0] + + for sdid, dv in sd.items(): + part = f'[{sdid}' + for k, v in dv.items(): + s = str(v) + s = self.escaped.sub(replacer, s) + part += f' {k}="{s}"' + part += ']' + parts.append(part) + sdata = ''.join(parts) + return f'{version} {asctime} {hostname} {appname} {procid} {msgid} {sdata} {msg}' + +You'll need to be familiar with RFC 5424 to fully understand the above code, and it +may be that you have slightly different needs (e.g. for how you pass structural data +to the log). Nevertheless, the above should be adaptable to your speciric needs. With +the above handler, you'd pass structured data using something like this:: + + sd = { + 'foo@12345': {'bar': 'baz', 'baz': 'bozz', 'fizz': r'buzz'}, + 'foo@54321': {'rab': 'baz', 'zab': 'bozz', 'zzif': r'buzz'} + } + extra = {'structured_data': sd} + i = 1 + logger.debug('Message %d', i, extra=extra) + .. patterns-to-avoid: @@ -3075,3 +3191,23 @@ the :ref:`existing mechanisms ` for passing contextual information into your logs and restrict the loggers created to those describing areas within your application (generally modules, but occasionally slightly more fine-grained than that). + +.. _cookbook-ref-links: + +Other resources +--------------- + +.. seealso:: + + Module :mod:`logging` + API reference for the logging module. + + Module :mod:`logging.config` + Configuration API for the logging module. + + Module :mod:`logging.handlers` + Useful handlers included with the logging module. + + :ref:`Basic Tutorial ` + + :ref:`Advanced Tutorial ` diff --git a/Doc/howto/logging.rst b/Doc/howto/logging.rst index 4d76c27332ccd3..f024430cedb0ae 100644 --- a/Doc/howto/logging.rst +++ b/Doc/howto/logging.rst @@ -8,6 +8,9 @@ Logging HOWTO .. currentmodule:: logging +This page contains tutorial information. For links to reference information and a +logging cookbook, please see :ref:`tutorial-ref-links`. + Basic Logging Tutorial ---------------------- @@ -178,10 +181,11 @@ following example:: raise ValueError('Invalid log level: %s' % loglevel) logging.basicConfig(level=numeric_level, ...) -The call to :func:`basicConfig` should come *before* any calls to :func:`debug`, -:func:`info` etc. As it's intended as a one-off simple configuration facility, -only the first call will actually do anything: subsequent calls are effectively -no-ops. +The call to :func:`basicConfig` should come *before* any calls to +:func:`debug`, :func:`info`, etc. Otherwise, those functions will call +:func:`basicConfig` for you with the default options. As it's intended as a +one-off simple configuration facility, only the first call will actually do +anything: subsequent calls are effectively no-ops. If you run the above script several times, the messages from successive runs are appended to the file *example.log*. If you want each run to start afresh, @@ -335,7 +339,7 @@ favourite beverage and carry on. If your logging needs are simple, then use the above examples to incorporate logging into your own scripts, and if you run into problems or don't understand something, please post a question on the comp.lang.python Usenet -group (available at https://groups.google.com/forum/#!forum/comp.lang.python) and you +group (available at https://groups.google.com/g/comp.lang.python) and you should receive help before too long. Still here? You can carry on reading the next few sections, which provide a @@ -1100,11 +1104,19 @@ need: | Current process name when using ``multiprocessing`` | Set ``logging.logMultiprocessing`` to ``False``. | | to manage multiple processes. | | +-----------------------------------------------------+---------------------------------------------------+ +| Current :class:`asyncio.Task` name when using | Set ``logging.logAsyncioTasks`` to ``False``. | +| ``asyncio``. | | ++-----------------------------------------------------+---------------------------------------------------+ Also note that the core logging module only includes the basic handlers. If you don't import :mod:`logging.handlers` and :mod:`logging.config`, they won't take up any memory. +.. _tutorial-ref-links: + +Other resources +--------------- + .. seealso:: Module :mod:`logging` diff --git a/Doc/howto/sockets.rst b/Doc/howto/sockets.rst index e58f78a7cb0245..0bbf97da39768d 100644 --- a/Doc/howto/sockets.rst +++ b/Doc/howto/sockets.rst @@ -252,20 +252,25 @@ Binary Data ----------- It is perfectly possible to send binary data over a socket. The major problem is -that not all machines use the same formats for binary data. For example, a -Motorola chip will represent a 16 bit integer with the value 1 as the two hex -bytes 00 01. Intel and DEC, however, are byte-reversed - that same 1 is 01 00. +that not all machines use the same formats for binary data. For example, +`network byte order `_ +is big-endian, with the most significant byte first, +so a 16 bit integer with the value ``1`` would be the two hex bytes ``00 01``. +However, most common processors (x86/AMD64, ARM, RISC-V), are little-endian, +with the least significant byte first - that same ``1`` would be ``01 00``. + Socket libraries have calls for converting 16 and 32 bit integers - ``ntohl, htonl, ntohs, htons`` where "n" means *network* and "h" means *host*, "s" means *short* and "l" means *long*. Where network order is host order, these do nothing, but where the machine is byte-reversed, these swap the bytes around appropriately. -In these days of 32 bit machines, the ascii representation of binary data is +In these days of 64-bit machines, the ASCII representation of binary data is frequently smaller than the binary representation. That's because a surprising -amount of the time, all those longs have the value 0, or maybe 1. The string "0" -would be two bytes, while binary is four. Of course, this doesn't fit well with -fixed-length messages. Decisions, decisions. +amount of the time, most integers have the value 0, or maybe 1. +The string ``"0"`` would be two bytes, while a full 64-bit integer would be 8. +Of course, this doesn't fit well with fixed-length messages. +Decisions, decisions. Disconnecting diff --git a/Doc/howto/urllib2.rst b/Doc/howto/urllib2.rst index 12d525771ddc28..e77fefffbd257d 100644 --- a/Doc/howto/urllib2.rst +++ b/Doc/howto/urllib2.rst @@ -4,13 +4,13 @@ HOWTO Fetch Internet Resources Using The urllib Package *********************************************************** -:Author: `Michael Foord `_ +:Author: `Michael Foord `_ .. note:: There is a French translation of an earlier revision of this HOWTO, available at `urllib2 - Le Manuel manquant - `_. + `_. @@ -22,7 +22,7 @@ Introduction You may also find useful the following article on fetching web resources with Python: - * `Basic Authentication `_ + * `Basic Authentication `_ A tutorial on *Basic Authentication*, with examples in Python. diff --git a/Doc/includes/email-headers.py b/Doc/includes/email-headers.py index 2c421451a8e5a8..5def0c90d28d9f 100644 --- a/Doc/includes/email-headers.py +++ b/Doc/includes/email-headers.py @@ -1,5 +1,6 @@ # Import the email modules we'll need -from email.parser import BytesParser, Parser +#from email.parser import BytesParser +from email.parser import Parser from email.policy import default # If the e-mail headers are in a file, uncomment these two lines: diff --git a/Doc/includes/sqlite3/adapter_datetime.py b/Doc/includes/sqlite3/adapter_datetime.py deleted file mode 100644 index d5221d80c35c8a..00000000000000 --- a/Doc/includes/sqlite3/adapter_datetime.py +++ /dev/null @@ -1,17 +0,0 @@ -import sqlite3 -import datetime -import time - -def adapt_datetime(ts): - return time.mktime(ts.timetuple()) - -sqlite3.register_adapter(datetime.datetime, adapt_datetime) - -con = sqlite3.connect(":memory:") -cur = con.cursor() - -now = datetime.datetime.now() -cur.execute("select ?", (now,)) -print(cur.fetchone()[0]) - -con.close() diff --git a/Doc/includes/sqlite3/converter_point.py b/Doc/includes/sqlite3/converter_point.py index 5df828e3361246..147807a2225ff0 100644 --- a/Doc/includes/sqlite3/converter_point.py +++ b/Doc/includes/sqlite3/converter_point.py @@ -5,28 +5,23 @@ def __init__(self, x, y): self.x, self.y = x, y def __repr__(self): - return "(%f;%f)" % (self.x, self.y) + return f"Point({self.x}, {self.y})" def adapt_point(point): - return ("%f;%f" % (point.x, point.y)).encode('ascii') + return f"{point.x};{point.y}".encode("utf-8") def convert_point(s): x, y = list(map(float, s.split(b";"))) return Point(x, y) -# Register the adapter +# Register the adapter and converter sqlite3.register_adapter(Point, adapt_point) - -# Register the converter sqlite3.register_converter("point", convert_point) +# 1) Parse using declared types p = Point(4.0, -3.2) - -######################### -# 1) Using declared types con = sqlite3.connect(":memory:", detect_types=sqlite3.PARSE_DECLTYPES) -cur = con.cursor() -cur.execute("create table test(p point)") +cur = con.execute("create table test(p point)") cur.execute("insert into test(p) values (?)", (p,)) cur.execute("select p from test") @@ -34,11 +29,9 @@ def convert_point(s): cur.close() con.close() -####################### -# 1) Using column names +# 2) Parse using column names con = sqlite3.connect(":memory:", detect_types=sqlite3.PARSE_COLNAMES) -cur = con.cursor() -cur.execute("create table test(p)") +cur = con.execute("create table test(p)") cur.execute("insert into test(p) values (?)", (p,)) cur.execute('select p as "p [point]" from test') diff --git a/Doc/installing/index.rst b/Doc/installing/index.rst index 4bacc7ba0c2cf2..e158bf1c4c0c7f 100644 --- a/Doc/installing/index.rst +++ b/Doc/installing/index.rst @@ -214,7 +214,7 @@ It is possible that ``pip`` does not get installed by default. One potential fix python -m ensurepip --default-pip There are also additional resources for `installing pip. -`__ +`__ Installing binary extensions diff --git a/Doc/library/_thread.rst b/Doc/library/_thread.rst index 1e6452b7b826fd..75b3834b9a339c 100644 --- a/Doc/library/_thread.rst +++ b/Doc/library/_thread.rst @@ -118,7 +118,7 @@ This module defines the following constants and functions: Its value may be used to uniquely identify this particular thread system-wide (until the thread terminates, after which the value may be recycled by the OS). - .. availability:: Windows, FreeBSD, Linux, macOS, OpenBSD, NetBSD, AIX. + .. availability:: Windows, FreeBSD, Linux, macOS, OpenBSD, NetBSD, AIX, DragonFlyBSD. .. versionadded:: 3.8 diff --git a/Doc/library/aifc.rst b/Doc/library/aifc.rst index fa277857574a3a..9f20a30193fa70 100644 --- a/Doc/library/aifc.rst +++ b/Doc/library/aifc.rst @@ -13,7 +13,7 @@ single: AIFF-C -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`aifc` module is deprecated (see :pep:`PEP 594 <594#aifc>` for details). diff --git a/Doc/library/argparse.rst b/Doc/library/argparse.rst index 83dd3cdf03136d..b2fa0b3c23c3a1 100644 --- a/Doc/library/argparse.rst +++ b/Doc/library/argparse.rst @@ -562,7 +562,7 @@ at the command line. If the ``fromfile_prefix_chars=`` argument is given to the specified characters will be treated as files, and will be replaced by the arguments they contain. For example:: - >>> with open('args.txt', 'w') as fp: + >>> with open('args.txt', 'w', encoding=sys.getfilesystemencoding()) as fp: ... fp.write('-f\nbar') >>> parser = argparse.ArgumentParser(fromfile_prefix_chars='@') >>> parser.add_argument('-f') @@ -575,9 +575,18 @@ were in the same place as the original file referencing argument on the command line. So in the example above, the expression ``['-f', 'foo', '@args.txt']`` is considered equivalent to the expression ``['-f', 'foo', '-f', 'bar']``. +:class:`ArgumentParser` uses :term:`filesystem encoding and error handler` +to read the file containing arguments. + The ``fromfile_prefix_chars=`` argument defaults to ``None``, meaning that arguments will never be treated as file references. +.. versionchanged:: 3.12 + :class:`ArgumentParser` changed encoding and errors to read arguments files + from default (e.g. :func:`locale.getpreferredencoding(False)` and + ``"strict"``) to :term:`filesystem encoding and error handler`. + Arguments file should be encoded in UTF-8 instead of ANSI Codepage on Windows. + argument_default ^^^^^^^^^^^^^^^^ @@ -1701,7 +1710,7 @@ Sub-commands .. method:: ArgumentParser.add_subparsers([title], [description], [prog], \ [parser_class], [action], \ - [option_string], [dest], [required], \ + [option_strings], [dest], [required], \ [help], [metavar]) Many programs split up their functionality into a number of sub-commands, diff --git a/Doc/library/array.rst b/Doc/library/array.rst index c7f137d15b4b86..975670cc81a202 100644 --- a/Doc/library/array.rst +++ b/Doc/library/array.rst @@ -52,7 +52,7 @@ Notes: .. versionchanged:: 3.9 ``array('u')`` now uses ``wchar_t`` as C type instead of deprecated - ``Py_UNICODE``. This change doesn't affect to its behavior because + ``Py_UNICODE``. This change doesn't affect its behavior because ``Py_UNICODE`` is alias of ``wchar_t`` since Python 3.3. .. deprecated-removed:: 3.3 4.0 diff --git a/Doc/library/asynchat.rst b/Doc/library/asynchat.rst index 7cc9d99779bbbb..4eb6a79d4dfbf2 100644 --- a/Doc/library/asynchat.rst +++ b/Doc/library/asynchat.rst @@ -10,8 +10,8 @@ **Source code:** :source:`Lib/asynchat.py` -.. deprecated:: 3.6 - :mod:`asynchat` will be removed in Python 3.12 +.. deprecated-removed:: 3.6 3.12 + The :mod:`asynchat` module is deprecated (see :pep:`PEP 594 <594#asynchat>` for details). Please use :mod:`asyncio` instead. diff --git a/Doc/library/asyncio-llapi-index.rst b/Doc/library/asyncio-llapi-index.rst index 69b550e43f5aa9..cdadb7975ad494 100644 --- a/Doc/library/asyncio-llapi-index.rst +++ b/Doc/library/asyncio-llapi-index.rst @@ -358,6 +358,10 @@ pipes, etc). Returned from methods like * - :meth:`transport.get_write_buffer_size() ` + - Return the current size of the output buffer. + + * - :meth:`transport.get_write_buffer_limits() + ` - Return high and low water marks for write flow control. * - :meth:`transport.set_write_buffer_limits() diff --git a/Doc/library/asyncio-stream.rst b/Doc/library/asyncio-stream.rst index 97431d103cf4ba..9b468670a2521e 100644 --- a/Doc/library/asyncio-stream.rst +++ b/Doc/library/asyncio-stream.rst @@ -192,7 +192,7 @@ StreamReader can be read. Use the :attr:`IncompleteReadError.partial` attribute to get the partially read data. - .. coroutinemethod:: readuntil(separator=b'\\n') + .. coroutinemethod:: readuntil(separator=b'\n') Read data from the stream until *separator* is found. diff --git a/Doc/library/asyncio-subprocess.rst b/Doc/library/asyncio-subprocess.rst index e5000532a895dd..28d0b21e8180b6 100644 --- a/Doc/library/asyncio-subprocess.rst +++ b/Doc/library/asyncio-subprocess.rst @@ -124,6 +124,7 @@ Constants ========= .. data:: asyncio.subprocess.PIPE + :module: Can be passed to the *stdin*, *stdout* or *stderr* parameters. @@ -137,11 +138,13 @@ Constants attributes will point to :class:`StreamReader` instances. .. data:: asyncio.subprocess.STDOUT + :module: Special value that can be used as the *stderr* argument and indicates that standard error should be redirected into standard output. .. data:: asyncio.subprocess.DEVNULL + :module: Special value that can be used as the *stdin*, *stdout* or *stderr* argument to process creation functions. It indicates that the special file @@ -157,6 +160,7 @@ wrapper that allows communicating with subprocesses and watching for their completion. .. class:: asyncio.subprocess.Process + :module: An object that wraps OS processes created by the :func:`create_subprocess_exec` and :func:`create_subprocess_shell` diff --git a/Doc/library/asyncio-sync.rst b/Doc/library/asyncio-sync.rst index 141733ee2c8001..05bdf5488af143 100644 --- a/Doc/library/asyncio-sync.rst +++ b/Doc/library/asyncio-sync.rst @@ -345,7 +345,7 @@ BoundedSemaphore Barrier ======= -.. class:: Barrier(parties, action=None) +.. class:: Barrier(parties) A barrier object. Not thread-safe. @@ -411,8 +411,8 @@ Barrier ... async with barrier as position: if position == 0: - # Only one task print this - print('End of *draining phasis*') + # Only one task prints this + print('End of *draining phase*') This method may raise a :class:`BrokenBarrierError` exception if the barrier is broken or reset while a task is waiting. @@ -429,7 +429,7 @@ Barrier Put the barrier into a broken state. This causes any active or future calls to :meth:`wait` to fail with the :class:`BrokenBarrierError`. - Use this for example if one of the taks needs to abort, to avoid infinite + Use this for example if one of the tasks needs to abort, to avoid infinite waiting tasks. .. attribute:: parties diff --git a/Doc/library/asyncio-task.rst b/Doc/library/asyncio-task.rst index 8e3d49dcf9d717..7796e47d4674dc 100644 --- a/Doc/library/asyncio-task.rst +++ b/Doc/library/asyncio-task.rst @@ -226,7 +226,24 @@ Creating Tasks .. important:: Save a reference to the result of this function, to avoid - a task disappearing mid execution. + a task disappearing mid execution. The event loop only keeps + weak references to tasks. A task that isn't referenced elsewhere + may get garbage-collected at any time, even before it's done. + For reliable "fire-and-forget" background tasks, gather them in + a collection:: + + background_tasks = set() + + for i in range(10): + task = asyncio.create_task(some_coro(param=i)) + + # Add task to the set. This creates a strong reference. + background_tasks.add(task) + + # To prevent keeping references to finished tasks forever, + # make each task remove its own reference from the set after + # completion: + task.add_done_callback(background_tasks.discard) .. versionadded:: 3.7 diff --git a/Doc/library/asyncore.rst b/Doc/library/asyncore.rst index a732fd7ba4f152..0084d754419d09 100644 --- a/Doc/library/asyncore.rst +++ b/Doc/library/asyncore.rst @@ -13,8 +13,8 @@ **Source code:** :source:`Lib/asyncore.py` -.. deprecated:: 3.6 - :mod:`asyncore` will be removed in Python 3.12 +.. deprecated-removed:: 3.6 3.12 + The :mod:`asyncore` module is deprecated (see :pep:`PEP 594 <594#asyncore>` for details). Please use :mod:`asyncio` instead. diff --git a/Doc/library/audioop.rst b/Doc/library/audioop.rst index 649c99e796282c..1f96575d08f5b1 100644 --- a/Doc/library/audioop.rst +++ b/Doc/library/audioop.rst @@ -5,7 +5,7 @@ :synopsis: Manipulate raw audio data. :deprecated: -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`audioop` module is deprecated (see :pep:`PEP 594 <594#audioop>` for details). diff --git a/Doc/library/base64.rst b/Doc/library/base64.rst index 4ff038c8d29f1a..a02ba739146aaf 100644 --- a/Doc/library/base64.rst +++ b/Doc/library/base64.rst @@ -201,7 +201,7 @@ The modern interface provides: .. versionadded:: 3.4 -.. function:: a85decode(b, *, foldspaces=False, adobe=False, ignorechars=b' \\t\\n\\r\\v') +.. function:: a85decode(b, *, foldspaces=False, adobe=False, ignorechars=b' \t\n\r\v') Decode the Ascii85 encoded :term:`bytes-like object` or ASCII string *b* and return the decoded :class:`bytes`. diff --git a/Doc/library/binascii.rst b/Doc/library/binascii.rst index 4417a5ac38251e..5a0815faa38eac 100644 --- a/Doc/library/binascii.rst +++ b/Doc/library/binascii.rst @@ -49,7 +49,7 @@ The :mod:`binascii` module defines the following functions: Added the *backtick* parameter. -.. function:: a2b_base64(string, strict_mode=False) +.. function:: a2b_base64(string, /, *, strict_mode=False) Convert a block of base64 data back to binary and return the binary data. More than one line may be passed at a time. diff --git a/Doc/library/cgi.rst b/Doc/library/cgi.rst index 31c03dee91ea82..983e412afafd99 100644 --- a/Doc/library/cgi.rst +++ b/Doc/library/cgi.rst @@ -15,10 +15,16 @@ single: URL single: Common Gateway Interface -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`cgi` module is deprecated (see :pep:`PEP 594 <594#cgi>` for details and alternatives). + The :class:`FieldStorage` class can typically be replaced with + :func:`urllib.parse.parse_qsl` for ``GET`` and ``HEAD`` requests, + and the :mod:`email.message` module or + `multipart `_ for ``POST`` and ``PUT``. + Most :ref:`utility functions ` have replacements. + -------------- Support module for Common Gateway Interface (CGI) scripts. @@ -293,6 +299,12 @@ algorithms implemented in this module in other circumstances. ``sys.stdin``). The *keep_blank_values*, *strict_parsing* and *separator* parameters are passed to :func:`urllib.parse.parse_qs` unchanged. + .. deprecated-removed:: 3.11 3.13 + This function, like the rest of the :mod:`cgi` module, is deprecated. + It can be replaced by calling :func:`urllib.parse.parse_qs` directly + on the desired query string (except for ``multipart/form-data`` input, + which can be handled as described for :func:`parse_multipart`). + .. function:: parse_multipart(fp, pdict, encoding="utf-8", errors="replace", separator="&") @@ -316,12 +328,31 @@ algorithms implemented in this module in other circumstances. .. versionchanged:: 3.10 Added the *separator* parameter. + .. deprecated-removed:: 3.11 3.13 + This function, like the rest of the :mod:`cgi` module, is deprecated. + It can be replaced with the functionality in the :mod:`email` package + (e.g. :class:`email.message.EmailMessage`/:class:`email.message.Message`) + which implements the same MIME RFCs, or with the + `multipart `__ PyPI project. + .. function:: parse_header(string) Parse a MIME header (such as :mailheader:`Content-Type`) into a main value and a dictionary of parameters. + .. deprecated-removed:: 3.11 3.13 + This function, like the rest of the :mod:`cgi` module, is deprecated. + It can be replaced with the functionality in the :mod:`email` package, + which implements the same MIME RFCs. + + For example, with :class:`email.message.EmailMessage`:: + + from email.message import EmailMessage + msg = EmailMessage() + msg['content-type'] = 'application/json; charset="utf8"' + main, params = msg.get_content_type(), msg['content-type'].params + .. function:: test() diff --git a/Doc/library/cgitb.rst b/Doc/library/cgitb.rst index 3b0b106abacd50..7f00bcd55c1e53 100644 --- a/Doc/library/cgitb.rst +++ b/Doc/library/cgitb.rst @@ -16,7 +16,7 @@ single: exceptions; in CGI scripts single: tracebacks; in CGI scripts -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`cgitb` module is deprecated (see :pep:`PEP 594 <594#cgitb>` for details). diff --git a/Doc/library/chunk.rst b/Doc/library/chunk.rst index 5a84c8904f7145..3b88e55b147882 100644 --- a/Doc/library/chunk.rst +++ b/Doc/library/chunk.rst @@ -17,7 +17,7 @@ single: Real Media File Format single: RMFF -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`chunk` module is deprecated (see :pep:`PEP 594 <594#chunk>` for details). diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index 99703ff3051d47..8efbf0a3d59554 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -257,7 +257,7 @@ to a :class:`ProcessPoolExecutor` will result in deadlock. replaced with a fresh worker process. By default *max_tasks_per_child* is ``None`` which means worker processes will live as long as the pool. When a max is specified, the "spawn" multiprocessing start method will be used by - default in absense of a *mp_context* parameter. This feature is incompatible + default in absence of a *mp_context* parameter. This feature is incompatible with the "fork" start method. .. versionchanged:: 3.3 diff --git a/Doc/library/configparser.rst b/Doc/library/configparser.rst index 72aa20d73d8bd3..bf49f2bfbe1b43 100644 --- a/Doc/library/configparser.rst +++ b/Doc/library/configparser.rst @@ -1206,28 +1206,6 @@ ConfigParser Objects names is stripped before :meth:`optionxform` is called. - .. method:: readfp(fp, filename=None) - - .. deprecated:: 3.2 - Use :meth:`read_file` instead. - - .. versionchanged:: 3.2 - :meth:`readfp` now iterates on *fp* instead of calling ``fp.readline()``. - - For existing code calling :meth:`readfp` with arguments which don't - support iteration, the following generator may be used as a wrapper - around the file-like object:: - - def readline_generator(fp): - line = fp.readline() - while line: - yield line - line = fp.readline() - - Instead of ``parser.readfp(fp)`` use - ``parser.read_file(readline_generator(fp))``. - - .. data:: MAX_INTERPOLATION_DEPTH The maximum depth for recursive interpolation for :meth:`get` when the *raw* @@ -1361,10 +1339,9 @@ Exceptions Exception raised when errors occur attempting to parse a file. - .. versionchanged:: 3.2 - The ``filename`` attribute and :meth:`__init__` argument were renamed to - ``source`` for consistency. - +.. versionchanged:: 3.12 + The ``filename`` attribute and :meth:`__init__` constructor argument were + removed. They have been available using the name ``source`` since 3.2. .. rubric:: Footnotes diff --git a/Doc/library/contextlib.rst b/Doc/library/contextlib.rst index 84c4ca098c6015..2d28fb35a9e316 100644 --- a/Doc/library/contextlib.rst +++ b/Doc/library/contextlib.rst @@ -361,7 +361,7 @@ Functions and classes provided: As this changes a global state, the working directory, it is not suitable for use in most threaded or async contexts. It is also not suitable for most non-linear code execution, like generators, where the program execution is - temporarily relinquished -- unless explicitely desired, you should not yield + temporarily relinquished -- unless explicitly desired, you should not yield when this context manager is active. This is a simple wrapper around :func:`~os.chdir`, it changes the current diff --git a/Doc/library/crypt.rst b/Doc/library/crypt.rst index e795f10f50eec3..efba4236bcbccc 100644 --- a/Doc/library/crypt.rst +++ b/Doc/library/crypt.rst @@ -16,7 +16,7 @@ single: crypt(3) pair: cipher; DES -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`crypt` module is deprecated (see :pep:`PEP 594 <594#crypt>` for details and alternatives). The :mod:`hashlib` module is a potential replacement for certain use cases. diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index e0b28d7cb978d9..ef1535c84bd5ad 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -998,7 +998,7 @@ Other constructors, all class methods: ISO 8601 format, with the following exceptions: 1. Time zone offsets may have fractional seconds. - 2. The `T` separator may be replaced by any single unicode character. + 2. The ``T`` separator may be replaced by any single unicode character. 3. Ordinal dates are not currently supported. 4. Fractional hours and minutes are not supported. @@ -2603,7 +2603,7 @@ Notes: many other calendar systems. .. [#] See R. H. van Gent's `guide to the mathematics of the ISO 8601 calendar - `_ + `_ for a good explanation. .. [#] Passing ``datetime.strptime('Feb 29', '%b %d')`` will fail since ``1900`` is not a leap year. diff --git a/Doc/library/decimal.rst b/Doc/library/decimal.rst index 2ad84f20b5560f..d052581c970124 100644 --- a/Doc/library/decimal.rst +++ b/Doc/library/decimal.rst @@ -571,9 +571,10 @@ Decimal objects >>> Decimal(321).exp() Decimal('2.561702493119680037517373933E+139') - .. method:: from_float(f) + .. classmethod:: from_float(f) - Classmethod that converts a float to a decimal number, exactly. + Alternative constructor that only accepts instances of :class:`float` or + :class:`int`. Note `Decimal.from_float(0.1)` is not the same as `Decimal('0.1')`. Since 0.1 is not exactly representable in binary floating point, the diff --git a/Doc/library/difflib.rst b/Doc/library/difflib.rst index 10f8808d33b160..87c38602113ead 100644 --- a/Doc/library/difflib.rst +++ b/Doc/library/difflib.rst @@ -149,7 +149,7 @@ diffs. For comparing directories and files, see also, the :mod:`filecmp` module. contains a good example of its use. -.. function:: context_diff(a, b, fromfile='', tofile='', fromfiledate='', tofiledate='', n=3, lineterm='\\n') +.. function:: context_diff(a, b, fromfile='', tofile='', fromfiledate='', tofiledate='', n=3, lineterm='\n') Compare *a* and *b* (lists of strings); return a delta (a :term:`generator` generating the delta lines) in context diff format. @@ -279,7 +279,7 @@ diffs. For comparing directories and files, see also, the :mod:`filecmp` module. emu -.. function:: unified_diff(a, b, fromfile='', tofile='', fromfiledate='', tofiledate='', n=3, lineterm='\\n') +.. function:: unified_diff(a, b, fromfile='', tofile='', fromfiledate='', tofiledate='', n=3, lineterm='\n') Compare *a* and *b* (lists of strings); return a delta (a :term:`generator` generating the delta lines) in unified diff format. @@ -321,7 +321,7 @@ diffs. For comparing directories and files, see also, the :mod:`filecmp` module. See :ref:`difflib-interface` for a more detailed example. -.. function:: diff_bytes(dfunc, a, b, fromfile=b'', tofile=b'', fromfiledate=b'', tofiledate=b'', n=3, lineterm=b'\\n') +.. function:: diff_bytes(dfunc, a, b, fromfile=b'', tofile=b'', fromfiledate=b'', tofiledate=b'', n=3, lineterm=b'\n') Compare *a* and *b* (lists of bytes objects) using *dfunc*; yield a sequence of delta lines (also bytes) in the format returned by *dfunc*. diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index 08e6c736d3e3c9..84712bce7fd1f6 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -6,6 +6,12 @@ **Source code:** :source:`Lib/dis.py` +.. testsetup:: + + import dis + def myfunc(alist): + return len(alist) + -------------- The :mod:`dis` module supports the analysis of CPython :term:`bytecode` by @@ -37,17 +43,17 @@ Example: Given the function :func:`myfunc`:: return len(alist) the following command can be used to display the disassembly of -:func:`myfunc`:: +:func:`myfunc`: - >>> dis.dis(myfunc) - 1 0 RESUME 0 +.. doctest:: - 2 2 PUSH_NULL - 4 LOAD_GLOBAL 1 (NULL + len) - 6 LOAD_FAST 0 (alist) - 8 PRECALL 1 - 10 CALL 1 - 12 RETURN_VALUE + >>> dis.dis(myfunc) + 2 0 RESUME 0 + + 3 2 LOAD_GLOBAL 1 (NULL + len) + 14 LOAD_FAST 0 (alist) + 16 CALL 1 + 26 RETURN_VALUE (The "2" is a line number). @@ -109,17 +115,17 @@ code. .. versionchanged:: 3.11 Added the ``show_caches`` parameter. -Example:: +Example: + +.. doctest:: >>> bytecode = dis.Bytecode(myfunc) >>> for instr in bytecode: ... print(instr.opname) ... RESUME - PUSH_NULL LOAD_GLOBAL LOAD_FAST - PRECALL CALL RETURN_VALUE @@ -506,8 +512,8 @@ the original TOS1. .. opcode:: GET_ANEXT - Implements ``PUSH(get_awaitable(TOS.__anext__()))``. See ``GET_AWAITABLE`` - for details about ``get_awaitable`` + Pushes ``get_awaitable(TOS.__anext__())`` to the stack. See + ``GET_AWAITABLE`` for details about ``get_awaitable``. .. versionadded:: 3.5 @@ -577,6 +583,8 @@ iterations of the loop. Pops TOS and yields it from a :term:`generator`. + .. versionchanged:: 3.11 + oparg set to be the stack depth, for efficient handling on frames. .. opcode:: YIELD_FROM @@ -879,7 +887,20 @@ iterations of the loop. .. opcode:: LOAD_ATTR (namei) - Replaces TOS with ``getattr(TOS, co_names[namei])``. + If the low bit of ``namei`` is not set, this replaces TOS with + ``getattr(TOS, co_names[namei>>1])``. + + If the low bit of ``namei`` is set, this will attempt to load a method named + ``co_names[namei>>1]`` from the TOS object. TOS is popped. + This bytecode distinguishes two cases: if TOS has a method with the correct + name, the bytecode pushes the unbound method and TOS. TOS will be used as + the first argument (``self``) by :opcode:`CALL` when calling the + unbound method. Otherwise, ``NULL`` and the object return by the attribute + lookup are pushed. + + .. versionchanged:: 3.12 + If the low bit of ``namei`` is set, then a ``NULL`` or ``self`` is + pushed to the stack before the attribute or unbound method respectively. .. opcode:: COMPARE_OP (opname) @@ -1034,6 +1055,17 @@ iterations of the loop. Pushes a reference to the local ``co_varnames[var_num]`` onto the stack. + .. versionchanged:: 3.12 + This opcode is now only used in situations where the local variable is + guaranteed to be initialized. It cannot raise :exc:`UnboundLocalError`. + +.. opcode:: LOAD_FAST_CHECK (var_num) + + Pushes a reference to the local ``co_varnames[var_num]`` onto the stack, + raising an :exc:`UnboundLocalError` if the local variable has not been + initialized. + + .. versionadded:: 3.12 .. opcode:: STORE_FAST (var_num) @@ -1170,27 +1202,6 @@ iterations of the loop. .. versionadded:: 3.6 -.. opcode:: LOAD_METHOD (namei) - - Loads a method named ``co_names[namei]`` from the TOS object. TOS is popped. - This bytecode distinguishes two cases: if TOS has a method with the correct - name, the bytecode pushes the unbound method and TOS. TOS will be used as - the first argument (``self``) by :opcode:`CALL` when calling the - unbound method. Otherwise, ``NULL`` and the object return by the attribute - lookup are pushed. - - .. versionadded:: 3.7 - - -.. opcode:: PRECALL (argc) - - Prefixes :opcode:`CALL`. Logically this is a no op. - It exists to enable effective specialization of calls. - ``argc`` is the number of arguments as described in :opcode:`CALL`. - - .. versionadded:: 3.11 - - .. opcode:: PUSH_NULL Pushes a ``NULL`` to the stack. @@ -1202,7 +1213,7 @@ iterations of the loop. .. opcode:: KW_NAMES (i) - Prefixes :opcode:`PRECALL`. + Prefixes :opcode:`CALL`. Stores a reference to ``co_consts[consti]`` into an internal variable for use by :opcode:`CALL`. ``co_consts[consti]`` must be a tuple of strings. diff --git a/Doc/library/doctest.rst b/Doc/library/doctest.rst index be5651d15a0c9a..2d50a49c6415e9 100644 --- a/Doc/library/doctest.rst +++ b/Doc/library/doctest.rst @@ -563,41 +563,35 @@ doctest decides whether actual output matches an example's expected output: .. data:: IGNORE_EXCEPTION_DETAIL - When specified, an example that expects an exception passes if an exception of - the expected type is raised, even if the exception detail does not match. For - example, an example expecting ``ValueError: 42`` will pass if the actual - exception raised is ``ValueError: 3*14``, but will fail, e.g., if - :exc:`TypeError` is raised. + When specified, doctests expecting exceptions pass so long as an exception + of the expected type is raised, even if the details + (message and fully-qualified exception name) don't match. - It will also ignore the module name used in Python 3 doctest reports. Hence - both of these variations will work with the flag specified, regardless of - whether the test is run under Python 2.7 or Python 3.2 (or later versions):: + For example, an example expecting ``ValueError: 42`` will pass if the actual + exception raised is ``ValueError: 3*14``, but will fail if, say, a + :exc:`TypeError` is raised instead. + It will also ignore any fully-qualified name included before the + exception class, which can vary between implementations and versions + of Python and the code/libraries in use. + Hence, all three of these variations will work with the flag specified: - >>> raise CustomError('message') + .. code-block:: pycon + + >>> raise Exception('message') Traceback (most recent call last): - CustomError: message + Exception: message - >>> raise CustomError('message') + >>> raise Exception('message') Traceback (most recent call last): - my_module.CustomError: message + builtins.Exception: message - Note that :const:`ELLIPSIS` can also be used to ignore the - details of the exception message, but such a test may still fail based - on whether or not the module details are printed as part of the - exception name. Using :const:`IGNORE_EXCEPTION_DETAIL` and the details - from Python 2.3 is also the only clear way to write a doctest that doesn't - care about the exception detail yet continues to pass under Python 2.3 or - earlier (those releases do not support :ref:`doctest directives - ` and ignore them as irrelevant comments). For example:: - - >>> (1, 2)[3] = 'moo' + >>> raise Exception('message') Traceback (most recent call last): - File "", line 1, in - TypeError: object doesn't support item assignment + __main__.Exception: message - passes under Python 2.3 and later Python versions with the flag specified, - even though the detail - changed in Python 2.4 to say "does not" instead of "doesn't". + Note that :const:`ELLIPSIS` can also be used to ignore the + details of the exception message, but such a test may still fail based + on whether the module name is present or matches exactly. .. versionchanged:: 3.2 :const:`IGNORE_EXCEPTION_DETAIL` now also ignores any information relating diff --git a/Doc/library/email.header.rst b/Doc/library/email.header.rst index 07152c224f2ff0..e093f138936b36 100644 --- a/Doc/library/email.header.rst +++ b/Doc/library/email.header.rst @@ -116,7 +116,7 @@ Here is the :class:`Header` class description: if *s* is a byte string. - .. method:: encode(splitchars=';, \\t', maxlinelen=None, linesep='\\n') + .. method:: encode(splitchars=';, \t', maxlinelen=None, linesep='\n') Encode a message header into an RFC-compliant format, possibly wrapping long lines and encapsulating non-ASCII parts in base64 or quoted-printable diff --git a/Doc/library/email.headerregistry.rst b/Doc/library/email.headerregistry.rst index 3e1d97a03264b2..98527cea43da2a 100644 --- a/Doc/library/email.headerregistry.rst +++ b/Doc/library/email.headerregistry.rst @@ -206,7 +206,7 @@ headers. The ``decoded`` value of the header will have all encoded words decoded to unicode. :class:`~encodings.idna` encoded domain names are also decoded to - unicode. The ``decoded`` value is set by :attr:`~str.join`\ ing the + unicode. The ``decoded`` value is set by :ref:`joining ` the :class:`str` value of the elements of the ``groups`` attribute with ``', '``. diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 5829d4617893be..b1333c7dd5cf9d 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -126,11 +126,11 @@ Module Contents :func:`member` - Make `obj` a member. Can be used as a decorator. + Make ``obj`` a member. Can be used as a decorator. :func:`nonmember` - Do not make `obj` a member. Can be used as a decorator. + Do not make ``obj`` a member. Can be used as a decorator. .. versionadded:: 3.6 ``Flag``, ``IntFlag``, ``auto`` @@ -761,6 +761,10 @@ Utilities and Decorators ``_generate_next_value_`` can be overridden to customize the values used by *auto*. + .. note:: in 3.13 the default ``"generate_next_value_`` will always return + the highest member value incremented by 1, and will fail if any + member is an incompatible type. + .. decorator:: property A decorator similar to the built-in *property*, but specifically for diff --git a/Doc/library/fcntl.rst b/Doc/library/fcntl.rst index 1ecd552fbd02cd..784e7071c2b353 100644 --- a/Doc/library/fcntl.rst +++ b/Doc/library/fcntl.rst @@ -50,6 +50,12 @@ descriptor. constants, which allow to duplicate a file descriptor, the latter setting ``FD_CLOEXEC`` flag in addition. +.. versionchanged:: 3.12 + On Linux >= 4.5, the :mod:`fcntl` module exposes the ``FICLONE`` and + ``FICLONERANGE`` constants, which allow to share some data of one file with + another file by reflinking on some filesystems (e.g., btrfs, OCFS2, and + XFS). This behavior is commonly referred to as "copy-on-write". + The module defines the following functions: diff --git a/Doc/library/fractions.rst b/Doc/library/fractions.rst index c893f2d5389d57..5f0ecf1f135ea5 100644 --- a/Doc/library/fractions.rst +++ b/Doc/library/fractions.rst @@ -114,10 +114,10 @@ another rational number, or from a string. .. versionadded:: 3.8 - .. method:: from_float(flt) + .. classmethod:: from_float(flt) - This class method constructs a :class:`Fraction` representing the exact - value of *flt*, which must be a :class:`float`. Beware that + Alternative constructor which only accepts instances of + :class:`float` or :class:`numbers.Integral`. Beware that ``Fraction.from_float(0.3)`` is not the same value as ``Fraction(3, 10)``. .. note:: @@ -126,10 +126,10 @@ another rational number, or from a string. :class:`Fraction` instance directly from a :class:`float`. - .. method:: from_decimal(dec) + .. classmethod:: from_decimal(dec) - This class method constructs a :class:`Fraction` representing the exact - value of *dec*, which must be a :class:`decimal.Decimal` instance. + Alternative constructor which only accepts instances of + :class:`decimal.Decimal` or :class:`numbers.Integral`. .. note:: diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 92b2c3bb53944d..da7de18722f87e 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -500,6 +500,7 @@ are always available. They are listed here in alphabetical order. yield n, elem n += 1 +.. _func-eval: .. function:: eval(expression[, globals[, locals]]) @@ -1392,7 +1393,7 @@ are always available. They are listed here in alphabetical order. supported. -.. function:: print(*objects, sep=' ', end='\\n', file=sys.stdout, flush=False) +.. function:: print(*objects, sep=' ', end='\n', file=sys.stdout, flush=False) Print *objects* to the text stream *file*, separated by *sep* and followed by *end*. *sep*, *end*, *file*, and *flush*, if present, must be given as keyword diff --git a/Doc/library/gzip.rst b/Doc/library/gzip.rst index 8cea2649ee6cb6..1a2582d6a904b2 100644 --- a/Doc/library/gzip.rst +++ b/Doc/library/gzip.rst @@ -165,6 +165,10 @@ The module defines the following items: .. versionchanged:: 3.6 Accepts a :term:`path-like object`. + .. versionchanged:: 3.12 + Remove the ``filename`` attribute, use the :attr:`~GzipFile.name` + attribute instead. + .. deprecated:: 3.9 Opening :class:`GzipFile` for writing without specifying the *mode* argument is deprecated. diff --git a/Doc/library/hashlib.rst b/Doc/library/hashlib.rst index 57f270c8d711cf..0906ce7cb7a2d9 100644 --- a/Doc/library/hashlib.rst +++ b/Doc/library/hashlib.rst @@ -668,7 +668,7 @@ function: hash function used in the protocol summarily stops this type of attack. (`The Skein Hash Function Family - `_, + `_, p. 21) BLAKE2 can be personalized by passing bytes to the *person* argument:: diff --git a/Doc/library/html.entities.rst b/Doc/library/html.entities.rst index 7d836fe7380245..10529561a92cd0 100644 --- a/Doc/library/html.entities.rst +++ b/Doc/library/html.entities.rst @@ -34,14 +34,14 @@ This module defines four dictionaries, :data:`html5`, .. data:: name2codepoint - A dictionary that maps HTML entity names to the Unicode code points. + A dictionary that maps HTML4 entity names to the Unicode code points. .. data:: codepoint2name - A dictionary that maps Unicode code points to HTML entity names. + A dictionary that maps Unicode code points to HTML4 entity names. .. rubric:: Footnotes -.. [#] See https://html.spec.whatwg.org/multipage/syntax.html#named-character-references +.. [#] See https://html.spec.whatwg.org/multipage/named-characters.html#named-character-references diff --git a/Doc/library/http.cookies.rst b/Doc/library/http.cookies.rst index 17792b200599bd..a2c1eb00d8b33d 100644 --- a/Doc/library/http.cookies.rst +++ b/Doc/library/http.cookies.rst @@ -93,7 +93,7 @@ Cookie Objects :meth:`value_decode` are inverses on the range of *value_decode*. -.. method:: BaseCookie.output(attrs=None, header='Set-Cookie:', sep='\\r\\n') +.. method:: BaseCookie.output(attrs=None, header='Set-Cookie:', sep='\r\n') Return a string representation suitable to be sent as HTTP headers. *attrs* and *header* are sent to each :class:`Morsel`'s :meth:`output` method. *sep* is used diff --git a/Doc/library/imghdr.rst b/Doc/library/imghdr.rst index c17bf897b9be91..318fe650776d2a 100644 --- a/Doc/library/imghdr.rst +++ b/Doc/library/imghdr.rst @@ -7,7 +7,7 @@ **Source code:** :source:`Lib/imghdr.py` -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`imghdr` module is deprecated (see :pep:`PEP 594 <594#imghdr>` for details and alternatives). diff --git a/Doc/library/imp.rst b/Doc/library/imp.rst index 121a730e0c9b4a..000793a7e66cae 100644 --- a/Doc/library/imp.rst +++ b/Doc/library/imp.rst @@ -7,7 +7,7 @@ **Source code:** :source:`Lib/imp.py` -.. deprecated:: 3.4 +.. deprecated-removed:: 3.4 3.12 The :mod:`imp` module is deprecated in favor of :mod:`importlib`. .. index:: statement: import diff --git a/Doc/library/importlib.metadata.rst b/Doc/library/importlib.metadata.rst index b3740178d6f9c7..a1af7a754ba4ef 100644 --- a/Doc/library/importlib.metadata.rst +++ b/Doc/library/importlib.metadata.rst @@ -13,13 +13,13 @@ **Source code:** :source:`Lib/importlib/metadata/__init__.py` -``importlib.metadata`` is a library that provides for access to installed -package metadata. Built in part on Python's import system, this library +``importlib.metadata`` is a library that provides access to installed +package metadata, such as its entry points or its +top-level name. Built in part on Python's import system, this library intends to replace similar functionality in the `entry point API`_ and `metadata API`_ of ``pkg_resources``. Along with -:mod:`importlib.resources` in Python 3.7 -and newer (backported as `importlib_resources`_ for older versions of -Python), this can eliminate the need to use the older and less efficient +:mod:`importlib.resources`, +this package can eliminate the need to use the older and less efficient ``pkg_resources`` package. By "installed package" we generally mean a third-party package installed into @@ -32,6 +32,13 @@ By default, package metadata can live on the file system or in zip archives on anywhere. +.. seealso:: + + https://importlib-metadata.readthedocs.io/ + The documentation for ``importlib_metadata``, which supplies a + backport of ``importlib.metadata``. + + Overview ======== @@ -54,9 +61,9 @@ You can get the version string for ``wheel`` by running the following: >>> version('wheel') # doctest: +SKIP '0.32.3' -You can also get the set of entry points keyed by group, such as +You can also get a collection of entry points selectable by properties of the EntryPoint (typically 'group' or 'name'), such as ``console_scripts``, ``distutils.commands`` and others. Each group contains a -sequence of :ref:`EntryPoint ` objects. +collection of :ref:`EntryPoint ` objects. You can get the :ref:`metadata for a distribution `:: @@ -91,7 +98,7 @@ Query all entry points:: >>> eps = entry_points() # doctest: +SKIP The ``entry_points()`` function returns an ``EntryPoints`` object, -a sequence of all ``EntryPoint`` objects with ``names`` and ``groups`` +a collection of all ``EntryPoint`` objects with ``names`` and ``groups`` attributes for convenience:: >>> sorted(eps.groups) # doctest: +SKIP @@ -136,7 +143,7 @@ Inspect the resolved entry point:: The ``group`` and ``name`` are arbitrary values defined by the package author and usually a client will wish to resolve all entry points for a particular group. Read `the setuptools docs -`_ +`_ for more information on entry points, their definition, and usage. *Compatibility Note* @@ -174,6 +181,13 @@ all the metadata in a JSON-compatible form per :PEP:`566`:: >>> wheel_metadata.json['requires_python'] '>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*' +.. note:: + + The actual type of the object returned by ``metadata()`` is an + implementation detail and should be accessed only through the interface + described by the + `PackageMetadata protocol `. + .. versionchanged:: 3.10 The ``Description`` is now included in the metadata when presented through the payload. Line continuation characters have been removed. @@ -295,6 +309,15 @@ The full set of available metadata is not described here. See :pep:`566` for additional details. +Distribution Discovery +====================== + +By default, this package provides built-in support for discovery of metadata for file system and zip file packages. This metadata finder search defaults to ``sys.path``, but varies slightly in how it interprets those values from how other import machinery does. In particular: + +- ``importlib.metadata`` does not honor :class:`bytes` objects on ``sys.path``. +- ``importlib.metadata`` will incidentally honor :py:class:`pathlib.Path` objects on ``sys.path`` even though such values will be ignored for imports. + + Extending the search algorithm ============================== diff --git a/Doc/library/importlib.resources.rst b/Doc/library/importlib.resources.rst index f62d15dd6fdc9e..b88c9882de44b8 100644 --- a/Doc/library/importlib.resources.rst +++ b/Doc/library/importlib.resources.rst @@ -4,7 +4,7 @@ .. module:: importlib.resources :synopsis: Package resource reading, opening, and access -**Source code:** :source:`Lib/importlib/resources.py` +**Source code:** :source:`Lib/importlib/resources/__init__.py` -------------- diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst index 0241fb30b6efb2..aac556e2c68d94 100644 --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -414,8 +414,8 @@ ABC hierarchy:: .. versionadded:: 3.4 - .. versionchanged:: 3.5 - Starting in Python 3.6, this method will not be optional when + .. versionchanged:: 3.6 + This method is no longer optional when :meth:`exec_module` is defined. .. method:: exec_module(module) @@ -1250,6 +1250,9 @@ Checking if a module can be imported If you need to find out if a module can be imported without actually doing the import, then you should use :func:`importlib.util.find_spec`. + +Note that if ``name`` is a submodule (contains a dot), +:func:`importlib.util.find_spec` will import the parent module. :: import importlib.util @@ -1273,8 +1276,7 @@ import, then you should use :func:`importlib.util.find_spec`. Importing a source file directly '''''''''''''''''''''''''''''''' -To import a Python source file directly, use the following recipe -(Python 3.5 and newer only):: +To import a Python source file directly, use the following recipe:: import importlib.util import sys @@ -1355,9 +1357,7 @@ Import itself is implemented in Python code, making it possible to expose most of the import machinery through importlib. The following helps illustrate the various APIs that importlib exposes by providing an approximate implementation of -:func:`importlib.import_module` (Python 3.4 and newer for the importlib usage, -Python 3.6 and newer for other parts of the code). -:: +:func:`importlib.import_module`:: import importlib.util import sys diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index 575b3088900e13..154d0f5dab0cd1 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -512,6 +512,7 @@ Retrieving source code If the documentation string for an object is not provided and the object is a class, a method, a property or a descriptor, retrieve the documentation string from the inheritance hierarchy. + Return ``None`` if the documentation string is invalid or missing. .. versionchanged:: 3.5 Documentation strings are now inherited if not overridden. @@ -535,12 +536,14 @@ Retrieving source code .. function:: getmodule(object) - Try to guess which module an object was defined in. + Try to guess which module an object was defined in. Return ``None`` + if the module cannot be determined. .. function:: getsourcefile(object) - Return the name of the Python source file in which an object was defined. This + Return the name of the Python source file in which an object was defined + or ``None`` if no way can be identified to get the source. This will fail with a :exc:`TypeError` if the object is a built-in module, class, or function. diff --git a/Doc/library/io.rst b/Doc/library/io.rst index d8e7b1621e2ea8..753e6c1e3b9b46 100644 --- a/Doc/library/io.rst +++ b/Doc/library/io.rst @@ -1042,7 +1042,7 @@ Text I/O The method supports ``encoding="locale"`` option. -.. class:: StringIO(initial_value='', newline='\\n') +.. class:: StringIO(initial_value='', newline='\n') A text stream using an in-memory text buffer. It inherits :class:`TextIOBase`. diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst index 26c9253cb7f5f6..416c4eca5eb48a 100644 --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -992,12 +992,7 @@ which incur interpreter overhead. "Equivalent to list(combinations(iterable, r))[index]" pool = tuple(iterable) n = len(pool) - if r < 0 or r > n: - raise ValueError - c = 1 - k = min(r, n-r) - for i in range(1, k+1): - c = c * (n - k + i) // i + c = math.comb(n, r) if index < 0: index += c if index < 0 or index >= c: @@ -1071,6 +1066,7 @@ which incur interpreter overhead. >>> import operator >>> import collections + >>> import math >>> take(10, count()) [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] diff --git a/Doc/library/locale.rst b/Doc/library/locale.rst index 77a3e036841baa..112f0bae78daf9 100644 --- a/Doc/library/locale.rst +++ b/Doc/library/locale.rst @@ -375,6 +375,8 @@ The :mod:`locale` module defines the following exception and functions: The default setting is determined by calling :func:`getdefaultlocale`. *category* defaults to :const:`LC_ALL`. + .. deprecated:: 3.11 3.13 + .. function:: strcoll(string1, string2) diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index 2f105ba29441da..9f9361a2fd569e 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -661,6 +661,76 @@ it with :func:`staticmethod`. For example:: You don't need to wrap with :func:`staticmethod` if you're setting the import callable on a configurator *instance*. +.. _configure-queue: + +Configuring QueueHandler and QueueListener +"""""""""""""""""""""""""""""""""""""""""" + +If you want to configure a :class:`~logging.handlers.QueueHandler`, noting that this +is normally used in conjunction with a :class:`~logging.handlers.QueueListener`, you +can configure both together. After the configuration, the ``QueueListener`` instance +will be available as the :attr:`~logging.handlers.QueueHandler.listener` attribute of +the created handler, and that in turn will be available to you using +:func:`~logging.getHandlerByName` and passing the name you have used for the +``QueueHandler`` in your configuration. The dictionary schema for configuring the pair +is shown in the example YAML snippet below. + +.. code-block:: yaml + + handlers: + qhand: + class: logging.handlers.QueueHandler + queue: my.module.queue_factory + listener: my.package.CustomListener + handlers: + - hand_name_1 + - hand_name_2 + ... + +The ``queue`` and ``listener`` keys are optional. + +If the ``queue`` key is present, the corresponding value can be one of the following: + +* An actual instance of :class:`queue.Queue` or a subclass thereof. This is of course + only possible if you are constructing or modifying the configuration dictionary in + code. + +* A string that resolves to a callable which, when called with no arguments, returns + the :class:`queue.Queue` instance to use. That callable could be a + :class:`queue.Queue` subclass or a function which returns a suitable queue instance, + such as ``my.module.queue_factory()``. + +* A dict with a ``'()'`` key which is constructed in the usual way as discussed in + :ref:`logging-config-dict-userdef`. The result of this construction should be a + :class:`queue.Queue` instance. + +If the ``queue`` key is absent, a standard unbounded :class:`queue.Queue` instance is +created and used. + +If the ``listener`` key is present, the corresponding value can be one of the following: + +* A subclass of :class:`logging.handlers.QueueListener`. This is of course only + possible if you are constructing or modifying the configuration dictionary in + code. + +* A string which resolves to a class which is a subclass of ``QueueListener``, such as + ``'my.package.CustomListener'``. + +* A dict with a ``'()'`` key which is constructed in the usual way as discussed in + :ref:`logging-config-dict-userdef`. The result of this construction should be a + callable with the same signature as the ``QueueListener`` initializer. + +If the ``listener`` key is absent, :class:`logging.handlers.QueueListener` is used. + +The values under the ``handlers`` key are the names of other handlers in the +configuration (not shown in the above snippet) which will be passed to the queue +listener. + +Any custom queue handler and listener classes will need to be defined with the same +initialization signatures as :class:`~logging.handlers.QueueHandler` and +:class:`~logging.handlers.QueueListener`. + +.. versionadded:: 3.12 .. _logging-config-fileformat: @@ -716,7 +786,7 @@ root logger section is given below. The ``level`` entry can be one of ``DEBUG, INFO, WARNING, ERROR, CRITICAL`` or ``NOTSET``. For the root logger only, ``NOTSET`` means that all messages will be -logged. Level values are :func:`eval`\ uated in the context of the ``logging`` +logged. Level values are :ref:`evaluated ` in the context of the ``logging`` package's namespace. The ``handlers`` entry is a comma-separated list of handler names, which must @@ -763,13 +833,13 @@ handler. If blank, a default formatter (``logging._defaultFormatter``) is used. If a name is specified, it must appear in the ``[formatters]`` section and have a corresponding section in the configuration file. -The ``args`` entry, when :func:`eval`\ uated in the context of the ``logging`` +The ``args`` entry, when :ref:`evaluated ` in the context of the ``logging`` package's namespace, is the list of arguments to the constructor for the handler class. Refer to the constructors for the relevant handlers, or to the examples below, to see how typical entries are constructed. If not provided, it defaults to ``()``. -The optional ``kwargs`` entry, when :func:`eval`\ uated in the context of the +The optional ``kwargs`` entry, when :ref:`evaluated ` in the context of the ``logging`` package's namespace, is the keyword argument dict to the constructor for the handler class. If not provided, it defaults to ``{}``. diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst index f5ef80ea044c66..4bdd5508c67c4e 100644 --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -1051,7 +1051,13 @@ possible, while any potentially slow operations (such as sending an email via want to override this if you want to use blocking behaviour, or a timeout, or a customized queue implementation. + .. attribute:: listener + When created via configuration using :func:`~logging.config.dictConfig`, this + attribute will contain a :class:`QueueListener` instance for use with this + handler. Otherwise, it will be ``None``. + + .. versionadded:: 3.12 .. _queue-listener: diff --git a/Doc/library/logging.rst b/Doc/library/logging.rst index b82b90b47dd160..51b5f86e69efe5 100644 --- a/Doc/library/logging.rst +++ b/Doc/library/logging.rst @@ -30,9 +30,17 @@ is that all Python modules can participate in logging, so your application log can include your own messages integrated with messages from third-party modules. +The simplest example: + +.. code-block:: none + + >>> import logging + >>> logging.warning('Watch out!') + WARNING:root:Watch out! + The module provides a lot of functionality and flexibility. If you are -unfamiliar with logging, the best way to get to grips with it is to see the -tutorials (see the links on the right). +unfamiliar with logging, the best way to get to grips with it is to view the +tutorials (**see the links above and on the right**). The basic classes defined by the module, together with their functions, are listed below. @@ -242,6 +250,10 @@ is the module's name in the Python package namespace. above example). In such circumstances, it is likely that specialized :class:`Formatter`\ s would be used with particular :class:`Handler`\ s. + If no handler is attached to this logger (or any of its ancestors, + taking into account the relevant :attr:`Logger.propagate` attributes), + the message will be sent to the handler set on :attr:`lastResort`. + .. versionchanged:: 3.2 The *stack_info* parameter was added. @@ -662,9 +674,10 @@ empty string, all events are passed. .. method:: filter(record) - Is the specified record to be logged? Returns zero for no, nonzero for - yes. If deemed appropriate, the record may be modified in-place by this - method. + Is the specified record to be logged? Returns falsy for no, truthy for + yes. Filters can either modify log records in-place or return a completely + different record instance which will replace the original + log record in any future processing of the event. Note that filters attached to handlers are consulted before an event is emitted by the handler, whereas filters attached to loggers are consulted @@ -686,6 +699,12 @@ which has a ``filter`` method with the same semantics. parameter. The returned value should conform to that returned by :meth:`~Filter.filter`. +.. versionchanged:: 3.12 + You can now return a :class:`LogRecord` instance from filters to replace + the log record rather than modifying it in place. This allows filters attached to + a :class:`Handler` to modify the log record before it is emitted, without + having side effects on other handlers. + Although filters are used primarily to filter records based on more sophisticated criteria than levels, they get to see every record which is processed by the handler or logger they're attached to: this can be useful if @@ -868,10 +887,14 @@ the options available to you. +----------------+-------------------------+-----------------------------------------------+ | threadName | ``%(threadName)s`` | Thread name (if available). | +----------------+-------------------------+-----------------------------------------------+ +| taskName | ``%(taskName)s`` | :class:`asyncio.Task` name (if available). | ++----------------+-------------------------+-----------------------------------------------+ .. versionchanged:: 3.1 *processName* was added. +.. versionchanged:: 3.12 + *taskName* was added. .. _logger-adapter: @@ -1038,6 +1061,10 @@ functions. above example). In such circumstances, it is likely that specialized :class:`Formatter`\ s would be used with particular :class:`Handler`\ s. + This function (as well as :func:`info`, :func:`warning`, :func:`error` and + :func:`critical`) will call :func:`basicConfig` if the root logger doesn't + have any handler attached. + .. versionchanged:: 3.2 The *stack_info* parameter was added. @@ -1152,6 +1179,19 @@ functions. This undocumented behaviour was considered a mistake, and was removed in Python 3.4, but reinstated in 3.4.2 due to retain backward compatibility. +.. function:: getHandlerByName(name) + + Returns a handler with the specified *name*, or ``None`` if there is no handler + with that name. + + .. versionadded:: 3.12 + +.. function:: getHandlerNames() + + Returns an immutable set of all known handler names. + + .. versionadded:: 3.12 + .. function:: makeLogRecord(attrdict) Creates and returns a new :class:`LogRecord` instance whose attributes are diff --git a/Doc/library/mailcap.rst b/Doc/library/mailcap.rst index e2e5bb34456373..bfaedb46091991 100644 --- a/Doc/library/mailcap.rst +++ b/Doc/library/mailcap.rst @@ -7,7 +7,7 @@ **Source code:** :source:`Lib/mailcap.py` -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`mailcap` module is deprecated (see :pep:`PEP 594 <594#mailcap>` for details). The :mod:`mimetypes` module provides an alternative. @@ -60,6 +60,18 @@ standard. However, mailcap files are supported on most Unix systems. use) to determine whether or not the mailcap line applies. :func:`findmatch` will automatically check such conditions and skip the entry if the check fails. + .. versionchanged:: 3.11 + + To prevent security issues with shell metacharacters (symbols that have + special effects in a shell command line), ``findmatch`` will refuse + to inject ASCII characters other than alphanumerics and ``@+=:,./-_`` + into the returned command line. + + If a disallowed character appears in *filename*, ``findmatch`` will always + return ``(None, None)`` as if no entry was found. + If such a character appears elsewhere (a value in *plist* or in *MIMEtype*), + ``findmatch`` will ignore all mailcap entries which use that value. + A :mod:`warning ` will be raised in either case. .. function:: getcaps() diff --git a/Doc/library/math.rst b/Doc/library/math.rst index 5efcc7f6a6efc4..6d9a992f731316 100644 --- a/Doc/library/math.rst +++ b/Doc/library/math.rst @@ -578,7 +578,7 @@ Special functions The :func:`erf` function can be used to compute traditional statistical functions such as the `cumulative standard normal distribution - `_:: + `_:: def phi(x): 'Cumulative distribution function for the standard normal distribution' diff --git a/Doc/library/mmap.rst b/Doc/library/mmap.rst index d19580cd7ee5ce..36c15e9d1d92d1 100644 --- a/Doc/library/mmap.rst +++ b/Doc/library/mmap.rst @@ -102,7 +102,7 @@ To map anonymous memory, -1 should be passed as the fileno along with the length To ensure validity of the created memory mapping the file specified by the descriptor *fileno* is internally automatically synchronized - with physical backing store on macOS and OpenVMS. + with the physical backing store on macOS. This example shows a simple way of using :class:`~mmap.mmap`:: diff --git a/Doc/library/msilib.rst b/Doc/library/msilib.rst index 1ee8130801cba5..fbe55db9372300 100644 --- a/Doc/library/msilib.rst +++ b/Doc/library/msilib.rst @@ -13,7 +13,7 @@ .. index:: single: msi -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`msilib` module is deprecated (see :pep:`PEP 594 <594#msilib>` for details). diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 70802ee1fdecb9..2a66b0f65c0882 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -1666,6 +1666,7 @@ different machines. A manager object controls a server process which manages proxies. .. function:: multiprocessing.Manager() + :module: Returns a started :class:`~multiprocessing.managers.SyncManager` object which can be used for sharing objects between processes. The returned manager diff --git a/Doc/library/multiprocessing.shared_memory.rst b/Doc/library/multiprocessing.shared_memory.rst index 2ba42b7e579a77..127a82d47aa195 100644 --- a/Doc/library/multiprocessing.shared_memory.rst +++ b/Doc/library/multiprocessing.shared_memory.rst @@ -1,5 +1,5 @@ -:mod:`multiprocessing.shared_memory` --- Provides shared memory for direct access across processes -=================================================================================================== +:mod:`multiprocessing.shared_memory` --- Shared memory for direct access across processes +========================================================================================= .. module:: multiprocessing.shared_memory :synopsis: Provides shared memory for direct access across processes. diff --git a/Doc/library/nis.rst b/Doc/library/nis.rst index 49fe62954cce8a..fd3c3d9293d247 100644 --- a/Doc/library/nis.rst +++ b/Doc/library/nis.rst @@ -10,7 +10,7 @@ .. moduleauthor:: Fred Gansevles .. sectionauthor:: Moshe Zadka -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`nis` module is deprecated (see :pep:`PEP 594 <594#nis>` for details). diff --git a/Doc/library/os.path.rst b/Doc/library/os.path.rst index ce7913e3712d73..85989ef32d4911 100644 --- a/Doc/library/os.path.rst +++ b/Doc/library/os.path.rst @@ -469,7 +469,7 @@ the :mod:`glob` module.) ("c:", "/dir") If the path contains a UNC path, drive will contain the host name - and share, up to but not including the fourth separator:: + and share:: >>> splitdrive("//host/computer/dir") ("//host/computer", "/dir") diff --git a/Doc/library/os.rst b/Doc/library/os.rst index 3c189bb40e234c..baf89bf8b56aac 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -798,18 +798,32 @@ as internal buffering of data. Copy *count* bytes from file descriptor *src*, starting from offset *offset_src*, to file descriptor *dst*, starting from offset *offset_dst*. If *offset_src* is None, then *src* is read from the current position; - respectively for *offset_dst*. The files pointed by *src* and *dst* + respectively for *offset_dst*. + + In Linux kernel older than 5.3, the files pointed by *src* and *dst* must reside in the same filesystem, otherwise an :exc:`OSError` is raised with :attr:`~OSError.errno` set to :data:`errno.EXDEV`. This copy is done without the additional cost of transferring data from the kernel to user space and then back into the kernel. Additionally, - some filesystems could implement extra optimizations. The copy is done as if - both files are opened as binary. + some filesystems could implement extra optimizations, such as the use of + reflinks (i.e., two or more inodes that share pointers to the same + copy-on-write disk blocks; supported file systems include btrfs and XFS) + and server-side copy (in the case of NFS). + + The function copies bytes between two file descriptors. Text options, like + the encoding and the line ending, are ignored. The return value is the amount of bytes copied. This could be less than the amount requested. + .. note:: + + On Linux, :func:`os.copy_file_range` should not be used for copying a + range of a pseudo file from a special filesystem like procfs and sysfs. + It will always copy no bytes and return 0 as if the file was empty + because of a known Linux kernel issue. + .. availability:: Linux kernel >= 4.5 or glibc >= 2.27. .. versionadded:: 3.8 @@ -2328,7 +2342,7 @@ features: :exc:`IsADirectoryError` or a :exc:`NotADirectoryError` will be raised respectively. If both are directories and *dst* is empty, *dst* will be silently replaced. If *dst* is a non-empty directory, an :exc:`OSError` - is raised. If both are files, *dst* it will be replaced silently if the user + is raised. If both are files, *dst* will be replaced silently if the user has permission. The operation may fail on some Unix flavors if *src* and *dst* are on different filesystems. If successful, the renaming will be an atomic operation (this is a POSIX requirement). @@ -3897,16 +3911,25 @@ written in Python, such as a mail server's external command delivery program. .. function:: pidfd_open(pid, flags=0) - Return a file descriptor referring to the process *pid*. This descriptor can - be used to perform process management without races and signals. The *flags* - argument is provided for future extensions; no flag values are currently - defined. + Return a file descriptor referring to the process *pid* with *flags* set. + This descriptor can be used to perform process management without races + and signals. See the :manpage:`pidfd_open(2)` man page for more details. .. availability:: Linux 5.3+ .. versionadded:: 3.9 + .. data:: PIDFD_NONBLOCK + + This flag indicates that the file descriptor will be non-blocking. + If the process referred to by the file descriptor has not yet terminated, + then an attempt to wait on the file descriptor using :manpage:`waitid(2)` + will immediately return the error :data:`~errno.EAGAIN` rather than blocking. + + .. availability:: Linux 5.10+ + .. versionadded:: 3.12 + .. function:: plock(op) @@ -3916,13 +3939,13 @@ written in Python, such as a mail server's external command delivery program. .. availability:: Unix. -.. function:: popen(cmd, mode='r', buffering=-1, encoding=None) +.. function:: popen(cmd, mode='r', buffering=-1) Open a pipe to or from command *cmd*. The return value is an open file object connected to the pipe, which can be read or written depending on whether *mode* is ``'r'`` (default) or ``'w'``. - The *buffering* and *encoding* arguments have the same meaning as + The *buffering* argument have the same meaning as the corresponding argument to the built-in :func:`open` function. The returned file object reads or writes text strings rather than bytes. @@ -3945,8 +3968,13 @@ written in Python, such as a mail server's external command delivery program. documentation for more powerful ways to manage and communicate with subprocesses. - .. versionchanged:: 3.11 - Added the *encoding* parameter. + .. note:: + The :ref:`Python UTF-8 Mode ` affects encodings used + for *cmd* and pipe contents. + + :func:`popen` is a simple wrapper around :class:`subprocess.Popen`. + Use :class:`subprocess.Popen` or :func:`subprocess.run` to + control options like encodings. .. function:: posix_spawn(path, argv, env, *, file_actions=None, \ @@ -4290,7 +4318,7 @@ written in Python, such as a mail server's external command delivery program. :attr:`!children_system`, and :attr:`!elapsed` in that order. See the Unix manual page - :manpage:`times(2)` and :manpage:`times(3)` manual page on Unix or `the GetProcessTimes MSDN + :manpage:`times(2)` and `times(3) `_ manual page on Unix or `the GetProcessTimes MSDN `_ on Windows. On Windows, only :attr:`!user` and :attr:`!system` are known; the other attributes are zero. diff --git a/Doc/library/ossaudiodev.rst b/Doc/library/ossaudiodev.rst index 728ee3036057dd..e14c1bf8d5367e 100644 --- a/Doc/library/ossaudiodev.rst +++ b/Doc/library/ossaudiodev.rst @@ -6,7 +6,7 @@ :synopsis: Access to OSS-compatible audio devices. :deprecated: -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`ossaudiodev` module is deprecated (see :pep:`PEP 594 <594#ossaudiodev>` for details). diff --git a/Doc/library/pathlib.rst b/Doc/library/pathlib.rst index d45e7aa84b28ce..2b798698385fd1 100644 --- a/Doc/library/pathlib.rst +++ b/Doc/library/pathlib.rst @@ -133,11 +133,13 @@ we also call *flavours*: PureWindowsPath('c:/Program Files') Spurious slashes and single dots are collapsed, but double dots (``'..'``) - are not, since this would change the meaning of a path in the face of - symbolic links:: + and leading double slashes (``'//'``) are not, since this would change the + meaning of a path for various reasons (e.g. symbolic links, UNC paths):: >>> PurePath('foo//bar') PurePosixPath('foo/bar') + >>> PurePath('//foo/bar') + PurePosixPath('//foo/bar') >>> PurePath('foo/./bar') PurePosixPath('foo/bar') >>> PurePath('foo/../bar') @@ -166,13 +168,17 @@ we also call *flavours*: .. class:: PureWindowsPath(*pathsegments) A subclass of :class:`PurePath`, this path flavour represents Windows - filesystem paths:: + filesystem paths, including `UNC paths`_:: >>> PureWindowsPath('c:/Program Files/') PureWindowsPath('c:/Program Files') + >>> PureWindowsPath('//server/share/file') + PureWindowsPath('//server/share/file') *pathsegments* is specified similarly to :class:`PurePath`. + .. _unc paths: https://en.wikipedia.org/wiki/Path_(computing)#UNC + Regardless of the system you're running on, you can instantiate all of these classes, since they don't provide any operation that does system calls. @@ -309,6 +315,26 @@ Pure paths provide the following methods and properties: >>> PureWindowsPath('//host/share').root '\\' + If the path starts with more than two successive slashes, + :class:`~pathlib.PurePosixPath` collapses them:: + + >>> PurePosixPath('//etc').root + '//' + >>> PurePosixPath('///etc').root + '/' + >>> PurePosixPath('////etc').root + '/' + + .. note:: + + This behavior conforms to *The Open Group Base Specifications Issue 6*, + paragraph `4.11 Pathname Resolution + `_: + + *"A pathname that begins with two successive slashes may be interpreted in + an implementation-defined manner, although more than two leading slashes + shall be treated as a single slash."* + .. data:: PurePath.anchor The concatenation of the drive and root:: @@ -1021,8 +1047,9 @@ call fails (for example because the path doesn't exist). Rename this file or directory to the given *target*, and return a new Path instance pointing to *target*. On Unix, if *target* exists and is a file, - it will be replaced silently if the user has permission. *target* can be - either a string or another path object:: + it will be replaced silently if the user has permission. + On Windows, if *target* exists, :exc:`FileExistsError` will be raised. + *target* can be either a string or another path object:: >>> p = Path('foo') >>> p.open('w').write('some text') @@ -1264,7 +1291,7 @@ Below is a table mapping various :mod:`os` functions to their corresponding :func:`os.link` :meth:`Path.hardlink_to` :func:`os.symlink` :meth:`Path.symlink_to` :func:`os.readlink` :meth:`Path.readlink` -:func:`os.path.relpath` :meth:`Path.relative_to` [#]_ +:func:`os.path.relpath` :meth:`PurePath.relative_to` [#]_ :func:`os.stat` :meth:`Path.stat`, :meth:`Path.owner`, :meth:`Path.group` @@ -1279,4 +1306,4 @@ Below is a table mapping various :mod:`os` functions to their corresponding .. rubric:: Footnotes .. [#] :func:`os.path.abspath` normalizes the resulting path, which may change its meaning in the presence of symlinks, while :meth:`Path.absolute` does not. -.. [#] :meth:`Path.relative_to` requires ``self`` to be the subpath of the argument, but :func:`os.path.relpath` does not. +.. [#] :meth:`PurePath.relative_to` requires ``self`` to be the subpath of the argument, but :func:`os.path.relpath` does not. diff --git a/Doc/library/pickle.rst b/Doc/library/pickle.rst index fa14b64027e8fe..41b0f48f4611da 100644 --- a/Doc/library/pickle.rst +++ b/Doc/library/pickle.rst @@ -502,10 +502,10 @@ The following types can be pickled: * tuples, lists, sets, and dictionaries containing only picklable objects; -* functions (built-in and user-defined) defined at the top level of a module - (using :keyword:`def`, not :keyword:`lambda`); +* functions (built-in and user-defined) accessible from the top level of a + module (using :keyword:`def`, not :keyword:`lambda`); -* classes defined at the top level of a module; +* classes accessible from the top level of a module; * instances of such classes whose the result of calling :meth:`__getstate__` is picklable (see section :ref:`pickle-inst` for details). @@ -517,9 +517,9 @@ structure may exceed the maximum recursion depth, a :exc:`RecursionError` will b raised in this case. You can carefully raise this limit with :func:`sys.setrecursionlimit`. -Note that functions (built-in and user-defined) are pickled by fully qualified -name, not by value. [#]_ This means that only the function name is -pickled, along with the name of the module the function is defined in. Neither +Note that functions (built-in and user-defined) are pickled by fully +:term:`qualified name`, not by value. [#]_ This means that only the function name is +pickled, along with the name of the containing module and classes. Neither the function's code, nor any of its function attributes are pickled. Thus the defining module must be importable in the unpickling environment, and the module must contain the named object, otherwise an exception will be raised. [#]_ diff --git a/Doc/library/pipes.rst b/Doc/library/pipes.rst index 1c5bb8bddaafbc..245dd0d2520881 100644 --- a/Doc/library/pipes.rst +++ b/Doc/library/pipes.rst @@ -10,7 +10,7 @@ **Source code:** :source:`Lib/pipes.py` -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`pipes` module is deprecated (see :pep:`PEP 594 <594#pipes>` for details). Please use the :mod:`subprocess` module instead. diff --git a/Doc/library/platform.rst b/Doc/library/platform.rst index 346063d66afab9..dc2d871b47d5ef 100644 --- a/Doc/library/platform.rst +++ b/Doc/library/platform.rst @@ -53,7 +53,7 @@ Cross Platform .. function:: machine() - Returns the machine type, e.g. ``'i386'``. An empty string is returned if the + Returns the machine type, e.g. ``'AMD64'``. An empty string is returned if the value cannot be determined. diff --git a/Doc/library/posix.rst b/Doc/library/posix.rst index ad417a17879c1f..90be191aa2f8d7 100644 --- a/Doc/library/posix.rst +++ b/Doc/library/posix.rst @@ -37,7 +37,7 @@ Large File Support .. sectionauthor:: Steve Clift -Several operating systems (including AIX, HP-UX and Solaris) provide +Several operating systems (including AIX and Solaris) provide support for files that are larger than 2 GiB from a C programming model where :c:type:`int` and :c:type:`long` are 32-bit values. This is typically accomplished by defining the relevant size and offset types as 64-bit values. Such files are diff --git a/Doc/library/re.rst b/Doc/library/re.rst index 39e7d23aaf9c93..1b9a7b63ef5e1b 100644 --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -7,7 +7,7 @@ .. moduleauthor:: Fredrik Lundh .. sectionauthor:: Andrew M. Kuchling -**Source code:** :source:`Lib/re.py` +**Source code:** :source:`Lib/re/` -------------- @@ -667,40 +667,14 @@ functions are simplified versions of the full featured methods for compiled regular expressions. Most non-trivial applications always use the compiled form. + +Flags +^^^^^ + .. versionchanged:: 3.6 Flag constants are now instances of :class:`RegexFlag`, which is a subclass of :class:`enum.IntFlag`. -.. function:: compile(pattern, flags=0) - - Compile a regular expression pattern into a :ref:`regular expression object - `, which can be used for matching using its - :func:`~Pattern.match`, :func:`~Pattern.search` and other methods, described - below. - - The expression's behaviour can be modified by specifying a *flags* value. - Values can be any of the following variables, combined using bitwise OR (the - ``|`` operator). - - The sequence :: - - prog = re.compile(pattern) - result = prog.match(string) - - is equivalent to :: - - result = re.match(pattern, string) - - but using :func:`re.compile` and saving the resulting regular expression - object for reuse is more efficient when the expression will be used several - times in a single program. - - .. note:: - - The compiled versions of the most recent patterns passed to - :func:`re.compile` and the module-level matching functions are cached, so - programs that use only a few regular expressions at a time needn't worry - about compiling regular expressions. .. class:: RegexFlag @@ -825,6 +799,41 @@ form. Corresponds to the inline flag ``(?x)``. +Functions +^^^^^^^^^ + +.. function:: compile(pattern, flags=0) + + Compile a regular expression pattern into a :ref:`regular expression object + `, which can be used for matching using its + :func:`~Pattern.match`, :func:`~Pattern.search` and other methods, described + below. + + The expression's behaviour can be modified by specifying a *flags* value. + Values can be any of the following variables, combined using bitwise OR (the + ``|`` operator). + + The sequence :: + + prog = re.compile(pattern) + result = prog.match(string) + + is equivalent to :: + + result = re.match(pattern, string) + + but using :func:`re.compile` and saving the resulting regular expression + object for reuse is more efficient when the expression will be used several + times in a single program. + + .. note:: + + The compiled versions of the most recent patterns passed to + :func:`re.compile` and the module-level matching functions are cached, so + programs that use only a few regular expressions at a time needn't worry + about compiling regular expressions. + + .. function:: search(pattern, string, flags=0) Scan through *string* looking for the first location where the regular expression @@ -1061,6 +1070,9 @@ form. Clear the regular expression cache. +Exceptions +^^^^^^^^^^ + .. exception:: error(msg, pattern=None, pos=None) Exception raised when a string passed to one of the functions here is not a @@ -1315,6 +1327,14 @@ Match objects support the following methods and attributes: >>> m[2] # The second parenthesized subgroup. 'Newton' + Named groups are supported as well:: + + >>> m = re.match(r"(?P\w+) (?P\w+)", "Isaac Newton") + >>> m['first_name'] + 'Isaac' + >>> m['last_name'] + 'Newton' + .. versionadded:: 3.6 diff --git a/Doc/library/secrets.rst b/Doc/library/secrets.rst index c22da727b55c9b..eda4616da5ec91 100644 --- a/Doc/library/secrets.rst +++ b/Doc/library/secrets.rst @@ -129,7 +129,7 @@ Other functions .. function:: compare_digest(a, b) Return ``True`` if strings *a* and *b* are equal, otherwise ``False``, - in such a way as to reduce the risk of + using a "constant-time compare" to reduce the risk of `timing attacks `_. See :func:`hmac.compare_digest` for additional details. diff --git a/Doc/library/shutil.rst b/Doc/library/shutil.rst index 9a25b0d008bf5f..e79caecf310f21 100644 --- a/Doc/library/shutil.rst +++ b/Doc/library/shutil.rst @@ -574,12 +574,18 @@ provided. They rely on the :mod:`zipfile` and :mod:`tarfile` modules. .. note:: - This function is not thread-safe. + This function is not thread-safe when custom archivers registered + with :func:`register_archive_format` are used. In this case it + temporarily changes the current working directory of the process + to perform archiving. .. versionchanged:: 3.8 The modern pax (POSIX.1-2001) format is now used instead of the legacy GNU format for archives created with ``format="tar"``. + .. versionchanged:: 3.10.6 + This function is now made thread-safe during creation of standard + ``.zip`` and tar archives. .. function:: get_archive_formats() diff --git a/Doc/library/signal.rst b/Doc/library/signal.rst index 678411d4f17056..d850639d242abe 100644 --- a/Doc/library/signal.rst +++ b/Doc/library/signal.rst @@ -95,7 +95,7 @@ The signal module defines three enums: :class:`enum.IntEnum` collection the constants :const:`SIG_BLOCK`, :const:`SIG_UNBLOCK` and :const:`SIG_SETMASK`. - Availability: Unix. See the man page :manpage:`sigprocmask(3)` and + Availability: Unix. See the man page :manpage:`sigprocmask(2)` and :manpage:`pthread_sigmask(3)` for further information. .. versionadded:: 3.5 @@ -739,7 +739,7 @@ To illustrate this issue, consider the following code:: def __enter__(self): # If KeyboardInterrupt occurs here, everything is fine self.lock.acquire() - # If KeyboardInterrupt occcurs here, __exit__ will not be called + # If KeyboardInterrupt occurs here, __exit__ will not be called ... # KeyboardInterrupt could occur just before the function returns diff --git a/Doc/library/smtpd.rst b/Doc/library/smtpd.rst index 121790b5ed0ddf..a0d1fb0aa51529 100644 --- a/Doc/library/smtpd.rst +++ b/Doc/library/smtpd.rst @@ -14,8 +14,8 @@ This module offers several classes to implement SMTP (email) servers. -.. deprecated:: 3.6 - :mod:`smtpd` will be removed in Python 3.12 +.. deprecated-removed:: 3.6 3.12 + The :mod:`smtpd` module is deprecated (see :pep:`PEP 594 <594#smtpd>` for details). The `aiosmtpd `_ package is a recommended replacement for this module. It is based on :mod:`asyncio` and provides a diff --git a/Doc/library/sndhdr.rst b/Doc/library/sndhdr.rst index 3ca36f270dade1..e1dbe4a1a34483 100644 --- a/Doc/library/sndhdr.rst +++ b/Doc/library/sndhdr.rst @@ -14,7 +14,7 @@ single: A-LAW single: u-LAW -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`sndhdr` module is deprecated (see :pep:`PEP 594 <594#sndhdr>` for details and alternatives). diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst old mode 100755 new mode 100644 index 4c193b892bda8f..fc8a56e7b41ba4 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -225,6 +225,29 @@ created. Socket addresses are represented as follows: .. versionadded:: 3.9 +- :const:`AF_HYPERV` is a Windows-only socket based interface for communicating + with Hyper-V hosts and guests. The address family is represented as a + ``(vm_id, service_id)`` tuple where the ``vm_id`` and ``service_id`` are + UUID strings. + + The ``vm_id`` is the virtual machine identifier or a set of known VMID values + if the target is not a specific virtual machine. Known VMID constants + defined on ``socket`` are: + + - ``HV_GUID_ZERO`` + - ``HV_GUID_BROADCAST`` + - ``HV_GUID_WILDCARD`` - Used to bind on itself and accept connections from + all partitions. + - ``HV_GUID_CHILDREN`` - Used to bind on itself and accept connection from + child partitions. + - ``HV_GUID_LOOPBACK`` - Used as a target to itself. + - ``HV_GUID_PARENT`` - When used as a bind accepts connection from the parent + partition. When used as an address target it will connect to the parent partition. + + The ``service_id`` is the service identifier of the registered service. + + .. versionadded:: 3.12 + If you use a hostname in the *host* portion of IPv4/v6 socket address, the program may show a nondeterministic behavior, as Python uses the first address returned from the DNS resolution. The socket address will be resolved @@ -233,9 +256,9 @@ resolution and/or the host configuration. For deterministic behavior use a numeric address in *host* portion. All errors raise exceptions. The normal exceptions for invalid argument types -and out-of-memory conditions can be raised; starting from Python 3.3, errors +and out-of-memory conditions can be raised. Errors related to socket or address semantics raise :exc:`OSError` or one of its -subclasses (they used to raise :exc:`socket.error`). +subclasses. Non-blocking mode is supported through :meth:`~socket.setblocking`. A generalization of this based on timeouts is supported through @@ -589,6 +612,25 @@ Constants .. availability:: Linux >= 3.9 +.. data:: AF_HYPERV + HV_PROTOCOL_RAW + HVSOCKET_CONNECT_TIMEOUT + HVSOCKET_CONNECT_TIMEOUT_MAX + HVSOCKET_CONNECTED_SUSPEND + HVSOCKET_ADDRESS_FLAG_PASSTHRU + HV_GUID_ZERO + HV_GUID_WILDCARD + HV_GUID_BROADCAST + HV_GUID_CHILDREN + HV_GUID_LOOPBACK + HV_GUID_LOOPBACK + + Constants for Windows Hyper-V sockets for host/guest communications. + + .. availability:: Windows. + + .. versionadded:: 3.12 + Functions ^^^^^^^^^ @@ -712,7 +754,7 @@ The following functions all create :ref:`socket objects `. .. function:: create_server(address, *, family=AF_INET, backlog=None, reuse_port=False, dualstack_ipv6=False) Convenience function which creates a TCP socket bound to *address* (a 2-tuple - ``(host, port)``) and return the socket object. + ``(host, port)``) and returns the socket object. *family* should be either :data:`AF_INET` or :data:`AF_INET6`. *backlog* is the queue size passed to :meth:`socket.listen`; when ``0`` @@ -2033,10 +2075,10 @@ the interface:: # Include IP headers s.setsockopt(socket.IPPROTO_IP, socket.IP_HDRINCL, 1) - # receive all packages + # receive all packets s.ioctl(socket.SIO_RCVALL, socket.RCVALL_ON) - # receive a package + # receive a packet print(s.recvfrom(65565)) # disabled promiscuous mode diff --git a/Doc/library/spwd.rst b/Doc/library/spwd.rst index 40f50de07babff..87e09167ada427 100644 --- a/Doc/library/spwd.rst +++ b/Doc/library/spwd.rst @@ -6,7 +6,7 @@ :synopsis: The shadow password database (getspnam() and friends). :deprecated: -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`spwd` module is deprecated (see :pep:`PEP 594 <594#spwd>` for details and alternatives). diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index 69e77e922a9ab7..a99485b6ba7bc4 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -142,12 +142,22 @@ Module functions and constants The version number of this module, as a string. This is not the version of the SQLite library. + .. deprecated-removed:: 3.12 3.14 + This constant used to reflect the version number of the ``pysqlite`` + package, a third-party library which used to upstream changes to + ``sqlite3``. Today, it carries no meaning or practical value. + .. data:: version_info The version number of this module, as a tuple of integers. This is not the version of the SQLite library. + .. deprecated-removed:: 3.12 3.14 + This constant used to reflect the version number of the ``pysqlite`` + package, a third-party library which used to upstream changes to + ``sqlite3``. Today, it carries no meaning or practical value. + .. data:: sqlite_version @@ -199,31 +209,41 @@ Module functions and constants .. data:: PARSE_DECLTYPES - This constant is meant to be used with the *detect_types* parameter of the - :func:`connect` function. + Pass this flag value to the *detect_types* parameter of + :func:`connect` to look up a converter function using + the declared types for each column. + The types are declared when the database table is created. + ``sqlite3`` will look up a converter function using the first word of the + declared type as the converter dictionary key. + For example: + - Setting it makes the :mod:`sqlite3` module parse the declared type for each - column it returns. It will parse out the first word of the declared type, - i. e. for "integer primary key", it will parse out "integer", or for - "number(10)" it will parse out "number". Then for that column, it will look - into the converters dictionary and use the converter function registered for - that type there. + .. code-block:: sql + + CREATE TABLE test( + i integer primary key, ! will look up a converter named "integer" + p point, ! will look up a converter named "point" + n number(10) ! will look up a converter named "number" + ) + + This flag may be combined with :const:`PARSE_COLNAMES` using the ``|`` + (bitwise or) operator. .. data:: PARSE_COLNAMES - This constant is meant to be used with the *detect_types* parameter of the - :func:`connect` function. + Pass this flag value to the *detect_types* parameter of + :func:`connect` to look up a converter function by + using the type name, parsed from the query column name, + as the converter dictionary key. + The type name must be wrapped in square brackets (``[]``). + + .. code-block:: sql - Setting this makes the SQLite interface parse the column name for each column it - returns. It will look for a string formed [mytype] in there, and then decide - that 'mytype' is the type of the column. It will try to find an entry of - 'mytype' in the converters dictionary and then use the converter function found - there to return the value. The column name found in :attr:`Cursor.description` - does not include the type, i. e. if you use something like - ``'as "Expiration date [datetime]"'`` in your SQL, then we will parse out - everything until the first ``'['`` for the column name and strip - the preceding space: the column name would simply be "Expiration date". + SELECT p as "p [point]" FROM test; ! will look up converter "point" + + This flag may be combined with :const:`PARSE_DECLTYPES` using the ``|`` + (bitwise or) operator. .. function:: connect(database[, timeout, detect_types, isolation_level, check_same_thread, factory, cached_statements, uri]) @@ -247,14 +267,17 @@ Module functions and constants SQLite natively supports only the types TEXT, INTEGER, REAL, BLOB and NULL. If you want to use other types you must add support for them yourself. The - *detect_types* parameter and the using custom **converters** registered with the + *detect_types* parameter and using custom **converters** registered with the module-level :func:`register_converter` function allow you to easily do that. - *detect_types* defaults to 0 (i. e. off, no type detection), you can set it to - any combination of :const:`PARSE_DECLTYPES` and :const:`PARSE_COLNAMES` to turn - type detection on. Due to SQLite behaviour, types can't be detected for generated - fields (for example ``max(data)``), even when *detect_types* parameter is set. In - such case, the returned type is :class:`str`. + *detect_types* defaults to 0 (type detection disabled). + Set it to any combination (using ``|``, bitwise or) of + :const:`PARSE_DECLTYPES` and :const:`PARSE_COLNAMES` + to enable type detection. + Column names takes precedence over declared types if both flags are set. + Types cannot be detected for generated fields (for example ``max(data)``), + even when the *detect_types* parameter is set. + In such cases, the returned type is :class:`str`. By default, *check_same_thread* is :const:`True` and only the creating thread may use the connection. If set :const:`False`, the returned connection may be shared @@ -309,26 +332,32 @@ Module functions and constants Added the ``sqlite3.connect/handle`` auditing event. -.. function:: register_converter(typename, callable) +.. function:: register_converter(typename, converter) + + Register the *converter* callable to convert SQLite objects of type + *typename* into a Python object of a specific type. + The converter is invoked for all SQLite values of type *typename*; + it is passed a :class:`bytes` object and should return an object of the + desired Python type. + Consult the parameter *detect_types* of + :func:`connect` for information regarding how type detection works. - Registers a callable to convert a bytestring from the database into a custom - Python type. The callable will be invoked for all database values that are of - the type *typename*. Confer the parameter *detect_types* of the :func:`connect` - function for how the type detection works. Note that *typename* and the name of - the type in your query are matched in case-insensitive manner. + Note: *typename* and the name of the type in your query are matched + case-insensitively. -.. function:: register_adapter(type, callable) +.. function:: register_adapter(type, adapter) - Registers a callable to convert the custom Python type *type* into one of - SQLite's supported types. The callable *callable* accepts as single parameter - the Python value, and must return a value of the following types: int, - float, str or bytes. + Register an *adapter* callable to adapt the Python type *type* into an + SQLite type. + The adapter is called with a Python object of type *type* as its sole + argument, and must return a value of a + :ref:`type that SQLite natively understands`. -.. function:: complete_statement(sql) +.. function:: complete_statement(statement) - Returns :const:`True` if the string *sql* contains one or more complete SQL + Returns :const:`True` if the string *statement* contains one or more complete SQL statements terminated by semicolons. It does not verify that the SQL is syntactically correct, only that there are no unclosed string literals and the statement is terminated by a semicolon. @@ -413,21 +442,20 @@ Connection Objects .. method:: commit() - This method commits the current transaction. If you don't call this method, - anything you did since the last call to ``commit()`` is not visible from - other database connections. If you wonder why you don't see the data you've - written to the database, please check you didn't forget to call this method. + Commit any pending transaction to the database. + If there is no open transaction, this method is a no-op. .. method:: rollback() - This method rolls back any changes to the database since the last call to - :meth:`commit`. + Roll back to the start of any pending transaction. + If there is no open transaction, this method is a no-op. .. method:: close() - This closes the database connection. Note that this does not automatically - call :meth:`commit`. If you just close your database connection without - calling :meth:`commit` first, your changes will be lost! + Close the database connection. + Any pending transaction is not committed implicitly; + make sure to :meth:`commit` before closing + to avoid losing pending changes. .. method:: execute(sql[, parameters]) @@ -447,11 +475,11 @@ Connection Objects :meth:`~Cursor.executescript` on it with the given *sql_script*. Return the new cursor object. - .. method:: create_function(name, num_params, func, *, deterministic=False) + .. method:: create_function(name, narg, func, *, deterministic=False) Creates a user-defined function that you can later use from within SQL - statements under the function name *name*. *num_params* is the number of - parameters the function accepts (if *num_params* is -1, the function may + statements under the function name *name*. *narg* is the number of + parameters the function accepts (if *narg* is -1, the function may take any number of arguments), and *func* is a Python callable that is called as the SQL function. If *deterministic* is true, the created function is marked as `deterministic `_, which @@ -470,12 +498,12 @@ Connection Objects .. literalinclude:: ../includes/sqlite3/md5func.py - .. method:: create_aggregate(name, num_params, aggregate_class) + .. method:: create_aggregate(name, n_arg, aggregate_class) Creates a user-defined aggregate function. The aggregate class must implement a ``step`` method, which accepts the number - of parameters *num_params* (if *num_params* is -1, the function may take + of parameters *n_arg* (if *n_arg* is -1, the function may take any number of arguments), and a ``finalize`` method which will return the final result of the aggregate. @@ -518,22 +546,19 @@ Connection Objects .. method:: create_collation(name, callable) - Creates a collation with the specified *name* and *callable*. The callable will - be passed two string arguments. It should return -1 if the first is ordered - lower than the second, 0 if they are ordered equal and 1 if the first is ordered - higher than the second. Note that this controls sorting (ORDER BY in SQL) so - your comparisons don't affect other SQL operations. + Create a collation named *name* using the collating function *callable*. + *callable* is passed two :class:`string ` arguments, + and it should return an :class:`integer `: - Note that the callable will get its parameters as Python bytestrings, which will - normally be encoded in UTF-8. + * ``1`` if the first is ordered higher than the second + * ``-1`` if the first is ordered lower than the second + * ``0`` if they are ordered equal - The following example shows a custom collation that sorts "the wrong way": + The following example shows a reverse sorting collation: .. literalinclude:: ../includes/sqlite3/collation_reverse.py - To remove a collation, call ``create_collation`` with ``None`` as callable:: - - con.create_collation("reverse", None) + Remove a collation function by setting *callable* to :const:`None`. .. versionchanged:: 3.11 The collation name can contain any Unicode character. Earlier, only @@ -573,7 +598,7 @@ Connection Objects Added support for disabling the authorizer using :const:`None`. - .. method:: set_progress_handler(handler, n) + .. method:: set_progress_handler(progress_handler, n) This routine registers a callback. The callback is invoked for every *n* instructions of the SQLite virtual machine. This is useful if you want to @@ -581,10 +606,10 @@ Connection Objects a GUI. If you want to clear any previously installed progress handler, call the - method with :const:`None` for *handler*. + method with :const:`None` for *progress_handler*. Returning a non-zero value from the handler function will terminate the - currently executing query and cause it to raise an :exc:`OperationalError` + currently executing query and cause it to raise a :exc:`DatabaseError` exception. @@ -621,7 +646,7 @@ Connection Objects Loadable extensions are disabled by default. See [#f1]_. - .. audit-event:: sqlite3.enable_load_extension connection,enabled sqlite3.enable_load_extension + .. audit-event:: sqlite3.enable_load_extension connection,enabled sqlite3.Connection.enable_load_extension .. versionadded:: 3.2 @@ -638,7 +663,7 @@ Connection Objects Loadable extensions are disabled by default. See [#f1]_. - .. audit-event:: sqlite3.load_extension connection,path sqlite3.load_extension + .. audit-event:: sqlite3.load_extension connection,path sqlite3.Connection.load_extension .. versionadded:: 3.2 @@ -816,7 +841,7 @@ Connection Objects *name*, and reopen *name* as an in-memory database based on the serialization contained in *data*. Deserialization will raise :exc:`OperationalError` if the database connection is currently involved - in a read transaction or a backup operation. :exc:`DataError` will be + in a read transaction or a backup operation. :exc:`OverflowError` will be raised if ``len(data)`` is larger than ``2**63 - 1``, and :exc:`DatabaseError` will be raised if *data* does not contain a valid SQLite database. @@ -847,7 +872,7 @@ Cursor Objects :ref:`placeholders `. :meth:`execute` will only execute a single SQL statement. If you try to execute - more than one statement with it, it will raise a :exc:`.Warning`. Use + more than one statement with it, it will raise a :exc:`ProgrammingError`. Use :meth:`executescript` if you want to execute multiple SQL statements with one call. @@ -1101,14 +1126,20 @@ Blob Objects Exceptions ---------- +The exception hierarchy is defined by the DB-API 2.0 (:pep:`249`). + .. exception:: Warning - A subclass of :exc:`Exception`. + This exception is not currently raised by the ``sqlite3`` module, + but may be raised by applications using ``sqlite3``, + for example if a user-defined function truncates data while inserting. + ``Warning`` is a subclass of :exc:`Exception`. .. exception:: Error - The base class of the other exceptions in this module. It is a subclass - of :exc:`Exception`. + The base class of the other exceptions in this module. + Use this to catch all errors with one single :keyword:`except` statement. + ``Error`` is a subclass of :exc:`Exception`. .. attribute:: sqlite_errorcode @@ -1124,34 +1155,60 @@ Exceptions .. versionadded:: 3.11 +.. exception:: InterfaceError + + Exception raised for misuse of the low-level SQLite C API. + In other words, if this exception is raised, it probably indicates a bug in the + ``sqlite3`` module. + ``InterfaceError`` is a subclass of :exc:`Error`. + .. exception:: DatabaseError Exception raised for errors that are related to the database. + This serves as the base exception for several types of database errors. + It is only raised implicitly through the specialised subclasses. + ``DatabaseError`` is a subclass of :exc:`Error`. + +.. exception:: DataError + + Exception raised for errors caused by problems with the processed data, + like numeric values out of range, and strings which are too long. + ``DataError`` is a subclass of :exc:`DatabaseError`. + +.. exception:: OperationalError + + Exception raised for errors that are related to the database's operation, + and not necessarily under the control of the programmer. + For example, the database path is not found, + or a transaction could not be processed. + ``OperationalError`` is a subclass of :exc:`DatabaseError`. .. exception:: IntegrityError Exception raised when the relational integrity of the database is affected, e.g. a foreign key check fails. It is a subclass of :exc:`DatabaseError`. -.. exception:: ProgrammingError +.. exception:: InternalError - Exception raised for programming errors, e.g. table not found or already - exists, syntax error in the SQL statement, wrong number of parameters - specified, etc. It is a subclass of :exc:`DatabaseError`. + Exception raised when SQLite encounters an internal error. + If this is raised, it may indicate that there is a problem with the runtime + SQLite library. + ``InternalError`` is a subclass of :exc:`DatabaseError`. -.. exception:: OperationalError +.. exception:: ProgrammingError - Exception raised for errors that are related to the database's operation - and not necessarily under the control of the programmer, e.g. an unexpected - disconnect occurs, the data source name is not found, a transaction could - not be processed, etc. It is a subclass of :exc:`DatabaseError`. + Exception raised for ``sqlite3`` API programming errors, + for example supplying the wrong number of bindings to a query, + or trying to operate on a closed :class:`Connection`. + ``ProgrammingError`` is a subclass of :exc:`DatabaseError`. .. exception:: NotSupportedError - Exception raised in case a method or database API was used which is not - supported by the database, e.g. calling the :meth:`~Connection.rollback` - method on a connection that does not support transaction or has - transactions turned off. It is a subclass of :exc:`DatabaseError`. + Exception raised in case a method or database API is not supported by the + underlying SQLite library. For example, setting *deterministic* to + :const:`True` in :meth:`~Connection.create_function`, if the underlying SQLite library + does not support deterministic functions. + ``NotSupportedError`` is a subclass of :exc:`DatabaseError`. .. _sqlite3-blob-objects: @@ -1208,33 +1265,32 @@ you can let the :mod:`sqlite3` module convert SQLite types to different Python types via converters. -Using adapters to store additional Python types in SQLite databases -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Using adapters to store custom Python types in SQLite databases +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -As described before, SQLite supports only a limited set of types natively. To -use other Python types with SQLite, you must **adapt** them to one of the -sqlite3 module's supported types for SQLite: one of NoneType, int, float, -str, bytes. +SQLite supports only a limited set of data types natively. +To store custom Python types in SQLite databases, *adapt* them to one of the +:ref:`Python types SQLite natively understands`. -There are two ways to enable the :mod:`sqlite3` module to adapt a custom Python -type to one of the supported ones. +There are two ways to adapt Python objects to SQLite types: +letting your object adapt itself, or using an *adapter callable*. +The latter will take precedence above the former. +For a library that exports a custom type, +it may make sense to enable that type to adapt itself. +As an application developer, it may make more sense to take direct control by +registering custom adapter functions. Letting your object adapt itself """""""""""""""""""""""""""""""" -This is a good approach if you write the class yourself. Let's suppose you have -a class like this:: - - class Point: - def __init__(self, x, y): - self.x, self.y = x, y - -Now you want to store the point in a single SQLite column. First you'll have to -choose one of the supported types to be used for representing the point. -Let's just use str and separate the coordinates using a semicolon. Then you need -to give your class a method ``__conform__(self, protocol)`` which must return -the converted value. The parameter *protocol* will be :class:`PrepareProtocol`. +Suppose we have a ``Point`` class that represents a pair of coordinates, +``x`` and ``y``, in a Cartesian coordinate system. +The coordinate pair will be stored as a text string in the database, +using a semicolon to separate the coordinates. +This can be implemented by adding a ``__conform__(self, protocol)`` +method which returns the adapted value. +The object passed to *protocol* will be of type :class:`PrepareProtocol`. .. literalinclude:: ../includes/sqlite3/adapter_point_1.py @@ -1242,26 +1298,20 @@ the converted value. The parameter *protocol* will be :class:`PrepareProtocol`. Registering an adapter callable """"""""""""""""""""""""""""""" -The other possibility is to create a function that converts the type to the -string representation and register the function with :meth:`register_adapter`. +The other possibility is to create a function that converts the Python object +to an SQLite-compatible type. +This function can then be registered using :func:`register_adapter`. .. literalinclude:: ../includes/sqlite3/adapter_point_2.py -The :mod:`sqlite3` module has two default adapters for Python's built-in -:class:`datetime.date` and :class:`datetime.datetime` types. Now let's suppose -we want to store :class:`datetime.datetime` objects not in ISO representation, -but as a Unix timestamp. - -.. literalinclude:: ../includes/sqlite3/adapter_datetime.py - Converting SQLite values to custom Python types ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Writing an adapter lets you send custom Python types to SQLite. But to make it -really useful we need to make the Python to SQLite to Python roundtrip work. - -Enter converters. +Writing an adapter lets you convert *from* custom Python types *to* SQLite +values. +To be able to convert *from* SQLite values *to* custom Python types, +we use *converters*. Let's go back to the :class:`Point` class. We stored the x and y coordinates separated via semicolons as strings in SQLite. @@ -1271,8 +1321,8 @@ and constructs a :class:`Point` object from it. .. note:: - Converter functions **always** get called with a :class:`bytes` object, no - matter under which data type you sent the value to SQLite. + Converter functions are **always** passed a :class:`bytes` object, + no matter the underlying SQLite data type. :: @@ -1280,17 +1330,17 @@ and constructs a :class:`Point` object from it. x, y = map(float, s.split(b";")) return Point(x, y) -Now you need to make the :mod:`sqlite3` module know that what you select from -the database is actually a point. There are two ways of doing this: - -* Implicitly via the declared type - -* Explicitly via the column name +We now need to tell ``sqlite3`` when it should convert a given SQLite value. +This is done when connecting to a database, using the *detect_types* parameter +of :func:`connect`. There are three options: -Both ways are described in section :ref:`sqlite3-module-contents`, in the entries -for the constants :const:`PARSE_DECLTYPES` and :const:`PARSE_COLNAMES`. +* Implicit: set *detect_types* to :const:`PARSE_DECLTYPES` +* Explicit: set *detect_types* to :const:`PARSE_COLNAMES` +* Both: set *detect_types* to + ``sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES``. + Colum names take precedence over declared types. -The following example illustrates both approaches. +The following example illustrates the implicit and explicit approaches: .. literalinclude:: ../includes/sqlite3/converter_point.py @@ -1324,6 +1374,52 @@ timestamp converter. offsets in timestamps, either leave converters disabled, or register an offset-aware converter with :func:`register_converter`. + +.. _sqlite3-adapter-converter-recipes: + +Adapter and Converter Recipes +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +This section shows recipes for common adapters and converters. + +.. code-block:: + + import datetime + import sqlite3 + + def adapt_date_iso(val): + """Adapt datetime.date to ISO 8601 date.""" + return val.isoformat() + + def adapt_datetime_iso(val): + """Adapt datetime.datetime to timezone-naive ISO 8601 date.""" + return val.isoformat() + + def adapt_datetime_epoch(val) + """Adapt datetime.datetime to Unix timestamp.""" + return int(val.timestamp()) + + sqlite3.register_adapter(datetime.date, adapt_date_iso) + sqlite3.register_adapter(datetime.datetime, adapt_datetime_iso) + sqlite3.register_adapter(datetime.datetime, adapt_datetime_epoch) + + def convert_date(val): + """Convert ISO 8601 date to datetime.date object.""" + return datetime.date.fromisoformat(val) + + def convert_datetime(val): + """Convert ISO 8601 datetime to datetime.datetime object.""" + return datetime.datetime.fromisoformat(val) + + def convert_timestamp(val): + """Convert Unix epoch timestamp to datetime.datetime object.""" + return datetime.datetime.fromtimestamp(val) + + sqlite3.register_converter("date", convert_date) + sqlite3.register_converter("datetime", convert_datetime) + sqlite3.register_converter("timestamp", convert_timestamp) + + .. _sqlite3-controlling-transactions: Controlling Transactions @@ -1392,13 +1488,27 @@ case-insensitively by name: .. literalinclude:: ../includes/sqlite3/rowclass.py +.. _sqlite3-connection-context-manager: + Using the connection as a context manager ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Connection objects can be used as context managers -that automatically commit or rollback transactions. In the event of an -exception, the transaction is rolled back; otherwise, the transaction is -committed: +A :class:`Connection` object can be used as a context manager that +automatically commits or rolls back open transactions when leaving the body of +the context manager. +If the body of the :keyword:`with` statement finishes without exceptions, +the transaction is committed. +If this commit fails, +or if the body of the ``with`` statement raises an uncaught exception, +the transaction is rolled back. + +If there is no open transaction upon leaving the body of the ``with`` statement, +the context manager is a no-op. + +.. note:: + + The context manager neither implicitly opens a new transaction + nor closes the connection. .. literalinclude:: ../includes/sqlite3/ctx_manager.py diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index 54bd335282699c..1bbcd7a1964ea5 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -311,27 +311,6 @@ Random generation .. versionadded:: 3.3 -.. function:: RAND_pseudo_bytes(num) - - Return (bytes, is_cryptographic): bytes are *num* pseudo-random bytes, - is_cryptographic is ``True`` if the bytes generated are cryptographically - strong. Raises an :class:`SSLError` if the operation is not supported by the - current RAND method. - - Generated pseudo-random byte sequences will be unique if they are of - sufficient length, but are not necessarily unpredictable. They can be used - for non-cryptographic purposes and for certain purposes in cryptographic - protocols, but usually not for key generation etc. - - For almost all applications :func:`os.urandom` is preferable. - - .. versionadded:: 3.3 - - .. deprecated:: 3.6 - - OpenSSL has deprecated :func:`ssl.RAND_pseudo_bytes`, use - :func:`ssl.RAND_bytes` instead. - .. function:: RAND_status() Return ``True`` if the SSL pseudo-random number generator has been seeded @@ -356,49 +335,6 @@ Certificate handling import ssl -.. function:: match_hostname(cert, hostname) - - Verify that *cert* (in decoded format as returned by - :meth:`SSLSocket.getpeercert`) matches the given *hostname*. The rules - applied are those for checking the identity of HTTPS servers as outlined - in :rfc:`2818`, :rfc:`5280` and :rfc:`6125`. In addition to HTTPS, this - function should be suitable for checking the identity of servers in - various SSL-based protocols such as FTPS, IMAPS, POPS and others. - - :exc:`CertificateError` is raised on failure. On success, the function - returns nothing:: - - >>> cert = {'subject': ((('commonName', 'example.com'),),)} - >>> ssl.match_hostname(cert, "example.com") - >>> ssl.match_hostname(cert, "example.org") - Traceback (most recent call last): - File "", line 1, in - File "/home/py3k/Lib/ssl.py", line 130, in match_hostname - ssl.CertificateError: hostname 'example.org' doesn't match 'example.com' - - .. versionadded:: 3.2 - - .. versionchanged:: 3.3.3 - The function now follows :rfc:`6125`, section 6.4.3 and does neither - match multiple wildcards (e.g. ``*.*.com`` or ``*a*.example.org``) nor - a wildcard inside an internationalized domain names (IDN) fragment. - IDN A-labels such as ``www*.xn--pthon-kva.org`` are still supported, - but ``x*.python.org`` no longer matches ``xn--tda.python.org``. - - .. versionchanged:: 3.5 - Matching of IP addresses, when present in the subjectAltName field - of the certificate, is now supported. - - .. versionchanged:: 3.7 - The function is no longer used to TLS connections. Hostname matching - is now performed by OpenSSL. - - Allow wildcard when it is the leftmost and the only character - in that segment. Partial wildcards like ``www*.example.com`` are no - longer supported. - - .. deprecated:: 3.7 - .. function:: cert_time_to_seconds(cert_time) Return the time in seconds since the Epoch, given the ``cert_time`` @@ -714,7 +650,7 @@ Constants Selects SSL version 2 as the channel encryption protocol. This protocol is not available if OpenSSL is compiled with the - ``OPENSSL_NO_SSL2`` flag. + ``no-ssl2`` option. .. warning:: @@ -728,8 +664,8 @@ Constants Selects SSL version 3 as the channel encryption protocol. - This protocol is not be available if OpenSSL is compiled with the - ``OPENSSL_NO_SSLv3`` flag. + This protocol is not available if OpenSSL is compiled with the + ``no-ssl3`` option. .. warning:: @@ -1272,11 +1208,6 @@ SSL sockets also have the following additional methods and attributes: 'subjectAltName': (('DNS', '*.eff.org'), ('DNS', 'eff.org')), 'version': 3} - .. note:: - - To validate a certificate for a particular service, you can use the - :func:`match_hostname` function. - If the ``binary_form`` parameter is :const:`True`, and a certificate was provided, this method returns the DER-encoded form of the entire certificate as a sequence of bytes, or :const:`None` if the peer did not provide a @@ -1291,6 +1222,8 @@ SSL sockets also have the following additional methods and attributes: :const:`None` if you used :const:`CERT_NONE` (rather than :const:`CERT_OPTIONAL` or :const:`CERT_REQUIRED`). + See also :attr:`SSLContext.check_hostname`. + .. versionchanged:: 3.2 The returned dictionary includes additional items such as ``issuer`` and ``notBefore``. @@ -1910,7 +1843,7 @@ to speed up repeated connections from the same clients. .. method:: SSLContext.session_stats() Get statistics about the SSL sessions created or managed by this context. - A dictionary is returned which maps the names of each `piece of information `_ to their + A dictionary is returned which maps the names of each `piece of information `_ to their numeric values. For example, here is the total number of hits and misses in the session cache since the context was created:: @@ -2660,10 +2593,9 @@ Therefore, when in client mode, it is highly recommended to use :const:`CERT_REQUIRED`. However, it is in itself not sufficient; you also have to check that the server certificate, which can be obtained by calling :meth:`SSLSocket.getpeercert`, matches the desired service. For many -protocols and applications, the service can be identified by the hostname; -in this case, the :func:`match_hostname` function can be used. This common -check is automatically performed when :attr:`SSLContext.check_hostname` is -enabled. +protocols and applications, the service can be identified by the hostname. +This common check is automatically performed when +:attr:`SSLContext.check_hostname` is enabled. .. versionchanged:: 3.7 Hostname matchings is now performed by OpenSSL. Python no longer uses @@ -2704,7 +2636,7 @@ enabled when negotiating a SSL session is possible through the :meth:`SSLContext.set_ciphers` method. Starting from Python 3.2.3, the ssl module disables certain weak ciphers by default, but you may want to further restrict the cipher choice. Be sure to read OpenSSL's documentation -about the `cipher list format `_. +about the `cipher list format `_. If you want to check which ciphers are enabled by a given cipher list, use :meth:`SSLContext.get_ciphers` or the ``openssl ciphers`` command on your system. @@ -2717,8 +2649,8 @@ for example the :mod:`multiprocessing` or :mod:`concurrent.futures` modules), be aware that OpenSSL's internal random number generator does not properly handle forked processes. Applications must change the PRNG state of the parent process if they use any SSL feature with :func:`os.fork`. Any -successful call of :func:`~ssl.RAND_add`, :func:`~ssl.RAND_bytes` or -:func:`~ssl.RAND_pseudo_bytes` is sufficient. +successful call of :func:`~ssl.RAND_add` or :func:`~ssl.RAND_bytes` is +sufficient. .. _ssl-tlsv1_3: diff --git a/Doc/library/statistics.rst b/Doc/library/statistics.rst index 4cd983bb25468a..347a1be8321e45 100644 --- a/Doc/library/statistics.rst +++ b/Doc/library/statistics.rst @@ -798,7 +798,7 @@ of applications in statistics. Compute the inverse cumulative distribution function, also known as the `quantile function `_ or the `percent-point - `_ + `_ function. Mathematically, it is written ``x : P(X <= x) = p``. Finds the value *x* of the random variable *X* such that the @@ -947,7 +947,7 @@ probability that the Python room will stay within its capacity limits? Normal distributions commonly arise in machine learning problems. Wikipedia has a `nice example of a Naive Bayesian Classifier -`_. +`_. The challenge is to predict a person's gender from measurements of normally distributed features including height, weight, and foot size. diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index 065afb8ae6038b..33fd2831228f88 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -1866,6 +1866,8 @@ expression support in the :mod:`re` module). +.. _meth-str-join: + .. method:: str.join(iterable) Return a string which is the concatenation of the strings in *iterable*. diff --git a/Doc/library/struct.rst b/Doc/library/struct.rst index eccba20fb8fe7e..25e26201d7d437 100644 --- a/Doc/library/struct.rst +++ b/Doc/library/struct.rst @@ -146,9 +146,10 @@ If the first character is not one of these, ``'@'`` is assumed. Native byte order is big-endian or little-endian, depending on the host system. For example, Intel x86 and AMD64 (x86-64) are little-endian; -Motorola 68000 and PowerPC G5 are big-endian; ARM and Intel Itanium feature -switchable endianness (bi-endian). Use ``sys.byteorder`` to check the -endianness of your system. +IBM z and most legacy architectures are big-endian; +and ARM, RISC-V and IBM Power feature switchable endianness +(bi-endian, though the former two are nearly always little-endian in practice). +Use ``sys.byteorder`` to check the endianness of your system. Native size and alignment are determined using the C compiler's ``sizeof`` expression. This is always combined with native byte order. @@ -466,6 +467,6 @@ The :mod:`struct` module also defines the following type: .. _half precision format: https://en.wikipedia.org/wiki/Half-precision_floating-point_format -.. _ieee 754 standard: https://en.wikipedia.org/wiki/IEEE_floating_point#IEEE_754-2008 +.. _ieee 754 standard: https://en.wikipedia.org/wiki/IEEE_754-2008_revision .. _IETF RFC 1700: https://tools.ietf.org/html/rfc1700 diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst index ce581abdd1dced..4031a5f62167f0 100644 --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -33,9 +33,6 @@ The recommended approach to invoking subprocesses is to use the :func:`run` function for all use cases it can handle. For more advanced use cases, the underlying :class:`Popen` interface can be used directly. -The :func:`run` function was added in Python 3.5; if you need to retain -compatibility with older versions, see the :ref:`call-function-trio` section. - .. function:: run(args, *, stdin=None, input=None, stdout=None, stderr=None,\ capture_output=False, shell=False, cwd=None, timeout=None, \ diff --git a/Doc/library/sunau.rst b/Doc/library/sunau.rst index b4d996e67e17cf..c7a38d96ade131 100644 --- a/Doc/library/sunau.rst +++ b/Doc/library/sunau.rst @@ -9,7 +9,7 @@ **Source code:** :source:`Lib/sunau.py` -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`sunau` module is deprecated (see :pep:`PEP 594 <594#sunau>` for details). diff --git a/Doc/library/symtable.rst b/Doc/library/symtable.rst index 0264f891cc8c06..65ff5bfe7abd61 100644 --- a/Doc/library/symtable.rst +++ b/Doc/library/symtable.rst @@ -69,7 +69,8 @@ Examining Symbol Tables .. method:: get_identifiers() - Return a list of names of symbols in this table. + Return a view object containing the names of symbols in the table. + See the :ref:`documentation of view objects `. .. method:: lookup(name) diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst index 5f3b9b5776cb82..7e2468446eb96c 100644 --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -1087,7 +1087,8 @@ always available. A list of :term:`meta path finder` objects that have their :meth:`~importlib.abc.MetaPathFinder.find_spec` methods called to see if one - of the objects can find the module to be imported. The + of the objects can find the module to be imported. By default, it holds entries + that implement Python's default import semantics. The :meth:`~importlib.abc.MetaPathFinder.find_spec` method is called with at least the absolute name of the module being imported. If the module to be imported is contained in a package, then the parent package's :attr:`__path__` diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst index f5c49b0ac4f738..f9d34def79a12b 100644 --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -98,8 +98,8 @@ Some facts and figures: If *fileobj* is specified, it is used as an alternative to a :term:`file object` opened in binary mode for *name*. It is supposed to be at position 0. - For modes ``'w:gz'``, ``'r:gz'``, ``'w:bz2'``, ``'r:bz2'``, ``'x:gz'``, - ``'x:bz2'``, :func:`tarfile.open` accepts the keyword argument + For modes ``'w:gz'``, ``'x:gz'``, ``'w|gz'``, ``'w:bz2'``, ``'x:bz2'``, + ``'w|bz2'``, :func:`tarfile.open` accepts the keyword argument *compresslevel* (default ``9``) to specify the compression level of the file. For modes ``'w:xz'`` and ``'x:xz'``, :func:`tarfile.open` accepts the @@ -152,6 +152,9 @@ Some facts and figures: .. versionchanged:: 3.6 The *name* parameter accepts a :term:`path-like object`. + .. versionchanged:: 3.12 + The *compresslevel* keyword argument also works for streams. + .. class:: TarFile :noindex: diff --git a/Doc/library/telnetlib.rst b/Doc/library/telnetlib.rst index 48a927c8ac96b2..70b8c7d1511d09 100644 --- a/Doc/library/telnetlib.rst +++ b/Doc/library/telnetlib.rst @@ -11,7 +11,7 @@ .. index:: single: protocol; Telnet -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`telnetlib` module is deprecated (see :pep:`PEP 594 <594#telnetlib>` for details and alternatives). diff --git a/Doc/library/termios.rst b/Doc/library/termios.rst index 3b0cb60f874522..fb1ff567d49e5c 100644 --- a/Doc/library/termios.rst +++ b/Doc/library/termios.rst @@ -85,11 +85,11 @@ The module defines the following functions: .. function:: tcsetwinsize(fd, winsize) - Set the tty window size for file descriptor *fd* from *winsize*, which is - a two-item tuple ``(ws_row, ws_col)`` like the one returned by - :func:`tcgetwinsize`. Requires at least one of the pairs - (:const:`termios.TIOCGWINSZ`, :const:`termios.TIOCSWINSZ`); - (:const:`termios.TIOCGSIZE`, :const:`termios.TIOCSSIZE`) to be defined. + Set the tty window size for file descriptor *fd* from *winsize*, which is + a two-item tuple ``(ws_row, ws_col)`` like the one returned by + :func:`tcgetwinsize`. Requires at least one of the pairs + (:const:`termios.TIOCGWINSZ`, :const:`termios.TIOCSWINSZ`); + (:const:`termios.TIOCGSIZE`, :const:`termios.TIOCSSIZE`) to be defined. .. versionadded:: 3.11 diff --git a/Doc/library/test.rst b/Doc/library/test.rst index 707e966455ceb5..e255952d4570ef 100644 --- a/Doc/library/test.rst +++ b/Doc/library/test.rst @@ -319,6 +319,15 @@ The :mod:`test.support` module defines the following constants: to make writes blocking. +.. data:: Py_DEBUG + + True if Python is built with the :c:macro:`Py_DEBUG` macro defined: if + Python is :ref:`built in debug mode ` + (:option:`./configure --with-pydebug <--with-pydebug>`). + + .. versionadded:: 3.12 + + .. data:: SOCK_MAX_SIZE A constant that is likely larger than the underlying OS socket buffer size, @@ -359,13 +368,19 @@ The :mod:`test.support` module defines the following constants: .. data:: MISSING_C_DOCSTRINGS - Return ``True`` if running on CPython, not on Windows, and configuration - not set with ``WITH_DOC_STRINGS``. + Set to ``True`` if Python is built without docstrings (the + :c:macro:`WITH_DOC_STRINGS` macro is not defined). + See the :option:`configure --without-doc-strings <--without-doc-strings>` option. + + See also the :data:`HAVE_DOCSTRINGS` variable. .. data:: HAVE_DOCSTRINGS - Check for presence of docstrings. + Set to ``True`` if function docstrings are available. + See the :option:`python -OO <-O>` option, which strips docstrings of functions implemented in Python. + + See also the :data:`MISSING_C_DOCSTRINGS` variable. .. data:: TEST_HTTP_URL @@ -398,6 +413,51 @@ The :mod:`test.support` module defines the following constants: The :mod:`test.support` module defines the following functions: +.. function:: busy_retry(timeout, err_msg=None, /, *, error=True) + + Run the loop body until ``break`` stops the loop. + + After *timeout* seconds, raise an :exc:`AssertionError` if *error* is true, + or just stop the loop if *error* is false. + + Example:: + + for _ in support.busy_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage:: + + for _ in support.busy_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + +.. function:: sleeping_retry(timeout, err_msg=None, /, *, init_delay=0.010, max_delay=1.0, error=True) + + Wait strategy that applies exponential backoff. + + Run the loop body until ``break`` stops the loop. Sleep at each loop + iteration, but not at the first iteration. The sleep delay is doubled at + each iteration (up to *max_delay* seconds). + + See :func:`busy_retry` documentation for the parameters usage. + + Example raising an exception after SHORT_TIMEOUT seconds:: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage:: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + .. function:: is_resource_enabled(resource) Return ``True`` if *resource* is enabled and available. The list of @@ -423,11 +483,6 @@ The :mod:`test.support` module defines the following functions: Used when tests are executed by :mod:`test.regrtest`. -.. function:: system_must_validate_cert(f) - - Raise :exc:`unittest.SkipTest` on TLS certification validation failures. - - .. function:: sortdict(dict) Return a repr of *dict* with keys sorted. @@ -445,12 +500,12 @@ The :mod:`test.support` module defines the following functions: .. function:: match_test(test) - Match *test* to patterns set in :func:`set_match_tests`. + Determine whether *test* matches the patterns set in :func:`set_match_tests`. -.. function:: set_match_tests(patterns) +.. function:: set_match_tests(accept_patterns=None, ignore_patterns=None) - Define match test with regular expression *patterns*. + Define match patterns on test filenames and test method names for filtering tests. .. function:: run_unittest(*classes) @@ -490,7 +545,9 @@ The :mod:`test.support` module defines the following functions: .. function:: check_impl_detail(**guards) Use this check to guard CPython's implementation-specific tests or to - run them only on the implementations guarded by the arguments:: + run them only on the implementations guarded by the arguments. This + function returns ``True`` or ``False`` depending on the host platform. + Example usage:: check_impl_detail() # Only on CPython (default). check_impl_detail(jython=True) # Only on Jython. @@ -509,7 +566,7 @@ The :mod:`test.support` module defines the following functions: time the regrtest began. -.. function:: get_original_stdout +.. function:: get_original_stdout() Return the original stdout set by :func:`record_original_stdout` or ``sys.stdout`` if it's not set. @@ -554,7 +611,7 @@ The :mod:`test.support` module defines the following functions: .. function:: disable_faulthandler() - A context manager that replaces ``sys.stderr`` with ``sys.__stderr__``. + A context manager that temporary disables :mod:`faulthandler`. .. function:: gc_collect() @@ -567,8 +624,8 @@ The :mod:`test.support` module defines the following functions: .. function:: disable_gc() - A context manager that disables the garbage collector upon entry and - reenables it upon exit. + A context manager that disables the garbage collector on entry. On + exit, the garbage collector is restored to its prior state. .. function:: swap_attr(obj, attr, new_val) @@ -642,14 +699,14 @@ The :mod:`test.support` module defines the following functions: .. function:: calcobjsize(fmt) - Return :func:`struct.calcsize` for ``nP{fmt}0n`` or, if ``gettotalrefcount`` - exists, ``2PnP{fmt}0P``. + Return the size of the :c:type:`PyObject` whose structure members are + defined by *fmt*. The returned value includes the size of the Python object header and alignment. .. function:: calcvobjsize(fmt) - Return :func:`struct.calcsize` for ``nPn{fmt}0n`` or, if ``gettotalrefcount`` - exists, ``2PnPn{fmt}0P``. + Return the size of the :c:type:`PyVarObject` whose structure members are + defined by *fmt*. The returned value includes the size of the Python object header and alignment. .. function:: checksizeof(test, o, size) @@ -665,6 +722,11 @@ The :mod:`test.support` module defines the following functions: have an associated comment identifying the relevant tracker issue. +.. function:: system_must_validate_cert(f) + + A decorator that skips the decorated test on TLS certification validation failures. + + .. decorator:: run_with_locale(catstr, *locales) A decorator for running a function in a different locale, correctly @@ -682,19 +744,19 @@ The :mod:`test.support` module defines the following functions: .. decorator:: requires_freebsd_version(*min_version) Decorator for the minimum version when running test on FreeBSD. If the - FreeBSD version is less than the minimum, raise :exc:`unittest.SkipTest`. + FreeBSD version is less than the minimum, the test is skipped. .. decorator:: requires_linux_version(*min_version) Decorator for the minimum version when running test on Linux. If the - Linux version is less than the minimum, raise :exc:`unittest.SkipTest`. + Linux version is less than the minimum, the test is skipped. .. decorator:: requires_mac_version(*min_version) Decorator for the minimum version when running test on macOS. If the - macOS version is less than the minimum, raise :exc:`unittest.SkipTest`. + macOS version is less than the minimum, the test is skipped. .. decorator:: requires_IEEE_754 @@ -732,7 +794,7 @@ The :mod:`test.support` module defines the following functions: Decorator for only running the test if :data:`HAVE_DOCSTRINGS`. -.. decorator:: cpython_only(test) +.. decorator:: cpython_only Decorator for tests only applicable to CPython. @@ -743,12 +805,12 @@ The :mod:`test.support` module defines the following functions: returns ``False``, then uses *msg* as the reason for skipping the test. -.. decorator:: no_tracing(func) +.. decorator:: no_tracing Decorator to temporarily turn off tracing for the duration of the test. -.. decorator:: refcount_test(test) +.. decorator:: refcount_test Decorator for tests which involve reference counting. The decorator does not run the test if it is not run by CPython. Any trace function is unset @@ -771,10 +833,9 @@ The :mod:`test.support` module defines the following functions: means the test doesn't support dummy runs when ``-M`` is not specified. -.. decorator:: bigaddrspacetest(f) +.. decorator:: bigaddrspacetest - Decorator for tests that fill the address space. *f* is the function to - wrap. + Decorator for tests that fill the address space. .. function:: check_syntax_error(testcase, statement, errtext='', *, lineno=None, offset=None) @@ -876,7 +937,7 @@ The :mod:`test.support` module defines the following functions: .. function:: check_free_after_iterating(test, iter, cls, args=()) - Assert that *iter* is deallocated after iterating. + Assert instances of *cls* are deallocated after iterating. .. function:: missing_compiler_executable(cmd_names=[]) @@ -967,6 +1028,16 @@ The :mod:`test.support` module defines the following classes: Class to save and restore signal handlers registered by the Python signal handler. + .. method:: save(self) + + Save the signal handlers to a dictionary mapping signal numbers to the + current signal handler. + + .. method:: restore(self) + + Set the signal numbers from the :meth:`save` dictionary to the saved + handler. + .. class:: Matcher() @@ -1103,11 +1174,11 @@ script execution tests. variables *env_vars* succeeds (``rc == 0``) and return a ``(return code, stdout, stderr)`` tuple. - If the ``__cleanenv`` keyword is set, *env_vars* is used as a fresh + If the *__cleanenv* keyword-only parameter is set, *env_vars* is used as a fresh environment. Python is started in isolated mode (command line option ``-I``), - except if the ``__isolated`` keyword is set to ``False``. + except if the *__isolated* keyword-only parameter is set to ``False``. .. versionchanged:: 3.9 The function no longer strips whitespaces from *stderr*. @@ -1218,15 +1289,17 @@ The :mod:`test.support.threading_helper` module provides support for threading t is still alive after *timeout* seconds. -.. decorator:: reap_threads(func) +.. decorator:: reap_threads Decorator to ensure the threads are cleaned up even if the test fails. .. function:: start_threads(threads, unlock=None) - Context manager to start *threads*. It attempts to join the threads upon - exit. + Context manager to start *threads*, which is a sequence of threads. + *unlock* is a function called after the threads are started, even if an + exception was raised; an example would be :meth:`threading.Event.set`. + ``start_threads`` will attempt to join the started threads upon exit. .. function:: threading_cleanup(*original_values) @@ -1308,7 +1381,10 @@ The :mod:`test.support.os_helper` module provides support for os tests. .. data:: TESTFN_NONASCII - Set to a filename containing the :data:`FS_NONASCII` character. + Set to a filename containing the :data:`FS_NONASCII` character, if it exists. + This guarantees that if the filename exists, it can be encoded and decoded + with the default filesystem encoding. This allows tests that require a + non-ASCII filename to be easily skipped on platforms where they can't work. .. data:: TESTFN_UNENCODABLE @@ -1406,13 +1482,16 @@ The :mod:`test.support.os_helper` module provides support for os tests. .. function:: rmdir(filename) Call :func:`os.rmdir` on *filename*. On Windows platforms, this is - wrapped with a wait loop that checks for the existence of the file. + wrapped with a wait loop that checks for the existence of the file, + which is needed due to antivirus programs that can hold files open and prevent + deletion. .. function:: rmtree(path) Call :func:`shutil.rmtree` on *path* or call :func:`os.lstat` and - :func:`os.rmdir` to remove a path and its contents. On Windows platforms, + :func:`os.rmdir` to remove a path and its contents. As with :func:`rmdir`, + on Windows platforms this is wrapped with a wait loop that checks for the existence of the files. @@ -1459,7 +1538,8 @@ The :mod:`test.support.os_helper` module provides support for os tests. .. function:: unlink(filename) - Call :func:`os.unlink` on *filename*. On Windows platforms, this is + Call :func:`os.unlink` on *filename*. As with :func:`rmdir`, + on Windows platforms, this is wrapped with a wait loop that checks for the existence of the file. @@ -1516,7 +1596,7 @@ The :mod:`test.support.import_helper` module provides support for import tests. .. versionadded:: 3.1 -.. function:: import_module(name, deprecated=False, *, required_on()) +.. function:: import_module(name, deprecated=False, *, required_on=()) This function imports and returns the named module. Unlike a normal import, this function raises :exc:`unittest.SkipTest` if the module @@ -1558,7 +1638,7 @@ The :mod:`test.support.import_helper` module provides support for import tests. A context manager to force import to return a new module reference. This is useful for testing module-level behaviors, such as the emission of a - DeprecationWarning on import. Example usage:: + :exc:`DeprecationWarning` on import. Example usage:: with CleanImport('foo'): importlib.import_module('foo') # New reference. @@ -1566,7 +1646,7 @@ The :mod:`test.support.import_helper` module provides support for import tests. .. class:: DirsOnSysPath(*paths) - A context manager to temporarily add directories to sys.path. + A context manager to temporarily add directories to :data:`sys.path`. This makes a copy of :data:`sys.path`, appends any directories given as positional arguments, then reverts :data:`sys.path` to the copied diff --git a/Doc/library/threading.rst b/Doc/library/threading.rst index b7775609616908..58e4ad786fe17f 100644 --- a/Doc/library/threading.rst +++ b/Doc/library/threading.rst @@ -114,7 +114,7 @@ This module defines the following functions: Its value may be used to uniquely identify this particular thread system-wide (until the thread terminates, after which the value may be recycled by the OS). - .. availability:: Windows, FreeBSD, Linux, macOS, OpenBSD, NetBSD, AIX. + .. availability:: Windows, FreeBSD, Linux, macOS, OpenBSD, NetBSD, AIX, DragonFlyBSD. .. versionadded:: 3.8 @@ -293,7 +293,7 @@ There is the possibility that "dummy thread objects" are created. These are thread objects corresponding to "alien threads", which are threads of control started outside the threading module, such as directly from C code. Dummy thread objects have limited functionality; they are always considered alive and -daemonic, and cannot be :meth:`~Thread.join`\ ed. They are never deleted, +daemonic, and cannot be :ref:`joined `. They are never deleted, since it is impossible to detect the termination of alien threads. @@ -366,6 +366,8 @@ since it is impossible to detect the termination of alien threads. >>> t.run() 1 + .. _meth-thread-join: + .. method:: join(timeout=None) Wait until the thread terminates. This blocks the calling thread until @@ -383,7 +385,7 @@ since it is impossible to detect the termination of alien threads. When the *timeout* argument is not present or ``None``, the operation will block until the thread terminates. - A thread can be :meth:`~Thread.join`\ ed many times. + A thread can be joined many times. :meth:`~Thread.join` raises a :exc:`RuntimeError` if an attempt is made to join the current thread as that would cause a deadlock. It is also diff --git a/Doc/library/tkinter.rst b/Doc/library/tkinter.rst index 37b6a02a31710e..096a343bd95589 100644 --- a/Doc/library/tkinter.rst +++ b/Doc/library/tkinter.rst @@ -877,8 +877,9 @@ of the bind method is:: where: sequence - is a string that denotes the target kind of event. (See the bind man page and - page 201 of John Ousterhout's book for details). + is a string that denotes the target kind of event. (See the + :manpage:`bind(3tk)` man page, and page 201 of John Ousterhout's book, + :title-reference:`Tcl and the Tk Toolkit (2nd edition)`, for details). func is a Python function, taking one argument, to be invoked when the event occurs. diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 29bcedaddce9dd..c7c5536a64fb41 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -78,6 +78,8 @@ annotations. These include: *Introducing* :data:`TypeVarTuple` * :pep:`647`: User-Defined Type Guards *Introducing* :data:`TypeGuard` +* :pep:`655`: Marking individual TypedDict items as required or potentially-missing + *Introducing* :data:`Required` and :data:`NotRequired` * :pep:`673`: Self type *Introducing* :data:`Self` * :pep:`675`: Arbitrary Literal String Type @@ -1022,6 +1024,18 @@ These can be used as types in annotations using ``[]``, each having a unique syn .. versionadded:: 3.8 +.. data:: Required + +.. data:: NotRequired + + Special typing constructs that mark individual keys of a :class:`TypedDict` + as either required or non-required respectively. + + For more information, see :class:`TypedDict` and + :pep:`655` ("Marking individual TypedDict items as required or potentially-missing"). + + .. versionadded:: 3.11 + .. data:: Annotated A type, introduced in :pep:`593` (``Flexible function and variable @@ -1706,8 +1720,21 @@ These are not used in annotations. They are building blocks for declaring types. Point2D = TypedDict('Point2D', {'in': int, 'x-y': int}) By default, all keys must be present in a ``TypedDict``. It is possible to - override this by specifying totality. - Usage:: + mark individual keys as non-required using :data:`NotRequired`:: + + class Point2D(TypedDict): + x: int + y: int + label: NotRequired[str] + + # Alternative syntax + Point2D = TypedDict('Point2D', {'x': int, 'y': int, 'label': NotRequired[str]}) + + This means that a ``Point2D`` ``TypedDict`` can have the ``label`` + key omitted. + + It is also possible to mark all keys as non-required by default + by specifying a totality of ``False``:: class Point2D(TypedDict, total=False): x: int @@ -1721,6 +1748,21 @@ These are not used in annotations. They are building blocks for declaring types. ``True`` as the value of the ``total`` argument. ``True`` is the default, and makes all items defined in the class body required. + Individual keys of a ``total=False`` ``TypedDict`` can be marked as + required using :data:`Required`:: + + class Point2D(TypedDict, total=False): + x: Required[int] + y: Required[int] + label: str + + # Alternative syntax + Point2D = TypedDict('Point2D', { + 'x': Required[int], + 'y': Required[int], + 'label': str + }, total=False) + It is possible for a ``TypedDict`` type to inherit from one or more other ``TypedDict`` types using the class-based syntax. Usage:: @@ -1785,11 +1827,16 @@ These are not used in annotations. They are building blocks for declaring types. ``Point2D.__required_keys__`` and ``Point2D.__optional_keys__`` return :class:`frozenset` objects containing required and non-required keys, respectively. - Currently the only way to declare both required and non-required keys in the - same ``TypedDict`` is mixed inheritance, declaring a ``TypedDict`` with one value - for the ``total`` argument and then inheriting it from another ``TypedDict`` with - a different value for ``total``. - Usage:: + + Keys marked with :data:`Required` will always appear in ``__required_keys__`` + and keys marked with :data:`NotRequired` will always appear in ``__optional_keys__``. + + For backwards compatibility with Python 3.10 and below, + it is also possible to use inheritance to declare both required and + non-required keys in the same ``TypedDict`` . This is done by declaring a + ``TypedDict`` with one value for the ``total`` argument and then + inheriting from it in another ``TypedDict`` with a different value for + ``total``:: >>> class Point2D(TypedDict, total=False): ... x: int @@ -1807,6 +1854,10 @@ These are not used in annotations. They are building blocks for declaring types. .. versionadded:: 3.8 + .. versionchanged:: 3.11 + Added support for marking individual keys as :data:`Required` or :data:`NotRequired`. + See :pep:`655`. + .. versionchanged:: 3.11 Added support for generic ``TypedDict``\ s. @@ -2429,6 +2480,75 @@ Functions and decorators .. versionadded:: 3.11 +.. decorator:: dataclass_transform + + :data:`~typing.dataclass_transform` may be used to + decorate a class, metaclass, or a function that is itself a decorator. + The presence of ``@dataclass_transform()`` tells a static type checker that the + decorated object performs runtime "magic" that + transforms a class, giving it :func:`dataclasses.dataclass`-like behaviors. + + Example usage with a decorator function:: + + T = TypeVar("T") + + @dataclass_transform() + def create_model(cls: type[T]) -> type[T]: + ... + return cls + + @create_model + class CustomerModel: + id: int + name: str + + On a base class:: + + @dataclass_transform() + class ModelBase: ... + + class CustomerModel(ModelBase): + id: int + name: str + + On a metaclass:: + + @dataclass_transform() + class ModelMeta(type): ... + + class ModelBase(metaclass=ModelMeta): ... + + class CustomerModel(ModelBase): + id: int + name: str + + The ``CustomerModel`` classes defined above will + be treated by type checkers similarly to classes created with + :func:`@dataclasses.dataclass `. + For example, type checkers will assume these classes have + ``__init__`` methods that accept ``id`` and ``name``. + + The arguments to this decorator can be used to customize this behavior: + + * ``eq_default`` indicates whether the ``eq`` parameter is assumed to be + ``True`` or ``False`` if it is omitted by the caller. + * ``order_default`` indicates whether the ``order`` parameter is + assumed to be True or False if it is omitted by the caller. + * ``kw_only_default`` indicates whether the ``kw_only`` parameter is + assumed to be True or False if it is omitted by the caller. + * ``field_specifiers`` specifies a static list of supported classes + or functions that describe fields, similar to ``dataclasses.field()``. + * Arbitrary other keyword arguments are accepted in order to allow for + possible future extensions. + + At runtime, this decorator records its arguments in the + ``__dataclass_transform__`` attribute on the decorated object. + It has no other runtime effect. + + See :pep:`681` for more details. + + .. versionadded:: 3.11 + .. decorator:: overload The ``@overload`` decorator allows describing functions and methods diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst index acc0d67541ae82..b768557e6075f6 100644 --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1944,7 +1944,7 @@ Both patch_ and patch.object_ correctly patch and restore descriptors: class methods, static methods and properties. You should patch these on the *class* rather than an instance. They also work with *some* objects that proxy attribute access, like the `django settings object -`_. +`_. MagicMock and magic method support diff --git a/Doc/library/uu.rst b/Doc/library/uu.rst index 026ec415c9da65..83c4aec47bbefc 100644 --- a/Doc/library/uu.rst +++ b/Doc/library/uu.rst @@ -9,7 +9,7 @@ **Source code:** :source:`Lib/uu.py` -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`uu` module is deprecated (see :pep:`PEP 594 <594#uu-and-the-uu-encoding>` for details). :mod:`base64` is a modern alternative. diff --git a/Doc/library/warnings.rst b/Doc/library/warnings.rst index f7a1f70833b7f5..43708f8021ad24 100644 --- a/Doc/library/warnings.rst +++ b/Doc/library/warnings.rst @@ -154,14 +154,19 @@ the disposition of the match. Each entry is a tuple of the form (*action*, +---------------+----------------------------------------------+ * *message* is a string containing a regular expression that the start of - the warning message must match. The expression is compiled to always be - case-insensitive. + the warning message must match, case-insensitively. In :option:`-W` and + :envvar:`PYTHONWARNINGS`, *message* is a literal string that the start of the + warning message must contain (case-insensitively), ignoring any whitespace at + the start or end of *message*. * *category* is a class (a subclass of :exc:`Warning`) of which the warning category must be a subclass in order to match. -* *module* is a string containing a regular expression that the module name must - match. The expression is compiled to be case-sensitive. +* *module* is a string containing a regular expression that the start of the + fully-qualified module name must match, case-sensitively. In :option:`-W` and + :envvar:`PYTHONWARNINGS`, *module* is a literal string that the + fully-qualified module name must be equal to (case-sensitively), ignoring any + whitespace at the start or end of *module*. * *lineno* is an integer that the line number where the warning occurred must match, or ``0`` to match all line numbers. @@ -207,8 +212,7 @@ Some examples:: error::ResourceWarning # Treat ResourceWarning messages as errors default::DeprecationWarning # Show DeprecationWarning messages ignore,default:::mymodule # Only report warnings triggered by "mymodule" - error:::mymodule[.*] # Convert warnings to errors in "mymodule" - # and any subpackages of "mymodule" + error:::mymodule # Convert warnings to errors in "mymodule" .. _default-warning-filter: diff --git a/Doc/library/weakref.rst b/Doc/library/weakref.rst index 1102c634edaf32..8397de4fb488f1 100644 --- a/Doc/library/weakref.rst +++ b/Doc/library/weakref.rst @@ -1,3 +1,5 @@ +.. _mod-weakref: + :mod:`weakref` --- Weak references ================================== diff --git a/Doc/library/xdrlib.rst b/Doc/library/xdrlib.rst index a3124a986524bd..39e75573260c50 100644 --- a/Doc/library/xdrlib.rst +++ b/Doc/library/xdrlib.rst @@ -11,7 +11,7 @@ single: XDR single: External Data Representation -.. deprecated:: 3.11 +.. deprecated-removed:: 3.11 3.13 The :mod:`xdrlib` module is deprecated (see :pep:`PEP 594 <594#xdrlib>` for details). diff --git a/Doc/library/xml.dom.minidom.rst b/Doc/library/xml.dom.minidom.rst index 20984b98b1778c..076cf34694ce4e 100644 --- a/Doc/library/xml.dom.minidom.rst +++ b/Doc/library/xml.dom.minidom.rst @@ -180,7 +180,7 @@ module documentation. This section lists the differences between the API and .. versionchanged:: 3.9 The *standalone* parameter was added. -.. method:: Node.toprettyxml(indent="\\t", newl="\\n", encoding=None, \ +.. method:: Node.toprettyxml(indent="\t", newl="\n", encoding=None, \ standalone=None) Return a pretty-printed version of the document. *indent* specifies the diff --git a/Doc/library/xml.etree.elementtree.rst b/Doc/library/xml.etree.elementtree.rst index e3932bc9e659fe..2fe0d2e082fb3a 100644 --- a/Doc/library/xml.etree.elementtree.rst +++ b/Doc/library/xml.etree.elementtree.rst @@ -826,6 +826,7 @@ Functions ^^^^^^^^^ .. function:: xml.etree.ElementInclude.default_loader( href, parse, encoding=None) + :module: Default loader. This default loader reads an included resource from disk. *href* is a URL. *parse* is for parse mode either "xml" or "text". *encoding* @@ -837,6 +838,7 @@ Functions .. function:: xml.etree.ElementInclude.include( elem, loader=None, base_url=None, \ max_depth=6) + :module: This function expands XInclude directives. *elem* is the root element. *loader* is an optional resource loader. If omitted, it defaults to :func:`default_loader`. diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst index 19e128ce02f54b..4dd9fa961a8d98 100644 --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -139,7 +139,7 @@ ZipFile Objects .. class:: ZipFile(file, mode='r', compression=ZIP_STORED, allowZip64=True, \ - compresslevel=None, *, strict_timestamps=True, + compresslevel=None, *, strict_timestamps=True, \ metadata_encoding=None) Open a ZIP file, where *file* can be a path to a file (a string), a diff --git a/Doc/license.rst b/Doc/license.rst index e0ca5f2662dc1c..e0276b49243770 100644 --- a/Doc/license.rst +++ b/Doc/license.rst @@ -626,9 +626,9 @@ strtod and dtoa The file :file:`Python/dtoa.c`, which supplies C functions dtoa and strtod for conversion of C doubles to and from strings, is derived from the file of the same name by David M. Gay, currently available -from http://www.netlib.org/fp/. The original file, as retrieved on -March 16, 2009, contains the following copyright and licensing -notice:: +from https://web.archive.org/web/20220517033456/http://www.netlib.org/fp/dtoa.c. +The original file, as retrieved on March 16, 2009, contains the following +copyright and licensing notice:: /**************************************************************** * diff --git a/Doc/make.bat b/Doc/make.bat index d9a7aa4ca7fa6c..4f0b3c11f4facb 100644 --- a/Doc/make.bat +++ b/Doc/make.bat @@ -180,7 +180,10 @@ if EXIST "%BUILDDIR%\html\index.html" ( goto end :check -cmd /S /C "%SPHINXLINT% -i tools" +rem Check the docs and NEWS files with sphinx-lint. +rem Ignore the tools dir and check that the default role is not used. +cmd /S /C "%SPHINXLINT% -i tools --enable default-role" +cmd /S /C "%SPHINXLINT% --enable default-role ..\Misc\NEWS.d\next\ " goto end :serve diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index b914c48d3d4cd5..6bf21a7dde49a0 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -573,7 +573,7 @@ is already executing raises a :exc:`ValueError` exception. In typical use, this is called with a single exception instance similar to the way the :keyword:`raise` keyword is used. - For backwards compatability, however, the second signature is + For backwards compatibility, however, the second signature is supported, following a convention from older versions of Python. The *type* argument should be an exception class, and *value* should be an exception instance. If the *value* is not provided, the @@ -1898,7 +1898,7 @@ precedence and have a left-to-right chaining feature as described in the | ``x[index]``, ``x[index:index]``, | Subscription, slicing, | | ``x(arguments...)``, ``x.attribute`` | call, attribute reference | +-----------------------------------------------+-------------------------------------+ -| :keyword:`await` ``x`` | Await expression | +| :keyword:`await x ` | Await expression | +-----------------------------------------------+-------------------------------------+ | ``**`` | Exponentiation [#]_ | +-----------------------------------------------+-------------------------------------+ @@ -1922,7 +1922,7 @@ precedence and have a left-to-right chaining feature as described in the | :keyword:`is`, :keyword:`is not`, ``<``, | tests and identity tests | | ``<=``, ``>``, ``>=``, ``!=``, ``==`` | | +-----------------------------------------------+-------------------------------------+ -| :keyword:`not` ``x`` | Boolean NOT | +| :keyword:`not x ` | Boolean NOT | +-----------------------------------------------+-------------------------------------+ | :keyword:`and` | Boolean AND | +-----------------------------------------------+-------------------------------------+ diff --git a/Doc/tools/extensions/asdl_highlight.py b/Doc/tools/extensions/asdl_highlight.py index b1989e53957072..42863a4b3bcd6a 100644 --- a/Doc/tools/extensions/asdl_highlight.py +++ b/Doc/tools/extensions/asdl_highlight.py @@ -1,4 +1,3 @@ -import os import sys from pathlib import Path @@ -6,7 +5,7 @@ sys.path.append(str(CPYTHON_ROOT / "Parser")) from pygments.lexer import RegexLexer, bygroups, include, words -from pygments.token import (Comment, Generic, Keyword, Name, Operator, +from pygments.token import (Comment, Keyword, Name, Operator, Punctuation, Text) from asdl import builtin_types diff --git a/Doc/tools/extensions/peg_highlight.py b/Doc/tools/extensions/peg_highlight.py index 27f54cdf593c87..4bdc2ee1861334 100644 --- a/Doc/tools/extensions/peg_highlight.py +++ b/Doc/tools/extensions/peg_highlight.py @@ -1,5 +1,5 @@ from pygments.lexer import RegexLexer, bygroups, include -from pygments.token import Comment, Generic, Keyword, Name, Operator, Punctuation, Text +from pygments.token import Comment, Keyword, Name, Operator, Punctuation, Text from sphinx.highlighting import lexers diff --git a/Doc/tools/extensions/pyspecific.py b/Doc/tools/extensions/pyspecific.py index 2e122b5607936e..8ac0028cc4f412 100644 --- a/Doc/tools/extensions/pyspecific.py +++ b/Doc/tools/extensions/pyspecific.py @@ -30,7 +30,6 @@ from sphinx.util import status_iterator, logging from sphinx.util.nodes import split_explicit_title from sphinx.writers.text import TextWriter, TextTranslator -from sphinx.writers.latex import LaTeXTranslator try: from sphinx.domains.python import PyFunction, PyMethod @@ -418,12 +417,7 @@ def run(self): translatable=False) node.append(para) env = self.state.document.settings.env - # deprecated pre-Sphinx-2 method - if hasattr(env, 'note_versionchange'): - env.note_versionchange('deprecated', version[0], node, self.lineno) - # new method - else: - env.get_domain('changeset').note_changeset(node) + env.get_domain('changeset').note_changeset(node) return [node] + messages diff --git a/Doc/tools/extensions/suspicious.py b/Doc/tools/extensions/suspicious.py index c3de4d79c83f87..2d581a8a6c3db6 100644 --- a/Doc/tools/extensions/suspicious.py +++ b/Doc/tools/extensions/suspicious.py @@ -44,7 +44,6 @@ import os import re import csv -import sys from docutils import nodes from sphinx.builders import Builder @@ -55,9 +54,7 @@ :[a-zA-Z][a-zA-Z0-9]+| # :foo `| # ` (seldom used by itself) (?= (3, 0) + ''', re.VERBOSE).finditer class Rule: @@ -152,32 +149,15 @@ def is_ignored(self, line, lineno, issue): def report_issue(self, text, lineno, issue): self.any_issue = True self.write_log_entry(lineno, issue, text) - if py3: - self.logger.warning('[%s:%d] "%s" found in "%-.120s"' % + self.logger.warning('[%s:%d] "%s" found in "%-.120s"' % (self.docname, lineno, issue, text)) - else: - self.logger.warning( - '[%s:%d] "%s" found in "%-.120s"' % ( - self.docname.encode(sys.getdefaultencoding(),'replace'), - lineno, - issue.encode(sys.getdefaultencoding(),'replace'), - text.strip().encode(sys.getdefaultencoding(),'replace'))) self.app.statuscode = 1 def write_log_entry(self, lineno, issue, text): - if py3: - f = open(self.log_file_name, 'a') - writer = csv.writer(f, dialect) - writer.writerow([self.docname, lineno, issue, text.strip()]) - f.close() - else: - f = open(self.log_file_name, 'ab') - writer = csv.writer(f, dialect) - writer.writerow([self.docname.encode('utf-8'), - lineno, - issue.encode('utf-8'), - text.strip().encode('utf-8')]) - f.close() + f = open(self.log_file_name, 'a') + writer = csv.writer(f, dialect) + writer.writerow([self.docname, lineno, issue, text.strip()]) + f.close() def load_rules(self, filename): """Load database of previously ignored issues. @@ -188,10 +168,7 @@ def load_rules(self, filename): self.logger.info("loading ignore rules... ", nonl=1) self.rules = rules = [] try: - if py3: - f = open(filename, 'r') - else: - f = open(filename, 'rb') + f = open(filename, 'r') except IOError: return for i, row in enumerate(csv.reader(f)): @@ -203,10 +180,6 @@ def load_rules(self, filename): lineno = int(lineno) else: lineno = None - if not py3: - docname = docname.decode('utf-8') - issue = issue.decode('utf-8') - text = text.decode('utf-8') rule = Rule(docname, lineno, issue, text) rules.append(rule) f.close() diff --git a/Doc/tools/rstlint.py b/Doc/tools/rstlint.py index d1c53dcb1a698e..4ea68ef3b030c8 100644 --- a/Doc/tools/rstlint.py +++ b/Doc/tools/rstlint.py @@ -130,7 +130,7 @@ # Find role glued with another word like: # the:c:func:`PyThreadState_LeaveTracing` function. -# instad of: +# instead of: # the :c:func:`PyThreadState_LeaveTracing` function. role_glued_with_word = re.compile(r"[a-zA-Z]%s" % all_roles) diff --git a/Doc/tools/susp-ignored.csv b/Doc/tools/susp-ignored.csv index 932c3800c9fa19..29d65cd7207fe7 100644 --- a/Doc/tools/susp-ignored.csv +++ b/Doc/tools/susp-ignored.csv @@ -172,6 +172,8 @@ library/itertools,,:step,elements from seq[start:stop:step] library/itertools,,::,kernel = tuple(kernel)[::-1] library/itertools,,:stop,elements from seq[start:stop:step] library/logging.handlers,,:port,host:port +library/logging,,:root,WARNING:root:Watch out! +library/logging,,:Watch,WARNING:root:Watch out! library/mmap,,:i2,obj[i1:i2] library/multiprocessing,,`,# Add more tasks using `put()` library/multiprocessing,,:queue,">>> QueueManager.register('get_queue', callable=lambda:queue)" diff --git a/Doc/tools/templates/download.html b/Doc/tools/templates/download.html index 2456625043416c..7920e0619f9337 100644 --- a/Doc/tools/templates/download.html +++ b/Doc/tools/templates/download.html @@ -3,6 +3,10 @@ {% if daily is defined %} {% set dlbase = pathto('archives', 1) %} {% else %} + {# + The link below returns HTTP 404 until the first related alpha release. + This is expected; use daily documentation builds for CPython development. + #} {% set dlbase = 'https://docs.python.org/ftp/python/doc/' + release %} {% endif %} diff --git a/Doc/tutorial/classes.rst b/Doc/tutorial/classes.rst index f44cb0b4e905a9..58b06eb5f25356 100644 --- a/Doc/tutorial/classes.rst +++ b/Doc/tutorial/classes.rst @@ -479,9 +479,9 @@ If the same attribute name occurs in both an instance and in a class, then attribute lookup prioritizes the instance:: >>> class Warehouse: - purpose = 'storage' - region = 'west' - + ... purpose = 'storage' + ... region = 'west' + ... >>> w1 = Warehouse() >>> print(w1.purpose, w1.region) storage west diff --git a/Doc/tutorial/controlflow.rst b/Doc/tutorial/controlflow.rst index f6e013b23e7e58..99a77e7addd774 100644 --- a/Doc/tutorial/controlflow.rst +++ b/Doc/tutorial/controlflow.rst @@ -253,8 +253,10 @@ at a more abstract level. The :keyword:`!pass` is silently ignored:: A :keyword:`match` statement takes an expression and compares its value to successive patterns given as one or more case blocks. This is superficially similar to a switch statement in C, Java or JavaScript (and many -other languages), but it can also extract components (sequence elements or -object attributes) from the value into variables. +other languages), but it's more similar to pattern matching in +languages like Rust or Haskell. Only the first pattern that matches +gets executed and it can also extract components (sequence elements +or object attributes) from the value into variables. The simplest form compares a subject value against one or more literals:: diff --git a/Doc/tutorial/inputoutput.rst b/Doc/tutorial/inputoutput.rst index b50063654e2628..1f1ef28e5cad40 100644 --- a/Doc/tutorial/inputoutput.rst +++ b/Doc/tutorial/inputoutput.rst @@ -179,7 +179,7 @@ square brackets ``'[]'`` to access the keys. :: ... 'Dcab: {0[Dcab]:d}'.format(table)) Jack: 4098; Sjoerd: 4127; Dcab: 8637678 -This could also be done by passing the table as keyword arguments with the '**' +This could also be done by passing the ``table`` dictionary as keyword arguments with the ``**`` notation. :: >>> table = {'Sjoerd': 4127, 'Jack': 4098, 'Dcab': 8637678} diff --git a/Doc/tutorial/whatnow.rst b/Doc/tutorial/whatnow.rst index 18805da90e5001..5f7010cb12e9e7 100644 --- a/Doc/tutorial/whatnow.rst +++ b/Doc/tutorial/whatnow.rst @@ -31,10 +31,7 @@ the set are: More Python resources: * https://www.python.org: The major Python web site. It contains code, - documentation, and pointers to Python-related pages around the web. This web - site is mirrored in various places around the world, such as Europe, Japan, and - Australia; a mirror may be faster than the main site, depending on your - geographical location. + documentation, and pointers to Python-related pages around the web. * https://docs.python.org: Fast access to Python's documentation. @@ -48,7 +45,7 @@ More Python resources: Particularly notable contributions are collected in a book also titled Python Cookbook (O'Reilly & Associates, ISBN 0-596-00797-3.) -* http://www.pyvideo.org collects links to Python-related videos from +* https://pyvideo.org collects links to Python-related videos from conferences and user-group meetings. * https://scipy.org: The Scientific Python project includes modules for fast diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst index bc54ed8691825f..d9f6afb3760bd5 100644 --- a/Doc/using/cmdline.rst +++ b/Doc/using/cmdline.rst @@ -183,6 +183,8 @@ automatically enabled, if available on your platform (see Automatic enabling of tab-completion and history editing. +.. _using-on-generic-options: + Generic options ~~~~~~~~~~~~~~~ @@ -190,8 +192,28 @@ Generic options -h --help - Print a short description of all command line options. + Print a short description of all command line options and corresponding + environment variables and exit. + +.. cmdoption:: --help-env + + Print a short description of Python-specific environment variables + and exit. + + .. versionadded:: 3.11 + +.. cmdoption:: --help-xoptions + + Print a description of implementation-specific :option:`-X` options + and exit. + + .. versionadded:: 3.11 +.. cmdoption:: --help-all + + Print complete usage information and exit. + + .. versionadded:: 3.11 .. cmdoption:: -V --version @@ -212,6 +234,7 @@ Generic options .. versionadded:: 3.6 The ``-VV`` option. + .. _using-on-misc-options: Miscellaneous options @@ -248,8 +271,11 @@ Miscellaneous options .. cmdoption:: -d - Turn on parser debugging output (for expert only, depending on compilation - options). See also :envvar:`PYTHONDEBUG`. + Turn on parser debugging output (for expert only). + See also the :envvar:`PYTHONDEBUG` environment variable. + + This option requires a :ref:`debug build of Python `, otherwise + it's ignored. .. cmdoption:: -E @@ -457,6 +483,7 @@ Miscellaneous options See :ref:`warning-filter` and :ref:`describing-warning-filters` for more details. + .. cmdoption:: -x Skip the first line of the source, allowing use of non-Unix forms of @@ -550,6 +577,7 @@ Miscellaneous options The ``-X frozen_modules`` option. + Options you shouldn't use ~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -660,6 +688,9 @@ conflict. :option:`-d` option. If set to an integer, it is equivalent to specifying :option:`-d` multiple times. + This environment variable requires a :ref:`debug build of Python + `, otherwise it's ignored. + .. envvar:: PYTHONINSPECT diff --git a/Doc/using/configure.rst b/Doc/using/configure.rst index d61647f5ea71ea..80c0dc08dae732 100644 --- a/Doc/using/configure.rst +++ b/Doc/using/configure.rst @@ -278,6 +278,8 @@ Effects of a debug build: * Add ``d`` to :data:`sys.abiflags`. * Add :func:`sys.gettotalrefcount` function. * Add :option:`-X showrefcount <-X>` command line option. +* Add :option:`-d` command line option and :envvar:`PYTHONDEBUG` environment + variable to debug the parser. * Add support for the ``__lltrace__`` variable: enable low-level tracing in the bytecode evaluation loop if the variable is defined. * Install :ref:`debug hooks on memory allocators ` @@ -747,6 +749,17 @@ Compiler flags extensions. Use it when a compiler flag should *not* be part of the distutils :envvar:`CFLAGS` once Python is installed (:issue:`21121`). + In particular, :envvar:`CFLAGS` should not contain: + + * the compiler flag `-I` (for setting the search path for include files). + The `-I` flags are processed from left to right, and any flags in + :envvar:`CFLAGS` would take precedence over user- and package-supplied `-I` + flags. + + * hardening flags such as `-Werror` because distributions cannot control + whether packages installed by users conform to such heightened + standards. + .. versionadded:: 3.5 .. envvar:: EXTRA_CFLAGS @@ -859,6 +872,13 @@ Linker flags :envvar:`CFLAGS_NODIST`. Use it when a linker flag should *not* be part of the distutils :envvar:`LDFLAGS` once Python is installed (:issue:`35257`). + In particular, :envvar:`LDFLAGS` should not contain: + + * the compiler flag `-L` (for setting the search path for libraries). + The `-L` flags are processed from left to right, and any flags in + :envvar:`LDFLAGS` would take precedence over user- and package-supplied `-L` + flags. + .. envvar:: CONFIGURE_LDFLAGS_NODIST Value of :envvar:`LDFLAGS_NODIST` variable passed to the ``./configure`` diff --git a/Doc/using/unix.rst b/Doc/using/unix.rst index 0a1834453a0ee8..3f9f1364c8ae87 100644 --- a/Doc/using/unix.rst +++ b/Doc/using/unix.rst @@ -69,7 +69,7 @@ Building Python If you want to compile CPython yourself, first thing you should do is get the `source `_. You can download either the latest release's source or just grab a fresh `clone -`_. (If you want +`_. (If you want to contribute patches, you will need a clone.) The build process consists of the usual commands:: diff --git a/Doc/using/venv-create.inc b/Doc/using/venv-create.inc index ddb36f94667d9f..b9785832864e22 100644 --- a/Doc/using/venv-create.inc +++ b/Doc/using/venv-create.inc @@ -17,7 +17,7 @@ re-used. .. deprecated:: 3.6 ``pyvenv`` was the recommended tool for creating virtual environments for Python 3.3 and 3.4, and is `deprecated in Python 3.6 - `_. + `_. .. versionchanged:: 3.5 The use of ``venv`` is now recommended for creating virtual environments. diff --git a/Doc/using/windows.rst b/Doc/using/windows.rst index dcc533726813ca..d7e0a5e13825a5 100644 --- a/Doc/using/windows.rst +++ b/Doc/using/windows.rst @@ -513,9 +513,11 @@ key features: Popular scientific modules (such as numpy, scipy and pandas) and the ``conda`` package manager. -`Canopy `_ - A "comprehensive Python analysis environment" with editors and other - development tools. +`Enthought Deployment Manager `_ + "The Next Generation Python Environment and Package Manager". + + Previously Enthought provided Canopy, but it `reached end of life in 2016 + `_. `WinPython `_ Windows-specific distribution with prebuilt scientific packages and @@ -1170,7 +1172,7 @@ Compiling Python on Windows If you want to compile CPython yourself, first thing you should do is get the `source `_. You can download either the latest release's source or just grab a fresh `checkout -`_. +`_. The source tree contains a build solution and project files for Microsoft Visual Studio, which is the compiler used to build the official Python diff --git a/Doc/whatsnew/2.1.rst b/Doc/whatsnew/2.1.rst index b690f90cf6636b..0136de58774038 100644 --- a/Doc/whatsnew/2.1.rst +++ b/Doc/whatsnew/2.1.rst @@ -542,8 +542,11 @@ PEP 241: Metadata in Python Packages A common complaint from Python users is that there's no single catalog of all the Python modules in existence. T. Middleton's Vaults of Parnassus at -http://www.vex.net/parnassus/ are the largest catalog of Python modules, but -registering software at the Vaults is optional, and many people don't bother. +``www.vex.net/parnassus/`` (retired in February 2009, `available in the +Internet Archive Wayback Machine +`_) +was the largest catalog of Python modules, but +registering software at the Vaults is optional, and many people did not bother. As a first small step toward fixing the problem, Python software packaged using the Distutils :command:`sdist` command will include a file named diff --git a/Doc/whatsnew/2.5.rst b/Doc/whatsnew/2.5.rst index 4e85abaea75535..597eaf51fb68d4 100644 --- a/Doc/whatsnew/2.5.rst +++ b/Doc/whatsnew/2.5.rst @@ -11,7 +11,7 @@ This article explains the new features in Python 2.5. The final release of Python 2.5 is scheduled for August 2006; :pep:`356` describes the planned -release schedule. +release schedule. Python 2.5 was released on September 19, 2006. The changes in Python 2.5 are an interesting mix of language and library improvements. The library enhancements will be more important to Python's user @@ -551,7 +551,7 @@ exhausted. https://en.wikipedia.org/wiki/Coroutine The Wikipedia entry for coroutines. - http://www.sidhe.org/~dan/blog/archives/000178.html + https://web.archive.org/web/20160321211320/http://www.sidhe.org/~dan/blog/archives/000178.html An explanation of coroutines from a Perl point of view, written by Dan Sugalski. .. ====================================================================== @@ -1746,8 +1746,8 @@ modules, now that :mod:`ctypes` is included with core Python. .. seealso:: - http://starship.python.net/crew/theller/ctypes/ - The ctypes web page, with a tutorial, reference, and FAQ. + https://web.archive.org/web/20180410025338/http://starship.python.net/crew/theller/ctypes/ + The pre-stdlib ctypes web page, with a tutorial, reference, and FAQ. The documentation for the :mod:`ctypes` module. @@ -1767,7 +1767,7 @@ included. The rest of this section will provide a brief overview of using ElementTree. Full documentation for ElementTree is available at -http://effbot.org/zone/element-index.htm. +https://web.archive.org/web/20201124024954/http://effbot.org/zone/element-index.htm. ElementTree represents an XML document as a tree of element nodes. The text content of the document is stored as the :attr:`text` and :attr:`tail` @@ -1865,7 +1865,7 @@ read the package's official documentation for more details. .. seealso:: - http://effbot.org/zone/element-index.htm + https://web.archive.org/web/20201124024954/http://effbot.org/zone/element-index.htm Official documentation for ElementTree. .. ====================================================================== @@ -2065,7 +2065,7 @@ up a server takes only a few lines of code:: .. seealso:: - http://www.wsgi.org + https://web.archive.org/web/20160331090247/http://wsgi.readthedocs.org/en/latest/ A central web site for WSGI-related resources. :pep:`333` - Python Web Server Gateway Interface v1.0 diff --git a/Doc/whatsnew/2.6.rst b/Doc/whatsnew/2.6.rst index b6174a19a178b6..0dbb5e50d27d75 100644 --- a/Doc/whatsnew/2.6.rst +++ b/Doc/whatsnew/2.6.rst @@ -49,7 +49,7 @@ This saves the maintainer some effort going through the SVN logs when researching a change. -This article explains the new features in Python 2.6, released on October 1 +This article explains the new features in Python 2.6, released on October 1, 2008. The release schedule is described in :pep:`361`. The major theme of Python 2.6 is preparing the migration path to @@ -176,7 +176,7 @@ Hosting of the Python bug tracker is kindly provided by of Stellenbosch, South Africa. Martin von Löwis put a lot of effort into importing existing bugs and patches from SourceForge; his scripts for this import operation are at -http://svn.python.org/view/tracker/importer/ and may be useful to +``http://svn.python.org/view/tracker/importer/`` and may be useful to other projects wishing to move from SourceForge to Roundup. .. seealso:: diff --git a/Doc/whatsnew/2.7.rst b/Doc/whatsnew/2.7.rst index 999f148fa6b795..8eb8daa2feeeb9 100644 --- a/Doc/whatsnew/2.7.rst +++ b/Doc/whatsnew/2.7.rst @@ -1545,7 +1545,7 @@ changes, or look through the Subversion logs for all the details. *ciphers* argument that's a string listing the encryption algorithms to be allowed; the format of the string is described `in the OpenSSL documentation - `__. + `__. (Added by Antoine Pitrou; :issue:`8322`.) Another change makes the extension load all of OpenSSL's ciphers and @@ -2001,7 +2001,7 @@ module is imported or used. .. seealso:: - http://www.voidspace.org.uk/python/articles/unittest2.shtml + https://web.archive.org/web/20210619163128/http://www.voidspace.org.uk/python/articles/unittest2.shtml Describes the new features, how to use them, and the rationale for various design decisions. (By Michael Foord.) @@ -2089,7 +2089,7 @@ version 1.3. Some of the new features are: Fredrik Lundh develops ElementTree and produced the 1.3 version; you can read his article describing 1.3 at -http://effbot.org/zone/elementtree-13-intro.htm. +https://web.archive.org/web/20200703234532/http://effbot.org/zone/elementtree-13-intro.htm. Florent Xicluna updated the version included with Python, after discussions on python-dev and in :issue:`6472`.) diff --git a/Doc/whatsnew/3.0.rst b/Doc/whatsnew/3.0.rst index 880958d3edb900..4da3507ad2e892 100644 --- a/Doc/whatsnew/3.0.rst +++ b/Doc/whatsnew/3.0.rst @@ -53,9 +53,9 @@ This article explains the new features in Python 3.0, compared to 2.6. Python 3.0, also known as "Python 3000" or "Py3K", is the first ever -*intentionally backwards incompatible* Python release. There are more -changes than in a typical release, and more that are important for all -Python users. Nevertheless, after digesting the changes, you'll find +*intentionally backwards incompatible* Python release. Python 3.0 was released on December 3, 2008. +There are more changes than in a typical release, and more that are important for all +Python users. Nevertheless, after digesting the changes, you'll find that Python really hasn't changed all that much -- by and large, we're mostly fixing well-known annoyances and warts, and removing a lot of old cruft. diff --git a/Doc/whatsnew/3.1.rst b/Doc/whatsnew/3.1.rst index f1e6d0c4f3dd68..3d89b97fa8f1b8 100644 --- a/Doc/whatsnew/3.1.rst +++ b/Doc/whatsnew/3.1.rst @@ -47,6 +47,7 @@ when researching a change. This article explains the new features in Python 3.1, compared to 3.0. +Python 3.1 was released on June 27, 2009. PEP 372: Ordered Dictionaries diff --git a/Doc/whatsnew/3.10.rst b/Doc/whatsnew/3.10.rst index f93523a4c50a89..e6e97d98f87f2b 100644 --- a/Doc/whatsnew/3.10.rst +++ b/Doc/whatsnew/3.10.rst @@ -47,7 +47,7 @@ when researching a change. This article explains the new features in Python 3.10, compared to 3.9. - +Python 3.10 was released on October 4, 2021. For full details, see the :ref:`changelog `. Summary -- Release highlights @@ -1746,7 +1746,7 @@ Deprecated * ``threading.Thread.setDaemon`` => :attr:`threading.Thread.daemon` - (Contributed by Jelle Zijlstra in :issue:`21574`.) + (Contributed by Jelle Zijlstra in :gh:`87889`.) * :meth:`pathlib.Path.link_to` is deprecated and slated for removal in Python 3.12. Use :meth:`pathlib.Path.hardlink_to` instead. diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index 6518eea4c7ba55..256d5822e5196e 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -306,23 +306,23 @@ Kumar Srinivasan and Graham Bleaney.) PEP 681: Data Class Transforms ------------------------------ -The new :data:`~typing.dataclass_transform` annotation may be used to -decorate a function that is itself a decorator, a class, or a metaclass. +:data:`~typing.dataclass_transform` may be used to +decorate a class, metaclass, or a function that is itself a decorator. The presence of ``@dataclass_transform()`` tells a static type checker that the -decorated function, class, or metaclass performs runtime "magic" that -transforms a class, endowing it with dataclass-like behaviors. +decorated object performs runtime "magic" that +transforms a class, giving it :func:`dataclasses.dataclass`-like behaviors. For example:: - # The ``create_model`` decorator is defined by a library. + # The create_model decorator is defined by a library. @typing.dataclass_transform() - def create_model(cls: Type[_T]) -> Type[_T]: + def create_model(cls: Type[T]) -> Type[T]: cls.__init__ = ... cls.__eq__ = ... cls.__ne__ = ... return cls - # The ``create_model`` decorator can now be used to create new model + # The create_model decorator can now be used to create new model # classes, like this: @create_model class CustomerModel: @@ -373,6 +373,9 @@ Other Language Changes the current directory, the script's directory or an empty string. (Contributed by Victor Stinner in :gh:`57684`.) +* A ``"z"`` option was added to the format specification mini-language that + coerces negative zero to zero after rounding to the format precision. See + :pep:`682` for more details. (Contributed by John Belmonte in :gh:`90153`.) Other CPython Implementation Changes ==================================== @@ -403,6 +406,11 @@ Other CPython Implementation Changes instead of prepending them. (Contributed by Bastian Neuburger in :issue:`44934`.) +* The :c:member:`PyConfig.module_search_paths_set` field must now be set to 1 for + initialization to use :c:member:`PyConfig.module_search_paths` to initialize + :data:`sys.path`. Otherwise, initialization will recalculate the path and replace + any values added to ``module_search_paths``. + New Modules =========== @@ -431,6 +439,10 @@ asyncio existing stream-based connections to TLS. (Contributed by Ian Good in :issue:`34975`.) +* Add :class:`~asyncio.Barrier` class to the synchronization primitives of + the asyncio library. (Contributed by Yves Duprat and Andrew Svetlov in + :gh:`87518`.) + datetime -------- @@ -726,6 +738,12 @@ For major changes, see :ref:`new-feat-related-type-hints-311`. the given type. At runtime it simply returns the received value. (Contributed by Jelle Zijlstra in :gh:`90638`.) +* :data:`typing.TypedDict` types can now be generic. (Contributed by + Samodya Abeysiriwardane in :gh:`89026`.) + +* :class:`~typing.NamedTuple` types can now be generic. + (Contributed by Serhiy Storchaka in :issue:`43923`.) + * Allow subclassing of :class:`typing.Any`. This is useful for avoiding type checker errors related to highly dynamic class, such as mocks. (Contributed by Shantanu Jain in :gh:`91154`.) @@ -739,11 +757,33 @@ For major changes, see :ref:`new-feat-related-type-hints-311`. to clear all registered overloads of a function. (Contributed by Jelle Zijlstra in :gh:`89263`.) -* :data:`typing.TypedDict` subclasses can now be generic. (Contributed by - Samodya Abey in :gh:`89026`.) +* The :meth:`__init__` method of :class:`~typing.Protocol` subclasses + is now preserved. (Contributed by Adrian Garcia Badarasco in :gh:`88970`.) -* :class:`~typing.NamedTuple` subclasses can now be generic. - (Contributed by Serhiy Storchaka in :issue:`43923`.) +* The representation of empty tuple types (``Tuple[()]``) is simplified. + This affects introspection, e.g. ``get_args(Tuple[()])`` now evaluates + to ``()`` instead of ``((),)``. + (Contributed by Serhiy Storchaka in :gh:`91137`.) + +* Loosen runtime requirements for type annotations by removing the callable + check in the private ``typing._type_check`` function. (Contributed by + Gregory Beauregard in :gh:`90802`.) + +* :func:`typing.get_type_hints` now supports evaluating strings as forward + references in :ref:`PEP 585 generic aliases `. + (Contributed by Niklas Rosenstein in :gh:`85542`.) + +* :func:`typing.get_type_hints` no longer adds :data:`~typing.Optional` + to parameters with ``None`` as a default. (Contributed by Nikita Sobolev + in :gh:`90353`.) + +* :func:`typing.get_type_hints` now supports evaluating bare stringified + :data:`~typing.ClassVar` annotations. (Contributed by Gregory Beauregard + in :gh:`90711`.) + +* :func:`typing.no_type_check` no longer modifies external classes and functions. + It also now correctly marks classmethods as not to be type checked. (Contributed + by Nikita Sobolev in :gh:`90729`.) tkinter @@ -1116,8 +1156,8 @@ Deprecated that was added in Python 3.10. (Contributed by Raymond Hettinger in :gh:`89519`.) -* Octal escapes with value larger than ``0o377`` now produce - a :exc:`DeprecationWarning`. +* Octal escapes in string and bytes literals with value larger than ``0o377`` now + produce :exc:`DeprecationWarning`. In a future Python version they will be a :exc:`SyntaxWarning` and eventually a :exc:`SyntaxError`. (Contributed by Serhiy Storchaka in :gh:`81548`.) @@ -1184,7 +1224,11 @@ Deprecated removed in Python 3.13. Use :func:`locale.setlocale`, :func:`locale.getpreferredencoding(False) ` and :func:`locale.getlocale` functions instead. - (Contributed by Victor Stinner in :issue:`46659`.) + (Contributed by Victor Stinner in :gh:`90817`.) + +* The :func:`locale.resetlocale` function is deprecated and will be + removed in Python 3.13. Use ``locale.setlocale(locale.LC_ALL, "")`` instead. + (Contributed by Victor Stinner in :gh:`90817`.) * The :mod:`asynchat`, :mod:`asyncore` and :mod:`smtpd` modules have been deprecated since at least Python 3.6. Their documentation and deprecation @@ -1232,6 +1276,15 @@ Deprecated wherever possible. (Contributed by Alex Waygood in :gh:`92332`.) +* The keyword argument syntax for constructing :data:`~typing.TypedDict` types + is now deprecated. Support will be removed in Python 3.13. (Contributed by + Jingchen Ye in :gh:`90224`.) + +* The :func:`re.template` function and the corresponding :const:`re.TEMPLATE` + and :const:`re.T` flags are deprecated, as they were undocumented and + lacked an obvious purpose. They will be removed in Python 3.13. + (Contributed by Serhiy Storchaka and Miro Hrončok in :gh:`92728`.) + Pending Removal in Python 3.12 ============================== @@ -1699,6 +1752,21 @@ Porting to Python 3.11 which are not available in the limited C API. (Contributed by Victor Stinner in :issue:`46007`.) +* The following frame functions and type are now directly available with + ``#include ``, it's no longer needed to add + ``#include ``: + + * :c:func:`PyFrame_Check` + * :c:func:`PyFrame_GetBack` + * :c:func:`PyFrame_GetBuiltins` + * :c:func:`PyFrame_GetGenerator` + * :c:func:`PyFrame_GetGlobals` + * :c:func:`PyFrame_GetLasti` + * :c:func:`PyFrame_GetLocals` + * :c:type:`PyFrame_Type` + + (Contributed by Victor Stinner in :gh:`93937`.) + .. _pyframeobject-3.11-hiding: * The :c:type:`PyFrameObject` structure members have been removed from the @@ -1825,6 +1893,15 @@ Porting to Python 3.11 * Distributors are encouraged to build Python with the optimized Blake2 library `libb2`_. +* The :c:member:`PyConfig.module_search_paths_set` field must now be set to 1 for + initialization to use :c:member:`PyConfig.module_search_paths` to initialize + :data:`sys.path`. Otherwise, initialization will recalculate the path and replace + any values added to ``module_search_paths``. + +* :c:func:`PyConfig_Read` no longer calculates the initial search path, and will not + fill any values into :c:member:`PyConfig.module_search_paths`. To calculate default + paths and then modify them, finish initialization and use :c:func:`PySys_GetObject` + to retrieve :data:`sys.path` as a Python list object and modify it directly. Deprecated ---------- diff --git a/Doc/whatsnew/3.12.rst b/Doc/whatsnew/3.12.rst index abbc76a1cbe675..849a74a8542209 100644 --- a/Doc/whatsnew/3.12.rst +++ b/Doc/whatsnew/3.12.rst @@ -90,6 +90,13 @@ New Modules Improved Modules ================ +os +-- + +* Add :data:`os.PIDFD_NONBLOCK` to open a file descriptor + for a process with :func:`os.pidfd_open` in non-blocking mode. + (Contributed by Kumar Aditya in :gh:`93312`.) + Optimizations ============= @@ -99,10 +106,74 @@ Optimizations (Contributed by Inada Naoki in :gh:`92536`.) +CPython bytecode changes +======================== + +* Removed the :opcode:`LOAD_METHOD` instruction. It has been merged into + :opcode:`LOAD_ATTR`. :opcode:`LOAD_ATTR` will now behave like the old + :opcode:`LOAD_METHOD` instruction if the low bit of its oparg is set. + (Contributed by Ken Jin in :gh:`93429`.) + + Deprecated ========== +Pending Removal in Python 3.13 +============================== + +The following modules and APIs have been deprecated in earlier Python releases, +and will be removed in Python 3.13. + +Modules (see :pep:`594`): + +* :mod:`aifc` +* :mod:`audioop` +* :mod:`cgi` +* :mod:`cgitb` +* :mod:`chunk` +* :mod:`crypt` +* :mod:`imghdr` +* :mod:`mailcap` +* :mod:`msilib` +* :mod:`nis` +* :mod:`nntplib` +* :mod:`ossaudiodev` +* :mod:`pipes` +* :mod:`sndhdr` +* :mod:`spwd` +* :mod:`sunau` +* :mod:`telnetlib` +* :mod:`uu` +* :mod:`xdrlib` + +APIs: + +* :class:`configparser.LegacyInterpolation` (:gh:`90765`) +* :func:`locale.getdefaultlocale` (:gh:`90817`) +* :meth:`turtle.RawTurtle.settiltangle` (:gh:`50096`) +* :func:`unittest.findTestCases` (:gh:`50096`) +* :func:`unittest.makeSuite` (:gh:`50096`) +* :func:`unittest.getTestCaseNames` (:gh:`50096`) +* :class:`webbrowser.MacOSX` (:gh:`86421`) + +Pending Removal in Future Versions +================================== + +The following APIs were deprecated in earlier Python versions and will be removed, +although there is currently no date scheduled for their removal. + +* :class:`typing.Text` (:gh:`92332`) + +* Currently Python accepts numeric literals immediately followed by keywords, + for example ``0in x``, ``1or x``, ``0if 1else 2``. It allows confusing + and ambiguous expressions like ``[0x1for x in y]`` (which can be + interpreted as ``[0x1 for x in y]`` or ``[0x1f or x in y]``). + A syntax warning is raised if the numeric literal is + immediately followed by one of keywords :keyword:`and`, :keyword:`else`, + :keyword:`for`, :keyword:`if`, :keyword:`in`, :keyword:`is` and :keyword:`or`. + In a future release it will be changed to a syntax error. (:gh:`87999`) + Removed ======= @@ -146,6 +217,57 @@ Removed (Contributed by Serhiy Storchaka in :issue:`45162`.) +* Several names deprecated in the :mod:`configparser` way back in 3.2 have + been removed per :gh:`89336`: + + * :class:`configparser.ParsingError` no longer has a ``filename`` attribute + or argument. Use the ``source`` attribute and argument instead. + * :mod:`configparser` no longer has a ``SafeConfigParser`` class. Use the + shorter :class:`~configparser.ConfigParser` name instead. + * :class:`configparser.ConfigParser` no longer has a ``readfp`` method. + Use :meth:`~configparser.ConfigParser.read_file` instead. + +* The following undocumented :mod:`sqlite3` features, deprecated in Python + 3.10, are now removed: + + * ``sqlite3.enable_shared_cache()`` + * ``sqlite3.OptimizedUnicode`` + + If a shared cache must be used, open the database in URI mode using the + ``cache=shared`` query parameter. + + The ``sqlite3.OptimizedUnicode`` text factory has been an alias for + :class:`str` since Python 3.3. Code that previously set the text factory to + ``OptimizedUnicode`` can either use ``str`` explicitly, or rely on the + default value which is also ``str``. + + (Contributed by Erlend E. Aasland in :gh:`92548`) + +* The ``--experimental-isolated-subinterpreters`` configure flag + (and corresponding ``EXPERIMENTAL_ISOLATED_SUBINTERPRETERS``) + have been removed. + +* Remove ``io.OpenWrapper`` and ``_pyio.OpenWrapper``, deprecated in Python + 3.10: just use :func:`open` instead. The :func:`open` (:func:`io.open`) + function is a built-in function. Since Python 3.10, :func:`_pyio.open` is + also a static method. + (Contributed by Victor Stinner in :gh:`94169`.) + +* Remove the :func:`ssl.RAND_pseudo_bytes` function, deprecated in Python 3.6: + use :func:`os.urandom` or :func:`ssl.RAND_bytes` instead. + (Contributed by Victor Stinner in :gh:`94199`.) + +* :mod:`gzip`: Remove the ``filename`` attribute of :class:`gzip.GzipFile`, + deprecated since Python 2.6, use the :attr:`~gzip.GzipFile.name` attribute + instead. In write mode, the ``filename`` attribute added ``'.gz'`` file + extension if it was not present. + (Contributed by Victor Stinner in :gh:`94196`.) + +* Remove the :func:`ssl.match_hostname` function. The + :func:`ssl.match_hostname` was deprecated in Python 3.7. OpenSSL performs + hostname matching since Python 3.7, Python no longer uses the + :func:`ssl.match_hostname` function. + (Contributed by Victor Stinner in :gh:`94199`.) Porting to Python 3.12 ====================== @@ -171,10 +293,21 @@ Changes in the Python API select from a larger range than ``randrange(10**25)``. (Originally suggested by Serhiy Storchaka gh-86388.) +* :class:`argparse.ArgumentParser` changed encoding and error handler + for reading arguments from file (e.g. ``fromfile_prefix_chars`` option) + from default text encoding (e.g. :func:`locale.getpreferredencoding(False) `) + to :term:`filesystem encoding and error handler`. + Argument files should be encoded in UTF-8 instead of ANSI Codepage on Windows. + Build Changes ============= +* ``va_start()`` with two parameters, like ``va_start(args, format),`` + is now required to build Python. + ``va_start()`` is no longer called with a single parameter. + (Contributed by Kumar Aditya in :gh:`93207`.) + C API Changes ============= @@ -182,6 +315,11 @@ C API Changes New Features ------------ +* Added the new limited C API function :c:func:`PyType_FromMetaclass`, + which generalizes the existing :c:func:`PyType_FromModuleAndSpec` using + an additional metaclass argument. + (Contributed by Wenzel Jakob in :gh:`93012`.) + Porting to Python 3.12 ---------------------- @@ -195,6 +333,35 @@ Porting to Python 3.12 Deprecated ---------- +* Deprecate global configuration variable: + + * :c:var:`Py_DebugFlag`: use :c:member:`PyConfig.parser_debug` + * :c:var:`Py_VerboseFlag`: use :c:member:`PyConfig.verbose` + * :c:var:`Py_QuietFlag`: use :c:member:`PyConfig.quiet` + * :c:var:`Py_InteractiveFlag`: use :c:member:`PyConfig.interactive` + * :c:var:`Py_InspectFlag`: use :c:member:`PyConfig.inspect` + * :c:var:`Py_OptimizeFlag`: use :c:member:`PyConfig.optimization_level` + * :c:var:`Py_NoSiteFlag`: use :c:member:`PyConfig.site_import` + * :c:var:`Py_BytesWarningFlag`: use :c:member:`PyConfig.bytes_warning` + * :c:var:`Py_FrozenFlag`: use :c:member:`PyConfig.pathconfig_warnings` + * :c:var:`Py_IgnoreEnvironmentFlag`: use :c:member:`PyConfig.use_environment` + * :c:var:`Py_DontWriteBytecodeFlag`: use :c:member:`PyConfig.write_bytecode` + * :c:var:`Py_NoUserSiteDirectory`: use :c:member:`PyConfig.user_site_directory` + * :c:var:`Py_UnbufferedStdioFlag`: use :c:member:`PyConfig.buffered_stdio` + * :c:var:`Py_HashRandomizationFlag`: use :c:member:`PyConfig.use_hash_seed` + and :c:member:`PyConfig.hash_seed` + * :c:var:`Py_IsolatedFlag`: use :c:member:`PyConfig.isolated` + * :c:var:`Py_LegacyWindowsFSEncodingFlag`: use :c:member:`PyConfig.legacy_windows_fs_encoding` + * :c:var:`Py_LegacyWindowsStdioFlag`: use :c:member:`PyConfig.legacy_windows_stdio` + * :c:var:`Py_FileSystemDefaultEncoding`: use :c:member:`PyConfig.filesystem_encoding` + * :c:var:`Py_FileSystemDefaultEncodeErrors`: use :c:member:`PyConfig.filesystem_errors` + * :c:var:`Py_UTF8Mode`: use :c:member:`PyPreConfig.utf8_mode` (see :c:func:`Py_PreInitialize`) + + The :c:func:`Py_InitializeFromConfig` API should be used with + :c:type:`PyConfig` instead. + (Contributed by Victor Stinner in :gh:`77782`.) + + Removed ------- diff --git a/Doc/whatsnew/3.2.rst b/Doc/whatsnew/3.2.rst index 09ec58eca5b110..9dd849f9e9e720 100644 --- a/Doc/whatsnew/3.2.rst +++ b/Doc/whatsnew/3.2.rst @@ -48,10 +48,11 @@ This saves the maintainer the effort of going through the SVN log when researching a change. -This article explains the new features in Python 3.2 as compared to 3.1. It +This article explains the new features in Python 3.2 as compared to 3.1. +Python 3.2 was released on February 20, 2011. It focuses on a few highlights and gives a few examples. For full details, see the `Misc/NEWS -`_ +`__ file. .. seealso:: @@ -744,7 +745,8 @@ Two methods have been deprecated: * :meth:`xml.etree.ElementTree.getiterator` use ``Element.iter`` instead. For details of the update, see `Introducing ElementTree -`_ on Fredrik Lundh's website. +`_ +on Fredrik Lundh's website. (Contributed by Florent Xicluna and Fredrik Lundh, :issue:`6472`.) @@ -1645,7 +1647,7 @@ for secure (encrypted, authenticated) internet connections: * The :func:`ssl.wrap_socket` constructor function now takes a *ciphers* argument. The *ciphers* string lists the allowed encryption algorithms using the format described in the `OpenSSL documentation - `__. + `__. * When linked against recent versions of OpenSSL, the :mod:`ssl` module now supports the Server Name Indication extension to the TLS protocol, allowing @@ -2590,10 +2592,12 @@ Changes to Python's build process and to the C API include: longer used and it had never been documented (:issue:`8837`). There were a number of other small changes to the C-API. See the -:source:`Misc/NEWS` file for a complete list. +`Misc/NEWS `__ +file for a complete list. Also, there were a number of updates to the Mac OS X build, see -:source:`Mac/BuildScript/README.txt` for details. For users running a 32/64-bit +`Mac/BuildScript/README.txt `_ +for details. For users running a 32/64-bit build, there is a known problem with the default Tcl/Tk on Mac OS X 10.6. Accordingly, we recommend installing an updated alternative such as `ActiveState Tcl/Tk 8.5.9 `_\. diff --git a/Doc/whatsnew/3.4.rst b/Doc/whatsnew/3.4.rst index f4ff143224196b..023736134b2ca9 100644 --- a/Doc/whatsnew/3.4.rst +++ b/Doc/whatsnew/3.4.rst @@ -409,7 +409,7 @@ Some smaller changes made to the core Python language are: evaluating has no elements. (Contributed by Julian Berman in :issue:`18111`.) -* Module objects are now :mod:`weakref`'able. +* Module objects are now :ref:`weakly referenceable `. * Module ``__file__`` attributes (and related values) should now always contain absolute paths by default, with the sole exception of @@ -1113,8 +1113,8 @@ with additional speedups by Antoine Pitrou in :issue:`19219`.) mmap ---- -mmap objects can now be :mod:`weakref`\ ed. (Contributed by Valerie Lambert in -:issue:`4885`.) +mmap objects are now :ref:`weakly referenceable `. +(Contributed by Valerie Lambert in :issue:`4885`.) multiprocessing diff --git a/Doc/whatsnew/3.6.rst b/Doc/whatsnew/3.6.rst index d35a0fd9d9da38..e54475fd83c975 100644 --- a/Doc/whatsnew/3.6.rst +++ b/Doc/whatsnew/3.6.rst @@ -2117,7 +2117,8 @@ API and Feature Removals platform specific ``Lib/plat-*/`` directories, but were chronically out of date, inconsistently available across platforms, and unmaintained. The script that created these modules is still available in the source - distribution at :source:`Tools/scripts/h2py.py`. + distribution at `Tools/scripts/h2py.py + `_. * The deprecated ``asynchat.fifo`` class has been removed. diff --git a/Doc/whatsnew/3.7.rst b/Doc/whatsnew/3.7.rst index 49246be57639d1..b2078f5a82e280 100644 --- a/Doc/whatsnew/3.7.rst +++ b/Doc/whatsnew/3.7.rst @@ -2120,7 +2120,8 @@ Platform Support Removals of other LTS Linux releases (e.g. RHEL/CentOS 7.5, SLES 12-SP3), use OpenSSL 1.0.2 or later, and remain supported in the default build configuration. - CPython's own :source:`CI configuration file <.travis.yml>` provides an + CPython's own `CI configuration file + `_ provides an example of using the SSL :source:`compatibility testing infrastructure ` in CPython's test suite to build and link against OpenSSL 1.1.0 rather than an diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index c2f78f1531317a..077de05f34c93e 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -45,6 +45,7 @@ :Editor: Raymond Hettinger This article explains the new features in Python 3.8, compared to 3.7. +Python 3.8 was released on October 14, 2019. For full details, see the :ref:`changelog `. .. testsetup:: @@ -2007,6 +2008,8 @@ Changes in the Python API ``replace()`` method of :class:`types.CodeType` can be used to make the code future-proof. +* The parameter ``digestmod`` for :func:`hmac.new` no longer uses the MD5 digest + by default. Changes in the C API -------------------- diff --git a/Doc/whatsnew/3.9.rst b/Doc/whatsnew/3.9.rst index 6dee55e5a0e555..6deaede4953bdc 100644 --- a/Doc/whatsnew/3.9.rst +++ b/Doc/whatsnew/3.9.rst @@ -45,7 +45,7 @@ when researching a change. This article explains the new features in Python 3.9, compared to 3.8. -Python 3.9 was released on October 5th, 2020. +Python 3.9 was released on October 5, 2020. For full details, see the :ref:`changelog `. diff --git a/Grammar/python.gram b/Grammar/python.gram index 15c40b6bbbacdc..9fd498c4e17db7 100644 --- a/Grammar/python.gram +++ b/Grammar/python.gram @@ -249,7 +249,7 @@ class_def[stmt_ty]: class_def_raw[stmt_ty]: | invalid_class_def_raw - | 'class' a=NAME b=['(' z=[arguments] ')' { z }] &&':' c=block { + | 'class' a=NAME b=['(' z=[arguments] ')' { z }] ':' c=block { _PyAST_ClassDef(a->v.Name.id, (b) ? ((expr_ty) b)->v.Call.args : NULL, (b) ? ((expr_ty) b)->v.Call.keywords : NULL, @@ -379,9 +379,9 @@ while_stmt[stmt_ty]: for_stmt[stmt_ty]: | invalid_for_stmt - | 'for' t=star_targets 'in' ~ ex=star_expressions &&':' tc=[TYPE_COMMENT] b=block el=[else_block] { + | 'for' t=star_targets 'in' ~ ex=star_expressions ':' tc=[TYPE_COMMENT] b=block el=[else_block] { _PyAST_For(t, ex, b, el, NEW_TYPE_COMMENT(p, tc), EXTRA) } - | ASYNC 'for' t=star_targets 'in' ~ ex=star_expressions &&':' tc=[TYPE_COMMENT] b=block el=[else_block] { + | ASYNC 'for' t=star_targets 'in' ~ ex=star_expressions ':' tc=[TYPE_COMMENT] b=block el=[else_block] { CHECK_VERSION(stmt_ty, 5, "Async for loops are", _PyAST_AsyncFor(t, ex, b, el, NEW_TYPE_COMMENT(p, tc), EXTRA)) } | invalid_for_target @@ -471,7 +471,7 @@ or_pattern[pattern_ty]: | patterns[asdl_pattern_seq*]='|'.closed_pattern+ { asdl_seq_LEN(patterns) == 1 ? asdl_seq_GET(patterns, 0) : _PyAST_MatchOr(patterns, EXTRA) } -closed_pattern[pattern_ty]: +closed_pattern[pattern_ty] (memo): | literal_pattern | capture_pattern | wildcard_pattern @@ -558,7 +558,7 @@ maybe_star_pattern[pattern_ty]: | star_pattern | pattern -star_pattern[pattern_ty]: +star_pattern[pattern_ty] (memo): | '*' target=pattern_capture_target { _PyAST_MatchStar(target->v.Name.id, EXTRA) } | '*' wildcard_pattern { @@ -1231,8 +1231,8 @@ invalid_import_from_targets: RAISE_SYNTAX_ERROR("trailing comma not allowed without surrounding parentheses") } invalid_with_stmt: - | [ASYNC] 'with' ','.(expression ['as' star_target])+ &&':' - | [ASYNC] 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':' + | [ASYNC] 'with' ','.(expression ['as' star_target])+ NEWLINE { RAISE_SYNTAX_ERROR("expected ':'") } + | [ASYNC] 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE { RAISE_SYNTAX_ERROR("expected ':'") } invalid_with_stmt_indent: | [ASYNC] a='with' ','.(expression ['as' star_target])+ ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after 'with' statement on line %d", a->lineno) } @@ -1262,11 +1262,11 @@ invalid_except_star_stmt_indent: | a='except' '*' expression ['as' NAME ] ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after 'except*' statement on line %d", a->lineno) } invalid_match_stmt: - | "match" subject_expr !':' { CHECK_VERSION(void*, 10, "Pattern matching is", RAISE_SYNTAX_ERROR("expected ':'") ) } + | "match" subject_expr NEWLINE { CHECK_VERSION(void*, 10, "Pattern matching is", RAISE_SYNTAX_ERROR("expected ':'") ) } | a="match" subject=subject_expr ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after 'match' statement on line %d", a->lineno) } invalid_case_block: - | "case" patterns guard? !':' { RAISE_SYNTAX_ERROR("expected ':'") } + | "case" patterns guard? NEWLINE { RAISE_SYNTAX_ERROR("expected ':'") } | a="case" patterns guard? ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after 'case' statement on line %d", a->lineno) } invalid_as_pattern: @@ -1295,13 +1295,15 @@ invalid_while_stmt: | a='while' named_expression ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after 'while' statement on line %d", a->lineno) } invalid_for_stmt: + | [ASYNC] 'for' star_targets 'in' star_expressions NEWLINE { RAISE_SYNTAX_ERROR("expected ':'") } | [ASYNC] a='for' star_targets 'in' star_expressions ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after 'for' statement on line %d", a->lineno) } invalid_def_raw: | [ASYNC] a='def' NAME '(' [params] ')' ['->' expression] ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after function definition on line %d", a->lineno) } invalid_class_def_raw: - | a='class' NAME ['('[arguments] ')'] ':' NEWLINE !INDENT { + | 'class' NAME ['(' [arguments] ')'] NEWLINE { RAISE_SYNTAX_ERROR("expected ':'") } + | a='class' NAME ['(' [arguments] ')'] ':' NEWLINE !INDENT { RAISE_INDENTATION_ERROR("expected an indented block after class definition on line %d", a->lineno) } invalid_double_starred_kvpairs: @@ -1312,4 +1314,4 @@ invalid_kvpair: | a=expression !(':') { RAISE_ERROR_KNOWN_LOCATION(p, PyExc_SyntaxError, a->lineno, a->end_col_offset - 1, a->end_lineno, -1, "':' expected after dictionary key") } | expression ':' a='*' bitwise_or { RAISE_SYNTAX_ERROR_STARTING_FROM(a, "cannot use a starred expression in a dictionary value") } - | expression a=':' {RAISE_SYNTAX_ERROR_KNOWN_LOCATION(a, "expression expected after dictionary key and ':'") } \ No newline at end of file + | expression a=':' {RAISE_SYNTAX_ERROR_KNOWN_LOCATION(a, "expression expected after dictionary key and ':'") } diff --git a/Include/abstract.h b/Include/abstract.h index 9e06fbbb749138..576024e09c4101 100644 --- a/Include/abstract.h +++ b/Include/abstract.h @@ -14,9 +14,9 @@ extern "C" { Print an object 'o' on file 'fp'. Returns -1 on error. The flags argument is used to enable certain printing options. The only option currently - supported is Py_Print_RAW. - - (What should be said about Py_Print_RAW?). */ + supported is Py_PRINT_RAW. By default (flags=0), PyObject_Print() formats + the object by calling PyObject_Repr(). If flags equals to Py_PRINT_RAW, it + formats the object by calling PyObject_Str(). */ /* Implemented elsewhere: @@ -88,7 +88,7 @@ extern "C" { -1 on failure. This is the equivalent of the Python statement: del o.attr_name. */ -#define PyObject_DelAttrString(O,A) PyObject_SetAttrString((O),(A), NULL) +#define PyObject_DelAttrString(O, A) PyObject_SetAttrString((O), (A), NULL) /* Implemented as a macro: @@ -98,7 +98,7 @@ extern "C" { Delete attribute named attr_name, for object o. Returns -1 on failure. This is the equivalent of the Python statement: del o.attr_name. */ -#define PyObject_DelAttr(O,A) PyObject_SetAttr((O),(A), NULL) +#define PyObject_DelAttr(O, A) PyObject_SetAttr((O), (A), NULL) /* Implemented elsewhere: @@ -722,7 +722,7 @@ PyAPI_FUNC(PyObject *) PySequence_Fast(PyObject *o, const char* m); /* Return the 'i'-th element of the sequence 'o', assuming that o was returned by PySequence_Fast, and that i is within bounds. */ #define PySequence_Fast_GET_ITEM(o, i)\ - (PyList_Check(o) ? PyList_GET_ITEM(o, i) : PyTuple_GET_ITEM(o, i)) + (PyList_Check(o) ? PyList_GET_ITEM((o), (i)) : PyTuple_GET_ITEM((o), (i))) /* Return a pointer to the underlying item array for an object returned by PySequence_Fast */ @@ -802,7 +802,7 @@ PyAPI_FUNC(Py_ssize_t) PyMapping_Length(PyObject *o); failure. This is equivalent to the Python statement: del o[key]. */ -#define PyMapping_DelItemString(O,K) PyObject_DelItemString((O),(K)) +#define PyMapping_DelItemString(O, K) PyObject_DelItemString((O), (K)) /* Implemented as a macro: @@ -812,7 +812,7 @@ PyAPI_FUNC(Py_ssize_t) PyMapping_Length(PyObject *o); Returns -1 on failure. This is equivalent to the Python statement: del o[key]. */ -#define PyMapping_DelItem(O,K) PyObject_DelItem((O),(K)) +#define PyMapping_DelItem(O, K) PyObject_DelItem((O), (K)) /* On success, return 1 if the mapping object 'o' has the key 'key', and 0 otherwise. diff --git a/Include/boolobject.h b/Include/boolobject.h index 28068d1cbe5939..ca21fbfad8e827 100644 --- a/Include/boolobject.h +++ b/Include/boolobject.h @@ -9,7 +9,7 @@ extern "C" { PyAPI_DATA(PyTypeObject) PyBool_Type; -#define PyBool_Check(x) Py_IS_TYPE(x, &PyBool_Type) +#define PyBool_Check(x) Py_IS_TYPE((x), &PyBool_Type) /* Py_False and Py_True are the only two bools in existence. Don't forget to apply Py_INCREF() when returning either!!! */ @@ -19,8 +19,8 @@ PyAPI_DATA(PyLongObject) _Py_FalseStruct; PyAPI_DATA(PyLongObject) _Py_TrueStruct; /* Use these macros */ -#define Py_False ((PyObject *) &_Py_FalseStruct) -#define Py_True ((PyObject *) &_Py_TrueStruct) +#define Py_False _PyObject_CAST(&_Py_FalseStruct) +#define Py_True _PyObject_CAST(&_Py_TrueStruct) // Test if an object is the True singleton, the same as "x is True" in Python. PyAPI_FUNC(int) Py_IsTrue(PyObject *x); diff --git a/Include/bytearrayobject.h b/Include/bytearrayobject.h index ae2bde1c303565..3d53fdba643267 100644 --- a/Include/bytearrayobject.h +++ b/Include/bytearrayobject.h @@ -21,8 +21,8 @@ PyAPI_DATA(PyTypeObject) PyByteArray_Type; PyAPI_DATA(PyTypeObject) PyByteArrayIter_Type; /* Type check macros */ -#define PyByteArray_Check(self) PyObject_TypeCheck(self, &PyByteArray_Type) -#define PyByteArray_CheckExact(self) Py_IS_TYPE(self, &PyByteArray_Type) +#define PyByteArray_Check(self) PyObject_TypeCheck((self), &PyByteArray_Type) +#define PyByteArray_CheckExact(self) Py_IS_TYPE((self), &PyByteArray_Type) /* Direct API functions */ PyAPI_FUNC(PyObject *) PyByteArray_FromObject(PyObject *); diff --git a/Include/bytesobject.h b/Include/bytesobject.h index 4c4dc40d705d71..ee448cd02bdab3 100644 --- a/Include/bytesobject.h +++ b/Include/bytesobject.h @@ -29,7 +29,7 @@ PyAPI_DATA(PyTypeObject) PyBytesIter_Type; #define PyBytes_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_BYTES_SUBCLASS) -#define PyBytes_CheckExact(op) Py_IS_TYPE(op, &PyBytes_Type) +#define PyBytes_CheckExact(op) Py_IS_TYPE((op), &PyBytes_Type) PyAPI_FUNC(PyObject *) PyBytes_FromStringAndSize(const char *, Py_ssize_t); PyAPI_FUNC(PyObject *) PyBytes_FromString(const char *); diff --git a/Include/ceval.h b/Include/ceval.h index 1b57f6ea20f6f0..ad4d909d6f2b14 100644 --- a/Include/ceval.h +++ b/Include/ceval.h @@ -31,7 +31,7 @@ Py_DEPRECATED(3.9) PyAPI_FUNC(PyObject *) PyEval_CallObjectWithKeywords( /* Deprecated since PyEval_CallObjectWithKeywords is deprecated */ #define PyEval_CallObject(callable, arg) \ - PyEval_CallObjectWithKeywords(callable, arg, (PyObject *)NULL) + PyEval_CallObjectWithKeywords((callable), (arg), _PyObject_CAST(_Py_NULL)) Py_DEPRECATED(3.9) PyAPI_FUNC(PyObject *) PyEval_CallFunction( PyObject *callable, const char *format, ...); diff --git a/Include/complexobject.h b/Include/complexobject.h index c7764e43803d65..ebe49a832f7414 100644 --- a/Include/complexobject.h +++ b/Include/complexobject.h @@ -10,8 +10,8 @@ extern "C" { PyAPI_DATA(PyTypeObject) PyComplex_Type; -#define PyComplex_Check(op) PyObject_TypeCheck(op, &PyComplex_Type) -#define PyComplex_CheckExact(op) Py_IS_TYPE(op, &PyComplex_Type) +#define PyComplex_Check(op) PyObject_TypeCheck((op), &PyComplex_Type) +#define PyComplex_CheckExact(op) Py_IS_TYPE((op), &PyComplex_Type) PyAPI_FUNC(PyObject *) PyComplex_FromDoubles(double real, double imag); diff --git a/Include/cpython/abstract.h b/Include/cpython/abstract.h index 161e2cb30fde3e..7038918f018880 100644 --- a/Include/cpython/abstract.h +++ b/Include/cpython/abstract.h @@ -111,9 +111,8 @@ static inline PyObject * PyObject_CallMethodOneArg(PyObject *self, PyObject *name, PyObject *arg) { PyObject *args[2] = {self, arg}; - - assert(arg != NULL); size_t nargsf = 2 | PY_VECTORCALL_ARGUMENTS_OFFSET; + assert(arg != NULL); return PyObject_VectorcallMethod(name, args, nargsf, _Py_NULL); } @@ -160,9 +159,8 @@ static inline PyObject * _PyObject_CallMethodIdOneArg(PyObject *self, _Py_Identifier *name, PyObject *arg) { PyObject *args[2] = {self, arg}; - - assert(arg != NULL); size_t nargsf = 2 | PY_VECTORCALL_ARGUMENTS_OFFSET; + assert(arg != NULL); return _PyObject_VectorcallMethodId(name, args, nargsf, _Py_NULL); } @@ -178,7 +176,7 @@ PyAPI_FUNC(Py_ssize_t) PyObject_LengthHint(PyObject *o, Py_ssize_t); /* Assume tp_as_sequence and sq_item exist and that 'i' does not need to be corrected for a negative index. */ #define PySequence_ITEM(o, i)\ - ( Py_TYPE(o)->tp_as_sequence->sq_item(o, i) ) + ( Py_TYPE(o)->tp_as_sequence->sq_item((o), (i)) ) #define PY_ITERSEARCH_COUNT 1 #define PY_ITERSEARCH_INDEX 2 diff --git a/Include/cpython/bytearrayobject.h b/Include/cpython/bytearrayobject.h index 5114169c280915..9ba176eb2d3ac2 100644 --- a/Include/cpython/bytearrayobject.h +++ b/Include/cpython/bytearrayobject.h @@ -25,14 +25,10 @@ static inline char* PyByteArray_AS_STRING(PyObject *op) } return _PyByteArray_empty_string; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyByteArray_AS_STRING(self) PyByteArray_AS_STRING(_PyObject_CAST(self)) -#endif +#define PyByteArray_AS_STRING(self) PyByteArray_AS_STRING(_PyObject_CAST(self)) static inline Py_ssize_t PyByteArray_GET_SIZE(PyObject *op) { PyByteArrayObject *self = _PyByteArray_CAST(op); return Py_SIZE(self); } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyByteArray_GET_SIZE(self) PyByteArray_GET_SIZE(_PyObject_CAST(self)) -#endif +#define PyByteArray_GET_SIZE(self) PyByteArray_GET_SIZE(_PyObject_CAST(self)) diff --git a/Include/cpython/bytesobject.h b/Include/cpython/bytesobject.h index 53343661f0ec43..e982031c107de2 100644 --- a/Include/cpython/bytesobject.h +++ b/Include/cpython/bytesobject.h @@ -36,17 +36,13 @@ static inline char* PyBytes_AS_STRING(PyObject *op) { return _PyBytes_CAST(op)->ob_sval; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyBytes_AS_STRING(op) PyBytes_AS_STRING(_PyObject_CAST(op)) -#endif +#define PyBytes_AS_STRING(op) PyBytes_AS_STRING(_PyObject_CAST(op)) static inline Py_ssize_t PyBytes_GET_SIZE(PyObject *op) { PyBytesObject *self = _PyBytes_CAST(op); return Py_SIZE(self); } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyBytes_GET_SIZE(self) PyBytes_GET_SIZE(_PyObject_CAST(self)) -#endif +#define PyBytes_GET_SIZE(self) PyBytes_GET_SIZE(_PyObject_CAST(self)) /* _PyBytes_Join(sep, x) is like sep.join(x). sep must be PyBytesObject*, x must be an iterable object. */ diff --git a/Include/cpython/cellobject.h b/Include/cpython/cellobject.h index f778f86de74695..47a6a491497ea0 100644 --- a/Include/cpython/cellobject.h +++ b/Include/cpython/cellobject.h @@ -15,29 +15,27 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyCell_Type; -#define PyCell_Check(op) Py_IS_TYPE(op, &PyCell_Type) +#define PyCell_Check(op) Py_IS_TYPE((op), &PyCell_Type) PyAPI_FUNC(PyObject *) PyCell_New(PyObject *); PyAPI_FUNC(PyObject *) PyCell_Get(PyObject *); PyAPI_FUNC(int) PyCell_Set(PyObject *, PyObject *); static inline PyObject* PyCell_GET(PyObject *op) { + PyCellObject *cell; assert(PyCell_Check(op)); - PyCellObject *cell = _Py_CAST(PyCellObject*, op); + cell = _Py_CAST(PyCellObject*, op); return cell->ob_ref; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030c0000 -# define PyCell_GET(op) PyCell_GET(_PyObject_CAST(op)) -#endif +#define PyCell_GET(op) PyCell_GET(_PyObject_CAST(op)) static inline void PyCell_SET(PyObject *op, PyObject *value) { + PyCellObject *cell; assert(PyCell_Check(op)); - PyCellObject *cell = _Py_CAST(PyCellObject*, op); + cell = _Py_CAST(PyCellObject*, op); cell->ob_ref = value; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030c0000 -# define PyCell_SET(op, value) PyCell_SET(_PyObject_CAST(op), (value)) -#endif +#define PyCell_SET(op, value) PyCell_SET(_PyObject_CAST(op), (value)) #ifdef __cplusplus } diff --git a/Include/cpython/classobject.h b/Include/cpython/classobject.h index 80df8842eb4f78..051041965002a3 100644 --- a/Include/cpython/classobject.h +++ b/Include/cpython/classobject.h @@ -19,7 +19,7 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyMethod_Type; -#define PyMethod_Check(op) Py_IS_TYPE(op, &PyMethod_Type) +#define PyMethod_Check(op) Py_IS_TYPE((op), &PyMethod_Type) PyAPI_FUNC(PyObject *) PyMethod_New(PyObject *, PyObject *); @@ -29,9 +29,9 @@ PyAPI_FUNC(PyObject *) PyMethod_Self(PyObject *); /* Macros for direct access to these values. Type checks are *not* done, so use with care. */ #define PyMethod_GET_FUNCTION(meth) \ - (((PyMethodObject *)meth) -> im_func) + (((PyMethodObject *)(meth)) -> im_func) #define PyMethod_GET_SELF(meth) \ - (((PyMethodObject *)meth) -> im_self) + (((PyMethodObject *)(meth)) -> im_self) typedef struct { PyObject_HEAD @@ -40,7 +40,7 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyInstanceMethod_Type; -#define PyInstanceMethod_Check(op) Py_IS_TYPE(op, &PyInstanceMethod_Type) +#define PyInstanceMethod_Check(op) Py_IS_TYPE((op), &PyInstanceMethod_Type) PyAPI_FUNC(PyObject *) PyInstanceMethod_New(PyObject *); PyAPI_FUNC(PyObject *) PyInstanceMethod_Function(PyObject *); @@ -48,7 +48,7 @@ PyAPI_FUNC(PyObject *) PyInstanceMethod_Function(PyObject *); /* Macros for direct access to these values. Type checks are *not* done, so use with care. */ #define PyInstanceMethod_GET_FUNCTION(meth) \ - (((PyInstanceMethodObject *)meth) -> func) + (((PyInstanceMethodObject *)(meth)) -> func) #ifdef __cplusplus } diff --git a/Include/cpython/code.h b/Include/cpython/code.h index ba7324b48d8675..595cd9e94f31a8 100644 --- a/Include/cpython/code.h +++ b/Include/cpython/code.h @@ -29,7 +29,8 @@ typedef uint16_t _Py_CODEUNIT; #endif // Use "unsigned char" instead of "uint8_t" here to avoid illegal aliasing: -#define _Py_SET_OPCODE(word, opcode) (((unsigned char *)&(word))[0] = (opcode)) +#define _Py_SET_OPCODE(word, opcode) \ + do { ((unsigned char *)&(word))[0] = (opcode); } while (0) // To avoid repeating ourselves in deepfreeze.py, all PyCodeObject members are // defined in this macro: @@ -62,7 +63,8 @@ typedef uint16_t _Py_CODEUNIT; PyObject *co_exceptiontable; /* Byte string encoding exception handling \ table */ \ int co_flags; /* CO_..., see below */ \ - int co_warmup; /* Warmup counter for quickening */ \ + short co_warmup; /* Warmup counter for quickening */ \ + short _co_linearray_entry_size; /* Size of each entry in _co_linearray */ \ \ /* The rest are not so impactful on performance. */ \ int co_argcount; /* #arguments, except *args */ \ @@ -73,8 +75,8 @@ typedef uint16_t _Py_CODEUNIT; \ /* redundant values (derived from co_localsplusnames and \ co_localspluskinds) */ \ - int co_nlocalsplus; /* number of local + cell + free variables \ - */ \ + int co_nlocalsplus; /* number of local + cell + free variables */ \ + int co_framesize; /* Size of frame in words */ \ int co_nlocals; /* number of local variables */ \ int co_nplaincellvars; /* number of non-arg cell variables */ \ int co_ncellvars; /* total number of cell variables */ \ @@ -88,6 +90,9 @@ typedef uint16_t _Py_CODEUNIT; PyObject *co_qualname; /* unicode (qualname, for reference) */ \ PyObject *co_linetable; /* bytes object that holds location info */ \ PyObject *co_weakreflist; /* to support weakrefs to code objects */ \ + PyObject *_co_code; /* cached co_code object/attribute */ \ + int _co_firsttraceable; /* index of first traceable instruction */ \ + char *_co_linearray; /* array of line offsets */ \ /* Scratch space for extra data relating to the code object. \ Type is a void* to keep the format private in codeobject.c to force \ people to go through the proper APIs. */ \ @@ -135,7 +140,7 @@ struct PyCodeObject _PyCode_DEF(1); PyAPI_DATA(PyTypeObject) PyCode_Type; -#define PyCode_Check(op) Py_IS_TYPE(op, &PyCode_Type) +#define PyCode_Check(op) Py_IS_TYPE((op), &PyCode_Type) #define PyCode_GetNumFree(op) ((op)->co_nfreevars) #define _PyCode_CODE(CO) ((_Py_CODEUNIT *)(CO)->co_code_adaptive) #define _PyCode_NBYTES(CO) (Py_SIZE(CO) * (Py_ssize_t)sizeof(_Py_CODEUNIT)) diff --git a/Include/cpython/context.h b/Include/cpython/context.h index 4db079f7633f48..9879fc7192ebb8 100644 --- a/Include/cpython/context.h +++ b/Include/cpython/context.h @@ -15,9 +15,9 @@ PyAPI_DATA(PyTypeObject) PyContextToken_Type; typedef struct _pycontexttokenobject PyContextToken; -#define PyContext_CheckExact(o) Py_IS_TYPE(o, &PyContext_Type) -#define PyContextVar_CheckExact(o) Py_IS_TYPE(o, &PyContextVar_Type) -#define PyContextToken_CheckExact(o) Py_IS_TYPE(o, &PyContextToken_Type) +#define PyContext_CheckExact(o) Py_IS_TYPE((o), &PyContext_Type) +#define PyContextVar_CheckExact(o) Py_IS_TYPE((o), &PyContextVar_Type) +#define PyContextToken_CheckExact(o) Py_IS_TYPE((o), &PyContextToken_Type) PyAPI_FUNC(PyObject *) PyContext_New(void); diff --git a/Include/cpython/dictobject.h b/Include/cpython/dictobject.h index f249c0e9ca5e29..565ad791a6cb28 100644 --- a/Include/cpython/dictobject.h +++ b/Include/cpython/dictobject.h @@ -47,13 +47,12 @@ PyAPI_FUNC(int) _PyDict_Next( /* Get the number of items of a dictionary. */ static inline Py_ssize_t PyDict_GET_SIZE(PyObject *op) { + PyDictObject *mp; assert(PyDict_Check(op)); - PyDictObject *mp = _Py_CAST(PyDictObject*, op); + mp = _Py_CAST(PyDictObject*, op); return mp->ma_used; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030c0000 -# define PyDict_GET_SIZE(op) PyDict_GET_SIZE(_PyObject_CAST(op)) -#endif +#define PyDict_GET_SIZE(op) PyDict_GET_SIZE(_PyObject_CAST(op)) PyAPI_FUNC(int) _PyDict_Contains_KnownHash(PyObject *, PyObject *, Py_hash_t); PyAPI_FUNC(int) _PyDict_ContainsId(PyObject *, _Py_Identifier *); diff --git a/Include/cpython/frameobject.h b/Include/cpython/frameobject.h index 9cd711e43559a6..4e19535c656f2c 100644 --- a/Include/cpython/frameobject.h +++ b/Include/cpython/frameobject.h @@ -6,10 +6,6 @@ /* Standard object interface */ -PyAPI_DATA(PyTypeObject) PyFrame_Type; - -#define PyFrame_Check(op) Py_IS_TYPE(op, &PyFrame_Type) - PyAPI_FUNC(PyFrameObject *) PyFrame_New(PyThreadState *, PyCodeObject *, PyObject *, PyObject *); @@ -31,12 +27,3 @@ PyAPI_FUNC(int) _PyFrame_IsEntryFrame(PyFrameObject *frame); PyAPI_FUNC(int) PyFrame_FastToLocalsWithError(PyFrameObject *f); PyAPI_FUNC(void) PyFrame_FastToLocals(PyFrameObject *); - -PyAPI_FUNC(PyFrameObject *) PyFrame_GetBack(PyFrameObject *frame); -PyAPI_FUNC(PyObject *) PyFrame_GetLocals(PyFrameObject *frame); - -PyAPI_FUNC(PyObject *) PyFrame_GetGlobals(PyFrameObject *frame); -PyAPI_FUNC(PyObject *) PyFrame_GetBuiltins(PyFrameObject *frame); - -PyAPI_FUNC(PyObject *) PyFrame_GetGenerator(PyFrameObject *frame); -PyAPI_FUNC(int) PyFrame_GetLasti(PyFrameObject *frame); diff --git a/Include/cpython/funcobject.h b/Include/cpython/funcobject.h index 99ac6008f8b611..05f76656c4bf1d 100644 --- a/Include/cpython/funcobject.h +++ b/Include/cpython/funcobject.h @@ -60,7 +60,7 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyFunction_Type; -#define PyFunction_Check(op) Py_IS_TYPE(op, &PyFunction_Type) +#define PyFunction_Check(op) Py_IS_TYPE((op), &PyFunction_Type) PyAPI_FUNC(PyObject *) PyFunction_New(PyObject *, PyObject *); PyAPI_FUNC(PyObject *) PyFunction_NewWithQualName(PyObject *, PyObject *, PyObject *); @@ -82,22 +82,45 @@ PyAPI_FUNC(PyObject *) _PyFunction_Vectorcall( size_t nargsf, PyObject *kwnames); -/* Macros for direct access to these values. Type checks are *not* - done, so use with care. */ -#define PyFunction_GET_CODE(func) \ - (((PyFunctionObject *)func) -> func_code) -#define PyFunction_GET_GLOBALS(func) \ - (((PyFunctionObject *)func) -> func_globals) -#define PyFunction_GET_MODULE(func) \ - (((PyFunctionObject *)func) -> func_module) -#define PyFunction_GET_DEFAULTS(func) \ - (((PyFunctionObject *)func) -> func_defaults) -#define PyFunction_GET_KW_DEFAULTS(func) \ - (((PyFunctionObject *)func) -> func_kwdefaults) -#define PyFunction_GET_CLOSURE(func) \ - (((PyFunctionObject *)func) -> func_closure) -#define PyFunction_GET_ANNOTATIONS(func) \ - (((PyFunctionObject *)func) -> func_annotations) +#define _PyFunction_CAST(func) \ + (assert(PyFunction_Check(func)), _Py_CAST(PyFunctionObject*, func)) + +/* Static inline functions for direct access to these values. + Type checks are *not* done, so use with care. */ +static inline PyObject* PyFunction_GET_CODE(PyObject *func) { + return _PyFunction_CAST(func)->func_code; +} +#define PyFunction_GET_CODE(func) PyFunction_GET_CODE(_PyObject_CAST(func)) + +static inline PyObject* PyFunction_GET_GLOBALS(PyObject *func) { + return _PyFunction_CAST(func)->func_globals; +} +#define PyFunction_GET_GLOBALS(func) PyFunction_GET_GLOBALS(_PyObject_CAST(func)) + +static inline PyObject* PyFunction_GET_MODULE(PyObject *func) { + return _PyFunction_CAST(func)->func_module; +} +#define PyFunction_GET_MODULE(func) PyFunction_GET_MODULE(_PyObject_CAST(func)) + +static inline PyObject* PyFunction_GET_DEFAULTS(PyObject *func) { + return _PyFunction_CAST(func)->func_defaults; +} +#define PyFunction_GET_DEFAULTS(func) PyFunction_GET_DEFAULTS(_PyObject_CAST(func)) + +static inline PyObject* PyFunction_GET_KW_DEFAULTS(PyObject *func) { + return _PyFunction_CAST(func)->func_kwdefaults; +} +#define PyFunction_GET_KW_DEFAULTS(func) PyFunction_GET_KW_DEFAULTS(_PyObject_CAST(func)) + +static inline PyObject* PyFunction_GET_CLOSURE(PyObject *func) { + return _PyFunction_CAST(func)->func_closure; +} +#define PyFunction_GET_CLOSURE(func) PyFunction_GET_CLOSURE(_PyObject_CAST(func)) + +static inline PyObject* PyFunction_GET_ANNOTATIONS(PyObject *func) { + return _PyFunction_CAST(func)->func_annotations; +} +#define PyFunction_GET_ANNOTATIONS(func) PyFunction_GET_ANNOTATIONS(_PyObject_CAST(func)) /* The classmethod and staticmethod types lives here, too */ PyAPI_DATA(PyTypeObject) PyClassMethod_Type; diff --git a/Include/cpython/genobject.h b/Include/cpython/genobject.h index 40eaa19d3fad94..6127ba7babb80f 100644 --- a/Include/cpython/genobject.h +++ b/Include/cpython/genobject.h @@ -37,8 +37,8 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyGen_Type; -#define PyGen_Check(op) PyObject_TypeCheck(op, &PyGen_Type) -#define PyGen_CheckExact(op) Py_IS_TYPE(op, &PyGen_Type) +#define PyGen_Check(op) PyObject_TypeCheck((op), &PyGen_Type) +#define PyGen_CheckExact(op) Py_IS_TYPE((op), &PyGen_Type) PyAPI_FUNC(PyObject *) PyGen_New(PyFrameObject *); PyAPI_FUNC(PyObject *) PyGen_NewWithQualName(PyFrameObject *, @@ -57,7 +57,7 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyCoro_Type; PyAPI_DATA(PyTypeObject) _PyCoroWrapper_Type; -#define PyCoro_CheckExact(op) Py_IS_TYPE(op, &PyCoro_Type) +#define PyCoro_CheckExact(op) Py_IS_TYPE((op), &PyCoro_Type) PyAPI_FUNC(PyObject *) PyCoro_New(PyFrameObject *, PyObject *name, PyObject *qualname); @@ -76,7 +76,7 @@ PyAPI_DATA(PyTypeObject) _PyAsyncGenAThrow_Type; PyAPI_FUNC(PyObject *) PyAsyncGen_New(PyFrameObject *, PyObject *name, PyObject *qualname); -#define PyAsyncGen_CheckExact(op) Py_IS_TYPE(op, &PyAsyncGen_Type) +#define PyAsyncGen_CheckExact(op) Py_IS_TYPE((op), &PyAsyncGen_Type) #undef _PyGenObject_HEAD diff --git a/Include/cpython/import.h b/Include/cpython/import.h index ef6be689468ee5..a69b4f34def342 100644 --- a/Include/cpython/import.h +++ b/Include/cpython/import.h @@ -40,3 +40,6 @@ struct _frozen { collection of frozen modules: */ PyAPI_DATA(const struct _frozen *) PyImport_FrozenModules; + +PyAPI_DATA(PyObject *) _PyImport_GetModuleAttr(PyObject *, PyObject *); +PyAPI_DATA(PyObject *) _PyImport_GetModuleAttrString(const char *, const char *); diff --git a/Include/cpython/listobject.h b/Include/cpython/listobject.h index 1add8213e0c092..8fa82122d8d248 100644 --- a/Include/cpython/listobject.h +++ b/Include/cpython/listobject.h @@ -34,18 +34,14 @@ static inline Py_ssize_t PyList_GET_SIZE(PyObject *op) { PyListObject *list = _PyList_CAST(op); return Py_SIZE(list); } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyList_GET_SIZE(op) PyList_GET_SIZE(_PyObject_CAST(op)) -#endif +#define PyList_GET_SIZE(op) PyList_GET_SIZE(_PyObject_CAST(op)) -#define PyList_GET_ITEM(op, index) (_PyList_CAST(op)->ob_item[index]) +#define PyList_GET_ITEM(op, index) (_PyList_CAST(op)->ob_item[(index)]) static inline void PyList_SET_ITEM(PyObject *op, Py_ssize_t index, PyObject *value) { PyListObject *list = _PyList_CAST(op); list->ob_item[index] = value; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 #define PyList_SET_ITEM(op, index, value) \ - PyList_SET_ITEM(_PyObject_CAST(op), index, _PyObject_CAST(value)) -#endif + PyList_SET_ITEM(_PyObject_CAST(op), (index), _PyObject_CAST(value)) diff --git a/Include/cpython/methodobject.h b/Include/cpython/methodobject.h index 54a61cfd077be9..d541e154948041 100644 --- a/Include/cpython/methodobject.h +++ b/Include/cpython/methodobject.h @@ -31,8 +31,8 @@ typedef struct { PyAPI_DATA(PyTypeObject) PyCMethod_Type; -#define PyCMethod_CheckExact(op) Py_IS_TYPE(op, &PyCMethod_Type) -#define PyCMethod_Check(op) PyObject_TypeCheck(op, &PyCMethod_Type) +#define PyCMethod_CheckExact(op) Py_IS_TYPE((op), &PyCMethod_Type) +#define PyCMethod_Check(op) PyObject_TypeCheck((op), &PyCMethod_Type) /* Static inline functions for direct access to these values. @@ -40,9 +40,7 @@ PyAPI_DATA(PyTypeObject) PyCMethod_Type; static inline PyCFunction PyCFunction_GET_FUNCTION(PyObject *func) { return _PyCFunctionObject_CAST(func)->m_ml->ml_meth; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyCFunction_GET_FUNCTION(func) PyCFunction_GET_FUNCTION(_PyObject_CAST(func)) -#endif +#define PyCFunction_GET_FUNCTION(func) PyCFunction_GET_FUNCTION(_PyObject_CAST(func)) static inline PyObject* PyCFunction_GET_SELF(PyObject *func_obj) { PyCFunctionObject *func = _PyCFunctionObject_CAST(func_obj); @@ -51,16 +49,12 @@ static inline PyObject* PyCFunction_GET_SELF(PyObject *func_obj) { } return func->m_self; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyCFunction_GET_SELF(func) PyCFunction_GET_SELF(_PyObject_CAST(func)) -#endif +#define PyCFunction_GET_SELF(func) PyCFunction_GET_SELF(_PyObject_CAST(func)) static inline int PyCFunction_GET_FLAGS(PyObject *func) { return _PyCFunctionObject_CAST(func)->m_ml->ml_flags; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyCFunction_GET_FLAGS(func) PyCFunction_GET_FLAGS(_PyObject_CAST(func)) -#endif +#define PyCFunction_GET_FLAGS(func) PyCFunction_GET_FLAGS(_PyObject_CAST(func)) static inline PyTypeObject* PyCFunction_GET_CLASS(PyObject *func_obj) { PyCFunctionObject *func = _PyCFunctionObject_CAST(func_obj); @@ -69,6 +63,4 @@ static inline PyTypeObject* PyCFunction_GET_CLASS(PyObject *func_obj) { } return _Py_NULL; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyCFunction_GET_CLASS(func) PyCFunction_GET_CLASS(_PyObject_CAST(func)) -#endif +#define PyCFunction_GET_CLASS(func) PyCFunction_GET_CLASS(_PyObject_CAST(func)) diff --git a/Include/cpython/modsupport.h b/Include/cpython/modsupport.h index 769eb52bf6e3ae..591dcb132961f8 100644 --- a/Include/cpython/modsupport.h +++ b/Include/cpython/modsupport.h @@ -34,11 +34,13 @@ PyAPI_FUNC(int) _PyArg_NoPositional(const char *funcname, PyObject *args); #define _PyArg_NoPositional(funcname, args) \ ((args) == NULL || _PyArg_NoPositional((funcname), (args))) +#define _Py_ANY_VARARGS(n) ((n) == PY_SSIZE_T_MAX) + PyAPI_FUNC(void) _PyArg_BadArgument(const char *, const char *, const char *, PyObject *); PyAPI_FUNC(int) _PyArg_CheckPositional(const char *, Py_ssize_t, Py_ssize_t, Py_ssize_t); #define _PyArg_CheckPositional(funcname, nargs, min, max) \ - ((!ANY_VARARGS(max) && (min) <= (nargs) && (nargs) <= (max)) \ + ((!_Py_ANY_VARARGS(max) && (min) <= (nargs) && (nargs) <= (max)) \ || _PyArg_CheckPositional((funcname), (nargs), (min), (max))) PyAPI_FUNC(PyObject **) _Py_VaBuildStack( @@ -98,7 +100,7 @@ PyAPI_FUNC(PyObject * const *) _PyArg_UnpackKeywordsWithVararg( #define _PyArg_UnpackKeywords(args, nargs, kwargs, kwnames, parser, minpos, maxpos, minkw, buf) \ (((minkw) == 0 && (kwargs) == NULL && (kwnames) == NULL && \ - (minpos) <= (nargs) && (nargs) <= (maxpos) && args != NULL) ? (args) : \ + (minpos) <= (nargs) && (nargs) <= (maxpos) && (args) != NULL) ? (args) : \ _PyArg_UnpackKeywords((args), (nargs), (kwargs), (kwnames), (parser), \ (minpos), (maxpos), (minkw), (buf))) diff --git a/Include/cpython/object.h b/Include/cpython/object.h index b018dabf9d862f..614d6c18ee0b4a 100644 --- a/Include/cpython/object.h +++ b/Include/cpython/object.h @@ -45,7 +45,7 @@ typedef struct _Py_Identifier { // For now we are keeping _Py_IDENTIFIER for continued use // in non-builtin extensions (and naughty PyPI modules). -#define _Py_static_string_init(value) { .string = value, .index = -1 } +#define _Py_static_string_init(value) { .string = (value), .index = -1 } #define _Py_static_string(varname, value) static _Py_Identifier varname = _Py_static_string_init(value) #define _Py_IDENTIFIER(varname) _Py_static_string(PyId_##varname, #varname) @@ -385,9 +385,9 @@ _PyObject_DebugTypeStats(FILE *out); #endif #define _PyObject_ASSERT_WITH_MSG(obj, expr, msg) \ - _PyObject_ASSERT_FROM(obj, expr, msg, __FILE__, __LINE__, __func__) + _PyObject_ASSERT_FROM((obj), expr, (msg), __FILE__, __LINE__, __func__) #define _PyObject_ASSERT(obj, expr) \ - _PyObject_ASSERT_WITH_MSG(obj, expr, NULL) + _PyObject_ASSERT_WITH_MSG((obj), expr, NULL) #define _PyObject_ASSERT_FAILED_MSG(obj, msg) \ _PyObject_AssertFailed((obj), NULL, (msg), __FILE__, __LINE__, __func__) @@ -493,8 +493,8 @@ PyAPI_FUNC(int) _PyTrash_cond(PyObject *op, destructor dealloc); } while (0); #define Py_TRASHCAN_BEGIN(op, dealloc) \ - Py_TRASHCAN_BEGIN_CONDITION(op, \ - _PyTrash_cond(_PyObject_CAST(op), (destructor)dealloc)) + Py_TRASHCAN_BEGIN_CONDITION((op), \ + _PyTrash_cond(_PyObject_CAST(op), (destructor)(dealloc))) /* The following two macros, Py_TRASHCAN_SAFE_BEGIN and * Py_TRASHCAN_SAFE_END, are deprecated since version 3.11 and @@ -505,7 +505,7 @@ Py_DEPRECATED(3.11) typedef int UsingDeprecatedTrashcanMacro; #define Py_TRASHCAN_SAFE_BEGIN(op) \ do { \ UsingDeprecatedTrashcanMacro cond=1; \ - Py_TRASHCAN_BEGIN_CONDITION(op, cond); + Py_TRASHCAN_BEGIN_CONDITION((op), cond); #define Py_TRASHCAN_SAFE_END(op) \ Py_TRASHCAN_END; \ } while(0); diff --git a/Include/cpython/odictobject.h b/Include/cpython/odictobject.h index e070413017d801..3822d554868c10 100644 --- a/Include/cpython/odictobject.h +++ b/Include/cpython/odictobject.h @@ -18,8 +18,8 @@ PyAPI_DATA(PyTypeObject) PyODictKeys_Type; PyAPI_DATA(PyTypeObject) PyODictItems_Type; PyAPI_DATA(PyTypeObject) PyODictValues_Type; -#define PyODict_Check(op) PyObject_TypeCheck(op, &PyODict_Type) -#define PyODict_CheckExact(op) Py_IS_TYPE(op, &PyODict_Type) +#define PyODict_Check(op) PyObject_TypeCheck((op), &PyODict_Type) +#define PyODict_CheckExact(op) Py_IS_TYPE((op), &PyODict_Type) #define PyODict_SIZE(op) PyDict_GET_SIZE((op)) PyAPI_FUNC(PyObject *) PyODict_New(void); @@ -27,13 +27,13 @@ PyAPI_FUNC(int) PyODict_SetItem(PyObject *od, PyObject *key, PyObject *item); PyAPI_FUNC(int) PyODict_DelItem(PyObject *od, PyObject *key); /* wrappers around PyDict* functions */ -#define PyODict_GetItem(od, key) PyDict_GetItem(_PyObject_CAST(od), key) +#define PyODict_GetItem(od, key) PyDict_GetItem(_PyObject_CAST(od), (key)) #define PyODict_GetItemWithError(od, key) \ - PyDict_GetItemWithError(_PyObject_CAST(od), key) -#define PyODict_Contains(od, key) PyDict_Contains(_PyObject_CAST(od), key) + PyDict_GetItemWithError(_PyObject_CAST(od), (key)) +#define PyODict_Contains(od, key) PyDict_Contains(_PyObject_CAST(od), (key)) #define PyODict_Size(od) PyDict_Size(_PyObject_CAST(od)) #define PyODict_GetItemString(od, key) \ - PyDict_GetItemString(_PyObject_CAST(od), key) + PyDict_GetItemString(_PyObject_CAST(od), (key)) #endif diff --git a/Include/cpython/picklebufobject.h b/Include/cpython/picklebufobject.h index 0df2561dceaea0..f3cbaeef919518 100644 --- a/Include/cpython/picklebufobject.h +++ b/Include/cpython/picklebufobject.h @@ -12,7 +12,7 @@ extern "C" { PyAPI_DATA(PyTypeObject) PyPickleBuffer_Type; -#define PyPickleBuffer_Check(op) Py_IS_TYPE(op, &PyPickleBuffer_Type) +#define PyPickleBuffer_Check(op) Py_IS_TYPE((op), &PyPickleBuffer_Type) /* Create a PickleBuffer redirecting to the given buffer-enabled object */ PyAPI_FUNC(PyObject *) PyPickleBuffer_FromObject(PyObject *); diff --git a/Include/cpython/pydebug.h b/Include/cpython/pydebug.h index cab799f0b38e0c..f6ebd99ed7e2ff 100644 --- a/Include/cpython/pydebug.h +++ b/Include/cpython/pydebug.h @@ -5,31 +5,31 @@ extern "C" { #endif -PyAPI_DATA(int) Py_DebugFlag; -PyAPI_DATA(int) Py_VerboseFlag; -PyAPI_DATA(int) Py_QuietFlag; -PyAPI_DATA(int) Py_InteractiveFlag; -PyAPI_DATA(int) Py_InspectFlag; -PyAPI_DATA(int) Py_OptimizeFlag; -PyAPI_DATA(int) Py_NoSiteFlag; -PyAPI_DATA(int) Py_BytesWarningFlag; -PyAPI_DATA(int) Py_FrozenFlag; -PyAPI_DATA(int) Py_IgnoreEnvironmentFlag; -PyAPI_DATA(int) Py_DontWriteBytecodeFlag; -PyAPI_DATA(int) Py_NoUserSiteDirectory; -PyAPI_DATA(int) Py_UnbufferedStdioFlag; -PyAPI_DATA(int) Py_HashRandomizationFlag; -PyAPI_DATA(int) Py_IsolatedFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_DebugFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_VerboseFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_QuietFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_InteractiveFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_InspectFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_OptimizeFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_NoSiteFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_BytesWarningFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_FrozenFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_IgnoreEnvironmentFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_DontWriteBytecodeFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_NoUserSiteDirectory; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_UnbufferedStdioFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_HashRandomizationFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_IsolatedFlag; #ifdef MS_WINDOWS -PyAPI_DATA(int) Py_LegacyWindowsFSEncodingFlag; -PyAPI_DATA(int) Py_LegacyWindowsStdioFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_LegacyWindowsFSEncodingFlag; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_LegacyWindowsStdioFlag; #endif /* this is a wrapper around getenv() that pays attention to Py_IgnoreEnvironmentFlag. It should be used for getting variables like PYTHONPATH and PYTHONHOME from the environment */ -PyAPI_DATA(char*) Py_GETENV(const char *name); +PyAPI_FUNC(char*) Py_GETENV(const char *name); #ifdef __cplusplus } diff --git a/Include/cpython/pyerrors.h b/Include/cpython/pyerrors.h index 47d80e3242302d..f33d3caaa2082e 100644 --- a/Include/cpython/pyerrors.h +++ b/Include/cpython/pyerrors.h @@ -176,4 +176,4 @@ PyAPI_FUNC(void) _Py_NO_RETURN _Py_FatalErrorFormat( const char *format, ...); -#define Py_FatalError(message) _Py_FatalErrorFunc(__func__, message) +#define Py_FatalError(message) _Py_FatalErrorFunc(__func__, (message)) diff --git a/Include/cpython/pyframe.h b/Include/cpython/pyframe.h new file mode 100644 index 00000000000000..1dc634ccee9a27 --- /dev/null +++ b/Include/cpython/pyframe.h @@ -0,0 +1,17 @@ +#ifndef Py_CPYTHON_PYFRAME_H +# error "this header file must not be included directly" +#endif + +PyAPI_DATA(PyTypeObject) PyFrame_Type; + +#define PyFrame_Check(op) Py_IS_TYPE((op), &PyFrame_Type) + +PyAPI_FUNC(PyFrameObject *) PyFrame_GetBack(PyFrameObject *frame); +PyAPI_FUNC(PyObject *) PyFrame_GetLocals(PyFrameObject *frame); + +PyAPI_FUNC(PyObject *) PyFrame_GetGlobals(PyFrameObject *frame); +PyAPI_FUNC(PyObject *) PyFrame_GetBuiltins(PyFrameObject *frame); + +PyAPI_FUNC(PyObject *) PyFrame_GetGenerator(PyFrameObject *frame); +PyAPI_FUNC(int) PyFrame_GetLasti(PyFrameObject *frame); + diff --git a/Include/cpython/pystate.h b/Include/cpython/pystate.h index 2bd46067cbbe18..cc3c3eae941933 100644 --- a/Include/cpython/pystate.h +++ b/Include/cpython/pystate.h @@ -279,7 +279,10 @@ PyAPI_FUNC(const PyConfig*) _PyInterpreterState_GetConfig(PyInterpreterState *in for example. Python must be preinitialized to call this method. - The caller must hold the GIL. */ + The caller must hold the GIL. + + Once done with the configuration, PyConfig_Clear() must be called to clear + it. */ PyAPI_FUNC(int) _PyInterpreterState_GetConfigCopy( struct PyConfig *config); diff --git a/Include/cpython/pythonrun.h b/Include/cpython/pythonrun.h index 2e72d0820d34f5..fb617655374026 100644 --- a/Include/cpython/pythonrun.h +++ b/Include/cpython/pythonrun.h @@ -66,8 +66,8 @@ PyAPI_FUNC(PyObject *) Py_CompileStringObject( PyCompilerFlags *flags, int optimize); -#define Py_CompileString(str, p, s) Py_CompileStringExFlags(str, p, s, NULL, -1) -#define Py_CompileStringFlags(str, p, s, f) Py_CompileStringExFlags(str, p, s, f, -1) +#define Py_CompileString(str, p, s) Py_CompileStringExFlags((str), (p), (s), NULL, -1) +#define Py_CompileStringFlags(str, p, s, f) Py_CompileStringExFlags((str), (p), (s), (f), -1) PyAPI_FUNC(const char *) _Py_SourceAsString( @@ -96,23 +96,23 @@ PyAPI_FUNC(PyObject *) PyRun_FileEx(FILE *fp, const char *p, int s, PyObject *g, PyAPI_FUNC(PyObject *) PyRun_FileFlags(FILE *fp, const char *p, int s, PyObject *g, PyObject *l, PyCompilerFlags *flags); /* Use macros for a bunch of old variants */ -#define PyRun_String(str, s, g, l) PyRun_StringFlags(str, s, g, l, NULL) -#define PyRun_AnyFile(fp, name) PyRun_AnyFileExFlags(fp, name, 0, NULL) +#define PyRun_String(str, s, g, l) PyRun_StringFlags((str), (s), (g), (l), NULL) +#define PyRun_AnyFile(fp, name) PyRun_AnyFileExFlags((fp), (name), 0, NULL) #define PyRun_AnyFileEx(fp, name, closeit) \ - PyRun_AnyFileExFlags(fp, name, closeit, NULL) + PyRun_AnyFileExFlags((fp), (name), (closeit), NULL) #define PyRun_AnyFileFlags(fp, name, flags) \ - PyRun_AnyFileExFlags(fp, name, 0, flags) -#define PyRun_SimpleString(s) PyRun_SimpleStringFlags(s, NULL) -#define PyRun_SimpleFile(f, p) PyRun_SimpleFileExFlags(f, p, 0, NULL) -#define PyRun_SimpleFileEx(f, p, c) PyRun_SimpleFileExFlags(f, p, c, NULL) -#define PyRun_InteractiveOne(f, p) PyRun_InteractiveOneFlags(f, p, NULL) -#define PyRun_InteractiveLoop(f, p) PyRun_InteractiveLoopFlags(f, p, NULL) + PyRun_AnyFileExFlags((fp), (name), 0, (flags)) +#define PyRun_SimpleString(s) PyRun_SimpleStringFlags((s), NULL) +#define PyRun_SimpleFile(f, p) PyRun_SimpleFileExFlags((f), (p), 0, NULL) +#define PyRun_SimpleFileEx(f, p, c) PyRun_SimpleFileExFlags((f), (p), (c), NULL) +#define PyRun_InteractiveOne(f, p) PyRun_InteractiveOneFlags((f), (p), NULL) +#define PyRun_InteractiveLoop(f, p) PyRun_InteractiveLoopFlags((f), (p), NULL) #define PyRun_File(fp, p, s, g, l) \ - PyRun_FileExFlags(fp, p, s, g, l, 0, NULL) + PyRun_FileExFlags((fp), (p), (s), (g), (l), 0, NULL) #define PyRun_FileEx(fp, p, s, g, l, c) \ - PyRun_FileExFlags(fp, p, s, g, l, c, NULL) + PyRun_FileExFlags((fp), (p), (s), (g), (l), (c), NULL) #define PyRun_FileFlags(fp, p, s, g, l, flags) \ - PyRun_FileExFlags(fp, p, s, g, l, 0, flags) + PyRun_FileExFlags((fp), (p), (s), (g), (l), 0, (flags)) /* Stuff with no proper home (yet) */ diff --git a/Include/cpython/pytime.h b/Include/cpython/pytime.h index 23d4f16a8fd847..e64f3b13e75ca1 100644 --- a/Include/cpython/pytime.h +++ b/Include/cpython/pytime.h @@ -130,6 +130,10 @@ PyAPI_FUNC(_PyTime_t) _PyTime_FromSeconds(int seconds); /* Create a timestamp from a number of nanoseconds. */ PyAPI_FUNC(_PyTime_t) _PyTime_FromNanoseconds(_PyTime_t ns); +/* Create a timestamp from a number of microseconds. + * Clamp to [_PyTime_MIN; _PyTime_MAX] on overflow. */ +PyAPI_FUNC(_PyTime_t) _PyTime_FromMicrosecondsClamp(_PyTime_t us); + /* Create a timestamp from nanoseconds (Python int). */ PyAPI_FUNC(int) _PyTime_FromNanosecondsObject(_PyTime_t *t, PyObject *obj); diff --git a/Include/cpython/tupleobject.h b/Include/cpython/tupleobject.h index 3d9c1aff588634..f6a1f076e03330 100644 --- a/Include/cpython/tupleobject.h +++ b/Include/cpython/tupleobject.h @@ -23,11 +23,9 @@ static inline Py_ssize_t PyTuple_GET_SIZE(PyObject *op) { PyTupleObject *tuple = _PyTuple_CAST(op); return Py_SIZE(tuple); } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyTuple_GET_SIZE(op) PyTuple_GET_SIZE(_PyObject_CAST(op)) -#endif +#define PyTuple_GET_SIZE(op) PyTuple_GET_SIZE(_PyObject_CAST(op)) -#define PyTuple_GET_ITEM(op, index) (_PyTuple_CAST(op)->ob_item[index]) +#define PyTuple_GET_ITEM(op, index) (_PyTuple_CAST(op)->ob_item[(index)]) /* Function *only* to be used to fill in brand new tuples */ static inline void @@ -35,9 +33,7 @@ PyTuple_SET_ITEM(PyObject *op, Py_ssize_t index, PyObject *value) { PyTupleObject *tuple = _PyTuple_CAST(op); tuple->ob_item[index] = value; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 #define PyTuple_SET_ITEM(op, index, value) \ - PyTuple_SET_ITEM(_PyObject_CAST(op), index, _PyObject_CAST(value)) -#endif + PyTuple_SET_ITEM(_PyObject_CAST(op), (index), _PyObject_CAST(value)) PyAPI_FUNC(void) _PyTuple_DebugMallocStats(FILE *out); diff --git a/Include/cpython/unicodeobject.h b/Include/cpython/unicodeobject.h index 37bb13cbe5397d..3ca6ace24c5f74 100644 --- a/Include/cpython/unicodeobject.h +++ b/Include/cpython/unicodeobject.h @@ -188,17 +188,13 @@ PyAPI_FUNC(int) _PyUnicode_CheckConsistency( static inline unsigned int PyUnicode_CHECK_INTERNED(PyObject *op) { return _PyASCIIObject_CAST(op)->state.interned; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_CHECK_INTERNED(op) PyUnicode_CHECK_INTERNED(_PyObject_CAST(op)) -#endif +#define PyUnicode_CHECK_INTERNED(op) PyUnicode_CHECK_INTERNED(_PyObject_CAST(op)) /* For backward compatibility */ -static inline unsigned int PyUnicode_IS_READY(PyObject *op) { +static inline unsigned int PyUnicode_IS_READY(PyObject* Py_UNUSED(op)) { return 1; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_IS_READY(op) PyUnicode_IS_READY(_PyObject_CAST(op)) -#endif +#define PyUnicode_IS_READY(op) PyUnicode_IS_READY(_PyObject_CAST(op)) /* Return true if the string contains only ASCII characters, or 0 if not. The string may be compact (PyUnicode_IS_COMPACT_ASCII) or not, but must be @@ -206,27 +202,21 @@ static inline unsigned int PyUnicode_IS_READY(PyObject *op) { static inline unsigned int PyUnicode_IS_ASCII(PyObject *op) { return _PyASCIIObject_CAST(op)->state.ascii; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_IS_ASCII(op) PyUnicode_IS_ASCII(_PyObject_CAST(op)) -#endif +#define PyUnicode_IS_ASCII(op) PyUnicode_IS_ASCII(_PyObject_CAST(op)) /* Return true if the string is compact or 0 if not. No type checks or Ready calls are performed. */ static inline unsigned int PyUnicode_IS_COMPACT(PyObject *op) { return _PyASCIIObject_CAST(op)->state.compact; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_IS_COMPACT(op) PyUnicode_IS_COMPACT(_PyObject_CAST(op)) -#endif +#define PyUnicode_IS_COMPACT(op) PyUnicode_IS_COMPACT(_PyObject_CAST(op)) /* Return true if the string is a compact ASCII string (use PyASCIIObject structure), or 0 if not. No type checks or Ready calls are performed. */ static inline int PyUnicode_IS_COMPACT_ASCII(PyObject *op) { return (_PyASCIIObject_CAST(op)->state.ascii && PyUnicode_IS_COMPACT(op)); } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_IS_COMPACT_ASCII(op) PyUnicode_IS_COMPACT_ASCII(_PyObject_CAST(op)) -#endif +#define PyUnicode_IS_COMPACT_ASCII(op) PyUnicode_IS_COMPACT_ASCII(_PyObject_CAST(op)) enum PyUnicode_Kind { /* Return values of the PyUnicode_KIND() function: */ @@ -236,22 +226,14 @@ enum PyUnicode_Kind { }; // PyUnicode_KIND(): Return one of the PyUnicode_*_KIND values defined above. -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030c0000 +// // gh-89653: Converting this macro to a static inline function would introduce // new compiler warnings on "kind < PyUnicode_KIND(str)" (compare signed and // unsigned numbers) where kind type is an int or on // "unsigned int kind = PyUnicode_KIND(str)" (cast signed to unsigned). // Only declare the function as static inline function in the limited C API // version 3.12 which is stricter. -#define PyUnicode_KIND(op) \ - (_PyASCIIObject_CAST(op)->state.kind) -#else -// Limited C API 3.12 and newer -static inline int PyUnicode_KIND(PyObject *op) { - assert(PyUnicode_IS_READY(op)); - return _PyASCIIObject_CAST(op)->state.kind; -} -#endif +#define PyUnicode_KIND(op) (_PyASCIIObject_CAST(op)->state.kind) /* Return a void pointer to the raw unicode buffer. */ static inline void* _PyUnicode_COMPACT_DATA(PyObject *op) { @@ -262,8 +244,9 @@ static inline void* _PyUnicode_COMPACT_DATA(PyObject *op) { } static inline void* _PyUnicode_NONCOMPACT_DATA(PyObject *op) { + void *data; assert(!PyUnicode_IS_COMPACT(op)); - void *data = _PyUnicodeObject_CAST(op)->data.any; + data = _PyUnicodeObject_CAST(op)->data.any; assert(data != NULL); return data; } @@ -274,9 +257,7 @@ static inline void* PyUnicode_DATA(PyObject *op) { } return _PyUnicode_NONCOMPACT_DATA(op); } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_DATA(op) PyUnicode_DATA(_PyObject_CAST(op)) -#endif +#define PyUnicode_DATA(op) PyUnicode_DATA(_PyObject_CAST(op)) /* Return pointers to the canonical representation cast to unsigned char, Py_UCS2, or Py_UCS4 for direct character access. @@ -291,9 +272,7 @@ static inline void* PyUnicode_DATA(PyObject *op) { static inline Py_ssize_t PyUnicode_GET_LENGTH(PyObject *op) { return _PyASCIIObject_CAST(op)->length; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_GET_LENGTH(op) PyUnicode_GET_LENGTH(_PyObject_CAST(op)) -#endif +#define PyUnicode_GET_LENGTH(op) PyUnicode_GET_LENGTH(_PyObject_CAST(op)) /* Write into the canonical representation, this function does not do any sanity checks and is intended for usage in loops. The caller should cache the @@ -303,6 +282,7 @@ static inline Py_ssize_t PyUnicode_GET_LENGTH(PyObject *op) { static inline void PyUnicode_WRITE(int kind, void *data, Py_ssize_t index, Py_UCS4 value) { + assert(index >= 0); if (kind == PyUnicode_1BYTE_KIND) { assert(value <= 0xffU); _Py_STATIC_CAST(Py_UCS1*, data)[index] = _Py_STATIC_CAST(Py_UCS1, value); @@ -317,17 +297,16 @@ static inline void PyUnicode_WRITE(int kind, void *data, _Py_STATIC_CAST(Py_UCS4*, data)[index] = value; } } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 #define PyUnicode_WRITE(kind, data, index, value) \ PyUnicode_WRITE(_Py_STATIC_CAST(int, kind), _Py_CAST(void*, data), \ (index), _Py_STATIC_CAST(Py_UCS4, value)) -#endif /* Read a code point from the string's canonical representation. No checks or ready calls are performed. */ static inline Py_UCS4 PyUnicode_READ(int kind, const void *data, Py_ssize_t index) { + assert(index >= 0); if (kind == PyUnicode_1BYTE_KIND) { return _Py_STATIC_CAST(const Py_UCS1*, data)[index]; } @@ -337,11 +316,10 @@ static inline Py_UCS4 PyUnicode_READ(int kind, assert(kind == PyUnicode_4BYTE_KIND); return _Py_STATIC_CAST(const Py_UCS4*, data)[index]; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 #define PyUnicode_READ(kind, data, index) \ - PyUnicode_READ(_Py_STATIC_CAST(int, kind), _Py_CAST(const void*, data), \ + PyUnicode_READ(_Py_STATIC_CAST(int, kind), \ + _Py_STATIC_CAST(const void*, data), \ (index)) -#endif /* PyUnicode_READ_CHAR() is less efficient than PyUnicode_READ() because it calls PyUnicode_KIND() and might call it twice. For single reads, use @@ -349,7 +327,13 @@ static inline Py_UCS4 PyUnicode_READ(int kind, cache kind and use PyUnicode_READ instead. */ static inline Py_UCS4 PyUnicode_READ_CHAR(PyObject *unicode, Py_ssize_t index) { - int kind = PyUnicode_KIND(unicode); + int kind; + + assert(index >= 0); + // Tolerate reading the NUL character at str[len(str)] + assert(index <= PyUnicode_GET_LENGTH(unicode)); + + kind = PyUnicode_KIND(unicode); if (kind == PyUnicode_1BYTE_KIND) { return PyUnicode_1BYTE_DATA(unicode)[index]; } @@ -359,21 +343,21 @@ static inline Py_UCS4 PyUnicode_READ_CHAR(PyObject *unicode, Py_ssize_t index) assert(kind == PyUnicode_4BYTE_KIND); return PyUnicode_4BYTE_DATA(unicode)[index]; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_READ_CHAR(unicode, index) \ - PyUnicode_READ_CHAR(_PyObject_CAST(unicode), (index)) -#endif +#define PyUnicode_READ_CHAR(unicode, index) \ + PyUnicode_READ_CHAR(_PyObject_CAST(unicode), (index)) /* Return a maximum character value which is suitable for creating another string based on op. This is always an approximation but more efficient than iterating over the string. */ static inline Py_UCS4 PyUnicode_MAX_CHAR_VALUE(PyObject *op) { + int kind; + if (PyUnicode_IS_ASCII(op)) { return 0x7fU; } - int kind = PyUnicode_KIND(op); + kind = PyUnicode_KIND(op); if (kind == PyUnicode_1BYTE_KIND) { return 0xffU; } @@ -383,10 +367,8 @@ static inline Py_UCS4 PyUnicode_MAX_CHAR_VALUE(PyObject *op) assert(kind == PyUnicode_4BYTE_KIND); return 0x10ffffU; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_MAX_CHAR_VALUE(op) \ - PyUnicode_MAX_CHAR_VALUE(_PyObject_CAST(op)) -#endif +#define PyUnicode_MAX_CHAR_VALUE(op) \ + PyUnicode_MAX_CHAR_VALUE(_PyObject_CAST(op)) /* === Public API ========================================================= */ @@ -401,13 +383,11 @@ PyAPI_FUNC(PyObject*) PyUnicode_New( ); /* For backward compatibility */ -static inline int PyUnicode_READY(PyObject *op) +static inline int PyUnicode_READY(PyObject* Py_UNUSED(op)) { return 0; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyUnicode_READY(op) PyUnicode_READY(_PyObject_CAST(op)) -#endif +#define PyUnicode_READY(op) PyUnicode_READY(_PyObject_CAST(op)) /* Get a copy of a Unicode string. */ PyAPI_FUNC(PyObject*) _PyUnicode_Copy( diff --git a/Include/cpython/warnings.h b/Include/cpython/warnings.h index 2ef8e3ce9435f4..4e3eb88e8ff447 100644 --- a/Include/cpython/warnings.h +++ b/Include/cpython/warnings.h @@ -17,4 +17,4 @@ PyAPI_FUNC(int) PyErr_WarnExplicitFormat( const char *format, ...); // DEPRECATED: Use PyErr_WarnEx() instead. -#define PyErr_Warn(category, msg) PyErr_WarnEx(category, msg, 1) +#define PyErr_Warn(category, msg) PyErr_WarnEx((category), (msg), 1) diff --git a/Include/cpython/weakrefobject.h b/Include/cpython/weakrefobject.h index 2dbef2cea37658..fd79fdc2dcc468 100644 --- a/Include/cpython/weakrefobject.h +++ b/Include/cpython/weakrefobject.h @@ -37,9 +37,11 @@ PyAPI_FUNC(Py_ssize_t) _PyWeakref_GetWeakrefCount(PyWeakReference *head); PyAPI_FUNC(void) _PyWeakref_ClearRef(PyWeakReference *self); static inline PyObject* PyWeakref_GET_OBJECT(PyObject *ref_obj) { + PyWeakReference *ref; + PyObject *obj; assert(PyWeakref_Check(ref_obj)); - PyWeakReference *ref = _Py_CAST(PyWeakReference*, ref_obj); - PyObject *obj = ref->wr_object; + ref = _Py_CAST(PyWeakReference*, ref_obj); + obj = ref->wr_object; // Explanation for the Py_REFCNT() check: when a weakref's target is part // of a long chain of deallocations which triggers the trashcan mechanism, // clearing the weakrefs can be delayed long after the target's refcount @@ -51,6 +53,4 @@ static inline PyObject* PyWeakref_GET_OBJECT(PyObject *ref_obj) { } return Py_None; } -#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyWeakref_GET_OBJECT(ref) PyWeakref_GET_OBJECT(_PyObject_CAST(ref)) -#endif +#define PyWeakref_GET_OBJECT(ref) PyWeakref_GET_OBJECT(_PyObject_CAST(ref)) diff --git a/Include/datetime.h b/Include/datetime.h index bb565201a164d7..b78cc0e8e2e5ac 100644 --- a/Include/datetime.h +++ b/Include/datetime.h @@ -119,39 +119,39 @@ typedef struct // o is a pointer to a time or a datetime object. #define _PyDateTime_HAS_TZINFO(o) (((_PyDateTime_BaseTZInfo *)(o))->hastzinfo) -#define PyDateTime_GET_YEAR(o) ((((PyDateTime_Date*)o)->data[0] << 8) | \ - ((PyDateTime_Date*)o)->data[1]) -#define PyDateTime_GET_MONTH(o) (((PyDateTime_Date*)o)->data[2]) -#define PyDateTime_GET_DAY(o) (((PyDateTime_Date*)o)->data[3]) - -#define PyDateTime_DATE_GET_HOUR(o) (((PyDateTime_DateTime*)o)->data[4]) -#define PyDateTime_DATE_GET_MINUTE(o) (((PyDateTime_DateTime*)o)->data[5]) -#define PyDateTime_DATE_GET_SECOND(o) (((PyDateTime_DateTime*)o)->data[6]) +#define PyDateTime_GET_YEAR(o) ((((PyDateTime_Date*)(o))->data[0] << 8) | \ + ((PyDateTime_Date*)(o))->data[1]) +#define PyDateTime_GET_MONTH(o) (((PyDateTime_Date*)(o))->data[2]) +#define PyDateTime_GET_DAY(o) (((PyDateTime_Date*)(o))->data[3]) + +#define PyDateTime_DATE_GET_HOUR(o) (((PyDateTime_DateTime*)(o))->data[4]) +#define PyDateTime_DATE_GET_MINUTE(o) (((PyDateTime_DateTime*)(o))->data[5]) +#define PyDateTime_DATE_GET_SECOND(o) (((PyDateTime_DateTime*)(o))->data[6]) #define PyDateTime_DATE_GET_MICROSECOND(o) \ - ((((PyDateTime_DateTime*)o)->data[7] << 16) | \ - (((PyDateTime_DateTime*)o)->data[8] << 8) | \ - ((PyDateTime_DateTime*)o)->data[9]) -#define PyDateTime_DATE_GET_FOLD(o) (((PyDateTime_DateTime*)o)->fold) -#define PyDateTime_DATE_GET_TZINFO(o) (_PyDateTime_HAS_TZINFO(o) ? \ + ((((PyDateTime_DateTime*)(o))->data[7] << 16) | \ + (((PyDateTime_DateTime*)(o))->data[8] << 8) | \ + ((PyDateTime_DateTime*)(o))->data[9]) +#define PyDateTime_DATE_GET_FOLD(o) (((PyDateTime_DateTime*)(o))->fold) +#define PyDateTime_DATE_GET_TZINFO(o) (_PyDateTime_HAS_TZINFO((o)) ? \ ((PyDateTime_DateTime *)(o))->tzinfo : Py_None) /* Apply for time instances. */ -#define PyDateTime_TIME_GET_HOUR(o) (((PyDateTime_Time*)o)->data[0]) -#define PyDateTime_TIME_GET_MINUTE(o) (((PyDateTime_Time*)o)->data[1]) -#define PyDateTime_TIME_GET_SECOND(o) (((PyDateTime_Time*)o)->data[2]) +#define PyDateTime_TIME_GET_HOUR(o) (((PyDateTime_Time*)(o))->data[0]) +#define PyDateTime_TIME_GET_MINUTE(o) (((PyDateTime_Time*)(o))->data[1]) +#define PyDateTime_TIME_GET_SECOND(o) (((PyDateTime_Time*)(o))->data[2]) #define PyDateTime_TIME_GET_MICROSECOND(o) \ - ((((PyDateTime_Time*)o)->data[3] << 16) | \ - (((PyDateTime_Time*)o)->data[4] << 8) | \ - ((PyDateTime_Time*)o)->data[5]) -#define PyDateTime_TIME_GET_FOLD(o) (((PyDateTime_Time*)o)->fold) + ((((PyDateTime_Time*)(o))->data[3] << 16) | \ + (((PyDateTime_Time*)(o))->data[4] << 8) | \ + ((PyDateTime_Time*)(o))->data[5]) +#define PyDateTime_TIME_GET_FOLD(o) (((PyDateTime_Time*)(o))->fold) #define PyDateTime_TIME_GET_TZINFO(o) (_PyDateTime_HAS_TZINFO(o) ? \ ((PyDateTime_Time *)(o))->tzinfo : Py_None) /* Apply for time delta instances */ -#define PyDateTime_DELTA_GET_DAYS(o) (((PyDateTime_Delta*)o)->days) -#define PyDateTime_DELTA_GET_SECONDS(o) (((PyDateTime_Delta*)o)->seconds) +#define PyDateTime_DELTA_GET_DAYS(o) (((PyDateTime_Delta*)(o))->days) +#define PyDateTime_DELTA_GET_SECONDS(o) (((PyDateTime_Delta*)(o))->seconds) #define PyDateTime_DELTA_GET_MICROSECONDS(o) \ - (((PyDateTime_Delta*)o)->microseconds) + (((PyDateTime_Delta*)(o))->microseconds) /* Define structure for C API. */ @@ -203,60 +203,60 @@ static PyDateTime_CAPI *PyDateTimeAPI = NULL; #define PyDateTime_TimeZone_UTC PyDateTimeAPI->TimeZone_UTC /* Macros for type checking when not building the Python core. */ -#define PyDate_Check(op) PyObject_TypeCheck(op, PyDateTimeAPI->DateType) -#define PyDate_CheckExact(op) Py_IS_TYPE(op, PyDateTimeAPI->DateType) +#define PyDate_Check(op) PyObject_TypeCheck((op), PyDateTimeAPI->DateType) +#define PyDate_CheckExact(op) Py_IS_TYPE((op), PyDateTimeAPI->DateType) -#define PyDateTime_Check(op) PyObject_TypeCheck(op, PyDateTimeAPI->DateTimeType) -#define PyDateTime_CheckExact(op) Py_IS_TYPE(op, PyDateTimeAPI->DateTimeType) +#define PyDateTime_Check(op) PyObject_TypeCheck((op), PyDateTimeAPI->DateTimeType) +#define PyDateTime_CheckExact(op) Py_IS_TYPE((op), PyDateTimeAPI->DateTimeType) -#define PyTime_Check(op) PyObject_TypeCheck(op, PyDateTimeAPI->TimeType) -#define PyTime_CheckExact(op) Py_IS_TYPE(op, PyDateTimeAPI->TimeType) +#define PyTime_Check(op) PyObject_TypeCheck((op), PyDateTimeAPI->TimeType) +#define PyTime_CheckExact(op) Py_IS_TYPE((op), PyDateTimeAPI->TimeType) -#define PyDelta_Check(op) PyObject_TypeCheck(op, PyDateTimeAPI->DeltaType) -#define PyDelta_CheckExact(op) Py_IS_TYPE(op, PyDateTimeAPI->DeltaType) +#define PyDelta_Check(op) PyObject_TypeCheck((op), PyDateTimeAPI->DeltaType) +#define PyDelta_CheckExact(op) Py_IS_TYPE((op), PyDateTimeAPI->DeltaType) -#define PyTZInfo_Check(op) PyObject_TypeCheck(op, PyDateTimeAPI->TZInfoType) -#define PyTZInfo_CheckExact(op) Py_IS_TYPE(op, PyDateTimeAPI->TZInfoType) +#define PyTZInfo_Check(op) PyObject_TypeCheck((op), PyDateTimeAPI->TZInfoType) +#define PyTZInfo_CheckExact(op) Py_IS_TYPE((op), PyDateTimeAPI->TZInfoType) /* Macros for accessing constructors in a simplified fashion. */ #define PyDate_FromDate(year, month, day) \ - PyDateTimeAPI->Date_FromDate(year, month, day, PyDateTimeAPI->DateType) + PyDateTimeAPI->Date_FromDate((year), (month), (day), PyDateTimeAPI->DateType) #define PyDateTime_FromDateAndTime(year, month, day, hour, min, sec, usec) \ - PyDateTimeAPI->DateTime_FromDateAndTime(year, month, day, hour, \ - min, sec, usec, Py_None, PyDateTimeAPI->DateTimeType) + PyDateTimeAPI->DateTime_FromDateAndTime((year), (month), (day), (hour), \ + (min), (sec), (usec), Py_None, PyDateTimeAPI->DateTimeType) #define PyDateTime_FromDateAndTimeAndFold(year, month, day, hour, min, sec, usec, fold) \ - PyDateTimeAPI->DateTime_FromDateAndTimeAndFold(year, month, day, hour, \ - min, sec, usec, Py_None, fold, PyDateTimeAPI->DateTimeType) + PyDateTimeAPI->DateTime_FromDateAndTimeAndFold((year), (month), (day), (hour), \ + (min), (sec), (usec), Py_None, (fold), PyDateTimeAPI->DateTimeType) #define PyTime_FromTime(hour, minute, second, usecond) \ - PyDateTimeAPI->Time_FromTime(hour, minute, second, usecond, \ + PyDateTimeAPI->Time_FromTime((hour), (minute), (second), (usecond), \ Py_None, PyDateTimeAPI->TimeType) #define PyTime_FromTimeAndFold(hour, minute, second, usecond, fold) \ - PyDateTimeAPI->Time_FromTimeAndFold(hour, minute, second, usecond, \ - Py_None, fold, PyDateTimeAPI->TimeType) + PyDateTimeAPI->Time_FromTimeAndFold((hour), (minute), (second), (usecond), \ + Py_None, (fold), PyDateTimeAPI->TimeType) #define PyDelta_FromDSU(days, seconds, useconds) \ - PyDateTimeAPI->Delta_FromDelta(days, seconds, useconds, 1, \ + PyDateTimeAPI->Delta_FromDelta((days), (seconds), (useconds), 1, \ PyDateTimeAPI->DeltaType) #define PyTimeZone_FromOffset(offset) \ - PyDateTimeAPI->TimeZone_FromTimeZone(offset, NULL) + PyDateTimeAPI->TimeZone_FromTimeZone((offset), NULL) #define PyTimeZone_FromOffsetAndName(offset, name) \ - PyDateTimeAPI->TimeZone_FromTimeZone(offset, name) + PyDateTimeAPI->TimeZone_FromTimeZone((offset), (name)) /* Macros supporting the DB API. */ #define PyDateTime_FromTimestamp(args) \ PyDateTimeAPI->DateTime_FromTimestamp( \ - (PyObject*) (PyDateTimeAPI->DateTimeType), args, NULL) + (PyObject*) (PyDateTimeAPI->DateTimeType), (args), NULL) #define PyDate_FromTimestamp(args) \ PyDateTimeAPI->Date_FromTimestamp( \ - (PyObject*) (PyDateTimeAPI->DateType), args) + (PyObject*) (PyDateTimeAPI->DateType), (args)) #endif /* !defined(_PY_DATETIME_IMPL) */ diff --git a/Include/dictobject.h b/Include/dictobject.h index a6233d8ae2512a..e7fcb44d0cf9a9 100644 --- a/Include/dictobject.h +++ b/Include/dictobject.h @@ -16,7 +16,7 @@ PyAPI_DATA(PyTypeObject) PyDict_Type; #define PyDict_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_DICT_SUBCLASS) -#define PyDict_CheckExact(op) Py_IS_TYPE(op, &PyDict_Type) +#define PyDict_CheckExact(op) Py_IS_TYPE((op), &PyDict_Type) PyAPI_FUNC(PyObject *) PyDict_New(void); PyAPI_FUNC(PyObject *) PyDict_GetItem(PyObject *mp, PyObject *key); @@ -67,9 +67,9 @@ PyAPI_DATA(PyTypeObject) PyDictKeys_Type; PyAPI_DATA(PyTypeObject) PyDictValues_Type; PyAPI_DATA(PyTypeObject) PyDictItems_Type; -#define PyDictKeys_Check(op) PyObject_TypeCheck(op, &PyDictKeys_Type) -#define PyDictValues_Check(op) PyObject_TypeCheck(op, &PyDictValues_Type) -#define PyDictItems_Check(op) PyObject_TypeCheck(op, &PyDictItems_Type) +#define PyDictKeys_Check(op) PyObject_TypeCheck((op), &PyDictKeys_Type) +#define PyDictValues_Check(op) PyObject_TypeCheck((op), &PyDictValues_Type) +#define PyDictItems_Check(op) PyObject_TypeCheck((op), &PyDictItems_Type) /* This excludes Values, since they are not sets. */ # define PyDictViewSet_Check(op) \ (PyDictKeys_Check(op) || PyDictItems_Check(op)) diff --git a/Include/fileobject.h b/Include/fileobject.h index 4c983e7b5daa8a..02bd7c915a23f7 100644 --- a/Include/fileobject.h +++ b/Include/fileobject.h @@ -19,14 +19,14 @@ PyAPI_FUNC(int) PyObject_AsFileDescriptor(PyObject *); /* The default encoding used by the platform file system APIs If non-NULL, this is different than the default encoding for strings */ -PyAPI_DATA(const char *) Py_FileSystemDefaultEncoding; +Py_DEPRECATED(3.12) PyAPI_DATA(const char *) Py_FileSystemDefaultEncoding; #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03060000 -PyAPI_DATA(const char *) Py_FileSystemDefaultEncodeErrors; +Py_DEPRECATED(3.12) PyAPI_DATA(const char *) Py_FileSystemDefaultEncodeErrors; #endif PyAPI_DATA(int) Py_HasFileSystemDefaultEncoding; #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03070000 -PyAPI_DATA(int) Py_UTF8Mode; +Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_UTF8Mode; #endif /* A routine to check if a file descriptor can be select()-ed. */ diff --git a/Include/floatobject.h b/Include/floatobject.h index 9d2fff3097e8ec..999441ac536e1d 100644 --- a/Include/floatobject.h +++ b/Include/floatobject.h @@ -14,7 +14,7 @@ extern "C" { PyAPI_DATA(PyTypeObject) PyFloat_Type; #define PyFloat_Check(op) PyObject_TypeCheck(op, &PyFloat_Type) -#define PyFloat_CheckExact(op) Py_IS_TYPE(op, &PyFloat_Type) +#define PyFloat_CheckExact(op) Py_IS_TYPE((op), &PyFloat_Type) #define Py_RETURN_NAN return PyFloat_FromDouble(Py_NAN) diff --git a/Include/import.h b/Include/import.h index a87677bb10c7f4..5d5f3425b8e715 100644 --- a/Include/import.h +++ b/Include/import.h @@ -67,7 +67,7 @@ PyAPI_FUNC(PyObject *) PyImport_ImportModuleLevelObject( #endif #define PyImport_ImportModuleEx(n, g, l, f) \ - PyImport_ImportModuleLevel(n, g, l, f, 0) + PyImport_ImportModuleLevel((n), (g), (l), (f), 0) PyAPI_FUNC(PyObject *) PyImport_GetImporter(PyObject *path); PyAPI_FUNC(PyObject *) PyImport_Import(PyObject *name); diff --git a/Include/internal/pycore_asdl.h b/Include/internal/pycore_asdl.h index 5b01c7a66599e9..afeada88d13e24 100644 --- a/Include/internal/pycore_asdl.h +++ b/Include/internal/pycore_asdl.h @@ -91,7 +91,7 @@ asdl_ ## NAME ## _seq *_Py_asdl_ ## NAME ## _seq_new(Py_ssize_t size, PyArena *a (S)->typed_elements[_asdl_i] = (V); \ } while (0) #else -# define asdl_seq_SET(S, I, V) _Py_RVALUE((S)->typed_elements[I] = (V)) +# define asdl_seq_SET(S, I, V) _Py_RVALUE((S)->typed_elements[(I)] = (V)) #endif #ifdef Py_DEBUG @@ -103,7 +103,7 @@ asdl_ ## NAME ## _seq *_Py_asdl_ ## NAME ## _seq_new(Py_ssize_t size, PyArena *a (S)->elements[_asdl_i] = (V); \ } while (0) #else -# define asdl_seq_SET_UNTYPED(S, I, V) _Py_RVALUE((S)->elements[I] = (V)) +# define asdl_seq_SET_UNTYPED(S, I, V) _Py_RVALUE((S)->elements[(I)] = (V)) #endif #ifdef __cplusplus diff --git a/Include/internal/pycore_call.h b/Include/internal/pycore_call.h index 3ccacfa0b8b038..55378e3dfebf24 100644 --- a/Include/internal/pycore_call.h +++ b/Include/internal/pycore_call.h @@ -103,6 +103,7 @@ _PyObject_CallNoArgsTstate(PyThreadState *tstate, PyObject *func) { // Private static inline function variant of public PyObject_CallNoArgs() static inline PyObject * _PyObject_CallNoArgs(PyObject *func) { + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, func); PyThreadState *tstate = _PyThreadState_GET(); return _PyObject_VectorcallTstate(tstate, func, NULL, 0, NULL); } @@ -111,6 +112,7 @@ _PyObject_CallNoArgs(PyObject *func) { static inline PyObject * _PyObject_FastCallTstate(PyThreadState *tstate, PyObject *func, PyObject *const *args, Py_ssize_t nargs) { + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, func); return _PyObject_VectorcallTstate(tstate, func, args, (size_t)nargs, NULL); } diff --git a/Include/internal/pycore_ceval.h b/Include/internal/pycore_ceval.h index 8dd89c6850794a..dafe31b216bf06 100644 --- a/Include/internal/pycore_ceval.h +++ b/Include/internal/pycore_ceval.h @@ -12,8 +12,14 @@ extern "C" { struct pyruntimestate; struct _ceval_runtime_state; +/* WASI has limited call stack. wasmtime 0.36 can handle sufficient amount of + C stack frames for little more than 750 recursions. */ #ifndef Py_DEFAULT_RECURSION_LIMIT -# define Py_DEFAULT_RECURSION_LIMIT 1000 +# ifdef __wasi__ +# define Py_DEFAULT_RECURSION_LIMIT 750 +# else +# define Py_DEFAULT_RECURSION_LIMIT 1000 +# endif #endif #include "pycore_interp.h" // PyInterpreterState.eval_frame @@ -62,6 +68,7 @@ extern PyObject* _PyEval_BuiltinsFromGlobals( static inline PyObject* _PyEval_EvalFrame(PyThreadState *tstate, struct _PyInterpreterFrame *frame, int throwflag) { + EVAL_CALL_STAT_INC(EVAL_CALL_TOTAL); if (tstate->interp->eval_frame == NULL) { return _PyEval_EvalFrameDefault(tstate, frame, throwflag); } @@ -74,11 +81,7 @@ _PyEval_Vector(PyThreadState *tstate, PyObject* const* args, size_t argcount, PyObject *kwnames); -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS -extern int _PyEval_ThreadsInitialized(PyInterpreterState *interp); -#else extern int _PyEval_ThreadsInitialized(struct pyruntimestate *runtime); -#endif extern PyStatus _PyEval_InitGIL(PyThreadState *tstate); extern void _PyEval_FiniGIL(PyInterpreterState *interp); diff --git a/Include/internal/pycore_code.h b/Include/internal/pycore_code.h index e11d1f05129c67..c975f1cb74f3d7 100644 --- a/Include/internal/pycore_code.h +++ b/Include/internal/pycore_code.h @@ -58,19 +58,18 @@ typedef struct { _Py_CODEUNIT index; } _PyAttrCache; -#define INLINE_CACHE_ENTRIES_LOAD_ATTR CACHE_ENTRIES(_PyAttrCache) - -#define INLINE_CACHE_ENTRIES_STORE_ATTR CACHE_ENTRIES(_PyAttrCache) - typedef struct { _Py_CODEUNIT counter; _Py_CODEUNIT type_version[2]; - _Py_CODEUNIT dict_offset; _Py_CODEUNIT keys_version[2]; _Py_CODEUNIT descr[4]; } _PyLoadMethodCache; -#define INLINE_CACHE_ENTRIES_LOAD_METHOD CACHE_ENTRIES(_PyLoadMethodCache) + +// MUST be the max(_PyAttrCache, _PyLoadMethodCache) +#define INLINE_CACHE_ENTRIES_LOAD_ATTR CACHE_ENTRIES(_PyLoadMethodCache) + +#define INLINE_CACHE_ENTRIES_STORE_ATTR CACHE_ENTRIES(_PyAttrCache) typedef struct { _Py_CODEUNIT counter; @@ -82,15 +81,15 @@ typedef struct { typedef struct { _Py_CODEUNIT counter; -} _PyPrecallCache; +} _PyStoreSubscrCache; -#define INLINE_CACHE_ENTRIES_PRECALL CACHE_ENTRIES(_PyPrecallCache) +#define INLINE_CACHE_ENTRIES_STORE_SUBSCR CACHE_ENTRIES(_PyStoreSubscrCache) typedef struct { _Py_CODEUNIT counter; -} _PyStoreSubscrCache; +} _PyForIterCache; -#define INLINE_CACHE_ENTRIES_STORE_SUBSCR CACHE_ENTRIES(_PyStoreSubscrCache) +#define INLINE_CACHE_ENTRIES_FOR_ITER CACHE_ENTRIES(_PyForIterCache) #define QUICKENING_WARMUP_DELAY 8 @@ -233,9 +232,6 @@ extern void _PyLineTable_InitAddressRange( extern int _PyLineTable_NextAddressRange(PyCodeAddressRange *range); extern int _PyLineTable_PreviousAddressRange(PyCodeAddressRange *range); - -#define ADAPTIVE_CACHE_BACKOFF 64 - /* Specialization functions */ extern int _Py_Specialize_LoadAttr(PyObject *owner, _Py_CODEUNIT *instr, @@ -243,20 +239,17 @@ extern int _Py_Specialize_LoadAttr(PyObject *owner, _Py_CODEUNIT *instr, extern int _Py_Specialize_StoreAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name); extern int _Py_Specialize_LoadGlobal(PyObject *globals, PyObject *builtins, _Py_CODEUNIT *instr, PyObject *name); -extern int _Py_Specialize_LoadMethod(PyObject *owner, _Py_CODEUNIT *instr, - PyObject *name); extern int _Py_Specialize_BinarySubscr(PyObject *sub, PyObject *container, _Py_CODEUNIT *instr); extern int _Py_Specialize_StoreSubscr(PyObject *container, PyObject *sub, _Py_CODEUNIT *instr); extern int _Py_Specialize_Call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, PyObject *kwnames); -extern int _Py_Specialize_Precall(PyObject *callable, _Py_CODEUNIT *instr, - int nargs, PyObject *kwnames, int oparg); extern void _Py_Specialize_BinaryOp(PyObject *lhs, PyObject *rhs, _Py_CODEUNIT *instr, int oparg, PyObject **locals); extern void _Py_Specialize_CompareOp(PyObject *lhs, PyObject *rhs, _Py_CODEUNIT *instr, int oparg); extern void _Py_Specialize_UnpackSequence(PyObject *seq, _Py_CODEUNIT *instr, int oparg); +extern void _Py_Specialize_ForIter(PyObject *iter, _Py_CODEUNIT *instr); /* Deallocator function for static codeobjects used in deepfreeze.py */ extern void _PyStaticCode_Dealloc(PyCodeObject *co); @@ -265,63 +258,17 @@ extern int _PyStaticCode_InternStrings(PyCodeObject *co); #ifdef Py_STATS -#define SPECIALIZATION_FAILURE_KINDS 30 - -typedef struct _specialization_stats { - uint64_t success; - uint64_t failure; - uint64_t hit; - uint64_t deferred; - uint64_t miss; - uint64_t deopt; - uint64_t failure_kinds[SPECIALIZATION_FAILURE_KINDS]; -} SpecializationStats; - -typedef struct _opcode_stats { - SpecializationStats specialization; - uint64_t execution_count; - uint64_t pair_count[256]; -} OpcodeStats; - -typedef struct _call_stats { - uint64_t inlined_py_calls; - uint64_t pyeval_calls; - uint64_t frames_pushed; - uint64_t frame_objects_created; -} CallStats; - -typedef struct _object_stats { - uint64_t allocations; - uint64_t allocations512; - uint64_t allocations4k; - uint64_t allocations_big; - uint64_t frees; - uint64_t to_freelist; - uint64_t from_freelist; - uint64_t new_values; - uint64_t dict_materialized_on_request; - uint64_t dict_materialized_new_key; - uint64_t dict_materialized_too_big; - uint64_t dict_materialized_str_subclass; -} ObjectStats; - -typedef struct _stats { - OpcodeStats opcode_stats[256]; - CallStats call_stats; - ObjectStats object_stats; -} PyStats; - -extern PyStats _py_stats; - -#define STAT_INC(opname, name) _py_stats.opcode_stats[opname].specialization.name++ -#define STAT_DEC(opname, name) _py_stats.opcode_stats[opname].specialization.name-- -#define OPCODE_EXE_INC(opname) _py_stats.opcode_stats[opname].execution_count++ -#define CALL_STAT_INC(name) _py_stats.call_stats.name++ -#define OBJECT_STAT_INC(name) _py_stats.object_stats.name++ -#define OBJECT_STAT_INC_COND(name, cond) \ - do { if (cond) _py_stats.object_stats.name++; } while (0) -extern void _Py_PrintSpecializationStats(int to_file); +#define STAT_INC(opname, name) do { if (_py_stats) _py_stats->opcode_stats[opname].specialization.name++; } while (0) +#define STAT_DEC(opname, name) do { if (_py_stats) _py_stats->opcode_stats[opname].specialization.name--; } while (0) +#define OPCODE_EXE_INC(opname) do { if (_py_stats) _py_stats->opcode_stats[opname].execution_count++; } while (0) +#define CALL_STAT_INC(name) do { if (_py_stats) _py_stats->call_stats.name++; } while (0) +#define OBJECT_STAT_INC(name) do { if (_py_stats) _py_stats->object_stats.name++; } while (0) +#define OBJECT_STAT_INC_COND(name, cond) \ + do { if (_py_stats && cond) _py_stats->object_stats.name++; } while (0) +#define EVAL_CALL_STAT_INC(name) do { if (_py_stats) _py_stats->call_stats.eval_calls[name]++; } while (0) +#define EVAL_CALL_STAT_INC_IF_FUNCTION(name, callable) \ + do { if (_py_stats && PyFunction_Check(callable)) _py_stats->call_stats.eval_calls[name]++; } while (0) // Used by the _opcode extension which is built as a shared library PyAPI_FUNC(PyObject*) _Py_GetSpecializationStats(void); @@ -333,6 +280,8 @@ PyAPI_FUNC(PyObject*) _Py_GetSpecializationStats(void); #define CALL_STAT_INC(name) ((void)0) #define OBJECT_STAT_INC(name) ((void)0) #define OBJECT_STAT_INC_COND(name, cond) ((void)0) +#define EVAL_CALL_STAT_INC(name) ((void)0) +#define EVAL_CALL_STAT_INC_IF_FUNCTION(name, callable) ((void)0) #endif // !Py_STATS // Cache values are only valid in memory, so use native endianness. @@ -475,6 +424,79 @@ write_location_entry_start(uint8_t *ptr, int code, int length) } +/** Counters + * The first 16-bit value in each inline cache is a counter. + * When counting misses, the counter is treated as a simple unsigned value. + * + * When counting executions until the next specialization attempt, + * exponential backoff is used to reduce the number of specialization failures. + * The high 12 bits store the counter, the low 4 bits store the backoff exponent. + * On a specialization failure, the backoff exponent is incremented and the + * counter set to (2**backoff - 1). + * Backoff == 6 -> starting counter == 63, backoff == 10 -> starting counter == 1023. + */ + +/* With a 16-bit counter, we have 12 bits for the counter value, and 4 bits for the backoff */ +#define ADAPTIVE_BACKOFF_BITS 4 +/* The initial counter value is 31 == 2**ADAPTIVE_BACKOFF_START - 1 */ +#define ADAPTIVE_BACKOFF_START 5 + +#define MAX_BACKOFF_VALUE (16 - ADAPTIVE_BACKOFF_BITS) + + +static inline uint16_t +adaptive_counter_bits(int value, int backoff) { + return (value << ADAPTIVE_BACKOFF_BITS) | + (backoff & ((1< MAX_BACKOFF_VALUE) { + backoff = MAX_BACKOFF_VALUE; + } + unsigned int value = (1 << backoff) - 1; + return adaptive_counter_bits(value, backoff); +} + + +/* Line array cache for tracing */ + +extern int _PyCode_CreateLineArray(PyCodeObject *co); + +static inline int +_PyCode_InitLineArray(PyCodeObject *co) +{ + if (co->_co_linearray) { + return 0; + } + return _PyCode_CreateLineArray(co); +} + +static inline int +_PyCode_LineNumberFromArray(PyCodeObject *co, int index) +{ + assert(co->_co_linearray != NULL); + assert(index >= 0); + assert(index < Py_SIZE(co)); + if (co->_co_linearray_entry_size == 2) { + return ((int16_t *)co->_co_linearray)[index]; + } + else { + assert(co->_co_linearray_entry_size == 4); + return ((int32_t *)co->_co_linearray)[index]; + } +} + + #ifdef __cplusplus } #endif diff --git a/Include/internal/pycore_descrobject.h b/Include/internal/pycore_descrobject.h new file mode 100644 index 00000000000000..76378569df90e3 --- /dev/null +++ b/Include/internal/pycore_descrobject.h @@ -0,0 +1,26 @@ +#ifndef Py_INTERNAL_DESCROBJECT_H +#define Py_INTERNAL_DESCROBJECT_H +#ifdef __cplusplus +extern "C" { +#endif + +#ifndef Py_BUILD_CORE +# error "this header requires Py_BUILD_CORE define" +#endif + +typedef struct { + PyObject_HEAD + PyObject *prop_get; + PyObject *prop_set; + PyObject *prop_del; + PyObject *prop_doc; + PyObject *prop_name; + int getter_doc; +} propertyobject; + +typedef propertyobject _PyPropertyObject; + +#ifdef __cplusplus +} +#endif +#endif /* !Py_INTERNAL_DESCROBJECT_H */ diff --git a/Include/internal/pycore_dict.h b/Include/internal/pycore_dict.h index 24d2a711878ced..c831c4ccbd0cb6 100644 --- a/Include/internal/pycore_dict.h +++ b/Include/internal/pycore_dict.h @@ -154,9 +154,11 @@ struct _dictvalues { 2 : sizeof(int32_t)) #endif #define DK_ENTRIES(dk) \ - (assert(dk->dk_kind == DICT_KEYS_GENERAL), (PyDictKeyEntry*)(&((int8_t*)((dk)->dk_indices))[(size_t)1 << (dk)->dk_log2_index_bytes])) + (assert((dk)->dk_kind == DICT_KEYS_GENERAL), \ + (PyDictKeyEntry*)(&((int8_t*)((dk)->dk_indices))[(size_t)1 << (dk)->dk_log2_index_bytes])) #define DK_UNICODE_ENTRIES(dk) \ - (assert(dk->dk_kind != DICT_KEYS_GENERAL), (PyDictUnicodeEntry*)(&((int8_t*)((dk)->dk_indices))[(size_t)1 << (dk)->dk_log2_index_bytes])) + (assert((dk)->dk_kind != DICT_KEYS_GENERAL), \ + (PyDictUnicodeEntry*)(&((int8_t*)((dk)->dk_indices))[(size_t)1 << (dk)->dk_log2_index_bytes])) #define DK_IS_UNICODE(dk) ((dk)->dk_kind != DICT_KEYS_GENERAL) extern uint64_t _pydict_global_version; diff --git a/Include/internal/pycore_frame.h b/Include/internal/pycore_frame.h index 405afd67d2274a..eed26fbb06218a 100644 --- a/Include/internal/pycore_frame.h +++ b/Include/internal/pycore_frame.h @@ -6,6 +6,7 @@ extern "C" { #include #include +#include "pycore_code.h" // STATS /* See Objects/frame_layout.md for an explanation of the frame stack * including explanation of the PyFrameObject and _PyInterpreterFrame @@ -94,20 +95,20 @@ static inline void _PyFrame_StackPush(_PyInterpreterFrame *f, PyObject *value) { void _PyFrame_Copy(_PyInterpreterFrame *src, _PyInterpreterFrame *dest); -/* Consumes reference to func */ +/* Consumes reference to func and locals */ static inline void _PyFrame_InitializeSpecials( _PyInterpreterFrame *frame, PyFunctionObject *func, - PyObject *locals, int nlocalsplus) + PyObject *locals, PyCodeObject *code) { frame->f_func = func; - frame->f_code = (PyCodeObject *)Py_NewRef(func->func_code); + frame->f_code = (PyCodeObject *)Py_NewRef(code); frame->f_builtins = func->func_builtins; frame->f_globals = func->func_globals; - frame->f_locals = Py_XNewRef(locals); - frame->stacktop = nlocalsplus; + frame->f_locals = locals; + frame->stacktop = code->co_nlocalsplus; frame->frame_obj = NULL; - frame->prev_instr = _PyCode_CODE(frame->f_code) - 1; + frame->prev_instr = _PyCode_CODE(code) - 1; frame->is_entry = false; frame->owner = FRAME_OWNED_BY_THREAD; } @@ -172,29 +173,32 @@ _PyFrame_FastToLocalsWithError(_PyInterpreterFrame *frame); void _PyFrame_LocalsToFast(_PyInterpreterFrame *frame, int clear); -extern _PyInterpreterFrame * -_PyThreadState_BumpFramePointerSlow(PyThreadState *tstate, size_t size); -static inline _PyInterpreterFrame * -_PyThreadState_BumpFramePointer(PyThreadState *tstate, size_t size) +static inline bool +_PyThreadState_HasStackSpace(PyThreadState *tstate, int size) { - PyObject **base = tstate->datastack_top; - if (base) { - PyObject **top = base + size; - assert(tstate->datastack_limit); - if (top < tstate->datastack_limit) { - tstate->datastack_top = top; - return (_PyInterpreterFrame *)base; - } - } - return _PyThreadState_BumpFramePointerSlow(tstate, size); + return tstate->datastack_top + size < tstate->datastack_limit; } +extern _PyInterpreterFrame * +_PyThreadState_PushFrame(PyThreadState *tstate, size_t size); + void _PyThreadState_PopFrame(PyThreadState *tstate, _PyInterpreterFrame *frame); -/* Consume reference to func */ -_PyInterpreterFrame * -_PyFrame_Push(PyThreadState *tstate, PyFunctionObject *func); +/* Pushes a frame without checking for space. + * Must be guarded by _PyThreadState_HasStackSpace() + * Consumes reference to func. */ +static inline _PyInterpreterFrame * +_PyFrame_PushUnchecked(PyThreadState *tstate, PyFunctionObject *func) +{ + CALL_STAT_INC(frames_pushed); + PyCodeObject *code = (PyCodeObject *)func->func_code; + _PyInterpreterFrame *new_frame = (_PyInterpreterFrame *)tstate->datastack_top; + tstate->datastack_top += code->co_framesize; + assert(tstate->datastack_top < tstate->datastack_limit); + _PyFrame_InitializeSpecials(new_frame, func, NULL, code); + return new_frame; +} int _PyInterpreterFrame_GetLine(_PyInterpreterFrame *frame); diff --git a/Include/internal/pycore_global_strings.h b/Include/internal/pycore_global_strings.h index cfa8ae99d1b6d9..2bf16c30e1bce2 100644 --- a/Include/internal/pycore_global_strings.h +++ b/Include/internal/pycore_global_strings.h @@ -90,6 +90,7 @@ struct _Py_global_strings { STRUCT_FOR_ID(__delete__) STRUCT_FOR_ID(__delitem__) STRUCT_FOR_ID(__dict__) + STRUCT_FOR_ID(__dictoffset__) STRUCT_FOR_ID(__dir__) STRUCT_FOR_ID(__divmod__) STRUCT_FOR_ID(__doc__) @@ -202,9 +203,11 @@ struct _Py_global_strings { STRUCT_FOR_ID(__truediv__) STRUCT_FOR_ID(__trunc__) STRUCT_FOR_ID(__typing_is_unpacked_typevartuple__) + STRUCT_FOR_ID(__typing_prepare_subst__) STRUCT_FOR_ID(__typing_subst__) STRUCT_FOR_ID(__typing_unpacked_tuple_args__) STRUCT_FOR_ID(__warningregistry__) + STRUCT_FOR_ID(__weaklistoffset__) STRUCT_FOR_ID(__weakref__) STRUCT_FOR_ID(__xor__) STRUCT_FOR_ID(_abc_impl) @@ -223,7 +226,6 @@ struct _Py_global_strings { STRUCT_FOR_ID(_showwarnmsg) STRUCT_FOR_ID(_shutdown) STRUCT_FOR_ID(_slotnames) - STRUCT_FOR_ID(_strptime_time) STRUCT_FOR_ID(_uninitialized_submodules) STRUCT_FOR_ID(_warn_unawaited_coroutine) STRUCT_FOR_ID(_xoptions) @@ -249,7 +251,6 @@ struct _Py_global_strings { STRUCT_FOR_ID(difference_update) STRUCT_FOR_ID(dispatch_table) STRUCT_FOR_ID(displayhook) - STRUCT_FOR_ID(enable) STRUCT_FOR_ID(encode) STRUCT_FOR_ID(encoding) STRUCT_FOR_ID(end_lineno) @@ -308,7 +309,6 @@ struct _Py_global_strings { STRUCT_FOR_ID(opcode) STRUCT_FOR_ID(open) STRUCT_FOR_ID(parent) - STRUCT_FOR_ID(partial) STRUCT_FOR_ID(path) STRUCT_FOR_ID(peek) STRUCT_FOR_ID(persistent_id) @@ -354,7 +354,6 @@ struct _Py_global_strings { STRUCT_FOR_ID(warnoptions) STRUCT_FOR_ID(writable) STRUCT_FOR_ID(write) - STRUCT_FOR_ID(zipimporter) } identifiers; struct { PyASCIIObject _ascii; diff --git a/Include/internal/pycore_hamt.h b/Include/internal/pycore_hamt.h index 85e35c5afc90cf..7a674b4ee4ac0e 100644 --- a/Include/internal/pycore_hamt.h +++ b/Include/internal/pycore_hamt.h @@ -5,7 +5,19 @@ # error "this header requires Py_BUILD_CORE define" #endif -#define _Py_HAMT_MAX_TREE_DEPTH 7 + +/* +HAMT tree is shaped by hashes of keys. Every group of 5 bits of a hash denotes +the exact position of the key in one level of the tree. Since we're using +32 bit hashes, we can have at most 7 such levels. Although if there are +two distinct keys with equal hashes, they will have to occupy the same +cell in the 7th level of the tree -- so we'd put them in a "collision" node. +Which brings the total possible tree depth to 8. Read more about the actual +layout of the HAMT tree in `hamt.c`. + +This constant is used to define a datastucture for storing iteration state. +*/ +#define _Py_HAMT_MAX_TREE_DEPTH 8 extern PyTypeObject _PyHamt_Type; @@ -23,7 +35,7 @@ void _PyHamt_Fini(PyInterpreterState *); /* other API */ -#define PyHamt_Check(o) Py_IS_TYPE(o, &_PyHamt_Type) +#define PyHamt_Check(o) Py_IS_TYPE((o), &_PyHamt_Type) /* Abstract tree node. */ diff --git a/Include/internal/pycore_hashtable.h b/Include/internal/pycore_hashtable.h index 18757abc28c195..2aa23a24c2830a 100644 --- a/Include/internal/pycore_hashtable.h +++ b/Include/internal/pycore_hashtable.h @@ -18,9 +18,9 @@ typedef struct { _Py_slist_item_t *head; } _Py_slist_t; -#define _Py_SLIST_ITEM_NEXT(ITEM) (((_Py_slist_item_t *)ITEM)->next) +#define _Py_SLIST_ITEM_NEXT(ITEM) (((_Py_slist_item_t *)(ITEM))->next) -#define _Py_SLIST_HEAD(SLIST) (((_Py_slist_t *)SLIST)->head) +#define _Py_SLIST_HEAD(SLIST) (((_Py_slist_t *)(SLIST))->head) /* _Py_hashtable: table entry */ diff --git a/Include/internal/pycore_initconfig.h b/Include/internal/pycore_initconfig.h index a2f4cd182e8e92..69f88d7d1d46b8 100644 --- a/Include/internal/pycore_initconfig.h +++ b/Include/internal/pycore_initconfig.h @@ -36,13 +36,13 @@ struct pyruntimestate; ._type = _PyStatus_TYPE_EXIT, \ .exitcode = (EXITCODE)} #define _PyStatus_IS_ERROR(err) \ - (err._type == _PyStatus_TYPE_ERROR) + ((err)._type == _PyStatus_TYPE_ERROR) #define _PyStatus_IS_EXIT(err) \ - (err._type == _PyStatus_TYPE_EXIT) + ((err)._type == _PyStatus_TYPE_EXIT) #define _PyStatus_EXCEPTION(err) \ - (err._type != _PyStatus_TYPE_OK) + ((err)._type != _PyStatus_TYPE_OK) #define _PyStatus_UPDATE_FUNC(err) \ - do { err.func = _PyStatus_GET_FUNC(); } while (0) + do { (err).func = _PyStatus_GET_FUNC(); } while (0) PyObject* _PyErr_SetFromPyStatus(PyStatus status); diff --git a/Include/internal/pycore_interp.h b/Include/internal/pycore_interp.h index d55627908a28f8..bcfcd88d84492d 100644 --- a/Include/internal/pycore_interp.h +++ b/Include/internal/pycore_interp.h @@ -51,9 +51,6 @@ struct _ceval_state { /* Request for dropping the GIL */ _Py_atomic_int gil_drop_request; struct _pending_calls pending; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - struct _gil_runtime_state gil; -#endif }; diff --git a/Include/internal/pycore_list.h b/Include/internal/pycore_list.h index 860dce1fd5d393..7721f6c2e222a0 100644 --- a/Include/internal/pycore_list.h +++ b/Include/internal/pycore_list.h @@ -56,6 +56,12 @@ _PyList_AppendTakeRef(PyListObject *self, PyObject *newitem) return _PyList_AppendTakeRefListResize(self, newitem); } +typedef struct { + PyObject_HEAD + Py_ssize_t it_index; + PyListObject *it_seq; /* Set to NULL when iterator is exhausted */ +} _PyListIterObject; + #ifdef __cplusplus } #endif diff --git a/Include/internal/pycore_long.h b/Include/internal/pycore_long.h index a33762402491b7..67dd5c3b13ec59 100644 --- a/Include/internal/pycore_long.h +++ b/Include/internal/pycore_long.h @@ -47,6 +47,8 @@ PyObject *_PyLong_Add(PyLongObject *left, PyLongObject *right); PyObject *_PyLong_Multiply(PyLongObject *left, PyLongObject *right); PyObject *_PyLong_Subtract(PyLongObject *left, PyLongObject *right); +int _PyLong_AssignValue(PyObject **target, Py_ssize_t value); + /* Used by Python/mystrtoul.c, _PyBytes_FromHex(), _PyBytes_DecodeEscape(), etc. */ PyAPI_DATA(unsigned char) _PyLong_DigitValue[256]; diff --git a/Include/internal/pycore_object.h b/Include/internal/pycore_object.h index f022f8246989d1..d4f66b65cc9ff2 100644 --- a/Include/internal/pycore_object.h +++ b/Include/internal/pycore_object.h @@ -17,7 +17,7 @@ extern "C" { #define _PyObject_IMMORTAL_INIT(type) \ { \ .ob_refcnt = 999999999, \ - .ob_type = type, \ + .ob_type = (type), \ } #define _PyVarObject_IMMORTAL_INIT(type, size) \ { \ @@ -29,11 +29,13 @@ PyAPI_FUNC(void) _Py_NO_RETURN _Py_FatalRefcountErrorFunc( const char *func, const char *message); -#define _Py_FatalRefcountError(message) _Py_FatalRefcountErrorFunc(__func__, message) +#define _Py_FatalRefcountError(message) \ + _Py_FatalRefcountErrorFunc(__func__, (message)) static inline void _Py_DECREF_SPECIALIZED(PyObject *op, const destructor destruct) { + _Py_DECREF_STAT_INC(); #ifdef Py_REF_DEBUG _Py_RefTotal--; #endif @@ -51,6 +53,7 @@ _Py_DECREF_SPECIALIZED(PyObject *op, const destructor destruct) static inline void _Py_DECREF_NO_DEALLOC(PyObject *op) { + _Py_DECREF_STAT_INC(); #ifdef Py_REF_DEBUG _Py_RefTotal--; #endif @@ -273,7 +276,7 @@ extern PyObject* _PyType_GetSubclasses(PyTypeObject *); // Access macro to the members which are floating "behind" the object #define _PyHeapType_GET_MEMBERS(etype) \ - ((PyMemberDef *)(((char *)etype) + Py_TYPE(etype)->tp_basicsize)) + ((PyMemberDef *)(((char *)(etype)) + Py_TYPE(etype)->tp_basicsize)) PyAPI_FUNC(PyObject *) _PyObject_LookupSpecial(PyObject *, PyObject *); @@ -294,7 +297,7 @@ PyAPI_FUNC(PyObject *) _PyObject_LookupSpecial(PyObject *, PyObject *); #if defined(__EMSCRIPTEN__) && defined(PY_CALL_TRAMPOLINE) #define _PyCFunction_TrampolineCall(meth, self, args) \ _PyCFunctionWithKeywords_TrampolineCall( \ - (*(PyCFunctionWithKeywords)(void(*)(void))meth), self, args, NULL) + (*(PyCFunctionWithKeywords)(void(*)(void))(meth)), (self), (args), NULL) extern PyObject* _PyCFunctionWithKeywords_TrampolineCall( PyCFunctionWithKeywords meth, PyObject *, PyObject *, PyObject *); #else diff --git a/Include/internal/pycore_opcode.h b/Include/internal/pycore_opcode.h index 09f65014671338..b9195f5c3299dc 100644 --- a/Include/internal/pycore_opcode.h +++ b/Include/internal/pycore_opcode.h @@ -44,13 +44,12 @@ const uint8_t _PyOpcode_Caches[256] = { [BINARY_SUBSCR] = 4, [STORE_SUBSCR] = 1, [UNPACK_SEQUENCE] = 1, + [FOR_ITER] = 1, [STORE_ATTR] = 4, - [LOAD_ATTR] = 4, + [LOAD_ATTR] = 9, [COMPARE_OP] = 2, [LOAD_GLOBAL] = 5, [BINARY_OP] = 1, - [LOAD_METHOD] = 10, - [PRECALL] = 1, [CALL] = 4, }; @@ -84,7 +83,22 @@ const uint8_t _PyOpcode_Deopt[256] = { [CACHE] = CACHE, [CALL] = CALL, [CALL_ADAPTIVE] = CALL, + [CALL_BOUND_METHOD_EXACT_ARGS] = CALL, + [CALL_BUILTIN_CLASS] = CALL, + [CALL_BUILTIN_FAST_WITH_KEYWORDS] = CALL, [CALL_FUNCTION_EX] = CALL_FUNCTION_EX, + [CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = CALL, + [CALL_NO_KW_BUILTIN_FAST] = CALL, + [CALL_NO_KW_BUILTIN_O] = CALL, + [CALL_NO_KW_ISINSTANCE] = CALL, + [CALL_NO_KW_LEN] = CALL, + [CALL_NO_KW_LIST_APPEND] = CALL, + [CALL_NO_KW_METHOD_DESCRIPTOR_FAST] = CALL, + [CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS] = CALL, + [CALL_NO_KW_METHOD_DESCRIPTOR_O] = CALL, + [CALL_NO_KW_STR_1] = CALL, + [CALL_NO_KW_TUPLE_1] = CALL, + [CALL_NO_KW_TYPE_1] = CALL, [CALL_PY_EXACT_ARGS] = CALL, [CALL_PY_WITH_DEFAULTS] = CALL, [CHECK_EG_MATCH] = CHECK_EG_MATCH, @@ -110,6 +124,9 @@ const uint8_t _PyOpcode_Deopt[256] = { [EXTENDED_ARG_QUICK] = EXTENDED_ARG, [FORMAT_VALUE] = FORMAT_VALUE, [FOR_ITER] = FOR_ITER, + [FOR_ITER_ADAPTIVE] = FOR_ITER, + [FOR_ITER_LIST] = FOR_ITER, + [FOR_ITER_RANGE] = FOR_ITER, [GET_AITER] = GET_AITER, [GET_ANEXT] = GET_ANEXT, [GET_AWAITABLE] = GET_AWAITABLE, @@ -133,8 +150,14 @@ const uint8_t _PyOpcode_Deopt[256] = { [LOAD_ASSERTION_ERROR] = LOAD_ASSERTION_ERROR, [LOAD_ATTR] = LOAD_ATTR, [LOAD_ATTR_ADAPTIVE] = LOAD_ATTR, + [LOAD_ATTR_CLASS] = LOAD_ATTR, [LOAD_ATTR_INSTANCE_VALUE] = LOAD_ATTR, + [LOAD_ATTR_METHOD_LAZY_DICT] = LOAD_ATTR, + [LOAD_ATTR_METHOD_NO_DICT] = LOAD_ATTR, + [LOAD_ATTR_METHOD_WITH_DICT] = LOAD_ATTR, + [LOAD_ATTR_METHOD_WITH_VALUES] = LOAD_ATTR, [LOAD_ATTR_MODULE] = LOAD_ATTR, + [LOAD_ATTR_PROPERTY] = LOAD_ATTR, [LOAD_ATTR_SLOT] = LOAD_ATTR, [LOAD_ATTR_WITH_HINT] = LOAD_ATTR, [LOAD_BUILD_CLASS] = LOAD_BUILD_CLASS, @@ -144,19 +167,13 @@ const uint8_t _PyOpcode_Deopt[256] = { [LOAD_CONST__LOAD_FAST] = LOAD_CONST, [LOAD_DEREF] = LOAD_DEREF, [LOAD_FAST] = LOAD_FAST, + [LOAD_FAST_CHECK] = LOAD_FAST_CHECK, [LOAD_FAST__LOAD_CONST] = LOAD_FAST, [LOAD_FAST__LOAD_FAST] = LOAD_FAST, [LOAD_GLOBAL] = LOAD_GLOBAL, [LOAD_GLOBAL_ADAPTIVE] = LOAD_GLOBAL, [LOAD_GLOBAL_BUILTIN] = LOAD_GLOBAL, [LOAD_GLOBAL_MODULE] = LOAD_GLOBAL, - [LOAD_METHOD] = LOAD_METHOD, - [LOAD_METHOD_ADAPTIVE] = LOAD_METHOD, - [LOAD_METHOD_CLASS] = LOAD_METHOD, - [LOAD_METHOD_MODULE] = LOAD_METHOD, - [LOAD_METHOD_NO_DICT] = LOAD_METHOD, - [LOAD_METHOD_WITH_DICT] = LOAD_METHOD, - [LOAD_METHOD_WITH_VALUES] = LOAD_METHOD, [LOAD_NAME] = LOAD_NAME, [MAKE_CELL] = MAKE_CELL, [MAKE_FUNCTION] = MAKE_FUNCTION, @@ -176,24 +193,6 @@ const uint8_t _PyOpcode_Deopt[256] = { [POP_JUMP_FORWARD_IF_NOT_NONE] = POP_JUMP_FORWARD_IF_NOT_NONE, [POP_JUMP_FORWARD_IF_TRUE] = POP_JUMP_FORWARD_IF_TRUE, [POP_TOP] = POP_TOP, - [PRECALL] = PRECALL, - [PRECALL_ADAPTIVE] = PRECALL, - [PRECALL_BOUND_METHOD] = PRECALL, - [PRECALL_BUILTIN_CLASS] = PRECALL, - [PRECALL_BUILTIN_FAST_WITH_KEYWORDS] = PRECALL, - [PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = PRECALL, - [PRECALL_NO_KW_BUILTIN_FAST] = PRECALL, - [PRECALL_NO_KW_BUILTIN_O] = PRECALL, - [PRECALL_NO_KW_ISINSTANCE] = PRECALL, - [PRECALL_NO_KW_LEN] = PRECALL, - [PRECALL_NO_KW_LIST_APPEND] = PRECALL, - [PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST] = PRECALL, - [PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS] = PRECALL, - [PRECALL_NO_KW_METHOD_DESCRIPTOR_O] = PRECALL, - [PRECALL_NO_KW_STR_1] = PRECALL, - [PRECALL_NO_KW_TUPLE_1] = PRECALL, - [PRECALL_NO_KW_TYPE_1] = PRECALL, - [PRECALL_PYFUNC] = PRECALL, [PREP_RERAISE_STAR] = PREP_RERAISE_STAR, [PRINT_EXPR] = PRINT_EXPR, [PUSH_EXC_INFO] = PUSH_EXC_INFO, @@ -268,7 +267,22 @@ const uint8_t _PyOpcode_Original[256] = { [CACHE] = CACHE, [CALL] = CALL, [CALL_ADAPTIVE] = CALL, + [CALL_BOUND_METHOD_EXACT_ARGS] = CALL, + [CALL_BUILTIN_CLASS] = CALL, + [CALL_BUILTIN_FAST_WITH_KEYWORDS] = CALL, [CALL_FUNCTION_EX] = CALL_FUNCTION_EX, + [CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = CALL, + [CALL_NO_KW_BUILTIN_FAST] = CALL, + [CALL_NO_KW_BUILTIN_O] = CALL, + [CALL_NO_KW_ISINSTANCE] = CALL, + [CALL_NO_KW_LEN] = CALL, + [CALL_NO_KW_LIST_APPEND] = CALL, + [CALL_NO_KW_METHOD_DESCRIPTOR_FAST] = CALL, + [CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS] = CALL, + [CALL_NO_KW_METHOD_DESCRIPTOR_O] = CALL, + [CALL_NO_KW_STR_1] = CALL, + [CALL_NO_KW_TUPLE_1] = CALL, + [CALL_NO_KW_TYPE_1] = CALL, [CALL_PY_EXACT_ARGS] = CALL, [CALL_PY_WITH_DEFAULTS] = CALL, [CHECK_EG_MATCH] = CHECK_EG_MATCH, @@ -294,6 +308,9 @@ const uint8_t _PyOpcode_Original[256] = { [EXTENDED_ARG_QUICK] = EXTENDED_ARG_QUICK, [FORMAT_VALUE] = FORMAT_VALUE, [FOR_ITER] = FOR_ITER, + [FOR_ITER_ADAPTIVE] = FOR_ITER, + [FOR_ITER_LIST] = FOR_ITER, + [FOR_ITER_RANGE] = FOR_ITER, [GET_AITER] = GET_AITER, [GET_ANEXT] = GET_ANEXT, [GET_AWAITABLE] = GET_AWAITABLE, @@ -317,8 +334,14 @@ const uint8_t _PyOpcode_Original[256] = { [LOAD_ASSERTION_ERROR] = LOAD_ASSERTION_ERROR, [LOAD_ATTR] = LOAD_ATTR, [LOAD_ATTR_ADAPTIVE] = LOAD_ATTR, + [LOAD_ATTR_CLASS] = LOAD_ATTR, [LOAD_ATTR_INSTANCE_VALUE] = LOAD_ATTR, + [LOAD_ATTR_METHOD_LAZY_DICT] = LOAD_ATTR, + [LOAD_ATTR_METHOD_NO_DICT] = LOAD_ATTR, + [LOAD_ATTR_METHOD_WITH_DICT] = LOAD_ATTR, + [LOAD_ATTR_METHOD_WITH_VALUES] = LOAD_ATTR, [LOAD_ATTR_MODULE] = LOAD_ATTR, + [LOAD_ATTR_PROPERTY] = LOAD_ATTR, [LOAD_ATTR_SLOT] = LOAD_ATTR, [LOAD_ATTR_WITH_HINT] = LOAD_ATTR, [LOAD_BUILD_CLASS] = LOAD_BUILD_CLASS, @@ -328,19 +351,13 @@ const uint8_t _PyOpcode_Original[256] = { [LOAD_CONST__LOAD_FAST] = LOAD_CONST, [LOAD_DEREF] = LOAD_DEREF, [LOAD_FAST] = LOAD_FAST, + [LOAD_FAST_CHECK] = LOAD_FAST_CHECK, [LOAD_FAST__LOAD_CONST] = LOAD_FAST, [LOAD_FAST__LOAD_FAST] = LOAD_FAST, [LOAD_GLOBAL] = LOAD_GLOBAL, [LOAD_GLOBAL_ADAPTIVE] = LOAD_GLOBAL, [LOAD_GLOBAL_BUILTIN] = LOAD_GLOBAL, [LOAD_GLOBAL_MODULE] = LOAD_GLOBAL, - [LOAD_METHOD] = LOAD_METHOD, - [LOAD_METHOD_ADAPTIVE] = LOAD_METHOD, - [LOAD_METHOD_CLASS] = LOAD_METHOD, - [LOAD_METHOD_MODULE] = LOAD_METHOD, - [LOAD_METHOD_NO_DICT] = LOAD_METHOD, - [LOAD_METHOD_WITH_DICT] = LOAD_METHOD, - [LOAD_METHOD_WITH_VALUES] = LOAD_METHOD, [LOAD_NAME] = LOAD_NAME, [MAKE_CELL] = MAKE_CELL, [MAKE_FUNCTION] = MAKE_FUNCTION, @@ -360,24 +377,6 @@ const uint8_t _PyOpcode_Original[256] = { [POP_JUMP_FORWARD_IF_NOT_NONE] = POP_JUMP_FORWARD_IF_NOT_NONE, [POP_JUMP_FORWARD_IF_TRUE] = POP_JUMP_FORWARD_IF_TRUE, [POP_TOP] = POP_TOP, - [PRECALL] = PRECALL, - [PRECALL_ADAPTIVE] = PRECALL, - [PRECALL_BOUND_METHOD] = PRECALL, - [PRECALL_BUILTIN_CLASS] = PRECALL, - [PRECALL_BUILTIN_FAST_WITH_KEYWORDS] = PRECALL, - [PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = PRECALL, - [PRECALL_NO_KW_BUILTIN_FAST] = PRECALL, - [PRECALL_NO_KW_BUILTIN_O] = PRECALL, - [PRECALL_NO_KW_ISINSTANCE] = PRECALL, - [PRECALL_NO_KW_LEN] = PRECALL, - [PRECALL_NO_KW_LIST_APPEND] = PRECALL, - [PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST] = PRECALL, - [PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS] = PRECALL, - [PRECALL_NO_KW_METHOD_DESCRIPTOR_O] = PRECALL, - [PRECALL_NO_KW_STR_1] = PRECALL, - [PRECALL_NO_KW_TUPLE_1] = PRECALL, - [PRECALL_NO_KW_TYPE_1] = PRECALL, - [PRECALL_PYFUNC] = PRECALL, [PREP_RERAISE_STAR] = PREP_RERAISE_STAR, [PRINT_EXPR] = PRINT_EXPR, [PUSH_EXC_INFO] = PUSH_EXC_INFO, @@ -451,67 +450,67 @@ static const char *const _PyOpcode_OpName[256] = { [CALL_PY_EXACT_ARGS] = "CALL_PY_EXACT_ARGS", [CALL_PY_WITH_DEFAULTS] = "CALL_PY_WITH_DEFAULTS", [BINARY_SUBSCR] = "BINARY_SUBSCR", - [COMPARE_OP_ADAPTIVE] = "COMPARE_OP_ADAPTIVE", - [COMPARE_OP_FLOAT_JUMP] = "COMPARE_OP_FLOAT_JUMP", - [COMPARE_OP_INT_JUMP] = "COMPARE_OP_INT_JUMP", - [COMPARE_OP_STR_JUMP] = "COMPARE_OP_STR_JUMP", + [CALL_BOUND_METHOD_EXACT_ARGS] = "CALL_BOUND_METHOD_EXACT_ARGS", + [CALL_BUILTIN_CLASS] = "CALL_BUILTIN_CLASS", + [CALL_BUILTIN_FAST_WITH_KEYWORDS] = "CALL_BUILTIN_FAST_WITH_KEYWORDS", + [CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = "CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS", [GET_LEN] = "GET_LEN", [MATCH_MAPPING] = "MATCH_MAPPING", [MATCH_SEQUENCE] = "MATCH_SEQUENCE", [MATCH_KEYS] = "MATCH_KEYS", - [EXTENDED_ARG_QUICK] = "EXTENDED_ARG_QUICK", + [CALL_NO_KW_BUILTIN_FAST] = "CALL_NO_KW_BUILTIN_FAST", [PUSH_EXC_INFO] = "PUSH_EXC_INFO", [CHECK_EXC_MATCH] = "CHECK_EXC_MATCH", [CHECK_EG_MATCH] = "CHECK_EG_MATCH", - [JUMP_BACKWARD_QUICK] = "JUMP_BACKWARD_QUICK", - [LOAD_ATTR_ADAPTIVE] = "LOAD_ATTR_ADAPTIVE", - [LOAD_ATTR_INSTANCE_VALUE] = "LOAD_ATTR_INSTANCE_VALUE", - [LOAD_ATTR_MODULE] = "LOAD_ATTR_MODULE", - [LOAD_ATTR_SLOT] = "LOAD_ATTR_SLOT", - [LOAD_ATTR_WITH_HINT] = "LOAD_ATTR_WITH_HINT", - [LOAD_CONST__LOAD_FAST] = "LOAD_CONST__LOAD_FAST", - [LOAD_FAST__LOAD_CONST] = "LOAD_FAST__LOAD_CONST", - [LOAD_FAST__LOAD_FAST] = "LOAD_FAST__LOAD_FAST", - [LOAD_GLOBAL_ADAPTIVE] = "LOAD_GLOBAL_ADAPTIVE", - [LOAD_GLOBAL_BUILTIN] = "LOAD_GLOBAL_BUILTIN", + [CALL_NO_KW_BUILTIN_O] = "CALL_NO_KW_BUILTIN_O", + [CALL_NO_KW_ISINSTANCE] = "CALL_NO_KW_ISINSTANCE", + [CALL_NO_KW_LEN] = "CALL_NO_KW_LEN", + [CALL_NO_KW_LIST_APPEND] = "CALL_NO_KW_LIST_APPEND", + [CALL_NO_KW_METHOD_DESCRIPTOR_FAST] = "CALL_NO_KW_METHOD_DESCRIPTOR_FAST", + [CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS] = "CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS", + [CALL_NO_KW_METHOD_DESCRIPTOR_O] = "CALL_NO_KW_METHOD_DESCRIPTOR_O", + [CALL_NO_KW_STR_1] = "CALL_NO_KW_STR_1", + [CALL_NO_KW_TUPLE_1] = "CALL_NO_KW_TUPLE_1", + [CALL_NO_KW_TYPE_1] = "CALL_NO_KW_TYPE_1", + [COMPARE_OP_ADAPTIVE] = "COMPARE_OP_ADAPTIVE", [WITH_EXCEPT_START] = "WITH_EXCEPT_START", [GET_AITER] = "GET_AITER", [GET_ANEXT] = "GET_ANEXT", [BEFORE_ASYNC_WITH] = "BEFORE_ASYNC_WITH", [BEFORE_WITH] = "BEFORE_WITH", [END_ASYNC_FOR] = "END_ASYNC_FOR", - [LOAD_GLOBAL_MODULE] = "LOAD_GLOBAL_MODULE", - [LOAD_METHOD_ADAPTIVE] = "LOAD_METHOD_ADAPTIVE", - [LOAD_METHOD_CLASS] = "LOAD_METHOD_CLASS", - [LOAD_METHOD_MODULE] = "LOAD_METHOD_MODULE", - [LOAD_METHOD_NO_DICT] = "LOAD_METHOD_NO_DICT", + [COMPARE_OP_FLOAT_JUMP] = "COMPARE_OP_FLOAT_JUMP", + [COMPARE_OP_INT_JUMP] = "COMPARE_OP_INT_JUMP", + [COMPARE_OP_STR_JUMP] = "COMPARE_OP_STR_JUMP", + [EXTENDED_ARG_QUICK] = "EXTENDED_ARG_QUICK", + [FOR_ITER_ADAPTIVE] = "FOR_ITER_ADAPTIVE", [STORE_SUBSCR] = "STORE_SUBSCR", [DELETE_SUBSCR] = "DELETE_SUBSCR", - [LOAD_METHOD_WITH_DICT] = "LOAD_METHOD_WITH_DICT", - [LOAD_METHOD_WITH_VALUES] = "LOAD_METHOD_WITH_VALUES", - [PRECALL_ADAPTIVE] = "PRECALL_ADAPTIVE", - [PRECALL_BOUND_METHOD] = "PRECALL_BOUND_METHOD", - [PRECALL_BUILTIN_CLASS] = "PRECALL_BUILTIN_CLASS", - [PRECALL_BUILTIN_FAST_WITH_KEYWORDS] = "PRECALL_BUILTIN_FAST_WITH_KEYWORDS", + [FOR_ITER_LIST] = "FOR_ITER_LIST", + [FOR_ITER_RANGE] = "FOR_ITER_RANGE", + [JUMP_BACKWARD_QUICK] = "JUMP_BACKWARD_QUICK", + [LOAD_ATTR_ADAPTIVE] = "LOAD_ATTR_ADAPTIVE", + [LOAD_ATTR_CLASS] = "LOAD_ATTR_CLASS", + [LOAD_ATTR_INSTANCE_VALUE] = "LOAD_ATTR_INSTANCE_VALUE", [GET_ITER] = "GET_ITER", [GET_YIELD_FROM_ITER] = "GET_YIELD_FROM_ITER", [PRINT_EXPR] = "PRINT_EXPR", [LOAD_BUILD_CLASS] = "LOAD_BUILD_CLASS", - [PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS] = "PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS", - [PRECALL_NO_KW_BUILTIN_FAST] = "PRECALL_NO_KW_BUILTIN_FAST", + [LOAD_ATTR_MODULE] = "LOAD_ATTR_MODULE", + [LOAD_ATTR_PROPERTY] = "LOAD_ATTR_PROPERTY", [LOAD_ASSERTION_ERROR] = "LOAD_ASSERTION_ERROR", [RETURN_GENERATOR] = "RETURN_GENERATOR", - [PRECALL_NO_KW_BUILTIN_O] = "PRECALL_NO_KW_BUILTIN_O", - [PRECALL_NO_KW_ISINSTANCE] = "PRECALL_NO_KW_ISINSTANCE", - [PRECALL_NO_KW_LEN] = "PRECALL_NO_KW_LEN", - [PRECALL_NO_KW_LIST_APPEND] = "PRECALL_NO_KW_LIST_APPEND", - [PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST] = "PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST", - [PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS] = "PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS", + [LOAD_ATTR_SLOT] = "LOAD_ATTR_SLOT", + [LOAD_ATTR_WITH_HINT] = "LOAD_ATTR_WITH_HINT", + [LOAD_ATTR_METHOD_LAZY_DICT] = "LOAD_ATTR_METHOD_LAZY_DICT", + [LOAD_ATTR_METHOD_NO_DICT] = "LOAD_ATTR_METHOD_NO_DICT", + [LOAD_ATTR_METHOD_WITH_DICT] = "LOAD_ATTR_METHOD_WITH_DICT", + [LOAD_ATTR_METHOD_WITH_VALUES] = "LOAD_ATTR_METHOD_WITH_VALUES", [LIST_TO_TUPLE] = "LIST_TO_TUPLE", [RETURN_VALUE] = "RETURN_VALUE", [IMPORT_STAR] = "IMPORT_STAR", [SETUP_ANNOTATIONS] = "SETUP_ANNOTATIONS", - [YIELD_VALUE] = "YIELD_VALUE", + [LOAD_CONST__LOAD_FAST] = "LOAD_CONST__LOAD_FAST", [ASYNC_GEN_WRAP] = "ASYNC_GEN_WRAP", [PREP_RERAISE_STAR] = "PREP_RERAISE_STAR", [POP_EXCEPT] = "POP_EXCEPT", @@ -538,7 +537,7 @@ static const char *const _PyOpcode_OpName[256] = { [JUMP_FORWARD] = "JUMP_FORWARD", [JUMP_IF_FALSE_OR_POP] = "JUMP_IF_FALSE_OR_POP", [JUMP_IF_TRUE_OR_POP] = "JUMP_IF_TRUE_OR_POP", - [PRECALL_NO_KW_METHOD_DESCRIPTOR_O] = "PRECALL_NO_KW_METHOD_DESCRIPTOR_O", + [LOAD_FAST__LOAD_CONST] = "LOAD_FAST__LOAD_CONST", [POP_JUMP_FORWARD_IF_FALSE] = "POP_JUMP_FORWARD_IF_FALSE", [POP_JUMP_FORWARD_IF_TRUE] = "POP_JUMP_FORWARD_IF_TRUE", [LOAD_GLOBAL] = "LOAD_GLOBAL", @@ -546,13 +545,13 @@ static const char *const _PyOpcode_OpName[256] = { [CONTAINS_OP] = "CONTAINS_OP", [RERAISE] = "RERAISE", [COPY] = "COPY", - [PRECALL_NO_KW_STR_1] = "PRECALL_NO_KW_STR_1", + [LOAD_FAST__LOAD_FAST] = "LOAD_FAST__LOAD_FAST", [BINARY_OP] = "BINARY_OP", [SEND] = "SEND", [LOAD_FAST] = "LOAD_FAST", [STORE_FAST] = "STORE_FAST", [DELETE_FAST] = "DELETE_FAST", - [PRECALL_NO_KW_TUPLE_1] = "PRECALL_NO_KW_TUPLE_1", + [LOAD_FAST_CHECK] = "LOAD_FAST_CHECK", [POP_JUMP_FORWARD_IF_NOT_NONE] = "POP_JUMP_FORWARD_IF_NOT_NONE", [POP_JUMP_FORWARD_IF_NONE] = "POP_JUMP_FORWARD_IF_NONE", [RAISE_VARARGS] = "RAISE_VARARGS", @@ -566,32 +565,32 @@ static const char *const _PyOpcode_OpName[256] = { [STORE_DEREF] = "STORE_DEREF", [DELETE_DEREF] = "DELETE_DEREF", [JUMP_BACKWARD] = "JUMP_BACKWARD", - [PRECALL_NO_KW_TYPE_1] = "PRECALL_NO_KW_TYPE_1", + [LOAD_GLOBAL_ADAPTIVE] = "LOAD_GLOBAL_ADAPTIVE", [CALL_FUNCTION_EX] = "CALL_FUNCTION_EX", - [PRECALL_PYFUNC] = "PRECALL_PYFUNC", + [LOAD_GLOBAL_BUILTIN] = "LOAD_GLOBAL_BUILTIN", [EXTENDED_ARG] = "EXTENDED_ARG", [LIST_APPEND] = "LIST_APPEND", [SET_ADD] = "SET_ADD", [MAP_ADD] = "MAP_ADD", [LOAD_CLASSDEREF] = "LOAD_CLASSDEREF", [COPY_FREE_VARS] = "COPY_FREE_VARS", - [RESUME_QUICK] = "RESUME_QUICK", + [YIELD_VALUE] = "YIELD_VALUE", [RESUME] = "RESUME", [MATCH_CLASS] = "MATCH_CLASS", - [STORE_ATTR_ADAPTIVE] = "STORE_ATTR_ADAPTIVE", - [STORE_ATTR_INSTANCE_VALUE] = "STORE_ATTR_INSTANCE_VALUE", + [LOAD_GLOBAL_MODULE] = "LOAD_GLOBAL_MODULE", + [RESUME_QUICK] = "RESUME_QUICK", [FORMAT_VALUE] = "FORMAT_VALUE", [BUILD_CONST_KEY_MAP] = "BUILD_CONST_KEY_MAP", [BUILD_STRING] = "BUILD_STRING", + [STORE_ATTR_ADAPTIVE] = "STORE_ATTR_ADAPTIVE", + [STORE_ATTR_INSTANCE_VALUE] = "STORE_ATTR_INSTANCE_VALUE", [STORE_ATTR_SLOT] = "STORE_ATTR_SLOT", [STORE_ATTR_WITH_HINT] = "STORE_ATTR_WITH_HINT", - [LOAD_METHOD] = "LOAD_METHOD", - [STORE_FAST__LOAD_FAST] = "STORE_FAST__LOAD_FAST", [LIST_EXTEND] = "LIST_EXTEND", [SET_UPDATE] = "SET_UPDATE", [DICT_MERGE] = "DICT_MERGE", [DICT_UPDATE] = "DICT_UPDATE", - [PRECALL] = "PRECALL", + [STORE_FAST__LOAD_FAST] = "STORE_FAST__LOAD_FAST", [STORE_FAST__STORE_FAST] = "STORE_FAST__STORE_FAST", [STORE_SUBSCR_ADAPTIVE] = "STORE_SUBSCR_ADAPTIVE", [STORE_SUBSCR_DICT] = "STORE_SUBSCR_DICT", diff --git a/Include/internal/pycore_pymath.h b/Include/internal/pycore_pymath.h index 5c6aee2a23890b..5f3afe4df6865c 100644 --- a/Include/internal/pycore_pymath.h +++ b/Include/internal/pycore_pymath.h @@ -56,17 +56,13 @@ static inline void _Py_ADJUST_ERANGE2(double x, double y) } } -// Return whether integral type *type* is signed or not. -#define _Py_IntegralTypeSigned(type) \ - ((type)(-1) < 0) - // Return the maximum value of integral type *type*. #define _Py_IntegralTypeMax(type) \ - ((_Py_IntegralTypeSigned(type)) ? (((((type)1 << (sizeof(type)*CHAR_BIT - 2)) - 1) << 1) + 1) : ~(type)0) + (_Py_IS_TYPE_SIGNED(type) ? (((((type)1 << (sizeof(type)*CHAR_BIT - 2)) - 1) << 1) + 1) : ~(type)0) // Return the minimum value of integral type *type*. #define _Py_IntegralTypeMin(type) \ - ((_Py_IntegralTypeSigned(type)) ? -_Py_IntegralTypeMax(type) - 1 : 0) + (_Py_IS_TYPE_SIGNED(type) ? -_Py_IntegralTypeMax(type) - 1 : 0) // Check whether *v* is in the range of integral type *type*. This is most // useful if *v* is floating-point, since demoting a floating-point *v* to an diff --git a/Include/internal/pycore_pystate.h b/Include/internal/pycore_pystate.h index c4bc53c707fdaf..b4a39b62b76a1d 100644 --- a/Include/internal/pycore_pystate.h +++ b/Include/internal/pycore_pystate.h @@ -64,18 +64,10 @@ _Py_ThreadCanHandlePendingCalls(void) /* Variable and macro for in-line access to current thread and interpreter state */ -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS -PyAPI_FUNC(PyThreadState*) _PyThreadState_GetTSS(void); -#endif - static inline PyThreadState* _PyRuntimeState_GetThreadState(_PyRuntimeState *runtime) { -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - return _PyThreadState_GetTSS(); -#else return (PyThreadState*)_Py_atomic_load_relaxed(&runtime->gilstate.tstate_current); -#endif } /* Get the current Python thread state. @@ -90,11 +82,7 @@ _PyRuntimeState_GetThreadState(_PyRuntimeState *runtime) static inline PyThreadState* _PyThreadState_GET(void) { -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - return _PyThreadState_GetTSS(); -#else return _PyRuntimeState_GetThreadState(&_PyRuntime); -#endif } PyAPI_FUNC(void) _Py_NO_RETURN _Py_FatalError_TstateNULL(const char *func); @@ -109,7 +97,7 @@ _Py_EnsureFuncTstateNotNULL(const char *func, PyThreadState *tstate) // Call Py_FatalError() if tstate is NULL #define _Py_EnsureTstateNotNULL(tstate) \ - _Py_EnsureFuncTstateNotNULL(__func__, tstate) + _Py_EnsureFuncTstateNotNULL(__func__, (tstate)) /* Get the current interpreter state. diff --git a/Include/internal/pycore_range.h b/Include/internal/pycore_range.h new file mode 100644 index 00000000000000..809e89a1e01b60 --- /dev/null +++ b/Include/internal/pycore_range.h @@ -0,0 +1,22 @@ +#ifndef Py_INTERNAL_RANGE_H +#define Py_INTERNAL_RANGE_H +#ifdef __cplusplus +extern "C" { +#endif + +#ifndef Py_BUILD_CORE +# error "this header requires Py_BUILD_CORE define" +#endif + +typedef struct { + PyObject_HEAD + long index; + long start; + long step; + long len; +} _PyRangeIterObject; + +#ifdef __cplusplus +} +#endif +#endif /* !Py_INTERNAL_RANGE_H */ diff --git a/Include/internal/pycore_runtime.h b/Include/internal/pycore_runtime.h index 18191c3771dfcc..ae63ae74afa5fe 100644 --- a/Include/internal/pycore_runtime.h +++ b/Include/internal/pycore_runtime.h @@ -23,9 +23,7 @@ struct _ceval_runtime_state { the main thread of the main interpreter can handle signals: see _Py_ThreadCanHandleSignals(). */ _Py_atomic_int signals_pending; -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS struct _gil_runtime_state gil; -#endif }; /* GIL state */ diff --git a/Include/internal/pycore_runtime_init.h b/Include/internal/pycore_runtime_init.h index 737507f07eacce..a7a436011d9b43 100644 --- a/Include/internal/pycore_runtime_init.h +++ b/Include/internal/pycore_runtime_init.h @@ -84,13 +84,13 @@ extern "C" { #define _PyBytes_SIMPLE_INIT(CH, LEN) \ { \ - _PyVarObject_IMMORTAL_INIT(&PyBytes_Type, LEN), \ + _PyVarObject_IMMORTAL_INIT(&PyBytes_Type, (LEN)), \ .ob_shash = -1, \ - .ob_sval = { CH }, \ + .ob_sval = { (CH) }, \ } #define _PyBytes_CHAR_INIT(CH) \ { \ - _PyBytes_SIMPLE_INIT(CH, 1) \ + _PyBytes_SIMPLE_INIT((CH), 1) \ } #define _PyUnicode_ASCII_BASE_INIT(LITERAL, ASCII) \ @@ -101,13 +101,13 @@ extern "C" { .state = { \ .kind = 1, \ .compact = 1, \ - .ascii = ASCII, \ + .ascii = (ASCII), \ }, \ } #define _PyASCIIObject_INIT(LITERAL) \ { \ - ._ascii = _PyUnicode_ASCII_BASE_INIT(LITERAL, 1), \ - ._data = LITERAL \ + ._ascii = _PyUnicode_ASCII_BASE_INIT((LITERAL), 1), \ + ._data = (LITERAL) \ } #define INIT_STR(NAME, LITERAL) \ ._ ## NAME = _PyASCIIObject_INIT(LITERAL) @@ -116,9 +116,9 @@ extern "C" { #define _PyUnicode_LATIN1_INIT(LITERAL) \ { \ ._latin1 = { \ - ._base = _PyUnicode_ASCII_BASE_INIT(LITERAL, 0), \ + ._base = _PyUnicode_ASCII_BASE_INIT((LITERAL), 0), \ }, \ - ._data = LITERAL, \ + ._data = (LITERAL), \ } /* The following is auto-generated by Tools/scripts/generate_global_objects.py. */ @@ -712,6 +712,7 @@ extern "C" { INIT_ID(__delete__), \ INIT_ID(__delitem__), \ INIT_ID(__dict__), \ + INIT_ID(__dictoffset__), \ INIT_ID(__dir__), \ INIT_ID(__divmod__), \ INIT_ID(__doc__), \ @@ -824,9 +825,11 @@ extern "C" { INIT_ID(__truediv__), \ INIT_ID(__trunc__), \ INIT_ID(__typing_is_unpacked_typevartuple__), \ + INIT_ID(__typing_prepare_subst__), \ INIT_ID(__typing_subst__), \ INIT_ID(__typing_unpacked_tuple_args__), \ INIT_ID(__warningregistry__), \ + INIT_ID(__weaklistoffset__), \ INIT_ID(__weakref__), \ INIT_ID(__xor__), \ INIT_ID(_abc_impl), \ @@ -845,7 +848,6 @@ extern "C" { INIT_ID(_showwarnmsg), \ INIT_ID(_shutdown), \ INIT_ID(_slotnames), \ - INIT_ID(_strptime_time), \ INIT_ID(_uninitialized_submodules), \ INIT_ID(_warn_unawaited_coroutine), \ INIT_ID(_xoptions), \ @@ -871,7 +873,6 @@ extern "C" { INIT_ID(difference_update), \ INIT_ID(dispatch_table), \ INIT_ID(displayhook), \ - INIT_ID(enable), \ INIT_ID(encode), \ INIT_ID(encoding), \ INIT_ID(end_lineno), \ @@ -930,7 +931,6 @@ extern "C" { INIT_ID(opcode), \ INIT_ID(open), \ INIT_ID(parent), \ - INIT_ID(partial), \ INIT_ID(path), \ INIT_ID(peek), \ INIT_ID(persistent_id), \ @@ -976,7 +976,6 @@ extern "C" { INIT_ID(warnoptions), \ INIT_ID(writable), \ INIT_ID(write), \ - INIT_ID(zipimporter), \ }, \ .ascii = { \ _PyASCIIObject_INIT("\x00"), \ diff --git a/Include/internal/pycore_symtable.h b/Include/internal/pycore_symtable.h index 28935f4ed55012..2d64aba22ff905 100644 --- a/Include/internal/pycore_symtable.h +++ b/Include/internal/pycore_symtable.h @@ -77,7 +77,7 @@ typedef struct _symtable_entry { extern PyTypeObject PySTEntry_Type; -#define PySTEntry_Check(op) Py_IS_TYPE(op, &PySTEntry_Type) +#define PySTEntry_Check(op) Py_IS_TYPE((op), &PySTEntry_Type) extern long _PyST_GetSymbol(PySTEntryObject *, PyObject *); extern int _PyST_GetScope(PySTEntryObject *, PyObject *); diff --git a/Include/internal/pycore_unionobject.h b/Include/internal/pycore_unionobject.h index a9ed5651a410e8..87264635b6e1cf 100644 --- a/Include/internal/pycore_unionobject.h +++ b/Include/internal/pycore_unionobject.h @@ -9,10 +9,10 @@ extern "C" { #endif extern PyTypeObject _PyUnion_Type; -#define _PyUnion_Check(op) Py_IS_TYPE(op, &_PyUnion_Type) +#define _PyUnion_Check(op) Py_IS_TYPE((op), &_PyUnion_Type) extern PyObject *_Py_union_type_or(PyObject *, PyObject *); -#define _PyGenericAlias_Check(op) PyObject_TypeCheck(op, &Py_GenericAliasType) +#define _PyGenericAlias_Check(op) PyObject_TypeCheck((op), &Py_GenericAliasType) extern PyObject *_Py_subs_parameters(PyObject *, PyObject *, PyObject *, PyObject *); extern PyObject *_Py_make_parameters(PyObject *); extern PyObject *_Py_union_args(PyObject *self); diff --git a/Include/iterobject.h b/Include/iterobject.h index 6454611aebef8a..fff30f7176fdeb 100644 --- a/Include/iterobject.h +++ b/Include/iterobject.h @@ -11,12 +11,12 @@ PyAPI_DATA(PyTypeObject) PyCallIter_Type; extern PyTypeObject _PyAnextAwaitable_Type; #endif -#define PySeqIter_Check(op) Py_IS_TYPE(op, &PySeqIter_Type) +#define PySeqIter_Check(op) Py_IS_TYPE((op), &PySeqIter_Type) PyAPI_FUNC(PyObject *) PySeqIter_New(PyObject *); -#define PyCallIter_Check(op) Py_IS_TYPE(op, &PyCallIter_Type) +#define PyCallIter_Check(op) Py_IS_TYPE((op), &PyCallIter_Type) PyAPI_FUNC(PyObject *) PyCallIter_New(PyObject *, PyObject *); diff --git a/Include/listobject.h b/Include/listobject.h index eff42c188f1ff1..6b7041ba0b05d5 100644 --- a/Include/listobject.h +++ b/Include/listobject.h @@ -23,7 +23,7 @@ PyAPI_DATA(PyTypeObject) PyListRevIter_Type; #define PyList_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_LIST_SUBCLASS) -#define PyList_CheckExact(op) Py_IS_TYPE(op, &PyList_Type) +#define PyList_CheckExact(op) Py_IS_TYPE((op), &PyList_Type) PyAPI_FUNC(PyObject *) PyList_New(Py_ssize_t size); PyAPI_FUNC(Py_ssize_t) PyList_Size(PyObject *); diff --git a/Include/longobject.h b/Include/longobject.h index 81ba1239a69ecf..e559e238ae5a35 100644 --- a/Include/longobject.h +++ b/Include/longobject.h @@ -11,7 +11,7 @@ PyAPI_DATA(PyTypeObject) PyLong_Type; #define PyLong_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_LONG_SUBCLASS) -#define PyLong_CheckExact(op) Py_IS_TYPE(op, &PyLong_Type) +#define PyLong_CheckExact(op) Py_IS_TYPE((op), &PyLong_Type) PyAPI_FUNC(PyObject *) PyLong_FromLong(long); PyAPI_FUNC(PyObject *) PyLong_FromUnsignedLong(unsigned long); diff --git a/Include/memoryobject.h b/Include/memoryobject.h index 154397ce1e56d2..19aec679a5fad1 100644 --- a/Include/memoryobject.h +++ b/Include/memoryobject.h @@ -11,7 +11,7 @@ PyAPI_DATA(PyTypeObject) _PyManagedBuffer_Type; #endif PyAPI_DATA(PyTypeObject) PyMemoryView_Type; -#define PyMemoryView_Check(op) Py_IS_TYPE(op, &PyMemoryView_Type) +#define PyMemoryView_Check(op) Py_IS_TYPE((op), &PyMemoryView_Type) #ifndef Py_LIMITED_API /* Get a pointer to the memoryview's private copy of the exporter's buffer. */ diff --git a/Include/methodobject.h b/Include/methodobject.h index c971d78a640a68..72af5ad933df7f 100644 --- a/Include/methodobject.h +++ b/Include/methodobject.h @@ -13,8 +13,8 @@ extern "C" { PyAPI_DATA(PyTypeObject) PyCFunction_Type; -#define PyCFunction_CheckExact(op) Py_IS_TYPE(op, &PyCFunction_Type) -#define PyCFunction_Check(op) PyObject_TypeCheck(op, &PyCFunction_Type) +#define PyCFunction_CheckExact(op) Py_IS_TYPE((op), &PyCFunction_Type) +#define PyCFunction_Check(op) PyObject_TypeCheck((op), &PyCFunction_Type) typedef PyObject *(*PyCFunction)(PyObject *, PyObject *); typedef PyObject *(*_PyCFunctionFast) (PyObject *, PyObject *const *, Py_ssize_t); diff --git a/Include/modsupport.h b/Include/modsupport.h index 0e96a5c988846e..4e369bd56b4d20 100644 --- a/Include/modsupport.h +++ b/Include/modsupport.h @@ -37,8 +37,6 @@ PyAPI_FUNC(PyObject *) Py_BuildValue(const char *, ...); PyAPI_FUNC(PyObject *) _Py_BuildValue_SizeT(const char *, ...); -#define ANY_VARARGS(n) (n == PY_SSIZE_T_MAX) - PyAPI_FUNC(PyObject *) Py_VaBuildValue(const char *, va_list); // Add an attribute with name 'name' and value 'obj' to the module 'mod. @@ -58,8 +56,8 @@ PyAPI_FUNC(int) PyModule_AddStringConstant(PyObject *, const char *, const char PyAPI_FUNC(int) PyModule_AddType(PyObject *module, PyTypeObject *type); #endif /* Py_LIMITED_API */ -#define PyModule_AddIntMacro(m, c) PyModule_AddIntConstant(m, #c, c) -#define PyModule_AddStringMacro(m, c) PyModule_AddStringConstant(m, #c, c) +#define PyModule_AddIntMacro(m, c) PyModule_AddIntConstant((m), #c, (c)) +#define PyModule_AddStringMacro(m, c) PyModule_AddStringConstant((m), #c, (c)) #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03050000 /* New in 3.5 */ @@ -134,10 +132,10 @@ PyAPI_FUNC(PyObject *) PyModule_Create2(PyModuleDef*, int apiver); #ifdef Py_LIMITED_API #define PyModule_Create(module) \ - PyModule_Create2(module, PYTHON_ABI_VERSION) + PyModule_Create2((module), PYTHON_ABI_VERSION) #else #define PyModule_Create(module) \ - PyModule_Create2(module, PYTHON_API_VERSION) + PyModule_Create2((module), PYTHON_API_VERSION) #endif #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03050000 @@ -148,10 +146,10 @@ PyAPI_FUNC(PyObject *) PyModule_FromDefAndSpec2(PyModuleDef *def, #ifdef Py_LIMITED_API #define PyModule_FromDefAndSpec(module, spec) \ - PyModule_FromDefAndSpec2(module, spec, PYTHON_ABI_VERSION) + PyModule_FromDefAndSpec2((module), (spec), PYTHON_ABI_VERSION) #else #define PyModule_FromDefAndSpec(module, spec) \ - PyModule_FromDefAndSpec2(module, spec, PYTHON_API_VERSION) + PyModule_FromDefAndSpec2((module), (spec), PYTHON_API_VERSION) #endif /* Py_LIMITED_API */ #endif /* New in 3.5 */ diff --git a/Include/moduleobject.h b/Include/moduleobject.h index 75abd2cf2b9050..fbb2c5ae79444b 100644 --- a/Include/moduleobject.h +++ b/Include/moduleobject.h @@ -9,8 +9,8 @@ extern "C" { PyAPI_DATA(PyTypeObject) PyModule_Type; -#define PyModule_Check(op) PyObject_TypeCheck(op, &PyModule_Type) -#define PyModule_CheckExact(op) Py_IS_TYPE(op, &PyModule_Type) +#define PyModule_Check(op) PyObject_TypeCheck((op), &PyModule_Type) +#define PyModule_CheckExact(op) Py_IS_TYPE((op), &PyModule_Type) #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x03030000 PyAPI_FUNC(PyObject *) PyModule_NewObject( diff --git a/Include/object.h b/Include/object.h index f2af428e2bb973..a12b754c8ed8f3 100644 --- a/Include/object.h +++ b/Include/object.h @@ -51,6 +51,8 @@ A standard interface exists for objects that contain an array of items whose size is determined when the object is allocated. */ +#include "pystats.h" + /* Py_DEBUG implies Py_REF_DEBUG. */ #if defined(Py_DEBUG) && !defined(Py_REF_DEBUG) # define Py_REF_DEBUG @@ -78,10 +80,10 @@ whose size is determined when the object is allocated. #define PyObject_HEAD_INIT(type) \ { _PyObject_EXTRA_INIT \ - 1, type }, + 1, (type) }, #define PyVarObject_HEAD_INIT(type, size) \ - { PyObject_HEAD_INIT(type) size }, + { PyObject_HEAD_INIT(type) (size) }, /* PyObject_VAR_HEAD defines the initial segment of all variable-size * container objects. These end with a declaration of an array with 1 @@ -150,7 +152,7 @@ static inline int Py_IS_TYPE(PyObject *ob, PyTypeObject *type) { return Py_TYPE(ob) == type; } #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define Py_IS_TYPE(ob, type) Py_IS_TYPE(_PyObject_CAST(ob), type) +# define Py_IS_TYPE(ob, type) Py_IS_TYPE(_PyObject_CAST(ob), (type)) #endif @@ -158,7 +160,7 @@ static inline void Py_SET_REFCNT(PyObject *ob, Py_ssize_t refcnt) { ob->ob_refcnt = refcnt; } #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define Py_SET_REFCNT(ob, refcnt) Py_SET_REFCNT(_PyObject_CAST(ob), refcnt) +# define Py_SET_REFCNT(ob, refcnt) Py_SET_REFCNT(_PyObject_CAST(ob), (refcnt)) #endif @@ -174,7 +176,7 @@ static inline void Py_SET_SIZE(PyVarObject *ob, Py_ssize_t size) { ob->ob_size = size; } #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define Py_SET_SIZE(ob, size) Py_SET_SIZE(_PyVarObject_CAST(ob), size) +# define Py_SET_SIZE(ob, size) Py_SET_SIZE(_PyVarObject_CAST(ob), (size)) #endif @@ -255,6 +257,9 @@ PyAPI_FUNC(void *) PyType_GetModuleState(PyTypeObject *); PyAPI_FUNC(PyObject *) PyType_GetName(PyTypeObject *); PyAPI_FUNC(PyObject *) PyType_GetQualName(PyTypeObject *); #endif +#if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 >= 0x030C0000 +PyAPI_FUNC(PyObject *) PyType_FromMetaclass(PyTypeObject*, PyObject*, PyType_Spec*, PyObject*); +#endif /* Generic type check */ PyAPI_FUNC(int) PyType_IsSubtype(PyTypeObject *, PyTypeObject *); @@ -263,7 +268,7 @@ static inline int PyObject_TypeCheck(PyObject *ob, PyTypeObject *type) { return Py_IS_TYPE(ob, type) || PyType_IsSubtype(Py_TYPE(ob), type); } #if !defined(Py_LIMITED_API) || Py_LIMITED_API+0 < 0x030b0000 -# define PyObject_TypeCheck(ob, type) PyObject_TypeCheck(_PyObject_CAST(ob), type) +# define PyObject_TypeCheck(ob, type) PyObject_TypeCheck(_PyObject_CAST(ob), (type)) #endif PyAPI_DATA(PyTypeObject) PyType_Type; /* built-in 'type' */ @@ -490,6 +495,7 @@ PyAPI_FUNC(void) _Py_DecRef(PyObject *); static inline void Py_INCREF(PyObject *op) { + _Py_INCREF_STAT_INC(); #if defined(Py_REF_DEBUG) && defined(Py_LIMITED_API) && Py_LIMITED_API+0 >= 0x030A0000 // Stable ABI for Python 3.10 built in debug mode. _Py_IncRef(op); @@ -506,7 +512,6 @@ static inline void Py_INCREF(PyObject *op) # define Py_INCREF(op) Py_INCREF(_PyObject_CAST(op)) #endif - #if defined(Py_REF_DEBUG) && defined(Py_LIMITED_API) && Py_LIMITED_API+0 >= 0x030A0000 // Stable ABI for limited C API version 3.10 of Python debug build static inline void Py_DECREF(PyObject *op) { @@ -517,6 +522,7 @@ static inline void Py_DECREF(PyObject *op) { #elif defined(Py_REF_DEBUG) static inline void Py_DECREF(const char *filename, int lineno, PyObject *op) { + _Py_DECREF_STAT_INC(); _Py_RefTotal--; if (--op->ob_refcnt != 0) { if (op->ob_refcnt < 0) { @@ -532,6 +538,7 @@ static inline void Py_DECREF(const char *filename, int lineno, PyObject *op) #else static inline void Py_DECREF(PyObject *op) { + _Py_DECREF_STAT_INC(); // Non-limited C API and limited C API for Python 3.9 and older access // directly PyObject.ob_refcnt. if (--op->ob_refcnt == 0) { @@ -772,7 +779,7 @@ PyType_HasFeature(PyTypeObject *type, unsigned long feature) return ((flags & feature) != 0); } -#define PyType_FastSubclass(type, flag) PyType_HasFeature(type, flag) +#define PyType_FastSubclass(type, flag) PyType_HasFeature((type), (flag)) static inline int PyType_Check(PyObject *op) { return PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_TYPE_SUBCLASS); diff --git a/Include/objimpl.h b/Include/objimpl.h index 4fa670e71ab46f..140918198dae0e 100644 --- a/Include/objimpl.h +++ b/Include/objimpl.h @@ -135,14 +135,14 @@ PyAPI_FUNC(PyVarObject *) _PyObject_NewVar(PyTypeObject *, Py_ssize_t); // Alias to PyObject_New(). In Python 3.8, PyObject_NEW() called directly // PyObject_MALLOC() with _PyObject_SIZE(). -#define PyObject_NEW(type, typeobj) PyObject_New(type, typeobj) +#define PyObject_NEW(type, typeobj) PyObject_New((type), (typeobj)) #define PyObject_NewVar(type, typeobj, n) \ ( (type *) _PyObject_NewVar((typeobj), (n)) ) // Alias to PyObject_NewVar(). In Python 3.8, PyObject_NEW_VAR() called // directly PyObject_MALLOC() with _PyObject_VAR_SIZE(). -#define PyObject_NEW_VAR(type, typeobj, n) PyObject_NewVar(type, typeobj, n) +#define PyObject_NEW_VAR(type, typeobj, n) PyObject_NewVar((type), (typeobj), (n)) /* diff --git a/Include/opcode.h b/Include/opcode.h index 084d34b8c73cd5..7a22d5257a669b 100644 --- a/Include/opcode.h +++ b/Include/opcode.h @@ -42,7 +42,6 @@ extern "C" { #define RETURN_VALUE 83 #define IMPORT_STAR 84 #define SETUP_ANNOTATIONS 85 -#define YIELD_VALUE 86 #define ASYNC_GEN_WRAP 87 #define PREP_RERAISE_STAR 88 #define POP_EXCEPT 89 @@ -82,6 +81,7 @@ extern "C" { #define LOAD_FAST 124 #define STORE_FAST 125 #define DELETE_FAST 126 +#define LOAD_FAST_CHECK 127 #define POP_JUMP_FORWARD_IF_NOT_NONE 128 #define POP_JUMP_FORWARD_IF_NONE 129 #define RAISE_VARARGS 130 @@ -102,17 +102,16 @@ extern "C" { #define MAP_ADD 147 #define LOAD_CLASSDEREF 148 #define COPY_FREE_VARS 149 +#define YIELD_VALUE 150 #define RESUME 151 #define MATCH_CLASS 152 #define FORMAT_VALUE 155 #define BUILD_CONST_KEY_MAP 156 #define BUILD_STRING 157 -#define LOAD_METHOD 160 #define LIST_EXTEND 162 #define SET_UPDATE 163 #define DICT_MERGE 164 #define DICT_UPDATE 165 -#define PRECALL 166 #define CALL 171 #define KW_NAMES 172 #define POP_JUMP_BACKWARD_IF_NOT_NONE 173 @@ -136,52 +135,53 @@ extern "C" { #define CALL_ADAPTIVE 22 #define CALL_PY_EXACT_ARGS 23 #define CALL_PY_WITH_DEFAULTS 24 -#define COMPARE_OP_ADAPTIVE 26 -#define COMPARE_OP_FLOAT_JUMP 27 -#define COMPARE_OP_INT_JUMP 28 -#define COMPARE_OP_STR_JUMP 29 -#define EXTENDED_ARG_QUICK 34 -#define JUMP_BACKWARD_QUICK 38 -#define LOAD_ATTR_ADAPTIVE 39 -#define LOAD_ATTR_INSTANCE_VALUE 40 -#define LOAD_ATTR_MODULE 41 -#define LOAD_ATTR_SLOT 42 -#define LOAD_ATTR_WITH_HINT 43 -#define LOAD_CONST__LOAD_FAST 44 -#define LOAD_FAST__LOAD_CONST 45 -#define LOAD_FAST__LOAD_FAST 46 -#define LOAD_GLOBAL_ADAPTIVE 47 -#define LOAD_GLOBAL_BUILTIN 48 -#define LOAD_GLOBAL_MODULE 55 -#define LOAD_METHOD_ADAPTIVE 56 -#define LOAD_METHOD_CLASS 57 -#define LOAD_METHOD_MODULE 58 -#define LOAD_METHOD_NO_DICT 59 -#define LOAD_METHOD_WITH_DICT 62 -#define LOAD_METHOD_WITH_VALUES 63 -#define PRECALL_ADAPTIVE 64 -#define PRECALL_BOUND_METHOD 65 -#define PRECALL_BUILTIN_CLASS 66 -#define PRECALL_BUILTIN_FAST_WITH_KEYWORDS 67 -#define PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS 72 -#define PRECALL_NO_KW_BUILTIN_FAST 73 -#define PRECALL_NO_KW_BUILTIN_O 76 -#define PRECALL_NO_KW_ISINSTANCE 77 -#define PRECALL_NO_KW_LEN 78 -#define PRECALL_NO_KW_LIST_APPEND 79 -#define PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST 80 -#define PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS 81 -#define PRECALL_NO_KW_METHOD_DESCRIPTOR_O 113 -#define PRECALL_NO_KW_STR_1 121 -#define PRECALL_NO_KW_TUPLE_1 127 -#define PRECALL_NO_KW_TYPE_1 141 -#define PRECALL_PYFUNC 143 -#define RESUME_QUICK 150 -#define STORE_ATTR_ADAPTIVE 153 -#define STORE_ATTR_INSTANCE_VALUE 154 -#define STORE_ATTR_SLOT 158 -#define STORE_ATTR_WITH_HINT 159 -#define STORE_FAST__LOAD_FAST 161 +#define CALL_BOUND_METHOD_EXACT_ARGS 26 +#define CALL_BUILTIN_CLASS 27 +#define CALL_BUILTIN_FAST_WITH_KEYWORDS 28 +#define CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS 29 +#define CALL_NO_KW_BUILTIN_FAST 34 +#define CALL_NO_KW_BUILTIN_O 38 +#define CALL_NO_KW_ISINSTANCE 39 +#define CALL_NO_KW_LEN 40 +#define CALL_NO_KW_LIST_APPEND 41 +#define CALL_NO_KW_METHOD_DESCRIPTOR_FAST 42 +#define CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS 43 +#define CALL_NO_KW_METHOD_DESCRIPTOR_O 44 +#define CALL_NO_KW_STR_1 45 +#define CALL_NO_KW_TUPLE_1 46 +#define CALL_NO_KW_TYPE_1 47 +#define COMPARE_OP_ADAPTIVE 48 +#define COMPARE_OP_FLOAT_JUMP 55 +#define COMPARE_OP_INT_JUMP 56 +#define COMPARE_OP_STR_JUMP 57 +#define EXTENDED_ARG_QUICK 58 +#define FOR_ITER_ADAPTIVE 59 +#define FOR_ITER_LIST 62 +#define FOR_ITER_RANGE 63 +#define JUMP_BACKWARD_QUICK 64 +#define LOAD_ATTR_ADAPTIVE 65 +#define LOAD_ATTR_CLASS 66 +#define LOAD_ATTR_INSTANCE_VALUE 67 +#define LOAD_ATTR_MODULE 72 +#define LOAD_ATTR_PROPERTY 73 +#define LOAD_ATTR_SLOT 76 +#define LOAD_ATTR_WITH_HINT 77 +#define LOAD_ATTR_METHOD_LAZY_DICT 78 +#define LOAD_ATTR_METHOD_NO_DICT 79 +#define LOAD_ATTR_METHOD_WITH_DICT 80 +#define LOAD_ATTR_METHOD_WITH_VALUES 81 +#define LOAD_CONST__LOAD_FAST 86 +#define LOAD_FAST__LOAD_CONST 113 +#define LOAD_FAST__LOAD_FAST 121 +#define LOAD_GLOBAL_ADAPTIVE 141 +#define LOAD_GLOBAL_BUILTIN 143 +#define LOAD_GLOBAL_MODULE 153 +#define RESUME_QUICK 154 +#define STORE_ATTR_ADAPTIVE 158 +#define STORE_ATTR_INSTANCE_VALUE 159 +#define STORE_ATTR_SLOT 160 +#define STORE_ATTR_WITH_HINT 161 +#define STORE_FAST__LOAD_FAST 166 #define STORE_FAST__STORE_FAST 167 #define STORE_SUBSCR_ADAPTIVE 168 #define STORE_SUBSCR_DICT 169 diff --git a/Include/py_curses.h b/Include/py_curses.h index b2c7f1bb4309c6..e46b08e9cc414e 100644 --- a/Include/py_curses.h +++ b/Include/py_curses.h @@ -64,7 +64,7 @@ typedef struct { char *encoding; } PyCursesWindowObject; -#define PyCursesWindow_Check(v) Py_IS_TYPE(v, &PyCursesWindow_Type) +#define PyCursesWindow_Check(v) Py_IS_TYPE((v), &PyCursesWindow_Type) #define PyCurses_CAPSULE_NAME "_curses._C_API" diff --git a/Include/pycapsule.h b/Include/pycapsule.h index fb5d503fea73f1..929a9a685259fb 100644 --- a/Include/pycapsule.h +++ b/Include/pycapsule.h @@ -22,7 +22,7 @@ PyAPI_DATA(PyTypeObject) PyCapsule_Type; typedef void (*PyCapsule_Destructor)(PyObject *); -#define PyCapsule_CheckExact(op) Py_IS_TYPE(op, &PyCapsule_Type) +#define PyCapsule_CheckExact(op) Py_IS_TYPE((op), &PyCapsule_Type) PyAPI_FUNC(PyObject *) PyCapsule_New( diff --git a/Include/pyerrors.h b/Include/pyerrors.h index 34e3de3328f410..d5ac6af5b32c6c 100644 --- a/Include/pyerrors.h +++ b/Include/pyerrors.h @@ -62,10 +62,10 @@ PyAPI_FUNC(void) PyException_SetContext(PyObject *, PyObject *); PyAPI_FUNC(const char *) PyExceptionClass_Name(PyObject *); -#define PyExceptionInstance_Class(x) ((PyObject*)Py_TYPE(x)) +#define PyExceptionInstance_Class(x) _PyObject_CAST(Py_TYPE(x)) #define _PyBaseExceptionGroup_Check(x) \ - PyObject_TypeCheck(x, (PyTypeObject *)PyExc_BaseExceptionGroup) + PyObject_TypeCheck((x), (PyTypeObject *)PyExc_BaseExceptionGroup) /* Predefined exceptions */ diff --git a/Include/pyframe.h b/Include/pyframe.h index feac16f69de698..13d52312ea966e 100644 --- a/Include/pyframe.h +++ b/Include/pyframe.h @@ -14,6 +14,12 @@ PyAPI_FUNC(int) PyFrame_GetLineNumber(PyFrameObject *); PyAPI_FUNC(PyCodeObject *) PyFrame_GetCode(PyFrameObject *frame); +#ifndef Py_LIMITED_API +# define Py_CPYTHON_PYFRAME_H +# include "cpython/pyframe.h" +# undef Py_CPYTHON_PYFRAME_H +#endif + #ifdef __cplusplus } #endif diff --git a/Include/pymacconfig.h b/Include/pymacconfig.h index 9dde11bd58e217..00459a03b980be 100644 --- a/Include/pymacconfig.h +++ b/Include/pymacconfig.h @@ -84,18 +84,6 @@ # define HAVE_GCC_ASM_FOR_X87 #endif - /* - * The definition in pyconfig.h is only valid on the OS release - * where configure ran on and not necessarily for all systems where - * the executable can be used on. - * - * Specifically: OSX 10.4 has limited supported for '%zd', while - * 10.5 has full support for '%zd'. A binary built on 10.5 won't - * work properly on 10.4 unless we suppress the definition - * of PY_FORMAT_SIZE_T - */ -#undef PY_FORMAT_SIZE_T - #endif /* defined(_APPLE__) */ diff --git a/Include/pymacro.h b/Include/pymacro.h index b959eeb3f58b35..e37cda44c5ebf1 100644 --- a/Include/pymacro.h +++ b/Include/pymacro.h @@ -12,8 +12,10 @@ // static_assert is defined in glibc from version 2.16. Before it requires // compiler support (gcc >= 4.6) and is called _Static_assert. +// In C++ 11 static_assert is a keyword, redefining is undefined behaviour. #if (defined(__GLIBC__) \ && (__GLIBC__ < 2 || (__GLIBC__ == 2 && __GLIBC_MINOR__ <= 16)) \ + && !(defined(__cplusplus) && __cplusplus >= 201103L) \ && !defined(static_assert)) # define static_assert _Static_assert #endif @@ -150,4 +152,9 @@ // For example, "int x; _Py_RVALUE(x) = 1;" fails with a compiler error. #define _Py_RVALUE(EXPR) ((void)0, (EXPR)) +// Return non-zero if the type is signed, return zero if it's unsigned. +// Use "<= 0" rather than "< 0" to prevent the compiler warning: +// "comparison of unsigned expression in '< 0' is always false". +#define _Py_IS_TYPE_SIGNED(type) ((type)(-1) <= 0) + #endif /* Py_PYMACRO_H */ diff --git a/Include/pymem.h b/Include/pymem.h index c15ad10dfcf831..e882645757bfd3 100644 --- a/Include/pymem.h +++ b/Include/pymem.h @@ -82,13 +82,13 @@ PyAPI_FUNC(void) PyMem_Free(void *ptr); // Deprecated aliases only kept for backward compatibility. // PyMem_Del and PyMem_DEL are defined with no parameter to be able to use // them as function pointers (ex: dealloc = PyMem_Del). -#define PyMem_MALLOC(n) PyMem_Malloc(n) -#define PyMem_NEW(type, n) PyMem_New(type, n) -#define PyMem_REALLOC(p, n) PyMem_Realloc(p, n) -#define PyMem_RESIZE(p, type, n) PyMem_Resize(p, type, n) -#define PyMem_FREE(p) PyMem_Free(p) -#define PyMem_Del PyMem_Free -#define PyMem_DEL PyMem_Free +#define PyMem_MALLOC(n) PyMem_Malloc((n)) +#define PyMem_NEW(type, n) PyMem_New(type, (n)) +#define PyMem_REALLOC(p, n) PyMem_Realloc((p), (n)) +#define PyMem_RESIZE(p, type, n) PyMem_Resize((p), type, (n)) +#define PyMem_FREE(p) PyMem_Free((p)) +#define PyMem_Del(p) PyMem_Free((p)) +#define PyMem_DEL(p) PyMem_Free((p)) #ifndef Py_LIMITED_API diff --git a/Include/pyport.h b/Include/pyport.h index 614a2789fb0781..313bc8d21c83fd 100644 --- a/Include/pyport.h +++ b/Include/pyport.h @@ -22,20 +22,59 @@ // _Py_CAST(PyObject*, op) can convert a "const PyObject*" to // "PyObject*". // -// The type argument must not be constant. For example, in C++, -// _Py_CAST(const PyObject*, expr) fails with a compiler error. +// The type argument must not be a constant type. #ifdef __cplusplus +#include # define _Py_STATIC_CAST(type, expr) static_cast(expr) -# define _Py_CAST(type, expr) \ - const_cast(reinterpret_cast(expr)) +extern "C++" { + namespace { + template + inline type _Py_CAST_impl(long int ptr) { + return reinterpret_cast(ptr); + } + template + inline type _Py_CAST_impl(int ptr) { + return reinterpret_cast(ptr); + } +#if __cplusplus >= 201103 + template + inline type _Py_CAST_impl(std::nullptr_t) { + return static_cast(nullptr); + } +#endif + + template + inline type _Py_CAST_impl(expr_type *expr) { + return reinterpret_cast(expr); + } + + template + inline type _Py_CAST_impl(expr_type const *expr) { + return reinterpret_cast(const_cast(expr)); + } + + template + inline type _Py_CAST_impl(expr_type &expr) { + return static_cast(expr); + } + + template + inline type _Py_CAST_impl(expr_type const &expr) { + return static_cast(const_cast(expr)); + } + } +} +# define _Py_CAST(type, expr) _Py_CAST_impl(expr) + #else # define _Py_STATIC_CAST(type, expr) ((type)(expr)) # define _Py_CAST(type, expr) ((type)(expr)) #endif // Static inline functions should use _Py_NULL rather than using directly NULL -// to prevent C++ compiler warnings. In C++, _Py_NULL uses nullptr. -#ifdef __cplusplus +// to prevent C++ compiler warnings. On C++11 and newer, _Py_NULL is defined as +// nullptr. +#if defined(__cplusplus) && __cplusplus >= 201103 # define _Py_NULL nullptr #else # define _Py_NULL NULL @@ -162,32 +201,10 @@ typedef Py_ssize_t Py_ssize_clean_t; /* Largest possible value of size_t. */ #define PY_SIZE_MAX SIZE_MAX -/* Macro kept for backward compatibility: use "z" in new code. - * - * PY_FORMAT_SIZE_T is a platform-specific modifier for use in a printf - * format to convert an argument with the width of a size_t or Py_ssize_t. - * C99 introduced "z" for this purpose, but old MSVCs had not supported it. - * Since MSVC supports "z" since (at least) 2015, we can just use "z" - * for new code. - * - * These "high level" Python format functions interpret "z" correctly on - * all platforms (Python interprets the format string itself, and does whatever - * the platform C requires to convert a size_t/Py_ssize_t argument): - * - * PyBytes_FromFormat - * PyErr_Format - * PyBytes_FromFormatV - * PyUnicode_FromFormatV - * - * Lower-level uses require that you interpolate the correct format modifier - * yourself (e.g., calling printf, fprintf, sprintf, PyOS_snprintf); for - * example, - * - * Py_ssize_t index; - * fprintf(stderr, "index %" PY_FORMAT_SIZE_T "d sucks\n", index); +/* Macro kept for backward compatibility: use directly "z" in new code. * - * That will expand to %zd or to something else correct for a Py_ssize_t on - * the platform. + * PY_FORMAT_SIZE_T is a modifier for use in a printf format to convert an + * argument with the width of a size_t or Py_ssize_t: "z" (C99). */ #ifndef PY_FORMAT_SIZE_T # define PY_FORMAT_SIZE_T "z" diff --git a/Include/pystats.h b/Include/pystats.h new file mode 100644 index 00000000000000..87e92aa4f05fa6 --- /dev/null +++ b/Include/pystats.h @@ -0,0 +1,105 @@ + + +#ifndef Py_PYSTATS_H +#define Py_PYSTATS_H +#ifdef __cplusplus +extern "C" { +#endif + +#ifdef Py_STATS + +#define SPECIALIZATION_FAILURE_KINDS 32 + +/* Stats for determining who is calling PyEval_EvalFrame */ +#define EVAL_CALL_TOTAL 0 +#define EVAL_CALL_VECTOR 1 +#define EVAL_CALL_GENERATOR 2 +#define EVAL_CALL_LEGACY 3 +#define EVAL_CALL_FUNCTION_VECTORCALL 4 +#define EVAL_CALL_BUILD_CLASS 5 +#define EVAL_CALL_SLOT 6 +#define EVAL_CALL_FUNCTION_EX 7 +#define EVAL_CALL_API 8 +#define EVAL_CALL_METHOD 9 + +#define EVAL_CALL_KINDS 10 + +typedef struct _specialization_stats { + uint64_t success; + uint64_t failure; + uint64_t hit; + uint64_t deferred; + uint64_t miss; + uint64_t deopt; + uint64_t failure_kinds[SPECIALIZATION_FAILURE_KINDS]; +} SpecializationStats; + +typedef struct _opcode_stats { + SpecializationStats specialization; + uint64_t execution_count; + uint64_t pair_count[256]; +} OpcodeStats; + +typedef struct _call_stats { + uint64_t inlined_py_calls; + uint64_t pyeval_calls; + uint64_t frames_pushed; + uint64_t frame_objects_created; + uint64_t eval_calls[EVAL_CALL_KINDS]; +} CallStats; + +typedef struct _object_stats { + uint64_t increfs; + uint64_t decrefs; + uint64_t interpreter_increfs; + uint64_t interpreter_decrefs; + uint64_t allocations; + uint64_t allocations512; + uint64_t allocations4k; + uint64_t allocations_big; + uint64_t frees; + uint64_t to_freelist; + uint64_t from_freelist; + uint64_t new_values; + uint64_t dict_materialized_on_request; + uint64_t dict_materialized_new_key; + uint64_t dict_materialized_too_big; + uint64_t dict_materialized_str_subclass; +} ObjectStats; + +typedef struct _stats { + OpcodeStats opcode_stats[256]; + CallStats call_stats; + ObjectStats object_stats; +} PyStats; + + +PyAPI_DATA(PyStats) _py_stats_struct; +PyAPI_DATA(PyStats *) _py_stats; + +extern void _Py_StatsClear(void); +extern void _Py_PrintSpecializationStats(int to_file); + +#ifdef _PY_INTERPRETER + +#define _Py_INCREF_STAT_INC() do { if (_py_stats) _py_stats->object_stats.interpreter_increfs++; } while (0) +#define _Py_DECREF_STAT_INC() do { if (_py_stats) _py_stats->object_stats.interpreter_decrefs++; } while (0) + +#else + +#define _Py_INCREF_STAT_INC() do { if (_py_stats) _py_stats->object_stats.increfs++; } while (0) +#define _Py_DECREF_STAT_INC() do { if (_py_stats) _py_stats->object_stats.decrefs++; } while (0) + +#endif + +#else + +#define _Py_INCREF_STAT_INC() ((void)0) +#define _Py_DECREF_STAT_INC() ((void)0) + +#endif // !Py_STATS + +#ifdef __cplusplus +} +#endif +#endif /* !Py_PYSTATs_H */ diff --git a/Include/pythread.h b/Include/pythread.h index a48329085fac51..63714437c496b7 100644 --- a/Include/pythread.h +++ b/Include/pythread.h @@ -20,7 +20,9 @@ PyAPI_FUNC(unsigned long) PyThread_start_new_thread(void (*)(void *), void *); PyAPI_FUNC(void) _Py_NO_RETURN PyThread_exit_thread(void); PyAPI_FUNC(unsigned long) PyThread_get_thread_ident(void); -#if defined(__APPLE__) || defined(__linux__) || defined(__FreeBSD__) || defined(__OpenBSD__) || defined(__NetBSD__) || defined(_WIN32) || defined(_AIX) +#if (defined(__APPLE__) || defined(__linux__) || defined(_WIN32) \ + || defined(__FreeBSD__) || defined(__OpenBSD__) || defined(__NetBSD__) \ + || defined(__DragonFly__) || defined(_AIX)) #define PY_HAVE_THREAD_NATIVE_ID PyAPI_FUNC(unsigned long) PyThread_get_thread_native_id(void); #endif diff --git a/Include/rangeobject.h b/Include/rangeobject.h index d6af8473f9e8d3..d46ce7cd41b741 100644 --- a/Include/rangeobject.h +++ b/Include/rangeobject.h @@ -19,7 +19,7 @@ PyAPI_DATA(PyTypeObject) PyRange_Type; PyAPI_DATA(PyTypeObject) PyRangeIter_Type; PyAPI_DATA(PyTypeObject) PyLongRangeIter_Type; -#define PyRange_Check(op) Py_IS_TYPE(op, &PyRange_Type) +#define PyRange_Check(op) Py_IS_TYPE((op), &PyRange_Type) #ifdef __cplusplus } diff --git a/Include/setobject.h b/Include/setobject.h index fdad70644c545c..62c9e6b13f8901 100644 --- a/Include/setobject.h +++ b/Include/setobject.h @@ -20,21 +20,21 @@ PyAPI_FUNC(int) PySet_Discard(PyObject *set, PyObject *key); PyAPI_FUNC(PyObject *) PySet_Pop(PyObject *set); PyAPI_FUNC(Py_ssize_t) PySet_Size(PyObject *anyset); -#define PyFrozenSet_CheckExact(ob) Py_IS_TYPE(ob, &PyFrozenSet_Type) +#define PyFrozenSet_CheckExact(ob) Py_IS_TYPE((ob), &PyFrozenSet_Type) #define PyFrozenSet_Check(ob) \ - (Py_IS_TYPE(ob, &PyFrozenSet_Type) || \ + (Py_IS_TYPE((ob), &PyFrozenSet_Type) || \ PyType_IsSubtype(Py_TYPE(ob), &PyFrozenSet_Type)) #define PyAnySet_CheckExact(ob) \ - (Py_IS_TYPE(ob, &PySet_Type) || Py_IS_TYPE(ob, &PyFrozenSet_Type)) + (Py_IS_TYPE((ob), &PySet_Type) || Py_IS_TYPE((ob), &PyFrozenSet_Type)) #define PyAnySet_Check(ob) \ - (Py_IS_TYPE(ob, &PySet_Type) || Py_IS_TYPE(ob, &PyFrozenSet_Type) || \ + (Py_IS_TYPE((ob), &PySet_Type) || Py_IS_TYPE((ob), &PyFrozenSet_Type) || \ PyType_IsSubtype(Py_TYPE(ob), &PySet_Type) || \ PyType_IsSubtype(Py_TYPE(ob), &PyFrozenSet_Type)) #define PySet_CheckExact(op) Py_IS_TYPE(op, &PySet_Type) #define PySet_Check(ob) \ - (Py_IS_TYPE(ob, &PySet_Type) || \ + (Py_IS_TYPE((ob), &PySet_Type) || \ PyType_IsSubtype(Py_TYPE(ob), &PySet_Type)) #ifndef Py_LIMITED_API diff --git a/Include/sliceobject.h b/Include/sliceobject.h index 2c889508b4b495..c13863f27c2e63 100644 --- a/Include/sliceobject.h +++ b/Include/sliceobject.h @@ -28,7 +28,7 @@ typedef struct { PyAPI_DATA(PyTypeObject) PySlice_Type; PyAPI_DATA(PyTypeObject) PyEllipsis_Type; -#define PySlice_Check(op) Py_IS_TYPE(op, &PySlice_Type) +#define PySlice_Check(op) Py_IS_TYPE((op), &PySlice_Type) PyAPI_FUNC(PyObject *) PySlice_New(PyObject* start, PyObject* stop, PyObject* step); diff --git a/Include/structseq.h b/Include/structseq.h index 4f5c09f7ba05ef..96871155611958 100644 --- a/Include/structseq.h +++ b/Include/structseq.h @@ -35,9 +35,9 @@ PyAPI_FUNC(PyObject *) PyStructSequence_New(PyTypeObject* type); typedef PyTupleObject PyStructSequence; /* Macro, *only* to be used to fill in brand new objects */ -#define PyStructSequence_SET_ITEM(op, i, v) PyTuple_SET_ITEM(op, i, v) +#define PyStructSequence_SET_ITEM(op, i, v) PyTuple_SET_ITEM((op), (i), (v)) -#define PyStructSequence_GET_ITEM(op, i) PyTuple_GET_ITEM(op, i) +#define PyStructSequence_GET_ITEM(op, i) PyTuple_GET_ITEM((op), (i)) #endif PyAPI_FUNC(void) PyStructSequence_SetItem(PyObject*, Py_ssize_t, PyObject*); diff --git a/Include/traceback.h b/Include/traceback.h index 2dfa2ada4f2c37..2b40cc9fc32617 100644 --- a/Include/traceback.h +++ b/Include/traceback.h @@ -11,7 +11,7 @@ PyAPI_FUNC(int) PyTraceBack_Print(PyObject *, PyObject *); /* Reveal traceback type so we can typecheck traceback objects */ PyAPI_DATA(PyTypeObject) PyTraceBack_Type; -#define PyTraceBack_Check(v) Py_IS_TYPE(v, &PyTraceBack_Type) +#define PyTraceBack_Check(v) Py_IS_TYPE((v), &PyTraceBack_Type) #ifndef Py_LIMITED_API diff --git a/Include/tupleobject.h b/Include/tupleobject.h index dc68e3fc5c6d88..1f9ab54be65f87 100644 --- a/Include/tupleobject.h +++ b/Include/tupleobject.h @@ -25,7 +25,7 @@ PyAPI_DATA(PyTypeObject) PyTupleIter_Type; #define PyTuple_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_TUPLE_SUBCLASS) -#define PyTuple_CheckExact(op) Py_IS_TYPE(op, &PyTuple_Type) +#define PyTuple_CheckExact(op) Py_IS_TYPE((op), &PyTuple_Type) PyAPI_FUNC(PyObject *) PyTuple_New(Py_ssize_t size); PyAPI_FUNC(Py_ssize_t) PyTuple_Size(PyObject *); diff --git a/Include/unicodeobject.h b/Include/unicodeobject.h index ed3e8d2c6cc99d..74474f5bb8f976 100644 --- a/Include/unicodeobject.h +++ b/Include/unicodeobject.h @@ -113,7 +113,7 @@ PyAPI_DATA(PyTypeObject) PyUnicodeIter_Type; #define PyUnicode_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_UNICODE_SUBCLASS) -#define PyUnicode_CheckExact(op) Py_IS_TYPE(op, &PyUnicode_Type) +#define PyUnicode_CheckExact(op) Py_IS_TYPE((op), &PyUnicode_Type) /* --- Constants ---------------------------------------------------------- */ @@ -755,38 +755,22 @@ PyAPI_FUNC(int) PyUnicode_FSConverter(PyObject*, void*); PyAPI_FUNC(int) PyUnicode_FSDecoder(PyObject*, void*); -/* Decode a null-terminated string using Py_FileSystemDefaultEncoding - and the "surrogateescape" error handler. - - If Py_FileSystemDefaultEncoding is not set, fall back to the locale - encoding. - - Use PyUnicode_DecodeFSDefaultAndSize() if the string length is known. -*/ +/* Decode a null-terminated string from the Python filesystem encoding + and error handler. + If the string length is known, use PyUnicode_DecodeFSDefaultAndSize(). */ PyAPI_FUNC(PyObject*) PyUnicode_DecodeFSDefault( const char *s /* encoded string */ ); -/* Decode a string using Py_FileSystemDefaultEncoding - and the "surrogateescape" error handler. - - If Py_FileSystemDefaultEncoding is not set, fall back to the locale - encoding. -*/ - +/* Decode a string from the Python filesystem encoding and error handler. */ PyAPI_FUNC(PyObject*) PyUnicode_DecodeFSDefaultAndSize( const char *s, /* encoded string */ Py_ssize_t size /* size */ ); -/* Encode a Unicode object to Py_FileSystemDefaultEncoding with the - "surrogateescape" error handler, and return bytes. - - If Py_FileSystemDefaultEncoding is not set, fall back to the locale - encoding. -*/ - +/* Encode a Unicode object to the Python filesystem encoding and error handler. + Return bytes. */ PyAPI_FUNC(PyObject*) PyUnicode_EncodeFSDefault( PyObject *unicode ); diff --git a/Include/weakrefobject.h b/Include/weakrefobject.h index f071e9c759a641..8e1fa1b9286a77 100644 --- a/Include/weakrefobject.h +++ b/Include/weakrefobject.h @@ -12,12 +12,12 @@ PyAPI_DATA(PyTypeObject) _PyWeakref_RefType; PyAPI_DATA(PyTypeObject) _PyWeakref_ProxyType; PyAPI_DATA(PyTypeObject) _PyWeakref_CallableProxyType; -#define PyWeakref_CheckRef(op) PyObject_TypeCheck(op, &_PyWeakref_RefType) +#define PyWeakref_CheckRef(op) PyObject_TypeCheck((op), &_PyWeakref_RefType) #define PyWeakref_CheckRefExact(op) \ - Py_IS_TYPE(op, &_PyWeakref_RefType) + Py_IS_TYPE((op), &_PyWeakref_RefType) #define PyWeakref_CheckProxy(op) \ - (Py_IS_TYPE(op, &_PyWeakref_ProxyType) || \ - Py_IS_TYPE(op, &_PyWeakref_CallableProxyType)) + (Py_IS_TYPE((op), &_PyWeakref_ProxyType) \ + || Py_IS_TYPE((op), &_PyWeakref_CallableProxyType)) #define PyWeakref_Check(op) \ (PyWeakref_CheckRef(op) || PyWeakref_CheckProxy(op)) diff --git a/Lib/_pyio.py b/Lib/_pyio.py index 0f647eed99d81b..12510784c8b926 100644 --- a/Lib/_pyio.py +++ b/Lib/_pyio.py @@ -303,22 +303,6 @@ def _open_code_with_warning(path): open_code = _open_code_with_warning -def __getattr__(name): - if name == "OpenWrapper": - # bpo-43680: Until Python 3.9, _pyio.open was not a static method and - # builtins.open was set to OpenWrapper to not become a bound method - # when set to a class variable. _io.open is a built-in function whereas - # _pyio.open is a Python function. In Python 3.10, _pyio.open() is now - # a static method, and builtins.open() is now io.open(). - import warnings - warnings.warn('OpenWrapper is deprecated, use open instead', - DeprecationWarning, stacklevel=2) - global OpenWrapper - OpenWrapper = open - return OpenWrapper - raise AttributeError(f"module {__name__!r} has no attribute {name!r}") - - # In normal operation, both `UnsupportedOperation`s should be bound to the # same object. try: @@ -2022,13 +2006,7 @@ def __init__(self, buffer, encoding=None, errors=None, newline=None, encoding = text_encoding(encoding) if encoding == "locale": - try: - import locale - except ImportError: - # Importing locale may fail if Python is being built - encoding = "utf-8" - else: - encoding = locale.getencoding() + encoding = self._get_locale_encoding() if not isinstance(encoding, str): raise ValueError("invalid encoding: %r" % encoding) @@ -2162,7 +2140,7 @@ def reconfigure(self, *, if not isinstance(encoding, str): raise TypeError("invalid encoding: %r" % encoding) if encoding == "locale": - encoding = locale.getencoding() + encoding = self._get_locale_encoding() if newline is Ellipsis: newline = self._readnl @@ -2267,6 +2245,15 @@ def _get_decoded_chars(self, n=None): self._decoded_chars_used += len(chars) return chars + def _get_locale_encoding(self): + try: + import locale + except ImportError: + # Importing locale may fail if Python is being built + return "utf-8" + else: + return locale.getencoding() + def _rewind_decoded_chars(self, n): """Rewind the _decoded_chars buffer.""" if self._decoded_chars_used < n: diff --git a/Lib/argparse.py b/Lib/argparse.py index 1c5520c4b41bd1..02e98bbf920cf1 100644 --- a/Lib/argparse.py +++ b/Lib/argparse.py @@ -2161,7 +2161,9 @@ def _read_args_from_files(self, arg_strings): # replace arguments referencing files with the file content else: try: - with open(arg_string[1:]) as args_file: + with open(arg_string[1:], + encoding=_sys.getfilesystemencoding(), + errors=_sys.getfilesystemencodeerrors()) as args_file: arg_strings = [] for arg_line in args_file.read().splitlines(): for arg in self.convert_arg_line_to_args(arg_line): diff --git a/Lib/ast.py b/Lib/ast.py index e81e28044bc6e4..4e2ae859245f99 100644 --- a/Lib/ast.py +++ b/Lib/ast.py @@ -1335,7 +1335,11 @@ def write_item(item): ) def visit_Tuple(self, node): - with self.require_parens(_Precedence.TUPLE, node): + with self.delimit_if( + "(", + ")", + len(node.elts) == 0 or self.get_precedence(node) > _Precedence.TUPLE + ): self.items_view(self.traverse, node.elts) unop = {"Invert": "~", "Not": "not", "UAdd": "+", "USub": "-"} diff --git a/Lib/asyncio/base_futures.py b/Lib/asyncio/base_futures.py index cd811a788cb6df..7987963bd99349 100644 --- a/Lib/asyncio/base_futures.py +++ b/Lib/asyncio/base_futures.py @@ -1,7 +1,6 @@ __all__ = () import reprlib -from _thread import get_ident from . import format_helpers diff --git a/Lib/asyncio/coroutines.py b/Lib/asyncio/coroutines.py index 0e4b489f30fd7c..7fda0e449d500a 100644 --- a/Lib/asyncio/coroutines.py +++ b/Lib/asyncio/coroutines.py @@ -4,7 +4,6 @@ import inspect import os import sys -import traceback import types diff --git a/Lib/asyncio/locks.py b/Lib/asyncio/locks.py index e71130274dd6f3..42177f1c079948 100644 --- a/Lib/asyncio/locks.py +++ b/Lib/asyncio/locks.py @@ -8,7 +8,6 @@ from . import exceptions from . import mixins -from . import tasks class _ContextManagerMixin: async def __aenter__(self): diff --git a/Lib/asyncio/proactor_events.py b/Lib/asyncio/proactor_events.py index 9636c6b4d28fad..ddb9daca026936 100644 --- a/Lib/asyncio/proactor_events.py +++ b/Lib/asyncio/proactor_events.py @@ -113,7 +113,7 @@ def close(self): def __del__(self, _warn=warnings.warn): if self._sock is not None: _warn(f"unclosed transport {self!r}", ResourceWarning, source=self) - self.close() + self._sock.close() def _fatal_error(self, exc, message='Fatal error on pipe transport'): try: diff --git a/Lib/asyncio/runners.py b/Lib/asyncio/runners.py index 065691bed99285..f524c3b8e424f5 100644 --- a/Lib/asyncio/runners.py +++ b/Lib/asyncio/runners.py @@ -5,7 +5,6 @@ import functools import threading import signal -import sys from . import coroutines from . import events from . import exceptions diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py index a568c4e4b295f0..9a169035de8865 100644 --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -4,7 +4,6 @@ import socket import sys -import warnings import weakref if hasattr(socket, 'AF_UNIX'): diff --git a/Lib/asyncio/taskgroups.py b/Lib/asyncio/taskgroups.py index 6af21f3a15d93a..9e0610deed281f 100644 --- a/Lib/asyncio/taskgroups.py +++ b/Lib/asyncio/taskgroups.py @@ -3,8 +3,6 @@ __all__ = ["TaskGroup"] -import weakref - from . import events from . import exceptions from . import tasks @@ -19,8 +17,7 @@ def __init__(self): self._loop = None self._parent_task = None self._parent_cancel_requested = False - self._tasks = weakref.WeakSet() - self._unfinished_tasks = 0 + self._tasks = set() self._errors = [] self._base_error = None self._on_completed_fut = None @@ -29,8 +26,6 @@ def __repr__(self): info = [''] if self._tasks: info.append(f'tasks={len(self._tasks)}') - if self._unfinished_tasks: - info.append(f'unfinished={self._unfinished_tasks}') if self._errors: info.append(f'errors={len(self._errors)}') if self._aborting: @@ -93,7 +88,7 @@ async def __aexit__(self, et, exc, tb): # can be cancelled multiple times if our parent task # is being cancelled repeatedly (or even once, when # our own cancellation is already in progress) - while self._unfinished_tasks: + while self._tasks: if self._on_completed_fut is None: self._on_completed_fut = self._loop.create_future() @@ -114,7 +109,7 @@ async def __aexit__(self, et, exc, tb): self._on_completed_fut = None - assert self._unfinished_tasks == 0 + assert not self._tasks if self._base_error is not None: raise self._base_error @@ -141,7 +136,7 @@ async def __aexit__(self, et, exc, tb): def create_task(self, coro, *, name=None, context=None): if not self._entered: raise RuntimeError(f"TaskGroup {self!r} has not been entered") - if self._exiting and self._unfinished_tasks == 0: + if self._exiting and not self._tasks: raise RuntimeError(f"TaskGroup {self!r} is finished") if context is None: task = self._loop.create_task(coro) @@ -149,7 +144,6 @@ def create_task(self, coro, *, name=None, context=None): task = self._loop.create_task(coro, context=context) tasks._set_task_name(task, name) task.add_done_callback(self._on_task_done) - self._unfinished_tasks += 1 self._tasks.add(task) return task @@ -169,10 +163,9 @@ def _abort(self): t.cancel() def _on_task_done(self, task): - self._unfinished_tasks -= 1 - assert self._unfinished_tasks >= 0 + self._tasks.discard(task) - if self._on_completed_fut is not None and not self._unfinished_tasks: + if self._on_completed_fut is not None and not self._tasks: if not self._on_completed_fut.done(): self._on_completed_fut.set_result(True) diff --git a/Lib/configparser.py b/Lib/configparser.py index de9ee537ac1884..f98e6fb01f97cc 100644 --- a/Lib/configparser.py +++ b/Lib/configparser.py @@ -148,14 +148,14 @@ import sys import warnings -__all__ = ["NoSectionError", "DuplicateOptionError", "DuplicateSectionError", +__all__ = ("NoSectionError", "DuplicateOptionError", "DuplicateSectionError", "NoOptionError", "InterpolationError", "InterpolationDepthError", "InterpolationMissingOptionError", "InterpolationSyntaxError", "ParsingError", "MissingSectionHeaderError", - "ConfigParser", "SafeConfigParser", "RawConfigParser", + "ConfigParser", "RawConfigParser", "Interpolation", "BasicInterpolation", "ExtendedInterpolation", "LegacyInterpolation", "SectionProxy", "ConverterMapping", - "DEFAULTSECT", "MAX_INTERPOLATION_DEPTH"] + "DEFAULTSECT", "MAX_INTERPOLATION_DEPTH") _default_dict = dict DEFAULTSECT = "DEFAULT" @@ -297,41 +297,12 @@ def __init__(self, option, section, rawval): class ParsingError(Error): """Raised when a configuration file does not follow legal syntax.""" - def __init__(self, source=None, filename=None): - # Exactly one of `source'/`filename' arguments has to be given. - # `filename' kept for compatibility. - if filename and source: - raise ValueError("Cannot specify both `filename' and `source'. " - "Use `source'.") - elif not filename and not source: - raise ValueError("Required argument `source' not given.") - elif filename: - source = filename - Error.__init__(self, 'Source contains parsing errors: %r' % source) + def __init__(self, source): + super().__init__(f'Source contains parsing errors: {source!r}') self.source = source self.errors = [] self.args = (source, ) - @property - def filename(self): - """Deprecated, use `source'.""" - warnings.warn( - "The 'filename' attribute will be removed in Python 3.12. " - "Use 'source' instead.", - DeprecationWarning, stacklevel=2 - ) - return self.source - - @filename.setter - def filename(self, value): - """Deprecated, user `source'.""" - warnings.warn( - "The 'filename' attribute will be removed in Python 3.12. " - "Use 'source' instead.", - DeprecationWarning, stacklevel=2 - ) - self.source = value - def append(self, lineno, line): self.errors.append((lineno, line)) self.message += '\n\t[line %2d]: %s' % (lineno, line) @@ -768,15 +739,6 @@ def read_dict(self, dictionary, source=''): elements_added.add((section, key)) self.set(section, key, value) - def readfp(self, fp, filename=None): - """Deprecated, use read_file instead.""" - warnings.warn( - "This method will be removed in Python 3.12. " - "Use 'parser.read_file()' instead.", - DeprecationWarning, stacklevel=2 - ) - self.read_file(fp, source=filename) - def get(self, section, option, *, raw=False, vars=None, fallback=_UNSET): """Get an option value for a given section. @@ -1239,19 +1201,6 @@ def _read_defaults(self, defaults): self._interpolation = hold_interpolation -class SafeConfigParser(ConfigParser): - """ConfigParser alias for backwards compatibility purposes.""" - - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - warnings.warn( - "The SafeConfigParser class has been renamed to ConfigParser " - "in Python 3.2. This alias will be removed in Python 3.12." - " Use ConfigParser directly instead.", - DeprecationWarning, stacklevel=2 - ) - - class SectionProxy(MutableMapping): """A proxy for a single section from a parser.""" diff --git a/Lib/copy.py b/Lib/copy.py index 69bac980be2054..1b276afe08121e 100644 --- a/Lib/copy.py +++ b/Lib/copy.py @@ -258,7 +258,7 @@ def _keep_alive(x, memo): def _reconstruct(x, memo, func, args, state=None, listiter=None, dictiter=None, - deepcopy=deepcopy): + *, deepcopy=deepcopy): deep = memo is not None if deep and args: args = (deepcopy(arg, memo) for arg in args) diff --git a/Lib/ctypes/test/__main__.py b/Lib/ctypes/test/__main__.py deleted file mode 100644 index 362a9ec8cff6c0..00000000000000 --- a/Lib/ctypes/test/__main__.py +++ /dev/null @@ -1,4 +0,0 @@ -from ctypes.test import load_tests -import unittest - -unittest.main() diff --git a/Lib/dataclasses.py b/Lib/dataclasses.py index 4645ebfa71e71c..69cab8c563ea98 100644 --- a/Lib/dataclasses.py +++ b/Lib/dataclasses.py @@ -230,7 +230,7 @@ def __init__(self, type): self.type = type def __repr__(self): - if isinstance(self.type, type) and not isinstance(self.type, GenericAlias): + if isinstance(self.type, type): type_name = self.type.__name__ else: # typing objects, e.g. List[int] @@ -1156,11 +1156,16 @@ def _add_slots(cls, is_frozen, weakref_slot): itertools.chain.from_iterable(map(_get_slots, cls.__mro__[1:-1])) ) # The slots for our class. Remove slots from our base classes. Add - # '__weakref__' if weakref_slot was given. + # '__weakref__' if weakref_slot was given, unless it is already present. cls_dict["__slots__"] = tuple( - itertools.chain( - itertools.filterfalse(inherited_slots.__contains__, field_names), - ("__weakref__",) if weakref_slot else ()) + itertools.filterfalse( + inherited_slots.__contains__, + itertools.chain( + # gh-93521: '__weakref__' also needs to be filtered out if + # already present in inherited_slots + field_names, ('__weakref__',) if weakref_slot else () + ) + ), ) for field_name in field_names: @@ -1243,7 +1248,7 @@ def _is_dataclass_instance(obj): def is_dataclass(obj): """Returns True if obj is a dataclass or an instance of a dataclass.""" - cls = obj if isinstance(obj, type) and not isinstance(obj, GenericAlias) else type(obj) + cls = obj if isinstance(obj, type) else type(obj) return hasattr(cls, _FIELDS) diff --git a/Lib/dis.py b/Lib/dis.py index 046013120b000d..bd87de97fc9ce6 100644 --- a/Lib/dis.py +++ b/Lib/dis.py @@ -37,6 +37,8 @@ LOAD_GLOBAL = opmap['LOAD_GLOBAL'] BINARY_OP = opmap['BINARY_OP'] JUMP_BACKWARD = opmap['JUMP_BACKWARD'] +FOR_ITER = opmap['FOR_ITER'] +LOAD_ATTR = opmap['LOAD_ATTR'] CACHE = opmap["CACHE"] @@ -463,6 +465,10 @@ def _get_instructions_bytes(code, varname_from_oparg=None, argval, argrepr = _get_name_info(arg//2, get_name) if (arg & 1) and argrepr: argrepr = "NULL + " + argrepr + elif deop == LOAD_ATTR: + argval, argrepr = _get_name_info(arg//2, get_name) + if (arg & 1) and argrepr: + argrepr = "NULL|self + " + argrepr else: argval, argrepr = _get_name_info(arg, get_name) elif deop in hasjabs: @@ -471,6 +477,8 @@ def _get_instructions_bytes(code, varname_from_oparg=None, elif deop in hasjrel: signed_arg = -arg if _is_backward_jump(deop) else arg argval = offset + 2 + signed_arg*2 + if deop == FOR_ITER: + argval += 2 argrepr = "to " + repr(argval) elif deop in haslocal or deop in hasfree: argval, argrepr = _get_name_info(arg, varname_from_oparg) @@ -492,17 +500,29 @@ def _get_instructions_bytes(code, varname_from_oparg=None, yield Instruction(_all_opname[op], op, arg, argval, argrepr, offset, starts_line, is_jump_target, positions) - if show_caches and _inline_cache_entries[deop]: - for name, caches in _cache_format[opname[deop]].items(): - data = code[offset + 2: offset + 2 + caches * 2] - argrepr = f"{name}: {int.from_bytes(data, sys.byteorder)}" - for _ in range(caches): - offset += 2 - yield Instruction( - "CACHE", 0, 0, None, argrepr, offset, None, False, None - ) - # Only show the actual value for the first cache entry: + caches = _inline_cache_entries[deop] + if not caches: + continue + if not show_caches: + # We still need to advance the co_positions iterator: + for _ in range(caches): + next(co_positions, ()) + continue + for name, size in _cache_format[opname[deop]].items(): + for i in range(size): + offset += 2 + # Only show the fancy argrepr for a CACHE instruction when it's + # the first entry for a particular cache value and the + # instruction using it is actually quickened: + if i == 0 and op != deop: + data = code[offset: offset + 2 * size] + argrepr = f"{name}: {int.from_bytes(data, sys.byteorder)}" + else: argrepr = "" + yield Instruction( + "CACHE", CACHE, 0, None, argrepr, offset, None, False, + Positions(*next(co_positions, ())) + ) def disassemble(co, lasti=-1, *, file=None, show_caches=False, adaptive=False): """Disassemble a code object.""" @@ -592,7 +612,7 @@ def _unpack_opargs(code): caches = _inline_cache_entries[deop] if deop >= HAVE_ARGUMENT: arg = code[i+1] | extended_arg - extended_arg = (arg << 8) if op == EXTENDED_ARG else 0 + extended_arg = (arg << 8) if deop == EXTENDED_ARG else 0 # The oparg is stored as a signed integer # If the value exceeds its upper limit, it will overflow and wrap # to a negative integer @@ -612,11 +632,14 @@ def findlabels(code): labels = [] for offset, op, arg in _unpack_opargs(code): if arg is not None: - if op in hasjrel: - if _is_backward_jump(op): + deop = _deoptop(op) + if deop in hasjrel: + if _is_backward_jump(deop): arg = -arg label = offset + 2 + arg*2 - elif op in hasjabs: + if deop == FOR_ITER: + label += 2 + elif deop in hasjabs: label = arg*2 else: continue diff --git a/Lib/distutils/sysconfig.py b/Lib/distutils/sysconfig.py index 3414a761e76b99..041877512051b4 100644 --- a/Lib/distutils/sysconfig.py +++ b/Lib/distutils/sysconfig.py @@ -9,7 +9,6 @@ Email: """ -import _imp import os import re import sys diff --git a/Lib/distutils/tests/test_bdist.py b/Lib/distutils/tests/test_bdist.py index 241fc9ad75f34b..5676f7f34d4292 100644 --- a/Lib/distutils/tests/test_bdist.py +++ b/Lib/distutils/tests/test_bdist.py @@ -1,5 +1,4 @@ """Tests for distutils.command.bdist.""" -import os import unittest from test.support import run_unittest diff --git a/Lib/distutils/tests/test_build.py b/Lib/distutils/tests/test_build.py index 83a9e4f4dd2f43..71b5e164bae14a 100644 --- a/Lib/distutils/tests/test_build.py +++ b/Lib/distutils/tests/test_build.py @@ -12,6 +12,7 @@ class BuildTestCase(support.TempdirManager, support.LoggingSilencer, unittest.TestCase): + @unittest.skipUnless(sys.executable, "test requires sys.executable") def test_finalize_options(self): pkg_dir, dist = self.create_dist() cmd = build(dist) diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py index e7d435f412db79..270e91a98d0c4d 100644 --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -10,9 +10,8 @@ from distutils import sysconfig from distutils.ccompiler import get_default_compiler from distutils.tests import support -from test.support import run_unittest, swap_item, requires_subprocess +from test.support import run_unittest, swap_item, requires_subprocess, is_wasi from test.support.os_helper import TESTFN -from test.support.warnings_helper import check_warnings class SysconfigTestCase(support.EnvironGuard, unittest.TestCase): @@ -32,6 +31,7 @@ def cleanup_testfn(self): elif os.path.isdir(TESTFN): shutil.rmtree(TESTFN) + @unittest.skipIf(is_wasi, "Incompatible with WASI mapdir and OOT builds") def test_get_config_h_filename(self): config_h = sysconfig.get_config_h_filename() self.assertTrue(os.path.isfile(config_h), config_h) diff --git a/Lib/doctest.py b/Lib/doctest.py index ed94d15c0e2da4..b2ef2ce63672eb 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1085,19 +1085,21 @@ def _get_test(self, obj, name, module, globs, source_lines): def _find_lineno(self, obj, source_lines): """ - Return a line number of the given object's docstring. Note: - this method assumes that the object has a docstring. + Return a line number of the given object's docstring. + + Returns `None` if the given object does not have a docstring. """ lineno = None + docstring = getattr(obj, '__doc__', None) # Find the line number for modules. - if inspect.ismodule(obj): + if inspect.ismodule(obj) and docstring is not None: lineno = 0 # Find the line number for classes. # Note: this could be fooled if a class is defined multiple # times in a single file. - if inspect.isclass(obj): + if inspect.isclass(obj) and docstring is not None: if source_lines is None: return None pat = re.compile(r'^\s*class\s*%s\b' % @@ -1109,7 +1111,9 @@ def _find_lineno(self, obj, source_lines): # Find the line number for functions & methods. if inspect.ismethod(obj): obj = obj.__func__ - if inspect.isfunction(obj): obj = obj.__code__ + if inspect.isfunction(obj) and getattr(obj, '__doc__', None): + # We don't use `docstring` var here, because `obj` can be changed. + obj = obj.__code__ if inspect.istraceback(obj): obj = obj.tb_frame if inspect.isframe(obj): obj = obj.f_code if inspect.iscode(obj): diff --git a/Lib/email/_header_value_parser.py b/Lib/email/_header_value_parser.py index 8a8fb8bc42a954..e637e6df06612d 100644 --- a/Lib/email/_header_value_parser.py +++ b/Lib/email/_header_value_parser.py @@ -2379,7 +2379,7 @@ def get_section(value): digits += value[0] value = value[1:] if digits[0] == '0' and digits != '0': - section.defects.append(errors.InvalidHeaderError( + section.defects.append(errors.InvalidHeaderDefect( "section number has an invalid leading 0")) section.number = int(digits) section.append(ValueTerminal(digits, 'digits')) diff --git a/Lib/enum.py b/Lib/enum.py index 1df38a6f7e5f50..15c8d880167103 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -480,8 +480,9 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # check for illegal enum names (any others?) invalid_names = set(member_names) & {'mro', ''} if invalid_names: - raise ValueError('invalid enum member name(s) '.format( - ','.join(repr(n) for n in invalid_names))) + raise ValueError('invalid enum member name(s) %s' % ( + ','.join(repr(n) for n in invalid_names) + )) # # adjust the sunders _order_ = classdict.pop('_order_', None) @@ -535,111 +536,6 @@ def __new__(metacls, cls, bases, classdict, *, boundary=None, _simple=False, **k # update classdict with any changes made by __init_subclass__ classdict.update(enum_class.__dict__) # - # create a default docstring if one has not been provided - if enum_class.__doc__ is None: - if not member_names: - enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ - Create a collection of name/value pairs. - - Example enumeration: - - >>> class Color(Enum): - ... RED = 1 - ... BLUE = 2 - ... GREEN = 3 - - Access them by: - - - attribute access:: - - >>> Color.RED - - - - value lookup: - - >>> Color(1) - - - - name lookup: - - >>> Color['RED'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Color) - 3 - - >>> list(Color) - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """) - else: - member = list(enum_class)[0] - enum_length = len(enum_class) - cls_name = enum_class.__name__ - if enum_length == 1: - list_line = 'list(%s)' % cls_name - list_repr = '[<%s.%s: %r>]' % (cls_name, member.name, member.value) - elif enum_length == 2: - member2 = list(enum_class)[1] - list_line = 'list(%s)' % cls_name - list_repr = '[<%s.%s: %r>, <%s.%s: %r>]' % ( - cls_name, member.name, member.value, - cls_name, member2.name, member2.value, - ) - else: - member2 = list(enum_class)[1] - member3 = list(enum_class)[2] - list_line = 'list(%s)%s' % (cls_name, ('','[:3]')[enum_length > 3]) - list_repr = '[<%s.%s: %r>, <%s.%s: %r>, <%s.%s: %r>]' % ( - cls_name, member.name, member.value, - cls_name, member2.name, member2.value, - cls_name, member3.name, member3.value, - ) - enum_class.__doc__ = classdict['__doc__'] = _dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> %s.%s - <%s.%s: %r> - - - value lookup: - - >>> %s(%r) - <%s.%s: %r> - - - name lookup: - - >>> %s[%r] - <%s.%s: %r> - - Enumerations can be iterated over, and know how many members they have: - - >>> len(%s) - %r - - >>> %s - %s - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """ - % (cls_name, member.name, - cls_name, member.name, member.value, - cls_name, member.value, - cls_name, member.name, member.value, - cls_name, member.name, - cls_name, member.name, member.value, - cls_name, enum_length, - list_line, list_repr, - )) - # # double check that repr and friends are not the mixin's or various # things break (such as pickle) # however, if the method is defined in the Enum itself, don't replace @@ -798,26 +694,16 @@ def __call__(cls, value, names=None, *, module=None, qualname=None, type=None, s boundary=boundary, ) - def __contains__(cls, member): - """ - Return True if member is a member of this enum - raises TypeError if member is not an enum member + def __contains__(cls, value): + """Return True if `value` is in `cls`. - note: in 3.12 TypeError will no longer be raised, and True will also be - returned if member is the value of a member in this enum + `value` is in `cls` if: + 1) `value` is a member of `cls`, or + 2) `value` is the value of one of the `cls`'s members. """ - if not isinstance(member, Enum): - import warnings - warnings.warn( - "in 3.12 __contains__ will no longer raise TypeError, but will return True or\n" - "False depending on whether the value is a member or the value of a member", - DeprecationWarning, - stacklevel=2, - ) - raise TypeError( - "unsupported operand type(s) for 'in': '%s' and '%s'" % ( - type(member).__qualname__, cls.__class__.__qualname__)) - return isinstance(member, cls) and member._name_ in cls._member_map_ + if isinstance(value, cls): + return True + return value in cls._value2member_map_ or value in cls._unhashable_values_ def __delattr__(cls, attr): # nicer error message when someone tries to delete an attribute @@ -1221,14 +1107,32 @@ def _generate_next_value_(name, start, count, last_values): name: the name of the member start: the initial start value or None count: the number of existing members - last_value: the last value assigned or None + last_values: the list of values assigned """ - for last_value in reversed(last_values): - try: - return last_value + 1 - except TypeError: - pass - else: + if not last_values: + return start + try: + last = last_values[-1] + last_values.sort() + if last == last_values[-1]: + # no difference between old and new methods + return last + 1 + else: + # trigger old method (with warning) + raise TypeError + except TypeError: + import warnings + warnings.warn( + "In 3.13 the default `auto()`/`_generate_next_value_` will require all values to be sortable and support adding +1\n" + "and the value returned will be the largest value in the enum incremented by 1", + DeprecationWarning, + stacklevel=3, + ) + for v in last_values: + try: + return v + 1 + except TypeError: + pass return start @classmethod @@ -1236,7 +1140,7 @@ def _missing_(cls, value): return None def __repr__(self): - v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ + v_repr = self.__class__._value_repr_ or repr return "<%s.%s: %s>" % (self.__class__.__name__, self._name_, v_repr(self._value_)) def __str__(self): @@ -1368,6 +1272,9 @@ class Flag(Enum, boundary=STRICT): Support for flags """ + def __reduce_ex__(self, proto): + return self.__class__, (self._value_, ) + _numeric_repr_ = repr def _generate_next_value_(name, start, count, last_values): @@ -1377,7 +1284,7 @@ def _generate_next_value_(name, start, count, last_values): name: the name of the member start: the initial start value or None count: the number of existing members - last_value: the last value assigned or None + last_values: the last value assigned or None """ if not count: return start if start is not None else 1 @@ -1508,7 +1415,7 @@ def __len__(self): def __repr__(self): cls_name = self.__class__.__name__ - v_repr = self.__class__._value_repr_ or self._value_.__class__.__repr__ + v_repr = self.__class__._value_repr_ or repr if self._name_ is None: return "<%s: %s>" % (cls_name, v_repr(self._value_)) else: @@ -1571,7 +1478,7 @@ def __invert__(self): __rxor__ = __xor__ -class IntFlag(int, ReprEnum, Flag, boundary=EJECT): +class IntFlag(int, ReprEnum, Flag, boundary=KEEP): """ Support for integer-based Flags """ @@ -1640,6 +1547,7 @@ def global_str(self): use enum_name instead of class.enum_name """ if self._name_ is None: + cls_name = self.__class__.__name__ return "%s(%r)" % (cls_name, self._value_) else: return self._name_ diff --git a/Lib/filecmp.py b/Lib/filecmp.py index 70a4b23c982205..30bd900fa805aa 100644 --- a/Lib/filecmp.py +++ b/Lib/filecmp.py @@ -157,17 +157,17 @@ def phase2(self): # Distinguish files, directories, funnies a_path = os.path.join(self.left, x) b_path = os.path.join(self.right, x) - ok = 1 + ok = True try: a_stat = os.stat(a_path) except OSError: # print('Can\'t stat', a_path, ':', why.args[1]) - ok = 0 + ok = False try: b_stat = os.stat(b_path) except OSError: # print('Can\'t stat', b_path, ':', why.args[1]) - ok = 0 + ok = False if ok: a_type = stat.S_IFMT(a_stat.st_mode) @@ -242,7 +242,7 @@ def report_full_closure(self): # Report on self and subdirs recursively methodmap = dict(subdirs=phase4, same_files=phase3, diff_files=phase3, funny_files=phase3, - common_dirs = phase2, common_files=phase2, common_funny=phase2, + common_dirs=phase2, common_files=phase2, common_funny=phase2, common=phase1, left_only=phase1, right_only=phase1, left_list=phase0, right_list=phase0) diff --git a/Lib/fnmatch.py b/Lib/fnmatch.py index 0f5a41ac06f3e6..d5e296f7748c1c 100644 --- a/Lib/fnmatch.py +++ b/Lib/fnmatch.py @@ -102,7 +102,7 @@ def translate(pat): add('\\[') else: stuff = pat[i:j] - if '--' not in stuff: + if '-' not in stuff: stuff = stuff.replace('\\', r'\\') else: chunks = [] @@ -114,7 +114,16 @@ def translate(pat): chunks.append(pat[i:k]) i = k+1 k = k+3 - chunks.append(pat[i:j]) + chunk = pat[i:j] + if chunk: + chunks.append(chunk) + else: + chunks[-1] += '-' + # Remove empty ranges -- invalid in RE. + for k in range(len(chunks)-1, 0, -1): + if chunks[k-1][-1] > chunks[k][0]: + chunks[k-1] = chunks[k-1][:-1] + chunks[k][1:] + del chunks[k] # Escape backslashes and hyphens for set difference (--). # Hyphens that create ranges shouldn't be escaped. stuff = '-'.join(s.replace('\\', r'\\').replace('-', r'\-') @@ -122,11 +131,18 @@ def translate(pat): # Escape set operations (&&, ~~ and ||). stuff = re.sub(r'([&~|])', r'\\\1', stuff) i = j+1 - if stuff[0] == '!': - stuff = '^' + stuff[1:] - elif stuff[0] in ('^', '['): - stuff = '\\' + stuff - add(f'[{stuff}]') + if not stuff: + # Empty range: never match. + add('(?!)') + elif stuff == '!': + # Negated empty range: match any character. + add('.') + else: + if stuff[0] == '!': + stuff = '^' + stuff[1:] + elif stuff[0] in ('^', '['): + stuff = '\\' + stuff + add(f'[{stuff}]') else: add(re.escape(c)) assert i == n diff --git a/Lib/fractions.py b/Lib/fractions.py index f9ac882ec002fa..738a0d4c301d43 100644 --- a/Lib/fractions.py +++ b/Lib/fractions.py @@ -245,14 +245,16 @@ def limit_denominator(self, max_denominator=1000000): break p0, q0, p1, q1 = p1, q1, p0+a*p1, q2 n, d = d, n-a*d - k = (max_denominator-q0)//q1 - bound1 = Fraction(p0+k*p1, q0+k*q1) - bound2 = Fraction(p1, q1) - if abs(bound2 - self) <= abs(bound1-self): - return bound2 + + # Determine which of the candidates (p0+k*p1)/(q0+k*q1) and p1/q1 is + # closer to self. The distance between them is 1/(q1*(q0+k*q1)), while + # the distance from p1/q1 to self is d/(q1*self._denominator). So we + # need to compare 2*(q0+k*q1) with self._denominator/d. + if 2*d*(q0+k*q1) <= self._denominator: + return Fraction(p1, q1, _normalize=False) else: - return bound1 + return Fraction(p0+k*p1, q0+k*q1, _normalize=False) @property def numerator(a): diff --git a/Lib/functools.py b/Lib/functools.py index cd5666dfa71fd0..43ead512e1ea4e 100644 --- a/Lib/functools.py +++ b/Lib/functools.py @@ -843,12 +843,11 @@ def _is_union_type(cls): return get_origin(cls) in {Union, types.UnionType} def _is_valid_dispatch_type(cls): - if isinstance(cls, type) and not isinstance(cls, GenericAlias): + if isinstance(cls, type): return True from typing import get_args return (_is_union_type(cls) and - all(isinstance(arg, type) and not isinstance(arg, GenericAlias) - for arg in get_args(cls))) + all(isinstance(arg, type) for arg in get_args(cls))) def register(cls, func=None): """generic_func.register(cls, func) -> func diff --git a/Lib/gzip.py b/Lib/gzip.py index 5b20e5ba698ee9..8edcda4493c894 100644 --- a/Lib/gzip.py +++ b/Lib/gzip.py @@ -212,14 +212,6 @@ def __init__(self, filename=None, mode=None, if self.mode == WRITE: self._write_gzip_header(compresslevel) - @property - def filename(self): - import warnings - warnings.warn("use the name attribute", DeprecationWarning, 2) - if self.mode == WRITE and self.name[-3:] != ".gz": - return self.name + ".gz" - return self.name - @property def mtime(self): """Last modification time read from stream, or None""" diff --git a/Lib/html/entities.py b/Lib/html/entities.py index dc508631ac4789..cc59bc314499ad 100644 --- a/Lib/html/entities.py +++ b/Lib/html/entities.py @@ -3,8 +3,7 @@ __all__ = ['html5', 'name2codepoint', 'codepoint2name', 'entitydefs'] -# maps the HTML entity name to the Unicode code point -# from https://html.spec.whatwg.org/multipage/named-characters.html +# maps HTML4 entity name to the Unicode code point name2codepoint = { 'AElig': 0x00c6, # latin capital letter AE = latin capital ligature AE, U+00C6 ISOlat1 'Aacute': 0x00c1, # latin capital letter A with acute, U+00C1 ISOlat1 @@ -261,7 +260,11 @@ } -# maps the HTML5 named character references to the equivalent Unicode character(s) +# HTML5 named character references +# Generated by 'Tools/scripts/parse_html5_entities.py' +# from https://html.spec.whatwg.org/entities.json and +# https://html.spec.whatwg.org/multipage/named-characters.html. +# Map HTML5 named character references to the equivalent Unicode character(s). html5 = { 'Aacute': '\xc1', 'aacute': '\xe1', diff --git a/Lib/http/client.py b/Lib/http/client.py index f54172fd0deeae..4bef50e6474d49 100644 --- a/Lib/http/client.py +++ b/Lib/http/client.py @@ -786,6 +786,20 @@ def getcode(self): ''' return self.status + +def _create_https_context(http_version): + # Function also used by urllib.request to be able to set the check_hostname + # attribute on a context object. + context = ssl._create_default_https_context() + # send ALPN extension to indicate HTTP/1.1 protocol + if http_version == 11: + context.set_alpn_protocols(['http/1.1']) + # enable PHA for TLS 1.3 connections if available + if context.post_handshake_auth is not None: + context.post_handshake_auth = True + return context + + class HTTPConnection: _http_vsn = 11 @@ -1418,19 +1432,9 @@ def __init__(self, host, port=None, key_file=None, cert_file=None, self.key_file = key_file self.cert_file = cert_file if context is None: - context = ssl._create_default_https_context() - # send ALPN extension to indicate HTTP/1.1 protocol - if self._http_vsn == 11: - context.set_alpn_protocols(['http/1.1']) - # enable PHA for TLS 1.3 connections if available - if context.post_handshake_auth is not None: - context.post_handshake_auth = True - will_verify = context.verify_mode != ssl.CERT_NONE - if check_hostname is None: - check_hostname = context.check_hostname - if check_hostname and not will_verify: - raise ValueError("check_hostname needs a SSL context with " - "either CERT_OPTIONAL or CERT_REQUIRED") + context = _create_https_context(self._http_vsn) + if check_hostname is not None: + context.check_hostname = check_hostname if key_file or cert_file: context.load_cert_chain(cert_file, key_file) # cert and key file means the user wants to authenticate. @@ -1438,8 +1442,6 @@ def __init__(self, host, port=None, key_file=None, cert_file=None, if context.post_handshake_auth is not None: context.post_handshake_auth = True self._context = context - if check_hostname is not None: - self._context.check_hostname = check_hostname def connect(self): "Connect to a host on a given (SSL) port." diff --git a/Lib/http/cookiejar.py b/Lib/http/cookiejar.py index f19a366496a21a..c514e0d382cbc7 100644 --- a/Lib/http/cookiejar.py +++ b/Lib/http/cookiejar.py @@ -1890,7 +1890,7 @@ def save(self, filename=None, ignore_discard=False, ignore_expires=False): if self.filename is not None: filename = self.filename else: raise ValueError(MISSING_FILENAME_TEXT) - with open(filename, "w") as f: + with os.fdopen(os.open(filename, os.O_CREAT | os.O_WRONLY, 0o600), 'w') as f: # There really isn't an LWP Cookies 2.0 format, but this indicates # that there is extra information in here (domain_dot and # port_spec) while still being compatible with libwww-perl, I hope. @@ -2086,7 +2086,7 @@ def save(self, filename=None, ignore_discard=False, ignore_expires=False): if self.filename is not None: filename = self.filename else: raise ValueError(MISSING_FILENAME_TEXT) - with open(filename, "w") as f: + with os.fdopen(os.open(filename, os.O_CREAT | os.O_WRONLY, 0o600), 'w') as f: f.write(NETSCAPE_HEADER_TEXT) now = time.time() for cookie in self: diff --git a/Lib/http/server.py b/Lib/http/server.py index d115dfc162bb14..8aee31bac2752a 100644 --- a/Lib/http/server.py +++ b/Lib/http/server.py @@ -329,6 +329,13 @@ def parse_request(self): return False self.command, self.path = command, path + # gh-87389: The purpose of replacing '//' with '/' is to protect + # against open redirect attacks possibly triggered if the path starts + # with '//' because http clients treat //path as an absolute URI + # without scheme (similar to http://path) rather than a path. + if self.path.startswith('//'): + self.path = '/' + self.path.lstrip('/') # Reduce to a single / + # Examine the headers and look for a Connection directive. try: self.headers = http.client.parse_headers(self.rfile, @@ -635,6 +642,7 @@ class SimpleHTTPRequestHandler(BaseHTTPRequestHandler): """ + index_pages = ["index.html", "index.htm"] server_version = "SimpleHTTP/" + __version__ extensions_map = _encodings_map_default = { '.gz': 'application/gzip', @@ -643,9 +651,11 @@ class SimpleHTTPRequestHandler(BaseHTTPRequestHandler): '.xz': 'application/x-xz', } - def __init__(self, *args, directory=None, **kwargs): + def __init__(self, *args, directory=None, index_pages=None, **kwargs): if directory is None: directory = os.getcwd() + if index_pages is not None: + self.index_pages = index_pages self.directory = os.fspath(directory) super().__init__(*args, **kwargs) @@ -689,7 +699,7 @@ def send_head(self): self.send_header("Content-Length", "0") self.end_headers() return None - for index in "index.html", "index.htm": + for index in self.index_pages: index = os.path.join(path, index) if os.path.exists(index): path = index diff --git a/Lib/idlelib/configdialog.py b/Lib/idlelib/configdialog.py index d5748a64a798b7..57eaffeef39625 100644 --- a/Lib/idlelib/configdialog.py +++ b/Lib/idlelib/configdialog.py @@ -11,7 +11,7 @@ """ import re -from tkinter import (Toplevel, Listbox, Scale, Canvas, +from tkinter import (Toplevel, Listbox, Canvas, StringVar, BooleanVar, IntVar, TRUE, FALSE, TOP, BOTTOM, RIGHT, LEFT, SOLID, GROOVE, NONE, BOTH, X, Y, W, E, EW, NS, NSEW, NW, diff --git a/Lib/idlelib/idle_test/test_debugger_r.py b/Lib/idlelib/idle_test/test_debugger_r.py index 638ebd36a7405d..cf8af05fe27e77 100644 --- a/Lib/idlelib/idle_test/test_debugger_r.py +++ b/Lib/idlelib/idle_test/test_debugger_r.py @@ -2,28 +2,21 @@ from idlelib import debugger_r import unittest -from test.support import requires -from tkinter import Tk - - -class Test(unittest.TestCase): +# Boilerplate likely to be needed for future test classes. +##from test.support import requires +##from tkinter import Tk +##class Test(unittest.TestCase): ## @classmethod ## def setUpClass(cls): ## requires('gui') ## cls.root = Tk() -## ## @classmethod ## def tearDownClass(cls): ## cls.root.destroy() -## del cls.root - - def test_init(self): - self.assertTrue(True) # Get coverage of import - -# Classes GUIProxy, IdbAdapter, FrameProxy, CodeProxy, DictProxy, -# GUIAdapter, IdbProxy plus 7 module functions. +# GUIProxy, IdbAdapter, FrameProxy, CodeProxy, DictProxy, +# GUIAdapter, IdbProxy, and 7 functions still need tests. class IdbAdapterTest(unittest.TestCase): diff --git a/Lib/idlelib/idle_test/test_editor.py b/Lib/idlelib/idle_test/test_editor.py index 8665d680c0118f..fdb47abf43fb77 100644 --- a/Lib/idlelib/idle_test/test_editor.py +++ b/Lib/idlelib/idle_test/test_editor.py @@ -5,7 +5,6 @@ from collections import namedtuple from test.support import requires from tkinter import Tk -from idlelib.idle_test.mock_idle import Func Editor = editor.EditorWindow diff --git a/Lib/idlelib/idle_test/test_parenmatch.py b/Lib/idlelib/idle_test/test_parenmatch.py index 4a41d8433d5483..2e10d7cd36760f 100644 --- a/Lib/idlelib/idle_test/test_parenmatch.py +++ b/Lib/idlelib/idle_test/test_parenmatch.py @@ -83,12 +83,12 @@ def test_paren_corner(self): """ Test corner cases in flash_paren_event and paren_closed_event. - These cases force conditional expression and alternate paths. + Force execution of conditional expressions and alternate paths. """ text = self.text pm = self.get_parenmatch() - text.insert('insert', '# this is a commen)') + text.insert('insert', '# Comment.)') pm.paren_closed_event('event') text.insert('insert', '\ndef') diff --git a/Lib/idlelib/iomenu.py b/Lib/idlelib/iomenu.py index ad3109df84096a..327e885203c3ca 100644 --- a/Lib/idlelib/iomenu.py +++ b/Lib/idlelib/iomenu.py @@ -9,18 +9,12 @@ from tkinter import messagebox from tkinter.simpledialog import askstring -import idlelib from idlelib.config import idleConf from idlelib.util import py_extensions py_extensions = ' '.join("*"+ext for ext in py_extensions) - encoding = 'utf-8' -if sys.platform == 'win32': - errors = 'surrogatepass' -else: - errors = 'surrogateescape' - +errors = 'surrogatepass' if sys.platform == 'win32' else 'surrogateescape' class IOBinding: diff --git a/Lib/idlelib/util.py b/Lib/idlelib/util.py index 5480219786bca6..ede670a4db5536 100644 --- a/Lib/idlelib/util.py +++ b/Lib/idlelib/util.py @@ -12,7 +12,6 @@ * std streams (pyshell, run), * warning stuff (pyshell, run). """ -from os import path # .pyw is for Windows; .pyi is for stub files. py_extensions = ('.py', '.pyw', '.pyi') # Order needed for open/save dialogs. diff --git a/Lib/importlib/_bootstrap_external.py b/Lib/importlib/_bootstrap_external.py index 5f67226adfc522..5c72049ffd4a52 100644 --- a/Lib/importlib/_bootstrap_external.py +++ b/Lib/importlib/_bootstrap_external.py @@ -402,9 +402,15 @@ def _write_atomic(path, data, mode=0o666): # add JUMP_BACKWARD_NO_INTERRUPT, make JUMP_NO_INTERRUPT virtual) # Python 3.11a7 3492 (make POP_JUMP_IF_NONE/NOT_NONE/TRUE/FALSE relative) # Python 3.11a7 3493 (Make JUMP_IF_TRUE_OR_POP/JUMP_IF_FALSE_OR_POP relative) -# Python 3.11a7 3494 (New location info table) -# Python 3.12 will start with magic number 3500 +# Python 3.11a7 3494 (New location info table) +# Python 3.12a1 3500 (Remove PRECALL opcode) +# Python 3.12a1 3501 (YIELD_VALUE oparg == stack_depth) +# Python 3.12a1 3502 (LOAD_FAST_CHECK, no NULL-check in LOAD_FAST) +# Python 3.12a1 3503 (Shrink LOAD_METHOD cache) +# Python 3.12a1 3504 (Merge LOAD_METHOD back into LOAD_ATTR) +# Python 3.12a1 3505 (Specialization/Cache for FOR_ITER) +# Python 3.13 will start with 3550 # # MAGIC must change whenever the bytecode emitted by the compiler may no @@ -416,7 +422,7 @@ def _write_atomic(path, data, mode=0o666): # Whenever MAGIC_NUMBER is changed, the ranges in the magic_values array # in PC/launcher.c must also be updated. -MAGIC_NUMBER = (3494).to_bytes(2, 'little') + b'\r\n' +MAGIC_NUMBER = (3505).to_bytes(2, 'little') + b'\r\n' _RAW_MAGIC_NUMBER = int.from_bytes(MAGIC_NUMBER, 'little') # For import.c @@ -1394,7 +1400,9 @@ def invalidate_caches(): """Call the invalidate_caches() method on all path entry finders stored in sys.path_importer_caches (where implemented).""" for name, finder in list(sys.path_importer_cache.items()): - if finder is None: + # Drop entry if finder name is a relative path. The current + # working directory may have changed. + if finder is None or not _path_isabs(name): del sys.path_importer_cache[name] elif hasattr(finder, 'invalidate_caches'): finder.invalidate_caches() @@ -1562,9 +1570,12 @@ def __init__(self, path, *loader_details): loaders.extend((suffix, loader) for suffix in suffixes) self._loaders = loaders # Base (directory) path - self.path = path or '.' - if not _path_isabs(self.path): - self.path = _path_join(_os.getcwd(), self.path) + if not path or path == '.': + self.path = _os.getcwd() + elif not _path_isabs(path): + self.path = _path_join(_os.getcwd(), path) + else: + self.path = path self._path_mtime = -1 self._path_cache = set() self._relaxed_path_cache = set() diff --git a/Lib/importlib/metadata/__init__.py b/Lib/importlib/metadata/__init__.py index 9ceae8a0e04b5a..b01de145c33671 100644 --- a/Lib/importlib/metadata/__init__.py +++ b/Lib/importlib/metadata/__init__.py @@ -543,7 +543,7 @@ def locate_file(self, path): """ @classmethod - def from_name(cls, name): + def from_name(cls, name: str): """Return the Distribution for the given package name. :param name: The name of the distribution package to search for. @@ -551,13 +551,13 @@ def from_name(cls, name): package, if found. :raises PackageNotFoundError: When the named package's distribution metadata cannot be found. + :raises ValueError: When an invalid value is supplied for name. """ - for resolver in cls._discover_resolvers(): - dists = resolver(DistributionFinder.Context(name=name)) - dist = next(iter(dists), None) - if dist is not None: - return dist - else: + if not name: + raise ValueError("A distribution name is required.") + try: + return next(cls.discover(name=name)) + except StopIteration: raise PackageNotFoundError(name) @classmethod @@ -945,13 +945,26 @@ def _normalized_name(self): normalized name from the file system path. """ stem = os.path.basename(str(self._path)) - return self._name_from_stem(stem) or super()._normalized_name + return ( + pass_none(Prepared.normalize)(self._name_from_stem(stem)) + or super()._normalized_name + ) - def _name_from_stem(self, stem): - name, ext = os.path.splitext(stem) + @staticmethod + def _name_from_stem(stem): + """ + >>> PathDistribution._name_from_stem('foo-3.0.egg-info') + 'foo' + >>> PathDistribution._name_from_stem('CherryPy-3.0.dist-info') + 'CherryPy' + >>> PathDistribution._name_from_stem('face.egg-info') + 'face' + >>> PathDistribution._name_from_stem('foo.bar') + """ + filename, ext = os.path.splitext(stem) if ext not in ('.dist-info', '.egg-info'): return - name, sep, rest = stem.partition('-') + name, sep, rest = filename.partition('-') return name @@ -991,6 +1004,15 @@ def version(distribution_name): return distribution(distribution_name).version +_unique = functools.partial( + unique_everseen, + key=operator.attrgetter('_normalized_name'), +) +""" +Wrapper for ``distributions`` to return unique distributions by name. +""" + + def entry_points(**params) -> Union[EntryPoints, SelectableGroups]: """Return EntryPoint objects for all installed packages. @@ -1008,10 +1030,8 @@ def entry_points(**params) -> Union[EntryPoints, SelectableGroups]: :return: EntryPoints or SelectableGroups for all installed packages. """ - norm_name = operator.attrgetter('_normalized_name') - unique = functools.partial(unique_everseen, key=norm_name) eps = itertools.chain.from_iterable( - dist.entry_points for dist in unique(distributions()) + dist.entry_points for dist in _unique(distributions()) ) return SelectableGroups.load(eps).select(**params) diff --git a/Lib/importlib/resources/_common.py b/Lib/importlib/resources/_common.py index 147ea19188f718..ca1fa8ab2fe6ff 100644 --- a/Lib/importlib/resources/_common.py +++ b/Lib/importlib/resources/_common.py @@ -67,7 +67,10 @@ def from_package(package): @contextlib.contextmanager -def _tempfile(reader, suffix=''): +def _tempfile(reader, suffix='', + # gh-93353: Keep a reference to call os.remove() in late Python + # finalization. + *, _os_remove=os.remove): # Not using tempfile.NamedTemporaryFile as it leads to deeper 'try' # blocks due to the need to close the temporary file to work on Windows # properly. @@ -81,7 +84,7 @@ def _tempfile(reader, suffix=''): yield pathlib.Path(raw_path) finally: try: - os.remove(raw_path) + _os_remove(raw_path) except FileNotFoundError: pass diff --git a/Lib/io.py b/Lib/io.py index a205e00575f7e8..50ce97436ac1d1 100644 --- a/Lib/io.py +++ b/Lib/io.py @@ -57,22 +57,6 @@ IncrementalNewlineDecoder, text_encoding, TextIOWrapper) -def __getattr__(name): - if name == "OpenWrapper": - # bpo-43680: Until Python 3.9, _pyio.open was not a static method and - # builtins.open was set to OpenWrapper to not become a bound method - # when set to a class variable. _io.open is a built-in function whereas - # _pyio.open is a Python function. In Python 3.10, _pyio.open() is now - # a static method, and builtins.open() is now io.open(). - import warnings - warnings.warn('OpenWrapper is deprecated, use open instead', - DeprecationWarning, stacklevel=2) - global OpenWrapper - OpenWrapper = open - return OpenWrapper - raise AttributeError("module {__name__!r} has no attribute {name!r}") - - # Pretend this exception was created here. UnsupportedOperation.__module__ = "io" diff --git a/Lib/locale.py b/Lib/locale.py index 25eb75ac65a32b..7a7694e1bfb71c 100644 --- a/Lib/locale.py +++ b/Lib/locale.py @@ -633,7 +633,17 @@ def resetlocale(category=LC_ALL): getdefaultlocale(). category defaults to LC_ALL. """ - _setlocale(category, _build_localename(getdefaultlocale())) + import warnings + warnings.warn( + 'Use locale.setlocale(locale.LC_ALL, "") instead', + DeprecationWarning, stacklevel=2 + ) + + with warnings.catch_warnings(): + warnings.simplefilter('ignore', category=DeprecationWarning) + loc = getdefaultlocale() + + _setlocale(category, _build_localename(loc)) try: diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py index 79e4b7d09bfa43..88953fefb74476 100644 --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -1,4 +1,4 @@ -# Copyright 2001-2019 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2022 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -18,7 +18,7 @@ Logging package for Python. Based on PEP 282 and comments thereto in comp.lang.python. -Copyright (C) 2001-2019 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2022 Vinay Sajip. All Rights Reserved. To use, simply 'import logging' and log away! """ @@ -38,7 +38,8 @@ 'exception', 'fatal', 'getLevelName', 'getLogger', 'getLoggerClass', 'info', 'log', 'makeLogRecord', 'setLoggerClass', 'shutdown', 'warn', 'warning', 'getLogRecordFactory', 'setLogRecordFactory', - 'lastResort', 'raiseExceptions', 'getLevelNamesMapping'] + 'lastResort', 'raiseExceptions', 'getLevelNamesMapping', + 'getHandlerByName', 'getHandlerNames'] import threading @@ -64,20 +65,25 @@ raiseExceptions = True # -# If you don't want threading information in the log, set this to zero +# If you don't want threading information in the log, set this to False # logThreads = True # -# If you don't want multiprocessing information in the log, set this to zero +# If you don't want multiprocessing information in the log, set this to False # logMultiprocessing = True # -# If you don't want process information in the log, set this to zero +# If you don't want process information in the log, set this to False # logProcesses = True +# +# If you don't want asyncio task information in the log, set this to False +# +logAsyncioTasks = True + #--------------------------------------------------------------------------- # Level related stuff #--------------------------------------------------------------------------- @@ -361,6 +367,15 @@ def __init__(self, name, level, pathname, lineno, else: self.process = None + self.taskName = None + if logAsyncioTasks: + asyncio = sys.modules.get('asyncio') + if asyncio: + try: + self.taskName = asyncio.current_task().get_name() + except Exception: + pass + def __repr__(self): return ''%(self.name, self.levelno, self.pathname, self.lineno, self.msg) @@ -566,6 +581,7 @@ class Formatter(object): (typically at application startup time) %(thread)d Thread ID (if available) %(threadName)s Thread name (if available) + %(taskName)s Task name (if available) %(process)d Process ID (if available) %(message)s The result of record.getMessage(), computed just as the record is emitted @@ -817,23 +833,36 @@ def filter(self, record): Determine if a record is loggable by consulting all the filters. The default is to allow the record to be logged; any filter can veto - this and the record is then dropped. Returns a zero value if a record - is to be dropped, else non-zero. + this by returning a falsy value. + If a filter attached to a handler returns a log record instance, + then that instance is used in place of the original log record in + any further processing of the event by that handler. + If a filter returns any other truthy value, the original log record + is used in any further processing of the event by that handler. + + If none of the filters return falsy values, this method returns + a log record. + If any of the filters return a falsy value, this method returns + a falsy value. .. versionchanged:: 3.2 Allow filters to be just callables. + + .. versionchanged:: 3.12 + Allow filters to return a LogRecord instead of + modifying it in place. """ - rv = True for f in self.filters: if hasattr(f, 'filter'): result = f.filter(record) else: result = f(record) # assume callable - will raise if not if not result: - rv = False - break - return rv + return False + if isinstance(result, LogRecord): + record = result + return record #--------------------------------------------------------------------------- # Handler classes and functions @@ -870,6 +899,23 @@ def _addHandlerRef(handler): finally: _releaseLock() + +def getHandlerByName(name): + """ + Get a handler with the specified *name*, or None if there isn't one with + that name. + """ + return _handlers.get(name) + + +def getHandlerNames(): + """ + Return all known handler names as an immutable set. + """ + result = set(_handlers.keys()) + return frozenset(result) + + class Handler(Filterer): """ Handler instances dispatch logging events to specific destinations. @@ -968,10 +1014,14 @@ def handle(self, record): Emission depends on filters which may have been added to the handler. Wrap the actual emission of the record with acquisition/release of - the I/O thread lock. Returns whether the filter passed the record for - emission. + the I/O thread lock. + + Returns an instance of the log record that was emitted + if it passed all filters, otherwise a falsy value is returned. """ rv = self.filter(record) + if isinstance(rv, LogRecord): + record = rv if rv: self.acquire() try: @@ -1483,7 +1533,7 @@ def info(self, msg, *args, **kwargs): To pass exception information, use the keyword argument exc_info with a true value, e.g. - logger.info("Houston, we have a %s", "interesting problem", exc_info=1) + logger.info("Houston, we have a %s", "notable problem", exc_info=1) """ if self.isEnabledFor(INFO): self._log(INFO, msg, args, **kwargs) @@ -1640,8 +1690,14 @@ def handle(self, record): This method is used for unpickled records received from a socket, as well as those created locally. Logger-level filtering is applied. """ - if (not self.disabled) and self.filter(record): - self.callHandlers(record) + if self.disabled: + return + maybe_record = self.filter(record) + if not maybe_record: + return + if isinstance(maybe_record, LogRecord): + record = maybe_record + self.callHandlers(record) def addHandler(self, hdlr): """ diff --git a/Lib/logging/config.py b/Lib/logging/config.py index 86a1e4eaf4cbc9..2b9d90c3ed5b79 100644 --- a/Lib/logging/config.py +++ b/Lib/logging/config.py @@ -1,4 +1,4 @@ -# Copyright 2001-2019 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2022 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -19,15 +19,17 @@ is based on PEP 282 and comments thereto in comp.lang.python, and influenced by Apache's log4j system. -Copyright (C) 2001-2019 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2022 Vinay Sajip. All Rights Reserved. To use, simply 'import logging' and log away! """ import errno +import functools import io import logging import logging.handlers +import queue import re import struct import threading @@ -563,7 +565,7 @@ def configure(self): handler.name = name handlers[name] = handler except Exception as e: - if 'target not configured yet' in str(e.__cause__): + if ' not configured yet' in str(e.__cause__): deferred.append(name) else: raise ValueError('Unable to configure handler ' @@ -702,6 +704,21 @@ def add_filters(self, filterer, filters): except Exception as e: raise ValueError('Unable to add filter %r' % f) from e + def _configure_queue_handler(self, klass, **kwargs): + if 'queue' in kwargs: + q = kwargs['queue'] + else: + q = queue.Queue() # unbounded + rhl = kwargs.get('respect_handler_level', False) + if 'listener' in kwargs: + lklass = kwargs['listener'] + else: + lklass = logging.handlers.QueueListener + listener = lklass(q, *kwargs['handlers'], respect_handler_level=rhl) + handler = klass(q) + handler.listener = listener + return handler + def configure_handler(self, config): """Configure a handler from a dictionary.""" config_copy = dict(config) # for restoring in case of error @@ -721,26 +738,83 @@ def configure_handler(self, config): factory = c else: cname = config.pop('class') - klass = self.resolve(cname) - #Special case for handler which refers to another handler + if callable(cname): + klass = cname + else: + klass = self.resolve(cname) if issubclass(klass, logging.handlers.MemoryHandler) and\ 'target' in config: + # Special case for handler which refers to another handler try: - th = self.config['handlers'][config['target']] + tn = config['target'] + th = self.config['handlers'][tn] if not isinstance(th, logging.Handler): config.update(config_copy) # restore for deferred cfg raise TypeError('target not configured yet') config['target'] = th except Exception as e: - raise ValueError('Unable to set target handler ' - '%r' % config['target']) from e + raise ValueError('Unable to set target handler %r' % tn) from e + elif issubclass(klass, logging.handlers.QueueHandler): + # Another special case for handler which refers to other handlers + if 'handlers' not in config: + raise ValueError('No handlers specified for a QueueHandler') + if 'queue' in config: + qspec = config['queue'] + if not isinstance(qspec, queue.Queue): + if isinstance(qspec, str): + q = self.resolve(qspec) + if not callable(q): + raise TypeError('Invalid queue specifier %r' % qspec) + q = q() + elif isinstance(qspec, dict): + if '()' not in qspec: + raise TypeError('Invalid queue specifier %r' % qspec) + q = self.configure_custom(dict(qspec)) + else: + raise TypeError('Invalid queue specifier %r' % qspec) + config['queue'] = q + if 'listener' in config: + lspec = config['listener'] + if isinstance(lspec, type): + if not issubclass(lspec, logging.handlers.QueueListener): + raise TypeError('Invalid listener specifier %r' % lspec) + else: + if isinstance(lspec, str): + listener = self.resolve(lspec) + if isinstance(listener, type) and\ + not issubclass(listener, logging.handlers.QueueListener): + raise TypeError('Invalid listener specifier %r' % lspec) + elif isinstance(lspec, dict): + if '()' not in lspec: + raise TypeError('Invalid listener specifier %r' % lspec) + listener = self.configure_custom(dict(lspec)) + else: + raise TypeError('Invalid listener specifier %r' % lspec) + if not callable(listener): + raise TypeError('Invalid listener specifier %r' % lspec) + config['listener'] = listener + hlist = [] + try: + for hn in config['handlers']: + h = self.config['handlers'][hn] + if not isinstance(h, logging.Handler): + config.update(config_copy) # restore for deferred cfg + raise TypeError('Required handler %r ' + 'is not configured yet' % hn) + hlist.append(h) + except Exception as e: + raise ValueError('Unable to set required handler %r' % hn) from e + config['handlers'] = hlist elif issubclass(klass, logging.handlers.SMTPHandler) and\ 'mailhost' in config: config['mailhost'] = self.as_tuple(config['mailhost']) elif issubclass(klass, logging.handlers.SysLogHandler) and\ 'address' in config: config['address'] = self.as_tuple(config['address']) - factory = klass + if issubclass(klass, logging.handlers.QueueHandler): + factory = functools.partial(self._configure_queue_handler, klass) + else: + factory = klass props = config.pop('.', None) kwargs = {k: config[k] for k in config if valid_ident(k)} try: diff --git a/Lib/logging/handlers.py b/Lib/logging/handlers.py index 78e919d195d974..b4c8a3ba9a1896 100644 --- a/Lib/logging/handlers.py +++ b/Lib/logging/handlers.py @@ -1424,6 +1424,7 @@ def __init__(self, queue): """ logging.Handler.__init__(self) self.queue = queue + self.listener = None # will be set to listener if configured via dictConfig() def enqueue(self, record): """ diff --git a/Lib/mailcap.py b/Lib/mailcap.py index 856b6a55475f38..7278ea7051fccf 100644 --- a/Lib/mailcap.py +++ b/Lib/mailcap.py @@ -2,6 +2,7 @@ import os import warnings +import re __all__ = ["getcaps","findmatch"] @@ -19,6 +20,11 @@ def lineno_sort_key(entry): else: return 1, 0 +_find_unsafe = re.compile(r'[^\xa1-\U0010FFFF\w@+=:,./-]').search + +class UnsafeMailcapInput(Warning): + """Warning raised when refusing unsafe input""" + # Part 1: top-level interface. @@ -171,15 +177,22 @@ def findmatch(caps, MIMEtype, key='view', filename="/dev/null", plist=[]): entry to use. """ + if _find_unsafe(filename): + msg = "Refusing to use mailcap with filename %r. Use a safe temporary filename." % (filename,) + warnings.warn(msg, UnsafeMailcapInput) + return None, None entries = lookup(caps, MIMEtype, key) # XXX This code should somehow check for the needsterminal flag. for e in entries: if 'test' in e: test = subst(e['test'], filename, plist) + if test is None: + continue if test and os.system(test) != 0: continue command = subst(e[key], MIMEtype, filename, plist) - return command, e + if command is not None: + return command, e return None, None def lookup(caps, MIMEtype, key=None): @@ -212,6 +225,10 @@ def subst(field, MIMEtype, filename, plist=[]): elif c == 's': res = res + filename elif c == 't': + if _find_unsafe(MIMEtype): + msg = "Refusing to substitute MIME type %r into a shell command." % (MIMEtype,) + warnings.warn(msg, UnsafeMailcapInput) + return None res = res + MIMEtype elif c == '{': start = i @@ -219,7 +236,12 @@ def subst(field, MIMEtype, filename, plist=[]): i = i+1 name = field[start:i] i = i+1 - res = res + findparam(name, plist) + param = findparam(name, plist) + if _find_unsafe(param): + msg = "Refusing to substitute parameter %r (%s) into a shell command" % (param, name) + warnings.warn(msg, UnsafeMailcapInput) + return None + res = res + param # XXX To do: # %n == number of parts if type is multipart/* # %F == list of alternating type and filename for parts diff --git a/Lib/multiprocessing/context.py b/Lib/multiprocessing/context.py index 8d0525d5d62179..b1960ea296fe20 100644 --- a/Lib/multiprocessing/context.py +++ b/Lib/multiprocessing/context.py @@ -223,6 +223,10 @@ class Process(process.BaseProcess): def _Popen(process_obj): return _default_context.get_context().Process._Popen(process_obj) + @staticmethod + def _after_fork(): + return _default_context.get_context().Process._after_fork() + class DefaultContext(BaseContext): Process = Process @@ -283,6 +287,11 @@ def _Popen(process_obj): from .popen_spawn_posix import Popen return Popen(process_obj) + @staticmethod + def _after_fork(): + # process is spawned, nothing to do + pass + class ForkServerProcess(process.BaseProcess): _start_method = 'forkserver' @staticmethod @@ -326,6 +335,11 @@ def _Popen(process_obj): from .popen_spawn_win32 import Popen return Popen(process_obj) + @staticmethod + def _after_fork(): + # process is spawned, nothing to do + pass + class SpawnContext(BaseContext): _name = 'spawn' Process = SpawnProcess diff --git a/Lib/multiprocessing/pool.py b/Lib/multiprocessing/pool.py index bbe05a550c349c..961d7e5991847a 100644 --- a/Lib/multiprocessing/pool.py +++ b/Lib/multiprocessing/pool.py @@ -203,6 +203,9 @@ def __init__(self, processes=None, initializer=None, initargs=(), processes = os.cpu_count() or 1 if processes < 1: raise ValueError("Number of processes must be at least 1") + if maxtasksperchild is not None: + if not isinstance(maxtasksperchild, int) or maxtasksperchild <= 0: + raise ValueError("maxtasksperchild must be a positive int or None") if initializer is not None and not callable(initializer): raise TypeError('initializer must be a callable') diff --git a/Lib/multiprocessing/process.py b/Lib/multiprocessing/process.py index 3917d2e4fa63ec..c03c859baa795b 100644 --- a/Lib/multiprocessing/process.py +++ b/Lib/multiprocessing/process.py @@ -304,8 +304,7 @@ def _bootstrap(self, parent_sentinel=None): if threading._HAVE_THREAD_NATIVE_ID: threading.main_thread()._set_native_id() try: - util._finalizer_registry.clear() - util._run_after_forkers() + self._after_fork() finally: # delay finalization of the old process object until after # _run_after_forkers() is executed @@ -336,6 +335,13 @@ def _bootstrap(self, parent_sentinel=None): return exitcode + @staticmethod + def _after_fork(): + from . import util + util._finalizer_registry.clear() + util._run_after_forkers() + + # # We subclass bytes to avoid accidental transmission of auth keys over network # diff --git a/Lib/multiprocessing/shared_memory.py b/Lib/multiprocessing/shared_memory.py index 122b3fcebf3fed..881f2001dd5980 100644 --- a/Lib/multiprocessing/shared_memory.py +++ b/Lib/multiprocessing/shared_memory.py @@ -23,6 +23,7 @@ import _posixshmem _USE_POSIX = True +from . import resource_tracker _O_CREX = os.O_CREAT | os.O_EXCL @@ -116,8 +117,7 @@ def __init__(self, name=None, create=False, size=0): self.unlink() raise - from .resource_tracker import register - register(self._name, "shared_memory") + resource_tracker.register(self._name, "shared_memory") else: @@ -237,9 +237,8 @@ def unlink(self): called once (and only once) across all processes which have access to the shared memory block.""" if _USE_POSIX and self._name: - from .resource_tracker import unregister _posixshmem.shm_unlink(self._name) - unregister(self._name, "shared_memory") + resource_tracker.unregister(self._name, "shared_memory") _encoding = "utf8" diff --git a/Lib/netrc.py b/Lib/netrc.py index c1358aac6ae02f..b285fd8e357ddb 100644 --- a/Lib/netrc.py +++ b/Lib/netrc.py @@ -2,7 +2,7 @@ # Module and documentation by Eric S. Raymond, 21 Dec 1998 -import os, shlex, stat +import os, stat __all__ = ["netrc", "NetrcParseError"] diff --git a/Lib/ntpath.py b/Lib/ntpath.py index 041ebc75cb127c..959bcd09831186 100644 --- a/Lib/ntpath.py +++ b/Lib/ntpath.py @@ -23,6 +23,7 @@ import genericpath from genericpath import * + __all__ = ["normcase","isabs","join","splitdrive","split","splitext", "basename","dirname","commonprefix","getsize","getmtime", "getatime","getctime", "islink","exists","lexists","isdir","isfile", @@ -41,14 +42,39 @@ def _get_bothseps(path): # Other normalizations (such as optimizing '../' away) are not done # (this is done by normpath). -def normcase(s): - """Normalize case of pathname. - - Makes all characters lowercase and all slashes into backslashes.""" - s = os.fspath(s) - if isinstance(s, bytes): - return s.replace(b'/', b'\\').lower() - else: +try: + from _winapi import ( + LCMapStringEx as _LCMapStringEx, + LOCALE_NAME_INVARIANT as _LOCALE_NAME_INVARIANT, + LCMAP_LOWERCASE as _LCMAP_LOWERCASE) + + def normcase(s): + """Normalize case of pathname. + + Makes all characters lowercase and all slashes into backslashes. + """ + s = os.fspath(s) + if not s: + return s + if isinstance(s, bytes): + encoding = sys.getfilesystemencoding() + s = s.decode(encoding, 'surrogateescape').replace('/', '\\') + s = _LCMapStringEx(_LOCALE_NAME_INVARIANT, + _LCMAP_LOWERCASE, s) + return s.encode(encoding, 'surrogateescape') + else: + return _LCMapStringEx(_LOCALE_NAME_INVARIANT, + _LCMAP_LOWERCASE, + s.replace('/', '\\')) +except ImportError: + def normcase(s): + """Normalize case of pathname. + + Makes all characters lowercase and all slashes into backslashes. + """ + s = os.fspath(s) + if isinstance(s, bytes): + return os.fsencode(os.fsdecode(s).replace('/', '\\').lower()) return s.replace('/', '\\').lower() @@ -146,17 +172,23 @@ def splitdrive(p): sep = b'\\' altsep = b'/' colon = b':' + unc_prefix = b'\\\\?\\UNC' else: sep = '\\' altsep = '/' colon = ':' + unc_prefix = '\\\\?\\UNC' normp = p.replace(altsep, sep) if (normp[0:2] == sep*2) and (normp[2:3] != sep): # is a UNC path: # vvvvvvvvvvvvvvvvvvvv drive letter or UNC path # \\machine\mountpoint\directory\etc\... # directory ^^^^^^^^^^^^^^^ - index = normp.find(sep, 2) + if normp[:8].upper().rstrip(sep) == unc_prefix: + start = 8 + else: + start = 2 + index = normp.find(sep, start) if index == -1: return p[:0], p index2 = normp.find(sep, index + 1) diff --git a/Lib/opcode.py b/Lib/opcode.py index 310582874dc8f0..f515aaa9fc8ddc 100644 --- a/Lib/opcode.py +++ b/Lib/opcode.py @@ -98,7 +98,7 @@ def jabs_op(name, op): def_op('RETURN_VALUE', 83) def_op('IMPORT_STAR', 84) def_op('SETUP_ANNOTATIONS', 85) -def_op('YIELD_VALUE', 86) + def_op('ASYNC_GEN_WRAP', 87) def_op('PREP_RERAISE_STAR', 88) def_op('POP_EXCEPT', 89) @@ -139,12 +139,14 @@ def jabs_op(name, op): def_op('COPY', 120) def_op('BINARY_OP', 122) jrel_op('SEND', 123) # Number of bytes to skip -def_op('LOAD_FAST', 124) # Local variable number +def_op('LOAD_FAST', 124) # Local variable number, no null check haslocal.append(124) def_op('STORE_FAST', 125) # Local variable number haslocal.append(125) def_op('DELETE_FAST', 126) # Local variable number haslocal.append(126) +def_op('LOAD_FAST_CHECK', 127) # Local variable number +haslocal.append(127) jrel_op('POP_JUMP_FORWARD_IF_NOT_NONE', 128) jrel_op('POP_JUMP_FORWARD_IF_NONE', 129) def_op('RAISE_VARARGS', 130) # Number of raise arguments (1, 2, or 3) @@ -174,21 +176,18 @@ def jabs_op(name, op): def_op('LOAD_CLASSDEREF', 148) hasfree.append(148) def_op('COPY_FREE_VARS', 149) - -def_op('RESUME', 151) +def_op('YIELD_VALUE', 150) +def_op('RESUME', 151) # This must be kept in sync with deepfreeze.py def_op('MATCH_CLASS', 152) def_op('FORMAT_VALUE', 155) def_op('BUILD_CONST_KEY_MAP', 156) def_op('BUILD_STRING', 157) -name_op('LOAD_METHOD', 160) - def_op('LIST_EXTEND', 162) def_op('SET_UPDATE', 163) def_op('DICT_MERGE', 164) def_op('DICT_UPDATE', 165) -def_op('PRECALL', 166) def_op('CALL', 171) def_op('KW_NAMES', 172) @@ -254,6 +253,21 @@ def jabs_op(name, op): "CALL_ADAPTIVE", "CALL_PY_EXACT_ARGS", "CALL_PY_WITH_DEFAULTS", + "CALL_BOUND_METHOD_EXACT_ARGS", + "CALL_BUILTIN_CLASS", + "CALL_BUILTIN_FAST_WITH_KEYWORDS", + "CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS", + "CALL_NO_KW_BUILTIN_FAST", + "CALL_NO_KW_BUILTIN_O", + "CALL_NO_KW_ISINSTANCE", + "CALL_NO_KW_LEN", + "CALL_NO_KW_LIST_APPEND", + "CALL_NO_KW_METHOD_DESCRIPTOR_FAST", + "CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS", + "CALL_NO_KW_METHOD_DESCRIPTOR_O", + "CALL_NO_KW_STR_1", + "CALL_NO_KW_TUPLE_1", + "CALL_NO_KW_TYPE_1", ], "COMPARE_OP": [ "COMPARE_OP_ADAPTIVE", @@ -264,15 +278,28 @@ def jabs_op(name, op): "EXTENDED_ARG": [ "EXTENDED_ARG_QUICK", ], + "FOR_ITER": [ + "FOR_ITER_ADAPTIVE", + "FOR_ITER_LIST", + "FOR_ITER_RANGE", + ], "JUMP_BACKWARD": [ "JUMP_BACKWARD_QUICK", ], "LOAD_ATTR": [ "LOAD_ATTR_ADAPTIVE", + # These potentially push [NULL, bound method] onto the stack. + "LOAD_ATTR_CLASS", "LOAD_ATTR_INSTANCE_VALUE", "LOAD_ATTR_MODULE", + "LOAD_ATTR_PROPERTY", "LOAD_ATTR_SLOT", "LOAD_ATTR_WITH_HINT", + # These will always push [unbound method, self] onto the stack. + "LOAD_ATTR_METHOD_LAZY_DICT", + "LOAD_ATTR_METHOD_NO_DICT", + "LOAD_ATTR_METHOD_WITH_DICT", + "LOAD_ATTR_METHOD_WITH_VALUES", ], "LOAD_CONST": [ "LOAD_CONST__LOAD_FAST", @@ -286,33 +313,6 @@ def jabs_op(name, op): "LOAD_GLOBAL_BUILTIN", "LOAD_GLOBAL_MODULE", ], - "LOAD_METHOD": [ - "LOAD_METHOD_ADAPTIVE", - "LOAD_METHOD_CLASS", - "LOAD_METHOD_MODULE", - "LOAD_METHOD_NO_DICT", - "LOAD_METHOD_WITH_DICT", - "LOAD_METHOD_WITH_VALUES", - ], - "PRECALL": [ - "PRECALL_ADAPTIVE", - "PRECALL_BOUND_METHOD", - "PRECALL_BUILTIN_CLASS", - "PRECALL_BUILTIN_FAST_WITH_KEYWORDS", - "PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS", - "PRECALL_NO_KW_BUILTIN_FAST", - "PRECALL_NO_KW_BUILTIN_O", - "PRECALL_NO_KW_ISINSTANCE", - "PRECALL_NO_KW_LEN", - "PRECALL_NO_KW_LIST_APPEND", - "PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST", - "PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS", - "PRECALL_NO_KW_METHOD_DESCRIPTOR_O", - "PRECALL_NO_KW_STR_1", - "PRECALL_NO_KW_TUPLE_1", - "PRECALL_NO_KW_TYPE_1", - "PRECALL_PYFUNC", - ], "RESUME": [ "RESUME_QUICK", ], @@ -372,31 +372,25 @@ def jabs_op(name, op): "type_version": 2, "func_version": 1, }, + "FOR_ITER": { + "counter": 1, + }, "LOAD_ATTR": { "counter": 1, "version": 2, - "index": 1, + "keys_version": 2, + "descr": 4, }, "STORE_ATTR": { "counter": 1, "version": 2, "index": 1, }, - "LOAD_METHOD": { - "counter": 1, - "type_version": 2, - "dict_offset": 1, - "keys_version": 2, - "descr": 4, - }, "CALL": { "counter": 1, "func_version": 2, "min_args": 1, }, - "PRECALL": { - "counter": 1, - }, "STORE_SUBSCR": { "counter": 1, }, diff --git a/Lib/os.py b/Lib/os.py index 67662ca7ad8588..648188e0f13490 100644 --- a/Lib/os.py +++ b/Lib/os.py @@ -974,15 +974,14 @@ def spawnlpe(mode, file, *args): # command in a shell can't be supported. if sys.platform != 'vxworks': # Supply os.popen() - def popen(cmd, mode="r", buffering=-1, encoding=None): + def popen(cmd, mode="r", buffering=-1): if not isinstance(cmd, str): raise TypeError("invalid cmd type (%s, expected string)" % type(cmd)) if mode not in ("r", "w"): raise ValueError("invalid mode %r" % mode) if buffering == 0 or buffering is None: raise ValueError("popen() does not support unbuffered streams") - import subprocess, io - encoding = io.text_encoding(encoding) + import subprocess if mode == "r": proc = subprocess.Popen(cmd, shell=True, text=True, diff --git a/Lib/pathlib.py b/Lib/pathlib.py index c608ba0954f32c..bb440c9d57216a 100644 --- a/Lib/pathlib.py +++ b/Lib/pathlib.py @@ -120,68 +120,18 @@ class _WindowsFlavour(_Flavour): is_supported = (os.name == 'nt') - drive_letters = set('abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ') - ext_namespace_prefix = '\\\\?\\' - reserved_names = ( {'CON', 'PRN', 'AUX', 'NUL', 'CONIN$', 'CONOUT$'} | {'COM%s' % c for c in '123456789\xb9\xb2\xb3'} | {'LPT%s' % c for c in '123456789\xb9\xb2\xb3'} ) - # Interesting findings about extended paths: - # * '\\?\c:\a' is an extended path, which bypasses normal Windows API - # path processing. Thus relative paths are not resolved and slash is not - # translated to backslash. It has the native NT path limit of 32767 - # characters, but a bit less after resolving device symbolic links, - # such as '\??\C:' => '\Device\HarddiskVolume2'. - # * '\\?\c:/a' looks for a device named 'C:/a' because slash is a - # regular name character in the object namespace. - # * '\\?\c:\foo/bar' is invalid because '/' is illegal in NT filesystems. - # The only path separator at the filesystem level is backslash. - # * '//?/c:\a' and '//?/c:/a' are effectively equivalent to '\\.\c:\a' and - # thus limited to MAX_PATH. - # * Prior to Windows 8, ANSI API bytes paths are limited to MAX_PATH, - # even with the '\\?\' prefix. - def splitroot(self, part, sep=sep): - first = part[0:1] - second = part[1:2] - if (second == sep and first == sep): - # XXX extended paths should also disable the collapsing of "." - # components (according to MSDN docs). - prefix, part = self._split_extended_path(part) - first = part[0:1] - second = part[1:2] + drv, rest = self.pathmod.splitdrive(part) + if drv[:1] == sep or rest[:1] == sep: + return drv, sep, rest.lstrip(sep) else: - prefix = '' - third = part[2:3] - if (second == sep and first == sep and third != sep): - # is a UNC path: - # vvvvvvvvvvvvvvvvvvvvv root - # \\machine\mountpoint\directory\etc\... - # directory ^^^^^^^^^^^^^^ - index = part.find(sep, 2) - if index != -1: - index2 = part.find(sep, index + 1) - # a UNC path can't have two slashes in a row - # (after the initial two) - if index2 != index + 1: - if index2 == -1: - index2 = len(part) - if prefix: - return prefix + part[1:index2], sep, part[index2+1:] - else: - return part[:index2], sep, part[index2+1:] - drv = root = '' - if second == ':' and first in self.drive_letters: - drv = part[:2] - part = part[2:] - first = third - if first == sep: - root = first - part = part.lstrip(sep) - return prefix + drv, root, part + return drv, '', rest def casefold(self, s): return s.lower() @@ -192,16 +142,6 @@ def casefold_parts(self, parts): def compile_pattern(self, pattern): return re.compile(fnmatch.translate(pattern), re.IGNORECASE).fullmatch - def _split_extended_path(self, s, ext_prefix=ext_namespace_prefix): - prefix = '' - if s.startswith(ext_prefix): - prefix = s[:4] - s = s[4:] - if s.startswith('UNC\\'): - prefix += s[:3] - s = '\\' + s[3:] - return prefix, s - def is_reserved(self, parts): # NOTE: the rules for reserved names seem somewhat complicated # (e.g. r"..\NUL" is reserved but not r"foo\NUL" if "foo" does not @@ -443,6 +383,8 @@ def __getitem__(self, idx): if idx >= len(self) or idx < -len(self): raise IndexError(idx) + if idx < 0: + idx += len(self) return self._pathcls._from_parsed_parts(self._drv, self._root, self._parts[:-idx - 1]) diff --git a/Lib/pickle.py b/Lib/pickle.py index e7f30f226101f5..f027e0432045b7 100644 --- a/Lib/pickle.py +++ b/Lib/pickle.py @@ -619,7 +619,7 @@ def save_pers(self, pid): "persistent IDs in protocol 0 must be ASCII strings") def save_reduce(self, func, args, state=None, listitems=None, - dictitems=None, state_setter=None, obj=None): + dictitems=None, state_setter=None, *, obj=None): # This API is called by some subclasses if not isinstance(args, tuple): diff --git a/Lib/platform.py b/Lib/platform.py index 3f3f25a2c92d3b..c272c407c77768 100755 --- a/Lib/platform.py +++ b/Lib/platform.py @@ -186,6 +186,10 @@ def libc_ver(executable=None, lib='', version='', chunksize=16384): executable = sys.executable + if not executable: + # sys.executable is not set. + return lib, version + V = _comparable_version # We use os.path.realpath() # here to work around problems with Cygwin not being diff --git a/Lib/pydoc.py b/Lib/pydoc.py index cec9ac89f1cc8d..e96cacbe434454 100755 --- a/Lib/pydoc.py +++ b/Lib/pydoc.py @@ -70,7 +70,6 @@ class or function within a module or module in a package. If the import sysconfig import time import tokenize -import types import urllib.parse import warnings from collections import deque @@ -92,16 +91,13 @@ def pathdirs(): normdirs.append(normdir) return dirs -def _isclass(object): - return inspect.isclass(object) and not isinstance(object, types.GenericAlias) - def _findclass(func): cls = sys.modules.get(func.__module__) if cls is None: return None for name in func.__qualname__.split('.')[:-1]: cls = getattr(cls, name) - if not _isclass(cls): + if not inspect.isclass(cls): return None return cls @@ -109,7 +105,7 @@ def _finddoc(obj): if inspect.ismethod(obj): name = obj.__func__.__name__ self = obj.__self__ - if (_isclass(self) and + if (inspect.isclass(self) and getattr(getattr(self, name, None), '__func__') is obj.__func__): # classmethod cls = self @@ -123,7 +119,7 @@ def _finddoc(obj): elif inspect.isbuiltin(obj): name = obj.__name__ self = obj.__self__ - if (_isclass(self) and + if (inspect.isclass(self) and self.__qualname__ + '.' + name == obj.__qualname__): # classmethod cls = self @@ -210,7 +206,7 @@ def classname(object, modname): def isdata(object): """Check if an object is of a type that probably means it's data.""" - return not (inspect.ismodule(object) or _isclass(object) or + return not (inspect.ismodule(object) or inspect.isclass(object) or inspect.isroutine(object) or inspect.isframe(object) or inspect.istraceback(object) or inspect.iscode(object)) @@ -481,7 +477,7 @@ def document(self, object, name=None, *args): # by lacking a __name__ attribute) and an instance. try: if inspect.ismodule(object): return self.docmodule(*args) - if _isclass(object): return self.docclass(*args) + if inspect.isclass(object): return self.docclass(*args) if inspect.isroutine(object): return self.docroutine(*args) except AttributeError: pass @@ -783,7 +779,7 @@ def docmodule(self, object, name=None, mod=None, *ignored): modules = inspect.getmembers(object, inspect.ismodule) classes, cdict = [], {} - for key, value in inspect.getmembers(object, _isclass): + for key, value in inspect.getmembers(object, inspect.isclass): # if __all__ exists, believe it. Otherwise use old heuristic. if (all is not None or (inspect.getmodule(value) or object) is object): @@ -1223,7 +1219,7 @@ def docmodule(self, object, name=None, mod=None): result = result + self.section('DESCRIPTION', desc) classes = [] - for key, value in inspect.getmembers(object, _isclass): + for key, value in inspect.getmembers(object, inspect.isclass): # if __all__ exists, believe it. Otherwise use old heuristic. if (all is not None or (inspect.getmodule(value) or object) is object): @@ -1707,7 +1703,7 @@ def describe(thing): return 'member descriptor %s.%s.%s' % ( thing.__objclass__.__module__, thing.__objclass__.__name__, thing.__name__) - if _isclass(thing): + if inspect.isclass(thing): return 'class ' + thing.__name__ if inspect.isfunction(thing): return 'function ' + thing.__name__ @@ -1768,7 +1764,7 @@ def render_doc(thing, title='Python Library Documentation: %s', forceload=0, desc += ' in module ' + module.__name__ if not (inspect.ismodule(object) or - _isclass(object) or + inspect.isclass(object) or inspect.isroutine(object) or inspect.isdatadescriptor(object) or _getdoc(object)): diff --git a/Lib/random.py b/Lib/random.py index a2dfcb574bd5eb..2166474af0554b 100644 --- a/Lib/random.py +++ b/Lib/random.py @@ -50,7 +50,7 @@ from math import sqrt as _sqrt, acos as _acos, cos as _cos, sin as _sin from math import tau as TWOPI, floor as _floor, isfinite as _isfinite from os import urandom as _urandom -from _collections_abc import Set as _Set, Sequence as _Sequence +from _collections_abc import Sequence as _Sequence from operator import index as _index from itertools import accumulate as _accumulate, repeat as _repeat from bisect import bisect as _bisect diff --git a/Lib/re/__init__.py b/Lib/re/__init__.py index c9b511422f1e59..d58c2117ef3e14 100644 --- a/Lib/re/__init__.py +++ b/Lib/re/__init__.py @@ -129,7 +129,7 @@ # public symbols __all__ = [ "match", "fullmatch", "search", "sub", "subn", "split", - "findall", "finditer", "compile", "purge", "escape", + "findall", "finditer", "compile", "purge", "template", "escape", "error", "Pattern", "Match", "A", "I", "L", "M", "S", "X", "U", "ASCII", "IGNORECASE", "LOCALE", "MULTILINE", "DOTALL", "VERBOSE", "UNICODE", "NOFLAG", "RegexFlag", @@ -148,6 +148,8 @@ class RegexFlag: MULTILINE = M = _compiler.SRE_FLAG_MULTILINE # make anchors look for newline DOTALL = S = _compiler.SRE_FLAG_DOTALL # make dot match newline VERBOSE = X = _compiler.SRE_FLAG_VERBOSE # ignore whitespace and comments + # sre extensions (experimental, don't rely on these) + TEMPLATE = T = _compiler.SRE_FLAG_TEMPLATE # unknown purpose, deprecated DEBUG = _compiler.SRE_FLAG_DEBUG # dump pattern after compilation __str__ = object.__str__ _numeric_repr_ = hex @@ -229,6 +231,18 @@ def purge(): _cache.clear() _compile_repl.cache_clear() +def template(pattern, flags=0): + "Compile a template pattern, returning a Pattern object, deprecated" + import warnings + warnings.warn("The re.template() function is deprecated " + "as it is an undocumented function " + "without an obvious purpose. " + "Use re.compile() instead.", + DeprecationWarning) + with warnings.catch_warnings(): + warnings.simplefilter("ignore", DeprecationWarning) # warn just once + return _compile(pattern, flags|T) + # SPECIAL_CHARS # closing ')', '}' and ']' # '-' (a range in character set) @@ -270,6 +284,13 @@ def _compile(pattern, flags): return pattern if not _compiler.isstring(pattern): raise TypeError("first argument must be string or compiled pattern") + if flags & T: + import warnings + warnings.warn("The re.TEMPLATE/re.T flag is deprecated " + "as it is an undocumented flag " + "without an obvious purpose. " + "Don't use it.", + DeprecationWarning) p = _compiler.compile(pattern, flags) if not (flags & DEBUG): if len(_cache) >= _MAXCACHE: diff --git a/Lib/re/_compiler.py b/Lib/re/_compiler.py index 63d82025505b74..d8e0d2fdefdcca 100644 --- a/Lib/re/_compiler.py +++ b/Lib/re/_compiler.py @@ -28,21 +28,14 @@ POSSESSIVE_REPEAT: (POSSESSIVE_REPEAT, SUCCESS, POSSESSIVE_REPEAT_ONE), } -class _CompileData: - __slots__ = ('code', 'repeat_count') - def __init__(self): - self.code = [] - self.repeat_count = 0 - def _combine_flags(flags, add_flags, del_flags, TYPE_FLAGS=_parser.TYPE_FLAGS): if add_flags & TYPE_FLAGS: flags &= ~TYPE_FLAGS return (flags | add_flags) & ~del_flags -def _compile(data, pattern, flags): +def _compile(code, pattern, flags): # internal: compile a (sub)pattern - code = data.code emit = code.append _len = len LITERAL_CODES = _LITERAL_CODES @@ -108,12 +101,14 @@ def _compile(data, pattern, flags): else: emit(ANY) elif op in REPEATING_CODES: + if flags & SRE_FLAG_TEMPLATE: + raise error("internal: unsupported template operator %r" % (op,)) if _simple(av[2]): emit(REPEATING_CODES[op][2]) skip = _len(code); emit(0) emit(av[0]) emit(av[1]) - _compile(data, av[2], flags) + _compile(code, av[2], flags) emit(SUCCESS) code[skip] = _len(code) - skip else: @@ -121,11 +116,7 @@ def _compile(data, pattern, flags): skip = _len(code); emit(0) emit(av[0]) emit(av[1]) - # now op is in (MIN_REPEAT, MAX_REPEAT, POSSESSIVE_REPEAT) - if op != POSSESSIVE_REPEAT: - emit(data.repeat_count) - data.repeat_count += 1 - _compile(data, av[2], flags) + _compile(code, av[2], flags) code[skip] = _len(code) - skip emit(REPEATING_CODES[op][1]) elif op is SUBPATTERN: @@ -134,7 +125,7 @@ def _compile(data, pattern, flags): emit(MARK) emit((group-1)*2) # _compile_info(code, p, _combine_flags(flags, add_flags, del_flags)) - _compile(data, p, _combine_flags(flags, add_flags, del_flags)) + _compile(code, p, _combine_flags(flags, add_flags, del_flags)) if group: emit(MARK) emit((group-1)*2+1) @@ -146,7 +137,7 @@ def _compile(data, pattern, flags): # pop their stack if they reach it emit(ATOMIC_GROUP) skip = _len(code); emit(0) - _compile(data, av, flags) + _compile(code, av, flags) emit(SUCCESS) code[skip] = _len(code) - skip elif op in SUCCESS_CODES: @@ -161,7 +152,7 @@ def _compile(data, pattern, flags): if lo != hi: raise error("look-behind requires fixed-width pattern") emit(lo) # look behind - _compile(data, av[1], flags) + _compile(code, av[1], flags) emit(SUCCESS) code[skip] = _len(code) - skip elif op is AT: @@ -180,7 +171,7 @@ def _compile(data, pattern, flags): for av in av[1]: skip = _len(code); emit(0) # _compile_info(code, av, flags) - _compile(data, av, flags) + _compile(code, av, flags) emit(JUMP) tailappend(_len(code)); emit(0) code[skip] = _len(code) - skip @@ -208,12 +199,12 @@ def _compile(data, pattern, flags): emit(op) emit(av[0]-1) skipyes = _len(code); emit(0) - _compile(data, av[1], flags) + _compile(code, av[1], flags) if av[2]: emit(JUMP) skipno = _len(code); emit(0) code[skipyes] = _len(code) - skipyes + 1 - _compile(data, av[2], flags) + _compile(code, av[2], flags) code[skipno] = _len(code) - skipno else: code[skipyes] = _len(code) - skipyes + 1 @@ -580,17 +571,17 @@ def isstring(obj): def _code(p, flags): flags = p.state.flags | flags - data = _CompileData() + code = [] # compile info block - _compile_info(data.code, p, flags) + _compile_info(code, p, flags) # compile the pattern - _compile(data, p.data, flags) + _compile(code, p.data, flags) - data.code.append(SUCCESS) + code.append(SUCCESS) - return data + return code def _hex_code(code): return '[%s]' % ', '.join('%#0*x' % (_sre.CODESIZE*2+2, x) for x in code) @@ -691,7 +682,7 @@ def print_2(*args): else: print_(FAILURE) i += 1 - elif op in (REPEAT_ONE, MIN_REPEAT_ONE, + elif op in (REPEAT, REPEAT_ONE, MIN_REPEAT_ONE, POSSESSIVE_REPEAT, POSSESSIVE_REPEAT_ONE): skip, min, max = code[i: i+3] if max == MAXREPEAT: @@ -699,13 +690,6 @@ def print_2(*args): print_(op, skip, min, max, to=i+skip) dis_(i+3, i+skip) i += skip - elif op is REPEAT: - skip, min, max, repeat_index = code[i: i+4] - if max == MAXREPEAT: - max = 'MAXREPEAT' - print_(op, skip, min, max, repeat_index, to=i+skip) - dis_(i+4, i+skip) - i += skip elif op is GROUPREF_EXISTS: arg, skip = code[i: i+2] print_(op, arg, skip, to=i+skip) @@ -760,11 +744,11 @@ def compile(p, flags=0): else: pattern = None - data = _code(p, flags) + code = _code(p, flags) if flags & SRE_FLAG_DEBUG: print() - dis(data.code) + dis(code) # map in either direction groupindex = p.state.groupdict @@ -773,6 +757,7 @@ def compile(p, flags=0): indexgroup[i] = k return _sre.compile( - pattern, flags | p.state.flags, data.code, - p.state.groups-1, groupindex, tuple(indexgroup), - data.repeat_count) + pattern, flags | p.state.flags, code, + p.state.groups-1, + groupindex, tuple(indexgroup) + ) diff --git a/Lib/re/_constants.py b/Lib/re/_constants.py index 71204d903b3226..10ee14bfab46ee 100644 --- a/Lib/re/_constants.py +++ b/Lib/re/_constants.py @@ -13,7 +13,7 @@ # update when constants are added or removed -MAGIC = 20220423 +MAGIC = 20220615 from _sre import MAXREPEAT, MAXGROUPS @@ -204,6 +204,7 @@ def _makecodes(*names): } # flags +SRE_FLAG_TEMPLATE = 1 # template mode (unknown purpose, deprecated) SRE_FLAG_IGNORECASE = 2 # case insensitive SRE_FLAG_LOCALE = 4 # honour system locale SRE_FLAG_MULTILINE = 8 # treat target as multiline string diff --git a/Lib/re/_parser.py b/Lib/re/_parser.py index 33b70973ae6d32..0d9cf632ea7105 100644 --- a/Lib/re/_parser.py +++ b/Lib/re/_parser.py @@ -61,11 +61,12 @@ "x": SRE_FLAG_VERBOSE, # extensions "a": SRE_FLAG_ASCII, + "t": SRE_FLAG_TEMPLATE, "u": SRE_FLAG_UNICODE, } TYPE_FLAGS = SRE_FLAG_ASCII | SRE_FLAG_LOCALE | SRE_FLAG_UNICODE -GLOBAL_FLAGS = SRE_FLAG_DEBUG +GLOBAL_FLAGS = SRE_FLAG_DEBUG | SRE_FLAG_TEMPLATE class State: # keeps track of state for parsing diff --git a/Lib/shutil.py b/Lib/shutil.py index de82453aa56e1a..f4128647861b81 100644 --- a/Lib/shutil.py +++ b/Lib/shutil.py @@ -182,7 +182,8 @@ def _copyfileobj_readinto(fsrc, fdst, length=COPY_BUFSIZE): break elif n < length: with mv[:n] as smv: - fdst.write(smv) + fdst_write(smv) + break else: fdst_write(mv) @@ -896,7 +897,7 @@ def _get_uid(name): return None def _make_tarball(base_name, base_dir, compress="gzip", verbose=0, dry_run=0, - owner=None, group=None, logger=None): + owner=None, group=None, logger=None, root_dir=None): """Create a (possibly compressed) tar file from all the files under 'base_dir'. @@ -953,14 +954,20 @@ def _set_uid_gid(tarinfo): if not dry_run: tar = tarfile.open(archive_name, 'w|%s' % tar_compression) + arcname = base_dir + if root_dir is not None: + base_dir = os.path.join(root_dir, base_dir) try: - tar.add(base_dir, filter=_set_uid_gid) + tar.add(base_dir, arcname, filter=_set_uid_gid) finally: tar.close() + if root_dir is not None: + archive_name = os.path.abspath(archive_name) return archive_name -def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, logger=None): +def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, + logger=None, owner=None, group=None, root_dir=None): """Create a zip file from all the files under 'base_dir'. The output zip file will be named 'base_name' + ".zip". Returns the @@ -984,42 +991,60 @@ def _make_zipfile(base_name, base_dir, verbose=0, dry_run=0, logger=None): if not dry_run: with zipfile.ZipFile(zip_filename, "w", compression=zipfile.ZIP_DEFLATED) as zf: - path = os.path.normpath(base_dir) - if path != os.curdir: - zf.write(path, path) + arcname = os.path.normpath(base_dir) + if root_dir is not None: + base_dir = os.path.join(root_dir, base_dir) + base_dir = os.path.normpath(base_dir) + if arcname != os.curdir: + zf.write(base_dir, arcname) if logger is not None: - logger.info("adding '%s'", path) + logger.info("adding '%s'", base_dir) for dirpath, dirnames, filenames in os.walk(base_dir): + arcdirpath = dirpath + if root_dir is not None: + arcdirpath = os.path.relpath(arcdirpath, root_dir) + arcdirpath = os.path.normpath(arcdirpath) for name in sorted(dirnames): - path = os.path.normpath(os.path.join(dirpath, name)) - zf.write(path, path) + path = os.path.join(dirpath, name) + arcname = os.path.join(arcdirpath, name) + zf.write(path, arcname) if logger is not None: logger.info("adding '%s'", path) for name in filenames: - path = os.path.normpath(os.path.join(dirpath, name)) + path = os.path.join(dirpath, name) + path = os.path.normpath(path) if os.path.isfile(path): - zf.write(path, path) + arcname = os.path.join(arcdirpath, name) + zf.write(path, arcname) if logger is not None: logger.info("adding '%s'", path) + if root_dir is not None: + zip_filename = os.path.abspath(zip_filename) return zip_filename +# Maps the name of the archive format to a tuple containing: +# * the archiving function +# * extra keyword arguments +# * description +# * does it support the root_dir argument? _ARCHIVE_FORMATS = { - 'tar': (_make_tarball, [('compress', None)], "uncompressed tar file"), + 'tar': (_make_tarball, [('compress', None)], + "uncompressed tar file", True), } if _ZLIB_SUPPORTED: _ARCHIVE_FORMATS['gztar'] = (_make_tarball, [('compress', 'gzip')], - "gzip'ed tar-file") - _ARCHIVE_FORMATS['zip'] = (_make_zipfile, [], "ZIP file") + "gzip'ed tar-file", True) + _ARCHIVE_FORMATS['zip'] = (_make_zipfile, [], "ZIP file", True) if _BZ2_SUPPORTED: _ARCHIVE_FORMATS['bztar'] = (_make_tarball, [('compress', 'bzip2')], - "bzip2'ed tar-file") + "bzip2'ed tar-file", True) if _LZMA_SUPPORTED: _ARCHIVE_FORMATS['xztar'] = (_make_tarball, [('compress', 'xz')], - "xz'ed tar-file") + "xz'ed tar-file", True) def get_archive_formats(): """Returns a list of supported formats for archiving and unarchiving. @@ -1050,7 +1075,7 @@ def register_archive_format(name, function, extra_args=None, description=''): if not isinstance(element, (tuple, list)) or len(element) !=2: raise TypeError('extra_args elements are : (arg_name, value)') - _ARCHIVE_FORMATS[name] = (function, extra_args, description) + _ARCHIVE_FORMATS[name] = (function, extra_args, description, False) def unregister_archive_format(name): del _ARCHIVE_FORMATS[name] @@ -1074,36 +1099,38 @@ def make_archive(base_name, format, root_dir=None, base_dir=None, verbose=0, uses the current owner and group. """ sys.audit("shutil.make_archive", base_name, format, root_dir, base_dir) - save_cwd = os.getcwd() - if root_dir is not None: - if logger is not None: - logger.debug("changing into '%s'", root_dir) - base_name = os.path.abspath(base_name) - if not dry_run: - os.chdir(root_dir) - - if base_dir is None: - base_dir = os.curdir - - kwargs = {'dry_run': dry_run, 'logger': logger} - try: format_info = _ARCHIVE_FORMATS[format] except KeyError: raise ValueError("unknown archive format '%s'" % format) from None + kwargs = {'dry_run': dry_run, 'logger': logger, + 'owner': owner, 'group': group} + func = format_info[0] for arg, val in format_info[1]: kwargs[arg] = val - if format != 'zip': - kwargs['owner'] = owner - kwargs['group'] = group + if base_dir is None: + base_dir = os.curdir + + support_root_dir = format_info[3] + save_cwd = None + if root_dir is not None: + if support_root_dir: + kwargs['root_dir'] = root_dir + else: + save_cwd = os.getcwd() + if logger is not None: + logger.debug("changing into '%s'", root_dir) + base_name = os.path.abspath(base_name) + if not dry_run: + os.chdir(root_dir) try: filename = func(base_name, base_dir, **kwargs) finally: - if root_dir is not None: + if save_cwd is not None: if logger is not None: logger.debug("changing back to '%s'", save_cwd) os.chdir(save_cwd) @@ -1216,6 +1243,11 @@ def _unpack_tarfile(filename, extract_dir): finally: tarobj.close() +# Maps the name of the unpack format to a tuple containing: +# * extensions +# * the unpacking function +# * extra keyword arguments +# * description _UNPACK_FORMATS = { 'tar': (['.tar'], _unpack_tarfile, [], "uncompressed tar file"), 'zip': (['.zip'], _unpack_zipfile, [], "ZIP file"), diff --git a/Lib/site.py b/Lib/site.py index b11cd48e69e9a8..69670d9d7f2230 100644 --- a/Lib/site.py +++ b/Lib/site.py @@ -266,8 +266,8 @@ def _getuserbase(): if env_base: return env_base - # VxWorks has no home directories - if sys.platform == "vxworks": + # Emscripten, VxWorks, and WASI have no home directories + if sys.platform in {"emscripten", "vxworks", "wasi"}: return None def joinuser(*args): diff --git a/Lib/socket.py b/Lib/socket.py old mode 100755 new mode 100644 index e08fb620eb1be2..0ed86afb4a9ea5 --- a/Lib/socket.py +++ b/Lib/socket.py @@ -28,6 +28,7 @@ socket.setdefaulttimeout() -- set the default timeout value create_connection() -- connects to an address, with an optional timeout and optional source address. +create_server() -- create a TCP socket and bind it to a specified address. [*] not available on all platforms! @@ -122,7 +123,7 @@ def _intenum_converter(value, enum_klass): errorTab[10014] = "A fault occurred on the network??" # WSAEFAULT errorTab[10022] = "An invalid operation was attempted." errorTab[10024] = "Too many open files." - errorTab[10035] = "The socket operation would block" + errorTab[10035] = "The socket operation would block." errorTab[10036] = "A blocking operation is already in progress." errorTab[10037] = "Operation already in progress." errorTab[10038] = "Socket operation on nonsocket." diff --git a/Lib/sqlite3/__init__.py b/Lib/sqlite3/__init__.py index 5a2dbd360fb49f..927267cf0b92ff 100644 --- a/Lib/sqlite3/__init__.py +++ b/Lib/sqlite3/__init__.py @@ -55,17 +55,16 @@ """ from sqlite3.dbapi2 import * +from sqlite3.dbapi2 import (_deprecated_names, + _deprecated_version_info, + _deprecated_version) -# bpo-42264: OptimizedUnicode was deprecated in Python 3.10. It's scheduled -# for removal in Python 3.12. def __getattr__(name): - if name == "OptimizedUnicode": - import warnings - msg = (""" - OptimizedUnicode is deprecated and will be removed in Python 3.12. - Since Python 3.3 it has simply been an alias for 'str'. - """) - warnings.warn(msg, DeprecationWarning, stacklevel=2) - return str - raise AttributeError(f"module 'sqlite3' has no attribute '{name}'") + if name in _deprecated_names: + from warnings import warn + + warn(f"{name} is deprecated and will be removed in Python 3.14", + DeprecationWarning, stacklevel=2) + return globals()[f"_deprecated_{name}"] + raise AttributeError(f"module {__name__!r} has no attribute {name!r}") diff --git a/Lib/sqlite3/dbapi2.py b/Lib/sqlite3/dbapi2.py index 7cf4dd32d541dd..3b6d2f7ba2d595 100644 --- a/Lib/sqlite3/dbapi2.py +++ b/Lib/sqlite3/dbapi2.py @@ -25,6 +25,9 @@ import collections.abc from _sqlite3 import * +from _sqlite3 import _deprecated_version + +_deprecated_names = frozenset({"version", "version_info"}) paramstyle = "qmark" @@ -45,7 +48,7 @@ def TimeFromTicks(ticks): def TimestampFromTicks(ticks): return Timestamp(*time.localtime(ticks)[:6]) -version_info = tuple([int(x) for x in version.split(".")]) +_deprecated_version_info = tuple(map(int, _deprecated_version.split("."))) sqlite_version_info = tuple([int(x) for x in sqlite_version.split(".")]) Binary = memoryview @@ -82,20 +85,15 @@ def convert_timestamp(val): register_adapters_and_converters() -# bpo-24464: enable_shared_cache was deprecated in Python 3.10. It's -# scheduled for removal in Python 3.12. -def enable_shared_cache(enable): - from _sqlite3 import enable_shared_cache as _old_enable_shared_cache - import warnings - msg = ( - "enable_shared_cache is deprecated and will be removed in Python 3.12. " - "Shared cache is strongly discouraged by the SQLite 3 documentation. " - "If shared cache must be used, open the database in URI mode using" - "the cache=shared query parameter." - ) - warnings.warn(msg, DeprecationWarning, stacklevel=2) - return _old_enable_shared_cache(enable) - # Clean up namespace del(register_adapters_and_converters) + +def __getattr__(name): + if name in _deprecated_names: + from warnings import warn + + warn(f"{name} is deprecated and will be removed in Python 3.14", + DeprecationWarning, stacklevel=2) + return globals()[f"_deprecated_{name}"] + raise AttributeError(f"module {__name__!r} has no attribute {name!r}") diff --git a/Lib/sqlite3/dump.py b/Lib/sqlite3/dump.py index de9c368be3014e..07b9da10b920f9 100644 --- a/Lib/sqlite3/dump.py +++ b/Lib/sqlite3/dump.py @@ -28,9 +28,16 @@ def _iterdump(connection): ORDER BY "name" """ schema_res = cu.execute(q) + sqlite_sequence = [] for table_name, type, sql in schema_res.fetchall(): if table_name == 'sqlite_sequence': - yield('DELETE FROM "sqlite_sequence";') + rows = cu.execute('SELECT * FROM "sqlite_sequence";').fetchall() + sqlite_sequence = ['DELETE FROM "sqlite_sequence"'] + sqlite_sequence += [ + f'INSERT INTO "sqlite_sequence" VALUES(\'{row[0]}\',{row[1]})' + for row in rows + ] + continue elif table_name == 'sqlite_stat1': yield('ANALYZE "sqlite_master";') elif table_name.startswith('sqlite_'): @@ -67,4 +74,9 @@ def _iterdump(connection): for name, type, sql in schema_res.fetchall(): yield('{0};'.format(sql)) + # gh-79009: Yield statements concerning the sqlite_sequence table at the + # end of the transaction. + for row in sqlite_sequence: + yield('{0};'.format(row)) + yield('COMMIT;') diff --git a/Lib/ssl.py b/Lib/ssl.py index ebac1d60d52de7..02359a18c01e36 100644 --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -106,7 +106,7 @@ SSLSyscallError, SSLEOFError, SSLCertVerificationError ) from _ssl import txt2obj as _txt2obj, nid2obj as _nid2obj -from _ssl import RAND_status, RAND_add, RAND_bytes, RAND_pseudo_bytes +from _ssl import RAND_status, RAND_add, RAND_bytes try: from _ssl import RAND_egd except ImportError: @@ -373,68 +373,6 @@ def _ipaddress_match(cert_ipaddress, host_ip): return ip == host_ip -def match_hostname(cert, hostname): - """Verify that *cert* (in decoded format as returned by - SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 - rules are followed. - - The function matches IP addresses rather than dNSNames if hostname is a - valid ipaddress string. IPv4 addresses are supported on all platforms. - IPv6 addresses are supported on platforms with IPv6 support (AF_INET6 - and inet_pton). - - CertificateError is raised on failure. On success, the function - returns nothing. - """ - warnings.warn( - "ssl.match_hostname() is deprecated", - category=DeprecationWarning, - stacklevel=2 - ) - if not cert: - raise ValueError("empty or no certificate, match_hostname needs a " - "SSL socket or SSL context with either " - "CERT_OPTIONAL or CERT_REQUIRED") - try: - host_ip = _inet_paton(hostname) - except ValueError: - # Not an IP address (common case) - host_ip = None - dnsnames = [] - san = cert.get('subjectAltName', ()) - for key, value in san: - if key == 'DNS': - if host_ip is None and _dnsname_match(value, hostname): - return - dnsnames.append(value) - elif key == 'IP Address': - if host_ip is not None and _ipaddress_match(value, host_ip): - return - dnsnames.append(value) - if not dnsnames: - # The subject is only checked when there is no dNSName entry - # in subjectAltName - for sub in cert.get('subject', ()): - for key, value in sub: - # XXX according to RFC 2818, the most specific Common Name - # must be used. - if key == 'commonName': - if _dnsname_match(value, hostname): - return - dnsnames.append(value) - if len(dnsnames) > 1: - raise CertificateError("hostname %r " - "doesn't match either of %s" - % (hostname, ', '.join(map(repr, dnsnames)))) - elif len(dnsnames) == 1: - raise CertificateError("hostname %r " - "doesn't match %r" - % (hostname, dnsnames[0])) - else: - raise CertificateError("no appropriate commonName or " - "subjectAltName fields were found") - - DefaultVerifyPaths = namedtuple("DefaultVerifyPaths", "cafile capath openssl_cafile_env openssl_cafile openssl_capath_env " "openssl_capath") diff --git a/Lib/subprocess.py b/Lib/subprocess.py index 6e61cc2e5e7b08..e10b01047ebef5 100644 --- a/Lib/subprocess.py +++ b/Lib/subprocess.py @@ -74,6 +74,9 @@ else: _mswindows = True +# wasm32-emscripten and wasm32-wasi do not support processes +_can_fork_exec = sys.platform not in {"emscripten", "wasi"} + if _mswindows: import _winapi from _winapi import (CREATE_NEW_CONSOLE, CREATE_NEW_PROCESS_GROUP, @@ -97,13 +100,10 @@ "CREATE_NO_WINDOW", "DETACHED_PROCESS", "CREATE_DEFAULT_ERROR_MODE", "CREATE_BREAKAWAY_FROM_JOB"]) else: - if sys.platform in {"emscripten", "wasi"}: - def _fork_exec(*args, **kwargs): - raise OSError( - errno.ENOTSUP, f"{sys.platform} does not support processes." - ) - else: + if _can_fork_exec: from _posixsubprocess import fork_exec as _fork_exec + else: + _fork_exec = None import select import selectors @@ -801,6 +801,11 @@ def __init__(self, args, bufsize=-1, executable=None, encoding=None, errors=None, text=None, umask=-1, pipesize=-1, process_group=None): """Create new Popen instance.""" + if not _can_fork_exec: + raise OSError( + errno.ENOTSUP, f"{sys.platform} does not support processes." + ) + _cleanup() # Held while anything is calling waitpid before returncode has been # updated to prevent clobbering returncode if wait() or poll() are diff --git a/Lib/symtable.py b/Lib/symtable.py index 75ff0921f4c0db..5dd71ffc6b4f19 100644 --- a/Lib/symtable.py +++ b/Lib/symtable.py @@ -111,7 +111,7 @@ def has_children(self): return bool(self._table.children) def get_identifiers(self): - """Return a list of names of symbols in the table. + """Return a view object containing the names of symbols in the table. """ return self._table.symbols.keys() diff --git a/Lib/sysconfig.py b/Lib/sysconfig.py index e21b7303fecc6b..bf926cf76807b7 100644 --- a/Lib/sysconfig.py +++ b/Lib/sysconfig.py @@ -111,8 +111,8 @@ def _getuserbase(): if env_base: return env_base - # VxWorks has no home directories - if sys.platform == "vxworks": + # Emscripten, VxWorks, and WASI have no home directories + if sys.platform in {"emscripten", "vxworks", "wasi"}: return None def joinuser(*args): diff --git a/Lib/tarfile.py b/Lib/tarfile.py index 8d43d0da7b9880..a08f247f496b3d 100755 --- a/Lib/tarfile.py +++ b/Lib/tarfile.py @@ -336,7 +336,8 @@ class _Stream: _Stream is intended to be used only internally. """ - def __init__(self, name, mode, comptype, fileobj, bufsize): + def __init__(self, name, mode, comptype, fileobj, bufsize, + compresslevel): """Construct a _Stream object. """ self._extfileobj = True @@ -371,7 +372,7 @@ def __init__(self, name, mode, comptype, fileobj, bufsize): self._init_read_gz() self.exception = zlib.error else: - self._init_write_gz() + self._init_write_gz(compresslevel) elif comptype == "bz2": try: @@ -383,7 +384,7 @@ def __init__(self, name, mode, comptype, fileobj, bufsize): self.cmp = bz2.BZ2Decompressor() self.exception = OSError else: - self.cmp = bz2.BZ2Compressor() + self.cmp = bz2.BZ2Compressor(compresslevel) elif comptype == "xz": try: @@ -410,13 +411,14 @@ def __del__(self): if hasattr(self, "closed") and not self.closed: self.close() - def _init_write_gz(self): + def _init_write_gz(self, compresslevel): """Initialize for writing with gzip compression. """ - self.cmp = self.zlib.compressobj(9, self.zlib.DEFLATED, - -self.zlib.MAX_WBITS, - self.zlib.DEF_MEM_LEVEL, - 0) + self.cmp = self.zlib.compressobj(compresslevel, + self.zlib.DEFLATED, + -self.zlib.MAX_WBITS, + self.zlib.DEF_MEM_LEVEL, + 0) timestamp = struct.pack(" 60.0: - raise Exception("failed to sync child in %.1f sec" % dt) + # synchronize the subprocess + start_time = time.monotonic() + for _ in support.sleeping_retry(support.LONG_TIMEOUT, error=False): try: lock_func(f, fcntl.LOCK_EX | fcntl.LOCK_NB) lock_func(f, fcntl.LOCK_UN) - time.sleep(0.01) except BlockingIOError: break + else: + dt = time.monotonic() - start_time + raise Exception("failed to sync child in %.1f sec" % dt) + # the child locked the file just a moment ago for 'sleep_time' seconds # that means that the lock below will block for 'sleep_time' minus some # potential context switch delay diff --git a/Lib/test/_test_embed_set_config.py b/Lib/test/_test_embed_set_config.py index e7b7106e17851d..7ff641b37bf188 100644 --- a/Lib/test/_test_embed_set_config.py +++ b/Lib/test/_test_embed_set_config.py @@ -236,10 +236,11 @@ def test_path(self): module_search_paths=['a', 'b', 'c']) self.assertEqual(sys.path, ['a', 'b', 'c']) - # Leave sys.path unchanged if module_search_paths_set=0 + # sys.path is reset if module_search_paths_set=0 self.set_config(module_search_paths_set=0, module_search_paths=['new_path']) - self.assertEqual(sys.path, ['a', 'b', 'c']) + self.assertNotEqual(sys.path, ['a', 'b', 'c']) + self.assertNotEqual(sys.path, ['new_path']) def test_argv(self): self.set_config(parse_argv=0, diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 67bb17c0ede364..75969975af35ba 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -5,6 +5,7 @@ import unittest import unittest.mock import queue as pyqueue +import textwrap import time import io import itertools @@ -123,6 +124,8 @@ def _resource_unlink(name, rtype): # BaseManager.shutdown_timeout SHUTDOWN_TIMEOUT = support.SHORT_TIMEOUT +WAIT_ACTIVE_CHILDREN_TIMEOUT = 5.0 + HAVE_GETVALUE = not getattr(_multiprocessing, 'HAVE_BROKEN_SEM_GETVALUE', False) @@ -2860,6 +2863,11 @@ def test_pool_worker_lifetime_early_close(self): for (j, res) in enumerate(results): self.assertEqual(res.get(), sqr(j)) + def test_pool_maxtasksperchild_invalid(self): + for value in [0, -1, 0.5, "12"]: + with self.assertRaises(ValueError): + multiprocessing.Pool(3, maxtasksperchild=value) + def test_worker_finalization_via_atexit_handler_of_multiprocessing(self): # tests cases against bpo-38744 and bpo-39360 cmd = '''if 1: @@ -4312,18 +4320,13 @@ def test_shared_memory_cleaned_after_process_termination(self): p.terminate() p.wait() - deadline = time.monotonic() + support.LONG_TIMEOUT - t = 0.1 - while time.monotonic() < deadline: - time.sleep(t) - t = min(t*2, 5) + err_msg = ("A SharedMemory segment was leaked after " + "a process was abruptly terminated") + for _ in support.sleeping_retry(support.LONG_TIMEOUT, err_msg): try: smm = shared_memory.SharedMemory(name, create=False) except FileNotFoundError: break - else: - raise AssertionError("A SharedMemory segment was leaked after" - " a process was abruptly terminated.") if os.name == 'posix': # Without this line it was raising warnings like: @@ -5333,9 +5336,10 @@ def create_and_register_resource(rtype): p.terminate() p.wait() - deadline = time.monotonic() + support.LONG_TIMEOUT - while time.monotonic() < deadline: - time.sleep(.5) + err_msg = (f"A {rtype} resource was leaked after a process was " + f"abruptly terminated") + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, + err_msg): try: _resource_unlink(name2, rtype) except OSError as e: @@ -5343,10 +5347,7 @@ def create_and_register_resource(rtype): # EINVAL self.assertIn(e.errno, (errno.ENOENT, errno.EINVAL)) break - else: - raise AssertionError( - f"A {rtype} resource was leaked after a process was " - f"abruptly terminated.") + err = p.stderr.read().decode('utf-8') p.stderr.close() expected = ('resource_tracker: There appear to be 2 leaked {} ' @@ -5574,18 +5575,18 @@ def wait_proc_exit(self): # but this can take a bit on slow machines, so wait a few seconds # if there are other children too (see #17395). join_process(self.proc) + + timeout = WAIT_ACTIVE_CHILDREN_TIMEOUT start_time = time.monotonic() - t = 0.01 - while len(multiprocessing.active_children()) > 1: - time.sleep(t) - t *= 2 - dt = time.monotonic() - start_time - if dt >= 5.0: - test.support.environment_altered = True - support.print_warning(f"multiprocessing.Manager still has " - f"{multiprocessing.active_children()} " - f"active children after {dt} seconds") + for _ in support.sleeping_retry(timeout, error=False): + if len(multiprocessing.active_children()) <= 1: break + else: + dt = time.monotonic() - start_time + support.environment_altered = True + support.print_warning(f"multiprocessing.Manager still has " + f"{multiprocessing.active_children()} " + f"active children after {dt:.1f} seconds") def run_worker(self, worker, obj): self.proc = multiprocessing.Process(target=worker, args=(obj, )) @@ -5760,6 +5761,35 @@ def test_namespace(self): self.run_worker(self._test_namespace, o) +class TestNamedResource(unittest.TestCase): + def test_global_named_resource_spawn(self): + # + # gh-90549: Check that global named resources in main module + # will not leak by a subprocess, in spawn context. + # + testfn = os_helper.TESTFN + self.addCleanup(os_helper.unlink, testfn) + with open(testfn, 'w', encoding='utf-8') as f: + f.write(textwrap.dedent('''\ + import multiprocessing as mp + + ctx = mp.get_context('spawn') + + global_resource = ctx.Semaphore() + + def submain(): pass + + if __name__ == '__main__': + p = ctx.Process(target=submain) + p.start() + p.join() + ''')) + rc, out, err = test.support.script_helper.assert_python_ok(testfn) + # on error, err = 'UserWarning: resource_tracker: There appear to + # be 1 leaked semaphore objects to clean up at shutdown' + self.assertEqual(err, b'') + + class MiscTestCase(unittest.TestCase): def test__all__(self): # Just make sure names in not_exported are excluded @@ -5853,18 +5883,17 @@ def tearDownClass(cls): # only the manager process should be returned by active_children() # but this can take a bit on slow machines, so wait a few seconds # if there are other children too (see #17395) + timeout = WAIT_ACTIVE_CHILDREN_TIMEOUT start_time = time.monotonic() - t = 0.01 - while len(multiprocessing.active_children()) > 1: - time.sleep(t) - t *= 2 - dt = time.monotonic() - start_time - if dt >= 5.0: - test.support.environment_altered = True - support.print_warning(f"multiprocessing.Manager still has " - f"{multiprocessing.active_children()} " - f"active children after {dt} seconds") + for _ in support.sleeping_retry(timeout, error=False): + if len(multiprocessing.active_children()) <= 1: break + else: + dt = time.monotonic() - start_time + support.environment_altered = True + support.print_warning(f"multiprocessing.Manager still has " + f"{multiprocessing.active_children()} " + f"active children after {dt:.1f} seconds") gc.collect() # do garbage collection if cls.manager._number_of_objects() != 0: diff --git a/Lib/test/_testcppext.cpp b/Lib/test/_testcppext.cpp index dc40f0ee9eb1cb..b6d35407a61edc 100644 --- a/Lib/test/_testcppext.cpp +++ b/Lib/test/_testcppext.cpp @@ -6,6 +6,12 @@ #include "Python.h" +#if __cplusplus >= 201103 +# define NAME _testcpp11ext +#else +# define NAME _testcpp03ext +#endif + PyDoc_STRVAR(_testcppext_add_doc, "add(x, y)\n" "\n" @@ -16,20 +22,42 @@ _testcppext_add(PyObject *Py_UNUSED(module), PyObject *args) { long i, j; if (!PyArg_ParseTuple(args, "ll:foo", &i, &j)) { - return nullptr; + return _Py_NULL; } long res = i + j; return PyLong_FromLong(res); } +// Class to test operator casting an object to PyObject* +class StrongRef +{ +public: + StrongRef(PyObject *obj) : m_obj(obj) { + Py_INCREF(this->m_obj); + } + + ~StrongRef() { + Py_DECREF(this->m_obj); + } + + // Cast to PyObject*: get a borrowed reference + inline operator PyObject*() const { return this->m_obj; } + +private: + PyObject *m_obj; // Strong reference +}; + + static PyObject * test_api_casts(PyObject *Py_UNUSED(module), PyObject *Py_UNUSED(args)) { PyObject *obj = Py_BuildValue("(ii)", 1, 2); - if (obj == nullptr) { - return nullptr; + if (obj == _Py_NULL) { + return _Py_NULL; } + Py_ssize_t refcnt = Py_REFCNT(obj); + assert(refcnt >= 1); // gh-92138: For backward compatibility, functions of Python C API accepts // "const PyObject*". Check that using it does not emit C++ compiler @@ -38,22 +66,69 @@ test_api_casts(PyObject *Py_UNUSED(module), PyObject *Py_UNUSED(args)) Py_INCREF(const_obj); Py_DECREF(const_obj); PyTypeObject *type = Py_TYPE(const_obj); - assert(Py_REFCNT(const_obj) >= 1); - + assert(Py_REFCNT(const_obj) == refcnt); assert(type == &PyTuple_Type); assert(PyTuple_GET_SIZE(const_obj) == 2); PyObject *one = PyTuple_GET_ITEM(const_obj, 0); assert(PyLong_AsLong(one) == 1); + // gh-92898: StrongRef doesn't inherit from PyObject but has an operator to + // cast to PyObject*. + StrongRef strong_ref(obj); + assert(Py_TYPE(strong_ref) == &PyTuple_Type); + assert(Py_REFCNT(strong_ref) == (refcnt + 1)); + Py_INCREF(strong_ref); + Py_DECREF(strong_ref); + + // gh-93442: Pass 0 as NULL for PyObject* + Py_XINCREF(0); + Py_XDECREF(0); +#if _cplusplus >= 201103 + // Test nullptr passed as PyObject* + Py_XINCREF(nullptr); + Py_XDECREF(nullptr); +#endif + Py_DECREF(obj); Py_RETURN_NONE; } +static PyObject * +test_unicode(PyObject *Py_UNUSED(module), PyObject *Py_UNUSED(args)) +{ + PyObject *str = PyUnicode_FromString("abc"); + if (str == _Py_NULL) { + return _Py_NULL; + } + + assert(PyUnicode_Check(str)); + assert(PyUnicode_GET_LENGTH(str) == 3); + + // gh-92800: test PyUnicode_READ() + const void* data = PyUnicode_DATA(str); + assert(data != _Py_NULL); + int kind = PyUnicode_KIND(str); + assert(kind == PyUnicode_1BYTE_KIND); + assert(PyUnicode_READ(kind, data, 0) == 'a'); + + // gh-92800: test PyUnicode_READ() casts + const void* const_data = PyUnicode_DATA(str); + unsigned int ukind = static_cast(kind); + assert(PyUnicode_READ(ukind, const_data, 2) == 'c'); + + assert(PyUnicode_READ_CHAR(str, 1) == 'b'); + + Py_DECREF(str); + Py_RETURN_NONE; +} + + static PyMethodDef _testcppext_methods[] = { {"add", _testcppext_add, METH_VARARGS, _testcppext_add_doc}, - {"test_api_casts", test_api_casts, METH_NOARGS, nullptr}, - {nullptr, nullptr, 0, nullptr} /* sentinel */ + {"test_api_casts", test_api_casts, METH_NOARGS, _Py_NULL}, + {"test_unicode", test_unicode, METH_NOARGS, _Py_NULL}, + {_Py_NULL, _Py_NULL, 0, _Py_NULL} /* sentinel */ }; @@ -68,26 +143,32 @@ _testcppext_exec(PyObject *module) static PyModuleDef_Slot _testcppext_slots[] = { {Py_mod_exec, reinterpret_cast(_testcppext_exec)}, - {0, nullptr} + {0, _Py_NULL} }; PyDoc_STRVAR(_testcppext_doc, "C++ test extension."); +#define _STR(NAME) #NAME +#define STR(NAME) _STR(NAME) + static struct PyModuleDef _testcppext_module = { PyModuleDef_HEAD_INIT, // m_base - "_testcppext", // m_name + STR(NAME), // m_name _testcppext_doc, // m_doc 0, // m_size _testcppext_methods, // m_methods _testcppext_slots, // m_slots - nullptr, // m_traverse - nullptr, // m_clear - nullptr, // m_free + _Py_NULL, // m_traverse + _Py_NULL, // m_clear + _Py_NULL, // m_free }; +#define _FUNC_NAME(NAME) PyInit_ ## NAME +#define FUNC_NAME(NAME) _FUNC_NAME(NAME) + PyMODINIT_FUNC -PyInit__testcppext(void) +FUNC_NAME(NAME)(void) { return PyModuleDef_Init(&_testcppext_module); } diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py index 8e4bcc7c9efdca..311f9db10ffefa 100644 --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -7,7 +7,6 @@ import bisect import copy import decimal -import functools import sys import os import pickle diff --git a/Lib/test/doctest_lineno.py b/Lib/test/doctest_lineno.py new file mode 100644 index 00000000000000..be198513a1502c --- /dev/null +++ b/Lib/test/doctest_lineno.py @@ -0,0 +1,50 @@ +# This module is used in `test_doctest`. +# It must not have a docstring. + +def func_with_docstring(): + """Some unrelated info.""" + + +def func_without_docstring(): + pass + + +def func_with_doctest(): + """ + This function really contains a test case. + + >>> func_with_doctest.__name__ + 'func_with_doctest' + """ + return 3 + + +class ClassWithDocstring: + """Some unrelated class information.""" + + +class ClassWithoutDocstring: + pass + + +class ClassWithDoctest: + """This class really has a test case in it. + + >>> ClassWithDoctest.__name__ + 'ClassWithDoctest' + """ + + +class MethodWrapper: + def method_with_docstring(self): + """Method with a docstring.""" + + def method_without_docstring(self): + pass + + def method_with_doctest(self): + """ + This has a doctest! + >>> MethodWrapper.method_with_doctest.__name__ + 'method_with_doctest' + """ diff --git a/Lib/test/fork_wait.py b/Lib/test/fork_wait.py index 4d3dbd8e83f5a9..ebd07e612e5810 100644 --- a/Lib/test/fork_wait.py +++ b/Lib/test/fork_wait.py @@ -54,10 +54,8 @@ def test_wait(self): self.threads.append(thread) # busy-loop to wait for threads - deadline = time.monotonic() + support.SHORT_TIMEOUT - while len(self.alive) < NUM_THREADS: - time.sleep(0.1) - if deadline < time.monotonic(): + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(self.alive) >= NUM_THREADS: break a = sorted(self.alive.keys()) diff --git a/Lib/test/leakers/test_ctypes.py b/Lib/test/leakers/test_ctypes.py index 7d7e9ff3a11129..ec09ac36996b0c 100644 --- a/Lib/test/leakers/test_ctypes.py +++ b/Lib/test/leakers/test_ctypes.py @@ -1,5 +1,5 @@ -# Taken from Lib/ctypes/test/test_keeprefs.py, PointerToStructure.test(). +# Taken from Lib/test/test_ctypes/test_keeprefs.py, PointerToStructure.test(). from ctypes import Structure, c_int, POINTER import gc diff --git a/Lib/test/libregrtest/cmdline.py b/Lib/test/libregrtest/cmdline.py index 1ac63af734c412..ebe57920d9185c 100644 --- a/Lib/test/libregrtest/cmdline.py +++ b/Lib/test/libregrtest/cmdline.py @@ -1,5 +1,6 @@ import argparse import os +import shlex import sys from test.support import os_helper @@ -372,8 +373,11 @@ def _parse_args(args, **kwargs): parser.error("-s and -f don't go together!") if ns.use_mp is not None and ns.trace: parser.error("-T and -j don't go together!") - if ns.python is not None and ns.use_mp is None: - parser.error("-p requires -j!") + if ns.python is not None: + if ns.use_mp is None: + parser.error("-p requires -j!") + # The "executable" may be two or more parts, e.g. "node python.js" + ns.python = shlex.split(ns.python) if ns.failfast and not (ns.verbose or ns.verbose3): parser.error("-G/--failfast needs either -v or -W") if ns.pgo and (ns.verbose or ns.verbose2 or ns.verbose3): diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index 0cacccfc0b5e39..655e4d2e56f8f0 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -306,13 +306,22 @@ def list_cases(self): printlist(self.skipped, file=sys.stderr) def rerun_failed_tests(self): + self.log() + + if self.ns.python: + # Temp patch for https://github.com/python/cpython/issues/94052 + self.log( + "Re-running failed tests is not supported with --python " + "host runner option." + ) + return + self.ns.verbose = True self.ns.failfast = False self.ns.verbose3 = False self.first_result = self.get_tests_result() - self.log() self.log("Re-running failed tests in verbose mode") rerun_list = list(self.need_rerun) self.need_rerun.clear() @@ -600,6 +609,16 @@ def save_xml_result(self): for s in ET.tostringlist(root): f.write(s) + def fix_umask(self): + if support.is_emscripten: + # Emscripten has default umask 0o777, which breaks some tests. + # see https://github.com/emscripten-core/emscripten/issues/17269 + old_mask = os.umask(0) + if old_mask == 0o777: + os.umask(0o027) + else: + os.umask(old_mask) + def set_temp_dir(self): if self.ns.tempdir: self.tmp_dir = self.ns.tempdir @@ -628,11 +647,16 @@ def create_temp_dir(self): # Define a writable temp dir that will be used as cwd while running # the tests. The name of the dir includes the pid to allow parallel # testing (see the -j option). - pid = os.getpid() + # Emscripten and WASI have stubbed getpid(), Emscripten has only + # milisecond clock resolution. Use randint() instead. + if sys.platform in {"emscripten", "wasi"}: + nounce = random.randint(0, 1_000_000) + else: + nounce = os.getpid() if self.worker_test_name is not None: - test_cwd = 'test_python_worker_{}'.format(pid) + test_cwd = 'test_python_worker_{}'.format(nounce) else: - test_cwd = 'test_python_{}'.format(pid) + test_cwd = 'test_python_{}'.format(nounce) test_cwd += os_helper.FS_NONASCII test_cwd = os.path.join(self.tmp_dir, test_cwd) return test_cwd @@ -655,6 +679,8 @@ def main(self, tests=None, **kwargs): self.set_temp_dir() + self.fix_umask() + if self.ns.cleanup: self.cleanup() sys.exit(0) diff --git a/Lib/test/libregrtest/runtest.py b/Lib/test/libregrtest/runtest.py index 62cf1a3f1175ee..e9bb72a7d77ee1 100644 --- a/Lib/test/libregrtest/runtest.py +++ b/Lib/test/libregrtest/runtest.py @@ -80,6 +80,11 @@ class EnvChanged(Failed): def __str__(self) -> str: return f"{self.name} failed (env changed)" + # Convert Passed to EnvChanged + @staticmethod + def from_passed(other): + return EnvChanged(other.name, other.duration_sec, other.xml_data) + class RefLeak(Failed): def __str__(self) -> str: diff --git a/Lib/test/libregrtest/runtest_mp.py b/Lib/test/libregrtest/runtest_mp.py index 39fab5566a089a..6ebabb86873bcb 100644 --- a/Lib/test/libregrtest/runtest_mp.py +++ b/Lib/test/libregrtest/runtest_mp.py @@ -1,11 +1,11 @@ import faulthandler import json -import os +import os.path import queue -import shlex import signal import subprocess import sys +import tempfile import threading import time import traceback @@ -17,7 +17,8 @@ from test.libregrtest.cmdline import Namespace from test.libregrtest.main import Regrtest from test.libregrtest.runtest import ( - runtest, is_failed, TestResult, Interrupted, Timeout, ChildError, PROGRESS_MIN_TIME) + runtest, is_failed, TestResult, Interrupted, Timeout, ChildError, + PROGRESS_MIN_TIME, Passed, EnvChanged) from test.libregrtest.setup import setup_tests from test.libregrtest.utils import format_duration, print_warning @@ -52,13 +53,12 @@ def parse_worker_args(worker_args) -> tuple[Namespace, str]: return (ns, test_name) -def run_test_in_subprocess(testname: str, ns: Namespace) -> subprocess.Popen: +def run_test_in_subprocess(testname: str, ns: Namespace, tmp_dir: str) -> subprocess.Popen: ns_dict = vars(ns) worker_args = (ns_dict, testname) worker_args = json.dumps(worker_args) if ns.python is not None: - # The "executable" may be two or more parts, e.g. "node python.js" - executable = shlex.split(ns.python) + executable = ns.python else: executable = [sys.executable] cmd = [*executable, *support.args_from_interpreter_flags(), @@ -66,10 +66,16 @@ def run_test_in_subprocess(testname: str, ns: Namespace) -> subprocess.Popen: '-m', 'test.regrtest', '--worker-args', worker_args] + env = dict(os.environ) + if tmp_dir is not None: + env['TMPDIR'] = tmp_dir + env['TEMP'] = tmp_dir + env['TMP'] = tmp_dir + # Running the child from the same working directory as regrtest's original # invocation ensures that TEMPDIR for the child is the same when # sysconfig.is_python_build() is true. See issue 15300. - kw = {} + kw = {'env': env} if USE_PROCESS_GROUP: kw['start_new_session'] = True return subprocess.Popen(cmd, @@ -206,12 +212,12 @@ def mp_result_error( test_result.duration_sec = time.monotonic() - self.start_time return MultiprocessResult(test_result, stdout, err_msg) - def _run_process(self, test_name: str) -> tuple[int, str, str]: + def _run_process(self, test_name: str, tmp_dir: str) -> tuple[int, str, str]: self.start_time = time.monotonic() self.current_test_name = test_name try: - popen = run_test_in_subprocess(test_name, self.ns) + popen = run_test_in_subprocess(test_name, self.ns, tmp_dir) self._killed = False self._popen = popen @@ -266,7 +272,23 @@ def _run_process(self, test_name: str) -> tuple[int, str, str]: self.current_test_name = None def _runtest(self, test_name: str) -> MultiprocessResult: - retcode, stdout = self._run_process(test_name) + # Don't check for leaked temporary files and directories if Python is + # run on WASI. WASI don't pass environment variables like TMPDIR to + # worker processes. + if not support.is_wasi: + # gh-93353: Check for leaked temporary files in the parent process, + # since the deletion of temporary files can happen late during + # Python finalization: too late for libregrtest. + tmp_dir = tempfile.mkdtemp(prefix="test_python_") + tmp_dir = os.path.abspath(tmp_dir) + try: + retcode, stdout = self._run_process(test_name, tmp_dir) + finally: + tmp_files = os.listdir(tmp_dir) + os_helper.rmtree(tmp_dir) + else: + retcode, stdout = self._run_process(test_name, None) + tmp_files = () if retcode is None: return self.mp_result_error(Timeout(test_name), stdout) @@ -289,6 +311,14 @@ def _runtest(self, test_name: str) -> MultiprocessResult: if err_msg is not None: return self.mp_result_error(ChildError(test_name), stdout, err_msg) + if tmp_files: + msg = (f'\n\n' + f'Warning -- {test_name} leaked temporary files ' + f'({len(tmp_files)}): {", ".join(sorted(tmp_files))}') + stdout += msg + if isinstance(result, Passed): + result = EnvChanged.from_passed(result) + return MultiprocessResult(result, stdout, err_msg) def run(self) -> None: diff --git a/Lib/test/libregrtest/setup.py b/Lib/test/libregrtest/setup.py index cfc1b82d2785cb..4298c922723407 100644 --- a/Lib/test/libregrtest/setup.py +++ b/Lib/test/libregrtest/setup.py @@ -141,7 +141,7 @@ def _adjust_resource_limits(): """Adjust the system resource limits (ulimit) if needed.""" try: import resource - from resource import RLIMIT_NOFILE, RLIM_INFINITY + from resource import RLIMIT_NOFILE except ImportError: return fd_limit, max_fds = resource.getrlimit(RLIMIT_NOFILE) diff --git a/Lib/test/list_tests.py b/Lib/test/list_tests.py index f7eea88c54a6ae..fe3ee80b8d461f 100644 --- a/Lib/test/list_tests.py +++ b/Lib/test/list_tests.py @@ -3,10 +3,9 @@ """ import sys -import os from functools import cmp_to_key -from test import support, seq_tests +from test import seq_tests from test.support import ALWAYS_EQ, NEVER_EQ diff --git a/Lib/test/lock_tests.py b/Lib/test/lock_tests.py index f16c7ed952cf55..a4f52cb20ad301 100644 --- a/Lib/test/lock_tests.py +++ b/Lib/test/lock_tests.py @@ -2,7 +2,6 @@ Various tests for synchronization primitives. """ -import os import gc import sys import time diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py index d0ea7d0e55e7d5..21419e11c87497 100644 --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -3035,6 +3035,67 @@ def check_array(arr): # 2-D, non-contiguous check_array(arr[::2]) + def test_evil_class_mutating_dict(self): + # https://github.com/python/cpython/issues/92930 + from random import getrandbits + + global Bad + class Bad: + def __eq__(self, other): + return ENABLED + def __hash__(self): + return 42 + def __reduce__(self): + if getrandbits(6) == 0: + collection.clear() + return (Bad, ()) + + for proto in protocols: + for _ in range(20): + ENABLED = False + collection = {Bad(): Bad() for _ in range(20)} + for bad in collection: + bad.bad = bad + bad.collection = collection + ENABLED = True + try: + data = self.dumps(collection, proto) + self.loads(data) + except RuntimeError as e: + expected = "changed size during iteration" + self.assertIn(expected, str(e)) + + def test_evil_pickler_mutating_collection(self): + # https://github.com/python/cpython/issues/92930 + if not hasattr(self, "pickler"): + raise self.skipTest(f"{type(self)} has no associated pickler type") + + global Clearer + class Clearer: + pass + + def check(collection): + class EvilPickler(self.pickler): + def persistent_id(self, obj): + if isinstance(obj, Clearer): + collection.clear() + return None + pickler = EvilPickler(io.BytesIO(), proto) + try: + pickler.dump(collection) + except RuntimeError as e: + expected = "changed size during iteration" + self.assertIn(expected, str(e)) + + for proto in protocols: + check([Clearer()]) + check([Clearer(), Clearer()]) + check({Clearer()}) + check({Clearer(), Clearer()}) + check({Clearer(): 1}) + check({Clearer(): 1, Clearer(): 2}) + check({1: Clearer(), 2: Clearer()}) + class BigmemPickleTests: diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 28549a645b0f57..2339e0049ef6ea 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -10,6 +10,9 @@ import warnings +MS_WINDOWS = (sys.platform == 'win32') + + def normalize_text(text): if text is None: return None @@ -127,13 +130,21 @@ def collect_sys(info_add): encoding = '%s/%s' % (encoding, errors) info_add('sys.%s.encoding' % name, encoding) - # Were we compiled --with-pydebug or with #define Py_DEBUG? + # Were we compiled --with-pydebug? Py_DEBUG = hasattr(sys, 'gettotalrefcount') if Py_DEBUG: text = 'Yes (sys.gettotalrefcount() present)' else: text = 'No (sys.gettotalrefcount() missing)' - info_add('Py_DEBUG', text) + info_add('build.Py_DEBUG', text) + + # Were we compiled --with-trace-refs? + Py_TRACE_REFS = hasattr(sys, 'getobjects') + if Py_TRACE_REFS: + text = 'Yes (sys.getobjects() present)' + else: + text = 'No (sys.getobjects() missing)' + info_add('build.Py_TRACE_REFS', text) def collect_platform(info_add): @@ -455,6 +466,11 @@ def collect_datetime(info_add): def collect_sysconfig(info_add): + # On Windows, sysconfig is not reliable to get macros used + # to build Python + if MS_WINDOWS: + return + import sysconfig for name in ( @@ -488,6 +504,28 @@ def collect_sysconfig(info_add): value = normalize_text(value) info_add('sysconfig[%s]' % name, value) + PY_CFLAGS = sysconfig.get_config_var('PY_CFLAGS') + NDEBUG = (PY_CFLAGS and '-DNDEBUG' in PY_CFLAGS) + if NDEBUG: + text = 'ignore assertions (macro defined)' + else: + text= 'build assertions (macro not defined)' + info_add('build.NDEBUG',text) + + for name in ( + 'WITH_DOC_STRINGS', + 'WITH_DTRACE', + 'WITH_FREELISTS', + 'WITH_PYMALLOC', + 'WITH_VALGRIND', + ): + value = sysconfig.get_config_var(name) + if value: + text = 'Yes' + else: + text = 'No' + info_add(f'build.{name}', text) + def collect_ssl(info_add): import os @@ -543,10 +581,19 @@ def format_attr(attr, value): def collect_socket(info_add): - import socket + try: + import socket + except ImportError: + return - hostname = socket.gethostname() - info_add('socket.hostname', hostname) + try: + hostname = socket.gethostname() + except OSError: + # WASI SDK 15.0 does not have gethostname(2). + if sys.platform != "wasi": + raise + else: + info_add('socket.hostname', hostname) def collect_sqlite(info_add): @@ -596,7 +643,6 @@ def collect_testcapi(info_add): return call_func(info_add, 'pymem.allocator', _testcapi, 'pymem_getallocatorsname') - copy_attr(info_add, 'pymem.with_pymalloc', _testcapi, 'WITH_PYMALLOC') def collect_resource(info_add): @@ -638,6 +684,13 @@ def collect_test_support(info_add): call_func(info_add, 'test_support._is_gui_available', support, '_is_gui_available') call_func(info_add, 'test_support.python_is_optimized', support, 'python_is_optimized') + info_add('test_support.check_sanitizer(address=True)', + support.check_sanitizer(address=True)) + info_add('test_support.check_sanitizer(memory=True)', + support.check_sanitizer(memory=True)) + info_add('test_support.check_sanitizer(ub=True)', + support.check_sanitizer(ub=True)) + def collect_cc(info_add): import subprocess diff --git a/Lib/test/setup_testcppext.py b/Lib/test/setup_testcppext.py index 780cb7b24a78c9..892e24a6376c5e 100644 --- a/Lib/test/setup_testcppext.py +++ b/Lib/test/setup_testcppext.py @@ -13,16 +13,12 @@ if not MS_WINDOWS: # C++ compiler flags for GCC and clang CPPFLAGS = [ - # Python currently targets C++11 - '-std=c++11', # gh-91321: The purpose of _testcppext extension is to check that building # a C++ extension using the Python C API does not emit C++ compiler # warnings '-Werror', # Warn on old-style cast (C cast) like: (PyObject*)op '-Wold-style-cast', - # Warn when using NULL rather than _Py_NULL in static inline functions - '-Wzero-as-null-pointer-constant', ] else: # Don't pass any compiler flag to MSVC @@ -30,12 +26,27 @@ def main(): + cppflags = list(CPPFLAGS) + if '-std=c++03' in sys.argv: + sys.argv.remove('-std=c++03') + std = 'c++03' + name = '_testcpp03ext' + else: + # Python currently targets C++11 + std = 'c++11' + name = '_testcpp11ext' + + cppflags = [*CPPFLAGS, f'-std={std}'] + if std == 'c++11': + # Warn when using NULL rather than _Py_NULL in static inline functions + cppflags.append('-Wzero-as-null-pointer-constant') + cpp_ext = Extension( - '_testcppext', + name, sources=[SOURCE], language='c++', - extra_compile_args=CPPFLAGS) - setup(name="_testcppext", ext_modules=[cpp_ext]) + extra_compile_args=cppflags) + setup(name=name, ext_modules=[cpp_ext]) if __name__ == "__main__": diff --git a/Lib/test/signalinterproctester.py b/Lib/test/signalinterproctester.py index bc60b747f71680..cdcd92a8baace6 100644 --- a/Lib/test/signalinterproctester.py +++ b/Lib/test/signalinterproctester.py @@ -28,16 +28,15 @@ def wait_signal(self, child, signame): # (if set) child.wait() - timeout = support.SHORT_TIMEOUT - deadline = time.monotonic() + timeout - - while time.monotonic() < deadline: + start_time = time.monotonic() + for _ in support.busy_retry(support.SHORT_TIMEOUT, error=False): if self.got_signals[signame]: return signal.pause() - - self.fail('signal %s not received after %s seconds' - % (signame, timeout)) + else: + dt = time.monotonic() - start_time + self.fail('signal %s not received after %.1f seconds' + % (signame, dt)) def subprocess_send_signal(self, pid, signame): code = 'import os, signal; os.kill(%s, signal.%s)' % (pid, signame) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 41502cf4e97be8..c51a1f26f29a6b 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -59,6 +59,7 @@ "run_with_tz", "PGO", "missing_compiler_executable", "ALWAYS_EQ", "NEVER_EQ", "LARGEST", "SMALLEST", "LOOPBACK_TIMEOUT", "INTERNET_TIMEOUT", "SHORT_TIMEOUT", "LONG_TIMEOUT", + "Py_DEBUG", ] @@ -199,6 +200,11 @@ def get_original_stdout(): def _force_run(path, func, *args): try: return func(*args) + except FileNotFoundError as err: + # chmod() won't fix a missing file. + if verbose >= 2: + print('%s: %s' % (err.__class__.__name__, err)) + raise except OSError as err: if verbose >= 2: print('%s: %s' % (err.__class__.__name__, err)) @@ -299,6 +305,8 @@ def requires(resource, msg=None): if msg is None: msg = "Use of the %r resource not enabled" % resource raise ResourceDenied(msg) + if resource in {"network", "urlfetch"} and not has_socket_support: + raise ResourceDenied("No socket support") if resource == 'gui' and not _is_gui_available(): raise ResourceDenied(_is_gui_available.reason) @@ -521,7 +529,7 @@ def requires_subprocess(): """Used for subprocess, os.spawn calls, fd inheritance""" return unittest.skipUnless(has_subprocess_support, "requires subprocess support") -# Emscripten's socket emulation has limitation. WASI doesn't have sockets yet. +# Emscripten's socket emulation and WASI sockets have limitations. has_socket_support = not is_emscripten and not is_wasi def requires_working_socket(*, module=False): @@ -1760,6 +1768,16 @@ def cleanup(): setattr(object_to_patch, attr_name, new_value) +@contextlib.contextmanager +def patch_list(orig): + """Like unittest.mock.patch.dict, but for lists.""" + try: + saved = orig[:] + yield + finally: + orig[:] = saved + + def run_in_subinterp(code): """ Run code in a subinterpreter. Raise unittest.SkipTest if the tracemalloc @@ -2064,31 +2082,26 @@ def wait_process(pid, *, exitcode, timeout=None): if timeout is None: timeout = SHORT_TIMEOUT - t0 = time.monotonic() - sleep = 0.001 - max_sleep = 0.1 - while True: + + start_time = time.monotonic() + for _ in sleeping_retry(timeout, error=False): pid2, status = os.waitpid(pid, os.WNOHANG) if pid2 != 0: break - # process is still running - - dt = time.monotonic() - t0 - if dt > SHORT_TIMEOUT: - try: - os.kill(pid, signal.SIGKILL) - os.waitpid(pid, 0) - except OSError: - # Ignore errors like ChildProcessError or PermissionError - pass - - raise AssertionError(f"process {pid} is still running " - f"after {dt:.1f} seconds") + # rety: the process is still running + else: + try: + os.kill(pid, signal.SIGKILL) + os.waitpid(pid, 0) + except OSError: + # Ignore errors like ChildProcessError or PermissionError + pass - sleep = min(sleep * 2, max_sleep) - time.sleep(sleep) + dt = time.monotonic() - start_time + raise AssertionError(f"process {pid} is still running " + f"after {dt:.1f} seconds") else: - # Windows implementation + # Windows implementation: don't support timeout :-( pid2, status = os.waitpid(pid, 0) exitcode2 = os.waitstatus_to_exitcode(status) @@ -2183,3 +2196,138 @@ def clear_ignored_deprecations(*tokens: object) -> None: if warnings.filters != new_filters: warnings.filters[:] = new_filters warnings._filters_mutated() + + +# Skip a test if venv with pip is known to not work. +def requires_venv_with_pip(): + # ensurepip requires zlib to open ZIP archives (.whl binary wheel packages) + try: + import zlib + except ImportError: + return unittest.skipIf(True, "venv: ensurepip requires zlib") + + # bpo-26610: pip/pep425tags.py requires ctypes. + # gh-92820: setuptools/windows_support.py uses ctypes (setuptools 58.1). + try: + import ctypes + except ImportError: + ctypes = None + return unittest.skipUnless(ctypes, 'venv: pip requires ctypes') + + +# True if Python is built with the Py_DEBUG macro defined: if +# Python is built in debug mode (./configure --with-pydebug). +Py_DEBUG = hasattr(sys, 'gettotalrefcount') + + +def late_deletion(obj): + """ + Keep a Python alive as long as possible. + + Create a reference cycle and store the cycle in an object deleted late in + Python finalization. Try to keep the object alive until the very last + garbage collection. + + The function keeps a strong reference by design. It should be called in a + subprocess to not mark a test as "leaking a reference". + """ + + # Late CPython finalization: + # - finalize_interp_clear() + # - _PyInterpreterState_Clear(): Clear PyInterpreterState members + # (ex: codec_search_path, before_forkers) + # - clear os.register_at_fork() callbacks + # - clear codecs.register() callbacks + + ref_cycle = [obj] + ref_cycle.append(ref_cycle) + + # Store a reference in PyInterpreterState.codec_search_path + import codecs + def search_func(encoding): + return None + search_func.reference = ref_cycle + codecs.register(search_func) + + if hasattr(os, 'register_at_fork'): + # Store a reference in PyInterpreterState.before_forkers + def atfork_func(): + pass + atfork_func.reference = ref_cycle + os.register_at_fork(before=atfork_func) + + +def busy_retry(timeout, err_msg=None, /, *, error=True): + """ + Run the loop body until "break" stops the loop. + + After *timeout* seconds, raise an AssertionError if *error* is true, + or just stop if *error is false. + + Example: + + for _ in support.busy_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage: + + for _ in support.busy_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + + """ + if timeout <= 0: + raise ValueError("timeout must be greater than zero") + + start_time = time.monotonic() + deadline = start_time + timeout + + while True: + yield + + if time.monotonic() >= deadline: + break + + if error: + dt = time.monotonic() - start_time + msg = f"timeout ({dt:.1f} seconds)" + if err_msg: + msg = f"{msg}: {err_msg}" + raise AssertionError(msg) + + +def sleeping_retry(timeout, err_msg=None, /, + *, init_delay=0.010, max_delay=1.0, error=True): + """ + Wait strategy that applies exponential backoff. + + Run the loop body until "break" stops the loop. Sleep at each loop + iteration, but not at the first iteration. The sleep delay is doubled at + each iteration (up to *max_delay* seconds). + + See busy_retry() documentation for the parameters usage. + + Example raising an exception after SHORT_TIMEOUT seconds: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + """ + + delay = init_delay + for _ in busy_retry(timeout, err_msg, error=error): + yield + + time.sleep(delay) + delay = min(delay * 2, max_delay) diff --git a/Lib/test/support/os_helper.py b/Lib/test/support/os_helper.py index ed4ec15c7cbe92..589ef19dd34cc2 100644 --- a/Lib/test/support/os_helper.py +++ b/Lib/test/support/os_helper.py @@ -141,6 +141,11 @@ try: name.decode(sys.getfilesystemencoding()) except UnicodeDecodeError: + try: + name.decode(sys.getfilesystemencoding(), + sys.getfilesystemencodeerrors()) + except UnicodeDecodeError: + continue TESTFN_UNDECODABLE = os.fsencode(TESTFN_ASCII) + name break @@ -171,9 +176,13 @@ def can_symlink(): global _can_symlink if _can_symlink is not None: return _can_symlink - symlink_path = TESTFN + "can_symlink" + # WASI / wasmtime prevents symlinks with absolute paths, see man + # openat2(2) RESOLVE_BENEATH. Almost all symlink tests use absolute + # paths. Skip symlink tests on WASI for now. + src = os.path.abspath(TESTFN) + symlink_path = src + "can_symlink" try: - os.symlink(TESTFN, symlink_path) + os.symlink(src, symlink_path) can = True except (OSError, NotImplementedError, AttributeError): can = False @@ -233,6 +242,42 @@ def skip_unless_xattr(test): return test if ok else unittest.skip(msg)(test) +_can_chmod = None + +def can_chmod(): + global _can_chmod + if _can_chmod is not None: + return _can_chmod + if not hasattr(os, "chown"): + _can_chmod = False + return _can_chmod + try: + with open(TESTFN, "wb") as f: + try: + os.chmod(TESTFN, 0o777) + mode1 = os.stat(TESTFN).st_mode + os.chmod(TESTFN, 0o666) + mode2 = os.stat(TESTFN).st_mode + except OSError as e: + can = False + else: + can = stat.S_IMODE(mode1) != stat.S_IMODE(mode2) + finally: + os.unlink(TESTFN) + _can_chmod = can + return can + + +def skip_unless_working_chmod(test): + """Skip tests that require working os.chmod() + + WASI SDK 15.0 cannot change file mode bits. + """ + ok = can_chmod() + msg = "requires working os.chmod()" + return test if ok else unittest.skip(msg)(test) + + def unlink(filename): try: _unlink(filename) @@ -459,7 +504,10 @@ def create_empty_file(filename): def open_dir_fd(path): """Open a file descriptor to a directory.""" assert os.path.isdir(path) - dir_fd = os.open(path, os.O_RDONLY) + flags = os.O_RDONLY + if hasattr(os, "O_DIRECTORY"): + flags |= os.O_DIRECTORY + dir_fd = os.open(path, flags) try: yield dir_fd finally: diff --git a/Lib/test/support/socket_helper.py b/Lib/test/support/socket_helper.py index 754af181ec922e..d2960c9e333474 100644 --- a/Lib/test/support/socket_helper.py +++ b/Lib/test/support/socket_helper.py @@ -1,8 +1,10 @@ import contextlib import errno +import os.path import socket -import unittest import sys +import tempfile +import unittest from .. import support from . import warnings_helper @@ -11,6 +13,9 @@ HOSTv4 = "127.0.0.1" HOSTv6 = "::1" +# WASI SDK 15.0 does not provide gethostname, stub raises OSError ENOTSUP. +has_gethostname = not support.is_wasi + def find_unused_port(family=socket.AF_INET, socktype=socket.SOCK_STREAM): """Returns an unused port that should be suitable for binding. This is @@ -267,3 +272,14 @@ def filter_error(err): # __cause__ or __context__? finally: socket.setdefaulttimeout(old_timeout) + + +def create_unix_domain_name(): + """ + Create a UNIX domain name: socket.bind() argument of a AF_UNIX socket. + + Return a path relative to the current directory to get a short path + (around 27 ASCII characters). + """ + return tempfile.mktemp(prefix="test_python_", suffix='.sock', + dir=os.path.curdir) diff --git a/Lib/test/support/threading_helper.py b/Lib/test/support/threading_helper.py index 26cbc6f4d2439c..b9973c8bf5c914 100644 --- a/Lib/test/support/threading_helper.py +++ b/Lib/test/support/threading_helper.py @@ -88,19 +88,17 @@ def wait_threads_exit(timeout=None): yield finally: start_time = time.monotonic() - deadline = start_time + timeout - while True: + for _ in support.sleeping_retry(timeout, error=False): + support.gc_collect() count = _thread._count() if count <= old_count: break - if time.monotonic() > deadline: - dt = time.monotonic() - start_time - msg = (f"wait_threads() failed to cleanup {count - old_count} " - f"threads after {dt:.1f} seconds " - f"(count: {count}, old count: {old_count})") - raise AssertionError(msg) - time.sleep(0.010) - support.gc_collect() + else: + dt = time.monotonic() - start_time + msg = (f"wait_threads() failed to cleanup {count - old_count} " + f"threads after {dt:.1f} seconds " + f"(count: {count}, old count: {old_count})") + raise AssertionError(msg) def join_thread(thread, timeout=None): diff --git a/Lib/test/test__locale.py b/Lib/test/test__locale.py index b3bc54cd551048..0947464bb8c04e 100644 --- a/Lib/test/test__locale.py +++ b/Lib/test/test__locale.py @@ -109,7 +109,8 @@ def numeric_tester(self, calc_type, calc_value, data_type, used_locale): @unittest.skipUnless(nl_langinfo, "nl_langinfo is not available") @unittest.skipIf( - support.is_emscripten, "musl libc issue on Emscripten, bpo-46390" + support.is_emscripten or support.is_wasi, + "musl libc issue on Emscripten, bpo-46390" ) def test_lc_numeric_nl_langinfo(self): # Test nl_langinfo against known values @@ -128,7 +129,8 @@ def test_lc_numeric_nl_langinfo(self): self.skipTest('no suitable locales') @unittest.skipIf( - support.is_emscripten, "musl libc issue on Emscripten, bpo-46390" + support.is_emscripten or support.is_wasi, + "musl libc issue on Emscripten, bpo-46390" ) def test_lc_numeric_localeconv(self): # Test localeconv against known values diff --git a/Lib/test/test__xxsubinterpreters.py b/Lib/test/test__xxsubinterpreters.py index 5d0ed9ea14ac7f..f20aae8e21c66f 100644 --- a/Lib/test/test__xxsubinterpreters.py +++ b/Lib/test/test__xxsubinterpreters.py @@ -45,12 +45,11 @@ def _wait_for_interp_to_run(interp, timeout=None): # run subinterpreter eariler than the main thread in multiprocess. if timeout is None: timeout = support.SHORT_TIMEOUT - start_time = time.monotonic() - deadline = start_time + timeout - while not interpreters.is_running(interp): - if time.monotonic() > deadline: - raise RuntimeError('interp is not running') - time.sleep(0.010) + for _ in support.sleeping_retry(timeout, error=False): + if interpreters.is_running(interp): + break + else: + raise RuntimeError('interp is not running') @contextlib.contextmanager diff --git a/Lib/test/test_argparse.py b/Lib/test/test_argparse.py index 273db45c00f7ac..673ef89b5e1cfe 100644 --- a/Lib/test/test_argparse.py +++ b/Lib/test/test_argparse.py @@ -45,6 +45,7 @@ def setUp(self): env['COLUMNS'] = '80' +@os_helper.skip_unless_working_chmod class TempDirMixin(object): def setUp(self): @@ -1504,14 +1505,15 @@ class TestArgumentsFromFile(TempDirMixin, ParserTestCase): def setUp(self): super(TestArgumentsFromFile, self).setUp() file_texts = [ - ('hello', 'hello world!\n'), - ('recursive', '-a\n' - 'A\n' - '@hello'), - ('invalid', '@no-such-path\n'), + ('hello', os.fsencode(self.hello) + b'\n'), + ('recursive', b'-a\n' + b'A\n' + b'@hello'), + ('invalid', b'@no-such-path\n'), + ('undecodable', self.undecodable + b'\n'), ] for path, text in file_texts: - with open(path, 'w', encoding="utf-8") as file: + with open(path, 'wb') as file: file.write(text) parser_signature = Sig(fromfile_prefix_chars='@') @@ -1521,15 +1523,25 @@ def setUp(self): Sig('y', nargs='+'), ] failures = ['', '-b', 'X', '@invalid', '@missing'] + hello = 'hello world!' + os_helper.FS_NONASCII successes = [ ('X Y', NS(a=None, x='X', y=['Y'])), ('X -a A Y Z', NS(a='A', x='X', y=['Y', 'Z'])), - ('@hello X', NS(a=None, x='hello world!', y=['X'])), - ('X @hello', NS(a=None, x='X', y=['hello world!'])), - ('-a B @recursive Y Z', NS(a='A', x='hello world!', y=['Y', 'Z'])), - ('X @recursive Z -a B', NS(a='B', x='X', y=['hello world!', 'Z'])), + ('@hello X', NS(a=None, x=hello, y=['X'])), + ('X @hello', NS(a=None, x='X', y=[hello])), + ('-a B @recursive Y Z', NS(a='A', x=hello, y=['Y', 'Z'])), + ('X @recursive Z -a B', NS(a='B', x='X', y=[hello, 'Z'])), (["-a", "", "X", "Y"], NS(a='', x='X', y=['Y'])), ] + if os_helper.TESTFN_UNDECODABLE: + undecodable = os_helper.TESTFN_UNDECODABLE.lstrip(b'@') + decoded_undecodable = os.fsdecode(undecodable) + successes += [ + ('@undecodable X', NS(a=None, x=decoded_undecodable, y=['X'])), + ('X @undecodable', NS(a=None, x='X', y=[decoded_undecodable])), + ] + else: + undecodable = b'' class TestArgumentsFromFileConverter(TempDirMixin, ParserTestCase): @@ -1538,10 +1550,10 @@ class TestArgumentsFromFileConverter(TempDirMixin, ParserTestCase): def setUp(self): super(TestArgumentsFromFileConverter, self).setUp() file_texts = [ - ('hello', 'hello world!\n'), + ('hello', b'hello world!\n'), ] for path, text in file_texts: - with open(path, 'w', encoding="utf-8") as file: + with open(path, 'wb') as file: file.write(text) class FromFileConverterArgumentParser(ErrorRaisingArgumentParser): diff --git a/Lib/test/test_ast.py b/Lib/test/test_ast.py index 03d9b310f161b9..33df22cb35a9e2 100644 --- a/Lib/test/test_ast.py +++ b/Lib/test/test_ast.py @@ -335,6 +335,41 @@ def test_ast_validation(self): for snippet in snippets_to_validate: tree = ast.parse(snippet) compile(tree, '', 'exec') + + def test_invalid_position_information(self): + invalid_linenos = [ + (10, 1), (-10, -11), (10, -11), (-5, -2), (-5, 1) + ] + + for lineno, end_lineno in invalid_linenos: + with self.subTest(f"Check invalid linenos {lineno}:{end_lineno}"): + snippet = "a = 1" + tree = ast.parse(snippet) + tree.body[0].lineno = lineno + tree.body[0].end_lineno = end_lineno + with self.assertRaises(ValueError): + compile(tree, '', 'exec') + + invalid_col_offsets = [ + (10, 1), (-10, -11), (10, -11), (-5, -2), (-5, 1) + ] + for col_offset, end_col_offset in invalid_col_offsets: + with self.subTest(f"Check invalid col_offset {col_offset}:{end_col_offset}"): + snippet = "a = 1" + tree = ast.parse(snippet) + tree.body[0].col_offset = col_offset + tree.body[0].end_col_offset = end_col_offset + with self.assertRaises(ValueError): + compile(tree, '', 'exec') + + def test_compilation_of_ast_nodes_with_default_end_position_values(self): + tree = ast.Module(body=[ + ast.Import(names=[ast.alias(name='builtins', lineno=1, col_offset=0)], lineno=1, col_offset=0), + ast.Import(names=[ast.alias(name='traceback', lineno=0, col_offset=0)], lineno=0, col_offset=1) + ], type_ignores=[]) + + # Check that compilation doesn't crash. Note: this may crash explicitly only on debug mode. + compile(tree, "", "exec") def test_slice(self): slc = ast.parse("x[::]").body[0].value.slice diff --git a/Lib/test/test_asyncio/test_events.py b/Lib/test/test_asyncio/test_events.py index 05d9107b28e2ae..80d7152128c469 100644 --- a/Lib/test/test_asyncio/test_events.py +++ b/Lib/test/test_asyncio/test_events.py @@ -29,7 +29,6 @@ import asyncio from asyncio import coroutines from asyncio import events -from asyncio import proactor_events from asyncio import selector_events from test.test_asyncio import utils as test_utils from test import support diff --git a/Lib/test/test_asyncio/test_locks.py b/Lib/test/test_asyncio/test_locks.py index 415cbe570fde98..3c94baa56e1649 100644 --- a/Lib/test/test_asyncio/test_locks.py +++ b/Lib/test/test_asyncio/test_locks.py @@ -853,7 +853,7 @@ async def c4(result): self.assertTrue(t1.result()) race_tasks = [t2, t3, t4] done_tasks = [t for t in race_tasks if t.done() and t.result()] - self.assertTrue(2, len(done_tasks)) + self.assertEqual(2, len(done_tasks)) # cleanup locked semaphore sem.release() @@ -1271,7 +1271,7 @@ async def coro(): # catch here waiting tasks results1.append(True) else: - # here drained task ouside the barrier + # here drained task outside the barrier if rest_of_tasks == barrier._count: # tasks outside the barrier await barrier.reset() @@ -1377,7 +1377,7 @@ async def coro(): # last task exited from barrier await barrier.reset() - # wit here to reach the `parties` + # wait here to reach the `parties` await barrier.wait() else: try: diff --git a/Lib/test/test_asyncio/test_runners.py b/Lib/test/test_asyncio/test_runners.py index 0c2092147842ca..7780a5f15e1b44 100644 --- a/Lib/test/test_asyncio/test_runners.py +++ b/Lib/test/test_asyncio/test_runners.py @@ -1,7 +1,6 @@ import _thread import asyncio import contextvars -import gc import re import signal import threading @@ -45,6 +44,9 @@ class BaseTest(unittest.TestCase): def new_loop(self): loop = asyncio.BaseEventLoop() loop._process_events = mock.Mock() + # Mock waking event loop from select + loop._write_to_self = mock.Mock() + loop._write_to_self.return_value = None loop._selector = mock.Mock() loop._selector.select.return_value = () loop.shutdown_ag_run = False @@ -376,9 +378,9 @@ async def coro(): with asyncio.Runner() as runner: with self.assertRaises(asyncio.CancelledError): runner.run(coro()) - + def test_signal_install_not_supported_ok(self): - # signal.signal() can throw if the "main thread" doensn't have signals enabled + # signal.signal() can throw if the "main thread" doesn't have signals enabled assert threading.current_thread() is threading.main_thread() async def coro(): diff --git a/Lib/test/test_asyncio/test_sock_lowlevel.py b/Lib/test/test_asyncio/test_sock_lowlevel.py index db47616d18343e..b829fd4cc69fff 100644 --- a/Lib/test/test_asyncio/test_sock_lowlevel.py +++ b/Lib/test/test_asyncio/test_sock_lowlevel.py @@ -5,7 +5,7 @@ from asyncio import proactor_events from itertools import cycle, islice -from unittest.mock import patch, Mock +from unittest.mock import Mock from test.test_asyncio import utils as test_utils from test import support from test.support import socket_helper diff --git a/Lib/test/test_asyncio/test_tasks.py b/Lib/test/test_asyncio/test_tasks.py index 6458859db2d127..69cecb3ab12070 100644 --- a/Lib/test/test_asyncio/test_tasks.py +++ b/Lib/test/test_asyncio/test_tasks.py @@ -1,22 +1,18 @@ """Tests for tasks.py.""" import collections -import contextlib import contextvars -import functools import gc import io import random import re import sys -import textwrap import traceback import unittest from unittest import mock from types import GenericAlias import asyncio -from asyncio import coroutines from asyncio import futures from asyncio import tasks from test.test_asyncio import utils as test_utils diff --git a/Lib/test/test_asyncio/test_timeouts.py b/Lib/test/test_asyncio/test_timeouts.py index ef1ab0acb390d2..9801541e55b7f8 100644 --- a/Lib/test/test_asyncio/test_timeouts.py +++ b/Lib/test/test_asyncio/test_timeouts.py @@ -4,7 +4,6 @@ import time import asyncio -from asyncio import tasks def tearDownModule(): diff --git a/Lib/test/test_asyncio/test_unix_events.py b/Lib/test/test_asyncio/test_unix_events.py index 2f68459d30cd48..9918165909f7f8 100644 --- a/Lib/test/test_asyncio/test_unix_events.py +++ b/Lib/test/test_asyncio/test_unix_events.py @@ -9,7 +9,6 @@ import socket import stat import sys -import tempfile import threading import unittest from unittest import mock @@ -315,11 +314,15 @@ def test_create_unix_connection_pathlib(self): self.loop.run_until_complete(coro) def test_create_unix_server_existing_path_nonsock(self): - with tempfile.NamedTemporaryFile() as file: - coro = self.loop.create_unix_server(lambda: None, file.name) - with self.assertRaisesRegex(OSError, - 'Address.*is already in use'): - self.loop.run_until_complete(coro) + path = test_utils.gen_unix_socket_path() + self.addCleanup(os_helper.unlink, path) + # create the file + open(path, "wb").close() + + coro = self.loop.create_unix_server(lambda: None, path) + with self.assertRaisesRegex(OSError, + 'Address.*is already in use'): + self.loop.run_until_complete(coro) def test_create_unix_server_ssl_bool(self): coro = self.loop.create_unix_server(lambda: None, path='spam', @@ -356,20 +359,18 @@ def test_create_unix_server_path_dgram(self): 'no socket.SOCK_NONBLOCK (linux only)') @socket_helper.skip_unless_bind_unix_socket def test_create_unix_server_path_stream_bittype(self): - sock = socket.socket( - socket.AF_UNIX, socket.SOCK_STREAM | socket.SOCK_NONBLOCK) - with tempfile.NamedTemporaryFile() as file: - fn = file.name - try: - with sock: - sock.bind(fn) - coro = self.loop.create_unix_server(lambda: None, path=None, - sock=sock) - srv = self.loop.run_until_complete(coro) - srv.close() - self.loop.run_until_complete(srv.wait_closed()) - finally: - os.unlink(fn) + fn = test_utils.gen_unix_socket_path() + self.addCleanup(os_helper.unlink, fn) + + sock = socket.socket(socket.AF_UNIX, + socket.SOCK_STREAM | socket.SOCK_NONBLOCK) + with sock: + sock.bind(fn) + coro = self.loop.create_unix_server(lambda: None, path=None, + sock=sock) + srv = self.loop.run_until_complete(coro) + srv.close() + self.loop.run_until_complete(srv.wait_closed()) def test_create_unix_server_ssl_timeout_with_plain_sock(self): coro = self.loop.create_unix_server(lambda: None, path='spam', diff --git a/Lib/test/test_asyncio/utils.py b/Lib/test/test_asyncio/utils.py index c32494d40ccea8..96be5a1c3bcf77 100644 --- a/Lib/test/test_asyncio/utils.py +++ b/Lib/test/test_asyncio/utils.py @@ -11,9 +11,7 @@ import socket import socketserver import sys -import tempfile import threading -import time import unittest import weakref @@ -34,6 +32,7 @@ from asyncio import tasks from asyncio.log import logger from test import support +from test.support import socket_helper from test.support import threading_helper @@ -109,13 +108,14 @@ async def once(): def run_until(loop, pred, timeout=support.SHORT_TIMEOUT): - deadline = time.monotonic() + timeout - while not pred(): - if timeout is not None: - timeout = deadline - time.monotonic() - if timeout <= 0: - raise futures.TimeoutError() - loop.run_until_complete(tasks.sleep(0.001)) + delay = 0.001 + for _ in support.busy_retry(timeout, error=False): + if pred(): + break + loop.run_until_complete(tasks.sleep(delay)) + delay = max(delay * 2, 1.0) + else: + raise futures.TimeoutError() def run_once(loop): @@ -250,8 +250,7 @@ class UnixSSLWSGIServer(SSLWSGIServerMixin, SilentUnixWSGIServer): def gen_unix_socket_path(): - with tempfile.NamedTemporaryFile() as file: - return file.name + return socket_helper.create_unix_domain_name() @contextlib.contextmanager diff --git a/Lib/test/test_asyncore.py b/Lib/test/test_asyncore.py index 98ccd3a9304deb..5a037fb9c5d595 100644 --- a/Lib/test/test_asyncore.py +++ b/Lib/test/test_asyncore.py @@ -76,8 +76,7 @@ def capture_server(evt, buf, serv): pass else: n = 200 - start = time.monotonic() - while n > 0 and time.monotonic() - start < 3.0: + for _ in support.busy_retry(support.SHORT_TIMEOUT): r, w, e = select.select([conn], [], [], 0.1) if r: n -= 1 @@ -86,7 +85,8 @@ def capture_server(evt, buf, serv): buf.write(data.replace(b'\n', b'')) if b'\n' in data: break - time.sleep(0.01) + if n <= 0: + break conn.close() finally: diff --git a/Lib/test/test_atexit.py b/Lib/test/test_atexit.py index e0feef7c653606..913b7556be83af 100644 --- a/Lib/test/test_atexit.py +++ b/Lib/test/test_atexit.py @@ -1,6 +1,5 @@ import atexit import os -import sys import textwrap import unittest from test import support @@ -82,6 +81,7 @@ def f(): self.assertEqual(ret, 0) self.assertEqual(atexit._ncallbacks(), n) + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_callback_on_subinterpreter_teardown(self): # This tests if a callback is called on # subinterpreter teardown. diff --git a/Lib/test/test_bdb.py b/Lib/test/test_bdb.py index 6ec59531fa88f0..87a5ac308a12df 100644 --- a/Lib/test/test_bdb.py +++ b/Lib/test/test_bdb.py @@ -59,6 +59,7 @@ from itertools import islice, repeat from test.support import import_helper from test.support import os_helper +from test.support import patch_list class BdbException(Exception): pass @@ -713,9 +714,18 @@ def test_until_in_caller_frame(self): with TracerRun(self) as tracer: tracer.runcall(tfunc_main) + @patch_list(sys.meta_path) def test_skip(self): # Check that tracing is skipped over the import statement in # 'tfunc_import()'. + + # Remove all but the standard importers. + sys.meta_path[:] = ( + item + for item in sys.meta_path + if item.__module__.startswith('_frozen_importlib') + ) + code = """ def main(): lno = 3 diff --git a/Lib/test/test_binascii.py b/Lib/test/test_binascii.py index 7087d7a471d3f4..a2d7d0293ce1ae 100644 --- a/Lib/test/test_binascii.py +++ b/Lib/test/test_binascii.py @@ -4,7 +4,7 @@ import binascii import array import re -from test.support import bigmemtest, _1G, _4G, warnings_helper +from test.support import bigmemtest, _1G, _4G # Note: "*_hex" functions are aliases for "(un)hexlify" diff --git a/Lib/test/test_bisect.py b/Lib/test/test_bisect.py index 20f8b9d7c0aa87..ba108221ebf454 100644 --- a/Lib/test/test_bisect.py +++ b/Lib/test/test_bisect.py @@ -257,6 +257,12 @@ def test_insort(self): target ) + def test_insort_keynotNone(self): + x = [] + y = {"a": 2, "b": 1} + for f in (self.module.insort_left, self.module.insort_right): + self.assertRaises(TypeError, f, x, y, key = "b") + class TestBisectPython(TestBisect, unittest.TestCase): module = py_bisect diff --git a/Lib/test/test_bufio.py b/Lib/test/test_bufio.py index 17151b13615e94..dc9a82dc635318 100644 --- a/Lib/test/test_bufio.py +++ b/Lib/test/test_bufio.py @@ -1,5 +1,4 @@ import unittest -from test import support from test.support import os_helper import io # C implementation. diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index efa9459a58629d..d5eaf291a5ca28 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -2188,7 +2188,7 @@ def skip_if_readline(self): # the readline implementation. In some cases, the Python readline # callback rlhandler() is called by readline with a string without # non-ASCII characters. Skip tests on non-ASCII characters if the - # readline module is loaded, since test_builtin is not intented to test + # readline module is loaded, since test_builtin is not intended to test # the readline module, but the builtins module. if 'readline' in sys.modules: self.skipTest("the readline module is loaded") diff --git a/Lib/test/test_calendar.py b/Lib/test/test_calendar.py index f76cbc9472a6e8..3d9dcf12f2dad8 100644 --- a/Lib/test/test_calendar.py +++ b/Lib/test/test_calendar.py @@ -564,6 +564,19 @@ def test_locale_calendars(self): new_october = calendar.TextCalendar().formatmonthname(2010, 10, 10) self.assertEqual(old_october, new_october) + def test_locale_calendar_formatweekday(self): + try: + # formatweekday uses different day names based on the available width. + cal = calendar.LocaleTextCalendar(locale='en_US') + # For short widths, a centered, abbreviated name is used. + self.assertEqual(cal.formatweekday(0, 5), " Mon ") + # For really short widths, even the abbreviated name is truncated. + self.assertEqual(cal.formatweekday(0, 2), "Mo") + # For long widths, the full day name is used. + self.assertEqual(cal.formatweekday(0, 10), " Monday ") + except locale.Error: + raise unittest.SkipTest('cannot set the en_US locale') + def test_locale_html_calendar_custom_css_class_month_name(self): try: cal = calendar.LocaleHTMLCalendar(locale='') diff --git a/Lib/test/test_capi.py b/Lib/test/test_capi.py index 6d75895589328c..7e571ab4f9e593 100644 --- a/Lib/test/test_capi.py +++ b/Lib/test/test_capi.py @@ -36,9 +36,6 @@ import _testinternalcapi -# Were we compiled --with-pydebug or with #define Py_DEBUG? -Py_DEBUG = hasattr(sys, 'gettotalrefcount') - def decode_stderr(err): return err.decode('utf-8', 'replace').replace('\r', '') @@ -73,16 +70,17 @@ def test_no_FatalError_infinite_loop(self): 'import _testcapi;' '_testcapi.crash_no_current_thread()'], stdout=subprocess.PIPE, - stderr=subprocess.PIPE) + stderr=subprocess.PIPE, + text=True) (out, err) = p.communicate() - self.assertEqual(out, b'') + self.assertEqual(out, '') # This used to cause an infinite loop. - self.assertTrue(err.rstrip().startswith( - b'Fatal Python error: ' - b'PyThreadState_Get: ' - b'the function must be called with the GIL held, ' - b'but the GIL is released ' - b'(the current Python thread state is NULL)'), + msg = ("Fatal Python error: PyThreadState_Get: " + "the function must be called with the GIL held, " + "after Python initialization and before Python finalization, " + "but the GIL is released " + "(the current Python thread state is NULL)") + self.assertTrue(err.rstrip().startswith(msg), err) def test_memoryview_from_NULL_pointer(self): @@ -230,7 +228,7 @@ def test_c_type_with_ipow(self): def test_return_null_without_error(self): # Issue #23571: A function must not return NULL without setting an # error - if Py_DEBUG: + if support.Py_DEBUG: code = textwrap.dedent(""" import _testcapi from test import support @@ -258,7 +256,7 @@ def test_return_null_without_error(self): def test_return_result_with_error(self): # Issue #23571: A function must not return a result with an error set - if Py_DEBUG: + if support.Py_DEBUG: code = textwrap.dedent(""" import _testcapi from test import support @@ -516,7 +514,7 @@ def __del__(self): del subclass_instance # Test that setting __class__ modified the reference counts of the types - if Py_DEBUG: + if support.Py_DEBUG: # gh-89373: In debug mode, _Py_Dealloc() keeps a strong reference # to the type while calling tp_dealloc() self.assertEqual(type_refcnt, B.refcnt_in_del) @@ -586,7 +584,7 @@ def test_c_subclass_of_heap_ctype_with_del_modifying_dunder_class_only_decrefs_o del subclass_instance # Test that setting __class__ modified the reference counts of the types - if Py_DEBUG: + if support.Py_DEBUG: # gh-89373: In debug mode, _Py_Dealloc() keeps a strong reference # to the type while calling tp_dealloc() self.assertEqual(type_refcnt, _testcapi.HeapCTypeSubclassWithFinalizer.refcnt_in_del) @@ -608,6 +606,25 @@ def test_heaptype_with_setattro(self): del obj.value self.assertEqual(obj.pvalue, 0) + def test_heaptype_with_custom_metaclass(self): + self.assertTrue(issubclass(_testcapi.HeapCTypeMetaclass, type)) + self.assertTrue(issubclass(_testcapi.HeapCTypeMetaclassCustomNew, type)) + + t = _testcapi.pytype_fromspec_meta(_testcapi.HeapCTypeMetaclass) + self.assertIsInstance(t, type) + self.assertEqual(t.__name__, "HeapCTypeViaMetaclass") + self.assertIs(type(t), _testcapi.HeapCTypeMetaclass) + + msg = "Metaclasses with custom tp_new are not supported." + with self.assertRaisesRegex(TypeError, msg): + t = _testcapi.pytype_fromspec_meta(_testcapi.HeapCTypeMetaclassCustomNew) + + def test_pytype_fromspec_with_repeated_slots(self): + for variant in range(2): + with self.subTest(variant=variant): + with self.assertRaises(SystemError): + _testcapi.create_type_from_repeated_slots(variant) + def test_pynumber_tobase(self): from _testcapi import pynumber_tobase self.assertEqual(pynumber_tobase(123, 2), '0b1111011') @@ -788,6 +805,7 @@ def test_pendingcalls_non_threaded(self): class SubinterpreterTest(unittest.TestCase): + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_subinterps(self): import builtins r, w = os.pipe() @@ -803,6 +821,7 @@ def test_subinterps(self): self.assertNotEqual(pickle.load(f), id(sys.modules)) self.assertNotEqual(pickle.load(f), id(builtins)) + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_subinterps_recent_language_features(self): r, w = os.pipe() code = """if 1: @@ -1027,7 +1046,7 @@ class PyMemPymallocDebugTests(PyMemDebugTests): PYTHONMALLOC = 'pymalloc_debug' -@unittest.skipUnless(Py_DEBUG, 'need Py_DEBUG') +@unittest.skipUnless(support.Py_DEBUG, 'need Py_DEBUG') class PyMemDefaultTests(PyMemDebugTests): # test default allocator of Python compiled in debug mode PYTHONMALLOC = '' diff --git a/Lib/test/test_cmd_line.py b/Lib/test/test_cmd_line.py index 5a9e14bbd32007..bc52bbdb0f94d1 100644 --- a/Lib/test/test_cmd_line.py +++ b/Lib/test/test_cmd_line.py @@ -18,9 +18,6 @@ if not support.has_subprocess_support: raise unittest.SkipTest("test module requires subprocess") -# Debug build? -Py_DEBUG = hasattr(sys, "gettotalrefcount") - # XXX (ncoghlan): Move to script_helper and make consistent with run_python def _kill_python_and_exit_code(p): @@ -28,32 +25,55 @@ def _kill_python_and_exit_code(p): returncode = p.wait() return data, returncode + class CmdLineTest(unittest.TestCase): def test_directories(self): assert_python_failure('.') assert_python_failure('< .') def verify_valid_flag(self, cmd_line): - rc, out, err = assert_python_ok(*cmd_line) + rc, out, err = assert_python_ok(cmd_line) self.assertTrue(out == b'' or out.endswith(b'\n')) self.assertNotIn(b'Traceback', out) self.assertNotIn(b'Traceback', err) + return out - def test_optimize(self): - self.verify_valid_flag('-O') - self.verify_valid_flag('-OO') + def test_help(self): + self.verify_valid_flag('-h') + self.verify_valid_flag('-?') + out = self.verify_valid_flag('--help') + lines = out.splitlines() + self.assertIn(b'usage', lines[0]) + self.assertNotIn(b'PYTHONHOME', out) + self.assertNotIn(b'-X dev', out) + self.assertLess(len(lines), 50) - def test_site_flag(self): - self.verify_valid_flag('-S') + def test_help_env(self): + out = self.verify_valid_flag('--help-env') + self.assertIn(b'PYTHONHOME', out) + + def test_help_xoptions(self): + out = self.verify_valid_flag('--help-xoptions') + self.assertIn(b'-X dev', out) - def test_usage(self): - rc, out, err = assert_python_ok('-h') + def test_help_all(self): + out = self.verify_valid_flag('--help-all') lines = out.splitlines() self.assertIn(b'usage', lines[0]) + self.assertIn(b'PYTHONHOME', out) + self.assertIn(b'-X dev', out) + # The first line contains the program name, # but the rest should be ASCII-only b''.join(lines[1:]).decode('ascii') + def test_optimize(self): + self.verify_valid_flag('-O') + self.verify_valid_flag('-OO') + + def test_site_flag(self): + self.verify_valid_flag('-S') + def test_version(self): version = ('Python %d.%d' % sys.version_info[:2]).encode("ascii") for switch in '-V', '--version', '-VV': @@ -93,7 +113,7 @@ def get_xoptions(*args): def test_unknown_xoptions(self): rc, out, err = assert_python_failure('-X', 'blech') self.assertIn(b'Unknown value for option -X', err) - msg = b'Fatal Python error: Unknown value for option -X' + msg = b'Fatal Python error: Unknown value for option -X (see --help-xoptions)' self.assertEqual(err.splitlines().count(msg), 1) self.assertEqual(b'', out) @@ -120,7 +140,7 @@ def run_python(*args): # "-X showrefcount" shows the refcount, but only in debug builds rc, out, err = run_python('-I', '-X', 'showrefcount', '-c', code) self.assertEqual(out.rstrip(), b"{'showrefcount': True}") - if Py_DEBUG: + if support.Py_DEBUG: # bpo-46417: Tolerate negative reference count which can occur # because of bugs in C extensions. This test is only about checking # the showrefcount feature. @@ -137,7 +157,6 @@ def test_xoption_frozen_modules(self): } for raw, expected in tests: cmd = ['-X', f'frozen_modules{raw}', - #'-c', 'import os; print(os.__spec__.loader.__name__, end="")'] '-c', 'import os; print(os.__spec__.loader, end="")'] with self.subTest(raw): res = assert_python_ok(*cmd) @@ -170,7 +189,6 @@ def test_relativedir_bug46421(self): # Test `python -m unittest` with a relative directory beginning with ./ # Note: We have to switch to the project's top module's directory, as per # the python unittest wiki. We will switch back when we are done. - defaultwd = os.getcwd() projectlibpath = os.path.dirname(__file__).removesuffix("test") with os_helper.change_cwd(projectlibpath): # Testing with and without ./ @@ -250,7 +268,6 @@ def test_invalid_utf8_arg(self): # # Test with default config, in the C locale, in the Python UTF-8 Mode. code = 'import sys, os; s=os.fsencode(sys.argv[1]); print(ascii(s))' - base_cmd = [sys.executable, '-c', code] def run_default(arg): cmd = [sys.executable, '-c', code, arg] @@ -685,7 +702,7 @@ def test_xdev(self): code = ("import warnings; " "print(' '.join('%s::%s' % (f[0], f[2].__name__) " "for f in warnings.filters))") - if Py_DEBUG: + if support.Py_DEBUG: expected_filters = "default::Warning" else: expected_filters = ("default::Warning " @@ -757,7 +774,7 @@ def test_warnings_filter_precedence(self): expected_filters = ("error::BytesWarning " "once::UserWarning " "always::UserWarning") - if not Py_DEBUG: + if not support.Py_DEBUG: expected_filters += (" " "default::DeprecationWarning " "ignore::DeprecationWarning " @@ -795,10 +812,10 @@ def test_pythonmalloc(self): # Test the PYTHONMALLOC environment variable pymalloc = support.with_pymalloc() if pymalloc: - default_name = 'pymalloc_debug' if Py_DEBUG else 'pymalloc' + default_name = 'pymalloc_debug' if support.Py_DEBUG else 'pymalloc' default_name_debug = 'pymalloc_debug' else: - default_name = 'malloc_debug' if Py_DEBUG else 'malloc' + default_name = 'malloc_debug' if support.Py_DEBUG else 'malloc' default_name_debug = 'malloc_debug' tests = [ @@ -895,6 +912,7 @@ def test_sys_flags_not_set(self): PYTHONSAFEPATH="1", ) + class SyntaxErrorTests(unittest.TestCase): def check_string(self, code): proc = subprocess.run([sys.executable, "-"], input=code, diff --git a/Lib/test/test_cmd_line_script.py b/Lib/test/test_cmd_line_script.py index bb433dc1e73a43..d783af65839ad9 100644 --- a/Lib/test/test_cmd_line_script.py +++ b/Lib/test/test_cmd_line_script.py @@ -558,8 +558,9 @@ def test_non_ascii(self): # Mac OS X denies the creation of a file with an invalid UTF-8 name. # Windows allows creating a name with an arbitrary bytes name, but # Python cannot a undecodable bytes argument to a subprocess. + # WASI does not permit invalid UTF-8 names. if (os_helper.TESTFN_UNDECODABLE - and sys.platform not in ('win32', 'darwin')): + and sys.platform not in ('win32', 'darwin', 'emscripten', 'wasi')): name = os.fsdecode(os_helper.TESTFN_UNDECODABLE) elif os_helper.TESTFN_NONASCII: name = os_helper.TESTFN_NONASCII diff --git a/Lib/test/test_code.py b/Lib/test/test_code.py index 1bb138e7f3243b..308141aaf10e55 100644 --- a/Lib/test/test_code.py +++ b/Lib/test/test_code.py @@ -141,6 +141,7 @@ check_impl_detail, requires_debug_ranges, gc_collect) from test.support.script_helper import assert_python_ok +from test.support import threading_helper from opcode import opmap COPY_FREE_VARS = opmap['COPY_FREE_VARS'] @@ -574,6 +575,15 @@ def positions_from_location_table(code): for _ in range(length): yield (line, end_line, col, end_col) +def dedup(lst, prev=object()): + for item in lst: + if item != prev: + yield item + prev = item + +def lines_from_postions(positions): + return dedup(l for (l, _, _, _) in positions) + def misshappen(): """ @@ -606,6 +616,13 @@ def misshappen(): ) else p +def bug93662(): + example_report_generation_message= ( + """ + """ + ).strip() + raise ValueError() + class CodeLocationTest(unittest.TestCase): @@ -616,10 +633,23 @@ def check_positions(self, func): self.assertEqual(l1, l2) self.assertEqual(len(pos1), len(pos2)) - def test_positions(self): self.check_positions(parse_location_table) self.check_positions(misshappen) + self.check_positions(bug93662) + + def check_lines(self, func): + co = func.__code__ + lines1 = list(dedup(l for (_, _, l) in co.co_lines())) + lines2 = list(lines_from_postions(positions_from_location_table(co))) + for l1, l2 in zip(lines1, lines2): + self.assertEqual(l1, l2) + self.assertEqual(len(lines1), len(lines2)) + + def test_lines(self): + self.check_lines(parse_location_table) + self.check_lines(misshappen) + self.check_lines(bug93662) if check_impl_detail(cpython=True) and ctypes is not None: @@ -694,6 +724,7 @@ def test_get_set(self): self.assertEqual(extra.value, 300) del f + @threading_helper.requires_working_threading() def test_free_different_thread(self): # Freeing a code object on a different thread then # where the co_extra was set should be safe. diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py index 42c600dcb00de1..57f3648eb7017c 100644 --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -9,7 +9,6 @@ from test import support from test.support import os_helper -from test.support import warnings_helper try: import _testcapi diff --git a/Lib/test/test_compile.py b/Lib/test/test_compile.py index 5a9c618786f4e2..d4357243ddabf4 100644 --- a/Lib/test/test_compile.py +++ b/Lib/test/test_compile.py @@ -109,7 +109,9 @@ def __getitem__(self, key): self.assertEqual(d['z'], 12) def test_extended_arg(self): - longexpr = 'x = x or ' + '-x' * 2500 + # default: 1000 * 2.5 = 2500 repetitions + repeat = int(sys.getrecursionlimit() * 2.5) + longexpr = 'x = x or ' + '-x' * repeat g = {} code = ''' def f(x): @@ -839,7 +841,7 @@ def foo(x): instructions = [opcode.opname for opcode in opcodes] self.assertNotIn('LOAD_METHOD', instructions) self.assertIn('LOAD_ATTR', instructions) - self.assertIn('PRECALL', instructions) + self.assertIn('CALL', instructions) def test_lineno_procedure_call(self): def call(): diff --git a/Lib/test/test_compileall.py b/Lib/test/test_compileall.py index 1a9fc973c3aa12..73c83c9bf1efee 100644 --- a/Lib/test/test_compileall.py +++ b/Lib/test/test_compileall.py @@ -434,6 +434,9 @@ class CompileallTestsWithoutSourceEpoch(CompileallTestsBase, pass +# WASI does not have a temp directory and uses cwd instead. The cwd contains +# non-ASCII chars, so _walk_dir() fails to encode self.directory. +@unittest.skipIf(support.is_wasi, "tempdir is not encodable on WASI") class EncodingTest(unittest.TestCase): """Issue 6716: compileall should escape source code when printing errors to stdout.""" diff --git a/Lib/test/test_concurrent_futures.py b/Lib/test/test_concurrent_futures.py index 6f3b4609232bbb..f50255bd575601 100644 --- a/Lib/test/test_concurrent_futures.py +++ b/Lib/test/test_concurrent_futures.py @@ -256,12 +256,13 @@ def test_initializer(self): else: with self.assertRaises(BrokenExecutor): future.result() + # At some point, the executor should break - t1 = time.monotonic() - while not self.executor._broken: - if time.monotonic() - t1 > 5: - self.fail("executor not broken after 5 s.") - time.sleep(0.01) + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, + "executor not broken"): + if self.executor._broken: + break + # ... and from this point submit() is guaranteed to fail with self.assertRaises(BrokenExecutor): self.executor.submit(get_init_status) diff --git a/Lib/test/test_configparser.py b/Lib/test/test_configparser.py index 59c4b275cb46d7..5e2715a96b9943 100644 --- a/Lib/test/test_configparser.py +++ b/Lib/test/test_configparser.py @@ -1612,23 +1612,12 @@ def test_interpolation_depth_error(self): self.assertEqual(error.section, 'section') def test_parsing_error(self): - with self.assertRaises(ValueError) as cm: + with self.assertRaises(TypeError) as cm: configparser.ParsingError() - self.assertEqual(str(cm.exception), "Required argument `source' not " - "given.") - with self.assertRaises(ValueError) as cm: - configparser.ParsingError(source='source', filename='filename') - self.assertEqual(str(cm.exception), "Cannot specify both `filename' " - "and `source'. Use `source'.") - error = configparser.ParsingError(filename='source') + error = configparser.ParsingError(source='source') + self.assertEqual(error.source, 'source') + error = configparser.ParsingError('source') self.assertEqual(error.source, 'source') - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always", DeprecationWarning) - self.assertEqual(error.filename, 'source') - error.filename = 'filename' - self.assertEqual(error.source, 'filename') - for warning in w: - self.assertTrue(warning.category is DeprecationWarning) def test_interpolation_validation(self): parser = configparser.ConfigParser() @@ -1647,27 +1636,6 @@ def test_interpolation_validation(self): self.assertEqual(str(cm.exception), "bad interpolation variable " "reference '%(()'") - def test_readfp_deprecation(self): - sio = io.StringIO(""" - [section] - option = value - """) - parser = configparser.ConfigParser() - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always", DeprecationWarning) - parser.readfp(sio, filename='StringIO') - for warning in w: - self.assertTrue(warning.category is DeprecationWarning) - self.assertEqual(len(parser), 2) - self.assertEqual(parser['section']['option'], 'value') - - def test_safeconfigparser_deprecation(self): - with warnings.catch_warnings(record=True) as w: - warnings.simplefilter("always", DeprecationWarning) - parser = configparser.SafeConfigParser() - for warning in w: - self.assertTrue(warning.category is DeprecationWarning) - def test_legacyinterpolation_deprecation(self): with warnings.catch_warnings(record=True) as w: warnings.simplefilter("always", DeprecationWarning) @@ -1841,7 +1809,7 @@ def test_parsingerror(self): self.assertEqual(e1.source, e2.source) self.assertEqual(e1.errors, e2.errors) self.assertEqual(repr(e1), repr(e2)) - e1 = configparser.ParsingError(filename='filename') + e1 = configparser.ParsingError('filename') e1.append(1, 'line1') e1.append(2, 'line2') e1.append(3, 'line3') diff --git a/Lib/test/test_context.py b/Lib/test/test_context.py index 3132cea668cb1b..b1aece4f5c9c49 100644 --- a/Lib/test/test_context.py +++ b/Lib/test/test_context.py @@ -535,6 +535,41 @@ def test_hamt_collision_1(self): self.assertEqual(len(h4), 2) self.assertEqual(len(h5), 3) + def test_hamt_collision_3(self): + # Test that iteration works with the deepest tree possible. + # https://github.com/python/cpython/issues/93065 + + C = HashKey(0b10000000_00000000_00000000_00000000, 'C') + D = HashKey(0b10000000_00000000_00000000_00000000, 'D') + + E = HashKey(0b00000000_00000000_00000000_00000000, 'E') + + h = hamt() + h = h.set(C, 'C') + h = h.set(D, 'D') + h = h.set(E, 'E') + + # BitmapNode(size=2 count=1 bitmap=0b1): + # NULL: + # BitmapNode(size=2 count=1 bitmap=0b1): + # NULL: + # BitmapNode(size=2 count=1 bitmap=0b1): + # NULL: + # BitmapNode(size=2 count=1 bitmap=0b1): + # NULL: + # BitmapNode(size=2 count=1 bitmap=0b1): + # NULL: + # BitmapNode(size=2 count=1 bitmap=0b1): + # NULL: + # BitmapNode(size=4 count=2 bitmap=0b101): + # : 'E' + # NULL: + # CollisionNode(size=4 id=0x107a24520): + # : 'C' + # : 'D' + + self.assertEqual({k.name for k in h.keys()}, {'C', 'D', 'E'}) + def test_hamt_stress(self): COLLECTION_SIZE = 7000 TEST_ITERS_EVERY = 647 diff --git a/Lib/test/test_contextlib_async.py b/Lib/test/test_contextlib_async.py index 76bd81c7d02ccd..b64673d2c31e05 100644 --- a/Lib/test/test_contextlib_async.py +++ b/Lib/test/test_contextlib_async.py @@ -15,15 +15,12 @@ def _async_test(func): @functools.wraps(func) def wrapper(*args, **kwargs): coro = func(*args, **kwargs) - loop = asyncio.new_event_loop() - asyncio.set_event_loop(loop) - try: - return loop.run_until_complete(coro) - finally: - loop.close() - asyncio.set_event_loop_policy(None) + asyncio.run(coro) return wrapper +def tearDownModule(): + asyncio.set_event_loop_policy(None) + class TestAbstractAsyncContextManager(unittest.TestCase): diff --git a/Lib/test/test_copy.py b/Lib/test/test_copy.py index f1ca8cb254d188..f4d91c16868986 100644 --- a/Lib/test/test_copy.py +++ b/Lib/test/test_copy.py @@ -678,6 +678,28 @@ def __eq__(self, other): self.assertIsNot(x, y) self.assertIsNot(x["foo"], y["foo"]) + def test_reduce_6tuple(self): + def state_setter(*args, **kwargs): + self.fail("shouldn't call this") + class C: + def __reduce__(self): + return C, (), self.__dict__, None, None, state_setter + x = C() + with self.assertRaises(TypeError): + copy.copy(x) + with self.assertRaises(TypeError): + copy.deepcopy(x) + + def test_reduce_6tuple_none(self): + class C: + def __reduce__(self): + return C, (), self.__dict__, None, None, None + x = C() + with self.assertRaises(TypeError): + copy.copy(x) + with self.assertRaises(TypeError): + copy.deepcopy(x) + def test_copy_slots(self): class C(object): __slots__ = ["foo"] diff --git a/Lib/test/test_coroutines.py b/Lib/test/test_coroutines.py index 77944e678c7504..dba5ceffaf1c03 100644 --- a/Lib/test/test_coroutines.py +++ b/Lib/test/test_coroutines.py @@ -2209,7 +2209,8 @@ async def f(): @unittest.skipIf( - support.is_emscripten, "asyncio does not work under Emscripten yet." + support.is_emscripten or support.is_wasi, + "asyncio does not work under Emscripten/WASI yet." ) class CoroAsyncIOCompatTest(unittest.TestCase): diff --git a/Lib/test/test_cppext.py b/Lib/test/test_cppext.py index 8acf0f1b7c0dc3..8673911ecfae5d 100644 --- a/Lib/test/test_cppext.py +++ b/Lib/test/test_cppext.py @@ -16,20 +16,29 @@ @support.requires_subprocess() class TestCPPExt(unittest.TestCase): + def test_build_cpp11(self): + self.check_build(False) + + def test_build_cpp03(self): + self.check_build(True) + # With MSVC, the linker fails with: cannot open file 'python311.lib' # https://github.com/python/cpython/pull/32175#issuecomment-1111175897 @unittest.skipIf(MS_WINDOWS, 'test fails on Windows') - def test_build(self): + # the test uses venv+pip: skip if it's not available + @support.requires_venv_with_pip() + def check_build(self, std_cpp03): # Build in a temporary directory with os_helper.temp_cwd(): - self._test_build() + self._check_build(std_cpp03) - def _test_build(self): + def _check_build(self, std_cpp03): venv_dir = 'env' + verbose = support.verbose # Create virtual environment to get setuptools cmd = [sys.executable, '-X', 'dev', '-m', 'venv', venv_dir] - if support.verbose: + if verbose: print() print('Run:', ' '.join(cmd)) subprocess.run(cmd, check=True) @@ -44,16 +53,21 @@ def _test_build(self): python = os.path.join(venv_dir, 'bin', python_exe) # Build the C++ extension - cmd = [python, '-X', 'dev', SETUP_TESTCPPEXT, 'build_ext', '--verbose'] - if support.verbose: + cmd = [python, '-X', 'dev', + SETUP_TESTCPPEXT, 'build_ext', '--verbose'] + if std_cpp03: + cmd.append('-std=c++03') + if verbose: print('Run:', ' '.join(cmd)) - proc = subprocess.run(cmd, - stdout=subprocess.PIPE, - stderr=subprocess.STDOUT, - text=True) - if proc.returncode: - print(proc.stdout, end='') - self.fail(f"Build failed with exit code {proc.returncode}") + subprocess.run(cmd, check=True) + else: + proc = subprocess.run(cmd, + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT, + text=True) + if proc.returncode: + print(proc.stdout, end='') + self.fail(f"Build failed with exit code {proc.returncode}") if __name__ == "__main__": diff --git a/Lib/test/test_ctypes.py b/Lib/test/test_ctypes.py deleted file mode 100644 index b0a12c97347490..00000000000000 --- a/Lib/test/test_ctypes.py +++ /dev/null @@ -1,10 +0,0 @@ -import unittest -from test.support.import_helper import import_module - - -ctypes_test = import_module('ctypes.test') - -load_tests = ctypes_test.load_tests - -if __name__ == "__main__": - unittest.main() diff --git a/Lib/ctypes/test/__init__.py b/Lib/test/test_ctypes/__init__.py similarity index 100% rename from Lib/ctypes/test/__init__.py rename to Lib/test/test_ctypes/__init__.py diff --git a/Lib/test/test_ctypes/__main__.py b/Lib/test/test_ctypes/__main__.py new file mode 100644 index 00000000000000..3003d4db890d4d --- /dev/null +++ b/Lib/test/test_ctypes/__main__.py @@ -0,0 +1,4 @@ +from test.test_ctypes import load_tests +import unittest + +unittest.main() diff --git a/Lib/ctypes/test/test_anon.py b/Lib/test/test_ctypes/test_anon.py similarity index 100% rename from Lib/ctypes/test/test_anon.py rename to Lib/test/test_ctypes/test_anon.py diff --git a/Lib/ctypes/test/test_array_in_pointer.py b/Lib/test/test_ctypes/test_array_in_pointer.py similarity index 100% rename from Lib/ctypes/test/test_array_in_pointer.py rename to Lib/test/test_ctypes/test_array_in_pointer.py diff --git a/Lib/ctypes/test/test_arrays.py b/Lib/test/test_ctypes/test_arrays.py similarity index 99% rename from Lib/ctypes/test/test_arrays.py rename to Lib/test/test_ctypes/test_arrays.py index 14603b7049c92c..415a5785a9c1bb 100644 --- a/Lib/ctypes/test/test_arrays.py +++ b/Lib/test/test_ctypes/test_arrays.py @@ -3,7 +3,7 @@ import sys from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol formats = "bBhHiIlLqQfd" diff --git a/Lib/ctypes/test/test_as_parameter.py b/Lib/test/test_ctypes/test_as_parameter.py similarity index 98% rename from Lib/ctypes/test/test_as_parameter.py rename to Lib/test/test_ctypes/test_as_parameter.py index f9d27cb89d341b..e9ec9ad847b487 100644 --- a/Lib/ctypes/test/test_as_parameter.py +++ b/Lib/test/test_ctypes/test_as_parameter.py @@ -1,6 +1,6 @@ import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import _ctypes_test dll = CDLL(_ctypes_test.__file__) @@ -122,6 +122,7 @@ def callback(value): result = f(self.wrap(-10), self.wrap(cb)) self.assertEqual(result, -18) + @need_symbol('c_longlong') def test_longlong_callbacks(self): f = dll._testfunc_callback_q_qf diff --git a/Lib/ctypes/test/test_bitfields.py b/Lib/test/test_ctypes/test_bitfields.py similarity index 99% rename from Lib/ctypes/test/test_bitfields.py rename to Lib/test/test_ctypes/test_bitfields.py index 66acd62e6851a1..dad71a0ba7ee4a 100644 --- a/Lib/ctypes/test/test_bitfields.py +++ b/Lib/test/test_ctypes/test_bitfields.py @@ -1,5 +1,5 @@ from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol from test import support import unittest import os diff --git a/Lib/ctypes/test/test_buffers.py b/Lib/test/test_ctypes/test_buffers.py similarity index 98% rename from Lib/ctypes/test/test_buffers.py rename to Lib/test/test_ctypes/test_buffers.py index 15782be757c853..a9be2023aa0fa0 100644 --- a/Lib/ctypes/test/test_buffers.py +++ b/Lib/test/test_ctypes/test_buffers.py @@ -1,5 +1,5 @@ from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import unittest class StringBufferTestCase(unittest.TestCase): diff --git a/Lib/ctypes/test/test_bytes.py b/Lib/test/test_ctypes/test_bytes.py similarity index 100% rename from Lib/ctypes/test/test_bytes.py rename to Lib/test/test_ctypes/test_bytes.py diff --git a/Lib/ctypes/test/test_byteswap.py b/Lib/test/test_ctypes/test_byteswap.py similarity index 100% rename from Lib/ctypes/test/test_byteswap.py rename to Lib/test/test_ctypes/test_byteswap.py diff --git a/Lib/ctypes/test/test_callbacks.py b/Lib/test/test_ctypes/test_callbacks.py similarity index 98% rename from Lib/ctypes/test/test_callbacks.py rename to Lib/test/test_ctypes/test_callbacks.py index 1099cf9a69c6b1..b5bef04e14d208 100644 --- a/Lib/ctypes/test/test_callbacks.py +++ b/Lib/test/test_ctypes/test_callbacks.py @@ -3,7 +3,7 @@ from test import support from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol from _ctypes import CTYPES_MAX_ARGCOUNT import _ctypes_test @@ -65,10 +65,12 @@ def test_long(self): def test_ulong(self): self.check_type(c_ulong, 42) + @need_symbol('c_longlong') def test_longlong(self): self.check_type(c_longlong, 42) self.check_type(c_longlong, -42) + @need_symbol('c_ulonglong') def test_ulonglong(self): self.check_type(c_ulonglong, 42) @@ -82,6 +84,7 @@ def test_double(self): self.check_type(c_double, 3.14) self.check_type(c_double, -3.14) + @need_symbol('c_longdouble') def test_longdouble(self): self.check_type(c_longdouble, 3.14) self.check_type(c_longdouble, -3.14) diff --git a/Lib/ctypes/test/test_cast.py b/Lib/test/test_ctypes/test_cast.py similarity index 98% rename from Lib/ctypes/test/test_cast.py rename to Lib/test/test_ctypes/test_cast.py index 6878f9732826f1..7ee23b16f1b00b 100644 --- a/Lib/ctypes/test/test_cast.py +++ b/Lib/test/test_ctypes/test_cast.py @@ -1,5 +1,5 @@ from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import unittest import sys diff --git a/Lib/ctypes/test/test_cfuncs.py b/Lib/test/test_ctypes/test_cfuncs.py similarity index 97% rename from Lib/ctypes/test/test_cfuncs.py rename to Lib/test/test_ctypes/test_cfuncs.py index ac2240fa197d3f..7cba4b0e527f4d 100644 --- a/Lib/ctypes/test/test_cfuncs.py +++ b/Lib/test/test_ctypes/test_cfuncs.py @@ -3,7 +3,7 @@ import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import _ctypes_test @@ -111,24 +111,28 @@ def test_ulong_plus(self): self.assertEqual(self._dll.tf_bL(b' ', 4294967295), 1431655765) self.assertEqual(self.U(), 4294967295) + @need_symbol('c_longlong') def test_longlong(self): self._dll.tf_q.restype = c_longlong self._dll.tf_q.argtypes = (c_longlong, ) self.assertEqual(self._dll.tf_q(-9223372036854775806), -3074457345618258602) self.assertEqual(self.S(), -9223372036854775806) + @need_symbol('c_longlong') def test_longlong_plus(self): self._dll.tf_bq.restype = c_longlong self._dll.tf_bq.argtypes = (c_byte, c_longlong) self.assertEqual(self._dll.tf_bq(0, -9223372036854775806), -3074457345618258602) self.assertEqual(self.S(), -9223372036854775806) + @need_symbol('c_ulonglong') def test_ulonglong(self): self._dll.tf_Q.restype = c_ulonglong self._dll.tf_Q.argtypes = (c_ulonglong, ) self.assertEqual(self._dll.tf_Q(18446744073709551615), 6148914691236517205) self.assertEqual(self.U(), 18446744073709551615) + @need_symbol('c_ulonglong') def test_ulonglong_plus(self): self._dll.tf_bQ.restype = c_ulonglong self._dll.tf_bQ.argtypes = (c_byte, c_ulonglong) @@ -159,12 +163,14 @@ def test_double_plus(self): self.assertEqual(self._dll.tf_bd(0, 42.), 14.) self.assertEqual(self.S(), 42) + @need_symbol('c_longdouble') def test_longdouble(self): self._dll.tf_D.restype = c_longdouble self._dll.tf_D.argtypes = (c_longdouble,) self.assertEqual(self._dll.tf_D(42.), 14.) self.assertEqual(self.S(), 42) + @need_symbol('c_longdouble') def test_longdouble_plus(self): self._dll.tf_bD.restype = c_longdouble self._dll.tf_bD.argtypes = (c_byte, c_longdouble) diff --git a/Lib/ctypes/test/test_checkretval.py b/Lib/test/test_ctypes/test_checkretval.py similarity index 95% rename from Lib/ctypes/test/test_checkretval.py rename to Lib/test/test_ctypes/test_checkretval.py index e9567dc3912585..1492099f4b9e99 100644 --- a/Lib/ctypes/test/test_checkretval.py +++ b/Lib/test/test_ctypes/test_checkretval.py @@ -1,7 +1,7 @@ import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol class CHECKED(c_int): def _check_retval_(value): diff --git a/Lib/ctypes/test/test_delattr.py b/Lib/test/test_ctypes/test_delattr.py similarity index 100% rename from Lib/ctypes/test/test_delattr.py rename to Lib/test/test_ctypes/test_delattr.py diff --git a/Lib/ctypes/test/test_errno.py b/Lib/test/test_ctypes/test_errno.py similarity index 100% rename from Lib/ctypes/test/test_errno.py rename to Lib/test/test_ctypes/test_errno.py diff --git a/Lib/ctypes/test/test_find.py b/Lib/test/test_ctypes/test_find.py similarity index 100% rename from Lib/ctypes/test/test_find.py rename to Lib/test/test_ctypes/test_find.py diff --git a/Lib/ctypes/test/test_frombuffer.py b/Lib/test/test_ctypes/test_frombuffer.py similarity index 100% rename from Lib/ctypes/test/test_frombuffer.py rename to Lib/test/test_ctypes/test_frombuffer.py diff --git a/Lib/ctypes/test/test_funcptr.py b/Lib/test/test_ctypes/test_funcptr.py similarity index 100% rename from Lib/ctypes/test/test_funcptr.py rename to Lib/test/test_ctypes/test_funcptr.py diff --git a/Lib/ctypes/test/test_functions.py b/Lib/test/test_ctypes/test_functions.py similarity index 99% rename from Lib/ctypes/test/test_functions.py rename to Lib/test/test_ctypes/test_functions.py index f9e92e1cc6b068..a3db0033533fbb 100644 --- a/Lib/ctypes/test/test_functions.py +++ b/Lib/test/test_ctypes/test_functions.py @@ -6,7 +6,7 @@ """ from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import sys, unittest try: @@ -128,6 +128,7 @@ def test_doubleresult(self): self.assertEqual(result, -21) self.assertEqual(type(result), float) + @need_symbol('c_longdouble') def test_longdoubleresult(self): f = dll._testfunc_D_bhilfD f.argtypes = [c_byte, c_short, c_int, c_long, c_float, c_longdouble] diff --git a/Lib/ctypes/test/test_incomplete.py b/Lib/test/test_ctypes/test_incomplete.py similarity index 100% rename from Lib/ctypes/test/test_incomplete.py rename to Lib/test/test_ctypes/test_incomplete.py diff --git a/Lib/ctypes/test/test_init.py b/Lib/test/test_ctypes/test_init.py similarity index 100% rename from Lib/ctypes/test/test_init.py rename to Lib/test/test_ctypes/test_init.py diff --git a/Lib/ctypes/test/test_internals.py b/Lib/test/test_ctypes/test_internals.py similarity index 100% rename from Lib/ctypes/test/test_internals.py rename to Lib/test/test_ctypes/test_internals.py diff --git a/Lib/ctypes/test/test_keeprefs.py b/Lib/test/test_ctypes/test_keeprefs.py similarity index 100% rename from Lib/ctypes/test/test_keeprefs.py rename to Lib/test/test_ctypes/test_keeprefs.py diff --git a/Lib/ctypes/test/test_libc.py b/Lib/test/test_ctypes/test_libc.py similarity index 100% rename from Lib/ctypes/test/test_libc.py rename to Lib/test/test_ctypes/test_libc.py diff --git a/Lib/ctypes/test/test_loading.py b/Lib/test/test_ctypes/test_loading.py similarity index 100% rename from Lib/ctypes/test/test_loading.py rename to Lib/test/test_ctypes/test_loading.py diff --git a/Lib/ctypes/test/test_macholib.py b/Lib/test/test_ctypes/test_macholib.py similarity index 100% rename from Lib/ctypes/test/test_macholib.py rename to Lib/test/test_ctypes/test_macholib.py diff --git a/Lib/ctypes/test/test_memfunctions.py b/Lib/test/test_ctypes/test_memfunctions.py similarity index 98% rename from Lib/ctypes/test/test_memfunctions.py rename to Lib/test/test_ctypes/test_memfunctions.py index e784b9a7068422..d5c973521177c8 100644 --- a/Lib/ctypes/test/test_memfunctions.py +++ b/Lib/test/test_ctypes/test_memfunctions.py @@ -2,7 +2,7 @@ from test import support import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol class MemFunctionsTest(unittest.TestCase): @unittest.skip('test disabled') diff --git a/Lib/ctypes/test/test_numbers.py b/Lib/test/test_ctypes/test_numbers.py similarity index 100% rename from Lib/ctypes/test/test_numbers.py rename to Lib/test/test_ctypes/test_numbers.py diff --git a/Lib/ctypes/test/test_objects.py b/Lib/test/test_ctypes/test_objects.py similarity index 87% rename from Lib/ctypes/test/test_objects.py rename to Lib/test/test_ctypes/test_objects.py index 19e3dc1f2d7621..44a3c61ad79267 100644 --- a/Lib/ctypes/test/test_objects.py +++ b/Lib/test/test_ctypes/test_objects.py @@ -42,7 +42,7 @@ of 'x' ('_b_base_' is either None, or the root object owning the memory block): >>> print(x.array._b_base_) # doctest: +ELLIPSIS - + >>> >>> x.array[0] = b'spam spam spam' @@ -56,12 +56,12 @@ import unittest, doctest -import ctypes.test.test_objects +import test.test_ctypes.test_objects class TestCase(unittest.TestCase): def test(self): - failures, tests = doctest.testmod(ctypes.test.test_objects) + failures, tests = doctest.testmod(test.test_ctypes.test_objects) self.assertFalse(failures, 'doctests failed, see output above') if __name__ == '__main__': - doctest.testmod(ctypes.test.test_objects) + doctest.testmod(test.test_ctypes.test_objects) diff --git a/Lib/ctypes/test/test_parameters.py b/Lib/test/test_ctypes/test_parameters.py similarity index 99% rename from Lib/ctypes/test/test_parameters.py rename to Lib/test/test_ctypes/test_parameters.py index 38af7ac13d756c..84839d9c6a96d9 100644 --- a/Lib/ctypes/test/test_parameters.py +++ b/Lib/test/test_ctypes/test_parameters.py @@ -1,11 +1,10 @@ import unittest -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import test.support class SimpleTypesTestCase(unittest.TestCase): def setUp(self): - import ctypes try: from _ctypes import set_conversion_mode except ImportError: diff --git a/Lib/ctypes/test/test_pep3118.py b/Lib/test/test_ctypes/test_pep3118.py similarity index 100% rename from Lib/ctypes/test/test_pep3118.py rename to Lib/test/test_ctypes/test_pep3118.py diff --git a/Lib/ctypes/test/test_pickling.py b/Lib/test/test_ctypes/test_pickling.py similarity index 100% rename from Lib/ctypes/test/test_pickling.py rename to Lib/test/test_ctypes/test_pickling.py diff --git a/Lib/ctypes/test/test_pointers.py b/Lib/test/test_ctypes/test_pointers.py similarity index 100% rename from Lib/ctypes/test/test_pointers.py rename to Lib/test/test_ctypes/test_pointers.py diff --git a/Lib/ctypes/test/test_prototypes.py b/Lib/test/test_ctypes/test_prototypes.py similarity index 99% rename from Lib/ctypes/test/test_prototypes.py rename to Lib/test/test_ctypes/test_prototypes.py index cd0c649de3e31b..bf27561487ac81 100644 --- a/Lib/ctypes/test/test_prototypes.py +++ b/Lib/test/test_ctypes/test_prototypes.py @@ -1,5 +1,5 @@ from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import unittest # IMPORTANT INFO: diff --git a/Lib/ctypes/test/test_python_api.py b/Lib/test/test_ctypes/test_python_api.py similarity index 100% rename from Lib/ctypes/test/test_python_api.py rename to Lib/test/test_ctypes/test_python_api.py diff --git a/Lib/ctypes/test/test_random_things.py b/Lib/test/test_ctypes/test_random_things.py similarity index 100% rename from Lib/ctypes/test/test_random_things.py rename to Lib/test/test_ctypes/test_random_things.py diff --git a/Lib/ctypes/test/test_refcounts.py b/Lib/test/test_ctypes/test_refcounts.py similarity index 100% rename from Lib/ctypes/test/test_refcounts.py rename to Lib/test/test_ctypes/test_refcounts.py diff --git a/Lib/ctypes/test/test_repr.py b/Lib/test/test_ctypes/test_repr.py similarity index 100% rename from Lib/ctypes/test/test_repr.py rename to Lib/test/test_ctypes/test_repr.py diff --git a/Lib/ctypes/test/test_returnfuncptrs.py b/Lib/test/test_ctypes/test_returnfuncptrs.py similarity index 100% rename from Lib/ctypes/test/test_returnfuncptrs.py rename to Lib/test/test_ctypes/test_returnfuncptrs.py diff --git a/Lib/ctypes/test/test_simplesubclasses.py b/Lib/test/test_ctypes/test_simplesubclasses.py similarity index 100% rename from Lib/ctypes/test/test_simplesubclasses.py rename to Lib/test/test_ctypes/test_simplesubclasses.py diff --git a/Lib/ctypes/test/test_sizes.py b/Lib/test/test_ctypes/test_sizes.py similarity index 100% rename from Lib/ctypes/test/test_sizes.py rename to Lib/test/test_ctypes/test_sizes.py diff --git a/Lib/ctypes/test/test_slicing.py b/Lib/test/test_ctypes/test_slicing.py similarity index 99% rename from Lib/ctypes/test/test_slicing.py rename to Lib/test/test_ctypes/test_slicing.py index a3932f176728af..b3e68f9a822d94 100644 --- a/Lib/ctypes/test/test_slicing.py +++ b/Lib/test/test_ctypes/test_slicing.py @@ -1,6 +1,6 @@ import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import _ctypes_test diff --git a/Lib/ctypes/test/test_stringptr.py b/Lib/test/test_ctypes/test_stringptr.py similarity index 100% rename from Lib/ctypes/test/test_stringptr.py rename to Lib/test/test_ctypes/test_stringptr.py diff --git a/Lib/ctypes/test/test_strings.py b/Lib/test/test_ctypes/test_strings.py similarity index 99% rename from Lib/ctypes/test/test_strings.py rename to Lib/test/test_ctypes/test_strings.py index 12e208828a70db..a9003be3f506e2 100644 --- a/Lib/ctypes/test/test_strings.py +++ b/Lib/test/test_ctypes/test_strings.py @@ -1,6 +1,6 @@ import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol class StringArrayTestCase(unittest.TestCase): def test(self): diff --git a/Lib/ctypes/test/test_struct_fields.py b/Lib/test/test_ctypes/test_struct_fields.py similarity index 100% rename from Lib/ctypes/test/test_struct_fields.py rename to Lib/test/test_ctypes/test_struct_fields.py diff --git a/Lib/ctypes/test/test_structures.py b/Lib/test/test_ctypes/test_structures.py similarity index 99% rename from Lib/ctypes/test/test_structures.py rename to Lib/test/test_ctypes/test_structures.py index 97ad2b8ed8a50d..13c0470ba2238c 100644 --- a/Lib/ctypes/test/test_structures.py +++ b/Lib/test/test_ctypes/test_structures.py @@ -2,7 +2,7 @@ import sys import unittest from ctypes import * -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol from struct import calcsize import _ctypes_test from test import support diff --git a/Lib/ctypes/test/test_unaligned_structures.py b/Lib/test/test_ctypes/test_unaligned_structures.py similarity index 100% rename from Lib/ctypes/test/test_unaligned_structures.py rename to Lib/test/test_ctypes/test_unaligned_structures.py diff --git a/Lib/ctypes/test/test_unicode.py b/Lib/test/test_ctypes/test_unicode.py similarity index 97% rename from Lib/ctypes/test/test_unicode.py rename to Lib/test/test_ctypes/test_unicode.py index 60c75424b767fa..319cb3b1dcac21 100644 --- a/Lib/ctypes/test/test_unicode.py +++ b/Lib/test/test_ctypes/test_unicode.py @@ -1,6 +1,6 @@ import unittest import ctypes -from ctypes.test import need_symbol +from test.test_ctypes import need_symbol import _ctypes_test diff --git a/Lib/ctypes/test/test_values.py b/Lib/test/test_ctypes/test_values.py similarity index 100% rename from Lib/ctypes/test/test_values.py rename to Lib/test/test_ctypes/test_values.py diff --git a/Lib/ctypes/test/test_varsize_struct.py b/Lib/test/test_ctypes/test_varsize_struct.py similarity index 100% rename from Lib/ctypes/test/test_varsize_struct.py rename to Lib/test/test_ctypes/test_varsize_struct.py diff --git a/Lib/ctypes/test/test_win32.py b/Lib/test/test_ctypes/test_win32.py similarity index 100% rename from Lib/ctypes/test/test_win32.py rename to Lib/test/test_ctypes/test_win32.py diff --git a/Lib/ctypes/test/test_wintypes.py b/Lib/test/test_ctypes/test_wintypes.py similarity index 100% rename from Lib/ctypes/test/test_wintypes.py rename to Lib/test/test_ctypes/test_wintypes.py diff --git a/Lib/test/test_dataclasses.py b/Lib/test/test_dataclasses.py index cf29cd07516f0d..98dfab481bfd14 100644 --- a/Lib/test/test_dataclasses.py +++ b/Lib/test/test_dataclasses.py @@ -3109,6 +3109,54 @@ def test_weakref_slot_make_dataclass(self): "weakref_slot is True but slots is False"): B = make_dataclass('B', [('a', int),], weakref_slot=True) + def test_weakref_slot_subclass_weakref_slot(self): + @dataclass(slots=True, weakref_slot=True) + class Base: + field: int + + # A *can* also specify weakref_slot=True if it wants to (gh-93521) + @dataclass(slots=True, weakref_slot=True) + class A(Base): + ... + + # __weakref__ is in the base class, not A. But an instance of A + # is still weakref-able. + self.assertIn("__weakref__", Base.__slots__) + self.assertNotIn("__weakref__", A.__slots__) + a = A(1) + weakref.ref(a) + + def test_weakref_slot_subclass_no_weakref_slot(self): + @dataclass(slots=True, weakref_slot=True) + class Base: + field: int + + @dataclass(slots=True) + class A(Base): + ... + + # __weakref__ is in the base class, not A. Even though A doesn't + # specify weakref_slot, it should still be weakref-able. + self.assertIn("__weakref__", Base.__slots__) + self.assertNotIn("__weakref__", A.__slots__) + a = A(1) + weakref.ref(a) + + def test_weakref_slot_normal_base_weakref_slot(self): + class Base: + __slots__ = ('__weakref__',) + + @dataclass(slots=True, weakref_slot=True) + class A(Base): + field: int + + # __weakref__ is in the base class, not A. But an instance of + # A is still weakref-able. + self.assertIn("__weakref__", Base.__slots__) + self.assertNotIn("__weakref__", A.__slots__) + a = A(1) + weakref.ref(a) + class TestDescriptors(unittest.TestCase): def test_set_name(self): diff --git a/Lib/test/test_dbm_dumb.py b/Lib/test/test_dbm_dumb.py index 73cff638f1e1a4..a481175b3bfdbd 100644 --- a/Lib/test/test_dbm_dumb.py +++ b/Lib/test/test_dbm_dumb.py @@ -42,6 +42,7 @@ def test_dumbdbm_creation(self): self.read_helper(f) @unittest.skipUnless(hasattr(os, 'umask'), 'test needs os.umask()') + @os_helper.skip_unless_working_chmod def test_dumbdbm_creation_mode(self): try: old_umask = os.umask(0o002) @@ -265,6 +266,7 @@ def test_invalid_flag(self): "'r', 'w', 'c', or 'n'"): dumbdbm.open(_fname, flag) + @os_helper.skip_unless_working_chmod def test_readonly_files(self): with os_helper.temp_dir() as dir: fname = os.path.join(dir, 'db') diff --git a/Lib/test/test_decorators.py b/Lib/test/test_decorators.py index a6baa3ad1dd6b3..4b492178c1581f 100644 --- a/Lib/test/test_decorators.py +++ b/Lib/test/test_decorators.py @@ -1,4 +1,3 @@ -from test import support import unittest from types import MethodType diff --git a/Lib/test/test_defaultdict.py b/Lib/test/test_defaultdict.py index 68fc449780a3db..bdbe9b81e8fb3f 100644 --- a/Lib/test/test_defaultdict.py +++ b/Lib/test/test_defaultdict.py @@ -1,9 +1,7 @@ """Unit tests for collections.defaultdict.""" -import os import copy import pickle -import tempfile import unittest from collections import defaultdict diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index 9a09d8a9e3fe03..b5f1bd4af888d3 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -3563,7 +3563,6 @@ class MyInt(int): def test_str_of_str_subclass(self): # Testing __str__ defined in subclass of str ... import binascii - import io class octetstring(str): def __str__(self): diff --git a/Lib/test/test_descrtut.py b/Lib/test/test_descrtut.py index e01a31a74695d5..b4158eb23a56f0 100644 --- a/Lib/test/test_descrtut.py +++ b/Lib/test/test_descrtut.py @@ -9,7 +9,6 @@ # deterministic. from test.support import sortdict -import pprint import doctest import unittest @@ -167,6 +166,7 @@ def merge(self, other): You can get the information from the list type: + >>> import pprint >>> pprint.pprint(dir(list)) # like list.__dict__.keys(), but sorted ['__add__', '__class__', diff --git a/Lib/test/test_dis.py b/Lib/test/test_dis.py index af736d6fe96141..255425302f4280 100644 --- a/Lib/test/test_dis.py +++ b/Lib/test/test_dis.py @@ -107,7 +107,6 @@ def _f(a): %3d LOAD_GLOBAL 1 (NULL + print) LOAD_FAST 0 (a) - PRECALL 1 CALL 1 POP_TOP @@ -122,7 +121,6 @@ def _f(a): RESUME 0 LOAD_GLOBAL 1 LOAD_FAST 0 - PRECALL 1 CALL 1 POP_TOP LOAD_CONST 1 @@ -143,13 +141,12 @@ def bug708901(): %3d LOAD_CONST 2 (10) -%3d PRECALL 2 - CALL 2 +%3d CALL 2 GET_ITER - >> FOR_ITER 2 (to 40) + >> FOR_ITER 2 (to 38) STORE_FAST 0 (res) -%3d JUMP_BACKWARD 3 (to 34) +%3d JUMP_BACKWARD 4 (to 30) %3d >> LOAD_CONST 0 (None) RETURN_VALUE @@ -174,13 +171,11 @@ def bug1333982(x=[]): MAKE_FUNCTION 0 LOAD_FAST 0 (x) GET_ITER - PRECALL 0 CALL 0 %3d LOAD_CONST 3 (1) %3d BINARY_OP 0 (+) - PRECALL 0 CALL 0 RAISE_VARARGS 1 """ % (bug1333982.__code__.co_firstlineno, @@ -315,7 +310,6 @@ def bug42562(): 3 PUSH_NULL LOAD_NAME 3 (fun) LOAD_CONST 0 (1) - PRECALL 1 CALL 1 LOAD_NAME 2 (__annotations__) LOAD_CONST 2 ('y') @@ -326,7 +320,6 @@ def bug42562(): PUSH_NULL LOAD_NAME 3 (fun) LOAD_CONST 3 (0) - PRECALL 1 CALL 1 STORE_SUBSCR LOAD_NAME 1 (int) @@ -367,17 +360,17 @@ def bug42562(): --> BINARY_OP 11 (/) POP_TOP -%3d LOAD_FAST 1 (tb) +%3d LOAD_FAST_CHECK 1 (tb) RETURN_VALUE >> PUSH_EXC_INFO %3d LOAD_GLOBAL 0 (Exception) CHECK_EXC_MATCH - POP_JUMP_FORWARD_IF_FALSE 18 (to 72) + POP_JUMP_FORWARD_IF_FALSE 23 (to 82) STORE_FAST 0 (e) %3d LOAD_FAST 0 (e) - LOAD_ATTR 1 (__traceback__) + LOAD_ATTR 2 (__traceback__) STORE_FAST 1 (tb) POP_EXCEPT LOAD_CONST 0 (None) @@ -396,6 +389,7 @@ def bug42562(): POP_EXCEPT RERAISE 1 ExceptionTable: +4 rows """ % (TRACEBACK_CODE.co_firstlineno, TRACEBACK_CODE.co_firstlineno + 1, TRACEBACK_CODE.co_firstlineno + 2, @@ -428,6 +422,133 @@ def _fstring(a, b, c, d): RETURN_VALUE """ % (_fstring.__code__.co_firstlineno, _fstring.__code__.co_firstlineno + 1) +def _with(c): + with c: + x = 1 + y = 2 + +dis_with = """\ +%3d RESUME 0 + +%3d LOAD_FAST 0 (c) + BEFORE_WITH + POP_TOP + +%3d LOAD_CONST 1 (1) + STORE_FAST 1 (x) + +%3d LOAD_CONST 0 (None) + LOAD_CONST 0 (None) + LOAD_CONST 0 (None) + CALL 2 + POP_TOP + +%3d LOAD_CONST 2 (2) + STORE_FAST 2 (y) + LOAD_CONST 0 (None) + RETURN_VALUE + +%3d >> PUSH_EXC_INFO + WITH_EXCEPT_START + POP_JUMP_FORWARD_IF_TRUE 1 (to 46) + RERAISE 2 + >> POP_TOP + POP_EXCEPT + POP_TOP + POP_TOP + +%3d LOAD_CONST 2 (2) + STORE_FAST 2 (y) + LOAD_CONST 0 (None) + RETURN_VALUE + >> COPY 3 + POP_EXCEPT + RERAISE 1 +ExceptionTable: +2 rows +""" % (_with.__code__.co_firstlineno, + _with.__code__.co_firstlineno + 1, + _with.__code__.co_firstlineno + 2, + _with.__code__.co_firstlineno + 1, + _with.__code__.co_firstlineno + 3, + _with.__code__.co_firstlineno + 1, + _with.__code__.co_firstlineno + 3, + ) + +async def _asyncwith(c): + async with c: + x = 1 + y = 2 + +dis_asyncwith = """\ +%3d RETURN_GENERATOR + POP_TOP + RESUME 0 + +%3d LOAD_FAST 0 (c) + BEFORE_ASYNC_WITH + GET_AWAITABLE 1 + LOAD_CONST 0 (None) + >> SEND 3 (to 22) + YIELD_VALUE 3 + RESUME 3 + JUMP_BACKWARD_NO_INTERRUPT 4 (to 14) + >> POP_TOP + +%3d LOAD_CONST 1 (1) + STORE_FAST 1 (x) + +%3d LOAD_CONST 0 (None) + LOAD_CONST 0 (None) + LOAD_CONST 0 (None) + CALL 2 + GET_AWAITABLE 2 + LOAD_CONST 0 (None) + >> SEND 3 (to 56) + YIELD_VALUE 2 + RESUME 3 + JUMP_BACKWARD_NO_INTERRUPT 4 (to 48) + >> POP_TOP + +%3d LOAD_CONST 2 (2) + STORE_FAST 2 (y) + LOAD_CONST 0 (None) + RETURN_VALUE + +%3d >> PUSH_EXC_INFO + WITH_EXCEPT_START + GET_AWAITABLE 2 + LOAD_CONST 0 (None) + >> SEND 3 (to 82) + YIELD_VALUE 6 + RESUME 3 + JUMP_BACKWARD_NO_INTERRUPT 4 (to 74) + >> POP_JUMP_FORWARD_IF_TRUE 1 (to 86) + RERAISE 2 + >> POP_TOP + POP_EXCEPT + POP_TOP + POP_TOP + +%3d LOAD_CONST 2 (2) + STORE_FAST 2 (y) + LOAD_CONST 0 (None) + RETURN_VALUE + >> COPY 3 + POP_EXCEPT + RERAISE 1 +ExceptionTable: +2 rows +""" % (_asyncwith.__code__.co_firstlineno, + _asyncwith.__code__.co_firstlineno + 1, + _asyncwith.__code__.co_firstlineno + 2, + _asyncwith.__code__.co_firstlineno + 1, + _asyncwith.__code__.co_firstlineno + 3, + _asyncwith.__code__.co_firstlineno + 1, + _asyncwith.__code__.co_firstlineno + 3, + ) + + def _tryfinally(a, b): try: return a @@ -449,14 +570,12 @@ def _tryfinallyconst(b): %3d PUSH_NULL LOAD_FAST 1 (b) - PRECALL 0 CALL 0 POP_TOP RETURN_VALUE >> PUSH_EXC_INFO PUSH_NULL LOAD_FAST 1 (b) - PRECALL 0 CALL 0 POP_TOP RERAISE 0 @@ -464,6 +583,7 @@ def _tryfinallyconst(b): POP_EXCEPT RERAISE 1 ExceptionTable: +2 rows """ % (_tryfinally.__code__.co_firstlineno, _tryfinally.__code__.co_firstlineno + 1, _tryfinally.__code__.co_firstlineno + 2, @@ -479,7 +599,6 @@ def _tryfinallyconst(b): %3d PUSH_NULL LOAD_FAST 0 (b) - PRECALL 0 CALL 0 POP_TOP LOAD_CONST 1 (1) @@ -487,7 +606,6 @@ def _tryfinallyconst(b): PUSH_EXC_INFO PUSH_NULL LOAD_FAST 0 (b) - PRECALL 0 CALL 0 POP_TOP RERAISE 0 @@ -495,6 +613,7 @@ def _tryfinallyconst(b): POP_EXCEPT RERAISE 1 ExceptionTable: +1 row """ % (_tryfinallyconst.__code__.co_firstlineno, _tryfinallyconst.__code__.co_firstlineno + 1, _tryfinallyconst.__code__.co_firstlineno + 2, @@ -550,7 +669,6 @@ def foo(x): MAKE_FUNCTION 8 (closure) LOAD_DEREF 1 (y) GET_ITER - PRECALL 0 CALL 0 RETURN_VALUE """ % (dis_nested_0, @@ -569,13 +687,13 @@ def foo(x): %3d RESUME 0 BUILD_LIST 0 LOAD_FAST 0 (.0) - >> FOR_ITER 7 (to 24) + >> FOR_ITER 7 (to 26) STORE_FAST 1 (z) LOAD_DEREF 2 (x) LOAD_FAST 1 (z) BINARY_OP 0 (+) LIST_APPEND 2 - JUMP_BACKWARD 8 (to 8) + JUMP_BACKWARD 9 (to 8) >> RETURN_VALUE """ % (dis_nested_1, __file__, @@ -608,31 +726,46 @@ def loop_test(): load_test(i) dis_loop_test_quickened_code = """\ -%3d 0 RESUME_QUICK 0 +%3d RESUME_QUICK 0 + +%3d BUILD_LIST 0 + LOAD_CONST 1 ((1, 2, 3)) + LIST_EXTEND 1 + LOAD_CONST 2 (3) + BINARY_OP_ADAPTIVE 5 (*) + GET_ITER + >> FOR_ITER_LIST 15 (to 50) + STORE_FAST 0 (i) + +%3d LOAD_GLOBAL_MODULE 1 (NULL + load_test) + LOAD_FAST 0 (i) + CALL_PY_WITH_DEFAULTS 1 + POP_TOP + JUMP_BACKWARD_QUICK 17 (to 16) -%3d 2 BUILD_LIST 0 - 4 LOAD_CONST 1 ((1, 2, 3)) - 6 LIST_EXTEND 1 - 8 LOAD_CONST 2 (3) - 10 BINARY_OP_ADAPTIVE 5 (*) - 14 GET_ITER - 16 FOR_ITER 17 (to 52) - 18 STORE_FAST 0 (i) - -%3d 20 LOAD_GLOBAL_MODULE 1 (NULL + load_test) - 32 LOAD_FAST 0 (i) - 34 PRECALL_PYFUNC 1 - 38 CALL_PY_WITH_DEFAULTS 1 - 48 POP_TOP - 50 JUMP_BACKWARD_QUICK 18 (to 16) - -%3d >> 52 LOAD_CONST 0 (None) - 54 RETURN_VALUE +%3d >> LOAD_CONST 0 (None) + RETURN_VALUE """ % (loop_test.__code__.co_firstlineno, loop_test.__code__.co_firstlineno + 1, loop_test.__code__.co_firstlineno + 2, loop_test.__code__.co_firstlineno + 1,) +def extended_arg_quick(): + *_, _ = ... + +dis_extended_arg_quick_code = """\ +%3d 0 RESUME 0 + +%3d 2 LOAD_CONST 1 (Ellipsis) + 4 EXTENDED_ARG_QUICK 1 + 6 UNPACK_EX 256 + 8 STORE_FAST 0 (_) + 10 STORE_FAST 0 (_) + 12 LOAD_CONST 0 (None) + 14 RETURN_VALUE +"""% (extended_arg_quick.__code__.co_firstlineno, + extended_arg_quick.__code__.co_firstlineno + 1,) + QUICKENING_WARMUP_DELAY = 8 class DisTestBase(unittest.TestCase): @@ -675,6 +808,18 @@ def assert_offsets_increasing(self, text, delta): self.assertGreaterEqual(offset, expected_offset, line) expected_offset = offset + delta + def assert_exception_table_increasing(self, lines): + prev_start, prev_end = -1, -1 + count = 0 + for line in lines: + m = re.match(r' (\d+) to (\d+) -> \d+ \[\d+\]', line) + start, end = [int(g) for g in m.groups()] + self.assertGreaterEqual(end, start) + self.assertGreater(start, prev_end) + prev_start, prev_end = start, end + count += 1 + return count + def strip_offsets(self, text): lines = text.splitlines(True) start, end = self.find_offset_column(lines) @@ -688,6 +833,9 @@ def strip_offsets(self, text): res.append(line) else: res.append(line[:start] + line[end:]) + num_rows = self.assert_exception_table_increasing(lines) + if num_rows: + res.append(f"{num_rows} row{'s' if num_rows > 1 else ''}\n") return "".join(res) def do_disassembly_compare(self, got, expected, with_offsets=False): @@ -880,6 +1028,12 @@ def test_disassemble_coroutine(self): def test_disassemble_fstring(self): self.do_disassembly_test(_fstring, dis_fstring) + def test_disassemble_with(self): + self.do_disassembly_test(_with, dis_with) + + def test_disassemble_asyncwith(self): + self.do_disassembly_test(_asyncwith, dis_asyncwith) + def test_disassemble_try_finally(self): self.do_disassembly_test(_tryfinally, dis_tryfinally) self.do_disassembly_test(_tryfinallyconst, dis_tryfinallyconst) @@ -980,7 +1134,7 @@ def test_load_attr_specialize(self): 1 2 LOAD_CONST 0 ('a') 4 LOAD_ATTR_SLOT 0 (__class__) - 14 RETURN_VALUE + 24 RETURN_VALUE """ co = compile("'a'.__class__", "", "eval") self.code_quicken(lambda: exec(co, {}, {})) @@ -990,26 +1144,30 @@ def test_load_attr_specialize(self): @cpython_only def test_call_specialize(self): call_quicken = """\ - 0 RESUME_QUICK 0 + RESUME_QUICK 0 - 1 2 PUSH_NULL - 4 LOAD_NAME 0 (str) - 6 LOAD_CONST 0 (1) - 8 PRECALL_NO_KW_STR_1 1 - 12 CALL_ADAPTIVE 1 - 22 RETURN_VALUE + 1 PUSH_NULL + LOAD_NAME 0 (str) + LOAD_CONST 0 (1) + CALL_NO_KW_STR_1 1 + RETURN_VALUE """ co = compile("str(1)", "", "eval") self.code_quicken(lambda: exec(co, {}, {})) got = self.get_disassembly(co, adaptive=True) - self.do_disassembly_compare(got, call_quicken, True) + self.do_disassembly_compare(got, call_quicken) @cpython_only def test_loop_quicken(self): # Loop can trigger a quicken where the loop is located self.code_quicken(loop_test, 1) got = self.get_disassembly(loop_test, adaptive=True) - self.do_disassembly_compare(got, dis_loop_test_quickened_code, True) + self.do_disassembly_compare(got, dis_loop_test_quickened_code) + + @cpython_only + def test_extended_arg_quick(self): + got = self.get_disassembly(extended_arg_quick) + self.do_disassembly_compare(got, dis_extended_arg_quick_code, True) def get_cached_values(self, quickened, adaptive): def f(): @@ -1039,8 +1197,10 @@ def test_show_caches(self): caches = list(self.get_cached_values(quickened, adaptive)) for cache in caches: self.assertRegex(cache, pattern) - self.assertEqual(caches.count(""), 8) - self.assertEqual(len(caches), 25) + total_caches = 23 + empty_caches = 8 if adaptive and quickened else total_caches + self.assertEqual(caches.count(""), empty_caches) + self.assertEqual(len(caches), total_caches) class DisWithFileTests(DisTests): @@ -1319,6 +1479,7 @@ def _prepare_test_cases(): #_prepare_test_cases() Instruction = dis.Instruction + expected_opinfo_outer = [ Instruction(opname='MAKE_CELL', opcode=135, arg=0, argval='a', argrepr='a', offset=0, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='MAKE_CELL', opcode=135, arg=1, argval='b', argrepr='b', offset=2, starts_line=None, is_jump_target=False, positions=None), @@ -1338,11 +1499,10 @@ def _prepare_test_cases(): Instruction(opname='BUILD_LIST', opcode=103, arg=0, argval=0, argrepr='', offset=40, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='BUILD_MAP', opcode=105, arg=0, argval=0, argrepr='', offset=42, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Hello world!', argrepr="'Hello world!'", offset=44, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=7, argval=7, argrepr='', offset=46, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=7, argval=7, argrepr='', offset=50, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=60, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='f', argrepr='f', offset=62, starts_line=8, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=64, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=7, argval=7, argrepr='', offset=46, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=56, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='f', argrepr='f', offset=58, starts_line=8, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=60, starts_line=None, is_jump_target=False, positions=None), ] expected_opinfo_f = [ @@ -1364,11 +1524,10 @@ def _prepare_test_cases(): Instruction(opname='LOAD_DEREF', opcode=137, arg=4, argval='b', argrepr='b', offset=40, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_DEREF', opcode=137, arg=0, argval='c', argrepr='c', offset=42, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_DEREF', opcode=137, arg=1, argval='d', argrepr='d', offset=44, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=4, argval=4, argrepr='', offset=46, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=4, argval=4, argrepr='', offset=50, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=60, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='inner', argrepr='inner', offset=62, starts_line=6, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=64, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=4, argval=4, argrepr='', offset=46, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=56, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=2, argval='inner', argrepr='inner', offset=58, starts_line=6, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=60, starts_line=None, is_jump_target=False, positions=None), ] expected_opinfo_inner = [ @@ -1381,137 +1540,124 @@ def _prepare_test_cases(): Instruction(opname='LOAD_DEREF', opcode=137, arg=5, argval='d', argrepr='d', offset=22, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='e', argrepr='e', offset=24, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_FAST', opcode=124, arg=1, argval='f', argrepr='f', offset=26, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=6, argval=6, argrepr='', offset=28, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=6, argval=6, argrepr='', offset=32, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=42, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=44, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=46, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=6, argval=6, argrepr='', offset=28, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=38, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=40, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=42, starts_line=None, is_jump_target=False, positions=None), ] expected_opinfo_jumpy = [ Instruction(opname='RESUME', opcode=151, arg=0, argval=0, argrepr='', offset=0, starts_line=1, is_jump_target=False, positions=None), Instruction(opname='LOAD_GLOBAL', opcode=116, arg=1, argval='range', argrepr='NULL + range', offset=2, starts_line=3, is_jump_target=False, positions=None), Instruction(opname='LOAD_CONST', opcode=100, arg=1, argval=10, argrepr='10', offset=14, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=16, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=20, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='GET_ITER', opcode=68, arg=None, argval=None, argrepr='', offset=30, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='FOR_ITER', opcode=93, arg=32, argval=98, argrepr='to 98', offset=32, starts_line=None, is_jump_target=True, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=34, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=36, starts_line=4, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=48, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=50, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=54, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=64, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=66, starts_line=5, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=68, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=70, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=1, argval=80, argrepr='to 80', offset=76, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_BACKWARD', opcode=140, arg=24, argval=32, argrepr='to 32', offset=78, starts_line=6, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=80, starts_line=7, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=82, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=84, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=2, argval=96, argrepr='to 96', offset=90, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=92, starts_line=8, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=16, argval=128, argrepr='to 128', offset=94, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_BACKWARD', opcode=140, arg=33, argval=32, argrepr='to 32', offset=96, starts_line=7, is_jump_target=True, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=98, starts_line=10, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval='I can haz else clause?', argrepr="'I can haz else clause?'", offset=110, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=112, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=116, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=126, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=128, starts_line=11, is_jump_target=True, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=36, argval=204, argrepr='to 204', offset=130, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=132, starts_line=12, is_jump_target=True, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=144, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=146, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=150, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=160, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=162, starts_line=13, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=164, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BINARY_OP', opcode=122, arg=23, argval=23, argrepr='-=', offset=166, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=170, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=172, starts_line=14, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=174, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=176, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=1, argval=186, argrepr='to 186', offset=182, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_BACKWARD', opcode=140, arg=29, argval=128, argrepr='to 128', offset=184, starts_line=15, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=186, starts_line=16, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=188, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=190, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=1, argval=200, argrepr='to 200', offset=196, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=17, argval=234, argrepr='to 234', offset=198, starts_line=17, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=200, starts_line=11, is_jump_target=True, positions=None), - Instruction(opname='POP_JUMP_BACKWARD_IF_TRUE', opcode=176, arg=36, argval=132, argrepr='to 132', offset=202, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=204, starts_line=19, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Who let lolcatz into this test suite?', argrepr="'Who let lolcatz into this test suite?'", offset=216, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=218, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=222, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=232, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=234, starts_line=20, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=236, starts_line=21, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=0, argrepr='0', offset=238, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='BINARY_OP', opcode=122, arg=11, argval=11, argrepr='/', offset=240, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=244, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=246, starts_line=25, is_jump_target=False, positions=None), - Instruction(opname='BEFORE_WITH', opcode=53, arg=None, argval=None, argrepr='', offset=248, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='STORE_FAST', opcode=125, arg=1, argval='dodgy', argrepr='dodgy', offset=250, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=252, starts_line=26, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=8, argval='Never reach this', argrepr="'Never reach this'", offset=264, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=266, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=270, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=280, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=282, starts_line=25, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=284, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=286, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=2, argval=2, argrepr='', offset=288, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=2, argval=2, argrepr='', offset=292, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=302, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=11, argval=328, argrepr='to 328', offset=304, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=306, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=16, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='GET_ITER', opcode=68, arg=None, argval=None, argrepr='', offset=26, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='FOR_ITER', opcode=93, arg=29, argval=90, argrepr='to 90', offset=28, starts_line=None, is_jump_target=True, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=32, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=34, starts_line=4, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=46, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=48, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=58, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=60, starts_line=5, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=62, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=64, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=1, argval=74, argrepr='to 74', offset=70, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_BACKWARD', opcode=140, arg=23, argval=28, argrepr='to 28', offset=72, starts_line=6, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=74, starts_line=7, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=76, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=78, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_BACKWARD_IF_FALSE', opcode=175, arg=29, argval=28, argrepr='to 28', offset=84, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=86, starts_line=8, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=13, argval=116, argrepr='to 116', offset=88, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=90, starts_line=10, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=4, argval='I can haz else clause?', argrepr="'I can haz else clause?'", offset=102, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=104, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=114, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST_CHECK', opcode=127, arg=0, argval='i', argrepr='i', offset=116, starts_line=11, is_jump_target=True, positions=None), + Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=34, argval=188, argrepr='to 188', offset=118, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=120, starts_line=12, is_jump_target=True, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=132, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=134, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=144, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=146, starts_line=13, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=148, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BINARY_OP', opcode=122, arg=23, argval=23, argrepr='-=', offset=150, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=0, argval='i', argrepr='i', offset=154, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=156, starts_line=14, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=3, argval=6, argrepr='6', offset=158, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=4, argval='>', argrepr='>', offset=160, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=1, argval=170, argrepr='to 170', offset=166, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_BACKWARD', opcode=140, arg=27, argval=116, argrepr='to 116', offset=168, starts_line=15, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=170, starts_line=16, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=2, argval=4, argrepr='4', offset=172, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COMPARE_OP', opcode=107, arg=0, argval='<', argrepr='<', offset=174, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=1, argval=184, argrepr='to 184', offset=180, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_FORWARD', opcode=110, arg=15, argval=214, argrepr='to 214', offset=182, starts_line=17, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=184, starts_line=11, is_jump_target=True, positions=None), + Instruction(opname='POP_JUMP_BACKWARD_IF_TRUE', opcode=176, arg=34, argval=120, argrepr='to 120', offset=186, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=188, starts_line=19, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=6, argval='Who let lolcatz into this test suite?', argrepr="'Who let lolcatz into this test suite?'", offset=200, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=202, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=212, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='NOP', opcode=9, arg=None, argval=None, argrepr='', offset=214, starts_line=20, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=5, argval=1, argrepr='1', offset=216, starts_line=21, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=7, argval=0, argrepr='0', offset=218, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='BINARY_OP', opcode=122, arg=11, argval=11, argrepr='/', offset=220, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=224, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_FAST', opcode=124, arg=0, argval='i', argrepr='i', offset=226, starts_line=25, is_jump_target=False, positions=None), + Instruction(opname='BEFORE_WITH', opcode=53, arg=None, argval=None, argrepr='', offset=228, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='STORE_FAST', opcode=125, arg=1, argval='dodgy', argrepr='dodgy', offset=230, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=232, starts_line=26, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=8, argval='Never reach this', argrepr="'Never reach this'", offset=244, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=246, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=256, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=258, starts_line=25, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=260, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=262, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=2, argval=2, argrepr='', offset=264, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=274, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=276, starts_line=28, is_jump_target=True, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=288, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=290, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=300, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=302, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=304, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=306, starts_line=25, is_jump_target=False, positions=None), Instruction(opname='WITH_EXCEPT_START', opcode=49, arg=None, argval=None, argrepr='', offset=308, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_TRUE', opcode=115, arg=4, argval=320, argrepr='to 320', offset=310, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_FORWARD_IF_TRUE', opcode=115, arg=1, argval=314, argrepr='to 314', offset=310, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='RERAISE', opcode=119, arg=2, argval=2, argrepr='', offset=312, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=314, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=314, starts_line=None, is_jump_target=True, positions=None), Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=316, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=318, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=320, starts_line=None, is_jump_target=True, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=322, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=324, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=326, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=31, argval=392, argrepr='to 392', offset=328, starts_line=None, is_jump_target=True, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=318, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=320, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_BACKWARD', opcode=140, arg=24, argval=276, argrepr='to 276', offset=322, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=324, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=326, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=328, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=330, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_GLOBAL', opcode=116, arg=4, argval='ZeroDivisionError', argrepr='ZeroDivisionError', offset=332, starts_line=22, is_jump_target=False, positions=None), Instruction(opname='CHECK_EXC_MATCH', opcode=36, arg=None, argval=None, argrepr='', offset=344, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=18, argval=384, argrepr='to 384', offset=346, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_JUMP_FORWARD_IF_FALSE', opcode=114, arg=16, argval=380, argrepr='to 380', offset=346, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=348, starts_line=None, is_jump_target=False, positions=None), Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=350, starts_line=23, is_jump_target=False, positions=None), Instruction(opname='LOAD_CONST', opcode=100, arg=9, argval='Here we go, here we go, here we go...', argrepr="'Here we go, here we go, here we go...'", offset=362, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=364, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=368, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=378, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=380, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='JUMP_FORWARD', opcode=110, arg=4, argval=392, argrepr='to 392', offset=382, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=384, starts_line=22, is_jump_target=True, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=386, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=388, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=390, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=392, starts_line=28, is_jump_target=True, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=404, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=406, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=410, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=420, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=0, argval=None, argrepr='None', offset=422, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RETURN_VALUE', opcode=83, arg=None, argval=None, argrepr='', offset=424, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=426, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=428, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=440, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='PRECALL', opcode=166, arg=1, argval=1, argrepr='', offset=442, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=446, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=456, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=458, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=460, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=462, starts_line=None, is_jump_target=False, positions=None), - Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=464, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=364, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=374, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=376, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='JUMP_BACKWARD', opcode=140, arg=52, argval=276, argrepr='to 276', offset=378, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=380, starts_line=22, is_jump_target=True, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=382, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=384, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=386, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='PUSH_EXC_INFO', opcode=35, arg=None, argval=None, argrepr='', offset=388, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='LOAD_GLOBAL', opcode=116, arg=3, argval='print', argrepr='NULL + print', offset=390, starts_line=28, is_jump_target=False, positions=None), + Instruction(opname='LOAD_CONST', opcode=100, arg=10, argval="OK, now we're done", argrepr='"OK, now we\'re done"', offset=402, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='CALL', opcode=171, arg=1, argval=1, argrepr='', offset=404, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_TOP', opcode=1, arg=None, argval=None, argrepr='', offset=414, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=0, argval=0, argrepr='', offset=416, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='COPY', opcode=120, arg=3, argval=3, argrepr='', offset=418, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='POP_EXCEPT', opcode=89, arg=None, argval=None, argrepr='', offset=420, starts_line=None, is_jump_target=False, positions=None), + Instruction(opname='RERAISE', opcode=119, arg=1, argval=1, argrepr='', offset=422, starts_line=None, is_jump_target=False, positions=None), ] # One last piece of inspect fodder to check the default line number handling @@ -1581,7 +1727,6 @@ def test_co_positions(self): (1, 3, 0, 1), (1, 3, 0, 1), (1, 3, 0, 1), - (1, 3, 0, 1), (1, 3, 0, 1) ] self.assertEqual(positions, expected) @@ -1608,6 +1753,36 @@ def test_co_positions_missing_info(self): self.assertIsNone(positions.col_offset) self.assertIsNone(positions.end_col_offset) + @requires_debug_ranges() + def test_co_positions_with_lots_of_caches(self): + def roots(a, b, c): + d = b**2 - 4 * a * c + yield (-b - cmath.sqrt(d)) / (2 * a) + if d: + yield (-b + cmath.sqrt(d)) / (2 * a) + code = roots.__code__ + ops = code.co_code[::2] + cache_opcode = opcode.opmap["CACHE"] + caches = sum(op == cache_opcode for op in ops) + non_caches = len(ops) - caches + # Make sure we have "lots of caches". If not, roots should be changed: + assert 1 / 3 <= caches / non_caches, "this test needs more caches!" + for show_caches in (False, True): + for adaptive in (False, True): + with self.subTest(f"{adaptive=}, {show_caches=}"): + co_positions = [ + positions + for op, positions in zip(ops, code.co_positions(), strict=True) + if show_caches or op != cache_opcode + ] + dis_positions = [ + instruction.positions + for instruction in dis.get_instructions( + code, adaptive=adaptive, show_caches=show_caches + ) + ] + self.assertEqual(co_positions, dis_positions) + # get_instructions has its own tests above, so can rely on it to validate # the object oriented API class BytecodeTests(InstructionTestCase, DisTestBase): diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 3e7f3782d89f42..7c799697d9c225 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -4,7 +4,6 @@ from test import support from test.support import import_helper -from test.support import os_helper import doctest import functools import os @@ -14,7 +13,6 @@ import importlib.util import unittest import tempfile -import shutil import types import contextlib @@ -25,6 +23,7 @@ # NOTE: There are some additional tests relating to interaction with # zipimport in the test_zipimport_support test module. +# There are also related tests in `test_doctest2` module. ###################################################################### ## Sample Objects (used by test cases) @@ -460,7 +459,7 @@ def basics(): r""" >>> tests = finder.find(sample_func) >>> print(tests) # doctest: +ELLIPSIS - [] + [] The exact name depends on how test_doctest was invoked, so allow for leading path components. @@ -642,6 +641,26 @@ def basics(): r""" 1 SampleClass.double 1 SampleClass.get +When used with `exclude_empty=False` we are also interested in line numbers +of doctests that are empty. +It used to be broken for quite some time until `bpo-28249`. + + >>> from test import doctest_lineno + >>> tests = doctest.DocTestFinder(exclude_empty=False).find(doctest_lineno) + >>> for t in tests: + ... print('%5s %s' % (t.lineno, t.name)) + None test.doctest_lineno + 22 test.doctest_lineno.ClassWithDocstring + 30 test.doctest_lineno.ClassWithDoctest + None test.doctest_lineno.ClassWithoutDocstring + None test.doctest_lineno.MethodWrapper + 39 test.doctest_lineno.MethodWrapper.method_with_docstring + 45 test.doctest_lineno.MethodWrapper.method_with_doctest + None test.doctest_lineno.MethodWrapper.method_without_docstring + 4 test.doctest_lineno.func_with_docstring + 12 test.doctest_lineno.func_with_doctest + None test.doctest_lineno.func_without_docstring + Turning off Recursion ~~~~~~~~~~~~~~~~~~~~~ DocTestFinder can be told not to look for tests in contained objects @@ -2790,6 +2809,8 @@ def test_lineendings(): r""" at least one of the line endings will raise a ValueError for inconsistent whitespace if doctest does not correctly do the newline conversion. + >>> from test.support import os_helper + >>> import shutil >>> dn = tempfile.mkdtemp() >>> pkg = os.path.join(dn, "doctest_testpkg") >>> os.mkdir(pkg) diff --git a/Lib/test/test_dtrace.py b/Lib/test/test_dtrace.py index 8a436ad123b80f..4b971deacc1a5c 100644 --- a/Lib/test/test_dtrace.py +++ b/Lib/test/test_dtrace.py @@ -6,9 +6,14 @@ import types import unittest +from test import support from test.support import findfile +if not support.has_subprocess_support: + raise unittest.SkipTest("test module requires subprocess") + + def abspath(filename): return os.path.abspath(findfile(filename, subdir="dtracedata")) diff --git a/Lib/test/test_email/test_email.py b/Lib/test/test_email/test_email.py index 69f883a3673f26..0bf679b4c56ca0 100644 --- a/Lib/test/test_email/test_email.py +++ b/Lib/test/test_email/test_email.py @@ -7,7 +7,6 @@ import base64 import unittest import textwrap -import warnings from io import StringIO, BytesIO from itertools import chain @@ -19,24 +18,25 @@ import email.policy from email.charset import Charset -from email.header import Header, decode_header, make_header -from email.parser import Parser, HeaderParser from email.generator import Generator, DecodedGenerator, BytesGenerator +from email.header import Header, decode_header, make_header +from email.headerregistry import HeaderRegistry from email.message import Message from email.mime.application import MIMEApplication from email.mime.audio import MIMEAudio -from email.mime.text import MIMEText -from email.mime.image import MIMEImage from email.mime.base import MIMEBase +from email.mime.image import MIMEImage from email.mime.message import MIMEMessage from email.mime.multipart import MIMEMultipart from email.mime.nonmultipart import MIMENonMultipart -from email import utils -from email import errors +from email.mime.text import MIMEText +from email.parser import Parser, HeaderParser +from email import base64mime from email import encoders +from email import errors from email import iterators -from email import base64mime from email import quoprimime +from email import utils from test.support import threading_helper from test.support.os_helper import unlink @@ -44,7 +44,7 @@ # These imports are documented to work, but we are testing them using a # different path, so we import them here just to make sure they are importable. -from email.parser import FeedParser, BytesFeedParser +from email.parser import FeedParser NL = '\n' EMPTYSTRING = '' @@ -5541,7 +5541,12 @@ def test_long_headers_flatten(self): result = fp.getvalue() self._signed_parts_eq(original, result) - +class TestHeaderRegistry(TestEmailBase): + # See issue gh-93010. + def test_HeaderRegistry(self): + reg = HeaderRegistry() + a = reg('Content-Disposition', 'attachment; 0*00="foo"') + self.assertIsInstance(a.defects[0], errors.InvalidHeaderDefect) if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 5ba6e3a43fdc6e..c7e5663566b06f 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -22,7 +22,6 @@ MS_WINDOWS = (os.name == 'nt') MACOS = (sys.platform == 'darwin') -Py_DEBUG = hasattr(sys, 'gettotalrefcount') PYMEM_ALLOCATOR_NOT_SET = 0 PYMEM_ALLOCATOR_DEBUG = 2 PYMEM_ALLOCATOR_MALLOC = 3 @@ -68,18 +67,16 @@ def setUp(self): ext = ("_d" if debug_build(sys.executable) else "") + ".exe" exename += ext exepath = builddir - expecteddir = os.path.join(support.REPO_ROOT, builddir) else: exepath = os.path.join(builddir, 'Programs') - expecteddir = os.path.join(support.REPO_ROOT, 'Programs') self.test_exe = exe = os.path.join(exepath, exename) - if exepath != expecteddir or not os.path.exists(exe): + if not os.path.exists(exe): self.skipTest("%r doesn't exist" % exe) # This is needed otherwise we get a fatal error: # "Py_Initialize: Unable to get the locale encoding # LookupError: no codec search functions registered: can't find encoding" self.oldcwd = os.getcwd() - os.chdir(support.REPO_ROOT) + os.chdir(builddir) def tearDown(self): os.chdir(self.oldcwd) @@ -498,7 +495,7 @@ class InitConfigTests(EmbeddingTestsMixin, unittest.TestCase): 'pathconfig_warnings': 1, '_init_main': 1, '_isolated_interpreter': 0, - 'use_frozen_modules': not Py_DEBUG, + 'use_frozen_modules': not support.Py_DEBUG, 'safe_path': 0, '_is_python_build': IGNORE_CONFIG, } @@ -630,7 +627,7 @@ def _get_expected_config(self): return configs def get_expected_config(self, expected_preconfig, expected, - env, api, modify_path_cb=None, cwd=None): + env, api, modify_path_cb=None): configs = self._get_expected_config() pre_config = configs['pre_config'] @@ -673,14 +670,6 @@ def get_expected_config(self, expected_preconfig, expected, expected['base_executable'] = default_executable if expected['program_name'] is self.GET_DEFAULT_CONFIG: expected['program_name'] = './_testembed' - if MS_WINDOWS: - # follow the calculation in getpath.py - tmpname = expected['program_name'] + '.exe' - if cwd: - tmpname = os.path.join(cwd, tmpname) - if os.path.isfile(tmpname): - expected['program_name'] += '.exe' - del tmpname config = configs['config'] for key, value in expected.items(): @@ -710,6 +699,11 @@ def check_pre_config(self, configs, expected): def check_config(self, configs, expected): config = dict(configs['config']) + if MS_WINDOWS: + value = config.get(key := 'program_name') + if value and isinstance(value, str): + ext = '_d.exe' if debug_build(sys.executable) else '.exe' + config[key] = value[:len(value.lower().removesuffix(ext))] for key, value in list(expected.items()): if value is self.IGNORE_CONFIG: config.pop(key, None) @@ -774,7 +768,7 @@ def check_all_configs(self, testname, expected_config=None, self.get_expected_config(expected_preconfig, expected_config, env, - api, modify_path_cb, cwd) + api, modify_path_cb) out, err = self.run_embedded_interpreter(testname, env=env, cwd=cwd) @@ -1206,7 +1200,7 @@ def test_init_setpath_config(self): # The current getpath.c doesn't determine the stdlib dir # in this case. 'stdlib_dir': '', - 'use_frozen_modules': not Py_DEBUG, + 'use_frozen_modules': not support.Py_DEBUG, # overridden by PyConfig 'program_name': 'conf_program_name', 'base_executable': 'conf_executable', @@ -1304,6 +1298,66 @@ def test_init_setpythonhome(self): self.check_all_configs("test_init_setpythonhome", config, api=API_COMPAT, env=env) + def test_init_is_python_build_with_home(self): + # Test _Py_path_config._is_python_build configuration (gh-91985) + config = self._get_expected_config() + paths = config['config']['module_search_paths'] + paths_str = os.path.pathsep.join(paths) + + for path in paths: + if not os.path.isdir(path): + continue + if os.path.exists(os.path.join(path, 'os.py')): + home = os.path.dirname(path) + break + else: + self.fail(f"Unable to find home in {paths!r}") + + prefix = exec_prefix = home + if MS_WINDOWS: + stdlib = os.path.join(home, "Lib") + # Because we are specifying 'home', module search paths + # are fairly static + expected_paths = [paths[0], stdlib, os.path.join(home, 'DLLs')] + else: + version = f'{sys.version_info.major}.{sys.version_info.minor}' + stdlib = os.path.join(home, sys.platlibdir, f'python{version}') + expected_paths = self.module_search_paths(prefix=home, exec_prefix=home) + + config = { + 'home': home, + 'module_search_paths': expected_paths, + 'prefix': prefix, + 'base_prefix': prefix, + 'exec_prefix': exec_prefix, + 'base_exec_prefix': exec_prefix, + 'pythonpath_env': paths_str, + 'stdlib_dir': stdlib, + } + # The code above is taken from test_init_setpythonhome() + env = {'TESTHOME': home, 'PYTHONPATH': paths_str} + + env['NEGATIVE_ISPYTHONBUILD'] = '1' + config['_is_python_build'] = 0 + self.check_all_configs("test_init_is_python_build", config, + api=API_COMPAT, env=env) + + env['NEGATIVE_ISPYTHONBUILD'] = '0' + config['_is_python_build'] = 1 + exedir = os.path.dirname(sys.executable) + with open(os.path.join(exedir, 'pybuilddir.txt'), encoding='utf8') as f: + expected_paths[2] = os.path.normpath( + os.path.join(exedir, f'{f.read()}\n$'.splitlines()[0])) + if not MS_WINDOWS: + # PREFIX (default) is set when running in build directory + prefix = exec_prefix = sys.prefix + # stdlib calculation (/Lib) is not yet supported + expected_paths[0] = self.module_search_paths(prefix=prefix)[0] + config.update(prefix=prefix, base_prefix=prefix, + exec_prefix=exec_prefix, base_exec_prefix=exec_prefix) + self.check_all_configs("test_init_is_python_build", config, + api=API_COMPAT, env=env) + def copy_paths_by_env(self, config): all_configs = self._get_expected_config() paths = all_configs['config']['module_search_paths'] @@ -1319,10 +1373,11 @@ def test_init_pybuilddir(self): with self.tmpdir_with_python() as tmpdir: # pybuilddir.txt is a sub-directory relative to the current # directory (tmpdir) + vpath = sysconfig.get_config_var("VPATH") or '' subdir = 'libdir' libdir = os.path.join(tmpdir, subdir) # The stdlib dir is dirname(executable) + VPATH + 'Lib' - stdlibdir = os.path.join(tmpdir, 'Lib') + stdlibdir = os.path.normpath(os.path.join(tmpdir, vpath, 'Lib')) os.mkdir(libdir) filename = os.path.join(tmpdir, 'pybuilddir.txt') @@ -1445,12 +1500,12 @@ def test_init_pyvenv_cfg(self): config['base_prefix'] = pyvenv_home config['prefix'] = pyvenv_home config['stdlib_dir'] = os.path.join(pyvenv_home, 'Lib') - config['use_frozen_modules'] = not Py_DEBUG + config['use_frozen_modules'] = not support.Py_DEBUG else: # cannot reliably assume stdlib_dir here because it # depends too much on our build. But it ought to be found config['stdlib_dir'] = self.IGNORE_CONFIG - config['use_frozen_modules'] = not Py_DEBUG + config['use_frozen_modules'] = not support.Py_DEBUG env = self.copy_paths_by_env(config) self.check_all_configs("test_init_compat_config", config, @@ -1680,7 +1735,7 @@ def test_frozenmain(self): """).lstrip() self.assertEqual(out, expected) - @unittest.skipUnless(hasattr(sys, 'gettotalrefcount'), + @unittest.skipUnless(support.Py_DEBUG, '-X showrefcount requires a Python debug build') def test_no_memleak(self): # bpo-1635741: Python must release all memory at exit diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 286d631d793e49..1968c8abd50dcf 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -1,3 +1,4 @@ +import copy import enum import doctest import inspect @@ -6,6 +7,7 @@ import sys import unittest import threading +import typing import builtins as bltns from collections import OrderedDict from datetime import date @@ -189,6 +191,12 @@ class HeadlightsC(IntFlag, boundary=enum.CONFORM): FOG_C = auto() +@enum.global_enum +class NoName(Flag): + ONE = 1 + TWO = 2 + + # tests class _EnumTests: @@ -335,20 +343,12 @@ def test_changing_member_fails(self): with self.assertRaises(AttributeError): self.MainEnum.second = 'really first' - @unittest.skipIf( - python_version >= (3, 12), - '__contains__ now returns True/False for all inputs', - ) - @unittest.expectedFailure - def test_contains_er(self): + def test_contains_tf(self): MainEnum = self.MainEnum - self.assertIn(MainEnum.third, MainEnum) - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - self.source_values[1] in MainEnum - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'first' in MainEnum + self.assertIn(MainEnum.first, MainEnum) + self.assertTrue(self.values[0] in MainEnum) + if type(self) is not TestStrEnum: + self.assertFalse('first' in MainEnum) val = MainEnum.dupe self.assertIn(val, MainEnum) # @@ -356,24 +356,43 @@ class OtherEnum(Enum): one = auto() two = auto() self.assertNotIn(OtherEnum.two, MainEnum) - - @unittest.skipIf( - python_version < (3, 12), - '__contains__ works only with enum memmbers before 3.12', - ) - @unittest.expectedFailure - def test_contains_tf(self): + # + if MainEnum._member_type_ is object: + # enums without mixed data types will always be False + class NotEqualEnum(self.enum_type): + this = self.source_values[0] + that = self.source_values[1] + self.assertNotIn(NotEqualEnum.this, MainEnum) + self.assertNotIn(NotEqualEnum.that, MainEnum) + else: + # enums with mixed data types may be True + class EqualEnum(self.enum_type): + this = self.source_values[0] + that = self.source_values[1] + self.assertIn(EqualEnum.this, MainEnum) + self.assertIn(EqualEnum.that, MainEnum) + + def test_contains_same_name_diff_enum_diff_values(self): MainEnum = self.MainEnum - self.assertIn(MainEnum.first, MainEnum) - self.assertTrue(self.source_values[0] in MainEnum) - self.assertFalse('first' in MainEnum) - val = MainEnum.dupe - self.assertIn(val, MainEnum) # class OtherEnum(Enum): - one = auto() - two = auto() - self.assertNotIn(OtherEnum.two, MainEnum) + first = "brand" + second = "new" + third = "values" + # + self.assertIn(MainEnum.first, MainEnum) + self.assertIn(MainEnum.second, MainEnum) + self.assertIn(MainEnum.third, MainEnum) + self.assertNotIn(MainEnum.first, OtherEnum) + self.assertNotIn(MainEnum.second, OtherEnum) + self.assertNotIn(MainEnum.third, OtherEnum) + # + self.assertIn(OtherEnum.first, OtherEnum) + self.assertIn(OtherEnum.second, OtherEnum) + self.assertIn(OtherEnum.third, OtherEnum) + self.assertNotIn(OtherEnum.first, MainEnum) + self.assertNotIn(OtherEnum.second, MainEnum) + self.assertNotIn(OtherEnum.third, MainEnum) def test_dir_on_class(self): TE = self.MainEnum @@ -616,6 +635,7 @@ class _PlainOutputTests: def test_str(self): TE = self.MainEnum if self.is_flag: + self.assertEqual(str(TE(0)), "MainEnum(0)") self.assertEqual(str(TE.dupe), "MainEnum.dupe") self.assertEqual(str(self.dupe2), "MainEnum.first|third") else: @@ -727,6 +747,13 @@ def test_format_specs(self): self.assertFormatIsValue('{:5.2}', TE.third) self.assertFormatIsValue('{:f}', TE.third) + def test_copy(self): + TE = self.MainEnum + copied = copy.copy(TE) + self.assertEqual(copied, TE) + deep = copy.deepcopy(TE) + self.assertEqual(deep, TE) + class _FlagTests: @@ -965,6 +992,15 @@ class SpamEnum(Enum): spam = SpamEnumNotInner self.assertEqual(SpamEnum.spam.value, SpamEnumNotInner) + def test_enum_of_generic_aliases(self): + class E(Enum): + a = typing.List[int] + b = list[int] + self.assertEqual(E.a.value, typing.List[int]) + self.assertEqual(E.b.value, list[int]) + self.assertEqual(repr(E.a), '') + self.assertEqual(repr(E.b), '') + @unittest.skipIf( python_version >= (3, 13), 'inner classes are not members', @@ -1090,6 +1126,28 @@ class Huh(Enum): self.assertEqual(Huh.name.name, 'name') self.assertEqual(Huh.name.value, 1) + def test_contains_name_and_value_overlap(self): + class IntEnum1(IntEnum): + X = 1 + class IntEnum2(IntEnum): + X = 1 + class IntEnum3(IntEnum): + X = 2 + class IntEnum4(IntEnum): + Y = 1 + self.assertIn(IntEnum1.X, IntEnum1) + self.assertIn(IntEnum1.X, IntEnum2) + self.assertNotIn(IntEnum1.X, IntEnum3) + self.assertIn(IntEnum1.X, IntEnum4) + + def test_contains_different_types_same_members(self): + class IntEnum1(IntEnum): + X = 1 + class IntFlag1(IntFlag): + X = 1 + self.assertIn(IntEnum1.X, IntFlag1) + self.assertIn(IntFlag1.X, IntEnum1) + def test_inherited_data_type(self): class HexInt(int): __qualname__ = 'HexInt' @@ -2647,6 +2705,26 @@ class MyIntFlag(int, Flag): self.assertTrue(isinstance(MyIntFlag.ONE | MyIntFlag.TWO, MyIntFlag), MyIntFlag.ONE | MyIntFlag.TWO) self.assertTrue(isinstance(MyIntFlag.ONE | 2, MyIntFlag)) + def test_int_flags_copy(self): + class MyIntFlag(IntFlag): + ONE = 1 + TWO = 2 + FOUR = 4 + + flags = MyIntFlag.ONE | MyIntFlag.TWO + copied = copy.copy(flags) + deep = copy.deepcopy(flags) + self.assertEqual(copied, flags) + self.assertEqual(deep, flags) + + flags = MyIntFlag.ONE | MyIntFlag.TWO | 8 + copied = copy.copy(flags) + deep = copy.deepcopy(flags) + self.assertEqual(copied, flags) + self.assertEqual(deep, flags) + self.assertEqual(copied.value, 1 | 2 | 8) + + class TestOrder(unittest.TestCase): "test usage of the `_order_` attribute" @@ -2924,34 +3002,6 @@ def test_pickle(self): test_pickle_dump_load(self.assertIs, FlagStooges.CURLY|FlagStooges.MOE) test_pickle_dump_load(self.assertIs, FlagStooges) - @unittest.skipIf( - python_version >= (3, 12), - '__contains__ now returns True/False for all inputs', - ) - @unittest.expectedFailure - def test_contains_er(self): - Open = self.Open - Color = self.Color - self.assertFalse(Color.BLACK in Open) - self.assertFalse(Open.RO in Color) - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'BLACK' in Color - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'RO' in Open - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 1 in Color - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 1 in Open - - @unittest.skipIf( - python_version < (3, 12), - '__contains__ only works with enum memmbers before 3.12', - ) - @unittest.expectedFailure def test_contains_tf(self): Open = self.Open Color = self.Color @@ -2959,6 +3009,8 @@ def test_contains_tf(self): self.assertFalse(Open.RO in Color) self.assertFalse('BLACK' in Color) self.assertFalse('RO' in Open) + self.assertTrue(Color.BLACK in Color) + self.assertTrue(Open.RO in Open) self.assertTrue(1 in Color) self.assertTrue(1 in Open) @@ -3242,6 +3294,10 @@ def test_global_repr_conform1(self): '%(m)s.OFF_C' % {'m': SHORT_MODULE}, ) + def test_global_enum_str(self): + self.assertEqual(str(NoName.ONE & NoName.TWO), 'NoName(0)') + self.assertEqual(str(NoName(0)), 'NoName(0)') + def test_format(self): Perm = self.Perm self.assertEqual(format(Perm.R, ''), '4') @@ -3342,7 +3398,10 @@ def test_invert(self): self.assertIs((Open.WO|Open.CE) & ~Open.WO, Open.CE) def test_boundary(self): - self.assertIs(enum.IntFlag._boundary_, EJECT) + self.assertIs(enum.IntFlag._boundary_, KEEP) + class Simple(IntFlag, boundary=KEEP): + SINGLE = 1 + # class Iron(IntFlag, boundary=STRICT): ONE = 1 TWO = 2 @@ -3361,7 +3420,6 @@ class Space(IntFlag, boundary=EJECT): EIGHT = 8 self.assertIs(Space._boundary_, EJECT) # - # class Bizarre(IntFlag, boundary=KEEP): b = 3 c = 4 @@ -3378,6 +3436,12 @@ class Bizarre(IntFlag, boundary=KEEP): self.assertEqual(list(Bizarre), [Bizarre.c]) self.assertIs(Bizarre(3), Bizarre.b) self.assertIs(Bizarre(6), Bizarre.d) + # + simple = Simple.SINGLE | Iron.TWO + self.assertEqual(simple, 3) + self.assertIsInstance(simple, Simple) + self.assertEqual(repr(simple), ': 3>') + self.assertEqual(str(simple), '3') def test_iter(self): Color = self.Color @@ -3486,43 +3550,11 @@ def test_programatic_function_from_empty_tuple(self): self.assertEqual(len(lst), len(Thing)) self.assertEqual(len(Thing), 0, Thing) - @unittest.skipIf( - python_version >= (3, 12), - '__contains__ now returns True/False for all inputs', - ) - @unittest.expectedFailure - def test_contains_er(self): - Open = self.Open - Color = self.Color - self.assertTrue(Color.GREEN in Color) - self.assertTrue(Open.RW in Open) - self.assertFalse(Color.GREEN in Open) - self.assertFalse(Open.RW in Color) - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'GREEN' in Color - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 'RW' in Open - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 2 in Color - with self.assertRaises(TypeError): - with self.assertWarns(DeprecationWarning): - 2 in Open - - @unittest.skipIf( - python_version < (3, 12), - '__contains__ only works with enum memmbers before 3.12', - ) - @unittest.expectedFailure def test_contains_tf(self): Open = self.Open Color = self.Color self.assertTrue(Color.GREEN in Color) self.assertTrue(Open.RW in Open) - self.assertTrue(Color.GREEN in Open) - self.assertTrue(Open.RW in Color) self.assertFalse('GREEN' in Color) self.assertFalse('RW' in Open) self.assertTrue(2 in Color) @@ -3921,23 +3953,54 @@ class Color(AutoNameEnum): self.assertEqual(Color.blue.value, 'blue') self.assertEqual(Color.green.value, 'green') - def test_auto_garbage(self): - class Color(Enum): - red = 'red' - blue = auto() + @unittest.skipIf( + python_version >= (3, 13), + 'mixed types with auto() no longer supported', + ) + def test_auto_garbage_ok(self): + with self.assertWarnsRegex(DeprecationWarning, 'will require all values to be sortable'): + class Color(Enum): + red = 'red' + blue = auto() self.assertEqual(Color.blue.value, 1) - def test_auto_garbage_corrected(self): - class Color(Enum): - red = 'red' - blue = 2 - green = auto() + @unittest.skipIf( + python_version >= (3, 13), + 'mixed types with auto() no longer supported', + ) + def test_auto_garbage_corrected_ok(self): + with self.assertWarnsRegex(DeprecationWarning, 'will require all values to be sortable'): + class Color(Enum): + red = 'red' + blue = 2 + green = auto() self.assertEqual(list(Color), [Color.red, Color.blue, Color.green]) self.assertEqual(Color.red.value, 'red') self.assertEqual(Color.blue.value, 2) self.assertEqual(Color.green.value, 3) + @unittest.skipIf( + python_version < (3, 13), + 'mixed types with auto() will raise in 3.13', + ) + def test_auto_garbage_fail(self): + with self.assertRaisesRegex(TypeError, 'will require all values to be sortable'): + class Color(Enum): + red = 'red' + blue = auto() + + @unittest.skipIf( + python_version < (3, 13), + 'mixed types with auto() will raise in 3.13', + ) + def test_auto_garbage_corrected_fail(self): + with self.assertRaisesRegex(TypeError, 'will require all values to be sortable'): + class Color(Enum): + red = 'red' + blue = 2 + green = auto() + def test_auto_order(self): with self.assertRaises(TypeError): class Color(Enum): @@ -3959,6 +4022,22 @@ def _generate_next_value_(name, start, count, last): self.assertEqual(Color.red.value, 'pathological case') self.assertEqual(Color.blue.value, 'blue') + @unittest.skipIf( + python_version < (3, 13), + 'auto() will return highest value + 1 in 3.13', + ) + def test_auto_with_aliases(self): + class Color(Enum): + red = auto() + blue = auto() + oxford = blue + crimson = red + green = auto() + self.assertIs(Color.crimson, Color.red) + self.assertIs(Color.oxford, Color.blue) + self.assertIsNot(Color.green, Color.red) + self.assertIsNot(Color.green, Color.blue) + def test_duplicate_auto(self): class Dupes(Enum): first = primero = auto() @@ -3975,36 +4054,6 @@ class TestEnumTypeSubclassing(unittest.TestCase): class Color(enum.Enum) | Color(value, names=None, *, module=None, qualname=None, type=None, start=1, boundary=None) |\x20\x20 - | A collection of name/value pairs. - |\x20\x20 - | Access them by: - |\x20\x20 - | - attribute access:: - |\x20\x20 - | >>> Color.CYAN - | - |\x20\x20 - | - value lookup: - |\x20\x20 - | >>> Color(1) - | - |\x20\x20 - | - name lookup: - |\x20\x20 - | >>> Color['CYAN'] - | - |\x20\x20 - | Enumerations can be iterated over, and know how many members they have: - |\x20\x20 - | >>> len(Color) - | 3 - |\x20\x20 - | >>> list(Color) - | [, , ] - |\x20\x20 - | Methods can be added to enumerations, and members can have their own - | attributes -- see the documentation for details. - |\x20\x20 | Method resolution order: | Color | enum.Enum @@ -4030,12 +4079,12 @@ class Color(enum.Enum) | ---------------------------------------------------------------------- | Methods inherited from enum.EnumType: |\x20\x20 - | __contains__(member) from enum.EnumType - | Return True if member is a member of this enum - | raises TypeError if member is not an enum member - |\x20\x20\x20\x20\x20\x20 - | note: in 3.12 TypeError will no longer be raised, and True will also be - | returned if member is the value of a member in this enum + | __contains__(value) from enum.EnumType + | Return True if `value` is in `cls`. + | + | `value` is in `cls` if: + | 1) `value` is a member of `cls`, or + | 2) `value` is the value of one of the `cls`'s members. |\x20\x20 | __getitem__(name) from enum.EnumType | Return the member matching `name`. @@ -4250,77 +4299,13 @@ def test__all__(self): def test_doc_1(self): class Single(Enum): ONE = 1 - self.assertEqual( - Single.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Single.ONE - - - - value lookup: - - >>> Single(1) - - - - name lookup: - - >>> Single['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Single) - 1 - - >>> list(Single) - [] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) + self.assertEqual(Single.__doc__, None) def test_doc_2(self): class Double(Enum): ONE = 1 TWO = 2 - self.assertEqual( - Double.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Double.ONE - - - - value lookup: - - >>> Double(1) - - - - name lookup: - - >>> Double['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Double) - 2 - - >>> list(Double) - [, ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) + self.assertEqual(Double.__doc__, None) def test_doc_1(self): @@ -4328,39 +4313,7 @@ class Triple(Enum): ONE = 1 TWO = 2 THREE = 3 - self.assertEqual( - Triple.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Triple.ONE - - - - value lookup: - - >>> Triple(1) - - - - name lookup: - - >>> Triple['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Triple) - 3 - - >>> list(Triple) - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) + self.assertEqual(Triple.__doc__, None) def test_doc_1(self): class Quadruple(Enum): @@ -4368,39 +4321,7 @@ class Quadruple(Enum): TWO = 2 THREE = 3 FOUR = 4 - self.assertEqual( - Quadruple.__doc__, - dedent("""\ - A collection of name/value pairs. - - Access them by: - - - attribute access:: - - >>> Quadruple.ONE - - - - value lookup: - - >>> Quadruple(1) - - - - name lookup: - - >>> Quadruple['ONE'] - - - Enumerations can be iterated over, and know how many members they have: - - >>> len(Quadruple) - 4 - - >>> list(Quadruple)[:3] - [, , ] - - Methods can be added to enumerations, and members can have their own - attributes -- see the documentation for details. - """)) + self.assertEqual(Quadruple.__doc__, None) # These are unordered here on purpose to ensure that declaration order diff --git a/Lib/test/test_exception_group.py b/Lib/test/test_exception_group.py index 2cfd8738304d1c..3ea5dff5c379af 100644 --- a/Lib/test/test_exception_group.py +++ b/Lib/test/test_exception_group.py @@ -1,5 +1,4 @@ import collections.abc -import traceback import types import unittest diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index ff1a02821a58fc..d91308572b27cc 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -1,7 +1,6 @@ # Python test set -- part 5, built-in exceptions import copy -import gc import os import sys import unittest diff --git a/Lib/test/test_extcall.py b/Lib/test/test_extcall.py index 13265ea0d8ce85..11d39ec63a49e3 100644 --- a/Lib/test/test_extcall.py +++ b/Lib/test/test_extcall.py @@ -8,6 +8,7 @@ We're defining four helper functions + >>> from test import support >>> def e(a,b): ... print(a, b) @@ -522,7 +523,6 @@ import doctest import unittest -from test import support def load_tests(loader, tests, pattern): tests.addTest(doctest.DocTestSuite()) diff --git a/Lib/test/test_fileio.py b/Lib/test/test_fileio.py index e4984d3cd559e3..c26cdc028cc890 100644 --- a/Lib/test/test_fileio.py +++ b/Lib/test/test_fileio.py @@ -9,7 +9,9 @@ from weakref import proxy from functools import wraps -from test.support import cpython_only, swap_attr, gc_collect, is_emscripten +from test.support import ( + cpython_only, swap_attr, gc_collect, is_emscripten, is_wasi +) from test.support.os_helper import (TESTFN, TESTFN_UNICODE, make_bad_fd) from test.support.warnings_helper import check_warnings from collections import UserList @@ -65,6 +67,7 @@ def testAttributes(self): self.assertRaises((AttributeError, TypeError), setattr, f, attr, 'oops') + @unittest.skipIf(is_wasi, "WASI does not expose st_blksize.") def testBlksize(self): # test private _blksize attribute blksize = io.DEFAULT_BUFFER_SIZE diff --git a/Lib/test/test_float.py b/Lib/test/test_float.py index d8c0fe1854eba5..672ec141155530 100644 --- a/Lib/test/test_float.py +++ b/Lib/test/test_float.py @@ -8,7 +8,6 @@ import unittest from test import support -from test.support import import_helper from test.test_grammar import (VALID_UNDERSCORE_LITERALS, INVALID_UNDERSCORE_LITERALS) from math import isinf, isnan, copysign, ldexp diff --git a/Lib/test/test_flufl.py b/Lib/test/test_flufl.py index a81a4d4c8f0e4c..fd264c926bd575 100644 --- a/Lib/test/test_flufl.py +++ b/Lib/test/test_flufl.py @@ -1,6 +1,5 @@ import __future__ import unittest -from test import support class FLUFLTests(unittest.TestCase): diff --git a/Lib/test/test_fnmatch.py b/Lib/test/test_fnmatch.py index ca695d6f3f019b..10ed496d4e2f37 100644 --- a/Lib/test/test_fnmatch.py +++ b/Lib/test/test_fnmatch.py @@ -2,6 +2,7 @@ import unittest import os +import string import warnings from fnmatch import fnmatch, fnmatchcase, translate, filter @@ -91,6 +92,119 @@ def test_sep(self): check('usr/bin', 'usr\\bin', normsep) check('usr\\bin', 'usr\\bin') + def test_char_set(self): + ignorecase = os.path.normcase('ABC') == os.path.normcase('abc') + check = self.check_match + tescases = string.ascii_lowercase + string.digits + string.punctuation + for c in tescases: + check(c, '[az]', c in 'az') + check(c, '[!az]', c not in 'az') + # Case insensitive. + for c in tescases: + check(c, '[AZ]', (c in 'az') and ignorecase) + check(c, '[!AZ]', (c not in 'az') or not ignorecase) + for c in string.ascii_uppercase: + check(c, '[az]', (c in 'AZ') and ignorecase) + check(c, '[!az]', (c not in 'AZ') or not ignorecase) + # Repeated same character. + for c in tescases: + check(c, '[aa]', c == 'a') + # Special cases. + for c in tescases: + check(c, '[^az]', c in '^az') + check(c, '[[az]', c in '[az') + check(c, r'[!]]', c != ']') + check('[', '[') + check('[]', '[]') + check('[!', '[!') + check('[!]', '[!]') + + def test_range(self): + ignorecase = os.path.normcase('ABC') == os.path.normcase('abc') + normsep = os.path.normcase('\\') == os.path.normcase('/') + check = self.check_match + tescases = string.ascii_lowercase + string.digits + string.punctuation + for c in tescases: + check(c, '[b-d]', c in 'bcd') + check(c, '[!b-d]', c not in 'bcd') + check(c, '[b-dx-z]', c in 'bcdxyz') + check(c, '[!b-dx-z]', c not in 'bcdxyz') + # Case insensitive. + for c in tescases: + check(c, '[B-D]', (c in 'bcd') and ignorecase) + check(c, '[!B-D]', (c not in 'bcd') or not ignorecase) + for c in string.ascii_uppercase: + check(c, '[b-d]', (c in 'BCD') and ignorecase) + check(c, '[!b-d]', (c not in 'BCD') or not ignorecase) + # Upper bound == lower bound. + for c in tescases: + check(c, '[b-b]', c == 'b') + # Special cases. + for c in tescases: + check(c, '[!-#]', c not in '-#') + check(c, '[!--.]', c not in '-.') + check(c, '[^-`]', c in '^_`') + if not (normsep and c == '/'): + check(c, '[[-^]', c in r'[\]^') + check(c, r'[\-^]', c in r'\]^') + check(c, '[b-]', c in '-b') + check(c, '[!b-]', c not in '-b') + check(c, '[-b]', c in '-b') + check(c, '[!-b]', c not in '-b') + check(c, '[-]', c in '-') + check(c, '[!-]', c not in '-') + # Upper bound is less that lower bound: error in RE. + for c in tescases: + check(c, '[d-b]', False) + check(c, '[!d-b]', True) + check(c, '[d-bx-z]', c in 'xyz') + check(c, '[!d-bx-z]', c not in 'xyz') + check(c, '[d-b^-`]', c in '^_`') + if not (normsep and c == '/'): + check(c, '[d-b[-^]', c in r'[\]^') + + def test_sep_in_char_set(self): + normsep = os.path.normcase('\\') == os.path.normcase('/') + check = self.check_match + check('/', r'[/]') + check('\\', r'[\]') + check('/', r'[\]', normsep) + check('\\', r'[/]', normsep) + check('[/]', r'[/]', False) + check(r'[\\]', r'[/]', False) + check('\\', r'[\t]') + check('/', r'[\t]', normsep) + check('t', r'[\t]') + check('\t', r'[\t]', False) + + def test_sep_in_range(self): + normsep = os.path.normcase('\\') == os.path.normcase('/') + check = self.check_match + check('a/b', 'a[.-0]b', not normsep) + check('a\\b', 'a[.-0]b', False) + check('a\\b', 'a[Z-^]b', not normsep) + check('a/b', 'a[Z-^]b', False) + + check('a/b', 'a[/-0]b', not normsep) + check(r'a\b', 'a[/-0]b', False) + check('a[/-0]b', 'a[/-0]b', False) + check(r'a[\-0]b', 'a[/-0]b', False) + + check('a/b', 'a[.-/]b') + check(r'a\b', 'a[.-/]b', normsep) + check('a[.-/]b', 'a[.-/]b', False) + check(r'a[.-\]b', 'a[.-/]b', False) + + check(r'a\b', r'a[\-^]b') + check('a/b', r'a[\-^]b', normsep) + check(r'a[\-^]b', r'a[\-^]b', False) + check('a[/-^]b', r'a[\-^]b', False) + + check(r'a\b', r'a[Z-\]b', not normsep) + check('a/b', r'a[Z-\]b', False) + check(r'a[Z-\]b', r'a[Z-\]b', False) + check('a[Z-/]b', r'a[Z-\]b', False) + def test_warnings(self): with warnings.catch_warnings(): warnings.simplefilter('error', Warning) diff --git a/Lib/test/test_fstring.py b/Lib/test/test_fstring.py index 0c3372f0335517..9815b7c28f2f40 100644 --- a/Lib/test/test_fstring.py +++ b/Lib/test/test_fstring.py @@ -590,7 +590,9 @@ def test_format_specifier_expressions(self): self.assertEqual(f'{-10:{"-"}#{1}0{"x"}}', ' -0xa') self.assertEqual(f'{10:#{3 != {4:5} and width}x}', ' 0xa') - self.assertAllRaise(SyntaxError, "f-string: expecting '}'", + self.assertAllRaise(SyntaxError, + """f-string: invalid conversion character 'r{"': """ + """expected 's', 'r', or 'a'""", ["""f'{"s"!r{":10"}}'""", # This looks like a nested format spec. @@ -1012,19 +1014,28 @@ def test_conversions(self): # Not a conversion, but show that ! is allowed in a format spec. self.assertEqual(f'{3.14:!<10.10}', '3.14!!!!!!') - self.assertAllRaise(SyntaxError, 'f-string: invalid conversion character', - ["f'{3!g}'", - "f'{3!A}'", - "f'{3!3}'", - "f'{3!G}'", - "f'{3!!}'", + self.assertAllRaise(SyntaxError, "f-string: expecting '}'", + ["f'{3!'", + "f'{3!s'", + "f'{3!g'", + ]) + + self.assertAllRaise(SyntaxError, 'f-string: missed conversion character', + ["f'{3!}'", + "f'{3!:'", "f'{3!:}'", - "f'{3! s}'", # no space before conversion char ]) - self.assertAllRaise(SyntaxError, "f-string: expecting '}'", - ["f'{x!s{y}}'", - "f'{3!ss}'", + for conv in 'g', 'A', '3', 'G', '!', ' s', 's ', ' s ', 'ä', 'ɐ', 'ª': + self.assertAllRaise(SyntaxError, + "f-string: invalid conversion character %r: " + "expected 's', 'r', or 'a'" % conv, + ["f'{3!" + conv + "}'"]) + + self.assertAllRaise(SyntaxError, + "f-string: invalid conversion character 'ss': " + "expected 's', 'r', or 'a'", + ["f'{3!ss}'", "f'{3!ss:}'", "f'{3!ss:s}'", ]) @@ -1073,6 +1084,7 @@ def test_mismatched_braces(self): "f'{'", "f'x{<'", # See bpo-46762. "f'x{>'", + "f'{i='", # See gh-93418. ]) # But these are just normal strings. diff --git a/Lib/test/test_functools.py b/Lib/test/test_functools.py index 382e7dbffddf9d..767d2a9c1aea96 100644 --- a/Lib/test/test_functools.py +++ b/Lib/test/test_functools.py @@ -13,7 +13,6 @@ import typing import unittest import unittest.mock -import os import weakref import gc from weakref import proxy @@ -21,7 +20,6 @@ from test.support import import_helper from test.support import threading_helper -from test.support.script_helper import assert_python_ok import functools diff --git a/Lib/test/test_future.py b/Lib/test/test_future.py index c3e8420d6672c5..189cbdc4365b79 100644 --- a/Lib/test/test_future.py +++ b/Lib/test/test_future.py @@ -3,7 +3,6 @@ import __future__ import ast import unittest -from test import support from test.support import import_helper from textwrap import dedent import os diff --git a/Lib/test/test_gc.py b/Lib/test/test_gc.py index dbbd67b4fc88a1..087f72768fa4bf 100644 --- a/Lib/test/test_gc.py +++ b/Lib/test/test_gc.py @@ -1440,19 +1440,13 @@ def test_ast_fini(self): code = textwrap.dedent(""" import ast import codecs + from test import support # Small AST tree to keep their AST types alive tree = ast.parse("def f(x, y): return 2*x-y") - x = [tree] - x.append(x) - - # Put the cycle somewhere to survive until the last GC collection. - # Codec search functions are only cleared at the end of - # interpreter_clear(). - def search_func(encoding): - return None - search_func.a = x - codecs.register(search_func) + + # Store the tree somewhere to survive until the last GC collection + support.late_deletion(tree) """) assert_python_ok("-c", code) diff --git a/Lib/test/test_genericpath.py b/Lib/test/test_genericpath.py index 2741adc139bcf6..489044f8090d3b 100644 --- a/Lib/test/test_genericpath.py +++ b/Lib/test/test_genericpath.py @@ -484,7 +484,7 @@ def test_nonascii_abspath(self): # invalid UTF-8 name. Windows allows creating a directory with an # arbitrary bytes name, but fails to enter this directory # (when the bytes name is used). - and sys.platform not in ('win32', 'darwin', 'emscripten')): + and sys.platform not in ('win32', 'darwin', 'emscripten', 'wasi')): name = os_helper.TESTFN_UNDECODABLE elif os_helper.TESTFN_NONASCII: name = os_helper.TESTFN_NONASCII diff --git a/Lib/test/test_getpath.py b/Lib/test/test_getpath.py index 5208374e20013c..1e46cb5780c942 100644 --- a/Lib/test/test_getpath.py +++ b/Lib/test/test_getpath.py @@ -2,7 +2,6 @@ import ntpath import pathlib import posixpath -import sys import unittest from test.support import verbose diff --git a/Lib/test/test_hashlib.py b/Lib/test/test_hashlib.py index 67becdd6d317f3..bc9407dc9e424d 100644 --- a/Lib/test/test_hashlib.py +++ b/Lib/test/test_hashlib.py @@ -26,8 +26,6 @@ from test.support import warnings_helper from http.client import HTTPException -# Were we compiled --with-pydebug or with #define Py_DEBUG? -COMPILED_WITH_PYDEBUG = hasattr(sys, 'gettotalrefcount') # default builtin hash module default_builtin_hashes = {'md5', 'sha1', 'sha256', 'sha512', 'sha3', 'blake2'} @@ -109,7 +107,7 @@ class HashLibTestCase(unittest.TestCase): shakes = {'shake_128', 'shake_256'} # Issue #14693: fallback modules are always compiled under POSIX - _warn_on_extension_import = os.name == 'posix' or COMPILED_WITH_PYDEBUG + _warn_on_extension_import = (os.name == 'posix' or support.Py_DEBUG) def _conditional_import_module(self, module_name): """Import a module and return a reference to it or None on failure.""" diff --git a/Lib/test/test_heapq.py b/Lib/test/test_heapq.py index cb1e4505b02a30..1aa8e4e289730d 100644 --- a/Lib/test/test_heapq.py +++ b/Lib/test/test_heapq.py @@ -4,7 +4,6 @@ import unittest import doctest -from test import support from test.support import import_helper from unittest import TestCase, skipUnless from operator import itemgetter diff --git a/Lib/test/test_http_cookiejar.py b/Lib/test/test_http_cookiejar.py index ad2b121fdaf5a1..f8291c2aa32cfe 100644 --- a/Lib/test/test_http_cookiejar.py +++ b/Lib/test/test_http_cookiejar.py @@ -1,8 +1,9 @@ """Tests for http/cookiejar.py.""" import os +import stat +import sys import re -import test.support from test.support import os_helper from test.support import warnings_helper import time @@ -17,6 +18,7 @@ reach, is_HDN, domain_match, user_domain_match, request_path, request_port, request_host) +mswindows = (sys.platform == "win32") class DateTimeTests(unittest.TestCase): @@ -364,10 +366,37 @@ def test_lwp_valueless_cookie(self): c = LWPCookieJar() c.load(filename, ignore_discard=True) finally: - try: os.unlink(filename) - except OSError: pass + os_helper.unlink(filename) self.assertEqual(c._cookies["www.acme.com"]["/"]["boo"].value, None) + @unittest.skipIf(mswindows, "windows file permissions are incompatible with file modes") + @os_helper.skip_unless_working_chmod + def test_lwp_filepermissions(self): + # Cookie file should only be readable by the creator + filename = os_helper.TESTFN + c = LWPCookieJar() + interact_netscape(c, "http://www.acme.com/", 'boo') + try: + c.save(filename, ignore_discard=True) + st = os.stat(filename) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o600) + finally: + os_helper.unlink(filename) + + @unittest.skipIf(mswindows, "windows file permissions are incompatible with file modes") + @os_helper.skip_unless_working_chmod + def test_mozilla_filepermissions(self): + # Cookie file should only be readable by the creator + filename = os_helper.TESTFN + c = MozillaCookieJar() + interact_netscape(c, "http://www.acme.com/", 'boo') + try: + c.save(filename, ignore_discard=True) + st = os.stat(filename) + self.assertEqual(stat.S_IMODE(st.st_mode), 0o600) + finally: + os_helper.unlink(filename) + def test_bad_magic(self): # OSErrors (eg. file doesn't exist) are allowed to propagate filename = os_helper.TESTFN @@ -391,8 +420,7 @@ def test_bad_magic(self): c = cookiejar_class() self.assertRaises(LoadError, c.load, filename) finally: - try: os.unlink(filename) - except OSError: pass + os_helper.unlink(filename) class CookieTests(unittest.TestCase): # XXX @@ -496,7 +524,7 @@ def test_missing_value(self): c = MozillaCookieJar(filename) c.revert(ignore_expires=True, ignore_discard=True) finally: - os.unlink(c.filename) + os_helper.unlink(c.filename) # cookies unchanged apart from lost info re. whether path was specified self.assertEqual( repr(c), @@ -1766,8 +1794,7 @@ def test_rejection(self): c = LWPCookieJar(policy=pol) c.load(filename, ignore_discard=True) finally: - try: os.unlink(filename) - except OSError: pass + os_helper.unlink(filename) self.assertEqual(old, repr(c)) @@ -1826,8 +1853,7 @@ def save_and_restore(cj, ignore_discard): DefaultCookiePolicy(rfc2965=True)) new_c.load(ignore_discard=ignore_discard) finally: - try: os.unlink(filename) - except OSError: pass + os_helper.unlink(filename) return new_c new_c = save_and_restore(c, True) diff --git a/Lib/test/test_httpservers.py b/Lib/test/test_httpservers.py index 1f041aa121f974..a937258069ed89 100644 --- a/Lib/test/test_httpservers.py +++ b/Lib/test/test_httpservers.py @@ -334,7 +334,7 @@ class request_handler(NoLogRequestHandler, SimpleHTTPRequestHandler): pass def setUp(self): - BaseTestCase.setUp(self) + super().setUp() self.cwd = os.getcwd() basetempdir = tempfile.gettempdir() os.chdir(basetempdir) @@ -362,7 +362,7 @@ def tearDown(self): except: pass finally: - BaseTestCase.tearDown(self) + super().tearDown() def check_status_and_reason(self, response, status, data=None): def close_conn(): @@ -418,6 +418,55 @@ def test_undecodable_filename(self): self.check_status_and_reason(response, HTTPStatus.OK, data=os_helper.TESTFN_UNDECODABLE) + def test_get_dir_redirect_location_domain_injection_bug(self): + """Ensure //evil.co/..%2f../../X does not put //evil.co/ in Location. + + //netloc/ in a Location header is a redirect to a new host. + https://github.com/python/cpython/issues/87389 + + This checks that a path resolving to a directory on our server cannot + resolve into a redirect to another server. + """ + os.mkdir(os.path.join(self.tempdir, 'existing_directory')) + url = f'/python.org/..%2f..%2f..%2f..%2f..%2f../%0a%0d/../{self.tempdir_name}/existing_directory' + expected_location = f'{url}/' # /python.org.../ single slash single prefix, trailing slash + # Canonicalizes to /tmp/tempdir_name/existing_directory which does + # exist and is a dir, triggering the 301 redirect logic. + response = self.request(url) + self.check_status_and_reason(response, HTTPStatus.MOVED_PERMANENTLY) + location = response.getheader('Location') + self.assertEqual(location, expected_location, msg='non-attack failed!') + + # //python.org... multi-slash prefix, no trailing slash + attack_url = f'/{url}' + response = self.request(attack_url) + self.check_status_and_reason(response, HTTPStatus.MOVED_PERMANENTLY) + location = response.getheader('Location') + self.assertFalse(location.startswith('//'), msg=location) + self.assertEqual(location, expected_location, + msg='Expected Location header to start with a single / and ' + 'end with a / as this is a directory redirect.') + + # ///python.org... triple-slash prefix, no trailing slash + attack3_url = f'//{url}' + response = self.request(attack3_url) + self.check_status_and_reason(response, HTTPStatus.MOVED_PERMANENTLY) + self.assertEqual(response.getheader('Location'), expected_location) + + # If the second word in the http request (Request-URI for the http + # method) is a full URI, we don't worry about it, as that'll be parsed + # and reassembled as a full URI within BaseHTTPRequestHandler.send_head + # so no errant scheme-less //netloc//evil.co/ domain mixup can happen. + attack_scheme_netloc_2slash_url = f'https://pypi.org/{url}' + expected_scheme_netloc_location = f'{attack_scheme_netloc_2slash_url}/' + response = self.request(attack_scheme_netloc_2slash_url) + self.check_status_and_reason(response, HTTPStatus.MOVED_PERMANENTLY) + location = response.getheader('Location') + # We're just ensuring that the scheme and domain make it through, if + # there are or aren't multiple slashes at the start of the path that + # follows that isn't important in this Location: header. + self.assertTrue(location.startswith('https://pypi.org/'), msg=location) + def test_get(self): #constructs the path relative to the root directory of the HTTPServer response = self.request(self.base_url + '/test') diff --git a/Lib/test/test_imaplib.py b/Lib/test/test_imaplib.py index ff13edea2d1e4a..b554bc0c79960d 100644 --- a/Lib/test/test_imaplib.py +++ b/Lib/test/test_imaplib.py @@ -10,9 +10,7 @@ import threading import socket -from test.support import (verbose, - run_with_tz, run_with_locale, cpython_only, - requires_working_socket) +from test.support import verbose, run_with_tz, run_with_locale, cpython_only from test.support import hashlib_helper from test.support import threading_helper from test.support import warnings_helper @@ -939,6 +937,7 @@ def test_with_statement_logout(self): @threading_helper.reap_threads @cpython_only + @unittest.skipUnless(__debug__, "Won't work if __debug__ is False") def test_dump_ur(self): # See: http://bugs.python.org/issue26543 untagged_resp_dict = {'READ-WRITE': [b'']} diff --git a/Lib/test/test_import/__init__.py b/Lib/test/test_import/__init__.py index be88677dc697ee..ab7f887c374ced 100644 --- a/Lib/test/test_import/__init__.py +++ b/Lib/test/test_import/__init__.py @@ -20,7 +20,8 @@ from test.support import os_helper from test.support import ( - STDLIB_DIR, is_jython, swap_attr, swap_item, cpython_only, is_emscripten) + STDLIB_DIR, is_jython, swap_attr, swap_item, cpython_only, is_emscripten, + is_wasi) from test.support.import_helper import ( forget, make_legacy_pyc, unlink, unload, DirsOnSysPath, CleanImport) from test.support.os_helper import ( @@ -535,7 +536,10 @@ class FilePermissionTests(unittest.TestCase): @unittest.skipUnless(os.name == 'posix', "test meaningful only on posix systems") - @unittest.skipIf(is_emscripten, "Emscripten's umask is a stub.") + @unittest.skipIf( + is_emscripten or is_wasi, + "Emscripten's/WASI's umask is a stub." + ) def test_creation_mode(self): mask = 0o022 with temp_umask(mask), _ready_to_import() as (name, path): @@ -553,6 +557,7 @@ def test_creation_mode(self): @unittest.skipUnless(os.name == 'posix', "test meaningful only on posix systems") + @os_helper.skip_unless_working_chmod def test_cached_mode_issue_2051(self): # permissions of .pyc should match those of .py, regardless of mask mode = 0o600 @@ -569,6 +574,7 @@ def test_cached_mode_issue_2051(self): @unittest.skipUnless(os.name == 'posix', "test meaningful only on posix systems") + @os_helper.skip_unless_working_chmod def test_cached_readonly(self): mode = 0o400 with temp_umask(0o022), _ready_to_import() as (name, path): @@ -882,6 +888,8 @@ def test_import_pyc_path(self): @unittest.skipIf(hasattr(os, 'geteuid') and os.geteuid() == 0, "due to varying filesystem permission semantics (issue #11956)") @skip_if_dont_write_bytecode + @os_helper.skip_unless_working_chmod + @unittest.skipIf(is_emscripten, "umask is a stub") def test_unwritable_directory(self): # When the umask causes the new __pycache__ directory to be # unwritable, the import still succeeds but no .pyc file is written. @@ -920,7 +928,7 @@ def test_missing_source_legacy(self): m = __import__(TESTFN) try: self.assertEqual(m.__file__, - os.path.join(os.getcwd(), os.curdir, os.path.relpath(pyc_file))) + os.path.join(os.getcwd(), os.path.relpath(pyc_file))) finally: os.remove(pyc_file) @@ -928,7 +936,7 @@ def test___cached__(self): # Modules now also have an __cached__ that points to the pyc file. m = __import__(TESTFN) pyc_file = importlib.util.cache_from_source(TESTFN + '.py') - self.assertEqual(m.__cached__, os.path.join(os.getcwd(), os.curdir, pyc_file)) + self.assertEqual(m.__cached__, os.path.join(os.getcwd(), pyc_file)) @skip_if_dont_write_bytecode def test___cached___legacy_pyc(self): @@ -944,7 +952,7 @@ def test___cached___legacy_pyc(self): importlib.invalidate_caches() m = __import__(TESTFN) self.assertEqual(m.__cached__, - os.path.join(os.getcwd(), os.curdir, os.path.relpath(pyc_file))) + os.path.join(os.getcwd(), os.path.relpath(pyc_file))) @skip_if_dont_write_bytecode def test_package___cached__(self): @@ -964,10 +972,10 @@ def cleanup(): m = __import__('pep3147.foo') init_pyc = importlib.util.cache_from_source( os.path.join('pep3147', '__init__.py')) - self.assertEqual(m.__cached__, os.path.join(os.getcwd(), os.curdir, init_pyc)) + self.assertEqual(m.__cached__, os.path.join(os.getcwd(), init_pyc)) foo_pyc = importlib.util.cache_from_source(os.path.join('pep3147', 'foo.py')) self.assertEqual(sys.modules['pep3147.foo'].__cached__, - os.path.join(os.getcwd(), os.curdir, foo_pyc)) + os.path.join(os.getcwd(), foo_pyc)) def test_package___cached___from_pyc(self): # Like test___cached__ but ensuring __cached__ when imported from a @@ -991,10 +999,10 @@ def cleanup(): m = __import__('pep3147.foo') init_pyc = importlib.util.cache_from_source( os.path.join('pep3147', '__init__.py')) - self.assertEqual(m.__cached__, os.path.join(os.getcwd(), os.curdir, init_pyc)) + self.assertEqual(m.__cached__, os.path.join(os.getcwd(), init_pyc)) foo_pyc = importlib.util.cache_from_source(os.path.join('pep3147', 'foo.py')) self.assertEqual(sys.modules['pep3147.foo'].__cached__, - os.path.join(os.getcwd(), os.curdir, foo_pyc)) + os.path.join(os.getcwd(), foo_pyc)) def test_recompute_pyc_same_second(self): # Even when the source file doesn't change timestamp, a change in diff --git a/Lib/test/test_importlib/extension/test_finder.py b/Lib/test/test_importlib/extension/test_finder.py index b6663a44845502..1d5b6e7a5de94b 100644 --- a/Lib/test/test_importlib/extension/test_finder.py +++ b/Lib/test/test_importlib/extension/test_finder.py @@ -3,7 +3,7 @@ machinery = util.import_importlib('importlib.machinery') import unittest -import warnings +import sys class FinderTests(abc.FinderTests): @@ -13,6 +13,10 @@ class FinderTests(abc.FinderTests): def setUp(self): if not self.machinery.EXTENSION_SUFFIXES: raise unittest.SkipTest("Requires dynamic loading support.") + if util.EXTENSIONS.name in sys.builtin_module_names: + raise unittest.SkipTest( + f"{util.EXTENSIONS.name} is a builtin module" + ) def find_spec(self, fullname): importer = self.machinery.FileFinder(util.EXTENSIONS.path, diff --git a/Lib/test/test_importlib/extension/test_loader.py b/Lib/test/test_importlib/extension/test_loader.py index 5080009bee32ee..8570c6bc90cd07 100644 --- a/Lib/test/test_importlib/extension/test_loader.py +++ b/Lib/test/test_importlib/extension/test_loader.py @@ -20,6 +20,10 @@ class LoaderTests(abc.LoaderTests): def setUp(self): if not self.machinery.EXTENSION_SUFFIXES: raise unittest.SkipTest("Requires dynamic loading support.") + if util.EXTENSIONS.name in sys.builtin_module_names: + raise unittest.SkipTest( + f"{util.EXTENSIONS.name} is a builtin module" + ) self.loader = self.machinery.ExtensionFileLoader(util.EXTENSIONS.name, util.EXTENSIONS.file_path) @@ -97,6 +101,10 @@ def setUp(self): if not self.machinery.EXTENSION_SUFFIXES: raise unittest.SkipTest("Requires dynamic loading support.") self.name = '_testmultiphase' + if self.name in sys.builtin_module_names: + raise unittest.SkipTest( + f"{self.name} is a builtin module" + ) finder = self.machinery.FileFinder(None) self.spec = importlib.util.find_spec(self.name) assert self.spec diff --git a/Lib/test/test_importlib/fixtures.py b/Lib/test/test_importlib/fixtures.py index 803d3738d263f4..e7be77b3957c67 100644 --- a/Lib/test/test_importlib/fixtures.py +++ b/Lib/test/test_importlib/fixtures.py @@ -5,6 +5,7 @@ import pathlib import tempfile import textwrap +import functools import contextlib from test.support.os_helper import FS_NONASCII @@ -296,3 +297,18 @@ def setUp(self): # Add self.zip_name to the front of sys.path. self.resources = contextlib.ExitStack() self.addCleanup(self.resources.close) + + +def parameterize(*args_set): + """Run test method with a series of parameters.""" + + def wrapper(func): + @functools.wraps(func) + def _inner(self): + for args in args_set: + with self.subTest(**args): + func(self, **args) + + return _inner + + return wrapper diff --git a/Lib/test/test_importlib/import_/test_path.py b/Lib/test/test_importlib/import_/test_path.py index 6f1d0cabd28a62..de620842bbc52b 100644 --- a/Lib/test/test_importlib/import_/test_path.py +++ b/Lib/test/test_importlib/import_/test_path.py @@ -202,10 +202,11 @@ def __init__(self): def invalidate_caches(self): self.called = True - cache = {'leave_alone': object(), 'finder_to_invalidate': FakeFinder()} + key = os.path.abspath('finder_to_invalidate') + cache = {'leave_alone': object(), key: FakeFinder()} with util.import_state(path_importer_cache=cache): self.machinery.PathFinder.invalidate_caches() - self.assertTrue(cache['finder_to_invalidate'].called) + self.assertTrue(cache[key].called) def test_invalidate_caches_clear_out_None(self): # Clear out None in sys.path_importer_cache() when invalidating caches. @@ -214,6 +215,16 @@ def test_invalidate_caches_clear_out_None(self): self.machinery.PathFinder.invalidate_caches() self.assertEqual(len(cache), 0) + def test_invalidate_caches_clear_out_relative_path(self): + class FakeFinder: + def invalidate_caches(self): + pass + + cache = {'relative_path': FakeFinder()} + with util.import_state(path_importer_cache=cache): + self.machinery.PathFinder.invalidate_caches() + self.assertEqual(cache, {}) + class FindModuleTests(FinderTests): def find(self, *args, **kwargs): diff --git a/Lib/test/test_importlib/test_abc.py b/Lib/test/test_importlib/test_abc.py index 92cb78067d0ebd..d59b663a43ed06 100644 --- a/Lib/test/test_importlib/test_abc.py +++ b/Lib/test/test_importlib/test_abc.py @@ -2,7 +2,6 @@ import marshal import os import sys -from test import support from test.support import import_helper import types import unittest diff --git a/Lib/test/test_importlib/test_api.py b/Lib/test/test_importlib/test_api.py index 1f8f7c00bda536..b3a99dc2dd5731 100644 --- a/Lib/test/test_importlib/test_api.py +++ b/Lib/test/test_importlib/test_api.py @@ -6,7 +6,6 @@ import os.path import sys -from test import support from test.support import import_helper from test.support import os_helper import types @@ -301,7 +300,8 @@ def test_reload_namespace_changed(self): name = 'spam' with os_helper.temp_cwd(None) as cwd: with test_util.uncache('spam'): - with import_helper.DirsOnSysPath(cwd): + with test_util.import_state(path=[cwd]): + self.init._bootstrap_external._install(self.init._bootstrap) # Start as a namespace package. self.init.invalidate_caches() bad_path = os.path.join(cwd, name, '__init.py') @@ -395,7 +395,7 @@ def find_module(self, *args): def invalidate_caches(self): self.called = True - key = 'gobledeegook' + key = os.path.abspath('gobledeegook') meta_ins = InvalidatingNullFinder() path_ins = InvalidatingNullFinder() sys.meta_path.insert(0, meta_ins) diff --git a/Lib/test/test_importlib/test_main.py b/Lib/test/test_importlib/test_main.py index c80a0e4a5a9f91..d9d067c4b23d66 100644 --- a/Lib/test/test_importlib/test_main.py +++ b/Lib/test/test_importlib/test_main.py @@ -1,7 +1,6 @@ import re import json import pickle -import textwrap import unittest import warnings import importlib.metadata @@ -16,6 +15,7 @@ Distribution, EntryPoint, PackageNotFoundError, + _unique, distributions, entry_points, metadata, @@ -51,6 +51,14 @@ def test_package_not_found_mentions_metadata(self): def test_new_style_classes(self): self.assertIsInstance(Distribution, type) + @fixtures.parameterize( + dict(name=None), + dict(name=''), + ) + def test_invalid_inputs_to_from_name(self, name): + with self.assertRaises(Exception): + Distribution.from_name(name) + class ImportTests(fixtures.DistInfoPkg, unittest.TestCase): def test_import_nonexistent_module(self): @@ -78,48 +86,50 @@ def test_resolve_without_attr(self): class NameNormalizationTests(fixtures.OnSysPath, fixtures.SiteDir, unittest.TestCase): @staticmethod - def pkg_with_dashes(site_dir): + def make_pkg(name): """ - Create minimal metadata for a package with dashes - in the name (and thus underscores in the filename). + Create minimal metadata for a dist-info package with + the indicated name on the file system. """ - metadata_dir = site_dir / 'my_pkg.dist-info' - metadata_dir.mkdir() - metadata = metadata_dir / 'METADATA' - with metadata.open('w', encoding='utf-8') as strm: - strm.write('Version: 1.0\n') - return 'my-pkg' + return { + f'{name}.dist-info': { + 'METADATA': 'VERSION: 1.0\n', + }, + } def test_dashes_in_dist_name_found_as_underscores(self): """ For a package with a dash in the name, the dist-info metadata uses underscores in the name. Ensure the metadata loads. """ - pkg_name = self.pkg_with_dashes(self.site_dir) - assert version(pkg_name) == '1.0' - - @staticmethod - def pkg_with_mixed_case(site_dir): - """ - Create minimal metadata for a package with mixed case - in the name. - """ - metadata_dir = site_dir / 'CherryPy.dist-info' - metadata_dir.mkdir() - metadata = metadata_dir / 'METADATA' - with metadata.open('w', encoding='utf-8') as strm: - strm.write('Version: 1.0\n') - return 'CherryPy' + fixtures.build_files(self.make_pkg('my_pkg'), self.site_dir) + assert version('my-pkg') == '1.0' def test_dist_name_found_as_any_case(self): """ Ensure the metadata loads when queried with any case. """ - pkg_name = self.pkg_with_mixed_case(self.site_dir) + pkg_name = 'CherryPy' + fixtures.build_files(self.make_pkg(pkg_name), self.site_dir) assert version(pkg_name) == '1.0' assert version(pkg_name.lower()) == '1.0' assert version(pkg_name.upper()) == '1.0' + def test_unique_distributions(self): + """ + Two distributions varying only by non-normalized name on + the file system should resolve as the same. + """ + fixtures.build_files(self.make_pkg('abc'), self.site_dir) + before = list(_unique(distributions())) + + alt_site_dir = self.fixtures.enter_context(fixtures.tempdir()) + self.fixtures.enter_context(self.add_sys_path(alt_site_dir)) + fixtures.build_files(self.make_pkg('ABC'), alt_site_dir) + after = list(_unique(distributions())) + + assert len(after) == len(before) + class NonASCIITests(fixtures.OnSysPath, fixtures.SiteDir, unittest.TestCase): @staticmethod @@ -128,11 +138,12 @@ def pkg_with_non_ascii_description(site_dir): Create minimal metadata for a package with non-ASCII in the description. """ - metadata_dir = site_dir / 'portend.dist-info' - metadata_dir.mkdir() - metadata = metadata_dir / 'METADATA' - with metadata.open('w', encoding='utf-8') as fp: - fp.write('Description: pôrˈtend') + contents = { + 'portend.dist-info': { + 'METADATA': 'Description: pôrˈtend', + }, + } + fixtures.build_files(contents, site_dir) return 'portend' @staticmethod @@ -141,19 +152,15 @@ def pkg_with_non_ascii_description_egg_info(site_dir): Create minimal metadata for an egg-info package with non-ASCII in the description. """ - metadata_dir = site_dir / 'portend.dist-info' - metadata_dir.mkdir() - metadata = metadata_dir / 'METADATA' - with metadata.open('w', encoding='utf-8') as fp: - fp.write( - textwrap.dedent( - """ + contents = { + 'portend.dist-info': { + 'METADATA': """ Name: portend - pôrˈtend - """ - ).strip() - ) + pôrˈtend""", + }, + } + fixtures.build_files(contents, site_dir) return 'portend' def test_metadata_loads(self): diff --git a/Lib/test/test_importlib/test_metadata_api.py b/Lib/test/test_importlib/test_metadata_api.py index c86bb4df19c5eb..69c78e9820c044 100644 --- a/Lib/test/test_importlib/test_metadata_api.py +++ b/Lib/test/test_importlib/test_metadata_api.py @@ -89,15 +89,15 @@ def test_entry_points_distribution(self): self.assertIn(ep.dist.name, ('distinfo-pkg', 'egginfo-pkg')) self.assertEqual(ep.dist.version, "1.0.0") - def test_entry_points_unique_packages(self): + def test_entry_points_unique_packages_normalized(self): """ Entry points should only be exposed for the first package - on sys.path with a given name. + on sys.path with a given name (even when normalized). """ alt_site_dir = self.fixtures.enter_context(fixtures.tempdir()) self.fixtures.enter_context(self.add_sys_path(alt_site_dir)) alt_pkg = { - "distinfo_pkg-1.1.0.dist-info": { + "DistInfo_pkg-1.1.0.dist-info": { "METADATA": """ Name: distinfo-pkg Version: 1.1.0 diff --git a/Lib/test/test_importlib/test_threaded_import.py b/Lib/test/test_importlib/test_threaded_import.py index cc1d804f35f913..9aeeb5e686e936 100644 --- a/Lib/test/test_importlib/test_threaded_import.py +++ b/Lib/test/test_importlib/test_threaded_import.py @@ -272,4 +272,4 @@ def setUpModule(): if __name__ == "__main__": - unittets.main() + unittest.main() diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect.py index fe0259ab609c7f..ae1842704d37df 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect.py @@ -842,7 +842,10 @@ def test_nested_class_definition_inside_function(self): self.assertSourceEqual(mod2.cls213, 218, 222) self.assertSourceEqual(mod2.cls213().func219(), 220, 221) - @unittest.skipIf(support.is_emscripten, "socket.accept is broken") + @unittest.skipIf( + support.is_emscripten or support.is_wasi, + "socket.accept is broken" + ) def test_nested_class_definition_inside_async_function(self): import asyncio self.addCleanup(asyncio.set_event_loop_policy, None) diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py index 5528c461e58ae6..24c93b969ea2b7 100644 --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -68,7 +68,7 @@ class EmptyStruct(ctypes.Structure): # Does io.IOBase finalizer log the exception if the close() method fails? # The exception is ignored silently by default in release build. -IOBASE_EMITS_UNRAISABLE = (hasattr(sys, "gettotalrefcount") or sys.flags.dev_mode) +IOBASE_EMITS_UNRAISABLE = (support.Py_DEBUG or sys.flags.dev_mode) def _default_chunk_size(): @@ -429,6 +429,7 @@ def test_invalid_operations(self): @unittest.skipIf( support.is_emscripten, "fstat() of a pipe fd is not supported" ) + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_optional_abilities(self): # Test for OSError when optional APIs are not supported # The purpose of this test is to try fileno(), reading, writing and @@ -3569,6 +3570,10 @@ def seekable(self): return True F.tell = lambda x: 0 t = self.TextIOWrapper(F(), encoding='utf-8') + def test_reconfigure_locale(self): + wrapper = io.TextIOWrapper(io.BytesIO(b"test")) + wrapper.reconfigure(encoding="locale") + def test_reconfigure_encoding_read(self): # latin1 -> utf8 # (latin1 can decode utf-8 encoded string) @@ -3970,6 +3975,7 @@ def test_removed_u_mode(self): @unittest.skipIf( support.is_emscripten, "fstat() of a pipe fd is not supported" ) + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_open_pipe_with_append(self): # bpo-27805: Ignore ESPIPE from lseek() in open(). r, w = os.pipe() @@ -4108,6 +4114,7 @@ def cleanup_fds(): with warnings_helper.check_no_resource_warning(self): open(r, *args, closefd=False, **kwargs) + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_warn_on_dealloc_fd(self): self._check_warn_on_dealloc_fd("rb", buffering=0) self._check_warn_on_dealloc_fd("rb") @@ -4147,6 +4154,7 @@ def test_nonblock_pipe_write_smallbuf(self): @unittest.skipUnless(hasattr(os, 'set_blocking'), 'os.set_blocking() required for this test') + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def _test_nonblock_pipe_write(self, bufsize): sent = [] received = [] @@ -4293,14 +4301,6 @@ def test_text_encoding(self): proc = assert_python_ok('-X', 'utf8=1', '-c', code) self.assertEqual(b"utf-8", proc.out.strip()) - @support.cpython_only - # Depending if OpenWrapper was already created or not, the warning is - # emitted or not. For example, the attribute is already created when this - # test is run multiple times. - @warnings_helper.ignore_warnings(category=DeprecationWarning) - def test_openwrapper(self): - self.assertIs(self.io.OpenWrapper, self.io.open) - class CMiscIOTest(MiscIOTest): io = io @@ -4458,14 +4458,17 @@ def _read(): raise @requires_alarm + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_interrupted_write_unbuffered(self): self.check_interrupted_write(b"xy", b"xy", mode="wb", buffering=0) @requires_alarm + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_interrupted_write_buffered(self): self.check_interrupted_write(b"xy", b"xy", mode="wb") @requires_alarm + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_interrupted_write_text(self): self.check_interrupted_write("xy", b"xy", mode="w", encoding="ascii") diff --git a/Lib/test/test_itertools.py b/Lib/test/test_itertools.py index 238afbbd883d3a..f469bfe185e65b 100644 --- a/Lib/test/test_itertools.py +++ b/Lib/test/test_itertools.py @@ -180,6 +180,7 @@ def test_chain_from_iterable(self): self.assertEqual(list(chain.from_iterable([''])), []) self.assertEqual(take(4, chain.from_iterable(['abc', 'def'])), list('abcd')) self.assertRaises(TypeError, list, chain.from_iterable([2, 3])) + self.assertEqual(list(islice(chain.from_iterable(repeat(range(5))), 2)), [0, 1]) def test_chain_reducible(self): for oper in [copy.deepcopy] + picklecopiers: diff --git a/Lib/test/test_largefile.py b/Lib/test/test_largefile.py index 8f6bec16200534..3c11c59baef6e5 100644 --- a/Lib/test/test_largefile.py +++ b/Lib/test/test_largefile.py @@ -156,6 +156,8 @@ def test_seekable(self): def skip_no_disk_space(path, required): def decorator(fun): def wrapper(*args, **kwargs): + if not hasattr(shutil, "disk_usage"): + raise unittest.SkipTest("requires shutil.disk_usage") if shutil.disk_usage(os.path.realpath(path)).free < required: hsize = int(required / 1024 / 1024) raise unittest.SkipTest( diff --git a/Lib/test/test_launcher.py b/Lib/test/test_launcher.py index 2e10c55e339d5c..cd7b944f98f586 100644 --- a/Lib/test/test_launcher.py +++ b/Lib/test/test_launcher.py @@ -7,7 +7,6 @@ import sys import sysconfig import tempfile -import textwrap import unittest from pathlib import Path from test import support @@ -244,6 +243,17 @@ def script(self, content, encoding="utf-8"): finally: file.unlink() + @contextlib.contextmanager + def fake_venv(self): + venv = Path.cwd() / "Scripts" + venv.mkdir(exist_ok=True, parents=True) + venv_exe = (venv / Path(sys.executable).name) + venv_exe.touch() + try: + yield venv_exe, {"VIRTUAL_ENV": str(venv.parent)} + finally: + shutil.rmtree(venv) + class TestLauncher(unittest.TestCase, RunPyMixin): @classmethod @@ -451,12 +461,8 @@ def test_py_default_in_list(self): self.assertEqual("PythonTestSuite/3.100", default) def test_virtualenv_in_list(self): - venv = Path.cwd() / "Scripts" - venv.mkdir(exist_ok=True, parents=True) - venv_exe = (venv / Path(sys.executable).name) - venv_exe.touch() - try: - data = self.run_py(["-0p"], env={"VIRTUAL_ENV": str(venv.parent)}) + with self.fake_venv() as (venv_exe, env): + data = self.run_py(["-0p"], env=env) for line in data["stdout"].splitlines(): m = re.match(r"\s*\*\s+(.+)$", line) if m: @@ -465,7 +471,7 @@ def test_virtualenv_in_list(self): else: self.fail("did not find active venv path") - data = self.run_py(["-0"], env={"VIRTUAL_ENV": str(venv.parent)}) + data = self.run_py(["-0"], env=env) for line in data["stdout"].splitlines(): m = re.match(r"\s*\*\s+(.+)$", line) if m: @@ -473,8 +479,17 @@ def test_virtualenv_in_list(self): break else: self.fail("did not find active venv entry") - finally: - shutil.rmtree(venv) + + def test_virtualenv_with_env(self): + with self.fake_venv() as (venv_exe, env): + data1 = self.run_py([], env={**env, "PY_PYTHON": "PythonTestSuite/3"}) + data2 = self.run_py(["-V:PythonTestSuite/3"], env={**env, "PY_PYTHON": "PythonTestSuite/3"}) + # Compare stdout, because stderr goes via ascii + self.assertEqual(data1["stdout"].strip(), str(venv_exe)) + self.assertEqual(data1["SearchInfo.lowPriorityTag"], "True") + # Ensure passing the argument doesn't trigger the same behaviour + self.assertNotEqual(data2["stdout"].strip(), str(venv_exe)) + self.assertNotEqual(data2["SearchInfo.lowPriorityTag"], "True") def test_py_shebang(self): with self.py_ini(TEST_PY_COMMANDS): diff --git a/Lib/test/test_lib2to3.py b/Lib/test/test_lib2to3.py deleted file mode 100644 index 6ea8aa4a56e52e..00000000000000 --- a/Lib/test/test_lib2to3.py +++ /dev/null @@ -1,9 +0,0 @@ -import unittest -from test.support.import_helper import import_fresh_module -from test.support.warnings_helper import check_warnings - -with check_warnings(("", DeprecationWarning)): - load_tests = import_fresh_module('lib2to3.tests', fresh=['lib2to3']).load_tests - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/lib2to3/tests/__init__.py b/Lib/test/test_lib2to3/__init__.py similarity index 100% rename from Lib/lib2to3/tests/__init__.py rename to Lib/test/test_lib2to3/__init__.py diff --git a/Lib/lib2to3/tests/__main__.py b/Lib/test/test_lib2to3/__main__.py similarity index 100% rename from Lib/lib2to3/tests/__main__.py rename to Lib/test/test_lib2to3/__main__.py diff --git a/Lib/lib2to3/tests/data/README b/Lib/test/test_lib2to3/data/README similarity index 100% rename from Lib/lib2to3/tests/data/README rename to Lib/test/test_lib2to3/data/README diff --git a/Lib/lib2to3/tests/data/bom.py b/Lib/test/test_lib2to3/data/bom.py similarity index 100% rename from Lib/lib2to3/tests/data/bom.py rename to Lib/test/test_lib2to3/data/bom.py diff --git a/Lib/lib2to3/tests/data/crlf.py b/Lib/test/test_lib2to3/data/crlf.py similarity index 100% rename from Lib/lib2to3/tests/data/crlf.py rename to Lib/test/test_lib2to3/data/crlf.py diff --git a/Lib/lib2to3/tests/data/different_encoding.py b/Lib/test/test_lib2to3/data/different_encoding.py similarity index 100% rename from Lib/lib2to3/tests/data/different_encoding.py rename to Lib/test/test_lib2to3/data/different_encoding.py diff --git a/Lib/lib2to3/tests/data/false_encoding.py b/Lib/test/test_lib2to3/data/false_encoding.py similarity index 100% rename from Lib/lib2to3/tests/data/false_encoding.py rename to Lib/test/test_lib2to3/data/false_encoding.py diff --git a/Lib/lib2to3/tests/data/fixers/bad_order.py b/Lib/test/test_lib2to3/data/fixers/bad_order.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/bad_order.py rename to Lib/test/test_lib2to3/data/fixers/bad_order.py diff --git a/Lib/lib2to3/tests/data/fixers/myfixes/__init__.py b/Lib/test/test_lib2to3/data/fixers/myfixes/__init__.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/myfixes/__init__.py rename to Lib/test/test_lib2to3/data/fixers/myfixes/__init__.py diff --git a/Lib/lib2to3/tests/data/fixers/myfixes/fix_explicit.py b/Lib/test/test_lib2to3/data/fixers/myfixes/fix_explicit.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/myfixes/fix_explicit.py rename to Lib/test/test_lib2to3/data/fixers/myfixes/fix_explicit.py diff --git a/Lib/lib2to3/tests/data/fixers/myfixes/fix_first.py b/Lib/test/test_lib2to3/data/fixers/myfixes/fix_first.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/myfixes/fix_first.py rename to Lib/test/test_lib2to3/data/fixers/myfixes/fix_first.py diff --git a/Lib/lib2to3/tests/data/fixers/myfixes/fix_last.py b/Lib/test/test_lib2to3/data/fixers/myfixes/fix_last.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/myfixes/fix_last.py rename to Lib/test/test_lib2to3/data/fixers/myfixes/fix_last.py diff --git a/Lib/lib2to3/tests/data/fixers/myfixes/fix_parrot.py b/Lib/test/test_lib2to3/data/fixers/myfixes/fix_parrot.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/myfixes/fix_parrot.py rename to Lib/test/test_lib2to3/data/fixers/myfixes/fix_parrot.py diff --git a/Lib/lib2to3/tests/data/fixers/myfixes/fix_preorder.py b/Lib/test/test_lib2to3/data/fixers/myfixes/fix_preorder.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/myfixes/fix_preorder.py rename to Lib/test/test_lib2to3/data/fixers/myfixes/fix_preorder.py diff --git a/Lib/lib2to3/tests/data/fixers/no_fixer_cls.py b/Lib/test/test_lib2to3/data/fixers/no_fixer_cls.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/no_fixer_cls.py rename to Lib/test/test_lib2to3/data/fixers/no_fixer_cls.py diff --git a/Lib/lib2to3/tests/data/fixers/parrot_example.py b/Lib/test/test_lib2to3/data/fixers/parrot_example.py similarity index 100% rename from Lib/lib2to3/tests/data/fixers/parrot_example.py rename to Lib/test/test_lib2to3/data/fixers/parrot_example.py diff --git a/Lib/lib2to3/tests/data/infinite_recursion.py b/Lib/test/test_lib2to3/data/infinite_recursion.py similarity index 100% rename from Lib/lib2to3/tests/data/infinite_recursion.py rename to Lib/test/test_lib2to3/data/infinite_recursion.py diff --git a/Lib/lib2to3/tests/data/py2_test_grammar.py b/Lib/test/test_lib2to3/data/py2_test_grammar.py similarity index 100% rename from Lib/lib2to3/tests/data/py2_test_grammar.py rename to Lib/test/test_lib2to3/data/py2_test_grammar.py diff --git a/Lib/lib2to3/tests/data/py3_test_grammar.py b/Lib/test/test_lib2to3/data/py3_test_grammar.py similarity index 100% rename from Lib/lib2to3/tests/data/py3_test_grammar.py rename to Lib/test/test_lib2to3/data/py3_test_grammar.py diff --git a/Lib/lib2to3/tests/pytree_idempotency.py b/Lib/test/test_lib2to3/pytree_idempotency.py similarity index 96% rename from Lib/lib2to3/tests/pytree_idempotency.py rename to Lib/test/test_lib2to3/pytree_idempotency.py index 2e7e9781d42995..eb2e2aa02ae0ed 100755 --- a/Lib/lib2to3/tests/pytree_idempotency.py +++ b/Lib/test/test_lib2to3/pytree_idempotency.py @@ -17,9 +17,9 @@ import logging # Local imports -from .. import pytree -from .. import pgen2 -from ..pgen2 import driver +from lib2to3 import pytree +from lib2to3 import pgen2 +from lib2to3.pgen2 import driver logging.basicConfig() diff --git a/Lib/lib2to3/tests/support.py b/Lib/test/test_lib2to3/support.py similarity index 77% rename from Lib/lib2to3/tests/support.py rename to Lib/test/test_lib2to3/support.py index fe084e8903fc86..9e56273e95992b 100644 --- a/Lib/lib2to3/tests/support.py +++ b/Lib/test/test_lib2to3/support.py @@ -8,12 +8,14 @@ from textwrap import dedent # Local imports +import lib2to3 from lib2to3 import pytree, refactor from lib2to3.pgen2 import driver as pgen2_driver +lib2to3_dir = os.path.dirname(lib2to3.__file__) test_dir = os.path.dirname(__file__) proj_dir = os.path.normpath(os.path.join(test_dir, "..")) -grammar_path = os.path.join(test_dir, "..", "Grammar.txt") +grammar_path = os.path.join(lib2to3_dir, "Grammar.txt") grammar = pgen2_driver.load_grammar(grammar_path) grammar_no_print_statement = pgen2_driver.load_grammar(grammar_path) del grammar_no_print_statement.keywords["print"] @@ -49,10 +51,19 @@ def get_refactorer(fixer_pkg="lib2to3", fixers=None, options=None): options = options or {} return refactor.RefactoringTool(fixers, options, explicit=True) -def all_project_files(): - for dirpath, dirnames, filenames in os.walk(proj_dir): +def _all_project_files(root, files): + for dirpath, dirnames, filenames in os.walk(root): for filename in filenames: - if filename.endswith(".py"): - yield os.path.join(dirpath, filename) + if not filename.endswith(".py"): + continue + files.append(os.path.join(dirpath, filename)) + +def all_project_files(): + files = [] + _all_project_files(lib2to3_dir, files) + _all_project_files(test_dir, files) + # Sort to get more reproducible tests + files.sort() + return files TestCase = unittest.TestCase diff --git a/Lib/lib2to3/tests/test_all_fixers.py b/Lib/test/test_lib2to3/test_all_fixers.py similarity index 99% rename from Lib/lib2to3/tests/test_all_fixers.py rename to Lib/test/test_lib2to3/test_all_fixers.py index a265941490d5dc..d0fca7072482cb 100644 --- a/Lib/lib2to3/tests/test_all_fixers.py +++ b/Lib/test/test_lib2to3/test_all_fixers.py @@ -7,7 +7,6 @@ # Python imports import os.path -import sys import test.support import unittest diff --git a/Lib/lib2to3/tests/test_fixers.py b/Lib/test/test_lib2to3/test_fixers.py similarity index 99% rename from Lib/lib2to3/tests/test_fixers.py rename to Lib/test/test_lib2to3/test_fixers.py index 121ebe68e5402b..68efeee7abb4d5 100644 --- a/Lib/lib2to3/tests/test_fixers.py +++ b/Lib/test/test_lib2to3/test_fixers.py @@ -7,7 +7,7 @@ # Local imports from lib2to3 import pygram, fixer_util -from lib2to3.tests import support +from test.test_lib2to3 import support class FixerTestCase(support.TestCase): @@ -1791,7 +1791,7 @@ def f(): class Test_imports(FixerTestCase, ImportsFixerTests): fixer = "imports" - from ..fixes.fix_imports import MAPPING as modules + from lib2to3.fixes.fix_imports import MAPPING as modules def test_multiple_imports(self): b = """import urlparse, cStringIO""" @@ -1812,16 +1812,16 @@ def test_multiple_imports_as(self): class Test_imports2(FixerTestCase, ImportsFixerTests): fixer = "imports2" - from ..fixes.fix_imports2 import MAPPING as modules + from lib2to3.fixes.fix_imports2 import MAPPING as modules class Test_imports_fixer_order(FixerTestCase, ImportsFixerTests): def setUp(self): super(Test_imports_fixer_order, self).setUp(['imports', 'imports2']) - from ..fixes.fix_imports2 import MAPPING as mapping2 + from lib2to3.fixes.fix_imports2 import MAPPING as mapping2 self.modules = mapping2.copy() - from ..fixes.fix_imports import MAPPING as mapping1 + from lib2to3.fixes.fix_imports import MAPPING as mapping1 for key in ('dbhash', 'dumbdbm', 'dbm', 'gdbm'): self.modules[key] = mapping1[key] @@ -1833,7 +1833,7 @@ def test_after_local_imports_refactoring(self): class Test_urllib(FixerTestCase): fixer = "urllib" - from ..fixes.fix_urllib import MAPPING as modules + from lib2to3.fixes.fix_urllib import MAPPING as modules def test_import_module(self): for old, changes in self.modules.items(): diff --git a/Lib/lib2to3/tests/test_main.py b/Lib/test/test_lib2to3/test_main.py similarity index 100% rename from Lib/lib2to3/tests/test_main.py rename to Lib/test/test_lib2to3/test_main.py diff --git a/Lib/lib2to3/tests/test_parser.py b/Lib/test/test_lib2to3/test_parser.py similarity index 99% rename from Lib/lib2to3/tests/test_parser.py rename to Lib/test/test_lib2to3/test_parser.py index e2dddbec577c02..2c798b181fdbda 100644 --- a/Lib/lib2to3/tests/test_parser.py +++ b/Lib/test/test_lib2to3/test_parser.py @@ -26,7 +26,7 @@ # Local imports from lib2to3.pgen2 import driver as pgen2_driver from lib2to3.pgen2 import tokenize -from ..pgen2.parse import ParseError +from lib2to3.pgen2.parse import ParseError from lib2to3.pygram import python_symbols as syms @@ -63,7 +63,7 @@ def test_load_grammar_from_pickle(self): @unittest.skipIf(sys.executable is None, 'sys.executable required') @unittest.skipIf( - sys.platform == 'emscripten', 'requires working subprocess' + sys.platform in {'emscripten', 'wasi'}, 'requires working subprocess' ) def test_load_grammar_from_subprocess(self): tmpdir = tempfile.mkdtemp() diff --git a/Lib/lib2to3/tests/test_pytree.py b/Lib/test/test_lib2to3/test_pytree.py similarity index 100% rename from Lib/lib2to3/tests/test_pytree.py rename to Lib/test/test_lib2to3/test_pytree.py diff --git a/Lib/lib2to3/tests/test_refactor.py b/Lib/test/test_lib2to3/test_refactor.py similarity index 100% rename from Lib/lib2to3/tests/test_refactor.py rename to Lib/test/test_lib2to3/test_refactor.py diff --git a/Lib/lib2to3/tests/test_util.py b/Lib/test/test_lib2to3/test_util.py similarity index 100% rename from Lib/lib2to3/tests/test_util.py rename to Lib/test/test_lib2to3/test_util.py diff --git a/Lib/test/test_lltrace.py b/Lib/test/test_lltrace.py index 63b65e4b2f821a..7cf89846f8a727 100644 --- a/Lib/test/test_lltrace.py +++ b/Lib/test/test_lltrace.py @@ -1,9 +1,9 @@ import dis -import sys import textwrap import unittest -from test.support import os_helper, verbose +from test import support +from test.support import os_helper from test.support.script_helper import assert_python_ok def example(): @@ -14,9 +14,8 @@ def example(): y = "an example" print(x, y) -Py_DEBUG = hasattr(sys, 'gettotalrefcount') -@unittest.skipUnless(Py_DEBUG, "lltrace requires Py_DEBUG") +@unittest.skipUnless(support.Py_DEBUG, "lltrace requires Py_DEBUG") class TestLLTrace(unittest.TestCase): def run_code(self, code): @@ -28,7 +27,7 @@ def run_code(self, code): self.assertEqual(stderr, b"") self.assertEqual(status, 0) result = stdout.decode('utf-8') - if verbose: + if support.verbose: print("\n\n--- code ---") print(code) print("\n--- stdout ---") diff --git a/Lib/test/test_locale.py b/Lib/test/test_locale.py index 5cb6edc52d777c..bc8a7a35fbf2dc 100644 --- a/Lib/test/test_locale.py +++ b/Lib/test/test_locale.py @@ -1,5 +1,5 @@ from decimal import Decimal -from test.support import verbose, is_android, is_emscripten +from test.support import verbose, is_android, is_emscripten, is_wasi from test.support.warnings_helper import check_warnings import unittest import locale @@ -373,13 +373,19 @@ def setUp(self): @unittest.skipIf(sys.platform.startswith('aix'), 'bpo-29972: broken test on AIX') - @unittest.skipIf(is_emscripten, "musl libc issue on Emscripten, bpo-46390") + @unittest.skipIf( + is_emscripten or is_wasi, + "musl libc issue on Emscripten/WASI, bpo-46390" + ) def test_strcoll_with_diacritic(self): self.assertLess(locale.strcoll('à', 'b'), 0) @unittest.skipIf(sys.platform.startswith('aix'), 'bpo-29972: broken test on AIX') - @unittest.skipIf(is_emscripten, "musl libc issue on Emscripten, bpo-46390") + @unittest.skipIf( + is_emscripten or is_wasi, + "musl libc issue on Emscripten/WASI, bpo-46390" + ) def test_strxfrm_with_diacritic(self): self.assertLess(locale.strxfrm('à'), locale.strxfrm('b')) diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index e69afae484aa77..0aec0728c0a8a0 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -1,4 +1,4 @@ -# Copyright 2001-2021 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2022 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -16,9 +16,8 @@ """Test harness for the logging module. Run all tests. -Copyright (C) 2001-2021 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2022 Vinay Sajip. All Rights Reserved. """ - import logging import logging.handlers import logging.config @@ -30,6 +29,7 @@ import pathlib import pickle import io +import itertools import gc import json import os @@ -50,6 +50,7 @@ from test.support.logging_helper import TestHandler import textwrap import threading +import asyncio import time import unittest import warnings @@ -467,6 +468,51 @@ def log_at_all_levels(self, logger): for lvl in LEVEL_RANGE: logger.log(lvl, self.next_message()) + def test_handler_filter_replaces_record(self): + def replace_message(record: logging.LogRecord): + record = copy.copy(record) + record.msg = "new message!" + return record + + # Set up a logging hierarchy such that "child" and it's handler + # (and thus `replace_message()`) always get called before + # propagating up to "parent". + # Then we can confirm that `replace_message()` was able to + # replace the log record without having a side effect on + # other loggers or handlers. + parent = logging.getLogger("parent") + child = logging.getLogger("parent.child") + stream_1 = io.StringIO() + stream_2 = io.StringIO() + handler_1 = logging.StreamHandler(stream_1) + handler_2 = logging.StreamHandler(stream_2) + handler_2.addFilter(replace_message) + parent.addHandler(handler_1) + child.addHandler(handler_2) + + child.info("original message") + handler_1.flush() + handler_2.flush() + self.assertEqual(stream_1.getvalue(), "original message\n") + self.assertEqual(stream_2.getvalue(), "new message!\n") + + def test_logging_filter_replaces_record(self): + records = set() + + class RecordingFilter(logging.Filter): + def filter(self, record: logging.LogRecord): + records.add(id(record)) + return copy.copy(record) + + logger = logging.getLogger("logger") + logger.setLevel(logging.INFO) + logger.addFilter(RecordingFilter()) + logger.addFilter(RecordingFilter()) + + logger.info("msg") + + self.assertEqual(2, len(records)) + def test_logger_filter(self): # Filter at logger level. self.root_logger.setLevel(VERBOSE) @@ -540,6 +586,12 @@ def test_specific_filters(self): handler.removeFilter(garr) +def make_temp_file(*args, **kwargs): + fd, fn = tempfile.mkstemp(*args, **kwargs) + os.close(fd) + return fn + + class HandlerTest(BaseTest): def test_name(self): h = logging.Handler() @@ -554,8 +606,7 @@ def test_builtin_handlers(self): # but we can try instantiating them with various options if sys.platform in ('linux', 'darwin'): for existing in (True, False): - fd, fn = tempfile.mkstemp() - os.close(fd) + fn = make_temp_file() if not existing: os.unlink(fn) h = logging.handlers.WatchedFileHandler(fn, encoding='utf-8', delay=True) @@ -609,8 +660,7 @@ def test_path_objects(self): See Issue #27493. """ - fd, fn = tempfile.mkstemp() - os.close(fd) + fn = make_temp_file() os.unlink(fn) pfn = pathlib.Path(fn) cases = ( @@ -649,8 +699,7 @@ def remove_loop(fname, tries): self.deletion_time = None for delay in (False, True): - fd, fn = tempfile.mkstemp('.log', 'test_logging-3-') - os.close(fd) + fn = make_temp_file('.log', 'test_logging-3-') remover = threading.Thread(target=remove_loop, args=(fn, del_count)) remover.daemon = True remover.start() @@ -1208,6 +1257,9 @@ class ExceptionFormatter(logging.Formatter): def formatException(self, ei): return "Got a [%s]" % ei[0].__name__ +def closeFileHandler(h, fn): + h.close() + os.remove(fn) class ConfigFileTest(BaseTest): @@ -1591,13 +1643,8 @@ def test_config7_ok(self): def test_config8_ok(self): - def cleanup(h1, fn): - h1.close() - os.remove(fn) - with self.check_no_resource_warning(): - fd, fn = tempfile.mkstemp(".log", "test_logging-X-") - os.close(fd) + fn = make_temp_file(".log", "test_logging-X-") # Replace single backslash with double backslash in windows # to avoid unicode error during string formatting @@ -1610,7 +1657,7 @@ def cleanup(h1, fn): self.apply_config(config8) handler = logging.root.handlers[0] - self.addCleanup(cleanup, handler, fn) + self.addCleanup(closeFileHandler, handler, fn) def test_logger_disabling(self): self.apply_config(self.disable_test) @@ -1781,13 +1828,6 @@ def test_noserver(self): time.sleep(self.sock_hdlr.retryTime - now + 0.001) self.root_logger.error('Nor this') -def _get_temp_domain_socket(): - fd, fn = tempfile.mkstemp(prefix='test_logging_', suffix='.sock') - os.close(fd) - # just need a name - file can't be present, or we'll get an - # 'address already in use' error. - os.remove(fn) - return fn @unittest.skipUnless(hasattr(socket, "AF_UNIX"), "Unix sockets required") class UnixSocketHandlerTest(SocketHandlerTest): @@ -1799,13 +1839,10 @@ class UnixSocketHandlerTest(SocketHandlerTest): def setUp(self): # override the definition in the base class - self.address = _get_temp_domain_socket() + self.address = socket_helper.create_unix_domain_name() + self.addCleanup(os_helper.unlink, self.address) SocketHandlerTest.setUp(self) - def tearDown(self): - SocketHandlerTest.tearDown(self) - os_helper.unlink(self.address) - @support.requires_working_socket() @threading_helper.requires_working_threading() class DatagramHandlerTest(BaseTest): @@ -1882,13 +1919,10 @@ class UnixDatagramHandlerTest(DatagramHandlerTest): def setUp(self): # override the definition in the base class - self.address = _get_temp_domain_socket() + self.address = socket_helper.create_unix_domain_name() + self.addCleanup(os_helper.unlink, self.address) DatagramHandlerTest.setUp(self) - def tearDown(self): - DatagramHandlerTest.tearDown(self) - os_helper.unlink(self.address) - @support.requires_working_socket() @threading_helper.requires_working_threading() class SysLogHandlerTest(BaseTest): @@ -1919,7 +1953,7 @@ def setUp(self): self.sl_hdlr = hcls((server.server_address[0], server.port)) else: self.sl_hdlr = hcls(server.server_address) - self.log_output = '' + self.log_output = b'' self.root_logger.removeHandler(self.root_logger.handlers[0]) self.root_logger.addHandler(self.sl_hdlr) self.handled = threading.Event() @@ -1976,13 +2010,10 @@ class UnixSysLogHandlerTest(SysLogHandlerTest): def setUp(self): # override the definition in the base class - self.address = _get_temp_domain_socket() + self.address = socket_helper.create_unix_domain_name() + self.addCleanup(os_helper.unlink, self.address) SysLogHandlerTest.setUp(self) - def tearDown(self): - SysLogHandlerTest.tearDown(self) - os_helper.unlink(self.address) - @unittest.skipUnless(socket_helper.IPV6_ENABLED, 'IPv6 support required for this test.') class IPv6SysLogHandlerTest(SysLogHandlerTest): @@ -2135,8 +2166,7 @@ class EncodingTest(BaseTest): def test_encoding_plain_file(self): # In Python 2.x, a plain file object is treated as having no encoding. log = logging.getLogger("test") - fd, fn = tempfile.mkstemp(".log", "test_logging-1-") - os.close(fd) + fn = make_temp_file(".log", "test_logging-1-") # the non-ascii data we write to the log. data = "foo\x80" try: @@ -2233,6 +2263,21 @@ def handlerFunc(): class CustomHandler(logging.StreamHandler): pass +class CustomListener(logging.handlers.QueueListener): + pass + +class CustomQueue(queue.Queue): + pass + +def queueMaker(): + return queue.Queue() + +def listenerMaker(arg1, arg2, respect_handler_level=False): + def func(queue, *handlers, **kwargs): + kwargs.setdefault('respect_handler_level', respect_handler_level) + return CustomListener(queue, *handlers, **kwargs) + return func + class ConfigDictTest(BaseTest): """Reading logging config from a dictionary.""" @@ -2836,7 +2881,7 @@ class ConfigDictTest(BaseTest): }, } - out_of_order = { + bad_format = { "version": 1, "formatters": { "mySimpleFormatter": { @@ -2856,7 +2901,7 @@ class ConfigDictTest(BaseTest): "formatter": "mySimpleFormatter", "target": "fileGlobal", "level": "DEBUG" - } + } }, "loggers": { "mymodule": { @@ -2975,13 +3020,36 @@ class ConfigDictTest(BaseTest): } } + config_queue_handler = { + 'version': 1, + 'handlers' : { + 'h1' : { + 'class': 'logging.FileHandler', + }, + # key is before depended on handlers to test that deferred config works + 'ah' : { + 'class': 'logging.handlers.QueueHandler', + 'handlers': ['h1'] + }, + }, + "root": { + "level": "DEBUG", + "handlers": ["ah"] + } + } + def apply_config(self, conf): logging.config.dictConfig(conf) + def check_handler(self, name, cls): + h = logging.getHandlerByName(name) + self.assertIsInstance(h, cls) + def test_config0_ok(self): # A simple config which overrides the default settings. with support.captured_stdout() as output: self.apply_config(self.config0) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger() # Won't output anything logger.info(self.next_message()) @@ -3028,6 +3096,7 @@ def test_config4_ok(self): # A config specifying a custom formatter class. with support.captured_stdout() as output: self.apply_config(self.config4) + self.check_handler('hand1', logging.StreamHandler) #logger = logging.getLogger() try: raise RuntimeError() @@ -3056,6 +3125,7 @@ def test_config4a_ok(self): def test_config5_ok(self): self.test_config1_ok(config=self.config5) + self.check_handler('hand1', CustomHandler) def test_config6_failure(self): self.assertRaises(Exception, self.apply_config, self.config6) @@ -3075,6 +3145,7 @@ def test_config7_ok(self): self.assert_log_lines([]) with support.captured_stdout() as output: self.apply_config(self.config7) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") self.assertTrue(logger.disabled) logger = logging.getLogger("compiler.lexer") @@ -3104,6 +3175,7 @@ def test_config_8_ok(self): self.assert_log_lines([]) with support.captured_stdout() as output: self.apply_config(self.config8) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") self.assertFalse(logger.disabled) # Both will output a message @@ -3125,6 +3197,7 @@ def test_config_8_ok(self): def test_config_8a_ok(self): with support.captured_stdout() as output: self.apply_config(self.config1a) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") # See issue #11424. compiler-hyphenated sorts # between compiler and compiler.xyz and this @@ -3145,6 +3218,7 @@ def test_config_8a_ok(self): self.assert_log_lines([]) with support.captured_stdout() as output: self.apply_config(self.config8a) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") self.assertFalse(logger.disabled) # Both will output a message @@ -3168,6 +3242,7 @@ def test_config_8a_ok(self): def test_config_9_ok(self): with support.captured_stdout() as output: self.apply_config(self.config9) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") # Nothing will be output since both handler and logger are set to WARNING logger.info(self.next_message()) @@ -3186,6 +3261,7 @@ def test_config_9_ok(self): def test_config_10_ok(self): with support.captured_stdout() as output: self.apply_config(self.config10) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") logger.warning(self.next_message()) logger = logging.getLogger('compiler') @@ -3222,13 +3298,8 @@ def test_config14_ok(self): def test_config15_ok(self): - def cleanup(h1, fn): - h1.close() - os.remove(fn) - with self.check_no_resource_warning(): - fd, fn = tempfile.mkstemp(".log", "test_logging-X-") - os.close(fd) + fn = make_temp_file(".log", "test_logging-X-") config = { "version": 1, @@ -3248,7 +3319,7 @@ def cleanup(h1, fn): self.apply_config(config) handler = logging.root.handlers[0] - self.addCleanup(cleanup, handler, fn) + self.addCleanup(closeFileHandler, handler, fn) def setup_via_listener(self, text, verify=None): text = text.encode("utf-8") @@ -3282,6 +3353,7 @@ def setup_via_listener(self, text, verify=None): def test_listen_config_10_ok(self): with support.captured_stdout() as output: self.setup_via_listener(json.dumps(self.config10)) + self.check_handler('hand1', logging.StreamHandler) logger = logging.getLogger("compiler.parser") logger.warning(self.next_message()) logger = logging.getLogger('compiler') @@ -3376,11 +3448,11 @@ def verify_reverse(stuff): ('ERROR', '2'), ], pat=r"^[\w.]+ -> (\w+): (\d+)$") - def test_out_of_order(self): - self.assertRaises(ValueError, self.apply_config, self.out_of_order) + def test_bad_format(self): + self.assertRaises(ValueError, self.apply_config, self.bad_format) - def test_out_of_order_with_dollar_style(self): - config = copy.deepcopy(self.out_of_order) + def test_bad_format_with_dollar_style(self): + config = copy.deepcopy(self.bad_format) config['formatters']['mySimpleFormatter']['format'] = "${asctime} (${name}) ${levelname}: ${message}" self.apply_config(config) @@ -3388,6 +3460,8 @@ def test_out_of_order_with_dollar_style(self): self.assertIsInstance(handler.target, logging.Handler) self.assertIsInstance(handler.formatter._style, logging.StringTemplateStyle) + self.assertEqual(sorted(logging.getHandlerNames()), + ['bufferGlobal', 'fileGlobal']) def test_custom_formatter_class_with_validate(self): self.apply_config(self.custom_formatter_class_validate) @@ -3403,7 +3477,7 @@ def test_custom_formatter_class_with_validate2_with_wrong_fmt(self): config = self.custom_formatter_class_validate.copy() config['formatters']['form1']['style'] = "$" - # Exception should not be raise as we have configured 'validate' to False + # Exception should not be raised as we have configured 'validate' to False self.apply_config(config) handler = logging.getLogger("my_test_logger_custom_formatter").handlers[0] self.assertIsInstance(handler.formatter, ExceptionFormatter) @@ -3504,6 +3578,75 @@ class NotAFilter: pass {"version": 1, "root": {"level": "DEBUG", "filters": [filter_]}} ) + def do_queuehandler_configuration(self, qspec, lspec): + cd = copy.deepcopy(self.config_queue_handler) + fn = make_temp_file('.log', 'test_logging-cqh-') + cd['handlers']['h1']['filename'] = fn + if qspec is not None: + cd['handlers']['ah']['queue'] = qspec + if lspec is not None: + cd['handlers']['ah']['listener'] = lspec + qh = None + try: + self.apply_config(cd) + qh = logging.getHandlerByName('ah') + self.assertEqual(sorted(logging.getHandlerNames()), ['ah', 'h1']) + self.assertIsNotNone(qh.listener) + qh.listener.start() + logging.debug('foo') + logging.info('bar') + logging.warning('baz') + + # Need to let the listener thread finish its work + while support.sleeping_retry(support.LONG_TIMEOUT, + "queue not empty"): + if qh.listener.queue.empty(): + break + + # wait until the handler completed its last task + qh.listener.queue.join() + + with open(fn, encoding='utf-8') as f: + data = f.read().splitlines() + self.assertEqual(data, ['foo', 'bar', 'baz']) + finally: + if qh: + qh.listener.stop() + h = logging.getHandlerByName('h1') + if h: + self.addCleanup(closeFileHandler, h, fn) + else: + self.addCleanup(os.remove, fn) + + @threading_helper.requires_working_threading() + def test_config_queue_handler(self): + q = CustomQueue() + dq = { + '()': __name__ + '.CustomQueue', + 'maxsize': 10 + } + dl = { + '()': __name__ + '.listenerMaker', + 'arg1': None, + 'arg2': None, + 'respect_handler_level': True + } + qvalues = (None, __name__ + '.queueMaker', __name__ + '.CustomQueue', dq, q) + lvalues = (None, __name__ + '.CustomListener', dl, CustomListener) + for qspec, lspec in itertools.product(qvalues, lvalues): + self.do_queuehandler_configuration(qspec, lspec) + + # Some failure cases + qvalues = (None, 4, int, '', 'foo') + lvalues = (None, 4, int, '', 'bar') + for qspec, lspec in itertools.product(qvalues, lvalues): + if lspec is None and qspec is None: + continue + with self.assertRaises(ValueError) as ctx: + self.do_queuehandler_configuration(qspec, lspec) + msg = str(ctx.exception) + self.assertEqual(msg, "Unable to configure handler 'ah'") + class ManagerTest(BaseTest): def test_manager_loggerclass(self): @@ -4552,29 +4695,65 @@ def test_multiprocessing(self): import multiprocessing def test_optional(self): - r = logging.makeLogRecord({}) + NONE = self.assertIsNone NOT_NONE = self.assertIsNotNone + + r = logging.makeLogRecord({}) NOT_NONE(r.thread) NOT_NONE(r.threadName) NOT_NONE(r.process) NOT_NONE(r.processName) + NONE(r.taskName) log_threads = logging.logThreads log_processes = logging.logProcesses log_multiprocessing = logging.logMultiprocessing + log_asyncio_tasks = logging.logAsyncioTasks try: logging.logThreads = False logging.logProcesses = False logging.logMultiprocessing = False + logging.logAsyncioTasks = False r = logging.makeLogRecord({}) - NONE = self.assertIsNone + NONE(r.thread) NONE(r.threadName) NONE(r.process) NONE(r.processName) + NONE(r.taskName) finally: logging.logThreads = log_threads logging.logProcesses = log_processes logging.logMultiprocessing = log_multiprocessing + logging.logAsyncioTasks = log_asyncio_tasks + + async def _make_record_async(self, assertion): + r = logging.makeLogRecord({}) + assertion(r.taskName) + + @support.requires_working_socket() + def test_taskName_with_asyncio_imported(self): + try: + make_record = self._make_record_async + with asyncio.Runner() as runner: + logging.logAsyncioTasks = True + runner.run(make_record(self.assertIsNotNone)) + logging.logAsyncioTasks = False + runner.run(make_record(self.assertIsNone)) + finally: + asyncio.set_event_loop_policy(None) + + @support.requires_working_socket() + def test_taskName_without_asyncio_imported(self): + try: + make_record = self._make_record_async + with asyncio.Runner() as runner, support.swap_item(sys.modules, 'asyncio', None): + logging.logAsyncioTasks = True + runner.run(make_record(self.assertIsNone)) + logging.logAsyncioTasks = False + runner.run(make_record(self.assertIsNone)) + finally: + asyncio.set_event_loop_policy(None) + class BasicConfigTest(unittest.TestCase): @@ -4853,6 +5032,35 @@ def dummy_handle_error(record): # didn't write anything due to the encoding error self.assertEqual(data, r'') + @support.requires_working_socket() + def test_log_taskName(self): + async def log_record(): + logging.warning('hello world') + + handler = None + log_filename = make_temp_file('.log', 'test-logging-taskname-') + self.addCleanup(os.remove, log_filename) + try: + encoding = 'utf-8' + logging.basicConfig(filename=log_filename, errors='strict', + encoding=encoding, level=logging.WARNING, + format='%(taskName)s - %(message)s') + + self.assertEqual(len(logging.root.handlers), 1) + handler = logging.root.handlers[0] + self.assertIsInstance(handler, logging.FileHandler) + + with asyncio.Runner(debug=True) as runner: + logging.logAsyncioTasks = True + runner.run(log_record()) + with open(log_filename, encoding='utf-8') as f: + data = f.read().strip() + self.assertRegex(data, r'Task-\d+ - hello world') + finally: + asyncio.set_event_loop_policy(None) + if handler: + handler.close() + def _test_log(self, method, level=None): # logging.root has no handlers so basicConfig should be called @@ -5236,8 +5444,7 @@ class BaseFileTest(BaseTest): def setUp(self): BaseTest.setUp(self) - fd, self.fn = tempfile.mkstemp(".log", "test_logging-2-") - os.close(fd) + self.fn = make_temp_file(".log", "test_logging-2-") self.rmfiles = [] def tearDown(self): @@ -5280,6 +5487,7 @@ def test_emit_after_closing_in_write_mode(self): self.assertEqual(fp.read().strip(), '1') class RotatingFileHandlerTest(BaseFileTest): + @unittest.skipIf(support.is_wasi, "WASI does not have /dev/null.") def test_should_not_rollover(self): # If maxbytes is zero rollover never occurs rh = logging.handlers.RotatingFileHandler( @@ -5387,6 +5595,7 @@ def rotator(source, dest): rh.close() class TimedRotatingFileHandlerTest(BaseFileTest): + @unittest.skipIf(support.is_wasi, "WASI does not have /dev/null.") def test_should_not_rollover(self): # See bpo-45401. Should only ever rollover regular files fh = logging.handlers.TimedRotatingFileHandler( @@ -5642,7 +5851,7 @@ def test__all__(self): 'logThreads', 'logMultiprocessing', 'logProcesses', 'currentframe', 'PercentStyle', 'StrFormatStyle', 'StringTemplateStyle', 'Filterer', 'PlaceHolder', 'Manager', 'RootLogger', 'root', - 'threading'} + 'threading', 'logAsyncioTasks'} support.check__all__(self, logging, not_exported=not_exported) diff --git a/Lib/test/test_mailbox.py b/Lib/test/test_mailbox.py index 20c460e300cc9d..07c2764dfd1b2f 100644 --- a/Lib/test/test_mailbox.py +++ b/Lib/test/test_mailbox.py @@ -10,12 +10,17 @@ import tempfile from test import support from test.support import os_helper +from test.support import socket_helper import unittest import textwrap import mailbox import glob +if not socket_helper.has_gethostname: + raise unittest.SkipTest("test requires gethostname()") + + class TestBase: all_mailbox_types = (mailbox.Message, mailbox.MaildirMessage, diff --git a/Lib/test/test_mailcap.py b/Lib/test/test_mailcap.py index 97a8fac6e074ad..819dc80a266433 100644 --- a/Lib/test/test_mailcap.py +++ b/Lib/test/test_mailcap.py @@ -3,7 +3,6 @@ import sys import test.support import unittest -import warnings from test.support import os_helper from test.support import warnings_helper @@ -128,7 +127,8 @@ def test_subst(self): (["", "audio/*", "foo.txt"], ""), (["echo foo", "audio/*", "foo.txt"], "echo foo"), (["echo %s", "audio/*", "foo.txt"], "echo foo.txt"), - (["echo %t", "audio/*", "foo.txt"], "echo audio/*"), + (["echo %t", "audio/*", "foo.txt"], None), + (["echo %t", "audio/wav", "foo.txt"], "echo audio/wav"), (["echo \\%t", "audio/*", "foo.txt"], "echo %t"), (["echo foo", "audio/*", "foo.txt", plist], "echo foo"), (["echo %{total}", "audio/*", "foo.txt", plist], "echo 3") @@ -212,7 +212,10 @@ def test_findmatch(self): ('"An audio fragment"', audio_basic_entry)), ([c, "audio/*"], {"filename": fname}, - ("/usr/local/bin/showaudio audio/*", audio_entry)), + (None, None)), + ([c, "audio/wav"], + {"filename": fname}, + ("/usr/local/bin/showaudio audio/wav", audio_entry)), ([c, "message/external-body"], {"plist": plist}, ("showexternal /dev/null default john python.org /tmp foo bar", message_entry)) @@ -221,6 +224,10 @@ def test_findmatch(self): @unittest.skipUnless(os.name == "posix", "Requires 'test' command on system") @unittest.skipIf(sys.platform == "vxworks", "'test' command is not supported on VxWorks") + @unittest.skipUnless( + test.support.has_subprocess_support, + "'test' command needs process support." + ) def test_test(self): # findmatch() will automatically check any "test" conditions and skip # the entry if the check fails. diff --git a/Lib/test/test_marshal.py b/Lib/test/test_marshal.py index aae86cc257d7e1..882a819ca8090f 100644 --- a/Lib/test/test_marshal.py +++ b/Lib/test/test_marshal.py @@ -256,7 +256,7 @@ def test_recursion_limit(self): # The max stack depth should match the value in Python/marshal.c. # BUG: https://bugs.python.org/issue33720 # Windows always limits the maximum depth on release and debug builds - #if os.name == 'nt' and hasattr(sys, 'gettotalrefcount'): + #if os.name == 'nt' and support.Py_DEBUG: if os.name == 'nt': MAX_MARSHAL_STACK_DEPTH = 1000 else: diff --git a/Lib/test/test_memoryview.py b/Lib/test/test_memoryview.py index d7e3f0c0effa69..9d1e1f3063ca55 100644 --- a/Lib/test/test_memoryview.py +++ b/Lib/test/test_memoryview.py @@ -545,6 +545,107 @@ def test_pickle(self): with self.assertRaises(TypeError): pickle.dumps(m, proto) + def test_use_released_memory(self): + # gh-92888: Previously it was possible to use a memoryview even after + # backing buffer is freed in certain cases. This tests that those + # cases raise an exception. + size = 128 + def release(): + m.release() + nonlocal ba + ba = bytearray(size) + class MyIndex: + def __index__(self): + release() + return 4 + class MyFloat: + def __float__(self): + release() + return 4.25 + class MyBool: + def __bool__(self): + release() + return True + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + with self.assertRaises(ValueError): + m[MyIndex()] + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + self.assertEqual(list(m[:MyIndex()]), [255] * 4) + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + self.assertEqual(list(m[MyIndex():8]), [255] * 4) + + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast('B', (64, 2)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[MyIndex(), 0] + + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast('B', (2, 64)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[0, MyIndex()] + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[MyIndex()] = 42 + self.assertEqual(ba[:8], b'\0'*8) + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[:MyIndex()] = b'spam' + self.assertEqual(ba[:8], b'\0'*8) + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[MyIndex():8] = b'spam' + self.assertEqual(ba[:8], b'\0'*8) + + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast('B', (64, 2)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[MyIndex(), 0] = 42 + self.assertEqual(ba[8:16], b'\0'*8) + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast('B', (2, 64)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[0, MyIndex()] = 42 + self.assertEqual(ba[:8], b'\0'*8) + + ba = None + m = memoryview(bytearray(b'\xff'*size)) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[0] = MyIndex() + self.assertEqual(ba[:8], b'\0'*8) + + for fmt in 'bhilqnBHILQN': + with self.subTest(fmt=fmt): + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast(fmt) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[0] = MyIndex() + self.assertEqual(ba[:8], b'\0'*8) + + for fmt in 'fd': + with self.subTest(fmt=fmt): + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast(fmt) + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[0] = MyFloat() + self.assertEqual(ba[:8], b'\0'*8) + + ba = None + m = memoryview(bytearray(b'\xff'*size)).cast('?') + with self.assertRaisesRegex(ValueError, "operation forbidden"): + m[0] = MyBool() + self.assertEqual(ba[:8], b'\0'*8) if __name__ == "__main__": unittest.main() diff --git a/Lib/test/test_modulefinder.py b/Lib/test/test_modulefinder.py index ca1058b8d4087c..b64e684f80599f 100644 --- a/Lib/test/test_modulefinder.py +++ b/Lib/test/test_modulefinder.py @@ -10,9 +10,6 @@ import modulefinder -TEST_DIR = tempfile.mkdtemp() -TEST_PATH = [TEST_DIR, os.path.dirname(tempfile.__file__)] - # Each test description is a list of 5 items: # # 1. a module name that will be imported by modulefinder @@ -23,9 +20,9 @@ # about because they MAY be not found # 5. a string specifying packages to create; the format is obvious imo. # -# Each package will be created in TEST_DIR, and TEST_DIR will be +# Each package will be created in test_dir, and test_dir will be # removed after the tests again. -# Modulefinder searches in a path that contains TEST_DIR, plus +# Modulefinder searches in a path that contains test_dir, plus # the standard Lib directory. maybe_test = [ @@ -300,7 +297,7 @@ def open_file(path): return open(path, 'wb') -def create_package(source): +def create_package(test_dir, source): ofi = None try: for line in source.splitlines(): @@ -313,41 +310,45 @@ def create_package(source): ofi.close() if type(line) == bytes: line = line.decode('utf-8') - ofi = open_file(os.path.join(TEST_DIR, line.strip())) + ofi = open_file(os.path.join(test_dir, line.strip())) finally: if ofi: ofi.close() class ModuleFinderTest(unittest.TestCase): + def setUp(self): + self.test_dir = tempfile.mkdtemp() + self.test_path = [self.test_dir, os.path.dirname(tempfile.__file__)] + + def tearDown(self): + shutil.rmtree(self.test_dir) + def _do_test(self, info, report=False, debug=0, replace_paths=[], modulefinder_class=modulefinder.ModuleFinder): import_this, modules, missing, maybe_missing, source = info - create_package(source) - try: - mf = modulefinder_class(path=TEST_PATH, debug=debug, - replace_paths=replace_paths) - mf.import_hook(import_this) - if report: - mf.report() -## # This wouldn't work in general when executed several times: -## opath = sys.path[:] -## sys.path = TEST_PATH -## try: -## __import__(import_this) -## except: -## import traceback; traceback.print_exc() -## sys.path = opath -## return - modules = sorted(set(modules)) - found = sorted(mf.modules) - # check if we found what we expected, not more, not less - self.assertEqual(found, modules) - - # check for missing and maybe missing modules - bad, maybe = mf.any_missing_maybe() - self.assertEqual(bad, missing) - self.assertEqual(maybe, maybe_missing) - finally: - shutil.rmtree(TEST_DIR) + create_package(self.test_dir, source) + mf = modulefinder_class(path=self.test_path, debug=debug, + replace_paths=replace_paths) + mf.import_hook(import_this) + if report: + mf.report() +## # This wouldn't work in general when executed several times: +## opath = sys.path[:] +## sys.path = self.test_path +## try: +## __import__(import_this) +## except: +## import traceback; traceback.print_exc() +## sys.path = opath +## return + modules = sorted(set(modules)) + found = sorted(mf.modules) + # check if we found what we expected, not more, not less + self.assertEqual(found, modules) + + # check for missing and maybe missing modules + bad, maybe = mf.any_missing_maybe() + self.assertEqual(bad, missing) + self.assertEqual(maybe, maybe_missing) def test_package(self): self._do_test(package_test) @@ -380,7 +381,7 @@ def test_same_name_as_bad(self): self._do_test(same_name_as_bad_test) def test_bytecode(self): - base_path = os.path.join(TEST_DIR, 'a') + base_path = os.path.join(self.test_dir, 'a') source_path = base_path + importlib.machinery.SOURCE_SUFFIXES[0] bytecode_path = base_path + importlib.machinery.BYTECODE_SUFFIXES[0] with open_file(source_path) as file: @@ -390,8 +391,8 @@ def test_bytecode(self): self._do_test(bytecode_test) def test_replace_paths(self): - old_path = os.path.join(TEST_DIR, 'a', 'module.py') - new_path = os.path.join(TEST_DIR, 'a', 'spam.py') + old_path = os.path.join(self.test_dir, 'a', 'module.py') + new_path = os.path.join(self.test_dir, 'a', 'spam.py') with support.captured_stdout() as output: self._do_test(maybe_test, debug=2, replace_paths=[(old_path, new_path)]) diff --git a/Lib/test/test_multiprocessing_main_handling.py b/Lib/test/test_multiprocessing_main_handling.py index 510d8d3a7597e1..6b30a89316703b 100644 --- a/Lib/test/test_multiprocessing_main_handling.py +++ b/Lib/test/test_multiprocessing_main_handling.py @@ -40,6 +40,7 @@ import sys import time from multiprocessing import Pool, set_start_method +from test import support # We use this __main__ defined function in the map call below in order to # check that multiprocessing in correctly running the unguarded @@ -59,13 +60,12 @@ def f(x): results = [] with Pool(5) as pool: pool.map_async(f, [1, 2, 3], callback=results.extend) - start_time = time.monotonic() - while not results: - time.sleep(0.05) - # up to 1 min to report the results - dt = time.monotonic() - start_time - if dt > 60.0: - raise RuntimeError("Timed out waiting for results (%.1f sec)" % dt) + + # up to 1 min to report the results + for _ in support.sleeping_retry(support.LONG_TIMEOUT, + "Timed out waiting for results"): + if results: + break results.sort() print(start_method, "->", results) @@ -86,19 +86,18 @@ def f(x): import sys import time from multiprocessing import Pool, set_start_method +from test import support start_method = sys.argv[1] set_start_method(start_method) results = [] with Pool(5) as pool: pool.map_async(int, [1, 4, 9], callback=results.extend) - start_time = time.monotonic() - while not results: - time.sleep(0.05) - # up to 1 min to report the results - dt = time.monotonic() - start_time - if dt > 60.0: - raise RuntimeError("Timed out waiting for results (%.1f sec)" % dt) + # up to 1 min to report the results + for _ in support.sleeping_retry(support.LONG_TIMEOUT, + "Timed out waiting for results"): + if results: + break results.sort() print(start_method, "->", results) diff --git a/Lib/test/test_netrc.py b/Lib/test/test_netrc.py index 3cca1e8ff1ac1a..573d636de956d1 100644 --- a/Lib/test/test_netrc.py +++ b/Lib/test/test_netrc.py @@ -1,6 +1,11 @@ import netrc, os, unittest, sys, textwrap from test.support import os_helper, run_unittest +try: + import pwd +except ImportError: + pwd = None + temp_filename = os_helper.TESTFN class NetrcTestCase(unittest.TestCase): @@ -266,6 +271,8 @@ def test_comment_at_end_of_machine_line_pass_has_hash(self): @unittest.skipUnless(os.name == 'posix', 'POSIX only test') + @unittest.skipIf(pwd is None, 'security check requires pwd module') + @os_helper.skip_unless_working_chmod def test_security(self): # This test is incomplete since we are normally not run as root and # therefore can't test the file ownership being wrong. diff --git a/Lib/test/test_nis.py b/Lib/test/test_nis.py index 797c9023ad34d4..f327ecf3d1de45 100644 --- a/Lib/test/test_nis.py +++ b/Lib/test/test_nis.py @@ -1,4 +1,3 @@ -from test import support from test.support import import_helper import unittest import warnings diff --git a/Lib/test/test_ntpath.py b/Lib/test/test_ntpath.py index 7211ed861762b4..d51946322c8056 100644 --- a/Lib/test/test_ntpath.py +++ b/Lib/test/test_ntpath.py @@ -117,6 +117,31 @@ def test_splitdrive(self): # Issue #19911: UNC part containing U+0130 self.assertEqual(ntpath.splitdrive('//conky/MOUNTPOİNT/foo/bar'), ('//conky/MOUNTPOİNT', '/foo/bar')) + # gh-81790: support device namespace, including UNC drives. + tester('ntpath.splitdrive("//?/c:")', ("//?/c:", "")) + tester('ntpath.splitdrive("//?/c:/")', ("//?/c:", "/")) + tester('ntpath.splitdrive("//?/c:/dir")', ("//?/c:", "/dir")) + tester('ntpath.splitdrive("//?/UNC")', ("", "//?/UNC")) + tester('ntpath.splitdrive("//?/UNC/")', ("", "//?/UNC/")) + tester('ntpath.splitdrive("//?/UNC/server/")', ("//?/UNC/server/", "")) + tester('ntpath.splitdrive("//?/UNC/server/share")', ("//?/UNC/server/share", "")) + tester('ntpath.splitdrive("//?/UNC/server/share/dir")', ("//?/UNC/server/share", "/dir")) + tester('ntpath.splitdrive("//?/VOLUME{00000000-0000-0000-0000-000000000000}/spam")', + ('//?/VOLUME{00000000-0000-0000-0000-000000000000}', '/spam')) + tester('ntpath.splitdrive("//?/BootPartition/")', ("//?/BootPartition", "/")) + + tester('ntpath.splitdrive("\\\\?\\c:")', ("\\\\?\\c:", "")) + tester('ntpath.splitdrive("\\\\?\\c:\\")', ("\\\\?\\c:", "\\")) + tester('ntpath.splitdrive("\\\\?\\c:\\dir")', ("\\\\?\\c:", "\\dir")) + tester('ntpath.splitdrive("\\\\?\\UNC")', ("", "\\\\?\\UNC")) + tester('ntpath.splitdrive("\\\\?\\UNC\\")', ("", "\\\\?\\UNC\\")) + tester('ntpath.splitdrive("\\\\?\\UNC\\server\\")', ("\\\\?\\UNC\\server\\", "")) + tester('ntpath.splitdrive("\\\\?\\UNC\\server\\share")', ("\\\\?\\UNC\\server\\share", "")) + tester('ntpath.splitdrive("\\\\?\\UNC\\server\\share\\dir")', + ("\\\\?\\UNC\\server\\share", "\\dir")) + tester('ntpath.splitdrive("\\\\?\\VOLUME{00000000-0000-0000-0000-000000000000}\\spam")', + ('\\\\?\\VOLUME{00000000-0000-0000-0000-000000000000}', '\\spam')) + tester('ntpath.splitdrive("\\\\?\\BootPartition\\")', ("\\\\?\\BootPartition", "\\")) def test_split(self): tester('ntpath.split("c:\\foo\\bar")', ('c:\\foo', 'bar')) @@ -852,6 +877,8 @@ def _check_function(self, func): def test_path_normcase(self): self._check_function(self.path.normcase) + if sys.platform == 'win32': + self.assertEqual(ntpath.normcase('\u03a9\u2126'), 'ωΩ') def test_path_isabs(self): self._check_function(self.path.isabs) diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index 36ad587760d701..29f69a8f475b20 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -11,7 +11,6 @@ import fractions import itertools import locale -import mmap import os import pickle import select @@ -59,6 +58,10 @@ except ImportError: INT_MAX = PY_SSIZE_T_MAX = sys.maxsize +try: + import mmap +except ImportError: + mmap = None from test.support.script_helper import assert_python_ok from test.support import unix_shell @@ -183,6 +186,9 @@ def test_access(self): @unittest.skipIf( support.is_emscripten, "Test is unstable under Emscripten." ) + @unittest.skipIf( + support.is_wasi, "WASI does not support dup." + ) def test_closerange(self): first = os.open(os_helper.TESTFN, os.O_CREAT|os.O_RDWR) # We must allocate two consecutive file descriptors, otherwise @@ -1585,7 +1591,10 @@ def test_makedir(self): 'dir5', 'dir6') os.makedirs(path) - @unittest.skipIf(support.is_emscripten, "Emscripten's umask is a stub.") + @unittest.skipIf( + support.is_emscripten or support.is_wasi, + "Emscripten's/WASI's umask is a stub." + ) def test_mode(self): with os_helper.temp_umask(0o002): base = os_helper.TESTFN @@ -1661,7 +1670,7 @@ def tearDown(self): os.removedirs(path) -@unittest.skipUnless(hasattr(os, 'chown'), "Test needs chown") +@os_helper.skip_unless_working_chmod class ChownFileTests(unittest.TestCase): @classmethod @@ -1762,6 +1771,7 @@ def test_remove_nothing(self): self.assertTrue(os.path.exists(os_helper.TESTFN)) +@unittest.skipIf(support.is_wasi, "WASI has no /dev/null") class DevNullTests(unittest.TestCase): def test_devnull(self): with open(os.devnull, 'wb', 0) as f: @@ -2108,6 +2118,7 @@ def test_chmod(self): self.assertRaises(OSError, os.chmod, os_helper.TESTFN, 0) +@unittest.skipIf(support.is_wasi, "Cannot create invalid FD on WASI.") class TestInvalidFD(unittest.TestCase): singles = ["fchdir", "dup", "fdatasync", "fstat", "fstatvfs", "fsync", "tcgetpgrp", "ttyname"] @@ -2167,7 +2178,8 @@ def test_fchown(self): @unittest.skipUnless(hasattr(os, 'fpathconf'), 'test needs os.fpathconf()') @unittest.skipIf( - support.is_emscripten, "musl libc issue on Emscripten, bpo-46390" + support.is_emscripten or support.is_wasi, + "musl libc issue on Emscripten/WASI, bpo-46390" ) def test_fpathconf(self): self.check(os.pathconf, "PC_NAME_MAX") @@ -2460,6 +2472,7 @@ def test_kill_int(self): # os.kill on Windows can take an int which gets set as the exit code self._kill(100) + @unittest.skipIf(mmap is None, "requires mmap") def _kill_with_event(self, event, name): tagname = "test_os_%s" % uuid.uuid1() m = mmap.mmap(-1, 1, tagname) @@ -3771,7 +3784,6 @@ class Str(str): def test_oserror_filename(self): funcs = [ (self.filenames, os.chdir,), - (self.filenames, os.chmod, 0o777), (self.filenames, os.lstat,), (self.filenames, os.open, os.O_RDONLY), (self.filenames, os.rmdir,), @@ -3792,6 +3804,8 @@ def test_oserror_filename(self): (self.filenames, os.rename, "dst"), (self.filenames, os.replace, "dst"), )) + if os_helper.can_chmod(): + funcs.append((self.filenames, os.chmod, 0o777)) if hasattr(os, "chown"): funcs.append((self.filenames, os.chown, 0, 0)) if hasattr(os, "lchown"): @@ -3980,7 +3994,7 @@ class PathTConverterTests(unittest.TestCase): ('access', False, (os.F_OK,), None), ('chflags', False, (0,), None), ('lchflags', False, (0,), None), - ('open', False, (0,), getattr(os, 'close', None)), + ('open', False, (os.O_RDONLY,), getattr(os, 'close', None)), ] def test_path_t_converter(self): @@ -4352,6 +4366,7 @@ def test_fd(self): st = os.stat(entry.name, dir_fd=fd, follow_symlinks=False) self.assertEqual(entry.stat(follow_symlinks=False), st) + @unittest.skipIf(support.is_wasi, "WASI maps '' to cwd") def test_empty_path(self): self.assertRaises(FileNotFoundError, os.scandir, '') diff --git a/Lib/test/test_pathlib.py b/Lib/test/test_pathlib.py index 964cc85d53153b..a42619853399f2 100644 --- a/Lib/test/test_pathlib.py +++ b/Lib/test/test_pathlib.py @@ -13,7 +13,7 @@ from unittest import mock from test.support import import_helper -from test.support import is_emscripten +from test.support import is_emscripten, is_wasi from test.support import os_helper from test.support.os_helper import TESTFN, FakePath @@ -465,6 +465,9 @@ def test_parents_common(self): self.assertEqual(par[0], P('/a/b')) self.assertEqual(par[1], P('/a')) self.assertEqual(par[2], P('/')) + self.assertEqual(par[-1], P('/')) + self.assertEqual(par[-2], P('/a')) + self.assertEqual(par[-3], P('/a/b')) self.assertEqual(par[0:1], (P('/a/b'),)) self.assertEqual(par[:2], (P('/a/b'), P('/a'))) self.assertEqual(par[:-1], (P('/a/b'), P('/a'))) @@ -472,6 +475,8 @@ def test_parents_common(self): self.assertEqual(par[::2], (P('/a/b'), P('/'))) self.assertEqual(par[::-1], (P('/'), P('/a'), P('/a/b'))) self.assertEqual(list(par), [P('/a/b'), P('/a'), P('/')]) + with self.assertRaises(IndexError): + par[-4] with self.assertRaises(IndexError): par[3] @@ -1530,6 +1535,7 @@ def test_empty_path(self): p = self.cls('') self.assertEqual(p.stat(), os.stat('.')) + @unittest.skipIf(is_wasi, "WASI has no user accounts.") def test_expanduser_common(self): P = self.cls p = P('~') @@ -1896,6 +1902,7 @@ def test_with(self): with p: pass + @os_helper.skip_unless_working_chmod def test_chmod(self): p = self.cls(BASE) / 'fileA' mode = p.stat().st_mode @@ -1910,6 +1917,7 @@ def test_chmod(self): # On Windows, os.chmod does not follow symlinks (issue #15411) @only_posix + @os_helper.skip_unless_working_chmod def test_chmod_follow_symlinks_true(self): p = self.cls(BASE) / 'linkA' q = p.resolve() @@ -1925,6 +1933,7 @@ def test_chmod_follow_symlinks_true(self): # XXX also need a test for lchmod. + @os_helper.skip_unless_working_chmod def test_stat(self): p = self.cls(BASE) / 'fileA' st = p.stat() @@ -2352,6 +2361,9 @@ def test_is_socket_false(self): @unittest.skipIf( is_emscripten, "Unix sockets are not implemented on Emscripten." ) + @unittest.skipIf( + is_wasi, "Cannot create socket on WASI." + ) def test_is_socket_true(self): P = self.cls(BASE, 'mysock') sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) @@ -2508,7 +2520,8 @@ def _check_symlink_loop(self, *args, strict=True): print(path.resolve(strict)) @unittest.skipIf( - is_emscripten, "umask is not implemented on Emscripten." + is_emscripten or is_wasi, + "umask is not implemented on Emscripten/WASI." ) def test_open_mode(self): old_mask = os.umask(0) @@ -2534,7 +2547,8 @@ def test_resolve_root(self): os.chdir(current_directory) @unittest.skipIf( - is_emscripten, "umask is not implemented on Emscripten." + is_emscripten or is_wasi, + "umask is not implemented on Emscripten/WASI." ) def test_touch_mode(self): old_mask = os.umask(0) diff --git a/Lib/test/test_patma.py b/Lib/test/test_patma.py index 57d3b1ec701ca4..db198f77157831 100644 --- a/Lib/test/test_patma.py +++ b/Lib/test/test_patma.py @@ -3151,6 +3151,27 @@ def f(command): # 0 self.assertListEqual(self._trace(f, "go x"), [1, 2, 3]) self.assertListEqual(self._trace(f, "spam"), [1, 2, 3]) + def test_parser_deeply_nested_patterns(self): + # Deeply nested patterns can cause exponential backtracking when parsing. + # See gh-93671 for more information. + + levels = 100 + + patterns = [ + "A" + "(" * levels + ")" * levels, + "{1:" * levels + "1" + "}" * levels, + "[" * levels + "1" + "]" * levels, + ] + + for pattern in patterns: + with self.subTest(pattern): + code = inspect.cleandoc(""" + match None: + case {}: + pass + """.format(pattern)) + compile(code, "", "exec") + if __name__ == "__main__": """ diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index 0141b739c25440..f0c15e7b9c5363 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -1363,7 +1363,50 @@ def test_pdb_issue_43318(): 4 """ +def test_pdb_issue_gh_91742(): + """See GH-91742 + >>> def test_function(): + ... __author__ = "pi" + ... __version__ = "3.14" + ... + ... def about(): + ... '''About''' + ... print(f"Author: {__author__!r}", + ... f"Version: {__version__!r}", + ... sep=" ") + ... + ... import pdb; pdb.Pdb(nosigint=True, readrc=False).set_trace() + ... about() + + + >>> reset_Breakpoint() + >>> with PdbTestInput([ # doctest: +NORMALIZE_WHITESPACE + ... 'step', + ... 'next', + ... 'next', + ... 'jump 5', + ... 'continue' + ... ]): + ... test_function() + > (12)test_function() + -> about() + (Pdb) step + --Call-- + > (5)about() + -> def about(): + (Pdb) next + > (7)about() + -> print(f"Author: {__author__!r}", + (Pdb) next + > (8)about() + -> f"Version: {__version__!r}", + (Pdb) jump 5 + > (5)about() + -> def about(): + (Pdb) continue + Author: 'pi' Version: '3.14' + """ @support.requires_subprocess() class PdbTestCase(unittest.TestCase): def tearDown(self): diff --git a/Lib/test/test_peepholer.py b/Lib/test/test_peepholer.py index 6a2f550d67052a..2c3b1ab65a8f9d 100644 --- a/Lib/test/test_peepholer.py +++ b/Lib/test/test_peepholer.py @@ -1,5 +1,6 @@ import dis from itertools import combinations, product +import sys import textwrap import unittest @@ -43,11 +44,11 @@ def check_jump_targets(self, code): continue tgt = targets[instr.argval] # jump to unconditional jump - if tgt.opname in ('JUMP_ABSOLUTE', 'JUMP_FORWARD'): + if tgt.opname in ('JUMP_BACKWARD', 'JUMP_FORWARD'): self.fail(f'{instr.opname} at {instr.offset} ' f'jumps to {tgt.opname} at {tgt.offset}') # unconditional jump to RETURN_VALUE - if (instr.opname in ('JUMP_ABSOLUTE', 'JUMP_FORWARD') and + if (instr.opname in ('JUMP_BACKWARD', 'JUMP_FORWARD') and tgt.opname == 'RETURN_VALUE'): self.fail(f'{instr.opname} at {instr.offset} ' f'jumps to {tgt.opname} at {tgt.offset}') @@ -352,7 +353,7 @@ def f(cond, true_value, false_value): else false_value) self.check_jump_targets(f) self.assertNotInBytecode(f, 'JUMP_FORWARD') - self.assertNotInBytecode(f, 'JUMP_ABSOLUTE') + self.assertNotInBytecode(f, 'JUMP_BACKWARD') returns = [instr for instr in dis.get_instructions(f) if instr.opname == 'RETURN_VALUE'] self.assertEqual(len(returns), 2) @@ -372,7 +373,7 @@ def f(): self.check_lnotab(f) def test_elim_jump_to_uncond_jump2(self): - # POP_JUMP_IF_FALSE to JUMP_ABSOLUTE --> POP_JUMP_IF_FALSE to non-jump + # POP_JUMP_IF_FALSE to JUMP_BACKWARD --> POP_JUMP_IF_FALSE to non-jump def f(): while a: # Intentionally use two-line expression to test issue37213. @@ -417,6 +418,13 @@ def f(a, b, c): self.assertInBytecode(f, 'JUMP_IF_FALSE_OR_POP') self.assertInBytecode(f, 'POP_JUMP_FORWARD_IF_TRUE') + def test_elim_jump_to_uncond_jump4(self): + def f(): + for i in range(5): + if i > 3: + print(i) + self.check_jump_targets(f) + def test_elim_jump_after_return1(self): # Eliminate dead code: jumps immediately after returns can't be reached def f(cond1, cond2): @@ -429,7 +437,7 @@ def f(cond1, cond2): return 5 return 6 self.assertNotInBytecode(f, 'JUMP_FORWARD') - self.assertNotInBytecode(f, 'JUMP_ABSOLUTE') + self.assertNotInBytecode(f, 'JUMP_BACKWARD') returns = [instr for instr in dis.get_instructions(f) if instr.opname == 'RETURN_VALUE'] self.assertLessEqual(len(returns), 6) @@ -675,5 +683,184 @@ def test_bpo_45773_pop_jump_if_false(self): compile("while True or not spam: pass", "", "exec") +class TestMarkingVariablesAsUnKnown(BytecodeTestCase): + + def setUp(self): + self.addCleanup(sys.settrace, sys.gettrace()) + sys.settrace(None) + + def test_load_fast_known_simple(self): + def f(): + x = 1 + y = x + x + self.assertInBytecode(f, 'LOAD_FAST') + + def test_load_fast_unknown_simple(self): + def f(): + if condition(): + x = 1 + print(x) + self.assertInBytecode(f, 'LOAD_FAST_CHECK') + self.assertNotInBytecode(f, 'LOAD_FAST') + + def test_load_fast_unknown_because_del(self): + def f(): + x = 1 + del x + print(x) + self.assertInBytecode(f, 'LOAD_FAST_CHECK') + self.assertNotInBytecode(f, 'LOAD_FAST') + + def test_load_fast_known_because_parameter(self): + def f1(x): + print(x) + self.assertInBytecode(f1, 'LOAD_FAST') + self.assertNotInBytecode(f1, 'LOAD_FAST_CHECK') + + def f2(*, x): + print(x) + self.assertInBytecode(f2, 'LOAD_FAST') + self.assertNotInBytecode(f2, 'LOAD_FAST_CHECK') + + def f3(*args): + print(args) + self.assertInBytecode(f3, 'LOAD_FAST') + self.assertNotInBytecode(f3, 'LOAD_FAST_CHECK') + + def f4(**kwargs): + print(kwargs) + self.assertInBytecode(f4, 'LOAD_FAST') + self.assertNotInBytecode(f4, 'LOAD_FAST_CHECK') + + def f5(x=0): + print(x) + self.assertInBytecode(f5, 'LOAD_FAST') + self.assertNotInBytecode(f5, 'LOAD_FAST_CHECK') + + def test_load_fast_known_because_already_loaded(self): + def f(): + if condition(): + x = 1 + print(x) + print(x) + self.assertInBytecode(f, 'LOAD_FAST_CHECK') + self.assertInBytecode(f, 'LOAD_FAST') + + def test_load_fast_known_multiple_branches(self): + def f(): + if condition(): + x = 1 + else: + x = 2 + print(x) + self.assertInBytecode(f, 'LOAD_FAST') + self.assertNotInBytecode(f, 'LOAD_FAST_CHECK') + + def test_load_fast_unknown_after_error(self): + def f(): + try: + res = 1 / 0 + except ZeroDivisionError: + pass + return res + # LOAD_FAST (known) still occurs in the no-exception branch. + # Assert that it doesn't occur in the LOAD_FAST_CHECK branch. + self.assertInBytecode(f, 'LOAD_FAST_CHECK') + + def test_load_fast_unknown_after_error_2(self): + def f(): + try: + 1 / 0 + except: + print(a, b, c, d, e, f, g) + a = b = c = d = e = f = g = 1 + self.assertInBytecode(f, 'LOAD_FAST_CHECK') + self.assertNotInBytecode(f, 'LOAD_FAST') + + def test_setting_lineno_adds_check(self): + code = textwrap.dedent("""\ + def f(): + x = 2 + L = 3 + L = 4 + for i in range(55): + x + 6 + del x + L = 8 + L = 9 + L = 10 + """) + ns = {} + exec(code, ns) + f = ns['f'] + self.assertInBytecode(f, "LOAD_FAST") + def trace(frame, event, arg): + if event == 'line' and frame.f_lineno == 9: + frame.f_lineno = 2 + sys.settrace(None) + return None + return trace + sys.settrace(trace) + f() + self.assertNotInBytecode(f, "LOAD_FAST") + + def make_function_with_no_checks(self): + code = textwrap.dedent("""\ + def f(): + x = 2 + L = 3 + L = 4 + L = 5 + if not L: + x + 7 + y = 2 + """) + ns = {} + exec(code, ns) + f = ns['f'] + self.assertInBytecode(f, "LOAD_FAST") + self.assertNotInBytecode(f, "LOAD_FAST_CHECK") + return f + + def test_deleting_local_adds_check(self): + f = self.make_function_with_no_checks() + def trace(frame, event, arg): + if event == 'line' and frame.f_lineno == 4: + del frame.f_locals["x"] + sys.settrace(None) + return None + return trace + sys.settrace(trace) + f() + self.assertNotInBytecode(f, "LOAD_FAST") + self.assertInBytecode(f, "LOAD_FAST_CHECK") + + def test_modifying_local_does_not_add_check(self): + f = self.make_function_with_no_checks() + def trace(frame, event, arg): + if event == 'line' and frame.f_lineno == 4: + frame.f_locals["x"] = 42 + sys.settrace(None) + return None + return trace + sys.settrace(trace) + f() + self.assertInBytecode(f, "LOAD_FAST") + self.assertNotInBytecode(f, "LOAD_FAST_CHECK") + + def test_initializing_local_does_not_add_check(self): + f = self.make_function_with_no_checks() + def trace(frame, event, arg): + if event == 'line' and frame.f_lineno == 4: + frame.f_locals["y"] = 42 + sys.settrace(None) + return None + return trace + sys.settrace(trace) + f() + self.assertInBytecode(f, "LOAD_FAST") + self.assertNotInBytecode(f, "LOAD_FAST_CHECK") + + if __name__ == "__main__": unittest.main() diff --git a/Lib/test/test_peg_generator/test_c_parser.py b/Lib/test/test_peg_generator/test_c_parser.py index d25bc112cfdc43..1b3fcbb92f8292 100644 --- a/Lib/test/test_peg_generator/test_c_parser.py +++ b/Lib/test/test_peg_generator/test_c_parser.py @@ -75,7 +75,7 @@ class TestCParser(unittest.TestCase): @classmethod def setUpClass(cls): - # When running under regtest, a seperate tempdir is used + # When running under regtest, a separate tempdir is used # as the current directory and watched for left-overs. # Reusing that as the base for temporary directories # ensures everything is cleaned up properly and diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 28e5e90297e242..ae25ef55885dd6 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -31,8 +31,8 @@ os_helper.TESTFN + '-dummy-symlink') requires_32b = unittest.skipUnless( - # Emscripten has 32 bits pointers, but support 64 bits syscall args. - sys.maxsize < 2**32 and not support.is_emscripten, + # Emscripten/WASI have 32 bits pointers, but support 64 bits syscall args. + sys.maxsize < 2**32 and not (support.is_emscripten or support.is_wasi), 'test is only meaningful on 32-bit builds' ) @@ -547,6 +547,7 @@ def test_readv_overflow_32bits(self): @unittest.skipUnless(hasattr(posix, 'dup'), 'test needs posix.dup()') + @unittest.skipIf(support.is_wasi, "WASI does not have dup()") def test_dup(self): fp = open(os_helper.TESTFN) try: @@ -564,6 +565,7 @@ def test_confstr(self): @unittest.skipUnless(hasattr(posix, 'dup2'), 'test needs posix.dup2()') + @unittest.skipIf(support.is_wasi, "WASI does not have dup2()") def test_dup2(self): fp1 = open(os_helper.TESTFN) fp2 = open(os_helper.TESTFN) @@ -784,7 +786,8 @@ def check_stat(uid, gid): self.assertRaises(TypeError, chown_func, first_param, uid, t(gid)) check_stat(uid, gid) - @unittest.skipUnless(hasattr(posix, 'chown'), "test needs os.chown()") + @os_helper.skip_unless_working_chmod + @unittest.skipIf(support.is_emscripten, "getgid() is a stub") def test_chown(self): # raise an OSError if the file does not exist os.unlink(os_helper.TESTFN) @@ -794,7 +797,9 @@ def test_chown(self): os_helper.create_empty_file(os_helper.TESTFN) self._test_all_chown_common(posix.chown, os_helper.TESTFN, posix.stat) + @os_helper.skip_unless_working_chmod @unittest.skipUnless(hasattr(posix, 'fchown'), "test needs os.fchown()") + @unittest.skipIf(support.is_emscripten, "getgid() is a stub") def test_fchown(self): os.unlink(os_helper.TESTFN) @@ -807,6 +812,7 @@ def test_fchown(self): finally: test_file.close() + @os_helper.skip_unless_working_chmod @unittest.skipUnless(hasattr(posix, 'lchown'), "test needs os.lchown()") def test_lchown(self): os.unlink(os_helper.TESTFN) @@ -1212,6 +1218,7 @@ def test_sched_setaffinity(self): # bpo-47205: does not raise OSError on FreeBSD self.assertRaises(OSError, posix.sched_setaffinity, -1, mask) + @unittest.skipIf(support.is_wasi, "No dynamic linking on WASI") def test_rtld_constants(self): # check presence of major RTLD_* constants posix.RTLD_LAZY @@ -1351,6 +1358,7 @@ def test_chmod_dir_fd(self): @unittest.skipUnless(hasattr(os, 'chown') and (os.chown in os.supports_dir_fd), "test needs dir_fd support in os.chown()") + @unittest.skipIf(support.is_emscripten, "getgid() is a stub") def test_chown_dir_fd(self): with self.prepare_file() as (dir_fd, name, fullname): posix.chown(name, os.getuid(), os.getgid(), dir_fd=dir_fd) @@ -1406,6 +1414,10 @@ def test_utime_dir_fd(self): # whoops! using both together not supported on this platform. pass + @unittest.skipIf( + support.is_wasi, + "WASI: symlink following on path_link is not supported" + ) @unittest.skipUnless( hasattr(os, "link") and os.link in os.supports_dir_fd, "test needs dir_fd support in os.link()" diff --git a/Lib/test/test_posixpath.py b/Lib/test/test_posixpath.py index 5fc4205beb125f..c644f881e460fe 100644 --- a/Lib/test/test_posixpath.py +++ b/Lib/test/test_posixpath.py @@ -193,8 +193,7 @@ def test_ismount_non_existent(self): self.assertIs(posixpath.ismount('/\x00'), False) self.assertIs(posixpath.ismount(b'/\x00'), False) - @unittest.skipUnless(os_helper.can_symlink(), - "Test requires symlink support") + @os_helper.skip_unless_symlink def test_ismount_symlinks(self): # Symlinks are never mountpoints. try: @@ -387,8 +386,7 @@ def test_realpath_pardir(self): self.assertEqual(realpath(b'../..'), dirname(dirname(os.getcwdb()))) self.assertEqual(realpath(b'/'.join([b'..'] * 100)), b'/') - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_basic(self): # Basic operation. @@ -398,8 +396,7 @@ def test_realpath_basic(self): finally: os_helper.unlink(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_strict(self): # Bug #43757: raise FileNotFoundError in strict mode if we encounter @@ -411,8 +408,7 @@ def test_realpath_strict(self): finally: os_helper.unlink(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_relative(self): try: @@ -421,8 +417,7 @@ def test_realpath_relative(self): finally: os_helper.unlink(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_symlink_loops(self): # Bug #930024, return the path unchanged if we get into an infinite @@ -463,8 +458,7 @@ def test_realpath_symlink_loops(self): os_helper.unlink(ABSTFN+"c") os_helper.unlink(ABSTFN+"a") - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_symlink_loops_strict(self): # Bug #43757, raise OSError if we get into an infinite symlink loop in @@ -505,8 +499,7 @@ def test_realpath_symlink_loops_strict(self): os_helper.unlink(ABSTFN+"c") os_helper.unlink(ABSTFN+"a") - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_repeated_indirect_symlinks(self): # Issue #6975. @@ -520,8 +513,7 @@ def test_realpath_repeated_indirect_symlinks(self): os_helper.unlink(ABSTFN + '/link') safe_rmdir(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_deep_recursion(self): depth = 10 @@ -540,8 +532,7 @@ def test_realpath_deep_recursion(self): os_helper.unlink(ABSTFN + '/%d' % i) safe_rmdir(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_resolve_parents(self): # We also need to resolve any symlinks in the parents of a relative @@ -560,8 +551,7 @@ def test_realpath_resolve_parents(self): safe_rmdir(ABSTFN + "/y") safe_rmdir(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_resolve_before_normalizing(self): # Bug #990669: Symbolic links should be resolved before we @@ -589,8 +579,7 @@ def test_realpath_resolve_before_normalizing(self): safe_rmdir(ABSTFN + "/k") safe_rmdir(ABSTFN) - @unittest.skipUnless(hasattr(os, "symlink"), - "Missing symlink implementation") + @os_helper.skip_unless_symlink @skip_if_ABSTFN_contains_backslash def test_realpath_resolve_first(self): # Bug #1213894: The first component of the path, if not absolute, diff --git a/Lib/test/test_property.py b/Lib/test/test_property.py index d91ad1c191275e..d07b8632aa8722 100644 --- a/Lib/test/test_property.py +++ b/Lib/test/test_property.py @@ -219,6 +219,9 @@ def test_property_set_name_incorrect_args(self): class PropertySub(property): """This is a subclass of property""" +class PropertySubWoDoc(property): + pass + class PropertySubSlots(property): """This is a subclass of property that defines __slots__""" __slots__ = () @@ -237,6 +240,38 @@ def spam(self): else: raise Exception("AttributeError not raised") + @unittest.skipIf(sys.flags.optimize >= 2, + "Docstrings are omitted with -O2 and above") + def test_issue41287(self): + + self.assertEqual(PropertySub.__doc__, "This is a subclass of property", + "Docstring of `property` subclass is ignored") + + doc = PropertySub(None, None, None, "issue 41287 is fixed").__doc__ + self.assertEqual(doc, "issue 41287 is fixed", + "Subclasses of `property` ignores `doc` constructor argument") + + def getter(x): + """Getter docstring""" + + def getter_wo_doc(x): + pass + + for ps in property, PropertySub, PropertySubWoDoc: + doc = ps(getter, None, None, "issue 41287 is fixed").__doc__ + self.assertEqual(doc, "issue 41287 is fixed", + "Getter overrides explicit property docstring (%s)" % ps.__name__) + + doc = ps(getter, None, None, None).__doc__ + self.assertEqual(doc, "Getter docstring", "Getter docstring is not picked-up (%s)" % ps.__name__) + + doc = ps(getter_wo_doc, None, None, "issue 41287 is fixed").__doc__ + self.assertEqual(doc, "issue 41287 is fixed", + "Getter overrides explicit property docstring (%s)" % ps.__name__) + + doc = ps(getter_wo_doc, None, None, None).__doc__ + self.assertIsNone(doc, "Property class doc appears in instance __doc__ (%s)" % ps.__name__) + @unittest.skipIf(sys.flags.optimize >= 2, "Docstrings are omitted with -O2 and above") def test_docstring_copy(self): @@ -249,6 +284,66 @@ def spam(self): Foo.spam.__doc__, "spam wrapped in property subclass") + @unittest.skipIf(sys.flags.optimize >= 2, + "Docstrings are omitted with -O2 and above") + def test_docstring_copy2(self): + """ + Property tries to provide the best docstring it finds for its instances. + If a user-provided docstring is available, it is preserved on copies. + If no docstring is available during property creation, the property + will utilize the docstring from the getter if available. + """ + def getter1(self): + return 1 + def getter2(self): + """doc 2""" + return 2 + def getter3(self): + """doc 3""" + return 3 + + # Case-1: user-provided doc is preserved in copies + # of property with undocumented getter + p = property(getter1, None, None, "doc-A") + + p2 = p.getter(getter2) + self.assertEqual(p.__doc__, "doc-A") + self.assertEqual(p2.__doc__, "doc-A") + + # Case-2: user-provided doc is preserved in copies + # of property with documented getter + p = property(getter2, None, None, "doc-A") + + p2 = p.getter(getter3) + self.assertEqual(p.__doc__, "doc-A") + self.assertEqual(p2.__doc__, "doc-A") + + # Case-3: with no user-provided doc new getter doc + # takes precendence + p = property(getter2, None, None, None) + + p2 = p.getter(getter3) + self.assertEqual(p.__doc__, "doc 2") + self.assertEqual(p2.__doc__, "doc 3") + + # Case-4: A user-provided doc is assigned after property construction + # with documented getter. The doc IS NOT preserved. + # It's an odd behaviour, but it's a strange enough + # use case with no easy solution. + p = property(getter2, None, None, None) + p.__doc__ = "user" + p2 = p.getter(getter3) + self.assertEqual(p.__doc__, "user") + self.assertEqual(p2.__doc__, "doc 3") + + # Case-5: A user-provided doc is assigned after property construction + # with UNdocumented getter. The doc IS preserved. + p = property(getter1, None, None, None) + p.__doc__ = "user" + p2 = p.getter(getter2) + self.assertEqual(p.__doc__, "user") + self.assertEqual(p2.__doc__, "user") + @unittest.skipIf(sys.flags.optimize >= 2, "Docstrings are omitted with -O2 and above") def test_property_setter_copies_getter_docstring(self): diff --git a/Lib/test/test_py_compile.py b/Lib/test/test_py_compile.py index 794d6436b61ab0..f494aed42feae3 100644 --- a/Lib/test/test_py_compile.py +++ b/Lib/test/test_py_compile.py @@ -119,6 +119,7 @@ def test_relative_path(self): 'non-root user required') @unittest.skipIf(os.name == 'nt', 'cannot control directory permissions on Windows') + @os_helper.skip_unless_working_chmod def test_exceptions_propagate(self): # Make sure that exceptions raised thanks to issues with writing # bytecode. diff --git a/Lib/test/test_pyclbr.py b/Lib/test/test_pyclbr.py index ad7b31aef1ddd1..b2de4e8397d6ad 100644 --- a/Lib/test/test_pyclbr.py +++ b/Lib/test/test_pyclbr.py @@ -9,6 +9,7 @@ import pyclbr from unittest import TestCase, main as unittest_main from test.test_importlib import util as test_importlib_util +import warnings StaticMethodType = type(staticmethod(lambda: None)) @@ -218,9 +219,13 @@ def test_others(self): # These were once some of the longest modules. cm('random', ignore=('Random',)) # from _random import Random as CoreGenerator - cm('cgi', ignore=('log',)) # set with = in module + with warnings.catch_warnings(): + warnings.simplefilter('ignore', DeprecationWarning) + cm('cgi', ignore=('log',)) # set with = in module cm('pickle', ignore=('partial', 'PickleBuffer')) - cm('sre_parse', ignore=('dump', 'groups', 'pos')) # from sre_constants import *; property + with warnings.catch_warnings(): + warnings.simplefilter('ignore', DeprecationWarning) + cm('sre_parse', ignore=('dump', 'groups', 'pos')) # from sre_constants import *; property cm( 'pdb', # pyclbr does not handle elegantly `typing` or properties diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py index 13c77b6fa6822c..89faf0d9141f9d 100644 --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -27,7 +27,8 @@ from test.support.script_helper import assert_python_ok, assert_python_failure from test.support import threading_helper from test.support import (reap_children, captured_output, captured_stdout, - captured_stderr, is_emscripten, requires_docstrings) + captured_stderr, is_emscripten, is_wasi, + requires_docstrings) from test.support.os_helper import (TESTFN, rmtree, unlink) from test import pydoc_mod @@ -932,6 +933,8 @@ def test_apropos_with_unreadable_dir(self): self.assertEqual(out.getvalue(), '') self.assertEqual(err.getvalue(), '') + @os_helper.skip_unless_working_chmod + @unittest.skipIf(is_emscripten, "cannot remove x bit") def test_apropos_empty_doc(self): pkgdir = os.path.join(TESTFN, 'walkpkg') os.mkdir(pkgdir) @@ -1356,7 +1359,10 @@ def a_fn_with_https_link(): ) -@unittest.skipIf(is_emscripten, "Socket server not available on Emscripten.") +@unittest.skipIf( + is_emscripten or is_wasi, + "Socket server not available on Emscripten/WASI." +) class PydocServerTest(unittest.TestCase): """Tests for pydoc._start_server""" diff --git a/Lib/test/test_pyexpat.py b/Lib/test/test_pyexpat.py index 6e578458a25096..6f0441b66d9b88 100644 --- a/Lib/test/test_pyexpat.py +++ b/Lib/test/test_pyexpat.py @@ -12,7 +12,7 @@ from xml.parsers import expat from xml.parsers.expat import errors -from test.support import sortdict, is_emscripten +from test.support import sortdict, is_emscripten, is_wasi class SetAttributeTest(unittest.TestCase): @@ -469,6 +469,7 @@ def test_exception(self): if (sysconfig.is_python_build() and not (sys.platform == 'win32' and platform.machine() == 'ARM') and not is_emscripten + and not is_wasi ): self.assertIn('call_with_frame("StartElement"', entries[1][3]) diff --git a/Lib/test/test_queue.py b/Lib/test/test_queue.py index e3080376a9de5f..33113a72e6b6a9 100644 --- a/Lib/test/test_queue.py +++ b/Lib/test/test_queue.py @@ -10,6 +10,8 @@ from test.support import import_helper from test.support import threading_helper +# queue module depends on threading primitives +threading_helper.requires_working_threading(module=True) py_queue = import_helper.import_fresh_module('queue', blocked=['_queue']) c_queue = import_helper.import_fresh_module('queue', fresh=['_queue']) @@ -87,7 +89,6 @@ def do_exceptional_blocking_test(self,block_func, block_args, trigger_func, self.fail("trigger thread ended but event never set") -@threading_helper.requires_working_threading() class BaseQueueTestMixin(BlockingTestMixin): def setUp(self): self.cum = 0 @@ -291,7 +292,6 @@ class CPriorityQueueTest(PriorityQueueTest, unittest.TestCase): class FailingQueueException(Exception): pass -@threading_helper.requires_working_threading() class FailingQueueTest(BlockingTestMixin): def setUp(self): @@ -467,7 +467,6 @@ def consume_timeout(self, q, results, sentinel): return results.append(val) - @threading_helper.requires_working_threading() def run_threads(self, n_threads, q, inputs, feed_func, consume_func): results = [] sentinel = None diff --git a/Lib/test/test_re.py b/Lib/test/test_re.py index ba70de4344bd9d..9f734d47c54499 100644 --- a/Lib/test/test_re.py +++ b/Lib/test/test_re.py @@ -1,6 +1,6 @@ from test.support import (gc_collect, bigmemtest, _2G, cpython_only, captured_stdout, - check_disallow_instantiation, is_emscripten) + check_disallow_instantiation, is_emscripten, is_wasi) import locale import re import string @@ -1765,12 +1765,9 @@ def test_dealloc(self): long_overflow = 2**128 self.assertRaises(TypeError, re.finditer, "a", {}) with self.assertRaises(OverflowError): - _sre.compile("abc", 0, [long_overflow], 0, {}, (), 0) + _sre.compile("abc", 0, [long_overflow], 0, {}, ()) with self.assertRaises(TypeError): - _sre.compile({}, 0, [], 0, [], [], 0) - with self.assertRaises(RuntimeError): - # invalid repeat_count -1 - _sre.compile("abc", 0, [1], 0, {}, (), -1) + _sre.compile({}, 0, [], 0, [], []) def test_search_dot_unicode(self): self.assertTrue(re.search("123.*-", '123abc-')) @@ -1943,7 +1940,10 @@ def test_bug_20998(self): # with ignore case. self.assertEqual(re.fullmatch('[a-c]+', 'ABC', re.I).span(), (0, 3)) - @unittest.skipIf(is_emscripten, "musl libc issue on Emscripten, bpo-46390") + @unittest.skipIf( + is_emscripten or is_wasi, + "musl libc issue on Emscripten/WASI, bpo-46390" + ) def test_locale_caching(self): # Issue #22410 oldlocale = locale.setlocale(locale.LC_CTYPE) @@ -1980,7 +1980,10 @@ def check_en_US_utf8(self): self.assertIsNone(re.match(b'(?Li)\xc5', b'\xe5')) self.assertIsNone(re.match(b'(?Li)\xe5', b'\xc5')) - @unittest.skipIf(is_emscripten, "musl libc issue on Emscripten, bpo-46390") + @unittest.skipIf( + is_emscripten or is_wasi, + "musl libc issue on Emscripten/WASI, bpo-46390" + ) def test_locale_compiled(self): oldlocale = locale.setlocale(locale.LC_CTYPE) self.addCleanup(locale.setlocale, locale.LC_CTYPE, oldlocale) @@ -2380,6 +2383,30 @@ def test_bug_gh91616(self): self.assertTrue(re.fullmatch(r'(?s:(?>.*?\.).*)\Z', "a.txt")) # reproducer self.assertTrue(re.fullmatch(r'(?s:(?=(?P.*?\.))(?P=g0).*)\Z', "a.txt")) + def test_template_function_and_flag_is_deprecated(self): + with self.assertWarns(DeprecationWarning) as cm: + template_re1 = re.template(r'a') + self.assertIn('re.template()', str(cm.warning)) + self.assertIn('is deprecated', str(cm.warning)) + self.assertIn('function', str(cm.warning)) + self.assertNotIn('flag', str(cm.warning)) + + with self.assertWarns(DeprecationWarning) as cm: + # we deliberately use more flags here to test that that still + # triggers the warning + # if paranoid, we could test multiple different combinations, + # but it's probably not worth it + template_re2 = re.compile(r'a', flags=re.TEMPLATE|re.UNICODE) + self.assertIn('re.TEMPLATE', str(cm.warning)) + self.assertIn('is deprecated', str(cm.warning)) + self.assertIn('flag', str(cm.warning)) + self.assertNotIn('function', str(cm.warning)) + + # while deprecated, is should still function + self.assertEqual(template_re1, template_re2) + self.assertTrue(template_re1.match('ahoy')) + self.assertFalse(template_re1.match('nope')) + def get_debug_out(pat): with captured_stdout() as out: @@ -2479,27 +2506,6 @@ def test_possesive_repeat(self): 14. SUCCESS ''') - def test_repeat_index(self): - self.assertEqual(get_debug_out(r'(?:ab)*?(?:cd)*'), '''\ -MIN_REPEAT 0 MAXREPEAT - LITERAL 97 - LITERAL 98 -MAX_REPEAT 0 MAXREPEAT - LITERAL 99 - LITERAL 100 - - 0. INFO 4 0b0 0 MAXREPEAT (to 5) - 5: REPEAT 8 0 MAXREPEAT 0 (to 14) -10. LITERAL 0x61 ('a') -12. LITERAL 0x62 ('b') -14: MIN_UNTIL -15. REPEAT 8 0 MAXREPEAT 1 (to 24) -20. LITERAL 0x63 ('c') -22. LITERAL 0x64 ('d') -24: MAX_UNTIL -25. SUCCESS -''') - class PatternReprTests(unittest.TestCase): def check(self, pattern, expected): @@ -2574,11 +2580,11 @@ def test_flags_repr(self): "re.IGNORECASE|re.DOTALL|re.VERBOSE|0x100000") self.assertEqual( repr(~re.I), - "re.ASCII|re.LOCALE|re.UNICODE|re.MULTILINE|re.DOTALL|re.VERBOSE|re.DEBUG|0x1") + "re.ASCII|re.LOCALE|re.UNICODE|re.MULTILINE|re.DOTALL|re.VERBOSE|re.TEMPLATE|re.DEBUG") self.assertEqual(repr(~(re.I|re.S|re.X)), - "re.ASCII|re.LOCALE|re.UNICODE|re.MULTILINE|re.DEBUG|0x1") + "re.ASCII|re.LOCALE|re.UNICODE|re.MULTILINE|re.TEMPLATE|re.DEBUG") self.assertEqual(repr(~(re.I|re.S|re.X|(1<<20))), - "re.ASCII|re.LOCALE|re.UNICODE|re.MULTILINE|re.DEBUG|0xffe01") + "re.ASCII|re.LOCALE|re.UNICODE|re.MULTILINE|re.TEMPLATE|re.DEBUG|0xffe00") class ImplementationTest(unittest.TestCase): diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 15e2f89ee20c1d..a36d18488a5ef7 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -15,7 +15,6 @@ import sysconfig import tempfile import textwrap -import time import unittest from test import libregrtest from test import support @@ -25,7 +24,6 @@ if not support.has_subprocess_support: raise unittest.SkipTest("test module requires subprocess") -Py_DEBUG = hasattr(sys, 'gettotalrefcount') ROOT_DIR = os.path.join(os.path.dirname(__file__), '..', '..') ROOT_DIR = os.path.abspath(os.path.normpath(ROOT_DIR)) LOG_PREFIX = r'[0-9]+:[0-9]+:[0-9]+ (?:load avg: [0-9]+\.[0-9]{2} )?' @@ -665,7 +663,7 @@ def test_tools_buildbot_test(self): test_args.append('-arm32') # 32-bit ARM build elif platform.architecture()[0] == '64bit': test_args.append('-x64') # 64-bit build - if not Py_DEBUG: + if not support.Py_DEBUG: test_args.append('+d') # Release build, use python.exe self.run_batch(script, *test_args, *self.tests) @@ -682,7 +680,7 @@ def test_pcbuild_rt(self): rt_args.append('-arm32') # 32-bit ARM build elif platform.architecture()[0] == '64bit': rt_args.append('-x64') # 64-bit build - if Py_DEBUG: + if support.Py_DEBUG: rt_args.append('-d') # Debug build, use python_d.exe self.run_batch(script, *rt_args, *self.regrtest_args, *self.tests) @@ -903,7 +901,7 @@ def check_leak(self, code, what): reflog = fp.read() self.assertIn(line2, reflog) - @unittest.skipUnless(Py_DEBUG, 'need a debug build') + @unittest.skipUnless(support.Py_DEBUG, 'need a debug build') def test_huntrleaks(self): # test --huntrleaks code = textwrap.dedent(""" @@ -917,7 +915,7 @@ def test_leak(self): """) self.check_leak(code, 'references') - @unittest.skipUnless(Py_DEBUG, 'need a debug build') + @unittest.skipUnless(support.Py_DEBUG, 'need a debug build') def test_huntrleaks_fd_leak(self): # test --huntrleaks for file descriptor leak code = textwrap.dedent(""" @@ -1359,6 +1357,32 @@ def test_cleanup(self): for name in names: self.assertFalse(os.path.exists(name), name) + @unittest.skipIf(support.is_wasi, + 'checking temp files is not implemented on WASI') + def test_leak_tmp_file(self): + code = textwrap.dedent(r""" + import os.path + import tempfile + import unittest + + class FileTests(unittest.TestCase): + def test_leak_tmp_file(self): + filename = os.path.join(tempfile.gettempdir(), 'mytmpfile') + with open(filename, "wb") as fp: + fp.write(b'content') + """) + testnames = [self.create_test(code=code) for _ in range(3)] + + output = self.run_tests("--fail-env-changed", "-v", "-j2", *testnames, exitcode=3) + self.check_executed_tests(output, testnames, + env_changed=testnames, + fail_env_changed=True, + randomize=True) + for testname in testnames: + self.assertIn(f"Warning -- {testname} leaked temporary " + f"files (1): mytmpfile", + output) + class TestUtils(unittest.TestCase): def test_format_duration(self): diff --git a/Lib/test/test_robotparser.py b/Lib/test/test_robotparser.py index 3821d66c2db7d4..8d89e2a8224452 100644 --- a/Lib/test/test_robotparser.py +++ b/Lib/test/test_robotparser.py @@ -308,8 +308,9 @@ def log_message(self, format, *args): pass -@unittest.skipIf( - support.is_emscripten, "Socket server not available on Emscripten." +@unittest.skipUnless( + support.has_socket_support, + "Socket server requires working socket." ) class PasswordProtectedSiteTestCase(unittest.TestCase): diff --git a/Lib/test/test_select.py b/Lib/test/test_select.py index ca2a9d9d24dabc..a82584d6904920 100644 --- a/Lib/test/test_select.py +++ b/Lib/test/test_select.py @@ -1,5 +1,4 @@ import errno -import os import select import subprocess import sys diff --git a/Lib/test/test_selectors.py b/Lib/test/test_selectors.py index c927331d438b0c..c2db88c203920a 100644 --- a/Lib/test/test_selectors.py +++ b/Lib/test/test_selectors.py @@ -19,8 +19,8 @@ resource = None -if support.is_emscripten: - raise unittest.SkipTest("Cannot create socketpair on Emscripten.") +if support.is_emscripten or support.is_wasi: + raise unittest.SkipTest("Cannot create socketpair on Emscripten/WASI.") if hasattr(socket, 'socketpair'): diff --git a/Lib/test/test_shelve.py b/Lib/test/test_shelve.py index b9eb007c827966..08c6562f2a273e 100644 --- a/Lib/test/test_shelve.py +++ b/Lib/test/test_shelve.py @@ -1,11 +1,9 @@ import unittest import dbm import shelve -import glob import pickle import os -from test import support from test.support import os_helper from collections.abc import MutableMapping from test.test_dbm import dbm_iterator diff --git a/Lib/test/test_shutil.py b/Lib/test/test_shutil.py index 70033863452805..6fa07399007a3e 100644 --- a/Lib/test/test_shutil.py +++ b/Lib/test/test_shutil.py @@ -51,6 +51,9 @@ except ImportError: _winapi = None +no_chdir = unittest.mock.patch('os.chdir', + side_effect=AssertionError("shouldn't call os.chdir()")) + def _fake_rename(*args, **kwargs): # Pretend the destination path is on a different filesystem. raise OSError(getattr(errno, 'EXDEV', 18), "Invalid cross-device link") @@ -311,6 +314,7 @@ def onerror(*args): "This test can't be run on Cygwin (issue #1071513).") @unittest.skipIf(hasattr(os, 'geteuid') and os.geteuid() == 0, "This test can't be run reliably as root (issue #1076467).") + @os_helper.skip_unless_working_chmod def test_on_error(self): self.errorState = 0 os.mkdir(TESTFN) @@ -1294,6 +1298,10 @@ def test_copyfile_same_file(self): self.assertEqual(read_file(src_file), 'foo') @unittest.skipIf(MACOS or SOLARIS or _winapi, 'On MACOS, Solaris and Windows the errors are not confusing (though different)') + # gh-92670: The test uses a trailing slash to force the OS consider + # the path as a directory, but on AIX the trailing slash has no effect + # and is considered as a file. + @unittest.skipIf(AIX, 'Not valid on AIX, see gh-92670') def test_copyfile_nonexistent_dir(self): # Issue 43219 src_dir = self.mkdtemp() @@ -1337,7 +1345,7 @@ def test_make_tarball(self): work_dir = os.path.dirname(tmpdir2) rel_base_name = os.path.join(os.path.basename(tmpdir2), 'archive') - with os_helper.change_cwd(work_dir): + with os_helper.change_cwd(work_dir), no_chdir: base_name = os.path.abspath(rel_base_name) tarball = make_archive(rel_base_name, 'gztar', root_dir, '.') @@ -1351,7 +1359,7 @@ def test_make_tarball(self): './file1', './file2', './sub/file3']) # trying an uncompressed one - with os_helper.change_cwd(work_dir): + with os_helper.change_cwd(work_dir), no_chdir: tarball = make_archive(rel_base_name, 'tar', root_dir, '.') self.assertEqual(tarball, base_name + '.tar') self.assertTrue(os.path.isfile(tarball)) @@ -1387,7 +1395,8 @@ def _create_files(self, base_dir='dist'): def test_tarfile_vs_tar(self): root_dir, base_dir = self._create_files() base_name = os.path.join(self.mkdtemp(), 'archive') - tarball = make_archive(base_name, 'gztar', root_dir, base_dir) + with no_chdir: + tarball = make_archive(base_name, 'gztar', root_dir, base_dir) # check if the compressed tarball was created self.assertEqual(tarball, base_name + '.tar.gz') @@ -1404,13 +1413,15 @@ def test_tarfile_vs_tar(self): self.assertEqual(self._tarinfo(tarball), self._tarinfo(tarball2)) # trying an uncompressed one - tarball = make_archive(base_name, 'tar', root_dir, base_dir) + with no_chdir: + tarball = make_archive(base_name, 'tar', root_dir, base_dir) self.assertEqual(tarball, base_name + '.tar') self.assertTrue(os.path.isfile(tarball)) # now for a dry_run - tarball = make_archive(base_name, 'tar', root_dir, base_dir, - dry_run=True) + with no_chdir: + tarball = make_archive(base_name, 'tar', root_dir, base_dir, + dry_run=True) self.assertEqual(tarball, base_name + '.tar') self.assertTrue(os.path.isfile(tarball)) @@ -1426,7 +1437,7 @@ def test_make_zipfile(self): work_dir = os.path.dirname(tmpdir2) rel_base_name = os.path.join(os.path.basename(tmpdir2), 'archive') - with os_helper.change_cwd(work_dir): + with os_helper.change_cwd(work_dir), no_chdir: base_name = os.path.abspath(rel_base_name) res = make_archive(rel_base_name, 'zip', root_dir) @@ -1439,7 +1450,7 @@ def test_make_zipfile(self): 'dist/file1', 'dist/file2', 'dist/sub/file3', 'outer']) - with os_helper.change_cwd(work_dir): + with os_helper.change_cwd(work_dir), no_chdir: base_name = os.path.abspath(rel_base_name) res = make_archive(rel_base_name, 'zip', root_dir, base_dir) @@ -1457,7 +1468,8 @@ def test_make_zipfile(self): def test_zipfile_vs_zip(self): root_dir, base_dir = self._create_files() base_name = os.path.join(self.mkdtemp(), 'archive') - archive = make_archive(base_name, 'zip', root_dir, base_dir) + with no_chdir: + archive = make_archive(base_name, 'zip', root_dir, base_dir) # check if ZIP file was created self.assertEqual(archive, base_name + '.zip') @@ -1483,7 +1495,8 @@ def test_zipfile_vs_zip(self): def test_unzip_zipfile(self): root_dir, base_dir = self._create_files() base_name = os.path.join(self.mkdtemp(), 'archive') - archive = make_archive(base_name, 'zip', root_dir, base_dir) + with no_chdir: + archive = make_archive(base_name, 'zip', root_dir, base_dir) # check if ZIP file was created self.assertEqual(archive, base_name + '.zip') @@ -1541,7 +1554,7 @@ def test_tarfile_root_owner(self): base_name = os.path.join(self.mkdtemp(), 'archive') group = grp.getgrgid(0)[0] owner = pwd.getpwuid(0)[0] - with os_helper.change_cwd(root_dir): + with os_helper.change_cwd(root_dir), no_chdir: archive_name = make_archive(base_name, 'gztar', root_dir, 'dist', owner=owner, group=group) @@ -1559,23 +1572,30 @@ def test_tarfile_root_owner(self): def test_make_archive_cwd(self): current_dir = os.getcwd() + root_dir = self.mkdtemp() def _breaks(*args, **kw): raise RuntimeError() + dirs = [] + def _chdir(path): + dirs.append(path) + orig_chdir(path) register_archive_format('xxx', _breaks, [], 'xxx file') try: - try: - make_archive('xxx', 'xxx', root_dir=self.mkdtemp()) - except Exception: - pass + with support.swap_attr(os, 'chdir', _chdir) as orig_chdir: + try: + make_archive('xxx', 'xxx', root_dir=root_dir) + except Exception: + pass self.assertEqual(os.getcwd(), current_dir) + self.assertEqual(dirs, [root_dir, current_dir]) finally: unregister_archive_format('xxx') def test_make_tarfile_in_curdir(self): # Issue #21280 root_dir = self.mkdtemp() - with os_helper.change_cwd(root_dir): + with os_helper.change_cwd(root_dir), no_chdir: self.assertEqual(make_archive('test', 'tar'), 'test.tar') self.assertTrue(os.path.isfile('test.tar')) @@ -1583,7 +1603,7 @@ def test_make_tarfile_in_curdir(self): def test_make_zipfile_in_curdir(self): # Issue #21280 root_dir = self.mkdtemp() - with os_helper.change_cwd(root_dir): + with os_helper.change_cwd(root_dir), no_chdir: self.assertEqual(make_archive('test', 'zip'), 'test.zip') self.assertTrue(os.path.isfile('test.zip')) @@ -2648,6 +2668,7 @@ def test_stty_match(self): self.assertEqual(expected, actual) + @unittest.skipIf(support.is_wasi, "WASI has no /dev/null") def test_fallback(self): with os_helper.EnvironmentVarGuard() as env: del env['LINES'] diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index 2ba29fc837d449..2562a57ea421ff 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -107,6 +107,10 @@ def test_interprocess_signal(self): script = os.path.join(dirname, 'signalinterproctester.py') assert_python_ok(script) + @unittest.skipUnless( + hasattr(signal, "valid_signals"), + "requires signal.valid_signals" + ) def test_valid_signals(self): s = signal.valid_signals() self.assertIsInstance(s, set) @@ -212,6 +216,7 @@ def test_invalid_fd(self): self.assertRaises((ValueError, OSError), signal.set_wakeup_fd, fd) + @unittest.skipUnless(support.has_socket_support, "needs working sockets.") def test_invalid_socket(self): sock = socket.socket() fd = sock.fileno() @@ -222,6 +227,7 @@ def test_invalid_socket(self): # Emscripten does not support fstat on pipes yet. # https://github.com/emscripten-core/emscripten/issues/16414 @unittest.skipIf(support.is_emscripten, "Emscripten cannot fstat pipes.") + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_set_wakeup_fd_result(self): r1, w1 = os.pipe() self.addCleanup(os.close, r1) @@ -240,6 +246,7 @@ def test_set_wakeup_fd_result(self): self.assertEqual(signal.set_wakeup_fd(-1), -1) @unittest.skipIf(support.is_emscripten, "Emscripten cannot fstat pipes.") + @unittest.skipUnless(support.has_socket_support, "needs working sockets.") def test_set_wakeup_fd_socket_result(self): sock1 = socket.socket() self.addCleanup(sock1.close) @@ -260,6 +267,7 @@ def test_set_wakeup_fd_socket_result(self): # function to test if a socket is in non-blocking mode. @unittest.skipIf(sys.platform == "win32", "tests specific to POSIX") @unittest.skipIf(support.is_emscripten, "Emscripten cannot fstat pipes.") + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_set_wakeup_fd_blocking(self): rfd, wfd = os.pipe() self.addCleanup(os.close, rfd) @@ -320,6 +328,7 @@ def check_signum(signals): assert_python_ok('-c', code) @unittest.skipIf(_testcapi is None, 'need _testcapi') + @unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") def test_wakeup_write_error(self): # Issue #16105: write() errors in the C signal handler should not # pass silently. @@ -659,6 +668,7 @@ def handler(signum, frame): @unittest.skipIf(sys.platform == "win32", "Not valid on Windows") @unittest.skipUnless(hasattr(signal, 'siginterrupt'), "needs signal.siginterrupt()") @support.requires_subprocess() +@unittest.skipUnless(hasattr(os, "pipe"), "requires os.pipe()") class SiginterruptTest(unittest.TestCase): def readpipe_interrupted(self, interrupt): @@ -802,15 +812,12 @@ def test_itimer_virtual(self): signal.signal(signal.SIGVTALRM, self.sig_vtalrm) signal.setitimer(self.itimer, 0.3, 0.2) - start_time = time.monotonic() - while time.monotonic() - start_time < 60.0: + for _ in support.busy_retry(support.LONG_TIMEOUT): # use up some virtual time by doing real work _ = pow(12345, 67890, 10000019) if signal.getitimer(self.itimer) == (0.0, 0.0): - break # sig_vtalrm handler stopped this itimer - else: # Issue 8424 - self.skipTest("timeout: likely cause: machine too slow or load too " - "high") + # sig_vtalrm handler stopped this itimer + break # virtual itimer should be (0.0, 0.0) now self.assertEqual(signal.getitimer(self.itimer), (0.0, 0.0)) @@ -822,15 +829,12 @@ def test_itimer_prof(self): signal.signal(signal.SIGPROF, self.sig_prof) signal.setitimer(self.itimer, 0.2, 0.2) - start_time = time.monotonic() - while time.monotonic() - start_time < 60.0: + for _ in support.busy_retry(support.LONG_TIMEOUT): # do some work _ = pow(12345, 67890, 10000019) if signal.getitimer(self.itimer) == (0.0, 0.0): - break # sig_prof handler stopped this itimer - else: # Issue 8424 - self.skipTest("timeout: likely cause: machine too slow or load too " - "high") + # sig_prof handler stopped this itimer + break # profiling itimer should be (0.0, 0.0) now self.assertEqual(signal.getitimer(self.itimer), (0.0, 0.0)) @@ -1297,8 +1301,6 @@ def handler(signum, frame): self.setsig(signal.SIGALRM, handler) # for ITIMER_REAL expected_sigs = 0 - deadline = time.monotonic() + support.SHORT_TIMEOUT - while expected_sigs < N: # Hopefully the SIGALRM will be received somewhere during # initial processing of SIGUSR1. @@ -1307,8 +1309,9 @@ def handler(signum, frame): expected_sigs += 2 # Wait for handlers to run to avoid signal coalescing - while len(sigs) < expected_sigs and time.monotonic() < deadline: - time.sleep(1e-5) + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(sigs) >= expected_sigs: + break # All ITIMER_REAL signals should have been delivered to the # Python handler diff --git a/Lib/test/test_site.py b/Lib/test/test_site.py index dd018d6b38a864..9e701fd847acdf 100644 --- a/Lib/test/test_site.py +++ b/Lib/test/test_site.py @@ -10,10 +10,9 @@ from test.support import os_helper from test.support import socket_helper from test.support import captured_stderr -from test.support.os_helper import TESTFN, EnvironmentVarGuard, change_cwd +from test.support.os_helper import TESTFN, EnvironmentVarGuard import ast import builtins -import encodings import glob import io import os @@ -206,7 +205,7 @@ def test_get_path(self): scheme = 'osx_framework_user' else: scheme = os.name + '_user' - self.assertEqual(site._get_path(site._getuserbase()), + self.assertEqual(os.path.normpath(site._get_path(site._getuserbase())), sysconfig.get_path('purelib', scheme)) @unittest.skipUnless(site.ENABLE_USER_SITE, "requires access to PEP 370 " @@ -214,7 +213,7 @@ def test_get_path(self): @support.requires_subprocess() def test_s_option(self): # (ncoghlan) Change this to use script_helper... - usersite = site.USER_SITE + usersite = os.path.normpath(site.USER_SITE) self.assertIn(usersite, sys.path) env = os.environ.copy() @@ -571,6 +570,8 @@ def _create_underpth_exe(self, lines, exe_pth=True): dll_file = os.path.join(temp_dir, os.path.split(dll_src_file)[1]) shutil.copy(sys.executable, exe_file) shutil.copy(dll_src_file, dll_file) + for fn in glob.glob(os.path.join(os.path.split(dll_src_file)[0], "vcruntime*.dll")): + shutil.copy(fn, os.path.join(temp_dir, os.path.split(fn)[1])) if exe_pth: _pth_file = os.path.splitext(exe_file)[0] + '._pth' else: diff --git a/Lib/test/test_smtpd.py b/Lib/test/test_smtpd.py index 57eb98ebc1d1ef..39ff8793648ba6 100644 --- a/Lib/test/test_smtpd.py +++ b/Lib/test/test_smtpd.py @@ -10,6 +10,9 @@ smtpd = warnings_helper.import_deprecated('smtpd') asyncore = warnings_helper.import_deprecated('asyncore') +if not socket_helper.has_gethostname: + raise unittest.SkipTest("test requires gethostname()") + class DummyServer(smtpd.SMTPServer): def __init__(self, *args, **kwargs): diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py old mode 100755 new mode 100644 index 1aaa9e44f90c65..1700b429ab07ca --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -4,30 +4,30 @@ from test.support import socket_helper from test.support import threading_helper +import _thread as thread +import array +import contextlib import errno import io import itertools -import socket -import select -import tempfile -import time -import traceback -import queue -import sys -import os -import platform -import array -import contextlib -from weakref import proxy -import signal import math +import os import pickle -import struct +import platform +import queue import random -import shutil +import re +import select +import signal +import socket import string -import _thread as thread +import struct +import sys +import tempfile import threading +import time +import traceback +from weakref import proxy try: import multiprocessing except ImportError: @@ -143,6 +143,17 @@ def _have_socket_bluetooth(): return True +def _have_socket_hyperv(): + """Check whether AF_HYPERV sockets are supported on this host.""" + try: + s = socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM, socket.HV_PROTOCOL_RAW) + except (AttributeError, OSError): + return False + else: + s.close() + return True + + @contextlib.contextmanager def socket_setdefaulttimeout(timeout): old_timeout = socket.getdefaulttimeout() @@ -171,6 +182,8 @@ def socket_setdefaulttimeout(timeout): HAVE_SOCKET_BLUETOOTH = _have_socket_bluetooth() +HAVE_SOCKET_HYPERV = _have_socket_hyperv() + # Size in bytes of the int type SIZEOF_INT = array.array("i").itemsize @@ -591,17 +604,18 @@ class SocketTestBase(unittest.TestCase): def setUp(self): self.serv = self.newSocket() + self.addCleanup(self.close_server) self.bindServer() + def close_server(self): + self.serv.close() + self.serv = None + def bindServer(self): """Bind server socket and set self.serv_addr to its address.""" self.bindSock(self.serv) self.serv_addr = self.serv.getsockname() - def tearDown(self): - self.serv.close() - self.serv = None - class SocketListeningTestMixin(SocketTestBase): """Mixin to listen on the server socket.""" @@ -686,15 +700,10 @@ class UnixSocketTestBase(SocketTestBase): # can't send anything that might be problematic for a privileged # user running the tests. - def setUp(self): - self.dir_path = tempfile.mkdtemp() - self.addCleanup(os.rmdir, self.dir_path) - super().setUp() - def bindSock(self, sock): - path = tempfile.mktemp(dir=self.dir_path) - socket_helper.bind_unix_socket(sock, path) + path = socket_helper.create_unix_domain_name() self.addCleanup(os_helper.unlink, path) + socket_helper.bind_unix_socket(sock, path) class UnixStreamBase(UnixSocketTestBase): """Base class for Unix-domain SOCK_STREAM tests.""" @@ -1891,17 +1900,18 @@ def test_socket_fileno(self): self._test_socket_fileno(s, socket.AF_INET6, socket.SOCK_STREAM) if hasattr(socket, "AF_UNIX"): - tmpdir = tempfile.mkdtemp() - self.addCleanup(shutil.rmtree, tmpdir) + unix_name = socket_helper.create_unix_domain_name() + self.addCleanup(os_helper.unlink, unix_name) + s = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) - self.addCleanup(s.close) - try: - s.bind(os.path.join(tmpdir, 'socket')) - except PermissionError: - pass - else: - self._test_socket_fileno(s, socket.AF_UNIX, - socket.SOCK_STREAM) + with s: + try: + s.bind(unix_name) + except PermissionError: + pass + else: + self._test_socket_fileno(s, socket.AF_UNIX, + socket.SOCK_STREAM) def test_socket_fileno_rejects_float(self): with self.assertRaises(TypeError): @@ -2459,6 +2469,59 @@ def testCreateScoSocket(self): pass +@unittest.skipUnless(HAVE_SOCKET_HYPERV, + 'Hyper-V sockets required for this test.') +class BasicHyperVTest(unittest.TestCase): + + def testHyperVConstants(self): + socket.HVSOCKET_CONNECT_TIMEOUT + socket.HVSOCKET_CONNECT_TIMEOUT_MAX + socket.HVSOCKET_CONNECTED_SUSPEND + socket.HVSOCKET_ADDRESS_FLAG_PASSTHRU + socket.HV_GUID_ZERO + socket.HV_GUID_WILDCARD + socket.HV_GUID_BROADCAST + socket.HV_GUID_CHILDREN + socket.HV_GUID_LOOPBACK + socket.HV_GUID_LOOPBACK + + def testCreateHyperVSocketWithUnknownProtoFailure(self): + expected = "A protocol was specified in the socket function call " \ + "that does not support the semantics of the socket type requested" + with self.assertRaisesRegex(OSError, expected): + socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM) + + def testCreateHyperVSocketAddrNotTupleFailure(self): + expected = "connect(): AF_HYPERV address must be tuple, not str" + with socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM, socket.HV_PROTOCOL_RAW) as s: + with self.assertRaisesRegex(TypeError, re.escape(expected)): + s.connect(socket.HV_GUID_ZERO) + + def testCreateHyperVSocketAddrNotTupleOf2StrsFailure(self): + expected = "AF_HYPERV address must be a str tuple (vm_id, service_id)" + with socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM, socket.HV_PROTOCOL_RAW) as s: + with self.assertRaisesRegex(TypeError, re.escape(expected)): + s.connect((socket.HV_GUID_ZERO,)) + + def testCreateHyperVSocketAddrNotTupleOfStrsFailure(self): + expected = "AF_HYPERV address must be a str tuple (vm_id, service_id)" + with socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM, socket.HV_PROTOCOL_RAW) as s: + with self.assertRaisesRegex(TypeError, re.escape(expected)): + s.connect((1, 2)) + + def testCreateHyperVSocketAddrVmIdNotValidUUIDFailure(self): + expected = "connect(): AF_HYPERV address vm_id is not a valid UUID string" + with socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM, socket.HV_PROTOCOL_RAW) as s: + with self.assertRaisesRegex(ValueError, re.escape(expected)): + s.connect(("00", socket.HV_GUID_ZERO)) + + def testCreateHyperVSocketAddrServiceIdNotValidUUIDFailure(self): + expected = "connect(): AF_HYPERV address service_id is not a valid UUID string" + with socket.socket(socket.AF_HYPERV, socket.SOCK_STREAM, socket.HV_PROTOCOL_RAW) as s: + with self.assertRaisesRegex(ValueError, re.escape(expected)): + s.connect((socket.HV_GUID_ZERO, "00")) + + class BasicTCPTest(SocketConnectedTest): def __init__(self, methodName='runTest'): diff --git a/Lib/test/test_socketserver.py b/Lib/test/test_socketserver.py index 2edb1e0c0e21e2..2fa5069423327a 100644 --- a/Lib/test/test_socketserver.py +++ b/Lib/test/test_socketserver.py @@ -8,7 +8,6 @@ import select import signal import socket -import tempfile import threading import unittest import socketserver @@ -98,8 +97,7 @@ def pickaddr(self, proto): else: # XXX: We need a way to tell AF_UNIX to pick its own name # like AF_INET provides port==0. - dir = None - fn = tempfile.mktemp(prefix='unix_socket.', dir=dir) + fn = socket_helper.create_unix_domain_name() self.test_files.append(fn) return fn diff --git a/Lib/test/test_sqlite3/__init__.py b/Lib/test/test_sqlite3/__init__.py index 099c01e3b3cc70..c3cab8f0acdc0b 100644 --- a/Lib/test/test_sqlite3/__init__.py +++ b/Lib/test/test_sqlite3/__init__.py @@ -3,7 +3,6 @@ # Skip test if _sqlite3 module not installed. import_helper.import_module('_sqlite3') -import unittest import os import sqlite3 diff --git a/Lib/test/test_sqlite3/test_dbapi.py b/Lib/test/test_sqlite3/test_dbapi.py index e132fcdfb0e658..363a308f3e5fec 100644 --- a/Lib/test/test_sqlite3/test_dbapi.py +++ b/Lib/test/test_sqlite3/test_dbapi.py @@ -21,33 +21,22 @@ # 3. This notice may not be removed or altered from any source distribution. import contextlib +import os import sqlite3 as sqlite import subprocess import sys import threading import unittest +import urllib.parse from test.support import ( - SHORT_TIMEOUT, - bigmemtest, - check_disallow_instantiation, - threading_helper, + SHORT_TIMEOUT, check_disallow_instantiation, requires_subprocess, + is_emscripten, is_wasi ) +from test.support import threading_helper from _testcapi import INT_MAX, ULLONG_MAX from os import SEEK_SET, SEEK_CUR, SEEK_END -from test.support.os_helper import TESTFN, unlink, temp_dir - - -# Helper for tests using TESTFN -@contextlib.contextmanager -def managed_connect(*args, in_mem=False, **kwargs): - cx = sqlite.connect(*args, **kwargs) - try: - yield cx - finally: - cx.close() - if not in_mem: - unlink(TESTFN) +from test.support.os_helper import TESTFN, TESTFN_UNDECODABLE, unlink, temp_dir, FakePath # Helper for temporary memory databases @@ -71,6 +60,17 @@ def test_api_level(self): self.assertEqual(sqlite.apilevel, "2.0", "apilevel is %s, should be 2.0" % sqlite.apilevel) + def test_deprecated_version(self): + msg = "deprecated and will be removed in Python 3.14" + for attr in "version", "version_info": + with self.subTest(attr=attr): + with self.assertWarnsRegex(DeprecationWarning, msg) as cm: + getattr(sqlite, attr) + self.assertEqual(cm.filename, __file__) + with self.assertWarnsRegex(DeprecationWarning, msg) as cm: + getattr(sqlite.dbapi2, attr) + self.assertEqual(cm.filename, __file__) + def test_thread_safety(self): self.assertIn(sqlite.threadsafety, {0, 1, 3}, "threadsafety is %d, should be 0, 1 or 3" % @@ -242,7 +242,7 @@ def test_module_constants(self): "SQLITE_READONLY_CANTLOCK", "SQLITE_READONLY_RECOVERY", ] - if sqlite.version_info >= (3, 7, 16): + if sqlite.sqlite_version_info >= (3, 7, 16): consts += [ "SQLITE_CONSTRAINT_CHECK", "SQLITE_CONSTRAINT_COMMITHOOK", @@ -255,63 +255,63 @@ def test_module_constants(self): "SQLITE_CONSTRAINT_VTAB", "SQLITE_READONLY_ROLLBACK", ] - if sqlite.version_info >= (3, 7, 17): + if sqlite.sqlite_version_info >= (3, 7, 17): consts += [ "SQLITE_IOERR_MMAP", "SQLITE_NOTICE_RECOVER_ROLLBACK", "SQLITE_NOTICE_RECOVER_WAL", ] - if sqlite.version_info >= (3, 8, 0): + if sqlite.sqlite_version_info >= (3, 8, 0): consts += [ "SQLITE_BUSY_SNAPSHOT", "SQLITE_IOERR_GETTEMPPATH", "SQLITE_WARNING_AUTOINDEX", ] - if sqlite.version_info >= (3, 8, 1): + if sqlite.sqlite_version_info >= (3, 8, 1): consts += ["SQLITE_CANTOPEN_CONVPATH", "SQLITE_IOERR_CONVPATH"] - if sqlite.version_info >= (3, 8, 2): + if sqlite.sqlite_version_info >= (3, 8, 2): consts.append("SQLITE_CONSTRAINT_ROWID") - if sqlite.version_info >= (3, 8, 3): + if sqlite.sqlite_version_info >= (3, 8, 3): consts.append("SQLITE_READONLY_DBMOVED") - if sqlite.version_info >= (3, 8, 7): + if sqlite.sqlite_version_info >= (3, 8, 7): consts.append("SQLITE_AUTH_USER") - if sqlite.version_info >= (3, 9, 0): + if sqlite.sqlite_version_info >= (3, 9, 0): consts.append("SQLITE_IOERR_VNODE") - if sqlite.version_info >= (3, 10, 0): + if sqlite.sqlite_version_info >= (3, 10, 0): consts.append("SQLITE_IOERR_AUTH") - if sqlite.version_info >= (3, 14, 1): + if sqlite.sqlite_version_info >= (3, 14, 1): consts.append("SQLITE_OK_LOAD_PERMANENTLY") - if sqlite.version_info >= (3, 21, 0): + if sqlite.sqlite_version_info >= (3, 21, 0): consts += [ "SQLITE_IOERR_BEGIN_ATOMIC", "SQLITE_IOERR_COMMIT_ATOMIC", "SQLITE_IOERR_ROLLBACK_ATOMIC", ] - if sqlite.version_info >= (3, 22, 0): + if sqlite.sqlite_version_info >= (3, 22, 0): consts += [ "SQLITE_ERROR_MISSING_COLLSEQ", "SQLITE_ERROR_RETRY", "SQLITE_READONLY_CANTINIT", "SQLITE_READONLY_DIRECTORY", ] - if sqlite.version_info >= (3, 24, 0): + if sqlite.sqlite_version_info >= (3, 24, 0): consts += ["SQLITE_CORRUPT_SEQUENCE", "SQLITE_LOCKED_VTAB"] - if sqlite.version_info >= (3, 25, 0): + if sqlite.sqlite_version_info >= (3, 25, 0): consts += ["SQLITE_CANTOPEN_DIRTYWAL", "SQLITE_ERROR_SNAPSHOT"] - if sqlite.version_info >= (3, 31, 0): + if sqlite.sqlite_version_info >= (3, 31, 0): consts += [ "SQLITE_CANTOPEN_SYMLINK", "SQLITE_CONSTRAINT_PINNED", "SQLITE_OK_SYMLINK", ] - if sqlite.version_info >= (3, 32, 0): + if sqlite.sqlite_version_info >= (3, 32, 0): consts += [ "SQLITE_BUSY_TIMEOUT", "SQLITE_CORRUPT_INDEX", "SQLITE_IOERR_DATA", ] - if sqlite.version_info >= (3, 34, 0): - const.append("SQLITE_IOERR_CORRUPTFS") + if sqlite.sqlite_version_info >= (3, 34, 0): + consts.append("SQLITE_IOERR_CORRUPTFS") for const in consts: with self.subTest(const=const): self.assertTrue(hasattr(sqlite, const)) @@ -333,7 +333,7 @@ def test_error_code_on_exception(self): @unittest.skipIf(sqlite.sqlite_version_info <= (3, 7, 16), "Requires SQLite 3.7.16 or newer") def test_extended_error_code_on_exception(self): - with managed_connect(":memory:", in_mem=True) as con: + with memory_database() as con: with con: con.execute("create table t(t integer check(t > 0))") errmsg = "constraint failed" @@ -344,15 +344,6 @@ def test_extended_error_code_on_exception(self): sqlite.SQLITE_CONSTRAINT_CHECK) self.assertEqual(exc.sqlite_errorname, "SQLITE_CONSTRAINT_CHECK") - # sqlite3_enable_shared_cache() is deprecated on macOS and calling it may raise - # OperationalError on some buildbots. - @unittest.skipIf(sys.platform == "darwin", "shared cache is deprecated on macOS") - def test_shared_cache_deprecated(self): - for enable in (True, False): - with self.assertWarns(DeprecationWarning) as cm: - sqlite.enable_shared_cache(enable) - self.assertIn("dbapi.py", cm.filename) - def test_disallow_instantiation(self): cx = sqlite.connect(":memory:") check_disallow_instantiation(self, type(cx("select 1"))) @@ -400,7 +391,7 @@ def test_cursor(self): def test_failed_open(self): YOU_CANNOT_OPEN_THIS = "/foo/bar/bla/23534/mydb.db" with self.assertRaises(sqlite.OperationalError): - con = sqlite.connect(YOU_CANNOT_OPEN_THIS) + sqlite.connect(YOU_CANNOT_OPEN_THIS) def test_close(self): self.cx.close() @@ -649,13 +640,6 @@ def test_deserialize_corrupt_database(self): # deserialized database. cx.execute("create table fail(f)") - @unittest.skipUnless(sys.maxsize > 2**32, 'requires 64bit platform') - @bigmemtest(size=2**63, memuse=3, dry_run=False) - def test_deserialize_too_much_data_64bit(self): - with memory_database() as cx: - with self.assertRaisesRegex(OverflowError, "'data' is too large"): - cx.deserialize(b"b" * size) - class OpenTests(unittest.TestCase): _sql = "create table test(id integer)" @@ -663,24 +647,86 @@ class OpenTests(unittest.TestCase): def test_open_with_path_like_object(self): """ Checks that we can successfully connect to a database using an object that is PathLike, i.e. has __fspath__(). """ - class Path: - def __fspath__(self): - return TESTFN - path = Path() - with managed_connect(path) as cx: + path = FakePath(TESTFN) + self.addCleanup(unlink, path) + self.assertFalse(os.path.exists(path)) + with contextlib.closing(sqlite.connect(path)) as cx: + self.assertTrue(os.path.exists(path)) + cx.execute(self._sql) + + @unittest.skipIf(sys.platform == "win32", "skipped on Windows") + @unittest.skipIf(sys.platform == "darwin", "skipped on macOS") + @unittest.skipIf(is_emscripten or is_wasi, "not supported on Emscripten/WASI") + @unittest.skipUnless(TESTFN_UNDECODABLE, "only works if there are undecodable paths") + def test_open_with_undecodable_path(self): + path = TESTFN_UNDECODABLE + self.addCleanup(unlink, path) + self.assertFalse(os.path.exists(path)) + with contextlib.closing(sqlite.connect(path)) as cx: + self.assertTrue(os.path.exists(path)) cx.execute(self._sql) def test_open_uri(self): - with managed_connect(TESTFN) as cx: + path = TESTFN + self.addCleanup(unlink, path) + uri = "file:" + urllib.parse.quote(os.fsencode(path)) + self.assertFalse(os.path.exists(path)) + with contextlib.closing(sqlite.connect(uri, uri=True)) as cx: + self.assertTrue(os.path.exists(path)) cx.execute(self._sql) - with managed_connect(f"file:{TESTFN}", uri=True) as cx: + + def test_open_unquoted_uri(self): + path = TESTFN + self.addCleanup(unlink, path) + uri = "file:" + path + self.assertFalse(os.path.exists(path)) + with contextlib.closing(sqlite.connect(uri, uri=True)) as cx: + self.assertTrue(os.path.exists(path)) cx.execute(self._sql) + + def test_open_uri_readonly(self): + path = TESTFN + self.addCleanup(unlink, path) + uri = "file:" + urllib.parse.quote(os.fsencode(path)) + "?mode=ro" + self.assertFalse(os.path.exists(path)) + # Cannot create new DB with self.assertRaises(sqlite.OperationalError): - with managed_connect(f"file:{TESTFN}?mode=ro", uri=True) as cx: + sqlite.connect(uri, uri=True) + self.assertFalse(os.path.exists(path)) + sqlite.connect(path).close() + self.assertTrue(os.path.exists(path)) + # Cannot modify new DB + with contextlib.closing(sqlite.connect(uri, uri=True)) as cx: + with self.assertRaises(sqlite.OperationalError): cx.execute(self._sql) + @unittest.skipIf(sys.platform == "win32", "skipped on Windows") + @unittest.skipIf(sys.platform == "darwin", "skipped on macOS") + @unittest.skipIf(is_emscripten or is_wasi, "not supported on Emscripten/WASI") + @unittest.skipUnless(TESTFN_UNDECODABLE, "only works if there are undecodable paths") + def test_open_undecodable_uri(self): + path = TESTFN_UNDECODABLE + self.addCleanup(unlink, path) + uri = "file:" + urllib.parse.quote(path) + self.assertFalse(os.path.exists(path)) + with contextlib.closing(sqlite.connect(uri, uri=True)) as cx: + self.assertTrue(os.path.exists(path)) + cx.execute(self._sql) + + def test_factory_database_arg(self): + def factory(database, *args, **kwargs): + nonlocal database_arg + database_arg = database + return sqlite.Connection(":memory:", *args, **kwargs) + + for database in (TESTFN, os.fsencode(TESTFN), + FakePath(TESTFN), FakePath(os.fsencode(TESTFN))): + database_arg = None + sqlite.connect(database, factory=factory).close() + self.assertEqual(database_arg, database) + def test_database_keyword(self): - with sqlite.connect(database=":memory:") as cx: + with contextlib.closing(sqlite.connect(database=":memory:")) as cx: self.assertEqual(type(cx), sqlite.Connection) @@ -705,22 +751,44 @@ def test_execute_illegal_sql(self): with self.assertRaises(sqlite.OperationalError): self.cu.execute("select asdf") - def test_execute_too_much_sql(self): - self.assertRaisesRegex(sqlite.ProgrammingError, - "You can only execute one statement at a time", - self.cu.execute, "select 5+4; select 4+5") - - def test_execute_too_much_sql2(self): - self.cu.execute("select 5+4; -- foo bar") + def test_execute_multiple_statements(self): + msg = "You can only execute one statement at a time" + dataset = ( + "select 1; select 2", + "select 1; // c++ comments are not allowed", + "select 1; *not a comment", + "select 1; -*not a comment", + "select 1; /* */ a", + "select 1; /**/a", + "select 1; -", + "select 1; /", + "select 1; -\n- select 2", + """select 1; + -- comment + select 2 + """, + ) + for query in dataset: + with self.subTest(query=query): + with self.assertRaisesRegex(sqlite.ProgrammingError, msg): + self.cu.execute(query) - def test_execute_too_much_sql3(self): - self.cu.execute(""" + def test_execute_with_appended_comments(self): + dataset = ( + "select 1; -- foo bar", + "select 1; --", + "select 1; /*", # Unclosed comments ending in \0 are skipped. + """ select 5+4; /* foo */ - """) + """, + ) + for query in dataset: + with self.subTest(query=query): + self.cu.execute(query) def test_execute_wrong_sql_arg(self): with self.assertRaises(TypeError): @@ -857,6 +925,38 @@ def test_rowcount_executemany(self): self.cu.executemany("insert into test(name) values (?)", [(1,), (2,), (3,)]) self.assertEqual(self.cu.rowcount, 3) + @unittest.skipIf(sqlite.sqlite_version_info < (3, 35, 0), + "Requires SQLite 3.35.0 or newer") + def test_rowcount_update_returning(self): + # gh-93421: rowcount is updated correctly for UPDATE...RETURNING queries + self.cu.execute("update test set name='bar' where name='foo' returning 1") + self.assertEqual(self.cu.fetchone()[0], 1) + self.assertEqual(self.cu.rowcount, 1) + + def test_rowcount_prefixed_with_comment(self): + # gh-79579: rowcount is updated even if query is prefixed with comments + self.cu.execute(""" + -- foo + insert into test(name) values ('foo'), ('foo') + """) + self.assertEqual(self.cu.rowcount, 2) + self.cu.execute(""" + /* -- messy *r /* /* ** *- *-- + */ + /* one more */ insert into test(name) values ('messy') + """) + self.assertEqual(self.cu.rowcount, 1) + self.cu.execute("/* bar */ update test set name='bar' where name='foo'") + self.assertEqual(self.cu.rowcount, 3) + + def test_rowcount_vaccuum(self): + data = ((1,), (2,), (3,)) + self.cu.executemany("insert into test(income) values(?)", data) + self.assertEqual(self.cu.rowcount, 3) + self.cx.commit() + self.cu.execute("vacuum") + self.assertEqual(self.cu.rowcount, -1) + def test_total_changes(self): self.cu.execute("insert into test(name) values ('foo')") self.cu.execute("insert into test(name) values ('foo')") @@ -1358,6 +1458,7 @@ def test_blob_closed_db_read(self): blob.read) +@threading_helper.requires_working_threading() class ThreadTests(unittest.TestCase): def setUp(self): self.con = sqlite.connect(":memory:") @@ -1722,6 +1823,7 @@ def test_on_conflict_replace(self): self.assertEqual(self.cu.fetchall(), [('Very different data!', 'foo')]) +@requires_subprocess() class MultiprocessTests(unittest.TestCase): CONNECTION_TIMEOUT = SHORT_TIMEOUT / 1000. # Defaults to 30 ms diff --git a/Lib/test/test_sqlite3/test_dump.py b/Lib/test/test_sqlite3/test_dump.py index 1f14c620f0d24b..d0c24b9c60e613 100644 --- a/Lib/test/test_sqlite3/test_dump.py +++ b/Lib/test/test_sqlite3/test_dump.py @@ -2,6 +2,8 @@ import unittest import sqlite3 as sqlite +from .test_dbapi import memory_database + class DumpTests(unittest.TestCase): def setUp(self): @@ -49,6 +51,51 @@ def test_table_dump(self): [self.assertEqual(expected_sqls[i], actual_sqls[i]) for i in range(len(expected_sqls))] + def test_dump_autoincrement(self): + expected = [ + 'CREATE TABLE "t1" (id integer primary key autoincrement);', + 'INSERT INTO "t1" VALUES(NULL);', + 'CREATE TABLE "t2" (id integer primary key autoincrement);', + ] + self.cu.executescript("".join(expected)) + + # the NULL value should now be automatically be set to 1 + expected[1] = expected[1].replace("NULL", "1") + expected.insert(0, "BEGIN TRANSACTION;") + expected.extend([ + 'DELETE FROM "sqlite_sequence";', + 'INSERT INTO "sqlite_sequence" VALUES(\'t1\',1);', + 'COMMIT;', + ]) + + actual = [stmt for stmt in self.cx.iterdump()] + self.assertEqual(expected, actual) + + def test_dump_autoincrement_create_new_db(self): + self.cu.execute("BEGIN TRANSACTION") + self.cu.execute("CREATE TABLE t1 (id integer primary key autoincrement)") + self.cu.execute("CREATE TABLE t2 (id integer primary key autoincrement)") + self.cu.executemany("INSERT INTO t1 VALUES(?)", ((None,) for _ in range(9))) + self.cu.executemany("INSERT INTO t2 VALUES(?)", ((None,) for _ in range(4))) + self.cx.commit() + + with memory_database() as cx2: + query = "".join(self.cx.iterdump()) + cx2.executescript(query) + cu2 = cx2.cursor() + + dataset = ( + ("t1", 9), + ("t2", 4), + ) + for table, seq in dataset: + with self.subTest(table=table, seq=seq): + res = cu2.execute(""" + SELECT "seq" FROM "sqlite_sequence" WHERE "name" == ? + """, (table,)) + rows = res.fetchall() + self.assertEqual(rows[0][0], seq) + def test_unorderable_row(self): # iterdump() should be able to cope with unorderable row types (issue #15545) class UnorderableRow: diff --git a/Lib/test/test_sqlite3/test_factory.py b/Lib/test/test_sqlite3/test_factory.py index 420855ba34b602..71603faa02840f 100644 --- a/Lib/test/test_sqlite3/test_factory.py +++ b/Lib/test/test_sqlite3/test_factory.py @@ -256,18 +256,6 @@ def test_custom(self): self.assertEqual(type(row[0]), str, "type of row[0] must be unicode") self.assertTrue(row[0].endswith("reich"), "column must contain original data") - def test_optimized_unicode(self): - # OptimizedUnicode is deprecated as of Python 3.10 - with self.assertWarns(DeprecationWarning) as cm: - self.con.text_factory = sqlite.OptimizedUnicode - self.assertIn("factory.py", cm.filename) - austria = "Österreich" - germany = "Deutchland" - a_row = self.con.execute("select ?", (austria,)).fetchone() - d_row = self.con.execute("select ?", (germany,)).fetchone() - self.assertEqual(type(a_row[0]), str, "type of non-ASCII row must be str") - self.assertEqual(type(d_row[0]), str, "type of ASCII-only row must be str") - def tearDown(self): self.con.close() diff --git a/Lib/test/test_sqlite3/test_regression.py b/Lib/test/test_sqlite3/test_regression.py index 19bb84bf38a367..0b727cecb0e8cb 100644 --- a/Lib/test/test_sqlite3/test_regression.py +++ b/Lib/test/test_sqlite3/test_regression.py @@ -28,7 +28,7 @@ from test import support from unittest.mock import patch -from test.test_sqlite3.test_dbapi import memory_database, managed_connect, cx_limit +from test.test_sqlite3.test_dbapi import memory_database, cx_limit class RegressionTests(unittest.TestCase): @@ -422,7 +422,7 @@ def test_return_empty_bytestring(self): self.assertEqual(val, b'') def test_table_lock_cursor_replace_stmt(self): - with managed_connect(":memory:", in_mem=True) as con: + with memory_database() as con: cur = con.cursor() cur.execute("create table t(t)") cur.executemany("insert into t values(?)", @@ -433,7 +433,7 @@ def test_table_lock_cursor_replace_stmt(self): con.commit() def test_table_lock_cursor_dealloc(self): - with managed_connect(":memory:", in_mem=True) as con: + with memory_database() as con: con.execute("create table t(t)") con.executemany("insert into t values(?)", ((v,) for v in range(5))) @@ -444,7 +444,7 @@ def test_table_lock_cursor_dealloc(self): con.commit() def test_table_lock_cursor_non_readonly_select(self): - with managed_connect(":memory:", in_mem=True) as con: + with memory_database() as con: con.execute("create table t(t)") con.executemany("insert into t values(?)", ((v,) for v in range(5))) @@ -459,7 +459,7 @@ def dup(v): con.commit() def test_executescript_step_through_select(self): - with managed_connect(":memory:", in_mem=True) as con: + with memory_database() as con: values = [(v,) for v in range(5)] with con: con.execute("create table t(t)") diff --git a/Lib/test/test_sqlite3/test_transactions.py b/Lib/test/test_sqlite3/test_transactions.py index 040ab1ee608cf1..9c3d19e79bd989 100644 --- a/Lib/test/test_sqlite3/test_transactions.py +++ b/Lib/test/test_sqlite3/test_transactions.py @@ -20,38 +20,36 @@ # misrepresented as being the original software. # 3. This notice may not be removed or altered from any source distribution. -import os, unittest +import unittest import sqlite3 as sqlite +from test.support import LOOPBACK_TIMEOUT +from test.support.os_helper import TESTFN, unlink + from test.test_sqlite3.test_dbapi import memory_database -def get_db_path(): - return "sqlite_testdb" + +TIMEOUT = LOOPBACK_TIMEOUT / 10 + class TransactionTests(unittest.TestCase): def setUp(self): - try: - os.remove(get_db_path()) - except OSError: - pass - - self.con1 = sqlite.connect(get_db_path(), timeout=0.1) + self.con1 = sqlite.connect(TESTFN, timeout=TIMEOUT) self.cur1 = self.con1.cursor() - self.con2 = sqlite.connect(get_db_path(), timeout=0.1) + self.con2 = sqlite.connect(TESTFN, timeout=TIMEOUT) self.cur2 = self.con2.cursor() def tearDown(self): - self.cur1.close() - self.con1.close() + try: + self.cur1.close() + self.con1.close() - self.cur2.close() - self.con2.close() + self.cur2.close() + self.con2.close() - try: - os.unlink(get_db_path()) - except OSError: - pass + finally: + unlink(TESTFN) def test_dml_does_not_auto_commit_before(self): self.cur1.execute("create table test(i)") @@ -141,6 +139,45 @@ def test_rollback_cursor_consistency(self): con.rollback() self.assertEqual(cur.fetchall(), [(1,), (2,), (3,)]) + def test_multiple_cursors_and_iternext(self): + # gh-94028: statements are cleared and reset in cursor iternext. + + # Provoke the gh-94028 by using a cursor cache. + CURSORS = {} + def sql(cx, sql, *args): + cu = cx.cursor() + cu.execute(sql, args) + CURSORS[id(sql)] = cu + return cu + + self.con1.execute("create table t(t)") + sql(self.con1, "insert into t values (?), (?), (?)", "u1", "u2", "u3") + self.con1.commit() + + # On second connection, verify rows are visible, then delete them. + count = sql(self.con2, "select count(*) from t").fetchone()[0] + self.assertEqual(count, 3) + changes = sql(self.con2, "delete from t").rowcount + self.assertEqual(changes, 3) + self.con2.commit() + + # Back in original connection, create 2 new users. + sql(self.con1, "insert into t values (?)", "u4") + sql(self.con1, "insert into t values (?)", "u5") + + # The second connection cannot see uncommitted changes. + count = sql(self.con2, "select count(*) from t").fetchone()[0] + self.assertEqual(count, 0) + + # First connection can see its own changes. + count = sql(self.con1, "select count(*) from t").fetchone()[0] + self.assertEqual(count, 2) + + # The second connection can now see the changes. + self.con1.commit() + count = sql(self.con2, "select count(*) from t").fetchone()[0] + self.assertEqual(count, 2) + class RollbackTests(unittest.TestCase): """bpo-44092: sqlite3 now leaves it to SQLite to resolve rollback issues""" diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index fed76378726c92..b41ce98a6d9940 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -38,8 +38,7 @@ from ssl import TLSVersion, _TLSContentType, _TLSMessageType, _TLSAlertType -Py_DEBUG = hasattr(sys, 'gettotalrefcount') -Py_DEBUG_WIN32 = Py_DEBUG and sys.platform == 'win32' +Py_DEBUG_WIN32 = support.Py_DEBUG and sys.platform == 'win32' PROTOCOLS = sorted(ssl._PROTOCOL_NAMES) HOST = socket_helper.HOST @@ -383,10 +382,6 @@ def test_random(self): % (v, (v and "sufficient randomness") or "insufficient randomness")) - with warnings_helper.check_warnings(): - data, is_cryptographic = ssl.RAND_pseudo_bytes(16) - self.assertEqual(len(data), 16) - self.assertEqual(is_cryptographic, v == 1) if v: data = ssl.RAND_bytes(16) self.assertEqual(len(data), 16) @@ -395,8 +390,6 @@ def test_random(self): # negative num is invalid self.assertRaises(ValueError, ssl.RAND_bytes, -5) - with warnings_helper.check_warnings(): - self.assertRaises(ValueError, ssl.RAND_pseudo_bytes, -5) ssl.RAND_add("this is a random string", 75.0) ssl.RAND_add(b"this is a random bytes object", 75.0) @@ -688,205 +681,6 @@ def test_malformed_key(self): """Wrapping with a badly formatted key (syntax error)""" self.bad_cert_test("badkey.pem") - @ignore_deprecation - def test_match_hostname(self): - def ok(cert, hostname): - ssl.match_hostname(cert, hostname) - def fail(cert, hostname): - self.assertRaises(ssl.CertificateError, - ssl.match_hostname, cert, hostname) - - # -- Hostname matching -- - - cert = {'subject': ((('commonName', 'example.com'),),)} - ok(cert, 'example.com') - ok(cert, 'ExAmple.cOm') - fail(cert, 'www.example.com') - fail(cert, '.example.com') - fail(cert, 'example.org') - fail(cert, 'exampleXcom') - - cert = {'subject': ((('commonName', '*.a.com'),),)} - ok(cert, 'foo.a.com') - fail(cert, 'bar.foo.a.com') - fail(cert, 'a.com') - fail(cert, 'Xa.com') - fail(cert, '.a.com') - - # only match wildcards when they are the only thing - # in left-most segment - cert = {'subject': ((('commonName', 'f*.com'),),)} - fail(cert, 'foo.com') - fail(cert, 'f.com') - fail(cert, 'bar.com') - fail(cert, 'foo.a.com') - fail(cert, 'bar.foo.com') - - # NULL bytes are bad, CVE-2013-4073 - cert = {'subject': ((('commonName', - 'null.python.org\x00example.org'),),)} - ok(cert, 'null.python.org\x00example.org') # or raise an error? - fail(cert, 'example.org') - fail(cert, 'null.python.org') - - # error cases with wildcards - cert = {'subject': ((('commonName', '*.*.a.com'),),)} - fail(cert, 'bar.foo.a.com') - fail(cert, 'a.com') - fail(cert, 'Xa.com') - fail(cert, '.a.com') - - cert = {'subject': ((('commonName', 'a.*.com'),),)} - fail(cert, 'a.foo.com') - fail(cert, 'a..com') - fail(cert, 'a.com') - - # wildcard doesn't match IDNA prefix 'xn--' - idna = 'püthon.python.org'.encode("idna").decode("ascii") - cert = {'subject': ((('commonName', idna),),)} - ok(cert, idna) - cert = {'subject': ((('commonName', 'x*.python.org'),),)} - fail(cert, idna) - cert = {'subject': ((('commonName', 'xn--p*.python.org'),),)} - fail(cert, idna) - - # wildcard in first fragment and IDNA A-labels in sequent fragments - # are supported. - idna = 'www*.pythön.org'.encode("idna").decode("ascii") - cert = {'subject': ((('commonName', idna),),)} - fail(cert, 'www.pythön.org'.encode("idna").decode("ascii")) - fail(cert, 'www1.pythön.org'.encode("idna").decode("ascii")) - fail(cert, 'ftp.pythön.org'.encode("idna").decode("ascii")) - fail(cert, 'pythön.org'.encode("idna").decode("ascii")) - - # Slightly fake real-world example - cert = {'notAfter': 'Jun 26 21:41:46 2011 GMT', - 'subject': ((('commonName', 'linuxfrz.org'),),), - 'subjectAltName': (('DNS', 'linuxfr.org'), - ('DNS', 'linuxfr.com'), - ('othername', ''))} - ok(cert, 'linuxfr.org') - ok(cert, 'linuxfr.com') - # Not a "DNS" entry - fail(cert, '') - # When there is a subjectAltName, commonName isn't used - fail(cert, 'linuxfrz.org') - - # A pristine real-world example - cert = {'notAfter': 'Dec 18 23:59:59 2011 GMT', - 'subject': ((('countryName', 'US'),), - (('stateOrProvinceName', 'California'),), - (('localityName', 'Mountain View'),), - (('organizationName', 'Google Inc'),), - (('commonName', 'mail.google.com'),))} - ok(cert, 'mail.google.com') - fail(cert, 'gmail.com') - # Only commonName is considered - fail(cert, 'California') - - # -- IPv4 matching -- - cert = {'subject': ((('commonName', 'example.com'),),), - 'subjectAltName': (('DNS', 'example.com'), - ('IP Address', '10.11.12.13'), - ('IP Address', '14.15.16.17'), - ('IP Address', '127.0.0.1'))} - ok(cert, '10.11.12.13') - ok(cert, '14.15.16.17') - # socket.inet_ntoa(socket.inet_aton('127.1')) == '127.0.0.1' - fail(cert, '127.1') - fail(cert, '14.15.16.17 ') - fail(cert, '14.15.16.17 extra data') - fail(cert, '14.15.16.18') - fail(cert, 'example.net') - - # -- IPv6 matching -- - if socket_helper.IPV6_ENABLED: - cert = {'subject': ((('commonName', 'example.com'),),), - 'subjectAltName': ( - ('DNS', 'example.com'), - ('IP Address', '2001:0:0:0:0:0:0:CAFE\n'), - ('IP Address', '2003:0:0:0:0:0:0:BABA\n'))} - ok(cert, '2001::cafe') - ok(cert, '2003::baba') - fail(cert, '2003::baba ') - fail(cert, '2003::baba extra data') - fail(cert, '2003::bebe') - fail(cert, 'example.net') - - # -- Miscellaneous -- - - # Neither commonName nor subjectAltName - cert = {'notAfter': 'Dec 18 23:59:59 2011 GMT', - 'subject': ((('countryName', 'US'),), - (('stateOrProvinceName', 'California'),), - (('localityName', 'Mountain View'),), - (('organizationName', 'Google Inc'),))} - fail(cert, 'mail.google.com') - - # No DNS entry in subjectAltName but a commonName - cert = {'notAfter': 'Dec 18 23:59:59 2099 GMT', - 'subject': ((('countryName', 'US'),), - (('stateOrProvinceName', 'California'),), - (('localityName', 'Mountain View'),), - (('commonName', 'mail.google.com'),)), - 'subjectAltName': (('othername', 'blabla'), )} - ok(cert, 'mail.google.com') - - # No DNS entry subjectAltName and no commonName - cert = {'notAfter': 'Dec 18 23:59:59 2099 GMT', - 'subject': ((('countryName', 'US'),), - (('stateOrProvinceName', 'California'),), - (('localityName', 'Mountain View'),), - (('organizationName', 'Google Inc'),)), - 'subjectAltName': (('othername', 'blabla'),)} - fail(cert, 'google.com') - - # Empty cert / no cert - self.assertRaises(ValueError, ssl.match_hostname, None, 'example.com') - self.assertRaises(ValueError, ssl.match_hostname, {}, 'example.com') - - # Issue #17980: avoid denials of service by refusing more than one - # wildcard per fragment. - cert = {'subject': ((('commonName', 'a*b.example.com'),),)} - with self.assertRaisesRegex( - ssl.CertificateError, - "partial wildcards in leftmost label are not supported"): - ssl.match_hostname(cert, 'axxb.example.com') - - cert = {'subject': ((('commonName', 'www.*.example.com'),),)} - with self.assertRaisesRegex( - ssl.CertificateError, - "wildcard can only be present in the leftmost label"): - ssl.match_hostname(cert, 'www.sub.example.com') - - cert = {'subject': ((('commonName', 'a*b*.example.com'),),)} - with self.assertRaisesRegex( - ssl.CertificateError, - "too many wildcards"): - ssl.match_hostname(cert, 'axxbxxc.example.com') - - cert = {'subject': ((('commonName', '*'),),)} - with self.assertRaisesRegex( - ssl.CertificateError, - "sole wildcard without additional labels are not support"): - ssl.match_hostname(cert, 'host') - - cert = {'subject': ((('commonName', '*.com'),),)} - with self.assertRaisesRegex( - ssl.CertificateError, - r"hostname 'com' doesn't match '\*.com'"): - ssl.match_hostname(cert, 'com') - - # extra checks for _inet_paton() - for invalid in ['1', '', '1.2.3', '256.0.0.1', '127.0.0.1/24']: - with self.assertRaises(ValueError): - ssl._inet_paton(invalid) - for ipaddr in ['127.0.0.1', '192.168.0.1']: - self.assertTrue(ssl._inet_paton(ipaddr)) - if socket_helper.IPV6_ENABLED: - for ipaddr in ['::1', '2001:db8:85a3::8a2e:370:7334']: - self.assertTrue(ssl._inet_paton(ipaddr)) - def test_server_side(self): # server_hostname doesn't work for server sockets ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER) @@ -1657,7 +1451,8 @@ def test_load_default_certs_env(self): self.assertEqual(ctx.cert_store_stats(), {"crl": 0, "x509": 1, "x509_ca": 0}) @unittest.skipUnless(sys.platform == "win32", "Windows specific") - @unittest.skipIf(hasattr(sys, "gettotalrefcount"), "Debug build does not share environment between CRTs") + @unittest.skipIf(support.Py_DEBUG, + "Debug build does not share environment between CRTs") def test_load_default_certs_env_windows(self): ctx = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT) ctx.load_default_certs() @@ -2262,11 +2057,8 @@ def ssl_io_loop(self, sock, incoming, outgoing, func, *args, **kwargs): # A simple IO loop. Call func(*args) depending on the error we get # (WANT_READ or WANT_WRITE) move data between the socket and the BIOs. timeout = kwargs.get('timeout', support.SHORT_TIMEOUT) - deadline = time.monotonic() + timeout count = 0 - while True: - if time.monotonic() > deadline: - self.fail("timeout") + for _ in support.busy_retry(timeout): errno = None count += 1 try: diff --git a/Lib/test/test_stable_abi_ctypes.py b/Lib/test/test_stable_abi_ctypes.py index 18c85061ca0893..53e93ab6b9b4c9 100644 --- a/Lib/test/test_stable_abi_ctypes.py +++ b/Lib/test/test_stable_abi_ctypes.py @@ -660,6 +660,7 @@ def test_windows_feature_macros(self): "PyTuple_Size", "PyTuple_Type", "PyType_ClearCache", + "PyType_FromMetaclass", "PyType_FromModuleAndSpec", "PyType_FromSpec", "PyType_FromSpecWithBases", diff --git a/Lib/test/test_stat.py b/Lib/test/test_stat.py index 2e1e2c349c8d09..4ba37aed2dc9db 100644 --- a/Lib/test/test_stat.py +++ b/Lib/test/test_stat.py @@ -113,6 +113,7 @@ def assertS_IS(self, name, mode): else: self.assertFalse(func(mode)) + @os_helper.skip_unless_working_chmod def test_mode(self): with open(TESTFN, 'w'): pass @@ -151,6 +152,7 @@ def test_mode(self): self.assertEqual(self.statmod.S_IFMT(st_mode), self.statmod.S_IFREG) + @os_helper.skip_unless_working_chmod def test_directory(self): os.mkdir(TESTFN) os.chmod(TESTFN, 0o700) @@ -161,7 +163,7 @@ def test_directory(self): else: self.assertEqual(modestr[0], 'd') - @unittest.skipUnless(hasattr(os, 'symlink'), 'os.symlink not available') + @os_helper.skip_unless_symlink def test_link(self): try: os.symlink(os.getcwd(), TESTFN) diff --git a/Lib/test/test_struct.py b/Lib/test/test_struct.py index 94873ff6124d02..8f14ed30588810 100644 --- a/Lib/test/test_struct.py +++ b/Lib/test/test_struct.py @@ -1,10 +1,12 @@ from collections import abc import array +import gc import math import operator import unittest import struct import sys +import weakref from test import support from test.support import import_helper @@ -672,6 +674,21 @@ def __del__(self): self.assertIn(b"Exception ignored in:", stderr) self.assertIn(b"C.__del__", stderr) + def test__struct_reference_cycle_cleaned_up(self): + # Regression test for python/cpython#94207. + + # When we create a new struct module, trigger use of its cache, + # and then delete it ... + _struct_module = import_helper.import_fresh_module("_struct") + module_ref = weakref.ref(_struct_module) + _struct_module.calcsize("b") + del _struct_module + + # Then the module should have been garbage collected. + gc.collect() + self.assertIsNone( + module_ref(), "_struct module was not garbage collected") + def test_issue35714(self): # Embedded null characters should not be allowed in format strings. for s in '\0', '2\0i', b'\0': diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index dce49809385c67..7738ca5e9b433d 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -9,7 +9,6 @@ import sys import tempfile import textwrap -import time import unittest import warnings @@ -461,18 +460,12 @@ def test_reap_children(self): # child process: do nothing, just exit os._exit(0) - t0 = time.monotonic() - deadline = time.monotonic() + support.SHORT_TIMEOUT - was_altered = support.environment_altered try: support.environment_altered = False stderr = io.StringIO() - while True: - if time.monotonic() > deadline: - self.fail("timeout") - + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): with support.swap_attr(support.print_warning, 'orig_stderr', stderr): support.reap_children() @@ -481,9 +474,6 @@ def test_reap_children(self): if support.environment_altered: break - # loop until the child process completed - time.sleep(0.100) - msg = "Warning -- reap_children() reaped child process %s" % pid self.assertIn(msg, stderr.getvalue()) self.assertTrue(support.environment_altered) @@ -664,6 +654,7 @@ def id(self): self.assertTrue(support.match_test(test_chdir)) @unittest.skipIf(support.is_emscripten, "Unstable in Emscripten") + @unittest.skipIf(support.is_wasi, "Unavailable on WASI") def test_fd_count(self): # We cannot test the absolute value of fd_count(): on old Linux # kernel or glibc versions, os.urandom() keeps a FD open on @@ -691,7 +682,7 @@ def test_print_warning(self): 'Warning -- a\nWarning -- b\n') def test_has_strftime_extensions(self): - if support.is_emscripten or support.is_wasi or sys.platform == "win32": + if support.is_emscripten or sys.platform == "win32": self.assertFalse(support.has_strftime_extensions) else: self.assertTrue(support.has_strftime_extensions) diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index 96e5c129c6599c..a9193b0b30cb67 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -607,7 +607,7 @@ >>> class C(x for x in L): ... pass Traceback (most recent call last): -SyntaxError: expected ':' +SyntaxError: invalid syntax >>> def g(*args, **kwargs): ... print(args, sorted(kwargs.items())) @@ -963,17 +963,22 @@ ... SyntaxError: cannot assign to function call here. Maybe you meant '==' instead of '='? - Missing ':' before suites: +Missing ':' before suites: - >>> def f() - ... pass - Traceback (most recent call last): - SyntaxError: expected ':' + >>> def f() + ... pass + Traceback (most recent call last): + SyntaxError: expected ':' - >>> class A - ... pass - Traceback (most recent call last): - SyntaxError: expected ':' + >>> class A + ... pass + Traceback (most recent call last): + SyntaxError: expected ':' + + >>> class R&D: + ... pass + Traceback (most recent call last): + SyntaxError: invalid syntax >>> if 1 ... pass @@ -1007,6 +1012,11 @@ Traceback (most recent call last): SyntaxError: expected ':' + >>> for x in range 10: + ... pass + Traceback (most recent call last): + SyntaxError: invalid syntax + >>> while True ... pass Traceback (most recent call last): @@ -1052,6 +1062,11 @@ Traceback (most recent call last): SyntaxError: expected ':' + >>> with block ad something: + ... pass + Traceback (most recent call last): + SyntaxError: invalid syntax + >>> try ... pass Traceback (most recent call last): @@ -1070,6 +1085,12 @@ Traceback (most recent call last): SyntaxError: expected ':' + >>> match x x: + ... case list(): + ... pass + Traceback (most recent call last): + SyntaxError: invalid syntax + >>> match x: ... case list() ... pass diff --git a/Lib/test/test_sys.py b/Lib/test/test_sys.py index 9c0f4a69289d2e..1dc10d8b0a39ac 100644 --- a/Lib/test/test_sys.py +++ b/Lib/test/test_sys.py @@ -1432,9 +1432,10 @@ def get_gen(): yield 1 import re check(re.finditer('',''), size('2P')) # list - samples = [[], [1,2,3], ['1', '2', '3']] - for sample in samples: - check(list(sample), vsize('Pn') + len(sample)*self.P) + check(list([]), vsize('Pn')) + check(list([1]), vsize('Pn') + 2*self.P) + check(list([1, 2]), vsize('Pn') + 2*self.P) + check(list([1, 2, 3]), vsize('Pn') + 4*self.P) # sortwrapper (list) # XXX # cmpwrapper (list) @@ -1538,6 +1539,7 @@ class newstyleclass(object): pass samples = ['1'*100, '\xff'*50, '\u0100'*40, '\uffff'*100, '\U00010000'*30, '\U0010ffff'*100] + # also update field definitions in test_unicode.test_raiseMemError asciifields = "nnb" compactfields = asciifields + "nP" unicodefields = compactfields + "P" diff --git a/Lib/test/test_sys_settrace.py b/Lib/test/test_sys_settrace.py index 9cc6bcfcab5e1c..162fd8328582ca 100644 --- a/Lib/test/test_sys_settrace.py +++ b/Lib/test/test_sys_settrace.py @@ -609,6 +609,58 @@ def run(tracer): self.compare_events(doit_async.__code__.co_firstlineno, tracer.events, events) + def test_async_for_backwards_jump_has_no_line(self): + async def arange(n): + for i in range(n): + yield i + async def f(): + async for i in arange(3): + if i > 100: + break # should never be traced + + tracer = self.make_tracer() + coro = f() + try: + sys.settrace(tracer.trace) + coro.send(None) + except Exception: + pass + finally: + sys.settrace(None) + + events = [ + (0, 'call'), + (1, 'line'), + (-3, 'call'), + (-2, 'line'), + (-1, 'line'), + (-1, 'return'), + (1, 'exception'), + (2, 'line'), + (1, 'line'), + (-1, 'call'), + (-2, 'line'), + (-1, 'line'), + (-1, 'return'), + (1, 'exception'), + (2, 'line'), + (1, 'line'), + (-1, 'call'), + (-2, 'line'), + (-1, 'line'), + (-1, 'return'), + (1, 'exception'), + (2, 'line'), + (1, 'line'), + (-1, 'call'), + (-2, 'line'), + (-2, 'return'), + (1, 'exception'), + (1, 'return'), + ] + self.compare_events(f.__code__.co_firstlineno, + tracer.events, events) + def test_21_repeated_pass(self): def func(): pass diff --git a/Lib/test/test_sysconfig.py b/Lib/test/test_sysconfig.py index f2b93706b270a7..27722fbe11ee17 100644 --- a/Lib/test/test_sysconfig.py +++ b/Lib/test/test_sysconfig.py @@ -5,11 +5,12 @@ import shutil from copy import copy -from test.support import (captured_stdout, PythonSymlink, requires_subprocess) +from test.support import ( + captured_stdout, PythonSymlink, requires_subprocess, is_wasi +) from test.support.import_helper import import_module from test.support.os_helper import (TESTFN, unlink, skip_unless_symlink, change_cwd) -from test.support.warnings_helper import check_warnings import sysconfig from sysconfig import (get_paths, get_platform, get_config_vars, @@ -328,6 +329,7 @@ def test_get_platform(self): # XXX more platforms to tests here + @unittest.skipIf(is_wasi, "Incompatible with WASI mapdir and OOT builds") def test_get_config_h_filename(self): config_h = sysconfig.get_config_h_filename() self.assertTrue(os.path.isfile(config_h), config_h) @@ -367,7 +369,7 @@ def test_user_similar(self): base = base.replace(sys.base_prefix, sys.prefix) if HAS_USER_BASE: user_path = get_path(name, 'posix_user') - expected = global_path.replace(base, user, 1) + expected = os.path.normpath(global_path.replace(base, user, 1)) # bpo-44860: platlib of posix_user doesn't use sys.platlibdir, # whereas posix_prefix does. if name == 'platlib': @@ -499,6 +501,7 @@ class MakefileTests(unittest.TestCase): @unittest.skipIf(sys.platform.startswith('win'), 'Test is not Windows compatible') + @unittest.skipIf(is_wasi, "Incompatible with WASI mapdir and OOT builds") def test_get_makefile_filename(self): makefile = sysconfig.get_makefile_filename() self.assertTrue(os.path.isfile(makefile), makefile) diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py index 12850cd635e995..0868d5d6e90915 100644 --- a/Lib/test/test_tarfile.py +++ b/Lib/test/test_tarfile.py @@ -228,6 +228,7 @@ def test_add_dir_getmember(self): def add_dir_and_getmember(self, name): with os_helper.temp_cwd(): with tarfile.open(tmpname, 'w') as tar: + tar.format = tarfile.USTAR_FORMAT try: os.mkdir(name) tar.add(name) @@ -630,6 +631,7 @@ def test_extract_hardlink(self): data = f.read() self.assertEqual(sha256sum(data), sha256_regtype) + @os_helper.skip_unless_working_chmod def test_extractall(self): # Test if extractall() correctly restores directory permissions # and times (see issue1735). @@ -660,6 +662,7 @@ def format_mtime(mtime): tar.close() os_helper.rmtree(DIR) + @os_helper.skip_unless_working_chmod def test_extract_directory(self): dirtype = "ustar/dirtype" DIR = os.path.join(TEMPDIR, "extractdir") @@ -1018,11 +1021,26 @@ def test_header_offset(self): "iso8859-1", "strict") self.assertEqual(tarinfo.type, self.longnametype) + def test_longname_directory(self): + # Test reading a longlink directory. Issue #47231. + longdir = ('a' * 101) + '/' + with os_helper.temp_cwd(): + with tarfile.open(tmpname, 'w') as tar: + tar.format = self.format + try: + os.mkdir(longdir) + tar.add(longdir) + finally: + os.rmdir(longdir.rstrip("/")) + with tarfile.open(tmpname) as tar: + self.assertIsNotNone(tar.getmember(longdir)) + self.assertIsNotNone(tar.getmember(longdir.removesuffix('/'))) class GNUReadTest(LongnameTest, ReadTest, unittest.TestCase): subdir = "gnu" longnametype = tarfile.GNUTYPE_LONGNAME + format = tarfile.GNU_FORMAT # Since 3.2 tarfile is supposed to accurately restore sparse members and # produce files with holes. This is what we actually want to test here. @@ -1082,6 +1100,7 @@ class PaxReadTest(LongnameTest, ReadTest, unittest.TestCase): subdir = "pax" longnametype = tarfile.XHDTYPE + format = tarfile.PAX_FORMAT def test_pax_global_headers(self): tar = tarfile.open(tarname, encoding="iso8859-1") @@ -1498,7 +1517,10 @@ def test_stream_padding(self): @unittest.skipUnless(sys.platform != "win32" and hasattr(os, "umask"), "Missing umask implementation") - @unittest.skipIf(support.is_emscripten, "Emscripten's umask is a stub.") + @unittest.skipIf( + support.is_emscripten or support.is_wasi, + "Emscripten's/WASI's umask is a stub." + ) def test_file_mode(self): # Test for issue #8464: Create files with correct # permissions. @@ -1532,6 +1554,75 @@ class Bz2StreamWriteTest(Bz2Test, StreamWriteTest): class LzmaStreamWriteTest(LzmaTest, StreamWriteTest): decompressor = lzma.LZMADecompressor if lzma else None +class _CompressedWriteTest(TarTest): + # This is not actually a standalone test. + # It does not inherit WriteTest because it only makes sense with gz,bz2 + source = (b"And we move to Bristol where they have a special, " + + b"Very Silly candidate") + + def _compressed_tar(self, compresslevel): + fobj = io.BytesIO() + with tarfile.open(tmpname, self.mode, fobj, + compresslevel=compresslevel) as tarfl: + tarfl.addfile(tarfile.TarInfo("foo"), io.BytesIO(self.source)) + return fobj + + def _test_bz2_header(self, compresslevel): + fobj = self._compressed_tar(compresslevel) + self.assertEqual(fobj.getvalue()[0:10], + b"BZh%d1AY&SY" % compresslevel) + + def _test_gz_header(self, compresslevel): + fobj = self._compressed_tar(compresslevel) + self.assertEqual(fobj.getvalue()[:3], b"\x1f\x8b\x08") + +class Bz2CompressWriteTest(Bz2Test, _CompressedWriteTest, unittest.TestCase): + prefix = "w:" + def test_compression_levels(self): + self._test_bz2_header(1) + self._test_bz2_header(5) + self._test_bz2_header(9) + +class Bz2CompressStreamWriteTest(Bz2Test, _CompressedWriteTest, + unittest.TestCase): + prefix = "w|" + def test_compression_levels(self): + self._test_bz2_header(1) + self._test_bz2_header(5) + self._test_bz2_header(9) + +class GzCompressWriteTest(GzipTest, _CompressedWriteTest, unittest.TestCase): + prefix = "w:" + def test_compression_levels(self): + self._test_gz_header(1) + self._test_gz_header(5) + self._test_gz_header(9) + +class GzCompressStreamWriteTest(GzipTest, _CompressedWriteTest, + unittest.TestCase): + prefix = "w|" + def test_compression_levels(self): + self._test_gz_header(1) + self._test_gz_header(5) + self._test_gz_header(9) + +class CompressLevelRaises(unittest.TestCase): + def test_compresslevel_wrong_modes(self): + compresslevel = 5 + fobj = io.BytesIO() + with self.assertRaises(TypeError): + tarfile.open(tmpname, "w:", fobj, compresslevel=compresslevel) + + @support.requires_bz2() + def test_wrong_compresslevels(self): + # BZ2 checks that the compresslevel is in [1,9]. gz does not + fobj = io.BytesIO() + with self.assertRaises(ValueError): + tarfile.open(tmpname, "w:bz2", fobj, compresslevel=0) + with self.assertRaises(ValueError): + tarfile.open(tmpname, "w:bz2", fobj, compresslevel=10) + with self.assertRaises(ValueError): + tarfile.open(tmpname, "w|bz2", fobj, compresslevel=10) class GNUWriteTest(unittest.TestCase): # This testcase checks for correct creation of GNU Longname diff --git a/Lib/test/test_tcl.py b/Lib/test/test_tcl.py index 548914796ed762..cd79024ab2c8e3 100644 --- a/Lib/test/test_tcl.py +++ b/Lib/test/test_tcl.py @@ -1,10 +1,7 @@ import unittest -import locale -import re import subprocess import sys import os -import warnings from test import support from test.support import import_helper from test.support import os_helper diff --git a/Lib/test/test_tempfile.py b/Lib/test/test_tempfile.py index f056e5ccb17f92..20f88d8c71e89a 100644 --- a/Lib/test/test_tempfile.py +++ b/Lib/test/test_tempfile.py @@ -450,6 +450,7 @@ def test_choose_directory(self): support.gc_collect() # For PyPy or other GCs. os.rmdir(dir) + @os_helper.skip_unless_working_chmod def test_file_mode(self): # _mkstemp_inner creates files with the proper mode @@ -787,6 +788,7 @@ def test_choose_directory(self): finally: os.rmdir(dir) + @os_helper.skip_unless_working_chmod def test_mode(self): # mkdtemp creates directories with the proper mode diff --git a/Lib/test/test_thread.py b/Lib/test/test_thread.py index ed527e7164fd0a..2ae5e9ccb44088 100644 --- a/Lib/test/test_thread.py +++ b/Lib/test/test_thread.py @@ -13,7 +13,6 @@ NUMTASKS = 10 NUMTRIPS = 3 -POLL_SLEEP = 0.010 # seconds = 10 ms _print_mutex = thread.allocate_lock() @@ -121,19 +120,24 @@ def task(): with threading_helper.wait_threads_exit(): thread.start_new_thread(task, ()) - while not started: - time.sleep(POLL_SLEEP) + for _ in support.sleeping_retry(support.LONG_TIMEOUT): + if started: + break self.assertEqual(thread._count(), orig + 1) + # Allow the task to finish. mut.release() + # The only reliable way to be sure that the thread ended from the - # interpreter's point of view is to wait for the function object to be - # destroyed. + # interpreter's point of view is to wait for the function object to + # be destroyed. done = [] wr = weakref.ref(task, lambda _: done.append(None)) del task - while not done: - time.sleep(POLL_SLEEP) + + for _ in support.sleeping_retry(support.LONG_TIMEOUT): + if done: + break support.gc_collect() # For PyPy or other GCs. self.assertEqual(thread._count(), orig) diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py index a1bf354e65e2e1..dcd27697bb4839 100644 --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -33,9 +33,6 @@ # on platforms known to behave badly. platforms_to_skip = ('netbsd5', 'hp-ux11') -# Is Python built with Py_DEBUG macro defined? -Py_DEBUG = hasattr(sys, 'gettotalrefcount') - def restore_default_excepthook(testcase): testcase.addCleanup(setattr, threading, 'excepthook', threading.excepthook) @@ -149,6 +146,7 @@ def test_args_argument(self): with self.subTest(target=target, args=args): t = threading.Thread(target=target, args=args) t.start() + t.join() @cpython_only def test_disallow_instantiation(self): diff --git a/Lib/test/test_threading_local.py b/Lib/test/test_threading_local.py index 9888e631881fa6..48e302930ff7e9 100644 --- a/Lib/test/test_threading_local.py +++ b/Lib/test/test_threading_local.py @@ -4,7 +4,6 @@ from test import support from test.support import threading_helper import weakref -import gc # Modules under test import _thread diff --git a/Lib/test/test_time.py b/Lib/test/test_time.py index dc0bbb0ee29311..884b14231f5737 100644 --- a/Lib/test/test_time.py +++ b/Lib/test/test_time.py @@ -489,6 +489,9 @@ def test_monotonic(self): def test_perf_counter(self): time.perf_counter() + @unittest.skipIf( + support.is_wasi, "process_time not available on WASI" + ) def test_process_time(self): # process_time() should not include time spend during a sleep start = time.process_time() diff --git a/Lib/test/test_tk.py b/Lib/test/test_tk.py deleted file mode 100644 index 8f90cbaba9f7c4..00000000000000 --- a/Lib/test/test_tk.py +++ /dev/null @@ -1,20 +0,0 @@ -import unittest -from test import support -from test.support import import_helper -from test.support import check_sanitizer - -if check_sanitizer(address=True, memory=True): - raise unittest.SkipTest("Tests involvin libX11 can SEGFAULT on ASAN/MSAN builds") - -# Skip test if _tkinter wasn't built. -import_helper.import_module('_tkinter') - -# Skip test if tk cannot be initialized. -support.requires('gui') - -def load_tests(loader, tests, pattern): - return loader.discover('tkinter.test.test_tkinter') - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/tkinter/test/README b/Lib/test/test_tkinter/README similarity index 100% rename from Lib/tkinter/test/README rename to Lib/test/test_tkinter/README diff --git a/Lib/test/test_tkinter/__init__.py b/Lib/test/test_tkinter/__init__.py new file mode 100644 index 00000000000000..edcb44951bde36 --- /dev/null +++ b/Lib/test/test_tkinter/__init__.py @@ -0,0 +1,18 @@ +import os.path +import unittest +from test import support +from test.support import import_helper + + +if support.check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("Tests involving libX11 can SEGFAULT on ASAN/MSAN builds") + +# Skip test if _tkinter wasn't built. +import_helper.import_module('_tkinter') + +# Skip test if tk cannot be initialized. +support.requires('gui') + + +def load_tests(*args): + return support.load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/test/test_tkinter/__main__.py b/Lib/test/test_tkinter/__main__.py new file mode 100644 index 00000000000000..40a23a297ec2b4 --- /dev/null +++ b/Lib/test/test_tkinter/__main__.py @@ -0,0 +1,4 @@ +from . import load_tests +import unittest + +unittest.main() diff --git a/Lib/tkinter/test/support.py b/Lib/test/test_tkinter/support.py similarity index 99% rename from Lib/tkinter/test/support.py rename to Lib/test/test_tkinter/support.py index 9e26d04536f227..9154ebac5c48f8 100644 --- a/Lib/tkinter/test/support.py +++ b/Lib/test/test_tkinter/support.py @@ -1,5 +1,4 @@ import functools -import re import tkinter import unittest diff --git a/Lib/tkinter/test/test_tkinter/test_colorchooser.py b/Lib/test/test_tkinter/test_colorchooser.py similarity index 96% rename from Lib/tkinter/test/test_tkinter/test_colorchooser.py rename to Lib/test/test_tkinter/test_colorchooser.py index 488162ff0dd966..9bba21392d8d14 100644 --- a/Lib/tkinter/test/test_tkinter/test_colorchooser.py +++ b/Lib/test/test_tkinter/test_colorchooser.py @@ -1,7 +1,7 @@ import unittest import tkinter from test.support import requires, swap_attr -from tkinter.test.support import AbstractDefaultRootTest, AbstractTkTest +from test.test_tkinter.support import AbstractDefaultRootTest, AbstractTkTest from tkinter import colorchooser from tkinter.colorchooser import askcolor from tkinter.commondialog import Dialog diff --git a/Lib/tkinter/test/test_tkinter/test_font.py b/Lib/test/test_tkinter/test_font.py similarity index 98% rename from Lib/tkinter/test/test_tkinter/test_font.py rename to Lib/test/test_tkinter/test_font.py index 058c53a9023647..563707ddd2fa9b 100644 --- a/Lib/tkinter/test/test_tkinter/test_font.py +++ b/Lib/test/test_tkinter/test_font.py @@ -2,7 +2,7 @@ import tkinter from tkinter import font from test.support import requires, gc_collect, ALWAYS_EQ -from tkinter.test.support import AbstractTkTest, AbstractDefaultRootTest +from test.test_tkinter.support import AbstractTkTest, AbstractDefaultRootTest requires('gui') diff --git a/Lib/tkinter/test/test_tkinter/test_geometry_managers.py b/Lib/test/test_tkinter/test_geometry_managers.py similarity index 99% rename from Lib/tkinter/test/test_tkinter/test_geometry_managers.py rename to Lib/test/test_tkinter/test_geometry_managers.py index c89bc8dbf85759..3663048a145ab1 100644 --- a/Lib/tkinter/test/test_tkinter/test_geometry_managers.py +++ b/Lib/test/test_tkinter/test_geometry_managers.py @@ -4,8 +4,8 @@ from tkinter import TclError from test.support import requires -from tkinter.test.support import pixels_conv -from tkinter.test.widget_tests import AbstractWidgetTest +from test.test_tkinter.support import pixels_conv +from test.test_tkinter.widget_tests import AbstractWidgetTest requires('gui') diff --git a/Lib/tkinter/test/test_tkinter/test_images.py b/Lib/test/test_tkinter/test_images.py similarity index 99% rename from Lib/tkinter/test/test_tkinter/test_images.py rename to Lib/test/test_tkinter/test_images.py index cc69ccac62d742..b6f8b79ae689fa 100644 --- a/Lib/tkinter/test/test_tkinter/test_images.py +++ b/Lib/test/test_tkinter/test_images.py @@ -2,7 +2,7 @@ import tkinter from test import support from test.support import os_helper -from tkinter.test.support import AbstractTkTest, AbstractDefaultRootTest, requires_tcl +from test.test_tkinter.support import AbstractTkTest, AbstractDefaultRootTest, requires_tcl support.requires('gui') diff --git a/Lib/tkinter/test/test_tkinter/test_loadtk.py b/Lib/test/test_tkinter/test_loadtk.py similarity index 100% rename from Lib/tkinter/test/test_tkinter/test_loadtk.py rename to Lib/test/test_tkinter/test_loadtk.py diff --git a/Lib/tkinter/test/test_tkinter/test_messagebox.py b/Lib/test/test_tkinter/test_messagebox.py similarity index 94% rename from Lib/tkinter/test/test_tkinter/test_messagebox.py rename to Lib/test/test_tkinter/test_messagebox.py index d38541a5a45e76..f41bdc98286283 100644 --- a/Lib/tkinter/test/test_tkinter/test_messagebox.py +++ b/Lib/test/test_tkinter/test_messagebox.py @@ -1,7 +1,7 @@ import unittest import tkinter from test.support import requires, swap_attr -from tkinter.test.support import AbstractDefaultRootTest +from test.test_tkinter.support import AbstractDefaultRootTest from tkinter.commondialog import Dialog from tkinter.messagebox import showinfo diff --git a/Lib/tkinter/test/test_tkinter/test_misc.py b/Lib/test/test_tkinter/test_misc.py similarity index 99% rename from Lib/tkinter/test/test_tkinter/test_misc.py rename to Lib/test/test_tkinter/test_misc.py index 620b6ed638c25a..d1aca58d15fbd8 100644 --- a/Lib/tkinter/test/test_tkinter/test_misc.py +++ b/Lib/test/test_tkinter/test_misc.py @@ -3,7 +3,7 @@ import tkinter import enum from test import support -from tkinter.test.support import AbstractTkTest, AbstractDefaultRootTest +from test.test_tkinter.support import AbstractTkTest, AbstractDefaultRootTest support.requires('gui') diff --git a/Lib/tkinter/test/test_tkinter/test_simpledialog.py b/Lib/test/test_tkinter/test_simpledialog.py similarity index 93% rename from Lib/tkinter/test/test_tkinter/test_simpledialog.py rename to Lib/test/test_tkinter/test_simpledialog.py index 18cd2712b0c5ed..502f7f7098a322 100644 --- a/Lib/tkinter/test/test_tkinter/test_simpledialog.py +++ b/Lib/test/test_tkinter/test_simpledialog.py @@ -1,7 +1,7 @@ import unittest import tkinter from test.support import requires, swap_attr -from tkinter.test.support import AbstractDefaultRootTest +from test.test_tkinter.support import AbstractDefaultRootTest from tkinter.simpledialog import Dialog, askinteger requires('gui') diff --git a/Lib/tkinter/test/test_tkinter/test_text.py b/Lib/test/test_tkinter/test_text.py similarity index 96% rename from Lib/tkinter/test/test_tkinter/test_text.py rename to Lib/test/test_tkinter/test_text.py index 482f150df559fc..d1583f0b20aabe 100644 --- a/Lib/tkinter/test/test_tkinter/test_text.py +++ b/Lib/test/test_tkinter/test_text.py @@ -1,7 +1,7 @@ import unittest import tkinter from test.support import requires -from tkinter.test.support import AbstractTkTest +from test.test_tkinter.support import AbstractTkTest requires('gui') diff --git a/Lib/tkinter/test/test_tkinter/test_variables.py b/Lib/test/test_tkinter/test_variables.py similarity index 99% rename from Lib/tkinter/test/test_tkinter/test_variables.py rename to Lib/test/test_tkinter/test_variables.py index 427e168454362c..c1d232e2febc7a 100644 --- a/Lib/tkinter/test/test_tkinter/test_variables.py +++ b/Lib/test/test_tkinter/test_variables.py @@ -6,7 +6,7 @@ from tkinter import (Variable, StringVar, IntVar, DoubleVar, BooleanVar, Tcl, TclError) from test.support import ALWAYS_EQ -from tkinter.test.support import AbstractDefaultRootTest +from test.test_tkinter.support import AbstractDefaultRootTest class Var(Variable): diff --git a/Lib/tkinter/test/test_tkinter/test_widgets.py b/Lib/test/test_tkinter/test_widgets.py similarity index 99% rename from Lib/tkinter/test/test_tkinter/test_widgets.py rename to Lib/test/test_tkinter/test_widgets.py index fe8ecfeb326554..1cddbe17ba5207 100644 --- a/Lib/tkinter/test/test_tkinter/test_widgets.py +++ b/Lib/test/test_tkinter/test_widgets.py @@ -4,10 +4,10 @@ import os from test.support import requires -from tkinter.test.support import (requires_tcl, +from test.test_tkinter.support import (requires_tcl, get_tk_patchlevel, widget_eq, AbstractDefaultRootTest) -from tkinter.test.widget_tests import ( +from test.test_tkinter.widget_tests import ( add_standard_options, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests, setUpModule) diff --git a/Lib/tkinter/test/widget_tests.py b/Lib/test/test_tkinter/widget_tests.py similarity index 99% rename from Lib/tkinter/test/widget_tests.py rename to Lib/test/test_tkinter/widget_tests.py index a450544c3ee6b8..85b0511aba3c7a 100644 --- a/Lib/tkinter/test/widget_tests.py +++ b/Lib/test/test_tkinter/widget_tests.py @@ -1,8 +1,7 @@ # Common tests for test_tkinter/test_widgets.py and test_ttk/test_widgets.py -import unittest import tkinter -from tkinter.test.support import (AbstractTkTest, tcl_version, +from test.test_tkinter.support import (AbstractTkTest, tcl_version, pixels_conv, tcl_obj_eq) import test.support diff --git a/Lib/test/test_tomllib/test_misc.py b/Lib/test/test_tomllib/test_misc.py index 76fa5905fa49ef..378db58f255945 100644 --- a/Lib/test/test_tomllib/test_misc.py +++ b/Lib/test/test_tomllib/test_misc.py @@ -6,6 +6,7 @@ import datetime from decimal import Decimal as D from pathlib import Path +import sys import tempfile import unittest @@ -91,11 +92,13 @@ def test_deepcopy(self): self.assertEqual(obj_copy, expected_obj) def test_inline_array_recursion_limit(self): - nest_count = 470 + # 470 with default recursion limit + nest_count = int(sys.getrecursionlimit() * 0.47) recursive_array_toml = "arr = " + nest_count * "[" + nest_count * "]" tomllib.loads(recursive_array_toml) def test_inline_table_recursion_limit(self): - nest_count = 310 + # 310 with default recursion limit + nest_count = int(sys.getrecursionlimit() * 0.31) recursive_table_toml = nest_count * "key = {" + nest_count * "}" tomllib.loads(recursive_table_toml) diff --git a/Lib/test/test_tools/test_md5sum.py b/Lib/test/test_tools/test_md5sum.py index c5a230e95c2b7e..710461f8819754 100644 --- a/Lib/test/test_tools/test_md5sum.py +++ b/Lib/test/test_tools/test_md5sum.py @@ -1,6 +1,5 @@ """Tests for the md5sum script in the Tools directory.""" -import sys import os import unittest from test.support import os_helper diff --git a/Lib/test/test_tools/test_pathfix.py b/Lib/test/test_tools/test_pathfix.py index ff61935298b920..aa754bc479b468 100644 --- a/Lib/test/test_tools/test_pathfix.py +++ b/Lib/test/test_tools/test_pathfix.py @@ -2,7 +2,6 @@ import subprocess import sys import unittest -from test import support from test.support import os_helper from test.test_tools import scriptsdir, skip_if_missing diff --git a/Lib/test/test_tools/test_pindent.py b/Lib/test/test_tools/test_pindent.py index 01f13850eae733..61e97fea48cfba 100644 --- a/Lib/test/test_tools/test_pindent.py +++ b/Lib/test/test_tools/test_pindent.py @@ -5,7 +5,6 @@ import unittest import subprocess import textwrap -from test import support from test.support import os_helper from test.support.script_helper import assert_python_ok diff --git a/Lib/test/test_traceback.py b/Lib/test/test_traceback.py index ed5ddf95069e35..722c265a6a8a51 100644 --- a/Lib/test/test_traceback.py +++ b/Lib/test/test_traceback.py @@ -4,6 +4,7 @@ from io import StringIO import linecache import sys +import types import inspect import unittest import re @@ -14,7 +15,6 @@ from test.support.os_helper import TESTFN, unlink from test.support.script_helper import assert_python_ok, assert_python_failure -import os import textwrap import traceback from functools import partial @@ -1129,7 +1129,7 @@ def test_print_exception_bad_type_python(self): class BaseExceptionReportingTests: def get_exception(self, exception_or_callable): - if isinstance(exception_or_callable, Exception): + if isinstance(exception_or_callable, BaseException): return exception_or_callable try: exception_or_callable() @@ -1851,6 +1851,31 @@ def exc(): report = self.get_report(exc) self.assertEqual(report, expected) + def test_KeyboardInterrupt_at_first_line_of_frame(self): + # see GH-93249 + def f(): + return sys._getframe() + + tb_next = None + frame = f() + lasti = 0 + lineno = f.__code__.co_firstlineno + tb = types.TracebackType(tb_next, frame, lasti, lineno) + + exc = KeyboardInterrupt() + exc.__traceback__ = tb + + expected = (f'Traceback (most recent call last):\n' + f' File "{__file__}", line {lineno}, in f\n' + f' def f():\n' + f'\n' + f'KeyboardInterrupt\n') + + report = self.get_report(exc) + # remove trailing writespace: + report = '\n'.join([l.rstrip() for l in report.split('\n')]) + self.assertEqual(report, expected) + class PyExcReportingTests(BaseExceptionReportingTests, unittest.TestCase): # diff --git a/Lib/test/test_ttk_guionly.py b/Lib/test/test_ttk/__init__.py similarity index 68% rename from Lib/test/test_ttk_guionly.py rename to Lib/test/test_ttk/__init__.py index c4919045d75cb7..7ee7ffbd6d7408 100644 --- a/Lib/test/test_ttk_guionly.py +++ b/Lib/test/test_ttk/__init__.py @@ -1,10 +1,11 @@ +import os.path import unittest from test import support from test.support import import_helper -from test.support import check_sanitizer -if check_sanitizer(address=True, memory=True): - raise unittest.SkipTest("Tests involvin libX11 can SEGFAULT on ASAN/MSAN builds") + +if support.check_sanitizer(address=True, memory=True): + raise unittest.SkipTest("Tests involving libX11 can SEGFAULT on ASAN/MSAN builds") # Skip this test if _tkinter wasn't built. import_helper.import_module('_tkinter') @@ -12,6 +13,7 @@ # Skip test if tk cannot be initialized. support.requires('gui') + import tkinter from _tkinter import TclError from tkinter import ttk @@ -32,9 +34,6 @@ def setUpModule(): root.destroy() del root -def load_tests(loader, tests, pattern): - return loader.discover('tkinter.test.test_ttk') - -if __name__ == '__main__': - unittest.main() +def load_tests(*args): + return support.load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/test/test_ttk/__main__.py b/Lib/test/test_ttk/__main__.py new file mode 100644 index 00000000000000..40a23a297ec2b4 --- /dev/null +++ b/Lib/test/test_ttk/__main__.py @@ -0,0 +1,4 @@ +from . import load_tests +import unittest + +unittest.main() diff --git a/Lib/tkinter/test/test_ttk/test_extensions.py b/Lib/test/test_ttk/test_extensions.py similarity index 99% rename from Lib/tkinter/test/test_ttk/test_extensions.py rename to Lib/test/test_ttk/test_extensions.py index 1220c4831c52f4..6135c49701f08e 100644 --- a/Lib/tkinter/test/test_ttk/test_extensions.py +++ b/Lib/test/test_ttk/test_extensions.py @@ -3,7 +3,7 @@ import tkinter from tkinter import ttk from test.support import requires, gc_collect -from tkinter.test.support import AbstractTkTest, AbstractDefaultRootTest +from test.test_tkinter.support import AbstractTkTest, AbstractDefaultRootTest requires('gui') diff --git a/Lib/tkinter/test/test_ttk/test_style.py b/Lib/test/test_ttk/test_style.py similarity index 98% rename from Lib/tkinter/test/test_ttk/test_style.py rename to Lib/test/test_ttk/test_style.py index 54ad3437168fe1..0ec95cf6b5ffc9 100644 --- a/Lib/tkinter/test/test_ttk/test_style.py +++ b/Lib/test/test_ttk/test_style.py @@ -4,7 +4,7 @@ from tkinter import ttk from test import support from test.support import requires -from tkinter.test.support import AbstractTkTest, get_tk_patchlevel +from test.test_tkinter.support import AbstractTkTest, get_tk_patchlevel requires('gui') diff --git a/Lib/tkinter/test/test_ttk/test_widgets.py b/Lib/test/test_ttk/test_widgets.py similarity index 99% rename from Lib/tkinter/test/test_ttk/test_widgets.py rename to Lib/test/test_ttk/test_widgets.py index c14c321ca2687d..eb0cc93ce270b5 100644 --- a/Lib/tkinter/test/test_ttk/test_widgets.py +++ b/Lib/test/test_ttk/test_widgets.py @@ -5,9 +5,9 @@ import sys from test.test_ttk_textonly import MockTclObj -from tkinter.test.support import (AbstractTkTest, tcl_version, get_tk_patchlevel, +from test.test_tkinter.support import (AbstractTkTest, tcl_version, get_tk_patchlevel, simulate_mouse_click, AbstractDefaultRootTest) -from tkinter.test.widget_tests import (add_standard_options, +from test.test_tkinter.widget_tests import (add_standard_options, AbstractWidgetTest, StandardOptionsTests, IntegerSizeTests, PixelSizeTests, setUpModule) diff --git a/Lib/test/test_type_comments.py b/Lib/test/test_type_comments.py index 71d1430dbc939d..668c1787f64d97 100644 --- a/Lib/test/test_type_comments.py +++ b/Lib/test/test_type_comments.py @@ -1,7 +1,6 @@ import ast import sys import unittest -from test import support funcdef = """\ diff --git a/Lib/test/test_types.py b/Lib/test/test_types.py index cde9dadc5e97fa..8556ca35ca06c1 100644 --- a/Lib/test/test_types.py +++ b/Lib/test/test_types.py @@ -597,6 +597,12 @@ def test_slot_wrapper_types(self): self.assertIsInstance(object.__lt__, types.WrapperDescriptorType) self.assertIsInstance(int.__lt__, types.WrapperDescriptorType) + def test_dunder_get_signature(self): + sig = inspect.signature(object.__init__.__get__) + self.assertEqual(list(sig.parameters), ["instance", "owner"]) + # gh-93021: Second parameter is optional + self.assertIs(sig.parameters["owner"].default, None) + def test_method_wrapper_types(self): self.assertIsInstance(object().__init__, types.MethodWrapperType) self.assertIsInstance(object().__str__, types.MethodWrapperType) diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 2afac235391552..6850b881a188d4 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -51,6 +51,10 @@ c_typing = import_helper.import_fresh_module('typing', fresh=['_typing']) +CANNOT_SUBCLASS_TYPE = 'Cannot subclass special typing classes' +CANNOT_SUBCLASS_INSTANCE = 'Cannot subclass an instance of %s' + + class BaseTestCase(TestCase): def assertIsSubclass(self, cls, class_or_tuple, msg=None): @@ -170,10 +174,11 @@ def test_not_generic(self): self.bottom_type[int] def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + 'Cannot subclass ' + re.escape(str(self.bottom_type))): class A(self.bottom_type): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class A(type(self.bottom_type)): pass @@ -266,10 +271,11 @@ def test_cannot_subscript(self): Self[int] def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(Self)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.Self'): class C(Self): pass @@ -322,10 +328,11 @@ def test_cannot_subscript(self): LiteralString[int] def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(LiteralString)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.LiteralString'): class C(LiteralString): pass @@ -415,15 +422,13 @@ def test_no_redefinition(self): self.assertNotEqual(TypeVar('T'), TypeVar('T')) self.assertNotEqual(TypeVar('T', int, str), TypeVar('T', int, str)) - def test_cannot_subclass_vars(self): - with self.assertRaises(TypeError): - class V(TypeVar('T')): - pass - - def test_cannot_subclass_var_itself(self): - with self.assertRaises(TypeError): - class V(TypeVar): - pass + def test_cannot_subclass(self): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): + class V(TypeVar): pass + T = TypeVar("T") + with self.assertRaisesRegex(TypeError, + CANNOT_SUBCLASS_INSTANCE % 'TypeVar'): + class V(T): pass def test_cannot_instantiate_vars(self): with self.assertRaises(TypeError): @@ -753,14 +758,11 @@ class C(Generic[*Ts]): pass ('generic[*Ts]', '[*tuple_type[int]]', 'generic[int]'), ('generic[*Ts]', '[*tuple_type[*Ts]]', 'generic[*Ts]'), ('generic[*Ts]', '[*tuple_type[int, str]]', 'generic[int, str]'), + ('generic[*Ts]', '[str, *tuple_type[int, ...], bool]', 'generic[str, *tuple_type[int, ...], bool]'), ('generic[*Ts]', '[tuple_type[int, ...]]', 'generic[tuple_type[int, ...]]'), ('generic[*Ts]', '[tuple_type[int, ...], tuple_type[str, ...]]', 'generic[tuple_type[int, ...], tuple_type[str, ...]]'), ('generic[*Ts]', '[*tuple_type[int, ...]]', 'generic[*tuple_type[int, ...]]'), - - # Technically, multiple unpackings are forbidden by PEP 646, but we - # choose to be less restrictive at runtime, to allow folks room - # to experiment. So all three of these should be valid. - ('generic[*Ts]', '[*tuple_type[int, ...], *tuple_type[str, ...]]', 'generic[*tuple_type[int, ...], *tuple_type[str, ...]]'), + ('generic[*Ts]', '[*tuple_type[int, ...], *tuple_type[str, ...]]', 'TypeError'), ('generic[*Ts]', '[*Ts]', 'generic[*Ts]'), ('generic[*Ts]', '[T, *Ts]', 'generic[T, *Ts]'), @@ -768,12 +770,24 @@ class C(Generic[*Ts]): pass ('generic[T, *Ts]', '[int]', 'generic[int]'), ('generic[T, *Ts]', '[int, str]', 'generic[int, str]'), ('generic[T, *Ts]', '[int, str, bool]', 'generic[int, str, bool]'), - - ('generic[T, *Ts]', '[*tuple[int, ...]]', 'TypeError'), # Should be generic[int, *tuple[int, ...]] + ('generic[list[T], *Ts]', '[int]', 'generic[list[int]]'), + ('generic[list[T], *Ts]', '[int, str]', 'generic[list[int], str]'), + ('generic[list[T], *Ts]', '[int, str, bool]', 'generic[list[int], str, bool]'), ('generic[*Ts, T]', '[int]', 'generic[int]'), ('generic[*Ts, T]', '[int, str]', 'generic[int, str]'), - ('generic[*Ts, T]', '[int, str, bool]', 'generic[int, str, bool]'), + ('generic[*Ts, T]', '[int, str, bool]', 'generic[int, str, bool]'), + ('generic[*Ts, list[T]]', '[int]', 'generic[list[int]]'), + ('generic[*Ts, list[T]]', '[int, str]', 'generic[int, list[str]]'), + ('generic[*Ts, list[T]]', '[int, str, bool]', 'generic[int, str, list[bool]]'), + + ('generic[T, *Ts]', '[*tuple_type[int, ...]]', 'generic[int, *tuple_type[int, ...]]'), + ('generic[*Ts, T]', '[*tuple_type[int, ...]]', 'generic[*tuple_type[int, ...], int]'), + ('generic[T1, *Ts, T2]', '[*tuple_type[int, ...]]', 'generic[int, *tuple_type[int, ...], int]'), + ('generic[T, str, *Ts]', '[*tuple_type[int, ...]]', 'generic[int, str, *tuple_type[int, ...]]'), + ('generic[*Ts, str, T]', '[*tuple_type[int, ...]]', 'generic[*tuple_type[int, ...], str, int]'), + ('generic[list[T], *Ts]', '[*tuple_type[int, ...]]', 'generic[list[int], *tuple_type[int, ...]]'), + ('generic[*Ts, list[T]]', '[*tuple_type[int, ...]]', 'generic[*tuple_type[int, ...], list[int]]'), ('generic[T, *tuple_type[int, ...]]', '[str]', 'generic[str, *tuple_type[int, ...]]'), ('generic[T1, T2, *tuple_type[int, ...]]', '[str, bool]', 'generic[str, bool, *tuple_type[int, ...]]'), @@ -1007,15 +1021,14 @@ class A(Generic[Unpack[Ts]]): pass self.assertEndsWith(repr(F[float]), 'A[float, *tuple[str, ...]]') self.assertEndsWith(repr(F[float, str]), 'A[float, str, *tuple[str, ...]]') - def test_cannot_subclass_class(self): - with self.assertRaises(TypeError): + def test_cannot_subclass(self): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(TypeVarTuple): pass - - def test_cannot_subclass_instance(self): Ts = TypeVarTuple('Ts') - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + CANNOT_SUBCLASS_INSTANCE % 'TypeVarTuple'): class C(Ts): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, r'Cannot subclass \*Ts'): class C(Unpack[Ts]): pass def test_variadic_class_args_are_correct(self): @@ -1402,13 +1415,15 @@ def test_repr(self): self.assertEqual(repr(u), 'typing.Optional[str]') def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.Union'): class C(Union): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(Union)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.Union\[int, str\]'): class C(Union[int, str]): pass @@ -3273,7 +3288,8 @@ class C(B[int]): self.assertEqual(x.bar, 'abc') self.assertEqual(x.__dict__, {'foo': 42, 'bar': 'abc'}) samples = [Any, Union, Tuple, Callable, ClassVar, - Union[int, str], ClassVar[List], Tuple[int, ...], Callable[[str], bytes], + Union[int, str], ClassVar[List], Tuple[int, ...], Tuple[()], + Callable[[str], bytes], typing.DefaultDict, typing.FrozenSet[int]] for s in samples: for proto in range(pickle.HIGHEST_PROTOCOL + 1): @@ -3291,7 +3307,8 @@ class C(B[int]): def test_copy_and_deepcopy(self): T = TypeVar('T') class Node(Generic[T]): ... - things = [Union[T, int], Tuple[T, int], Callable[..., T], Callable[[int], int], + things = [Union[T, int], Tuple[T, int], Tuple[()], + Callable[..., T], Callable[[int], int], Tuple[Any, Any], Node[T], Node[int], Node[Any], typing.Iterable[T], typing.Iterable[Any], typing.Iterable[int], typing.Dict[int, str], typing.Dict[T, Any], ClassVar[int], ClassVar[List[T]], Tuple['T', 'T'], @@ -3649,10 +3666,10 @@ def test_repr(self): self.assertEqual(repr(cv), 'typing.ClassVar[%s.Employee]' % __name__) def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(ClassVar)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(ClassVar[int])): pass @@ -3691,10 +3708,10 @@ def test_repr(self): self.assertEqual(repr(cv), 'typing.Final[tuple[int]]') def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(Final)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(Final[int])): pass @@ -6197,16 +6214,18 @@ def test_repr(self): self.assertEqual(repr(cv), f'typing.Required[{__name__}.Employee]') def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(Required)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(Required[int])): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.Required'): class C(Required): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.Required\[int\]'): class C(Required[int]): pass @@ -6243,16 +6262,18 @@ def test_repr(self): self.assertEqual(repr(cv), f'typing.NotRequired[{__name__}.Employee]') def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(NotRequired)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(NotRequired[int])): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.NotRequired'): class C(NotRequired): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.NotRequired\[int\]'): class C(NotRequired[int]): pass @@ -6668,7 +6689,8 @@ def test_no_issubclass(self): issubclass(TypeAlias, Employee) def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, + r'Cannot subclass typing\.TypeAlias'): class C(TypeAlias): pass @@ -6870,6 +6892,24 @@ def test_paramspec_gets_copied(self): self.assertEqual(C2[Concatenate[str, P2]].__parameters__, (P2,)) self.assertEqual(C2[Concatenate[T, P2]].__parameters__, (T, P2)) + def test_cannot_subclass(self): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): + class C(ParamSpec): pass + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): + class C(ParamSpecArgs): pass + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): + class C(ParamSpecKwargs): pass + P = ParamSpec('P') + with self.assertRaisesRegex(TypeError, + CANNOT_SUBCLASS_INSTANCE % 'ParamSpec'): + class C(P): pass + with self.assertRaisesRegex(TypeError, + CANNOT_SUBCLASS_INSTANCE % 'ParamSpecArgs'): + class C(P.args): pass + with self.assertRaisesRegex(TypeError, + CANNOT_SUBCLASS_INSTANCE % 'ParamSpecKwargs'): + class C(P.kwargs): pass + class ConcatenateTests(BaseTestCase): def test_basics(self): @@ -6936,10 +6976,10 @@ def test_repr(self): self.assertEqual(repr(cv), 'typing.TypeGuard[tuple[int]]') def test_cannot_subclass(self): - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(TypeGuard)): pass - with self.assertRaises(TypeError): + with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): class C(type(TypeGuard[int])): pass diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index c98fabf8bc9b5a..9765ed97a60a44 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -2370,20 +2370,19 @@ def test_expandtabs_optimization(self): self.assertIs(s.expandtabs(), s) def test_raiseMemError(self): - if struct.calcsize('P') == 8: - # 64 bits pointers - ascii_struct_size = 48 - compact_struct_size = 72 - else: - # 32 bits pointers - ascii_struct_size = 24 - compact_struct_size = 36 + asciifields = "nnb" + compactfields = asciifields + "nP" + ascii_struct_size = support.calcobjsize(asciifields) + compact_struct_size = support.calcobjsize(compactfields) for char in ('a', '\xe9', '\u20ac', '\U0010ffff'): code = ord(char) - if code < 0x100: + if code < 0x80: char_size = 1 # sizeof(Py_UCS1) struct_size = ascii_struct_size + elif code < 0x100: + char_size = 1 # sizeof(Py_UCS1) + struct_size = compact_struct_size elif code < 0x10000: char_size = 2 # sizeof(Py_UCS2) struct_size = compact_struct_size @@ -2395,8 +2394,18 @@ def test_raiseMemError(self): # be allocatable, given enough memory. maxlen = ((sys.maxsize - struct_size) // char_size) alloc = lambda: char * maxlen - self.assertRaises(MemoryError, alloc) - self.assertRaises(MemoryError, alloc) + with self.subTest( + char=char, + struct_size=struct_size, + char_size=char_size + ): + # self-check + self.assertEqual( + sys.getsizeof(char * 42), + struct_size + (char_size * (42 + 1)) + ) + self.assertRaises(MemoryError, alloc) + self.assertRaises(MemoryError, alloc) def test_format_subclass(self): class S(str): diff --git a/Lib/test/test_unicode_file.py b/Lib/test/test_unicode_file.py index 80c22c6cdd1dad..fe25bfe9f88d11 100644 --- a/Lib/test/test_unicode_file.py +++ b/Lib/test/test_unicode_file.py @@ -110,7 +110,7 @@ def _test_single(self, filename): os.unlink(filename) self.assertTrue(not os.path.exists(filename)) # and again with os.open. - f = os.open(filename, os.O_CREAT) + f = os.open(filename, os.O_CREAT | os.O_WRONLY) os.close(f) try: self._do_single(filename) diff --git a/Lib/test/test_unittest.py b/Lib/test/test_unittest.py deleted file mode 100644 index 1079c7df2e51c2..00000000000000 --- a/Lib/test/test_unittest.py +++ /dev/null @@ -1,16 +0,0 @@ -import unittest.test - -from test import support - - -def load_tests(*_): - # used by unittest - return unittest.test.suite() - - -def tearDownModule(): - support.reap_children() - - -if __name__ == "__main__": - unittest.main() diff --git a/Lib/test/test_unittest/__init__.py b/Lib/test/test_unittest/__init__.py new file mode 100644 index 00000000000000..bc502ef32d2916 --- /dev/null +++ b/Lib/test/test_unittest/__init__.py @@ -0,0 +1,6 @@ +import os.path +from test.support import load_package_tests + + +def load_tests(*args): + return load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/test/test_unittest/__main__.py b/Lib/test/test_unittest/__main__.py new file mode 100644 index 00000000000000..40a23a297ec2b4 --- /dev/null +++ b/Lib/test/test_unittest/__main__.py @@ -0,0 +1,4 @@ +from . import load_tests +import unittest + +unittest.main() diff --git a/Lib/unittest/test/_test_warnings.py b/Lib/test/test_unittest/_test_warnings.py similarity index 100% rename from Lib/unittest/test/_test_warnings.py rename to Lib/test/test_unittest/_test_warnings.py diff --git a/Lib/unittest/test/dummy.py b/Lib/test/test_unittest/dummy.py similarity index 100% rename from Lib/unittest/test/dummy.py rename to Lib/test/test_unittest/dummy.py diff --git a/Lib/unittest/test/support.py b/Lib/test/test_unittest/support.py similarity index 100% rename from Lib/unittest/test/support.py rename to Lib/test/test_unittest/support.py diff --git a/Lib/unittest/test/test_assertions.py b/Lib/test/test_unittest/test_assertions.py similarity index 100% rename from Lib/unittest/test/test_assertions.py rename to Lib/test/test_unittest/test_assertions.py diff --git a/Lib/unittest/test/test_async_case.py b/Lib/test/test_unittest/test_async_case.py similarity index 100% rename from Lib/unittest/test/test_async_case.py rename to Lib/test/test_unittest/test_async_case.py diff --git a/Lib/unittest/test/test_break.py b/Lib/test/test_unittest/test_break.py similarity index 100% rename from Lib/unittest/test/test_break.py rename to Lib/test/test_unittest/test_break.py diff --git a/Lib/unittest/test/test_case.py b/Lib/test/test_unittest/test_case.py similarity index 99% rename from Lib/unittest/test/test_case.py rename to Lib/test/test_unittest/test_case.py index 9d53db65da8a5e..fae0d1076da948 100644 --- a/Lib/unittest/test/test_case.py +++ b/Lib/test/test_unittest/test_case.py @@ -15,7 +15,7 @@ import unittest -from unittest.test.support import ( +from test.test_unittest.support import ( TestEquality, TestHashing, LoggingResult, LegacyLoggingResult, ResultWithNoStartTestRunStopTestRun ) diff --git a/Lib/unittest/test/test_discovery.py b/Lib/test/test_unittest/test_discovery.py similarity index 99% rename from Lib/unittest/test/test_discovery.py rename to Lib/test/test_unittest/test_discovery.py index 3b58786ec16a10..946fa1258ea25e 100644 --- a/Lib/unittest/test/test_discovery.py +++ b/Lib/test/test_unittest/test_discovery.py @@ -10,7 +10,7 @@ import unittest import unittest.mock -import unittest.test +import test.test_unittest class TestableTestProgram(unittest.TestProgram): @@ -789,7 +789,7 @@ def test_discovery_from_dotted_path(self): loader = unittest.TestLoader() tests = [self] - expectedPath = os.path.abspath(os.path.dirname(unittest.test.__file__)) + expectedPath = os.path.abspath(os.path.dirname(test.test_unittest.__file__)) self.wasRun = False def _find_tests(start_dir, pattern): @@ -797,7 +797,7 @@ def _find_tests(start_dir, pattern): self.assertEqual(start_dir, expectedPath) return tests loader._find_tests = _find_tests - suite = loader.discover('unittest.test') + suite = loader.discover('test.test_unittest') self.assertTrue(self.wasRun) self.assertEqual(suite._tests, tests) diff --git a/Lib/unittest/test/test_functiontestcase.py b/Lib/test/test_unittest/test_functiontestcase.py similarity index 99% rename from Lib/unittest/test/test_functiontestcase.py rename to Lib/test/test_unittest/test_functiontestcase.py index 4971729880d6d8..2ebed9564ad9cd 100644 --- a/Lib/unittest/test/test_functiontestcase.py +++ b/Lib/test/test_unittest/test_functiontestcase.py @@ -1,6 +1,6 @@ import unittest -from unittest.test.support import LoggingResult +from test.test_unittest.support import LoggingResult class Test_FunctionTestCase(unittest.TestCase): diff --git a/Lib/unittest/test/test_loader.py b/Lib/test/test_unittest/test_loader.py similarity index 99% rename from Lib/unittest/test/test_loader.py rename to Lib/test/test_unittest/test_loader.py index 6b2954396abdec..bbdfb247ebaada 100644 --- a/Lib/unittest/test/test_loader.py +++ b/Lib/test/test_unittest/test_loader.py @@ -583,7 +583,7 @@ def test_loadTestsFromName__module_not_loaded(self): # We're going to try to load this module as a side-effect, so it # better not be loaded before we try. # - module_name = 'unittest.test.dummy' + module_name = 'test.test_unittest.dummy' sys.modules.pop(module_name, None) loader = unittest.TestLoader() @@ -711,7 +711,7 @@ def test_loadTestsFromNames__unknown_attr_name(self): loader = unittest.TestLoader() suite = loader.loadTestsFromNames( - ['unittest.loader.sdasfasfasdf', 'unittest.test.dummy']) + ['unittest.loader.sdasfasfasdf', 'test.test_unittest.dummy']) error, test = self.check_deferred_error(loader, list(suite)[0]) expected = "module 'unittest.loader' has no attribute 'sdasfasfasdf'" self.assertIn( @@ -1008,7 +1008,7 @@ def test_loadTestsFromNames__module_not_loaded(self): # We're going to try to load this module as a side-effect, so it # better not be loaded before we try. # - module_name = 'unittest.test.dummy' + module_name = 'test.test_unittest.dummy' sys.modules.pop(module_name, None) loader = unittest.TestLoader() diff --git a/Lib/unittest/test/test_program.py b/Lib/test/test_unittest/test_program.py similarity index 96% rename from Lib/unittest/test/test_program.py rename to Lib/test/test_unittest/test_program.py index 7c16b93a2dd0d5..3645bcf4b43562 100644 --- a/Lib/unittest/test/test_program.py +++ b/Lib/test/test_unittest/test_program.py @@ -1,12 +1,10 @@ -import io - import os import sys import subprocess from test import support import unittest -import unittest.test -from unittest.test.test_result import BufferedWriter +import test.test_unittest +from test.test_unittest.test_result import BufferedWriter class Test_TestProgram(unittest.TestCase): @@ -15,7 +13,7 @@ def test_discovery_from_dotted_path(self): loader = unittest.TestLoader() tests = [self] - expectedPath = os.path.abspath(os.path.dirname(unittest.test.__file__)) + expectedPath = os.path.abspath(os.path.dirname(test.test_unittest.__file__)) self.wasRun = False def _find_tests(start_dir, pattern): @@ -23,7 +21,7 @@ def _find_tests(start_dir, pattern): self.assertEqual(start_dir, expectedPath) return tests loader._find_tests = _find_tests - suite = loader.discover('unittest.test') + suite = loader.discover('test.test_unittest') self.assertTrue(self.wasRun) self.assertEqual(suite._tests, tests) @@ -93,10 +91,10 @@ def run(self, test): sys.argv = ['faketest'] runner = FakeRunner() program = unittest.TestProgram(testRunner=runner, exit=False, - defaultTest='unittest.test', + defaultTest='test.test_unittest', testLoader=self.FooBarLoader()) sys.argv = old_argv - self.assertEqual(('unittest.test',), program.testNames) + self.assertEqual(('test.test_unittest',), program.testNames) def test_defaultTest_with_iterable(self): class FakeRunner(object): @@ -109,10 +107,10 @@ def run(self, test): runner = FakeRunner() program = unittest.TestProgram( testRunner=runner, exit=False, - defaultTest=['unittest.test', 'unittest.test2'], + defaultTest=['test.test_unittest', 'test.test_unittest2'], testLoader=self.FooBarLoader()) sys.argv = old_argv - self.assertEqual(['unittest.test', 'unittest.test2'], + self.assertEqual(['test.test_unittest', 'test.test_unittest2'], program.testNames) def test_NonExit(self): diff --git a/Lib/unittest/test/test_result.py b/Lib/test/test_unittest/test_result.py similarity index 99% rename from Lib/unittest/test/test_result.py rename to Lib/test/test_unittest/test_result.py index b0cc3d867d6e57..eeb15992fd109c 100644 --- a/Lib/unittest/test/test_result.py +++ b/Lib/test/test_unittest/test_result.py @@ -2,7 +2,7 @@ import sys import textwrap -from test.support import warnings_helper, captured_stdout, captured_stderr +from test.support import warnings_helper, captured_stdout import traceback import unittest diff --git a/Lib/unittest/test/test_runner.py b/Lib/test/test_unittest/test_runner.py similarity index 99% rename from Lib/unittest/test/test_runner.py rename to Lib/test/test_unittest/test_runner.py index 101076a3236665..d396f2bab57871 100644 --- a/Lib/unittest/test/test_runner.py +++ b/Lib/test/test_unittest/test_runner.py @@ -8,7 +8,7 @@ import unittest from unittest.case import _Outcome -from unittest.test.support import (LoggingResult, +from test.test_unittest.support import (LoggingResult, ResultWithNoStartTestRunStopTestRun) diff --git a/Lib/unittest/test/test_setups.py b/Lib/test/test_unittest/test_setups.py similarity index 100% rename from Lib/unittest/test/test_setups.py rename to Lib/test/test_unittest/test_setups.py diff --git a/Lib/unittest/test/test_skipping.py b/Lib/test/test_unittest/test_skipping.py similarity index 99% rename from Lib/unittest/test/test_skipping.py rename to Lib/test/test_unittest/test_skipping.py index 64ceeae37ef0a7..f146dcac18ecc0 100644 --- a/Lib/unittest/test/test_skipping.py +++ b/Lib/test/test_unittest/test_skipping.py @@ -1,6 +1,6 @@ import unittest -from unittest.test.support import LoggingResult +from test.test_unittest.support import LoggingResult class Test_TestSkipping(unittest.TestCase): diff --git a/Lib/unittest/test/test_suite.py b/Lib/test/test_unittest/test_suite.py similarity index 99% rename from Lib/unittest/test/test_suite.py rename to Lib/test/test_unittest/test_suite.py index 0551a16996723e..ca52ee9d9c011c 100644 --- a/Lib/unittest/test/test_suite.py +++ b/Lib/test/test_unittest/test_suite.py @@ -3,7 +3,7 @@ import gc import sys import weakref -from unittest.test.support import LoggingResult, TestEquality +from test.test_unittest.support import LoggingResult, TestEquality ### Support code for Test_TestSuite diff --git a/Lib/test/test_unittest/testmock/__init__.py b/Lib/test/test_unittest/testmock/__init__.py new file mode 100644 index 00000000000000..bc502ef32d2916 --- /dev/null +++ b/Lib/test/test_unittest/testmock/__init__.py @@ -0,0 +1,6 @@ +import os.path +from test.support import load_package_tests + + +def load_tests(*args): + return load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/unittest/test/testmock/__main__.py b/Lib/test/test_unittest/testmock/__main__.py similarity index 86% rename from Lib/unittest/test/testmock/__main__.py rename to Lib/test/test_unittest/testmock/__main__.py index 45c633a4ee48b5..1e3068b0ddf391 100644 --- a/Lib/unittest/test/testmock/__main__.py +++ b/Lib/test/test_unittest/testmock/__main__.py @@ -6,7 +6,7 @@ def load_tests(loader, standard_tests, pattern): # top level directory cached on loader instance this_dir = os.path.dirname(__file__) pattern = pattern or "test*.py" - # We are inside unittest.test.testmock, so the top-level is three notches up + # We are inside test.test_unittest.testmock, so the top-level is three notches up top_level_dir = os.path.dirname(os.path.dirname(os.path.dirname(this_dir))) package_tests = loader.discover(start_dir=this_dir, pattern=pattern, top_level_dir=top_level_dir) diff --git a/Lib/unittest/test/testmock/support.py b/Lib/test/test_unittest/testmock/support.py similarity index 100% rename from Lib/unittest/test/testmock/support.py rename to Lib/test/test_unittest/testmock/support.py diff --git a/Lib/unittest/test/testmock/testasync.py b/Lib/test/test_unittest/testmock/testasync.py similarity index 100% rename from Lib/unittest/test/testmock/testasync.py rename to Lib/test/test_unittest/testmock/testasync.py diff --git a/Lib/unittest/test/testmock/testcallable.py b/Lib/test/test_unittest/testmock/testcallable.py similarity index 98% rename from Lib/unittest/test/testmock/testcallable.py rename to Lib/test/test_unittest/testmock/testcallable.py index 5eadc007049400..ca88511f63959d 100644 --- a/Lib/unittest/test/testmock/testcallable.py +++ b/Lib/test/test_unittest/testmock/testcallable.py @@ -3,7 +3,7 @@ # http://www.voidspace.org.uk/python/mock/ import unittest -from unittest.test.testmock.support import is_instance, X, SomeClass +from test.test_unittest.testmock.support import is_instance, X, SomeClass from unittest.mock import ( Mock, MagicMock, NonCallableMagicMock, diff --git a/Lib/unittest/test/testmock/testhelpers.py b/Lib/test/test_unittest/testmock/testhelpers.py similarity index 100% rename from Lib/unittest/test/testmock/testhelpers.py rename to Lib/test/test_unittest/testmock/testhelpers.py diff --git a/Lib/unittest/test/testmock/testmagicmethods.py b/Lib/test/test_unittest/testmock/testmagicmethods.py similarity index 100% rename from Lib/unittest/test/testmock/testmagicmethods.py rename to Lib/test/test_unittest/testmock/testmagicmethods.py diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/test/test_unittest/testmock/testmock.py similarity index 99% rename from Lib/unittest/test/testmock/testmock.py rename to Lib/test/test_unittest/testmock/testmock.py index c99098dc4ea86a..8a92490137f9a7 100644 --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/test/test_unittest/testmock/testmock.py @@ -5,7 +5,7 @@ from test.support import ALWAYS_EQ import unittest -from unittest.test.testmock.support import is_instance +from test.test_unittest.testmock.support import is_instance from unittest import mock from unittest.mock import ( call, DEFAULT, patch, sentinel, diff --git a/Lib/unittest/test/testmock/testpatch.py b/Lib/test/test_unittest/testmock/testpatch.py similarity index 98% rename from Lib/unittest/test/testmock/testpatch.py rename to Lib/test/test_unittest/testmock/testpatch.py index 8ab63a1317d3df..93ec0ca4bef524 100644 --- a/Lib/unittest/test/testmock/testpatch.py +++ b/Lib/test/test_unittest/testmock/testpatch.py @@ -7,8 +7,8 @@ from collections import OrderedDict import unittest -from unittest.test.testmock import support -from unittest.test.testmock.support import SomeClass, is_instance +from test.test_unittest.testmock import support +from test.test_unittest.testmock.support import SomeClass, is_instance from test.test_importlib.util import uncache from unittest.mock import ( @@ -669,7 +669,7 @@ def test_patch_dict_decorator_resolution(self): # the new dictionary during function call original = support.target.copy() - @patch.dict('unittest.test.testmock.support.target', {'bar': 'BAR'}) + @patch.dict('test.test_unittest.testmock.support.target', {'bar': 'BAR'}) def test(): self.assertEqual(support.target, {'foo': 'BAZ', 'bar': 'BAR'}) @@ -1614,7 +1614,7 @@ def test_patch_with_spec_mock_repr(self): def test_patch_nested_autospec_repr(self): - with patch('unittest.test.testmock.support', autospec=True) as m: + with patch('test.test_unittest.testmock.support', autospec=True) as m: self.assertIn(" name='support.SomeClass.wibble()'", repr(m.SomeClass.wibble())) self.assertIn(" name='support.SomeClass().wibble()'", @@ -1882,7 +1882,7 @@ def foo(x=0): with patch.object(foo, '__module__', "testpatch2"): self.assertEqual(foo.__module__, "testpatch2") - self.assertEqual(foo.__module__, 'unittest.test.testmock.testpatch') + self.assertEqual(foo.__module__, 'test.test_unittest.testmock.testpatch') with patch.object(foo, '__annotations__', dict([('s', 1, )])): self.assertEqual(foo.__annotations__, dict([('s', 1, )])) @@ -1917,16 +1917,16 @@ def test_dotted_but_module_not_loaded(self): # This exercises the AttributeError branch of _dot_lookup. # make sure it's there - import unittest.test.testmock.support + import test.test_unittest.testmock.support # now make sure it's not: with patch.dict('sys.modules'): - del sys.modules['unittest.test.testmock.support'] - del sys.modules['unittest.test.testmock'] - del sys.modules['unittest.test'] + del sys.modules['test.test_unittest.testmock.support'] + del sys.modules['test.test_unittest.testmock'] + del sys.modules['test.test_unittest'] del sys.modules['unittest'] # now make sure we can patch based on a dotted path: - @patch('unittest.test.testmock.support.X') + @patch('test.test_unittest.testmock.support.X') def test(mock): pass test() @@ -1943,7 +1943,7 @@ class Foo: def test_cant_set_kwargs_when_passing_a_mock(self): - @patch('unittest.test.testmock.support.X', new=object(), x=1) + @patch('test.test_unittest.testmock.support.X', new=object(), x=1) def test(): pass with self.assertRaises(TypeError): test() diff --git a/Lib/unittest/test/testmock/testsealable.py b/Lib/test/test_unittest/testmock/testsealable.py similarity index 100% rename from Lib/unittest/test/testmock/testsealable.py rename to Lib/test/test_unittest/testmock/testsealable.py diff --git a/Lib/unittest/test/testmock/testsentinel.py b/Lib/test/test_unittest/testmock/testsentinel.py similarity index 100% rename from Lib/unittest/test/testmock/testsentinel.py rename to Lib/test/test_unittest/testmock/testsentinel.py diff --git a/Lib/unittest/test/testmock/testwith.py b/Lib/test/test_unittest/testmock/testwith.py similarity index 99% rename from Lib/unittest/test/testmock/testwith.py rename to Lib/test/test_unittest/testmock/testwith.py index c74d49a63c8928..8dc8eb1137691f 100644 --- a/Lib/unittest/test/testmock/testwith.py +++ b/Lib/test/test_unittest/testmock/testwith.py @@ -1,7 +1,7 @@ import unittest from warnings import catch_warnings -from unittest.test.testmock.support import is_instance +from test.test_unittest.testmock.support import is_instance from unittest.mock import MagicMock, Mock, patch, sentinel, mock_open, call diff --git a/Lib/test/test_univnewlines.py b/Lib/test/test_univnewlines.py index b905491878001c..ed2e0970bac949 100644 --- a/Lib/test/test_univnewlines.py +++ b/Lib/test/test_univnewlines.py @@ -4,7 +4,6 @@ import unittest import os import sys -from test import support from test.support import os_helper diff --git a/Lib/test/test_unparse.py b/Lib/test/test_unparse.py index f999ae8c16ceaf..969aa16678f493 100644 --- a/Lib/test/test_unparse.py +++ b/Lib/test/test_unparse.py @@ -648,6 +648,9 @@ def test_star_expr_assign_target(self): self.check_src_roundtrip(source.format(target=target)) def test_star_expr_assign_target_multiple(self): + self.check_src_roundtrip("() = []") + self.check_src_roundtrip("[] = ()") + self.check_src_roundtrip("() = [a] = c, = [d] = e, f = () = g = h") self.check_src_roundtrip("a = b = c = d") self.check_src_roundtrip("a, b = c, d = e, f = g") self.check_src_roundtrip("[a, b] = [c, d] = [e, f] = g") diff --git a/Lib/test/test_urllib.py b/Lib/test/test_urllib.py index bc6e74c291ac1c..f067560ca6caa1 100644 --- a/Lib/test/test_urllib.py +++ b/Lib/test/test_urllib.py @@ -10,6 +10,7 @@ from unittest.mock import patch from test import support from test.support import os_helper +from test.support import socket_helper from test.support import warnings_helper import os try: @@ -24,6 +25,10 @@ import collections +if not socket_helper.has_gethostname: + raise unittest.SkipTest("test requires gethostname()") + + def hexescape(char): """Escape char as RFC 2396 specifies""" hex_repr = hex(ord(char))[2:].upper() diff --git a/Lib/test/test_urllib2.py b/Lib/test/test_urllib2.py index 46a0ab1d83e60e..88270b7537b5e7 100644 --- a/Lib/test/test_urllib2.py +++ b/Lib/test/test_urllib2.py @@ -1,7 +1,6 @@ import unittest from test import support from test.support import os_helper -from test.support import socket_helper from test.support import warnings_helper from test import test_urllib diff --git a/Lib/test/test_urllib2net.py b/Lib/test/test_urllib2net.py index 04cfb492e45492..5da41c37bbfb8e 100644 --- a/Lib/test/test_urllib2net.py +++ b/Lib/test/test_urllib2net.py @@ -3,6 +3,7 @@ from test import support from test.support import os_helper from test.support import socket_helper +from test.support import ResourceDenied from test.test_urllib2 import sanepathname2url import os diff --git a/Lib/test/test_urllib_response.py b/Lib/test/test_urllib_response.py index 73d2ef0424f4af..b76763f4ed824f 100644 --- a/Lib/test/test_urllib_response.py +++ b/Lib/test/test_urllib_response.py @@ -4,6 +4,11 @@ import tempfile import urllib.response import unittest +from test import support + +if support.is_wasi: + raise unittest.SkipTest("Cannot create socket on WASI") + class TestResponse(unittest.TestCase): diff --git a/Lib/test/test_uu.py b/Lib/test/test_uu.py index 316a04af1cdaa7..0493aae4fc67be 100644 --- a/Lib/test/test_uu.py +++ b/Lib/test/test_uu.py @@ -74,6 +74,7 @@ def test_encode(self): with self.assertRaises(TypeError): uu.encode(inp, out, "t1", 0o644, True) + @os_helper.skip_unless_working_chmod def test_decode(self): for backtick in True, False: inp = io.BytesIO(encodedtextwrapped(0o666, "t1", backtick=backtick)) @@ -199,6 +200,8 @@ def test_encode(self): s = fout.read() self.assertEqual(s, encodedtextwrapped(0o644, self.tmpin)) + # decode() calls chmod() + @os_helper.skip_unless_working_chmod def test_decode(self): with open(self.tmpin, 'wb') as f: f.write(encodedtextwrapped(0o644, self.tmpout)) @@ -211,6 +214,7 @@ def test_decode(self): self.assertEqual(s, plaintext) # XXX is there an xp way to verify the mode? + @os_helper.skip_unless_working_chmod def test_decode_filename(self): with open(self.tmpin, 'wb') as f: f.write(encodedtextwrapped(0o644, self.tmpout)) @@ -221,6 +225,7 @@ def test_decode_filename(self): s = f.read() self.assertEqual(s, plaintext) + @os_helper.skip_unless_working_chmod def test_decodetwice(self): # Verify that decode() will refuse to overwrite an existing file with open(self.tmpin, 'wb') as f: @@ -231,6 +236,7 @@ def test_decodetwice(self): with open(self.tmpin, 'rb') as f: self.assertRaises(uu.Error, uu.decode, f) + @os_helper.skip_unless_working_chmod def test_decode_mode(self): # Verify that decode() will set the given mode for the out_file expected_mode = 0o444 diff --git a/Lib/test/test_venv.py b/Lib/test/test_venv.py index d96cf1e6c74930..1545a94affc5e4 100644 --- a/Lib/test/test_venv.py +++ b/Lib/test/test_venv.py @@ -5,18 +5,21 @@ Licensed to the PSF under a contributor agreement. """ +import contextlib import ensurepip import os import os.path +import pathlib import re import shutil import struct import subprocess import sys import tempfile -from test.support import (captured_stdout, captured_stderr, requires_zlib, +from test.support import (captured_stdout, captured_stderr, skip_if_broken_multiprocessing_synchronize, verbose, - requires_subprocess, is_emscripten) + requires_subprocess, is_emscripten, is_wasi, + requires_venv_with_pip) from test.support.os_helper import (can_symlink, EnvironmentVarGuard, rmtree) import unittest import venv @@ -34,8 +37,8 @@ or sys._base_executable != sys.executable, 'cannot run venv.create from within a venv on this platform') -if is_emscripten: - raise unittest.SkipTest("venv is not available on Emscripten.") +if is_emscripten or is_wasi: + raise unittest.SkipTest("venv is not available on Emscripten/WASI.") @requires_subprocess() def check_output(cmd, encoding=None): @@ -98,12 +101,23 @@ def isdir(self, *args): fn = self.get_env_file(*args) self.assertTrue(os.path.isdir(fn)) - def test_defaults(self): + def test_defaults_with_str_path(self): """ - Test the create function with default arguments. + Test the create function with default arguments and a str path. """ rmtree(self.env_dir) self.run_with_capture(venv.create, self.env_dir) + self._check_output_of_default_create() + + def test_defaults_with_pathlib_path(self): + """ + Test the create function with default arguments and a pathlib.Path path. + """ + rmtree(self.env_dir) + self.run_with_capture(venv.create, pathlib.Path(self.env_dir)) + self._check_output_of_default_create() + + def _check_output_of_default_create(self): self.isdir(self.bindir) self.isdir(self.include) self.isdir(*self.lib) @@ -473,7 +487,9 @@ def test_pathsep_error(self): the path separator. """ rmtree(self.env_dir) - self.assertRaises(ValueError, venv.create, self.env_dir + os.pathsep) + bad_itempath = self.env_dir + os.pathsep + self.assertRaises(ValueError, venv.create, bad_itempath) + self.assertRaises(ValueError, venv.create, pathlib.Path(bad_itempath)) @requireVenvCreate class EnsurePipTest(BaseTest): @@ -543,16 +559,10 @@ def do_test_with_pip(self, system_site_packages): # Actually run the create command with all that unhelpful # config in place to ensure we ignore it - try: + with self.nicer_error(): self.run_with_capture(venv.create, self.env_dir, system_site_packages=system_site_packages, with_pip=True) - except subprocess.CalledProcessError as exc: - # The output this produces can be a little hard to read, - # but at least it has all the details - details = exc.output.decode(errors="replace") - msg = "{}\n\n**Subprocess Output**\n{}" - self.fail(msg.format(exc, details)) # Ensure pip is available in the virtual environment envpy = os.path.join(os.path.realpath(self.env_dir), self.bindir, self.exe) # Ignore DeprecationWarning since pip code is not part of Python @@ -573,13 +583,14 @@ def do_test_with_pip(self, system_site_packages): # Check the private uninstall command provided for the Windows # installers works (at least in a virtual environment) with EnvironmentVarGuard() as envvars: - # It seems ensurepip._uninstall calls subprocesses which do not - # inherit the interpreter settings. - envvars["PYTHONWARNINGS"] = "ignore" - out, err = check_output([envpy, - '-W', 'ignore::DeprecationWarning', - '-W', 'ignore::ImportWarning', '-I', - '-m', 'ensurepip._uninstall']) + with self.nicer_error(): + # It seems ensurepip._uninstall calls subprocesses which do not + # inherit the interpreter settings. + envvars["PYTHONWARNINGS"] = "ignore" + out, err = check_output([envpy, + '-W', 'ignore::DeprecationWarning', + '-W', 'ignore::ImportWarning', '-I', + '-m', 'ensurepip._uninstall']) # We force everything to text, so unittest gives the detailed diff # if we get unexpected results err = err.decode("latin-1") # Force to text, prevent decoding errors @@ -605,12 +616,30 @@ def do_test_with_pip(self, system_site_packages): if not system_site_packages: self.assert_pip_not_installed() - # Issue #26610: pip/pep425tags.py requires ctypes - @unittest.skipUnless(ctypes, 'pip requires ctypes') - @requires_zlib() + @contextlib.contextmanager + def nicer_error(self): + """ + Capture output from a failed subprocess for easier debugging. + + The output this handler produces can be a little hard to read, + but at least it has all the details. + """ + try: + yield + except subprocess.CalledProcessError as exc: + out = exc.output.decode(errors="replace") + err = exc.stderr.decode(errors="replace") + self.fail( + f"{exc}\n\n" + f"**Subprocess Output**\n{out}\n\n" + f"**Subprocess Error**\n{err}" + ) + + @requires_venv_with_pip() def test_with_pip(self): self.do_test_with_pip(False) self.do_test_with_pip(True) + if __name__ == "__main__": unittest.main() diff --git a/Lib/test/test_wait3.py b/Lib/test/test_wait3.py index 4ec7690ac19bbf..eae885a61300be 100644 --- a/Lib/test/test_wait3.py +++ b/Lib/test/test_wait3.py @@ -4,7 +4,6 @@ import os import subprocess import sys -import time import unittest from test.fork_wait import ForkWait from test import support @@ -20,14 +19,12 @@ def wait_impl(self, cpid, *, exitcode): # This many iterations can be required, since some previously run # tests (e.g. test_ctypes) could have spawned a lot of children # very quickly. - deadline = time.monotonic() + support.SHORT_TIMEOUT - while time.monotonic() <= deadline: + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): # wait3() shouldn't hang, but some of the buildbots seem to hang # in the forking tests. This is an attempt to fix the problem. spid, status, rusage = os.wait3(os.WNOHANG) if spid == cpid: break - time.sleep(0.1) self.assertEqual(spid, cpid) self.assertEqual(os.waitstatus_to_exitcode(status), exitcode) diff --git a/Lib/test/test_wait4.py b/Lib/test/test_wait4.py index 24f1aaec60c56b..67afab1d6f2e3c 100644 --- a/Lib/test/test_wait4.py +++ b/Lib/test/test_wait4.py @@ -2,7 +2,6 @@ """ import os -import time import sys import unittest from test.fork_wait import ForkWait @@ -22,14 +21,12 @@ def wait_impl(self, cpid, *, exitcode): # Issue #11185: wait4 is broken on AIX and will always return 0 # with WNOHANG. option = 0 - deadline = time.monotonic() + support.SHORT_TIMEOUT - while time.monotonic() <= deadline: + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): # wait4() shouldn't hang, but some of the buildbots seem to hang # in the forking tests. This is an attempt to fix the problem. spid, status, rusage = os.wait4(cpid, option) if spid == cpid: break - time.sleep(0.1) self.assertEqual(spid, cpid) self.assertEqual(os.waitstatus_to_exitcode(status), exitcode) self.assertTrue(rusage) diff --git a/Lib/test/test_warnings/__init__.py b/Lib/test/test_warnings/__init__.py index 0f960b82bfaebc..b00ddd5df2f256 100644 --- a/Lib/test/test_warnings/__init__.py +++ b/Lib/test/test_warnings/__init__.py @@ -22,8 +22,6 @@ c_warnings = import_helper.import_fresh_module('warnings', fresh=['_warnings']) -Py_DEBUG = hasattr(sys, 'gettotalrefcount') - @contextmanager def warnings_state(module): """Use a specific warnings implementation in warning_tests.""" @@ -1191,7 +1189,7 @@ def test_conflicting_envvar_and_command_line(self): def test_default_filter_configuration(self): pure_python_api = self.module is py_warnings - if Py_DEBUG: + if support.Py_DEBUG: expected_default_filters = [] else: if pure_python_api: diff --git a/Lib/test/test_weakref.py b/Lib/test/test_weakref.py index 702bb4495e7499..3a9573d6e70559 100644 --- a/Lib/test/test_weakref.py +++ b/Lib/test/test_weakref.py @@ -2154,6 +2154,17 @@ def test_atexit(self): self.assertTrue(b'ZeroDivisionError' in err) +class ModuleTestCase(unittest.TestCase): + def test_names(self): + for name in ('ReferenceType', 'ProxyType', 'CallableProxyType', + 'WeakMethod', 'WeakSet', 'WeakKeyDictionary', 'WeakValueDictionary'): + obj = getattr(weakref, name) + if name != 'WeakSet': + self.assertEqual(obj.__module__, 'weakref') + self.assertEqual(obj.__name__, name) + self.assertEqual(obj.__qualname__, name) + + libreftest = """ Doctest for examples in the library reference: weakref.rst >>> from test.support import gc_collect diff --git a/Lib/test/test_winsound.py b/Lib/test/test_winsound.py index 3c3359b9004fd2..a59d0d24f5db48 100644 --- a/Lib/test/test_winsound.py +++ b/Lib/test/test_winsound.py @@ -1,6 +1,7 @@ # Ridiculously simple test of the winsound module for Windows. import functools +import pathlib import time import unittest @@ -84,6 +85,13 @@ def test_keyword_args(self): safe_MessageBeep(type=winsound.MB_OK) +# A class for testing winsound when the given path resolves +# to bytes rather than str. +class BytesPath(pathlib.WindowsPath): + def __fspath__(self): + return bytes(super().__fspath__(), 'UTF-8') + + class PlaySoundTest(unittest.TestCase): def test_errors(self): @@ -116,6 +124,20 @@ def test_snd_filename(self): fn = support.findfile('pluck-pcm8.wav', subdir='audiodata') safe_PlaySound(fn, winsound.SND_FILENAME | winsound.SND_NODEFAULT) + def test_snd_filepath(self): + fn = support.findfile('pluck-pcm8.wav', subdir='audiodata') + path = pathlib.Path(fn) + safe_PlaySound(path, winsound.SND_FILENAME | winsound.SND_NODEFAULT) + + def test_snd_filepath_as_bytes(self): + fn = support.findfile('pluck-pcm8.wav', subdir='audiodata') + self.assertRaises( + TypeError, + winsound.PlaySound, + BytesPath(fn), + winsound.SND_FILENAME | winsound.SND_NODEFAULT + ) + def test_aliases(self): aliases = [ "SystemAsterisk", diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py index aea77b192c1006..afa4641e6906b7 100644 --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -3739,13 +3739,7 @@ def test_write_to_filename_as_unicode(self): tree = ET.ElementTree(ET.XML('''\xf8''')) tree.write(TESTFN, encoding='unicode') with open(TESTFN, 'rb') as f: - data = f.read() - expected = "\xf8".encode(encoding, 'xmlcharrefreplace') - if encoding.lower() in ('utf-8', 'ascii'): - self.assertEqual(data, expected) - else: - self.assertIn(b"\xc3\xb8") def test_write_to_text_file(self): self.addCleanup(os_helper.unlink, TESTFN) @@ -3760,17 +3754,13 @@ def test_write_to_text_file(self): tree.write(f, encoding='unicode') self.assertFalse(f.closed) with open(TESTFN, 'rb') as f: - self.assertEqual(f.read(), convlinesep( - b'''\n''' - b'''ø''')) + self.assertEqual(f.read(), b'''ø''') with open(TESTFN, 'w', encoding='ISO-8859-1') as f: tree.write(f, encoding='unicode') self.assertFalse(f.closed) with open(TESTFN, 'rb') as f: - self.assertEqual(f.read(), convlinesep( - b'''\n''' - b'''\xf8''')) + self.assertEqual(f.read(), b'''\xf8''') def test_write_to_binary_file(self): self.addCleanup(os_helper.unlink, TESTFN) diff --git a/Lib/test/test_yield_from.py b/Lib/test/test_yield_from.py index d105d8c6eb513c..1a60357a1bcd62 100644 --- a/Lib/test/test_yield_from.py +++ b/Lib/test/test_yield_from.py @@ -1049,6 +1049,533 @@ def outer(): g.send((1, 2, 3, 4)) self.assertEqual(v, (1, 2, 3, 4)) +class TestInterestingEdgeCases(unittest.TestCase): + + def assert_stop_iteration(self, iterator): + with self.assertRaises(StopIteration) as caught: + next(iterator) + self.assertIsNone(caught.exception.value) + self.assertIsNone(caught.exception.__context__) + + def assert_generator_raised_stop_iteration(self): + return self.assertRaisesRegex(RuntimeError, r"^generator raised StopIteration$") + + def assert_generator_ignored_generator_exit(self): + return self.assertRaisesRegex(RuntimeError, r"^generator ignored GeneratorExit$") + + def test_close_and_throw_work(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + yield yielded_first + yield yielded_second + return returned + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + g.close() + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = GeneratorExit() + with self.assertRaises(GeneratorExit) as caught: + g.throw(thrown) + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = StopIteration() + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + g.throw(thrown) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = BaseException() + with self.assertRaises(BaseException) as caught: + g.throw(thrown) + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = Exception() + with self.assertRaises(Exception) as caught: + g.throw(thrown) + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + def test_close_and_throw_raise_generator_exit(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + try: + yield yielded_first + yield yielded_second + return returned + finally: + raise raised + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = GeneratorExit() + # GeneratorExit is suppressed. This is consistent with PEP 342: + # https://peps.python.org/pep-0342/#new-generator-method-close + g.close() + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = GeneratorExit() + thrown = GeneratorExit() + with self.assertRaises(GeneratorExit) as caught: + g.throw(thrown) + # The raised GeneratorExit is suppressed, but the thrown one + # propagates. This is consistent with PEP 380: + # https://peps.python.org/pep-0380/#proposal + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = GeneratorExit() + thrown = StopIteration() + with self.assertRaises(GeneratorExit) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = GeneratorExit() + thrown = BaseException() + with self.assertRaises(GeneratorExit) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = GeneratorExit() + thrown = Exception() + with self.assertRaises(GeneratorExit) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + def test_close_and_throw_raise_stop_iteration(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + try: + yield yielded_first + yield yielded_second + return returned + finally: + raise raised + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = StopIteration() + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + g.close() + self.assertIs(caught.exception.__context__, raised) + self.assertIsInstance(caught.exception.__context__.__context__, GeneratorExit) + self.assertIsNone(caught.exception.__context__.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = StopIteration() + thrown = GeneratorExit() + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + g.throw(thrown) + self.assertIs(caught.exception.__context__, raised) + # This isn't the same GeneratorExit as thrown! It's the one created + # by calling inner.close(): + self.assertIsInstance(caught.exception.__context__.__context__, GeneratorExit) + self.assertIsNone(caught.exception.__context__.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = StopIteration() + thrown = StopIteration() + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + g.throw(thrown) + self.assertIs(caught.exception.__context__, raised) + self.assertIs(caught.exception.__context__.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = StopIteration() + thrown = BaseException() + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + g.throw(thrown) + self.assertIs(caught.exception.__context__, raised) + self.assertIs(caught.exception.__context__.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = StopIteration() + thrown = Exception() + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + g.throw(thrown) + self.assertIs(caught.exception.__context__, raised) + self.assertIs(caught.exception.__context__.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__.__context__) + self.assert_stop_iteration(g) + + def test_close_and_throw_raise_base_exception(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + try: + yield yielded_first + yield yielded_second + return returned + finally: + raise raised + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = BaseException() + with self.assertRaises(BaseException) as caught: + g.close() + self.assertIs(caught.exception, raised) + self.assertIsInstance(caught.exception.__context__, GeneratorExit) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = BaseException() + thrown = GeneratorExit() + with self.assertRaises(BaseException) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + # This isn't the same GeneratorExit as thrown! It's the one created + # by calling inner.close(): + self.assertIsInstance(caught.exception.__context__, GeneratorExit) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = BaseException() + thrown = StopIteration() + with self.assertRaises(BaseException) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = BaseException() + thrown = BaseException() + with self.assertRaises(BaseException) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = BaseException() + thrown = Exception() + with self.assertRaises(BaseException) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + def test_close_and_throw_raise_exception(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + try: + yield yielded_first + yield yielded_second + return returned + finally: + raise raised + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = Exception() + with self.assertRaises(Exception) as caught: + g.close() + self.assertIs(caught.exception, raised) + self.assertIsInstance(caught.exception.__context__, GeneratorExit) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = Exception() + thrown = GeneratorExit() + with self.assertRaises(Exception) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + # This isn't the same GeneratorExit as thrown! It's the one created + # by calling inner.close(): + self.assertIsInstance(caught.exception.__context__, GeneratorExit) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = Exception() + thrown = StopIteration() + with self.assertRaises(Exception) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = Exception() + thrown = BaseException() + with self.assertRaises(Exception) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + raised = Exception() + thrown = Exception() + with self.assertRaises(Exception) as caught: + g.throw(thrown) + self.assertIs(caught.exception, raised) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + def test_close_and_throw_yield(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + try: + yield yielded_first + finally: + yield yielded_second + return returned + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + # No chaining happens. This is consistent with PEP 342: + # https://peps.python.org/pep-0342/#new-generator-method-close + with self.assert_generator_ignored_generator_exit() as caught: + g.close() + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = GeneratorExit() + # No chaining happens. This is consistent with PEP 342: + # https://peps.python.org/pep-0342/#new-generator-method-close + with self.assert_generator_ignored_generator_exit() as caught: + g.throw(thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = StopIteration() + self.assertEqual(g.throw(thrown), yielded_second) + # PEP 479: + with self.assert_generator_raised_stop_iteration() as caught: + next(g) + self.assertIs(caught.exception.__context__, thrown) + self.assertIsNone(caught.exception.__context__.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = BaseException() + self.assertEqual(g.throw(thrown), yielded_second) + with self.assertRaises(BaseException) as caught: + next(g) + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = Exception() + self.assertEqual(g.throw(thrown), yielded_second) + with self.assertRaises(Exception) as caught: + next(g) + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + def test_close_and_throw_return(self): + + yielded_first = object() + yielded_second = object() + returned = object() + + def inner(): + try: + yield yielded_first + yield yielded_second + finally: + return returned + + def outer(): + return (yield from inner()) + + with self.subTest("close"): + g = outer() + self.assertIs(next(g), yielded_first) + # StopIteration is suppressed. This is consistent with PEP 342: + # https://peps.python.org/pep-0342/#new-generator-method-close + g.close() + self.assert_stop_iteration(g) + + with self.subTest("throw GeneratorExit"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = GeneratorExit() + # StopIteration is suppressed. This is consistent with PEP 342: + # https://peps.python.org/pep-0342/#new-generator-method-close + with self.assertRaises(GeneratorExit) as caught: + g.throw(thrown) + self.assertIs(caught.exception, thrown) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw StopIteration"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = StopIteration() + with self.assertRaises(StopIteration) as caught: + g.throw(thrown) + self.assertIs(caught.exception.value, returned) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw BaseException"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = BaseException() + with self.assertRaises(StopIteration) as caught: + g.throw(thrown) + self.assertIs(caught.exception.value, returned) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + + with self.subTest("throw Exception"): + g = outer() + self.assertIs(next(g), yielded_first) + thrown = Exception() + with self.assertRaises(StopIteration) as caught: + g.throw(thrown) + self.assertIs(caught.exception.value, returned) + self.assertIsNone(caught.exception.__context__) + self.assert_stop_iteration(g) + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_zipapp.py b/Lib/test/test_zipapp.py index 69f2e55d563840..f1c6b2d97621ee 100644 --- a/Lib/test/test_zipapp.py +++ b/Lib/test/test_zipapp.py @@ -9,6 +9,7 @@ import zipapp import zipfile from test.support import requires_zlib +from test.support import os_helper from unittest.mock import patch @@ -54,6 +55,22 @@ def test_create_archive_with_subdirs(self): self.assertIn('foo/', z.namelist()) self.assertIn('bar/', z.namelist()) + def test_create_sorted_archive(self): + # Test that zipapps order their files by name + source = self.tmpdir / 'source' + source.mkdir() + (source / 'zed.py').touch() + (source / 'bin').mkdir() + (source / 'bin' / 'qux').touch() + (source / 'bin' / 'baz').touch() + (source / '__main__.py').touch() + target = io.BytesIO() + zipapp.create_archive(str(source), target) + target.seek(0) + with zipfile.ZipFile(target, 'r') as zf: + self.assertEqual(zf.namelist(), + ["__main__.py", "bin/", "bin/baz", "bin/qux", "zed.py"]) + def test_create_archive_with_filter(self): # Test packing a directory and using filter to specify # which files to include. @@ -301,6 +318,7 @@ def test_content_of_copied_archive(self): # (Unix only) tests that archives with shebang lines are made executable @unittest.skipIf(sys.platform == 'win32', 'Windows does not support an executable bit') + @os_helper.skip_unless_working_chmod def test_shebang_is_executable(self): # Test that an archive with a shebang line is made executable. source = self.tmpdir / 'source' diff --git a/Lib/test/test_zipfile.py b/Lib/test/test_zipfile.py index 848bf4f76d4536..f4c11d88c8a09f 100644 --- a/Lib/test/test_zipfile.py +++ b/Lib/test/test_zipfile.py @@ -1740,6 +1740,17 @@ def test_empty_file_raises_BadZipFile(self): fp.write("short file") self.assertRaises(zipfile.BadZipFile, zipfile.ZipFile, TESTFN) + def test_negative_central_directory_offset_raises_BadZipFile(self): + # Zip file containing an empty EOCD record + buffer = bytearray(b'PK\x05\x06' + b'\0'*18) + + # Set the size of the central directory bytes to become 1, + # causing the central directory offset to become negative + for dirsize in 1, 2**32-1: + buffer[12:16] = struct.pack(' 0: + waiter = waiters[0] + try: + waiter.release() + except RuntimeError: + # gh-92530: The previous call of notify() released the lock, + # but was interrupted before removing it from the queue. + # It can happen if a signal handler raises an exception, + # like CTRL+C which raises KeyboardInterrupt. + pass + else: + n -= 1 try: - all_waiters.remove(waiter) + waiters.remove(waiter) except ValueError: pass diff --git a/Lib/tkinter/commondialog.py b/Lib/tkinter/commondialog.py index e595c99defb991..86f5387e001bf2 100644 --- a/Lib/tkinter/commondialog.py +++ b/Lib/tkinter/commondialog.py @@ -10,7 +10,7 @@ __all__ = ["Dialog"] -from tkinter import Frame, _get_temp_root, _destroy_temp_root +from tkinter import _get_temp_root, _destroy_temp_root class Dialog: diff --git a/Lib/tkinter/test/__init__.py b/Lib/tkinter/test/__init__.py deleted file mode 100644 index e69de29bb2d1d6..00000000000000 diff --git a/Lib/tkinter/test/test_tkinter/__init__.py b/Lib/tkinter/test/test_tkinter/__init__.py deleted file mode 100644 index e69de29bb2d1d6..00000000000000 diff --git a/Lib/tkinter/test/test_ttk/__init__.py b/Lib/tkinter/test/test_ttk/__init__.py deleted file mode 100644 index e69de29bb2d1d6..00000000000000 diff --git a/Lib/types.py b/Lib/types.py index 9490da7b9ee3b9..2e73fbc4501337 100644 --- a/Lib/types.py +++ b/Lib/types.py @@ -80,7 +80,7 @@ def resolve_bases(bases): updated = False shift = 0 for i, base in enumerate(bases): - if isinstance(base, type) and not isinstance(base, GenericAlias): + if isinstance(base, type): continue if not hasattr(base, "__mro_entries__"): continue diff --git a/Lib/typing.py b/Lib/typing.py index 306bb9fb6df784..25ae19f71fe94a 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -954,6 +954,9 @@ def __repr__(self): prefix = '~' return prefix + self.__name__ + def __mro_entries__(self, bases): + raise TypeError(f"Cannot subclass an instance of {type(self).__name__}") + class TypeVar(_Final, _Immutable, _BoundVarianceMixin, _PickleUsingNameMixin, _root=True): @@ -1065,6 +1068,45 @@ def __repr__(self): def __typing_subst__(self, arg): raise TypeError("Substitution of bare TypeVarTuple is not supported") + def __typing_prepare_subst__(self, alias, args): + params = alias.__parameters__ + typevartuple_index = params.index(self) + for param in enumerate(params[typevartuple_index + 1:]): + if isinstance(param, TypeVarTuple): + raise TypeError(f"More than one TypeVarTuple parameter in {alias}") + + alen = len(args) + plen = len(params) + left = typevartuple_index + right = plen - typevartuple_index - 1 + var_tuple_index = None + fillarg = None + for k, arg in enumerate(args): + if not isinstance(arg, type): + subargs = getattr(arg, '__typing_unpacked_tuple_args__', None) + if subargs and len(subargs) == 2 and subargs[-1] is ...: + if var_tuple_index is not None: + raise TypeError("More than one unpacked arbitrary-length tuple argument") + var_tuple_index = k + fillarg = subargs[0] + if var_tuple_index is not None: + left = min(left, var_tuple_index) + right = min(right, alen - var_tuple_index - 1) + elif left + right > alen: + raise TypeError(f"Too few arguments for {alias};" + f" actual {alen}, expected at least {plen-1}") + + return ( + *args[:left], + *([fillarg]*(typevartuple_index - left)), + tuple(args[left: alen - right]), + *([fillarg]*(plen - right - left - typevartuple_index - 1)), + *args[alen - right:], + ) + + def __mro_entries__(self, bases): + raise TypeError(f"Cannot subclass an instance of {type(self).__name__}") + class ParamSpecArgs(_Final, _Immutable, _root=True): """The args for a ParamSpec object. @@ -1089,6 +1131,9 @@ def __eq__(self, other): return NotImplemented return self.__origin__ == other.__origin__ + def __mro_entries__(self, bases): + raise TypeError(f"Cannot subclass an instance of {type(self).__name__}") + class ParamSpecKwargs(_Final, _Immutable, _root=True): """The kwargs for a ParamSpec object. @@ -1113,6 +1158,9 @@ def __eq__(self, other): return NotImplemented return self.__origin__ == other.__origin__ + def __mro_entries__(self, bases): + raise TypeError(f"Cannot subclass an instance of {type(self).__name__}") + class ParamSpec(_Final, _Immutable, _BoundVarianceMixin, _PickleUsingNameMixin, _root=True): @@ -1184,6 +1232,8 @@ def __typing_subst__(self, arg): f"ParamSpec, or Concatenate. Got {arg}") return arg + def __typing_prepare_subst__(self, alias, args): + return _prepare_paramspec_params(alias, args) def _is_dunder(attr): return attr.startswith('__') and attr.endswith('__') @@ -1255,44 +1305,6 @@ def __dir__(self): + [attr for attr in dir(self.__origin__) if not _is_dunder(attr)])) -def _is_unpacked_tuple(x: Any) -> bool: - # Is `x` something like `*tuple[int]` or `*tuple[int, ...]`? - if not isinstance(x, _UnpackGenericAlias): - return False - # Alright, `x` is `Unpack[something]`. - - # `x` will always have `__args__`, because Unpack[] and Unpack[()] - # aren't legal. - unpacked_type = x.__args__[0] - - return getattr(unpacked_type, '__origin__', None) is tuple - - -def _is_unpacked_arbitrary_length_tuple(x: Any) -> bool: - if not _is_unpacked_tuple(x): - return False - unpacked_tuple = x.__args__[0] - - if not hasattr(unpacked_tuple, '__args__'): - # It's `Unpack[tuple]`. We can't make any assumptions about the length - # of the tuple, so it's effectively an arbitrary-length tuple. - return True - - tuple_args = unpacked_tuple.__args__ - if not tuple_args: - # It's `Unpack[tuple[()]]`. - return False - - last_arg = tuple_args[-1] - if last_arg is Ellipsis: - # It's `Unpack[tuple[something, ...]]`, which is arbitrary-length. - return True - - # If the arguments didn't end with an ellipsis, then it's not an - # arbitrary-length tuple. - return False - - # Special typing constructs Union, Optional, Generic, Callable and Tuple # use three special attributes for internal bookkeeping of generic types: # * __parameters__ is a tuple of unique free type parameters of a generic @@ -1385,10 +1397,6 @@ def __getitem__(self, args): args = (args,) args = tuple(_type_convert(p) for p in args) args = _unpack_args(args) - if (self._paramspec_tvars - and any(isinstance(t, ParamSpec) for t in self.__parameters__)): - args = _prepare_paramspec_params(self, args) - new_args = self._determine_new_args(args) r = self.copy_with(new_args) return r @@ -1410,30 +1418,16 @@ def _determine_new_args(self, args): params = self.__parameters__ # In the example above, this would be {T3: str} - new_arg_by_param = {} - typevartuple_index = None - for i, param in enumerate(params): - if isinstance(param, TypeVarTuple): - if typevartuple_index is not None: - raise TypeError(f"More than one TypeVarTuple parameter in {self}") - typevartuple_index = i - + for param in params: + prepare = getattr(param, '__typing_prepare_subst__', None) + if prepare is not None: + args = prepare(self, args) alen = len(args) plen = len(params) - if typevartuple_index is not None: - i = typevartuple_index - j = alen - (plen - i - 1) - if j < i: - raise TypeError(f"Too few arguments for {self};" - f" actual {alen}, expected at least {plen-1}") - new_arg_by_param.update(zip(params[:i], args[:i])) - new_arg_by_param[params[i]] = tuple(args[i: j]) - new_arg_by_param.update(zip(params[i + 1:], args[j:])) - else: - if alen != plen: - raise TypeError(f"Too {'many' if alen > plen else 'few'} arguments for {self};" - f" actual {alen}, expected {plen}") - new_arg_by_param.update(zip(params, args)) + if alen != plen: + raise TypeError(f"Too {'many' if alen > plen else 'few'} arguments for {self};" + f" actual {alen}, expected {plen}") + new_arg_by_param = dict(zip(params, args)) new_args = [] for old_arg in self.__args__: @@ -3368,10 +3362,10 @@ def dataclass_transform( Example usage with a decorator function: - _T = TypeVar("_T") + T = TypeVar("T") @dataclass_transform() - def create_model(cls: type[_T]) -> type[_T]: + def create_model(cls: type[T]) -> type[T]: ... return cls @@ -3400,20 +3394,23 @@ class CustomerModel(ModelBase): id: int name: str - Each of the ``CustomerModel`` classes defined in this example will now - behave similarly to a dataclass created with the ``@dataclasses.dataclass`` - decorator. For example, the type checker will synthesize an ``__init__`` - method. + The ``CustomerModel`` classes defined above will + be treated by type checkers similarly to classes created with + ``@dataclasses.dataclass``. + For example, type checkers will assume these classes have + ``__init__`` methods that accept ``id`` and ``name``. The arguments to this decorator can be used to customize this behavior: - ``eq_default`` indicates whether the ``eq`` parameter is assumed to be - True or False if it is omitted by the caller. + ``True`` or ``False`` if it is omitted by the caller. - ``order_default`` indicates whether the ``order`` parameter is assumed to be True or False if it is omitted by the caller. - ``kw_only_default`` indicates whether the ``kw_only`` parameter is assumed to be True or False if it is omitted by the caller. - ``field_specifiers`` specifies a static list of supported classes or functions that describe fields, similar to ``dataclasses.field()``. + - Arbitrary other keyword arguments are accepted in order to allow for + possible future extensions. At runtime, this decorator records its arguments in the ``__dataclass_transform__`` attribute on the decorated object. diff --git a/Lib/unittest/__init__.py b/Lib/unittest/__init__.py index 73e74de62bfc0e..5bcbf834840b59 100644 --- a/Lib/unittest/__init__.py +++ b/Lib/unittest/__init__.py @@ -70,16 +70,6 @@ def testMultiply(self): from .loader import makeSuite, getTestCaseNames, findTestCases -# There are no tests here, so don't try to run anything discovered from -# introspecting the symbols (e.g. FunctionTestCase). Instead, all our -# tests come from within unittest.test. -def load_tests(loader, tests, pattern): - import os.path - # top level directory cached on loader instance - this_dir = os.path.dirname(__file__) - return loader.discover(start_dir=this_dir, pattern=pattern) - - # Lazy import of IsolatedAsyncioTestCase from .async_case # It imports asyncio, which is relatively heavy, but most tests # do not need it. diff --git a/Lib/unittest/test/__init__.py b/Lib/unittest/test/__init__.py deleted file mode 100644 index 143f4ab5a3d878..00000000000000 --- a/Lib/unittest/test/__init__.py +++ /dev/null @@ -1,25 +0,0 @@ -import os -import sys -import unittest - - -here = os.path.dirname(__file__) -loader = unittest.defaultTestLoader - -def suite(): - suite = unittest.TestSuite() - for fn in os.listdir(here): - if fn.startswith("test") and fn.endswith(".py"): - modname = "unittest.test." + fn[:-3] - try: - __import__(modname) - except unittest.SkipTest: - continue - module = sys.modules[modname] - suite.addTest(loader.loadTestsFromModule(module)) - suite.addTest(loader.loadTestsFromName('unittest.test.testmock')) - return suite - - -if __name__ == "__main__": - unittest.main(defaultTest="suite") diff --git a/Lib/unittest/test/__main__.py b/Lib/unittest/test/__main__.py deleted file mode 100644 index 44d0591e8473af..00000000000000 --- a/Lib/unittest/test/__main__.py +++ /dev/null @@ -1,18 +0,0 @@ -import os -import unittest - - -def load_tests(loader, standard_tests, pattern): - # top level directory cached on loader instance - this_dir = os.path.dirname(__file__) - pattern = pattern or "test_*.py" - # We are inside unittest.test, so the top-level is two notches up - top_level_dir = os.path.dirname(os.path.dirname(this_dir)) - package_tests = loader.discover(start_dir=this_dir, pattern=pattern, - top_level_dir=top_level_dir) - standard_tests.addTests(package_tests) - return standard_tests - - -if __name__ == '__main__': - unittest.main() diff --git a/Lib/unittest/test/testmock/__init__.py b/Lib/unittest/test/testmock/__init__.py deleted file mode 100644 index 87d7ae994d5cd3..00000000000000 --- a/Lib/unittest/test/testmock/__init__.py +++ /dev/null @@ -1,17 +0,0 @@ -import os -import sys -import unittest - - -here = os.path.dirname(__file__) -loader = unittest.defaultTestLoader - -def load_tests(*args): - suite = unittest.TestSuite() - for fn in os.listdir(here): - if fn.startswith("test") and fn.endswith(".py"): - modname = "unittest.test.testmock." + fn[:-3] - __import__(modname) - module = sys.modules[modname] - suite.addTest(loader.loadTestsFromModule(module)) - return suite diff --git a/Lib/urllib/parse.py b/Lib/urllib/parse.py index d70a6943f0a739..fd6d9f44c6268c 100644 --- a/Lib/urllib/parse.py +++ b/Lib/urllib/parse.py @@ -30,7 +30,6 @@ from collections import namedtuple import functools import re -import sys import types import warnings diff --git a/Lib/urllib/request.py b/Lib/urllib/request.py index 6d580a434a7bea..7878daacb52d08 100644 --- a/Lib/urllib/request.py +++ b/Lib/urllib/request.py @@ -88,7 +88,6 @@ import http.client import io import os -import posixpath import re import socket import string @@ -1384,12 +1383,16 @@ class HTTPSHandler(AbstractHTTPHandler): def __init__(self, debuglevel=0, context=None, check_hostname=None): AbstractHTTPHandler.__init__(self, debuglevel) + if context is None: + http_version = http.client.HTTPSConnection._http_vsn + context = http.client._create_https_context(http_version) + if check_hostname is not None: + context.check_hostname = check_hostname self._context = context - self._check_hostname = check_hostname def https_open(self, req): return self.do_open(http.client.HTTPSConnection, req, - context=self._context, check_hostname=self._check_hostname) + context=self._context) https_request = AbstractHTTPHandler.do_request_ diff --git a/Lib/venv/__init__.py b/Lib/venv/__init__.py index 7bfbadda7b4970..6032f3648e15ff 100644 --- a/Lib/venv/__init__.py +++ b/Lib/venv/__init__.py @@ -116,7 +116,7 @@ def create_if_needed(d): elif os.path.islink(d) or os.path.isfile(d): raise ValueError('Unable to create directory %r' % d) - if os.pathsep in env_dir: + if os.pathsep in os.fspath(env_dir): raise ValueError(f'Refusing to create a venv in {env_dir} because ' f'it contains the PATH separator {os.pathsep}.') if os.path.exists(env_dir) and self.clear: diff --git a/Lib/xml/etree/ElementTree.py b/Lib/xml/etree/ElementTree.py index a5cc65e789c004..1dc80351bf7ddd 100644 --- a/Lib/xml/etree/ElementTree.py +++ b/Lib/xml/etree/ElementTree.py @@ -731,6 +731,7 @@ def write(self, file_or_filename, with _get_writer(file_or_filename, encoding) as (write, declared_encoding): if method == "xml" and (xml_declaration or (xml_declaration is None and + encoding.lower() != "unicode" and declared_encoding.lower() not in ("utf-8", "us-ascii"))): write("\n" % ( declared_encoding,)) @@ -757,13 +758,10 @@ def _get_writer(file_or_filename, encoding): except AttributeError: # file_or_filename is a file name if encoding.lower() == "unicode": - file = open(file_or_filename, "w", - errors="xmlcharrefreplace") - else: - file = open(file_or_filename, "w", encoding=encoding, - errors="xmlcharrefreplace") - with file: - yield file.write, file.encoding + encoding="utf-8" + with open(file_or_filename, "w", encoding=encoding, + errors="xmlcharrefreplace") as file: + yield file.write, encoding else: # file_or_filename is a file-like object # encoding determines if it is a text or binary writer diff --git a/Lib/zipapp.py b/Lib/zipapp.py index ce77632516c646..d8ebfcb6c73b0c 100644 --- a/Lib/zipapp.py +++ b/Lib/zipapp.py @@ -136,7 +136,7 @@ def create_archive(source, target=None, interpreter=None, main=None, compression = (zipfile.ZIP_DEFLATED if compressed else zipfile.ZIP_STORED) with zipfile.ZipFile(fd, 'w', compression=compression) as z: - for child in source.rglob('*'): + for child in sorted(source.rglob('*')): arcname = child.relative_to(source) if filter is None or filter(arcname): z.write(child, arcname.as_posix()) diff --git a/Lib/zipfile.py b/Lib/zipfile.py index dc02011084329a..fc6ca65e5ed1e9 100644 --- a/Lib/zipfile.py +++ b/Lib/zipfile.py @@ -1381,6 +1381,8 @@ def _RealGetContents(self): print("given, inferred, offset", offset_cd, inferred, concat) # self.start_dir: Position of start of central directory self.start_dir = offset_cd + concat + if self.start_dir < 0: + raise BadZipFile("Bad offset for central directory") fp.seek(self.start_dir, 0) data = fp.read(size_cd) fp = io.BytesIO(data) @@ -1564,7 +1566,7 @@ def open(self, name, mode="r", pwd=None, *, force_zip64=False): fname = zef_file.read(fheader[_FH_FILENAME_LENGTH]) if fheader[_FH_EXTRA_FIELD_LENGTH]: - zef_file.read(fheader[_FH_EXTRA_FIELD_LENGTH]) + zef_file.seek(fheader[_FH_EXTRA_FIELD_LENGTH], whence=1) if zinfo.flag_bits & _MASK_COMPRESSED_PATCH: # Zip 2.7: compressed patched data diff --git a/Mac/BuildScript/scripts/postflight.framework b/Mac/BuildScript/scripts/postflight.framework index 0f2e52c4ca1926..a63909358f1719 100755 --- a/Mac/BuildScript/scripts/postflight.framework +++ b/Mac/BuildScript/scripts/postflight.framework @@ -8,12 +8,12 @@ FWK="/Library/Frameworks/Python.framework/Versions/@PYVER@" "${FWK}/bin/python@PYVER@" -E -s -Wi \ "${FWK}/lib/python${PYVER}/compileall.py" -q -j0 \ - -f -x 'bad_coding|badsyntax|site-packages|lib2to3/tests/data' \ + -f -x 'bad_coding|badsyntax|site-packages|test/test_lib2to3/data' \ "${FWK}/lib/python${PYVER}" "${FWK}/bin/python@PYVER@" -E -s -Wi -O \ "${FWK}/lib/python${PYVER}/compileall.py" -q -j0 \ - -f -x 'bad_coding|badsyntax|site-packages|lib2to3/tests/data' \ + -f -x 'bad_coding|badsyntax|site-packages|test/test_lib2to3/data' \ "${FWK}/lib/python${PYVER}" "${FWK}/bin/python@PYVER@" -E -s -Wi \ diff --git a/Makefile.pre.in b/Makefile.pre.in index 515c18cc216664..102cd752c39cdf 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -247,8 +247,8 @@ SRCDIRS= @SRCDIRS@ SUBDIRSTOO= Include Lib Misc # assets for Emscripten browser builds -WASM_ASSETS_DIR=".$(prefix)" -WASM_STDLIB="$(WASM_ASSETS_DIR)/local/lib/python$(VERSION)/os.py" +WASM_ASSETS_DIR=.$(prefix) +WASM_STDLIB=$(WASM_ASSETS_DIR)/lib/python$(VERSION)/os.py # Files and directories to be distributed CONFIGFILES= configure configure.ac acconfig.h pyconfig.h.in Makefile.pre.in @@ -290,8 +290,11 @@ HOSTRUNNER= @HOSTRUNNER@ PYTHON_FOR_REGEN?=@PYTHON_FOR_REGEN@ UPDATE_FILE=$(PYTHON_FOR_REGEN) $(srcdir)/Tools/scripts/update_file.py PYTHON_FOR_BUILD=@PYTHON_FOR_BUILD@ +# Single-platform builds depend on $(BUILDPYTHON). Cross builds use an +# external "build Python" and have an empty PYTHON_FOR_BUILD_DEPS. +PYTHON_FOR_BUILD_DEPS=@PYTHON_FOR_BUILD_DEPS@ -# Normal builds use Programs/_freeze_module.c for bootstrapping and +# Single-platform builds use Programs/_freeze_module.c for bootstrapping and # ./_bootstrap_python Programs/_freeze_module.py for remaining modules # Cross builds use an external "build Python" for all modules. PYTHON_FOR_FREEZE=@PYTHON_FOR_FREEZE@ @@ -580,8 +583,8 @@ LIBEXPAT_HEADERS= \ # Default target all: @DEF_MAKE_ALL_RULE@ -build_all: check-clean-src $(BUILDPYTHON) oldsharedmods sharedmods gdbhooks \ - Programs/_testembed python-config +build_all: check-clean-src $(BUILDPYTHON) platform oldsharedmods sharedmods \ + gdbhooks Programs/_testembed python-config build_wasm: check-clean-src $(BUILDPYTHON) platform oldsharedmods python-config # Check that the source is clean when building out of source. @@ -700,7 +703,7 @@ clinic: check-clean-src $(srcdir)/Modules/_blake2/blake2s_impl.c $(BUILDPYTHON): Programs/python.o $(LINK_PYTHON_DEPS) $(LINKCC) $(PY_CORE_LDFLAGS) $(LINKFORSHARED) -o $@ Programs/python.o $(LINK_PYTHON_OBJS) $(LIBS) $(MODLIBS) $(SYSLIBS) -platform: $(BUILDPYTHON) pybuilddir.txt +platform: $(PYTHON_FOR_BUILD_DEPS) pybuilddir.txt $(RUNSHARED) $(PYTHON_FOR_BUILD) -c 'import sys ; from sysconfig import get_platform ; print("%s-%d.%d" % (get_platform(), *sys.version_info[:2]))' >platform # Create build directory and generate the sysconfig build-time data there. @@ -710,7 +713,7 @@ platform: $(BUILDPYTHON) pybuilddir.txt # problems by creating a dummy pybuilddir.txt just to allow interpreter # initialization to succeed. It will be overwritten by generate-posix-vars # or removed in case of failure. -pybuilddir.txt: $(BUILDPYTHON) +pybuilddir.txt: $(PYTHON_FOR_BUILD_DEPS) @echo "none" > ./pybuilddir.txt $(RUNSHARED) $(PYTHON_FOR_BUILD) -S -m sysconfig --generate-posix-vars ;\ if test $$? -ne 0 ; then \ @@ -729,7 +732,7 @@ $(srcdir)/Modules/_blake2/blake2s_impl.c: $(srcdir)/Modules/_blake2/blake2b_impl # -s, --silent or --quiet is always the first char. # Under BSD make, MAKEFLAGS might be " -s -v x=y". # Ignore macros passed by GNU make, passed after -- -sharedmods: $(BUILDPYTHON) pybuilddir.txt @LIBMPDEC_INTERNAL@ @LIBEXPAT_INTERNAL@ +sharedmods: $(PYTHON_FOR_BUILD_DEPS) pybuilddir.txt @LIBMPDEC_INTERNAL@ @LIBEXPAT_INTERNAL@ @case "`echo X $$MAKEFLAGS | sed 's/^X //;s/ -- .*//'`" in \ *\ -s*|s*) quiet="-q";; \ *) quiet="";; \ @@ -818,7 +821,7 @@ $(WASM_STDLIB): $(srcdir)/Lib/*.py $(srcdir)/Lib/*/*.py \ Makefile pybuilddir.txt Modules/Setup.local \ python.html python.worker.js $(PYTHON_FOR_BUILD) $(srcdir)/Tools/wasm/wasm_assets.py \ - --builddir . --prefix $(prefix) + --buildroot . --prefix $(prefix) python.html: $(srcdir)/Tools/wasm/python.html python.worker.js @cp $(srcdir)/Tools/wasm/python.html $@ @@ -1198,6 +1201,14 @@ regen-global-objects: $(srcdir)/Tools/scripts/generate_global_objects.py ############################################################################ # ABI +regen-abidump: all + @$(MKDIR_P) $(srcdir)/Doc/data/ + abidw "libpython$(LDVERSION).so" --no-architecture --out-file $(srcdir)/Doc/data/python$(LDVERSION).abi.new + @$(UPDATE_FILE) --create $(srcdir)/Doc/data/python$(LDVERSION).abi $(srcdir)/Doc/data/python$(LDVERSION).abi.new + +check-abidump: all + abidiff $(srcdir)/Doc/data/python$(LDVERSION).abi "libpython$(LDVERSION).so" --drop-private-types --no-architecture --no-added-syms + regen-limited-abi: all $(RUNSHARED) ./$(BUILDPYTHON) $(srcdir)/Tools/scripts/stable_abi.py --generate-all $(srcdir)/Misc/stable_abi.toml @@ -1509,6 +1520,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/pymem.h \ $(srcdir)/Include/pyport.h \ $(srcdir)/Include/pystate.h \ + $(srcdir)/Include/pystats.h \ $(srcdir)/Include/pystrcmp.h \ $(srcdir)/Include/pystrtod.h \ $(srcdir)/Include/pythonrun.h \ @@ -1563,6 +1575,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/cpython/pydebug.h \ $(srcdir)/Include/cpython/pyerrors.h \ $(srcdir)/Include/cpython/pyfpe.h \ + $(srcdir)/Include/cpython/pyframe.h \ $(srcdir)/Include/cpython/pylifecycle.h \ $(srcdir)/Include/cpython/pymem.h \ $(srcdir)/Include/cpython/pystate.h \ @@ -1594,6 +1607,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/internal/pycore_condvar.h \ $(srcdir)/Include/internal/pycore_context.h \ $(srcdir)/Include/internal/pycore_dict.h \ + $(srcdir)/Include/internal/pycore_descrobject.h \ $(srcdir)/Include/internal/pycore_dtoa.h \ $(srcdir)/Include/internal/pycore_exceptions.h \ $(srcdir)/Include/internal/pycore_fileutils.h \ @@ -1622,6 +1636,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/internal/pycore_pylifecycle.h \ $(srcdir)/Include/internal/pycore_pymem.h \ $(srcdir)/Include/internal/pycore_pystate.h \ + $(srcdir)/Include/internal/pycore_range.h \ $(srcdir)/Include/internal/pycore_runtime.h \ $(srcdir)/Include/internal/pycore_runtime_init.h \ $(srcdir)/Include/internal/pycore_signal.h \ @@ -1662,7 +1677,7 @@ cleantest: all # Run a basic set of regression tests. # This excludes some tests that are particularly resource-intensive. -test: @DEF_MAKE_RULE@ platform +test: all $(TESTRUNNER) $(TESTOPTS) # Run the full test suite twice - once without .pyc files, and once with. @@ -1672,7 +1687,7 @@ test: @DEF_MAKE_RULE@ platform # the bytecode read from a .pyc file had the bug, sometimes the directly # generated bytecode. This is sometimes a very shy bug needing a lot of # sample data. -testall: @DEF_MAKE_RULE@ platform +testall: all -find $(srcdir)/Lib -name '*.py[co]' -print | xargs rm -f $(TESTPYTHON) -E $(srcdir)/Lib/compileall.py -find $(srcdir)/Lib -name '*.py[co]' -print | xargs rm -f @@ -1681,7 +1696,7 @@ testall: @DEF_MAKE_RULE@ platform # Run the test suite for both architectures in a Universal build on OSX. # Must be run on an Intel box. -testuniversal: @DEF_MAKE_RULE@ platform +testuniversal: all @if [ `arch` != 'i386' ]; then \ echo "This can only be used on OSX/i386" ;\ exit 1 ;\ @@ -1692,7 +1707,7 @@ testuniversal: @DEF_MAKE_RULE@ platform # Like testall, but with only one pass and without multiple processes. # Run an optional script to include information about the build environment. -buildbottest: all platform +buildbottest: all -@if which pybuildbot.identify >/dev/null 2>&1; then \ pybuildbot.identify "CC='$(CC)'" "CXX='$(CXX)'"; \ fi @@ -1707,7 +1722,7 @@ QUICKTESTOPTS= $(TESTOPTS) -x test_subprocess test_io test_lib2to3 \ test_multiprocessing_forkserver \ test_mailbox test_nntplib test_socket test_poll \ test_select test_zipfile test_concurrent_futures -quicktest: @DEF_MAKE_RULE@ platform +quicktest: all $(TESTRUNNER) $(QUICKTESTOPTS) # SSL tests @@ -1718,6 +1733,10 @@ multisslcompile: all multissltest: all $(RUNSHARED) ./$(BUILDPYTHON) $(srcdir)/Tools/ssl/multissltests.py +# All install targets use the "all" target as synchronization point to +# prevent race conditions with PGO builds. PGO builds use recursive make, +# which can lead to two parallel `./python setup.py build` processes that +# step on each others toes. install: @FRAMEWORKINSTALLFIRST@ commoninstall bininstall maninstall @FRAMEWORKINSTALLLAST@ if test "x$(ENSUREPIP)" != "xno" ; then \ case $(ENSUREPIP) in \ @@ -1746,7 +1765,7 @@ commoninstall: check-clean-src @FRAMEWORKALTINSTALLFIRST@ \ # Install shared libraries enabled by Setup DESTDIRS= $(exec_prefix) $(LIBDIR) $(BINLIBDEST) $(DESTSHARED) -oldsharedinstall: $(DESTSHARED) $(SHAREDMODS) +oldsharedinstall: $(DESTSHARED) all @for i in X $(SHAREDMODS); do \ if test $$i != X; then \ echo $(INSTALL_SHARED) $$i $(DESTSHARED)/`basename $$i`; \ @@ -1908,13 +1927,8 @@ LIBSUBDIRS= asyncio \ xmlrpc \ zoneinfo \ __phello__ -TESTSUBDIRS= ctypes/test \ - distutils/tests \ +TESTSUBDIRS= distutils/tests \ idlelib/idle_test \ - lib2to3/tests \ - lib2to3/tests/data \ - lib2to3/tests/data/fixers \ - lib2to3/tests/data/fixers/myfixes \ test test/audiodata \ test/capath test/cjkencodings \ test/data test/decimaltestdata \ @@ -1923,6 +1937,7 @@ TESTSUBDIRS= ctypes/test \ test/libregrtest test/sndhdrdata \ test/subprocessdata test/support \ test/test_asyncio \ + test/test_ctypes \ test/test_email test/test_email/data \ test/test_import \ test/test_import/data \ @@ -1975,16 +1990,20 @@ TESTSUBDIRS= ctypes/test \ test/test_importlib/zipdata01 \ test/test_importlib/zipdata02 \ test/test_json \ + test/test_lib2to3 \ + test/test_lib2to3/data \ + test/test_lib2to3/data/fixers \ + test/test_lib2to3/data/fixers/myfixes \ test/test_peg_generator \ + test/test_tkinter \ test/test_tools \ + test/test_ttk \ test/test_warnings test/test_warnings/data \ test/test_zoneinfo test/test_zoneinfo/data \ + test/test_unittest test/test_unittest/testmock \ test/tracedmodules \ test/xmltestdata test/xmltestdata/c14n-20 \ - test/ziptestdata \ - tkinter/test tkinter/test/test_tkinter \ - tkinter/test/test_ttk \ - unittest/test unittest/test/testmock + test/ziptestdata TEST_MODULES=@TEST_MODULES@ libinstall: all $(srcdir)/Modules/xxmodule.c @@ -2062,17 +2081,17 @@ libinstall: all $(srcdir)/Modules/xxmodule.c -PYTHONPATH=$(DESTDIR)$(LIBDEST) $(RUNSHARED) \ $(PYTHON_FOR_BUILD) -Wi $(DESTDIR)$(LIBDEST)/compileall.py \ -j0 -d $(LIBDEST) -f \ - -x 'bad_coding|badsyntax|site-packages|lib2to3/tests/data' \ + -x 'bad_coding|badsyntax|site-packages|test/test_lib2to3/data' \ $(DESTDIR)$(LIBDEST) -PYTHONPATH=$(DESTDIR)$(LIBDEST) $(RUNSHARED) \ $(PYTHON_FOR_BUILD) -Wi -O $(DESTDIR)$(LIBDEST)/compileall.py \ -j0 -d $(LIBDEST) -f \ - -x 'bad_coding|badsyntax|site-packages|lib2to3/tests/data' \ + -x 'bad_coding|badsyntax|site-packages|test/test_lib2to3/data' \ $(DESTDIR)$(LIBDEST) -PYTHONPATH=$(DESTDIR)$(LIBDEST) $(RUNSHARED) \ $(PYTHON_FOR_BUILD) -Wi -OO $(DESTDIR)$(LIBDEST)/compileall.py \ -j0 -d $(LIBDEST) -f \ - -x 'bad_coding|badsyntax|site-packages|lib2to3/tests/data' \ + -x 'bad_coding|badsyntax|site-packages|test/test_lib2to3/data' \ $(DESTDIR)$(LIBDEST) -PYTHONPATH=$(DESTDIR)$(LIBDEST) $(RUNSHARED) \ $(PYTHON_FOR_BUILD) -Wi $(DESTDIR)$(LIBDEST)/compileall.py \ @@ -2152,7 +2171,7 @@ LIBPL= @LIBPL@ # pkgconfig directory LIBPC= $(LIBDIR)/pkgconfig -libainstall: @DEF_MAKE_RULE@ python-config +libainstall: all python-config @for i in $(LIBDIR) $(LIBPL) $(LIBPC) $(BINDIR); \ do \ if test ! -d $(DESTDIR)$$i; then \ @@ -2206,7 +2225,7 @@ libainstall: @DEF_MAKE_RULE@ python-config # Install the dynamically loadable modules # This goes into $(exec_prefix) -sharedinstall: sharedmods +sharedinstall: all $(RUNSHARED) $(PYTHON_FOR_BUILD) $(srcdir)/setup.py install \ --prefix=$(prefix) \ --install-scripts=$(BINDIR) \ @@ -2382,7 +2401,7 @@ clean-retain-profile: pycremoval -rm -f Lib/lib2to3/*Grammar*.pickle -rm -f _bootstrap_python -rm -f python.html python*.js python.data python*.symbols python*.map - -rm -rf $(WASM_STDLIB) + -rm -f $(WASM_STDLIB) -rm -f Programs/_testembed Programs/_freeze_module -rm -f Python/deepfreeze/*.[co] -rm -f Python/frozen_modules/*.h @@ -2436,7 +2455,7 @@ distclean: clobber -exec rm -f {} ';' # Check that all symbols exported by libpython start with "Py" or "_Py" -smelly: @DEF_MAKE_RULE@ +smelly: all $(RUNSHARED) ./$(BUILDPYTHON) $(srcdir)/Tools/scripts/smelly.py # Find files with funny names @@ -2471,7 +2490,7 @@ funny: -o -print # Perform some verification checks on any modified files. -patchcheck: @DEF_MAKE_RULE@ +patchcheck: all $(RUNSHARED) ./$(BUILDPYTHON) $(srcdir)/Tools/scripts/patchcheck.py check-limited-abi: all diff --git a/Misc/ACKS b/Misc/ACKS index f3d8924ea62af5..b6340414cf7012 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -548,6 +548,7 @@ Nils Fischbeck Frederik Fix Tom Flanagan Matt Fleming +Sean Fleming Hernán Martínez Foffani Benjamin Fogle Artem Fokin @@ -933,6 +934,7 @@ Ron Klatchko Reid Kleckner Carsten Klein Bastian Kleineidam +Joel Klimont Bob Kline Matthias Klose Jeremy Kloth @@ -1062,6 +1064,7 @@ Robert Li Xuanji Li Zekun Li Zheao Li +Eli Libman Dan Lidral-Porter Robert van Liere Ross Light @@ -1454,6 +1457,7 @@ Edward K. Ream Chris Rebert Marc Recht John Redford +Kalyan Reddy Terry J. Reedy Gareth Rees John Reese diff --git a/Misc/NEWS.d/3.11.0b1.rst b/Misc/NEWS.d/3.11.0b1.rst index 0def806185e5a2..c135eff4598e40 100644 --- a/Misc/NEWS.d/3.11.0b1.rst +++ b/Misc/NEWS.d/3.11.0b1.rst @@ -1373,6 +1373,7 @@ Suppress expression chaining for more :mod:`re` parsing errors. Remove undocumented and never working function ``re.template()`` and flag ``re.TEMPLATE``. +This was later reverted in 3.11.0b2 and deprecated instead. .. diff --git a/Misc/NEWS.d/next/Build/2018-08-21-11-10-18.bpo-34449.Z3qm3c.rst b/Misc/NEWS.d/next/Build/2018-08-21-11-10-18.bpo-34449.Z3qm3c.rst new file mode 100644 index 00000000000000..53b75aee9cc272 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2018-08-21-11-10-18.bpo-34449.Z3qm3c.rst @@ -0,0 +1 @@ +Drop invalid compiler switch ``-fPIC`` for HP aCC on HP-UX. Patch by Michael Osipov. diff --git a/Misc/NEWS.d/next/Build/2022-05-25-05-46-00.gh-issue-93202.T37jtj.rst b/Misc/NEWS.d/next/Build/2022-05-25-05-46-00.gh-issue-93202.T37jtj.rst new file mode 100644 index 00000000000000..6018e400c15d91 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-05-25-05-46-00.gh-issue-93202.T37jtj.rst @@ -0,0 +1,4 @@ +Python now always use the ``%zu`` and ``%zd`` printf formats to format a +``size_t`` or ``Py_ssize_t`` number. Building Python 3.12 requires a C11 +compiler, so these printf formats are now always supported. Patch by Victor +Stinner. diff --git a/Misc/NEWS.d/next/Build/2022-05-25-13-56-00.gh-issue-93207.B9Rubf.rst b/Misc/NEWS.d/next/Build/2022-05-25-13-56-00.gh-issue-93207.B9Rubf.rst new file mode 100644 index 00000000000000..cd462bb232f043 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-05-25-13-56-00.gh-issue-93207.B9Rubf.rst @@ -0,0 +1,3 @@ +``va_start()`` with two parameters, like ``va_start(args, format),`` +is now required to build Python. ``va_start()`` is no longer called with a single parameter. +Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Build/2022-05-31-18-04-58.gh-issue-69093.6lSa0C.rst b/Misc/NEWS.d/next/Build/2022-05-31-18-04-58.gh-issue-69093.6lSa0C.rst new file mode 100644 index 00000000000000..b061d6c2186c10 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-05-31-18-04-58.gh-issue-69093.6lSa0C.rst @@ -0,0 +1 @@ +Fix ``Modules/Setup.stdlib.in`` rule for ``_sqlite3`` extension. diff --git a/Misc/NEWS.d/next/Build/2022-06-04-12-53-53.gh-issue-93491.ehM211.rst b/Misc/NEWS.d/next/Build/2022-06-04-12-53-53.gh-issue-93491.ehM211.rst new file mode 100644 index 00000000000000..b3560fac81d2a9 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-06-04-12-53-53.gh-issue-93491.ehM211.rst @@ -0,0 +1 @@ +``configure`` now detects and reports :pep:`11` support tiers. diff --git a/Misc/NEWS.d/next/Build/2022-06-08-14-28-03.gh-issue-93584.0xfHOK.rst b/Misc/NEWS.d/next/Build/2022-06-08-14-28-03.gh-issue-93584.0xfHOK.rst new file mode 100644 index 00000000000000..07ca5fad592b2d --- /dev/null +++ b/Misc/NEWS.d/next/Build/2022-06-08-14-28-03.gh-issue-93584.0xfHOK.rst @@ -0,0 +1,2 @@ +Address race condition in ``Makefile`` when installing a PGO build. All +``test`` and ``install`` targets now depend on ``all`` target. diff --git a/Misc/NEWS.d/next/C API/2021-10-05-21-59-43.bpo-45383.TVClgf.rst b/Misc/NEWS.d/next/C API/2021-10-05-21-59-43.bpo-45383.TVClgf.rst new file mode 100644 index 00000000000000..ca1b7a43d29481 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2021-10-05-21-59-43.bpo-45383.TVClgf.rst @@ -0,0 +1,3 @@ +The :c:func:`PyType_FromSpec` API will now find and use a metaclass +based on the provided bases. +An error will be raised if there is a metaclass conflict. diff --git a/Misc/NEWS.d/next/C API/2022-05-13-18-17-48.gh-issue-92781.TVDr3-.rst b/Misc/NEWS.d/next/C API/2022-05-13-18-17-48.gh-issue-92781.TVDr3-.rst new file mode 100644 index 00000000000000..6442ba619450f0 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-05-13-18-17-48.gh-issue-92781.TVDr3-.rst @@ -0,0 +1,3 @@ +Avoid mixing declarations and code in the C API to fix the compiler warning: +"ISO C90 forbids mixed declarations and code" +[-Werror=declaration-after-statement]. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/C API/2022-05-19-18-05-51.gh-issue-92913.Ass1Hv.rst b/Misc/NEWS.d/next/C API/2022-05-19-18-05-51.gh-issue-92913.Ass1Hv.rst new file mode 100644 index 00000000000000..c448c64029d825 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-05-19-18-05-51.gh-issue-92913.Ass1Hv.rst @@ -0,0 +1,2 @@ +Ensures changes to :c:member:`PyConfig.module_search_paths` are ignored +unless :c:member:`PyConfig.module_search_paths_set` is set diff --git a/Misc/NEWS.d/next/C API/2022-05-23-12-31-04.gh-issue-77782.ugC8dn.rst b/Misc/NEWS.d/next/C API/2022-05-23-12-31-04.gh-issue-77782.ugC8dn.rst new file mode 100644 index 00000000000000..d212f6b977ef93 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-05-23-12-31-04.gh-issue-77782.ugC8dn.rst @@ -0,0 +1,3 @@ +Deprecate global configuration variable like +:c:var:`Py_IgnoreEnvironmentFlag`: the :c:func:`Py_InitializeFromConfig` API +should be instead. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/C API/2022-05-23-13-33-18.gh-issue-93103.ooD3Eb.rst b/Misc/NEWS.d/next/C API/2022-05-23-13-33-18.gh-issue-93103.ooD3Eb.rst new file mode 100644 index 00000000000000..404f0089cd0b79 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-05-23-13-33-18.gh-issue-93103.ooD3Eb.rst @@ -0,0 +1,4 @@ +Deprecate global configuration variables, like +:c:var:`Py_IgnoreEnvironmentFlag`, in the documentation: the +:c:func:`Py_InitializeFromConfig` API should be instead. Patch by Victor +Stinner. diff --git a/Misc/NEWS.d/next/C API/2022-05-23-15-22-18.gh-issue-92898.Qjc9d3.rst b/Misc/NEWS.d/next/C API/2022-05-23-15-22-18.gh-issue-92898.Qjc9d3.rst new file mode 100644 index 00000000000000..01eca1db1f18c9 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-05-23-15-22-18.gh-issue-92898.Qjc9d3.rst @@ -0,0 +1,2 @@ +Fix C++ compiler warnings when casting function arguments to ``PyObject*``. +Patch by Serge Guelton. diff --git a/Misc/NEWS.d/next/C API/2022-06-03-14-54-41.gh-issue-93466.DDtH0X.rst b/Misc/NEWS.d/next/C API/2022-06-03-14-54-41.gh-issue-93466.DDtH0X.rst new file mode 100644 index 00000000000000..0bb65ea2b38786 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-06-03-14-54-41.gh-issue-93466.DDtH0X.rst @@ -0,0 +1,3 @@ +Slot IDs in PyType_Spec may not be repeated. The documentation was updated +to mention this. For some cases of repeated slots, PyType_FromSpec and +related functions will now raise an exception. diff --git a/Misc/NEWS.d/next/C API/2022-06-04-13-15-41.gh-issue-93442.4M4NDb.rst b/Misc/NEWS.d/next/C API/2022-06-04-13-15-41.gh-issue-93442.4M4NDb.rst new file mode 100644 index 00000000000000..f48ed37c814454 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-06-04-13-15-41.gh-issue-93442.4M4NDb.rst @@ -0,0 +1,3 @@ +Add C++ overloads for _Py_CAST_impl() to handle 0/NULL. This will allow C++ +extensions that pass 0 or NULL to macros using _Py_CAST() to continue to +compile. diff --git a/Misc/NEWS.d/next/C API/2022-06-10-16-50-27.gh-issue-89546.mX1f10.rst b/Misc/NEWS.d/next/C API/2022-06-10-16-50-27.gh-issue-89546.mX1f10.rst new file mode 100644 index 00000000000000..8e6b6d95c7b4ab --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-06-10-16-50-27.gh-issue-89546.mX1f10.rst @@ -0,0 +1,4 @@ +:c:func:`PyType_FromMetaclass` (and other ``PyType_From*`` functions) now +check that offsets and the base class's +:c:member:`~PyTypeObject.tp_basicsize` fit in the new class's +``tp_basicsize``. diff --git a/Misc/NEWS.d/next/C API/2022-06-10-23-41-48.gh-issue-91731.fhYUQG.rst b/Misc/NEWS.d/next/C API/2022-06-10-23-41-48.gh-issue-91731.fhYUQG.rst new file mode 100644 index 00000000000000..185671ca4fcfb1 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-06-10-23-41-48.gh-issue-91731.fhYUQG.rst @@ -0,0 +1,3 @@ +Avoid defining the ``static_assert`` when compiling with C++ 11, where this +is a keyword and redefining it can lead to undefined behavior. Patch by +Pablo Galindo diff --git a/Misc/NEWS.d/next/C API/2022-06-13-21-37-31.gh-issue-91321.DgJFvS.rst b/Misc/NEWS.d/next/C API/2022-06-13-21-37-31.gh-issue-91321.DgJFvS.rst new file mode 100644 index 00000000000000..57c39bc8d83c87 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-06-13-21-37-31.gh-issue-91321.DgJFvS.rst @@ -0,0 +1,2 @@ +Fix the compatibility of the Python C API with C++ older than C++11. Patch by +Victor Stinner. diff --git a/Misc/NEWS.d/next/C API/2022-06-17-13-41-38.gh-issue-93937.uKVTEh.rst b/Misc/NEWS.d/next/C API/2022-06-17-13-41-38.gh-issue-93937.uKVTEh.rst new file mode 100644 index 00000000000000..c0a0745aa0dd25 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2022-06-17-13-41-38.gh-issue-93937.uKVTEh.rst @@ -0,0 +1,14 @@ +The following frame functions and type are now directly available with +``#include ``, it's no longer needed to add ``#include +``: + +* :c:func:`PyFrame_Check` +* :c:func:`PyFrame_GetBack` +* :c:func:`PyFrame_GetBuiltins` +* :c:func:`PyFrame_GetGenerator` +* :c:func:`PyFrame_GetGlobals` +* :c:func:`PyFrame_GetLasti` +* :c:func:`PyFrame_GetLocals` +* :c:type:`PyFrame_Type` + +Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-01-02-14-53-59.bpo-46142.WayjgT.rst b/Misc/NEWS.d/next/Core and Builtins/2022-01-02-14-53-59.bpo-46142.WayjgT.rst new file mode 100644 index 00000000000000..3c62c54289946c --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-01-02-14-53-59.bpo-46142.WayjgT.rst @@ -0,0 +1,3 @@ +Make ``--help`` output shorter by moving some info to the new +``--help-env`` and ``--help-xoptions`` command-line options. +Also add ``--help-all`` option to print complete usage. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-04-24-02-22-10.gh-issue-91432.YPJAK6.rst b/Misc/NEWS.d/next/Core and Builtins/2022-04-24-02-22-10.gh-issue-91432.YPJAK6.rst new file mode 100644 index 00000000000000..081dc0f5c334c5 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-04-24-02-22-10.gh-issue-91432.YPJAK6.rst @@ -0,0 +1 @@ +Specialized the :opcode:`FOR_ITER` opcode using the PEP 659 machinery diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-13-00-57-18.gh-issue-92658.YdhFE2.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-13-00-57-18.gh-issue-92658.YdhFE2.rst new file mode 100644 index 00000000000000..887b3d61596609 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-13-00-57-18.gh-issue-92658.YdhFE2.rst @@ -0,0 +1 @@ +Add support for connecting and binding to Hyper-V sockets on Windows Hyper-V hosts and guests. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-13-12-36-10.gh-issue-92777.Odo4vP.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-13-12-36-10.gh-issue-92777.Odo4vP.rst new file mode 100644 index 00000000000000..19416590a8e9ac --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-13-12-36-10.gh-issue-92777.Odo4vP.rst @@ -0,0 +1 @@ +Specialize ``LOAD_METHOD`` for objects with lazy dictionaries. Patch by Ken Jin. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-14-13-22-11.gh-issue-92804.rAqpI2.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-14-13-22-11.gh-issue-92804.rAqpI2.rst new file mode 100644 index 00000000000000..7a5fd3f6568eaf --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-14-13-22-11.gh-issue-92804.rAqpI2.rst @@ -0,0 +1 @@ +Fix memory leak in ``memoryview`` iterator as it was not finalized at exit. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-15-15-25-05.gh-issue-90473.MoPHYW.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-15-15-25-05.gh-issue-90473.MoPHYW.rst new file mode 100644 index 00000000000000..1f9f45a511fbae --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-15-15-25-05.gh-issue-90473.MoPHYW.rst @@ -0,0 +1 @@ +Decrease default recursion limit on WASI to address limited call stack size. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-17-20-41-43.gh-issue-92858.eIXJTn.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-17-20-41-43.gh-issue-92858.eIXJTn.rst new file mode 100644 index 00000000000000..fa91d941b1759a --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-17-20-41-43.gh-issue-92858.eIXJTn.rst @@ -0,0 +1 @@ +Improve error message for some suites with syntax error before ':' diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-18-08-32-33.gh-issue-92914.tJUeTD.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-18-08-32-33.gh-issue-92914.tJUeTD.rst new file mode 100644 index 00000000000000..1242a15c029dc1 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-18-08-32-33.gh-issue-92914.tJUeTD.rst @@ -0,0 +1 @@ +Always round the allocated size for lists up to the nearest even number. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-18-12-55-35.gh-issue-90690.TKuoTa.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-18-12-55-35.gh-issue-90690.TKuoTa.rst new file mode 100644 index 00000000000000..6c5aa91dfb9d3d --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-18-12-55-35.gh-issue-90690.TKuoTa.rst @@ -0,0 +1,2 @@ +The PRECALL instruction has been removed. It offered only a small advantage +for specialization and is not needed in the vast majority of cases. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-18-18-34-45.gh-issue-92930.kpYPOb.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-18-18-34-45.gh-issue-92930.kpYPOb.rst new file mode 100644 index 00000000000000..cd5d7b3214ea43 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-18-18-34-45.gh-issue-92930.kpYPOb.rst @@ -0,0 +1 @@ +Fixed a crash in ``_pickle.c`` from mutating collections during ``__reduce__`` or ``persistent_id``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-19-13-25-50.gh-issue-92955.kmNV33.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-19-13-25-50.gh-issue-92955.kmNV33.rst new file mode 100644 index 00000000000000..09f03e520c436a --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-19-13-25-50.gh-issue-92955.kmNV33.rst @@ -0,0 +1 @@ +Fix memory leak in code object's lines and positions iterators as they were not finalized at exit. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-19-15-29-53.gh-issue-89914.8bAffH.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-19-15-29-53.gh-issue-89914.8bAffH.rst new file mode 100644 index 00000000000000..d2156f8bbf3d20 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-19-15-29-53.gh-issue-89914.8bAffH.rst @@ -0,0 +1,2 @@ +The operand of the ``YIELD_VALUE`` instruction is set to the stack depth. +This is done to help frame handling on ``yield`` and may assist debuggers. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-20-09-25-34.gh-issue-93021.k3Aji2.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-20-09-25-34.gh-issue-93021.k3Aji2.rst new file mode 100644 index 00000000000000..8fdd8dfb4229a7 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-20-09-25-34.gh-issue-93021.k3Aji2.rst @@ -0,0 +1,2 @@ +Fix the :attr:`__text_signature__` for :meth:`__get__` methods implemented +in C. Patch by Jelle Zijlstra. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-20-13-32-24.gh-issue-93012.e9B-pv.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-20-13-32-24.gh-issue-93012.e9B-pv.rst new file mode 100644 index 00000000000000..8de0f000647dc8 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-20-13-32-24.gh-issue-93012.e9B-pv.rst @@ -0,0 +1,8 @@ +Added the new function :c:func:`PyType_FromMetaclass`, which generalizes the +existing :c:func:`PyType_FromModuleAndSpec` using an additional metaclass +argument. This is useful for language binding tools, where it can be used to +intercept type-related operations like subclassing or static attribute access +by specifying a metaclass with custom slots. + +Importantly, :c:func:`PyType_FromMetaclass` is available in the Limited API, +which provides a path towards migrating more binding tools onto the Stable ABI. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-21-23-21-37.gh-issue-93065.5I18WC.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-21-23-21-37.gh-issue-93065.5I18WC.rst new file mode 100644 index 00000000000000..ea801653f75025 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-21-23-21-37.gh-issue-93065.5I18WC.rst @@ -0,0 +1,5 @@ +Fix contextvars HAMT implementation to handle iteration over deep trees. + +The bug was discovered and fixed by Eli Libman. See +`MagicStack/immutables#84 `_ +for more details. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-22-02-37-50.gh-issue-93061.r70Imp.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-22-02-37-50.gh-issue-93061.r70Imp.rst new file mode 100644 index 00000000000000..d41e59028ad570 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-22-02-37-50.gh-issue-93061.r70Imp.rst @@ -0,0 +1 @@ +Backward jumps after ``async for`` loops are no longer given dubious line numbers. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-23-18-36-07.gh-issue-93143.X1Yqxm.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-23-18-36-07.gh-issue-93143.X1Yqxm.rst new file mode 100644 index 00000000000000..03994bcfdb56f8 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-23-18-36-07.gh-issue-93143.X1Yqxm.rst @@ -0,0 +1 @@ +Avoid ``NULL`` checks for uninitialized local variables by determining at compile time which variables must be initialized. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-24-14-35-48.gh-issue-93040.9X6Ofu.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-24-14-35-48.gh-issue-93040.9X6Ofu.rst new file mode 100644 index 00000000000000..b2e527446ea7a0 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-24-14-35-48.gh-issue-93040.9X6Ofu.rst @@ -0,0 +1 @@ +Wraps unused parameters in ``Objects/obmalloc.c`` with ``Py_UNUSED``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-25-04-07-22.gh-issue-91924.-UyO4q.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-25-04-07-22.gh-issue-91924.-UyO4q.rst new file mode 100644 index 00000000000000..44866a03cf48b1 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-25-04-07-22.gh-issue-91924.-UyO4q.rst @@ -0,0 +1,2 @@ +Fix ``__lltrace__`` debug feature if the stdout encoding is not UTF-8. Patch +by Victor Stinner. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-25-12-30-12.gh-issue-84694.5sjy2w.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-25-12-30-12.gh-issue-84694.5sjy2w.rst new file mode 100644 index 00000000000000..c062d28b294868 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-25-12-30-12.gh-issue-84694.5sjy2w.rst @@ -0,0 +1,2 @@ +The ``--experimental-isolated-subinterpreters`` configure option and +``EXPERIMENTAL_ISOLATED_SUBINTERPRETERS`` macro have been removed. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-25-21-56-25.gh-issue-93223.gTOGVZ.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-25-21-56-25.gh-issue-93223.gTOGVZ.rst new file mode 100644 index 00000000000000..cac30e11215b4c --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-25-21-56-25.gh-issue-93223.gTOGVZ.rst @@ -0,0 +1 @@ +When a bytecode instruction jumps to an unconditional jump instruction, the first instruction can often be optimized to target the unconditional jump's target directly. For tracing reasons, this would previously only occur if both instructions have the same line number. This also now occurs if the unconditional jump is artificial, i.e., if it has no associated line number. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-30-10-22-46.gh-issue-93345.gi1A4L.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-10-22-46.gh-issue-93345.gi1A4L.rst new file mode 100644 index 00000000000000..4cdb37cfe46981 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-10-22-46.gh-issue-93345.gi1A4L.rst @@ -0,0 +1,2 @@ +Fix a crash in substitution of a ``TypeVar`` in nested generic alias after +``TypeVarTuple``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-30-14-50-03.gh-issue-93283.XDO2ZQ.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-14-50-03.gh-issue-93283.XDO2ZQ.rst new file mode 100644 index 00000000000000..550e86988b25a7 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-14-50-03.gh-issue-93283.XDO2ZQ.rst @@ -0,0 +1,2 @@ +Improve error message for invalid syntax of conversion character in f-string +expressions. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-35-42.gh-issue-93354.RZk8gs.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-35-42.gh-issue-93354.RZk8gs.rst new file mode 100644 index 00000000000000..dcfe6a9b6ba3a5 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-35-42.gh-issue-93354.RZk8gs.rst @@ -0,0 +1,3 @@ +Use exponential backoff for specialization counters in the interpreter. Can +reduce the number of failed specializations significantly and avoid slowdown +for those parts of a program that are not suitable for specialization. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-51-11.gh-issue-93356.l5wnzW.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-51-11.gh-issue-93356.l5wnzW.rst new file mode 100644 index 00000000000000..9bb58468197e2d --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-15-51-11.gh-issue-93356.l5wnzW.rst @@ -0,0 +1 @@ +Code for exception handlers is emitted at the end of the code unit's bytecode. This avoids one jump when no exception is raised. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-30-19-00-38.gh-issue-93359.zXV3A0.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-19-00-38.gh-issue-93359.zXV3A0.rst new file mode 100644 index 00000000000000..36e5e52390d7b4 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-30-19-00-38.gh-issue-93359.zXV3A0.rst @@ -0,0 +1,2 @@ +Ensure that custom :mod:`ast` nodes without explicit end positions can be +compiled. Patch by Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-05-31-16-36-30.gh-issue-93382.Jf6gAj.rst b/Misc/NEWS.d/next/Core and Builtins/2022-05-31-16-36-30.gh-issue-93382.Jf6gAj.rst new file mode 100644 index 00000000000000..1fe821edf5a148 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-05-31-16-36-30.gh-issue-93382.Jf6gAj.rst @@ -0,0 +1,2 @@ +Cache the result of :c:func:`PyCode_GetCode` function to restore the O(1) +lookup of the :attr:`~types.CodeType.co_code` attribute. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-01-17-47-40.gh-issue-93418.24dJuc.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-01-17-47-40.gh-issue-93418.24dJuc.rst new file mode 100644 index 00000000000000..74ad06bfeee7c4 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-01-17-47-40.gh-issue-93418.24dJuc.rst @@ -0,0 +1,2 @@ +Fixed an assert where an f-string has an equal sign '=' following an +expression, but there's no trailing brace. For example, f"{i=". diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-02-08-28-55.gh-issue-93429.DZTWHx.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-02-08-28-55.gh-issue-93429.DZTWHx.rst new file mode 100644 index 00000000000000..02efedaa9645f0 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-02-08-28-55.gh-issue-93429.DZTWHx.rst @@ -0,0 +1 @@ +``LOAD_METHOD`` instruction has been removed. It was merged back into ``LOAD_ATTR``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-02-23-00-08.gh-issue-93444.m63DIs.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-02-23-00-08.gh-issue-93444.m63DIs.rst new file mode 100644 index 00000000000000..23cc1bd3e0b4d2 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-02-23-00-08.gh-issue-93444.m63DIs.rst @@ -0,0 +1 @@ +Removed redundant fields from the compiler's basicblock struct: ``b_nofallthrough``, ``b_exit``, ``b_return``. They can be easily calculated from the opcode of the last instruction of the block. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-06-14-28-24.gh-issue-93533.lnC0CC.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-06-14-28-24.gh-issue-93533.lnC0CC.rst new file mode 100644 index 00000000000000..2d8ac7068af673 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-06-14-28-24.gh-issue-93533.lnC0CC.rst @@ -0,0 +1 @@ +Reduce the size of the inline cache for ``LOAD_METHOD`` by 2 bytes. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-09-09-08-29.gh-issue-93621.-_Pn1d.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-09-09-08-29.gh-issue-93621.-_Pn1d.rst new file mode 100644 index 00000000000000..388540982fd1ff --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-09-09-08-29.gh-issue-93621.-_Pn1d.rst @@ -0,0 +1 @@ +Change order of bytecode instructions emitted for :keyword:`with` and :keyword:`async with` to reduce the number of entries in the exception table. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-09-19-19-02.gh-issue-93461.5DqP1e.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-09-19-19-02.gh-issue-93461.5DqP1e.rst new file mode 100644 index 00000000000000..f6ed14887eae1b --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-09-19-19-02.gh-issue-93461.5DqP1e.rst @@ -0,0 +1,6 @@ +:func:`importlib.invalidate_caches` now drops entries from +:data:`sys.path_importer_cache` with a relative path as name. This solves a +caching issue when a process changes its current working directory. + +``FileFinder`` no longer inserts a dot in the path, e.g. +``/egg/./spam`` is now ``/egg/spam``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-10-10-31-18.gh-issue-93662.-7RSC1.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-10-10-31-18.gh-issue-93662.-7RSC1.rst new file mode 100644 index 00000000000000..e444a00cf7ecc0 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-10-10-31-18.gh-issue-93662.-7RSC1.rst @@ -0,0 +1,2 @@ +Make sure that the end column offsets are correct in multi-line method +calls. Previously, the end column could precede the column offset. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-10-12-03-17.gh-issue-93671.idkQqG.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-10-12-03-17.gh-issue-93671.idkQqG.rst new file mode 100644 index 00000000000000..a7757153359517 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-10-12-03-17.gh-issue-93671.idkQqG.rst @@ -0,0 +1,2 @@ +Fix some exponential backtrace case happening with deeply nested sequence +patterns in match statements. Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-10-16-57-35.gh-issue-93678.1WBnHt.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-10-16-57-35.gh-issue-93678.1WBnHt.rst new file mode 100644 index 00000000000000..24a0d1042d81ae --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-10-16-57-35.gh-issue-93678.1WBnHt.rst @@ -0,0 +1 @@ +Refactor the compiler to reduce boilerplate and repetition. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-12-19-31-56.gh-issue-89828.bq02M7.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-12-19-31-56.gh-issue-89828.bq02M7.rst new file mode 100644 index 00000000000000..14ca99e1458ce3 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-12-19-31-56.gh-issue-89828.bq02M7.rst @@ -0,0 +1,2 @@ +:class:`types.GenericAlias` no longer relays the ``__class__`` attribute. +For example, ``isinstance(list[int], type)`` no longer returns ``True``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-13-10-48-09.gh-issue-93516.yJSait.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-13-10-48-09.gh-issue-93516.yJSait.rst new file mode 100644 index 00000000000000..5c22c7a67b6e51 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-13-10-48-09.gh-issue-93516.yJSait.rst @@ -0,0 +1,2 @@ +Lazily create a table mapping bytecode offsets to line numbers to speed up +calculation of line numbers when tracing. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-13-13-55-34.gh-issue-93516.HILrDl.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-13-13-55-34.gh-issue-93516.HILrDl.rst new file mode 100644 index 00000000000000..a324c2dbcbe8a6 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-13-13-55-34.gh-issue-93516.HILrDl.rst @@ -0,0 +1,2 @@ +Store offset of first traceable instruction in code object to avoid having +to recompute it for each instruction when tracing. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-15-11-16-13.gh-issue-93841.06zqX3.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-15-11-16-13.gh-issue-93841.06zqX3.rst new file mode 100644 index 00000000000000..179d3808e3b4d5 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-15-11-16-13.gh-issue-93841.06zqX3.rst @@ -0,0 +1,3 @@ +When built with ``-enable-pystats``, ``sys._stats_on()``, +``sys._stats_off()``, ``sys._stats_clear()`` and ``sys._stats_dump()`` +functions have been added to enable gathering stats for parts of programs. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-15-16-45-53.gh-issue-93678.1I_ZT3.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-15-16-45-53.gh-issue-93678.1I_ZT3.rst new file mode 100644 index 00000000000000..3c99fd2ecf6633 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-15-16-45-53.gh-issue-93678.1I_ZT3.rst @@ -0,0 +1 @@ +Refactor compiler optimisation code so that it no longer needs the ``struct assembler`` and ``struct compiler`` passed around. Instead, each function takes the CFG and other data that it actually needs. This will make it possible to test this code directly. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-16-16-53-22.gh-issue-93911.RDwIiK.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-16-16-53-22.gh-issue-93911.RDwIiK.rst new file mode 100644 index 00000000000000..9efa994c1916dd --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-16-16-53-22.gh-issue-93911.RDwIiK.rst @@ -0,0 +1 @@ +Specialize ``LOAD_ATTR`` for ``property()`` attributes. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-17-16-30-24.gh-issue-93955.LmiAe9.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-17-16-30-24.gh-issue-93955.LmiAe9.rst new file mode 100644 index 00000000000000..3b2f0e8c32d745 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-17-16-30-24.gh-issue-93955.LmiAe9.rst @@ -0,0 +1 @@ +Improve performance of attribute lookups on objects with custom ``__getattribute__`` and ``__getattr__``. Patch by Ken Jin. diff --git a/Misc/NEWS.d/next/Core and Builtins/2022-06-20-13-48-57.gh-issue-94021.o78q3G.rst b/Misc/NEWS.d/next/Core and Builtins/2022-06-20-13-48-57.gh-issue-94021.o78q3G.rst new file mode 100644 index 00000000000000..0724c517b2ba06 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2022-06-20-13-48-57.gh-issue-94021.o78q3G.rst @@ -0,0 +1 @@ +Fix unreachable code warning in ``Python/specialize.c``. diff --git a/Misc/NEWS.d/next/Documentation/2022-01-13-16-03-15.bpo-40838.k3NVCf.rst b/Misc/NEWS.d/next/Documentation/2022-01-13-16-03-15.bpo-40838.k3NVCf.rst new file mode 100644 index 00000000000000..0f071ab64dbecc --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-01-13-16-03-15.bpo-40838.k3NVCf.rst @@ -0,0 +1,2 @@ +Document that :func:`inspect.getdoc`, :func:`inspect.getmodule`, and +:func:`inspect.getsourcefile` might return ``None``. diff --git a/Misc/NEWS.d/next/Documentation/2022-03-30-17-56-01.bpo-47161.gesHfS.rst b/Misc/NEWS.d/next/Documentation/2022-03-30-17-56-01.bpo-47161.gesHfS.rst new file mode 100644 index 00000000000000..6b552daa7c13af --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-03-30-17-56-01.bpo-47161.gesHfS.rst @@ -0,0 +1,2 @@ +Document that :class:`pathlib.PurePath` does not collapse +initial double slashes because they denote UNC paths. diff --git a/Misc/NEWS.d/next/Documentation/2022-05-18-23-58-26.gh-issue-92240.bHvYiz.rst b/Misc/NEWS.d/next/Documentation/2022-05-18-23-58-26.gh-issue-92240.bHvYiz.rst new file mode 100644 index 00000000000000..53b2a66c9779c2 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-05-18-23-58-26.gh-issue-92240.bHvYiz.rst @@ -0,0 +1,2 @@ +Added release dates for +"What's New in Python 3.X" for 3.0, 3.1, 3.2, 3.8 and 3.10 diff --git a/Misc/NEWS.d/next/Documentation/2022-05-26-11-33-23.gh-issue-86438.kEGGmK.rst b/Misc/NEWS.d/next/Documentation/2022-05-26-11-33-23.gh-issue-86438.kEGGmK.rst new file mode 100644 index 00000000000000..75abfdd63b8b2e --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-05-26-11-33-23.gh-issue-86438.kEGGmK.rst @@ -0,0 +1,3 @@ +Clarify that :option:`-W` and :envvar:`PYTHONWARNINGS` are matched literally +and case-insensitively, rather than as regular expressions, in +:mod:`warnings`. diff --git a/Misc/NEWS.d/next/Documentation/2022-05-26-14-51-25.gh-issue-88831.5Cccr5.rst b/Misc/NEWS.d/next/Documentation/2022-05-26-14-51-25.gh-issue-88831.5Cccr5.rst new file mode 100644 index 00000000000000..983bea981a9b20 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-05-26-14-51-25.gh-issue-88831.5Cccr5.rst @@ -0,0 +1 @@ +Augmented documentation of asyncio.create_task(). Clarified the need to keep strong references to tasks and added a code snippet detailing how to to this. diff --git a/Misc/NEWS.d/next/Documentation/2022-05-29-21-22-54.gh-issue-86986.lFXw8j.rst b/Misc/NEWS.d/next/Documentation/2022-05-29-21-22-54.gh-issue-86986.lFXw8j.rst new file mode 100644 index 00000000000000..1db028c30f67a4 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-05-29-21-22-54.gh-issue-86986.lFXw8j.rst @@ -0,0 +1 @@ +The minimum Sphinx version required to build the documentation is now 3.2. diff --git a/Misc/NEWS.d/next/Documentation/2022-06-15-12-12-49.gh-issue-87260.epyI7D.rst b/Misc/NEWS.d/next/Documentation/2022-06-15-12-12-49.gh-issue-87260.epyI7D.rst new file mode 100644 index 00000000000000..4c6cee86ca115f --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-06-15-12-12-49.gh-issue-87260.epyI7D.rst @@ -0,0 +1 @@ +Align :mod:`sqlite3` argument specs with the actual implementation. diff --git a/Misc/NEWS.d/next/Documentation/2022-06-16-10-10-59.gh-issue-61162.1ypkG8.rst b/Misc/NEWS.d/next/Documentation/2022-06-16-10-10-59.gh-issue-61162.1ypkG8.rst new file mode 100644 index 00000000000000..c8b3a222232189 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2022-06-16-10-10-59.gh-issue-61162.1ypkG8.rst @@ -0,0 +1 @@ +Clarify :mod:`sqlite3` behavior when :ref:`sqlite3-connection-context-manager`. diff --git a/Misc/NEWS.d/next/Library/2017-07-31-13-35-28.bpo-26253.8v_sCs.rst b/Misc/NEWS.d/next/Library/2017-07-31-13-35-28.bpo-26253.8v_sCs.rst new file mode 100644 index 00000000000000..fa0dc95b7d62b3 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2017-07-31-13-35-28.bpo-26253.8v_sCs.rst @@ -0,0 +1,2 @@ +Allow adjustable compression level for tarfile streams in +:func:`tarfile.open`. diff --git a/Misc/NEWS.d/next/Library/2018-09-28-22-18-03.bpo-34828.5Zyi_S.rst b/Misc/NEWS.d/next/Library/2018-09-28-22-18-03.bpo-34828.5Zyi_S.rst new file mode 100644 index 00000000000000..b0e10a158b5b19 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-09-28-22-18-03.bpo-34828.5Zyi_S.rst @@ -0,0 +1 @@ +:meth:`sqlite3.Connection.iterdump` now handles databases that use ``AUTOINCREMENT`` in one or more tables. diff --git a/Misc/NEWS.d/next/Library/2020-10-15-18-37-12.bpo-42047.XDdoSF.rst b/Misc/NEWS.d/next/Library/2020-10-15-18-37-12.bpo-42047.XDdoSF.rst new file mode 100644 index 00000000000000..4c23763cf8d89b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2020-10-15-18-37-12.bpo-42047.XDdoSF.rst @@ -0,0 +1 @@ +Add :func:`threading.get_native_id` support for DragonFly BSD. Patch by David Carlier. diff --git a/Misc/NEWS.d/next/Library/2022-01-09-14-23-00.bpo-28249.4dzB80.rst b/Misc/NEWS.d/next/Library/2022-01-09-14-23-00.bpo-28249.4dzB80.rst new file mode 100644 index 00000000000000..b5f1312d768669 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-01-09-14-23-00.bpo-28249.4dzB80.rst @@ -0,0 +1,2 @@ +Set :attr:`doctest.DocTest.lineno` to ``None`` when object does not have +:attr:`__doc__`. diff --git a/Misc/NEWS.d/next/Library/2022-02-05-18-46-54.bpo-46642.YI6nHQ.rst b/Misc/NEWS.d/next/Library/2022-02-05-18-46-54.bpo-46642.YI6nHQ.rst new file mode 100644 index 00000000000000..2d2815c1e4b00d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-02-05-18-46-54.bpo-46642.YI6nHQ.rst @@ -0,0 +1 @@ +Improve error message when trying to subclass an instance of :data:`typing.TypeVar`, :data:`typing.ParamSpec`, :data:`typing.TypeVarTuple`, etc. Based on patch by Gregory Beauregard. diff --git a/Misc/NEWS.d/next/Library/2022-02-09-23-44-27.bpo-45393.9v5Y8U.rst b/Misc/NEWS.d/next/Library/2022-02-09-23-44-27.bpo-45393.9v5Y8U.rst new file mode 100644 index 00000000000000..0a239b07d76bd1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-02-09-23-44-27.bpo-45393.9v5Y8U.rst @@ -0,0 +1,2 @@ +Fix the formatting for ``await x`` and ``not x`` in the operator precedence +table when using the :func:`help` system. diff --git a/Misc/NEWS.d/next/Library/2022-03-08-04-46-44.bpo-46951.SWAz97.rst b/Misc/NEWS.d/next/Library/2022-03-08-04-46-44.bpo-46951.SWAz97.rst new file mode 100644 index 00000000000000..cd7601aa8c4de7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-03-08-04-46-44.bpo-46951.SWAz97.rst @@ -0,0 +1 @@ +Order the contents of zipapp archives, to make builds more reproducible. diff --git a/Misc/NEWS.d/next/Library/2022-03-19-04-41-42.bpo-47063.nwRfUo.rst b/Misc/NEWS.d/next/Library/2022-03-19-04-41-42.bpo-47063.nwRfUo.rst new file mode 100644 index 00000000000000..b889d3c6520753 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-03-19-04-41-42.bpo-47063.nwRfUo.rst @@ -0,0 +1 @@ +Add an index_pages parameter to support using non-default index page names. diff --git a/Misc/NEWS.d/next/Library/2022-04-03-11-25-02.bpo-41287.8CTdwf.rst b/Misc/NEWS.d/next/Library/2022-04-03-11-25-02.bpo-41287.8CTdwf.rst new file mode 100644 index 00000000000000..ef80ec664c4a86 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-04-03-11-25-02.bpo-41287.8CTdwf.rst @@ -0,0 +1 @@ +Fix handling of the ``doc`` argument in subclasses of :func:`property`. diff --git a/Misc/NEWS.d/next/Library/2022-04-03-19-40-09.bpo-39064.76PbIz.rst b/Misc/NEWS.d/next/Library/2022-04-03-19-40-09.bpo-39064.76PbIz.rst new file mode 100644 index 00000000000000..34d31527e332dc --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-04-03-19-40-09.bpo-39064.76PbIz.rst @@ -0,0 +1,2 @@ +:class:`zipfile.ZipFile` now raises :exc:`zipfile.BadZipFile` instead of ``ValueError`` when reading a +corrupt zip file in which the central directory offset is negative. diff --git a/Misc/NEWS.d/next/Library/2022-04-08-22-12-11.bpo-47231.lvyglt.rst b/Misc/NEWS.d/next/Library/2022-04-08-22-12-11.bpo-47231.lvyglt.rst new file mode 100644 index 00000000000000..ee05c5e2856756 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-04-08-22-12-11.bpo-47231.lvyglt.rst @@ -0,0 +1 @@ +Fixed an issue with inconsistent trailing slashes in tarfile longname directories. diff --git a/Misc/NEWS.d/next/Library/2022-04-11-16-55-41.gh-issue-91456.DK3KKl.rst b/Misc/NEWS.d/next/Library/2022-04-11-16-55-41.gh-issue-91456.DK3KKl.rst new file mode 100644 index 00000000000000..a4c853149bdf02 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-04-11-16-55-41.gh-issue-91456.DK3KKl.rst @@ -0,0 +1,3 @@ +Deprecate current default auto() behavior: In 3.13 the default will be for +for auto() to always return the largest member value incremented by +1, and to raise if incompatible value types are used. diff --git a/Misc/NEWS.d/next/Library/2022-04-15-17-38-55.gh-issue-91577.Ah7cLL.rst b/Misc/NEWS.d/next/Library/2022-04-15-17-38-55.gh-issue-91577.Ah7cLL.rst new file mode 100644 index 00000000000000..0f44f34011f9c7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-04-15-17-38-55.gh-issue-91577.Ah7cLL.rst @@ -0,0 +1 @@ +Move imports in :class:`~multiprocessing.SharedMemory` methods to module level so that they can be executed late in python finalization. diff --git a/Misc/NEWS.d/next/Library/2022-04-24-22-26-45.gh-issue-81790.M5Rvpm.rst b/Misc/NEWS.d/next/Library/2022-04-24-22-26-45.gh-issue-81790.M5Rvpm.rst new file mode 100644 index 00000000000000..8894493e97410f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-04-24-22-26-45.gh-issue-81790.M5Rvpm.rst @@ -0,0 +1,2 @@ +:func:`os.path.splitdrive` now understands DOS device paths with UNC +links (beginning ``\\?\UNC\``). Contributed by Barney Gale. diff --git a/Misc/NEWS.d/next/Library/2022-05-08-18-51-14.gh-issue-89336.TL6ip7.rst b/Misc/NEWS.d/next/Library/2022-05-08-18-51-14.gh-issue-89336.TL6ip7.rst new file mode 100644 index 00000000000000..b4c58c0e48b25e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-08-18-51-14.gh-issue-89336.TL6ip7.rst @@ -0,0 +1,4 @@ +Removed :mod:`configparser` module APIs: +the ``SafeConfigParser`` class alias, the ``ParsingError.filename`` +property and parameter, and the ``ConfigParser.readfp`` method, all +of which were deprecated since Python 3.2. diff --git a/Misc/NEWS.d/next/Library/2022-05-09-09-28-02.gh-issue-92530.M4Q1RS.rst b/Misc/NEWS.d/next/Library/2022-05-09-09-28-02.gh-issue-92530.M4Q1RS.rst new file mode 100644 index 00000000000000..8bb8ca0488c962 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-09-09-28-02.gh-issue-92530.M4Q1RS.rst @@ -0,0 +1,2 @@ +Fix an issue that occurred after interrupting +:func:`threading.Condition.notify`. diff --git a/Misc/NEWS.d/next/Library/2022-05-09-11-55-04.gh-issue-92547.CzVZft.rst b/Misc/NEWS.d/next/Library/2022-05-09-11-55-04.gh-issue-92547.CzVZft.rst new file mode 100644 index 00000000000000..52626974c40198 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-09-11-55-04.gh-issue-92547.CzVZft.rst @@ -0,0 +1,6 @@ +Remove undocumented :mod:`sqlite3` features deprecated in Python 3.10: + +* ``sqlite3.enable_shared_cache()`` +* ``sqlite3.OptimizedUnicode`` + +Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-05-09-22-27-11.gh-issue-92591.V7RCk2.rst b/Misc/NEWS.d/next/Library/2022-05-09-22-27-11.gh-issue-92591.V7RCk2.rst new file mode 100644 index 00000000000000..cd9b598d1dbca1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-09-22-27-11.gh-issue-92591.V7RCk2.rst @@ -0,0 +1,3 @@ +Allow :mod:`logging` filters to return a :class:`logging.LogRecord` instance +so that filters attached to :class:`logging.Handler`\ s can enrich records without +side effects on other handlers. diff --git a/Misc/NEWS.d/next/Library/2022-05-11-19-33-27.gh-issue-92671.KE4v6a.rst b/Misc/NEWS.d/next/Library/2022-05-11-19-33-27.gh-issue-92671.KE4v6a.rst new file mode 100644 index 00000000000000..b50677ab5ca105 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-11-19-33-27.gh-issue-92671.KE4v6a.rst @@ -0,0 +1 @@ +Fixed :func:`ast.unparse` for empty tuples in the assignment target context. diff --git a/Misc/NEWS.d/next/Library/2022-05-14-11-41-23.gh-issue-90473.kPdOZl.rst b/Misc/NEWS.d/next/Library/2022-05-14-11-41-23.gh-issue-90473.kPdOZl.rst new file mode 100644 index 00000000000000..bf5ee542182e02 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-14-11-41-23.gh-issue-90473.kPdOZl.rst @@ -0,0 +1,2 @@ +:mod:`subprocess` now fails early on Emscripten and WASI platforms to work +around missing :func:`os.pipe` on WASI. diff --git a/Misc/NEWS.d/next/Library/2022-05-16-14-35-39.gh-issue-92839.owSMyo.rst b/Misc/NEWS.d/next/Library/2022-05-16-14-35-39.gh-issue-92839.owSMyo.rst new file mode 100644 index 00000000000000..b425bd9c47bc91 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-16-14-35-39.gh-issue-92839.owSMyo.rst @@ -0,0 +1 @@ +Fixed crash resulting from calling bisect.insort() or bisect.insort_left() with the key argument not equal to None. diff --git a/Misc/NEWS.d/next/Library/2022-05-18-17-18-41.gh-issue-91922.DwWIsJ.rst b/Misc/NEWS.d/next/Library/2022-05-18-17-18-41.gh-issue-91922.DwWIsJ.rst new file mode 100644 index 00000000000000..30f7bba115606d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-18-17-18-41.gh-issue-91922.DwWIsJ.rst @@ -0,0 +1,3 @@ +Fix function :func:`sqlite.connect` and the :class:`sqlite.Connection` +constructor on non-UTF-8 locales. Also, they now support bytes paths +non-decodable with the current FS encoding. diff --git a/Misc/NEWS.d/next/Library/2022-05-18-21-04-09.gh-issue-87901.lnf041.rst b/Misc/NEWS.d/next/Library/2022-05-18-21-04-09.gh-issue-87901.lnf041.rst new file mode 100644 index 00000000000000..3488541eb3d77c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-18-21-04-09.gh-issue-87901.lnf041.rst @@ -0,0 +1,2 @@ +Removed the ``encoding`` argument from :func:`os.popen` that was added in +3.11b1. diff --git a/Misc/NEWS.d/next/Library/2022-05-19-13-33-18.gh-issue-92675.ZeerMZ.rst b/Misc/NEWS.d/next/Library/2022-05-19-13-33-18.gh-issue-92675.ZeerMZ.rst new file mode 100644 index 00000000000000..6adc024fc54154 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-19-13-33-18.gh-issue-92675.ZeerMZ.rst @@ -0,0 +1,2 @@ +Fix :func:`venv.ensure_directories` to accept :class:`pathlib.Path` arguments +in addition to :class:`str` paths. Patch by David Foster. diff --git a/Misc/NEWS.d/next/Library/2022-05-19-17-49-58.gh-issue-92932.o2peTh.rst b/Misc/NEWS.d/next/Library/2022-05-19-17-49-58.gh-issue-92932.o2peTh.rst new file mode 100644 index 00000000000000..cb76ac5cbd60e1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-19-17-49-58.gh-issue-92932.o2peTh.rst @@ -0,0 +1,3 @@ +Now :func:`~dis.dis` and :func:`~dis.get_instructions` handle operand values +for instructions prefixed by ``EXTENDED_ARG_QUICK``. +Patch by Sam Gross and Dong-hee Na. diff --git a/Misc/NEWS.d/next/Library/2022-05-20-15-52-43.gh-issue-93010.WF-cAc.rst b/Misc/NEWS.d/next/Library/2022-05-20-15-52-43.gh-issue-93010.WF-cAc.rst new file mode 100644 index 00000000000000..24208b5160ed52 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-20-15-52-43.gh-issue-93010.WF-cAc.rst @@ -0,0 +1 @@ +In a very special case, the email package tried to append the nonexistent ``InvalidHeaderError`` to the defect list. It should have been ``InvalidHeaderDefect``. diff --git a/Misc/NEWS.d/next/Library/2022-05-21-13-16-16.gh-issue-93044.eJ_XkZ.rst b/Misc/NEWS.d/next/Library/2022-05-21-13-16-16.gh-issue-93044.eJ_XkZ.rst new file mode 100644 index 00000000000000..c9df8676bcdda0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-21-13-16-16.gh-issue-93044.eJ_XkZ.rst @@ -0,0 +1,2 @@ +No longer convert the database argument of :func:`sqlite3.connect` to bytes +before passing it to the factory. diff --git a/Misc/NEWS.d/next/Library/2022-05-22-16-08-01.gh-issue-89973.jc-Q4g.rst b/Misc/NEWS.d/next/Library/2022-05-22-16-08-01.gh-issue-89973.jc-Q4g.rst new file mode 100644 index 00000000000000..7e61fd7d46a0bb --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-22-16-08-01.gh-issue-89973.jc-Q4g.rst @@ -0,0 +1,3 @@ +Fix :exc:`re.error` raised in :mod:`fnmatch` if the pattern contains a +character range with upper bound lower than lower bound (e.g. ``[c-a]``). +Now such ranges are interpreted as empty ranges. diff --git a/Misc/NEWS.d/next/Library/2022-05-22-23-46-18.gh-issue-93033.wZfiL-.rst b/Misc/NEWS.d/next/Library/2022-05-22-23-46-18.gh-issue-93033.wZfiL-.rst new file mode 100644 index 00000000000000..3cee530339fb51 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-22-23-46-18.gh-issue-93033.wZfiL-.rst @@ -0,0 +1 @@ +Search in some strings (platform dependent i.e [U+0xFFFF, U+0x0100] on Windows or [U+0xFFFFFFFF, U+0x00010000] on Linux 64-bit) are now up to 10 times faster. diff --git a/Misc/NEWS.d/next/Library/2022-05-24-10-59-02.gh-issue-92728.zxTifq.rst b/Misc/NEWS.d/next/Library/2022-05-24-10-59-02.gh-issue-92728.zxTifq.rst new file mode 100644 index 00000000000000..b39609be2c4cf5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-24-10-59-02.gh-issue-92728.zxTifq.rst @@ -0,0 +1,3 @@ +The :func:`re.template` function and the corresponding :const:`re.TEMPLATE` +and :const:`re.T` flags are restored after they were removed in 3.11.0b1, +but they are now deprecated, so they might be removed from Python 3.13. diff --git a/Misc/NEWS.d/next/Library/2022-05-24-11-19-04.gh-issue-74696.-cnf-A.rst b/Misc/NEWS.d/next/Library/2022-05-24-11-19-04.gh-issue-74696.-cnf-A.rst new file mode 100644 index 00000000000000..5b2e460e9ea075 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-24-11-19-04.gh-issue-74696.-cnf-A.rst @@ -0,0 +1,2 @@ +:func:`shutil.make_archive` no longer temporarily changes the current +working directory during creation of standard ``.zip`` or tar archives. diff --git a/Misc/NEWS.d/next/Library/2022-05-25-00-21-28.gh-issue-91513.9VyCT4.rst b/Misc/NEWS.d/next/Library/2022-05-25-00-21-28.gh-issue-91513.9VyCT4.rst new file mode 100644 index 00000000000000..f9f93767ed173d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-25-00-21-28.gh-issue-91513.9VyCT4.rst @@ -0,0 +1 @@ +Added ``taskName`` attribute to :mod:`logging` module for use with :mod:`asyncio` tasks. diff --git a/Misc/NEWS.d/next/Library/2022-05-25-02-45-41.gh-issue-90817.yxANgU.rst b/Misc/NEWS.d/next/Library/2022-05-25-02-45-41.gh-issue-90817.yxANgU.rst new file mode 100644 index 00000000000000..06937e88691725 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-25-02-45-41.gh-issue-90817.yxANgU.rst @@ -0,0 +1,3 @@ +The :func:`locale.resetlocale` function is deprecated and will be removed in +Python 3.13. Use ``locale.setlocale(locale.LC_ALL, "")`` instead. Patch by +Victor Stinner. diff --git a/Misc/NEWS.d/next/Library/2022-05-26-09-24-41.gh-issue-93162.W1VuhU.rst b/Misc/NEWS.d/next/Library/2022-05-26-09-24-41.gh-issue-93162.W1VuhU.rst new file mode 100644 index 00000000000000..4d916a1df5e065 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-26-09-24-41.gh-issue-93162.W1VuhU.rst @@ -0,0 +1,4 @@ +Add the ability for :func:`logging.config.dictConfig` to usefully configure +:class:`~logging.handlers.QueueHandler` and :class:`~logging.handlers.QueueListener` +as a pair, and add :func:`logging.getHandlerByName` and :func:`logging.getHandlerNames` +APIs to allow access to handlers by name. diff --git a/Misc/NEWS.d/next/Library/2022-05-26-23-10-55.gh-issue-93156.4XfDVN.rst b/Misc/NEWS.d/next/Library/2022-05-26-23-10-55.gh-issue-93156.4XfDVN.rst new file mode 100644 index 00000000000000..165baa08aaab14 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-26-23-10-55.gh-issue-93156.4XfDVN.rst @@ -0,0 +1,2 @@ +Accessing the :attr:`pathlib.PurePath.parents` sequence of an absolute path +using negative index values produced incorrect results. diff --git a/Misc/NEWS.d/next/Library/2022-05-27-10-52-06.gh-issue-85308.K6r-tJ.rst b/Misc/NEWS.d/next/Library/2022-05-27-10-52-06.gh-issue-85308.K6r-tJ.rst new file mode 100644 index 00000000000000..4574264dd4d433 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-27-10-52-06.gh-issue-85308.K6r-tJ.rst @@ -0,0 +1,4 @@ +Changed :class:`argparse.ArgumentParser` to use :term:`filesystem encoding +and error handler` instead of default text encoding to read arguments from +file (e.g. ``fromfile_prefix_chars`` option). This change affects Windows; +argument file should be encoded with UTF-8 instead of ANSI Codepage. diff --git a/Misc/NEWS.d/next/Library/2022-05-27-13-18-18.gh-issue-93297.e2zuHz.rst b/Misc/NEWS.d/next/Library/2022-05-27-13-18-18.gh-issue-93297.e2zuHz.rst new file mode 100644 index 00000000000000..a8e4cd93d3047a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-27-13-18-18.gh-issue-93297.e2zuHz.rst @@ -0,0 +1 @@ +Make asyncio task groups prevent child tasks from being GCed diff --git a/Misc/NEWS.d/next/Library/2022-05-27-22-17-11.gh-issue-88123.mkYl5q.rst b/Misc/NEWS.d/next/Library/2022-05-27-22-17-11.gh-issue-88123.mkYl5q.rst new file mode 100644 index 00000000000000..46bd37a85a7ca1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-27-22-17-11.gh-issue-88123.mkYl5q.rst @@ -0,0 +1,2 @@ +Implement Enum __contains__ that returns True or False to replace the +deprecated behaviour that would sometimes raise a TypeError. diff --git a/Misc/NEWS.d/next/Library/2022-05-28-08-02-55.gh-issue-93312.HY0Uzj.rst b/Misc/NEWS.d/next/Library/2022-05-28-08-02-55.gh-issue-93312.HY0Uzj.rst new file mode 100644 index 00000000000000..f11d04f63532f3 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-28-08-02-55.gh-issue-93312.HY0Uzj.rst @@ -0,0 +1,3 @@ +Add :data:`os.PIDFD_NONBLOCK` flag to open a file descriptor +for a process with :func:`os.pidfd_open` in non-blocking mode. +Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2022-05-30-21-42-50.gh-issue-83658.01Ntx0.rst b/Misc/NEWS.d/next/Library/2022-05-30-21-42-50.gh-issue-83658.01Ntx0.rst new file mode 100644 index 00000000000000..a1873095409803 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-30-21-42-50.gh-issue-83658.01Ntx0.rst @@ -0,0 +1 @@ +Make :class:`multiprocessing.Pool` raise an exception if ``maxtasksperchild`` is not ``None`` or a positive int. diff --git a/Misc/NEWS.d/next/Library/2022-05-31-14-58-40.gh-issue-93353.9Hvm6o.rst b/Misc/NEWS.d/next/Library/2022-05-31-14-58-40.gh-issue-93353.9Hvm6o.rst new file mode 100644 index 00000000000000..67be3c68f47cba --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-31-14-58-40.gh-issue-93353.9Hvm6o.rst @@ -0,0 +1,3 @@ +Fix the :func:`importlib.resources.as_file` context manager to remove the +temporary file if destroyed late during Python finalization: keep a local +reference to the :func:`os.remove` function. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Library/2022-06-01-11-24-13.gh-issue-91162.NxvU_u.rst b/Misc/NEWS.d/next/Library/2022-06-01-11-24-13.gh-issue-91162.NxvU_u.rst new file mode 100644 index 00000000000000..09fa47c0d23840 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-01-11-24-13.gh-issue-91162.NxvU_u.rst @@ -0,0 +1,5 @@ +Support splitting of unpacked arbitrary-length tuple over ``TypeVar`` and +``TypeVarTuple`` parameters. For example: + +* ``A[T, *Ts][*tuple[int, ...]]`` -> ``A[int, *tuple[int, ...]]`` +* ``A[*Ts, T][*tuple[int, ...]]`` -> ``A[*tuple[int, ...], int]`` diff --git a/Misc/NEWS.d/next/Library/2022-06-02-08-40-58.gh-issue-91810.Gtk44w.rst b/Misc/NEWS.d/next/Library/2022-06-02-08-40-58.gh-issue-91810.Gtk44w.rst new file mode 100644 index 00000000000000..e40005886afc3e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-02-08-40-58.gh-issue-91810.Gtk44w.rst @@ -0,0 +1,2 @@ +Suppress writing an XML declaration in open files in ``ElementTree.write()`` +with ``encoding='unicode'`` and ``xml_declaration=None``. diff --git a/Misc/NEWS.d/next/Library/2022-06-03-22-13-28.gh-issue-93370.tjfu9L.rst b/Misc/NEWS.d/next/Library/2022-06-03-22-13-28.gh-issue-93370.tjfu9L.rst new file mode 100644 index 00000000000000..bd531503800319 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-03-22-13-28.gh-issue-93370.tjfu9L.rst @@ -0,0 +1 @@ +Deprecate :data:`sqlite3.version` and :data:`sqlite3.version_info`. diff --git a/Misc/NEWS.d/next/Library/2022-06-04-00-11-54.gh-issue-93475.vffFw1.rst b/Misc/NEWS.d/next/Library/2022-06-04-00-11-54.gh-issue-93475.vffFw1.rst new file mode 100644 index 00000000000000..efe7ff3e9b4fb6 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-04-00-11-54.gh-issue-93475.vffFw1.rst @@ -0,0 +1,2 @@ +Expose ``FICLONE`` and ``FICLONERANGE`` constants in :mod:`fcntl`. Patch by +Illia Volochii. diff --git a/Misc/NEWS.d/next/Library/2022-06-05-22-22-42.gh-issue-93421.43UO_8.rst b/Misc/NEWS.d/next/Library/2022-06-05-22-22-42.gh-issue-93421.43UO_8.rst new file mode 100644 index 00000000000000..9e1d6554e0ab2b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-05-22-22-42.gh-issue-93421.43UO_8.rst @@ -0,0 +1,3 @@ +Update :data:`sqlite3.Cursor.rowcount` when a DML statement has run to +completion. This fixes the row count for SQL queries like +``UPDATE ... RETURNING``. Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-06-06-12-58-27.gh-issue-79579.e8rB-M.rst b/Misc/NEWS.d/next/Library/2022-06-06-12-58-27.gh-issue-79579.e8rB-M.rst new file mode 100644 index 00000000000000..82b1a1c28a6001 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-06-12-58-27.gh-issue-79579.e8rB-M.rst @@ -0,0 +1,2 @@ +:mod:`sqlite3` now correctly detects DML queries with leading comments. +Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-06-06-13-19-43.gh-issue-93521._vE8m9.rst b/Misc/NEWS.d/next/Library/2022-06-06-13-19-43.gh-issue-93521._vE8m9.rst new file mode 100644 index 00000000000000..3a3ff4736d2940 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-06-13-19-43.gh-issue-93521._vE8m9.rst @@ -0,0 +1,4 @@ +Fixed a case where dataclasses would try to add ``__weakref__`` into the +``__slots__`` for a dataclass that specified ``weakref_slot=True`` when it was +already defined in one of its bases. This resulted in a ``TypeError`` upon the +new class being created. diff --git a/Misc/NEWS.d/next/Library/2022-06-07-14-53-46.gh-issue-90549.T4FMKY.rst b/Misc/NEWS.d/next/Library/2022-06-07-14-53-46.gh-issue-90549.T4FMKY.rst new file mode 100644 index 00000000000000..6ebdc394900e63 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-07-14-53-46.gh-issue-90549.T4FMKY.rst @@ -0,0 +1,2 @@ +Fix a multiprocessing bug where a global named resource (such as a semaphore) +could leak when a child process is spawned (as opposed to forked). diff --git a/Misc/NEWS.d/next/Library/2022-06-08-20-11-02.gh-issue-90494.LIZT85.rst b/Misc/NEWS.d/next/Library/2022-06-08-20-11-02.gh-issue-90494.LIZT85.rst new file mode 100644 index 00000000000000..95416768793eaf --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-08-20-11-02.gh-issue-90494.LIZT85.rst @@ -0,0 +1,3 @@ +:func:`copy.copy` and :func:`copy.deepcopy` now always raise a TypeError if +``__reduce__()`` returns a tuple with length 6 instead of silently ignore +the 6th item or produce incorrect result. diff --git a/Misc/NEWS.d/next/Library/2022-06-09-10-12-55.gh-issue-90473.683m_C.rst b/Misc/NEWS.d/next/Library/2022-06-09-10-12-55.gh-issue-90473.683m_C.rst new file mode 100644 index 00000000000000..b053a8e9a08135 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-09-10-12-55.gh-issue-90473.683m_C.rst @@ -0,0 +1,2 @@ +Emscripten and WASI have no home directory and cannot provide :pep:`370` +user site directory. diff --git a/Misc/NEWS.d/next/Library/2022-06-09-17-15-26.gh-issue-91389.OE4vS5.rst b/Misc/NEWS.d/next/Library/2022-06-09-17-15-26.gh-issue-91389.OE4vS5.rst new file mode 100644 index 00000000000000..0a126551e4110b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-09-17-15-26.gh-issue-91389.OE4vS5.rst @@ -0,0 +1,2 @@ +Fix an issue where :mod:`dis` utilities could report missing or incorrect +position information in the presence of ``CACHE`` entries. diff --git a/Misc/NEWS.d/next/Library/2022-06-11-13-32-17.gh-issue-79512.A1KTDr.rst b/Misc/NEWS.d/next/Library/2022-06-11-13-32-17.gh-issue-79512.A1KTDr.rst new file mode 100644 index 00000000000000..5393fb52e93c30 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-11-13-32-17.gh-issue-79512.A1KTDr.rst @@ -0,0 +1,3 @@ +Fixed names and ``__module__`` value of :mod:`weakref` classes +:class:`~weakref.ReferenceType`, :class:`~weakref.ProxyType`, +:class:`~weakref.CallableProxyType`. It makes them pickleable. diff --git a/Misc/NEWS.d/next/Library/2022-06-15-21-20-02.gh-issue-93820.FAMLY8.rst b/Misc/NEWS.d/next/Library/2022-06-15-21-20-02.gh-issue-93820.FAMLY8.rst new file mode 100644 index 00000000000000..e06d897e7d8e6b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-15-21-20-02.gh-issue-93820.FAMLY8.rst @@ -0,0 +1,2 @@ +Fixed a regression when :func:`copy.copy`-ing :class:`enum.Flag` with +multiple flag members. diff --git a/Misc/NEWS.d/next/Library/2022-06-15-21-35-11.gh-issue-91404.39TZzW.rst b/Misc/NEWS.d/next/Library/2022-06-15-21-35-11.gh-issue-91404.39TZzW.rst new file mode 100644 index 00000000000000..e20b15c7b75864 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-15-21-35-11.gh-issue-91404.39TZzW.rst @@ -0,0 +1,3 @@ +Revert the :mod:`re` memory leak when a match is terminated by a signal or +memory allocation failure as the implemented fix caused a major performance +regression. diff --git a/Misc/NEWS.d/next/Library/2022-06-16-09-24-50.gh-issue-93847.kuv8bN.rst b/Misc/NEWS.d/next/Library/2022-06-16-09-24-50.gh-issue-93847.kuv8bN.rst new file mode 100644 index 00000000000000..c6947575e67e1c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-16-09-24-50.gh-issue-93847.kuv8bN.rst @@ -0,0 +1 @@ +Fix repr of enum of generic aliases. diff --git a/Misc/NEWS.d/next/Library/2022-06-20-23-14-43.gh-issue-94028.UofEcX.rst b/Misc/NEWS.d/next/Library/2022-06-20-23-14-43.gh-issue-94028.UofEcX.rst new file mode 100644 index 00000000000000..5775b2276d70b4 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-20-23-14-43.gh-issue-94028.UofEcX.rst @@ -0,0 +1,3 @@ +Fix a regression in the :mod:`sqlite3` where statement objects were not +properly cleared and reset after use in cursor iters. The regression was +introduced by PR 27884 in Python 3.11a1. Patch by Erlend E. Aasland. diff --git a/Misc/NEWS.d/next/Library/2022-06-22-11-16-11.gh-issue-94101.V9vDG8.rst b/Misc/NEWS.d/next/Library/2022-06-22-11-16-11.gh-issue-94101.V9vDG8.rst new file mode 100644 index 00000000000000..bcef0ca07470a6 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-22-11-16-11.gh-issue-94101.V9vDG8.rst @@ -0,0 +1,3 @@ +Manual instantiation of :class:`ssl.SSLSession` objects is no longer allowed +as it lead to misconfigured instances that crashed the interpreter when +attributes where accessed on them. diff --git a/Misc/NEWS.d/next/Library/2022-06-23-13-12-05.gh-issue-91742.sNytVX.rst b/Misc/NEWS.d/next/Library/2022-06-23-13-12-05.gh-issue-91742.sNytVX.rst new file mode 100644 index 00000000000000..30c92363b10b36 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-23-13-12-05.gh-issue-91742.sNytVX.rst @@ -0,0 +1 @@ +Fix :mod:`pdb` crash after jump caused by a null pointer dereference. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2022-06-23-14-35-10.gh-issue-94169.jeba90.rst b/Misc/NEWS.d/next/Library/2022-06-23-14-35-10.gh-issue-94169.jeba90.rst new file mode 100644 index 00000000000000..40c1fc10bc0e46 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-23-14-35-10.gh-issue-94169.jeba90.rst @@ -0,0 +1,4 @@ +Remove ``io.OpenWrapper`` and ``_pyio.OpenWrapper``, deprecated in Python +3.10: just use :func:`open` instead. The :func:`open` (:func:`io.open`) +function is a built-in function. Since Python 3.10, :func:`_pyio.open` is +also a static method. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Library/2022-06-24-09-41-41.gh-issue-94196.r2KyfS.rst b/Misc/NEWS.d/next/Library/2022-06-24-09-41-41.gh-issue-94196.r2KyfS.rst new file mode 100644 index 00000000000000..e22776f1b45e6f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-24-09-41-41.gh-issue-94196.r2KyfS.rst @@ -0,0 +1,4 @@ +:mod:`gzip`: Remove the ``filename`` attribute of :class:`gzip.GzipFile`, +deprecated since Python 2.6, use the :attr:`~gzip.GzipFile.name` attribute +instead. In write mode, the ``filename`` attribute added ``'.gz'`` file +extension if it was not present. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Library/2022-06-24-10-29-19.gh-issue-94199.pfehmz.rst b/Misc/NEWS.d/next/Library/2022-06-24-10-29-19.gh-issue-94199.pfehmz.rst new file mode 100644 index 00000000000000..ed325c0f6886f5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-24-10-29-19.gh-issue-94199.pfehmz.rst @@ -0,0 +1,3 @@ +Remove the :func:`ssl.RAND_pseudo_bytes` function, deprecated in Python 3.6: +use :func:`os.urandom` or :func:`ssl.RAND_bytes` instead. Patch by Victor +Stinner. diff --git a/Misc/NEWS.d/next/Library/2022-06-24-17-11-33.gh-issue-94199.7releN.rst b/Misc/NEWS.d/next/Library/2022-06-24-17-11-33.gh-issue-94199.7releN.rst new file mode 100644 index 00000000000000..68bd283b990741 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-24-17-11-33.gh-issue-94199.7releN.rst @@ -0,0 +1,4 @@ +Remove the :func:`ssl.match_hostname` function. The +:func:`ssl.match_hostname` was deprecated in Python 3.7. OpenSSL performs +hostname matching since Python 3.7, Python no longer uses the +:func:`ssl.match_hostname` function. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Library/2022-06-24-19-23-59.gh-issue-94207.VhS1eS.rst b/Misc/NEWS.d/next/Library/2022-06-24-19-23-59.gh-issue-94207.VhS1eS.rst new file mode 100644 index 00000000000000..3d38524ac0e83c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-24-19-23-59.gh-issue-94207.VhS1eS.rst @@ -0,0 +1,2 @@ +Made :class:`_struct.Struct` GC-tracked in order to fix a reference leak in +the :mod:`_struct` module. diff --git a/Misc/NEWS.d/next/Library/2022-06-25-13-38-53.gh-issue-93259.FAGw-2.rst b/Misc/NEWS.d/next/Library/2022-06-25-13-38-53.gh-issue-93259.FAGw-2.rst new file mode 100644 index 00000000000000..d346b65836b7a3 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-06-25-13-38-53.gh-issue-93259.FAGw-2.rst @@ -0,0 +1,2 @@ +Now raise ``ValueError`` when ``None`` or an empty string are passed to +``Distribution.from_name`` (and other callers). diff --git a/Misc/NEWS.d/next/Security/2022-04-27-18-25-30.gh-issue-68966.gjS8zs.rst b/Misc/NEWS.d/next/Security/2022-04-27-18-25-30.gh-issue-68966.gjS8zs.rst new file mode 100644 index 00000000000000..da81a1f6993dbe --- /dev/null +++ b/Misc/NEWS.d/next/Security/2022-04-27-18-25-30.gh-issue-68966.gjS8zs.rst @@ -0,0 +1,4 @@ +The deprecated mailcap module now refuses to inject unsafe text (filenames, +MIME types, parameters) into shell commands. Instead of using such text, it +will warn and act as if a match was not found (or for test commands, as if +the test failed). diff --git a/Misc/NEWS.d/next/Security/2022-05-19-08-53-07.gh-issue-92888.TLtR9W.rst b/Misc/NEWS.d/next/Security/2022-05-19-08-53-07.gh-issue-92888.TLtR9W.rst new file mode 100644 index 00000000000000..4841b8a90a0879 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2022-05-19-08-53-07.gh-issue-92888.TLtR9W.rst @@ -0,0 +1,2 @@ +Fix ``memoryview`` use after free when accessing the backing buffer in certain cases. + diff --git a/Misc/NEWS.d/next/Security/2022-06-03-12-52-53.gh-issue-79096.YVoxgC.rst b/Misc/NEWS.d/next/Security/2022-06-03-12-52-53.gh-issue-79096.YVoxgC.rst new file mode 100644 index 00000000000000..9ec3335dc71b92 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2022-06-03-12-52-53.gh-issue-79096.YVoxgC.rst @@ -0,0 +1 @@ +LWPCookieJar and MozillaCookieJar create files with file mode 600 instead of 644 (Microsoft Windows is not affected) diff --git a/Misc/NEWS.d/next/Security/2022-06-15-20-09-23.gh-issue-87389.QVaC3f.rst b/Misc/NEWS.d/next/Security/2022-06-15-20-09-23.gh-issue-87389.QVaC3f.rst new file mode 100644 index 00000000000000..029d437190deb5 --- /dev/null +++ b/Misc/NEWS.d/next/Security/2022-06-15-20-09-23.gh-issue-87389.QVaC3f.rst @@ -0,0 +1,3 @@ +:mod:`http.server`: Fix an open redirection vulnerability in the HTTP server +when an URI path starts with ``//``. Vulnerability discovered, and initial +fix proposed, by Hamza Avvan. diff --git a/Misc/NEWS.d/next/Tests/2022-03-14-23-28-17.bpo-47016.K-t2QX.rst b/Misc/NEWS.d/next/Tests/2022-03-14-23-28-17.bpo-47016.K-t2QX.rst new file mode 100644 index 00000000000000..774bfafc021efc --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-03-14-23-28-17.bpo-47016.K-t2QX.rst @@ -0,0 +1,2 @@ +Create a GitHub Actions workflow for verifying bundled pip and setuptools. +Patch by Illia Volochii and Adam Turner. diff --git a/Misc/NEWS.d/next/Tests/2022-05-12-05-51-06.gh-issue-92670.7L43Z_.rst b/Misc/NEWS.d/next/Tests/2022-05-12-05-51-06.gh-issue-92670.7L43Z_.rst new file mode 100644 index 00000000000000..c1349519e7c37c --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-05-12-05-51-06.gh-issue-92670.7L43Z_.rst @@ -0,0 +1,3 @@ +Skip ``test_shutil.TestCopy.test_copyfile_nonexistent_dir`` test on AIX as the test uses a trailing +slash to force the OS consider the path as a directory, but on AIX the +trailing slash has no effect and is considered as a file. diff --git a/Misc/NEWS.d/next/Tests/2022-05-25-23-00-35.gh-issue-92886.Y-vrWj.rst b/Misc/NEWS.d/next/Tests/2022-05-25-23-00-35.gh-issue-92886.Y-vrWj.rst new file mode 100644 index 00000000000000..93c1ffe33f28a6 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-05-25-23-00-35.gh-issue-92886.Y-vrWj.rst @@ -0,0 +1 @@ +Fixing tests that fail when running with optimizations (``-O``) in ``test_zipimport.py`` diff --git a/Misc/NEWS.d/next/Tests/2022-05-25-23-07-15.gh-issue-92886.Aki63_.rst b/Misc/NEWS.d/next/Tests/2022-05-25-23-07-15.gh-issue-92886.Aki63_.rst new file mode 100644 index 00000000000000..581f6bfea24b6e --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-05-25-23-07-15.gh-issue-92886.Aki63_.rst @@ -0,0 +1 @@ +Fixing tests that fail when running with optimizations (``-O``) in ``test_imaplib.py``. diff --git a/Misc/NEWS.d/next/Tests/2022-06-03-12-22-44.gh-issue-89858.ftBvjE.rst b/Misc/NEWS.d/next/Tests/2022-06-03-12-22-44.gh-issue-89858.ftBvjE.rst new file mode 100644 index 00000000000000..ef806a93c018cc --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-03-12-22-44.gh-issue-89858.ftBvjE.rst @@ -0,0 +1 @@ +Fix ``test_embed`` for out-of-tree builds. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Tests/2022-06-03-14-18-37.gh-issue-90473.7iXVRK.rst b/Misc/NEWS.d/next/Tests/2022-06-03-14-18-37.gh-issue-90473.7iXVRK.rst new file mode 100644 index 00000000000000..a3165a01111fb9 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-03-14-18-37.gh-issue-90473.7iXVRK.rst @@ -0,0 +1,2 @@ +Skip symlink tests on WASI. wasmtime uses ``openat2(2)`` with +``RESOLVE_BENEATH`` flag, which prevents symlinks with absolute paths. diff --git a/Misc/NEWS.d/next/Tests/2022-06-03-16-26-04.gh-issue-57539.HxWgYO.rst b/Misc/NEWS.d/next/Tests/2022-06-03-16-26-04.gh-issue-57539.HxWgYO.rst new file mode 100644 index 00000000000000..0734b599f4ad95 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-03-16-26-04.gh-issue-57539.HxWgYO.rst @@ -0,0 +1 @@ +Increase calendar test coverage for :meth:`calendar.LocaleTextCalendar.formatweekday`. diff --git a/Misc/NEWS.d/next/Tests/2022-06-04-12-05-31.gh-issue-90473.RSpjF7.rst b/Misc/NEWS.d/next/Tests/2022-06-04-12-05-31.gh-issue-90473.RSpjF7.rst new file mode 100644 index 00000000000000..07d579995c0911 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-04-12-05-31.gh-issue-90473.RSpjF7.rst @@ -0,0 +1 @@ +Skip tests on WASI that require symlinks with absolute paths. diff --git a/Misc/NEWS.d/next/Tests/2022-06-05-10-16-45.gh-issue-90473.QMu7A8.rst b/Misc/NEWS.d/next/Tests/2022-06-05-10-16-45.gh-issue-90473.QMu7A8.rst new file mode 100644 index 00000000000000..6c76b7f4990e4f --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-05-10-16-45.gh-issue-90473.QMu7A8.rst @@ -0,0 +1,2 @@ +WASI does not have a ``chmod(2)`` syscall. :func:`os.chmod` is now a dummy +function on WASI. Skip all tests that depend on working :func:`os.chmod`. diff --git a/Misc/NEWS.d/next/Tests/2022-06-08-14-17-59.gh-issue-93575.Xb2LNB.rst b/Misc/NEWS.d/next/Tests/2022-06-08-14-17-59.gh-issue-93575.Xb2LNB.rst new file mode 100644 index 00000000000000..98d15328a087ae --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-08-14-17-59.gh-issue-93575.Xb2LNB.rst @@ -0,0 +1,4 @@ +Fix issue with test_unicode test_raiseMemError. The test case now use +``test.support.calcobjsize`` to calculate size of PyUnicode structs. +:func:`sys.getsizeof` may return different size when string has UTF-8 +memory. diff --git a/Misc/NEWS.d/next/Tests/2022-06-08-22-32-56.gh-issue-93616.e5Kkx2.rst b/Misc/NEWS.d/next/Tests/2022-06-08-22-32-56.gh-issue-93616.e5Kkx2.rst new file mode 100644 index 00000000000000..de8ec52cc88b2a --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-08-22-32-56.gh-issue-93616.e5Kkx2.rst @@ -0,0 +1,2 @@ +``test_modulefinder`` now creates a temporary directory in +``ModuleFinderTest.setUp()`` instead of module scope. diff --git a/Misc/NEWS.d/next/Tests/2022-06-10-21-18-14.gh-issue-84461.9TAb26.rst b/Misc/NEWS.d/next/Tests/2022-06-10-21-18-14.gh-issue-84461.9TAb26.rst new file mode 100644 index 00000000000000..7cdf5bee721870 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-10-21-18-14.gh-issue-84461.9TAb26.rst @@ -0,0 +1,2 @@ +``run_tests.py`` now handles cross compiling env vars correctly and pass +``HOSTRUNNER`` to regression tests. diff --git a/Misc/NEWS.d/next/Tests/2022-06-16-17-50-58.gh-issue-93353.JdpATx.rst b/Misc/NEWS.d/next/Tests/2022-06-16-17-50-58.gh-issue-93353.JdpATx.rst new file mode 100644 index 00000000000000..4e232948f49ee7 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-16-17-50-58.gh-issue-93353.JdpATx.rst @@ -0,0 +1,2 @@ +regrtest now checks if a test leaks temporary files or directories if run +with -jN option. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2022-06-16-21-38-18.gh-issue-93852.U_Hl6s.rst b/Misc/NEWS.d/next/Tests/2022-06-16-21-38-18.gh-issue-93852.U_Hl6s.rst new file mode 100644 index 00000000000000..ce86eead02660c --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-16-21-38-18.gh-issue-93852.U_Hl6s.rst @@ -0,0 +1,4 @@ +test_asyncio, test_logging, test_socket and test_socketserver now create +AF_UNIX domains in the current directory to no longer fail with +``OSError("AF_UNIX path too long")`` if the temporary directory (the +:envvar:`TMPDIR` environment variable) is too long. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2022-06-17-13-55-11.gh-issue-93957.X4ovYV.rst b/Misc/NEWS.d/next/Tests/2022-06-17-13-55-11.gh-issue-93957.X4ovYV.rst new file mode 100644 index 00000000000000..2719933f6b94c7 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-17-13-55-11.gh-issue-93957.X4ovYV.rst @@ -0,0 +1,2 @@ +Provide nicer error reporting from subprocesses in +test_venv.EnsurePipTest.test_with_pip. diff --git a/Misc/NEWS.d/next/Tests/2022-06-17-15-20-09.gh-issue-93951.CW1Vv4.rst b/Misc/NEWS.d/next/Tests/2022-06-17-15-20-09.gh-issue-93951.CW1Vv4.rst new file mode 100644 index 00000000000000..b627466b4b221c --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-17-15-20-09.gh-issue-93951.CW1Vv4.rst @@ -0,0 +1 @@ +In test_bdb.StateTestCase.test_skip, avoid including auxiliary importers. diff --git a/Misc/NEWS.d/next/Tests/2022-06-20-23-04-52.gh-issue-93839.OE3Ybk.rst b/Misc/NEWS.d/next/Tests/2022-06-20-23-04-52.gh-issue-93839.OE3Ybk.rst new file mode 100644 index 00000000000000..121b64b13307ae --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-20-23-04-52.gh-issue-93839.OE3Ybk.rst @@ -0,0 +1,2 @@ +Move ``Lib/ctypes/test/`` to ``Lib/test/test_ctypes/``. Patch by Victor +Stinner. diff --git a/Misc/NEWS.d/next/Tests/2022-06-21-17-37-46.gh-issue-54781.BjVAVg.rst b/Misc/NEWS.d/next/Tests/2022-06-21-17-37-46.gh-issue-54781.BjVAVg.rst new file mode 100644 index 00000000000000..f7ae7b976af95b --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2022-06-21-17-37-46.gh-issue-54781.BjVAVg.rst @@ -0,0 +1,2 @@ +Rename test_tk to test_tkinter, and rename test_ttk_guionly to test_ttk. +Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tools-Demos/2022-06-19-14-56-33.gh-issue-86087.R8MkRy.rst b/Misc/NEWS.d/next/Tools-Demos/2022-06-19-14-56-33.gh-issue-86087.R8MkRy.rst new file mode 100644 index 00000000000000..a10d4c76884522 --- /dev/null +++ b/Misc/NEWS.d/next/Tools-Demos/2022-06-19-14-56-33.gh-issue-86087.R8MkRy.rst @@ -0,0 +1,2 @@ +The ``Tools/scripts/parseentities.py`` script used to parse HTML4 entities +has been removed. diff --git a/Misc/NEWS.d/next/Windows/2020-01-10-23-33-03.bpo-38704.2Idtdn.rst b/Misc/NEWS.d/next/Windows/2020-01-10-23-33-03.bpo-38704.2Idtdn.rst new file mode 100644 index 00000000000000..519338f27a47ae --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2020-01-10-23-33-03.bpo-38704.2Idtdn.rst @@ -0,0 +1 @@ +Prevent installation on unsupported Windows versions. diff --git a/Misc/NEWS.d/next/Windows/2022-03-20-15-47-35.bpo-42658.16eXtb.rst b/Misc/NEWS.d/next/Windows/2022-03-20-15-47-35.bpo-42658.16eXtb.rst new file mode 100644 index 00000000000000..852cc77676a31d --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-03-20-15-47-35.bpo-42658.16eXtb.rst @@ -0,0 +1,3 @@ +Support native Windows case-insensitive path comparisons by using +``LCMapStringEx`` instead of :func:`str.lower` in :func:`ntpath.normcase`. +Add ``LCMapStringEx`` to the :mod:`_winapi` module. diff --git a/Misc/NEWS.d/next/Windows/2022-04-12-18-35-20.gh-issue-91061.x40hSK.rst b/Misc/NEWS.d/next/Windows/2022-04-12-18-35-20.gh-issue-91061.x40hSK.rst new file mode 100644 index 00000000000000..aadc3034420251 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-04-12-18-35-20.gh-issue-91061.x40hSK.rst @@ -0,0 +1 @@ +Accept os.PathLike for the argument to winsound.PlaySound diff --git a/Misc/NEWS.d/next/Windows/2022-05-16-11-45-06.gh-issue-92841.NQx107.rst b/Misc/NEWS.d/next/Windows/2022-05-16-11-45-06.gh-issue-92841.NQx107.rst new file mode 100644 index 00000000000000..5e1897e6ba1bc6 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-05-16-11-45-06.gh-issue-92841.NQx107.rst @@ -0,0 +1,2 @@ +:mod:`asyncio` no longer throws ``RuntimeError: Event loop is closed`` on +interpreter exit after asynchronous socket activity. Patch by Oleg Iarygin. diff --git a/Misc/NEWS.d/next/Windows/2022-05-19-14-01-30.gh-issue-92984.Dsxnlr.rst b/Misc/NEWS.d/next/Windows/2022-05-19-14-01-30.gh-issue-92984.Dsxnlr.rst new file mode 100644 index 00000000000000..2d3071b00b7618 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-05-19-14-01-30.gh-issue-92984.Dsxnlr.rst @@ -0,0 +1 @@ +Explicitly disable incremental linking for non-Debug builds diff --git a/Misc/NEWS.d/next/Windows/2022-05-19-21-44-25.gh-issue-92817.Jrf-Kv.rst b/Misc/NEWS.d/next/Windows/2022-05-19-21-44-25.gh-issue-92817.Jrf-Kv.rst new file mode 100644 index 00000000000000..16acba1d6274c9 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-05-19-21-44-25.gh-issue-92817.Jrf-Kv.rst @@ -0,0 +1,3 @@ +Ensures that :file:`py.exe` will prefer an active virtual environment over +default tags specified with environment variables or through a +:file:`py.ini` file. diff --git a/Misc/NEWS.d/next/Windows/2022-06-15-01-03-52.gh-issue-93824.mR4mxu.rst b/Misc/NEWS.d/next/Windows/2022-06-15-01-03-52.gh-issue-93824.mR4mxu.rst new file mode 100644 index 00000000000000..cabe983847c891 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2022-06-15-01-03-52.gh-issue-93824.mR4mxu.rst @@ -0,0 +1,2 @@ +Drag and drop of files onto Python files in Windows Explorer has been +enabled for Windows ARM64. diff --git a/Misc/python.man b/Misc/python.man index 69dab58a9aac26..1705eeb0c9c126 100644 --- a/Misc/python.man +++ b/Misc/python.man @@ -69,10 +69,10 @@ python \- an interpreted, interactive, object-oriented programming language .B \-x ] [ -[ .B \-X .I option ] +[ .B \-? ] .br @@ -84,6 +84,19 @@ python \- an interpreted, interactive, object-oriented programming language | .I never ] +.br + [ +.B \--help +] +[ +.B \--help-env +] +[ +.B \--help-xoptions +] +[ +.B \--help-all +] .br [ .B \-c @@ -149,6 +162,16 @@ the behavior of the interpreter. .B \-h ", " \-? ", "\-\-help Prints the usage for the interpreter executable and exits. .TP +.B "\-\-help\-env" +Prints help about Python-specific environment variables and exits. +.TP +.B "\-\-help\-xoptions" +Prints help about implementation-specific \fB\-X\fP options and exits. +.TP +.TP +.B "\-\-help\-all" +Prints complete usage information and exits. +.TP .B \-i When a script is passed as first argument or the \fB\-c\fP option is used, enter interactive mode after executing the script or the @@ -287,7 +310,7 @@ a regular expression on the warning message. .TP .BI "\-X " option -Set implementation specific option. The following options are available: +Set implementation-specific option. The following options are available: -X faulthandler: enable faulthandler @@ -310,7 +333,8 @@ Set implementation specific option. The following options are available: more verbose than the default if the code is correct: new warnings are only emitted when an issue is detected. Effect of the developer mode: * Add default warning filter, as -W default - * Install debug hooks on memory allocators: see the PyMem_SetupDebugHooks() C function + * Install debug hooks on memory allocators: see the PyMem_SetupDebugHooks() + C function * Enable the faulthandler module to dump the Python traceback on a crash * Enable asyncio debug mode * Set the dev_mode attribute of sys.flags to True @@ -321,7 +345,19 @@ Set implementation specific option. The following options are available: otherwise activate automatically). See PYTHONUTF8 for more details -X pycache_prefix=PATH: enable writing .pyc files to a parallel tree rooted at the - given directory instead of to the code tree. + given directory instead of to the code tree. + + -X warn_default_encoding: enable opt-in EncodingWarning for 'encoding=None' + + -X no_debug_ranges: disable the inclusion of the tables mapping extra location + information (end line, start column offset and end column offset) to every + instruction in code objects. This is useful when smaller code objects and pyc + files are desired as well as suppressing the extra visual location indicators + when the interpreter displays tracebacks. + + -X frozen_modules=[on|off]: whether or not frozen modules should be used. + The default is "on" (or "off" if you are running a local build). + .TP .B \-x Skip the first line of the source. This is intended for a DOS diff --git a/Misc/stable_abi.toml b/Misc/stable_abi.toml index d848f18d68ff64..84bec827096050 100644 --- a/Misc/stable_abi.toml +++ b/Misc/stable_abi.toml @@ -2275,3 +2275,5 @@ added = '3.11' [function.PyErr_SetHandledException] added = '3.11' +[function.PyType_FromMetaclass] + added = '3.12' diff --git a/Modules/Setup.stdlib.in b/Modules/Setup.stdlib.in index 22c0b147c1b894..2730030a156506 100644 --- a/Modules/Setup.stdlib.in +++ b/Modules/Setup.stdlib.in @@ -143,7 +143,7 @@ # needs -lncurses and -lpanel #@MODULE__CURSES_PANEL_TRUE@_curses_panel _curses_panel.c -@MODULE__SQLITE3_TRUE@_sqlite3 _sqlite/connection.c _sqlite/cursor.c _sqlite/microprotocols.c _sqlite/module.c _sqlite/prepare_protocol.c _sqlite/row.c _sqlite/statement.c _sqlite/util.c +@MODULE__SQLITE3_TRUE@_sqlite3 _sqlite/blob.c _sqlite/connection.c _sqlite/cursor.c _sqlite/microprotocols.c _sqlite/module.c _sqlite/prepare_protocol.c _sqlite/row.c _sqlite/statement.c _sqlite/util.c # needs -lssl and -lcrypt @MODULE__SSL_TRUE@_ssl _ssl.c diff --git a/Modules/_asynciomodule.c b/Modules/_asynciomodule.c index 5fa722305757d4..bb7c5a7c3ec550 100644 --- a/Modules/_asynciomodule.c +++ b/Modules/_asynciomodule.c @@ -384,7 +384,7 @@ call_soon(PyObject *loop, PyObject *func, PyObject *arg, PyObject *ctx) nargs++; } stack[nargs] = (PyObject *)ctx; - + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, callable); handle = PyObject_Vectorcall(callable, stack, nargs, context_kwname); Py_DECREF(callable); } @@ -2643,11 +2643,7 @@ task_set_error_soon(TaskObj *task, PyObject *et, const char *format, ...) PyObject* msg; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif msg = PyUnicode_FromFormatV(format, vargs); va_end(vargs); @@ -2954,6 +2950,7 @@ task_step_impl(TaskObj *task, PyObject *exc) PyObject *stack[2]; stack[0] = wrapper; stack[1] = (PyObject *)task->task_context; + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, add_cb); tmp = PyObject_Vectorcall(add_cb, stack, 1, context_kwname); Py_DECREF(add_cb); Py_DECREF(wrapper); diff --git a/Modules/_bisectmodule.c b/Modules/_bisectmodule.c index 19e9cd2d46f109..0caa92b2dc6e02 100644 --- a/Modules/_bisectmodule.c +++ b/Modules/_bisectmodule.c @@ -128,7 +128,7 @@ _bisect_insort_right_impl(PyObject *module, PyObject *a, PyObject *x, index = internal_bisect_right(a, x, lo, hi, key); } else { key_x = PyObject_CallOneArg(key, x); - if (x == NULL) { + if (key_x == NULL) { return NULL; } index = internal_bisect_right(a, key_x, lo, hi, key); @@ -256,7 +256,7 @@ _bisect_insort_left_impl(PyObject *module, PyObject *a, PyObject *x, index = internal_bisect_left(a, x, lo, hi, key); } else { key_x = PyObject_CallOneArg(key, x); - if (x == NULL) { + if (key_x == NULL) { return NULL; } index = internal_bisect_left(a, key_x, lo, hi, key); diff --git a/Modules/_ctypes/_ctypes.c b/Modules/_ctypes/_ctypes.c index d6fa11d3481305..2c629d76beb3bd 100644 --- a/Modules/_ctypes/_ctypes.c +++ b/Modules/_ctypes/_ctypes.c @@ -396,9 +396,9 @@ _ctypes_alloc_format_string_with_shape(int ndim, const Py_ssize_t *shape, strcat(new_prefix, "("); for (k = 0; k < ndim; ++k) { if (k < ndim-1) { - sprintf(buf, "%"PY_FORMAT_SIZE_T"d,", shape[k]); + sprintf(buf, "%zd,", shape[k]); } else { - sprintf(buf, "%"PY_FORMAT_SIZE_T"d)", shape[k]); + sprintf(buf, "%zd)", shape[k]); } strcat(new_prefix, buf); } diff --git a/Modules/_ctypes/callbacks.c b/Modules/_ctypes/callbacks.c index e1e0225f67b341..2a668c0ca0cc97 100644 --- a/Modules/_ctypes/callbacks.c +++ b/Modules/_ctypes/callbacks.c @@ -10,7 +10,6 @@ #endif #include "pycore_call.h" // _PyObject_CallNoArgs() -#include "frameobject.h" #include @@ -472,24 +471,17 @@ static void LoadPython(void) long Call_GetClassObject(REFCLSID rclsid, REFIID riid, LPVOID *ppv) { - PyObject *mod, *func, *result; + PyObject *func, *result; long retval; static PyObject *context; if (context == NULL) context = PyUnicode_InternFromString("_ctypes.DllGetClassObject"); - mod = PyImport_ImportModule("ctypes"); - if (!mod) { - PyErr_WriteUnraisable(context ? context : Py_None); - /* There has been a warning before about this already */ - return E_FAIL; - } - - func = PyObject_GetAttrString(mod, "DllGetClassObject"); - Py_DECREF(mod); + func = _PyImport_GetModuleAttrString("ctypes", "DllGetClassObject"); if (!func) { PyErr_WriteUnraisable(context ? context : Py_None); + /* There has been a warning before about this already */ return E_FAIL; } diff --git a/Modules/_ctypes/ctypes.h b/Modules/_ctypes/ctypes.h index da1941caf3927d..88eb9f59922a04 100644 --- a/Modules/_ctypes/ctypes.h +++ b/Modules/_ctypes/ctypes.h @@ -19,7 +19,11 @@ * to avoid allocating a massive buffer on the stack. */ #ifndef CTYPES_MAX_ARGCOUNT - #define CTYPES_MAX_ARGCOUNT 1024 + #ifdef __EMSCRIPTEN__ + #define CTYPES_MAX_ARGCOUNT 1000 + #else + #define CTYPES_MAX_ARGCOUNT 1024 + #endif #endif typedef struct tagPyCArgObject PyCArgObject; diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c index 06dff8fdd9eed9..ba24e3c124846f 100644 --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -1718,17 +1718,17 @@ wrap_strftime(PyObject *object, PyObject *format, PyObject *timetuple, goto Done; { PyObject *format; - PyObject *time = PyImport_ImportModule("time"); + PyObject *strftime = _PyImport_GetModuleAttrString("time", "strftime"); - if (time == NULL) + if (strftime == NULL) goto Done; format = PyUnicode_FromString(PyBytes_AS_STRING(newfmt)); if (format != NULL) { - result = _PyObject_CallMethodIdObjArgs(time, &PyId_strftime, + result = PyObject_CallFunctionObjArgs(strftime, format, timetuple, NULL); Py_DECREF(format); } - Py_DECREF(time); + Py_DECREF(strftime); } Done: Py_XDECREF(freplacement); @@ -1748,12 +1748,10 @@ static PyObject * time_time(void) { PyObject *result = NULL; - PyObject *time = PyImport_ImportModule("time"); + PyObject *time = _PyImport_GetModuleAttrString("time", "time"); if (time != NULL) { - _Py_IDENTIFIER(time); - - result = _PyObject_CallMethodIdNoArgs(time, &PyId_time); + result = PyObject_CallNoArgs(time); Py_DECREF(time); } return result; @@ -1765,31 +1763,21 @@ time_time(void) static PyObject * build_struct_time(int y, int m, int d, int hh, int mm, int ss, int dstflag) { - PyObject *time; + PyObject *struct_time; PyObject *result; - _Py_IDENTIFIER(struct_time); - PyObject *args; - - time = PyImport_ImportModule("time"); - if (time == NULL) { + struct_time = _PyImport_GetModuleAttrString("time", "struct_time"); + if (struct_time == NULL) { return NULL; } - args = Py_BuildValue("iiiiiiiii", + result = PyObject_CallFunction(struct_time, "((iiiiiiiii))", y, m, d, hh, mm, ss, weekday(y, m, d), days_before_month(y, m) + d, dstflag); - if (args == NULL) { - Py_DECREF(time); - return NULL; - } - - result = _PyObject_CallMethodIdOneArg(time, &PyId_struct_time, args); - Py_DECREF(time); - Py_DECREF(args); + Py_DECREF(struct_time); return result; } diff --git a/Modules/_elementtree.c b/Modules/_elementtree.c index 453e57a94f2c6b..87f18d6e671a63 100644 --- a/Modules/_elementtree.c +++ b/Modules/_elementtree.c @@ -4370,7 +4370,7 @@ static struct PyModuleDef elementtreemodule = { PyMODINIT_FUNC PyInit__elementtree(void) { - PyObject *m, *temp; + PyObject *m; elementtreestate *st; m = PyState_FindModule(&elementtreemodule); @@ -4394,11 +4394,7 @@ PyInit__elementtree(void) return NULL; st = get_elementtree_state(m); - if (!(temp = PyImport_ImportModule("copy"))) - return NULL; - st->deepcopy_obj = PyObject_GetAttrString(temp, "deepcopy"); - Py_XDECREF(temp); - + st->deepcopy_obj = _PyImport_GetModuleAttrString("copy", "deepcopy"); if (st->deepcopy_obj == NULL) { return NULL; } diff --git a/Modules/_hashopenssl.c b/Modules/_hashopenssl.c index 203366e380d4e7..82398547f9b372 100644 --- a/Modules/_hashopenssl.c +++ b/Modules/_hashopenssl.c @@ -255,11 +255,7 @@ _setException(PyObject *exc, const char* altmsg, ...) const char *lib, *func, *reason; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, altmsg); -#else - va_start(vargs); -#endif if (!errcode) { if (altmsg == NULL) { PyErr_SetString(exc, "no reason supplied"); @@ -2002,7 +1998,7 @@ _hashlib_compare_digest_impl(PyObject *module, PyObject *a, PyObject *b) PyUnicode_GET_LENGTH(a), PyUnicode_GET_LENGTH(b)); } - /* fallback to buffer interface for bytes, bytesarray and other */ + /* fallback to buffer interface for bytes, bytearray and other */ else { Py_buffer view_a; Py_buffer view_b; diff --git a/Modules/_io/bufferedio.c b/Modules/_io/bufferedio.c index ac30d1d632bff8..4a4a1992dbbb7a 100644 --- a/Modules/_io/bufferedio.c +++ b/Modules/_io/bufferedio.c @@ -328,7 +328,7 @@ _enter_buffered_busy(buffered *self) : buffered_closed(self))) #define CHECK_CLOSED(self, error_msg) \ - if (IS_CLOSED(self) & (Py_SAFE_DOWNCAST(READAHEAD(self), Py_off_t, Py_ssize_t) == 0)) { \ + if (IS_CLOSED(self) && (Py_SAFE_DOWNCAST(READAHEAD(self), Py_off_t, Py_ssize_t) == 0)) { \ PyErr_SetString(PyExc_ValueError, error_msg); \ return NULL; \ } \ diff --git a/Modules/_io/textio.c b/Modules/_io/textio.c index 3cbaca3ef46069..660396b8b03ead 100644 --- a/Modules/_io/textio.c +++ b/Modules/_io/textio.c @@ -19,9 +19,9 @@ /*[clinic input] module _io class _io.IncrementalNewlineDecoder "nldecoder_object *" "&PyIncrementalNewlineDecoder_Type" -class _io.TextIOWrapper "textio *" "&TextIOWrapper_TYpe" +class _io.TextIOWrapper "textio *" "&TextIOWrapper_Type" [clinic start generated code]*/ -/*[clinic end generated code: output=da39a3ee5e6b4b0d input=2097a4fc85670c26]*/ +/*[clinic end generated code: output=da39a3ee5e6b4b0d input=ed072384f8aada2c]*/ /* TextIOBase */ diff --git a/Modules/_operator.c b/Modules/_operator.c index 1af4a4feb7728a..51ca74eeeaee21 100644 --- a/Modules/_operator.c +++ b/Modules/_operator.c @@ -839,7 +839,7 @@ _operator__compare_digest_impl(PyObject *module, PyObject *a, PyObject *b) PyUnicode_GET_LENGTH(a), PyUnicode_GET_LENGTH(b)); } - /* fallback to buffer interface for bytes, bytesarray and other */ + /* fallback to buffer interface for bytes, bytearray and other */ else { Py_buffer view_a; Py_buffer view_b; @@ -1752,16 +1752,11 @@ methodcaller_reduce(methodcallerobject *mc, PyObject *Py_UNUSED(ignored)) return Py_BuildValue("ON", Py_TYPE(mc), newargs); } else { - PyObject *functools; PyObject *partial; PyObject *constructor; PyObject *newargs[2]; - functools = PyImport_ImportModule("functools"); - if (!functools) - return NULL; - partial = PyObject_GetAttr(functools, &_Py_ID(partial)); - Py_DECREF(functools); + partial = _PyImport_GetModuleAttrString("functools", "partial"); if (!partial) return NULL; diff --git a/Modules/_pickle.c b/Modules/_pickle.c index 0adfa4103cc0d0..1c5de30b07d620 100644 --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -232,8 +232,6 @@ _Pickle_InitState(PickleState *st) { PyObject *copyreg = NULL; PyObject *compat_pickle = NULL; - PyObject *codecs = NULL; - PyObject *functools = NULL; st->getattr = _PyEval_GetBuiltin(&_Py_ID(getattr)); if (st->getattr == NULL) @@ -329,10 +327,7 @@ _Pickle_InitState(PickleState *st) } Py_CLEAR(compat_pickle); - codecs = PyImport_ImportModule("codecs"); - if (codecs == NULL) - goto error; - st->codecs_encode = PyObject_GetAttrString(codecs, "encode"); + st->codecs_encode = _PyImport_GetModuleAttrString("codecs", "encode"); if (st->codecs_encode == NULL) { goto error; } @@ -342,23 +337,16 @@ _Pickle_InitState(PickleState *st) Py_TYPE(st->codecs_encode)->tp_name); goto error; } - Py_CLEAR(codecs); - functools = PyImport_ImportModule("functools"); - if (!functools) - goto error; - st->partial = PyObject_GetAttrString(functools, "partial"); + st->partial = _PyImport_GetModuleAttrString("functools", "partial"); if (!st->partial) goto error; - Py_CLEAR(functools); return 0; error: Py_CLEAR(copyreg); Py_CLEAR(compat_pickle); - Py_CLEAR(codecs); - Py_CLEAR(functools); _Pickle_ClearState(st); return -1; } @@ -3006,7 +2994,10 @@ batch_list_exact(PicklerObject *self, PyObject *obj) if (PyList_GET_SIZE(obj) == 1) { item = PyList_GET_ITEM(obj, 0); - if (save(self, item, 0) < 0) + Py_INCREF(item); + int err = save(self, item, 0); + Py_DECREF(item); + if (err < 0) return -1; if (_Pickler_Write(self, &append_op, 1) < 0) return -1; @@ -3021,7 +3012,10 @@ batch_list_exact(PicklerObject *self, PyObject *obj) return -1; while (total < PyList_GET_SIZE(obj)) { item = PyList_GET_ITEM(obj, total); - if (save(self, item, 0) < 0) + Py_INCREF(item); + int err = save(self, item, 0); + Py_DECREF(item); + if (err < 0) return -1; total++; if (++this_batch == BATCHSIZE) @@ -3259,10 +3253,16 @@ batch_dict_exact(PicklerObject *self, PyObject *obj) /* Special-case len(d) == 1 to save space. */ if (dict_size == 1) { PyDict_Next(obj, &ppos, &key, &value); - if (save(self, key, 0) < 0) - return -1; - if (save(self, value, 0) < 0) - return -1; + Py_INCREF(key); + Py_INCREF(value); + if (save(self, key, 0) < 0) { + goto error; + } + if (save(self, value, 0) < 0) { + goto error; + } + Py_CLEAR(key); + Py_CLEAR(value); if (_Pickler_Write(self, &setitem_op, 1) < 0) return -1; return 0; @@ -3274,10 +3274,16 @@ batch_dict_exact(PicklerObject *self, PyObject *obj) if (_Pickler_Write(self, &mark_op, 1) < 0) return -1; while (PyDict_Next(obj, &ppos, &key, &value)) { - if (save(self, key, 0) < 0) - return -1; - if (save(self, value, 0) < 0) - return -1; + Py_INCREF(key); + Py_INCREF(value); + if (save(self, key, 0) < 0) { + goto error; + } + if (save(self, value, 0) < 0) { + goto error; + } + Py_CLEAR(key); + Py_CLEAR(value); if (++i == BATCHSIZE) break; } @@ -3292,6 +3298,10 @@ batch_dict_exact(PicklerObject *self, PyObject *obj) } while (i == BATCHSIZE); return 0; +error: + Py_XDECREF(key); + Py_XDECREF(value); + return -1; } static int @@ -3409,7 +3419,10 @@ save_set(PicklerObject *self, PyObject *obj) if (_Pickler_Write(self, &mark_op, 1) < 0) return -1; while (_PySet_NextEntry(obj, &ppos, &item, &hash)) { - if (save(self, item, 0) < 0) + Py_INCREF(item); + int err = save(self, item, 0); + Py_CLEAR(item); + if (err < 0) return -1; if (++i == BATCHSIZE) break; diff --git a/Modules/_posixsubprocess.c b/Modules/_posixsubprocess.c index 9132f13e8166a7..44e60d7c14954d 100644 --- a/Modules/_posixsubprocess.c +++ b/Modules/_posixsubprocess.c @@ -612,8 +612,10 @@ child_exec(char *const exec_array[], #endif #ifdef HAVE_SETPGID - if (pgid_to_set >= 0) + static_assert(_Py_IS_TYPE_SIGNED(pid_t), "pid_t is unsigned"); + if (pgid_to_set >= 0) { POSIX_CALL(setpgid(0, pgid_to_set)); + } #endif #ifdef HAVE_SETGROUPS diff --git a/Modules/_sqlite/clinic/connection.c.h b/Modules/_sqlite/clinic/connection.c.h index 1e27c5e0afb126..62d31b787ad717 100644 --- a/Modules/_sqlite/clinic/connection.c.h +++ b/Modules/_sqlite/clinic/connection.c.h @@ -3,9 +3,9 @@ preserve [clinic start generated code]*/ static int -pysqlite_connection_init_impl(pysqlite_Connection *self, - const char *database, double timeout, - int detect_types, const char *isolation_level, +pysqlite_connection_init_impl(pysqlite_Connection *self, PyObject *database, + double timeout, int detect_types, + const char *isolation_level, int check_same_thread, PyObject *factory, int cache_size, int uri); @@ -19,7 +19,7 @@ pysqlite_connection_init(PyObject *self, PyObject *args, PyObject *kwargs) PyObject * const *fastargs; Py_ssize_t nargs = PyTuple_GET_SIZE(args); Py_ssize_t noptargs = nargs + (kwargs ? PyDict_GET_SIZE(kwargs) : 0) - 1; - const char *database = NULL; + PyObject *database; double timeout = 5.0; int detect_types = 0; const char *isolation_level = ""; @@ -32,9 +32,7 @@ pysqlite_connection_init(PyObject *self, PyObject *args, PyObject *kwargs) if (!fastargs) { goto exit; } - if (!clinic_fsconverter(fastargs[0], &database)) { - goto exit; - } + database = fastargs[0]; if (!noptargs) { goto skip_optional_pos; } @@ -102,9 +100,6 @@ pysqlite_connection_init(PyObject *self, PyObject *args, PyObject *kwargs) return_value = pysqlite_connection_init_impl((pysqlite_Connection *)self, database, timeout, detect_types, isolation_level, check_same_thread, factory, cache_size, uri); exit: - /* Cleanup for database */ - PyMem_Free((void *)database); - return return_value; } @@ -253,7 +248,9 @@ PyDoc_STRVAR(pysqlite_connection_close__doc__, "close($self, /)\n" "--\n" "\n" -"Closes the connection."); +"Close the database connection.\n" +"\n" +"Any pending transaction is not committed implicitly."); #define PYSQLITE_CONNECTION_CLOSE_METHODDEF \ {"close", (PyCFunction)pysqlite_connection_close, METH_NOARGS, pysqlite_connection_close__doc__}, @@ -271,7 +268,9 @@ PyDoc_STRVAR(pysqlite_connection_commit__doc__, "commit($self, /)\n" "--\n" "\n" -"Commit the current transaction."); +"Commit any pending transaction to the database.\n" +"\n" +"If there is no open transaction, this method is a no-op."); #define PYSQLITE_CONNECTION_COMMIT_METHODDEF \ {"commit", (PyCFunction)pysqlite_connection_commit, METH_NOARGS, pysqlite_connection_commit__doc__}, @@ -289,7 +288,9 @@ PyDoc_STRVAR(pysqlite_connection_rollback__doc__, "rollback($self, /)\n" "--\n" "\n" -"Roll back the current transaction."); +"Roll back to the start of any pending transaction.\n" +"\n" +"If there is no open transaction, this method is a no-op."); #define PYSQLITE_CONNECTION_ROLLBACK_METHODDEF \ {"rollback", (PyCFunction)pysqlite_connection_rollback, METH_NOARGS, pysqlite_connection_rollback__doc__}, @@ -1236,4 +1237,4 @@ getlimit(pysqlite_Connection *self, PyObject *arg) #ifndef DESERIALIZE_METHODDEF #define DESERIALIZE_METHODDEF #endif /* !defined(DESERIALIZE_METHODDEF) */ -/*[clinic end generated code: output=d21767843c480a10 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=8818c1c3ec9425aa input=a9049054013a1b77]*/ diff --git a/Modules/_sqlite/clinic/module.c.h b/Modules/_sqlite/clinic/module.c.h index 8f7008adef2b1d..ef0dd4d1c103ea 100644 --- a/Modules/_sqlite/clinic/module.c.h +++ b/Modules/_sqlite/clinic/module.c.h @@ -43,9 +43,7 @@ pysqlite_connect(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyOb if (!args) { goto exit; } - if (!PyUnicode_FSConverter(args[0], &database)) { - goto exit; - } + database = args[0]; if (!noptargs) { goto skip_optional_pos; } @@ -158,51 +156,11 @@ pysqlite_complete_statement(PyObject *module, PyObject *const *args, Py_ssize_t return return_value; } -PyDoc_STRVAR(pysqlite_enable_shared_cache__doc__, -"enable_shared_cache($module, /, do_enable)\n" -"--\n" -"\n" -"Enable or disable shared cache mode for the calling thread.\n" -"\n" -"This method is deprecated and will be removed in Python 3.12.\n" -"Shared cache is strongly discouraged by the SQLite 3 documentation.\n" -"If shared cache must be used, open the database in URI mode using\n" -"the cache=shared query parameter."); - -#define PYSQLITE_ENABLE_SHARED_CACHE_METHODDEF \ - {"enable_shared_cache", _PyCFunction_CAST(pysqlite_enable_shared_cache), METH_FASTCALL|METH_KEYWORDS, pysqlite_enable_shared_cache__doc__}, - -static PyObject * -pysqlite_enable_shared_cache_impl(PyObject *module, int do_enable); - -static PyObject * -pysqlite_enable_shared_cache(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames) -{ - PyObject *return_value = NULL; - static const char * const _keywords[] = {"do_enable", NULL}; - static _PyArg_Parser _parser = {NULL, _keywords, "enable_shared_cache", 0}; - PyObject *argsbuf[1]; - int do_enable; - - args = _PyArg_UnpackKeywords(args, nargs, NULL, kwnames, &_parser, 1, 1, 0, argsbuf); - if (!args) { - goto exit; - } - do_enable = _PyLong_AsInt(args[0]); - if (do_enable == -1 && PyErr_Occurred()) { - goto exit; - } - return_value = pysqlite_enable_shared_cache_impl(module, do_enable); - -exit: - return return_value; -} - PyDoc_STRVAR(pysqlite_register_adapter__doc__, -"register_adapter($module, type, caster, /)\n" +"register_adapter($module, type, adapter, /)\n" "--\n" "\n" -"Registers an adapter with sqlite3\'s adapter registry."); +"Register a function to adapt Python objects to SQLite values."); #define PYSQLITE_REGISTER_ADAPTER_METHODDEF \ {"register_adapter", _PyCFunction_CAST(pysqlite_register_adapter), METH_FASTCALL, pysqlite_register_adapter__doc__}, @@ -230,10 +188,10 @@ pysqlite_register_adapter(PyObject *module, PyObject *const *args, Py_ssize_t na } PyDoc_STRVAR(pysqlite_register_converter__doc__, -"register_converter($module, name, converter, /)\n" +"register_converter($module, typename, converter, /)\n" "--\n" "\n" -"Registers a converter with sqlite3."); +"Register a function to convert SQLite values to Python objects."); #define PYSQLITE_REGISTER_CONVERTER_METHODDEF \ {"register_converter", _PyCFunction_CAST(pysqlite_register_converter), METH_FASTCALL, pysqlite_register_converter__doc__}, @@ -334,4 +292,4 @@ pysqlite_adapt(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=d846459943008a9c input=a9049054013a1b77]*/ +/*[clinic end generated code: output=9ac18606b0eaec03 input=a9049054013a1b77]*/ diff --git a/Modules/_sqlite/connection.c b/Modules/_sqlite/connection.c index 333847f0fafb96..b6b7417fc0a6c1 100644 --- a/Modules/_sqlite/connection.c +++ b/Modules/_sqlite/connection.c @@ -92,32 +92,6 @@ isolation_level_converter(PyObject *str_or_none, const char **result) return 1; } -static int -clinic_fsconverter(PyObject *pathlike, const char **result) -{ - PyObject *bytes = NULL; - Py_ssize_t len; - char *str; - - if (!PyUnicode_FSConverter(pathlike, &bytes)) { - goto error; - } - if (PyBytes_AsStringAndSize(bytes, &str, &len) < 0) { - goto error; - } - if ((*result = (const char *)PyMem_Malloc(len+1)) == NULL) { - goto error; - } - - memcpy((void *)(*result), str, len+1); - Py_DECREF(bytes); - return 1; - -error: - Py_XDECREF(bytes); - return 0; -} - #define clinic_state() (pysqlite_get_state_by_type(Py_TYPE(self))) #include "clinic/connection.c.h" #undef clinic_state @@ -159,25 +133,17 @@ new_statement_cache(pysqlite_Connection *self, pysqlite_state *state, } /*[python input] -class FSConverter_converter(CConverter): - type = "const char *" - converter = "clinic_fsconverter" - def converter_init(self): - self.c_default = "NULL" - def cleanup(self): - return f"PyMem_Free((void *){self.name});\n" - class IsolationLevel_converter(CConverter): type = "const char *" converter = "isolation_level_converter" [python start generated code]*/ -/*[python end generated code: output=da39a3ee5e6b4b0d input=be142323885672ab]*/ +/*[python end generated code: output=da39a3ee5e6b4b0d input=cbcfe85b253061c2]*/ /*[clinic input] _sqlite3.Connection.__init__ as pysqlite_connection_init - database: FSConverter + database: object timeout: double = 5.0 detect_types: int = 0 isolation_level: IsolationLevel = "" @@ -188,14 +154,19 @@ _sqlite3.Connection.__init__ as pysqlite_connection_init [clinic start generated code]*/ static int -pysqlite_connection_init_impl(pysqlite_Connection *self, - const char *database, double timeout, - int detect_types, const char *isolation_level, +pysqlite_connection_init_impl(pysqlite_Connection *self, PyObject *database, + double timeout, int detect_types, + const char *isolation_level, int check_same_thread, PyObject *factory, int cache_size, int uri) -/*[clinic end generated code: output=7d640ae1d83abfd4 input=342173993434ba1e]*/ +/*[clinic end generated code: output=839eb2fee4293bda input=b8ce63dc6f70a383]*/ { - if (PySys_Audit("sqlite3.connect", "s", database) < 0) { + if (PySys_Audit("sqlite3.connect", "O", database) < 0) { + return -1; + } + + PyObject *bytes; + if (!PyUnicode_FSConverter(database, &bytes)) { return -1; } @@ -210,7 +181,7 @@ pysqlite_connection_init_impl(pysqlite_Connection *self, sqlite3 *db; int rc; Py_BEGIN_ALLOW_THREADS - rc = sqlite3_open_v2(database, &db, + rc = sqlite3_open_v2(PyBytes_AS_STRING(bytes), &db, SQLITE_OPEN_READWRITE | SQLITE_OPEN_CREATE | (uri ? SQLITE_OPEN_URI : 0), NULL); if (rc == SQLITE_OK) { @@ -218,6 +189,7 @@ pysqlite_connection_init_impl(pysqlite_Connection *self, } Py_END_ALLOW_THREADS + Py_DECREF(bytes); if (db == NULL && rc == SQLITE_NOMEM) { PyErr_NoMemory(); return -1; @@ -498,12 +470,14 @@ blobopen_impl(pysqlite_Connection *self, const char *table, const char *col, /*[clinic input] _sqlite3.Connection.close as pysqlite_connection_close -Closes the connection. +Close the database connection. + +Any pending transaction is not committed implicitly. [clinic start generated code]*/ static PyObject * pysqlite_connection_close_impl(pysqlite_Connection *self) -/*[clinic end generated code: output=a546a0da212c9b97 input=3d58064bbffaa3d3]*/ +/*[clinic end generated code: output=a546a0da212c9b97 input=b3ed5b74f6fefc06]*/ { if (!pysqlite_check_thread(self)) { return NULL; @@ -550,12 +524,14 @@ int pysqlite_check_connection(pysqlite_Connection* con) /*[clinic input] _sqlite3.Connection.commit as pysqlite_connection_commit -Commit the current transaction. +Commit any pending transaction to the database. + +If there is no open transaction, this method is a no-op. [clinic start generated code]*/ static PyObject * pysqlite_connection_commit_impl(pysqlite_Connection *self) -/*[clinic end generated code: output=3da45579e89407f2 input=39c12c04dda276a8]*/ +/*[clinic end generated code: output=3da45579e89407f2 input=c8793c97c3446065]*/ { if (!pysqlite_check_thread(self) || !pysqlite_check_connection(self)) { return NULL; @@ -585,12 +561,14 @@ pysqlite_connection_commit_impl(pysqlite_Connection *self) /*[clinic input] _sqlite3.Connection.rollback as pysqlite_connection_rollback -Roll back the current transaction. +Roll back to the start of any pending transaction. + +If there is no open transaction, this method is a no-op. [clinic start generated code]*/ static PyObject * pysqlite_connection_rollback_impl(pysqlite_Connection *self) -/*[clinic end generated code: output=b66fa0d43e7ef305 input=12d4e8d068942830]*/ +/*[clinic end generated code: output=b66fa0d43e7ef305 input=7f60a2f1076f16b3]*/ { if (!pysqlite_check_thread(self) || !pysqlite_check_connection(self)) { return NULL; @@ -1610,9 +1588,8 @@ static PyObject* pysqlite_connection_get_total_changes(pysqlite_Connection* self { if (!pysqlite_check_connection(self)) { return NULL; - } else { - return Py_BuildValue("i", sqlite3_total_changes(self->db)); } + return PyLong_FromLong(sqlite3_total_changes(self->db)); } static PyObject* pysqlite_connection_get_in_transaction(pysqlite_Connection* self, void* unused) @@ -1869,43 +1846,21 @@ static PyObject * pysqlite_connection_iterdump_impl(pysqlite_Connection *self) /*[clinic end generated code: output=586997aaf9808768 input=1911ca756066da89]*/ { - PyObject* retval = NULL; - PyObject* module = NULL; - PyObject* module_dict; - PyObject* pyfn_iterdump; - if (!pysqlite_check_connection(self)) { - goto finally; - } - - module = PyImport_ImportModule(MODULE_NAME ".dump"); - if (!module) { - goto finally; - } - - module_dict = PyModule_GetDict(module); - if (!module_dict) { - goto finally; + return NULL; } - PyObject *meth = PyUnicode_InternFromString("_iterdump"); - if (meth == NULL) { - goto finally; - } - pyfn_iterdump = PyDict_GetItemWithError(module_dict, meth); - Py_DECREF(meth); - if (!pyfn_iterdump) { + PyObject *iterdump = _PyImport_GetModuleAttrString(MODULE_NAME ".dump", "_iterdump"); + if (!iterdump) { if (!PyErr_Occurred()) { PyErr_SetString(self->OperationalError, "Failed to obtain _iterdump() reference"); } - goto finally; + return NULL; } - retval = PyObject_CallOneArg(pyfn_iterdump, (PyObject *)self); - -finally: - Py_XDECREF(module); + PyObject *retval = PyObject_CallOneArg(iterdump, (PyObject *)self); + Py_DECREF(iterdump); return retval; } @@ -2119,7 +2074,7 @@ serialize_impl(pysqlite_Connection *self, const char *name) name); return NULL; } - PyObject *res = PyBytes_FromStringAndSize(data, size); + PyObject *res = PyBytes_FromStringAndSize(data, (Py_ssize_t)size); if (!(flags & SQLITE_SERIALIZE_NOCOPY)) { sqlite3_free((void *)data); } diff --git a/Modules/_sqlite/cursor.c b/Modules/_sqlite/cursor.c index c58def5f0362f1..8d9f20da530b23 100644 --- a/Modules/_sqlite/cursor.c +++ b/Modules/_sqlite/cursor.c @@ -830,17 +830,12 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation } } - if (self->statement != NULL) { - /* There is an active statement */ - stmt_reset(self->statement); - } - - /* reset description and rowcount */ + /* reset description */ Py_INCREF(Py_None); Py_SETREF(self->description, Py_None); - self->rowcount = 0L; if (self->statement) { + // Reset pending statements on this cursor. (void)stmt_reset(self->statement); } @@ -867,6 +862,7 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation stmt_reset(self->statement); stmt_mark_dirty(self->statement); + self->rowcount = self->statement->is_dml ? 0L : -1L; /* We start a transaction implicitly before a DML statement. SELECT is the only exception. See #9924. */ @@ -879,6 +875,7 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation } } + assert(!sqlite3_stmt_busy(self->statement->st)); while (1) { parameters = PyIter_Next(parameters_iter); if (!parameters) { @@ -902,7 +899,6 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation PyErr_Clear(); } } - (void)stmt_reset(self->statement); _pysqlite_seterror(state, self->connection->db); goto error; } @@ -944,18 +940,10 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation } } - if (self->statement->is_dml) { - self->rowcount += (long)sqlite3_changes(self->connection->db); - } else { - self->rowcount= -1L; - } - - if (rc == SQLITE_DONE && !multiple) { - stmt_reset(self->statement); - Py_CLEAR(self->statement); - } - - if (multiple) { + if (rc == SQLITE_DONE) { + if (self->statement->is_dml) { + self->rowcount += (long)sqlite3_changes(self->connection->db); + } stmt_reset(self->statement); } Py_XDECREF(parameters); @@ -980,11 +968,17 @@ _pysqlite_query_execute(pysqlite_Cursor* self, int multiple, PyObject* operation self->locked = 0; if (PyErr_Occurred()) { + if (self->statement) { + (void)stmt_reset(self->statement); + Py_CLEAR(self->statement); + } self->rowcount = -1L; return NULL; - } else { - return Py_NewRef((PyObject *)self); } + if (self->statement && !sqlite3_stmt_busy(self->statement->st)) { + Py_CLEAR(self->statement); + } + return Py_NewRef((PyObject *)self); } /*[clinic input] @@ -1111,11 +1105,7 @@ pysqlite_cursor_iternext(pysqlite_Cursor *self) sqlite3_stmt *stmt = self->statement->st; assert(stmt != NULL); - if (sqlite3_data_count(stmt) == 0) { - (void)stmt_reset(self->statement); - Py_CLEAR(self->statement); - return NULL; - } + assert(sqlite3_data_count(stmt) != 0); self->locked = 1; // GH-80254: Prevent recursive use of cursors. PyObject *row = _pysqlite_fetch_one_row(self); @@ -1125,11 +1115,17 @@ pysqlite_cursor_iternext(pysqlite_Cursor *self) } int rc = stmt_step(stmt); if (rc == SQLITE_DONE) { + if (self->statement->is_dml) { + self->rowcount = (long)sqlite3_changes(self->connection->db); + } (void)stmt_reset(self->statement); + Py_CLEAR(self->statement); } else if (rc != SQLITE_ROW) { (void)_pysqlite_seterror(self->connection->state, self->connection->db); + (void)stmt_reset(self->statement); + Py_CLEAR(self->statement); Py_DECREF(row); return NULL; } diff --git a/Modules/_sqlite/microprotocols.c b/Modules/_sqlite/microprotocols.c index a79f0067b17e8c..148220d0f91f96 100644 --- a/Modules/_sqlite/microprotocols.c +++ b/Modules/_sqlite/microprotocols.c @@ -57,7 +57,7 @@ pysqlite_microprotocols_add(pysqlite_state *state, PyTypeObject *type, assert(type != NULL); assert(proto != NULL); - key = Py_BuildValue("(OO)", (PyObject*)type, proto); + key = PyTuple_Pack(2, (PyObject *)type, proto); if (!key) { return -1; } @@ -81,7 +81,7 @@ pysqlite_microprotocols_adapt(pysqlite_state *state, PyObject *obj, way to get a quotable object to be its instance */ /* look for an adapter in the registry */ - key = Py_BuildValue("(OO)", (PyObject*)Py_TYPE(obj), proto); + key = PyTuple_Pack(2, (PyObject *)Py_TYPE(obj), proto); if (!key) { return NULL; } diff --git a/Modules/_sqlite/module.c b/Modules/_sqlite/module.c index fbc57c7cc739ee..eca25b94a4817d 100644 --- a/Modules/_sqlite/module.c +++ b/Modules/_sqlite/module.c @@ -46,7 +46,7 @@ module _sqlite3 /*[clinic input] _sqlite3.connect as pysqlite_connect - database: object(converter='PyUnicode_FSConverter') + database: object timeout: double = 5.0 detect_types: int = 0 isolation_level: object = NULL @@ -66,7 +66,7 @@ pysqlite_connect_impl(PyObject *module, PyObject *database, double timeout, int detect_types, PyObject *isolation_level, int check_same_thread, PyObject *factory, int cached_statements, int uri) -/*[clinic end generated code: output=450ac9078b4868bb input=ea6355ba55a78e12]*/ +/*[clinic end generated code: output=450ac9078b4868bb input=e16914663ddf93ce]*/ { if (isolation_level == NULL) { isolation_level = PyUnicode_FromString(""); @@ -81,7 +81,6 @@ pysqlite_connect_impl(PyObject *module, PyObject *database, double timeout, timeout, detect_types, isolation_level, check_same_thread, factory, cached_statements, uri); - Py_DECREF(database); // needed bco. the AC FSConverter Py_DECREF(isolation_level); return res; } @@ -105,50 +104,20 @@ pysqlite_complete_statement_impl(PyObject *module, const char *statement) } } -/*[clinic input] -_sqlite3.enable_shared_cache as pysqlite_enable_shared_cache - - do_enable: int - -Enable or disable shared cache mode for the calling thread. - -This method is deprecated and will be removed in Python 3.12. -Shared cache is strongly discouraged by the SQLite 3 documentation. -If shared cache must be used, open the database in URI mode using -the cache=shared query parameter. -[clinic start generated code]*/ - -static PyObject * -pysqlite_enable_shared_cache_impl(PyObject *module, int do_enable) -/*[clinic end generated code: output=259c74eedee1516b input=26e40d5971d3487d]*/ -{ - int rc; - - rc = sqlite3_enable_shared_cache(do_enable); - - if (rc != SQLITE_OK) { - pysqlite_state *state = pysqlite_get_state(module); - PyErr_SetString(state->OperationalError, "Changing the shared_cache flag failed"); - return NULL; - } else { - Py_RETURN_NONE; - } -} - /*[clinic input] _sqlite3.register_adapter as pysqlite_register_adapter type: object(type='PyTypeObject *') - caster: object + adapter as caster: object / -Registers an adapter with sqlite3's adapter registry. +Register a function to adapt Python objects to SQLite values. [clinic start generated code]*/ static PyObject * pysqlite_register_adapter_impl(PyObject *module, PyTypeObject *type, PyObject *caster) -/*[clinic end generated code: output=a287e8db18e8af23 input=b4bd87afcadc535d]*/ +/*[clinic end generated code: output=a287e8db18e8af23 input=29a5e0f213030242]*/ { int rc; @@ -173,17 +142,17 @@ pysqlite_register_adapter_impl(PyObject *module, PyTypeObject *type, /*[clinic input] _sqlite3.register_converter as pysqlite_register_converter - name as orig_name: unicode + typename as orig_name: unicode converter as callable: object / -Registers a converter with sqlite3. +Register a function to convert SQLite values to Python objects. [clinic start generated code]*/ static PyObject * pysqlite_register_converter_impl(PyObject *module, PyObject *orig_name, PyObject *callable) -/*[clinic end generated code: output=a2f2bfeed7230062 input=90f645419425d6c4]*/ +/*[clinic end generated code: output=a2f2bfeed7230062 input=159a444971b40378]*/ { PyObject* name = NULL; PyObject* retval = NULL; @@ -258,14 +227,8 @@ static int converters_init(PyObject* module) static int load_functools_lru_cache(PyObject *module) { - PyObject *functools = PyImport_ImportModule("functools"); - if (functools == NULL) { - return -1; - } - pysqlite_state *state = pysqlite_get_state(module); - state->lru_cache = PyObject_GetAttrString(functools, "lru_cache"); - Py_DECREF(functools); + state->lru_cache = _PyImport_GetModuleAttrString("functools", "lru_cache"); if (state->lru_cache == NULL) { return -1; } @@ -277,7 +240,6 @@ static PyMethodDef module_methods[] = { PYSQLITE_COMPLETE_STATEMENT_METHODDEF PYSQLITE_CONNECT_METHODDEF PYSQLITE_ENABLE_CALLBACK_TRACE_METHODDEF - PYSQLITE_ENABLE_SHARED_CACHE_METHODDEF PYSQLITE_REGISTER_ADAPTER_METHODDEF PYSQLITE_REGISTER_CONVERTER_METHODDEF {NULL, NULL} @@ -739,7 +701,7 @@ module_exec(PyObject *module) goto error; } - if (PyModule_AddStringConstant(module, "version", PYSQLITE_VERSION) < 0) { + if (PyModule_AddStringConstant(module, "_deprecated_version", PYSQLITE_VERSION) < 0) { goto error; } diff --git a/Modules/_sqlite/statement.c b/Modules/_sqlite/statement.c index f9cb70f0ef146c..aee460747b45f4 100644 --- a/Modules/_sqlite/statement.c +++ b/Modules/_sqlite/statement.c @@ -26,16 +26,7 @@ #include "util.h" /* prototypes */ -static int pysqlite_check_remaining_sql(const char* tail); - -typedef enum { - LINECOMMENT_1, - IN_LINECOMMENT, - COMMENTSTART_1, - IN_COMMENT, - COMMENTEND_1, - NORMAL -} parse_remaining_sql_state; +static const char *lstrip_sql(const char *sql); pysqlite_Statement * pysqlite_statement_create(pysqlite_Connection *connection, PyObject *sql) @@ -73,7 +64,7 @@ pysqlite_statement_create(pysqlite_Connection *connection, PyObject *sql) return NULL; } - if (pysqlite_check_remaining_sql(tail)) { + if (lstrip_sql(tail) != NULL) { PyErr_SetString(connection->ProgrammingError, "You can only execute one statement at a time."); goto error; @@ -82,20 +73,12 @@ pysqlite_statement_create(pysqlite_Connection *connection, PyObject *sql) /* Determine if the statement is a DML statement. SELECT is the only exception. See #9924. */ int is_dml = 0; - for (const char *p = sql_cstr; *p != 0; p++) { - switch (*p) { - case ' ': - case '\r': - case '\n': - case '\t': - continue; - } - + const char *p = lstrip_sql(sql_cstr); + if (p != NULL) { is_dml = (PyOS_strnicmp(p, "insert", 6) == 0) || (PyOS_strnicmp(p, "update", 6) == 0) || (PyOS_strnicmp(p, "delete", 6) == 0) || (PyOS_strnicmp(p, "replace", 7) == 0); - break; } pysqlite_Statement *self = PyObject_GC_New(pysqlite_Statement, @@ -139,73 +122,61 @@ stmt_traverse(pysqlite_Statement *self, visitproc visit, void *arg) } /* - * Checks if there is anything left in an SQL string after SQLite compiled it. - * This is used to check if somebody tried to execute more than one SQL command - * with one execute()/executemany() command, which the DB-API and we don't - * allow. + * Strip leading whitespace and comments from incoming SQL (null terminated C + * string) and return a pointer to the first non-whitespace, non-comment + * character. * - * Returns 1 if there is more left than should be. 0 if ok. + * This is used to check if somebody tries to execute more than one SQL query + * with one execute()/executemany() command, which the DB-API don't allow. + * + * It is also used to harden DML query detection. */ -static int pysqlite_check_remaining_sql(const char* tail) +static inline const char * +lstrip_sql(const char *sql) { - const char* pos = tail; - - parse_remaining_sql_state state = NORMAL; - - for (;;) { + // This loop is borrowed from the SQLite source code. + for (const char *pos = sql; *pos; pos++) { switch (*pos) { - case 0: - return 0; - case '-': - if (state == NORMAL) { - state = LINECOMMENT_1; - } else if (state == LINECOMMENT_1) { - state = IN_LINECOMMENT; - } - break; case ' ': case '\t': - break; + case '\f': case '\n': - case 13: - if (state == IN_LINECOMMENT) { - state = NORMAL; - } + case '\r': + // Skip whitespace. break; - case '/': - if (state == NORMAL) { - state = COMMENTSTART_1; - } else if (state == COMMENTEND_1) { - state = NORMAL; - } else if (state == COMMENTSTART_1) { - return 1; + case '-': + // Skip line comments. + if (pos[1] == '-') { + pos += 2; + while (pos[0] && pos[0] != '\n') { + pos++; + } + if (pos[0] == '\0') { + return NULL; + } + continue; } - break; - case '*': - if (state == NORMAL) { - return 1; - } else if (state == LINECOMMENT_1) { - return 1; - } else if (state == COMMENTSTART_1) { - state = IN_COMMENT; - } else if (state == IN_COMMENT) { - state = COMMENTEND_1; + return pos; + case '/': + // Skip C style comments. + if (pos[1] == '*') { + pos += 2; + while (pos[0] && (pos[0] != '*' || pos[1] != '/')) { + pos++; + } + if (pos[0] == '\0') { + return NULL; + } + pos++; + continue; } - break; + return pos; default: - if (state == COMMENTEND_1) { - state = IN_COMMENT; - } else if (state == IN_LINECOMMENT) { - } else if (state == IN_COMMENT) { - } else { - return 1; - } + return pos; } - - pos++; } - return 0; + return NULL; } static PyType_Slot stmt_slots[] = { diff --git a/Modules/_sre/clinic/sre.c.h b/Modules/_sre/clinic/sre.c.h index e243c756e1f971..048a494f1bc7c6 100644 --- a/Modules/_sre/clinic/sre.c.h +++ b/Modules/_sre/clinic/sre.c.h @@ -764,7 +764,7 @@ PyDoc_STRVAR(_sre_SRE_Pattern___deepcopy____doc__, PyDoc_STRVAR(_sre_compile__doc__, "compile($module, /, pattern, flags, code, groups, groupindex,\n" -" indexgroup, repeat_count)\n" +" indexgroup)\n" "--\n" "\n"); @@ -774,24 +774,23 @@ PyDoc_STRVAR(_sre_compile__doc__, static PyObject * _sre_compile_impl(PyObject *module, PyObject *pattern, int flags, PyObject *code, Py_ssize_t groups, PyObject *groupindex, - PyObject *indexgroup, Py_ssize_t repeat_count); + PyObject *indexgroup); static PyObject * _sre_compile(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames) { PyObject *return_value = NULL; - static const char * const _keywords[] = {"pattern", "flags", "code", "groups", "groupindex", "indexgroup", "repeat_count", NULL}; + static const char * const _keywords[] = {"pattern", "flags", "code", "groups", "groupindex", "indexgroup", NULL}; static _PyArg_Parser _parser = {NULL, _keywords, "compile", 0}; - PyObject *argsbuf[7]; + PyObject *argsbuf[6]; PyObject *pattern; int flags; PyObject *code; Py_ssize_t groups; PyObject *groupindex; PyObject *indexgroup; - Py_ssize_t repeat_count; - args = _PyArg_UnpackKeywords(args, nargs, NULL, kwnames, &_parser, 7, 7, 0, argsbuf); + args = _PyArg_UnpackKeywords(args, nargs, NULL, kwnames, &_parser, 6, 6, 0, argsbuf); if (!args) { goto exit; } @@ -827,19 +826,7 @@ _sre_compile(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject goto exit; } indexgroup = args[5]; - { - Py_ssize_t ival = -1; - PyObject *iobj = _PyNumber_Index(args[6]); - if (iobj != NULL) { - ival = PyLong_AsSsize_t(iobj); - Py_DECREF(iobj); - } - if (ival == -1 && PyErr_Occurred()) { - goto exit; - } - repeat_count = ival; - } - return_value = _sre_compile_impl(module, pattern, flags, code, groups, groupindex, indexgroup, repeat_count); + return_value = _sre_compile_impl(module, pattern, flags, code, groups, groupindex, indexgroup); exit: return return_value; @@ -1129,4 +1116,4 @@ _sre_SRE_Scanner_search(ScannerObject *self, PyTypeObject *cls, PyObject *const } return _sre_SRE_Scanner_search_impl(self, cls); } -/*[clinic end generated code: output=97e7ce058366760b input=a9049054013a1b77]*/ +/*[clinic end generated code: output=fd2f45c941620e6e input=a9049054013a1b77]*/ diff --git a/Modules/_sre/sre.c b/Modules/_sre/sre.c index bd9204da428af8..bcb30848d9a592 100644 --- a/Modules/_sre/sre.c +++ b/Modules/_sre/sre.c @@ -427,12 +427,6 @@ state_init(SRE_STATE* state, PatternObject* pattern, PyObject* string, state->lastmark = -1; state->lastindex = -1; - state->repeats_array = PyMem_New(SRE_REPEAT, pattern->repeat_count); - if (!state->repeats_array) { - PyErr_NoMemory(); - goto err; - } - state->buffer.buf = NULL; ptr = getstring(string, &length, &isbytes, &charsize, &state->buffer); if (!ptr) @@ -482,9 +476,6 @@ state_init(SRE_STATE* state, PatternObject* pattern, PyObject* string, safely casted to `void*`, see bpo-39943 for details. */ PyMem_Free((void*) state->mark); state->mark = NULL; - PyMem_Free(state->repeats_array); - state->repeats_array = NULL; - if (state->buffer.buf) PyBuffer_Release(&state->buffer); return NULL; @@ -500,8 +491,6 @@ state_fini(SRE_STATE* state) /* See above PyMem_Del for why we explicitly cast here. */ PyMem_Free((void*) state->mark); state->mark = NULL; - PyMem_Free(state->repeats_array); - state->repeats_array = NULL; } /* calculate offset from start of string */ @@ -771,22 +760,12 @@ _sre_SRE_Pattern_search_impl(PatternObject *self, PyTypeObject *cls, static PyObject* call(const char* module, const char* function, PyObject* args) { - PyObject* name; - PyObject* mod; PyObject* func; PyObject* result; if (!args) return NULL; - name = PyUnicode_FromString(module); - if (!name) - return NULL; - mod = PyImport_Import(name); - Py_DECREF(name); - if (!mod) - return NULL; - func = PyObject_GetAttrString(mod, function); - Py_DECREF(mod); + func = _PyImport_GetModuleAttrString(module, function); if (!func) return NULL; result = PyObject_CallObject(func, args); @@ -1323,6 +1302,7 @@ pattern_repr(PatternObject *obj) const char *name; int value; } flag_names[] = { + {"re.TEMPLATE", SRE_FLAG_TEMPLATE}, {"re.IGNORECASE", SRE_FLAG_IGNORECASE}, {"re.LOCALE", SRE_FLAG_LOCALE}, {"re.MULTILINE", SRE_FLAG_MULTILINE}, @@ -1417,15 +1397,14 @@ _sre.compile groups: Py_ssize_t groupindex: object(subclass_of='&PyDict_Type') indexgroup: object(subclass_of='&PyTuple_Type') - repeat_count: Py_ssize_t [clinic start generated code]*/ static PyObject * _sre_compile_impl(PyObject *module, PyObject *pattern, int flags, PyObject *code, Py_ssize_t groups, PyObject *groupindex, - PyObject *indexgroup, Py_ssize_t repeat_count) -/*[clinic end generated code: output=922af562d51b1657 input=77e39c322501ec2a]*/ + PyObject *indexgroup) +/*[clinic end generated code: output=ef9c2b3693776404 input=0a68476dbbe5db30]*/ { /* "compile" pattern descriptor to pattern object */ @@ -1483,8 +1462,8 @@ _sre_compile_impl(PyObject *module, PyObject *pattern, int flags, self->pattern = pattern; self->flags = flags; + self->groups = groups; - self->repeat_count = repeat_count; if (PyDict_GET_SIZE(groupindex) > 0) { Py_INCREF(groupindex); @@ -1656,7 +1635,7 @@ _validate_charset(SRE_CODE *code, SRE_CODE *end) } static int -_validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) +_validate_inner(SRE_CODE *code, SRE_CODE *end, Py_ssize_t groups) { /* Some variables are manipulated by the macros above */ SRE_CODE op; @@ -1677,8 +1656,8 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) sre_match() code is robust even if they don't, and the worst you can get is nonsensical match results. */ GET_ARG; - if (arg > 2 * (size_t)self->groups + 1) { - VTRACE(("arg=%d, groups=%d\n", (int)arg, (int)self->groups)); + if (arg > 2 * (size_t)groups + 1) { + VTRACE(("arg=%d, groups=%d\n", (int)arg, (int)groups)); FAIL; } break; @@ -1807,7 +1786,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) if (skip == 0) break; /* Stop 2 before the end; we check the JUMP below */ - if (!_validate_inner(code, code+skip-3, self)) + if (!_validate_inner(code, code+skip-3, groups)) FAIL; code += skip-3; /* Check that it ends with a JUMP, and that each JUMP @@ -1836,7 +1815,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) FAIL; if (max > SRE_MAXREPEAT) FAIL; - if (!_validate_inner(code, code+skip-4, self)) + if (!_validate_inner(code, code+skip-4, groups)) FAIL; code += skip-4; GET_OP; @@ -1848,7 +1827,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) case SRE_OP_REPEAT: case SRE_OP_POSSESSIVE_REPEAT: { - SRE_CODE op1 = op, min, max, repeat_index; + SRE_CODE op1 = op, min, max; GET_SKIP; GET_ARG; min = arg; GET_ARG; max = arg; @@ -1856,17 +1835,9 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) FAIL; if (max > SRE_MAXREPEAT) FAIL; - if (op1 == SRE_OP_REPEAT) { - GET_ARG; repeat_index = arg; - if (repeat_index >= (size_t)self->repeat_count) - FAIL; - skip -= 4; - } else { - skip -= 3; - } - if (!_validate_inner(code, code+skip, self)) + if (!_validate_inner(code, code+skip-3, groups)) FAIL; - code += skip; + code += skip-3; GET_OP; if (op1 == SRE_OP_POSSESSIVE_REPEAT) { if (op != SRE_OP_SUCCESS) @@ -1882,7 +1853,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) case SRE_OP_ATOMIC_GROUP: { GET_SKIP; - if (!_validate_inner(code, code+skip-2, self)) + if (!_validate_inner(code, code+skip-2, groups)) FAIL; code += skip-2; GET_OP; @@ -1896,7 +1867,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) case SRE_OP_GROUPREF_UNI_IGNORE: case SRE_OP_GROUPREF_LOC_IGNORE: GET_ARG; - if (arg >= (size_t)self->groups) + if (arg >= (size_t)groups) FAIL; break; @@ -1905,7 +1876,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) 'group' is either an integer group number or a group name, 'then' and 'else' are sub-regexes, and 'else' is optional. */ GET_ARG; - if (arg >= (size_t)self->groups) + if (arg >= (size_t)groups) FAIL; GET_SKIP_ADJ(1); code--; /* The skip is relative to the first arg! */ @@ -1938,17 +1909,17 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) code[skip-3] == SRE_OP_JUMP) { VTRACE(("both then and else parts present\n")); - if (!_validate_inner(code+1, code+skip-3, self)) + if (!_validate_inner(code+1, code+skip-3, groups)) FAIL; code += skip-2; /* Position after JUMP, at */ GET_SKIP; - if (!_validate_inner(code, code+skip-1, self)) + if (!_validate_inner(code, code+skip-1, groups)) FAIL; code += skip-1; } else { VTRACE(("only a then part present\n")); - if (!_validate_inner(code+1, code+skip-1, self)) + if (!_validate_inner(code+1, code+skip-1, groups)) FAIL; code += skip-1; } @@ -1962,7 +1933,7 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) if (arg & 0x80000000) FAIL; /* Width too large */ /* Stop 1 before the end; we check the SUCCESS below */ - if (!_validate_inner(code+1, code+skip-2, self)) + if (!_validate_inner(code+1, code+skip-2, groups)) FAIL; code += skip-2; GET_OP; @@ -1981,19 +1952,18 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, PatternObject *self) } static int -_validate_outer(SRE_CODE *code, SRE_CODE *end, PatternObject *self) +_validate_outer(SRE_CODE *code, SRE_CODE *end, Py_ssize_t groups) { - if (self->groups < 0 || (size_t)self->groups > SRE_MAXGROUPS || - self->repeat_count < 0 || + if (groups < 0 || (size_t)groups > SRE_MAXGROUPS || code >= end || end[-1] != SRE_OP_SUCCESS) FAIL; - return _validate_inner(code, end-1, self); + return _validate_inner(code, end-1, groups); } static int _validate(PatternObject *self) { - if (!_validate_outer(self->code, self->code+self->codesize, self)) + if (!_validate_outer(self->code, self->code+self->codesize, self->groups)) { PyErr_SetString(PyExc_RuntimeError, "invalid SRE code"); return 0; diff --git a/Modules/_sre/sre.h b/Modules/_sre/sre.h index aff064d343ec4c..52ae3e11b5f750 100644 --- a/Modules/_sre/sre.h +++ b/Modules/_sre/sre.h @@ -29,8 +29,6 @@ typedef struct { Py_ssize_t groups; /* must be first! */ PyObject* groupindex; /* dict */ PyObject* indexgroup; /* tuple */ - /* the number of REPEATs */ - Py_ssize_t repeat_count; /* compatibility */ PyObject* pattern; /* pattern source (or None) */ int flags; /* flags used when compiling pattern source */ @@ -85,8 +83,6 @@ typedef struct { size_t data_stack_base; /* current repeat context */ SRE_REPEAT *repeat; - /* repeat contexts array */ - SRE_REPEAT *repeats_array; } SRE_STATE; typedef struct { diff --git a/Modules/_sre/sre_constants.h b/Modules/_sre/sre_constants.h index d5de650b7025af..c6335147368626 100644 --- a/Modules/_sre/sre_constants.h +++ b/Modules/_sre/sre_constants.h @@ -11,7 +11,7 @@ * See the sre.c file for information on usage and redistribution. */ -#define SRE_MAGIC 20220423 +#define SRE_MAGIC 20220615 #define SRE_OP_FAILURE 0 #define SRE_OP_SUCCESS 1 #define SRE_OP_ANY 2 @@ -85,6 +85,7 @@ #define SRE_CATEGORY_UNI_NOT_WORD 15 #define SRE_CATEGORY_UNI_LINEBREAK 16 #define SRE_CATEGORY_UNI_NOT_LINEBREAK 17 +#define SRE_FLAG_TEMPLATE 1 #define SRE_FLAG_IGNORECASE 2 #define SRE_FLAG_LOCALE 4 #define SRE_FLAG_MULTILINE 8 diff --git a/Modules/_sre/sre_lib.h b/Modules/_sre/sre_lib.h index 1e5b50170ae76e..fb4c18b63d643d 100644 --- a/Modules/_sre/sre_lib.h +++ b/Modules/_sre/sre_lib.h @@ -1079,12 +1079,17 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) by the UNTIL operator (MAX_UNTIL, MIN_UNTIL) */ /* <1=min> <2=max> <3=repeat_index> item tail */ - TRACE(("|%p|%p|REPEAT %d %d %d\n", pattern, ptr, - pattern[1], pattern[2], pattern[3])); - - /* install repeat context */ - ctx->u.rep = &state->repeats_array[pattern[3]]; + TRACE(("|%p|%p|REPEAT %d %d\n", pattern, ptr, + pattern[1], pattern[2])); + /* install new repeat context */ + /* TODO(https://github.com/python/cpython/issues/67877): Fix this + * potential memory leak. */ + ctx->u.rep = (SRE_REPEAT*) PyObject_Malloc(sizeof(*ctx->u.rep)); + if (!ctx->u.rep) { + PyErr_NoMemory(); + RETURN_FAILURE; + } ctx->u.rep->count = -1; ctx->u.rep->pattern = pattern; ctx->u.rep->prev = state->repeat; @@ -1094,6 +1099,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) state->ptr = ptr; DO_JUMP(JUMP_REPEAT, jump_repeat, pattern+pattern[0]); state->repeat = ctx->u.rep->prev; + PyObject_Free(ctx->u.rep); if (ret) { RETURN_ON_ERROR(ret); @@ -1103,8 +1109,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) TARGET(SRE_OP_MAX_UNTIL): /* maximizing repeat */ - /* <1=min> <2=max> - <3=repeat_index> item tail */ + /* <1=min> <2=max> item tail */ /* FIXME: we probably need to deal with zero-width matches in here... */ @@ -1124,7 +1129,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) /* not enough matches */ ctx->u.rep->count = ctx->count; DO_JUMP(JUMP_MAX_UNTIL_1, jump_max_until_1, - ctx->u.rep->pattern+4); + ctx->u.rep->pattern+3); if (ret) { RETURN_ON_ERROR(ret); RETURN_SUCCESS; @@ -1146,7 +1151,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) DATA_PUSH(&ctx->u.rep->last_ptr); ctx->u.rep->last_ptr = state->ptr; DO_JUMP(JUMP_MAX_UNTIL_2, jump_max_until_2, - ctx->u.rep->pattern+4); + ctx->u.rep->pattern+3); DATA_POP(&ctx->u.rep->last_ptr); if (ret) { MARK_POP_DISCARD(ctx->lastmark); @@ -1171,8 +1176,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) TARGET(SRE_OP_MIN_UNTIL): /* minimizing repeat */ - /* <1=min> <2=max> - <3=repeat_index> item tail */ + /* <1=min> <2=max> item tail */ ctx->u.rep = state->repeat; if (!ctx->u.rep) @@ -1189,7 +1193,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) /* not enough matches */ ctx->u.rep->count = ctx->count; DO_JUMP(JUMP_MIN_UNTIL_1, jump_min_until_1, - ctx->u.rep->pattern+4); + ctx->u.rep->pattern+3); if (ret) { RETURN_ON_ERROR(ret); RETURN_SUCCESS; @@ -1232,7 +1236,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) DATA_PUSH(&ctx->u.rep->last_ptr); ctx->u.rep->last_ptr = state->ptr; DO_JUMP(JUMP_MIN_UNTIL_3,jump_min_until_3, - ctx->u.rep->pattern+4); + ctx->u.rep->pattern+3); DATA_POP(&ctx->u.rep->last_ptr); if (ret) { RETURN_ON_ERROR(ret); diff --git a/Modules/_ssl.c b/Modules/_ssl.c index e67ab42050b26c..f19ee6815af394 100644 --- a/Modules/_ssl.c +++ b/Modules/_ssl.c @@ -5067,7 +5067,8 @@ static PyType_Spec PySSLSession_spec = { .name = "_ssl.SSLSession", .basicsize = sizeof(PySSLSession), .flags = (Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | - Py_TPFLAGS_IMMUTABLETYPE), + Py_TPFLAGS_IMMUTABLETYPE | + Py_TPFLAGS_DISALLOW_INSTANTIATION), .slots = PySSLSession_slots, }; @@ -5157,24 +5158,6 @@ _ssl_RAND_bytes_impl(PyObject *module, int n) return PySSL_RAND(module, n, 0); } -/*[clinic input] -_ssl.RAND_pseudo_bytes - n: int - / - -Generate n pseudo-random bytes. - -Return a pair (bytes, is_cryptographic). is_cryptographic is True -if the bytes generated are cryptographically strong. -[clinic start generated code]*/ - -static PyObject * -_ssl_RAND_pseudo_bytes_impl(PyObject *module, int n) -/*[clinic end generated code: output=b1509e937000e52d input=58312bd53f9bbdd0]*/ -{ - PY_SSL_DEPRECATED("ssl.RAND_pseudo_bytes() is deprecated", 1, NULL); - return PySSL_RAND(module, n, 1); -} /*[clinic input] _ssl.RAND_status @@ -5633,7 +5616,6 @@ static PyMethodDef PySSL_methods[] = { _SSL__TEST_DECODE_CERT_METHODDEF _SSL_RAND_ADD_METHODDEF _SSL_RAND_BYTES_METHODDEF - _SSL_RAND_PSEUDO_BYTES_METHODDEF _SSL_RAND_STATUS_METHODDEF _SSL_GET_DEFAULT_VERIFY_PATHS_METHODDEF _SSL_ENUM_CERTIFICATES_METHODDEF diff --git a/Modules/_struct.c b/Modules/_struct.c index 4daf9529d882c5..20307ad15be814 100644 --- a/Modules/_struct.c +++ b/Modules/_struct.c @@ -1496,10 +1496,26 @@ Struct___init___impl(PyStructObject *self, PyObject *format) return ret; } +static int +s_clear(PyStructObject *s) +{ + Py_CLEAR(s->s_format); + return 0; +} + +static int +s_traverse(PyStructObject *s, visitproc visit, void *arg) +{ + Py_VISIT(Py_TYPE(s)); + Py_VISIT(s->s_format); + return 0; +} + static void s_dealloc(PyStructObject *s) { PyTypeObject *tp = Py_TYPE(s); + PyObject_GC_UnTrack(s); if (s->weakreflist != NULL) PyObject_ClearWeakRefs((PyObject *)s); if (s->s_codes != NULL) { @@ -2078,13 +2094,15 @@ static PyType_Slot PyStructType_slots[] = { {Py_tp_getattro, PyObject_GenericGetAttr}, {Py_tp_setattro, PyObject_GenericSetAttr}, {Py_tp_doc, (void*)s__doc__}, + {Py_tp_traverse, s_traverse}, + {Py_tp_clear, s_clear}, {Py_tp_methods, s_methods}, {Py_tp_members, s_members}, {Py_tp_getset, s_getsetlist}, {Py_tp_init, Struct___init__}, {Py_tp_alloc, PyType_GenericAlloc}, {Py_tp_new, s_new}, - {Py_tp_free, PyObject_Del}, + {Py_tp_free, PyObject_GC_Del}, {0, 0}, }; @@ -2092,7 +2110,7 @@ static PyType_Spec PyStructType_spec = { "_struct.Struct", sizeof(PyStructObject), 0, - Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, + Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC | Py_TPFLAGS_BASETYPE, PyStructType_slots }; diff --git a/Modules/_testcapimodule.c b/Modules/_testcapimodule.c index 363dbbb666ca5e..f427a490471762 100644 --- a/Modules/_testcapimodule.c +++ b/Modules/_testcapimodule.c @@ -21,7 +21,6 @@ #define PY_SSIZE_T_CLEAN #include "Python.h" -#include "frameobject.h" // PyFrame_Check() #include "datetime.h" // PyDateTimeAPI #include "marshal.h" // PyMarshal_WriteLongToFile #include "structmember.h" // PyMemberDef @@ -308,6 +307,32 @@ test_dict_inner(int count) } } +static PyObject *pytype_fromspec_meta(PyObject* self, PyObject *meta) +{ + if (!PyType_Check(meta)) { + PyErr_SetString( + TestError, + "pytype_fromspec_meta: must be invoked with a type argument!"); + return NULL; + } + + PyType_Slot HeapCTypeViaMetaclass_slots[] = { + {0}, + }; + + PyType_Spec HeapCTypeViaMetaclass_spec = { + "_testcapi.HeapCTypeViaMetaclass", + sizeof(PyObject), + 0, + Py_TPFLAGS_DEFAULT, + HeapCTypeViaMetaclass_slots + }; + + return PyType_FromMetaclass( + (PyTypeObject *) meta, NULL, &HeapCTypeViaMetaclass_spec, NULL); +} + + static PyObject* test_dict_iteration(PyObject* self, PyObject *Py_UNUSED(ignored)) { @@ -1182,6 +1207,162 @@ test_get_type_name(PyObject *self, PyObject *Py_UNUSED(ignored)) } +static PyType_Slot empty_type_slots[] = { + {0, 0}, +}; + +static PyType_Spec MinimalMetaclass_spec = { + .name = "_testcapi.MinimalMetaclass", + .basicsize = sizeof(PyHeapTypeObject), + .flags = Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, + .slots = empty_type_slots, +}; + +static PyType_Spec MinimalType_spec = { + .name = "_testcapi.MinimalSpecType", + .basicsize = 0, // Updated later + .flags = Py_TPFLAGS_DEFAULT, + .slots = empty_type_slots, +}; + +static PyObject * +test_from_spec_metatype_inheritance(PyObject *self, PyObject *Py_UNUSED(ignored)) +{ + PyObject *metaclass = NULL; + PyObject *class = NULL; + PyObject *new = NULL; + PyObject *subclasses = NULL; + PyObject *result = NULL; + int r; + + metaclass = PyType_FromSpecWithBases(&MinimalMetaclass_spec, (PyObject*)&PyType_Type); + if (metaclass == NULL) { + goto finally; + } + class = PyObject_CallFunction(metaclass, "s(){}", "TestClass"); + if (class == NULL) { + goto finally; + } + + MinimalType_spec.basicsize = (int)(((PyTypeObject*)class)->tp_basicsize); + new = PyType_FromSpecWithBases(&MinimalType_spec, class); + if (new == NULL) { + goto finally; + } + if (Py_TYPE(new) != (PyTypeObject*)metaclass) { + PyErr_SetString(PyExc_AssertionError, + "Metaclass not set properly!"); + goto finally; + } + + /* Assert that __subclasses__ is updated */ + subclasses = PyObject_CallMethod(class, "__subclasses__", ""); + if (!subclasses) { + goto finally; + } + r = PySequence_Contains(subclasses, new); + if (r < 0) { + goto finally; + } + if (r == 0) { + PyErr_SetString(PyExc_AssertionError, + "subclasses not set properly!"); + goto finally; + } + + result = Py_NewRef(Py_None); + +finally: + Py_XDECREF(metaclass); + Py_XDECREF(class); + Py_XDECREF(new); + Py_XDECREF(subclasses); + return result; +} + + +static PyObject * +test_from_spec_invalid_metatype_inheritance(PyObject *self, PyObject *Py_UNUSED(ignored)) +{ + PyObject *metaclass_a = NULL; + PyObject *metaclass_b = NULL; + PyObject *class_a = NULL; + PyObject *class_b = NULL; + PyObject *bases = NULL; + PyObject *new = NULL; + PyObject *meta_error_string = NULL; + PyObject *exc_type = NULL; + PyObject *exc_value = NULL; + PyObject *exc_traceback = NULL; + PyObject *result = NULL; + + metaclass_a = PyType_FromSpecWithBases(&MinimalMetaclass_spec, (PyObject*)&PyType_Type); + if (metaclass_a == NULL) { + goto finally; + } + metaclass_b = PyType_FromSpecWithBases(&MinimalMetaclass_spec, (PyObject*)&PyType_Type); + if (metaclass_b == NULL) { + goto finally; + } + class_a = PyObject_CallFunction(metaclass_a, "s(){}", "TestClassA"); + if (class_a == NULL) { + goto finally; + } + + class_b = PyObject_CallFunction(metaclass_b, "s(){}", "TestClassB"); + if (class_b == NULL) { + goto finally; + } + + bases = PyTuple_Pack(2, class_a, class_b); + if (bases == NULL) { + goto finally; + } + + /* + * The following should raise a TypeError due to a MetaClass conflict. + */ + new = PyType_FromSpecWithBases(&MinimalType_spec, bases); + if (new != NULL) { + PyErr_SetString(PyExc_AssertionError, + "MetaType conflict not recognized by PyType_FromSpecWithBases"); + goto finally; + } + + // Assert that the correct exception was raised + if (PyErr_ExceptionMatches(PyExc_TypeError)) { + PyErr_Fetch(&exc_type, &exc_value, &exc_traceback); + + meta_error_string = PyUnicode_FromString("metaclass conflict:"); + if (meta_error_string == NULL) { + goto finally; + } + int res = PyUnicode_Contains(exc_value, meta_error_string); + if (res < 0) { + goto finally; + } + if (res == 0) { + PyErr_SetString(PyExc_AssertionError, + "TypeError did not inlclude expected message."); + goto finally; + } + result = Py_NewRef(Py_None); + } +finally: + Py_XDECREF(metaclass_a); + Py_XDECREF(metaclass_b); + Py_XDECREF(bases); + Py_XDECREF(new); + Py_XDECREF(meta_error_string); + Py_XDECREF(exc_type); + Py_XDECREF(exc_value); + Py_XDECREF(exc_traceback); + Py_XDECREF(class_a); + Py_XDECREF(class_b); + return result; +} + + static PyObject * simple_str(PyObject *self) { return PyUnicode_FromString(""); @@ -1221,7 +1402,7 @@ test_type_from_ephemeral_spec(PyObject *self, PyObject *Py_UNUSED(ignored)) memcpy(name, NAME, sizeof(NAME)); doc = PyMem_New(char, sizeof(DOC)); - if (name == NULL) { + if (doc == NULL) { PyErr_NoMemory(); goto finally; } @@ -1301,6 +1482,63 @@ test_type_from_ephemeral_spec(PyObject *self, PyObject *Py_UNUSED(ignored)) return result; } +PyType_Slot repeated_doc_slots[] = { + {Py_tp_doc, "A class used for tests·"}, + {Py_tp_doc, "A class used for tests"}, + {0, 0}, +}; + +PyType_Spec repeated_doc_slots_spec = { + .name = "RepeatedDocSlotClass", + .basicsize = sizeof(PyObject), + .slots = repeated_doc_slots, +}; + +typedef struct { + PyObject_HEAD + int data; +} HeapCTypeWithDataObject; + + +static struct PyMemberDef members_to_repeat[] = { + {"T_INT", T_INT, offsetof(HeapCTypeWithDataObject, data), 0, NULL}, + {NULL} +}; + +PyType_Slot repeated_members_slots[] = { + {Py_tp_members, members_to_repeat}, + {Py_tp_members, members_to_repeat}, + {0, 0}, +}; + +PyType_Spec repeated_members_slots_spec = { + .name = "RepeatedMembersSlotClass", + .basicsize = sizeof(HeapCTypeWithDataObject), + .slots = repeated_members_slots, +}; + +static PyObject * +create_type_from_repeated_slots(PyObject *self, PyObject *variant_obj) +{ + PyObject *class = NULL; + int variant = PyLong_AsLong(variant_obj); + if (PyErr_Occurred()) { + return NULL; + } + switch (variant) { + case 0: + class = PyType_FromSpec(&repeated_doc_slots_spec); + break; + case 1: + class = PyType_FromSpec(&repeated_members_slots_spec); + break; + default: + PyErr_SetString(PyExc_ValueError, "bad test variant"); + break; + } + return class; +} + static PyObject * test_get_type_qualname(PyObject *self, PyObject *Py_UNUSED(ignored)) @@ -5832,6 +6070,44 @@ settrace_to_record(PyObject *self, PyObject *list) Py_RETURN_NONE; } + +static PyObject * +test_macros(PyObject *self, PyObject *Py_UNUSED(args)) +{ + struct MyStruct { + int x; + }; + wchar_t array[3]; + + // static_assert(), Py_BUILD_ASSERT() + static_assert(1 == 1, "bug"); + Py_BUILD_ASSERT(1 == 1); + + + // Py_MIN(), Py_MAX(), Py_ABS() + assert(Py_MIN(5, 11) == 5); + assert(Py_MAX(5, 11) == 11); + assert(Py_ABS(-5) == 5); + + // Py_STRINGIFY() + assert(strcmp(Py_STRINGIFY(123), "123") == 0); + + // Py_MEMBER_SIZE(), Py_ARRAY_LENGTH() + assert(Py_MEMBER_SIZE(struct MyStruct, x) == sizeof(int)); + assert(Py_ARRAY_LENGTH(array) == 3); + + // Py_CHARMASK() + int c = 0xab00 | 7; + assert(Py_CHARMASK(c) == 7); + + // _Py_IS_TYPE_SIGNED() + assert(_Py_IS_TYPE_SIGNED(int)); + assert(!_Py_IS_TYPE_SIGNED(unsigned int)); + + Py_RETURN_NONE; +} + + static PyObject *negative_dictoffset(PyObject *, PyObject *); static PyObject *test_buildvalue_issue38913(PyObject *, PyObject *); static PyObject *getargs_s_hash_int(PyObject *, PyObject *, PyObject*); @@ -5886,6 +6162,7 @@ static PyMethodDef TestMethods[] = { {"test_long_numbits", test_long_numbits, METH_NOARGS}, {"test_k_code", test_k_code, METH_NOARGS}, {"test_empty_argparse", test_empty_argparse, METH_NOARGS}, + {"pytype_fromspec_meta", pytype_fromspec_meta, METH_O}, {"parse_tuple_and_keywords", parse_tuple_and_keywords, METH_VARARGS}, {"pyobject_repr_from_null", pyobject_repr_from_null, METH_NOARGS}, {"pyobject_str_from_null", pyobject_str_from_null, METH_NOARGS}, @@ -5911,6 +6188,13 @@ static PyMethodDef TestMethods[] = { {"test_get_type_name", test_get_type_name, METH_NOARGS}, {"test_get_type_qualname", test_get_type_qualname, METH_NOARGS}, {"test_type_from_ephemeral_spec", test_type_from_ephemeral_spec, METH_NOARGS}, + {"create_type_from_repeated_slots", + create_type_from_repeated_slots, METH_O}, + {"test_from_spec_metatype_inheritance", test_from_spec_metatype_inheritance, + METH_NOARGS}, + {"test_from_spec_invalid_metatype_inheritance", + test_from_spec_invalid_metatype_inheritance, + METH_NOARGS}, {"get_kwargs", _PyCFunction_CAST(get_kwargs), METH_VARARGS|METH_KEYWORDS}, {"getargs_tuple", getargs_tuple, METH_VARARGS}, @@ -6122,6 +6406,7 @@ static PyMethodDef TestMethods[] = { {"get_feature_macros", get_feature_macros, METH_NOARGS, NULL}, {"test_code_api", test_code_api, METH_NOARGS, NULL}, {"settrace_to_record", settrace_to_record, METH_O, NULL}, + {"test_macros", test_macros, METH_NOARGS, NULL}, {NULL, NULL} /* sentinel */ }; @@ -7078,6 +7363,38 @@ static PyType_Spec HeapCTypeSubclassWithFinalizer_spec = { HeapCTypeSubclassWithFinalizer_slots }; +static PyType_Slot HeapCTypeMetaclass_slots[] = { + {0}, +}; + +static PyType_Spec HeapCTypeMetaclass_spec = { + "_testcapi.HeapCTypeMetaclass", + sizeof(PyHeapTypeObject), + sizeof(PyMemberDef), + Py_TPFLAGS_DEFAULT, + HeapCTypeMetaclass_slots +}; + +static PyObject * +heap_ctype_metaclass_custom_tp_new(PyTypeObject *tp, PyObject *args, PyObject *kwargs) +{ + return PyType_Type.tp_new(tp, args, kwargs); +} + +static PyType_Slot HeapCTypeMetaclassCustomNew_slots[] = { + { Py_tp_new, heap_ctype_metaclass_custom_tp_new }, + {0}, +}; + +static PyType_Spec HeapCTypeMetaclassCustomNew_spec = { + "_testcapi.HeapCTypeMetaclassCustomNew", + sizeof(PyHeapTypeObject), + sizeof(PyMemberDef), + Py_TPFLAGS_DEFAULT, + HeapCTypeMetaclassCustomNew_slots +}; + + typedef struct { PyObject_HEAD PyObject *dict; @@ -7591,6 +7908,20 @@ PyInit__testcapi(void) Py_DECREF(subclass_with_finalizer_bases); PyModule_AddObject(m, "HeapCTypeSubclassWithFinalizer", HeapCTypeSubclassWithFinalizer); + PyObject *HeapCTypeMetaclass = PyType_FromMetaclass( + &PyType_Type, m, &HeapCTypeMetaclass_spec, (PyObject *) &PyType_Type); + if (HeapCTypeMetaclass == NULL) { + return NULL; + } + PyModule_AddObject(m, "HeapCTypeMetaclass", HeapCTypeMetaclass); + + PyObject *HeapCTypeMetaclassCustomNew = PyType_FromMetaclass( + &PyType_Type, m, &HeapCTypeMetaclassCustomNew_spec, (PyObject *) &PyType_Type); + if (HeapCTypeMetaclassCustomNew == NULL) { + return NULL; + } + PyModule_AddObject(m, "HeapCTypeMetaclassCustomNew", HeapCTypeMetaclassCustomNew); + if (PyType_Ready(&ContainerNoGC_type) < 0) { return NULL; } diff --git a/Modules/_testinternalcapi.c b/Modules/_testinternalcapi.c index 914b20b36f146b..238de749fffc5d 100644 --- a/Modules/_testinternalcapi.c +++ b/Modules/_testinternalcapi.c @@ -307,7 +307,7 @@ check_edit_cost(const char *a, const char *b, Py_ssize_t expected) goto exit; } b_obj = PyUnicode_FromString(b); - if (a_obj == NULL) { + if (b_obj == NULL) { goto exit; } Py_ssize_t result = _Py_UTF8_Edit_Cost(a_obj, b_obj, -1); diff --git a/Modules/_winapi.c b/Modules/_winapi.c index 3e24d512cac384..4845b4e6d4ad7c 100644 --- a/Modules/_winapi.c +++ b/Modules/_winapi.c @@ -1512,6 +1512,50 @@ _winapi_PeekNamedPipe_impl(PyObject *module, HANDLE handle, int size) } } +/*[clinic input] +_winapi.LCMapStringEx + + locale: LPCWSTR + flags: DWORD + src: LPCWSTR + +[clinic start generated code]*/ + +static PyObject * +_winapi_LCMapStringEx_impl(PyObject *module, LPCWSTR locale, DWORD flags, + LPCWSTR src) +/*[clinic end generated code: output=cf4713d80e2b47c9 input=9fe26f95d5ab0001]*/ +{ + if (flags & (LCMAP_SORTHANDLE | LCMAP_HASH | LCMAP_BYTEREV | + LCMAP_SORTKEY)) { + return PyErr_Format(PyExc_ValueError, "unsupported flags"); + } + + int dest_size = LCMapStringEx(locale, flags, src, -1, NULL, 0, + NULL, NULL, 0); + if (dest_size == 0) { + return PyErr_SetFromWindowsErr(0); + } + + wchar_t* dest = PyMem_NEW(wchar_t, dest_size); + if (dest == NULL) { + return PyErr_NoMemory(); + } + + int nmapped = LCMapStringEx(locale, flags, src, -1, dest, dest_size, + NULL, NULL, 0); + if (nmapped == 0) { + DWORD error = GetLastError(); + PyMem_DEL(dest); + return PyErr_SetFromWindowsErr(error); + } + + PyObject *ret = PyUnicode_FromWideChar(dest, dest_size - 1); + PyMem_DEL(dest); + + return ret; +} + /*[clinic input] _winapi.ReadFile @@ -2023,6 +2067,7 @@ static PyMethodDef winapi_functions[] = { _WINAPI_OPENFILEMAPPING_METHODDEF _WINAPI_OPENPROCESS_METHODDEF _WINAPI_PEEKNAMEDPIPE_METHODDEF + _WINAPI_LCMAPSTRINGEX_METHODDEF _WINAPI_READFILE_METHODDEF _WINAPI_SETNAMEDPIPEHANDLESTATE_METHODDEF _WINAPI_TERMINATEPROCESS_METHODDEF @@ -2160,6 +2205,22 @@ static int winapi_exec(PyObject *m) WINAPI_CONSTANT(F_DWORD, FILE_TYPE_PIPE); WINAPI_CONSTANT(F_DWORD, FILE_TYPE_REMOTE); + WINAPI_CONSTANT("u", LOCALE_NAME_INVARIANT); + WINAPI_CONSTANT(F_DWORD, LOCALE_NAME_MAX_LENGTH); + WINAPI_CONSTANT("u", LOCALE_NAME_SYSTEM_DEFAULT); + WINAPI_CONSTANT("u", LOCALE_NAME_USER_DEFAULT); + + WINAPI_CONSTANT(F_DWORD, LCMAP_FULLWIDTH); + WINAPI_CONSTANT(F_DWORD, LCMAP_HALFWIDTH); + WINAPI_CONSTANT(F_DWORD, LCMAP_HIRAGANA); + WINAPI_CONSTANT(F_DWORD, LCMAP_KATAKANA); + WINAPI_CONSTANT(F_DWORD, LCMAP_LINGUISTIC_CASING); + WINAPI_CONSTANT(F_DWORD, LCMAP_LOWERCASE); + WINAPI_CONSTANT(F_DWORD, LCMAP_SIMPLIFIED_CHINESE); + WINAPI_CONSTANT(F_DWORD, LCMAP_TITLECASE); + WINAPI_CONSTANT(F_DWORD, LCMAP_TRADITIONAL_CHINESE); + WINAPI_CONSTANT(F_DWORD, LCMAP_UPPERCASE); + WINAPI_CONSTANT("i", NULL); return 0; diff --git a/Modules/_xxsubinterpretersmodule.c b/Modules/_xxsubinterpretersmodule.c index 9b1f186c5b6c7c..f40601ad3a1a72 100644 --- a/Modules/_xxsubinterpretersmodule.c +++ b/Modules/_xxsubinterpretersmodule.c @@ -6,7 +6,6 @@ #endif #include "Python.h" -#include "frameobject.h" #include "pycore_frame.h" #include "pycore_pystate.h" // _PyThreadState_GET() #include "pycore_interpreteridobject.h" @@ -1932,20 +1931,6 @@ _run_script_in_interpreter(PyInterpreterState *interp, const char *codestr, return -1; } -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - // Switch to interpreter. - PyThreadState *new_tstate = PyInterpreterState_ThreadHead(interp); - PyThreadState *save1 = PyEval_SaveThread(); - - (void)PyThreadState_Swap(new_tstate); - - // Run the script. - _sharedexception *exc = NULL; - int result = _run_script(interp, codestr, shared, &exc); - - // Switch back. - PyEval_RestoreThread(save1); -#else // Switch to interpreter. PyThreadState *save_tstate = NULL; if (interp != PyInterpreterState_Get()) { @@ -1963,7 +1948,6 @@ _run_script_in_interpreter(PyInterpreterState *interp, const char *codestr, if (save_tstate != NULL) { PyThreadState_Swap(save_tstate); } -#endif // Propagate any exception out to the caller. if (exc != NULL) { @@ -2338,7 +2322,7 @@ channel_list_all(PyObject *self, PyObject *Py_UNUSED(ignored)) ids = NULL; break; } - PyList_SET_ITEM(ids, i, id); + PyList_SET_ITEM(ids, (Py_ssize_t)i, id); } finally: diff --git a/Modules/_zoneinfo.c b/Modules/_zoneinfo.c index 1535721b026d1f..207340adec152f 100644 --- a/Modules/_zoneinfo.c +++ b/Modules/_zoneinfo.c @@ -659,14 +659,8 @@ zoneinfo_reduce(PyObject *obj_self, PyObject *unused) PyZoneInfo_ZoneInfo *self = (PyZoneInfo_ZoneInfo *)obj_self; if (self->source == SOURCE_FILE) { // Objects constructed from files cannot be pickled. - PyObject *pickle = PyImport_ImportModule("pickle"); - if (pickle == NULL) { - return NULL; - } - PyObject *pickle_error = - PyObject_GetAttrString(pickle, "PicklingError"); - Py_DECREF(pickle); + _PyImport_GetModuleAttrString("pickle", "PicklingError"); if (pickle_error == NULL) { return NULL; } @@ -2492,14 +2486,13 @@ clear_strong_cache(const PyTypeObject *const type) static PyObject * new_weak_cache(void) { - PyObject *weakref_module = PyImport_ImportModule("weakref"); - if (weakref_module == NULL) { + PyObject *WeakValueDictionary = + _PyImport_GetModuleAttrString("weakref", "WeakValueDictionary"); + if (WeakValueDictionary == NULL) { return NULL; } - - PyObject *weak_cache = - PyObject_CallMethod(weakref_module, "WeakValueDictionary", ""); - Py_DECREF(weakref_module); + PyObject *weak_cache = PyObject_CallNoArgs(WeakValueDictionary); + Py_DECREF(WeakValueDictionary); return weak_cache; } @@ -2656,25 +2649,13 @@ zoneinfomodule_exec(PyObject *m) PyModule_AddObject(m, "ZoneInfo", (PyObject *)&PyZoneInfo_ZoneInfoType); /* Populate imports */ - PyObject *_tzpath_module = PyImport_ImportModule("zoneinfo._tzpath"); - if (_tzpath_module == NULL) { - goto error; - } - _tzpath_find_tzfile = - PyObject_GetAttrString(_tzpath_module, "find_tzfile"); - Py_DECREF(_tzpath_module); + _PyImport_GetModuleAttrString("zoneinfo._tzpath", "find_tzfile"); if (_tzpath_find_tzfile == NULL) { goto error; } - PyObject *io_module = PyImport_ImportModule("io"); - if (io_module == NULL) { - goto error; - } - - io_open = PyObject_GetAttrString(io_module, "open"); - Py_DECREF(io_module); + io_open = _PyImport_GetModuleAttrString("io", "open"); if (io_open == NULL) { goto error; } diff --git a/Modules/arraymodule.c b/Modules/arraymodule.c index a04e6a4e070d9e..924fbf29bfb889 100644 --- a/Modules/arraymodule.c +++ b/Modules/arraymodule.c @@ -60,7 +60,6 @@ typedef struct { PyObject *str_read; PyObject *str_write; - PyObject *str__array_reconstructor; PyObject *str___dict__; PyObject *str_iter; } array_state; @@ -2199,13 +2198,8 @@ array_array___reduce_ex___impl(arrayobject *self, PyTypeObject *cls, assert(state != NULL); if (array_reconstructor == NULL) { - PyObject *array_module = PyImport_ImportModule("array"); - if (array_module == NULL) - return NULL; - array_reconstructor = PyObject_GetAttr( - array_module, - state->str__array_reconstructor); - Py_DECREF(array_module); + array_reconstructor = _PyImport_GetModuleAttrString( + "array", "_array_reconstructor"); if (array_reconstructor == NULL) return NULL; } @@ -3029,7 +3023,6 @@ array_clear(PyObject *module) Py_CLEAR(state->ArrayIterType); Py_CLEAR(state->str_read); Py_CLEAR(state->str_write); - Py_CLEAR(state->str__array_reconstructor); Py_CLEAR(state->str___dict__); Py_CLEAR(state->str_iter); return 0; @@ -3075,7 +3068,6 @@ array_modexec(PyObject *m) /* Add interned strings */ ADD_INTERNED(state, read); ADD_INTERNED(state, write); - ADD_INTERNED(state, _array_reconstructor); ADD_INTERNED(state, __dict__); ADD_INTERNED(state, iter); @@ -3089,13 +3081,8 @@ array_modexec(PyObject *m) return -1; } - PyObject *abc_mod = PyImport_ImportModule("collections.abc"); - if (!abc_mod) { - Py_DECREF((PyObject *)state->ArrayType); - return -1; - } - PyObject *mutablesequence = PyObject_GetAttrString(abc_mod, "MutableSequence"); - Py_DECREF(abc_mod); + PyObject *mutablesequence = _PyImport_GetModuleAttrString( + "collections.abc", "MutableSequence"); if (!mutablesequence) { Py_DECREF((PyObject *)state->ArrayType); return -1; diff --git a/Modules/binascii.c b/Modules/binascii.c index afe49885491714..ffc2c59413613b 100644 --- a/Modules/binascii.c +++ b/Modules/binascii.c @@ -1024,10 +1024,7 @@ binascii_a2b_qp_impl(PyObject *module, Py_buffer *data, int header) out++; } } - if ((rv = PyBytes_FromStringAndSize((char *)odata, out)) == NULL) { - PyMem_Free(odata); - return NULL; - } + rv = PyBytes_FromStringAndSize((char *)odata, out); PyMem_Free(odata); return rv; } @@ -1232,10 +1229,7 @@ binascii_b2a_qp_impl(PyObject *module, Py_buffer *data, int quotetabs, } } } - if ((rv = PyBytes_FromStringAndSize((char *)odata, out)) == NULL) { - PyMem_Free(odata); - return NULL; - } + rv = PyBytes_FromStringAndSize((char *)odata, out); PyMem_Free(odata); return rv; } diff --git a/Modules/cjkcodecs/cjkcodecs.h b/Modules/cjkcodecs/cjkcodecs.h index ba8fad26055a8b..d9aeec2ff40b08 100644 --- a/Modules/cjkcodecs/cjkcodecs.h +++ b/Modules/cjkcodecs/cjkcodecs.h @@ -245,14 +245,7 @@ static const struct dbcs_map *mapping_list; static PyObject * getmultibytecodec(void) { - PyObject *mod = PyImport_ImportModuleNoBlock("_multibytecodec"); - if (mod == NULL) { - return NULL; - } - - PyObject *cofunc = PyObject_GetAttrString(mod, "__create_codec"); - Py_DECREF(mod); - return cofunc; + return _PyImport_GetModuleAttrString("_multibytecodec", "__create_codec"); } static PyObject * diff --git a/Modules/cjkcodecs/mappings_hk.h b/Modules/cjkcodecs/mappings_hk.h index 1b1d70e7c1750c..9012ae350c4def 100644 --- a/Modules/cjkcodecs/mappings_hk.h +++ b/Modules/cjkcodecs/mappings_hk.h @@ -1,3 +1,4 @@ +// AUTO-GENERATED FILE FROM genmap_tchinese.py: DO NOT EDIT static const ucs2_t __big5hkscs_decmap[6219] = { 17392,19506,17923,17830,17784,29287,19831,17843,31921,19682,31941,15253,18230, 18244,19527,19520,17087,13847,29522,28299,28882,19543,41809,18255,17882,19589, diff --git a/Modules/cjkcodecs/mappings_tw.h b/Modules/cjkcodecs/mappings_tw.h index ec3f9f7468e41b..ceb4bc56a218f2 100644 --- a/Modules/cjkcodecs/mappings_tw.h +++ b/Modules/cjkcodecs/mappings_tw.h @@ -1,3 +1,4 @@ +// AUTO-GENERATED FILE FROM genmap_tchinese.py: DO NOT EDIT static const ucs2_t __big5_decmap[16702] = { 12288,65292,12289,12290,65294,8226,65307,65306,65311,65281,65072,8230,8229, 65104,65380,65106,183,65108,65109,65110,65111,65372,8211,65073,8212,65075, @@ -2631,3 +2632,4 @@ static const struct unim_index cp950ext_encmap[256] = { 0,0},{0,0,0},{0,0,0},{0,0,0},{0,0,0},{0,0,0},{__cp950ext_encmap+342,81,104},{ __cp950ext_encmap+366,15,229}, }; + diff --git a/Modules/clinic/_ssl.c.h b/Modules/clinic/_ssl.c.h index 67b125f3d76167..24604dd43687c5 100644 --- a/Modules/clinic/_ssl.c.h +++ b/Modules/clinic/_ssl.c.h @@ -1090,37 +1090,6 @@ _ssl_RAND_bytes(PyObject *module, PyObject *arg) return return_value; } -PyDoc_STRVAR(_ssl_RAND_pseudo_bytes__doc__, -"RAND_pseudo_bytes($module, n, /)\n" -"--\n" -"\n" -"Generate n pseudo-random bytes.\n" -"\n" -"Return a pair (bytes, is_cryptographic). is_cryptographic is True\n" -"if the bytes generated are cryptographically strong."); - -#define _SSL_RAND_PSEUDO_BYTES_METHODDEF \ - {"RAND_pseudo_bytes", (PyCFunction)_ssl_RAND_pseudo_bytes, METH_O, _ssl_RAND_pseudo_bytes__doc__}, - -static PyObject * -_ssl_RAND_pseudo_bytes_impl(PyObject *module, int n); - -static PyObject * -_ssl_RAND_pseudo_bytes(PyObject *module, PyObject *arg) -{ - PyObject *return_value = NULL; - int n; - - n = _PyLong_AsInt(arg); - if (n == -1 && PyErr_Occurred()) { - goto exit; - } - return_value = _ssl_RAND_pseudo_bytes_impl(module, n); - -exit: - return return_value; -} - PyDoc_STRVAR(_ssl_RAND_status__doc__, "RAND_status($module, /)\n" "--\n" @@ -1361,4 +1330,4 @@ _ssl_enum_crls(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObje #ifndef _SSL_ENUM_CRLS_METHODDEF #define _SSL_ENUM_CRLS_METHODDEF #endif /* !defined(_SSL_ENUM_CRLS_METHODDEF) */ -/*[clinic end generated code: output=2a488dd0cbc777df input=a9049054013a1b77]*/ +/*[clinic end generated code: output=9d806f8ff4a06ed3 input=a9049054013a1b77]*/ diff --git a/Modules/clinic/_winapi.c.h b/Modules/clinic/_winapi.c.h index 4d89888af9054a..486029a6300304 100644 --- a/Modules/clinic/_winapi.c.h +++ b/Modules/clinic/_winapi.c.h @@ -820,6 +820,43 @@ _winapi_PeekNamedPipe(PyObject *module, PyObject *const *args, Py_ssize_t nargs) return return_value; } +PyDoc_STRVAR(_winapi_LCMapStringEx__doc__, +"LCMapStringEx($module, /, locale, flags, src)\n" +"--\n" +"\n"); + +#define _WINAPI_LCMAPSTRINGEX_METHODDEF \ + {"LCMapStringEx", _PyCFunction_CAST(_winapi_LCMapStringEx), METH_FASTCALL|METH_KEYWORDS, _winapi_LCMapStringEx__doc__}, + +static PyObject * +_winapi_LCMapStringEx_impl(PyObject *module, LPCWSTR locale, DWORD flags, + LPCWSTR src); + +static PyObject * +_winapi_LCMapStringEx(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject *kwnames) +{ + PyObject *return_value = NULL; + static const char * const _keywords[] = {"locale", "flags", "src", NULL}; + static _PyArg_Parser _parser = {"O&kO&:LCMapStringEx", _keywords, 0}; + LPCWSTR locale; + DWORD flags; + LPCWSTR src; + + if (!_PyArg_ParseStackAndKeywords(args, nargs, kwnames, &_parser, + _PyUnicode_WideCharString_Converter, &locale, &flags, _PyUnicode_WideCharString_Converter, &src)) { + goto exit; + } + return_value = _winapi_LCMapStringEx_impl(module, locale, flags, src); + +exit: + /* Cleanup for locale */ + PyMem_Free((void *)locale); + /* Cleanup for src */ + PyMem_Free((void *)src); + + return return_value; +} + PyDoc_STRVAR(_winapi_ReadFile__doc__, "ReadFile($module, /, handle, size, overlapped=False)\n" "--\n" @@ -1164,4 +1201,4 @@ _winapi__mimetypes_read_windows_registry(PyObject *module, PyObject *const *args exit: return return_value; } -/*[clinic end generated code: output=b007dde2e7f2fff8 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=6cdefec63a1d7f12 input=a9049054013a1b77]*/ diff --git a/Modules/faulthandler.c b/Modules/faulthandler.c index 08c40834c45f20..b36268782bda9f 100644 --- a/Modules/faulthandler.c +++ b/Modules/faulthandler.c @@ -5,8 +5,6 @@ #include "pycore_signal.h" // Py_NSIG #include "pycore_traceback.h" // _Py_DumpTracebackThreads -#include "frameobject.h" - #include #include #include @@ -1340,13 +1338,13 @@ PyInit_faulthandler(void) static int faulthandler_init_enable(void) { - PyObject *module = PyImport_ImportModule("faulthandler"); - if (module == NULL) { + PyObject *enable = _PyImport_GetModuleAttrString("faulthandler", "enable"); + if (enable == NULL) { return -1; } - PyObject *res = PyObject_CallMethodNoArgs(module, &_Py_ID(enable)); - Py_DECREF(module); + PyObject *res = PyObject_CallNoArgs(enable); + Py_DECREF(enable); if (res == NULL) { return -1; } diff --git a/Modules/fcntlmodule.c b/Modules/fcntlmodule.c index ea9b2bc14a9f24..9a8ec8dc9858d7 100644 --- a/Modules/fcntlmodule.c +++ b/Modules/fcntlmodule.c @@ -8,6 +8,9 @@ #ifdef HAVE_SYS_FILE_H #include #endif +#ifdef HAVE_LINUX_FS_H +#include +#endif #include #include @@ -572,6 +575,12 @@ all_ins(PyObject* m) #ifdef F_GETPIPE_SZ if (PyModule_AddIntMacro(m, F_GETPIPE_SZ)) return -1; #endif +#ifdef FICLONE + if (PyModule_AddIntMacro(m, FICLONE)) return -1; +#endif +#ifdef FICLONERANGE + if (PyModule_AddIntMacro(m, FICLONERANGE)) return -1; +#endif /* OS X specifics */ #ifdef F_FULLFSYNC diff --git a/Modules/gcmodule.c b/Modules/gcmodule.c index 45dad8f0824df8..3bda6e4bb8c021 100644 --- a/Modules/gcmodule.c +++ b/Modules/gcmodule.c @@ -1195,14 +1195,6 @@ gc_collect_main(PyThreadState *tstate, int generation, assert(gcstate->garbage != NULL); assert(!_PyErr_Occurred(tstate)); -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - if (tstate->interp->config._isolated_interpreter) { - // bpo-40533: The garbage collector must not be run on parallel on - // Python objects shared by multiple interpreters. - return 0; - } -#endif - if (gcstate->debug & DEBUG_STATS) { PySys_WriteStderr("gc: collecting generation %d...\n", generation); show_stats_each_generations(gcstate); diff --git a/Modules/getnameinfo.c b/Modules/getnameinfo.c index f014c11ae157e4..db3e8eedd217c0 100644 --- a/Modules/getnameinfo.c +++ b/Modules/getnameinfo.c @@ -104,8 +104,8 @@ getnameinfo(sa, salen, host, hostlen, serv, servlen, flags) u_long v4a; #ifdef ENABLE_IPV6 u_char pfx; -#endif int h_error; +#endif char numserv[512]; char numaddr[512]; @@ -181,7 +181,6 @@ getnameinfo(sa, salen, host, hostlen, serv, servlen, flags) hp = getipnodebyaddr(addr, gni_afd->a_addrlen, gni_afd->a_af, &h_error); #else hp = gethostbyaddr(addr, gni_afd->a_addrlen, gni_afd->a_af); - h_error = h_errno; #endif if (hp) { diff --git a/Modules/getpath.py b/Modules/getpath.py index 9aff19c0af7edd..dceeed7702c0be 100644 --- a/Modules/getpath.py +++ b/Modules/getpath.py @@ -228,6 +228,7 @@ def search_up(prefix, *landmarks, test=isfile): use_environment = config.get('use_environment', 1) pythonpath = config.get('module_search_paths') +pythonpath_was_set = config.get('module_search_paths_set') real_executable_dir = None stdlib_dir = None @@ -460,7 +461,8 @@ def search_up(prefix, *landmarks, test=isfile): build_prefix = None -if not home_was_set and real_executable_dir and not py_setpath: +if ((not home_was_set and real_executable_dir and not py_setpath) + or config.get('_is_python_build', 0) > 0): # Detect a build marker and use it to infer prefix, exec_prefix, # stdlib_dir and the platstdlib_dir directories. try: @@ -626,8 +628,8 @@ def search_up(prefix, *landmarks, test=isfile): config['module_search_paths'] = py_setpath.split(DELIM) config['module_search_paths_set'] = 1 -elif not pythonpath: - # If pythonpath was already set, we leave it alone. +elif not pythonpath_was_set: + # If pythonpath was already explicitly set or calculated, we leave it alone. # This won't matter in normal use, but if an embedded host is trying to # recalculate paths while running then we do not want to change it. pythonpath = [] diff --git a/Modules/main.c b/Modules/main.c index cca669bdbf4b84..90f237293b8565 100644 --- a/Modules/main.c +++ b/Modules/main.c @@ -479,12 +479,23 @@ pymain_run_interactive_hook(int *exitcode) } +static void +pymain_set_inspect(PyConfig *config, int inspect) +{ + config->inspect = inspect; +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS + Py_InspectFlag = inspect; +_Py_COMP_DIAG_POP +} + + static int pymain_run_stdin(PyConfig *config) { if (stdin_is_interactive(config)) { - config->inspect = 0; - Py_InspectFlag = 0; /* do exit on SystemExit */ + // do exit on SystemExit + pymain_set_inspect(config, 0); int exitcode; if (pymain_run_startup(config, &exitcode)) { @@ -517,16 +528,14 @@ pymain_repl(PyConfig *config, int *exitcode) /* Check this environment variable at the end, to give programs the opportunity to set it from Python. */ if (!config->inspect && _Py_GetEnv(config->use_environment, "PYTHONINSPECT")) { - config->inspect = 1; - Py_InspectFlag = 1; + pymain_set_inspect(config, 1); } if (!(config->inspect && stdin_is_interactive(config) && config_run_code(config))) { return; } - config->inspect = 0; - Py_InspectFlag = 0; + pymain_set_inspect(config, 0); if (pymain_run_interactive_hook(exitcode)) { return; } diff --git a/Modules/mathmodule.c b/Modules/mathmodule.c index aa93e756c606ba..48625c8c18d70e 100644 --- a/Modules/mathmodule.c +++ b/Modules/mathmodule.c @@ -55,13 +55,14 @@ raised for division by zero and mod by zero. #ifndef Py_BUILD_CORE_BUILTIN # define Py_BUILD_CORE_MODULE 1 #endif -#define NEEDS_PY_IDENTIFIER #include "Python.h" #include "pycore_bitutils.h" // _Py_bit_length() #include "pycore_call.h" // _PyObject_CallNoArgs() #include "pycore_dtoa.h" // _Py_dg_infinity() #include "pycore_long.h" // _PyLong_GetZero() +#include "pycore_moduleobject.h" // _PyModule_GetState() +#include "pycore_object.h" // _PyObject_LookupSpecial() #include "pycore_pymath.h" // _PY_SHORT_FLOAT_REPR /* For DBL_EPSILON in _math.h */ #include @@ -76,6 +77,20 @@ module math /*[clinic end generated code: output=da39a3ee5e6b4b0d input=76bc7002685dd942]*/ +typedef struct { + PyObject *str___ceil__; + PyObject *str___floor__; + PyObject *str___trunc__; +} math_module_state; + +static inline math_module_state* +get_math_module_state(PyObject *module) +{ + void *state = _PyModule_GetState(module); + assert(state != NULL); + return (math_module_state *)state; +} + /* sin(pi*x), giving accurate results for all finite x (especially x integral or close to an integer). This is here for use in the @@ -1215,10 +1230,10 @@ static PyObject * math_ceil(PyObject *module, PyObject *number) /*[clinic end generated code: output=6c3b8a78bc201c67 input=2725352806399cab]*/ { - _Py_IDENTIFIER(__ceil__); if (!PyFloat_CheckExact(number)) { - PyObject *method = _PyObject_LookupSpecialId(number, &PyId___ceil__); + math_module_state *state = get_math_module_state(module); + PyObject *method = _PyObject_LookupSpecial(number, state->str___ceil__); if (method != NULL) { PyObject *result = _PyObject_CallNoArgs(method); Py_DECREF(method); @@ -1283,14 +1298,13 @@ math_floor(PyObject *module, PyObject *number) { double x; - _Py_IDENTIFIER(__floor__); - if (PyFloat_CheckExact(number)) { x = PyFloat_AS_DOUBLE(number); } else { - PyObject *method = _PyObject_LookupSpecialId(number, &PyId___floor__); + math_module_state *state = get_math_module_state(module); + PyObject *method = _PyObject_LookupSpecial(number, state->str___floor__); if (method != NULL) { PyObject *result = _PyObject_CallNoArgs(method); Py_DECREF(method); @@ -2156,7 +2170,6 @@ static PyObject * math_trunc(PyObject *module, PyObject *x) /*[clinic end generated code: output=34b9697b707e1031 input=2168b34e0a09134d]*/ { - _Py_IDENTIFIER(__trunc__); PyObject *trunc, *result; if (PyFloat_CheckExact(x)) { @@ -2168,7 +2181,8 @@ math_trunc(PyObject *module, PyObject *x) return NULL; } - trunc = _PyObject_LookupSpecialId(x, &PyId___trunc__); + math_module_state *state = get_math_module_state(module); + trunc = _PyObject_LookupSpecial(x, state->str___trunc__); if (trunc == NULL) { if (!PyErr_Occurred()) PyErr_Format(PyExc_TypeError, @@ -2312,7 +2326,7 @@ math_modf_impl(PyObject *module, double x) in that int is larger than PY_SSIZE_T_MAX. */ static PyObject* -loghelper(PyObject* arg, double (*func)(double), const char *funcname) +loghelper(PyObject* arg, double (*func)(double)) { /* If it is int, do it ourselves. */ if (PyLong_Check(arg)) { @@ -2372,11 +2386,11 @@ math_log_impl(PyObject *module, PyObject *x, int group_right_1, PyObject *num, *den; PyObject *ans; - num = loghelper(x, m_log, "log"); + num = loghelper(x, m_log); if (num == NULL || base == NULL) return num; - den = loghelper(base, m_log, "log"); + den = loghelper(base, m_log); if (den == NULL) { Py_DECREF(num); return NULL; @@ -2402,7 +2416,7 @@ static PyObject * math_log2(PyObject *module, PyObject *x) /*[clinic end generated code: output=5425899a4d5d6acb input=08321262bae4f39b]*/ { - return loghelper(x, m_log2, "log2"); + return loghelper(x, m_log2); } @@ -2419,7 +2433,7 @@ static PyObject * math_log10(PyObject *module, PyObject *x) /*[clinic end generated code: output=be72a64617df9c6f input=b2469d02c6469e53]*/ { - return loghelper(x, m_log10, "log10"); + return loghelper(x, m_log10); } @@ -3825,6 +3839,20 @@ math_ulp_impl(PyObject *module, double x) static int math_exec(PyObject *module) { + + math_module_state *state = get_math_module_state(module); + state->str___ceil__ = PyUnicode_InternFromString("__ceil__"); + if (state->str___ceil__ == NULL) { + return -1; + } + state->str___floor__ = PyUnicode_InternFromString("__floor__"); + if (state->str___floor__ == NULL) { + return -1; + } + state->str___trunc__ = PyUnicode_InternFromString("__trunc__"); + if (state->str___trunc__ == NULL) { + return -1; + } if (PyModule_AddObject(module, "pi", PyFloat_FromDouble(Py_MATH_PI)) < 0) { return -1; } @@ -3846,6 +3874,22 @@ math_exec(PyObject *module) return 0; } +static int +math_clear(PyObject *module) +{ + math_module_state *state = get_math_module_state(module); + Py_CLEAR(state->str___ceil__); + Py_CLEAR(state->str___floor__); + Py_CLEAR(state->str___trunc__); + return 0; +} + +static void +math_free(void *module) +{ + math_clear((PyObject *)module); +} + static PyMethodDef math_methods[] = { {"acos", math_acos, METH_O, math_acos_doc}, {"acosh", math_acosh, METH_O, math_acosh_doc}, @@ -3918,9 +3962,11 @@ static struct PyModuleDef mathmodule = { PyModuleDef_HEAD_INIT, .m_name = "math", .m_doc = module_doc, - .m_size = 0, + .m_size = sizeof(math_module_state), .m_methods = math_methods, .m_slots = math_slots, + .m_clear = math_clear, + .m_free = math_free, }; PyMODINIT_FUNC diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index 0a72aca8d51fa2..40229bce0f4033 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -3282,6 +3282,10 @@ os_chmod_impl(PyObject *module, path_t *path, int mode, int dir_fd, { #ifdef HAVE_CHMOD result = chmod(path->narrow, mode); +#elif defined(__wasi__) + // WASI SDK 15.0 does not support chmod. + // Ignore missing syscall for now. + result = 0; #else result = -1; errno = ENOSYS; @@ -8275,11 +8279,7 @@ wait_helper(PyObject *module, pid_t pid, int status, struct rusage *ru) memset(ru, 0, sizeof(*ru)); } - PyObject *m = PyImport_ImportModule("resource"); - if (m == NULL) - return NULL; - struct_rusage = PyObject_GetAttr(m, get_posix_state(module)->struct_rusage); - Py_DECREF(m); + struct_rusage = _PyImport_GetModuleAttrString("resource", "struct_rusage"); if (struct_rusage == NULL) return NULL; @@ -15255,6 +15255,9 @@ all_ins(PyObject *m) #ifdef P_PIDFD if (PyModule_AddIntMacro(m, P_PIDFD)) return -1; #endif +#ifdef PIDFD_NONBLOCK + if (PyModule_AddIntMacro(m, PIDFD_NONBLOCK)) return -1; +#endif #endif #ifdef WEXITED if (PyModule_AddIntMacro(m, WEXITED)) return -1; diff --git a/Modules/pyexpat.c b/Modules/pyexpat.c index b8535dfe7115a1..678347331ef4dd 100644 --- a/Modules/pyexpat.c +++ b/Modules/pyexpat.c @@ -2,7 +2,6 @@ #include #include "structmember.h" // PyMemberDef -#include "frameobject.h" #include "expat.h" #include "pyexpat.h" diff --git a/Modules/signalmodule.c b/Modules/signalmodule.c index dfb1f923cdccf9..e3b37f179312d4 100644 --- a/Modules/signalmodule.c +++ b/Modules/signalmodule.c @@ -189,8 +189,8 @@ compare_handler(PyObject *func, PyObject *dfl_ign_handler) return PyObject_RichCompareBool(func, dfl_ign_handler, Py_EQ) == 1; } -#ifdef HAVE_GETITIMER -/* auxiliary functions for setitimer */ +#ifdef HAVE_SETITIMER +/* auxiliary function for setitimer */ static int timeval_from_double(PyObject *obj, struct timeval *tv) { @@ -206,7 +206,10 @@ timeval_from_double(PyObject *obj, struct timeval *tv) } return _PyTime_AsTimeval(t, tv, _PyTime_ROUND_CEILING); } +#endif +#if defined(HAVE_SETITIMER) || defined(HAVE_GETITIMER) +/* auxiliary functions for get/setitimer */ Py_LOCAL_INLINE(double) double_from_timeval(struct timeval *tv) { diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c index f376513fead1b8..55e19ffab5cba4 100644 --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -271,6 +271,9 @@ shutdown(how) -- shut down traffic in one or both directions\n\ # include # endif +/* Helpers needed for AF_HYPERV */ +# include + /* Macros based on the IPPROTO enum, see: https://bugs.python.org/issue29515 */ #ifdef MS_WINDOWS #define IPPROTO_ICMP IPPROTO_ICMP @@ -1013,6 +1016,7 @@ init_sockobject(PySocketSockObject *s, } +#ifdef HAVE_SOCKETPAIR /* Create a new socket object. This just creates the object and initializes it. If the creation fails, return NULL and set an exception (implicit @@ -1032,6 +1036,7 @@ new_sockobject(SOCKET_T fd, int family, int type, int proto) } return s; } +#endif /* Lock to allow python interpreter to continue, but only allow one @@ -1579,6 +1584,35 @@ makesockaddr(SOCKET_T sockfd, struct sockaddr *addr, size_t addrlen, int proto) } #endif /* HAVE_SOCKADDR_ALG */ +#ifdef HAVE_AF_HYPERV + case AF_HYPERV: + { + SOCKADDR_HV *a = (SOCKADDR_HV *) addr; + + wchar_t *guidStr; + RPC_STATUS res = UuidToStringW(&a->VmId, &guidStr); + if (res != RPC_S_OK) { + PyErr_SetFromWindowsErr(res); + return 0; + } + PyObject *vmId = PyUnicode_FromWideChar(guidStr, -1); + res = RpcStringFreeW(&guidStr); + assert(res == RPC_S_OK); + + res = UuidToStringW(&a->ServiceId, &guidStr); + if (res != RPC_S_OK) { + Py_DECREF(vmId); + PyErr_SetFromWindowsErr(res); + return 0; + } + PyObject *serviceId = PyUnicode_FromWideChar(guidStr, -1); + res = RpcStringFreeW(&guidStr); + assert(res == RPC_S_OK); + + return Py_BuildValue("NN", vmId, serviceId); + } +#endif /* AF_HYPERV */ + /* More cases here... */ default: @@ -2375,6 +2409,76 @@ getsockaddrarg(PySocketSockObject *s, PyObject *args, return 1; } #endif /* HAVE_SOCKADDR_ALG */ +#ifdef HAVE_AF_HYPERV + case AF_HYPERV: + { + switch (s->sock_proto) { + case HV_PROTOCOL_RAW: + { + PyObject *vm_id_obj = NULL; + PyObject *service_id_obj = NULL; + + SOCKADDR_HV *addr = &addrbuf->hv; + + memset(addr, 0, sizeof(*addr)); + addr->Family = AF_HYPERV; + + if (!PyTuple_Check(args)) { + PyErr_Format(PyExc_TypeError, + "%s(): AF_HYPERV address must be tuple, not %.500s", + caller, Py_TYPE(args)->tp_name); + return 0; + } + if (!PyArg_ParseTuple(args, + "UU;AF_HYPERV address must be a str tuple (vm_id, service_id)", + &vm_id_obj, &service_id_obj)) + { + return 0; + } + + wchar_t *guid_str = PyUnicode_AsWideCharString(vm_id_obj, NULL); + if (guid_str == NULL) { + PyErr_Format(PyExc_ValueError, + "%s(): AF_HYPERV address vm_id is not a valid UUID string", + caller); + return 0; + } + RPC_STATUS rc = UuidFromStringW(guid_str, &addr->VmId); + PyMem_Free(guid_str); + if (rc != RPC_S_OK) { + PyErr_Format(PyExc_ValueError, + "%s(): AF_HYPERV address vm_id is not a valid UUID string", + caller); + return 0; + } + + guid_str = PyUnicode_AsWideCharString(service_id_obj, NULL); + if (guid_str == NULL) { + PyErr_Format(PyExc_ValueError, + "%s(): AF_HYPERV address service_id is not a valid UUID string", + caller); + return 0; + } + rc = UuidFromStringW(guid_str, &addr->ServiceId); + PyMem_Free(guid_str); + if (rc != RPC_S_OK) { + PyErr_Format(PyExc_ValueError, + "%s(): AF_HYPERV address service_id is not a valid UUID string", + caller); + return 0; + } + + *len_ret = sizeof(*addr); + return 1; + } + default: + PyErr_Format(PyExc_OSError, + "%s(): unsupported AF_HYPERV protocol: %d", + caller, s->sock_proto); + return 0; + } + } +#endif /* HAVE_AF_HYPERV */ /* More cases here... */ @@ -2524,6 +2628,13 @@ getsockaddrlen(PySocketSockObject *s, socklen_t *len_ret) return 1; } #endif /* HAVE_SOCKADDR_ALG */ +#ifdef HAVE_AF_HYPERV + case AF_HYPERV: + { + *len_ret = sizeof (SOCKADDR_HV); + return 1; + } +#endif /* HAVE_AF_HYPERV */ /* More cases here... */ @@ -7351,6 +7462,27 @@ PyInit__socket(void) /* Linux LLC */ PyModule_AddIntMacro(m, AF_LLC); #endif +#ifdef HAVE_AF_HYPERV + /* Hyper-V sockets */ + PyModule_AddIntMacro(m, AF_HYPERV); + + /* for proto */ + PyModule_AddIntMacro(m, HV_PROTOCOL_RAW); + + /* for setsockopt() */ + PyModule_AddIntMacro(m, HVSOCKET_CONNECT_TIMEOUT); + PyModule_AddIntMacro(m, HVSOCKET_CONNECT_TIMEOUT_MAX); + PyModule_AddIntMacro(m, HVSOCKET_CONNECTED_SUSPEND); + PyModule_AddIntMacro(m, HVSOCKET_ADDRESS_FLAG_PASSTHRU); + + /* for bind() or connect() */ + PyModule_AddStringConstant(m, "HV_GUID_ZERO", "00000000-0000-0000-0000-000000000000"); + PyModule_AddStringConstant(m, "HV_GUID_WILDCARD", "00000000-0000-0000-0000-000000000000"); + PyModule_AddStringConstant(m, "HV_GUID_BROADCAST", "FFFFFFFF-FFFF-FFFF-FFFF-FFFFFFFFFFFF"); + PyModule_AddStringConstant(m, "HV_GUID_CHILDREN", "90DB8B89-0D35-4F79-8CE9-49EA0AC8B7CD"); + PyModule_AddStringConstant(m, "HV_GUID_LOOPBACK", "E0E16197-DD56-4A10-9195-5EE7A155A838"); + PyModule_AddStringConstant(m, "HV_GUID_PARENT", "A42E7CDA-D03F-480C-9CC2-A4DE20ABB878"); +#endif /* HAVE_AF_HYPERV */ #ifdef USE_BLUETOOTH PyModule_AddIntMacro(m, AF_BLUETOOTH); diff --git a/Modules/socketmodule.h b/Modules/socketmodule.h index 1b35b11cdee6af..f31ba532a6c60d 100644 --- a/Modules/socketmodule.h +++ b/Modules/socketmodule.h @@ -76,6 +76,15 @@ struct SOCKADDR_BTH_REDEF { # else typedef int socklen_t; # endif /* IPPROTO_IPV6 */ + +/* Remove ifdef once Py_WINVER >= 0x0604 + * socket.h only defines AF_HYPERV if _WIN32_WINNT is at that level or higher + * so for now it's just manually defined. + */ +# ifndef AF_HYPERV +# define AF_HYPERV 34 +# endif +# include #endif /* MS_WINDOWS */ #ifdef HAVE_SYS_UN_H @@ -240,6 +249,11 @@ typedef int SOCKET_T; #define PyLong_AsSocket_t(fd) (SOCKET_T)PyLong_AsLongLong(fd) #endif +// AF_HYPERV is only supported on Windows +#if defined(AF_HYPERV) && defined(MS_WINDOWS) +# define HAVE_AF_HYPERV +#endif + /* Socket address */ typedef union sock_addr { struct sockaddr_in in; @@ -288,6 +302,9 @@ typedef union sock_addr { #ifdef HAVE_LINUX_TIPC_H struct sockaddr_tipc tipc; #endif +#ifdef HAVE_AF_HYPERV + SOCKADDR_HV hv; +#endif } sock_addr_t; /* The object holding a socket. It holds some extra information, diff --git a/Modules/timemodule.c b/Modules/timemodule.c index 7475ef344b72b2..11c888af03e82d 100644 --- a/Modules/timemodule.c +++ b/Modules/timemodule.c @@ -910,14 +910,9 @@ is not present, current time as returned by localtime() is used.\n\ static PyObject * time_strptime(PyObject *self, PyObject *args) { - PyObject *module, *func, *result; + PyObject *func, *result; - module = PyImport_ImportModule("_strptime"); - if (!module) - return NULL; - - func = PyObject_GetAttr(module, &_Py_ID(_strptime_time)); - Py_DECREF(module); + func = _PyImport_GetModuleAttrString("_strptime", "_strptime_time"); if (!func) { return NULL; } @@ -1481,7 +1476,7 @@ _PyTime_GetThreadTimeWithInfo(_PyTime_t *tp, _Py_clock_info_t *info) #elif defined(HAVE_CLOCK_GETTIME) && \ defined(CLOCK_PROCESS_CPUTIME_ID) && \ - !defined(__EMSCRIPTEN__) + !defined(__EMSCRIPTEN__) && !defined(__wasi__) #define HAVE_THREAD_TIME #if defined(__APPLE__) && defined(__has_attribute) && __has_attribute(availability) diff --git a/Modules/xxlimited_35.c b/Modules/xxlimited_35.c index 647abf6721276c..8d29c71951768a 100644 --- a/Modules/xxlimited_35.c +++ b/Modules/xxlimited_35.c @@ -124,7 +124,7 @@ static PyType_Slot Xxo_Type_slots[] = { }; static PyType_Spec Xxo_Type_spec = { - "xxlimited.Xxo", + "xxlimited_35.Xxo", sizeof(XxoObject), 0, Py_TPFLAGS_DEFAULT | Py_TPFLAGS_HAVE_GC, @@ -189,7 +189,7 @@ static PyType_Slot Str_Type_slots[] = { }; static PyType_Spec Str_Type_spec = { - "xxlimited.Str", + "xxlimited_35.Str", 0, 0, Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, @@ -212,7 +212,7 @@ static PyType_Slot Null_Type_slots[] = { }; static PyType_Spec Null_Type_spec = { - "xxlimited.Null", + "xxlimited_35.Null", 0, /* basicsize */ 0, /* itemsize */ Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, @@ -248,40 +248,50 @@ xx_modexec(PyObject *m) Null_Type_slots[1].pfunc = PyType_GenericNew; Str_Type_slots[0].pfunc = &PyUnicode_Type; - Xxo_Type = PyType_FromSpec(&Xxo_Type_spec); - if (Xxo_Type == NULL) - goto fail; - /* Add some symbolic constants to the module */ if (ErrorObject == NULL) { - ErrorObject = PyErr_NewException("xxlimited.error", NULL, NULL); - if (ErrorObject == NULL) - goto fail; + ErrorObject = PyErr_NewException("xxlimited_35.error", NULL, NULL); + if (ErrorObject == NULL) { + return -1; + } } Py_INCREF(ErrorObject); - PyModule_AddObject(m, "error", ErrorObject); + if (PyModule_AddObject(m, "error", ErrorObject) < 0) { + Py_DECREF(ErrorObject); + return -1; + } /* Add Xxo */ - o = PyType_FromSpec(&Xxo_Type_spec); - if (o == NULL) - goto fail; - PyModule_AddObject(m, "Xxo", o); + Xxo_Type = PyType_FromSpec(&Xxo_Type_spec); + if (Xxo_Type == NULL) { + return -1; + } + if (PyModule_AddObject(m, "Xxo", Xxo_Type) < 0) { + Py_DECREF(Xxo_Type); + return -1; + } /* Add Str */ o = PyType_FromSpec(&Str_Type_spec); - if (o == NULL) - goto fail; - PyModule_AddObject(m, "Str", o); + if (o == NULL) { + return -1; + } + if (PyModule_AddObject(m, "Str", o) < 0) { + Py_DECREF(o); + return -1; + } /* Add Null */ o = PyType_FromSpec(&Null_Type_spec); - if (o == NULL) - goto fail; - PyModule_AddObject(m, "Null", o); + if (o == NULL) { + return -1; + } + if (PyModule_AddObject(m, "Null", o) < 0) { + Py_DECREF(o); + return -1; + } + return 0; - fail: - Py_XDECREF(m); - return -1; } diff --git a/Modules/xxmodule.c b/Modules/xxmodule.c index edcd62157c02f3..a6e5071d1d6303 100644 --- a/Modules/xxmodule.c +++ b/Modules/xxmodule.c @@ -358,31 +358,32 @@ xx_exec(PyObject *m) /* Finalize the type object including setting type of the new type * object; doing it here is required for portability, too. */ - if (PyType_Ready(&Xxo_Type) < 0) - goto fail; + if (PyType_Ready(&Xxo_Type) < 0) { + return -1; + } /* Add some symbolic constants to the module */ if (ErrorObject == NULL) { ErrorObject = PyErr_NewException("xx.error", NULL, NULL); - if (ErrorObject == NULL) - goto fail; + if (ErrorObject == NULL) { + return -1; + } + } + int rc = PyModule_AddType(m, (PyTypeObject *)ErrorObject); + Py_DECREF(ErrorObject); + if (rc < 0) { + return -1; } - Py_INCREF(ErrorObject); - PyModule_AddObject(m, "error", ErrorObject); - - /* Add Str */ - if (PyType_Ready(&Str_Type) < 0) - goto fail; - PyModule_AddObject(m, "Str", (PyObject *)&Str_Type); - - /* Add Null */ - if (PyType_Ready(&Null_Type) < 0) - goto fail; - PyModule_AddObject(m, "Null", (PyObject *)&Null_Type); + + /* Add Str and Null types */ + if (PyModule_AddType(m, &Str_Type) < 0) { + return -1; + } + if (PyModule_AddType(m, &Null_Type) < 0) { + return -1; + } + return 0; - fail: - Py_XDECREF(m); - return -1; } static struct PyModuleDef_Slot xx_slots[] = { diff --git a/Objects/abstract.c b/Objects/abstract.c index 93987c201b5e25..5d50491b2dd347 100644 --- a/Objects/abstract.c +++ b/Objects/abstract.c @@ -526,18 +526,12 @@ _Py_add_one_to_index_C(int nd, Py_ssize_t *index, const Py_ssize_t *shape) Py_ssize_t PyBuffer_SizeFromFormat(const char *format) { - PyObject *structmodule = NULL; PyObject *calcsize = NULL; PyObject *res = NULL; PyObject *fmt = NULL; Py_ssize_t itemsize = -1; - structmodule = PyImport_ImportModule("struct"); - if (structmodule == NULL) { - return itemsize; - } - - calcsize = PyObject_GetAttrString(structmodule, "calcsize"); + calcsize = _PyImport_GetModuleAttrString("struct", "calcsize"); if (calcsize == NULL) { goto done; } @@ -558,7 +552,6 @@ PyBuffer_SizeFromFormat(const char *format) } done: - Py_DECREF(structmodule); Py_XDECREF(calcsize); Py_XDECREF(fmt); Py_XDECREF(res); diff --git a/Objects/bytearrayobject.c b/Objects/bytearrayobject.c index b9436d9aa5b1a1..d0179626414874 100644 --- a/Objects/bytearrayobject.c +++ b/Objects/bytearrayobject.c @@ -1096,6 +1096,7 @@ bytearray_dealloc(PyByteArrayObject *self) #define STRINGLIB_ISSPACE Py_ISSPACE #define STRINGLIB_ISLINEBREAK(x) ((x == '\n') || (x == '\r')) #define STRINGLIB_CHECK_EXACT PyByteArray_CheckExact +#define STRINGLIB_FAST_MEMCHR memchr #define STRINGLIB_MUTABLE 1 #include "stringlib/fastsearch.h" diff --git a/Objects/bytes_methods.c b/Objects/bytes_methods.c index 994fb8a73c6cd3..6b8166385d375b 100644 --- a/Objects/bytes_methods.c +++ b/Objects/bytes_methods.c @@ -431,6 +431,7 @@ _Py_bytes_maketrans(Py_buffer *frm, Py_buffer *to) #define STRINGLIB(F) stringlib_##F #define STRINGLIB_CHAR char #define STRINGLIB_SIZEOF_CHAR 1 +#define STRINGLIB_FAST_MEMCHR memchr #include "stringlib/fastsearch.h" #include "stringlib/count.h" diff --git a/Objects/bytesobject.c b/Objects/bytesobject.c index 1dfa1d57cf65a1..80660881920fb7 100644 --- a/Objects/bytesobject.c +++ b/Objects/bytesobject.c @@ -377,11 +377,7 @@ PyBytes_FromFormat(const char *format, ...) PyObject* ret; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif ret = PyBytes_FromFormatV(format, vargs); va_end(vargs); return ret; diff --git a/Objects/call.c b/Objects/call.c index 678d16269f2169..ed168c9c4796e2 100644 --- a/Objects/call.c +++ b/Objects/call.c @@ -1,11 +1,10 @@ #include "Python.h" #include "pycore_call.h" // _PyObject_CallNoArgsTstate() -#include "pycore_ceval.h" // _PyEval_EvalFrame() -#include "pycore_object.h" // _PyObject_GC_TRACK() +#include "pycore_ceval.h" // _Py_EnterRecursiveCallTstate() +#include "pycore_object.h" // _PyCFunctionWithKeywords_TrampolineCall() #include "pycore_pyerrors.h" // _PyErr_Occurred() #include "pycore_pystate.h" // _PyThreadState_GET() #include "pycore_tuple.h" // _PyTuple_ITEMS() -#include "frameobject.h" // _PyFrame_New_NoTrack() static PyObject *const * @@ -109,7 +108,9 @@ _Py_CheckSlotResult(PyObject *obj, const char *slot_name, int success) PyObject * PyObject_CallNoArgs(PyObject *func) { - return _PyObject_CallNoArgs(func); + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, func); + PyThreadState *tstate = _PyThreadState_GET(); + return _PyObject_VectorcallTstate(tstate, func, NULL, 0, NULL); } @@ -322,7 +323,7 @@ _PyObject_Call(PyThreadState *tstate, PyObject *callable, assert(!_PyErr_Occurred(tstate)); assert(PyTuple_Check(args)); assert(kwargs == NULL || PyDict_Check(kwargs)); - + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, callable); vectorcallfunc vector_func = _PyVectorcall_Function(callable); if (vector_func != NULL) { return _PyVectorcall_Call(tstate, vector_func, callable, args, kwargs); @@ -367,6 +368,7 @@ PyCFunction_Call(PyObject *callable, PyObject *args, PyObject *kwargs) PyObject * PyObject_CallOneArg(PyObject *func, PyObject *arg) { + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, func); assert(arg != NULL); PyObject *_args[2]; PyObject **args = _args + 1; // For PY_VECTORCALL_ARGUMENTS_OFFSET @@ -389,6 +391,7 @@ _PyFunction_Vectorcall(PyObject *func, PyObject* const* stack, assert(nargs >= 0); PyThreadState *tstate = _PyThreadState_GET(); assert(nargs == 0 || stack != NULL); + EVAL_CALL_STAT_INC(EVAL_CALL_FUNCTION_VECTORCALL); if (((PyCodeObject *)f->func_code)->co_flags & CO_OPTIMIZED) { return _PyEval_Vector(tstate, f, NULL, stack, nargs, kwnames); } @@ -520,7 +523,7 @@ _PyObject_CallFunctionVa(PyThreadState *tstate, PyObject *callable, if (stack == NULL) { return NULL; } - + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, callable); if (nargs == 1 && PyTuple_Check(stack[0])) { /* Special cases for backward compatibility: - PyObject_CallFunction(func, "O", tuple) calls func(*tuple) @@ -815,6 +818,11 @@ object_vacall(PyThreadState *tstate, PyObject *base, stack[i] = va_arg(vargs, PyObject *); } +#ifdef Py_STATS + if (PyFunction_Check(callable)) { + EVAL_CALL_STAT_INC(EVAL_CALL_API); + } +#endif /* Call the function */ result = _PyObject_VectorcallTstate(tstate, callable, stack, nargs, NULL); @@ -852,6 +860,7 @@ PyObject_VectorcallMethod(PyObject *name, PyObject *const *args, args++; nargsf--; } + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_METHOD, callable); PyObject *result = _PyObject_VectorcallTstate(tstate, callable, args, nargsf, kwnames); Py_DECREF(callable); diff --git a/Objects/codeobject.c b/Objects/codeobject.c index c2b29be1fe8693..c38c51b45321c1 100644 --- a/Objects/codeobject.c +++ b/Objects/codeobject.c @@ -4,6 +4,7 @@ #include "opcode.h" #include "structmember.h" // PyMemberDef #include "pycore_code.h" // _PyCodeConstructor +#include "pycore_frame.h" // FRAME_SPECIALS_SIZE #include "pycore_interp.h" // PyInterpreterState.co_extra_freefuncs #include "pycore_opcode.h" // _PyOpcode_Deopt #include "pycore_pystate.h" // _PyInterpreterState_GET() @@ -327,6 +328,7 @@ init_code(PyCodeObject *co, struct _PyCodeConstructor *con) /* derived values */ co->co_nlocalsplus = nlocalsplus; co->co_nlocals = nlocals; + co->co_framesize = nlocalsplus + con->stacksize + FRAME_SPECIALS_SIZE; co->co_nplaincellvars = nplaincellvars; co->co_ncellvars = ncellvars; co->co_nfreevars = nfreevars; @@ -334,10 +336,19 @@ init_code(PyCodeObject *co, struct _PyCodeConstructor *con) /* not set */ co->co_weakreflist = NULL; co->co_extra = NULL; + co->_co_code = NULL; co->co_warmup = QUICKENING_INITIAL_WARMUP_VALUE; + co->_co_linearray_entry_size = 0; + co->_co_linearray = NULL; memcpy(_PyCode_CODE(co), PyBytes_AS_STRING(con->code), PyBytes_GET_SIZE(con->code)); + int entry_point = 0; + while (entry_point < Py_SIZE(co) && + _Py_OPCODE(_PyCode_CODE(co)[entry_point]) != RESUME) { + entry_point++; + } + co->_co_firsttraceable = entry_point; } static int @@ -694,6 +705,50 @@ PyCode_NewEmpty(const char *filename, const char *funcname, int firstlineno) lnotab_notes.txt for the details of the lnotab representation. */ +int +_PyCode_CreateLineArray(PyCodeObject *co) +{ + assert(co->_co_linearray == NULL); + PyCodeAddressRange bounds; + int size; + int max_line = 0; + _PyCode_InitAddressRange(co, &bounds); + while(_PyLineTable_NextAddressRange(&bounds)) { + if (bounds.ar_line > max_line) { + max_line = bounds.ar_line; + } + } + if (max_line < (1 << 15)) { + size = 2; + } + else { + size = 4; + } + co->_co_linearray = PyMem_Malloc(Py_SIZE(co)*size); + if (co->_co_linearray == NULL) { + PyErr_NoMemory(); + return -1; + } + co->_co_linearray_entry_size = size; + _PyCode_InitAddressRange(co, &bounds); + while(_PyLineTable_NextAddressRange(&bounds)) { + int start = bounds.ar_start / sizeof(_Py_CODEUNIT); + int end = bounds.ar_end / sizeof(_Py_CODEUNIT); + for (int index = start; index < end; index++) { + assert(index < (int)Py_SIZE(co)); + if (size == 2) { + assert(((int16_t)bounds.ar_line) == bounds.ar_line); + ((int16_t *)co->_co_linearray)[index] = bounds.ar_line; + } + else { + assert(size == 4); + ((int32_t *)co->_co_linearray)[index] = bounds.ar_line; + } + } + } + return 0; +} + int PyCode_Addr2Line(PyCodeObject *co, int addrq) { @@ -701,6 +756,9 @@ PyCode_Addr2Line(PyCodeObject *co, int addrq) return co->co_firstlineno; } assert(addrq >= 0 && addrq < _PyCode_NBYTES(co)); + if (co->_co_linearray) { + return _PyCode_LineNumberFromArray(co, addrq / sizeof(_Py_CODEUNIT)); + } PyCodeAddressRange bounds; _PyCode_InitAddressRange(co, &bounds); return _PyCode_CheckLineNumber(addrq, &bounds); @@ -768,6 +826,11 @@ next_code_delta(PyCodeAddressRange *bounds) static int previous_code_delta(PyCodeAddressRange *bounds) { + if (bounds->ar_start == 0) { + // If we looking at the first entry, the + // "previous" entry has an implicit length of 1. + return 1; + } const uint8_t *ptr = bounds->opaque.lo_next-1; while (((*ptr) & 128) == 0) { ptr--; @@ -811,7 +874,7 @@ static void retreat(PyCodeAddressRange *bounds) { ASSERT_VALID_BOUNDS(bounds); - assert(bounds->ar_start > 0); + assert(bounds->ar_start >= 0); do { bounds->opaque.lo_next--; } while (((*bounds->opaque.lo_next) & 128) == 0); @@ -1096,7 +1159,7 @@ lineiter_next(lineiterator *li) return result; } -static PyTypeObject LineIterator = { +PyTypeObject _PyLineIterator = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "line_iterator", /* tp_name */ sizeof(lineiterator), /* tp_basicsize */ @@ -1142,7 +1205,7 @@ static PyTypeObject LineIterator = { static lineiterator * new_linesiterator(PyCodeObject *code) { - lineiterator *li = (lineiterator *)PyType_GenericAlloc(&LineIterator, 0); + lineiterator *li = (lineiterator *)PyType_GenericAlloc(&_PyLineIterator, 0); if (li == NULL) { return NULL; } @@ -1196,7 +1259,7 @@ positionsiter_next(positionsiterator* pi) _source_offset_converter, &pi->pi_endcolumn); } -static PyTypeObject PositionsIterator = { +PyTypeObject _PyPositionsIterator = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "positions_iterator", /* tp_name */ sizeof(positionsiterator), /* tp_basicsize */ @@ -1242,7 +1305,7 @@ static PyTypeObject PositionsIterator = { static PyObject* code_positionsiterator(PyCodeObject* code, PyObject* Py_UNUSED(args)) { - positionsiterator* pi = (positionsiterator*)PyType_GenericAlloc(&PositionsIterator, 0); + positionsiterator* pi = (positionsiterator*)PyType_GenericAlloc(&_PyPositionsIterator, 0); if (pi == NULL) { return NULL; } @@ -1367,12 +1430,17 @@ deopt_code(_Py_CODEUNIT *instructions, Py_ssize_t len) PyObject * _PyCode_GetCode(PyCodeObject *co) { + if (co->_co_code != NULL) { + return Py_NewRef(co->_co_code); + } PyObject *code = PyBytes_FromStringAndSize((const char *)_PyCode_CODE(co), _PyCode_NBYTES(co)); if (code == NULL) { return NULL; } deopt_code((_Py_CODEUNIT *)PyBytes_AS_STRING(code), Py_SIZE(co)); + assert(co->_co_code == NULL); + co->_co_code = Py_NewRef(code); return code; } @@ -1531,9 +1599,13 @@ code_dealloc(PyCodeObject *co) Py_XDECREF(co->co_qualname); Py_XDECREF(co->co_linetable); Py_XDECREF(co->co_exceptiontable); + Py_XDECREF(co->_co_code); if (co->co_weakreflist != NULL) { PyObject_ClearWeakRefs((PyObject*)co); } + if (co->_co_linearray) { + PyMem_Free(co->_co_linearray); + } if (co->co_warmup == 0) { _Py_QuickenedCount--; } @@ -2085,11 +2157,16 @@ _PyStaticCode_Dealloc(PyCodeObject *co) deopt_code(_PyCode_CODE(co), Py_SIZE(co)); co->co_warmup = QUICKENING_INITIAL_WARMUP_VALUE; PyMem_Free(co->co_extra); + Py_CLEAR(co->_co_code); co->co_extra = NULL; if (co->co_weakreflist != NULL) { PyObject_ClearWeakRefs((PyObject *)co); co->co_weakreflist = NULL; } + if (co->_co_linearray) { + PyMem_Free(co->_co_linearray); + co->_co_linearray = NULL; + } } int diff --git a/Objects/descrobject.c b/Objects/descrobject.c index c3c541bf3c3212..8ef6a82d7c6d96 100644 --- a/Objects/descrobject.c +++ b/Objects/descrobject.c @@ -6,6 +6,7 @@ #include "pycore_pystate.h" // _PyThreadState_GET() #include "pycore_tuple.h" // _PyTuple_ITEMS() #include "structmember.h" // PyMemberDef +#include "pycore_descrobject.h" /*[clinic input] class mappingproxy "mappingproxyobject *" "&PyDictProxy_Type" @@ -1501,16 +1502,6 @@ class property(object): */ -typedef struct { - PyObject_HEAD - PyObject *prop_get; - PyObject *prop_set; - PyObject *prop_del; - PyObject *prop_doc; - PyObject *prop_name; - int getter_doc; -} propertyobject; - static PyObject * property_copy(PyObject *, PyObject *, PyObject *, PyObject *); @@ -1677,6 +1668,7 @@ property_descr_set(PyObject *self, PyObject *obj, PyObject *value) res = PyObject_CallOneArg(func, obj); } else { + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_API, func); PyObject *args[] = { obj, value }; res = PyObject_Vectorcall(func, args, 2, NULL); } @@ -1781,38 +1773,55 @@ property_init_impl(propertyobject *self, PyObject *fget, PyObject *fset, Py_XINCREF(fget); Py_XINCREF(fset); Py_XINCREF(fdel); - Py_XINCREF(doc); Py_XSETREF(self->prop_get, fget); Py_XSETREF(self->prop_set, fset); Py_XSETREF(self->prop_del, fdel); - Py_XSETREF(self->prop_doc, doc); + Py_XSETREF(self->prop_doc, NULL); Py_XSETREF(self->prop_name, NULL); self->getter_doc = 0; + PyObject *prop_doc = NULL; + if (doc != NULL && doc != Py_None) { + prop_doc = doc; + Py_XINCREF(prop_doc); + } /* if no docstring given and the getter has one, use that one */ - if ((doc == NULL || doc == Py_None) && fget != NULL) { - PyObject *get_doc; - int rc = _PyObject_LookupAttr(fget, &_Py_ID(__doc__), &get_doc); + else if (fget != NULL) { + int rc = _PyObject_LookupAttr(fget, &_Py_ID(__doc__), &prop_doc); if (rc <= 0) { return rc; } - if (Py_IS_TYPE(self, &PyProperty_Type)) { - Py_XSETREF(self->prop_doc, get_doc); + if (prop_doc == Py_None) { + prop_doc = NULL; + Py_DECREF(Py_None); } - else { - /* If this is a property subclass, put __doc__ - in dict of the subclass instance instead, - otherwise it gets shadowed by __doc__ in the - class's dict. */ - int err = PyObject_SetAttr( - (PyObject *)self, &_Py_ID(__doc__), get_doc); - Py_DECREF(get_doc); - if (err < 0) - return -1; + if (prop_doc != NULL){ + self->getter_doc = 1; + } + } + + /* At this point `prop_doc` is either NULL or + a non-None object with incremented ref counter */ + + if (Py_IS_TYPE(self, &PyProperty_Type)) { + Py_XSETREF(self->prop_doc, prop_doc); + } else { + /* If this is a property subclass, put __doc__ + in dict of the subclass instance instead, + otherwise it gets shadowed by __doc__ in the + class's dict. */ + + if (prop_doc == NULL) { + prop_doc = Py_None; + Py_INCREF(prop_doc); } - self->getter_doc = 1; + int err = PyObject_SetAttr( + (PyObject *)self, &_Py_ID(__doc__), prop_doc); + Py_XDECREF(prop_doc); + if (err < 0) + return -1; } return 0; diff --git a/Objects/exceptions.c b/Objects/exceptions.c index cf8258b0e244bb..e212e02352efb3 100644 --- a/Objects/exceptions.c +++ b/Objects/exceptions.c @@ -3816,11 +3816,7 @@ _PyErr_TrySetFromCause(const char *format, ...) Py_DECREF(tb); } -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif msg_prefix = PyUnicode_FromFormatV(format, vargs); va_end(vargs); if (msg_prefix == NULL) { diff --git a/Objects/fileobject.c b/Objects/fileobject.c index 8dba5b9aea6db1..cbc57412134d3e 100644 --- a/Objects/fileobject.c +++ b/Objects/fileobject.c @@ -32,16 +32,16 @@ PyObject * PyFile_FromFd(int fd, const char *name, const char *mode, int buffering, const char *encoding, const char *errors, const char *newline, int closefd) { - PyObject *io, *stream; + PyObject *open, *stream; /* import _io in case we are being used to open io.py */ - io = PyImport_ImportModule("_io"); - if (io == NULL) + open = _PyImport_GetModuleAttrString("_io", "open"); + if (open == NULL) return NULL; - stream = _PyObject_CallMethod(io, &_Py_ID(open), "isisssO", fd, mode, + stream = PyObject_CallFunction(open, "isisssO", fd, mode, buffering, encoding, errors, newline, closefd ? Py_True : Py_False); - Py_DECREF(io); + Py_DECREF(open); if (stream == NULL) return NULL; /* ignore name attribute because the name attribute of _BufferedIOMixin @@ -490,7 +490,7 @@ PyFile_SetOpenCodeHook(Py_OpenCodeHookFunction hook, void *userData) { PyObject * PyFile_OpenCodeObject(PyObject *path) { - PyObject *iomod, *f = NULL; + PyObject *f = NULL; if (!PyUnicode_Check(path)) { PyErr_Format(PyExc_TypeError, "'path' must be 'str', not '%.200s'", @@ -502,10 +502,10 @@ PyFile_OpenCodeObject(PyObject *path) if (hook) { f = hook(path, _PyRuntime.open_code_userdata); } else { - iomod = PyImport_ImportModule("_io"); - if (iomod) { - f = _PyObject_CallMethod(iomod, &_Py_ID(open), "Os", path, "rb"); - Py_DECREF(iomod); + PyObject *open = _PyImport_GetModuleAttrString("_io", "open"); + if (open) { + f = PyObject_CallFunction(open, "Os", path, "rb"); + Py_DECREF(open); } } diff --git a/Objects/floatobject.c b/Objects/floatobject.c index 86861b7e28dceb..47d308b8eb3702 100644 --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -71,7 +71,7 @@ static PyStructSequence_Field floatinfo_fields[] = { {"min_exp", "DBL_MIN_EXP -- minimum int e such that radix**(e-1) " "is a normalized float"}, {"min_10_exp", "DBL_MIN_10_EXP -- minimum int e such that 10**e is " - "a normalized"}, + "a normalized float"}, {"dig", "DBL_DIG -- maximum number of decimal digits that " "can be faithfully represented in a float"}, {"mant_dig", "DBL_MANT_DIG -- mantissa digits"}, diff --git a/Objects/frameobject.c b/Objects/frameobject.c index 60f0f2f4edd388..9ff0443e4558f4 100644 --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -278,7 +278,7 @@ mark_stacks(PyCodeObject *code_obj, int len) { int64_t target_stack = pop_value(next_stack); stacks[i+1] = push_value(next_stack, Object); - j = get_arg(code, i) + i + 1; + j = get_arg(code, i) + 1 + INLINE_CACHE_ENTRIES_FOR_ITER + i; assert(j < len); assert(stacks[j] == UNINITIALIZED || stacks[j] == target_stack); stacks[j] = target_stack; @@ -418,7 +418,7 @@ static void frame_stack_pop(PyFrameObject *f) { PyObject *v = _PyFrame_StackPop(f->f_frame); - Py_DECREF(v); + Py_XDECREF(v); } static PyFrameState @@ -455,6 +455,34 @@ _PyFrame_GetState(PyFrameObject *frame) Py_UNREACHABLE(); } +static void +add_load_fast_null_checks(PyCodeObject *co) +{ + int changed = 0; + _Py_CODEUNIT *instructions = _PyCode_CODE(co); + for (Py_ssize_t i = 0; i < Py_SIZE(co); i++) { + switch (_Py_OPCODE(instructions[i])) { + case LOAD_FAST: + case LOAD_FAST__LOAD_FAST: + case LOAD_FAST__LOAD_CONST: + changed = 1; + _Py_SET_OPCODE(instructions[i], LOAD_FAST_CHECK); + break; + case LOAD_CONST__LOAD_FAST: + changed = 1; + _Py_SET_OPCODE(instructions[i], LOAD_CONST); + break; + case STORE_FAST__LOAD_FAST: + changed = 1; + _Py_SET_OPCODE(instructions[i], STORE_FAST); + break; + } + } + if (changed) { + // invalidate cached co_code object + Py_CLEAR(co->_co_code); + } +} /* Setter for f_lineno - you can set f_lineno from within a trace function in * order to jump to a given line of code, subject to some restrictions. Most @@ -545,6 +573,8 @@ frame_setlineno(PyFrameObject *f, PyObject* p_new_lineno, void *Py_UNUSED(ignore return -1; } + add_load_fast_null_checks(f->f_frame->f_code); + /* PyCode_NewWithPosOnlyArgs limits co_code to be under INT_MAX so this * should never overflow. */ int len = (int)Py_SIZE(f->f_frame->f_code); @@ -828,8 +858,9 @@ init_frame(_PyInterpreterFrame *frame, PyFunctionObject *func, PyObject *locals) { /* _PyFrame_InitializeSpecials consumes reference to func */ Py_INCREF(func); + Py_XINCREF(locals); PyCodeObject *code = (PyCodeObject *)func->func_code; - _PyFrame_InitializeSpecials(frame, func, locals, code->co_nlocalsplus); + _PyFrame_InitializeSpecials(frame, func, locals, code); for (Py_ssize_t i = 0; i < code->co_nlocalsplus; i++) { frame->localsplus[i] = NULL; } @@ -1047,6 +1078,7 @@ _PyFrame_LocalsToFast(_PyInterpreterFrame *frame, int clear) } fast = _PyFrame_GetLocalsArray(frame); co = frame->f_code; + bool added_null_checks = false; PyErr_Fetch(&error_type, &error_value, &error_traceback); for (int i = 0; i < co->co_nlocalsplus; i++) { @@ -1066,6 +1098,10 @@ _PyFrame_LocalsToFast(_PyInterpreterFrame *frame, int clear) } } PyObject *oldvalue = fast[i]; + if (!added_null_checks && oldvalue != NULL && value == NULL) { + add_load_fast_null_checks(co); + added_null_checks = true; + } PyObject *cell = NULL; if (kind == CO_FAST_FREE) { // The cell was set when the frame was created from diff --git a/Objects/genericaliasobject.c b/Objects/genericaliasobject.c index 39fd70999ecbe5..b2636d5475dbb9 100644 --- a/Objects/genericaliasobject.c +++ b/Objects/genericaliasobject.c @@ -269,7 +269,7 @@ _Py_make_parameters(PyObject *args) a non-empty tuple, return a new reference to obj. */ static PyObject * subs_tvars(PyObject *obj, PyObject *params, - PyObject **argitems, Py_ssize_t nargs, Py_ssize_t varparam) + PyObject **argitems, Py_ssize_t nargs) { PyObject *subparams; if (_PyObject_LookupAttr(obj, &_Py_ID(__parameters__), &subparams) < 0) { @@ -283,28 +283,28 @@ subs_tvars(PyObject *obj, PyObject *params, Py_DECREF(subparams); return NULL; } - for (Py_ssize_t i = 0, j = 0; i < nsubargs; ++i) { + Py_ssize_t j = 0; + for (Py_ssize_t i = 0; i < nsubargs; ++i) { PyObject *arg = PyTuple_GET_ITEM(subparams, i); Py_ssize_t iparam = tuple_index(params, nparams, arg); - if (iparam == varparam) { - j = tuple_extend(&subargs, j, - argitems + iparam, nargs - nparams + 1); - if (j < 0) { - return NULL; - } - } - else { - if (iparam >= 0) { - if (iparam > varparam) { - iparam += nargs - nsubargs; + if (iparam >= 0) { + PyObject *param = PyTuple_GET_ITEM(params, iparam); + arg = argitems[iparam]; + if (Py_TYPE(param)->tp_iter && PyTuple_Check(arg)) { // TypeVarTuple + j = tuple_extend(&subargs, j, + &PyTuple_GET_ITEM(arg, 0), + PyTuple_GET_SIZE(arg)); + if (j < 0) { + return NULL; } - arg = argitems[iparam]; + continue; } - Py_INCREF(arg); - PyTuple_SET_ITEM(subargs, j, arg); - j++; } + Py_INCREF(arg); + PyTuple_SET_ITEM(subargs, j, arg); + j++; } + assert(j == PyTuple_GET_SIZE(subargs)); obj = PyObject_GetItem(obj, subargs); @@ -409,39 +409,37 @@ _Py_subs_parameters(PyObject *self, PyObject *args, PyObject *parameters, PyObje self); } item = _unpack_args(item); - int is_tuple = PyTuple_Check(item); - Py_ssize_t nitems = is_tuple ? PyTuple_GET_SIZE(item) : 1; - PyObject **argitems = is_tuple ? &PyTuple_GET_ITEM(item, 0) : &item; - Py_ssize_t varparam = nparams; for (Py_ssize_t i = 0; i < nparams; i++) { PyObject *param = PyTuple_GET_ITEM(parameters, i); - if (Py_TYPE(param)->tp_iter) { // TypeVarTuple - if (varparam < nparams) { - Py_DECREF(item); - return PyErr_Format(PyExc_TypeError, - "More than one TypeVarTuple parameter in %S", - self); - } - varparam = i; - } - } - if (varparam < nparams) { - if (nitems < nparams - 1) { + PyObject *prepare, *tmp; + if (_PyObject_LookupAttr(param, &_Py_ID(__typing_prepare_subst__), &prepare) < 0) { Py_DECREF(item); - return PyErr_Format(PyExc_TypeError, - "Too few arguments for %R", - self); + return NULL; } - } - else { - if (nitems != nparams) { - Py_DECREF(item); - return PyErr_Format(PyExc_TypeError, - "Too %s arguments for %R; actual %zd, expected %zd", - nitems > nparams ? "many" : "few", - self, nitems, nparams); + if (prepare && prepare != Py_None) { + if (PyTuple_Check(item)) { + tmp = PyObject_CallFunction(prepare, "OO", self, item); + } + else { + tmp = PyObject_CallFunction(prepare, "O(O)", self, item); + } + Py_DECREF(prepare); + Py_SETREF(item, tmp); + if (item == NULL) { + return NULL; + } } } + int is_tuple = PyTuple_Check(item); + Py_ssize_t nitems = is_tuple ? PyTuple_GET_SIZE(item) : 1; + PyObject **argitems = is_tuple ? &PyTuple_GET_ITEM(item, 0) : &item; + if (nitems != nparams) { + Py_DECREF(item); + return PyErr_Format(PyExc_TypeError, + "Too %s arguments for %R; actual %zd, expected %zd", + nitems > nparams ? "many" : "few", + self, nitems, nparams); + } /* Replace all type variables (specified by parameters) with corresponding values specified by argitems. t = list[T]; t[int] -> newargs = [int] @@ -471,22 +469,11 @@ _Py_subs_parameters(PyObject *self, PyObject *args, PyObject *parameters, PyObje if (subst) { Py_ssize_t iparam = tuple_index(parameters, nparams, arg); assert(iparam >= 0); - if (iparam == varparam) { - Py_DECREF(subst); - Py_DECREF(newargs); - Py_DECREF(item); - PyErr_SetString(PyExc_TypeError, - "Substitution of bare TypeVarTuple is not supported"); - return NULL; - } - if (iparam > varparam) { - iparam += nitems - nparams; - } arg = PyObject_CallOneArg(subst, argitems[iparam]); Py_DECREF(subst); } else { - arg = subs_tvars(arg, parameters, argitems, nitems, varparam); + arg = subs_tvars(arg, parameters, argitems, nitems); } if (arg == NULL) { Py_DECREF(newargs); @@ -596,6 +583,7 @@ ga_vectorcall(PyObject *self, PyObject *const *args, } static const char* const attr_exceptions[] = { + "__class__", "__origin__", "__args__", "__unpacked__", diff --git a/Objects/genobject.c b/Objects/genobject.c index b9a0c30c411a00..2b45e28cbf16df 100644 --- a/Objects/genobject.c +++ b/Objects/genobject.c @@ -1,5 +1,7 @@ /* Generator object implementation */ +#define _PY_INTERPRETER + #include "Python.h" #include "pycore_call.h" // _PyObject_CallNoArgs() #include "pycore_ceval.h" // _PyEval_EvalFrame() @@ -11,6 +13,7 @@ #include "pycore_pystate.h" // _PyThreadState_GET() #include "structmember.h" // PyMemberDef #include "opcode.h" // SEND +#include "pystats.h" static PyObject *gen_close(PyGenObject *, PyObject *); static PyObject *async_gen_asend_new(PyAsyncGenObject *, PyObject *); @@ -216,6 +219,7 @@ gen_send_ex2(PyGenObject *gen, PyObject *arg, PyObject **presult, } gen->gi_frame_state = FRAME_EXECUTING; + EVAL_CALL_STAT_INC(EVAL_CALL_GENERATOR); result = _PyEval_EvalFrame(tstate, frame, exc); if (gen->gi_frame_state == FRAME_EXECUTING) { gen->gi_frame_state = FRAME_COMPLETED; diff --git a/Objects/listobject.c b/Objects/listobject.c index b50623ed73d940..83dfb7da01dfc3 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -3,7 +3,7 @@ #include "Python.h" #include "pycore_abstract.h" // _PyIndex_Check() #include "pycore_interp.h" // PyInterpreterState.list -#include "pycore_list.h" // struct _Py_list_state +#include "pycore_list.h" // struct _Py_list_state, _PyListIterObject #include "pycore_object.h" // _PyObject_GC_TRACK() #include "pycore_tuple.h" // _PyTuple_FromArray() #include @@ -94,6 +94,12 @@ list_preallocate_exact(PyListObject *self, Py_ssize_t size) assert(self->ob_item == NULL); assert(size > 0); + /* Since the Python memory allocator has granularity of 16 bytes on 64-bit + * platforms (8 on 32-bit), there is no benefit of allocating space for + * the odd number of items, and there is no drawback of rounding the + * allocated size up to the nearest even number. + */ + size = (size + 1) & ~(size_t)1; PyObject **items = PyMem_New(PyObject*, size); if (items == NULL) { PyErr_NoMemory(); @@ -3182,19 +3188,13 @@ PyTypeObject PyList_Type = { /*********************** List Iterator **************************/ -typedef struct { - PyObject_HEAD - Py_ssize_t it_index; - PyListObject *it_seq; /* Set to NULL when iterator is exhausted */ -} listiterobject; - -static void listiter_dealloc(listiterobject *); -static int listiter_traverse(listiterobject *, visitproc, void *); -static PyObject *listiter_next(listiterobject *); -static PyObject *listiter_len(listiterobject *, PyObject *); +static void listiter_dealloc(_PyListIterObject *); +static int listiter_traverse(_PyListIterObject *, visitproc, void *); +static PyObject *listiter_next(_PyListIterObject *); +static PyObject *listiter_len(_PyListIterObject *, PyObject *); static PyObject *listiter_reduce_general(void *_it, int forward); -static PyObject *listiter_reduce(listiterobject *, PyObject *); -static PyObject *listiter_setstate(listiterobject *, PyObject *state); +static PyObject *listiter_reduce(_PyListIterObject *, PyObject *); +static PyObject *listiter_setstate(_PyListIterObject *, PyObject *state); PyDoc_STRVAR(length_hint_doc, "Private method returning an estimate of len(list(it))."); PyDoc_STRVAR(reduce_doc, "Return state information for pickling."); @@ -3210,7 +3210,7 @@ static PyMethodDef listiter_methods[] = { PyTypeObject PyListIter_Type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) "list_iterator", /* tp_name */ - sizeof(listiterobject), /* tp_basicsize */ + sizeof(_PyListIterObject), /* tp_basicsize */ 0, /* tp_itemsize */ /* methods */ (destructor)listiter_dealloc, /* tp_dealloc */ @@ -3244,13 +3244,13 @@ PyTypeObject PyListIter_Type = { static PyObject * list_iter(PyObject *seq) { - listiterobject *it; + _PyListIterObject *it; if (!PyList_Check(seq)) { PyErr_BadInternalCall(); return NULL; } - it = PyObject_GC_New(listiterobject, &PyListIter_Type); + it = PyObject_GC_New(_PyListIterObject, &PyListIter_Type); if (it == NULL) return NULL; it->it_index = 0; @@ -3261,7 +3261,7 @@ list_iter(PyObject *seq) } static void -listiter_dealloc(listiterobject *it) +listiter_dealloc(_PyListIterObject *it) { _PyObject_GC_UNTRACK(it); Py_XDECREF(it->it_seq); @@ -3269,14 +3269,14 @@ listiter_dealloc(listiterobject *it) } static int -listiter_traverse(listiterobject *it, visitproc visit, void *arg) +listiter_traverse(_PyListIterObject *it, visitproc visit, void *arg) { Py_VISIT(it->it_seq); return 0; } static PyObject * -listiter_next(listiterobject *it) +listiter_next(_PyListIterObject *it) { PyListObject *seq; PyObject *item; @@ -3300,7 +3300,7 @@ listiter_next(listiterobject *it) } static PyObject * -listiter_len(listiterobject *it, PyObject *Py_UNUSED(ignored)) +listiter_len(_PyListIterObject *it, PyObject *Py_UNUSED(ignored)) { Py_ssize_t len; if (it->it_seq) { @@ -3312,13 +3312,13 @@ listiter_len(listiterobject *it, PyObject *Py_UNUSED(ignored)) } static PyObject * -listiter_reduce(listiterobject *it, PyObject *Py_UNUSED(ignored)) +listiter_reduce(_PyListIterObject *it, PyObject *Py_UNUSED(ignored)) { return listiter_reduce_general(it, 1); } static PyObject * -listiter_setstate(listiterobject *it, PyObject *state) +listiter_setstate(_PyListIterObject *it, PyObject *state) { Py_ssize_t index = PyLong_AsSsize_t(state); if (index == -1 && PyErr_Occurred()) @@ -3493,7 +3493,7 @@ listiter_reduce_general(void *_it, int forward) /* the objects are not the same, index is of different types! */ if (forward) { - listiterobject *it = (listiterobject *)_it; + _PyListIterObject *it = (_PyListIterObject *)_it; if (it->it_seq) { return Py_BuildValue("N(O)n", _PyEval_GetBuiltin(&_Py_ID(iter)), it->it_seq, it->it_index); diff --git a/Objects/longobject.c b/Objects/longobject.c index 7f60866d2eb280..4a1e5caf5a2843 100644 --- a/Objects/longobject.c +++ b/Objects/longobject.c @@ -262,6 +262,38 @@ _PyLong_FromSTwoDigits(stwodigits x) return _PyLong_FromLarge(x); } +int +_PyLong_AssignValue(PyObject **target, Py_ssize_t value) +{ + PyObject *old = *target; + if (IS_SMALL_INT(value)) { + *target = get_small_int(Py_SAFE_DOWNCAST(value, Py_ssize_t, sdigit)); + Py_XDECREF(old); + return 0; + } + else if (old != NULL && PyLong_CheckExact(old) && + Py_REFCNT(old) == 1 && Py_SIZE(old) == 1 && + (size_t)value <= PyLong_MASK) + { + // Mutate in place if there are no other references the old + // object. This avoids an allocation in a common case. + // Since the primary use-case is iterating over ranges, which + // are typically positive, only do this optimization + // for positive integers (for now). + ((PyLongObject *)old)->ob_digit[0] = + Py_SAFE_DOWNCAST(value, Py_ssize_t, digit); + return 0; + } + else { + *target = PyLong_FromSsize_t(value); + Py_XDECREF(old); + if (*target == NULL) { + return -1; + } + return 0; + } +} + /* If a freshly-allocated int is already shared, it must be a small integer, so negating it must go to PyLong_FromLong */ Py_LOCAL_INLINE(void) @@ -4842,7 +4874,7 @@ long_lshift1(PyLongObject *a, Py_ssize_t wordshift, digit remshift) for (i = 0; i < wordshift; i++) z->ob_digit[i] = 0; accum = 0; - for (i = wordshift, j = 0; j < oldsize; i++, j++) { + for (j = 0; j < oldsize; i++, j++) { accum |= (twodigits)a->ob_digit[j] << remshift; z->ob_digit[i] = (digit)(accum & PyLong_MASK); accum >>= PyLong_SHIFT; diff --git a/Objects/memoryobject.c b/Objects/memoryobject.c index 45fe8985c2adb4..d29e35c2bc34de 100644 --- a/Objects/memoryobject.c +++ b/Objects/memoryobject.c @@ -193,6 +193,11 @@ PyTypeObject _PyManagedBuffer_Type = { return -1; \ } +/* See gh-92888. These macros signal that we need to check the memoryview + again due to possible read after frees. */ +#define CHECK_RELEASED_AGAIN(mv) CHECK_RELEASED(mv) +#define CHECK_RELEASED_INT_AGAIN(mv) CHECK_RELEASED_INT(mv) + #define CHECK_LIST_OR_TUPLE(v) \ if (!PyList_Check(v) && !PyTuple_Check(v)) { \ PyErr_SetString(PyExc_TypeError, \ @@ -381,8 +386,9 @@ copy_rec(const Py_ssize_t *shape, Py_ssize_t ndim, Py_ssize_t itemsize, /* Faster copying of one-dimensional arrays. */ static int -copy_single(const Py_buffer *dest, const Py_buffer *src) +copy_single(PyMemoryViewObject *self, const Py_buffer *dest, const Py_buffer *src) { + CHECK_RELEASED_INT_AGAIN(self); char *mem = NULL; assert(dest->ndim == 1); @@ -1677,7 +1683,7 @@ pylong_as_zu(PyObject *item) module syntax. This function is very sensitive to small changes. With this layout gcc automatically generates a fast jump table. */ static inline PyObject * -unpack_single(const char *ptr, const char *fmt) +unpack_single(PyMemoryViewObject *self, const char *ptr, const char *fmt) { unsigned long long llu; unsigned long lu; @@ -1689,6 +1695,8 @@ unpack_single(const char *ptr, const char *fmt) unsigned char uc; void *p; + CHECK_RELEASED_AGAIN(self); + switch (fmt[0]) { /* signed integers and fast path for 'B' */ @@ -1767,7 +1775,7 @@ unpack_single(const char *ptr, const char *fmt) /* Pack a single item. 'fmt' can be any native format character in struct module syntax. */ static int -pack_single(char *ptr, PyObject *item, const char *fmt) +pack_single(PyMemoryViewObject *self, char *ptr, PyObject *item, const char *fmt) { unsigned long long llu; unsigned long lu; @@ -1784,6 +1792,7 @@ pack_single(char *ptr, PyObject *item, const char *fmt) ld = pylong_as_ld(item); if (ld == -1 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); switch (fmt[0]) { case 'b': if (ld < SCHAR_MIN || ld > SCHAR_MAX) goto err_range; @@ -1804,6 +1813,7 @@ pack_single(char *ptr, PyObject *item, const char *fmt) lu = pylong_as_lu(item); if (lu == (unsigned long)-1 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); switch (fmt[0]) { case 'B': if (lu > UCHAR_MAX) goto err_range; @@ -1824,12 +1834,14 @@ pack_single(char *ptr, PyObject *item, const char *fmt) lld = pylong_as_lld(item); if (lld == -1 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); PACK_SINGLE(ptr, lld, long long); break; case 'Q': llu = pylong_as_llu(item); if (llu == (unsigned long long)-1 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); PACK_SINGLE(ptr, llu, unsigned long long); break; @@ -1838,12 +1850,14 @@ pack_single(char *ptr, PyObject *item, const char *fmt) zd = pylong_as_zd(item); if (zd == -1 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); PACK_SINGLE(ptr, zd, Py_ssize_t); break; case 'N': zu = pylong_as_zu(item); if (zu == (size_t)-1 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); PACK_SINGLE(ptr, zu, size_t); break; @@ -1852,6 +1866,7 @@ pack_single(char *ptr, PyObject *item, const char *fmt) d = PyFloat_AsDouble(item); if (d == -1.0 && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); if (fmt[0] == 'f') { PACK_SINGLE(ptr, d, float); } @@ -1865,6 +1880,7 @@ pack_single(char *ptr, PyObject *item, const char *fmt) ld = PyObject_IsTrue(item); if (ld < 0) return -1; /* preserve original error */ + CHECK_RELEASED_INT_AGAIN(self); PACK_SINGLE(ptr, ld, _Bool); break; @@ -1882,6 +1898,7 @@ pack_single(char *ptr, PyObject *item, const char *fmt) p = PyLong_AsVoidPtr(item); if (p == NULL && PyErr_Occurred()) goto err_occurred; + CHECK_RELEASED_INT_AGAIN(self); PACK_SINGLE(ptr, p, void *); break; @@ -1950,18 +1967,12 @@ unpacker_free(struct unpacker *x) static struct unpacker * struct_get_unpacker(const char *fmt, Py_ssize_t itemsize) { - PyObject *structmodule; /* XXX cache these two */ - PyObject *Struct = NULL; /* XXX in globals? */ + PyObject *Struct = NULL; /* XXX cache it in globals? */ PyObject *structobj = NULL; PyObject *format = NULL; struct unpacker *x = NULL; - structmodule = PyImport_ImportModule("struct"); - if (structmodule == NULL) - return NULL; - - Struct = PyObject_GetAttrString(structmodule, "Struct"); - Py_DECREF(structmodule); + Struct = _PyImport_GetModuleAttrString("struct", "Struct"); if (Struct == NULL) return NULL; @@ -2048,7 +2059,7 @@ adjust_fmt(const Py_buffer *view) /* Base case for multi-dimensional unpacking. Assumption: ndim == 1. */ static PyObject * -tolist_base(const char *ptr, const Py_ssize_t *shape, +tolist_base(PyMemoryViewObject *self, const char *ptr, const Py_ssize_t *shape, const Py_ssize_t *strides, const Py_ssize_t *suboffsets, const char *fmt) { @@ -2061,7 +2072,7 @@ tolist_base(const char *ptr, const Py_ssize_t *shape, for (i = 0; i < shape[0]; ptr+=strides[0], i++) { const char *xptr = ADJUST_PTR(ptr, suboffsets, 0); - item = unpack_single(xptr, fmt); + item = unpack_single(self, xptr, fmt); if (item == NULL) { Py_DECREF(lst); return NULL; @@ -2075,7 +2086,7 @@ tolist_base(const char *ptr, const Py_ssize_t *shape, /* Unpack a multi-dimensional array into a nested list. Assumption: ndim >= 1. */ static PyObject * -tolist_rec(const char *ptr, Py_ssize_t ndim, const Py_ssize_t *shape, +tolist_rec(PyMemoryViewObject *self, const char *ptr, Py_ssize_t ndim, const Py_ssize_t *shape, const Py_ssize_t *strides, const Py_ssize_t *suboffsets, const char *fmt) { @@ -2087,7 +2098,7 @@ tolist_rec(const char *ptr, Py_ssize_t ndim, const Py_ssize_t *shape, assert(strides != NULL); if (ndim == 1) - return tolist_base(ptr, shape, strides, suboffsets, fmt); + return tolist_base(self, ptr, shape, strides, suboffsets, fmt); lst = PyList_New(shape[0]); if (lst == NULL) @@ -2095,7 +2106,7 @@ tolist_rec(const char *ptr, Py_ssize_t ndim, const Py_ssize_t *shape, for (i = 0; i < shape[0]; ptr+=strides[0], i++) { const char *xptr = ADJUST_PTR(ptr, suboffsets, 0); - item = tolist_rec(xptr, ndim-1, shape+1, + item = tolist_rec(self, xptr, ndim-1, shape+1, strides+1, suboffsets ? suboffsets+1 : NULL, fmt); if (item == NULL) { @@ -2129,15 +2140,15 @@ memoryview_tolist_impl(PyMemoryViewObject *self) if (fmt == NULL) return NULL; if (view->ndim == 0) { - return unpack_single(view->buf, fmt); + return unpack_single(self, view->buf, fmt); } else if (view->ndim == 1) { - return tolist_base(view->buf, view->shape, + return tolist_base(self, view->buf, view->shape, view->strides, view->suboffsets, fmt); } else { - return tolist_rec(view->buf, view->ndim, view->shape, + return tolist_rec(self, view->buf, view->ndim, view->shape, view->strides, view->suboffsets, fmt); } @@ -2345,7 +2356,7 @@ memory_item(PyMemoryViewObject *self, Py_ssize_t index) char *ptr = ptr_from_index(view, index); if (ptr == NULL) return NULL; - return unpack_single(ptr, fmt); + return unpack_single(self, ptr, fmt); } PyErr_SetString(PyExc_NotImplementedError, @@ -2376,7 +2387,7 @@ memory_item_multi(PyMemoryViewObject *self, PyObject *tup) ptr = ptr_from_tuple(view, tup); if (ptr == NULL) return NULL; - return unpack_single(ptr, fmt); + return unpack_single(self, ptr, fmt); } static inline int @@ -2463,7 +2474,7 @@ memory_subscript(PyMemoryViewObject *self, PyObject *key) const char *fmt = adjust_fmt(view); if (fmt == NULL) return NULL; - return unpack_single(view->buf, fmt); + return unpack_single(self, view->buf, fmt); } else if (key == Py_Ellipsis) { Py_INCREF(self); @@ -2538,7 +2549,7 @@ memory_ass_sub(PyMemoryViewObject *self, PyObject *key, PyObject *value) if (key == Py_Ellipsis || (PyTuple_Check(key) && PyTuple_GET_SIZE(key)==0)) { ptr = (char *)view->buf; - return pack_single(ptr, value, fmt); + return pack_single(self, ptr, value, fmt); } else { PyErr_SetString(PyExc_TypeError, @@ -2560,7 +2571,7 @@ memory_ass_sub(PyMemoryViewObject *self, PyObject *key, PyObject *value) ptr = ptr_from_index(view, index); if (ptr == NULL) return -1; - return pack_single(ptr, value, fmt); + return pack_single(self, ptr, value, fmt); } /* one-dimensional: fast path */ if (PySlice_Check(key) && view->ndim == 1) { @@ -2583,7 +2594,7 @@ memory_ass_sub(PyMemoryViewObject *self, PyObject *key, PyObject *value) goto end_block; dest.len = dest.shape[0] * dest.itemsize; - ret = copy_single(&dest, &src); + ret = copy_single(self, &dest, &src); end_block: PyBuffer_Release(&src); @@ -2599,7 +2610,7 @@ memory_ass_sub(PyMemoryViewObject *self, PyObject *key, PyObject *value) ptr = ptr_from_tuple(view, key); if (ptr == NULL) return -1; - return pack_single(ptr, value, fmt); + return pack_single(self, ptr, value, fmt); } if (PySlice_Check(key) || is_multislice(key)) { /* Call memory_subscript() to produce a sliced lvalue, then copy @@ -3156,7 +3167,7 @@ static PyMethodDef memory_methods[] = { /* Memoryview Iterator */ /**************************************************************************/ -static PyTypeObject PyMemoryIter_Type; +PyTypeObject _PyMemoryIter_Type; typedef struct { PyObject_HEAD @@ -3200,7 +3211,7 @@ memoryiter_next(memoryiterobject *it) if (ptr == NULL) { return NULL; } - return unpack_single(ptr, it->it_fmt); + return unpack_single(seq, ptr, it->it_fmt); } it->it_seq = NULL; @@ -3233,7 +3244,7 @@ memory_iter(PyObject *seq) } memoryiterobject *it; - it = PyObject_GC_New(memoryiterobject, &PyMemoryIter_Type); + it = PyObject_GC_New(memoryiterobject, &_PyMemoryIter_Type); if (it == NULL) { return NULL; } @@ -3246,7 +3257,7 @@ memory_iter(PyObject *seq) return (PyObject *)it; } -static PyTypeObject PyMemoryIter_Type = { +PyTypeObject _PyMemoryIter_Type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) .tp_name = "memory_iterator", .tp_basicsize = sizeof(memoryiterobject), diff --git a/Objects/object.c b/Objects/object.c index 8339ab392fcaee..75aee909a3b496 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -16,7 +16,6 @@ #include "pycore_symtable.h" // PySTEntry_Type #include "pycore_typeobject.h" // _PyTypes_InitSlotDefs() #include "pycore_unionobject.h" // _PyUnion_Type -#include "frameobject.h" // PyFrame_Type #include "pycore_interpreteridobject.h" // _PyInterpreterID_Type #ifdef Py_LIMITED_API @@ -274,11 +273,8 @@ PyObject_Print(PyObject *op, FILE *fp, int flags) } else { if (Py_REFCNT(op) <= 0) { - /* XXX(twouters) cast refcount to long until %zd is - universally available */ Py_BEGIN_ALLOW_THREADS - fprintf(fp, "", - (long)Py_REFCNT(op), (void *)op); + fprintf(fp, "", Py_REFCNT(op), (void *)op); Py_END_ALLOW_THREADS } else { @@ -371,9 +367,7 @@ _PyObject_Dump(PyObject* op) /* first, write fields which are the least likely to crash */ fprintf(stderr, "object address : %p\n", (void *)op); - /* XXX(twouters) cast refcount to long until %zd is - universally available */ - fprintf(stderr, "object refcount : %ld\n", (long)Py_REFCNT(op)); + fprintf(stderr, "object refcount : %zd\n", Py_REFCNT(op)); fflush(stderr); PyTypeObject *type = Py_TYPE(op); @@ -1845,6 +1839,9 @@ _PyTypes_InitState(PyInterpreterState *interp) extern PyTypeObject PyHKEY_Type; #endif extern PyTypeObject _Py_GenericAliasIterType; +extern PyTypeObject _PyMemoryIter_Type; +extern PyTypeObject _PyLineIterator; +extern PyTypeObject _PyPositionsIterator; static PyTypeObject* static_types[] = { // The two most important base types: must be initialized first and @@ -1943,11 +1940,14 @@ static PyTypeObject* static_types[] = { &_PyHamt_CollisionNode_Type, &_PyHamt_Type, &_PyInterpreterID_Type, + &_PyLineIterator, &_PyManagedBuffer_Type, + &_PyMemoryIter_Type, &_PyMethodWrapper_Type, &_PyNamespace_Type, &_PyNone_Type, &_PyNotImplemented_Type, + &_PyPositionsIterator, &_PyUnicodeASCIIIter_Type, &_PyUnion_Type, &_PyWeakref_CallableProxyType, diff --git a/Objects/obmalloc.c b/Objects/obmalloc.c index 823855ca6d8e8c..78a6f01a0964ed 100644 --- a/Objects/obmalloc.c +++ b/Objects/obmalloc.c @@ -90,7 +90,7 @@ struct _PyTraceMalloc_Config _Py_tracemalloc_config = _PyTraceMalloc_Config_INIT static void * -_PyMem_RawMalloc(void *ctx, size_t size) +_PyMem_RawMalloc(void *Py_UNUSED(ctx), size_t size) { /* PyMem_RawMalloc(0) means malloc(1). Some systems would return NULL for malloc(0), which would be treated as an error. Some platforms would @@ -102,7 +102,7 @@ _PyMem_RawMalloc(void *ctx, size_t size) } static void * -_PyMem_RawCalloc(void *ctx, size_t nelem, size_t elsize) +_PyMem_RawCalloc(void *Py_UNUSED(ctx), size_t nelem, size_t elsize) { /* PyMem_RawCalloc(0, 0) means calloc(1, 1). Some systems would return NULL for calloc(0, 0), which would be treated as an error. Some platforms @@ -116,7 +116,7 @@ _PyMem_RawCalloc(void *ctx, size_t nelem, size_t elsize) } static void * -_PyMem_RawRealloc(void *ctx, void *ptr, size_t size) +_PyMem_RawRealloc(void *Py_UNUSED(ctx), void *ptr, size_t size) { if (size == 0) size = 1; @@ -124,7 +124,7 @@ _PyMem_RawRealloc(void *ctx, void *ptr, size_t size) } static void -_PyMem_RawFree(void *ctx, void *ptr) +_PyMem_RawFree(void *Py_UNUSED(ctx), void *ptr) { free(ptr); } @@ -132,21 +132,22 @@ _PyMem_RawFree(void *ctx, void *ptr) #ifdef MS_WINDOWS static void * -_PyObject_ArenaVirtualAlloc(void *ctx, size_t size) +_PyObject_ArenaVirtualAlloc(void *Py_UNUSED(ctx), size_t size) { return VirtualAlloc(NULL, size, MEM_COMMIT | MEM_RESERVE, PAGE_READWRITE); } static void -_PyObject_ArenaVirtualFree(void *ctx, void *ptr, size_t size) +_PyObject_ArenaVirtualFree(void *Py_UNUSED(ctx), void *ptr, + size_t Py_UNUSED(size)) { VirtualFree(ptr, 0, MEM_RELEASE); } #elif defined(ARENAS_USE_MMAP) static void * -_PyObject_ArenaMmap(void *ctx, size_t size) +_PyObject_ArenaMmap(void *Py_UNUSED(ctx), size_t size) { void *ptr; ptr = mmap(NULL, size, PROT_READ|PROT_WRITE, @@ -158,20 +159,20 @@ _PyObject_ArenaMmap(void *ctx, size_t size) } static void -_PyObject_ArenaMunmap(void *ctx, void *ptr, size_t size) +_PyObject_ArenaMunmap(void *Py_UNUSED(ctx), void *ptr, size_t size) { munmap(ptr, size); } #else static void * -_PyObject_ArenaMalloc(void *ctx, size_t size) +_PyObject_ArenaMalloc(void *Py_UNUSED(ctx), size_t size) { return malloc(size); } static void -_PyObject_ArenaFree(void *ctx, void *ptr, size_t size) +_PyObject_ArenaFree(void *Py_UNUSED(ctx), void *ptr, size_t Py_UNUSED(size)) { free(ptr); } @@ -1684,7 +1685,7 @@ new_arena(void) pymalloc. When the radix tree is used, 'poolp' is unused. */ static bool -address_in_range(void *p, poolp pool) +address_in_range(void *p, poolp Py_UNUSED(pool)) { return arena_map_is_used(p); } @@ -1945,7 +1946,7 @@ allocate_from_new_pool(uint size) or when the max memory limit has been reached. */ static inline void* -pymalloc_alloc(void *ctx, size_t nbytes) +pymalloc_alloc(void *Py_UNUSED(ctx), size_t nbytes) { #ifdef WITH_VALGRIND if (UNLIKELY(running_on_valgrind == -1)) { @@ -2215,7 +2216,7 @@ insert_to_freepool(poolp pool) Return 1 if it was freed. Return 0 if the block was not allocated by pymalloc_alloc(). */ static inline int -pymalloc_free(void *ctx, void *p) +pymalloc_free(void *Py_UNUSED(ctx), void *p) { assert(p != NULL); diff --git a/Objects/rangeobject.c b/Objects/rangeobject.c index 5d583b2edf0e94..f0b06de1c7d2a0 100644 --- a/Objects/rangeobject.c +++ b/Objects/rangeobject.c @@ -2,6 +2,7 @@ #include "Python.h" #include "pycore_abstract.h" // _PyIndex_Check() +#include "pycore_range.h" #include "pycore_long.h" // _PyLong_GetZero() #include "pycore_tuple.h" // _PyTuple_ITEMS() #include "structmember.h" // PyMemberDef @@ -762,16 +763,8 @@ PyTypeObject PyRange_Type = { in the normal case, but possible for any numeric value. */ -typedef struct { - PyObject_HEAD - long index; - long start; - long step; - long len; -} rangeiterobject; - static PyObject * -rangeiter_next(rangeiterobject *r) +rangeiter_next(_PyRangeIterObject *r) { if (r->index < r->len) /* cast to unsigned to avoid possible signed overflow @@ -782,7 +775,7 @@ rangeiter_next(rangeiterobject *r) } static PyObject * -rangeiter_len(rangeiterobject *r, PyObject *Py_UNUSED(ignored)) +rangeiter_len(_PyRangeIterObject *r, PyObject *Py_UNUSED(ignored)) { return PyLong_FromLong(r->len - r->index); } @@ -791,7 +784,7 @@ PyDoc_STRVAR(length_hint_doc, "Private method returning an estimate of len(list(it))."); static PyObject * -rangeiter_reduce(rangeiterobject *r, PyObject *Py_UNUSED(ignored)) +rangeiter_reduce(_PyRangeIterObject *r, PyObject *Py_UNUSED(ignored)) { PyObject *start=NULL, *stop=NULL, *step=NULL; PyObject *range; @@ -821,7 +814,7 @@ rangeiter_reduce(rangeiterobject *r, PyObject *Py_UNUSED(ignored)) } static PyObject * -rangeiter_setstate(rangeiterobject *r, PyObject *state) +rangeiter_setstate(_PyRangeIterObject *r, PyObject *state) { long index = PyLong_AsLong(state); if (index == -1 && PyErr_Occurred()) @@ -850,8 +843,8 @@ static PyMethodDef rangeiter_methods[] = { PyTypeObject PyRangeIter_Type = { PyVarObject_HEAD_INIT(&PyType_Type, 0) - "range_iterator", /* tp_name */ - sizeof(rangeiterobject), /* tp_basicsize */ + "range_iterator", /* tp_name */ + sizeof(_PyRangeIterObject), /* tp_basicsize */ 0, /* tp_itemsize */ /* methods */ (destructor)PyObject_Del, /* tp_dealloc */ @@ -915,7 +908,7 @@ get_len_of_range(long lo, long hi, long step) static PyObject * fast_range_iter(long start, long stop, long step, long len) { - rangeiterobject *it = PyObject_New(rangeiterobject, &PyRangeIter_Type); + _PyRangeIterObject *it = PyObject_New(_PyRangeIterObject, &PyRangeIter_Type); if (it == NULL) return NULL; it->start = start; diff --git a/Objects/setobject.c b/Objects/setobject.c index 4b6a8b8dfb679d..dd55a943010ab8 100644 --- a/Objects/setobject.c +++ b/Objects/setobject.c @@ -1735,13 +1735,13 @@ set_issubset(PySetObject *so, PyObject *other) int rv; if (!PyAnySet_Check(other)) { - PyObject *tmp, *result; - tmp = make_new_set(&PySet_Type, other); - if (tmp == NULL) + PyObject *tmp = set_intersection(so, other); + if (tmp == NULL) { return NULL; - result = set_issubset(so, tmp); + } + int result = (PySet_GET_SIZE(tmp) == PySet_GET_SIZE(so)); Py_DECREF(tmp); - return result; + return PyBool_FromLong(result); } if (PySet_GET_SIZE(so) > PySet_GET_SIZE(other)) Py_RETURN_FALSE; diff --git a/Objects/stringlib/asciilib.h b/Objects/stringlib/asciilib.h index eebe888e411e04..b3016bfbbb0dd2 100644 --- a/Objects/stringlib/asciilib.h +++ b/Objects/stringlib/asciilib.h @@ -21,6 +21,7 @@ #define STRINGLIB_CHECK PyUnicode_Check #define STRINGLIB_CHECK_EXACT PyUnicode_CheckExact #define STRINGLIB_MUTABLE 0 +#define STRINGLIB_FAST_MEMCHR memchr #define STRINGLIB_TOSTR PyObject_Str #define STRINGLIB_TOASCII PyObject_ASCII diff --git a/Objects/stringlib/fastsearch.h b/Objects/stringlib/fastsearch.h index b91082bd523cb6..2949d00ad223ee 100644 --- a/Objects/stringlib/fastsearch.h +++ b/Objects/stringlib/fastsearch.h @@ -39,7 +39,7 @@ #define STRINGLIB_BLOOM(mask, ch) \ ((mask & (1UL << ((ch) & (STRINGLIB_BLOOM_WIDTH -1))))) -#if STRINGLIB_SIZEOF_CHAR == 1 +#ifdef STRINGLIB_FAST_MEMCHR # define MEMCHR_CUT_OFF 15 #else # define MEMCHR_CUT_OFF 40 @@ -53,8 +53,8 @@ STRINGLIB(find_char)(const STRINGLIB_CHAR* s, Py_ssize_t n, STRINGLIB_CHAR ch) p = s; e = s + n; if (n > MEMCHR_CUT_OFF) { -#if STRINGLIB_SIZEOF_CHAR == 1 - p = memchr(s, ch, n); +#ifdef STRINGLIB_FAST_MEMCHR + p = STRINGLIB_FAST_MEMCHR(s, ch, n); if (p != NULL) return (p - s); return -1; @@ -102,16 +102,26 @@ STRINGLIB(find_char)(const STRINGLIB_CHAR* s, Py_ssize_t n, STRINGLIB_CHAR ch) return -1; } +#undef MEMCHR_CUT_OFF + +#if STRINGLIB_SIZEOF_CHAR == 1 +# define MEMRCHR_CUT_OFF 15 +#else +# define MEMRCHR_CUT_OFF 40 +#endif + + Py_LOCAL_INLINE(Py_ssize_t) STRINGLIB(rfind_char)(const STRINGLIB_CHAR* s, Py_ssize_t n, STRINGLIB_CHAR ch) { const STRINGLIB_CHAR *p; #ifdef HAVE_MEMRCHR - /* memrchr() is a GNU extension, available since glibc 2.1.91. - it doesn't seem as optimized as memchr(), but is still quite - faster than our hand-written loop below */ + /* memrchr() is a GNU extension, available since glibc 2.1.91. it + doesn't seem as optimized as memchr(), but is still quite + faster than our hand-written loop below. There is no wmemrchr + for 4-byte chars. */ - if (n > MEMCHR_CUT_OFF) { + if (n > MEMRCHR_CUT_OFF) { #if STRINGLIB_SIZEOF_CHAR == 1 p = memrchr(s, ch, n); if (p != NULL) @@ -139,11 +149,11 @@ STRINGLIB(rfind_char)(const STRINGLIB_CHAR* s, Py_ssize_t n, STRINGLIB_CHAR ch) if (*p == ch) return n; /* False positive */ - if (n1 - n > MEMCHR_CUT_OFF) + if (n1 - n > MEMRCHR_CUT_OFF) continue; - if (n <= MEMCHR_CUT_OFF) + if (n <= MEMRCHR_CUT_OFF) break; - s1 = p - MEMCHR_CUT_OFF; + s1 = p - MEMRCHR_CUT_OFF; while (p > s1) { p--; if (*p == ch) @@ -151,7 +161,7 @@ STRINGLIB(rfind_char)(const STRINGLIB_CHAR* s, Py_ssize_t n, STRINGLIB_CHAR ch) } n = p - s; } - while (n > MEMCHR_CUT_OFF); + while (n > MEMRCHR_CUT_OFF); } #endif } @@ -165,7 +175,7 @@ STRINGLIB(rfind_char)(const STRINGLIB_CHAR* s, Py_ssize_t n, STRINGLIB_CHAR ch) return -1; } -#undef MEMCHR_CUT_OFF +#undef MEMRCHR_CUT_OFF /* Change to a 1 to see logging comments walk through the algorithm. */ #if 0 && STRINGLIB_SIZEOF_CHAR == 1 @@ -345,7 +355,7 @@ STRINGLIB(_preprocess)(const STRINGLIB_CHAR *needle, Py_ssize_t len_needle, } // Fill up a compressed Boyer-Moore "Bad Character" table Py_ssize_t not_found_shift = Py_MIN(len_needle, MAX_SHIFT); - for (Py_ssize_t i = 0; i < TABLE_SIZE; i++) { + for (Py_ssize_t i = 0; i < (Py_ssize_t)TABLE_SIZE; i++) { p->table[i] = Py_SAFE_DOWNCAST(not_found_shift, Py_ssize_t, SHIFT_TYPE); } diff --git a/Objects/stringlib/replace.h b/Objects/stringlib/replace.h index ef318ed6dd5736..123c9f850f5a0e 100644 --- a/Objects/stringlib/replace.h +++ b/Objects/stringlib/replace.h @@ -29,9 +29,9 @@ STRINGLIB(replace_1char_inplace)(STRINGLIB_CHAR* s, STRINGLIB_CHAR* end, if (!--attempts) { /* if u1 was not found for attempts iterations, use FASTSEARCH() or memchr() */ -#if STRINGLIB_SIZEOF_CHAR == 1 +#ifdef STRINGLIB_FAST_MEMCHR s++; - s = memchr(s, u1, end - s); + s = STRINGLIB_FAST_MEMCHR(s, u1, end - s); if (s == NULL) return; #else diff --git a/Objects/stringlib/stringdefs.h b/Objects/stringlib/stringdefs.h index 88641b25d47c6f..484b98b7291309 100644 --- a/Objects/stringlib/stringdefs.h +++ b/Objects/stringlib/stringdefs.h @@ -24,4 +24,5 @@ #define STRINGLIB_CHECK_EXACT PyBytes_CheckExact #define STRINGLIB_TOSTR PyObject_Str #define STRINGLIB_TOASCII PyObject_Repr +#define STRINGLIB_FAST_MEMCHR memchr #endif /* !STRINGLIB_STRINGDEFS_H */ diff --git a/Objects/stringlib/ucs1lib.h b/Objects/stringlib/ucs1lib.h index 026ab11f1f7b82..1b9b65ecbaadb3 100644 --- a/Objects/stringlib/ucs1lib.h +++ b/Objects/stringlib/ucs1lib.h @@ -20,6 +20,7 @@ #define STRINGLIB_NEW _PyUnicode_FromUCS1 #define STRINGLIB_CHECK PyUnicode_Check #define STRINGLIB_CHECK_EXACT PyUnicode_CheckExact +#define STRINGLIB_FAST_MEMCHR memchr #define STRINGLIB_MUTABLE 0 #define STRINGLIB_TOSTR PyObject_Str diff --git a/Objects/stringlib/ucs2lib.h b/Objects/stringlib/ucs2lib.h index 75f11bc290508a..4b49bbb31d7cbe 100644 --- a/Objects/stringlib/ucs2lib.h +++ b/Objects/stringlib/ucs2lib.h @@ -21,6 +21,10 @@ #define STRINGLIB_CHECK PyUnicode_Check #define STRINGLIB_CHECK_EXACT PyUnicode_CheckExact #define STRINGLIB_MUTABLE 0 +#if SIZEOF_WCHAR_T == 2 +#define STRINGLIB_FAST_MEMCHR(s, c, n) \ + (Py_UCS2 *)wmemchr((const wchar_t *)(s), c, n) +#endif #define STRINGLIB_TOSTR PyObject_Str #define STRINGLIB_TOASCII PyObject_ASCII diff --git a/Objects/stringlib/ucs4lib.h b/Objects/stringlib/ucs4lib.h index 57344f235b659d..def4ca5d17d504 100644 --- a/Objects/stringlib/ucs4lib.h +++ b/Objects/stringlib/ucs4lib.h @@ -21,6 +21,10 @@ #define STRINGLIB_CHECK PyUnicode_Check #define STRINGLIB_CHECK_EXACT PyUnicode_CheckExact #define STRINGLIB_MUTABLE 0 +#if SIZEOF_WCHAR_T == 4 +#define STRINGLIB_FAST_MEMCHR(s, c, n) \ + (Py_UCS4 *)wmemchr((const wchar_t *)(s), c, n) +#endif #define STRINGLIB_TOSTR PyObject_Str #define STRINGLIB_TOASCII PyObject_ASCII diff --git a/Objects/stringlib/undef.h b/Objects/stringlib/undef.h index bf32298505ed71..cc873a2ec4e022 100644 --- a/Objects/stringlib/undef.h +++ b/Objects/stringlib/undef.h @@ -8,3 +8,4 @@ #undef STRINGLIB_NEW #undef STRINGLIB_IS_UNICODE #undef STRINGLIB_MUTABLE +#undef STRINGLIB_FAST_MEMCHR diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 1daf2b8d3b0ff8..5ebff6084f4a97 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -11,7 +11,6 @@ #include "pycore_pystate.h" // _PyThreadState_GET() #include "pycore_typeobject.h" // struct type_cache #include "pycore_unionobject.h" // _Py_union_type_or -#include "frameobject.h" // PyFrameObject #include "pycore_frame.h" // _PyInterpreterFrame #include "opcode.h" // MAKE_CELL #include "structmember.h" // PyMemberDef @@ -54,11 +53,6 @@ typedef struct PySlot_Offset { } PySlot_Offset; -/* bpo-40521: Interned strings are shared by all subinterpreters */ -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS -# define INTERN_NAME_STRINGS -#endif - static PyObject * slot_tp_new(PyTypeObject *type, PyObject *args, PyObject *kwds); @@ -1653,6 +1647,7 @@ vectorcall_unbound(PyThreadState *tstate, int unbound, PyObject *func, args++; nargsf = nargsf - 1 + PY_VECTORCALL_ARGUMENTS_OFFSET; } + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_SLOT, func); return _PyObject_VectorcallTstate(tstate, func, args, nargsf, NULL); } @@ -3365,29 +3360,125 @@ static const PySlot_Offset pyslot_offsets[] = { #include "typeslots.inc" }; -PyObject * -PyType_FromSpecWithBases(PyType_Spec *spec, PyObject *bases) +/* Given a PyType_FromMetaclass `bases` argument (NULL, type, or tuple of + * types), return a tuple of types. + */ +inline static PyObject * +get_bases_tuple(PyObject *bases_in, PyType_Spec *spec) +{ + if (!bases_in) { + /* Default: look in the spec, fall back to (type,). */ + PyTypeObject *base = &PyBaseObject_Type; // borrowed ref + PyObject *bases = NULL; // borrowed ref + const PyType_Slot *slot; + for (slot = spec->slots; slot->slot; slot++) { + switch (slot->slot) { + case Py_tp_base: + base = slot->pfunc; + break; + case Py_tp_bases: + bases = slot->pfunc; + break; + } + } + if (!bases) { + return PyTuple_Pack(1, base); + } + if (PyTuple_Check(bases)) { + return Py_NewRef(bases); + } + PyErr_SetString(PyExc_SystemError, "Py_tp_bases is not a tuple"); + return NULL; + } + if (PyTuple_Check(bases_in)) { + return Py_NewRef(bases_in); + } + // Not a tuple, should be a single type + return PyTuple_Pack(1, bases_in); +} + +static inline int +check_basicsize_includes_size_and_offsets(PyTypeObject* type) { - return PyType_FromModuleAndSpec(NULL, spec, bases); + if (type->tp_alloc != PyType_GenericAlloc) { + // Custom allocators can ignore tp_basicsize + return 1; + } + Py_ssize_t max = (Py_ssize_t)type->tp_basicsize; + + if (type->tp_base && type->tp_base->tp_basicsize > type->tp_basicsize) { + PyErr_Format(PyExc_TypeError, + "tp_basicsize for type '%s' (%d) is too small for base '%s' (%d)", + type->tp_name, type->tp_basicsize, + type->tp_base->tp_name, type->tp_base->tp_basicsize); + return 0; + } + if (type->tp_weaklistoffset + (Py_ssize_t)sizeof(PyObject*) > max) { + PyErr_Format(PyExc_TypeError, + "weaklist offset %d is out of bounds for type '%s' (tp_basicsize = %d)", + type->tp_weaklistoffset, + type->tp_name, type->tp_basicsize); + return 0; + } + if (type->tp_dictoffset + (Py_ssize_t)sizeof(PyObject*) > max) { + PyErr_Format(PyExc_TypeError, + "dict offset %d is out of bounds for type '%s' (tp_basicsize = %d)", + type->tp_dictoffset, + type->tp_name, type->tp_basicsize); + return 0; + } + if (type->tp_vectorcall_offset + (Py_ssize_t)sizeof(vectorcallfunc*) > max) { + PyErr_Format(PyExc_TypeError, + "vectorcall offset %d is out of bounds for type '%s' (tp_basicsize = %d)", + type->tp_vectorcall_offset, + type->tp_name, type->tp_basicsize); + return 0; + } + return 1; } PyObject * -PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) +PyType_FromMetaclass(PyTypeObject *metaclass, PyObject *module, + PyType_Spec *spec, PyObject *bases_in) { - PyHeapTypeObject *res; - PyObject *modname; - PyTypeObject *type, *base; + /* Invariant: A non-NULL value in one of these means this function holds + * a strong reference or owns allocated memory. + * These get decrefed/freed/returned at the end, on both success and error. + */ + PyHeapTypeObject *res = NULL; + PyTypeObject *type; + PyObject *bases = NULL; + char *tp_doc = NULL; + PyObject *ht_name = NULL; + char *_ht_tpname = NULL; + int r; + /* Prepare slots that need special handling. + * Keep in mind that a slot can be given multiple times: + * if that would cause trouble (leaks, UB, ...), raise an exception. + */ + const PyType_Slot *slot; - Py_ssize_t nmembers, weaklistoffset, dictoffset, vectorcalloffset; + Py_ssize_t nmembers = 0; + Py_ssize_t weaklistoffset, dictoffset, vectorcalloffset; char *res_start; - short slot_offset, subslot_offset; nmembers = weaklistoffset = dictoffset = vectorcalloffset = 0; for (slot = spec->slots; slot->slot; slot++) { - if (slot->slot == Py_tp_members) { - nmembers = 0; + if (slot->slot < 0 + || (size_t)slot->slot >= Py_ARRAY_LENGTH(pyslot_offsets)) { + PyErr_SetString(PyExc_RuntimeError, "invalid slot offset"); + goto finally; + } + switch (slot->slot) { + case Py_tp_members: + if (nmembers != 0) { + PyErr_SetString( + PyExc_SystemError, + "Multiple Py_tp_members slots are not supported."); + goto finally; + } for (const PyMemberDef *memb = slot->pfunc; memb->name != NULL; memb++) { nmembers++; if (strcmp(memb->name, "__weaklistoffset__") == 0) { @@ -3409,25 +3500,41 @@ PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) vectorcalloffset = memb->offset; } } + break; + case Py_tp_doc: + /* For the docstring slot, which usually points to a static string + literal, we need to make a copy */ + if (tp_doc != NULL) { + PyErr_SetString( + PyExc_SystemError, + "Multiple Py_tp_doc slots are not supported."); + goto finally; + } + if (slot->pfunc == NULL) { + PyObject_Free(tp_doc); + tp_doc = NULL; + } + else { + size_t len = strlen(slot->pfunc)+1; + tp_doc = PyObject_Malloc(len); + if (tp_doc == NULL) { + PyErr_NoMemory(); + goto finally; + } + memcpy(tp_doc, slot->pfunc, len); + } + break; } } - res = (PyHeapTypeObject*)PyType_GenericAlloc(&PyType_Type, nmembers); - if (res == NULL) - return NULL; - res_start = (char*)res; + /* Prepare the type name and qualname */ if (spec->name == NULL) { PyErr_SetString(PyExc_SystemError, "Type spec does not define the name field."); - goto fail; + goto finally; } - type = &res->ht_type; - /* The flags must be initialized early, before the GC traverses us */ - type->tp_flags = spec->flags | Py_TPFLAGS_HEAPTYPE; - - /* Set the type name and qualname */ const char *s = strrchr(spec->name, '.'); if (s == NULL) { s = spec->name; @@ -3436,11 +3543,10 @@ PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) s++; } - res->ht_name = PyUnicode_FromString(s); - if (!res->ht_name) { - goto fail; + ht_name = PyUnicode_FromString(s); + if (!ht_name) { + goto finally; } - res->ht_qualname = Py_NewRef(res->ht_name); /* Copy spec->name to a buffer we own. * @@ -3452,119 +3558,132 @@ PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) * deallocated with the type (if it's non-NULL). */ Py_ssize_t name_buf_len = strlen(spec->name) + 1; - res->_ht_tpname = PyMem_Malloc(name_buf_len); - if (res->_ht_tpname == NULL) { - goto fail; + _ht_tpname = PyMem_Malloc(name_buf_len); + if (_ht_tpname == NULL) { + goto finally; } - type->tp_name = memcpy(res->_ht_tpname, spec->name, name_buf_len); + memcpy(_ht_tpname, spec->name, name_buf_len); - res->ht_module = Py_XNewRef(module); - - /* Adjust for empty tuple bases */ + /* Get a tuple of bases. + * bases is a strong reference (unlike bases_in). + */ + bases = get_bases_tuple(bases_in, spec); if (!bases) { - base = &PyBaseObject_Type; - /* See whether Py_tp_base(s) was specified */ - for (slot = spec->slots; slot->slot; slot++) { - if (slot->slot == Py_tp_base) - base = slot->pfunc; - else if (slot->slot == Py_tp_bases) { - bases = slot->pfunc; - } - } - if (!bases) { - bases = PyTuple_Pack(1, base); - if (!bases) - goto fail; - } - else if (!PyTuple_Check(bases)) { - PyErr_SetString(PyExc_SystemError, "Py_tp_bases is not a tuple"); - goto fail; - } - else { - Py_INCREF(bases); - } + goto finally; } - else if (!PyTuple_Check(bases)) { - bases = PyTuple_Pack(1, bases); - if (!bases) - goto fail; + + /* Calculate the metaclass */ + + if (!metaclass) { + metaclass = &PyType_Type; } - else { - Py_INCREF(bases); + metaclass = _PyType_CalculateMetaclass(metaclass, bases); + if (metaclass == NULL) { + goto finally; + } + if (!PyType_Check(metaclass)) { + PyErr_Format(PyExc_TypeError, + "Metaclass '%R' is not a subclass of 'type'.", + metaclass); + goto finally; + } + if (metaclass->tp_new != PyType_Type.tp_new) { + PyErr_SetString(PyExc_TypeError, + "Metaclasses with custom tp_new are not supported."); + goto finally; } /* Calculate best base, and check that all bases are type objects */ - base = best_base(bases); + PyTypeObject *base = best_base(bases); // borrowed ref if (base == NULL) { - Py_DECREF(bases); - goto fail; - } - if (!_PyType_HasFeature(base, Py_TPFLAGS_BASETYPE)) { - PyErr_Format(PyExc_TypeError, - "type '%.100s' is not an acceptable base type", - base->tp_name); - Py_DECREF(bases); - goto fail; + goto finally; + } + // best_base should check Py_TPFLAGS_BASETYPE & raise a proper exception, + // here we just check its work + assert(_PyType_HasFeature(base, Py_TPFLAGS_BASETYPE)); + + /* Allocate the new type + * + * Between here and PyType_Ready, we should limit: + * - calls to Python code + * - raising exceptions + * - memory allocations + */ + + res = (PyHeapTypeObject*)metaclass->tp_alloc(metaclass, nmembers); + if (res == NULL) { + goto finally; } + res_start = (char*)res; + + type = &res->ht_type; + /* The flags must be initialized early, before the GC traverses us */ + type->tp_flags = spec->flags | Py_TPFLAGS_HEAPTYPE; + + res->ht_module = Py_XNewRef(module); /* Initialize essential fields */ + type->tp_as_async = &res->as_async; type->tp_as_number = &res->as_number; type->tp_as_sequence = &res->as_sequence; type->tp_as_mapping = &res->as_mapping; type->tp_as_buffer = &res->as_buffer; - /* Set tp_base and tp_bases */ + + /* Set slots we have prepared */ + + type->tp_base = (PyTypeObject *)Py_NewRef(base); type->tp_bases = bases; - Py_INCREF(base); - type->tp_base = base; + bases = NULL; // We give our reference to bases to the type + + type->tp_doc = tp_doc; + tp_doc = NULL; // Give ownership of the allocated memory to the type + + res->ht_qualname = Py_NewRef(ht_name); + res->ht_name = ht_name; + ht_name = NULL; // Give our reference to to the type + + type->tp_name = _ht_tpname; + res->_ht_tpname = _ht_tpname; + _ht_tpname = NULL; // Give ownership to to the type + + /* Copy the sizes */ type->tp_basicsize = spec->basicsize; type->tp_itemsize = spec->itemsize; + /* Copy all the ordinary slots */ + for (slot = spec->slots; slot->slot; slot++) { - if (slot->slot < 0 - || (size_t)slot->slot >= Py_ARRAY_LENGTH(pyslot_offsets)) { - PyErr_SetString(PyExc_RuntimeError, "invalid slot offset"); - goto fail; - } - else if (slot->slot == Py_tp_base || slot->slot == Py_tp_bases) { + switch (slot->slot) { + case Py_tp_base: + case Py_tp_bases: + case Py_tp_doc: /* Processed above */ - continue; - } - else if (slot->slot == Py_tp_doc) { - /* For the docstring slot, which usually points to a static string - literal, we need to make a copy */ - if (slot->pfunc == NULL) { - type->tp_doc = NULL; - continue; - } - size_t len = strlen(slot->pfunc)+1; - char *tp_doc = PyObject_Malloc(len); - if (tp_doc == NULL) { - type->tp_doc = NULL; - PyErr_NoMemory(); - goto fail; + break; + case Py_tp_members: + { + /* Move the slots to the heap type itself */ + size_t len = Py_TYPE(type)->tp_itemsize * nmembers; + memcpy(_PyHeapType_GET_MEMBERS(res), slot->pfunc, len); + type->tp_members = _PyHeapType_GET_MEMBERS(res); } - memcpy(tp_doc, slot->pfunc, len); - type->tp_doc = tp_doc; - } - else if (slot->slot == Py_tp_members) { - /* Move the slots to the heap type itself */ - size_t len = Py_TYPE(type)->tp_itemsize * nmembers; - memcpy(_PyHeapType_GET_MEMBERS(res), slot->pfunc, len); - type->tp_members = _PyHeapType_GET_MEMBERS(res); - } - else { - /* Copy other slots directly */ - PySlot_Offset slotoffsets = pyslot_offsets[slot->slot]; - slot_offset = slotoffsets.slot_offset; - if (slotoffsets.subslot_offset == -1) { - *(void**)((char*)res_start + slot_offset) = slot->pfunc; - } else { - void *parent_slot = *(void**)((char*)res_start + slot_offset); - subslot_offset = slotoffsets.subslot_offset; - *(void**)((char*)parent_slot + subslot_offset) = slot->pfunc; + break; + default: + { + /* Copy other slots directly */ + PySlot_Offset slotoffsets = pyslot_offsets[slot->slot]; + short slot_offset = slotoffsets.slot_offset; + if (slotoffsets.subslot_offset == -1) { + *(void**)((char*)res_start + slot_offset) = slot->pfunc; + } + else { + void *procs = *(void**)((char*)res_start + slot_offset); + short subslot_offset = slotoffsets.subslot_offset; + *(void**)((char*)procs + subslot_offset) = slot->pfunc; + } } + break; } } if (type->tp_dealloc == NULL) { @@ -3574,12 +3693,25 @@ PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) type->tp_dealloc = subtype_dealloc; } - if (vectorcalloffset) { - type->tp_vectorcall_offset = vectorcalloffset; + /* Set up offsets */ + + type->tp_vectorcall_offset = vectorcalloffset; + type->tp_weaklistoffset = weaklistoffset; + type->tp_dictoffset = dictoffset; + + /* Ready the type (which includes inheritance). + * + * After this call we should generally only touch up what's + * accessible to Python code, like __dict__. + */ + + if (PyType_Ready(type) < 0) { + goto finally; } - if (PyType_Ready(type) < 0) - goto fail; + if (!check_basicsize_includes_size_and_offsets(type)) { + goto finally; + } if (type->tp_flags & Py_TPFLAGS_MANAGED_DICT) { res->ht_cached_keys = _PyDict_NewKeysForClass(); @@ -3587,62 +3719,83 @@ PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) if (type->tp_doc) { PyObject *__doc__ = PyUnicode_FromString(_PyType_DocWithoutSignature(type->tp_name, type->tp_doc)); - if (!__doc__) - goto fail; + if (!__doc__) { + goto finally; + } r = PyDict_SetItem(type->tp_dict, &_Py_ID(__doc__), __doc__); Py_DECREF(__doc__); - if (r < 0) - goto fail; + if (r < 0) { + goto finally; + } } if (weaklistoffset) { - type->tp_weaklistoffset = weaklistoffset; - if (PyDict_DelItemString((PyObject *)type->tp_dict, "__weaklistoffset__") < 0) - goto fail; + if (PyDict_DelItem((PyObject *)type->tp_dict, &_Py_ID(__weaklistoffset__)) < 0) { + goto finally; + } } if (dictoffset) { - type->tp_dictoffset = dictoffset; - if (PyDict_DelItemString((PyObject *)type->tp_dict, "__dictoffset__") < 0) - goto fail; + if (PyDict_DelItem((PyObject *)type->tp_dict, &_Py_ID(__dictoffset__)) < 0) { + goto finally; + } } /* Set type.__module__ */ r = PyDict_Contains(type->tp_dict, &_Py_ID(__module__)); if (r < 0) { - goto fail; + goto finally; } if (r == 0) { s = strrchr(spec->name, '.'); if (s != NULL) { - modname = PyUnicode_FromStringAndSize( + PyObject *modname = PyUnicode_FromStringAndSize( spec->name, (Py_ssize_t)(s - spec->name)); if (modname == NULL) { - goto fail; + goto finally; } r = PyDict_SetItem(type->tp_dict, &_Py_ID(__module__), modname); Py_DECREF(modname); - if (r != 0) - goto fail; - } else { + if (r != 0) { + goto finally; + } + } + else { if (PyErr_WarnFormat(PyExc_DeprecationWarning, 1, "builtin type %.200s has no __module__ attribute", spec->name)) - goto fail; + goto finally; } } assert(_PyType_CheckConsistency(type)); + + finally: + if (PyErr_Occurred()) { + Py_CLEAR(res); + } + Py_XDECREF(bases); + PyObject_Free(tp_doc); + Py_XDECREF(ht_name); + PyMem_Free(_ht_tpname); return (PyObject*)res; +} - fail: - Py_DECREF(res); - return NULL; +PyObject * +PyType_FromModuleAndSpec(PyObject *module, PyType_Spec *spec, PyObject *bases) +{ + return PyType_FromMetaclass(NULL, module, spec, bases); +} + +PyObject * +PyType_FromSpecWithBases(PyType_Spec *spec, PyObject *bases) +{ + return PyType_FromMetaclass(NULL, NULL, spec, bases); } PyObject * PyType_FromSpec(PyType_Spec *spec) { - return PyType_FromSpecWithBases(spec, NULL); + return PyType_FromMetaclass(NULL, NULL, spec, NULL); } PyObject * @@ -4009,7 +4162,7 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) if (name == NULL) return -1; } -#ifdef INTERN_NAME_STRINGS + /* bpo-40521: Interned strings are shared by all subinterpreters */ if (!PyUnicode_CHECK_INTERNED(name)) { PyUnicode_InternInPlace(&name); if (!PyUnicode_CHECK_INTERNED(name)) { @@ -4019,7 +4172,6 @@ type_setattro(PyTypeObject *type, PyObject *name, PyObject *value) return -1; } } -#endif } else { /* Will fail in _PyObject_GenericSetAttrWithDict. */ @@ -6518,8 +6670,11 @@ add_subclass(PyTypeObject *base, PyTypeObject *type) PyObject *subclasses = base->tp_subclasses; if (subclasses == NULL) { base->tp_subclasses = subclasses = PyDict_New(); - if (subclasses == NULL) + if (subclasses == NULL) { + Py_DECREF(key); + Py_DECREF(ref); return -1; + } } assert(PyDict_CheckExact(subclasses)); @@ -7029,7 +7184,7 @@ wrap_descr_get(PyObject *self, PyObject *args, void *wrapped) obj = NULL; if (type == Py_None) type = NULL; - if (type == NULL &&obj == NULL) { + if (type == NULL && obj == NULL) { PyErr_SetString(PyExc_TypeError, "__get__(None, None) is invalid"); return NULL; @@ -7629,10 +7784,17 @@ slot_tp_getattro(PyObject *self, PyObject *name) return vectorcall_method(&_Py_ID(__getattribute__), stack, 2); } -static PyObject * +static inline PyObject * call_attribute(PyObject *self, PyObject *attr, PyObject *name) { PyObject *res, *descr = NULL; + + if (_PyType_HasFeature(Py_TYPE(attr), Py_TPFLAGS_METHOD_DESCRIPTOR)) { + PyObject *args[] = { self, name }; + res = PyObject_Vectorcall(attr, args, 2, NULL); + return res; + } + descrgetfunc f = Py_TYPE(attr)->tp_descr_get; if (f != NULL) { @@ -8047,7 +8209,7 @@ static slotdef slotdefs[] = { TPSLOT("__next__", tp_iternext, slot_tp_iternext, wrap_next, "__next__($self, /)\n--\n\nImplement next(self)."), TPSLOT("__get__", tp_descr_get, slot_tp_descr_get, wrap_descr_get, - "__get__($self, instance, owner, /)\n--\n\nReturn an attribute of instance, which is of type owner."), + "__get__($self, instance, owner=None, /)\n--\n\nReturn an attribute of instance, which is of type owner."), TPSLOT("__set__", tp_descr_set, slot_tp_descr_set, wrap_descr_set, "__set__($self, instance, value, /)\n--\n\nSet an attribute of instance to value."), TPSLOT("__delete__", tp_descr_set, slot_tp_descr_set, @@ -8456,17 +8618,11 @@ _PyTypes_InitSlotDefs(void) for (slotdef *p = slotdefs; p->name; p++) { /* Slots must be ordered by their offset in the PyHeapTypeObject. */ assert(!p[1].name || p->offset <= p[1].offset); -#ifdef INTERN_NAME_STRINGS + /* bpo-40521: Interned strings are shared by all subinterpreters */ p->name_strobj = PyUnicode_InternFromString(p->name); if (!p->name_strobj || !PyUnicode_CHECK_INTERNED(p->name_strobj)) { return _PyStatus_NO_MEMORY(); } -#else - p->name_strobj = PyUnicode_FromString(p->name); - if (!p->name_strobj) { - return _PyStatus_NO_MEMORY(); - } -#endif } slotdefs_initialized = 1; return _PyStatus_OK(); @@ -8491,24 +8647,17 @@ update_slot(PyTypeObject *type, PyObject *name) int offset; assert(PyUnicode_CheckExact(name)); -#ifdef INTERN_NAME_STRINGS assert(PyUnicode_CHECK_INTERNED(name)); -#endif assert(slotdefs_initialized); pp = ptrs; for (p = slotdefs; p->name; p++) { assert(PyUnicode_CheckExact(p->name_strobj)); assert(PyUnicode_CheckExact(name)); -#ifdef INTERN_NAME_STRINGS + /* bpo-40521: Using interned strings. */ if (p->name_strobj == name) { *pp++ = p; } -#else - if (p->name_strobj == name || _PyUnicode_EQ(p->name_strobj, name)) { - *pp++ = p; - } -#endif } *pp = NULL; for (pp = ptrs; *pp; pp++) { diff --git a/Objects/unicodeobject.c b/Objects/unicodeobject.c index e9358290724832..d0e4bcd920e401 100644 --- a/Objects/unicodeobject.c +++ b/Objects/unicodeobject.c @@ -190,11 +190,6 @@ extern "C" { # define OVERALLOCATE_FACTOR 4 #endif -/* bpo-40521: Interned strings are shared by all interpreters. */ -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS -# define INTERNED_STRINGS -#endif - /* This dictionary holds all interned unicode strings. Note that references to strings in this dictionary are *not* counted in the string's ob_refcnt. When the interned string reaches a refcnt of 0 the string deallocation @@ -203,9 +198,7 @@ extern "C" { Another way to look at this is that to say that the actual reference count of a string is: s->ob_refcnt + (s->state ? 2 : 0) */ -#ifdef INTERNED_STRINGS static PyObject *interned = NULL; -#endif /* Forward declaration */ static inline int @@ -454,7 +447,14 @@ unicode_check_encoding_errors(const char *encoding, const char *errors) return 0; } - if (encoding != NULL) { + if (encoding != NULL + // Fast path for the most common built-in encodings. Even if the codec + // is cached, _PyCodec_Lookup() decodes the bytes string from UTF-8 to + // create a temporary Unicode string (the key in the cache). + && strcmp(encoding, "utf-8") != 0 + && strcmp(encoding, "utf8") != 0 + && strcmp(encoding, "ascii") != 0) + { PyObject *handler = _PyCodec_Lookup(encoding); if (handler == NULL) { return -1; @@ -462,7 +462,14 @@ unicode_check_encoding_errors(const char *encoding, const char *errors) Py_DECREF(handler); } - if (errors != NULL) { + if (errors != NULL + // Fast path for the most common built-in error handlers. + && strcmp(errors, "strict") != 0 + && strcmp(errors, "ignore") != 0 + && strcmp(errors, "replace") != 0 + && strcmp(errors, "surrogateescape") != 0 + && strcmp(errors, "surrogatepass") != 0) + { PyObject *handler = PyCodec_LookupError(errors); if (handler == NULL) { return -1; @@ -1516,7 +1523,6 @@ unicode_dealloc(PyObject *unicode) } #endif -#ifdef INTERNED_STRINGS if (PyUnicode_CHECK_INTERNED(unicode)) { /* Revive the dead object temporarily. PyDict_DelItem() removes two references (key and value) which were ignored by @@ -1532,7 +1538,6 @@ unicode_dealloc(PyObject *unicode) assert(Py_REFCNT(unicode) == 1); Py_SET_REFCNT(unicode, 0); } -#endif if (_PyUnicode_HAS_UTF8_MEMORY(unicode)) { PyObject_Free(_PyUnicode_UTF8(unicode)); @@ -2694,11 +2699,7 @@ PyUnicode_FromFormat(const char *format, ...) PyObject* ret; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif ret = PyUnicode_FromFormatV(format, vargs); va_end(vargs); return ret; @@ -10558,13 +10559,11 @@ _PyUnicode_EqualToASCIIId(PyObject *left, _Py_Identifier *right) if (PyUnicode_CHECK_INTERNED(left)) return 0; -#ifdef INTERNED_STRINGS assert(_PyUnicode_HASH(right_uni) != -1); Py_hash_t hash = _PyUnicode_HASH(left); if (hash != -1 && hash != _PyUnicode_HASH(right_uni)) { return 0; } -#endif return unicode_compare_eq(left, right_uni); } @@ -14635,7 +14634,6 @@ PyUnicode_InternInPlace(PyObject **p) return; } -#ifdef INTERNED_STRINGS if (interned == NULL) { interned = PyDict_New(); if (interned == NULL) { @@ -14661,11 +14659,6 @@ PyUnicode_InternInPlace(PyObject **p) this. */ Py_SET_REFCNT(s, Py_REFCNT(s) - 2); _PyUnicode_STATE(s).interned = 1; -#else - // PyDict expects that interned strings have their hash - // (PyASCIIObject.hash) already computed. - (void)unicode_hash(s); -#endif } // Function kept for the stable ABI. diff --git a/Objects/weakrefobject.c b/Objects/weakrefobject.c index 1712533a39d802..2b4361e63ca413 100644 --- a/Objects/weakrefobject.c +++ b/Objects/weakrefobject.c @@ -371,7 +371,7 @@ static PyMethodDef weakref_methods[] = { PyTypeObject _PyWeakref_RefType = { PyVarObject_HEAD_INIT(&PyType_Type, 0) - .tp_name = "weakref", + .tp_name = "weakref.ReferenceType", .tp_basicsize = sizeof(PyWeakReference), .tp_dealloc = weakref_dealloc, .tp_vectorcall_offset = offsetof(PyWeakReference, vectorcall), @@ -719,7 +719,7 @@ static PyMappingMethods proxy_as_mapping = { PyTypeObject _PyWeakref_ProxyType = { PyVarObject_HEAD_INIT(&PyType_Type, 0) - "weakproxy", + "weakref.ProxyType", sizeof(PyWeakReference), 0, /* methods */ @@ -754,7 +754,7 @@ _PyWeakref_ProxyType = { PyTypeObject _PyWeakref_CallableProxyType = { PyVarObject_HEAD_INIT(&PyType_Type, 0) - "weakcallableproxy", + "weakref.CallableProxyType", sizeof(PyWeakReference), 0, /* methods */ diff --git a/PC/launcher2.c b/PC/launcher2.c index 35c932aa329c8e..ae11f4f024a904 100644 --- a/PC/launcher2.c +++ b/PC/launcher2.c @@ -386,6 +386,12 @@ typedef struct { int tagLength; // if true, treats 'tag' as a non-PEP 514 filter bool oldStyleTag; + // if true, ignores 'tag' when a high priority environment is found + // gh-92817: This is currently set when a tag is read from configuration or + // the environment, rather than the command line or a shebang line, and the + // only currently possible high priority environment is an active virtual + // environment + bool lowPriorityTag; // if true, we had an old-style tag with '-64' suffix, and so do not // want to match tags like '3.x-32' bool exclude32Bit; @@ -475,6 +481,7 @@ dumpSearchInfo(SearchInfo *search) DEBUG_2(company, companyLength); DEBUG_2(tag, tagLength); DEBUG_BOOL(oldStyleTag); + DEBUG_BOOL(lowPriorityTag); DEBUG_BOOL(exclude32Bit); DEBUG_BOOL(only32Bit); DEBUG_BOOL(allowDefaults); @@ -972,6 +979,9 @@ checkDefaults(SearchInfo *search) search->tagLength = n - (search->companyLength + 1); search->oldStyleTag = false; } + // gh-92817: allow a high priority env to be selected even if it + // doesn't match the tag + search->lowPriorityTag = true; } return 0; @@ -995,7 +1005,7 @@ typedef struct EnvironmentInfo { const wchar_t *executableArgs; const wchar_t *architecture; const wchar_t *displayName; - bool isActiveVenv; + bool highPriority; } EnvironmentInfo; @@ -1481,7 +1491,7 @@ virtualenvSearch(const SearchInfo *search, EnvironmentInfo **result) if (!env) { return RC_NO_MEMORY; } - env->isActiveVenv = true; + env->highPriority = true; env->internalSortKey = 20; exitCode = copyWstr(&env->displayName, L"Active venv"); if (exitCode) { @@ -1821,6 +1831,15 @@ _selectEnvironment(const SearchInfo *search, EnvironmentInfo *env, EnvironmentIn return 0; } + if (env->highPriority && search->lowPriorityTag) { + // This environment is marked high priority, and the search allows + // it to be selected even though a tag is specified, so select it + // gh-92817: this allows an active venv to be selected even when a + // default tag has been found in py.ini or the environment + *best = env; + return 0; + } + if (!search->oldStyleTag) { if (_companyMatches(search, env) && _tagMatches(search, env)) { // Because of how our sort tree is set up, we will walk up the diff --git a/PC/layout/main.py b/PC/layout/main.py index 8e69ecc2591214..923483ad4a3f71 100644 --- a/PC/layout/main.py +++ b/PC/layout/main.py @@ -8,11 +8,8 @@ __version__ = "3.8" import argparse -import functools import os -import re import shutil -import subprocess import sys import tempfile import zipfile diff --git a/PC/layout/support/appxmanifest.py b/PC/layout/support/appxmanifest.py index 427a36f31c8f9e..4850fad9b56d8b 100644 --- a/PC/layout/support/appxmanifest.py +++ b/PC/layout/support/appxmanifest.py @@ -6,13 +6,11 @@ __version__ = "3.8" -import collections import ctypes import io import os -import sys -from pathlib import Path, PureWindowsPath +from pathlib import PureWindowsPath from xml.etree import ElementTree as ET from .constants import * diff --git a/PC/layout/support/catalog.py b/PC/layout/support/catalog.py index 43121187ed180c..c9d87cb89484c2 100644 --- a/PC/layout/support/catalog.py +++ b/PC/layout/support/catalog.py @@ -6,8 +6,6 @@ __version__ = "3.8" -import sys - __all__ = ["PYTHON_CAT_NAME", "PYTHON_CDF_NAME"] diff --git a/PC/layout/support/constants.py b/PC/layout/support/constants.py index 6efd8bcd5cbb5e..8195c3dc30cdc7 100644 --- a/PC/layout/support/constants.py +++ b/PC/layout/support/constants.py @@ -6,7 +6,6 @@ __version__ = "3.8" import os -import re import struct import sys diff --git a/PC/pyconfig.h b/PC/pyconfig.h index 9dfe71bacabd14..87d55fa07e51e3 100644 --- a/PC/pyconfig.h +++ b/PC/pyconfig.h @@ -113,20 +113,30 @@ WIN32 is still required for the locale module. #define MS_WIN64 #endif -/* set the COMPILER */ +/* set the COMPILER and support tier + * + * win_amd64 MSVC (x86_64-pc-windows-msvc): 1 + * win32 MSVC (i686-pc-windows-msvc): 1 + * win_arm64 MSVC (aarch64-pc-windows-msvc): 3 + * other archs and ICC: 0 + */ #ifdef MS_WIN64 #if defined(_M_X64) || defined(_M_AMD64) #if defined(__INTEL_COMPILER) #define COMPILER ("[ICC v." _Py_STRINGIZE(__INTEL_COMPILER) " 64 bit (amd64) with MSC v." _Py_STRINGIZE(_MSC_VER) " CRT]") +#define PY_SUPPORT_TIER 0 #else #define COMPILER _Py_PASTE_VERSION("64 bit (AMD64)") +#define PY_SUPPORT_TIER 1 #endif /* __INTEL_COMPILER */ #define PYD_PLATFORM_TAG "win_amd64" #elif defined(_M_ARM64) #define COMPILER _Py_PASTE_VERSION("64 bit (ARM64)") +#define PY_SUPPORT_TIER 3 #define PYD_PLATFORM_TAG "win_arm64" #else #define COMPILER _Py_PASTE_VERSION("64 bit (Unknown)") +#define PY_SUPPORT_TIER 0 #endif #endif /* MS_WIN64 */ @@ -173,15 +183,19 @@ typedef _W64 int Py_ssize_t; #if defined(_M_IX86) #if defined(__INTEL_COMPILER) #define COMPILER ("[ICC v." _Py_STRINGIZE(__INTEL_COMPILER) " 32 bit (Intel) with MSC v." _Py_STRINGIZE(_MSC_VER) " CRT]") +#define PY_SUPPORT_TIER 0 #else #define COMPILER _Py_PASTE_VERSION("32 bit (Intel)") +#define PY_SUPPORT_TIER 1 #endif /* __INTEL_COMPILER */ #define PYD_PLATFORM_TAG "win32" #elif defined(_M_ARM) #define COMPILER _Py_PASTE_VERSION("32 bit (ARM)") #define PYD_PLATFORM_TAG "win_arm32" +#define PY_SUPPORT_TIER 0 #else #define COMPILER _Py_PASTE_VERSION("32 bit (Unknown)") +#define PY_SUPPORT_TIER 0 #endif #endif /* MS_WIN32 && !MS_WIN64 */ @@ -569,9 +583,6 @@ Py_NO_ENABLE_SHARED to find out. Also support MS_NO_COREDLL for b/w compat */ /* Define to 1 if you have the header file. */ #define HAVE_SIGNAL_H 1 -/* Define if you have the prototypes. */ -#define HAVE_STDARG_PROTOTYPES - /* Define if you have the header file. */ #define HAVE_STDDEF_H 1 diff --git a/PC/python3dll.c b/PC/python3dll.c index 50e7a9607bec95..024ec49d68d797 100755 --- a/PC/python3dll.c +++ b/PC/python3dll.c @@ -599,6 +599,7 @@ EXPORT_FUNC(PyTuple_Pack) EXPORT_FUNC(PyTuple_SetItem) EXPORT_FUNC(PyTuple_Size) EXPORT_FUNC(PyType_ClearCache) +EXPORT_FUNC(PyType_FromMetaclass) EXPORT_FUNC(PyType_FromModuleAndSpec) EXPORT_FUNC(PyType_FromSpec) EXPORT_FUNC(PyType_FromSpecWithBases) diff --git a/PC/winsound.c b/PC/winsound.c index fd04e1e55b5fc8..65025ddc5e1f51 100644 --- a/PC/winsound.c +++ b/PC/winsound.c @@ -94,17 +94,25 @@ winsound_PlaySound_impl(PyObject *module, PyObject *sound, int flags) return NULL; } wsound = (wchar_t *)view.buf; + } else if (PyBytes_Check(sound)) { + PyErr_Format(PyExc_TypeError, + "'sound' must be str, os.PathLike, or None, not '%s'", + Py_TYPE(sound)->tp_name); + return NULL; } else { - if (!PyUnicode_Check(sound)) { + PyObject *obj = PyOS_FSPath(sound); + // Either is unicode/bytes/NULL, or a helpful message + // has been surfaced to the user about how they gave a non-path. + if (obj == NULL) return NULL; + if (PyBytes_Check(obj)) { PyErr_Format(PyExc_TypeError, - "'sound' must be str or None, not '%s'", - Py_TYPE(sound)->tp_name); - return NULL; - } - wsound = PyUnicode_AsWideCharString(sound, NULL); - if (wsound == NULL) { + "'sound' must resolve to str, not bytes"); + Py_DECREF(obj); return NULL; } + wsound = PyUnicode_AsWideCharString(obj, NULL); + Py_DECREF(obj); + if (wsound == NULL) return NULL; } diff --git a/PCbuild/_socket.vcxproj b/PCbuild/_socket.vcxproj index 8fd75f90e7ee1e..78fa4d6729abb9 100644 --- a/PCbuild/_socket.vcxproj +++ b/PCbuild/_socket.vcxproj @@ -93,7 +93,7 @@ - ws2_32.lib;iphlpapi.lib;%(AdditionalDependencies) + ws2_32.lib;iphlpapi.lib;Rpcrt4.lib;%(AdditionalDependencies) diff --git a/PCbuild/lib.pyproj b/PCbuild/lib.pyproj index 43c570f1dab37a..0556efe1a77d2b 100644 --- a/PCbuild/lib.pyproj +++ b/PCbuild/lib.pyproj @@ -83,59 +83,6 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - @@ -614,33 +561,6 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - @@ -944,7 +864,59 @@ - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + @@ -1158,7 +1130,33 @@ - + + + + + + + + + + + + + + + + + + + + + + + + + + + @@ -1325,7 +1323,17 @@ - + + + + + + + + + + + @@ -1341,8 +1349,11 @@ - + + + + @@ -1355,7 +1366,33 @@ - + + + + + + + + + + + + + + + + + + + + + + + + + + + @@ -1434,22 +1471,6 @@ - - - - - - - - - - - - - - - - @@ -1492,33 +1513,6 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - @@ -1725,7 +1719,6 @@ - @@ -1745,10 +1738,6 @@ - - - - @@ -1769,6 +1758,7 @@ + @@ -1803,19 +1793,22 @@ + + + + + + + + - - - - - diff --git a/PCbuild/pyproject.props b/PCbuild/pyproject.props index 8d24393aa1861e..df3efc631d154f 100644 --- a/PCbuild/pyproject.props +++ b/PCbuild/pyproject.props @@ -18,6 +18,7 @@ true false false + false diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj index a35884b3c35887..09acce9d0c894a 100644 --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -167,6 +167,7 @@ + @@ -207,6 +208,7 @@ + @@ -236,6 +238,7 @@ + @@ -280,6 +283,7 @@ + diff --git a/PCbuild/pythoncore.vcxproj.filters b/PCbuild/pythoncore.vcxproj.filters index ff42cc92c4bd23..4d91a7ad0b3e75 100644 --- a/PCbuild/pythoncore.vcxproj.filters +++ b/PCbuild/pythoncore.vcxproj.filters @@ -174,6 +174,9 @@ Include + + Include + Include @@ -432,6 +435,9 @@ Include\cpython + + Include\cpython + Include\cpython @@ -525,6 +531,9 @@ Include\internal + + Include\internal + Include\internal @@ -609,6 +618,9 @@ Include\internal + + Include\internal + Include\internal diff --git a/Parser/asdl_c.py b/Parser/asdl_c.py index 3bfe320fe3b70e..bf391a3ae16533 100755 --- a/Parser/asdl_c.py +++ b/Parser/asdl_c.py @@ -1,7 +1,6 @@ #! /usr/bin/env python """Generate C code from an ASDL description.""" -import os import sys import textwrap import types @@ -488,6 +487,12 @@ def visitProduct(self, prod, name): class Obj2ModVisitor(PickleVisitor): + + attribute_special_defaults = { + "end_lineno": "lineno", + "end_col_offset": "col_offset", + } + @contextmanager def recursive_call(self, node, level): self.emit('if (_Py_EnterRecursiveCall(" while traversing \'%s\' node")) {' % node, level, reflow=False) @@ -637,7 +642,13 @@ def visitField(self, field, name, sum=None, prod=None, depth=0): self.emit("if (tmp == NULL || tmp == Py_None) {", depth) self.emit("Py_CLEAR(tmp);", depth+1) if self.isNumeric(field): - self.emit("%s = 0;" % field.name, depth+1) + if field.name in self.attribute_special_defaults: + self.emit( + "%s = %s;" % (field.name, self.attribute_special_defaults[field.name]), + depth+1, + ) + else: + self.emit("%s = 0;" % field.name, depth+1) elif not self.isSimpleType(field): self.emit("%s = NULL;" % field.name, depth+1) else: diff --git a/Parser/myreadline.c b/Parser/myreadline.c index b10d306255bb67..d55fcefbb6f206 100644 --- a/Parser/myreadline.c +++ b/Parser/myreadline.c @@ -247,7 +247,8 @@ PyOS_StdioReadline(FILE *sys_stdin, FILE *sys_stdout, const char *prompt) assert(tstate != NULL); #ifdef MS_WINDOWS - if (!Py_LegacyWindowsStdioFlag && sys_stdin == stdin) { + const PyConfig *config = _PyInterpreterState_GetConfig(tstate->interp); + if (!config->legacy_windows_stdio && sys_stdin == stdin) { HANDLE hStdIn, hStdErr; hStdIn = _Py_get_osfhandle_noraise(fileno(sys_stdin)); diff --git a/Parser/parser.c b/Parser/parser.c index adc8d509eb7d7d..73fd66cf423890 100644 --- a/Parser/parser.c +++ b/Parser/parser.c @@ -2,12 +2,16 @@ #include "pegen.h" #if defined(Py_DEBUG) && defined(Py_BUILD_CORE) -# define D(x) if (Py_DebugFlag) x; +# define D(x) if (p->debug) { x; } #else # define D(x) #endif -# define MAXSTACK 6000 +#ifdef __wasi__ +# define MAXSTACK 4000 +#else +# define MAXSTACK 6000 +#endif static const int n_keyword_lists = 9; static KeywordToken *reserved_keywords[] = { (KeywordToken[]) {{NULL, -1}}, @@ -15,15 +19,15 @@ static KeywordToken *reserved_keywords[] = { (KeywordToken[]) { {"if", 634}, {"as", 632}, - {"in", 641}, + {"in", 643}, {"or", 574}, {"is", 582}, {NULL, -1}, }, (KeywordToken[]) { {"del", 603}, - {"def", 642}, - {"for", 640}, + {"def", 644}, + {"for", 642}, {"try", 618}, {"and", 575}, {"not", 581}, @@ -43,7 +47,7 @@ static KeywordToken *reserved_keywords[] = { {"raise", 522}, {"yield", 573}, {"break", 508}, - {"class", 643}, + {"class", 646}, {"while", 639}, {"False", 602}, {NULL, -1}, @@ -515,9 +519,9 @@ static char *soft_keywords[] = { #define _tmp_211_type 1439 #define _tmp_212_type 1440 #define _tmp_213_type 1441 -#define _loop0_215_type 1442 -#define _gather_214_type 1443 -#define _tmp_216_type 1444 +#define _tmp_214_type 1442 +#define _loop0_216_type 1443 +#define _gather_215_type 1444 #define _tmp_217_type 1445 #define _tmp_218_type 1446 #define _tmp_219_type 1447 @@ -547,8 +551,9 @@ static char *soft_keywords[] = { #define _tmp_243_type 1471 #define _tmp_244_type 1472 #define _tmp_245_type 1473 -#define _loop1_246_type 1474 +#define _tmp_246_type 1474 #define _loop1_247_type 1475 +#define _loop1_248_type 1476 static mod_ty file_rule(Parser *p); static mod_ty interactive_rule(Parser *p); @@ -992,9 +997,9 @@ static void *_tmp_210_rule(Parser *p); static void *_tmp_211_rule(Parser *p); static void *_tmp_212_rule(Parser *p); static void *_tmp_213_rule(Parser *p); -static asdl_seq *_loop0_215_rule(Parser *p); -static asdl_seq *_gather_214_rule(Parser *p); -static void *_tmp_216_rule(Parser *p); +static void *_tmp_214_rule(Parser *p); +static asdl_seq *_loop0_216_rule(Parser *p); +static asdl_seq *_gather_215_rule(Parser *p); static void *_tmp_217_rule(Parser *p); static void *_tmp_218_rule(Parser *p); static void *_tmp_219_rule(Parser *p); @@ -1024,8 +1029,9 @@ static void *_tmp_242_rule(Parser *p); static void *_tmp_243_rule(Parser *p); static void *_tmp_244_rule(Parser *p); static void *_tmp_245_rule(Parser *p); -static asdl_seq *_loop1_246_rule(Parser *p); +static void *_tmp_246_rule(Parser *p); static asdl_seq *_loop1_247_rule(Parser *p); +static asdl_seq *_loop1_248_rule(Parser *p); // file: statements? $ @@ -4204,7 +4210,7 @@ class_def_rule(Parser *p) return _res; } -// class_def_raw: invalid_class_def_raw | 'class' NAME ['(' arguments? ')'] &&':' block +// class_def_raw: invalid_class_def_raw | 'class' NAME ['(' arguments? ')'] ':' block static stmt_ty class_def_raw_rule(Parser *p) { @@ -4246,30 +4252,30 @@ class_def_raw_rule(Parser *p) D(fprintf(stderr, "%*c%s class_def_raw[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "invalid_class_def_raw")); } - { // 'class' NAME ['(' arguments? ')'] &&':' block + { // 'class' NAME ['(' arguments? ')'] ':' block if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); + D(fprintf(stderr, "%*c> class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] ':' block")); Token * _keyword; Token * _literal; expr_ty a; void *b; asdl_stmt_seq* c; if ( - (_keyword = _PyPegen_expect_token(p, 643)) // token='class' + (_keyword = _PyPegen_expect_token(p, 646)) // token='class' && (a = _PyPegen_name_token(p)) // NAME && (b = _tmp_33_rule(p), !p->error_indicator) // ['(' arguments? ')'] && - (_literal = _PyPegen_expect_forced_token(p, 11, ":")) // forced_token=':' + (_literal = _PyPegen_expect_token(p, 11)) // token=':' && (c = block_rule(p)) // block ) { - D(fprintf(stderr, "%*c+ class_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); + D(fprintf(stderr, "%*c+ class_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] ':' block")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { p->level--; @@ -4289,7 +4295,7 @@ class_def_raw_rule(Parser *p) } p->mark = _mark; D(fprintf(stderr, "%*c%s class_def_raw[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'class' NAME ['(' arguments? ')'] &&':' block")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'class' NAME ['(' arguments? ')'] ':' block")); } _res = NULL; done: @@ -4424,7 +4430,7 @@ function_def_raw_rule(Parser *p) void *params; void *tc; if ( - (_keyword = _PyPegen_expect_token(p, 642)) // token='def' + (_keyword = _PyPegen_expect_token(p, 644)) // token='def' && (n = _PyPegen_name_token(p)) // NAME && @@ -4484,7 +4490,7 @@ function_def_raw_rule(Parser *p) if ( (async_var = _PyPegen_expect_token(p, ASYNC)) // token='ASYNC' && - (_keyword = _PyPegen_expect_token(p, 642)) // token='def' + (_keyword = _PyPegen_expect_token(p, 644)) // token='def' && (n = _PyPegen_name_token(p)) // NAME && @@ -6225,8 +6231,8 @@ while_stmt_rule(Parser *p) // for_stmt: // | invalid_for_stmt -// | 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? -// | ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? +// | 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block? +// | ASYNC 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block? // | invalid_for_target static stmt_ty for_stmt_rule(Parser *p) @@ -6269,12 +6275,12 @@ for_stmt_rule(Parser *p) D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "invalid_for_stmt")); } - { // 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? + { // 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block? if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); + D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block?")); int _cut_var = 0; Token * _keyword; Token * _keyword_1; @@ -6285,17 +6291,17 @@ for_stmt_rule(Parser *p) expr_ty t; void *tc; if ( - (_keyword = _PyPegen_expect_token(p, 640)) // token='for' + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' && (t = star_targets_rule(p)) // star_targets && - (_keyword_1 = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword_1 = _PyPegen_expect_token(p, 643)) // token='in' && (_cut_var = 1) && (ex = star_expressions_rule(p)) // star_expressions && - (_literal = _PyPegen_expect_forced_token(p, 11, ":")) // forced_token=':' + (_literal = _PyPegen_expect_token(p, 11)) // token=':' && (tc = _PyPegen_expect_token(p, TYPE_COMMENT), !p->error_indicator) // TYPE_COMMENT? && @@ -6304,7 +6310,7 @@ for_stmt_rule(Parser *p) (el = else_block_rule(p), !p->error_indicator) // else_block? ) { - D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); + D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { p->level--; @@ -6324,18 +6330,18 @@ for_stmt_rule(Parser *p) } p->mark = _mark; D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block?")); if (_cut_var) { p->level--; return NULL; } } - { // ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block? + { // ASYNC 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block? if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); + D(fprintf(stderr, "%*c> for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block?")); int _cut_var = 0; Token * _keyword; Token * _keyword_1; @@ -6349,17 +6355,17 @@ for_stmt_rule(Parser *p) if ( (async_var = _PyPegen_expect_token(p, ASYNC)) // token='ASYNC' && - (_keyword = _PyPegen_expect_token(p, 640)) // token='for' + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' && (t = star_targets_rule(p)) // star_targets && - (_keyword_1 = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword_1 = _PyPegen_expect_token(p, 643)) // token='in' && (_cut_var = 1) && (ex = star_expressions_rule(p)) // star_expressions && - (_literal = _PyPegen_expect_forced_token(p, 11, ":")) // forced_token=':' + (_literal = _PyPegen_expect_token(p, 11)) // token=':' && (tc = _PyPegen_expect_token(p, TYPE_COMMENT), !p->error_indicator) // TYPE_COMMENT? && @@ -6368,7 +6374,7 @@ for_stmt_rule(Parser *p) (el = else_block_rule(p), !p->error_indicator) // else_block? ) { - D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); + D(fprintf(stderr, "%*c+ for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block?")); Token *_token = _PyPegen_get_last_nonnwhitespace_token(p); if (_token == NULL) { p->level--; @@ -6388,7 +6394,7 @@ for_stmt_rule(Parser *p) } p->mark = _mark; D(fprintf(stderr, "%*c%s for_stmt[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions &&':' TYPE_COMMENT? block else_block?")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC 'for' star_targets 'in' ~ star_expressions ':' TYPE_COMMENT? block else_block?")); if (_cut_var) { p->level--; return NULL; @@ -7941,6 +7947,10 @@ closed_pattern_rule(Parser *p) return NULL; } pattern_ty _res = NULL; + if (_PyPegen_is_memoized(p, closed_pattern_type, &_res)) { + p->level--; + return _res; + } int _mark = p->mark; { // literal_pattern if (p->error_indicator) { @@ -8096,6 +8106,7 @@ closed_pattern_rule(Parser *p) } _res = NULL; done: + _PyPegen_insert_memo(p, _mark, closed_pattern_type, _res); p->level--; return _res; } @@ -9619,6 +9630,10 @@ star_pattern_rule(Parser *p) return NULL; } pattern_ty _res = NULL; + if (_PyPegen_is_memoized(p, star_pattern_type, &_res)) { + p->level--; + return _res; + } int _mark = p->mark; if (p->mark == p->fill && _PyPegen_fill_token(p) < 0) { p->error_indicator = 1; @@ -9703,6 +9718,7 @@ star_pattern_rule(Parser *p) } _res = NULL; done: + _PyPegen_insert_memo(p, _mark, star_pattern_type, _res); p->level--; return _res; } @@ -12185,7 +12201,7 @@ notin_bitwise_or_rule(Parser *p) if ( (_keyword = _PyPegen_expect_token(p, 581)) // token='not' && - (_keyword_1 = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword_1 = _PyPegen_expect_token(p, 643)) // token='in' && (a = bitwise_or_rule(p)) // bitwise_or ) @@ -12232,7 +12248,7 @@ in_bitwise_or_rule(Parser *p) Token * _keyword; expr_ty a; if ( - (_keyword = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword = _PyPegen_expect_token(p, 643)) // token='in' && (a = bitwise_or_rule(p)) // bitwise_or ) @@ -16017,11 +16033,11 @@ for_if_clause_rule(Parser *p) if ( (async_var = _PyPegen_expect_token(p, ASYNC)) // token='ASYNC' && - (_keyword = _PyPegen_expect_token(p, 640)) // token='for' + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' && (a = star_targets_rule(p)) // star_targets && - (_keyword_1 = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword_1 = _PyPegen_expect_token(p, 643)) // token='in' && (_cut_var = 1) && @@ -16060,11 +16076,11 @@ for_if_clause_rule(Parser *p) expr_ty b; asdl_expr_seq* c; if ( - (_keyword = _PyPegen_expect_token(p, 640)) // token='for' + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' && (a = star_targets_rule(p)) // star_targets && - (_keyword_1 = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword_1 = _PyPegen_expect_token(p, 643)) // token='in' && (_cut_var = 1) && @@ -21420,7 +21436,7 @@ invalid_for_target_rule(Parser *p) if ( (_opt_var = _PyPegen_expect_token(p, ASYNC), !p->error_indicator) // ASYNC? && - (_keyword = _PyPegen_expect_token(p, 640)) // token='for' + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' && (a = star_expressions_rule(p)) // star_expressions ) @@ -21578,8 +21594,8 @@ invalid_import_from_targets_rule(Parser *p) } // invalid_with_stmt: -// | ASYNC? 'with' ','.(expression ['as' star_target])+ &&':' -// | ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':' +// | ASYNC? 'with' ','.(expression ['as' star_target])+ NEWLINE +// | ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE static void * invalid_with_stmt_rule(Parser *p) { @@ -21593,17 +21609,17 @@ invalid_with_stmt_rule(Parser *p) } void * _res = NULL; int _mark = p->mark; - { // ASYNC? 'with' ','.(expression ['as' star_target])+ &&':' + { // ASYNC? 'with' ','.(expression ['as' star_target])+ NEWLINE if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ &&':'")); + D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ NEWLINE")); asdl_seq * _gather_194_var; Token * _keyword; - Token * _literal; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings + Token * newline_var; if ( (_opt_var = _PyPegen_expect_token(p, ASYNC), !p->error_indicator) // ASYNC? && @@ -21611,32 +21627,37 @@ invalid_with_stmt_rule(Parser *p) && (_gather_194_var = _gather_194_rule(p)) // ','.(expression ['as' star_target])+ && - (_literal = _PyPegen_expect_forced_token(p, 11, ":")) // forced_token=':' + (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ invalid_with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ &&':'")); - _res = _PyPegen_dummy_name(p, _opt_var, _keyword, _gather_194_var, _literal); + D(fprintf(stderr, "%*c+ invalid_with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ NEWLINE")); + _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); + if (_res == NULL && PyErr_Occurred()) { + p->error_indicator = 1; + p->level--; + return NULL; + } goto done; } p->mark = _mark; D(fprintf(stderr, "%*c%s invalid_with_stmt[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ &&':'")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ NEWLINE")); } - { // ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':' + { // ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':'")); + D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE")); asdl_seq * _gather_196_var; Token * _keyword; Token * _literal; Token * _literal_1; - Token * _literal_2; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings void *_opt_var_1; UNUSED(_opt_var_1); // Silence compiler warnings + Token * newline_var; if ( (_opt_var = _PyPegen_expect_token(p, ASYNC), !p->error_indicator) // ASYNC? && @@ -21650,16 +21671,21 @@ invalid_with_stmt_rule(Parser *p) && (_literal_1 = _PyPegen_expect_token(p, 8)) // token=')' && - (_literal_2 = _PyPegen_expect_forced_token(p, 11, ":")) // forced_token=':' + (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ invalid_with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':'")); - _res = _PyPegen_dummy_name(p, _opt_var, _keyword, _literal, _gather_196_var, _opt_var_1, _literal_1, _literal_2); + D(fprintf(stderr, "%*c+ invalid_with_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE")); + _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); + if (_res == NULL && PyErr_Occurred()) { + p->error_indicator = 1; + p->level--; + return NULL; + } goto done; } p->mark = _mark; D(fprintf(stderr, "%*c%s invalid_with_stmt[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' &&':'")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE")); } _res = NULL; done: @@ -22272,7 +22298,7 @@ invalid_except_star_stmt_indent_rule(Parser *p) } // invalid_match_stmt: -// | "match" subject_expr !':' +// | "match" subject_expr NEWLINE // | "match" subject_expr ':' NEWLINE !INDENT static void * invalid_match_stmt_rule(Parser *p) @@ -22287,23 +22313,24 @@ invalid_match_stmt_rule(Parser *p) } void * _res = NULL; int _mark = p->mark; - { // "match" subject_expr !':' + { // "match" subject_expr NEWLINE if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> invalid_match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr !':'")); + D(fprintf(stderr, "%*c> invalid_match_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr NEWLINE")); expr_ty _keyword; + Token * newline_var; expr_ty subject_expr_var; if ( (_keyword = _PyPegen_expect_soft_keyword(p, "match")) // soft_keyword='"match"' && (subject_expr_var = subject_expr_rule(p)) // subject_expr && - _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 11) // token=':' + (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ invalid_match_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr !':'")); + D(fprintf(stderr, "%*c+ invalid_match_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"match\" subject_expr NEWLINE")); _res = CHECK_VERSION ( void* , 10 , "Pattern matching is" , RAISE_SYNTAX_ERROR ( "expected ':'" ) ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -22314,7 +22341,7 @@ invalid_match_stmt_rule(Parser *p) } p->mark = _mark; D(fprintf(stderr, "%*c%s invalid_match_stmt[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "\"match\" subject_expr !':'")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "\"match\" subject_expr NEWLINE")); } { // "match" subject_expr ':' NEWLINE !INDENT if (p->error_indicator) { @@ -22358,7 +22385,7 @@ invalid_match_stmt_rule(Parser *p) } // invalid_case_block: -// | "case" patterns guard? !':' +// | "case" patterns guard? NEWLINE // | "case" patterns guard? ':' NEWLINE !INDENT static void * invalid_case_block_rule(Parser *p) @@ -22373,15 +22400,16 @@ invalid_case_block_rule(Parser *p) } void * _res = NULL; int _mark = p->mark; - { // "case" patterns guard? !':' + { // "case" patterns guard? NEWLINE if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> invalid_case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? !':'")); + D(fprintf(stderr, "%*c> invalid_case_block[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? NEWLINE")); expr_ty _keyword; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings + Token * newline_var; pattern_ty patterns_var; if ( (_keyword = _PyPegen_expect_soft_keyword(p, "case")) // soft_keyword='"case"' @@ -22390,10 +22418,10 @@ invalid_case_block_rule(Parser *p) && (_opt_var = guard_rule(p), !p->error_indicator) // guard? && - _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 11) // token=':' + (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ invalid_case_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? !':'")); + D(fprintf(stderr, "%*c+ invalid_case_block[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "\"case\" patterns guard? NEWLINE")); _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -22404,7 +22432,7 @@ invalid_case_block_rule(Parser *p) } p->mark = _mark; D(fprintf(stderr, "%*c%s invalid_case_block[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "\"case\" patterns guard? !':'")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "\"case\" patterns guard? NEWLINE")); } { // "case" patterns guard? ':' NEWLINE !INDENT if (p->error_indicator) { @@ -22951,7 +22979,9 @@ invalid_while_stmt_rule(Parser *p) return _res; } -// invalid_for_stmt: ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT +// invalid_for_stmt: +// | ASYNC? 'for' star_targets 'in' star_expressions NEWLINE +// | ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT static void * invalid_for_stmt_rule(Parser *p) { @@ -22965,6 +22995,46 @@ invalid_for_stmt_rule(Parser *p) } void * _res = NULL; int _mark = p->mark; + { // ASYNC? 'for' star_targets 'in' star_expressions NEWLINE + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> invalid_for_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'for' star_targets 'in' star_expressions NEWLINE")); + Token * _keyword; + Token * _keyword_1; + void *_opt_var; + UNUSED(_opt_var); // Silence compiler warnings + Token * newline_var; + expr_ty star_expressions_var; + expr_ty star_targets_var; + if ( + (_opt_var = _PyPegen_expect_token(p, ASYNC), !p->error_indicator) // ASYNC? + && + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' + && + (star_targets_var = star_targets_rule(p)) // star_targets + && + (_keyword_1 = _PyPegen_expect_token(p, 643)) // token='in' + && + (star_expressions_var = star_expressions_rule(p)) // star_expressions + && + (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' + ) + { + D(fprintf(stderr, "%*c+ invalid_for_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "ASYNC? 'for' star_targets 'in' star_expressions NEWLINE")); + _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); + if (_res == NULL && PyErr_Occurred()) { + p->error_indicator = 1; + p->level--; + return NULL; + } + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s invalid_for_stmt[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "ASYNC? 'for' star_targets 'in' star_expressions NEWLINE")); + } { // ASYNC? 'for' star_targets 'in' star_expressions ':' NEWLINE !INDENT if (p->error_indicator) { p->level--; @@ -22982,11 +23052,11 @@ invalid_for_stmt_rule(Parser *p) if ( (_opt_var = _PyPegen_expect_token(p, ASYNC), !p->error_indicator) // ASYNC? && - (a = _PyPegen_expect_token(p, 640)) // token='for' + (a = _PyPegen_expect_token(p, 642)) // token='for' && (star_targets_var = star_targets_rule(p)) // star_targets && - (_keyword = _PyPegen_expect_token(p, 641)) // token='in' + (_keyword = _PyPegen_expect_token(p, 643)) // token='in' && (star_expressions_var = star_expressions_rule(p)) // star_expressions && @@ -23052,7 +23122,7 @@ invalid_def_raw_rule(Parser *p) if ( (_opt_var = _PyPegen_expect_token(p, ASYNC), !p->error_indicator) // ASYNC? && - (a = _PyPegen_expect_token(p, 642)) // token='def' + (a = _PyPegen_expect_token(p, 644)) // token='def' && (name_var = _PyPegen_name_token(p)) // NAME && @@ -23090,7 +23160,9 @@ invalid_def_raw_rule(Parser *p) return _res; } -// invalid_class_def_raw: 'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT +// invalid_class_def_raw: +// | 'class' NAME ['(' arguments? ')'] NEWLINE +// | 'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT static void * invalid_class_def_raw_rule(Parser *p) { @@ -23104,6 +23176,40 @@ invalid_class_def_raw_rule(Parser *p) } void * _res = NULL; int _mark = p->mark; + { // 'class' NAME ['(' arguments? ')'] NEWLINE + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> invalid_class_def_raw[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] NEWLINE")); + Token * _keyword; + void *_opt_var; + UNUSED(_opt_var); // Silence compiler warnings + expr_ty name_var; + Token * newline_var; + if ( + (_keyword = _PyPegen_expect_token(p, 646)) // token='class' + && + (name_var = _PyPegen_name_token(p)) // NAME + && + (_opt_var = _tmp_213_rule(p), !p->error_indicator) // ['(' arguments? ')'] + && + (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' + ) + { + D(fprintf(stderr, "%*c+ invalid_class_def_raw[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'class' NAME ['(' arguments? ')'] NEWLINE")); + _res = RAISE_SYNTAX_ERROR ( "expected ':'" ); + if (_res == NULL && PyErr_Occurred()) { + p->error_indicator = 1; + p->level--; + return NULL; + } + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s invalid_class_def_raw[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'class' NAME ['(' arguments? ')'] NEWLINE")); + } { // 'class' NAME ['(' arguments? ')'] ':' NEWLINE !INDENT if (p->error_indicator) { p->level--; @@ -23117,11 +23223,11 @@ invalid_class_def_raw_rule(Parser *p) expr_ty name_var; Token * newline_var; if ( - (a = _PyPegen_expect_token(p, 643)) // token='class' + (a = _PyPegen_expect_token(p, 646)) // token='class' && (name_var = _PyPegen_name_token(p)) // NAME && - (_opt_var = _tmp_213_rule(p), !p->error_indicator) // ['(' arguments? ')'] + (_opt_var = _tmp_214_rule(p), !p->error_indicator) // ['(' arguments? ')'] && (_literal = _PyPegen_expect_token(p, 11)) // token=':' && @@ -23172,11 +23278,11 @@ invalid_double_starred_kvpairs_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ',' invalid_kvpair")); - asdl_seq * _gather_214_var; + asdl_seq * _gather_215_var; Token * _literal; void *invalid_kvpair_var; if ( - (_gather_214_var = _gather_214_rule(p)) // ','.double_starred_kvpair+ + (_gather_215_var = _gather_215_rule(p)) // ','.double_starred_kvpair+ && (_literal = _PyPegen_expect_token(p, 12)) // token=',' && @@ -23184,7 +23290,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) ) { D(fprintf(stderr, "%*c+ invalid_double_starred_kvpairs[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ',' invalid_kvpair")); - _res = _PyPegen_dummy_name(p, _gather_214_var, _literal, invalid_kvpair_var); + _res = _PyPegen_dummy_name(p, _gather_215_var, _literal, invalid_kvpair_var); goto done; } p->mark = _mark; @@ -23237,7 +23343,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) && (a = _PyPegen_expect_token(p, 11)) // token=':' && - _PyPegen_lookahead(1, _tmp_216_rule, p) + _PyPegen_lookahead(1, _tmp_217_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_double_starred_kvpairs[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ':' &('}' | ',')")); @@ -23777,7 +23883,7 @@ _tmp_7_rule(Parser *p) D(fprintf(stderr, "%*c> _tmp_7[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'def'")); Token * _keyword; if ( - (_keyword = _PyPegen_expect_token(p, 642)) // token='def' + (_keyword = _PyPegen_expect_token(p, 644)) // token='def' ) { D(fprintf(stderr, "%*c+ _tmp_7[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'def'")); @@ -23854,7 +23960,7 @@ _tmp_8_rule(Parser *p) D(fprintf(stderr, "%*c> _tmp_8[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'class'")); Token * _keyword; if ( - (_keyword = _PyPegen_expect_token(p, 643)) // token='class' + (_keyword = _PyPegen_expect_token(p, 646)) // token='class' ) { D(fprintf(stderr, "%*c+ _tmp_8[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'class'")); @@ -23970,7 +24076,7 @@ _tmp_10_rule(Parser *p) D(fprintf(stderr, "%*c> _tmp_10[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'for'")); Token * _keyword; if ( - (_keyword = _PyPegen_expect_token(p, 640)) // token='for' + (_keyword = _PyPegen_expect_token(p, 642)) // token='for' ) { D(fprintf(stderr, "%*c+ _tmp_10[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'for'")); @@ -24199,12 +24305,12 @@ _loop1_14_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_14[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); - void *_tmp_217_var; + void *_tmp_218_var; while ( - (_tmp_217_var = _tmp_217_rule(p)) // star_targets '=' + (_tmp_218_var = _tmp_218_rule(p)) // star_targets '=' ) { - _res = _tmp_217_var; + _res = _tmp_218_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -24781,12 +24887,12 @@ _loop0_24_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_24[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); - void *_tmp_218_var; + void *_tmp_219_var; while ( - (_tmp_218_var = _tmp_218_rule(p)) // '.' | '...' + (_tmp_219_var = _tmp_219_rule(p)) // '.' | '...' ) { - _res = _tmp_218_var; + _res = _tmp_219_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -24850,12 +24956,12 @@ _loop1_25_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_25[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); - void *_tmp_219_var; + void *_tmp_220_var; while ( - (_tmp_219_var = _tmp_219_rule(p)) // '.' | '...' + (_tmp_220_var = _tmp_220_rule(p)) // '.' | '...' ) { - _res = _tmp_219_var; + _res = _tmp_220_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -25258,12 +25364,12 @@ _loop1_32_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_32[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('@' named_expression NEWLINE)")); - void *_tmp_220_var; + void *_tmp_221_var; while ( - (_tmp_220_var = _tmp_220_rule(p)) // '@' named_expression NEWLINE + (_tmp_221_var = _tmp_221_rule(p)) // '@' named_expression NEWLINE ) { - _res = _tmp_220_var; + _res = _tmp_221_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28347,12 +28453,12 @@ _loop1_80_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_80[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' expression)")); - void *_tmp_221_var; + void *_tmp_222_var; while ( - (_tmp_221_var = _tmp_221_rule(p)) // ',' expression + (_tmp_222_var = _tmp_222_rule(p)) // ',' expression ) { - _res = _tmp_221_var; + _res = _tmp_222_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28421,12 +28527,12 @@ _loop1_81_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_81[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_expression)")); - void *_tmp_222_var; + void *_tmp_223_var; while ( - (_tmp_222_var = _tmp_222_rule(p)) // ',' star_expression + (_tmp_223_var = _tmp_223_rule(p)) // ',' star_expression ) { - _res = _tmp_222_var; + _res = _tmp_223_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28615,12 +28721,12 @@ _loop1_84_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_84[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('or' conjunction)")); - void *_tmp_223_var; + void *_tmp_224_var; while ( - (_tmp_223_var = _tmp_223_rule(p)) // 'or' conjunction + (_tmp_224_var = _tmp_224_rule(p)) // 'or' conjunction ) { - _res = _tmp_223_var; + _res = _tmp_224_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28689,12 +28795,12 @@ _loop1_85_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_85[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('and' inversion)")); - void *_tmp_224_var; + void *_tmp_225_var; while ( - (_tmp_224_var = _tmp_224_rule(p)) // 'and' inversion + (_tmp_225_var = _tmp_225_rule(p)) // 'and' inversion ) { - _res = _tmp_224_var; + _res = _tmp_225_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28886,7 +28992,7 @@ _loop0_89_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_225_rule(p)) // slice | starred_expression + (elem = _tmp_226_rule(p)) // slice | starred_expression ) { _res = elem; @@ -28952,7 +29058,7 @@ _gather_88_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_225_rule(p)) // slice | starred_expression + (elem = _tmp_226_rule(p)) // slice | starred_expression && (seq = _loop0_89_rule(p)) // _loop0_89 ) @@ -30656,12 +30762,12 @@ _loop0_114_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_114[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); - void *_tmp_226_var; + void *_tmp_227_var; while ( - (_tmp_226_var = _tmp_226_rule(p)) // 'if' disjunction + (_tmp_227_var = _tmp_227_rule(p)) // 'if' disjunction ) { - _res = _tmp_226_var; + _res = _tmp_227_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -30725,12 +30831,12 @@ _loop0_115_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_115[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); - void *_tmp_227_var; + void *_tmp_228_var; while ( - (_tmp_227_var = _tmp_227_rule(p)) // 'if' disjunction + (_tmp_228_var = _tmp_228_rule(p)) // 'if' disjunction ) { - _res = _tmp_227_var; + _res = _tmp_228_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -30859,7 +30965,7 @@ _loop0_118_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_228_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' + (elem = _tmp_229_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' ) { _res = elem; @@ -30926,7 +31032,7 @@ _gather_117_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_228_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' + (elem = _tmp_229_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' && (seq = _loop0_118_rule(p)) // _loop0_118 ) @@ -31502,12 +31608,12 @@ _loop0_128_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_128[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); - void *_tmp_229_var; + void *_tmp_230_var; while ( - (_tmp_229_var = _tmp_229_rule(p)) // ',' star_target + (_tmp_230_var = _tmp_230_rule(p)) // ',' star_target ) { - _res = _tmp_229_var; + _res = _tmp_230_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -31691,12 +31797,12 @@ _loop1_131_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_131[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); - void *_tmp_230_var; + void *_tmp_231_var; while ( - (_tmp_230_var = _tmp_230_rule(p)) // ',' star_target + (_tmp_231_var = _tmp_231_rule(p)) // ',' star_target ) { - _res = _tmp_230_var; + _res = _tmp_231_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -33066,12 +33172,12 @@ _loop0_153_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_153[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); - void *_tmp_231_var; + void *_tmp_232_var; while ( - (_tmp_231_var = _tmp_231_rule(p)) // star_targets '=' + (_tmp_232_var = _tmp_232_rule(p)) // star_targets '=' ) { - _res = _tmp_231_var; + _res = _tmp_232_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -33135,12 +33241,12 @@ _loop0_154_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); - void *_tmp_232_var; + void *_tmp_233_var; while ( - (_tmp_232_var = _tmp_232_rule(p)) // star_targets '=' + (_tmp_233_var = _tmp_233_rule(p)) // star_targets '=' ) { - _res = _tmp_232_var; + _res = _tmp_233_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -34190,15 +34296,15 @@ _tmp_170_rule(Parser *p) } D(fprintf(stderr, "%*c> _tmp_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); Token * _literal; - void *_tmp_233_var; + void *_tmp_234_var; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_tmp_233_var = _tmp_233_rule(p)) // ')' | '**' + (_tmp_234_var = _tmp_234_rule(p)) // ')' | '**' ) { D(fprintf(stderr, "%*c+ _tmp_170[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); - _res = _PyPegen_dummy_name(p, _literal, _tmp_233_var); + _res = _PyPegen_dummy_name(p, _literal, _tmp_234_var); goto done; } p->mark = _mark; @@ -35374,15 +35480,15 @@ _tmp_188_rule(Parser *p) } D(fprintf(stderr, "%*c> _tmp_188[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); Token * _literal; - void *_tmp_234_var; + void *_tmp_235_var; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_tmp_234_var = _tmp_234_rule(p)) // ':' | '**' + (_tmp_235_var = _tmp_235_rule(p)) // ':' | '**' ) { D(fprintf(stderr, "%*c+ _tmp_188[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); - _res = _PyPegen_dummy_name(p, _literal, _tmp_234_var); + _res = _PyPegen_dummy_name(p, _literal, _tmp_235_var); goto done; } p->mark = _mark; @@ -35769,7 +35875,7 @@ _loop0_195_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_235_rule(p)) // expression ['as' star_target] + (elem = _tmp_236_rule(p)) // expression ['as' star_target] ) { _res = elem; @@ -35835,7 +35941,7 @@ _gather_194_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_235_rule(p)) // expression ['as' star_target] + (elem = _tmp_236_rule(p)) // expression ['as' star_target] && (seq = _loop0_195_rule(p)) // _loop0_195 ) @@ -35889,7 +35995,7 @@ _loop0_197_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_236_rule(p)) // expressions ['as' star_target] + (elem = _tmp_237_rule(p)) // expressions ['as' star_target] ) { _res = elem; @@ -35955,7 +36061,7 @@ _gather_196_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_236_rule(p)) // expressions ['as' star_target] + (elem = _tmp_237_rule(p)) // expressions ['as' star_target] && (seq = _loop0_197_rule(p)) // _loop0_197 ) @@ -36009,7 +36115,7 @@ _loop0_199_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_237_rule(p)) // expression ['as' star_target] + (elem = _tmp_238_rule(p)) // expression ['as' star_target] ) { _res = elem; @@ -36075,7 +36181,7 @@ _gather_198_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_237_rule(p)) // expression ['as' star_target] + (elem = _tmp_238_rule(p)) // expression ['as' star_target] && (seq = _loop0_199_rule(p)) // _loop0_199 ) @@ -36129,7 +36235,7 @@ _loop0_201_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_238_rule(p)) // expressions ['as' star_target] + (elem = _tmp_239_rule(p)) // expressions ['as' star_target] ) { _res = elem; @@ -36195,7 +36301,7 @@ _gather_200_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_238_rule(p)) // expressions ['as' star_target] + (elem = _tmp_239_rule(p)) // expressions ['as' star_target] && (seq = _loop0_201_rule(p)) // _loop0_201 ) @@ -36361,13 +36467,13 @@ _tmp_204_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _tmp_204[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(except_block+ except_star_block)")); - void *_tmp_239_var; + void *_tmp_240_var; if ( - (_tmp_239_var = _tmp_239_rule(p)) // except_block+ except_star_block + (_tmp_240_var = _tmp_240_rule(p)) // except_block+ except_star_block ) { D(fprintf(stderr, "%*c+ _tmp_204[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(except_block+ except_star_block)")); - _res = _tmp_239_var; + _res = _tmp_240_var; goto done; } p->mark = _mark; @@ -36380,13 +36486,13 @@ _tmp_204_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _tmp_204[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(except_star_block+ except_block)")); - void *_tmp_240_var; + void *_tmp_241_var; if ( - (_tmp_240_var = _tmp_240_rule(p)) // except_star_block+ except_block + (_tmp_241_var = _tmp_241_rule(p)) // except_star_block+ except_block ) { D(fprintf(stderr, "%*c+ _tmp_204[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(except_star_block+ except_block)")); - _res = _tmp_240_var; + _res = _tmp_241_var; goto done; } p->mark = _mark; @@ -36824,9 +36930,55 @@ _tmp_213_rule(Parser *p) return _res; } -// _loop0_215: ',' double_starred_kvpair +// _tmp_214: '(' arguments? ')' +static void * +_tmp_214_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void * _res = NULL; + int _mark = p->mark; + { // '(' arguments? ')' + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_214[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); + Token * _literal; + Token * _literal_1; + void *_opt_var; + UNUSED(_opt_var); // Silence compiler warnings + if ( + (_literal = _PyPegen_expect_token(p, 7)) // token='(' + && + (_opt_var = arguments_rule(p), !p->error_indicator) // arguments? + && + (_literal_1 = _PyPegen_expect_token(p, 8)) // token=')' + ) + { + D(fprintf(stderr, "%*c+ _tmp_214[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); + _res = _PyPegen_dummy_name(p, _literal, _opt_var, _literal_1); + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_214[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'(' arguments? ')'")); + } + _res = NULL; + done: + p->level--; + return _res; +} + +// _loop0_216: ',' double_starred_kvpair static asdl_seq * -_loop0_215_rule(Parser *p) +_loop0_216_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36853,7 +37005,7 @@ _loop0_215_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_215[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); + D(fprintf(stderr, "%*c> _loop0_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); Token * _literal; KeyValuePair* elem; while ( @@ -36884,7 +37036,7 @@ _loop0_215_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_215[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_216[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' double_starred_kvpair")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -36897,14 +37049,14 @@ _loop0_215_rule(Parser *p) } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); - _PyPegen_insert_memo(p, _start_mark, _loop0_215_type, _seq); + _PyPegen_insert_memo(p, _start_mark, _loop0_216_type, _seq); p->level--; return _seq; } -// _gather_214: double_starred_kvpair _loop0_215 +// _gather_215: double_starred_kvpair _loop0_216 static asdl_seq * -_gather_214_rule(Parser *p) +_gather_215_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36916,27 +37068,27 @@ _gather_214_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // double_starred_kvpair _loop0_215 + { // double_starred_kvpair _loop0_216 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_214[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_215")); + D(fprintf(stderr, "%*c> _gather_215[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_216")); KeyValuePair* elem; asdl_seq * seq; if ( (elem = double_starred_kvpair_rule(p)) // double_starred_kvpair && - (seq = _loop0_215_rule(p)) // _loop0_215 + (seq = _loop0_216_rule(p)) // _loop0_216 ) { - D(fprintf(stderr, "%*c+ _gather_214[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_215")); + D(fprintf(stderr, "%*c+ _gather_215[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_216")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_214[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "double_starred_kvpair _loop0_215")); + D(fprintf(stderr, "%*c%s _gather_215[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "double_starred_kvpair _loop0_216")); } _res = NULL; done: @@ -36944,9 +37096,9 @@ _gather_214_rule(Parser *p) return _res; } -// _tmp_216: '}' | ',' +// _tmp_217: '}' | ',' static void * -_tmp_216_rule(Parser *p) +_tmp_217_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36963,18 +37115,18 @@ _tmp_216_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); + D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 26)) // token='}' ) { - D(fprintf(stderr, "%*c+ _tmp_216[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'}'")); + D(fprintf(stderr, "%*c+ _tmp_217[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'}'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_216[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_217[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'}'")); } { // ',' @@ -36982,18 +37134,18 @@ _tmp_216_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_216[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_217[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_216[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_217[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -37002,9 +37154,9 @@ _tmp_216_rule(Parser *p) return _res; } -// _tmp_217: star_targets '=' +// _tmp_218: star_targets '=' static void * -_tmp_217_rule(Parser *p) +_tmp_218_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37021,7 +37173,7 @@ _tmp_217_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c> _tmp_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); Token * _literal; expr_ty z; if ( @@ -37030,7 +37182,7 @@ _tmp_217_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_217[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c+ _tmp_218[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37040,7 +37192,7 @@ _tmp_217_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_217[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_218[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_targets '='")); } _res = NULL; @@ -37049,9 +37201,9 @@ _tmp_217_rule(Parser *p) return _res; } -// _tmp_218: '.' | '...' +// _tmp_219: '.' | '...' static void * -_tmp_218_rule(Parser *p) +_tmp_219_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37068,18 +37220,18 @@ _tmp_218_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c> _tmp_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 23)) // token='.' ) { - D(fprintf(stderr, "%*c+ _tmp_218[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c+ _tmp_219[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_218[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_219[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'.'")); } { // '...' @@ -37087,18 +37239,18 @@ _tmp_218_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c> _tmp_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 52)) // token='...' ) { - D(fprintf(stderr, "%*c+ _tmp_218[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c+ _tmp_219[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_218[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_219[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'...'")); } _res = NULL; @@ -37107,9 +37259,9 @@ _tmp_218_rule(Parser *p) return _res; } -// _tmp_219: '.' | '...' +// _tmp_220: '.' | '...' static void * -_tmp_219_rule(Parser *p) +_tmp_220_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37126,18 +37278,18 @@ _tmp_219_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 23)) // token='.' ) { - D(fprintf(stderr, "%*c+ _tmp_219[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c+ _tmp_220[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_219[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_220[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'.'")); } { // '...' @@ -37145,18 +37297,18 @@ _tmp_219_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 52)) // token='...' ) { - D(fprintf(stderr, "%*c+ _tmp_219[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c+ _tmp_220[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_219[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_220[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'...'")); } _res = NULL; @@ -37165,9 +37317,9 @@ _tmp_219_rule(Parser *p) return _res; } -// _tmp_220: '@' named_expression NEWLINE +// _tmp_221: '@' named_expression NEWLINE static void * -_tmp_220_rule(Parser *p) +_tmp_221_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37184,7 +37336,7 @@ _tmp_220_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); + D(fprintf(stderr, "%*c> _tmp_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); Token * _literal; expr_ty f; Token * newline_var; @@ -37196,7 +37348,7 @@ _tmp_220_rule(Parser *p) (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ _tmp_220[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); + D(fprintf(stderr, "%*c+ _tmp_221[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); _res = f; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37206,7 +37358,7 @@ _tmp_220_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_220[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_221[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'@' named_expression NEWLINE")); } _res = NULL; @@ -37215,9 +37367,9 @@ _tmp_220_rule(Parser *p) return _res; } -// _tmp_221: ',' expression +// _tmp_222: ',' expression static void * -_tmp_221_rule(Parser *p) +_tmp_222_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37234,7 +37386,7 @@ _tmp_221_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); + D(fprintf(stderr, "%*c> _tmp_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); Token * _literal; expr_ty c; if ( @@ -37243,7 +37395,7 @@ _tmp_221_rule(Parser *p) (c = expression_rule(p)) // expression ) { - D(fprintf(stderr, "%*c+ _tmp_221[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' expression")); + D(fprintf(stderr, "%*c+ _tmp_222[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' expression")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37253,7 +37405,7 @@ _tmp_221_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_221[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_222[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' expression")); } _res = NULL; @@ -37262,9 +37414,9 @@ _tmp_221_rule(Parser *p) return _res; } -// _tmp_222: ',' star_expression +// _tmp_223: ',' star_expression static void * -_tmp_222_rule(Parser *p) +_tmp_223_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37281,7 +37433,7 @@ _tmp_222_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_expression")); + D(fprintf(stderr, "%*c> _tmp_223[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_expression")); Token * _literal; expr_ty c; if ( @@ -37290,7 +37442,7 @@ _tmp_222_rule(Parser *p) (c = star_expression_rule(p)) // star_expression ) { - D(fprintf(stderr, "%*c+ _tmp_222[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_expression")); + D(fprintf(stderr, "%*c+ _tmp_223[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_expression")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37300,7 +37452,7 @@ _tmp_222_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_222[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_223[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' star_expression")); } _res = NULL; @@ -37309,9 +37461,9 @@ _tmp_222_rule(Parser *p) return _res; } -// _tmp_223: 'or' conjunction +// _tmp_224: 'or' conjunction static void * -_tmp_223_rule(Parser *p) +_tmp_224_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37328,7 +37480,7 @@ _tmp_223_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_223[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); + D(fprintf(stderr, "%*c> _tmp_224[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); Token * _keyword; expr_ty c; if ( @@ -37337,7 +37489,7 @@ _tmp_223_rule(Parser *p) (c = conjunction_rule(p)) // conjunction ) { - D(fprintf(stderr, "%*c+ _tmp_223[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); + D(fprintf(stderr, "%*c+ _tmp_224[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37347,7 +37499,7 @@ _tmp_223_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_223[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_224[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'or' conjunction")); } _res = NULL; @@ -37356,9 +37508,9 @@ _tmp_223_rule(Parser *p) return _res; } -// _tmp_224: 'and' inversion +// _tmp_225: 'and' inversion static void * -_tmp_224_rule(Parser *p) +_tmp_225_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37375,7 +37527,7 @@ _tmp_224_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_224[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'and' inversion")); + D(fprintf(stderr, "%*c> _tmp_225[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'and' inversion")); Token * _keyword; expr_ty c; if ( @@ -37384,7 +37536,7 @@ _tmp_224_rule(Parser *p) (c = inversion_rule(p)) // inversion ) { - D(fprintf(stderr, "%*c+ _tmp_224[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'and' inversion")); + D(fprintf(stderr, "%*c+ _tmp_225[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'and' inversion")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37394,7 +37546,7 @@ _tmp_224_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_224[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_225[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'and' inversion")); } _res = NULL; @@ -37403,9 +37555,9 @@ _tmp_224_rule(Parser *p) return _res; } -// _tmp_225: slice | starred_expression +// _tmp_226: slice | starred_expression static void * -_tmp_225_rule(Parser *p) +_tmp_226_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37422,18 +37574,18 @@ _tmp_225_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_225[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice")); + D(fprintf(stderr, "%*c> _tmp_226[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice")); expr_ty slice_var; if ( (slice_var = slice_rule(p)) // slice ) { - D(fprintf(stderr, "%*c+ _tmp_225[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slice")); + D(fprintf(stderr, "%*c+ _tmp_226[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slice")); _res = slice_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_225[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_226[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "slice")); } { // starred_expression @@ -37441,18 +37593,18 @@ _tmp_225_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_225[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c> _tmp_226[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); expr_ty starred_expression_var; if ( (starred_expression_var = starred_expression_rule(p)) // starred_expression ) { - D(fprintf(stderr, "%*c+ _tmp_225[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c+ _tmp_226[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); _res = starred_expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_225[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_226[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "starred_expression")); } _res = NULL; @@ -37461,9 +37613,9 @@ _tmp_225_rule(Parser *p) return _res; } -// _tmp_226: 'if' disjunction +// _tmp_227: 'if' disjunction static void * -_tmp_226_rule(Parser *p) +_tmp_227_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37480,7 +37632,7 @@ _tmp_226_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_226[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c> _tmp_227[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); Token * _keyword; expr_ty z; if ( @@ -37489,7 +37641,7 @@ _tmp_226_rule(Parser *p) (z = disjunction_rule(p)) // disjunction ) { - D(fprintf(stderr, "%*c+ _tmp_226[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c+ _tmp_227[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37499,7 +37651,7 @@ _tmp_226_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_226[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_227[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'if' disjunction")); } _res = NULL; @@ -37508,9 +37660,9 @@ _tmp_226_rule(Parser *p) return _res; } -// _tmp_227: 'if' disjunction +// _tmp_228: 'if' disjunction static void * -_tmp_227_rule(Parser *p) +_tmp_228_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37527,7 +37679,7 @@ _tmp_227_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_227[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c> _tmp_228[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); Token * _keyword; expr_ty z; if ( @@ -37536,7 +37688,7 @@ _tmp_227_rule(Parser *p) (z = disjunction_rule(p)) // disjunction ) { - D(fprintf(stderr, "%*c+ _tmp_227[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c+ _tmp_228[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37546,7 +37698,7 @@ _tmp_227_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_227[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_228[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'if' disjunction")); } _res = NULL; @@ -37555,9 +37707,9 @@ _tmp_227_rule(Parser *p) return _res; } -// _tmp_228: starred_expression | (assignment_expression | expression !':=') !'=' +// _tmp_229: starred_expression | (assignment_expression | expression !':=') !'=' static void * -_tmp_228_rule(Parser *p) +_tmp_229_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37574,18 +37726,18 @@ _tmp_228_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_228[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c> _tmp_229[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); expr_ty starred_expression_var; if ( (starred_expression_var = starred_expression_rule(p)) // starred_expression ) { - D(fprintf(stderr, "%*c+ _tmp_228[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c+ _tmp_229[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); _res = starred_expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_228[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_229[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "starred_expression")); } { // (assignment_expression | expression !':=') !'=' @@ -37593,20 +37745,20 @@ _tmp_228_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_228[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); - void *_tmp_241_var; + D(fprintf(stderr, "%*c> _tmp_229[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + void *_tmp_242_var; if ( - (_tmp_241_var = _tmp_241_rule(p)) // assignment_expression | expression !':=' + (_tmp_242_var = _tmp_242_rule(p)) // assignment_expression | expression !':=' && _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 22) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_228[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); - _res = _tmp_241_var; + D(fprintf(stderr, "%*c+ _tmp_229[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + _res = _tmp_242_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_228[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_229[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(assignment_expression | expression !':=') !'='")); } _res = NULL; @@ -37615,9 +37767,9 @@ _tmp_228_rule(Parser *p) return _res; } -// _tmp_229: ',' star_target +// _tmp_230: ',' star_target static void * -_tmp_229_rule(Parser *p) +_tmp_230_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37634,7 +37786,7 @@ _tmp_229_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_229[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); + D(fprintf(stderr, "%*c> _tmp_230[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); Token * _literal; expr_ty c; if ( @@ -37643,7 +37795,7 @@ _tmp_229_rule(Parser *p) (c = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_229[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_target")); + D(fprintf(stderr, "%*c+ _tmp_230[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_target")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37653,7 +37805,7 @@ _tmp_229_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_229[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_230[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' star_target")); } _res = NULL; @@ -37662,9 +37814,9 @@ _tmp_229_rule(Parser *p) return _res; } -// _tmp_230: ',' star_target +// _tmp_231: ',' star_target static void * -_tmp_230_rule(Parser *p) +_tmp_231_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37681,7 +37833,7 @@ _tmp_230_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_230[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); + D(fprintf(stderr, "%*c> _tmp_231[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); Token * _literal; expr_ty c; if ( @@ -37690,7 +37842,7 @@ _tmp_230_rule(Parser *p) (c = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_230[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_target")); + D(fprintf(stderr, "%*c+ _tmp_231[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_target")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37700,7 +37852,7 @@ _tmp_230_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_230[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_231[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' star_target")); } _res = NULL; @@ -37709,9 +37861,9 @@ _tmp_230_rule(Parser *p) return _res; } -// _tmp_231: star_targets '=' +// _tmp_232: star_targets '=' static void * -_tmp_231_rule(Parser *p) +_tmp_232_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37728,7 +37880,7 @@ _tmp_231_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_231[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c> _tmp_232[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); Token * _literal; expr_ty star_targets_var; if ( @@ -37737,12 +37889,12 @@ _tmp_231_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_231[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c+ _tmp_232[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); _res = _PyPegen_dummy_name(p, star_targets_var, _literal); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_231[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_232[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_targets '='")); } _res = NULL; @@ -37751,9 +37903,9 @@ _tmp_231_rule(Parser *p) return _res; } -// _tmp_232: star_targets '=' +// _tmp_233: star_targets '=' static void * -_tmp_232_rule(Parser *p) +_tmp_233_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37770,7 +37922,7 @@ _tmp_232_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_232[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c> _tmp_233[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); Token * _literal; expr_ty star_targets_var; if ( @@ -37779,12 +37931,12 @@ _tmp_232_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_232[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c+ _tmp_233[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); _res = _PyPegen_dummy_name(p, star_targets_var, _literal); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_232[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_233[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_targets '='")); } _res = NULL; @@ -37793,9 +37945,9 @@ _tmp_232_rule(Parser *p) return _res; } -// _tmp_233: ')' | '**' +// _tmp_234: ')' | '**' static void * -_tmp_233_rule(Parser *p) +_tmp_234_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37812,18 +37964,18 @@ _tmp_233_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_233[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_233[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_233[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_234[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "')'")); } { // '**' @@ -37831,18 +37983,18 @@ _tmp_233_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_233[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 35)) // token='**' ) { - D(fprintf(stderr, "%*c+ _tmp_233[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_233[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_234[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'**'")); } _res = NULL; @@ -37851,9 +38003,9 @@ _tmp_233_rule(Parser *p) return _res; } -// _tmp_234: ':' | '**' +// _tmp_235: ':' | '**' static void * -_tmp_234_rule(Parser *p) +_tmp_235_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37870,18 +38022,18 @@ _tmp_234_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c> _tmp_235[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 11)) // token=':' ) { - D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c+ _tmp_235[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_234[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_235[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':'")); } { // '**' @@ -37889,18 +38041,18 @@ _tmp_234_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c> _tmp_235[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 35)) // token='**' ) { - D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c+ _tmp_235[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_234[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_235[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'**'")); } _res = NULL; @@ -37909,9 +38061,9 @@ _tmp_234_rule(Parser *p) return _res; } -// _tmp_235: expression ['as' star_target] +// _tmp_236: expression ['as' star_target] static void * -_tmp_235_rule(Parser *p) +_tmp_236_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37928,22 +38080,22 @@ _tmp_235_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_235[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_236[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_242_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_243_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_235[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_236[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); _res = _PyPegen_dummy_name(p, expression_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_235[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_236[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression ['as' star_target]")); } _res = NULL; @@ -37952,9 +38104,9 @@ _tmp_235_rule(Parser *p) return _res; } -// _tmp_236: expressions ['as' star_target] +// _tmp_237: expressions ['as' star_target] static void * -_tmp_236_rule(Parser *p) +_tmp_237_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37971,22 +38123,22 @@ _tmp_236_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_236[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_237[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expressions_var; if ( (expressions_var = expressions_rule(p)) // expressions && - (_opt_var = _tmp_243_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_244_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_236[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_237[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); _res = _PyPegen_dummy_name(p, expressions_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_236[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_237[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expressions ['as' star_target]")); } _res = NULL; @@ -37995,9 +38147,9 @@ _tmp_236_rule(Parser *p) return _res; } -// _tmp_237: expression ['as' star_target] +// _tmp_238: expression ['as' star_target] static void * -_tmp_237_rule(Parser *p) +_tmp_238_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38014,22 +38166,22 @@ _tmp_237_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_237[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_238[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_244_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_245_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_237[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_238[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); _res = _PyPegen_dummy_name(p, expression_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_237[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_238[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression ['as' star_target]")); } _res = NULL; @@ -38038,9 +38190,9 @@ _tmp_237_rule(Parser *p) return _res; } -// _tmp_238: expressions ['as' star_target] +// _tmp_239: expressions ['as' star_target] static void * -_tmp_238_rule(Parser *p) +_tmp_239_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38057,22 +38209,22 @@ _tmp_238_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_238[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_239[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expressions_var; if ( (expressions_var = expressions_rule(p)) // expressions && - (_opt_var = _tmp_245_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_246_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_238[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_239[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); _res = _PyPegen_dummy_name(p, expressions_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_238[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_239[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expressions ['as' star_target]")); } _res = NULL; @@ -38081,9 +38233,9 @@ _tmp_238_rule(Parser *p) return _res; } -// _tmp_239: except_block+ except_star_block +// _tmp_240: except_block+ except_star_block static void * -_tmp_239_rule(Parser *p) +_tmp_240_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38100,21 +38252,21 @@ _tmp_239_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_239[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block+ except_star_block")); - asdl_seq * _loop1_246_var; + D(fprintf(stderr, "%*c> _tmp_240[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block+ except_star_block")); + asdl_seq * _loop1_247_var; excepthandler_ty except_star_block_var; if ( - (_loop1_246_var = _loop1_246_rule(p)) // except_block+ + (_loop1_247_var = _loop1_247_rule(p)) // except_block+ && (except_star_block_var = except_star_block_rule(p)) // except_star_block ) { - D(fprintf(stderr, "%*c+ _tmp_239[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "except_block+ except_star_block")); - _res = _PyPegen_dummy_name(p, _loop1_246_var, except_star_block_var); + D(fprintf(stderr, "%*c+ _tmp_240[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "except_block+ except_star_block")); + _res = _PyPegen_dummy_name(p, _loop1_247_var, except_star_block_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_239[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_240[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "except_block+ except_star_block")); } _res = NULL; @@ -38123,9 +38275,9 @@ _tmp_239_rule(Parser *p) return _res; } -// _tmp_240: except_star_block+ except_block +// _tmp_241: except_star_block+ except_block static void * -_tmp_240_rule(Parser *p) +_tmp_241_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38142,21 +38294,21 @@ _tmp_240_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_240[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block+ except_block")); - asdl_seq * _loop1_247_var; + D(fprintf(stderr, "%*c> _tmp_241[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block+ except_block")); + asdl_seq * _loop1_248_var; excepthandler_ty except_block_var; if ( - (_loop1_247_var = _loop1_247_rule(p)) // except_star_block+ + (_loop1_248_var = _loop1_248_rule(p)) // except_star_block+ && (except_block_var = except_block_rule(p)) // except_block ) { - D(fprintf(stderr, "%*c+ _tmp_240[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "except_star_block+ except_block")); - _res = _PyPegen_dummy_name(p, _loop1_247_var, except_block_var); + D(fprintf(stderr, "%*c+ _tmp_241[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "except_star_block+ except_block")); + _res = _PyPegen_dummy_name(p, _loop1_248_var, except_block_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_240[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_241[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "except_star_block+ except_block")); } _res = NULL; @@ -38165,9 +38317,9 @@ _tmp_240_rule(Parser *p) return _res; } -// _tmp_241: assignment_expression | expression !':=' +// _tmp_242: assignment_expression | expression !':=' static void * -_tmp_241_rule(Parser *p) +_tmp_242_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38184,18 +38336,18 @@ _tmp_241_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_241[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); + D(fprintf(stderr, "%*c> _tmp_242[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); expr_ty assignment_expression_var; if ( (assignment_expression_var = assignment_expression_rule(p)) // assignment_expression ) { - D(fprintf(stderr, "%*c+ _tmp_241[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "assignment_expression")); + D(fprintf(stderr, "%*c+ _tmp_242[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "assignment_expression")); _res = assignment_expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_241[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_242[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "assignment_expression")); } { // expression !':=' @@ -38203,7 +38355,7 @@ _tmp_241_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_241[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); + D(fprintf(stderr, "%*c> _tmp_242[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression @@ -38211,12 +38363,12 @@ _tmp_241_rule(Parser *p) _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 53) // token=':=' ) { - D(fprintf(stderr, "%*c+ _tmp_241[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression !':='")); + D(fprintf(stderr, "%*c+ _tmp_242[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression !':='")); _res = expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_241[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_242[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression !':='")); } _res = NULL; @@ -38225,9 +38377,9 @@ _tmp_241_rule(Parser *p) return _res; } -// _tmp_242: 'as' star_target +// _tmp_243: 'as' star_target static void * -_tmp_242_rule(Parser *p) +_tmp_243_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38244,7 +38396,7 @@ _tmp_242_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_242[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_243[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38253,12 +38405,12 @@ _tmp_242_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_242[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_243[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_242[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_243[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38267,9 +38419,9 @@ _tmp_242_rule(Parser *p) return _res; } -// _tmp_243: 'as' star_target +// _tmp_244: 'as' star_target static void * -_tmp_243_rule(Parser *p) +_tmp_244_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38286,7 +38438,7 @@ _tmp_243_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_243[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_244[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38295,12 +38447,12 @@ _tmp_243_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_243[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_244[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_243[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_244[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38309,9 +38461,9 @@ _tmp_243_rule(Parser *p) return _res; } -// _tmp_244: 'as' star_target +// _tmp_245: 'as' star_target static void * -_tmp_244_rule(Parser *p) +_tmp_245_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38328,7 +38480,7 @@ _tmp_244_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_244[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_245[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38337,12 +38489,12 @@ _tmp_244_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_244[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_245[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_244[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_245[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38351,9 +38503,9 @@ _tmp_244_rule(Parser *p) return _res; } -// _tmp_245: 'as' star_target +// _tmp_246: 'as' star_target static void * -_tmp_245_rule(Parser *p) +_tmp_246_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38370,7 +38522,7 @@ _tmp_245_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_245[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_246[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38379,12 +38531,12 @@ _tmp_245_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_245[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_246[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_245[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_246[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38393,9 +38545,9 @@ _tmp_245_rule(Parser *p) return _res; } -// _loop1_246: except_block +// _loop1_247: except_block static asdl_seq * -_loop1_246_rule(Parser *p) +_loop1_247_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38422,7 +38574,7 @@ _loop1_246_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_246[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); + D(fprintf(stderr, "%*c> _loop1_247[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); excepthandler_ty except_block_var; while ( (except_block_var = except_block_rule(p)) // except_block @@ -38444,7 +38596,7 @@ _loop1_246_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_246[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_247[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "except_block")); } if (_n == 0 || p->error_indicator) { @@ -38462,14 +38614,14 @@ _loop1_246_rule(Parser *p) } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); - _PyPegen_insert_memo(p, _start_mark, _loop1_246_type, _seq); + _PyPegen_insert_memo(p, _start_mark, _loop1_247_type, _seq); p->level--; return _seq; } -// _loop1_247: except_star_block +// _loop1_248: except_star_block static asdl_seq * -_loop1_247_rule(Parser *p) +_loop1_248_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38496,7 +38648,7 @@ _loop1_247_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_247[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); + D(fprintf(stderr, "%*c> _loop1_248[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); excepthandler_ty except_star_block_var; while ( (except_star_block_var = except_star_block_rule(p)) // except_star_block @@ -38518,7 +38670,7 @@ _loop1_247_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_247[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_248[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "except_star_block")); } if (_n == 0 || p->error_indicator) { @@ -38536,7 +38688,7 @@ _loop1_247_rule(Parser *p) } for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); PyMem_Free(_children); - _PyPegen_insert_memo(p, _start_mark, _loop1_247_type, _seq); + _PyPegen_insert_memo(p, _start_mark, _loop1_248_type, _seq); p->level--; return _seq; } diff --git a/Parser/pegen.c b/Parser/pegen.c index 143461d44a1a4a..dbf105aedcf427 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -77,13 +77,7 @@ init_normalization(Parser *p) if (p->normalize) { return 1; } - PyObject *m = PyImport_ImportModule("unicodedata"); - if (!m) - { - return 0; - } - p->normalize = PyObject_GetAttrString(m, "normalize"); - Py_DECREF(m); + p->normalize = _PyImport_GetModuleAttrString("unicodedata", "normalize"); if (!p->normalize) { return 0; @@ -774,6 +768,9 @@ _PyPegen_Parser_New(struct tok_state *tok, int start_rule, int flags, p->known_err_token = NULL; p->level = 0; p->call_invalid_rules = 0; +#ifdef Py_DEBUG + p->debug = _Py_GetConfig()->parser_debug; +#endif return p; } diff --git a/Parser/pegen.h b/Parser/pegen.h index d6a6e4e1eeb2f9..d8ac7e8cb918f7 100644 --- a/Parser/pegen.h +++ b/Parser/pegen.h @@ -78,6 +78,7 @@ typedef struct { Token *known_err_token; int level; int call_invalid_rules; + int debug; } Parser; typedef struct { diff --git a/Parser/string_parser.c b/Parser/string_parser.c index 9c12d8ca101d00..984c05d39c1a17 100644 --- a/Parser/string_parser.c +++ b/Parser/string_parser.c @@ -756,7 +756,9 @@ fstring_find_expr(Parser *p, const char **str, const char *end, int raw, int rec while (Py_ISSPACE(**str)) { *str += 1; } - + if (*str >= end) { + goto unexpected_end_of_string; + } /* Set *expr_text to the text of the expression. */ *expr_text = PyUnicode_FromStringAndSize(expr_start, *str-expr_start); if (!*expr_text) { @@ -767,27 +769,43 @@ fstring_find_expr(Parser *p, const char **str, const char *end, int raw, int rec /* Check for a conversion char, if present. */ if (**str == '!') { *str += 1; - if (*str >= end) { - goto unexpected_end_of_string; + const char *conv_start = *str; + while (1) { + if (*str >= end) { + goto unexpected_end_of_string; + } + if (**str == '}' || **str == ':') { + break; + } + *str += 1; + } + if (*str == conv_start) { + RAISE_SYNTAX_ERROR( + "f-string: missed conversion character"); + goto error; } - conversion = (unsigned char)**str; - *str += 1; - + conversion = (unsigned char)*conv_start; /* Validate the conversion. */ - if (!(conversion == 's' || conversion == 'r' || conversion == 'a')) { - RAISE_SYNTAX_ERROR( - "f-string: invalid conversion character: " - "expected 's', 'r', or 'a'"); + if ((*str != conv_start + 1) || + !(conversion == 's' || conversion == 'r' || conversion == 'a')) + { + PyObject *conv_obj = PyUnicode_FromStringAndSize(conv_start, + *str-conv_start); + if (conv_obj) { + RAISE_SYNTAX_ERROR( + "f-string: invalid conversion character %R: " + "expected 's', 'r', or 'a'", + conv_obj); + Py_DECREF(conv_obj); + } goto error; } } /* Check for the format spec, if present. */ - if (*str >= end) { - goto unexpected_end_of_string; - } + assert(*str < end); if (**str == ':') { *str += 1; if (*str >= end) { diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c index 7c797180956d54..952265eb923f9d 100644 --- a/Parser/tokenizer.c +++ b/Parser/tokenizer.c @@ -88,6 +88,9 @@ tok_new(void) tok->async_def_nl = 0; tok->interactive_underflow = IUNDERFLOW_NORMAL; tok->str = NULL; +#ifdef Py_DEBUG + tok->debug = _Py_GetConfig()->parser_debug; +#endif return tok; } @@ -418,7 +421,7 @@ tok_readline_recode(struct tok_state *tok) { static int fp_setreadl(struct tok_state *tok, const char* enc) { - PyObject *readline, *io, *stream; + PyObject *readline, *open, *stream; int fd; long pos; @@ -435,13 +438,13 @@ fp_setreadl(struct tok_state *tok, const char* enc) return 0; } - io = PyImport_ImportModule("io"); - if (io == NULL) { + open = _PyImport_GetModuleAttrString("io", "open"); + if (open == NULL) { return 0; } - stream = _PyObject_CallMethod(io, &_Py_ID(open), "isisOOO", + stream = PyObject_CallFunction(open, "isisOOO", fd, "r", -1, enc, Py_None, Py_None, Py_False); - Py_DECREF(io); + Py_DECREF(open); if (stream == NULL) { return 0; } @@ -1021,7 +1024,7 @@ tok_nextc(struct tok_state *tok) rc = tok_underflow_file(tok); } #if defined(Py_DEBUG) - if (Py_DebugFlag) { + if (tok->debug) { fprintf(stderr, "line[%d] = ", tok->lineno); print_escape(stderr, tok->cur, tok->inp - tok->cur); fprintf(stderr, " tok->done = %d\n", tok->done); @@ -1102,11 +1105,7 @@ static int syntaxerror(struct tok_state *tok, const char *format, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif int ret = _syntaxerror_range(tok, format, -1, -1, vargs); va_end(vargs); return ret; @@ -1118,11 +1117,7 @@ syntaxerror_known_range(struct tok_state *tok, const char *format, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif int ret = _syntaxerror_range(tok, format, col_offset, end_col_offset, vargs); va_end(vargs); return ret; @@ -1143,11 +1138,7 @@ parser_warn(struct tok_state *tok, PyObject *category, const char *format, ...) { PyObject *errmsg; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif errmsg = PyUnicode_FromFormatV(format, vargs); va_end(vargs); if (!errmsg) { diff --git a/Parser/tokenizer.h b/Parser/tokenizer.h index dba71bd60fefe4..5ac64a99b7d661 100644 --- a/Parser/tokenizer.h +++ b/Parser/tokenizer.h @@ -84,6 +84,9 @@ struct tok_state { NEWLINE token after it. */ /* How to proceed when asked for a new token in interactive mode */ enum interactive_underflow_t interactive_underflow; +#ifdef Py_DEBUG + int debug; +#endif }; extern struct tok_state *_PyTokenizer_FromString(const char *, int); diff --git a/Programs/_testembed.c b/Programs/_testembed.c index 9d3d0cbddf0e53..542e46968ce564 100644 --- a/Programs/_testembed.c +++ b/Programs/_testembed.c @@ -1550,6 +1550,46 @@ static int test_init_setpythonhome(void) } +static int test_init_is_python_build(void) +{ + // gh-91985: in-tree builds fail to check for build directory landmarks + // under the effect of 'home' or PYTHONHOME environment variable. + char *env = getenv("TESTHOME"); + if (!env) { + error("missing TESTHOME env var"); + return 1; + } + wchar_t *home = Py_DecodeLocale(env, NULL); + if (home == NULL) { + error("failed to decode TESTHOME"); + return 1; + } + + PyConfig config; + _PyConfig_InitCompatConfig(&config); + config_set_program_name(&config); + config_set_string(&config, &config.home, home); + PyMem_RawFree(home); + putenv("TESTHOME="); + + // Use an impossible value so we can detect whether it isn't updated + // during initialization. + config._is_python_build = INT_MAX; + env = getenv("NEGATIVE_ISPYTHONBUILD"); + if (env && strcmp(env, "0") != 0) { + config._is_python_build++; + } + init_from_config_clear(&config); + Py_Finalize(); + // Second initialization + config._is_python_build = -1; + init_from_config_clear(&config); + dump_config(); // home and _is_python_build are cached in _Py_path_config + Py_Finalize(); + return 0; +} + + static int test_init_warnoptions(void) { putenv("PYTHONWARNINGS=ignore:::env1,ignore:::env2"); @@ -1965,6 +2005,7 @@ static struct TestCase TestCases[] = { {"test_init_setpath", test_init_setpath}, {"test_init_setpath_config", test_init_setpath_config}, {"test_init_setpythonhome", test_init_setpythonhome}, + {"test_init_is_python_build", test_init_is_python_build}, {"test_init_warnoptions", test_init_warnoptions}, {"test_init_set_config", test_init_set_config}, {"test_run_main", test_run_main}, diff --git a/Programs/test_frozenmain.h b/Programs/test_frozenmain.h index 1c279134e94dc9..55d461c8beb809 100644 --- a/Programs/test_frozenmain.h +++ b/Programs/test_frozenmain.h @@ -1,42 +1,42 @@ // Auto-generated by Programs/freeze_test_frozenmain.py unsigned char M_test_frozenmain[] = { 227,0,0,0,0,0,0,0,0,0,0,0,0,8,0,0, - 0,0,0,0,0,115,176,0,0,0,151,0,100,0,100,1, + 0,0,0,0,0,243,182,0,0,0,151,0,100,0,100,1, 108,0,90,0,100,0,100,1,108,1,90,1,2,0,101,2, - 100,2,166,1,0,0,171,1,0,0,0,0,0,0,0,0, - 1,0,2,0,101,2,100,3,101,0,106,3,0,0,0,0, - 0,0,0,0,166,2,0,0,171,2,0,0,0,0,0,0, - 0,0,1,0,2,0,101,1,106,4,0,0,0,0,0,0, - 0,0,166,0,0,0,171,0,0,0,0,0,0,0,0,0, - 100,4,25,0,0,0,0,0,0,0,0,0,90,5,100,5, - 68,0,93,25,90,6,2,0,101,2,100,6,101,6,155,0, - 100,7,101,5,101,6,25,0,0,0,0,0,0,0,0,0, - 155,0,157,4,166,1,0,0,171,1,0,0,0,0,0,0, - 0,0,1,0,140,26,100,1,83,0,41,8,233,0,0,0, - 0,78,122,18,70,114,111,122,101,110,32,72,101,108,108,111, - 32,87,111,114,108,100,122,8,115,121,115,46,97,114,103,118, - 218,6,99,111,110,102,105,103,41,5,218,12,112,114,111,103, - 114,97,109,95,110,97,109,101,218,10,101,120,101,99,117,116, - 97,98,108,101,218,15,117,115,101,95,101,110,118,105,114,111, - 110,109,101,110,116,218,17,99,111,110,102,105,103,117,114,101, - 95,99,95,115,116,100,105,111,218,14,98,117,102,102,101,114, - 101,100,95,115,116,100,105,111,122,7,99,111,110,102,105,103, - 32,122,2,58,32,41,7,218,3,115,121,115,218,17,95,116, - 101,115,116,105,110,116,101,114,110,97,108,99,97,112,105,218, - 5,112,114,105,110,116,218,4,97,114,103,118,218,11,103,101, - 116,95,99,111,110,102,105,103,115,114,2,0,0,0,218,3, - 107,101,121,169,0,243,0,0,0,0,250,18,116,101,115,116, - 95,102,114,111,122,101,110,109,97,105,110,46,112,121,250,8, - 60,109,111,100,117,108,101,62,114,17,0,0,0,1,0,0, - 0,115,152,0,0,0,248,240,6,0,1,11,128,10,128,10, - 128,10,216,0,24,208,0,24,208,0,24,208,0,24,224,0, - 5,128,5,208,6,26,209,0,27,212,0,27,208,0,27,216, - 0,5,128,5,128,106,144,35,148,40,209,0,27,212,0,27, - 208,0,27,216,9,38,208,9,26,212,9,38,209,9,40,212, - 9,40,168,24,212,9,50,128,6,240,2,6,12,2,240,0, - 7,1,42,240,0,7,1,42,128,67,240,14,0,5,10,128, - 69,208,10,40,144,67,208,10,40,208,10,40,152,54,160,35, - 156,59,208,10,40,208,10,40,209,4,41,212,4,41,208,4, - 41,208,4,41,240,15,7,1,42,240,0,7,1,42,114,15, - 0,0,0, + 100,2,171,1,0,0,0,0,0,0,0,0,1,0,2,0, + 101,2,100,3,101,0,106,6,0,0,0,0,0,0,0,0, + 0,0,0,0,0,0,0,0,0,0,171,2,0,0,0,0, + 0,0,0,0,1,0,2,0,101,1,106,8,0,0,0,0, + 0,0,0,0,0,0,0,0,0,0,0,0,0,0,171,0, + 0,0,0,0,0,0,0,0,100,4,25,0,0,0,0,0, + 0,0,0,0,90,5,100,5,68,0,93,23,0,0,90,6, + 2,0,101,2,100,6,101,6,155,0,100,7,101,5,101,6, + 25,0,0,0,0,0,0,0,0,0,155,0,157,4,171,1, + 0,0,0,0,0,0,0,0,1,0,140,25,100,1,83,0, + 41,8,233,0,0,0,0,78,122,18,70,114,111,122,101,110, + 32,72,101,108,108,111,32,87,111,114,108,100,122,8,115,121, + 115,46,97,114,103,118,218,6,99,111,110,102,105,103,41,5, + 218,12,112,114,111,103,114,97,109,95,110,97,109,101,218,10, + 101,120,101,99,117,116,97,98,108,101,218,15,117,115,101,95, + 101,110,118,105,114,111,110,109,101,110,116,218,17,99,111,110, + 102,105,103,117,114,101,95,99,95,115,116,100,105,111,218,14, + 98,117,102,102,101,114,101,100,95,115,116,100,105,111,122,7, + 99,111,110,102,105,103,32,122,2,58,32,41,7,218,3,115, + 121,115,218,17,95,116,101,115,116,105,110,116,101,114,110,97, + 108,99,97,112,105,218,5,112,114,105,110,116,218,4,97,114, + 103,118,218,11,103,101,116,95,99,111,110,102,105,103,115,114, + 3,0,0,0,218,3,107,101,121,169,0,243,0,0,0,0, + 250,18,116,101,115,116,95,102,114,111,122,101,110,109,97,105, + 110,46,112,121,250,8,60,109,111,100,117,108,101,62,114,18, + 0,0,0,1,0,0,0,115,145,0,0,0,248,240,6,0, + 1,11,128,10,128,10,128,10,216,0,24,208,0,24,208,0, + 24,208,0,24,224,0,5,128,5,208,6,26,212,0,27,208, + 0,27,216,0,5,128,5,128,106,144,35,151,40,145,40,212, + 0,27,208,0,27,216,9,38,208,9,26,215,9,38,209,9, + 38,212,9,40,168,24,212,9,50,128,6,240,2,6,12,2, + 240,0,7,1,42,241,0,7,1,42,128,67,240,14,0,5, + 10,128,69,208,10,40,144,67,208,10,40,208,10,40,152,54, + 160,35,156,59,208,10,40,208,10,40,212,4,41,208,4,41, + 208,4,41,240,15,7,1,42,240,0,7,1,42,114,16,0, + 0,0, }; diff --git a/Python/Python-ast.c b/Python/Python-ast.c index 3861eaf978a38f..e52a72d43bcbd8 100644 --- a/Python/Python-ast.c +++ b/Python/Python-ast.c @@ -5697,7 +5697,7 @@ obj2ast_stmt(struct ast_state *state, PyObject* obj, stmt_ty* out, PyArena* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_lineno = 0; + end_lineno = lineno; } else { int res; @@ -5714,7 +5714,7 @@ obj2ast_stmt(struct ast_state *state, PyObject* obj, stmt_ty* out, PyArena* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_col_offset = 0; + end_col_offset = col_offset; } else { int res; @@ -8114,7 +8114,7 @@ obj2ast_expr(struct ast_state *state, PyObject* obj, expr_ty* out, PyArena* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_lineno = 0; + end_lineno = lineno; } else { int res; @@ -8131,7 +8131,7 @@ obj2ast_expr(struct ast_state *state, PyObject* obj, expr_ty* out, PyArena* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_col_offset = 0; + end_col_offset = col_offset; } else { int res; @@ -10291,7 +10291,7 @@ obj2ast_excepthandler(struct ast_state *state, PyObject* obj, excepthandler_ty* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_lineno = 0; + end_lineno = lineno; } else { int res; @@ -10308,7 +10308,7 @@ obj2ast_excepthandler(struct ast_state *state, PyObject* obj, excepthandler_ty* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_col_offset = 0; + end_col_offset = col_offset; } else { int res; @@ -10755,7 +10755,7 @@ obj2ast_arg(struct ast_state *state, PyObject* obj, arg_ty* out, PyArena* arena) } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_lineno = 0; + end_lineno = lineno; } else { int res; @@ -10772,7 +10772,7 @@ obj2ast_arg(struct ast_state *state, PyObject* obj, arg_ty* out, PyArena* arena) } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_col_offset = 0; + end_col_offset = col_offset; } else { int res; @@ -10877,7 +10877,7 @@ obj2ast_keyword(struct ast_state *state, PyObject* obj, keyword_ty* out, } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_lineno = 0; + end_lineno = lineno; } else { int res; @@ -10894,7 +10894,7 @@ obj2ast_keyword(struct ast_state *state, PyObject* obj, keyword_ty* out, } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_col_offset = 0; + end_col_offset = col_offset; } else { int res; @@ -10999,7 +10999,7 @@ obj2ast_alias(struct ast_state *state, PyObject* obj, alias_ty* out, PyArena* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_lineno = 0; + end_lineno = lineno; } else { int res; @@ -11016,7 +11016,7 @@ obj2ast_alias(struct ast_state *state, PyObject* obj, alias_ty* out, PyArena* } if (tmp == NULL || tmp == Py_None) { Py_CLEAR(tmp); - end_col_offset = 0; + end_col_offset = col_offset; } else { int res; diff --git a/Python/_warnings.c b/Python/_warnings.c index 942308b357e338..601dae8fb46558 100644 --- a/Python/_warnings.c +++ b/Python/_warnings.c @@ -4,7 +4,6 @@ #include "pycore_long.h" // _PyLong_GetZero() #include "pycore_pyerrors.h" #include "pycore_pystate.h" // _PyThreadState_GET() -#include "frameobject.h" // PyFrame_GetBack() #include "pycore_frame.h" #include "clinic/_warnings.c.h" @@ -1136,11 +1135,7 @@ PyErr_WarnFormat(PyObject *category, Py_ssize_t stack_level, int res; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif res = _PyErr_WarnFormatV(NULL, category, stack_level, format, vargs); va_end(vargs); return res; @@ -1153,11 +1148,7 @@ _PyErr_WarnFormat(PyObject *source, PyObject *category, Py_ssize_t stack_level, int res; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif res = _PyErr_WarnFormatV(source, category, stack_level, format, vargs); va_end(vargs); return res; @@ -1170,11 +1161,7 @@ PyErr_ResourceWarning(PyObject *source, Py_ssize_t stack_level, int res; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif res = _PyErr_WarnFormatV(source, PyExc_ResourceWarning, stack_level, format, vargs); va_end(vargs); @@ -1274,11 +1261,7 @@ PyErr_WarnExplicitFormat(PyObject *category, goto exit; } -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif message = PyUnicode_FromFormatV(format, vargs); if (message != NULL) { PyObject *res; diff --git a/Python/ast.c b/Python/ast.c index 607281e2685535..a0321b58ba8cff 100644 --- a/Python/ast.c +++ b/Python/ast.c @@ -22,6 +22,27 @@ static int validate_stmt(struct validator *, stmt_ty); static int validate_expr(struct validator *, expr_ty, expr_context_ty); static int validate_pattern(struct validator *, pattern_ty, int); +#define VALIDATE_POSITIONS(node) \ + if (node->lineno > node->end_lineno) { \ + PyErr_Format(PyExc_ValueError, \ + "AST node line range (%d, %d) is not valid", \ + node->lineno, node->end_lineno); \ + return 0; \ + } \ + if ((node->lineno < 0 && node->end_lineno != node->lineno) || \ + (node->col_offset < 0 && node->col_offset != node->end_col_offset)) { \ + PyErr_Format(PyExc_ValueError, \ + "AST node column range (%d, %d) for line range (%d, %d) is not valid", \ + node->col_offset, node->end_col_offset, node->lineno, node->end_lineno); \ + return 0; \ + } \ + if (node->lineno == node->end_lineno && node->col_offset > node->end_col_offset) { \ + PyErr_Format(PyExc_ValueError, \ + "line %d, column %d-%d is not a valid range", \ + node->lineno, node->col_offset, node->end_col_offset); \ + return 0; \ + } + static int validate_name(PyObject *name) { @@ -75,6 +96,7 @@ validate_args(struct validator *state, asdl_arg_seq *args) Py_ssize_t i; for (i = 0; i < asdl_seq_LEN(args); i++) { arg_ty arg = asdl_seq_GET(args, i); + VALIDATE_POSITIONS(arg); if (arg->annotation && !validate_expr(state, arg->annotation, Load)) return 0; } @@ -183,6 +205,7 @@ validate_constant(struct validator *state, PyObject *value) static int validate_expr(struct validator *state, expr_ty exp, expr_context_ty ctx) { + VALIDATE_POSITIONS(exp); int ret = -1; if (++state->recursion_depth > state->recursion_limit) { PyErr_SetString(PyExc_RecursionError, @@ -505,6 +528,7 @@ validate_capture(PyObject *name) static int validate_pattern(struct validator *state, pattern_ty p, int star_ok) { + VALIDATE_POSITIONS(p); int ret = -1; if (++state->recursion_depth > state->recursion_limit) { PyErr_SetString(PyExc_RecursionError, @@ -674,6 +698,7 @@ validate_body(struct validator *state, asdl_stmt_seq *body, const char *owner) static int validate_stmt(struct validator *state, stmt_ty stmt) { + VALIDATE_POSITIONS(stmt); int ret = -1; Py_ssize_t i; if (++state->recursion_depth > state->recursion_limit) { @@ -807,6 +832,7 @@ validate_stmt(struct validator *state, stmt_ty stmt) } for (i = 0; i < asdl_seq_LEN(stmt->v.Try.handlers); i++) { excepthandler_ty handler = asdl_seq_GET(stmt->v.Try.handlers, i); + VALIDATE_POSITIONS(handler); if ((handler->v.ExceptHandler.type && !validate_expr(state, handler->v.ExceptHandler.type, Load)) || !validate_body(state, handler->v.ExceptHandler.body, "ExceptHandler")) diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index 072bf75bf8d697..6284bbdf97468e 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -198,6 +198,7 @@ builtin___build_class__(PyObject *self, PyObject *const *args, Py_ssize_t nargs, goto error; } PyThreadState *tstate = _PyThreadState_GET(); + EVAL_CALL_STAT_INC(EVAL_CALL_BUILD_CLASS); cell = _PyEval_Vector(tstate, (PyFunctionObject *)func, ns, NULL, 0, NULL); if (cell != NULL) { if (bases != orig_bases) { diff --git a/Python/ceval.c b/Python/ceval.c index c73218fcf307ef..47dd1008c35ced 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -5,6 +5,8 @@ XXX document it! */ +#define _PY_INTERPRETER + #include "Python.h" #include "pycore_abstract.h" // _PyIndex_Check() #include "pycore_call.h" // _PyObject_FastCallDictTstate() @@ -20,13 +22,13 @@ #include "pycore_pylifecycle.h" // _PyErr_Print() #include "pycore_pymem.h" // _PyMem_IsPtrFreed() #include "pycore_pystate.h" // _PyInterpreterState_GET() +#include "pycore_range.h" // _PyRangeIterObject #include "pycore_sysmodule.h" // _PySys_Audit() #include "pycore_tuple.h" // _PyTuple_ITEMS() #include "pycore_emscripten_signal.h" // _Py_CHECK_EMSCRIPTEN_SIGNALS #include "pycore_dict.h" #include "dictobject.h" -#include "frameobject.h" #include "pycore_frame.h" #include "opcode.h" #include "pydtrace.h" @@ -55,6 +57,7 @@ #undef Py_DECREF #define Py_DECREF(arg) \ do { \ + _Py_DECREF_STAT_INC(); \ PyObject *op = _PyObject_CAST(arg); \ if (--op->ob_refcnt == 0) { \ destructor dealloc = Py_TYPE(op)->tp_dealloc; \ @@ -78,6 +81,7 @@ #undef _Py_DECREF_SPECIALIZED #define _Py_DECREF_SPECIALIZED(arg, dealloc) \ do { \ + _Py_DECREF_STAT_INC(); \ PyObject *op = _PyObject_CAST(arg); \ if (--op->ob_refcnt == 0) { \ destructor d = (destructor)(dealloc); \ @@ -103,7 +107,6 @@ static PyObject * do_call_core( PyObject *callargs, PyObject *kwdict, int use_tracing); #ifdef LLTRACE -static int lltrace; static void dump_stack(_PyInterpreterFrame *frame, PyObject **stack_pointer) { @@ -344,26 +347,11 @@ void _Py_NO_RETURN _Py_FatalError_TstateNULL(const char *func) { _Py_FatalErrorFunc(func, - "the function must be called with the GIL held, " - "but the GIL is released " - "(the current Python thread state is NULL)"); -} - -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS -int -_PyEval_ThreadsInitialized(PyInterpreterState *interp) -{ - return gil_created(&interp->ceval.gil); + "the function must be called with the GIL held, " + "after Python initialization and before Python finalization, " + "but the GIL is released (the current Python thread state is NULL)"); } -int -PyEval_ThreadsInitialized(void) -{ - // Fatal error if there is no current interpreter - PyInterpreterState *interp = PyInterpreterState_Get(); - return _PyEval_ThreadsInitialized(interp); -} -#else int _PyEval_ThreadsInitialized(_PyRuntimeState *runtime) { @@ -376,25 +364,18 @@ PyEval_ThreadsInitialized(void) _PyRuntimeState *runtime = &_PyRuntime; return _PyEval_ThreadsInitialized(runtime); } -#endif PyStatus _PyEval_InitGIL(PyThreadState *tstate) { -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS if (!_Py_IsMainInterpreter(tstate->interp)) { /* Currently, the GIL is shared by all interpreters, and only the main interpreter is responsible to create and destroy it. */ return _PyStatus_OK(); } -#endif -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - struct _gil_runtime_state *gil = &tstate->interp->ceval.gil; -#else struct _gil_runtime_state *gil = &tstate->interp->runtime->ceval.gil; -#endif assert(!gil_created(gil)); PyThread_init_thread(); @@ -409,20 +390,14 @@ _PyEval_InitGIL(PyThreadState *tstate) void _PyEval_FiniGIL(PyInterpreterState *interp) { -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS if (!_Py_IsMainInterpreter(interp)) { /* Currently, the GIL is shared by all interpreters, and only the main interpreter is responsible to create and destroy it. */ return; } -#endif -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - struct _gil_runtime_state *gil = &interp->ceval.gil; -#else struct _gil_runtime_state *gil = &interp->runtime->ceval.gil; -#endif if (!gil_created(gil)) { /* First Py_InitializeFromConfig() call: the GIL doesn't exist yet: do nothing. */ @@ -486,13 +461,9 @@ PyEval_AcquireThread(PyThreadState *tstate) take_gil(tstate); struct _gilstate_runtime_state *gilstate = &tstate->interp->runtime->gilstate; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - (void)_PyThreadState_Swap(gilstate, tstate); -#else if (_PyThreadState_Swap(gilstate, tstate) != NULL) { Py_FatalError("non-NULL old thread state"); } -#endif } void @@ -519,11 +490,7 @@ _PyEval_ReInitThreads(PyThreadState *tstate) { _PyRuntimeState *runtime = tstate->interp->runtime; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - struct _gil_runtime_state *gil = &tstate->interp->ceval.gil; -#else struct _gil_runtime_state *gil = &runtime->ceval.gil; -#endif if (!gil_created(gil)) { return _PyStatus_OK(); } @@ -555,21 +522,12 @@ PyThreadState * PyEval_SaveThread(void) { _PyRuntimeState *runtime = &_PyRuntime; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - PyThreadState *old_tstate = _PyThreadState_GET(); - PyThreadState *tstate = _PyThreadState_Swap(&runtime->gilstate, old_tstate); -#else PyThreadState *tstate = _PyThreadState_Swap(&runtime->gilstate, NULL); -#endif _Py_EnsureTstateNotNULL(tstate); struct _ceval_runtime_state *ceval = &runtime->ceval; struct _ceval_state *ceval2 = &tstate->interp->ceval; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - assert(gil_created(&ceval2->gil)); -#else assert(gil_created(&ceval->gil)); -#endif drop_gil(ceval, ceval2, tstate); return tstate; } @@ -833,9 +791,7 @@ Py_MakePendingCalls(void) void _PyEval_InitRuntimeState(struct _ceval_runtime_state *ceval) { -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS _gil_initialize(&ceval->gil); -#endif } void @@ -845,10 +801,6 @@ _PyEval_InitState(struct _ceval_state *ceval, PyThread_type_lock pending_lock) assert(pending->lock == NULL); pending->lock = pending_lock; - -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - _gil_initialize(&ceval->gil); -#endif } void @@ -1204,6 +1156,7 @@ PyEval_EvalCode(PyObject *co, PyObject *globals, PyObject *locals) if (func == NULL) { return NULL; } + EVAL_CALL_STAT_INC(EVAL_CALL_LEGACY); PyObject *res = _PyEval_Vector(tstate, func, locals, NULL, 0, NULL); Py_DECREF(func); return res; @@ -1263,13 +1216,9 @@ eval_frame_handle_pending(PyThreadState *tstate) take_gil(tstate); -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - (void)_PyThreadState_Swap(&runtime->gilstate, tstate); -#else if (_PyThreadState_Swap(&runtime->gilstate, tstate) != NULL) { Py_FatalError("orphan tstate"); } -#endif } /* Check for asynchronous exception. */ @@ -1361,7 +1310,7 @@ eval_frame_handle_pending(PyThreadState *tstate) do { \ frame->prev_instr = next_instr++; \ OPCODE_EXE_INC(op); \ - _py_stats.opcode_stats[lastopcode].pair_count[op]++; \ + if (_py_stats) _py_stats->opcode_stats[lastopcode].pair_count[op]++; \ lastopcode = op; \ } while (0) #else @@ -1435,10 +1384,6 @@ eval_frame_handle_pending(PyThreadState *tstate) #define JUMPTO(x) (next_instr = first_instr + (x)) #define JUMPBY(x) (next_instr += (x)) -// Skip from a PRECALL over a CALL to the next instruction: -#define SKIP_CALL() \ - JUMPBY(INLINE_CACHE_ENTRIES_PRECALL + 1 + INLINE_CACHE_ENTRIES_CALL) - /* Get opcode and oparg from original instructions, not quickened form. */ #define TRACING_NEXTOPARG() do { \ NEXTOPARG(); \ @@ -1557,22 +1502,6 @@ eval_frame_handle_pending(PyThreadState *tstate) /* Shared opcode macros */ -// shared by LOAD_ATTR_MODULE and LOAD_METHOD_MODULE -#define LOAD_MODULE_ATTR_OR_METHOD(attr_or_method) \ - _PyAttrCache *cache = (_PyAttrCache *)next_instr; \ - DEOPT_IF(!PyModule_CheckExact(owner), LOAD_##attr_or_method); \ - PyDictObject *dict = (PyDictObject *)((PyModuleObject *)owner)->md_dict; \ - assert(dict != NULL); \ - DEOPT_IF(dict->ma_keys->dk_version != read_u32(cache->version), \ - LOAD_##attr_or_method); \ - assert(dict->ma_keys->dk_kind == DICT_KEYS_UNICODE); \ - assert(cache->index < dict->ma_keys->dk_nentries); \ - PyDictUnicodeEntry *ep = DK_UNICODE_ENTRIES(dict->ma_keys) + cache->index; \ - res = ep->me_value; \ - DEOPT_IF(res == NULL, LOAD_##attr_or_method); \ - STAT_INC(LOAD_##attr_or_method, hit); \ - Py_INCREF(res); - #define TRACE_FUNCTION_EXIT() \ if (cframe.use_tracing) { \ if (trace_function_exit(tstate, frame, retval)) { \ @@ -1616,7 +1545,11 @@ eval_frame_handle_pending(PyThreadState *tstate) dtrace_function_entry(frame); \ } +#define ADAPTIVE_COUNTER_IS_ZERO(cache) \ + (cache)->counter < (1<counter -= (1<interp->ceval.eval_breaker; +#ifdef LLTRACE + int lltrace = 0; +#endif _PyCFrame cframe; CallShape call_shape; @@ -1861,7 +1797,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(LOAD_FAST) { + TARGET(LOAD_FAST_CHECK) { PyObject *value = GETLOCAL(oparg); if (value == NULL) { goto unbound_local_error; @@ -1871,6 +1807,14 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } + TARGET(LOAD_FAST) { + PyObject *value = GETLOCAL(oparg); + assert(value != NULL); + Py_INCREF(value); + PUSH(value); + DISPATCH(); + } + TARGET(LOAD_CONST) { PREDICTED(LOAD_CONST); PyObject *value = GETITEM(consts, oparg); @@ -1880,7 +1824,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } TARGET(STORE_FAST) { - PREDICTED(STORE_FAST); PyObject *value = POP(); SETLOCAL(oparg, value); DISPATCH(); @@ -1888,17 +1831,13 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(LOAD_FAST__LOAD_FAST) { PyObject *value = GETLOCAL(oparg); - if (value == NULL) { - goto unbound_local_error; - } + assert(value != NULL); NEXTOPARG(); next_instr++; Py_INCREF(value); PUSH(value); value = GETLOCAL(oparg); - if (value == NULL) { - goto unbound_local_error; - } + assert(value != NULL); Py_INCREF(value); PUSH(value); NOTRACE_DISPATCH(); @@ -1906,9 +1845,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(LOAD_FAST__LOAD_CONST) { PyObject *value = GETLOCAL(oparg); - if (value == NULL) { - goto unbound_local_error; - } + assert(value != NULL); NEXTOPARG(); next_instr++; Py_INCREF(value); @@ -1925,9 +1862,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int NEXTOPARG(); next_instr++; value = GETLOCAL(oparg); - if (value == NULL) { - goto unbound_local_error; - } + assert(value != NULL); Py_INCREF(value); PUSH(value); NOTRACE_DISPATCH(); @@ -1950,9 +1885,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int Py_INCREF(value); PUSH(value); value = GETLOCAL(oparg); - if (value == NULL) { - goto unbound_local_error; - } + assert(value != NULL); Py_INCREF(value); PUSH(value); NOTRACE_DISPATCH(); @@ -2208,7 +2141,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(BINARY_SUBSCR_ADAPTIVE) { _PyBinarySubscrCache *cache = (_PyBinarySubscrCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *sub = TOP(); PyObject *container = SECOND(); next_instr--; @@ -2219,7 +2152,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(BINARY_SUBSCR, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(BINARY_SUBSCR); } } @@ -2306,16 +2239,12 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int PyFunctionObject *getitem = (PyFunctionObject *)cached; DEOPT_IF(getitem->func_version != cache->func_version, BINARY_SUBSCR); PyCodeObject *code = (PyCodeObject *)getitem->func_code; - size_t size = code->co_nlocalsplus + code->co_stacksize + FRAME_SPECIALS_SIZE; assert(code->co_argcount == 2); - _PyInterpreterFrame *new_frame = _PyThreadState_BumpFramePointer(tstate, size); - if (new_frame == NULL) { - goto error; - } - CALL_STAT_INC(frames_pushed); + DEOPT_IF(!_PyThreadState_HasStackSpace(tstate, code->co_framesize), BINARY_SUBSCR); + STAT_INC(BINARY_SUBSCR, hit); Py_INCREF(getitem); - _PyFrame_InitializeSpecials(new_frame, getitem, - NULL, code->co_nlocalsplus); + _PyInterpreterFrame *new_frame = _PyFrame_PushUnchecked(tstate, getitem); + CALL_STAT_INC(inlined_py_calls); STACK_SHRINK(2); new_frame->localsplus[0] = container; new_frame->localsplus[1] = sub; @@ -2373,7 +2302,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(STORE_SUBSCR_ADAPTIVE) { _PyStoreSubscrCache *cache = (_PyStoreSubscrCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *sub = TOP(); PyObject *container = SECOND(); next_instr--; @@ -2384,7 +2313,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(STORE_SUBSCR, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(STORE_SUBSCR); } } @@ -2698,6 +2627,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } TARGET(YIELD_VALUE) { + assert(oparg == STACK_LEVEL()); assert(frame->is_entry); PyObject *retval = POP(); _PyFrame_GetGenerator(frame)->gi_frame_state = FRAME_SUSPENDED; @@ -2866,7 +2796,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(UNPACK_SEQUENCE_ADAPTIVE) { assert(cframe.use_tracing == 0); _PyUnpackSequenceCache *cache = (_PyUnpackSequenceCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *seq = TOP(); next_instr--; _Py_Specialize_UnpackSequence(seq, next_instr, oparg); @@ -2874,7 +2804,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(UNPACK_SEQUENCE, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(UNPACK_SEQUENCE); } } @@ -3107,7 +3037,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(LOAD_GLOBAL_ADAPTIVE) { assert(cframe.use_tracing == 0); _PyLoadGlobalCache *cache = (_PyLoadGlobalCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *name = GETITEM(names, oparg>>1); next_instr--; if (_Py_Specialize_LoadGlobal(GLOBALS(), BUILTINS(), next_instr, name) < 0) { @@ -3117,7 +3047,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(LOAD_GLOBAL, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(LOAD_GLOBAL); } } @@ -3516,8 +3446,43 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(LOAD_ATTR) { PREDICTED(LOAD_ATTR); - PyObject *name = GETITEM(names, oparg); + PyObject *name = GETITEM(names, oparg >> 1); PyObject *owner = TOP(); + if (oparg & 1) { + /* Designed to work in tandem with CALL. */ + PyObject* meth = NULL; + + int meth_found = _PyObject_GetMethod(owner, name, &meth); + + if (meth == NULL) { + /* Most likely attribute wasn't found. */ + goto error; + } + + if (meth_found) { + /* We can bypass temporary bound method object. + meth is unbound method and obj is self. + + meth | self | arg1 | ... | argN + */ + SET_TOP(meth); + PUSH(owner); // self + } + else { + /* meth is not an unbound method (but a regular attr, or + something was returned by a descriptor protocol). Set + the second element of the stack to NULL, to signal + CALL that it's not a method call. + + NULL | meth | arg1 | ... | argN + */ + SET_TOP(NULL); + Py_DECREF(owner); + PUSH(meth); + } + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); + DISPATCH(); + } PyObject *res = PyObject_GetAttr(owner, name); if (res == NULL) { goto error; @@ -3531,9 +3496,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(LOAD_ATTR_ADAPTIVE) { assert(cframe.use_tracing == 0); _PyAttrCache *cache = (_PyAttrCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *owner = TOP(); - PyObject *name = GETITEM(names, oparg); + PyObject *name = GETITEM(names, oparg>>1); next_instr--; if (_Py_Specialize_LoadAttr(owner, next_instr, name) < 0) { goto error; @@ -3542,7 +3507,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(LOAD_ATTR, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(LOAD_ATTR); } } @@ -3564,6 +3529,8 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DEOPT_IF(res == NULL, LOAD_ATTR); STAT_INC(LOAD_ATTR, hit); Py_INCREF(res); + SET_TOP(NULL); + STACK_GROW((oparg & 1)); SET_TOP(res); Py_DECREF(owner); JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); @@ -3572,10 +3539,23 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(LOAD_ATTR_MODULE) { assert(cframe.use_tracing == 0); - // shared with LOAD_METHOD_MODULE PyObject *owner = TOP(); PyObject *res; - LOAD_MODULE_ATTR_OR_METHOD(ATTR); + _PyAttrCache *cache = (_PyAttrCache *)next_instr; + DEOPT_IF(!PyModule_CheckExact(owner), LOAD_ATTR); + PyDictObject *dict = (PyDictObject *)((PyModuleObject *)owner)->md_dict; + assert(dict != NULL); + DEOPT_IF(dict->ma_keys->dk_version != read_u32(cache->version), + LOAD_ATTR); + assert(dict->ma_keys->dk_kind == DICT_KEYS_UNICODE); + assert(cache->index < dict->ma_keys->dk_nentries); + PyDictUnicodeEntry *ep = DK_UNICODE_ENTRIES(dict->ma_keys) + cache->index; + res = ep->me_value; + DEOPT_IF(res == NULL, LOAD_ATTR); + STAT_INC(LOAD_ATTR, hit); + Py_INCREF(res); + SET_TOP(NULL); + STACK_GROW((oparg & 1)); SET_TOP(res); Py_DECREF(owner); JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); @@ -3595,7 +3575,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int PyDictObject *dict = *(PyDictObject **)_PyObject_ManagedDictPointer(owner); DEOPT_IF(dict == NULL, LOAD_ATTR); assert(PyDict_CheckExact((PyObject *)dict)); - PyObject *name = GETITEM(names, oparg); + PyObject *name = GETITEM(names, oparg>>1); uint16_t hint = cache->index; DEOPT_IF(hint >= (size_t)dict->ma_keys->dk_nentries, LOAD_ATTR); if (DK_IS_UNICODE(dict->ma_keys)) { @@ -3611,6 +3591,8 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DEOPT_IF(res == NULL, LOAD_ATTR); STAT_INC(LOAD_ATTR, hit); Py_INCREF(res); + SET_TOP(NULL); + STACK_GROW((oparg & 1)); SET_TOP(res); Py_DECREF(owner); JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); @@ -3631,16 +3613,78 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DEOPT_IF(res == NULL, LOAD_ATTR); STAT_INC(LOAD_ATTR, hit); Py_INCREF(res); + SET_TOP(NULL); + STACK_GROW((oparg & 1)); SET_TOP(res); Py_DECREF(owner); JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); NOTRACE_DISPATCH(); } + TARGET(LOAD_ATTR_CLASS) { + assert(cframe.use_tracing == 0); + _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; + + PyObject *cls = TOP(); + DEOPT_IF(!PyType_Check(cls), LOAD_ATTR); + uint32_t type_version = read_u32(cache->type_version); + DEOPT_IF(((PyTypeObject *)cls)->tp_version_tag != type_version, + LOAD_ATTR); + assert(type_version != 0); + + STAT_INC(LOAD_ATTR, hit); + PyObject *res = read_obj(cache->descr); + assert(res != NULL); + Py_INCREF(res); + SET_TOP(NULL); + STACK_GROW((oparg & 1)); + SET_TOP(res); + Py_DECREF(cls); + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); + NOTRACE_DISPATCH(); + } + + TARGET(LOAD_ATTR_PROPERTY) { + assert(cframe.use_tracing == 0); + DEOPT_IF(tstate->interp->eval_frame, LOAD_ATTR); + _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; + + PyObject *owner = TOP(); + PyTypeObject *cls = Py_TYPE(owner); + uint32_t type_version = read_u32(cache->type_version); + DEOPT_IF(cls->tp_version_tag != type_version, LOAD_ATTR); + assert(type_version != 0); + PyObject *fget = read_obj(cache->descr); + PyFunctionObject *f = (PyFunctionObject *)fget; + uint32_t func_version = read_u32(cache->keys_version); + assert(func_version != 0); + DEOPT_IF(f->func_version != func_version, LOAD_ATTR); + PyCodeObject *code = (PyCodeObject *)f->func_code; + assert(code->co_argcount == 1); + DEOPT_IF(!_PyThreadState_HasStackSpace(tstate, code->co_framesize), LOAD_ATTR); + STAT_INC(LOAD_ATTR, hit); + Py_INCREF(fget); + _PyInterpreterFrame *new_frame = _PyFrame_PushUnchecked(tstate, f); + SET_TOP(NULL); + int push_null = !(oparg & 1); + STACK_SHRINK(push_null); + new_frame->localsplus[0] = owner; + for (int i = 1; i < code->co_nlocalsplus; i++) { + new_frame->localsplus[i] = NULL; + } + _PyFrame_SetStackPointer(frame, stack_pointer); + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); + frame->prev_instr = next_instr - 1; + new_frame->previous = frame; + frame = cframe.current_frame = new_frame; + CALL_STAT_INC(inlined_py_calls); + goto start_frame; + } + TARGET(STORE_ATTR_ADAPTIVE) { assert(cframe.use_tracing == 0); _PyAttrCache *cache = (_PyAttrCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *owner = TOP(); PyObject *name = GETITEM(names, oparg); next_instr--; @@ -3651,7 +3695,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(STORE_ATTR, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(STORE_ATTR); } } @@ -3770,7 +3814,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(COMPARE_OP_ADAPTIVE) { assert(cframe.use_tracing == 0); _PyCompareOpCache *cache = (_PyCompareOpCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *right = TOP(); PyObject *left = SECOND(); next_instr--; @@ -3779,7 +3823,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(COMPARE_OP, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(COMPARE_OP); } } @@ -4349,7 +4393,6 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int SET_TOP(iter); if (iter == NULL) goto error; - PREDICT(FOR_ITER); DISPATCH(); } @@ -4386,16 +4429,10 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int PREDICTED(FOR_ITER); /* before: [iter]; after: [iter, iter()] *or* [] */ PyObject *iter = TOP(); -#ifdef Py_STATS - extern int _PySpecialization_ClassifyIterator(PyObject *); - _py_stats.opcode_stats[FOR_ITER].specialization.failure++; - _py_stats.opcode_stats[FOR_ITER].specialization.failure_kinds[_PySpecialization_ClassifyIterator(iter)]++; -#endif PyObject *next = (*Py_TYPE(iter)->tp_iternext)(iter); if (next != NULL) { PUSH(next); - PREDICT(STORE_FAST); - PREDICT(UNPACK_SEQUENCE); + JUMPBY(INLINE_CACHE_ENTRIES_FOR_ITER); DISPATCH(); } if (_PyErr_Occurred(tstate)) { @@ -4407,13 +4444,70 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } _PyErr_Clear(tstate); } + iterator_exhausted_no_error: /* iterator ended normally */ - STACK_SHRINK(1); - Py_DECREF(iter); - JUMPBY(oparg); + assert(!_PyErr_Occurred(tstate)); + Py_DECREF(POP()); + JUMPBY(INLINE_CACHE_ENTRIES_FOR_ITER + oparg); DISPATCH(); } + TARGET(FOR_ITER_ADAPTIVE) { + assert(cframe.use_tracing == 0); + _PyForIterCache *cache = (_PyForIterCache *)next_instr; + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { + next_instr--; + _Py_Specialize_ForIter(TOP(), next_instr); + NOTRACE_DISPATCH_SAME_OPARG(); + } + else { + STAT_INC(FOR_ITER, deferred); + DECREMENT_ADAPTIVE_COUNTER(cache); + JUMP_TO_INSTRUCTION(FOR_ITER); + } + } + + TARGET(FOR_ITER_LIST) { + assert(cframe.use_tracing == 0); + _PyListIterObject *it = (_PyListIterObject *)TOP(); + DEOPT_IF(Py_TYPE(it) != &PyListIter_Type, FOR_ITER); + STAT_INC(FOR_ITER, hit); + PyListObject *seq = it->it_seq; + if (seq == NULL) { + goto iterator_exhausted_no_error; + } + if (it->it_index < PyList_GET_SIZE(seq)) { + PyObject *next = PyList_GET_ITEM(seq, it->it_index++); + Py_INCREF(next); + PUSH(next); + JUMPBY(INLINE_CACHE_ENTRIES_FOR_ITER); + NOTRACE_DISPATCH(); + } + it->it_seq = NULL; + Py_DECREF(seq); + goto iterator_exhausted_no_error; + } + + TARGET(FOR_ITER_RANGE) { + assert(cframe.use_tracing == 0); + _PyRangeIterObject *r = (_PyRangeIterObject *)TOP(); + DEOPT_IF(Py_TYPE(r) != &PyRangeIter_Type, FOR_ITER); + STAT_INC(FOR_ITER, hit); + _Py_CODEUNIT next = next_instr[INLINE_CACHE_ENTRIES_FOR_ITER]; + assert(_PyOpcode_Deopt[_Py_OPCODE(next)] == STORE_FAST); + if (r->index >= r->len) { + goto iterator_exhausted_no_error; + } + long value = (long)(r->start + + (unsigned long)(r->index++) * r->step); + if (_PyLong_AssignValue(&GETLOCAL(_Py_OPARG(next)), value) < 0) { + goto error; + } + // The STORE_FAST is already done. + JUMPBY(INLINE_CACHE_ENTRIES_FOR_ITER + 1); + NOTRACE_DISPATCH(); + } + TARGET(BEFORE_ASYNC_WITH) { PyObject *mgr = TOP(); PyObject *res; @@ -4535,228 +4629,107 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(LOAD_METHOD) { - PREDICTED(LOAD_METHOD); - /* Designed to work in tandem with PRECALL. */ - PyObject *name = GETITEM(names, oparg); - PyObject *obj = TOP(); - PyObject *meth = NULL; - - int meth_found = _PyObject_GetMethod(obj, name, &meth); - - if (meth == NULL) { - /* Most likely attribute wasn't found. */ - goto error; - } - - if (meth_found) { - /* We can bypass temporary bound method object. - meth is unbound method and obj is self. - - meth | self | arg1 | ... | argN - */ - SET_TOP(meth); - PUSH(obj); // self - } - else { - /* meth is not an unbound method (but a regular attr, or - something was returned by a descriptor protocol). Set - the second element of the stack to NULL, to signal - PRECALL that it's not a method call. - - NULL | meth | arg1 | ... | argN - */ - SET_TOP(NULL); - Py_DECREF(obj); - PUSH(meth); - } - JUMPBY(INLINE_CACHE_ENTRIES_LOAD_METHOD); - DISPATCH(); - } - - TARGET(LOAD_METHOD_ADAPTIVE) { - assert(cframe.use_tracing == 0); - _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; - if (cache->counter == 0) { - PyObject *owner = TOP(); - PyObject *name = GETITEM(names, oparg); - next_instr--; - if (_Py_Specialize_LoadMethod(owner, next_instr, name) < 0) { - goto error; - } - NOTRACE_DISPATCH_SAME_OPARG(); - } - else { - STAT_INC(LOAD_METHOD, deferred); - cache->counter--; - JUMP_TO_INSTRUCTION(LOAD_METHOD); - } - } - - TARGET(LOAD_METHOD_WITH_VALUES) { - /* LOAD_METHOD, with cached method object */ + TARGET(LOAD_ATTR_METHOD_WITH_VALUES) { + /* Cached method object */ assert(cframe.use_tracing == 0); PyObject *self = TOP(); PyTypeObject *self_cls = Py_TYPE(self); _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; uint32_t type_version = read_u32(cache->type_version); assert(type_version != 0); - DEOPT_IF(self_cls->tp_version_tag != type_version, LOAD_METHOD); + DEOPT_IF(self_cls->tp_version_tag != type_version, LOAD_ATTR); assert(self_cls->tp_flags & Py_TPFLAGS_MANAGED_DICT); PyDictObject *dict = *(PyDictObject**)_PyObject_ManagedDictPointer(self); - DEOPT_IF(dict != NULL, LOAD_METHOD); + DEOPT_IF(dict != NULL, LOAD_ATTR); PyHeapTypeObject *self_heap_type = (PyHeapTypeObject *)self_cls; DEOPT_IF(self_heap_type->ht_cached_keys->dk_version != - read_u32(cache->keys_version), LOAD_METHOD); - STAT_INC(LOAD_METHOD, hit); + read_u32(cache->keys_version), LOAD_ATTR); + STAT_INC(LOAD_ATTR, hit); PyObject *res = read_obj(cache->descr); assert(res != NULL); assert(_PyType_HasFeature(Py_TYPE(res), Py_TPFLAGS_METHOD_DESCRIPTOR)); Py_INCREF(res); SET_TOP(res); PUSH(self); - JUMPBY(INLINE_CACHE_ENTRIES_LOAD_METHOD); + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); NOTRACE_DISPATCH(); } - TARGET(LOAD_METHOD_WITH_DICT) { - /* LOAD_METHOD, with a dict - Can be either a managed dict, or a tp_dictoffset offset.*/ + TARGET(LOAD_ATTR_METHOD_WITH_DICT) { + /* Can be either a managed dict, or a tp_dictoffset offset.*/ assert(cframe.use_tracing == 0); PyObject *self = TOP(); PyTypeObject *self_cls = Py_TYPE(self); _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; DEOPT_IF(self_cls->tp_version_tag != read_u32(cache->type_version), - LOAD_METHOD); + LOAD_ATTR); /* Treat index as a signed 16 bit value */ - int dictoffset = *(int16_t *)&cache->dict_offset; + Py_ssize_t dictoffset = self_cls->tp_dictoffset; + assert(dictoffset > 0); PyDictObject **dictptr = (PyDictObject**)(((char *)self)+dictoffset); - assert( - dictoffset == MANAGED_DICT_OFFSET || - (dictoffset == self_cls->tp_dictoffset && dictoffset > 0) - ); PyDictObject *dict = *dictptr; - DEOPT_IF(dict == NULL, LOAD_METHOD); + DEOPT_IF(dict == NULL, LOAD_ATTR); DEOPT_IF(dict->ma_keys->dk_version != read_u32(cache->keys_version), - LOAD_METHOD); - STAT_INC(LOAD_METHOD, hit); + LOAD_ATTR); + STAT_INC(LOAD_ATTR, hit); PyObject *res = read_obj(cache->descr); assert(res != NULL); assert(_PyType_HasFeature(Py_TYPE(res), Py_TPFLAGS_METHOD_DESCRIPTOR)); Py_INCREF(res); SET_TOP(res); PUSH(self); - JUMPBY(INLINE_CACHE_ENTRIES_LOAD_METHOD); + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); NOTRACE_DISPATCH(); } - TARGET(LOAD_METHOD_NO_DICT) { + TARGET(LOAD_ATTR_METHOD_NO_DICT) { assert(cframe.use_tracing == 0); PyObject *self = TOP(); PyTypeObject *self_cls = Py_TYPE(self); _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; uint32_t type_version = read_u32(cache->type_version); - DEOPT_IF(self_cls->tp_version_tag != type_version, LOAD_METHOD); + DEOPT_IF(self_cls->tp_version_tag != type_version, LOAD_ATTR); assert(self_cls->tp_dictoffset == 0); - STAT_INC(LOAD_METHOD, hit); + STAT_INC(LOAD_ATTR, hit); PyObject *res = read_obj(cache->descr); assert(res != NULL); assert(_PyType_HasFeature(Py_TYPE(res), Py_TPFLAGS_METHOD_DESCRIPTOR)); Py_INCREF(res); SET_TOP(res); PUSH(self); - JUMPBY(INLINE_CACHE_ENTRIES_LOAD_METHOD); - NOTRACE_DISPATCH(); - } - - TARGET(LOAD_METHOD_MODULE) { - /* LOAD_METHOD, for module methods */ - assert(cframe.use_tracing == 0); - PyObject *owner = TOP(); - PyObject *res; - LOAD_MODULE_ATTR_OR_METHOD(METHOD); - SET_TOP(NULL); - Py_DECREF(owner); - PUSH(res); - JUMPBY(INLINE_CACHE_ENTRIES_LOAD_METHOD); + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); NOTRACE_DISPATCH(); } - TARGET(LOAD_METHOD_CLASS) { - /* LOAD_METHOD, for class methods */ + TARGET(LOAD_ATTR_METHOD_LAZY_DICT) { assert(cframe.use_tracing == 0); + PyObject *self = TOP(); + PyTypeObject *self_cls = Py_TYPE(self); _PyLoadMethodCache *cache = (_PyLoadMethodCache *)next_instr; - - PyObject *cls = TOP(); - DEOPT_IF(!PyType_Check(cls), LOAD_METHOD); uint32_t type_version = read_u32(cache->type_version); - DEOPT_IF(((PyTypeObject *)cls)->tp_version_tag != type_version, - LOAD_METHOD); - assert(type_version != 0); - - STAT_INC(LOAD_METHOD, hit); + DEOPT_IF(self_cls->tp_version_tag != type_version, LOAD_ATTR); + Py_ssize_t dictoffset = self_cls->tp_dictoffset; + assert(dictoffset > 0); + PyObject *dict = *(PyObject **)((char *)self + dictoffset); + /* This object has a __dict__, just not yet created */ + DEOPT_IF(dict != NULL, LOAD_ATTR); + STAT_INC(LOAD_ATTR, hit); PyObject *res = read_obj(cache->descr); assert(res != NULL); + assert(_PyType_HasFeature(Py_TYPE(res), Py_TPFLAGS_METHOD_DESCRIPTOR)); Py_INCREF(res); - SET_TOP(NULL); - Py_DECREF(cls); - PUSH(res); - JUMPBY(INLINE_CACHE_ENTRIES_LOAD_METHOD); + SET_TOP(res); + PUSH(self); + JUMPBY(INLINE_CACHE_ENTRIES_LOAD_ATTR); NOTRACE_DISPATCH(); } - TARGET(PRECALL) { - PREDICTED(PRECALL); - /* Designed to work in tamdem with LOAD_METHOD. */ - /* `meth` is NULL when LOAD_METHOD thinks that it's not - a method call. - - Stack layout: - - ... | NULL | callable | arg1 | ... | argN - ^- TOP() - ^- (-oparg) - ^- (-oparg-1) - ^- (-oparg-2) - - `callable` will be POPed by call_function. - NULL will will be POPed manually later. - If `meth` isn't NULL, it's a method call. Stack layout: - - ... | method | self | arg1 | ... | argN - ^- TOP() - ^- (-oparg) - ^- (-oparg-1) - ^- (-oparg-2) - - `self` and `method` will be POPed by call_function. - We'll be passing `oparg + 1` to call_function, to - make it accept the `self` as a first argument. - */ - int is_meth = is_method(stack_pointer, oparg); - int nargs = oparg + is_meth; - /* Move ownership of reference from stack to call_shape - * and make sure that NULL is cleared from stack */ - PyObject *function = PEEK(nargs + 1); - if (!is_meth && Py_TYPE(function) == &PyMethod_Type) { - PyObject *meth = ((PyMethodObject *)function)->im_func; - PyObject *self = ((PyMethodObject *)function)->im_self; - Py_INCREF(meth); - Py_INCREF(self); - PEEK(oparg+1) = self; - PEEK(oparg+2) = meth; - Py_DECREF(function); - } - JUMPBY(INLINE_CACHE_ENTRIES_PRECALL); - DISPATCH(); - } - - TARGET(PRECALL_BOUND_METHOD) { - DEOPT_IF(is_method(stack_pointer, oparg), PRECALL); + TARGET(CALL_BOUND_METHOD_EXACT_ARGS) { + DEOPT_IF(is_method(stack_pointer, oparg), CALL); PyObject *function = PEEK(oparg + 1); - DEOPT_IF(Py_TYPE(function) != &PyMethod_Type, PRECALL); - STAT_INC(PRECALL, hit); + DEOPT_IF(Py_TYPE(function) != &PyMethod_Type, CALL); + STAT_INC(CALL, hit); PyObject *meth = ((PyMethodObject *)function)->im_func; PyObject *self = ((PyMethodObject *)function)->im_self; Py_INCREF(meth); @@ -4764,17 +4737,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int PEEK(oparg + 1) = self; PEEK(oparg + 2) = meth; Py_DECREF(function); - JUMPBY(INLINE_CACHE_ENTRIES_PRECALL); - DISPATCH(); - } - - TARGET(PRECALL_PYFUNC) { - int nargs = oparg + is_method(stack_pointer, oparg); - PyObject *function = PEEK(nargs + 1); - DEOPT_IF(Py_TYPE(function) != &PyFunction_Type, PRECALL); - STAT_INC(PRECALL, hit); - JUMPBY(INLINE_CACHE_ENTRIES_PRECALL); - DISPATCH(); + goto call_exact_args; } TARGET(KW_NAMES) { @@ -4785,16 +4748,27 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } TARGET(CALL) { - int is_meth; + int total_args, is_meth; call_function: is_meth = is_method(stack_pointer, oparg); - int total_args = oparg + is_meth; - PyObject *function = PEEK(total_args + 1); + PyObject *function = PEEK(oparg + 1); + if (!is_meth && Py_TYPE(function) == &PyMethod_Type) { + PyObject *meth = ((PyMethodObject *)function)->im_func; + PyObject *self = ((PyMethodObject *)function)->im_self; + Py_INCREF(meth); + Py_INCREF(self); + PEEK(oparg+1) = self; + PEEK(oparg+2) = meth; + Py_DECREF(function); + is_meth = 1; + } + total_args = oparg + is_meth; + function = PEEK(total_args + 1); int positional_args = total_args - KWNAMES_LEN(); // Check if the call can be inlined or not if (Py_TYPE(function) == &PyFunction_Type && tstate->interp->eval_frame == NULL) { int code_flags = ((PyCodeObject*)PyFunction_GET_CODE(function))->co_flags; - PyObject *locals = code_flags & CO_OPTIMIZED ? NULL : PyFunction_GET_GLOBALS(function); + PyObject *locals = code_flags & CO_OPTIMIZED ? NULL : Py_NewRef(PyFunction_GET_GLOBALS(function)); STACK_SHRINK(total_args); _PyInterpreterFrame *new_frame = _PyEvalFramePushAndInit( tstate, (PyFunctionObject *)function, locals, @@ -4846,30 +4820,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_ADAPTIVE) { - _PyPrecallCache *cache = (_PyPrecallCache *)next_instr; - if (cache->counter == 0) { - next_instr--; - int is_meth = is_method(stack_pointer, oparg); - int nargs = oparg + is_meth; - PyObject *callable = PEEK(nargs + 1); - int err = _Py_Specialize_Precall(callable, next_instr, nargs, - call_shape.kwnames, oparg); - if (err < 0) { - goto error; - } - NOTRACE_DISPATCH_SAME_OPARG(); - } - else { - STAT_INC(PRECALL, deferred); - cache->counter--; - JUMP_TO_INSTRUCTION(PRECALL); - } - } - TARGET(CALL_ADAPTIVE) { _PyCallCache *cache = (_PyCallCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { next_instr--; int is_meth = is_method(stack_pointer, oparg); int nargs = oparg + is_meth; @@ -4883,12 +4836,13 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(CALL, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); goto call_function; } } TARGET(CALL_PY_EXACT_ARGS) { + call_exact_args: assert(call_shape.kwnames == NULL); DEOPT_IF(tstate->interp->eval_frame, CALL); _PyCallCache *cache = (_PyCallCache *)next_instr; @@ -4900,11 +4854,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DEOPT_IF(func->func_version != read_u32(cache->func_version), CALL); PyCodeObject *code = (PyCodeObject *)func->func_code; DEOPT_IF(code->co_argcount != argcount, CALL); + DEOPT_IF(!_PyThreadState_HasStackSpace(tstate, code->co_framesize), CALL); STAT_INC(CALL, hit); - _PyInterpreterFrame *new_frame = _PyFrame_Push(tstate, func); - if (new_frame == NULL) { - goto error; - } + _PyInterpreterFrame *new_frame = _PyFrame_PushUnchecked(tstate, func); CALL_STAT_INC(inlined_py_calls); STACK_SHRINK(argcount); for (int i = 0; i < argcount; i++) { @@ -4936,11 +4888,9 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DEOPT_IF(argcount > code->co_argcount, CALL); int minargs = cache->min_args; DEOPT_IF(argcount < minargs, CALL); + DEOPT_IF(!_PyThreadState_HasStackSpace(tstate, code->co_framesize), CALL); STAT_INC(CALL, hit); - _PyInterpreterFrame *new_frame = _PyFrame_Push(tstate, func); - if (new_frame == NULL) { - goto error; - } + _PyInterpreterFrame *new_frame = _PyFrame_PushUnchecked(tstate, func); CALL_STAT_INC(inlined_py_calls); STACK_SHRINK(argcount); for (int i = 0; i < argcount; i++) { @@ -4964,16 +4914,16 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int goto start_frame; } - TARGET(PRECALL_NO_KW_TYPE_1) { + TARGET(CALL_NO_KW_TYPE_1) { assert(call_shape.kwnames == NULL); assert(cframe.use_tracing == 0); assert(oparg == 1); - DEOPT_IF(is_method(stack_pointer, 1), PRECALL); + DEOPT_IF(is_method(stack_pointer, 1), CALL); PyObject *obj = TOP(); PyObject *callable = SECOND(); - DEOPT_IF(callable != (PyObject *)&PyType_Type, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(callable != (PyObject *)&PyType_Type, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyObject *res = Py_NewRef(Py_TYPE(obj)); Py_DECREF(callable); Py_DECREF(obj); @@ -4982,15 +4932,15 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int NOTRACE_DISPATCH(); } - TARGET(PRECALL_NO_KW_STR_1) { + TARGET(CALL_NO_KW_STR_1) { assert(call_shape.kwnames == NULL); assert(cframe.use_tracing == 0); assert(oparg == 1); - DEOPT_IF(is_method(stack_pointer, 1), PRECALL); + DEOPT_IF(is_method(stack_pointer, 1), CALL); PyObject *callable = PEEK(2); - DEOPT_IF(callable != (PyObject *)&PyUnicode_Type, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(callable != (PyObject *)&PyUnicode_Type, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyObject *arg = TOP(); PyObject *res = PyObject_Str(arg); Py_DECREF(arg); @@ -5004,14 +4954,14 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_TUPLE_1) { + TARGET(CALL_NO_KW_TUPLE_1) { assert(call_shape.kwnames == NULL); assert(oparg == 1); - DEOPT_IF(is_method(stack_pointer, 1), PRECALL); + DEOPT_IF(is_method(stack_pointer, 1), CALL); PyObject *callable = PEEK(2); - DEOPT_IF(callable != (PyObject *)&PyTuple_Type, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(callable != (PyObject *)&PyTuple_Type, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyObject *arg = TOP(); PyObject *res = PySequence_Tuple(arg); Py_DECREF(arg); @@ -5025,16 +4975,16 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_BUILTIN_CLASS) { + TARGET(CALL_BUILTIN_CLASS) { int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; int kwnames_len = KWNAMES_LEN(); PyObject *callable = PEEK(total_args + 1); - DEOPT_IF(!PyType_Check(callable), PRECALL); + DEOPT_IF(!PyType_Check(callable), CALL); PyTypeObject *tp = (PyTypeObject *)callable; - DEOPT_IF(tp->tp_vectorcall == NULL, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(tp->tp_vectorcall == NULL, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); STACK_SHRINK(total_args); PyObject *res = tp->tp_vectorcall((PyObject *)tp, stack_pointer, total_args-kwnames_len, call_shape.kwnames); @@ -5053,18 +5003,18 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_BUILTIN_O) { + TARGET(CALL_NO_KW_BUILTIN_O) { assert(cframe.use_tracing == 0); /* Builtin METH_O functions */ assert(call_shape.kwnames == NULL); int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; - DEOPT_IF(total_args != 1, PRECALL); + DEOPT_IF(total_args != 1, CALL); PyObject *callable = PEEK(total_args + 1); - DEOPT_IF(!PyCFunction_CheckExact(callable), PRECALL); - DEOPT_IF(PyCFunction_GET_FLAGS(callable) != METH_O, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(!PyCFunction_CheckExact(callable), CALL); + DEOPT_IF(PyCFunction_GET_FLAGS(callable) != METH_O, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyCFunction cfunc = PyCFunction_GET_FUNCTION(callable); // This is slower but CPython promises to check all non-vectorcall // function calls. @@ -5087,18 +5037,18 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_BUILTIN_FAST) { + TARGET(CALL_NO_KW_BUILTIN_FAST) { assert(cframe.use_tracing == 0); /* Builtin METH_FASTCALL functions, without keywords */ assert(call_shape.kwnames == NULL); int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; PyObject *callable = PEEK(total_args + 1); - DEOPT_IF(!PyCFunction_CheckExact(callable), PRECALL); + DEOPT_IF(!PyCFunction_CheckExact(callable), CALL); DEOPT_IF(PyCFunction_GET_FLAGS(callable) != METH_FASTCALL, - PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyCFunction cfunc = PyCFunction_GET_FUNCTION(callable); STACK_SHRINK(total_args); /* res = func(self, args, nargs) */ @@ -5127,17 +5077,17 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_BUILTIN_FAST_WITH_KEYWORDS) { + TARGET(CALL_BUILTIN_FAST_WITH_KEYWORDS) { assert(cframe.use_tracing == 0); /* Builtin METH_FASTCALL | METH_KEYWORDS functions */ int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; PyObject *callable = PEEK(total_args + 1); - DEOPT_IF(!PyCFunction_CheckExact(callable), PRECALL); + DEOPT_IF(!PyCFunction_CheckExact(callable), CALL); DEOPT_IF(PyCFunction_GET_FLAGS(callable) != - (METH_FASTCALL | METH_KEYWORDS), PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + (METH_FASTCALL | METH_KEYWORDS), CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); STACK_SHRINK(total_args); /* res = func(self, args, nargs, kwnames) */ _PyCFunctionFastWithKeywords cfunc = @@ -5166,18 +5116,18 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_LEN) { + TARGET(CALL_NO_KW_LEN) { assert(cframe.use_tracing == 0); assert(call_shape.kwnames == NULL); /* len(o) */ int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; - DEOPT_IF(total_args != 1, PRECALL); + DEOPT_IF(total_args != 1, CALL); PyObject *callable = PEEK(total_args + 1); PyInterpreterState *interp = _PyInterpreterState_GET(); - DEOPT_IF(callable != interp->callable_cache.len, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(callable != interp->callable_cache.len, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyObject *arg = TOP(); Py_ssize_t len_i = PyObject_Length(arg); if (len_i < 0) { @@ -5196,18 +5146,18 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_ISINSTANCE) { + TARGET(CALL_NO_KW_ISINSTANCE) { assert(cframe.use_tracing == 0); assert(call_shape.kwnames == NULL); /* isinstance(o, o2) */ int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; PyObject *callable = PEEK(total_args + 1); - DEOPT_IF(total_args != 2, PRECALL); + DEOPT_IF(total_args != 2, CALL); PyInterpreterState *interp = _PyInterpreterState_GET(); - DEOPT_IF(callable != interp->callable_cache.isinstance, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(callable != interp->callable_cache.isinstance, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyObject *cls = POP(); PyObject *inst = TOP(); int retval = PyObject_IsInstance(inst, cls); @@ -5229,18 +5179,18 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_LIST_APPEND) { + TARGET(CALL_NO_KW_LIST_APPEND) { assert(cframe.use_tracing == 0); assert(call_shape.kwnames == NULL); assert(oparg == 1); PyObject *callable = PEEK(3); PyInterpreterState *interp = _PyInterpreterState_GET(); - DEOPT_IF(callable != interp->callable_cache.list_append, PRECALL); + DEOPT_IF(callable != interp->callable_cache.list_append, CALL); PyObject *list = SECOND(); - DEOPT_IF(!PyList_Check(list), PRECALL); - STAT_INC(PRECALL, hit); - // PRECALL + CALL + POP_TOP - JUMPBY(INLINE_CACHE_ENTRIES_PRECALL + 1 + INLINE_CACHE_ENTRIES_CALL + 1); + DEOPT_IF(!PyList_Check(list), CALL); + STAT_INC(CALL, hit); + // CALL + POP_TOP + JUMPBY(INLINE_CACHE_ENTRIES_CALL + 1); assert(_Py_OPCODE(next_instr[-1]) == POP_TOP); PyObject *arg = POP(); if (_PyList_AppendTakeRef((PyListObject *)list, arg) < 0) { @@ -5252,21 +5202,21 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int NOTRACE_DISPATCH(); } - TARGET(PRECALL_NO_KW_METHOD_DESCRIPTOR_O) { + TARGET(CALL_NO_KW_METHOD_DESCRIPTOR_O) { assert(call_shape.kwnames == NULL); int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; PyMethodDescrObject *callable = (PyMethodDescrObject *)PEEK(total_args + 1); - DEOPT_IF(total_args != 2, PRECALL); - DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), PRECALL); + DEOPT_IF(total_args != 2, CALL); + DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), CALL); PyMethodDef *meth = callable->d_method; - DEOPT_IF(meth->ml_flags != METH_O, PRECALL); + DEOPT_IF(meth->ml_flags != METH_O, CALL); PyObject *arg = TOP(); PyObject *self = SECOND(); - DEOPT_IF(!Py_IS_TYPE(self, callable->d_common.d_type), PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(!Py_IS_TYPE(self, callable->d_common.d_type), CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyCFunction cfunc = meth->ml_meth; // This is slower but CPython promises to check all non-vectorcall // function calls. @@ -5288,19 +5238,19 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS) { + TARGET(CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS) { int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; PyMethodDescrObject *callable = (PyMethodDescrObject *)PEEK(total_args + 1); - DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), PRECALL); + DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), CALL); PyMethodDef *meth = callable->d_method; - DEOPT_IF(meth->ml_flags != (METH_FASTCALL|METH_KEYWORDS), PRECALL); + DEOPT_IF(meth->ml_flags != (METH_FASTCALL|METH_KEYWORDS), CALL); PyTypeObject *d_type = callable->d_common.d_type; PyObject *self = PEEK(total_args); - DEOPT_IF(!Py_IS_TYPE(self, d_type), PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(!Py_IS_TYPE(self, d_type), CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); int nargs = total_args-1; STACK_SHRINK(nargs); _PyCFunctionFastWithKeywords cfunc = @@ -5325,20 +5275,20 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS) { + TARGET(CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS) { assert(call_shape.kwnames == NULL); assert(oparg == 0 || oparg == 1); int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; - DEOPT_IF(total_args != 1, PRECALL); + DEOPT_IF(total_args != 1, CALL); PyMethodDescrObject *callable = (PyMethodDescrObject *)SECOND(); - DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), PRECALL); + DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), CALL); PyMethodDef *meth = callable->d_method; PyObject *self = TOP(); - DEOPT_IF(!Py_IS_TYPE(self, callable->d_common.d_type), PRECALL); - DEOPT_IF(meth->ml_flags != METH_NOARGS, PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(!Py_IS_TYPE(self, callable->d_common.d_type), CALL); + DEOPT_IF(meth->ml_flags != METH_NOARGS, CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); PyCFunction cfunc = meth->ml_meth; // This is slower but CPython promises to check all non-vectorcall // function calls. @@ -5359,20 +5309,20 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int DISPATCH(); } - TARGET(PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST) { + TARGET(CALL_NO_KW_METHOD_DESCRIPTOR_FAST) { assert(call_shape.kwnames == NULL); int is_meth = is_method(stack_pointer, oparg); int total_args = oparg + is_meth; PyMethodDescrObject *callable = (PyMethodDescrObject *)PEEK(total_args + 1); /* Builtin METH_FASTCALL methods, without keywords */ - DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), PRECALL); + DEOPT_IF(!Py_IS_TYPE(callable, &PyMethodDescr_Type), CALL); PyMethodDef *meth = callable->d_method; - DEOPT_IF(meth->ml_flags != METH_FASTCALL, PRECALL); + DEOPT_IF(meth->ml_flags != METH_FASTCALL, CALL); PyObject *self = PEEK(total_args); - DEOPT_IF(!Py_IS_TYPE(self, callable->d_common.d_type), PRECALL); - STAT_INC(PRECALL, hit); - SKIP_CALL(); + DEOPT_IF(!Py_IS_TYPE(self, callable->d_common.d_type), CALL); + STAT_INC(CALL, hit); + JUMPBY(INLINE_CACHE_ENTRIES_CALL); _PyCFunctionFast cfunc = (_PyCFunctionFast)(void(*)(void))meth->ml_meth; int nargs = total_args-1; @@ -5614,7 +5564,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int TARGET(BINARY_OP_ADAPTIVE) { assert(cframe.use_tracing == 0); _PyBinaryOpCache *cache = (_PyBinaryOpCache *)next_instr; - if (cache->counter == 0) { + if (ADAPTIVE_COUNTER_IS_ZERO(cache)) { PyObject *lhs = SECOND(); PyObject *rhs = TOP(); next_instr--; @@ -5623,7 +5573,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int } else { STAT_INC(BINARY_OP, deferred); - cache->counter--; + DECREMENT_ADAPTIVE_COUNTER(cache); JUMP_TO_INSTRUCTION(BINARY_OP); } } @@ -5662,57 +5612,47 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int case DO_TRACING: #endif { - if (tstate->tracing == 0) { + if (tstate->tracing == 0 && + INSTR_OFFSET() >= frame->f_code->_co_firsttraceable + ) { int instr_prev = _PyInterpreterFrame_LASTI(frame); frame->prev_instr = next_instr; TRACING_NEXTOPARG(); - switch(opcode) { - case COPY_FREE_VARS: - case MAKE_CELL: - case RETURN_GENERATOR: - /* Frame not fully initialized */ - break; - case RESUME: - if (oparg < 2) { - CHECK_EVAL_BREAKER(); - } - /* Call tracing */ - TRACE_FUNCTION_ENTRY(); - DTRACE_FUNCTION_ENTRY(); - break; - case POP_TOP: - if (_Py_OPCODE(next_instr[-1]) == RETURN_GENERATOR) { - /* Frame not fully initialized */ - break; - } - /* fall through */ - default: - /* line-by-line tracing support */ - if (PyDTrace_LINE_ENABLED()) { - maybe_dtrace_line(frame, &tstate->trace_info, instr_prev); - } - - if (cframe.use_tracing && - tstate->c_tracefunc != NULL && !tstate->tracing) { - int err; - /* see maybe_call_line_trace() - for expository comments */ - _PyFrame_SetStackPointer(frame, stack_pointer); - - err = maybe_call_line_trace(tstate->c_tracefunc, - tstate->c_traceobj, - tstate, frame, instr_prev); - if (err) { - /* trace function raised an exception */ - next_instr++; - goto error; - } - /* Reload possibly changed frame fields */ - next_instr = frame->prev_instr; + if (opcode == RESUME) { + if (oparg < 2) { + CHECK_EVAL_BREAKER(); + } + /* Call tracing */ + TRACE_FUNCTION_ENTRY(); + DTRACE_FUNCTION_ENTRY(); + } + else { + /* line-by-line tracing support */ + if (PyDTrace_LINE_ENABLED()) { + maybe_dtrace_line(frame, &tstate->trace_info, instr_prev); + } - stack_pointer = _PyFrame_GetStackPointer(frame); - frame->stacktop = -1; + if (cframe.use_tracing && + tstate->c_tracefunc != NULL && !tstate->tracing) { + int err; + /* see maybe_call_line_trace() + for expository comments */ + _PyFrame_SetStackPointer(frame, stack_pointer); + + err = maybe_call_line_trace(tstate->c_tracefunc, + tstate->c_traceobj, + tstate, frame, instr_prev); + if (err) { + /* trace function raised an exception */ + next_instr++; + goto error; } + /* Reload possibly changed frame fields */ + next_instr = frame->prev_instr; + + stack_pointer = _PyFrame_GetStackPointer(frame); + frame->stacktop = -1; + } } } TRACING_NEXTOPARG(); @@ -5751,7 +5691,7 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int assert(adaptive_opcode); _Py_SET_OPCODE(next_instr[-1], adaptive_opcode); STAT_INC(opcode, deopt); - *counter = ADAPTIVE_CACHE_BACKOFF; + *counter = adaptive_counter_start(); } next_instr--; DISPATCH_GOTO(); @@ -6398,20 +6338,19 @@ initialize_locals(PyThreadState *tstate, PyFunctionObject *func, return -1; } -/* Consumes references to func and all the args */ +/* Consumes references to func, locals and all the args */ static _PyInterpreterFrame * _PyEvalFramePushAndInit(PyThreadState *tstate, PyFunctionObject *func, PyObject *locals, PyObject* const* args, size_t argcount, PyObject *kwnames) { PyCodeObject * code = (PyCodeObject *)func->func_code; - size_t size = code->co_nlocalsplus + code->co_stacksize + FRAME_SPECIALS_SIZE; CALL_STAT_INC(frames_pushed); - _PyInterpreterFrame *frame = _PyThreadState_BumpFramePointer(tstate, size); + _PyInterpreterFrame *frame = _PyThreadState_PushFrame(tstate, code->co_framesize); if (frame == NULL) { goto fail; } - _PyFrame_InitializeSpecials(frame, func, locals, code->co_nlocalsplus); + _PyFrame_InitializeSpecials(frame, func, locals, code); PyObject **localsarray = &frame->localsplus[0]; for (int i = 0; i < code->co_nlocalsplus; i++) { localsarray[i] = NULL; @@ -6455,8 +6394,9 @@ _PyEval_Vector(PyThreadState *tstate, PyFunctionObject *func, PyObject *kwnames) { /* _PyEvalFramePushAndInit consumes the references - * to func and all its arguments */ + * to func, locals and all its arguments */ Py_INCREF(func); + Py_XINCREF(locals); for (size_t i = 0; i < argcount; i++) { Py_INCREF(args[i]); } @@ -6471,6 +6411,7 @@ _PyEval_Vector(PyThreadState *tstate, PyFunctionObject *func, if (frame == NULL) { return NULL; } + EVAL_CALL_STAT_INC(EVAL_CALL_VECTOR); PyObject *retval = _PyEval_EvalFrame(tstate, frame, 0); assert( _PyFrame_GetStackPointer(frame) == _PyFrame_Stackbase(frame) || @@ -6546,6 +6487,7 @@ PyEval_EvalCodeEx(PyObject *_co, PyObject *globals, PyObject *locals, if (func == NULL) { goto fail; } + EVAL_CALL_STAT_INC(EVAL_CALL_LEGACY); res = _PyEval_Vector(tstate, func, locals, allargs, argcount, kwnames); @@ -6907,9 +6849,10 @@ call_trace(Py_tracefunc func, PyObject *obj, tstate->tracing_what = what; PyThreadState_EnterTracing(tstate); assert(_PyInterpreterFrame_LASTI(frame) >= 0); - initialize_trace_info(&tstate->trace_info, frame); - int addr = _PyInterpreterFrame_LASTI(frame) * sizeof(_Py_CODEUNIT); - f->f_lineno = _PyCode_CheckLineNumber(addr, &tstate->trace_info.bounds); + if (_PyCode_InitLineArray(frame->f_code)) { + return -1; + } + f->f_lineno = _PyCode_LineNumberFromArray(frame->f_code, _PyInterpreterFrame_LASTI(frame)); result = func(obj, f, what, arg); f->f_lineno = 0; PyThreadState_LeaveTracing(tstate); @@ -6946,21 +6889,17 @@ maybe_call_line_trace(Py_tracefunc func, PyObject *obj, represents a jump backwards, update the frame's line number and then call the trace function if we're tracing source lines. */ - initialize_trace_info(&tstate->trace_info, frame); - int entry_point = 0; - _Py_CODEUNIT *code = _PyCode_CODE(frame->f_code); - while (_PyOpcode_Deopt[_Py_OPCODE(code[entry_point])] != RESUME) { - entry_point++; + if (_PyCode_InitLineArray(frame->f_code)) { + return -1; } int lastline; - if (instr_prev <= entry_point) { + if (instr_prev <= frame->f_code->_co_firsttraceable) { lastline = -1; } else { - lastline = _PyCode_CheckLineNumber(instr_prev*sizeof(_Py_CODEUNIT), &tstate->trace_info.bounds); + lastline = _PyCode_LineNumberFromArray(frame->f_code, instr_prev); } - int addr = _PyInterpreterFrame_LASTI(frame) * sizeof(_Py_CODEUNIT); - int line = _PyCode_CheckLineNumber(addr, &tstate->trace_info.bounds); + int line = _PyCode_LineNumberFromArray(frame->f_code, _PyInterpreterFrame_LASTI(frame)); PyFrameObject *f = _PyFrame_GetFrameObject(frame); if (f == NULL) { return -1; @@ -7338,7 +7277,6 @@ do_call_core(PyThreadState *tstate, ) { PyObject *result; - if (PyCFunction_CheckExact(func) || PyCMethod_CheckExact(func)) { C_TRACE(result, PyObject_Call(func, callargs, kwdict)); return result; @@ -7368,6 +7306,7 @@ do_call_core(PyThreadState *tstate, return result; } } + EVAL_CALL_STAT_INC_IF_FUNCTION(EVAL_CALL_FUNCTION_EX, func); return PyObject_Call(func, callargs, kwdict); } @@ -7851,7 +7790,7 @@ _Py_GetDXProfile(PyObject *self, PyObject *args) PyObject *l = PyList_New(257); if (l == NULL) return NULL; for (i = 0; i < 256; i++) { - PyObject *x = getarray(_py_stats.opcode_stats[i].pair_count); + PyObject *x = getarray(_py_stats_struct.opcode_stats[i].pair_count); if (x == NULL) { Py_DECREF(l); return NULL; @@ -7865,7 +7804,7 @@ _Py_GetDXProfile(PyObject *self, PyObject *args) } for (i = 0; i < 256; i++) { PyObject *x = PyLong_FromUnsignedLongLong( - _py_stats.opcode_stats[i].execution_count); + _py_stats_struct.opcode_stats[i].execution_count); if (x == NULL) { Py_DECREF(counts); Py_DECREF(l); diff --git a/Python/ceval_gil.h b/Python/ceval_gil.h index 9b8b43253f04d2..1b2dc7f8e1dc31 100644 --- a/Python/ceval_gil.h +++ b/Python/ceval_gil.h @@ -144,11 +144,7 @@ static void drop_gil(struct _ceval_runtime_state *ceval, struct _ceval_state *ceval2, PyThreadState *tstate) { -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - struct _gil_runtime_state *gil = &ceval2->gil; -#else struct _gil_runtime_state *gil = &ceval->gil; -#endif if (!_Py_atomic_load_relaxed(&gil->locked)) { Py_FatalError("drop_gil: GIL is not locked"); } @@ -232,11 +228,7 @@ take_gil(PyThreadState *tstate) PyInterpreterState *interp = tstate->interp; struct _ceval_runtime_state *ceval = &interp->runtime->ceval; struct _ceval_state *ceval2 = &interp->ceval; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - struct _gil_runtime_state *gil = &ceval2->gil; -#else struct _gil_runtime_state *gil = &ceval->gil; -#endif /* Check that _PyEval_InitThreads() was called to create the lock */ assert(gil_created(gil)); @@ -328,22 +320,12 @@ take_gil(PyThreadState *tstate) void _PyEval_SetSwitchInterval(unsigned long microseconds) { -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - PyInterpreterState *interp = PyInterpreterState_Get(); - struct _gil_runtime_state *gil = &interp->ceval.gil; -#else struct _gil_runtime_state *gil = &_PyRuntime.ceval.gil; -#endif gil->interval = microseconds; } unsigned long _PyEval_GetSwitchInterval() { -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - PyInterpreterState *interp = PyInterpreterState_Get(); - struct _gil_runtime_state *gil = &interp->ceval.gil; -#else struct _gil_runtime_state *gil = &_PyRuntime.ceval.gil; -#endif return gil->interval; } diff --git a/Python/clinic/sysmodule.c.h b/Python/clinic/sysmodule.c.h index 6ee3bb2a849aab..76b4cc5578263c 100644 --- a/Python/clinic/sysmodule.c.h +++ b/Python/clinic/sysmodule.c.h @@ -965,6 +965,94 @@ sys_is_finalizing(PyObject *module, PyObject *Py_UNUSED(ignored)) return sys_is_finalizing_impl(module); } +#if defined(Py_STATS) + +PyDoc_STRVAR(sys__stats_on__doc__, +"_stats_on($module, /)\n" +"--\n" +"\n" +"Turns on stats gathering (stats gathering is on by default)."); + +#define SYS__STATS_ON_METHODDEF \ + {"_stats_on", (PyCFunction)sys__stats_on, METH_NOARGS, sys__stats_on__doc__}, + +static PyObject * +sys__stats_on_impl(PyObject *module); + +static PyObject * +sys__stats_on(PyObject *module, PyObject *Py_UNUSED(ignored)) +{ + return sys__stats_on_impl(module); +} + +#endif /* defined(Py_STATS) */ + +#if defined(Py_STATS) + +PyDoc_STRVAR(sys__stats_off__doc__, +"_stats_off($module, /)\n" +"--\n" +"\n" +"Turns off stats gathering (stats gathering is on by default)."); + +#define SYS__STATS_OFF_METHODDEF \ + {"_stats_off", (PyCFunction)sys__stats_off, METH_NOARGS, sys__stats_off__doc__}, + +static PyObject * +sys__stats_off_impl(PyObject *module); + +static PyObject * +sys__stats_off(PyObject *module, PyObject *Py_UNUSED(ignored)) +{ + return sys__stats_off_impl(module); +} + +#endif /* defined(Py_STATS) */ + +#if defined(Py_STATS) + +PyDoc_STRVAR(sys__stats_clear__doc__, +"_stats_clear($module, /)\n" +"--\n" +"\n" +"Clears the stats."); + +#define SYS__STATS_CLEAR_METHODDEF \ + {"_stats_clear", (PyCFunction)sys__stats_clear, METH_NOARGS, sys__stats_clear__doc__}, + +static PyObject * +sys__stats_clear_impl(PyObject *module); + +static PyObject * +sys__stats_clear(PyObject *module, PyObject *Py_UNUSED(ignored)) +{ + return sys__stats_clear_impl(module); +} + +#endif /* defined(Py_STATS) */ + +#if defined(Py_STATS) + +PyDoc_STRVAR(sys__stats_dump__doc__, +"_stats_dump($module, /)\n" +"--\n" +"\n" +"Dump stats to file, and clears the stats."); + +#define SYS__STATS_DUMP_METHODDEF \ + {"_stats_dump", (PyCFunction)sys__stats_dump, METH_NOARGS, sys__stats_dump__doc__}, + +static PyObject * +sys__stats_dump_impl(PyObject *module); + +static PyObject * +sys__stats_dump(PyObject *module, PyObject *Py_UNUSED(ignored)) +{ + return sys__stats_dump_impl(module); +} + +#endif /* defined(Py_STATS) */ + #if defined(ANDROID_API_LEVEL) PyDoc_STRVAR(sys_getandroidapilevel__doc__, @@ -1011,7 +1099,23 @@ sys_getandroidapilevel(PyObject *module, PyObject *Py_UNUSED(ignored)) #define SYS_GETTOTALREFCOUNT_METHODDEF #endif /* !defined(SYS_GETTOTALREFCOUNT_METHODDEF) */ +#ifndef SYS__STATS_ON_METHODDEF + #define SYS__STATS_ON_METHODDEF +#endif /* !defined(SYS__STATS_ON_METHODDEF) */ + +#ifndef SYS__STATS_OFF_METHODDEF + #define SYS__STATS_OFF_METHODDEF +#endif /* !defined(SYS__STATS_OFF_METHODDEF) */ + +#ifndef SYS__STATS_CLEAR_METHODDEF + #define SYS__STATS_CLEAR_METHODDEF +#endif /* !defined(SYS__STATS_CLEAR_METHODDEF) */ + +#ifndef SYS__STATS_DUMP_METHODDEF + #define SYS__STATS_DUMP_METHODDEF +#endif /* !defined(SYS__STATS_DUMP_METHODDEF) */ + #ifndef SYS_GETANDROIDAPILEVEL_METHODDEF #define SYS_GETANDROIDAPILEVEL_METHODDEF #endif /* !defined(SYS_GETANDROIDAPILEVEL_METHODDEF) */ -/*[clinic end generated code: output=98efd34fd9b9b6ab input=a9049054013a1b77]*/ +/*[clinic end generated code: output=41122dae1bb7158c input=a9049054013a1b77]*/ diff --git a/Python/compile.c b/Python/compile.c index 51ef8fd17a1067..3946aac8c9943c 100644 --- a/Python/compile.c +++ b/Python/compile.c @@ -84,8 +84,9 @@ #define POP_JUMP_IF_TRUE -8 #define POP_JUMP_IF_NONE -9 #define POP_JUMP_IF_NOT_NONE -10 +#define LOAD_METHOD -11 -#define MIN_VIRTUAL_OPCODE -10 +#define MIN_VIRTUAL_OPCODE -11 #define MAX_ALLOWED_OPCODE 254 #define IS_WITHIN_OPCODE_RANGE(opcode) \ @@ -101,6 +102,15 @@ (opcode) == POP_JUMP_IF_FALSE || \ (opcode) == POP_JUMP_IF_TRUE) +#define IS_JUMP_OPCODE(opcode) \ + (IS_VIRTUAL_JUMP_OPCODE(opcode) || \ + is_bit_set_in_table(_PyOpcode_Jump, opcode)) + +#define IS_BLOCK_PUSH_OPCODE(opcode) \ + ((opcode) == SETUP_FINALLY || \ + (opcode) == SETUP_WITH || \ + (opcode) == SETUP_CLEANUP) + /* opcodes which are not emitted in codegen stage, only by the assembler */ #define IS_ASSEMBLER_OPCODE(opcode) \ ((opcode) == JUMP_FORWARD || \ @@ -124,11 +134,34 @@ (opcode) == POP_JUMP_BACKWARD_IF_TRUE || \ (opcode) == POP_JUMP_BACKWARD_IF_FALSE) +#define IS_UNCONDITIONAL_JUMP_OPCODE(opcode) \ + ((opcode) == JUMP || \ + (opcode) == JUMP_NO_INTERRUPT || \ + (opcode) == JUMP_FORWARD || \ + (opcode) == JUMP_BACKWARD || \ + (opcode) == JUMP_BACKWARD_NO_INTERRUPT) + +#define IS_SCOPE_EXIT_OPCODE(opcode) \ + ((opcode) == RETURN_VALUE || \ + (opcode) == RAISE_VARARGS || \ + (opcode) == RERAISE) #define IS_TOP_LEVEL_AWAIT(c) ( \ (c->c_flags->cf_flags & PyCF_ALLOW_TOP_LEVEL_AWAIT) \ && (c->u->u_ste->ste_type == ModuleBlock)) +struct location { + int lineno; + int end_lineno; + int col_offset; + int end_col_offset; +}; + +#define LOCATION(LNO, END_LNO, COL, END_COL) \ + ((const struct location){(LNO), (END_LNO), (COL), (END_COL)}) + +static struct location NO_LOCATION = {-1, -1, -1, -1}; + struct instr { int i_opcode; int i_oparg; @@ -136,17 +169,9 @@ struct instr { struct basicblock_ *i_target; /* target block when exception is raised, should not be set by front-end. */ struct basicblock_ *i_except; - int i_lineno; - int i_end_lineno; - int i_col_offset; - int i_end_col_offset; + struct location i_loc; }; -typedef struct excepthandler { - struct instr *setup; - int offset; -} ExceptHandler; - typedef struct exceptstack { struct basicblock_ *handlers[CO_MAXBLOCKS+1]; int depth; @@ -178,17 +203,15 @@ is_relative_jump(struct instr *i) } static inline int -is_block_push(struct instr *instr) +is_block_push(struct instr *i) { - int opcode = instr->i_opcode; - return opcode == SETUP_FINALLY || opcode == SETUP_WITH || opcode == SETUP_CLEANUP; + return IS_BLOCK_PUSH_OPCODE(i->i_opcode); } static inline int is_jump(struct instr *i) { - return IS_VIRTUAL_JUMP_OPCODE(i->i_opcode) || - is_bit_set_in_table(_PyOpcode_Jump, i->i_opcode); + return IS_JUMP_OPCODE(i->i_opcode); } static int @@ -248,22 +271,54 @@ typedef struct basicblock_ { int b_ialloc; /* Number of predecssors that a block has. */ int b_predecessors; + /* Number of predecssors that a block has as an exception handler. */ + int b_except_predecessors; /* depth of stack upon entry of block, computed by stackdepth() */ int b_startdepth; /* instruction offset for block, computed by assemble_jump_offsets() */ int b_offset; - /* Basic block has no fall through (it ends with a return, raise or jump) */ - unsigned b_nofallthrough : 1; /* Basic block is an exception handler that preserves lasti */ unsigned b_preserve_lasti : 1; /* Used by compiler passes to mark whether they have visited a basic block. */ unsigned b_visited : 1; - /* Basic block exits scope (it ends with a return or raise) */ - unsigned b_exit : 1; - /* b_return is true if a RETURN_VALUE opcode is inserted. */ - unsigned b_return : 1; + /* b_cold is true if this block is not perf critical (like an exception handler) */ + unsigned b_cold : 1; + /* b_warm is used by the cold-detection algorithm to mark blocks which are definitely not cold */ + unsigned b_warm : 1; } basicblock; + +static struct instr * +basicblock_last_instr(basicblock *b) { + if (b->b_iused) { + return &b->b_instr[b->b_iused - 1]; + } + return NULL; +} + +static inline int +basicblock_returns(basicblock *b) { + struct instr *last = basicblock_last_instr(b); + return last && last->i_opcode == RETURN_VALUE; +} + +static inline int +basicblock_exits_scope(basicblock *b) { + struct instr *last = basicblock_last_instr(b); + return last && IS_SCOPE_EXIT_OPCODE(last->i_opcode); +} + +static inline int +basicblock_nofallthrough(basicblock *b) { + struct instr *last = basicblock_last_instr(b); + return (last && + (IS_SCOPE_EXIT_OPCODE(last->i_opcode) || + IS_UNCONDITIONAL_JUMP_OPCODE(last->i_opcode))); +} + +#define BB_NO_FALLTHROUGH(B) (basicblock_nofallthrough(B)) +#define BB_HAS_FALLTHROUGH(B) (!basicblock_nofallthrough(B)) + /* fblockinfo tracks the current frame block. A frame block is used to handle loops, try/except, and try/finally. @@ -327,13 +382,7 @@ struct compiler_unit { struct fblockinfo u_fblock[CO_MAXBLOCKS]; int u_firstlineno; /* the first lineno of the block */ - int u_lineno; /* the lineno for the current stmt */ - int u_col_offset; /* the offset of the current stmt */ - int u_end_lineno; /* the end line of the current stmt */ - int u_end_col_offset; /* the end offset of the current stmt */ - - /* true if we need to create an implicit basicblock before the next instr */ - int u_need_new_implicit_block; + struct location u_loc; /* line/column info of the current stmt */ }; /* This struct captures the global state of a compilation. @@ -389,14 +438,14 @@ typedef struct { Py_ssize_t on_top; } pattern_context; +static int basicblock_next_instr(basicblock *); + static int compiler_enter_scope(struct compiler *, identifier, int, void *, int); static void compiler_free(struct compiler *); static basicblock *compiler_new_block(struct compiler *); -static int compiler_next_instr(basicblock *); -static int compiler_addop(struct compiler *, int); -static int compiler_addop_i(struct compiler *, int, Py_ssize_t); -static int compiler_addop_j(struct compiler *, int, basicblock *); -static int compiler_addop_j_noline(struct compiler *, int, basicblock *); +static int compiler_addop(struct compiler *, int, bool); +static int compiler_addop_i(struct compiler *, int, Py_ssize_t, bool); +static int compiler_addop_j(struct compiler *, int, basicblock *, bool); static int compiler_error(struct compiler *, const char *, ...); static int compiler_warn(struct compiler *, const char *, ...); static int compiler_nameop(struct compiler *, identifier, expr_context_ty); @@ -805,20 +854,26 @@ compiler_set_qualname(struct compiler *c) /* Allocate a new block and return a pointer to it. Returns NULL on error. */ +static basicblock * +new_basicblock() +{ + basicblock *b = (basicblock *)PyObject_Calloc(1, sizeof(basicblock)); + if (b == NULL) { + PyErr_NoMemory(); + return NULL; + } + return b; +} static basicblock * compiler_new_block(struct compiler *c) { - basicblock *b; - struct compiler_unit *u; - - u = c->u; - b = (basicblock *)PyObject_Calloc(1, sizeof(basicblock)); + basicblock *b = new_basicblock(); if (b == NULL) { - PyErr_NoMemory(); return NULL; } /* Extend the singly linked list of blocks with new block. */ + struct compiler_unit *u = c->u; b->b_list = u->u_blocks; u->u_blocks = b; return b; @@ -830,30 +885,39 @@ compiler_use_next_block(struct compiler *c, basicblock *block) assert(block != NULL); c->u->u_curblock->b_next = block; c->u->u_curblock = block; - c->u->u_need_new_implicit_block = 0; return block; } static basicblock * -compiler_copy_block(struct compiler *c, basicblock *block) +basicblock_new_b_list_successor(basicblock *prev) +{ + basicblock *result = new_basicblock(); + if (result == NULL) { + return NULL; + } + result->b_list = prev->b_list; + prev->b_list = result; + return result; +} + +static basicblock * +copy_basicblock(basicblock *block) { /* Cannot copy a block if it has a fallthrough, since * a block can only have one fallthrough predecessor. */ - assert(block->b_nofallthrough); - basicblock *result = compiler_new_block(c); + assert(BB_NO_FALLTHROUGH(block)); + basicblock *result = basicblock_new_b_list_successor(block); if (result == NULL) { return NULL; } for (int i = 0; i < block->b_iused; i++) { - int n = compiler_next_instr(result); + int n = basicblock_next_instr(result); if (n < 0) { return NULL; } result->b_instr[n] = block->b_instr[i]; } - result->b_exit = block->b_exit; - result->b_nofallthrough = 1; return result; } @@ -863,7 +927,7 @@ compiler_copy_block(struct compiler *c, basicblock *block) */ static int -compiler_next_instr(basicblock *b) +basicblock_next_instr(basicblock *b) { assert(b != NULL); if (b->b_instr == NULL) { @@ -912,24 +976,19 @@ compiler_next_instr(basicblock *b) - before the "except" and "finally" clauses */ -#define SET_LOC(c, x) \ - (c)->u->u_lineno = (x)->lineno; \ - (c)->u->u_col_offset = (x)->col_offset; \ - (c)->u->u_end_lineno = (x)->end_lineno; \ - (c)->u->u_end_col_offset = (x)->end_col_offset; +#define SET_LOC(c, x) \ + (c)->u->u_loc.lineno = (x)->lineno; \ + (c)->u->u_loc.end_lineno = (x)->end_lineno; \ + (c)->u->u_loc.col_offset = (x)->col_offset; \ + (c)->u->u_loc.end_col_offset = (x)->end_col_offset; // Artificial instructions -#define UNSET_LOC(c) \ - (c)->u->u_lineno = -1; \ - (c)->u->u_col_offset = -1; \ - (c)->u->u_end_lineno = -1; \ - (c)->u->u_end_col_offset = -1; - -#define COPY_INSTR_LOC(old, new) \ - (new).i_lineno = (old).i_lineno; \ - (new).i_col_offset = (old).i_col_offset; \ - (new).i_end_lineno = (old).i_end_lineno; \ - (new).i_end_col_offset = (old).i_end_col_offset; +#define UNSET_LOC(c) \ + (c)->u->u_loc.lineno = -1; \ + (c)->u->u_loc.end_lineno = -1; \ + (c)->u->u_loc.col_offset = -1; \ + (c)->u->u_loc.end_col_offset = -1; + /* Return the stack effect of opcode with argument oparg. @@ -1033,7 +1092,7 @@ stack_effect(int opcode, int oparg, int jump) case BUILD_CONST_KEY_MAP: return -oparg; case LOAD_ATTR: - return 0; + return (oparg & 1); case COMPARE_OP: case IS_OP: case CONTAINS_OP: @@ -1103,6 +1162,7 @@ stack_effect(int opcode, int oparg, int jump) return 1; case LOAD_FAST: + case LOAD_FAST_CHECK: return 1; case STORE_FAST: return -1; @@ -1116,12 +1176,10 @@ stack_effect(int opcode, int oparg, int jump) return -oparg; /* Functions and calls */ - case PRECALL: - return -oparg; case KW_NAMES: return 0; case CALL: - return -1; + return -1-oparg; case CALL_FUNCTION_EX: return -2 - ((oparg & 0x01) != 0); @@ -1208,20 +1266,18 @@ PyCompile_OpcodeStackEffect(int opcode, int oparg) return stack_effect(opcode, oparg, -1); } -static int is_end_of_basic_block(struct instr *instr) +static int +is_end_of_basic_block(struct instr *instr) { int opcode = instr->i_opcode; - - return is_jump(instr) || - opcode == RETURN_VALUE || - opcode == RAISE_VARARGS || - opcode == RERAISE; + return is_jump(instr) || IS_SCOPE_EXIT_OPCODE(opcode); } static int compiler_use_new_implicit_block_if_needed(struct compiler *c) { - if (c->u->u_need_new_implicit_block) { + basicblock *b = c->u->u_curblock; + if (b->b_iused && is_end_of_basic_block(basicblock_last_instr(b))) { basicblock *b = compiler_new_block(c); if (b == NULL) { return -1; @@ -1231,64 +1287,48 @@ compiler_use_new_implicit_block_if_needed(struct compiler *c) return 0; } -static void -compiler_check_if_end_of_block(struct compiler *c, struct instr *instr) -{ - if (is_end_of_basic_block(instr)) { - c->u->u_need_new_implicit_block = 1; - } -} - /* Add an opcode with no argument. Returns 0 on failure, 1 on success. */ static int -compiler_addop_line(struct compiler *c, int opcode, int line, - int end_line, int col_offset, int end_col_offset) +basicblock_addop(basicblock *b, int opcode, int oparg, + basicblock *target, const struct location *loc) { assert(IS_WITHIN_OPCODE_RANGE(opcode)); assert(!IS_ASSEMBLER_OPCODE(opcode)); - assert(!HAS_ARG(opcode) || IS_ARTIFICIAL(opcode)); - - if (compiler_use_new_implicit_block_if_needed(c) < 0) { - return -1; - } - - basicblock *b = c->u->u_curblock; - int off = compiler_next_instr(b); + assert(HAS_ARG(opcode) || oparg == 0); + assert(0 <= oparg && oparg < (1 << 30)); + assert((target == NULL) || + IS_JUMP_OPCODE(opcode) || + IS_BLOCK_PUSH_OPCODE(opcode)); + assert(oparg == 0 || target == NULL); + + int off = basicblock_next_instr(b); if (off < 0) { return 0; } struct instr *i = &b->b_instr[off]; i->i_opcode = opcode; - i->i_oparg = 0; - if (opcode == RETURN_VALUE) { - b->b_return = 1; - } - i->i_lineno = line; - i->i_end_lineno = end_line; - i->i_col_offset = col_offset; - i->i_end_col_offset = end_col_offset; + i->i_oparg = oparg; + i->i_target = target; + i->i_loc = loc ? *loc : NO_LOCATION; - compiler_check_if_end_of_block(c, i); return 1; } static int -compiler_addop(struct compiler *c, int opcode) +compiler_addop(struct compiler *c, int opcode, bool line) { - return compiler_addop_line(c, opcode, c->u->u_lineno, c->u->u_end_lineno, - c->u->u_col_offset, c->u->u_end_col_offset); -} + assert(!HAS_ARG(opcode) || IS_ARTIFICIAL(opcode)); + if (compiler_use_new_implicit_block_if_needed(c) < 0) { + return -1; + } -static int -compiler_addop_noline(struct compiler *c, int opcode) -{ - return compiler_addop_line(c, opcode, -1, 0, 0, 0); + const struct location *loc = line ? &c->u->u_loc : NULL; + return basicblock_addop(c->u->u_curblock, opcode, 0, NULL, loc); } - static Py_ssize_t compiler_add_o(PyObject *dict, PyObject *o) { @@ -1318,8 +1358,9 @@ compiler_add_o(PyObject *dict, PyObject *o) // Merge const *o* recursively and return constant key object. static PyObject* -merge_consts_recursive(struct compiler *c, PyObject *o) +merge_consts_recursive(PyObject *const_cache, PyObject *o) { + assert(PyDict_CheckExact(const_cache)); // None and Ellipsis are singleton, and key is the singleton. // No need to merge object and key. if (o == Py_None || o == Py_Ellipsis) { @@ -1333,22 +1374,22 @@ merge_consts_recursive(struct compiler *c, PyObject *o) } // t is borrowed reference - PyObject *t = PyDict_SetDefault(c->c_const_cache, key, key); + PyObject *t = PyDict_SetDefault(const_cache, key, key); if (t != key) { - // o is registered in c_const_cache. Just use it. + // o is registered in const_cache. Just use it. Py_XINCREF(t); Py_DECREF(key); return t; } - // We registered o in c_const_cache. + // We registered o in const_cache. // When o is a tuple or frozenset, we want to merge its // items too. if (PyTuple_CheckExact(o)) { Py_ssize_t len = PyTuple_GET_SIZE(o); for (Py_ssize_t i = 0; i < len; i++) { PyObject *item = PyTuple_GET_ITEM(o, i); - PyObject *u = merge_consts_recursive(c, item); + PyObject *u = merge_consts_recursive(const_cache, item); if (u == NULL) { Py_DECREF(key); return NULL; @@ -1391,7 +1432,7 @@ merge_consts_recursive(struct compiler *c, PyObject *o) PyObject *item; Py_hash_t hash; while (_PySet_NextEntry(o, &pos, &item, &hash)) { - PyObject *k = merge_consts_recursive(c, item); + PyObject *k = merge_consts_recursive(const_cache, item); if (k == NULL) { Py_DECREF(tuple); Py_DECREF(key); @@ -1429,7 +1470,7 @@ merge_consts_recursive(struct compiler *c, PyObject *o) static Py_ssize_t compiler_add_const(struct compiler *c, PyObject *o) { - PyObject *key = merge_consts_recursive(c, o); + PyObject *key = merge_consts_recursive(c->c_const_cache, o); if (key == NULL) { return -1; } @@ -1445,7 +1486,7 @@ compiler_addop_load_const(struct compiler *c, PyObject *o) Py_ssize_t arg = compiler_add_const(c, o); if (arg < 0) return 0; - return compiler_addop_i(c, LOAD_CONST, arg); + return compiler_addop_i(c, LOAD_CONST, arg, true); } static int @@ -1455,7 +1496,7 @@ compiler_addop_o(struct compiler *c, int opcode, PyObject *dict, Py_ssize_t arg = compiler_add_o(dict, o); if (arg < 0) return 0; - return compiler_addop_i(c, opcode, arg); + return compiler_addop_i(c, opcode, arg, true); } static int @@ -1471,18 +1512,26 @@ compiler_addop_name(struct compiler *c, int opcode, PyObject *dict, Py_DECREF(mangled); if (arg < 0) return 0; - return compiler_addop_i(c, opcode, arg); + if (opcode == LOAD_ATTR) { + arg <<= 1; + } + if (opcode == LOAD_METHOD) { + opcode = LOAD_ATTR; + arg <<= 1; + arg |= 1; + } + return compiler_addop_i(c, opcode, arg, true); } /* Add an opcode with an integer argument. Returns 0 on failure, 1 on success. */ - static int -compiler_addop_i_line(struct compiler *c, int opcode, Py_ssize_t oparg, - int lineno, int end_lineno, - int col_offset, int end_col_offset) +compiler_addop_i(struct compiler *c, int opcode, Py_ssize_t oparg, bool line) { + if (compiler_use_new_implicit_block_if_needed(c) < 0) { + return -1; + } /* oparg value is unsigned, but a signed C int is usually used to store it in the C code (like Python/ceval.c). @@ -1491,103 +1540,36 @@ compiler_addop_i_line(struct compiler *c, int opcode, Py_ssize_t oparg, The argument of a concrete bytecode instruction is limited to 8-bit. EXTENDED_ARG is used for 16, 24, and 32-bit arguments. */ - assert(IS_WITHIN_OPCODE_RANGE(opcode)); - assert(!IS_ASSEMBLER_OPCODE(opcode)); - assert(HAS_ARG(opcode)); - assert(0 <= oparg && oparg <= 2147483647); - - if (compiler_use_new_implicit_block_if_needed(c) < 0) { - return -1; - } + int oparg_ = Py_SAFE_DOWNCAST(oparg, Py_ssize_t, int); - basicblock *b = c->u->u_curblock; - int off = compiler_next_instr(b); - if (off < 0) { - return 0; - } - struct instr *i = &b->b_instr[off]; - i->i_opcode = opcode; - i->i_oparg = Py_SAFE_DOWNCAST(oparg, Py_ssize_t, int); - i->i_lineno = lineno; - i->i_end_lineno = end_lineno; - i->i_col_offset = col_offset; - i->i_end_col_offset = end_col_offset; - - compiler_check_if_end_of_block(c, i); - return 1; + const struct location *loc = line ? &c->u->u_loc : NULL; + return basicblock_addop(c->u->u_curblock, opcode, oparg_, NULL, loc); } static int -compiler_addop_i(struct compiler *c, int opcode, Py_ssize_t oparg) +compiler_addop_j(struct compiler *c, int opcode, basicblock *target, bool line) { - return compiler_addop_i_line(c, opcode, oparg, - c->u->u_lineno, c->u->u_end_lineno, - c->u->u_col_offset, c->u->u_end_col_offset); -} - -static int -compiler_addop_i_noline(struct compiler *c, int opcode, Py_ssize_t oparg) -{ - return compiler_addop_i_line(c, opcode, oparg, -1, 0, 0, 0); -} - -static int add_jump_to_block(struct compiler *c, int opcode, - int lineno, int end_lineno, - int col_offset, int end_col_offset, - basicblock *target) -{ - assert(IS_WITHIN_OPCODE_RANGE(opcode)); - assert(!IS_ASSEMBLER_OPCODE(opcode)); - assert(HAS_ARG(opcode) || IS_VIRTUAL_OPCODE(opcode)); - assert(target != NULL); - if (compiler_use_new_implicit_block_if_needed(c) < 0) { return -1; } - - basicblock *b = c->u->u_curblock; - int off = compiler_next_instr(b); - struct instr *i = &b->b_instr[off]; - if (off < 0) { - return 0; - } - i->i_opcode = opcode; - i->i_target = target; - i->i_lineno = lineno; - i->i_end_lineno = end_lineno; - i->i_col_offset = col_offset; - i->i_end_col_offset = end_col_offset; - - compiler_check_if_end_of_block(c, i); - return 1; -} - -static int -compiler_addop_j(struct compiler *c, int opcode, basicblock *b) -{ - return add_jump_to_block(c, opcode, c->u->u_lineno, - c->u->u_end_lineno, c->u->u_col_offset, - c->u->u_end_col_offset, b); -} - -static int -compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) -{ - return add_jump_to_block(c, opcode, -1, 0, 0, 0, b); + const struct location *loc = line ? &c->u->u_loc : NULL; + assert(target != NULL); + assert(IS_JUMP_OPCODE(opcode) || IS_BLOCK_PUSH_OPCODE(opcode)); + return basicblock_addop(c->u->u_curblock, opcode, 0, target, loc); } #define ADDOP(C, OP) { \ - if (!compiler_addop((C), (OP))) \ + if (!compiler_addop((C), (OP), true)) \ return 0; \ } #define ADDOP_NOLINE(C, OP) { \ - if (!compiler_addop_noline((C), (OP))) \ + if (!compiler_addop((C), (OP), false)) \ return 0; \ } #define ADDOP_IN_SCOPE(C, OP) { \ - if (!compiler_addop((C), (OP))) { \ + if (!compiler_addop((C), (OP), true)) { \ compiler_exit_scope(c); \ return 0; \ } \ @@ -1626,17 +1608,17 @@ compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) } #define ADDOP_I(C, OP, O) { \ - if (!compiler_addop_i((C), (OP), (O))) \ + if (!compiler_addop_i((C), (OP), (O), true)) \ return 0; \ } #define ADDOP_I_NOLINE(C, OP, O) { \ - if (!compiler_addop_i_noline((C), (OP), (O))) \ + if (!compiler_addop_i((C), (OP), (O), false)) \ return 0; \ } #define ADDOP_JUMP(C, OP, O) { \ - if (!compiler_addop_j((C), (OP), (O))) \ + if (!compiler_addop_j((C), (OP), (O), true)) \ return 0; \ } @@ -1644,7 +1626,7 @@ compiler_addop_j_noline(struct compiler *c, int opcode, basicblock *b) * Used for artificial jumps that have no corresponding * token in the source code. */ #define ADDOP_JUMP_NOLINE(C, OP, O) { \ - if (!compiler_addop_j_noline((C), (OP), (O))) \ + if (!compiler_addop_j((C), (OP), (O), false)) \ return 0; \ } @@ -1764,10 +1746,7 @@ compiler_enter_scope(struct compiler *c, identifier name, u->u_blocks = NULL; u->u_nfblocks = 0; u->u_firstlineno = lineno; - u->u_lineno = lineno; - u->u_col_offset = 0; - u->u_end_lineno = lineno; - u->u_end_col_offset = 0; + u->u_loc = LOCATION(lineno, lineno, 0, 0); u->u_consts = PyDict_New(); if (!u->u_consts) { compiler_unit_free(u); @@ -1803,7 +1782,7 @@ compiler_enter_scope(struct compiler *c, identifier name, c->u->u_curblock = block; if (u->u_scope_type == COMPILER_SCOPE_MODULE) { - c->u->u_lineno = -1; + c->u->u_loc.lineno = -1; } else { if (!compiler_set_qualname(c)) @@ -1947,7 +1926,6 @@ compiler_call_exit_with_nones(struct compiler *c) { ADDOP_LOAD_CONST(c, Py_None); ADDOP_LOAD_CONST(c, Py_None); ADDOP_LOAD_CONST(c, Py_None); - ADDOP_I(c, PRECALL, 2); ADDOP_I(c, CALL, 2); return 1; } @@ -1965,7 +1943,7 @@ compiler_add_yield_from(struct compiler *c, int await) compiler_use_next_block(c, start); ADDOP_JUMP(c, SEND, exit); compiler_use_next_block(c, resume); - ADDOP(c, YIELD_VALUE); + ADDOP_I(c, YIELD_VALUE, 0); ADDOP_I(c, RESUME, await ? 3 : 2); ADDOP_JUMP(c, JUMP_NO_INTERRUPT, start); compiler_use_next_block(c, exit); @@ -2175,7 +2153,7 @@ compiler_mod(struct compiler *c, mod_ty mod) mod, 1)) { return NULL; } - c->u->u_lineno = 1; + c->u->u_loc.lineno = 1; switch (mod->kind) { case Module_kind: if (!compiler_body(c, mod->v.Module.body)) { @@ -2319,19 +2297,12 @@ compiler_apply_decorators(struct compiler *c, asdl_expr_seq* decos) if (!decos) return 1; - int old_lineno = c->u->u_lineno; - int old_end_lineno = c->u->u_end_lineno; - int old_col_offset = c->u->u_col_offset; - int old_end_col_offset = c->u->u_end_col_offset; + struct location old_loc = c->u->u_loc; for (Py_ssize_t i = asdl_seq_LEN(decos) - 1; i > -1; i--) { SET_LOC(c, (expr_ty)asdl_seq_GET(decos, i)); - ADDOP_I(c, PRECALL, 0); ADDOP_I(c, CALL, 0); } - c->u->u_lineno = old_lineno; - c->u->u_end_lineno = old_end_lineno; - c->u->u_col_offset = old_col_offset; - c->u->u_end_col_offset = old_end_col_offset; + c->u->u_loc = old_loc; return 1; } @@ -3152,6 +3123,8 @@ compiler_async_for(struct compiler *c, stmt_ty s) /* Success block for __anext__ */ VISIT(c, expr, s->v.AsyncFor.target); VISIT_SEQ(c, stmt, s->v.AsyncFor.body); + /* Mark jump as artificial */ + UNSET_LOC(c); ADDOP_JUMP(c, JUMP, start); compiler_pop_fblock(c, FOR_LOOP, start); @@ -4002,7 +3975,6 @@ compiler_assert(struct compiler *c, stmt_ty s) ADDOP(c, LOAD_ASSERTION_ERROR); if (s->v.Assert.msg) { VISIT(c, expr, s->v.Assert.msg); - ADDOP_I(c, PRECALL, 0); ADDOP_I(c, CALL, 0); } ADDOP_I(c, RAISE_VARARGS, 1); @@ -4198,7 +4170,7 @@ addop_yield(struct compiler *c) { if (c->u->u_ste->ste_generator && c->u->u_ste->ste_coroutine) { ADDOP(c, ASYNC_GEN_WRAP); } - ADDOP(c, YIELD_VALUE); + ADDOP_I(c, YIELD_VALUE, 0); ADDOP_I(c, RESUME, 1); return 1; } @@ -4298,7 +4270,7 @@ compiler_nameop(struct compiler *c, identifier name, expr_context_ty ctx) if (op == LOAD_GLOBAL) { arg <<= 1; } - return compiler_addop_i(c, op, arg); + return compiler_addop_i(c, op, arg, true); } static int @@ -4779,6 +4751,16 @@ is_import_originated(struct compiler *c, expr_ty e) return flags & DEF_IMPORT; } +static void +update_location_to_match_attr(struct compiler *c, expr_ty meth) +{ + if (meth->lineno != meth->end_lineno) { + // Make start location match attribute + c->u->u_loc.lineno = meth->end_lineno; + c->u->u_loc.col_offset = meth->end_col_offset - (int)PyUnicode_GetLength(meth->v.Attribute.attr)-1; + } +} + // Return 1 if the method call was optimized, -1 if not, and 0 on error. static int maybe_optimize_method_call(struct compiler *c, expr_ty e) @@ -4820,8 +4802,8 @@ maybe_optimize_method_call(struct compiler *c, expr_ty e) } /* Alright, we can optimize the code. */ VISIT(c, expr, meth->v.Attribute.value); - int old_lineno = c->u->u_lineno; - c->u->u_lineno = meth->end_lineno; + SET_LOC(c, meth); + update_location_to_match_attr(c, meth); ADDOP_NAME(c, LOAD_METHOD, meth->v.Attribute.attr, names); VISIT_SEQ(c, expr, e->v.Call.args); @@ -4831,9 +4813,9 @@ maybe_optimize_method_call(struct compiler *c, expr_ty e) return 0; }; } - ADDOP_I(c, PRECALL, argsl + kwdsl); + SET_LOC(c, e); + update_location_to_match_attr(c, meth); ADDOP_I(c, CALL, argsl + kwdsl); - c->u->u_lineno = old_lineno; return 1; } @@ -4897,7 +4879,6 @@ compiler_joined_str(struct compiler *c, expr_ty e) VISIT(c, expr, asdl_seq_GET(e->v.JoinedStr.values, i)); ADDOP_I(c, LIST_APPEND, 1); } - ADDOP_I(c, PRECALL, 1); ADDOP_I(c, CALL, 1); } else { @@ -5071,7 +5052,6 @@ compiler_call_helper(struct compiler *c, return 0; }; } - ADDOP_I(c, PRECALL, n + nelts + nkwelts); ADDOP_I(c, CALL, n + nelts + nkwelts); return 1; @@ -5462,7 +5442,6 @@ compiler_comprehension(struct compiler *c, expr_ty e, int type, ADDOP(c, GET_ITER); } - ADDOP_I(c, PRECALL, 0); ADDOP_I(c, CALL, 0); if (is_async_generator && type != COMP_GENEXP) { @@ -5533,20 +5512,26 @@ compiler_visit_keyword(struct compiler *c, keyword_ty k) static int compiler_with_except_finish(struct compiler *c, basicblock * cleanup) { UNSET_LOC(c); - basicblock *exit; - exit = compiler_new_block(c); - if (exit == NULL) + basicblock *suppress = compiler_new_block(c); + if (suppress == NULL) { return 0; - ADDOP_JUMP(c, POP_JUMP_IF_TRUE, exit); + } + ADDOP_JUMP(c, POP_JUMP_IF_TRUE, suppress); ADDOP_I(c, RERAISE, 2); - compiler_use_next_block(c, cleanup); - POP_EXCEPT_AND_RERAISE(c); - compiler_use_next_block(c, exit); + compiler_use_next_block(c, suppress); ADDOP(c, POP_TOP); /* exc_value */ ADDOP(c, POP_BLOCK); ADDOP(c, POP_EXCEPT); ADDOP(c, POP_TOP); ADDOP(c, POP_TOP); + basicblock *exit = compiler_new_block(c); + if (exit == NULL) { + return 0; + } + ADDOP_JUMP(c, JUMP, exit); + compiler_use_next_block(c, cleanup); + POP_EXCEPT_AND_RERAISE(c); + compiler_use_next_block(c, exit); return 1; } @@ -5844,20 +5829,20 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) switch (e->v.Attribute.ctx) { case Load: { - int old_lineno = c->u->u_lineno; - c->u->u_lineno = e->end_lineno; + int old_lineno = c->u->u_loc.lineno; + c->u->u_loc.lineno = e->end_lineno; ADDOP_NAME(c, LOAD_ATTR, e->v.Attribute.attr, names); - c->u->u_lineno = old_lineno; + c->u->u_loc.lineno = old_lineno; break; } case Store: if (forbidden_name(c, e->v.Attribute.attr, e->v.Attribute.ctx)) { return 0; } - int old_lineno = c->u->u_lineno; - c->u->u_lineno = e->end_lineno; + int old_lineno = c->u->u_loc.lineno; + c->u->u_loc.lineno = e->end_lineno; ADDOP_NAME(c, STORE_ATTR, e->v.Attribute.attr, names); - c->u->u_lineno = old_lineno; + c->u->u_loc.lineno = old_lineno; break; case Del: ADDOP_NAME(c, DELETE_ATTR, e->v.Attribute.attr, names); @@ -5894,16 +5879,10 @@ compiler_visit_expr1(struct compiler *c, expr_ty e) static int compiler_visit_expr(struct compiler *c, expr_ty e) { - int old_lineno = c->u->u_lineno; - int old_end_lineno = c->u->u_end_lineno; - int old_col_offset = c->u->u_col_offset; - int old_end_col_offset = c->u->u_end_col_offset; + struct location old_loc = c->u->u_loc; SET_LOC(c, e); int res = compiler_visit_expr1(c, e); - c->u->u_lineno = old_lineno; - c->u->u_end_lineno = old_end_lineno; - c->u->u_col_offset = old_col_offset; - c->u->u_end_col_offset = old_end_col_offset; + c->u->u_loc = old_loc; return res; } @@ -5913,20 +5892,17 @@ compiler_augassign(struct compiler *c, stmt_ty s) assert(s->kind == AugAssign_kind); expr_ty e = s->v.AugAssign.target; - int old_lineno = c->u->u_lineno; - int old_end_lineno = c->u->u_end_lineno; - int old_col_offset = c->u->u_col_offset; - int old_end_col_offset = c->u->u_end_col_offset; + struct location old_loc = c->u->u_loc; SET_LOC(c, e); switch (e->kind) { case Attribute_kind: VISIT(c, expr, e->v.Attribute.value); ADDOP_I(c, COPY, 1); - int old_lineno = c->u->u_lineno; - c->u->u_lineno = e->end_lineno; + int old_lineno = c->u->u_loc.lineno; + c->u->u_loc.lineno = e->end_lineno; ADDOP_NAME(c, LOAD_ATTR, e->v.Attribute.attr, names); - c->u->u_lineno = old_lineno; + c->u->u_loc.lineno = old_lineno; break; case Subscript_kind: VISIT(c, expr, e->v.Subscript.value); @@ -5946,10 +5922,7 @@ compiler_augassign(struct compiler *c, stmt_ty s) return 0; } - c->u->u_lineno = old_lineno; - c->u->u_end_lineno = old_end_lineno; - c->u->u_col_offset = old_col_offset; - c->u->u_end_col_offset = old_end_col_offset; + c->u->u_loc = old_loc; VISIT(c, expr, s->v.AugAssign.value); ADDOP_INPLACE(c, s->v.AugAssign.op); @@ -5958,7 +5931,7 @@ compiler_augassign(struct compiler *c, stmt_ty s) switch (e->kind) { case Attribute_kind: - c->u->u_lineno = e->end_lineno; + c->u->u_loc.lineno = e->end_lineno; ADDOP_I(c, SWAP, 2); ADDOP_NAME(c, STORE_ATTR, e->v.Attribute.attr, names); break; @@ -6101,24 +6074,21 @@ static int compiler_error(struct compiler *c, const char *format, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif PyObject *msg = PyUnicode_FromFormatV(format, vargs); va_end(vargs); if (msg == NULL) { return 0; } - PyObject *loc = PyErr_ProgramTextObject(c->c_filename, c->u->u_lineno); + PyObject *loc = PyErr_ProgramTextObject(c->c_filename, c->u->u_loc.lineno); if (loc == NULL) { Py_INCREF(Py_None); loc = Py_None; } + struct location u_loc = c->u->u_loc; PyObject *args = Py_BuildValue("O(OiiOii)", msg, c->c_filename, - c->u->u_lineno, c->u->u_col_offset + 1, loc, - c->u->u_end_lineno, c->u->u_end_col_offset + 1); + u_loc.lineno, u_loc.col_offset + 1, loc, + u_loc.end_lineno, u_loc.end_col_offset + 1); Py_DECREF(msg); if (args == NULL) { goto exit; @@ -6138,18 +6108,14 @@ static int compiler_warn(struct compiler *c, const char *format, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif PyObject *msg = PyUnicode_FromFormatV(format, vargs); va_end(vargs); if (msg == NULL) { return 0; } if (PyErr_WarnExplicitObject(PyExc_SyntaxWarning, msg, c->c_filename, - c->u->u_lineno, NULL, NULL) < 0) + c->u->u_loc.lineno, NULL, NULL) < 0) { if (PyErr_ExceptionMatches(PyExc_SyntaxWarning)) { /* Replace the SyntaxWarning exception with a SyntaxError @@ -6289,7 +6255,7 @@ emit_and_reset_fail_pop(struct compiler *c, pattern_context *pc) } while (--pc->fail_pop_size) { compiler_use_next_block(c, pc->fail_pop[pc->fail_pop_size]); - if (!compiler_addop(c, POP_TOP)) { + if (!compiler_addop(c, POP_TOP, true)) { pc->fail_pop_size = 0; PyObject_Free(pc->fail_pop); pc->fail_pop = NULL; @@ -6723,7 +6689,7 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) pc->fail_pop = NULL; pc->fail_pop_size = 0; pc->on_top = 0; - if (!compiler_addop_i(c, COPY, 1) || !compiler_pattern(c, alt, pc)) { + if (!compiler_addop_i(c, COPY, 1, true) || !compiler_pattern(c, alt, pc)) { goto error; } // Success! @@ -6786,7 +6752,7 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) } } assert(control); - if (!compiler_addop_j(c, JUMP, end) || + if (!compiler_addop_j(c, JUMP, end, true) || !emit_and_reset_fail_pop(c, pc)) { goto error; @@ -6798,7 +6764,7 @@ compiler_pattern_or(struct compiler *c, pattern_ty p, pattern_context *pc) // Need to NULL this for the PyObject_Free call in the error block. old_pc.fail_pop = NULL; // No match. Pop the remaining copy of the subject and fail: - if (!compiler_addop(c, POP_TOP) || !jump_to_fail_pop(c, pc, JUMP)) { + if (!compiler_addop(c, POP_TOP, true) || !jump_to_fail_pop(c, pc, JUMP)) { goto error; } compiler_use_next_block(c, end); @@ -7047,33 +7013,35 @@ compiler_match(struct compiler *c, stmt_ty s) #undef WILDCARD_CHECK #undef WILDCARD_STAR_CHECK -/* End of the compiler section, beginning of the assembler section */ -/* do depth-first search of basic block graph, starting with block. - post records the block indices in post-order. - - XXX must handle implicit jumps from one block to next -*/ +/* End of the compiler section, beginning of the assembler section */ struct assembler { PyObject *a_bytecode; /* bytes containing bytecode */ - PyObject *a_except_table; /* bytes containing exception table */ - basicblock *a_entry; int a_offset; /* offset into bytecode */ - int a_nblocks; /* number of reachable blocks */ + PyObject *a_except_table; /* bytes containing exception table */ int a_except_table_off; /* offset into exception table */ - int a_prevlineno; /* lineno of last emitted line in line table */ - int a_prev_end_lineno; /* end_lineno of last emitted line in line table */ - int a_lineno; /* lineno of last emitted instruction */ - int a_end_lineno; /* end_lineno of last emitted instruction */ - int a_lineno_start; /* bytecode start offset of current lineno */ - int a_end_lineno_start; /* bytecode start offset of current end_lineno */ /* Location Info */ + int a_lineno; /* lineno of last emitted instruction */ PyObject* a_linetable; /* bytes containing location info */ int a_location_off; /* offset of last written location info frame */ }; +static basicblock** +make_cfg_traversal_stack(basicblock *entryblock) { + int nblocks = 0; + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + b->b_visited = 0; + nblocks++; + } + basicblock **stack = (basicblock **)PyMem_Malloc(sizeof(basicblock *) * nblocks); + if (!stack) { + PyErr_NoMemory(); + } + return stack; +} + Py_LOCAL_INLINE(void) stackdepth_push(basicblock ***sp, basicblock *b, int depth) { @@ -7089,31 +7057,26 @@ stackdepth_push(basicblock ***sp, basicblock *b, int depth) * cycles in the flow graph have no net effect on the stack depth. */ static int -stackdepth(struct compiler *c) +stackdepth(basicblock *entryblock, int code_flags) { - basicblock *b, *entryblock = NULL; - basicblock **stack, **sp; - int nblocks = 0, maxdepth = 0; - for (b = c->u->u_blocks; b != NULL; b = b->b_list) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { b->b_startdepth = INT_MIN; - entryblock = b; - nblocks++; } - assert(entryblock!= NULL); - stack = (basicblock **)PyObject_Malloc(sizeof(basicblock *) * nblocks); + basicblock **stack = make_cfg_traversal_stack(entryblock); if (!stack) { - PyErr_NoMemory(); return -1; } - sp = stack; - if (c->u->u_ste->ste_generator || c->u->u_ste->ste_coroutine) { + int maxdepth = 0; + basicblock **sp = stack; + if (code_flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { stackdepth_push(&sp, entryblock, 1); } else { stackdepth_push(&sp, entryblock, 0); } + while (sp != stack) { - b = *--sp; + basicblock *b = *--sp; int depth = b->b_startdepth; assert(depth >= 0); basicblock *next = b->b_next; @@ -7143,32 +7106,31 @@ stackdepth(struct compiler *c) } depth = new_depth; assert(!IS_ASSEMBLER_OPCODE(instr->i_opcode)); - if (instr->i_opcode == JUMP_NO_INTERRUPT || - instr->i_opcode == JUMP || - instr->i_opcode == RETURN_VALUE || - instr->i_opcode == RAISE_VARARGS || - instr->i_opcode == RERAISE) + if (IS_UNCONDITIONAL_JUMP_OPCODE(instr->i_opcode) || + IS_SCOPE_EXIT_OPCODE(instr->i_opcode)) { /* remaining code is dead */ next = NULL; break; } + if (instr->i_opcode == YIELD_VALUE) { + instr->i_oparg = depth; + } } if (next != NULL) { - assert(b->b_nofallthrough == 0); + assert(BB_HAS_FALLTHROUGH(b)); stackdepth_push(&sp, next, depth); } } - PyObject_Free(stack); + PyMem_Free(stack); return maxdepth; } static int -assemble_init(struct assembler *a, int nblocks, int firstlineno) +assemble_init(struct assembler *a, int firstlineno) { memset(a, 0, sizeof(struct assembler)); - a->a_prevlineno = a->a_lineno = firstlineno; - a->a_prev_end_lineno = a->a_end_lineno = firstlineno; + a->a_lineno = firstlineno; a->a_linetable = NULL; a->a_location_off = 0; a->a_except_table = NULL; @@ -7184,10 +7146,6 @@ assemble_init(struct assembler *a, int nblocks, int firstlineno) if (a->a_except_table == NULL) { goto error; } - if ((size_t)nblocks > SIZE_MAX / sizeof(basicblock *)) { - PyErr_NoMemory(); - goto error; - } return 1; error: Py_XDECREF(a->a_bytecode); @@ -7263,15 +7221,9 @@ copy_except_stack(ExceptStack *stack) { } static int -label_exception_targets(basicblock *entry) { - int nblocks = 0; - for (basicblock *b = entry; b != NULL; b = b->b_next) { - b->b_visited = 0; - nblocks++; - } - basicblock **todo_stack = PyMem_Malloc(sizeof(basicblock *)*nblocks); +label_exception_targets(basicblock *entryblock) { + basicblock **todo_stack = make_cfg_traversal_stack(entryblock); if (todo_stack == NULL) { - PyErr_NoMemory(); return -1; } ExceptStack *except_stack = make_except_stack(); @@ -7281,9 +7233,9 @@ label_exception_targets(basicblock *entry) { return -1; } except_stack->depth = 0; - todo_stack[0] = entry; - entry->b_visited = 1; - entry->b_exceptstack = except_stack; + todo_stack[0] = entryblock; + entryblock->b_visited = 1; + entryblock->b_exceptstack = except_stack; basicblock **todo = &todo_stack[1]; basicblock *handler = NULL; while (todo > todo_stack) { @@ -7316,7 +7268,7 @@ label_exception_targets(basicblock *entry) { instr->i_except = handler; assert(i == b->b_iused -1); if (!instr->i_target->b_visited) { - if (b->b_nofallthrough == 0) { + if (BB_HAS_FALLTHROUGH(b)) { ExceptStack *copy = copy_except_stack(except_stack); if (copy == NULL) { goto error; @@ -7336,7 +7288,7 @@ label_exception_targets(basicblock *entry) { instr->i_except = handler; } } - if (b->b_nofallthrough == 0 && !b->b_next->b_visited) { + if (BB_HAS_FALLTHROUGH(b) && !b->b_next->b_visited) { assert(except_stack != NULL); b->b_next->b_exceptstack = except_stack; todo[0] = b->b_next; @@ -7348,7 +7300,7 @@ label_exception_targets(basicblock *entry) { } } #ifdef Py_DEBUG - for (basicblock *b = entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { assert(b->b_exceptstack == NULL); } #endif @@ -7360,10 +7312,158 @@ label_exception_targets(basicblock *entry) { return -1; } +static int +mark_warm(basicblock *entryblock) { + basicblock **stack = make_cfg_traversal_stack(entryblock); + if (stack == NULL) { + return -1; + } + basicblock **sp = stack; + + *sp++ = entryblock; + entryblock->b_visited = 1; + while (sp > stack) { + basicblock *b = *(--sp); + assert(!b->b_except_predecessors); + b->b_warm = 1; + basicblock *next = b->b_next; + if (next && BB_HAS_FALLTHROUGH(b) && !next->b_visited) { + *sp++ = next; + next->b_visited = 1; + } + for (int i=0; i < b->b_iused; i++) { + struct instr *instr = &b->b_instr[i]; + if (is_jump(instr) && !instr->i_target->b_visited) { + *sp++ = instr->i_target; + instr->i_target->b_visited = 1; + } + } + } + PyMem_Free(stack); + return 0; +} + +static int +mark_cold(basicblock *entryblock) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + assert(!b->b_cold && !b->b_warm); + } + if (mark_warm(entryblock) < 0) { + return -1; + } + + basicblock **stack = make_cfg_traversal_stack(entryblock); + if (stack == NULL) { + return -1; + } + + basicblock **sp = stack; + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + if (b->b_except_predecessors) { + assert(b->b_except_predecessors == b->b_predecessors); + assert(!b->b_warm); + *sp++ = b; + b->b_visited = 1; + } + } + + while (sp > stack) { + basicblock *b = *(--sp); + b->b_cold = 1; + basicblock *next = b->b_next; + if (next && BB_HAS_FALLTHROUGH(b)) { + if (!next->b_warm && !next->b_visited) { + *sp++ = next; + next->b_visited = 1; + } + } + for (int i = 0; i < b->b_iused; i++) { + struct instr *instr = &b->b_instr[i]; + if (is_jump(instr)) { + assert(i == b->b_iused-1); + basicblock *target = b->b_instr[i].i_target; + if (!target->b_warm && !target->b_visited) { + *sp++ = target; + target->b_visited = 1; + } + } + } + } + PyMem_Free(stack); + return 0; +} + +static int +push_cold_blocks_to_end(basicblock *entryblock, int code_flags) { + if (entryblock->b_next == NULL) { + /* single basicblock, no need to reorder */ + return 0; + } + if (mark_cold(entryblock) < 0) { + return -1; + } + + /* If we have a cold block with fallthrough to a warm block, add */ + /* an explicit jump instead of fallthrough */ + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + if (b->b_cold && BB_HAS_FALLTHROUGH(b) && b->b_next && b->b_next->b_warm) { + basicblock *explicit_jump = basicblock_new_b_list_successor(b); + if (explicit_jump == NULL) { + return -1; + } + basicblock_addop(explicit_jump, JUMP, 0, b->b_next, &NO_LOCATION); + + explicit_jump->b_cold = 1; + explicit_jump->b_next = b->b_next; + b->b_next = explicit_jump; + } + } + + assert(!entryblock->b_cold); /* First block can't be cold */ + basicblock *cold_blocks = NULL; + basicblock *cold_blocks_tail = NULL; + + basicblock *b = entryblock; + while(b->b_next) { + assert(!b->b_cold); + while (b->b_next && !b->b_next->b_cold) { + b = b->b_next; + } + if (b->b_next == NULL) { + /* no more cold blocks */ + break; + } + + /* b->b_next is the beginning of a cold streak */ + assert(!b->b_cold && b->b_next->b_cold); + + basicblock *b_end = b->b_next; + while (b_end->b_next && b_end->b_next->b_cold) { + b_end = b_end->b_next; + } + + /* b_end is the end of the cold streak */ + assert(b_end && b_end->b_cold); + assert(b_end->b_next == NULL || !b_end->b_next->b_cold); + + if (cold_blocks == NULL) { + cold_blocks = b->b_next; + } + else { + cold_blocks_tail->b_next = b->b_next; + } + cold_blocks_tail = b_end; + b->b_next = b_end->b_next; + b_end->b_next = NULL; + } + assert(b != NULL && b->b_next == NULL); + b->b_next = cold_blocks; + return 0; +} static void -convert_exception_handlers_to_nops(basicblock *entry) { - for (basicblock *b = entry; b != NULL; b = b->b_next) { +convert_exception_handlers_to_nops(basicblock *entryblock) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { for (int i = 0; i < b->b_iused; i++) { struct instr *instr = &b->b_instr[i]; if (is_block_push(instr) || instr->i_opcode == POP_BLOCK) { @@ -7433,13 +7533,13 @@ assemble_emit_exception_table_entry(struct assembler *a, int start, int end, bas } static int -assemble_exception_table(struct assembler *a) +assemble_exception_table(struct assembler *a, basicblock *entryblock) { basicblock *b; int ioffset = 0; basicblock *handler = NULL; int start = -1; - for (b = a->a_entry; b != NULL; b = b->b_next) { + for (b = entryblock; b != NULL; b = b->b_next) { ioffset = b->b_offset; for (int i = 0; i < b->b_iused; i++) { struct instr *instr = &b->b_instr[i]; @@ -7507,6 +7607,7 @@ write_location_info_short_form(struct assembler* a, int length, int column, int int column_low_bits = column & 7; int column_group = column >> 3; assert(column < 80); + assert(end_column >= column); assert(end_column - column < 16); write_location_first_byte(a, PY_CODE_LOCATION_INFO_SHORT0 + column_group, length); write_location_byte(a, (column_low_bits << 4) | (end_column - column)); @@ -7529,11 +7630,11 @@ write_location_info_long_form(struct assembler* a, struct instr* i, int length) { assert(length > 0 && length <= 8); write_location_first_byte(a, PY_CODE_LOCATION_INFO_LONG, length); - write_location_signed_varint(a, i->i_lineno - a->a_lineno); - assert(i->i_end_lineno >= i->i_lineno); - write_location_varint(a, i->i_end_lineno - i->i_lineno); - write_location_varint(a, i->i_col_offset+1); - write_location_varint(a, i->i_end_col_offset+1); + write_location_signed_varint(a, i->i_loc.lineno - a->a_lineno); + assert(i->i_loc.end_lineno >= i->i_loc.lineno); + write_location_varint(a, i->i_loc.end_lineno - i->i_loc.lineno); + write_location_varint(a, i->i_loc.col_offset + 1); + write_location_varint(a, i->i_loc.end_col_offset + 1); } static void @@ -7561,35 +7662,35 @@ write_location_info_entry(struct assembler* a, struct instr* i, int isize) return 0; } } - if (i->i_lineno < 0) { + if (i->i_loc.lineno < 0) { write_location_info_none(a, isize); return 1; } - int line_delta = i->i_lineno - a->a_lineno; - int column = i->i_col_offset; - int end_column = i->i_end_col_offset; + int line_delta = i->i_loc.lineno - a->a_lineno; + int column = i->i_loc.col_offset; + int end_column = i->i_loc.end_col_offset; assert(column >= -1); assert(end_column >= -1); if (column < 0 || end_column < 0) { - if (i->i_end_lineno == i->i_lineno || i->i_end_lineno == -1) { + if (i->i_loc.end_lineno == i->i_loc.lineno || i->i_loc.end_lineno == -1) { write_location_info_no_column(a, isize, line_delta); - a->a_lineno = i->i_lineno; + a->a_lineno = i->i_loc.lineno; return 1; } } - else if (i->i_end_lineno == i->i_lineno) { - if (line_delta == 0 && column < 80 && end_column - column < 16) { + else if (i->i_loc.end_lineno == i->i_loc.lineno) { + if (line_delta == 0 && column < 80 && end_column - column < 16 && end_column >= column) { write_location_info_short_form(a, isize, column, end_column); return 1; } if (line_delta >= 0 && line_delta < 3 && column < 128 && end_column < 128) { write_location_info_oneline_form(a, isize, line_delta, column, end_column); - a->a_lineno = i->i_lineno; + a->a_lineno = i->i_loc.lineno; return 1; } } write_location_info_long_form(a, i, isize); - a->a_lineno = i->i_lineno; + a->a_lineno = i->i_loc.lineno; return 1; } @@ -7631,12 +7732,12 @@ assemble_emit(struct assembler *a, struct instr *i) } static void -normalize_jumps(struct assembler *a) +normalize_jumps(basicblock *entryblock) { - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { b->b_visited = 0; } - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { b->b_visited = 1; if (b->b_iused == 0) { continue; @@ -7690,25 +7791,23 @@ normalize_jumps(struct assembler *a) } static void -assemble_jump_offsets(struct assembler *a, struct compiler *c) +assemble_jump_offsets(basicblock *entryblock) { - basicblock *b; int bsize, totsize, extended_arg_recompile; - int i; /* Compute the size of each block and fixup jump args. Replace block pointer with position in bytecode. */ do { totsize = 0; - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { bsize = blocksize(b); b->b_offset = totsize; totsize += bsize; } extended_arg_recompile = 0; - for (b = c->u->u_blocks; b != NULL; b = b->b_list) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { bsize = b->b_offset; - for (i = 0; i < b->b_iused; i++) { + for (int i = 0; i < b->b_iused; i++) { struct instr *instr = &b->b_instr[i]; int isize = instr_size(instr); /* Relative jumps are computed relative to @@ -7755,6 +7854,109 @@ assemble_jump_offsets(struct assembler *a, struct compiler *c) } while (extended_arg_recompile); } + +// Ensure each basicblock is only put onto the stack once. +#define MAYBE_PUSH(B) do { \ + if ((B)->b_visited == 0) { \ + *(*stack_top)++ = (B); \ + (B)->b_visited = 1; \ + } \ + } while (0) + +static void +scan_block_for_local(int target, basicblock *b, bool unsafe_to_start, + basicblock ***stack_top) +{ + bool unsafe = unsafe_to_start; + for (int i = 0; i < b->b_iused; i++) { + struct instr *instr = &b->b_instr[i]; + assert(instr->i_opcode != EXTENDED_ARG); + assert(instr->i_opcode != EXTENDED_ARG_QUICK); + assert(instr->i_opcode != LOAD_FAST__LOAD_FAST); + assert(instr->i_opcode != STORE_FAST__LOAD_FAST); + assert(instr->i_opcode != LOAD_CONST__LOAD_FAST); + assert(instr->i_opcode != STORE_FAST__STORE_FAST); + assert(instr->i_opcode != LOAD_FAST__LOAD_CONST); + if (unsafe && instr->i_except != NULL) { + MAYBE_PUSH(instr->i_except); + } + if (instr->i_oparg != target) { + continue; + } + switch (instr->i_opcode) { + case LOAD_FAST_CHECK: + // if this doesn't raise, then var is defined + unsafe = false; + break; + case LOAD_FAST: + if (unsafe) { + instr->i_opcode = LOAD_FAST_CHECK; + } + unsafe = false; + break; + case STORE_FAST: + unsafe = false; + break; + case DELETE_FAST: + unsafe = true; + break; + } + } + if (unsafe) { + // unsafe at end of this block, + // so unsafe at start of next blocks + if (b->b_next && BB_HAS_FALLTHROUGH(b)) { + MAYBE_PUSH(b->b_next); + } + if (b->b_iused > 0) { + struct instr *last = &b->b_instr[b->b_iused-1]; + if (is_jump(last)) { + assert(last->i_target != NULL); + MAYBE_PUSH(last->i_target); + } + } + } +} +#undef MAYBE_PUSH + +static int +add_checks_for_loads_of_unknown_variables(basicblock *entryblock, + struct compiler *c) +{ + basicblock **stack = make_cfg_traversal_stack(entryblock); + if (stack == NULL) { + return -1; + } + Py_ssize_t nparams = PyList_GET_SIZE(c->u->u_ste->ste_varnames); + int nlocals = (int)PyDict_GET_SIZE(c->u->u_varnames); + for (int target = 0; target < nlocals; target++) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + b->b_visited = 0; + } + basicblock **stack_top = stack; + + // First pass: find the relevant DFS starting points: + // the places where "being uninitialized" originates, + // which are the entry block and any DELETE_FAST statements. + if (target >= nparams) { + // only non-parameter locals start out uninitialized. + *(stack_top++) = entryblock; + entryblock->b_visited = 1; + } + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + scan_block_for_local(target, b, false, &stack_top); + } + + // Second pass: Depth-first search to propagate uncertainty + while (stack_top > stack) { + basicblock *b = *--stack_top; + scan_block_for_local(target, b, true, &stack_top); + } + } + PyMem_Free(stack); + return 0; +} + static PyObject * dict_keys_inorder(PyObject *dict, Py_ssize_t offset) { @@ -7835,15 +8037,16 @@ compute_code_flags(struct compiler *c) // Merge *obj* with constant cache. // Unlike merge_consts_recursive(), this function doesn't work recursively. static int -merge_const_one(struct compiler *c, PyObject **obj) +merge_const_one(PyObject *const_cache, PyObject **obj) { + PyDict_CheckExact(const_cache); PyObject *key = _PyCode_ConstantKey(*obj); if (key == NULL) { return 0; } // t is borrowed reference - PyObject *t = PyDict_SetDefault(c->c_const_cache, key, key); + PyObject *t = PyDict_SetDefault(const_cache, key, key); Py_DECREF(key); if (t == NULL) { return 0; @@ -7914,7 +8117,7 @@ compute_localsplus_info(struct compiler *c, int nlocalsplus, static PyCodeObject * makecode(struct compiler *c, struct assembler *a, PyObject *constslist, - int maxdepth, int nlocalsplus) + int maxdepth, int nlocalsplus, int code_flags) { PyCodeObject *co = NULL; PyObject *names = NULL; @@ -7926,12 +8129,7 @@ makecode(struct compiler *c, struct assembler *a, PyObject *constslist, if (!names) { goto error; } - if (!merge_const_one(c, &names)) { - goto error; - } - - int flags = compute_code_flags(c); - if (flags < 0) { + if (!merge_const_one(c->c_const_cache, &names)) { goto error; } @@ -7939,7 +8137,7 @@ makecode(struct compiler *c, struct assembler *a, PyObject *constslist, if (consts == NULL) { goto error; } - if (!merge_const_one(c, &consts)) { + if (!merge_const_one(c->c_const_cache, &consts)) { goto error; } @@ -7965,7 +8163,7 @@ makecode(struct compiler *c, struct assembler *a, PyObject *constslist, .filename = c->c_filename, .name = c->u->u_name, .qualname = c->u->u_qualname ? c->u->u_qualname : c->u->u_name, - .flags = flags, + .flags = code_flags, .code = a->a_bytecode, .firstlineno = c->u->u_firstlineno, @@ -7990,7 +8188,7 @@ makecode(struct compiler *c, struct assembler *a, PyObject *constslist, goto error; } - if (!merge_const_one(c, &localsplusnames)) { + if (!merge_const_one(c->c_const_cache, &localsplusnames)) { goto error; } con.localsplusnames = localsplusnames; @@ -8023,16 +8221,23 @@ dump_instr(struct instr *i) if (HAS_ARG(i->i_opcode)) { sprintf(arg, "arg: %d ", i->i_oparg); } + if (is_jump(i)) { + sprintf(arg, "target: %p ", i->i_target); + } + if (is_block_push(i)) { + sprintf(arg, "except_target: %p ", i->i_target); + } fprintf(stderr, "line: %d, opcode: %d %s%s%s\n", - i->i_lineno, i->i_opcode, arg, jabs, jrel); + i->i_loc.lineno, i->i_opcode, arg, jabs, jrel); } static void dump_basicblock(const basicblock *b) { - const char *b_return = b->b_return ? "return " : ""; - fprintf(stderr, "used: %d, depth: %d, offset: %d %s\n", - b->b_iused, b->b_startdepth, b->b_offset, b_return); + const char *b_return = basicblock_returns(b) ? "return " : ""; + fprintf(stderr, "[%d %d %d %p] used: %d, depth: %d, offset: %d %s\n", + b->b_cold, b->b_warm, BB_NO_FALLTHROUGH(b), b, b->b_iused, + b->b_startdepth, b->b_offset, b_return); if (b->b_instr) { int i; for (i = 0; i < b->b_iused; i++) { @@ -8048,14 +8253,14 @@ static int normalize_basic_block(basicblock *bb); static int -optimize_cfg(struct compiler *c, struct assembler *a, PyObject *consts); +optimize_cfg(basicblock *entryblock, PyObject *consts, PyObject *const_cache); static int -trim_unused_consts(struct compiler *c, struct assembler *a, PyObject *consts); +trim_unused_consts(basicblock *entryblock, PyObject *consts); /* Duplicates exit BBs, so that line numbers can be propagated to them */ static int -duplicate_exits_without_lineno(struct compiler *c); +duplicate_exits_without_lineno(basicblock *entryblock); static int extend_block(basicblock *bb); @@ -8095,7 +8300,7 @@ build_cellfixedoffsets(struct compiler *c) static inline int insert_instruction(basicblock *block, int pos, struct instr *instr) { - if (compiler_next_instr(block) < 0) { + if (basicblock_next_instr(block) < 0) { return -1; } for (int i = block->b_iused-1; i > pos; i--) { @@ -8107,24 +8312,16 @@ insert_instruction(basicblock *block, int pos, struct instr *instr) { static int insert_prefix_instructions(struct compiler *c, basicblock *entryblock, - int *fixed, int nfreevars) + int *fixed, int nfreevars, int code_flags) { - - int flags = compute_code_flags(c); - if (flags < 0) { - return -1; - } assert(c->u->u_firstlineno > 0); /* Add the generator prefix instructions. */ - if (flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { + if (code_flags & (CO_GENERATOR | CO_COROUTINE | CO_ASYNC_GENERATOR)) { struct instr make_gen = { .i_opcode = RETURN_GENERATOR, .i_oparg = 0, - .i_lineno = c->u->u_firstlineno, - .i_col_offset = -1, - .i_end_lineno = c->u->u_firstlineno, - .i_end_col_offset = -1, + .i_loc = LOCATION(c->u->u_firstlineno, c->u->u_firstlineno, -1, -1), .i_target = NULL, }; if (insert_instruction(entryblock, 0, &make_gen) < 0) { @@ -8133,10 +8330,7 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, struct instr pop_top = { .i_opcode = POP_TOP, .i_oparg = 0, - .i_lineno = -1, - .i_col_offset = -1, - .i_end_lineno = -1, - .i_end_col_offset = -1, + .i_loc = NO_LOCATION, .i_target = NULL, }; if (insert_instruction(entryblock, 1, &pop_top) < 0) { @@ -8168,10 +8362,7 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, .i_opcode = MAKE_CELL, // This will get fixed in offset_derefs(). .i_oparg = oldindex, - .i_lineno = -1, - .i_col_offset = -1, - .i_end_lineno = -1, - .i_end_col_offset = -1, + .i_loc = NO_LOCATION, .i_target = NULL, }; if (insert_instruction(entryblock, ncellsused, &make_cell) < 0) { @@ -8186,10 +8377,7 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, struct instr copy_frees = { .i_opcode = COPY_FREE_VARS, .i_oparg = nfreevars, - .i_lineno = -1, - .i_col_offset = -1, - .i_end_lineno = -1, - .i_end_col_offset = -1, + .i_loc = NO_LOCATION, .i_target = NULL, }; if (insert_instruction(entryblock, 0, ©_frees) < 0) { @@ -8206,25 +8394,25 @@ insert_prefix_instructions(struct compiler *c, basicblock *entryblock, * The resulting line number may not be correct according to PEP 626, * but should be "good enough", and no worse than in older versions. */ static void -guarantee_lineno_for_exits(struct assembler *a, int firstlineno) { +guarantee_lineno_for_exits(basicblock *entryblock, int firstlineno) { int lineno = firstlineno; assert(lineno > 0); - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { if (b->b_iused == 0) { continue; } struct instr *last = &b->b_instr[b->b_iused-1]; - if (last->i_lineno < 0) { + if (last->i_loc.lineno < 0) { if (last->i_opcode == RETURN_VALUE) { for (int i = 0; i < b->b_iused; i++) { - assert(b->b_instr[i].i_lineno < 0); + assert(b->b_instr[i].i_loc.lineno < 0); - b->b_instr[i].i_lineno = lineno; + b->b_instr[i].i_loc.lineno = lineno; } } } else { - lineno = last->i_lineno; + lineno = last->i_loc.lineno; } } } @@ -8276,19 +8464,52 @@ fix_cell_offsets(struct compiler *c, basicblock *entryblock, int *fixedmap) } static void -propagate_line_numbers(struct assembler *a); +propagate_line_numbers(basicblock *entryblock); + +static void +eliminate_empty_basic_blocks(basicblock *entryblock); + + +static void +remove_redundant_jumps(basicblock *entryblock) { + /* If a non-empty block ends with a jump instruction, check if the next + * non-empty block reached through normal flow control is the target + * of that jump. If it is, then the jump instruction is redundant and + * can be deleted. + */ + int removed = 0; + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + if (b->b_iused > 0) { + struct instr *b_last_instr = &b->b_instr[b->b_iused - 1]; + assert(!IS_ASSEMBLER_OPCODE(b_last_instr->i_opcode)); + if (b_last_instr->i_opcode == JUMP || + b_last_instr->i_opcode == JUMP_NO_INTERRUPT) { + if (b_last_instr->i_target == b->b_next) { + assert(b->b_next->b_iused); + b_last_instr->i_opcode = NOP; + removed++; + } + } + } + } + if (removed) { + eliminate_empty_basic_blocks(entryblock); + } +} static PyCodeObject * assemble(struct compiler *c, int addNone) { - basicblock *b, *entryblock; - struct assembler a; - int j, nblocks; PyCodeObject *co = NULL; PyObject *consts = NULL; + int code_flags = compute_code_flags(c); + if (code_flags < 0) { + return NULL; + } + /* Make sure every block that falls off the end returns None. */ - if (!c->u->u_curblock->b_return) { + if (!basicblock_returns(c->u->u_curblock)) { UNSET_LOC(c); if (addNone) ADDOP_LOAD_CONST(c, Py_None); @@ -8307,14 +8528,6 @@ assemble(struct compiler *c, int addNone) } } - nblocks = 0; - entryblock = NULL; - for (b = c->u->u_blocks; b != NULL; b = b->b_list) { - nblocks++; - entryblock = b; - } - assert(entryblock != NULL); - assert(PyDict_GET_SIZE(c->u->u_varnames) < INT_MAX); assert(PyDict_GET_SIZE(c->u->u_cellvars) < INT_MAX); assert(PyDict_GET_SIZE(c->u->u_freevars) < INT_MAX); @@ -8329,10 +8542,22 @@ assemble(struct compiler *c, int addNone) goto error; } + int nblocks = 0; + basicblock *entryblock = NULL; + for (basicblock *b = c->u->u_blocks; b != NULL; b = b->b_list) { + nblocks++; + entryblock = b; + } + assert(entryblock != NULL); + if ((size_t)nblocks > SIZE_MAX / sizeof(basicblock *)) { + PyErr_NoMemory(); + goto error; + } + /* Set firstlineno if it wasn't explicitly set. */ if (!c->u->u_firstlineno) { - if (entryblock->b_instr && entryblock->b_instr->i_lineno) { - c->u->u_firstlineno = entryblock->b_instr->i_lineno; + if (entryblock->b_instr && entryblock->b_instr->i_loc.lineno) { + c->u->u_firstlineno = entryblock->b_instr->i_loc.lineno; } else { c->u->u_firstlineno = 1; @@ -8340,15 +8565,10 @@ assemble(struct compiler *c, int addNone) } // This must be called before fix_cell_offsets(). - if (insert_prefix_instructions(c, entryblock, cellfixedoffsets, nfreevars)) { + if (insert_prefix_instructions(c, entryblock, cellfixedoffsets, nfreevars, code_flags)) { goto error; } - if (!assemble_init(&a, nblocks, c->u->u_firstlineno)) - goto error; - a.a_entry = entryblock; - a.a_nblocks = nblocks; - int numdropped = fix_cell_offsets(c, entryblock, cellfixedoffsets); PyMem_Free(cellfixedoffsets); // At this point we're done with it. cellfixedoffsets = NULL; @@ -8362,18 +8582,19 @@ assemble(struct compiler *c, int addNone) goto error; } - if (optimize_cfg(c, &a, consts)) { + if (optimize_cfg(entryblock, consts, c->c_const_cache)) { goto error; } - if (duplicate_exits_without_lineno(c)) { + if (duplicate_exits_without_lineno(entryblock)) { return NULL; } - if (trim_unused_consts(c, &a, consts)) { + if (trim_unused_consts(entryblock, consts)) { goto error; } - propagate_line_numbers(&a); - guarantee_lineno_for_exits(&a, c->u->u_firstlineno); - int maxdepth = stackdepth(c); + propagate_line_numbers(entryblock); + guarantee_lineno_for_exits(entryblock, c->u->u_firstlineno); + + int maxdepth = stackdepth(entryblock, code_flags); if (maxdepth < 0) { goto error; } @@ -8388,56 +8609,72 @@ assemble(struct compiler *c, int addNone) goto error; } convert_exception_handlers_to_nops(entryblock); - for (basicblock *b = a.a_entry; b != NULL; b = b->b_next) { + + if (push_cold_blocks_to_end(entryblock, code_flags) < 0) { + goto error; + } + + remove_redundant_jumps(entryblock); + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { clean_basic_block(b); } /* Order of basic blocks must have been determined by now */ - normalize_jumps(&a); + normalize_jumps(entryblock); + + if (add_checks_for_loads_of_unknown_variables(entryblock, c) < 0) { + goto error; + } /* Can't modify the bytecode after computing jump offsets. */ - assemble_jump_offsets(&a, c); + assemble_jump_offsets(entryblock); + + + /* Create assembler */ + struct assembler a; + if (!assemble_init(&a, c->u->u_firstlineno)) + goto error; /* Emit code. */ - for(b = entryblock; b != NULL; b = b->b_next) { - for (j = 0; j < b->b_iused; j++) + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + for (int j = 0; j < b->b_iused; j++) if (!assemble_emit(&a, &b->b_instr[j])) goto error; } /* Emit location info */ a.a_lineno = c->u->u_firstlineno; - for(b = entryblock; b != NULL; b = b->b_next) { - for (j = 0; j < b->b_iused; j++) + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + for (int j = 0; j < b->b_iused; j++) if (!assemble_emit_location(&a, &b->b_instr[j])) goto error; } - if (!assemble_exception_table(&a)) { + if (!assemble_exception_table(&a, entryblock)) { goto error; } if (_PyBytes_Resize(&a.a_except_table, a.a_except_table_off) < 0) { goto error; } - if (!merge_const_one(c, &a.a_except_table)) { + if (!merge_const_one(c->c_const_cache, &a.a_except_table)) { goto error; } if (_PyBytes_Resize(&a.a_linetable, a.a_location_off) < 0) { goto error; } - if (!merge_const_one(c, &a.a_linetable)) { + if (!merge_const_one(c->c_const_cache, &a.a_linetable)) { goto error; } if (_PyBytes_Resize(&a.a_bytecode, a.a_offset * sizeof(_Py_CODEUNIT)) < 0) { goto error; } - if (!merge_const_one(c, &a.a_bytecode)) { + if (!merge_const_one(c->c_const_cache, &a.a_bytecode)) { goto error; } - co = makecode(c, &a, consts, maxdepth, nlocalsplus); + co = makecode(c, &a, consts, maxdepth, nlocalsplus, code_flags); error: Py_XDECREF(consts); assemble_free(&a); @@ -8472,11 +8709,12 @@ get_const_value(int opcode, int oparg, PyObject *co_consts) Called with codestr pointing to the first LOAD_CONST. */ static int -fold_tuple_on_constants(struct compiler *c, +fold_tuple_on_constants(PyObject *const_cache, struct instr *inst, int n, PyObject *consts) { /* Pre-conditions */ + assert(PyDict_CheckExact(const_cache)); assert(PyList_CheckExact(consts)); assert(inst[n].i_opcode == BUILD_TUPLE); assert(inst[n].i_oparg == n); @@ -8501,7 +8739,7 @@ fold_tuple_on_constants(struct compiler *c, } PyTuple_SET_ITEM(newconst, i, constant); } - if (merge_const_one(c, &newconst) == 0) { + if (merge_const_one(const_cache, &newconst) == 0) { Py_DECREF(newconst); return -1; } @@ -8644,7 +8882,7 @@ next_swappable_instruction(basicblock *block, int i, int lineno) { while (++i < block->b_iused) { struct instr *instruction = &block->b_instr[i]; - if (0 <= lineno && instruction->i_lineno != lineno) { + if (0 <= lineno && instruction->i_loc.lineno != lineno) { // Optimizing across this instruction could cause user-visible // changes in the names bound between line tracing events! return -1; @@ -8683,7 +8921,7 @@ apply_static_swaps(basicblock *block, int i) return; } int k = j; - int lineno = block->b_instr[j].i_lineno; + int lineno = block->b_instr[j].i_loc.lineno; for (int count = swap->i_oparg - 1; 0 < count; count--) { k = next_swappable_instruction(block, k, lineno); if (k < 0) { @@ -8709,7 +8947,7 @@ jump_thread(struct instr *inst, struct instr *target, int opcode) assert(is_jump(target)); // bpo-45773: If inst->i_target == target->i_target, then nothing actually // changes (and we fall into an infinite loop): - if (inst->i_lineno == target->i_lineno && + if ((inst->i_loc.lineno == target->i_loc.lineno || target->i_loc.lineno == -1) && inst->i_target != target->i_target) { inst->i_target = target->i_target; @@ -8724,8 +8962,9 @@ jump_thread(struct instr *inst, struct instr *target, int opcode) /* Optimization */ static int -optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) +optimize_basic_block(PyObject *const_cache, basicblock *bb, PyObject *consts) { + assert(PyDict_CheckExact(const_cache)); assert(PyList_CheckExact(consts)); struct instr nop; nop.i_opcode = NOP; @@ -8769,7 +9008,6 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) jump_if_true = nextop == POP_JUMP_IF_TRUE; if (is_true == jump_if_true) { bb->b_instr[i+1].i_opcode = JUMP; - bb->b_nofallthrough = 1; } else { bb->b_instr[i+1].i_opcode = NOP; @@ -8789,7 +9027,6 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) jump_if_true = nextop == JUMP_IF_TRUE_OR_POP; if (is_true == jump_if_true) { bb->b_instr[i+1].i_opcode = JUMP; - bb->b_nofallthrough = 1; } else { inst->i_opcode = NOP; @@ -8834,7 +9071,7 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) } } if (i >= oparg) { - if (fold_tuple_on_constants(c, inst-oparg, oparg, consts)) { + if (fold_tuple_on_constants(const_cache, inst-oparg, oparg, consts)) { goto error; } } @@ -8865,7 +9102,7 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) break; case JUMP_IF_TRUE_OR_POP: case POP_JUMP_IF_TRUE: - if (inst->i_lineno == target->i_lineno) { + if (inst->i_loc.lineno == target->i_loc.lineno) { // We don't need to bother checking for loops here, // since a block's b_next cannot point to itself: assert(inst->i_target != inst->i_target->b_next); @@ -8887,7 +9124,7 @@ optimize_basic_block(struct compiler *c, basicblock *bb, PyObject *consts) break; case JUMP_IF_FALSE_OR_POP: case POP_JUMP_IF_FALSE: - if (inst->i_lineno == target->i_lineno) { + if (inst->i_loc.lineno == target->i_loc.lineno) { // We don't need to bother checking for loops here, // since a block's b_next cannot point to itself: assert(inst->i_target != inst->i_target->b_next); @@ -8978,17 +9215,16 @@ extend_block(basicblock *bb) { last->i_opcode != JUMP_BACKWARD) { return 0; } - if (last->i_target->b_exit && last->i_target->b_iused <= MAX_COPY_SIZE) { + if (basicblock_exits_scope(last->i_target) && last->i_target->b_iused <= MAX_COPY_SIZE) { basicblock *to_copy = last->i_target; last->i_opcode = NOP; for (int i = 0; i < to_copy->b_iused; i++) { - int index = compiler_next_instr(bb); + int index = basicblock_next_instr(bb); if (index < 0) { return -1; } bb->b_instr[index] = to_copy->b_instr[i]; } - bb->b_exit = 1; } return 0; } @@ -8999,7 +9235,7 @@ clean_basic_block(basicblock *bb) { int dest = 0; int prev_lineno = -1; for (int src = 0; src < bb->b_iused; src++) { - int lineno = bb->b_instr[src].i_lineno; + int lineno = bb->b_instr[src].i_loc.lineno; if (bb->b_instr[src].i_opcode == NOP) { /* Eliminate no-op if it doesn't have a line number */ if (lineno < 0) { @@ -9011,9 +9247,9 @@ clean_basic_block(basicblock *bb) { } /* or, if the next instruction has same line number or no line number */ if (src < bb->b_iused - 1) { - int next_lineno = bb->b_instr[src+1].i_lineno; + int next_lineno = bb->b_instr[src+1].i_loc.lineno; if (next_lineno < 0 || next_lineno == lineno) { - COPY_INSTR_LOC(bb->b_instr[src], bb->b_instr[src+1]); + bb->b_instr[src+1].i_loc = bb->b_instr[src].i_loc; continue; } } @@ -9024,7 +9260,7 @@ clean_basic_block(basicblock *bb) { } /* or if last instruction in BB and next BB has same line number */ if (next) { - if (lineno == next->b_instr[0].i_lineno) { + if (lineno == next->b_instr[0].i_loc.lineno) { continue; } } @@ -9046,52 +9282,41 @@ normalize_basic_block(basicblock *bb) { /* Mark blocks as exit and/or nofallthrough. Raise SystemError if CFG is malformed. */ for (int i = 0; i < bb->b_iused; i++) { - assert(!IS_ASSEMBLER_OPCODE(bb->b_instr[i].i_opcode)); - switch(bb->b_instr[i].i_opcode) { - case RETURN_VALUE: - case RAISE_VARARGS: - case RERAISE: - bb->b_exit = 1; - bb->b_nofallthrough = 1; - break; - case JUMP: - case JUMP_NO_INTERRUPT: - bb->b_nofallthrough = 1; - /* fall through */ - case POP_JUMP_IF_NOT_NONE: - case POP_JUMP_IF_NONE: - case POP_JUMP_IF_FALSE: - case POP_JUMP_IF_TRUE: - case JUMP_IF_FALSE_OR_POP: - case JUMP_IF_TRUE_OR_POP: - case FOR_ITER: - if (i != bb->b_iused-1) { - PyErr_SetString(PyExc_SystemError, "malformed control flow graph."); - return -1; - } - /* Skip over empty basic blocks. */ - while (bb->b_instr[i].i_target->b_iused == 0) { - bb->b_instr[i].i_target = bb->b_instr[i].i_target->b_next; - } - + int opcode = bb->b_instr[i].i_opcode; + assert(!IS_ASSEMBLER_OPCODE(opcode)); + int is_jump = IS_JUMP_OPCODE(opcode); + int is_exit = IS_SCOPE_EXIT_OPCODE(opcode); + if (is_exit || is_jump) { + if (i != bb->b_iused-1) { + PyErr_SetString(PyExc_SystemError, "malformed control flow graph."); + return -1; + } + } + if (is_jump) { + /* Skip over empty basic blocks. */ + while (bb->b_instr[i].i_target->b_iused == 0) { + bb->b_instr[i].i_target = bb->b_instr[i].i_target->b_next; + } } } return 0; } static int -mark_reachable(struct assembler *a) { - basicblock **stack, **sp; - sp = stack = (basicblock **)PyObject_Malloc(sizeof(basicblock *) * a->a_nblocks); +mark_reachable(basicblock *entryblock) { + basicblock **stack = make_cfg_traversal_stack(entryblock); if (stack == NULL) { return -1; } - a->a_entry->b_predecessors = 1; - *sp++ = a->a_entry; + basicblock **sp = stack; + entryblock->b_predecessors = 1; + *sp++ = entryblock; while (sp > stack) { basicblock *b = *(--sp); - if (b->b_next && !b->b_nofallthrough) { - if (b->b_next->b_predecessors == 0) { + b->b_visited = 1; + if (b->b_next && BB_HAS_FALLTHROUGH(b)) { + if (!b->b_next->b_visited) { + assert(b->b_next->b_predecessors == 0); *sp++ = b->b_next; } b->b_next->b_predecessors++; @@ -9101,21 +9326,27 @@ mark_reachable(struct assembler *a) { struct instr *instr = &b->b_instr[i]; if (is_jump(instr) || is_block_push(instr)) { target = instr->i_target; - if (target->b_predecessors == 0) { + if (!target->b_visited) { + assert(target->b_predecessors == 0 || target == b->b_next); *sp++ = target; } target->b_predecessors++; + if (is_block_push(instr)) { + target->b_except_predecessors++; + } + assert(target->b_except_predecessors == 0 || + target->b_except_predecessors == target->b_predecessors); } } } - PyObject_Free(stack); + PyMem_Free(stack); return 0; } static void -eliminate_empty_basic_blocks(basicblock *entry) { +eliminate_empty_basic_blocks(basicblock *entryblock) { /* Eliminate empty blocks */ - for (basicblock *b = entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { basicblock *next = b->b_next; if (next) { while (next->b_iused == 0 && next->b_next) { @@ -9124,16 +9355,19 @@ eliminate_empty_basic_blocks(basicblock *entry) { b->b_next = next; } } - for (basicblock *b = entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { if (b->b_iused == 0) { continue; } - if (is_jump(&b->b_instr[b->b_iused-1])) { - basicblock *target = b->b_instr[b->b_iused-1].i_target; - while (target->b_iused == 0) { - target = target->b_next; + for (int i = 0; i < b->b_iused; i++) { + struct instr *instr = &b->b_instr[i]; + if (is_jump(instr) || is_block_push(instr)) { + basicblock *target = instr->i_target; + while (target->b_iused == 0) { + target = target->b_next; + } + instr->i_target = target; } - b->b_instr[b->b_iused-1].i_target = target; } } } @@ -9147,39 +9381,32 @@ eliminate_empty_basic_blocks(basicblock *entry) { * but has no impact on the generated line number events. */ static void -propagate_line_numbers(struct assembler *a) { - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { +propagate_line_numbers(basicblock *entryblock) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { if (b->b_iused == 0) { continue; } - // Not a real instruction, only to store positions - // from previous instructions and propagate them. - struct instr prev_instr = { - .i_lineno = -1, - .i_col_offset = -1, - .i_end_lineno = -1, - .i_end_col_offset = -1, - }; + struct location prev_location = NO_LOCATION; for (int i = 0; i < b->b_iused; i++) { - if (b->b_instr[i].i_lineno < 0) { - COPY_INSTR_LOC(prev_instr, b->b_instr[i]); + if (b->b_instr[i].i_loc.lineno < 0) { + b->b_instr[i].i_loc = prev_location; } else { - COPY_INSTR_LOC(b->b_instr[i], prev_instr); + prev_location = b->b_instr[i].i_loc; } } - if (!b->b_nofallthrough && b->b_next->b_predecessors == 1) { + if (BB_HAS_FALLTHROUGH(b) && b->b_next->b_predecessors == 1) { assert(b->b_next->b_iused); - if (b->b_next->b_instr[0].i_lineno < 0) { - COPY_INSTR_LOC(prev_instr, b->b_next->b_instr[0]); + if (b->b_next->b_instr[0].i_loc.lineno < 0) { + b->b_next->b_instr[0].i_loc = prev_location; } } if (is_jump(&b->b_instr[b->b_iused-1])) { basicblock *target = b->b_instr[b->b_iused-1].i_target; if (target->b_predecessors == 1) { - if (target->b_instr[0].i_lineno < 0) { - COPY_INSTR_LOC(prev_instr, target->b_instr[0]); + if (target->b_instr[0].i_loc.lineno < 0) { + target->b_instr[0].i_loc = prev_location; } } } @@ -9196,70 +9423,46 @@ propagate_line_numbers(struct assembler *a) { */ static int -optimize_cfg(struct compiler *c, struct assembler *a, PyObject *consts) +optimize_cfg(basicblock *entryblock, PyObject *consts, PyObject *const_cache) { - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { - if (optimize_basic_block(c, b, consts)) { + assert(PyDict_CheckExact(const_cache)); + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + if (optimize_basic_block(const_cache, b, consts)) { return -1; } clean_basic_block(b); assert(b->b_predecessors == 0); } - for (basicblock *b = c->u->u_blocks; b != NULL; b = b->b_list) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { if (extend_block(b)) { return -1; } } - if (mark_reachable(a)) { + if (mark_reachable(entryblock)) { return -1; } /* Delete unreachable instructions */ - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { if (b->b_predecessors == 0) { b->b_iused = 0; - b->b_nofallthrough = 0; } } - eliminate_empty_basic_blocks(a->a_entry); - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + eliminate_empty_basic_blocks(entryblock); + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { clean_basic_block(b); } - /* Delete jump instructions made redundant by previous step. If a non-empty - block ends with a jump instruction, check if the next non-empty block - reached through normal flow control is the target of that jump. If it - is, then the jump instruction is redundant and can be deleted. - */ - int maybe_empty_blocks = 0; - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { - if (b->b_iused > 0) { - struct instr *b_last_instr = &b->b_instr[b->b_iused - 1]; - assert(!IS_ASSEMBLER_OPCODE(b_last_instr->i_opcode)); - if (b_last_instr->i_opcode == JUMP || - b_last_instr->i_opcode == JUMP_NO_INTERRUPT) { - if (b_last_instr->i_target == b->b_next) { - assert(b->b_next->b_iused); - b->b_nofallthrough = 0; - b_last_instr->i_opcode = NOP; - maybe_empty_blocks = 1; - } - } - } - } - if (maybe_empty_blocks) { - eliminate_empty_basic_blocks(a->a_entry); - } return 0; } // Remove trailing unused constants. static int -trim_unused_consts(struct compiler *c, struct assembler *a, PyObject *consts) +trim_unused_consts(basicblock *entryblock, PyObject *consts) { assert(PyList_CheckExact(consts)); // The first constant may be docstring; keep it always. int max_const_index = 0; - for (basicblock *b = a->a_entry; b != NULL; b = b->b_next) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { for (int i = 0; i < b->b_iused; i++) { if ((b->b_instr[i].i_opcode == LOAD_CONST || b->b_instr[i].i_opcode == KW_NAMES) && @@ -9281,11 +9484,11 @@ trim_unused_consts(struct compiler *c, struct assembler *a, PyObject *consts) static inline int is_exit_without_lineno(basicblock *b) { - if (!b->b_exit) { + if (!basicblock_exits_scope(b)) { return 0; } for (int i = 0; i < b->b_iused; i++) { - if (b->b_instr[i].i_lineno >= 0) { + if (b->b_instr[i].i_loc.lineno >= 0) { return 0; } } @@ -9302,19 +9505,19 @@ is_exit_without_lineno(basicblock *b) { * copy the line number from the sole predecessor block. */ static int -duplicate_exits_without_lineno(struct compiler *c) +duplicate_exits_without_lineno(basicblock *entryblock) { /* Copy all exit blocks without line number that are targets of a jump. */ - for (basicblock *b = c->u->u_blocks; b != NULL; b = b->b_list) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { if (b->b_iused > 0 && is_jump(&b->b_instr[b->b_iused-1])) { basicblock *target = b->b_instr[b->b_iused-1].i_target; if (is_exit_without_lineno(target) && target->b_predecessors > 1) { - basicblock *new_target = compiler_copy_block(c, target); + basicblock *new_target = copy_basicblock(target); if (new_target == NULL) { return -1; } - COPY_INSTR_LOC(b->b_instr[b->b_iused-1], new_target->b_instr[0]); + new_target->b_instr[0].i_loc = b->b_instr[b->b_iused-1].i_loc; b->b_instr[b->b_iused-1].i_target = new_target; target->b_predecessors--; new_target->b_predecessors = 1; @@ -9324,18 +9527,18 @@ duplicate_exits_without_lineno(struct compiler *c) } } /* Eliminate empty blocks */ - for (basicblock *b = c->u->u_blocks; b != NULL; b = b->b_list) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { while (b->b_next && b->b_next->b_iused == 0) { b->b_next = b->b_next->b_next; } } /* Any remaining reachable exit blocks without line number can only be reached by * fall through, and thus can only have a single predecessor */ - for (basicblock *b = c->u->u_blocks; b != NULL; b = b->b_list) { - if (!b->b_nofallthrough && b->b_next && b->b_iused > 0) { + for (basicblock *b = entryblock; b != NULL; b = b->b_next) { + if (BB_HAS_FALLTHROUGH(b) && b->b_next && b->b_iused > 0) { if (is_exit_without_lineno(b->b_next)) { assert(b->b_next->b_iused > 0); - COPY_INSTR_LOC(b->b_instr[b->b_iused-1], b->b_next->b_instr[0]); + b->b_next->b_instr[0].i_loc = b->b_instr[b->b_iused-1].i_loc; } } } diff --git a/Python/condvar.h b/Python/condvar.h index e5df7ff132802f..4ddc5311cf8fad 100644 --- a/Python/condvar.h +++ b/Python/condvar.h @@ -68,9 +68,9 @@ void _PyThread_cond_after(long long us, struct timespec *abs); Py_LOCAL_INLINE(int) PyCOND_TIMEDWAIT(PyCOND_T *cond, PyMUTEX_T *mut, long long us) { - struct timespec abs; - _PyThread_cond_after(us, &abs); - int ret = pthread_cond_timedwait(cond, mut, &abs); + struct timespec abs_timeout; + _PyThread_cond_after(us, &abs_timeout); + int ret = pthread_cond_timedwait(cond, mut, &abs_timeout); if (ret == ETIMEDOUT) { return 1; } diff --git a/Python/errors.c b/Python/errors.c index 3eb8a5ef04d284..b6b5d9b046ce85 100644 --- a/Python/errors.c +++ b/Python/errors.c @@ -688,11 +688,7 @@ _PyErr_FormatFromCauseTstate(PyThreadState *tstate, PyObject *exception, const char *format, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif _PyErr_FormatVFromCause(tstate, exception, format, vargs); va_end(vargs); return NULL; @@ -703,11 +699,7 @@ _PyErr_FormatFromCause(PyObject *exception, const char *format, ...) { PyThreadState *tstate = _PyThreadState_GET(); va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif _PyErr_FormatVFromCause(tstate, exception, format, vargs); va_end(vargs); return NULL; @@ -1096,11 +1088,7 @@ _PyErr_Format(PyThreadState *tstate, PyObject *exception, const char *format, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif _PyErr_FormatV(tstate, exception, format, vargs); va_end(vargs); return NULL; @@ -1112,11 +1100,7 @@ PyErr_Format(PyObject *exception, const char *format, ...) { PyThreadState *tstate = _PyThreadState_GET(); va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif _PyErr_FormatV(tstate, exception, format, vargs); va_end(vargs); return NULL; diff --git a/Python/fileutils.c b/Python/fileutils.c index a38886a17ccbbe..7e5d01f6e63d3b 100644 --- a/Python/fileutils.c +++ b/Python/fileutils.c @@ -603,9 +603,9 @@ _Py_DecodeLocaleEx(const char* arg, wchar_t **wstr, size_t *wlen, return _Py_DecodeUTF8Ex(arg, strlen(arg), wstr, wlen, reason, errors); #else - int use_utf8 = (Py_UTF8Mode == 1); + int use_utf8 = (_PyRuntime.preconfig.utf8_mode >= 1); #ifdef MS_WINDOWS - use_utf8 |= !Py_LegacyWindowsFSEncodingFlag; + use_utf8 |= (_PyRuntime.preconfig.legacy_windows_fs_encoding == 0); #endif if (use_utf8) { return _Py_DecodeUTF8Ex(arg, strlen(arg), wstr, wlen, reason, @@ -795,9 +795,9 @@ encode_locale_ex(const wchar_t *text, char **str, size_t *error_pos, return _Py_EncodeUTF8Ex(text, str, error_pos, reason, raw_malloc, errors); #else - int use_utf8 = (Py_UTF8Mode == 1); + int use_utf8 = (_PyRuntime.preconfig.utf8_mode >= 1); #ifdef MS_WINDOWS - use_utf8 |= !Py_LegacyWindowsFSEncodingFlag; + use_utf8 |= (_PyRuntime.preconfig.legacy_windows_fs_encoding == 0); #endif if (use_utf8) { return _Py_EncodeUTF8Ex(text, str, error_pos, reason, diff --git a/Python/frame.c b/Python/frame.c index c2da123a2bbc15..206ecab44c1b87 100644 --- a/Python/frame.c +++ b/Python/frame.c @@ -1,7 +1,9 @@ +#define _PY_INTERPRETER + #include "Python.h" #include "frameobject.h" -#include "pycore_code.h" // stats +#include "pycore_code.h" // stats #include "pycore_frame.h" #include "pycore_object.h" // _PyObject_GC_UNTRACK() #include "opcode.h" @@ -112,22 +114,6 @@ _PyFrame_Clear(_PyInterpreterFrame *frame) Py_DECREF(frame->f_code); } -/* Consumes reference to func */ -_PyInterpreterFrame * -_PyFrame_Push(PyThreadState *tstate, PyFunctionObject *func) -{ - PyCodeObject *code = (PyCodeObject *)func->func_code; - size_t size = code->co_nlocalsplus + code->co_stacksize + FRAME_SPECIALS_SIZE; - CALL_STAT_INC(frames_pushed); - _PyInterpreterFrame *new_frame = _PyThreadState_BumpFramePointer(tstate, size); - if (new_frame == NULL) { - Py_DECREF(func); - return NULL; - } - _PyFrame_InitializeSpecials(new_frame, func, NULL, code->co_nlocalsplus); - return new_frame; -} - int _PyInterpreterFrame_GetLine(_PyInterpreterFrame *frame) { diff --git a/Python/frozenmain.c b/Python/frozenmain.c index 8743e082b4ff8f..f8be165f7671df 100644 --- a/Python/frozenmain.c +++ b/Python/frozenmain.c @@ -53,7 +53,7 @@ Py_FrozenMain(int argc, char **argv) PyWinFreeze_ExeInit(); #endif - if (Py_VerboseFlag) { + if (_Py_GetConfig()->verbose) { fprintf(stderr, "Python %s\n%s\n", Py_GetVersion(), Py_GetCopyright()); } diff --git a/Python/getargs.c b/Python/getargs.c index ed3ffdafe37cdf..fb4a5124beab8a 100644 --- a/Python/getargs.c +++ b/Python/getargs.c @@ -2792,11 +2792,7 @@ PyArg_UnpackTuple(PyObject *args, const char *name, Py_ssize_t min, Py_ssize_t m stack = _PyTuple_ITEMS(args); nargs = PyTuple_GET_SIZE(args); -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, max); -#else - va_start(vargs); -#endif retval = unpack_stack(stack, nargs, name, min, max, vargs); va_end(vargs); return retval; @@ -2809,11 +2805,7 @@ _PyArg_UnpackStack(PyObject *const *args, Py_ssize_t nargs, const char *name, int retval; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, max); -#else - va_start(vargs); -#endif retval = unpack_stack(args, nargs, name, min, max, vargs); va_end(vargs); return retval; diff --git a/Python/getopt.c b/Python/getopt.c index fcea60759d12cd..4135bf1446ecfc 100644 --- a/Python/getopt.c +++ b/Python/getopt.c @@ -44,8 +44,12 @@ static const wchar_t *opt_ptr = L""; #define SHORT_OPTS L"bBc:dEhiIJm:OPqRsStuvVW:xX:?" static const _PyOS_LongOption longopts[] = { + /* name, has_arg, val (used in switch in initconfig.c) */ {L"check-hash-based-pycs", 1, 0}, - {NULL, 0, 0}, + {L"help-all", 0, 1}, + {L"help-env", 0, 2}, + {L"help-xoptions", 0, 3}, + {NULL, 0, -1}, /* sentinel */ }; diff --git a/Python/hamt.c b/Python/hamt.c index c3cb4e6fda4500..3aa31c625824cb 100644 --- a/Python/hamt.c +++ b/Python/hamt.c @@ -409,14 +409,22 @@ hamt_hash(PyObject *o) return -1; } - /* While it's suboptimal to reduce Python's 64 bit hash to + /* While it's somewhat suboptimal to reduce Python's 64 bit hash to 32 bits via XOR, it seems that the resulting hash function is good enough (this is also how Long type is hashed in Java.) Storing 10, 100, 1000 Python strings results in a relatively shallow and uniform tree structure. - Please don't change this hashing algorithm, as there are many - tests that test some exact tree shape to cover all code paths. + Also it's worth noting that it would be possible to adapt the tree + structure to 64 bit hashes, but that would increase memory pressure + and provide little to no performance benefits for collections with + fewer than billions of key/value pairs. + + Important: do not change this hash reducing function. There are many + tests that need an exact tree shape to cover all code paths and + we do that by specifying concrete values for test data's `__hash__`. + If this function is changed most of the regression tests would + become useless. */ int32_t xored = (int32_t)(hash & 0xffffffffl) ^ (int32_t)(hash >> 32); return xored == -1 ? -2 : xored; @@ -488,11 +496,7 @@ _hamt_dump_format(_PyUnicodeWriter *writer, const char *format, ...) int ret; va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif msg = PyUnicode_FromFormatV(format, vargs); va_end(vargs); diff --git a/Python/import.c b/Python/import.c index 4b6d6d16821a94..54c21fa4a56aa9 100644 --- a/Python/import.c +++ b/Python/import.c @@ -48,7 +48,7 @@ module _imp PyStatus _PyImportZip_Init(PyThreadState *tstate) { - PyObject *path_hooks, *zipimport; + PyObject *path_hooks; int err = 0; path_hooks = PySys_GetObject("path_hooks"); @@ -63,32 +63,22 @@ _PyImportZip_Init(PyThreadState *tstate) PySys_WriteStderr("# installing zipimport hook\n"); } - zipimport = PyImport_ImportModule("zipimport"); - if (zipimport == NULL) { - _PyErr_Clear(tstate); /* No zip import module -- okay */ + PyObject *zipimporter = _PyImport_GetModuleAttrString("zipimport", "zipimporter"); + if (zipimporter == NULL) { + _PyErr_Clear(tstate); /* No zipimporter object -- okay */ if (verbose) { - PySys_WriteStderr("# can't import zipimport\n"); + PySys_WriteStderr("# can't import zipimport.zipimporter\n"); } } else { - PyObject *zipimporter = PyObject_GetAttr(zipimport, &_Py_ID(zipimporter)); - Py_DECREF(zipimport); - if (zipimporter == NULL) { - _PyErr_Clear(tstate); /* No zipimporter object -- okay */ - if (verbose) { - PySys_WriteStderr("# can't import zipimport.zipimporter\n"); - } + /* sys.path_hooks.insert(0, zipimporter) */ + err = PyList_Insert(path_hooks, 0, zipimporter); + Py_DECREF(zipimporter); + if (err < 0) { + goto error; } - else { - /* sys.path_hooks.insert(0, zipimporter) */ - err = PyList_Insert(path_hooks, 0, zipimporter); - Py_DECREF(zipimporter); - if (err < 0) { - goto error; - } - if (verbose) { - PySys_WriteStderr("# installed zipimport hook\n"); - } + if (verbose) { + PySys_WriteStderr("# installed zipimport hook\n"); } } @@ -2632,6 +2622,37 @@ PyImport_AppendInittab(const char *name, PyObject* (*initfunc)(void)) return PyImport_ExtendInittab(newtab); } + +PyObject * +_PyImport_GetModuleAttr(PyObject *modname, PyObject *attrname) +{ + PyObject *mod = PyImport_Import(modname); + if (mod == NULL) { + return NULL; + } + PyObject *result = PyObject_GetAttr(mod, attrname); + Py_DECREF(mod); + return result; +} + +PyObject * +_PyImport_GetModuleAttrString(const char *modname, const char *attrname) +{ + PyObject *pmodname = PyUnicode_FromString(modname); + if (pmodname == NULL) { + return NULL; + } + PyObject *pattrname = PyUnicode_FromString(attrname); + if (pattrname == NULL) { + Py_DECREF(pmodname); + return NULL; + } + PyObject *result = _PyImport_GetModuleAttr(pmodname, pattrname); + Py_DECREF(pattrname); + Py_DECREF(pmodname); + return result; +} + #ifdef __cplusplus } #endif diff --git a/Python/initconfig.c b/Python/initconfig.c index a623973f953734..355f9869290bbd 100644 --- a/Python/initconfig.c +++ b/Python/initconfig.c @@ -28,9 +28,10 @@ static const char usage_line[] = "usage: %ls [option] ... [-c cmd | -m mod | file | -] [arg] ...\n"; -/* Long usage message, split into parts < 512 bytes */ -static const char usage_1[] = "\ -Options and arguments (and corresponding environment variables):\n\ +/* Long help message */ +/* Lines sorted by option name; keep in sync with usage_envvars* below */ +static const char usage_help[] = "\ +Options (and corresponding environment variables):\n\ -b : issue warnings about str(bytes_instance), str(bytearray_instance)\n\ and comparing bytes/bytearray with str. (-bb: issue errors)\n\ -B : don't write .pyc files on import; also PYTHONDONTWRITEBYTECODE=x\n\ @@ -38,9 +39,7 @@ Options and arguments (and corresponding environment variables):\n\ -d : turn on parser debugging output (for experts only, only works on\n\ debug builds); also PYTHONDEBUG=x\n\ -E : ignore PYTHON* environment variables (such as PYTHONPATH)\n\ --h : print this help message and exit (also --help)\n\ -"; -static const char usage_2[] = "\ +-h : print this help message and exit (also -? or --help)\n\ -i : inspect interactively after running script; forces a prompt even\n\ if stdin does not appear to be a terminal; also PYTHONINSPECT=x\n\ -I : isolate Python from the user's environment (implies -E and -s)\n\ @@ -53,8 +52,6 @@ static const char usage_2[] = "\ -q : don't print version and copyright messages on interactive startup\n\ -s : don't add user site directory to sys.path; also PYTHONNOUSERSITE\n\ -S : don't imply 'import site' on initialization\n\ -"; -static const char usage_3[] = "\ -u : force the stdout and stderr streams to be unbuffered;\n\ this option has no effect on stdin; also PYTHONUNBUFFERED=x\n\ -v : verbose (trace import statements); also PYTHONVERBOSE=x\n\ @@ -64,56 +61,72 @@ static const char usage_3[] = "\ -W arg : warning control; arg is action:message:category:module:lineno\n\ also PYTHONWARNINGS=arg\n\ -x : skip first line of source, allowing use of non-Unix forms of #!cmd\n\ --X opt : set implementation-specific option. The following options are available:\n\ -\n\ - -X faulthandler: enable faulthandler\n\ - -X showrefcount: output the total reference count and number of used\n\ - memory blocks when the program finishes or after each statement in the\n\ - interactive interpreter. This only works on debug builds\n\ - -X tracemalloc: start tracing Python memory allocations using the\n\ - tracemalloc module. By default, only the most recent frame is stored in a\n\ - traceback of a trace. Use -X tracemalloc=NFRAME to start tracing with a\n\ - traceback limit of NFRAME frames\n\ - -X importtime: show how long each import takes. It shows module name,\n\ - cumulative time (including nested imports) and self time (excluding\n\ - nested imports). Note that its output may be broken in multi-threaded\n\ - application. Typical usage is python3 -X importtime -c 'import asyncio'\n\ - -X dev: enable CPython's \"development mode\", introducing additional runtime\n\ - checks which are too expensive to be enabled by default. Effect of the\n\ - developer mode:\n\ - * Add default warning filter, as -W default\n\ - * Install debug hooks on memory allocators: see the PyMem_SetupDebugHooks() C function\n\ - * Enable the faulthandler module to dump the Python traceback on a crash\n\ - * Enable asyncio debug mode\n\ - * Set the dev_mode attribute of sys.flags to True\n\ - * io.IOBase destructor logs close() exceptions\n\ - -X utf8: enable UTF-8 mode for operating system interfaces, overriding the default\n\ - locale-aware mode. -X utf8=0 explicitly disables UTF-8 mode (even when it would\n\ - otherwise activate automatically)\n\ - -X pycache_prefix=PATH: enable writing .pyc files to a parallel tree rooted at the\n\ - given directory instead of to the code tree\n\ - -X warn_default_encoding: enable opt-in EncodingWarning for 'encoding=None'\n\ - -X no_debug_ranges: disable the inclusion of the tables mapping extra location \n\ - information (end line, start column offset and end column offset) to every \n\ - instruction in code objects. This is useful when smaller code objects and pyc \n\ - files are desired as well as suppressing the extra visual location indicators \n\ - when the interpreter displays tracebacks.\n\ - -X frozen_modules=[on|off]: whether or not frozen modules should be used.\n\ - The default is \"on\" (or \"off\" if you are running a local build).\n\ -\n\ +-X opt : set implementation-specific option\n\ --check-hash-based-pycs always|default|never:\n\ - control how Python invalidates hash-based .pyc files\n\ -"; -static const char usage_4[] = "\ + control how Python invalidates hash-based .pyc files\n\ +--help-env : print help about Python environment variables and exit\n\ +--help-xoptions : print help about implementation-specific -X options and exit\n\ +--help-all : print complete help information and exit\n\ +Arguments:\n\ file : program read from script file\n\ - : program read from stdin (default; interactive mode if a tty)\n\ -arg ...: arguments passed to program in sys.argv[1:]\n\n\ -Other environment variables:\n\ -PYTHONSTARTUP: file executed on interactive startup (no default)\n\ -PYTHONPATH : '%lc'-separated list of directories prefixed to the\n\ - default module search path. The result is sys.path.\n\ +arg ...: arguments passed to program in sys.argv[1:]\n\ "; -static const char usage_5[] = + +static const char usage_xoptions[] = "\ +The following implementation-specific options are available:\n\ +\n\ +-X faulthandler: enable faulthandler\n\ +\n\ +-X showrefcount: output the total reference count and number of used\n\ + memory blocks when the program finishes or after each statement in the\n\ + interactive interpreter. This only works on debug builds\n\ +\n\ +-X tracemalloc: start tracing Python memory allocations using the\n\ + tracemalloc module. By default, only the most recent frame is stored in a\n\ + traceback of a trace. Use -X tracemalloc=NFRAME to start tracing with a\n\ + traceback limit of NFRAME frames\n\ +\n\ +-X importtime: show how long each import takes. It shows module name,\n\ + cumulative time (including nested imports) and self time (excluding\n\ + nested imports). Note that its output may be broken in multi-threaded\n\ + application. Typical usage is python3 -X importtime -c 'import asyncio'\n\ +\n\ +-X dev: enable CPython's \"development mode\", introducing additional runtime\n\ + checks which are too expensive to be enabled by default. Effect of the\n\ + developer mode:\n\ + * Add default warning filter, as -W default\n\ + * Install debug hooks on memory allocators: see the PyMem_SetupDebugHooks()\n\ + C function\n\ + * Enable the faulthandler module to dump the Python traceback on a crash\n\ + * Enable asyncio debug mode\n\ + * Set the dev_mode attribute of sys.flags to True\n\ + * io.IOBase destructor logs close() exceptions\n\ +\n\ +-X utf8: enable UTF-8 mode for operating system interfaces, overriding the default\n\ + locale-aware mode. -X utf8=0 explicitly disables UTF-8 mode (even when it would\n\ + otherwise activate automatically)\n\ +\n\ +-X pycache_prefix=PATH: enable writing .pyc files to a parallel tree rooted at the\n\ + given directory instead of to the code tree\n\ +\n\ +-X warn_default_encoding: enable opt-in EncodingWarning for 'encoding=None'\n\ +\n\ +-X no_debug_ranges: disable the inclusion of the tables mapping extra location \n\ + information (end line, start column offset and end column offset) to every \n\ + instruction in code objects. This is useful when smaller code objects and pyc \n\ + files are desired as well as suppressing the extra visual location indicators \n\ + when the interpreter displays tracebacks.\n\ +\n\ +-X frozen_modules=[on|off]: whether or not frozen modules should be used.\n\ + The default is \"on\" (or \"off\" if you are running a local build)."; + +/* Envvars that don't have equivalent command-line options are listed first */ +static const char usage_envvars[] = +"Environment variables that change behavior:\n" +"PYTHONSTARTUP: file executed on interactive startup (no default)\n" +"PYTHONPATH : '%lc'-separated list of directories prefixed to the\n" +" default module search path. The result is sys.path.\n" "PYTHONSAFEPATH: don't prepend a potentially unsafe path to sys.path.\n" "PYTHONHOME : alternate directory (or %lc).\n" " The default module search path uses %s.\n" @@ -121,8 +134,7 @@ static const char usage_5[] = "PYTHONCASEOK : ignore case in 'import' statements (Windows).\n" "PYTHONUTF8: if set to 1, enable the UTF-8 mode.\n" "PYTHONIOENCODING: Encoding[:errors] used for stdin/stdout/stderr.\n" -"PYTHONFAULTHANDLER: dump the Python traceback on fatal errors.\n"; -static const char usage_6[] = +"PYTHONFAULTHANDLER: dump the Python traceback on fatal errors.\n" "PYTHONHASHSEED: if this variable is set to 'random', a random value is used\n" " to seed the hashes of str and bytes objects. It can also be set to an\n" " integer in the range [0,4294967295] to get hash values with a\n" @@ -141,8 +153,17 @@ static const char usage_6[] = "PYTHONNODEBUGRANGES: If this variable is set, it disables the inclusion of the \n" " tables mapping extra location information (end line, start column offset \n" " and end column offset) to every instruction in code objects. This is useful \n" -" when smaller cothe de objects and pyc files are desired as well as suppressing the \n" -" extra visual location indicators when the interpreter displays tracebacks.\n"; +" when smaller code objects and pyc files are desired as well as suppressing the \n" +" extra visual location indicators when the interpreter displays tracebacks.\n" +"These variables have equivalent command-line parameters (see --help for details):\n" +"PYTHONDEBUG : enable parser debug mode (-d)\n" +"PYTHONDONTWRITEBYTECODE : don't write .pyc files (-B)\n" +"PYTHONINSPECT : inspect interactively after running script (-i)\n" +"PYTHONNOUSERSITE : disable user site directory (-s)\n" +"PYTHONOPTIMIZE : enable level 1 optimizations (-O)\n" +"PYTHONUNBUFFERED : disable stdout/stderr buffering (-u)\n" +"PYTHONVERBOSE : trace import statements (-v)\n" +"PYTHONWARNINGS=arg : warning control (-W arg)\n"; #if defined(MS_WINDOWS) # define PYTHONHOMEHELP "\\python{major}{minor}" @@ -159,7 +180,7 @@ int Py_UTF8Mode = 0; int Py_DebugFlag = 0; /* Needed by parser.c */ int Py_VerboseFlag = 0; /* Needed by import.c */ int Py_QuietFlag = 0; /* Needed by sysmodule.c */ -int Py_InteractiveFlag = 0; /* Needed by Py_FdIsInteractive() below */ +int Py_InteractiveFlag = 0; /* Previously, was used by Py_FdIsInteractive() */ int Py_InspectFlag = 0; /* Needed to determine whether to exit at SystemExit */ int Py_OptimizeFlag = 0; /* Needed by compile.c */ int Py_NoSiteFlag = 0; /* Suppress 'import site' */ @@ -180,6 +201,8 @@ int Py_LegacyWindowsStdioFlag = 0; /* Uses FileIO instead of WindowsConsoleIO */ static PyObject * _Py_GetGlobalVariablesAsDict(void) { +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS PyObject *dict, *obj; dict = PyDict_New(); @@ -246,15 +269,19 @@ _Py_GetGlobalVariablesAsDict(void) #undef SET_ITEM #undef SET_ITEM_INT #undef SET_ITEM_STR +_Py_COMP_DIAG_POP } char* Py_GETENV(const char *name) { +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS if (Py_IgnoreEnvironmentFlag) { return NULL; } return getenv(name); +_Py_COMP_DIAG_POP } /* --- PyStatus ----------------------------------------------- */ @@ -516,8 +543,11 @@ Py_SetStandardStreamEncoding(const char *encoding, const char *errors) } #ifdef MS_WINDOWS if (_Py_StandardStreamEncoding) { +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS /* Overriding the stream encoding implies legacy streams */ Py_LegacyWindowsStdioFlag = 1; +_Py_COMP_DIAG_POP } #endif @@ -1422,6 +1452,8 @@ config_get_env_dup(PyConfig *config, static void config_get_global_vars(PyConfig *config) { +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS if (config->_config_init != _PyConfig_INIT_COMPAT) { /* Python and Isolated configuration ignore global variables */ return; @@ -1457,6 +1489,7 @@ config_get_global_vars(PyConfig *config) #undef COPY_FLAG #undef COPY_NOT_FLAG +_Py_COMP_DIAG_POP } @@ -1464,6 +1497,8 @@ config_get_global_vars(PyConfig *config) static void config_set_global_vars(const PyConfig *config) { +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS #define COPY_FLAG(ATTR, VAR) \ if (config->ATTR != -1) { \ VAR = config->ATTR; \ @@ -1498,6 +1533,7 @@ config_set_global_vars(const PyConfig *config) #undef COPY_FLAG #undef COPY_NOT_FLAG +_Py_COMP_DIAG_POP } @@ -2085,7 +2121,7 @@ config_read(PyConfig *config, int compute_path_config) /* -X options */ const wchar_t* option = _Py_check_xoptions(&config->xoptions, known_xoptions); if (option != NULL) { - return PyStatus_Error("Unknown value for option -X"); + return PyStatus_Error("Unknown value for option -X (see --help-xoptions)"); } if (config_get_xoption(config, L"showrefcount")) { @@ -2238,15 +2274,32 @@ config_usage(int error, const wchar_t* program) if (error) fprintf(f, "Try `python -h' for more information.\n"); else { - fputs(usage_1, f); - fputs(usage_2, f); - fputs(usage_3, f); - fprintf(f, usage_4, (wint_t)DELIM); - fprintf(f, usage_5, (wint_t)DELIM, PYTHONHOMEHELP); - fputs(usage_6, f); + fputs(usage_help, f); } } +static void +config_envvars_usage() +{ + printf(usage_envvars, (wint_t)DELIM, (wint_t)DELIM, PYTHONHOMEHELP); +} + +static void +config_xoptions_usage() +{ + puts(usage_xoptions); +} + +static void +config_complete_usage(const wchar_t* program) +{ + config_usage(0, program); + puts("\n"); + config_envvars_usage(); + puts("\n"); + config_xoptions_usage(); +} + /* Parse the command line arguments */ static PyStatus @@ -2298,9 +2351,9 @@ config_parse_cmdline(PyConfig *config, PyWideStringList *warnoptions, } switch (c) { + // Integers represent long options, see Python/getopt.c case 0: - // Handle long option. - assert(longindex == 0); // Only one long option now. + // check-hash-based-pycs if (wcscmp(_PyOS_optarg, L"always") == 0 || wcscmp(_PyOS_optarg, L"never") == 0 || wcscmp(_PyOS_optarg, L"default") == 0) @@ -2318,6 +2371,21 @@ config_parse_cmdline(PyConfig *config, PyWideStringList *warnoptions, } break; + case 1: + // help-all + config_complete_usage(program); + return _PyStatus_EXIT(0); + + case 2: + // help-env + config_envvars_usage(); + return _PyStatus_EXIT(0); + + case 3: + // help-xoptions + config_xoptions_usage(); + return _PyStatus_EXIT(0); + case 'b': config->bytes_warning++; break; diff --git a/Python/opcode_targets.h b/Python/opcode_targets.h index d37c1326247185..8a6f1cdbbbb62f 100644 --- a/Python/opcode_targets.h +++ b/Python/opcode_targets.h @@ -25,67 +25,67 @@ static void *opcode_targets[256] = { &&TARGET_CALL_PY_EXACT_ARGS, &&TARGET_CALL_PY_WITH_DEFAULTS, &&TARGET_BINARY_SUBSCR, - &&TARGET_COMPARE_OP_ADAPTIVE, - &&TARGET_COMPARE_OP_FLOAT_JUMP, - &&TARGET_COMPARE_OP_INT_JUMP, - &&TARGET_COMPARE_OP_STR_JUMP, + &&TARGET_CALL_BOUND_METHOD_EXACT_ARGS, + &&TARGET_CALL_BUILTIN_CLASS, + &&TARGET_CALL_BUILTIN_FAST_WITH_KEYWORDS, + &&TARGET_CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS, &&TARGET_GET_LEN, &&TARGET_MATCH_MAPPING, &&TARGET_MATCH_SEQUENCE, &&TARGET_MATCH_KEYS, - &&TARGET_EXTENDED_ARG_QUICK, + &&TARGET_CALL_NO_KW_BUILTIN_FAST, &&TARGET_PUSH_EXC_INFO, &&TARGET_CHECK_EXC_MATCH, &&TARGET_CHECK_EG_MATCH, - &&TARGET_JUMP_BACKWARD_QUICK, - &&TARGET_LOAD_ATTR_ADAPTIVE, - &&TARGET_LOAD_ATTR_INSTANCE_VALUE, - &&TARGET_LOAD_ATTR_MODULE, - &&TARGET_LOAD_ATTR_SLOT, - &&TARGET_LOAD_ATTR_WITH_HINT, - &&TARGET_LOAD_CONST__LOAD_FAST, - &&TARGET_LOAD_FAST__LOAD_CONST, - &&TARGET_LOAD_FAST__LOAD_FAST, - &&TARGET_LOAD_GLOBAL_ADAPTIVE, - &&TARGET_LOAD_GLOBAL_BUILTIN, + &&TARGET_CALL_NO_KW_BUILTIN_O, + &&TARGET_CALL_NO_KW_ISINSTANCE, + &&TARGET_CALL_NO_KW_LEN, + &&TARGET_CALL_NO_KW_LIST_APPEND, + &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_FAST, + &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS, + &&TARGET_CALL_NO_KW_METHOD_DESCRIPTOR_O, + &&TARGET_CALL_NO_KW_STR_1, + &&TARGET_CALL_NO_KW_TUPLE_1, + &&TARGET_CALL_NO_KW_TYPE_1, + &&TARGET_COMPARE_OP_ADAPTIVE, &&TARGET_WITH_EXCEPT_START, &&TARGET_GET_AITER, &&TARGET_GET_ANEXT, &&TARGET_BEFORE_ASYNC_WITH, &&TARGET_BEFORE_WITH, &&TARGET_END_ASYNC_FOR, - &&TARGET_LOAD_GLOBAL_MODULE, - &&TARGET_LOAD_METHOD_ADAPTIVE, - &&TARGET_LOAD_METHOD_CLASS, - &&TARGET_LOAD_METHOD_MODULE, - &&TARGET_LOAD_METHOD_NO_DICT, + &&TARGET_COMPARE_OP_FLOAT_JUMP, + &&TARGET_COMPARE_OP_INT_JUMP, + &&TARGET_COMPARE_OP_STR_JUMP, + &&TARGET_EXTENDED_ARG_QUICK, + &&TARGET_FOR_ITER_ADAPTIVE, &&TARGET_STORE_SUBSCR, &&TARGET_DELETE_SUBSCR, - &&TARGET_LOAD_METHOD_WITH_DICT, - &&TARGET_LOAD_METHOD_WITH_VALUES, - &&TARGET_PRECALL_ADAPTIVE, - &&TARGET_PRECALL_BOUND_METHOD, - &&TARGET_PRECALL_BUILTIN_CLASS, - &&TARGET_PRECALL_BUILTIN_FAST_WITH_KEYWORDS, + &&TARGET_FOR_ITER_LIST, + &&TARGET_FOR_ITER_RANGE, + &&TARGET_JUMP_BACKWARD_QUICK, + &&TARGET_LOAD_ATTR_ADAPTIVE, + &&TARGET_LOAD_ATTR_CLASS, + &&TARGET_LOAD_ATTR_INSTANCE_VALUE, &&TARGET_GET_ITER, &&TARGET_GET_YIELD_FROM_ITER, &&TARGET_PRINT_EXPR, &&TARGET_LOAD_BUILD_CLASS, - &&TARGET_PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS, - &&TARGET_PRECALL_NO_KW_BUILTIN_FAST, + &&TARGET_LOAD_ATTR_MODULE, + &&TARGET_LOAD_ATTR_PROPERTY, &&TARGET_LOAD_ASSERTION_ERROR, &&TARGET_RETURN_GENERATOR, - &&TARGET_PRECALL_NO_KW_BUILTIN_O, - &&TARGET_PRECALL_NO_KW_ISINSTANCE, - &&TARGET_PRECALL_NO_KW_LEN, - &&TARGET_PRECALL_NO_KW_LIST_APPEND, - &&TARGET_PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST, - &&TARGET_PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS, + &&TARGET_LOAD_ATTR_SLOT, + &&TARGET_LOAD_ATTR_WITH_HINT, + &&TARGET_LOAD_ATTR_METHOD_LAZY_DICT, + &&TARGET_LOAD_ATTR_METHOD_NO_DICT, + &&TARGET_LOAD_ATTR_METHOD_WITH_DICT, + &&TARGET_LOAD_ATTR_METHOD_WITH_VALUES, &&TARGET_LIST_TO_TUPLE, &&TARGET_RETURN_VALUE, &&TARGET_IMPORT_STAR, &&TARGET_SETUP_ANNOTATIONS, - &&TARGET_YIELD_VALUE, + &&TARGET_LOAD_CONST__LOAD_FAST, &&TARGET_ASYNC_GEN_WRAP, &&TARGET_PREP_RERAISE_STAR, &&TARGET_POP_EXCEPT, @@ -112,7 +112,7 @@ static void *opcode_targets[256] = { &&TARGET_JUMP_FORWARD, &&TARGET_JUMP_IF_FALSE_OR_POP, &&TARGET_JUMP_IF_TRUE_OR_POP, - &&TARGET_PRECALL_NO_KW_METHOD_DESCRIPTOR_O, + &&TARGET_LOAD_FAST__LOAD_CONST, &&TARGET_POP_JUMP_FORWARD_IF_FALSE, &&TARGET_POP_JUMP_FORWARD_IF_TRUE, &&TARGET_LOAD_GLOBAL, @@ -120,13 +120,13 @@ static void *opcode_targets[256] = { &&TARGET_CONTAINS_OP, &&TARGET_RERAISE, &&TARGET_COPY, - &&TARGET_PRECALL_NO_KW_STR_1, + &&TARGET_LOAD_FAST__LOAD_FAST, &&TARGET_BINARY_OP, &&TARGET_SEND, &&TARGET_LOAD_FAST, &&TARGET_STORE_FAST, &&TARGET_DELETE_FAST, - &&TARGET_PRECALL_NO_KW_TUPLE_1, + &&TARGET_LOAD_FAST_CHECK, &&TARGET_POP_JUMP_FORWARD_IF_NOT_NONE, &&TARGET_POP_JUMP_FORWARD_IF_NONE, &&TARGET_RAISE_VARARGS, @@ -140,32 +140,32 @@ static void *opcode_targets[256] = { &&TARGET_STORE_DEREF, &&TARGET_DELETE_DEREF, &&TARGET_JUMP_BACKWARD, - &&TARGET_PRECALL_NO_KW_TYPE_1, + &&TARGET_LOAD_GLOBAL_ADAPTIVE, &&TARGET_CALL_FUNCTION_EX, - &&TARGET_PRECALL_PYFUNC, + &&TARGET_LOAD_GLOBAL_BUILTIN, &&TARGET_EXTENDED_ARG, &&TARGET_LIST_APPEND, &&TARGET_SET_ADD, &&TARGET_MAP_ADD, &&TARGET_LOAD_CLASSDEREF, &&TARGET_COPY_FREE_VARS, - &&TARGET_RESUME_QUICK, + &&TARGET_YIELD_VALUE, &&TARGET_RESUME, &&TARGET_MATCH_CLASS, - &&TARGET_STORE_ATTR_ADAPTIVE, - &&TARGET_STORE_ATTR_INSTANCE_VALUE, + &&TARGET_LOAD_GLOBAL_MODULE, + &&TARGET_RESUME_QUICK, &&TARGET_FORMAT_VALUE, &&TARGET_BUILD_CONST_KEY_MAP, &&TARGET_BUILD_STRING, + &&TARGET_STORE_ATTR_ADAPTIVE, + &&TARGET_STORE_ATTR_INSTANCE_VALUE, &&TARGET_STORE_ATTR_SLOT, &&TARGET_STORE_ATTR_WITH_HINT, - &&TARGET_LOAD_METHOD, - &&TARGET_STORE_FAST__LOAD_FAST, &&TARGET_LIST_EXTEND, &&TARGET_SET_UPDATE, &&TARGET_DICT_MERGE, &&TARGET_DICT_UPDATE, - &&TARGET_PRECALL, + &&TARGET_STORE_FAST__LOAD_FAST, &&TARGET_STORE_FAST__STORE_FAST, &&TARGET_STORE_SUBSCR_ADAPTIVE, &&TARGET_STORE_SUBSCR_DICT, diff --git a/Python/pathconfig.c b/Python/pathconfig.c index 4271928571fa1f..69b7e10a3b0288 100644 --- a/Python/pathconfig.c +++ b/Python/pathconfig.c @@ -36,10 +36,11 @@ typedef struct _PyPathConfig { wchar_t *program_name; /* Set by Py_SetPythonHome() or PYTHONHOME environment variable */ wchar_t *home; + int _is_python_build; } _PyPathConfig; # define _PyPathConfig_INIT \ - {.module_search_path = NULL} + {.module_search_path = NULL, ._is_python_build = 0} _PyPathConfig _Py_path_config = _PyPathConfig_INIT; @@ -72,6 +73,7 @@ _PyPathConfig_ClearGlobal(void) CLEAR(calculated_module_search_path); CLEAR(program_name); CLEAR(home); + _Py_path_config._is_python_build = 0; #undef CLEAR @@ -99,15 +101,25 @@ _PyPathConfig_ReadGlobal(PyConfig *config) } \ } while (0) +#define COPY_INT(ATTR) \ + do { \ + assert(_Py_path_config.ATTR >= 0); \ + if ((_Py_path_config.ATTR >= 0) && (config->ATTR <= 0)) { \ + config->ATTR = _Py_path_config.ATTR; \ + } \ + } while (0) + COPY(prefix); COPY(exec_prefix); COPY(stdlib_dir); COPY(program_name); COPY(home); COPY2(executable, program_full_path); + COPY_INT(_is_python_build); // module_search_path must be initialised - not read #undef COPY #undef COPY2 +#undef COPY_INT done: return status; @@ -137,14 +149,23 @@ _PyPathConfig_UpdateGlobal(const PyConfig *config) } \ } while (0) +#define COPY_INT(ATTR) \ + do { \ + if (config->ATTR > 0) { \ + _Py_path_config.ATTR = config->ATTR; \ + } \ + } while (0) + COPY(prefix); COPY(exec_prefix); COPY(stdlib_dir); COPY(program_name); COPY(home); COPY2(program_full_path, executable); + COPY_INT(_is_python_build); #undef COPY #undef COPY2 +#undef COPY_INT PyMem_RawFree(_Py_path_config.module_search_path); _Py_path_config.module_search_path = NULL; diff --git a/Python/preconfig.c b/Python/preconfig.c index afa16cccf32e96..77a86d651eb0f4 100644 --- a/Python/preconfig.c +++ b/Python/preconfig.c @@ -24,6 +24,8 @@ int _Py_HasFileSystemDefaultEncodeErrors = 0; void _Py_ClearFileSystemEncoding(void) { +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS if (!Py_HasFileSystemDefaultEncoding && Py_FileSystemDefaultEncoding) { PyMem_RawFree((char*)Py_FileSystemDefaultEncoding); Py_FileSystemDefaultEncoding = NULL; @@ -32,6 +34,7 @@ _Py_ClearFileSystemEncoding(void) PyMem_RawFree((char*)Py_FileSystemDefaultEncodeErrors); Py_FileSystemDefaultEncodeErrors = NULL; } +_Py_COMP_DIAG_POP } @@ -56,11 +59,14 @@ _Py_SetFileSystemEncoding(const char *encoding, const char *errors) _Py_ClearFileSystemEncoding(); +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS Py_FileSystemDefaultEncoding = encoding2; Py_HasFileSystemDefaultEncoding = 0; Py_FileSystemDefaultEncodeErrors = errors2; _Py_HasFileSystemDefaultEncodeErrors = 0; +_Py_COMP_DIAG_POP return 0; } @@ -294,17 +300,7 @@ _PyPreConfig_InitCompatConfig(PyPreConfig *config) config->coerce_c_locale_warn = 0; config->dev_mode = -1; -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - /* bpo-40512: pymalloc is not compatible with subinterpreters, - force usage of libc malloc() which is thread-safe. */ -#ifdef Py_DEBUG - config->allocator = PYMEM_ALLOCATOR_MALLOC_DEBUG; -#else - config->allocator = PYMEM_ALLOCATOR_MALLOC; -#endif -#else config->allocator = PYMEM_ALLOCATOR_NOT_SET; -#endif #ifdef MS_WINDOWS config->legacy_windows_fs_encoding = -1; #endif @@ -482,6 +478,8 @@ preconfig_get_global_vars(PyPreConfig *config) config->ATTR = !(VALUE); \ } +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS COPY_FLAG(isolated, Py_IsolatedFlag); COPY_NOT_FLAG(use_environment, Py_IgnoreEnvironmentFlag); if (Py_UTF8Mode > 0) { @@ -490,6 +488,7 @@ preconfig_get_global_vars(PyPreConfig *config) #ifdef MS_WINDOWS COPY_FLAG(legacy_windows_fs_encoding, Py_LegacyWindowsFSEncodingFlag); #endif +_Py_COMP_DIAG_POP #undef COPY_FLAG #undef COPY_NOT_FLAG @@ -508,12 +507,15 @@ preconfig_set_global_vars(const PyPreConfig *config) VAR = !config->ATTR; \ } +_Py_COMP_DIAG_PUSH +_Py_COMP_DIAG_IGNORE_DEPR_DECLS COPY_FLAG(isolated, Py_IsolatedFlag); COPY_NOT_FLAG(use_environment, Py_IgnoreEnvironmentFlag); #ifdef MS_WINDOWS COPY_FLAG(legacy_windows_fs_encoding, Py_LegacyWindowsFSEncodingFlag); #endif COPY_FLAG(utf8_mode, Py_UTF8Mode); +_Py_COMP_DIAG_POP #undef COPY_FLAG #undef COPY_NOT_FLAG @@ -826,12 +828,10 @@ _PyPreConfig_Read(PyPreConfig *config, const _PyArgv *args) _Py_SetLocaleFromEnv(LC_CTYPE); } - _PyPreCmdline cmdline = _PyPreCmdline_INIT; - int init_utf8_mode = Py_UTF8Mode; -#ifdef MS_WINDOWS - int init_legacy_encoding = Py_LegacyWindowsFSEncodingFlag; -#endif + PyPreConfig save_runtime_config; + preconfig_copy(&save_runtime_config, &_PyRuntime.preconfig); + _PyPreCmdline cmdline = _PyPreCmdline_INIT; int locale_coerced = 0; int loops = 0; @@ -847,11 +847,9 @@ _PyPreConfig_Read(PyPreConfig *config, const _PyArgv *args) } /* bpo-34207: Py_DecodeLocale() and Py_EncodeLocale() depend - on Py_UTF8Mode and Py_LegacyWindowsFSEncodingFlag. */ - Py_UTF8Mode = config->utf8_mode; -#ifdef MS_WINDOWS - Py_LegacyWindowsFSEncodingFlag = config->legacy_windows_fs_encoding; -#endif + on the utf8_mode and legacy_windows_fs_encoding members + of _PyRuntime.preconfig. */ + preconfig_copy(&_PyRuntime.preconfig, config); if (args) { // Set command line arguments at each iteration. If they are bytes @@ -914,14 +912,10 @@ _PyPreConfig_Read(PyPreConfig *config, const _PyArgv *args) status = _PyStatus_OK(); done: - if (init_ctype_locale != NULL) { - setlocale(LC_CTYPE, init_ctype_locale); - PyMem_RawFree(init_ctype_locale); - } - Py_UTF8Mode = init_utf8_mode ; -#ifdef MS_WINDOWS - Py_LegacyWindowsFSEncodingFlag = init_legacy_encoding; -#endif + // Revert side effects + setlocale(LC_CTYPE, init_ctype_locale); + PyMem_RawFree(init_ctype_locale); + preconfig_copy(&_PyRuntime.preconfig, &save_runtime_config); _PyPreCmdline_Clear(&cmdline); return status; } diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 8644b5b68b6c57..65e7f23e963b5f 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -1462,7 +1462,7 @@ finalize_restore_builtins(PyThreadState *tstate) } PyDict_Clear(interp->builtins); if (PyDict_Update(interp->builtins, interp->builtins_copy)) { - _PyErr_Clear(tstate); + PyErr_WriteUnraisable(NULL); } Py_XDECREF(dict); } @@ -1981,12 +1981,10 @@ new_interpreter(PyThreadState **tstate_p, int isolated_subinterpreter) /* Copy the current interpreter config into the new interpreter */ const PyConfig *config; -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS if (save_tstate != NULL) { config = _PyInterpreterState_GetConfig(save_tstate->interp); } else -#endif { /* No current thread state, copy from the main interpreter */ PyInterpreterState *main_interp = _PyInterpreterState_Main(); @@ -2344,19 +2342,15 @@ create_stdio(const PyConfig *config, PyObject* io, static PyStatus init_set_builtins_open(void) { - PyObject *iomod = NULL, *wrapper; + PyObject *wrapper; PyObject *bimod = NULL; PyStatus res = _PyStatus_OK(); - if (!(iomod = PyImport_ImportModule("io"))) { - goto error; - } - if (!(bimod = PyImport_ImportModule("builtins"))) { goto error; } - if (!(wrapper = PyObject_GetAttrString(iomod, "open"))) { + if (!(wrapper = _PyImport_GetModuleAttrString("io", "open"))) { goto error; } @@ -2373,7 +2367,6 @@ init_set_builtins_open(void) done: Py_XDECREF(bimod); - Py_XDECREF(iomod); return res; } @@ -2838,11 +2831,7 @@ _Py_FatalErrorFormat(const char *func, const char *format, ...) } va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, format); -#else - va_start(vargs); -#endif vfprintf(stream, format, vargs); va_end(vargs); @@ -2949,28 +2938,30 @@ Py_Exit(int sts) int Py_FdIsInteractive(FILE *fp, const char *filename) { - if (isatty((int)fileno(fp))) + if (isatty(fileno(fp))) { return 1; - if (!Py_InteractiveFlag) + } + if (!_Py_GetConfig()->interactive) { return 0; - return (filename == NULL) || - (strcmp(filename, "") == 0) || - (strcmp(filename, "???") == 0); + } + return ((filename == NULL) + || (strcmp(filename, "") == 0) + || (strcmp(filename, "???") == 0)); } int _Py_FdIsInteractive(FILE *fp, PyObject *filename) { - if (isatty((int)fileno(fp))) { + if (isatty(fileno(fp))) { return 1; } - if (!Py_InteractiveFlag) { + if (!_Py_GetConfig()->interactive) { return 0; } - return (filename == NULL) || - (PyUnicode_CompareWithASCIIString(filename, "") == 0) || - (PyUnicode_CompareWithASCIIString(filename, "???") == 0); + return ((filename == NULL) + || (PyUnicode_CompareWithASCIIString(filename, "") == 0) + || (PyUnicode_CompareWithASCIIString(filename, "???") == 0)); } diff --git a/Python/pystate.c b/Python/pystate.c index 3e28a6ab69a989..11cc122185f781 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -1165,14 +1165,6 @@ _PyThreadState_DeleteExcept(_PyRuntimeState *runtime, PyThreadState *tstate) } -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS -PyThreadState* -_PyThreadState_GetTSS(void) { - return PyThread_tss_get(&_PyRuntime.gilstate.autoTSSkey); -} -#endif - - PyThreadState * _PyThreadState_UncheckedGet(void) { @@ -1192,11 +1184,7 @@ PyThreadState_Get(void) PyThreadState * _PyThreadState_Swap(struct _gilstate_runtime_state *gilstate, PyThreadState *newts) { -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - PyThreadState *oldts = _PyThreadState_GetTSS(); -#else PyThreadState *oldts = _PyRuntimeGILState_GetThreadState(gilstate); -#endif _PyRuntimeGILState_SetThreadState(gilstate, newts); /* It should not be possible for more than one thread state @@ -1214,9 +1202,6 @@ _PyThreadState_Swap(struct _gilstate_runtime_state *gilstate, PyThreadState *new Py_FatalError("Invalid thread state for this thread"); errno = err; } -#endif -#ifdef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - PyThread_tss_set(&gilstate->autoTSSkey, newts); #endif return oldts; } @@ -1665,9 +1650,7 @@ PyGILState_Ensure(void) /* Ensure that _PyEval_InitThreads() and _PyGILState_Init() have been called by Py_Initialize() */ -#ifndef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS assert(_PyEval_ThreadsInitialized(runtime)); -#endif assert(gilstate->autoInterpreterState); PyThreadState *tcur = (PyThreadState *)PyThread_tss_get(&gilstate->autoTSSkey); @@ -2153,6 +2136,7 @@ _Py_GetConfig(void) { assert(PyGILState_Check()); PyThreadState *tstate = _PyThreadState_GET(); + _Py_EnsureTstateNotNULL(tstate); return _PyInterpreterState_GetConfig(tstate->interp); } @@ -2184,7 +2168,7 @@ push_chunk(PyThreadState *tstate, int size) } _PyInterpreterFrame * -_PyThreadState_BumpFramePointerSlow(PyThreadState *tstate, size_t size) +_PyThreadState_PushFrame(PyThreadState *tstate, size_t size) { assert(size < INT_MAX/sizeof(PyObject *)); PyObject **base = tstate->datastack_top; diff --git a/Python/pytime.c b/Python/pytime.c index f49a25bf7bce7c..01c07da074757e 100644 --- a/Python/pytime.c +++ b/Python/pytime.c @@ -406,6 +406,14 @@ _PyTime_FromNanoseconds(_PyTime_t ns) } +_PyTime_t +_PyTime_FromMicrosecondsClamp(_PyTime_t us) +{ + _PyTime_t ns = _PyTime_Mul(us, US_TO_NS); + return pytime_from_nanoseconds(ns); +} + + int _PyTime_FromNanosecondsObject(_PyTime_t *tp, PyObject *obj) { diff --git a/Python/specialize.c b/Python/specialize.c index fa42993606fd63..948fb328695bb9 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -8,6 +8,7 @@ #include "pycore_object.h" #include "pycore_opcode.h" // _PyOpcode_Caches #include "structmember.h" // struct PyMemberDef, T_OFFSET_EX +#include "pycore_descrobject.h" #include // rand() @@ -20,20 +21,20 @@ uint8_t _PyOpcode_Adaptive[256] = { [LOAD_ATTR] = LOAD_ATTR_ADAPTIVE, [LOAD_GLOBAL] = LOAD_GLOBAL_ADAPTIVE, - [LOAD_METHOD] = LOAD_METHOD_ADAPTIVE, [BINARY_SUBSCR] = BINARY_SUBSCR_ADAPTIVE, [STORE_SUBSCR] = STORE_SUBSCR_ADAPTIVE, [CALL] = CALL_ADAPTIVE, - [PRECALL] = PRECALL_ADAPTIVE, [STORE_ATTR] = STORE_ATTR_ADAPTIVE, [BINARY_OP] = BINARY_OP_ADAPTIVE, [COMPARE_OP] = COMPARE_OP_ADAPTIVE, [UNPACK_SEQUENCE] = UNPACK_SEQUENCE_ADAPTIVE, + [FOR_ITER] = FOR_ITER_ADAPTIVE, }; Py_ssize_t _Py_QuickenedCount = 0; #ifdef Py_STATS -PyStats _py_stats = { 0 }; +PyStats _py_stats_struct = { 0 }; +PyStats *_py_stats = &_py_stats_struct; #define ADD_STAT_TO_DICT(res, field) \ do { \ @@ -93,7 +94,7 @@ add_stat_dict( int opcode, const char *name) { - SpecializationStats *stats = &_py_stats.opcode_stats[opcode].specialization; + SpecializationStats *stats = &_py_stats_struct.opcode_stats[opcode].specialization; PyObject *d = stats_to_dict(stats); if (d == NULL) { return -1; @@ -113,7 +114,6 @@ _Py_GetSpecializationStats(void) { int err = 0; err += add_stat_dict(stats, LOAD_ATTR, "load_attr"); err += add_stat_dict(stats, LOAD_GLOBAL, "load_global"); - err += add_stat_dict(stats, LOAD_METHOD, "load_method"); err += add_stat_dict(stats, BINARY_SUBSCR, "binary_subscr"); err += add_stat_dict(stats, STORE_SUBSCR, "store_subscr"); err += add_stat_dict(stats, STORE_ATTR, "store_attr"); @@ -121,7 +121,6 @@ _Py_GetSpecializationStats(void) { err += add_stat_dict(stats, BINARY_OP, "binary_op"); err += add_stat_dict(stats, COMPARE_OP, "compare_op"); err += add_stat_dict(stats, UNPACK_SEQUENCE, "unpack_sequence"); - err += add_stat_dict(stats, PRECALL, "precall"); if (err < 0) { Py_DECREF(stats); return NULL; @@ -178,6 +177,9 @@ print_call_stats(FILE *out, CallStats *stats) fprintf(out, "Calls to Python functions inlined: %" PRIu64 "\n", stats->inlined_py_calls); fprintf(out, "Frames pushed: %" PRIu64 "\n", stats->frames_pushed); fprintf(out, "Frame objects created: %" PRIu64 "\n", stats->frame_objects_created); + for (int i = 0; i < EVAL_CALL_KINDS; i++) { + fprintf(out, "Calls via PyEval_EvalFrame[%d] : %" PRIu64 "\n", i, stats->eval_calls[i]); + } } static void @@ -191,6 +193,10 @@ print_object_stats(FILE *out, ObjectStats *stats) fprintf(out, "Object allocations over 4 kbytes: %" PRIu64 "\n", stats->allocations_big); fprintf(out, "Object frees: %" PRIu64 "\n", stats->frees); fprintf(out, "Object new values: %" PRIu64 "\n", stats->new_values); + fprintf(out, "Object interpreter increfs: %" PRIu64 "\n", stats->interpreter_increfs); + fprintf(out, "Object interpreter decrefs: %" PRIu64 "\n", stats->interpreter_decrefs); + fprintf(out, "Object increfs: %" PRIu64 "\n", stats->increfs); + fprintf(out, "Object decrefs: %" PRIu64 "\n", stats->decrefs); fprintf(out, "Object materialize dict (on request): %" PRIu64 "\n", stats->dict_materialized_on_request); fprintf(out, "Object materialize dict (new key): %" PRIu64 "\n", stats->dict_materialized_new_key); fprintf(out, "Object materialize dict (too big): %" PRIu64 "\n", stats->dict_materialized_too_big); @@ -204,9 +210,18 @@ print_stats(FILE *out, PyStats *stats) { print_object_stats(out, &stats->object_stats); } +void +_Py_StatsClear(void) +{ + _py_stats_struct = (PyStats) { 0 }; +} + void _Py_PrintSpecializationStats(int to_file) { + if (_py_stats == NULL) { + return; + } FILE *out = stderr; if (to_file) { /* Write to a file instead of stderr. */ @@ -237,7 +252,7 @@ _Py_PrintSpecializationStats(int to_file) else { fprintf(out, "Specialization stats:\n"); } - print_stats(out, &_py_stats); + print_stats(out, _py_stats); if (out != stderr) { fclose(out); } @@ -245,8 +260,12 @@ _Py_PrintSpecializationStats(int to_file) #ifdef Py_STATS -#define SPECIALIZATION_FAIL(opcode, kind) _py_stats.opcode_stats[opcode].specialization.failure_kinds[kind]++ - +#define SPECIALIZATION_FAIL(opcode, kind) \ +do { \ + if (_py_stats) { \ + _py_stats->opcode_stats[opcode].specialization.failure_kinds[kind]++; \ + } \ +} while (0) #endif #endif @@ -316,7 +335,7 @@ _PyCode_Quicken(PyCodeObject *code) } static inline int -initial_counter_value(void) { +miss_counter_start(void) { /* Starting value for the counter. * This value needs to be not too low, otherwise * it would cause excessive de-optimization. @@ -328,6 +347,8 @@ initial_counter_value(void) { return 53; } +#define SIMPLE_FUNCTION 0 + /* Common */ #define SPEC_FAIL_OTHER 0 @@ -355,24 +376,13 @@ initial_counter_value(void) { #define SPEC_FAIL_ATTR_NON_STRING_OR_SPLIT 18 #define SPEC_FAIL_ATTR_MODULE_ATTR_NOT_FOUND 19 -/* Methods */ - -#define SPEC_FAIL_LOAD_METHOD_OVERRIDING_DESCRIPTOR 8 -#define SPEC_FAIL_LOAD_METHOD_NON_OVERRIDING_DESCRIPTOR 9 -#define SPEC_FAIL_LOAD_METHOD_NOT_DESCRIPTOR 10 -#define SPEC_FAIL_LOAD_METHOD_METHOD 11 -#define SPEC_FAIL_LOAD_METHOD_MUTABLE_CLASS 12 -#define SPEC_FAIL_LOAD_METHOD_PROPERTY 13 -#define SPEC_FAIL_LOAD_METHOD_NON_OBJECT_SLOT 14 -#define SPEC_FAIL_LOAD_METHOD_IS_ATTR 15 -#define SPEC_FAIL_LOAD_METHOD_DICT_SUBCLASS 16 -#define SPEC_FAIL_LOAD_METHOD_BUILTIN_CLASS_METHOD 17 -#define SPEC_FAIL_LOAD_METHOD_CLASS_METHOD_OBJ 18 -#define SPEC_FAIL_LOAD_METHOD_OBJECT_SLOT 19 -#define SPEC_FAIL_LOAD_METHOD_HAS_DICT 20 -#define SPEC_FAIL_LOAD_METHOD_HAS_MANAGED_DICT 21 -#define SPEC_FAIL_LOAD_METHOD_INSTANCE_ATTRIBUTE 22 -#define SPEC_FAIL_LOAD_METHOD_METACLASS_ATTRIBUTE 23 +#define SPEC_FAIL_ATTR_SHADOWED 21 +#define SPEC_FAIL_ATTR_BUILTIN_CLASS_METHOD 22 +#define SPEC_FAIL_ATTR_CLASS_METHOD_OBJ 23 +#define SPEC_FAIL_ATTR_OBJECT_SLOT 24 +#define SPEC_FAIL_ATTR_HAS_MANAGED_DICT 25 +#define SPEC_FAIL_ATTR_INSTANCE_ATTRIBUTE 26 +#define SPEC_FAIL_ATTR_METACLASS_ATTRIBUTE 27 /* Binary subscr and store subscr */ @@ -472,6 +482,12 @@ initial_counter_value(void) { #define SPEC_FAIL_FOR_ITER_DICT_ITEMS 21 #define SPEC_FAIL_FOR_ITER_DICT_VALUES 22 #define SPEC_FAIL_FOR_ITER_ENUMERATE 23 +#define SPEC_FAIL_FOR_ITER_MAP 24 +#define SPEC_FAIL_FOR_ITER_ZIP 25 +#define SPEC_FAIL_FOR_ITER_SEQ_ITER 26 +#define SPEC_FAIL_FOR_ITER_REVERSED_LIST 27 +#define SPEC_FAIL_FOR_ITER_CALLABLE 28 +#define SPEC_FAIL_FOR_ITER_ASCII_STRING 29 // UNPACK_SEQUENCE @@ -582,7 +598,9 @@ analyze_descriptor(PyTypeObject *type, PyObject *name, PyObject **descr, int sto return DUNDER_CLASS; } } - return OVERRIDING; + if (store) { + return OVERRIDING; + } } if (desc_cls->tp_descr_get) { if (desc_cls->tp_flags & Py_TPFLAGS_METHOD_DESCRIPTOR) { @@ -649,6 +667,11 @@ specialize_dict_access( return 1; } +static int specialize_attr_loadmethod(PyObject* owner, _Py_CODEUNIT* instr, PyObject* name, + PyObject* descr, DescriptorClassification kind); +static int specialize_class_load_attr(PyObject* owner, _Py_CODEUNIT* instr, PyObject* name); +static int function_kind(PyCodeObject *code); + int _Py_Specialize_LoadAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) { @@ -662,24 +685,74 @@ _Py_Specialize_LoadAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) } goto success; } + if (PyType_Check(owner)) { + int err = specialize_class_load_attr(owner, instr, name); + if (err) { + goto fail; + } + goto success; + } PyTypeObject *type = Py_TYPE(owner); if (type->tp_dict == NULL) { if (PyType_Ready(type) < 0) { return -1; } } - PyObject *descr; + PyObject *descr = NULL; DescriptorClassification kind = analyze_descriptor(type, name, &descr, 0); + assert(descr != NULL || kind == ABSENT || kind == GETSET_OVERRIDDEN); switch(kind) { case OVERRIDING: SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_OVERRIDING_DESCRIPTOR); goto fail; case METHOD: + { + int oparg = _Py_OPARG(*instr); + if (oparg & 1) { + if (specialize_attr_loadmethod(owner, instr, name, descr, kind)) { + goto success; + } + } SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_METHOD); goto fail; + } case PROPERTY: - SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_PROPERTY); - goto fail; + { + _PyLoadMethodCache *lm_cache = (_PyLoadMethodCache *)(instr + 1); + assert(Py_TYPE(descr) == &PyProperty_Type); + PyObject *fget = ((_PyPropertyObject *)descr)->prop_get; + if (fget == NULL) { + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_EXPECTED_ERROR); + goto fail; + } + if (Py_TYPE(fget) != &PyFunction_Type) { + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_PROPERTY); + goto fail; + } + PyFunctionObject *func = (PyFunctionObject *)fget; + PyCodeObject *fcode = (PyCodeObject *)func->func_code; + int kind = function_kind(fcode); + if (kind != SIMPLE_FUNCTION) { + SPECIALIZATION_FAIL(LOAD_ATTR, kind); + goto fail; + } + if (fcode->co_argcount != 1) { + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); + goto fail; + } + int version = _PyFunction_GetVersionForCurrentState(func); + if (version == 0) { + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_OUT_OF_VERSIONS); + goto fail; + } + write_u32(lm_cache->keys_version, version); + assert(type->tp_version_tag != 0); + write_u32(lm_cache->type_version, type->tp_version_tag); + /* borrowed */ + write_obj(lm_cache->descr, fget); + _Py_SET_OPCODE(*instr, LOAD_ATTR_PROPERTY); + goto success; + } case OBJECT_SLOT: { PyMemberDescrObject *member = (PyMemberDescrObject *)descr; @@ -738,12 +811,12 @@ _Py_Specialize_LoadAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) fail: STAT_INC(LOAD_ATTR, failure); assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return 0; success: STAT_INC(LOAD_ATTR, success); assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); return 0; } @@ -820,54 +893,54 @@ _Py_Specialize_StoreAttr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) fail: STAT_INC(STORE_ATTR, failure); assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return 0; success: STAT_INC(STORE_ATTR, success); assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); return 0; } #ifdef Py_STATS static int -load_method_fail_kind(DescriptorClassification kind) +load_attr_fail_kind(DescriptorClassification kind) { switch (kind) { case OVERRIDING: - return SPEC_FAIL_LOAD_METHOD_OVERRIDING_DESCRIPTOR; + return SPEC_FAIL_ATTR_OVERRIDING_DESCRIPTOR; case METHOD: - return SPEC_FAIL_LOAD_METHOD_METHOD; + return SPEC_FAIL_ATTR_METHOD; case PROPERTY: - return SPEC_FAIL_LOAD_METHOD_PROPERTY; + return SPEC_FAIL_ATTR_PROPERTY; case OBJECT_SLOT: - return SPEC_FAIL_LOAD_METHOD_OBJECT_SLOT; + return SPEC_FAIL_ATTR_OBJECT_SLOT; case OTHER_SLOT: - return SPEC_FAIL_LOAD_METHOD_NON_OBJECT_SLOT; + return SPEC_FAIL_ATTR_NON_OBJECT_SLOT; case DUNDER_CLASS: return SPEC_FAIL_OTHER; case MUTABLE: - return SPEC_FAIL_LOAD_METHOD_MUTABLE_CLASS; + return SPEC_FAIL_ATTR_MUTABLE_CLASS; case GETSET_OVERRIDDEN: return SPEC_FAIL_OVERRIDDEN; case BUILTIN_CLASSMETHOD: - return SPEC_FAIL_LOAD_METHOD_BUILTIN_CLASS_METHOD; + return SPEC_FAIL_ATTR_BUILTIN_CLASS_METHOD; case PYTHON_CLASSMETHOD: - return SPEC_FAIL_LOAD_METHOD_CLASS_METHOD_OBJ; + return SPEC_FAIL_ATTR_CLASS_METHOD_OBJ; case NON_OVERRIDING: - return SPEC_FAIL_LOAD_METHOD_NON_OVERRIDING_DESCRIPTOR; + return SPEC_FAIL_ATTR_NON_OVERRIDING_DESCRIPTOR; case NON_DESCRIPTOR: - return SPEC_FAIL_LOAD_METHOD_NOT_DESCRIPTOR; + return SPEC_FAIL_ATTR_NOT_DESCRIPTOR; case ABSENT: - return SPEC_FAIL_LOAD_METHOD_INSTANCE_ATTRIBUTE; + return SPEC_FAIL_ATTR_INSTANCE_ATTRIBUTE; } Py_UNREACHABLE(); } #endif static int -specialize_class_load_method(PyObject *owner, _Py_CODEUNIT *instr, +specialize_class_load_attr(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) { _PyLoadMethodCache *cache = (_PyLoadMethodCache *)(instr + 1); @@ -879,20 +952,20 @@ specialize_class_load_method(PyObject *owner, _Py_CODEUNIT *instr, case NON_DESCRIPTOR: write_u32(cache->type_version, ((PyTypeObject *)owner)->tp_version_tag); write_obj(cache->descr, descr); - _Py_SET_OPCODE(*instr, LOAD_METHOD_CLASS); + _Py_SET_OPCODE(*instr, LOAD_ATTR_CLASS); return 0; #ifdef Py_STATS case ABSENT: if (_PyType_Lookup(Py_TYPE(owner), name) != NULL) { - SPECIALIZATION_FAIL(LOAD_METHOD, SPEC_FAIL_LOAD_METHOD_METACLASS_ATTRIBUTE); + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_METACLASS_ATTRIBUTE); } else { - SPECIALIZATION_FAIL(LOAD_METHOD, SPEC_FAIL_EXPECTED_ERROR); + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_EXPECTED_ERROR); } return -1; #endif default: - SPECIALIZATION_FAIL(LOAD_METHOD, load_method_fail_kind(kind)); + SPECIALIZATION_FAIL(LOAD_ATTR, load_attr_fail_kind(kind)); return -1; } } @@ -901,50 +974,21 @@ typedef enum { MANAGED_VALUES = 1, MANAGED_DICT = 2, OFFSET_DICT = 3, - NO_DICT = 4 + NO_DICT = 4, + LAZY_DICT = 5, } ObjectDictKind; // Please collect stats carefully before and after modifying. A subtle change // can cause a significant drop in cache hits. A possible test is // python.exe -m test_typing test_re test_dis test_zlib. -int -_Py_Specialize_LoadMethod(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) +static int +specialize_attr_loadmethod(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name, +PyObject *descr, DescriptorClassification kind) { - assert(_PyOpcode_Caches[LOAD_METHOD] == INLINE_CACHE_ENTRIES_LOAD_METHOD); _PyLoadMethodCache *cache = (_PyLoadMethodCache *)(instr + 1); PyTypeObject *owner_cls = Py_TYPE(owner); - if (PyModule_CheckExact(owner)) { - assert(INLINE_CACHE_ENTRIES_LOAD_ATTR <= - INLINE_CACHE_ENTRIES_LOAD_METHOD); - int err = specialize_module_load_attr(owner, instr, name, LOAD_METHOD, - LOAD_METHOD_MODULE); - if (err) { - goto fail; - } - goto success; - } - if (owner_cls->tp_dict == NULL) { - if (PyType_Ready(owner_cls) < 0) { - return -1; - } - } - if (PyType_Check(owner)) { - int err = specialize_class_load_method(owner, instr, name); - if (err) { - goto fail; - } - goto success; - } - - PyObject *descr = NULL; - DescriptorClassification kind = 0; - kind = analyze_descriptor(owner_cls, name, &descr, 0); - assert(descr != NULL || kind == ABSENT || kind == GETSET_OVERRIDDEN); - if (kind != METHOD) { - SPECIALIZATION_FAIL(LOAD_METHOD, load_method_fail_kind(kind)); - goto fail; - } + assert(kind == METHOD && descr != NULL); ObjectDictKind dictkind; PyDictKeysObject *keys; if (owner_cls->tp_flags & Py_TPFLAGS_MANAGED_DICT) { @@ -960,7 +1004,7 @@ _Py_Specialize_LoadMethod(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) else { Py_ssize_t dictoffset = owner_cls->tp_dictoffset; if (dictoffset < 0 || dictoffset > INT16_MAX) { - SPECIALIZATION_FAIL(LOAD_METHOD, SPEC_FAIL_OUT_OF_RANGE); + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_OUT_OF_RANGE); goto fail; } if (dictoffset == 0) { @@ -970,41 +1014,46 @@ _Py_Specialize_LoadMethod(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) else { PyObject *dict = *(PyObject **) ((char *)owner + dictoffset); if (dict == NULL) { - SPECIALIZATION_FAIL(LOAD_METHOD, SPEC_FAIL_NO_DICT); - goto fail; + // This object will have a dict if user access __dict__ + dictkind = LAZY_DICT; + keys = NULL; + } + else { + keys = ((PyDictObject *)dict)->ma_keys; + dictkind = OFFSET_DICT; } - keys = ((PyDictObject *)dict)->ma_keys; - dictkind = OFFSET_DICT; } } - if (dictkind != NO_DICT) { + if (dictkind == MANAGED_VALUES || dictkind == OFFSET_DICT) { Py_ssize_t index = _PyDictKeys_StringLookup(keys, name); if (index != DKIX_EMPTY) { - SPECIALIZATION_FAIL(LOAD_METHOD, SPEC_FAIL_LOAD_METHOD_IS_ATTR); + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_SHADOWED); goto fail; } uint32_t keys_version = _PyDictKeys_GetVersionForCurrentState(keys); if (keys_version == 0) { - SPECIALIZATION_FAIL(LOAD_METHOD, SPEC_FAIL_OUT_OF_VERSIONS); + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_OUT_OF_VERSIONS); goto fail; } write_u32(cache->keys_version, keys_version); } switch(dictkind) { case NO_DICT: - _Py_SET_OPCODE(*instr, LOAD_METHOD_NO_DICT); + _Py_SET_OPCODE(*instr, LOAD_ATTR_METHOD_NO_DICT); break; case MANAGED_VALUES: - _Py_SET_OPCODE(*instr, LOAD_METHOD_WITH_VALUES); + _Py_SET_OPCODE(*instr, LOAD_ATTR_METHOD_WITH_VALUES); break; case MANAGED_DICT: - *(int16_t *)&cache->dict_offset = (int16_t)MANAGED_DICT_OFFSET; - _Py_SET_OPCODE(*instr, LOAD_METHOD_WITH_DICT); - break; + SPECIALIZATION_FAIL(LOAD_ATTR, SPEC_FAIL_ATTR_HAS_MANAGED_DICT); + goto fail; case OFFSET_DICT: assert(owner_cls->tp_dictoffset > 0 && owner_cls->tp_dictoffset <= INT16_MAX); - cache->dict_offset = (uint16_t)owner_cls->tp_dictoffset; - _Py_SET_OPCODE(*instr, LOAD_METHOD_WITH_DICT); + _Py_SET_OPCODE(*instr, LOAD_ATTR_METHOD_WITH_DICT); + break; + case LAZY_DICT: + assert(owner_cls->tp_dictoffset > 0 && owner_cls->tp_dictoffset <= INT16_MAX); + _Py_SET_OPCODE(*instr, LOAD_ATTR_METHOD_LAZY_DICT); break; } /* `descr` is borrowed. This is safe for methods (even inherited ones from @@ -1024,17 +1073,9 @@ _Py_Specialize_LoadMethod(PyObject *owner, _Py_CODEUNIT *instr, PyObject *name) write_u32(cache->type_version, owner_cls->tp_version_tag); write_obj(cache->descr, descr); // Fall through. -success: - STAT_INC(LOAD_METHOD, success); - assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); - return 0; + return 1; fail: - STAT_INC(LOAD_METHOD, failure); - assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; return 0; - } int @@ -1110,12 +1151,12 @@ _Py_Specialize_LoadGlobal( fail: STAT_INC(LOAD_GLOBAL, failure); assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return 0; success: STAT_INC(LOAD_GLOBAL, success); assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); return 0; } @@ -1159,9 +1200,6 @@ binary_subscr_fail_kind(PyTypeObject *container_type, PyObject *sub) } #endif - -#define SIMPLE_FUNCTION 0 - static int function_kind(PyCodeObject *code) { int flags = code->co_flags; @@ -1226,7 +1264,8 @@ _Py_Specialize_BinarySubscr( write_u32(cache->type_version, cls->tp_version_tag); int version = _PyFunction_GetVersionForCurrentState(func); if (version == 0 || version != (uint16_t)version) { - SPECIALIZATION_FAIL(BINARY_SUBSCR, SPEC_FAIL_OUT_OF_VERSIONS); + SPECIALIZATION_FAIL(BINARY_SUBSCR, version == 0 ? + SPEC_FAIL_OUT_OF_VERSIONS : SPEC_FAIL_OUT_OF_RANGE); goto fail; } cache->func_version = version; @@ -1239,12 +1278,12 @@ _Py_Specialize_BinarySubscr( fail: STAT_INC(BINARY_SUBSCR, failure); assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return 0; success: STAT_INC(BINARY_SUBSCR, success); assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); return 0; } @@ -1343,49 +1382,50 @@ _Py_Specialize_StoreSubscr(PyObject *container, PyObject *sub, _Py_CODEUNIT *ins fail: STAT_INC(STORE_SUBSCR, failure); assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return 0; success: STAT_INC(STORE_SUBSCR, success); assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); return 0; } static int specialize_class_call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, - PyObject *kwnames, int oparg) + PyObject *kwnames) { - assert(_Py_OPCODE(*instr) == PRECALL_ADAPTIVE); + assert(_Py_OPCODE(*instr) == CALL_ADAPTIVE); PyTypeObject *tp = _PyType_CAST(callable); if (tp->tp_new == PyBaseObject_Type.tp_new) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_CALL_PYTHON_CLASS); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_PYTHON_CLASS); return -1; } if (tp->tp_flags & Py_TPFLAGS_IMMUTABLETYPE) { + int oparg = _Py_OPARG(*instr); if (nargs == 1 && kwnames == NULL && oparg == 1) { if (tp == &PyUnicode_Type) { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_STR_1); + _Py_SET_OPCODE(*instr, CALL_NO_KW_STR_1); return 0; } else if (tp == &PyType_Type) { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_TYPE_1); + _Py_SET_OPCODE(*instr, CALL_NO_KW_TYPE_1); return 0; } else if (tp == &PyTuple_Type) { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_TUPLE_1); + _Py_SET_OPCODE(*instr, CALL_NO_KW_TUPLE_1); return 0; } } if (tp->tp_vectorcall != NULL) { - _Py_SET_OPCODE(*instr, PRECALL_BUILTIN_CLASS); + _Py_SET_OPCODE(*instr, CALL_BUILTIN_CLASS); return 0; } - SPECIALIZATION_FAIL(PRECALL, tp == &PyUnicode_Type ? + SPECIALIZATION_FAIL(CALL, tp == &PyUnicode_Type ? SPEC_FAIL_CALL_STR : SPEC_FAIL_CALL_CLASS_NO_VECTORCALL); return -1; } - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_CALL_CLASS_MUTABLE); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_CLASS_MUTABLE); return -1; } @@ -1415,11 +1455,11 @@ builtin_call_fail_kind(int ml_flags) static int specialize_method_descriptor(PyMethodDescrObject *descr, _Py_CODEUNIT *instr, - int nargs, PyObject *kwnames, int oparg) + int nargs, PyObject *kwnames) { - assert(_Py_OPCODE(*instr) == PRECALL_ADAPTIVE); + assert(_Py_OPCODE(*instr) == CALL_ADAPTIVE); if (kwnames) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_CALL_KWNAMES); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_KWNAMES); return -1; } @@ -1428,45 +1468,45 @@ specialize_method_descriptor(PyMethodDescrObject *descr, _Py_CODEUNIT *instr, METH_KEYWORDS | METH_METHOD)) { case METH_NOARGS: { if (nargs != 1) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); return -1; } - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_METHOD_DESCRIPTOR_NOARGS); + _Py_SET_OPCODE(*instr, CALL_NO_KW_METHOD_DESCRIPTOR_NOARGS); return 0; } case METH_O: { if (nargs != 2) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); return -1; } PyInterpreterState *interp = _PyInterpreterState_GET(); PyObject *list_append = interp->callable_cache.list_append; - _Py_CODEUNIT next = instr[INLINE_CACHE_ENTRIES_PRECALL + 1 - + INLINE_CACHE_ENTRIES_CALL + 1]; + _Py_CODEUNIT next = instr[INLINE_CACHE_ENTRIES_CALL + 1]; bool pop = (_Py_OPCODE(next) == POP_TOP); + int oparg = _Py_OPARG(*instr); if ((PyObject *)descr == list_append && oparg == 1 && pop) { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_LIST_APPEND); + _Py_SET_OPCODE(*instr, CALL_NO_KW_LIST_APPEND); return 0; } - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_METHOD_DESCRIPTOR_O); + _Py_SET_OPCODE(*instr, CALL_NO_KW_METHOD_DESCRIPTOR_O); return 0; } case METH_FASTCALL: { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_METHOD_DESCRIPTOR_FAST); + _Py_SET_OPCODE(*instr, CALL_NO_KW_METHOD_DESCRIPTOR_FAST); return 0; } case METH_FASTCALL|METH_KEYWORDS: { - _Py_SET_OPCODE(*instr, PRECALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS); + _Py_SET_OPCODE(*instr, CALL_METHOD_DESCRIPTOR_FAST_WITH_KEYWORDS); return 0; } } - SPECIALIZATION_FAIL(PRECALL, builtin_call_fail_kind(descr->d_method->ml_flags)); + SPECIALIZATION_FAIL(CALL, builtin_call_fail_kind(descr->d_method->ml_flags)); return -1; } static int specialize_py_call(PyFunctionObject *func, _Py_CODEUNIT *instr, int nargs, - PyObject *kwnames) + PyObject *kwnames, bool bound_method) { _PyCallCache *cache = (_PyCallCache *)(instr + 1); assert(_Py_OPCODE(*instr) == CALL_ADAPTIVE); @@ -1508,7 +1548,11 @@ specialize_py_call(PyFunctionObject *func, _Py_CODEUNIT *instr, int nargs, write_u32(cache->func_version, version); cache->min_args = min_args; if (argcount == nargs) { - _Py_SET_OPCODE(*instr, CALL_PY_EXACT_ARGS); + _Py_SET_OPCODE(*instr, bound_method ? CALL_BOUND_METHOD_EXACT_ARGS : CALL_PY_EXACT_ARGS); + } + else if (bound_method) { + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_BOUND_METHOD); + return -1; } else { _Py_SET_OPCODE(*instr, CALL_PY_WITH_DEFAULTS); @@ -1520,7 +1564,7 @@ static int specialize_c_call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, PyObject *kwnames) { - assert(_Py_OPCODE(*instr) == PRECALL_ADAPTIVE); + assert(_Py_OPCODE(*instr) == CALL_ADAPTIVE); if (PyCFunction_GET_FUNCTION(callable) == NULL) { return 1; } @@ -1529,44 +1573,44 @@ specialize_c_call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, METH_KEYWORDS | METH_METHOD)) { case METH_O: { if (kwnames) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_CALL_KWNAMES); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_KWNAMES); return -1; } if (nargs != 1) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_WRONG_NUMBER_ARGUMENTS); return 1; } /* len(o) */ PyInterpreterState *interp = _PyInterpreterState_GET(); if (callable == interp->callable_cache.len) { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_LEN); + _Py_SET_OPCODE(*instr, CALL_NO_KW_LEN); return 0; } - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_BUILTIN_O); + _Py_SET_OPCODE(*instr, CALL_NO_KW_BUILTIN_O); return 0; } case METH_FASTCALL: { if (kwnames) { - SPECIALIZATION_FAIL(PRECALL, SPEC_FAIL_CALL_KWNAMES); + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_KWNAMES); return -1; } if (nargs == 2) { /* isinstance(o1, o2) */ PyInterpreterState *interp = _PyInterpreterState_GET(); if (callable == interp->callable_cache.isinstance) { - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_ISINSTANCE); + _Py_SET_OPCODE(*instr, CALL_NO_KW_ISINSTANCE); return 0; } } - _Py_SET_OPCODE(*instr, PRECALL_NO_KW_BUILTIN_FAST); + _Py_SET_OPCODE(*instr, CALL_NO_KW_BUILTIN_FAST); return 0; } case METH_FASTCALL | METH_KEYWORDS: { - _Py_SET_OPCODE(*instr, PRECALL_BUILTIN_FAST_WITH_KEYWORDS); + _Py_SET_OPCODE(*instr, CALL_BUILTIN_FAST_WITH_KEYWORDS); return 0; } default: - SPECIALIZATION_FAIL(PRECALL, + SPECIALIZATION_FAIL(CALL, builtin_call_fail_kind(PyCFunction_GET_FLAGS(callable))); return 1; } @@ -1614,62 +1658,39 @@ call_fail_kind(PyObject *callable) #endif +/* TODO: + - Specialize calling classes. +*/ int -_Py_Specialize_Precall(PyObject *callable, _Py_CODEUNIT *instr, int nargs, - PyObject *kwnames, int oparg) +_Py_Specialize_Call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, + PyObject *kwnames) { - assert(_PyOpcode_Caches[PRECALL] == INLINE_CACHE_ENTRIES_PRECALL); - _PyPrecallCache *cache = (_PyPrecallCache *)(instr + 1); + assert(_PyOpcode_Caches[CALL] == INLINE_CACHE_ENTRIES_CALL); + _PyCallCache *cache = (_PyCallCache *)(instr + 1); int fail; if (PyCFunction_CheckExact(callable)) { fail = specialize_c_call(callable, instr, nargs, kwnames); } else if (PyFunction_Check(callable)) { - _Py_SET_OPCODE(*instr, PRECALL_PYFUNC); - fail = 0; + fail = specialize_py_call((PyFunctionObject *)callable, instr, nargs, + kwnames, false); } else if (PyType_Check(callable)) { - fail = specialize_class_call(callable, instr, nargs, kwnames, oparg); + fail = specialize_class_call(callable, instr, nargs, kwnames); } else if (Py_IS_TYPE(callable, &PyMethodDescr_Type)) { fail = specialize_method_descriptor((PyMethodDescrObject *)callable, - instr, nargs, kwnames, oparg); + instr, nargs, kwnames); } else if (Py_TYPE(callable) == &PyMethod_Type) { - _Py_SET_OPCODE(*instr, PRECALL_BOUND_METHOD); - fail = 0; - } - else { - SPECIALIZATION_FAIL(PRECALL, call_fail_kind(callable)); - fail = -1; - } - if (fail) { - STAT_INC(PRECALL, failure); - assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; - } - else { - STAT_INC(PRECALL, success); - assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); - } - return 0; -} - - -/* TODO: - - Specialize calling classes. -*/ -int -_Py_Specialize_Call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, - PyObject *kwnames) -{ - assert(_PyOpcode_Caches[CALL] == INLINE_CACHE_ENTRIES_CALL); - _PyCallCache *cache = (_PyCallCache *)(instr + 1); - int fail; - if (PyFunction_Check(callable)) { - fail = specialize_py_call((PyFunctionObject *)callable, instr, nargs, - kwnames); + PyObject *func = ((PyMethodObject *)callable)->im_func; + if (PyFunction_Check(func)) { + fail = specialize_py_call((PyFunctionObject *)func, + instr, nargs+1, kwnames, true); + } else { + SPECIALIZATION_FAIL(CALL, SPEC_FAIL_CALL_BOUND_METHOD); + fail = -1; + } } else { SPECIALIZATION_FAIL(CALL, call_fail_kind(callable)); @@ -1678,12 +1699,12 @@ _Py_Specialize_Call(PyObject *callable, _Py_CODEUNIT *instr, int nargs, if (fail) { STAT_INC(CALL, failure); assert(!PyErr_Occurred()); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); } else { STAT_INC(CALL, success); assert(!PyErr_Occurred()); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); } return 0; } @@ -1831,11 +1852,11 @@ _Py_Specialize_BinaryOp(PyObject *lhs, PyObject *rhs, _Py_CODEUNIT *instr, } SPECIALIZATION_FAIL(BINARY_OP, binary_op_fail_kind(oparg, lhs, rhs)); STAT_INC(BINARY_OP, failure); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return; success: STAT_INC(BINARY_OP, success); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); } @@ -1905,13 +1926,14 @@ _Py_Specialize_CompareOp(PyObject *lhs, PyObject *rhs, _Py_CODEUNIT *instr, #ifndef Py_STATS _Py_SET_OPCODE(*instr, COMPARE_OP); return; -#endif +#else if (next_opcode == EXTENDED_ARG) { SPECIALIZATION_FAIL(COMPARE_OP, SPEC_FAIL_COMPARE_OP_EXTENDED_ARG); goto failure; } SPECIALIZATION_FAIL(COMPARE_OP, SPEC_FAIL_COMPARE_OP_NOT_FOLLOWED_BY_COND_JUMP); goto failure; +#endif } assert(oparg <= Py_GE); int when_to_jump_mask = compare_masks[oparg]; @@ -1957,11 +1979,11 @@ _Py_Specialize_CompareOp(PyObject *lhs, PyObject *rhs, _Py_CODEUNIT *instr, SPECIALIZATION_FAIL(COMPARE_OP, compare_op_fail_kind(lhs, rhs)); failure: STAT_INC(COMPARE_OP, failure); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return; success: STAT_INC(COMPARE_OP, success); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); } #ifdef Py_STATS @@ -2007,11 +2029,11 @@ _Py_Specialize_UnpackSequence(PyObject *seq, _Py_CODEUNIT *instr, int oparg) SPECIALIZATION_FAIL(UNPACK_SEQUENCE, unpack_sequence_fail_kind(seq)); failure: STAT_INC(UNPACK_SEQUENCE, failure); - cache->counter = ADAPTIVE_CACHE_BACKOFF; + cache->counter = adaptive_counter_backoff(cache->counter); return; success: STAT_INC(UNPACK_SEQUENCE, success); - cache->counter = initial_counter_value(); + cache->counter = miss_counter_start(); } #ifdef Py_STATS @@ -2059,11 +2081,59 @@ int if (t == &PyEnum_Type) { return SPEC_FAIL_FOR_ITER_ENUMERATE; } - - if (strncmp(t->tp_name, "itertools", 8) == 0) { + if (t == &PyMap_Type) { + return SPEC_FAIL_FOR_ITER_MAP; + } + if (t == &PyZip_Type) { + return SPEC_FAIL_FOR_ITER_ZIP; + } + if (t == &PySeqIter_Type) { + return SPEC_FAIL_FOR_ITER_SEQ_ITER; + } + if (t == &PyListRevIter_Type) { + return SPEC_FAIL_FOR_ITER_REVERSED_LIST; + } + if (t == &_PyUnicodeASCIIIter_Type) { + return SPEC_FAIL_FOR_ITER_ASCII_STRING; + } + const char *name = t->tp_name; + if (strncmp(name, "itertools", 9) == 0) { return SPEC_FAIL_FOR_ITER_ITERTOOLS; } + if (strncmp(name, "callable_iterator", 17) == 0) { + return SPEC_FAIL_FOR_ITER_CALLABLE; + } return SPEC_FAIL_OTHER; } #endif + +void +_Py_Specialize_ForIter(PyObject *iter, _Py_CODEUNIT *instr) +{ + assert(_PyOpcode_Caches[FOR_ITER] == INLINE_CACHE_ENTRIES_FOR_ITER); + _PyForIterCache *cache = (_PyForIterCache *)(instr + 1); + PyTypeObject *tp = Py_TYPE(iter); + _Py_CODEUNIT next = instr[1+INLINE_CACHE_ENTRIES_FOR_ITER]; + int next_op = _PyOpcode_Deopt[_Py_OPCODE(next)]; + if (tp == &PyListIter_Type) { + _Py_SET_OPCODE(*instr, FOR_ITER_LIST); + goto success; + } + else if (tp == &PyRangeIter_Type && next_op == STORE_FAST) { + _Py_SET_OPCODE(*instr, FOR_ITER_RANGE); + goto success; + } + else { + SPECIALIZATION_FAIL(FOR_ITER, + _PySpecialization_ClassifyIterator(iter)); + goto failure; + } +failure: + STAT_INC(FOR_ITER, failure); + cache->counter = adaptive_counter_backoff(cache->counter); + return; +success: + STAT_INC(FOR_ITER, success); + cache->counter = miss_counter_start(); +} diff --git a/Python/suggestions.c b/Python/suggestions.c index b84acaaed6b1fd..c336ec8ffffc9c 100644 --- a/Python/suggestions.c +++ b/Python/suggestions.c @@ -1,5 +1,4 @@ #include "Python.h" -#include "frameobject.h" #include "pycore_frame.h" #include "pycore_pyerrors.h" diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 4f8b4cc17f2c1a..444042f82d9473 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -31,7 +31,7 @@ Data members: #include "pycore_structseq.h" // _PyStructSequence_InitType() #include "pycore_tuple.h" // _PyTuple_FromArray() -#include "frameobject.h" // PyFrame_GetBack() +#include "frameobject.h" // PyFrame_FastToLocalsWithError() #include "pydtrace.h" #include "osdefs.h" // DELIM #include "stdlib_module_names.h" // _Py_stdlib_module_names @@ -292,11 +292,7 @@ _PySys_Audit(PyThreadState *tstate, const char *event, const char *argFormat, ...) { va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, argFormat); -#else - va_start(vargs); -#endif int res = sys_audit_tstate(tstate, event, argFormat, vargs); va_end(vargs); return res; @@ -307,11 +303,7 @@ PySys_Audit(const char *event, const char *argFormat, ...) { PyThreadState *tstate = _PyThreadState_GET(); va_list vargs; -#ifdef HAVE_STDARG_PROTOTYPES va_start(vargs, argFormat); -#else - va_start(vargs); -#endif int res = sys_audit_tstate(tstate, event, argFormat, vargs); va_end(vargs); return res; @@ -1917,6 +1909,66 @@ sys_is_finalizing_impl(PyObject *module) return PyBool_FromLong(_Py_IsFinalizing()); } +#ifdef Py_STATS +/*[clinic input] +sys._stats_on + +Turns on stats gathering (stats gathering is on by default). +[clinic start generated code]*/ + +static PyObject * +sys__stats_on_impl(PyObject *module) +/*[clinic end generated code: output=aca53eafcbb4d9fe input=8ddc6df94e484f3a]*/ +{ + _py_stats = &_py_stats_struct; + Py_RETURN_NONE; +} + +/*[clinic input] +sys._stats_off + +Turns off stats gathering (stats gathering is on by default). +[clinic start generated code]*/ + +static PyObject * +sys__stats_off_impl(PyObject *module) +/*[clinic end generated code: output=1534c1ee63812214 input=b3e50e71ecf29f66]*/ +{ + _py_stats = NULL; + Py_RETURN_NONE; +} + +/*[clinic input] +sys._stats_clear + +Clears the stats. +[clinic start generated code]*/ + +static PyObject * +sys__stats_clear_impl(PyObject *module) +/*[clinic end generated code: output=fb65a2525ee50604 input=3e03f2654f44da96]*/ +{ + _Py_StatsClear(); + Py_RETURN_NONE; +} + +/*[clinic input] +sys._stats_dump + +Dump stats to file, and clears the stats. +[clinic start generated code]*/ + +static PyObject * +sys__stats_dump_impl(PyObject *module) +/*[clinic end generated code: output=79f796fb2b4ddf05 input=92346f16d64f6f95]*/ +{ + _Py_PrintSpecializationStats(1); + _Py_StatsClear(); + Py_RETURN_NONE; +} + +#endif + #ifdef ANDROID_API_LEVEL /*[clinic input] sys.getandroidapilevel @@ -1986,6 +2038,12 @@ static PyMethodDef sys_methods[] = { SYS_GET_ASYNCGEN_HOOKS_METHODDEF SYS_GETANDROIDAPILEVEL_METHODDEF SYS_UNRAISABLEHOOK_METHODDEF +#ifdef Py_STATS + SYS__STATS_ON_METHODDEF + SYS__STATS_OFF_METHODDEF + SYS__STATS_CLEAR_METHODDEF + SYS__STATS_DUMP_METHODDEF +#endif {NULL, NULL} // sentinel }; diff --git a/Python/thread_pthread.h b/Python/thread_pthread.h index 2237018e9cbc2c..c310d72abd2d58 100644 --- a/Python/thread_pthread.h +++ b/Python/thread_pthread.h @@ -23,6 +23,8 @@ # include /* thread_self() */ #elif defined(__NetBSD__) # include /* _lwp_self() */ +#elif defined(__DragonFly__) +# include /* lwp_gettid() */ #endif /* The POSIX spec requires that use of pthread_attr_setstacksize @@ -111,19 +113,6 @@ #endif -#define MICROSECONDS_TO_TIMESPEC(microseconds, ts) \ -do { \ - struct timeval tv; \ - gettimeofday(&tv, NULL); \ - tv.tv_usec += microseconds % 1000000; \ - tv.tv_sec += microseconds / 1000000; \ - tv.tv_sec += tv.tv_usec / 1000000; \ - tv.tv_usec %= 1000000; \ - ts.tv_sec = tv.tv_sec; \ - ts.tv_nsec = tv.tv_usec * 1000; \ -} while(0) - - /* * pthread_cond support */ @@ -154,23 +143,23 @@ _PyThread_cond_init(PyCOND_T *cond) return pthread_cond_init(cond, condattr_monotonic); } + void _PyThread_cond_after(long long us, struct timespec *abs) { + _PyTime_t timeout = _PyTime_FromMicrosecondsClamp(us); + _PyTime_t t; #ifdef CONDATTR_MONOTONIC if (condattr_monotonic) { - clock_gettime(CLOCK_MONOTONIC, abs); - abs->tv_sec += us / 1000000; - abs->tv_nsec += (us % 1000000) * 1000; - abs->tv_sec += abs->tv_nsec / 1000000000; - abs->tv_nsec %= 1000000000; - return; + t = _PyTime_GetMonotonicClock(); } + else #endif - - struct timespec ts; - MICROSECONDS_TO_TIMESPEC(us, ts); - *abs = ts; + { + t = _PyTime_GetSystemClock(); + } + t = _PyTime_Add(t, timeout); + _PyTime_AsTimespec_clamp(t, abs); } @@ -349,6 +338,9 @@ PyThread_get_thread_native_id(void) #elif defined(__NetBSD__) lwpid_t native_id; native_id = _lwp_self(); +#elif defined(__DragonFly__) + lwpid_t native_id; + native_id = lwp_gettid(); #endif return (unsigned long) native_id; } @@ -433,22 +425,15 @@ PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds, _PyTime_t timeout; // relative timeout if (microseconds >= 0) { - _PyTime_t ns; - if (microseconds <= _PyTime_MAX / 1000) { - ns = microseconds * 1000; - } - else { - // bpo-41710: PyThread_acquire_lock_timed() cannot report timeout - // overflow to the caller, so clamp the timeout to - // [_PyTime_MIN, _PyTime_MAX]. - // - // _PyTime_MAX nanoseconds is around 292.3 years. - // - // _thread.Lock.acquire() and _thread.RLock.acquire() raise an - // OverflowError if microseconds is greater than PY_TIMEOUT_MAX. - ns = _PyTime_MAX; - } - timeout = _PyTime_FromNanoseconds(ns); + // bpo-41710: PyThread_acquire_lock_timed() cannot report timeout + // overflow to the caller, so clamp the timeout to + // [_PyTime_MIN, _PyTime_MAX]. + // + // _PyTime_MAX nanoseconds is around 292.3 years. + // + // _thread.Lock.acquire() and _thread.RLock.acquire() raise an + // OverflowError if microseconds is greater than PY_TIMEOUT_MAX. + timeout = _PyTime_FromMicrosecondsClamp(microseconds); } else { timeout = _PyTime_FromNanoseconds(-1); @@ -499,7 +484,7 @@ PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds, #ifndef HAVE_SEM_CLOCKWAIT if (timeout > 0) { /* wait interrupted by a signal (EINTR): recompute the timeout */ - _PyTime_t timeout = _PyDeadline_Get(deadline); + timeout = _PyDeadline_Get(deadline); if (timeout < 0) { status = ETIMEDOUT; break; @@ -621,63 +606,79 @@ PyThread_acquire_lock_timed(PyThread_type_lock lock, PY_TIMEOUT_T microseconds, if (microseconds == 0) { status = pthread_mutex_trylock( &thelock->mut ); - if (status != EBUSY) + if (status != EBUSY) { CHECK_STATUS_PTHREAD("pthread_mutex_trylock[1]"); + } } else { status = pthread_mutex_lock( &thelock->mut ); CHECK_STATUS_PTHREAD("pthread_mutex_lock[1]"); } - if (status == 0) { - if (thelock->locked == 0) { - success = PY_LOCK_ACQUIRED; - } - else if (microseconds != 0) { - struct timespec abs; - if (microseconds > 0) { - _PyThread_cond_after(microseconds, &abs); + if (status != 0) { + goto done; + } + + if (thelock->locked == 0) { + success = PY_LOCK_ACQUIRED; + goto unlock; + } + if (microseconds == 0) { + goto unlock; + } + + struct timespec abs_timeout; + if (microseconds > 0) { + _PyThread_cond_after(microseconds, &abs_timeout); + } + // Continue trying until we get the lock + + // mut must be locked by me -- part of the condition protocol + while (1) { + if (microseconds > 0) { + status = pthread_cond_timedwait(&thelock->lock_released, + &thelock->mut, &abs_timeout); + if (status == 1) { + break; } - /* continue trying until we get the lock */ - - /* mut must be locked by me -- part of the condition - * protocol */ - while (success == PY_LOCK_FAILURE) { - if (microseconds > 0) { - status = pthread_cond_timedwait( - &thelock->lock_released, - &thelock->mut, &abs); - if (status == 1) { - break; - } - if (status == ETIMEDOUT) - break; - CHECK_STATUS_PTHREAD("pthread_cond_timedwait"); - } - else { - status = pthread_cond_wait( - &thelock->lock_released, - &thelock->mut); - CHECK_STATUS_PTHREAD("pthread_cond_wait"); - } - - if (intr_flag && status == 0 && thelock->locked) { - /* We were woken up, but didn't get the lock. We probably received - * a signal. Return PY_LOCK_INTR to allow the caller to handle - * it and retry. */ - success = PY_LOCK_INTR; - break; - } - else if (status == 0 && !thelock->locked) { - success = PY_LOCK_ACQUIRED; - } + if (status == ETIMEDOUT) { + break; } + CHECK_STATUS_PTHREAD("pthread_cond_timedwait"); + } + else { + status = pthread_cond_wait( + &thelock->lock_released, + &thelock->mut); + CHECK_STATUS_PTHREAD("pthread_cond_wait"); + } + + if (intr_flag && status == 0 && thelock->locked) { + // We were woken up, but didn't get the lock. We probably received + // a signal. Return PY_LOCK_INTR to allow the caller to handle + // it and retry. + success = PY_LOCK_INTR; + break; + } + + if (status == 0 && !thelock->locked) { + success = PY_LOCK_ACQUIRED; + break; } - if (success == PY_LOCK_ACQUIRED) thelock->locked = 1; - status = pthread_mutex_unlock( &thelock->mut ); - CHECK_STATUS_PTHREAD("pthread_mutex_unlock[1]"); + + // Wait got interrupted by a signal: retry } - if (error) success = PY_LOCK_FAILURE; +unlock: + if (success == PY_LOCK_ACQUIRED) { + thelock->locked = 1; + } + status = pthread_mutex_unlock( &thelock->mut ); + CHECK_STATUS_PTHREAD("pthread_mutex_unlock[1]"); + +done: + if (error) { + success = PY_LOCK_FAILURE; + } return success; } diff --git a/Python/traceback.c b/Python/traceback.c index e76c9aa1a14c5e..439689b32aea8a 100644 --- a/Python/traceback.c +++ b/Python/traceback.c @@ -3,7 +3,6 @@ #include "Python.h" -#include "frameobject.h" // PyFrame_GetBack() #include "pycore_ast.h" // asdl_seq_* #include "pycore_call.h" // _PyObject_CallMethodFormat() #include "pycore_compile.h" // _PyAST_Optimize @@ -15,7 +14,9 @@ #include "pycore_pyerrors.h" // _PyErr_Fetch() #include "pycore_pystate.h" // _PyThreadState_GET() #include "pycore_traceback.h" // EXCEPTION_TB_HEADER + #include "../Parser/pegen.h" // _PyPegen_byte_offset_to_character_offset() +#include "frameobject.h" // PyFrame_New() #include "structmember.h" // PyMemberDef #include "osdefs.h" // SEP #ifdef HAVE_FCNTL_H diff --git a/README.rst b/README.rst index c3a83bddf05e04..90061a886a6ece 100644 --- a/README.rst +++ b/README.rst @@ -238,7 +238,7 @@ All current PEPs, as well as guidelines for submitting a new PEP, are listed at Release Schedule ---------------- -See :pep:`664` for Python 3.12 release details. +See :pep:`693` for Python 3.12 release details. Copyright and License Information diff --git a/Tools/c-analyzer/cpython/globals-to-fix.tsv b/Tools/c-analyzer/cpython/globals-to-fix.tsv index 7e939277f1e05a..c92f64df1c5de5 100644 --- a/Tools/c-analyzer/cpython/globals-to-fix.tsv +++ b/Tools/c-analyzer/cpython/globals-to-fix.tsv @@ -18,8 +18,8 @@ Objects/capsule.c - PyCapsule_Type - Objects/cellobject.c - PyCell_Type - Objects/classobject.c - PyInstanceMethod_Type - Objects/classobject.c - PyMethod_Type - -Objects/codeobject.c - LineIterator - -Objects/codeobject.c - PositionsIterator - +Objects/codeobject.c - _PyLineIterator - +Objects/codeobject.c - _PyPositionsIterator - Objects/codeobject.c - PyCode_Type - Objects/complexobject.c - PyComplex_Type - Objects/descrobject.c - PyClassMethodDescr_Type - @@ -51,6 +51,7 @@ Objects/frameobject.c - PyFrame_Type - Objects/funcobject.c - PyClassMethod_Type - Objects/funcobject.c - PyFunction_Type - Objects/funcobject.c - PyStaticMethod_Type - +Objects/genericaliasobject.c - _Py_GenericAliasIterType - Objects/genericaliasobject.c - Py_GenericAliasType - Objects/genobject.c - PyAsyncGen_Type - Objects/genobject.c - PyCoro_Type - @@ -68,7 +69,7 @@ Objects/listobject.c - PyListRevIter_Type - Objects/listobject.c - PyList_Type - Objects/longobject.c - Int_InfoType - Objects/longobject.c - PyLong_Type - -Objects/memoryobject.c - PyMemoryIter_Type - +Objects/memoryobject.c - _PyMemoryIter_Type - Objects/memoryobject.c - PyMemoryView_Type - Objects/memoryobject.c - _PyManagedBuffer_Type - Objects/methodobject.c - PyCFunction_Type - diff --git a/Tools/clinic/clinic.py b/Tools/clinic/clinic.py index 53e29df8a8e40f..f0f2d205c419db 100755 --- a/Tools/clinic/clinic.py +++ b/Tools/clinic/clinic.py @@ -22,7 +22,6 @@ import shlex import string import sys -import tempfile import textwrap import traceback import types diff --git a/Tools/msi/bundle/Default.wxl b/Tools/msi/bundle/Default.wxl index 2eddd1f749ec5a..8b2633fe89f726 100644 --- a/Tools/msi/bundle/Default.wxl +++ b/Tools/msi/bundle/Default.wxl @@ -121,7 +121,6 @@ Feel free to email <a href="mailto:python-list@python.org">python-list@pyt You must restart your computer to complete the rollback of the software. &Restart Unable to install [WixBundleName] due to an existing install. Use Programs and Features to modify, repair or remove [WixBundleName]. - At least Windows 8.1 or Windows Server 2012 are required to install [WixBundleName] Visit <a href="https://www.python.org/">python.org</a> to download an earlier version of Python. diff --git a/Tools/msi/bundle/packagegroups/launcher.wxs b/Tools/msi/bundle/packagegroups/launcher.wxs index 7dae8ca7a68c18..a6922758f31f14 100644 --- a/Tools/msi/bundle/packagegroups/launcher.wxs +++ b/Tools/msi/bundle/packagegroups/launcher.wxs @@ -11,7 +11,11 @@ EnableFeatureSelection="yes" Permanent="yes" Visible="yes" - InstallCondition="(InstallAllUsers or InstallLauncherAllUsers) and Include_launcher and not DetectedLauncher" /> + InstallCondition="(InstallAllUsers or InstallLauncherAllUsers) and Include_launcher and not DetectedLauncher"> + + + + + InstallCondition="not (InstallAllUsers or InstallLauncherAllUsers) and Include_launcher and not DetectedLauncher"> + + + + \ No newline at end of file diff --git a/Tools/msi/launcher/launcher.wxs b/Tools/msi/launcher/launcher.wxs index d001fe53ea3811..b83058c63bf6d9 100644 --- a/Tools/msi/launcher/launcher.wxs +++ b/Tools/msi/launcher/launcher.wxs @@ -3,13 +3,18 @@ - + + + + + @@ -17,14 +22,14 @@ - + - + - + NOT Installed AND NOT ALLUSERS=1 NOT Installed AND ALLUSERS=1 diff --git a/Tools/msi/launcher/launcher_files.wxs b/Tools/msi/launcher/launcher_files.wxs index 2c6c808137a6ff..d9a230e2d35ace 100644 --- a/Tools/msi/launcher/launcher_files.wxs +++ b/Tools/msi/launcher/launcher_files.wxs @@ -22,26 +22,23 @@ - VersionNT64 + VersionNT64 AND NOT ARM64_SHELLEXT - NOT VersionNT64 + NOT VersionNT64 AND NOT ARM64_SHELLEXT - + diff --git a/Tools/peg_generator/Makefile b/Tools/peg_generator/Makefile index d010f19d58892c..084da154919e3e 100644 --- a/Tools/peg_generator/Makefile +++ b/Tools/peg_generator/Makefile @@ -88,7 +88,7 @@ time_stdlib: $(CPYTHON) venv -d $(CPYTHON) \ $(TESTFLAGS) \ --exclude "*/bad*" \ - --exclude "*/lib2to3/tests/data/*" + --exclude "*/test/test_lib2to3/data/*" mypy: regen-metaparser $(MYPY) # For list of files, see mypy.ini diff --git a/Tools/peg_generator/pegen/c_generator.py b/Tools/peg_generator/pegen/c_generator.py index 56a1e5a5a14fb0..31bb505983329e 100644 --- a/Tools/peg_generator/pegen/c_generator.py +++ b/Tools/peg_generator/pegen/c_generator.py @@ -32,12 +32,16 @@ #include "pegen.h" #if defined(Py_DEBUG) && defined(Py_BUILD_CORE) -# define D(x) if (Py_DebugFlag) x; +# define D(x) if (p->debug) { x; } #else # define D(x) #endif -# define MAXSTACK 6000 +#ifdef __wasi__ +# define MAXSTACK 4000 +#else +# define MAXSTACK 6000 +#endif """ diff --git a/Tools/peg_generator/pegen/grammar.py b/Tools/peg_generator/pegen/grammar.py index fa47b98201c0fd..03d60d01026f85 100644 --- a/Tools/peg_generator/pegen/grammar.py +++ b/Tools/peg_generator/pegen/grammar.py @@ -1,23 +1,16 @@ from __future__ import annotations -from abc import abstractmethod from typing import ( - TYPE_CHECKING, AbstractSet, Any, - Dict, Iterable, Iterator, List, Optional, - Set, Tuple, Union, ) -if TYPE_CHECKING: - from pegen.parser_generator import ParserGenerator - class GrammarError(Exception): pass diff --git a/Tools/peg_generator/scripts/benchmark.py b/Tools/peg_generator/scripts/benchmark.py index 4a063bf10034c8..053f8ef06d42c5 100644 --- a/Tools/peg_generator/scripts/benchmark.py +++ b/Tools/peg_generator/scripts/benchmark.py @@ -78,7 +78,7 @@ def run_benchmark_stdlib(subcommand): verbose=False, excluded_files=[ "*/bad*", - "*/lib2to3/tests/data/*", + "*/test/test_lib2to3/data/*", ], short=True, mode=modes[subcommand], diff --git a/Tools/peg_generator/scripts/grammar_grapher.py b/Tools/peg_generator/scripts/grammar_grapher.py index 4a41dfaa3da0ff..e5271a0e4810f9 100755 --- a/Tools/peg_generator/scripts/grammar_grapher.py +++ b/Tools/peg_generator/scripts/grammar_grapher.py @@ -30,7 +30,6 @@ Alt, Cut, Forced, - Grammar, Group, Leaf, Lookahead, diff --git a/Tools/peg_generator/scripts/test_parse_directory.py b/Tools/peg_generator/scripts/test_parse_directory.py index a5e26f0a0feda5..f5cf198be8dabf 100755 --- a/Tools/peg_generator/scripts/test_parse_directory.py +++ b/Tools/peg_generator/scripts/test_parse_directory.py @@ -5,7 +5,6 @@ import os import sys import time -import traceback import tokenize from glob import glob, escape from pathlib import PurePath @@ -13,7 +12,6 @@ from typing import List, Optional, Any, Tuple sys.path.insert(0, os.getcwd()) -from pegen.ast_dump import ast_dump from pegen.testutil import print_memstats SUCCESS = "\033[92m" diff --git a/Tools/peg_generator/scripts/test_pypi_packages.py b/Tools/peg_generator/scripts/test_pypi_packages.py index e2eaef9de26f73..01ccc3d21ef5a5 100755 --- a/Tools/peg_generator/scripts/test_pypi_packages.py +++ b/Tools/peg_generator/scripts/test_pypi_packages.py @@ -9,11 +9,10 @@ import pathlib import sys -from typing import Generator, Any +from typing import Generator sys.path.insert(0, ".") -from pegen import build from scripts import test_parse_directory HERE = pathlib.Path(__file__).resolve().parent diff --git a/Tools/scripts/deepfreeze.py b/Tools/scripts/deepfreeze.py index ac2076708a156f..f9fd4e36a81baa 100644 --- a/Tools/scripts/deepfreeze.py +++ b/Tools/scripts/deepfreeze.py @@ -1,7 +1,9 @@ """Deep freeze -The script is executed by _bootstrap_python interpreter. Shared library -extension modules are not available. +The script may be executed by _bootstrap_python interpreter. +Shared library extension modules are not available in that case. +On Windows, and in cross-compilation cases, it is executed +by Python 3.10, and 3.11 features are not available. """ import argparse import ast @@ -20,6 +22,9 @@ verbose = False identifiers, strings = get_identifiers_and_strings() +# This must be kept in sync with opcode.py +RESUME = 151 + def isprintable(b: bytes) -> bool: return all(0x20 <= c < 0x7f for c in b) @@ -115,6 +120,7 @@ def __init__(self, file: TextIO) -> None: self.write('#include "Python.h"') self.write('#include "internal/pycore_gc.h"') self.write('#include "internal/pycore_code.h"') + self.write('#include "internal/pycore_frame.h"') self.write('#include "internal/pycore_long.h"') self.write("") @@ -250,9 +256,11 @@ def generate_code(self, name: str, code: types.CodeType) -> str: self.write(f".co_exceptiontable = {co_exceptiontable},") self.field(code, "co_flags") self.write(".co_warmup = QUICKENING_INITIAL_WARMUP_VALUE,") + self.write("._co_linearray_entry_size = 0,") self.field(code, "co_argcount") self.field(code, "co_posonlyargcount") self.field(code, "co_kwonlyargcount") + self.write(f".co_framesize = {code.co_stacksize + len(localsplusnames)} + FRAME_SPECIALS_SIZE,") self.field(code, "co_stacksize") self.field(code, "co_firstlineno") self.write(f".co_nlocalsplus = {len(localsplusnames)},") @@ -266,7 +274,13 @@ def generate_code(self, name: str, code: types.CodeType) -> str: self.write(f".co_name = {co_name},") self.write(f".co_qualname = {co_qualname},") self.write(f".co_linetable = {co_linetable},") + self.write(f"._co_code = NULL,") + self.write("._co_linearray = NULL,") self.write(f".co_code_adaptive = {co_code_adaptive},") + for i, op in enumerate(code.co_code[::2]): + if op == RESUME: + self.write(f"._co_firsttraceable = {i},") + break name_as_code = f"(PyCodeObject *)&{name}" self.deallocs.append(f"_PyStaticCode_Dealloc({name_as_code});") self.interns.append(f"_PyStaticCode_InternStrings({name_as_code})") diff --git a/Tools/scripts/freeze_modules.py b/Tools/scripts/freeze_modules.py index dd208c78471943..aa1e4fe2ea0f44 100644 --- a/Tools/scripts/freeze_modules.py +++ b/Tools/scripts/freeze_modules.py @@ -8,7 +8,6 @@ import os import ntpath import posixpath -import sys import argparse from update_file import updating_file_with_tmpfile diff --git a/Tools/scripts/generate_re_casefix.py b/Tools/scripts/generate_re_casefix.py index 00b048b5d716c3..625b0658d97d1b 100755 --- a/Tools/scripts/generate_re_casefix.py +++ b/Tools/scripts/generate_re_casefix.py @@ -2,7 +2,6 @@ # This script generates Lib/re/_casefix.py. import collections -import re import sys import unicodedata diff --git a/Tools/scripts/generate_stdlib_module_names.py b/Tools/scripts/generate_stdlib_module_names.py index fe1e429ebce175..6f864c317da6c8 100644 --- a/Tools/scripts/generate_stdlib_module_names.py +++ b/Tools/scripts/generate_stdlib_module_names.py @@ -35,7 +35,6 @@ '_xxtestfuzz', 'distutils.tests', 'idlelib.idle_test', - 'lib2to3.tests', 'test', 'xxlimited', 'xxlimited_35', diff --git a/Tools/scripts/parse_html5_entities.py b/Tools/scripts/parse_html5_entities.py index c011328b0101bf..1e5bdad2165548 100755 --- a/Tools/scripts/parse_html5_entities.py +++ b/Tools/scripts/parse_html5_entities.py @@ -2,10 +2,14 @@ """ Utility for parsing HTML5 entity definitions available from: - http://dev.w3.org/html5/spec/entities.json + https://html.spec.whatwg.org/entities.json + https://html.spec.whatwg.org/multipage/named-characters.html -Written by Ezio Melotti and Iuliia Proskurnia. +The page now contains the following note: + + "This list is static and will not be expanded or changed in the future." +Written by Ezio Melotti and Iuliia Proskurnia. """ import os @@ -14,7 +18,9 @@ from urllib.request import urlopen from html.entities import html5 -entities_url = 'http://dev.w3.org/html5/spec/entities.json' +PAGE_URL = 'https://html.spec.whatwg.org/multipage/named-characters.html' +ENTITIES_URL = 'https://html.spec.whatwg.org/entities.json' +HTML5_SECTION_START = '# HTML5 named character references' def get_json(url): """Download the json file from the url and returns a decoded object.""" @@ -62,9 +68,15 @@ def write_items(entities, file=sys.stdout): # be before their equivalent lowercase version. keys = sorted(entities.keys()) keys = sorted(keys, key=str.lower) + print(HTML5_SECTION_START, file=file) + print(f'# Generated by {sys.argv[0]!r}\n' + f'# from {ENTITIES_URL} and\n' + f'# {PAGE_URL}.\n' + f'# Map HTML5 named character references to the ' + f'equivalent Unicode character(s).', file=file) print('html5 = {', file=file) for name in keys: - print(' {!r}: {!a},'.format(name, entities[name]), file=file) + print(f' {name!r}: {entities[name]!a},', file=file) print('}', file=file) @@ -72,11 +84,8 @@ def write_items(entities, file=sys.stdout): # without args print a diff between html.entities.html5 and new_html5 # with --create print the new html5 dict # with --patch patch the Lib/html/entities.py file - new_html5 = create_dict(get_json(entities_url)) + new_html5 = create_dict(get_json(ENTITIES_URL)) if '--create' in sys.argv: - print('# map the HTML5 named character references to the ' - 'equivalent Unicode character(s)') - print('# Generated by {}. Do not edit manually.'.format(__file__)) write_items(new_html5) elif '--patch' in sys.argv: fname = 'Lib/html/entities.py' @@ -84,7 +93,7 @@ def write_items(entities, file=sys.stdout): with open(fname) as f1, open(temp_fname, 'w') as f2: skip = False for line in f1: - if line.startswith('html5 = {'): + if line.startswith(HTML5_SECTION_START): write_items(new_html5, file=f2) skip = True continue diff --git a/Tools/scripts/parseentities.py b/Tools/scripts/parseentities.py deleted file mode 100755 index 0229d3af86ba78..00000000000000 --- a/Tools/scripts/parseentities.py +++ /dev/null @@ -1,64 +0,0 @@ -#!/usr/bin/env python3 -""" Utility for parsing HTML entity definitions available from: - - http://www.w3.org/ as e.g. - http://www.w3.org/TR/REC-html40/HTMLlat1.ent - - Input is read from stdin, output is written to stdout in form of a - Python snippet defining a dictionary "entitydefs" mapping literal - entity name to character or numeric entity. - - Marc-Andre Lemburg, mal@lemburg.com, 1999. - Use as you like. NO WARRANTIES. - -""" -import re,sys - -entityRE = re.compile(r'') - -def parse(text,pos=0,endpos=None): - - pos = 0 - if endpos is None: - endpos = len(text) - d = {} - while 1: - m = entityRE.search(text,pos,endpos) - if not m: - break - name,charcode,comment = m.groups() - d[name] = charcode,comment - pos = m.end() - return d - -def writefile(f,defs): - - f.write("entitydefs = {\n") - items = sorted(defs.items()) - for name, (charcode,comment) in items: - if charcode[:2] == '&#': - code = int(charcode[2:-1]) - if code < 256: - charcode = r"'\%o'" % code - else: - charcode = repr(charcode) - else: - charcode = repr(charcode) - comment = ' '.join(comment.split()) - f.write(" '%s':\t%s, \t# %s\n" % (name,charcode,comment)) - f.write('\n}\n') - -if __name__ == '__main__': - if len(sys.argv) > 1: - with open(sys.argv[1]) as infile: - text = infile.read() - else: - text = sys.stdin.read() - - defs = parse(text) - - if len(sys.argv) > 2: - with open(sys.argv[2],'w') as outfile: - writefile(outfile, defs) - else: - writefile(sys.stdout, defs) diff --git a/Tools/scripts/pathfix.py b/Tools/scripts/pathfix.py index d252321a21a172..f957b11547179d 100755 --- a/Tools/scripts/pathfix.py +++ b/Tools/scripts/pathfix.py @@ -27,7 +27,6 @@ # into a program for a different change to Python programs... import sys -import re import os from stat import * import getopt diff --git a/Tools/scripts/run_tests.py b/Tools/scripts/run_tests.py index 48feb3f778ee8d..445a34ae3e8eee 100644 --- a/Tools/scripts/run_tests.py +++ b/Tools/scripts/run_tests.py @@ -8,7 +8,9 @@ """ import os +import shlex import sys +import sysconfig import test.support @@ -19,15 +21,37 @@ def is_multiprocess_flag(arg): def is_resource_use_flag(arg): return arg.startswith('-u') or arg.startswith('--use') +def is_python_flag(arg): + return arg.startswith('-p') or arg.startswith('--python') + def main(regrtest_args): args = [sys.executable, '-u', # Unbuffered stdout and stderr '-W', 'default', # Warnings set to 'default' '-bb', # Warnings about bytes/bytearray - '-E', # Ignore environment variables ] + cross_compile = '_PYTHON_HOST_PLATFORM' in os.environ + if (hostrunner := os.environ.get("_PYTHON_HOSTRUNNER")) is None: + hostrunner = sysconfig.get_config_var("HOSTRUNNER") + if cross_compile: + # emulate -E, but keep PYTHONPATH + cross compile env vars, so + # test executable can load correct sysconfigdata file. + keep = { + '_PYTHON_PROJECT_BASE', + '_PYTHON_HOST_PLATFORM', + '_PYTHON_SYSCONFIGDATA_NAME', + 'PYTHONPATH' + } + environ = { + name: value for name, value in os.environ.items() + if not name.startswith(('PYTHON', '_PYTHON')) or name in keep + } + else: + environ = os.environ.copy() + args.append("-E") + # Allow user-specified interpreter options to override our defaults. args.extend(test.support.args_from_interpreter_flags()) @@ -38,16 +62,30 @@ def main(regrtest_args): if sys.platform == 'win32': args.append('-n') # Silence alerts under Windows if not any(is_multiprocess_flag(arg) for arg in regrtest_args): - args.extend(['-j', '0']) # Use all CPU cores + if cross_compile and hostrunner: + # For now use only two cores for cross-compiled builds; + # hostrunner can be expensive. + args.extend(['-j', '2']) + else: + args.extend(['-j', '0']) # Use all CPU cores if not any(is_resource_use_flag(arg) for arg in regrtest_args): args.extend(['-u', 'all,-largefile,-audio,-gui']) + + if cross_compile and hostrunner: + # If HOSTRUNNER is set and -p/--python option is not given, then + # use hostrunner to execute python binary for tests. + if not any(is_python_flag(arg) for arg in regrtest_args): + buildpython = sysconfig.get_config_var("BUILDPYTHON") + args.extend(["--python", f"{hostrunner} {buildpython}"]) + args.extend(regrtest_args) - print(' '.join(args)) + + print(shlex.join(args)) if sys.platform == 'win32': from subprocess import call sys.exit(call(args)) else: - os.execv(sys.executable, args) + os.execve(sys.executable, args, environ) if __name__ == '__main__': diff --git a/Tools/scripts/stable_abi.py b/Tools/scripts/stable_abi.py index f5a9f8d2dd617b..d557e102ea8e69 100755 --- a/Tools/scripts/stable_abi.py +++ b/Tools/scripts/stable_abi.py @@ -16,7 +16,6 @@ import textwrap import tomllib import difflib -import shutil import pprint import sys import os diff --git a/Tools/scripts/summarize_stats.py b/Tools/scripts/summarize_stats.py index bc528ca316f931..26e9c4ecb1679f 100644 --- a/Tools/scripts/summarize_stats.py +++ b/Tools/scripts/summarize_stats.py @@ -7,7 +7,7 @@ import opcode from datetime import date import itertools -import argparse +import sys if os.name == "nt": DEFAULT_DIR = "c:\\temp\\py_stats\\" @@ -88,7 +88,11 @@ def gather_stats(): for filename in os.listdir(DEFAULT_DIR): with open(os.path.join(DEFAULT_DIR, filename)) as fd: for line in fd: - key, value = line.split(":") + try: + key, value = line.split(":") + except ValueError: + print (f"Unparsable line: '{line.strip()}' in {filename}", file=sys.stderr) + continue key = key.strip() value = int(value) stats[key] += value @@ -103,13 +107,14 @@ def extract_opcode_stats(stats): opcode_stats[int(n)][rest.strip(".")] = value return opcode_stats -def parse_kinds(spec_src): +def parse_kinds(spec_src, prefix="SPEC_FAIL"): defines = collections.defaultdict(list) + start = "#define " + prefix + "_" for line in spec_src: line = line.strip() - if not line.startswith("#define SPEC_FAIL_"): + if not line.startswith(start): continue - line = line[len("#define SPEC_FAIL_"):] + line = line[len(start):] name, val = line.split() defines[int(val.strip())].append(name.strip()) return defines @@ -124,8 +129,6 @@ def kind_to_text(kind, defines, opname): opname = "ATTR" if opname.endswith("SUBSCR"): opname = "SUBSCR" - if opname.startswith("PRECALL"): - opname = "CALL" for name in defines[kind]: if name.startswith(opname): return pretty(name[len(opname)+1:]) @@ -184,6 +187,12 @@ def __exit__(*args): print("") print() +def to_str(x): + if isinstance(x, int): + return format(x, ",d") + else: + return str(x) + def emit_table(header, rows): width = len(header) header_line = "|" @@ -199,8 +208,8 @@ def emit_table(header, rows): print(under_line) for row in rows: if width is not None and len(row) != width: - raise ValueError("Wrong number of elements in row '" + str(rows) + "'") - print("|", " | ".join(str(i) for i in row), "|") + raise ValueError("Wrong number of elements in row '" + str(row) + "'") + print("|", " | ".join(to_str(i) for i in row), "|") print() def emit_execution_counts(opcode_stats, total): @@ -247,8 +256,23 @@ def emit_specialization_overview(opcode_stats, total): ("Not specialized", not_specialized, f"{not_specialized*100/total:0.1f}%"), ("Specialized", specialized, f"{specialized*100/total:0.1f}%"), )) + for title, field in (("Deferred", "specialization.deferred"), ("Misses", "specialization.miss")): + total = 0 + counts = [] + for i, opcode_stat in enumerate(opcode_stats): + value = opcode_stat.get(field, 0) + counts.append((value, opname[i])) + total += value + counts.sort(reverse=True) + if total: + with Section(f"{title} by instruction", 3): + rows = [ (name, count, f"{100*count/total:0.1f}%") for (count, name) in counts[:10] ] + emit_table(("Name", "Count:", "Ratio:"), rows) def emit_call_stats(stats): + stats_path = os.path.join(os.path.dirname(__file__), "../../Include/pystats.h") + with open(stats_path) as stats_src: + defines = parse_kinds(stats_src, prefix="EVAL_CALL") with Section("Call stats", summary="Inlined calls and frame stats"): total = 0 for key, value in stats.items(): @@ -258,6 +282,11 @@ def emit_call_stats(stats): for key, value in stats.items(): if "Calls to" in key: rows.append((key, value, f"{100*value/total:0.1f}%")) + elif key.startswith("Calls "): + name, index = key[:-1].split("[") + index = int(index) + label = name + " (" + pretty(defines[index][0]) + ")" + rows.append((label, value, f"{100*value/total:0.1f}%")) for key, value in stats.items(): if key.startswith("Frame"): rows.append((key, value, f"{100*value/total:0.1f}%")) @@ -265,17 +294,26 @@ def emit_call_stats(stats): def emit_object_stats(stats): with Section("Object stats", summary="allocations, frees and dict materializatons"): - total = stats.get("Object new values") + total_materializations = stats.get("Object new values") + total_allocations = stats.get("Object allocations") + stats.get("Object allocations from freelist") + total_increfs = stats.get("Object interpreter increfs") + stats.get("Object increfs") + total_decrefs = stats.get("Object interpreter decrefs") + stats.get("Object decrefs") rows = [] for key, value in stats.items(): if key.startswith("Object"): if "materialize" in key: - materialize = f"{100*value/total:0.1f}%" + ratio = f"{100*value/total_materializations:0.1f}%" + elif "allocations" in key: + ratio = f"{100*value/total_allocations:0.1f}%" + elif "increfs" in key: + ratio = f"{100*value/total_increfs:0.1f}%" + elif "decrefs" in key: + ratio = f"{100*value/total_decrefs:0.1f}%" else: - materialize = "" + ratio = "" label = key[6:].strip() label = label[0].upper() + label[1:] - rows.append((label, value, materialize)) + rows.append((label, value, ratio)) emit_table(("", "Count:", "Ratio:"), rows) def get_total(opcode_stats): @@ -307,7 +345,7 @@ def emit_pair_counts(opcode_stats, total): emit_table(("Pair", "Count:", "Self:", "Cumulative:"), rows ) - with Section("Predecessor/Successor Pairs", summary="Top 3 predecessors and successors of each opcode"): + with Section("Predecessor/Successor Pairs", summary="Top 5 predecessors and successors of each opcode"): predecessors = collections.defaultdict(collections.Counter) successors = collections.defaultdict(collections.Counter) total_predecessors = collections.Counter() @@ -326,10 +364,10 @@ def emit_pair_counts(opcode_stats, total): pred_rows = succ_rows = () if total1: pred_rows = [(opname[pred], count, f"{count/total1:.1%}") - for (pred, count) in predecessors[i].most_common(3)] + for (pred, count) in predecessors[i].most_common(5)] if total2: succ_rows = [(opname[succ], count, f"{count/total2:.1%}") - for (succ, count) in successors[i].most_common(3)] + for (succ, count) in successors[i].most_common(5)] with Section(name, 3, f"Successors and predecessors for {name}"): emit_table(("Predecessors", "Count:", "Percentage:"), pred_rows diff --git a/Tools/scripts/verify_ensurepip_wheels.py b/Tools/scripts/verify_ensurepip_wheels.py new file mode 100755 index 00000000000000..044d1fd6b3cf2d --- /dev/null +++ b/Tools/scripts/verify_ensurepip_wheels.py @@ -0,0 +1,98 @@ +#! /usr/bin/env python3 + +""" +Compare checksums for wheels in :mod:`ensurepip` against the Cheeseshop. + +When GitHub Actions executes the script, output is formatted accordingly. +https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#setting-a-notice-message +""" + +import hashlib +import json +import os +import re +from pathlib import Path +from urllib.request import urlopen + +PACKAGE_NAMES = ("pip", "setuptools") +ENSURE_PIP_ROOT = Path(__file__).parent.parent.parent / "Lib/ensurepip" +WHEEL_DIR = ENSURE_PIP_ROOT / "_bundled" +ENSURE_PIP_INIT_PY_TEXT = (ENSURE_PIP_ROOT / "__init__.py").read_text(encoding="utf-8") +GITHUB_ACTIONS = os.getenv("GITHUB_ACTIONS") == "true" + + +def print_notice(file_path: str, message: str) -> None: + if GITHUB_ACTIONS: + message = f"::notice file={file_path}::{message}" + print(message, end="\n\n") + + +def print_error(file_path: str, message: str) -> None: + if GITHUB_ACTIONS: + message = f"::error file={file_path}::{message}" + print(message, end="\n\n") + + +def verify_wheel(package_name: str) -> bool: + # Find the package on disk + package_path = next(WHEEL_DIR.glob(f"{package_name}*.whl"), None) + if not package_path: + print_error("", f"Could not find a {package_name} wheel on disk.") + return False + + print(f"Verifying checksum for {package_path}.") + + # Find the version of the package used by ensurepip + package_version_match = re.search( + f'_{package_name.upper()}_VERSION = "([^"]+)', ENSURE_PIP_INIT_PY_TEXT + ) + if not package_version_match: + print_error( + package_path, + f"No {package_name} version found in Lib/ensurepip/__init__.py.", + ) + return False + package_version = package_version_match[1] + + # Get the SHA 256 digest from the Cheeseshop + try: + raw_text = urlopen(f"https://pypi.org/pypi/{package_name}/json").read() + except (OSError, ValueError): + print_error(package_path, f"Could not fetch JSON metadata for {package_name}.") + return False + + release_files = json.loads(raw_text)["releases"][package_version] + for release_info in release_files: + if package_path.name != release_info["filename"]: + continue + expected_digest = release_info["digests"].get("sha256", "") + break + else: + print_error(package_path, f"No digest for {package_name} found from PyPI.") + return False + + # Compute the SHA 256 digest of the wheel on disk + actual_digest = hashlib.sha256(package_path.read_bytes()).hexdigest() + + print(f"Expected digest: {expected_digest}") + print(f"Actual digest: {actual_digest}") + + if actual_digest != expected_digest: + print_error( + package_path, f"Failed to verify the checksum of the {package_name} wheel." + ) + return False + + print_notice( + package_path, + f"Successfully verified the checksum of the {package_name} wheel.", + ) + return True + + +if __name__ == "__main__": + exit_status = 0 + for package_name in PACKAGE_NAMES: + if not verify_wheel(package_name): + exit_status = 1 + raise SystemExit(exit_status) diff --git a/Tools/ssl/multissltests.py b/Tools/ssl/multissltests.py index 82076808bfd372..fc6b2d0f8becf8 100755 --- a/Tools/ssl/multissltests.py +++ b/Tools/ssl/multissltests.py @@ -35,7 +35,6 @@ from urllib2 import urlopen, HTTPError import re import shutil -import string import subprocess import sys import tarfile diff --git a/Tools/unicode/genmap_tchinese.py b/Tools/unicode/genmap_tchinese.py new file mode 100644 index 00000000000000..a416cf3d1faf9b --- /dev/null +++ b/Tools/unicode/genmap_tchinese.py @@ -0,0 +1,239 @@ +# +# genmap_tchinese.py: Traditional Chinese Codecs Map Generator +# +# Original Author: Hye-Shik Chang +# +import os + +from genmap_support import * + + +# ranges for (lead byte, follower byte) +BIG5_C1 = (0xa1, 0xfe) +BIG5_C2 = (0x40, 0xfe) +BIG5HKSCS_C1 = (0x87, 0xfe) +BIG5HKSCS_C2 = (0x40, 0xfe) + +MAPPINGS_BIG5 = 'https://unicode.org/Public/MAPPINGS/OBSOLETE/EASTASIA/OTHER/BIG5.TXT' +MAPPINGS_CP950 = 'https://www.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/WINDOWS/CP950.TXT' + +HKSCS_VERSION = '2004' +# The files for HKSCS mappings are available under a restrictive license. +# Users of the script need to download the files from the HKSARG CCLI website: +MAPPINGS_HKSCS = f'https://www.ccli.gov.hk/en/archive/terms_hkscs-{HKSCS_VERSION}-big5-iso.html' + + +def bh2s(code): + return ((code >> 8) - 0x87) * (0xfe - 0x40 + 1) + ((code & 0xff) - 0x40) + + +def split_bytes(code): + """Split 0xABCD into 0xAB, 0xCD""" + return code >> 8, code & 0xff + + +def parse_hkscs_map(fo): + fo.seek(0, 0) + table = [] + for line in fo: + line = line.split('#', 1)[0].strip() + # We expect 4 columns in supported HKSCS files: + # [1999]: unsupported + # [2001]: unsupported + # [2004]: Big-5; iso10646-1:1993; iso10646-1:2000; iso10646:2003+amd1 + # [2008]: Big-5; iso10646-1:1993; iso10646-1:2000; iso10646:2003+amd6 + # [2016]: not supported here--uses a json file instead + # + # In both supported cases, we only need the first and last column: + # * Big-5 is a hex string (always 4 digits) + # * iso10646:2003 is either a hex string (4 or 5 digits) or a sequence + # of hex strings like: `` + try: + hkscs_col, _, _, uni_col = line.split() + hkscs = int(hkscs_col, 16) + seq = tuple(int(cp, 16) for cp in uni_col.strip('<>').split(',')) + except ValueError: + continue + table.append((hkscs, seq)) + return table + + +def make_hkscs_map(table): + decode_map = {} + encode_map_bmp, encode_map_notbmp = {}, {} + is_bmp_map = {} + sequences = [] + beginnings = {} + single_cp_table = [] + # Determine multi-codepoint sequences, and sequence beginnings that encode + # multiple multibyte (i.e. Big-5) codes. + for mbcode, cp_seq in table: + cp, *_ = cp_seq + if len(cp_seq) == 1: + single_cp_table.append((mbcode, cp)) + else: + sequences.append((mbcode, cp_seq)) + beginnings.setdefault(cp, []).append(mbcode) + # Decode table only cares about single code points (no sequences) currently + for mbcode, cp in single_cp_table: + b1, b2 = split_bytes(mbcode) + decode_map.setdefault(b1, {}) + decode_map[b1][b2] = cp & 0xffff + # Encode table needs to mark code points beginning a sequence as tuples. + for cp, mbcodes in beginnings.items(): + plane = cp >> 16 + if plane == 0: + encode_map = encode_map_bmp + elif plane == 2: + encode_map = encode_map_notbmp + is_bmp_map[bh2s(mbcodes[0])] = 1 + else: + assert False, 'only plane 0 (BMP) and plane 2 (SIP) allowed' + if len(mbcodes) == 1: + encode_value = mbcodes[0] + else: + encode_value = tuple(mbcodes) + uni_b1, uni_b2 = split_bytes(cp & 0xffff) + encode_map.setdefault(uni_b1, {}) + encode_map[uni_b1][uni_b2] = encode_value + + return decode_map, encode_map_bmp, encode_map_notbmp, is_bmp_map + + +def load_big5_map(): + mapfile = open_mapping_file('python-mappings/BIG5.txt', MAPPINGS_BIG5) + with mapfile: + big5decmap = loadmap(mapfile) + # big5 mapping fix: use the cp950 mapping for these characters as the file + # provided by unicode.org doesn't define a mapping. See notes in BIG5.txt. + # Since U+5341, U+5345, U+FF0F, U+FF3C already have a big5 mapping, no + # roundtrip compatibility is guaranteed for those. + for m in """\ + 0xA15A 0x2574 + 0xA1C3 0xFFE3 + 0xA1C5 0x02CD + 0xA1FE 0xFF0F + 0xA240 0xFF3C + 0xA2CC 0x5341 + 0xA2CE 0x5345""".splitlines(): + bcode, ucode = list(map(eval, m.split())) + big5decmap[bcode >> 8][bcode & 0xff] = ucode + # encoding map + big5encmap = {} + for c1, m in list(big5decmap.items()): + for c2, code in list(m.items()): + big5encmap.setdefault(code >> 8, {}) + if code & 0xff not in big5encmap[code >> 8]: + big5encmap[code >> 8][code & 0xff] = c1 << 8 | c2 + # fix unicode->big5 priority for the above-mentioned duplicate characters + big5encmap[0xFF][0x0F] = 0xA241 + big5encmap[0xFF][0x3C] = 0xA242 + big5encmap[0x53][0x41] = 0xA451 + big5encmap[0x53][0x45] = 0xA4CA + + return big5decmap, big5encmap + + +def load_cp950_map(): + mapfile = open_mapping_file('python-mappings/CP950.TXT', MAPPINGS_CP950) + with mapfile: + cp950decmap = loadmap(mapfile) + cp950encmap = {} + for c1, m in list(cp950decmap.items()): + for c2, code in list(m.items()): + cp950encmap.setdefault(code >> 8, {}) + if code & 0xff not in cp950encmap[code >> 8]: + cp950encmap[code >> 8][code & 0xff] = c1 << 8 | c2 + # fix unicode->big5 duplicated mapping priority + cp950encmap[0x53][0x41] = 0xA451 + cp950encmap[0x53][0x45] = 0xA4CA + return cp950decmap, cp950encmap + + +def main_tw(): + big5decmap, big5encmap = load_big5_map() + cp950decmap, cp950encmap = load_cp950_map() + + # CP950 extends Big5, and the codec can use the Big5 lookup tables + # for most entries. So the CP950 tables should only include entries + # that are not in Big5: + for c1, m in list(cp950encmap.items()): + for c2, code in list(m.items()): + if (c1 in big5encmap and c2 in big5encmap[c1] + and big5encmap[c1][c2] == code): + del cp950encmap[c1][c2] + for c1, m in list(cp950decmap.items()): + for c2, code in list(m.items()): + if (c1 in big5decmap and c2 in big5decmap[c1] + and big5decmap[c1][c2] == code): + del cp950decmap[c1][c2] + + with open('mappings_tw.h', 'w') as fp: + print_autogen(fp, os.path.basename(__file__)) + write_big5_maps(fp, 'BIG5', 'big5', big5decmap, big5encmap) + write_big5_maps(fp, 'CP950', 'cp950ext', cp950decmap, cp950encmap) + + +def write_big5_maps(fp, display_name, table_name, decode_map, encode_map): + print(f'Generating {display_name} decode map...') + writer = DecodeMapWriter(fp, table_name, decode_map) + writer.update_decode_map(BIG5_C1, BIG5_C2) + writer.generate() + print(f'Generating {display_name} encode map...') + writer = EncodeMapWriter(fp, table_name, encode_map) + writer.generate() + + +class HintsWriter: + filler_class = BufferedFiller + + def __init__(self, fp, prefix, isbmpmap): + self.fp = fp + self.prefix = prefix + self.isbmpmap = isbmpmap + self.filler = self.filler_class() + + def fillhints(self, hintfrom, hintto): + name = f'{self.prefix}_phint_{hintfrom}' + self.fp.write(f'static const unsigned char {name}[] = {{\n') + for msbcode in range(hintfrom, hintto+1, 8): + v = 0 + for c in range(msbcode, msbcode+8): + v |= self.isbmpmap.get(c, 0) << (c - msbcode) + self.filler.write('%d,' % v) + self.filler.printout(self.fp) + self.fp.write('};\n\n') + + +def main_hkscs(): + filename = f'python-mappings/hkscs-{HKSCS_VERSION}-big5-iso.txt' + with open_mapping_file(filename, MAPPINGS_HKSCS) as f: + table = parse_hkscs_map(f) + hkscsdecmap, hkscsencmap_bmp, hkscsencmap_nonbmp, isbmpmap = ( + make_hkscs_map(table) + ) + with open('mappings_hk.h', 'w') as fp: + print('Generating BIG5HKSCS decode map...') + print_autogen(fp, os.path.basename(__file__)) + writer = DecodeMapWriter(fp, 'big5hkscs', hkscsdecmap) + writer.update_decode_map(BIG5HKSCS_C1, BIG5HKSCS_C2) + writer.generate() + + print('Generating BIG5HKSCS decode map Unicode plane hints...') + writer = HintsWriter(fp, 'big5hkscs', isbmpmap) + writer.fillhints(bh2s(0x8740), bh2s(0xa0fe)) + writer.fillhints(bh2s(0xc6a1), bh2s(0xc8fe)) + writer.fillhints(bh2s(0xf9d6), bh2s(0xfefe)) + + print('Generating BIG5HKSCS encode map (BMP)...') + writer = EncodeMapWriter(fp, 'big5hkscs_bmp', hkscsencmap_bmp) + writer.generate() + + print('Generating BIG5HKSCS encode map (non-BMP)...') + writer = EncodeMapWriter(fp, 'big5hkscs_nonbmp', hkscsencmap_nonbmp) + writer.generate() + + +if __name__ == '__main__': + main_tw() + main_hkscs() diff --git a/Tools/wasm/README.md b/Tools/wasm/README.md index 83806f0581ace3..40f23a396f6711 100644 --- a/Tools/wasm/README.md +++ b/Tools/wasm/README.md @@ -98,9 +98,11 @@ popd ``` ```shell -node --experimental-wasm-threads --experimental-wasm-bulk-memory builddir/emscripten-node/python.js +node --experimental-wasm-threads --experimental-wasm-bulk-memory --experimental-wasm-bigint builddir/emscripten-node/python.js ``` +(``--experimental-wasm-bigint`` is not needed with recent NodeJS versions) + # wasm32-emscripten limitations and issues Emscripten before 3.1.8 has known bugs that can cause memory corruption and @@ -173,6 +175,8 @@ functions. [bpo-46390](https://bugs.python.org/issue46390). - Python's object allocator ``obmalloc`` is disabled by default. - ``ensurepip`` is not available. +- Some ``ctypes`` features like ``c_longlong`` and ``c_longdouble`` may need + NodeJS option ``--experimental-wasm-bigint``. ## wasm32-emscripten in browsers @@ -220,10 +224,47 @@ AddType application/wasm wasm # WASI (wasm32-wasi) -WASI builds require [WASI SDK](https://github.com/WebAssembly/wasi-sdk) and -currently [wasix](https://github.com/singlestore-labs/wasix) for POSIX +WASI builds require [WASI SDK](https://github.com/WebAssembly/wasi-sdk) 15.0+ +and currently [wasix](https://github.com/singlestore-labs/wasix) for POSIX compatibility stubs. +## WASI limitations and issues (WASI SDK 15.0) + +A lot of Emscripten limitations also apply to WASI. Noticable restrictions +are: + +- Call stack size is limited. Default recursion limit and parser stack size + are smaller than in regular Python builds. +- ``socket(2)`` cannot create new socket file descriptors. WASI programs can + call read/write/accept on a file descriptor that is passed into the process. +- ``socket.gethostname()`` and host name resolution APIs like + ``socket.gethostbyname()`` are not implemented and always fail. +- ``open(2)`` checks flags more strictly. Caller must pass either + ``O_RDONLY``, ``O_RDWR``, or ``O_WDONLY`` to ``os.open``. Directory file + descriptors must be created with flags ``O_RDONLY | O_DIRECTORY``. +- ``chmod(2)`` is not available. It's not possible to modify file permissions, + yet. A future version of WASI may provide a limited ``set_permissions`` API. +- User/group related features like ``os.chown()``, ``os.getuid``, etc. are + stubs or fail with ``ENOTSUP``. +- File locking (``fcntl``) is not available. +- ``os.pipe()``, ``os.mkfifo()``, and ``os.mknod()`` are not supported. +- ``process_time`` does not work as expected because it's implemented using + wall clock. +- ``os.umask()`` is a stub. +- ``sys.executable`` is empty. +- ``/dev/null`` / ``os.devnull`` may not be available. +- ``os.utime*()`` is buggy in WASM SDK 15.0, see + [utimensat() with timespec=NULL sets wrong time](https://github.com/bytecodealliance/wasmtime/issues/4184) +- ``os.symlink()`` fails with ``PermissionError`` when attempting to create a + symlink with an absolute path with wasmtime 0.36.0. The wasmtime runtime + uses ``openat2(2)`` syscall with flag ``RESOLVE_BENEATH`` to open files. + The flag causes the syscall to reject symlinks with absolute paths. +- ``os.curdir`` (aka ``.``) seems to behave differently, which breaks some + ``importlib`` tests that add ``.`` to ``sys.path`` and indirectly + ``sys.path_importer_cache``. +- WASI runtime environments may not provide a dedicated temp directory. + + # Detect WebAssembly builds ## Python code @@ -301,3 +342,81 @@ Feature detection flags: * ``__wasm_bulk_memory__`` * ``__wasm_atomics__`` * ``__wasm_mutable_globals__`` + +## Install SDKs and dependencies manually + +In some cases (e.g. build bots) you may prefer to install build dependencies +directly on the system instead of using the container image. Total disk size +of SDKs and cached libraries is about 1.6 GB. + +### Install OS dependencies + +```shell +# Debian/Ubuntu +apt update +apt install -y git make xz-utils bzip2 curl python3-minimal ccache +``` + +```shell +# Fedora +dnf install -y git make xz bzip2 which ccache +``` + +### Install [Emscripten SDK](https://emscripten.org/docs/getting_started/downloads.html) + +**NOTE**: Follow the on-screen instructions how to add the SDK to ``PATH``. + +```shell +git clone https://github.com/emscripten-core/emsdk.git /opt/emsdk +/opt/emsdk/emsdk install latest +/opt/emsdk/emsdk activate latest +``` + +### Optionally: pre-build and cache static libraries + +Emscripten SDK provides static builds of core libraries without PIC +(position-independent code). Python builds with ``dlopen`` support require +PIC. To populate the build cache, run: + +```shell +. /opt/emsdk/emsdk_env.sh +embuilder build --force zlib bzip2 +embuilder build --force --pic \ + zlib bzip2 libc-mt libdlmalloc-mt libsockets-mt \ + libstubs libcompiler_rt libcompiler_rt-mt crtbegin libhtml5 \ + libc++-mt-noexcept libc++abi-mt-noexcept \ + libal libGL-mt libstubs-debug libc-mt-debug +``` + +### Install [WASI-SDK](https://github.com/WebAssembly/wasi-sdk) + +**NOTE**: WASI-SDK's clang may show a warning on Fedora: +``/lib64/libtinfo.so.6: no version information available``, +[RHBZ#1875587](https://bugzilla.redhat.com/show_bug.cgi?id=1875587). + +```shell +export WASI_VERSION=16 +export WASI_VERSION_FULL=${WASI_VERSION}.0 +curl -sSf -L -O https://github.com/WebAssembly/wasi-sdk/releases/download/wasi-sdk-${WASI_VERSION}/wasi-sdk-${WASI_VERSION_FULL}-linux.tar.gz +mkdir -p /opt/wasi-sdk +tar --strip-components=1 -C /opt/wasi-sdk -xvf wasi-sdk-${WASI_VERSION_FULL}-linux.tar.gz +rm -f wasi-sdk-${WASI_VERSION_FULL}-linux.tar.gz +``` + +### Install [wasmtime](https://github.com/bytecodealliance/wasmtime) WASI runtime + +**NOTE**: wasmtime 0.37 has a bug. Newer versions should be fine again. + +```shell +curl -sSf -L -o ~/install-wasmtime.sh https://wasmtime.dev/install.sh +chmod +x ~/install-wasmtime.sh +~/install-wasmtime.sh --version v0.36.0 +ln -srf -t /usr/local/bin/ ~/.wasmtime/bin/wasmtime +``` + +### Install [WASIX](https://github.com/singlestore-labs/wasix) + +```shell +git clone https://github.com/singlestore-labs/wasix.git ~/wasix +make install -C ~/wasix +``` diff --git a/Tools/wasm/config.site-wasm32-emscripten b/Tools/wasm/config.site-wasm32-emscripten index 33636648eaa520..a31d60d05dd770 100644 --- a/Tools/wasm/config.site-wasm32-emscripten +++ b/Tools/wasm/config.site-wasm32-emscripten @@ -43,6 +43,13 @@ ac_cv_func_symlinkat=no ac_cv_func_lchmod=no ac_cv_func_lchown=no +# geteuid / getegid are stubs and always return 0 (root). The stub breaks +# code that assume effective user root has special permissions. +ac_cv_func_geteuid=no +ac_cv_func_getegid=no +ac_cv_func_seteuid=no +ac_cv_func_setegid=no + # Syscalls not implemented in emscripten # [Errno 52] Function not implemented ac_cv_func_preadv2=no @@ -89,5 +96,8 @@ ac_cv_func_setresgid=no ac_cv_func_link=no ac_cv_func_linkat=no +# Emscripten's faccessat does not accept AT_* flags. +ac_cv_func_faccessat=no + # alarm signal is not delivered, may need a callback into the event loop? ac_cv_func_alarm=no diff --git a/Tools/wasm/config.site-wasm32-wasi b/Tools/wasm/config.site-wasm32-wasi index 255e99c279a0a3..f151b7bc5ab0c9 100644 --- a/Tools/wasm/config.site-wasm32-wasi +++ b/Tools/wasm/config.site-wasm32-wasi @@ -17,3 +17,30 @@ ac_cv_header_sys_resource_h=no # undefined symbols / unsupported features ac_cv_func_eventfd=no + +# WASI SDK 15.0 has no pipe syscall. +ac_cv_func_pipe=no + +# WASI SDK 15.0 cannot create fifos and special files. +ac_cv_func_mkfifo=no +ac_cv_func_mkfifoat=no +ac_cv_func_mknod=no +ac_cv_func_mknodat=no +ac_cv_func_makedev=no + +# fdopendir() fails on SDK 15.0, +# OSError: [Errno 28] Invalid argument: '.' +ac_cv_func_fdopendir=no + +# WASIX stubs we don't want to use. +ac_cv_func_kill=no + +# WASI SDK 15.0 does not have chmod. +# Ignore WASIX stubs for now. +ac_cv_func_chmod=no +ac_cv_func_fchmod=no + +# WASI sockets are limited to operations on given socket fd and inet sockets. +# Disable AF_UNIX and AF_PACKET support, see socketmodule.h. +ac_cv_header_sys_un_h=no +ac_cv_header_netpacket_packet_h=no diff --git a/Tools/wasm/wasm_assets.py b/Tools/wasm/wasm_assets.py index fba70b9c9d0426..a59db9db7cdd20 100755 --- a/Tools/wasm/wasm_assets.py +++ b/Tools/wasm/wasm_assets.py @@ -13,18 +13,13 @@ import pathlib import shutil import sys +import sysconfig import zipfile # source directory SRCDIR = pathlib.Path(__file__).parent.parent.parent.absolute() SRCDIR_LIB = SRCDIR / "Lib" -# sysconfig data relative to build dir. -SYSCONFIGDATA = pathlib.PurePath( - "build", - f"lib.emscripten-wasm32-{sys.version_info.major}.{sys.version_info.minor}", - "_sysconfigdata__emscripten_wasm32-emscripten.py", -) # Library directory relative to $(prefix). WASM_LIB = pathlib.PurePath("lib") @@ -114,12 +109,21 @@ "_zoneinfo": ["zoneinfo/"], } -# regression test sub directories -OMIT_SUBDIRS = ( - "ctypes/test/", - "tkinter/test/", - "unittest/test/", -) +def get_builddir(args: argparse.Namespace) -> pathlib.Path: + """Get builddir path from pybuilddir.txt + """ + with open("pybuilddir.txt", encoding="utf-8") as f: + builddir = f.read() + return pathlib.Path(builddir) + + +def get_sysconfigdata(args: argparse.Namespace) -> pathlib.Path: + """Get path to sysconfigdata relative to build root + """ + data_name = sysconfig._get_sysconfigdata_name() + assert "emscripten_wasm32" in data_name + filename = data_name + ".py" + return args.builddir / filename def create_stdlib_zip( @@ -127,9 +131,6 @@ def create_stdlib_zip( *, optimize: int = 0, ) -> None: - def filterfunc(name: str) -> bool: - return not name.startswith(args.omit_subdirs_absolute) - with zipfile.PyZipFile( args.wasm_stdlib_zip, mode="w", compression=args.compression, optimize=optimize ) as pzf: @@ -143,14 +144,14 @@ def filterfunc(name: str) -> bool: continue if entry.name.endswith(".py") or entry.is_dir(): # writepy() writes .pyc files (bytecode). - pzf.writepy(entry, filterfunc=filterfunc) + pzf.writepy(entry) def detect_extension_modules(args: argparse.Namespace): modules = {} # disabled by Modules/Setup.local ? - with open(args.builddir / "Makefile") as f: + with open(args.buildroot / "Makefile") as f: for line in f: if line.startswith("MODDISABLED_NAMES="): disabled = line.split("=", 1)[1].strip().split() @@ -183,8 +184,8 @@ def path(val: str) -> pathlib.Path: parser = argparse.ArgumentParser() parser.add_argument( - "--builddir", - help="absolute build directory", + "--buildroot", + help="absolute path to build root", default=pathlib.Path(".").absolute(), type=path, ) @@ -202,7 +203,7 @@ def main(): relative_prefix = args.prefix.relative_to(pathlib.Path("/")) args.srcdir = SRCDIR args.srcdir_lib = SRCDIR_LIB - args.wasm_root = args.builddir / relative_prefix + args.wasm_root = args.buildroot / relative_prefix args.wasm_stdlib_zip = args.wasm_root / WASM_STDLIB_ZIP args.wasm_stdlib = args.wasm_root / WASM_STDLIB args.wasm_dynload = args.wasm_root / WASM_DYNLOAD @@ -212,9 +213,10 @@ def main(): args.compression = zipfile.ZIP_DEFLATED args.compresslevel = 9 - args.sysconfig_data = args.builddir / SYSCONFIGDATA + args.builddir = get_builddir(args) + args.sysconfig_data = get_sysconfigdata(args) if not args.sysconfig_data.is_file(): - raise ValueError(f"sysconfigdata file {SYSCONFIGDATA} missing.") + raise ValueError(f"sysconfigdata file {args.sysconfig_data} missing.") extmods = detect_extension_modules(args) omit_files = list(OMIT_FILES) @@ -224,9 +226,6 @@ def main(): omit_files.extend(modfiles) args.omit_files_absolute = {args.srcdir_lib / name for name in omit_files} - args.omit_subdirs_absolute = tuple( - str(args.srcdir_lib / name) for name in OMIT_SUBDIRS - ) # Empty, unused directory for dynamic libs, but required for site initialization. args.wasm_dynload.mkdir(parents=True, exist_ok=True) diff --git a/configure b/configure index 02810880e914f9..9f15e492b997c2 100755 --- a/configure +++ b/configure @@ -949,6 +949,7 @@ CONFIG_ARGS SOVERSION VERSION PYTHON_FOR_REGEN +PYTHON_FOR_BUILD_DEPS FREEZE_MODULE_DEPS FREEZE_MODULE FREEZE_MODULE_BOOTSTRAP_DEPS @@ -1061,7 +1062,6 @@ with_openssl with_openssl_rpath with_ssl_default_suites with_builtin_hashlib_hashes -with_experimental_isolated_subinterpreters enable_test_modules ' ac_precious_vars='build_alias @@ -1861,9 +1861,6 @@ Optional Packages: --with-builtin-hashlib-hashes=md5,sha1,sha256,sha512,sha3,blake2 builtin hash modules, md5, sha1, sha256, sha512, sha3 (with shake), blake2 - --with-experimental-isolated-subinterpreters - better isolate subinterpreters, experimental build - mode (default is no) Some influential environment variables: PKG_CONFIG path to pkg-config utility @@ -3286,6 +3283,7 @@ if test "x$cross_compiling" = xyes; then : FREEZE_MODULE_BOOTSTRAP_DEPS='$(srcdir)/Programs/_freeze_module.py' FREEZE_MODULE='$(FREEZE_MODULE_BOOTSTRAP)' FREEZE_MODULE_DEPS='$(FREEZE_MODULE_BOOTSTRAP_DEPS)' + PYTHON_FOR_BUILD_DEPS='' else @@ -3293,6 +3291,7 @@ else FREEZE_MODULE_BOOTSTRAP_DEPS="Programs/_freeze_module" FREEZE_MODULE='$(PYTHON_FOR_FREEZE) $(srcdir)/Programs/_freeze_module.py' FREEZE_MODULE_DEPS="_bootstrap_python \$(srcdir)/Programs/_freeze_module.py" + PYTHON_FOR_BUILD_DEPS='$(BUILDPYTHON)' fi @@ -3301,6 +3300,7 @@ fi + for ac_prog in python$PACKAGE_VERSION python3.10 python3.9 python3.8 python3.7 python3.6 python3 python do # Extract the first word of "$ac_prog", so it can be a program name with args. @@ -5217,6 +5217,39 @@ $as_echo "$ac_cv_path_EGREP" >&6; } +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for CC compiler name" >&5 +$as_echo_n "checking for CC compiler name... " >&6; } +if ${ac_cv_cc_name+:} false; then : + $as_echo_n "(cached) " >&6 +else + +cat > conftest.c <conftest.out 2>/dev/null; then + ac_cv_cc_name=`grep -v '^#' conftest.out | grep -v '^ *$' | tr -d ' '` +else + ac_cv_cc_name="unknown" +fi +rm -f conftest.c conftest.out + +fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_cc_name" >&5 +$as_echo "$ac_cv_cc_name" >&6; } + # checks for UNIX variants that set C preprocessor variables # may set _GNU_SOURCE, __EXTENSIONS__, _POSIX_PTHREAD_SEMANTICS, # _POSIX_SOURCE, _POSIX_1_SOURCE, and more @@ -6176,6 +6209,66 @@ if test x$MULTIARCH != x; then fi +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for PEP 11 support tier" >&5 +$as_echo_n "checking for PEP 11 support tier... " >&6; } +case $host/$ac_cv_cc_name in #( + x86_64-*-linux-gnu/gcc) : + PY_SUPPORT_TIER=1 ;; #( + x86_64-apple-darwin*/clang) : + PY_SUPPORT_TIER=1 ;; #( + i686-pc-windows-msvc/msvc) : + PY_SUPPORT_TIER=1 ;; #( + x86_64-pc-windows-msvc/msvc) : + PY_SUPPORT_TIER=1 ;; #( + + aarch64-apple-darwin*/clang) : + PY_SUPPORT_TIER=2 ;; #( + aarch64-*-linux-gnu/gcc) : + PY_SUPPORT_TIER=2 ;; #( + aarch64-*-linux-gnu/clang) : + PY_SUPPORT_TIER=2 ;; #( + powerpc64le-*-linux-gnu/gcc) : + PY_SUPPORT_TIER=2 ;; #( + x86_64-*-linux-gnu/clang) : + PY_SUPPORT_TIER=2 ;; #( + + aarch64-pc-windows-msvc/msvc) : + PY_SUPPORT_TIER=3 ;; #( + armv7l-*-linux-gnueabihf/gcc) : + PY_SUPPORT_TIER=3 ;; #( + powerpc64le-*-linux-gnu/clang) : + PY_SUPPORT_TIER=3 ;; #( + s390x-*-linux-gnu/gcc) : + PY_SUPPORT_TIER=3 ;; #( + x86_64-*-freebsd/clang) : + PY_SUPPORT_TIER=3 ;; #( + *) : + PY_SUPPORT_TIER=0 + ;; +esac + +case $PY_SUPPORT_TIER in #( + 1) : + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $host/$ac_cv_cc_name has tier 1 (supported)" >&5 +$as_echo "$host/$ac_cv_cc_name has tier 1 (supported)" >&6; } ;; #( + 2) : + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $host/$ac_cv_cc_name has tier 2 (supported)" >&5 +$as_echo "$host/$ac_cv_cc_name has tier 2 (supported)" >&6; } ;; #( + 3) : + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $host/$ac_cv_cc_name has tier 3 (partially supported)" >&5 +$as_echo "$host/$ac_cv_cc_name has tier 3 (partially supported)" >&6; } ;; #( + *) : + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: $host/$ac_cv_cc_name is not supported" >&5 +$as_echo "$as_me: WARNING: $host/$ac_cv_cc_name is not supported" >&2;} + ;; +esac + + +cat >>confdefs.h <<_ACEOF +#define PY_SUPPORT_TIER $PY_SUPPORT_TIER +_ACEOF + + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for -Wl,--no-as-needed" >&5 $as_echo_n "checking for -Wl,--no-as-needed... " >&6; } if ${ac_cv_wl_no_as_needed+:} false; then : @@ -6682,16 +6775,16 @@ then case $ac_sys_system/$ac_sys_emscripten_target in #( Emscripten/node*) : + # bigint for ctypes c_longlong, c_longdouble + HOSTRUNNER="node --experimental-wasm-bigint" if test "x$enable_wasm_pthreads" = xyes; then : - HOSTRUNNER='node --experimental-wasm-threads --experimental-wasm-bulk-memory' - -else - - HOSTRUNNER='node' + HOSTRUNNER="$HOSTRUNNER --experimental-wasm-threads --experimental-wasm-bulk-memory" fi ;; #( + WASI/*) : + HOSTRUNNER='wasmtime run --env PYTHONPATH=/$(shell realpath --relative-to $(abs_srcdir) $(abs_builddir))/$(shell cat pybuilddir.txt):/Lib --mapdir /::$(srcdir) --' ;; #( *) : HOSTRUNNER='' ;; @@ -6701,6 +6794,10 @@ fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: $HOSTRUNNER" >&5 $as_echo "$HOSTRUNNER" >&6; } +if test -n "$HOSTRUNNER"; then + PYTHON_FOR_BUILD="_PYTHON_HOSTRUNNER='$HOSTRUNNER' $PYTHON_FOR_BUILD" +fi + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $LDLIBRARY" >&5 $as_echo "$LDLIBRARY" >&6; } @@ -7856,6 +7953,8 @@ fi as_fn_append LDFLAGS_NODIST " -sALLOW_MEMORY_GROWTH -sTOTAL_MEMORY=20971520" + as_fn_append LDFLAGS_NODIST " -sWASM_BIGINT" + as_fn_append LDFLAGS_NODIST " -sFORCE_FILESYSTEM -lidbfs.js -lnodefs.js -lproxyfs.js -lworkerfs.js" if test "x$enable_wasm_dynamic_linking" = xyes; then : @@ -8851,7 +8950,8 @@ $as_echo "#define STDC_HEADERS 1" >>confdefs.h # checks for header files for ac_header in \ alloca.h asm/types.h bluetooth.h conio.h crypt.h direct.h dlfcn.h endian.h errno.h fcntl.h grp.h \ - ieeefp.h io.h langinfo.h libintl.h libutil.h linux/auxvec.h sys/auxv.h linux/memfd.h linux/random.h linux/soundcard.h \ + ieeefp.h io.h langinfo.h libintl.h libutil.h linux/auxvec.h sys/auxv.h linux/fs.h linux/memfd.h \ + linux/random.h linux/soundcard.h \ linux/tipc.h linux/wait.h netinet/in.h netpacket/packet.h poll.h process.h pthread.h pty.h \ sched.h setjmp.h shadow.h signal.h spawn.h stropts.h sys/audioio.h sys/bsdtty.h sys/devpoll.h \ sys/endian.h sys/epoll.h sys/event.h sys/eventfd.h sys/file.h sys/ioctl.h sys/kern_control.h \ @@ -18416,52 +18516,6 @@ if test "x$ac_cv_function_prototypes" = xyes; then : $as_echo "#define HAVE_PROTOTYPES 1" >>confdefs.h -fi - -works=no -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for variable length prototypes and stdarg.h" >&5 -$as_echo_n "checking for variable length prototypes and stdarg.h... " >&6; } -if ${ac_cv_stdarg_prototypes+:} false; then : - $as_echo_n "(cached) " >&6 -else - -cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ - -#include -int foo(int x, ...) { - va_list va; - va_start(va, x); - va_arg(va, int); - va_arg(va, char *); - va_arg(va, double); - return 0; -} - -int -main () -{ -return foo(10, "", 3.14); - ; - return 0; -} -_ACEOF -if ac_fn_c_try_compile "$LINENO"; then : - ac_cv_stdarg_prototypes=yes -else - ac_cv_stdarg_prototypes=no -fi -rm -f core conftest.err conftest.$ac_objext conftest.$ac_ext - -fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_stdarg_prototypes" >&5 -$as_echo "$ac_cv_stdarg_prototypes" >&6; } -if test "x$ac_cv_stdarg_prototypes" = xyes; then : - - -$as_echo "#define HAVE_STDARG_PROTOTYPES 1" >>confdefs.h - - fi @@ -21150,72 +21204,6 @@ then LIBS="$LIBS -framework CoreFoundation" fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for %zd printf() format support" >&5 -$as_echo_n "checking for %zd printf() format support... " >&6; } -if ${ac_cv_have_size_t_format+:} false; then : - $as_echo_n "(cached) " >&6 -else - if test "$cross_compiling" = yes; then : - ac_cv_have_size_t_format="cross -- assuming yes" - -else - cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ - -#include -#include -#include - -#ifdef HAVE_SYS_TYPES_H -#include -#endif - -#ifdef HAVE_SSIZE_T -typedef ssize_t Py_ssize_t; -#elif SIZEOF_VOID_P == SIZEOF_LONG -typedef long Py_ssize_t; -#else -typedef int Py_ssize_t; -#endif - -int main() -{ - char buffer[256]; - - if(sprintf(buffer, "%zd", (size_t)123) < 0) - return 1; - - if (strcmp(buffer, "123")) - return 1; - - if (sprintf(buffer, "%zd", (Py_ssize_t)-123) < 0) - return 1; - - if (strcmp(buffer, "-123")) - return 1; - - return 0; -} - -_ACEOF -if ac_fn_c_try_run "$LINENO"; then : - ac_cv_have_size_t_format=yes -else - ac_cv_have_size_t_format=no -fi -rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ - conftest.$ac_objext conftest.beam conftest.$ac_ext -fi - -fi -{ $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_cv_have_size_t_format" >&5 -$as_echo "$ac_cv_have_size_t_format" >&6; } -if test "$ac_cv_have_size_t_format" != no ; then - -$as_echo "#define PY_FORMAT_SIZE_T \"z\"" >>confdefs.h - -fi - ac_fn_c_check_type "$LINENO" "socklen_t" "ac_cv_type_socklen_t" " #ifdef HAVE_SYS_TYPES_H #include @@ -22475,30 +22463,6 @@ fi fi -# --with-experimental-isolated-subinterpreters - -{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for --with-experimental-isolated-subinterpreters" >&5 -$as_echo_n "checking for --with-experimental-isolated-subinterpreters... " >&6; } - -# Check whether --with-experimental-isolated-subinterpreters was given. -if test "${with_experimental_isolated_subinterpreters+set}" = set; then : - withval=$with_experimental_isolated_subinterpreters; -if test "$withval" != no -then - { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 -$as_echo "yes" >&6; }; - $as_echo "#define EXPERIMENTAL_ISOLATED_SUBINTERPRETERS 1" >>confdefs.h - -else - { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 -$as_echo "no" >&6; }; -fi -else - { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 -$as_echo "no" >&6; } -fi - - # Check whether to disable test modules. Once set, setup.py will not build # test extension modules and "make install" will not install test suites. { $as_echo "$as_me:${as_lineno-$LINENO}: checking for --disable-test-modules" >&5 @@ -22611,6 +22575,10 @@ case $ac_sys_system in #( py_cv_module__ctypes_test=n/a + py_cv_module_fcntl=n/a + py_cv_module_mmap=n/a + py_cv_module_resource=n/a + py_cv_module_termios=n/a py_cv_module_=n/a @@ -24684,7 +24652,7 @@ $as_echo_n "checking for stdlib extension module _ctypes_test... " >&6; } if test "$py_cv_module__ctypes_test" != "n/a"; then : if test "$TEST_MODULES" = yes; then : - if true; then : + if test "$ac_cv_func_dlopen" = yes; then : py_cv_module__ctypes_test=yes else py_cv_module__ctypes_test=missing @@ -26429,3 +26397,16 @@ If you want a release build with all stable optimizations active (PGO, etc), please run ./configure --enable-optimizations " >&6;} fi + +if test "x$PY_SUPPORT_TIER" = x0; then : + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: + +Platform \"$host\" with compiler \"$ac_cv_cc_name\" is not supported by the +CPython core team, see https://peps.python.org/pep-0011/ for more information. +" >&5 +$as_echo "$as_me: WARNING: + +Platform \"$host\" with compiler \"$ac_cv_cc_name\" is not supported by the +CPython core team, see https://peps.python.org/pep-0011/ for more information. +" >&2;} +fi diff --git a/configure.ac b/configure.ac index eab326232b14d7..f9abd851ea5cbf 100644 --- a/configure.ac +++ b/configure.ac @@ -143,6 +143,7 @@ AS_VAR_IF([cross_compiling], [yes], FREEZE_MODULE_BOOTSTRAP_DEPS='$(srcdir)/Programs/_freeze_module.py' FREEZE_MODULE='$(FREEZE_MODULE_BOOTSTRAP)' FREEZE_MODULE_DEPS='$(FREEZE_MODULE_BOOTSTRAP_DEPS)' + PYTHON_FOR_BUILD_DEPS='' ], [ dnl internal build tools also depend on Programs/_freeze_module and _bootstrap_python. @@ -150,12 +151,14 @@ AS_VAR_IF([cross_compiling], [yes], FREEZE_MODULE_BOOTSTRAP_DEPS="Programs/_freeze_module" FREEZE_MODULE='$(PYTHON_FOR_FREEZE) $(srcdir)/Programs/_freeze_module.py' FREEZE_MODULE_DEPS="_bootstrap_python \$(srcdir)/Programs/_freeze_module.py" + PYTHON_FOR_BUILD_DEPS='$(BUILDPYTHON)' ] ) AC_SUBST([FREEZE_MODULE_BOOTSTRAP]) AC_SUBST([FREEZE_MODULE_BOOTSTRAP_DEPS]) AC_SUBST([FREEZE_MODULE]) AC_SUBST([FREEZE_MODULE_DEPS]) +AC_SUBST([PYTHON_FOR_BUILD_DEPS]) AC_CHECK_PROGS([PYTHON_FOR_REGEN], [python$PACKAGE_VERSION python3.10 python3.9 python3.8 python3.7 python3.6 python3 python], @@ -770,6 +773,35 @@ AC_PROG_GREP AC_PROG_SED AC_PROG_EGREP +dnl detect compiler name +dnl check for xlc before clang, newer xlc's can use clang as frontend. +dnl check for GCC last, other compilers set __GNUC__, too. +dnl msvc is listed for completeness. +AC_CACHE_CHECK([for CC compiler name], [ac_cv_cc_name], [ +cat > conftest.c <conftest.out 2>/dev/null; then + ac_cv_cc_name=`grep -v '^#' conftest.out | grep -v '^ *$' | tr -d ' '` +else + ac_cv_cc_name="unknown" +fi +rm -f conftest.c conftest.out +]) + # checks for UNIX variants that set C preprocessor variables # may set _GNU_SOURCE, __EXTENSIONS__, _POSIX_PTHREAD_SEMANTICS, # _POSIX_SOURCE, _POSIX_1_SOURCE, and more @@ -1031,6 +1063,42 @@ if test x$MULTIARCH != x; then fi AC_SUBST(MULTIARCH_CPPFLAGS) +dnl Support tiers according to https://peps.python.org/pep-0011/ +dnl +dnl NOTE: Windows support tiers are defined in PC/pyconfig.h. +dnl +AC_MSG_CHECKING([for PEP 11 support tier]) +AS_CASE([$host/$ac_cv_cc_name], + [x86_64-*-linux-gnu/gcc], [PY_SUPPORT_TIER=1], dnl Linux on AMD64, any vendor, glibc, gcc + [x86_64-apple-darwin*/clang], [PY_SUPPORT_TIER=1], dnl macOS on Intel, any version + [i686-pc-windows-msvc/msvc], [PY_SUPPORT_TIER=1], dnl 32bit Windows on Intel, MSVC + [x86_64-pc-windows-msvc/msvc], [PY_SUPPORT_TIER=1], dnl 64bit Windows on AMD64, MSVC + + [aarch64-apple-darwin*/clang], [PY_SUPPORT_TIER=2], dnl macOS on M1, any version + [aarch64-*-linux-gnu/gcc], [PY_SUPPORT_TIER=2], dnl Linux ARM64, glibc, gcc+clang + [aarch64-*-linux-gnu/clang], [PY_SUPPORT_TIER=2], + [powerpc64le-*-linux-gnu/gcc], [PY_SUPPORT_TIER=2], dnl Linux on PPC64 little endian, glibc, gcc + [x86_64-*-linux-gnu/clang], [PY_SUPPORT_TIER=2], dnl Linux on AMD64, any vendor, glibc, clang + + [aarch64-pc-windows-msvc/msvc], [PY_SUPPORT_TIER=3], dnl Windows ARM64, MSVC + [armv7l-*-linux-gnueabihf/gcc], [PY_SUPPORT_TIER=3], dnl ARMv7 LE with hardware floats, any vendor, glibc, gcc + [powerpc64le-*-linux-gnu/clang], [PY_SUPPORT_TIER=3], dnl Linux on PPC64 little endian, glibc, clang + [s390x-*-linux-gnu/gcc], [PY_SUPPORT_TIER=3], dnl Linux on 64bit s390x (big endian), glibc, gcc + dnl [wasm32-unknown-emscripten/clang], [PY_SUPPORT_TIER=3], dnl WebAssembly Emscripten + dnl [wasm32-unknown-wasi/clang], [PY_SUPPORT_TIER=3], dnl WebAssembly System Interface + [x86_64-*-freebsd/clang], [PY_SUPPORT_TIER=3], dnl FreeBSD on AMD64 + [PY_SUPPORT_TIER=0] +) + +AS_CASE([$PY_SUPPORT_TIER], + [1], [AC_MSG_RESULT([$host/$ac_cv_cc_name has tier 1 (supported)])], + [2], [AC_MSG_RESULT([$host/$ac_cv_cc_name has tier 2 (supported)])], + [3], [AC_MSG_RESULT([$host/$ac_cv_cc_name has tier 3 (partially supported)])], + [AC_MSG_WARN([$host/$ac_cv_cc_name is not supported])] +) + +AC_DEFINE_UNQUOTED([PY_SUPPORT_TIER], [$PY_SUPPORT_TIER], [PEP 11 Support tier (1, 2, 3 or 0 for unsupported)]) + AC_CACHE_CHECK([for -Wl,--no-as-needed], [ac_cv_wl_no_as_needed], [ save_LDFLAGS="$LDFLAGS" AS_VAR_APPEND([LDFLAGS], [-Wl,--no-as-needed]) @@ -1418,18 +1486,27 @@ if test -z "$HOSTRUNNER" then AS_CASE([$ac_sys_system/$ac_sys_emscripten_target], [Emscripten/node*], [ + # bigint for ctypes c_longlong, c_longdouble + HOSTRUNNER="node --experimental-wasm-bigint" AS_VAR_IF([enable_wasm_pthreads], [yes], [ - HOSTRUNNER='node --experimental-wasm-threads --experimental-wasm-bulk-memory' - ], [ - HOSTRUNNER='node' + HOSTRUNNER="$HOSTRUNNER --experimental-wasm-threads --experimental-wasm-bulk-memory" ]) ], + dnl TODO: support other WASI runtimes + dnl wasmtime starts the proces with "/" as CWD. For OOT builds add the + dnl directory containing _sysconfigdata to PYTHONPATH. + [WASI/*], [HOSTRUNNER='wasmtime run --env PYTHONPATH=/$(shell realpath --relative-to $(abs_srcdir) $(abs_builddir))/$(shell cat pybuilddir.txt):/Lib --mapdir /::$(srcdir) --'], [HOSTRUNNER=''] ) fi AC_SUBST([HOSTRUNNER]) AC_MSG_RESULT([$HOSTRUNNER]) +if test -n "$HOSTRUNNER"; then + dnl Pass hostrunner variable as env var in order to expand shell expressions. + PYTHON_FOR_BUILD="_PYTHON_HOSTRUNNER='$HOSTRUNNER' $PYTHON_FOR_BUILD" +fi + AC_MSG_RESULT($LDLIBRARY) # LIBRARY_DEPS, LINK_PYTHON_OBJS and LINK_PYTHON_DEPS variable @@ -1947,10 +2024,13 @@ AS_CASE([$ac_sys_system], dnl build with WASM debug info if either Py_DEBUG is set or the target is dnl node-debug or browser-debug. AS_VAR_IF([Py_DEBUG], [yes], [wasm_debug=yes], [wasm_debug=no]) - + dnl Start with 20 MB and allow to grow AS_VAR_APPEND([LDFLAGS_NODIST], [" -sALLOW_MEMORY_GROWTH -sTOTAL_MEMORY=20971520"]) + dnl map int64_t and uint64_t to JS bigint + AS_VAR_APPEND([LDFLAGS_NODIST], [" -sWASM_BIGINT"]) + dnl Include file system support AS_VAR_APPEND([LDFLAGS_NODIST], [" -sFORCE_FILESYSTEM -lidbfs.js -lnodefs.js -lproxyfs.js -lworkerfs.js"]) @@ -2504,7 +2584,8 @@ AC_DEFINE(STDC_HEADERS, 1, [Define to 1 if you have the ANSI C header files.]) # checks for header files AC_CHECK_HEADERS([ \ alloca.h asm/types.h bluetooth.h conio.h crypt.h direct.h dlfcn.h endian.h errno.h fcntl.h grp.h \ - ieeefp.h io.h langinfo.h libintl.h libutil.h linux/auxvec.h sys/auxv.h linux/memfd.h linux/random.h linux/soundcard.h \ + ieeefp.h io.h langinfo.h libintl.h libutil.h linux/auxvec.h sys/auxv.h linux/fs.h linux/memfd.h \ + linux/random.h linux/soundcard.h \ linux/tipc.h linux/wait.h netinet/in.h netpacket/packet.h poll.h process.h pthread.h pty.h \ sched.h setjmp.h shadow.h signal.h spawn.h stropts.h sys/audioio.h sys/bsdtty.h sys/devpoll.h \ sys/endian.h sys/epoll.h sys/event.h sys/eventfd.h sys/file.h sys/ioctl.h sys/kern_control.h \ @@ -5022,27 +5103,6 @@ AS_VAR_IF([ac_cv_function_prototypes], [yes], [ [Define if your compiler supports function prototype]) ]) -works=no -AC_CACHE_CHECK([for variable length prototypes and stdarg.h], [ac_cv_stdarg_prototypes], [ -AC_COMPILE_IFELSE([AC_LANG_PROGRAM([[ -#include -int foo(int x, ...) { - va_list va; - va_start(va, x); - va_arg(va, int); - va_arg(va, char *); - va_arg(va, double); - return 0; -} -]], [[return foo(10, "", 3.14);]])], - [ac_cv_stdarg_prototypes=yes], [ac_cv_stdarg_prototypes=no]) -]) -AS_VAR_IF([ac_cv_stdarg_prototypes], [yes], [ - AC_DEFINE(HAVE_STDARG_PROTOTYPES, 1, - [Define if your compiler supports variable length function prototypes - (e.g. void fprintf(FILE *, char *, ...);) *and* ]) -]) - # check for socketpair PY_CHECK_FUNC([socketpair], [ @@ -6004,52 +6064,6 @@ then LIBS="$LIBS -framework CoreFoundation" fi -AC_CACHE_CHECK([for %zd printf() format support], ac_cv_have_size_t_format, [dnl -AC_RUN_IFELSE([AC_LANG_SOURCE([[ -#include -#include -#include - -#ifdef HAVE_SYS_TYPES_H -#include -#endif - -#ifdef HAVE_SSIZE_T -typedef ssize_t Py_ssize_t; -#elif SIZEOF_VOID_P == SIZEOF_LONG -typedef long Py_ssize_t; -#else -typedef int Py_ssize_t; -#endif - -int main() -{ - char buffer[256]; - - if(sprintf(buffer, "%zd", (size_t)123) < 0) - return 1; - - if (strcmp(buffer, "123")) - return 1; - - if (sprintf(buffer, "%zd", (Py_ssize_t)-123) < 0) - return 1; - - if (strcmp(buffer, "-123")) - return 1; - - return 0; -} -]])], - [ac_cv_have_size_t_format=yes], - [ac_cv_have_size_t_format=no], - [ac_cv_have_size_t_format="cross -- assuming yes" -])]) -if test "$ac_cv_have_size_t_format" != no ; then - AC_DEFINE(PY_FORMAT_SIZE_T, "z", - [Define to printf format modifier for Py_ssize_t]) -fi - AC_CHECK_TYPE(socklen_t,, AC_DEFINE(socklen_t,int, [Define to `int' if does not define.]),[ @@ -6607,23 +6621,6 @@ AS_VAR_IF([with_builtin_blake2], [yes], [ ], [have_libb2=no]) ]) -# --with-experimental-isolated-subinterpreters -AH_TEMPLATE(EXPERIMENTAL_ISOLATED_SUBINTERPRETERS, - [Better isolate subinterpreters, experimental build mode.]) -AC_MSG_CHECKING(for --with-experimental-isolated-subinterpreters) -AC_ARG_WITH(experimental-isolated-subinterpreters, - AS_HELP_STRING([--with-experimental-isolated-subinterpreters], - [better isolate subinterpreters, experimental build mode (default is no)]), -[ -if test "$withval" != no -then - AC_MSG_RESULT(yes); - AC_DEFINE(EXPERIMENTAL_ISOLATED_SUBINTERPRETERS) -else - AC_MSG_RESULT(no); -fi], -[AC_MSG_RESULT(no)]) - # Check whether to disable test modules. Once set, setup.py will not build # test extension modules and "make install" will not install test suites. AC_MSG_CHECKING([for --disable-test-modules]) @@ -6690,8 +6687,13 @@ AS_CASE([$ac_sys_system], ], [Emscripten/node*], [], [WASI/*], [ + dnl WASI SDK 15.0 does not support file locking, mmap, and more. PY_STDLIB_MOD_SET_NA( [_ctypes_test], + [fcntl], + [mmap], + [resource], + [termios], ) ] ) @@ -6912,7 +6914,7 @@ PY_STDLIB_MOD([_testbuffer], [test "$TEST_MODULES" = yes]) PY_STDLIB_MOD([_testimportmultiple], [test "$TEST_MODULES" = yes], [test "$ac_cv_func_dlopen" = yes]) PY_STDLIB_MOD([_testmultiphase], [test "$TEST_MODULES" = yes], [test "$ac_cv_func_dlopen" = yes]) PY_STDLIB_MOD([_xxtestfuzz], [test "$TEST_MODULES" = yes]) -PY_STDLIB_MOD([_ctypes_test], [test "$TEST_MODULES" = yes], [], [], [-lm]) +PY_STDLIB_MOD([_ctypes_test], [test "$TEST_MODULES" = yes], [test "$ac_cv_func_dlopen" = yes], [], [-lm]) dnl Limited API template modules. dnl The limited C API is not compatible with the Py_TRACE_REFS macro. @@ -6952,3 +6954,9 @@ If you want a release build with all stable optimizations active (PGO, etc), please run ./configure --enable-optimizations ]) fi + +AS_VAR_IF([PY_SUPPORT_TIER], [0], [AC_MSG_WARN([ + +Platform "$host" with compiler "$ac_cv_cc_name" is not supported by the +CPython core team, see https://peps.python.org/pep-0011/ for more information. +])]) diff --git a/pyconfig.h.in b/pyconfig.h.in index 383fd47dd43c81..a09652ec15e536 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -44,9 +44,6 @@ /* Define if --enable-ipv6 is specified */ #undef ENABLE_IPV6 -/* Better isolate subinterpreters, experimental build mode. */ -#undef EXPERIMENTAL_ISOLATED_SUBINTERPRETERS - /* Define to 1 if your system stores words within floats with the most significant word first */ #undef FLOAT_WORDS_BIGENDIAN @@ -688,6 +685,9 @@ /* Define if compiling using Linux 4.1 or later. */ #undef HAVE_LINUX_CAN_RAW_JOIN_FILTERS +/* Define to 1 if you have the header file. */ +#undef HAVE_LINUX_FS_H + /* Define to 1 if you have the header file. */ #undef HAVE_LINUX_MEMFD_H @@ -1115,10 +1115,6 @@ /* Define if you have struct stat.st_mtimensec */ #undef HAVE_STAT_TV_NSEC2 -/* Define if your compiler supports variable length function prototypes (e.g. - void fprintf(FILE *, char *, ...);) *and* */ -#undef HAVE_STDARG_PROTOTYPES - /* Define to 1 if you have the header file. */ #undef HAVE_STDINT_H @@ -1506,9 +1502,6 @@ /* Define if you want to coerce the C locale to a UTF-8 based locale */ #undef PY_COERCE_C_LOCALE -/* Define to printf format modifier for Py_ssize_t */ -#undef PY_FORMAT_SIZE_T - /* Define to 1 to build the sqlite module with loadable extensions support. */ #undef PY_SQLITE_ENABLE_LOAD_EXTENSION @@ -1522,6 +1515,9 @@ /* Cipher suite string for PY_SSL_DEFAULT_CIPHERS=0 */ #undef PY_SSL_DEFAULT_CIPHER_STRING +/* PEP 11 Support tier (1, 2, 3 or 0 for unsupported) */ +#undef PY_SUPPORT_TIER + /* Define if you want to build an interpreter with many run-time checks. */ #undef Py_DEBUG diff --git a/setup.py b/setup.py index f45cd6de33749b..bba344c3af07ca 100644 --- a/setup.py +++ b/setup.py @@ -538,7 +538,6 @@ def print_three_column(lst): if self.missing: print() - print("Python build finished successfully!") print("The necessary bits to build these optional modules were not " "found:") print_three_column(self.missing) @@ -1408,9 +1407,6 @@ def detect_ctypes(self): # finding some -z option for the Sun compiler. extra_link_args.append('-mimpure-text') - elif HOST_PLATFORM.startswith('hp-ux'): - extra_link_args.append('-fPIC') - ext = Extension('_ctypes', include_dirs=include_dirs, extra_compile_args=extra_compile_args,