diff --git a/Doc/c-api/buffer.rst b/Doc/c-api/buffer.rst index e572815ffd6259..1e1cabdf242bd1 100644 --- a/Doc/c-api/buffer.rst +++ b/Doc/c-api/buffer.rst @@ -29,7 +29,7 @@ without intermediate copying. Python provides such a facility at the C level in the form of the :ref:`buffer protocol `. This protocol has two sides: -.. index:: single: PyBufferProcs +.. index:: single: PyBufferProcs (C type) - on the producer side, a type can export a "buffer interface" which allows objects of that type to expose information about their underlying buffer. diff --git a/Doc/c-api/code.rst b/Doc/c-api/code.rst index 5082b0cb6ad3f3..382cfbff864072 100644 --- a/Doc/c-api/code.rst +++ b/Doc/c-api/code.rst @@ -22,12 +22,13 @@ bound into a function. .. c:var:: PyTypeObject PyCode_Type This is an instance of :c:type:`PyTypeObject` representing the Python - :class:`code` type. + :ref:`code object `. .. c:function:: int PyCode_Check(PyObject *co) - Return true if *co* is a :class:`code` object. This function always succeeds. + Return true if *co* is a :ref:`code object `. + This function always succeeds. .. c:function:: int PyCode_GetNumFree(PyCodeObject *co) @@ -48,7 +49,7 @@ bound into a function. .. versionchanged:: 3.11 Added ``qualname`` and ``exceptiontable`` parameters. - .. index:: single: PyCode_New + .. index:: single: PyCode_New (C function) .. versionchanged:: 3.12 @@ -61,7 +62,7 @@ bound into a function. Similar to :c:func:`PyUnstable_Code_New`, but with an extra "posonlyargcount" for positional-only arguments. The same caveats that apply to ``PyUnstable_Code_New`` also apply to this function. - .. index:: single: PyCode_NewWithPosOnlyArgs + .. index:: single: PyCode_NewWithPosOnlyArgs (C function) .. versionadded:: 3.8 as ``PyCode_NewWithPosOnlyArgs`` @@ -220,7 +221,7 @@ may change without deprecation warnings. *free* will be called on non-``NULL`` data stored under the new index. Use :c:func:`Py_DecRef` when storing :c:type:`PyObject`. - .. index:: single: _PyEval_RequestCodeExtraIndex + .. index:: single: _PyEval_RequestCodeExtraIndex (C function) .. versionadded:: 3.6 as ``_PyEval_RequestCodeExtraIndex`` @@ -238,7 +239,7 @@ may change without deprecation warnings. If no data was set under the index, set *extra* to ``NULL`` and return 0 without setting an exception. - .. index:: single: _PyCode_GetExtra + .. index:: single: _PyCode_GetExtra (C function) .. versionadded:: 3.6 as ``_PyCode_GetExtra`` @@ -253,7 +254,7 @@ may change without deprecation warnings. Set the extra data stored under the given index to *extra*. Return 0 on success. Set an exception and return -1 on failure. - .. index:: single: _PyCode_SetExtra + .. index:: single: _PyCode_SetExtra (C function) .. versionadded:: 3.6 as ``_PyCode_SetExtra`` diff --git a/Doc/c-api/exceptions.rst b/Doc/c-api/exceptions.rst index c7e3cd9463e5d7..e6309ae7614d34 100644 --- a/Doc/c-api/exceptions.rst +++ b/Doc/c-api/exceptions.rst @@ -180,7 +180,7 @@ For convenience, some of these functions will always return a .. c:function:: PyObject* PyErr_SetFromErrno(PyObject *type) - .. index:: single: strerror() + .. index:: single: strerror (C function) This is a convenience function to raise an exception when a C library function has returned an error and set the C variable :c:data:`errno`. It constructs a @@ -396,7 +396,7 @@ an error value). .. c:function:: int PyErr_ResourceWarning(PyObject *source, Py_ssize_t stack_level, const char *format, ...) Function similar to :c:func:`PyErr_WarnFormat`, but *category* is - :exc:`ResourceWarning` and it passes *source* to :func:`warnings.WarningMessage`. + :exc:`ResourceWarning` and it passes *source* to :class:`!warnings.WarningMessage`. .. versionadded:: 3.6 @@ -635,7 +635,7 @@ Signal Handling .. index:: pair: module; signal - single: SIGINT + single: SIGINT (C macro) single: KeyboardInterrupt (built-in exception) This function interacts with Python's signal handling. @@ -666,7 +666,7 @@ Signal Handling .. index:: pair: module; signal - single: SIGINT + single: SIGINT (C macro) single: KeyboardInterrupt (built-in exception) Simulate the effect of a :c:macro:`!SIGINT` signal arriving. @@ -732,7 +732,7 @@ Exception Classes This creates a class object derived from :exc:`Exception` (accessible in C as :c:data:`PyExc_Exception`). - The :attr:`__module__` attribute of the new class is set to the first part (up + The :attr:`!__module__` attribute of the new class is set to the first part (up to the last dot) of the *name* argument, and the class name is set to the last part (after the last dot). The *base* argument can be used to specify alternate base classes; it can either be only one class or a tuple of classes. The *dict* @@ -904,8 +904,8 @@ because the :ref:`call protocol ` takes care of recursion handling. Marks a point where a recursive C-level call is about to be performed. - If :c:macro:`USE_STACKCHECK` is defined, this function checks if the OS - stack overflowed using :c:func:`PyOS_CheckStack`. In this is the case, it + If :c:macro:`!USE_STACKCHECK` is defined, this function checks if the OS + stack overflowed using :c:func:`PyOS_CheckStack`. If this is the case, it sets a :exc:`MemoryError` and returns a nonzero value. The function then checks if the recursion limit is reached. If this is the @@ -968,59 +968,59 @@ All standard Python exceptions are available as global variables whose names are the variables: .. index:: - single: PyExc_BaseException - single: PyExc_Exception - single: PyExc_ArithmeticError - single: PyExc_AssertionError - single: PyExc_AttributeError - single: PyExc_BlockingIOError - single: PyExc_BrokenPipeError - single: PyExc_BufferError - single: PyExc_ChildProcessError - single: PyExc_ConnectionAbortedError - single: PyExc_ConnectionError - single: PyExc_ConnectionRefusedError - single: PyExc_ConnectionResetError - single: PyExc_EOFError - single: PyExc_FileExistsError - single: PyExc_FileNotFoundError - single: PyExc_FloatingPointError - single: PyExc_GeneratorExit - single: PyExc_ImportError - single: PyExc_IndentationError - single: PyExc_IndexError - single: PyExc_InterruptedError - single: PyExc_IsADirectoryError - single: PyExc_KeyError - single: PyExc_KeyboardInterrupt - single: PyExc_LookupError - single: PyExc_MemoryError - single: PyExc_ModuleNotFoundError - single: PyExc_NameError - single: PyExc_NotADirectoryError - single: PyExc_NotImplementedError - single: PyExc_OSError - single: PyExc_OverflowError - single: PyExc_PermissionError - single: PyExc_ProcessLookupError - single: PyExc_RecursionError - single: PyExc_ReferenceError - single: PyExc_RuntimeError - single: PyExc_StopAsyncIteration - single: PyExc_StopIteration - single: PyExc_SyntaxError - single: PyExc_SystemError - single: PyExc_SystemExit - single: PyExc_TabError - single: PyExc_TimeoutError - single: PyExc_TypeError - single: PyExc_UnboundLocalError - single: PyExc_UnicodeDecodeError - single: PyExc_UnicodeEncodeError - single: PyExc_UnicodeError - single: PyExc_UnicodeTranslateError - single: PyExc_ValueError - single: PyExc_ZeroDivisionError + single: PyExc_BaseException (C var) + single: PyExc_Exception (C var) + single: PyExc_ArithmeticError (C var) + single: PyExc_AssertionError (C var) + single: PyExc_AttributeError (C var) + single: PyExc_BlockingIOError (C var) + single: PyExc_BrokenPipeError (C var) + single: PyExc_BufferError (C var) + single: PyExc_ChildProcessError (C var) + single: PyExc_ConnectionAbortedError (C var) + single: PyExc_ConnectionError (C var) + single: PyExc_ConnectionRefusedError (C var) + single: PyExc_ConnectionResetError (C var) + single: PyExc_EOFError (C var) + single: PyExc_FileExistsError (C var) + single: PyExc_FileNotFoundError (C var) + single: PyExc_FloatingPointError (C var) + single: PyExc_GeneratorExit (C var) + single: PyExc_ImportError (C var) + single: PyExc_IndentationError (C var) + single: PyExc_IndexError (C var) + single: PyExc_InterruptedError (C var) + single: PyExc_IsADirectoryError (C var) + single: PyExc_KeyError (C var) + single: PyExc_KeyboardInterrupt (C var) + single: PyExc_LookupError (C var) + single: PyExc_MemoryError (C var) + single: PyExc_ModuleNotFoundError (C var) + single: PyExc_NameError (C var) + single: PyExc_NotADirectoryError (C var) + single: PyExc_NotImplementedError (C var) + single: PyExc_OSError (C var) + single: PyExc_OverflowError (C var) + single: PyExc_PermissionError (C var) + single: PyExc_ProcessLookupError (C var) + single: PyExc_RecursionError (C var) + single: PyExc_ReferenceError (C var) + single: PyExc_RuntimeError (C var) + single: PyExc_StopAsyncIteration (C var) + single: PyExc_StopIteration (C var) + single: PyExc_SyntaxError (C var) + single: PyExc_SystemError (C var) + single: PyExc_SystemExit (C var) + single: PyExc_TabError (C var) + single: PyExc_TimeoutError (C var) + single: PyExc_TypeError (C var) + single: PyExc_UnboundLocalError (C var) + single: PyExc_UnicodeDecodeError (C var) + single: PyExc_UnicodeEncodeError (C var) + single: PyExc_UnicodeError (C var) + single: PyExc_UnicodeTranslateError (C var) + single: PyExc_ValueError (C var) + single: PyExc_ZeroDivisionError (C var) +-----------------------------------------+---------------------------------+----------+ | C Name | Python Name | Notes | @@ -1151,18 +1151,18 @@ the variables: These are compatibility aliases to :c:data:`PyExc_OSError`: .. index:: - single: PyExc_EnvironmentError - single: PyExc_IOError - single: PyExc_WindowsError + single: PyExc_EnvironmentError (C var) + single: PyExc_IOError (C var) + single: PyExc_WindowsError (C var) +-------------------------------------+----------+ | C Name | Notes | +=====================================+==========+ -| :c:data:`PyExc_EnvironmentError` | | +| :c:data:`!PyExc_EnvironmentError` | | +-------------------------------------+----------+ -| :c:data:`PyExc_IOError` | | +| :c:data:`!PyExc_IOError` | | +-------------------------------------+----------+ -| :c:data:`PyExc_WindowsError` | [2]_ | +| :c:data:`!PyExc_WindowsError` | [2]_ | +-------------------------------------+----------+ .. versionchanged:: 3.3 @@ -1188,17 +1188,17 @@ names are ``PyExc_`` followed by the Python exception name. These have the type the variables: .. index:: - single: PyExc_Warning - single: PyExc_BytesWarning - single: PyExc_DeprecationWarning - single: PyExc_FutureWarning - single: PyExc_ImportWarning - single: PyExc_PendingDeprecationWarning - single: PyExc_ResourceWarning - single: PyExc_RuntimeWarning - single: PyExc_SyntaxWarning - single: PyExc_UnicodeWarning - single: PyExc_UserWarning + single: PyExc_Warning (C var) + single: PyExc_BytesWarning (C var) + single: PyExc_DeprecationWarning (C var) + single: PyExc_FutureWarning (C var) + single: PyExc_ImportWarning (C var) + single: PyExc_PendingDeprecationWarning (C var) + single: PyExc_ResourceWarning (C var) + single: PyExc_RuntimeWarning (C var) + single: PyExc_SyntaxWarning (C var) + single: PyExc_UnicodeWarning (C var) + single: PyExc_UserWarning (C var) +------------------------------------------+---------------------------------+----------+ | C Name | Python Name | Notes | diff --git a/Doc/c-api/file.rst b/Doc/c-api/file.rst index d3a78c588454e8..e9019a0d500f7e 100644 --- a/Doc/c-api/file.rst +++ b/Doc/c-api/file.rst @@ -96,7 +96,7 @@ the :mod:`io` APIs instead. .. c:function:: int PyFile_WriteObject(PyObject *obj, PyObject *p, int flags) - .. index:: single: Py_PRINT_RAW + .. index:: single: Py_PRINT_RAW (C macro) Write object *obj* to file object *p*. The only supported flag for *flags* is :c:macro:`Py_PRINT_RAW`; if given, the :func:`str` of the object is written diff --git a/Doc/c-api/gcsupport.rst b/Doc/c-api/gcsupport.rst index 6b2494ee4f0ed4..621da3eb069949 100644 --- a/Doc/c-api/gcsupport.rst +++ b/Doc/c-api/gcsupport.rst @@ -83,10 +83,15 @@ rules: .. versionadded:: 3.12 -.. c:function:: TYPE* PyObject_GC_Resize(TYPE, PyVarObject *op, Py_ssize_t newsize) +.. c:macro:: PyObject_GC_Resize(TYPE, op, newsize) - Resize an object allocated by :c:macro:`PyObject_NewVar`. Returns the - resized object or ``NULL`` on failure. *op* must not be tracked by the collector yet. + Resize an object allocated by :c:macro:`PyObject_NewVar`. + Returns the resized object of type ``TYPE*`` (refers to any C type) + or ``NULL`` on failure. + + *op* must be of type :c:expr:`PyVarObject *` + and must not be tracked by the collector yet. + *newsize* must be of type :c:type:`Py_ssize_t`. .. c:function:: void PyObject_GC_Track(PyObject *op) diff --git a/Doc/c-api/init.rst b/Doc/c-api/init.rst index f8fd48e781d6da..e7199ad5e0c1b1 100644 --- a/Doc/c-api/init.rst +++ b/Doc/c-api/init.rst @@ -332,7 +332,7 @@ Initializing and finalizing the interpreter pair: module; __main__ pair: module; sys triple: module; search; path - single: Py_FinalizeEx() + single: Py_FinalizeEx (C function) Initialize the Python interpreter. In an application embedding Python, this should be called before using any other Python/C API functions; see @@ -661,7 +661,7 @@ operations could cause problems in a multi-threaded program: for example, when two threads simultaneously increment the reference count of the same object, the reference count could end up being incremented only once instead of twice. -.. index:: single: setswitchinterval() (in module sys) +.. index:: single: setswitchinterval (in module sys) Therefore, the rule exists that only the thread that has acquired the :term:`GIL` may operate on Python objects or call Python/C API functions. @@ -671,8 +671,7 @@ released around potentially blocking I/O operations like reading or writing a file, so that other Python threads can run in the meantime. .. index:: - single: PyThreadState - single: PyThreadState + single: PyThreadState (C type) The Python interpreter keeps some thread-specific bookkeeping information inside a data structure called :c:type:`PyThreadState`. There's also one @@ -698,8 +697,8 @@ This is so common that a pair of macros exists to simplify it:: Py_END_ALLOW_THREADS .. index:: - single: Py_BEGIN_ALLOW_THREADS - single: Py_END_ALLOW_THREADS + single: Py_BEGIN_ALLOW_THREADS (C macro) + single: Py_END_ALLOW_THREADS (C macro) The :c:macro:`Py_BEGIN_ALLOW_THREADS` macro opens a new block and declares a hidden local variable; the :c:macro:`Py_END_ALLOW_THREADS` macro closes the @@ -714,8 +713,8 @@ The block above expands to the following code:: PyEval_RestoreThread(_save); .. index:: - single: PyEval_RestoreThread() - single: PyEval_SaveThread() + single: PyEval_RestoreThread (C function) + single: PyEval_SaveThread (C function) Here is how these functions work: the global interpreter lock is used to protect the pointer to the current thread state. When releasing the lock and saving the thread state, @@ -1399,8 +1398,8 @@ function. You can create and destroy them using the following functions: may be stored internally on the :c:type:`PyInterpreterState`. .. index:: - single: Py_FinalizeEx() - single: Py_Initialize() + single: Py_FinalizeEx (C function) + single: Py_Initialize (C function) Extension modules are shared between (sub-)interpreters as follows: @@ -1428,7 +1427,7 @@ function. You can create and destroy them using the following functions: As with multi-phase initialization, this means that only C-level static and global variables are shared between these modules. - .. index:: single: close() (in module os) + .. index:: single: close (in module os) .. c:function:: PyThreadState* Py_NewInterpreter(void) @@ -1451,7 +1450,7 @@ function. You can create and destroy them using the following functions: .. c:function:: void Py_EndInterpreter(PyThreadState *tstate) - .. index:: single: Py_FinalizeEx() + .. index:: single: Py_FinalizeEx (C function) Destroy the (sub-)interpreter represented by the given thread state. The given thread state must be the current thread state. See the @@ -1543,8 +1542,6 @@ pointer and a void pointer argument. .. c:function:: int Py_AddPendingCall(int (*func)(void *), void *arg) - .. index:: single: Py_AddPendingCall() - Schedule a function to be called from the main interpreter thread. On success, ``0`` is returned and *func* is queued for being called in the main thread. On failure, ``-1`` is returned without setting any exception. diff --git a/Doc/c-api/intro.rst b/Doc/c-api/intro.rst index dcda1071a58f35..8ef463e3f88ca8 100644 --- a/Doc/c-api/intro.rst +++ b/Doc/c-api/intro.rst @@ -325,8 +325,8 @@ objects that reference each other here; for now, the solution is "don't do that.") .. index:: - single: Py_INCREF() - single: Py_DECREF() + single: Py_INCREF (C function) + single: Py_DECREF (C function) Reference counts are always manipulated explicitly. The normal way is to use the macro :c:func:`Py_INCREF` to take a new reference to an @@ -401,8 +401,8 @@ function, that function assumes that it now owns that reference, and you are not responsible for it any longer. .. index:: - single: PyList_SetItem() - single: PyTuple_SetItem() + single: PyList_SetItem (C function) + single: PyTuple_SetItem (C function) Few functions steal references; the two notable exceptions are :c:func:`PyList_SetItem` and :c:func:`PyTuple_SetItem`, which steal a reference @@ -491,8 +491,8 @@ using :c:func:`PySequence_GetItem` (which happens to take exactly the same arguments), you do own a reference to the returned object. .. index:: - single: PyList_GetItem() - single: PySequence_GetItem() + single: PyList_GetItem (C function) + single: PySequence_GetItem (C function) Here is an example of how you could write a function that computes the sum of the items in a list of integers; once using :c:func:`PyList_GetItem`, and once @@ -587,7 +587,7 @@ caller, then to the caller's caller, and so on, until they reach the top-level interpreter, where they are reported to the user accompanied by a stack traceback. -.. index:: single: PyErr_Occurred() +.. index:: single: PyErr_Occurred (C function) For C programmers, however, error checking always has to be explicit. All functions in the Python/C API can raise exceptions, unless an explicit claim is @@ -601,8 +601,8 @@ ambiguous return value, and require explicit testing for errors with :c:func:`PyErr_Occurred`. These exceptions are always explicitly documented. .. index:: - single: PyErr_SetString() - single: PyErr_Clear() + single: PyErr_SetString (C function) + single: PyErr_Clear (C function) Exception state is maintained in per-thread storage (this is equivalent to using global storage in an unthreaded application). A thread can be in one of @@ -624,7 +624,7 @@ an exception is being passed on between C functions until it reaches the Python bytecode interpreter's main loop, which takes care of transferring it to ``sys.exc_info()`` and friends. -.. index:: single: exc_info() (in module sys) +.. index:: single: exc_info (in module sys) Note that starting with Python 1.5, the preferred, thread-safe way to access the exception state from Python code is to call the function :func:`sys.exc_info`, @@ -709,9 +709,9 @@ Here is the corresponding C code, in all its glory:: .. index:: single: incr_item() .. index:: - single: PyErr_ExceptionMatches() - single: PyErr_Clear() - single: Py_XDECREF() + single: PyErr_ExceptionMatches (C function) + single: PyErr_Clear (C function) + single: Py_XDECREF (C function) This example represents an endorsed use of the ``goto`` statement in C! It illustrates the use of :c:func:`PyErr_ExceptionMatches` and @@ -735,7 +735,7 @@ the finalization, of the Python interpreter. Most functionality of the interpreter can only be used after the interpreter has been initialized. .. index:: - single: Py_Initialize() + single: Py_Initialize (C function) pair: module; builtins pair: module; __main__ pair: module; sys @@ -770,10 +770,10 @@ environment variable :envvar:`PYTHONHOME`, or insert additional directories in front of the standard path by setting :envvar:`PYTHONPATH`. .. index:: - single: Py_GetPath() - single: Py_GetPrefix() - single: Py_GetExecPrefix() - single: Py_GetProgramFullPath() + single: Py_GetPath (C function) + single: Py_GetPrefix (C function) + single: Py_GetExecPrefix (C function) + single: Py_GetProgramFullPath (C function) The embedding application can steer the search by setting :c:member:`PyConfig.program_name` *before* calling @@ -784,7 +784,7 @@ control has to provide its own implementation of :c:func:`Py_GetPath`, :c:func:`Py_GetPrefix`, :c:func:`Py_GetExecPrefix`, and :c:func:`Py_GetProgramFullPath` (all defined in :file:`Modules/getpath.c`). -.. index:: single: Py_IsInitialized() +.. index:: single: Py_IsInitialized (C function) Sometimes, it is desirable to "uninitialize" Python. For instance, the application may want to start over (make another call to diff --git a/Doc/c-api/long.rst b/Doc/c-api/long.rst index 9167e9d140f5e2..06bb2435565804 100644 --- a/Doc/c-api/long.rst +++ b/Doc/c-api/long.rst @@ -139,7 +139,7 @@ distinguished from a number. Use :c:func:`PyErr_Occurred` to disambiguate. .. c:function:: long PyLong_AsLong(PyObject *obj) .. index:: - single: LONG_MAX + single: LONG_MAX (C macro) single: OverflowError (built-in exception) Return a C :c:expr:`long` representation of *obj*. If *obj* is not an @@ -232,7 +232,7 @@ distinguished from a number. Use :c:func:`PyErr_Occurred` to disambiguate. .. c:function:: Py_ssize_t PyLong_AsSsize_t(PyObject *pylong) .. index:: - single: PY_SSIZE_T_MAX + single: PY_SSIZE_T_MAX (C macro) single: OverflowError (built-in exception) Return a C :c:type:`Py_ssize_t` representation of *pylong*. *pylong* must @@ -247,7 +247,7 @@ distinguished from a number. Use :c:func:`PyErr_Occurred` to disambiguate. .. c:function:: unsigned long PyLong_AsUnsignedLong(PyObject *pylong) .. index:: - single: ULONG_MAX + single: ULONG_MAX (C macro) single: OverflowError (built-in exception) Return a C :c:expr:`unsigned long` representation of *pylong*. *pylong* @@ -263,7 +263,7 @@ distinguished from a number. Use :c:func:`PyErr_Occurred` to disambiguate. .. c:function:: size_t PyLong_AsSize_t(PyObject *pylong) .. index:: - single: SIZE_MAX + single: SIZE_MAX (C macro) single: OverflowError (built-in exception) Return a C :c:type:`size_t` representation of *pylong*. *pylong* must be diff --git a/Doc/c-api/memory.rst b/Doc/c-api/memory.rst index c05282ffc59521..9da09a21607f61 100644 --- a/Doc/c-api/memory.rst +++ b/Doc/c-api/memory.rst @@ -41,10 +41,10 @@ buffers is performed on demand by the Python memory manager through the Python/C API functions listed in this document. .. index:: - single: malloc() - single: calloc() - single: realloc() - single: free() + single: malloc (C function) + single: calloc (C function) + single: realloc (C function) + single: free (C function) To avoid memory corruption, extension writers should never try to operate on Python objects with the functions exported by the C library: :c:func:`malloc`, diff --git a/Doc/c-api/structures.rst b/Doc/c-api/structures.rst index 0032da9659636c..77f2b6991d770e 100644 --- a/Doc/c-api/structures.rst +++ b/Doc/c-api/structures.rst @@ -561,9 +561,9 @@ The following flags can be used with :c:member:`PyMemberDef.flags`: :c:member:`PyMemberDef.offset` to the offset from the ``PyObject`` struct. .. index:: - single: READ_RESTRICTED - single: WRITE_RESTRICTED - single: RESTRICTED + single: READ_RESTRICTED (C macro) + single: WRITE_RESTRICTED (C macro) + single: RESTRICTED (C macro) .. versionchanged:: 3.10 @@ -574,7 +574,7 @@ The following flags can be used with :c:member:`PyMemberDef.flags`: :c:macro:`Py_AUDIT_READ`; :c:macro:`!WRITE_RESTRICTED` does nothing. .. index:: - single: READONLY + single: READONLY (C macro) .. versionchanged:: 3.12 @@ -637,24 +637,24 @@ Macro name C type Python type Reading a ``NULL`` pointer raises :py:exc:`AttributeError`. .. index:: - single: T_BYTE - single: T_SHORT - single: T_INT - single: T_LONG - single: T_LONGLONG - single: T_UBYTE - single: T_USHORT - single: T_UINT - single: T_ULONG - single: T_ULONGULONG - single: T_PYSSIZET - single: T_FLOAT - single: T_DOUBLE - single: T_BOOL - single: T_CHAR - single: T_STRING - single: T_STRING_INPLACE - single: T_OBJECT_EX + single: T_BYTE (C macro) + single: T_SHORT (C macro) + single: T_INT (C macro) + single: T_LONG (C macro) + single: T_LONGLONG (C macro) + single: T_UBYTE (C macro) + single: T_USHORT (C macro) + single: T_UINT (C macro) + single: T_ULONG (C macro) + single: T_ULONGULONG (C macro) + single: T_PYSSIZET (C macro) + single: T_FLOAT (C macro) + single: T_DOUBLE (C macro) + single: T_BOOL (C macro) + single: T_CHAR (C macro) + single: T_STRING (C macro) + single: T_STRING_INPLACE (C macro) + single: T_OBJECT_EX (C macro) single: structmember.h .. versionadded:: 3.12 diff --git a/Doc/c-api/sys.rst b/Doc/c-api/sys.rst index e3c54b075114ff..d6fca1a0b0a219 100644 --- a/Doc/c-api/sys.rst +++ b/Doc/c-api/sys.rst @@ -5,6 +5,7 @@ Operating System Utilities ========================== + .. c:function:: PyObject* PyOS_FSPath(PyObject *path) Return the file system representation for *path*. If the object is a @@ -97,27 +98,30 @@ Operating System Utilities .. c:function:: int PyOS_CheckStack() + .. index:: single: USE_STACKCHECK (C macro) + Return true when the interpreter runs out of stack space. This is a reliable - check, but is only available when :c:macro:`USE_STACKCHECK` is defined (currently + check, but is only available when :c:macro:`!USE_STACKCHECK` is defined (currently on certain versions of Windows using the Microsoft Visual C++ compiler). - :c:macro:`USE_STACKCHECK` will be defined automatically; you should never + :c:macro:`!USE_STACKCHECK` will be defined automatically; you should never change the definition in your own code. +.. c:type:: void (*PyOS_sighandler_t)(int) + + .. c:function:: PyOS_sighandler_t PyOS_getsig(int i) Return the current signal handler for signal *i*. This is a thin wrapper around either :c:func:`!sigaction` or :c:func:`!signal`. Do not call those functions - directly! :c:type:`PyOS_sighandler_t` is a typedef alias for :c:expr:`void - (\*)(int)`. + directly! .. c:function:: PyOS_sighandler_t PyOS_setsig(int i, PyOS_sighandler_t h) Set the signal handler for signal *i* to be *h*; return the old signal handler. This is a thin wrapper around either :c:func:`!sigaction` or :c:func:`!signal`. Do - not call those functions directly! :c:type:`PyOS_sighandler_t` is a typedef - alias for :c:expr:`void (\*)(int)`. + not call those functions directly! .. c:function:: wchar_t* Py_DecodeLocale(const char* arg, size_t *size) @@ -342,10 +346,8 @@ accessible to C code. They all work with the current interpreter thread's silently abort the operation by raising an error subclassed from :class:`Exception` (other errors will not be silenced). - The hook function is of type :c:expr:`int (*)(const char *event, PyObject - *args, void *userData)`, where *args* is guaranteed to be a - :c:type:`PyTupleObject`. The hook function is always called with the GIL - held by the Python interpreter that raised the event. + The hook function is always called with the GIL held by the Python + interpreter that raised the event. See :pep:`578` for a detailed description of auditing. Functions in the runtime and standard library that raise events are listed in the @@ -354,12 +356,21 @@ accessible to C code. They all work with the current interpreter thread's .. audit-event:: sys.addaudithook "" c.PySys_AddAuditHook - If the interpreter is initialized, this function raises a auditing event + If the interpreter is initialized, this function raises an auditing event ``sys.addaudithook`` with no arguments. If any existing hooks raise an exception derived from :class:`Exception`, the new hook will not be added and the exception is cleared. As a result, callers cannot assume that their hook has been added unless they control all existing hooks. + .. c:namespace:: NULL + .. c:type:: int (*Py_AuditHookFunction) (const char *event, PyObject *args, void *userData) + + The type of the hook function. + *event* is the C string event argument passed to :c:func:`PySys_Audit` or + :c:func:`PySys_AuditTuple`. + *args* is guaranteed to be a :c:type:`PyTupleObject`. + *userData* is the argument passed to PySys_AddAuditHook(). + .. versionadded:: 3.8 @@ -371,7 +382,7 @@ Process Control .. c:function:: void Py_FatalError(const char *message) - .. index:: single: abort() + .. index:: single: abort (C function) Print a fatal error message and kill the process. No cleanup is performed. This function should only be invoked when a condition is detected that would @@ -391,8 +402,8 @@ Process Control .. c:function:: void Py_Exit(int status) .. index:: - single: Py_FinalizeEx() - single: exit() + single: Py_FinalizeEx (C function) + single: exit (C function) Exit the current process. This calls :c:func:`Py_FinalizeEx` and then calls the standard C library function ``exit(status)``. If :c:func:`Py_FinalizeEx` @@ -405,7 +416,7 @@ Process Control .. c:function:: int Py_AtExit(void (*func) ()) .. index:: - single: Py_FinalizeEx() + single: Py_FinalizeEx (C function) single: cleanup functions Register a cleanup function to be called by :c:func:`Py_FinalizeEx`. The cleanup diff --git a/Doc/c-api/veryhigh.rst b/Doc/c-api/veryhigh.rst index 324518c035096b..67167444d0a685 100644 --- a/Doc/c-api/veryhigh.rst +++ b/Doc/c-api/veryhigh.rst @@ -322,7 +322,7 @@ the same library that the Python runtime is using. .. c:var:: int Py_eval_input - .. index:: single: Py_CompileString() + .. index:: single: Py_CompileString (C function) The start symbol from the Python grammar for isolated expressions; for use with :c:func:`Py_CompileString`. @@ -330,7 +330,7 @@ the same library that the Python runtime is using. .. c:var:: int Py_file_input - .. index:: single: Py_CompileString() + .. index:: single: Py_CompileString (C function) The start symbol from the Python grammar for sequences of statements as read from a file or other source; for use with :c:func:`Py_CompileString`. This is @@ -339,7 +339,7 @@ the same library that the Python runtime is using. .. c:var:: int Py_single_input - .. index:: single: Py_CompileString() + .. index:: single: Py_CompileString (C function) The start symbol from the Python grammar for a single statement; for use with :c:func:`Py_CompileString`. This is the symbol used for the interactive diff --git a/Doc/extending/extending.rst b/Doc/extending/extending.rst index 745fc10a22d161..b70e1b1fe57e67 100644 --- a/Doc/extending/extending.rst +++ b/Doc/extending/extending.rst @@ -547,7 +547,7 @@ reference count of an object and are safe in the presence of ``NULL`` pointers (but note that *temp* will not be ``NULL`` in this context). More info on them in section :ref:`refcounts`. -.. index:: single: PyObject_CallObject() +.. index:: single: PyObject_CallObject (C function) Later, when it is time to call the function, you call the C function :c:func:`PyObject_CallObject`. This function has two arguments, both pointers to @@ -638,7 +638,7 @@ the above example, we use :c:func:`Py_BuildValue` to construct the dictionary. : Extracting Parameters in Extension Functions ============================================ -.. index:: single: PyArg_ParseTuple() +.. index:: single: PyArg_ParseTuple (C function) The :c:func:`PyArg_ParseTuple` function is declared as follows:: @@ -730,7 +730,7 @@ Some example calls:: Keyword Parameters for Extension Functions ========================================== -.. index:: single: PyArg_ParseTupleAndKeywords() +.. index:: single: PyArg_ParseTupleAndKeywords (C function) The :c:func:`PyArg_ParseTupleAndKeywords` function is declared as follows:: diff --git a/Doc/extending/newtypes.rst b/Doc/extending/newtypes.rst index 7a92b3257c6cd3..473a418809cff1 100644 --- a/Doc/extending/newtypes.rst +++ b/Doc/extending/newtypes.rst @@ -89,8 +89,8 @@ If your type supports garbage collection, the destructor should call } .. index:: - single: PyErr_Fetch() - single: PyErr_Restore() + single: PyErr_Fetch (C function) + single: PyErr_Restore (C function) One important requirement of the deallocator function is that it leaves any pending exceptions alone. This is important since deallocators are frequently diff --git a/Doc/howto/index.rst b/Doc/howto/index.rst index a835bb5f13bd1c..bb507953582639 100644 --- a/Doc/howto/index.rst +++ b/Doc/howto/index.rst @@ -13,7 +13,6 @@ Currently, the HOWTOs are: .. toctree:: :maxdepth: 1 - pyporting.rst cporting.rst curses.rst descriptor.rst diff --git a/Doc/howto/logging-cookbook.rst b/Doc/howto/logging-cookbook.rst index 80147e31fcbae1..f7d885ec88483d 100644 --- a/Doc/howto/logging-cookbook.rst +++ b/Doc/howto/logging-cookbook.rst @@ -1744,13 +1744,11 @@ to the above, as in the following example:: return self.fmt.format(*self.args) class StyleAdapter(logging.LoggerAdapter): - def __init__(self, logger, extra=None): - super().__init__(logger, extra or {}) - - def log(self, level, msg, /, *args, **kwargs): + def log(self, level, msg, /, *args, stacklevel=1, **kwargs): if self.isEnabledFor(level): msg, kwargs = self.process(msg, kwargs) - self.logger._log(level, Message(msg, args), (), **kwargs) + self.logger.log(level, Message(msg, args), **kwargs, + stacklevel=stacklevel+1) logger = StyleAdapter(logging.getLogger(__name__)) @@ -1762,7 +1760,7 @@ to the above, as in the following example:: main() The above script should log the message ``Hello, world!`` when run with -Python 3.2 or later. +Python 3.8 or later. .. currentmodule:: logging diff --git a/Doc/howto/pyporting.rst b/Doc/howto/pyporting.rst index 501b16d82d4d6f..d560364107bd12 100644 --- a/Doc/howto/pyporting.rst +++ b/Doc/howto/pyporting.rst @@ -1,3 +1,5 @@ +:orphan: + .. _pyporting-howto: ************************************* @@ -6,423 +8,30 @@ How to port Python 2 Code to Python 3 :author: Brett Cannon -.. topic:: Abstract - - Python 2 reached its official end-of-life at the start of 2020. This means - that no new bug reports, fixes, or changes will be made to Python 2 - it's - no longer supported. - - This guide is intended to provide you with a path to Python 3 for your - code, that includes compatibility with Python 2 as a first step. - - If you are looking to port an extension module instead of pure Python code, - please see :ref:`cporting-howto`. - - The archived python-porting_ mailing list may contain some useful guidance. - - -The Short Explanation -===================== - -To achieve Python 2/3 compatibility in a single code base, the basic steps -are: - -#. Only worry about supporting Python 2.7 -#. Make sure you have good test coverage (coverage.py_ can help; - ``python -m pip install coverage``) -#. Learn the differences between Python 2 and 3 -#. Use Futurize_ (or Modernize_) to update your code (e.g. ``python -m pip install future``) -#. Use Pylint_ to help make sure you don't regress on your Python 3 support - (``python -m pip install pylint``) -#. Use caniusepython3_ to find out which of your dependencies are blocking your - use of Python 3 (``python -m pip install caniusepython3``) -#. Once your dependencies are no longer blocking you, use continuous integration - to make sure you stay compatible with Python 2 and 3 (tox_ can help test - against multiple versions of Python; ``python -m pip install tox``) -#. Consider using optional :term:`static type checking ` - to make sure your type usage - works in both Python 2 and 3 (e.g. use mypy_ to check your typing under both - Python 2 and Python 3; ``python -m pip install mypy``). - -.. note:: - - Note: Using ``python -m pip install`` guarantees that the ``pip`` you invoke - is the one installed for the Python currently in use, whether it be - a system-wide ``pip`` or one installed within a - :ref:`virtual environment `. - -Details -======= - -Even if other factors - say, dependencies over which you have no control - -still require you to support Python 2, that does not prevent you taking the -step of including Python 3 support. - -Most changes required to support Python 3 lead to cleaner code using newer -practices even in Python 2 code. - - -Different versions of Python 2 ------------------------------- - -Ideally, your code should be compatible with Python 2.7, which was the -last supported version of Python 2. - -Some of the tools mentioned in this guide will not work with Python 2.6. - -If absolutely necessary, the six_ project can help you support Python 2.5 and -3 simultaneously. Do realize, though, that nearly all the projects listed in -this guide will not be available to you. - -If you are able to skip Python 2.5 and older, the required changes to your -code will be minimal. At worst you will have to use a function instead of a -method in some instances or have to import a function instead of using a -built-in one. - - -Make sure you specify the proper version support in your ``setup.py`` file --------------------------------------------------------------------------- - -In your ``setup.py`` file you should have the proper `trove classifier`_ -specifying what versions of Python you support. As your project does not support -Python 3 yet you should at least have -``Programming Language :: Python :: 2 :: Only`` specified. Ideally you should -also specify each major/minor version of Python that you do support, e.g. -``Programming Language :: Python :: 2.7``. - - -Have good test coverage ------------------------ - -Once you have your code supporting the oldest version of Python 2 you want it -to, you will want to make sure your test suite has good coverage. A good rule of -thumb is that if you want to be confident enough in your test suite that any -failures that appear after having tools rewrite your code are actual bugs in the -tools and not in your code. If you want a number to aim for, try to get over 80% -coverage (and don't feel bad if you find it hard to get better than 90% -coverage). If you don't already have a tool to measure test coverage then -coverage.py_ is recommended. - - -Be aware of the differences between Python 2 and 3 --------------------------------------------------- - -Once you have your code well-tested you are ready to begin porting your code to -Python 3! But to fully understand how your code is going to change and what -you want to look out for while you code, you will want to learn what changes -Python 3 makes in terms of Python 2. - -Some resources for understanding the differences and their implications for you -code: - -* the :ref:`"What's New" ` doc for each release of Python 3 -* the `Porting to Python 3`_ book (which is free online) -* the handy `cheat sheet`_ from the Python-Future project. - - -Update your code ----------------- - -There are tools available that can port your code automatically. - -Futurize_ does its best to make Python 3 idioms and practices exist in Python -2, e.g. backporting the ``bytes`` type from Python 3 so that you have -semantic parity between the major versions of Python. This is the better -approach for most cases. - -Modernize_, on the other hand, is more conservative and targets a Python 2/3 -subset of Python, directly relying on six_ to help provide compatibility. - -A good approach is to run the tool over your test suite first and visually -inspect the diff to make sure the transformation is accurate. After you have -transformed your test suite and verified that all the tests still pass as -expected, then you can transform your application code knowing that any tests -which fail is a translation failure. - -Unfortunately the tools can't automate everything to make your code work under -Python 3, and you will also need to read the tools' documentation in case some -options you need are turned off by default. - -Key issues to be aware of and check for: - -Division -++++++++ - -In Python 3, ``5 / 2 == 2.5`` and not ``2`` as it was in Python 2; all -division between ``int`` values result in a ``float``. This change has -actually been planned since Python 2.2 which was released in 2002. Since then -users have been encouraged to add ``from __future__ import division`` to any -and all files which use the ``/`` and ``//`` operators or to be running the -interpreter with the ``-Q`` flag. If you have not been doing this then you -will need to go through your code and do two things: - -#. Add ``from __future__ import division`` to your files -#. Update any division operator as necessary to either use ``//`` to use floor - division or continue using ``/`` and expect a float - -The reason that ``/`` isn't simply translated to ``//`` automatically is that if -an object defines a ``__truediv__`` method but not ``__floordiv__`` then your -code would begin to fail (e.g. a user-defined class that uses ``/`` to -signify some operation but not ``//`` for the same thing or at all). +Python 2 reached its official end-of-life at the start of 2020. This means +that no new bug reports, fixes, or changes will be made to Python 2 - it's +no longer supported: see :pep:`373` and +`status of Python versions `_. +If you are looking to port an extension module instead of pure Python code, +please see :ref:`cporting-howto`. -Text versus binary data -+++++++++++++++++++++++ +The archived python-porting_ mailing list may contain some useful guidance. -In Python 2 you could use the ``str`` type for both text and binary data. -Unfortunately this confluence of two different concepts could lead to brittle -code which sometimes worked for either kind of data, sometimes not. It also -could lead to confusing APIs if people didn't explicitly state that something -that accepted ``str`` accepted either text or binary data instead of one -specific type. This complicated the situation especially for anyone supporting -multiple languages as APIs wouldn't bother explicitly supporting ``unicode`` -when they claimed text data support. +Since Python 3.13 the original porting guide was discontinued. +You can find the old guide in the +`archive `_. -Python 3 made text and binary data distinct types that cannot simply be mixed -together. For any code that deals only with text or only binary data, this -separation doesn't pose an issue. But for code that has to deal with both, it -does mean you might have to now care about when you are using text compared -to binary data, which is why this cannot be entirely automated. -Decide which APIs take text and which take binary (it is **highly** recommended -you don't design APIs that can take both due to the difficulty of keeping the -code working; as stated earlier it is difficult to do well). In Python 2 this -means making sure the APIs that take text can work with ``unicode`` and those -that work with binary data work with the ``bytes`` type from Python 3 -(which is a subset of ``str`` in Python 2 and acts as an alias for ``bytes`` -type in Python 2). Usually the biggest issue is realizing which methods exist -on which types in Python 2 and 3 simultaneously (for text that's ``unicode`` -in Python 2 and ``str`` in Python 3, for binary that's ``str``/``bytes`` in -Python 2 and ``bytes`` in Python 3). +Third-party guides +================== -The following table lists the **unique** methods of each data type across -Python 2 and 3 (e.g., the ``decode()`` method is usable on the equivalent binary -data type in either Python 2 or 3, but it can't be used by the textual data -type consistently between Python 2 and 3 because ``str`` in Python 3 doesn't -have the method). Do note that as of Python 3.5 the ``__mod__`` method was -added to the bytes type. +There are also multiple third-party guides that might be useful: -======================== ===================== -**Text data** **Binary data** ------------------------- --------------------- -\ decode ------------------------- --------------------- -encode ------------------------- --------------------- -format ------------------------- --------------------- -isdecimal ------------------------- --------------------- -isnumeric -======================== ===================== +- `Guide by Fedora `_ +- `PyCon 2020 tutorial `_ +- `Guide by DigitalOcean `_ +- `Guide by ActiveState `_ -Making the distinction easier to handle can be accomplished by encoding and -decoding between binary data and text at the edge of your code. This means that -when you receive text in binary data, you should immediately decode it. And if -your code needs to send text as binary data then encode it as late as possible. -This allows your code to work with only text internally and thus eliminates -having to keep track of what type of data you are working with. -The next issue is making sure you know whether the string literals in your code -represent text or binary data. You should add a ``b`` prefix to any -literal that presents binary data. For text you should add a ``u`` prefix to -the text literal. (There is a :mod:`__future__` import to force all unspecified -literals to be Unicode, but usage has shown it isn't as effective as adding a -``b`` or ``u`` prefix to all literals explicitly) - -You also need to be careful about opening files. Possibly you have not always -bothered to add the ``b`` mode when opening a binary file (e.g., ``rb`` for -binary reading). Under Python 3, binary files and text files are clearly -distinct and mutually incompatible; see the :mod:`io` module for details. -Therefore, you **must** make a decision of whether a file will be used for -binary access (allowing binary data to be read and/or written) or textual access -(allowing text data to be read and/or written). You should also use :func:`io.open` -for opening files instead of the built-in :func:`open` function as the :mod:`io` -module is consistent from Python 2 to 3 while the built-in :func:`open` function -is not (in Python 3 it's actually :func:`io.open`). Do not bother with the -outdated practice of using :func:`codecs.open` as that's only necessary for -keeping compatibility with Python 2.5. - -The constructors of both ``str`` and ``bytes`` have different semantics for the -same arguments between Python 2 and 3. Passing an integer to ``bytes`` in Python 2 -will give you the string representation of the integer: ``bytes(3) == '3'``. -But in Python 3, an integer argument to ``bytes`` will give you a bytes object -as long as the integer specified, filled with null bytes: -``bytes(3) == b'\x00\x00\x00'``. A similar worry is necessary when passing a -bytes object to ``str``. In Python 2 you just get the bytes object back: -``str(b'3') == b'3'``. But in Python 3 you get the string representation of the -bytes object: ``str(b'3') == "b'3'"``. - -Finally, the indexing of binary data requires careful handling (slicing does -**not** require any special handling). In Python 2, -``b'123'[1] == b'2'`` while in Python 3 ``b'123'[1] == 50``. Because binary data -is simply a collection of binary numbers, Python 3 returns the integer value for -the byte you index on. But in Python 2 because ``bytes == str``, indexing -returns a one-item slice of bytes. The six_ project has a function -named ``six.indexbytes()`` which will return an integer like in Python 3: -``six.indexbytes(b'123', 1)``. - -To summarize: - -#. Decide which of your APIs take text and which take binary data -#. Make sure that your code that works with text also works with ``unicode`` and - code for binary data works with ``bytes`` in Python 2 (see the table above - for what methods you cannot use for each type) -#. Mark all binary literals with a ``b`` prefix, textual literals with a ``u`` - prefix -#. Decode binary data to text as soon as possible, encode text as binary data as - late as possible -#. Open files using :func:`io.open` and make sure to specify the ``b`` mode when - appropriate -#. Be careful when indexing into binary data - - -Use feature detection instead of version detection -++++++++++++++++++++++++++++++++++++++++++++++++++ - -Inevitably you will have code that has to choose what to do based on what -version of Python is running. The best way to do this is with feature detection -of whether the version of Python you're running under supports what you need. -If for some reason that doesn't work then you should make the version check be -against Python 2 and not Python 3. To help explain this, let's look at an -example. - -Let's pretend that you need access to a feature of :mod:`importlib` that -is available in Python's standard library since Python 3.3 and available for -Python 2 through importlib2_ on PyPI. You might be tempted to write code to -access e.g. the :mod:`importlib.abc` module by doing the following:: - - import sys - - if sys.version_info[0] == 3: - from importlib import abc - else: - from importlib2 import abc - -The problem with this code is what happens when Python 4 comes out? It would -be better to treat Python 2 as the exceptional case instead of Python 3 and -assume that future Python versions will be more compatible with Python 3 than -Python 2:: - - import sys - - if sys.version_info[0] > 2: - from importlib import abc - else: - from importlib2 import abc - -The best solution, though, is to do no version detection at all and instead rely -on feature detection. That avoids any potential issues of getting the version -detection wrong and helps keep you future-compatible:: - - try: - from importlib import abc - except ImportError: - from importlib2 import abc - - -Prevent compatibility regressions ---------------------------------- - -Once you have fully translated your code to be compatible with Python 3, you -will want to make sure your code doesn't regress and stop working under -Python 3. This is especially true if you have a dependency which is blocking you -from actually running under Python 3 at the moment. - -To help with staying compatible, any new modules you create should have -at least the following block of code at the top of it:: - - from __future__ import absolute_import - from __future__ import division - from __future__ import print_function - -You can also run Python 2 with the ``-3`` flag to be warned about various -compatibility issues your code triggers during execution. If you turn warnings -into errors with ``-Werror`` then you can make sure that you don't accidentally -miss a warning. - -You can also use the Pylint_ project and its ``--py3k`` flag to lint your code -to receive warnings when your code begins to deviate from Python 3 -compatibility. This also prevents you from having to run Modernize_ or Futurize_ -over your code regularly to catch compatibility regressions. This does require -you only support Python 2.7 and Python 3.4 or newer as that is Pylint's -minimum Python version support. - - -Check which dependencies block your transition ----------------------------------------------- - -**After** you have made your code compatible with Python 3 you should begin to -care about whether your dependencies have also been ported. The caniusepython3_ -project was created to help you determine which projects --- directly or indirectly -- are blocking you from supporting Python 3. There -is both a command-line tool as well as a web interface at -https://caniusepython3.com. - -The project also provides code which you can integrate into your test suite so -that you will have a failing test when you no longer have dependencies blocking -you from using Python 3. This allows you to avoid having to manually check your -dependencies and to be notified quickly when you can start running on Python 3. - - -Update your ``setup.py`` file to denote Python 3 compatibility --------------------------------------------------------------- - -Once your code works under Python 3, you should update the classifiers in -your ``setup.py`` to contain ``Programming Language :: Python :: 3`` and to not -specify sole Python 2 support. This will tell anyone using your code that you -support Python 2 **and** 3. Ideally you will also want to add classifiers for -each major/minor version of Python you now support. - - -Use continuous integration to stay compatible ---------------------------------------------- - -Once you are able to fully run under Python 3 you will want to make sure your -code always works under both Python 2 and 3. Probably the best tool for running -your tests under multiple Python interpreters is tox_. You can then integrate -tox with your continuous integration system so that you never accidentally break -Python 2 or 3 support. - -You may also want to use the ``-bb`` flag with the Python 3 interpreter to -trigger an exception when you are comparing bytes to strings or bytes to an int -(the latter is available starting in Python 3.5). By default type-differing -comparisons simply return ``False``, but if you made a mistake in your -separation of text/binary data handling or indexing on bytes you wouldn't easily -find the mistake. This flag will raise an exception when these kinds of -comparisons occur, making the mistake much easier to track down. - - -Consider using optional static type checking --------------------------------------------- - -Another way to help port your code is to use a :term:`static type checker` like -mypy_ or pytype_ on your code. These tools can be used to analyze your code as -if it's being run under Python 2, then you can run the tool a second time as if -your code is running under Python 3. By running a static type checker twice like -this you can discover if you're e.g. misusing binary data type in one version -of Python compared to another. If you add optional type hints to your code you -can also explicitly state whether your APIs use textual or binary data, helping -to make sure everything functions as expected in both versions of Python. - - -.. _caniusepython3: https://pypi.org/project/caniusepython3 -.. _cheat sheet: https://python-future.org/compatible_idioms.html -.. _coverage.py: https://pypi.org/project/coverage -.. _Futurize: https://python-future.org/automatic_conversion.html -.. _importlib2: https://pypi.org/project/importlib2 -.. _Modernize: https://python-modernize.readthedocs.io/ -.. _mypy: https://mypy-lang.org/ -.. _Porting to Python 3: http://python3porting.com/ -.. _Pylint: https://pypi.org/project/pylint - -.. _Python 3 Q & A: https://ncoghlan-devs-python-notes.readthedocs.io/en/latest/python3/questions_and_answers.html - -.. _pytype: https://github.com/google/pytype -.. _python-future: https://python-future.org/ .. _python-porting: https://mail.python.org/pipermail/python-porting/ -.. _six: https://pypi.org/project/six -.. _tox: https://pypi.org/project/tox -.. _trove classifier: https://pypi.org/classifiers - -.. _Why Python 3 exists: https://snarky.ca/why-python-3-exists diff --git a/Doc/library/array.rst b/Doc/library/array.rst index a0e8bb20a098fd..043badf05ffc12 100644 --- a/Doc/library/array.rst +++ b/Doc/library/array.rst @@ -215,6 +215,13 @@ The module defines the following type: Remove the first occurrence of *x* from the array. + .. method:: clear() + + Remove all elements from the array. + + .. versionadded:: 3.13 + + .. method:: reverse() Reverse the order of the items in the array. diff --git a/Doc/library/asyncio-protocol.rst b/Doc/library/asyncio-protocol.rst index 3f734f544afe21..ecd8cdc709af7d 100644 --- a/Doc/library/asyncio-protocol.rst +++ b/Doc/library/asyncio-protocol.rst @@ -417,8 +417,8 @@ Subprocess Transports Stop the subprocess. - On POSIX systems, this method sends SIGTERM to the subprocess. - On Windows, the Windows API function TerminateProcess() is called to + On POSIX systems, this method sends :py:const:`~signal.SIGTERM` to the subprocess. + On Windows, the Windows API function :c:func:`!TerminateProcess` is called to stop the subprocess. See also :meth:`subprocess.Popen.terminate`. diff --git a/Doc/library/asyncio-subprocess.rst b/Doc/library/asyncio-subprocess.rst index bf35b1cb798aee..817a6ff3052f4a 100644 --- a/Doc/library/asyncio-subprocess.rst +++ b/Doc/library/asyncio-subprocess.rst @@ -240,7 +240,7 @@ their completion. .. note:: - On Windows, :py:data:`SIGTERM` is an alias for :meth:`terminate`. + On Windows, :py:const:`~signal.SIGTERM` is an alias for :meth:`terminate`. ``CTRL_C_EVENT`` and ``CTRL_BREAK_EVENT`` can be sent to processes started with a *creationflags* parameter which includes ``CREATE_NEW_PROCESS_GROUP``. @@ -249,10 +249,10 @@ their completion. Stop the child process. - On POSIX systems this method sends :py:const:`signal.SIGTERM` to the + On POSIX systems this method sends :py:const:`~signal.SIGTERM` to the child process. - On Windows the Win32 API function :c:func:`TerminateProcess` is + On Windows the Win32 API function :c:func:`!TerminateProcess` is called to stop the child process. .. method:: kill() diff --git a/Doc/library/bdb.rst b/Doc/library/bdb.rst index 52f0ca7c013482..7bf4308a96d0f5 100644 --- a/Doc/library/bdb.rst +++ b/Doc/library/bdb.rst @@ -148,8 +148,8 @@ The :mod:`bdb` module also defines two classes: .. method:: reset() - Set the :attr:`botframe`, :attr:`stopframe`, :attr:`returnframe` and - :attr:`quitting` attributes with values ready to start debugging. + Set the :attr:`!botframe`, :attr:`!stopframe`, :attr:`!returnframe` and + :attr:`quitting ` attributes with values ready to start debugging. .. method:: trace_dispatch(frame, event, arg) @@ -182,7 +182,7 @@ The :mod:`bdb` module also defines two classes: If the debugger should stop on the current line, invoke the :meth:`user_line` method (which should be overridden in subclasses). - Raise a :exc:`BdbQuit` exception if the :attr:`Bdb.quitting` flag is set + Raise a :exc:`BdbQuit` exception if the :attr:`quitting ` flag is set (which can be set from :meth:`user_line`). Return a reference to the :meth:`trace_dispatch` method for further tracing in that scope. @@ -190,7 +190,7 @@ The :mod:`bdb` module also defines two classes: If the debugger should stop on this function call, invoke the :meth:`user_call` method (which should be overridden in subclasses). - Raise a :exc:`BdbQuit` exception if the :attr:`Bdb.quitting` flag is set + Raise a :exc:`BdbQuit` exception if the :attr:`quitting ` flag is set (which can be set from :meth:`user_call`). Return a reference to the :meth:`trace_dispatch` method for further tracing in that scope. @@ -198,7 +198,7 @@ The :mod:`bdb` module also defines two classes: If the debugger should stop on this function return, invoke the :meth:`user_return` method (which should be overridden in subclasses). - Raise a :exc:`BdbQuit` exception if the :attr:`Bdb.quitting` flag is set + Raise a :exc:`BdbQuit` exception if the :attr:`quitting ` flag is set (which can be set from :meth:`user_return`). Return a reference to the :meth:`trace_dispatch` method for further tracing in that scope. @@ -206,7 +206,7 @@ The :mod:`bdb` module also defines two classes: If the debugger should stop at this exception, invokes the :meth:`user_exception` method (which should be overridden in subclasses). - Raise a :exc:`BdbQuit` exception if the :attr:`Bdb.quitting` flag is set + Raise a :exc:`BdbQuit` exception if the :attr:`quitting ` flag is set (which can be set from :meth:`user_exception`). Return a reference to the :meth:`trace_dispatch` method for further tracing in that scope. @@ -293,7 +293,9 @@ The :mod:`bdb` module also defines two classes: .. method:: set_quit() - Set the :attr:`quitting` attribute to ``True``. This raises :exc:`BdbQuit` in + .. index:: single: quitting (bdb.Bdb attribute) + + Set the :attr:`!quitting` attribute to ``True``. This raises :exc:`BdbQuit` in the next call to one of the :meth:`!dispatch_\*` methods. @@ -383,7 +385,7 @@ The :mod:`bdb` module also defines two classes: .. method:: run(cmd, globals=None, locals=None) Debug a statement executed via the :func:`exec` function. *globals* - defaults to :attr:`__main__.__dict__`, *locals* defaults to *globals*. + defaults to :attr:`!__main__.__dict__`, *locals* defaults to *globals*. .. method:: runeval(expr, globals=None, locals=None) diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index 930af6cbbe9e8d..a46eed35ee2329 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -619,11 +619,27 @@ Notes: (4) :class:`date` objects are equal if they represent the same date. + :class:`!date` objects that are not also :class:`.datetime` instances + are never equal to :class:`!datetime` objects, even if they represent + the same date. + (5) *date1* is considered less than *date2* when *date1* precedes *date2* in time. In other words, ``date1 < date2`` if and only if ``date1.toordinal() < date2.toordinal()``. + Order comparison between a :class:`!date` object that is not also a + :class:`.datetime` instance and a :class:`!datetime` object raises + :exc:`TypeError`. + +.. versionchanged:: 3.13 + Comparison between :class:`.datetime` object and an instance of + the :class:`date` subclass that is not a :class:`!datetime` subclass + no longer coverts the latter to :class:`!date`, ignoring the time part + and the time zone. + The default behavior can be changed by overriding the special comparison + methods in subclasses. + In Boolean contexts, all :class:`date` objects are considered to be true. Instance methods: @@ -1192,9 +1208,6 @@ Supported operations: and time, taking into account the time zone. Naive and aware :class:`!datetime` objects are never equal. - :class:`!datetime` objects are never equal to :class:`date` objects - that are not also :class:`!datetime` instances, even if they represent - the same date. If both comparands are aware and have different :attr:`~.datetime.tzinfo` attributes, the comparison acts as comparands were first converted to UTC @@ -1206,9 +1219,8 @@ Supported operations: *datetime1* is considered less than *datetime2* when *datetime1* precedes *datetime2* in time, taking into account the time zone. - Order comparison between naive and aware :class:`.datetime` objects, - as well as a :class:`!datetime` object and a :class:`!date` object - that is not also a :class:`!datetime` instance, raises :exc:`TypeError`. + Order comparison between naive and aware :class:`.datetime` objects + raises :exc:`TypeError`. If both comparands are aware and have different :attr:`~.datetime.tzinfo` attributes, the comparison acts as comparands were first converted to UTC @@ -1218,6 +1230,14 @@ Supported operations: Equality comparisons between aware and naive :class:`.datetime` instances don't raise :exc:`TypeError`. +.. versionchanged:: 3.13 + Comparison between :class:`.datetime` object and an instance of + the :class:`date` subclass that is not a :class:`!datetime` subclass + no longer coverts the latter to :class:`!date`, ignoring the time part + and the time zone. + The default behavior can be changed by overriding the special comparison + methods in subclasses. + Instance methods: .. method:: datetime.date() diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 534939943d3326..30d80ce8d488cc 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -286,6 +286,19 @@ Data Types appropriate value will be chosen for you. See :class:`auto` for the details. + .. attribute:: Enum._name_ + + Name of the member. + + .. attribute:: Enum._value_ + + Value of the member, can be set in :meth:`~object.__new__`. + + .. attribute:: Enum._order_ + + No longer used, kept for backward compatibility. + (class attribute, removed during class creation). + .. attribute:: Enum._ignore_ ``_ignore_`` is only used during creation and is removed from the @@ -823,8 +836,8 @@ Supported ``_sunder_`` names - :attr:`~Enum._ignore_` -- a list of names, either as a :class:`list` or a :class:`str`, that will not be transformed into members, and will be removed from the final class -- :attr:`~Enum._order_` -- used in Python 2/3 code to ensure member order is - consistent (class attribute, removed during class creation) +- :attr:`~Enum._order_` -- no longer used, kept for backward + compatibility (class attribute, removed during class creation) - :meth:`~Enum._generate_next_value_` -- used to get an appropriate value for an enum member; may be overridden diff --git a/Doc/library/msvcrt.rst b/Doc/library/msvcrt.rst index 2a6d980ab78a60..ac3458c86fd4c4 100644 --- a/Doc/library/msvcrt.rst +++ b/Doc/library/msvcrt.rst @@ -252,3 +252,18 @@ Other Functions .. data:: CRTDBG_REPORT_MODE Returns current *mode* for the specified *type*. + + +.. data:: CRT_ASSEMBLY_VERSION + + The CRT Assembly version, from the :file:`crtassem.h` header file. + + +.. data:: VC_ASSEMBLY_PUBLICKEYTOKEN + + The VC Assembly public key token, from the :file:`crtassem.h` header file. + + +.. data:: LIBRARIES_ASSEMBLY_NAME_PREFIX + + The Libraries Assembly name prefix, from the :file:`crtassem.h` header file. diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index b104a6483b70e6..d570d4eb0dae78 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -649,8 +649,8 @@ The :mod:`multiprocessing` package mostly replicates the API of the .. method:: terminate() - Terminate the process. On POSIX this is done using the ``SIGTERM`` signal; - on Windows :c:func:`TerminateProcess` is used. Note that exit handlers and + Terminate the process. On POSIX this is done using the :py:const:`~signal.SIGTERM` signal; + on Windows :c:func:`!TerminateProcess` is used. Note that exit handlers and finally clauses, etc., will not be executed. Note that descendant processes of the process will *not* be terminated -- diff --git a/Doc/library/queue.rst b/Doc/library/queue.rst index b2b787c5a8260c..1421fc2e552f0e 100644 --- a/Doc/library/queue.rst +++ b/Doc/library/queue.rst @@ -93,6 +93,14 @@ The :mod:`queue` module defines the following classes and exceptions: on a :class:`Queue` object which is full. +.. exception:: ShutDown + + Exception raised when :meth:`~Queue.put` or :meth:`~Queue.get` is called on + a :class:`Queue` object which has been shut down. + + .. versionadded:: 3.13 + + .. _queueobjects: Queue Objects @@ -135,6 +143,8 @@ provide the public methods described below. immediately available, else raise the :exc:`Full` exception (*timeout* is ignored in that case). + Raises :exc:`ShutDown` if the queue has been shut down. + .. method:: Queue.put_nowait(item) @@ -155,6 +165,9 @@ provide the public methods described below. an uninterruptible wait on an underlying lock. This means that no exceptions can occur, and in particular a SIGINT will not trigger a :exc:`KeyboardInterrupt`. + Raises :exc:`ShutDown` if the queue has been shut down and is empty, or if + the queue has been shut down immediately. + .. method:: Queue.get_nowait() @@ -177,6 +190,8 @@ fully processed by daemon consumer threads. Raises a :exc:`ValueError` if called more times than there were items placed in the queue. + Raises :exc:`ShutDown` if the queue has been shut down immediately. + .. method:: Queue.join() @@ -187,6 +202,8 @@ fully processed by daemon consumer threads. indicate that the item was retrieved and all work on it is complete. When the count of unfinished tasks drops to zero, :meth:`join` unblocks. + Raises :exc:`ShutDown` if the queue has been shut down immediately. + Example of how to wait for enqueued tasks to be completed:: @@ -214,6 +231,27 @@ Example of how to wait for enqueued tasks to be completed:: print('All work completed') +Terminating queues +^^^^^^^^^^^^^^^^^^ + +:class:`Queue` objects can be made to prevent further interaction by shutting +them down. + +.. method:: Queue.shutdown(immediate=False) + + Shut down the queue, making :meth:`~Queue.get` and :meth:`~Queue.put` raise + :exc:`ShutDown`. + + By default, :meth:`~Queue.get` on a shut down queue will only raise once the + queue is empty. Set *immediate* to true to make :meth:`~Queue.get` raise + immediately instead. + + All blocked callers of :meth:`~Queue.put` will be unblocked. If *immediate* + is true, also unblock callers of :meth:`~Queue.get` and :meth:`~Queue.join`. + + .. versionadded:: 3.13 + + SimpleQueue Objects ------------------- diff --git a/Doc/library/re.rst b/Doc/library/re.rst index 0a8c88b50cdeec..a5bd5c73f2fac7 100644 --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -1597,7 +1597,7 @@ To find out what card the pair consists of, one could use the Simulating scanf() ^^^^^^^^^^^^^^^^^^ -.. index:: single: scanf() +.. index:: single: scanf (C function) Python does not currently have an equivalent to :c:func:`!scanf`. Regular expressions are generally more powerful, though also more verbose, than diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst index f63ca73b3ec067..1dcfea58a8e89f 100644 --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -857,8 +857,8 @@ Instances of the :class:`Popen` class have the following methods: .. method:: Popen.terminate() - Stop the child. On POSIX OSs the method sends SIGTERM to the - child. On Windows the Win32 API function :c:func:`TerminateProcess` is called + Stop the child. On POSIX OSs the method sends :py:const:`~signal.SIGTERM` to the + child. On Windows the Win32 API function :c:func:`!TerminateProcess` is called to stop the child. diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst index a97a369b77b88a..ad8857fc2807f7 100644 --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -195,6 +195,17 @@ always available. This function should be used for internal and specialized purposes only. + .. deprecated:: 3.13 + Use the more general :func:`_clear_internal_caches` function instead. + + +.. function:: _clear_internal_caches() + + Clear all internal performance-related caches. Use this function *only* to + release unnecessary references and memory blocks when hunting for leaks. + + .. versionadded:: 3.13 + .. function:: _current_frames() @@ -724,7 +735,7 @@ always available. regardless of their size. This function is mainly useful for tracking and debugging memory leaks. Because of the interpreter's internal caches, the result can vary from call to call; you may have to call - :func:`_clear_type_cache()` and :func:`gc.collect()` to get more + :func:`_clear_internal_caches()` and :func:`gc.collect()` to get more predictable results. If a Python build or implementation cannot reasonably compute this diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst index 0a1c1d58558e94..885ee825c12296 100644 --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -1988,8 +1988,8 @@ access (use of, assignment to, or deletion of ``x.name``) for class instances. .. method:: object.__dir__(self) - Called when :func:`dir` is called on the object. A sequence must be - returned. :func:`dir` converts the returned sequence to a list and sorts it. + Called when :func:`dir` is called on the object. An iterable must be + returned. :func:`dir` converts the returned iterable to a list and sorts it. Customizing module attribute access @@ -2009,7 +2009,7 @@ not found on a module object through the normal lookup, i.e. the module ``__dict__`` before raising an :exc:`AttributeError`. If found, it is called with the attribute name and the result is returned. -The ``__dir__`` function should accept no arguments, and return a sequence of +The ``__dir__`` function should accept no arguments, and return an iterable of strings that represents the names accessible on module. If present, this function overrides the standard :func:`dir` search on a module. diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index f96478b45e44c0..33129e898e51d6 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -3,15 +3,12 @@ # Keep lines sorted lexicographically to help avoid merge conflicts. Doc/c-api/descriptor.rst -Doc/c-api/exceptions.rst Doc/c-api/float.rst -Doc/c-api/gcsupport.rst Doc/c-api/init.rst Doc/c-api/init_config.rst Doc/c-api/intro.rst Doc/c-api/module.rst Doc/c-api/stable.rst -Doc/c-api/sys.rst Doc/c-api/type.rst Doc/c-api/typeobj.rst Doc/extending/extending.rst @@ -22,7 +19,6 @@ Doc/library/ast.rst Doc/library/asyncio-extending.rst Doc/library/asyncio-policy.rst Doc/library/asyncio-subprocess.rst -Doc/library/bdb.rst Doc/library/collections.rst Doc/library/dbm.rst Doc/library/decimal.rst @@ -31,7 +27,6 @@ Doc/library/email.compat32-message.rst Doc/library/email.errors.rst Doc/library/email.parser.rst Doc/library/email.policy.rst -Doc/library/enum.rst Doc/library/exceptions.rst Doc/library/faulthandler.rst Doc/library/fcntl.rst diff --git a/Doc/whatsnew/2.6.rst b/Doc/whatsnew/2.6.rst index 7d3769a22286e2..05c21d313aae03 100644 --- a/Doc/whatsnew/2.6.rst +++ b/Doc/whatsnew/2.6.rst @@ -2388,11 +2388,11 @@ changes, or look through the Subversion logs for all the details. using the format character ``'?'``. (Contributed by David Remahl.) -* The :class:`Popen` objects provided by the :mod:`subprocess` module - now have :meth:`terminate`, :meth:`kill`, and :meth:`send_signal` methods. - On Windows, :meth:`send_signal` only supports the :const:`SIGTERM` +* The :class:`~subprocess.Popen` objects provided by the :mod:`subprocess` module + now have :meth:`~subprocess.Popen.terminate`, :meth:`~subprocess.Popen.kill`, and :meth:`~subprocess.Popen.send_signal` methods. + On Windows, :meth:`!send_signal` only supports the :py:const:`~signal.SIGTERM` signal, and all these methods are aliases for the Win32 API function - :c:func:`TerminateProcess`. + :c:func:`!TerminateProcess`. (Contributed by Christian Heimes.) * A new variable in the :mod:`sys` module, :attr:`float_info`, is an diff --git a/Doc/whatsnew/2.7.rst b/Doc/whatsnew/2.7.rst index ada05aa22b46f6..2a42664c02852c 100644 --- a/Doc/whatsnew/2.7.rst +++ b/Doc/whatsnew/2.7.rst @@ -196,7 +196,7 @@ A partial list of 3.1 features that were backported to 2.7: Other new Python3-mode warnings include: -* :func:`operator.isCallable` and :func:`operator.sequenceIncludes`, +* :func:`!operator.isCallable` and :func:`!operator.sequenceIncludes`, which are not supported in 3.x, now trigger warnings. * The :option:`!-3` switch now automatically enables the :option:`!-Qwarn` switch that causes warnings @@ -455,11 +455,11 @@ a varying number of handlers. All this flexibility can require a lot of configuration. You can write Python statements to create objects and set their properties, but a complex set-up requires verbose but boring code. -:mod:`logging` also supports a :func:`~logging.fileConfig` +:mod:`logging` also supports a :func:`~logging.config.fileConfig` function that parses a file, but the file format doesn't support configuring filters, and it's messier to generate programmatically. -Python 2.7 adds a :func:`~logging.dictConfig` function that +Python 2.7 adds a :func:`~logging.config.dictConfig` function that uses a dictionary to configure logging. There are many ways to produce a dictionary from different sources: construct one with code; parse a file containing JSON; or use a YAML parsing library if one is @@ -533,7 +533,7 @@ implemented by Vinay Sajip, are: ``getLogger('app.network.listen')``. * The :class:`~logging.LoggerAdapter` class gained an - :meth:`~logging.LoggerAdapter.isEnabledFor` method that takes a + :meth:`~logging.Logger.isEnabledFor` method that takes a *level* and returns whether the underlying logger would process a message of that level of importance. @@ -554,8 +554,8 @@ called a :dfn:`view` instead of a fully materialized list. It's not possible to change the return values of :meth:`~dict.keys`, :meth:`~dict.values`, and :meth:`~dict.items` in Python 2.7 because too much code would break. Instead the 3.x versions were added -under the new names :meth:`~dict.viewkeys`, :meth:`~dict.viewvalues`, -and :meth:`~dict.viewitems`. +under the new names :meth:`!viewkeys`, :meth:`!viewvalues`, +and :meth:`!viewitems`. :: @@ -720,7 +720,7 @@ Some smaller changes made to the core Python language are: with B() as b: ... suite of statements ... - The :func:`contextlib.nested` function provides a very similar + The :func:`!contextlib.nested` function provides a very similar function, so it's no longer necessary and has been deprecated. (Proposed in https://codereview.appspot.com/53094; implemented by @@ -785,7 +785,7 @@ Some smaller changes made to the core Python language are: implemented by Mark Dickinson; :issue:`1811`.) * Implicit coercion for complex numbers has been removed; the interpreter - will no longer ever attempt to call a :meth:`__coerce__` method on complex + will no longer ever attempt to call a :meth:`!__coerce__` method on complex objects. (Removed by Meador Inge and Mark Dickinson; :issue:`5211`.) * The :meth:`str.format` method now supports automatic numbering of the replacement @@ -817,7 +817,7 @@ Some smaller changes made to the core Python language are: A low-level change: the :meth:`object.__format__` method now triggers a :exc:`PendingDeprecationWarning` if it's passed a format string, - because the :meth:`__format__` method for :class:`object` converts + because the :meth:`!__format__` method for :class:`object` converts the object to a string representation and formats that. Previously the method silently applied the format string to the string representation, but that could hide mistakes in Python code. If @@ -825,7 +825,7 @@ Some smaller changes made to the core Python language are: precision, presumably you're expecting the formatting to be applied in some object-specific way. (Fixed by Eric Smith; :issue:`7994`.) -* The :func:`int` and :func:`long` types gained a ``bit_length`` +* The :func:`int` and :func:`!long` types gained a ``bit_length`` method that returns the number of bits necessary to represent its argument in binary:: @@ -848,8 +848,8 @@ Some smaller changes made to the core Python language are: statements that were only working by accident. (Fixed by Meador Inge; :issue:`7902`.) -* It's now possible for a subclass of the built-in :class:`unicode` type - to override the :meth:`__unicode__` method. (Implemented by +* It's now possible for a subclass of the built-in :class:`!unicode` type + to override the :meth:`!__unicode__` method. (Implemented by Victor Stinner; :issue:`1583863`.) * The :class:`bytearray` type's :meth:`~bytearray.translate` method now accepts @@ -876,7 +876,7 @@ Some smaller changes made to the core Python language are: Forgeot d'Arc in :issue:`1616979`; CP858 contributed by Tim Hatch in :issue:`8016`.) -* The :class:`file` object will now set the :attr:`filename` attribute +* The :class:`!file` object will now set the :attr:`!filename` attribute on the :exc:`IOError` exception when trying to open a directory on POSIX platforms (noted by Jan Kaliszewski; :issue:`4764`), and now explicitly checks for and forbids writing to read-only file objects @@ -966,7 +966,7 @@ Several performance enhancements have been added: Apart from the performance improvements this change should be invisible to end users, with one exception: for testing and - debugging purposes there's a new structseq :data:`sys.long_info` that + debugging purposes there's a new structseq :data:`!sys.long_info` that provides information about the internal format, giving the number of bits per digit and the size in bytes of the C type used to store each digit:: @@ -1005,8 +1005,8 @@ Several performance enhancements have been added: conversion function that supports arbitrary bases. (Patch by Gawain Bolton; :issue:`6713`.) -* The :meth:`split`, :meth:`replace`, :meth:`rindex`, - :meth:`rpartition`, and :meth:`rsplit` methods of string-like types +* The :meth:`!split`, :meth:`!replace`, :meth:`!rindex`, + :meth:`!rpartition`, and :meth:`!rsplit` methods of string-like types (strings, Unicode strings, and :class:`bytearray` objects) now use a fast reverse-search algorithm instead of a character-by-character scan. This is sometimes faster by a factor of 10. (Added by @@ -1044,7 +1044,7 @@ changes, or look through the Subversion logs for all the details. used with :class:`memoryview` instances and other similar buffer objects. (Backported from 3.x by Florent Xicluna; :issue:`7703`.) -* Updated module: the :mod:`bsddb` module has been updated from 4.7.2devel9 +* Updated module: the :mod:`!bsddb` module has been updated from 4.7.2devel9 to version 4.8.4 of `the pybsddb package `__. The new version features better Python 3.x compatibility, various bug fixes, @@ -1129,7 +1129,7 @@ changes, or look through the Subversion logs for all the details. (Added by Raymond Hettinger; :issue:`1818`.) - Finally, the :class:`~collections.Mapping` abstract base class now + Finally, the :class:`~collections.abc.Mapping` abstract base class now returns :const:`NotImplemented` if a mapping is compared to another type that isn't a :class:`Mapping`. (Fixed by Daniel Stutzbach; :issue:`8729`.) @@ -1158,7 +1158,7 @@ changes, or look through the Subversion logs for all the details. (Contributed by Mats Kindahl; :issue:`7005`.) -* Deprecated function: :func:`contextlib.nested`, which allows +* Deprecated function: :func:`!contextlib.nested`, which allows handling more than one context manager with a single :keyword:`with` statement, has been deprecated, because the :keyword:`!with` statement now supports multiple context managers. @@ -1184,7 +1184,7 @@ changes, or look through the Subversion logs for all the details. * New method: the :class:`~decimal.Decimal` class gained a :meth:`~decimal.Decimal.from_float` class method that performs an exact - conversion of a floating-point number to a :class:`~decimal.Decimal`. + conversion of a floating-point number to a :class:`!Decimal`. This exact conversion strives for the closest decimal approximation to the floating-point representation's value; the resulting decimal value will therefore still include the inaccuracy, @@ -1198,9 +1198,9 @@ changes, or look through the Subversion logs for all the details. of the operands. Previously such comparisons would fall back to Python's default rules for comparing objects, which produced arbitrary results based on their type. Note that you still cannot combine - :class:`Decimal` and floating-point in other operations such as addition, + :class:`!Decimal` and floating-point in other operations such as addition, since you should be explicitly choosing how to convert between float and - :class:`~decimal.Decimal`. (Fixed by Mark Dickinson; :issue:`2531`.) + :class:`!Decimal`. (Fixed by Mark Dickinson; :issue:`2531`.) The constructor for :class:`~decimal.Decimal` now accepts floating-point numbers (added by Raymond Hettinger; :issue:`8257`) @@ -1218,7 +1218,7 @@ changes, or look through the Subversion logs for all the details. more sensible for numeric types. (Changed by Mark Dickinson; :issue:`6857`.) Comparisons involving a signaling NaN value (or ``sNAN``) now signal - :const:`InvalidOperation` instead of silently returning a true or + :const:`~decimal.InvalidOperation` instead of silently returning a true or false value depending on the comparison operator. Quiet NaN values (or ``NaN``) are now hashable. (Fixed by Mark Dickinson; :issue:`7279`.) @@ -1235,13 +1235,13 @@ changes, or look through the Subversion logs for all the details. created some new files that should be included. (Fixed by Tarek Ziadé; :issue:`8688`.) -* The :mod:`doctest` module's :const:`IGNORE_EXCEPTION_DETAIL` flag +* The :mod:`doctest` module's :const:`~doctest.IGNORE_EXCEPTION_DETAIL` flag will now ignore the name of the module containing the exception being tested. (Patch by Lennart Regebro; :issue:`7490`.) * The :mod:`email` module's :class:`~email.message.Message` class will now accept a Unicode-valued payload, automatically converting the - payload to the encoding specified by :attr:`output_charset`. + payload to the encoding specified by :attr:`!output_charset`. (Added by R. David Murray; :issue:`1368247`.) * The :class:`~fractions.Fraction` class now accepts a single float or @@ -1268,10 +1268,10 @@ changes, or look through the Subversion logs for all the details. :issue:`6845`.) * New class decorator: :func:`~functools.total_ordering` in the :mod:`functools` - module takes a class that defines an :meth:`__eq__` method and one of - :meth:`__lt__`, :meth:`__le__`, :meth:`__gt__`, or :meth:`__ge__`, + module takes a class that defines an :meth:`~object.__eq__` method and one of + :meth:`~object.__lt__`, :meth:`~object.__le__`, :meth:`~object.__gt__`, or :meth:`~object.__ge__`, and generates the missing comparison methods. Since the - :meth:`__cmp__` method is being deprecated in Python 3.x, + :meth:`!__cmp__` method is being deprecated in Python 3.x, this decorator makes it easier to define ordered classes. (Added by Raymond Hettinger; :issue:`5479`.) @@ -1300,7 +1300,7 @@ changes, or look through the Subversion logs for all the details. :mod:`gzip` module will now consume these trailing bytes. (Fixed by Tadek Pietraszek and Brian Curtin; :issue:`2846`.) -* New attribute: the :mod:`hashlib` module now has an :attr:`~hashlib.hashlib.algorithms` +* New attribute: the :mod:`hashlib` module now has an :attr:`!algorithms` attribute containing a tuple naming the supported algorithms. In Python 2.7, ``hashlib.algorithms`` contains ``('md5', 'sha1', 'sha224', 'sha256', 'sha384', 'sha512')``. @@ -1348,10 +1348,10 @@ changes, or look through the Subversion logs for all the details. * Updated module: The :mod:`io` library has been upgraded to the version shipped with Python 3.1. For 3.1, the I/O library was entirely rewritten in C and is 2 to 20 times faster depending on the task being performed. The - original Python version was renamed to the :mod:`_pyio` module. + original Python version was renamed to the :mod:`!_pyio` module. One minor resulting change: the :class:`io.TextIOBase` class now - has an :attr:`errors` attribute giving the error setting + has an :attr:`~io.TextIOBase.errors` attribute giving the error setting used for encoding and decoding errors (one of ``'strict'``, ``'replace'``, ``'ignore'``). @@ -1423,10 +1423,10 @@ changes, or look through the Subversion logs for all the details. passed to the callable. (Contributed by lekma; :issue:`5585`.) - The :class:`~multiprocessing.Pool` class, which controls a pool of worker processes, + The :class:`~multiprocessing.pool.Pool` class, which controls a pool of worker processes, now has an optional *maxtasksperchild* parameter. Worker processes will perform the specified number of tasks and then exit, causing the - :class:`~multiprocessing.Pool` to start a new worker. This is useful if tasks may leak + :class:`!Pool` to start a new worker. This is useful if tasks may leak memory or other resources, or if some tasks will cause the worker to become very large. (Contributed by Charles Cazabon; :issue:`6963`.) @@ -1498,7 +1498,7 @@ changes, or look through the Subversion logs for all the details. global site-packages directories, :func:`~site.getusersitepackages` returns the path of the user's site-packages directory, and - :func:`~site.getuserbase` returns the value of the :envvar:`USER_BASE` + :func:`~site.getuserbase` returns the value of the :data:`~site.USER_BASE` environment variable, giving the path to a directory that can be used to store data. (Contributed by Tarek Ziadé; :issue:`6693`.) @@ -1540,11 +1540,11 @@ changes, or look through the Subversion logs for all the details. * The :mod:`ssl` module's :class:`~ssl.SSLSocket` objects now support the buffer API, which fixed a test suite failure (fix by Antoine Pitrou; :issue:`7133`) and automatically set - OpenSSL's :c:macro:`SSL_MODE_AUTO_RETRY`, which will prevent an error + OpenSSL's :c:macro:`!SSL_MODE_AUTO_RETRY`, which will prevent an error code being returned from :meth:`recv` operations that trigger an SSL renegotiation (fix by Antoine Pitrou; :issue:`8222`). - The :func:`ssl.wrap_socket` constructor function now takes a + The :func:`~ssl.SSLContext.wrap_socket` constructor function now takes a *ciphers* argument that's a string listing the encryption algorithms to be allowed; the format of the string is described `in the OpenSSL documentation @@ -1568,8 +1568,8 @@ changes, or look through the Subversion logs for all the details. code (one of ``bBhHiIlLqQ``); it now always raises a :exc:`struct.error` exception. (Changed by Mark Dickinson; :issue:`1523`.) The :func:`~struct.pack` function will also - attempt to use :meth:`__index__` to convert and pack non-integers - before trying the :meth:`__int__` method or reporting an error. + attempt to use :meth:`~object.__index__` to convert and pack non-integers + before trying the :meth:`~object.__int__` method or reporting an error. (Changed by Mark Dickinson; :issue:`8300`.) * New function: the :mod:`subprocess` module's @@ -1590,7 +1590,7 @@ changes, or look through the Subversion logs for all the details. (Contributed by Gregory P. Smith.) The :mod:`subprocess` module will now retry its internal system calls - on receiving an :const:`EINTR` signal. (Reported by several people; final + on receiving an :const:`~errno.EINTR` signal. (Reported by several people; final patch by Gregory P. Smith in :issue:`1068268`.) * New function: :func:`~symtable.Symbol.is_declared_global` in the :mod:`symtable` module @@ -1602,16 +1602,16 @@ changes, or look through the Subversion logs for all the details. identifier instead of the previous default value of ``'python'``. (Changed by Sean Reifschneider; :issue:`8451`.) -* The ``sys.version_info`` value is now a named tuple, with attributes - named :attr:`major`, :attr:`minor`, :attr:`micro`, - :attr:`releaselevel`, and :attr:`serial`. (Contributed by Ross +* The :attr:`sys.version_info` value is now a named tuple, with attributes + named :attr:`!major`, :attr:`!minor`, :attr:`!micro`, + :attr:`!releaselevel`, and :attr:`!serial`. (Contributed by Ross Light; :issue:`4285`.) :func:`sys.getwindowsversion` also returns a named tuple, - with attributes named :attr:`major`, :attr:`minor`, :attr:`build`, - :attr:`platform`, :attr:`service_pack`, :attr:`service_pack_major`, - :attr:`service_pack_minor`, :attr:`suite_mask`, and - :attr:`product_type`. (Contributed by Brian Curtin; :issue:`7766`.) + with attributes named :attr:`!major`, :attr:`!minor`, :attr:`!build`, + :attr:`!platform`, :attr:`!service_pack`, :attr:`!service_pack_major`, + :attr:`!service_pack_minor`, :attr:`!suite_mask`, and + :attr:`!product_type`. (Contributed by Brian Curtin; :issue:`7766`.) * The :mod:`tarfile` module's default error handling has changed, to no longer suppress fatal errors. The default error level was previously 0, @@ -1691,7 +1691,7 @@ changes, or look through the Subversion logs for all the details. (Originally implemented in Python 3.x by Raymond Hettinger, and backported to 2.7 by Michael Foord.) -* The ElementTree library, :mod:`xml.etree`, no longer escapes +* The :mod:`xml.etree.ElementTree` library, no longer escapes ampersands and angle brackets when outputting an XML processing instruction (which looks like ````) or comment (which looks like ````). @@ -1701,8 +1701,8 @@ changes, or look through the Subversion logs for all the details. :mod:`SimpleXMLRPCServer ` modules, have improved performance by supporting HTTP/1.1 keep-alive and by optionally using gzip encoding to compress the XML being exchanged. The gzip compression is - controlled by the :attr:`encode_threshold` attribute of - :class:`SimpleXMLRPCRequestHandler`, which contains a size in bytes; + controlled by the :attr:`!encode_threshold` attribute of + :class:`~xmlrpc.server.SimpleXMLRPCRequestHandler`, which contains a size in bytes; responses larger than this will be compressed. (Contributed by Kristján Valur Jónsson; :issue:`6267`.) @@ -1713,7 +1713,8 @@ changes, or look through the Subversion logs for all the details. :mod:`zipfile` now also supports archiving empty directories and extracts them correctly. (Fixed by Kuba Wieczorek; :issue:`4710`.) Reading files out of an archive is faster, and interleaving - :meth:`~zipfile.ZipFile.read` and :meth:`~zipfile.ZipFile.readline` now works correctly. + :meth:`read() ` and + :meth:`readline() ` now works correctly. (Contributed by Nir Aides; :issue:`7610`.) The :func:`~zipfile.is_zipfile` function now @@ -1807,14 +1808,14 @@ closely resemble the native platform's widgets. This widget set was originally called Tile, but was renamed to Ttk (for "themed Tk") on being added to Tcl/Tck release 8.5. -To learn more, read the :mod:`ttk` module documentation. You may also +To learn more, read the :mod:`~tkinter.ttk` module documentation. You may also wish to read the Tcl/Tk manual page describing the Ttk theme engine, available at -https://www.tcl.tk/man/tcl8.5/TkCmd/ttk_intro.htm. Some +https://www.tcl.tk/man/tcl8.5/TkCmd/ttk_intro.html. Some screenshots of the Python/Ttk code in use are at https://code.google.com/archive/p/python-ttk/wikis/Screenshots.wiki. -The :mod:`ttk` module was written by Guilherme Polo and added in +The :mod:`tkinter.ttk` module was written by Guilherme Polo and added in :issue:`2983`. An alternate version called ``Tile.py``, written by Martin Franklin and maintained by Kevin Walzer, was proposed for inclusion in :issue:`2618`, but the authors argued that Guilherme @@ -1830,7 +1831,7 @@ The :mod:`unittest` module was greatly enhanced; many new features were added. Most of these features were implemented by Michael Foord, unless otherwise noted. The enhanced version of the module is downloadable separately for use with Python versions 2.4 to 2.6, -packaged as the :mod:`unittest2` package, from +packaged as the :mod:`!unittest2` package, from https://pypi.org/project/unittest2. When used from the command line, the module can automatically discover @@ -1938,19 +1939,20 @@ GvR worked on merging them into Python's version of :mod:`unittest`. differences in the two strings. This comparison is now used by default when Unicode strings are compared with :meth:`~unittest.TestCase.assertEqual`. -* :meth:`~unittest.TestCase.assertRegexpMatches` and - :meth:`~unittest.TestCase.assertNotRegexpMatches` checks whether the +* :meth:`assertRegexpMatches() ` and + :meth:`assertNotRegexpMatches() ` checks whether the first argument is a string matching or not matching the regular expression provided as the second argument (:issue:`8038`). -* :meth:`~unittest.TestCase.assertRaisesRegexp` checks whether a particular exception +* :meth:`assertRaisesRegexp() ` checks + whether a particular exception is raised, and then also checks that the string representation of the exception matches the provided regular expression. * :meth:`~unittest.TestCase.assertIn` and :meth:`~unittest.TestCase.assertNotIn` tests whether *first* is or is not in *second*. -* :meth:`~unittest.TestCase.assertItemsEqual` tests whether two provided sequences +* :meth:`assertItemsEqual() ` tests whether two provided sequences contain the same elements. * :meth:`~unittest.TestCase.assertSetEqual` compares whether two sets are equal, and @@ -1966,7 +1968,7 @@ GvR worked on merging them into Python's version of :mod:`unittest`. * :meth:`~unittest.TestCase.assertDictEqual` compares two dictionaries and reports the differences; it's now used by default when you compare two dictionaries - using :meth:`~unittest.TestCase.assertEqual`. :meth:`~unittest.TestCase.assertDictContainsSubset` checks whether + using :meth:`~unittest.TestCase.assertEqual`. :meth:`!assertDictContainsSubset` checks whether all of the key/value pairs in *first* are found in *second*. * :meth:`~unittest.TestCase.assertAlmostEqual` and :meth:`~unittest.TestCase.assertNotAlmostEqual` test @@ -2023,8 +2025,8 @@ version 1.3. Some of the new features are: p = ET.XMLParser(encoding='utf-8') t = ET.XML("""""", parser=p) - Errors in parsing XML now raise a :exc:`ParseError` exception, whose - instances have a :attr:`position` attribute + Errors in parsing XML now raise a :exc:`~xml.etree.ElementTree.ParseError` exception, whose + instances have a :attr:`!position` attribute containing a (*line*, *column*) tuple giving the location of the problem. * ElementTree's code for converting trees to a string has been @@ -2034,7 +2036,8 @@ version 1.3. Some of the new features are: "xml" (the default), "html", or "text". HTML mode will output empty elements as ```` instead of ````, and text mode will skip over elements and only output the text chunks. If - you set the :attr:`tag` attribute of an element to ``None`` but + you set the :attr:`~xml.etree.ElementTree.Element.tag` attribute of an + element to ``None`` but leave its children in place, the element will be omitted when the tree is written out, so you don't need to do more extensive rearrangement to remove a single element. @@ -2064,14 +2067,14 @@ version 1.3. Some of the new features are: # Outputs 1... print ET.tostring(new) -* New :class:`Element` method: +* New :class:`~xml.etree.ElementTree.Element` method: :meth:`~xml.etree.ElementTree.Element.iter` yields the children of the element as a generator. It's also possible to write ``for child in elem:`` to loop over an element's children. The existing method - :meth:`getiterator` is now deprecated, as is :meth:`getchildren` + :meth:`!getiterator` is now deprecated, as is :meth:`!getchildren` which constructs and returns a list of children. -* New :class:`Element` method: +* New :class:`~xml.etree.ElementTree.Element` method: :meth:`~xml.etree.ElementTree.Element.itertext` yields all chunks of text that are descendants of the element. For example:: @@ -2227,7 +2230,7 @@ Changes to Python's build process and to the C API include: (Fixed by Thomas Wouters; :issue:`1590864`.) * The :c:func:`Py_Finalize` function now calls the internal - :func:`threading._shutdown` function; this prevents some exceptions from + :func:`!threading._shutdown` function; this prevents some exceptions from being raised when an interpreter shuts down. (Patch by Adam Olsen; :issue:`1722344`.) @@ -2242,7 +2245,7 @@ Changes to Python's build process and to the C API include: Heller; :issue:`3102`.) * New configure option: the :option:`!--with-system-expat` switch allows - building the :mod:`pyexpat` module to use the system Expat library. + building the :mod:`pyexpat ` module to use the system Expat library. (Contributed by Arfrever Frehtes Taifersar Arahesis; :issue:`7609`.) * New configure option: the @@ -2329,9 +2332,9 @@ Port-Specific Changes: Windows * The :mod:`msvcrt` module now contains some constants from the :file:`crtassem.h` header file: - :data:`CRT_ASSEMBLY_VERSION`, - :data:`VC_ASSEMBLY_PUBLICKEYTOKEN`, - and :data:`LIBRARIES_ASSEMBLY_NAME_PREFIX`. + :data:`~msvcrt.CRT_ASSEMBLY_VERSION`, + :data:`~msvcrt.VC_ASSEMBLY_PUBLICKEYTOKEN`, + and :data:`~msvcrt.LIBRARIES_ASSEMBLY_NAME_PREFIX`. (Contributed by David Cournapeau; :issue:`4365`.) * The :mod:`_winreg ` module for accessing the registry now implements @@ -2342,21 +2345,21 @@ Port-Specific Changes: Windows were also tested and documented. (Implemented by Brian Curtin: :issue:`7347`.) -* The new :c:func:`_beginthreadex` API is used to start threads, and +* The new :c:func:`!_beginthreadex` API is used to start threads, and the native thread-local storage functions are now used. (Contributed by Kristján Valur Jónsson; :issue:`3582`.) * The :func:`os.kill` function now works on Windows. The signal value - can be the constants :const:`CTRL_C_EVENT`, - :const:`CTRL_BREAK_EVENT`, or any integer. The first two constants + can be the constants :const:`~signal.CTRL_C_EVENT`, + :const:`~signal.CTRL_BREAK_EVENT`, or any integer. The first two constants will send :kbd:`Control-C` and :kbd:`Control-Break` keystroke events to - subprocesses; any other value will use the :c:func:`TerminateProcess` + subprocesses; any other value will use the :c:func:`!TerminateProcess` API. (Contributed by Miki Tebeka; :issue:`1220212`.) * The :func:`os.listdir` function now correctly fails for an empty path. (Fixed by Hirokazu Yamamoto; :issue:`5913`.) -* The :mod:`mimelib` module will now read the MIME database from +* The :mod:`mimetypes` module will now read the MIME database from the Windows registry when initializing. (Patch by Gabriel Genellina; :issue:`4969`.) @@ -2385,7 +2388,7 @@ Port-Specific Changes: Mac OS X Port-Specific Changes: FreeBSD ----------------------------------- -* FreeBSD 7.1's :const:`SO_SETFIB` constant, used with the :func:`~socket.socket` methods +* FreeBSD 7.1's :const:`!SO_SETFIB` constant, used with the :func:`~socket.socket` methods :func:`~socket.socket.getsockopt`/:func:`~socket.socket.setsockopt` to select an alternate routing table, is now available in the :mod:`socket` module. (Added by Kyle VanderBeek; :issue:`8235`.) @@ -2441,7 +2444,7 @@ This section lists previously described changes and other bugfixes that may require changes to your code: * The :func:`range` function processes its arguments more - consistently; it will now call :meth:`__int__` on non-float, + consistently; it will now call :meth:`~object.__int__` on non-float, non-integer arguments that are supplied to it. (Fixed by Alexander Belopolsky; :issue:`1533`.) @@ -2486,13 +2489,13 @@ In the standard library: (or ``NaN``) are now hashable. (Fixed by Mark Dickinson; :issue:`7279`.) -* The ElementTree library, :mod:`xml.etree`, no longer escapes +* The :mod:`xml.etree.ElementTree` library no longer escapes ampersands and angle brackets when outputting an XML processing instruction (which looks like ````) or comment (which looks like ````). (Patch by Neil Muller; :issue:`2746`.) -* The :meth:`~StringIO.StringIO.readline` method of :class:`~StringIO.StringIO` objects now does +* The :meth:`!readline` method of :class:`~io.StringIO` objects now does nothing when a negative length is requested, as other file-like objects do. (:issue:`7348`). @@ -2577,11 +2580,11 @@ Two new environment variables for debug mode -------------------------------------------- In debug mode, the ``[xxx refs]`` statistic is not written by default, the -:envvar:`PYTHONSHOWREFCOUNT` environment variable now must also be set. +:envvar:`!PYTHONSHOWREFCOUNT` environment variable now must also be set. (Contributed by Victor Stinner; :issue:`31733`.) When Python is compiled with ``COUNT_ALLOC`` defined, allocation counts are no -longer dumped by default anymore: the :envvar:`PYTHONSHOWALLOCCOUNT` environment +longer dumped by default anymore: the :envvar:`!PYTHONSHOWALLOCCOUNT` environment variable must now also be set. Moreover, allocation counts are now dumped into stderr, rather than stdout. (Contributed by Victor Stinner; :issue:`31692`.) @@ -2712,7 +2715,8 @@ PEP 476: Enabling certificate verification by default for stdlib http clients ----------------------------------------------------------------------------- :pep:`476` updated :mod:`httplib ` and modules which use it, such as -:mod:`urllib2 ` and :mod:`xmlrpclib`, to now verify that the server +:mod:`urllib2 ` and :mod:`xmlrpclib `, to now +verify that the server presents a certificate which is signed by a Certificate Authority in the platform trust store and whose hostname matches the hostname being requested by default, significantly improving security for many applications. This @@ -2753,7 +2757,7 @@ entire Python process back to the default permissive behaviour of Python 2.7.8 and earlier. For cases where the connection establishment code can't be modified, but the -overall application can be, the new :func:`ssl._https_verify_certificates` +overall application can be, the new :func:`!ssl._https_verify_certificates` function can be used to adjust the default behaviour at runtime. diff --git a/Doc/whatsnew/3.1.rst b/Doc/whatsnew/3.1.rst index e237179f4b1829..c912a928ee4597 100644 --- a/Doc/whatsnew/3.1.rst +++ b/Doc/whatsnew/3.1.rst @@ -169,7 +169,7 @@ Some smaller changes made to the core Python language are: ... if '' in line: ... outfile.write(line) - With the new syntax, the :func:`contextlib.nested` function is no longer + With the new syntax, the :func:`!contextlib.nested` function is no longer needed and is now deprecated. (Contributed by Georg Brandl and Mattias Brändström; diff --git a/Doc/whatsnew/3.13.rst b/Doc/whatsnew/3.13.rst index 10bb1502ab90af..c187af6be12b04 100644 --- a/Doc/whatsnew/3.13.rst +++ b/Doc/whatsnew/3.13.rst @@ -101,6 +101,17 @@ Improved Error Messages variables. See also :ref:`using-on-controlling-color`. (Contributed by Pablo Galindo Salgado in :gh:`112730`.) +* When an incorrect keyword argument is passed to a function, the error message + now potentially suggests the correct keyword argument. + (Contributed by Pablo Galindo Salgado and Shantanu Jain in :gh:`107944`.) + + >>> "better error messages!".split(max_split=1) + Traceback (most recent call last): + File "", line 1, in + "better error messages!".split(max_split=1) + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^ + TypeError: split() got an unexpected keyword argument 'max_split'. Did you mean 'maxsplit'? + Other Language Changes ====================== @@ -174,6 +185,9 @@ array It can be used instead of ``'u'`` type code, which is deprecated. (Contributed by Inada Naoki in :gh:`80480`.) +* Add ``clear()`` method in order to implement ``MutableSequence``. + (Contributed by Mike Zimin in :gh:`114894`.) + ast --- @@ -392,6 +406,13 @@ pdb command line option or :envvar:`PYTHONSAFEPATH` environment variable). (Contributed by Tian Gao and Christian Walther in :gh:`111762`.) +queue +----- + +* Add :meth:`queue.Queue.shutdown` (along with :exc:`queue.ShutDown`) for queue + termination. + (Contributed by Laurie Opperman and Yves Duprat in :gh:`104750`.) + re -- * Rename :exc:`!re.error` to :exc:`re.PatternError` for improved clarity. @@ -448,6 +469,12 @@ tkinter a dict instead of a tuple. (Contributed by Serhiy Storchaka in :gh:`43457`.) +* Add new optional keyword-only parameter *return_ints* in + the :meth:`!Text.count` method. + Passing ``return_ints=True`` makes it always returning the single count + as an integer instead of a 1-tuple or ``None``. + (Contributed by Serhiy Storchaka in :gh:`97928`.) + * Add support of the "vsapi" element type in the :meth:`~tkinter.ttk.Style.element_create` method of :class:`tkinter.ttk.Style`. @@ -1266,13 +1293,6 @@ that may require changes to your code. Changes in the Python API ------------------------- -* :meth:`!tkinter.Text.count` now always returns an integer if one or less - counting options are specified. - Previously it could return a single count as a 1-tuple, an integer (only if - option ``"update"`` was specified) or ``None`` if no items found. - The result is now the same if ``wantobjects`` is set to ``0``. - (Contributed by Serhiy Storchaka in :gh:`97928`.) - * Functions :c:func:`PyDict_GetItem`, :c:func:`PyDict_GetItemString`, :c:func:`PyMapping_HasKey`, :c:func:`PyMapping_HasKeyString`, :c:func:`PyObject_HasAttr`, :c:func:`PyObject_HasAttrString`, and @@ -1329,6 +1349,12 @@ Build Changes :ref:`limited C API `. (Contributed by Victor Stinner in :gh:`85283`.) +* ``wasm32-wasi`` is now a tier 2 platform. + (Contributed by Brett Cannon in :gh:`115192`.) + +* ``wasm32-emscripten`` is no longer a supported platform. + (Contributed by Brett Cannon in :gh:`115192`.) + C API Changes ============= diff --git a/Doc/whatsnew/3.2.rst b/Doc/whatsnew/3.2.rst index 9834bc03dc4b74..4f70d902243d4d 100644 --- a/Doc/whatsnew/3.2.rst +++ b/Doc/whatsnew/3.2.rst @@ -743,8 +743,8 @@ Several new and useful functions and methods have been added: Two methods have been deprecated: -* :meth:`xml.etree.ElementTree.getchildren` use ``list(elem)`` instead. -* :meth:`xml.etree.ElementTree.getiterator` use ``Element.iter`` instead. +* :meth:`!xml.etree.ElementTree.getchildren` use ``list(elem)`` instead. +* :meth:`!xml.etree.ElementTree.getiterator` use ``Element.iter`` instead. For details of the update, see `Introducing ElementTree `_ @@ -2682,7 +2682,7 @@ require changes to your code: (Contributed by Georg Brandl; :issue:`5675`.) -* The previously deprecated :func:`contextlib.nested` function has been removed +* The previously deprecated :func:`!contextlib.nested` function has been removed in favor of a plain :keyword:`with` statement which can accept multiple context managers. The latter technique is faster (because it is built-in), and it does a better job finalizing multiple context managers when one of them diff --git a/Include/cpython/optimizer.h b/Include/cpython/optimizer.h index 5a9ccaea3b2209..3928eca583ba5b 100644 --- a/Include/cpython/optimizer.h +++ b/Include/cpython/optimizer.h @@ -24,9 +24,10 @@ typedef struct { uint8_t opcode; uint8_t oparg; uint8_t valid; - uint8_t linked; + int index; // Index of ENTER_EXECUTOR (if code isn't NULL, below). _PyBloomFilter bloom; _PyExecutorLinkListNode links; + PyCodeObject *code; // Weak (NULL if no corresponding ENTER_EXECUTOR). } _PyVMData; typedef struct { diff --git a/Include/cpython/pystats.h b/Include/cpython/pystats.h index bf0cfe4cb695b4..0f50439b73848e 100644 --- a/Include/cpython/pystats.h +++ b/Include/cpython/pystats.h @@ -133,6 +133,9 @@ typedef struct _rare_event_stats { uint64_t builtin_dict; /* Modifying a function, e.g. func.__defaults__ = ..., etc. */ uint64_t func_modification; + /* Modifying a dict that is being watched */ + uint64_t watched_dict_modification; + uint64_t watched_globals_modification; } RareEventStats; typedef struct _stats { diff --git a/Include/internal/pycore_brc.h b/Include/internal/pycore_brc.h new file mode 100644 index 00000000000000..3453d83b57ca97 --- /dev/null +++ b/Include/internal/pycore_brc.h @@ -0,0 +1,74 @@ +#ifndef Py_INTERNAL_BRC_H +#define Py_INTERNAL_BRC_H + +#include +#include "pycore_llist.h" // struct llist_node +#include "pycore_lock.h" // PyMutex +#include "pycore_object_stack.h" // _PyObjectStack + +#ifdef __cplusplus +extern "C" { +#endif + +#ifndef Py_BUILD_CORE +# error "this header requires Py_BUILD_CORE define" +#endif + +#ifdef Py_GIL_DISABLED + +// Prime number to avoid correlations with memory addresses. +#define _Py_BRC_NUM_BUCKETS 257 + +// Hash table bucket +struct _brc_bucket { + // Mutex protects both the bucket and thread state queues in this bucket. + PyMutex mutex; + + // Linked list of _PyThreadStateImpl objects hashed to this bucket. + struct llist_node root; +}; + +// Per-interpreter biased reference counting state +struct _brc_state { + // Hash table of thread states by thread-id. Thread states within a bucket + // are chained using a doubly-linked list. + struct _brc_bucket table[_Py_BRC_NUM_BUCKETS]; +}; + +// Per-thread biased reference counting state +struct _brc_thread_state { + // Linked-list of thread states per hash bucket + struct llist_node bucket_node; + + // Thread-id as determined by _PyThread_Id() + uintptr_t tid; + + // Objects with refcounts to be merged (protected by bucket mutex) + _PyObjectStack objects_to_merge; + + // Local stack of objects to be merged (not accessed by other threads) + _PyObjectStack local_objects_to_merge; +}; + +// Initialize/finalize the per-thread biased reference counting state +void _Py_brc_init_thread(PyThreadState *tstate); +void _Py_brc_remove_thread(PyThreadState *tstate); + +// Initialize per-interpreter state +void _Py_brc_init_state(PyInterpreterState *interp); + +void _Py_brc_after_fork(PyInterpreterState *interp); + +// Enqueues an object to be merged by it's owning thread (tid). This +// steals a reference to the object. +void _Py_brc_queue_object(PyObject *ob); + +// Merge the refcounts of queued objects for the current thread. +void _Py_brc_merge_refcounts(PyThreadState *tstate); + +#endif /* Py_GIL_DISABLED */ + +#ifdef __cplusplus +} +#endif +#endif /* !Py_INTERNAL_BRC_H */ diff --git a/Include/internal/pycore_ceval.h b/Include/internal/pycore_ceval.h index a66af1389541dd..b158fc9ff5ebc1 100644 --- a/Include/internal/pycore_ceval.h +++ b/Include/internal/pycore_ceval.h @@ -206,6 +206,7 @@ void _PyEval_FrameClearAndPop(PyThreadState *tstate, _PyInterpreterFrame *frame) #define _PY_ASYNC_EXCEPTION_BIT 3 #define _PY_GC_SCHEDULED_BIT 4 #define _PY_EVAL_PLEASE_STOP_BIT 5 +#define _PY_EVAL_EXPLICIT_MERGE_BIT 6 /* Reserve a few bits for future use */ #define _PY_EVAL_EVENTS_BITS 8 diff --git a/Include/internal/pycore_context.h b/Include/internal/pycore_context.h index 3284efba2b6f4c..ae5c47f195eb7f 100644 --- a/Include/internal/pycore_context.h +++ b/Include/internal/pycore_context.h @@ -14,7 +14,6 @@ extern PyTypeObject _PyContextTokenMissing_Type; /* runtime lifecycle */ PyStatus _PyContext_Init(PyInterpreterState *); -void _PyContext_Fini(_PyFreeListState *); /* other API */ diff --git a/Include/internal/pycore_dict.h b/Include/internal/pycore_dict.h index 233da058f464d1..0ebe701bc16f81 100644 --- a/Include/internal/pycore_dict.h +++ b/Include/internal/pycore_dict.h @@ -209,6 +209,7 @@ static inline PyDictUnicodeEntry* DK_UNICODE_ENTRIES(PyDictKeysObject *dk) { #define DICT_VERSION_INCREMENT (1 << (DICT_MAX_WATCHERS + DICT_WATCHED_MUTATION_BITS)) #define DICT_WATCHER_MASK ((1 << DICT_MAX_WATCHERS) - 1) +#define DICT_WATCHER_AND_MODIFICATION_MASK ((1 << (DICT_MAX_WATCHERS + DICT_WATCHED_MUTATION_BITS)) - 1) #ifdef Py_GIL_DISABLED #define DICT_NEXT_VERSION(INTERP) \ @@ -236,10 +237,10 @@ _PyDict_NotifyEvent(PyInterpreterState *interp, assert(Py_REFCNT((PyObject*)mp) > 0); int watcher_bits = mp->ma_version_tag & DICT_WATCHER_MASK; if (watcher_bits) { + RARE_EVENT_STAT_INC(watched_dict_modification); _PyDict_SendEvent(watcher_bits, event, mp, key, value); - return DICT_NEXT_VERSION(interp) | watcher_bits; } - return DICT_NEXT_VERSION(interp); + return DICT_NEXT_VERSION(interp) | (mp->ma_version_tag & DICT_WATCHER_AND_MODIFICATION_MASK); } extern PyObject *_PyObject_MakeDictFromInstanceAttributes(PyObject *obj, PyDictValues *values); diff --git a/Include/internal/pycore_floatobject.h b/Include/internal/pycore_floatobject.h index 038578e1f9680a..3767df5506d43f 100644 --- a/Include/internal/pycore_floatobject.h +++ b/Include/internal/pycore_floatobject.h @@ -15,7 +15,6 @@ extern "C" { extern void _PyFloat_InitState(PyInterpreterState *); extern PyStatus _PyFloat_InitTypes(PyInterpreterState *); -extern void _PyFloat_Fini(_PyFreeListState *); extern void _PyFloat_FiniType(PyInterpreterState *); diff --git a/Include/internal/pycore_freelist.h b/Include/internal/pycore_freelist.h index 82a42300991ecc..1bc551914794f0 100644 --- a/Include/internal/pycore_freelist.h +++ b/Include/internal/pycore_freelist.h @@ -125,6 +125,16 @@ typedef struct _Py_freelist_state { struct _Py_object_stack_state object_stacks; } _PyFreeListState; +extern void _PyObject_ClearFreeLists(_PyFreeListState *state, int is_finalization); +extern void _PyTuple_ClearFreeList(_PyFreeListState *state, int is_finalization); +extern void _PyFloat_ClearFreeList(_PyFreeListState *state, int is_finalization); +extern void _PyList_ClearFreeList(_PyFreeListState *state, int is_finalization); +extern void _PySlice_ClearFreeList(_PyFreeListState *state, int is_finalization); +extern void _PyDict_ClearFreeList(_PyFreeListState *state, int is_finalization); +extern void _PyAsyncGen_ClearFreeLists(_PyFreeListState *state, int is_finalization); +extern void _PyContext_ClearFreeList(_PyFreeListState *state, int is_finalization); +extern void _PyObjectStackChunk_ClearFreeList(_PyFreeListState *state, int is_finalization); + #ifdef __cplusplus } #endif diff --git a/Include/internal/pycore_gc.h b/Include/internal/pycore_gc.h index 8d0bc2a218e48d..582a16bf5218ce 100644 --- a/Include/internal/pycore_gc.h +++ b/Include/internal/pycore_gc.h @@ -279,14 +279,6 @@ extern PyObject *_PyGC_GetReferrers(PyInterpreterState *interp, PyObject *objs); // Functions to clear types free lists extern void _PyGC_ClearAllFreeLists(PyInterpreterState *interp); -extern void _Py_ClearFreeLists(_PyFreeListState *state, int is_finalization); -extern void _PyTuple_ClearFreeList(_PyFreeListState *state, int is_finalization); -extern void _PyFloat_ClearFreeList(_PyFreeListState *state, int is_finalization); -extern void _PyList_ClearFreeList(_PyFreeListState *state, int is_finalization); -extern void _PySlice_ClearCache(_PyFreeListState *state); -extern void _PyDict_ClearFreeList(_PyFreeListState *state, int is_finalization); -extern void _PyAsyncGen_ClearFreeLists(_PyFreeListState *state, int is_finalization); -extern void _PyContext_ClearFreeList(_PyFreeListState *state, int is_finalization); extern void _Py_ScheduleGC(PyInterpreterState *interp); extern void _Py_RunGC(PyThreadState *tstate); diff --git a/Include/internal/pycore_genobject.h b/Include/internal/pycore_genobject.h index 5ad63658051e86..b2aa017598409f 100644 --- a/Include/internal/pycore_genobject.h +++ b/Include/internal/pycore_genobject.h @@ -26,10 +26,6 @@ extern PyTypeObject _PyCoroWrapper_Type; extern PyTypeObject _PyAsyncGenWrappedValue_Type; extern PyTypeObject _PyAsyncGenAThrow_Type; -/* runtime lifecycle */ - -extern void _PyAsyncGen_Fini(_PyFreeListState *); - #ifdef __cplusplus } #endif diff --git a/Include/internal/pycore_interp.h b/Include/internal/pycore_interp.h index f7c332ed747cfa..31d88071e19d0c 100644 --- a/Include/internal/pycore_interp.h +++ b/Include/internal/pycore_interp.h @@ -201,6 +201,7 @@ struct _is { #if defined(Py_GIL_DISABLED) struct _mimalloc_interp_state mimalloc; + struct _brc_state brc; // biased reference counting state #endif // Per-interpreter state for the obmalloc allocator. For the main diff --git a/Include/internal/pycore_list.h b/Include/internal/pycore_list.h index 4536f90e414493..50dc13c4da4487 100644 --- a/Include/internal/pycore_list.h +++ b/Include/internal/pycore_list.h @@ -13,12 +13,6 @@ extern "C" { extern PyObject* _PyList_Extend(PyListObject *, PyObject *); extern void _PyList_DebugMallocStats(FILE *out); - -/* runtime lifecycle */ - -extern void _PyList_Fini(_PyFreeListState *); - - #define _PyList_ITEMS(op) _Py_RVALUE(_PyList_CAST(op)->ob_item) extern int diff --git a/Include/internal/pycore_object_stack.h b/Include/internal/pycore_object_stack.h index 1dc1c1591525de..fc130b1e9920b4 100644 --- a/Include/internal/pycore_object_stack.h +++ b/Include/internal/pycore_object_stack.h @@ -1,6 +1,8 @@ #ifndef Py_INTERNAL_OBJECT_STACK_H #define Py_INTERNAL_OBJECT_STACK_H +#include "pycore_freelist.h" // _PyFreeListState + #ifdef __cplusplus extern "C" { #endif @@ -32,9 +34,6 @@ _PyObjectStackChunk_New(void); extern void _PyObjectStackChunk_Free(_PyObjectStackChunk *); -extern void -_PyObjectStackChunk_ClearFreeList(_PyFreeListState *state, int is_finalization); - // Push an item onto the stack. Return -1 on allocation failure, 0 on success. static inline int _PyObjectStack_Push(_PyObjectStack *stack, PyObject *obj) @@ -74,6 +73,10 @@ _PyObjectStack_Pop(_PyObjectStack *stack) return obj; } +// Merge src into dst, leaving src empty +extern void +_PyObjectStack_Merge(_PyObjectStack *dst, _PyObjectStack *src); + // Remove all items from the stack extern void _PyObjectStack_Clear(_PyObjectStack *stack); diff --git a/Include/internal/pycore_sliceobject.h b/Include/internal/pycore_sliceobject.h index 0c72d3ee6225c5..89086f67683a2f 100644 --- a/Include/internal/pycore_sliceobject.h +++ b/Include/internal/pycore_sliceobject.h @@ -11,8 +11,6 @@ extern "C" { /* runtime lifecycle */ -extern void _PySlice_Fini(_PyFreeListState *); - extern PyObject * _PyBuildSlice_ConsumeRefs(PyObject *start, PyObject *stop); diff --git a/Include/internal/pycore_symtable.h b/Include/internal/pycore_symtable.h index 1d782ca2c96e05..b44393b5644673 100644 --- a/Include/internal/pycore_symtable.h +++ b/Include/internal/pycore_symtable.h @@ -109,18 +109,18 @@ extern PyObject* _Py_Mangle(PyObject *p, PyObject *name); /* Flags for def-use information */ -#define DEF_GLOBAL 1 /* global stmt */ -#define DEF_LOCAL 2 /* assignment in code block */ -#define DEF_PARAM 2<<1 /* formal parameter */ -#define DEF_NONLOCAL 2<<2 /* nonlocal stmt */ -#define USE 2<<3 /* name is used */ -#define DEF_FREE 2<<4 /* name used but not defined in nested block */ -#define DEF_FREE_CLASS 2<<5 /* free variable from class's method */ -#define DEF_IMPORT 2<<6 /* assignment occurred via import */ -#define DEF_ANNOT 2<<7 /* this name is annotated */ -#define DEF_COMP_ITER 2<<8 /* this name is a comprehension iteration variable */ -#define DEF_TYPE_PARAM 2<<9 /* this name is a type parameter */ -#define DEF_COMP_CELL 2<<10 /* this name is a cell in an inlined comprehension */ +#define DEF_GLOBAL 1 /* global stmt */ +#define DEF_LOCAL 2 /* assignment in code block */ +#define DEF_PARAM (2<<1) /* formal parameter */ +#define DEF_NONLOCAL (2<<2) /* nonlocal stmt */ +#define USE (2<<3) /* name is used */ +#define DEF_FREE (2<<4) /* name used but not defined in nested block */ +#define DEF_FREE_CLASS (2<<5) /* free variable from class's method */ +#define DEF_IMPORT (2<<6) /* assignment occurred via import */ +#define DEF_ANNOT (2<<7) /* this name is annotated */ +#define DEF_COMP_ITER (2<<8) /* this name is a comprehension iteration variable */ +#define DEF_TYPE_PARAM (2<<9) /* this name is a type parameter */ +#define DEF_COMP_CELL (2<<10) /* this name is a cell in an inlined comprehension */ #define DEF_BOUND (DEF_LOCAL | DEF_PARAM | DEF_IMPORT) diff --git a/Include/internal/pycore_tstate.h b/Include/internal/pycore_tstate.h index 472fa08154e8f9..77a1dc59163d21 100644 --- a/Include/internal/pycore_tstate.h +++ b/Include/internal/pycore_tstate.h @@ -10,6 +10,7 @@ extern "C" { #include "pycore_freelist.h" // struct _Py_freelist_state #include "pycore_mimalloc.h" // struct _mimalloc_thread_state +#include "pycore_brc.h" // struct _brc_thread_state // Every PyThreadState is actually allocated as a _PyThreadStateImpl. The @@ -22,6 +23,7 @@ typedef struct _PyThreadStateImpl { #ifdef Py_GIL_DISABLED struct _mimalloc_thread_state mimalloc; struct _Py_freelist_state freelist_state; + struct _brc_thread_state brc; #endif } _PyThreadStateImpl; diff --git a/Include/internal/pycore_tuple.h b/Include/internal/pycore_tuple.h index b348339a505b0f..4605f355ccbc38 100644 --- a/Include/internal/pycore_tuple.h +++ b/Include/internal/pycore_tuple.h @@ -14,7 +14,6 @@ extern void _PyTuple_DebugMallocStats(FILE *out); /* runtime lifecycle */ extern PyStatus _PyTuple_InitGlobalObjects(PyInterpreterState *); -extern void _PyTuple_Fini(_PyFreeListState *); /* other API */ diff --git a/Lib/_pydatetime.py b/Lib/_pydatetime.py index 54c12d3b2f3f16..b7d569cc41740e 100644 --- a/Lib/_pydatetime.py +++ b/Lib/_pydatetime.py @@ -556,10 +556,6 @@ def _check_tzinfo_arg(tz): if tz is not None and not isinstance(tz, tzinfo): raise TypeError("tzinfo argument must be None or of a tzinfo subclass") -def _cmperror(x, y): - raise TypeError("can't compare '%s' to '%s'" % ( - type(x).__name__, type(y).__name__)) - def _divide_and_round(a, b): """divide a by b and round result to the nearest integer @@ -1113,32 +1109,33 @@ def replace(self, year=None, month=None, day=None): # Comparisons of date objects with other. def __eq__(self, other): - if isinstance(other, date): + if isinstance(other, date) and not isinstance(other, datetime): return self._cmp(other) == 0 return NotImplemented def __le__(self, other): - if isinstance(other, date): + if isinstance(other, date) and not isinstance(other, datetime): return self._cmp(other) <= 0 return NotImplemented def __lt__(self, other): - if isinstance(other, date): + if isinstance(other, date) and not isinstance(other, datetime): return self._cmp(other) < 0 return NotImplemented def __ge__(self, other): - if isinstance(other, date): + if isinstance(other, date) and not isinstance(other, datetime): return self._cmp(other) >= 0 return NotImplemented def __gt__(self, other): - if isinstance(other, date): + if isinstance(other, date) and not isinstance(other, datetime): return self._cmp(other) > 0 return NotImplemented def _cmp(self, other): assert isinstance(other, date) + assert not isinstance(other, datetime) y, m, d = self._year, self._month, self._day y2, m2, d2 = other._year, other._month, other._day return _cmp((y, m, d), (y2, m2, d2)) @@ -2137,42 +2134,32 @@ def dst(self): def __eq__(self, other): if isinstance(other, datetime): return self._cmp(other, allow_mixed=True) == 0 - elif not isinstance(other, date): - return NotImplemented else: - return False + return NotImplemented def __le__(self, other): if isinstance(other, datetime): return self._cmp(other) <= 0 - elif not isinstance(other, date): - return NotImplemented else: - _cmperror(self, other) + return NotImplemented def __lt__(self, other): if isinstance(other, datetime): return self._cmp(other) < 0 - elif not isinstance(other, date): - return NotImplemented else: - _cmperror(self, other) + return NotImplemented def __ge__(self, other): if isinstance(other, datetime): return self._cmp(other) >= 0 - elif not isinstance(other, date): - return NotImplemented else: - _cmperror(self, other) + return NotImplemented def __gt__(self, other): if isinstance(other, datetime): return self._cmp(other) > 0 - elif not isinstance(other, date): - return NotImplemented else: - _cmperror(self, other) + return NotImplemented def _cmp(self, other, allow_mixed=False): assert isinstance(other, datetime) diff --git a/Lib/fractions.py b/Lib/fractions.py index 389ab386b6a8a4..f8c6c9c438c737 100644 --- a/Lib/fractions.py +++ b/Lib/fractions.py @@ -579,7 +579,8 @@ def __format__(self, format_spec, /): f"for object of type {type(self).__name__!r}" ) - def _operator_fallbacks(monomorphic_operator, fallback_operator): + def _operator_fallbacks(monomorphic_operator, fallback_operator, + handle_complex=True): """Generates forward and reverse operators given a purely-rational operator and a function from the operator module. @@ -666,7 +667,7 @@ def forward(a, b): return monomorphic_operator(a, Fraction(b)) elif isinstance(b, float): return fallback_operator(float(a), b) - elif isinstance(b, complex): + elif handle_complex and isinstance(b, complex): return fallback_operator(complex(a), b) else: return NotImplemented @@ -679,7 +680,7 @@ def reverse(b, a): return monomorphic_operator(Fraction(a), b) elif isinstance(a, numbers.Real): return fallback_operator(float(a), float(b)) - elif isinstance(a, numbers.Complex): + elif handle_complex and isinstance(a, numbers.Complex): return fallback_operator(complex(a), complex(b)) else: return NotImplemented @@ -830,7 +831,7 @@ def _floordiv(a, b): """a // b""" return (a.numerator * b.denominator) // (a.denominator * b.numerator) - __floordiv__, __rfloordiv__ = _operator_fallbacks(_floordiv, operator.floordiv) + __floordiv__, __rfloordiv__ = _operator_fallbacks(_floordiv, operator.floordiv, False) def _divmod(a, b): """(a // b, a % b)""" @@ -838,14 +839,14 @@ def _divmod(a, b): div, n_mod = divmod(a.numerator * db, da * b.numerator) return div, Fraction(n_mod, da * db) - __divmod__, __rdivmod__ = _operator_fallbacks(_divmod, divmod) + __divmod__, __rdivmod__ = _operator_fallbacks(_divmod, divmod, False) def _mod(a, b): """a % b""" da, db = a.denominator, b.denominator return Fraction((a.numerator * db) % (b.numerator * da), da * db) - __mod__, __rmod__ = _operator_fallbacks(_mod, operator.mod) + __mod__, __rmod__ = _operator_fallbacks(_mod, operator.mod, False) def __pow__(a, b): """a ** b diff --git a/Lib/glob.py b/Lib/glob.py index 4a335a10766cf4..343be78a73b20a 100644 --- a/Lib/glob.py +++ b/Lib/glob.py @@ -132,7 +132,8 @@ def glob1(dirname, pattern): def _glob2(dirname, pattern, dir_fd, dironly, include_hidden=False): assert _isrecursive(pattern) - yield pattern[:0] + if not dirname or _isdir(dirname, dir_fd): + yield pattern[:0] yield from _rlistdir(dirname, dir_fd, dironly, include_hidden=include_hidden) diff --git a/Lib/idlelib/sidebar.py b/Lib/idlelib/sidebar.py index ff77b568a786e0..aa19a24e3edef2 100644 --- a/Lib/idlelib/sidebar.py +++ b/Lib/idlelib/sidebar.py @@ -27,7 +27,7 @@ def get_displaylines(text, index): """Display height, in lines, of a logical line in a Tk text widget.""" return text.count(f"{index} linestart", f"{index} lineend", - "displaylines") + "displaylines", return_ints=True) def get_widget_padding(widget): """Get the total padding of a Tk widget, including its border.""" diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py index 684b58d5548f91..fcec9e76b98661 100644 --- a/Lib/logging/__init__.py +++ b/Lib/logging/__init__.py @@ -1949,18 +1949,11 @@ def hasHandlers(self): """ return self.logger.hasHandlers() - def _log(self, level, msg, args, exc_info=None, extra=None, stack_info=False): + def _log(self, level, msg, args, **kwargs): """ Low-level log implementation, proxied to allow nested logger adapters. """ - return self.logger._log( - level, - msg, - args, - exc_info=exc_info, - extra=extra, - stack_info=stack_info, - ) + return self.logger._log(level, msg, args, **kwargs) @property def manager(self): diff --git a/Lib/multiprocessing/connection.py b/Lib/multiprocessing/connection.py index dbbf106f680964..c6a66a1bc963c3 100644 --- a/Lib/multiprocessing/connection.py +++ b/Lib/multiprocessing/connection.py @@ -19,7 +19,6 @@ import tempfile import itertools -import _multiprocessing from . import util @@ -28,6 +27,7 @@ _ForkingPickler = reduction.ForkingPickler try: + import _multiprocessing import _winapi from _winapi import WAIT_OBJECT_0, WAIT_ABANDONED_0, WAIT_TIMEOUT, INFINITE except ImportError: diff --git a/Lib/pathlib/__init__.py b/Lib/pathlib/__init__.py index 65ce836765c42b..46834b1a76a6eb 100644 --- a/Lib/pathlib/__init__.py +++ b/Lib/pathlib/__init__.py @@ -587,9 +587,13 @@ def iterdir(self): def _scandir(self): return os.scandir(self) - def _make_child_entry(self, entry): + def _direntry_str(self, entry): + # Transform an entry yielded from _scandir() into a path string. + return entry.name if str(self) == '.' else entry.path + + def _make_child_direntry(self, entry): # Transform an entry yielded from _scandir() into a path object. - path_str = entry.name if str(self) == '.' else entry.path + path_str = self._direntry_str(entry) path = self.with_segments(path_str) path._str = path_str path._drv = self.drive diff --git a/Lib/pathlib/_abc.py b/Lib/pathlib/_abc.py index e4b1201a3703c3..27c6b4e367a050 100644 --- a/Lib/pathlib/_abc.py +++ b/Lib/pathlib/_abc.py @@ -86,19 +86,29 @@ def _select_children(parent_paths, dir_only, follow_symlinks, match): continue except OSError: continue - if match(entry.name): - yield parent_path._make_child_entry(entry) + # Avoid cost of making a path object for non-matching paths by + # matching against the os.DirEntry.name string. + if match is None or match(entry.name): + yield parent_path._make_child_direntry(entry) -def _select_recursive(parent_paths, dir_only, follow_symlinks): - """Yield given paths and all their subdirectories, recursively.""" +def _select_recursive(parent_paths, dir_only, follow_symlinks, match): + """Yield given paths and all their children, recursively, filtering by + string and type. + """ if follow_symlinks is None: follow_symlinks = False for parent_path in parent_paths: + if match is not None: + # If we're filtering paths through a regex, record the length of + # the parent path. We'll pass it to match(path, pos=...) later. + parent_len = len(str(parent_path._make_child_relpath('_'))) - 1 paths = [parent_path._make_child_relpath('')] while paths: path = paths.pop() - yield path + if match is None or match(str(path), parent_len): + # Yield *directory* path that matches pattern (if any). + yield path try: # We must close the scandir() object before proceeding to # avoid exhausting file descriptors when globbing deep trees. @@ -108,14 +118,22 @@ def _select_recursive(parent_paths, dir_only, follow_symlinks): pass else: for entry in entries: + # Handle directory entry. try: if entry.is_dir(follow_symlinks=follow_symlinks): - paths.append(path._make_child_entry(entry)) + # Recurse into this directory. + paths.append(path._make_child_direntry(entry)) continue except OSError: pass + + # Handle file entry. if not dir_only: - yield path._make_child_entry(entry) + # Avoid cost of making a path object for non-matching + # files by matching against the os.DirEntry object. + if match is None or match(path._direntry_str(entry), parent_len): + # Yield *file* path that matches pattern (if any). + yield path._make_child_direntry(entry) def _select_unique(paths): @@ -750,8 +768,14 @@ def _scandir(self): from contextlib import nullcontext return nullcontext(self.iterdir()) - def _make_child_entry(self, entry): + def _direntry_str(self, entry): + # Transform an entry yielded from _scandir() into a path string. + # PathBase._scandir() yields PathBase objects, so use str(). + return str(entry) + + def _make_child_direntry(self, entry): # Transform an entry yielded from _scandir() into a path object. + # PathBase._scandir() yields PathBase objects, so this is a no-op. return entry def _make_child_relpath(self, name): @@ -769,43 +793,49 @@ def glob(self, pattern, *, case_sensitive=None, follow_symlinks=None): stack = pattern._pattern_stack specials = ('', '.', '..') - filter_paths = False deduplicate_paths = False sep = self.pathmod.sep paths = iter([self] if self.is_dir() else []) while stack: part = stack.pop() if part in specials: + # Join special component (e.g. '..') onto paths. paths = _select_special(paths, part) + elif part == '**': - # Consume adjacent '**' components. + # Consume following '**' components, which have no effect. while stack and stack[-1] == '**': stack.pop() - # Consume adjacent non-special components and enable post-walk - # regex filtering, provided we're treating symlinks consistently. + # Consume following non-special components, provided we're + # treating symlinks consistently. Each component is joined + # onto 'part', which is used to generate an re.Pattern object. if follow_symlinks is not None: while stack and stack[-1] not in specials: - filter_paths = True - stack.pop() + part += sep + stack.pop() - dir_only = bool(stack) - paths = _select_recursive(paths, dir_only, follow_symlinks) + # If the previous loop consumed pattern components, compile an + # re.Pattern object based on those components. + match = _compile_pattern(part, sep, case_sensitive) if part != '**' else None + + # Recursively walk directories, filtering by type and regex. + paths = _select_recursive(paths, bool(stack), follow_symlinks, match) + + # De-duplicate if we've already seen a '**' component. if deduplicate_paths: - # De-duplicate if we've already seen a '**' component. paths = _select_unique(paths) deduplicate_paths = True + elif '**' in part: raise ValueError("Invalid pattern: '**' can only be an entire path component") + else: - dir_only = bool(stack) - match = _compile_pattern(part, sep, case_sensitive) - paths = _select_children(paths, dir_only, follow_symlinks, match) - if filter_paths: - # Filter out paths that don't match pattern. - prefix_len = len(str(self._make_child_relpath('_'))) - 1 - match = _compile_pattern(pattern._pattern_str, sep, case_sensitive) - paths = (path for path in paths if match(path._pattern_str, prefix_len)) + # If the pattern component isn't '*', compile an re.Pattern + # object based on the component. + match = _compile_pattern(part, sep, case_sensitive) if part != '*' else None + + # Iterate over directories' children filtering by type and regex. + paths = _select_children(paths, bool(stack), follow_symlinks, match) return paths def rglob(self, pattern, *, case_sensitive=None, follow_symlinks=None): @@ -854,7 +884,7 @@ def walk(self, top_down=True, on_error=None, follow_symlinks=False): if is_dir: if not top_down: - paths.append(path._make_child_entry(entry)) + paths.append(path._make_child_direntry(entry)) dirnames.append(entry.name) else: filenames.append(entry.name) diff --git a/Lib/pickletools.py b/Lib/pickletools.py index 95a77aeb2afe2a..51ee4a7a2632ac 100644 --- a/Lib/pickletools.py +++ b/Lib/pickletools.py @@ -1253,7 +1253,7 @@ def __init__(self, name, code, arg, stack_before=[], stack_after=[pyint], proto=2, - doc="""Long integer using found-byte length. + doc="""Long integer using four-byte length. A more efficient encoding of a Python long; the long4 encoding says it all."""), diff --git a/Lib/pydoc.py b/Lib/pydoc.py index 96aa1dfc1aacf6..17f7346e5cc619 100755 --- a/Lib/pydoc.py +++ b/Lib/pydoc.py @@ -225,6 +225,19 @@ def classname(object, modname): name = object.__module__ + '.' + name return name +def parentname(object, modname): + """Get a name of the enclosing class (qualified it with a module name + if necessary) or module.""" + if '.' in object.__qualname__: + name = object.__qualname__.rpartition('.')[0] + if object.__module__ != modname: + return object.__module__ + '.' + name + else: + return name + else: + if object.__module__ != modname: + return object.__module__ + def isdata(object): """Check if an object is of a type that probably means it's data.""" return not (inspect.ismodule(object) or inspect.isclass(object) or @@ -319,13 +332,15 @@ def visiblename(name, all=None, obj=None): return not name.startswith('_') def classify_class_attrs(object): - """Wrap inspect.classify_class_attrs, with fixup for data descriptors.""" + """Wrap inspect.classify_class_attrs, with fixup for data descriptors and bound methods.""" results = [] for (name, kind, cls, value) in inspect.classify_class_attrs(object): if inspect.isdatadescriptor(value): kind = 'data descriptor' if isinstance(value, property) and value.fset is None: kind = 'readonly property' + elif kind == 'method' and _is_bound_method(value): + kind = 'static method' results.append((name, kind, cls, value)) return results @@ -681,6 +696,25 @@ def classlink(self, object, modname): module.__name__, name, classname(object, modname)) return classname(object, modname) + def parentlink(self, object, modname): + """Make a link for the enclosing class or module.""" + link = None + name, module = object.__name__, sys.modules.get(object.__module__) + if hasattr(module, name) and getattr(module, name) is object: + if '.' in object.__qualname__: + name = object.__qualname__.rpartition('.')[0] + if object.__module__ != modname: + link = '%s.html#%s' % (module.__name__, name) + else: + link = '#%s' % name + else: + if object.__module__ != modname: + link = '%s.html' % module.__name__ + if link: + return '%s' % (link, parentname(object, modname)) + else: + return parentname(object, modname) + def modulelink(self, object): """Make a link for a module.""" return '%s' % (object.__name__, object.__name__) @@ -925,7 +959,7 @@ def spill(msg, attrs, predicate): push(self.docdata(value, name, mod)) else: push(self.document(value, name, mod, - funcs, classes, mdict, object)) + funcs, classes, mdict, object, homecls)) push('\n') return attrs @@ -1043,24 +1077,44 @@ def formatvalue(self, object): return self.grey('=' + self.repr(object)) def docroutine(self, object, name=None, mod=None, - funcs={}, classes={}, methods={}, cl=None): + funcs={}, classes={}, methods={}, cl=None, homecls=None): """Produce HTML documentation for a function or method object.""" realname = object.__name__ name = name or realname - anchor = (cl and cl.__name__ or '') + '-' + name + if homecls is None: + homecls = cl + anchor = ('' if cl is None else cl.__name__) + '-' + name note = '' - skipdocs = 0 + skipdocs = False + imfunc = None if _is_bound_method(object): - imclass = object.__self__.__class__ - if cl: - if imclass is not cl: - note = ' from ' + self.classlink(imclass, mod) + imself = object.__self__ + if imself is cl: + imfunc = getattr(object, '__func__', None) + elif inspect.isclass(imself): + note = ' class method of %s' % self.classlink(imself, mod) else: - if object.__self__ is not None: - note = ' method of %s instance' % self.classlink( - object.__self__.__class__, mod) - else: - note = ' unbound %s method' % self.classlink(imclass,mod) + note = ' method of %s instance' % self.classlink( + imself.__class__, mod) + elif (inspect.ismethoddescriptor(object) or + inspect.ismethodwrapper(object)): + try: + objclass = object.__objclass__ + except AttributeError: + pass + else: + if cl is None: + note = ' unbound %s method' % self.classlink(objclass, mod) + elif objclass is not homecls: + note = ' from ' + self.classlink(objclass, mod) + else: + imfunc = object + if inspect.isfunction(imfunc) and homecls is not None and ( + imfunc.__module__ != homecls.__module__ or + imfunc.__qualname__ != homecls.__qualname__ + '.' + realname): + pname = self.parentlink(imfunc, mod) + if pname: + note = ' from %s' % pname if (inspect.iscoroutinefunction(object) or inspect.isasyncgenfunction(object)): @@ -1071,10 +1125,13 @@ def docroutine(self, object, name=None, mod=None, if name == realname: title = '%s' % (anchor, realname) else: - if cl and inspect.getattr_static(cl, realname, []) is object: + if (cl is not None and + inspect.getattr_static(cl, realname, []) is object): reallink = '%s' % ( cl.__name__ + '-' + realname, realname) - skipdocs = 1 + skipdocs = True + if note.startswith(' from '): + note = '' else: reallink = realname title = '%s = %s' % ( @@ -1102,7 +1159,7 @@ def docroutine(self, object, name=None, mod=None, doc = doc and '
%s
' % doc return '
%s
%s
\n' % (decl, doc) - def docdata(self, object, name=None, mod=None, cl=None): + def docdata(self, object, name=None, mod=None, cl=None, *ignored): """Produce html documentation for a data descriptor.""" results = [] push = results.append @@ -1213,7 +1270,7 @@ def formattree(self, tree, modname, parent=None, prefix=''): entry, modname, c, prefix + ' ') return result - def docmodule(self, object, name=None, mod=None): + def docmodule(self, object, name=None, mod=None, *ignored): """Produce text documentation for a given module object.""" name = object.__name__ # ignore the passed-in name synop, desc = splitdoc(getdoc(object)) @@ -1392,7 +1449,7 @@ def spill(msg, attrs, predicate): push(self.docdata(value, name, mod)) else: push(self.document(value, - name, mod, object)) + name, mod, object, homecls)) return attrs def spilldescriptors(msg, attrs, predicate): @@ -1467,23 +1524,43 @@ def formatvalue(self, object): """Format an argument default value as text.""" return '=' + self.repr(object) - def docroutine(self, object, name=None, mod=None, cl=None): + def docroutine(self, object, name=None, mod=None, cl=None, homecls=None): """Produce text documentation for a function or method object.""" realname = object.__name__ name = name or realname + if homecls is None: + homecls = cl note = '' - skipdocs = 0 + skipdocs = False + imfunc = None if _is_bound_method(object): - imclass = object.__self__.__class__ - if cl: - if imclass is not cl: - note = ' from ' + classname(imclass, mod) + imself = object.__self__ + if imself is cl: + imfunc = getattr(object, '__func__', None) + elif inspect.isclass(imself): + note = ' class method of %s' % classname(imself, mod) else: - if object.__self__ is not None: - note = ' method of %s instance' % classname( - object.__self__.__class__, mod) - else: - note = ' unbound %s method' % classname(imclass,mod) + note = ' method of %s instance' % classname( + imself.__class__, mod) + elif (inspect.ismethoddescriptor(object) or + inspect.ismethodwrapper(object)): + try: + objclass = object.__objclass__ + except AttributeError: + pass + else: + if cl is None: + note = ' unbound %s method' % classname(objclass, mod) + elif objclass is not homecls: + note = ' from ' + classname(objclass, mod) + else: + imfunc = object + if inspect.isfunction(imfunc) and homecls is not None and ( + imfunc.__module__ != homecls.__module__ or + imfunc.__qualname__ != homecls.__qualname__ + '.' + realname): + pname = parentname(imfunc, mod) + if pname: + note = ' from %s' % pname if (inspect.iscoroutinefunction(object) or inspect.isasyncgenfunction(object)): @@ -1494,8 +1571,11 @@ def docroutine(self, object, name=None, mod=None, cl=None): if name == realname: title = self.bold(realname) else: - if cl and inspect.getattr_static(cl, realname, []) is object: - skipdocs = 1 + if (cl is not None and + inspect.getattr_static(cl, realname, []) is object): + skipdocs = True + if note.startswith(' from '): + note = '' title = self.bold(name) + ' = ' + realname argspec = None @@ -1517,7 +1597,7 @@ def docroutine(self, object, name=None, mod=None, cl=None): doc = getdoc(object) or '' return decl + '\n' + (doc and self.indent(doc).rstrip() + '\n') - def docdata(self, object, name=None, mod=None, cl=None): + def docdata(self, object, name=None, mod=None, cl=None, *ignored): """Produce text documentation for a data descriptor.""" results = [] push = results.append @@ -1533,7 +1613,8 @@ def docdata(self, object, name=None, mod=None, cl=None): docproperty = docdata - def docother(self, object, name=None, mod=None, parent=None, maxlen=None, doc=None): + def docother(self, object, name=None, mod=None, parent=None, *ignored, + maxlen=None, doc=None): """Produce text documentation for a data object.""" repr = self.repr(object) if maxlen: diff --git a/Lib/queue.py b/Lib/queue.py index 55f50088460f9e..467ff4fcecb134 100644 --- a/Lib/queue.py +++ b/Lib/queue.py @@ -25,6 +25,10 @@ class Full(Exception): pass +class ShutDown(Exception): + '''Raised when put/get with shut-down queue.''' + + class Queue: '''Create a queue object with a given maximum size. @@ -54,6 +58,9 @@ def __init__(self, maxsize=0): self.all_tasks_done = threading.Condition(self.mutex) self.unfinished_tasks = 0 + # Queue shutdown state + self.is_shutdown = False + def task_done(self): '''Indicate that a formerly enqueued task is complete. @@ -67,6 +74,8 @@ def task_done(self): Raises a ValueError if called more times than there were items placed in the queue. + + Raises ShutDown if the queue has been shut down immediately. ''' with self.all_tasks_done: unfinished = self.unfinished_tasks - 1 @@ -84,6 +93,8 @@ def join(self): to indicate the item was retrieved and all work on it is complete. When the count of unfinished tasks drops to zero, join() unblocks. + + Raises ShutDown if the queue has been shut down immediately. ''' with self.all_tasks_done: while self.unfinished_tasks: @@ -129,8 +140,12 @@ def put(self, item, block=True, timeout=None): Otherwise ('block' is false), put an item on the queue if a free slot is immediately available, else raise the Full exception ('timeout' is ignored in that case). + + Raises ShutDown if the queue has been shut down. ''' with self.not_full: + if self.is_shutdown: + raise ShutDown if self.maxsize > 0: if not block: if self._qsize() >= self.maxsize: @@ -138,6 +153,8 @@ def put(self, item, block=True, timeout=None): elif timeout is None: while self._qsize() >= self.maxsize: self.not_full.wait() + if self.is_shutdown: + raise ShutDown elif timeout < 0: raise ValueError("'timeout' must be a non-negative number") else: @@ -147,6 +164,8 @@ def put(self, item, block=True, timeout=None): if remaining <= 0.0: raise Full self.not_full.wait(remaining) + if self.is_shutdown: + raise ShutDown self._put(item) self.unfinished_tasks += 1 self.not_empty.notify() @@ -161,14 +180,21 @@ def get(self, block=True, timeout=None): Otherwise ('block' is false), return an item if one is immediately available, else raise the Empty exception ('timeout' is ignored in that case). + + Raises ShutDown if the queue has been shut down and is empty, + or if the queue has been shut down immediately. ''' with self.not_empty: + if self.is_shutdown and not self._qsize(): + raise ShutDown if not block: if not self._qsize(): raise Empty elif timeout is None: while not self._qsize(): self.not_empty.wait() + if self.is_shutdown and not self._qsize(): + raise ShutDown elif timeout < 0: raise ValueError("'timeout' must be a non-negative number") else: @@ -178,6 +204,8 @@ def get(self, block=True, timeout=None): if remaining <= 0.0: raise Empty self.not_empty.wait(remaining) + if self.is_shutdown and not self._qsize(): + raise ShutDown item = self._get() self.not_full.notify() return item @@ -198,6 +226,28 @@ def get_nowait(self): ''' return self.get(block=False) + def shutdown(self, immediate=False): + '''Shut-down the queue, making queue gets and puts raise. + + By default, gets will only raise once the queue is empty. Set + 'immediate' to True to make gets raise immediately instead. + + All blocked callers of put() will be unblocked, and also get() + and join() if 'immediate'. The ShutDown exception is raised. + ''' + with self.mutex: + self.is_shutdown = True + if immediate: + n_items = self._qsize() + while self._qsize(): + self._get() + if self.unfinished_tasks > 0: + self.unfinished_tasks -= 1 + self.not_empty.notify_all() + # release all blocked threads in `join()` + self.all_tasks_done.notify_all() + self.not_full.notify_all() + # Override these methods to implement other queue organizations # (e.g. stack or priority queue). # These will only be called with appropriate locks held diff --git a/Lib/tarfile.py b/Lib/tarfile.py index 9775040cbe372c..f4dd0fdab4a3e4 100755 --- a/Lib/tarfile.py +++ b/Lib/tarfile.py @@ -2411,7 +2411,7 @@ def _extract_member(self, tarinfo, targetpath, set_attrs=True, if upperdirs and not os.path.exists(upperdirs): # Create directories that are not part of the archive with # default permissions. - os.makedirs(upperdirs) + os.makedirs(upperdirs, exist_ok=True) if tarinfo.islnk() or tarinfo.issym(): self._dbg(1, "%s -> %s" % (tarinfo.name, tarinfo.linkname)) diff --git a/Lib/test/archiver_tests.py b/Lib/test/archiver_tests.py index 1a4bbb9e5706c5..24745941b08923 100644 --- a/Lib/test/archiver_tests.py +++ b/Lib/test/archiver_tests.py @@ -3,6 +3,7 @@ import os import sys +from test.support import swap_attr from test.support import os_helper class OverwriteTests: @@ -153,3 +154,24 @@ def test_overwrite_broken_dir_symlink_as_implicit_dir(self): self.extractall(ar) self.assertTrue(os.path.islink(target)) self.assertFalse(os.path.exists(target2)) + + def test_concurrent_extract_dir(self): + target = os.path.join(self.testdir, 'test') + def concurrent_mkdir(*args, **kwargs): + orig_mkdir(*args, **kwargs) + orig_mkdir(*args, **kwargs) + with swap_attr(os, 'mkdir', concurrent_mkdir) as orig_mkdir: + with self.open(self.ar_with_dir) as ar: + self.extractall(ar) + self.assertTrue(os.path.isdir(target)) + + def test_concurrent_extract_implicit_dir(self): + target = os.path.join(self.testdir, 'test') + def concurrent_mkdir(*args, **kwargs): + orig_mkdir(*args, **kwargs) + orig_mkdir(*args, **kwargs) + with swap_attr(os, 'mkdir', concurrent_mkdir) as orig_mkdir: + with self.open(self.ar_with_implicit_dir) as ar: + self.extractall(ar) + self.assertTrue(os.path.isdir(target)) + self.assertTrue(os.path.isfile(os.path.join(target, 'file'))) diff --git a/Lib/test/datetimetester.py b/Lib/test/datetimetester.py index 53ad5e57ada017..31fc383e29707a 100644 --- a/Lib/test/datetimetester.py +++ b/Lib/test/datetimetester.py @@ -1723,11 +1723,24 @@ def test_replace(self): def test_subclass_replace(self): class DateSubclass(self.theclass): - pass + def __new__(cls, *args, **kwargs): + result = self.theclass.__new__(cls, *args, **kwargs) + result.extra = 7 + return result dt = DateSubclass(2012, 1, 1) - self.assertIs(type(dt.replace(year=2013)), DateSubclass) - self.assertIs(type(copy.replace(dt, year=2013)), DateSubclass) + + test_cases = [ + ('self.replace', dt.replace(year=2013)), + ('copy.replace', copy.replace(dt, year=2013)), + ] + + for name, res in test_cases: + with self.subTest(name): + self.assertIs(type(res), DateSubclass) + self.assertEqual(res.year, 2013) + self.assertEqual(res.month, 1) + self.assertEqual(res.extra, 7) def test_subclass_date(self): @@ -3025,6 +3038,26 @@ def __new__(cls, *args, **kwargs): self.assertIsInstance(dt, DateTimeSubclass) self.assertEqual(dt.extra, 7) + def test_subclass_replace_fold(self): + class DateTimeSubclass(self.theclass): + pass + + dt = DateTimeSubclass(2012, 1, 1) + dt2 = DateTimeSubclass(2012, 1, 1, fold=1) + + test_cases = [ + ('self.replace', dt.replace(year=2013), 0), + ('self.replace', dt2.replace(year=2013), 1), + ('copy.replace', copy.replace(dt, year=2013), 0), + ('copy.replace', copy.replace(dt2, year=2013), 1), + ] + + for name, res, fold in test_cases: + with self.subTest(name, fold=fold): + self.assertIs(type(res), DateTimeSubclass) + self.assertEqual(res.year, 2013) + self.assertEqual(res.fold, fold) + def test_fromisoformat_datetime(self): # Test that isoformat() is reversible base_dates = [ @@ -3705,11 +3738,28 @@ def test_replace(self): def test_subclass_replace(self): class TimeSubclass(self.theclass): - pass + def __new__(cls, *args, **kwargs): + result = self.theclass.__new__(cls, *args, **kwargs) + result.extra = 7 + return result ctime = TimeSubclass(12, 30) - self.assertIs(type(ctime.replace(hour=10)), TimeSubclass) - self.assertIs(type(copy.replace(ctime, hour=10)), TimeSubclass) + ctime2 = TimeSubclass(12, 30, fold=1) + + test_cases = [ + ('self.replace', ctime.replace(hour=10), 0), + ('self.replace', ctime2.replace(hour=10), 1), + ('copy.replace', copy.replace(ctime, hour=10), 0), + ('copy.replace', copy.replace(ctime2, hour=10), 1), + ] + + for name, res, fold in test_cases: + with self.subTest(name, fold=fold): + self.assertIs(type(res), TimeSubclass) + self.assertEqual(res.hour, 10) + self.assertEqual(res.minute, 30) + self.assertEqual(res.extra, 7) + self.assertEqual(res.fold, fold) def test_subclass_time(self): @@ -5435,42 +5485,50 @@ def fromutc(self, dt): class Oddballs(unittest.TestCase): - def test_bug_1028306(self): + def test_date_datetime_comparison(self): + # bpo-1028306, bpo-5516 (gh-49766) # Trying to compare a date to a datetime should act like a mixed- # type comparison, despite that datetime is a subclass of date. as_date = date.today() as_datetime = datetime.combine(as_date, time()) - self.assertTrue(as_date != as_datetime) - self.assertTrue(as_datetime != as_date) - self.assertFalse(as_date == as_datetime) - self.assertFalse(as_datetime == as_date) - self.assertRaises(TypeError, lambda: as_date < as_datetime) - self.assertRaises(TypeError, lambda: as_datetime < as_date) - self.assertRaises(TypeError, lambda: as_date <= as_datetime) - self.assertRaises(TypeError, lambda: as_datetime <= as_date) - self.assertRaises(TypeError, lambda: as_date > as_datetime) - self.assertRaises(TypeError, lambda: as_datetime > as_date) - self.assertRaises(TypeError, lambda: as_date >= as_datetime) - self.assertRaises(TypeError, lambda: as_datetime >= as_date) - - # Nevertheless, comparison should work with the base-class (date) - # projection if use of a date method is forced. - self.assertEqual(as_date.__eq__(as_datetime), True) - different_day = (as_date.day + 1) % 20 + 1 - as_different = as_datetime.replace(day= different_day) - self.assertEqual(as_date.__eq__(as_different), False) + date_sc = SubclassDate(as_date.year, as_date.month, as_date.day) + datetime_sc = SubclassDatetime(as_date.year, as_date.month, + as_date.day, 0, 0, 0) + for d in (as_date, date_sc): + for dt in (as_datetime, datetime_sc): + for x, y in (d, dt), (dt, d): + self.assertTrue(x != y) + self.assertFalse(x == y) + self.assertRaises(TypeError, lambda: x < y) + self.assertRaises(TypeError, lambda: x <= y) + self.assertRaises(TypeError, lambda: x > y) + self.assertRaises(TypeError, lambda: x >= y) # And date should compare with other subclasses of date. If a # subclass wants to stop this, it's up to the subclass to do so. - date_sc = SubclassDate(as_date.year, as_date.month, as_date.day) - self.assertEqual(as_date, date_sc) - self.assertEqual(date_sc, as_date) - # Ditto for datetimes. - datetime_sc = SubclassDatetime(as_datetime.year, as_datetime.month, - as_date.day, 0, 0, 0) - self.assertEqual(as_datetime, datetime_sc) - self.assertEqual(datetime_sc, as_datetime) + for x, y in ((as_date, date_sc), + (date_sc, as_date), + (as_datetime, datetime_sc), + (datetime_sc, as_datetime)): + self.assertTrue(x == y) + self.assertFalse(x != y) + self.assertFalse(x < y) + self.assertFalse(x > y) + self.assertTrue(x <= y) + self.assertTrue(x >= y) + + # Nevertheless, comparison should work if other object is an instance + # of date or datetime class with overridden comparison operators. + # So special methods should return NotImplemented, as if + # date and datetime were independent classes. + for x, y in (as_date, as_datetime), (as_datetime, as_date): + self.assertEqual(x.__eq__(y), NotImplemented) + self.assertEqual(x.__ne__(y), NotImplemented) + self.assertEqual(x.__lt__(y), NotImplemented) + self.assertEqual(x.__gt__(y), NotImplemented) + self.assertEqual(x.__gt__(y), NotImplemented) + self.assertEqual(x.__ge__(y), NotImplemented) def test_extra_attributes(self): with self.assertWarns(DeprecationWarning): diff --git a/Lib/test/libregrtest/refleak.py b/Lib/test/libregrtest/refleak.py index 7da16cf721f097..71a70af6882d16 100644 --- a/Lib/test/libregrtest/refleak.py +++ b/Lib/test/libregrtest/refleak.py @@ -201,8 +201,8 @@ def dash_R_cleanup(fs, ps, pic, zdc, abcs): # Clear caches clear_caches() - # Clear type cache at the end: previous function calls can modify types - sys._clear_type_cache() + # Clear other caches last (previous function calls can re-populate them): + sys._clear_internal_caches() def warm_caches(): diff --git a/Lib/test/pydocfodder.py b/Lib/test/pydocfodder.py index a3ef2231243954..27037e048db819 100644 --- a/Lib/test/pydocfodder.py +++ b/Lib/test/pydocfodder.py @@ -2,6 +2,12 @@ import types +def global_func(x, y): + """Module global function""" + +def global_func2(x, y): + """Module global function 2""" + class A: "A class." @@ -26,7 +32,7 @@ def A_classmethod(cls, x): "A class method defined in A." A_classmethod = classmethod(A_classmethod) - def A_staticmethod(): + def A_staticmethod(x, y): "A static method defined in A." A_staticmethod = staticmethod(A_staticmethod) @@ -61,6 +67,28 @@ def BD_method(self): def BCD_method(self): "Method defined in B, C and D." + @classmethod + def B_classmethod(cls, x): + "A class method defined in B." + + global_func = global_func # same name + global_func_alias = global_func + global_func2_alias = global_func2 + B_classmethod_alias = B_classmethod + A_classmethod_ref = A.A_classmethod + A_staticmethod = A.A_staticmethod # same name + A_staticmethod_alias = A.A_staticmethod + A_method_ref = A().A_method + A_method_alias = A.A_method + B_method_alias = B_method + __repr__ = object.__repr__ # same name + object_repr = object.__repr__ + get = {}.get # same name + dict_get = {}.get + +B.B_classmethod_ref = B.B_classmethod + + class C(A): "A class, derived from A." @@ -136,3 +164,21 @@ def __call__(self, inst): submodule = types.ModuleType(__name__ + '.submodule', """A submodule, which should appear in its parent's summary""") + +global_func_alias = global_func +A_classmethod = A.A_classmethod # same name +A_classmethod2 = A.A_classmethod +A_classmethod3 = B.A_classmethod +A_staticmethod = A.A_staticmethod # same name +A_staticmethod_alias = A.A_staticmethod +A_staticmethod_ref = A().A_staticmethod +A_staticmethod_ref2 = B().A_staticmethod +A_method = A().A_method # same name +A_method2 = A().A_method +A_method3 = B().A_method +B_method = B.B_method # same name +B_method2 = B.B_method +count = list.count # same name +list_count = list.count +get = {}.get # same name +dict_get = {}.get diff --git a/Lib/test/test_array.py b/Lib/test/test_array.py index a219fa365e7f20..95383be9659eb9 100755 --- a/Lib/test/test_array.py +++ b/Lib/test/test_array.py @@ -1014,6 +1014,29 @@ def test_pop(self): array.array(self.typecode, self.example[3:]+self.example[:-1]) ) + def test_clear(self): + a = array.array(self.typecode, self.example) + with self.assertRaises(TypeError): + a.clear(42) + a.clear() + self.assertEqual(len(a), 0) + self.assertEqual(a.typecode, self.typecode) + + a = array.array(self.typecode) + a.clear() + self.assertEqual(len(a), 0) + self.assertEqual(a.typecode, self.typecode) + + a = array.array(self.typecode, self.example) + a.clear() + a.append(self.example[2]) + a.append(self.example[3]) + self.assertEqual(a, array.array(self.typecode, self.example[2:4])) + + with memoryview(a): + with self.assertRaises(BufferError): + a.clear() + def test_reverse(self): a = array.array(self.typecode, self.example) self.assertRaises(TypeError, a.reverse, 42) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index fcddd147bac63e..9a0bf524e3943f 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -308,14 +308,13 @@ class C3(C2): pass self.assertTrue(callable(c3)) def test_chr(self): + self.assertEqual(chr(0), '\0') self.assertEqual(chr(32), ' ') self.assertEqual(chr(65), 'A') self.assertEqual(chr(97), 'a') self.assertEqual(chr(0xff), '\xff') - self.assertRaises(ValueError, chr, 1<<24) - self.assertEqual(chr(sys.maxunicode), - str('\\U0010ffff'.encode("ascii"), 'unicode-escape')) self.assertRaises(TypeError, chr) + self.assertRaises(TypeError, chr, 65.0) self.assertEqual(chr(0x0000FFFF), "\U0000FFFF") self.assertEqual(chr(0x00010000), "\U00010000") self.assertEqual(chr(0x00010001), "\U00010001") @@ -327,7 +326,11 @@ def test_chr(self): self.assertEqual(chr(0x0010FFFF), "\U0010FFFF") self.assertRaises(ValueError, chr, -1) self.assertRaises(ValueError, chr, 0x00110000) - self.assertRaises((OverflowError, ValueError), chr, 2**32) + self.assertRaises(ValueError, chr, 1<<24) + self.assertRaises(ValueError, chr, 2**32-1) + self.assertRaises(ValueError, chr, -2**32) + self.assertRaises(ValueError, chr, 2**1000) + self.assertRaises(ValueError, chr, -2**1000) def test_cmp(self): self.assertTrue(not hasattr(builtins, "cmp")) @@ -611,6 +614,14 @@ def __dir__(self): self.assertIsInstance(res, list) self.assertTrue(res == ["a", "b", "c"]) + # dir(obj__dir__iterable) + class Foo(object): + def __dir__(self): + return {"b", "c", "a"} + res = dir(Foo()) + self.assertIsInstance(res, list) + self.assertEqual(sorted(res), ["a", "b", "c"]) + # dir(obj__dir__not_sequence) class Foo(object): def __dir__(self): diff --git a/Lib/test/test_call.py b/Lib/test/test_call.py index 3c8fc35e3c116d..2a6a5d287b04ee 100644 --- a/Lib/test/test_call.py +++ b/Lib/test/test_call.py @@ -155,7 +155,7 @@ def test_varargs16_kw(self): min, 0, default=1, key=2, foo=3) def test_varargs17_kw(self): - msg = r"'foo' is an invalid keyword argument for print\(\)$" + msg = r"print\(\) got an unexpected keyword argument 'foo'$" self.assertRaisesRegex(TypeError, msg, print, 0, sep=1, end=2, file=3, flush=4, foo=5) @@ -928,7 +928,7 @@ def check_suggestion_includes(self, message): self.assertIn(f"Did you mean '{message}'?", str(cm.exception)) @contextlib.contextmanager - def check_suggestion_not_pressent(self): + def check_suggestion_not_present(self): with self.assertRaises(TypeError) as cm: yield self.assertNotIn("Did you mean", str(cm.exception)) @@ -946,7 +946,7 @@ def foo(blech=None, /, aaa=None, *args, late1=None): for keyword, suggestion in cases: with self.subTest(keyword): - ctx = self.check_suggestion_includes(suggestion) if suggestion else self.check_suggestion_not_pressent() + ctx = self.check_suggestion_includes(suggestion) if suggestion else self.check_suggestion_not_present() with ctx: foo(**{keyword:None}) @@ -987,6 +987,32 @@ def case_change_over_substitution(BLuch=None, Luch = None, fluch = None): with self.check_suggestion_includes(suggestion): func(bluch=None) + def test_unexpected_keyword_suggestion_via_getargs(self): + with self.check_suggestion_includes("maxsplit"): + "foo".split(maxsplt=1) + + self.assertRaisesRegex( + TypeError, r"split\(\) got an unexpected keyword argument 'blech'$", + "foo".split, blech=1 + ) + with self.check_suggestion_not_present(): + "foo".split(blech=1) + with self.check_suggestion_not_present(): + "foo".split(more_noise=1, maxsplt=1) + + # Also test the vgetargskeywords path + with self.check_suggestion_includes("name"): + ImportError(namez="oops") + + self.assertRaisesRegex( + TypeError, r"ImportError\(\) got an unexpected keyword argument 'blech'$", + ImportError, blech=1 + ) + with self.check_suggestion_not_present(): + ImportError(blech=1) + with self.check_suggestion_not_present(): + ImportError(blech=1, namez="oops") + @cpython_only class TestRecursion(unittest.TestCase): diff --git a/Lib/test/test_capi/test_getargs.py b/Lib/test/test_capi/test_getargs.py index 9b6aef27625ad0..12039803ba543e 100644 --- a/Lib/test/test_capi/test_getargs.py +++ b/Lib/test/test_capi/test_getargs.py @@ -667,7 +667,7 @@ def test_invalid_keyword(self): try: getargs_keywords((1,2),3,arg5=10,arg666=666) except TypeError as err: - self.assertEqual(str(err), "'arg666' is an invalid keyword argument for this function") + self.assertEqual(str(err), "this function got an unexpected keyword argument 'arg666'") else: self.fail('TypeError should have been raised') @@ -675,7 +675,7 @@ def test_surrogate_keyword(self): try: getargs_keywords((1,2), 3, (4,(5,6)), (7,8,9), **{'\uDC80': 10}) except TypeError as err: - self.assertEqual(str(err), "'\udc80' is an invalid keyword argument for this function") + self.assertEqual(str(err), "this function got an unexpected keyword argument '\udc80'") else: self.fail('TypeError should have been raised') @@ -742,12 +742,12 @@ def test_too_many_args(self): def test_invalid_keyword(self): # extraneous keyword arg with self.assertRaisesRegex(TypeError, - "'monster' is an invalid keyword argument for this function"): + "this function got an unexpected keyword argument 'monster'"): getargs_keyword_only(1, 2, monster=666) def test_surrogate_keyword(self): with self.assertRaisesRegex(TypeError, - "'\udc80' is an invalid keyword argument for this function"): + "this function got an unexpected keyword argument '\udc80'"): getargs_keyword_only(1, 2, **{'\uDC80': 10}) def test_weird_str_subclass(self): @@ -761,7 +761,7 @@ def __hash__(self): "invalid keyword argument for this function"): getargs_keyword_only(1, 2, **{BadStr("keyword_only"): 3}) with self.assertRaisesRegex(TypeError, - "invalid keyword argument for this function"): + "this function got an unexpected keyword argument"): getargs_keyword_only(1, 2, **{BadStr("monster"): 666}) def test_weird_str_subclass2(self): @@ -774,7 +774,7 @@ def __hash__(self): "invalid keyword argument for this function"): getargs_keyword_only(1, 2, **{BadStr("keyword_only"): 3}) with self.assertRaisesRegex(TypeError, - "invalid keyword argument for this function"): + "this function got an unexpected keyword argument"): getargs_keyword_only(1, 2, **{BadStr("monster"): 666}) @@ -807,7 +807,7 @@ def test_required_args(self): def test_empty_keyword(self): with self.assertRaisesRegex(TypeError, - "'' is an invalid keyword argument for this function"): + "this function got an unexpected keyword argument ''"): self.getargs(1, 2, **{'': 666}) @@ -1204,7 +1204,7 @@ def test_basic(self): "function missing required argument 'a'"): parse((), {}, 'O', ['a']) with self.assertRaisesRegex(TypeError, - "'b' is an invalid keyword argument"): + "this function got an unexpected keyword argument 'b'"): parse((), {'b': 1}, '|O', ['a']) with self.assertRaisesRegex(TypeError, fr"argument for function given by name \('a'\) " @@ -1278,10 +1278,10 @@ def test_nonascii_keywords(self): fr"and position \(1\)"): parse((1,), {name: 2}, 'O|O', [name, 'b']) with self.assertRaisesRegex(TypeError, - f"'{name}' is an invalid keyword argument"): + f"this function got an unexpected keyword argument '{name}'"): parse((), {name: 1}, '|O', ['b']) with self.assertRaisesRegex(TypeError, - "'b' is an invalid keyword argument"): + "this function got an unexpected keyword argument 'b'"): parse((), {'b': 1}, '|O', [name]) invalid = name.encode() + (name.encode()[:-1] or b'\x80') @@ -1301,17 +1301,17 @@ def test_nonascii_keywords(self): for name2 in ('b', 'ë', 'ĉ', 'Ɐ', '𐀁'): with self.subTest(name2=name2): with self.assertRaisesRegex(TypeError, - f"'{name2}' is an invalid keyword argument"): + f"this function got an unexpected keyword argument '{name2}'"): parse((), {name2: 1}, '|O', [name]) name2 = name.encode().decode('latin1') if name2 != name: with self.assertRaisesRegex(TypeError, - f"'{name2}' is an invalid keyword argument"): + f"this function got an unexpected keyword argument '{name2}'"): parse((), {name2: 1}, '|O', [name]) name3 = name + '3' with self.assertRaisesRegex(TypeError, - f"'{name2}' is an invalid keyword argument"): + f"this function got an unexpected keyword argument '{name2}'"): parse((), {name2: 1, name3: 2}, '|OO', [name, name3]) def test_nested_tuple(self): diff --git a/Lib/test/test_capi/test_opt.py b/Lib/test/test_capi/test_opt.py index 5c8c0596610303..e6b1b554c9af10 100644 --- a/Lib/test/test_capi/test_opt.py +++ b/Lib/test/test_capi/test_opt.py @@ -1,5 +1,6 @@ import contextlib import opcode +import sys import textwrap import unittest @@ -181,6 +182,21 @@ def f(): _testinternalcapi.invalidate_executors(f.__code__) self.assertFalse(exe.is_valid()) + def test_sys__clear_internal_caches(self): + def f(): + for _ in range(1000): + pass + opt = _testinternalcapi.get_uop_optimizer() + with temporary_optimizer(opt): + f() + exe = get_first_executor(f) + self.assertIsNotNone(exe) + self.assertTrue(exe.is_valid()) + sys._clear_internal_caches() + self.assertFalse(exe.is_valid()) + exe = get_first_executor(f) + self.assertIsNone(exe) + class TestUops(unittest.TestCase): def test_basic_loop(self): diff --git a/Lib/test/test_capi/test_structmembers.py b/Lib/test/test_capi/test_structmembers.py index a294c3b13a5c30..08ca1f828529cf 100644 --- a/Lib/test/test_capi/test_structmembers.py +++ b/Lib/test/test_capi/test_structmembers.py @@ -81,36 +81,22 @@ def _test_int_range(self, name, minval, maxval, *, hardlimit=None, self._test_warn(name, maxval+1, minval) self._test_warn(name, hardmaxval) - if indexlimit is None: - indexlimit = hardlimit - if not indexlimit: + if indexlimit is False: self.assertRaises(TypeError, setattr, ts, name, Index(minval)) self.assertRaises(TypeError, setattr, ts, name, Index(maxval)) else: - hardminindexval, hardmaxindexval = indexlimit self._test_write(name, Index(minval), minval) - if minval < hardminindexval: - self._test_write(name, Index(hardminindexval), hardminindexval) - if maxval < hardmaxindexval: - self._test_write(name, Index(maxval), maxval) - else: - self._test_write(name, Index(hardmaxindexval), hardmaxindexval) - self._test_overflow(name, Index(hardminindexval-1)) - if name in ('T_UINT', 'T_ULONG'): - self.assertRaises(TypeError, setattr, self.ts, name, - Index(hardmaxindexval+1)) - self.assertRaises(TypeError, setattr, self.ts, name, - Index(2**1000)) - else: - self._test_overflow(name, Index(hardmaxindexval+1)) - self._test_overflow(name, Index(2**1000)) + self._test_write(name, Index(maxval), maxval) + self._test_overflow(name, Index(hardminval-1)) + self._test_overflow(name, Index(hardmaxval+1)) + self._test_overflow(name, Index(2**1000)) self._test_overflow(name, Index(-2**1000)) - if hardminindexval < minval and name != 'T_ULONGLONG': - self._test_warn(name, Index(hardminindexval)) - self._test_warn(name, Index(minval-1)) - if maxval < hardmaxindexval: - self._test_warn(name, Index(maxval+1)) - self._test_warn(name, Index(hardmaxindexval)) + if hardminval < minval: + self._test_warn(name, Index(hardminval)) + self._test_warn(name, Index(minval-1), maxval) + if maxval < hardmaxval: + self._test_warn(name, Index(maxval+1), minval) + self._test_warn(name, Index(hardmaxval)) def test_bool(self): ts = self.ts @@ -138,14 +124,12 @@ def test_int(self): self._test_int_range('T_INT', INT_MIN, INT_MAX, hardlimit=(LONG_MIN, LONG_MAX)) self._test_int_range('T_UINT', 0, UINT_MAX, - hardlimit=(LONG_MIN, ULONG_MAX), - indexlimit=(LONG_MIN, LONG_MAX)) + hardlimit=(LONG_MIN, ULONG_MAX)) def test_long(self): self._test_int_range('T_LONG', LONG_MIN, LONG_MAX) self._test_int_range('T_ULONG', 0, ULONG_MAX, - hardlimit=(LONG_MIN, ULONG_MAX), - indexlimit=(LONG_MIN, LONG_MAX)) + hardlimit=(LONG_MIN, ULONG_MAX)) def test_py_ssize_t(self): self._test_int_range('T_PYSSIZET', PY_SSIZE_T_MIN, PY_SSIZE_T_MAX, indexlimit=False) @@ -153,7 +137,7 @@ def test_py_ssize_t(self): def test_longlong(self): self._test_int_range('T_LONGLONG', LLONG_MIN, LLONG_MAX) self._test_int_range('T_ULONGLONG', 0, ULLONG_MAX, - indexlimit=(LONG_MIN, LONG_MAX)) + hardlimit=(LONG_MIN, ULLONG_MAX)) def test_bad_assignments(self): ts = self.ts diff --git a/Lib/test/test_code.py b/Lib/test/test_code.py index d8fb826edeb681..46bebfc7af675b 100644 --- a/Lib/test/test_code.py +++ b/Lib/test/test_code.py @@ -865,6 +865,7 @@ def __init__(self, f, test): self.test = test def run(self): del self.f + gc_collect() self.test.assertEqual(LAST_FREED, 500) SetExtra(f.__code__, FREE_INDEX, ctypes.c_voidp(500)) diff --git a/Lib/test/test_collections.py b/Lib/test/test_collections.py index 7e6f811e17cfa2..1fb492ecebd668 100644 --- a/Lib/test/test_collections.py +++ b/Lib/test/test_collections.py @@ -1,5 +1,6 @@ """Unit tests for collections.py.""" +import array import collections import copy import doctest @@ -1972,6 +1973,7 @@ def test_MutableSequence(self): for sample in [list, bytearray, deque]: self.assertIsInstance(sample(), MutableSequence) self.assertTrue(issubclass(sample, MutableSequence)) + self.assertTrue(issubclass(array.array, MutableSequence)) self.assertFalse(issubclass(str, MutableSequence)) self.validate_abstract_methods(MutableSequence, '__contains__', '__iter__', '__len__', '__getitem__', '__setitem__', '__delitem__', 'insert') diff --git a/Lib/test/test_concurrent_futures/executor.py b/Lib/test/test_concurrent_futures/executor.py index 1e7d4344740943..6a79fe69ec37cf 100644 --- a/Lib/test/test_concurrent_futures/executor.py +++ b/Lib/test/test_concurrent_futures/executor.py @@ -1,8 +1,10 @@ import threading import time +import unittest import weakref from concurrent import futures from test import support +from test.support import Py_GIL_DISABLED def mul(x, y): @@ -83,10 +85,21 @@ def test_no_stale_references(self): my_object_collected = threading.Event() my_object_callback = weakref.ref( my_object, lambda obj: my_object_collected.set()) - # Deliberately discarding the future. - self.executor.submit(my_object.my_method) + fut = self.executor.submit(my_object.my_method) del my_object + if Py_GIL_DISABLED: + # Due to biased reference counting, my_object might only be + # deallocated while the thread that created it runs -- if the + # thread is paused waiting on an event, it may not merge the + # refcount of the queued object. For that reason, we wait for the + # task to finish (so that it's no longer referenced) and force a + # GC to ensure that it is collected. + fut.result() # Wait for the task to finish. + support.gc_collect() + else: + del fut # Deliberately discard the future. + collected = my_object_collected.wait(timeout=support.SHORT_TIMEOUT) self.assertTrue(collected, "Stale reference not collected within timeout.") diff --git a/Lib/test/test_concurrent_futures/test_process_pool.py b/Lib/test/test_concurrent_futures/test_process_pool.py index 3e61b0c9387c6f..7fc59a05f3deac 100644 --- a/Lib/test/test_concurrent_futures/test_process_pool.py +++ b/Lib/test/test_concurrent_futures/test_process_pool.py @@ -98,6 +98,7 @@ def test_ressources_gced_in_workers(self): # explicitly destroy the object to ensure that EventfulGCObj.__del__() # is called while manager is still running. + support.gc_collect() obj = None support.gc_collect() diff --git a/Lib/test/test_decimal.py b/Lib/test/test_decimal.py index 1423bc61c7f690..f23ea8af0c8772 100644 --- a/Lib/test/test_decimal.py +++ b/Lib/test/test_decimal.py @@ -1110,6 +1110,13 @@ def test_formatting(self): ('z>z6.1f', '-0.', 'zzz0.0'), ('x>z6.1f', '-0.', 'xxx0.0'), ('🖤>z6.1f', '-0.', '🖤🖤🖤0.0'), # multi-byte fill char + ('\x00>z6.1f', '-0.', '\x00\x00\x000.0'), # null fill char + + # issue 114563 ('z' format on F type in cdecimal) + ('z3,.10F', '-6.24E-323', '0.0000000000'), + + # issue 91060 ('#' format in cdecimal) + ('#', '0', '0.'), # issue 6850 ('a=-7.0', '0.12345', 'aaaa0.1'), @@ -5726,6 +5733,21 @@ def test_c_signaldict_segfault(self): with self.assertRaisesRegex(ValueError, err_msg): sd.copy() + def test_format_fallback_capitals(self): + # Fallback to _pydecimal formatting (triggered by `#` format which + # is unsupported by mpdecimal) should honor the current context. + x = C.Decimal('6.09e+23') + self.assertEqual(format(x, '#'), '6.09E+23') + with C.localcontext(capitals=0): + self.assertEqual(format(x, '#'), '6.09e+23') + + def test_format_fallback_rounding(self): + y = C.Decimal('6.09') + self.assertEqual(format(y, '#.1f'), '6.1') + with C.localcontext(rounding=C.ROUND_DOWN): + self.assertEqual(format(y, '#.1f'), '6.0') + + @requires_docstrings @requires_cdecimal class SignatureTest(unittest.TestCase): diff --git a/Lib/test/test_descr.py b/Lib/test/test_descr.py index beeab6cb7f254c..5404d8d3b99d5d 100644 --- a/Lib/test/test_descr.py +++ b/Lib/test/test_descr.py @@ -1594,7 +1594,11 @@ def f(cls, arg): cm = classmethod(f) cm_dict = {'__annotations__': {}, - '__doc__': "f docstring", + '__doc__': ( + "f docstring" + if support.HAVE_DOCSTRINGS + else None + ), '__module__': __name__, '__name__': 'f', '__qualname__': f.__qualname__} diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index 39c1ae0ad5a078..5d7dae8829574b 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -4851,22 +4851,22 @@ class Color(enum.Enum) | The value of the Enum member. | | ---------------------------------------------------------------------- - | Methods inherited from enum.EnumType: + | Static methods inherited from enum.EnumType: | - | __contains__(value) from enum.EnumType + | __contains__(value) | Return True if `value` is in `cls`. | | `value` is in `cls` if: | 1) `value` is a member of `cls`, or | 2) `value` is the value of one of the `cls`'s members. | - | __getitem__(name) from enum.EnumType + | __getitem__(name) | Return the member matching `name`. | - | __iter__() from enum.EnumType + | __iter__() | Return members in definition order. | - | __len__() from enum.EnumType + | __len__() | Return the number of members (no aliases) | | ---------------------------------------------------------------------- @@ -4891,11 +4891,11 @@ class Color(enum.Enum) | | Data and other attributes defined here: | - | YELLOW = + | CYAN = | | MAGENTA = | - | CYAN = + | YELLOW = | | ---------------------------------------------------------------------- | Data descriptors inherited from enum.Enum: @@ -4905,7 +4905,18 @@ class Color(enum.Enum) | value | | ---------------------------------------------------------------------- - | Data descriptors inherited from enum.EnumType: + | Methods inherited from enum.EnumType: + | + | __contains__(value) from enum.EnumType + | + | __getitem__(name) from enum.EnumType + | + | __iter__() from enum.EnumType + | + | __len__() from enum.EnumType + | + | ---------------------------------------------------------------------- + | Readonly properties inherited from enum.EnumType: | | __members__""" diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index c57488e44aecc6..c7e76414ff0715 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -1917,7 +1917,7 @@ def test_attributes(self): self.assertEqual(exc.name, 'somename') self.assertEqual(exc.path, 'somepath') - msg = "'invalid' is an invalid keyword argument for ImportError" + msg = r"ImportError\(\) got an unexpected keyword argument 'invalid'" with self.assertRaisesRegex(TypeError, msg): ImportError('test', invalid='keyword') diff --git a/Lib/test/test_fractions.py b/Lib/test/test_fractions.py index af3cb214ab0ac1..b45bd098a36684 100644 --- a/Lib/test/test_fractions.py +++ b/Lib/test/test_fractions.py @@ -1314,6 +1314,33 @@ def test_float_format_testfile(self): self.assertEqual(float(format(f, fmt2)), float(rhs)) self.assertEqual(float(format(-f, fmt2)), float('-' + rhs)) + def test_complex_handling(self): + # See issue gh-102840 for more details. + + a = F(1, 2) + b = 1j + message = "unsupported operand type(s) for %s: '%s' and '%s'" + # test forward + self.assertRaisesMessage(TypeError, + message % ("%", "Fraction", "complex"), + operator.mod, a, b) + self.assertRaisesMessage(TypeError, + message % ("//", "Fraction", "complex"), + operator.floordiv, a, b) + self.assertRaisesMessage(TypeError, + message % ("divmod()", "Fraction", "complex"), + divmod, a, b) + # test reverse + self.assertRaisesMessage(TypeError, + message % ("%", "complex", "Fraction"), + operator.mod, b, a) + self.assertRaisesMessage(TypeError, + message % ("//", "complex", "Fraction"), + operator.floordiv, b, a) + self.assertRaisesMessage(TypeError, + message % ("divmod()", "complex", "Fraction"), + divmod, b, a) + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_glob.py b/Lib/test/test_glob.py index aa5fac8eca1354..8b2ea8f89f5daf 100644 --- a/Lib/test/test_glob.py +++ b/Lib/test/test_glob.py @@ -333,6 +333,17 @@ def test_recursive_glob(self): eq(glob.glob('**', recursive=True, include_hidden=True), [join(*i) for i in full+rec]) + def test_glob_non_directory(self): + eq = self.assertSequencesEqual_noorder + eq(self.rglob('EF'), self.joins(('EF',))) + eq(self.rglob('EF', ''), []) + eq(self.rglob('EF', '*'), []) + eq(self.rglob('EF', '**'), []) + eq(self.rglob('nonexistent'), []) + eq(self.rglob('nonexistent', ''), []) + eq(self.rglob('nonexistent', '*'), []) + eq(self.rglob('nonexistent', '**'), []) + def test_glob_many_open_files(self): depth = 30 base = os.path.join(self.tempdir, 'deep') diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py index 73669ecc792776..cc387afa391909 100644 --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -2497,6 +2497,28 @@ def test_interleaved_read_write(self): f.flush() self.assertEqual(raw.getvalue(), b'a2c') + def test_read1_after_write(self): + with self.BytesIO(b'abcdef') as raw: + with self.tp(raw, 3) as f: + f.write(b"1") + self.assertEqual(f.read1(1), b'b') + f.flush() + self.assertEqual(raw.getvalue(), b'1bcdef') + with self.BytesIO(b'abcdef') as raw: + with self.tp(raw, 3) as f: + f.write(b"1") + self.assertEqual(f.read1(), b'bcd') + f.flush() + self.assertEqual(raw.getvalue(), b'1bcdef') + with self.BytesIO(b'abcdef') as raw: + with self.tp(raw, 3) as f: + f.write(b"1") + # XXX: read(100) returns different numbers of bytes + # in Python and C implementations. + self.assertEqual(f.read1(100)[:3], b'bcd') + f.flush() + self.assertEqual(raw.getvalue(), b'1bcdef') + def test_interleaved_readline_write(self): with self.BytesIO(b'ab\ncdef\ng\n') as raw: with self.tp(raw) as f: diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index 888523227c2ac4..cf09bad4c9187b 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -5478,6 +5478,7 @@ def test_critical(self): self.assertEqual(record.levelno, logging.CRITICAL) self.assertEqual(record.msg, msg) self.assertEqual(record.args, (self.recording,)) + self.assertEqual(record.funcName, 'test_critical') def test_is_enabled_for(self): old_disable = self.adapter.logger.manager.disable @@ -5496,15 +5497,9 @@ def test_has_handlers(self): self.assertFalse(self.adapter.hasHandlers()) def test_nested(self): - class Adapter(logging.LoggerAdapter): - prefix = 'Adapter' - - def process(self, msg, kwargs): - return f"{self.prefix} {msg}", kwargs - msg = 'Adapters can be nested, yo.' - adapter = Adapter(logger=self.logger, extra=None) - adapter_adapter = Adapter(logger=adapter, extra=None) + adapter = PrefixAdapter(logger=self.logger, extra=None) + adapter_adapter = PrefixAdapter(logger=adapter, extra=None) adapter_adapter.prefix = 'AdapterAdapter' self.assertEqual(repr(adapter), repr(adapter_adapter)) adapter_adapter.log(logging.CRITICAL, msg, self.recording) @@ -5513,6 +5508,7 @@ def process(self, msg, kwargs): self.assertEqual(record.levelno, logging.CRITICAL) self.assertEqual(record.msg, f"Adapter AdapterAdapter {msg}") self.assertEqual(record.args, (self.recording,)) + self.assertEqual(record.funcName, 'test_nested') orig_manager = adapter_adapter.manager self.assertIs(adapter.manager, orig_manager) self.assertIs(self.logger.manager, orig_manager) @@ -5528,6 +5524,61 @@ def process(self, msg, kwargs): self.assertIs(adapter.manager, orig_manager) self.assertIs(self.logger.manager, orig_manager) + def test_styled_adapter(self): + # Test an example from the Cookbook. + records = self.recording.records + adapter = StyleAdapter(self.logger) + adapter.warning('Hello, {}!', 'world') + self.assertEqual(str(records[-1].msg), 'Hello, world!') + self.assertEqual(records[-1].funcName, 'test_styled_adapter') + adapter.log(logging.WARNING, 'Goodbye {}.', 'world') + self.assertEqual(str(records[-1].msg), 'Goodbye world.') + self.assertEqual(records[-1].funcName, 'test_styled_adapter') + + def test_nested_styled_adapter(self): + records = self.recording.records + adapter = PrefixAdapter(self.logger) + adapter.prefix = '{}' + adapter2 = StyleAdapter(adapter) + adapter2.warning('Hello, {}!', 'world') + self.assertEqual(str(records[-1].msg), '{} Hello, world!') + self.assertEqual(records[-1].funcName, 'test_nested_styled_adapter') + adapter2.log(logging.WARNING, 'Goodbye {}.', 'world') + self.assertEqual(str(records[-1].msg), '{} Goodbye world.') + self.assertEqual(records[-1].funcName, 'test_nested_styled_adapter') + + def test_find_caller_with_stacklevel(self): + the_level = 1 + trigger = self.adapter.warning + + def innermost(): + trigger('test', stacklevel=the_level) + + def inner(): + innermost() + + def outer(): + inner() + + records = self.recording.records + outer() + self.assertEqual(records[-1].funcName, 'innermost') + lineno = records[-1].lineno + the_level += 1 + outer() + self.assertEqual(records[-1].funcName, 'inner') + self.assertGreater(records[-1].lineno, lineno) + lineno = records[-1].lineno + the_level += 1 + outer() + self.assertEqual(records[-1].funcName, 'outer') + self.assertGreater(records[-1].lineno, lineno) + lineno = records[-1].lineno + the_level += 1 + outer() + self.assertEqual(records[-1].funcName, 'test_find_caller_with_stacklevel') + self.assertGreater(records[-1].lineno, lineno) + def test_extra_in_records(self): self.adapter = logging.LoggerAdapter(logger=self.logger, extra={'foo': '1'}) @@ -5569,6 +5620,30 @@ def test_extra_merged_log_call_has_precedence(self): self.assertEqual(record.foo, '2') +class PrefixAdapter(logging.LoggerAdapter): + prefix = 'Adapter' + + def process(self, msg, kwargs): + return f"{self.prefix} {msg}", kwargs + + +class Message: + def __init__(self, fmt, args): + self.fmt = fmt + self.args = args + + def __str__(self): + return self.fmt.format(*self.args) + + +class StyleAdapter(logging.LoggerAdapter): + def log(self, level, msg, /, *args, stacklevel=1, **kwargs): + if self.isEnabledFor(level): + msg, kwargs = self.process(msg, kwargs) + self.logger.log(level, Message(msg, args), **kwargs, + stacklevel=stacklevel+1) + + class LoggerTest(BaseTest, AssertErrorMessage): def setUp(self): diff --git a/Lib/test/test_mailbox.py b/Lib/test/test_mailbox.py index c52c014185bec7..d4628f91daf7e8 100644 --- a/Lib/test/test_mailbox.py +++ b/Lib/test/test_mailbox.py @@ -10,6 +10,7 @@ import tempfile from test import support from test.support import os_helper +from test.support import refleak_helper from test.support import socket_helper import unittest import textwrap @@ -2443,6 +2444,9 @@ def test__all__(self): def tearDownModule(): support.reap_children() + # reap_children may have re-populated caches: + if refleak_helper.hunting_for_refleaks(): + sys._clear_internal_caches() if __name__ == '__main__': diff --git a/Lib/test/test_optimizer.py b/Lib/test/test_optimizer.py index b56bf3cfd9560e..dfea8be3c6956f 100644 --- a/Lib/test/test_optimizer.py +++ b/Lib/test/test_optimizer.py @@ -1,9 +1,15 @@ -import _testinternalcapi import unittest import types +from test.support import import_helper + + +_testinternalcapi = import_helper.import_module("_testinternalcapi") class TestRareEventCounters(unittest.TestCase): + def setUp(self): + _testinternalcapi.reset_rare_event_counters() + def test_set_class(self): class A: pass diff --git a/Lib/test/test_pathlib/test_pathlib.py b/Lib/test/test_pathlib/test_pathlib.py index 2b166451243775..c0dcf314da4bfc 100644 --- a/Lib/test/test_pathlib/test_pathlib.py +++ b/Lib/test/test_pathlib/test_pathlib.py @@ -1250,6 +1250,19 @@ def test_glob_pathlike(self): self.assertEqual(expect, set(p.glob(P(pattern)))) self.assertEqual(expect, set(p.glob(FakePath(pattern)))) + @needs_symlinks + def test_glob_dot(self): + P = self.cls + with os_helper.change_cwd(P(self.base, "dirC")): + self.assertEqual( + set(P('.').glob('*')), {P("fileC"), P("novel.txt"), P("dirD")}) + self.assertEqual( + set(P('.').glob('**')), {P("fileC"), P("novel.txt"), P("dirD"), P("dirD/fileD"), P(".")}) + self.assertEqual( + set(P('.').glob('**/*')), {P("fileC"), P("novel.txt"), P("dirD"), P("dirD/fileD")}) + self.assertEqual( + set(P('.').glob('**/*/*')), {P("dirD/fileD")}) + def test_rglob_pathlike(self): P = self.cls p = P(self.base, "dirC") diff --git a/Lib/test/test_property.py b/Lib/test/test_property.py index c12c908d2ee32d..8ace9fd17ab96e 100644 --- a/Lib/test/test_property.py +++ b/Lib/test/test_property.py @@ -224,6 +224,7 @@ class PropertySubSlots(property): class PropertySubclassTests(unittest.TestCase): + @support.requires_docstrings def test_slots_docstring_copy_exception(self): # A special case error that we preserve despite the GH-98963 behavior # that would otherwise silently ignore this error. diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py index 99b19d01783a10..f3c26624c624f5 100644 --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -35,6 +35,7 @@ requires_docstrings, MISSING_C_DOCSTRINGS) from test.support.os_helper import (TESTFN, rmtree, unlink) from test import pydoc_mod +from test import pydocfodder class nonascii: @@ -102,7 +103,7 @@ class C(builtins.object) | ---------------------------------------------------------------------- | Class methods defined here: | - | __class_getitem__(item) from builtins.type + | __class_getitem__(item) | | ---------------------------------------------------------------------- | Data descriptors defined here: @@ -166,7 +167,7 @@ class A(builtins.object) Methods defined here: __init__() Wow, I have no function! - + ---------------------------------------------------------------------- Data descriptors defined here: __dict__ dictionary for instance variables @@ -179,6 +180,7 @@ class B(builtins.object) dictionary for instance variables __weakref__ list of weak references to the object + ---------------------------------------------------------------------- Data and other attributes defined here: NO_MEANING = 'eggs' __annotations__ = {'NO_MEANING': } @@ -191,8 +193,10 @@ class C(builtins.object) is_it_true(self) Return self.get_answer() say_no(self) + ---------------------------------------------------------------------- Class methods defined here: - __class_getitem__(item) from builtins.type + __class_getitem__(item) + ---------------------------------------------------------------------- Data descriptors defined here: __dict__ dictionary for instance variables @@ -330,6 +334,10 @@ def get_pydoc_html(module): loc = "
Module Docs" return output.strip(), loc +def clean_text(doc): + # clean up the extra text formatting that pydoc performs + return re.sub('\b.', '', doc) + def get_pydoc_link(module): "Returns a documentation web link of a module" abspath = os.path.abspath @@ -347,10 +355,7 @@ def get_pydoc_text(module): loc = "\nMODULE DOCS\n " + loc + "\n" output = doc.docmodule(module) - - # clean up the extra text formatting that pydoc performs - patt = re.compile('\b.') - output = patt.sub('', output) + output = clean_text(output) return output.strip(), loc def get_html_title(text): @@ -367,6 +372,7 @@ def html2text(html): Tailored for pydoc tests only. """ html = html.replace("
", "\n") + html = html.replace("
", "-"*70) html = re.sub("<.*?>", "", html) html = pydoc.replace(html, " ", " ", ">", ">", "<", "<") return html @@ -798,8 +804,7 @@ def itemconfigure(self, tagOrId, cnf=None, **kw): b_size = A.a_size doc = pydoc.render_doc(B) - # clean up the extra text formatting that pydoc performs - doc = re.sub('\b.', '', doc) + doc = clean_text(doc) self.assertEqual(doc, '''\ Python Library Documentation: class B in module %s @@ -887,8 +892,7 @@ def __init__(self, ... doc = pydoc.render_doc(A) - # clean up the extra text formatting that pydoc performs - doc = re.sub('\b.', '', doc) + doc = clean_text(doc) self.assertEqual(doc, '''Python Library Documentation: class A in module %s class A(builtins.object) @@ -925,8 +929,7 @@ def func( ... doc = pydoc.render_doc(func) - # clean up the extra text formatting that pydoc performs - doc = re.sub('\b.', '', doc) + doc = clean_text(doc) self.assertEqual(doc, '''Python Library Documentation: function func in module %s func( @@ -942,8 +945,7 @@ def function_with_really_long_name_so_annotations_can_be_rather_small( ... doc = pydoc.render_doc(function_with_really_long_name_so_annotations_can_be_rather_small) - # clean up the extra text formatting that pydoc performs - doc = re.sub('\b.', '', doc) + doc = clean_text(doc) self.assertEqual(doc, '''Python Library Documentation: function function_with_really_long_name_so_annotations_can_be_rather_small in module %s function_with_really_long_name_so_annotations_can_be_rather_small( @@ -957,8 +959,7 @@ def function_with_really_long_name_so_annotations_can_be_rather_small( second_very_long_parameter_name: ... doc = pydoc.render_doc(does_not_have_name) - # clean up the extra text formatting that pydoc performs - doc = re.sub('\b.', '', doc) + doc = clean_text(doc) self.assertEqual(doc, '''Python Library Documentation: function in module %s lambda very_long_parameter_name_that_should_not_fit_into_a_single_line, second_very_long_parameter_name @@ -1244,7 +1245,7 @@ def test_unbound_python_method(self): @requires_docstrings def test_unbound_builtin_method(self): self.assertEqual(self._get_summary_line(_pickle.Pickler.dump), - "dump(self, obj, /)") + "dump(self, obj, /) unbound _pickle.Pickler method") # these no longer include "self" def test_bound_python_method(self): @@ -1296,7 +1297,7 @@ def test_module_level_callable_o(self): def test_unbound_builtin_method_noargs(self): self.assertEqual(self._get_summary_line(str.lower), - "lower(self, /)") + "lower(self, /) unbound builtins.str method") def test_bound_builtin_method_noargs(self): self.assertEqual(self._get_summary_line(''.lower), @@ -1304,7 +1305,7 @@ def test_bound_builtin_method_noargs(self): def test_unbound_builtin_method_o(self): self.assertEqual(self._get_summary_line(set.add), - "add(self, object, /)") + "add(self, object, /) unbound builtins.set method") def test_bound_builtin_method_o(self): self.assertEqual(self._get_summary_line(set().add), @@ -1312,7 +1313,7 @@ def test_bound_builtin_method_o(self): def test_unbound_builtin_method_coexist_o(self): self.assertEqual(self._get_summary_line(set.__contains__), - "__contains__(self, object, /)") + "__contains__(self, object, /) unbound builtins.set method") def test_bound_builtin_method_coexist_o(self): self.assertEqual(self._get_summary_line(set().__contains__), @@ -1320,19 +1321,19 @@ def test_bound_builtin_method_coexist_o(self): def test_unbound_builtin_classmethod_noargs(self): self.assertEqual(self._get_summary_line(datetime.datetime.__dict__['utcnow']), - "utcnow(type, /)") + "utcnow(type, /) unbound datetime.datetime method") def test_bound_builtin_classmethod_noargs(self): self.assertEqual(self._get_summary_line(datetime.datetime.utcnow), - "utcnow() method of builtins.type instance") + "utcnow() class method of datetime.datetime") def test_unbound_builtin_classmethod_o(self): self.assertEqual(self._get_summary_line(dict.__dict__['__class_getitem__']), - "__class_getitem__(type, object, /)") + "__class_getitem__(type, object, /) unbound builtins.dict method") def test_bound_builtin_classmethod_o(self): self.assertEqual(self._get_summary_line(dict.__class_getitem__), - "__class_getitem__(object, /) method of builtins.type instance") + "__class_getitem__(object, /) class method of builtins.dict") @support.cpython_only @requires_docstrings @@ -1356,11 +1357,13 @@ def test_builtin_staticmethod_unrepresentable_default(self): @requires_docstrings def test_unbound_builtin_method_unrepresentable_default(self): self.assertEqual(self._get_summary_line(dict.pop), - "pop(self, key, default=, /)") + "pop(self, key, default=, /) " + "unbound builtins.dict method") import _testcapi cls = _testcapi.DocStringUnrepresentableSignatureTest self.assertEqual(self._get_summary_line(cls.meth), - "meth(self, /, a, b=)") + "meth(self, /, a, b=) unbound " + "_testcapi.DocStringUnrepresentableSignatureTest method") @support.cpython_only @requires_docstrings @@ -1381,7 +1384,8 @@ def test_unbound_builtin_classmethod_unrepresentable_default(self): cls = _testcapi.DocStringUnrepresentableSignatureTest descr = cls.__dict__['classmeth'] self.assertEqual(self._get_summary_line(descr), - "classmeth(type, /, a, b=)") + "classmeth(type, /, a, b=) unbound " + "_testcapi.DocStringUnrepresentableSignatureTest method") @support.cpython_only @requires_docstrings @@ -1389,7 +1393,8 @@ def test_bound_builtin_classmethod_unrepresentable_default(self): import _testcapi cls = _testcapi.DocStringUnrepresentableSignatureTest self.assertEqual(self._get_summary_line(cls.classmeth), - "classmeth(a, b=) method of builtins.type instance") + "classmeth(a, b=) class method of " + "_testcapi.DocStringUnrepresentableSignatureTest") def test_overridden_text_signature(self): class C: @@ -1423,7 +1428,7 @@ def smeth(*args, **kwargs): "meth" + bound + " method of test.test_pydoc.C instance") C.cmeth.__func__.__text_signature__ = text_signature self.assertEqual(self._get_summary_line(C.cmeth), - "cmeth" + bound + " method of builtins.type instance") + "cmeth" + bound + " class method of test.test_pydoc.C") C.smeth.__text_signature__ = text_signature self.assertEqual(self._get_summary_line(C.smeth), "smeth" + unbound) @@ -1460,13 +1465,13 @@ def cm(cls, x): 'cm(...)\n' ' A class method\n') self.assertEqual(self._get_summary_lines(X.cm), """\ -cm(x) method of builtins.type instance +cm(x) class method of test.test_pydoc.X A class method """) self.assertIn(""" | Class methods defined here: | - | cm(x) from builtins.type + | cm(x) | A class method """, pydoc.plain(pydoc.render_doc(X))) @@ -1623,6 +1628,128 @@ def a_fn_with_https_link(): ) +class PydocFodderTest(unittest.TestCase): + + def getsection(self, text, beginline, endline): + lines = text.splitlines() + beginindex, endindex = 0, None + if beginline is not None: + beginindex = lines.index(beginline) + if endline is not None: + endindex = lines.index(endline, beginindex) + return lines[beginindex:endindex] + + def test_text_doc_routines_in_class(self, cls=pydocfodder.B): + doc = pydoc.TextDoc() + result = doc.docclass(cls) + result = clean_text(result) + where = 'defined here' if cls is pydocfodder.B else 'inherited from B' + lines = self.getsection(result, f' | Methods {where}:', ' | ' + '-'*70) + self.assertIn(' | A_method_alias = A_method(self)', lines) + self.assertIn(' | B_method_alias = B_method(self)', lines) + self.assertIn(' | A_staticmethod(x, y) from test.pydocfodder.A', lines) + self.assertIn(' | A_staticmethod_alias = A_staticmethod(x, y)', lines) + self.assertIn(' | global_func(x, y) from test.pydocfodder', lines) + self.assertIn(' | global_func_alias = global_func(x, y)', lines) + self.assertIn(' | global_func2_alias = global_func2(x, y) from test.pydocfodder', lines) + self.assertIn(' | __repr__(self, /) from builtins.object', lines) + self.assertIn(' | object_repr = __repr__(self, /)', lines) + + lines = self.getsection(result, f' | Static methods {where}:', ' | ' + '-'*70) + self.assertIn(' | A_classmethod_ref = A_classmethod(x) class method of test.pydocfodder.A', lines) + note = '' if cls is pydocfodder.B else ' class method of test.pydocfodder.B' + self.assertIn(' | B_classmethod_ref = B_classmethod(x)' + note, lines) + self.assertIn(' | A_method_ref = A_method() method of test.pydocfodder.A instance', lines) + self.assertIn(' | get(key, default=None, /) method of builtins.dict instance', lines) + self.assertIn(' | dict_get = get(key, default=None, /) method of builtins.dict instance', lines) + + lines = self.getsection(result, f' | Class methods {where}:', ' | ' + '-'*70) + self.assertIn(' | B_classmethod(x)', lines) + self.assertIn(' | B_classmethod_alias = B_classmethod(x)', lines) + + def test_html_doc_routines_in_class(self, cls=pydocfodder.B): + doc = pydoc.HTMLDoc() + result = doc.docclass(cls) + result = html2text(result) + where = 'defined here' if cls is pydocfodder.B else 'inherited from B' + lines = self.getsection(result, f'Methods {where}:', '-'*70) + self.assertIn('A_method_alias = A_method(self)', lines) + self.assertIn('B_method_alias = B_method(self)', lines) + self.assertIn('A_staticmethod(x, y) from test.pydocfodder.A', lines) + self.assertIn('A_staticmethod_alias = A_staticmethod(x, y)', lines) + self.assertIn('global_func(x, y) from test.pydocfodder', lines) + self.assertIn('global_func_alias = global_func(x, y)', lines) + self.assertIn('global_func2_alias = global_func2(x, y) from test.pydocfodder', lines) + self.assertIn('__repr__(self, /) from builtins.object', lines) + self.assertIn('object_repr = __repr__(self, /)', lines) + + lines = self.getsection(result, f'Static methods {where}:', '-'*70) + self.assertIn('A_classmethod_ref = A_classmethod(x) class method of test.pydocfodder.A', lines) + note = '' if cls is pydocfodder.B else ' class method of test.pydocfodder.B' + self.assertIn('B_classmethod_ref = B_classmethod(x)' + note, lines) + self.assertIn('A_method_ref = A_method() method of test.pydocfodder.A instance', lines) + + lines = self.getsection(result, f'Class methods {where}:', '-'*70) + self.assertIn('B_classmethod(x)', lines) + self.assertIn('B_classmethod_alias = B_classmethod(x)', lines) + + def test_text_doc_inherited_routines_in_class(self): + self.test_text_doc_routines_in_class(pydocfodder.D) + + def test_html_doc_inherited_routines_in_class(self): + self.test_html_doc_routines_in_class(pydocfodder.D) + + def test_text_doc_routines_in_module(self): + doc = pydoc.TextDoc() + result = doc.docmodule(pydocfodder) + result = clean_text(result) + lines = self.getsection(result, 'FUNCTIONS', 'FILE') + # function alias + self.assertIn(' global_func_alias = global_func(x, y)', lines) + self.assertIn(' A_staticmethod(x, y)', lines) + self.assertIn(' A_staticmethod_alias = A_staticmethod(x, y)', lines) + # bound class methods + self.assertIn(' A_classmethod(x) class method of A', lines) + self.assertIn(' A_classmethod2 = A_classmethod(x) class method of A', lines) + self.assertIn(' A_classmethod3 = A_classmethod(x) class method of B', lines) + # bound methods + self.assertIn(' A_method() method of A instance', lines) + self.assertIn(' A_method2 = A_method() method of A instance', lines) + self.assertIn(' A_method3 = A_method() method of B instance', lines) + self.assertIn(' A_staticmethod_ref = A_staticmethod(x, y)', lines) + self.assertIn(' A_staticmethod_ref2 = A_staticmethod(y) method of B instance', lines) + self.assertIn(' get(key, default=None, /) method of builtins.dict instance', lines) + self.assertIn(' dict_get = get(key, default=None, /) method of builtins.dict instance', lines) + # unbound methods + self.assertIn(' B_method(self)', lines) + self.assertIn(' B_method2 = B_method(self)', lines) + + def test_html_doc_routines_in_module(self): + doc = pydoc.HTMLDoc() + result = doc.docmodule(pydocfodder) + result = html2text(result) + lines = self.getsection(result, ' Functions', None) + # function alias + self.assertIn(' global_func_alias = global_func(x, y)', lines) + self.assertIn(' A_staticmethod(x, y)', lines) + self.assertIn(' A_staticmethod_alias = A_staticmethod(x, y)', lines) + # bound class methods + self.assertIn('A_classmethod(x) class method of A', lines) + self.assertIn(' A_classmethod2 = A_classmethod(x) class method of A', lines) + self.assertIn(' A_classmethod3 = A_classmethod(x) class method of B', lines) + # bound methods + self.assertIn(' A_method() method of A instance', lines) + self.assertIn(' A_method2 = A_method() method of A instance', lines) + self.assertIn(' A_method3 = A_method() method of B instance', lines) + self.assertIn(' A_staticmethod_ref = A_staticmethod(x, y)', lines) + self.assertIn(' A_staticmethod_ref2 = A_staticmethod(y) method of B instance', lines) + self.assertIn(' get(key, default=None, /) method of builtins.dict instance', lines) + self.assertIn(' dict_get = get(key, default=None, /) method of builtins.dict instance', lines) + # unbound methods + self.assertIn(' B_method(self)', lines) + self.assertIn(' B_method2 = B_method(self)', lines) + + @unittest.skipIf( is_emscripten or is_wasi, "Socket server not available on Emscripten/WASI." diff --git a/Lib/test/test_queue.py b/Lib/test/test_queue.py index 33113a72e6b6a9..d308a212999429 100644 --- a/Lib/test/test_queue.py +++ b/Lib/test/test_queue.py @@ -2,6 +2,7 @@ # to ensure the Queue locks remain stable. import itertools import random +import sys import threading import time import unittest @@ -241,6 +242,386 @@ def test_shrinking_queue(self): with self.assertRaises(self.queue.Full): q.put_nowait(4) + def test_shutdown_empty(self): + q = self.type2test() + q.shutdown() + with self.assertRaises(self.queue.ShutDown): + q.put("data") + with self.assertRaises(self.queue.ShutDown): + q.get() + + def test_shutdown_nonempty(self): + q = self.type2test() + q.put("data") + q.shutdown() + q.get() + with self.assertRaises(self.queue.ShutDown): + q.get() + + def test_shutdown_immediate(self): + q = self.type2test() + q.put("data") + q.shutdown(immediate=True) + with self.assertRaises(self.queue.ShutDown): + q.get() + + def test_shutdown_allowed_transitions(self): + # allowed transitions would be from alive via shutdown to immediate + q = self.type2test() + self.assertFalse(q.is_shutdown) + + q.shutdown() + self.assertTrue(q.is_shutdown) + + q.shutdown(immediate=True) + self.assertTrue(q.is_shutdown) + + q.shutdown(immediate=False) + + def _shutdown_all_methods_in_one_thread(self, immediate): + q = self.type2test(2) + q.put("L") + q.put_nowait("O") + q.shutdown(immediate) + + with self.assertRaises(self.queue.ShutDown): + q.put("E") + with self.assertRaises(self.queue.ShutDown): + q.put_nowait("W") + if immediate: + with self.assertRaises(self.queue.ShutDown): + q.get() + with self.assertRaises(self.queue.ShutDown): + q.get_nowait() + with self.assertRaises(ValueError): + q.task_done() + q.join() + else: + self.assertIn(q.get(), "LO") + q.task_done() + self.assertIn(q.get(), "LO") + q.task_done() + q.join() + # on shutdown(immediate=False) + # when queue is empty, should raise ShutDown Exception + with self.assertRaises(self.queue.ShutDown): + q.get() # p.get(True) + with self.assertRaises(self.queue.ShutDown): + q.get_nowait() # p.get(False) + with self.assertRaises(self.queue.ShutDown): + q.get(True, 1.0) + + def test_shutdown_all_methods_in_one_thread(self): + return self._shutdown_all_methods_in_one_thread(False) + + def test_shutdown_immediate_all_methods_in_one_thread(self): + return self._shutdown_all_methods_in_one_thread(True) + + def _write_msg_thread(self, q, n, results, delay, + i_when_exec_shutdown, + event_start, event_end): + event_start.wait() + for i in range(1, n+1): + try: + q.put((i, "YDLO")) + results.append(True) + except self.queue.ShutDown: + results.append(False) + # triggers shutdown of queue + if i == i_when_exec_shutdown: + event_end.set() + time.sleep(delay) + # end of all puts + q.join() + + def _read_msg_thread(self, q, nb, results, delay, event_start): + event_start.wait() + block = True + while nb: + time.sleep(delay) + try: + # Get at least one message + q.get(block) + block = False + q.task_done() + results.append(True) + nb -= 1 + except self.queue.ShutDown: + results.append(False) + nb -= 1 + except self.queue.Empty: + pass + q.join() + + def _shutdown_thread(self, q, event_end, immediate): + event_end.wait() + q.shutdown(immediate) + q.join() + + def _join_thread(self, q, delay, event_start): + event_start.wait() + time.sleep(delay) + q.join() + + def _shutdown_all_methods_in_many_threads(self, immediate): + q = self.type2test() + ps = [] + ev_start = threading.Event() + ev_exec_shutdown = threading.Event() + res_puts = [] + res_gets = [] + delay = 1e-4 + read_process = 4 + nb_msgs = read_process * 16 + nb_msgs_r = nb_msgs // read_process + when_exec_shutdown = nb_msgs // 2 + lprocs = ( + (self._write_msg_thread, 1, (q, nb_msgs, res_puts, delay, + when_exec_shutdown, + ev_start, ev_exec_shutdown)), + (self._read_msg_thread, read_process, (q, nb_msgs_r, + res_gets, delay*2, + ev_start)), + (self._join_thread, 2, (q, delay*2, ev_start)), + (self._shutdown_thread, 1, (q, ev_exec_shutdown, immediate)), + ) + # start all threds + for func, n, args in lprocs: + for i in range(n): + ps.append(threading.Thread(target=func, args=args)) + ps[-1].start() + # set event in order to run q.shutdown() + ev_start.set() + + if not immediate: + assert(len(res_gets) == len(res_puts)) + assert(res_gets.count(True) == res_puts.count(True)) + else: + assert(len(res_gets) <= len(res_puts)) + assert(res_gets.count(True) <= res_puts.count(True)) + + for thread in ps[1:]: + thread.join() + + @unittest.skipIf(sys.platform == "win32", "test times out (gh-115258)") + def test_shutdown_all_methods_in_many_threads(self): + return self._shutdown_all_methods_in_many_threads(False) + + @unittest.skipIf(sys.platform == "win32", "test times out (gh-115258)") + def test_shutdown_immediate_all_methods_in_many_threads(self): + return self._shutdown_all_methods_in_many_threads(True) + + def _get(self, q, go, results, shutdown=False): + go.wait() + try: + msg = q.get() + results.append(not shutdown) + return not shutdown + except self.queue.ShutDown: + results.append(shutdown) + return shutdown + + def _get_shutdown(self, q, go, results): + return self._get(q, go, results, True) + + def _get_task_done(self, q, go, results): + go.wait() + try: + msg = q.get() + q.task_done() + results.append(True) + return msg + except self.queue.ShutDown: + results.append(False) + return False + + def _put(self, q, msg, go, results, shutdown=False): + go.wait() + try: + q.put(msg) + results.append(not shutdown) + return not shutdown + except self.queue.ShutDown: + results.append(shutdown) + return shutdown + + def _put_shutdown(self, q, msg, go, results): + return self._put(q, msg, go, results, True) + + def _join(self, q, results, shutdown=False): + try: + q.join() + results.append(not shutdown) + return not shutdown + except self.queue.ShutDown: + results.append(shutdown) + return shutdown + + def _join_shutdown(self, q, results): + return self._join(q, results, True) + + def _shutdown_get(self, immediate): + q = self.type2test(2) + results = [] + go = threading.Event() + q.put("Y") + q.put("D") + # queue full + + if immediate: + thrds = ( + (self._get_shutdown, (q, go, results)), + (self._get_shutdown, (q, go, results)), + ) + else: + thrds = ( + # on shutdown(immediate=False) + # one of these threads shoud raise Shutdown + (self._get, (q, go, results)), + (self._get, (q, go, results)), + (self._get, (q, go, results)), + ) + threads = [] + for func, params in thrds: + threads.append(threading.Thread(target=func, args=params)) + threads[-1].start() + q.shutdown(immediate) + go.set() + for t in threads: + t.join() + if immediate: + self.assertListEqual(results, [True, True]) + else: + self.assertListEqual(sorted(results), [False] + [True]*(len(thrds)-1)) + + def test_shutdown_get(self): + return self._shutdown_get(False) + + def test_shutdown_immediate_get(self): + return self._shutdown_get(True) + + def _shutdown_put(self, immediate): + q = self.type2test(2) + results = [] + go = threading.Event() + q.put("Y") + q.put("D") + # queue fulled + + thrds = ( + (self._put_shutdown, (q, "E", go, results)), + (self._put_shutdown, (q, "W", go, results)), + ) + threads = [] + for func, params in thrds: + threads.append(threading.Thread(target=func, args=params)) + threads[-1].start() + q.shutdown() + go.set() + for t in threads: + t.join() + + self.assertEqual(results, [True]*len(thrds)) + + def test_shutdown_put(self): + return self._shutdown_put(False) + + def test_shutdown_immediate_put(self): + return self._shutdown_put(True) + + def _shutdown_join(self, immediate): + q = self.type2test() + results = [] + q.put("Y") + go = threading.Event() + nb = q.qsize() + + thrds = ( + (self._join, (q, results)), + (self._join, (q, results)), + ) + threads = [] + for func, params in thrds: + threads.append(threading.Thread(target=func, args=params)) + threads[-1].start() + if not immediate: + res = [] + for i in range(nb): + threads.append(threading.Thread(target=self._get_task_done, args=(q, go, res))) + threads[-1].start() + q.shutdown(immediate) + go.set() + for t in threads: + t.join() + + self.assertEqual(results, [True]*len(thrds)) + + def test_shutdown_immediate_join(self): + return self._shutdown_join(True) + + def test_shutdown_join(self): + return self._shutdown_join(False) + + def _shutdown_put_join(self, immediate): + q = self.type2test(2) + results = [] + go = threading.Event() + q.put("Y") + nb = q.qsize() + # queue not fulled + + thrds = ( + (self._put_shutdown, (q, "E", go, results)), + (self._join, (q, results)), + ) + threads = [] + for func, params in thrds: + threads.append(threading.Thread(target=func, args=params)) + threads[-1].start() + self.assertEqual(q.unfinished_tasks, nb) + for i in range(nb): + t = threading.Thread(target=q.task_done) + t.start() + threads.append(t) + q.shutdown(immediate) + go.set() + for t in threads: + t.join() + + self.assertEqual(results, [True]*len(thrds)) + + def test_shutdown_immediate_put_join(self): + return self._shutdown_put_join(True) + + def test_shutdown_put_join(self): + return self._shutdown_put_join(False) + + def test_shutdown_get_task_done_join(self): + q = self.type2test(2) + results = [] + go = threading.Event() + q.put("Y") + q.put("D") + self.assertEqual(q.unfinished_tasks, q.qsize()) + + thrds = ( + (self._get_task_done, (q, go, results)), + (self._get_task_done, (q, go, results)), + (self._join, (q, results)), + (self._join, (q, results)), + ) + threads = [] + for func, params in thrds: + threads.append(threading.Thread(target=func, args=params)) + threads[-1].start() + go.set() + q.shutdown(False) + for t in threads: + t.join() + + self.assertEqual(results, [True]*len(thrds)) + + class QueueTest(BaseQueueTestMixin): def setUp(self): diff --git a/Lib/test/test_tkinter/test_text.py b/Lib/test/test_tkinter/test_text.py index f809c4510e3a1f..b26956930d3402 100644 --- a/Lib/test/test_tkinter/test_text.py +++ b/Lib/test/test_tkinter/test_text.py @@ -52,27 +52,47 @@ def test_count(self): options = ('chars', 'indices', 'lines', 'displaychars', 'displayindices', 'displaylines', 'xpixels', 'ypixels') + self.assertEqual(len(text.count('1.0', 'end', *options, return_ints=True)), 8) self.assertEqual(len(text.count('1.0', 'end', *options)), 8) - self.assertEqual(text.count('1.0', 'end', 'chars', 'lines'), (124, 4)) + self.assertEqual(text.count('1.0', 'end', 'chars', 'lines', return_ints=True), + (124, 4)) self.assertEqual(text.count('1.3', '4.5', 'chars', 'lines'), (92, 3)) + self.assertEqual(text.count('4.5', '1.3', 'chars', 'lines', return_ints=True), + (-92, -3)) self.assertEqual(text.count('4.5', '1.3', 'chars', 'lines'), (-92, -3)) + self.assertEqual(text.count('1.3', '1.3', 'chars', 'lines', return_ints=True), + (0, 0)) self.assertEqual(text.count('1.3', '1.3', 'chars', 'lines'), (0, 0)) - self.assertEqual(text.count('1.0', 'end', 'lines'), 4) - self.assertEqual(text.count('end', '1.0', 'lines'), -4) - self.assertEqual(text.count('1.3', '1.5', 'lines'), 0) - self.assertEqual(text.count('1.3', '1.3', 'lines'), 0) - self.assertEqual(text.count('1.0', 'end'), 124) # 'indices' by default - self.assertEqual(text.count('1.0', 'end', 'indices'), 124) + self.assertEqual(text.count('1.0', 'end', 'lines', return_ints=True), 4) + self.assertEqual(text.count('1.0', 'end', 'lines'), (4,)) + self.assertEqual(text.count('end', '1.0', 'lines', return_ints=True), -4) + self.assertEqual(text.count('end', '1.0', 'lines'), (-4,)) + self.assertEqual(text.count('1.3', '1.5', 'lines', return_ints=True), 0) + self.assertEqual(text.count('1.3', '1.5', 'lines'), None) + self.assertEqual(text.count('1.3', '1.3', 'lines', return_ints=True), 0) + self.assertEqual(text.count('1.3', '1.3', 'lines'), None) + # Count 'indices' by default. + self.assertEqual(text.count('1.0', 'end', return_ints=True), 124) + self.assertEqual(text.count('1.0', 'end'), (124,)) + self.assertEqual(text.count('1.0', 'end', 'indices', return_ints=True), 124) + self.assertEqual(text.count('1.0', 'end', 'indices'), (124,)) self.assertRaises(tkinter.TclError, text.count, '1.0', 'end', 'spam') self.assertRaises(tkinter.TclError, text.count, '1.0', 'end', '-lines') - self.assertIsInstance(text.count('1.3', '1.5', 'ypixels'), int) + self.assertIsInstance(text.count('1.3', '1.5', 'ypixels', return_ints=True), int) + self.assertIsInstance(text.count('1.3', '1.5', 'ypixels'), tuple) + self.assertIsInstance(text.count('1.3', '1.5', 'update', 'ypixels', return_ints=True), int) self.assertIsInstance(text.count('1.3', '1.5', 'update', 'ypixels'), int) - self.assertEqual(text.count('1.3', '1.3', 'update', 'ypixels'), 0) + self.assertEqual(text.count('1.3', '1.3', 'update', 'ypixels', return_ints=True), 0) + self.assertEqual(text.count('1.3', '1.3', 'update', 'ypixels'), None) + self.assertEqual(text.count('1.3', '1.5', 'update', 'indices', return_ints=True), 2) self.assertEqual(text.count('1.3', '1.5', 'update', 'indices'), 2) - self.assertEqual(text.count('1.3', '1.3', 'update', 'indices'), 0) - self.assertEqual(text.count('1.3', '1.5', 'update'), 2) - self.assertEqual(text.count('1.3', '1.3', 'update'), 0) + self.assertEqual(text.count('1.3', '1.3', 'update', 'indices', return_ints=True), 0) + self.assertEqual(text.count('1.3', '1.3', 'update', 'indices'), None) + self.assertEqual(text.count('1.3', '1.5', 'update', return_ints=True), 2) + self.assertEqual(text.count('1.3', '1.5', 'update'), (2,)) + self.assertEqual(text.count('1.3', '1.3', 'update', return_ints=True), 0) + self.assertEqual(text.count('1.3', '1.3', 'update'), None) if __name__ == "__main__": diff --git a/Lib/test/test_traceback.py b/Lib/test/test_traceback.py index 372fc48bf81a6a..dd9b1850adf086 100644 --- a/Lib/test/test_traceback.py +++ b/Lib/test/test_traceback.py @@ -3124,10 +3124,13 @@ def test_smoke_user_exception(self): class MyException(Exception): pass - self.do_test_smoke( - MyException('bad things happened'), - ('test.test_traceback.TestTracebackException.' - 'test_smoke_user_exception..MyException')) + if __name__ == '__main__': + expected = ('TestTracebackException.' + 'test_smoke_user_exception..MyException') + else: + expected = ('test.test_traceback.TestTracebackException.' + 'test_smoke_user_exception..MyException') + self.do_test_smoke(MyException('bad things happened'), expected) def test_from_exception(self): # Check all the parameters are accepted. diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index b684af4f33ed71..176623171c9888 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -4323,6 +4323,16 @@ class C(B[int]): c.bar = 'abc' self.assertEqual(c.__dict__, {'bar': 'abc'}) + def test_setattr_exceptions(self): + class Immutable[T]: + def __setattr__(self, key, value): + raise RuntimeError("immutable") + + # gh-115165: This used to cause RuntimeError to be raised + # when we tried to set `__orig_class__` on the `Immutable` instance + # returned by the `Immutable[int]()` call + self.assertIsInstance(Immutable[int](), Immutable) + def test_subscripted_generics_as_proxies(self): T = TypeVar('T') class C(Generic[T]): @@ -4920,6 +4930,75 @@ class B(Generic[S]): ... class C(List[int], B): ... self.assertEqual(C.__mro__, (C, list, B, Generic, object)) + def test_multiple_inheritance_non_type_with___mro_entries__(self): + class GoodEntries: + def __mro_entries__(self, bases): + return (object,) + + class A(List[int], GoodEntries()): ... + + self.assertEqual(A.__mro__, (A, list, Generic, object)) + + def test_multiple_inheritance_non_type_without___mro_entries__(self): + # Error should be from the type machinery, not from typing.py + with self.assertRaisesRegex(TypeError, r"^bases must be types"): + class A(List[int], object()): ... + + def test_multiple_inheritance_non_type_bad___mro_entries__(self): + class BadEntries: + def __mro_entries__(self, bases): + return None + + # Error should be from the type machinery, not from typing.py + with self.assertRaisesRegex( + TypeError, + r"^__mro_entries__ must return a tuple", + ): + class A(List[int], BadEntries()): ... + + def test_multiple_inheritance___mro_entries___returns_non_type(self): + class BadEntries: + def __mro_entries__(self, bases): + return (object(),) + + # Error should be from the type machinery, not from typing.py + with self.assertRaisesRegex( + TypeError, + r"^bases must be types", + ): + class A(List[int], BadEntries()): ... + + def test_multiple_inheritance_with_genericalias(self): + class A(typing.Sized, list[int]): ... + + self.assertEqual( + A.__mro__, + (A, collections.abc.Sized, Generic, list, object), + ) + + def test_multiple_inheritance_with_genericalias_2(self): + T = TypeVar("T") + + class BaseSeq(typing.Sequence[T]): ... + class MySeq(List[T], BaseSeq[T]): ... + + self.assertEqual( + MySeq.__mro__, + ( + MySeq, + list, + BaseSeq, + collections.abc.Sequence, + collections.abc.Reversible, + collections.abc.Collection, + collections.abc.Sized, + collections.abc.Iterable, + collections.abc.Container, + Generic, + object, + ), + ) + def test_init_subclass_super_called(self): class FinalException(Exception): pass @@ -6077,8 +6156,6 @@ def test_overload_registry_repeated(self): self.assertEqual(list(get_overloads(impl)), overloads) -# Definitions needed for features introduced in Python 3.6 - from test.typinganndata import ( ann_module, ann_module2, ann_module3, ann_module5, ann_module6, ) @@ -8492,6 +8569,17 @@ def test_instantiate_generic(self): self.assertEqual(MyCount([4, 4, 5]), {4: 2, 5: 1}) self.assertEqual(MyCount[int]([4, 4, 5]), {4: 2, 5: 1}) + def test_instantiate_immutable(self): + class C: + def __setattr__(self, key, value): + raise Exception("should be ignored") + + A = Annotated[C, "a decoration"] + # gh-115165: This used to cause RuntimeError to be raised + # when we tried to set `__orig_class__` on the `C` instance + # returned by the `A()` call + self.assertIsInstance(A(), C) + def test_cannot_instantiate_forward(self): A = Annotated["int", (5, 6)] with self.assertRaises(TypeError): diff --git a/Lib/test/test_unittest/testmock/testpatch.py b/Lib/test/test_unittest/testmock/testpatch.py index 833d7da1f31a20..d0046d702a53f4 100644 --- a/Lib/test/test_unittest/testmock/testpatch.py +++ b/Lib/test/test_unittest/testmock/testpatch.py @@ -1912,7 +1912,7 @@ def foo(x=0): with patch.object(foo, '__module__', "testpatch2"): self.assertEqual(foo.__module__, "testpatch2") - self.assertEqual(foo.__module__, 'test.test_unittest.testmock.testpatch') + self.assertEqual(foo.__module__, __name__) with patch.object(foo, '__annotations__', dict([('s', 1, )])): self.assertEqual(foo.__annotations__, dict([('s', 1, )])) diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py index a435ec7822ea0c..c535d631bb646f 100644 --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -13,6 +13,7 @@ import operator import os import pickle +import pyexpat import sys import textwrap import types @@ -120,6 +121,10 @@ """ +fails_with_expat_2_6_0 = (unittest.expectedFailure + if pyexpat.version_info >= (2, 6, 0) else + lambda test: test) + def checkwarnings(*filters, quiet=False): def decorator(test): def newtest(*args, **kwargs): @@ -1480,28 +1485,37 @@ def assert_event_tags(self, parser, expected, max_events=None): self.assertEqual([(action, elem.tag) for action, elem in events], expected) - def test_simple_xml(self): - for chunk_size in (None, 1, 5): - with self.subTest(chunk_size=chunk_size): - parser = ET.XMLPullParser() - self.assert_event_tags(parser, []) - self._feed(parser, "\n", chunk_size) - self.assert_event_tags(parser, []) - self._feed(parser, - "\n text\n", chunk_size) - self.assert_event_tags(parser, [('end', 'element')]) - self._feed(parser, "texttail\n", chunk_size) - self._feed(parser, "\n", chunk_size) - self.assert_event_tags(parser, [ - ('end', 'element'), - ('end', 'empty-element'), - ]) - self._feed(parser, "\n", chunk_size) - self.assert_event_tags(parser, [('end', 'root')]) - self.assertIsNone(parser.close()) + def test_simple_xml(self, chunk_size=None): + parser = ET.XMLPullParser() + self.assert_event_tags(parser, []) + self._feed(parser, "\n", chunk_size) + self.assert_event_tags(parser, []) + self._feed(parser, + "\n text\n", chunk_size) + self.assert_event_tags(parser, [('end', 'element')]) + self._feed(parser, "texttail\n", chunk_size) + self._feed(parser, "\n", chunk_size) + self.assert_event_tags(parser, [ + ('end', 'element'), + ('end', 'empty-element'), + ]) + self._feed(parser, "\n", chunk_size) + self.assert_event_tags(parser, [('end', 'root')]) + self.assertIsNone(parser.close()) + + @fails_with_expat_2_6_0 + def test_simple_xml_chunk_1(self): + self.test_simple_xml(chunk_size=1) + + @fails_with_expat_2_6_0 + def test_simple_xml_chunk_5(self): + self.test_simple_xml(chunk_size=5) + + def test_simple_xml_chunk_22(self): + self.test_simple_xml(chunk_size=22) def test_feed_while_iterating(self): parser = ET.XMLPullParser() diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py index 2be9da2cfb9299..175bfbd7d912d2 100644 --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -3745,7 +3745,7 @@ def compare(self, index1, op, index2): return self.tk.getboolean(self.tk.call( self._w, 'compare', index1, op, index2)) - def count(self, index1, index2, *options): # new in Tk 8.5 + def count(self, index1, index2, *options, return_ints=False): # new in Tk 8.5 """Counts the number of relevant things between the two indices. If INDEX1 is after INDEX2, the result will be a negative number @@ -3753,19 +3753,26 @@ def count(self, index1, index2, *options): # new in Tk 8.5 The actual items which are counted depends on the options given. The result is a tuple of integers, one for the result of each - counting option given, if more than one option is specified, - otherwise it is an integer. Valid counting options are "chars", - "displaychars", "displayindices", "displaylines", "indices", - "lines", "xpixels" and "ypixels". The default value, if no - option is specified, is "indices". There is an additional possible - option "update", which if given then all subsequent options ensure - that any possible out of date information is recalculated.""" + counting option given, if more than one option is specified or + return_ints is false (default), otherwise it is an integer. + Valid counting options are "chars", "displaychars", + "displayindices", "displaylines", "indices", "lines", "xpixels" + and "ypixels". The default value, if no option is specified, is + "indices". There is an additional possible option "update", + which if given then all subsequent options ensure that any + possible out of date information is recalculated. + """ options = ['-%s' % arg for arg in options] res = self.tk.call(self._w, 'count', *options, index1, index2) if not isinstance(res, int): res = self._getints(res) if len(res) == 1: res, = res + if not return_ints: + if not res: + res = None + elif len(options) <= 1: + res = (res,) return res def debug(self, boolean=None): diff --git a/Lib/typing.py b/Lib/typing.py index d278b4effc7eba..914ddeaf504cd0 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -1127,7 +1127,9 @@ def __call__(self, *args, **kwargs): result = self.__origin__(*args, **kwargs) try: result.__orig_class__ = self - except AttributeError: + # Some objects raise TypeError (or something even more exotic) + # if you try to set attributes on them; we guard against that here + except Exception: pass return result @@ -1135,9 +1137,29 @@ def __mro_entries__(self, bases): res = [] if self.__origin__ not in bases: res.append(self.__origin__) + + # Check if any base that occurs after us in `bases` is either itself a + # subclass of Generic, or something which will add a subclass of Generic + # to `__bases__` via its `__mro_entries__`. If not, add Generic + # ourselves. The goal is to ensure that Generic (or a subclass) will + # appear exactly once in the final bases tuple. If we let it appear + # multiple times, we risk "can't form a consistent MRO" errors. i = bases.index(self) for b in bases[i+1:]: - if isinstance(b, _BaseGenericAlias) or issubclass(b, Generic): + if isinstance(b, _BaseGenericAlias): + break + if not isinstance(b, type): + meth = getattr(b, "__mro_entries__", None) + new_bases = meth(bases) if meth else None + if ( + isinstance(new_bases, tuple) and + any( + isinstance(b2, type) and issubclass(b2, Generic) + for b2 in new_bases + ) + ): + break + elif issubclass(b, Generic): break else: res.append(Generic) diff --git a/Lib/zipfile/__init__.py b/Lib/zipfile/__init__.py index 8005b4b34ccf76..cc08f602fe44e0 100644 --- a/Lib/zipfile/__init__.py +++ b/Lib/zipfile/__init__.py @@ -1802,11 +1802,15 @@ def _extract_member(self, member, targetpath, pwd): # Create all upper directories if necessary. upperdirs = os.path.dirname(targetpath) if upperdirs and not os.path.exists(upperdirs): - os.makedirs(upperdirs) + os.makedirs(upperdirs, exist_ok=True) if member.is_dir(): if not os.path.isdir(targetpath): - os.mkdir(targetpath) + try: + os.mkdir(targetpath) + except FileExistsError: + if not os.path.isdir(targetpath): + raise return targetpath with self.open(member, pwd=pwd) as source, \ diff --git a/Makefile.pre.in b/Makefile.pre.in index 07b2ec7adde78a..4dabe328ce0362 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -405,6 +405,7 @@ PYTHON_OBJS= \ Python/ast_opt.o \ Python/ast_unparse.o \ Python/bltinmodule.o \ + Python/brc.o \ Python/ceval.o \ Python/codecs.o \ Python/compile.o \ @@ -1081,6 +1082,7 @@ PYTHON_HEADERS= \ $(srcdir)/Include/internal/pycore_atexit.h \ $(srcdir)/Include/internal/pycore_bitutils.h \ $(srcdir)/Include/internal/pycore_blocks_output_buffer.h \ + $(srcdir)/Include/internal/pycore_brc.h \ $(srcdir)/Include/internal/pycore_bytes_methods.h \ $(srcdir)/Include/internal/pycore_bytesobject.h \ $(srcdir)/Include/internal/pycore_call.h \ diff --git a/Misc/ACKS b/Misc/ACKS index 466023f390a421..8a80e02ecba26a 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1051,6 +1051,7 @@ Mark Lawrence Chris Laws Michael Layzell Michael Lazar +Peter Lazorchak Brian Leair Mathieu Leduc-Hamel Amandine Lee diff --git a/Misc/NEWS.d/3.12.0b1.rst b/Misc/NEWS.d/3.12.0b1.rst index 211513d05d0040..21f2c748f40548 100644 --- a/Misc/NEWS.d/3.12.0b1.rst +++ b/Misc/NEWS.d/3.12.0b1.rst @@ -2371,7 +2371,7 @@ Add a new C-API function to eagerly assign a version tag to a PyTypeObject: .. nonce: _paFIF .. section: C API -:c:func:`PyObject_GC_Resize` should calculate preheader size if needed. +:c:macro:`PyObject_GC_Resize` should calculate preheader size if needed. Patch by Donghee Na. .. diff --git a/Misc/NEWS.d/3.5.3.rst b/Misc/NEWS.d/3.5.3.rst index c3fcb67a4563f9..25db389ba5734f 100644 --- a/Misc/NEWS.d/3.5.3.rst +++ b/Misc/NEWS.d/3.5.3.rst @@ -3,5 +3,6 @@ .. no changes: True .. nonce: zYPqUK .. release date: 2017-01-17 +.. section: Library There were no code changes between 3.5.3rc1 and 3.5.3 final. diff --git a/Misc/NEWS.d/3.6.0.rst b/Misc/NEWS.d/3.6.0.rst index f9805cab28615e..d5c41f38838d93 100644 --- a/Misc/NEWS.d/3.6.0.rst +++ b/Misc/NEWS.d/3.6.0.rst @@ -3,5 +3,6 @@ .. no changes: True .. nonce: F9ENBV .. release date: 2016-12-23 +.. section: Library No changes since release candidate 2 diff --git a/Misc/NEWS.d/3.6.2.rst b/Misc/NEWS.d/3.6.2.rst index dba43d146df954..ee50670bd9f442 100644 --- a/Misc/NEWS.d/3.6.2.rst +++ b/Misc/NEWS.d/3.6.2.rst @@ -3,5 +3,6 @@ .. no changes: True .. nonce: F9ENBV .. release date: 2017-07-17 +.. section: Library No changes since release candidate 2 diff --git a/Misc/NEWS.d/next/Build/2024-02-08-17-38-56.gh-issue-113632.y9KIGb.rst b/Misc/NEWS.d/next/Build/2024-02-08-17-38-56.gh-issue-113632.y9KIGb.rst new file mode 100644 index 00000000000000..8b02b1b2cd08c9 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2024-02-08-17-38-56.gh-issue-113632.y9KIGb.rst @@ -0,0 +1,2 @@ +Promote WASI to a tier 2 platform and drop Emscripten from tier 3 in +configure.ac. diff --git a/Misc/NEWS.d/next/Build/2024-02-08-19-36-20.gh-issue-115167.LB9nDK.rst b/Misc/NEWS.d/next/Build/2024-02-08-19-36-20.gh-issue-115167.LB9nDK.rst new file mode 100644 index 00000000000000..c60c4a93fe8906 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2024-02-08-19-36-20.gh-issue-115167.LB9nDK.rst @@ -0,0 +1 @@ +Avoid vendoring ``vcruntime140_threads.dll`` when building with Visual Studio 2022 version 17.8. diff --git a/Misc/NEWS.d/next/Core and Builtins/2024-01-31-09-10-10.gh-issue-107944.XWm1B-.rst b/Misc/NEWS.d/next/Core and Builtins/2024-01-31-09-10-10.gh-issue-107944.XWm1B-.rst new file mode 100644 index 00000000000000..8e3fb786c11055 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2024-01-31-09-10-10.gh-issue-107944.XWm1B-.rst @@ -0,0 +1 @@ +Improve error message for function calls with bad keyword arguments via getargs diff --git a/Misc/NEWS.d/next/Core and Builtins/2024-02-01-23-43-49.gh-issue-76763.o_2J6i.rst b/Misc/NEWS.d/next/Core and Builtins/2024-02-01-23-43-49.gh-issue-76763.o_2J6i.rst new file mode 100644 index 00000000000000..d35d3d87073ddd --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2024-02-01-23-43-49.gh-issue-76763.o_2J6i.rst @@ -0,0 +1,3 @@ +The :func:`chr` builtin function now always raises :exc:`ValueError` for +values outside the valid range. Previously it raised :exc:`OverflowError` for +very large or small values. diff --git a/Misc/NEWS.d/next/Core and Builtins/2024-02-05-12-40-26.gh-issue-115011.L1AKF5.rst b/Misc/NEWS.d/next/Core and Builtins/2024-02-05-12-40-26.gh-issue-115011.L1AKF5.rst new file mode 100644 index 00000000000000..cf91a4f818bd44 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2024-02-05-12-40-26.gh-issue-115011.L1AKF5.rst @@ -0,0 +1,3 @@ +Setters for members with an unsigned integer type now support the same range +of valid values for objects that has a :meth:`~object.__index__` method as +for :class:`int`. diff --git a/Misc/NEWS.d/next/Core and Builtins/2024-02-07-00-18-42.gh-issue-112069.jRDRR5.rst b/Misc/NEWS.d/next/Core and Builtins/2024-02-07-00-18-42.gh-issue-112069.jRDRR5.rst new file mode 100644 index 00000000000000..51ba6bd1ddaac3 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2024-02-07-00-18-42.gh-issue-112069.jRDRR5.rst @@ -0,0 +1 @@ +Adapt :class:`set` and :class:`frozenset` methods to Argument Clinic. diff --git a/Misc/NEWS.d/next/Core and Builtins/2024-02-07-18-04-36.gh-issue-114695.o9wP5P.rst b/Misc/NEWS.d/next/Core and Builtins/2024-02-07-18-04-36.gh-issue-114695.o9wP5P.rst new file mode 100644 index 00000000000000..a1db4de393eecb --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2024-02-07-18-04-36.gh-issue-114695.o9wP5P.rst @@ -0,0 +1,3 @@ +Add :func:`sys._clear_internal_caches`, which clears all internal +performance-related caches (and deprecate the less-general +:func:`sys._clear_type_cache` function). diff --git a/Misc/NEWS.d/next/Documentation/2024-02-12-12-26-17.gh-issue-115233.aug6r9.rst b/Misc/NEWS.d/next/Documentation/2024-02-12-12-26-17.gh-issue-115233.aug6r9.rst new file mode 100644 index 00000000000000..f37f94d12d4cf1 --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2024-02-12-12-26-17.gh-issue-115233.aug6r9.rst @@ -0,0 +1 @@ +Fix an example for :class:`~logging.LoggerAdapter` in the Logging Cookbook. diff --git a/Misc/NEWS.d/next/Library/2023-05-06-04-57-10.gh-issue-96471.C9wAU7.rst b/Misc/NEWS.d/next/Library/2023-05-06-04-57-10.gh-issue-96471.C9wAU7.rst new file mode 100644 index 00000000000000..0bace8d8bd425c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-05-06-04-57-10.gh-issue-96471.C9wAU7.rst @@ -0,0 +1 @@ +Add :py:class:`queue.Queue` termination with :py:meth:`~queue.Queue.shutdown`. diff --git a/Misc/NEWS.d/next/Library/2023-12-18-20-10-50.gh-issue-89039.gqFdtU.rst b/Misc/NEWS.d/next/Library/2023-12-18-20-10-50.gh-issue-89039.gqFdtU.rst new file mode 100644 index 00000000000000..d1998d75e9fd76 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-12-18-20-10-50.gh-issue-89039.gqFdtU.rst @@ -0,0 +1,6 @@ +When replace() method is called on a subclass of datetime, date or time, +properly call derived constructor. Previously, only the base class's +constructor was called. + +Also, make sure to pass non-zero fold values when creating subclasses in +various methods. Previously, fold was silently ignored. diff --git a/Misc/NEWS.d/next/Library/2024-01-11-15-10-53.gh-issue-97959.UOj6d4.rst b/Misc/NEWS.d/next/Library/2024-01-11-15-10-53.gh-issue-97959.UOj6d4.rst new file mode 100644 index 00000000000000..a317271947dc37 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-01-11-15-10-53.gh-issue-97959.UOj6d4.rst @@ -0,0 +1,7 @@ +Fix rendering class methods, bound methods, method and function aliases in +:mod:`pydoc`. Class methods no longer have "method of builtins.type +instance" note. Corresponding notes are now added for class and unbound +methods. Method and function aliases now have references to the module or +the class where the origin was defined if it differs from the current. Bound +methods are now listed in the static methods section. Methods of builtin +classes are now supported as well as methods of Python classes. diff --git a/Misc/NEWS.d/next/Library/2024-01-30-22-10-50.gh-issue-49766.yulJL_.rst b/Misc/NEWS.d/next/Library/2024-01-30-22-10-50.gh-issue-49766.yulJL_.rst new file mode 100644 index 00000000000000..eaaa3ba1cb6f09 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-01-30-22-10-50.gh-issue-49766.yulJL_.rst @@ -0,0 +1,8 @@ +Fix :class:`~datetime.date`-:class:`~datetime.datetime` comparison. Now the +special comparison methods like ``__eq__`` and ``__lt__`` return +:data:`NotImplemented` if one of comparands is :class:`!date` and other is +:class:`!datetime` instead of ignoring the time part and the time zone or +forcefully return "not equal" or raise :exc:`TypeError`. It makes comparison +of :class:`!date` and :class:`!datetime` subclasses more symmetric and +allows to change the default behavior by overriding the special comparison +methods in subclasses. diff --git a/Misc/NEWS.d/next/Library/2024-02-02-15-50-13.gh-issue-114894.DF-dSd.rst b/Misc/NEWS.d/next/Library/2024-02-02-15-50-13.gh-issue-114894.DF-dSd.rst new file mode 100644 index 00000000000000..ec620f2aae3f03 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-02-15-50-13.gh-issue-114894.DF-dSd.rst @@ -0,0 +1 @@ +Add :meth:`array.array.clear`. diff --git a/Misc/NEWS.d/next/Library/2024-02-04-02-28-37.gh-issue-85984.NHZVTQ.rst b/Misc/NEWS.d/next/Library/2024-02-04-02-28-37.gh-issue-85984.NHZVTQ.rst new file mode 100644 index 00000000000000..bfa7e676f92306 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-04-02-28-37.gh-issue-85984.NHZVTQ.rst @@ -0,0 +1 @@ +Added ``_POSIX_VDISABLE`` from C's ```` to :mod:`termios`. diff --git a/Misc/NEWS.d/next/Library/2024-02-05-16-48-06.gh-issue-97928.JZCies.rst b/Misc/NEWS.d/next/Library/2024-02-05-16-48-06.gh-issue-97928.JZCies.rst new file mode 100644 index 00000000000000..24fed926a95513 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-05-16-48-06.gh-issue-97928.JZCies.rst @@ -0,0 +1,5 @@ +Partially revert the behavior of :meth:`tkinter.Text.count`. By default it +preserves the behavior of older Python versions, except that setting +``wantobjects`` to 0 no longer has effect. Add a new parameter *return_ints*: +specifying ``return_ints=True`` makes ``Text.count()`` always returning the +single count as an integer instead of a 1-tuple or ``None``. diff --git a/Misc/NEWS.d/next/Library/2024-02-06-03-55-46.gh-issue-115060.EkWRpP.rst b/Misc/NEWS.d/next/Library/2024-02-06-03-55-46.gh-issue-115060.EkWRpP.rst new file mode 100644 index 00000000000000..b358eeb569626f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-06-03-55-46.gh-issue-115060.EkWRpP.rst @@ -0,0 +1 @@ +Speed up :meth:`pathlib.Path.glob` by removing redundant regex matching. diff --git a/Misc/NEWS.d/next/Library/2024-02-06-15-16-28.gh-issue-67837._JKa73.rst b/Misc/NEWS.d/next/Library/2024-02-06-15-16-28.gh-issue-67837._JKa73.rst new file mode 100644 index 00000000000000..340b65f1883942 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-06-15-16-28.gh-issue-67837._JKa73.rst @@ -0,0 +1,2 @@ +Avoid race conditions in the creation of directories during concurrent +extraction in :mod:`tarfile` and :mod:`zipfile`. diff --git a/Misc/NEWS.d/next/Library/2024-02-07-12-37-52.gh-issue-79382.Yz_5WB.rst b/Misc/NEWS.d/next/Library/2024-02-07-12-37-52.gh-issue-79382.Yz_5WB.rst new file mode 100644 index 00000000000000..5eb1888943186a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-07-12-37-52.gh-issue-79382.Yz_5WB.rst @@ -0,0 +1,2 @@ +Trailing ``**`` no longer allows to match files and non-existing paths in +recursive :func:`~glob.glob`. diff --git a/Misc/NEWS.d/next/Library/2024-02-08-13-26-14.gh-issue-115059.DqP9dr.rst b/Misc/NEWS.d/next/Library/2024-02-08-13-26-14.gh-issue-115059.DqP9dr.rst new file mode 100644 index 00000000000000..331baedd3b24c5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-08-13-26-14.gh-issue-115059.DqP9dr.rst @@ -0,0 +1 @@ +:meth:`io.BufferedRandom.read1` now flushes the underlying write buffer. diff --git a/Misc/NEWS.d/next/Library/2024-02-08-14-21-28.gh-issue-115133.ycl4ko.rst b/Misc/NEWS.d/next/Library/2024-02-08-14-21-28.gh-issue-115133.ycl4ko.rst new file mode 100644 index 00000000000000..6f1015235cc25d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-08-14-21-28.gh-issue-115133.ycl4ko.rst @@ -0,0 +1,2 @@ +Fix tests for :class:`~xml.etree.ElementTree.XMLPullParser` with Expat +2.6.0. diff --git a/Misc/NEWS.d/next/Library/2024-02-08-17-04-58.gh-issue-112903.SN_vUs.rst b/Misc/NEWS.d/next/Library/2024-02-08-17-04-58.gh-issue-112903.SN_vUs.rst new file mode 100644 index 00000000000000..e27f5832553c13 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-08-17-04-58.gh-issue-112903.SN_vUs.rst @@ -0,0 +1,2 @@ +Fix "issubclass() arg 1 must be a class" errors in certain cases of multiple +inheritance with generic aliases (regression in early 3.13 alpha releases). diff --git a/Misc/NEWS.d/next/Library/2024-02-09-07-20-16.gh-issue-115165.yfJLXA.rst b/Misc/NEWS.d/next/Library/2024-02-09-07-20-16.gh-issue-115165.yfJLXA.rst new file mode 100644 index 00000000000000..73d3d001f07f3f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-09-07-20-16.gh-issue-115165.yfJLXA.rst @@ -0,0 +1,4 @@ +Most exceptions are now ignored when attempting to set the ``__orig_class__`` +attribute on objects returned when calling :mod:`typing` generic aliases +(including generic aliases created using :data:`typing.Annotated`). +Previously only :exc:`AttributeError`` was ignored. Patch by Dave Shawley. diff --git a/Misc/NEWS.d/next/Library/2024-02-10-15-24-20.gh-issue-102840.4mnDq1.rst b/Misc/NEWS.d/next/Library/2024-02-10-15-24-20.gh-issue-102840.4mnDq1.rst new file mode 100644 index 00000000000000..52668a9424a976 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-10-15-24-20.gh-issue-102840.4mnDq1.rst @@ -0,0 +1,3 @@ +Fix confused traceback when floordiv, mod, or divmod operations happens +between instances of :class:`fractions.Fraction` and :class:`complex`. + diff --git a/Misc/NEWS.d/next/Library/2024-02-11-20-23-36.gh-issue-114563.RzxNYT.rst b/Misc/NEWS.d/next/Library/2024-02-11-20-23-36.gh-issue-114563.RzxNYT.rst new file mode 100644 index 00000000000000..013b6db8e6dbd7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2024-02-11-20-23-36.gh-issue-114563.RzxNYT.rst @@ -0,0 +1,4 @@ +Fix several :func:`format()` bugs when using the C implementation of :class:`~decimal.Decimal`: +* memory leak in some rare cases when using the ``z`` format option (coerce negative 0) +* incorrect output when applying the ``z`` format option to type ``F`` (fixed-point with capital ``NAN`` / ``INF``) +* incorrect output when applying the ``#`` format option (alternate form) diff --git a/Misc/NEWS.d/next/Windows/2024-02-08-21-37-22.gh-issue-115049.X1ObpJ.rst b/Misc/NEWS.d/next/Windows/2024-02-08-21-37-22.gh-issue-115049.X1ObpJ.rst new file mode 100644 index 00000000000000..a679391857dcb3 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2024-02-08-21-37-22.gh-issue-115049.X1ObpJ.rst @@ -0,0 +1 @@ +Fixes ``py.exe`` launcher failing when run as users without user profiles. diff --git a/Misc/NEWS.d/next/macOS/2022-11-18-10-05-35.gh-issue-87804.rhlDmD.rst b/Misc/NEWS.d/next/macOS/2022-11-18-10-05-35.gh-issue-87804.rhlDmD.rst new file mode 100644 index 00000000000000..e6554d5c9f1e1e --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2022-11-18-10-05-35.gh-issue-87804.rhlDmD.rst @@ -0,0 +1 @@ +On macOS the result of ``os.statvfs`` and ``os.fstatvfs`` now correctly report the size of very large disks, in previous versions the reported number of blocks was wrong for disks with at least 2**32 blocks. diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c index 9b8e0a719d9048..014ccdd3f6effe 100644 --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -1045,6 +1045,40 @@ new_datetime_ex(int year, int month, int day, int hour, int minute, new_datetime_ex2(y, m, d, hh, mm, ss, us, tzinfo, fold, \ &PyDateTime_DateTimeType) +static PyObject * +call_subclass_fold(PyObject *cls, int fold, const char *format, ...) +{ + PyObject *kwargs = NULL, *res = NULL; + va_list va; + + va_start(va, format); + PyObject *args = Py_VaBuildValue(format, va); + va_end(va); + if (args == NULL) { + return NULL; + } + if (fold) { + kwargs = PyDict_New(); + if (kwargs == NULL) { + goto Done; + } + PyObject *obj = PyLong_FromLong(fold); + if (obj == NULL) { + goto Done; + } + int err = PyDict_SetItemString(kwargs, "fold", obj); + Py_DECREF(obj); + if (err < 0) { + goto Done; + } + } + res = PyObject_Call(cls, args, kwargs); +Done: + Py_DECREF(args); + Py_XDECREF(kwargs); + return res; +} + static PyObject * new_datetime_subclass_fold_ex(int year, int month, int day, int hour, int minute, int second, int usecond, PyObject *tzinfo, @@ -1054,17 +1088,11 @@ new_datetime_subclass_fold_ex(int year, int month, int day, int hour, int minute // Use the fast path constructor dt = new_datetime(year, month, day, hour, minute, second, usecond, tzinfo, fold); - } else { + } + else { // Subclass - dt = PyObject_CallFunction(cls, "iiiiiiiO", - year, - month, - day, - hour, - minute, - second, - usecond, - tzinfo); + dt = call_subclass_fold(cls, fold, "iiiiiiiO", year, month, day, + hour, minute, second, usecond, tzinfo); } return dt; @@ -1120,6 +1148,24 @@ new_time_ex(int hour, int minute, int second, int usecond, #define new_time(hh, mm, ss, us, tzinfo, fold) \ new_time_ex2(hh, mm, ss, us, tzinfo, fold, &PyDateTime_TimeType) +static PyObject * +new_time_subclass_fold_ex(int hour, int minute, int second, int usecond, + PyObject *tzinfo, int fold, PyObject *cls) +{ + PyObject *t; + if ((PyTypeObject*)cls == &PyDateTime_TimeType) { + // Use the fast path constructor + t = new_time(hour, minute, second, usecond, tzinfo, fold); + } + else { + // Subclass + t = call_subclass_fold(cls, fold, "iiiiO", hour, minute, second, + usecond, tzinfo); + } + + return t; +} + /* Create a timedelta instance. Normalize the members iff normalize is * true. Passing false is a speed optimization, if you know for sure * that seconds and microseconds are already in their proper ranges. In any @@ -1816,16 +1862,6 @@ diff_to_bool(int diff, int op) Py_RETURN_RICHCOMPARE(diff, 0, op); } -/* Raises a "can't compare" TypeError and returns NULL. */ -static PyObject * -cmperror(PyObject *a, PyObject *b) -{ - PyErr_Format(PyExc_TypeError, - "can't compare %s to %s", - Py_TYPE(a)->tp_name, Py_TYPE(b)->tp_name); - return NULL; -} - /* --------------------------------------------------------------------------- * Class implementations. */ @@ -3448,7 +3484,15 @@ date_isocalendar(PyDateTime_Date *self, PyObject *Py_UNUSED(ignored)) static PyObject * date_richcompare(PyObject *self, PyObject *other, int op) { - if (PyDate_Check(other)) { + /* Since DateTime is a subclass of Date, if the other object is + * a DateTime, it would compute an equality testing or an ordering + * based on the date part alone, and we don't want that. + * So return NotImplemented here in that case. + * If a subclass wants to change this, it's up to the subclass to do so. + * The behavior is the same as if Date and DateTime were independent + * classes. + */ + if (PyDate_Check(other) && !PyDateTime_Check(other)) { int diff = memcmp(((PyDateTime_Date *)self)->data, ((PyDateTime_Date *)other)->data, _PyDateTime_DATE_DATASIZE); @@ -3482,7 +3526,7 @@ datetime_date_replace_impl(PyDateTime_Date *self, int year, int month, int day) /*[clinic end generated code: output=2a9430d1e6318aeb input=0d1f02685b3e90f6]*/ { - return new_date_ex(year, month, day, Py_TYPE(self)); + return new_date_subclass_ex(year, month, day, (PyObject *)Py_TYPE(self)); } static Py_hash_t @@ -4591,8 +4635,8 @@ datetime_time_replace_impl(PyDateTime_Time *self, int hour, int minute, int fold) /*[clinic end generated code: output=0b89a44c299e4f80 input=9b6a35b1e704b0ca]*/ { - return new_time_ex2(hour, minute, second, microsecond, tzinfo, fold, - Py_TYPE(self)); + return new_time_subclass_fold_ex(hour, minute, second, microsecond, tzinfo, + fold, (PyObject *)Py_TYPE(self)); } static PyObject * @@ -5880,21 +5924,7 @@ datetime_richcompare(PyObject *self, PyObject *other, int op) PyObject *offset1, *offset2; int diff; - if (! PyDateTime_Check(other)) { - if (PyDate_Check(other)) { - /* Prevent invocation of date_richcompare. We want to - return NotImplemented here to give the other object - a chance. But since DateTime is a subclass of - Date, if the other object is a Date, it would - compute an ordering based on the date part alone, - and we don't want that. So force unequal or - uncomparable here in that case. */ - if (op == Py_EQ) - Py_RETURN_FALSE; - if (op == Py_NE) - Py_RETURN_TRUE; - return cmperror(self, other); - } + if (!PyDateTime_Check(other)) { Py_RETURN_NOTIMPLEMENTED; } @@ -6055,8 +6085,9 @@ datetime_datetime_replace_impl(PyDateTime_DateTime *self, int year, int fold) /*[clinic end generated code: output=00bc96536833fddb input=9b38253d56d9bcad]*/ { - return new_datetime_ex2(year, month, day, hour, minute, second, - microsecond, tzinfo, fold, Py_TYPE(self)); + return new_datetime_subclass_fold_ex(year, month, day, hour, minute, + second, microsecond, tzinfo, fold, + (PyObject *)Py_TYPE(self)); } static PyObject * diff --git a/Modules/_decimal/_decimal.c b/Modules/_decimal/_decimal.c index 127f5f2887d4cd..5b053c73e20bc9 100644 --- a/Modules/_decimal/_decimal.c +++ b/Modules/_decimal/_decimal.c @@ -82,6 +82,9 @@ typedef struct { /* Convert rationals for comparison */ PyObject *Rational; + /* Invariant: NULL or pointer to _pydecimal.Decimal */ + PyObject *PyDecimal; + PyObject *SignalTuple; struct DecCondMap *signal_map; @@ -3336,56 +3339,6 @@ dotsep_as_utf8(const char *s) return utf8; } -/* copy of libmpdec _mpd_round() */ -static void -_mpd_round(mpd_t *result, const mpd_t *a, mpd_ssize_t prec, - const mpd_context_t *ctx, uint32_t *status) -{ - mpd_ssize_t exp = a->exp + a->digits - prec; - - if (prec <= 0) { - mpd_seterror(result, MPD_Invalid_operation, status); - return; - } - if (mpd_isspecial(a) || mpd_iszero(a)) { - mpd_qcopy(result, a, status); - return; - } - - mpd_qrescale_fmt(result, a, exp, ctx, status); - if (result->digits > prec) { - mpd_qrescale_fmt(result, result, exp+1, ctx, status); - } -} - -/* Locate negative zero "z" option within a UTF-8 format spec string. - * Returns pointer to "z", else NULL. - * The portion of the spec we're working with is [[fill]align][sign][z] */ -static const char * -format_spec_z_search(char const *fmt, Py_ssize_t size) { - char const *pos = fmt; - char const *fmt_end = fmt + size; - /* skip over [[fill]align] (fill may be multi-byte character) */ - pos += 1; - while (pos < fmt_end && *pos & 0x80) { - pos += 1; - } - if (pos < fmt_end && strchr("<>=^", *pos) != NULL) { - pos += 1; - } else { - /* fill not present-- skip over [align] */ - pos = fmt; - if (pos < fmt_end && strchr("<>=^", *pos) != NULL) { - pos += 1; - } - } - /* skip over [sign] */ - if (pos < fmt_end && strchr("+- ", *pos) != NULL) { - pos += 1; - } - return pos < fmt_end && *pos == 'z' ? pos : NULL; -} - static int dict_get_item_string(PyObject *dict, const char *key, PyObject **valueobj, const char **valuestr) { @@ -3411,6 +3364,48 @@ dict_get_item_string(PyObject *dict, const char *key, PyObject **valueobj, const return 0; } +/* + * Fallback _pydecimal formatting for new format specifiers that mpdecimal does + * not yet support. As documented, libmpdec follows the PEP-3101 format language: + * https://www.bytereef.org/mpdecimal/doc/libmpdec/assign-convert.html#to-string + */ +static PyObject * +pydec_format(PyObject *dec, PyObject *context, PyObject *fmt, decimal_state *state) +{ + PyObject *result; + PyObject *pydec; + PyObject *u; + + if (state->PyDecimal == NULL) { + state->PyDecimal = _PyImport_GetModuleAttrString("_pydecimal", "Decimal"); + if (state->PyDecimal == NULL) { + return NULL; + } + } + + u = dec_str(dec); + if (u == NULL) { + return NULL; + } + + pydec = PyObject_CallOneArg(state->PyDecimal, u); + Py_DECREF(u); + if (pydec == NULL) { + return NULL; + } + + result = PyObject_CallMethod(pydec, "__format__", "(OO)", fmt, context); + Py_DECREF(pydec); + + if (result == NULL && PyErr_ExceptionMatches(PyExc_ValueError)) { + /* Do not confuse users with the _pydecimal exception */ + PyErr_Clear(); + PyErr_SetString(PyExc_ValueError, "invalid format string"); + } + + return result; +} + /* Formatted representation of a PyDecObject. */ static PyObject * dec_format(PyObject *dec, PyObject *args) @@ -3423,16 +3418,11 @@ dec_format(PyObject *dec, PyObject *args) PyObject *fmtarg; PyObject *context; mpd_spec_t spec; - char const *fmt; - char *fmt_copy = NULL; + char *fmt; char *decstring = NULL; uint32_t status = 0; int replace_fillchar = 0; - int no_neg_0 = 0; Py_ssize_t size; - mpd_t *mpd = MPD(dec); - mpd_uint_t dt[MPD_MINALLOC_MAX]; - mpd_t tmp = {MPD_STATIC|MPD_STATIC_DATA,0,0,0,MPD_MINALLOC_MAX,dt}; decimal_state *state = get_module_state_by_def(Py_TYPE(dec)); @@ -3442,7 +3432,7 @@ dec_format(PyObject *dec, PyObject *args) } if (PyUnicode_Check(fmtarg)) { - fmt = PyUnicode_AsUTF8AndSize(fmtarg, &size); + fmt = (char *)PyUnicode_AsUTF8AndSize(fmtarg, &size); if (fmt == NULL) { return NULL; } @@ -3454,35 +3444,15 @@ dec_format(PyObject *dec, PyObject *args) } } - /* NOTE: If https://github.com/python/cpython/pull/29438 lands, the - * format string manipulation below can be eliminated by enhancing - * the forked mpd_parse_fmt_str(). */ if (size > 0 && fmt[0] == '\0') { /* NUL fill character: must be replaced with a valid UTF-8 char before calling mpd_parse_fmt_str(). */ replace_fillchar = 1; - fmt = fmt_copy = dec_strdup(fmt, size); - if (fmt_copy == NULL) { + fmt = dec_strdup(fmt, size); + if (fmt == NULL) { return NULL; } - fmt_copy[0] = '_'; - } - /* Strip 'z' option, which isn't understood by mpd_parse_fmt_str(). - * NOTE: fmt is always null terminated by PyUnicode_AsUTF8AndSize() */ - char const *z_position = format_spec_z_search(fmt, size); - if (z_position != NULL) { - no_neg_0 = 1; - size_t z_index = z_position - fmt; - if (fmt_copy == NULL) { - fmt = fmt_copy = dec_strdup(fmt, size); - if (fmt_copy == NULL) { - return NULL; - } - } - /* Shift characters (including null terminator) left, - overwriting the 'z' option. */ - memmove(fmt_copy + z_index, fmt_copy + z_index + 1, size - z_index); - size -= 1; + fmt[0] = '_'; } } else { @@ -3492,10 +3462,13 @@ dec_format(PyObject *dec, PyObject *args) } if (!mpd_parse_fmt_str(&spec, fmt, CtxCaps(context))) { - PyErr_SetString(PyExc_ValueError, - "invalid format string"); - goto finish; + if (replace_fillchar) { + PyMem_Free(fmt); + } + + return pydec_format(dec, context, fmtarg, state); } + if (replace_fillchar) { /* In order to avoid clobbering parts of UTF-8 thousands separators or decimal points when the substitution is reversed later, the actual @@ -3548,45 +3521,8 @@ dec_format(PyObject *dec, PyObject *args) } } - if (no_neg_0 && mpd_isnegative(mpd) && !mpd_isspecial(mpd)) { - /* Round into a temporary (carefully mirroring the rounding - of mpd_qformat_spec()), and check if the result is negative zero. - If so, clear the sign and format the resulting positive zero. */ - mpd_ssize_t prec; - mpd_qcopy(&tmp, mpd, &status); - if (spec.prec >= 0) { - switch (spec.type) { - case 'f': - mpd_qrescale(&tmp, &tmp, -spec.prec, CTX(context), &status); - break; - case '%': - tmp.exp += 2; - mpd_qrescale(&tmp, &tmp, -spec.prec, CTX(context), &status); - break; - case 'g': - prec = (spec.prec == 0) ? 1 : spec.prec; - if (tmp.digits > prec) { - _mpd_round(&tmp, &tmp, prec, CTX(context), &status); - } - break; - case 'e': - if (!mpd_iszero(&tmp)) { - _mpd_round(&tmp, &tmp, spec.prec+1, CTX(context), &status); - } - break; - } - } - if (status & MPD_Errors) { - PyErr_SetString(PyExc_ValueError, "unexpected error when rounding"); - goto finish; - } - if (mpd_iszero(&tmp)) { - mpd_set_positive(&tmp); - mpd = &tmp; - } - } - decstring = mpd_qformat_spec(mpd, &spec, CTX(context), &status); + decstring = mpd_qformat_spec(MPD(dec), &spec, CTX(context), &status); if (decstring == NULL) { if (status & MPD_Malloc_error) { PyErr_NoMemory(); @@ -3609,7 +3545,7 @@ dec_format(PyObject *dec, PyObject *args) Py_XDECREF(grouping); Py_XDECREF(sep); Py_XDECREF(dot); - if (fmt_copy) PyMem_Free(fmt_copy); + if (replace_fillchar) PyMem_Free(fmt); if (decstring) mpd_free(decstring); return result; } @@ -5987,6 +5923,9 @@ _decimal_exec(PyObject *m) Py_CLEAR(collections_abc); Py_CLEAR(MutableMapping); + /* For format specifiers not yet supported by libmpdec */ + state->PyDecimal = NULL; + /* Add types to the module */ CHECK_INT(PyModule_AddType(m, state->PyDec_Type)); CHECK_INT(PyModule_AddType(m, state->PyDecContext_Type)); @@ -6192,6 +6131,7 @@ decimal_clear(PyObject *module) Py_CLEAR(state->extended_context_template); Py_CLEAR(state->Rational); Py_CLEAR(state->SignalTuple); + Py_CLEAR(state->PyDecimal); PyMem_Free(state->signal_map); PyMem_Free(state->cond_map); diff --git a/Modules/_io/bufferedio.c b/Modules/_io/bufferedio.c index f02207ace9f3d2..8ebe9ec7095586 100644 --- a/Modules/_io/bufferedio.c +++ b/Modules/_io/bufferedio.c @@ -1050,6 +1050,16 @@ _io__Buffered_read1_impl(buffered *self, Py_ssize_t n) Py_DECREF(res); return NULL; } + /* Flush the write buffer if necessary */ + if (self->writable) { + PyObject *r = buffered_flush_and_rewind_unlocked(self); + if (r == NULL) { + LEAVE_BUFFERED(self) + Py_DECREF(res); + return NULL; + } + Py_DECREF(r); + } _bufferedreader_reset_buf(self); r = _bufferedreader_raw_read(self, PyBytes_AS_STRING(res), n); LEAVE_BUFFERED(self) diff --git a/Modules/_testbuffer.c b/Modules/_testbuffer.c index 5101834cfe1387..5084bcadb10f85 100644 --- a/Modules/_testbuffer.c +++ b/Modules/_testbuffer.c @@ -2816,70 +2816,91 @@ static struct PyModuleDef _testbuffermodule = { NULL }; - -PyMODINIT_FUNC -PyInit__testbuffer(void) +static int +_testbuffer_exec(PyObject *mod) { - PyObject *m; - - m = PyModule_Create(&_testbuffermodule); - if (m == NULL) - return NULL; - Py_SET_TYPE(&NDArray_Type, &PyType_Type); - Py_INCREF(&NDArray_Type); - PyModule_AddObject(m, "ndarray", (PyObject *)&NDArray_Type); + if (PyModule_AddType(mod, &NDArray_Type) < 0) { + return -1; + } Py_SET_TYPE(&StaticArray_Type, &PyType_Type); - Py_INCREF(&StaticArray_Type); - PyModule_AddObject(m, "staticarray", (PyObject *)&StaticArray_Type); + if (PyModule_AddType(mod, &StaticArray_Type) < 0) { + return -1; + } structmodule = PyImport_ImportModule("struct"); - if (structmodule == NULL) - return NULL; + if (structmodule == NULL) { + return -1; + } Struct = PyObject_GetAttrString(structmodule, "Struct"); + if (Struct == NULL) { + return -1; + } calcsize = PyObject_GetAttrString(structmodule, "calcsize"); - if (Struct == NULL || calcsize == NULL) - return NULL; + if (calcsize == NULL) { + return -1; + } simple_format = PyUnicode_FromString(simple_fmt); - if (simple_format == NULL) - return NULL; - - PyModule_AddIntMacro(m, ND_MAX_NDIM); - PyModule_AddIntMacro(m, ND_VAREXPORT); - PyModule_AddIntMacro(m, ND_WRITABLE); - PyModule_AddIntMacro(m, ND_FORTRAN); - PyModule_AddIntMacro(m, ND_SCALAR); - PyModule_AddIntMacro(m, ND_PIL); - PyModule_AddIntMacro(m, ND_GETBUF_FAIL); - PyModule_AddIntMacro(m, ND_GETBUF_UNDEFINED); - PyModule_AddIntMacro(m, ND_REDIRECT); - - PyModule_AddIntMacro(m, PyBUF_SIMPLE); - PyModule_AddIntMacro(m, PyBUF_WRITABLE); - PyModule_AddIntMacro(m, PyBUF_FORMAT); - PyModule_AddIntMacro(m, PyBUF_ND); - PyModule_AddIntMacro(m, PyBUF_STRIDES); - PyModule_AddIntMacro(m, PyBUF_INDIRECT); - PyModule_AddIntMacro(m, PyBUF_C_CONTIGUOUS); - PyModule_AddIntMacro(m, PyBUF_F_CONTIGUOUS); - PyModule_AddIntMacro(m, PyBUF_ANY_CONTIGUOUS); - PyModule_AddIntMacro(m, PyBUF_FULL); - PyModule_AddIntMacro(m, PyBUF_FULL_RO); - PyModule_AddIntMacro(m, PyBUF_RECORDS); - PyModule_AddIntMacro(m, PyBUF_RECORDS_RO); - PyModule_AddIntMacro(m, PyBUF_STRIDED); - PyModule_AddIntMacro(m, PyBUF_STRIDED_RO); - PyModule_AddIntMacro(m, PyBUF_CONTIG); - PyModule_AddIntMacro(m, PyBUF_CONTIG_RO); - - PyModule_AddIntMacro(m, PyBUF_READ); - PyModule_AddIntMacro(m, PyBUF_WRITE); - - return m; -} + if (simple_format == NULL) { + return -1; + } +#define ADD_INT_MACRO(mod, macro) \ + do { \ + if (PyModule_AddIntConstant(mod, #macro, macro) < 0) { \ + return -1; \ + } \ + } while (0) + + ADD_INT_MACRO(mod, ND_MAX_NDIM); + ADD_INT_MACRO(mod, ND_VAREXPORT); + ADD_INT_MACRO(mod, ND_WRITABLE); + ADD_INT_MACRO(mod, ND_FORTRAN); + ADD_INT_MACRO(mod, ND_SCALAR); + ADD_INT_MACRO(mod, ND_PIL); + ADD_INT_MACRO(mod, ND_GETBUF_FAIL); + ADD_INT_MACRO(mod, ND_GETBUF_UNDEFINED); + ADD_INT_MACRO(mod, ND_REDIRECT); + + ADD_INT_MACRO(mod, PyBUF_SIMPLE); + ADD_INT_MACRO(mod, PyBUF_WRITABLE); + ADD_INT_MACRO(mod, PyBUF_FORMAT); + ADD_INT_MACRO(mod, PyBUF_ND); + ADD_INT_MACRO(mod, PyBUF_STRIDES); + ADD_INT_MACRO(mod, PyBUF_INDIRECT); + ADD_INT_MACRO(mod, PyBUF_C_CONTIGUOUS); + ADD_INT_MACRO(mod, PyBUF_F_CONTIGUOUS); + ADD_INT_MACRO(mod, PyBUF_ANY_CONTIGUOUS); + ADD_INT_MACRO(mod, PyBUF_FULL); + ADD_INT_MACRO(mod, PyBUF_FULL_RO); + ADD_INT_MACRO(mod, PyBUF_RECORDS); + ADD_INT_MACRO(mod, PyBUF_RECORDS_RO); + ADD_INT_MACRO(mod, PyBUF_STRIDED); + ADD_INT_MACRO(mod, PyBUF_STRIDED_RO); + ADD_INT_MACRO(mod, PyBUF_CONTIG); + ADD_INT_MACRO(mod, PyBUF_CONTIG_RO); + + ADD_INT_MACRO(mod, PyBUF_READ); + ADD_INT_MACRO(mod, PyBUF_WRITE); + +#undef ADD_INT_MACRO + return 0; +} +PyMODINIT_FUNC +PyInit__testbuffer(void) +{ + PyObject *mod = PyModule_Create(&_testbuffermodule); + if (mod == NULL) { + return NULL; + } + if (_testbuffer_exec(mod) < 0) { + Py_DECREF(mod); + return NULL; + } + return mod; +} diff --git a/Modules/_testinternalcapi.c b/Modules/_testinternalcapi.c index 0bb739b5398b11..3834f00009cea4 100644 --- a/Modules/_testinternalcapi.c +++ b/Modules/_testinternalcapi.c @@ -1650,6 +1650,20 @@ get_rare_event_counters(PyObject *self, PyObject *type) ); } +static PyObject * +reset_rare_event_counters(PyObject *self, PyObject *Py_UNUSED(type)) +{ + PyInterpreterState *interp = PyInterpreterState_Get(); + + interp->rare_events.set_class = 0; + interp->rare_events.set_bases = 0; + interp->rare_events.set_eval_frame_func = 0; + interp->rare_events.builtin_dict = 0; + interp->rare_events.func_modification = 0; + + return Py_None; +} + #ifdef Py_GIL_DISABLED static PyObject * @@ -1727,6 +1741,7 @@ static PyMethodDef module_functions[] = { _TESTINTERNALCAPI_TEST_LONG_NUMBITS_METHODDEF {"get_type_module_name", get_type_module_name, METH_O}, {"get_rare_event_counters", get_rare_event_counters, METH_NOARGS}, + {"reset_rare_event_counters", reset_rare_event_counters, METH_NOARGS}, #ifdef Py_GIL_DISABLED {"py_thread_id", get_py_thread_id, METH_NOARGS}, #endif diff --git a/Modules/_xxtestfuzz/fuzzer.c b/Modules/_xxtestfuzz/fuzzer.c index e133b4d3c44480..6ea9f64d628530 100644 --- a/Modules/_xxtestfuzz/fuzzer.c +++ b/Modules/_xxtestfuzz/fuzzer.c @@ -502,7 +502,6 @@ static int fuzz_elementtree_parsewhole(const char* data, size_t size) { } #define MAX_PYCOMPILE_TEST_SIZE 16384 -static char pycompile_scratch[MAX_PYCOMPILE_TEST_SIZE]; static const int start_vals[] = {Py_eval_input, Py_single_input, Py_file_input}; const size_t NUM_START_VALS = sizeof(start_vals) / sizeof(start_vals[0]); @@ -531,6 +530,8 @@ static int fuzz_pycompile(const char* data, size_t size) { unsigned char optimize_idx = (unsigned char) data[1]; int optimize = optimize_vals[optimize_idx % NUM_OPTIMIZE_VALS]; + char pycompile_scratch[MAX_PYCOMPILE_TEST_SIZE]; + // Create a NUL-terminated C string from the remaining input memcpy(pycompile_scratch, data + 2, size - 2); // Put a NUL terminator just after the copied data. (Space was reserved already.) @@ -549,7 +550,13 @@ static int fuzz_pycompile(const char* data, size_t size) { PyObject *result = Py_CompileStringExFlags(pycompile_scratch, "", start, flags, optimize); if (result == NULL) { - /* compilation failed, most likely from a syntax error */ + /* Compilation failed, most likely from a syntax error. If it was a + SystemError we abort. There's no non-bug reason to raise a + SystemError. */ + if (PyErr_Occurred() && PyErr_ExceptionMatches(PyExc_SystemError)) { + PyErr_Print(); + abort(); + } PyErr_Clear(); } else { Py_DECREF(result); diff --git a/Modules/arraymodule.c b/Modules/arraymodule.c index b97ade6126fa08..df09d9d84789f7 100644 --- a/Modules/arraymodule.c +++ b/Modules/arraymodule.c @@ -868,6 +868,21 @@ array_slice(arrayobject *a, Py_ssize_t ilow, Py_ssize_t ihigh) return (PyObject *)np; } +/*[clinic input] +array.array.clear + +Remove all items from the array. +[clinic start generated code]*/ + +static PyObject * +array_array_clear_impl(arrayobject *self) +/*[clinic end generated code: output=5efe0417062210a9 input=5dffa30e94e717a4]*/ +{ + if (array_resize(self, 0) == -1) { + return NULL; + } + Py_RETURN_NONE; +} /*[clinic input] array.array.__copy__ @@ -2342,6 +2357,7 @@ static PyMethodDef array_methods[] = { ARRAY_ARRAY_APPEND_METHODDEF ARRAY_ARRAY_BUFFER_INFO_METHODDEF ARRAY_ARRAY_BYTESWAP_METHODDEF + ARRAY_ARRAY_CLEAR_METHODDEF ARRAY_ARRAY___COPY___METHODDEF ARRAY_ARRAY_COUNT_METHODDEF ARRAY_ARRAY___DEEPCOPY___METHODDEF diff --git a/Modules/clinic/arraymodule.c.h b/Modules/clinic/arraymodule.c.h index 0b764e43e19437..60a03fe012550e 100644 --- a/Modules/clinic/arraymodule.c.h +++ b/Modules/clinic/arraymodule.c.h @@ -5,6 +5,24 @@ preserve #include "pycore_abstract.h" // _PyNumber_Index() #include "pycore_modsupport.h" // _PyArg_CheckPositional() +PyDoc_STRVAR(array_array_clear__doc__, +"clear($self, /)\n" +"--\n" +"\n" +"Remove all items from the array."); + +#define ARRAY_ARRAY_CLEAR_METHODDEF \ + {"clear", (PyCFunction)array_array_clear, METH_NOARGS, array_array_clear__doc__}, + +static PyObject * +array_array_clear_impl(arrayobject *self); + +static PyObject * +array_array_clear(arrayobject *self, PyObject *Py_UNUSED(ignored)) +{ + return array_array_clear_impl(self); +} + PyDoc_STRVAR(array_array___copy____doc__, "__copy__($self, /)\n" "--\n" @@ -667,4 +685,4 @@ PyDoc_STRVAR(array_arrayiterator___setstate____doc__, #define ARRAY_ARRAYITERATOR___SETSTATE___METHODDEF \ {"__setstate__", (PyCFunction)array_arrayiterator___setstate__, METH_O, array_arrayiterator___setstate____doc__}, -/*[clinic end generated code: output=3be987238a4bb431 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=52c55d9b1d026c1c input=a9049054013a1b77]*/ diff --git a/Modules/getpath.c b/Modules/getpath.c index a3c8fc269d1c3c..abed139028244a 100644 --- a/Modules/getpath.c +++ b/Modules/getpath.c @@ -262,6 +262,10 @@ getpath_joinpath(PyObject *Py_UNUSED(self), PyObject *args) } /* Convert all parts to wchar and accumulate max final length */ wchar_t **parts = (wchar_t **)PyMem_Malloc(n * sizeof(wchar_t *)); + if (parts == NULL) { + PyErr_NoMemory(); + return NULL; + } memset(parts, 0, n * sizeof(wchar_t *)); Py_ssize_t cchFinal = 0; Py_ssize_t first = 0; diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index e26265fc874ebb..ef6d65623bf038 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -52,6 +52,12 @@ # define EX_OK EXIT_SUCCESS #endif +#ifdef __APPLE__ + /* Needed for the implementation of os.statvfs */ +# include +# include +#endif + /* On android API level 21, 'AT_EACCESS' is not declared although * HAVE_FACCESSAT is defined. */ #ifdef __ANDROID__ @@ -637,6 +643,10 @@ PyOS_AfterFork_Child(void) tstate->native_thread_id = PyThread_get_thread_native_id(); #endif +#ifdef Py_GIL_DISABLED + _Py_brc_after_fork(tstate->interp); +#endif + status = _PyEval_ReInitThreads(tstate); if (_PyStatus_EXCEPTION(status)) { goto fatal_error; @@ -12882,6 +12892,62 @@ os_WSTOPSIG_impl(PyObject *module, int status) #endif #include +#ifdef __APPLE__ +/* On macOS struct statvfs uses 32-bit integers for block counts, + * resulting in overflow when filesystems are larger than 4TB. Therefore + * os.statvfs is implemented in terms of statfs(2). + */ + +static PyObject* +_pystatvfs_fromstructstatfs(PyObject *module, struct statfs st) { + PyObject *StatVFSResultType = get_posix_state(module)->StatVFSResultType; + PyObject *v = PyStructSequence_New((PyTypeObject *)StatVFSResultType); + if (v == NULL) { + return NULL; + } + + long flags = 0; + if (st.f_flags & MNT_RDONLY) { + flags |= ST_RDONLY; + } + if (st.f_flags & MNT_NOSUID) { + flags |= ST_NOSUID; + } + + _Static_assert(sizeof(st.f_blocks) == sizeof(long long), "assuming large file"); + +#define SET_ITEM(SEQ, INDEX, EXPR) \ + do { \ + PyObject *obj = (EXPR); \ + if (obj == NULL) { \ + Py_DECREF((SEQ)); \ + return NULL; \ + } \ + PyStructSequence_SET_ITEM((SEQ), (INDEX), obj); \ + } while (0) + + SET_ITEM(v, 0, PyLong_FromLong((long) st.f_iosize)); + SET_ITEM(v, 1, PyLong_FromLong((long) st.f_bsize)); + SET_ITEM(v, 2, PyLong_FromLongLong((long long) st.f_blocks)); + SET_ITEM(v, 3, PyLong_FromLongLong((long long) st.f_bfree)); + SET_ITEM(v, 4, PyLong_FromLongLong((long long) st.f_bavail)); + SET_ITEM(v, 5, PyLong_FromLongLong((long long) st.f_files)); + SET_ITEM(v, 6, PyLong_FromLongLong((long long) st.f_ffree)); + SET_ITEM(v, 7, PyLong_FromLongLong((long long) st.f_ffree)); + SET_ITEM(v, 8, PyLong_FromLong((long) flags)); + + SET_ITEM(v, 9, PyLong_FromLong((long) NAME_MAX)); + SET_ITEM(v, 10, PyLong_FromUnsignedLong(st.f_fsid.val[0])); + +#undef SET_ITEM + + return v; +} + +#else + + + static PyObject* _pystatvfs_fromstructstatvfs(PyObject *module, struct statvfs st) { PyObject *StatVFSResultType = get_posix_state(module)->StatVFSResultType; @@ -12933,6 +12999,8 @@ _pystatvfs_fromstructstatvfs(PyObject *module, struct statvfs st) { return v; } +#endif + /*[clinic input] os.fstatvfs @@ -12950,6 +13018,22 @@ os_fstatvfs_impl(PyObject *module, int fd) { int result; int async_err = 0; +#ifdef __APPLE__ + struct statfs st; + /* On macOS os.fstatvfs is implemented using fstatfs(2) because + * the former uses 32-bit values for block counts. + */ + do { + Py_BEGIN_ALLOW_THREADS + result = fstatfs(fd, &st); + Py_END_ALLOW_THREADS + } while (result != 0 && errno == EINTR && + !(async_err = PyErr_CheckSignals())); + if (result != 0) + return (!async_err) ? posix_error() : NULL; + + return _pystatvfs_fromstructstatfs(module, st); +#else struct statvfs st; do { @@ -12962,6 +13046,7 @@ os_fstatvfs_impl(PyObject *module, int fd) return (!async_err) ? posix_error() : NULL; return _pystatvfs_fromstructstatvfs(module, st); +#endif } #endif /* defined(HAVE_FSTATVFS) && defined(HAVE_SYS_STATVFS_H) */ @@ -12985,6 +13070,28 @@ os_statvfs_impl(PyObject *module, path_t *path) /*[clinic end generated code: output=87106dd1beb8556e input=3f5c35791c669bd9]*/ { int result; + +#ifdef __APPLE__ + /* On macOS os.statvfs is implemented using statfs(2)/fstatfs(2) because + * the former uses 32-bit values for block counts. + */ + struct statfs st; + + Py_BEGIN_ALLOW_THREADS + if (path->fd != -1) { + result = fstatfs(path->fd, &st); + } + else + result = statfs(path->narrow, &st); + Py_END_ALLOW_THREADS + + if (result) { + return path_error(path); + } + + return _pystatvfs_fromstructstatfs(module, st); + +#else struct statvfs st; Py_BEGIN_ALLOW_THREADS @@ -13002,6 +13109,7 @@ os_statvfs_impl(PyObject *module, path_t *path) } return _pystatvfs_fromstructstatvfs(module, st); +#endif } #endif /* defined(HAVE_STATVFS) && defined(HAVE_SYS_STATVFS_H) */ diff --git a/Modules/termios.c b/Modules/termios.c index 69dbd88be5fcc2..4635fefb8f3f5a 100644 --- a/Modules/termios.c +++ b/Modules/termios.c @@ -27,9 +27,7 @@ #include #include -#if defined(__sun) && defined(__SVR4) -# include // ioctl() -#endif +#include // _POSIX_VDISABLE /* HP-UX requires that this be included to pick up MDCD, MCTS, MDSR, * MDTR, MRI, and MRTS (apparently used internally by some things @@ -1315,6 +1313,9 @@ static struct constant { #ifdef TIOCTTYGSTRUCT {"TIOCTTYGSTRUCT", TIOCTTYGSTRUCT}, #endif +#ifdef _POSIX_VDISABLE + {"_POSIX_VDISABLE", _POSIX_VDISABLE}, +#endif /* sentinel */ {NULL, 0} diff --git a/Objects/clinic/setobject.c.h b/Objects/clinic/setobject.c.h new file mode 100644 index 00000000000000..f3c96995ede60d --- /dev/null +++ b/Objects/clinic/setobject.c.h @@ -0,0 +1,414 @@ +/*[clinic input] +preserve +[clinic start generated code]*/ + +#include "pycore_modsupport.h" // _PyArg_CheckPositional() + +PyDoc_STRVAR(set_pop__doc__, +"pop($self, /)\n" +"--\n" +"\n" +"Remove and return an arbitrary set element.\n" +"\n" +"Raises KeyError if the set is empty."); + +#define SET_POP_METHODDEF \ + {"pop", (PyCFunction)set_pop, METH_NOARGS, set_pop__doc__}, + +static PyObject * +set_pop_impl(PySetObject *so); + +static PyObject * +set_pop(PySetObject *so, PyObject *Py_UNUSED(ignored)) +{ + return set_pop_impl(so); +} + +PyDoc_STRVAR(set_update__doc__, +"update($self, /, *others)\n" +"--\n" +"\n" +"Update the set, adding elements from all others."); + +#define SET_UPDATE_METHODDEF \ + {"update", _PyCFunction_CAST(set_update), METH_FASTCALL, set_update__doc__}, + +static PyObject * +set_update_impl(PySetObject *so, PyObject *args); + +static PyObject * +set_update(PySetObject *so, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *__clinic_args = NULL; + + if (!_PyArg_CheckPositional("update", nargs, 0, PY_SSIZE_T_MAX)) { + goto exit; + } + __clinic_args = PyTuple_New(nargs - 0); + if (!__clinic_args) { + goto exit; + } + for (Py_ssize_t i = 0; i < nargs - 0; ++i) { + PyTuple_SET_ITEM(__clinic_args, i, Py_NewRef(args[0 + i])); + } + return_value = set_update_impl(so, __clinic_args); + +exit: + Py_XDECREF(__clinic_args); + return return_value; +} + +PyDoc_STRVAR(set_copy__doc__, +"copy($self, /)\n" +"--\n" +"\n" +"Return a shallow copy of a set."); + +#define SET_COPY_METHODDEF \ + {"copy", (PyCFunction)set_copy, METH_NOARGS, set_copy__doc__}, + +static PyObject * +set_copy_impl(PySetObject *so); + +static PyObject * +set_copy(PySetObject *so, PyObject *Py_UNUSED(ignored)) +{ + return set_copy_impl(so); +} + +PyDoc_STRVAR(frozenset_copy__doc__, +"copy($self, /)\n" +"--\n" +"\n" +"Return a shallow copy of a set."); + +#define FROZENSET_COPY_METHODDEF \ + {"copy", (PyCFunction)frozenset_copy, METH_NOARGS, frozenset_copy__doc__}, + +static PyObject * +frozenset_copy_impl(PySetObject *so); + +static PyObject * +frozenset_copy(PySetObject *so, PyObject *Py_UNUSED(ignored)) +{ + return frozenset_copy_impl(so); +} + +PyDoc_STRVAR(set_clear__doc__, +"clear($self, /)\n" +"--\n" +"\n" +"Remove all elements from this set."); + +#define SET_CLEAR_METHODDEF \ + {"clear", (PyCFunction)set_clear, METH_NOARGS, set_clear__doc__}, + +static PyObject * +set_clear_impl(PySetObject *so); + +static PyObject * +set_clear(PySetObject *so, PyObject *Py_UNUSED(ignored)) +{ + return set_clear_impl(so); +} + +PyDoc_STRVAR(set_union__doc__, +"union($self, /, *others)\n" +"--\n" +"\n" +"Return a new set with elements from the set and all others."); + +#define SET_UNION_METHODDEF \ + {"union", _PyCFunction_CAST(set_union), METH_FASTCALL, set_union__doc__}, + +static PyObject * +set_union_impl(PySetObject *so, PyObject *args); + +static PyObject * +set_union(PySetObject *so, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *__clinic_args = NULL; + + if (!_PyArg_CheckPositional("union", nargs, 0, PY_SSIZE_T_MAX)) { + goto exit; + } + __clinic_args = PyTuple_New(nargs - 0); + if (!__clinic_args) { + goto exit; + } + for (Py_ssize_t i = 0; i < nargs - 0; ++i) { + PyTuple_SET_ITEM(__clinic_args, i, Py_NewRef(args[0 + i])); + } + return_value = set_union_impl(so, __clinic_args); + +exit: + Py_XDECREF(__clinic_args); + return return_value; +} + +PyDoc_STRVAR(set_intersection_multi__doc__, +"intersection($self, /, *others)\n" +"--\n" +"\n" +"Return a new set with elements common to the set and all others."); + +#define SET_INTERSECTION_MULTI_METHODDEF \ + {"intersection", _PyCFunction_CAST(set_intersection_multi), METH_FASTCALL, set_intersection_multi__doc__}, + +static PyObject * +set_intersection_multi_impl(PySetObject *so, PyObject *args); + +static PyObject * +set_intersection_multi(PySetObject *so, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *__clinic_args = NULL; + + if (!_PyArg_CheckPositional("intersection", nargs, 0, PY_SSIZE_T_MAX)) { + goto exit; + } + __clinic_args = PyTuple_New(nargs - 0); + if (!__clinic_args) { + goto exit; + } + for (Py_ssize_t i = 0; i < nargs - 0; ++i) { + PyTuple_SET_ITEM(__clinic_args, i, Py_NewRef(args[0 + i])); + } + return_value = set_intersection_multi_impl(so, __clinic_args); + +exit: + Py_XDECREF(__clinic_args); + return return_value; +} + +PyDoc_STRVAR(set_intersection_update_multi__doc__, +"intersection_update($self, /, *others)\n" +"--\n" +"\n" +"Update the set, keeping only elements found in it and all others."); + +#define SET_INTERSECTION_UPDATE_MULTI_METHODDEF \ + {"intersection_update", _PyCFunction_CAST(set_intersection_update_multi), METH_FASTCALL, set_intersection_update_multi__doc__}, + +static PyObject * +set_intersection_update_multi_impl(PySetObject *so, PyObject *args); + +static PyObject * +set_intersection_update_multi(PySetObject *so, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *__clinic_args = NULL; + + if (!_PyArg_CheckPositional("intersection_update", nargs, 0, PY_SSIZE_T_MAX)) { + goto exit; + } + __clinic_args = PyTuple_New(nargs - 0); + if (!__clinic_args) { + goto exit; + } + for (Py_ssize_t i = 0; i < nargs - 0; ++i) { + PyTuple_SET_ITEM(__clinic_args, i, Py_NewRef(args[0 + i])); + } + return_value = set_intersection_update_multi_impl(so, __clinic_args); + +exit: + Py_XDECREF(__clinic_args); + return return_value; +} + +PyDoc_STRVAR(set_isdisjoint__doc__, +"isdisjoint($self, other, /)\n" +"--\n" +"\n" +"Return True if two sets have a null intersection."); + +#define SET_ISDISJOINT_METHODDEF \ + {"isdisjoint", (PyCFunction)set_isdisjoint, METH_O, set_isdisjoint__doc__}, + +PyDoc_STRVAR(set_difference_update__doc__, +"difference_update($self, /, *others)\n" +"--\n" +"\n" +"Update the set, removing elements found in others."); + +#define SET_DIFFERENCE_UPDATE_METHODDEF \ + {"difference_update", _PyCFunction_CAST(set_difference_update), METH_FASTCALL, set_difference_update__doc__}, + +static PyObject * +set_difference_update_impl(PySetObject *so, PyObject *args); + +static PyObject * +set_difference_update(PySetObject *so, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *__clinic_args = NULL; + + if (!_PyArg_CheckPositional("difference_update", nargs, 0, PY_SSIZE_T_MAX)) { + goto exit; + } + __clinic_args = PyTuple_New(nargs - 0); + if (!__clinic_args) { + goto exit; + } + for (Py_ssize_t i = 0; i < nargs - 0; ++i) { + PyTuple_SET_ITEM(__clinic_args, i, Py_NewRef(args[0 + i])); + } + return_value = set_difference_update_impl(so, __clinic_args); + +exit: + Py_XDECREF(__clinic_args); + return return_value; +} + +PyDoc_STRVAR(set_difference_multi__doc__, +"difference($self, /, *others)\n" +"--\n" +"\n" +"Return a new set with elements in the set that are not in the others."); + +#define SET_DIFFERENCE_MULTI_METHODDEF \ + {"difference", _PyCFunction_CAST(set_difference_multi), METH_FASTCALL, set_difference_multi__doc__}, + +static PyObject * +set_difference_multi_impl(PySetObject *so, PyObject *args); + +static PyObject * +set_difference_multi(PySetObject *so, PyObject *const *args, Py_ssize_t nargs) +{ + PyObject *return_value = NULL; + PyObject *__clinic_args = NULL; + + if (!_PyArg_CheckPositional("difference", nargs, 0, PY_SSIZE_T_MAX)) { + goto exit; + } + __clinic_args = PyTuple_New(nargs - 0); + if (!__clinic_args) { + goto exit; + } + for (Py_ssize_t i = 0; i < nargs - 0; ++i) { + PyTuple_SET_ITEM(__clinic_args, i, Py_NewRef(args[0 + i])); + } + return_value = set_difference_multi_impl(so, __clinic_args); + +exit: + Py_XDECREF(__clinic_args); + return return_value; +} + +PyDoc_STRVAR(set_symmetric_difference_update__doc__, +"symmetric_difference_update($self, other, /)\n" +"--\n" +"\n" +"Update the set, keeping only elements found in either set, but not in both."); + +#define SET_SYMMETRIC_DIFFERENCE_UPDATE_METHODDEF \ + {"symmetric_difference_update", (PyCFunction)set_symmetric_difference_update, METH_O, set_symmetric_difference_update__doc__}, + +PyDoc_STRVAR(set_symmetric_difference__doc__, +"symmetric_difference($self, other, /)\n" +"--\n" +"\n" +"Return a new set with elements in either the set or other but not both."); + +#define SET_SYMMETRIC_DIFFERENCE_METHODDEF \ + {"symmetric_difference", (PyCFunction)set_symmetric_difference, METH_O, set_symmetric_difference__doc__}, + +PyDoc_STRVAR(set_issubset__doc__, +"issubset($self, other, /)\n" +"--\n" +"\n" +"Report whether another set contains this set."); + +#define SET_ISSUBSET_METHODDEF \ + {"issubset", (PyCFunction)set_issubset, METH_O, set_issubset__doc__}, + +PyDoc_STRVAR(set_issuperset__doc__, +"issuperset($self, other, /)\n" +"--\n" +"\n" +"Report whether this set contains another set."); + +#define SET_ISSUPERSET_METHODDEF \ + {"issuperset", (PyCFunction)set_issuperset, METH_O, set_issuperset__doc__}, + +PyDoc_STRVAR(set_add__doc__, +"add($self, object, /)\n" +"--\n" +"\n" +"Add an element to a set.\n" +"\n" +"This has no effect if the element is already present."); + +#define SET_ADD_METHODDEF \ + {"add", (PyCFunction)set_add, METH_O, set_add__doc__}, + +PyDoc_STRVAR(set___contains____doc__, +"__contains__($self, object, /)\n" +"--\n" +"\n" +"x.__contains__(y) <==> y in x."); + +#define SET___CONTAINS___METHODDEF \ + {"__contains__", (PyCFunction)set___contains__, METH_O|METH_COEXIST, set___contains____doc__}, + +PyDoc_STRVAR(set_remove__doc__, +"remove($self, object, /)\n" +"--\n" +"\n" +"Remove an element from a set; it must be a member.\n" +"\n" +"If the element is not a member, raise a KeyError."); + +#define SET_REMOVE_METHODDEF \ + {"remove", (PyCFunction)set_remove, METH_O, set_remove__doc__}, + +PyDoc_STRVAR(set_discard__doc__, +"discard($self, object, /)\n" +"--\n" +"\n" +"Remove an element from a set if it is a member.\n" +"\n" +"Unlike set.remove(), the discard() method does not raise\n" +"an exception when an element is missing from the set."); + +#define SET_DISCARD_METHODDEF \ + {"discard", (PyCFunction)set_discard, METH_O, set_discard__doc__}, + +PyDoc_STRVAR(set___reduce____doc__, +"__reduce__($self, /)\n" +"--\n" +"\n" +"Return state information for pickling."); + +#define SET___REDUCE___METHODDEF \ + {"__reduce__", (PyCFunction)set___reduce__, METH_NOARGS, set___reduce____doc__}, + +static PyObject * +set___reduce___impl(PySetObject *so); + +static PyObject * +set___reduce__(PySetObject *so, PyObject *Py_UNUSED(ignored)) +{ + return set___reduce___impl(so); +} + +PyDoc_STRVAR(set___sizeof____doc__, +"__sizeof__($self, /)\n" +"--\n" +"\n" +"S.__sizeof__() -> size of S in memory, in bytes."); + +#define SET___SIZEOF___METHODDEF \ + {"__sizeof__", (PyCFunction)set___sizeof__, METH_NOARGS, set___sizeof____doc__}, + +static PyObject * +set___sizeof___impl(PySetObject *so); + +static PyObject * +set___sizeof__(PySetObject *so, PyObject *Py_UNUSED(ignored)) +{ + return set___sizeof___impl(so); +} +/*[clinic end generated code: output=34a30591148da884 input=a9049054013a1b77]*/ diff --git a/Objects/codeobject.c b/Objects/codeobject.c index dc46b773c26528..30336fa86111a7 100644 --- a/Objects/codeobject.c +++ b/Objects/codeobject.c @@ -1489,27 +1489,19 @@ PyCode_GetFreevars(PyCodeObject *code) static void clear_executors(PyCodeObject *co) { + assert(co->co_executors); for (int i = 0; i < co->co_executors->size; i++) { - Py_CLEAR(co->co_executors->executors[i]); + if (co->co_executors->executors[i]) { + _Py_ExecutorClear(co->co_executors->executors[i]); + } } PyMem_Free(co->co_executors); co->co_executors = NULL; } void -_PyCode_Clear_Executors(PyCodeObject *code) { - int code_len = (int)Py_SIZE(code); - for (int i = 0; i < code_len; i += _PyInstruction_GetLength(code, i)) { - _Py_CODEUNIT *instr = &_PyCode_CODE(code)[i]; - uint8_t opcode = instr->op.code; - uint8_t oparg = instr->op.arg; - if (opcode == ENTER_EXECUTOR) { - _PyExecutorObject *exec = code->co_executors->executors[oparg]; - assert(exec->vm_data.opcode != ENTER_EXECUTOR); - instr->op.code = exec->vm_data.opcode; - instr->op.arg = exec->vm_data.oparg; - } - } +_PyCode_Clear_Executors(PyCodeObject *code) +{ clear_executors(code); } @@ -2360,10 +2352,10 @@ _PyCode_ConstantKey(PyObject *op) void _PyStaticCode_Fini(PyCodeObject *co) { - deopt_code(co, _PyCode_CODE(co)); if (co->co_executors != NULL) { clear_executors(co); } + deopt_code(co, _PyCode_CODE(co)); PyMem_Free(co->co_extra); if (co->_co_cached != NULL) { Py_CLEAR(co->_co_cached->_co_code); diff --git a/Objects/dictobject.c b/Objects/dictobject.c index 2df95e977a180f..9b1defa5cbc609 100644 --- a/Objects/dictobject.c +++ b/Objects/dictobject.c @@ -5989,6 +5989,18 @@ _PyObject_MakeDictFromInstanceAttributes(PyObject *obj, PyDictValues *values) return make_dict_from_instance_attributes(interp, keys, values); } +static bool +has_unique_reference(PyObject *op) +{ +#ifdef Py_GIL_DISABLED + return (_Py_IsOwnedByCurrentThread(op) && + op->ob_ref_local == 1 && + _Py_atomic_load_ssize_relaxed(&op->ob_ref_shared) == 0); +#else + return Py_REFCNT(op) == 1; +#endif +} + // Return true if the dict was dematerialized, false otherwise. bool _PyObject_MakeInstanceAttributesFromDict(PyObject *obj, PyDictOrValues *dorv) @@ -6005,7 +6017,9 @@ _PyObject_MakeInstanceAttributesFromDict(PyObject *obj, PyDictOrValues *dorv) return false; } assert(_PyType_HasFeature(Py_TYPE(obj), Py_TPFLAGS_HEAPTYPE)); - if (dict->ma_keys != CACHED_KEYS(Py_TYPE(obj)) || Py_REFCNT(dict) != 1) { + if (dict->ma_keys != CACHED_KEYS(Py_TYPE(obj)) || + !has_unique_reference((PyObject *)dict)) + { return false; } assert(dict->ma_values); diff --git a/Objects/floatobject.c b/Objects/floatobject.c index c440e0dab0e79f..9b322c52d4daea 100644 --- a/Objects/floatobject.c +++ b/Objects/floatobject.c @@ -2010,16 +2010,6 @@ _PyFloat_ClearFreeList(_PyFreeListState *freelist_state, int is_finalization) #endif } -void -_PyFloat_Fini(_PyFreeListState *state) -{ - // With Py_GIL_DISABLED: - // the freelists for the current thread state have already been cleared. -#ifndef Py_GIL_DISABLED - _PyFloat_ClearFreeList(state, 1); -#endif -} - void _PyFloat_FiniType(PyInterpreterState *interp) { diff --git a/Objects/genobject.c b/Objects/genobject.c index ab523e46cceaa3..59ab7abf6180bd 100644 --- a/Objects/genobject.c +++ b/Objects/genobject.c @@ -1682,17 +1682,6 @@ _PyAsyncGen_ClearFreeLists(_PyFreeListState *freelist_state, int is_finalization #endif } -void -_PyAsyncGen_Fini(_PyFreeListState *state) -{ - // With Py_GIL_DISABLED: - // the freelists for the current thread state have already been cleared. -#ifndef Py_GIL_DISABLED - _PyAsyncGen_ClearFreeLists(state, 1); -#endif -} - - static PyObject * async_gen_unwrap_value(PyAsyncGenObject *gen, PyObject *result) { diff --git a/Objects/listobject.c b/Objects/listobject.c index 307b8f1bd76cac..7fdb91eab890b5 100644 --- a/Objects/listobject.c +++ b/Objects/listobject.c @@ -135,16 +135,6 @@ _PyList_ClearFreeList(_PyFreeListState *freelist_state, int is_finalization) #endif } -void -_PyList_Fini(_PyFreeListState *state) -{ - // With Py_GIL_DISABLED: - // the freelists for the current thread state have already been cleared. -#ifndef Py_GIL_DISABLED - _PyList_ClearFreeList(state, 1); -#endif -} - /* Print summary info about the state of the optimized allocator */ void _PyList_DebugMallocStats(FILE *out) diff --git a/Objects/mimalloc/heap.c b/Objects/mimalloc/heap.c index 164b28f0fab240..154dad0b128480 100644 --- a/Objects/mimalloc/heap.c +++ b/Objects/mimalloc/heap.c @@ -538,7 +538,6 @@ bool _mi_heap_area_visit_blocks(const mi_heap_area_t* area, mi_page_t *page, mi_ mi_assert(page != NULL); if (page == NULL) return true; - _mi_page_free_collect(page,true); mi_assert_internal(page->local_free == NULL); if (page->used == 0) return true; @@ -635,6 +634,7 @@ bool _mi_heap_area_visit_blocks(const mi_heap_area_t* area, mi_page_t *page, mi_ typedef bool (mi_heap_area_visit_fun)(const mi_heap_t* heap, const mi_heap_area_ex_t* area, void* arg); void _mi_heap_area_init(mi_heap_area_t* area, mi_page_t* page) { + _mi_page_free_collect(page,true); const size_t bsize = mi_page_block_size(page); const size_t ubsize = mi_page_usable_block_size(page); area->reserved = page->reserved * bsize; diff --git a/Objects/object.c b/Objects/object.c index bbf7f98ae3daf9..275aa6713c8c21 100644 --- a/Objects/object.c +++ b/Objects/object.c @@ -2,6 +2,7 @@ /* Generic object operations; and implementation of None */ #include "Python.h" +#include "pycore_brc.h" // _Py_brc_queue_object() #include "pycore_call.h" // _PyObject_CallNoArgs() #include "pycore_ceval.h" // _Py_EnterRecursiveCallTstate() #include "pycore_context.h" // _PyContextTokenMissing_Type @@ -344,12 +345,10 @@ _Py_DecRefSharedDebug(PyObject *o, const char *filename, int lineno) &shared, new_shared)); if (should_queue) { - // TODO: the inter-thread queue is not yet implemented. For now, - // we just merge the refcount here. - Py_ssize_t refcount = _Py_ExplicitMergeRefcount(o, -1); - if (refcount == 0) { - _Py_Dealloc(o); - } +#ifdef Py_REF_DEBUG + _Py_IncRefTotal(_PyInterpreterState_GET()); +#endif + _Py_brc_queue_object(o); } else if (new_shared == _Py_REF_MERGED) { // refcount is zero AND merged @@ -399,10 +398,6 @@ _Py_ExplicitMergeRefcount(PyObject *op, Py_ssize_t extra) Py_ssize_t shared = _Py_atomic_load_ssize_relaxed(&op->ob_ref_shared); do { refcnt = Py_ARITHMETIC_RIGHT_SHIFT(Py_ssize_t, shared, _Py_REF_SHARED_SHIFT); - if (_Py_REF_IS_MERGED(shared)) { - return refcnt; - } - refcnt += (Py_ssize_t)op->ob_ref_local; refcnt += extra; @@ -410,6 +405,10 @@ _Py_ExplicitMergeRefcount(PyObject *op, Py_ssize_t extra) } while (!_Py_atomic_compare_exchange_ssize(&op->ob_ref_shared, &shared, new_shared)); +#ifdef Py_REF_DEBUG + _Py_AddRefTotal(_PyInterpreterState_GET(), extra); +#endif + _Py_atomic_store_uint32_relaxed(&op->ob_ref_local, 0); _Py_atomic_store_uintptr_relaxed(&op->ob_tid, 0); return refcnt; @@ -794,6 +793,21 @@ PyObject_Bytes(PyObject *v) return PyBytes_FromObject(v); } +void +_PyObject_ClearFreeLists(_PyFreeListState *state, int is_finalization) +{ + // In the free-threaded build, freelists are per-PyThreadState and cleared in PyThreadState_Clear() + // In the default build, freelists are per-interpreter and cleared in finalize_interp_types() + _PyFloat_ClearFreeList(state, is_finalization); + _PyTuple_ClearFreeList(state, is_finalization); + _PyList_ClearFreeList(state, is_finalization); + _PyDict_ClearFreeList(state, is_finalization); + _PyContext_ClearFreeList(state, is_finalization); + _PyAsyncGen_ClearFreeLists(state, is_finalization); + // Only be cleared if is_finalization is true. + _PyObjectStackChunk_ClearFreeList(state, is_finalization); + _PySlice_ClearFreeList(state, is_finalization); +} /* def _PyObject_FunctionStr(x): diff --git a/Objects/obmalloc.c b/Objects/obmalloc.c index bea4ea85332bdd..6a12c3dca38b36 100644 --- a/Objects/obmalloc.c +++ b/Objects/obmalloc.c @@ -1073,7 +1073,12 @@ get_mimalloc_allocated_blocks(PyInterpreterState *interp) mi_heap_visit_blocks(heap, false, &count_blocks, &allocated_blocks); } } - // TODO(sgross): count blocks in abandoned segments. + + mi_abandoned_pool_t *pool = &interp->mimalloc.abandoned_pool; + for (uint8_t tag = 0; tag < _Py_MIMALLOC_HEAP_COUNT; tag++) { + _mi_abandoned_pool_visit_blocks(pool, tag, false, &count_blocks, + &allocated_blocks); + } #else // TODO(sgross): this only counts the current thread's blocks. mi_heap_t *heap = mi_heap_get_default(); @@ -1189,6 +1194,7 @@ get_num_global_allocated_blocks(_PyRuntimeState *runtime) } } else { + _PyEval_StopTheWorldAll(&_PyRuntime); HEAD_LOCK(runtime); PyInterpreterState *interp = PyInterpreterState_Head(); assert(interp != NULL); @@ -1208,6 +1214,7 @@ get_num_global_allocated_blocks(_PyRuntimeState *runtime) } } HEAD_UNLOCK(runtime); + _PyEval_StartTheWorldAll(&_PyRuntime); #ifdef Py_DEBUG assert(got_main); #endif diff --git a/Objects/setobject.c b/Objects/setobject.c index 3acf2a7a74890b..6a4c8c45f0836d 100644 --- a/Objects/setobject.c +++ b/Objects/setobject.c @@ -40,6 +40,19 @@ #include "pycore_pyerrors.h" // _PyErr_SetKeyError() #include "pycore_setobject.h" // _PySet_NextEntry() definition #include // offsetof() +#include "clinic/setobject.c.h" + +/*[clinic input] +class set "PySetObject *" "&PySet_Type" +class frozenset "PySetObject *" "&PyFrozenSet_Type" +[clinic start generated code]*/ +/*[clinic end generated code: output=da39a3ee5e6b4b0d input=97ad1d3e9f117079]*/ + +/*[python input] +class setobject_converter(self_converter): + type = "PySetObject *" +[python start generated code]*/ +/*[python end generated code: output=da39a3ee5e6b4b0d input=33a44506d4d57793]*/ /* Object used as dummy key to fill deleted entries */ static PyObject _dummy_struct; @@ -631,8 +644,18 @@ set_merge(PySetObject *so, PyObject *otherset) return 0; } +/*[clinic input] +set.pop + so: setobject + +Remove and return an arbitrary set element. + +Raises KeyError if the set is empty. +[clinic start generated code]*/ + static PyObject * -set_pop(PySetObject *so, PyObject *Py_UNUSED(ignored)) +set_pop_impl(PySetObject *so) +/*[clinic end generated code: output=4d65180f1271871b input=4a3f5552e660a260]*/ { /* Make sure the search finger is in bounds */ setentry *entry = so->table + (so->finger & so->mask); @@ -656,9 +679,6 @@ set_pop(PySetObject *so, PyObject *Py_UNUSED(ignored)) return key; } -PyDoc_STRVAR(pop_doc, "Remove and return an arbitrary set element.\n\ -Raises KeyError if the set is empty."); - static int set_traverse(PySetObject *so, visitproc visit, void *arg) { @@ -935,8 +955,18 @@ set_update_internal(PySetObject *so, PyObject *other) return 0; } +/*[clinic input] +set.update + so: setobject + *others as args: object + / + +Update the set, adding elements from all others. +[clinic start generated code]*/ + static PyObject * -set_update(PySetObject *so, PyObject *args) +set_update_impl(PySetObject *so, PyObject *args) +/*[clinic end generated code: output=34f6371704974c8a input=eb47c4fbaeb3286e]*/ { Py_ssize_t i; @@ -948,12 +978,6 @@ set_update(PySetObject *so, PyObject *args) Py_RETURN_NONE; } -PyDoc_STRVAR(update_doc, -"update($self, /, *others)\n\ ---\n\ -\n\ -Update the set, adding elements from all others."); - /* XXX Todo: If aligned memory allocations become available, make the set object 64 byte aligned so that most of the fields @@ -1101,14 +1125,30 @@ set_swap_bodies(PySetObject *a, PySetObject *b) } } +/*[clinic input] +set.copy + so: setobject + +Return a shallow copy of a set. +[clinic start generated code]*/ + static PyObject * -set_copy(PySetObject *so, PyObject *Py_UNUSED(ignored)) +set_copy_impl(PySetObject *so) +/*[clinic end generated code: output=c9223a1e1cc6b041 input=2b80b288d47b8cf1]*/ { return make_new_set_basetype(Py_TYPE(so), (PyObject *)so); } +/*[clinic input] +frozenset.copy + so: setobject + +Return a shallow copy of a set. +[clinic start generated code]*/ + static PyObject * -frozenset_copy(PySetObject *so, PyObject *Py_UNUSED(ignored)) +frozenset_copy_impl(PySetObject *so) +/*[clinic end generated code: output=b356263526af9e70 input=3dc65577d344eff7]*/ { if (PyFrozenSet_CheckExact(so)) { return Py_NewRef(so); @@ -1116,19 +1156,33 @@ frozenset_copy(PySetObject *so, PyObject *Py_UNUSED(ignored)) return set_copy(so, NULL); } -PyDoc_STRVAR(copy_doc, "Return a shallow copy of a set."); +/*[clinic input] +set.clear + so: setobject + +Remove all elements from this set. +[clinic start generated code]*/ static PyObject * -set_clear(PySetObject *so, PyObject *Py_UNUSED(ignored)) +set_clear_impl(PySetObject *so) +/*[clinic end generated code: output=4e71d5a83904161a input=74ac19794da81a39]*/ { set_clear_internal(so); Py_RETURN_NONE; } -PyDoc_STRVAR(clear_doc, "Remove all elements from this set."); +/*[clinic input] +set.union + so: setobject + *others as args: object + / + +Return a new set with elements from the set and all others. +[clinic start generated code]*/ static PyObject * -set_union(PySetObject *so, PyObject *args) +set_union_impl(PySetObject *so, PyObject *args) +/*[clinic end generated code: output=2c83d05a446a1477 input=2e2024fa1e40ac84]*/ { PySetObject *result; PyObject *other; @@ -1150,12 +1204,6 @@ set_union(PySetObject *so, PyObject *args) return (PyObject *)result; } -PyDoc_STRVAR(union_doc, -"union($self, /, *others)\n\ ---\n\ -\n\ -Return a new set with elements from the set and all others."); - static PyObject * set_or(PySetObject *so, PyObject *other) { @@ -1270,8 +1318,18 @@ set_intersection(PySetObject *so, PyObject *other) return NULL; } +/*[clinic input] +set.intersection as set_intersection_multi + so: setobject + *others as args: object + / + +Return a new set with elements common to the set and all others. +[clinic start generated code]*/ + static PyObject * -set_intersection_multi(PySetObject *so, PyObject *args) +set_intersection_multi_impl(PySetObject *so, PyObject *args) +/*[clinic end generated code: output=2406ef3387adbe2f input=04108ea6d7f0532b]*/ { Py_ssize_t i; @@ -1291,12 +1349,6 @@ set_intersection_multi(PySetObject *so, PyObject *args) return result; } -PyDoc_STRVAR(intersection_doc, -"intersection($self, /, *others)\n\ ---\n\ -\n\ -Return a new set with elements common to the set and all others."); - static PyObject * set_intersection_update(PySetObject *so, PyObject *other) { @@ -1310,12 +1362,22 @@ set_intersection_update(PySetObject *so, PyObject *other) Py_RETURN_NONE; } +/*[clinic input] +set.intersection_update as set_intersection_update_multi + so: setobject + *others as args: object + / + +Update the set, keeping only elements found in it and all others. +[clinic start generated code]*/ + static PyObject * -set_intersection_update_multi(PySetObject *so, PyObject *args) +set_intersection_update_multi_impl(PySetObject *so, PyObject *args) +/*[clinic end generated code: output=251c1f729063609d input=ff8f119f97458d16]*/ { PyObject *tmp; - tmp = set_intersection_multi(so, args); + tmp = set_intersection_multi_impl(so, args); if (tmp == NULL) return NULL; set_swap_bodies(so, (PySetObject *)tmp); @@ -1323,12 +1385,6 @@ set_intersection_update_multi(PySetObject *so, PyObject *args) Py_RETURN_NONE; } -PyDoc_STRVAR(intersection_update_doc, -"intersection_update($self, /, *others)\n\ ---\n\ -\n\ -Update the set, keeping only elements found in it and all others."); - static PyObject * set_and(PySetObject *so, PyObject *other) { @@ -1351,8 +1407,18 @@ set_iand(PySetObject *so, PyObject *other) return Py_NewRef(so); } +/*[clinic input] +set.isdisjoint + so: setobject + other: object + / + +Return True if two sets have a null intersection. +[clinic start generated code]*/ + static PyObject * set_isdisjoint(PySetObject *so, PyObject *other) +/*[clinic end generated code: output=a92bbf9a2db6a3da input=c254ddec8a2326e3]*/ { PyObject *key, *it, *tmp; int rv; @@ -1410,9 +1476,6 @@ set_isdisjoint(PySetObject *so, PyObject *other) Py_RETURN_TRUE; } -PyDoc_STRVAR(isdisjoint_doc, -"Return True if two sets have a null intersection."); - static int set_difference_update_internal(PySetObject *so, PyObject *other) { @@ -1471,8 +1534,18 @@ set_difference_update_internal(PySetObject *so, PyObject *other) return set_table_resize(so, so->used>50000 ? so->used*2 : so->used*4); } +/*[clinic input] +set.difference_update + so: setobject + *others as args: object + / + +Update the set, removing elements found in others. +[clinic start generated code]*/ + static PyObject * -set_difference_update(PySetObject *so, PyObject *args) +set_difference_update_impl(PySetObject *so, PyObject *args) +/*[clinic end generated code: output=28685b2fc63e41c4 input=e7abb43c9f2c5a73]*/ { Py_ssize_t i; @@ -1484,12 +1557,6 @@ set_difference_update(PySetObject *so, PyObject *args) Py_RETURN_NONE; } -PyDoc_STRVAR(difference_update_doc, -"difference_update($self, /, *others)\n\ ---\n\ -\n\ -Update the set, removing elements found in others."); - static PyObject * set_copy_and_difference(PySetObject *so, PyObject *other) { @@ -1580,8 +1647,18 @@ set_difference(PySetObject *so, PyObject *other) return result; } +/*[clinic input] +set.difference as set_difference_multi + so: setobject + *others as args: object + / + +Return a new set with elements in the set that are not in the others. +[clinic start generated code]*/ + static PyObject * -set_difference_multi(PySetObject *so, PyObject *args) +set_difference_multi_impl(PySetObject *so, PyObject *args) +/*[clinic end generated code: output=3130c3bb3cac873d input=d8ae9bb6d518ab95]*/ { Py_ssize_t i; PyObject *result, *other; @@ -1604,11 +1681,6 @@ set_difference_multi(PySetObject *so, PyObject *args) return result; } -PyDoc_STRVAR(difference_doc, -"difference($self, /, *others)\n\ ---\n\ -\n\ -Return a new set with elements in the set that are not in the others."); static PyObject * set_sub(PySetObject *so, PyObject *other) { @@ -1654,8 +1726,18 @@ set_symmetric_difference_update_dict(PySetObject *so, PyObject *other) Py_RETURN_NONE; } +/*[clinic input] +set.symmetric_difference_update + so: setobject + other: object + / + +Update the set, keeping only elements found in either set, but not in both. +[clinic start generated code]*/ + static PyObject * set_symmetric_difference_update(PySetObject *so, PyObject *other) +/*[clinic end generated code: output=fbb049c0806028de input=a50acf0365e1f0a5]*/ { PySetObject *otherset; PyObject *key; @@ -1708,14 +1790,18 @@ set_symmetric_difference_update(PySetObject *so, PyObject *other) Py_RETURN_NONE; } -PyDoc_STRVAR(symmetric_difference_update_doc, -"symmetric_difference_update($self, other, /)\n\ ---\n\ -\n\ -Update the set, keeping only elements found in either set, but not in both."); +/*[clinic input] +set.symmetric_difference + so: setobject + other: object + / + +Return a new set with elements in either the set or other but not both. +[clinic start generated code]*/ static PyObject * set_symmetric_difference(PySetObject *so, PyObject *other) +/*[clinic end generated code: output=f95364211b88775a input=f18af370ad72ebac]*/ { PyObject *rv; PySetObject *otherset; @@ -1732,12 +1818,6 @@ set_symmetric_difference(PySetObject *so, PyObject *other) return (PyObject *)otherset; } -PyDoc_STRVAR(symmetric_difference_doc, -"symmetric_difference($self, other, /)\n\ ---\n\ -\n\ -Return a new set with elements in either the set or other but not both."); - static PyObject * set_xor(PySetObject *so, PyObject *other) { @@ -1760,8 +1840,18 @@ set_ixor(PySetObject *so, PyObject *other) return Py_NewRef(so); } +/*[clinic input] +set.issubset + so: setobject + other: object + / + +Report whether another set contains this set. +[clinic start generated code]*/ + static PyObject * set_issubset(PySetObject *so, PyObject *other) +/*[clinic end generated code: output=78aef1f377aedef1 input=37fbc579b609db0c]*/ { setentry *entry; Py_ssize_t pos = 0; @@ -1794,14 +1884,18 @@ set_issubset(PySetObject *so, PyObject *other) Py_RETURN_TRUE; } -PyDoc_STRVAR(issubset_doc, -"issubset($self, other, /)\n\ ---\n\ -\n\ -Test whether every element in the set is in other."); +/*[clinic input] +set.issuperset + so: setobject + other: object + / + +Report whether this set contains another set. +[clinic start generated code]*/ static PyObject * set_issuperset(PySetObject *so, PyObject *other) +/*[clinic end generated code: output=7d2b71dd714a7ec7 input=fd5dab052f2e9bb3]*/ { if (PyAnySet_Check(other)) { return set_issubset((PySetObject *)other, (PyObject *)so); @@ -1830,12 +1924,6 @@ set_issuperset(PySetObject *so, PyObject *other) Py_RETURN_TRUE; } -PyDoc_STRVAR(issuperset_doc, -"issuperset($self, other, /)\n\ ---\n\ -\n\ -Test whether every element in other is in the set."); - static PyObject * set_richcompare(PySetObject *v, PyObject *w, int op) { @@ -1879,19 +1967,26 @@ set_richcompare(PySetObject *v, PyObject *w, int op) Py_RETURN_NOTIMPLEMENTED; } +/*[clinic input] +set.add + so: setobject + object as key: object + / + +Add an element to a set. + +This has no effect if the element is already present. +[clinic start generated code]*/ + static PyObject * set_add(PySetObject *so, PyObject *key) +/*[clinic end generated code: output=cd9c2d5c2069c2ba input=96f1efe029e47972]*/ { if (set_add_key(so, key)) return NULL; Py_RETURN_NONE; } -PyDoc_STRVAR(add_doc, -"Add an element to a set.\n\ -\n\ -This has no effect if the element is already present."); - static int set_contains(PySetObject *so, PyObject *key) { @@ -1912,8 +2007,19 @@ set_contains(PySetObject *so, PyObject *key) return rv; } +/*[clinic input] +@coexist +set.__contains__ + so: setobject + object as key: object + / + +x.__contains__(y) <==> y in x. +[clinic start generated code]*/ + static PyObject * -set_direct_contains(PySetObject *so, PyObject *key) +set___contains__(PySetObject *so, PyObject *key) +/*[clinic end generated code: output=b5948bc5c590d3ca input=cf4c72db704e4cf0]*/ { long result; @@ -1923,10 +2029,20 @@ set_direct_contains(PySetObject *so, PyObject *key) return PyBool_FromLong(result); } -PyDoc_STRVAR(contains_doc, "x.__contains__(y) <==> y in x."); +/*[clinic input] +set.remove + so: setobject + object as key: object + / + +Remove an element from a set; it must be a member. + +If the element is not a member, raise a KeyError. +[clinic start generated code]*/ static PyObject * set_remove(PySetObject *so, PyObject *key) +/*[clinic end generated code: output=08ae496d0cd2b8c1 input=10132515dfe8ebd7]*/ { PyObject *tmpkey; int rv; @@ -1952,13 +2068,21 @@ set_remove(PySetObject *so, PyObject *key) Py_RETURN_NONE; } -PyDoc_STRVAR(remove_doc, -"Remove an element from a set; it must be a member.\n\ -\n\ -If the element is not a member, raise a KeyError."); +/*[clinic input] +set.discard + so: setobject + object as key: object + / + +Remove an element from a set if it is a member. + +Unlike set.remove(), the discard() method does not raise +an exception when an element is missing from the set. +[clinic start generated code]*/ static PyObject * set_discard(PySetObject *so, PyObject *key) +/*[clinic end generated code: output=9181b60d7bb7d480 input=82a689eba94d5ad9]*/ { PyObject *tmpkey; int rv; @@ -1979,14 +2103,16 @@ set_discard(PySetObject *so, PyObject *key) Py_RETURN_NONE; } -PyDoc_STRVAR(discard_doc, -"Remove an element from a set if it is a member.\n\ -\n\ -Unlike set.remove(), the discard() method does not raise\n\ -an exception when an element is missing from the set."); +/*[clinic input] +set.__reduce__ + so: setobject + +Return state information for pickling. +[clinic start generated code]*/ static PyObject * -set_reduce(PySetObject *so, PyObject *Py_UNUSED(ignored)) +set___reduce___impl(PySetObject *so) +/*[clinic end generated code: output=9af7d0e029df87ee input=531375e87a24a449]*/ { PyObject *keys=NULL, *args=NULL, *result=NULL, *state=NULL; @@ -2007,8 +2133,16 @@ set_reduce(PySetObject *so, PyObject *Py_UNUSED(ignored)) return result; } +/*[clinic input] +set.__sizeof__ + so: setobject + +S.__sizeof__() -> size of S in memory, in bytes. +[clinic start generated code]*/ + static PyObject * -set_sizeof(PySetObject *so, PyObject *Py_UNUSED(ignored)) +set___sizeof___impl(PySetObject *so) +/*[clinic end generated code: output=4bfa3df7bd38ed88 input=0f214fc2225319fc]*/ { size_t res = _PyObject_SIZE(Py_TYPE(so)); if (so->table != so->smalltable) { @@ -2017,7 +2151,6 @@ set_sizeof(PySetObject *so, PyObject *Py_UNUSED(ignored)) return PyLong_FromSize_t(res); } -PyDoc_STRVAR(sizeof_doc, "S.__sizeof__() -> size of S in memory, in bytes"); static int set_init(PySetObject *self, PyObject *args, PyObject *kwds) { @@ -2071,46 +2204,26 @@ static PySequenceMethods set_as_sequence = { /* set object ********************************************************/ static PyMethodDef set_methods[] = { - {"add", (PyCFunction)set_add, METH_O, - add_doc}, - {"clear", (PyCFunction)set_clear, METH_NOARGS, - clear_doc}, - {"__contains__",(PyCFunction)set_direct_contains, METH_O | METH_COEXIST, - contains_doc}, - {"copy", (PyCFunction)set_copy, METH_NOARGS, - copy_doc}, - {"discard", (PyCFunction)set_discard, METH_O, - discard_doc}, - {"difference", (PyCFunction)set_difference_multi, METH_VARARGS, - difference_doc}, - {"difference_update", (PyCFunction)set_difference_update, METH_VARARGS, - difference_update_doc}, - {"intersection",(PyCFunction)set_intersection_multi, METH_VARARGS, - intersection_doc}, - {"intersection_update",(PyCFunction)set_intersection_update_multi, METH_VARARGS, - intersection_update_doc}, - {"isdisjoint", (PyCFunction)set_isdisjoint, METH_O, - isdisjoint_doc}, - {"issubset", (PyCFunction)set_issubset, METH_O, - issubset_doc}, - {"issuperset", (PyCFunction)set_issuperset, METH_O, - issuperset_doc}, - {"pop", (PyCFunction)set_pop, METH_NOARGS, - pop_doc}, - {"__reduce__", (PyCFunction)set_reduce, METH_NOARGS, - reduce_doc}, - {"remove", (PyCFunction)set_remove, METH_O, - remove_doc}, - {"__sizeof__", (PyCFunction)set_sizeof, METH_NOARGS, - sizeof_doc}, - {"symmetric_difference",(PyCFunction)set_symmetric_difference, METH_O, - symmetric_difference_doc}, - {"symmetric_difference_update",(PyCFunction)set_symmetric_difference_update, METH_O, - symmetric_difference_update_doc}, - {"union", (PyCFunction)set_union, METH_VARARGS, - union_doc}, - {"update", (PyCFunction)set_update, METH_VARARGS, - update_doc}, + SET_ADD_METHODDEF + SET_CLEAR_METHODDEF + SET___CONTAINS___METHODDEF + SET_COPY_METHODDEF + SET_DISCARD_METHODDEF + SET_DIFFERENCE_MULTI_METHODDEF + SET_DIFFERENCE_UPDATE_METHODDEF + SET_INTERSECTION_MULTI_METHODDEF + SET_INTERSECTION_UPDATE_MULTI_METHODDEF + SET_ISDISJOINT_METHODDEF + SET_ISSUBSET_METHODDEF + SET_ISSUPERSET_METHODDEF + SET_POP_METHODDEF + SET___REDUCE___METHODDEF + SET_REMOVE_METHODDEF + SET___SIZEOF___METHODDEF + SET_SYMMETRIC_DIFFERENCE_METHODDEF + SET_SYMMETRIC_DIFFERENCE_UPDATE_METHODDEF + SET_UNION_METHODDEF + SET_UPDATE_METHODDEF {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* sentinel */ }; @@ -2203,28 +2316,17 @@ PyTypeObject PySet_Type = { static PyMethodDef frozenset_methods[] = { - {"__contains__",(PyCFunction)set_direct_contains, METH_O | METH_COEXIST, - contains_doc}, - {"copy", (PyCFunction)frozenset_copy, METH_NOARGS, - copy_doc}, - {"difference", (PyCFunction)set_difference_multi, METH_VARARGS, - difference_doc}, - {"intersection", (PyCFunction)set_intersection_multi, METH_VARARGS, - intersection_doc}, - {"isdisjoint", (PyCFunction)set_isdisjoint, METH_O, - isdisjoint_doc}, - {"issubset", (PyCFunction)set_issubset, METH_O, - issubset_doc}, - {"issuperset", (PyCFunction)set_issuperset, METH_O, - issuperset_doc}, - {"__reduce__", (PyCFunction)set_reduce, METH_NOARGS, - reduce_doc}, - {"__sizeof__", (PyCFunction)set_sizeof, METH_NOARGS, - sizeof_doc}, - {"symmetric_difference",(PyCFunction)set_symmetric_difference, METH_O, - symmetric_difference_doc}, - {"union", (PyCFunction)set_union, METH_VARARGS, - union_doc}, + SET___CONTAINS___METHODDEF + FROZENSET_COPY_METHODDEF + SET_DIFFERENCE_MULTI_METHODDEF + SET_INTERSECTION_MULTI_METHODDEF + SET_ISDISJOINT_METHODDEF + SET_ISSUBSET_METHODDEF + SET_ISSUPERSET_METHODDEF + SET___REDUCE___METHODDEF + SET___SIZEOF___METHODDEF + SET_SYMMETRIC_DIFFERENCE_METHODDEF + SET_UNION_METHODDEF {"__class_getitem__", Py_GenericAlias, METH_O|METH_CLASS, PyDoc_STR("See PEP 585")}, {NULL, NULL} /* sentinel */ }; diff --git a/Objects/sliceobject.c b/Objects/sliceobject.c index 8b9d6bbfd858b7..9880c123c80f95 100644 --- a/Objects/sliceobject.c +++ b/Objects/sliceobject.c @@ -103,8 +103,11 @@ PyObject _Py_EllipsisObject = _PyObject_HEAD_INIT(&PyEllipsis_Type); /* Slice object implementation */ -void _PySlice_ClearCache(_PyFreeListState *state) +void _PySlice_ClearFreeList(_PyFreeListState *state, int is_finalization) { + if (!is_finalization) { + return; + } #ifdef WITH_FREELISTS PySliceObject *obj = state->slices.slice_cache; if (obj != NULL) { @@ -114,13 +117,6 @@ void _PySlice_ClearCache(_PyFreeListState *state) #endif } -void _PySlice_Fini(_PyFreeListState *state) -{ -#ifdef WITH_FREELISTS - _PySlice_ClearCache(state); -#endif -} - /* start, stop, and step are python objects with None indicating no index is present. */ diff --git a/Objects/tupleobject.c b/Objects/tupleobject.c index b9bf6cd48f6129..7d73c3fb0f7f2c 100644 --- a/Objects/tupleobject.c +++ b/Objects/tupleobject.c @@ -964,11 +964,6 @@ _PyTuple_Resize(PyObject **pv, Py_ssize_t newsize) static void maybe_freelist_clear(_PyFreeListState *, int); -void -_PyTuple_Fini(_PyFreeListState *state) -{ - maybe_freelist_clear(state, 1); -} void _PyTuple_ClearFreeList(_PyFreeListState *state, int is_finalization) diff --git a/PC/launcher2.c b/PC/launcher2.c index e426eccd700044..90b0fdebd3bdfb 100644 --- a/PC/launcher2.c +++ b/PC/launcher2.c @@ -1594,6 +1594,7 @@ _registryReadLegacyEnvironment(const SearchInfo *search, HKEY root, EnvironmentI int count = swprintf_s(realTag, tagLength + 4, L"%s-32", env->tag); if (count == -1) { + debug(L"# Failed to generate 32bit tag\n"); free(realTag); return RC_INTERNAL_ERROR; } @@ -1749,10 +1750,18 @@ appxSearch(const SearchInfo *search, EnvironmentInfo **result, const wchar_t *pa exeName = search->windowed ? L"pythonw.exe" : L"python.exe"; } - if (FAILED(SHGetFolderPathW(NULL, CSIDL_LOCAL_APPDATA, NULL, 0, buffer)) || - !join(buffer, MAXLEN, L"Microsoft\\WindowsApps") || + // Failure to get LocalAppData may just mean we're running as a user who + // doesn't have a profile directory. + // In this case, return "not found", but don't fail. + // Chances are they can't launch Store installs anyway. + if (FAILED(SHGetFolderPathW(NULL, CSIDL_LOCAL_APPDATA, NULL, 0, buffer))) { + return RC_NO_PYTHON; + } + + if (!join(buffer, MAXLEN, L"Microsoft\\WindowsApps") || !join(buffer, MAXLEN, packageFamilyName) || !join(buffer, MAXLEN, exeName)) { + debug(L"# Failed to construct App Execution Alias path\n"); return RC_INTERNAL_ERROR; } @@ -1982,6 +1991,7 @@ collectEnvironments(const SearchInfo *search, EnvironmentInfo **result) EnvironmentInfo *env = NULL; if (!result) { + debug(L"# collectEnvironments() was passed a NULL result\n"); return RC_INTERNAL_ERROR; } *result = NULL; @@ -2276,6 +2286,7 @@ int selectEnvironment(const SearchInfo *search, EnvironmentInfo *root, EnvironmentInfo **best) { if (!best) { + debug(L"# selectEnvironment() was passed a NULL best\n"); return RC_INTERNAL_ERROR; } if (!root) { diff --git a/PCbuild/_freeze_module.vcxproj b/PCbuild/_freeze_module.vcxproj index 35788ec4503e8f..49f529ebbc2f9b 100644 --- a/PCbuild/_freeze_module.vcxproj +++ b/PCbuild/_freeze_module.vcxproj @@ -191,6 +191,7 @@ + diff --git a/PCbuild/_freeze_module.vcxproj.filters b/PCbuild/_freeze_module.vcxproj.filters index 7a44179e356105..5b1bd7552b4cd9 100644 --- a/PCbuild/_freeze_module.vcxproj.filters +++ b/PCbuild/_freeze_module.vcxproj.filters @@ -46,6 +46,9 @@ Source Files + + Python + Source Files diff --git a/PCbuild/pyproject.props b/PCbuild/pyproject.props index fd5fbc9e910eee..9c85e5efa4af4a 100644 --- a/PCbuild/pyproject.props +++ b/PCbuild/pyproject.props @@ -250,7 +250,7 @@ public override bool Execute() { - + diff --git a/PCbuild/pythoncore.vcxproj b/PCbuild/pythoncore.vcxproj index e1ff97659659ee..4cc0ca4b9af8de 100644 --- a/PCbuild/pythoncore.vcxproj +++ b/PCbuild/pythoncore.vcxproj @@ -206,6 +206,7 @@ + @@ -553,6 +554,7 @@ + diff --git a/PCbuild/pythoncore.vcxproj.filters b/PCbuild/pythoncore.vcxproj.filters index 4c55f23006b2f0..ceaa21217267cf 100644 --- a/PCbuild/pythoncore.vcxproj.filters +++ b/PCbuild/pythoncore.vcxproj.filters @@ -546,6 +546,9 @@ Include\internal + + Include\internal + Include\internal @@ -1253,6 +1256,9 @@ Python + + Python + Python diff --git a/Python/bltinmodule.c b/Python/bltinmodule.c index 31c1bf07e8fb91..b0074962b73799 100644 --- a/Python/bltinmodule.c +++ b/Python/bltinmodule.c @@ -703,17 +703,34 @@ builtin_format_impl(PyObject *module, PyObject *value, PyObject *format_spec) /*[clinic input] chr as builtin_chr - i: int + i: object / Return a Unicode string of one character with ordinal i; 0 <= i <= 0x10ffff. [clinic start generated code]*/ static PyObject * -builtin_chr_impl(PyObject *module, int i) -/*[clinic end generated code: output=c733afcd200afcb7 input=3f604ef45a70750d]*/ +builtin_chr(PyObject *module, PyObject *i) +/*[clinic end generated code: output=d34f25b8035a9b10 input=f919867f0ba2f496]*/ { - return PyUnicode_FromOrdinal(i); + int overflow; + long v = PyLong_AsLongAndOverflow(i, &overflow); + if (v == -1 && PyErr_Occurred()) { + return NULL; + } + if (overflow) { + v = overflow < 0 ? INT_MIN : INT_MAX; + /* Allow PyUnicode_FromOrdinal() to raise an exception */ + } +#if SIZEOF_INT < SIZEOF_LONG + else if (v < INT_MIN) { + v = INT_MIN; + } + else if (v > INT_MAX) { + v = INT_MAX; + } +#endif + return PyUnicode_FromOrdinal(v); } diff --git a/Python/brc.c b/Python/brc.c new file mode 100644 index 00000000000000..f1fd57a2964cf5 --- /dev/null +++ b/Python/brc.c @@ -0,0 +1,198 @@ +// Implementation of biased reference counting inter-thread queue. +// +// Biased reference counting maintains two refcount fields in each object: +// ob_ref_local and ob_ref_shared. The true refcount is the sum of these two +// fields. In some cases, when refcounting operations are split across threads, +// the ob_ref_shared field can be negative (although the total refcount must +// be at least zero). In this case, the thread that decremented the refcount +// requests that the owning thread give up ownership and merge the refcount +// fields. This file implements the mechanism for doing so. +// +// Each thread state maintains a queue of objects whose refcounts it should +// merge. The thread states are stored in a per-interpreter hash table by +// thread id. The hash table has a fixed size and uses a linked list to store +// thread states within each bucket. +// +// The queueing thread uses the eval breaker mechanism to notify the owning +// thread that it has objects to merge. Additionaly, all queued objects are +// merged during GC. +#include "Python.h" +#include "pycore_object.h" // _Py_ExplicitMergeRefcount +#include "pycore_brc.h" // struct _brc_thread_state +#include "pycore_ceval.h" // _Py_set_eval_breaker_bit +#include "pycore_llist.h" // struct llist_node +#include "pycore_pystate.h" // _PyThreadStateImpl + +#ifdef Py_GIL_DISABLED + +// Get the hashtable bucket for a given thread id. +static struct _brc_bucket * +get_bucket(PyInterpreterState *interp, uintptr_t tid) +{ + return &interp->brc.table[tid % _Py_BRC_NUM_BUCKETS]; +} + +// Find the thread state in a hash table bucket by thread id. +static _PyThreadStateImpl * +find_thread_state(struct _brc_bucket *bucket, uintptr_t thread_id) +{ + struct llist_node *node; + llist_for_each(node, &bucket->root) { + // Get the containing _PyThreadStateImpl from the linked-list node. + _PyThreadStateImpl *ts = llist_data(node, _PyThreadStateImpl, + brc.bucket_node); + if (ts->brc.tid == thread_id) { + return ts; + } + } + return NULL; +} + +// Enqueue an object to be merged by the owning thread. This steals a +// reference to the object. +void +_Py_brc_queue_object(PyObject *ob) +{ + PyInterpreterState *interp = _PyInterpreterState_GET(); + + uintptr_t ob_tid = _Py_atomic_load_uintptr(&ob->ob_tid); + if (ob_tid == 0) { + // The owning thread may have concurrently decided to merge the + // refcount fields. + Py_DECREF(ob); + return; + } + + struct _brc_bucket *bucket = get_bucket(interp, ob_tid); + PyMutex_Lock(&bucket->mutex); + _PyThreadStateImpl *tstate = find_thread_state(bucket, ob_tid); + if (tstate == NULL) { + // If we didn't find the owning thread then it must have already exited. + // It's safe (and necessary) to merge the refcount. Subtract one when + // merging because we've stolen a reference. + Py_ssize_t refcount = _Py_ExplicitMergeRefcount(ob, -1); + PyMutex_Unlock(&bucket->mutex); + if (refcount == 0) { + _Py_Dealloc(ob); + } + return; + } + + if (_PyObjectStack_Push(&tstate->brc.objects_to_merge, ob) < 0) { + PyMutex_Unlock(&bucket->mutex); + + // Fall back to stopping all threads and manually merging the refcount + // if we can't enqueue the object to be merged. + _PyEval_StopTheWorld(interp); + Py_ssize_t refcount = _Py_ExplicitMergeRefcount(ob, -1); + _PyEval_StartTheWorld(interp); + + if (refcount == 0) { + _Py_Dealloc(ob); + } + return; + } + + // Notify owning thread + _Py_set_eval_breaker_bit(interp, _PY_EVAL_EXPLICIT_MERGE_BIT, 1); + + PyMutex_Unlock(&bucket->mutex); +} + +static void +merge_queued_objects(_PyObjectStack *to_merge) +{ + PyObject *ob; + while ((ob = _PyObjectStack_Pop(to_merge)) != NULL) { + // Subtract one when merging because the queue had a reference. + Py_ssize_t refcount = _Py_ExplicitMergeRefcount(ob, -1); + if (refcount == 0) { + _Py_Dealloc(ob); + } + } +} + +// Process this thread's queue of objects to merge. +void +_Py_brc_merge_refcounts(PyThreadState *tstate) +{ + struct _brc_thread_state *brc = &((_PyThreadStateImpl *)tstate)->brc; + struct _brc_bucket *bucket = get_bucket(tstate->interp, brc->tid); + + // Append all objects into a local stack. We don't want to hold the lock + // while calling destructors. + PyMutex_Lock(&bucket->mutex); + _PyObjectStack_Merge(&brc->local_objects_to_merge, &brc->objects_to_merge); + PyMutex_Unlock(&bucket->mutex); + + // Process the local stack until it's empty + merge_queued_objects(&brc->local_objects_to_merge); +} + +void +_Py_brc_init_state(PyInterpreterState *interp) +{ + struct _brc_state *brc = &interp->brc; + for (Py_ssize_t i = 0; i < _Py_BRC_NUM_BUCKETS; i++) { + llist_init(&brc->table[i].root); + } +} + +void +_Py_brc_init_thread(PyThreadState *tstate) +{ + struct _brc_thread_state *brc = &((_PyThreadStateImpl *)tstate)->brc; + brc->tid = _Py_ThreadId(); + + // Add ourself to the hashtable + struct _brc_bucket *bucket = get_bucket(tstate->interp, brc->tid); + PyMutex_Lock(&bucket->mutex); + llist_insert_tail(&bucket->root, &brc->bucket_node); + PyMutex_Unlock(&bucket->mutex); +} + +void +_Py_brc_remove_thread(PyThreadState *tstate) +{ + struct _brc_thread_state *brc = &((_PyThreadStateImpl *)tstate)->brc; + struct _brc_bucket *bucket = get_bucket(tstate->interp, brc->tid); + + // We need to fully process any objects to merge before removing ourself + // from the hashtable. It is not safe to perform any refcount operations + // after we are removed. After that point, other threads treat our objects + // as abandoned and may merge the objects' refcounts directly. + bool empty = false; + while (!empty) { + // Process the local stack until it's empty + merge_queued_objects(&brc->local_objects_to_merge); + + PyMutex_Lock(&bucket->mutex); + empty = (brc->objects_to_merge.head == NULL); + if (empty) { + llist_remove(&brc->bucket_node); + } + else { + _PyObjectStack_Merge(&brc->local_objects_to_merge, + &brc->objects_to_merge); + } + PyMutex_Unlock(&bucket->mutex); + } + + assert(brc->local_objects_to_merge.head == NULL); + assert(brc->objects_to_merge.head == NULL); +} + +void +_Py_brc_after_fork(PyInterpreterState *interp) +{ + // Unlock all bucket mutexes. Some of the buckets may be locked because + // locks can be handed off to a parked thread (see lock.c). We don't have + // to worry about consistency here, becuase no thread can be actively + // modifying a bucket, but it might be paused (not yet woken up) on a + // PyMutex_Lock while holding that lock. + for (Py_ssize_t i = 0; i < _Py_BRC_NUM_BUCKETS; i++) { + _PyMutex_at_fork_reinit(&interp->brc.table[i].mutex); + } +} + +#endif /* Py_GIL_DISABLED */ diff --git a/Python/bytecodes.c b/Python/bytecodes.c index 6fb4d719e43991..197dff4b9888ce 100644 --- a/Python/bytecodes.c +++ b/Python/bytecodes.c @@ -2370,23 +2370,12 @@ dummy_func( CHECK_EVAL_BREAKER(); PyCodeObject *code = _PyFrame_GetCode(frame); - _PyExecutorObject *executor = code->co_executors->executors[oparg & 255]; - if (executor->vm_data.valid) { - Py_INCREF(executor); - current_executor = executor; - GOTO_TIER_TWO(); - } - else { - /* ENTER_EXECUTOR will be the first code unit of the instruction */ - assert(oparg < 256); - code->co_executors->executors[oparg] = NULL; - opcode = this_instr->op.code = executor->vm_data.opcode; - this_instr->op.arg = executor->vm_data.oparg; - oparg = executor->vm_data.oparg; - Py_DECREF(executor); - next_instr = this_instr; - DISPATCH_GOTO(); - } + current_executor = code->co_executors->executors[oparg & 255]; + assert(current_executor->vm_data.index == INSTR_OFFSET() - 1); + assert(current_executor->vm_data.code == code); + assert(current_executor->vm_data.valid); + Py_INCREF(current_executor); + GOTO_TIER_TWO(); } replaced op(_POP_JUMP_IF_FALSE, (cond -- )) { diff --git a/Python/ceval_gil.c b/Python/ceval_gil.c index ad90359318761a..deb9741291fca7 100644 --- a/Python/ceval_gil.c +++ b/Python/ceval_gil.c @@ -980,6 +980,14 @@ _Py_HandlePending(PyThreadState *tstate) } } +#ifdef Py_GIL_DISABLED + /* Objects with refcounts to merge */ + if (_Py_eval_breaker_bit_is_set(interp, _PY_EVAL_EXPLICIT_MERGE_BIT)) { + _Py_set_eval_breaker_bit(interp, _PY_EVAL_EXPLICIT_MERGE_BIT, 0); + _Py_brc_merge_refcounts(tstate); + } +#endif + /* GC scheduled to run */ if (_Py_eval_breaker_bit_is_set(interp, _PY_GC_SCHEDULED_BIT)) { _Py_set_eval_breaker_bit(interp, _PY_GC_SCHEDULED_BIT, 0); diff --git a/Python/clinic/bltinmodule.c.h b/Python/clinic/bltinmodule.c.h index 8d40e659b54a57..3898f987cd61ea 100644 --- a/Python/clinic/bltinmodule.c.h +++ b/Python/clinic/bltinmodule.c.h @@ -233,25 +233,6 @@ PyDoc_STRVAR(builtin_chr__doc__, #define BUILTIN_CHR_METHODDEF \ {"chr", (PyCFunction)builtin_chr, METH_O, builtin_chr__doc__}, -static PyObject * -builtin_chr_impl(PyObject *module, int i); - -static PyObject * -builtin_chr(PyObject *module, PyObject *arg) -{ - PyObject *return_value = NULL; - int i; - - i = PyLong_AsInt(arg); - if (i == -1 && PyErr_Occurred()) { - goto exit; - } - return_value = builtin_chr_impl(module, i); - -exit: - return return_value; -} - PyDoc_STRVAR(builtin_compile__doc__, "compile($module, /, source, filename, mode, flags=0,\n" " dont_inherit=False, optimize=-1, *, _feature_version=-1)\n" @@ -1212,4 +1193,4 @@ builtin_issubclass(PyObject *module, PyObject *const *args, Py_ssize_t nargs) exit: return return_value; } -/*[clinic end generated code: output=31bded5d08647a57 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=643a8d5f900e0c36 input=a9049054013a1b77]*/ diff --git a/Python/clinic/sysmodule.c.h b/Python/clinic/sysmodule.c.h index 93b8385a5b4097..13f4ea81eb8984 100644 --- a/Python/clinic/sysmodule.c.h +++ b/Python/clinic/sysmodule.c.h @@ -1131,6 +1131,24 @@ sys__clear_type_cache(PyObject *module, PyObject *Py_UNUSED(ignored)) return sys__clear_type_cache_impl(module); } +PyDoc_STRVAR(sys__clear_internal_caches__doc__, +"_clear_internal_caches($module, /)\n" +"--\n" +"\n" +"Clear all internal performance-related caches."); + +#define SYS__CLEAR_INTERNAL_CACHES_METHODDEF \ + {"_clear_internal_caches", (PyCFunction)sys__clear_internal_caches, METH_NOARGS, sys__clear_internal_caches__doc__}, + +static PyObject * +sys__clear_internal_caches_impl(PyObject *module); + +static PyObject * +sys__clear_internal_caches(PyObject *module, PyObject *Py_UNUSED(ignored)) +{ + return sys__clear_internal_caches_impl(module); +} + PyDoc_STRVAR(sys_is_finalizing__doc__, "is_finalizing($module, /)\n" "--\n" @@ -1486,4 +1504,4 @@ sys__get_cpu_count_config(PyObject *module, PyObject *Py_UNUSED(ignored)) #ifndef SYS_GETANDROIDAPILEVEL_METHODDEF #define SYS_GETANDROIDAPILEVEL_METHODDEF #endif /* !defined(SYS_GETANDROIDAPILEVEL_METHODDEF) */ -/*[clinic end generated code: output=3dc3b2cb0ce38ebb input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b8b1c53e04c3b20c input=a9049054013a1b77]*/ diff --git a/Python/context.c b/Python/context.c index 793dfa2b72c7e3..e44fef705c36e0 100644 --- a/Python/context.c +++ b/Python/context.c @@ -1284,17 +1284,6 @@ _PyContext_ClearFreeList(_PyFreeListState *freelist_state, int is_finalization) } -void -_PyContext_Fini(_PyFreeListState *state) -{ - // With Py_GIL_DISABLED: - // the freelists for the current thread state have already been cleared. -#ifndef Py_GIL_DISABLED - _PyContext_ClearFreeList(state, 1); -#endif -} - - PyStatus _PyContext_Init(PyInterpreterState *interp) { diff --git a/Python/gc_free_threading.c b/Python/gc_free_threading.c index 8fbcdb15109b76..93e1168002b6f7 100644 --- a/Python/gc_free_threading.c +++ b/Python/gc_free_threading.c @@ -1,5 +1,6 @@ // Cyclic garbage collector implementation for free-threaded build. #include "Python.h" +#include "pycore_brc.h" // struct _brc_thread_state #include "pycore_ceval.h" // _Py_set_eval_breaker_bit() #include "pycore_context.h" #include "pycore_dict.h" // _PyDict_MaybeUntrack() @@ -152,8 +153,7 @@ gc_decref(PyObject *op) op->ob_tid -= 1; } -// Merge refcounts while the world is stopped. -static void +static Py_ssize_t merge_refcount(PyObject *op, Py_ssize_t extra) { assert(_PyInterpreterState_GET()->stoptheworld.world_stopped); @@ -169,6 +169,7 @@ merge_refcount(PyObject *op, Py_ssize_t extra) op->ob_tid = 0; op->ob_ref_local = 0; op->ob_ref_shared = _Py_REF_SHARED(refcount, _Py_REF_MERGED); + return refcount; } static void @@ -282,6 +283,41 @@ gc_visit_heaps(PyInterpreterState *interp, mi_block_visit_fun *visitor, return err; } +static void +merge_queued_objects(_PyThreadStateImpl *tstate, struct collection_state *state) +{ + struct _brc_thread_state *brc = &tstate->brc; + _PyObjectStack_Merge(&brc->local_objects_to_merge, &brc->objects_to_merge); + + PyObject *op; + while ((op = _PyObjectStack_Pop(&brc->local_objects_to_merge)) != NULL) { + // Subtract one when merging because the queue had a reference. + Py_ssize_t refcount = merge_refcount(op, -1); + + if (!_PyObject_GC_IS_TRACKED(op) && refcount == 0) { + // GC objects with zero refcount are handled subsequently by the + // GC as if they were cyclic trash, but we have to handle dead + // non-GC objects here. Add one to the refcount so that we can + // decref and deallocate the object once we start the world again. + op->ob_ref_shared += (1 << _Py_REF_SHARED_SHIFT); +#ifdef Py_REF_DEBUG + _Py_IncRefTotal(_PyInterpreterState_GET()); +#endif + worklist_push(&state->objs_to_decref, op); + } + } +} + +static void +merge_all_queued_objects(PyInterpreterState *interp, struct collection_state *state) +{ + HEAD_LOCK(&_PyRuntime); + for (PyThreadState *p = interp->threads.head; p != NULL; p = p->next) { + merge_queued_objects((_PyThreadStateImpl *)p, state); + } + HEAD_UNLOCK(&_PyRuntime); +} + // Subtract an incoming reference from the computed "gc_refs" refcount. static int visit_decref(PyObject *op, void *arg) @@ -927,6 +963,9 @@ static void gc_collect_internal(PyInterpreterState *interp, struct collection_state *state) { _PyEval_StopTheWorld(interp); + // merge refcounts for all queued objects + merge_all_queued_objects(interp, state); + // Find unreachable objects int err = deduce_unreachable_heap(interp, state); if (err < 0) { @@ -946,6 +985,9 @@ gc_collect_internal(PyInterpreterState *interp, struct collection_state *state) clear_weakrefs(state); _PyEval_StartTheWorld(interp); + // Deallocate any object from the refcount merge step + cleanup_worklist(&state->objs_to_decref); + // Call weakref callbacks and finalizers after unpausing other threads to // avoid potential deadlocks. call_weakref_callbacks(state); @@ -1679,7 +1721,7 @@ _PyGC_ClearAllFreeLists(PyInterpreterState *interp) HEAD_LOCK(&_PyRuntime); _PyThreadStateImpl *tstate = (_PyThreadStateImpl *)interp->threads.head; while (tstate != NULL) { - _Py_ClearFreeLists(&tstate->freelist_state, 0); + _PyObject_ClearFreeLists(&tstate->freelist_state, 0); tstate = (_PyThreadStateImpl *)tstate->base.next; } HEAD_UNLOCK(&_PyRuntime); diff --git a/Python/gc_gil.c b/Python/gc_gil.c index 4e2aa8f7af746c..5f1365f509deb0 100644 --- a/Python/gc_gil.c +++ b/Python/gc_gil.c @@ -11,7 +11,7 @@ void _PyGC_ClearAllFreeLists(PyInterpreterState *interp) { - _Py_ClearFreeLists(&interp->freelist_state, 0); + _PyObject_ClearFreeLists(&interp->freelist_state, 0); } #endif diff --git a/Python/generated_cases.c.h b/Python/generated_cases.c.h index 16f1db30620d72..e5244147d499af 100644 --- a/Python/generated_cases.c.h +++ b/Python/generated_cases.c.h @@ -2363,29 +2363,18 @@ } TARGET(ENTER_EXECUTOR) { - _Py_CODEUNIT *this_instr = frame->instr_ptr = next_instr; + frame->instr_ptr = next_instr; next_instr += 1; INSTRUCTION_STATS(ENTER_EXECUTOR); TIER_ONE_ONLY CHECK_EVAL_BREAKER(); PyCodeObject *code = _PyFrame_GetCode(frame); - _PyExecutorObject *executor = code->co_executors->executors[oparg & 255]; - if (executor->vm_data.valid) { - Py_INCREF(executor); - current_executor = executor; - GOTO_TIER_TWO(); - } - else { - /* ENTER_EXECUTOR will be the first code unit of the instruction */ - assert(oparg < 256); - code->co_executors->executors[oparg] = NULL; - opcode = this_instr->op.code = executor->vm_data.opcode; - this_instr->op.arg = executor->vm_data.oparg; - oparg = executor->vm_data.oparg; - Py_DECREF(executor); - next_instr = this_instr; - DISPATCH_GOTO(); - } + current_executor = code->co_executors->executors[oparg & 255]; + assert(current_executor->vm_data.index == INSTR_OFFSET() - 1); + assert(current_executor->vm_data.code == code); + assert(current_executor->vm_data.valid); + Py_INCREF(current_executor); + GOTO_TIER_TWO(); DISPATCH(); } diff --git a/Python/getargs.c b/Python/getargs.c index 0c4ce282f48764..08e97ee3e627b5 100644 --- a/Python/getargs.c +++ b/Python/getargs.c @@ -8,6 +8,7 @@ #include "pycore_modsupport.h" // export _PyArg_NoKeywords() #include "pycore_pylifecycle.h" // _PyArg_Fini #include "pycore_tuple.h" // _PyTuple_ITEMS() +#include "pycore_pyerrors.h" // _Py_CalculateSuggestions() /* Export Stable ABIs (abi only) */ PyAPI_FUNC(int) _PyArg_Parse_SizeT(PyObject *, const char *, ...); @@ -1424,12 +1425,31 @@ error_unexpected_keyword_arg(PyObject *kwargs, PyObject *kwnames, PyObject *kwtu int match = PySequence_Contains(kwtuple, keyword); if (match <= 0) { if (!match) { - PyErr_Format(PyExc_TypeError, - "'%S' is an invalid keyword " - "argument for %.200s%s", - keyword, - (fname == NULL) ? "this function" : fname, - (fname == NULL) ? "" : "()"); + PyObject *kwlist = PySequence_List(kwtuple); + if (!kwlist) { + return; + } + PyObject *suggestion_keyword = _Py_CalculateSuggestions(kwlist, keyword); + Py_DECREF(kwlist); + + if (suggestion_keyword) { + PyErr_Format(PyExc_TypeError, + "%.200s%s got an unexpected keyword argument '%S'." + " Did you mean '%S'?", + (fname == NULL) ? "this function" : fname, + (fname == NULL) ? "" : "()", + keyword, + suggestion_keyword); + Py_DECREF(suggestion_keyword); + } + else { + PyErr_Format(PyExc_TypeError, + "%.200s%s got an unexpected keyword argument '%S'", + (fname == NULL) ? "this function" : fname, + (fname == NULL) ? "" : "()", + keyword); + } + } return; } @@ -1457,6 +1477,9 @@ PyArg_ValidateKeywordArguments(PyObject *kwargs) return 1; } +static PyObject * +new_kwtuple(const char * const *keywords, int total, int pos); + #define IS_END_OF_FORMAT(c) (c == '\0' || c == ';' || c == ':') static int @@ -1722,12 +1745,35 @@ vgetargskeywords(PyObject *args, PyObject *kwargs, const char *format, } } if (!match) { - PyErr_Format(PyExc_TypeError, - "'%U' is an invalid keyword " - "argument for %.200s%s", - key, - (fname == NULL) ? "this function" : fname, - (fname == NULL) ? "" : "()"); + PyObject *_pykwtuple = new_kwtuple(kwlist, len, pos); + if (!_pykwtuple) { + return cleanreturn(0, &freelist); + } + PyObject *pykwlist = PySequence_List(_pykwtuple); + Py_DECREF(_pykwtuple); + if (!pykwlist) { + return cleanreturn(0, &freelist); + } + PyObject *suggestion_keyword = _Py_CalculateSuggestions(pykwlist, key); + Py_DECREF(pykwlist); + + if (suggestion_keyword) { + PyErr_Format(PyExc_TypeError, + "%.200s%s got an unexpected keyword argument '%S'." + " Did you mean '%S'?", + (fname == NULL) ? "this function" : fname, + (fname == NULL) ? "" : "()", + key, + suggestion_keyword); + Py_DECREF(suggestion_keyword); + } + else { + PyErr_Format(PyExc_TypeError, + "%.200s%s got an unexpected keyword argument '%S'", + (fname == NULL) ? "this function" : fname, + (fname == NULL) ? "" : "()", + key); + } return cleanreturn(0, &freelist); } } diff --git a/Python/object_stack.c b/Python/object_stack.c index 8544892eb71dcb..ced4460da00f44 100644 --- a/Python/object_stack.c +++ b/Python/object_stack.c @@ -67,6 +67,27 @@ _PyObjectStack_Clear(_PyObjectStack *queue) } } +void +_PyObjectStack_Merge(_PyObjectStack *dst, _PyObjectStack *src) +{ + if (src->head == NULL) { + return; + } + + if (dst->head != NULL) { + // First, append dst to the bottom of src + _PyObjectStackChunk *last = src->head; + while (last->prev != NULL) { + last = last->prev; + } + last->prev = dst->head; + } + + // Now that src has all the chunks, set dst to src + dst->head = src->head; + src->head = NULL; +} + void _PyObjectStackChunk_ClearFreeList(_PyFreeListState *free_lists, int is_finalization) { diff --git a/Python/optimizer.c b/Python/optimizer.c index d71ca0aef0e11a..ad9ac382d300ef 100644 --- a/Python/optimizer.c +++ b/Python/optimizer.c @@ -73,25 +73,21 @@ insert_executor(PyCodeObject *code, _Py_CODEUNIT *instr, int index, _PyExecutorO Py_INCREF(executor); if (instr->op.code == ENTER_EXECUTOR) { assert(index == instr->op.arg); - _PyExecutorObject *old = code->co_executors->executors[index]; - executor->vm_data.opcode = old->vm_data.opcode; - executor->vm_data.oparg = old->vm_data.oparg; - old->vm_data.opcode = 0; - code->co_executors->executors[index] = executor; - Py_DECREF(old); + _Py_ExecutorClear(code->co_executors->executors[index]); } else { assert(code->co_executors->size == index); assert(code->co_executors->capacity > index); - executor->vm_data.opcode = instr->op.code; - executor->vm_data.oparg = instr->op.arg; - code->co_executors->executors[index] = executor; - assert(index < MAX_EXECUTORS_SIZE); - instr->op.code = ENTER_EXECUTOR; - instr->op.arg = index; code->co_executors->size++; } - return; + executor->vm_data.opcode = instr->op.code; + executor->vm_data.oparg = instr->op.arg; + executor->vm_data.code = code; + executor->vm_data.index = (int)(instr - _PyCode_CODE(code)); + code->co_executors->executors[index] = executor; + assert(index < MAX_EXECUTORS_SIZE); + instr->op.code = ENTER_EXECUTOR; + instr->op.arg = index; } int @@ -1071,7 +1067,7 @@ link_executor(_PyExecutorObject *executor) } head->vm_data.links.next = executor; } - executor->vm_data.linked = true; + executor->vm_data.valid = true; /* executor_list_head must be first in list */ assert(interp->executor_list_head->vm_data.links.previous == NULL); } @@ -1079,7 +1075,7 @@ link_executor(_PyExecutorObject *executor) static void unlink_executor(_PyExecutorObject *executor) { - if (!executor->vm_data.linked) { + if (!executor->vm_data.valid) { return; } _PyExecutorLinkListNode *links = &executor->vm_data.links; @@ -1097,7 +1093,7 @@ unlink_executor(_PyExecutorObject *executor) assert(interp->executor_list_head == executor); interp->executor_list_head = next; } - executor->vm_data.linked = false; + executor->vm_data.valid = false; } /* This must be called by optimizers before using the executor */ @@ -1116,12 +1112,24 @@ void _Py_ExecutorClear(_PyExecutorObject *executor) { unlink_executor(executor); + PyCodeObject *code = executor->vm_data.code; + if (code == NULL) { + return; + } + _Py_CODEUNIT *instruction = &_PyCode_CODE(code)[executor->vm_data.index]; + assert(instruction->op.code == ENTER_EXECUTOR); + int index = instruction->op.arg; + assert(code->co_executors->executors[index] == executor); + instruction->op.code = executor->vm_data.opcode; + instruction->op.arg = executor->vm_data.oparg; + executor->vm_data.code = NULL; + Py_CLEAR(code->co_executors->executors[index]); } void _Py_Executor_DependsOn(_PyExecutorObject *executor, void *obj) { - assert(executor->vm_data.valid = true); + assert(executor->vm_data.valid); _Py_BloomFilter_Add(&executor->vm_data.bloom, obj); } @@ -1140,8 +1148,7 @@ _Py_Executors_InvalidateDependency(PyInterpreterState *interp, void *obj) assert(exec->vm_data.valid); _PyExecutorObject *next = exec->vm_data.links.next; if (bloom_filter_may_contain(&exec->vm_data.bloom, &obj_filter)) { - exec->vm_data.valid = false; - unlink_executor(exec); + _Py_ExecutorClear(exec); } exec = next; } @@ -1151,15 +1158,14 @@ _Py_Executors_InvalidateDependency(PyInterpreterState *interp, void *obj) void _Py_Executors_InvalidateAll(PyInterpreterState *interp) { - /* Walk the list of executors */ - for (_PyExecutorObject *exec = interp->executor_list_head; exec != NULL;) { - assert(exec->vm_data.valid); - _PyExecutorObject *next = exec->vm_data.links.next; - exec->vm_data.links.next = NULL; - exec->vm_data.links.previous = NULL; - exec->vm_data.valid = false; - exec->vm_data.linked = false; - exec = next; + while (interp->executor_list_head) { + _PyExecutorObject *executor = interp->executor_list_head; + if (executor->vm_data.code) { + // Clear the entire code object so its co_executors array be freed: + _PyCode_Clear_Executors(executor->vm_data.code); + } + else { + _Py_ExecutorClear(executor); + } } - interp->executor_list_head = NULL; } diff --git a/Python/optimizer_analysis.c b/Python/optimizer_analysis.c index 2cfbf4b349d0f5..b14e6950b4a06b 100644 --- a/Python/optimizer_analysis.c +++ b/Python/optimizer_analysis.c @@ -28,25 +28,23 @@ increment_mutations(PyObject* dict) { d->ma_version_tag += (1 << DICT_MAX_WATCHERS); } +/* The first two dict watcher IDs are reserved for CPython, + * so we don't need to check that they haven't been used */ +#define BUILTINS_WATCHER_ID 0 +#define GLOBALS_WATCHER_ID 1 + static int globals_watcher_callback(PyDict_WatchEvent event, PyObject* dict, PyObject* key, PyObject* new_value) { - if (event == PyDict_EVENT_CLONED) { - return 0; - } - uint64_t watched_mutations = get_mutations(dict); - if (watched_mutations < _Py_MAX_ALLOWED_GLOBALS_MODIFICATIONS) { - _Py_Executors_InvalidateDependency(_PyInterpreterState_GET(), dict); - increment_mutations(dict); - } - else { - PyDict_Unwatch(1, dict); - } + RARE_EVENT_STAT_INC(watched_globals_modification); + assert(get_mutations(dict) < _Py_MAX_ALLOWED_GLOBALS_MODIFICATIONS); + _Py_Executors_InvalidateDependency(_PyInterpreterState_GET(), dict); + increment_mutations(dict); + PyDict_Unwatch(GLOBALS_WATCHER_ID, dict); return 0; } - static void global_to_const(_PyUOpInstruction *inst, PyObject *obj) { @@ -82,11 +80,6 @@ incorrect_keys(_PyUOpInstruction *inst, PyObject *obj) return 0; } -/* The first two dict watcher IDs are reserved for CPython, - * so we don't need to check that they haven't been used */ -#define BUILTINS_WATCHER_ID 0 -#define GLOBALS_WATCHER_ID 1 - /* Returns 1 if successfully optimized * 0 if the trace is not suitable for optimization (yet) * -1 if there was an error. */ @@ -117,8 +110,8 @@ remove_globals(_PyInterpreterFrame *frame, _PyUOpInstruction *buffer, uint32_t builtins_watched = 0; uint32_t globals_checked = 0; uint32_t globals_watched = 0; - if (interp->dict_state.watchers[1] == NULL) { - interp->dict_state.watchers[1] = globals_watcher_callback; + if (interp->dict_state.watchers[GLOBALS_WATCHER_ID] == NULL) { + interp->dict_state.watchers[GLOBALS_WATCHER_ID] = globals_watcher_callback; } for (int pc = 0; pc < buffer_size; pc++) { _PyUOpInstruction *inst = &buffer[pc]; diff --git a/Python/pylifecycle.c b/Python/pylifecycle.c index 0cac7109340129..230018068d751c 100644 --- a/Python/pylifecycle.c +++ b/Python/pylifecycle.c @@ -611,7 +611,7 @@ static int builtins_dict_watcher(PyDict_WatchEvent event, PyObject *dict, PyObject *key, PyObject *new_value) { PyInterpreterState *interp = _PyInterpreterState_GET(); - if (event != PyDict_EVENT_CLONED && interp->rare_events.builtin_dict < _Py_MAX_ALLOWED_BUILTINS_MODIFICATIONS) { + if (interp->rare_events.builtin_dict < _Py_MAX_ALLOWED_BUILTINS_MODIFICATIONS) { _Py_Executors_InvalidateAll(interp); } RARE_EVENT_INTERP_INC(interp, builtin_dict); @@ -1790,16 +1790,14 @@ finalize_interp_types(PyInterpreterState *interp) // a dict internally. _PyUnicode_ClearInterned(interp); - _PyDict_Fini(interp); _PyUnicode_Fini(interp); +#ifndef Py_GIL_DISABLED + // With Py_GIL_DISABLED: + // the freelists for the current thread state have already been cleared. _PyFreeListState *state = _PyFreeListState_GET(); - _PyTuple_Fini(state); - _PyList_Fini(state); - _PyFloat_Fini(state); - _PySlice_Fini(state); - _PyContext_Fini(state); - _PyAsyncGen_Fini(state); + _PyObject_ClearFreeLists(state, 1); +#endif #ifdef Py_DEBUG _PyStaticObjects_CheckRefcnt(interp); diff --git a/Python/pystate.c b/Python/pystate.c index e77e5bfa7e2df8..937c43033b068d 100644 --- a/Python/pystate.c +++ b/Python/pystate.c @@ -611,6 +611,9 @@ init_interpreter(PyInterpreterState *interp, _PyGC_InitState(&interp->gc); PyConfig_InitPythonConfig(&interp->config); _PyType_InitCache(interp); +#ifdef Py_GIL_DISABLED + _Py_brc_init_state(interp); +#endif for (int i = 0; i < _PY_MONITORING_UNGROUPED_EVENTS; i++) { interp->monitors.tools[i] = 0; } @@ -1336,6 +1339,11 @@ init_threadstate(_PyThreadStateImpl *_tstate, tstate->datastack_limit = NULL; tstate->what_event = -1; +#ifdef Py_GIL_DISABLED + // Initialize biased reference counting inter-thread queue + _Py_brc_init_thread(tstate); +#endif + if (interp->stoptheworld.requested || _PyRuntime.stoptheworld.requested) { // Start in the suspended state if there is an ongoing stop-the-world. tstate->state = _Py_THREAD_SUSPENDED; @@ -1460,20 +1468,6 @@ clear_datastack(PyThreadState *tstate) } } -void -_Py_ClearFreeLists(_PyFreeListState *state, int is_finalization) -{ - // In the free-threaded build, freelists are per-PyThreadState and cleared in PyThreadState_Clear() - // In the default build, freelists are per-interpreter and cleared in finalize_interp_types() - _PyFloat_ClearFreeList(state, is_finalization); - _PyTuple_ClearFreeList(state, is_finalization); - _PyList_ClearFreeList(state, is_finalization); - _PyDict_ClearFreeList(state, is_finalization); - _PyContext_ClearFreeList(state, is_finalization); - _PyAsyncGen_ClearFreeLists(state, is_finalization); - _PyObjectStackChunk_ClearFreeList(state, is_finalization); -} - void PyThreadState_Clear(PyThreadState *tstate) { @@ -1558,9 +1552,11 @@ PyThreadState_Clear(PyThreadState *tstate) } #ifdef Py_GIL_DISABLED // Each thread should clear own freelists in free-threading builds. - _PyFreeListState *freelist_state = &((_PyThreadStateImpl*)tstate)->freelist_state; - _Py_ClearFreeLists(freelist_state, 1); - _PySlice_ClearCache(freelist_state); + _PyFreeListState *freelist_state = _PyFreeListState_GET(); + _PyObject_ClearFreeLists(freelist_state, 1); + + // Remove ourself from the biased reference counting table of threads. + _Py_brc_remove_thread(tstate); #endif _PyThreadState_ClearMimallocHeaps(tstate); diff --git a/Python/specialize.c b/Python/specialize.c index e38e3556a6d642..ea2638570f22d0 100644 --- a/Python/specialize.c +++ b/Python/specialize.c @@ -275,6 +275,8 @@ print_rare_event_stats(FILE *out, RareEventStats *stats) fprintf(out, "Rare event (set_eval_frame_func): %" PRIu64 "\n", stats->set_eval_frame_func); fprintf(out, "Rare event (builtin_dict): %" PRIu64 "\n", stats->builtin_dict); fprintf(out, "Rare event (func_modification): %" PRIu64 "\n", stats->func_modification); + fprintf(out, "Rare event (watched_dict_modification): %" PRIu64 "\n", stats->watched_dict_modification); + fprintf(out, "Rare event (watched_globals_modification): %" PRIu64 "\n", stats->watched_globals_modification); } static void diff --git a/Python/structmember.c b/Python/structmember.c index c9f03a464078d0..ba881d18a0973d 100644 --- a/Python/structmember.c +++ b/Python/structmember.c @@ -2,6 +2,8 @@ /* Map C struct members to Python object attributes */ #include "Python.h" +#include "pycore_abstract.h" // _PyNumber_Index() +#include "pycore_long.h" // _PyLong_IsNegative() PyObject * @@ -200,27 +202,22 @@ PyMember_SetOne(char *addr, PyMemberDef *l, PyObject *v) case Py_T_UINT: { /* XXX: For compatibility, accept negative int values as well. */ - int overflow; - long long_val = PyLong_AsLongAndOverflow(v, &overflow); - if (long_val == -1 && PyErr_Occurred()) { - return -1; - } - if (overflow < 0) { - PyErr_SetString(PyExc_OverflowError, - "Python int too large to convert to C long"); + v = _PyNumber_Index(v); + if (v == NULL) { return -1; } - else if (!overflow) { - *(unsigned int *)addr = (unsigned int)(unsigned long)long_val; - if (long_val < 0) { - WARN("Writing negative value into unsigned field"); - } - else if ((unsigned long)long_val > UINT_MAX) { - WARN("Truncation of value to unsigned short"); + if (_PyLong_IsNegative((PyLongObject *)v)) { + long long_val = PyLong_AsLong(v); + Py_DECREF(v); + if (long_val == -1 && PyErr_Occurred()) { + return -1; } + *(unsigned int *)addr = (unsigned int)(unsigned long)long_val; + WARN("Writing negative value into unsigned field"); } else { unsigned long ulong_val = PyLong_AsUnsignedLong(v); + Py_DECREF(v); if (ulong_val == (unsigned long)-1 && PyErr_Occurred()) { return -1; } @@ -240,24 +237,22 @@ PyMember_SetOne(char *addr, PyMemberDef *l, PyObject *v) case Py_T_ULONG: { /* XXX: For compatibility, accept negative int values as well. */ - int overflow; - long long_val = PyLong_AsLongAndOverflow(v, &overflow); - if (long_val == -1 && PyErr_Occurred()) { - return -1; - } - if (overflow < 0) { - PyErr_SetString(PyExc_OverflowError, - "Python int too large to convert to C long"); + v = _PyNumber_Index(v); + if (v == NULL) { return -1; } - else if (!overflow) { - *(unsigned long *)addr = (unsigned long)long_val; - if (long_val < 0) { - WARN("Writing negative value into unsigned field"); + if (_PyLong_IsNegative((PyLongObject *)v)) { + long long_val = PyLong_AsLong(v); + Py_DECREF(v); + if (long_val == -1 && PyErr_Occurred()) { + return -1; } + *(unsigned long *)addr = (unsigned long)long_val; + WARN("Writing negative value into unsigned field"); } else { unsigned long ulong_val = PyLong_AsUnsignedLong(v); + Py_DECREF(v); if (ulong_val == (unsigned long)-1 && PyErr_Occurred()) { return -1; } @@ -313,18 +308,30 @@ PyMember_SetOne(char *addr, PyMemberDef *l, PyObject *v) return -1; break; } - case Py_T_ULONGLONG:{ - unsigned long long value; - /* ??? PyLong_AsLongLong accepts an int, but PyLong_AsUnsignedLongLong - doesn't ??? */ - if (PyLong_Check(v)) - *(unsigned long long*)addr = value = PyLong_AsUnsignedLongLong(v); - else - *(unsigned long long*)addr = value = PyLong_AsLong(v); - if ((value == (unsigned long long)-1) && PyErr_Occurred()) + case Py_T_ULONGLONG: { + v = _PyNumber_Index(v); + if (v == NULL) { return -1; - break; } + if (_PyLong_IsNegative((PyLongObject *)v)) { + long long_val = PyLong_AsLong(v); + Py_DECREF(v); + if (long_val == -1 && PyErr_Occurred()) { + return -1; + } + *(unsigned long long *)addr = (unsigned long long)(long long)long_val; + WARN("Writing negative value into unsigned field"); + } + else { + unsigned long long ulonglong_val = PyLong_AsUnsignedLongLong(v); + Py_DECREF(v); + if (ulonglong_val == (unsigned long long)-1 && PyErr_Occurred()) { + return -1; + } + *(unsigned long long*)addr = ulonglong_val; + } + break; + } default: PyErr_Format(PyExc_SystemError, "bad memberdescr type for %s", l->name); diff --git a/Python/sysmodule.c b/Python/sysmodule.c index 437d7f8dfc4958..69b6d886ccc3e9 100644 --- a/Python/sysmodule.c +++ b/Python/sysmodule.c @@ -2127,6 +2127,22 @@ sys__clear_type_cache_impl(PyObject *module) Py_RETURN_NONE; } +/*[clinic input] +sys._clear_internal_caches + +Clear all internal performance-related caches. +[clinic start generated code]*/ + +static PyObject * +sys__clear_internal_caches_impl(PyObject *module) +/*[clinic end generated code: output=0ee128670a4966d6 input=253e741ca744f6e8]*/ +{ + PyInterpreterState *interp = _PyInterpreterState_GET(); + _Py_Executors_InvalidateAll(interp); + PyType_ClearCache(); + Py_RETURN_NONE; +} + /* Note that, for now, we do not have a per-interpreter equivalent for sys.is_finalizing(). */ @@ -2461,6 +2477,7 @@ static PyMethodDef sys_methods[] = { {"audit", _PyCFunction_CAST(sys_audit), METH_FASTCALL, audit_doc }, {"breakpointhook", _PyCFunction_CAST(sys_breakpointhook), METH_FASTCALL | METH_KEYWORDS, breakpointhook_doc}, + SYS__CLEAR_INTERNAL_CACHES_METHODDEF SYS__CLEAR_TYPE_CACHE_METHODDEF SYS__CURRENT_FRAMES_METHODDEF SYS__CURRENT_EXCEPTIONS_METHODDEF diff --git a/README.rst b/README.rst index fbfae16a7dbb0b..1145fd43755840 100644 --- a/README.rst +++ b/README.rst @@ -161,15 +161,6 @@ For information about building Python's documentation, refer to `Doc/README.rst `_. -Converting From Python 2.x to 3.x ---------------------------------- - -Significant backward incompatible changes were made for the release of Python -3.0, which may cause programs written for Python 2 to fail when run with Python -3. For more information about porting your code from Python 2 to Python 3, see -the `Porting HOWTO `_. - - Testing ------- diff --git a/Tools/scripts/summarize_stats.py b/Tools/scripts/summarize_stats.py index 9b7e7b999ea7c7..7891b9cf923d33 100644 --- a/Tools/scripts/summarize_stats.py +++ b/Tools/scripts/summarize_stats.py @@ -415,7 +415,7 @@ def get_histogram(self, prefix: str) -> list[tuple[int, int]]: def get_rare_events(self) -> list[tuple[str, int]]: prefix = "Rare event " return [ - (key[len(prefix) + 1:-1], val) + (key[len(prefix) + 1:-1].replace("_", " "), val) for key, val in self._data.items() if key.startswith(prefix) ] diff --git a/configure b/configure index 0375565c294552..705a778cafced3 100755 --- a/configure +++ b/configure @@ -6805,6 +6805,8 @@ case $host/$ac_cv_cc_name in #( aarch64-*-linux-gnu/clang) : PY_SUPPORT_TIER=2 ;; #( powerpc64le-*-linux-gnu/gcc) : + PY_SUPPORT_TIER=2 ;; #( + wasm32-unknown-wasi/clang) : PY_SUPPORT_TIER=2 ;; #( x86_64-*-linux-gnu/clang) : PY_SUPPORT_TIER=2 ;; #( @@ -6817,10 +6819,6 @@ case $host/$ac_cv_cc_name in #( PY_SUPPORT_TIER=3 ;; #( s390x-*-linux-gnu/gcc) : PY_SUPPORT_TIER=3 ;; #( - wasm32-unknown-emscripten/clang) : - PY_SUPPORT_TIER=3 ;; #( - wasm32-unknown-wasi/clang) : - PY_SUPPORT_TIER=3 ;; #( x86_64-*-freebsd*/clang) : PY_SUPPORT_TIER=3 ;; #( *) : diff --git a/configure.ac b/configure.ac index e121e893a1d0d9..dee7ed552b370f 100644 --- a/configure.ac +++ b/configure.ac @@ -973,14 +973,13 @@ AS_CASE([$host/$ac_cv_cc_name], [aarch64-*-linux-gnu/gcc], [PY_SUPPORT_TIER=2], dnl Linux ARM64, glibc, gcc+clang [aarch64-*-linux-gnu/clang], [PY_SUPPORT_TIER=2], [powerpc64le-*-linux-gnu/gcc], [PY_SUPPORT_TIER=2], dnl Linux on PPC64 little endian, glibc, gcc + [wasm32-unknown-wasi/clang], [PY_SUPPORT_TIER=2], dnl WebAssembly System Interface, clang [x86_64-*-linux-gnu/clang], [PY_SUPPORT_TIER=2], dnl Linux on AMD64, any vendor, glibc, clang [aarch64-pc-windows-msvc/msvc], [PY_SUPPORT_TIER=3], dnl Windows ARM64, MSVC [armv7l-*-linux-gnueabihf/gcc], [PY_SUPPORT_TIER=3], dnl ARMv7 LE with hardware floats, any vendor, glibc, gcc [powerpc64le-*-linux-gnu/clang], [PY_SUPPORT_TIER=3], dnl Linux on PPC64 little endian, glibc, clang [s390x-*-linux-gnu/gcc], [PY_SUPPORT_TIER=3], dnl Linux on 64bit s390x (big endian), glibc, gcc - [wasm32-unknown-emscripten/clang], [PY_SUPPORT_TIER=3], dnl WebAssembly Emscripten - [wasm32-unknown-wasi/clang], [PY_SUPPORT_TIER=3], dnl WebAssembly System Interface [x86_64-*-freebsd*/clang], [PY_SUPPORT_TIER=3], dnl FreeBSD on AMD64 [PY_SUPPORT_TIER=0] )