Skip to content

Commit

Permalink
Merge branch '1.5.x' of https://github.com/pandas-dev/pandas into 1.5.x
Browse files Browse the repository at this point in the history
  • Loading branch information
rhshadrach committed Nov 11, 2022
2 parents 3f6b0e4 + c9252cf commit 1b8e4db
Show file tree
Hide file tree
Showing 33 changed files with 454 additions and 59 deletions.
2 changes: 0 additions & 2 deletions .github/workflows/32-bit-linux.yml
Expand Up @@ -5,12 +5,10 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
pull_request:
branches:
- main
- 1.5.x
- 1.4.x
paths-ignore:
- "doc/**"

Expand Down
2 changes: 0 additions & 2 deletions .github/workflows/code-checks.yml
Expand Up @@ -5,12 +5,10 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
pull_request:
branches:
- main
- 1.5.x
- 1.4.x

env:
ENV_FILE: environment.yml
Expand Down
10 changes: 4 additions & 6 deletions .github/workflows/docbuild-and-upload.yml
Expand Up @@ -5,14 +5,12 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
tags:
- '*'
pull_request:
branches:
- main
- 1.5.x
- 1.4.x

env:
ENV_FILE: environment.yml
Expand Down Expand Up @@ -66,22 +64,22 @@ jobs:
mkdir -m 700 -p ~/.ssh
echo "${{ secrets.server_ssh_key }}" > ~/.ssh/id_rsa
chmod 600 ~/.ssh/id_rsa
echo "${{ secrets.server_ip }} ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBE1Kkopomm7FHG5enATf7SgnpICZ4W2bw+Ho+afqin+w7sMcrsa0je7sbztFAV8YchDkiBKnWTG4cRT+KZgZCaY=" > ~/.ssh/known_hosts
echo "${{ secrets.server_ip }} ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFjYkJBk7sos+r7yATODogQc3jUdW1aascGpyOD4bohj8dWjzwLJv/OJ/fyOQ5lmj81WKDk67tGtqNJYGL9acII=" > ~/.ssh/known_hosts
if: github.event_name == 'push' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/'))

- name: Copy cheatsheets into site directory
run: cp doc/cheatsheet/Pandas_Cheat_Sheet* web/build/

- name: Upload web
run: rsync -az --delete --exclude='pandas-docs' --exclude='docs' web/build/ docs@${{ secrets.server_ip }}:/usr/share/nginx/pandas
run: rsync -az --delete --exclude='pandas-docs' --exclude='docs' web/build/ web@${{ secrets.server_ip }}:/var/www/html
if: github.event_name == 'push' && github.ref == 'refs/heads/main'

- name: Upload dev docs
run: rsync -az --delete doc/build/html/ docs@${{ secrets.server_ip }}:/usr/share/nginx/pandas/pandas-docs/dev
run: rsync -az --delete doc/build/html/ web@${{ secrets.server_ip }}:/var/www/html/pandas-docs/dev
if: github.event_name == 'push' && github.ref == 'refs/heads/main'

- name: Upload prod docs
run: rsync -az --delete doc/build/html/ docs@${{ secrets.server_ip }}:/usr/share/nginx/pandas/pandas-docs/version/${GITHUB_REF_NAME:1}
run: rsync -az --delete doc/build/html/ web@${{ secrets.server_ip }}:/var/www/html/pandas-docs/version/${GITHUB_REF_NAME:1}
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags/')

- name: Move docs into site directory
Expand Down
2 changes: 0 additions & 2 deletions .github/workflows/macos-windows.yml
Expand Up @@ -5,12 +5,10 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
pull_request:
branches:
- main
- 1.5.x
- 1.4.x
paths-ignore:
- "doc/**"

Expand Down
2 changes: 0 additions & 2 deletions .github/workflows/python-dev.yml
Expand Up @@ -25,12 +25,10 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
pull_request:
branches:
- main
- 1.5.x
- 1.4.x
paths-ignore:
- "doc/**"

Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/sdist.yml
Expand Up @@ -5,12 +5,10 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
pull_request:
branches:
- main
- 1.5.x
- 1.4.x
types: [labeled, opened, synchronize, reopened]
paths-ignore:
- "doc/**"
Expand All @@ -30,7 +28,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10", "3.11"]
concurrency:
# https://github.community/t/concurrecy-not-work-for-push/183068/7
group: ${{ github.event_name == 'push' && github.run_number || github.ref }}-${{matrix.python-version}}-sdist
Expand All @@ -42,7 +40,7 @@ jobs:
fetch-depth: 0

- name: Set up Python
uses: actions/setup-python@v3
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}

Expand Down Expand Up @@ -86,6 +84,8 @@ jobs:
pip install numpy==1.20.3 ;;
3.10)
pip install numpy==1.21.2 ;;
3.11)
pip install numpy==1.23.2 ;;
esac
- name: Import pandas
Expand Down
2 changes: 0 additions & 2 deletions .github/workflows/ubuntu.yml
Expand Up @@ -5,12 +5,10 @@ on:
branches:
- main
- 1.5.x
- 1.4.x
pull_request:
branches:
- main
- 1.5.x
- 1.4.x
paths-ignore:
- "doc/**"

Expand Down
2 changes: 1 addition & 1 deletion doc/source/development/contributing_environment.rst
Expand Up @@ -10,7 +10,7 @@ To test out code changes, you'll need to build pandas from source, which
requires a C/C++ compiler and Python environment. If you're making documentation
changes, you can skip to :ref:`contributing to the documentation <contributing_documentation>` but if you skip
creating the development environment you won't be able to build the documentation
locally before pushing your changes.
locally before pushing your changes. It's recommended to also install the :ref:`pre-commit hooks <contributing.pre-commit>`.

.. contents:: Table of contents:
:local:
Expand Down
2 changes: 1 addition & 1 deletion doc/source/getting_started/install.rst
Expand Up @@ -20,7 +20,7 @@ Instructions for installing from source,
Python version support
----------------------

Officially Python 3.8, 3.9 and 3.10.
Officially Python 3.8, 3.9, 3.10 and 3.11.

Installing pandas
-----------------
Expand Down
2 changes: 1 addition & 1 deletion doc/source/whatsnew/v0.13.0.rst
Expand Up @@ -733,7 +733,7 @@ Enhancements
.. _scipy: http://www.scipy.org
.. _documentation: http://docs.scipy.org/doc/scipy/reference/interpolate.html#univariate-interpolation
.. _guide: http://docs.scipy.org/doc/scipy/reference/tutorial/interpolate.html
.. _guide: https://docs.scipy.org/doc/scipy/tutorial/interpolate.html

- ``to_csv`` now takes a ``date_format`` keyword argument that specifies how
output datetime objects should be formatted. Datetimes encountered in the
Expand Down
8 changes: 6 additions & 2 deletions doc/source/whatsnew/v1.5.2.rst
Expand Up @@ -13,15 +13,19 @@ including other versions of pandas.

Fixed regressions
~~~~~~~~~~~~~~~~~
-
- Fixed regression in :meth:`MultiIndex.join` for extension array dtypes (:issue:`49277`)
- Fixed regression in :meth:`Series.replace` raising ``RecursionError`` with numeric dtype and when specifying ``value=None`` (:issue:`45725`)
- Fixed regression in :meth:`DataFrame.plot` preventing :class:`~matplotlib.colors.Colormap` instance
from being passed using the ``colormap`` argument if Matplotlib 3.6+ is used (:issue:`49374`)
- Fixed regression in :func:`date_range` returning an invalid set of periods for ``CustomBusinessDay`` frequency and ``start`` date with timezone (:issue:`49441`)
-

.. ---------------------------------------------------------------------------
.. _whatsnew_152.bug_fixes:

Bug fixes
~~~~~~~~~
-
- Bug in the Copy-on-Write implementation losing track of views in certain chained indexing cases (:issue:`48996`)
-

.. ---------------------------------------------------------------------------
Expand Down
12 changes: 9 additions & 3 deletions pandas/_libs/internals.pyx
Expand Up @@ -676,8 +676,9 @@ cdef class BlockManager:
public bint _known_consolidated, _is_consolidated
public ndarray _blknos, _blklocs
public list refs
public object parent

def __cinit__(self, blocks=None, axes=None, refs=None, verify_integrity=True):
def __cinit__(self, blocks=None, axes=None, refs=None, parent=None, verify_integrity=True):
# None as defaults for unpickling GH#42345
if blocks is None:
# This adds 1-2 microseconds to DataFrame(np.array([]))
Expand All @@ -690,6 +691,7 @@ cdef class BlockManager:
self.blocks = blocks
self.axes = axes.copy() # copy to make sure we are not remotely-mutable
self.refs = refs
self.parent = parent

# Populate known_consolidate, blknos, and blklocs lazily
self._known_consolidated = False
Expand Down Expand Up @@ -805,7 +807,9 @@ cdef class BlockManager:
nrefs.append(weakref.ref(blk))

new_axes = [self.axes[0], self.axes[1]._getitem_slice(slobj)]
mgr = type(self)(tuple(nbs), new_axes, nrefs, verify_integrity=False)
mgr = type(self)(
tuple(nbs), new_axes, nrefs, parent=self, verify_integrity=False
)

# We can avoid having to rebuild blklocs/blknos
blklocs = self._blklocs
Expand All @@ -827,4 +831,6 @@ cdef class BlockManager:
new_axes = list(self.axes)
new_axes[axis] = new_axes[axis]._getitem_slice(slobj)

return type(self)(tuple(new_blocks), new_axes, new_refs, verify_integrity=False)
return type(self)(
tuple(new_blocks), new_axes, new_refs, parent=self, verify_integrity=False
)
4 changes: 3 additions & 1 deletion pandas/_libs/tslibs/offsets.pyx
Expand Up @@ -258,7 +258,9 @@ cdef _to_dt64D(dt):
if getattr(dt, 'tzinfo', None) is not None:
# Get the nanosecond timestamp,
# equiv `Timestamp(dt).value` or `dt.timestamp() * 10**9`
naive = dt.astimezone(None)
# The `naive` must be the `dt` naive wall time
# instead of the naive absolute time (GH#49441)
naive = dt.replace(tzinfo=None)
dt = np.datetime64(naive, "D")
else:
dt = np.datetime64(dt)
Expand Down
8 changes: 8 additions & 0 deletions pandas/core/frame.py
Expand Up @@ -86,6 +86,7 @@
function as nv,
np_percentile_argname,
)
from pandas.errors import InvalidIndexError
from pandas.util._decorators import (
Appender,
Substitution,
Expand Down Expand Up @@ -4220,6 +4221,13 @@ def _set_value(
self.loc[index, col] = value
self._item_cache.pop(col, None)

except InvalidIndexError as ii_err:
# GH48729: Seems like you are trying to assign a value to a
# row when only scalar options are permitted
raise InvalidIndexError(
f"You can only assign a scalar value not a {type(value)}"
) from ii_err

def _ensure_valid_index(self, value) -> None:
"""
Ensure that if we don't have an index, that we can create one from the
Expand Down
6 changes: 4 additions & 2 deletions pandas/core/indexes/base.py
Expand Up @@ -4701,8 +4701,10 @@ def join(
return self._join_non_unique(other, how=how)
elif not self.is_unique or not other.is_unique:
if self.is_monotonic_increasing and other.is_monotonic_increasing:
if self._can_use_libjoin:
if not is_interval_dtype(self.dtype):
# otherwise we will fall through to _join_via_get_indexer
# GH#39133
# go through object dtype for ea till engine is supported properly
return self._join_monotonic(other, how=how)
else:
return self._join_non_unique(other, how=how)
Expand Down Expand Up @@ -5079,7 +5081,7 @@ def _wrap_joined_index(self: _IndexT, joined: ArrayLike, other: _IndexT) -> _Ind
return self._constructor(joined, name=name) # type: ignore[return-value]
else:
name = get_op_result_name(self, other)
return self._constructor._with_infer(joined, name=name)
return self._constructor._with_infer(joined, name=name, dtype=self.dtype)

@cache_readonly
def _can_use_libjoin(self) -> bool:
Expand Down
6 changes: 4 additions & 2 deletions pandas/core/internals/blocks.py
Expand Up @@ -569,7 +569,6 @@ def replace(
# Note: the checks we do in NDFrame.replace ensure we never get
# here with listlike to_replace or value, as those cases
# go through replace_list

values = self.values

if isinstance(values, Categorical):
Expand Down Expand Up @@ -608,7 +607,10 @@ def replace(
return blocks

elif self.ndim == 1 or self.shape[0] == 1:
blk = self.coerce_to_target_dtype(value)
if value is None:
blk = self.astype(np.dtype(object))
else:
blk = self.coerce_to_target_dtype(value)
return blk.replace(
to_replace=to_replace,
value=value,
Expand Down

0 comments on commit 1b8e4db

Please sign in to comment.