You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This happens while checking several different files. Only including the one report, as they are all reporting the same error.
Configuration
No response
Command used
pylint ert ert3
Pylint output
2022-04-20T08:48:25.7211802Z Exception on node <Name.record l.406 at 0x7f764e8a9040>in file '/home/runner/work/ert/ert/ert/storage/_storage.py'
2022-04-20T08:48:25.7213396Z Can't write the issue template for the crash in /home/runner/.cache/pylint/pylint-crash-2022-04-20-08.txt because of: '[Errno 2] No such file or directory: '/home/runner/.cache/pylint/pylint-crash-2022-04-20-08.txt''2022-04-20T08:48:25.7214301Z Here's the content anyway:
2022-04-20T08:48:25.7214936Z First, please verify that the bug is not already filled:
2022-04-20T08:48:25.7215678Z https://github.com/PyCQA/pylint/issues/
2022-04-20T08:48:25.7215907Z
2022-04-20T08:48:25.7216011Z Then create a new crash issue:
2022-04-20T08:48:25.7260608Z https://github.com/PyCQA/pylint/issues/new?assignees=&labels=crash%2Cneeds+triage&template=BUG-REPORT.yml
2022-04-20T08:48:25.7263107Z
2022-04-20T08:48:25.7265187Z
2022-04-20T08:48:25.7267354Z Issue title:
2022-04-20T08:48:25.7270018Z Crash ``'Values' object has no attribute 'additional_builtins'`` (if possible, be more specific about what made pylint crash)
2022-04-20T08:48:25.7272491Z Content:
2022-04-20T08:48:25.7275004Z When parsing the following file:
2022-04-20T08:48:25.7277332Z
2022-04-20T08:48:25.7279586Z <!--
2022-04-20T08:48:25.7282064Z If sharing the code is not an option, please state so,
2022-04-20T08:48:25.7284583Z but providing only the stacktrace would still be helpful.
2022-04-20T08:48:25.7287174Z -->
2022-04-20T08:48:25.7289401Z
2022-04-20T08:48:25.7291546Z
2022-04-20T08:48:25.7293835Z import io
2022-04-20T08:48:25.7296113Z import logging
2022-04-20T08:48:25.7299621Z import json
2022-04-20T08:48:25.7303474Z from functools import partial
2022-04-20T08:48:25.7306300Z from http import HTTPStatus
2022-04-20T08:48:25.7308610Z from typing import (
2022-04-20T08:48:25.7310805Z Any,
2022-04-20T08:48:25.7312971Z Awaitable,
2022-04-20T08:48:25.7315421Z Dict,
2022-04-20T08:48:25.7317736Z Iterable,
2022-04-20T08:48:25.7320050Z List,
2022-04-20T08:48:25.7322527Z Optional,
2022-04-20T08:48:25.7324876Z Set,
2022-04-20T08:48:25.7327208Z Tuple,
2022-04-20T08:48:25.7329214Z Union,
2022-04-20T08:48:25.7331228Z )
2022-04-20T08:48:25.7333246Z import httpx
2022-04-20T08:48:25.7334983Z import pandas as pd
2022-04-20T08:48:25.7335366Z import ert
2022-04-20T08:48:25.7335790Z from ert_shared.async_utils import get_event_loop
2022-04-20T08:48:25.7336244Z from ert_shared.services import Storage
2022-04-20T08:48:25.7336570Z
2022-04-20T08:48:25.7336841Z logger = logging.getLogger(__name__)
2022-04-20T08:48:25.7337339Z read_csv = partial(pd.read_csv, index_col=0, float_precision="round_trip")
2022-04-20T08:48:25.7337710Z
2022-04-20T08:48:25.7337988Z _ENSEMBLE_RECORDS = "__ensemble_records__"
2022-04-20T08:48:25.7338429Z _SPECIAL_KEYS = (_ENSEMBLE_RECORDS,)
2022-04-20T08:48:25.7338726Z
2022-04-20T08:48:25.7339066Z # Character used as separator for parameter record names. This is used as a
2022-04-20T08:48:25.7339812Z # workaround for webviz-ert, which expects each parameter record to have exactly
2022-04-20T08:48:25.7340329Z # one value per realisation.
2022-04-20T08:48:25.7340948Z _PARAMETER_RECORD_SEPARATOR = "."
2022-04-20T08:48:25.7341492Z _OCTET_STREAM = "application/octet-stream"
2022-04-20T08:48:25.7341921Z _CSV = "text/csv"
2022-04-20T08:48:25.7342206Z
2022-04-20T08:48:25.7342352Z
2022-04-20T08:48:25.7342690Z class StorageRecordTransmitter(ert.data.RecordTransmitter):
2022-04-20T08:48:25.7343242Z def __init__(self, name: str, storage_url: str, iens: Optional[int] = None):
2022-04-20T08:48:25.7343779Z super().__init__(ert.data.RecordTransmitterType.ert_storage)
2022-04-20T08:48:25.7344255Z self._name: str = name
2022-04-20T08:48:25.7344674Z self._uri = f"{storage_url}/{name}"
2022-04-20T08:48:25.7345112Z self._real_id: Optional[int] = iens
2022-04-20T08:48:25.7345428Z
2022-04-20T08:48:25.7346012Z async def _get_recordtree_transmitters(
2022-04-20T08:48:25.7346398Z self,
2022-04-20T08:48:25.7346765Z trans_records: Dict[str, str],
2022-04-20T08:48:25.7347174Z record_type: ert.data.RecordType,
2022-04-20T08:48:25.7347565Z path: Optional[str] = None,
2022-04-20T08:48:25.7348097Z ) -> Dict[str, ert.data.RecordTransmitter]:
2022-04-20T08:48:25.7348565Z _storage_url = self._uri[: self._uri.rfind("/")]
2022-04-20T08:48:25.7349027Z transmitters: Dict[str, ert.data.RecordTransmitter] = {}
2022-04-20T08:48:25.7350313Z forrecord_path, record_uriintrans_records.items():
2022-04-20T08:48:25.7350940Z if path is None or path in record_path:
2022-04-20T08:48:25.7351471Z record_name = record_path.split("/")[-1]
2022-04-20T08:48:25.7351855Z transmitter = StorageRecordTransmitter(record_name, _storage_url)
2022-04-20T08:48:25.7352234Z transmitter.set_transmitted(record_uri, record_type)
2022-04-20T08:48:25.7352684Z transmitters[record_path] = transmitter
2022-04-20T08:48:25.7352946Z return transmitters
2022-04-20T08:48:25.7353090Z
2022-04-20T08:48:25.7353156Z @property
2022-04-20T08:48:25.7353428Z def uri(self) -> str:
2022-04-20T08:48:25.7353666Z if not self.is_transmitted():
2022-04-20T08:48:25.7353973Z raise RuntimeError(f"Record {self._name} not transmitted")
2022-04-20T08:48:25.7354235Z return self._uri
2022-04-20T08:48:25.7354369Z
2022-04-20T08:48:25.7354448Z @property
2022-04-20T08:48:25.7354793Z def record_type(self) -> Optional[ert.data.RecordType]:
2022-04-20T08:48:25.7355067Z assert self.is_transmitted()
2022-04-20T08:48:25.7355313Z return self._record_type
2022-04-20T08:48:25.7355454Z
2022-04-20T08:48:25.7355919Z def set_transmitted(self, uri: str, record_type: ert.data.RecordType) -> None:
2022-04-20T08:48:25.7356269Z self._set_transmitted_state(uri, record_type)
2022-04-20T08:48:25.7356442Z
2022-04-20T08:48:25.7356749Z async def _transmit_numerical_record(self, record: ert.data.NumericalRecord) -> str:
2022-04-20T08:48:25.7357073Z url = f"{self._uri}/matrix"
2022-04-20T08:48:25.7357312Z if self._real_id is not None:
2022-04-20T08:48:25.7357722Z url = f"{url}?realization_index={self._real_id}"
2022-04-20T08:48:25.7358076Z self._uri = f"{self._uri}?realization_index={self._real_id}"
2022-04-20T08:48:25.7358392Z await add_record(url, record)
2022-04-20T08:48:25.7358637Z return self._uri
2022-04-20T08:48:25.7358786Z
2022-04-20T08:48:25.7359091Z async def _transmit_blob_record(self, record: ert.data.BlobRecord) -> str:
2022-04-20T08:48:25.7359700Z url = f"{self._uri}/file"
2022-04-20T08:48:25.7359968Z if self._real_id is not None:
2022-04-20T08:48:25.7360273Z url = f"{url}?realization_index={self._real_id}"
2022-04-20T08:48:25.7360897Z self._uri = f"{self._uri}?realization_index={self._real_id}"
2022-04-20T08:48:25.7361181Z await add_record(url, record)
2022-04-20T08:48:25.7361395Z return self._uri
2022-04-20T08:48:25.7361525Z
2022-04-20T08:48:25.7361635Z async def _transmit_recordtree(
2022-04-20T08:48:25.7361976Z self, record: Union[ert.data.NumericalRecordTree, ert.data.BlobRecordTree]
2022-04-20T08:48:25.7362310Z ) -> str:
2022-04-20T08:48:25.7362522Z data: Dict[str, str] = {}
2022-04-20T08:48:25.7362787Z storage_url = self._uri[: self._uri.rfind("/")]
2022-04-20T08:48:25.7363081Z forrecord_pathin record.flat_record_dict:
2022-04-20T08:48:25.7363429Z record_name = record_path.split("/")[-1]
2022-04-20T08:48:25.7363730Z transmitter = StorageRecordTransmitter(
2022-04-20T08:48:25.7364034Z record_name, storage_url, iens=self._real_id
2022-04-20T08:48:25.7364260Z )
2022-04-20T08:48:25.7364551Z await transmitter.transmit_record(record.flat_record_dict[record_path])
2022-04-20T08:48:25.7364995Z data[record_path] = transmitter._uri
2022-04-20T08:48:25.7365246Z await self._transmit_blob_record(
2022-04-20T08:48:25.7365650Z ert.data.BlobRecord(data=json.dumps(data).encode("utf-8"))
2022-04-20T08:48:25.7365913Z )
2022-04-20T08:48:25.7366172Z # Since metadata is stored only for record with real_id == 0, within async
2022-04-20T08:48:25.7366538Z # processing we make sure that only the same realization write the metadata
2022-04-20T08:48:25.7366831Z if self._real_id == 0:
2022-04-20T08:48:25.7367065Z await add_record_metadata(
2022-04-20T08:48:25.7367350Z storage_url, self._name, {"record_type": record.record_type}
2022-04-20T08:48:25.7367608Z )
2022-04-20T08:48:25.7367808Z return self._uri
2022-04-20T08:48:25.7367939Z
2022-04-20T08:48:25.7368185Z async def _load_numerical_record(self) -> ert.data.NumericalRecord:
2022-04-20T08:48:25.7368491Z assert self._record_type
2022-04-20T08:48:25.7370009Z record = await load_record(self._uri, self._record_type)
2022-04-20T08:48:25.7370442Z if not isinstance(record, ert.data.NumericalRecord):
2022-04-20T08:48:25.7370830Z raise TypeError(f"unexpected blobrecord for numerical {self._uri}")
2022-04-20T08:48:25.7371152Z return record
2022-04-20T08:48:25.7371298Z
2022-04-20T08:48:25.7371606Z async def _load_blob_record(self) -> ert.data.BlobRecord:
2022-04-20T08:48:25.7371908Z assert self._record_type
2022-04-20T08:48:25.7372223Z record = await load_record(self._uri, self._record_type)
2022-04-20T08:48:25.7372809Z if not isinstance(record, ert.data.BlobRecord):
2022-04-20T08:48:25.7373155Z raise TypeError(f"unexpected numerical record for blob {self._uri}")
2022-04-20T08:48:25.7373452Z return record
2022-04-20T08:48:25.7373754Z
2022-04-20T08:48:25.7373765Z
2022-04-20T08:48:25.7373913Z async def _get_record_storage_transmitters(
2022-04-20T08:48:25.7374183Z records_url: str,
2022-04-20T08:48:25.7374495Z record_name: str,
2022-04-20T08:48:25.7374734Z record_source: Optional[str] = None,
2022-04-20T08:48:25.7374999Z ensemble_size: Optional[int] = None,
2022-04-20T08:48:25.7375435Z ) -> Tuple[List[StorageRecordTransmitter], ert.data.RecordCollectionType]:
2022-04-20T08:48:25.7375763Z if record_source is None:
2022-04-20T08:48:25.7376000Z record_source = record_name
2022-04-20T08:48:25.7376237Z uri = f"{records_url}/{record_source}"
2022-04-20T08:48:25.7376509Z metadata = await get_record_metadata(uri)
2022-04-20T08:48:25.7376779Z record_type = metadata["record_type"]
2022-04-20T08:48:25.7377107Z collection_type: ert.data.RecordCollectionType = metadata["collection_type"]
2022-04-20T08:48:25.7377422Z uris = metadata["uris"]
2022-04-20T08:48:25.7377720Z # We expect the number of uris in the record metadata to match the size of
2022-04-20T08:48:25.7378060Z # the ensemble or be equal to 1, in the case of an uniform record
2022-04-20T08:48:25.7378330Z if ensemble_size is not None:
2022-04-20T08:48:25.7378673Z if collection_type == ert.data.RecordCollectionType.UNIFORM and len(uris) != 1:
2022-04-20T08:48:25.7379028Z raise ert.exceptions.ErtError(
2022-04-20T08:48:25.7379310Z "Ensemble is uniform but stores multiple records"
2022-04-20T08:48:25.7379554Z )
2022-04-20T08:48:25.7379740Z if (
2022-04-20T08:48:25.7380014Z collection_type != ert.data.RecordCollectionType.UNIFORM
2022-04-20T08:48:25.7380323Z and len(uris) != ensemble_size
2022-04-20T08:48:25.7381016Z ):
2022-04-20T08:48:25.7381268Z raise ert.exceptions.ErtError(
2022-04-20T08:48:25.7381598Z f"Ensemble size {ensemble_size} does not match stored record ensemble "
2022-04-20T08:48:25.7381945Z + f"for {record_name} of size {len(uris)}"
2022-04-20T08:48:25.7382189Z )
2022-04-20T08:48:25.7382438Z
2022-04-20T08:48:25.7382538Z transmitters = []
2022-04-20T08:48:25.7382776Z forrecord_uriin uris:
2022-04-20T08:48:25.7383104Z transmitter = StorageRecordTransmitter(record_source, records_url)
2022-04-20T08:48:25.7383593Z # Record data has already been stored, now just setting the transmitter uri and
2022-04-20T08:48:25.7383881Z # record type
2022-04-20T08:48:25.7384151Z transmitter.set_transmitted(record_uri, record_type)
2022-04-20T08:48:25.7384446Z transmitters.append(transmitter)
2022-04-20T08:48:25.7384595Z
2022-04-20T08:48:25.7384717Z return transmitters, collection_type
2022-04-20T08:48:25.7384876Z
2022-04-20T08:48:25.7384882Z
2022-04-20T08:48:25.7385005Z async def get_record_storage_transmitters(
2022-04-20T08:48:25.7385248Z records_url: str,
2022-04-20T08:48:25.7385444Z record_name: str,
2022-04-20T08:48:25.7385686Z record_source: Optional[str] = None,
2022-04-20T08:48:25.7385944Z ensemble_size: Optional[int] = None,
2022-04-20T08:48:25.7386335Z ) -> Dict[int, Dict[str, ert.data.RecordTransmitter]]:
2022-04-20T08:48:25.7386686Z transmitters, collection_type = await _get_record_storage_transmitters(
2022-04-20T08:48:25.7387027Z records_url, record_name, record_source, ensemble_size
2022-04-20T08:48:25.7387270Z )
2022-04-20T08:48:25.7387435Z if (
2022-04-20T08:48:25.7387641Z ensemble_size is not None
2022-04-20T08:48:25.7387956Z and collection_type == ert.data.RecordCollectionType.UNIFORM
2022-04-20T08:48:25.7388217Z ):
2022-04-20T08:48:25.7388492Z return {iens: {record_name: transmitters[0]} foriensin range(ensemble_size)}
2022-04-20T08:48:25.7388765Z return {
2022-04-20T08:48:25.7388975Z iens: {record_name: transmitter}
2022-04-20T08:48:25.7389257Z foriens, transmitterin enumerate(transmitters)
2022-04-20T08:48:25.7389587Z }
2022-04-20T08:48:25.7389710Z
2022-04-20T08:48:25.7389715Z
2022-04-20T08:48:25.7389950Z def _get(url: str, headers: Dict[str, Any]) -> httpx.Response:
2022-04-20T08:48:25.7390244Z with Storage.session() as session:
2022-04-20T08:48:25.7390532Z return session.get(url, headers=headers, timeout=60)
2022-04-20T08:48:25.7390708Z
2022-04-20T08:48:25.7390714Z
2022-04-20T08:48:25.7390823Z async def _get_from_server_async(
2022-04-20T08:48:25.7391026Z url: str,
2022-04-20T08:48:25.7391238Z headers: Dict[str, str],
2022-04-20T08:48:25.7391451Z **kwargs: Any,
2022-04-20T08:48:25.7391694Z ) -> httpx.Response:
2022-04-20T08:48:25.7391828Z
2022-04-20T08:48:25.7391926Z loop = get_event_loop()
2022-04-20T08:48:25.7392065Z
2022-04-20T08:48:25.7392236Z # Using sync code because one of the httpx dependencies (anyio) throws an
2022-04-20T08:48:25.7392710Z # AttributeError: module 'anyio._backends._asyncio' has no attribute 'current_time'
2022-04-20T08:48:25.7393073Z # Refactor and try to use aiohttp or httpx once the issue above is fixed
2022-04-20T08:48:25.7393368Z future = loop.run_in_executor(
2022-04-20T08:48:25.7393703Z None,
2022-04-20T08:48:25.7393919Z partial(_get, url, headers),
2022-04-20T08:48:25.7394144Z )
2022-04-20T08:48:25.7394346Z resp = await future
2022-04-20T08:48:25.7394485Z
2022-04-20T08:48:25.7394598Z if resp.status_code != HTTPStatus.OK:
2022-04-20T08:48:25.7394937Z logger.error("Failed to fetch from %s. Response: %s", url, resp.text)
2022-04-20T08:48:25.7395286Z if resp.status_code == HTTPStatus.NOT_FOUND:
2022-04-20T08:48:25.7395626Z raise ert.exceptions.ElementMissingError(resp.text)
2022-04-20T08:48:25.7395989Z raise ert.exceptions.StorageError(resp.text)
2022-04-20T08:48:25.7396384Z return resp
2022-04-20T08:48:25.7396519Z
2022-04-20T08:48:25.7396525Z
2022-04-20T08:48:25.7396817Z def _post(url: str, headers: Dict[str, Any], **kwargs: Any) -> httpx.Response:
2022-04-20T08:48:25.7397142Z with Storage.session() as session:
2022-04-20T08:48:25.7397529Z return session.post(url=url, headers=headers, timeout=60, **kwargs)
2022-04-20T08:48:25.7397853Z
2022-04-20T08:48:25.7397859Z
2022-04-20T08:48:25.7397977Z async def _post_to_server_async(
2022-04-20T08:48:25.7398218Z url: str,
2022-04-20T08:48:25.7398461Z headers: Optional[Dict[str, str]] = None,
2022-04-20T08:48:25.7398838Z **kwargs: Any,
2022-04-20T08:48:25.7399606Z ) -> httpx.Response:
2022-04-20T08:48:25.7399875Z if headers is None:
2022-04-20T08:48:25.7400117Z headers = {}
2022-04-20T08:48:25.7400262Z
2022-04-20T08:48:25.7400373Z loop = get_event_loop()
2022-04-20T08:48:25.7400700Z # Using sync code because one of the httpx dependencies (anyio) throws an
2022-04-20T08:48:25.7401285Z # AttributeError: module 'anyio._backends._asyncio' has no attribute 'current_time'
2022-04-20T08:48:25.7401715Z # Refactor and try to use aiohttp or httpx once the issue above is fixed
2022-04-20T08:48:25.7402043Z future = loop.run_in_executor(
2022-04-20T08:48:25.7402289Z None,
2022-04-20T08:48:25.7402615Z partial(_post, url, headers, **kwargs),
2022-04-20T08:48:25.7402874Z )
2022-04-20T08:48:25.7403077Z resp = await future
2022-04-20T08:48:25.7403226Z
2022-04-20T08:48:25.7403363Z if resp.status_code != HTTPStatus.OK:
2022-04-20T08:48:25.7403837Z logger.error("Failed to post to %s. Response: %s", url, resp.text)
2022-04-20T08:48:25.7404181Z if resp.status_code == HTTPStatus.CONFLICT:
2022-04-20T08:48:25.7404548Z raise ert.exceptions.ElementExistsError(resp.text)
2022-04-20T08:48:25.7404922Z raise ert.exceptions.StorageError(resp.text)
2022-04-20T08:48:25.7405191Z return resp
2022-04-20T08:48:25.7405326Z
2022-04-20T08:48:25.7405332Z
2022-04-20T08:48:25.7405622Z def _put(url: str, headers: Dict[str, Any], **kwargs: Any) -> httpx.Response:
2022-04-20T08:48:25.7405960Z with Storage.session() as session:
2022-04-20T08:48:25.7406479Z return session.put(url=url, headers=headers, timeout=60, **kwargs)
2022-04-20T08:48:25.7406693Z
2022-04-20T08:48:25.7406721Z
2022-04-20T08:48:25.7406822Z async def _put_to_server_async(
2022-04-20T08:48:25.7407062Z url: str,
2022-04-20T08:48:25.7407292Z headers: Dict[str, str],
2022-04-20T08:48:25.7407517Z **kwargs: Any,
2022-04-20T08:48:25.7407810Z ) -> httpx.Response:
2022-04-20T08:48:25.7408058Z loop = get_event_loop()
2022-04-20T08:48:25.7408210Z
2022-04-20T08:48:25.7408383Z # Using sync code because one of the httpx dependencies (anyio) throws an
2022-04-20T08:48:25.7408903Z # AttributeError: module 'anyio._backends._asyncio' has no attribute 'current_time'
2022-04-20T08:48:25.7409315Z # Refactor and try to use aiohttp or httpx once the issue above is fixed
2022-04-20T08:48:25.7409643Z future = loop.run_in_executor(
2022-04-20T08:48:25.7409868Z None,
2022-04-20T08:48:25.7410127Z partial(_put, url, headers, **kwargs),
2022-04-20T08:48:25.7410376Z )
2022-04-20T08:48:25.7410576Z resp = await future
2022-04-20T08:48:25.7410721Z
2022-04-20T08:48:25.7410854Z if resp.status_code != HTTPStatus.OK:
2022-04-20T08:48:25.7411206Z logger.error("Failed to post to %s. Response: %s", url, resp.text)
2022-04-20T08:48:25.7411652Z if resp.status_code == HTTPStatus.CONFLICT:
2022-04-20T08:48:25.7412005Z raise ert.exceptions.ElementExistsError(resp.text)
2022-04-20T08:48:25.7412369Z raise ert.exceptions.StorageError(resp.text)
2022-04-20T08:48:25.7412563Z
2022-04-20T08:48:25.7412649Z return resp
2022-04-20T08:48:25.7412764Z
2022-04-20T08:48:25.7412785Z
2022-04-20T08:48:25.7412870Z def _set_content_header(
2022-04-20T08:48:25.7413093Z header: str,
2022-04-20T08:48:25.7413340Z record_type: ert.data.RecordType,
2022-04-20T08:48:25.7413613Z headers: Optional[Dict[str, str]] = None,
2022-04-20T08:48:25.7413923Z ) -> Dict[str, str]:
2022-04-20T08:48:25.7414374Z content_type = _OCTET_STREAM if record_type == ert.data.RecordType.BYTES else _CSV
2022-04-20T08:48:25.7414705Z if headers is None:
2022-04-20T08:48:25.7414938Z return {header: content_type}
2022-04-20T08:48:25.7415316Z headers_ = headers.copy()
2022-04-20T08:48:25.7415577Z headers_[header] = content_type
2022-04-20T08:48:25.7415802Z return headers_
2022-04-20T08:48:25.7415940Z
2022-04-20T08:48:25.7415946Z
2022-04-20T08:48:25.7416341Z async def add_record(url: str, record: ert.data.Record) -> None:
2022-04-20T08:48:25.7416659Z assert record.record_type
2022-04-20T08:48:25.7416966Z if record.record_type != ert.data.RecordType.BYTES:
2022-04-20T08:48:25.7417279Z headers = _set_content_header(
2022-04-20T08:48:25.7417689Z header="content-type", record_type=record.record_type
2022-04-20T08:48:25.7417954Z )
2022-04-20T08:48:25.7418230Z data = pd.DataFrame([record.data]).to_csv().encode()
2022-04-20T08:48:25.7418597Z await _post_to_server_async(url=url, headers=headers, data=data)
2022-04-20T08:48:25.7418890Z else:
2022-04-20T08:48:25.7419129Z assert isinstance(record.data, bytes)
2022-04-20T08:48:25.7419543Z data = {"file": io.BytesIO(record.data)}
2022-04-20T08:48:25.7419850Z await _post_to_server_async(url=url, files=data)
2022-04-20T08:48:25.7420029Z
2022-04-20T08:48:25.7420036Z
2022-04-20T08:48:25.7420316Z def _interpret_series(row: pd.Series, record_type: ert.data.RecordType) -> Any:
2022-04-20T08:48:25.7421050Z if record_type not in {item.value foritemin ert.data.RecordType}:
2022-04-20T08:48:25.7421370Z raise ValueError(
2022-04-20T08:48:25.7421705Z f"Unexpected record type when loading numerical record: {record_type}"
2022-04-20T08:48:25.7421991Z )
2022-04-20T08:48:25.7422114Z
2022-04-20T08:48:25.7422294Z if record_type == ert.data.RecordType.MAPPING_INT_FLOAT:
2022-04-20T08:48:25.7422647Z return {int(k): v fork, vinrow.to_dict().items()}
2022-04-20T08:48:25.7422890Z return (
2022-04-20T08:48:25.7423213Z row.to_list()
2022-04-20T08:48:25.7423628Z if record_type == ert.data.RecordType.LIST_FLOAT
2022-04-20T08:48:25.7423900Z elserow.to_dict()
2022-04-20T08:48:25.7424111Z )
2022-04-20T08:48:25.7424225Z
2022-04-20T08:48:25.7424231Z
2022-04-20T08:48:25.7424562Z async def load_record(url: str, record_type: ert.data.RecordType) -> ert.data.Record:
2022-04-20T08:48:25.7424889Z if record_type in (
2022-04-20T08:48:25.7425153Z ert.data.RecordType.NUMERICAL_TREE,
2022-04-20T08:48:25.7425453Z ert.data.RecordType.BLOB_TREE,
2022-04-20T08:48:25.7425692Z ):
2022-04-20T08:48:25.7425896Z headers = _set_content_header(
2022-04-20T08:48:25.7426217Z header="accept", record_type=ert.data.RecordType.BYTES
2022-04-20T08:48:25.7426491Z )
2022-04-20T08:48:25.7426665Z else:
2022-04-20T08:48:25.7426957Z headers = _set_content_header(header="accept", record_type=record_type)
2022-04-20T08:48:25.7427336Z response = await _get_from_server_async(url=url, headers=headers)
2022-04-20T08:48:25.7427629Z content = response.content
2022-04-20T08:48:25.7427875Z if record_type in (
2022-04-20T08:48:25.7428135Z ert.data.RecordType.LIST_FLOAT,
2022-04-20T08:48:25.7428433Z ert.data.RecordType.MAPPING_STR_FLOAT,
2022-04-20T08:48:25.7428757Z ert.data.RecordType.MAPPING_INT_FLOAT,
2022-04-20T08:48:25.7429006Z ):
2022-04-20T08:48:25.7429278Z dataframe: pd.DataFrame = read_csv(io.BytesIO(content))
2022-04-20T08:48:25.7429725Z for_, rowindataframe.iterrows(): # pylint: disable=no-member
2022-04-20T08:48:25.7430063Z return ert.data.NumericalRecord(
2022-04-20T08:48:25.7430399Z data=_interpret_series(row=row, record_type=record_type)
2022-04-20T08:48:25.7430656Z )
2022-04-20T08:48:25.7430908Z return ert.data.BlobRecord(data=content)
2022-04-20T08:48:25.7431084Z
2022-04-20T08:48:25.7431090Z
2022-04-20T08:48:25.7431341Z async def get_record_metadata(record_url: str) -> Dict[Any, Any]:
2022-04-20T08:48:25.7431739Z # TODO once storage returns proper record metadata information add proper support
2022-04-20T08:48:25.7432145Z # for metadata
2022-04-20T08:48:25.7432420Z url = f"{record_url}/userdata?realization_index=0"
2022-04-20T08:48:25.7432728Z resp = await _get_from_server_async(url, {})
2022-04-20T08:48:25.7432993Z ret: Dict[Any, Any] = resp.json()
2022-04-20T08:48:25.7433231Z return ret
2022-04-20T08:48:25.7433363Z
2022-04-20T08:48:25.7433369Z
2022-04-20T08:48:25.7433479Z async def add_record_metadata(
2022-04-20T08:48:25.7433767Z record_urls: str, record_name: str, metadata: Dict[Any, Any]
2022-04-20T08:48:25.7434086Z ) -> None:
2022-04-20T08:48:25.7434369Z url = f"{record_urls}/{record_name}/userdata?realization_index=0"
2022-04-20T08:48:25.7434704Z await _put_to_server_async(url, {}, json=metadata)
2022-04-20T08:48:25.7434869Z
2022-04-20T08:48:25.7434875Z
2022-04-20T08:48:25.7435021Z async def transmit_awaitable_record_collection(
2022-04-20T08:48:25.7435363Z record_awaitable: Awaitable[ert.data.RecordCollection],
2022-04-20T08:48:25.7435655Z record_name: str,
2022-04-20T08:48:25.7435870Z workspace_name: str,
2022-04-20T08:48:25.7436138Z experiment_name: Optional[str] = None,
2022-04-20T08:48:25.7436540Z ) -> Dict[int, Dict[str, ert.data.RecordTransmitter]]:
2022-04-20T08:48:25.7436838Z record_coll = await record_awaitable
2022-04-20T08:48:25.7437126Z return await transmit_record_collection(
2022-04-20T08:48:25.7437454Z record_coll, record_name, workspace_name, experiment_name
2022-04-20T08:48:25.7437721Z )
2022-04-20T08:48:25.7437822Z
2022-04-20T08:48:25.7437828Z
2022-04-20T08:48:25.7437948Z async def transmit_record_collection(
2022-04-20T08:48:25.7438242Z record_coll: ert.data.RecordCollection,
2022-04-20T08:48:25.7438505Z record_name: str,
2022-04-20T08:48:25.7438720Z workspace_name: str,
2022-04-20T08:48:25.7438984Z experiment_name: Optional[str] = None,
2022-04-20T08:48:25.7439457Z ) -> Dict[int, Dict[str, ert.data.RecordTransmitter]]:
2022-04-20T08:48:25.7439751Z record: ert.data.Record
2022-04-20T08:48:25.7440009Z metadata: Dict[Any, Any] = {
2022-04-20T08:48:25.7440292Z "record_type": record_coll.record_type,
2022-04-20T08:48:25.7440591Z "collection_type": record_coll.collection_type,
2022-04-20T08:48:25.7440854Z "uris": [],
2022-04-20T08:48:25.7441053Z }
2022-04-20T08:48:25.7441167Z
2022-04-20T08:48:25.7441351Z records_url = await get_records_url_async(workspace_name, experiment_name)
2022-04-20T08:48:25.7441668Z if experiment_name is not None:
2022-04-20T08:48:25.7442006Z ensemble_id = await _get_ensemble_id_async(workspace_name, experiment_name)
2022-04-20T08:48:25.7442386Z ensemble_size = await _get_ensemble_size(ensemble_id=ensemble_id)
2022-04-20T08:48:25.7442647Z else:
2022-04-20T08:48:25.7442876Z ensemble_size = len(record_coll)
2022-04-20T08:48:25.7443037Z
2022-04-20T08:48:25.7443160Z if len(record_coll) != ensemble_size:
2022-04-20T08:48:25.7443427Z raise ert.exceptions.ErtError(
2022-04-20T08:48:25.7443760Z f"Experiment ensemble size {ensemble_size} does not match"
2022-04-20T08:48:25.7444074Z f" data size {len(record_coll)}"
2022-04-20T08:48:25.7444305Z )
2022-04-20T08:48:25.7444408Z
2022-04-20T08:48:25.7444558Z # Handle special case of a uniform record collection
2022-04-20T08:48:25.7444947Z if record_coll.collection_type == ert.data.RecordCollectionType.UNIFORM:
2022-04-20T08:48:25.7445297Z record = record_coll.records[0]
2022-04-20T08:48:25.7445615Z transmitter = ert.storage.StorageRecordTransmitter(
2022-04-20T08:48:25.7445972Z name=record_name, storage_url=records_url, iens=0
2022-04-20T08:48:25.7446223Z )
2022-04-20T08:48:25.7446512Z await transmitter.transmit_record(record)
2022-04-20T08:48:25.7446828Z metadata["uris"].append(transmitter.uri)
2022-04-20T08:48:25.7447165Z await add_record_metadata(records_url, record_name, metadata)
2022-04-20T08:48:25.7447541Z return {iens: {record_name: transmitter} foriensin range(ensemble_size)}
2022-04-20T08:48:25.7447828Z
2022-04-20T08:48:25.7448014Z transmitters: Dict[int, Dict[str, ert.data.RecordTransmitter]] = {}
2022-04-20T08:48:25.7448328Z transmitter_list = []
2022-04-20T08:48:25.7448615Z foriens, recordin enumerate(record_coll.records):
2022-04-20T08:48:25.7448981Z transmitter = StorageRecordTransmitter(record_name, records_url, iens=iens)
2022-04-20T08:48:25.7449350Z await transmitter.transmit_record(record)
2022-04-20T08:48:25.7449787Z transmitter_list.append(transmitter)
2022-04-20T08:48:25.7450091Z transmitters[iens] = {record_name: transmitter}
2022-04-20T08:48:25.7450278Z
2022-04-20T08:48:25.7450406Z fortransmitterin transmitter_list:
2022-04-20T08:48:25.7450712Z metadata["uris"].append(transmitter.uri)
2022-04-20T08:48:25.7451058Z await add_record_metadata(records_url, record_name, metadata)
2022-04-20T08:48:25.7451263Z
2022-04-20T08:48:25.7451350Z return transmitters
2022-04-20T08:48:25.7451501Z
2022-04-20T08:48:25.7451508Z
2022-04-20T08:48:25.7451604Z def _get_from_server(
2022-04-20T08:48:25.7451829Z path: str,
2022-04-20T08:48:25.7452079Z headers: Optional[Dict[Any, Any]] = None,
2022-04-20T08:48:25.7452350Z status_code: int = 200,
2022-04-20T08:48:25.7452582Z **kwargs: Any,
2022-04-20T08:48:25.7452883Z ) -> httpx.Response:
2022-04-20T08:48:25.7453099Z if not headers:
2022-04-20T08:48:25.7453321Z headers = {}
2022-04-20T08:48:25.7453458Z
2022-04-20T08:48:25.7453580Z with Storage.session() as session:
2022-04-20T08:48:25.7453898Z resp = session.get(path, headers=headers, timeout=60, **kwargs)
2022-04-20T08:48:25.7454221Z if resp.status_code != status_code:
2022-04-20T08:48:25.7454568Z logger.error("Failed to fetch from %s. Response: %s", path, resp.text)
2022-04-20T08:48:25.7454787Z
2022-04-20T08:48:25.7454940Z return resp
2022-04-20T08:48:25.7455092Z
2022-04-20T08:48:25.7455099Z
2022-04-20T08:48:25.7455427Z def get_records_url(workspace_name: str, experiment_name: Optional[str] = None) -> str:
2022-04-20T08:48:25.7455774Z if experiment_name is None:
2022-04-20T08:48:25.7456208Z experiment_name = f"{workspace_name}.{_ENSEMBLE_RECORDS}"
2022-04-20T08:48:25.7456561Z experiment = _get_experiment_by_name(experiment_name)
2022-04-20T08:48:25.7456866Z if experiment is None:
2022-04-20T08:48:25.7457196Z raise ert.exceptions.NonExistentExperiment(
2022-04-20T08:48:25.7457550Z f"Experiment {experiment_name} does not exist"
2022-04-20T08:48:25.7457820Z )
2022-04-20T08:48:25.7457948Z
2022-04-20T08:48:25.7458143Z ensemble_id = experiment["ensemble_ids"][0] # currently just one ens per exp
2022-04-20T08:48:25.7458503Z return f"/ensembles/{ensemble_id}/records"
2022-04-20T08:48:25.7458671Z
2022-04-20T08:48:25.7458677Z
2022-04-20T08:48:25.7458801Z async def get_records_url_async(
2022-04-20T08:48:25.7459133Z workspace_name: str, experiment_name: Optional[str] = None
2022-04-20T08:48:25.7459587Z ) -> str:
2022-04-20T08:48:25.7459881Z ensemble_id = await _get_ensemble_id_async(workspace_name, experiment_name)
2022-04-20T08:48:25.7460240Z return f"/ensembles/{ensemble_id}/records"
2022-04-20T08:48:25.7460416Z
2022-04-20T08:48:25.7460423Z
2022-04-20T08:48:25.7502008Z def _delete_on_server(
2022-04-20T08:48:25.7502481Z path: str, headers: Optional[Dict[Any, Any]] = None, status_code: int = 200
2022-04-20T08:48:25.7502904Z ) -> httpx.Response:
2022-04-20T08:48:25.7503357Z if not headers:
2022-04-20T08:48:25.7503583Z headers = {}
2022-04-20T08:48:25.7503824Z with Storage.session() as session:
2022-04-20T08:48:25.7504098Z resp = session.delete(
2022-04-20T08:48:25.7504331Z path,
2022-04-20T08:48:25.7504545Z headers=headers,
2022-04-20T08:48:25.7504778Z timeout=60,
2022-04-20T08:48:25.7504995Z )
2022-04-20T08:48:25.7505222Z if resp.status_code != status_code:
2022-04-20T08:48:25.7505752Z logger.error("Failed to delete %s. Response: %s", path, resp.text)
2022-04-20T08:48:25.7505972Z
2022-04-20T08:48:25.7506064Z return resp
2022-04-20T08:48:25.7506202Z
2022-04-20T08:48:25.7506208Z
2022-04-20T08:48:25.7506305Z def _post_to_server(
2022-04-20T08:48:25.7506514Z path: str,
2022-04-20T08:48:25.7506775Z headers: Optional[Dict[Any, Any]] = None,
2022-04-20T08:48:25.7507048Z status_code: int = 200,
2022-04-20T08:48:25.7507264Z **kwargs: Any,
2022-04-20T08:48:25.7507559Z ) -> httpx.Response:
2022-04-20T08:48:25.7507790Z if not headers:
2022-04-20T08:48:25.7507998Z headers = {}
2022-04-20T08:48:25.7508250Z with Storage.session() as session:
2022-04-20T08:48:25.7508588Z resp = session.post(path, headers=headers, timeout=60, **kwargs)
2022-04-20T08:48:25.7508904Z if resp.status_code != status_code:
2022-04-20T08:48:25.7509252Z logger.error("Failed to post to %s. Response: %s", path, resp.text)
2022-04-20T08:48:25.7509470Z
2022-04-20T08:48:25.7509563Z return resp
2022-04-20T08:48:25.7509805Z
2022-04-20T08:48:25.7509811Z
2022-04-20T08:48:25.7509904Z def _put_to_server(
2022-04-20T08:48:25.7510105Z path: str,
2022-04-20T08:48:25.7510360Z headers: Optional[Dict[Any, Any]] = None,
2022-04-20T08:48:25.7510626Z status_code: int = 200,
2022-04-20T08:48:25.7510837Z **kwargs: Any,
2022-04-20T08:48:25.7511114Z ) -> httpx.Response:
2022-04-20T08:48:25.7511340Z if not headers:
2022-04-20T08:48:25.7511542Z headers = {}
2022-04-20T08:48:25.7511788Z with Storage.session() as session:
2022-04-20T08:48:25.7512112Z resp = session.put(path, headers=headers, timeout=60, **kwargs)
2022-04-20T08:48:25.7512410Z if resp.status_code != status_code:
2022-04-20T08:48:25.7512740Z logger.error("Failed to put to %s. Response: %s", path, resp.text)
2022-04-20T08:48:25.7513026Z return resp
2022-04-20T08:48:25.7513250Z
2022-04-20T08:48:25.7513260Z
2022-04-20T08:48:25.7513543Z def _get_experiment_by_name(experiment_name: str) -> Dict[str, Any]:
2022-04-20T08:48:25.7514033Z response = _get_from_server(path="experiments")
2022-04-20T08:48:25.7514341Z if response.status_code != 200:
2022-04-20T08:48:25.7514681Z raise ert.exceptions.StorageError(response.text)
2022-04-20T08:48:25.7515057Z experiments = {exp["name"]: exp forexpinresponse.json()}
2022-04-20T08:48:25.7515397Z return experiments.get(experiment_name, None)
2022-04-20T08:48:25.7515593Z
2022-04-20T08:48:25.7515599Z
2022-04-20T08:48:25.7515717Z async def _get_ensemble_id_async(
2022-04-20T08:48:25.7516044Z workspace_name: str, experiment_name: Optional[str] = None
2022-04-20T08:48:25.7516361Z ) -> str:
2022-04-20T08:48:25.7516588Z url = "/experiments"
2022-04-20T08:48:25.7517076Z if experiment_name is None:
2022-04-20T08:48:25.7517366Z experiment_name = f"{workspace_name}.{_ENSEMBLE_RECORDS}"
2022-04-20T08:48:25.7517564Z
2022-04-20T08:48:25.7517702Z response = await _get_from_server_async(url, {})
2022-04-20T08:48:25.7518031Z experiments = {exp["name"]: exp forexpinresponse.json()}
2022-04-20T08:48:25.7518368Z experiment = experiments.get(experiment_name, None)
2022-04-20T08:48:25.7518650Z if experiment is not None:
2022-04-20T08:48:25.7518928Z return str(experiment["ensemble_ids"][0])
2022-04-20T08:48:25.7519258Z raise ert.exceptions.NonExistentExperiment(
2022-04-20T08:48:25.7519583Z f"Experiment {experiment_name} does not exist"
2022-04-20T08:48:25.7519829Z )
2022-04-20T08:48:25.7519944Z
2022-04-20T08:48:25.7519951Z
2022-04-20T08:48:25.7520181Z async def _get_ensemble_size(ensemble_id: str) -> int:
2022-04-20T08:48:25.7520473Z url = f"/ensembles/{ensemble_id}"
2022-04-20T08:48:25.7520750Z response = await _get_from_server_async(url, {})
2022-04-20T08:48:25.7521038Z response_json = response.json()
2022-04-20T08:48:25.7521309Z return int(response_json["size"])
2022-04-20T08:48:25.7521454Z
2022-04-20T08:48:25.7521473Z
2022-04-20T08:48:25.7521650Z def init(*, workspace_name: str) -> None:
2022-04-20T08:48:25.7522048Z response = _get_from_server(path="experiments")
2022-04-20T08:48:25.7522407Z experiment_names = {exp["name"]: exp["ensemble_ids"] forexpinresponse.json()}
2022-04-20T08:48:25.7522631Z
2022-04-20T08:48:25.7522746Z forspecial_keyin _SPECIAL_KEYS:
2022-04-20T08:48:25.7523045Z if f"{workspace_name}.{special_key}"in experiment_names:
2022-04-20T08:48:25.7523332Z raise RuntimeError(
2022-04-20T08:48:25.7523640Z f"Workspace {workspace_name} already registered in storage"
2022-04-20T08:48:25.7523897Z )
2022-04-20T08:48:25.7524108Z _init_experiment(
2022-04-20T08:48:25.7524390Z experiment_name=f"{workspace_name}.{special_key}",
2022-04-20T08:48:25.7524654Z parameters={},
2022-04-20T08:48:25.7524960Z ensemble_size=-1,
2022-04-20T08:48:25.7525195Z responses=[],
2022-04-20T08:48:25.7525387Z )
2022-04-20T08:48:25.7525503Z
2022-04-20T08:48:25.7525512Z
2022-04-20T08:48:25.7525757Z def assert_storage_initialized(workspace_name: str) -> None:
2022-04-20T08:48:25.7526092Z response = _get_from_server(path="experiments")
2022-04-20T08:48:25.7526453Z experiment_names = {exp["name"]: exp["ensemble_ids"] forexpinresponse.json()}
2022-04-20T08:48:25.7526663Z
2022-04-20T08:48:25.7526781Z forspecial_keyin _SPECIAL_KEYS:
2022-04-20T08:48:25.7527096Z if f"{workspace_name}.{special_key}" not in experiment_names:
2022-04-20T08:48:25.7527430Z raise ert.exceptions.StorageError(
2022-04-20T08:48:25.7527729Z "Storage is not initialized properly. "
2022-04-20T08:48:25.7528043Z + "The workspace needs to be reinitialized"
2022-04-20T08:48:25.7528289Z )
2022-04-20T08:48:25.7528410Z
2022-04-20T08:48:25.7528417Z
2022-04-20T08:48:25.7528589Z def init_experiment(
2022-04-20T08:48:25.7528799Z *,
2022-04-20T08:48:25.7529007Z experiment_name: str,
2022-04-20T08:48:25.7529255Z parameters: Iterable[str],
2022-04-20T08:48:25.7529481Z ensemble_size: int,
2022-04-20T08:48:25.7529716Z responses: Iterable[str],
2022-04-20T08:48:25.7529987Z ) -> None:
2022-04-20T08:48:25.7532745Z if ensemble_size <= 0:
2022-04-20T08:48:25.7533117Z raise ValueError("Ensemble cannot have a size <= 0")
2022-04-20T08:48:25.7533319Z
2022-04-20T08:48:25.7533422Z _init_experiment(
2022-04-20T08:48:25.7533805Z experiment_name=experiment_name,
2022-04-20T08:48:25.7534081Z parameters=parameters,
2022-04-20T08:48:25.7534345Z ensemble_size=ensemble_size,
2022-04-20T08:48:25.7534590Z responses=responses,
2022-04-20T08:48:25.7534810Z )
2022-04-20T08:48:25.7534930Z
2022-04-20T08:48:25.7534936Z
2022-04-20T08:48:25.7535033Z def _init_experiment(
2022-04-20T08:48:25.7535350Z *,
2022-04-20T08:48:25.7535551Z experiment_name: str,
2022-04-20T08:48:25.7535795Z parameters: Iterable[str],
2022-04-20T08:48:25.7536034Z ensemble_size: int,
2022-04-20T08:48:25.7536261Z responses: Iterable[str],
2022-04-20T08:48:25.7536576Z ) -> None:
2022-04-20T08:48:25.7536794Z if not experiment_name:
2022-04-20T08:48:25.7537093Z raise ValueError("Cannot initialize experiment without a name")
2022-04-20T08:48:25.7537301Z
2022-04-20T08:48:25.7537456Z if _get_experiment_by_name(experiment_name) is not None:
2022-04-20T08:48:25.7537796Z raise ert.exceptions.ElementExistsError(
2022-04-20T08:48:25.7538139Z f"Cannot initialize existing experiment: {experiment_name}"
2022-04-20T08:48:25.7538412Z )
2022-04-20T08:48:25.7538530Z
2022-04-20T08:48:25.7538684Z if len(set(parameters).intersection(responses)) > 0:
2022-04-20T08:48:25.7539001Z raise ert.exceptions.StorageError(
2022-04-20T08:48:25.7539336Z "Experiment parameters and responses cannot have a name in common"
2022-04-20T08:48:25.7539623Z )
2022-04-20T08:48:25.7539741Z
2022-04-20T08:48:25.7539931Z exp_response = _post_to_server(path="experiments", json={"name": experiment_name})
2022-04-20T08:48:25.7557803Z exp_id = exp_response.json()["id"]
2022-04-20T08:48:25.7558103Z
2022-04-20T08:48:25.7558227Z response = _post_to_server(
2022-04-20T08:48:25.7558506Z f"experiments/{exp_id}/ensembles",
2022-04-20T08:48:25.7558749Z json={
2022-04-20T08:48:25.7558991Z "parameter_names": list(parameters),
2022-04-20T08:48:25.7559284Z "response_names": list(responses),
2022-04-20T08:48:25.7559544Z "size": ensemble_size,
2022-04-20T08:48:25.7559802Z "userdata": {"name": experiment_name},
2022-04-20T08:48:25.7560041Z },
2022-04-20T08:48:25.7560226Z )
2022-04-20T08:48:25.7560439Z if response.status_code != 200:
2022-04-20T08:48:25.7560766Z raise ert.exceptions.StorageError(response.text)
2022-04-20T08:48:25.7560968Z
2022-04-20T08:48:25.7560982Z
2022-04-20T08:48:25.7561308Z def get_experiment_names(*, workspace_name: str) -> Set[str]:
2022-04-20T08:48:25.7561753Z response = _get_from_server(path="experiments")
2022-04-20T08:48:25.7562107Z experiment_names = {exp["name"] forexpinresponse.json()}
2022-04-20T08:48:25.7562421Z forspecial_keyin _SPECIAL_KEYS:
2022-04-20T08:48:25.7562721Z key = f"{workspace_name}.{special_key}"
2022-04-20T08:48:25.7562989Z if key in experiment_names:
2022-04-20T08:48:25.7563266Z experiment_names.remove(key)
2022-04-20T08:48:25.7563535Z return experiment_names
2022-04-20T08:48:25.7563690Z
2022-04-20T08:48:25.7563697Z
2022-04-20T08:48:25.7563966Z def _get_experiment_parameters(experiment_name: str) -> Iterable[str]:
2022-04-20T08:48:25.7564471Z experiment = _get_experiment_by_name(experiment_name)
2022-04-20T08:48:25.7564775Z if experiment is None:
2022-04-20T08:48:25.7565106Z raise ert.exceptions.NonExistentExperiment(
2022-04-20T08:48:25.7565811Z f"Cannot get parameters from non-existing experiment: {experiment_name}"
2022-04-20T08:48:25.7566157Z )
2022-04-20T08:48:25.7566288Z
2022-04-20T08:48:25.7566534Z ensemble_id = experiment["ensemble_ids"][0] # currently just one ens per exp
2022-04-20T08:48:25.7566927Z response = _get_from_server(f"ensembles/{ensemble_id}/parameters")
2022-04-20T08:48:25.7567148Z
2022-04-20T08:48:25.7567393Z if response.status_code != 200:
2022-04-20T08:48:25.7567727Z raise ert.exceptions.StorageError(response.text)
2022-04-20T08:48:25.7567935Z
2022-04-20T08:48:25.7568050Z returnlist(response.json())
2022-04-20T08:48:25.7568197Z
2022-04-20T08:48:25.7568203Z
2022-04-20T08:48:25.7568319Z async def _get_record_collection(
2022-04-20T08:48:25.7568571Z records_url: str,
2022-04-20T08:48:25.7568801Z record_name: str,
2022-04-20T08:48:25.7569017Z ensemble_size: int,
2022-04-20T08:48:25.7569282Z record_source: Optional[str] = None,
2022-04-20T08:48:25.7569780Z ) -> Tuple[List[StorageRecordTransmitter], ert.data.RecordCollectionType]:
2022-04-20T08:48:25.7570167Z return await _get_record_storage_transmitters(
2022-04-20T08:48:25.7570507Z records_url, record_name, record_source, ensemble_size
2022-04-20T08:48:25.7570773Z )
2022-04-20T08:48:25.7570893Z
2022-04-20T08:48:25.7570900Z
2022-04-20T08:48:25.7571073Z def get_ensemble_record(
2022-04-20T08:48:25.7571325Z *,
2022-04-20T08:48:25.7571738Z workspace_name: str,
2022-04-20T08:48:25.7572068Z record_name: str,
2022-04-20T08:48:25.7572376Z ensemble_size: int,
2022-04-20T08:48:25.7572670Z experiment_name: Optional[str] = None,
2022-04-20T08:48:25.7573023Z source: Optional[str] = None,
2022-04-20T08:48:25.7573531Z ) -> ert.data.RecordCollection:
2022-04-20T08:48:25.7573868Z records_url = ert.storage.get_records_url(
2022-04-20T08:48:25.7574300Z workspace_name=workspace_name, experiment_name=experiment_name
2022-04-20T08:48:25.7574664Z )
2022-04-20T08:48:25.7598647Z
2022-04-20T08:48:25.7598887Z transmitters, collection_type = get_event_loop().run_until_complete(
2022-04-20T08:48:25.7599222Z _get_record_collection(
2022-04-20T08:48:25.7599670Z records_url=records_url,
2022-04-20T08:48:25.7599926Z record_name=record_name,
2022-04-20T08:48:25.7600898Z record_source=source,
2022-04-20T08:48:25.7601193Z ensemble_size=ensemble_size,
2022-04-20T08:48:25.7601440Z )
2022-04-20T08:48:25.7601638Z )
2022-04-20T08:48:25.7601838Z records = tuple(
2022-04-20T08:48:25.7602151Z get_event_loop().run_until_complete(transmitter.load())
2022-04-20T08:48:25.7602480Z fortransmitterin transmitters
2022-04-20T08:48:25.7602708Z )
2022-04-20T08:48:25.7602954Z return ert.data.RecordCollection(
2022-04-20T08:48:25.7603325Z records=records, length=ensemble_size, collection_type=collection_type
2022-04-20T08:48:25.7603634Z )
2022-04-20T08:48:25.7603746Z
2022-04-20T08:48:25.7603754Z
2022-04-20T08:48:25.7603878Z def get_ensemble_record_names(
2022-04-20T08:48:25.7604468Z *, workspace_name: str, experiment_name: Optional[str] = None, _flatten: bool = True
2022-04-20T08:48:25.7604895Z ) -> Iterable[str]:
2022-04-20T08:48:25.7605203Z # _flatten is a parameter used only for testing separated parameter records
2022-04-20T08:48:25.7605526Z if experiment_name is None:
2022-04-20T08:48:25.7605831Z experiment_name = f"{workspace_name}.{_ENSEMBLE_RECORDS}"
2022-04-20T08:48:25.7606157Z experiment = _get_experiment_by_name(experiment_name)
2022-04-20T08:48:25.7606442Z if experiment is None:
2022-04-20T08:48:25.7606749Z raise ert.exceptions.NonExistentExperiment(
2022-04-20T08:48:25.7607245Z f"Cannot get record names of non-existing experiment: {experiment_name}"
2022-04-20T08:48:25.7607526Z )
2022-04-20T08:48:25.7607646Z
2022-04-20T08:48:25.7607825Z ensemble_id = experiment["ensemble_ids"][0] # currently just one ens per exp
2022-04-20T08:48:25.7608399Z response = _get_from_server(path=f"ensembles/{ensemble_id}/records")
2022-04-20T08:48:25.7608731Z if response.status_code != 200:
2022-04-20T08:48:25.7609059Z raise ert.exceptions.StorageError(response.text)
2022-04-20T08:48:25.7609260Z
2022-04-20T08:48:25.7609400Z # Flatten any parameter records that were split
2022-04-20T08:48:25.7609655Z if _flatten:
2022-04-20T08:48:25.7609955Z return {x.split(_PARAMETER_RECORD_SEPARATOR)[0] forxinresponse.json().keys()}
2022-04-20T08:48:25.7610291Z returnlist(response.json().keys())
2022-04-20T08:48:25.7610454Z
2022-04-20T08:48:25.7610460Z
2022-04-20T08:48:25.7610753Z def get_experiment_parameters(*, experiment_name: str) -> Iterable[str]:
2022-04-20T08:48:25.7611102Z return _get_experiment_parameters(experiment_name)
2022-04-20T08:48:25.7611290Z
2022-04-20T08:48:25.7611296Z
2022-04-20T08:48:25.7611560Z def get_experiment_responses(*, experiment_name: str) -> Iterable[str]:
2022-04-20T08:48:25.7611922Z experiment = _get_experiment_by_name(experiment_name)
2022-04-20T08:48:25.7612205Z if experiment is None:
2022-04-20T08:48:25.7612499Z raise ert.exceptions.NonExistentExperiment(
2022-04-20T08:48:25.7612983Z f"Cannot get responses from non-existing experiment: {experiment_name}"
2022-04-20T08:48:25.7613275Z )
2022-04-20T08:48:25.7613394Z
2022-04-20T08:48:25.7613563Z ensemble_id = experiment["ensemble_ids"][0] # currently just one ens per exp
2022-04-20T08:48:25.7613934Z response = _get_from_server(f"ensembles/{ensemble_id}/responses")
2022-04-20T08:48:25.7614137Z
2022-04-20T08:48:25.7614251Z if response.status_code != 200:
2022-04-20T08:48:25.7614568Z raise ert.exceptions.StorageError(response.text)
2022-04-20T08:48:25.7614755Z
2022-04-20T08:48:25.7614909Z # The ensemble responses are sent in the following form:
2022-04-20T08:48:25.7615166Z # {
2022-04-20T08:48:25.7615441Z # "polynomial_output": {"id": id, "name": name, "userdata": {}}
2022-04-20T08:48:25.7615694Z # }
2022-04-20T08:48:25.7615927Z # therefore we extract only the keys
2022-04-20T08:48:25.7616318Z
2022-04-20T08:48:25.7616443Z returnlist(response.json().keys())
2022-04-20T08:48:25.7616612Z
2022-04-20T08:48:25.7616618Z
2022-04-20T08:48:25.7616870Z def delete_experiment(*, experiment_name: str) -> None:
2022-04-20T08:48:25.7617204Z experiment = _get_experiment_by_name(experiment_name)
2022-04-20T08:48:25.7617496Z if experiment is None:
2022-04-20T08:48:25.7617815Z raise ert.exceptions.NonExistentExperiment(
2022-04-20T08:48:25.7618161Z f"Experiment does not exist: {experiment_name}"
2022-04-20T08:48:25.7618421Z )
2022-04-20T08:48:25.7618813Z response = _delete_on_server(path=f"experiments/{experiment['id']}")
2022-04-20T08:48:25.7619034Z
2022-04-20T08:48:25.7619140Z if response.status_code != 200:
2022-04-20T08:48:25.7619569Z raise ert.exceptions.StorageError(response.text)
2022-04-20T08:48:25.7619767Z
2022-04-20T08:48:25.7619844Z ```2022-04-20T08:48:25.7619957Z 2022-04-20T08:48:25.7620140Z pylint crashed with a ``AttributeError`` and with the following stacktrace:2022-04-20T08:48:25.7620415Z ```
2022-04-20T08:48:25.7620813Z Traceback (most recent call last):
2022-04-20T08:48:25.7628768Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/lint/pylinter.py", line 1111, in _check_files
2022-04-20T08:48:25.7629282Z self._check_file(get_ast, check_astroid_module, file)
2022-04-20T08:48:25.7630000Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/lint/pylinter.py", line 1146, in _check_file
2022-04-20T08:48:25.7630402Z check_astroid_module(ast_node)
2022-04-20T08:48:25.7631034Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/lint/pylinter.py", line 1298, in check_astroid_module
2022-04-20T08:48:25.7631451Z retval = self._check_astroid_module(
2022-04-20T08:48:25.7632326Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/lint/pylinter.py", line 1345, in _check_astroid_module
2022-04-20T08:48:25.7632733Z walker.walk(node)
2022-04-20T08:48:25.7633323Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/utils/ast_walker.py", line 76, in walk
2022-04-20T08:48:25.7633694Z self.walk(child)
2022-04-20T08:48:25.7634242Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/utils/ast_walker.py", line 76, in walk
2022-04-20T08:48:25.7634589Z self.walk(child)
2022-04-20T08:48:25.7635129Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/utils/ast_walker.py", line 76, in walk
2022-04-20T08:48:25.7635492Z self.walk(child)
2022-04-20T08:48:25.7635753Z [Previous line repeated 3 more times]
2022-04-20T08:48:25.7636411Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/utils/ast_walker.py", line 73, in walk
2022-04-20T08:48:25.7636765Z callback(astroid)
2022-04-20T08:48:25.7637348Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/checkers/variables.py", line 1377, in visit_name
2022-04-20T08:48:25.7637778Z self._undefined_and_used_before_checker(node, stmt)
2022-04-20T08:48:25.7638448Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/checkers/variables.py", line 1419, in _undefined_and_used_before_checker
2022-04-20T08:48:25.7638879Z action, nodes_to_consume = self._check_consumer(
2022-04-20T08:48:25.7639495Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/checkers/variables.py", line 1680, in _check_consumer
2022-04-20T08:48:25.7639924Z elif self._is_only_type_assignment(node, defstmt):
2022-04-20T08:48:25.7640570Z File "/opt/hostedtoolcache/Python/3.8.12/x64/lib/python3.8/site-packages/pylint/checkers/variables.py", line 2072, in _is_only_type_assignment
2022-04-20T08:48:25.7641054Z if node.name in self.linter.config.additional_builtins or utils.is_builtin(
2022-04-20T08:48:25.7641551Z AttributeError: 'Values' object has no attribute 'additional_builtins'
2022-04-20T08:48:25.7641960Z ```2022-04-20T08:48:38.8692233Z .
### Expected behavior
pylint should report success of failure with the linting, and exit accordingly, with no crash.
### Pylint version
```shell
pylint 2.13.6
OS / Environment
Ubuntu 20.04.4 LTS
Additional dependencies
No response
The text was updated successfully, but these errors were encountered:
Bug description
Using the latest release (2.13.6) we get this crash in pylint. It works with the previous version. This was ran on https://github.com/equinor/ert according to the github action in https://github.com/equinor/ert/blob/main/.github/workflows/style.yml
This happens while checking several different files. Only including the one report, as they are all reporting the same error.
Configuration
No response
Command used
Pylint output
OS / Environment
Ubuntu 20.04.4 LTS
Additional dependencies
No response
The text was updated successfully, but these errors were encountered: