Skip to content

Commit

Permalink
security fixes and python 2.7.16 bug-fix update
Browse files Browse the repository at this point in the history
* Fix #525: Update to python 2.7.16 which fixes a bunch of potential crashes, also updating to match the Dockerfile.
* Fix #511 update `urllib3` from 1.23 to 1.24.2 (not to 1.25.1 which is incompatible with this version of requests). https://nvd.nist.gov/vuln/detail/CVE-2019-11324
* Fix #527 update `Jinja2` from 2.10 to 2.10.1. https://nvd.nist.gov/vuln/detail/CVE-2019-10906
* Fix #478 update `pyyaml` from 3.13 to 5.1, which deprecates `yaml.load()` but doesn't actually fix the vulnerability. It looks like they tried to close the execute-arbitrary-code vulnerability in 4.0 but reverted the incompatibility. Still, this update should appease GitHub's security scanner. Change our code to call `yaml.safe_load()` instead of `yaml.load()`, although I think we're calling `ruamel.yaml`. https://nvd.nist.gov/vuln/detail/CVE-2017-18342
* Fix #479 `requests` from 2.19.1 to 2.21.0. It's used by FireWorks, bokeh, confluent_kafka, and ipython. https://nvd.nist.gov/vuln/detail/CVE-2018-18074
* Update `ruamel.yaml` from 0.15.43 to 0.15.94. That's 51 new bug fix releases! Clearly YAML is over-complicated.
* Update NumPy from 1.14.5 to 1.14.6 for a thread safety bug fix. (Releases 1.15 & 1.16 are substantial changes, and 1.17 will drop support for Python 2.7.)
* Add the `typing` and `mypy` pips while we're updating pyenvs. We'll need these when we start adding static type checking.
  • Loading branch information
1fish2 committed Apr 27, 2019
1 parent f758fee commit 2187021
Show file tree
Hide file tree
Showing 8 changed files with 26 additions and 23 deletions.
2 changes: 1 addition & 1 deletion agent/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Distributed simulation of whole cell agents relative to a shared environment.

## Setup

The simulation is written in [Python 2.7.15](https://www.python.org/), and depends on [Kafka](https://kafka.apache.org/) for mediating communication between the different processes.
The simulation is written in [Python 2.7](https://www.python.org/), and depends on [Kafka](https://kafka.apache.org/) for mediating communication between the different processes.

Kafka is a message passing system that allows decoupling of message senders and message receivers. It does this by providing two abstractions, a Consumer and a Producer. A Consumer can subscribe to any number of "topics" it will receive messages on, and a Producer can send to any topics it wishes. Topics are communication "channels" between processes that otherwise do not need to know who is sending and receiving these messages.

Expand Down
8 changes: 5 additions & 3 deletions cloud/docker/runtime/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,14 @@
# > gcloud builds submit --timeout=2h --tag gcr.io/allen-discovery-center-mcovert/wcm-runtime .
# > rm requirements.txt

# DO NOT USE python:2.7.15-alpine since the simulation math comes out different!
# DO NOT USE an alpine base since the simulation math comes out different!
FROM python:2.7.16

RUN apt-get update \
&& apt-get install -y glpk-utils libglpk-dev glpk-doc swig python-cvxopt \
gfortran llvm cmake ncurses-dev libreadline7 libreadline-dev
gfortran llvm cmake ncurses-dev libreadline7 libreadline-dev nano

RUN echo "alias ls='ls --color=auto'" >> ~/.bashrc

# Install openblas v0.3.5 (not buggy v0.3.3).
# Future: Install openblas-dev via apt-get.
Expand All @@ -47,7 +49,7 @@ ENV CVXOPT_BUILD_GLPK=1 \

# Install all the pips within one Docker layer and don't cache the downloads.
COPY requirements.txt /
RUN pip install --no-cache-dir 'numpy==1.14.5' --no-binary numpy \
RUN pip install --no-cache-dir 'numpy==1.14.6' --no-binary numpy \
&& pip install --no-cache-dir -r requirements.txt --no-binary numpy,scipy,cvxopt

CMD ["/bin/bash"]
6 changes: 3 additions & 3 deletions docs/create-pyenv.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ This page goes through the Python environment setup steps in more detail and wit
2. Install the required version of Python via `pyenv`, and _remember to enable it as a shared library_ so Theano can call into it:

```bash
PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 2.7.15
PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 2.7.16
```


Expand All @@ -75,7 +75,7 @@ This page goes through the Python environment setup steps in more detail and wit

```bash
cd ~/dev/wcEcoli # or wherever you cloned the `wcEcoli` project to
pyenv local 2.7.15
pyenv local 2.7.16
pyenv virtualenv wcEcoli2
pyenv local wcEcoli2
```
Expand Down Expand Up @@ -204,7 +204,7 @@ This page goes through the Python environment setup steps in more detail and wit
nosetests
```

If the unit tests fail with an error message saying the loader can't load `.../pyenv/versions/.../lib/libpython2.7.a`, that means you didn't successfully `--enable-shared` when installing python. Go back to that step, run `PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 2.7.15 --force`, and repeat all the steps after it.
If the unit tests fail with an error message saying the loader can't load `.../pyenv/versions/.../lib/libpython2.7.a`, that means you didn't successfully `--enable-shared` when installing python. Go back to that step, run `PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 2.7.16 --force`, and repeat all the steps after it.


## Sherlock SCRATCH directory setup
Expand Down
25 changes: 13 additions & 12 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,14 @@
#
## Install the required version of Python via pyenv, and remember to enable it as
## a shared library:
# PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 2.7.15
# PYTHON_CONFIGURE_OPTS="--enable-shared" pyenv install 2.7.16
#
## Create the "wcEcoli2" python virtual environment and select it for your project
## Create the "wcEcoli3" python virtual environment and select it for your project
## directory:
# cd ~/dev/wcEcoli
# pyenv local 2.7.15
# pyenv virtualenv wcEcoli2
# pyenv local wcEcoli2
# pyenv local 2.7.16
# pyenv virtualenv wcEcoli3
# pyenv local wcEcoli3
#
## Upgrade this virtual environment's installers:
# pip install --upgrade pip setuptools virtualenv virtualenvwrapper virtualenv-clone wheel
Expand All @@ -36,7 +36,7 @@
# include_dirs = /usr/local/opt/openblas/include
#
## Install NumPy and SciPy, linked to this OpenBLAS thanks to ~/.numpy-site.cfg:
# pip install numpy==1.14.5 scipy==1.0.1 --no-binary numpy,scipy --force-reinstall
# pip install numpy==1.14.6 scipy==1.0.1 --no-binary numpy,scipy --force-reinstall
#
## Install the packages listed in this file:
# CVXOPT_BUILD_GLPK=1 pip install -r requirements.txt --no-binary numpy,scipy,cvxopt
Expand Down Expand Up @@ -88,7 +88,7 @@ ipython==5.7.0 # 6.0+ are for Python 3
ipython-genutils==0.2.0
isort==4.3.4
itsdangerous==0.24
Jinja2==2.10
Jinja2==2.10.1
jsonschema==2.6.0
jupyter-core==4.4.0
kiwisolver==1.0.1
Expand All @@ -106,7 +106,7 @@ mpmath==1.0.0
multiprocess==0.70.6.1
nbformat==4.4.0
nose==1.3.7
numpy==1.14.5
numpy==1.14.6
optlang==1.4.2
osqp==0.4.1
packaging==17.1
Expand All @@ -131,10 +131,10 @@ pytest==3.6.3
pytest-benchmark==3.1.1
python-dateutil==2.7.3
pytz==2018.5
PyYAML==3.13
requests==2.19.1
PyYAML==5.1
requests==2.21.0
ruamel.ordereddict==0.4.13
ruamel.yaml==0.15.43
ruamel.yaml==0.15.94
scandir==1.7
scipy==1.0.1
scs==2.0.2
Expand All @@ -152,8 +152,9 @@ toolz==0.9.0
tornado==5.0.2
tqdm==4.23.0
traitlets==4.3.2
typing==3.6.6
Unum==4.1.4
urllib3==1.23
urllib3==1.24.2
virtualenv>=16.4.1
virtualenv-clone>=0.5.1
virtualenvwrapper>=4.8.4
Expand Down
2 changes: 1 addition & 1 deletion runscripts/fireworks/fw_queue.py
Original file line number Diff line number Diff line change
Expand Up @@ -253,7 +253,7 @@ def get_environment(variable, default):

# Create launchpad
with open(LAUNCHPAD_FILE) as f:
lpad = LaunchPad(**yaml.load(f))
lpad = LaunchPad(**yaml.safe_load(f))

# Store list of FireWorks
wf_fws = []
Expand Down
2 changes: 1 addition & 1 deletion runscripts/reconstruction/build_complexation_reactions.py
Original file line number Diff line number Diff line change
Expand Up @@ -177,7 +177,7 @@ def fixCompartments(reaction, ourLocations):
idMass.update(rnaMass)
ourLocations = getMonomerLocationsFromOurData()

jsonData = yaml.load(open(ECOCYC_DUMP, "r"))
jsonData = yaml.safe_load(open(ECOCYC_DUMP, "r"))
reactionDataFiltered = removeBlaclistedReactions(jsonData["complexations"])
idLocation = getLocations(reactionDataFiltered)
getMasses(idMass, reactionDataFiltered)
Expand Down
2 changes: 1 addition & 1 deletion runscripts/reconstruction/reaction_enzymes.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ def addFilteredEntries(rxnNamesEnzymes):
reactionStoich = getReactionStoich()
reactionReversibility = getReactionReversibility()

jsonData = yaml.load(open(ECOCYC_DUMP, "r"))
jsonData = yaml.safe_load(open(ECOCYC_DUMP, "r"))

rxnNamesEnzymes = dict([(x["name"], x["annotation"]["enzymes"]) for x in jsonData["reactions"] if "enzymes" in x["annotation"]])

Expand Down
2 changes: 1 addition & 1 deletion runscripts/reconstruction/update_proteins_location.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ def getLocationDifferences(ourLocations, ecocycLocations):



jsonData = yaml.load(open(ECOCYC_DUMP, "r"))
jsonData = yaml.safe_load(open(ECOCYC_DUMP, "r"))

ourLocations = getMonomerLocationsFromOurData()
ecocycLocations = getMonomerLocationsFromEcocyc(jsonData["complexations"])
Expand Down

0 comments on commit 2187021

Please sign in to comment.