Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Build manylinux wheels (including aarch64) with Zuul #5386

Merged
merged 1 commit into from
Aug 27, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
37 changes: 36 additions & 1 deletion .zuul.d/jobs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
name: pyca-cryptography-base
abstract: true
description: Run pyca/cryptography unit testing
run: .zuul.playbooks/playbooks/main.yaml
run: .zuul.playbooks/playbooks/tox/main.yaml

- job:
name: pyca-cryptography-ubuntu-focal-py38-arm64
Expand Down Expand Up @@ -31,3 +31,38 @@
nodeset: centos-8-arm64
vars:
tox_envlist: py27

- job:
name: pyca-cryptography-build-wheel
abstract: true
run: .zuul.playbooks/playbooks/wheel/main.yaml

- job:
name: pyca-cryptography-build-wheel-arm64
parent: pyca-cryptography-build-wheel
nodeset: ubuntu-bionic-arm64
vars:
wheel_builds:
- platform: manylinux2014_aarch64
image: pyca/cryptography-manylinux2014_aarch64
pythons:
- cp35-cp35m

- job:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need this? We currently build our x86-64 wheels elsehwere.

Copy link
Contributor Author

@ianw ianw Aug 27, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was a deliberate choice on my behalf to make sure the playbooks/roles are (and remain) suitable for multi-arch and multi-python builds in general

name: pyca-cryptography-build-wheel-x86_64
parent: pyca-cryptography-build-wheel
nodeset: ubuntu-bionic
vars:
wheel_builds:
- platform: manylinux1_x86_64
image: pyca/cryptography-manylinux1:x86_64
pythons:
- cp27-cp27m
- cp27-cp27mu
- cp35-cp35m
- platform: manylinux2010_x86_64
image: pyca/cryptography-manylinux2010:x86_64
pythons:
- cp27-cp27m
- cp27-cp27mu
- cp35-cp35m
6 changes: 6 additions & 0 deletions .zuul.d/project.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,13 @@
- project:
check:
jobs:
- pyca-cryptography-build-wheel-arm64
- pyca-cryptography-build-wheel-x86_64
Comment on lines +4 to +5
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We dont' usually run our wheel builder tasks in CI... though I guess there's no reason we couldn't. WDYT @reaperhulk ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would say that this does ensure when it comes time to push a tag, you can be sure there's no surprises that you've broken something in the mean time.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have an objection to this happening provided we have the concurrency for it not to hurt our CI turnaround time.

- pyca-cryptography-ubuntu-focal-py38-arm64
- pyca-cryptography-ubuntu-bionic-py36-arm64
- pyca-cryptography-centos-8-py36-arm64
- pyca-cryptography-centos-8-py27-arm64
release:
jobs:
- pyca-cryptography-build-wheel-arm64
- pyca-cryptography-build-wheel-x86_64
6 changes: 6 additions & 0 deletions .zuul.playbooks/playbooks/wheel/main.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
- hosts: all
tasks:

- name: Build wheel
include_role:
name: build-wheel-manylinux
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Build manylinux wheels for cryptography
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
#!/bin/bash -ex

# Compile wheels
cd /io

mkdir -p wheelhouse.final

for P in ${PYTHONS}; do

PYBIN=/opt/python/${P}/bin

"${PYBIN}"/python -m virtualenv .venv

.venv/bin/pip install cffi six ipaddress "enum34; python_version < '3'"

REGEX="cp3([0-9])*"
if [[ "${PYBIN}" =~ $REGEX ]]; then
PY_LIMITED_API="--py-limited-api=cp3${BASH_REMATCH[1]}"
fi

LDFLAGS="-L/opt/pyca/cryptography/openssl/lib" \
CFLAGS="-I/opt/pyca/cryptography/openssl/include -Wl,--exclude-libs,ALL" \
.venv/bin/python setup.py bdist_wheel $PY_LIMITED_API

auditwheel repair --plat ${PLAT} -w wheelhouse/ dist/cryptography*.whl

# Sanity checks
# NOTE(ianw) : no execstack on aarch64, comes from
# prelink, which was never supported. CentOS 8 does
# have it separate, skip for now.
if [[ "${PLAT}" != "manylinux2014_aarch64" ]]; then
for f in wheelhouse/*.whl; do
unzip $f -d execstack.check

results=$(execstack execstack.check/cryptography/hazmat/bindings/*.so)
count=$(echo "$results" | grep -c '^X' || true)
if [ "$count" -ne 0 ]; then
exit 1
fi
rm -rf execstack.check
done
fi

.venv/bin/pip install cryptography --no-index -f wheelhouse/
.venv/bin/python -c "from cryptography.hazmat.backends.openssl.backend import backend;print('Loaded: ' + backend.openssl_version_text());print('Linked Against: ' + backend._ffi.string(backend._lib.OPENSSL_VERSION_TEXT).decode('ascii'))"

# Cleanup
mv wheelhouse/* wheelhouse.final
rm -rf .venv dist wheelhouse

done
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
# Wheel builds is a list of dicts, with keys
#
# platform: the manylinux platform name
# image: the docker image to build in
# pythons: list of pythons in the image to build wheels for
- name: Sanity check build list
assert:
that: wheel_builds is defined

- name: Ensure pip installed
include_role:
name: ensure-pip

- name: Run ensure-docker
include_role:
name: ensure-docker

- name: Workaround Linaro aarch64 cloud MTU issues
# NOTE(ianw) : Docker default networking, the Linaro NAT setup and
# *insert random things here* cause PMTU issues, resulting in hung
# connections, particularly to fastly CDN (particularly annoying
# because pypi and pythonhosted live behind that). Can remove after
# upstream changes merge, or we otherwise find a solution in the
# upstream cloud.
# https://review.opendev.org/747062
# https://review.opendev.org/746833
# https://review.opendev.org/747064
when: ansible_architecture == 'aarch64'
block:
- name: Install jq
package:
name: jq
state: present
become: yes

- name: Reset docker MTU
shell: |
jq --arg mtu 1400 '. + {mtu: $mtu|tonumber}' /etc/docker/daemon.json > /etc/docker/daemon.json.new
cat /etc/docker/daemon.json.new
mv /etc/docker/daemon.json.new /etc/docker/daemon.json
service docker restart
become: yes

# We build an sdist of the checkout, and then build wheels from the
# sdist. This ensures that nothing is left out of the sdist.
- name: Install sdist required packages
package:
name:
- build-essential
- libssl-dev
- libffi-dev
- python3-dev
become: yes
when: ansible_distribution in ['Debian', 'Ubuntu']

- name: Create sdist
command: |
python3 setup.py sdist
args:
chdir: '{{ ansible_user_dir }}/{{ zuul.project.src_dir }}'

- name: Find output file
find:
paths: '{{ ansible_user_dir }}/{{ zuul.project.src_dir }}/dist'
file_type: file
patterns: "*.tar.gz"
register: _sdist

- assert:
that:
- _sdist.matched == 1

- name: Create a build area
file:
path: '{{ ansible_user_dir }}/build'
state: directory

- name: Create build area from sdist
unarchive:
src: '{{ _sdist.files[0].path }}'
dest: '{{ ansible_user_dir }}/build'
remote_src: yes

- name: Find cryptography subdir from sdist build dir
set_fact:
_build_dir: "{{ ansible_user_dir }}/build/{{ _sdist.files[0].path | basename | replace('.tar.gz', '') }}"

- name: Show _build_dir
debug:
var: _build_dir

- name: Install build script
copy:
src: build-wheels.sh
dest: '{{ _build_dir }}'
mode: 0755

- name: Pre-pull containers
command: >-
docker pull {{ item.image }}
become: yes
loop: '{{ wheel_builds }}'

- name: Run builds
command: |
docker run --rm \
-e PLAT={{ item.platform }} \
-e PYTHONS="{{ item.pythons | join(' ') }}" \
-v {{ _build_dir }}:/io \
{{ item.image }} \
/io/build-wheels.sh
become: yes
loop: '{{ wheel_builds }}'

- name: Copy sdist to output
synchronize:
src: '{{ _sdist.files[0].path }}'
dest: '{{ zuul.executor.log_root }}'
mode: pull

- name: Return sdist artifact
zuul_return:
data:
zuul:
artifacts:
- name: '{{ _sdist.files[0].path | basename }}'
url: 'sdist/{{ _sdist.files[0].path }}'
metadata:
type: sdist

- name: Copy wheels to output
synchronize:
src: '{{ _build_dir }}/wheelhouse.final/'
dest: '{{ zuul.executor.log_root }}/wheelhouse'
mode: pull

- name: Return wheelhouse artifact
zuul_return:
data:
zuul:
artifacts:
- name: "Wheelhouse"
url: "wheelhouse"
metadata:
type: wheelhouse