Skip to content

Commit

Permalink
AP-1039 Revamp CI (transferwise#54)
Browse files Browse the repository at this point in the history
  • Loading branch information
Samira-El committed Aug 6, 2021
1 parent 53b4bbf commit e060d44
Show file tree
Hide file tree
Showing 10 changed files with 153 additions and 146 deletions.
65 changes: 0 additions & 65 deletions .circleci/config.yml

This file was deleted.

44 changes: 44 additions & 0 deletions .github/workflows/ci.yml
@@ -0,0 +1,44 @@
name: CI

on:
pull_request:
push:
branches:
- master

jobs:
lint_and_test:
name: Linting and Testing
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [ 3.6, 3.7, 3.8 ]

steps:
- name: Checkout repository
uses: actions/checkout@v2

- name: Start PG test container
run: docker-compose up -d --build db

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}

- name: Setup virtual environment
run: make venv

- name: Pylinting
run: make pylint

- name: Unit Tests
run: make unit_test

- name: Integration Tests
env:
LOGGING_CONF_FILE: ./sample_logging.conf
run: make integration_test

- name: Shutdown PG test container
run: docker-compose down
25 changes: 25 additions & 0 deletions Makefile
@@ -0,0 +1,25 @@
venv:
python3 -m venv venv ;\
. ./venv/bin/activate ;\
pip install --upgrade pip setuptools wheel ;\
pip install -e .[test]

pylint:
. ./venv/bin/activate ;\
pylint --rcfile .pylintrc target_postgres/

unit_test:
. ./venv/bin/activate ;\
pytest --cov=target_postgres --cov-fail-under=44 tests/unit -v

env:
export TARGET_POSTGRES_PORT=5432
export TARGET_POSTGRES_DBNAME=target_db
export TARGET_POSTGRES_USER=my_user
export TARGET_POSTGRES_PASSWORD=secret
export TARGET_POSTGRES_HOST=localhost
export TARGET_POSTGRES_SCHEMA=public

integration_test: env
. ./venv/bin/activate ;\
pytest tests/integration --cov=target_postgres --cov-fail-under=87 -v
70 changes: 33 additions & 37 deletions README.md
Expand Up @@ -23,43 +23,43 @@ installation instructions for [Mac](http://docs.python-guide.org/en/latest/start
It's recommended to use a virtualenv:

```bash
python3 -m venv venv
pip install pipelinewise-target-postgres
```

or

```bash
python3 -m venv venv
. venv/bin/activate
pip install --upgrade pip
pip install .
make venv
```

### To run

Like any other target that's following the singer specificiation:
Like any other target that's following the singer specification:

`some-singer-tap | target-postgres --config [config.json]`

It's reading incoming messages from STDIN and using the properites in `config.json` to upload data into Postgres.
It's reading incoming messages from STDIN and using the properties in `config.json` to upload data into Postgres.

**Note**: To avoid version conflicts run `tap` and `targets` in separate virtual environments.

### Configuration settings

Running the the target connector requires a `config.json` file. An example with the minimal settings:
#### Spin up a PG DB

Make use of the available docker-compose file to spin up a PG DB.

```json
{
"host": "localhost",
"port": 5432,
"user": "my_user",
"password": "secret",
"dbname": "my_db_name",
"default_target_schema": "my_target_schema"
}
```
```bash
docker-compose up -d --build db
```


### Configuration settings

Running the target connector requires a `config.json` file. An example with the minimal settings:

```json
{
"host": "localhost",
"port": 5432,
"user": "my_user",
"password": "secret",
"dbname": "target_db",
"default_target_schema": "public"
}
```

Full list of options in `config.json`:

Expand Down Expand Up @@ -96,33 +96,29 @@ Full list of options in `config.json`:
export TARGET_POSTGRES_SCHEMA=<postgres-schema>
```

2. Install python dependencies in a virtual env and run nose unit and integration tests
**PS**: You can run `make env` to export pre-defined environment variables


2. Install python dependencies in a virtual env and run unit and integration tests
```
python3 -m venv venv
. venv/bin/activate
pip install --upgrade pip
pip install .[test]
make venv
```

3. To run unit tests:
```
nosetests --where=tests/unit
make unit_test
```

4. To run integration tests:
```
nosetests --where=tests/integration
make integration_test
```

### To run pylint:

1. Install python dependencies and run python linter
```
python3 -m venv venv
. venv/bin/activate
pip install --upgrade pip
pip install .[test]
pylint --rcfile .pylintrc --disable duplicate-code target_postgres/
make venv pylint
```

## License
Expand Down
11 changes: 11 additions & 0 deletions docker-compose.yml
@@ -0,0 +1,11 @@
version: "3"

services:
db:
image: postgres:12-alpine
environment:
POSTGRES_DB: "target_db"
POSTGRES_USER: "my_user"
POSTGRES_PASSWORD: "secret"
ports:
- 5432:5432
3 changes: 0 additions & 3 deletions requirements.txt

This file was deleted.

15 changes: 7 additions & 8 deletions setup.py
Expand Up @@ -3,7 +3,7 @@
from setuptools import setup

with open('README.md') as f:
long_description = f.read()
long_description = f.read()

setup(name="pipelinewise-target-postgres",
version="2.1.1",
Expand All @@ -25,17 +25,16 @@
],
extras_require={
"test": [
'nose==1.3.7',
'mock==3.0.5',
'pylint==2.4.4',
'nose-cov==1.6'
]
'pytest==6.2.1',
'pylint==2.6.0',
'pytest-cov==2.10.1',
]
},
entry_points="""
[console_scripts]
target-postgres=target_postgres:main
""",
packages=["target_postgres"],
package_data = {},
package_data={},
include_package_data=True,
)
)
6 changes: 3 additions & 3 deletions target_postgres/__init__.py
Expand Up @@ -94,7 +94,6 @@ def persist_lines(config, lines) -> None:
stream_to_sync = {}
total_row_count = {}
batch_size_rows = config.get('batch_size_rows', DEFAULT_BATCH_SIZE_ROWS)
parallelism = config.get("parallelism", -1)

# Loop over lines from stdin
for line in lines:
Expand Down Expand Up @@ -127,8 +126,9 @@ def persist_lines(config, lines) -> None:
raise InvalidValidationOperationException(
f"Data validation failed and cannot load to destination. RECORD: {o['record']}\n"
"multipleOf validations that allows long precisions are not supported (i.e. with 15 digits"
"or more) Try removing 'multipleOf' methods from JSON schema.")
raise RecordValidationException(f"Record does not pass schema validation. RECORD: {o['record']}")
"or more) Try removing 'multipleOf' methods from JSON schema.") from ex
raise RecordValidationException(
f"Record does not pass schema validation. RECORD: {o['record']}") from ex

primary_key_string = stream_to_sync[stream].record_primary_key_string(o['record'])
if not primary_key_string:
Expand Down

0 comments on commit e060d44

Please sign in to comment.