Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New add_load_date_suffix import option. #681

Merged
merged 3 commits into from Aug 18, 2021

Conversation

bmenasha
Copy link
Contributor

@bmenasha bmenasha commented Aug 16, 2021

With the new import option , the pipeline will now create
a new BigQuery table for each new load date. This helps handle API incompatible
schema changes better as each day's imports only need to be consistent rather
then all prior imports.

With the new import option , the pipeline will now create
a new BigQuery table for each new load date. This helps handle API incompatible
schema changes better as each day's imports only need to be consistent rather
then all prior imports.
@pull-request-size pull-request-size bot added the size/M Denotes a PR that changes 30-99 lines. label Aug 16, 2021
@google-cla google-cla bot added the cla: yes All committers have signed a CLA label Aug 16, 2021
@boredabdel boredabdel self-requested a review August 17, 2021 07:55
@boredabdel
Copy link
Member

/gcbrun

@bmenasha
Copy link
Contributor Author

bmenasha commented Aug 18, 2021

those failures aren't related to this commit. Would fix them myself but would rather @TheLanceLord take a look and resolve them in a different PR as perhaps it will help find bugs in that tool. I manually resolved them locally and the cloud-build pipeline was clean afterwards.

Thanks @TheLanceLord for writing that bot, i've had customers asking for that very thing in the past and am very happy you took the effort to write it.

@TheLanceLord
Copy link
Member

Yep, I'll take a look into it and see what I broke

@boredabdel
Copy link
Member

@TheLanceLord This is the PR that broke our CI #673

Can you have a look please ?

@boredabdel
Copy link
Member

/gcbrun

@boredabdel
Copy link
Member

@TheLanceLord I run the linter locally, here are the errors i get

./google_cloud_support_slackbot.py:182:5: F841 local variable 'token' is assigned to but never used
./google_cloud_support_slackbot.py:255:13: F841 local variable 'e' is assigned to but never used
./google_cloud_support_slackbot.py:284:13: F841 local variable 'e' is assigned to but never used
./google_cloud_support_slackbot.py:560:9: F841 local variable 'escalation_mask' is assigned to but never used
./google_cloud_support_slackbot.py:564:9: F841 local variable 'e' is assigned to but never used
./google_cloud_support_slackbot.py:959:13: F841 local variable 'data' is assigned to but never used
Some files need to be formatted in ./tools/google-cloud-support-slackbot - FAIL

make: *** [Makefile:28: test] Error 1

I will add the tools/google-cloud-support-slackbot to the exclusion list for now so it doesn't break CI for everything else. I will open an issue and assign it to you

@boredabdel
Copy link
Member

/gcbrun

Copy link
Member

@boredabdel boredabdel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@boredabdel boredabdel merged commit f3d57a3 into GoogleCloudPlatform:main Aug 18, 2021
rosmo pushed a commit to rosmo/professional-services that referenced this pull request Mar 17, 2022
With the new import option , the pipeline will now create
a new BigQuery table for each new load date. This helps handle API incompatible
schema changes better as each day's imports only need to be consistent rather
then all prior imports.

Co-authored-by: Abdel SGHIOUAR <abdelfettah@google.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla: yes All committers have signed a CLA size/M Denotes a PR that changes 30-99 lines.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants