Skip to content

cannot connect #2961

Answered by sehrope
RonOrgad asked this question in Q&A
Oct 5, 2023 · 1 comments · 7 replies
Discussion options

You must be logged in to vote

I think you have a missing "s" in "postgre__S__ql" in: "/Users/ronorgad/Desktop/testing_pyspark/postgreql-42.6.0.jar"

Whatever path you have there, copy it from your code and make sure you can run from a terminal: ls -l "/Users/ronorgad/Desktop/testing_pyspark/postgreql-42.6.0.jar"

If that does not work, a spark related forum would be more appropriate for this type of question as it's not an issue with the driver. This is a spark configuration issue.

Replies: 1 comment 7 replies

Comment options

You must be logged in to vote
7 replies
@RonOrgad
Comment options

@davecramer
Comment options

@RonOrgad
Comment options

@sehrope
Comment options

Answer selected by RonOrgad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants