-
Notifications
You must be signed in to change notification settings - Fork 138
Use OAuth to authenticate against BigQuery #42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
With "default credentials", you can use https://cloud.google.com/sdk/gcloud/reference/auth/application-default/login, but you'll need to provide a project ID in the connection string. Note: You'll encounter a warning from the google-auth library when using this method. It can safely be ignored, but if you want to avoid it, it's possible to create a key file associated with your user credentials. I started a PR to make such a utility to do this in pydata/pydata-google-auth#28 (Once you create a key file, set the GOOGLE_APPLICATION_DEFAULT environment variable) |
Obviously, this use case could be better. We should consider using the pydata-google-auth package here which will fall back to browser login when no other credentials are found. |
OK. I'll dig this thanks. In facrt, I'm targeting an update in SuperSet, using OAuth, and providing list of BgTable tables available for the connected user in a complete organisation. Seems a bit tricky for me, but I sense there is a way. |
For a shared SuperSet instance, it's a little trickier than I described. #28 is an attempt at implementing it, but it used super-user permissions with service account, which isn't ideal. Does SuperSet have a means of providing a connection string per user? |
Need to check where the create_engine is called. I guess it would be better if they did... |
Can there be support to directly pass a credentials object here - https://github.com/mxmzdlv/pybigquery/blob/master/pybigquery/sqlalchemy_bigquery.py#L327, directly using the credentials object returned by the OAuth authentication process to generate the client. Alternatively if the provided credentials path would accept not just service account credentials but also OAuth2 Client credentials that would work too. |
Hi @tswast--if you can provide direction on what would be a reasonable path to allow this, @choprashweta or I can prepare a PR. |
The fact that the connection args have to be parsed from a URL string makes this a little more difficult. The closest comparison I can think of is how the Simba ODBC driver is configured. It has an optional If specified in the URL, the credentials = google.oauth2.credentials.Credentials(None, refresh_token=refresh_token) |
To support non-"offline" credentials, it may also be desirable to add an |
@YannBrrd @tswast are any workaround available for superset ? superset is using this library for connecting to Bigquery. We are using service account json for connection and in current scenario user can query all the available datasets. |
anyone can provide me some pointers for the implementation so i can work on this. |
has anyone worked on this? or if anyone can provide any direction, I can look into that. |
Adding A |
has anyone tried using domain-wide-delegation so that the service account impersonates the user and only uses the user's permission scope when it comes to accessing BigQuery? |
Due to conflicting priorities, closing as Won't Fix. |
is anyone now can use acces_token and refresh token in sqlalchemy? |
Hi,
Is it possible to use OAuth to authentitcate against BigQuery, using end user identity ?
Thanks.
Yann
The text was updated successfully, but these errors were encountered: