Conversation
setup.py
Outdated
| # 'Development Status :: 5 - Production/Stable'`` | ||
| release_status = "Development Status :: 4 - Beta" | ||
| dependencies = [ | ||
| "bigframes >= 1.17.0", |
There was a problem hiding this comment.
Please don't.
In the design I shared that this should be an optional dependency. That means it should be an "extra".
There was a problem hiding this comment.
Ack. Moved it to extra.
e1dfe18 to
fa25c72
Compare
5376177 to
23630a8
Compare
owlbot.py
Outdated
| # Multi-processing note isn't relevant, as bigquery-magics is responsible for | ||
| # creating clients, not the end user. | ||
| "docs/multiprocessing.rst", | ||
| "noxfile.py", |
There was a problem hiding this comment.
Please don't modify the noxfile in this way. There is a feature in the template that does what we want.
Use unit_test_extras_by_python to make sure we have some Python versions where we test without extras and some where we test with.
noxfile.py
Outdated
| session.install("protobuf<4") | ||
|
|
||
| if install_bigframes: | ||
| session.install("bigframes >= 1.17.0") |
There was a problem hiding this comment.
Please use the "extras". Don't install this manually.
There was a problem hiding this comment.
Sounds good. Updated owlbot.py instead
| "grpcio >= 1.47.0, < 2.0dev", | ||
| "grpcio >= 1.49.1, < 2.0dev; python_version>='3.11'", | ||
| ], | ||
| "bigframes": ["bigframes >= 1.17.0"], |
There was a problem hiding this comment.
pip will always install the latest version of bigframes. Let's make sure we test our compatibility with the minimum supported bigframes by adding bigframes==1.17.0 to https://github.com/googleapis/python-bigquery-magics/blob/main/testing/constraints-3.9.txt
| bpd.options.bigquery.project = context.project | ||
| bpd.options.bigquery.credentials = context.credentials |
There was a problem hiding this comment.
I'm worried these might fail if the bigframes session has already started. Maybe that's what we want to happen?
Alternatively, we could explicitly create a bigframes Session object, but then whatever DataFrame we produce won't be compatible with other DataFrames in the notebook.
context.engine to 'bigframes' to support query results larger than 10 GB
No description provided.