Skip to content

feat: Add deferred data uploading #1720

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 6 commits into
base: main
Choose a base branch
from
Open

feat: Add deferred data uploading #1720

wants to merge 6 commits into from

Conversation

TrevorBergeron
Copy link
Contributor

Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:

  • Make sure to open an issue as a bug/issue before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea
  • Ensure the tests and linter pass
  • Code coverage does not decrease (if any source code was changed)
  • Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

@product-auto-label product-auto-label bot added size: l Pull request size is large. api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. labels May 9, 2025
@TrevorBergeron TrevorBergeron requested a review from tswast May 14, 2025 00:33
@TrevorBergeron TrevorBergeron marked this pull request as ready for review May 14, 2025 00:34
@TrevorBergeron TrevorBergeron requested review from a team as code owners May 14, 2025 00:34
):
mapping = {
local_data.schema.items[i].column: bq_data.table.physical_schema[i].name
for i in range(len(local_data.schema))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we add any safeguards here in case the lengths don't match?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added an assert for exact schema length

# Step 1: Upload all previously un-uploaded data
for leaf in original_root.unique_nodes():
if isinstance(leaf, nodes.ReadLocalNode):
if leaf.local_data_source.metadata.total_bytes > 5000:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's create a constant for this 5000.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

Comment on lines 549 to 554
if node.offsets_col is not None:
scan_list = scan_list.append(
bq_source.table.physical_schema[-1].name,
bigframes.dtypes.INT_DTYPE,
node.offsets_col,
)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the reason that offsets_col isn't in source_mapping, already? Could you add a comment explaining it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Its a bit clumsy due to offsets_col only really being a thing for the local node, so when we upload, we will set the final physical column to be those offsets, but then we just have to add to the end of the scan_list for the ReadTableNode. Added a comment to explain. Probably some room to respec the leaf nodes for a bit more of a 1:1 mapping.

@TrevorBergeron TrevorBergeron requested a review from tswast May 16, 2025 18:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigquery Issues related to the googleapis/python-bigquery-dataframes API. size: l Pull request size is large.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants