-
Notifications
You must be signed in to change notification settings - Fork 0
feat: add multipart upload support with progress callbacks #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Contributor
dtfiedler
commented
Jan 21, 2026
- Add ChunkedUploader class for multipart upload lifecycle
- Add ChunkingParams for configurable chunk size (5-500 MiB), concurrency, and chunking mode (auto/force/disabled)
- Extend upload() with on_progress callback and chunking params
- Add custom upload_url/payment_url support to Turbo client
- Add TurboUploadStatus, ChunkedUploadInit types
- Add UnderfundedError, UploadValidationError, UploadFinalizationError
- Add unit tests for chunking logic (28 tests)
- Add performance benchmark tests (skipped by default)
- Update README with multipart upload documentation
- Add ChunkedUploader class for multipart upload lifecycle - Add ChunkingParams for configurable chunk size (5-500 MiB), concurrency, and chunking mode (auto/force/disabled) - Extend upload() with on_progress callback and chunking params - Add custom upload_url/payment_url support to Turbo client - Add TurboUploadStatus, ChunkedUploadInit types - Add UnderfundedError, UploadValidationError, UploadFinalizationError - Add unit tests for chunking logic (28 tests) - Add performance benchmark tests (skipped by default) - Update README with multipart upload documentation Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add StreamingDataItem class that wraps data streams with DataItem headers - Add create_data_header() for building headers separately from data - Use sign_stream to compute signatures without loading entire file into memory - Update _upload_chunked() to use streaming by default for all inputs - Bytes inputs are wrapped in BytesIO for unified code path This enables uploading large files with constant memory usage regardless of file size. The stream is read twice: once for signing (computing the deep hash), and once for the actual upload. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Replace data_stream parameter with stream_factory callable throughout the streaming upload code. This allows: - Non-seekable streams (generators, network streams, etc.) - Clean separation between signing pass and upload pass - Easy retries by creating a new stream from the factory Changes: - StreamingDataItem now takes stream_factory: Callable[[], BinaryIO] - Turbo.upload() accepts optional stream_factory parameter - Internal _upload_single and _upload_chunked use stream_factory - Added comprehensive tests for bytes, streams, and stream_factory inputs Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
…ataItem StreamingDataItem now accepts the signer via sign(signer) instead of at construction time, matching the DataItem sign() pattern. Also exposes identity properties (id, raw_id, signature, owner, etc.) after signing. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
The stream_factory lambda for BinaryIO inputs was returning the same stream object (with seek(0)). Since sign() closes its signing stream, the second factory call would fail on a closed file. Now we read the bytes upfront and create independent BytesIO instances. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.