fix(js-sdk): buffer template upload to avoid S3 chunked PUT 501#1285
fix(js-sdk): buffer template upload to avoid S3 chunked PUT 501#1285truffle-dev wants to merge 6 commits intoe2b-dev:mainfrom
Conversation
uploadFile passed a Node Readable directly to fetch, which undici turns into Transfer-Encoding: chunked when Content-Length is unknown. S3 presigned PUT URLs reject chunked requests with 501 NotImplemented, which breaks template uploads against S3-compatible storage in self-hosted deployments (issue e2b-dev#1243). Buffer the archive via stream/consumers.buffer() before PUT so fetch sets Content-Length automatically. Aligns with the Python SDK, which already materializes the tar into io.BytesIO and uploads bytes via httpx. Adds a regression test that spins up a local HTTP server, runs uploadFile against it, and asserts Content-Length is set and matches the received body length. Fixes e2b-dev#1243
🦋 Changeset detectedLatest commit: 3bac316 The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
Bugbot couldn't runBugbot is not enabled for your user on this team. Ask your team administrator to increase your team's hard limit for Bugbot seats or add you to the allowlist in the Cursor dashboard. |
Use mkdtemp for atomic unique temp dir and lift server/fixture lifecycle to beforeAll/afterAll since there's only one test. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
|
Codex Review: Didn't find any major issues. 👍 ℹ️ About Codex in GitHubCodex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback". |
The cli's typecheck (without skipLibCheck on the js-sdk source it imports via path mapping) didn't narrow tar's Pack to the AsyncIterable<Buffer> overload of node:stream/consumers' buffer(). Cast via unknown to satisfy both the cli and js-sdk typechecks. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Static import broke the browser build — vite externalized the module and threw at import time for browser test files that transitively pulled in buildApi. Defer the import inside uploadFile like tarFileStream already does for the tar package. Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
|
moved to #1285 |
…n tests (#1294) ## Summary Consolidates the fix and tests from #1285 and #1293 into a single PR. - **js-sdk**: `uploadFile` used to pass a Node `Readable` directly to `fetch`, causing undici to fall back to `Transfer-Encoding: chunked`. S3 presigned PUT URLs reject chunked with 501 NotImplemented. Fix buffers the archive first so `Content-Length` is set. Includes: - Regression test that spins up a local HTTP server and asserts `Content-Length` is set and matches the body, and `Transfer-Encoding` is not chunked. - Type-fix for the CLI's typecheck (cast `Pack` → `AsyncIterable<Buffer>`). - Dynamic import of `node:stream/consumers` so the browser bundle doesn't pull it in. - **python-sdk**: Adds sync + async regression tests for `upload_file` that guard against the same class of bug (someone swapping `tar_buffer.getvalue()` for a stream/generator). No Python code change — the current implementation already passes bytes to `httpx.put(..., content=...)`. Authorship of the original JS fix commit preserved (truffle-dev). Closes #1243. ## Test plan - [x] `pnpm run test tests/template/uploadFile.test.ts` — passes - [x] `pnpm run typecheck` / `lint` clean across js-sdk and cli - [x] `poetry run pytest tests/sync/template_sync/test_upload_file.py tests/async/template_async/test_upload_file.py -v` — both pass - [x] `poetry run make format` / `make lint` / `make typecheck` clean 🤖 Generated with [Claude Code](https://claude.com/claude-code) --------- Co-authored-by: truffle <truffleagent@gmail.com> Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
Fixes #1243.
uploadFilepassed the gzipped tarReadabledirectly tofetch. With no knownContent-Length, undici falls back toTransfer-Encoding: chunked, which S3 presigned PUT URLs reject with501 NotImplemented. Template uploads against S3-compatible storage in self-hosted deployments were broken.Buffer the archive with
stream/consumers.buffer()before PUT sofetchsetsContent-Lengthautomatically. This aligns the JS SDK with the Python SDK, which already materializes the tar intoio.BytesIOand uploads bytes via httpx (packages/python-sdk/e2b/template_sync/build_api.py:upload_file).Verification
Added
tests/template/uploadFile.test.ts: spins up a local HTTP server, runsuploadFile, assertsContent-Lengthis set and matches the received body length, and asserts no chunked transfer encoding.Stash-bisected against the fix:
All 51 JS SDK unit tests (
tests/template/utils+ new file) pass. The 22 failures in the fulltests/templaterun are integration tests that requireE2B_API_KEY; they are unrelated.Trade-off note
Like the Python SDK, this materializes the archive in memory before PUT. Template build contexts are bounded (the SDK targets Docker-build-sized payloads) and aligning with the Python SDK keeps behavior consistent across clients. If memory pressure becomes an issue for very large uploads, a streaming alternative would need coordinated changes on both SDKs and the server to compute Content-Length up front.
Issue: #1243