Skip to content

fix(js-sdk): buffer template upload to avoid S3 chunked PUT 501#1285

Closed
truffle-dev wants to merge 6 commits intoe2b-dev:mainfrom
truffle-dev:fix/js-sdk-upload-content-length
Closed

fix(js-sdk): buffer template upload to avoid S3 chunked PUT 501#1285
truffle-dev wants to merge 6 commits intoe2b-dev:mainfrom
truffle-dev:fix/js-sdk-upload-content-length

Conversation

@truffle-dev
Copy link
Copy Markdown
Contributor

Fixes #1243.

uploadFile passed the gzipped tar Readable directly to fetch. With no known Content-Length, undici falls back to Transfer-Encoding: chunked, which S3 presigned PUT URLs reject with 501 NotImplemented. Template uploads against S3-compatible storage in self-hosted deployments were broken.

Buffer the archive with stream/consumers.buffer() before PUT so fetch sets Content-Length automatically. This aligns the JS SDK with the Python SDK, which already materializes the tar into io.BytesIO and uploads bytes via httpx (packages/python-sdk/e2b/template_sync/build_api.py:upload_file).

Verification

Added tests/template/uploadFile.test.ts: spins up a local HTTP server, runs uploadFile, asserts Content-Length is set and matches the received body length, and asserts no chunked transfer encoding.

Stash-bisected against the fix:

# pre-fix (body = Readable)
× expected undefined to be defined  (Content-Length was never sent)

# post-fix (body = Buffer)
✓ sets Content-Length and does not use chunked transfer encoding  (68ms)

All 51 JS SDK unit tests (tests/template/utils + new file) pass. The 22 failures in the full tests/template run are integration tests that require E2B_API_KEY; they are unrelated.

Trade-off note

Like the Python SDK, this materializes the archive in memory before PUT. Template build contexts are bounded (the SDK targets Docker-build-sized payloads) and aligning with the Python SDK keeps behavior consistent across clients. If memory pressure becomes an issue for very large uploads, a streaming alternative would need coordinated changes on both SDKs and the server to compute Content-Length up front.

Issue: #1243

uploadFile passed a Node Readable directly to fetch, which undici
turns into Transfer-Encoding: chunked when Content-Length is unknown.
S3 presigned PUT URLs reject chunked requests with 501 NotImplemented,
which breaks template uploads against S3-compatible storage in
self-hosted deployments (issue e2b-dev#1243).

Buffer the archive via stream/consumers.buffer() before PUT so fetch
sets Content-Length automatically. Aligns with the Python SDK, which
already materializes the tar into io.BytesIO and uploads bytes via
httpx.

Adds a regression test that spins up a local HTTP server, runs
uploadFile against it, and asserts Content-Length is set and matches
the received body length.

Fixes e2b-dev#1243
Copy link
Copy Markdown

@claude claude Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Claude Code Review

This pull request is from a fork — automated review is disabled. A repository maintainer can comment @claude review to run a one-time review.

@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented Apr 22, 2026

🦋 Changeset detected

Latest commit: 3bac316

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
e2b Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@mishushakov
Copy link
Copy Markdown
Member

@cursor review
@codex review
@claude review

@cursor
Copy link
Copy Markdown

cursor Bot commented Apr 22, 2026

Bugbot couldn't run

Bugbot is not enabled for your user on this team.

Ask your team administrator to increase your team's hard limit for Bugbot seats or add you to the allowlist in the Cursor dashboard.

Use mkdtemp for atomic unique temp dir and lift server/fixture lifecycle
to beforeAll/afterAll since there's only one test.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. 👍

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

mishushakov and others added 2 commits April 22, 2026 14:46
The cli's typecheck (without skipLibCheck on the js-sdk source it imports
via path mapping) didn't narrow tar's Pack to the AsyncIterable<Buffer>
overload of node:stream/consumers' buffer(). Cast via unknown to satisfy
both the cli and js-sdk typechecks.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Static import broke the browser build — vite externalized the module and
threw at import time for browser test files that transitively pulled in
buildApi. Defer the import inside uploadFile like tarFileStream already
does for the tar package.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
@mishushakov
Copy link
Copy Markdown
Member

moved to #1285

mishushakov added a commit that referenced this pull request Apr 24, 2026
…n tests (#1294)

## Summary
Consolidates the fix and tests from #1285 and #1293 into a single PR.

- **js-sdk**: `uploadFile` used to pass a Node `Readable` directly to
`fetch`, causing undici to fall back to `Transfer-Encoding: chunked`. S3
presigned PUT URLs reject chunked with 501 NotImplemented. Fix buffers
the archive first so `Content-Length` is set. Includes:
- Regression test that spins up a local HTTP server and asserts
`Content-Length` is set and matches the body, and `Transfer-Encoding` is
not chunked.
- Type-fix for the CLI's typecheck (cast `Pack` →
`AsyncIterable<Buffer>`).
- Dynamic import of `node:stream/consumers` so the browser bundle
doesn't pull it in.
- **python-sdk**: Adds sync + async regression tests for `upload_file`
that guard against the same class of bug (someone swapping
`tar_buffer.getvalue()` for a stream/generator). No Python code change —
the current implementation already passes bytes to `httpx.put(...,
content=...)`.

Authorship of the original JS fix commit preserved (truffle-dev).

Closes #1243.

## Test plan
- [x] `pnpm run test tests/template/uploadFile.test.ts` — passes
- [x] `pnpm run typecheck` / `lint` clean across js-sdk and cli
- [x] `poetry run pytest tests/sync/template_sync/test_upload_file.py
tests/async/template_async/test_upload_file.py -v` — both pass
- [x] `poetry run make format` / `make lint` / `make typecheck` clean

🤖 Generated with [Claude Code](https://claude.com/claude-code)

---------

Co-authored-by: truffle <truffleagent@gmail.com>
Co-authored-by: Claude Opus 4.7 <noreply@anthropic.com>
@truffle-dev truffle-dev deleted the fix/js-sdk-upload-content-length branch April 30, 2026 22:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[JS SDK] uploadFile uses chunked transfer encoding, causing 501 NotImplemented on S3 presigned PUT URLs

2 participants