Skip to content

fix(sdk): buffer template upload to set Content-Length, add regression tests#1294

Merged
mishushakov merged 6 commits intomainfrom
mishushakov/s3-upload-fix
Apr 24, 2026
Merged

fix(sdk): buffer template upload to set Content-Length, add regression tests#1294
mishushakov merged 6 commits intomainfrom
mishushakov/s3-upload-fix

Conversation

@mishushakov
Copy link
Copy Markdown
Member

Summary

Consolidates the fix and tests from #1285 and #1293 into a single PR.

  • js-sdk: uploadFile used to pass a Node Readable directly to fetch, causing undici to fall back to Transfer-Encoding: chunked. S3 presigned PUT URLs reject chunked with 501 NotImplemented. Fix buffers the archive first so Content-Length is set. Includes:
    • Regression test that spins up a local HTTP server and asserts Content-Length is set and matches the body, and Transfer-Encoding is not chunked.
    • Type-fix for the CLI's typecheck (cast PackAsyncIterable<Buffer>).
    • Dynamic import of node:stream/consumers so the browser bundle doesn't pull it in.
  • python-sdk: Adds sync + async regression tests for upload_file that guard against the same class of bug (someone swapping tar_buffer.getvalue() for a stream/generator). No Python code change — the current implementation already passes bytes to httpx.put(..., content=...).

Authorship of the original JS fix commit preserved (truffle-dev).

Closes #1243.

Test plan

  • pnpm run test tests/template/uploadFile.test.ts — passes
  • pnpm run typecheck / lint clean across js-sdk and cli
  • poetry run pytest tests/sync/template_sync/test_upload_file.py tests/async/template_async/test_upload_file.py -v — both pass
  • poetry run make format / make lint / make typecheck clean

🤖 Generated with Claude Code

truffle-dev and others added 6 commits April 24, 2026 15:40
uploadFile passed a Node Readable directly to fetch, which undici
turns into Transfer-Encoding: chunked when Content-Length is unknown.
S3 presigned PUT URLs reject chunked requests with 501 NotImplemented,
which breaks template uploads against S3-compatible storage in
self-hosted deployments (issue #1243).

Buffer the archive via stream/consumers.buffer() before PUT so fetch
sets Content-Length automatically. Aligns with the Python SDK, which
already materializes the tar into io.BytesIO and uploads bytes via
httpx.

Adds a regression test that spins up a local HTTP server, runs
uploadFile against it, and asserts Content-Length is set and matches
the received body length.

Fixes #1243
Use mkdtemp for atomic unique temp dir and lift server/fixture lifecycle
to beforeAll/afterAll since there's only one test.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
The cli's typecheck (without skipLibCheck on the js-sdk source it imports
via path mapping) didn't narrow tar's Pack to the AsyncIterable<Buffer>
overload of node:stream/consumers' buffer(). Cast via unknown to satisfy
both the cli and js-sdk typechecks.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Static import broke the browser build — vite externalized the module and
threw at import time for browser test files that transitively pulled in
buildApi. Defer the import inside uploadFile like tarFileStream already
does for the tar package.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
…ngth

Guards against future changes swapping the buffered bytes for a
generator/stream, which would trigger Transfer-Encoding: chunked and
break S3 presigned PUT uploads with 501 NotImplemented.

Co-Authored-By: Claude Opus 4.7 <noreply@anthropic.com>
@cursor
Copy link
Copy Markdown

cursor Bot commented Apr 24, 2026

PR Summary

Medium Risk
Changes template upload behavior in the JS SDK from streaming to fully-buffered PUT requests, which can affect memory usage and performance for large templates and could surface new upload-size edge cases.

Overview
Fixes JS SDK template uploads to S3-compatible presigned PUT URLs by buffering the generated tar archive before calling fetch, ensuring a Content-Length header is sent and avoiding Transfer-Encoding: chunked (which S3 rejects with 501 NotImplemented). The implementation uses a dynamic import of node:stream/consumers and a cast to treat the tar stream as an async iterable so the browser bundle doesn’t pull in Node stream modules.

Adds regression tests: a new JS test that spins up a local HTTP server to assert Content-Length matches the received body and chunked encoding isn’t used, plus equivalent sync/async Python tests to guard against future regressions. Includes a changeset bump for e2b (patch).

Reviewed by Cursor Bugbot for commit 1323a27. Bugbot is set up for automated code reviews on this repo. Configure here.

@changeset-bot
Copy link
Copy Markdown

changeset-bot Bot commented Apr 24, 2026

🦋 Changeset detected

Latest commit: 1323a27

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
e2b Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@github-actions
Copy link
Copy Markdown
Contributor

Package Artifacts

Built from 0b91e8f. Download artifacts from this workflow run.

JS SDK (e2b@2.19.1-mishushakov-s3-upload-fix.0):

npm install ./e2b-2.19.1-mishushakov-s3-upload-fix.0.tgz

CLI (@e2b/cli@2.9.1-mishushakov-s3-upload-fix.0):

npm install ./e2b-cli-2.9.1-mishushakov-s3-upload-fix.0.tgz

Python SDK (e2b==2.20.0+mishushakov-s3-upload-fix):

pip install ./e2b-2.20.0+mishushakov.s3.upload.fix-py3-none-any.whl

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 1323a2767a

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread packages/js-sdk/src/template/buildApi.ts
Copy link
Copy Markdown

@claude claude Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM — small, well-scoped fix with solid regression coverage.

Extended reasoning...

Overview

The PR changes uploadFile in packages/js-sdk/src/template/buildApi.ts to buffer the tar archive via node:stream/consumers.buffer before calling fetch, so undici sets Content-Length instead of falling back to Transfer-Encoding: chunked (which S3 presigned PUT URLs reject with 501). It adds a JS regression test that spins up a local HTTP server and asserts the headers, plus sync+async Python regression tests guarding the existing Python behavior (no Python code change).

Security risks

None identified. This is a client-side HTTP behavior fix — no auth/crypto/permission surface touched. The dynamic import of node:stream/consumers is a hardcoded module string, not user-controlled.

Level of scrutiny

Low-to-medium. The change is ~10 lines of production code in a single SDK function with a clearly stated motivation (GitHub issue #1243) and an obvious matching fix pattern already used by the Python SDK. The tradeoff flagged by cursor[bot] — buffering large templates in memory — is real but already the status quo on the Python side, so this PR simply aligns the two SDKs rather than introducing a novel design decision.

Other factors

The cast uploadStream as unknown as AsyncIterable<Buffer> is explained in-code with a clear justification tied to preserveSymlinks in the CLI's tsconfig; tar's Pack is indeed async-iterable at runtime via Minipass. The dynamic import keeps the browser bundle clean. No outstanding human review comments, and the bug-hunting system found no issues.

@mishushakov mishushakov enabled auto-merge (squash) April 24, 2026 13:50
@mishushakov mishushakov merged commit 3167e19 into main Apr 24, 2026
24 checks passed
@mishushakov mishushakov deleted the mishushakov/s3-upload-fix branch April 24, 2026 14:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[JS SDK] uploadFile uses chunked transfer encoding, causing 501 NotImplemented on S3 presigned PUT URLs

3 participants