Skip to content

Add optional maxDepth and maxAliases execution options#4666

Open
eddieran wants to merge 2 commits intographql:16.x.xfrom
eddieran:fix/execution-depth-alias-limits
Open

Add optional maxDepth and maxAliases execution options#4666
eddieran wants to merge 2 commits intographql:16.x.xfrom
eddieran:fix/execution-depth-alias-limits

Conversation

@eddieran
Copy link
Copy Markdown

Summary

Fixes #4662 — the execution engine has no built-in depth or complexity limits, allowing deep recursive queries and alias-bombing to cause denial-of-service via resolver amplification.

This PR adds two opt-in execution options:

  • maxDepth — limits the field nesting depth during execution. When a field at depth > maxDepth is encountered, a GraphQLError is raised and the parent field resolves to null (standard nullable error handling). List indices do not count toward depth.

  • maxAliases — limits the number of response keys (including aliases) per selection set. This prevents alias-bombing attacks where thousands of aliases for the same field bypass depth-based defenses. When exceeded, a GraphQLError is raised before the selection set executes.

Both options are undefined by default — no limits are applied and existing behavior is fully preserved. They are passed via ExecutionArgs.options alongside the existing maxCoercionErrors:

const result = await execute({
  schema,
  document,
  options: {
    maxDepth: 10,
    maxAliases: 50,
  },
});

Implementation details

  • Depth checking happens in executeField by walking the Path linked list (only counting string keys, not list indices)
  • Alias counting happens in executeOperation (root fields) and completeObjectValue (sub-selections) after field collection
  • Errors follow standard GraphQL error propagation — nullable parent fields are nulled, non-null fields propagate upward
  • 12 new tests covering: within-limit queries, exceeded limits, no-op when unset, list index handling, nested alias limits, combined usage

Test plan

  • All 12 new tests pass
  • Full test suite (1972 tests) passes with no regressions
  • TypeScript compilation clean
  • ESLint clean

Add opt-in depth and alias limits to the execution engine to mitigate
denial-of-service attacks via deeply nested queries and alias bombing.

- maxDepth: limits the field nesting depth during execution. When a
  field exceeds the configured depth, a GraphQLError is raised and the
  parent field resolves to null (standard error handling).

- maxAliases: limits the number of response keys (including aliases)
  per selection set. When exceeded, a GraphQLError is raised before
  the selection set is executed.

Both options are undefined by default, preserving full backwards
compatibility. They are passed via ExecutionArgs.options alongside the
existing maxCoercionErrors option.

Fixes graphql#4662
@linux-foundation-easycla
Copy link
Copy Markdown

linux-foundation-easycla Bot commented Apr 14, 2026

CLA Not Signed

@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 14, 2026

@eddieran is attempting to deploy a commit to the The GraphQL Foundation Team on Vercel.

A member of the Team first needs to authorize it.

Copy link
Copy Markdown
Contributor

@yaacovCR yaacovCR left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we could consider adding a fieldDepth property to FieldDetails of type number? Then we wouldn't have to keep recalculating the path length?

Addresses review feedback: rather than walking the Path linked list
at every executeField invocation, thread the current field depth as
an explicit parameter through executeFields/executeField/completeValue
and its helpers. completeObjectValue increments depth when descending
into sub-selections; list items and abstract-type resolution keep the
same depth as their parent field.

This changes the depth check from O(depth) per field resolution to
O(1), while preserving identical semantics (list indices still do not
count toward depth).
@eddieran
Copy link
Copy Markdown
Author

Thanks for the review, @yaacovCR! Since FieldDetails doesn't exist in the 16.x.x branch this PR targets (collected fields are plain Map<string, ReadonlyArray<FieldNode>>), I took the equivalent approach: thread a depth: number parameter through executeFields/executeField/completeValue and its helpers. completeObjectValue increments depth when descending into sub-selections; list items and abstract-type resolution keep the same depth as their parent field.

This turns the depth check into O(1) per field resolution instead of O(depth) walk of the Path linked list, while preserving identical semantics (list indices still don't count).

All 12 new tests and the full 1972-test suite still pass. Pushed in f9d67dc. Happy to take further suggestions, e.g. if you'd prefer a different naming or placement.

Copy link
Copy Markdown
Contributor

@yaacovCR yaacovCR left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks great to me, comments/suggestions inline.

Looping in @benjie with regards to naming as he is driving the Golden Path initiative (graphql/graphql-wg#1887 https://github.com/graphql/golden-path-wg).

Gentle nudge also about the CLA!

(CI also notes there is a prettier issue).

From benchmarking on my machine ran from the 17.x.x branch:

npm run benchmark -- benchmark/introspectionFromSchema-benchmark.js benchmark/list-sync-benchmark.js benchmark/list-async-benchmark.js benchmark/object-async-benchmark.js --revs pr/4666 v16.13.2

Image

Comment thread src/execution/execute.ts
path: Path,
depth: number,
): PromiseOrValue<unknown> {
// Check depth limit before resolving the field.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If the depth limit check was in completeObjectValue (and at the root) rather than at the field, right after field collection, we could return all the collected fields/subfields. We could also with this change allow for setting a limit of 0 which should fail everything, not be so useful, but would protects us in that edge case from an error message in that edge case of uncertain use that would read "Query depth limit of 0 exceeded, found depth: 2." when it might be expected to read: "Query depth limit of 0 exceeded, found depth: 1."

Comment thread src/execution/execute.ts
* When exceeded, a GraphQLError is thrown for the offending field.
* No limit is applied when undefined (the default).
*/
maxDepth?: number;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could consider calling this option maxResultDepth as we may later wish to introduce a separate maximum inline/named fragment depth with respect to field collection.

Comment thread src/execution/execute.ts
* When exceeded, a GraphQLError is thrown before executing the selection set.
* No limit is applied when undefined (the default).
*/
maxAliases?: number;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
maxAliases?: number;
maxResponseNames?: number;

The idea is that we want a maximum breadth at an object level, whether or not aliases are used.

I think "response name" would be the official erm => https://spec.graphql.org/draft/#sec-Field-Alias

(This would need to be renamed throughout the code/comments.)

Comment thread src/execution/execute.ts
if (exeContext.maxAliases !== undefined) {
const aliasCount = fields.size;
if (aliasCount > exeContext.maxAliases) {
// Collect nodes for the error location from the first field node of
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure, but I think we would want to include all nodes, not just those above the maximum, it's not the fault of a later node if a new node is added to a operation prior to it and causes it to fail? And we would not be able to tell which is the real offending node. Alternatively, we could just list the parent with too many sub-response names. Not sure how much value we actually add by including all of the nodes causing the response names limit to be breached, it may be more helpful to highlight the offending parent (considering the list of nodes would be expected to be quite long).

Comment thread src/execution/execute.ts
// each response key beyond the limit.
const nodes: Array<FieldNode> = [];
for (const [, fieldNodes] of fields) {
nodes.push(fieldNodes[0]);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we include all the offending response names, for consistency, I would assume we would need to include all of the fieldNodes, not just the first. This array is the set of overlapping fieldNodes that have been coalesced into one response name, but if we think we're helping by pointing out all the nodes within the operation that cause this limit to be breached, I think we would just confuse things by not listing all the nodes that are part of the problem. If we include just the first, the removal of that node would not cause the error to be avoided if there were other nodes with that response name.

(As above, not sure if we actually need to include all of the offending nodes rather than just the parent.)

@benjie
Copy link
Copy Markdown
Member

benjie commented Apr 24, 2026

Is there a reason for doing this at resolve-time rather than validation-time?

Returning null for data past a given depth is unsafe; a client might update their normalized cache thinking null is the true data for this. An error must be throw instead.

I'd recommend doing this as a validation rule instead.

I'd strongly recommend that you factor in list depth, it's much more significant a concern than selection set depth - selection sets grow cost linearly, lists grow cost exponentially.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Security] No built-in query depth/complexity limit + alias bombing DoS

3 participants