• Uncategorised

How short circuit work in Java streams

Understand how limit() short-circuits and how a stream pipeline executes lazily and element-by-element.


šŸ”„ Example Code

List<String> names = Arrays.asList("Alice", "Bob", "Charlie", "David", "Eve");

List<String> result = names.stream()
.filter(name -> {
System.out.println("Filtering: " + name);
return name.length() > 3;
})
.map(name -> {
System.out.println("Mapping: " + name);
return name.toUpperCase();
})
.limit(2)
.collect(Collectors.toList());

System.out.println("Result: " + result);

🧠 What Will Happen?

Even though there are 5 names, only the minimum number of elements needed to satisfy limit(2) will be processed.


šŸ” Expected Output

Filtering: Alice
Mapping: Alice
Filtering: Bob
Filtering: Charlie
Mapping: Charlie
Result: [ALICE, CHARLIE]

āœ… Step-by-Step Explanation

Let’s understand this by simulating how the stream pipeline works.

šŸ”— Stream pipeline:

Source → filter → map → limit → collect

Java Streams work by pulling data through this pipeline one element at a time.


šŸ” Iteration 1:

  • Alice is passed into filter → passes (length > 3)
  • Alice is passed to map() → becomes ALICE
  • Collected in result list
    āœ… 1 of 2 done

šŸ” Iteration 2:

  • Bob is passed into filter → fails (length <= 3)
  • Not passed to map or limit
  • Skipped

šŸ” Iteration 3:

  • Charlie passes filter → becomes CHARLIE
  • Collected
    āœ… 2 of 2 done

šŸ” Iteration 4 and 5:

Skipped entirely
Why? Because limit(2) has already been fulfilled — it’s a short-circuiting terminal step.


šŸ’” So How Does limit() Short-Circuit?

Internally:

  • limit(n) wraps the downstream Sink (collector)
  • It counts how many elements have been passed downstream
  • Once the limit is reached, it stops requesting more elements from upstream

It’s like: ā€œI’ve got 2 items, I’m done. Stop pulling more!ā€


šŸ“Œ Why This Matters

  • Efficiency: Even though the source has 5 items, only 3 elements were fully evaluated, and 2 were mapped.
  • You save CPU cycles and memory
  • It’s especially useful with large datasets, infinite streams, or expensive filters/mappings

You may also like...