• Uncategorised
  • 0

How short circuit work in Java streams

Understand how limit() short-circuits and how a stream pipeline executes lazily and element-by-element.


πŸ”„ Example Code

List<String> names = Arrays.asList("Alice", "Bob", "Charlie", "David", "Eve");

List<String> result = names.stream()
.filter(name -> {
System.out.println("Filtering: " + name);
return name.length() > 3;
})
.map(name -> {
System.out.println("Mapping: " + name);
return name.toUpperCase();
})
.limit(2)
.collect(Collectors.toList());

System.out.println("Result: " + result);

🧠 What Will Happen?

Even though there are 5 names, only the minimum number of elements needed to satisfy limit(2) will be processed.


πŸ” Expected Output

Filtering: Alice
Mapping: Alice
Filtering: Bob
Filtering: Charlie
Mapping: Charlie
Result: [ALICE, CHARLIE]

βœ… Step-by-Step Explanation

Let’s understand this by simulating how the stream pipeline works.

πŸ”— Stream pipeline:

Source β†’ filter β†’ map β†’ limit β†’ collect

Java Streams work by pulling data through this pipeline one element at a time.


πŸ” Iteration 1:

  • Alice is passed into filter β†’ passes (length > 3)
  • Alice is passed to map() β†’ becomes ALICE
  • Collected in result list
    βœ… 1 of 2 done

πŸ” Iteration 2:

  • Bob is passed into filter β†’ fails (length <= 3)
  • Not passed to map or limit
  • Skipped

πŸ” Iteration 3:

  • Charlie passes filter β†’ becomes CHARLIE
  • Collected
    βœ… 2 of 2 done

πŸ” Iteration 4 and 5:

Skipped entirely
Why? Because limit(2) has already been fulfilled β€” it’s a short-circuiting terminal step.


πŸ’‘ So How Does limit() Short-Circuit?

Internally:

  • limit(n) wraps the downstream Sink (collector)
  • It counts how many elements have been passed downstream
  • Once the limit is reached, it stops requesting more elements from upstream

It’s like: β€œI’ve got 2 items, I’m done. Stop pulling more!”


πŸ“Œ Why This Matters

  • Efficiency: Even though the source has 5 items, only 3 elements were fully evaluated, and 2 were mapped.
  • You save CPU cycles and memory
  • It’s especially useful with large datasets, infinite streams, or expensive filters/mappings

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *