提供流生成的限制条件

2022-01-22 00:00:00 lambda java-8 java java-stream

我正在编写一个代码来计算斐波那契数.使用此代码,我可以生成斐波那契数列的前 n 个数字.

I am writing a code to calculate Fibonacci numbers. With this code I can generate first n numbers of the Fibonacci sequence.

Stream.generate(new Supplier<Long>() {
    private long n1 = 1;
    private long n2 = 2;

    @Override
    public Long get() {
        long fibonacci = n1;
        long n3 = n2 + n1;
        n1 = n2;
        n2 = n3;
        return fibonacci;
    }
}).limit(50).forEach(System.out::println);

limit 方法返回 Stream,其中包含传递给此方法的元素数量.我想在斐波那契数达到某个值后停止生成 Stream.

The method limit returns the Stream which holds the number of elements passed to this method. I want to stop the generation of the Stream after the Fibonacci number reached some value.

我的意思是,如果我想列出所有小于 1000 的斐波那契数,那么我不能使用 limit,因为我不知道可能有多少个斐波那契数.

I mean if I want to list all Fibonacci numbers less than 1000 then I cannot use limit, because I don't know how many Fibonacci numbers there could be.

有没有办法使用 lambda 表达式来做到这一点?

Is there any way to do this using lambda expressions?

推荐答案

我能找到的使用 Stream 内置功能的最佳解决方案是:

The best solution using the Stream’s built-in features I could find is:

LongStream.generate(new LongSupplier() {
  private long n1 = 1, n2 = 2;

  public long getAsLong() {
      long fibonacci = n1;
      long n3 = n2 + n1;
      n1 = n2;
      n2 = n3;
      return fibonacci;
  }
}).peek(System.out::println).filter(x->x>1000).findFirst();

它的缺点是处理第一个项目是 >=1000.这可以通过使语句有条件来防止,例如

It has the disadvantage of processing the first item being >=1000 though. This can be prevented by making the statement conditional, e.g.

.peek(x->{if(x<=1000) System.out.println(x);}).filter(x->x>1000).findFirst();

但我不喜欢对相同的条件(大于或不大于千)进行两次评估.但是,对于需要基于结果值的限制的实际任务,这两种解决方案中的一种可能足够实用.

but I don’t like to evaluate the same condition (bigger than thousand or not) twice. But maybe one of these two solution might be practical enough for real life tasks where a limit based on the resulting value is needed.

我认为,很明显整个构造不具备并行能力……

I think, it’s clear that the entire construct is not parallel capable…

相关文章