过滤和映射在相同的迭代 [英] filter and map in same iteration

查看:94
本文介绍了过滤和映射在相同的迭代的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

  const files =>我有这种简单的情况,我想过滤并映射到相同的值,如下所示: results.filter(function(r){
return r.file;
})
.map(function(r){
return r.file;
} );

为了节省代码行数以及提高性能,我正在寻找:

  const files = results.filterAndMap(function(r){
return r.file;
});

确实存在,或者我应该自己写点东西?我曾经在几个地方想过这样的功能,以前从来没有打扰过它。

strong>



在其最通用的形式中,您的问题的答案在于传感器。但是,在我们过于抽象之前,让我们先看一些基础 - 下面,我们实现一对换能器 mapReduce filterReduce ,和 tapReduce ;你可以添加任何你需要的。



= map => ;减少=> (acc,x)=> reduce(acc,map(x))const filterReduce = filter =>减少=> (acc,x)=>过滤器(x)? reduce(acc,x):acc const tapReduce = tap =>减少=> (acc,x)=> (tap(x),reduce(acc,x))const tcomp =(f,g)=> k => f(g(k))const concat =(xs,ys)=> xs.concat(ys)const transduce =(... ts)=> xs => xs.reduce(ts.reduce(tcomp,k => k)(concat),[])const main = transduce(tapReduce(x => console.log('with:',x)),filterReduce(x tapReduce(x => console.log('has file:',x.file)),mapReduce(x => x.file),tapReduce(x => console.log ('final:',x)))const data = [{file:1},{file:undefined},{},{file:2}] console.log(main(data))// with:{file :1} // file:1 // final:1 // with:{file:undefined} // with:{} // with:{file:2} // file:2 // final:2 / / => [1,2]

可连接API

也许你对代码的简单性感到满意,但你对这种有点不合常规的API感到不满。如果你想保留链接 .map .filter .whatever 调用而不添加不必要的迭代,我们可以创建一个用于转换的通用接口,并在上面创建可链接的API - 此答案根据上面共享的链接进行调整,其他有关传感器的答案

  // Trans Monoidconst Trans = f => ({runTrans:f,concat:({runTrans:g})=> Trans(k => f(g(k)))})Trans.empty =()=> Trans(k => k)//换能器基元const mapper = f => Trans(k =>(acc,x)=> k(acc,f(x)))const filterer = f => Trans(k =>(acc,x)=> f(x)k(acc,x):acc)const tapper = f => Trans(k =>(acc,x)=>(f(x),k(acc,x)))//可链式APIconst Transduce =(t = Trans.empty())=> (t.concat(mapper(f))),filter:f => Transduce(t.concat(filterer(f))),tap:f => Transduce(t。 concat(tapper(f))),运行:xs => xs.reduce(t.runTrans((xs,ys)=> xs.concat(ys)),[])})// democonst main = data => Transduce().tap(x => console.log('with:',x)).filter(x => x.file).tap(x => console.log('has file: x.file)).map(x => x.file).tap(x => console.log('final:',x)).run(data)const data = [{file:1}, {file:undefined},{},{file:2}] console.log(main(data))// with:{file:1} // file:1 // final:1 // with:{file :undefined} // with:{} // with:{file:2} // has file:2 // final:2 // => [1,2]  



作为一个尽可能少依赖关系实现链接API的练习,我重写了代码片段而不依赖 Trans monoid实现或原始传感器 mapper filterer 等 - 感谢评论@ftor。



从总体可读性来看,这是一个明确的降级。我们失去了只看它并理解发生的事情的能力。我们也失去了monoid界面,这让我们很容易就其他表达式中的换能器进行推理。这里有一个很大的收获是 code> Transduce 的定义包含在10行源代码中;与之前的28次相比 - 所以当表达式更复杂时,您可能在大脑开始挣扎之前完成整个定义。

lang =jsdata-hide =falsedata-console =truedata-babel =false>

 Transduce(k => t((acc,x)=> k(acc,f(x)))),filter:f => Transduce(k => t (acc,x)=> f(x)≠k(acc,x):acc)),tap:f => Transduce(k = ),x(xs,ys)=> xs.concat(ys)),[])})// demo(this保持不变)const main = data => Transduce().tap(x => console.log('with:',x)).filter(x => x.file).tap(x => console.log('has file: x.file)).map(x => x.file).tap(x => console.log('final:',x)).run(data)const data = [{file:1}, {file:undefined},{},{file:2}] console.log(main(data))// with:{file:1} // file:1 // final:1 // with:{file :undefined} // with:{} // with:{file:2} // has file:2 // final:2 // => [1,2]  



>性能谈话



在速度方面,没有任何功能变体会超过的静态 c $ c>循环,它将所有的程序语句合并到一个循环体中。但是,上面的换能器确实比 .map / .filter / .whatever 调用,其中通过大型数据集进行多次迭代将花费很多。

编码风格&执行



传感器的本质在于 mapReduce ,这就是为什么我选择介绍它首先。如果您可以理解如何进行多次 mapReduce 调用并将它们排列在一起,您将会了解换能器。

<<当然,您可以通过多种方式实施换能器,但是我发现 Brian的方法最有用,因为它将换能器编码为 monoid - 拥有monoid允许我们对它做出各种方便的假设。一旦我们转换了一个数组(一种类型的monoid),你可能想知道如何转换任何其他monoid ...在这种情况下,阅读该文章!


I have this simple situation where I want to filter and map to the same value, like so:

 const files = results.filter(function(r){
      return r.file;
    })
    .map(function(r){
       return r.file;
    });

To save lines of code, as well as increase performance, I am looking for:

const files = results.filterAndMap(function(r){
  return r.file;
});

does this exist, or should I write something myself? I have wanted such functionality in a few places, just never bothered to look into it before.

解决方案

Transducers

In its most generic form, the answer to your question lies in transducers. But before we go too abstract, let's see some basics first – below, we implement a couple transducers mapReduce, filterReduce, and tapReduce; you can add any others that you need.

const mapReduce = map => reduce =>
  (acc, x) => reduce (acc, map (x))
  
const filterReduce = filter => reduce =>
  (acc, x) => filter (x) ? reduce (acc, x) : acc
  
const tapReduce = tap => reduce =>
  (acc, x) => (tap (x), reduce (acc, x))

const tcomp = (f,g) =>
  k => f (g (k))

const concat = (xs,ys) =>
  xs.concat(ys)
  
const transduce = (...ts) => xs =>
  xs.reduce (ts.reduce (tcomp, k => k) (concat), [])

const main =
  transduce (
    tapReduce (x => console.log('with:', x)),
    filterReduce (x => x.file),
    tapReduce (x => console.log('has file:', x.file)),
    mapReduce (x => x.file),
    tapReduce (x => console.log('final:', x)))
      
const data =
  [{file: 1}, {file: undefined}, {}, {file: 2}]
  
console.log (main (data))
// with: { file: 1 }
// has file: 1
// final: 1
// with: { file: undefined }
// with: {}
// with: { file: 2 }
// has file: 2
// final: 2
// => [ 1, 2 ]

Chainable API

Maybe you're satisfied with the simplicity of the code but you're unhappy with the somewhat unconventional API. If you want to preserve the ability to chain .map, .filter, .whatever calls without adding undue iterations, we can make a generic interface for transducing and make our chainable API on top of that – this answer is adapted from the link I shared above and other answers I have about transducers

// Trans Monoid
const Trans = f => ({
  runTrans: f,
  concat: ({runTrans: g}) =>
    Trans (k => f (g (k)))
})

Trans.empty = () =>
  Trans(k => k)

// transducer "primitives"
const mapper = f =>
  Trans (k => (acc, x) => k (acc, f (x)))
  
const filterer = f =>
  Trans (k => (acc, x) => f (x) ? k (acc, x) : acc)
  
const tapper = f =>
  Trans (k => (acc, x) => (f (x), k (acc, x)))
  
// chainable API
const Transduce = (t = Trans.empty()) => ({
  map: f =>
    Transduce (t.concat (mapper (f))),
  filter: f =>
    Transduce (t.concat (filterer (f))),
  tap: f =>
    Transduce (t.concat (tapper (f))),
  run: xs =>
    xs.reduce (t.runTrans ((xs,ys) => xs.concat(ys)), [])
})

// demo
const main = data =>
  Transduce()
    .tap (x => console.log('with:', x))
    .filter (x => x.file)
    .tap (x => console.log('has file:', x.file))
    .map (x => x.file)
    .tap (x => console.log('final:', x))
    .run (data)
    
const data =
  [{file: 1}, {file: undefined}, {}, {file: 2}]

console.log (main (data))
// with: { file: 1 }
// has file: 1
// final: 1
// with: { file: undefined }
// with: {}
// with: { file: 2 }
// has file: 2
// final: 2
// => [ 1, 2 ]

Chainable API, take 2

As an exercise to implement the chaining API with as little dependency ceremony as possible, I rewrote the code snippet without relying upon the Trans monoid implementation or the primitive transducers mapper, filterer, etc – thanks for the comment @ftor.

This is a definite downgrade in terms of overall readability. We lost that ability to just look at it and understand what was happening. We also lost the monoid interface which made it easy for us to reason about our transducers in other expressions. A big gain here tho is the definition of Transduce is contained within 10 lines of source code; compared to 28 before – so while the expressions are more complex, you can probably finish reading the entire definition before your brain starts struggling

// chainable API only (no external dependencies)
const Transduce = (t = k => k) => ({
  map: f =>
    Transduce (k => t ((acc, x) => k (acc, f (x)))),
  filter: f =>
    Transduce (k => t ((acc, x) => f (x) ? k (acc, x) : acc)),
  tap: f =>
    Transduce (k => t ((acc, x) => (f (x), k (acc, x)))),
  run: xs =>
    xs.reduce (t ((xs,ys) => xs.concat(ys)), [])
})

// demo (this stays the same)
const main = data =>
  Transduce()
    .tap (x => console.log('with:', x))
    .filter (x => x.file)
    .tap (x => console.log('has file:', x.file))
    .map (x => x.file)
    .tap (x => console.log('final:', x))
    .run (data)
    
const data =
  [{file: 1}, {file: undefined}, {}, {file: 2}]

console.log (main (data))
// with: { file: 1 }
// has file: 1
// final: 1
// with: { file: undefined }
// with: {}
// with: { file: 2 }
// has file: 2
// final: 2
// => [ 1, 2 ]

> Talks about performance

When it comes to speed, no functional variant of this is ever going to beat a static for loop which combines all of your program statements in a single loop body. However, the transducers above do have the potential to be faster than a series of .map/.filter/.whatever calls where multiple iterations thru a large data set would be expensive.

Coding style & implementation

The very essence of the transducer lies in mapReduce, which is why I chose to introduce it first. If you can understand how to take multiple mapReduce calls and sequence them together, you'll understand transducers.

Of course you can implement transducers in any number of ways, but I found Brian's approach the most useful as it encodes transducers as a monoid – having a monoid allows us make all sorts of convenient assumptions about it. And once we transduce an Array (one type of monoid), you might wonder how you can transduce any other monoid... in such a case, get reading that article!

这篇关于过滤和映射在相同的迭代的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆