从JavaScript数组删除重复时出错 [英] Error when removing duplicates from Javascript Array

查看:106
本文介绍了从JavaScript数组删除重复时出错的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

更新

这是问题的一个小提琴: https://jsfiddle.net/q9c5fku3/ 当我运行即code,并期待在控制台我看到它的console.logging阵列中的一个不同的数字。

感谢您的答复,对不起,我越来越downvotes,但是这实在是困惑我。

我再次使用不同的号码,我想知道,你们可以在你的最终测试这些数字,看看你得到不同的结果尝试过吗?

  myarray的无功= [621608617992776,621608617992776,10156938936550295,621608617992776,10156938936550295,10156938936550295,621608617992776,10156938936550295]
    的console.log(myarray的);    变种myArrayTrimmed = [];    对于(VAR我在myArray的){
        如果(myArrayTrimmed.indexOf(myarray的[I])=== -1){
            myArrayTrimmed.push(myarray的[I]);
        }
    }
    的console.log(myArrayTrimmed);

这是给我下面的数组在控制台中:

  [621608617992776,10156938936550296]

由于某些原因,第二个数字已增加了1。

====================

原题:

我有这样的数组:

 变种myArray的= [100,200,100,200,100,100,200,200,200,200]。

我试图创建一个名为新数组 myArrayTrimmed ,这将是一样的上述阵列,但是里面有删除重复项。这将导致:

  VAR myArrayTrimmed = [100,200];

这是code我用努力实现这一点:

 变种myArray的= [100,200,100,200,100,100,200,200,200,200]。
变种myArrayTrimmed = [];对于(VAR我在myArray的){
    如果(myArrayTrimmed.indexOf(myarray的[I])=== -1){
        myArrayTrimmed.push(myarray的[I]);
    }
}
的console.log(myArrayTrimmed);

这是无法正常工作,而这是除去重复的,由于某种原因,它减去200号 1 ,所以在控制台中的输出是:

  [100,199]

我想,这一定是由于 1 在code,但我不知道该怎么删除重复。


解决方案

我相信这是做到这一点的最好办法

\r
\r

myarray的无功= [100,200,100,200,100,100,200 ,200,200,200],\r
    减少= Object.keys(myArray.reduce((P,C)=>(P [C] =真,P),{}));\r
的console.log(减少);

\r

\r
\r

OK ..即使这个人是O(n)和其它的是为O(n ^ 2)我很好奇,看看之间的比较基准本减少/查找表和筛选器/组合的indexOf(我选择Jeetendras很不错实施 http://stackoverflow.com/a/37441144/4543207 )。我prepare在范围0-9999充满随机正整数a 100K项目数组,并删除重复项。我重复测试10次,结果的平均值显示,它们在性能不匹配。


  • 在Firefox中V47减少和放大器; LUT:14.85ms VS滤波器放大器;的indexOf:2836ms

  • 在镀铬V51减少和放大器; LUT:23.90ms VS滤波器放大器;的indexOf:1066ms

好吧好吧到目前为止好。但是让我们做正确这次是在ES6风格。它看起来太酷了..!但截至目前它将如何对功能强大的解决方案LUT执行是一个谜给我。让我们先看看code,然后基准它。

\r
\r

myarray的无功= [100,200,100,200,100,100,200 ,200,200,200],\r
    减少= [... myArray.reduce((P,C)=> P.SET(C,真),新地图())键()];\r
的console.log(减少);

\r

\r
\r

哇,这是短..!又是如何的表现..?它的美丽......由于滤波器的重压/解除的indexOf在我们的肩上,现在我可以在范围内0..99999试验阳性整数数组1M随机物品从连续10次测试得到的平均值。我可以说,这一次,它是一个真正的比赛。见自己的结果:)

\r
\r

VAR ranar = []\r
     RED1 = A => Object.keys(a.reduce((P,C)=>(P [C] =真,P),{}))\r
     RED2 = A =>减压= [... a.reduce((P,C)=> P.SET(三,真),新地图())。键​​()],\r
     AVG1 = [],\r
     AVG2 = [],\r
       TS = 0,\r
       德= 0,\r
     RES1 = [],\r
     RES2 = [],\r
     数= 10;\r
对于(VAR I = 0; I<计数;我++){\r
  ranar =(新阵列(1000000).fill伪(真))图(E => Math.floor(的Math.random()* 100000))。\r
  TS = performance.now();\r
  RES1 = RED1(ranar);\r
  德= performance.now();\r
  avg1.push(TE-TS);\r
  TS = performance.now();\r
  RES2 = RED2(ranar);\r
  德= performance.now();\r
  avg2.push(TE-TS);\r
}\r
\r
AVG1 = avg1.reduce((P,C)=> P + C)/计数;\r
AVG2 = avg2.reduce((P,C)=> P + C)/计数;\r
\r
的console.log(减少和放大器; LUT了:+ AVG1 +毫秒);\r
的console.log(地图&安培; S $ P $垫了:+ AVG2 +毫秒);

\r

\r
\r

哪一个你会使用..?嗯没那么快......!不要上当受骗。地图是排量。现在看......在上述所有情况下,我们填补大小为n与范围&LT数字数组; ñ。我的意思是,我们有大小100的数组,我们充满了随机数0..9所以有一定的重复和几乎肯定每个号码有重复。怎么样,如果我们填写尺寸100的阵列随机数0..9999。让我们现在看看地图在主场作战。这一次的100K项的数组,但随机数的范围是0..100M。我们将连续做100测试平均结果。 OK,让我们看看下注..! < - 无错字

\r
\r

VAR ranar = []\r
     RED1 = A => Object.keys(a.reduce((P,C)=>(P [C] =真,P),{}))\r
     RED2 = A =>减压= [... a.reduce((P,C)=> P.SET(三,真),新地图())。键​​()],\r
     AVG1 = [],\r
     AVG2 = [],\r
       TS = 0,\r
       德= 0,\r
     RES1 = [],\r
     RES2 = [],\r
     数= 100;\r
对于(VAR I = 0; I<计数;我++){\r
  ranar =(新阵列(100000).fill伪(真))图(E => Math.floor(的Math.random()* 100000000))。\r
  TS = performance.now();\r
  RES1 = RED1(ranar);\r
  德= performance.now();\r
  avg1.push(TE-TS);\r
  TS = performance.now();\r
  RES2 = RED2(ranar);\r
  德= performance.now();\r
  avg2.push(TE-TS);\r
}\r
\r
AVG1 = avg1.reduce((P,C)=> P + C)/计数;\r
AVG2 = avg2.reduce((P,C)=> P + C)/计数;\r
\r
的console.log(减少和放大器; LUT了:+ AVG1 +毫秒);\r
的console.log(地图&安培; S $ P $垫了:+ AVG2 +毫秒);

\r

\r
\r

现在,这是地图的壮观复出()..!可能现在你可以做出更好的决定,当你要删除的受骗者。

好吧好吧,我们现在都高兴。但主角永远是最后一次了一些掌声。我相信你们当中有些人不知道什么套装对象会做。现在,由于我们是开放的ES6,我们知道地图是的previous游戏的赢家,让我们与集比较地图作为决赛。一个典型的皇家马德里vs巴萨的比赛这个时候......是这样吗?让我们看看谁就能赢得EL Classico的:)

\r
\r

VAR ranar = []\r
     RED1 = A =>减压= [... a.reduce((P,C)=> P.SET(三,真),新地图())。键​​()],\r
     RED2 = A => Array.from(新集(一)),\r
     AVG1 = [],\r
     AVG2 = [],\r
       TS = 0,\r
       德= 0,\r
     RES1 = [],\r
     RES2 = [],\r
     数= 100;\r
对于(VAR I = 0; I<计数;我++){\r
  ranar =(新阵列(100000).fill伪(真))图(E => Math.floor(的Math.random()* 10000000))。\r
  TS = performance.now();\r
  RES1 = RED1(ranar);\r
  德= performance.now();\r
  avg1.push(TE-TS);\r
  TS = performance.now();\r
  RES2 = RED2(ranar);\r
  德= performance.now();\r
  avg2.push(TE-TS);\r
}\r
\r
AVG1 = avg1.reduce((P,C)=> P + C)/计数;\r
AVG2 = avg2.reduce((P,C)=> P + C)/计数;\r
\r
的console.log(地图&安培; S $ P $垫了:+ AVG1 +毫秒);\r
的console.log(集&安培; A.from了:+ AVG2 +毫秒);

\r

\r
\r

哇..男人..!那么意外它没有变成是一个EL Classico的都没有。更多类似FC巴塞罗那对奥萨苏纳:))

UPDATE

Here is a fiddle of the problem: https://jsfiddle.net/q9c5fku3/ when I run that code and look at the console I see it's console.logging a different number in the array.

Thanks for your replies, sorry I'm getting downvotes but this is really confusing me.

I've tried it again using different numbers, I'm wondering can you guys test these numbers on your end and see if you get a different result?

    var myArray = [621608617992776, 621608617992776, 10156938936550295, 621608617992776, 10156938936550295, 10156938936550295, 621608617992776, 10156938936550295];
    console.log(myArray);

    var myArrayTrimmed = [];

    for(var i in myArray){
        if(myArrayTrimmed.indexOf(myArray[i]) === -1){
            myArrayTrimmed.push(myArray[i]);
        }
    }
    console.log(myArrayTrimmed);

This is giving me the following array in the console:

[621608617992776, 10156938936550296]

For some reason the second number has increased by 1.

====================

Original Question:

I have this array:

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200];

I'm trying to create a new array named myArrayTrimmed that will be the same as the above array, except it will have duplicates removed. This should result in:

var myArrayTrimmed = [100, 200];

This is the code I'm using to try to achieve this:

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200];
var myArrayTrimmed = [];

for(var i in myArray){
    if(myArrayTrimmed.indexOf(myArray[i]) === -1){
        myArrayTrimmed.push(myArray[i]);
    }
}
console.log(myArrayTrimmed);

This isn't working correctly, while it is removing duplicates, for some reason it's subtracting the number 1 from 200, so the output in the console is:

[100, 199]

I think this must be due to the -1 in the code, but I don't know how else to remove the duplicates.

解决方案

I believe this is the best way to do this

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
    reduced = Object.keys(myArray.reduce((p,c) => (p[c] = true,p),{}));
console.log(reduced);

OK .. even though this one is O(n) and the others are O(n^2) i was curious to see benchmark comparison between this reduce / look up table and filter/indexOf combo (I choose Jeetendras very nice implementation http://stackoverflow.com/a/37441144/4543207). I prepare a 100K item array filled with random positive integers in range 0-9999 and and it removes the duplicates. I repeat the test for 10 times and the average of the results show that they are no match in performance.

  • In firefox v47 reduce & lut : 14.85ms vs filter & indexOf : 2836ms
  • In chrome v51 reduce & lut : 23.90ms vs filter & indexOf : 1066ms

Well ok so far so good. But let's do it properly this time in the ES6 style. It looks so cool..! But as of now how it will perform against the powerful lut solution is a mystery to me. Lets first see the code and then benchmark it.

var myArray = [100, 200, 100, 200, 100, 100, 200, 200, 200, 200],
    reduced = [...myArray.reduce((p,c) => p.set(c,true),new Map()).keys()];
console.log(reduced);

Wow that was short..! But how about the performance..? It's beautiful... Since the heavy weight of the filter / indexOf lifted over our shoulders now i can test an array 1M random items of positive integers in range 0..99999 to get an average from 10 consecutive tests. I can say this time it's a real match. See the result for yourself :)

var ranar = [],
     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 10;
for (var i = 0; i<count; i++){
  ranar = (new Array(1000000).fill(true)).map(e => Math.floor(Math.random()*100000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");

Which one would you use..? Well not so fast...! Don't be deceived. Map is at displacement. Now look... in all of the above cases we fill an array of size n with numbers of range < n. I mean we have an array of size 100 and we fill with random numbers 0..9 so there are definite duplicates and "almost" definitely each number has a duplicate. How about if we fill the array in size 100 with random numbers 0..9999. Let's now see Map playing at home. This time an Array of 100K items but random number range is 0..100M. We will do 100 consecutive tests to average the results. OK let's see the bets..! <- no typo

var ranar = [],
     red1 = a => Object.keys(a.reduce((p,c) => (p[c] = true,p),{})),
     red2 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 100;
for (var i = 0; i<count; i++){
  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*100000000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("reduce & lut took: " + avg1 + "msec");
console.log("map & spread took: " + avg2 + "msec");

Now this is the spectacular comeback of Map()..! May be now you can make a better decision when you want to remove the dupes.

Well ok we are all happy now. But the lead role always comes last with some applause. I am sure some of you wonder what Set object would do. Now that since we are open to ES6 and we know Map is the winner of the previous games let us compare Map with Set as a final. A typical Real Madrid vs Barcelona game this time... or is it? Let's see who will win the el classico :)

var ranar = [],
     red1 = a => reduced = [...a.reduce((p,c) => p.set(c,true),new Map()).keys()],
     red2 = a => Array.from(new Set(a)),
     avg1 = [],
     avg2 = [],
       ts = 0,
       te = 0,
     res1 = [],
     res2 = [],
     count= 100;
for (var i = 0; i<count; i++){
  ranar = (new Array(100000).fill(true)).map(e => Math.floor(Math.random()*10000000));
  ts = performance.now();
  res1 = red1(ranar);
  te = performance.now();
  avg1.push(te-ts);
  ts = performance.now();
  res2 = red2(ranar);
  te = performance.now();
  avg2.push(te-ts);
}

avg1 = avg1.reduce((p,c) => p+c)/count;
avg2 = avg2.reduce((p,c) => p+c)/count;

console.log("map & spread took: " + avg1 + "msec");
console.log("set & A.from took: " + avg2 + "msec");

Wow.. man..! Well unexpectedly it didn't turn out to be an el classico at all. More like Barcelona FC against CA Osasuna :))

这篇关于从JavaScript数组删除重复时出错的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆