在Python的列表列表中删除重复的列表 [英] Remove duplicated lists in list of lists in Python
问题描述
我在这里看到了一些非常相关的问题,但它们的答案对我不起作用.我有一个列表列表,其中有些子列表是重复的,但是它们的元素可能会混乱.例如
I've seen some questions here very related but their answer doesn't work for me. I have a list of lists where some sublists are repeated but their elements may be disordered. For example
g = [[1, 2, 3], [3, 2, 1], [1, 3, 2], [9, 0, 1], [4, 3, 2]]
根据我的问题,输出应该自然是:
The output should be, naturally according to my question:
g = [[1,2,3],[9,0,1],[4,3,2]]
我已经尝试过使用set
,但是只会删除那些相等的列表(我认为这应该起作用,因为根据定义,集合是无序的).我访问过的其他问题仅包含示例,这些示例具有完全相同或重复的列表,例如:
I've tried with set
but only removes those lists that are equal (I thought It should work because sets are by definition without order). Other questions i had visited only has examples with lists exactly duplicated or repeated like this: Python : How to remove duplicate lists in a list of list?. For now order of output (for list and sublists) is not a problem.
推荐答案
(ab)使用列表组件的副作用版本:
(ab)using side-effects version of a list comp:
seen = set()
[x for x in g if frozenset(x) not in seen and not seen.add(frozenset(x))]
Out[4]: [[1, 2, 3], [9, 0, 1], [4, 3, 2]]
对于那些不喜欢以这种方式使用副作用的人(与我不同):
For those (unlike myself) who don't like using side-effects in this manner:
res = []
seen = set()
for x in g:
x_set = frozenset(x)
if x_set not in seen:
res.append(x)
seen.add(x_set)
将frozenset
添加到集合中的原因是,您只能将可哈希对象添加到set
,而普通set
不可哈希.
The reason that you add frozenset
s to the set is that you can only add hashable objects to a set
, and vanilla set
s are not hashable.
这篇关于在Python的列表列表中删除重复的列表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!