有效地遍历加权节点的所有可能图,并计算最大派系大小> k的概率 [英] Efficiently loop through all possible graphs of weighted nodes and calculate the probability that the maximal clique size is >k

查看:77
本文介绍了有效地遍历加权节点的所有可能图,并计算最大派系大小> k的概率的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这个问题是对另一个问题的跟进:

This question is a follow up to this other one: Maximal full-mesh in a graph - python code is very slow.

那是关于在加权的子图中找到最大的集团规模.我的最终目标是找到一个概率为q的图,该图的连接处于活动状态,并且其所有节点上的某些权重都将具有大于k的最大加权派系大小.为此,我将循环遍历n个节点上的所有可能的图,如果最大加权派系大小> = k,则增加概率.但是,@ Dillon Davis在另一个问题中提到,有一种方法可以使其更有效.因此,发布此问题以查看是否有人可以通过重新使用较早计算出的图来帮助提高图的循环效率.发布我的天真的循环代码以供参考.

That one was about finding the maximal clique size in a weighted sub-graph. My ultimate objective was to find the probability that a graph with connections having probabilities q of being active and some weights on all its nodes would have a maximal weighted clique size greater than k. For this, I would loop over all possible graphs on the n nodes and add to the probability if the maximal weighted clique size was >=k. However, @Dillon Davis mentioned in the other question that there is a way to make this more efficient. So, posting this question to see if anyone can help make the looping over graphs more efficient by re-using graphs computed earlier. Posting my code that does a naive loop for reference.

def networking_resiliency(k=4, q=0.5, wts=np.ones(4)):    
    edges = []
    n = len(wts)
    for i in range(n):
        for j in range(i+1,n):
            edges.append((i,j))
    edges = np.array(edges)
    ans = 0.0
    for e_idx in range(2**len(edges)):
        arr = to_binary(e_idx, len(edges))
        broken_edges = edges[arr==0]
        if type == "full_mesh":
            fm = FullMesh(broken_edges,wts)
            if fm.find_max()[0] >= k:
                up_edges = sum(arr)
                ans += q**up_edges*(1-q)**(len(edges)-up_edges)        
    return ans

推荐答案

可以从其他计算图计算某些图的最佳全网格.对于给定的全网格,如果我们注意到已经从原始图上切下的节点生成了当前子图,我们发现连接到这些切图节点的任何其他边都可以自由切割,而没有任何后果.使用此信息,我们可以获取剪切节点和所有其他节点的笛卡尔积,删除任何等效的重复项,并拥有一组可以自由剪切的所有边.然后,我们利用所有这些边的幂集,并将每个边与当前的断边列表连接起来,以确定其他子图,这些子图将不可避免地导致相同的最佳全网格.修改后的实现此行为的FullMesh类可以在下面看到:

It is possible to compute the optimal full-mesh of some graphs from other computed graphs. For a given full-mesh, if we note the nodes that have been cut from the original graph to produce the current sub graph, we find that any other edges connecting to these cut nodes can freely but cut without consequence. Using this information, we can take the Cartesian product of the cut nodes and all other nodes, remove any equivalent duplicates, and have a set of all edges we can freely cut. We then take the powerset of all these edges, and concatenate each to our current list of broken edges to determine a other subgraphs that will inevitably result in the same optimal full-mesh. The modified FullMesh class that implements this behavior can be seen below:

class FullMesh:
    def __init__(self, weights, pairs=[]):
        self.weights = weights
        self.elements = set(range(len(weights)))
        self.set_pairs(pairs)

    def set_pairs(self, pairs):
        self.pairs = pairs
        self.skips = {e:set() for e in self.elements}
        for i, (a, b) in enumerate(pairs):
            self.skips[a].add(i)
            self.skips[b].add(i)

    def powerset(self, elements):
        return chain.from_iterable(combinations(elements, r) for r in range(len(elements)+1))

    def find_all(self):
        to_search = self.powerset(list(combinations(self.elements, 2)))
        pairs_searched = dict()
        for pairs in to_search:
            if pairs in pairs_searched: continue
            self.set_pairs(pairs)
            val, nums = self.find_max()
            new_pairs = set(product(set(self.elements) - set(nums), set(self.elements))) - set(pairs)
            new_pairs = self.powerset({(x, y) for x, y in new_pairs if x < y})
            pairs_searched.update({tuple(sorted(pairs + np)):(val,nums) for np in new_pairs})
        return pairs_searched

    def find_max(self):
        max_score = sum(self.weights)
        val, nums = self.exclude(0, max_score + 1, set(range(len(self.pairs))))
        return max_score - val, sorted(self.elements - set(nums))

    def exclude(self, curr_score, min_score, search):
        if not search or min_score <= curr_score:
            return curr_score, []

        min_nums = []
        for e in self.pairs[next(iter(search))]:
            score, nums = self.exclude(curr_score + self.weights[e], min_score, search - self.skips[e])
            if score < min_score:
                min_score, min_nums = score, nums + [e]
        return min_score, min_nums

此代码花了大约50秒的时间才能为随机加权的7节点全连接图的所有子图生成所有最佳全网格.

This code took ~50sec to produce all optimal full-meshes for all sub graphs of a randomly weighted, 7-node fully-connected graph.

这篇关于有效地遍历加权节点的所有可能图,并计算最大派系大小> k的概率的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆