Excel VBA的表现不好自动实现神话? [英] Is the poor performance of Excel VBA auto-instancing a myth?

查看:167
本文介绍了Excel VBA的表现不好自动实现神话?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所接受的智慧是使用像 Dim dict As New Dictionary 之类的结构在性能上比更差Dim Dim As As dictionary / Set dict = New字典



解释是前一个例子 - 自动实例化 - 直到第一次使用变量字典。因此,每次引用时,编译代码必须首先检查 dict 是否等于Nothing。



但是,对我来说,编译代码也是这样做的。每次尝试使用 Nothing 的对象引用时,您将收到错误。



所以,为了对科学表示敬意,我进行了一些考验。结果表明,两种方法之间没有性能差异。 (在Excel 2007上运行)



调用创建字典并添加2项100,000次。


$显示:16,891ms /自动:16,797ms(快速自动94ms)
  • 显式:16,797ms /自动:16,781ms(自动16ms更快)



  • 反向测试呼叫的顺序:




    • 自动:16,766ms /显式:16,812ms(自动速度快46ms)

    • 自动:16,828ms /显式:16,813ms(显式快15ms)



    调用创建字典并添加6个项目100,000次。




    • 自动:17,437ms /显式:17,407ms(明显快30ms)

    • 自动:17,343ms /显式:17,360ms(自动快17ms)



    创建字典并添加100,000项。




    • 自动:391ms /显式:391ms(相同)



    创建字典并添加1,000,000项。 / strong>


    显示:57,343毫秒/自动:57,422毫秒(显式) 79ms更快)


    我没有看到任何表明自动实例化与显式实例化效果不佳的关系。 (要清楚,由于其他原因,我会避免自动实例化,但我只是对这里的表现角度感兴趣。)



    这是一个神话吗? / p>

    更新



    让我说明为什么性能参数没有意义对我来说。据说,

      x.Add(Key,Item)

    相当于以下内容:

     code>如果x是Nothing然后
    设置x =新字典
    结束If
    x.Add(Key,Item)

    这使得它看起来像可怕的开销,如果你打电话这几千次。但是在显式实例化的情况下,正是在编译版本的代码中生成的逻辑形式:

     如果x是Nothing然后
    Err.RaiseObject variable not set ...
    End If
    x.Add(Key,Item)

    它不一定跟随汽车更长,这就是为什么我问这是否有任何真相。我想知道我是否确定了许多未经测试的性能神话中的另一个。

    解决方案

    我知道有圣洁的方式if没关系,还是不要暗淡,但是从来没有听说过这样的表现。 简短的答案不是真的。是的,它会用不正当的检查来丢弃你的代码,看看它是不是没有,但是你不会注意到今天的机器的速度差异。就像说超过10000个字符的循环速度比10001快。为了开始看到任何差异,你需要以更高的数值来循环测试,如数百万和数千万。




    • 你失去了控制什么时候被初始化的能力

    • 您无法检查对象是否为Nothing

    • 速度差异与否,它会以不正当检查的方式丢弃您的代码



    如果你只是使用VBA自动化一些工作表的东西或操纵数据,你可能不会在乎这些,但是当你看到更多复杂的代码,失去了测试对象是否为Nothing而不是控制什么时候被初始化的能力是巨大的并且可以产生意想不到的行为的能力,更不用说测试在屁股的痛苦,所有这些用于保存几行代码。 p>

    然后有一个微软优化器将会争辩说,添加任何不需要的代码可能导致性能不佳。虽然在某些方面是正确的,但在这种情况下您最有可能保存0.000000001秒。


    The accepted wisdom is that using a construct like Dim dict As New Dictionary is poorer in performance than Dim dict As Dictionary / Set dict = New Dictionary.

    The explanation is that the former example - auto-instantiation - defers instantiation until the first usage of the variable dict. And thus, every time dict is referenced, the compiled code must first check whether dict is equal to Nothing.

    But it occurs to me that compiled code does this anyway. You will get an error any time you try to make use of an object reference that is Nothing.

    So, in tribute to science, I ran some tests. And the results suggest there is no performance difference between the two approaches. (Run on Excel 2007)

    Call "create dictionary & add 2 items" 100,000 times.

    • Explicit: 16,891ms / Auto: 16,797ms (Auto 94ms faster)
    • Explicit: 16,797ms / Auto: 16,781ms (Auto 16ms faster)

    Reverse the order of test calls:

    • Auto: 16,766ms / Explicit: 16,812ms (Auto 46ms faster)
    • Auto: 16,828ms / Explicit: 16,813ms (Explicit 15ms faster)

    Call "create dictionary & add 6 items" 100,000 times.

    • Auto: 17,437ms / Explicit: 17,407ms (Explicit 30ms faster)
    • Auto: 17,343ms / Explicit: 17,360ms (Auto 17ms faster)

    Create dictionary and add 100,000 items.

    • Auto: 391ms / Explicit: 391ms (Same)

    Create dictionary and add 1,000,000 items.

    • Auto: 57,609ms / Explicit: 58,172ms (Auto 563ms faster)
    • Explicit: 57,343ms / Auto: 57,422ms (Explicit 79ms faster)

    I see nothing to indicate that auto-instantiation is a poor performing relation to explicit instantiation. (To be clear, for other reasons, I would avoid auto-instantiation but I'm just interested in the performance angle here.)

    So is this a myth?

    UPDATE

    Let me lay out why the performance argument doesn't make sense to me. It is said that

    x.Add("Key", "Item")
    

    in an auto-instantiated object is equivalent to the following:

    If x is Nothing then
        Set x = New Dictionary
    End If
    x.Add("Key", "Item")
    

    which makes it look like "frightening overhead" if you're calling this thousands of times. But in the explicit instantiation case, it's exactly the form of logic generated in the compiled version of the code:

    If x is Nothing Then
        Err.Raise "Object variable not set..."
    End If
    x.Add("Key", "Item")
    

    It doesn't necessarily follow that auto is longer, which is why I'm asking whether there was any truth to this. I wonder if I've identified another one of the many untested performance myths.

    解决方案

    I know there is the holy way over if it's okay or not to dim as new, but I've never heard of it being said to generate poor performance. The short answer is Not Really. Yes it does litter your code with unnessesary checks to see if it's not Nothing, but you wouldn't notice a speed difference thanks to today's machines. It's like saying "for looping over 10000 characters is faster than 10001. To start seeing any difference, you need to be looping your tests in higher terms, like millions and tens of millions.

    That being said Dim as New is frowned upon but not for performance reasons.

    • You lose the ability to control when it's initialized
    • You lose the ability to check if an object is Nothing
    • Speed difference or not, it does litter your code with unnessesary checking

    Granted if you are just using VBA to automate some worksheet stuff or manipulate data, you probably won't care about these, but the moment you look at more sophisticated code, losing the ability to test if an object is Nothing and not controlling when it's initialized is huge and can generate unexpected behavior, not to mention make testing a pain in the butt. All that for saving a few lines of code.

    Then there are the micro-optimizers who will argue that adding anything to your code that is not needed makes for poor performance. While they are right in some ways, you'll most likely saving 0.000000001 seconds in this case.

    这篇关于Excel VBA的表现不好自动实现神话?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆