如何在 Scipy 中定义具有时间延迟的 LTI 系统? [英] How to define LTI systems with Time delay in Scipy?

查看:68
本文介绍了如何在 Scipy 中定义具有时间延迟的 LTI 系统?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

具有时延的 LTI 系统的传递函数有一个分子项 exp(-Td * s),其中 Td 是时延.在 Matlab 中,可以通过多种方式创建这样的 LTI 系统(例如,使用s"运算符并直接设置指数项或通过设置 inputdelay outputdelay 属性code>tf 对象.)但是,我在 Scipy Signal LTI 对象中找不到任何方法来执行此操作.我也查了Python Control Systems Library,还是没找到办法.

The transfer function of an LTI system with time delay has a numerator term exp(-Td * s) where Td is the time delay. In Matlab, one could create such an LTI system in many ways (e.g. using the "s" operator and setting the exponential term directly or by setting the inputdelay outputdelay properties of tf objects.) However, I cannot find any way to do this in Scipy Signal LTI objects. I also checked the Python Control Systems Library, but still couldn't find a way.

我不想对时间延迟使用 Pade 近似值,而是想为 LTI 系统设置准确的时间延迟.

I do not want to use the Pade approximation for time delay and want to set the exact time delay to the LTI system.

有谁知道如何在 Scipy 或任何其他外部 Python 库中实现这一点?

Does anyone know how to achieve this in Scipy or in any other external Python library?

推荐答案

我在 github 上查看了 ltisys 模块并尝试创建一个具有时间延迟的 LTI 类.我认为,如果我们将 BU(t) 替换为 BU(t-Td),其中 Td 是时间延迟,那么在状态方程 中引入输入时间延迟应该很简单.以下方法适用于单输入单输出系统.可能不是没有错误,但它解决了我的目的.

I checked out the ltisys module at github and attempted to create a LTI class with time delay. I think, it should be straightforward to introduce a input time delay in the state equation , if we replace BU(t) by BU(t-Td) where Td is the time delay. Following approach works for single input single output system. May not be free of bugs , but it solved my purpose.

#Inherit the parent LTI class  to create LTI class with time delay 


class ltidelay(lti):
    def __init__(self,inputdelay,*args,**kwargs):
        super(ltidelay,self).__init__(*args,**kwargs)    
        self.d =inputdelay

#define a method to simulate LTI with time delay . just copied lsim2 and made 2 changes. 1. passed the delay from the `ltidelay` object and 2. modified the state equation.


def lsim3(system , U=None, T=None,X0=None, **kwargs):
    if isinstance(system,lti):
        sys = system
    else:
        sys = lti(*system)
    delay = sys.d
    if X0 is  None:
        X0 = zeros(sys.B.shape[0],sys.A.dtype)        
    if T is None:
        T = linspace(0,10,101)
    T = atleast_1d(T)
    if len(T.shape) != 1:
        raise ValueError("T must be a rank1 array")
    if U is not None:
        U = atleast_1d(U)
        if len(U.shape)==1:
            U=U.reshape(-1,1)
        sU = U.shape
        if sU[0] != len(T):
            raise ValueError("U must have the same number of rows as elements in T")
        if sU[1] != sys.inputs:
            raise ValueError("The number of inputs in U is not compatible")
        ufunc = interpolate.interp1d(T, U, kind ='linear',axis =0,bounds_error =False)
        def fprime(x,t,sys,ufunc):
            return  dot(sys.A,x)+squeeze(dot(sys.B,nan_to_num(ufunc([t-delay]))))
        xout = odeint(fprime,X0,T,args=(sys,ufunc),**kwargs)
        yout = dot(sys.C,transpose(xout))
    else:
        def fprime(x,t,sys):
            return dot(sys.A,x)
        xout = odeint(fprime,X0,T,args=(sys,),**kwargs)
        yout = dot(sys.C, transpose(xout))
    return T , squeeze(transpose(yout)),xout   

#create an LTI system with delay 10

 tf = ltidelay(10,2,[4,1])

#create a step signal and time vector to simulate the LTI and check


u = linspace(0,0,100)

u[50:100] = 1

 t = linspace(1,100,100)

#check the simulation
y = lsim3(tf,u,t,X0 =0)

plot(y[1])

# compare with  LTI without time delay
y1 =lsim2(tf, u,t, X0=0)

plot(y1[1])

#delay works

这篇关于如何在 Scipy 中定义具有时间延迟的 LTI 系统?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆