sqlalchemy自动加载orm持久性 [英] sqlalchemy autoloaded orm persistence
问题描述
我们正在使用sqlalchemy的自动加载功能进行列映射,以防止在我们的代码中进行硬编码.
We are using sqlalchemy's autoload feature to do column mapping to prevent hardcoding in our code.
class users(Base):
__tablename__ = 'users'
__table_args__ = {
'autoload': True,
'mysql_engine': 'InnoDB',
'mysql_charset': 'utf8'
}
是否有一种方法可以序列化或缓存自动加载的元数据/规范,这样我们就不必在每次需要从其他脚本/函数引用我们的orm类时都经过自动加载过程了吗?
Is there a way to serialize or cache autoloaded metadata/orms so we don't have to go through the autoload process every time we need to reference our orm classes from other scripts/functions?
我看过烧杯缓存和泡菜,但没有找到明确的答案,如果可能或如何做.
I have looked at beaker caching and pickle but haven't found a clear answer if it is possible or how to do it.
理想情况下,仅当对数据库结构进行更改但从所有其他脚本/函数引用数据库映射的非自动加载/永久/缓存版本时,才运行autload映射脚本.
Ideally we run the autload mapping script only when we have committed changes to our database structure but reference a non-autoload/persistent/cached version of our database mapping from all other scripts/functions,
有什么想法吗?
推荐答案
我现在正在做的是通过数据库连接(MySQL)运行反射后,对元数据进行腌制,一旦有腌汁可用,就使用该腌制的元数据来进行反映在架构上具有绑定到SQLite引擎的元数据.
What I am doing now is to pickle the metadata after running the reflection through a database connection (MySQL) and once a pickle is available use that pickled metadata to reflect on the schema with the metadata bound to an SQLite engine.
cachefile='orm.p'
dbfile='database'
engine_dev = create_engine(#db connect, echo=True)
engine_meta = create_engine('sqlite:///%s' % dbfile,echo=True)
Base = declarative_base()
Base.metadata.bind = engine_dev
metadata = MetaData(bind=engine_dev)
# load from pickle
try:
with open(cachefile, 'r') as cache:
metadata2 = pickle.load(cache)
metadata2.bind = engine_meta
cache.close()
class Users(Base):
__table__ = Table('users', metadata2, autoload=True)
print "ORM loaded from pickle"
# if no pickle, use reflect through database connection
except:
class Users(Base):
__table__ = Table('users', metadata, autoload=True)
print "ORM through database autoload"
# create metapickle
metadata.create_all()
with open(cachefile, 'w') as cache:
pickle.dump(metadata, cache)
cache.close()
如果这还行(有效)或有什么我可以改善的地方,有何评论?
Any comments if this is alright (it works) or there is something I can improve?
这篇关于sqlalchemy自动加载orm持久性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!