如何在Pentaho数据集成水壶转换中为生产环境配置数据库连接 [英] How to configure Database connection for production environment in Pentaho data integration Kettle transformation

查看:179
本文介绍了如何在Pentaho数据集成水壶转换中为生产环境配置数据库连接的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我设计了一个用于转换的ktr文件.我需要配置生产环境的数据库连接详细信息.我怎样才能做到这一点?有什么建议吗?

I designed a ktr file for transformation. I need to configure the database connection details of production environment. How can I do this? Any suggestions?

推荐答案

我使用环境变量.

KETTLE_HOME
KETTLE_JNDI_ROOT
PATH=$PATH:$KETTLE_HOME

水壶家只是指向目录的链接.默认情况下,我有专门用于数据集成套件的目录.它包含水壶的几种版本.

Kettle home is just a link to directory. By default i have directory specially devoted to data-integration suite. It contains several versions of kettle.

示例

/opt/kettle/data-integration-4.4.0 (few old jobs made like several years ago)
/opt/kettle/data-integration-5.2.0 (currenly used)
/opt/kettle/data-integration-6.0.0 (on test)

然后有一个指向当前默认设置的链接(类似于debian中的替代方法).在同一台计算机上安装多个版本的水壶会有所帮助.

then there is a link to current default (something like alternatives in debian). It helps to have several versions of kettle on same machine.

ln -s /opt/kettle/data-integration-5.2.0 /opt/kettle/data-integration

现在

/opt/kettle/data-integration 

是指向主套件的链接,默认情况下将使用它.这是出于jdk兼容性的原因.由于4.4.0是基于jdk6的,因此是5.2.0 jdk7的(由于xstream xml库中存在bug,因此不适用于jdk8,但可能已经修复,即使在jdk7上构建,6.0分支也可以在jdk8上很好地工作)

is a link to main suite which will be used by default. This is made for jdk compbility reasons. Since 4.4.0 is jdk6 based, 5.2.0 jdk7 (and doesn't work on jdk8 since there is a bug in xstream xml library, but may be fixed already, 6.0 branch works well on jdk8 even it is built on jdk7)

在/etc/profile中配置的变量. 文件jdbc.properties位于

Variables configured in /etc/profile. File jdbc.properties is located in

$KETTLE_HOME/simple-jndi 

,并由所有作业共享.如果网络配置发生更改,此文件以及连接设置将由puppet生成.如果管理员做某事,在迁移的情况下,puppet会自动创建新文件.

and shared by all jobs. This file, with connection settings, generated by puppet if network configuration changes. If administrators do something, puppet automatically will make new file in case of migrations.

对于连接定义,我使用命名规则

For connection definitions I use naming rules

 main       -connection to productive database
 main_slave -connection to slave database (read-only)
 test       -connection to test database (on separate machine)
 test_slave

以此类推...

这篇关于如何在Pentaho数据集成水壶转换中为生产环境配置数据库连接的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆