Spring批处理和jpa配置的Spring引导 [英] Spring boot with spring batch and jpa -configuration
问题描述
我已经在下面的回购代码中将代码上传到bitbucket中
https://github.com/soasathish/spring-batch-with-jpa.git
使用弹簧数据JPA配置写入数据库时遇到问题。
我得到的管理bean找不到.issue。
这个jpa弹簧数据配置在不同的项目中起作用,当我试图与spring批量集成时失败管理bean找不到。
批处理配置有弹簧作业
只有一步
1)阅读器 - 从csv文件读取。
处理器对这些文件应用一些规则.. Drools
请运行schema-postgresql.sql来设置数据库
写入者使用弹簧数据JPA TO写给DB
可以帮助
我已经在下面的回购中在bitbucket中上传了代码
https:// github。 com / soasathish / spring-batch-with-jpa.git
我知道这是一个小问题,但任何方向或帮助都会感激不尽。
创建回购代码
=================== ====
package uk.gov.iebr.batch.config;
导入静态uk.gov.iebr.batch.config.AppProperties.DRIVER_CLASS_NAME;
导入静态uk.gov.iebr.batch.config.AppProperties.IEBR_DB_PASSWORD_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_URL_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_USER_KEY;
import java.util.Properties;
import javax.sql.DataSource;
导入org.hibernate.jpa.HibernatePersistenceProvider;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.PropertySource;
import org.springframework.core.env.Environment;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
@Configuration
@PropertySource({classpath:application.properties})
@EnableJpaRepositories({uk.gov.iebr.batch.repository})
@EnableTransactionManagement
@ComponentScan(basePackages =uk.gov.iebr.batch.repository)
公共类DataSourceConfiguration {
@Autowired
环境环境;
$ b $ public barkerEntity );
em.setDataSource(allsparkDS());
em.setPersistenceUnitName(allsparkEntityMF);
em.setPackagesToScan(new String [] {uk.gov.iebr.batch});
em.setPackagesToScan(new String [] {uk.gov.iebr.batch.repository});
em.setPersistenceProvider(new HibernatePersistenceProvider());
HibernateJpaVendorAdapter a = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(a);
属性p = hibernateSpecificProperties();
p.setProperty(hibernate.ejb.entitymanager_factory_name,allsparkEntityMF);
em.setJpaProperties(p);
返回em;
$ b $Bean(name =allsparkDS)
public DataSource allsparkDS(){
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty(DRIVER_CLASS_NAME));
dataSource.setUrl(env.getProperty(IEBR_DB_URL_KEY));
dataSource.setUsername(env.getProperty(IEBR_DB_USER_KEY));
dataSource.setPassword(env.getProperty(IEBR_DB_PASSWORD_KEY));
返回dataSource;
@Bean
public Properties hibernateSpecificProperties(){
final Properties p = new Properties();
p.setProperty(hibernate.hbm2ddl.auto,env.getProperty(spring.jpa.hibernate.ddl-auto));
p.setProperty(hibernate.dialect,env.getProperty(spring.jpa.hibernate.dialect));
p.setProperty(hibernate.show-sql,env.getProperty(spring.jpa.show-sql));
p.setProperty(hibernate.cache.use_second_level_cache,env.getProperty(spring.jpa.hibernate.cache.use_second_level_cache));
p.setProperty(hibernate.cache.use_query_cache,env.getProperty(spring.jpa.hibernate.cache.use_query_cache));
return p;
$ b @Bean(name =defaultTm)
Public PlatformTransactionManager transactionManager(){
JpaTransactionManager txManager = new JpaTransactionManager );
txManager.setEntityManagerFactory(allsparkEntityMF()。getObject());
返回txManager;
}
}
批量配置文件:
package uk.gov.iebr.batch;
导入org.slf4j.Logger;
导入org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.PropertySource;
import uk.gov.iebr.batch.config.AllSparkDataSourceConfiguration;
import uk.gov.iebr.batch.config.DataSourceConfiguration;
import uk.gov.iebr.batch.dao.PersonDao;
import uk.gov.iebr.batch.model.Person;
import uk.gov.iebr.batch.step.Listener;
import uk.gov.iebr.batch.step.Processor;
import uk.gov.iebr.batch.step.Reader;
import uk.gov.iebr.batch.step.Writer;
@Configuration
@EnableBatchProcessing
//弹簧引导配置
@EnableAutoConfiguration
//包含属性的文件
@PropertySource( classpath:application.properties)
@Import({DataSourceConfiguration.class,AllSparkDataSourceConfiguration.class})
public class BatchConfig {
private static final Logger log = LoggerFactory.getLogger( BatchConfig.class);
@Autowired
public JobBuilderFactory jobBuilderFactory;
@Autowired
public StepBuilderFactory stepBuilderFactory;
@Autowired
Public PersonDao PersonDao;
@Autowired
public DataSourceConfiguration dataSourceConfiguration;
@Bean
public Job(){
long startTime = System.currentTimeMillis();
log.info(批量开始====================================== ==================================+ startTime);
return jobBuilderFactory.get(job)。incrementmenter(new RunIdIncrementer())
//.listener(new Listener(PersonDao))
.flow(step1())。end )。建立();
}
@Bean
public step1(){
return stepBuilderFactory.get(step1)。< Person,Person> chunk(10)
.reader(Reader.reader(tram-data.csv))
.processor(new Processor())。writer(new Writer(PersonDao))。build();
$ b $ / code $ / pre
$ b 编写者称这个PersonDaoImpl:
public class PersonDaoImpl implements PersonDao {
@Autowired
DataSourceConfiguration数据源;
@Autowired
PersonRepository personrepo;
@Override
public void insert(List< ;? extends Person> Persons){
personrepo.save(Persons);
}
}
解决方案
这是抱怨它无法找到 @Bean
名为 entityManagerFactory
。
发生这种情况的原因是因为您使用 @EnableJpaRepositories
和 entityManagerFactoryRef
属性默认为 entityManagerFactory 。此属性为 EntityManagerFactory
定义 @Bean
的名称。
我认为您的应用程序配置阻止了正常的自动配置处理。
我建议删除 IEBRFileProcessApplication
类,下面的例子用于配置你的spring-boot应用程序(如果你愿意,可以使用 ServletInitializer
)。
@SpringBootApplication
公共类应用程序扩展SpringBootServletInitializer {
$ b $ @Override
保护SpringApplicationBuilder configure(SpringApplicationBuilder应用程序) {
返回application.sources(Application.class);
public static void main(String [] args)throws Exception {
SpringApplication.run(Application.class,args);
}
}
我也看不到需要 DataSourceConfiguration
和 AllSparkDataSourceConfiguration
,所以我建议删除它们。如果你确实需要指定你自己的 DataSource
,请告诉我,我可以提供一个额外的例子。
之间 @SpringBootApplication
和 @EnableBatchProcessing
注解,所有必要的都会为您启动。
您需要的全部 BatchConfig
是 @Configuration
和 @EnableBatchProcessing
。
如果您进行这些更改以简化您的代码库,那么您的问题应该消失。
更新:
我在这里创建了一个拉取请求 https://github.com/soasathish/spring-batch-with-jpa/pull/1
请查看javadoc,了解 @EnableBatchProcessing
的工作原理。 http:// docs。 spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html
I have simple batch application, reading csv to postgres database.
I have uploaded the code in this below repo in bitbucket
https://github.com/soasathish/spring-batch-with-jpa.git
I have problems in configuring the writing to database using spring data JPA.
I am getting manage bean not found .issue.
This same jpa spring data configuration works in different project when i tried to integrate with spring batch it fails with manage bean not found.
The batch config has spring job
There is only one step
1) reader -read from csv files.
processor applies some rules on the files .. Drools
please run schema-postgresql.sql to setup database
WRITER USES THE SPRING DATA JPA TO WRITE TO DB
could one help
I have uploaded the code in this below repo in bitbucket
https://github.com/soasathish/spring-batch-with-jpa.git
i know its a minor issue , but any direction or help will be grateful
code for creating repo
=======================
package uk.gov.iebr.batch.config;
import static uk.gov.iebr.batch.config.AppProperties.DRIVER_CLASS_NAME;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_PASSWORD_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_URL_KEY;
import static uk.gov.iebr.batch.config.AppProperties.IEBR_DB_USER_KEY;
import java.util.Properties;
import javax.sql.DataSource;
import org.hibernate.jpa.HibernatePersistenceProvider;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.PropertySource;
import org.springframework.core.env.Environment;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
@Configuration
@PropertySource({"classpath:application.properties"})
@EnableJpaRepositories({"uk.gov.iebr.batch.repository"})
@EnableTransactionManagement
@ComponentScan(basePackages="uk.gov.iebr.batch.repository")
public class DataSourceConfiguration {
@Autowired
Environment env;
@Bean(name = "allsparkEntityMF")
public LocalContainerEntityManagerFactoryBean allsparkEntityMF() {
final LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(allsparkDS());
em.setPersistenceUnitName("allsparkEntityMF");
em.setPackagesToScan(new String[] { "uk.gov.iebr.batch"});
em.setPackagesToScan(new String[] { "uk.gov.iebr.batch.repository"});
em.setPersistenceProvider(new HibernatePersistenceProvider());
HibernateJpaVendorAdapter a = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(a);
Properties p = hibernateSpecificProperties();
p.setProperty("hibernate.ejb.entitymanager_factory_name", "allsparkEntityMF");
em.setJpaProperties(p);
return em;
}
@Bean(name = "allsparkDS")
public DataSource allsparkDS() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty(DRIVER_CLASS_NAME));
dataSource.setUrl(env.getProperty(IEBR_DB_URL_KEY));
dataSource.setUsername(env.getProperty(IEBR_DB_USER_KEY));
dataSource.setPassword(env.getProperty(IEBR_DB_PASSWORD_KEY));
return dataSource;
}
@Bean
public Properties hibernateSpecificProperties(){
final Properties p = new Properties();
p.setProperty("hibernate.hbm2ddl.auto", env.getProperty("spring.jpa.hibernate.ddl-auto"));
p.setProperty("hibernate.dialect", env.getProperty("spring.jpa.hibernate.dialect"));
p.setProperty("hibernate.show-sql", env.getProperty("spring.jpa.show-sql"));
p.setProperty("hibernate.cache.use_second_level_cache", env.getProperty("spring.jpa.hibernate.cache.use_second_level_cache"));
p.setProperty("hibernate.cache.use_query_cache", env.getProperty("spring.jpa.hibernate.cache.use_query_cache"));
return p;
}
@Bean(name = "defaultTm")
public PlatformTransactionManager transactionManager() {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(allsparkEntityMF().getObject());
return txManager;
}
}
Batch config file:
package uk.gov.iebr.batch;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.PropertySource;
import uk.gov.iebr.batch.config.AllSparkDataSourceConfiguration;
import uk.gov.iebr.batch.config.DataSourceConfiguration;
import uk.gov.iebr.batch.dao.PersonDao;
import uk.gov.iebr.batch.model.Person;
import uk.gov.iebr.batch.step.Listener;
import uk.gov.iebr.batch.step.Processor;
import uk.gov.iebr.batch.step.Reader;
import uk.gov.iebr.batch.step.Writer;
@Configuration
@EnableBatchProcessing
//spring boot configuration
@EnableAutoConfiguration
//file that contains the properties
@PropertySource("classpath:application.properties")
@Import({DataSourceConfiguration.class, AllSparkDataSourceConfiguration.class})
public class BatchConfig {
private static final Logger log = LoggerFactory.getLogger(BatchConfig.class);
@Autowired
public JobBuilderFactory jobBuilderFactory;
@Autowired
public StepBuilderFactory stepBuilderFactory;
@Autowired
public PersonDao PersonDao;
@Autowired
public DataSourceConfiguration dataSourceConfiguration;
@Bean
public Job job() {
long startTime = System.currentTimeMillis();
log.info("START OF BATCH ========================================================================" +startTime);
return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer())
//.listener(new Listener(PersonDao))
.flow(step1()).end().build();
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1").<Person, Person>chunk(10)
.reader(Reader.reader("tram-data.csv"))
.processor(new Processor()).writer(new Writer(PersonDao)).build();
}
}
Writer calls this PersonDaoImpl:
public class PersonDaoImpl implements PersonDao {
@Autowired
DataSourceConfiguration dataSource;
@Autowired
PersonRepository personrepo;
@Override
public void insert(List<? extends Person> Persons) {
personrepo.save(Persons);
}
}
解决方案 Based on the code you provided and the stack trace in your comment.
It's complaining that it can't find a @Bean
named entityManagerFactory
.
The reason this is happening is because you are using @EnableJpaRepositories
and the entityManagerFactoryRef
property defaults to entityManagerFactory. This property defines the name of the @Bean
for the EntityManagerFactory
.
I think your application configuration is preventing the normal auto-configuration from spring-boot from being processed.
I would recommend removing the IEBRFileProcessApplication
class and following this example for configuring your spring-boot application (you could use ServletInitializer
if you want).
@SpringBootApplication
public class Application extends SpringBootServletInitializer {
@Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(Application.class);
}
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
}
}
I also can't really see a need for DataSourceConfiguration
and AllSparkDataSourceConfiguration
, so I would recommend removing them. If you really need to specify your own DataSource
, let me know and I can provide an additional example.
Between the @SpringBootApplication
and @EnableBatchProcessing
annotations, everything that is necessary will be bootstrapped for you.
All you need on BatchConfig
is @Configuration
and @EnableBatchProcessing
.
If you make these changes to simplify your code base, then your problems should disappear.
UPDATE:
I created a pull request located here https://github.com/soasathish/spring-batch-with-jpa/pull/1
Please take a look at the javadoc here for an explanation on how @EnableBatchProcessing
works. http://docs.spring.io/spring-batch/apidocs/org/springframework/batch/core/configuration/annotation/EnableBatchProcessing.html
这篇关于Spring批处理和jpa配置的Spring引导的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文