Spring + Hibernate save()不起作用 [英] Spring + Hibernate save() not working

查看:106
本文介绍了Spring + Hibernate save()不起作用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

  @Async 
public void performSeismicOperations( Integer sessionUserId,
int seismicFileId,String seismicFileName,ShClstr targetCluster,
Collection< String> listOperations,String processedFolderName,
Map< String,Object []> args,String userNotes)throws IOException {




/ *一些代码* /


日期currentDate = new Date(System.currentTimeMillis());
$ b $ * IMMEDIATE JOB ENTRY * /
log.info(开始:插入sh_job以确保用户);
ShJob shJob = new ShJob(user,ClusterConstants.JOB_SUBMITTED,
currentDate,null,null,null);
shJobDAO.save(shJob);
log.info(结束:插入sh_job以确保用户);

/ *一些耗时的操作 - 1 * /

SeismicFiles processedSeismicFile = new SeismicFiles(user,
processedFolderName,0,HDFSConstants.PROCESSED,currentDate);
seismicFilesDAO.persist(processedSeismicFile);

/ *一些耗时的操作 - 2 * /

log.info(开始:更新Hadoop作业ID);
shJob.setShjHadoopJobId(hadoopJobId);
shJobDAO.attachDirty(shJob);
log.info(结束:更新Hadoop作业ID);




/ *一些代码* /



log.info(从SeismicHadoopServiceImpl.performSeismicOperations()返回);
}

DAO代码

  import java.util.List; 

导入org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.hibernate.LockMode;
import org.hibernate.Query;
import org.hibernate.SessionFactory;
import org.hibernate.criterion.Example;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;

import com.lnt.seismichadoop.pojo.ShJob;

@Repository
public class ShJobDAO {

private static final Log log = LogFactory.getLog(ShJobDAO.class);

@Autowired
private SessionFactory sessionFactory;

public void setSessionFactory(SessionFactory sessionFactory){
this.sessionFactory = sessionFactory;
}

public void persist(ShJob transientInstance){
log.debug(persistence ShJob instance);
尝试{
sessionFactory.getCurrentSession()。persist(transientInstance);
log.debug(坚持成功);
} catch(RuntimeException re){
log.error(persist failed,re);
throw re;


$ b public void save(ShJob transientInstance){
log.debug(SAVING ShJob instance);
尝试{
sessionFactory.getCurrentSession()。save(transientInstance);
log.debug(保存成功);
} catch(RuntimeException re){
log.error(save failed,re);
throw re;


$ b $ public void attachDirty(ShJob实例){
log.debug(附上dirty ShJob实例);
尝试{
sessionFactory.getCurrentSession()。saveOrUpdate(instance);
log.debug(attach成功);
} catch(RuntimeException re){
log.error(attach failed,re);
throw re;


$ b $ public void attachClean(ShJob实例){
log.debug(attached clean ShJob instance);
尝试{
sessionFactory.getCurrentSession()。lock(instance,LockMode.NONE);
log.debug(attach成功);
} catch(RuntimeException re){
log.error(attach failed,re);
throw re;


$ b $ public void delete(ShJob persistentInstance){
log.debug(删除ShJob实例);
尝试{
sessionFactory.getCurrentSession()。delete(persistentInstance);
log.debug(删除成功);
} catch(RuntimeException re){
log.error(delete failed,re);
throw re;


$ b $公共ShJob合并(ShJob detachedInstance){
log.debug(合并ShJob实例);
尝试{
ShJob result =(ShJob)sessionFactory.getCurrentSession()。merge(
detachedInstance);
log.debug(merge successful);
返回结果;
} catch(RuntimeException re){
log.error(merge failed,re);
throw re;


$ b $公共ShJob findById(java.lang.Integer id){
log.debug(使ShJob实例获得id:+ id);
尝试{
ShJob instance =(ShJob)sessionFactory.getCurrentSession()。get(
com.lnt.seismic.dao.ShJob,id);
if(instance == null){
log.debug(get successful,no instance found);
} else {
log.debug(get successful,instance found);
}
返回实例;
} catch(RuntimeException re){
log.error(get failed,re);
throw re;


$ b $ public List findByExample(ShJob实例){
log.debug(通过示例查找ShJob实例);
尝试{
List results = sessionFactory.getCurrentSession()
.createCriteria(com.lnt.seismic.dao.ShJob)
.add(Example.create(instance) ).LIST();
log.debug(成功查找成功,结果大小:
+ results.size());
返回结果;
} catch(RuntimeException re){
log.error(find by example failed,re);
throw re;
}
}

public List< ShJob> findAll(){
log.debug(通过findAll查找JobStatus实例);
尝试{
Query Query = sessionFactory.getCurrentSession()。createQuery(
from ShJob);
列表< ShJob> results = query.list();
log.debug(find find all all,result size:
+ results.size());
返回结果;
} catch(RuntimeException re){
log.error(find by example failed,re);
throw re;
}
}
}

我的要求是条目必须在处理启动后立即进入作业表(代码中的 / IMMEDIATE JOB ENTRY / )。在/ 一些耗时的操作 - 2 /之后,我将更新相同的条目(具有适当的状态)。
当我读到save()和persist()之间的区别时,我的save()仍然推迟插入,直到/ 进行一些耗时的操作 - 2 /反过来反映进入非常晚的前端。



请指导我在哪里我犯了一个错误。



1st EDIT



在我的例子中,用户提交了一个操作请求,它涉及上述服务方法,标记为@Async - 用户必须在服务方法中的操作仍在进行时看到一个页面,其请求显示已提交。在这种情况下,我应该使用session.flush()还是需要进行任何代码更改?

解决方案

save and persist ,并且通常,每次使用持久化实体进行的操作都会延迟到真正必要的时候,以避免不必要的往返数据库。



可以让Hibernate写入通过使用 session.flush()对数据库进行每次挂起的更改,但这不会使该实体可用于前端,因为ront-end没有使用相同的事务来读取数据,而不是执行长操作并持久化实体的数据。

并且由于事务处于隔离状态(大多数情况下默认隔离是READ_COMMITTED),事务不会看到另一个事务写入的任何内容,直到这个事务提交到数据库。



如果要立即选择插入的实体,请将其与其他长时间运行的操作保存在单独的事务中,或者更改隔离级别设置为READ_UNCOMMITTED。


I'm attaching the Spring service (problematic) code below :

@Async
    public void performSeismicOperations(Integer sessionUserId,
            int seismicFileId, String seismicFileName, ShClstr targetCluster,
            Collection<String> listOperations, String processedFolderName,
            Map<String, Object[]> args, String userNotes) throws IOException {

            .
            .
            .
            /*some code*/
            .
            .
        Date currentDate = new Date(System.currentTimeMillis());

            /*IMMEDIATE JOB ENTRY*/    
        log.info("Start : Inserting in sh_job to assure user");
        ShJob shJob = new ShJob(user, ClusterConstants.JOB_SUBMITTED,
                currentDate, null, null, null);
        shJobDAO.save(shJob);
        log.info("End : Inserting in sh_job to assure user");

        /*some time-consuming operation - 1*/

        SeismicFiles processedSeismicFile = new SeismicFiles(user,
                processedFolderName, 0, HDFSConstants.PROCESSED, currentDate);
        seismicFilesDAO.persist(processedSeismicFile);

        /*some time-consuming operation - 2*/

        log.info("Start : Updating the Hadoop job id");
        shJob.setShjHadoopJobId(hadoopJobId);
        shJobDAO.attachDirty(shJob);
        log.info("End : Updating the Hadoop job id");

            .
            .
            .
            /*some code*/
            .
            .

        log.info("Returning from SeismicHadoopServiceImpl.performSeismicOperations()");
    }

DAO code

import java.util.List;

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.hibernate.LockMode;
import org.hibernate.Query;
import org.hibernate.SessionFactory;
import org.hibernate.criterion.Example;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Repository;

import com.lnt.seismichadoop.pojo.ShJob;

@Repository
public class ShJobDAO {

    private static final Log log = LogFactory.getLog(ShJobDAO.class);

    @Autowired
    private SessionFactory sessionFactory;

    public void setSessionFactory(SessionFactory sessionFactory) {
        this.sessionFactory = sessionFactory;
    }

    public void persist(ShJob transientInstance) {
        log.debug("persisting ShJob instance");
        try {
            sessionFactory.getCurrentSession().persist(transientInstance);
            log.debug("persist successful");
        } catch (RuntimeException re) {
            log.error("persist failed", re);
            throw re;
        }
    }

    public void save(ShJob transientInstance) {
        log.debug("SAVING ShJob instance");
        try {
            sessionFactory.getCurrentSession().save(transientInstance);
            log.debug("save successful");
        } catch (RuntimeException re) {
            log.error("save failed", re);
            throw re;
        }
    }

    public void attachDirty(ShJob instance) {
        log.debug("attaching dirty ShJob instance");
        try {
            sessionFactory.getCurrentSession().saveOrUpdate(instance);
            log.debug("attach successful");
        } catch (RuntimeException re) {
            log.error("attach failed", re);
            throw re;
        }
    }

    public void attachClean(ShJob instance) {
        log.debug("attaching clean ShJob instance");
        try {
            sessionFactory.getCurrentSession().lock(instance, LockMode.NONE);
            log.debug("attach successful");
        } catch (RuntimeException re) {
            log.error("attach failed", re);
            throw re;
        }
    }

    public void delete(ShJob persistentInstance) {
        log.debug("deleting ShJob instance");
        try {
            sessionFactory.getCurrentSession().delete(persistentInstance);
            log.debug("delete successful");
        } catch (RuntimeException re) {
            log.error("delete failed", re);
            throw re;
        }
    }

    public ShJob merge(ShJob detachedInstance) {
        log.debug("merging ShJob instance");
        try {
            ShJob result = (ShJob) sessionFactory.getCurrentSession().merge(
                    detachedInstance);
            log.debug("merge successful");
            return result;
        } catch (RuntimeException re) {
            log.error("merge failed", re);
            throw re;
        }
    }

    public ShJob findById(java.lang.Integer id) {
        log.debug("getting ShJob instance with id: " + id);
        try {
            ShJob instance = (ShJob) sessionFactory.getCurrentSession().get(
                    "com.lnt.seismic.dao.ShJob", id);
            if (instance == null) {
                log.debug("get successful, no instance found");
            } else {
                log.debug("get successful, instance found");
            }
            return instance;
        } catch (RuntimeException re) {
            log.error("get failed", re);
            throw re;
        }
    }

    public List findByExample(ShJob instance) {
        log.debug("finding ShJob instance by example");
        try {
            List results = sessionFactory.getCurrentSession()
                    .createCriteria("com.lnt.seismic.dao.ShJob")
                    .add(Example.create(instance)).list();
            log.debug("find by example successful, result size: "
                    + results.size());
            return results;
        } catch (RuntimeException re) {
            log.error("find by example failed", re);
            throw re;
        }
    }

    public List<ShJob> findAll() {
        log.debug("finding JobStatus instance by findAll");
        try {
            Query query = sessionFactory.getCurrentSession().createQuery(
                    "from ShJob");
            List<ShJob> results = query.list();
            log.debug("find by findAll successful, result size: "
                    + results.size());
            return results;
        } catch (RuntimeException re) {
            log.error("find by example failed", re);
            throw re;
        }
    }
}

My requirement is that an entry must go into a job table as soon as the processing initiates(/IMMEDIATE JOB ENTRY/ in the code). After /some time-consuming operation - 2/, I will update the same entry(with an appropriate status). While I read the difference between save() and persist(), my save() still defers the insert till /some time-consuming operation - 2/ which, in turn, reflects an entry very late in the front end.

Please guide me as to where I'm making a blunder.

1st EDIT

In my case, the user submits an operation request which comes to the above service method which is marked @Async - the user must see a page with his request showing 'SUBMITTED' while the operation is still going on in the service method. In this case, shall I use session.flush() or I need to make any code changes?

解决方案

save and persist, and generally, every operation you do with persistent entities, are deferred until really necessary, to avoid needless roundtrips to the database.

You can make Hibernate write every pending change to the database by using session.flush(), but that won't make the entity available for the front-end, because the ront-end doesn't use the same transaction to read the data than the one doing the long operation and persisting the entities.

And since transactions run in isolation (with the default isolation being READ_COMMITTED, most of the time), a transaction won't see anything written by another transaction until this other transaction commits to the database.

If you want to se the inserted entity immediately, save it in a separate transaction from the rest of the long-running operation, or change the isolation level to READ_UNCOMMITTED.

这篇关于Spring + Hibernate save()不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆