如何在Oracle 18c中有效地解析json? [英] How to parse json efficiently in Oracle 18c?

查看:243
本文介绍了如何在Oracle 18c中有效地解析json?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用JSON_OBJECT_TJSON_ARRAY_T API解析大型json及其工作正常,但是我希望获得专家建议,不管它是否有效?

I am trying to parse a large json using JSON_OBJECT_T, JSON_ARRAY_T APIs and its working fine but I want expert suggestions whether it is efficient or not?

我正在添加我的json文件并解析如下代码

I am adding my json file and parsing code as below

SampleJson

代码

SET SERVEROUTPUT ON;
DECLARE
   l_clob clob;
   l_time timestamp;

   l_json json_object_t;
   l_stops_array json_array_t;
   l_stops_arr json_array_t;

   routeInfoObj json_object_t;
   routeStopArr json_array_t;
BEGIN
   SELECT LOG_CLOB INTO l_clob FROM ITV_DEV_LOGS WHERE LOG_ID = 1435334;
   l_time := systimestamp;

   l_json := json_object_t.parse( l_clob );

   dbms_output.put_line( 'Parsing Time: ' || extract(second from( systimestamp - l_time ) ) );
   l_stops_array := l_json.get_array('data');
   DBMS_OUTPUT.PUT_LINE('Data array: '||l_stops_array.get_size);
   FOR i in 0..l_stops_array.get_size-1 loop
      l_stops_arr := TREAT(l_stops_array.get(i) AS JSON_OBJECT_T).get_array('routedStops');
      DBMS_OUTPUT.PUT_LINE('stops array: '||l_stops_arr.get_size);
      FOR j in 0..l_stops_arr.get_size - 1 loop
         routeInfoObj := TREAT(l_stops_arr.get(j) AS JSON_OBJECT_T).get_object('routingInfo');
         DBMS_OUTPUT.PUT_LINE('Stop : ' || routeInfoObj.get_number('stop'));
         routeStopArr := TREAT(l_stops_arr.get(j) AS JSON_OBJECT_T).get_array('routedJobs');
         FOR k in 0..routeStopArr.get_size - 1 loop
            DBMS_OUTPUT.PUT_LINE('JobRef : ' || TREAT(routeStopArr.get(k) AS JSON_OBJECT_T).get_string('jobRef'));
         // update query to update stop value to respective jobRef
        end loop;
      end loop;
  end loop;
END;

它工作正常,但是有一种方法可以改进此实现,因为这只是一个示例json,内部对象的数量可能会达到2000,并且不是一个接一个地更新记录,而是有一种可以一次更新所有记录的方法陈述?

It's working fine but is there a way to improve this implementation as this is just a sample json and count of objects inside may go to 2000 and instead of updating records one by one, is there a way to update all records in one statement?

推荐答案

您可以使用json_table()将JSON值转换为关系表示形式.这又可以在MERGE语句中使用.

You can use json_table() to turn the JSON value into a relational representation. That in turn can be used within a MERGE statement.

例如以下查询:

select j.*
from itv_dev_logs
  cross join json_table(log_clob, '$.data.routedStops[*]'
                        columns stop_id integer path '$.stopId',
                                zone_ltf integer path '$.zoneLTF', 
                                info_stop_nr integer path '$.routingInfo.stop',
                                info_route_ref varchar(20) path '$.routingInfo.routeRef',
                                info_eta varchar(20) path '$.routingInfo.eta',
                                info_eta_dt timestamp path '$.routingInfo.etaDateTime',
                                info_stop_time number path '$.routingInfo.stopTime'
                        ) j
where log_id = 1435334;

返回类似这样的内容:

STOP_ID       | ZONE_LTF | INFO_STOP_NR | INFO_ROUTE_REF | INFO_ETA | INFO_ETA_DT             | INFO_STOP_TIME | INFO_DIST_PREV_STOP | INFO_BREAK_TIME | INFO_BREAK_DURATION
--------------+----------+--------------+----------------+----------+-------------------------+----------------+---------------------+-----------------+--------------------
1554383571432 |        1 |            1 | R119           | 11:01    | 2019-04-16 11:01:00.000 |           0.08 |                0.27 | 00:00           | 00:00              
1554383571515 |        1 |            2 | R119           | 11:07    | 2019-04-16 11:07:00.000 |           0.08 |                0.34 | 00:00           | 00:00              
1554383571601 |        1 |            3 | R119           | 11:13    | 2019-04-16 11:13:00.000 |           0.08 |                   0 | 00:00           | 00:00              
1554383571671 |        1 |            4 | R119           | 11:19    | 2019-04-16 11:19:00.000 |           0.08 |                   0 | 00:00           | 00:00              
1554383571739 |        1 |            5 | R119           | 11:25    | 2019-04-16 11:25:00.000 |           0.08 |                   0 | 00:00           | 00:00              


这可用作更新目标表的MERGE语句的源:


That could be used as the source of a MERGE statement to update your target table:

merge into your_target_table tg
using (
    select j.*
    from itv_dev_logs
      cross join json_table(log_clob, '$.data.routedStops[*]'
                            columns stop_id integer path '$.stopId',
                                    zone_ltf integer path '$.zoneLTF', 
                                    info_stop_nr integer path '$.routingInfo.stop',
                                    info_route_ref varchar(20) path '$.routingInfo.routeRef',
                                    info_eta varchar(20) path '$.routingInfo.eta',
                                    info_eta_dt timestamp path '$.routingInfo.etaDateTime',
                                    info_stop_time number path '$.routingInfo.stopTime'
                            ) j
    where log_id = 1435334
) t on (t.stop_id = tg.stop_id and ... more join conditions ...)
when matched then update
    set stop_nr = t.info_stop_nr, 
        eta_timestamp = t.eta_dt, 


由于您既未提供目标的结构,也未提供哪些JSON键应与哪些表列匹配的信息,因此所有列名仅是猜测,您需要将其替换为正确的名称.


As you neither provided the structure of the target nor the information which JSON keys should be matched to which table columns, all column names are just guesses and you need to replace them with the correct names.

这篇关于如何在Oracle 18c中有效地解析json?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆