The Oracle JSON functions are very useful for generating JSON from a query, and developing using these functions requires understanding the limitations of the string data types they return.
Unless otherwise specified, they return a VARCHAR2 with a maximum of 4000 bytes. If your query might return more than this, you must either specify a larger length, e.g. RETURNING VARCHAR2(32767), or request a CLOB, e.g. RETURNING CLOB.
If the data exceeds the limit, calls to JSON_OBJECT, JSON_OBJECTAGG, JSON_ARRAYAGG, and JSON_TRANSFORM will fail at runtime with the following exception:
select
json_object(
'name-is-twenty-chars' : rpad('x',3974,'x')
)
from dual;
ORA-40478: output value too large (maximum: 4000)
The error occurs here because the representation of the entire JSON object requires more than 4000 bytes. No-one likes to see errors, but it’s better than the alternative because it is more likely to alert you to the problem so you can fix it.
You may have noticed I missed one of the JSON functions from the list above – JSON_MERGEPATCH. By default, this function does not raise an exception if the size limit is exceeded. Instead, it merely returns NULL at runtime. This behaviour can cause confusion when debugging a complex query, so it’s something to be aware of.
Note that even though both the JSON objects specified RETURNING CLOB, this was missed for JSON_MERGEPATCH; which means it is limited to the default 4000 bytes, causing it to return NULL. The fix is to add RETURNING CLOB to the JSON_MERGEPATCH:
I have been working with some code that uses JSON, sometimes fairly large documents of the stuff, and it’s often necessary to send this to the debug log (e.g. DBMS_OUTPUT) for debugging; however, the builtin functions that convert a JSON object to a string (or clob) return the JSON document in one big long line, like this:
To show this formatted, I added the following function using JSON_SERIALIZE with the “PRETTY” option, to my utility package:
function format_json (p_clob in clob) return clob is
l_blob blob;
l_clob clob;
function clob_to_blob(p_clob clob) return blob is
l_blob blob;
o1 integer := 1;
o2 integer := 1;
c integer := 0;
w integer := 0;
begin
sys.dbms_lob.createtemporary(l_blob, true);
sys.dbms_lob.converttoblob(l_blob, p_clob, length(p_clob), o1, o2, 0, c, w);
return l_blob;
end clob_to_blob;
begin
l_blob := clob_to_blob(p_clob);
select JSON_SERIALIZE(l_blob returning clob PRETTY) into l_clob from dual;
return l_clob;
end format_json;
Note that my function takes a CLOB, not a JSON object, because sometimes I receive the data already as a CLOB and I don’t want to require conversion to JSON before passing it to my formatting function.
A good question – how to load fairly largish GeoJSON documents into a Google Map in APEX?
To investigate this I started by downloading a source of GeoJSON data for test purposes – one containing the borders of countries around the world: https://datahub.io/core/geo-countries. This file is 23.5MB in size and contains a JSON array of features, like this:
(the data does not appear to be very accurate for a lot of countries, but it will do just fine for my purposes)
Uploading the file to the database
To load this file into my database I copied the file to the server and ran this to load the data into a temporary table:
Alternatively, I could also have created a temporary APEX application with a File Browse item to upload the file and insert it into the import_lob_tmp table.
Parsing the JSON
I wanted to get the array of features as a table with one row per country; to get this I used json_table; after a fair bit of muddling around this is what I ended up with:
create table country_borders as
select j.*
from import_lob_tmp,
json_table(the_clob, '$.features[*]'
columns (
country varchar2(255) path '$.properties.ADMIN',
iso_a3 varchar2(255) path '$.properties.ISO_A3',
geometry clob format json
)) j;
alter table country_borders modify country not null;
alter table country_borders modify iso_a3 not null;
alter table country_borders modify geometry not null;
alter table country_borders add
constraint country_border_name_uk unique (country);
alter table country_borders add
constraint geometry_is_json check (geometry is json);
The first JSON path expression allowed me to drill down from the document root ($) to the features node; this is an array so I added [*] to get one row for each entry.
The COLUMNS list then breaks down each entry into the columns I’m interested in; each entry consists of a type attribute (which I don’t need), followed by a more interesting properties node with some attributes which are extracted using some relative JSON path expressions; followed by the geometry node with the GeoJSON fragment that represents the country borders that I wish to store “as is” in a clob column.
Now if I query this table it’s interesting to see which countries are likely to have the most complex coastlines (at least, as far as the data quality provided in this file will provide):
select country,
iso_a3,
dbms_lob.getlength(geometry) geometry_size,
geometry
from country_borders
order by 3 desc;
It should be noted that since I’ve extracted the geometry node from each feature, the resulting data in the geometry column do not actually represent valid GeoJSON documents. However, it’s easy to construct a valid GeoJSON document by surrounding it with a suitable JSON wrapper, e.g.:
'{"type":"Feature","geometry":' + geometry + '}'
Showing the GeoJSON on a map
The next step is to load this border data onto a map for display. I recently released version 1.1 of my Report Map Google Map plugin which adds support for loading and manipulating geoJSON strings, so I started by importing region_type_plugin_com_jk64_report_google_map_r1 into my APEX application.
I created a page with a region using this plugin. I set the map region Static ID to testmap. On the same page I added a text item, P1_GEOJSON, to hold the GeoJSON data; and a Select List item P1_COUNTRY with the following query as its source:
select country
|| ' ('
|| ceil(dbms_lob.getlength(geometry)/1024)
|| 'KB)' as d
,country
from country_borders
order by country
I added a dynamic action to the Select List item on the Change event to load the geometry from the table into the map. Initially, I added the following actions:
A Set Value action that sets P1_GEOJSON to the result of the query: select geometry from country_borders where country = :P1_COUNTRY
An Execute JavaScript action that loads the GeoJSON into the map (after first clearing any previously loaded features):
This technique works ok, but only for smallish countries where the GeoJSON of their borders is less than 4K in size. For countries with more border detail than can fit within that limit, the Set Value action query only loads part of the JSON data, resulting in an invalid JSON string – and so the map failed to load it. The Set Value action was therefore unsuitable for my purpose.
To load the entire CLOB data I used another plugin. There are a few CLOB load plugins available for APEX – search the Plugins list at apex.world for “clob”. I chose APEX CLOB Load 2 by Ronny Weiß.
I imported the plugin dynamic_action_plugin_apex_clob_load_2 into my application, then replaced the Set Value action with the action APEX CLOB Load 2 [Plug-In]. I set SQL Source to:
select /* Element type dom - for jQuery selector e.g. body or #region-id,
item - for item name e.g. P1_MY_ITEM */
'item' as element_type,
/* jQuery selector or item name */
'P1_GEOJSON' as element_selector,
geometry as clob_value
from country_borders
where country = :P1_COUNTRY
I set Items to Submit = P1_COUNTRY and Sanitise HTML = No. I also set Selection Type = Region and select the map region so that the spinner is shown while the data is loaded.
Result
The plugin works well. The border for any country can be loaded (for some countries, it takes a few extra seconds to load) and drawn on the map:
If you would like to see this in action, you may play with it here: https://jk64.dev/apex/f?p=JK64_REPORT_MAP:GEOJSON