Tag: problem-solved

SQL Problem Solved: Merge History Tables

Yesterday I published a SQL problem (that I I had solved – or at least, thought I had solved!) and invited readers to submit their own solutions.

SPOILER ALERT: if you haven’t had a chance to have a go yourself, go back, don’t read the comments (yet), but see if you can solve it yourself.

I was very pleased to see two excellent submissions that were both correct as far as I could tell, and are (in my opinion) sufficiently simple, elegant and efficient. Note that this was not a competition and I will not be ranking or rating or anything like that. If you need to solve a similar problem, I would suggest you try all of these solutions out and pick the one that seems to work the best for you.

Solution #1

Stew Ashton supplied the first solution which was very simple and elegant:

with dates as (
  select distinct deptid, dte from (
    select deptid, from_date, to_date+1 to_date from t1
    union all
    select deptid, from_date, to_date+1 to_date from t2
  )
  unpivot(dte for col in(from_date, to_date))
)
select * from (
  select distinct a.deptid, dte from_date,
  lead(dte) over(partition by a.deptid order by dte) - 1 to_date,
  data1, data2 from dates a
  left join t1 b on a.deptid = b.deptid and a.dte between b.from_date and b.to_date
  left join t2 c on a.deptid = c.deptid and a.dte between c.from_date and c.to_date
)
where data1 is not null or data2 is not null
order by 1,2;

SQL Fiddle

Stew’s solution does a quick pass over the two tables and generates a “date dimension” table in memory; it then does left joins to each table to generate each row. It may have a disadvantage due to the requirement to make two passes over the tables, which might be mitigated by suitable indexing. As pointed out by Matthias Rogel, this solution can be easily generalised to more than two tables, which is a plus.

Solution #2

The second solution was provided by Kim Berg Hansen which was a bit longer but in my opinion, still simple and elegant:

with v1 as (
   select t.deptid
        , case r.l
             when 1 then t.from_date
             when 2 then date '0001-01-01'
             when 3 then t.max_to + 1
          end from_date
        , case r.l
             when 1 then t.to_date
             when 2 then t.min_from -1
             when 3 then date '9999-01-01'
          end to_date
        , case r.l
             when 1 then t.data1
          end data1
     from (
      select deptid
           , from_date
           , to_date
           , data1
           , row_number() over (partition by deptid order by from_date) rn
           , min(from_date) over (partition by deptid) min_from
           , max(to_date) over (partition by deptid) max_to
        from t1
     ) t, lateral( 
         select level l
           from dual
         connect by level <= case t.rn when 1 then 3 else 1 end
     ) r
), v2 as (
   select t.deptid
        , case r.l
             when 1 then t.from_date
             when 2 then date '0001-01-01'
             when 3 then t.max_to + 1
          end from_date
        , case r.l
             when 1 then t.to_date
             when 2 then t.min_from -1
             when 3 then date '9999-01-01'
          end to_date
        , case r.l
             when 1 then t.data2
          end data2
     from (
      select deptid
           , from_date
           , to_date
           , data2
           , row_number() over (partition by deptid order by from_date) rn
           , min(from_date) over (partition by deptid) min_from
           , max(to_date) over (partition by deptid) max_to
        from t2
     ) t, lateral( 
         select level l
           from dual
         connect by level <= case t.rn when 1 then 3 else 1 end
     ) r
)
select v1.deptid
     , greatest(v1.from_date, v2.from_date) from_date
     , least(v1.to_date, v2.to_date) to_date
     , v1.data1
     , v2.data2
  from v1
  join v2
      on v2.deptid = v1.deptid
     and v2.to_date >= v1.from_date
     and v2.from_date <= v1.to_date
 where v2.data2 is not null or v1.data1 is not null
 order by v1.deptid
        , greatest(v1.from_date, v2.from_date)

Since it takes advantage of 12c’s new LATERAL feature, I wasn’t able to test it on my system, but I was able to test it on Oracle’s new Live SQL and it passed with flying colours. I backported it to 11g and it worked just fine on my system as well. An advantage of Kim’s solution is that it should only make one pass over each of the two tables.

Oracle Live SQL: merge-history-kimberghansen (this is the 12c version – Oracle login required)

SQL Fiddle (this is my attempt to backport to 11g – you can judge for yourself if I was faithful to the original)

Solution #3

When my colleague came to me with this problem, I started by creating a test suite including as many test cases as we could think of. Developing and testing the solution against these test cases proved invaluable and allowed a number of ideas to be tested and thrown away. Big thanks go to Kim Berg Hansen for providing an additional test case, which revealed a small bug in my original solution.

with q1 as (
select nvl(t1.deptid, t2.deptid) as deptid
      ,greatest(nvl(t1.from_date, t2.from_date), nvl(t2.from_date, t1.from_date)) as gt_from_date
      ,least(nvl(t1.to_date, t2.to_date), nvl(t2.to_date, t1.to_date))            as lt_to_date
      ,t1.from_date AS t1_from_date
      ,t1.to_date   AS t1_to_date
      ,t2.from_date AS t2_from_date
      ,t2.to_date   AS t2_to_date
      ,min(greatest(t1.from_date, t2.from_date)) over (partition by t1.deptid) - 1 as fr_to_date
      ,max(least(t1.to_date, t2.to_date)) over (partition by t2.deptid)        + 1 as lr_from_date
      ,t1.data1
      ,t2.data2
      ,row_number() over (partition by t1.deptid, t2.deptid order by t1.from_date) as rn_fr1
      ,row_number() over (partition by t1.deptid, t2.deptid order by t2.from_date) as rn_fr2
      ,row_number() over (partition by t1.deptid, t2.deptid order by t1.to_date desc) as rn_lr1
      ,row_number() over (partition by t1.deptid, t2.deptid order by t2.to_date desc) as rn_lr2
from   t1
full outer join t2
on     t1.deptid = t2.deptid
and    (t1.from_date between t2.from_date and t2.to_date
        or t1.to_date between t2.from_date and t2.to_date
        or t2.from_date between t1.from_date and t1.to_date
        or t2.to_date between t1.from_date and t1.to_date)
order by 1, 2
)
select deptid
      ,gt_from_date AS from_date
      ,lt_to_date   AS to_date
      ,data1
      ,data2
from   q1
union all
select deptid
      ,t1_from_date as from_date
      ,fr_to_date AS to_date
      ,data1
      ,'' AS data2
from   q1
where  fr_to_date between t1_from_date and t1_to_date
and    rn_fr1 = 1
union all
select deptid
      ,t2_from_date as from_date
      ,fr_to_date AS to_date
      ,'' AS data1
      ,data2
from   q1
where  fr_to_date between t2_from_date and t2_to_date
and    rn_fr2 = 1
union all
select deptid
      ,lr_from_date as from_date
      ,t2_to_date AS to_date
      ,'' AS data1
      ,data2
from   q1
where  lr_from_date between t2_from_date and t2_to_date
and    rn_lr2 = 1
union all
select deptid
      ,lr_from_date as from_date
      ,t1_to_date AS to_date
      ,data1
      ,'' AS data2
from   q1
where  lr_from_date between t1_from_date and t1_to_date
and    rn_lr1 = 1
order by deptid, from_date

SQL Fiddle

This solution is probably the ugliest of the three, and though it only requires one pass over the two tables, it materializes all the data in a temp table so probably requires a fair amount of memory for large data sets.

I’m so glad I posted this problem – I received help in the form of two quite different solutions, which help to show different ways of approaching problems like this. Not only that, but it highlighted the value of test-driven development – instead of groping in the dark to come up with a quick-and-dirty solution (that might or might not work), starting with comprehensive test cases is invaluable for both developing and validating potential solutions.

SQL Problem: Merge record history tables

A colleague asked me if I could help solve a problem in SQL, and I was grateful because I love solving problems in SQL! So we sat down, built a test suite, and worked together to solve it.

I will first present the problem and invite you to have a go at solving it for yourself; tomorrow I will post the solution we came up with. I think our solution is reasonably simple and elegant, but I’m certain there are other solutions as well, some of which may be more elegant or efficient.

I have two “record history” tables, from different data sources, that record the history of changes to two sets of columns describing the same entity. Here are the example tables:

T1 (deptid, from_date, to_date, data1)
T2 (deptid, from_date, to_date, data2)

So T1 records the changes in the column “data1” over time for each deptid; and T2 records the changes in the column “data2” over time for each deptid.

Assumptions/Constraints on T1 and T2:

  • All columns are not Null.
  • All dates are just dates (no time values).
  • All date ranges are valid (i.e. from_date <- to_date).
  • Within each table, all date ranges for a particular deptid are gap-free and have no overlaps.

We want to produce a query that returns the following structure:

T3 (deptid, from_date, to_date, data1, data2)

Where the dates for a particular deptid overlap between T1 and T2, we want to see the corresponding data1 and data2 values. Where the dates do not overlap (e.g. where we have a value for data1 in T1 but not in T2), we want to see the corresponding data1 but NULL for data2; and vice versa.

Below is the setup and test cases you can use if you wish. Alternatively, you can use the SQL Fiddle here.

create table t1
  (deptid number not null
  ,from_date date not null
  ,to_date date not null
  ,data1 varchar2(30) not null
  ,primary key (deptid, from_date)
  );

create table t2
  (deptid number not null
  ,from_date date not null
  ,to_date date not null
  ,data2 varchar2(30) not null
  ,primary key (deptid, from_date)
  );

--Test Case 1 (4001)
--T1  |-------------||-------------|
--T2         |-------------||-------------||-------------|

insert into t1 values (4001, date'2015-01-01', date'2015-01-31', 'jan data 1');
insert into t1 values (4001, date'2015-02-01', date'2015-02-28', 'feb data 1');
insert into t2 values (4001, date'2015-01-15', date'2015-02-14', 'jan-feb data 2');
insert into t2 values (4001, date'2015-02-15', date'2015-03-14', 'feb-mar data 2');
insert into t2 values (4001, date'2015-03-15', date'2015-04-14', 'mar-apr data 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4001    01-JAN-2015  14-JAN-2015  jan data 1
--4001    15-JAN-2015  31-JAN-2015  jan data 1      jan-feb data 2
--4001    01-FEB-2015  14-FEB-2015  feb data 1      jan-feb data 2
--4001    15-FEB-2015  28-FEB-2015  feb data 1      feb-mar data 2
--4001    01-MAR-2015  14-MAR-2015                  feb-mar data 2
--4001    15-MAR-2015  14-APR-2015                  mar-apr data 2

--Test Case 2 (4002)
--T1         |-------------||-------------||-------------|
--T2  |-------------||-------------|

insert into t1 values (4002, date'2015-01-15', date'2015-02-14', 'jan-feb data 1');
insert into t1 values (4002, date'2015-02-15', date'2015-03-14', 'feb-mar data 1');
insert into t1 values (4002, date'2015-03-15', date'2015-04-14', 'mar-apr data 1');
insert into t2 values (4002, date'2015-01-01', date'2015-01-31', 'jan data 2');
insert into t2 values (4002, date'2015-02-01', date'2015-02-28', 'feb data 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4002    01-JAN-2015  14-JAN-2015                  jan data 2
--4002    15-JAN-2015  31-JAN-2015  jan-feb data 1  jan data 2
--4002    01-FEB-2015  14-FEB-2015  jan-feb data 1  feb data 2
--4002    15-FEB-2015  28-FEB-2015  feb-mar data 1  feb data 2
--4002    01-MAR-2015  14-MAR-2015  feb-mar data 1
--4002    15-MAR-2015  14-APR-2015  mar-apr data 1

--Test Case 3 (4003)
--T1  |-------------|
--T2  |-------------|

insert into t1 values (4003, date'2015-01-01', date'2015-01-31', 'jan data 1');
insert into t2 values (4003, date'2015-01-01', date'2015-01-31', 'jan data 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4003    01-JAN-2015  31-JAN-2015  jan data 1      jan data 2

--Test Case 4 (4004)
--T1  |-------------|
--T2      |-----|

insert into t1 values (4004, date'2015-01-01', date'2015-01-31', 'jan data 1');
insert into t2 values (4004, date'2015-01-10', date'2015-01-20', 'jan data int 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4004    01-JAN-2015  09-JAN-2015  jan data 1
--4004    10-JAN-2015  20-JAN-2015  jan data 1      jan data int 2
--4004    21-JAN-2015  31-JAN-2015  jan data 1

--Test Case 5 (4005)
--T1      |-----|
--T2  |-------------|

insert into t1 values (4005, date'2015-01-10', date'2015-01-20', 'jan data int 1');
insert into t2 values (4005, date'2015-01-01', date'2015-01-31', 'jan data 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4005    01-JAN-2015  09-JAN-2015                  jan data 2
--4005    10-JAN-2015  20-JAN-2015  jan data int 1  jan data 2
--4005    21-JAN-2015  31-JAN-2015                  jan data 2

--Test Case 6 (4006)
--T1  ||------------|
--T2  |-------------|

insert into t1 values (4006, date'2015-01-01', date'2015-01-01', 'jan data a 1');
insert into t1 values (4006, date'2015-01-02', date'2015-01-31', 'jan data b 1');
insert into t2 values (4006, date'2015-01-01', date'2015-01-31', 'jan data 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4006    01-JAN-2015  01-JAN-2015  jan data a 1    jan data 2
--4006    02-JAN-2015  31-JAN-2015  jan data b 1    jan data 2

--Test Case 7 (4007)
--T1  |-------------|
--T2  ||------------|

insert into t1 values (4007, date'2015-01-01', date'2015-01-31', 'jan data 1');
insert into t2 values (4007, date'2015-01-01', date'2015-01-01', 'jan data a 2');
insert into t2 values (4007, date'2015-01-02', date'2015-01-31', 'jan data b 2');

--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4007    01-JAN-2015  01-JAN-2015  jan data 1      jan data a 2
--4007    02-JAN-2015  31-JAN-2015  jan data 1      jan data b 2

--Test Case 8 (4008)
--T1  |-------------|
--T2    |---||---||---|
 
insert into t1 values (4008, date'2015-01-01', date'2015-01-31', 'jan data 1');
insert into t2 values (4008, date'2015-01-09', date'2015-01-17', 'jan data a 2');
insert into t2 values (4008, date'2015-01-18', date'2015-01-26', 'jan data b 2');
insert into t2 values (4008, date'2015-01-27', date'2015-02-04', 'jan-feb data 2');
 
--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4008    01-JAN-2015  08-JAN-2015  jan data 1
--4008    09-JAN-2015  17-JAN-2015  jan data 1      jan data a 2
--4008    18-JAN-2015  26-JAN-2015  jan data 1      jan data b 2
--4008    27-JAN-2015  31-JAN-2015  jan data 1      jan-feb data 2
--4008    01-FEB-2015  04-FEB-2015                  jan-feb data 2

--Test Case 9 (4009)
--T1    |---||---||---|
--T2  |-------------|
 
insert into t1 values (4009, date'2015-01-09', date'2015-01-17', 'jan data a 1');
insert into t1 values (4009, date'2015-01-18', date'2015-01-26', 'jan data b 1');
insert into t1 values (4009, date'2015-01-27', date'2015-02-04', 'jan-feb data 1');
insert into t2 values (4009, date'2015-01-01', date'2015-01-31', 'jan data 2');
 
--Expected Output:
--DEPTID  FROM_DATE    TO_DATE      DATA1           DATA2
--======  ===========  ===========  ==============  ==============
--4009    01-JAN-2015  08-JAN-2015                  jan data 2
--4009    09-JAN-2015  17-JAN-2015  jan data a 1    jan data 2
--4009    18-JAN-2015  26-JAN-2015  jan data b 1    jan data 2
--4009    27-JAN-2015  31-JAN-2015  jan-feb data 1  jan data 2
--4009    01-FEB-2015  04-FEB-2015  jan-feb data 1

EDIT 15/10/2015: added Kim Berg Hansen’s test cases

UPDATE 15/10/2015: solutions (spoiler alert!)

Static File not updating in runtime Apex environment

The UAT environment is a runtime Apex installation (4.2.4.00.08) and all deployments are done via SQL scripts. My application uses a small number of static files that for convenience we are serving from Apex (at least for now); to deploy changes to these static files, I export f100.sql and static_files.sql from Apex in dev and we run them in the target environment after setting a few variables, like this:

declare
  v_workspace CONSTANT VARCHAR2(100) := 'MYWORKSPACE';
  v_workspace_id NUMBER;
begin
  select workspace_id into v_workspace_id
  from apex_workspaces where workspace = v_workspace;
  apex_application_install.set_workspace_id (v_workspace_id);
  apex_util.set_security_group_id
    (p_security_group_id => apex_application_install.get_workspace_id);
  apex_application_install.set_schema('MYSCHEMA');
  apex_application_install.set_application_id(100);
end;
/

@f100.sql
@static_file.sql

Many months after this application went live, and after multiple deployments in all the environments, we suddenly had an issue where the static files being served from one instance (UAT) were an older version. The logs showed the correct files had been deployed, and re-deploying into DEV seemed to work fine. I got the DBA to temporarily change the schema password in UAT so I could login to see what was going on.

When I ran this query in DEV, I got the expected two records:

select * from apex_workspace_files
where file_name in ('myapp.css', 'myapp.js');

When I ran it in UAT, I got four records – two copies of each file, and the BLOB contents showed that the older copies were the ones being served to the clients. I have no idea how the extra copies got created in that environment. It must have been due to a failed deployment but the deployment logs didn’t seem to show any errors or anomalies.

Killing the Zombie Static File

I tried editing the static_file.sql script to remove the files (as below), but it only ever removed the new files that were created; re-running it never causes it to drop the old file copies.

...
declare
  l_name    varchar2(255);
begin
  l_name := 'myapp.css';
  wwv_flow_html_api.remove_html(
    p_html_name => l_name,
    p_flow_id   => nvl(wwv_flow.g_flow_id, 0) );
end;
/
...

Next thing I tried was something I picked up from here:

NOTE: run this at your own risk! It is not supported by Oracle.

declare
  v_workspace CONSTANT VARCHAR2(100) := 'MYWORKSPACE';
  v_workspace_id NUMBER;
begin
*** WARNING: DO NOT RUN THIS UNLESS YOU KNOW WHAT YOU ARE DOING ***
  select workspace_id into v_workspace_id
  from apex_workspaces where workspace = v_workspace;
  apex_application_install.set_workspace_id (v_workspace_id);
  apex_util.set_security_group_id
    (p_security_group_id => apex_application_install.get_workspace_id);
  delete from wwv_flow_files where filename like 'myapp.%';
*  commit;
end;
/

That seemed to do the trick. Thankfully this problem only occurred in a test environment – I would be uncomfortable running this in Prod.

“Smart quotes” showing as “?” in emails

When some of my users were using my system to send emails, they’d often copy-and-paste their messages from their favourite word processor, but when my system sent the emails they’d have question marks dotted around, e.g.

“Why doesn’t this work?”

would get changed to

?Why doesn?t? this work??

Simple fix was to detect and replace those fancy-pants quote characters with the equivalent html entities, e.g.:

function enc_chars (m in varchar2) return varchar2 is
begin
  return replace(replace(replace(replace(m
    ,chr(14844060),'&#8220;')/*left double quote*/
    ,chr(14844061),'&#8221;')/*right double quote*/
    ,chr(96)      ,'&#8216;')/*left single quote*/
    ,chr(14844057),'&#8217;')/*right single quote*/
    ;
end enc_chars;

P.S. Stupid wordpress keeps mucking around with my code, trying to replace the html entities with the unencoded versions. In case this doesn’t work, here’s an image of what the above code is supposed to look like:
enc_chars

Help for your keyboard users

Apex’s Blue Responsive Theme 25 is a great theme for building a user-friendly website, and unlike many other themes which make the item labels clickable to get help, it renders little question-mark icons to the right of each item that has a help message defined.

theme-25-help-icon

One issue with this design, however, is that a keyboard-savvy user (such as myself) hates to switch between keyboard and mouse – so they Tab between each field as they fill in a form. With this theme, however, those little question-mark icons are in the tab order, so the user has to Tab twice between each field. If they’re not looking at the page they might forget; and if one item doesn’t happen to have a Help icon, they have to remember to only Tab once. All of this adds up to a poor user experience.

To fix this, we simply tell the browser to move the Help icons out of the tab order – and yet again, jQuery comes to the rescue as we can simply pick out all the elements with the class itemHelpButton and set their tabindex to “-1”:

$(".itemHelpButton").attr('tabindex',-1);

Put this code in the page attribute Execute when Page Loads – when the page loads, this will set the tabindex on all the help icons on the page, so now the user can tab through the page without interruption.