I’ve added this script to our toolbelt for future upgrades. We have a friendly “System is under maintenance, sorry for any convenience” web page that we want to show to users while we run upgrades, and we want it to be shown even if we’re just doing some database schema changes.
So I took the script from here and adapted it slightly, here’s our version:
declare PRAGMA AUTONOMOUS_TRANSACTION;
v_workspace CONSTANT VARCHAR2(100) := 'MYSCHEMA';
v_workspace_id NUMBER;
begin
select workspace_id into v_workspace_id
from apex_workspaces where workspace = v_workspace;
apex_application_install.set_workspace_id (v_workspace_id);
apex_util.set_security_group_id
(p_security_group_id => apex_application_install.get_workspace_id);
wwv_flow_api.set_flow_status
(p_flow_id => 100
,p_flow_status => 'UNAVAILABLE_URL'
,p_flow_status_message => 'http://www.example.com/system_unavailable.html'
);
commit;
end;
/
It uses an autonomous transaction because we want the system to be unavailable immediately for all users while the deployment is running.
Warning: WWV_FLOW_API is an undocumented package so this is not supported.
The opposite script to make the application available again is:
declare PRAGMA AUTONOMOUS_TRANSACTION;
v_workspace CONSTANT VARCHAR2(100) := 'MYSCHEMA';
v_workspace_id NUMBER;
begin
select workspace_id into v_workspace_id
from apex_workspaces where workspace = v_workspace;
apex_application_install.set_workspace_id (v_workspace_id);
apex_util.set_security_group_id
(p_security_group_id => apex_application_install.get_workspace_id);
wwv_flow_api.set_flow_status
(p_flow_id => 100
,p_flow_status => 'AVAILABLE'
);
commit;
end;
/
However, if we run the f100.sql script to deploy a new version of the application, we don’t need to run the “set available” script since the redeployment of the application (which would have been exported in an “available” state already) will effectively make it available straight away.
UPDATE FOR APEX 5.1 and later
The APEX API has been updated with a documented and supported call – APEX_UTIL.set_application_status
to do this which should be used in APEX 5.1 or later.
You’ve finished the design for an Apex application, and the manager asks you “when will you have it ready to test”. You resist the temptation to respond snarkily “how long is a piece of string” – which, by the way, is often the only appropriate answer if they ask for an estimate before the design work has started.
Since you have a design and a clear idea of what exactly this application will do, you can build a reasonable estimate of development time. The starting point is to break down the design into small chunks of discrete modules of work, where each chunk is something you can realistically estimate to take between half a day up to a maximum of three days. In cases where it’s something you haven’t done before, you can reduce uncertainty by creating a small Proof-of-Concept application or code snippet and seeing how the abstract ideas in the design might work in reality. This is where Apex comes in handy – by the time the design has completed, you’ll have a database schema (you created the schema in a tool like SQL Developer, didn’t you – so you can generate the DDL for that in a jiffy) and you can just point Apex to that and use the New Application wizard to create a simple application.
I plan all my projects using Trello, recording estimates using the Scrum for Trello extension. I don’t know much about the “Scrum” method but this extension allows me to assign “points” to each task (blue numbers on each card, with a total at the top of each list). I used to use a 1 point = 1 day (8 hours) convention, but I was finding most of the tasks were more fine grained (I was assigning 0.5 and 0.25 points to most tasks) so I’ve now switched to a convention of 1 point = 1 hour (more or less). In other words, I’d report my estimates with the assumption that my “velocity” would be 8 points per day.
(note: the blue numbers are the Scrum for Trello points, and the orange numbers are simply the number of cards in the list)
My points system looks roughly like this:
- Simple report, form or process = 4 points
- Complex report or form or process = 8 points
- Very complex form or process = 24 points
- Simple bug fix / tweak = 1 point
- Complex bug fix / enhancement = 2 to 8 points depending on complexity
In addition to that, I keep a daily diary of work done in Evernote which has allowed me to do what I’ve wanted to do for a while now: measure my actuals vs. estimates for my projects, in order to calibrate my estimates for future projects. I made up a quick spreadsheet showing the development work for two projects, showing the original estimate, the start and finish dates, and actual development days worked (accurate roughly to the nearest half day, not counting non-development time like meetings and other project work). This allows me to see how my actual development time compares to my original estimates.
SAM Development (16 Jun 2014 to 11 Dec 2014):
- Estimate: 715 points (“18 weeks”)
- Actual: 103.5 days (21 weeks)
- Avg. points per day (“velocity”): 6.9
SAM Support (12 Dec 2014 to 29 Jul 2015):
- Estimate: 460 points (“12 weeks”)
- Actual: 64.5 days (13 weeks)
- Avg. points per day (“velocity”): 7.1
SUM Development (4 Jun 2015 to present):
- Estimate: 238 points (“6 weeks”)
- Actual: 31 days (6 weeks)
- Avg. points per day (“velocity”): 7.7
Details: EstimatesCalibration.xlsx
Since my reported estimates were roughly 8 points = 1 day, it seems I tend to underestimate. It may seem that my estimates are getting better (since my Points per Day is approaching 8), but that trend is more likely a result of SUM involving a lot less uncertainty and risk and being a smaller project overall. SAM was in a new environment and new client, whereas SUM is merely an additional application in the same environment for the same client. I was also able to re-use or adapt a lot of infrastructure-type code and Apex components from SAM in SUM.
The other thing that I can see from the details is that my “velocity” (points per day) is higher in the earlier stages of development, probably because I have plenty of work planned out, and I can work for several days in a row, uninterrupted by meetings and other distractions. In later stages, my attention gets divided by bug fixes, enhancement requests, meetings, doing design work for the next project, and general waiting-for-other-people-to-do-their-jobs.
For my next project I’ll estimate using the same points system as above, but factor in a 7 points-per-day to the estimates that I report.
Do you have a system of your own you use to create estimates and measure their accuracy? If not, why not try this for yourself? Do you keep track of your estimates and progress? If not, I recommend you start 🙂
I have a simple tabular form with a numeric “sort order” column. I want the value of this column to be defaulted automatically, based on the maximum value of the rest of the records on the screen. Unfortunately the builtin Apex default type for columns in a tabular form can only be based on an Item, or a PL/SQL expression or function. I didn’t want to make a database call to get the maximum value because the user may have added multiple records which have not yet been saved to the database.
I tried basing the default on a hidden page item which I kept updated based on the values entered, but it seems the tabular form only gets the item’s value on page load and doesn’t re-examine the item’s value when you click the “Add” button. Instead, I had to turn to javascript and jQuery to get the job done.
1. Add a class to the tabular form column by setting the Element CSS Classes column attribute (I’ve used “sortorder” as the class name).
2. Add a javascript function to the page that searches for all the “sortorder” items, calculates the maximum value and adds 10, and assigns it to the last sortorder item on the page.
function setNextSortOrder() {
var highest=0;
$(".sortorder").each(function(){
if (this.value != "") {
highest=Math.max(highest,parseFloat(this.value));
}
});
$(".sortorder").last().val(highest+10);
}
3. Modify the javascript for the “Add” button to call the setNextSortOrder function immediately after adding a row.
javascript:{apex.widget.tabular.addRow();setNextSortOrder();}
It’s a bit hackish and might not work correctly in some cases but it’ll do for now.
If you are using the APEX built-in Data Loading feature to allow your users to upload CSV files, you may have encountered this error.
(Note: the error may appear differently in your application as I have built a custom error handling function)
It’s not a particularly useful message, and the logs don’t seem to shed much light on the problem either – reporting only the following:
DATA_LOAD - Final collection is created
...Execute Statement: select 1 from "DEMO"."MY_TABLE" where "RECORD_ID" = :uk_1
Add error onto error stack
...Error data:
......message: Data Loading Failed
......additional_info: ORA-01403: no data found
...
......ora_sqlerrm: ORA-01403: no data found
......error_backtrace:
ORA-06512: at "APEX_040200.WWV_FLOW_DATA_UPLOAD", line 4115
ORA-06512: at "APEX_040200.WWV_FLOW_PROCESS_NATIVE", line 213
ORA-06512: at "APEX_040200.WWV_FLOW_PROCESS_NATIVE", line 262
ORA-06512: at "APEX_040200.WWV_FLOW_PLUGIN", line 1808
ORA-06512: at "APEX_040200.WWV_FLOW_PROCESS", line 453
After trial and error I tracked down one potential cause of this error so I thought I’d share it in case it happens again. I’ll probably come across this again later and forget what the solution was and find this article.
In my case (APEX 4.2.4), the problem was caused by an invalid entry in the Column Name Aliases list of values. I was using a custom List of Values so that alternative names for the columns would be automatically mapped without the user having to select them every time. To do this, I had to edit the List of Values directly to add the alternative names; but I had mistyped one of the Return Values which must map to a real column name on the target table. Whenever I picked this column for an import, I’d get the “Data Loading Failed” error message. Correcting the return value resolved the issue.
In order to stop this happening again, I added the following check to my Apex QA script (this is run whenever the application is deployed):
PROMPT Invalid dataload column mappings (expected: none)
SELECT REPLACE(lt.owner,'#OWNER#',USER) AS owner
,lt.table_name
,le.return_value AS target_column_not_found
,le.list_of_values_name
,le.display_value AS col_alias
,lt.application_id
,lt.application_name
,lt.name AS dataload_definition
FROM apex_appl_load_tables lt
JOIN apex_application_lov_entries le
ON le.lov_id = lt.column_names_lov_id
WHERE NOT EXISTS (
SELECT NULL
FROM all_tab_columns tc
WHERE tc.owner = REPLACE(lt.owner,'#OWNER#',USER)
AND tc.table_name = lt.table_name
AND tc.column_name = le.return_value)
ORDER BY lt.name, le.display_sequence;
If the above query returns any rows, it’ll be a problem.
I’m a morning person, and my mind is usually sharpest on Monday or Tuesday mornings, so these are the best times for me to work on fiddly javascript stuff. Today was one of those mornings and here are the results, just in case I want to refer back to them later on.
I had many items dotted around an Apex application where the user is allowed to enter “Codes” – values that must be uppercase and contain no spaces or other punctuation characters – except underscores (_
) were allowed.
To make things easier for the user, I wanted the page to automatically strip these characters out when they exit the field, instead of just giving validation errors (Note: I still included the validations, but the javascript just makes the process a bit smoother for the user doing the data entry).
My APEX application already has a global .js
file that is loaded with each page, so all I had to do was add the following code to it:
function cleanCode (c) {
return c.replace(/[^A-Za-z0-9_]/g,"");
}
$(document).ready(function() {
//automatically remove non-code characters from
//"edit_code" class
$( document ).on('change', '.edit_code', function(){
var i = "#"+$(this).attr("id");
$(i).val( cleanCode($(i).val()) );
});
});
EDIT: greatly simplified regexp based on the excellent contribution by Jacopo 🙂
EDIT #2: corrected, thanks to Sentinel
Finally, on each “Code” page item, I set the following attribute (or append, if other classes have already been added):
HTML Form Element CSS Classes = edit_code
For code items within a tabular form, I set the following column attribute:
Element CSS Classes = edit_code
The UAT environment is a runtime APEX installation (4.2.4.00.08) and all deployments are done via SQL scripts. My application uses a small number of static files that for convenience we are serving from APEX (at least for now); to deploy changes to these static files, I export f100.sql and static_files.sql from APEX in dev and we run them in the target environment after setting a few variables, like this:
declare
v_workspace CONSTANT VARCHAR2(100) := 'MYWORKSPACE';
v_workspace_id NUMBER;
begin
select workspace_id into v_workspace_id
from apex_workspaces where workspace = v_workspace;
apex_application_install.set_workspace_id (v_workspace_id);
apex_util.set_security_group_id
(p_security_group_id => apex_application_install.get_workspace_id);
apex_application_install.set_schema('MYSCHEMA');
apex_application_install.set_application_id(100);
end;
/
@f100.sql
@static_file.sql
Many months after this application went live, and after multiple deployments in all the environments, we suddenly had an issue where the static files being served from one instance (UAT) were an older version. The logs showed the correct files had been deployed, and re-deploying into DEV seemed to work fine. I got the DBA to temporarily change the schema password in UAT so I could login to see what was going on.
When I ran this query in DEV, I got the expected two records:
select * from apex_workspace_files
where file_name in ('myapp.css', 'myapp.js');
When I ran it in UAT, I got four records – two copies of each file, and the BLOB contents showed that the older copies were the ones being served to the clients. I have no idea how the extra copies got created in that environment. It must have been due to a failed deployment but the deployment logs didn’t seem to show any errors or anomalies.
Killing the Zombie Static File
I tried editing the static_file.sql script to remove the files (as below), but it only ever removed the new files that were created; re-running it never causes it to drop the old file copies.
...
declare
l_name varchar2(255);
begin
l_name := 'myapp.css';
wwv_flow_html_api.remove_html(
p_html_name => l_name,
p_flow_id => nvl(wwv_flow.g_flow_id, 0) );
end;
/
...
Next thing I tried was something I picked up from here:
NOTE: run this at your own risk! It is not supported by Oracle.
declare
v_workspace CONSTANT VARCHAR2(100) := 'MYWORKSPACE';
v_workspace_id NUMBER;
begin
*** WARNING: DO NOT RUN THIS UNLESS YOU KNOW WHAT YOU ARE DOING ***
select workspace_id into v_workspace_id
from apex_workspaces where workspace = v_workspace;
apex_application_install.set_workspace_id (v_workspace_id);
apex_util.set_security_group_id
(p_security_group_id => apex_application_install.get_workspace_id);
delete from wwv_flow_files where filename like 'myapp.%';
* commit;
end;
/
That seemed to do the trick. Thankfully this problem only occurred in a test environment – I would be uncomfortable running this in Prod.