AUSOUG Perth Conference 2011 Day One

I had a most enjoyable* day today at the Oracle conference at Burswood, Perth.

* This is significant in light of the fact that the start of the day was quite hectic, having had to drop the car off to get a new alternator (the electrics died on the way home from work yesterday, so this morning I called RAC out to get the car started and charge the battery up a bit. I managed to drive the car within 100m of Ultratune, then it died completely. So I left the car in the traffic and walked the rest of the way, borrowed some kind of block box that was like a CPR machine for cars.), thereafter going with Rosalie to drop the eldest at her school, so she could then gift me a lift to Burswood.

I got to the conference in time to hear Graham Wood from Oracle talk about DB Time-based Oracle Performance Tuning, which was excellent, then Tim Hall on Edition-Based Redefinition which led to some excellent discussions with both Tim and colleagues from my current client.

My talk on a simple ETL method using just SQL seemed to go well and there were some good questions, comments and suggestions. If you missed something it was probably because I talk too fast – if you want to look something up from the talk you can look at the slides here, or have a play with a full SQL*Plus demo []. I also blogged about this technique back in February.

I finished the day with Graham Wood’s second talk, on Oracle Hidden Features – which indeed included info on some Oracle features that I’d either not yet come across, or had forgotten about. Doug Burns blogged about this presentation earlier. The bit about external tables, especially the new 11g feature for running the source file through a preprocessor, was particularly interesting.

I’m looking forward to the second day tomorrow.

Googlebot, APEX Session IDs, and Cookies
AUSOUG Perth Conference 2011 Day Two


  1. Hi Jeff,

    Have you had to deal with CLOBs when doing a transfer? We had a data transfer from a VARCHAR2(4000) column to a CLOB column in another table. The transfer was extremely slow. We ended up having to change the CLOB to VARCHAR2 to do the transfer and then alter the column afterwards. This was dealing with hundreds of thousands of rows.

    • Hi Quinny,

      I haven’t had to deal with any CLOBs in these projects.

      I expect your approach would be fine; an alternative might have been to set the clobs to EMPTY_CLOB() in the MERGE statement, then do an ordinary UPDATE to transfer the clob data. I don’t know if this would be any better. I’ll have to try it out some time 🙂


Leave a Reply

Your email address will not be published / Required fields are marked *