Skip to main content

OOW2006 - Day 1 (Monday) : Worst Practices Day

For me this Monday started with an (Developer) Keynote by Thomas Kurian: The next application platform. Thomas pointed out three main trends : SOA, Information Driven Architecture an Grid Computing Architecture. For the developers (about 1200 in the Grand Ballroom of the Hilton) he mentioned the tools Oracle offers for building applications on the 3-tier architecture (of course: JDev, SOA Suite and SQL Developer). The most important announcement was the availabilty of the Oracle Developer Depot, were you can easily download an install Java applications to facilitate code reuse and simplify the development process. Of course you can upload your work to this comunity. You can even win a meet and greet with Larry (or an HD TV) if your software is selected as "the best".

For the next session a headed over to Moscone for The Future of DB Technology by Andy Mendelsohn. He addressed (a.o.) the next interesting new products / features / options :
      Information Lifecycle Management
      How to match storage to the information lifecycle to minimize costs (put data that you need less often - or with less performance - on less expensive storage ) by using partitioning. For decision support Oracle offers the Oracle ILM Assistant, a free downloadable program that shows the gains and migration of implementing ILM.

      Database Vault
      Audit and manage the use of data by other users (even the DBA) under the motto "Keep your DBA out of the database".

      Online Application Upgrade (or Online Hot Patching)
      To assure 24x7 availability in Oracle 11 you can upgrade your database while users are using the application. Currently logged in users continue working in the pre upgrade version of the application, new logins will use the upgraded application. They even showed an impressive live demo of this feature!

      Database Capture (I think that was the name he used)
      A tool to capture SQL statements from one environment (e.g. Production) and run it in another (e.g. Test) and localize the differences in the CBO and/or capture statements in a pre upgrade version (e.g. Oracle 10g) and replay these statements in an upgraded version (e.g. Oracle 11) to spot the differences in execution plans - to facilitate the tuning pre- and post upgrade.

The third session this day was Developing PL/SQL Programs Using Automated Unit Testing by my honourable colleague Andrew Clarke. He used the utPLSQL framework ( http://utplsql.sourceforge.net, http://utplsql.oracledeveloper.nl) to facilitate Test Driven Development - an approach that is also strongly supported by Quests PL/SQL Evangelist Steven Feuerstein.

The fourth session was Database Worst Practices by Thomas Kyte. A very popular session, because although the session was sold out, 200 people stood in line to get in! Luckily for those who missed it, Tom repeated this gig on Thursday. In his own special way, with lots of humour, Tom gave a tongue-in-cheek presentation (which is not - yet - available on the Openworld Presentation Download site, but it is on asktom!).
The most important "worst practices" were:
      Never ever question authority
      You do not need bind variables
      You don't want to expose end users to errors (exception when others then null)
      Generic is better
      You don't need a design
      Create as many instance per server
      Reinvent database features
      No need to test
      Only use varchar
      Commit frequently
      No scalability needed, because nothing ever changes

The fifth and last session (who says that visiting OOW isn't hard work!) was called Unleashing the Power of Oracle Streams by Patricia McElroy. I wasn't familiar with Streams (a little with AQ), but was quite impressed with the capabilities of this feature (option?). Streams facilitates an asynchronous information sharing architecture by capturing, staging and consumption of data. IMHO the functionality is similar to the ESB but on the database tier instead of the middle tier. Because the processes run close to the data I expect that the throughput of Streams will be much higher (compared to using the ESB).

After all this hard work this evening was reserved for the OTN Night in the St Francis (just across the street from our hotel). By accident I first visited a Quest party on the 32nd floor of the hotel, with good food, free drinks and a splendid view of SF. When I came down to the 2nd floor the OTN Jeopardy game was still going on, where the contesters excellerated in giving wrong (or no) answers to difficult Oracle related questions (I did not get one right answer...) and everybody was having good time eating, drinking, talking, dancing and looking and the belly dancers (with snake).

Comments

Popular posts from this blog

apex_application.g_f0x array processing in Oracle 12

If you created your own "updatable reports" or your custom version of tabular forms in Oracle Application Express, you'll end up with a query that looks similar to this one: then you disable the " Escape special characters " property and the result is an updatable multirecord form. That was easy, right? But now we need to process the changes in the Ename column when the form is submitted, but only if the checkbox is checked. All the columns are submitted as separated arrays, named apex_application.g_f0x - where the "x" is the value of the "p_idx" parameter you specified in the apex_item calls. So we have apex_application.g_f01, g_f02 and g_f03. But then you discover APEX has the oddity that the "checkbox" array only contains values for the checked rows. Thus if you just check "Jones", the length of g_f02 is 1 and it contains only the empno of Jones - while the other two arrays will contain all (14) rows. So for

Filtering in the APEX Interactive Grid

Remember Oracle Forms? One of the nice features of Forms was the use of GLOBAL items. More or less comparable to Application Items in APEX. These GLOBALS where often used to pre-query data. For example you queried Employee 200 in Form A, then opened Form B and on opening that Form the Employee field is filled with that (GLOBAL) value of 200 and the query was executed. So without additional keys strokes or entering data, when switching to another Form a user would immediately see the data in the same context. And they loved that. In APEX you can create a similar experience using Application Items (or an Item on the Global Page) for Classic Reports (by setting a Default Value to a Search Item) and Interactive Reports (using the  APEX_IR.ADD_FILTER  procedure). But what about the Interactive Grid? There is no APEX_IG package ... so the first thing we have to figure out is how can we set a filter programmatically? Start with creating an Interactive Grid based upon the good old Employ

Stop using validations for checking constraints !

 If you run your APEX application - like a Form based on the EMP table - and test if you can change the value of Department to something else then the standard values of 10, 20, 30 or 40, you'll get a nice error message like this: But it isn't really nice, is it? So what do a lot of developers do? They create a validation (just) in order to show a nicer, better worded, error message like "This is not a valid department".  And what you then just did is writing code twice : Once in the database as a (foreign key) check constraint and once as a sql statement in your validation. And we all know : writing code twice is usually not a good idea - and executing the same query twice is not enhancing your performance! So how can we transform that ugly error message into something nice? By combining two APEX features: the Error Handling Function and the Text Messages! Start with copying the example of an Error Handling Function from the APEX documentation. Create this function