Search

Top 60 Oracle Blogs

Recent comments

Oakies Blog Aggregator

11gR2: new algorithm for fast refresh of on-commit materialized views

This post investigates the improvements that have been made in 11gR2 to the fast refresh engine of materialized views (MVs) that are set to be automatically refreshed at commit time. We speak about join-only materialized views only in this post, as always with the help of a test case.
As noted in the post of mine [...]

Collection cardinality utility

A utility to provide the CBO with the correct cardinality of small collections used in TABLE() queries (such as variable in-lists). This helps to achieve better execution plans without having to use the undocumented CARDINALITY hint. Supports versions from 10g onwards. November 2009

Collection sorter utility

A package of two functions to sort collections, including support for descending and distinct sorts. Works with versions from 9i Release 2 onwards but can be easily edited to support 8i. November 2009

RAC system load testing and test plan

Yesterday I was browsing through some docs in Metalink regarding RAC on Windows… then again came across the “RAC Starter Kit and Best Practices” notes…

RAC Assurance Support Team: RAC Starter Kit and Best Practices (Generic) (Doc ID 810394.1)
RAC Assurance Support Team: RAC Starter Kit and Best Practices (Windows) (Doc ID 811271.1)

On the Generic note, just recently they’ve uploaded a new version of the RAC System Load Testing white paper… you can also find on the attached docs the RAC System Test Plan

Both of these docs are good stuff.. we all know that critical to the RAC install is testing the right components in the stack, either from the perspective of Performance, Scalability, or Reliability..

so these docs will guide you through the tools (yes you’ll have choices), steps, and structure on how to ensure a highly available RAC setup…





Performance Optimization with Global Entry. Or Not?

As I entered the 30-minute "U.S. Citizens" queue for immigration back into the U.S. last week, the helpful "queue manager" handed me a brochure. This is a great place to hand me something to read, because I'm captive for the next 30 minutes as I await my turn with the immigration officer at the Passport Control desk. The brochure said "Roll through Customs faster."Ok. I'm listening.Inside the brochure, the first page lays out the main benefits:

  • bypass the passport lines
  • no paper Customs declaration
  • in most major U.S. airports

Well, that's pretty cool. Especially as I'm standing only 5% deep in a queue with a couple hundred people in it. And look, there's a Global Entry kiosk right there with its own special queue, with nobody—nobody!—in it.If I had this Global Entry thing, I'd have a superpower that would enable me to zap past the couple hundred people in front of me, and get out of the Passport Control queue right now. Fantastic.So what does this thing cost? It's right there in the brochure:

  1. Apply online at www.globalentry.gov. There is a non-refundable $100 application fee. Membership is valid for five years. That's $20 a year for the queue-bypassing superpower. Not bad. Still listening.
  2. Schedule an in-person interview. Next, I have to book an appointment to meet someone at the airport for a brief interview.
  3. Complete the interview and enrollment. I give my interview, get my photo taken, have my docs verified, and that's it, I'm done.

So, all in all, it doesn't cost too much: a hundred bucks and probably a couple hours one day next month sometime.What's the benefit of the queue-bypassing superpower? Well, it's clearly going to knock a half-hour off my journey through Passport Control. I immigrate three or four times per year on average, and today's queue is one of the shorter ones I've seen, so that's at least a couple hours per year that I'd save... Wow, that would be spectacular: a couple more hours each year in my family's arms instead of waiting like a lamb at the abattoir to have my passport controlled.But getting me into my family's arms 30 minutes earlier is not really what happens. The problem is a kind of logic that people I meet get hung up in all the time. When you think about subsystem (or resource) optimization, it looks like your latency savings for the subsystem should go straight to your system's bottom line, but that's often not what happens. That's why I really don't care about subsystem optimization; I care about response time. I could say that a thousand times, but my statement is too abstract to really convey what I mean unless you already know what I mean.What really happens in the airport story is this: if I had used Global Entry on my recent arrival, it would have saved me only a minute or two. Not half an hour, not even close.It sounds crazy, doesn't it? How can a service that cuts half an hour off my Passport Control time not get me home at least a half hour earlier?You'll understand once I show you a sequence diagram of my arrival. Here it is (at right). You can click the image to embiggen it, if you need.To read this sequence diagram, start at the top. Time flows downward. This sequence diagram shows two competing scenarios. The multicolored bar on the left-hand side represents the timeline of my actual recent arrival at DFW Airport, without using the Global Entry service. The right-hand timeline is what my arrival would have looked like had I been endowed with the Global Entry superpower.You can see at the very bottom of the timeline on the right that the time I would have saved with Global Entry is minuscule: only a minute or two.The real problem is easy to see in the diagram: Queue for Baggage Claim is the great equalizer in this system. No matter whether I'm a Global Entrant or not, I'm going to get my baggage when the good people outside with the Day-Glo Orange vests send it up to me. My status in the Global Entry system has absolutely no influence over what time that will occur.Once I've gotten my baggage, the Global Entry superpower would have again swung into effect, allowing me to pass through the zero-length queue at the Global Entry kiosk instead of waiting behind two families at the Customs queue. And that's the only net benefit I would have received.Wait: there were only two families in the Customs queue? What about the hundreds of people I was standing behind in the Passport Control queue? Well, many of them were gone already (either they had hand-carry bags only, or their bags had come off earlier than mine). Many others were still awaiting their bags on the Baggage Claim carousel. Because bags trickle out of the baggage claim process, there isn't the huge all-at-once surge of demand at Customs that there is at Passport Control when a plane unloads. So the queues are shorter.At any rate, there were four queues at Customs, and none of them was longer than three or four families. So the benefit of Global Entry—in exchange for the $100 and the time spent doing the interview—for me, this day, would have been only the savings of a couple of minutes.Now, if—if, mind you—I had been able to travel with only carry-on luggage, then Global Entry would have provided me significantly more value. But when I'm returning to the U. S. from abroad, I'm almost never allowed to carry on any bag other than my briefcase. Furthermore, I don't remember ever clearing Passport Control to find my bag waiting for me at Baggage Claim. So the typical benefit to me of enrolling in Global Entry, unfortunately, appears to be only a fraction of the duration required to clear Customs, which in my case is almost always approximately zero.The problem causing the low value (to me) of the Global Entry program is that the Passport Control resource hides the latency of the Baggage Claim resource. No amount of tuning upon the Passport Control resource will affect the timing of the Baggage In Hand milestone; the time at which that milestone occurs is entirely independent of the Passport Control resource. And that milestone—as long as it occurs after I queue for Baggage Claim—is a direct determinant of when I can exit the airport. (Gantt or PERT chart optimizers would say that Queue for Baggage Claim is on the critical path.)How could a designer make the airport experience better for the customer? Here are a few ideas:

  • Let me carry on more baggage. This idea would allow me to trot right through Baggage Claim without waiting for my bag. In this environment, the value of Global Entry would be tremendous. Well, nice theory; but allowing more carry-on baggage wouldn't work too well in the aggregate. The overhead bins on my flight were already stuffed to maximum capacity, and we don't need more flight delays induced by passengers who bring more stuff onboard than the cabin can physically accommodate.
  • Improve the latency of the baggage claim process. The sequence diagram shows clearly that this is where the big win is. It's easy to complain about baggage claim, because it's nearly always noticeably slower than we want it to be, and we can't see what's going on down there. Our imaginations inform us that there's all sorts of horrible waste going on.
  • Use latency hiding to mask the pain of the baggage claim process. Put TV sets in the Baggage Claim area, and tune them to something interesting instead of infinite loops of advertising. At CPH, they have a Danish hot dog stand in the baggage claim area. They also have a currency exchange office in there. Excellent latency hiding ideas if you need a snack or some DKK walkin'-around-money.

Latency hiding is a weak substitute for improving the speed of the baggage claim process. The killer app would certainly be to make Baggage Claim faster. Note, however, that just making Baggage Claim a little bit faster wouldn't make the Global Entry program any more valuable. To make Global Entry any more valuable, you'd have to make Baggage Claim fast enough that your bag would be waiting for anyone who cleared the full Passport Control queue.So, my message today: When you optimize, you must first know your goal. So many people optimize subsystems (resources) that they think are important, but optimizing subsystems is often not a path to optimizing what you really want. At the airport, I really don't give a rip about getting out of the Passport Control queue if it just means I'm going to be dumped earlier into a room where I'll have to wait until an affixed time for my baggage.Once you know what your real optimization goal is (that's Method R step 1), then the sequence diagram is often all you need to get your breakthrough insight that either helps you either (a) solve your problem or (b) understand when there's nothing further that you can really do about it.

4th Dutch Planboard Oracle DBA Symposium

Last Tuesday I presented at, and attended, the 4th Dutch Planboard Oracle DBA Symposium and here are my impressions about this wonderful event. The symposium offered ten presentations, divided into two parallel tracks with each presentation taking approximately one hour. All presentations featured hard-core DBA topics or topics very closely related to DBA work. The [...]

RAC Perf Tuning Seminar in Istanbul, Turkey

For those who attended my 2-day training event in Istanbul, I wish to express my sincere thanks for the participation. For me, or any speaker, the privilege of having your attention for 2 days away from your work and family, means a lot. Some folks came all the way from Ankara and had to go back to work the following day. I hope you all got to learn something worthwhile your time.

The scripts are located in: http://www.proligence.com/racperf_istanbul/scripts.zip The password and userid is the ones I gave you in the class.

I also want to reiterate my earlier request to send me your detailed and honest feedback at arup@proligence.com. You may want to write it in Turkish, if that is more convenient. The important thing is to provide the feedback; I can always translate using Google.

Thanks, Hande and Madalina from Oracle University for arranging it. Much appreciated. Now I am off to Estonia.

Comparative Window Functions...

I've been known as a huge fan of Analytic functions (as evidenced by the Rock and Roll linkability!)

And - they could be getting better in the near future. Read this document for a proposal to allow analytics to access the current row value to be compared against any other row value in a defined window.

I've already supplied them with my feedback (which started with "this is an awesome idea") - and you can too - by posting it here. They'll be checking back to see what you say.

Also, this is being proposed as well:

Another window function extension, not contained in the attached proposal, is the notion of VALUE based windows. Currently, we have ROW based (or physical) and RANGE based (logical) windows. RANGE window has limitation in that there can only be one sort key in window ORDER BY. On the other hand, ROW based window is agnostic to column value and can be non-deterministic.

The new VALUE based window allows one to include all rows with "n" values before or after the current row's value. For example, VALUE 2 PRECEDING and 3 FOLLOWING would include all rows with 2 values that are prior to current row's value and all rows with 3 values that come after the current row's value in sort order.

ticker txndate volume
orcl 1 10
orcl 2 10 <--------------------------- start of window for (orcl,6,12)
orcl 2 11
orcl 2 11
orcl 3 11
orcl 6 12 <=== assume this is current row
orcl 7 12
orcl 11 11
orcl 11 12
orcl 11 12
orcl 13 11 <------------------------- end of window for (orcl,6,12)

Similar RANGE window would have rows [orcl,6,12] through [orcl,7,12]. Similar
ROW window would include rows [orcl,3,1] through [orcl,11,11].

The VALUE based window would find usefulness when there are gaps in the dataset. For example, a query like "find the intra-day maximum for a stock in the past three trading days". Today, to do this one has to aggregate on trading date and then compute the moving max (in the past 3 days).

VALUE based window can have multiple keys in ORDER BY.

Thanks in advance for any feedback or ideas you might have on this.

Born Again Classic Metalink

On November 6th MyOracleSupport went into production to replace Metalink. MyOracleSupport is build using Flash technology which isn’t totally accessible to visually impaired people who rely on screen-readers. Although Flash can be made accessible, it remains difficult to use in my opinion and I prefer using an HTML interface where available. When logging in to [...]

Data Modeling

Most readers of the blog are probably DBA's, or do DBA work along with development or other duties.
Though my title is DBA, Data Modeling is something I really like to do.
When first learning Oracle, I cut my teeth on data modeling, and used CASE 5.1 on unix to model a database system. True, CASE 5.0 used an Oracle Forms 3.x based interface, and the GUI modeling was unix only.
That was alright with me, as the Form interface allowed manual changes to be made quite quickly.
And the graphic modeling tool was fairly decent, even on a PC running Hummingbird X Server.
When Designer 2000 came out, it was clearly a more capable tool. Not only did it do everything that CASE 5.1 could do, it could do more. I won't make any silly claim that I was ever able to fully exploit D2K, as it was an end-to-end tool that could do much more than model data and databases.
What it could do with just the databases however was quite good.  Data models could be created, and then a physical database could be generated from the model.
Changes in the database model could be reverse engineered back to the model, and changes in the model could be forward engineered in to the physical model. D2K could truly separate logical and physical models, and allow changes to be migrated back and forth between the two.
There are other high end tools such as Erwin which can no doubt accomplish the same thing, but I have not used them.  
One important differentiation for me between D2K and other tools was that D2K worked with Barker Notation, which is the notation I first learned, and the one I still prefer.  
I should not speak of Designer 2000 in past tense I guess, as it is still available from Oracle as part of the Oracle Development Suite, but is now called Oracle Designer.  It just hasn't received much attention in the past few years, as I think many people have come to think of data modeling as too much overhead.  
I've tried several low end tools in the past few years, and while some claim to separate logical and physical models, those that I have tried actually do a rather poor job of it.
All this leads to some new (at least, new to me) developments from of all places, Microsoft.
Maybe you have heard of Oslo, Microsoft's Data Modeling toolset that has been in development for the past couple years.
If you're just now hearing about it, you will likely be hearing much more. The bit I have read has made me think this will be a very impressive tool.
If you have done data modeling, you have likely used traditional tools that allow you to define entities, drop them on a graphical model, and define relationships.
The tool you used may even have allowed you to create domains that could be used to provide data consistency among the entities.
Oslo is different.  
Oslo incorporates a data definition language M. The definitions can be translated to T-SQL, which in turn can be used to create the physical aspects of the model.  M also allows easy creation of strongly typed data types which are carried over into the model.
Whether Oslo will allow round trip engineering ala D2K, I don't yet know.
I do however think this is a very innovative approach to modeling data. 
Here are a few Oslo related links to peruse :
You may be thinking that I have given SQL Developer Data Modeler short shrift.
Along with a lot of other folks, I eagerly anticipated the arrival of SQL Developer Data Modeler.
And along with many others, was disappointed to learn that this add on to SQL Developer would set us back a cool $3000 US per seat.  That seems a pretty steep price for tool that is nowhere near as capable as Oracle Designer, which is included as part of the Oracle Internet Developer Suite. True the price is nearly double that of SQL Modeler at $5800, but you get quite a bit more than just Designer with the Suite.
As for the cost of Oslo, it's probably too early to tell.
Some reading suggests that it will be included as part of SQL Server 2008, but it's probably too soon to tell.
Why all the talk about a SQL Server specific tool?
Because data modeling has been in a rut for quite some time, and Microsoft seems to have broken out of that rut.  It's time for Oracle to take notice and provide better tools for modeling, rather than upholding the status quo.