Search

Top 60 Oracle Blogs

Recent comments

current_schema

nVision Performance Tuning: 8. Interval Partitioning and Statistics Maintenance of Tree Selector Tables

This blog post is part of a series that discusses how to get optimal performance from PeopleSoft nVision reporting as used in General Ledger.

The decision to use interval partitioning on the tree selector tables came from the need to have accurate statistics for the optimizer when parsing nVision queries.  It is not possible to introduce hints into nVision SQL. The dynamic nature of the code means it is not viable to consider any of the forms of database plan stability across the whole application, (although it may be possible to use SQL Profiles in limited cases). Therefore, as far as possible the optimizer has to choose the best plan on its own. Without accurate optimizer statistics, I have found that the optimizer will usually not choose to use a Bloom filter.

Using Two Temporary Tablespaces in PeopleSoft

A longer version of this posting, with all necessary code and a demonstration test, is available as a technical note on the Go-Faster website.  I am working with two different PeopleSoft customers who have had challenges with the size of the temporary tablespaces.  Critical batch processes have failed because they have exhausted space in the temporary tablespace.
#eeeeee; border: 0px solid #000000; font-family: courier new; font-size: 85%; overflow: auto; padding-left: 4px; padding-right: 4px; width: 95%;">ORA-01652 unable to extend temporary segment...

Using Two Temporary Tablespaces in PeopleSoft

A longer version of this posting, with all necessary code and a demonstration test, is available as a technical note on the Go-Faster website.  I am working with two different PeopleSoft customers who have had challenges with the size of the temporary tablespaces.  Critical batch processes have failed because they have exhausted space in the temporary tablespace.
#eeeeee; border: 0px solid #000000; font-family: courier new; font-size: 85%; overflow: auto; padding-left: 4px; padding-right: 4px; width: 95%;">ORA-01652 unable to extend temporary segment...

Dynamic sampling and set current_schema anomaly

Sometimes when I'm asked to check a particular SQL in a production like environment at first I only get rather limited access, e.g. a read-only user that has the required privileges to perform the SQL statement, but doesn't own the objects.

If I'm now supposed to analyse a rather complex SQL with dozens of table and/or view references I like to use the ALTER SESSION SET CURRENT_SCHEMA command to switch any unqualified object references to the schema set via CURRENT_SCHEMA rather than editing the SQL and add all the object references.

This basically works just fine, but I recently have come across an interesting anomaly in 10.2.0.4: If the optimizer uses dynamic sampling as part of the parsing then it doesn't do so anymore if the CURRENT_SCHEMA has been altered to a non-default schema.

The following simple test case demonstrates the issue in 10.2.0.4 Win32:

SQL>
SQL> drop user dynsamp_test cascade;
drop user dynsamp_test cascade
*
ERROR at line 1:
ORA-01918: user 'DYNSAMP_TEST' does not exist

SQL>
SQL> create user dynsamp_test identified by dynsamp_test;

User created.

SQL>
SQL> grant create session, alter session to dynsamp_test;

Grant succeeded.

SQL>
SQL> drop table dynsamp_test_table purge;
drop table dynsamp_test_table purge
*
ERROR at line 1:
ORA-00942: table or view does not exist

SQL>
SQL> create table dynsamp_test_table
2 as
3 select
4 *
5 from
6 all_objects
7 where
8 rownum <= 1000;

Table created.

SQL>
SQL> grant select on dynsamp_test_table to dynsamp_test;

Grant succeeded.

SQL>
SQL> -- this is the plan we get as schema owner
SQL> explain plan for
2 select /*+ dynamic_sampling(t 4) */
3 *
4 from
5 dynsamp_test_table
6 where
7 owner = 'NOT_A_VALID_OWNER_SINCE_TOO_LONG';

Explained.

SQL>
SQL> -- It uses obviously dynamic sampling as instructed
SQL> select * from table(dbms_xplan.display);

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
Plan hash value: 1573551846

----------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | 128 | 5 (0)| 00:00:01 |
|* 1 | TABLE ACCESS FULL| DYNSAMP_TEST_TABLE | 1 | 128 | 5 (0)| 00:00:01 |
----------------------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------

1 - filter("OWNER"='NOT_A_VALID_OWNER_SINCE_TOO_LONG')

Note
-----
- dynamic sampling used for this statement

17 rows selected.

SQL>
SQL> rollback;

Rollback complete.

SQL>
SQL> -- We get the same from different schema with qualified reference (or valid synonym)
SQL> connect dynsamp_test/dynsamp_test@orcl
Connect durchgeführt.
SQL>
SQL> alter session set nls_language = 'AMERICAN';

Session altered.

SQL>
SQL> explain plan for
2 select /*+ dynamic_sampling(t 4) */
3 *
4 from
5 cbo_test.dynsamp_test_table
6 where
7 owner = 'NOT_A_VALID_OWNER_SINCE_TOO_LONG';

Explained.

SQL>
SQL> -- Still uses dynamic sampling
SQL> select * from table(dbms_xplan.display);

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
Plan hash value: 1573551846

----------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 1 | 128 | 5 (0)| 00:00:01 |
|* 1 | TABLE ACCESS FULL| DYNSAMP_TEST_TABLE | 1 | 128 | 5 (0)| 00:00:01 |
----------------------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------

1 - filter("OWNER"='NOT_A_VALID_OWNER_SINCE_TOO_LONG')

Note
-----
- dynamic sampling used for this statement

17 rows selected.

SQL>
SQL> rollback;

Rollback complete.

SQL>
SQL> -- Now switch the current_schema
SQL> alter session set current_schema = cbo_test;

Session altered.

SQL>
SQL> explain plan for
2 select /*+ dynamic_sampling(t 4) */
3 *
4 from
5 dynsamp_test_table
6 where
7 owner = 'NOT_A_VALID_OWNER_SINCE_TOO_LONG';

Explained.

SQL>
SQL> -- No more dynamic sampling!
SQL> select * from table(dbms_xplan.display);

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
Plan hash value: 1573551846

----------------------------------------------------------------------------------------
| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
----------------------------------------------------------------------------------------
| 0 | SELECT STATEMENT | | 10 | 1280 | 5 (0)| 00:00:01 |
|* 1 | TABLE ACCESS FULL| DYNSAMP_TEST_TABLE | 10 | 1280 | 5 (0)| 00:00:01 |
----------------------------------------------------------------------------------------

Predicate Information (identified by operation id):
---------------------------------------------------

1 - filter("OWNER"='NOT_A_VALID_OWNER_SINCE_TOO_LONG')

13 rows selected.

SQL>
SQL> rollback;

Rollback complete.

SQL>
SQL> -- Check the 10053 trace file
SQL> -- It will show that the dynamic samping query fails
SQL> -- and therefore no dynamic sampling will be performed
SQL> -- presumably because the current_schema is not obeyed
SQL> -- by the recursively executed query
SQL> alter session set tracefile_identifier = 'dynamic_sampling_current_schema_10053';

Session altered.

SQL>
SQL> alter session set events '10053 trace name context forever, level 1';

Session altered.

SQL>
SQL> explain plan for
2 select /*+ dynamic_sampling(t 4) */
3 *
4 from
5 dynsamp_test_table
6 where
7 owner = 'NOT_A_VALID_OWNER_SINCE_TOO_LONG';

Explained.

SQL>
SQL> alter session set events '10053 trace name context off';

Session altered.

SQL> -- Check the 10046 trace file
SQL> -- which confirms a ORA-00942 while parsing the dynamic sampling query
SQL> alter session set tracefile_identifier = 'dynamic_sampling_current_schema_10046';

Session altered.

SQL>
SQL> alter session set sql_trace = true;

Session altered.

SQL>
SQL> explain plan for
2 select /*+ dynamic_sampling(t 4) */
3 *
4 from
5 dynsamp_test_table
6 where
7 owner = 'NOT_A_VALID_OWNER_SINCE_TOO_LONG';

Explained.

SQL>
SQL> alter session set sql_trace = false;

Session altered.

SQL>
SQL> rollback;

Rollback complete.

SQL>
SQL> spool off

So as you can see the simply query uses dynamic sampling as instructed (and in 10.2 would use dynamic sampling by default anyway since the table has no statistics), but when using the CURRENT_SCHEMA trick and then an unqualified reference, the dynamic sampling is not performed and the estimate is based on hardcoded defaults.

Checking the 10053 optimizer trace file confirms that the dynamic sampling is attempted, but the recursive query fails, and the 10046 SQL trace shows that an "ORA-00942 table or view does not exist" error is encountered while parsing the recursive dynamic sampling query.

So be careful when using this (rather unusual I have to admit) setup, the execution plans might be different if you're using a non-default CURRENT_SCHEMA and dynamic sampling is involved.

The issue seems to be fixed in 11.1.0.7 (dynamic sampling is performed as expected when using non-default CURRENT_SCHEMA), however I couldn't identify a corresponding bug on Metalink. If anyone comes across a bug number or document ID this would be helpful.