Tuesday, July 14, 2009

Oracle9i: IMP-20 on Import of Table with TIMESTAMP Column that was Created via Database Link

The information in this article applies to:
- Oracle Server - Enterprise Edition - Version: 8.1.7.0 to 9.2.0.6
- Oracle Server - Personal Edition - Version: 8.1.7.0 to 9.2.0.6
- Oracle Server - Standard Edition - Version: 8.1.7.0 to 9.2.0.6
This problem can occur on any platform.


ERRORS
------

IMP-00020: long column too large for column buffer size (7)


SYMPTOMS
--------

In a remote Oracle8i or higher release database, you have a table with
a column of the TIMESTAMP data type:

SQL> connect scott/tiger
SQL> alter session set nls_timestamp_format = 'YYYY-MM-DD HH24:MI:SSXFF';
SQL> create table my_original_tab
(
nr number,
stamp_col timestamp(3)
);
SQL> insert into my_original_tab values (1, '2004-10-20 18:34:31.123456');
SQL> commit;
SQL> select * from my_original_tab;

NR STAMP_COL
---------- --------------------------------
1 2004-10-20 18:34:31.123


In your local Oracle8i or Oracle9i database you have created a database link:

SQL> connect system/manager
SQL> grant create database link to scott;

SQL> connect scott/tiger
SQL> alter session set nls_timestamp_format = 'MON DD, YYYY HH:MI:SSXFF AM';
SQL> create database link orcl.oracle.com
connect to scott identified by tiger using 'orcl';


Now you create a new table in this database with the CREATE TABLE AS SELECT
statement (CTAS), accessing the original table via the database link:

SQL> create table my_tab as
select * from my_original_tab@orcl.oracle.com;
SQL> select * from my_tab;

NR STAMP_COL
---------- --------------------------------
1 OCT 20, 2004 06:34:31.123 PM


You export this table:

% exp scott/tiger file=exp_tab.dmp log=exp_t.log tables=my_tab

About to export specified tables via Conventional Path ...
. . exporting table MY_TAB 1 rows exported
Export terminated successfully without warnings.


Any you try to import from this export dump file, e.g. into a different user:

% imp system/manager file=exp_tab.dmp log=imp_t.log fromuser=scott touser=hugo

. importing SCOTT's objects into HUGO
. . importing table "MY_TAB"
IMP-00020: long column too large for column buffer size (7)
Import terminated successfully with warnings.


- Increasing the value for the Import BUFFER parameter does not solve the
IMP-20 error.
- Pre-creating the table in the target schema, and running import with
IGNORE=Y does not solve the IMP-20 error.
- Specifying Import parameter COMMIT=Y does not import any row for this table.


CAUSE
-----

If you create a table with the CREATE TABLE ... AS SELECT (CTAS) syntax,
and the original table is located on a remote database that you access
via database link, and the table has a column with the TIMESTAMP data type,
then the new created table has an incorrect precision in the data dictionary.
Example:

If you would query the view dba_tab_columns for the original table in the
remote database you would see:

SQL> connect system/manager
SQL> select substr (data_type, 1, 15) data_type, data_length, data_precision,
data_scale, column_id, substr (column_name, 1, 15) column_name
from dba_tab_cols
where owner = 'SCOTT' and table_name = 'MY_ORIGINAL_TAB'
order by column_id;

DATA_TYPE DATA_LENGTH DATA_PRECISION DATA_SCALE COLUMN_ID COLUMN_NAME
--------------- ----------- -------------- ---------- ---------- ------------
NUMBER 22 1 NR
TIMESTAMP(3) 11 3 2 STAMP_COL


However, a similar query in the local database (where you ran the CTAS),
gives a different result:

SQL> connect system/manager
SQL> select substr (data_type, 1, 15) data_type, data_length, data_precision,
data_scale, column_id, substr (column_name, 1, 15) column_name
from dba_tab_cols
where owner = 'SCOTT' and table_name = 'MY_TAB'
order by column_id;

DATA_TYPE DATA_LENGTH DATA_PRECISION DATA_SCALE COLUMN_ID COLUMN_NAME
--------------- ----------- -------------- ---------- ---------- ------------
NUMBER 22 1 NR
TIMESTAMP(0) 11 0 2 STAMP_COL


If you export this table, there is a mismatch in the metadata syntax of
the table in the export dumpfile. As a result, import will fail because
import expects a field of 7 bytes (used to store the value of a timestamp
without any precision), but actually encounters values of 11 bytes (used
to store timestamps with a precision). As a result, an error is produced:

IMP-00020: long column too large for column buffer size (7)


FIX
---

This defect has been fixed in Oracle10g. This means that if the table is
created in Oracle10g via a database link and exported afterwards, then
the import of that table with a TIMESTAMP column, will not result
in an IMP-20 error.

In Oracle8i and Oracle9i, use the workaround by correcting the precision
of the TIMESTAMP column in the data dictionary. Example:

1. In the local database, determine the correct value for the precision
of the column with the timestamp data type. You can determine the correct
value with a simple DESCRIBE command:

SQL> connect scott/tiger
SQL> desc my_tab

Name Null? Type
----------------------------------------- -------- -------------
NR NUMBER
STAMP_COL TIMESTAMP(3)

2. Modify the column:

SQL> connect scott/tiger
SQL> alter table my_tab modify (stamp_col timestamp(3));

If you receive the error:
ORA-01866: the datetime class is invalid
Please refer to:
Note 372638.1C> "ORA-1866 on ALTER TABLE MODIFY when trying to change
precision of timestamp column"

3. Check for the correct output (scale = 3 in this case):

SQL> connect system/manager
SQL> select substr (data_type, 1, 15) data_type, data_length, data_precision,
data_scale, column_id, substr (column_name, 1, 15) column_name
from dba_tab_cols
where owner = 'SCOTT' and table_name = 'MY_TAB'
order by column_id;

DATA_TYPE DATA_LENGTH DATA_PRECISION DATA_SCALE COLUMN_ID COLUMN_NAME
--------------- ----------- -------------- ---------- ---------- ------------
NUMBER 22 1 NR
TIMESTAMP(3) 11 3 2 STAMP_COL

4. Afterwards, re-create the export dumpfile:

% exp scott/tiger file=exp_tab.dmp log=exp_t.log tables=my_tab

About to export specified tables via Conventional Path ...
. . exporting table MY_TAB 1 rows exported
Export terminated successfully without warnings.

5. And re-run the import from the new export dump file:

% imp system/manager file=exp_tab.dmp log=imp_t.log fromuser=scott touser=hugo

. importing SCOTT's objects into HUGO
. . importing table "MY_TAB" 1 rows imported
Import terminated successfully without warnings.

REFERENCE
---------
Bug 2417643 "CTAS VIA DBLINK WITH TIMESTAMP COLUMN CHANGES PRECISION TO 0
IN DATA DICTIONARY".
Note 372638.1 "ORA-1866 on ALTER TABLE MODIFY when trying to change
precision of timestamp column"

How to Integrate Java With Oracle Forms 6i and 9i/10g?

PURPOSE
-------

Where to look for support articles, white papers, demo code which discuss /
illustrate how to integrate Java with Oracle Forms 6i and 9i/10g

SCOPE & APPLICATION
-------------------

Target Audience: Developers, Programmers, Consultants

How to Integrate Java With Oracle Forms 6i and 9i/10g?
--------------------------
----------------------------

This document serves as a pointer to useful support articles, white papers, and
demo code.

Support Articles
-----------------
Useful Support Articles can be found on Metalink

White Papers
-------------
White Papers are to be found http://otn.oracle.com


Forms 9i/10g:


Oracle9i Forms in a Java World
( http://otn.oracle.com/products/forms/pdf/forms_in_java_world.pdf )

Oracle9i Forms - How to Debug Pluggable Java Components
( http://otn.oracle.com/products/forms/htdocs/howto_debug_pjc.html )

Oracle9iForms JAR File Signing for JInitiator 1.3
( http://otn.oracle.com/products/forms/pdf/SigningJint13.pdf )

Forms 6i:

Oracle9iAS Forms Services 6i Release 2 - Use Java from Oracle9iAS Forms
Services with the Java Importer
( http://otn.oracle.com/products/forms/pdf/javaimporter.pdf )

Using Java Components in Oracle Forms Applications
( http://otn.oracle.com/products/forms/pdf/269054.pdf )

Demos
-----
Demos are also to be found on otn.oracle.com

Forms 9i/10g : http://otn.oracle.com/sample_code/products/forms/content.html

Scroll down and look for section 'Forms Builder Demos'

[Viewlet] Java Importer
[Viewlet] JDeveloper PJC Wizard
[Viewlet] Debugging PJCs

Also see section 'JavaBeans / PJC Samples'
e.g

[Viewlet] Spell Checking With JSpell 2.0
[Viewlet] File Upload Utility

etc

Forms 6i : http://otn.oracle.com/sample_code/products/forms/6idemos.html

See sections 'Java Beans' and 'Pluggable Java Components'

WebUtil
-------
There is also a new utility called WebUtil which contains some commonly
required forms / java integration functionality (including some functionality
illustrated by the demos above). The production version is available on
http://otn.oracle.com/products/forms
and is fully supported on Forms 10g (9.0.4).

Reference:

Note 237545.1 WebUtil - Forms 10g Desktop Integration 'Out of the Box' on the Web:

KEYWORDS
--------

forms6i forms9i 6i 9i 9iAS 9iDS 10g ias webutil integration integrate integrating
interface java jdk sun J2EE oc4j jdev jdeveloper bean pjc pluggable component
jinitiator applet jar javabean sample example demo code c/s client-server
migrating migrate upgrade upgrading www internet web webforms notes papers doc
documentation info information plug plug-in

Sunday, July 12, 2009

How to make case insensitivity work in Oracle SQL Developer

Applies to:
Oracle SQL Developer - Version: 1.1.2.25 to 1.5.3.57.83
Information in this document applies to any platform.
Goal
How to make case insensitivity work in Oracle SQL Developer
Solution

1. Connect to the Scott schema using Oracle SQL Developer.

2. Run the below SQL Queries.

SQL> SELECT * FROM emp WHERE ename='james';
No rows selected.
SQL>SELECT * FROM emp WHERE ename='JAMES';
This will give the result set which are available in the EMP table.

3. Now alter the below session parameters

ALTER SESSION SET NLS_COMP=ANSI;
ALTER SESSION SET NLS_SORT='BINARY_CI';

4. Repeat the Step 2,You will get values for both the SQL Queries.
Keywords
CONFIGURATION ; SESSION~PARAMETERS ; NLS_COMP=ANSI ; NLS_SORT='BINARY_CI' ;