Wednesday, 22 January 2014

T-SQL v PL/SQL

How to Migrate T-SQL to Oracle PL/SQL
Oracle SQL Developer available for free on OTN provides a Migrate to Oracle feature.
Migrate to Oracle helps automate the task of migrating a SQL Server or Sybase database, their  tables, data and their T-SQL objects (Views, Triggers, Functions, Procedures) to a corresponding Oracle database and PL/SQL Objects.

T-SQL v PL/SQL
Both are procedural languages which provide standard programming language features like control flow, variables, conditions and support embedded DML SQL (INSERT,UPDATE,DELETE,SELECT) statements.

Many of the structures found in T-SQL have a one to one mapping to similar structures in PL/SQL.
Sometimes there are differences in syntax and features.

Using SQL Developers Scratch Editor, T-SQL can be automatically converted to the corresponding Oracle PL/SQL. Typically procedures are migrated using the Migrate to Oracle feature, but the Scratch Editor provides a neat editor to type T-SQL on the left hand pane and have it automatically converted to Oracle PL/SQL on the right.

Some of the obvious differences  are the T-SQL variables/parameters are denoted with '@' . This makes them distinct from column names. Oracle variables/parameters do not use this syntax.

Another obvious difference is the T-SQL query result set is passed back implicitly without the need for an explicit returned value. Oracles normal practice is to return the cursor explicitly through the procedures parameters.

In Oracle 12c, implicitly returned cursors are now supported through DBMS_SQL.RETURN_RESULT.
This makes migrating procedures a lot simpler as the procedure header does not change and therefore neither do its calls.

Many of the mappings between T-SQL and PL/SQL are handled automatically by Oracle SQL Developer.

Sometimes its not obvious why one construct is converted to another. The following outlines the some of the more common conversions.

Object Naming
SQL Server allow identifiers (table names, column names,..) to be up to 128 characters. Oracle has a limit of 30 bytes. SQL Developer will truncate and resolve collisions of objects names so that valid object names are referenced in the PL/SQL.

Embedded DDL 
DDL , ex: CREATE TABLE, can be embedded and then referenced within T-SQL.
DDL cannot be directly embedded in PL/SQL but there are two choices.
1) Take the DDL out of the  procedure and create it up front as a separate object.
2) Use dynamic sql (EXECUTE IMMEDIATE) to create the DDL from within the procedure. But then all references to it have to be withing dynamic sql as well.

Temporary Tables tend to be the most common DDL defined within T-SQL.
SQL Developer takes these definitions  out of the procedures and convert them into Oracle Global Temporary Tables.

String Concatenation
T-SQL use the + operator for string concatenation and Oracle uses ||.
When converting from T-SQL to PL/SQL the expression data types need to known to choose the correct operator in oracle. SQL Developer can lookup the table definition to decide what data type the columns are.

UPDATE/DELETE FROM
T-SQL allows UPDATE and DELETE statements to have a FROM clause.
Oracle SQL does not allow a FROM clause in an UPDATE statement.
But the same functionality can be defined using a sub query or a MERGE clause.
SQL Developer will automatically convert these types of statements to a MERGE statement.

Empty String
Empty string literals '', means different things in different databases.
Sybase treats an empty string literal '' as a single space ' '.
SQL Server treats an empty string literal '' as an empty string literal ''
Oracle treats an empty string literal '' as a NULL.
During the conversion of T-SQL if an empty string literal is used , it is converted to a single space string literal.
Empty Strings are also converted to single spaces when moving data from SQL Server to Oracle.

Dynamic SQL
Dynamic SQL, or SQL defined within strings are not converted. Dynamic SQL typically isn't a complete statement, instead they tend to be strings built from constants, parameter values, column values ... , these substring of a SQL statement, are not translated automatically by SQL Developer. Instead they have to be reviewed and manually converted.

SQL Developer can be of some help though.
If you can manually put together an example of the complete T-SQL statement, the Scratch Editor can be used to convert it to PL/SQL.

Helping you manually convert the procedure.

CREATE OR REPLACE PROCEDURE proc1( v_cursor  OUT SYS_REFCURSOR)
AS
BEGIN
  OPEN v_cursor FOR  'select ' || ' * from t1 FETCH FIRST 2 ROWS ONLY';
END;
/


If the dynamic SQL is not directly available to translate, 12c provides a new feature SQL Translation Framework, which allows you to translate non Oracle SQL statements to Oracle on the fly.

It requires installing SQL Developers translator into the Oracle database  and creating a new SQL Translation Profile. The procedure can then use dbms_sql to parse and execute the query and have the dynamic SQL translated by the translator on the fly.

create or replace PROCEDURE proc1( v_cursor  OUT SYS_REFCURSOR)
AS
  c integer;
  n number;
  v number;
BEGIN
 c := dbms_sql.open_cursor();
 BEGIN
    execute immediate 'ALTER SESSION SET SQL_TRANSLATION_PROFILE=SQLSERVER_PROFILE';
    dbms_sql.parse(c, 'select TOP 2 * from t1', dbms_sql.foreign_syntax);
    n := dbms_sql.execute(c);
  END;
  v_cursor := DBMS_SQL.TO_REFCURSOR(c);
END;



Monday, 7 October 2013

Inspecting a Temporary Table or Uncommitted Table rows in a Debug Session

Debugging a procedure using SQL Developer is really neat. But I ran into an issue whereby I wanted to inspect the rows of a temporary table while I was debugging a stored procedure. I was unable to browse and inspect the rows in the temporary table as the debug session is on a different session. I had to throw in some debug code into the procedure to see something useful in the debug data tab. I ended up with the little procedure below which you may find useful. Basically I query the table in question and place the result in a CLOB to view in the debug window.

1) Create this procedure in your schema
--COMPILE THIS WITH DEBUG AND STEP INTO IT IF YOU WISH TO SEE RUN TIME VALUES
CREATE OR REPLACE PROCEDURE DEBUGINFO(p_tablename VARCHAR2,p_numrowstoinspect NUMBER DEFAULT 10) AS
  v_count INT;
  v_qryCtx DBMS_XMLGEN.ctxHandle;
  v_result CLOB;
BEGIN
  EXECUTE IMMEDIATE 'SELECT COUNT(*) INTO :v_count FROM '||p_tablename INTO v_count;
  DBMS_OUTPUT.PUT_LINE('tablename='||p_tablename||' : rowcount='||v_count);
  --query table rows into XML
  v_qryCtx := DBMS_XMLGEN.newContext('SELECT * FROM ' ||p_tablename || ' WHERE ROWNUM <='||p_numrowstoinspect);
  -- Get the result
  v_result := DBMS_XMLGEN.getXML(v_qryCtx);
  DBMS_OUTPUT.PUT_LINE('--BEGIN DEBUGINFO RESULT FOR '||p_tablename||CHR(10)||CHR(10)||v_result||CHR(10)||'--END DEBUGINFO RESULT FOR '||p_tablename);
END DEBUGINFO;
/
2) Call DEBUGINFO from within the procedure your testing.
Pass the tablename and number of rows to inspect



3) Step into DEBUGINFO during debug to view "live" values.
V_COUNT contains the number of rows in the table
V_RESULT contains the table data (for the number of rows to inspect)



4) Review the log window for DBMS_OUTPUT
Result available after the procedure finishes.


Tuesday, 20 August 2013

Offline Capture

Sometimes its not possible to directly connect over JDBC to a database your want to migrate to Oracle.
Sometimes you want someone else to perform the migration but would rather if you could just email them the metadata to carry out the migration without having them onsite.
In these cases SQL Developer offers Offline Capture.

Instead of making a connection to you database using a SQL Developer JDBC connection and then extracting the database metadata over the connection, you can run a script provided by SQL Developer which extracts the database metadata to files. These files can be used to perform the capture, making it easy to provide the database metadata to others without a connection.

Script Creation

The Offline Capture scripts can be acquired using SQL Developer.
Tools > Migration > Create Database Capture Scripts ...


Choose where you want to place the scripts, Windows or Linux format and which Platform
( SQL Server, MySQL, Sybase, DB2, Teradata)


Running Scripts

Prerequisites 

The scripts will use the tools of  non Oracle database(ex: BCP in Sybase and SQL Server, db2 for DB2,... ) to export the metadata (table definitions, procedure definitions, ..., not the table data) out to flat dat files on the disk. So it is essential that those "client" tools are available when running the scripts.  

The scripts also require a highly privileged database owner  (ex: sa for Sybase and SQL Server) so that the metadata tables can be read and their data extracted.

Notes
  • Disk space is usually not an issue as only the metadata of the database is extracted at this stage, not the table data.
  • SH files may need to made executable before being run

Running Scripts

The scripts have to be run by hand from the command line. The user supplies a privileged login into the database as an argument and the name of the database to extract its metadata.

SYBASE 15 Windows Example
Arguments: username password databasename servername

Note, in SQL Server and Sybase that two directories will be created.
  • Metadata from the databasename chosen
  • Metadata from the master database
This master database metadata is the same for all databases on this server.
If you want to offline capture databases from different servers, then you should create a directory for each server and copy/run the offline capture scripts from each directory.

Extract One Database
SYB15_OFFLINE_CAPTURE sa sa_password pubs2  servername

The result should be something like this

Extract Multiple Databases From the Same Server
SYB15_OFFLINE_CAPTURE sa sa_password database1  servername
SYB15_OFFLINE_CAPTURE sa sa_password database2  servername

The result should be something like this.

Extract Multiple Databases From Different Servers
From Directory server1
SYB15_OFFLINE_CAPTURE sa sa_password database1  servername1
SYB15_OFFLINE_CAPTURE sa sa_password database2  servername1

The result should be something like this.

From Directory server2
SYB15_OFFLINE_CAPTURE sa sa_password database3  servername2
SYB15_OFFLINE_CAPTURE sa sa_password database4  servername2

The result should be something like this.

The following are the expected files in a database directories and the master directory for Sybase 15.
master DIR

database DIR (ex: pubs2)

Now the metadata is on disk, the next step is to package it up to send it to the migrator.

Packaging Metadata

The file names, directories names and structure are important. Do not change the names of any files of directories or move their positions. All the files are required for SQL Developer to perform the  offline capture. Best to zip up the entire directory including scripts, metadata, directories without any changes.
When zipping up the files, make sure to keep the directory structure, so when unzipped the same directory structure is reproduced.

Offline Capture of Files

Unzip the files provided so that they are accessible to SQL Developer.
Create a brand new migration project. Tools > Migration > Migrate to Oracle

Follow the migration wizard.On Step 4 (Source Database)
  • Choose Mode Offline.
  • Choose the *.ocp file. 
Continue through migration

Note that SQL Developer requires a new project for each Server. So if you are migrating multiple servers, you will have to have separate migration projects, and have to perform the offline capture for each server.

Hopefully this makes offline capture a little easier to follow. Ill place up some examples of different databases soon.

Thursday, 14 March 2013

Multibyte Offline Data Move

Moving data from a non Oracle database to Oracle can be a bit tricky when different character sets are at play.
For this example Ill move data from a Russian (Cyrillic_General_CI_AS) SQL Server  database on windows  to UTF8 (AL32UTF8) Oracle  database on linux.


SQL Server 2008 Cyrillic_General_CI_AS database
CREATE TABLE multibyte1(col1 VARCHAR(10),col2 CHAR(10));
CREATE TABLE multibyte2(col2 NVARCHAR(10),col2 NCHAR(100));
INSERT INTO multibyte1 VALUES('фис','фис');
INSERT INTO multibyte2 VALUES('фис','фис');
SELECT * FROM multibyte1;
SELECT * FROM multibyte2;




SQL Developer can migrate the two tables to Oracle

CREATE TABLE multibyte1(col1 VARCHAR2(10),col2 CHAR(10));
CREATE TABLE multibyte2(col2 NVARCHAR2(10),col2 NCHAR(100));




SQL Developer will also generate offline data move scripts which can perform the data move using SQL Servers BCP and Oracle SQL*Loader.

There are two issues with the scripts generated by SQL Developer (Ill fix them from the next release).

  • The shell files are not in linux format
  • The NLS_LANG setting is specified incorrectly as NLS_LANGUAGE
The solution is to 
  • run dos2unix on the .sh files
  • change NLS_LANGUAGE to NLS_LANG in the "bottom" oracle_ctl.sh file
Once that's done, the data move is performed in two steps. 

Dump the Data From SQL Server using BCP
The data from SQL Server can be dumped out to dat files using the unload_script.sh  or unload_script.bat.
The data inside the dat files can be verified as correct. 
The dat files will be encoded in  Cyrillic_General_CI_AS (Cyrillic Window 1251).





You can now move all the files to your linux machine to load into Oracle.

Load the Data To Oracle using SQL*Loader
SQL*Loader needs to be able to recognize the characters in the dat files.
In the "bottom" oracle_ctl.sh,  export NLS_LANG should be set to RUSSIAN_CIS.CL8MSWIN1251 .
SQL*Loader will recognize the characters and insert them correctly.
The oracle database is setup to save them in UTF8.



Verify the data in Oracle
In linux, if you open SQLPlus and query the tables, you may get ? , question marks instead of the Russian characters. This is because Oracle has saved the characters in UTF-8.

In linux, set the NLS_LANG to UTF8 and query the tables again.

export NLS_LANG=_.UTF8


Monday, 16 July 2012

DBMS_OUTPUT manipulation and filtering in SQL*Plus

Small script to filter and manipulate the DBMS_OUTPUT in PL/SQL from SQL*Plus

CLEAR SCREEN;
SET LONG 1000000000 ;
SET SERVEROUTPUT ON;

VARIABLE outputclob CLOB ;
VARIABLE maxlines   NUMBER;
--set the max lines
EXECUTE  :maxlines  :=10;

DECLARE
  --array to save the output lines into
  outtab dbms_output.chararr;
  outstr VARCHAR2(255);
BEGIN
  -- initialize the output clob
  dbms_lob.createtemporary (:outputclob, TRUE);
  -- add some lines to the output buffer, added some "special" lines to filter on
  dbms_output.put_line('special 1');
  dbms_output.put_line('nothing special 2');
  dbms_output.put_line('special 3');
  dbms_output.put_line('nothing special 4');
  dbms_output.put_line('special 5');
  dbms_output.put_line('nothing special 6');
  
  --get the output lines into the local array
  dbms_output.get_lines(outtab, :maxlines);--maxlines is changed here if there are fewer rows
  -- note that this string will not be part of the output clob, set serveroutput on will pick it up
  dbms_output.put_line('maxlines after:'||:maxlines);
  
  --loop through the array and build up the output clob
  FOR i IN 1 .. :maxlines LOOP
      outstr := outtab(i);
      -- filter the output
      IF INSTR(outstr,'special') = 1 THEN
        --modify the output
        dbms_lob.writeappend(:outputclob,
                            length(outstr||chr(10)||'my message'||chr(10)),
                                   outstr||chr(10)||'my message'||chr(10)); 
      END IF;                    
  END LOOP;
END;
/
print outputclob;

Tuesday, 20 March 2012

Migrate to Existing Oracle Users

Problem
By default, SQL Developer will migrate a Sybase, SQL Server ,... database to a brand new target user in Oracle. This new user is defined (CREATE USER ... ) at the start of the the generation script.

For Sybase and SQL Server, we append the owner name to the database name (ex: dbo_Northwind) , to come up with a new Oracle target user name. This is done to create the same separation of objects. At the moment (SQL Developer 3.1) we define the password to be the same as the user name. Its always best to perform the generation phase "offline" so you can inspect the generation script first before it is run.

Generation Script : Notice the CREATE USER ....

In the above example I migrated the SQL Server Northwind database with default preferences.
Default Preferences


Solution
If you wish to migrate to an existing Oracle User then there are two methods
  1. Least Privilege Schema Migration : Run the generation script directly into the chosen connection, with no users created
  2. Object Naming: Specify what target user name you want

Least Privilege Schema Migration
Tools>Preferences>Migration>Generation Options

Choose Least Privilege Schema Migration.

Once checked, you can reconvert a Captured Database Objects model. As long as there is only one target Oracle user , then the generation script will be created, without any CREATE USER or CONNECT ...
The script will then create the migrated objects into the target connection specified.

Note that if the migration project has more than one target user (ex: dbo_Northwind & user2_Northwind) SQL Developer will not create a least privilege script. It will create a standard script with CREATE USER, CONNECT TO .. for each user.

Object Naming: Specify what target user name you want
If you have more than one existing target Oracle user and you want the objects to be migrated into them. Then you have to explicitly state what Oracle user you want to use. SQL Developer needs to know what user to connect to when creating the objects. SQL Developer also needs to know the target user names when converting Views, Triggers, Procedures and Functions. So that cross database references are resolved correctly.

Any object name change, including user/schema name change, should be performed in the convert page of the Migration Wizard.

To see the Object Naming tab, you must capture the source database first. Then perform the convert. Otherwise SQL Developer does not know the name of the source objects.

Just choose "Proceed to Summary Page" on the Capture page of the Migration Wizard

You can right click on the Captured Database Objects node in the Migration Projects navigator and choose Convert.

Object Naming Tab : Only visible after the source database has been captured.

You can filter by object type ,to show just the Users/Schemas.
SRC_TYPE='MD_SCHEMAS'

Change the New Identifier to your target Oracle user. And Commit the change

The generation script will now use the new target name. And all cross database references will be correct.

Note that you may want to comment out the CREATE USER as it will fail if the user already exists, or you can just ignore the error.

Note you should review the CONNECT statements to make sure the existing password is used, possibly prompt for it using substitution.

Note that using the above two methods allows SQL Developer to perform the data move correctly as the target users are identified.




Friday, 27 January 2012

Sybase and SQL Server Image Data Move

SQL Developer can move data online and offline.
Online is really only for small amounts of data (<100mb).
For larger sets of data with greater performance and much better logging, the Offline Data Move feature should be used.



SQL Developer will create two sets of scripts.
One to dump out the data using Sybase/SQL Servers BCP tool to DAT files on the disk.
The second set of scripts to read the DAT files and load the data into the Oracle tables using Oracle SQL*Loader.
As the scripts are just plain text files they should be inspected and can be modified to suit your needs.

Because SQL Developer knows the names of all the tables and each columns name/datatype it can do a good job of defining these scripts, so very little has to be modified.

Currently there is an issue moving large Images from Sybase/SQL Server to Oracle using the offline data move scripts SQL Developer generate. But with a little change you can get them to move correctly.

For this little test , the first thing is to


Insert an Image into Sybase using BCP.
  • Create a table with one column in Sybase of type IMAGE
CREATE TABLE loadimage (col1 IMAGE)
go
  • Make sure the Sybase database can INSERT from BCP (note this database is called loadimage)
use master
go
sp_dboption "loadimage", "select into", true
go
use loadimage
go
checkpoint
go
  • The bcp.fmt needs to have the correct size of the imagefile specified, in this case 1551333.
10.0
1
1 SYBIMAGE 0 1551333"" 1 col1
  • From the command line load an image into the Sybase table using Sybase BCP
bcp loadimage..loadimage in image3.jpg -Usa -P -S -f bcp.fmt -T2000000

This should load up the image3.jpg into the Sybase table.
It can be viewed from SQL Developer.
Just browse to the Sybase table, click on the data tab. Double click the (BLOB) item and choose "View As Image".

Now we need to move it to Oracle.
Migrate the Sybase database to Oracle, but choose Offline Data Move in step 8 of the migration wizard.

The following changes need to be performed against the scripts SQL Developer generates for you.

The unload_script.bat needs to be changed as BCP by default will truncate to 32K.
mkdir data
mkdir log
bcp "[loadimage].[dbo].[loadimage]" out "data\dbo_loadimage.loadimage.dat" -T 1552503 -c -t "" -r "" -U%2 -P%3 -S%1


The -T 1552503 denotes the largest Image in that column.
This can be found by querying the Sybase table.
select max(datalength()) from

Now the DAT file will include all the IMAGE data in HEX format, and it will not be truncated.
Next we have to load it into Oracle and those scripts need a little change as well.
SQL Developer creates scripts which
1) Create a new CLOB column to hold the HEX string
2) Process the CLOB HEX string into a BLOB using a procedure
3) Remove the CLOB column
This is necessary as you cannot load a large HEX string directly into a BLOB column in Oracle.
This action is performed for you, but this is a default limit to the size of the HEX string SQL*Loader will read, and this needs to be changed.

Open the control/dbo_loadimage.loadimage.ctl file
load data
infile 'data/dbo_loadimage.loadimage.dat'
"str ''"
into table dbo_loadimage.loadimage
fields terminated by ''
trailing nullcols
(
SQLDEVELOPER_CLOB_1 CHAR(20000000)
)

Change the size of the CHAR column. This is the temporary column the scripts will create for you to load in the HEX String.

Once this is done you can run the script to load the image data into the Oracle table.
oracle_ctl.bat system oracle@11GR1

Notice that the rows are loaded fairly fast into the CLOB columns as HEX strings. But it does take some time to process these HEX strings into BLOBs.

Once the load is finished the Images can be inspected in SQL Developer




Wednesday, 12 October 2011

JDeveloper Group By Insight Preference

During Oracle OpenWorld a customer asked how to turn off the autogenerate of GROUP BY clause when using JDeveloper.

SQL Developer has a preference


But JDeveloper is missing this preference, and it is turned on by default.
This will be rectified in a future realese of JDeveloper but in the mean time there is a workaround.

1) Close JDeveloper.
2) Open JDevelopers product-preferences.xml file.
This can be found under
c:\Users\\AppData\Roaming\JDeveloper\\o.jdeveloper\
on windows 7. Or in the usual place applications persist preferences in your OS.
3)Add the following after the ide tag.

<hash n="DBConfig">
<value n="AUTOGENERATEGROUPBY" v="false"/>
<hash n="EXTENSION"/>
<value n="INDENT_CACHE_NUMBER" v="2"/>
<value n="INDENT_CACHE_USE_TAB" v="false"/>
</hash>


.

4) Restart JDeveloper, and Autogenerate GROUP BY should not turned off.


Friday, 26 August 2011

Very Large Migrations

Most of the time SQL Developer does a good job of migrating a database from SQL Server, Sybase ,... To Oracle. But there are some tricks to help make very large migrations perform smoother.

This talks about Large migrations in terms of object numbers , not amount of data in tables.

If your migrating 50 databases at once, and each has 1,000 tables, 1,000 views , 1,000 procedures . That is a lot of meta data to churn through and sometimes we find
memory issues , open cursors ,... can be a problem.

We are always working to reduce the amount of resources required to perform the migrations , but for the moment (SQL Dev 3.0) here are a few tricks which just might make the difference between the migration failing and the migration completing.


Memory
By Default SQL Developer startups with
  • AddVMOption -XX:MaxPermSize=128M
  • (found in sqldeveloper\sqldeveloper\bin\sqldeveloper.conf)
  • AddVMOption -Xmx640M (found in sqldeveloper\ide\bin\ide.conf)
  • AddVMOption -Xms128M (found in sqldeveloper\ide\bin\ide.conf)
This is usually fine, but ... if you have more memory, SQL Dev maybe able to use it.
When using SQL Developer 32 bit on a 32 bit machine with a 32 bit JDK. The following is ok
  • AddVMOption -XX:MaxPermSize=128M
  • (found in sqldeveloper\sqldeveloper\bin\sqldeveloper.conf)
  • AddVMOption -Xmx1024M (found in sqldeveloper\ide\bin\ide.conf)
  • AddVMOption -Xms256M (found in sqldeveloper\ide\bin\ide.conf)
When using SQL Developer 64 bit on a 64 bit machine with a 64 bit JDK. The following is ok . As long as you have the memory.
  • AddVMOption -XX:MaxPermSize=512M
  • (found in sqldeveloper\sqldeveloper\bin\sqldeveloper.conf)
  • AddVMOption -Xmx4096M (found in sqldeveloper\ide\bin\ide.conf)
  • AddVMOption -Xms2048M (found in sqldeveloper\ide\bin\ide.conf)
Oracle Database
By Default Oracle uses a small maximum of open cursors allowed. When migrating large databases SQL Developer does tend to open a lot of cursors against the Oracle database which has the migration repository. We have worked on minimizing this and will continue to do so, but the workaround at the moment is to set this limit higher in the Oracle database you create your migration repository in.

ALTER SYSTEM SET open_cursors=10000 COMMENT='' SCOPE=BOTH

Note if you are migrating multibyte databases (objects names are in chinese, japanese ,.. ) you should create the migration repository in a UTF-8 Oracle database. Otherwise the object names may not be saved/migrated correctly.

SQL Developer
SQL Developer obviously interacts with the migration repository. For example to convert a stored procedure. It queries the migration repository to retrieve the captured SQL and then inserts into the repository the converted SQL. Not only that, SQL Developer also queries the migration repository to identify name changes, data type of columns and so on. These round trips back and forth to the migration repository can slow down a migration.

If you are migrating large amounts of Stored Procedures, Triggers, Views, Functions it is best to install SQL Developer on the same machine as the Oracle database that contains the migration repository. This will remove the network overhead and speed up the migration.


Migrate Incrementally
SQL Developers Migration Wizard allows you to define the entire migration you want up front, and then kick it off. This is ok for smaller migrations, but I think it best that most migrations perform the migration incrementally.
The migration wizard allow you perform each step of the migration by itself. This enables you to review each step of the migration before proceeding to the next.

Just click the "Proceed to Summary" checkbox at the bottom left of the wizard and then click Next. The wizard will then only perform the steps you have choosen so far.

When this step is complete you will get a chance to review the result and then kick off the wizard again , straight back into the wizard at the correct step.

Not only that but breaking it into steps also helps with memory/cursor issues.

Redundant Objects
Some times a database which is being migrated contains a lot of objects which are redundant or not required to migrate. If there is a substantial amount of these objects, its best to remove them from the captured model so as they do not consume resources going forward.
Right click the objects in the captured model you do not want and choose delete.
Note that this is a model and not connected to your Source SQL Server, Sybase ,... database.
So those source objects are not deleted , just items in your model saved in the migration repository.



Generation Script
Very large databases will have very large generation script. By default SQL Developer generates one big script which is then run to create all the target objects in Oracle.
But there is a preference (Migration > Generation Options > A File Per Object) to split each object (table,trigger,view,...) definition into a separate file.
Then a master script calls each file individually to create the objects. This means a much smaller master file.
It is also handy if you want to source control the object definitions.