Moving Data Between Oracle Instances with Data Pump

Last weekend I took our annual trip up north to open my grandpa’s cabin up in Oneida County.  There are various tasks to get it ready for use in the summer, such as raking leaves, cleaning the gutters, and putting in the pier.  When I arrived I found most of the work had already been done by a group of people that had come up to do some deep cleaning a few weeks earlier.  This gave me extra time to fish!  I ended up bring my five year old son with me this time, so much of my fishing time was spent with him.  He’s a pretty good fisherman, but I still spent most of my time helping him rather than fishing myself.  It was a great weekend and he caught a ton of fish, including his first northern pike.  He was incredibly excited and I was relieved I got it into the net before it bit off the end of the line.  Taking your children fishing when the bite is really on is one of life’s greatest thrills.

Sons first northern

I didn’t more than a couple hours of muskie fishing in and I didn’t catch any.  I did have one follow from a small fish, probably around 25″, and I didn’t get a great look so it may have even been an aggressive northern pike.  However, I did see a different muskie that took my breath away.  It wasn’t following my lure because it came from a different direction than I had been casting.  It came in slowly and lazily and then gulped some air before slowly swimming off.  I’ve read that muskies sometimes will gulp air to aid in digestion, so I assume it had just finished a large meal.  Since it was moving so slowly and came completely out of the water, I got a really good look at it.  It had a large gash on its back right above the dorsal fin that must have been a result of the recent spawn.  But its most distinct feature was the size.  It was unquestionably the largest muskie I have ever seen.  Not having seen many fish that big, its hard to give a really good estimate, but it must have been a mid 50″ fish.  I wouldn’t be surprised in the least if it was a full 60″.  You don’t get to see many fish that size, and I felt very blessed to get such a great look at it.

Data pump is an effective way to move data in and out of an Oracle database instance.  It can easily be used to move data from one instance to another by first exporting the data from instance A and then importing it into instance B.  It can also be an effective means of taking a snapshot of the database for simple backup purposes.  I was recently tasked with recovering data in a particular table in an Oracle database and was able to get that data by loading a nightly data pump export.  In my case, the export had been created with the old deprecated way of importing and exporting data, which uses the imp and exp commands instead of impdp and expdp.  The command used to export the data was:

20160607 data pump export

You can see the command run is exp.  The next section, f21/$my_f21pwd@my_sid is the login to the database.  The file=$exportfile is a variable that contains the file name which contains the day of the week.  This was defined earlier in the shell script.  The script was set up to do a nightly export and replace the current export for that day of the week, so you always have a rolling seven days of export “backups”.  The buffer specifies the size, in bytes, of the buffer used to fetch rows.  Bigger buffer settings will perform better.  Full = y indicates that the entire database will be exported.  Consistent = y makes sure the database is in a consistent state.  This ensures relationships between tables will be consistent; it makes the data usable.  Lastly, the logfile parameter is specifying a file to put the log of what was done.  It is helpful to have a log for each import and export so you can see what was run and view any errors or notices that may arise.

Next came time to import a specific table from this export file.  I needed a table called FMSH_IMAGE, and I wanted to put it into my own user’s database.  From there I could grab only the specific column I wanted using an update statement to the table in the real database.  I ran this:

20160607 data pump import

You can see the command run is imp.  You can’t use the new data pump impdp with a file exported from the old exp.  The rest is pretty self explanatory.  I include a username and password to access the database.  I specify the file to use, this one from the most recent Thursday.  I specify a log file to put any messages into.  The last three parameters are a little more interesting.  Fromuser tells the program the database that was exported.  Touser tells the program what database I want the import to go into.  These are important because I didn’t want to just override the FMSH_IMAGE table in the source database – this would get rid of any newly imported data.  I only needed to update a single column with the data from the previous day.  Lastly, I was able to specify that I only wanted to import the single table.

Using this example, you can see Oracle’s import and export programs are very useful for getting data into and out of the database.  Be sure to check out the Oracle documentation on the newer impdp and expdp programs, as these have different parameters and improved functionality.

Oracle data pump import

Oracle data pump export


Moving data between Oracle Instances with Linked Server

We are officially one week away from the open of the general fishing season here in southeast Wisconsin. I am incredibly excited to get back on the water and try to catch some big fish. I haven’t yet decided where I’m headed for the opener, but I had great luck on the east side of Pewaukee Lake last year, so I’m thinking about heading back there.
In the meantime, there are a few opportunities to get fishing before the general opener. Last weekend I went up to the Wolf River near Fremont to try to catch some walleyes. The entire Winnebago system, which includes the Wolf and Fox rivers and several lakes includeing Winnebago, Butte de Mortes, Partridge, and Poygan are all open all year round. Additionally, the walleye and whitebass from the lake systems head up and down the rivers to spawn around this time of year, and the fishing can be nothing short of amazing. Unfortunately, we didn’t happen to be on at the right time and ended up catching no fish. It was cool to try it once, especially since the fishing is legendary, it gets rather crowded. We fished about 1/2 mile stretch of the Wolf River just north of Partridge lake and we must have been among at least 40 boats. I’m personally not a big fan of crowded fishing, but it wasn’t as bad as I thought it would be.

There are a few ways to move data between different instances of Oracle database. In this post I will demonstrate how to set up a database link and use that to transport data from one environment to the other. The reason I did this recently was to update a small subset of our Test database with data from Prod. We didn’t want to update an entire database, tablespace, or even table. We simply wanted to update a single column on one table in the Test environment with data from Prod. The first thing I did was to create a database link from the Test environment to Production.

    (ADDRESS = (PROTOCOL = TCP)(HOST = = 1521))
      (SERVICE_NAME = myProdDB)
	  (UR = A)

A list of database links can be found by querying the DBA_DB_LINKS view:


Next to physically move the data from the Prod to the Test instance, I create a new table with the data I want, using the data from Prod:

CREATE TABLE prod_Copied_Table AS
FROM    mySchema.CopiedTable@myProdDB;

Immediately I drop my database link – I don’t need it anymore and don’t want anyone else to use it.


Now I have all the data under my user’s schema in Test. The last step is simply to update data I want changed:

UPDATE SCHEMA.TestTable test
SET test.myDate = (SELECT myDate
                      FROM prod_Copied_Table prod 
                      WHERE test.myCOMPANY_ID = prod.myCOMPANY_ID 
                           AND test.myVENDOR_ID = prod.myVENDOR_ID 
                           AND test.mySKU = prod.mySKU 
                           AND test.myVENDOR_PART_ID = prod.myVENDOR_PART_ID)
                      WHERE test.myDate = TO_DATE('2016-04-29','yyyy-mm-dd');

Now my data migration is complete. I drop the unneeded table from my user and I’m finished.

DROP TABLE prod_Copied_Table;

I hope you have found this example of moving data from one instance to another helpful.