Use Oracle Flashback Query to restore data fast

Winter has dragged into Spring here in Southern Wisconsin.  We were teased with some 70 degree weather in early March, but since then we’ve been hovering around 40 degrees.  I was expecting the fish to get active a little earlier this year, but now I’m not so sure.

One activity that I enjoy doing with my kids while waiting for fishing to start up is making a trip to Cabela’s.  We are blessed to have a Cabela’s about 10 minutes away from our house, so making a trip down is easy and a good way to enjoy the feeling of the outdoors without actually venturing out into the chilly weather.  For those that don’t know, Cabela’s is a large retailer that specializes in outdoors activities.  They have huge stores stocked with fishing and hunting gear, camping equipment, boats, guns, and outdoor apparel.  They also have some very impressive displays that include mounts of just about every kind of game animal, and three large aquariums that have all kinds of Wisconsin fish.  These fish include trout, sturgeon, walleye, bass, catfish, gar, and several kinds of panfish.  A few years ago they had an old and fairly large, sickly looking muskie, but the last several times we went we couldn’t find it.  I assume it had died.  During our recent trip, we saw two new muskies – small ones, probably about two years old.  The kids love seeing the fish swimming around, and I get to browse through the great fishing gear that I can’t afford.  A trip to Cabela’s is a great way to spend a few hours on a cold, blustery spring day.

Spending a few hours on a data recovery can have big implications for a DBA.  Let’s face it – the expression time is money may have been invented with our profession in mind.  Taking a database offline so we can recover consistent data can cost our companies any amount of money.  Imagine Amazon going offline during Black Friday or Cyber Monday.  The amount of money lost could be measured in the millions.  With these kinds of pressures confronting us on a daily basis, it is important to use every tool in our arsenal to do our job effectively.

I recently discovered that Oracle has a very clever way to recover data quickly without using RMAN.  This trick has been available since version 9i and uses the Undo tablespace to see old versions of data.  As a reminder, the Undo tablespace stores versions of data when transactions are in progress.  So if a transaction is performed and then rolled back, the previous version of the data is grabbed from Undo and put back into the active data in the table.  Once the transaction has been either committed or rolled back, the information is no longer needed and will eventually be overwritten.  However, using the Undo Retention Period property, we can attempt to keep that old information as long as possible so it can be used for other purposes, such as the flashback query I am going to demonstrate.  As a further clarification, the flashback query can be used in any edition of Oracle database, while the similar but different flashback table is only available in Enterprise Edition.

The flashback query is a perfect way to recover accidentally deleted data.  Let’s look at an example.  First, I’ll show that Enterprise flashback database is off:

20160404 Flashback Off

Next I’ll create a table with about 100 rows of dummy data.

20160404 Create Table

This created a table with 98  rows:

20160404 Initial Table Rowcount

Now I’m going to delete half of the rows:

20160404 Table delete and new rowcount

The next step is to grab a copy of the table from before the delete.  In this case I deleted the data after 1:50 PM, so I’m going to go back to that time.

20160404 Create Recovered Table20160404 Create Recovered Table2

I now double check and make sure I have all the data I need.  I’ll do this by checking the row count.

20160404 Recovered table rowcount.PNG

The last step is to add the missing rows back into the original table.  This can be done with a simple left join.

INSERT INTO fbq_test
FROM fbq_test_recovery r
LEFT OUTER JOIN fbq_test o

The thing to keep in mind when using this method is to try and catch the “old” data as soon as possible. Once the Undo tablespace needs more room, the existing old data will be removed.  Use this method to recover accidental deletes and updates without having to bother with RMAN.


Steve Stedman’s Corruption Challenge #3

This Saturday is the fishing opener here in Wisconsin.  The DNR designates some time in spring when the only fishing allowed on most inland waters is panfishing.  This rule was created to protect the fish, since most species spawn in the spring.  I can’t wait to get out there on Saturday and try to catch some opening day fish!  This is my first opener owning my own boat, so I can certainly say I’ve never been looking forward to a fishing opener this much.  Hopefully my next post will have a picture or two of a beautiful 2015 muskie.

SQL Server blogger Steve Stedman recently started a weekly database corruption challenge. The challenges each involve the simulation of some sort of corruption or database failure, and we are tasked with recovering without losing any data. I missed the first two weeks, but decided to spend a bit of time last Saturday morning working on the third challenge. Here is the scenario Steve set up:

Here is the timeline as things occurred

At 2:53pm a full backup was performed. After that full backup, users continued to do work and the database continued to grow.

At 2:54pm a transaction log backup was performed. Followed by more changes to the database.

At 3:01pm another transaction log backup was performed. Followed by more changes to the database.

At 3:12 another transaction log backup was performed. Followed by more changes to the database.

At 3:19 a catastrophic failure occurred causing the operating system running SQL Server to reboot. Let’s blame it on a power surge, or maybe it was just a DBA who accidently bumped the power cord. I like the power surge idea, so that I don’t get blamed for kicking the power cord.

After the server rebooted, several other databases on this SQL Server recovered just fine, however the CorruptionChallenge3 database would not come on line. After examining the SQL Server data directory, it was discovered that the single .mdf file for this database was gone, missing, just not there. The only thing that remained on this SQL Server was the .ldf file for this database.

20150429 corruption challenge db recovering

Steve included five files needed to recover all the data – the last full backup, the three transaction log backups that were taken between the last full backup and the disaster event, and the log file.

20150429 corruption challenge files given

After thinking about the problem briefly, I formulated my recovery plan. I would restore the full backup and each of the transaction log backups. This would bring the database current through the time of the last backup. Then I would detach the database and swap out the log file with the one provided by Steve, and take another log backup. Then I could use this new log backup to bring the database online with all transactions accounted for. The first part of my plan went according to plan:

USE [master]
RESTORE DATABASE [CorruptionChallenge3] FROM  DISK = N'\\server\Gugg\CorruptionChallenge3_Full.bak' WITH  FILE = 1,  MOVE N'CorruptionChallenge3' TO N'X:\SQLData\CorruptionChallenge3.mdf',  MOVE N'CorruptionChallenge3_log' TO N'Y:\SQLLogs\CorruptionChallenge3_log.LDF',  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [CorruptionChallenge3] FROM  DISK = N'\\server\Gugg\TransLog_CorruptionChallenge30.trn' WITH  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [CorruptionChallenge3] FROM  DISK = N'\\server\Gugg\TransLog_CorruptionChallenge31.trn' WITH  FILE = 1,  NORECOVERY,  NOUNLOAD,  STATS = 5
RESTORE LOG [CorruptionChallenge3] FROM  DISK = N'\\server\Gugg\TransLog_CorruptionChallenge32.trn' WITH  FILE = 1,  RECOVERY,  NOUNLOAD,  STATS = 5
EXEC MASTER.dbo.sp_detach_db @dbname = N'CorruptionChallenge3'

However, when I tried to re-attach the database after replacing the log file, I got the following message:
One or more files do not match the primary file of the database. If you are attempting to attach a database, retry the operation with the correct files. If this is an existing database, the file may be corrupted and should be restored from a backup.
Log file ‘R:\SQLLogs\CorruptionChallenge3_log.LDF’ does not match the primary file. It may be from a different database or the log may have been rebuilt previously.
At this point I remembered some Oracle training I’d done regarding Log Sequence Numbers (LSNs). When connecting a database, the LSN in the log has to match the LSN in the data file. At this point I couldn’t think of what else to do. I could not think of any way to get the data out of the log file without taking a backup of it, and I couldn’t take a backup without first attaching it. I thought about it for another 15 minutes and then turned my attention to other tasks I had to get done that Saturday.
When Steve revealed the answer on Tuesday, I was disappointed by how close I came, but I missed an important detail. In order to take a tail-log backup without a valid data file, you need to set a “dummy” version of the database offline, delete the data file, replace the log file, and the when you turn it back online you’ll get an error message but you will still be able to take a tail-log backup. (See this post by Paul Randal for details.) So I should have done this:

USE [master]
CREATE DATABASE [CorruptionChallenge3] ON 
( NAME = 'CorruptionChallenge3', FILENAME = N'S:\SQLData\CorruptionChallenge3.mdf' ),
( NAME = 'CorruptionChallenge3_log', FILENAME = N'R:\SQLLogs\CorruptionChallenge3_log.LDF' )


--Delete the mdf file and replace the ldf file


20150429 corruption challenge files inaccesible

BACKUP LOG CorruptionChallenge3 TO DISK = N'\\server\Gugg\Final_Log_Backup.trn' WITH INIT, NO_TRUNCATE;

The key here is to use the NO_TRUNCATE option. With this I am able to take a log backup, and now have one full backup and four log backups, enough to restore the database with no data loss!

I have a busy weekend coming up since it’s the fishing opener on Saturday, but I hope to be able to work on Corruption Challenge #4, coming up on Friday.