Upgrade Oracle Database from 11g2 to 12c

The fishing opener has come and gone.  I was able to get out for the morning on the day of the opener, from about 6 AM to 11:30 AM.  I went with my brother-in-law down to Pewaukee Lake, where we tried to find some early season muskies.  I ended up with two follow-ups, one that fled as soon as it saw the boat, and another beauty of a fish that was very aggressive.  It followed the lure in hot, right on its tail.  It was a bigger fish (somewhere between 40 and 45 inches by my estimation), so instead of doing a figure 8 I went with a big continuous circle next to the boat.  I ended up circling somewhere around 10 times with the fish following closely.  Each time I swung the lure out away from the boat, the fish would cut the turn and make like it was going to t-bone the lure, but stopped just short.  Finally, as I swung the lure back toward the boat, the large fish simply nudged it without even opening its mouth.  It clearly was wary, and didn’t want to commit without investigating a bit more.  After the nudge the fish took off, I’m sure because it felt the different between a real fish and my plastic Shallow Raider.  The follow was a great start to the fishing season, but I’m hoping for more catches this year, not just follows.  I did get this largemouth bass as a consolation catch:

Open Day LM Bass.jpg

Last week I completed an upgrade of the Oracle database in our Production environment.  We only have a single application running on our Oracle database as compared to numerous applications running against SQL Server, so I’m not nearly as familiar with working in Oracle.  To put it bluntly, I was quite nervous to do the upgrade, and practiced several times in our Test environment before proceeding with the Production upgrade.  I was nervous for good reason because the upgrade did not go smoothly at all.

In our Test environment, it took me 3 or 4 tries before I was able to successfully complete the upgrade.  I found a few major hurdles to clear:

  • I needed a new user created.  We currently had a domain user running the existing Oracle database service, but the installer complained that the existing user had some flavor of domain admin rights, and it would not let me proceed with that user.
  • Our startup parameter file had a parameter called os_authent_prefix, which was set to doman\.  When the upgrade ran I kept getting error messages caused by the \.  I guess the upgrade assistant was not smart enough to deal with it, so I removed that parameter before the upgrade, then added it back in afterward.  This is an important note!  If you are doing an upgrade in the Windows environment, you will probably run into this issue.
  • I had to set the job_queue_processes parameter to a higher number, 100.
  • I dropped a few deprecated startup parameters, remote_os_authent and cursor_space_for_time.  I ended up adding the remote_os_authent back in after the upgrade had completed.
  • Lastly, Before the upgrade I compile invalid objects with the UTLRP.SQL job and emptied the Oracle recycle bin.

In addition to those issues which were causing the upgrade process to completely bomb, once I fixed them and got the upgrade to complete I had some cleanup, including the PATH and TNS_ADMIN environment variables, which had to point to the new Oracle home folder structure, and the tnsnames.ora and listener.ora files in the new Oracle home needed to be updated.  By the last practice attempt I was getting through the upgrade process in around an hour.

Finally, the night arrived for the Production upgrade.  The application is only used during business hours, so I started it at 5 PM once the normal users were finished for the day.  The first thing I noticed is a difference in Environment Variables between our Production and Test environment.  Production has the ORACLE_HOME environment variable set, and Test does not.  This somehow caused the first upgrade attempt to completely fail.  Not only did it fail, but it somehow erased the OracleDB service from the list of services in Windows.  It took me quite a while to get it back and working again so I could make a second attempt at the upgrade.  Although I received some error messages, this one did end up completing successfully.   The one thing I wasn’t expecting though was the amount of time it took.  While the database that was the same size but on inferior hardware took less than an hour to install in the Test environment, in our Production environment it took well over three hours to install.

I had to perform similar cleanup on the Production environment as in Test, but I also ran into one additional hiccup.  After the upgrade I found all accounts except for sys were expired and locked.  Now, unlocking an account is not a big problem, but there is not good way to unexpire an account.  This is a big problem because many of the accounts can be service accounts, where no actual user is signing in.  So no prompt for a new password, and no way to unexpire.  Fortunately I found a great workaround on a blog post by Simon Krenger.  This method involves replacing the existing encrypted password with that same encrypted password.  Once I executed the SQL output by his method, all the accounts were unexpired.

The last step was to gather stats on the relevant databases, and the upgrade was complete.

Removing Schema from Oracle

I took my kids to the park on the local lake here this weekend.  While we went mainly to play on the playground, we also brought along a fishing pole and some worms.  I didn’t think the fish had moved up in the shallows yet, but we brought the pole just in case.  Unfortunately, I was right.  The weeds were still brown and low, and there was no sign of any fish.  They should be coming up pretty soon though.  We have a little over two weeks until the fishing opener!

We recently stopped using a third party software that had a back-end in our Oracle database.  After waiting a sufficient amount of time, I was finally able to remove the user and data from the database.  I’d never done a large data removal like this from Oracle before.  My first step was to export a copy of the data using data pump.  Although the data should never be needed again, it doesn’t hurt to grab one last copy before deleting it into oblivion.

20170418 expdp

I zipped that up and placed it into archive storage, probably never to be touched again.

Next I dropped the user.  In order to remove a user that has objects in it, you must use cascade.

DROP USER usertodrop CASCADE;

Lastly, I noticed that the tablespaces that were created to house the objects in that schema were still there. I needed to drop the tablespaces. I made a mistake the first time around by omitting including contents and datafiles. Without this part of the command, the files will stay on the disk and you will have to manually delete them using the OS. In my case, I had to wait several hours for the Oracle service to relinquish its lock on the files. The second time around I used the proper command, and the disk space was returned to the server immediately:

DROP TABLESPACE ts_usertodrop_pd INCLUDING CONTENTS AND DATAFILES;
DROP TABLESPACE ts_usertodrop_td INCLUDING CONTENTS AND DATAFILES;

After this, the user was gone!

db_recovery_file_dest_size of N bytes is X% used and has R remaining bytes available

The boat launch can be a tough place, especially if you are a beginner on a busy lake.  People are anxious to get onto and off of the lake.  You don’t want people to be sitting there waiting on you, so the temptation is to go as fast as you can.  This can be a huge mistake for many though.  Even experienced boaters can make rookie mistakes that are downright embarrassing in front of all the other boaters.  Personally, I’ve made two mistakes that were somewhat embarrassing.  My first wasn’t a big deal – I forgot to put in the boat plug.  I’d guess most boaters have made this mistake at one time or another.  Water started gushing in the boat.  Once I saw it I quickly jammed in the plug and started the bilge pump, which drained out the water in about five minutes.  No real harm done.  My other big mistake was while I was pulling off a river.  The launch was gravel and had no dock.  I walked out on the tongue of the trailer, but while I was adjusting the boat I lost my balance.  Unfortunately my shoe got stuck on a bracket so instead of stepping down and getting my leg wet up to my knee, I fell back first and ended up totally submerged in 1 1/2 feet of water.  I’m the kind of guy who is able to laugh at my mistakes, so even though I ended up soaking wet and embarrassed, I still had a good laugh at my own expense.

A DBA can certainly see the lesson in this.  If you go too fast you make mistakes.  You need to be very careful and pay close attention to details.  One way to cover yourself is to maintain backups.  I recently ran into a warning in Oracle Enterprise Manager that said I was using 97% of the recovery area free space.  A quick search found that if the used space got up to 100%, the database would cease to process transactions.

20161122-recovery-file-size-oem

I wanted to see within the database what my db_recovery_file_dest_size parameter was set to.  I ran a quick query and found that it was set to 350 GB.

20161122-recovery-file-size-parameter

Next I queried the v$recovery_file_dest to see the total space, used space, and directory where I was storing the backupsets.

20161122-recovery-file-size-recovery-dest

I checked the directory and found it quite full of back files.

20161122-recovery-file-size-backups

My next step was to attempt to delete any expired archivelogs.  Unfortunately, this didn’t do anything:

20161122-recovery-file-size-delete-expired-results

I needed to first backup the archivelogs before I could delete any of them.  So I ran the following command in RMAN:

20161122-recovery-file-size-backup-archivelog-all-delete-input

This ran a VERY long time, but eventually succeeded.  This fixed my problem.  After running this command I checked my free space and found I’d gained plenty:

20161122-recovery-file-size-fixed-size

Taking backups is not enough – you need to be aware of what is happening to the backups.  Testing recover-ability is also a great idea!

Using Powershell to create a nightly SQL health check

I had a great opportunity through work to go on a hunting weekend last week.  I usually prefer to fish, especially at this time of year, but I found the hunting to be a lot of fun.  I saw a total of 16 deer, 3 of them being bucks.  The only deer in shooting range were two small doe, and I let them go.  It was very cool to see the deer all around me but not aware of my presence.  I’d love to get back out, but this coming weekend starts gun hunting here in Wisconsin, and I’d prefer not to be in the woods with all the bullets that will be flying around.

hunt-guard

My last two posts showed how to query SQL Server and Oracle using Powershell.  In this post I’m going to share my Powershell script that I use to create a nightly monitoring workbook in Excel.  Each morning when I get in I open that workbook to be sure everything is well in my databases.  The workbook is divided into four tabs:

  1. Scheduled Jobs
  2. Backups
  3. Database file space
  4. Server disk space

I use the information in these worksheets to investigate failed jobs, make sure my backups are up to date, grow any database files that need to be grown, and make sure my servers aren’t running out of space.  Please note I found a script similar to this online and adapted it for my purposes.  The idea behind this monitoring is not original, but my implementation is.

#Create a new Excel object using COM 
$Excel = New-Object -ComObject Excel.Application
$excel.Visible = $true
$excel.DisplayAlerts = $false

$WB = $Excel.Workbooks.Add()
$WB.Worksheets.Add()
$WB.Worksheets.Add()
$WB.Worksheets.Add()
$now = [datetime]::now.ToString('yyyyMMdd')
$a = "\\Servername\IT\Gugg\Powershell\ServerMonitoring\ServerMonitoring$now.xlsx"
#write-host $a
$WB.SaveAs($a)

#################################################################################
#####                             SQL Agent Jobs                                #####

$Sheet = $WB.Worksheets.Item(1)
$Sheet.Name = "SQL Agent Jobs"

#Counter variable for rows
$intRow = 1

#Header for SQL Server
$Sheet.Cells.Item($intRow,1) = "SQL Server"
$Sheet.Cells.Item($intRow,1).Font.Bold = $True
$Sheet.Cells.Item($intRow,1).Font.ColorIndex = 10
$intRow++

#Read thru the contents of the SQL_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\SQL_Servers.txt")
{

     #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "JOB NAME"
      $Sheet.Cells.Item($intRow,2) = "LAST RUN OUTCOME"
      $Sheet.Cells.Item($intRow,3) = "LAST RUN DATE"
      $Sheet.Cells.Item($intRow,4) = "ENABLED"

     #Format the column headers
     for ($col = 1; $col -le 4; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
      #######################################################
     #This script gets SQL Server Agent job status information using PowerShell

     [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null

     # Create an SMO connection to the instance
     $srv = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $instance

     $jobs=$srv.JobServer.Jobs

     #Formatting using Excel


ForEach ($job in $jobs)  
{ 

       # Formatting for the failed jobs 
       if ($job.LastRunOutcome -eq 0) 
       { 
           $fgColor = 3 
       } 
       else 
       { 
           $fgColor = 0 
       } 
   

       $Sheet.Cells.Item($intRow, 1) =  $job.Name 
       $Sheet.Cells.Item($intRow, 2) =  $job.LastRunOutcome.ToString() 
       $Sheet.Cells.item($intRow, 2).Interior.ColorIndex = $fgColor 
       $Sheet.Cells.Item($intRow, 3) =  $job.LastRunDate
       $Sheet.Cells.Item($intRow, 4) =  $job.IsEnabled

    
            
       $intRow ++ 
   
} 
   $intRow ++ 


}

#Header for Oracle
$Sheet.Cells.Item($intRow,1) = "ORACLE"
$Sheet.Cells.Item($intRow,1).Font.Bold = $True
$Sheet.Cells.Item($intRow,1).Font.ColorIndex = 30
$intRow++

#Read thru the contents of the Oracle_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\Oracle_Servers.txt")
{

#Create column headers
     $Sheet.Cells.Item($intRow,1) = "DATABASE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "JOB NAME"
      $Sheet.Cells.Item($intRow,2) = "OWNER"
      $Sheet.Cells.Item($intRow,3) = "LAST RUN OUTCOME"
      $Sheet.Cells.Item($intRow,4) = "LAST RUN DATE"
      $Sheet.Cells.Item($intRow,5) = "ENABLED"

     #Format the column headers
     for ($col = 1; $col -le 5; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
     
     #Create connection to Oracle
     $OracleconnectionString = "Data Source=OracleDataSource;User Id=myUser;Password=myPassword;Integrated Security=No"
     $OracleSchedulerQueryString = "with cte as (SELECT sjrd.job_name ,sjrd.owner ,sjrd.status ,TO_CHAR(sjrd.actual_start_date ,'yyyy-mm-dd HH24:MI:SS') as ACTUAL_START_DATE ,sjrd.error# ,asj.ENABLED ,ROW_NUMBER() OVER (Partition by sjrd.JOB_NAME ORDER BY sjrd.ACTUAL_START_DATE DESC) as rn FROM dba_scheduler_job_run_details sjrd INNER JOIN ALL_SCHEDULER_JOBS asj on sjrd.JOB_NAME = asj.JOB_NAME) select JoB_Name ,owner ,Status ,ACTUAL_START_DATE ,ENABLED from cte where rn = 1"
     [System.Reflection.Assembly]::LoadWithPartialName("System.Data.OracleClient") | Out-Null
     $OracleConnection = New-Object System.Data.OracleClient.OracleConnection($OracleConnectionString)
     $OracleCommand = New-Object System.Data.OracleClient.OracleCommand($OracleSchedulerQueryString, $OracleConnection)
     $OracleConnection.Open()
     $OraSchedJobs = $OracleCommand.ExecuteReader()
     $OraCounter = $OraSchedJobs.FieldCount
     #Read the data out onto the Excel worksheet
     While ($OraSchedJobs.Read()) {
        for ($o=0; $o -lt $OraCounter; $o = $o + 5) {
            $Sheet.Cells.Item($intRow, 1) = $OraSchedJobs.GetValue($o)
            $Sheet.Cells.Item($intRow, 2) = $OraSchedJobs.GetValue($o+1)
            $Sheet.Cells.Item($intRow, 3) = $OraSchedJobs.GetValue($o+2)
            if ($OraSchedJobs.GetValue($o+2) -ne "SUCCEEDED") 
                { 
                    $Sheet.Cells.item($intRow, 3).Interior.ColorIndex = 3 
                } 
            $Sheet.Cells.Item($intRow, 4) = $OraSchedJobs.GetValue($o+3)
            $Sheet.Cells.Item($intRow, 5) = $OraSchedJobs.GetValue($o+4)
            $intRow++
        }
    }
    $OracleConnection.Close()
    $intRow++
}

$Sheet.UsedRange.EntireColumn.AutoFit()

##############################################################################
#####                    Database Backups                                #####

#Add a new sheet
$Sheet = $WB.Worksheets.Item(2)
$Sheet.Name = "Database Backups"

#Counter variable for rows
$intRow = 1

#Read thru the contents of the SQL_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\SQL_Servers.txt")
{

     #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "DATABASE NAME"
      $Sheet.Cells.Item($intRow,2) = "LAST FULL BACKUP"
      $Sheet.Cells.Item($intRow,3) = "LAST DIFF BACKUP"
      $Sheet.Cells.Item($intRow,4) = "LAST LOG BACKUP"
      $Sheet.Cells.Item($intRow,5) = "LAST BACKUP AGE(DAYS)"

     #Format the column headers
     for ($col = 1; $col -le 5; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
      #######################################################
     #This script gets SQL Server database information using PowerShell

     [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null

     # Create an SMO connection to the instance
     $s = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $instance

     $dbs = $s.Databases

     #Formatting using Excel


ForEach ($db in $dbs)  
{ 
   if ($db.Name -ne "tempdb") #We do not need the backup information for the tempdb database 
   {         
       if($db.LastBackupDate -eq "1/1/0001 12:00 AM") #This is the default dateTime value for databases that have not had any backups 
       { 
           $fullBackupDate="Never been backed up" 
           $fgColor3="red" 
       } 
       elseif(($db.LastBackupDate) -lt ((Get-Date).AddDays(-7)))
       {
              $fullBackupDate="{0:g}" -f  $db.LastBackupDate
           $fgColor3="red"
       }
       else 
       { 
           $fullBackupDate="{0:g}" -f  $db.LastBackupDate
           $fgColor3 = "xlnone"
       } 
       
       $Sheet.Cells.Item($intRow, 1) = $db.Name 
       $Sheet.Cells.Item($intRow, 2) = $fullBackupDate
       $Sheet.Cells.item($intRow, 2).Interior.ColorIndex = $fgColor3
       
       if($db.LastDifferentialBackupDate -eq "1/1/0001 12:00 AM") #This is the default dateTime value for databases that have not had any backups 
       { 
           $diffBackupDate="Never had differential taken"
           $fgColor4 = 15
       } 
       elseif(($db.LastDifferentialBackupDate) -lt ((Get-Date).AddDays(-2)))
       {
              $diffBackupDate="{0:g}" -f  $db.LastDifferentialBackupDate
           $fgColor4=3
       }
       else 
       { 
           $diffBackupDate="{0:g}" -f  $db.LastDifferentialBackupDate 
           $fgColor4=1
       }
    
         
       $Sheet.Cells.Item($intRow, 3) = $diffBackupDate
       $Sheet.Cells.item($intRow, 3).Font.ColorIndex = $fgColor4
   
       #We use the .ToString() Method to convert the value of the Recovery model to string and ignore Log backups for databases with Simple recovery model 
       if ($db.RecoveryModel.Tostring() -eq "SIMPLE")
       { 
           $logBackupDate="N/A"
           $fgColor5 = 15
           $fgColor6 = "xlNone"
       } 
       else 
       { 
           if($db.LastLogBackupDate -eq "1/1/0001 12:00 AM")  
           { 
               $logBackupDate="Never been backed up" 
               $fgColor5 = 1
                  $fgColor6 = 3
           } 
           elseif(($db.LastLogBackupDate) -lt ((Get-Date).AddHours(-5)))
           {
                  $logBackupDate= "{0:g}" -f $db.LastLogBackupDate
               $fgColor5 = 1
                  $fgColor6 = 3
           }
           else 
           { 
               $logBackupDate= "{0:g}" -f $db.LastLogBackupDate
               $fgColor5 = 1
               $fgColor6 = "xlNone"
           } 
            
       } 
    
       $Sheet.Cells.Item($intRow, 4) = $logBackupDate
       $Sheet.Cells.Item($intRow, 4).Font.ColorIndex = $fgColor5
       $Sheet.Cells.Item($intRow, 4).Interior.ColorIndex = $fgColor6
    
       #Define your service-level agreement in terms of days here 
       if ($db.LastBackupDate -gt $db.LastDifferentialBackupDate)
           {
            if ($db.LastBackupDate -gt $db.LastLogBackupDate)
                {
                    $LatestBackup = $db.LastBackupDate
                }
            else
                {
                    $LatestBackup = $db.LastLogBackupDate
                }
        }
        elseif ($db.LastDifferentialBackupDate -gt $db.LastBackupDate)
        {
            if ($db.LastDifferentialBackupDate -gt $db.LastLogBackupDate)
                {
                    $LatestBackup = $db.LastDifferentialBackupDate
                }
            else
                {
                    $LatestBackup = $db.LastLogBackupDate
                }
        }
        else
        {
            $LatestBackup = $db.LastLogBackupDate
        }
        
        
       $NumDaysSinceLastBackup = ((Get-Date) - $LatestBackup).Days #We use Date Math to extract the number of days since the last full backup  
       if ($NumDaysSinceLastBackup -gt 1) 
       { 
           $fgColor = 3 
       } 
       else 
       { 
           $fgColor = 0 
       } 
    
       $Sheet.Cells.Item($intRow, 5) = $NumDaysSinceLastBackup 
       $Sheet.Cells.item($intRow, 5).Interior.ColorIndex = $fgColor 
    
    
            
       $intRow ++ 
    
       } 
   } 
   $intRow ++ 


}

#Read thru the contents of the Oracle_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\Oracle_Servers.txt")
{
    #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++
     
     $Sheet.Cells.Item($intRow,1) = "DATAFILE BACKUPS"
     $Sheet.Cells.Item($intRow,1).Interior.ColorIndex = 50
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     
     $intRow++

      $Sheet.Cells.Item($intRow,1) = "TABLESPACE"
      $Sheet.Cells.Item($intRow,2) = "FILE TYPE"
      $Sheet.Cells.Item($intRow,3) = "BACKUP TYPE"
      $Sheet.Cells.Item($intRow,4) = "FILE NAME"
      $Sheet.Cells.Item($intRow,5) = "BACKUPSET TIME"


     #Format the column headers
     for ($col = 1; $col -le 5; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
     
     #Create connection to Oracle
     $OracleconnectionString = "Data Source=OracleDataSource;User Id=myUser;Password=myPassword;Integrated Security=No"
     $OracleDFBUQueryString = "with cte as (select DF_TABLESPACE,FILE_TYPE,BACKUP_TYPE,FNAME,BS_COMPLETION_TIME, ROW_NUMBER () OVER (PARTITION BY DF_TABLESPACE,FILE_TYPE,BACKUP_TYPE,FNAME ORDER BY BS_COMPLETION_TIME DESC) rn from V`$BACKUP_FILES WHERE FILE_TYPE = 'DATAFILE') SELECT DF_TABLESPACE, FILE_TYPE, BACKUP_TYPE, FNAME, BS_COMPLETION_TIME FROM cte WHERE rn = 1"
     [System.Reflection.Assembly]::LoadWithPartialName("System.Data.OracleClient") | Out-Null
     $OracleConnection = New-Object System.Data.OracleClient.OracleConnection($OracleConnectionString)
     $OracleCommand = New-Object System.Data.OracleClient.OracleCommand($OracleDFBUQueryString, $OracleConnection)
     $OracleConnection.Open()
     $OraDFBU = $OracleCommand.ExecuteReader()
     $OraCounter = $OraDFBU.FieldCount
     #Read the data out onto the Excel worksheet
     While ($OraDFBU.Read()) {
        for ($o=0; $o -lt $OraCounter; $o = $o + 5) {
            $Sheet.Cells.Item($intRow, 1) = $OraDFBU.GetValue($o)
            $Sheet.Cells.Item($intRow, 2) = $OraDFBU.GetValue($o+1)
            $Sheet.Cells.Item($intRow, 3) = $OraDFBU.GetValue($o+2)
            $Sheet.Cells.Item($intRow, 4) = $OraDFBU.GetValue($o+3)
            $Sheet.Cells.Item($intRow, 5) = $OraDFBU.GetValue($o+4)
            $intRow++
        }
    }
    $intRow++
    $Sheet.Cells.Item($intRow,1) = "OTHER FILE BACKUPS"
     $Sheet.Cells.Item($intRow,1).Interior.ColorIndex = 54
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     
     $intRow++

      $Sheet.Cells.Item($intRow,1) = "BACKUP TYPE"
      $Sheet.Cells.Item($intRow,2) = "FILE TYPE"
      $Sheet.Cells.Item($intRow,3) = "BACKUP TIME"
      
      
      #Format the column headers
     for ($col = 1; $col -le 3; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }
    
    $intRow++
    #We're going to use the existing connection, which is already open, but we need a new query string and command object
    $OracleOFBUQueryString = "with cte as ( select BACKUP_TYPE, FILE_TYPE, NVL(COMPLETION_TIME,BS_COMPLETION_TIME) COMPLETION_TIME, ROW_NUMBER () OVER (PARTITION BY BACKUP_TYPE, FILE_TYPE ORDER BY NVL(COMPLETION_TIME,BS_COMPLETION_TIME) DESC) rn from V`$BACKUP_FILES WHERE FILE_TYPE<> 'DATAFILE') SELECT BACKUP_TYPE, FILE_TYPE, COMPLETION_TIME FROM cte WHERE rn = 1"
    $OracleCommandOF = New-Object System.Data.OracleClient.OracleCommand($OracleOFBUQueryString, $OracleConnection)
    $OraOFBU = $OracleCommandOF.ExecuteReader()
     $OraCounterOF = $OraOFBU.FieldCount
     #Read the data out onto the Excel worksheet
     While ($OraOFBU.Read()) {
        for ($o=0; $o -lt $OraCounterOF; $o = $o + 3) {
            $Sheet.Cells.Item($intRow, 1) = $OraOFBU.GetValue($o)
            $Sheet.Cells.Item($intRow, 2) = $OraOFBU.GetValue($o+1)
            $Sheet.Cells.Item($intRow, 3) = $OraOFBU.GetValue($o+2)
            $intRow++
        }
    }
    
    $OracleConnection.Close()
    $intRow++
     
}

$Sheet.UsedRange.EntireColumn.AutoFit()

#############################################################################################################
#####                                    Database Disk Space                                                #####

#Create a new Excel object using COM 
$Sheet = $WB.Worksheets.Item(3)
$Sheet.Name = "Database File Space"

#Counter variable for rows
$intRow = 1



#Read thru the contents of the SQL_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\SQL_Servers.txt")
{

     # Create an SMO connection to the instance
     $s = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $instance

     $dbs = $s.Databases

     #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "DATABASE NAME"
      $Sheet.Cells.Item($intRow,2) = "AUTOSHRINK"
      $Sheet.Cells.Item($intRow,3) = "RECOVERY MODEL"
      $Sheet.Cells.Item($intRow,4) = "FILE NAME"
      $Sheet.Cells.Item($intRow,5) = "SIZE (MB)"
      $Sheet.Cells.Item($intRow,6) = "SPACE USED (MB)"
      $Sheet.Cells.Item($intRow,7) = "PERCENT FREE"

     #Format the column headers
     for ($col = 1; $col -le 7; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
      #######################################################
     #This script gets SQL Server database information using PowerShell

#     [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null

     # Create an SMO connection to the instance
#     $s = New-Object ('Microsoft.SqlServer.Management.Smo.Server') $instance

#     $dbs = $s.Databases

     #$dbs | SELECT Name, Collation, CompatibilityLevel, AutoShrink, RecoveryModel, Size, SpaceAvailable

     #Formatting using Excel

     ForEach ($db in $dbs) 
     {
     $FileGroups = $db.FileGroups
        ForEach($FileGroup in $FileGroups)
        {
        $Files = $FileGroup.Files
            ForEach($File in $Files)
            {
#              #Divide the value of SpaceAvailable by 1KB 
#              $FileSpaceAvailable = $File.SpaceAvailable/1KB 

              #Format the results to a number with three decimal places 
    #          $dbSpaceAvailable = "{0:N3}" -f $dbSpaceAvailable

              $Sheet.Cells.Item($intRow, 1) = $db.Name
     
               #Change the background color of the Cell depending on the AutoShrink property value 
               if ($db.AutoShrink -eq "True")
              {
                   $fgColor = 3
              }
              else
              {
                   $fgColor = 0
              }

              $Sheet.Cells.Item($intRow, 2) = $db.AutoShrink 
              $Sheet.Cells.item($intRow, 2).Interior.ColorIndex = $fgColor

              $Recovery = $db.RecoveryModel
              Switch ($Recovery)
              {
                 Full { $RecoveryText = 'Full' }
                 BulkLogged { $RecoveryText = 'Bulk-Logged?' }
                 Simple { $RecoveryText = 'Simple' }
                 Default { $RecoveryText = 'unable to determine' }
                }

              $Sheet.Cells.Item($intRow, 3) = $RecoveryText
              $Sheet.Cells.Item($intRow, 4) = $File.FileName
              $Sheet.Cells.Item($intRow, 5) = "{0:N0}" -f ($File.Size/1024)

              #Change the background color of the Cell depending on the SpaceAvailable property value 
              if (($File.UsedSpace / $File.Size * 100) -gt 95.00)
              {
                   $fgColor = 3
              }
               else
              {
                   $fgColor = 0
              }

              $Sheet.Cells.Item($intRow, 6) = "{0:N0}" -f ($File.UsedSpace/1024 )
              $Sheet.Cells.Item($intRow, 7) = 1-($File.UsedSpace / $File.Size)
              $Sheet.Cells.item($intRow, 7).Interior.ColorIndex = $fgColor
              $Sheet.Cells.item($intRow, 7).NumberFormat = "0.00%"

              $intRow ++
                  }
            }
        #Continue with log files
        $LogFiles = $db.LogFiles
        ForEach ($LogFile in $LogFiles)
            {
                $Sheet.Cells.Item($intRow, 1) = $db.Name
     
               #Change the background color of the Cell depending on the AutoShrink property value 
               if ($db.AutoShrink -eq "True")
              {
                   $fgColor = 3
              }
              else
              {
                   $fgColor = 0
              }

              $Sheet.Cells.Item($intRow, 2) = $db.AutoShrink 
              $Sheet.Cells.item($intRow, 2).Interior.ColorIndex = $fgColor

              $Recovery = $db.RecoveryModel
              Switch ($Recovery)
              {
                 Full { $RecoveryText = 'Full' }
                 BulkLogged { $RecoveryText = 'Bulk-Logged?' }
                 Simple { $RecoveryText = 'Simple' }
                 Default { $RecoveryText = 'unable to determine' }
                }

              $Sheet.Cells.Item($intRow, 3) = $RecoveryText
              $Sheet.Cells.Item($intRow, 4) = $LogFile.FileName
              $Sheet.Cells.Item($intRow, 5) = "{0:N0}" -f ($LogFile.Size/1024)

              #Change the background color of the Cell depending on the SpaceAvailable property value 
              if (($LogFile.UsedSpace / $LogFile.Size * 100) -gt 95.00)
              {
                   $fgColor = 3
              }
               else
              {
                   $fgColor = 0
              }

              $Sheet.Cells.Item($intRow, 6) = "{0:N0}" -f ($LogFile.UsedSpace/1024 )
              $Sheet.Cells.Item($intRow, 7) = 1-($LogFile.UsedSpace / $LogFile.Size)
              $Sheet.Cells.item($intRow, 7).Interior.ColorIndex = $fgColor
              $Sheet.Cells.item($intRow, 7).NumberFormat = "0.00%"

              $intRow ++
            }

     }

$intRow ++

}

#Read thru the contents of the Oracle_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\Oracle_Servers.txt")
{

     #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "DATABASE NAME"
      $Sheet.Cells.Item($intRow,2) = "SPACE MANAGEMENT"
      $Sheet.Cells.Item($intRow,3) = "ARCHIVELOG MODE"
      $Sheet.Cells.Item($intRow,4) = "FILE NAME"
      $Sheet.Cells.Item($intRow,5) = "SIZE (MB)"
      $Sheet.Cells.Item($intRow,6) = "SPACE USED (MB)"
      $Sheet.Cells.Item($intRow,7) = "PERCENT FREE"
      
      #Format the column headers
     for ($col = 1; $col -le 7; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
     
     #Create connection to Oracle
     $OracleconnectionString = "Data Source=OracleDataSource;User Id=myUser;Password=myPassword;Integrated Security=No"
     $OracleSchedulerQueryString = "SELECT  Substr(df.tablespace_name,1,20) , dt.SEGMENT_SPACE_MANAGEMENT, vd.LOG_MODE, Substr(df.file_name,1,80), Round(df.bytes/1024/1024,0), decode(e.used_bytes,NULL,0,Round(e.used_bytes/1024/1024,0)), 1 - decode(e.used_bytes,NULL,0,Round((e.used_bytes/df.bytes),3)) FROM  DBA_DATA_FILES DF INNER JOIN dba_tablespaces dt on DF.TABLESPACE_NAME = dt.tablespace_name LEFT OUTER JOIN (SELECT file_id, sum(bytes) used_bytes FROM dba_extents GROUP by file_id) E ON  e.file_id = df.file_id LEFT OUTER JOIN (SELECT Max(bytes) free_bytes, file_id FROM dba_free_space GROUP BY file_id) f ON df.file_id  = f.file_id CROSS JOIN v`$database vd ORDER BY df.tablespace_name, df.file_name"
     [System.Reflection.Assembly]::LoadWithPartialName("System.Data.OracleClient") | Out-Null
     $OracleConnection = New-Object System.Data.OracleClient.OracleConnection($OracleConnectionString)
     $OracleCommand = New-Object System.Data.OracleClient.OracleCommand($OracleSchedulerQueryString, $OracleConnection)
     $OracleConnection.Open()
     $OraSchedJobs = $OracleCommand.ExecuteReader()
     $OraCounter = $OraSchedJobs.FieldCount
     #Read the data out onto the Excel worksheet
     While ($OraSchedJobs.Read()) {
        for ($o=0; $o -lt $OraCounter; $o = $o + 7) {
            $Sheet.Cells.Item($intRow, 1) = $OraSchedJobs.GetValue($o)
            $Sheet.Cells.Item($intRow, 2) = $OraSchedJobs.GetValue($o+1)
            $Sheet.Cells.Item($intRow, 3) = $OraSchedJobs.GetValue($o+2)
            $Sheet.Cells.Item($intRow, 4) = $OraSchedJobs.GetValue($o+3)
            $Sheet.Cells.Item($intRow, 5) = $OraSchedJobs.GetValue($o+4)
            $Sheet.Cells.Item($intRow, 6) = $OraSchedJobs.GetValue($o+5)
            $Sheet.Cells.Item($intRow, 7) = $OraSchedJobs.GetValue($o+6)
            $Sheet.Cells.item($intRow, 7).NumberFormat = "0.00%"
            if ($OraSchedJobs.GetValue($o+6) -lt 0.05) 
                { 
                    $Sheet.Cells.item($intRow, 7).Interior.ColorIndex = 3 
                } 
            $intRow++
        }
    }
    $OracleConnection.Close()
    $intRow++
}

$Sheet.UsedRange.EntireColumn.AutoFit()

############################################################################################################################
######################                    Server disk space                                     ############################

#Create a new Excel object using COM 
$Sheet = $WB.Worksheets.Item(4)
$Sheet.Name = "Server Disk Space"

#Counter variable for rows
$intRow = 1

#Read thru the contents of the SQL_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\SQL_Servers.txt")
{
    #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "DRIVE"
      $Sheet.Cells.Item($intRow,2) = "SIZE GB"
      $Sheet.Cells.Item($intRow,3) = "FREE SPACE GB"
      $Sheet.Cells.Item($intRow,4) = "PERCENT FREE"

     #Format the column headers
     for ($col = 1; $col -le 4; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
    
    
    $disks = Get-WmiObject -ComputerName $instance -Class Win32_LogicalDisk -Filter "DriveType = 3";
    
    foreach($disk in $disks)
    {
         $percentFree = ($disk.FreeSpace / $disk.Size)
         if ($percentFree -lt .15)
         {
             if ($percentFree -lt .05)
            {
                $fgColor7 = 3
            }
            else
            {
                $fgColor7 = 6
            }
         }
         else
         {
             $fgColor7 = "xlNone"
         }
        
         $Sheet.Cells.Item($intRow,1) = $disk.DeviceID
         $Sheet.Cells.Item($intRow,2) = "{0:N1}" -f ($disk.Size / 1073741824)
         $Sheet.Cells.Item($intRow,3) = "{0:N1}" -f ($disk.FreeSpace / 1073741824)
         $Sheet.Cells.Item($intRow,4) = $percentFree
         $Sheet.Cells.item($intRow, 4).NumberFormat = "0.00%"
         $Sheet.Cells.Item($intRow,4).Interior.ColorIndex = $fgColor7
         
         $intRow++
    }
    
    $intRow++
}

#Read thru the contents of the Oracle_Servers.txt file
foreach ($instance in get-content "\\Servername\IT\Gugg\Powershell\Oracle_Servers.txt")
{
    #Create column headers
     $Sheet.Cells.Item($intRow,1) = "INSTANCE NAME:"
     $Sheet.Cells.Item($intRow,2) = $instance
     $Sheet.Cells.Item($intRow,1).Font.Bold = $True
     $Sheet.Cells.Item($intRow,2).Font.Bold = $True

     $intRow++

      $Sheet.Cells.Item($intRow,1) = "DRIVE"
      $Sheet.Cells.Item($intRow,2) = "SIZE GB"
      $Sheet.Cells.Item($intRow,3) = "FREE SPACE GB"
      $Sheet.Cells.Item($intRow,4) = "PERCENT FREE"

     #Format the column headers
     for ($col = 1; $col -le 4; $col++)
     {
          $Sheet.Cells.Item($intRow,$col).Font.Bold = $True
          $Sheet.Cells.Item($intRow,$col).Interior.ColorIndex = 48
          $Sheet.Cells.Item($intRow,$col).Font.ColorIndex = 34
     }

     $intRow++
    
    
    $disks = Get-WmiObject -ComputerName $instance -Class Win32_LogicalDisk -Filter "DriveType = 3";
    
    foreach($disk in $disks)
    {
         $percentFree = ($disk.FreeSpace / $disk.Size)
         if ($percentFree -lt .15)
         {
             if ($percentFree -lt .05)
            {
                $fgColor7 = 3
            }
            else
            {
                $fgColor7 = 6
            }
         }
         else
         {
             $fgColor7 = "xlNone"
         }
        
         $Sheet.Cells.Item($intRow,1) = $disk.DeviceID
         $Sheet.Cells.Item($intRow,2) = "{0:N1}" -f ($disk.Size / 1073741824)
         $Sheet.Cells.Item($intRow,3) = "{0:N1}" -f ($disk.FreeSpace / 1073741824)
         $Sheet.Cells.Item($intRow,4) = $percentFree
         $Sheet.Cells.item($intRow, 4).NumberFormat = "0.00%"
         $Sheet.Cells.Item($intRow,4).Interior.ColorIndex = $fgColor7
         
         $intRow++
    }
    
    $intRow++
}

$Sheet.UsedRange.EntireColumn.AutoFit()

$WB.Save()
$WB.Close()
$Excel.Application.Quit()
cls

There are a few things to note:

  • I keep the list of SQL Servers and Oracle Servers in text files on a folder share.  That is the\\Servername\IT\Gugg\Powershell\Oracle_Servers.txt and

    \\Servername\IT\Gugg\Powershell\SQL_Servers.txt that you are seeing over and over.

  • You’ll notice several places where the Excel cells are shaded red or yellow, depending upon conditions.  It helps to see potential problems highlighted when going through the new workbook each morning.
  • I use Windows Task Scheduler to kick this powershell script off each morning at 6 AM.

Querying Oracle through Powershell

I’ve been dealing with some back pain this summer/fall that has made fishing uncomfortable.  I’m still able to do it, but when you’re in constant pain, it takes the fun out of doing pretty much everything but laying down.  That, along with the general busyness of the summer and fall, have kept the fishing down to a minimum.  However, I did get out for a weekend and found that while musky fishing was difficult, slip-bobber fishing was better.  I’d never slip-bobber fished until this year, but I have found it’s really enjoyable.  You can get the bait right above the bottom at any depth, and that presentation seems to work well when the fish aren’t as aggressive.

This brings me to my second note – I love catching smallmouth bass.  They fight really hard for their size, with deep runs and long battles.  Below is one of two that I caught at dusk on a weedy finger that stuck out into the main basin of the lake.

late-summer-smallmouth-bass

As you’d expect, querying Oracle database from Powershell isn’t as easy as querying SQL Server.  Powershell and SQL Server are both Microsoft technologies, so they play together pretty nicely.  With Oracle, their are a couple ways to connect to the database.  You can use the OLE DB database connector or the Oracle Client version.  I tried using the OLE DB connector first, but found I was unable to get it going.  So I switched over to the Oracle Client, which was a bit more complicated in the setup, but worked.

In order to use the Oracle Client, you have to install it on your system first.  I will warn you that this is not a minor install.  The installer itself is 600 MB, and you need to make sure you grab the correct version (32 vs 64 bit), depending on your OS.  Once installed you should be ready to go.  Here is the basic pattern for creating the connection and querying data:

#Create connection to Oracle
 $OracleconnectionString = "Data Source=OracleDatabaseServer;User Id=dgugg;Password=myPass;Integrated Security=No"
 $OracleSchedulerQueryString = "select IMAGE_ID, DESCRIPTION from FMSH_IMAGE WHERE Rownum < 11 ORDER BY IMAGE_ID DESC"
 [System.Reflection.Assembly]::LoadWithPartialName("System.Data.OracleClient") | Out-Null
 $OracleConnection = New-Object System.Data.OracleClient.OracleConnection($OracleConnectionString)
 $OracleCommand = New-Object System.Data.OracleClient.OracleCommand($OracleSchedulerQueryString, $OracleConnection)
 $OracleConnection.Open()
 $OraImages = $OracleCommand.ExecuteReader()
 $OraCounter = $OraImages.FieldCount
 While ($OraImages.Read()) {
    for ($o=0; $o -lt $OraCounter; $o = $o++ ) {
        $OraImages.GetName($o) = $OraImages.GetValue($o)
        }
    }
$OracleConnection.Close()

Here is the gist of the code:

  • Define a connection string to the Oracle database.
  • Define your SQL statement
  • Load the Oracle Client Assembly into this instance of Powershell
  • Define and create your Oracle Connection Object
  • Define and create your Oracle command
  • Open your connection to Oracle
  • Execute your SQL into a recordset
  • Loop through the recordset from beginning to end
    • In each loop, print out the field name and field value

There are a few gotchas to keep in mind:

  1. Don’t end your SQL statement with a semicolon.  I did this and couldn’t figure out why it wasn’t working.  Once I took it out it was smooth sailing.
  2. Don’t forget to escape Powershell special characters.  I was doing a query from the V$Database view and again ran into problems.  Once I escaped the dollar sign ($) with a back tick (`), everything once again started working.

Stay tuned for my final blog in this series where I will use this connection to Oracle as well as the previously posted connection to SQL Server to create a workbook that has daily monitoring of my database health stats.

Oracle not available: shared memory realm does not exist

Last month I got the opportunity to do some fishing that I have never done before.  Through a work event, I was able to go out and do some trolling for salmon in Lake Michigan.  Usually I’m not a big fan of trolling.  You don’t get to feel the fish hit and there is no technique when working the bait.  We were basically just dragging 9 – 11 lines behind the boat with varying baits and depths, and waiting for the fish to bite.  I did end up enjoying it quite a bit though.  It was much more relaxing – I could just sit back in a deck chair and wait for one of the poles to start frantically bobbing.  My arms and sides were already sore from a weekend of muskie fishing, so it was nice to be able to take a break and do some easy fishing.  We ended up catching a bunch of coho salmon.  They put up great fights and were delicious table fare.

Coho Salmon

Every so often I run into an issue where Oracle doesn’t come back up after a server reboot.  Attempting to log in via SQLPlus gives a message indicating that Oracle is not available and the shared memory realm does not exist.

20160711 Oracle not available error

I’m not sure what exactly causes this, but if it were to happen every time I would check the logs to investigate further.  However, for now I’m just interested in bringing it back up.  A server reboot can be attempted, but there is a quicker way.  First open sqlplus without attempting to log into an instance.

20160711 Oracle not available sqlplus nolog

Next connected to the instance as sysdba.

20160711 Oracle not available conn to idle instancwe

Now shutdown the instance cleanly.

20160711 Oracle not available shutdown abort

I use the abort option, which will close any connections immediately.  Since the database was unavailable anyway, I’m not really concerned about any in-process transactions.  The last step is to start the instance back up.

20160711 Oracle not available startup

Now the database is back online and ready for use.  To test this I exit out of SQLPlus and log back in with my standard user.

20160711 Oracle not available all better

No error message received and I’m good to go.

Moving Data Between Oracle Instances with Data Pump

Last weekend I took our annual trip up north to open my grandpa’s cabin up in Oneida County.  There are various tasks to get it ready for use in the summer, such as raking leaves, cleaning the gutters, and putting in the pier.  When I arrived I found most of the work had already been done by a group of people that had come up to do some deep cleaning a few weeks earlier.  This gave me extra time to fish!  I ended up bring my five year old son with me this time, so much of my fishing time was spent with him.  He’s a pretty good fisherman, but I still spent most of my time helping him rather than fishing myself.  It was a great weekend and he caught a ton of fish, including his first northern pike.  He was incredibly excited and I was relieved I got it into the net before it bit off the end of the line.  Taking your children fishing when the bite is really on is one of life’s greatest thrills.

Sons first northern

I didn’t more than a couple hours of muskie fishing in and I didn’t catch any.  I did have one follow from a small fish, probably around 25″, and I didn’t get a great look so it may have even been an aggressive northern pike.  However, I did see a different muskie that took my breath away.  It wasn’t following my lure because it came from a different direction than I had been casting.  It came in slowly and lazily and then gulped some air before slowly swimming off.  I’ve read that muskies sometimes will gulp air to aid in digestion, so I assume it had just finished a large meal.  Since it was moving so slowly and came completely out of the water, I got a really good look at it.  It had a large gash on its back right above the dorsal fin that must have been a result of the recent spawn.  But its most distinct feature was the size.  It was unquestionably the largest muskie I have ever seen.  Not having seen many fish that big, its hard to give a really good estimate, but it must have been a mid 50″ fish.  I wouldn’t be surprised in the least if it was a full 60″.  You don’t get to see many fish that size, and I felt very blessed to get such a great look at it.

Data pump is an effective way to move data in and out of an Oracle database instance.  It can easily be used to move data from one instance to another by first exporting the data from instance A and then importing it into instance B.  It can also be an effective means of taking a snapshot of the database for simple backup purposes.  I was recently tasked with recovering data in a particular table in an Oracle database and was able to get that data by loading a nightly data pump export.  In my case, the export had been created with the old deprecated way of importing and exporting data, which uses the imp and exp commands instead of impdp and expdp.  The command used to export the data was:

20160607 data pump export

You can see the command run is exp.  The next section, f21/$my_f21pwd@my_sid is the login to the database.  The file=$exportfile is a variable that contains the file name which contains the day of the week.  This was defined earlier in the shell script.  The script was set up to do a nightly export and replace the current export for that day of the week, so you always have a rolling seven days of export “backups”.  The buffer specifies the size, in bytes, of the buffer used to fetch rows.  Bigger buffer settings will perform better.  Full = y indicates that the entire database will be exported.  Consistent = y makes sure the database is in a consistent state.  This ensures relationships between tables will be consistent; it makes the data usable.  Lastly, the logfile parameter is specifying a file to put the log of what was done.  It is helpful to have a log for each import and export so you can see what was run and view any errors or notices that may arise.

Next came time to import a specific table from this export file.  I needed a table called FMSH_IMAGE, and I wanted to put it into my own user’s database.  From there I could grab only the specific column I wanted using an update statement to the table in the real database.  I ran this:

20160607 data pump import

You can see the command run is imp.  You can’t use the new data pump impdp with a file exported from the old exp.  The rest is pretty self explanatory.  I include a username and password to access the database.  I specify the file to use, this one from the most recent Thursday.  I specify a log file to put any messages into.  The last three parameters are a little more interesting.  Fromuser tells the program the database that was exported.  Touser tells the program what database I want the import to go into.  These are important because I didn’t want to just override the FMSH_IMAGE table in the source database – this would get rid of any newly imported data.  I only needed to update a single column with the data from the previous day.  Lastly, I was able to specify that I only wanted to import the single table.

Using this example, you can see Oracle’s import and export programs are very useful for getting data into and out of the database.  Be sure to check out the Oracle documentation on the newer impdp and expdp programs, as these have different parameters and improved functionality.

Oracle data pump import

Oracle data pump export