Moving UNDO and TEMP files in Oracle

This summer I had the pleasure of reading a couple of really good muskie fishing books. The first was Musky Strategy by row-trolling legend Tom Gelb. There were a number of cool things in this book, but the main thing that struck me was how scientific Tom’s approach to muskie fishing is. He went so far as to test the depth his lures ran while trolling by rowing parallel to a shoreline over and over from deep to shallow to see when the lure started contacting the bottom. The book was a fun read, and showed how can you succeed without all the modern electronic tools as long as you’re willing to put in the time. The second was Time on the Water by Bill Gardner. This book was a story instead of a nonfictional book to teach tactics. It tells the story of one man’s quest to catch a big muskie while fishing the entire season for a year in Northern Wisconsin. It was a fun read, and the main takeaway here was just how difficult it is to catch a big muskie, even when fishing for them every day. If you like fishing, check out the books!
Loss of free space on a disk is something that we as DBAs are always dealing with. Perhaps we are able to ask our SAN admin to give us more space, but in many cases we are not. In those cases, moving database files from one logical drive to another can answer this challenge. Note that care should be taken when considering moving database files. Drives can be different speeds, and their may be a reason a database file is on one drive an not another.
I recently needed to move a couple Oracle system tablespace files (UNDO and TEMP) from the C: drive (where they never should have been put in the first place) to a different drive.
20171215 OracleDB Drive Space
The method for doing this is relatively simple. Create a second tablespace for each on the “other” drive, make it the default tablespace, and drop the original tablespace.
CREATE TEMPORARY TABLESPACE temp2 TEMPFILE 'D:\APP\ORACLE\ORADATA\LEGENDOPRD\TEMP03.DBF' size 30G AUTOEXTEND ON NEXT 1G MAXSIZE unlimited;
ALTER DATABASE DEFAULT TEMPORARY TABLESPACE temp2;
DROP TABLESPACE TEMP INCLUDING CONTENTS AND DATAFILES;

CREATE UNDO TABLESPACE undotbs2 datafile 'D:\APP\ORACLE\ORADATA\LEGENDOPRD\UNDO02.DBF' size 20G;
ALTER SYSTEM SET undo_tablespace=undotbs2;
DROP tablespace UNDOTBS1 INCLUDING CONTENTS;

I found after doing this I still needed to delete the old files from the OS folder, and I couldn’t do that until the Oracle database service had been recycled, but after that my drive space was much healthier.

Advertisements

Custom Roles in SSRS

Our world continues to drift into virtual reality. I don’t, of course, mean actual virtual reality, but I do mean we are growing further and further away from the great outdoors and closer and closer to spending all our team immersed in technology. A Nielsen Company audience report from 2016 indicated adults in the US devoted about 10 hours and 39 minutes each day to consuming media (tablets, smartphones, computers, TVs, etc.). A BBC news article reports that kids aged 5 – 16 spend an average of 6.5 hours each day in front of a screen, compared to just three hours back in 1995. This trend is scary for a number of reasons, but it is clear that many people, especially kids, are losing out on experiencing the great outdoors. There is nothing in the virtual world that can compare to feeling a big fish tugging on the end of your line, or sitting in a tree stand as a deer stealthily approaches, or even just taking a walk around a lake and enjoying the fresh air and beautiful view. It’s up to us adults to teach our children the joy of outdoor sports.
Somehow, up until this point in my DBA career I’ve never had to mess with custom SSRS permissions. SSRS installs with five permissions roles by default.
20171205 Default Roles
As such, I was completely unaware that you can create your own custom roles. You can’t do this from Report Manager, as far as I could find in my SSRS 2008 R2 version. I had to log into the Report Server with SSMS. As a side note, logging in with Windows Authentication using localhost as the server didn’t work. I actually had to put in http://servername/reportserver to use Windows Authentication, even when logging in from on the server.
Once I had successfully logged in, I was able to add a new role.
20171205 Add New Role
I wanted to create a role that would allow a user to manage subscriptions for others. The only built in role that has that permission is Content Manager, but that role also has several other permissions that I didn’t want to confer on my target user. Remember a user can have multiple roles, so there is not reason not to get as granular as necessary. Below are the Tasks available for assigning to the custom roles:
20171205 Subscription Manager
Similarly, System Roles can also be customized. By default there is only System Administrator and System User. Here are the options for creating a new System Role.
20171205 Job Manager.PNG
Using custom roles can make your job of managing the permissions on Reporting Services easier and more precise.

Vendor Woes

The season has gotten late. I don’t have any recent fishing stories to share; I’ve spent most of this fall hunting instead. I’ll have to take the boat back down to the lake one more time to winterize the motor, but then it will be moved into the garage to await the thaw in spring. It’s always sad when the fishing season ends, but this year I’ve had so much fun bow hunting that I haven’t even noticed.

Hunting

Two Does Within Range

The lesson from this blog post will not be technical. Rather, it will be about applying good expectations on your vendors. This is probably most important to the DBA folks as they consider using cloud infrastructure.
The company I currently work for uses a third party cloud platform for our website hosting. We have no retail stores, and over 90% of our business comes from internet sales. Additionally, we are very seasonal, doing well over half our sales in the final two months of the calendar year. We spend the other ten months preparing for this busy sales period. Any technical issues during November or December can have catastrophic consequences to our bottom line. As you can expect, it is extremely important that our website remain functional from late fall through winter.
I’m sure you can see where I’m going with this. Our website hosting provider, a very well known company, has been abysmal so far this fall. We’ve had several unexpected outages, several scheduled outages, and very little explanation for it. We’ve had periods where our website has been unavailable for over 12 hours at a time. Our management has grown increasingly frusterated with this third party provider and, to a lesser degree, the IT department as well. In examining our contract, we were shocked to see no promise of uptime. I’m not sure how this was missed when we signed up for this service.
We will most likely be seeking to end our relationship with this company before our contract ends. Unfortunately, there is no time to procure another solution before the end of the year. We’re just going to have to live with the results. We may also be sued for breach of contract for seeking an early release, but with all the problems we’ve had I’m not sure that will happen.
Either way, the lesson here is to do a thorough investigation of any company you will be depending on. If you are looking at cloud services, find out ahead of time the expectations for uptime and maintenance outages. I would also be ready for imposing financial penalties on the vendor if those expectations are not met.

Inserting into SQL Server from Powershell using parameters

Sometimes fishing, especially muskie fishing, just doesn’t go your way. I recently took a trip to Northern Wisconsin to do some hunting and fishing. I spent five days beating the water to a froth with my lures, but ended up seeing only a single follow. When you’re faced with that level of failure, it’s common to search everywhere for reasons. Was the weather too warm? Was I fishing in the wrong spots? Was I using the wrong types of lures? Was I fishing too fast? Was I fishing too slow? Am I just a bad fisherman? Muskie fishing is all about developing patterns, but it’s awfully tough to find a pattern when you aren’t seeing any fish. I’m going to chalk my failure on this trip up to poor weather. Although it’s now fall, the temperatures have been setting record highs. I’m thinking the fish are just waiting for the water to start cooling so they can binge feed and put on the winter fat. But who knows? Maybe I am just a bad fisherman.

Bad Weather

Bad weather chased me off the water after just two hours of fishing on one day.

I recently constructed an IT dashboard. This dashboard, built in SSRS, compiled data from several sources into a SQL Server database where it could be quickly grabbed by SSRS. The part of this task that was new for me was grabbing performance counters from an array of remote servers and inserting them into the SQL Server table. I was able to make use of a Powershell SYSTEM.DATA.SQLCLIENT.SQQLCOMMAND for this. Below I’ll show how I did it.
First I need a table to hold the data. For this specific set of metrics I’m going to be collecting memory used by the servers.

CREATE TABLE [dbo].[IT_DB_Memory](
   [ServerName] [VARCHAR](255) NOT NULL,
   [MemoryUsed_GB] [DECIMAL](5, 2) NOT NULL,
   [CaptureDateTime] [DATETIME2](7) NOT NULL,
 CONSTRAINT [PK_IT_DB_Memory] PRIMARY KEY CLUSTERED 
(
   [ServerName] ASC,
   [CaptureDateTime] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

Now that I have a table, I will build the powershell script. This is going to run over multiple servers. First I need to set up an array with the list of servers I plan to survey.

$serverarray = @(('Muskie'),('NorthernPike'),('Walleye'))

That’s right, I name my servers after fish. Next I’ll create a connection to SQL Server.

$sqlConn = New-OBJECT SYSTEM.DATA.SqlClient.SQLConnection
$sqlConn.ConnectionString = "Server=SmallmouthBass;Database=DBAReporting;Integrated Security=True;"
$sqlConn.OPEN()

Now I create the command that will be run. Note the SQL Server parameters as distinguished by @ in the front of it.

$sqlCmnd = New-OBJECT SYSTEM.DATA.SqlClient.SqlCommand
$sqlCmnd.CONNECTION = $SqlConn
$sqlCmnd.CommandText = "
SET NOCOUNT ON;

INSERT INTO DBAReporting.dbo.IT_DB_Memory
    ( ServerName
    , MemoryUsed_GB
    , CaptureDateTime )
VALUES
    ( @ServerName
    , @MemoryUsed / 1073741824
    , GETDATE());"

Next I’ll actually create those parameters in the Powershell SQL command.

$sqlCmnd.Parameters.ADD((New-OBJECT DATA.SQLClient.SQLParameter("@ServerName",[Data.SQLDBType]::VarChar, 255))) | OUT-NULL
$sqlCmnd.Parameters.ADD((New-OBJECT DATA.SQLClient.SQLParameter("@MemoryUsed",[Data.SQLDBType]::DECIMAL, 5,2))) | OUT-NULL

This next step is what does the actual work. I’ll loop through the array and use the GET-COUNTER command to get the Memory Used. The way I have it set up will give sample the memory five times, once per second, and then return the average of those five samples.

foreach ($server in $serverarray) {
$sqlCmnd.Parameters[0].Value = $server
$Memory = GET-COUNTER -COUNTER "\Memory\Committed Bytes" -SampleInterval 1 -MaxSamples 5 -ComputerName $server |
    select -ExpandProperty countersamples | select -ExpandProperty cookedvalue | Measure-Object -Average
$sqlCmnd.Parameters[1].Value = $Memory.Average
$sqlCmnd.ExecuteScalar()
}

The last step in Powershell is simply to close the database connection.

$SqlConn.Close()

Now I can set this to run on a regular basis using Windows Task Scheduler, and I’ll have a history of how my application servers are using memory throughout the day.

Tally Tables

Occasionally I like to take a break from Muskie fishing and spend time catching some easier species. This is especially true when I’m taking friends out fishing. Not many people like to cast for hours with only a few follows to show for it. Last month I took my brother Steve out onto a smaller lake about five minutes from my house. This lake is overrun with invasive weeds, and I tend to think of it as a garbage lake. However, we had a great time catching fish. My brother caught several bass and a bonus walleye, while I managed this fat 30″ pike. The pike took a good 5 minutes to get in the boat since I was using fairly light tackle and we had no net.

Little Cedar Northern Pike.jpg

SQL is a set based language. It is built with the idea that the engine will handle any looping in the background, without the author needing to specify the best way to loop. There are a few rare exceptions, but if you are creating a loop in SQL, you are usually doing something wrong or much less efficiently. One great way to get around loops is to create a Tally Table. Originally defined by SQL Server legend Jeff Moden in 2008, the Tally Table is simply a table with a single column of very well indexed sequential numbers.
If you’re a programmer or developer, you’re probably going to think of something like this to build a Tally Table:

--Create the Tally Table
CREATE TABLE #Tally
(
    N INT
  , CONSTRAINT PK_Tally_N
        PRIMARY KEY CLUSTERED (N)
);

--Set up a increment counter
DECLARE @TallyCounter INT;
SET @TallyCounter = 1;

--Fill the Tally Table with a Loop
WHILE @TallyCounter <= 11000
BEGIN
    INSERT INTO #Tally
    (
        N
    )
    VALUES (@TallyCounter);

    SET @TallyCounter = @TallyCounter + 1;
END;

Running on my server, this code took an avergage of 432 milisecond while requiring 22,426 reads and 407 CPU. A more efficient way to generate the table will be like this:

--Create and populate table
SELECT TOP 11000
    IDENTITY(INT, 1, 1) AS N
INTO #Tally
FROM MASTER.sys.syscolumns sc1
   , MASTER.sys.syscolumns sc2;

--Add Primary Key Clustered
ALTER TABLE #Tally
ADD CONSTRAINT PK_Tally_N 
PRIMARY KEY CLUSTERED (N) WITH FILLFACTOR = 100;

This took me only 73 miliseconds to run, and required only 885 reads and 78 CPU.
In Oracle this is even easier to create:

CREATE TABLE tempTally AS
SELECT LEVEL AS N
FROM DUAL
CONNECT BY LEVEL <= 11000
ORDER BY LEVEL;

So now we’ve got a table full of sequential numbers from 1 to 11,000. What can we use this for?
From a programmer or developer perspective, loops are often used with strings. Let’s say we want to step through and display each character in a string. With a loop, you’d do something like this:

DECLARE @StepThroughMe VARCHAR(100), @i INT;
SELECT @StepThroughMe = 'Looping through this string is a waste of time.', @i = 1;

WHILE @i <= LEN(@StepThroughMe)
BEGIN
    SELECT @i, SUBSTRING(@StepThroughMe, @i, 1);
   SELECT @i = @i+1
END;

Using a Tally table, you can do it in a way that is simpler to write and runs in less than a tenth of the time:

DECLARE @TallyThroughMe VARCHAR(100);
SELECT @TallyThroughMe = 'Using a Tally Table is an efficient use of time.'

SELECT t.N, SUBSTRING(@TallyThroughMe, t.N, 1)
FROM #Tally AS t
WHERE t.N <= LEN(@TallyThroughMe);

One other way I used this was to create my Date table in my date warehouse.

WITH cte
AS (SELECT DATEADD(DAY, N - 1, '2000-01-01') AS Date
    FROM #Tally
   )
SELECT YEAR(cte.Date) * 10000 + MONTH(cte.Date) * 100 + DAY(cte.Date) AS DateKey
     , cte.Date
     , YEAR(cte.Date) AS YEAR
     , DATEPART(QUARTER, cte.Date) AS Quarter
     , MONTH(cte.Date) AS MONTH
     , RIGHT('0' + CAST(MONTH(cte.Date) AS VARCHAR(2)), 2) + '. ' + DATENAME(MONTH, cte.Date) AS MonthName
     , DATEPART(ww, cte.Date) + 1 - DATEPART(ww, CAST(DATEPART(mm, cte.Date) AS VARCHAR) + '/1/' + CAST(DATEPART(yy, cte.Date) AS VARCHAR)) AS WeekOfMonth
     , DATEPART(wk, cte.Date) AS WeekOfYear
     , DATEPART(dw, cte.Date) AS DayOfWeek
     , RIGHT('0' + DATEPART(dw, cte.Date), 2) + '. ' + DATENAME(dw, cte.Date) AS DayOfWeekName
     , DAY(cte.Date) AS DayOfMonth
     , DATEPART(DAYOFYEAR, cte.Date) AS DayOfYear
     , CASE
           WHEN DATEPART(QUARTER, cte.Date) IN ( 1, 2 ) THEN
               'Spring'
           ELSE
               'Fall'
       END AS RetailSeason
FROM cte;

This worked for loading my permanent table, but you could also use it to load a temp table or table variable that could be joined to a data set to get a full range of dates even when your data set is missing data on some of the dates.

Tally tables can be used to improve performance in a number of different scenarios. Next time you’re not sure whether you may need a loop, stop and consider whether your situation may benefit from a Tally Table.

SQL Agent Properties

It’s been a rather windy summer thus far, making it less fun to be out on the water.  I don’t know many guys who enjoyed being blasted by the wind while bobbing up and down on the waves for hours on end.  I went out on Pewaukee Lake a few weeks ago with a buddy from work.  We had picked the day in advance since it was supposed to be dry and calm.  We got the dry, but not the calm.  We had a stiff wind blowing out of the west that drove us back into the western half of the lake after trying to fish the narrows in the middle.

I spent the day focusing my fishing efforts on making the bait look real.  I tried hard to avoid retrieving the lure in a rhythmic fashion.  I was paid off with a nice upper 30s muskie:

June 2017 Pewaukee Muskie.jpg

My fishing buddy hooked one a short time later, but couldn’t keep it pinned and we lost it.

Recently, I blogged about migrating the SQL Server installation onto a different drive.  I did find one problem after this move that I had to address.  I ran into a problem with the SQL Agent and I wasn’t able to diagnose the issue.  If I remember correctly it was actually an SSRS subscription that failed, and I needed details to find out why.  I found that the SQL Agent has properties, and the error log was still pointing back at the previous location on the C: drive, which no longer existed.  There is a stored procedure you can execute to see those properties, in addition to looking at them in the SSMS UI:

EXEC msdb..sp_get_sqlagent_properties

20170712 SQL Agent Properties.PNG

Lastly, just update the value with the corresponding SET stored procedure and restart the SQL Agent:

EXEC msdb..sp_set_sqlagent_properties 
   @errorlog_file = N'D:\SQLSERVERINSTALL\SQLAGENTERRORLOG.log'

Now your SQL Agent properties have been update.

Addressing login trigger failures in SQL Server

As I get older I have come to enjoy watching others fish, especially my children.  The thrill of catching a big fish is magnified by seeing the smile on someone else’s face when he/she is the one bringing it in.  Below is a nice sized largemouth bass my son caught on a recent fishing trip.

Two Sisters LM.jpg

In my previous post I showed how to create a login trigger to log sysadmin access to a SQL Server instance.  Almost immediately I received a comment describing how the failure of the trigger could almost completely prevent anyone from logging into the instance.  This is a major problem!

The reason this occurs makes sense if you think about it.  While attempting to login, the user executes some code in a trigger.  If that code is invalid, the trigger will fail and abort.  When that happens, the login aborts as well.  What could cause the trigger to fail?  Well, if the table (or other objects) you are accessing within the trigger is inaccessible to the user, or if it doesn’t even exist, the trigger will fail.

I tested this by using my working trigger, which logged sysadmin logins to a table called dbo.sysadminLogging.  Next I renamed the table to dbo.sysadminLogging1.

20170606 Renamed table

Next I tried to login in a new window in SSMS:

20170606 Failed login

First, let’s talk about how to get back into a server that has this issue.  We need to log into the SQL using SQLCMD with a dedicated administrator connection, then disable the trigger:

20170606 Disable trigger

After doing this everyone should now be able to log back into SQL Server as normal.

Now to prevent this type of event from happening, I suggest a small edit to my original trigger.  This edit will make sure the referenced objects are valid.  If not, the trigger does nothing.  It may also be a good idea to send an email to the DBA so they can investigate, and I’ve noted that in the comments.

CREATE TRIGGER [servertrigger_CheckForSysAdminLogin] ON ALL SERVER
    FOR LOGON
AS
    BEGIN
       IF OBJECT_ID('DBMaint.dbo.sysadminLogging') IS NULL
           BEGIN
               --Possibly send an email to the DBA, indicating the trigger is not working as expected
               GOTO Abort;--Do nothing
           END

        IF IS_SRVROLEMEMBER('sysadmin') = 1
            BEGIN
                INSERT  INTO DBMaint.dbo.sysadminLogging
                        ( [Login] , LoginDate )
                VALUES  ( ORIGINAL_LOGIN() , GETDATE() );
            END;

Abort:

    END;

GO

This newer version of the trigger should cut down on the chances that this functionality will come back to bite you. Special thanks to james youkhanis for pointing this out.