Import XML into SQL Server

One thing any responsible fisherman needs is a pair of pliers.  You may need different sizes for different types of fish.  For panfish, something small and skinny is going to help get in their small mouths to extract deep hooks.  For muskie, you’re going to want an extra-long pair to get to the hooks while keeping your hands away from those viscous teeth.  I’ve found that Northern Pike are particularily good at swallowing hooks into the back of their throats.  The last thing to remember is that a cut in the gills is almost always a death sentence for the fish.  Since the gills are what re-oxygenates the blood, their is a high volume of blood flow there, and even a small cut can cause a fish to bleed out in minutes.  If you aren’t confident you can remove a deep hook without damaging the fish, it’s usually best to just cut the line as close to the hook as possible.  Studies have shown the fish are adept at shaking hooks off once they are free of the fisherman.

northern-pike

I recently was provided a file that I needed to extract a bunch of data from.  The problem was that the file was xml, not the easiest to import into SQL Server.  I took a two step approach to solve this problem.  The first step was to import the data into SQL Server as a blob:

CREATE TABLE dbo.MarketplaceDecemberXML
(
id INT IDENTITY
PRIMARY KEY
, XMLData XML
, LoadedDate DATETIME
);

INSERT  INTO dbo.MarketplaceDecemberXML
( XMLData
, LoadedDate
)
SELECT  CONVERT(XML , BulkColumn) AS BulkColumn
, GETDATE()
FROM    OPENROWSET(BULK '\\ServerName\DataImport\All Orders.txt' , SINGLE_BLOB) AS oro;

The next step is to pull out any data that is needed.  There are a few ways to do this, but the easiest for me was to use the OPENXML function.  This function required me to load the xml data into an xml variable, then I could read different data based on the depth to which I did the initial opening.

DECLARE @XML AS XML
, @hDoc AS INT
, @SQL NVARCHAR(MAX);

SELECT  @XML = adx.XMLData
FROM    dbo.MarketplaceDecemberXML adx;
EXEC sys.sp_xml_preparedocument @hDoc OUTPUT , @XML;

SELECT  MarketplaceOrderID
, FulfillmentChannel
, PurchaseDate
, CustState
, OrderStatus
FROM    OPENXML(@hDoc, 'VendorEnvelope/Message/Order/FulfillmentData/Address')
WITH
(
VendorOrderID VARCHAR(100) '../../VendorOrderID',
OrderStatus VARCHAR(100) '../../OrderStatus',
FulfillmentChannel VARCHAR(100) '../FulfillmentChannel',
PurchaseDate DATETIME '../../PurchaseDate',
CustState VARCHAR(100) 'State'
);

EXEC sys.sp_xml_removedocument @hDoc;
GO

Here is a screenshot of some of the data I was working with:

20161215-import-xml

The important thing to note is that I used the ../ to move up in the hierarchy.  Using this method I was able to extract all the data I needed and place it in a table where I could query it at will.

Advertisements

Experimenting with SQL Server Memory

In my “career” as a muskie fisherman, I’ve caught or seen fish in some very unlikely places. I’ve caught fish over very deep water with seemingly no nearby structure as well as in water as shallow as just a few feet. I’ve had muskies strike or chase northern pike and walleye as I brought them into the boat. I’ve had bites when I was expecting nothing. The key to catching more fish is being will to test new locations, techniques, and lures.

shallow-muskies
I recently wondered what would happen if I was forced to drastically reduce memory available to SQL Server. Most DBAs know that if SQL Server is low on available memory and there is RAM to spare on the server, you can increase the memory SQL Server can use in the server properties. But what happens if you reduce the memory SQL Server can use below what it is currently using? I decided to try it out in our Test environment.

The initial maximum memory usage was 7 GB:

20161130-lower-memory-limit-below-current-usage-initial-setting

In looking at the SQL Server process in Windows Task Manager, I see it is using all plus a bit:

20161130-lower-memory-limit-below-current-usage-current-usage

I turned the max down to 4.5 GB:

20161130-lower-memory-limit-below-current-usage-change-setting

Looking at the Windows Task Manager shortly after the change, I can see the amount be using by SQL Server has been reduced as expected.

20161130-lower-memory-limit-below-current-usage-new-usage

If I look at my Memory Usage tracking at that time, I can see buffer pool was the area that was directly affected.  The buffer pool is where any data pages in SQL Server are cached for quick reading/writing.

20161130-lower-memory-limit-below-current-where-memory-changed

So next time you need to adjust the max memory downward, be aware this will immediately reduce the number of pages in the buffer pool.

db_recovery_file_dest_size of N bytes is X% used and has R remaining bytes available

The boat launch can be a tough place, especially if you are a beginner on a busy lake.  People are anxious to get onto and off of the lake.  You don’t want people to be sitting there waiting on you, so the temptation is to go as fast as you can.  This can be a huge mistake for many though.  Even experienced boaters can make rookie mistakes that are downright embarrassing in front of all the other boaters.  Personally, I’ve made two mistakes that were somewhat embarrassing.  My first wasn’t a big deal – I forgot to put in the boat plug.  I’d guess most boaters have made this mistake at one time or another.  Water started gushing in the boat.  Once I saw it I quickly jammed in the plug and started the bilge pump, which drained out the water in about five minutes.  No real harm done.  My other big mistake was while I was pulling off a river.  The launch was gravel and had no dock.  I walked out on the tongue of the trailer, but while I was adjusting the boat I lost my balance.  Unfortunately my shoe got stuck on a bracket so instead of stepping down and getting my leg wet up to my knee, I fell back first and ended up totally submerged in 1 1/2 feet of water.  I’m the kind of guy who is able to laugh at my mistakes, so even though I ended up soaking wet and embarrassed, I still had a good laugh at my own expense.

A DBA can certainly see the lesson in this.  If you go too fast you make mistakes.  You need to be very careful and pay close attention to details.  One way to cover yourself is to maintain backups.  I recently ran into a warning in Oracle Enterprise Manager that said I was using 97% of the recovery area free space.  A quick search found that if the used space got up to 100%, the database would cease to process transactions.

20161122-recovery-file-size-oem

I wanted to see within the database what my db_recovery_file_dest_size parameter was set to.  I ran a quick query and found that it was set to 350 GB.

20161122-recovery-file-size-parameter

Next I queried the v$recovery_file_dest to see the total space, used space, and directory where I was storing the backupsets.

20161122-recovery-file-size-recovery-dest

I checked the directory and found it quite full of back files.

20161122-recovery-file-size-backups

My next step was to attempt to delete any expired archivelogs.  Unfortunately, this didn’t do anything:

20161122-recovery-file-size-delete-expired-results

I needed to first backup the archivelogs before I could delete any of them.  So I ran the following command in RMAN:

20161122-recovery-file-size-backup-archivelog-all-delete-input

This ran a VERY long time, but eventually succeeded.  This fixed my problem.  After running this command I checked my free space and found I’d gained plenty:

20161122-recovery-file-size-fixed-size

Taking backups is not enough – you need to be aware of what is happening to the backups.  Testing recover-ability is also a great idea!