Check Old Database for Page Verification Method – Torn Page versus Checksum

I always keep an eye out for used gear that I can pick up at a good price. Muskie fishing is an expensive sport, and having top of the line, brand new gear isn’t always necessary. I watch Craigslist and keep my eyes open at rummage sales to try to find lures, rods/reels, nets, and anything else that may be able to help me catch fish. One important thing to remember whenever buying used gear is to tune it up. If they are lures, sharpen the hooks and make sure they run straight. If it’s a reel, lube the applicable parts.
One important routine in SQL Server is DBCC CheckDB. This function will do a consistency check on the database by checking the each page for inconsistencies. Starting in SQL Server 2005 the database engine can use a CHECKSUM function to make this check. Once you enable CHECKSUM on a database (enabled by default in SQL Server 2005 and above), every time a page is written to disk a checksum is calculated over the contents of the whole page and is placed in the page header. When a page is read from the disk, and when DBCC CheckDB is run, the CHECKSUM will be recalculated and compared to the value stored in the page header. If the value isn’t the same as on the page header, you’ve got an indication of data corruption, most likely caused by the storage disk.

This is not good.

This is not good.

There are a few things to keep in mind regarding this consistency check. First, databases created in SQL Server 2000 will be using Torn-page for verification, instead of CHECKSUM. After upgrading a database to a new compatability level above SQL Server 2000, you have to manually go into the database settings and change the verfication type to CHECKSUM. This is important to do as CHECKSUM is far better at detecting inconsistencies than Torn Page. Next, the CHECKSUM will be added to pages when writing them to disk. This means existing data that is not written will not have the CHECKSUM in the page header, and DBCC CHECKDB will not use CHECKSUM to verify data integrity. A good way to get the CHECKSUM into the page header is to rebuild the table (clustered index or heap) and all the nonclustered indexes. Lastly, DBCC CHECKDB will only check pages that are allocated. This means if there is a problem with a page that is not being used, you will not see an error. This can be confusing if you are running DBCC CHECKDB overnight, then re-indexing afterward as the bad page may no longer be used by the time you are able to inspect the system. This doesn’t mean the I/O system fixed itself – you are still having problems that need to be looked at.

USE [master]
GO
ALTER DATABASE [DB_Test] SET PAGE_VERIFY CHECKSUM  WITH NO_WAIT
GO

The takeaway from this is to check each database that may have been created in SQL Server 2000 or earlier to make sure CHECKSUM page verification is being used, and to make sure you are running DBCC CHECKDB on a regular basis.

Much better!

Much better!

Advertisements

The FIRST and LAST functions in Oracle – Using the KEEP Clause

This weekend I attended the Milwaukee Muskie Expo at State Fair Park. It’s a fun way to stay connected to muskie fishing during the winter months when we can’t fish for them (The muskie season in WI goes from May through December). Basically, there are a lot of lure manufacturers, rod/reel makers, boat dealers, taxidermists, and fishing guides that set up in booths to try and get you to consider their products. My favorite to look at are the Lax replicas and Fittante replicas – it’s great to see some big fish that other fisherman have caught. I also keep my eye out for new and exciting lures – muskie often get used to seeing the same things over and over, so it can help to give them something new. This year I picked out a new Bondy Bait, the Royal Orba. I’ve never thought Bondy Baits looked that great, but I know they’ve gotten results. This new one has a nice tail and trails double blades, so it should be something the fish are not used to seeing. The only sad thing is that I’ve got to wait until May to test it out.
Finding a guide can be a challenging experience. Most guides cost $200 – $400 a day, so there is pressure to find a good one who really knows his stuff. Word of mouth is the main way to find a reliable guide, but if you’re going to an area where you don’t know too many people, like on a vacation in another part of the state or country, you’re going to have to find another reference. It’d be helpful if we had an internet database where people could rate the guides based on how the trip went. If such a database existed, I’d use Oracle’s FIRST and LAST functions to find the right guide.
Oracle’s documentation describes the functions: “When you need a value from the first or last row of a sorted group, but the needed value in not the sort key, the FIRST and LAST functions eliminate the need for self-joins or views and enable better performance.” So the real challenge that is being addressed is when you are selecting with an aggregate function and you want to get records that are tied to the fields you are grouping by, but don’t want them included in the GROUP BY clause, you use the FIRST or LAST functions. As a very simple example, let’s say I have this table:

CREATE TABLE groupbyexample (lake VARCHAR2(25), rating INT, otherdata VARCHAR2(25));
INSERT ALL
  INTO groupbyexample (lake, rating, otherdata) VALUES ('Pewaukee',3,'get me')
  INTO groupbyexample (lake, rating, otherdata) VALUES ('Pewaukee',4,'this value')
  INTO groupbyexample (lake, rating, otherdata) VALUES ('Okauchee',3,'return')
  INTO groupbyexample (lake, rating, otherdata) VALUES ('Okauchee',4,'shallow')
  INTO groupbyexample (lake, rating, otherdata) VALUES ('Okauchee',5,'trophy')
SELECT * FROM dual;

Now I want to get the record with the highest rating for each lake. This is a very simple aggregate query:

SELECT lake, MAX(rating)
FROM groupbyexample
GROUP BY lake;

However, if I also want to display the otherdata field with my results, the common solution would be to join my result set back to the initial table like this:

SELECT g.lake,
  g.rating,
  g.otherdata
FROM groupbyexample g
INNER JOIN
  ( SELECT lake, MAX(rating) AS rating FROM groupbyexample GROUP BY lake
  ) ij
ON g.lake    = ij.lake
AND g.rating = ij.rating;

20150216 FIRST LAST simple example This is the standard way to do this in SQL Server 2008 R2 (the later versions have a version of the FIRST and LAST functions). This example was pretty simple, so lets look at a complex example using fishing guides as an example data set. In this example each guide will be assigned to a certain region of Wisconsin, and will specialize in two fish, a primary and a secondary. They will also be rated based on past user experience. These guide names are all made up and any names that happen to really exist are coincidental.

CREATE TABLE FishSpecies
  (FishID INT, FishName VARCHAR2(50)
  );
CREATE TABLE Regions
  (RegionID INT, RegionName VARCHAR2(100)
  );
CREATE TABLE FishingGuides
  (
    GuideID          INT,
    GuideName        VARCHAR2(50),
    PrimarySpecies   INT,
    SecondarySpecies INT,
    Region           INT,
    Rating           NUMBER(3,0)
  );

INSERT ALL
  INTO FishSpecies (FishID, Fishname) VALUES (1,'Musky')
  INTO FishSpecies (FishID, Fishname) VALUES (2,'Largemouth Bass')
  INTO FishSpecies (FishID, Fishname) VALUES (3,'Smallmouth Bass')
  INTO FishSpecies (FishID, Fishname) VALUES (4,'Panfish')
  INTO FishSpecies (FishID, Fishname) VALUES (5,'Crappie')
  INTO FishSpecies (FishID, Fishname) VALUES (6,'Perch')
  INTO FishSpecies (FishID, Fishname) VALUES (7,'Walleye')
  INTO FishSpecies (FishID, Fishname) VALUES (8,'Northern Pike')
  INTO FishSpecies (FishID, Fishname) VALUES (9,'Steelhead')
  INTO FishSpecies (FishID, Fishname) VALUES (10,'Rainbow Trout')
  INTO FishSpecies (FishID, Fishname) VALUES (11,'Brown Trout')
  INTO FishSpecies (FishID, Fishname) VALUES (12,'Lake Trout')
SELECT * FROM DUAL;

INSERT ALL
  INTO Regions (RegionID, RegionName) VALUES (1,'Southeast WI')
  INTO Regions (RegionID, RegionName) VALUES (2,'Southwest WI')
  INTO Regions (RegionID, RegionName) VALUES (3,'Central WI')
  INTO Regions (RegionID, RegionName) VALUES (4,'Northeast WI')
  INTO Regions (RegionID, RegionName) VALUES (5,'Northwest WI')
  INTO Regions (RegionID, RegionName) VALUES (6,'Southern Lake Michigan')
  INTO Regions (RegionID, RegionName) VALUES (7,'Northern Lake Michigan/Green Bay')
  INTO Regions (RegionID, RegionName) VALUES (8, 'Lake Winnebago')
SELECT * FROM DUAL;  

INSERT ALL 
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (1,'Dave''s Muskie Guide Service', 1,8,1,5)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (2,'Madison Action', 2,8,1,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (3,'Big Fish Guide Service', 7,1,5,3)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (4,'Good Eatin''', 4,7,2,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (5,'Muskie Mike', 1,8,4,2)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (6,'WTD Inc.', 2,8,5,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (7,'Fred''s Fishing Charters', 7,3,3,5)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (8,'Joe Woz', 6,5,4,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (9,'Walt and Willy', 1,8,5,3)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (10,'Robo Fishing', 10,11,6,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (11,'The Wiz Fishery', 7,3,8,3)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (12,'H and H', 12,8,6,5)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (13,'Rhinelander''s Best', 3,8,4,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (14,'Iron Mountain Fishing', 5,4,5,3)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (15,'Milwaukee Mike''s', 7,3,1,2)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (16,'Eagle River Guide Service', 2,3,4,1)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (17,'Green Bay Charters', 11,9,7,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (18,'Hayward Trophy Fishing', 5,4,5,5)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (19,'Winnebago Fishing', 7,8,8,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (20,'Lake Michigan Fishing Charters', 9,12,6,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (21,'Tom at Lake Tom', 1,2,4,2)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (22,'HTG Fishing', 1,3,4,3)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (23,'Greg and Gavin', 7,1,4,4)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (24,'Captain Hook', 1,8,4,5)
  INTO FishingGuides (GuideID, GuideName, PrimarySpecies, SecondarySpecies, Region, Rating) VALUES (25,'The Better Netter', 2,1,4,4)
SELECT * FROM DUAL;

Here are what a select from the tables would look like:
20150216 FIRST LAST real example simple select
Let’s say I want to get the highest rated guide in each region, regardless of the specialized fish type. Using the “old way”, my query would look like this:

WITH cte AS
  (SELECT r.RegionName,
    fg.Rating,
    f.Fishname  AS PrimarySpecies,
    f2.Fishname AS SecondarySpecies,
    fg.GuideName,
    row_number() OVER (partition BY fg.region ORDER BY fg.rating DESC) rn
  FROM FishingGuides fg
  INNER JOIN Regions r ON fg.Region = r.RegionID
  INNER JOIN FishSpecies f ON fg.PrimarySpecies = f.FishID
  INNER JOIN FishSpecies f2 ON fg.SecondarySpecies = f2.FishID
  )
SELECT * FROM cte WHERE rn=1 ORDER BY RegionName;

I use a windowing function to get the row number for each region ordered by rating, then just look at results where the row number is 1. This looks easier to understand than my simple example, but still is not as efficient. Using the LAST function, I can rewrite the query like this:

SELECT  r.RegionName, 
        MAX(fg.Rating) AS Rating, 
        MAX(f.Fishname) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS PrimarySpecies, 
        MAX(f2.Fishname) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS SecondarySpecies, 
        MAX(fg.GuideName) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS Guidename
FROM FishingGuides fg
INNER JOIN Regions r ON fg.Region = r.RegionID
INNER JOIN FishSpecies f ON fg.PrimarySpecies = f.FishID
INNER JOIN FishSpecies f2 ON fg.SecondarySpecies = f2.FishID
GROUP BY r.RegionName
ORDER BY r.Regionname;

20150216 FIRST LAST real example best guide by region Both queries return the exact same results. Let’s say I now want to find the top rated guide in each region who has musky as either their primary or secondary fish. I would just add a WHERE clause to get this info:

--Highest rated guide in each region
SELECT  r.RegionName, 
        MAX(fg.Rating) AS Rating, 
        MAX(f.Fishname) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS PrimarySpecies, 
        MAX(f2.Fishname) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS SecondarySpecies, 
        MAX(fg.GuideName) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS Guidename
FROM FishingGuides fg
INNER JOIN Regions r ON fg.Region = r.RegionID
INNER JOIN FishSpecies f ON fg.PrimarySpecies = f.FishID
INNER JOIN FishSpecies f2 ON fg.SecondarySpecies = f2.FishID
WHERE fg.PRIMARYSPECIES = 1 OR fg.SECONDARYSPECIES = 1
GROUP BY r.RegionName
ORDER BY r.Regionname;

20150216 FIRST LAST real example best musky guide by region For my last example, let’s say we’re taking a trip up north and we want to find the highest rated guide by primary species. From this I’ll decide what species I want to target, and what guide to use. I’ll change up my query to only look at northern locations, then I’ll change the GROUP BY field to Primary Species:

SELECT  f.Fishname AS PrimarySpecies, 
        MAX(fg.Rating) AS Rating, 
        MAX(r.RegionName) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS Region,
        MAX(f2.Fishname) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS SecondarySpecies, 
        MAX(fg.GuideName) KEEP (DENSE_RANK LAST ORDER BY fg.rating, fg.PRIMARYSPECIES, fg.SECONDARYSPECIES) AS Guidename
FROM FishingGuides fg
INNER JOIN Regions r ON fg.Region = r.RegionID
INNER JOIN FishSpecies f ON fg.PrimarySpecies = f.FishID
INNER JOIN FishSpecies f2 ON fg.SecondarySpecies = f2.FishID
WHERE r.REGIONNAME LIKE '%North%'
GROUP BY f.Fishname
ORDER BY f.Fishname;

20150216 FIRST LAST real example best guide by fish north
Now I can see if I want to go for musky and northern, I can get a five star guide in Captain Hook. However, if I’d rather go for musky and walleye, I can book Greg and Gavin and go with a four star guide instead.

I hope this has helped explain how to use the LAST and FIRST functions available in Oracle. As I stated, SQL Server has similar functions available starting in 2012.

Send Database Mail in HTML Table Format from Oracle

As a fishing trip approaches, I like to plan out my expected strategy for my first day. I look at the expected weather (especially wind direction and speed), and try to decide where active fish may be biting. However, once I get out there plans can change very quickly. I remember a trip up north quite a few years ago where, once out on the water, I was facing 25 – 30 mph winds. I was also fishing in a rowboat, and was unable to fish where I wanted because I was drifting through the prime areas too quickly. I had to make an adjustment and focus on spots on the upwind side of the lake.
Database administrators also need to be able to make adjustments. It is fairly common to try something one way and find it doesn’t work well or doesn’t work as expected, and have to find another way. In my last post, I demonstrated how to use ListAgg to make a common delimited list in Oracle. I had planned to use this function to build a stored procedure that would take a query as its parameters and send an email in an html table format with the results of that query. Upon building the query, I found ListAgg has a severe limitation. It returns its results in a VARCHAR2 format only. This means the result set, which is a string of html, could only use 4000 characters, the limit of VARCHAR2. This may seem like a lot, but for an entire result set along with html tags it may not be sufficient. Instead, I used some of Oracle’s XML functions to complete the procedure. First I’ll show the procedure and a sample of its use, then I’ll do a brief description of the xml functions used.
Here is the SQL for the stored procedure:

CREATE OR REPLACE PROCEDURE       SCOTT.SENDDBEMAILWITHTABULARQUERY 
(
  QSELECT IN VARCHAR2
, FIELDLIST IN VARCHAR2
, QFROM IN VARCHAR2
, QWHERE IN VARCHAR2 
, QGROUPBY IN VARCHAR2 
, QHAVING IN VARCHAR2 
, QORDERBY IN VARCHAR2 
, RECIPIENTS IN VARCHAR2 
, SUBJECT IN VARCHAR2 
, TITLE IN VARCHAR2 
, SENDEMAILWITHNORESULTS IN VARCHAR2 
) AS 

  XML CLOB;
  body CLOB;
  varsql      VARCHAR2(4000);
  tblfieldheader NVARCHAR2(2000) := '';
  tempfield NVARCHAR2(2000)      := '';
  i          INT                          := 1;
  j          INT                          := 1;
  SendEmail  INT;
  splitcnt   INT;
  fieldcount INT ;
TYPE AllFieldNames
IS
  TABLE OF VARCHAR2(2000);
  TempFields AllFieldNames;
BEGIN
  --Find the number of fields in the query
  splitcnt := LENGTH(fieldlist)-LENGTH(REPLACE(fieldlist,'|',''));
  --Loop through the fields and put each on into the #Fields temp table as a new record
  FOR j IN 1..splitcnt
  LOOP
    SELECT x.s BULK COLLECT
    INTO TempFields
    FROM
      (SELECT regexp_substr (fieldlist, '[^|]+', 1, rownum) s,
        rownum rn
      FROM dual
        CONNECT BY LEVEL <= LENGTH (regexp_replace (fieldlist, '[^|]+')) + 1
      ) x;
  END LOOP;
  --SELECT fieldcount = splitcnt + 1 --Will be the splitcnt + 1, otherwise MAX(ID) FROM TempFields
  --Start setting up the sql statement for the query.
  varsql := qSELECT || ' xmlagg(xmlelement("tr",xmlforest(';
  --Loop through the #Fields table to get the field list
  FOR i IN TempFields.first..TempFields.last
  LOOP
    --------------------------------------------------------------------------------------------------------------------------------------------------------------
    --This next section is required in case a field is aliased.  For the xml, we need to get rid of the aliases, the table header will only require the aliases.
    --NULL values need to be shown as a string = 'NULL' or the html table will just skip the cell and all values after that in the row will be shifted left.
    ---------------------------------------------------------------------------------------------------------------------------------------------------------------
    IF SUBSTR(TempFields(i),-1) = ']' OR INSTR(UPPER(TempFields(i)),' AS ') = 0 THEN
      --Set the xml field to be the entire field name
      varsql := varsql || 'NVL(CAST(' || TempFields(i) || ' AS VARCHAR2(2000)),''NULL'') as "td", ';
      --Set the table header field to be the entire field name
      tblfieldheader := tblfieldheader || '<th>' || TempFields(i) || '</th>';
    ELSE
      --Set the xml field to be the field name minus the alias
      varsql := varsql || 'NVL(CAST(' || SUBSTR(TempFields(i),1,(INSTR(UPPER(TempFields(i)),' AS ',1))-1) || ' AS VARCHAR2(2000)),''NULL'') as "td", ';
      --Set the table header field to be the field name's alias
      tblfieldheader := tblfieldheader || '<th>' || SUBSTR(TempFields(i),INSTR(UPPER(TempFields(i)),' AS ',-1)+4) || '</th>';
    END IF;
    --Increment the counter.
  END LOOP;
  --Trim the extra two characters of the end of sql.
  varsql := SUBSTR(varsql,1, LENGTH(varsql)-2);
  --Add the end of the table tag
  varsql := varsql || '))).GetClobVal() ';
  --Add the from, where, group by, having, and order by clause to the select statement.
  varsql := varsql || qFROM || ' ' || qWHERE || ' ' || qGroupBy || ' ' || qHaving || ' ' || qOrderBy;
  --Run the sql that will create the xml.
  EXECUTE IMMEDIATE varsql INTO XML;
  --Capture whether or not any rows were returned
  IF LENGTH(XML) > 0 THEN
    --Create the body of the email, which contains the xml results of the query.
    body := to_clob('<html><body><H3>') || Title || '</H3><table border = 1><tr>' || tblfieldheader || '</tr>' || XML || '</table></body></html>';
    --If rows were returned, send the email.
    sys.utl_mail.SEND( 'Oracledbmail@domain.com', recipients,NULL, NULL, subject , boddy, 'text/html', 3);
    --If no results, check whether user wants an email anyway
  ELSIF SendEmailWithNoResults != '0' THEN
    sys.utl_mail.SEND( 'Oracledbmail@domain.com', recipients,NULL, NULL, subject, 'Query returned no results', 'text/html', 3);
  END IF;
END SENDDBEMAILWITHTABULARQUERY;

To call the stored procedure, we’d use this syntax:

DECLARE
  QSELECT VARCHAR2(200);
  FIELDLIST VARCHAR2(200);
  QFROM VARCHAR2(200);
  QWHERE VARCHAR2(200);
  QGROUPBY VARCHAR2(200);
  QHAVING VARCHAR2(200);
  QORDERBY VARCHAR2(200);
  RECIPIENTS VARCHAR2(200);
  SUBJECT VARCHAR2(200);
  TITLE VARCHAR2(200);
  SENDEMAILWITHNORESULTS VARCHAR2(200);
BEGIN
  QSELECT := 'SELECT';
  FIELDLIST := 'emp.ename AS Employee|emp.job|emp.sal as Salary|dept.dname AS Department|dept.loc';
  QFROM := 'FROM scott.emp
            INNER JOIN scott.dept
            ON emp.deptno = dept.deptno';
  QWHERE := 'WHERE emp.sal > 1000';
  QGROUPBY := NULL;
  QHAVING := NULL;
  QORDERBY := NULL;
  RECIPIENTS := 'dgugg@domain.com';
  SUBJECT := 'Scott''s Employees';
  TITLE := 'Salary Greater Than 1,000';
  SENDEMAILWITHNORESULTS := '1';

  Scott.SENDDBEMAILWITHTABULARQUERY(
    QSELECT => QSELECT,
    FIELDLIST => FIELDLIST,
    QFROM => QFROM,
    QWHERE => QWHERE,
    QGROUPBY => QGROUPBY,
    QHAVING => QHAVING,
    QORDERBY => QORDERBY,
    RECIPIENTS => RECIPIENTS,
    SUBJECT => SUBJECT,
    TITLE => TITLE,
    SENDEMAILWITHNORESULTS => SENDEMAILWITHNORESULTS
  );
END;

Here’s how the resulting email looks:
20150209 SendHTML Received Mail
A couple of things to be aware of:

  • You first need to configure database mail before you can send email from the database.  You can do this in Database/Enterprise manager by going to Setup -> Notification Methods

20150209 SendHTML Setup DBMail

  • You need to setup a Access Control List for your user to get to the email server.  This can be done with these SQL statements, run as SYS.
--Create a new ACL
BEGIN
  --Creat the list
  DBMS_NETWORK_ACL_ADMIN.CREATE_ACL(
    acl         => 'utl_mail.xml',
    description => 'utl mail for Scott',
    principal   => 'Scott',
    is_grant    => true,
    privilege   => 'connect',
    start_date  => NULL,
    end_date    => NULL);
  --Add an initial privilege to it
  DBMS_NETWORK_ACL_ADMIN.ADD_PRIVILEGE(
    acl         => 'utl_mail.xml',
    principal   => 'Scott',
    is_grant    => true,
    privilege   => 'connect',
    start_date  => NULL,
    end_date    => NULL);
  --Assign a host to the ACL
  DBMS_NETWORK_ACL_ADMIN.ASSIGN_ACL(
    acl        => 'utl_mail.xml',
    host       => 'hostname',
    lower_port => 25,
    upper_port => 25);
END;
/
COMMIT;

Without this you will get the following error:
20150209 SendHTML DBMail ACL

Now I’ll briefly touch on the functionality of the XML functions.

  1. xmlagg – This takes multiple rows of xml and changes it into a single row.  This is needed for this stored procedure because we want the html table to control the line breaks, not line break characters created by SQL.
  2. xmlelement – This simply turns a returned column into an xml element.  The first argument is the xml element name and the second argument is the field.
  3. xmlforest – This behaves similarly to xmlelement, but it lets you do multiple columns at the same time.  You use comma to separate the columns.  Confusingly, opposite from xmlelement you have to first specify the field, than you put the xml element name second.

When I first got the requirement to send emails from Oracle, I took a look around the web and couldn’t find any ready-made solutions for sending out query results in an html table.  Hopefully this post will help some of you who are looking for the same thing.

Creating a comma separated list with ListAgg in Oracle

When moving to a new location on a lake to fish I do a mental inventory on my lures to identify which ones would work for that location. First I’ll categorize my lures by type (spinner, topwater, jerkbait, crankbait), then I’ll think about which lures in each category will work in that location. If there are weeds coming up I don’t want anything that’ll dive too deep and get fouled each cast. If there are no weeds and I’m working bottom structure I want a lure that will get down deep enough, closer to the bottom. If I’m in clear water I want natural colored lures, if I’m in stained water I want brightly colored lures. If the fish have been active I’ll want lures that can be worked fast, so I can cover as much water as possible. If they’ve been passive I’ll choose a lure that can be worked slowly, giving the fish plenty of time to decide to strike. Once my mental inventory is finished, I have a list of lures to choose from that will give me the best chance to catch a fish.
I’ve already demonstrated how to create a comma separated list in SQL Server here. Creating one in Oracle is a bit different, and in my opinion, easier. This method works starting in 11g, which most people are on by now.  To demonstrate, let’s start by creating a set of sample data:

CREATE TABLE DGUGG.LureNames
  (ID NUMBER PRIMARY KEY , Lure VARCHAR2(25), PriceRange VARCHAR2(8));
  
CREATE sequence DGUGG.LureNameID_seq;

CREATE OR REPLACE TRIGGER DGUGG.LureNamesID_trg before
  INSERT ON DGUGG.LureNames FOR EACH row BEGIN
  SELECT LureNameID_seq.nextval INTO :new.ID FROM dual;
END;
/

INSERT ALL
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Mepps Muskie Killer', '$10-$15' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Harasser', '$10-$15' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Buchertail 700','$10-$15')
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Reef Runner', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Creek Chub Pike Minnow', '$10-$15' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Grandma', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Jake', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Shallow Raider', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Bulldawg', '$10-$15' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Suick Thriller', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Believer', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Swim Whizz', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Double Cowgirl', '$20-$25' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Medussa', '$20-$25' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Doubledawg', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Shallow Invader', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Top Raider', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Depth Raider', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Pacemaker', '$15-$20' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Hawg Wobbler', '$25-$30' )
  INTO DGUGG.LureNames ( Lure, PriceRange ) VALUES  ( 'Softtail Phantom', '$25-$30' )
SELECT * FROM dual;

I’ve created a table called LureNames and inserted several lures names into it, as well as an approximate price range. A simple example would get all the lure names into a comma separated list:

SELECT listagg(Lure,', ')WITHIN GROUP (ORDER BY 1) AllLures FROM DGUGG.LureNames;

The Listagg() function takes two arguments, the source of data (the field Lure in this example) and the delimiter. I have made the delimiter a comma and space. After WITHIN GROUP we are able to choose how to order the results within the list.
20150203 Oracle Comma Delimited List simple
A slightly more complex example will group all the lures together into a comma separated list within each price range bucket:

SELECT PriceRange, listagg(Lure,', ') WITHIN GROUP (ORDER BY Lure DESC) LureList 
FROM DGUGG.LureNames 
GROUP BY PriceRange;

This time I added the PriceRange field so I also had to group by it. I also switched the order by Lure to descending so the lists start with the lowest alphabetical name and work up.
20150203 Oracle Comma Delimited List complex
So that is how to create a comma separate list in Oracle.  It is a bit easier since it only requires one function, not a second to clip off the leading delimiter like in SQL Server.  Next post I will be using this functionality to create a database email with an HTML table of query results in Oracle.