Inserting into SQL Server from Powershell using parameters

Sometimes fishing, especially muskie fishing, just doesn’t go your way. I recently took a trip to Northern Wisconsin to do some hunting and fishing. I spent five days beating the water to a froth with my lures, but ended up seeing only a single follow. When you’re faced with that level of failure, it’s common to search everywhere for reasons. Was the weather too warm? Was I fishing in the wrong spots? Was I using the wrong types of lures? Was I fishing too fast? Was I fishing too slow? Am I just a bad fisherman? Muskie fishing is all about developing patterns, but it’s awfully tough to find a pattern when you aren’t seeing any fish. I’m going to chalk my failure on this trip up to poor weather. Although it’s now fall, the temperatures have been setting record highs. I’m thinking the fish are just waiting for the water to start cooling so they can binge feed and put on the winter fat. But who knows? Maybe I am just a bad fisherman.

Bad Weather

Bad weather chased me off the water after just two hours of fishing on one day.

I recently constructed an IT dashboard. This dashboard, built in SSRS, compiled data from several sources into a SQL Server database where it could be quickly grabbed by SSRS. The part of this task that was new for me was grabbing performance counters from an array of remote servers and inserting them into the SQL Server table. I was able to make use of a Powershell SYSTEM.DATA.SQLCLIENT.SQQLCOMMAND for this. Below I’ll show how I did it.
First I need a table to hold the data. For this specific set of metrics I’m going to be collecting memory used by the servers.

CREATE TABLE [dbo].[IT_DB_Memory](
   [ServerName] [VARCHAR](255) NOT NULL,
   [MemoryUsed_GB] [DECIMAL](5, 2) NOT NULL,
   [CaptureDateTime] [DATETIME2](7) NOT NULL,
 CONSTRAINT [PK_IT_DB_Memory] PRIMARY KEY CLUSTERED 
(
   [ServerName] ASC,
   [CaptureDateTime] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

Now that I have a table, I will build the powershell script. This is going to run over multiple servers. First I need to set up an array with the list of servers I plan to survey.

$serverarray = @(('Muskie'),('NorthernPike'),('Walleye'))

That’s right, I name my servers after fish. Next I’ll create a connection to SQL Server.

$sqlConn = New-OBJECT SYSTEM.DATA.SqlClient.SQLConnection
$sqlConn.ConnectionString = "Server=SmallmouthBass;Database=DBAReporting;Integrated Security=True;"
$sqlConn.OPEN()

Now I create the command that will be run. Note the SQL Server parameters as distinguished by @ in the front of it.

$sqlCmnd = New-OBJECT SYSTEM.DATA.SqlClient.SqlCommand
$sqlCmnd.CONNECTION = $SqlConn
$sqlCmnd.CommandText = "
SET NOCOUNT ON;

INSERT INTO DBAReporting.dbo.IT_DB_Memory
    ( ServerName
    , MemoryUsed_GB
    , CaptureDateTime )
VALUES
    ( @ServerName
    , @MemoryUsed / 1073741824
    , GETDATE());"

Next I’ll actually create those parameters in the Powershell SQL command.

$sqlCmnd.Parameters.ADD((New-OBJECT DATA.SQLClient.SQLParameter("@ServerName",[Data.SQLDBType]::VarChar, 255))) | OUT-NULL
$sqlCmnd.Parameters.ADD((New-OBJECT DATA.SQLClient.SQLParameter("@MemoryUsed",[Data.SQLDBType]::DECIMAL, 5,2))) | OUT-NULL

This next step is what does the actual work. I’ll loop through the array and use the GET-COUNTER command to get the Memory Used. The way I have it set up will give sample the memory five times, once per second, and then return the average of those five samples.

foreach ($server in $serverarray) {
$sqlCmnd.Parameters[0].Value = $server
$Memory = GET-COUNTER -COUNTER "\Memory\Committed Bytes" -SampleInterval 1 -MaxSamples 5 -ComputerName $server |
    select -ExpandProperty countersamples | select -ExpandProperty cookedvalue | Measure-Object -Average
$sqlCmnd.Parameters[1].Value = $Memory.Average
$sqlCmnd.ExecuteScalar()
}

The last step in Powershell is simply to close the database connection.

$SqlConn.Close()

Now I can set this to run on a regular basis using Windows Task Scheduler, and I’ll have a history of how my application servers are using memory throughout the day.

Advertisements

Tally Tables

Occasionally I like to take a break from Muskie fishing and spend time catching some easier species. This is especially true when I’m taking friends out fishing. Not many people like to cast for hours with only a few follows to show for it. Last month I took my brother Steve out onto a smaller lake about five minutes from my house. This lake is overrun with invasive weeds, and I tend to think of it as a garbage lake. However, we had a great time catching fish. My brother caught several bass and a bonus walleye, while I managed this fat 30″ pike. The pike took a good 5 minutes to get in the boat since I was using fairly light tackle and we had no net.

Little Cedar Northern Pike.jpg

SQL is a set based language. It is built with the idea that the engine will handle any looping in the background, without the author needing to specify the best way to loop. There are a few rare exceptions, but if you are creating a loop in SQL, you are usually doing something wrong or much less efficiently. One great way to get around loops is to create a Tally Table. Originally defined by SQL Server legend Jeff Moden in 2008, the Tally Table is simply a table with a single column of very well indexed sequential numbers.
If you’re a programmer or developer, you’re probably going to think of something like this to build a Tally Table:

--Create the Tally Table
CREATE TABLE #Tally
(
    N INT
  , CONSTRAINT PK_Tally_N
        PRIMARY KEY CLUSTERED (N)
);

--Set up a increment counter
DECLARE @TallyCounter INT;
SET @TallyCounter = 1;

--Fill the Tally Table with a Loop
WHILE @TallyCounter <= 11000
BEGIN
    INSERT INTO #Tally
    (
        N
    )
    VALUES (@TallyCounter);

    SET @TallyCounter = @TallyCounter + 1;
END;

Running on my server, this code took an avergage of 432 milisecond while requiring 22,426 reads and 407 CPU. A more efficient way to generate the table will be like this:

--Create and populate table
SELECT TOP 11000
    IDENTITY(INT, 1, 1) AS N
INTO #Tally
FROM MASTER.sys.syscolumns sc1
   , MASTER.sys.syscolumns sc2;

--Add Primary Key Clustered
ALTER TABLE #Tally
ADD CONSTRAINT PK_Tally_N 
PRIMARY KEY CLUSTERED (N) WITH FILLFACTOR = 100;

This took me only 73 miliseconds to run, and required only 885 reads and 78 CPU.
In Oracle this is even easier to create:

CREATE TABLE tempTally AS
SELECT LEVEL AS N
FROM DUAL
CONNECT BY LEVEL <= 11000
ORDER BY LEVEL;

So now we’ve got a table full of sequential numbers from 1 to 11,000. What can we use this for?
From a programmer or developer perspective, loops are often used with strings. Let’s say we want to step through and display each character in a string. With a loop, you’d do something like this:

DECLARE @StepThroughMe VARCHAR(100), @i INT;
SELECT @StepThroughMe = 'Looping through this string is a waste of time.', @i = 1;

WHILE @i <= LEN(@StepThroughMe)
BEGIN
    SELECT @i, SUBSTRING(@StepThroughMe, @i, 1);
   SELECT @i = @i+1
END;

Using a Tally table, you can do it in a way that is simpler to write and runs in less than a tenth of the time:

DECLARE @TallyThroughMe VARCHAR(100);
SELECT @TallyThroughMe = 'Using a Tally Table is an efficient use of time.'

SELECT t.N, SUBSTRING(@TallyThroughMe, t.N, 1)
FROM #Tally AS t
WHERE t.N <= LEN(@TallyThroughMe);

One other way I used this was to create my Date table in my date warehouse.

WITH cte
AS (SELECT DATEADD(DAY, N - 1, '2000-01-01') AS Date
    FROM #Tally
   )
SELECT YEAR(cte.Date) * 10000 + MONTH(cte.Date) * 100 + DAY(cte.Date) AS DateKey
     , cte.Date
     , YEAR(cte.Date) AS YEAR
     , DATEPART(QUARTER, cte.Date) AS Quarter
     , MONTH(cte.Date) AS MONTH
     , RIGHT('0' + CAST(MONTH(cte.Date) AS VARCHAR(2)), 2) + '. ' + DATENAME(MONTH, cte.Date) AS MonthName
     , DATEPART(ww, cte.Date) + 1 - DATEPART(ww, CAST(DATEPART(mm, cte.Date) AS VARCHAR) + '/1/' + CAST(DATEPART(yy, cte.Date) AS VARCHAR)) AS WeekOfMonth
     , DATEPART(wk, cte.Date) AS WeekOfYear
     , DATEPART(dw, cte.Date) AS DayOfWeek
     , RIGHT('0' + DATEPART(dw, cte.Date), 2) + '. ' + DATENAME(dw, cte.Date) AS DayOfWeekName
     , DAY(cte.Date) AS DayOfMonth
     , DATEPART(DAYOFYEAR, cte.Date) AS DayOfYear
     , CASE
           WHEN DATEPART(QUARTER, cte.Date) IN ( 1, 2 ) THEN
               'Spring'
           ELSE
               'Fall'
       END AS RetailSeason
FROM cte;

This worked for loading my permanent table, but you could also use it to load a temp table or table variable that could be joined to a data set to get a full range of dates even when your data set is missing data on some of the dates.

Tally tables can be used to improve performance in a number of different scenarios. Next time you’re not sure whether you may need a loop, stop and consider whether your situation may benefit from a Tally Table.

SQL Agent Properties

It’s been a rather windy summer thus far, making it less fun to be out on the water.  I don’t know many guys who enjoyed being blasted by the wind while bobbing up and down on the waves for hours on end.  I went out on Pewaukee Lake a few weeks ago with a buddy from work.  We had picked the day in advance since it was supposed to be dry and calm.  We got the dry, but not the calm.  We had a stiff wind blowing out of the west that drove us back into the western half of the lake after trying to fish the narrows in the middle.

I spent the day focusing my fishing efforts on making the bait look real.  I tried hard to avoid retrieving the lure in a rhythmic fashion.  I was paid off with a nice upper 30s muskie:

June 2017 Pewaukee Muskie.jpg

My fishing buddy hooked one a short time later, but couldn’t keep it pinned and we lost it.

Recently, I blogged about migrating the SQL Server installation onto a different drive.  I did find one problem after this move that I had to address.  I ran into a problem with the SQL Agent and I wasn’t able to diagnose the issue.  If I remember correctly it was actually an SSRS subscription that failed, and I needed details to find out why.  I found that the SQL Agent has properties, and the error log was still pointing back at the previous location on the C: drive, which no longer existed.  There is a stored procedure you can execute to see those properties, in addition to looking at them in the SSMS UI:

EXEC msdb..sp_get_sqlagent_properties

20170712 SQL Agent Properties.PNG

Lastly, just update the value with the corresponding SET stored procedure and restart the SQL Agent:

EXEC msdb..sp_set_sqlagent_properties 
   @errorlog_file = N'D:\SQLSERVERINSTALL\SQLAGENTERRORLOG.log'

Now your SQL Agent properties have been update.

Addressing login trigger failures in SQL Server

As I get older I have come to enjoy watching others fish, especially my children.  The thrill of catching a big fish is magnified by seeing the smile on someone else’s face when he/she is the one bringing it in.  Below is a nice sized largemouth bass my son caught on a recent fishing trip.

Two Sisters LM.jpg

In my previous post I showed how to create a login trigger to log sysadmin access to a SQL Server instance.  Almost immediately I received a comment describing how the failure of the trigger could almost completely prevent anyone from logging into the instance.  This is a major problem!

The reason this occurs makes sense if you think about it.  While attempting to login, the user executes some code in a trigger.  If that code is invalid, the trigger will fail and abort.  When that happens, the login aborts as well.  What could cause the trigger to fail?  Well, if the table (or other objects) you are accessing within the trigger is inaccessible to the user, or if it doesn’t even exist, the trigger will fail.

I tested this by using my working trigger, which logged sysadmin logins to a table called dbo.sysadminLogging.  Next I renamed the table to dbo.sysadminLogging1.

20170606 Renamed table

Next I tried to login in a new window in SSMS:

20170606 Failed login

First, let’s talk about how to get back into a server that has this issue.  We need to log into the SQL using SQLCMD with a dedicated administrator connection, then disable the trigger:

20170606 Disable trigger

After doing this everyone should now be able to log back into SQL Server as normal.

Now to prevent this type of event from happening, I suggest a small edit to my original trigger.  This edit will make sure the referenced objects are valid.  If not, the trigger does nothing.  It may also be a good idea to send an email to the DBA so they can investigate, and I’ve noted that in the comments.

CREATE TRIGGER [servertrigger_CheckForSysAdminLogin] ON ALL SERVER
    FOR LOGON
AS
    BEGIN
       IF OBJECT_ID('DBMaint.dbo.sysadminLogging') IS NULL
           BEGIN
               --Possibly send an email to the DBA, indicating the trigger is not working as expected
               GOTO Abort;--Do nothing
           END

        IF IS_SRVROLEMEMBER('sysadmin') = 1
            BEGIN
                INSERT  INTO DBMaint.dbo.sysadminLogging
                        ( [Login] , LoginDate )
                VALUES  ( ORIGINAL_LOGIN() , GETDATE() );
            END;

Abort:

    END;

GO

This newer version of the trigger should cut down on the chances that this functionality will come back to bite you. Special thanks to james youkhanis for pointing this out.

Logging sysadmin logins in SQL Server

Our yearly cabin opening “men’s weekend” was last weekend. The fishing was a bit below average, but we still had a great time. I brought my six year old son up, and two of my six year old nephews were brought up by their dad’s as well. The first day of fishing we went over to the rainbow flowage, which was a bit of a bust. We ended up hooking five or six northern pike, but only landed one because they kept biting off the line. The boys in the other boat caught some bluegill and bass, but overall the fishing wasn’t as hot as in years past. This was probably due to all the rain they got up there. There were places where the rivers were over the road. Below is the one pike we managed to land.
Rainbow Flowage NP.jpg
Security these days is as important as ever. SQL Server provides lots of ways to improve security of your data and environment. One small way I like to keep an eye on my system is to log sysadmin logins to the server. Sysadmin is able to do anything in SQL Server, and by reviewing my log from time to time I can be sure that no one is using this type of login to compromise the system.
The first thing to do is to create a table to hold the log data:

USE [DBA_DB]
GO

CREATE TABLE [dbo].[sysadminLogging](
   [SAL_id] [INT] IDENTITY(1,1) NOT NULL,
   [Login] [VARCHAR](255) NOT NULL,
   [LoginDate] [DATETIME2](7) NOT NULL,
 CONSTRAINT [PK_sysadminLogging] PRIMARY KEY NONCLUSTERED 
(
   [SAL_id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

GO

All that is left is to create a server level login trigger to record any sysadmin logins to that table:

CREATE TRIGGER [servertrigger_CheckForSysAdminLogin] ON ALL SERVER
    FOR LOGON
AS
    BEGIN
        IF IS_SRVROLEMEMBER('sysadmin') = 1
            BEGIN
                INSERT  INTO DBA_DB.dbo.sysadminLogging
                        ( [Login] , LoginDate )
                VALUES  ( ORIGINAL_LOGIN() , GETDATE() );
            END;
    END;
GO

Check this table regularly to get a good idea of who is logging into your SQL Server with these higher privileges.

EDIT:

james youkhanis points out a problem that can occur in the comments.  This problem could make logging in quite difficult.  I have posted a follow-up where I explain the problem, provide a workaround to allow logging in (as james demonstrates below), and provide an updated trigger to minimize the risk of this occurring.

Cross Database Certificates – Trouble with Triggers

The weather has been awesome here for the last few days.  Sixty plus degree temperatures has made it feel more like May than February.  It isn’t supposed to last much longer, but I have enjoyed it.  I took the boat in for an engine tune-up this weekend, which means I should get it back just in time for most the ice to be coming off the lakes.  I’m hoping to take a couple more shots at the Wolf River walleye run this spring.  Last year didn’t provide good results.

I took my sons to a park on the edge of a lake this past weekend and happened to be watching while an unfortunate ice fisherman’s ATV fell through the ice.  I’m not sure how these ice fishermen know what ice is good versus what ice is bad, but you can see from the main picture above that not all of them know either.  Fortunately, only the front tires went through and another ATV came over and pulled him out.

I ran into an issue with cross database certificates recently.  I have blogged about how to set these certificates up here – they are a handy way to enable permissions across databases.  However, I ran into a problem where the permission chain failed due to a trigger on the original table that updated a separate table.  Here is the SQL  to replicate the issue:

CREATE LOGIN [GuggTest] WITH PASSWORD=N'abcd', DEFAULT_DATABASE=[master], CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF

CREATE DATABASE A;
CREATE DATABASE B;

USE A;

CREATE TABLE dbo.SPtoUpdate
    (
      ID INT
    , ILoveFishing VARCHAR(255)
    );
INSERT INTO dbo.SPtoUpdate
        ( ID , ILoveFishing )
VALUES  ( 1,'Musky'),( 2,'Pike'),( 3,'Yellow Perch');
CREATE TABLE dbo.TriggerToInsert
    (
      ID INT
    , ILoveFishing VARCHAR(255)
    , ChangeDate DATETIME2
    );
GO

CREATE TRIGGER dbo.SPtoUpdateTrigger ON dbo.SPtoUpdate
    FOR UPDATE
AS
    DECLARE @datetime DATETIME2;
    SELECT  @datetime = GETDATE()

    INSERT  INTO dbo.TriggerToInsert
            ( ID , ILoveFishing , ChangeDate )
    VALUES  ( 1 , 'Yes' , @datetime );
GO

CREATE CERTIFICATE BExecutor
   ENCRYPTION BY PASSWORD = 'Obfuscated'
   WITH SUBJECT = 'Execute sp from B to A',
   START_DATE = '20140101', EXPIRY_DATE = '20300101'
GO

BACKUP CERTIFICATE BExecutor TO FILE = 'C:\temp\crossdbcert.cer'
WITH PRIVATE KEY (FILE = 'C:\temp\crossdbcert.pvk' ,
                  ENCRYPTION BY PASSWORD = 'Obfuscated',
                  DECRYPTION BY PASSWORD = 'Obfuscated')
GO

CREATE USER BExecutor FROM CERTIFICATE BExecutor

GRANT UPDATE ON dbo.SPtoUpdate TO BExecutor
GRANT SELECT ON dbo.SPtoUpdate TO BExecutor
--Also give insert on dbo.TriggerToInsert
GRANT INSERT ON dbo.TriggerToInsert TO BExecutor

USE B
GO

CREATE USER [GuggTest] FOR LOGIN [GuggTest];
EXEC sp_addrolemember N'db_owner', N'GuggTest'
GO

CREATE PROCEDURE dbo.UpdateTableInA
AS
    BEGIN
        UPDATE  A.dbo.SPtoUpdate
        SET     ILoveFishing = 'Walleye'
        WHERE   ID = 2;
    END

GO


CREATE CERTIFICATE BExecutor FROM FILE = 'C:\temp\crossdbcert.cer'
WITH PRIVATE KEY (FILE = 'C:\temp\crossdbcert.pvk' ,
                  ENCRYPTION BY PASSWORD = 'Obfuscated',
                  DECRYPTION BY PASSWORD = 'Obfuscated')
GO

EXEC MASTER..xp_cmdshell 'DEL C:\temp\crossdbcert.*', 'no_output'
GO

ADD SIGNATURE TO dbo.UpdateTableInA BY CERTIFICATE BExecutor
    WITH PASSWORD = 'Obfuscated'
GO

--Log In or Change execution context to GuggTest, then EXEC dbo.UpdateTableInA

It turns out you can counter sign a trigger with the certificate, and this will allow the permission chain to succeed. By doing this, you don’t even need to grant the certificate user permission to the second table. Here is the syntax to do that:

ADD COUNTER SIGNATURE TO dbo.SPtoUpdateTrigger
BY CERTIFICATE BExecutor
WITH PASSWORD = 'Obfuscated';

Use this technique to work with cross database permissions that have to access tables with triggers.

Recursive Common Table Expressions

Wind can be an ally or an enemy of the fisherman.  Both in terms of comfort and in changing the mood and location of the fish, wind is something that can’t be ignored.  As it relates to the fish, wind can often turn fish on.  The term “muskie chop” refers to medium sized waves that can help create good conditions for fishing.  The wind does a couple things: it restricts the light by creating waves that break up the sun, and it also creates a current that can move fish to specific locations that can be targeted.  The other factor to consider related to wind if fisherman comfort.  I love fishing the colder months, but you’d better make sure you’re dressed for the weather.  There is no indoors in a fishing boat, so if it’s going to be windy and cold, bundle up.  At the same time on those hot, sunny, humid July days, you may not want to even be out unless there is some wind to cool you down.  Keeping all these factors in mind, it’s important to remember that wind is strongest when it has a large open space to build up it force.  If you want to avoid the wind, head to the upwind side of the lake.  If you want to embrace the wind, head to the downwind side.

In SQL Server, a recursive common table expression (CTE) could be compared to wind building up power as it moves over the lake.  A recursive CTE will call itself, and in doing so use the previous results to build to a final results set.

I recently had a perfect use case for this concept.  I had to take dollars given to me on a monthly level and distribute it to each day within the month.  Using a recursive CTE, I told SQL Server to give me the monthly total divided by the days in the month for each day in the month.  Below is an example of how I set it up:

CREATE TABLE #SalesTotalsByMonth
    (
      FirstOfMonth DATE
    , Channel VARCHAR(10)
    , SalesTotal DECIMAL(10 , 2)
    );
INSERT  INTO #SalesTotalsByMonth
        ( FirstOfMonth , Channel , SalesTotal )
VALUES  ( '2016-01-01' , 'Web' , 165473.99 ),
        ( '2016-01-01' , 'In-store' , 56998.45 ),
        ( '2016-01-01' , 'Mail' , 4645.85 )
,       ( '2016-02-01' , 'Web' , 27463.56 ),
        ( '2016-02-01' , 'In-store' , 61423.78 ),
        ( '2016-02-01' , 'Mail' , 5341.56 )
,       ( '2016-03-01' , 'Web' , 487356.67 ),
        ( '2016-03-01' , 'In-store' , 15734.56 ),
        ( '2016-03-01' , 'Mail' , 3104.85 )
,       ( '2016-04-01' , 'Web' , 478236.78 ),
        ( '2016-04-01' , 'In-store' , 24675.67 ),
        ( '2016-04-01' , 'Mail' , 1024.56 )
,       ( '2016-05-01' , 'Web' , 167524.89 ),
        ( '2016-05-01' , 'In-store' , 31672.78 ),
        ( '2016-05-01' , 'Mail' , 1798.67 )
,       ( '2016-06-01' , 'Web' , 347652.19 ),
        ( '2016-06-01' , 'In-store' , 41675.19 ),
        ( '2016-06-01' , 'Mail' , 801.78 )
,       ( '2016-07-01' , 'Web' , 247653.02 ),
        ( '2016-07-01' , 'In-store' , 59713.02 ),
        ( '2016-07-01' , 'Mail' , 2097.19 )
,       ( '2016-08-01' , 'Web' , 891642.23 ),
        ( '2016-08-01' , 'In-store' , 67134.23 ),
        ( '2016-08-01' , 'Mail' , 3752.02 )
,       ( '2016-09-01' , 'Web' , 342591.24 ),
        ( '2016-09-01' , 'In-store' , 77123.24 ),
        ( '2016-09-01' , 'Mail' , 2406.23 )
,       ( '2016-10-01' , 'Web' , 246758.25 ),
        ( '2016-10-01' , 'In-store' , 81214.24 ),
        ( '2016-10-01' , 'Mail' , 3012.24 )
,       ( '2016-11-01' , 'Web' , 267423.26 ),
        ( '2016-11-01' , 'In-store' , 91023.26 ),
        ( '2016-11-01' , 'Mail' , 2034.24 )
,       ( '2016-12-01' , 'Web' , 265219.56 ),
        ( '2016-12-01' , 'In-store' , 34167.02 ),
        ( '2016-12-01' , 'Mail' , 1010.26 );

WITH    recurse
          AS ( SELECT   stbm.Channel
                      , stbm.SalesTotal / DATEDIFF(DAY , stbm.FirstOfMonth , DATEADD(MONTH , 1 , stbm.FirstOfMonth)) AS Revenue
                      , DATEDIFF(DAY , stbm.FirstOfMonth , DATEADD(MONTH , 1 , stbm.FirstOfMonth)) AS daysleft
                      , stbm.FirstOfMonth AS [Sales Day]
               FROM     #SalesTotalsByMonth stbm
               UNION ALL
               SELECT   recurse.Channel
                      , recurse.Revenue
                      , recurse.daysleft - 1
                      , DATEADD(DAY , 1 , recurse.[Sales Day])
               FROM     recurse
               WHERE    recurse.daysleft > 1
             )
    SELECT  recurse.[Sales Day]
          , recurse.Channel
          , SUM(recurse.Revenue) AS Revenue
    FROM    recurse
    GROUP BY recurse.Channel
          , recurse.[Sales Day];

DROP TABLE #SalesTotalsByMonth;

The important thing to note here is the general pattern for a recursive CTE – the initial expression with a UNION ALL that calls the CTE.  Be sure to put the upper limit in the WHERE clause of the bottom half to avoid infinite recursion.

My final results gave me the total per day.