GE Centricity EMR Performance Tuning.

Download PDF

I have been doing work with multiple clients using the GE Centricity EMR product which makes extensive use of Microsoft SQL Server. The performance issues that you run into with Centricity are very similar to what I see working with other clients on a daily basis. Centricity is a solid product, when you run into performance issues, they can often times be addressed by performance tuning the SQL Server.

GE Centricity Performance Tuning

Here are 5 of my recommendations for IT administrators responsible for the GE Centricity SQL Server.

  1. Stay on top of SQL Server updates, specifically the service packs and security patches. Some of these can have a significant stability or performance impact on your system. This doesn’t mean jumping to the latest version of SQL Server that may not be supported by GE, it just means keeping up on the applying the bug fixes that Microsoft supplies in the service packs, hot fixes and security patches.
  2. Understand the speed of your storage. For all drives that hold data, log files, or backups, have a good understanding of their speed. Without knowing what drives are fast or slow, it’s hard to make the right storage decisions. Get a good understanding of your I/O bottlenecks.
  3. Reboot your SQL Server less often. There is a common belief that when you reboot a server, after the reboot things will run faster, better, and more stable. The opposite is true with SQL Server. When SQL Server runs, it brings a great deal of data into memory from disk, every time you restart the SQL Server process, or reboot the server, all that data is thrown out of memory and has to be reloaded. Sometimes a reboot will impact the overall performance of a SQL Server for the next 24 to 36 hours. Reboot when you need to, but never reboot a SQL Server as a regularly scheduled job.
  4. Take a look at Database Health Monitor, this is a free tool that I have spent the last 5 years building to help track down and repair performance problems. Several of the features in Database Health Monitor have come from tracking down specific performance issues with Centricity. Its free, why not use it.  http://DatabaseHealth.com/download
  5. Sign up for a database health assessment, where Stedman Solutions, LLC can help track down and adjust the SQL Server database to improve performance. http://stedmansolutions.com/centricity

Related Links

 

Posted in Performance Tuning Tagged with: , , , , ,

RAID10 Saved The Day

Download PDF

Reliable DBA Services is one of the things I talk about on my business website at http://StedmanSolutions.com, but that can mean many things to different people. To me this has many meanings. Today the reliability of having RAID10 on my home office computer saved the day.

 

Drive Failure Last Year

A year ago not long after starting working at Stedman Solutions, LLC full time, I had a drive failure, on the boot drive on my home office computer. I lost my entire C: drive, fortunately I had things backed up but it did cost me about 2 days of work while I was replacing drives, reinstalling the OS, reinstalling all my applications, and restoring things from backup. At that point I decided NEVER AGAIN will I allow that to happen.

Sure I had my laptop, but I didn’t have everything I need to do my work on the laptop.

After that I ended up building out another home office computer with the goal of having everything overly redundant. My c: boot drive is now RAID10 with 4 SSD’s, and my data drive is made up of 3 drives using the Two-way mirror feature of Windows Storage spaces.

Drive Failure Today

Today one of the SSD drives in my RAID10 c: drive configuration failed. No big deal, just a few minutes of working time lost. First after a reboot, I was able to get the failed drive to rebuild, that will hold me for a few days. Then I spent 10 minutes to go to Amazon.com and order a replacement drive, and a second one to have as a spare. Total elapsed time today 10 minutes. In 2 days when the new drives arrive, it will take me another 10 minutes to shut down my computer, replace the failed drive, start up the computer and then start the rebuild process.

Savings

Totals savings by having the RAID10 boot drive, about 16 billable hours, and headaches.  Total cost, for the RAID controller and SSDs about $600 plus replacements another $190. Peace of mind having RAID10 priceless.

 

All this and I didn’t even mention the performance benefits with RAID10.

Posted in Uncategorized Tagged with: , , , , , ,

SQL Server 2005 End of Life – End of Support Today

Download PDF

Today marks the end of life of SQL Server 2005, originally release in November of 2005, this product has finally come to its end of life point.

sqlServer2005

One of my favorite features that was introduced in SQL Server 2005 was Common Table Expressions. Another big item that was added to SQL Server 2005 was the SQL Server Management Studio.

Its hard to believe its been 11 years since SQL Server SQL Server 2005 released.

So for anyone still using SQL Server 2005, you may want to think about upgrading to a newer version, there are many new features, bug fixes and some major enhancements to the product in the last 11 years. If you need help with this, Stedman Solutions, LLC can help with the upgrade.

 

SQL Server 2005, Rest In Peace.

Posted in SQL Server Tagged with: , , ,

Visualizing Log File VLF Sizing

Download PDF

One of my favorite queries this week is the following query that creates a text based bar chart to quickly help visualize the VLF files on any database log file. In the last month I have given the “TempDB Do This and Don’t Do That” presentation twice, once at the Bellingham SQL Server users group, and another time at SQL Saturday Redmond. One of the questions that came up after the users group meeting was around an easy way to visualize the Virtual Log Files. Its one thing to just get a count, but to see the sizing of each VLF is helpful to understand how VLFs work.

If you are not familiar with VLFs, these are Virtual Log Files, or the chunks that make up your entire log file. When your log file grows, VLFs are added to help manage the file space. For the log growth, between 4 and 16 VLFs are added for each growth of the log file (except on SQL Server 2014 and newer, which modifies the sizing algorithm a bit).

Read more ›

Posted in Performance Tagged with: , , , , , , ,

TempDB – Do This and Don’t Do That

Download PDF

Today I am presenting my “TempDB – Do This and Don’t Do That” session at SQL Saturday Redmond.

Here is the download of the presentation and samples.

StedmanTempDbPresentation.zip

 

Some of the sample code from the session:

DECLARE @logInfoResults AS TABLE
(
[RecoveryUnitId] BIGINT, -- only on SQL Server 2012 and newer
[FileId] TINYINT,
[FileSize] BIGINT,
[StartOffset] BIGINT,
[FSeqNo] INTEGER,
[Status] TINYINT,
[Parity] TINYINT,
[CreateLSN] BIGINT
);

INSERT INTO @logInfoResults
EXEC sp_executesql N'DBCC LOGINFO WITH NO_INFOMSGS';
--SELECT AVG(FileSize) / 1024.0 / 1024.0 as AvgSizeMB, count(*) as VLFs FROM @logInfoResults ;
--SELECT FileSize / 1024 / 1024, [Status] FROM @logInfoResults ;
SELECT * FROM @logInfoResults ;

SELECT ISNULL(cast([Status] as nvarchar(10)), 'Total'),
COUNT(*)
FROM @logInfoResults
GROUP BY [Status] WITH ROLLUP;

 

I hope you enjoy the session.

Posted in SSIS Performance Tagged with: , , , , ,

A year in review – The first year of Stedman Solutions, LLC.

Download PDF

The TL;DR summary: Life is good, business is good, and I am loving it. Ready to roll with Year 2 of Stedman Solutions, LLC.

Stedman Solutions Remote Skilled DBA

The Details

It has been an interesting year.  I officially formed the business about 3 years ago, but never worked at it regularly until April 1, 2015. Prior to April 1, 2015 had a full time position at a company in Bellingham.

My long term plan had originally been to go full time at Stedman Solutions, LLC in June of 2016 at which point my wife would have completed her schooling to become a nurse, and we would have had some additional family income to help while I got things rolling with Stedman Solutions, LLC.

“the best laid plans of mice and men often go awry”

In life, things don’t always work the way you plan. At the end of March 2015, my full time position came to an end due to what the company called a “lay-off”, one person in the lay-off, that’s me. I learned later that the employees there had been told by management that I had left of my own accord. Can you smell something fishy with that? Things are awesome today, and I am so glad every single day that I no longer work there.

So there I was March 31th, 2015 in a position felt much like that of a historic military leader who burned the ships after landing on the shore to prevent the troops from retreating. The point of no return, I was all in (or out depending on the perspective). I knew what I wanted to do, but I certainly wasn’t entirely ready; no business structure, no marketing plan, and no client leads. That was a very scary point in time, but I knew what I wanted to do, I was going to be an independent consultant focused on SQL Server.

With my 25 years of SQL Server database and programming background, at that point in time I was certainly up to the challenge technically. I was an overly skilled self-proclaimed consultant with no clients.

Read more ›

Posted in Uncategorized Tagged with: , , , , , , ,

TSQL Script to Display Agent Job History

Download PDF

It can be a bit time consuming to work through the the SQL Server Management Studio (SSMS) user interface to check on the agent job history to find specific agent jobs that have been run. To help speed that process up, I created the following TSQL script to directly query the job history.

This query makes use of a common table expression to first collect the details from the sysjobhistory table and the sysjobs table, with some formatting.

Agent Job History


;WITH jobListCTE as
(
SELECT j.name as job_name,
msdb.dbo.agent_datetime(run_date, run_time) AS run_datetime,
RIGHT('000000' + CONVERT(varchar(6), run_duration), 6) AS run_duration,
message
FROM msdb.dbo.sysjobhistory h
INNER JOIN msdb.dbo.sysjobs j ON h.job_id = j.job_id
WHERE h.step_name = '(Job outcome)'
)
SELECT job_name as [JobStep],
run_datetime as [StartDateTime],
SUBSTRING(run_duration, 1, 2) + ':' +
SUBSTRING(run_duration, 3, 2) + ':' +
SUBSTRING(run_duration, 5, 2) as [Duration],
message
FROM jobListCTE
ORDER BY run_datetime DESC, job_name;

You get the following output.

agent job history

 

You can now modify this query and drill down on just specific jobs out of the job history that you may be interested in.

Another option is to take a look at the job history report in the Database Health Monitor Application:

 

Related Links:

Posted in DBA Tagged with: , , , , , , ,

CTE to Split a String in a function SplitString

Download PDF

Back in 2012 when I was writing my Common Table Expressions book, I came up with the following CTE in a function to split a string into rows.

CREATE FUNCTION dbo.Split (@sep char(1), @s varchar(8000))
RETURNS table
AS
RETURN (
    WITH splitter_cte AS (
      SELECT CHARINDEX(@sep, @s) as pos, 0 as lastPos
      UNION ALL
      SELECT CHARINDEX(@sep, @s, pos + 1), pos
      FROM splitter_cte
      WHERE pos > 0
    )
    SELECT SUBSTRING(@s, lastPos + 1,
                     case when pos = 0 then 80000
                     else pos - lastPos -1 end) as chunk
    FROM splitter_cte
  )
GO
 
SELECT *
  FROM dbo.Split(' ', 'the quick brown dog jumped over the lazy fox')
OPTION(MAXRECURSION 0);

Original Post: http://stevestedman.com/2012/04/using-a-cte-to-split-a-string-into-rows/

At the time, I thought it was pretty handy function, I ended up including it as an example in my CTE book, and I have used it dozens of times over the last couple of years.

SQL Server 2016

In SQL Server 2016 Microsoft has introduced a built in function called STRING_SPLIT which is very similar to this example. If you are using SQL Server 2016 you may want to consider the STRING_SPLIT function instead. The examples shown here work great and have been tested on SQL Server 2005, SQL Server 2008, SQL Server 2008R2, SQL Server 2012, SQL Server 2014 and SQL Server 2016. If Microsoft has done it right in SQL Server 2016, I would expect that their version of SPLIT_STRING will have better performance.

Performance with splitting strings

This last week I was working with a client who had a performance issue on a SSRS report that took over an hour to run, the person working on the report did an excellent job tracking it down. In SSRS when you have the option to select multiple items from a list with a check box, the id’s from those items are passed in to the report as a string that is comma separated. Something like this:

“12334, 34234, 2342332, 234234, 23423”

In order to use that in a query it needs to be split out into something that can easily be used. The way I have typically seen this done is with a split function like the one above. They had a split function but it didn’t use a CTE, instead the function used a loop to iterate through the list and break it up into rows.

SELECT * FROM originalSplitFunction('12334, 34234, 2342332, 234234, 23423', ',')


Worked great stand alone, it ran in far less than a second.

The person working on the report had narrowed down the query to something like this:

SELECT some columns
FROM Table1 as t1
INNER JOIN Table2 as t2 on t1.id = t2.foreignKeyValue
INNER JOIN originalSplitFunction('12334, 34234, 2342332, 234234, 23423', ',') as s 
           on s.id = t2.id;

Which took over an hour to run. He had also through some testing with hard coded values determined that the following ran in just a second.

SELECT some columns
FROM Table1 as t1
INNER JOIN Table2 as t2 on t1.id = t2.foreignKeyValue
INNER JOIN t2.id in (12334, 34234, 2342332, 234234, 23423)

In this case Table1 and Table2 were both very large tables, with some great indexes.

 

Why did it take so long to run with the originalSplitFunction?

 

Here is what I knew as we started looking into it:

  • The originalSplitFunction when included in the query took over an hour to run.
  • The originalSplitFunction contained multiple statements with a loop and some work to fill up the result table.
  • The SQL Server query optimizer treats multiple statement table valued functions as a black box when analyzing the plan. The optimizer doesn’t know anything about how the function works, or how many results will be returned.
  • The SQL Server query optimizer is able to use single statement table valued functions when analyzing the plan, therefore if we have a string split function that is a single statement, we should be able to end up with a better plan.
  • The use of the originalSplitFunction is causing a full table scan on a very large table.

The new and improved function

That’s the point that I remembered my split function using a CTE that only has a single statement. We grabbed the split function from my blog, and with some minor adjustments to handle VARCHAR(MAX), we ended up with the following function.

CREATE FUNCTION dbo.SplitString (@s VARCHAR(MAX), @sep CHAR(1))
RETURNS TABLE
AS
-- SplitString function created by Steve Stedman
RETURN (
WITH splitter_cte AS
(
  SELECT CHARINDEX(@sep, @s) as pos, cast(0 AS BIGINT) as lastPos
  UNION ALL
  SELECT CHARINDEX(@sep, @s, pos + 1), pos
  FROM splitter_cte
  WHERE pos > 0
)
, splitter2_cte AS
(
  SELECT ltrim(rtrim(SUBSTRING(@s, lastPos + 1,
  case when pos = 0 then 80000000000
  else pos - lastPos -1 end))) as chunk
  FROM splitter_cte
)
SELECT chunk as chunk
  FROM splitter2_cte
 WHERE NULLIF(chunk, '') IS NOT NULL
)
GO

SELECT *
FROM dbo.SplitString('the quick brown dog jumped over the lazy fox', ' ')
OPTION(MAXRECURSION 0);

When we replaced the originalSplitFunction with the SplitString function the query ran in about a second. From 1 hour down to a second run time for a report. I hope someone notices that improvement.

Related Links:

Posted in CTE Book Tagged with: , , , , , , ,

SQL Server – sysmaintplan_logdetail

Download PDF

The check for excessive sysmaintplan_logdetail has be added to the Database Health MonitorQuick Scan Report.

If you haven’t had a maintenance plan to clean up maintenance plan history, it can grow excessively over time. I recently worked on a SQL Server that had 6 years worth of maintenance plan history, and the msdb.dbo.sysmaintplan_logdetail table was approaching nearly 1gb in size. It would be a tough argument to claim that you really need a 6 year history of your maintenance plan history. If you really do, you might want to copy it off to another database outside of msdb.

The problem with this type of excessive growth is that it bloats out the size of msdb, which should be a pretty small database under almost any circumstances.

Here are some queries to examine the contents of the msdb.dbo.sysmaintplan_logdetail table.

SELECT TOP 100 *
FROM msdb.dbo.sysmaintplan_logdetail WITH (NOLOCK)


SELECT count(*)
FROM msdb.dbo.sysmaintplan_logdetail WITH (NOLOCK)


SELECT TOP 1 start_time, count(1) OVER() as numOlder
FROM msdb.dbo.sysmaintplan_logdetail WITH (NOLOCK)
WHERE start_time < GETDATE() - 365
ORDER BY start_time ASC

The following sproc sp_mainplan_delete_log can be used to clean up the excessive history. You could also use the maintenance plan task to clean up maintenance plan history.

msdb.dbo.sp_maintplan_delete_log @oldest_time='3/19/2016';

The recommendation is to clean out this table regularly. As it grows, it can add excess overhead when running maintenance plan jobs.

If you haven’t tried Database Health Monitor, now might be a good time to give it a try. Its a free application created by StedmanSolutions, LLC, and available at http://DatabaseHealth.com

Related Links:

Posted in Database Health Tagged with: , , , ,

SQL Server – Cost Threshold For Parallelism

Download PDF

The check for Cost Threshold for Parallelism check has be added to the Database Health MonitorQuick Scan Report.

The Cost Threshold For Parallelism setting in SQL Server controls the level at which the query optimizer decides to have a query be processed using parallelism, or multiple threads.

The default setting is 5 which may have made sense 15 years ago in SQL Server 2000 or older, but with more modern versions of SQL Server, with much faster processors, more memory, and all around major improvements, the default of 5 doesn’t make sense any more.

I usually set it to 50, then evaluate system performance and adjust it from there. Here is a SQL Script to set the cost threshold for parallelism to 50.

sp_configure 'show advanced options', 1;
GO
reconfigure;
GO
sp_configure 'cost threshold for parallelism', 50;
GO
reconfigure;
GO

You may need to set this to something much higher than 50, this really depends on your overall SQL Server performance.

If you are adjusting Cost Threshold for Parallelism you may also want to take a look at adjusting the Max Degree of Parallelism at the same time.

If you haven’t tried Database Health Monitor, now might be a good time to give it a try. Its a free application created by StedmanSolutions, LLC, and available at http://DatabaseHealth.com

Related Links:

Posted in Database Health Tagged with: , , , ,

SQL Server Performance Tuning

Need help with SQL Server Performance Tuning, contact Steve at Stedman Solutions, LLC for help, or take a look at the performance tuning page at the SteveStedman.com blog. We can help with tuning a single query or with figuring out why your SQL Server is running slow. Free 30 minute consultation.

Newsletter Signup

Stay informed of Database Corruption Challenge events, and other things happening at SteveStedman.com with my newsletter.
Newsletter signup form.

Stedman Solutions

StedmanSolutionsRemoteSkilledDBA

Archives