Posted on Leave a comment

Detrimental effects of predicates on estimation of cardinality with the query processor.

• Queries with predicates that use comparison operators between different columns of the same table.

• Queries with predicates that use operators, and any one of the following are true:
   o There are no statistics on the columns involved on either side of the operators.
   o The distribution of values in the statistics is not uniform, but the query seeks a highly selective value set. This situation can be especially true if the operator is anything other than the equality (=) operator.
   o The predicate uses the not equal to (!=) comparison operator or the NOT logical operator.

• Queries that use any of the SQL Server built-in functions or a scalar-valued, user-defined function whose argument is not a constant value.

• Queries that involve joining columns through arithmetic or string concatenation operators.

• Queries that compare variables whose values are not known when the query is compiled and optimized.

Posted on Leave a comment

table variable myths

Table variables are not “in memory” tables, they function basically the same as #temporary tables and even create objects in tempdb.

The difference is that table variables are “lightweight” temporary tables in that they exist only for the scope of the batch and there are no statistics associated with them (the optimizer assumes only one row in table).  Also, you cannot create indexes on them (aside from primary key and unique constraints).

So, table variables can improve performance and take up less resources in situations with small data sets, however larger data sets can have better performance as temporary tables.

Posted on Leave a comment

SQL SERVER 2014 DATA TOOLS

SQL SERVER 2014 DATA TOOLS

For some reason as yet unknown Microsoft decided to stop shipping SQL Server Data Tools with the installation media. Adding to the confusion is that the tools are now split into different versions.

There is one version named just SQL Server Data Tools (SSDT), which is an add-on to Visual Studio. This version adds the toolset that was formerly known as Data Dude to Visual Studio. These SSDT tools include data and schema comparison, as well as support for refactoring databases, creating views, stored procedures, functions, and triggers. This version of SSDT provides both a live development mode and an offline project mode that helps provide source control for your databases by integrating with Visual Studio for team development, source control, and change tracking. There are different versions of these SSDT tools depending on the version of Visual Studio that you are using.

• SQL Server tooling in Visual Studio 2013—Visual Studio 2013 Express for Web, Express for Windows Desktop, Professional, Premium, and Ultimate include the SSDT tools. You don’t need a separate download. To check for the latest version of SSDT, open Visual Studio 2013 and choose the Tools, Extensions, and Updates menu. Then check the Updates section for Microsoft SQL Server Update for database tooling.

• SSDT Visual Studio 2012—there is a standalone install experience as well as an integrated install for the Visual Studio Professional, Premium, and Ultimate SKUs.

• SSDT Visual Studio 2010—this version of SSDT is no longer being updated.
The other version of SSDT is confusingly called SQL Server Data Tools – Business Intelligence (SSDT-BI). Although their names are almost identical, SSDT-BI is a distinctly different toolset than SSDT. SSDT-BI is the replacement for BIDS and it enables the development of Integration Services packages, Analysis Services cubes, and Reporting Services reports. Both versions of SSDT are no cost downloads for SQL Server users.

Source: 55144BC SQL SERVER PERFORMANCE TUNING AND OPTIMIZATION

Posted on Leave a comment

i am the very model of a modern data professional

I am the very model of a modern Data-Professional,
I’ve information vegetable, animal, and mineral,
I know the kings of England, and I quote the fights historical
From Marathon to Waterloo, in order categorical;
I’m very well acquainted, too, with matters mathematical,
I understand equations, both the simple and quadratical,
About binomial theorem I’m teeming with a lot o’ news,
With many cheerful facts about the square of the hypotenuse.
I’m very good at integral and differential calculus;
I know the scientific names of beings animalculous:
In short, in matters vegetable, animal, and mineral,
I am the very model of a modern Data-Professional.

I know our mythic history, King Arthur’s and Sir Caradoc’s;
I answer hard acrostics, I’ve a pretty taste for paradox,
I quote in elegiacs all the crimes of Heliogabalus,
In conics I can floor peculiarities parabolous;
I can tell undoubted Raphaels from Gerard Dows and Zoffanies,
I know the croaking chorus from The Frogs of Aristophanes!
Then I can hum a fugue of which I’ve heard the music’s din afore,
And whistle all the airs from that infernal nonsense Pinafore.
Then I can write a washing bill in Babylonic cuneiform,
And tell you ev’ry detail of Caractacus’s uniform:
In short, in matters vegetable, animal, and mineral,
I am the very model of a modern Data-Professional.

In fact, when I know what is meant by “mamelon” and “ravelin”,
When I can tell at sight a Mauser rifle from a javelin,d
When such affairs as sorties and surprises I’m more wary at,
And when I know precisely what is meant by “commissariat”,
When I have learnt what progress has been made in modern gunnery,
When I know more of tactics than a novice in a nunnery –
In short, when I’ve a smattering of elemental strategy –

You’ll say a better Data-Professional has never sat a gee-e
For my military knowledge, though I’m plucky and adventury,
Has only been brought down to the beginning of the century;
But still, in matters vegetable, animal, and mineral,
I am the very model of a modern Data-Professional.

Posted on Leave a comment

before restarting that instance…

Check number of VLFs – Virtual log file counts in the 1000s will mean longer recovery time

Check size of database and transaction log files – the bigger the file sizes, the longer it will take to recover

Check size of tempdb file definitions – tempdb will be rebuilt, is it properly sized and in the right location?  Also, bigger the file sizes, the long it will take to allocate the space.

When was last Full / Incremental / Transaction log backup taken?  If possible, kick off another transaction log backup before restating.

Check SQL Server Agent jobs and make sure no big jobs are running, then stop the Agent

Update your resume and/or linked-in profile 🙂

After restart;
– If smaller databases and minimal impact run checkdb with physical only
– Check SQL Agent, start it if it has not started automatically
– Trace flags not defined in startup parameters were lost
– Statistics were reset

Posted on Leave a comment

Fix Logins -> DB Users

When a database is restored from one server to another, the databases users are still mapped to the guid of the logins on the original server. To remap the database users to the logins on the gaining server, you can drop and recreate the user or run sp_change_users_login.  The script below will run this for all users.

USE [db_name]
GO

SET NOCOUNT ON
GO

declare @tab TABLE ( id int identity(1,1), sqlcmd nvarchar(4000) )
declare @id int, @sqlcmd nvarchar(4000)

insert into @tab ( sqlcmd )
select ‘exec sp_change_users_login ”update_one”, ”’ + A.name + ”’,”’ + B.name + ””
from sys.sysusers A
inner join sys.syslogins B
on lower(A.name) collate SQL_Latin1_General_CP1_CI_AS = lower(B.name) collate SQL_Latin1_General_CP1_CI_AS
where issqluser = 1

select @id = max(id) from @tab

while @id > 0
begin
   select @sqlcmd = sqlcmd from @tab where id = @id
   print @sqlcmd
   exec sp_executesql @sqlcmd
   set @id = @id – 1
end

GO

Posted on Leave a comment

Cost Threshold for Parallelism

Was experiencing high CPU usage (95%+) on 4 vCPU server I had “inherited”.
Server has SQL Server, Reporting Services, and Integration Services running on it.
Cost Threshold for Parallelism was set to 25.  Increased it to 120 and immediately saw drop in CPU contention.

Average CPU usage dropped to %25-%30 with short lived spikes instead long running contention.

Lowered to 60 and saw this:

Increased in steps of 10 up to 100 until the contention was again relieved.

Posted on Leave a comment

sp_recompile

Causes stored procedures, triggers, and user-defined functions to be recompiled the next time that they are run. It does this by dropping the existing plan from the procedure cache forcing a new plan to be created the next time that the procedure or trigger is run. In a SQL Server Profiler collection, the event SP:CacheInsert is logged instead of the event SP:Recompile.

https://msdn.microsoft.com/en-us/library/ms181647(v=sql.110).aspx

Posted on Leave a comment

Log Reuse Waits

   SELECT name
        , log_reuse_wait_desc
     FROM sys.databases;

  • NOTHING – vlfs available
  • CHECKPOINT – dirty pages in buffer pool, awaiting checkpoint, issue checkpoint and/or adjust RECOVERY INTERVAL setting
  • LOG_BACKUP – waiting for next transaction log backup
  • ACTIVE_BACKUP_OR_RESTORE – transaction log backup in progress or is “hung”
  • ACTIVE_TRANSACTION – open transaction, use DBCC OPENTRAN to identify
  • DATABASE_MIRRORING – secondary falls behind
  • REPLICATION – subscriber falls behind
  • DATABASE_SNAPSHOT_CREATION – transient; snapshot creation in progress
  • LOG_SCAN –  transient; fn_dblog, during a checkpoint initiates a log scan to synchronize log sequence numbers, or other process that causes a log scan
  • OTHER_TRANSIENT – currently not in use