Sunday, January 03, 2016

A challenge for 2016....... accepted!

I was reading a challenge for 2016 post by Aaron Bertrand and in that post Aaron mentioned that Tim Ford challenged the SQL community to post 1 introductory post per month. The tweet in question is below


I am not an MVP anymore, but I decided to still jump on the bandwagon as well. I noticed that with some of the new hires we had, some of the stuff I take for granted is sometimes not very clear/obvious to them.

Here are some of the things I will blog about

Selecting all the rows from one table that don't exist in another table
How to make SSMS rock so that you are more productive
Get the max/first/latest/oldest row based on some criteria

I will think of more once time passes. During the summer when we get our interns in the office will also be a good time to get some additional ideas.


Saturday, October 31, 2015

TempDB Database changes in SQL Server 2016

According to Books On Line, there are several changes made in SQL Server 2016 in regards to TempDB. Here is what Books On Line has on the topic

There are several enhancements to TempDB:
  • Trace Flags 1117 and 1118 are not required for tempdb anymore. If there are multiple tempdb database files all files will grow at the same time depending on growth settings. In addition, all allocations in tempdb will use uniform extents.
  • By default, setup adds as many tempdb files as the CPU count or 8, whichever is lower.
  • During setup, you can configure the number of tempdb database files, initial size, autogrowth and directory placement using the new UI input control on the Database Engine Configuration - TempDB section of SQL Server Installation Wizard.
  • The default initial size is 8MB and the default autogrowth is 64MB.
  • You can specify multiple volumes for tempdb database files. If multiple directories are specified tempdb data files will be spread across the directories in a round-robin fashion.

I decided to check this out, I downloaded and installed SQL Server 2016 Community Technology Preview 3.0 During the install I did not make any changes to TempDB

Once the install was done, I executed the following to see how many files TempDB had on my laptop

exec sp_helpdb 'tempdb'

Here is what I saw in the output

name fileid filename filegroup size maxsize growth usage
tempdev 1 C:\DATA\tempdb.mdf PRIMARY 8192 KB Unlimited 65536 KB data only
templog 2 C:\DATA\templog.ldf NULL 8192 KB Unlimited 65536 KB log only
temp2 3 C:\DATA\tempdb_mssql_2.ndf PRIMARY 8192 KB Unlimited 65536 KB data only
temp3 4 C:\DATA\tempdb_mssql_3.ndf PRIMARY 8192 KB Unlimited 65536 KB data only
temp4 5 C:\DATA\tempdb_mssql_4.ndf PRIMARY 8192 KB Unlimited 65536 KB data only


As you can see, there are 4 data files, when I looked at my SQL Server 2014 instance which was installed with defaults, there is only 1 data file. Another difference is that SQL Server 2014 uses 10% growth as default while SQL Server 2016 uses 65536 KB

This is a good change for all the casual installs by people who just want to experiment and learn, however most shops I would think already would have changed TempDB to have more than 1 file. But then again maybe Microsoft noticed that a lot of support calls that they are getting have only 1 data file for TempDB and thus they decided to implement this change.


Thursday, October 29, 2015

SQL Server 2016 Community Technology Preview 3.0 is now available for download

Public service announcement!!!!!

SQL Server 2016 Community Technology Preview 3.0 is now available for download. The download is 2.6 GB, you can get it here SQL Server 2016 Community Technology Preview 3.0


I am already downloading mine.... what are you waiting for?


The stuff I am most excited about in this CTP is the SQL Server R Services in SQL Server 2016....


To accompany CTP3, the SQL Server team also created an updated version of AdventureWorks, called AdventureWorks2016CTP3, and they have created a number of samples to showcase some of the new functionality in SQL2016. They recommend you download and install the sample databases and play with the sample scripts. Download link: AdventureWorks Sample Databases and Scripts for SQL Server 2016 CTP3

SQL Server 2016 Community Technology Preview 3.0 will be available this week

SQL Server 2016 Community Technology Preview 3.0 will be available this week.

Some of the new and exciting stuff......


Advanced Analytics (RRE integration)

With this release, we are very excited to announce the public availability SQL Server R Services in SQL Server 2016, an Advanced Analytics capability which supports enterprise-scale data science, significantly reducing the friction for adopting machine learning in your business. SQL Server R Services is all about helping customers embrace the highly popular open source R language in their business. R is the most popular programming language for Advanced Analytics. You can use it to analyze data, uncover patterns and trends and build predictive models. It offers an incredibly rich set of packages and a vibrant and fast-growing developer community. At the same time, embracing R in an enterprise setting presents certain challenges, especially as the volume of data rises and with the switch from modeling to production environments. Microsoft SQL Server R Services with in-database analytics helps customers embrace this technology by supporting several scenarios. 
Transactional replicate from SQL Server to Azure SQL DB 
This is new in CTP3. Now you can setup Azure SQL DB as a subscriber of transaction replication, allowing you to migrate data from SQL Server instance on-premises or in IaaS to Azure SQL database without downtime. The replication is one way in this release, and works with SQL Server 2016, SQL Server 2014 and SQL Server 2012. This is the same Transactional Replication technology you have been using for many years on premise. As you configure a subscriber (from SSMS or by script), instead of entering an instance name, you enter the name of your Azure SQL DB subscription along with the associated login and password. A snapshot (as in a Replication Snapshot) will used to initialize the subscription and subsequent data changes will be replicated to you Azure SQL DB in the same transactional consistent way you are used to. A transactional publication can deliver changes to subscribers both in Azure SQL DB and/or on premise/Azure VM. There is no Replication service hosted in Azure for this. Everything is driven from on-premise distribution agents. To use this feature, you just need to set it up the way you do to replicate on-premises: Install the Replication components, configure the Distributor, the Publisher and create the Publication, the Articles and you the Subscriptions. In this case, one of the subscriptions will be your Azure SQL DB.
In-Memory improvements in this release:
  • In-Memory OLTP
    • FOREIGN KEY constraints
    • CHECK constraints
    • UNIQUE constraints
    • DML triggers (AFTER only)
    • EXECUTE AS CALLER
    • Inline table-values functions
    • Security built-ins and increased math function support
  • Real-time Operational Analytics
    • Support for in-memory tables
    • Existing nonclustered columnstore index (NCCI) are updateable without requiring index rebuild
    • Parallel Index build of nonclustered columnstore index (NCCI)
    • Performance improvements (INSERT, String pushdown, bypassing delete buffer when processing deleted rows)
  • In-Memory Analytics
    • You can upgrade databases with nonclustered columnstore index and have it updateable without requiring rebuild of the index
    • General performance improvements for analytics queries with columnstore index especially involving aggregates and string predicates
    • Improved supportability with DMVs and XEvents
Stretch Database updates in this release:
  • Engine Update
    • Create/Drop index support
    • AlwaysEncrypted support
    • Improved credential management for remote Stretch database stores
    • Improved performance for joins between stretched tables
    • New external data source integration
  • SSMS Wizard updates
    • Database and Table level fly out menu options were updated to reflect new Stretch functionality
    • Stretch monitor functionality added to allow users to monitor current migration status, including the ability to pause the migration at the table level
    • XEVENT support for diagnostics session support in monitor
    • Updated and simplified stretch wizard flow to reduce the amount of steps required to enable or reconfigure Stretch
    • Help icon links Updated to point to new MSDN content focusing specifically on wizard topic
    • Added functionality that allows users to pause or disable migration at the table level
    • Added ability to Stretch individual tables
    • Added database scoped credential support - for AlwaysOn
    • Ability to enabling stretch on the server using the wizard
    • Updated table level validation error/warning messaging
    • The ability to Stretch to new SQL Azure or existing SQL Azure server
    • Updated SSMS Object Explorer Stretch Databases icons
    • SMO model for Stretch status query and updates
Polybase in CTP3 includes the following new capabilities:
  • Improved PolyBase query performance with scale-out computation on external data (PolyBase scale-out groups)
  • Improved PolyBase query performance with faster data movement from HDFS to SQL Server and between PolyBase Engine and SQL Server
  • Support for exporting data to external data source via INSERT INTO EXTERNAL TABLE SELECT FROM TABLE
  • Support for push-down computation to Hadoop for string operations (compare, LIKE)
  • Support for ALTER EXTERNAL DATA SOURCE statement
Built-in JSON support improvements in this release include:
  • OPENJSON - Table value function that parses JSON text and returns rowset view of JSON. By default, OPENJSON returns properties of object or elements of array that is parsed. Advanced version of OPENJSON function with defined schema allows user to define schema of resulting rowset, and mapping rules that define where can be found values in the parsed JSON text that will be returned in the resulting rowset. It enables developers to easily parse JSON text and import it into relational tables.
  • JSON_VALUE - Scalar function that returns a value from JSON on the specified path. It can be used in any query, view, computed column. It can be also used to define indexes on properties of JSON text stored in table columns.
  • ISJSON - function that validates that JSON is properly formatted. It can be used to define check constraints on the columns that contain JSON text. It is not supported in check constraints defined on in-memory tables.
  • JSON_QUERY - Scalar function that returns a fragment from the JSON text. Unlike JSON_VALUE that returns scalar values, JSON_QUERY returns complex object (i.e. JSON arrays and objects).
Temporal support improvements in this release include:
  • Support for using temporal system-versioning with In-Memory OLTP
    • Combining disk-based table for cost-effective storing of history data with memory-optimized tables for storing latest (actual) data
    • Super-fast DML and current data querying supported from natively compiled code
    • Temporal querying supported from interop mode
    • Internal in-memory table created to minimally impact performance of DML operations
    • Background process that flushes the data from internal in-memory to permanent disk-based history table
  • Direct ALTER for system-versioned temporal tables enables modifying table schema without introducing maintenance window
    • Support for adding/altering/dropping columns while SYSTEM_VERSIONING is ON
    • Support for ADD/DROP HIDDEN for period columns while SYSTEM_VERSIONING is ON
  • Support for temporal querying clause FOR SYSTEM_TIME ALL that enables users to query entire data history easily without specifying period boundaries
  • Optimized CONTAINED IN implementation with minimized locking on current table. If your main case is analysis on historical data only, use CONTAINED IN.
Query Store improvements in this release include:
  • Performance monitoring supported for natively compiled code from In-Memory OLTP workloads:
    • Collecting queries, plans and compile time statistics enabled for natively compiled queries when Query Store is ON
    • Stored plan is semantically equivalent to one that is produced when SET SHOWPLAN_XML is set to ON with one difference: plans in Query Store are always split and stored per individual statement
    • Runtime statistics collection is controlled with sys.sp_xtp_control_query_exec_stats (does not enabled by default)
    • is_natively_compiled field added to sys.query_store_plan to help finding queries  generated by the native code compilation
    • Plan forcing for queries from natively compiled modules is available and forced plans are honored during module recompilation. As for disk-based workloads, Query Store does not guarantee success of plan forcing operation as some plan shapes cannot be forced
    • Memory grants metrics within sys.query_store_runtime_stats are not populated for natively compiled queries – their values are always 0
  • Improving implementation of time-based cleanup (configured with STALE_QUERY_THRESHOLD_DAYS) to run in multiple transactions, holding database lock for a shorter period of time and thus minimize impact on customer workload
  • Hadoop Connector: SSIS Hadoop connector allows customer to copy data to/from HDFS and trigger Hive/Pig job on Hadoop cluster. This brings in following components: Hadoop Connection Manager, H
SQL Server Integration Services (SSIS) improvements in this release include:
  • SSIS control flow template enables customers to save a commonly used control flow task or container to a standalone template file and reuse it multiple times in a package or multiple packages in a project. This reusability introduced by template makes SSIS packages easier to design and maintain.
  • Added Azure blob source support for the Import/Export wizard; user can use Azure blob source as source or destination during the transformation.
  • Relaxed Max Buffer Size of Data Flow Task. The max Default Buffer Size of Data Flow Task is relaxed to 2G-1 from 100M. A new attribute 'AutoAdjustBufferSize' is added to Data Flow Task, which can be set in SSDT. If it is set to true, the Default Buffer Size will be set automatically in runtime according to Default Buffer Max Rows.
SQL Server Analysis Services (SSAS) improvements in this release include the following; please visit the SSAS team blog to learn more.
  • DBCC support
  • The Microsoft.AnalysisServices library has been re-factored to include a second namespace, Microsoft.AnalysisServices.Core. The new namespace separates out common classes like Server, Database, and Role that have broad application in Analysis Services, irrespective of server mode.
  • SSMS and SSDT updates for Tabular
SQL Server Reporting Services (SSRS) improvements in this release include the following; please visit the SSRS team blog to learn more.
  • Pin Reporting Services report items - Including charts, gauge panels, maps, and images - to Power BI dashboards. Dashboard tiles always show up-to-date data thanks to scheduled refresh. Click a dashboard tile to drill through to the complete Reporting Services report.
  • Design reports using Visual Studio 2015 with an updated version of SQL Server Data Tools.
  • Uses .NET Framework 4.x code in report expressions, report code, referenced assemblies, and extensions for report security, data processing, rendering, or delivery.
Master Data Services (MDS) improvements in this release include:
  • Entity Change Approve Flow. Admin can mark an entity requiring approvals for changes. Examples include:
    • Admin marks an entity requiring approval for changes in the entity administrator page
    • User needs to save pending change to change set and submit to admin for approval
    • Admin approves or rejects the pending changes
    • The approved pending changes will be committed to the master data services
  • Domain Based Attribute Constrained List. For domain-based attributes, optionally, user can select a parent attribute whose value will constrain the allowed values for this attribute. Examples include:
    • Model has State, City, Account entity
    • Account has a City DBA to City entity and State DBA to State entity
    • City has a State DBA to State entity and a derived hierarchy from State to City
    • A constrain can be added on Account; City attribute which parent is Account.State. so the City attribute dropdown list is constrained by State value
Query memory grant enhancement improvement in CTP3:
  • Updated sort_warning and hash_warning XEvent to include spill IO stats
  • New hash_spill_details XEvent for detailed spill information
  • Statistics XML and SSMS updated to include spill warning
  • sys.dm_exec_query_stats DMV: updated with new memory grant and parallelism info
  • sys.dm_exec_requests DMV: updated with new memory grant and parallelism info
  • sys.dm_exec_query_parallel_workers: new DMV to show worker threads status for parallel queries
  • sys.dm_exec_query_optimizer_memory_gateways: new DMV to show query optimizer compile gates




To see when SQL Server 2016 Community Technology Preview 3.0 will be available visit this page  http://www.microsoft.com/en-us/server-cloud/products/sql-server-2016/default.aspx

Sunday, October 18, 2015

How much memory does a SQL Server connection use?

Someone asked me the other day how much memory a SQL Server connection uses. I told him that I remember it being about 40 kb or so. We decided to look it up, in the SQL Server internals books it says it is about 24 kb. I was looking in Books On Line and the only thing I could find was the following on the SQL Server Connection Basics page

SQL Server sets aside three packet buffers for every connection made from a client. Each buffer is sized according to the default network packet size specified by the sp_configure stored procedure. If the default network packet size is less than 8KB, the memory for these packets comes from SQL Server's buffer pool. If it's 8KB or larger, the memory is allocated from SQL Server's MemToLeave region.
It's worth noting that the default network packet size for the .NET Framework Data Provider for SQL Server is 8KB, so the buffers associated with managed code client connections typically come from SQL Server's MemToLeave region. This contrasts with classic ADO applications, where the default packet size is 4KB, and the buffers are allocated form the SQL Server buffer pool.

So the 24 kb mentioned in the SQL Server internals book make sense since 3 * 8 kb is 24 kb. What about if you network packet size is not 8 kb?
You can change the network packet size by using sp_configure. The  minimum network  packet size is 512 kb, the maximum network packet size is 32767 kb,  and the default network packet size is  4096 kb.

So my question is:  does the amount of memory a SQL Server connection use change base on the network packet size?

Leave me a comment here or you can answer it on twitter as well

SQLSoldier answered the question.

It is Approximately (3 * network_packet_size + 94 KB). However when using MARS it is different.... When using multiple active result sets is enabled, the user connection is approximately (3 + 3 * num_logical_connections) * network_packet_size + 94 KB.

You can find the answer here: Memory Used by SQL Server Objects Specifications

Monday, October 12, 2015

Some reasons your tSQLt test is not executing

This post is part of a Unit Testing series I created, you can find links to all the posts here: Unit testing in SQL Server, a list of all the posts
Today we are going to look at why a test might not be running. Before starting, if you don't have tSQLt setup, follow the instruction here: Unit testing in SQL Server, installing tSQLt and doing a quick test
 First, if you didn't create a test database yet, let's create one and name it UnittestExamples, we will also create a simple table in that database

CREATE DATABASE UnittestExamples
GO

USE UnittestExamples
GO

CREATE TABLE SomeTable (ID int)

Now connect to your tSQLt database
USE tSQLt
GO

Let's create two test classes

EXEC tSQLt.NewTestClass 'MyNewClass'
EXEC tSQLt.NewTestClass 'MyNewestClass'

To see all the test classes in your tSQLt database, you can use the following query

SELECT Name, SchemaId
FROM tSQLt.TestClasses;

Name       SchemaId
MyNewClass 6
MyNewestClass 7

Now we will create a bunch of procs, these procs will test for the existance of a table in the UnittestExamples database

CREATE PROCEDURE MyNewClass.[ObjectExist]
AS
EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable'
GO

CREATE PROCEDURE MyNewestClass.[ObjectExist]
AS
EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable'
GO



CREATE PROCEDURE MyNewClass.[test ObjectExist]
AS
EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable'
GO

CREATE PROCEDURE MyNewestClass.[test ObjectExist]
AS
EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable'
GO


CREATE PROCEDURE [test ObjectExist]
AS
EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable'
GO


Now it is time to run our tests,

EXEC tSQLt.tSQLt.Run 'MyNewestClass'


+----------------------+
|Test Execution Summary|
+----------------------+

|No|Test Case Name                    |Result |
+--+----------------------------------+-------+
|1 |[MyNewestClass].[test ObjectExist]|Success|
-----------------------------------------------------------------------------
Test Case Summary: 1 test case(s) executed, 1 succeeded, 0 failed, 0 errored.
-----------------------------------------------------------------------------



EXEC tSQLt.tSQLt.Run 'MyNewClass'



+----------------------+
|Test Execution Summary|
+----------------------+

|No|Test Case Name                 |Result |
+--+-------------------------------+-------+
|1 |[MyNewClass].[test ObjectExist]|Success|
-----------------------------------------------------------------------------
Test Case Summary: 1 test case(s) executed, 1 succeeded, 0 failed, 0 errored.
-----------------------------------------------------------------------------

Now, let;s call all the classes in one shot, will that make a difference?

EXEC tSQLt.tSQLt.RunAll


+----------------------+
|Test Execution Summary|
+----------------------+

|No|Test Case Name                 |Result |
+--+-------------------------------+-------+
|1  |[MyNewClass].[test ObjectExist]    |Success|
|2  |[MyNewestClass].[test ObjectExist] |Success|
-----------------------------------------------------------------------------
Test Case Summary: 2 test case(s) executed, 2 succeeded, 0 failed, 0 errored.
-----------------------------------------------------------------------------

Nope, only two out of five tests executed.
So what happened, why are only two of the five tests running? The reason is that each test has to be named so that the first part of the name is test. We created the following four test stored procedures

CREATE PROCEDURE MyNewClass.[ObjectExist]
CREATE PROCEDURE MyNewestClass.[ObjectExist]

CREATE PROCEDURE MyNewClass.[test ObjectExist]
CREATE PROCEDURE MyNewestClass.[test ObjectExist]

As you can see the first two are not named so that the name starts with test. Renaming those so that they start with test will fix that. I myself have made this mistake many times in the beginning, hopefully this will save someone some time in the future

What about the 5th proc why didn't that run?

CREATE PROCEDURE [test ObjectExist]

As you can see that proc is named correctly, however it is not in the two test classes that we created. It is created in the default class which most likely will be dbo, You can actually execute this by calling the dbo class

EXEC tSQLt.tSQLt.Run 'dbo'

+----------------------+
|Test Execution Summary|
+----------------------+

|No|Test Case Name          |Result |
+--+------------------------+-------+
|1 |[dbo].[test ObjectExist]|Success|
-----------------------------------------------------------------------------
Test Case Summary: 1 test case(s) executed, 1 succeeded, 0 failed, 0 errored.
-----------------------------------------------------------------------------


So there you have it, the procs have to be named so that they start with test and the procs have to be created in a test class that the following query returns

SELECT Name, SchemaId
FROM tSQLt.TestClasses;

Time to clean up the mess...... Here is how you can quickly clean up, this will drop the test class as well as all the objects in that class


EXEC tSQLt.DropClass 'MyNewClass'
EXEC tSQLt.DropClass 'MyNewestClass'


This post is part of a Unit Testing series I created, you can find links to all the posts here: Unit testing in SQL Server, a list of all the posts

Saturday, October 10, 2015

Your first step in dealing with SQLState = S1000 I/O error while writing a BCP data-file

The other day one job started to fail every time it ran. The error was

Server: FancyServerName 
Job: SomeJobName 
Step: Step 3) SomeStepName 
Message: Executed as user: SomeDomain\SomeServiceAccount. Starting copy... SQLState = S1000, NativeError = 0 Error = [Microsoft][ODBC Driver 11 for SQL Server]I/O error while writing BCP data-file 19 rows copied. Network packet size (bytes): 4096 Clock Time (ms.) Total : 1 Average : (19000.00 rows per sec.). Process Exit Code 1. The step failed. 
Time: 2015-10-06 08:16:20


This was very interesting since the file we were writing had only 19 rows and this was very small, less than a kilobyte. So how do you investigate what the error is? It could be all kinds of things:


Permissions on the drive
Problems with the drive
Memory issues
Issue with a format file



If you do a search you will get all kinds of things back. However the first thing you need to run is the following


EXEC xp_fixeddrives



Here is what I saw after running xp_fixeddrives


drive      MB free
C             235644
D             1479234
E              10123
F              10123
G             10123
L              10123
S              117706
T              10123
Z              0




Take a look at the Z drive do you see that? The Z drive is full, there is no space left. It would be nice if the error message indicated that it couldn't write to the file because there was no space left. In a production environment this should not happen because hopefully you have something in place that sends out an alert if the drive is 90% full.

So, if you ever get this error, first thing you have to do is if you have space left on the drive.


I also want to point out that the xp_fixeddrives stored procedure is undocumented so there is no guarantee it will be available in a future version of SQL Server.

Monday, October 05, 2015

Unit testing in SQL Server, a list of all the posts

This is a list of all the unit test blog post I created about tSQLt and unit testing.

Unit testing in SQl Server... why?
This post explains why you should be unit testing

Unit testing in SQL Server..what are unit tests
This post explains what unit tests are

Unit testing in SQL Server, taking a first look a tSQLt
This post takes a first look at tSQLt, tSQLt is a unit test framework completely written in T-SQL

Unit testing in SQL Server, installing tSQLt and doing a quick test
This post helps you install tSQLt and run a few test to get a quick demo of how tSQLt works

Some reasons your tSQLt test is not executing
This post will explain why some unit tests might not be running


Unit testing in SQL Server, installing tSQLt and doing a quick test

In today's post we will install tSQLt and run a simple test. To start download tSQLt, unzip the zipfile into its own folder. Here is what is inside that zip file, we are looking at executing the tSQLt.class.sql file later on



Open up SSMS, connect to the database server where you want to create the tSQLt database. We need to enable the CLR on the server where you are running your tests because tSQLt is using some CLR functionality.

Execute the following command to enable the CLR on your box


EXEC sp_configure 'clr enabled', 1;
RECONFIGURE;


Now we need to create the database and make it trustworthy, execute the following



CREATE DATABASE tSQLt
GO

USE tSQLt
GO

ALTER DATABASE tSQLt SET TRUSTWORTHY ON;
GO


Go to your unzipped folder, open up the tSQLt.class.sql file, make sure that you are connected to you tSQLt database, execute the code

You should see something like this in the output once all the code has run

Installed at 2015-10-05 17:58:53.360

(1 row(s) affected)

+-----------------------------------------+
|                                         |
| Thank you for using tSQLt.              |
|                                         |
| tSQLt Version: 1.0.5686.18945           |
|                                         |
+-----------------------------------------+


After the code has completed, run the following

EXEC tSQLt.Fail 'congratulation, we are ready to go to the next step'


You should see the following output

Msg 50000, Level 16, State 10, Procedure Fail, Line 51
tSQLt.Failure


Let's create a sample database so that we can do some testing

Execute the following code


CREATE DATABASE UnittestExamples
GO

USE UnittestExamples
GO

CREATE TABLE dbo.test1(id int)
CREATE TABLE dbo.test2(id int)

INSERT dbo.test1  VALUES(1)
INSERT dbo.test2  VALUES(2)

INSERT dbo.test1  VALUES(3)
INSERT dbo.test2  VALUES(3)

INSERT dbo.test1  VALUES(4)
INSERT dbo.test2  VALUES(5)
GO

USE tSQLt
GO

Now let's run a couple of tests

tSQLt ships with an assertion to test if 2 tables are the same, the assertion is named AssertEqualsTable

So let's see if these two tables are the same, execute the following

EXEC tSQLt.AssertEqualsTable 'UnittestExamples.dbo.test1', 'UnittestExamples.dbo.test2'


Here is the output
Msg 50000, Level 16, State 10, Procedure Fail, Line 73
tSQLt.Failure

As you just saw, will get an error, now what will happen if you compare the table against itself, run this to find out.

EXEC tSQLt.AssertEqualsTable 'UnittestExamples.dbo.test1', 'UnittestExamples.dbo.test1'

There was no error now, this is expected

What happens if we test a table against a non existing table?

EXEC tSQLt.AssertEqualsTable 'UnittestExamples.dbo.test1', 'UnittestExamples.dbo.test112'

Msg 50000, Level 16, State 10, Procedure Fail, Line 51
tSQLt.Failure
Msg 208, Level 16, State 1, Line 2
Invalid object name 'UnittestExamples.dbo.test112'.

I want to point something important out. Comparing two tables is usually doing for small lookup tables, you create a table inside your unit test with values that you expect to match the table in the database, if these tables don't match, then you have a problem. You wouldn't want to compare two million row tables in a test.


Let's do another assertion, let's see if an table exists.

EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable'

Msg 50000, Level 16, State 10, Procedure Fail, Line 51
tSQLt.Failure

That fails as expected, now let's try it with a table that exists

EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.test1'

That ran fine as expected.

Let's clean up by dropping the two tables we created earlier.

DROP TABLE dbo.test1 , dbo.test12


Creating a test class
What you want to do with tSQLt is group all your tests together, for example if you are testing a stored procedure, you can execute just those set of tests. When creating a test class in tSQLt, a new schema will be created, you can easily verify this. Execute the following

USE tSQLt
GO

SELECT * FROM sys.schemas
WHERE schema_id BETWEEN 5 AND 1000


name schema_id principal_id
tSQLt         5 1

As you can see, the only schema the query returns is the tSQLt schema. Now let's create a new test class, you do this by executing the NewTestClass  stored procedure, if we wanted to create a test class named WebFlags, we would do it like this

tSQLt.NewTestClass 'WebFlags'


Running the following code again, you will now see an additional schema in the output
USE tSQLt
GO

SELECT * FROM sys.schemas
WHERE schema_id BETWEEN 5 AND 1000


name schema_id principal_id
tSQLt         5 1
WebFlags 6 1  


Let's create a couple of test stored procedures with the examples we used before

First we need to recreate the test1 table again

USE UnittestExamples
GO

CREATE TABLE dbo.test1(id int)


INSERT dbo.test1  VALUES(1)
INSERT dbo.test1  VALUES(3)
INSERT dbo.test1  VALUES(4)
GO


Now we will create 3 stored procedures. It is very important that you name each stored procedure beginning with 'test ', if you don't do this then tSQLt will skip the proc. Here is the code to create our 3 stored procedures



USE tSQLt
GO

CREATE PROCEDURE WebFlags.[test test1 table has all correct values]
AS

CREATE TABLE dbo.test2(id int)

INSERT dbo.test2  VALUES(2)
INSERT dbo.test2  VALUES(3)
INSERT dbo.test2  VALUES(5)

EXEC tSQLt.AssertEqualsTable 'UnittestExamples.dbo.test1', 'tSQLt.dbo.test2'


DROP TABLE dbo.test2
GO CREATE PROCEDURE WebFlags.[test SomeTable table exists] AS EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.SomeTable' GO CREATE PROCEDURE WebFlags.[test test1 table exists] AS EXEC tSQLt.AssertObjectExists 'UnittestExamples.dbo.test1' GO


To execute these procs, all we need to do is run the following

tSQLt.tSQLt.Run 'WebFlags'


Here is the output

[WebFlags].[test SomeTable table exists] failed: (Failure) 'UnittestExamples.dbo.SomeTable' does not exist
[WebFlags].[test test1 table has all correct values] failed: (Failure) Unexpected/missing resultset rows!
|_m_|id|
+---+--+
|<  |1 |
|>  |4 |
|=  |3 |
|>  |5 |
|>  |2 |

+----------------------+
|Test Execution Summary|
+----------------------+

|No|Test Case Name                                      |Dur(ms)|Result |
+--+----------------------------------------------------+-------+-------+
|1 |[WebFlags].[test test1 table exists]                |      3|Success|
|2 |[WebFlags].[test SomeTable table exists]            |      6|Failure|
|3 |[WebFlags].[test test1 table has all correct values]|     33|Failure|
-----------------------------------------------------------------------------
Msg 50000, Level 16, State 10, Line 1
Test Case Summary: 3 test case(s) executed, 1 succeeded, 2 failed, 0 errored.
-----------------------------------------------------------------------------



As you can see, on top you will get the name of the tests that failed, also included is the error message

[WebFlags].[test SomeTable table exists] failed: (Failure) 'UnittestExamples.dbo.SomeTable' does not exist
[WebFlags].[test test1 table has all correct values] failed: (Failure) Unexpected/missing resultset rows!
|_m_|id|
+---+--+
|<  |1 |
|>  |4 |
|=  |3 |
|>  |5 |
|>  |2 |


Below that you will get a list of each that that ran with an indication if the test was successful or not

|No|Test Case Name                                      |Dur(ms)|Result |
+--+----------------------------------------------------+-------+-------+
|1 |[WebFlags].[test test1 table exists]                |      3|Success|
|2 |[WebFlags].[test SomeTable table exists]            |      6|Failure|
|3 |[WebFlags].[test test1 table has all correct values]|     33|Failure|


Finally you get a summary

Test Case Summary: 3 test case(s) executed, 1 succeeded, 2 failed, 0 errored.


You can also run all these tests by running the following

tSQLt.tSQLt.RunAll


Right now since you only have 1 test class, the result is the same if you run all tests or the tests for the WebFlags class

That's all for today, in the next post we will take a look at tSQLt available assertions and how they are used

The list of all the unit test posts can be found here: Unit testing in SQL Server, a list of all the posts


Friday, October 02, 2015

Dropping all objects in a schema

I had the need to drop all objects that I created in a specific schema as part of a unit test. Then I decided to test my script on the Adventure Works 2014 sample database and I got a couple of errors. Some of these errors were that there were foreign key constraints from a different schema pointing to a table in this schema. I also got an error complaining about an XML SCHEMA collection living in my schema.

If you want to follow along by running the scripts, first download the Adventure Works 2014 sample database

After you are done downloading the database, create a folder named training on your C drive.
Open up SSMS, open a query window, run the following script

USE [master]

--DROP DATABASE AdventureWorks2014
GO

RESTORE DATABASE AdventureWorks2014
FROM disk= 'C:\Training\AdventureWorks2014.bak'
WITH MOVE 'AdventureWorks2014_data' TO 'C:\Training\AdventureWorks2014.mdf',
MOVE 'AdventureWorks2014_Log' TO 'C:\Training\AdventureWorks2014.ldf'
,REPLACE, stats =20


Now that the DB is available, we can look at what the script looks like. Connect to the database you just created and open up a new query window. Before running the script make sure that you are in text mode, just hit CTRL + T in your  SSMS query window

Here is what the script looks like

   SET NOCOUNT ON
   DECLARE  @SchemaName NVARCHAR(100) = 'HumanResources'


   SELECT 'ALTER TABLE ' + QUOTENAME(SCHEMA_NAME(t.schema_id))
   +'.'+ QUOTENAME(OBJECT_NAME(s.parent_object_id))
   + ' DROP CONSTRAINT ' +  QUOTENAME(s.name)   
   FROM sys.foreign_keys s
   JOIN sys.tables t on s.parent_object_id = t.object_id
   JOIN sys.tables t2 on s.referenced_object_id = t2.object_id
    WHERE t2.schema_id = SCHEMA_ID(@SchemaName)

    SELECT 'DROP ' +
                    CASE WHEN type IN ('P','PC') THEN 'PROCEDURE'
                         WHEN type =  'U' THEN 'TABLE'
                         WHEN type IN ('IF','TF','FN') THEN 'FUNCTION'
                         WHEN type = 'V' THEN 'VIEW'
                     END +
                   ' ' +  QUOTENAME(SCHEMA_NAME(schema_id))+'.'+QUOTENAME(name)  
                   FROM sys.objects
             WHERE schema_id = SCHEMA_ID(@SchemaName)
    AND type IN('P','PC','U','IF','TF','FN','V')
    ORDER BY  CASE WHEN type IN ('P','PC') THEN 4
                         WHEN type =  'U' THEN 3
                         WHEN type IN ('IF','TF','FN') THEN 1
                         WHEN type = 'V' THEN 2
                       END


  SELECT 'DROP XML SCHEMA COLLECTION '
  + QUOTENAME(SCHEMA_NAME(schema_id))+'.'+QUOTENAME(name)
  FROM sys.xml_schema_collections
  WHERE schema_id = SCHEMA_ID(@SchemaName)

  SELECT 'DROP SCHEMA ' + QUOTENAME(@SchemaName)


As you can see, there are 4 things going on here.

  1. Remove the constraints
  2. Drop Views, Function, Stored Procedures and Tables, drop functions and views first, procs last
  3. Drop XML Schema Collections
  4. Drop the Schema


Of course if you have a view in  another database or schema referencing your objects inside this schema you will have problems...I will get to this later


Running that script will give you the following output


ALTER TABLE [HumanResources].[EmployeeDepartmentHistory] 
DROP CONSTRAINT [FK_EmployeeDepartmentHistory_Shift_ShiftID]
ALTER TABLE [HumanResources].[EmployeeDepartmentHistory] 
DROP CONSTRAINT [FK_EmployeeDepartmentHistory_Department_DepartmentID]
ALTER TABLE [Production].[Document] 
DROP CONSTRAINT [FK_Document_Employee_Owner]
ALTER TABLE [HumanResources].[EmployeeDepartmentHistory] 
DROP CONSTRAINT [FK_EmployeeDepartmentHistory_Employee_BusinessEntityID]
ALTER TABLE [HumanResources].[EmployeePayHistory] 
DROP CONSTRAINT [FK_EmployeePayHistory_Employee_BusinessEntityID]
ALTER TABLE [HumanResources].[JobCandidate] 
DROP CONSTRAINT [FK_JobCandidate_Employee_BusinessEntityID]
ALTER TABLE [Purchasing].[PurchaseOrderHeader] 
DROP CONSTRAINT [FK_PurchaseOrderHeader_Employee_EmployeeID]
ALTER TABLE [Sales].[SalesPerson] 
DROP CONSTRAINT [FK_SalesPerson_Employee_BusinessEntityID]


DROP VIEW [HumanResources].[vEmployee]
DROP VIEW [HumanResources].[vEmployeeDepartment]
DROP VIEW [HumanResources].[vEmployeeDepartmentHistory]
DROP VIEW [HumanResources].[vJobCandidate]
DROP VIEW [HumanResources].[vJobCandidateEmployment]
DROP VIEW [HumanResources].[vJobCandidateEducation]
DROP TABLE [HumanResources].[Shift]
DROP TABLE [HumanResources].[Department]
DROP TABLE [HumanResources].[Employee]
DROP TABLE [HumanResources].[EmployeeDepartmentHistory]
DROP TABLE [HumanResources].[EmployeePayHistory]
DROP TABLE [HumanResources].[JobCandidate]
DROP PROCEDURE [HumanResources].[uspUpdateEmployeeHireInfo]
DROP PROCEDURE [HumanResources].[uspUpdateEmployeeLogin]
DROP PROCEDURE [HumanResources].[uspUpdateEmployeePersonalInfo]


DROP XML SCHEMA COLLECTION [HumanResources].[HRResumeSchemaCollection]


DROP SCHEMA [HumanResources]

Run that and it should all run fine, the schema should be gone

Now change @SchemaName NVARCHAR(100) = 'HumanResources' to @SchemaName NVARCHAR(100) = 'Sales'

Run the script again, run the output from the script

Finally change the variable to Person @SchemaName NVARCHAR(100= 'Person'

Run the script, run the output.  All 3 schemas we specified should be gone now



Now restore the database again, This time run the code to remove the Sales Schema, Then run the code to remove the Person Schema.

Here is the error you will receive


Msg 6328, Level 16, State 1, Line 39
Specified collection 'AdditionalContactInfoSchemaCollection' cannot be dropped because it is used by object 'HumanResources.vEmployee'.

However, it is really used by the Person.Person table which we just dropped. In this case you will need to drop the view. If you have stuff like this in the database, you probably don't want to just start dropping objects in another schema that point to objects in your schema

I am not having these kind of issues so the code is good enough for me to wipe out objects that have been created as part of my unit testing