Friday, November 23, 2018

Happy Fibonacci day, here is how to generate a Fibonacci sequence in SQL


Image by Jahobr - Own work, CC0, Link


Since today is Fibonacci day I decided to to a short post about how to do generate a Fibonacci sequence in T-SQL. But first let's take a look at what a Fibonacci sequence actually is.

In mathematics, the Fibonacci numbers are the numbers in the following integer sequence, called the Fibonacci sequence, and characterized by the fact that every number after the first two is the sum of the two preceding ones:

 1, 1, 2, 3, 5, 8, 13, 21, 34, ...

Often, especially in modern usage, the sequence is extended by one more initial term:

0, 1, 1, 2, 3, 5, 8, 13, 21, 34, ...

November 23 is celebrated as Fibonacci day because when the date is written in the mm/dd format (11/23), the digits in the date form a Fibonacci sequence: 1,1,2,3.

So here is how you can generate a Fibonacci sequence in SQL, you can do it by using s recursive table expression.  Here is what it looks like if you wanted to generate the Fibonacci sequence to up to a value of 1 million

;WITH Fibonacci (Prev, Next) as
(
     SELECT 1, 1
     UNION ALL
     SELECT Next, Prev + Next
     FROM Fibonacci
     WHERE Next < 1000000
)
SELECT Prev as Fibonacci
     FROM Fibonacci
     WHERE Prev < 1000000




That will generate a Fibonacci sequence that starts with 1, if you need a Fibonacci sequence that start with 0, all you have to do is replace the 1 to 0 in the first select statement

;WITH Fibonacci (Prev, Next) as
(
     SELECT 1, 1
     UNION ALL
     SELECT Next, Prev + Next
     FROM Fibonacci
     WHERE Next < 1000000
)
SELECT Prev as Fibonacci
     FROM Fibonacci
     WHERE Prev < 1000000


Here is what it looks like in SSMS



Happy Fibonacci day!!

I created the same for PostgreSQL, the only difference is that you need to add the keyword RECURSIVE in the CTE, here is that post  Happy Fibonacci day, here is how to generate a Fibonacci sequence in PostgreSQL

Wednesday, September 26, 2018

A little less hate for: String or binary data would be truncated in table


The error String or binary data would be truncated is one of the more annoying things in SQL Server. You would insert some data and you would get back the error String or binary data would be truncated. Then you would have to spend a good amount of time to see what caused the error.

I even posted about this as part of  T-SQL Tuesday #86: String or binary data would be truncated

I read the SQL Server 2019 CTP 2 whitepaper and on page 17 it has the following

Improve truncation message for ETL DW scenarios—the error message ID 8152 String or binary data would be truncated is familiar to many SQL Server developers and administrators who develop or maintain data movement workloads; the error is raised during data transfers between a source and a destination with different schemas when the source data is too large to fit into the destination data type. This error message can be time-consuming to troubleshoot because of its lack of specificity. SQL Server 2019 introduces a new, more specific error message for this scenario: ID 2628 String or binary data would be truncated in table '%.*ls', column '%.*ls'. Truncated value: '%.*ls'. The new error message provides more context for the problem, simplifying the troubleshooting process. So that it cannot break existing applications parsing message ID 8152, this new message ID 2628 is an opt-in replacement, which can be enabled with trace flag 460. 


Oh really... they fixed this? Let's take a look

First I downloaded SQL Server 2019 CTP 2 and installed it. Then I created a database with a simple table, I also inserted some data that wouldn't fit

CREATE DATABASE truncatetest
GO

USE truncatetest
GO

CREATE TABLE TruncateMe(somevalue varchar(5),somevalue2 varchar(5), somedecimal decimal(5,2))
GO

INSERT TruncateMe
SELECT '333333','444444',5.3

I then received the following error message, so this is the same as in SQL Server 2018 and earlier, notice message id 8152

Msg 8152, Level 16, State 30, Line 10
String or binary data would be truncated.
The statement has been terminated.

To enable the new functionality, we need to enable trace flag 460, you can do that by running the DBCC TRACEON command like this

DBCC TRACEON(460)

Now let's try that insert statement again

INSERT TruncateMe
SELECT '333333','444444',5.3

And there we go, you get the table name, the column name as well as the value, notice that the message id changed from 8152 to 2628 now

Msg 2628, Level 16, State 1, Line 20
String or binary data would be truncated in table 'truncatetest.dbo.TruncateMe', column 'somevalue'. Truncated value: '33333'.
The statement has been terminated.


So it looks it only returns the first value that generates the error, let's change the first value to fit into the column and execute the insert statement again

INSERT TruncateMe
SELECT '3','444444',5.3

Now you will see that the error is for the somevalue2 column


Msg 2628, Level 16, State 1, Line 27
String or binary data would be truncated in table 'truncatetest.dbo.TruncateMe', column 'somevalue2'. Truncated value: '44444'.
The statement has been terminated.



What will happen if you have more than one row that fails?

insert TruncateMe
select  '333333','444444',5.3 union all
select '3','444444',5.3

Here is the error

Msg 2628, Level 16, State 1, Line 37
String or binary data would be truncated in table 'truncatetest.dbo.TruncateMe', column 'somevalue'. Truncated value: '33333'.
The statement has been terminated.

As you can see the error is only for the first row, not the second one

What about Table Variables, will you also get an error with the column and value like for real tables?

declare  @table table (somevalue varchar(5),somevalue2 varchar(5), somedecimal decimal(5,2))


insert @table
select '333333','444444',5.3


Here is the error

Msg 2628, Level 16, State 1, Line 53
String or binary data would be truncated in table 'tempdb.dbo.#A6AD698B', column 'somevalue'. Truncated value: '33333'.

As you can see you also get the error, the table name is the internal table name for the table variable tied to your session


What about Table Valued Parameters?

CREATE TYPE TestTypeTrunc AS TABLE   
( somevalue varchar(5),somevalue2 varchar(5));  
GO

DECLARE @table TestTypeTrunc
INSERT @table
SELECT '555555555','444444'

Here is the error for that 

Msg 2628, Level 16, State 1, Line 58
String or binary data would be truncated in table 'tempdb.dbo.#AC6642E1', column 'somevalue'. Truncated value: '55555'.


Pretty much the same message as for the table variable


I do applaud Microsoft for finally fixing this.

Here it is also in beautiful technicolor





Monday, September 24, 2018

First public preview of SQL Server 2019 is available for download



The first public preview of SQL Server 2019 is available for download

Download it here https://www.microsoft.com/en-us/sql-server/sql-server-2019

Aaron Bertrand blogged about some of the new stuff here: https://sqlperformance.com/2018/09/sql-server-2019/first-public-preview

No new T-SQL that stands out, but that might come in a future CTP

You can install this CTP on Windows, Linux, Docker and Kubernetes

Here is also a small video


Wednesday, May 16, 2018

Azure SQL Data Warehouse, Azure SQL Database and SQL Server sessions from the Build 2018 conference



The Build 2018 conference happened last week, I decided to put all the Azure SQL Data Warehouse, Azure SQL Database and SQL Server sessions from the Build 2018 conference into one place.

Here are the youtube videos, the descriptions as well as download links to video, audio and slides where available

Enjoy



Looking ahead with SQL Operations Studio

Join the SQL Operations Studio Engineering team as we highlight our journey towards our upcoming release announcement. We will cover major features delivered since our Public Preview announcement including developing our Extensibility story through the extension marketplace and extension API’s, improving dashboard experiences, and experimenting with deeper community involvement to help deliver this awesome cross-platform tool. You may even hear special announcements and key roadmap initiatives that have yet to be shared.



You can also download the sessions in different formats here:

MP3 (32.2 MB)
Low Quality MP4 (41.4 MB)
High Quality MP4 (87.9 MB)
Mid Quality MP4 (79.2 MB)



Query Processing Innovations for data intensive, modern applications 

Learn how your application can benefit from new capabilities in the Azure SQL Database and SQL Server platform, including graph data processing to model complex relationships between objects, and advanced self-tuning query processing to solve or avoid performance related problems.

 

You can also download the sessions in different formats here:

MP3 (72.3 MB)
Low Quality MP4 (81.6 MB)
High Quality MP4 (207.6 MB)
Mid Quality MP4 (157.8 MB)


Migrating and modernizing your data estate to Azure with Data Migration Services

Come learn all about Database Migration Service, and how anyone can utilize this service to migrate and modernize your data estate to fully managed services in Azure



You can also download the sessions in different formats here: 


MP3 (76.0 MB)
Low Quality MP4 (82.9 MB)
High Quality MP4 (208.0 MB)
Mid Quality MP4 (160.5 MB)


Azure SQL Database the intelligent database – Your database on Autopilot

Come learn how Azure SQL DB, the most intelligent cloud database, uses machine learning and best practices to ensure your database is always performing at its best. Come learn about features like Adaptive Query Processing, Autotuning and Performance Recommendations, to see how Azure SQL Database can help you spend more time developing applications and less time managing your databases.



You can also download the sessions in different formats here: 


MP3 (70.6 MB)
Low Quality MP4 (80.8 MB)
High Quality MP4 (201.3 MB)
Mid Quality MP4 (155.6 MB)



Azure SQL DB Managed Instances - Built to easily modernize application data layer

Managed Instances is the latest fully managed deployment model for Azure SQL Database that enables friction-free migration for SQL Server applications running on-premises. Come and learn why Managed Instance is the best PaaS destination for all SQL Server workloads and how to start your cloud modernization at scale now, using Azure SQL Database Managed Instances.



You can also download the sessions in different formats here: 


MP3 (72.8 MB)
Low Quality MP4 (83.6 MB)
High Quality MP4 (207.9 MB)
Mid Quality MP4 (158.2 MB)


Build Intelligent Apps with the Microsoft Data & AI Platform

Description Join Rohan Kumar, Corporate Vice President of Data Platform, to learn how Microsoft provides the most comprehensive data platform for your modern, intelligent applications. Come see our latest innovations that enable you to easily modernize and provide new, differentiated experiences in your applications. Hear from customers like dv01, Finastra, Jet.com and Rubikloud as they share their stories of how they’re each transforming their industries by leveraging Microsoft’s data platform.



You can also download the sessions in different formats here: 


MP3 (73.4 MB)
Low Quality MP4 (97.1 MB)
High Quality MP4 (286.7 MB)
Mid Quality MP4 (209.8 MB)


Develop scalable analytical solutions with Azure Data Factory & Azure SQL Data Warehouse

In this session you will learn how to develop data pipelines in Azure Data Factory and build a Cloud-based analytical solution adopting modern data warehouse approaches with Azure SQL Data Warehouse and implementing incremental ETL orchestration at scale. With the multiple sources and types of data available in an enterprise today Azure Data factory enables full integration of data and enables direct storage in Azure SQL Data Warehouse for powerful and high-performance query workloads which drive a majority of enterprise applications and business intelligence applications.

 

You can also download the sessions in different formats here: 



MP3 (55.1 MB)
Low Quality MP4 (61.7 MB)
High Quality MP4 (151.2 MB)
Mid Quality MP4 (116.3 MB)



Enhancing DevOps with SQL Server on Linux + containers

SQL Server 2017 runs now on both Linux and containers. In this session you will learn how SQL Server 2017 supports Continuous Deployment/Continuous Integration (CD/CI) and other DevOps processes. You will also learn the latest on cross-platform tooling, APIs, and how to develop apps with SQL Server running anywhere. We will also demo how easy it is to use SQL Server in your modern database development workflows.

 


You can also download the sessions in different formats here: 



MP3 (58.7 MB)
Low Quality MP4 (64.1 MB)
High Quality MP4 (162.3 MB)
Mid Quality MP4 (121.0 MB)



Modernizing Mission-Critical Apps with SQL Server

Learn about the features that can help you modernize your mission critical applications, where security and performance can go hand in hand. From the wide range of SQL Server features available, we will take a closer look at In-Memory performance, Automatic Tuning, Advanced Security Features like Always Encrypted, Polybase and integration with Machine Learning through R and Python.

 


You can also download the sessions in different formats here: 



MP3 (70.1 MB)
Low Quality MP4 (82.2 MB)
High Quality MP4 (231.5 MB)
Mid Quality MP4 (169.6 MB)

Slides
View Slides Online


Securing your data with Azure SQL DB

Come learn about new security features like Vulnerability Assessment, Information Protection, Thread Detection and Always Encrypt to see how Azure SQL Database is securing your data in the most secure database on the planet.

 


You can also download the sessions in different formats here: 



MP3 (73.3 MB)
Low Quality MP4 (85.5 MB)
High Quality MP4 (238.8 MB)
Mid Quality MP4 (178.0 MB)

Slides
View Slides Online

Monday, February 19, 2018

Webinar: SQL Server on Linux tools: SQL Server Integration Services


Microsoft is hosting a SQL Server on Linux tools: SQL Server Integration Services webinar/town hall on March 1st, 2018 10:00-11:00 AM Pacific Time (UTC-5)

In this session, They will demonstrate how easy it is to install, run and monitor SSIS on Linux. They will also explain how you can use ODBC to connect common data sources on Linux. Specifically, they will cover:


  • Current system architecture
  • Supported/unsupported features
  • A live demo of SSIS installation on Linux
  • A live demo of how to use ODBC to connect data sources like MySQL and Oracle
  • Future roadmap of SSIS on Linux and other platforms


If you are interested in SSIS running on Linux, make sure to register for this webinar

The link to register is here: https://info.microsoft.com/sql-server-on-linux-town-hall-sql-server-integration-services-registration.html

Reinventing the wheel

It has been a while since I wrote some of my best practices posts. I decided to revisit these posts again to see if anything has changed, I also wanted to see if I could add some additional info.

In this post  we are going to look at something called reinventing the wheel. Just in case your are not familiar with this metaphor or maybe you are not a native English speaker, I will use wikipedia's description of what reinventing the wheel means.
To reinvent the wheel is to duplicate a basic method that has already previously been created or optimized by others.
The inspiration for this idiomatic metaphor lies in the fact that the wheel is the archetype of human ingenuity, both by virtue of the added power and flexibility it affords its users, and also in the ancient origins which allow it to underlie much, if not all, of modern technology. As it has already been invented, and is not considered to have any operational flaws, an attempt to reinvent it would be pointless and add no value to the object, and would be a waste of time, diverting the investigator's resources from possibly more worthy goals which his or her skills could advance more substantially.



So now that you have read the paragraph above, how many times did you write some code only to find out that it already exists in the language as part of some library or function?. How many times have you written code that you could have easily grabbed from GitHub, CodePlex and other repositories for your own use?


Why write your own solution when you can use something that is robust and tested?


To start let's take a look at the GitHub repositories mentioned in this post: Five great SQL Server GitHub repos that every SQL Server person should check out
You will find code that does index maintenance, helps you with performance issues, setup and more. Check out that post for more details


Find out who the community leaders are for a particular skill set that you are interested in, start following these people, follow them on twitter, subscribe to their blogs and podcasts. Go to their presentations, talk to them, find out what they use, find out if they have made code available for the public to use. You will find out that a good percentage of these people have made available a whole bunch of libraries, stored procedures, functions, maintenance routines and much more for you to use and it is all free.
Don't be scared to ask for help on twitter, if you don't know any of the SQL Server tweeple, use the #sqlhelp hash tag and ask for help, here is an example of what it looks like #sqlhelp
Here is an image of the replies on twitter after I asked a question with the #sqlhelp tag


Besides twitter, you can also use slack. I like slack more because you are not limited to 280 characters. Here is the link to the relevant slack channel: https://sqlcommunity.slack.com/messages/C1MS1RA4B/

Here is a screen shot of what it looks like

That looks a little better than twitter don't you think?


Some commercial firms will also have community editions of code and tools for you to use. Take advantage of this, these are great, if you like the tools then maybe you will find a need for the pro editions, these have more bells and whistles and are not limited.

Some examples of available solutions:

SQL Server activity
Want to know what is going on right now? Try Adam Machanic's procedure Who Is Active

Execution Plans
Check out SentryOne's  Plan Explorer. This plan explorer does much more than the one that comes with SQL Server Management Studio

SQL Search and other tools
Red Gate has a bunch of free tool, you can get those here https://www.red-gate.com/products/free-tools. I started to use Red Gate's tools back in 2003, SQL Compare is the one I used the most. SQL Search is free and if you need to find anything in your DB it is invaluable.
Idera free tools
Idera has a bunch of free tools available for download, you can find those all here: https://www.idera.com/productssolutions/freetools

Get involved

If you have created some cool code and you know there is nothing similar, why now give back to the community? Put it out there, solicit feedback and in the end the code will be better because more eyes will have looked at it. Accept contributions as well. All of these things will make the community as a whole grow, if the community grows then the platform will grow as well. When the platform grows, this means there will be more demand for someone with your skill set. You are responsible that your community doesn't turn into a ghost town.

Monday, January 15, 2018

When uncompressing data, make sure to use the correct function



SQL Server has offered data compression for a while now, you could either use PAGE compression or ROW compression, you had no ability so specify which columns.  That changed in SQL Server 2016

SQL Server 2016 added the ability to compress data yourself by using the COMPRESS function. The COMPRESS function compresses the input expression using the GZIP algorithm. The result of the compression is byte array of type varbinary(max).


Let's take a quick look at how this function works

We will create a table, insert one row and then update the SomeOtherColumn value by using the COMPRESS function on the SomeColumn column

CREATE TABLE test (Somecolumn varchar(200), SomeOtherColumn varbinary(max))
go
INSERT test VALUES ('aaaaaaaaaaaaa', null)


UPDATE test
SET SomeOtherColumn = COMPRESS(Somecolumn)
go

SELECT Somecolumn,SomeOtherColumn 
FROM test

Here is what the output looks like


What if if told you now to create a new table by uncompressing the data via a SELECT INTO query followed by dropping the old table

No problem, if there is a COMPRESS function then surely there must be an UNCOMPRESS function... right?  Let's find out......


SELECT Somecolumn,CAST( UNCOMPRESS(SomeOtherColumn) AS varchar(max))  AS SomeOtherColumn
INTO SomeNewTable
FROM test

DROP TABLE test

SELECT *
FROM SomeNewTable


Wrong, while the UNCOMPRESS function exists, it is not the correct function

Hopefully you ran a test before just dropping a table after doing select into

The correct function is DECOMPRESS

Here is what it all looks like

SELECT Somecolumn,SomeOtherColumn,
 CAST( DECOMPRESS(SomeOtherColumn) AS varchar(max)) AS CastedDecompressed,
 DECOMPRESS(SomeOtherColumn) as Decompressed FROM test

 SELECT Somecolumn,SomeOtherColumn, 
CAST( UNCOMPRESS(SomeOtherColumn) AS varchar(max)) AS CastedUncompressed,
 UNCOMPRESS(SomeOtherColumn) as Uncompressed FROM test

This is the output



As you can see the UNCOMPRESS function returns a bunch of nonsense while the DECOMPRESS function works as expected

So the question is really wth is the uncompress function?