Showing posts with label SQL Server. Show all posts
Showing posts with label SQL Server. Show all posts

Wednesday, May 16, 2018

Azure SQL Data Warehouse, Azure SQL Database and SQL Server sessions from the Build 2018 conference



The Build 2018 conference happened last week, I decided to put all the Azure SQL Data Warehouse, Azure SQL Database and SQL Server sessions from the Build 2018 conference into one place.

Here are the youtube videos, the descriptions as well as download links to video, audio and slides where available

Enjoy



Looking ahead with SQL Operations Studio

Join the SQL Operations Studio Engineering team as we highlight our journey towards our upcoming release announcement. We will cover major features delivered since our Public Preview announcement including developing our Extensibility story through the extension marketplace and extension API’s, improving dashboard experiences, and experimenting with deeper community involvement to help deliver this awesome cross-platform tool. You may even hear special announcements and key roadmap initiatives that have yet to be shared.



You can also download the sessions in different formats here:

MP3 (32.2 MB)
Low Quality MP4 (41.4 MB)
High Quality MP4 (87.9 MB)
Mid Quality MP4 (79.2 MB)



Query Processing Innovations for data intensive, modern applications 

Learn how your application can benefit from new capabilities in the Azure SQL Database and SQL Server platform, including graph data processing to model complex relationships between objects, and advanced self-tuning query processing to solve or avoid performance related problems.

 

You can also download the sessions in different formats here:

MP3 (72.3 MB)
Low Quality MP4 (81.6 MB)
High Quality MP4 (207.6 MB)
Mid Quality MP4 (157.8 MB)


Migrating and modernizing your data estate to Azure with Data Migration Services

Come learn all about Database Migration Service, and how anyone can utilize this service to migrate and modernize your data estate to fully managed services in Azure



You can also download the sessions in different formats here: 


MP3 (76.0 MB)
Low Quality MP4 (82.9 MB)
High Quality MP4 (208.0 MB)
Mid Quality MP4 (160.5 MB)


Azure SQL Database the intelligent database – Your database on Autopilot

Come learn how Azure SQL DB, the most intelligent cloud database, uses machine learning and best practices to ensure your database is always performing at its best. Come learn about features like Adaptive Query Processing, Autotuning and Performance Recommendations, to see how Azure SQL Database can help you spend more time developing applications and less time managing your databases.



You can also download the sessions in different formats here: 


MP3 (70.6 MB)
Low Quality MP4 (80.8 MB)
High Quality MP4 (201.3 MB)
Mid Quality MP4 (155.6 MB)



Azure SQL DB Managed Instances - Built to easily modernize application data layer

Managed Instances is the latest fully managed deployment model for Azure SQL Database that enables friction-free migration for SQL Server applications running on-premises. Come and learn why Managed Instance is the best PaaS destination for all SQL Server workloads and how to start your cloud modernization at scale now, using Azure SQL Database Managed Instances.



You can also download the sessions in different formats here: 


MP3 (72.8 MB)
Low Quality MP4 (83.6 MB)
High Quality MP4 (207.9 MB)
Mid Quality MP4 (158.2 MB)


Build Intelligent Apps with the Microsoft Data & AI Platform

Description Join Rohan Kumar, Corporate Vice President of Data Platform, to learn how Microsoft provides the most comprehensive data platform for your modern, intelligent applications. Come see our latest innovations that enable you to easily modernize and provide new, differentiated experiences in your applications. Hear from customers like dv01, Finastra, Jet.com and Rubikloud as they share their stories of how they’re each transforming their industries by leveraging Microsoft’s data platform.



You can also download the sessions in different formats here: 


MP3 (73.4 MB)
Low Quality MP4 (97.1 MB)
High Quality MP4 (286.7 MB)
Mid Quality MP4 (209.8 MB)


Develop scalable analytical solutions with Azure Data Factory & Azure SQL Data Warehouse

In this session you will learn how to develop data pipelines in Azure Data Factory and build a Cloud-based analytical solution adopting modern data warehouse approaches with Azure SQL Data Warehouse and implementing incremental ETL orchestration at scale. With the multiple sources and types of data available in an enterprise today Azure Data factory enables full integration of data and enables direct storage in Azure SQL Data Warehouse for powerful and high-performance query workloads which drive a majority of enterprise applications and business intelligence applications.

 

You can also download the sessions in different formats here: 



MP3 (55.1 MB)
Low Quality MP4 (61.7 MB)
High Quality MP4 (151.2 MB)
Mid Quality MP4 (116.3 MB)



Enhancing DevOps with SQL Server on Linux + containers

SQL Server 2017 runs now on both Linux and containers. In this session you will learn how SQL Server 2017 supports Continuous Deployment/Continuous Integration (CD/CI) and other DevOps processes. You will also learn the latest on cross-platform tooling, APIs, and how to develop apps with SQL Server running anywhere. We will also demo how easy it is to use SQL Server in your modern database development workflows.

 


You can also download the sessions in different formats here: 



MP3 (58.7 MB)
Low Quality MP4 (64.1 MB)
High Quality MP4 (162.3 MB)
Mid Quality MP4 (121.0 MB)



Modernizing Mission-Critical Apps with SQL Server

Learn about the features that can help you modernize your mission critical applications, where security and performance can go hand in hand. From the wide range of SQL Server features available, we will take a closer look at In-Memory performance, Automatic Tuning, Advanced Security Features like Always Encrypted, Polybase and integration with Machine Learning through R and Python.

 


You can also download the sessions in different formats here: 



MP3 (70.1 MB)
Low Quality MP4 (82.2 MB)
High Quality MP4 (231.5 MB)
Mid Quality MP4 (169.6 MB)

Slides
View Slides Online


Securing your data with Azure SQL DB

Come learn about new security features like Vulnerability Assessment, Information Protection, Thread Detection and Always Encrypt to see how Azure SQL Database is securing your data in the most secure database on the planet.

 


You can also download the sessions in different formats here: 



MP3 (73.3 MB)
Low Quality MP4 (85.5 MB)
High Quality MP4 (238.8 MB)
Mid Quality MP4 (178.0 MB)

Slides
View Slides Online

Monday, April 24, 2017

How to shut down SQL Server

The question how to shutdown SQL Server came up at work last week, I figured it would be a nice topic for the newer DBAs and SQL Server programmers

Note: I assume SQL Server is not part of a clustered environment here, if your SQL Server instance is part of a cluster, then use the failover cluster manager instead!


There are a bunch of way to shut down SQL Server, here is how you can do it

Configuration Manager

To shut down SQL Server for the SQL Server Configuration Manager, navigate to your instance, right click and select Stop

SSMS Object Explorer



The process from SSMS is very similar to Configuration Manager, right click on the instance and select Stop

Shutdown command

From SSMS open a new query window and type shutdown or shutdown with nowait

Unless the WITHNOWAIT option is used, SHUTDOWN shuts down SQL Server by:

  • Disabling logins (except for members of the sysadmin and serveradmin fixed server roles).
  • Waiting for currently running Transact-SQL statements or stored procedures to finish. To display a list of all active processes and locks, run sp_who and sp_lock, respectively.
  • Inserting a checkpoint in every database.

That seems kinda dangerous can someone pass that command to a proc by masking it?


declare @d varchar(30)= reverse('tiawon htiw nwodtuhs')
exec(@d)

The SHUTDOWN statement cannot be executed within a transaction or by a stored procedure. Msg 0, Level 11, State 0, Line 3 A severe error occurred on the current command. The results, if any, should be discarded.

That is good to know  :-)


Command prompt

Open a command prompt as administrator and type net stop MSSQLSERVER

Microsoft Windows [Version 10.0.14393]
(c) 2016 Microsoft Corporation. All rights reserved.

C:\WINDOWS\system32>net stop MSSQLSERVER
The SQL Server (MSSQLSERVER) service is stopping.
The SQL Server (MSSQLSERVER) service was stopped successfully.


C:\WINDOWS\system32>net start MSSQLSERVER
The SQL Server (MSSQLSERVER) service is starting.
The SQL Server (MSSQLSERVER) service was started successfully.


C:\WINDOWS\system32>

To start SQL Server again, change net stop to net start


Below are a couple of more ways but you should really not be using those!!

Task Manager


Yes, you can kill SQL Server from task manager

Taskkill

You can use the plain old DOS command taskkill as well

Shutdown your server
That is one way of course but you might get a bunch of people yelling at you

Unplug the power to your server after making sure you have no UPS battery backup plugged in
Great way to test recovery... .. sarcasm people.....

Shutdown the data center
You laugh but this actually happened to me once, someone did some work and shut down the whole development part of the data canter. We had SQL Server 2000 running on NT 4, it was not restarted in 6 years (meaning wasn't patched either) It was an isolated box, internal ip, couldn't be reached from outside the dev network


There you have all the ways that you can use to shut down SQL Server

Friday, December 09, 2016

Connecting to SQL Server when your default database is not available


One of our database on the development went in suspect mode today. This database was the default for a bunch of logins.  These people could not login now. Someone needed to use a different database but he couldn’t login because the database that was in suspect mode was the default database for the login he was using.

I told this person to click on the Options button in the connection dialog and specify another database. I guess there was an misunderstanding because this person couldn’t get it to work. This means it is time for a blog post.

Let's take a look how this all works

Here is a script that will create 2 databases

CREATE DATABASE Good
GO
 
CREATE DATABASE OopsBad
GO

Now create a new login named TestLogin with a password of Test. We are also adding the login we just created to the OopsBad database and we will make the login part of the db_owner role


USE [master]
GO
CREATE LOGIN [TestLogin] WITH PASSWORD=N'Test', DEFAULT_DATABASE=[OopsBad]
USE [OopsBad]
GO
CREATE USER [TestLogin] FOR LOGIN [TestLogin]
GO
USE [OopsBad]
GO
ALTER ROLE [db_owner] ADD MEMBER [TestLogin]
GO

Add the login we just created to the Good database as well and make the login part of the db_owner role



USE [Good]
GO
CREATE USER [TestLogin] FOR LOGIN [TestLogin]
GO
USE [Good]
GO
ALTER ROLE [db_owner] ADD MEMBER [TestLogin]
GO

Make sure that you can login with the TestLogin account

Now that you know that you can login with the TestLogin account, use another account and put the OopsBad database in offline mode

Take the database online

ALTER DATABASE OopsBad SET OFFLINE



Now if you try to login with the TestLogin account, you will see the following error



Here is what you need to do, on the connect to server window, click on the Options button



One the next screen, click on the Connection Properties tab, do not click on  Browse server... from the drop down.  If you click that you might have to wait for a long time

Instead just type in the database name, in our case the database name is Good



Click on Connect and you should be in....  That's it, pretty simple, you just have to know where to find it.

Wednesday, November 16, 2016

SQL Server 2016 SP1 released, SQL Server vNext available for download


Today Microsoft announced the CTP of the next version of SQL Server, you can download it here https://www.microsoft.com/evalcenter/evaluate-sql-server-vnext-ctp

As you can see I am already downloading this version



What's New in SQL Server vNext https://msdn.microsoft.com/en-US/library/mt788653(SQL.130).aspx

Install SQL Server on Linux https://docs.microsoft.com/en-us/sql/linux/sql-server-linux-setup

SQL Server on Linux Documentation https://docs.microsoft.com/en-us/sql/linux/


SQL Server Service Pack 1

Also announced was Service Pack 1 of SQL Server 2016, you can download that here https://go.microsoft.com/fwlink/?linkid=835368

There are so many cool things in this service pack. For one,  all the editions now support all the programmability features like indexed views, columnstore indexes, partitioning etc etc, see image below


You are still bound by the memory and CPU limits but at least your code doesn't have to change, this is good news for ISVs.

A couple of more tidbits....

  • CREATE OR ALTER (Yes, we heard you !!!) – New CREATE OR ALTER support makes it easier to modify and deploy objects like Stored Procedures, Triggers, User–Defined Functions, and Views. This was one of the highly requested features by developers and SQL Community.

  • New DMF sys.dm_exec_query_statistics_xml – Use this DMF to obtain actual query execution showplan XML (with actual number of rows) for a query which is still being executed in a given session (session id as input parameter). The showplan with a snapshot of current execution statistics is returned when profiling infrastructure (legacy or lightweight) is on.
  • New DMF for incremental statistics – New DMF sys.dm_db_incremental_stats_properties to expose information per–partition for incremental stats.
  • Better correlation between diagnostics XE and DMVs – Query_hash and query_plan_hash are used for identifying a query uniquely. DMV defines them as varbinary(8), while XEvent defines them as UINT64. Since SQL server does not have “unsigned bigint”, casting does not always work. This improvement introduces new XEvent action/filter columns equivalent to query_hash and query_plan_hash except they are defined as INT64 which can help correlating queries between XE and DMVs.

There is much more, see all the news here: https://blogs.msdn.microsoft.com/sqlreleaseservices/sql-server-2016-service-pack-1-sp1-released/


I also played around with some of the new stuff after installing vNext, see here:
Playing around with the new stuff in SQL Server vNext and SQL Server 2016 SP1

Friday, October 28, 2016

Some cool SQL Server announcements SQL Graph, Adaptive Query Plan, CTP1 of SQL vNext, SQL Injection detection



There were a bunch of cool announcement at the SQL QA and chalk talk sessions at the PASS Summit



Here are the people presenting that session(not all pictured here)  Lindsey Allen  Jakub Szymaszek  Jovan Popovic  Kevin Farlee  Tomas Talius  Sunil Agarwal  Joseph Sack and Tomas Talius

Resumable Index Rebuilds

For now this will be row-store only, this will not be available for columnstore indexes for the coming release. I assume it will work like showcontig back in the day. I need nore info and have the following questions....

What happens if you stop the index rebuild and you add more data?
Are statistics update only at the end, when everything is done?
Is this just a reorg in disguise?
Is this online or offline only?


SQL Graph

Think about this like Neo4J but then in SQL Server and probably better


SQL Graph adds graph processing capabilities to SQL Server, which will help you link different pieces of connected data to help gather powerful insights and increase operational agility. Graphs are well suited for applications where relationships are important, such as fraud detection, risk management, social networks, recommendation engines, predictive analysis, dependence analysis, IoT suites, etc.
Initially, SQL Server will support CRUD graph operations and multi-hop graph navigation, and the following functionality will be available in the private preview:
Create graph objects, that is, nodes to represent entities and edges to represent relationships between any 2 given nodes. Both Nodes and Edges can have properties associated to them.
SQL language extensions to support join free, pattern matching queries for multi-hop navigation.


If you like to signup for the private preview...go here: http://aka.ms/GraphPreview



Adaptive Query Plans
I was sitting in this session, and this is more or less what I remembered

This is for starters about Multi-statement Table Valued Functions (MSTVF). The legacy cardinality estimator will use 1,  in 2014 the new new cardinality estimator will use the number 100. Both are of course wrong. The idea with Adaptive Query Plans is when the tree is build, grab the correct value from the MSTVF, inject that into the plan and pass it down the pipeline

Adjust the plan..consecutive execution, if you use too much memory and start spilling, this will be adjusted and you will stop spilling
Adaptive joins, the idea is if they think that it is appropriate that they can have a nested loop operation but there might be a chance that the estimate is wrong, they will revert to a hash join otherwise they will stick with their original nested loop join

The plan is changed in flight, there will be an intra-plan change, same thing with the adaptive join.

For starters, they will just address MSTVF, other stuff will follow later


If you like to signup for the private preview...go here: http:/aka.ms/adaptiveqppreview


SQL Server  vNext CTP1 coming next month
CTP1 for SQL Server vNext will be available next month, this is both for the Linux as well as the Windows version. This is available now, here is the link https://www.microsoft.com/evalcenter/evaluate-sql-server-vnext-ctp

See also: SQL Server 2016 SP1 released, SQL Server vNext available for download

SQL Server Standard Edition announcement on November 15th

There will be a big announcement about SQL Server Standard Edition on November 15th....  mmmm maybe the ability to use more memory? We will have to wait and see what will be announced

SQL Injection detection
The way this will work is that if they detect a query that has been SQL injected with for example DROP TABLE, they will block it and sent an email to a DL or group that you have defined explaining that the statement has been blocked



Python coming to SQL Server
It looks like in addition to R, Python will be coming to SQL Server as well




P.S.

If you want to read about my time at SQL Pass 2016, read these posts


SQL Pass Summit 2016... day 2
SQL Pass Summit 2016... day 3
SQL Pass Summit 2016... day 4
SQL Pass Summit 2016... day 5


You might be thinking, where is day 1?  The day 1 posts is mostly about Seattle, if you are interested in that one, here it is:  SQL Pass Summit 2016... day 1.... just looking around

Wednesday, September 28, 2016

If you are going to PASS Summit 2016, build your schedule today

If you are going to the SQL Server Pass summit this year, you can start building your schedule already. Don't wait too long because some events seat only a limited number of people. Take the Chalk Talk with the SQL Server Team [DBA-501-M] session for example
Here is what is says on the page for that session

Speaker(s):  Lindsey Allen 
Duration: 75 minutes
Track: Enterprise Database Administration & Deployment
Join us in an open dialog with members of the SQL Server 2016 Program Management team. Don't miss this unique opportunity to ask the hard questions. Seating is limited to 50 so arrive early. Chalk Talk sessions are not recorded. 
As you can see this session sits only 50 people, so make sure to make a note of that, this way you know to get there early

Some other sessions are also going to be either very popular or they might be in a smaller room, so add those sessions to your schedule first.

You can see all the sessions here: http://www.sqlpass.org/summit/2016/Sessions/Schedule.aspx


That's all

See you at the PASS summit in October


Sunday, February 21, 2016

How to store twitter search results from R into SQL Server

I was messing around with R and grabbing twitter searches from within R, after that I decided to store the result in a csv file. Instead of stroring the results in a file, what if I want to store that info into a SQL Server database? This is pretty easy to do. If you want to try this out yourself, first head on over to my other post Using twitter from the R console. There you will be able to see how to setup OAuth so that you can use twitter within R, you will also see how to install the R twitteR package as well.

Time to get started....

First let's see how to connect to SQL Server from R
To load data from SQL Server into R, you first need to create an ODBC Data Source, I created one named LocalSqlServer, this DSN Data Source points to my local SQL Server 2016 instance.


After you have created your ODBC data source, you need to load the (RODBC) package, create a connection, execute a query, display the results and close the connection.. Here is what it looks like in the R Console



> require(RODBC)

>db <- odbcConnect("LocalSqlServer")

> Sample <- sqlQuery(db, "select * from sysobjects", stringsAsFactors=FALSE)

> head(Sample)
> close(db)  #close ORDBC connection
name id xtype uid info status base_schema_ver 1 sp_MSalreadyhavegeneration -1073624922 P 4 0 0 0 2 sp_MSwritemergeperfcounter -1072815163 P 4 0 0 0 3 TABLE_PRIVILEGES -1072372588 V 3 0 4 sp_replsetsyncstatus -1071944761 X 4 0 0 0 5 sp_replshowcmds -1070913306 P 4 0 0 0 6 sp_publishdb -1070573756 P 4 0 0 0

Now that we know how to connect to SQL Server, let's see if we can get the twitteR package to work, head on over to my other post Using twitter from the R console to see how to install twitterR and how some queries are executed.


Assuming the twitteR package is installed, we need to load it to use it


library(twitteR)

The next command is the command to enable OAuth, to see how to set this up go here Using twitter from the R console

setup_twitter_oauth("API key", "API secret", "Access token", "Access secret")
[1] "Using direct authentication"

Now it is time to do a simple search

tweets <- searchTwitter('#rstats', n=6) 
tweets





Running that command should have given you some results. However what we need as output is a dataframe, this way we can easily load this into a table in SQL Server. You can use the twListToDF function


tweets <- searchTwitter('#rstats', n=6) 
twListToDF(tweets)


The output now looks different



1                             RT @rquintino: @Mairos_B #sqlsatportugal session: all about R in #SqlServer 2016 #rstats https://t.co/DHrqIZrz1e
2                                                      a quick script to use imgcat in #rstats https://t.co/fpUlgWNX33 https://t.co/AhCCMLewCH
3 RT @KirkDBorne: Useful packages (libraries) for Data Analysis in R: https://t.co/haRKopFyly #DataScience #Rstats by @analyticsvidhya https:…
4                      Hey #rstats tweeps, do you have any readings to recommend on sensitivity analysis? Books/articles/websites all welcome.
5 RT @KirkDBorne: A Complete Tutorial on Time Series Modeling in R: https://t.co/7oI6JKyU4E #MachineLearning #DataScience #Rstats by @Analyti…
6 RT @KirkDBorne: A Complete Tutorial on Time Series Modeling in R: https://t.co/7oI6JKyU4E #MachineLearning #DataScience #Rstats by @Analyti…
  favorited favoriteCount replyToSN             created truncated replyToSID
1     FALSE             0        NA 2016-02-20 20:29:54     FALSE         NA
2     FALSE             0        NA 2016-02-20 20:24:50     FALSE         NA
3     FALSE             0        NA 2016-02-20 20:16:25     FALSE         NA
4     FALSE             0        NA 2016-02-20 20:11:08     FALSE         NA
5     FALSE             0        NA 2016-02-20 20:11:06     FALSE         NA
6     FALSE             0        NA 2016-02-20 20:02:05     FALSE         NA
                  id replyToUID
1 701141750161784834         NA
2 701140474019577856         NA
3 701138356466483204         NA
4 701137026075140096         NA
5 701137018508722176         NA
6 701134750296227840         NA
                                                                            statusSource
1                Mobile Web (M5)
Tweetbot for Mac
3   Twitter for Android
4                     Twitter Web Client
5     Twitter for iPhone
6                     Twitter Web Client
     screenName retweetCount isRetweet retweeted longitude latitude
1      psousa75            3      TRUE     FALSE        NA       NA
2      millerdl            0     FALSE     FALSE        NA       NA
3   diana_nario           50      TRUE     FALSE        NA       NA
4    emjonaitis            0     FALSE     FALSE        NA       NA
5       caryden           41      TRUE     FALSE        NA       NA
6 ArkangelScrap           41      TRUE     FALSE        NA       NA



The output now has a lot more stuff, you can see if it has been retweeted or favorited as well as the latitude, longtitude and more.

Time to send the result to SQL Server

There is a problem with how the sqlSave method creates the table, it makes the created column a float and then all inserts fail.


Run the following DDL statement in your database
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
CREATE TABLE [dbo].[tweets](
	[rownames] [nvarchar](255) NULL,
	[text] [nvarchar](255) NULL,
	[favorited] [nvarchar](5) NULL,
	[favoriteCount] [float] NULL,
	[replyToSN] [nvarchar](5) NULL,
	[created] [datetime] NULL,
	[truncated] [nvarchar](5) NULL,
	[replyToSID] [nvarchar](5) NULL,
	[id] [nvarchar](255) NULL,
	[replyToUID] [nvarchar](5) NULL,
	[statusSource] [nvarchar](255) NULL,
	[screenName] [nvarchar](255) NULL,
	[retweetCount] [float] NULL,
	[isRetweet] [nvarchar](5) NULL,
	[retweeted] [nvarchar](5) NULL,
	[longitude] [nvarchar](255) NULL,
	[latitude] [nvarchar](255) NULL
) ON [PRIMARY]

Now we are ready to save our twitter data in SQL Server,  here is how you do it.


> sqlSave(db,append = TRUE,colnames = FALSE,verbose = FALSE, addPK = FALSE, twListToDF(tweets))
Here is what the results look in SSMS



Make sure that you have append = TRUE, if you make it FALSE, the table will be recreated every time but since the command issued creates the wrong data type for the created column, you can't use it


So to recap, here are the commands you need after you loaded the twitterR package and have setup OAuth


> tweets <- searchTwitter('#rstats', n=6) #store the twitter search results
> twListToDF(tweets)  #convert to dataframe

> require(RODBC)  # load RODBC package
> db <- odbcConnect("LocalSqlServer")  #connect
> sqlSave(db,append = TRUE,colnames = FALSE,verbose = FALSE, addPK = FALSE, twListToDF(tweets))  #insert data into your table
> close(db)  #close ORDBC connection
 

Thursday, June 26, 2008

Working On SQL Admin Hacks, Tips and Tricks

I haven't posted for a while because I have been working on SQL Admin Hacks, Tips and Tricks lately. it is still a work in progress but below is what is on the wiki currently. It is not yet categorized but we will do that once we get more of these hacks done. To see what it will look like when it is done take a look at the SQL Server Programming Hacks

Can you think of any admin stuff you would like to see? This is what we have right now

Find Primary Keys and Columns Used in SQL Server
Get The Domain Name Of Your SQL Server Machine With T-SQL
Grant Execute/SELECT Permissions For All User Defined Functions To A User
Grant Execute Permissions For All Stored Procedures To A User
Kill All Active Connections To A Database
SQL Server 2008: When Was The Server Last Started?
Check If Auto Update Statistics Is Enabled By Using DATABASEPROPERTY
Three Way To List All Databases On Your Server
Generate A List Of Object Types By Using OBJECTPROPERTY
How to find all the tables and views in a database
Find Out Server Roles For a SQL Server Login
Which Service Pack Is Installed On My SQL Server
Test SQL Server Login Permissions With SETUSER
Login failed for user 'sa'. Reason: Not associated with a trusted SQL Server connection. SQL 2000
Login failed for user 'sa'. Reason: Not associated with a trusted SQL Server connection. SQL 2005
Compare Tables With Tablediff
Find All Tables Without Triggers In SQL Server
Find All Tables With Triggers In SQL Server
Create Stored Procedures That Run At SQL Server Startup
Cycle The SQL Server Error Log
How to read sql server error messages
Use OBJECT_DEFINITION To Track SQL Server Stored Procedure Changes
SQL Compare Without The Price Tag
How To Get The Database Name For The Current User Process
How To Find Out Which Columns Have Defaults And What Those Default Values Are
Fixing Cannot add, update, or delete a job that originated from an MSX Server Error after renaming a server

Wednesday, May 07, 2008

Should SQLServer Have The CREATE [OR REPLACE] PROCEDURE Syntax?

I was asked by a developer at work the other day why SQL Server does not have the create or replace syntax. I started thinking and there were some advantages and one big disadvantage

First the advantages

Advantage
When scripting out a database you don’t have to generate if exists.....drop statements

When modifying an object from a file in source control you don’t have to change back and forth between CREATE and ALTER. This is really annoying sometimes; usually you create a proc or script out a proc and store it in Subversion/SourceSafe. Now you take that file, make some changes to the proc and run it. Of course it will tell you that the proc already exists, in general you don’t want to drop the proc and then execute the CREATE proc statement because then you have to worry about the permissions.(not everyone is running as dbo you know)

Disadvantage
I can overwrite a proc without even knowing it. What if we both are working on a proc and somehow we pic the same name I create my proc, you work in a remote location, you create yours and mine is gone.

Of course it is all what you are used to, I am sure the Oracle guys are not overwriting each other’s procs every day either

So what do you think? Would you be in favor of this syntax or not? Can you think of more advantages or disadvantages?

Monday, July 23, 2007

SQL Server Podcast: James Luetkehoelter Talks About Disaster Recovery Planning

SQL Down Under has made available their latest podcast. From the site:

Announcing show 23 with SQL Server MVP James Luetkehoelter. In this show, James discusses disaster recovery planning and technology for SQL Server, clustering, log shipping, mirroring and snapshots.
Dowmload it here: http://www.sqldownunder.com/

Sunday, June 03, 2007

Did You Know SQL Server Has A Black Box Like An Airplane?

Paul Randal writes:
"Kimberly mentioned that SQL Server has a 'black-box' trace, similar to an
aircraft flight-recorder, which I'd never heard of. It's an internal trace that
has the last 5MB of various trace events and it's dumped to a file when SQL
Server crashes. This can be really useful if you're troubleshooting an issue
that causing SQL Server to crash or someone or something is telling SQL Server
to shutdown and its unclear who or what is doing it."


Read here how to turn it on: http://blogs.msdn.com/sqlserverstorageengine/archive/2007/06/03/sql-server-s-black-box.aspx

Cross-posted from SQLBlog! - http://www.sqlblog.com/

Monday, May 14, 2007

Make Your Case Sensitive Searches 1000 Times Faster

I had an case sensitive update query that ran in about 9 minutes. Since it was a case sensitive query it did an index scan not an index seek.
Once I modified my WHERE clause the update took a little less than 3 seconds

Let's get started and see what I did

First create this table

CREATE TABLE #CaseSensitiveSearchTemp (Val CHAR(1))

INSERT #CaseSensitiveSearchTemp VALUES('A')
INSERT #CaseSensitiveSearchTemp VALUES('B')
INSERT #CaseSensitiveSearchTemp VALUES('C')
INSERT #CaseSensitiveSearchTemp VALUES('D')
INSERT #CaseSensitiveSearchTemp VALUES('E')
INSERT #CaseSensitiveSearchTemp VALUES('F')
INSERT #CaseSensitiveSearchTemp VALUES('G')
INSERT #CaseSensitiveSearchTemp VALUES('H')

Now we will insert some lowercase characters

INSERT #CaseSensitiveSearchTemp
SELECT LOWER(Val) FROM #CaseSensitiveSearchTemp

Now we will create our real table which will have 65536 rows

CREATE TABLE CaseSensitiveSearch (Val VARCHAR(50))

We will do a couple of cross joins to generate the data for our queries

INSERT CaseSensitiveSearch
SELECT t1.val + t2.val + t3.val + t4.val
FROM #CaseSensitiveSearchTemp t1
CROSS JOIN #CaseSensitiveSearchTemp t2
CROSS JOIN #CaseSensitiveSearchTemp t3
CROSS JOIN #CaseSensitiveSearchTemp t4


This should give you 65536 rows

SELECT * FROM CaseSensitiveSearch


Create an index on the table

CREATE INDEX IX_SearchVal ON CaseSensitiveSearch(Val)


This is how you do a case sensitive search

SELECT * FROM CaseSensitiveSearch
WHERE Val = 'ABCD' COLLATE SQL_Latin1_General_CP1_CS_AS



Now hit CRLK + K (SQL Server 2000) or CRLK + M(SQL Server 2005)

run these 2 queries in one batch by highlighting them both and hitting F5

SELECT * FROM CaseSensitiveSearch
WHERE Val = 'ABCD' COLLATE SQL_Latin1_General_CP1_CS_AS

SELECT * FROM CaseSensitiveSearch
WHERE Val = 'ABCD' COLLATE SQL_Latin1_General_CP1_CS_AS
AND Val LIKE 'ABCD'

Look at the execution plan, I get 98.71% for the first query and 1.29% for the second query. Just by adding the AND condition SQL server is able to do an index seek and run the query many times faster




Now try it with a lowercase a

SELECT * FROM CaseSensitiveSearch
WHERE Val = 'aBCD' COLLATE SQL_Latin1_General_CP1_CS_AS

SELECT * FROM CaseSensitiveSearch
WHERE Val = 'aBCD' COLLATE SQL_Latin1_General_CP1_CS_AS
AND Val LIKE 'aBCD'


You see it all works without a problem, the correct result is returned

Wednesday, April 11, 2007

Does Anyone Use ~ (Bitwise NOT)?

I have Used ¦ (Bitwise OR) and & (Bitwise AND) but I have never used ~ (Bitwise NOT)
Lets' see what it does

SELECT ~170
this returns -171, OK Fair enough.

SELECT ~convert(tinyint,170)
this returns 85, mmmmmmm

Here is some more interesting stuff

DECLARE @i int, @i2 tinyint,@i3 smallint, @i4 bigint
SELECT @i =170,@i2 = 170,@i3 =170,@i4 = 170

SELECT ~@i int, ~@i2 tinyint,~@i3 smallint, ~@i4 bigint

output
-171 85 -171 -171

So for the tinyint it is 85, for the other datatypes it is -171

What does BOL say?

Remarks
The bitwise ~ operator performs a bitwise logical NOT for the expression, taking each corresponding bit. The bits in the result are set to 1 if one bit (for the current bit being resolved) in expression has a value of 0; otherwise, the bit in the result is cleared to a value of 1.

The ~ bitwise operator can be used only on columns of the integer data type category.



Important
When performing any kind of bitwise operation, the storage length of the expression used in the bitwise operation is important. It is recommended that you use the same number of bytes when storing values. For example, storing the decimal value of 5 as a tinyint, smallint, or int produces a value stored with different numbers of bytes. tinyint stores data using 1 byte, smallint stores data using 2 bytes, and int stores data using 4 bytes. Therefore, performing a bitwise operation on an int decimal value can produce different results as compared to a direct binary or hexidecimal translation, especially when the ~ (bitwise NOT) operator is used. The bitwise NOT operation may occur on a variable of a shorter length that, when converted to a longer data type variable, may not have the bits in the upper 8 bits set to the expected value. It is recommended that you convert the smaller data type variable to the larger data type, and then perform the NOT operation on the result.

There you have it.

Sunday, February 25, 2007

Failure Trends in a Large Disk Drive Population

The Google engineers published a paper on Failure Trends in a Large Disk Drive Population. Based on a study of 100,000 disk drives over 5 years they found some interesting stuff:

While drive manufacturers often quote yearly failure rates below 2%,
user studies have seen rates as high as 6%.


We find, for example, that after their first scan error, drives are 39
times more likely to fail within 60 days than drives with no such errors. First
errors in reallocations,offline reallocations, and probational counts are also
strongly correlated to higher failure probabilities.



Six percent, that is higher than I expected. I must say (and I am knocking on wood as I write this) that I only saw a drive die once (within a month of deploying) on a blade server. The only major problem I had was when consulting for a client in NYC. They had a SQL Server box which was running for 2 years without a problem. We upgraded the machine to an active/passive cluster and a week later the motherboard died (downtime 20 seconds ;-) ), talking about good timing.....

So what failure rates do you see? Does stuff break down a lot?

Thursday, February 22, 2007

How To Find Out Which Columns Have Defaults And What Those Default Values Are

Okay so many many moons ago you created a bunch of tables and those tables have columns of course. You want to know how to find the columns that have defaults.
There are a couple of ways to do this
Below is a list:

1 INFORMATION_SCHEMA.COLUMNS view (2000 and 2005)
2 sysobjects,syscolumns and syscomments (2000 only)
3 sys.default_constraints and sys.sysobjects (2005 only)
4 sp_help (2000 only)


So let's get started with some code
CREATE TABLE blah(id INT DEFAULT 0,
SomeDate DATETIME DEFAULT CURRENT_TIMESTAMP)

INSERT blah DEFAULT VALUES

SELECT
* FROM blah


--SQL 2000/2005
SELECT COLUMN_DEFAULT,*
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME ='blah'

--SQL 2000
SELECT Text,*
FROM sysobjects o
INNER JOIN syscolumns c
ON o.parent_obj = c.id
AND o.info = c.colid
INNER JOIN syscomments s
ON o.id = s.id
WHERE o.xtype = 'D'
AND OBJECT_NAME(parent_obj) = 'blah'

--SQL 2005
SELECT * FROM
sys.default_constraints d
JOIN sys.sysobjects o ON d.parent_object_id = o.id
WHERE o.name = 'blah'


See what happens when you don't specify a name (we will do this later)? You will get wacky names like these: DF__blah__id__15A53433 and DF__blah__SomeDate__1699586C
Instead of specifying the default when creating the table use an alter table add constraint statement.Let's see this in action.

DROP TABLE blah
CREATE TABLE blah(id INT,SomeDate DATETIME)

ALTER TABLE blah
ADD CONSTRAINT DF_Blah_ID_0
DEFAULT 0 FOR id

ALTER TABLE blah
ADD CONSTRAINT DF_Blah_SomeDate_Now
DEFAULT CURRENT_TIMESTAMP FOR SomeDate


INSERT blah DEFAULT VALUES

SELECT
* FROM blah


--SQL 2000/2005
SELECT COLUMN_DEFAULT,*
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME ='blah'

--SQL 2000
SELECT Text,*
FROM sysobjects o
INNER JOIN syscolumns c
ON o.parent_obj = c.id
AND o.info = c.colid
INNER JOIN syscomments s
ON o.id = s.id
WHERE o.xtype = 'D'
AND OBJECT_NAME(parent_obj) = 'blah'

--SQL 2005
SELECT * FROM
sys.default_constraints d
JOIN sys.sysobjects o ON d.parent_object_id = o.id
WHERE o.name = 'blah'

And last we have sp_help
You can use sp_help in SQL Server 2000(you can also use it in SQL server 2005 but it doesn't return the defaults )
Execute the following
sp_help 'blah'

The defaults will be in the last resultset (the one where the first column name = constraint_type)


And last but not least did you notice that we had CURRENT_TIMESTAMP but when we queried the table we saw GETDATE() This is kind of strange since CURRENT_TIMESTAMP is ANSI complaint but GETDATE() is not