Showing posts with label Howto. Show all posts
Showing posts with label Howto. Show all posts

Tuesday, February 12, 2019

Using SonarQube, SonarQube Scanner and the sonar-tsql-plugin to run static code analysis

In the previous post Scripting out procs and user defined functions with PowerShell, we scripted out some procs and functions from the Adventureworks database so that we can run some static code analysis. Today we will install SonarQube, SonarQube Scanner and the sonar-tsql-plugin. First thing we need is to grab Java if you don't have it installed. I know I know... I uninstalled it as well.. sigh

Anyway.. after you have Java installed, you can grab SonarCube here: https://www.sonarqube.org/downloads/

Create a folder named sonarqube-7.6 on the C drive, download SonarCube and extract it in C:\sonarqube-7.6\

Next we need Sonar Cube Scanner,  You can download it here: https://docs.sonarqube.org/display/SCAN/Analyzing+with+SonarQube+Scanner

Create a folder name C:\sonar-scanner-cli-3.3.0.1492-windows and extract the Sonar Cube Scanner file there

Finally we need the sonar-tsql-plugin, you can download that here https://github.com/gretard/sonar-tsql-plugin/releases
Grab the file named: sonar-tsqlopen-plugin-0.9.0.jar and download it
Place the jar file in the folder C:\sonarqube-7.6\sonarqube-7.6\extensions\plugins\

sonar-tsqlopen-plugin


Now it's time to create some environmental variables. In an explorer window, paste this into an address bar

Control Panel\System and Security\System
Click on Advanced System Settings, click on Environment Variable, click on new

In the variable name add SONAR_RUNNER_HOME
In the variable value add C:\sonar-scanner-cli-3.3.0.1492-windows

It will look like this

There is one more thing to do, we need to add something to the path
On older versions of windows... add the line below at the end of the path variable, on newer versions, just click on New and paste the line below

;%SONAR_RUNNER_HOME%\bin;



Ok time to run (and fail) SonarQube finally

Go to the folder C:\sonarqube-7.6\sonarqube-7.6\bin\windows-x86-32 and kick off the script StartSonar.bat

If you get an error about 32 or 64 bit, then run the script from the windows-x86-64 folder

If you run the script, if you are lucky, you won't get an error, but if you do is it this one?


jvm 1    | Error: missing `server' JVM at `C:\Program Files (x86)\Java\jre1.8.0_201\bin\server\jvm.dll'.
jvm 1    | Please install or use the JRE or JDK that contains these missing components.

C:\sonarqube-7.6\sonarqube-7.6\bin\windows-x86-32>StartSonar.bat
wrapper  | ERROR: Another instance of the SonarQube application is already running.
Press any key to continue . . .
C:\sonarqube-7.6\sonarqube-7.6\bin\windows-x86-32>StartSonar.bat
wrapper  | --> Wrapper Started as Console
wrapper  | Launching a JVM...
jvm 1    | Wrapper (Version 3.2.3) http://wrapper.tanukisoftware.org
jvm 1    |   Copyright 1999-2006 Tanuki Software, Inc.  All Rights Reserved.
jvm 1    |
jvm 1    | 2019.02.12 13:22:02 INFO  app[][o.s.a.AppFileSystem] Cleaning or creating temp directory C:\sonarqube-7.6\sonarqube-7.6\temp
jvm 1    | 2019.02.12 13:22:02 INFO  app[][o.s.a.es.EsSettings] Elasticsearch listening on /127.0.0.1:9001
jvm 1    | 2019.02.12 13:22:02 INFO  app[][o.s.a.p.ProcessLauncherImpl] Launch process[[key='es', ipcIndex=1, logFilenamePrefix=es]] from [C:\sonarqube-7.6\sonarqube-7.6\elasticsearch]: C:\Program Files (x86)\Java\jre1.8.0_201\bin\java -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+AlwaysPreTouch -server -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -Djdk.io.permissionsUseCanonicalPath=true -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Dlog4j.skipJansi=true -Xms512m -Xmx512m -XX:+HeapDumpOnOutOfMemoryError -Delasticsearch -Des.path.home=C:\sonarqube-7.6\sonarqube-7.6\elasticsearch -cp lib/* org.elasticsearch.bootstrap.Elasticsearch -Epath.conf=C:\sonarqube-7.6\sonarqube-7.6\temp\conf\es
jvm 1    | 2019.02.12 13:22:02 INFO  app[][o.s.a.SchedulerImpl] Waiting for Elasticsearch to be up and running
jvm 1    | Error: missing `server' JVM at `C:\Program Files (x86)\Java\jre1.8.0_201\bin\server\jvm.dll'.
jvm 1    | Please install or use the JRE or JDK that contains these missing components.


So to quickly fix this create a server folder in the java bin location from the error message



Now grab the files from the client folder and copy them to the server folder


Ok, we are ready to run the file again, rerun the command and the last 2 lines should be something like this

jvm 1    | 2019.02.12 13:25:53 INFO  app[][o.s.a.SchedulerImpl] Process[ce] is up
jvm 1    | 2019.02.12 13:25:53 INFO  app[][o.s.a.SchedulerImpl] SonarQube is up

Navigate to http://localhost:9000/ login with admin for username and password

Now we need to do one more thing and we are ready, open notepad or you favorite text editor, paste the following

# Required metadata 
sonar.projectKey=StaticCodeAnalysis.project
sonar.projectName=Static Code Analysis project
sonar.projectVersion=1.0
sonar.sources=StoredProcedures,UserDefinedFunctions
sonar.host.url=http://localhost:9000
#sonar.exclusions=**/bin/**/*.*,**/obj/**/*.*,**/*.sqlproj
# Comma-separated paths to directories of source codes to be analyzed. 
# Path is relative to the sonar-project.properties file. 
# Replace "\" by "/" on Windows. 
# Since SonarQube 4.2, this property is optional. 
# If not set, SonarQube starts looking for source code 
# from the directory containing the sonar-project.properties file. 
# Language 
sonar.language=tsql
#Encoding of the source code 
#sonar.sourceEncoding=UTF-8

Save that as sonar-project.properties in the folder where your code is located, in our case it is in C:\temp





Alright.. it's time to run the static code analysis...

Open a command window, cd to the C;\temp folder, and paste following

C:\sonar-scanner-cli-3.3.0.1492-windows\sonar-scanner-3.3.0.1492-windows\bin\sonar-scanner.bat

You should see something like the following

C:\temp>C:\sonar-scanner-cli-3.3.0.1492-windows\sonar-scanner-3.3.0.1492-windows\bin\sonar-scanner.bat
INFO: Scanner configuration file: C:\sonar-scanner-cli-3.3.0.1492-windows\sonar-scanner-3.3.0.1492-windows\bin\..\conf\sonar-scanner.properties
INFO: Project root configuration file: C:\temp\sonar-project.properties
INFO: SonarQube Scanner 3.3.0.1492
INFO: Java 1.8.0_121 Oracle Corporation (64-bit)
INFO: Windows 10 10.0 amd64
INFO: User cache: C:\Users\denis\.sonar\cache
INFO: SonarQube server 7.6.0
INFO: Default locale: "en_US", source code encoding: "windows-1252" (analysis is platform dependent)
INFO: Load global settings
INFO: Load global settings (done) | time=78ms
INFO: Server id: BF41A1F2-AWji9AZ8kkLV5J16bA1i
INFO: User cache: C:\Users\denis\.sonar\cache
INFO: Load/download plugins
INFO: Load plugins index
INFO: Load plugins index (done) | time=31ms
INFO: Load/download plugins (done) | time=47ms
INFO: Process project properties
INFO: Execute project builders
INFO: Execute project builders (done) | time=0ms
INFO: Project key: StaticCodeAnalysis.project
INFO: Base dir: C:\temp
INFO: Working dir: C:\temp\.scannerwork
INFO: Load project settings
INFO: Load project settings (done) | time=16ms
INFO: Load project repositories
INFO: Load project repositories (done) | time=47ms
INFO: Load quality profiles
INFO: Load quality profiles (done) | time=63ms
INFO: Load active rules
INFO: Load active rules (done) | time=1922ms
INFO: Load metrics repository
INFO: Load metrics repository (done) | time=32ms
WARN: SCM provider autodetection failed. Please use "sonar.scm.provider" to define SCM of your project, or disable the SCM Sensor in the project settings.
INFO: Language is forced to tsql
INFO: Indexing files...
INFO: Project configuration:
INFO: 23 files indexed
INFO: Quality profile for tsql: Sonar Way
INFO: ------------- Run sensors on module Static Code Analysis project
INFO: Sensor JaCoCo XML Report Importer [jacoco]
INFO: Sensor JaCoCo XML Report Importer [jacoco] (done) | time=0ms
INFO: Sensor MsIssuesLoaderSensor [tsqlopen]
INFO: Found 0 issues
INFO: Sensor MsIssuesLoaderSensor [tsqlopen] (done) | time=15ms
INFO: Sensor CodeGuardIssuesLoaderSensor [tsqlopen]
INFO: SQL Code guard path is empty, trying to search directories instead
INFO: Found 0 issues
INFO: Sensor CodeGuardIssuesLoaderSensor [tsqlopen] (done) | time=0ms
INFO: Sensor CustomChecksSensor [tsqlopen]
WARN: Property 'sonar.tsql.customrules.paths' is not declared as multi-values/property set but was read using 'getStringArray' method. The SonarQube plugin declaring this property should be updated.
INFO: Total 1 custom rules repositories with total 15 checks
INFO: Sensor CustomChecksSensor [tsqlopen] (done) | time=21548ms
INFO: Sensor CoverageSensor [tsqlopen]
INFO: Sensor CoverageSensor [tsqlopen] (done) | time=16ms
INFO: Sensor JavaXmlSensor [java]
INFO: Sensor JavaXmlSensor [java] (done) | time=0ms
INFO: Sensor HTML [web]
INFO: Sensor HTML [web] (done) | time=15ms
INFO: Sensor Zero Coverage Sensor
INFO: Sensor Zero Coverage Sensor (done) | time=16ms
INFO: ------------- Run sensors on project
INFO: No SCM system was detected. You can use the 'sonar.scm.provider' property to explicitly specify it.
INFO: 21 files had no CPD blocks
INFO: Calculating CPD for 2 files
INFO: CPD calculation finished
INFO: Analysis report generated in 250ms, dir size=127 KB
INFO: Analysis report compressed in 51ms, zip size=40 KB
INFO: Analysis report uploaded in 47ms
INFO: ANALYSIS SUCCESSFUL, you can browse http://localhost:9000/dashboard?id=StaticCodeAnalysis.project
INFO: Note that you will be able to access the updated dashboard once the server has processed the submitted analysis report
INFO: More about the report processing at http://localhost:9000/api/ce/task?id=AWjjFKvAkkLV5J16bDDx
INFO: Analysis total time: 26.228 s
INFO: ------------------------------------------------------------------------
INFO: EXECUTION SUCCESS
INFO: ------------------------------------------------------------------------
INFO: Total time: 28.017s
INFO: Final Memory: 36M/1173M
INFO: ------------------------------------------------------------------------

C:\temp>


When you get the prompt back, it's time to go to the http://localhost:9000/projects URL

You should have 1 project there, the name matches what we had in our properties file

sonar.projectName=Static Code Analysis project



When you click on the project, you will get some information about bugs, code smells and duplication


Clicking on code smells brings back the following.. you can then act on those or not



I added 2 of my own bad procs to see what it would flag

create proc BadProc
as
select * from Person.Address
order by 2,3


I also added this one Unique_Login_IPs, you can grab it here https://social.msdn.microsoft.com/Forums/en-US/55cfe1b0-402a-4468-bf7a-cc0966d4a487/faster-way-to-do-this

As you can see we got some warnings for those procs

SELECT *..   No need to comment on this one
No ASC/DESC in the order by... this defaults to ASC anyway but I guess for clarity it's better to specify ASC
Positional reference is used... I do this all the time with ad-hoc queries but I don't do it with procs

Non-sargeable argument found - column referenced in a function.

That is this line

WHERE (method = 'LOGIN') AND (YEAR(logged) = @year1) AND (MONTH(logged) = 3)) as tmpy1_3

This you would ideally rewrite by doing something like first creating the variable @startdate  and let it have the value @year.03/01 in other words '20190301'

Then the WHERE clause would be something like that

WHERE (method = 'LOGIN') AND logged >= @startdate and logged < dateadd(mm,1,@startdate) 

That is all for this post. Of course you can do some of this stuff with other tools and even with policy management. But if you use SQL Server and many languages, you could do static code analysis from one tool





Scripting out procs and user defined functions with PowerShell



I decided to take SonarCube for a spin to run some static code analysis. In order to do that I needed some code that I can display here without breaking laws.

I a real world, you would just point SonarCube to your code repository, but in order to show you how to run it so that you can follow along, I decided to go this route

I decided to use the AdventureWorks database to script out some procs and functions. If you want to follow what I am doing here, grab the backup from here: https://github.com/Microsoft/sql-server-samples/releases/tag/adventureworks

On your C drive create a folder named DB, in the DB folder create 2 folders, one name data and the other log

To restore the DB you can run this command (assuming your backup is in the c:\temp directory

USE [master]
RESTORE DATABASE [AdventureWorks2017] 
FROM  DISK = N'C:\temp\AdventureWorks2017.bak' WITH  FILE = 1,  
MOVE N'AdventureWorks2017' 
TO N'C:\DB\DATA\AdventureWorks2017.mdf',  
MOVE N'AdventureWorks2017_log' 
TO N'C:\DB\LOG\AdventureWorks2017_log.ldf',  
NOUNLOAD,  STATS = 5

 Now It's time to script out the procs and user defined functions

But first we need a place to store them

In my case I decided to create 2 folders in the c:\temp directory
StoredProcedures
UserDefinedFunctions

Now it's time to script out the procs and user defined functions, I will be using Powershell from that, you will need to install the sql server module, you can download it here

https://docs.microsoft.com/en-us/sql/powershell/download-sql-server-ps-module?view=sql-server-2017

Follow the instruction if you get any errors.

Now it's time to run the script, you will need to run powershell as an administrator

You will need to change Machinename\sql2019 to your machine and instance name

The script is simple.. it's for a one time use.. if you need to run the script repeatedly, you probably want to make it so you can rerun it for different databases and servers, so no hardcoding  :-)


Import-Module SqlServer -Version 21.0.17279

cd SQLSERVER:\SQL\MAchinename\sql2019\Databases\AdventureWorks2017\StoredProcedures



foreach ($tbl in Get-ChildItem  )

{
$k="C:\Temp\StoredProcedures\" + $($tbl.Schema) + "." + $($tbl.name) + ".SQL"
$tbl.Script() > $k
}

cd ..\UserDefinedFunctions

foreach ($tbl in Get-ChildItem  )

{
$k="C:\Temp\UserDefinedFunctions\" + $($tbl.Schema) + "." + $($tbl.name) + ".SQL"
$tbl.Script() > $k


That's all for the script.. if it ran successfully, you should see a bunch of procs and user defined functions in their directories.  Here is a view of the functions


Next up..installing SonarQube, SonarQube Scanner and the sonar-tsql-plugin

Monday, February 04, 2019

After 20+ years in IT .. I finally discovered this...

Originally I was not going to write this post, but after I found out that several other people didn't know this I figured what the heck, why not, maybe this will help someone else as well

Last week I was on a remote session with 2 clients, each run the Advent program . The team I am part of provides a script to run the advent (APX or Axys) executable. This will then generate the portfolios, composites, price, security master, splits and other files. We then zip it up and sftp it over for ingestion so that we can run analytics and attribution

During these calls I interact with system administrators because usually the need to give permissions so that the script runs correctly

None of these admins knew that what I will show you existed. All the co-workers I asked didn't know this either (This could be because they are developers and not admins)



Back in the day (win 98 or perhaps NT 4), there was a windows powertool that you could install and if you right clicked on a folder you would get an option to open a command window and it would be in the path that you right clicked on

Those power tools don't exist anymore and you could do the same by hacking the registry, it's like a 16 step process

But there is a faster way.....

So what I usually did before 2 months ago is that I would select the path


And then I would open a command prompt, type CD and then paste the path...not too complicated

But here is the faster way.... instead of copying the path...just type in cmd in the address bar and hit enter


Boom shakalaka... a command prompt is opened immediately and you are in the same path



Did you know this also works when you type Powershell in the address bar, Eric Darling left me a comment on twitter informing me that it works with powershell as well

Here is what you see after typing it




So there you have it... hopefully it will save you some minutes of valuable time in a year

Also if you knew about this or did not know..leave a comment and let me know





Tuesday, January 08, 2019

Print.. the disruptor of batch deletes in SQL



Someone had an issue where a batched delete script was not deleting anything. I looked over some code in our repository and noticed two patterns the way queries are written to handle batch deletes

One is a while loop that runs while @@rowcount is greater than 0

WHILE @@rowcount > 0
 BEGIN
  DELETE TOP (5000)
  FROM SomeTable
 END


The other way is to run a while loop which is always true and then check if @@rowcount is 0, if it is 0 then break out of the loop

 WHILE 1 = 1  
    BEGIN  
        DELETE TOP(5000)  
        FROM SomeTable

        IF @@ROWCOUNT = 0  
     BREAK  
     END


I have always used WHILE @@rowcount > 0 but you have to be careful because @@rowcount could be 0 when your while loop starts

Let's take a look at an example. This is a simplified example without a where clause..but let's say you have to delete several million rows from a table with many more millions of rows and the table is replicated... in that case you want to batch the deletes so that your log file doesn't fill up, replication has a chance to catch up and in general the deletes should run faster

SELECT TOP 20000 row_number() OVER(ORDER BY t1.id) AS SomeId,
  getutcdate() AS SomeDate, newid() AS SomeValue
INTO SomeTable
FROM sys.sysobjects t1
CROSS JOIN sys.sysobjects t2

SELECT COUNT(*) from SomeTable 

SELECT * FROM SomeTable WHERE 1= 0

WHILE @@rowcount > 0
 BEGIN
  DELETE TOP (5000)
  FROM SomeTable
 END

SELECT COUNT(*) from SomeTable 

DROP TABLE SomeTable -- Added here as cleanup in case people run the example

This is of course a silly example because why would you do a count like that against a different table before a delete

But what if you had this instead, you put a nice print statement there so that from the output you see when it started and you would also see the rowcounts?

SELECT TOP 20000 row_number() OVER(ORDER BY t1.id) AS SomeId,
  getutcdate() AS SomeDate, newid() AS SomeValue
INTO SomeTable
FROM sys.sysobjects t1
CROSS JOIN sys.sysobjects t2

SELECT COUNT(*) from SomeTable 

PRINT' Starting my update now....'

WHILE @@rowcount > 0
 BEGIN
  DELETE TOP (5000)
  FROM SomeTable
 END

SELECT COUNT(*) from SomeTable 

DROP TABLE SomeTable -- Added here as cleanup in case people run the example



The count is 20000 before and after the loop, nothing got delete, this is because a print statement will reset @@rowcount  to 0

Take a look by running this simple set of queries

SELECT 1 UNION ALL SELECT 2
SELECT @@rowcount as 'Rowcount'
PRINT '1'
SELECT @@rowcount as 'RowcountAfterPrint'

Here is what the output looks like


After the  PRINT line @@rowcount is reset back to 0

So if you want to use a while loop while checking @@rowcount, do this instead by running the delete first once outside the loop



SELECT TOP 20000 row_number() OVER(ORDER BY t1.id) AS SomeId,
  getutcdate() AS SomeDate, newid() AS SomeValue
INTO SomeTable
FROM sys.sysobjects t1
CROSS JOIN sys.sysobjects t2

SELECT COUNT(*) from SomeTable 

PRINT' Starting my update now....'

DELETE TOP (5000)
FROM SomeTable

WHILE @@rowcount > 0
 BEGIN
  DELETE TOP (5000)
  FROM SomeTable
 END

SELECT COUNT(*) from SomeTable 

DROP TABLE SomeTable -- Added here as cleanup in case people run the example

If you run the delete this way if there was something to delete, the while loop would be entered, if the table was empty then there would no need to enter the while loop

Also keep in mind that it is not just PRINT that will reset @@rowcount back to 0.

From Books On Line:

Statements such as USE, SET <option>, DEALLOCATE CURSOR, CLOSE CURSOR, BEGIN TRANSACTION, or COMMIT TRANSACTION reset the ROWCOUNT value to 0.

That's all... hopefully this helps someone out in the future if they notice nothing gets deleted

Friday, November 23, 2018

Happy Fibonacci day, here is how to generate a Fibonacci sequence in SQL


Image by Jahobr - Own work, CC0, Link


Since today is Fibonacci day I decided to to a short post about how to do generate a Fibonacci sequence in T-SQL. But first let's take a look at what a Fibonacci sequence actually is.

In mathematics, the Fibonacci numbers are the numbers in the following integer sequence, called the Fibonacci sequence, and characterized by the fact that every number after the first two is the sum of the two preceding ones:

 1, 1, 2, 3, 5, 8, 13, 21, 34, ...

Often, especially in modern usage, the sequence is extended by one more initial term:

0, 1, 1, 2, 3, 5, 8, 13, 21, 34, ...

November 23 is celebrated as Fibonacci day because when the date is written in the mm/dd format (11/23), the digits in the date form a Fibonacci sequence: 1,1,2,3.

So here is how you can generate a Fibonacci sequence in SQL, you can do it by using s recursive table expression.  Here is what it looks like if you wanted to generate the Fibonacci sequence to up to a value of 1 million

;WITH Fibonacci (Prev, Next) as
(
     SELECT 1, 1
     UNION ALL
     SELECT Next, Prev + Next
     FROM Fibonacci
     WHERE Next < 1000000
)
SELECT Prev as Fibonacci
     FROM Fibonacci
     WHERE Prev < 1000000




That will generate a Fibonacci sequence that starts with 1, if you need a Fibonacci sequence that start with 0, all you have to do is replace the 1 to 0 in the first select statement

;WITH Fibonacci (Prev, Next) as
(
     SELECT 1, 1
     UNION ALL
     SELECT Next, Prev + Next
     FROM Fibonacci
     WHERE Next < 1000000
)
SELECT Prev as Fibonacci
     FROM Fibonacci
     WHERE Prev < 1000000


Here is what it looks like in SSMS



Happy Fibonacci day!!

I created the same for PostgreSQL, the only difference is that you need to add the keyword RECURSIVE in the CTE, here is that post  Happy Fibonacci day, here is how to generate a Fibonacci sequence in PostgreSQL

Saturday, May 13, 2017

All the Build 2017 data related sessions in 1 spot, SQL Server, CosmosDB, Azure SQL DB and more




Are you bored this weekend? No problem, I have got you covered, I have put together a page with all the data related sessions from the build 2017 conference. I have embedded the videos and also added links to the page at the channel 9 site. I have also added links to the presentations and source code if it was available.  I have arranged this in 4 categories:

SQL Server
Azure SQL DB
CosmosDB
The rest of the sessions


This way the sessions are not all intermingled

SQL Server


Securing your app data with Microsoft SQL Server 2017 
by Tommy Mullaney

Join us for an overview of strategy and best practices to build secure applications using SQL Server. SQL Server offers a versatile toolset of security capabilities, which enable you to protect sensitive data and meet stringent compliance requirements. You’ll leave with an understanding of how these capabilities fit together to protect your application as a whole, including the new security enhancements available in the latest version of SQL Server.




Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4018



Evolving SQL workloads from software to SaaS: Data for devs immersion
by Tara Shankar Jana

In this modern era; applications must recover from user errors efficiently, must be optimized for transactional performance, provide real-time insights into app transactions, must be secure and complaint, and support seamless integration to cloud, hybrid scenarios and other data types/systems. These applications must also learn and adapt to deliver more powerful experiences using cloud-based data services, applications, and intelligence. In this session we use a scenario-based hands-on lab program (immersion), to offer you a cohesive and consistent experience to build modern on-premises/cloud based applications with intelligence. Build hack-free, high performance, scalable applications using Microsoft SQL Server, Microsoft ML and Azure data services.



Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4013



Advanced SQL Server on Linux (Docker) via Entity Framework (EF)
by Tara Shankar Jana

It’s SQL Server as you already know it today, but it’s now native to Linux. It works with your data, your favorite tools, application frameworks (Entity Framework) and programming languages. In this session, we pull the code for an application running in cloud to a Linux-based machine. We test and enhance the application for performance (using EF, in-memory) and security against SQL vNext (Linux). Then we re-deploy the changes using DevOps to Azure with just a flip of d, show how the application works across platforms, on-premises, and in the cloud. Best part, all of it will be done in ten minutes.



Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4001


SQL Unplugged
by Scott Klein, Rohan Kumar

In this Channel 9 Live session we'll discuss all things SQL



Link to Build site: https://channel9.msdn.com/Events/Build/2017/C9L20


Serving AI with data: How to use R, Python, and machine learning with Microsoft SQL Server 2017
by Umachandar Jayachandran, Nellie Gustafsson

In this session, learn how to use R and Python to perform statistical and machine learning tasks in SQL Server 2017. Also, learn how to operationalize your R or Python scripts using the SQL Server integration.




download:Slides View Slides Online




Microsoft SQL Server 2017 in the modern datacenter: how to run on Linux, Docker, OpenShift, and Kubernetes
by Travis Wright, Tobias Ternstrom

With the on-premises data center evolving into a private cloud, the fast pace of innovation in public clouds, and the strong interest in container technology it begs the question: How do you run a database in this new environment? In this session we examine what it means that SQL Server is evolving to support both Windows and Linux, including running inside a container on both platforms. Should you run SQL Server inside a Docker container? How does it play with orchestration technologies like OpenShift and Kubernetes? The brief answer is that we believe SQL Server should be ubiquitous and run in our customers environment of choice. Come to the session to hear the long answer with a bunch of demos!



Link to Build site: https://channel9.msdn.com/Events/Build/2017/B8080


How to do Predictive Modelling using R and SQL Server ML Services
4 days ago  by Umachandar Jayachandran, Nellie Gustafsson

Learn how to use R scripts from T-SQL to perform training and scoring and leverage parallelism and streaming capabilities to get better performance.



Link to Build site: https://channel9.msdn.com/Events/Build/2017/T6070


Built-in machine learning in Microsoft SQL Server 2017 with Python
by Sumit Kumar

Machine learning services in SQL Server 2017 provides Python support for in-database machine learning, now. In this session we show the basics of how to run Python code in SQL Server. We then discuss how any app that can talk to SQL Server can get intelligence from machine learning models running in SQL Server. We showcase an app that uses a Python-based deep learning model built and deployed in SQL Server. The model leverages an open source deep learning framework running with SQL Server and utilizes GPU for boosting model training performance.




Link to Build site: https://channel9.msdn.com/Events/Build/2017/T6067



Modernize your database development lifecycle with SQL Server Data Tools in Visual Studio
2 days ago  by Tara Raj

Learn how SQL Server Data Tools (SSDT) turns Visual Studio into a powerful environment for database development. Easily build, debug, maintain, and refactor databases inside Visual Studio with a declarative model that spans all the phases of database development and easily enables continuous integration and deployment for your databases. Work offline with a database project, or work directly with a connected database instance in Azure SQL Database, Azure SQL Data Warehouse, and SQL Server running on Windows, Linux, or Docker, on-premises or in any cloud.


Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4009

Azure SQL DB

Get to the cloud faster with Azure SQLDB Managed Instance and Database Migration Service
by Lindsey Allen, Harini Gupta


A new, expanded Azure SQL Database offers fully-managed server instance with greater compatibility with SQL Server application features and more, and the ability to move hundreds of databases at once using the Database Migration Service. It provides security isolation with Azure Virtual Network, along with built-in HADR, built-in intelligent performance tuning, and intelligent security services that are already available in Azure SQL Database. To reduce the friction of transitioning your relational database estate to public cloud, expanded Azure SQL Database provides SQL Server application compatibility including commonly used cross database references using three-part names, CLR, SQL Agent, Transactional Replication, Change Data Capture, Service Broker, and a lot more. We showcase a five-step seamless migration experience from your on-premises SQL Server instances to expanded Azure SQL Database using the newly announced Azure Database Migration Service.



Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4008


Migrating Oracle database to the Azure SQL DB with Database Migration Service
by Shamik Ghosh, Alexander Ivanov

In this virtual session, you learn how to assess, remediate and migrate the schema, data, and server artifacts from on-premises Oracle instances to Azure SQL DB. We walk you through an end-to-end experience using Azure data migration tools and services to help solve your database migration needs.




Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4182 

download: Slides View Slides Online



How to build global-scale applications with Microsoft Azure SQL Database
by Rohan Kumar, Bob Strudwick

Join us in this session to learn how to build a global-scale IoT application by walking through a sample application and real-world customer case study. Gain insight on building an IoT application on a fully managed Azure database as a service, with built-in intelligence for performance, cost and security, dynamic scaling and hassle-free HADR, and infrastructure maintenance. We show you how to use Azure SQL Database enterprise grade in-memory engines to handle high volume and high concurrent transactions; while running real-time analytics queries on the same database. New features are released in Azure SQL Database in fast cadence; we show you the new Graph processing in the context of an IoT scenario, Adaptive Query Processing, predictive analytics, and the first in market built-in homomorphic data encryption feature for securing data at rest and in motion. It will be a fun learning hour.


Link to Build site: https://channel9.msdn.com/Events/Build/2017/B8018


Design Patterns for SaaS applications on Azure SQL Database
by Julie Strauss, Bill Gibson


Experience the power of building multi-tenant SaaS applications on Azure SQL Database, Microsoft’s fully managed database as a service platform: Using a sample SaaS application, we walk through a series of SaaS-focused design and management patterns that have been distilled from work with a multitude of customers. Patterns spans from multi-tenant provisioning, schema management, performance monitoring and management to operational analytics. The code for the sample application, plus management scripts, ARM templates and tutorials, will be available for download in an easy-to-explore “SaaS-in-a-Box” package, enabling you to jump-start your own SaaS application.


Link to Build site: https://channel9.msdn.com/Events/Build/2017/T6025

CosmosDB


CosmosDB
by Seth Juarez, Rimma Nehme

In this session we meet with Rimma Nehme and Seth Juarez to discuss some of the latest Build 2017 announcements around CosmosDB



Link to Build site: https://channel9.msdn.com/Events/Build/2017/C9L08


Azure Cosmos DB: Build planet scale mobile apps in minutes
by Kirill Gavrylyuk

This session has no description, but the video is below



Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4012


Azure Cosmos DB: API for MongoDB
by Andrew Hoh

Azure Cosmos DB (formerly known as Azure DocumentDB) natively supports multiple APIs; one of which is the API for MongoDB. Use existing code, applications, drivers, and tools to work with Azure Cosmos DB. Benefit from the fully managed and scalable Azure database, while continuing to use familiar skills and tools for MongoDB. Come and watch this video to learn about the feature and how to migrate to this cosmic-scale database, Azure Cosmos DB.




Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4011


Azure Cosmos DB: NoSQL capabilities everyone should know about
by Aravind Ramachandran

Microsoft Azure provides a fully managed NoSQL database service built for fast and predictable performance, high availability, elastic scaling, global distribution, and ease of development. As a schema-free NoSQL database, the service provides rich and familiar SQL query capabilities with consistent low latencies on JSON data - ensuring that 99% of your reads are served under 10 milliseconds and 99% of your writes are served under 15 milliseconds. These unique benefits make it a great fit for web, mobile, gaming, IoT, AI, and many other applications that need seamless scale and global replication. Come and learn about the NoSQL capabilities in Azure Cosmos DB that every developer should know about.



Link to Build site: https://channel9.msdn.com/Events/Build/2017/T6058



A lap around Azure HDInsight and Cosmos DB Open Source Analytics + NoSQL
by Andrew Liu, Raghav Mohan


Recently, we released the Spark Connector for our distributed NoSQL service – Azure Cosmos DB (formerly known as Azure DocumentDB). By connecting Apache Spark running on top Azure HDInsight to Azure Cosmos DB, you can accelerate your ability to solve fast-moving data science problems and machine learning. The Spark to Azure Cosmos DB connector efficiently exploits the native Cosmos DB managed indexes and enables updateable columns when performing analytics, push-down predicate filtering against fast-changing globally-distributed data, ranging from IoT, data science, and analytics scenarios. Come learn how you can perform blazing fast planet-scale data processing with Azure Cosmos DB and HDInsight.


Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4010


Data lakes, U-SQL, Azure DBMS  and the rest


How to get started with the new Azure DBMS for PostgreSQL.
by Eric Spear, Irakliy Khaburzaniya, Sunil Kamath

Join this session to learn about Microsoft’s latest announcement of managed PostgreSQL database on Azure. In this breakout, we will learn from two early adopting customers, how they've leveraged this latest database service to innovate faster. We will learn from their experience of using the managed PostgreSQL service, including migrating to the service, and discuss next steps in their application journey. We will walk through some of the key service features and discuss how you as a developer can migrate your existing applications or develop new applications that use this managed PostgreSQL in Azure. If you’re a developer with applications that use PostgreSQL today, whether on-premises or cloud, and want to learn about how this new managed service can help, this session is for you!



Link to Build site: https://channel9.msdn.com/Events/Build/2017/B8046
download: Slides View Slides Online



How to get started with the new Azure DBMS for MySQL.
by Gebi Liang, Jason M. Anderson, Matt Williams


Join this session to learn about Microsoft’s managed MySQL offering in Azure. We’ll walk through Microsoft’s strategy for supporting Open-Source database systems in Azure and what it means to you as a developer as you look to develop or deploy applications that use MySQL in Azure. An overview of the architecture of the service along with how Azure Database for MySQL is integrated with other Azure Services such as Web Apps will also be discussed and demo’d. If you’re a developer with applications using MySQL today, whether on-prem or already in Azure today, this session is for you!



Link to Build site: https://channel9.msdn.com/Events/Build/2017/B8045
download: Slides View Slides Online Source Code



How to serve AI with data: The future of the Microsoft Data Platform
by Joseph Sirosh, Alex Miller

The cloud is truly becoming the “brain” for our connected planet. You’re not just running algorithms in the cloud, rather you’re connecting that with data from sensors from around the world. By bringing data into the cloud, you can integrate all of the information, apply machine learning and AI on top of that, and deliver apps that are continuously learning and evolving in the cloud. Consequently, the devices and applications connected to cloud are learning and becoming increasingly intelligent. Please join us on a journey through the core new patterns that are emerging as we bring advanced intelligence, the cloud, IoT, and big data together.




Link to Build site: https://channel9.msdn.com/Events/Build/2017/B8081


How to run AI at Petabyte Scale with cognitive functions in the Azure Data Lake
by Wee Hyong Tok

In this session, learn how you can use Azure Data Lake (ADL) for doing Big Cognition on raw customer support data. Learn how you can use ADL to perform key phrase extraction, sentiment analysis, and how you can use R/Python scripts for support volume forecasting. In addition, learn how you can use federated queries in ADL with Azure SQL Database. Discover how you can pull all these insights into an Azure SQL Data Warehouse, and using Azure Analysis Services to enable interactive analysis of the processed data. Join us for this exciting session as we show how you can develop intelligent applications by using the insights derived from processing massive Petabyte-scale datasets.


Link to Build site: https://channel9.msdn.com/Events/Build/2017/B8065
download: Slides View Slides Online Source Code



Image processing at scale using U-SQL in Azure Data Lake
by Saveen Reddy

Making use of cognitive capabilities such as Image OCR or Sentiment Analysis of text is straightforward with small datasets of a few terabytes. But, at the scale of hundreds of terabytes or even a petabyte, you need a different approach to that can massively scale out *AND* be simple to build. Azure Data Lake offers a straightforward way of programming using .NET code against these massive Petabyte-scale datasets without the need to become a deep expert in distributed computing, big data technologies, or machine learning.Link to Build site:


Link to Build site: https://channel9.msdn.com/Events/Build/2017/T6040



Lift and shift any runtime into U-SQL for large scale processing
by Michael Rys

One of U-SQL’s major strengths is to scale out user code over your large data lake assets. This presentation first shows how we have used the U-SQL extensibility model to give you access to Python and R and then shows how to run Scala from within U-SQL.


Link to Build site: https://channel9.msdn.com/Events/Build/2017/P4004


There you have have a bunch of data related sessions for your viewing pleasure




Monday, April 24, 2017

How to shut down SQL Server

The question how to shutdown SQL Server came up at work last week, I figured it would be a nice topic for the newer DBAs and SQL Server programmers

Note: I assume SQL Server is not part of a clustered environment here, if your SQL Server instance is part of a cluster, then use the failover cluster manager instead!


There are a bunch of way to shut down SQL Server, here is how you can do it

Configuration Manager

To shut down SQL Server for the SQL Server Configuration Manager, navigate to your instance, right click and select Stop

SSMS Object Explorer



The process from SSMS is very similar to Configuration Manager, right click on the instance and select Stop

Shutdown command

From SSMS open a new query window and type shutdown or shutdown with nowait

Unless the WITHNOWAIT option is used, SHUTDOWN shuts down SQL Server by:

  • Disabling logins (except for members of the sysadmin and serveradmin fixed server roles).
  • Waiting for currently running Transact-SQL statements or stored procedures to finish. To display a list of all active processes and locks, run sp_who and sp_lock, respectively.
  • Inserting a checkpoint in every database.

That seems kinda dangerous can someone pass that command to a proc by masking it?


declare @d varchar(30)= reverse('tiawon htiw nwodtuhs')
exec(@d)

The SHUTDOWN statement cannot be executed within a transaction or by a stored procedure. Msg 0, Level 11, State 0, Line 3 A severe error occurred on the current command. The results, if any, should be discarded.

That is good to know  :-)


Command prompt

Open a command prompt as administrator and type net stop MSSQLSERVER

Microsoft Windows [Version 10.0.14393]
(c) 2016 Microsoft Corporation. All rights reserved.

C:\WINDOWS\system32>net stop MSSQLSERVER
The SQL Server (MSSQLSERVER) service is stopping.
The SQL Server (MSSQLSERVER) service was stopped successfully.


C:\WINDOWS\system32>net start MSSQLSERVER
The SQL Server (MSSQLSERVER) service is starting.
The SQL Server (MSSQLSERVER) service was started successfully.


C:\WINDOWS\system32>

To start SQL Server again, change net stop to net start


Below are a couple of more ways but you should really not be using those!!

Task Manager


Yes, you can kill SQL Server from task manager

Taskkill

You can use the plain old DOS command taskkill as well

Shutdown your server
That is one way of course but you might get a bunch of people yelling at you

Unplug the power to your server after making sure you have no UPS battery backup plugged in
Great way to test recovery... .. sarcasm people.....

Shutdown the data center
You laugh but this actually happened to me once, someone did some work and shut down the whole development part of the data canter. We had SQL Server 2000 running on NT 4, it was not restarted in 6 years (meaning wasn't patched either) It was an isolated box, internal ip, couldn't be reached from outside the dev network


There you have all the ways that you can use to shut down SQL Server

Friday, December 09, 2016

Connecting to SQL Server when your default database is not available


One of our database on the development went in suspect mode today. This database was the default for a bunch of logins.  These people could not login now. Someone needed to use a different database but he couldn’t login because the database that was in suspect mode was the default database for the login he was using.

I told this person to click on the Options button in the connection dialog and specify another database. I guess there was an misunderstanding because this person couldn’t get it to work. This means it is time for a blog post.

Let's take a look how this all works

Here is a script that will create 2 databases

CREATE DATABASE Good
GO
 
CREATE DATABASE OopsBad
GO

Now create a new login named TestLogin with a password of Test. We are also adding the login we just created to the OopsBad database and we will make the login part of the db_owner role


USE [master]
GO
CREATE LOGIN [TestLogin] WITH PASSWORD=N'Test', DEFAULT_DATABASE=[OopsBad]
USE [OopsBad]
GO
CREATE USER [TestLogin] FOR LOGIN [TestLogin]
GO
USE [OopsBad]
GO
ALTER ROLE [db_owner] ADD MEMBER [TestLogin]
GO

Add the login we just created to the Good database as well and make the login part of the db_owner role



USE [Good]
GO
CREATE USER [TestLogin] FOR LOGIN [TestLogin]
GO
USE [Good]
GO
ALTER ROLE [db_owner] ADD MEMBER [TestLogin]
GO

Make sure that you can login with the TestLogin account

Now that you know that you can login with the TestLogin account, use another account and put the OopsBad database in offline mode

Take the database online

ALTER DATABASE OopsBad SET OFFLINE



Now if you try to login with the TestLogin account, you will see the following error



Here is what you need to do, on the connect to server window, click on the Options button



One the next screen, click on the Connection Properties tab, do not click on  Browse server... from the drop down.  If you click that you might have to wait for a long time

Instead just type in the database name, in our case the database name is Good



Click on Connect and you should be in....  That's it, pretty simple, you just have to know where to find it.

Sunday, November 27, 2016

Changing the SQL Server port on Linux


Today, I decided to see how I can change the port SQL Server is listening on



To see what port SQL Server is listening on, you can use the netstat command, here is what the output looks like when I run netstat -tulpn | grep sqlservr


I used grep to limit the output to lines that contain sqlservr only

Next, to change the port that SQL Server is listening on to 3000, I ran the following command

sudo /opt/mssql/bin/mssql-conf set tcpport 3000


I restarted SQL Server bu running the command systemctl restart mssql-server
Then I ran  netstat -tulpn | grep sqlservr again and as you can see the output now has 3000 as the port number

Now I tried using sqlcmd, I ran the following sqlcmd -S localhost -U SA


I got an timeout error

Finally I ran the sqlcmd command again but I added port number 3000, like this
sqlcmd -S localhost,3000 -U SA


As you can see, this worked and I could connect to SQL Server listening on port 3000 when supplying that to the sqlcmd command


How do we reset SQL Server back to the default port?
To reset SQL Server back to use the default port, run the command below

/opt/mssql/bin/mssql-conf unset tcpport

Here is a screenshot of running netstat, resetting the port, restarting SQL Server and running netstat again





You want to learn a little more?


We used tulpn as options in our netstat command. So what do all these options do?

[-t] 
tcp

[-u] 
udp

[-l]
Show only listening sockets

[-p]
Show the PID (process identifier) and name of the program to which each socket belongs.

[-n] 
Show numerical addresses instead of trying to determine symbolic host, port or user names.



On a Linux system to get help for a command, use man, this will bring up the user manual for that command

So if you run man netstat, you will see the following (I only pasted 1 page here, there are more)

man netstat

--------------------------

NETSTAT(8)                 Linux Programmer's Manual                NETSTAT(8)

NAME
       netstat  - Print network connections, routing tables, interface statis‐
       tics, masquerade connections, and multicast memberships

SYNOPSIS
       netstat  [address_family_options]  [--tcp|-t]   [--udp|-u]   [--raw|-w]
       [--listening|-l]     [--all|-a]     [--numeric|-n]    [--numeric-hosts]
       [--numeric-ports]           [--numeric-users]           [--symbolic|-N]
       [--extend|-e[--extend|-e]]  [--timers|-o] [--program|-p] [--verbose|-v]
       [--continuous|-c]

       netstat              {--route|-r}              [address_family_options]
       [--extend|-e[--extend|-e]]         [--verbose|-v]        [--numeric|-n]
       [--numeric-hosts] [--numeric-ports] [--numeric-users] [--continuous|-c]

       netstat {--interfaces|-i} [--all|-a] [--extend|-e[--extend|-e]] [--ver‐
       bose|-v]  [--program|-p]  [--numeric|-n]  [--numeric-hosts] [--numeric-
       ports] [--numeric-users] [--continuous|-c]

       netstat      {--groups|-g}       [--numeric|-n]       [--numeric-hosts]
       [--numeric-ports] [--numeric-users] [--continuous|-c]
 Manual page netstat(8) line 1/348 15% (press h for help or q to quit)


To see all my SQL Server on Linux posts, click here: SQL Server on Linux