The SQL Diagnostic (jupyter) Book

Table of Contents

Welcome to 2020! I wanted to start this year by giving to all my fellow consultants another way to troubleshoot our beloved SQL Servers; I’ve already talked about diagnostic notebooks in the past, and now, since Azure Data Studio has implemented the feature, I wanted to group them into a Diagnostic Book.

As the name implies, a jupyter book is no other than a collection of notebooks (and markdown files) that groups everything in a coherent space, with an index and navigation options alike.

The index for the 1st release

The process is easy, just download the folder of the Book from my GitHub (here is a direct link for the folder only, thanks DownGit !) wherever you like on your machine and open it up with Azure Data Studio clicking “Open Folder…” ; voilà, if you are on the December 2019 version and onward of ADS you’ll get that new icon you see in the screenshot above for “jupyter Books”

Update: For the folks that have issues opening the book, try restarting ADS and waiting a couple of minutes before opening the folder, some services start up asynchronously in ADS and the notebook one may be still sleeping; if everything fails, try using the insider build of ADS.

How it was made

I’ve already explained in a separate article how to automatically create notebooks from Glen Berry’s diagnostic scripts.

I’ve applied a similar concept in creating the Notebook to install the FirstRespondersKit procedures:

function DownloadFilesFromRepo {
    Param(
        [string]$Owner,
        [string]$Repository,
        [string]$Path,
        [string]$DestinationPath
        )
    
        $baseUri = "https://api.github.com/"
        $ar = "repos/$Owner/$Repository/contents/$Path"
        [Net.ServicePointManager]::SecurityProtocol = "tls12, tls11, tls"
        $wr = Invoke-WebRequest -Uri $($baseuri+$ar)
        $objects = $wr.Content | ConvertFrom-Json
        $files = $objects | Where-Object {$_.type -eq "file" -and $_.name -like "sp_*" } | Select-Object -exp download_url
        
        if (-not (Test-Path $DestinationPath)) {
            # Destination path does not exist, let's create it
            try {
                New-Item -Path $DestinationPath -ItemType Directory -ErrorAction Stop
            } catch {
                throw "Could not create path '$DestinationPath'!"
            }
        }
    
        foreach ($file in $files) {
            $fileDestination = Join-Path $DestinationPath (Split-Path $file -Leaf) | ForEach-Object { $_ -replace '%20',' '}
            try {
                Invoke-WebRequest -Uri $file -OutFile $fileDestination -ErrorAction Stop -Verbose
                "Grabbed '$($file)' to '$fileDestination'"
            } catch {
                throw "Unable to download '$($file.path)'"
            }
        }
    
    }

#Download Diagnostic Scripts
DownloadFilesFromRepo -Owner "BrentOzarULTD" -Repository "SQL-Server-First-Responder-Kit" -Path "" -DestinationPath "TempFolderForThisStuff"

$files = Get-ChildItem -Path "TempFolderForThisStuff\*.sql"
$filename = "FirstRespondersKit.ipynb"

$cells = @()
$cells += [pscustomobject]@{cell_type = "markdown"; source = "# First Responders Kit Installation Notebook" }

$cells += [pscustomobject]@{cell_type = "markdown"; source = "You're a DBA, sysadmin, or developer who manages Microsoft SQL Servers. It's your fault if they're down or slow. These tools help you understand what's going on in your server.

- When you want an overall health check, run <b>sp_Blitz</b>.
- To learn which queries have been using the most resources, run <b>sp_BlitzCache</b>.
- To analyze which indexes are missing or slowing you down, run <b>sp_BlitzIndex</b>.
- To find out why the server is slow right now, run <b>sp_BlitzFirst</b>." }

$preamble = @"
    {
        "metadata": {
            "kernelspec": {
                "name": "SQL",
                "display_name": "SQL",
                "language": "sql"
            },
            "language_info": {
                "name": "sql",
                "version": ""
            }
        },
        "nbformat_minor": 2,
        "nbformat": 4,
        "cells":
"@


$preamble | Out-File $filename -Force

foreach ($file in $files) {
    $text = Get-Content $file -Raw
    $cells += [pscustomobject]@{cell_type = "markdown"; source = "## Install $($file.BaseName)`n" }
    $cells += [pscustomobject]@{cell_type = "code"; source = $text.ToString(); metadata = [PSCustomObject]@{
        tags = @("hide_input","dummytag")}}
}

$cells | ConvertTo-Json -Depth 5 | Out-File -FilePath $filename -Append
"}}" | Out-File -FilePath $filename -Append

Remove-Item "TempFolderForThisStuff" -Force -Recurse

This time is even easier, since I just needed to download all the files and include the SQL script in each of them in a separate cell, as opposed to parse the code in a single SQL File as it was the case before.
The only new thing is that now ADS allows for compressed code cells, so I just added the metadata to compress the cell (hide_input) to each cell in order not to have an extra long notebook to scroll.

The script pasted above just goes to Brent’s repository, downloads the latest installation files and creates a notebook with each installation scripts in its own (compressed) code cell.

As far as the BPCheck Notebook goes, the TigerTeam is so cool to have created a script to deploy a notebook themselves, so you can just refer to the page on their repository here .

Summing Up

This a first iteration of what can be created with jupyter books, a centralized document repository for notebooks and markdown files alike.

Of course the potential of seeing this used as a runbook platform for all kinds of scenarios for your business is very real.

If you have ideas on how to improve on this by adding scripts or changing things up, contact me right away using the contact form here !

comments powered by Disqus

Related Posts

Tracking Dynamic SQL performance automatically with a wrapper on sp_executesql (Dynamic SQL Inception)

in some projects I have a lot of Dynamic SQL going on, some queries are fast, some queries are slow, it depends on how the dynamic statement is actually put together ¯\_(ツ)_/¯ Tracking the performance of such statements is usually difficult because you don’t really know “why it’s slow” and you have to interpolate data from the app and the plan cache (if the plan is still there and wasn’t purged due to memory pressure) and try to figure out which was the piece of the offending dynamic SQL and why it did that.

Read More

Slides and materials for my session "How to use PowerBI as a Monitoring Tool" @ GroupBy.org

Good news everybody! As you may know if you follow me on Linkedin , I’ve been selected as one of the GroupBy Speakers for the last GroupBy session for 2019!

Read More