Dear 20 Year Old Self

This month’s T-SQL Tuesday invite comes from Mohammad Darab and invites a letter to your 20 year old self. I love this idea for a theme and am really looking forward to other people’s write-ups. To date this in time, I’m currently 35 so this will be a 15 year flashback.

I’ll be honest, I don’t have many regrets in life. Some things haven’t gone as I’d have expected but everything has happened together to bring me to where I am now.

Some pieces of advice I would have sent back to my 20 year old self:

  • That degree you’re doing? Yeah, complete that because you’ll be paying it off for the next 15 years.
  • Don’t stress about your career yet, that will sort itself out in due time.
  • Don’t stop playing Rugby, you’ll get fat.
  • Enjoy your hair while you have it.
  • Finally: don’t worry, you’ll grow into your face;

There’s not a lot of professional suggestions there, that’s probably a mix of how happy I am with my career as it currently is as well as how much my life has changed over the last 15 years.

I quite like the introspective of this post so see you in another 5 years for another update!

Full Backups

Have you ever heard the phrase ‘don’t keep all of your eggs in one basket’? Yeah, people have learned that particular lesson the hard way. Don’t make the same mistake.

When you do a full backup you’re taking a copy of your database (and all of the data within) and storing it somewhere nice and safe. If something tragic happens to your database you will have a copy of it stored elsewhere. 

You can take a full backup manually using the GUI, just right click on your database and select Tasks -> Back Up…

You get to select things like what database you’re backing up, the type of backup you’re taking and the location you’re saving it to. To save it to the default location you can just click OK and watch it take a full backup.

If you have a little time have a read through the options you have. The main suggestion is to save this backup to a different drive than the one that your database is stored on (if your database is on D: then save your backups to another drive if you have one available).

You should also consider backup compression. It’s going to save you space when you schedule your backups regularly.

Talking about those scheduled backups, this is something that should be being done regularly. You’ll need to consider your own RTO/RPO goals to see how much data you’re willing to lose (in minutes). A good default is to set your full backups to nightly and your log backups to this figure (e.g. if you’re willing to lose up to 15 minutes of data then take log backups every 15 minutes).

Do you want an easy way to create your agent job to schedule that backup? At the top of the GUI when taking a backup there’s a little Script button and you can send the script to an agent job;

From here you can set your own SQL Server Agent job, the main thing you need to do is to give it a schedule (based upon your own RTO/RPO goals). Once that’s done you’re golden. Check back tomorrow and make sure your backups are working.

Don’t forget to practice restoring these backups elsewhere to make sure that they’re working without anything silly like corruption!

The final countdown

Do do do dooooo, do do do do do……..

You know that SQL Server 2008 or 2008 R2 box you’ve got sitting around on an old dusty server somewhere? You’ve got 3 months to upgrade this to a version of SQL Server that was released in the last decade.

Just in case you’ve missed it, on July 9th this year (2019) Microsoft is ending support for these versions of SQL Server.

But what does that mean for you?

Well, some of you might just not care. And that’s fine if it’s a decision you’ve come to when considering the risks of staying on an unsupported version of SQL Server. But bear in mind;

  1. You won’t be getting any security updates for your box.
  2. Microsoft won’t even touch you if you tried to raise a support case with them.
  3. You know those bugs they find and patch? Nope, you’re stuck with them now.
  4. You’re missing out on all of the features in later versions of SQL Server that used to be on enterprise edition only.

Worth the risk? It’s totally your decision. I certainly wouldn’t want to be running this risk.

If you’re like me and don’t want to accept these risks, consider this the 3 month mark to have these old instances upgraded or taken out of commission.

Good Luck!

Azure Data Studio – Extensions

Azure Data Studio supports installing extensions and has it’s own marketplace where you can get the full install details.

On your left bar, choose this funny little icon;

It will show what extensions you already have installed and also the marketplace. If you sort by below you’ll be able to scroll through the marketplace.



Click on any that take your fancy, it’ll open a page about it along with animations showing how they work (if the publisher has included them).

You can also build your own or download other extensions directly from source. Let’s go get Phil Scott’s pre-release of queryplan.show;

https://github.com/phil-scott-78/azure-data-studio-queryplan-show/releases/tag/v0.0.1

Go ahead and download the .vsix file. Once you’ve got that, open ADS, open the command palette (Ctrl+Shift+P) and find Extensions: Install from VSIX…

Navigate to your download folder and select the vsix file you’ve just downloaded.

If it’s not on the store you’ll probably see the following error message, I’m going to click Yes but you need to know that you trust the publisher (or you like living life dangerously, you rebel)

YOLO right?

Once it’s installed you’ll have to reboot Azure Data Studio to enable it

Once it’s reloaded you’ll see that your new extension is enabled

Nice work, you can now add Master With Azure Data Studio to your C.V.

Each extension will have different install instructions that will be shown on their GitHub page. Follow these and you’ll have access to those sweet extensions.

Azure Data Studio – Command Palette

Ever wondered how many things Azure Data Studio can do? Open the command palette and have a scroll through.

Press Ctrl+Shift+P and you’ll see the command palette at the top of the screen. You’ll see your recently used commands appear at the top, the rest will be scrollable;

There’s a whole section for your installed extensions, a whole bunch for source control and loads of others. Have a dig through and see what’s interesting to you

You’ll use the command palette a lot, get comfortable with it and your life will be a lot easier.

Azure Data Studio – Execution Plans

If you’ve ever had to get involved in query performance you’ll have used execution plans. Azure Data Studio gives execution plans too but they’re a little tricky.

Let’s build a quick query and gather our execution plans. This query will work on any database;

In your query editor window the obvious button is the ‘Explain’ button. This works and will give you the estimated query plan.

You’ll recognise the execution plan that you see in ADS as it looks very much like the same version in SSMS.

Estimated plan in Azure Data Studio

Here’s what it would have looked like in SSMS

The same plan in SSMS

One advantage ADS has over SSMS is that you can also see the Top Operations natively.

This has been possible in Sentry One Plan Explorer for a long time but it’s also now native in ADS. It’s great when you’re looking at a massive plan and want to drill down into the major pain points quickly.

Getting the actual execution plan is a little more complicated. It’s not a nice easy button so you’ll want to get used to shortcuts.

press Ctrl+Shift+P to open the command palette and type ‘run’. You’ll notice the command to ‘Run Current Query with Actual Plan’. That’s the ticket;

You’ll also notice that there’s an even better shortcut. Ctrl+M is going to execute the query and give you the actual plan.

There is currently a gotcha with ADS actual execution plans where it only renders the last code block. There is an open github issue for this so keep an eye on it as there’s new stuff being released every month.

Azure Data Studio – Server Management

Let’s look at how you can connect to your servers and group them up using Azure Data Studio. 

You’ll see down the left navigation bar. The icon we care about is at the top and will take you to the servers area of the app.

You may as well jump in and connect a server to see what it’s like.

You’ll be given a connection popup. Assuming you can connect using Windows credentials then you’ll just need to put the name of your instance in the server box.

You’ll see the connection appear in your server list. Have a click around, you can see the databases, security and database objects, you’ll be used to these from SSMS.

You’ll be able to connect to all of your servers here, just add them one at a time.

Once you’ve added a few you will probably notice you’ll want to start organising them into folders. Go ahead and add a new server group.

You get to choose a name for the group as well as a description that pops up like a tooltip. You also get to choose a funky colour for it too

Personally, I’ve separated out Live from Dev from QA but do whatever is best in your environment.

If you have instances stacked on the same box then you can create subfolders for these. Just drag and drop folders within folders and instances in those folders.

Look at that, all pretty and organised.

What is Your “Why?”

This month Andy Leonard has asked What is Your “Why”?.

Well, here’s my Why.

I love SQL Server and the community that surrounds it. It’s so welcoming, open and accessable.

I’ve had a sort of organic progression of Microsoft products in my career. I’ve gone Excel Developer -> Access Developer -> SQL Server Developer -> SQL Server DBA (there’s some other products in there like SSRS but that’s the main path). I’ve never really felt comfortable with any of the communities around these other products but SQL Server is a different kettle of fish completely.

Finding the SQL Server Community slack channel was a great thing. I am the only DBA where I am (with loads of developers) and having people to chat to about DBA stuff is such a pressure release.

Also, check out the call for speakers at most conferences. It’s not unusual to have a ‘first timers’ track for people who want to get into speaking. Doing this isn’t a necessity but it shows how inclusive the community is.

I didn’t choose to stay with SQL Server because of the technology specifically (although I do enjoy focusing on performance tuning) but rather the community around it.

Generate Test Data with Faker & Python within SQL Server

Make sure you’ve done these steps first

  1. You’ve installed SQL Server with Python
  2. You’ve then installed pip
  3. You’ve also installed Pandas using pip

Then let’s get started

We’re going to use a Python library called Faker which is designed to generate test data. You’ll need to open the command line for the folder where pip is installed. In my standard installation of SQL Server 2019 it’s here (adjust for your own installation);

C:\Program Files\Microsoft SQL Server\MSSQL15.SQL2019PYTHON\PYTHON_SERVICES\Scripts

From here you want to run the following command to install mimesis;

Once it’s done we’ve got it installed, we can open SSMS and get started with our test data.

We’re going to get started with the sample queries from the official documentation but we have to add a print statement to see our results because we’re using SSMS;

If you run this in SSMS you’ll see the output in the messages window

This guy loves quality legwear

Now we know that works, let’s put this into a useable format within SQL Server.

This is going to be our block of Python;

For the purposes of this example, we’re going to make a temp table to store the data and view what we’ve done. Wrapping this python script into t-sql will give us an output like so;

Go ahead and run it, you should see a sample of 100 names and addresses that are currently stored in your temp table;

There are far more options when using Faker. Looking at the official documentation you’ll see the list of different data types you can generate as well as options such as region specific data.

Go have fun trying this, it’s a small setup for a large amount of time saved.

Azure Data Studio Themes

This is one of the features of Azure Data Studio that is great for accessibility as well as just being cool.

The default theme is your basic light theme. It’s fine but this isn’t the only theme you have to use.

Use Ctrl+k Ctrl+t to open the theme options.

Have a click through and see how they look when you’re editing code. It’s a case of choosing something that suits your style. My preference is the default dark theme but go nuts and choose one you like.

Oh, and if you’re a sadist, check out the Red theme