The tools in my day

Another month, another writing assignment for T-SQL Tuesday – although I did have a post go live over at, where I wrote about ways that I put queries together when I have separate date and time columns.

This month, with the #tsql2sday theme being tools, hosted by Jens Vestergaard (@vestergaardj), I get to write about other methods I use to do my job.

In case you hadn’t realised, I’m a consultant, and tend to be on-site at my customers’ offices, as I help them with their data. Sometimes I’m helping them with some advanced analytics things, such as making predictions with Machine Learning. Sometimes I’m in standard BI/DW projects, as they’re trying to make sense of their data. Sometimes I’m troubleshooting performance issues, corruption issues, or whatever they need. Or I’m teaching them, about Power BI, querying, tuning, whatever. And more often than not, I’m using a machine that they’ve allocated to me – and using it alongside the customers.

So I don’t always (or typically) get to install new software on the machines I generally use. So the tools I get to use the most are things like SSMS, ML Studio, Visual Studio, DAX Studio, Power BI Desktop Studio, PowerShell Studio, and so on – the things that they already have on their machine – along with various scripts that I will use. If I’m doing a salvage operation from a corrupt database, I might be able to start thinking about other tools such as the HxD hex editor, or Plan Explorer (actually, the whole Sentry One suite is a great set of tools, that I love finding a customer sites when I do).

I want to wax lyrical a little about one of the tools that I feel like I’ve got to know better over the past couple of years, since SQL 2014 came around.

I wrote about Live Query Statistics within SSMS a while back – and even presented at conferences about how useful it is for understanding how queries run…

…but what I love is that at customers where I have long-running queries to deal with, I can keep an eye on the queries as they execute. I can see how the Actuals are forming, and quickly notice whether the iterations in a Nested Loop are going to be unreasonable, or whether I’m happy enough with things. I don’t always want to spend time tuning a once-off query, but if I run it with LQS turned on, I can easily notice if it’s going to be ages before I see any rows back, see any blocking operators that are going to frustrate me, and so on. Using TOP might introduce row goals that twist my plan more than I’d like, but using LQS lets me feel how long a query will take overall, as well as giving me good insight into it without too much cost. I figure the cost of having the server store the execution profile is probably (I wouldn’t do this on a production server) a lot easier than my looking at an estimate plan for a while and assessing whether I should do a bit of a rewrite or introduce some indexes.

LQS is a great teaching tool, but it also serves as a progress bar for me on queries. I recognise blocking operators, such as Hash Aggregate or Top N Sort, and I can tell when an estimate on an operator is way off – to the point that I can quickly get an idea about whether I should let the query keep going or not. To be able to sit at someone’s desk and show them that their query will start returning rows fairly soon (or not) is excellent, and having a better guess about whether it’s going to be another five minutes or another hour is also tremendously useful.

…because that way I can figure out whether to get some more coffee.