T-SQL Tuesday 119 – Changing your mind

First let me say that Alex Yates’ blog has an excellent title / sentiment: “Working with devs – databases should not be bottlenecks”. It’s close to my heart for a number of reasons, and not least because I used to be very much a developer and now I help solve database bottlenecks. I was in primary school when I learned about conditions and loops and modules. I dabbled at home with BASIC, got into the Unix space at university, and spent a number of years writing code in various versions of VB and then .Net languages. Throughout this time, I’ve had my mind changed a number of times, so Alex’s invitation for T-SQL Tuesday this month is good.

He wants us to write about a time when our mind has changed on something, and although I can think of plenty of times in my life when I’ve learned more about a situation and found that my earlier opinion was lacking somewhat, I’m going to focus on a pretty significant set of changes. Oh, and it’s worth pointing out that these days I hope I’ve learned that my opinions are shaped by my own limited understanding, and that there’s therefore every chance that I’m wrong about all kinds of stuff. On top of that, other people have their own opinions, shaped by their own limited understanding. So whenever I find myself disagreeing with someone, I try to remember that each of us is probably forming the most sensible opinion based on what we know. Most things aren’t worth fighting over.

The situation I’m going to describe is about databases – funnily enough.

Like Alex, I often work with developers to help them with the database they’re using. Developers typically have skills that I no longer try to maintain, and I have skills with data. And it’s good that we have different skill sets, because I don’t want to have to be a .Net (or Java or web) developer any more.

There was a time when I wanted to know everything about every type of coding. At university I enjoyed the subject Programming Paradigms, in which we had to learn about different approaches to coding, including functional programming and logic programming. I remember an assignment that had to be done in Prolog, solving puzzles where each letter represented a number and coming up with heuristics so that it didn’t just use brute force. It was eye-opening to realise that by adopting a different approach, you could achieve much better results.

Wind the clock forward to my early consulting days, and I was discovering that programming in the real world involved languages that were evolving fast. Whether it was VB3 through to VB6 and then the .Net framework, or PL/SQL with Oracle Web Services, programming was adopting generics and model-based architectures, and I was wanting to keep up, as well as moving up in management.

And then I changed my mind. About all kinds of things.

Firstly, I realised that I wasn’t going to be able to keep up with everything. But I could keep up with a subset. I changed my mind and decided not to try to keep up with the .Net world.

Secondly, I realised that I didn’t have to move up in management to grow my career. Leadership doesn’t mean management, and I was more interesting in leading. I changed my mind about pursuing roles in management, and focused on serving, increasing my influence and establishing my profile.

Thirdly, I realised that data was what mattered. Applications could come and go, but the database had to be strong. I changed my mind about doing application development and consciously moved towards the database. I’d also been strong with data – but I changed my mind about where my technical focus could be. I think my first community presentation wasn’t about data at all. I’m pretty sure it was about asynchronous calls from web browsers – AJAX stuff. I only started presenting about data later, after I’d had the change of mind.

These changes were the things that led me to involvement in the SQL community, and to setting up a company that lets me hire other people who are passionate about providing consulting services in data. Before then, I’d been moving through a career progression that was essentially fine, but might not have left me doing things I enjoy.

I’m not saying that everyone should jump into data. While I consider it to be very significant, there are plenty of things that are more significant – if you can cure diseases then please go and do that. But don’t be afraid of changing your mind about things. Understand that the path that you’re on may not be in the direction you want. Make choices. Understand others. And change your mind when you realise you need to.


The SQL feature I’m still waiting for

This month Kevin Chant (@kevchant) challenges us to write about our fantasy SQL feature. And so I’m going to reiterate something that I had listed as a Connect item many years ago. It got quite a lot of upvotes, but was never picked up.

That feature was about making more predicates SARGable, by recognising when helper predicates could be leveraged to help performance. I wrote about it at http://blogs.lobsterpot.com.au/2010/01/22/sargable-functions-in-sql-server/, and the Connect item can be seen on the ‘wayback machine’ here.

The idea of this is to address the problem that people have when they write a predicate involving two date columns like “WHERE DATEDIFF(day, s.SomeDate, o.OtherDate) > 3”. This means that the number of days between the values in those two date columns must be more than 3, but could also be written as “WHERE s.SomeDate < DATEADD(day, -3, o.OtherDate)” or “WHERE o.OtherDate > DATEADD(day, 3, s.SomeDate)”. I appreciate that if you’re considering columns that also involve a time component then these aren’t exactly equivalent (in the first I’d need to convert o.OtherDate to a date type first, in the second I’d need to convert s.SomeDate to a date type, use 4 instead of 3, and make it >=), but I’m thinking about this from an index usage scenario.

If we have an index on s.SomeDate and we know o.OtherDate before the join is performed, then we might get a nice Index Seek operation to quickly find the rows in table ‘s’ that have SomeDate earlier than 3 days prior to o.OtherDate, but only if we’re using the first of those alternatives that I mentioned. If we know s.SomeDate and have an index on o.OtherDate, we might get a seek if we using the second alternative.

One of the query-tuning tricks I do is to introduce extra predicates to help indexes be used better. If I see a predicate that isn’t SARGable, I can often introduce helper predicates myself by adding them to the query. If I had all three of these predicates in my query, I haven’t changed the logic at all (again, assuming I understand the data types), but might have given the SQL Query Optimizer enough to do a better job of running this query.

And these helper predicates I add don’t have to be exact – they might just reduce the range. For example, even if I just filtered s.SomeDate < o.OtherDate, it might help significantly. If I can’t figure out an exact-enough predicate, well that’s not a problem, because I still have the original predicate. So long as I’m not filtering out more than the original one, then my logic is okay.

I just think that Microsoft could build some of this into the product.


A quick tuning win with Memory-Optimized Tables

I hate writing ‘optimised’ with a ‘z’, but as a feature, I’m okay with writing about Memory-Optimized Tables as a way of optimising your SQL environment.

Let me start by saying that if you really want to get the most out of this feature, you will dive deep into questions like durability and natively-compiled stored procedures, which can really make your database fly if the conditions are right. Arguably, any process you’re doing (such as ETL) where the data doesn’t have to survive a system restart should be considered for Memory-Optimized Tables with durability set to SCHEMA_ONLY (I say ‘considered’ because the answer isn’t always obvious – at the moment inserting into memory-optimised tables won’t run in parallel, and this could be a show-stopper for you).

But today I’m going to mention one of the quick-wins available: Table Variables that use User-defined Table Types

User-defined table types are table definitions that you’ve created for the purpose of table variables that you’re going to pass between stored procedures. You can use them for any table variables, but typically I find user-defined table types are only used if the table variable is being used as a parameter somewhere. Still, this is remarkably often these days, for lists of things. Typically small lists, where operations are quick, and the need for parallelism is less.

For example, a list of Products:

, which is then used like:

The key thing here is that these tables already don’t survive a restart. They’re only scoped to the session, because they’re variables. In fact, a common misconception about table variables is that they’re automatically in memory only, but that’s simply not true.

…so why not change your type definition and make it that way? It’s really easy…

First, if you’re on-prem you need a filegroup and file that supports Memory-Optimized tables.

This creates a folder (not a file; despite what the command looks like, it’s actually a folder) for it to use. It’s just a thing that’s needed. You don’t need to do this on Azure SQL DB.

Then you simply have to change your definition. You’ll need to add an index, and use the WITH option.

Frustratingly you’ll need to script any objects that use your type, because you can’t drop the type until you’ve dropped any objects that refer to it – but this change should be going through source control and proper DevOps practices and testing anyway, in case it turns out that performance doesn’t improve.

Make sure you’re patched to CU3 of SQL 2016 SP1 to allow the table variable to be scanned in parallel to reduce the risk things slow down at all (chances are you’re already seeing serial inserts on your table variables, but you don’t need them when reading back), and CU7 to avoid some other annoying errors.

But it’s really as simple as that. For systems where data is frequently being pushed between procedures using TVPs, this is a low-risk change that can often have a noticeable impact.


(Thanks to Steve Jones for hosting this month!)

SQL on Linux – why bother?

It’s easy for someone who has been working with Microsoft technologies to wonder why they should bother learning about Linux so that they can run SQL Server on it, when it can run just fine on Windows machines.

There was a time when my main PC at home ran Linux. But that was before I graduated at got into the ‘real world’ and started using creating applications that ran on Windows machines. Sure, many of the databases I interacted with were on Unix servers (which generally meant they were Oracle), but most of the systems I used were Microsoft-based. As time went on, my career moved further away from Unix servers and more towards Microsoft systems, until I was working completely within the Microsoft space. Windows Servers and SQL Server. The cloud changed things in regard to the physical infrastructure but still I was within the Microsoft space for everything from the OS up.

I’m NOT about to rebuild my laptop using Ubuntu or Red Hat.

So why should I learn about Linux? Or why should you for that matter?

I’m not sure many of my customers care where their SQL Servers live. Only a few of them even care these days whether they live in the cloud or on-prem, and the operating system is becoming less and less important. I’m sure I could avoid Linux easily enough from that perspective. They’re more likely to move towards from having no operating system at all (by using a PaaS solution like Azure SQL DB) than towards an operating system they haven’t had to support before.

But as long as there are on-prem customers, there will be those who are concerned with the cost of licensing their operating systems, and that makes a strong case for Linux. Even those who are in the cloud but thinking more of VMs than SQL DB are starting to opt for the cheaper licensing of Linux.

And then there’s the concept of provisioning machines on-prem for a variety of reasons, particularly in containers – the idea of having instances of SQL that don’t need a whole machine to run, but just within subsystems like Docker, able to be spun up and torn down without having to set up a whole server.

One major benefit of the cloud has been that we now think of provisioning new systems, rather than having to purchase servers. This had started in the virtual world even before the cloud, when new servers could be placed onto hosts that had other servers on them already – but it was still complex and the realm of IT departments. Developers wouldn’t provision servers. We’d install multiple named instances of SQL Developer Edition, but the overhead of having a number of Windows VMs was often painful (not to mention the licensing). Docker with Linux gives us access to a different world. We can provision containers running SQL Server easily, rip them down easily, reconfigure them easily, set up HA using them, and so much more.

So why not embrace SQL on Linux? It doesn’t mean having to give up Windows. You can start by appreciating the ease of provisioning new systems with Linux – inside VMs and containers. Try it out, and see how easy it is. You don’t need to give up your copy of Windows, or even your version of SSMS. But the future isn’t going away, and the future includes SQL on Linux.


Thanks to Tracy Boggiano for hosting this month’s T-SQL Tuesday.

A letter to a former me

Hmm. This month’s T-SQL Tuesday topic (hosted by Mohammad Darab) is to write a letter to an earlier version of myself – the 20yo version. Obviously a lot has changed since then (more kids, less hair)… I’m going to focus on the career-related stuff though. Sometimes I think if I had it to do over again I wouldn’t get into something IT-related at all, because I’m typically much more interested in the people-consulting than technical-consulting. I do technical because people care about it, not for the sake of technology.

The pieces of advice that I would give to myself are about the significance of data, the significance of community, and the risk of not having a full-time job.

The signficance of data

When I left university, I started right away in a consulting firm. I was offered the job half way through my final (honours) year, and I turned up not really having a clue about how it all worked. I knew enough about how to write software, and fumbled my way through the consulting side, working out that I could solve clients’ problems in code, and help them be happier with the bespoke applications we were producing. Over the years through various jobs in various cities I started to realise that the only important thing was the database. I only wish I had’ve realised that earlier, and the fact that it was possible to specialise in data. In 1998 I was offered the chance to get into OLAP Services (as it was then) but was distracted by the rest of life, and a mindset that customers mostly wanted applications.

The significance of community

I only realised the significance of the technical community in the 2000s. It was this that helped me realise you could specialise in data, and that the world was so much bigger if you were involved in user groups. It was 2004 when I started to attend user group meetings, and 2005 when I was asked to take over the leadership of the Adelaide SQL Server User Group. I had no problem presenting, but before 2004 I didn’t even know that there were meetings. I’d attended TechEd Australia 1999 (in Brisbane), and never realised that I could’ve attended user group meetings in the city I was living. If I’d realised this earlier, then maybe I could’ve had the chance to impact the third thing…

The risk of not having a full-time job

It took a big leap of faith to quit permanent employment and set up my own thing, and in hindsight I wish I’d done it earlier – although it had taken me a long time to realise the significance of data and of community, such that I built up a profile that formed a foundation capable of attracting customers. It had worked in my favour in London in 2000 when Enron collapsed and the city became full of IT contractors, because my permanent role was secure. But until I was over 30 I had always chosen the stability of permanent employment.

So what I would be telling my younger self would be to open my eyes to the opportunities in data and in the technical community, so that doors to self-employment could open. I do what I do for the sake of my customers, but I have the chance to serve them only because I realised those first things.


Puzzling times

It’s T-SQL Tuesday again today! I’m feeling a little distracted because I’m in Poland for SQLDay, suffering from jet lag, and with my mind turning to the presentations I have to give this afternoon and tomorrow. Yesterday I gave a full workshop on SQL Internals and Performance without turning on my computer even once, but I can’t let the day go past without writing a post.

The topic for the month is hosted by Matt McGiffen (@mattmcgiffen) and is on puzzles. It takes me back to the second ever T-SQL Tuesday, almost a decade ago in January 2010. Back then I wrote about an interesting puzzle I had come up with to explain how GROUP BY and HAVING work (the answer is here – and as my blog has moved in recent years as has Adam’s post, I can’t guarantee any of the links there).

This month I was reminded of something from years back (2007!), when Itzik Ben-Gan challenged people to solve a palindrome puzzle – how to find palindromes made up of words in a list. Even that post has moved – originally being at SQLMag, and now being at ITProToday. I took on the challenge, and was one of the solutions listed by Itzik in his follow-up post. I’ve never tried to create code to solve sudoku puzzles or other things, although when I was at university I did a subject that included using Prolog and needed to solve number-letter substitution puzzles, which I enjoyed in a strangely geeky way.

The reason why I mention the palindrome puzzle is that the approach I took was very similar to the approach that I’d take if I were trying to do it by hand – looking for words that start with the letters that I’m missing from the end. I’ve learned (and taught many times) over the years that if I can try to persuade the SQL engine to run a query the same way that I would in real life on paper, then when the computer does it, I’ve got a strong chance of it performing well.

These days the puzzles that I spend time on are the ones that my clients need me to solve, although I’m often tempted to pick up puzzle books and solve some things. We should never fail to find puzzles that exercise our minds – for what else will keep us sharp?


SQLDay in Poland

Today I’m sitting in the historic city of Wrocław in Poland. I’m not sure I ever would have thought to visit, except that over a year ago I was asked to one of the special guest speakers at SQL Day – one of the largest SQL Server-centric events in the world.

The city is a charming place, and despite the rain that’s falling today and my jet lag from travelling for over 27 hours, this place has fascinated me, from the architectural mix of plain v elegant and old v new to its friendly people. I knew of Silesia from a board game, but didn’t know of Wrocław, and certainly didn’t know how to say it (which is roughly “Vrotz-waff”). I’ve walked around, enjoyed some coffee in a Czekoladziarnia, looked around the museum in the Old Town Hall, and will enjoy seeing the attendees at the event which is in the Centennial Hall – which I’m told is the second most significant building here (I even saw photos of it in the museum) and is on the UNESCO World Heritage List. The image below is from VisitWroclaw.eu.


Along with the PASS Summit (USA), SQLBits (UK), and the Data Platform Summit (India), I would consider this to complete the other Grand Slam of SQL events around the world. I’ve done precon seminars at the other three, and tomorrow rounds it off. It’s a great honour to be invited to be one of the special guests – when I look at the list of others they’ve had over the years I know I’m in excellent company.

I’m probably the only speaker here who won’t be using a computer. I’ve discovered over the past couple of years that if I can explain concepts without using one, it exercises a different part of the brain and people understand in a deeper way. And so I’m going to spend a day talking about SQL Internals and Performance, using a flip chart, and using audience members to help demonstrate concepts. Hopefully they’ll enjoy it and be able to approach their data in a different way. I have two other presentations to give later in the week which will also be different to the rest – one will hopefully involve audience discussion, and the other is a session that I’ve done as a keynote before, about story-telling and seeing yourself in movies. I hope to be thought-provoking and entertaining. If I can inspire the data community in any way, then I’ll be happy.


My personal databases

This month’s T-SQL Tuesday sees Todd Kleinhans (@toddkleinhans) ask about what we use databases for – outside work. Because obviously as database professionals, we ought to be putting what we preach into practice in our personal lives. We should be leveraging our expertise to make sure that the data with which we live, not just the data with which our clients live, is taking advantage of everything we have to offer.


Except that I’m a consultant. Sure I’m a consultant specialising in the Microsoft Data Platform & Analytics space, but my main priority is looking after the best interest of our customers. Ultimately, that’s what they pay us for – it’s just convenient when it overlaps with our expertise in the Microsoft ecosystem.

Certainly I don’t like to tell my clients to pay for things they shouldn’t be paying for, and that includes my time in building a solution when there’s an off-the-shelf solution.

At home, there are all kinds of data that I’m interested in. I’m interested in my financial transactions, in the car usage, and more. And that’s before I even consider the things that affect the company, which obviously I need to track.

But I have applications for those things. My bank keep track of the financial transactions, and I use Xero for the business. We use Harvest for timesheets, and I track my car usage with an app on my phone. And most of these things provide the analytics that I’m after too. I can look for changes in my spending, or in how project time is tracking against budget, and generally manage my data WITHOUT having to spin up my own databases and creating my own applications. I do this type of work for our customers, but I also know that the best use of my time is not creating applications, but in tuning and troubleshooting, analytics and insight.

I treat my family and my company (and myself) like my customers, and don’t reinvent the wheel for no reason. If there’s a product out there that already does what I need it to do, I’m not going to roll my own. It’s just not in their best interest.


T-SQL Tuesday – the confidence ‘cookie jar’

This month, Shane O’Neill (@SOZDBA) challenges us to write about the “cookie jar” that we reach into to re-energise, to give us back confidence when we’re feeling empty of it. He says we all have times of success that we look back upon to help convince ourselves that we can get through our current difficulties. Past achievements that signal that we can still achieve today.

Hi, my name is Rob Farley, and I’m not that person.

For me, this isn’t how life works. I don’t look back at times that I’ve been successful and therefore assume that things will be okay going forward. I’ve had enough experience to be confident that my past successes are no guarantee of anything in the future. I remember a time in my life when job interviews seemed to go well – I felt quite hireable – and yet not so long afterwards I couldn’t seem to get through an interview at all. If I went in on strength from past experiences, that wasn’t enough for a while, and my experiences then changed and became a negative influence on my thought-patterns.

From a technical perspective, I’ve had to learn that the ways that things were done in the past are no indication of how they will happen in the future, and again I find myself looking at past experiences and realising they have no bearing. Just because I knew something about a particularly technology some years ago doesn’t mean that I’ll have the faintest clue next time that area comes up, because I could just as likely find that my past patterns demonstrate that I’m now simply archaic.

Luckily, this isn’t how I work either.

I don’t look back at achievements and base my opinion of the future on that. Nor do I look at other people’s achievements and base my opinion of their future on that. I’d rather look at myself, or at the qualities of other people, to make those judgement calls.

If I put my confident hat on, I know that I’m good. I’m not overly comfortable saying that. I’m naturally self-deprecating, and concerned about the self-worth of other people, and find that I have to figuratively put on a different hat to make those kinds of statements. It feels like it’s bordering on arrogance, except that arrogance implies that I’m devaluing those around me. Really it’s just self-assurance. Knowing that I will be able to beat whatever the thing is that I’m going through. And I don’t feel always feel it, I just have to decide that I’m going to get up in the morning anyway.

I know I’m good in that I can solve whatever technical thing I’m given. I’ve always had that ability. Sometimes solving something might need me to ask others for help, but I’m okay with that. Sometimes solving something may be the decision to not do various things, maybe finding a workaround instead. But I’m not afraid of technical things.

I’m not afraid of new experiences. I did stand-up comedy a few years ago, and it was good. I don’t mean that I was funny – I mean that the experience was generally positive. Public speaking is nerve-racking, but if the alternative is to not do it, then I’d be worse off.

I am afraid of hurting people, and will choose to avoid those things that I think will tread on people, or subject people to unnecessary risk. In that regard, maybe I’m not much of a risk-taker. I can step out over an edge, but will hold back if the person with me doesn’t want to go for that journey.

I have no problem looking at the past and seeing things that I’ve achieved, but I don’t value those things as much as the achievements of other people. I’m immensely proud of the people I’ve been able to help along their journey, but I don’t look back at particular presentations or blog posts or projects with a sense of pride – that’s reserved for the things that make other people smile.

For my own “cookie jar”, I stand on who I am, on who God says I am, and figure that I can get through somehow. Not because I have done in the past, but because I will in the future.


Why we do what we do

This month, Andy Leonard (@andyleonard) asks why do we do what do. He talks about the fact that money is rarely the only reason people become entrepreneurs, and challenges us to to explain what makes us get up in the morning.

Money is definitely not the driving force in my life. I do feel a burden of providing for my family, and don’t feel like money has ever really been a thing going spare, but I also see money as a tool to bless others rather than as something to hold onto.

For me, I created LobsterPot Solutions to help people. It’s a vehicle so I can serve people in whatever ways that I can – typically through consulting about their data, be it strategy, analytics, intelligence, data integrity, whatever. I feel like people could always do what they do better if only they had better data, if they didn’t have to worry about it, or if they could ask questions of it… all stuff that I can help with. And I have employees because it increases the reach I can have. It means we can help more people.

Of course, the company does have to be financially viable. And the employees need to be looked after in whatever ways they need. This takes money. But all of it is to provide a service to people, to help enable those people through data. Our customers do good things, whether they’re in government agencies, or health care, or charities, or mining, or whatever it is – and they’re trying to do the best they can do. So if we can help them make that best a little better, then we’re doing something right.

That’s my why.