On containers

Anthony Nocentino (@nocentino) has invited us to write about what we're doing with containers. It's kinda his thing. It's good to have a thing. I got him to speak to the Adelaide User Group about it in February 2020, which turned out to be our last in-person meeting.

(We could switch back to in-person, but the room at Microsoft only holds 15 people under the current restrictions, and I don't think we've ever had attendance that low in the 16 years I've been running it.)

Two years ago, I wrote about how I'm running SQL on Linux using docker, and I do find this to be really useful (because I don't have to spin up a whole VM or make my machine dual-boot as I did back in the 1990s). I've also used containers for various Machine Learning tasks. In both situations, it gives me the flexibility to be doing things without having to use my local operating system. In the last couple of years I've replaced my main laptop, and the impact on my containers has been really small.

I know I'm not getting as much out of my containers as Anthony is. I don't have a large testing suite letting me spin up umpteen versions of SQL. But I know that containers will let me do this as and when I need it. I know that I can use containers to spin up things without their having a lasting impact on my local operating system. And that's really useful.

I encourage you – get used to containers. They're here to stay.


A hybrid world

This month, Ben Weissman (@bweissman) challenges us to write about hybrid solutions. He says:

It has become pretty evident to me, that we'll be living in a hybrid world for a long time – potentially indefinitely.

Ben Weissman

, and this is what I want to address in this post.

I like what Ben says here because for a long time it felt like hybrid was only the domain of the partially migrated. For those organisations that had dabbled with the cloud but weren't prepared to commit. A halfway house for people who were on the journey towards being in the cloud, but who had a few things they weren't prepared to give up.

In some ways, I suspect this is actually the rationale behind hybrid solutions like Azure Arc. There are many organisations who like the features that Azure offers, but simply aren't prepared to have their data in the cloud. It's a stance I've seen many times, although with multiple Azure data regions in Australia now, the reluctance to the cloud is not what it was.

Power BI has also caused organisations to have some amount of data in the cloud even if their data still resides on-site. And course, this leads to the situation where they push data into cloud storage to make refreshes easier.

I agree with Ben. I think many organisations will continue to have a large amount of their data within their own data centres for a long time yet. As much as I like the idea of everyone moving fully to the cloud, there might always be reasons why that's not practical. And so hybrid solutions will continue to improve. Azure Arc is an excellent example of how to think smarter around hybrid, so that some of the benefits of Azure can be realised without the need to compromise on having data in external data centres.

As was discussed last month, technology keeps moving on, and the features available in a cloud-only world are developing quicker than ever. But hybrid is one of the ways to make sure that people aren't left behind, and I really like the fact that hybrid is such a strong story.


Ever-changing technology

This month, Andy Leonard (@andyleonard) describes a situation where a technology change made his book inaccurate between its completion and when it was published. Ouch – that must've been a pain. He asks the blogosphere how we respond to changes in technology.

My answer is acceptance.

It happens. Technology moves on, and generally quicker than we can keep up with. And so I don't try to pretend that I know everything. I like to think I know a lot, and that I probably keep up with someone who doesn't do what I do – but I'm happy to admit that there are things I'm not across. Particularly if they're new changes.

I'm sure if I were a doctor, I'd need to be across a lot of things very thoroughly. But I also suspect that's why doctors tend to prescribe the same meds over and over – because it means they have fewer advisories to be across. If my doctor gives me some antibiotics, I hope they're reading journals about them and have considered whether they're right or not. I care slightly less about whether there might be a better alternative, but I do want to know that the one I'm being given is okay.

And so it is with tech. If I'm teaching someone about how to do something, and there's a new option that's available in the latest release (looking at you, Power BI and DAX, with your frequent updates – I love them, but every so often I miss a detail), then I might look foolish for a moment. But that foolishness is probably short-lived, because I'm happy to admit that I didn't know. I learn, as do my students, and we're all better for it.

There are times when I feel like I've missed quite a lot of things. Once upon a time, data was a relatively small area of expertise, and I felt like I had a good grasp on just about every aspect of the SQL Server product. Except that the more I looked into it, the more I realised there were areas in which I couldn't call myself an expert at all. As time goes on, I know I could spend all my time learning if I wanted. So instead, I spend most of my time doing, and only part of my time learning.

And I just accept that technology will change in the moments when I'm not looking.



My old friend Aaron Nelson (@sqlvariant) told me about this back in the middle of last year – the concept of executing a Jupyter Notebook (.ipynb file) against a SQL instance and storing a copy of it (complete with the results of the various queries) in a separate file. Aaron and I have had a lot of conversations about SQL and PowerShell over the years, and Invoke-SqlNotebook fits nicely.

I don't think I was particularly convinced right away. For a start, not many of my customers were using Azure Data Studio, which is probably the easiest way to look at Notebook files. But mostly, I didn't consider I had really had a gap to fill.

The concept is that if I have a notebook with a bunch of queries in it, I can easily call that using Invoke-SqlNotebook, and get the results of the queries to be stored in an easily-viewable file. But I can also just call Invoke-SqlCmd and get the results stored. Or I can create an RDL to create something that will email me based on a subscription. And I wasn't sure I needed another method for running something.

But the materialised notebook fits into a useful spot in between these. It gets created in an environment that is convenient for running queries, which is harder to argue for RDLs. And it contains multiple queries complete with nicely-formatted documentation, where the results of queries are displayed in amongst the text. SQL files created in SSMS don't fit that description.

So these days one useful option that's open to me is to schedule a call to Invoke-SqlNotebook, telling it to materialise a notebook which a bunch of useful queries, so that it can be stored somewhere in a file system and easily opened with the results. I can then give that to just about anyone to digest, because the combination of documentation and results can easily be made understandable and useful.

If there are actions to be made based on the results of queries, I'm still going to use PowerShell to run the queries, and then take further actions based on the results. But when I'm just wanting to create something that shows data and explains what it is – Invoke-SqlNotebook does the job. Well done Aaron.

This post fits into Steve Jones (@way0utwest)'s T-SQL Tuesday this month. For more on notebooks, go to Steve's post and read some of the other posts linked to via the comments.


datetimeoffset Uniqueness

Datetimeoffset is an interesting data type, because there are multiple ways to represent the same value. And I don't mean like using a string to store "1" and "1.0", I mean the fact that I might have inserted a value in one time zone, but need to be query it in a different one.

CREATE TABLE dbo.TimesOffsets (TimeAndDateAndOffset datetimeoffset, CONSTRAINT pkTimesOffset PRIMARY KEY(TimeAndDateAndOffset))

INSERT dbo.TimesOffsets(TimeAndDateAndOffset) 
VALUES ('20210101 00:00 +10:30');

SELECT TimeAndDateAndOffset
FROM dbo.TimesOffsets
WHERE TimeAndDateAndOffset = '20210101 00:30 +11:00';
Insert in one time zone, query in another

Notice that I inserted a value in the time zone UTC+10:30, and queried it back in UTC+11, and the value that it gave me in the results is in UTC+10:30. Clearly the system knows that the value in the table was stored in UTC+10:30, and would store something different if I had told it to insert the value in UTC+11 (feel free to try it and see that it'll return the same value that you inserted, whichever time zone your WHERE clause value is in).

Sort order understands this too. If I insert other rows, it knows what order they are actually in, regardless of where in the world those times happen.

As weird as it looks, this is in chronological order

And as I have a unique index on this, it won't let me insert 00:30 in UTC+11, because 00:00 in UTC+10:30 is already there. It tells me "Msg 2627, Level 14, State 1, Line 19. Violation of PRIMARY KEY constraint 'pkTimesOffset'. Cannot insert duplicate key in object 'dbo.TimesOffsets'. The duplicate key value is (2021-01-01 00:30:00.0000000 +11:00)."

Errors if I try to insert the same time but translated to a different time zone

I'm okay with this. They are definitely the same value, even if they are written differently. The same thing happens with strings in different collations, because they sort to the same value. Müller and Mueller will clash in a table with a German PhoneBook collation (and yes, that's what the German collations are called, complete with a capital 'B' in the middle of 'PhoneBook').

CREATE TABLE dbo.Names (LastName varchar(100) COLLATE German_PhoneBook_CI_AI, CONSTRAINT pkGermanNames PRIMARY KEY(LastName));
INSERT dbo.Names (LastName) VALUES ('Müller');
INSERT dbo.Names (LastName) VALUES ('Mueller');

So you see, the time zone thing isn't about whether they are actually identical or not. It's about whether they are considered the same based on the things that the SQL engine uses to consider them equal. When you're dealing with data that is case-insensitive, it will consider that 'FARLEY' and 'Farley' are the same, and complain if I'm trying to have both those values in a unique index.

And for a fun trick, let's consider what happens with GROUP BY or DISTINCT.

For this, I'm going to first recreate those tables with non-unique indexes. Still using indexes, just non-unique ones.

DROP TABLE IF EXISTS dbo.TimesOffsets;
CREATE TABLE dbo.TimesOffsets (TimeAndDateAndOffset datetimeoffset, INDEX cixTDO CLUSTERED (TimeAndDateAndOffset))
INSERT dbo.TimesOffsets(TimeAndDateAndOffset) 
VALUES ('20210101 00:00 +10:30');
INSERT dbo.TimesOffsets(TimeAndDateAndOffset) 
VALUES ('20210101 00:30 +11:00');
FROM dbo.TimesOffsets
ORDER BY TimeAndDateAndOffset ASC
FROM dbo.TimesOffsets
ORDER BY TimeAndDateAndOffset DESC

And the results show different values according to whether we sort the data ascending or descending.

Different order, different results

All those times someone told you the order of the data didn't affect what the results were, just what order they were in… yeah.

And if I run the script again but insert the rows in the opposite order…

Same, but different

You see, the rows are in the table in the order they were inserted. And when we are moving through the index either forwards or backwards, we're coming across values, and the version of the value is the one that gets seen first. If it moves through it forwards, it'll (probably) come across the first row that was inserted first. If it moves through it backwards, it'll (probably) come across the last row first. I say 'probably', because there are times that it won't quite work like this. Parallelism could make this be different. There may have been index rebuilds, or page splits, or the execution plan might decide to do run things differently.

But the idea is there – those two values are the same, even though they're not.


This post fits into T-SQL Tuesday, hosted this month by Brent Ozar (@brento) on the topic of data types.

Beware the width of the covering range

I'm sure you've heard by now that casting a datetime to a date is still SARGable. Like, that if you have an index on a datetime column (such as DateTimeCol in dbo.SomeTable), then you will get a seek even if you use a predicate like "WHERE CAST(DateTimeCol as date) = @DateParam". Sargability is all about whether your predicates can be used for a Seek, and it's something I've been preaching about for a very long time.

The trouble with this is that casting a datetime column to a date isn't actually a Sargable predicate. It feels like one, but it's not. Either way, I thought I'd write about this for T-SQL Tuesday, which is hosted this month by Brent Ozar (@brento). He asks us to write about our favourite data types, and I figure that talking about the pseudo-sargability of datetime-date conversion fits. (It was either that or write about how a unique index on datetimeoffset complains that '20210101 00:00 +10:30' and '20210101 00:30 +11:00' are the same value, even though they are clearly a little different. Maybe I'll do both… Edit: I did.)

Sargability is about the fact that an index is on the values in a column, and that if you're actually looking for something else, then the index doesn't work. So converting a column from one datatype to another doesn't cut it.

It doesn't even work to add zero to a integer value. That's enough to confuse the Query Optimizer into thinking that an index might not be right (and this is still the case if we use OPTION (RECOMPILE) so that it doesn't try to parameterise the query). It's easy for us to see that the order of things in the index isn't changed by adding a constant value, but the Query Optimizer doesn't look for this. I had a Connect item open for years about this.

The operation on the index is a Scan when I add a "plus zero" to the predicate

So what's special about converting from datetime to date? Well, it's about being able to add a helper predicate to the mix. (This is also how LIKE works against strings.) The Query Optimizer knows that all the values must be in a certain range, so that it can seek to that range. It works out this range using an internal function called GetRangeThroughConvert. It does a good job on this for LIKE, but doesn't do such a good job with dates.

Let me show you.

First I'm going to create a table called Times and put 100,000 rows in it, one per minute from the start of the year until roughly now. It's actually in a couple of days' time, mid-morning on March 11th.


CREATE TABLE dbo.Times (TimeAndDate datetime, CONSTRAINT pkTimes PRIMARY KEY (TimeAndDate));

INSERT dbo.Times (TimeAndDate)
SELECT TOP (100000) DATEADD(minute,ROW_NUMBER() OVER (ORDER BY (SELECT 1))-1,'20210101') 
FROM master..spt_values t1, master..spt_values t2;

Now when I query this table to find all the rows from today, I see an Index Seek and can be happy. "Obviously" this is sargable.

Looks like we've got ourselves some sargability!

But it's not really. The properties of that Clustered Index Seek show that our WHERE clause is being used as the Residual Predicate, and that our Seek Predicate is based on the Expressions created in that Compute Scalar operator.

The properties show there's something else going on

Now, you'll notice that the operators used in that Seek predicate are > and <. Greater-than and less-than. Not Greater-than-or-equal and less-than. This is a range which is exclusive at both ends. And this is interesting because if I want to write out a date range myself, I do it using an inclusive start and an exclusive end, like:

SELECT TimeAndDate
FROM dbo.Times
WHERE TimeAndDate >= '20210309'
AND TimeAndDate < '20210310'
<!-- /wp:shortcode -->

<!-- wp:paragraph -->
<p>So I got to thinking about this. I wasn't quite curious enough to pull up the debugger to try to see what the values were, but I did do some investigation.</p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>Notice in that tooltip I showed earlier, that the "Actual Number of Rows for All Executions" property says 1440. That's the number of minutes in a day, so that makes sense. But the "Number of Rows Read" property is 2879. This means that the Seek Predicate is returning 2879 rows, and then the Residual Predicate (shown as just "Predicate") is filtering these down to the 1440 that actually produce the correct value when converted to the date type.</p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>2879 is one less than the number of minutes in two days. I figure the "one less" bit is because it's an exclusive-exclusive range. But still it's a range which is twice as big. It's not "a little bit bigger", it's twice as big as it needs to be. To compare this with LIKE:</p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>When LIKE knows what values are being used, it doesn't use those ComputeScalar &amp; ConstantScan operators, and we see helper predicates that do an inclusive-exclusive range:</p>
<!-- /wp:paragraph -->

<!-- wp:image {"id":4311,"align":"center"} -->
<div class="wp-block-image"><figure class="aligncenter"><img src="http://blogs.lobsterpot.com.au/wp-content/uploads/2021/03/image-6.png" alt="" class="wp-image-4311"/><figcaption>Inclusive-exclusive range</figcaption></figure></div>
<!-- /wp:image -->

<!-- wp:paragraph -->
<p>But when we use a parameter, we have the expressions and an exclusive-exclusive range, but the same number of rows read as returned. </p>
<!-- /wp:paragraph -->

<!-- wp:image {"id":4310} -->
<figure class="wp-block-image"><img src="http://blogs.lobsterpot.com.au/wp-content/uploads/2021/03/image-5.png" alt="" class="wp-image-4310"/><figcaption>Exclusive-exclusive range, but still the right number of rows read</figcaption></figure>
<!-- /wp:image -->

<!-- wp:paragraph -->
<p>The Query Optimizer clearly understands how to do the range for LIKE. </p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>But with the date conversion, twice as big.</p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>Back to what that time period range is...</p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>To figure this out, I started deleting rows from my table. I figured I'll be able to see the range by when the Number of Rows Read value starts dropping.</p>
<!-- /wp:paragraph -->

<!-- wp:paragraph -->
<p>First I deleted rows from mid-morning March 10th on:</p>
<!-- /wp:paragraph -->

<!-- wp:shortcode -->

FROM dbo.Times
WHERE TimeAndDate &gt;= '20210310 10:00';

And the size of the range didn't change.

I removed all the datetime values from the 10th and the range didn't change. And all the datetime values from before the 8th. Only when I deleted '20210308 00:01' did the range start shrinking.

Without 00:01 on the 8th, my range to find values from the 9th was smaller.

So the range of my helper predicate hits the day before my date as well as the day I actually want. TWO days. Twice as big.

It certainly sounds like I'd be better off running my query using my own range predicate, typing out my inclusive-exclusive range instead of relying on the supposed sargability of casting as date. When I write this out, my plan becomes a lot more simple-looking, because it doesn't have to work out the range, but the plans tell me the cost is greater!

Looks simpler, but apparently the cost is greater?

It's lying though. It thinks the first query is cheaper because it does a bad job of estimating. It thinks there's only one row being produced when there are actually way more.

It estimates badly to make us think it's cheaper.

To try to even things up, let's try using a parameter for the date in both queries.

And whoa! It's even worse. Now when we do the range ourselves, it thinks it's 93% of the batch, and the simplicity of the plan has gone out the window.

Ugliness prevails

And yet this "more expensive" plan is actually better. The Index Seek looks through 1440 rows to find the 1440 rows, while the CAST query looks through 2879 rows. The estimate on this second version says that over 16,000 rows might be produced (which is what causes it to be 93% of the batch). But if I'm going to be passing 1440 rows through a query, I'd rather guess 16,000 than guess just 1.

The conclusion of all this is to point out that although we often refer casually to the idea that a datetime column cast to a date column is sargable, it's actually not quite, and you're better off handling your ranges yourself.


Adapting the tools of my trade

It's been a while since there was a T-SQL Tuesday about the tools we use, so I figured my response to Mikey Bronowski (@MikeyBronowski)'s T-SQL Tuesday invite should be about things that have become more pertinent in the last twelve months.

I should point out that I rarely need to work from home 2020-style. I'm almost always back at client sites now, because our response to the pandemic here in Australia has been so good and we have no cases in my city. Massive thanks to my compatriots here, who have almost all been very compliant in following the rules so that we can now be back in offices.

But still there were adjustments, and SQL Community activity is still different.

These days I carry an external webcam with me in my bag, because I want one that's better than the one built-in to my laptop and I want to be able to adjust it separately to the angle of my laptop screen. It's a Logitech C615 HD Webcam. My headphones for meetings are Plantronics Focus, which were a speaker gift from one of the big conferences years ago. They're pretty good at not picking up background noise, probably because of the mic arm.

I carry them around these days because more of my meetings are still remote and want my camera on. Back in 2019, if a meeting was remote it didn't tend to involve webcams. That's changed now, perhaps forever.

Without travel being an option, I've found myself thinking more about how I present online. Presenting online used to be just an occasional thing, and typically something that I would only do when needed. But now I find myself talking into my webcam really often. I'm not quite at the stage of buying googly eyes for my webcam (the webcam in that picture is also a Logitech C615) to make it seem more like a person, but I do have my webcam output showing on my laptop screen so that I have at least my own face to talk at. I'm sure talking to yourself is a sign of madness, but I'm really just using my face so that I'm not just looking at my webcam.

I grabbed a whiteboard for home, because I find that writing on a whiteboard helps me work through ideas. I didn't have much wallspace, so it's actually a small whiteboard – similar size to a monitor. But it still helps a bit.

For presenting, I found myself really missing having a large whiteboard or flip chart. These things had become a bit of a staple of my presentation style, and I found they didn't really lend themselves to the online world. I know there are virtual whiteboards available in Teams and Zoom and so on, but I didn't like having to shrink my face down to the bottom corner of the screen when using it – it makes me feel like I'm connecting with the audience less. So I recently worked out a way of using OBS Studio to let me draw on my laptop screen like a glassboard. It also works for displaying PowerPoint bullet points that hover in the air like magic so that people can keep just watching my webcam feed (and all my hand waving), but still see my annotations.

And the idea has caught the imagination of a few people, including Brian Hogan and Scott Hanselman (who promptly made a video about it).

So now I'm still not using slides in my presentations, but I am using PowerPoint, because it responds nicely to drawing with my stylus.

Online presentations are still not really my thing. I'm looking forward to the day when I can safely travel to events and present to rooms of people, and interact with people without a webcam. Being able to draw on the screen doesn't replace being able to draw on a massive piece of paper, and I'm still stuck in one place for the whole thing. But changing the tools of my trade is helping to compensate a bit.


Scrolling credits in Teams or Zoom – for free and so quick and easy

I made a video about how to do this too, at https://youtu.be/JbRZoAqA1rw

The basic concept is that once you have a greenscreen application background in place, you can easily show text or drawings or whatever on the screen, superimposed over your webcam feed. I show how to do this in this post: http://blogs.lobsterpot.com.au/2021/01/30/presentation-trickery-online-glassboard-like-lightboard-but-using-just-free-software/. Scott Hanselman also made a video about it at https://youtu.be/-oaikJCR6ec, and used the idea to make scrolling credits, which is a really neat idea that everyone loves.

So once you have the greenscreen part set up…

I figure we all know how to make scrolling text in PowerPoint – you just make a textbox containing what you want to display, move to the side (or bottom or wherever), and give it an Exit animation so that it passes through the screen. Make it slow enough to read, and then you just hit the Preview button.

So quick… so easy… and OBS is free (get it from obsproject.com) and you might already have PowerPoint somewhere…

EDIT – STOP THE PRESS, it's even easier to do the credits natively inside OBS…

I can add a Text source that I set to invisible…

…with the credits in the properties…

…add a Scroll filter with vertical or horizontal speed…

…and then when I make this layer visible, the credits scroll past!

So that's even easier than using PowerPoint! But using PowerPoint means I can draw with my stylus, which is the winning feature for me, so I'll use a combination. A PowerPoint layer for drawing and bullet points, and then the OBS Layer for credits.


Presentation trickery: Online glassboard (like Lightboard) but using just free software

I don't know if you've ever seen me present. I like to use whiteboards or flip charts, and that doesn't necessarily translate well to online presentations.

It was at least ten years ago when I had an idea about giving online presentations with a whiteboard, but where the whiteboard would be between me and the camera. A glassboard rather than a whiteboard, obviously, so that I would still be visible through the glass but not obscuring the text. And the image would be mirrored so that the things I wrote would be readable to the audience, given that I'd be drawing on the other side of it. Recently I've found out this is a real thing called Lightboard, using ultra-clear glass and lights to make sure what's drawn glows enough. And considering this seems to have only appeared around 2015 (a good five years after I was musing about the concept), I really should've explored things further.

This picture is from a site belonging to the National University of Singapore, but an image search gives plenty of examples

In a world where technical presentations are more online than in-person, I've struggled a bit with how to give my usual style of presentation. I know I could set up something like this, and I've been tempted, but it would be a lot of effort, and it's not exactly portable, and I just haven't (although I know at least one person who has…)

I know what you're thinking – and that's that tools like Teams do have whiteboards in their meetings. That's not what I want though – in those situations the main screen becomes white (or whatever colour the whiteboard background is), and the bit showing me disappears or shrinks to the corner. It's like sharing a screen. And my style isn't just writing on a whiteboard, it's pointing at the whiteboard, it's gesturing, it's all of that stuff that doesn't work if I'm constrained to the corner of the screen. If I had a greenscreen behind me I could do the "weatherperson" trick of moving in front of the screen to point things out, etc, but it's still not quite what I want.

And so I got to thinking about what could be done.

Enter OBS. That free piece of software that many people use now. Plus, from version 26 on, it can act as a Virtual Camera, so that Teams (or Zoom or GoTo or whatever) can show whatever OBS is doing. So if I figured if I could get this to behave the right way, I would be able to use it in live presentations. The button that appeared in version 26 is in the Controls pane in the bottom right of the OBS window.

There is no mode where I can just use my stylus to draw on the screen where my face is. At least I don't believe there is, but I found a workaround.

The basic concept here is that I use my webcam as a source, but overlay a window capture from an application where I can draw with my stylus (PowerPoint will do, and is good for other reasons too). Then I set the application's background to a chroma key colour (like bright green #00FF00) and filter that out like a traditional greenscreen. I stretch that source in OBS bit so that the main drawing area is over my webcam feed, and all the menus and stuff is outside.

To explain with pictures:

I started with a plain PowerPoint presentation and set the background to bright green. I'm going to leave the PowerPoint application in this mode because I find it behaves better. And as I'm about to draw on it with a stylus, I don't want it to do anything weird by switching into 'presenter mode' or 'annotation mode' or anything like that.

Then in OBS, with the webcam as the bottom layer of the Sources, I add a Window Capture and use the PowerPoint screen.

Now select the WindowCapture and resize it to match the slide to the webcam. This slide is going to disappear though, when we add the filter.

Right-clicking on the "Window Capture" source, I can go to Filters and add a Chroma Key filter, using the default Green. You'll notice that the bits that are green in PowerPoint have gone grey, indicating that they'll be invisible.

Now back in OBS, I see myself again..

…but when I write in PowerPoint, it appears in front of me. (Oh, because I want to point at the things I draw, I flip the webcam horizontally. You might have noticed the writing on my shirt is backwards.)

And it's not just text that works. I can have standard bullet-point text.

And I can write in other colours too, just not green.

But that brings me to an interesting set of tricks.

The first one is to grab a screenshot of me on the webcam, and make it very saturated, and green. I can do this using just about any image editing tool. So now I have an image that looks like this.

I set this to my PowerPoint background, and I can easily see where I can draw and where I can't (assuming I don't move around too much).

I can also add gridlines to help my handwriting stay neater, and help me make sure I don't run out of room.

But in many ways the biggest trick is around the flow of my presentation.

Part of why I use whiteboards and flipcharts is because I feel like they help the audience connect with me better. There's something about writing live that means that people feel like I'm doing it on the fly, being more responsive to how the audience is responding, changing tack as I go.

But it's not like that at all. When I present without slides, or even without a computer, I have to know my material really well. I have to know where I'm going. I don't have the crutch of a slide deck of bullet points. I need to know what I'm planning to write. How I'm going to make the various points. Even how the audience is likely to respond to various things. By knowing the narrative of my presentation really well, it gives me the freedom to move around the content if I feel like I need to, but I have to know my anchor points in my head, because I don't have them on the screen.

So… as long as they're still green, and therefore invisible to the audience, I can indeed have them on the screen now. And whether I use consecutive slides with different colours (first green, then white/black/whatever) to make it look like it's building, or whether I trace over the top using my stylus, I can have all the notes I like – even notes to myself that the audience can't see. I can have as many slides as I want and can move around them just like regular folk do.

When it comes to giving demos, I still need to add a separate screen or do traditional screensharing through Teams or GoTo or Zoom or whatever. But with a bit more effort I could do my demo in the background and still draw over the top. Or get a physical greenscreen behind me so that I can have a solid background or demo screen behind me and still have my drawings in the foreground. This can definitely go a lot further.

But for my whiteboard-based presentations, this should work nicely.


PS: I just made a short video at https://youtu.be/LdtmEf2XhSU to show drawing on the screen. It would've been better if I had taken a few minutes to sort out my lighting and background, but you can see the rough concept there.

My other job

I don't tend to take many holidays. So when James McGillivray (@JamesMcG_MSBI) prompted this month for a post about taking breaks I felt a little underqualified. I could write about how one of the reasons I enjoy my car is the way that I can separate from everything for a bit, but I think I'm going to write about the week before Christmas when I didn't attend client sites at all, and instead took on a different type of work.

In recent years I've got to know some people that have a company that provides Father Christmases (I feel like that should be "Fathers Christmas", but that sounds odd – maybe just "Santas") for shopping centres, department stores, and so on. Last year at a Christmas party I was at, I mentioned that I had filled in as Father Christmas to give out the "Secret Santa" gifts at a client, and commented that maybe I should look into being a proper Santa. I say "Proper Santa" as if Santa is real, but you know what I mean. They said I should remind them in the middle of the year.

Of course, 2020 was, well, you know. Most places didn't have a Santa, even though Adelaide had no covid cases at all between April and November. I had said I could still be available, but the opportunities were scarce. In mid-November we had some cases hit Adelaide and we were in lockdown for a bit. But then by early December we were clear of the virus again, and I got asked if I could do a 3-hour shift as Santa on the 18th. And then that was followed up with the opportunity to do a 3-hour shift on each day Dec 21-24.

I said yes, and arranged to take a vacation from clients.

I still did some client work that week, but it was definitely a break from normality. And although I got paid for my work as Santa (meaning I could update my LinkedIn profile with "Professional Santa") that all went into a donation to a children's charity.

I wasn't a "sit on a couch and talk to the children that have queued up for a photo" Santa – I was a "walk around and greet people" Santa. Photos were selfies, not staged. At first I was a little disappointed, because it wasn't what I had imagined, but in many ways it was better this way, for quite an unexpected reason.

It wasn't a photo of me in the sign advertising that I'd be there…

I hadn't really noticed this before, but people that pre-book to have photos with Father Christmas are quite privileged. I don't mean that they're lucky to have the honour of sitting with him, I mean that they've been able to pick up the phone and make a booking. Or they've been able to take the time to queue for a long time. That they've been able to spend the money to get a photo.

The people that I was dealing with were just people that were at the shopping centre. People who needed to pick up a few things ahead of Christmas. Or who were stressed about getting some presents for people because they weren't ready yet. On the 24th many of them were dads with kids in tow, muttering "I don't know what we're going to get her".

And so I interacted with people. My elf and I walked around every part of the shopping centre for three hours each day. I waved at everyone I saw. With no cases of the virus in the community anywhere in the country, let alone the state, kids ran up and hugged me. Many of them were totally enchanted, and the wonder in their eyes made it worthwhile.

A picture of my elf and me.

But the one that caught me by surprise was the adults who wanted to interact with me. Not the parents or the shop-owners, who understood the "game", but the ones who were there with their carers because of some disability. The ones who still believed in Father Christmas. Who would never have been able to get a photo with a sit-down Santa, but who were just as excited to see me as any of the children (oh, except one kid who was absolutely beside herself – no one was more excited than her). Some of them I saw every day I was there – I guess an outing to the shops was part of their routine. One gave me a Christmas card on my last day.

The band Wizzard sings "I wish it could be Christmas every day", and part of me would like to recreate that feeling of being Santa more often. The theme from Cheers reminded us that we like to go where everybody knows our name, and when my name was "Santa" everyone really did. Walking through the shopping centre afterwards was strange because people couldn't see me. I would see those same faces, but wasn't able to talk to them. I missed it immediately, and totally understood the character Bob Parr (from The Incredibles) and his feeling of being powerless when not able to help people the extent that he wanted.

The annual PASS conference felt a little like that (as will events like SQLBits when they become in-person again), because I would see a lot of people I knew, but the level of enchantment isn't the same. People are never as excited to see Rob (well, not this Rob anyway) as children are to see Santa. And having lost Gareth Swanepoel to Covid this week, I'm feeling that we all need to see each other in the way that children see Santa. I was on a call with Gareth in the last couple of weeks thinking he had recovered. So we were all able to tell him how good it was that he was still around. We never thought it might be the last time.

So for those people who ask if I'll work as Santa again – yes, I hope so. It's tiring work, and it's so hot in the suit, but the rewards are huge.

I pretended to come down a slide