Where projects often go wrong

I don’t write about our customers in a negative way. I just don’t. I don’t ever want a customer (or potential customer) to see what I’ve written about someone else and to worry about what I might be saying about them. So I don’t tweet about the times when something’s driving me crazy, or about code that I’m seeing – I don’t even tweet about the times that I’ve tuned a process from 15 minutes down to a second or two in case someone considers that I’ve just broadcast that they had a process that was taking 15 minutes.

But I do write for T-SQL Tuesday every month. Without fail. And this month, the topic is failed projects. Thanks to Jeff Mlakar (@jmlakar) for hosting.

And so I want to write about something I see which commonly makes projects fail, and that is:

A misaligned expectations and feedback loop

There’s a blog post I often show to customers that dates back to 2006. It’s Kathy Sierra’s post called “Don’t Make The Demo Look Too Done“, and it brings out a lot of good points.

The gist of it is that customer expectations largely depend on how finished the demo looks, because they don’t see what’s happening behind the scenes. They’re measuring the completeness of the project based on the tip of the iceberg, and not really listening to any commentary about how much work there is to do beneath the surface. I was thinking of putting a picture here of an iceberg, but I think you can imagine. You know what’s going on out-of-sight of the customer, and how the user interface is just a small part. It’s just that the customer might not, and will imagine, and either get excited about how much progress they think has been made or concerned about how little progress seems to have happened.

Kathy also brings out the fact that the depth of feedback will tend to be based around how complete the thing appears to be. If I asked for feedback on this post, you might say “You know, you really should’ve put a good picture of an iceberg in that last paragraph”. Or you’ll point out the fact that I have a mispelled word in this sentence. You probably won’t tell me whether you think Kathy’s post is a good way of making my argument, or whether I should continue to write blog posts every month. You’ll probably assume that the window for that is gone. (Of course, having said this, there are four types of comments that will almost certainly appear below – no prizes for the bleeding obvious.)

As a consultant who tends to get involved in customers’ businesses, not just their data (because honestly – there’s a reason why it’s called “Business Intelligence”), I do sometimes find myself saying “Now, please tell me if the opportunity for this level of feedback has passed…”, and then asking questions that probably belonged much earlier in the project. I’ll then see people taking deep breaths and wondering whether they should’ve either not invited me to that meeting or perhaps got me into meetings back when the opportunity for that level of feedback hadn’t passed.

Project-wise, I see that developers, whether database developers or user interface developers or whatever, tend to work based on their understanding of how things work, and customers won’t ask certain questions until they see how things actually look. Because before then, they don’t really get a sense that it’s being worked on. And then suddenly things look like they’re done, and they feel that the window for deeper feedback has passed them by. As much as I love the rich visuals available with Power BI, the speed with which data can be presented (and the same occurs even in SSRS) can often cause people to give the wrong level of feedback early on, and only later look at fundamental questions like whether the metrics that are being shown are being measured in the right way for the business.

All of this can lead to projects which are poorly estimated.

Customers will feel like things are going well if they’re seeing things that look pretty, and developers will become unstuck when they realise that they didn’t have key pieces of information early on because they hadn’t asked for feedback at the right times.

Sadly, there’s no easy solution, except to manage both sides better. Customer expectations need to be managed to make sure that they have a good picture of where things are at, and developers need to be managed to make sure that stories are being presented in ways that invite the right level of feedback. Methodologies can help, including rapid prototyping (although this should also be taken with salt because prototypes can often form the top of an iceberg and mislead – everyone should understand that prototypes are designed to be learned from and thrown away), but ultimately it’s about perspective and empathy. By understanding the difficulties faced by each side, a technical manager ought to be able to manage the expectations of both customers and developers, and advocate for the other in getting the right level of feedback needed at each point along the way.

It doesn’t surprise me that our T-SQL Tuesday host Jeff sees continuing behaviour around poor project estimation. I see it too. I’m guilty of misjudging the quality of feedback that I’ll need.

And I frequently show Kathy’s blog post to the customers and developers that I interact with, in the hope that we can all improve and get better estimations going forward.

@rob_farley

One thought on “Where projects often go wrong”

Leave a Reply

Your email address will not be published. Required fields are marked *