The second main type of bad data that you may encounter is fluff. These are hypotheticals. These are generic statements. Whenever you're in the middle of a user interview and you start getting onto this hypothetical, you know, "Oh, here's what the product may look like in the future." Try to steer it back to specifics. Again, you're conducting a user interview, not to pitch your product but to learn about problems or issues that the user has faced in their past so that you can improve it in the future. That's it. That was meant to be like a quick short dive into "Talking To Users." I don't know if we have any time for questions. Cool. Awesome. Well, I'd love to answer any questions but other than that, thank you very much.
The third thing to do during user interviews at this stage is to remember to discard bad data. Some of the kind of worst bad data that you may encounter is compliments. People may say, "Oh, I love the new design." Or, "Man, this thing is really useful." You may love that during the course of your user interviews but they actually are not useful information because it's not specific. It's more of a general statement about your product and it's not tactical. It's not giving you correct information on what you can change or what you can improve about your product.
The second one is don't design by committee. You can't simply ask your users what features they want. You have to begin to understand whether those features are truly going to help make your product more sticky and more useful. You can do this through kind of the advice that the Superhuman CEO lays out in his blog post, or you could ask other tactical questions. Like, instead of asking, you know, "Will users be interested in using this new product or this new feature?" Instead say, "Here's an upgrade flow. If you want this new product, put your credit card." Or, "If you want this new feature, put your credit card information or pay more." Even before you actually built out the feature, this could help give you information about whether the feature that you're working on is actually something that the users are going to use.
He read some analysis that said that if 40% or more of your user base reports that they would be very disappointed if your product went away on a weekly basis, that that's kind of the signal. That's the differentiation point that it says, "If you get past this point, your product will just grow exponentially." And he evaluated a number of other successful companies and realized that the answer to this question was always around or above 40%. So, again, I probably won't be able to go into it too much more in detail but I would recommend reading this blog post. If you're at the stage where you're iterating and you actively have users that you can ask this question of, this can be an immensely useful thing for quantitatively, determining whether the features that you worked on in the previous week were actually benefiting or adding to your product-market fit, or potentially detracting from it as well.
He wrote a great blog post on this. You can just Google it. I'm just going to kind of touch on it but I would highly recommend reading the entire thing because it is fantastic. But in it, he describes a process where on a weekly basis, he asks pretty much all his customers but it doesn't even have to be your entire customer base. It could just be, you know, 30, 40 users, a critical question. "How would you feel if you could no longer use Superhuman?" Three answers, very disappointed, somewhat disappointed, not disappointed. He measured the percentage of users who answered the question, "Very disappointed." These are the users who most value your product. These are the users who your product has now become a key part of their life. It's kind of weaseled their way into their daily habits.