Curious though what the metrics they use to evaluate effectiveness against spam are. It could have just as much (or as little) spam indexed as it had 5 years ago, and in some comparisons that would be valid, but what if much of the spam had moved from being evenly distributed throughout results to being distributed in the top positions? Then, one could say spam was even lower than ever in total quantity, but it would be even worse in terms of user experience.
That said, I agree with Google, users' expectations have skyrocketed, and it is tough to keep pace with them.
I actually feel quite comfortable with our metrics. Back in 2003 or so, we had pretty primitive measures of webspam levels. But the case that you're wondering about (more spam, but in different positions) wouldn't slip past the current metrics.
How do you interpret the backlash from the users recently ? In your eyes, have we become more used to "perfect" results, or are the fewer bad results left more insidious and thus more harmful (despite the overall level of quality being higher) ?
Personally I tend to find what I'm looking for by adding a few more words, but in the case of reviews and tech stuff it doesn't always work and I often have to rewrite my query one or more times to get something valuable.
in the case of reviews and tech stuff it doesn't always work
This is one of my pet peeves. If I search for "product X review", most of the result I get back are of the form "be the first to review product X", which is absolutely not what I want.
That said, I agree with Google, users' expectations have skyrocketed, and it is tough to keep pace with them.