Sometimes I wonder if we obsess too much about things like the supplemental index. Predictably, the Google post announcing that the supplemental label is gone has predictable resulted in a firestorm of commentary. Here are some posts on the topic from Danny, Barry, and Lisa Barone.
Actually, I do understand the reason why people liked having the information. It was a way of measuring where a site was in Google's eyes. If the work you are doing as an SEO was making the number go down, you knew you were making progress. Having a form of measurement is very useful.
But, you really don't need a supplemental label to identify a poor quality page. It's a page that's thin on original content, and/or without a lot in the way of link juice (note that you should scale this latter factor in relationship to the overall authority of the site on which the page resides). When you are an SEO, and you encounter these types of pages, you then need to determine how large a percentage of the site is made up of these poor quality pages. If the percentage is large - 10% or more (perhaps even 5%), then Houston, we have a problem.
You are going to need to work on the problem, regardless of how it's labelled. A large number of poor quality pages is always a problem. However, if you want some ideas on how to see if a page is in supplemental, check out Jim Boykin's post on the topic.
In the meantime, my suggestion to Google is that the Webmaster Tools team look at implementing some way for webmasters to see within Webmaster Tools those pages that they are assigning a low quality score to. That way SEOs get the kind of tool that they crave, and Google doesn't have to carry around the supplemental index bogey any more too. Better still, this will help SEOs focus on reducing the number of poor quality pages on the web, something I suspect that Google wants. Sounds like a win-win, doesn't it?