We have a manual blacklist for the SC list that I use to nudge records without enough hits onto SC (sooner than they otherwise would):
http://spamcheck.freeapp.net/blacklist-domains
I've pruned that list, removing domain names that have had no NS records for three successive checks over about five weeks. That has reduced the manual blacklist list by about 270 records. Before removing those I checked to see how many DNS hits they were getting and only notinuse.biz had any of significance, so we kept that one on the manual blacklist.
SC will probably be manually pruned again some time in future, but the list (both automatic and manual) is so small and effective to begin with that it tends not to need to much manual maintenance.
Note that cleaning up the manual blacklist is not the same as whitelisting. Any record that comes off the manual blacklist can be manually or automatically blacklisted again in future at any time if it appears in spams again. So it's not at all a "free pass" to spammers or anything like that. It just means some domains that are unusable and don't appear in any recent spams are no longer included on SC for now. If they get used again, they'll get listed again.
I still plan to re-write the SC-specific engine and the general data engine used for processing the SURBL data. The main features will be more uniform processing of the data, including public logs of additions, deletions, whitelists, etc. Aside from SC, the results should be identical, and I will run them in parallel for a while to make sure that's the case. The SC data will be handle more efficiently and probably have an resolved-IP-based biasing system to get the domains of persistent spammers onto SC sooner, automatically.
Cheers,
Jeff C. -- "If it appears in hams, then don't list it."