[SURBL-Discuss] Re: Revised DMOZ data, got Wikipedia domains too

Daniel Quinlan quinlan at pathname.com
Fri Oct 8 20:01:13 CEST 2004


Chris Santerre <csanterre at MerchantsOverseas.com> writes:

> I think this is just plain nuts to whitelist all of these! Why? If we
> don't try to whitelist the most popular sites, then what the heck it
> the point? We could whitelist millions of legit domains forever. The
> popular ones are the most important.

The points:

  - whitelisting legitimate domains limits the effectiveness of joe job
    attacks that result in FPs in various SURBL blacklists
  - whitelisting could be used as negative points for MAIL FROM if
    combined with SPF (and more domains is better)

In addition:

  - I would only whitelist those domains (a) subject to editorial
    removal (b) so long as their domain registration is old enough and
    (c) so long as they pass other criteria such as no SBL listing for
    NS->A.
  - I would maintain the automated whitelist separately from the human
    edited whitelist and handle it differently.  For example, perhaps
    automated whitelist entries can only remove a single blacklist hit
    (like SpamCop), but to remove two independent blacklist hits, it
    requires a human decision.
 
> so: 
> -1 for adding all those intersected to WL
> +1 for whitelisting the blacklist hits.

I think there are other options available due to the miracle of
programming.  ;-)

Daniel

-- 
Daniel Quinlan                     ApacheCon! 13-17 November (3 SpamAssassin
http://www.pathname.com/~quinlan/  http://www.apachecon.com/  sessions & more)


More information about the Discuss mailing list