>-----Original Message-----
>From: Jeff Chan [mailto:jeffc@surbl.org]
>Sent: Saturday, October 09, 2004 9:06 PM
>To: SURBL Discussion list
>Subject: Re: [SURBL-Discuss] Revised DMOZ data, got Wikipedia domains
>too
>
>
>On Saturday, October 9, 2004, 1:09:43 PM, Patrik Nilsson wrote:
>
>> Just create a separate "TLDs or treat as TLDSs" zone that
>can be checked
>> and cached client side.
>
>> Or even better - give "TLDs or treat as TLDSs" a
>distinguished A value in
>> existing lists.
>> If a lookup returns XXX.XXX.XXX.XXX, it is a "TLD or treat
>as TLD" and
>> should be further recursed.
>> If we think there is a risk that some bad client
>implementatins treat any
>> returned A record as a hit, use TXT records.
>
>This is still an interesting idea, but I'd still be somewhat
>concerned about putting out a list that looks like a regular
>SURBL that it could get misused.
>
>But perhaps the larger issues is that the hard core spammers
>don't seem to use *subdomains of legitimate shared-domain
>hosting providers*. They just register their own full domain
>names and use those (lots of them).
>
>If some legitimate hosting provider has an abuse issue,
>then it's in their own interest to stop the abuse.
>
>SURBLs are arguably best suited for cases where the ISP
>is spam-friendly and allows spam hosting on custom domains.
>The reality is that's a much larger and tougher problem
>than shared, common-domain hosting, like a geocities or
>tripod.
>
>The best use of our time is to focus on the biggest spammers
>first, and we're not catching all of those yet.
>
Big surprise...I disagree :-)
I think we are already reviewing these domains, so why not just add to catch
the smaller ones instead of throwing them away.
But I see your vision, and I can bend like the reed ;)
But given enough time, and perhaps enough chocolate, I think I can turn you
around :-)
--Chris