Folks,
If I were a spammer monitoring this list's traffic (there have got to be some), I would buy up a bunch of domains that were registered a few years ago but expired, throw up a bunch of bogus "legitimate looking" content, send out a bunch of spam using those "legitimate" domain names, and then complain to Jeff et al. that SURBL is generating false positives. According to current policies, my sites would be whitelisted, "yay!".
It's my opinion that you have to draw the line somewhere because of this, and hosting entities who don't have compliant AUPs or enforce their AUPs with any speed need to be listed somehow.
Jeff, you really should consider creating a separate "semi-legitimate" list for entities such as greatnow.com, if only to appease those of us who don't necessarily keep often-updated private blacklists and whitelists for SURBL queries/hits.
Thanks, Matthew Wilson.
-----Original Message----- From: discuss-bounces@lists.surbl.org [mailto:discuss-bounces@lists.surbl.org] On Behalf Of Jeff Chan Sent: Friday, October 22, 2004 1:29 PM To: SURBL Discuss Subject: Re: [SURBL-Discuss] free host: greatnow.com
On Friday, October 22, 2004, 11:27:25 AM, Jeff Chan wrote:
There's a difference between removing the entire list and checking them carefully before using them.
We can use the data if we check it first.
Can you post or link the list so we can all see the data and comment on it?
Jeff C. -- "If it appears in hams, then don't list it."
_______________________________________________ Discuss mailing list Discuss@lists.surbl.org http://lists.surbl.org/mailman/listinfo/discuss
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Matthew Wilson writes:
Folks,
If I were a spammer monitoring this list's traffic (there have got to be some), I would buy up a bunch of domains that were registered a few years ago but expired, throw up a bunch of bogus "legitimate looking" content,
yep, the google-spammers are doing that already.
send out a bunch of spam using those "legitimate" domain names, and then complain to Jeff et al. that SURBL is generating false positives. According to current policies, my sites would be whitelisted, "yay!".
this is a possible problem, alright. But as far as I can see Jeff has been saying to *check* the possible false positive domains, not to just blindly whitelist them.
- --j.
It's my opinion that you have to draw the line somewhere because of this, and hosting entities who don't have compliant AUPs or enforce their AUPs with any speed need to be listed somehow.
Jeff, you really should consider creating a separate "semi-legitimate" list for entities such as greatnow.com, if only to appease those of us who don't necessarily keep often-updated private blacklists and whitelists for SURBL queries/hits.
Thanks, Matthew Wilson.
-----Original Message----- From: discuss-bounces@lists.surbl.org [mailto:discuss-bounces@lists.surbl.org] On Behalf Of Jeff Chan Sent: Friday, October 22, 2004 1:29 PM To: SURBL Discuss Subject: Re: [SURBL-Discuss] free host: greatnow.com
On Friday, October 22, 2004, 11:27:25 AM, Jeff Chan wrote:
There's a difference between removing the entire list and checking them carefully before using them.
We can use the data if we check it first.
Can you post or link the list so we can all see the data and comment on it?
Jeff C.
"If it appears in hams, then don't list it."
Discuss mailing list Discuss@lists.surbl.org http://lists.surbl.org/mailman/listinfo/discuss
Discuss mailing list Discuss@lists.surbl.org http://lists.surbl.org/mailman/listinfo/discuss
If I were a spammer monitoring this list's traffic (there
have got to be
some), I would buy up a bunch of domains that were registered a few years ago but expired, throw up a bunch of bogus
"legitimate looking"
content,
yep, the google-spammers are doing that already.
send out a bunch of spam using those "legitimate" domain names, and then complain to Jeff et al. that SURBL is generating false positives. According to current policies, my sites would be whitelisted, "yay!".
this is a possible problem, alright. But as far as I can see Jeff has been saying to *check* the possible false positive domains, not to just blindly whitelist them.
And this exposes a serious problem with the current SURBL design. If I can manage to host one legitimate page in one subdomain and have 100 other subdomains that are purely for spam, you can't list me as a spammer.
Maybe we need to rethink this only listing root domains thing. Maybe we could use an address to mean "check next level" and have the plugin add the next subdomain level and check that. It would increase traffic, but only for those domains where you need subdomain checking.
Certainly it would be better than ignoring a major spammer host just because a newsletter referred to some code on one page in one subdomain.
Bret
On Friday, October 22, 2004, 1:05:39 PM, Bret Miller wrote:
And this exposes a serious problem with the current SURBL design. If I can manage to host one legitimate page in one subdomain and have 100 other subdomains that are purely for spam, you can't list me as a spammer.
If that were the case, we'd probably list the domain.
However, it doesn't tend to happen. Professional spammers tend to register their own domain names (and a lot of them). They don't tend to use shared or subdomain hosting sites to host their spam web sites. When was the last time you saw a pill/warez/mortgage spam on geocities or even terra.es?
Jeff C. -- "If it appears in hams, then don't list it."
On Friday, October 22, 2004, 12:25:12 PM, Matthew Wilson wrote:
If I were a spammer monitoring this list's traffic (there have got to be some), I would buy up a bunch of domains that were registered a few years ago but expired, throw up a bunch of bogus "legitimate looking" content, send out a bunch of spam using those "legitimate" domain names, and then complain to Jeff et al. that SURBL is generating false positives. According to current policies, my sites would be whitelisted, "yay!".
We don't just look at the sites. We also look at the spams, other sites in the same domain, hams, inclusion in SBL, NANAS hits, DMOZ hits, etc.
Using expired domains won't help them because when they re-register it, the creation date gets reset. We look at registrar data, not historical records of (old) web sites.
It's my opinion that you have to draw the line somewhere because of this, and hosting entities who don't have compliant AUPs or enforce their AUPs with any speed need to be listed somehow.
I don't see legitimate ISPs who have reasonable AUPs as being a major problem. Sure they may get some abuse, but it's relatively minor and they shut it down eventually. These kinds of sites don't tend to be used by the people using zombies to send out millions of spams per day. They need something more reliable for their hosting, and that means custom domains at China Telecom, Hanaro, Brazil, etc.
Jeff C. -- "If it appears in hams, then don't list it."