> BigEvil.cf and MidEvil.cf are now available in SURBL form as
> be.surbl.org, for use with SpamCopURI SpamAssassin 2.63 and
> URIDNSBL SpamAssassin 3.0 plugins. Thanks Chris, Paul and
> Gary Funck!
>
> Here's an excerpt about the new list from the Quick Start
> section:
>
>
> http://www.surbl.org/
>
> Chris Santerre and Paul Barbeau's BigEvil and MidEvil
> SpamAssassin rules are now available as an SURBL for use with
> plugins and programs such as those mentioned above which can
> extract message body URI domains and compare them against
> name-based RBLs. The name of the list is be.surbl.org, and some
> sample rules and scores to use it appears below. The well-known
> and popular BigEvil and MidEvil SA rulesets are used to block
> messages based on domains that have occurred in spam message body
> URIs. Using this as an SURBL instead allows you to remove this
> relatively large ruleset from SA memory and lets DNS cache the
> data in a zone file instead, querying SURBL hits from DNS as
> needed.
>
> An SA 2.63 rule and score using SpamCopURI (but not the SpamCop
> data!) looks like this:
>
> uri BE_URI_RBL
> eval:check_spamcop_uri_rbl('be.surbl.org','127.0.0.2')
> describe BE_URI_RBL URI's domain appears in BigEvil
> tflags BE_URI_RBL net
>
> score BE_URI_RBL 3.0
>
> An SA 3.0 rule and score using URIBL's urirhsbl looks like this:
>
> urirhsbl URIBL_BE_SURBL be.surbl.org. A
> header URIBL_BE_SURBL eval:check_uridnsbl('URIBL_BE_SURBL')
> describe URIBL_BE_SURBL Contains a URL listed BigEvil
> tflags URIBL_BE_SURBL net
>
> score URIBL_BE_SURBL 3.0
>
> be.surbl.org can be used alone or with other SURBL lists; all
> that's needed are different rule and score names, as we've shown
> in the samples. More information about be.surbl.org can be found
> in the Additional SURBLs section.
>
> http://www.surbl.org/additional.html
>
>
> be.surbl.org joins Bill Stearns' sa-blacklist-based ws.surbl.org
> and my own SpamCop URI-based sc.surbl.org SURBLs. All are
> described more at the site.
>
> Please send me any questions, comments, corrections, updates,
> etc.
>
> Cheers,
>
> Jeff C.
>
> P.S. We will probably offer a combined list at some point.
> We're still working out the details of that. Until then it's
> quite possible to use one or more of the lists simply by using
> separate SA rules for each one that you want to use, as shown
> in the Quick Start samples.
>
> P.P.S. The sample rules have been updated to mention "SpamCop"
> only in the descriptions of rules that actually use SpamCop data.
> --
> Jeff Chan
Hi,
You seem to put a lot of emphasis on the memory taken up by these two lists in memory. When I removed them, spamd's memory utilisation went down only 1.9MB (down from 33.5MB to 31.6MB). Now unless you are really strapped for memory, I don't see this as a great advantage. What's quicker execution-wise...a regex of the list in memory, or a DNS lookup/eval...I would imagine the later, but does anybody know?
The obvious advantage is that one doesn't have to update the cf files manually.
What's the TTL for entries in this database?
Cheers
Scott
[Update, Chris wrote off list that he's put up a quick be.htm
page to be prettified later.]
On Wednesday, April 21, 2004, 7:36:08 AM, Chris Santerre wrote:
>> Sounds good. Can you let me know what kind of TTL I should set?
> Well I am now trying to update at least every other day. This way I won't
> fall behind. But I'm now getting every day. I always test overnight, because
> too many people rely on the list now. I usually post before noon EST.
OK sounds like an 8 or 12 hour TTL is appropriate then; setting
to 8 now.
Any idea how often Paul updates MidEvil?
>> Basically I'd like to set the lifetime of the zone info to
>> something relevant towards how often you and Paul usually
>> update the lists. Nothing too specific is needed, just a
>> general idea. Like is it daily, twice a day, every other
>> day on average, etc.
>>
>> Also does this TXT record work for you guys:
>>
>> "Blocked in BigEvil. See: http://www.rulesemporium.com/"
>>
>> It was just a generic placeholder. I'd like
>> comments/improvements on it.
> How about www.rulesemporium.com/be.htm ? I can make a page just for that
> error? Otherwise it is fine.
Done. Please set up a page when you get a chance... :-)
>> > 1) BigEvil wildcards. Not sure how you would handle these.
>> Something like
>> > evil\d{2,4}spam\.com is a general wildcard. Some of those
>> domains don't even
>> > exhist. Not sure how SURBL will handle that.
>>
>> Yes, I should have mentioned that I'm simply discarding them.
>> Unfortunately there's no easy way to deal with them. Domains
>> without any patterns in them, which are a majority, come right
>> through. The script is at:
> Can we make sure that when you announce this to the public that they know
> this! :)
> I can see the flurry of emails now.
Definitely will mention the differences in the announcement and
web site!
>> > 3) What is the quickest way to check a domain against the
>> other SURBL lists?
>> > Basically I see no reason to duplicate the listings. *gulp* and on a
>> > Windowze machine? (Don't ask!)
>>
>> I wouldn't worry too much about that for now. For now we just
>> want to get an accurate record of everything. We're working on
>> ways to merge things next.
>>
> Well ok, but I still want to look others up if I have a domain in question
> :) Will there be a quick web page to look up a domain? Or do I do an
> NSLOOKUP using the SURBL?
You can find the domains currently going into the SURBL lists at:
sc: http://spamcheck.freeapp.net/top-sites-domains
ws: http://spamcheck.freeapp.net/sa-blacklist.current.domains.afterwhitelist
be: http://spamcheck.freeapp.net/bigevil.domains.afterwhitelist
But frankly I like the fact that there is some overlap in the
lists. In a sense that represents multiple reporting; i.e.
a domain in more than one list is more likely a bad guy.
I don't think we should lose that coding.
YMMV, but I'd say keep any overlap in BE. It's a feature not
a bug.
>> > 4) Has there been any talk with the sendmail people? It
>> would be interesting
>> > to actually block at the MTA level based on an evil URL. I
>> realise the
>> > inherent dangers in this ;)
>>
>> Yes, there is talk about sendmail milters using SURBLs. I
>> haven't heard of anyone doing one yet, but they're feasible.
>> The limiting factor is the FP rate. FPs must be as close to
>> zero as possible before people will dare to reject spams at the
>> MTA level using SURBLs, other than perhaps for personal servers,
>> etc.
> Dangerous, but so very fun!
Hehe! ;-) Messing with spammers is always fun!
Jeff C.
Just browsing through my spam folder and noticed a spam with the following URL:
http://yahoo.com.collectiza.com-munged/vp9
(without the -munged of course)
Looks like they might think that putting yahoo.com on the front will fool a
simple parser ? :) Have we been "noticed" already or am I just being
paranoid ;)
That particular spam didn't match on that test, but did match on another
different URL in the same message...
Regards,
Simon
There's the problem - Sticking -munged doesn't help, since it just yanks
the domain right out of the middle.
-------- Original Message --------
Subject: Re: [SURBL-Discuss] First attempt to subvert surbl approach ? :)
Date: Wed, 21 Apr 2004 17:58:44 -0700
From: Jeff Chan <jeffc(a)surbl.org>
To: discuss(a)lists.surbl.org
Spam detection software, running on the system "umlcoop", has
identified this incoming email as possible spam. The original message
has been attached to this so you can view it (if it isn't spam) or block
similar future email. If you have any questions, see
the administrator of that system for details.
Content preview: On Wednesday, April 21, 2004, 5:30:37 PM, Simon Byrnand
wrote: > Just browsing through my spam folder and noticed a spam with
the following URL: >
http://yahoo-MUNGED.com-EVEN-MORE.collectiza.com-munged/vp9 [...]
Content analysis details: (8.8 points, 6.0 required)
pts rule name description
---- ----------------------
--------------------------------------------------
0.8 RATWR9_MESSID Message-ID has ratware pattern (9999.99999999@)
3.0 WS_URI_RBL URI's domain appears in spamcop database at
ws.surbl.org
[yahoo-MUNGED.com-EVEN-MORE.collectiza.com is]
[blacklisted in SpamCop RBL at ws.surbl.org]
5.0 SC_URI_RBL URI's domain appears in spamcop database at
sc.surbl.org
[yahoo-MUNGED.com-EVEN-MORE.collectiza.com is]
[blacklisted in SpamCop RBL at sc.surbl.org]
Initially, when I released spamcopuri I decided to pretty much ignore
whether the TLD was a country code or not. This results in about
twice as many queries as necessary, but guaranteed you would get
hits if the domain was listed.
Now that people are pointing this to other RBL's beside just surbl,
should we continue to do second and third level queries? Or just
the query that we assume to be necessary? My concern is that not
all RBLs will process the domains according to a list such as
http://www.bestregistrar.com/help/ccTLD.htm. I suppose the worst
case scenario is we end up getting a miss when we should be getting
a hit because one side presumes that say TLD .za has a subdomain 'foo',
when the server doesn't. The server side would expect a second level, while
the client would do a third level query (this is why I wanted the wildcard
records). I guess this really isn't that great a consequence considering
the savings and the fact that this shouldn't occur very often.
I will go ahead and make this change if everyone is comfortable with the
known risk.
thanks,
--eric
Got a question.
How is the best way to use all the surbl.org zone with SA?
uri SPAMCOP_URI_RBL
eval:check_spamcop_uri_rbl('sc.surbl.org','127.0.0.2')
describe SPAMCOP_URI_RBL URI's domain appears in spamcop database
at sc.surbl.org
tflags SPAMCOP_URI_RBL net
score SPAMCOP_URI_RBL 3.0
uri SPAMCOP_URI_RBL
eval:check_spamcop_uri_rbl('be.surbl.org','127.0.0.2')
describe SPAMCOP_URI_RBL URI's domain appears in spamcop database
at be.surbl.org
tflags SPAMCOP_URI_RBL net
score SPAMCOP_URI_RBL 3.0
uri SPAMCOP_URI_RBL
eval:check_spamcop_uri_rbl('ws.surbl.org','127.0.0.2')
describe SPAMCOP_URI_RBL URI's domain appears in spamcop database
at ws.surbl.org
tflags SPAMCOP_URI_RBL net
score SPAMCOP_URI_RBL 3.0
Or do I need to change something? I just have it now doing the sc zones,
but would like to have it parse through them all.
Thanks,
--
-Doc
---
MomNDoc Online Consultants
http://www.maddoc.net/
momndoc(a)maddoc.net
> -----Original Message-----
> From: Jeff Chan [mailto:jeffc@surbl.org]
> Sent: Wednesday, April 21, 2004 9:54 AM
> To: Chris Santerre
> Cc: SURBL Discussion list
> Subject: Re: [SURBL-Discuss] BigEvil + MidEvil as SURBL
>
>
> > 2) Where would I send updates? As single domains, or a txt
> list? How would I
> > remove an FP?
>
> In case it's not clear, FPs will come out of be.surbl.org
> automatically when they come out of bigevil.cf and midevil.cf.
>
> If you need to manually whitelist a domain, just send a message
> to us at whitelist at surbl dot org and we'll do that ASAP.
>
> Jeff C.
Now that I see how you are doing this, let me just reiterate....FREAKIN
KEWL!!!
Well then, I see what I have to do with Paul. And This is so so very cool!
--Chris
Just release 0.12 to fix a test some users may have had errors with
during make test. No real need to grab this unless you want a clean make
test.
--eric
Trying to install 0.11 over an existing (and working) 0.10 installation
on a redhat 9 box.
make test gives the following errors (all other tests are ok):
| t/open_redirect....NOK 5# Failed test (t/open_redirect.t at line 43)
| t/open_redirect....ok 7/7# Looks like you failed 1 tests of 7.
| t/open_redirect....dubious
| Test returned status 1 (wstat 256, 0x100)
| DIED. FAILED test 5
| Failed 1/7 tests, 85.71% okay
Any ideas?
John.
--
-- Over 2400 webcams from ski resorts around the world - www.snoweye.com
-- Translate your technical documents and web pages - www.tradoc.fr