On Wed, 29 Sep 2004, Jeff Chan wrote:
On Wednesday, September 29, 2004, 4:48:36 PM, Raymond Dijkxhoorn wrote:
Hi!
I have a similar idea. Would it be possible to have a running list of the top 20 (or so... 50? 100?) most often queried URI's that are blocked by SURBL (and which should be blocked)? This way, we could take additional pressure off SURBL DNS servers by blacklisting these domains locally BEFORE doing SURBL checking on such messages?
I have a feeling that this has already been requested and implemented??
Its something that is suggested, and we are looking into ways to getting that inside for example SA3.1, the SA guys also had some suggestions.
So yes, excellent idea... ;)
Yes, SA is adding a feature to hardcode or have a database of the 125 most often hit whitelist domains to 3.1 or 3.0.1. This will prevent domains like w3.org, yahoo.com, etc. from even being queried.
One issue is with the top spammer domains is that unlike the whitehats, the big spammer domains tend to change over time. The biggest spammers also seem to be the most dynamic.
So the whitehats may work better with local listing than the blackhats.
This is certainly a good idea though. Note that Eric Kolve also built in local black and whitelists to SpamCopURI.
Jeff C.
A far better way to effect this is to just increase the TTL on those long-term blackhat domains. (A static list is effectively an infinitly large TTL, query once and keep for ever).
The static whitelist is necessary as it is not possible to tune the per-query NAK TTL, and you want the general NAK TTL to be low to improve responsiveness of "add" events. (Hmm, there's a thought, modify a DNS server to hand out customized TTLs on particular NAK responses).
You -can- tune the postive TTL on a per entry basis. So for the long-term blackhats just give them a large (say 24 hours) TTL and their querys will drop way down.
This presupposes that all client sites are running a locally caching DNS server. Any body -not- doing that should be banished and dis-allowed from using SURBL.