Hi
Have you thought about using the logdata from caching proxy's to check against this data and the other SURBL lists? The caching proxy's logfiles contain a very good view on which domains are used while surfing. While not all domains that are used are legit, I would be surprised if active spam domains were frequently visited over a period in the past. Those frequently surfed to domains shouldn't be whitelisted by default, but it could be a trigger to have a closer look before placing them on a list.
A very nice thing about this data is that it has a long period of usefulness even data from a few months ago will be very usefull, if not more usefull than the data of today or from the past week.
A caching proxy could also make good use from the surbl lists. Not caching spammers domains or even adding a delay of several seconds (or even adding a warning page) before returning the page could be usefull. Especially in non ISP use (organisations, companies,etc...) where there are less problems with blocking content.
Alain
On Saturday, March 26, 2005, 2:18:07 AM, Alain Alain wrote:
Have you thought about using the logdata from caching proxy's to check against this data and the other SURBL lists? The caching proxy's logfiles contain a very good view on which domains are used while surfing. While not all domains that are used are legit, I would be surprised if active spam domains were frequently visited over a period in the past. Those frequently surfed to domains shouldn't be whitelisted by default, but it could be a trigger to have a closer look before placing them on a list.
A very nice thing about this data is that it has a long period of usefulness even data from a few months ago will be very usefull, if not more usefull than the data of today or from the past week.
It's an interesting idea to try to generate whitelist data or at least get an idea of the magnitude of web site hosts visited. Does anyone have such data in useable form already? :-)
A caching proxy could also make good use from the surbl lists. Not caching spammers domains or even adding a delay of several seconds (or even adding a warning page) before returning the page could be usefull. Especially in non ISP use (organisations, companies,etc...) where there are less problems with blocking content.
Alain
Yes, there has been talk of using SURBLs with squid caching proxies, etc. Your ideas about warning about visiting listed sites are interesting. Perhaps someone reading this will try to implement some of them.
Jeff C. -- "If it appears in hams, then don't list it."