Author Topic: How often may I downloaded the MDL hosts file?  (Read 16400 times)

0 Members and 1 Guest are viewing this topic.

March 07, 2009, 10:47:51 pm
Read 16400 times

JohnnyZb1UKXat

  • Guest
Hello MalwareDomainList,

I have written a shell script to automatically download some hosts file from the internet for a few OSX machines in house. One of the hosts files is one you provide.

Could you tell me, how often may I downloaded the MDL hosts file?

~Good day

March 08, 2009, 01:56:05 pm
Reply #1

DiFor

  • Jr. Member

  • Offline
  • **

  • 19

March 08, 2009, 04:39:18 pm
Reply #2

JohnnyZb1UKXat

  • Guest

March 08, 2009, 04:51:08 pm
Reply #3

SysAdMini

  • Administrator
  • Hero Member

  • Offline
  • *****

  • 3335

Could you tell me, how often may I downloaded the MDL hosts file?


Usually we add new urls one or two times per day. So it is useless to poll the list each x minutes.
I suggest to poll the list in 4 hour intervals. You could subscribe to the RSS feed,analyze its result
and poll the list only if changes occur.

Regards,

 SysAdMini
Ruining the bad guy's day

March 10, 2009, 12:04:04 am
Reply #4

JohnnyZb1UKXat

  • Guest
I think it might be a good idea to include how often you feel is acceptable that one may download your hosts file in your hosts file as a comment.

Thanks for your reply, the hosts file, and MalwareDomainsList.

April 21, 2009, 04:49:12 pm
Reply #5

RS-232

  • Special Access
  • Sr. Member

  • Offline
  • *

  • 165
Quote
It looks like what ever was causing the high pull-the-plug rate on the hosts is over now.
Sad but true...

Quote
I think my advice is to tell people to merge this file with what ever they have with HostsMan on Windows...
.....
Just live with the dead hosts in the file (they are very good at removing them now) and realize that they don't have all of the problem hosts out there.

Calculations - part 1:
In late 2008,it was estimated that there were about...300 million porn sites out there on the net.
Now - just how many are the supposedly "well-recognized" legitimate sites of the porn industry?
50-100 maximum i would dare saying...but let's add to these the lesser known ones - say 400 more?
Assume that each one out of them also has...2000 different commercially affiliated partner sites: counting 2.500 now in total.
Let's be generous...and assume that each one of the above webmasters,actually runs...500 different sites: which brings us up to 1.250.000
But i'll even "round it" up to 2 million...thereby,counting 298 million sites left...

Calculations - part 2:
Anyone really thinks that i'm way too strict with the above?...
He/she should feel free to multiply with 10x...it would still leave us with 280 million "suspicious" sites,lol...
And even so...i still feel like i should be even more "generous" than that,he-he:
let's say that merely 5% out of these serves nasties and the like...which leaves us with 14 million sites.

Now,what's the point of all the above...MDL lists about 7.000 sites at the moment.
Other blocklists have 50.000 entries,and some others maybe even 100.000 in total.
Even if a blocklist file had 1 million entries,someone can easily see that it is more or less the "poor man's AV".

Thereby,what makes a blocklist effective,is obviously not the number of domains listed.
Even more,it's not that much the "how often" it is updated per day/week...although this plays a quite major role.
It's the "quality" of the malware that it can block at a given moment in time,what is actually important...
ie.samples with low detection rate who represent a newer threat that mutates/spreads in a small fraction of time,
pages that can act as a carrier/dropper for multiple drive-by infections in a row etc...

...If the whole process of blacklisting domains was "organized" in a 100% optimal way,
i'd dare saying that most of the "new and important" malware threats could be prevented from spreading,
with a daily updated blocklist that would contain only 4000,or maybe say 5000 entries...
Prevented from spreading means...until the AV products fully dissect the newer threats presented by the sites in question,
add the samples in their database,improve their engines to deal with possible future variants proactively etc etc:
for decent products,this is usually just a matter of a few days...same goes for browsers and search engine proactive filters.
Now,if only more ISPs/netops also somehow made use of such blocklists during these few days...
Spamhaus seems to be the only notable exception of blacklisting service which is widely utilized for the time being.
Only for the "fun" of it...rs-232 aka sowhat-x aka younameit ;-)
http://www.youtube.com/watch?v=fADjY97_KTw

July 27, 2009, 09:04:38 pm
Reply #6

hhhobbit

  • Special Access
  • Full Member

  • Offline
  • *

  • 54
I am providing the following files at hostsfile.org & securemecca.com for synchronization and to minimize downloads to only what is necessary:

http://HostsFile.org/Downloads/hdate.txt
http://HostsFile.org/Downloads/pdate.txt
http://SecureMecca.com/Downloads/hdate.txt
http://SecureMecca.com/Downloads/pdate.txt

WARNING:  Just because the web site says that what you have is on a given day, that does not mean what you have is current, especially for the hosts file. Right now I am putting the hosts file up fairly often.  The contents of the hdate.txt and pdate.txt files are literally a date & time stamp and are the authoritative mechanism of when the files are updated (hdate.txt for host, pdate.txt for PAC filter).  Here is a shell script that will compare what you have (it will also prime the pump - but you need to install UnixUtils or Cygwin on Windows to make it work) with what is on the server and tell you if the hosts file has changed:

http://www.securemecca.com/public/SecureMeccaUpdated.sh

All of the Auto*.sh files that do automated download and install have been changed to depend on these files.  If nothing has changed they tell you that and then exit.  You can click on the Auto*.sh and the SecureMeccaUpdate.sh script files thousands of times per day and I won't mind.  You can even automate it and I won't care.  I would say it is highly unlikely you will have an update following the previous one by less than four hours for the hosts file and in less than a week for the PAC filter.  I will say for the next few weeks you can count on it changing several times per day though (after I get Fedora 11 and OpenBSD loaded on the new system).

The pdate.txt file used with the PAC filter is not so much to prevent server overload as to put off updating the PAC filter only until it has changed.  That way, you preserve your white-list rules as long as possible.  But I do listen to anybody and will white-list domains that most people may go to.  I will also downgrade some rules from URL to host scope or alter other rules that cause problems.  But the PAC filter isn't perfect (nearly so for my uses), so like NoScript, white-listing a domain is the encouraged first choice to resolve a problem.

So click away all you  want.  I won't mind!  More to the point, the web servers will thank you.

PS  I put a lot of hosts from WOT (Web Of Trust) in the hosts file lately.

July 30, 2009, 10:58:20 am
Reply #7

hhhobbit

  • Special Access
  • Full Member

  • Offline
  • *

  • 54
My whole point in this is to come up with something that will reduce the load on MDL long before it will ever become a problem.  I just modified this script (hopefully for the last time):

http://www.SecureMecca.com/public/SecureMeccaUpdated.sh
(uses next file):
http://www.SecureMecca.com/Downloads/hdate.txt

In addition to the date stamp in the hdate.txt file I have added the reason for the volatility - ScareWare.  You may just want to add the hosts in the hdate.txt file! To give you an idea of how bad it is, I went to work and left a script running that did the test about 3 hours after I left.  When I came back home seven hours later the last host used in the exploit four hours earlier was already dead and had been replaced by another host.  That is why I say, host of the hour.  By replaced I don't just mean it was no longer used - it was no longer even in DNS/ It was gone!  This is their modus-operandi. I am putting these discarded hosts into my add.Dead file because they may come back to life.  A detection rate of only 0/41 at VirusTotal for almost every binary I got from them (and they have hundreds of versions) is something to take seriously. I would appreciate you tossing a coin.  If it comes up tails change the "securemecca.com" to "hostsfile.org" in my scripts.  But if you feel strongly about what web server you want, HostsFile.org is IIS (getting them to allow a download of a 7-Zip file was very difficult), SecureMecca.com is a modified Apache.  HostsFile.org used to have the biggest load.  They are about even now.

The main reason I wrote all of this is just to throw some ideas out there that can reduce the load on MDL and other servers out there delivering lists of bad hosts (malware, tracker, etcetera).  I really do think a tiny date-stamp file is a good mechanism for those who want to automate their updates while at the same time not loading down servers and inconveniencing other people.  If anybody has a better idea, before MDL gets used by everybody instead of just the security people  you need to come up with something that will reduce the load to a minimum.  Things are getting pretty bad out there for somebody at SANS (Northcutt) to make a statement about hosts files as a security measure.  He has also mentioned NoScript.  There is even an entire Math department using NoScript on mostly Unix type systems.

Anyway, if somebody has a better idea of how to reduce the workload while at the same time making sure that busy shops that are depending on being as current as possible aren't left unprotected, please speak up now.  RSS is okay, but you have to be there to see it.  What I just provided will make it possible for the person who posted to suck the hdate.txt file down every 15-20 minutes to several hours.  If it changes then they need to take action.  Actually, a smart admin can set it up so that everything is automated.