[sanesecurity] Re: Undated script: unofficial-sig.sh

  • From: Chuck Fisher <cfisher@xxxxxxxxxxxx>
  • To: sanesecurity@xxxxxxxxxxxxx
  • Date: Mon, 02 Feb 2009 20:12:31 -0600

Bill Landry wrote:
> Chuck Fisher wrote:
>   
>> Burt wrote:
>>     
>>> Bill Landry wrote:
>>>       
>>>> I know that Malcolm has claimed that my script should not be making
>>>> separate connections for each file download, however, from my research
>>>> and testing there is no other way to give users the ability to select
>>>> which database files they wish to download and use.
>>>>
>>>> I just took a look at Malcolm's script for reference, and it appears
>>>> that his script downloads all files at every check, whether they have
>>>> been updated or not.  The check for updates is done after all files have
>>>> been downloaded and then compared to the existing files running in
>>>> production.
>>>>
>>>> If we really should only be downloading files if they have been updated,
>>>> then this does not accomplish that goal, whereas my script does, and
>>>> also gives users the flexibility to choose which files they want to
>>>> download and use.
>>>>         
>>> I agree that the separate connections are a bad idea -- it will cause
>>> downloads to be blocked much sooner than otherwise.  rsync has the
>>> ability to do multiple specified files in a single connection.  This
>>> is covered in the "Advanced Usage" section of the rsync man page.  The
>>> script should end up executing a command along the lines of the
>>> following:
>>>
>>> rsync -av host:'dir1/file1 dir2/file2' /dest
>>>
>>> Constructing that 'dir1/file1 dir2/file2' string from a list of
>>> desired sig files should not be that difficult.  rsync3 added a
>>> different format for specifying multiple files, but it still supports
>>> this older method, and there seem to be plenty of rsync2.x installs
>>> still out there.
>>>
>>> As far as the fetch-sanesecurity-sigs script goes, the point of rsync
>>> is that it will only pull down differences, especially when dealing
>>> with entire directories at the one time.  If the files are the same,
>>> nothing is really transferred.  On the flip side, that script also
>>> assumes that all sig files are wanted -- no simple method for choosing
>>> files either.
>>>
>>> Happy trails,
>>>
>>>       
>> Maybe the answer/logic I'm using can be of some help. I'm using rsync
>> exactly as mentioned and keeping two directories mirrored. One is a
>> mirror of sanesecurity files, the other is MSRBL. While the sanesecurity
>> directory currently is only 4.6Mb, MSRBL is 37Mb. But even so, it makes
>> things easier in my opinion to keep the files locally. It's from these
>> directories I choose which files to use with Clam.
>> Using a list of wanted files, I compare the wanted files to files in
>> use, copy (not move, so rsync does not need to download again until the
>> file is updated) to a temp directory the files that need to be updated.
>> Change ownership etc while in the temp directory, And then just a bulk
>> forced move from the temp to the Clam DB directory. Then I tell Clam to
>> restart. I've experimented with letting Clam pick up the updates on it's
>> own, but I've noticed (I think) that a removed .?db file does not
>> trigger a update.
>> In the mix though are a few system specific tweaks I'm using. If there
>> are files to be updated my script tells the MTA to stop, a sleep loop
>> checks for my MTA's .pid to know when it's safe, then it reloads Clam
>> and starts my MTA back up. This way there's little chance of Clam
>> crashing due to a forced reload while it's busy.  The MTA is only down
>> maybe 5 seconds on average for an update. And if there are no files to
>> update, nothing is stopped/reloaded.
>>
>> Chuck
>>     
>
> Issuing a "kill -USR2" to the clamd pid notifies clamd to reload its
> databases without performing a restart of the clamd service.
>
> Bill
>
>   
Bill,
I know/understand that. I'm being over cautious. But it's also because
I'm using scripts I've written for other reasons that this happens.
I worded the logic behind my update scripts the way I did because it's
what really happens on my system, and not just the needed.
In a way I'm being lazy and not doing all the work from one script,
which is why I'm not able to "share with the class" at this time and
publish a viable script that others can use.
Just wanted to share what's working for me. And why.

Chuck

Other related posts: