> I'm not much of a scripter or a programmer (if you know C++, > HELP!) but my thought was to have them all pipe to the same > file, then you only have one file in the end rather than a > file per computer. > > Of course with your way, it would only happen once per > computer which would probably be easier in the end. > > Then you could just make anther script to run against the > directory to make one big file > > Greg
I've done that before, but it seemed that even though I used >> to pipe, the file would get overwritten when there were many people trying to write to it at the same time. So, I'd have a list of 100 computers and two minutes later, there would be 25.
Ray at work
Chris Berry compjma@xxxxxxxxxxx Systems Administrator JM Associates
******************************************************** This Week's Sponsor - RTO Software / TScale What's keeping you from getting more from your terminal servers? Did you know, in most cases, CPU Utilization IS NOT the single biggest constraint to scaling up?! Get this free white paper to understand the real constraints & how to overcome them. SAVE MONEY by scaling-up rather than buying more servers. http://www.rtosoft.com/Enter.asp?ID=147 ********************************************************** To Unsubscribe, set digest or vacation mode or view archives use the below link.