[THIN] Re: Home directory and Profile Script

  • From: "Andrew Wood" <andrew.wood@xxxxxxxxxxxxxxxx>
  • To: <thin@xxxxxxxxxxxxx>
  • Date: Thu, 20 Oct 2005 22:13:45 +0100

Mike,

You still here ?!?!

I'd have thought you'd have used Ricks script ages ago while we got on with
whos got the most efficient handbag :)

-----Original Message-----
From: thin-bounce@xxxxxxxxxxxxx [mailto:thin-bounce@xxxxxxxxxxxxx] On Behalf
Of msemon@xxxxxxx
Sent: 20 October 2005 21:57
To: thin@xxxxxxxxxxxxx
Subject: [THIN] Re: Home directory and Profile Script

Thank you! Just looking for simple way to do this. I never thought a script
could bring out such passion in admins! Just goes to show their are
different ways to approach the same problem, expecially with scripting.

Mike

Original Message:
-----------------
From: Joe Shonk joe.shonk@xxxxxxxxx
Date: Thu, 20 Oct 2005 13:46:26 -0700
To: thin@xxxxxxxxxxxxx
Subject: [THIN] Re: Home directory and Profile Script


If I may, the whole point of the script is to extract the TS Profile Path
for all users in AD. Correct?  Is it really worth trying to optimize the
code and resources used?  Especially when a script like this is used once
every full moon?  Shouldn't the goal be to get the job done?  So it takes a
few extra seconds to process or an extra megabyte of RAM on your PC doing
one way vs. the other.  Most people use the ADO method because that's what
Microsoft has posted on the script site.  A simple cut and paste, change a
few parameters and you're up and running...  Granted, shifting the workload
from your workstation to the DC isn't the best idea.

Using a recursive routine is not going to kill anything.  It's quite
effective when sorting through hierarchical data.  The amount memory used
going to be first incursion variables plus the variables for each recursion.
The total memory used at a given time may actually be less that a single
filtered query as ALL of the filtered objects will be returned in one large
array.  The disadvantage of using a recursive routine is that you will
perform a query for each recursion.

Now on the flip side... Recursive routines are difficult to troubleshoot and
understand... Sure, it's easy for me understand and I use them from time to
time, but it's not so easy for most people out there.  (These are admins,
not programmers... Remember, they are working on Microsoft products, not
Linux)  Sure it's cute to do things like use foo and bar for variables, but
do it for yourself... If you're writing the code for a customer, write it so
that they can understand and support it.

Efficiency is more than just how much memory or CPU is utilized or even how
short the code it.  It also includes readability and supportability (if
that's a word)

Joe

-----Original Message-----
From: thin-bounce@xxxxxxxxxxxxx [mailto:thin-bounce@xxxxxxxxxxxxx] On Behalf
Of Braebaum, Neil
Sent: Thursday, October 20, 2005 12:53 AM
To: thin@xxxxxxxxxxxxx
Subject: [THIN] Re: Home directory and Profile Script

> -----Original Message-----
> From: thin-bounce@xxxxxxxxxxxxx
> [mailto:thin-bounce@xxxxxxxxxxxxx] On Behalf Of Andrew Wood
> Sent: 19 October 2005 17:54
> To: thin@xxxxxxxxxxxxx
> Subject: [THIN] Re: Home directory and Profile Script
> 
> Recursion is an inefficient process. I have not only suggested it, I 
> believe it to be. I believe this because in terms of raw computing 
> resource a recursive process will take more resource than a linear 
> one.

So - what makes that inefficient, *inherently* compared with any other
approach?

What if that's exactly what's wanted - what if that represents a good thing?

> What I've asked is that you
> think about it too - have you done that yet? I used the quote as an 
> example as he put it quite succinctly
> 
> 'Do they just, like, stay hanging around until everything finishes? ' 
> 
> Yep I'm saying exactly that - because that's what happens. 
> The memory states have to be stored while your subroutine works its 
> way through.
> 
> Here's a bit of recursion -                   
> 
> Recursive procedure 1
> [memory variables stored]
> Call recursive 
>       recursive procedure (2)
>       [memory variables stored]
>       call recursive
>               recursive procedure (3)
>               [memory variables stored]
>               return
>       return
> end
> 
> So - the first lot 'hang around' until (2) finishes (2) hangs around 
> until
> (3) finishes and so on and so forth. Its all memory sitting around 
> being used. Not a problem in the example code above - but you go and 
> start creating big arrays and populating data objects and that's a lot 
> of hanging around.

My point was that not every iteration of recursion is left hanging around -
only the currently instantiated ones.

And considering how AD should be structured, there should only be a handful
of iterations concurrently running. Any more than that, and
*any* searches would be wildly inefficient.

Now accepted, recursing *many* times *could* be an inefficient manner of
doing things, but as it stands, recursing a small number of instances, is
probably no more, and quite possibly less resource usage, than your bloated
ADO search.

> Maybe you don't care, maybe your servers have loads of memory. Just 
> because its out there and its being used doesn't mean it's the most 
> 'efficient'
> thing since sliced bread.

Are you honestly trying to lecture me that doing this by ADO is the most
resource efficient manner?

Look at all the objects you instantiate - and what do you think happens once
you submit your structured query? That it just disappears into some
wondrous, magic cloud, that's the most efficient thing since Babbage's
engine? How many objects, and memory used do you think occurs whilst the ADO
search occurs? What do you imagine is happening at that point?

If you really had any point to make on this WRT efficiency, you'd be
advocating searching the GC as I mentioned earlier.

You know why most *scripters* use ADO searches? Because they don't have any
other method of doing it. When they first had to face using the LDAP
provider, they searched on t'internet for some manner of getting all users
via LDAP, and came across ADO examples, downloaded scripts, and have copied
and pasted ever since, never once thinking about it.

You know why I don't *necessarily* use ADO searches? Because when I first
had to do this, not every client platform could do this, because it's often
a sledgehammer to crack a peanut, and because I can be more selective and
optimum using what I alluded to in my posted example - passing arguments to
the recursive subroutine.

> You can get into some real pickles
> with recursion.

Agreed.

> Recursion allows for a succinct coding method, but it will be a trade 
> off on memory and resources.

*Can* be a trade off in memory and resources.

In this example, I highly suspect your ADO search uses more resources than
my lightweight recursive search.

> Recursive procedures can
> be more resource intensive than linear code.

Yes - agreed *can* be.

> Recursive
> procedures are typically harder to debug.

In the examples given, and for somebody relatively new to vbscripting as the
original poster appears, which do you imagine would be easier to debug?

> With all this
> information I believe recursion is a useful tool, but not necessarily 
> efficient.

Agreed, *not necessarily*. It's all about context and alternatives.

> Your original statement was that a recursive process was 'the most 
> efficient'

No it wasn't.

My original statement on efficiency was that I find it a more efficient
approach, as *I* can limit the degree of searching, either on depth, some
other condition, or on the starting place.

And given the two code examples given, I stand by my comments that mine is
more efficient.

> I do not believe that is correct - and you've still not convinced me 
> otherwise. 'The process solved the problem ergo its efficient' 
> [raspberry noise] ( in a friendly way ;) )

That's not what I said.

I qualified, previously, why in context, I find my approach more efficient.

And certainly, the way yours is coded, certainly doesn't make me believe
that efficiency is an aspiration.

If you were truly bothered about efficiency, you'd be searching the GC.

Neil



****************************************************************************
*
This email and its attachments are confidential and are intended for the
above named recipient only. If this has come to you in error, please notify
the sender immediately and delete this email from your system. You must take
no action based on this, nor must you copy or disclose it or any part of its
contents to any person or organisation. Statements and opinions contained in
this email may not necessarily represent those of Littlewoods Shop Direct
Group Limited or its subsidiaries. Please note that email communications may
be monitored. The registered office of Littlewoods Shop Direct Group Limited
is 100 Old Hall Street Liverpool L70 1AB registered number 5059352
****************************************************************************
*




This message has been scanned for viruses by BlackSpider MailControl -
www.blackspider.com
********************************************************
This Weeks Sponsor: Cesura, Inc.
Know about Citrix end-user slowdowns before they know.
Know the probable cause, immediately.
Know it all now with this free white paper.
http://www.cesurasolutions.com/landing/WPBCForCitrix.htm?mc=TBCC
********************************************************
Useful Thin Client Computing Links are available at:
http://thin.net/links.cfm
ThinWiki community - Excellent SBC Search Capabilities!
http://www.thinwiki.com
***********************************************************
For Archives, to Unsubscribe, Subscribe or set Digest or Vacation mode use
the below link:
http://thin.net/citrixlist.cfm

********************************************************
This Weeks Sponsor: Cesura, Inc.
Know about Citrix end-user slowdowns before they know.
Know the probable cause, immediately.
Know it all now with this free white paper.
http://www.cesurasolutions.com/landing/WPBCForCitrix.htm?mc=WETBCC
********************************************************
Useful Thin Client Computing Links are available at:
http://thin.net/links.cfm
ThinWiki community - Excellent SBC Search Capabilities!
http://www.thinwiki.com
***********************************************************
For Archives, to Unsubscribe, Subscribe or set Digest or Vacation mode use
the below link:
http://thin.net/citrixlist.cfm


--------------------------------------------------------------------
mail2web - Check your email from the web at http://mail2web.com/ .


********************************************************
This Weeks Sponsor: Cesura, Inc.
Know about Citrix end-user slowdowns before they know.
Know the probable cause, immediately.
Know it all now with this free white paper.
http://www.cesurasolutions.com/landing/WPBCForCitrix.htm?mc=TBCC
********************************************************
Useful Thin Client Computing Links are available at:
http://thin.net/links.cfm
ThinWiki community - Excellent SBC Search Capabilities!
http://www.thinwiki.com
***********************************************************
For Archives, to Unsubscribe, Subscribe or set Digest or Vacation mode use
the below link:
http://thin.net/citrixlist.cfm

********************************************************
This Weeks Sponsor: Cesura, Inc.
Know about Citrix end-user slowdowns before they know.
Know the probable cause, immediately.
Know it all now with this free white paper.
http://www.cesurasolutions.com/landing/WPBCForCitrix.htm?mc=WETBCC
******************************************************** 
Useful Thin Client Computing Links are available at:
http://thin.net/links.cfm
ThinWiki community - Excellent SBC Search Capabilities!
http://www.thinwiki.com
***********************************************************
For Archives, to Unsubscribe, Subscribe or 
set Digest or Vacation mode use the below link:
http://thin.net/citrixlist.cfm

Other related posts: