Guillermo Alan Bort wrote,on my timestamp of 6/04/2009 10:16 AM: Embedded comments.
However, I must say that one without the other (be it either theoretical knowledge or experience) is not enough.
Precisely. 100% agreed.
Certifications are a necessary evil.
I disagree, they most certainly are not necessary. The majority of people that directly benefit from certifications are those making money out of providing them.
I actually got a raise after Igot certified.
And that is a perfect example of why not. Raises should never be linked to certifications. Education, yes, absolutely. Certifications provided by the industry itself and recognized only by said industry? Never. That is precisely what is wrong with the whole enchillada.
It was important for me since I don't have a college degree yet, and am working as a Sr. DBA... and many juniors are actually computer engineers... so I had to have some 'valid' way of showing I deserver to be Sr. (actually after a few weeks at the job it shows, but it's a nice addition to my wall ;-))
Independent of your personal motivation, the certification is useless as a means of giving me any assurance the person flaunting it is motivated. Someone who has been on the job for 10 years certainly tells me he/she has, if nothing else, a commitment to the profession. Of course, it might be a case of a "*saurus". But that is my duty of care to detect and a certification will be the last thing I'll look for help.
You are of course referring to the dreaded dbasaurus.
No I am not. In this day and age of instant gratification, 10 years might be considered a lifetime. It is far from it.
I think that along time DBA that has become stale is the worst DBA.
There is a fundamental difference between a stale dba and the job being stale by itself. This might come as a surprise to those used to the clickety-click of OEM but dbas actually do not need graphics to work out how to manage disk space in a database. The subject has been part and parcel of dba work for more than three decades, long before even Oracle itself existed. In that sense the job is itself stale. Of course, we may flash it up with spiffy graphics to make it look new, but it's still fundamentally the same thing. > I actually had a
discussion with a senior dba (due to age, not knowledge) about locally vs dictionary managed tbs in 10g. He sustained that dictionary managed was better because local extent managmenet was buggy. Had a similar argument with a 9i OCP about ASM... he said that using veritas' cluster filesystem was better because ASM was buggy (though true, the result is a failed RAC implementation due to veritas' limitations and a new project using ASM)
Not so fast. Without a doubt ASM is horribly buggy in anything other than Linux. Anyone recommending it for the sake of ASM itself is doing a disservice to the client. It needs to be said and pointed out clearly. Hiding reality serves no purpose whatsoever. Of course if RAC is involved, that changes the picture somewhat: both are horribly buggy anyways on anything other than Linux. But, I digress... As for DMT vs LMT: I can certainly vouch for the terrible bugs in LMT upon its introduction all the way up to early 9i releases. Even in 10g, I can say that two months ago I installed the latest patch to 10.2.0.3 to avoid dbfile corruptions with LMT and ASSM. It is still that buggy! Guess what: the same patch is needed for 11g! Of course with 10g, I wouldn't even think of creating DMTs! Come to think of it, I haven't used DMTs since I moved to 8i. But I certainly haven't stopped checking for patches. You know why? Because of the experience. Missing in those who think using the latest whitest and brightest is proof enough of safety and reliability. Check with the dbas you mentioned if that was not what they meant. If it was, there is your answer. If it wasn't, you have my full support to frame them as pure dbasaurus! ;)
That's what reporting tools are for.
No it most definitely is not. Reports don't replace analysis and reasoning. They just make it easier to visualize what is, for all intents and purposes, numbers. We still need to analyze the report. And understand its meaning and the meaning of the data behind it.
It's like telling engineering students they can't use a periodic table because in the old days they didn't have them.
Actually, it's like telling chemical engineering students they can deduce the totality of possible chemical reactions between elements by observing the periodic table. Any chemical engineering student will immediately explain that is simply impossible.
If you learn to use the new tools (OEM?) you might save a great deal of time and get the same information (plus AWR/statspack reports in a neat easy-to-copy-and-paste-into-a-report html)
I have learned to use the "new" tools. OEM has been around since release 7. That's over 15 years ago. The new html version is just a different presentation of the same information, it's not a new tool. Sure it has new functionality. It better! But that doesn't make it a "new" tool. Grid is, though. But that's a different kettle.
I've utterly destroyed a few databases in my time (though through OS and not Oracle itself... like fsck with fs mounted)
Ah yes. You should have tried "rm -rf .*" in the old days of basic AT&T SystemV. Nothing destroyed ALL file systems more efficiently... LOL!
I've used both EM (both DC and GC) and sql*plus scripts (on aix). For most tasks I take sql*plus over EM every time... except for the quick overview of the database... there's something to the performance page of EM that is just easier to perform a db tuning... (without using advisors due to license restrictions) it's like reading the head of the statspack and know where you have to drill down... EM performance page gives that... and I think that's about all I'd use EM for... at least db control. (grid control has other features).
I find it easier and faster to go with the head of Statspack plus a home brew script to check v$session for runaways. EM is still too misleading in the overall status page. What I find it incredibly useful for, with the tuning pack, is to quickly work out a better plan for a tricky statement. I'm sick and tired of manually figuring out three-page-long Peoplesoft SQL with umpteen levels of views on views and impenetrable execution plans. On a good day. That is precisely where an automatic tool like the tuning pack becomes invaluable. It saves heaps of time. But it's still not perfect: all it can do is fix that particular iteration or instance of that statement. If the problem is caused by some other factor than just that plain SQL, it won't go away that easy and will require lots of analysis and reflection.
Oh, an perhaps the dbms_scheduler interface... since I really, really don't remember any of the member functions and procedures :-P
Indeed! It's taken me a while but I finally got a set of scripts that more or less keeps that beast under control. Without a doubt the OEM front-end makes it heaps easier! -- Cheers Nuno Souto dbvision@xxxxxxxxxxxx -- http://www.freelists.org/webpage/oracle-l