Monday, October 15, 2007

Database Tourette's

Man, you ever have one of those customers who gives you Tourette's for like a week? You know the one I mean... he calls you up with something ridiculous he's promised a customer and it falls to you to get it done, only it's a month-long task that he's promised in 12 days. Or the one who keeps getting you out of bed because he won't quit monkeying around in the system and he keeps deleting the wrong data and you have to recover every 3 days.
And what do you do when that happens? You walk around for a week muttering obsenities to yourself. You find yourself telling him off in the shower. You see his face in stop signs. You hear his voice asking for a restore every night just as you fall asleep. Then you wake up and mutter more obscenities to yourself. He's like a bad song. Those of you who are parents, you know what I'm talking about. How often have you tried to not hear the Oobi theme song in your head, or that god-awful map song from Dora?

Oddly enough, I just finished a bout of Tourette's myself. I'm not going to go into the details, but it's something that got me muttering for over a week. Now I'm feeling much better and back to the good ole days falling asleep to the Oobi theme.
Tuesday, August 14, 2007

SQL Server Service Packs

I think it should be easier to find the service packs on the SQL site. There used to be a link on the front page, and then it moved to the download page. Now you have to search for it and it does eventually take you to the download page, but it's not easy to find like it used to be.

I think I actually found it this time by accident looking at an article for SP hotfixes... and that had a link to the SP2 download page. Come on MS, make it easy for us to find these things, and let us know the last time the SP was refreshed.

OK, so here's the SP2 page for Yukon. I found it from the SQL Support page here.
Tuesday, August 07, 2007

Gill People

I'll tell ya man, every now and then you get one of those users who just tests every part of you.  I've currently got this lady who calls about 3x/wk with one issue or another.  She's a report writer, and not really a very good one, but anyway.  This lady actually never shuts up while we're on the phone.  I took special notice the other day when she called to ask me why her ODBC wasn't working all of a sudden, and she actually didn't breathe for like 2mins.  It was amazing.  I swear to god she's actually grown gills so she doesn't have to stop talking.

It's really hard to deal with users like that too.  I mean, how can you tell someone to shut up and retain any kind of good customer service?  It's tough really because I don't have time to turn a quick troubleshooting call into a 30min dissertation on the merits of using one font or another in an insignificant report. 

What I've done is taken to putting her on the defensive right away.  Whenever she calls, I'm always on my way to a meeting, but I've got 2mins to help her if she'll make it fast.  So far it's working really well.  Let's just hope she doesn't read my blog.

Tuesday, July 31, 2007

No Ticket, No Problem

One of the hardest things to manage as a DBA is all the different sources we receive requests through. We get walk-ups, emails, calls, tickets, rumors, pigeons, and sometimes even smoke signals. And while it's easy for the boss to say 'no ticket, no problem', as a DBA in the trenches you can't exactly be a dick to your users. You have to take the request in whatever form it manifests itself. Sure, you can try to push everyone to a ticketing system, but it's just not worth it in the long run. Too many times the request is just something small like can I have access to this table or can I get a backup real quick so I can push this new code. Using a ticketing system just isn't always practical or even conducive to good relations.

So what can you do then? I mean, you can't exactly just have people shouting requests at you from down the hall assuming you heard them. What I tend to do is push everyone to email. After all, I get email on my hip and it's easy enough to not get in the way of even the simplest requests. When I'm in a shop that insists on having tickets for everything, I'll take requests through email, and just create the ticket myself. And once I get all these requests coming in, I keep them organized in a master list of some kind. I used to use Excel which has its advantages with sorting, etc. Currently, I use OneNote and I'm liking it just fine. In the past I've also written small web apps to manage my task list too.

The point is though it really doesn't matter what you use. Use whatever is easiest for you and that won't hold you back. Just make sure that whatever you use, you use all the time. If you want to manage tasks in Outlook, then do so. Like I said though, I like OneNote these days. That's mainly because it's easy to copy emails from Outlook. That makes it really easy to reply once the task is done, and it gives me a record of exactly what was requested.

Thursday, July 05, 2007

The Bathroom DBA

I was watching this guy in the bathroom the other day (no, not like that… be nice!). And something struck me as a bit useless. So what this guy did was he had one of those large PDA phones and he didn’t have any where to put it, so he laid it on top of the urinal while he was doing his business. When he was done, he took it out and went to the sink and washed his hands. Then he took his phone and left. Since he actually laid his phone on the urinal and didn’t wash it, I was wondering why he bothered washing his hands at all. Doesn’t he know that unless he actually pees directly on his hands that they’re not in any more danger than his phone is, and it’s actually highly likely that his phone has whatever his hands do? And where do you think all that yuckiness will go next time he uses his phone?

I saw this episode of MythBusters where they tested this theory with toothbrushes. They had a control brush under glass down the hall, and they had a couple more sitting in the bathroom, etc. At the end, what they found was that there was just as much fecal matter on the brush under glass down the hall as there was on the one in the bathroom itself. Pretty gross, huh? The point is that there’s a lot of conjecture that goes on when it comes to bathroom habits and the like. The MythBusters however, pulled hard numbers and proved that your toothbrush is going to get nasty stuff on it regardless.

In DBs we deal with numbers, and not conjecture and not feelings. Numbers don’t lie. So the next time you step into a meeting, or go to your boss and ask for a change to the DB for whatever reason, perhaps it would be best if you pulled some numbers so you have something solid to back you up. Otherwise, despite all your trouble, you might just end up with piss on your face.
Tuesday, June 19, 2007

New RSS Feed Ready!!!

OK, I've been promising this for a long time now, but I finally got the RSS ready for IT Bookworm. So you guys go on and subscribe and I'll update it with book and video reviews as well as press releases for new books and videos. I may even put some training discounts up there if they come my way.
Monday, June 18, 2007

Managing Jobs in SQK2K

I'm actually double posting this here and at Underground just to help get the word out.

OK, from time to time this topic comes up so I thought I'd publish this info in the hopes that someone will find it useful.

The problem is with getting non-admins the ability to admin jobs in SQL2K. It's widely known that only SAs can do this, but there's a special DB role in msdb that lets you do this w/o having to be an admin. Rather than give you a big explanation, here's the code so you can test it yourself. This will let anyone in this group admin SQL jobs just like they were SA.


use msdb
exec master..sp_addlogin 'testme', 'testme', 'msdb'
exec msdb..sp_adduser 'testme', 'testme', 'TargetServersRole'
grant execute on sp_add_job to TargetServersRole
grant execute on sp_add_jobstep to TargetServersRole
grant execute on sp_update_job to TargetServersRole
grant execute on sp_add_jobserver to TargetServersRole
grant execute on sp_add_jobschedule to TargetServersRole
grant execute on sp_delete_jobschedule to TargetServersRole
grant execute on sp_delete_job to TargetServersRole
grant execute on sp_start_job to TargetServersRole
grant execute on sp_stop_job to TargetServersRole

I hope this helps somebody. If you want to give the ability to admin only specific jobs though, you'll still have to use one of the intermediate steps you did before, or make the user the owner of the job.
Too bad groups can't own jobs.
Tuesday, June 12, 2007

Do You Care About the New Cert?

Today on SSWUG, Steven talked about the new DB Architect cert from MS. I have no problem with what he said, and in fact, I even agree with most of it. However, it's really going to be hard to make any of us care about it when pretty much every company out there acts like DB work is so easy that pretty much anyone can do it. Companies go through DBAs like candy with this laissez-faire attitude that any DBA is as good as the next.

It's really getting kinda sickening. Somehow they think that their IT staff is so replacable that they can do anything to us they want and they can just go get another one. Well, I don't know about any of you guys, but I don't remember seeing any DBAs standing on the work wall at 7-11.

I really think that most of the problem is that companies still don't see the real benefit to putting the time in on their apps before they hit production. Most companies still just rush things together and test their apps under fire. They have the coders write the DB schema, and architect the entire solution. Well, I'm sure that reading a couple .Net books qualifies you as an expert in networks, storage platforms, DB schemas, indexing, etc. And until companies start to realize that all of these things can't be done by one person, and that they're all a lot harder than they sound, we won't get any real respect. See, once they make that leap, and start to finally see that these things not only matter, but they'd actually start listening to their DBAs and actually treat them like their valued members of the company with something very important to contribute.

Now, nothing has really spawned this post except me thinking about how trivial this new cert is going to be. If companies don't even see the need for this stuff, then our precious new cert certainly isn't going to mean anything to them. So basically, we're getting certs to impress other geeks at conferences. It's almost like we're displaying our Klingon ranks for each other to see.

And I know what you're thinking... I'm complete wrong and companies are really into architecting real solutions. Well, if that were true, then we wouldn't need to be compliance audits for basic IT processes. Hell, we have to fight companies for even an extra week to benchmark a new app, much less for the money to buy the benchmark software. It's pathetic. I realize that it's a pain in the ass to have to hold off on your deployment for something as worthless as benchmarking, but you're just gonna have to start sucking it up.

Anyway, that's all I've got.
Friday, June 01, 2007

Pathetic Vendor Scripts

We're currently setting up Informatica and we got some scripts from them to create the DB. There were 3 scripts in all. The first script went ok. The 2nd had some problems, but it was small enough that I was able to just fix them and get past it in fairly short order.

The 3rd however was just pathetic. Not only was it huge, but it was plagued with dozens upon dozens of errors. The header of the scripts says that they were written and tested in SQL Server 7.0. WHAT? You haven't updated your scripts in 10yrs? Not only that, but with the errors I was getting, there's just NO WAY they ever even ran on SQL 7. It's mathematically impossible. Here are the problems I remember... there may have been more.

1. Deleting tables before they're even created. They should have put an IF clause in there.
2. They were deleting PK tables before their FK tables were deleted. Excuse me guys... have you ever heard of referential integrity?
3. They were creating relationships with data types that didn't match, so I got TONS of errors that the 2 column in the relationship had different data types. I don't even see how this is possible.

Frankly, I'm not only disappointed, but somewhat shocked at the lack of skill a company like Informatica displays here. My mother could write better SQL than this. There's no way this code ever ran. I've even got a VMWare image with SQL 7 on it, and I ran it in there, and it bombed too, so don't give me any of that wrong version of SQL crap.

It really does make you wonder though. How many people are actually using Informatica? Because for them to send a customer such pathetic code, it clearly hasn't been tested... EVER.

And this isn't the only bad code I've ever received from a vendor. It seems like somehow none of them can write SQL that runs worth a damn. It really doesn't seem that hard does it? Because I've written hundreds of scripts, and when I'm done, I run it. It either runs well or it doesn't. If it doesn't run without errors, I fix it. Go figure! And some mistakes are understandable, but like I said, there's no way that dropping a table before it's created has ever run without an error. The code was simply never run against a DB.

I can probably tell you what happened though. The developer already had the schema in his DB and he was getting errors when he needed to create the tables so he added drop statements. I guess it never crossed his mind that someone might be installing for the first time. The other errors however... there's just no way he ever got the rest of that script to run.
Tuesday, February 06, 2007

Work with me here

I'm sitting at home really sick today, so do me a favor and just assume I would've said something really inspiring and groundbreaking about backup or benchmarking.

Hopefully I'll be back online tomorrow.

Monday, February 05, 2007

MultiCast Mirroring

I was talking to a vendor the other day, and he was saying that he was trying to do a 3rd party solution that would allow you to mirror a single DB to several servers at once. I've officially coined the phrase MultiCast Mirroring. I think this is the next logical step in the mirroring process, and while it wouldn't be practical to use it with safety on, you could use it to keep several types of systems more or less in synch. Of course, I would expect that at least one of the mirroring sessions would have safety on, but not all.

I know what you're all thinking... if you need to do something like that then why not just use replication? Well, that's actually a significant question and as we've all seen, repl and mirroring aren't the same thing at all. They're different approaches to similar problems, but they do have major differences. I'm not going to go into all the tired arguements right now, but most of us know what they are. Let me just say that repl adds a complication that doesn't need to be there.

One interesting thing is that I'm going over the architecture of what would make this possible, and under the covers, it would need to look a lot like repl. To avoid any reading conflicts, etc with the primary log, you'd prob end up shipping the single log to like a log distribution server where any server that wants to attach itself to it as a mirror can do so. So now we're looking at kind of a publisher/subscriber type of method. It's still easier to keep up with than repl. though.

Anyway, this is all just my rambling, but it would still be interesting to see something like that.
Tuesday, January 30, 2007

It Pays to be Paranoid

You know, sometimes it just really pays to be overly paranoid, esp about backups. Here's the deal...

I've got my home web server and database. I had an array controller start acting up on me about a month ago and I knew I was gonna have to replace it. So I started the process of making sure I had good backups of all my data in case the controller went out before I had a chance to fix it. I copied my DBs and websites to a different partition on the same box. I also backed them up to my NAS. I also copied them over to a different server. And just to be sure, in case the backup was corrupt (which is a real possibility with Windows backup), I copied the raw files to the NAS as well. OK, so I've got all of this paranoia going on here, but this is very important data to me and I just can't afford to lose it.

So I got my array back up and everything up and running again. No problems. Now, this weekend, I lost a drive on that array. And of course, I'm using Raid-0. Why? Well, because disk is expensive and this is my home system, and sometimes it's hard to justify the extra cost of making things redundant. Well, I've been running backups to the NAS pretty much every day, so I knew I was protected. Unfortunately, I found that my NAS had lost a disk too... and you guessed it... Raid-0 as well. OK, I could ordinarily go to that other server that I copied it to from the previous outage, but I'm in the middle of rebuilding that server for my new Exchange server. So that data's no longer there... great.

Now it looks like I've lost all my data. I can't get the array back, and I've lost something like 12yrs of data. It's a complete disaster. Well, last night after talking to one of those disk recovery folks, I was about to pack up my array and send it to them for the $3,000 data retrieval process, when I looked at the other partition on my server and there it was... the data I had copied over from the previous meltdown! I couldn't believe it. Furtunately, my data doesn't change often enough to really worry about a couple weeks going by between backups, so I haven't lost anything at all.

Let me just say that when it comes to backups, it pays to be extremely paranoid. If your data's that important, act like it. I thought that my NAS and other server would be good enough, but they clearly weren't. So always plan for a multiple failure if it's at all possible. You could have electrical problems that knock out more than one server, you could have a flood or other natural disaster, etc. I'm serious here. If the data you're protecting is that important, treat it as such. In fact, treat it like a newborn baby. Protect it and never let anything happen to it.

I'll definitely be revamping my backup strategy at home so nothing like this ever happens again. I guess the first part of that is to stop being so cheap and get off Raid-0. I just can't afford that anymore. So you guys learn from my almost misfortune and take a good hard look at your backups with a paranoid eye. Develop impossible scenarios and see if your strategy can overcome them.

Good Luck!
Thursday, January 25, 2007

Cut and Paste Code

This is just a little reminder to not be so lazy.  I know it's quick and easy to cut and paste code, especially for reports.  It saves a lot of time and you can really burn through projects that way.  But you just have to consider what you're doing.  Just because a section of code worked well before, that doesn't mean that it'll work well here.  Look before you leap.  If you want to cut and paste code, go ahead, but at least look at it to make sure it's actually what you need. 

Recently, I ran across a group of SPs that were clearly from the same template.  They all pulled 150+ cols into #tables, and used only a couple of them.  The theory behind it was the original author needed a few cols, and the rest of the authors only needed a couple, but since they were included in the select * the original guy pulled in, they decided to just leave it at that.  It's nothing but laziness people.  Look at what you need and get it.  A DB is like a buffet.  You can eat all you want, but eat all you take.  If you need logic from a query, that's fine, but you should only use that query to help you cut some of your work, not to replace it. 

So, do the work, think about what you need, and do what's right.  And another thing... cut and paste code leads to some other really bad things.  Like in the case of this code I just fixed.  Not only did the SP get cloned, it wasn't good code to begin with.  So, instead of the other report writers figuring out how to write the queries themselves, and possibly stumbling on a better way, they just pushed the same bad code out there.  Don't just assume that because code is published in the DB, that it's good.  Take the code if you need it, but only take what you can use, and definitely check it somehow to make sure it's written well to begin with.  If you don't know how to check it, there are plenty of resources.  Start with your own DBAs, and if you can't get any love there, try the internet... books... training CDs... user groups... seminars... newsgroups, etc...

Tuesday, January 23, 2007

Database Spam

I love this. I logged into my Blogger account about 3 weeks ago to post a blog and they had a message up saying that they had a new version with all these new features. I upgraded because it sounded cool, and the first thing they did was mark my blog as spam and disable it.

I filled out the form to get it turned back on and it just came up today. I always pays to upgrade.
Thursday, January 04, 2007

Don't Blow it Off

From time to time I get saved only by the grace of good practice. I've got this server that has had fairly unreliable backups for a while now. It's really hit or miss as to whether the whole thing will be backed up on any given day. Yesterday was one of those days when the backups failed. It was very tempting to just let it go and have it pick up the backups the next day, but I bit the bullet and did what it took to run the backups by hand instead. What can I tell you... last night, the NOC guy who runs the ETL process messed up and hosed 2 DBs on that server. At 2:30am I got a call that both of those DBs had to be restored. I was able to restore and get things back on track in just a few mins.

Take a page out of my book and never blow off backups. You never know when you'll be called on to restore. Anything can happen at any time and you need to be ready. I know sometimes it's a beating to make sure the troublesome backups complete, but one day it'll really save the day... and maybe your job.

Remember, a DBA is only as good as his last backup.

About Me

My Photo
Sean McCown
I am a Contributing Editor for InfoWorld Magazine, and a frequent contributor to SQLServerCentral.com as well as SSWUG.org. I live with my wife and 3 kids, and have practiced and taught Kenpo for 22yrs now.
View my complete profile

Labels

Blogumulus by Roy Tanck and Amanda Fazani

Page Views