T-SQL Tuesday – SQL Family and community

This month’s T-SQL Tuesday is being hosted by Jeffrey Verheul (B|T). Our topic of the month is SQL Family. Check out the host page here.

Here is our task:

” How do you feel about SQL Family? Did they help you, or did you help someone in the SQL Family? ”

I feel that the SQL Family is a great online community that helps each other out wherever and whenever they can. I don’t like to use the term SQL Family though. I prefer to keep work and personal life separate so I tend not to use the term “SQL Family”. When talking about my job and why I like what I do, one of the things I always bring up is the great online community.

When I first started learning about databases, I lived on sites like SQL Server Central and MSSQL tips. I loved how that people had tons of questions and other people just answered them…ACTIVELY. What was even better was that people weren’t jerks about answering basic or n00b questions.

Then I was introduced to twitter.  I started using twitter when I was working in New Zealand. My boss at the time was encouraging us to use twitter but I never really thought it was useful. Then I met the #SQLHelp hashtag. That is when I really understood what it was all about. For me, it’s like an ongoing discussion in a huge room of DBAs, developers, and Sys Admins. Great use of twitter. Almost always a good discussion on there.

As for me helping someone, I am currently on the committee that puts together Pittsburgh’s SQL Saturday (did you sign up yet?!). For the last 3 years I have helped organize this event. It’s been a great experience working with some great people. We periodically meet to discuss the event. It takes quite a bit of work to pull these off. From organizing speakers and figuring out what is for lunch (no pizza!) to actually getting people to come. The reward comes the day of the event though. It’s great seeing the last couple months worth of work all come together.

Overall I think the community as a whole is the best part about being a SQL DBA. Most of the time every problem you’ve ever had has been seen by someone else and that person will 99% of the time be willing to help you out with it. And if there is a time where you might have a new problem, there will always be about 50 people on #SQLHelp or stackexchange to give you some ideas of what to check. Thanks SQL peoples!

 

Share

Read More

T-SQL Tuesday – Assumptions

This is my first attempt at contributing to T-SQL2sDay. The topic of the month is Assumptions. 

Assignment – Write about a big assumption you encounter at work, one that people are uncomfortable talking about.

I work for a vendor and our application(s) don’t live on a single server but on many different servers. We have message boxes, SQL servers, application servers, and so on.

When we get a call/email about our applications being slow or timing out, we usually log into the server and try to take a look and see where the pain point is. A lot of the blame is usually put onto SQL or the application.

Here is where the assumption comes in.

As a database administrator working for a vendor who deals a lot with the support team, we assume that the hardware was specced out and properly configured. Many times during conference calls this unease topic comes up. Sometimes these conversations go something like:

“Your application isn’t working, it’s so slow. We keep getting timeouts”

“We might need to give it a bit more memory, my laptop has more resources than the database server”

It’s not a conversation a lot of people want to have but sometimes it really isn’t the application or it really isn’t SQL’s fault.

Many users of the system, the administrators, engineers, and executives all assume that all the magical hardware is the appropriate speed, size, etc. But is it really? Is your hardware adequate for what your application is trying to do?

 

Share

Read More

SQL Saturday #250

I’m currently helping to plan Pittsburgh’s next SQL Saturday. It should be a good day with lots of good sessions by some new (see Steve Hood‘s post here) and some returning speakers.

I’m personally looking forward to the session on Statistical Semantic Search. It’s something I haven’t used before.

Here is the current brochure. If you haven’t signed up yet, what are you waiting for?!

Welcome to SQLSaturday #250 in Pittsburgh

http://www.sqlsaturday.com/250/

SQLSaturday is a full day FREE  training event for SQL Server professionals and those wanting to learn more about SQL Server and network with the SQL Server community. Our event will be held Saturday September 14, 2013 at ITT Technical Institute, 5460 Campbells Run Road, Pittsburgh, PA 15205. Admittance to this event is free, but we charge a fee of $10.00 to help defer the cost of lunch, coffee/tea  and water which will be provided to attendees throughout the day. 

Pick from more than 30 different sessions and learn from the experts!! Check the schedule to take a peek at the sessions and the speakers. Some of our sponsors are interested in hiring. Don’t forget to stop by at the sponsors table during the event.
Please register soon as seating is limited, and let friends and colleages know about the event.

 New this year !! An entire track for “Wanna be DBAs”. Don’t know anything about SQL Server? This track is for you !! Explore the life of a DBA and see if it can be yours too. This is mainly focused towards students and novices. Once you complete your registration at http://www.sqlsaturday.com/250/eventhome.aspx . Register via eventbrite at https://wannabeadba.eventbrite.com/  also( only if you want to attend the “Wanna Be a DBA” track).

Check “Track 0″ under the schedule http://sqlsaturday.com/250/schedule.aspx  for session details

Don’t forget to follow us on Twitter @SqlSaturdayPitt and join the conversation with the #SqlSat250 hashtag



PreConference Training (Optional and prepaid) – Friday Sept 13, 2013

Introduction to Performance Tuning in SQL Server 2012 presented by John Sterrett and Konstantin Melamud
Have you always wanted to dig into performance tuning but wasn’t sure where to start? During this all day pre-con you will be able to find out which SQL statements are causing your performance problems and you will be able to apply basic concepts to improve your general performance. You will also get all the free tools and scripts the pros use to to their systems.

Eventbrite - Introduction to Performance Tuning with SQL Server 2012

Share

Read More

Tempdb Destroyer

One of my previous posts, I covered how to create a really big database with just garbage data in it. Recently, I was given a task to blow up tempdb. I thought about using a similar script and tried it out.

Of course it worked and it blew up TempDb but it also errored out. This wasn’t acceptable. I needed to find a way to blow up the tempdb with out it erroring out.

I tried a couple little things but ended up using the following script:

SET nocount ON
 
USE tempdb
 
IF OBJECT_ID('tempdb..#freespace') IS NOT NULL
DROP TABLE #freespace
IF OBJECT_ID('tempdb..#BigGuy1') IS NOT NULL
DROP TABLE #BigGuy1
IF OBJECT_ID('tempdb..#BigGuy2') IS NOT NULL
DROP TABLE #BigGuy2
IF OBJECT_ID('tempdb..#BigGuy3') IS NOT NULL
DROP TABLE #BigGuy3
IF OBJECT_ID('tempdb..#BigGuy4') IS NOT NULL
DROP TABLE #BigGuy4
IF OBJECT_ID('tempdb..#BigGuy5') IS NOT NULL
DROP TABLE #BigGuy5
GO
 
CREATE TABLE #BigGuy1 ( column1 CHAR(7100), column2 CHAR(900))
CREATE TABLE #BigGuy2 ( column1 CHAR(7100), column2 CHAR(900))
CREATE TABLE #BigGuy3 ( column1 CHAR(7100), column2 CHAR(900))
CREATE TABLE #BigGuy4 ( column1 CHAR(7100), column2 CHAR(900))
CREATE TABLE #BigGuy5 ( column1 CHAR(7100), column2 CHAR(900))
 
CREATE TABLE #freespace
( drive VARCHAR(1),
[MB Free] INT)
 
DECLARE @drive VARCHAR(1),
@free INT
 
SELECT @drive = LEFT(physical_name, 1)
FROM sys.master_files
WHERE database_id = DB_ID('tempdb')
AND TYPE = 0
 
INSERT INTO #freespace
EXEC master..xp_fixeddrives
 
SELECT @free = [MB Free]
FROM #freespace
WHERE drive = @drive
 
while @free > 1
BEGIN
INSERT INTO #BigGuy1 VALUES('test', 'test')
INSERT INTO #BigGuy2 VALUES('test', 'test')
INSERT INTO #BigGuy3 VALUES('test', 'test')
INSERT INTO #BigGuy4 VALUES('test', 'test')
INSERT INTO #BigGuy5 VALUES('test', 'test')
INSERT INTO #BigGuy1 VALUES('test', 'test')
INSERT INTO #BigGuy2 VALUES('test', 'test')
INSERT INTO #BigGuy3 VALUES('test', 'test')
INSERT INTO #BigGuy4 VALUES('test', 'test')
INSERT INTO #BigGuy5 VALUES('test', 'test')
INSERT INTO #BigGuy1 VALUES('test', 'test')
INSERT INTO #BigGuy2 VALUES('test', 'test')
INSERT INTO #BigGuy3 VALUES('test', 'test')
INSERT INTO #BigGuy4 VALUES('test', 'test')
INSERT INTO #BigGuy5 VALUES('test', 'test')
INSERT INTO #BigGuy1 VALUES('test', 'test')
INSERT INTO #BigGuy2 VALUES('test', 'test')
INSERT INTO #BigGuy3 VALUES('test', 'test')
INSERT INTO #BigGuy4 VALUES('test', 'test')
INSERT INTO #BigGuy5 VALUES('test', 'test')
INSERT INTO #BigGuy1 VALUES('test', 'test')
INSERT INTO #BigGuy2 VALUES('test', 'test')
INSERT INTO #BigGuy3 VALUES('test', 'test')
INSERT INTO #BigGuy4 VALUES('test', 'test')
INSERT INTO #BigGuy5 VALUES('test', 'test')
 
DELETE #freespace
 
INSERT INTO #freespace
EXEC master..xp_fixeddrives
 
SELECT @free = [MB Free]
FROM #freespace
WHERE drive = @drive
 
print @free
 
END

This one uses a loop to check how much free space you have left. You can also set the threshold. For example, if you want to have 100MB free, you would just change @free > 1 to @free > 100.

With a word of caution, do NOT run this on your production system. We used this for a training session in a TEST environment.

The easy part about this one is that when you are finished expanding your tempdb, you can just restart your SQL services and it will reset the size.

Have fun!

 

Share

Read More

#sqlsat171 – behind the scenes

This year was the first year Pittsburgh had a SQL Saturday. Luckily, I was able to find out about it a couple months beforehand and was able to get involved. I didn’t help out that much but I was still was able to lend a hand. It was the first time I was able to be part of this kind of event. In the past I have always been the one who went to the sessions but this time I got to see behind the scenes.

It started when I signed up (one of the first days!) and checked the “would you like to volunteer” checkbox during the registration process (which was very easy, btw). I then received a series of emails for in person meetings or phone conference meetings. I was able to attend a couple of the phone calls but didn’t really have much input to offer. The only in person meeting I was able to attend was a meeting after our local user group meeting. It was a great one to go to because it was the one where we picked out all the sessions. It was a long process but I think we all were satisfied with the outcome. We had a solid slammed packed day planned.

After that, I helped out a bit by contacting the former Chicago SQL user group president, Bill Lescher and asking for some words of wisdom. He had some good advice about getting sponsors. I also contacted Pluralsight.com about sponsoring the event (which they did!!!)

So then some more emails and then finally the day came. Volunteers had to be there at 6am so that meant I had to wake up at 5am to get ready and drive up to the north side of Pittsburgh. Of course I had to stop on the way and pick up some caffeine in Monster form.

ready?????
ready!

 

My duties for the day were to be room proctor. I was in charge of getting the speakers what they needed, handing out and collecting evaluations, and making sure we stayed within the time limits. It was an easy job because I got to attend sessions at the same time.

So how were the sessions? GREAT! It was nice to spend a whole day listening to and learning about a technology that you find interesting.

I started off my day to a nice intro to Service Broker by Adam Belebczuk. Service broker can get a bit complicated if you have never used it before. I thought Adam did a great job cramming a ton of info into 1 hour. Great presentation skills as well (ZOOM IT!).

Next up was Dean Richards talking about VMware. It was another good introduction to VMWare. If you never used VMWare before then this was your course. It was a nice recap for me.

The final morning session was Mike Hiilwig‘s session of 7 SQL Agent jobs that I should be running. It was a very entertaining session that show some great methods for DB maintenance. Personally, I like to use powershell for monitoring errors over multiple servers but Mike’s ideas were also very good. I hope to see more sessions by him in the future.

LUNCH BREAK – The only take away I got from lunch was that college students don’t get dressed on weekends. The event was held at La Roche college. We didn’t eat until around 12:30. While I waited in line to get food with my other SQL Sat-ers, I noticed that every single college student I saw was still in their PJ’s or sweat pants. It was fall break though. ii-na.

After lunch I went to Justin Dearing‘s session on MongoDB. It was very cool to be exposed to noSQL. I’ve read about it before but have never seen it up close. It’s something I doubt I will ever use but now I feel I have a better understanding of it.

The next session was Michael John’s Top Development and Design Mistakes. It was a packed room for some antodotes of Michael’s experience in the field.

My final session of the day might have been my favorite. It was Sam Nasr’s 14 new SQl functions in 2012. It was exactly what the title says it was. We went over them 1 by 1. It was a great session that was more like a class. The whole group was engaged in the session. We actually had some time left over and explored these functions a bit more in depth.

After the sessions finished everybody filed back into the main hall for the giving out of the prizes and the final closing remarks.

FREE STUFF!!

After the day was over, some people headed to the bar for a couple SQL drinks. It was a nice way to finish off a great day. Overall, I enjoyed helping out (the little bit I did) and thought it was a great thing for the city of Pittsburgh. I hope we have more of these and the community is continues to support community training.

Thank you to everybody involved!!

 

 

 

Share

Read More

Changing physical locations of all DBs

I got a request this week from an application engineer. The request was to move ALL physical database files from one physical drive to another physical drive on the same machine.

Hmmmm, I have never moved ALL .mdf files from one place to another at one time. One of my teammates told the application engineer that it may take some time. He want to detach/move/attach all the DBs.

I knew this wouldn’t work because:

1) it would take forever

2) how are you going to detach the master DB?

No Detach for YOU!

So my solution contained 3 parts:

  1. Change the location of all the files inside SQL
  2. shutdown and move the files
  3. Change SQL’s startup parameters for the master database

Easy as π.

I used some good old dynamic SQL to create an ALTER DATABASE script for each db (except the master)

--first, let's just take a look and see where our files are living
SELECT name, physical_name,
CASE
WHEN TYPE = 0 THEN 'DATA'
WHEN TYPE = 1 THEN 'LOG'
END AS FileType
FROM sys.master_Files
 
--Now let's create those ALTER DB statements
SELECT 'Alter DATABASE ' + d.name + ' MODIFY FILE ( NAME = ' + f.name + ', FILENAME = ' +
 
--here is where you want to put where you want to move your files
 
' ''c:\NewFolder\'+
right(physical_name, charindex('\',reverse(physical_name))-1 ) + ''' ) '
FROM sys.databases d
inner join sys.master_files f on d.database_id = f.database_id
where 
 --remove this if you want both DB AND log file
 type = 0 
--and keep this part, dbid 1 is master
 and d.database_id <> 1
 
--Check again to see if the physical name is changed
select name, physical_name,
case
when type = 0 then 'DATA'
when type = 1 then 'LOG'
end as FileType
from sys.master_Files

After you run this, you should see this message a bunch of times:

The file “tempdev” has been modified in the system catalog. The new path will be used the next time the database is started.

Now you can shut down your SQL services and move the physical files to their new location.

Once you have all the files moved over (including the master) you’ll need to change the startup parameters.

To do this:

  1. open up the configuration manager
  2. Right click on your SQL instance –> properties
  3. Click on advanced –> startup parameters
  4. Change the location of the master .mdf
Change the location here
Once you have updated your startup parameters, you should be good to go. I want to stress “should” here because you have to make sure that your service account has access to the new location. If it doesn’t, SQL won’t start.
I tired this a couple times on my test machine and then in the production environment. I was hoping to “wow” the customer with the speed of completing this task, unfortunately their disks were really slow and moving the files took the majority of the time.
Share

Read More

WTF is the DL on COTS EMR DBs?

Well it seems I have missed April and allllllmost May. I had a nice 2.5 month vacation from work but just this week started a new position in Pittsburgh. During my time off I monitored #sqlhelp, read some articles, and tried to keep up with the ever-changing world of databases. I also did a lot of relaxing, played too much skyrim, and traveled quite a bit.

The company I started working for is having a conference for their users this week so almost no one is in the office. It gave me a good chance to read through their documentation.

It’s very easy to get lost in documentation when you are new to a company. Especially when most companies these days use lots and lots of acronyms. Luckily, a company acronym dictionary was given to me in my intro packet. It came in handy while reading through documents and meeting with various people. It’s still easy to get lost in conversations involving simple things.

Do you use a lot of acronyms at your company? Do you document them? Love em/Hate em?

Share

Read More

1 year old sandwich

It’s been an interesting day. It was the 1 year anniversary of SQLSandwiches today. It was also my last day working in New Zealand. That’s right, I decided to leave my current role as DBA at Foodstuffs. Mrs. Sandwiches and I are headed back to the States to be closer to our family.

Foodstuffs was my second job in New Zealand. I enjoyed my time working there. My managers were great and let me database my day away. I learned a lot and worked with great people. There is a reason people stay there for a long time and after only a short while, I can see why. I also had the best ID ever there.

you got something better?

Back to the sandwiches. I’ve totally slacked off on the blog so far this year in preparation of this big move. Now that I will have some down time, I can catch up on all the half written blogs that I have in my backlog. I’ll be taking the next couple weeks off to travel around Japan and then to the States! Until then it’s going to be lots of reading, running, blogging, and studying for some exams.

So here’s to 1 year blogging under SQLSandwiches and the end of a short but great job in Auckland.

BEER!!!

Final words to Foodstuffs…David, you can keep DTM! HA!

w...t...f....
Share

Read More

Find failed SQL jobs with powershell

This weekend I went to SQL Saturday in Auckland. It was nice to interact with other DBAs again.  Kent Chenery (blog | twitter) did a nice session on powershell for n00bs and it got me thinking that I need to start posting some of my powershell stuff up here.

When I started my current job I was given a handover document that had a list of standard operating procedures, some links to some useful info, and a “daily morning checks” list. When I first read this list I was amazed. It went something like this:

  1. Log on to server A
  2. Open up management studio
  3. Open up SQL Agent’s job
  4. Order by last outcome
  5. Rerun the jobs that failed
  6. Repeat for servers B, C, D….and on

Ouch..this was all done manually. This would take way too long to just “check” to see if these jobs failed. To fix this issue I turned to my good friend – powershell. I came up with little script to hit up all my servers, check the SQL jobs, and mail me if a job failed in the last 24 hours.

#Find Failed SQL Jobs with Powershell
#by Adam Mikolaj
#www.sqlsandwiches.com
 
 
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null;
 
 
#let's get our list of servers. For this, create a .txt files with all the server names you want to check.
$sqlservers = Get-Content "C:\DevWork\scripts\computers.txt";
 
#we'll get the long date and toss that in a variable
$datefull = Get-Date
#and shorten it
$today = $datefull.ToShortDateString()
 
#let's set up the email stuff
$msg = new-object Net.Mail.MailMessage
$smtp = new-object Net.Mail.SmtpClient("ExchangeServerNameHere")
$msg.Body = “Here is a list of failed SQL Jobs for $today (the last 24 hours)”
 
 
#here, we will begin with a foreach loop. We'll be checking all servers in the .txt referenced above.
foreach($sqlserver in $sqlservers)
{
 
    #here we need to set which server we are going to check in this loop
    $srv = New-Object "Microsoft.SqlServer.Management.Smo.Server" $sqlserver;
 
        #now let's loop through all the jobs
        foreach ($job in $srv.Jobserver.Jobs)
        {
            #now we are going to set up some variables. 
            #These values come from the information in $srv.Jobserver.Jobs
            $jobName = $job.Name;
        	$jobEnabled = $job.IsEnabled;
        	$jobLastRunOutcome = $job.LastRunOutcome;
            $jobLastRun = $job.LastRunDate;
 
 
            #we are only concerned about jos that are enabled and have run before. 
            #POSH is weird with nulls so you check by just calling the var
            #if we wanted to check isnull() we would use !$jobLastRun  
            if($jobEnabled = "true" -and $jobLastRun)
                {  
                   # we need to find out how many days ago that job ran
                   $datediff = New-TimeSpan $jobLastRun $today 
                   #now we need to take the value of days in $datediff
                   $days = $datediff.days
 
 
                       #gotta check to make sure the job ran in the last 24 hours     
                       if($days -le 1 )                    
                         {       
                            #and make sure the job failed
                            IF($jobLastRunOutcome -eq "Failed")
                            {
                                #now we add the job info to our email body. use `n for a new line
                			    $msg.body   = $msg.body + "`n `n FAILED JOB INFO: 
                                 SERVER = $sqlserver 
                                 JOB = $jobName 
                                 LASTRUN = $jobLastRunOutcome
                                 LASTRUNDATE = $jobLastRun"
 
                            }    
                          } 
                }
 
 
        }
}
 
#once all that loops through and builds our $msg.body, we are read to send
 
#who is this coming from
$msg.From = "adam@sqlsandwiches.com"
#and going to
$msg.To.Add("adam@sqlsandwiches.com")
#and a nice pretty title
$msg.Subject = "FAILED SQL Jobs for $today"
#and BOOM! send that bastard!
$smtp.Send($msg)

Just set this script up to run every morning as a scheduled task and you can enjoy a nice SQL Sandwich instead of plowing through SQL agent jobs.

Next up, I’ll do the same job but use the psx extensions. Wonder which is easier…

Share

Read More

Bring it on 2012

Well, I’m about a week and a half late with the whole New Year blog post but I figure better late than never.

So just a little recap of my year. 2011 was the first year I started blogging about SQL. It was pretty rewarding and I hope to do more of it this year. When I got in the zone things just flowed but when work or other life distractions got in the way things slowed down a bit.

Last year I was able to learn a good bit about:

  • Powershell, Powershell, Powershell – I am still at the beginning of my journey but some of the things I have done so far have saved me so much time.
  • Access (unfortunately) – some critical systems got tossed my way that were dependant on access. I had to learn how to deal with them. It wasn’t fun but it was awesome once everything worked (for that week).
  • Big DBs – This year I got to work with some monsters. It’s been a good oppurtunity so far. It makes tuning even more fun.
  • Indexes – used, needed, overlapping, and of course…fragmented.
  • DBCC commands – Just when you think you know some of these, you learn a new one that totally changes how you work.
  • Twitter – this is the first I have ever used twitter. I used to think it was the stupidest thing in the world. I started using it a bit and holy crap, a whole new world opened up. I can’t believe the people I have interacted with on there. SQL has a great community and I hope I can give back a little more this year.
  • Double Hoping and SPNs – every DBA needs to go through this at least once.
  • Extended Events – Profiler? Who needs that?

Overall it’s been another great year to be a DBA. I like what I do and just want to get better and increase the awesome meter.

This year I’ll be focusing a lot on powershell, puffy clouds of computers, 2012 of course, extended events, and probably get another cert or 3. I WILL also present 1 or 2 times this year. Something I have never done before. If you got advice, send it my way.

The other area I am hoping to dip more into is BI. I have a feeling that there will be a big industry swing once 2012 comes out for really good BI people. I’m hoping to jump on that wagon.

I will also be moving to the states in the near future. That is going to be a whole new world of adventure that I am ready for.

So 2012, bring it.

Share

Read More