Sorry I haven’t written in the past 2 weeks, but this is a very busy time of year in my house. There are a lot of topics about which I want to write, so I’ll be back very soon doing that. In the meantime, I wish you all a very merry Christmas and a joyous new year.
Recently, someone posted a question to Ask SQL Server Central:
The issue was that the last row in a text file contained the row count, so he needed to bulk import all but that last row.
My solution was to set maxerrors to 1 so the import would ignore the last row due to the error. Any other row with an error would still fail the import. This reminded me of one of my least favorite defaults in SQL Server, so I decided to write about it here as well. A lot of people don’t realize this, but by default a bulk insert will only fail after 10 errors. Why not 0? I’ve never understood that. If you don’t change the default and then bulk insert 20 rows of data from a file, it will only fail if over half of the rows cause an error.
I generally recommend setting maxerrors to 0. In this case, I suggested setting it to 1. He can expect the last line to fail, so setting it to 1 means he’ll be okay as long as that’s the only bad line. Any rows with errors fewer than the limit are simply ignored.
Here is how to use the option for each of the 3 bulk insert methods.
BULK INSERT … WITH (MAXERRORS = 1);
SELECT * FROM OPENROWSET(BULK …, MAXERRORS = 1);
bcp -m 1
I suggest you use this option (with 0 or another appropriate value) in all bulk inserts to override the default of 10 … unless you’re okay ignoring the first 10 errors.
I have several topics I want to write about later this week, but first I want to quickly talk about the SQL Saturday DC event I attended a few days ago. For those of you who don’t already know, SQL Saturday events are annual free networking and training events that happen all year long all around the world.
This year was extra-special for me because I attended my first in-person training session with Brent Ozar at his “Performance Tuning When You Can’t Fix The Queries” pre-con on Friday. Without getting into the details, I’ll just say that it was informative and thought-provoking but light and humorous at times as well. If you have the opportunity to attend one, I highly recommend it.
The actual SQL Saturday event was a wonderful opportunity for me to learn and network. One of the highlights was finally meeting Steve Jones. As some of you know, I’ve been helping people at Ask SQL Server Central for years and was recently made a moderator there. Well, I can thank Steve for that opportunity because he is one of the founders of SQL Server Central. I also saw Monica Rathbun whom I had previously only known online. I had the pleasure of meeting and having lengthy conversations with Chris Bell, William Wolf, and Reeves Smith. I won’t list everyone else here, but I met several other people and saw some friends I hadn’t seen in awhile. All of that was in addition to sitting in some great sessions on SQL tuning (Brent Ozar), Service Broker (William Wolf), tSQLt (Steve Jones), and test-driven SSIS with Biml (Reeves Smith).
Special thanks to Chris Bell, Matt Velic, Paul Rizza, Andrew Notarian, and all of the others who helped make the event a success.
(I could have also titled this “Never Deny Yourself a Learning Opportunity”, but more on that later.)
This past Thursday, we celebrated Thanksgiving here in the United States. I have many things to be thankful for, but Thomas LaRock’s recent Twitter post helped me realize I have even more. This Is what he said:
I am #thankful for the many failures I’ve had in life, as they have made me a better person and professional.
Wow. What a great thought. It reminds me of the old Michael Jordan commercial when he listed his many “failures” as the number of shots missed, games lost, etc. Not only have we all failed many times in our lives, but I believe one of the keys to success is not only recognizing those failures but accepting them as important learning opportunities. A coworker of mine jokingly refers to bugs as “opportunities for improvement.” I think he may be on to something.
So, in that spirit, I will share with you an early and very memorable technical failure of mine. It was a long, dark night way back in the computing Mesozoic Era of 1996. I had worked with networks before (anyone remember LANtastic?) but had never used this exciting and relatively new product, Windows NT. My boss was happy to have me learn by configuring the new network overnight, but he insisted it was essential that no one have access to the system drive for fear someone could hack in through the modem. Okay. No problem. In fact, I quickly learned how easy it was. I simply denied read access to the C drive to this group I found called “Everyone.” Soon after that, I learned that the operating system accounts are part of Everyone, and I had effectively turned our server into a very expensive space heater. Well, after panicking and trying to figure out how to contact support, I eventually gave up and started over. After reinstalling the server OS, I proceeded to configure the clients. By the time it was all done, it was well after midnight, but my on-the-job training had been a success. I may have done it the hard way, but I had also learned the power of “deny.” That lesson proved invaluable years later when I began working with SQL Server security.
So, thank you, Thomas LaRock, for your inspirational post. And thank you, Tom Staab, for so many learning opportunities.
A coworker recently came to me with a problem. His query returned the set of data he expected … until he added a certain condition. Then, instead of returning a subset of the original result, it returned nothing at all. I sat down to take a look and noticed he was dynamically generating a SQL statement. He demonstrated his process of executing his script to text for review. Then he added his condition to one of his subqueries, and the output went blank.
Fortunately for both of us, I’ve dealt with this sort of problem many times in my own code as well as others’. The trouble was his condition made the subquery sometimes return null, and all results (including the null) were being concatenated into a single SQL string. In case you don’t already know this, nulls are world-class cleaners; the kind your parents wanted you to be. Concatenating a single null to a string cleans up everything very neatly. In fact, everything disappears and is replaced with nothing (i.e., null). To solve his problem, I simply wrapped his subquery in an ISNULL function, and then it worked as expected with or without the filter.
The important lesson here is to always ensure each and every piece of a string cannot possibly be null. Prior to SQL Server 2012, this was typically done using the ISNULL function, but we now have an alternative. Rather than “adding” strings together and wrapping each in ISNULL, we can use the CONCAT function with substrings separated by commas. Let’s take a look at the 2 options:
DECLARE @str1 varchar(10) = 'ABC';
DECLARE @str2 varchar(10);
DECLARE @str3 varchar(10) = 'XYZ';
DECLARE @mystring varchar(20);
SET @mystring =
+ ISNULL(@str2, '')
+ ISNULL(@str3, '')
-- SQL2012 and newer
DECLARE @mystring = CONCAT(@str1, @str2, @str3);
Both methods guarantee a null won’t eliminate your string value, but as you can see, the newer CONCAT function is much simpler to use. If any value is null, the function automatically (implicitly) converts it to an empty string. As an added bonus, it will also convert any non-string values that supports implicit conversion to string.
So remember, contrary to what your parents may have told you about your bedroom, it is possible to be too clean. When that long string you’ve been building suddenly looks a little empty, it may have been swept up by a sneaky little null.
Everyone has questions. It’s just important to not be afraid to ask and then to know whose advice is worth listening to. Fortunately, I know just the place for you to go with those pesky SQL Server questions. The name says it all: Ask SQL Server Central. I joined the site just over 6 years ago and jumped right in. It follows the typical style of someone asking a question, people submitting answers and/or comments, and people voting up or down if they do or do not like the quality of a question, answer, or comment. I enjoyed it, both as a learning tool and as a way to give back some of what I had learned. I was committed for about a year, but then I just stopped.
For several reasons (or mostly excuses), I didn’t do much there or anywhere else in the SQL Server community for about 4 years until early this year. I’m sure many of you can relate. I had the desire, but once I stopped, it just became “too hard” or “took too much time” for me to get started again. But that finally changed this past January. I had missed the local SQL Saturday event in DC the month before due to vacation, and something just clicked inside. I was disappointed I had missed it, and I had to get back out there in the SQL community. That may sound silly if you don’t know me, but those who do know how much I love learning, teaching, and sharing.
Fast-forward 10 months, and I’m happy and proud to say that I was recently promoted to be a moderator at AskSSC. This is an honor and privilege I do not take lightly. In the past 6 years, over 15,000 questions have been asked for which over 27,000 answers were provided. The site is frequented by such prominent members of the SQL Server community as Grant Fritchey, Kev Riley, Thomas Rushton, Tim Radney, and many others. In fact, we have a lot of people with centuries of combined experience who contribute to the success of AskSSC. So come join us to ask your questions, provide some answers, or maybe do a little bit of both. I hope to see you there.
I considered not writing my original topic now and instead writing about my reaction to the tragedy in Paris Friday night. The words shocked, horrified, and scared all come quickly to mind whenever I think about it. I think about the victims and their families. I wonder about the psychological effects on the survivors that can weigh on them long after the physical structures have been repaired. How long does it take one to feel safe going to a concert again when your friends were shot dead in front of you the last time? I struggle to think about anything else important enough to write about. But then I think about the terrorists responsible for this. I don’t want to give them the power. Of course I want to pay my respect to the victims of this tragedy, both dead and alive. But then I want to continue living if for no other reason than because that is exactly what the terrorists don’t want. Yes, I will remember November 13th with sadness for a very long time if not the rest of my life, but these terrorists will not chain me with fear. In fact, my original topic seems more important than ever now. Life is not a guarantee. It is a gift and a treasure, and every day our mission should be to give some of it back.
I still remember reading in Hakeem Olajuwon’s autobiography, Living the Dream, how important effort was to him. He believed that his talents were a gift from Allah and not using them to the best of his ability would be disgraceful. I frequently remind myself of that and ask myself if I could be doing more. Another way I sometimes look at it feels a little more powerful this weekend: If I die tonight, how did I help the Earth during my last day on it?
With that in mind, I want to talk about the importance of knowledge sharing. First, consider this question: What is the purpose of knowledge if not to share it? Perhaps your first reaction is that it could be used to help oneself. But how? By enabling you to be better at your job? Well, aren’t you then at the very least indirectly helping others through your knowledge? Could that be considered a form of sharing?
I believe the primary goal of knowledge is to teach and share. Of course I’m not talking about exposing trade secrets or confidential information. I’m referring to the concepts we work with every day that are free for anyone to learn. We all have something to teach and something to learn. Unshared knowledge = untapped potential. What can we lose by sharing our knowledge to help others? If you are truly concerned about job security, don’t be. If you really do possess knowledge that no one else in your organization has, why wouldn’t they reward that? And, yes, I know it’s not always that simple with budget cuts, outsourcing, etc., but I’m thinking big picture here. If a company doesn’t reward knowledge sharing, I don’t want to work there anyway because they probably don’t have a very bright future.
Now let’s reconsider what I said earlier. Unshared knowledge = untapped potential. What can we gain by sharing our knowledge? Well, if you’re anything like me, personal satisfaction is definitely one answer. My most rewarding days are the ones when I know I was able to teach someone something. Anything. You will gain the respect of others, and if you work in a good organization, that can lead to positive outcomes during your review. Others will gain the knowledge you are sharing thereby enabling them to do more (and eventually share more) themselves.
If you’re one of the many people already doing what you can to share your knowledge (not to mention your time), thank you. I have already had the pleasure of thanking some of you in person because I had the good fortune of being one of those with whom you shared your knowledge. There are many others I have not met but from who I have still learned a tremendous amount through books, videos, online articles, etc. Thank you all. If you’re not doing it yet, today might just be your day to start. And then, when you ask yourself tonight, “If I die tonight, how did I help the Earth during my last day on it?”, hopefully you can smile and sleep peacefully. And then get up the next morning and do it again.