MCTS SQL Server 2008 beta exam 71-432 taken

I just got off from taking exam 71-432: TS: Microsoft SQL Server 2008, Implementation and Maintenance – which will become 70-432 when it goes live and is the successor of 70-431 (for SQL 2005)and here are just a few thoughts [more]

This is sort of "SQL Server 2008" foundation exam and required to move on to MCTS SQL 2008 DBA, Dev, BI Dev and their MCITP counterparts.

And my advise is always, read the prep guide to guide you 🙂

I'm glad the weights/percent of coverage is IMHO close to the prep guide so take this into consideration and work on your weak points. Also, there were no rocket science questions that is not directly related to the objectives lined up so still review but I guess that's good news.

I think knowledge on SQL 2005 and a little review of what's new for SQL 2008 would be fine. A good number are specific to SQL 2008 but somehow I think you can eliminate answers based on foundation/SQL 2005 knowledge. I scanned Apress Accelerated SQL 2008 and most of the topics there were helpful (though again I just scanned so it would have been more helpful if I actually read). That's the only material I got hold of so far but it will definitely be helpful.

Over-all the exam was a good experience and didn't leave me looking at the ceiling like others (e.g. WPF) and although I don't really expect to pass this one, I at least know what I need more review for when I get the chance. (clue: resource management and recovery models 😛 that's in the prep guide anyway)

Get Ancestor and Descendant IDs or info using SQL 2005 Common Table Expressions

Need some recursive processing in TSQL (SQL Server 2005)?

Needed some TSQL code again to retrieve descendant and ancestor entries in a self-referencing table. I've done this a couple of times already and although I can write on top of my mind, sometimes you just want to make life easier and have a script you can just modify a bit to fit the new requirement. [more]

So what I usually use for recursions in SQL Server 2005 are Common Table Expressions (the "WITH" keyword). This is only available in SQL 2005 and very easy/efficient for recursive.

To read more about it check this link from MSDN : Using Common Table Expressions

And of course the ever helpful Books Online (search for maybe WITH or Common Table Expressions) 

Nevertheless here's my basic examples to get "Descendants and Self" and "Ancestors and Self" information. In this case I only involved the ID, ParentID, and Sequence but you may add more columns (ID and ParentID are required to work). Also you might want to just retrieve the ID and just perform a JOIN afterwards.

   11   — NOTE: replace these accordingly

   12   — replace columns : MyEntityID, MyParentEntityID

   13   — replace parameter : @MyEntityIDParameter

   14   — replace table: MyTable

   15   — also note that the condition for anchor and recursive part

   16   — and the final consumption (select statement after the WITH block)

   17   — can be changed dependending on requirement

   18   WITH DescendantsAndSelf([ID], [ParentID], [Sequence]) AS

   19   (

   20       — self (anchor part)

   21       SELECT

   22         [ID] = MyEntityID,

   23         [ParentID] = MyParentEntityID,

   24         [Sequence] = 1

   25       FROM MyTable

   26       WHERE

   27         MyEntityID = @MyEntityIDParameter

   28 

   29       UNION ALL

   30 

   31       — descendants

   32       SELECT

   33         [ID] = MyEntityID,

   34         [ParentID] = MyParentEntityID,

   35         [Sequence] = [Sequence] + 1

   36       FROM DescendantsAndSelf AS CTE, MyTable AS T1

   37       WHERE CTE.[ID] = T1.MyParentEntityID

   38   )

   39 

   40     SELECT *

   41     FROM DescendantsAndSelf

Download above code as text file > CTE_descendantsAndSelf.txt (947.00 bytes)

   11   — NOTE: replace these accordingly

   12   — replace columns : MyEntityID, MyParentEntityID

   13   — replace parameter : @MyEntityIDParameter

   14   — replace table: MyTable

   15   — also note that the condition for anchor and recursive part

   16   — and the final consumption (select statement after the WITH block)

   17   — can be changed dependending on requirement

   18   WITH AncestorsAndSelf([ID], [ParentID], [Sequence]) AS

   19   (

   20       — self (anchor part)

   21       SELECT

   22         [ID] = MyEntityID,

   23         [ParentID] = MyParentEntityID,

   24         [Sequence] = 1

   25       FROM MyTable

   26       WHERE

   27         MyEntityID = @MyEntityIDParameter

   28 

   29       UNION ALL

   30 

   31       — ancestors (recursive part)

   32       SELECT

   33         [ID] = MyEntityID,

   34         [ParentID] = MyParentEntityID,

   35         [Sequence] = [Sequence] + 1

   36       FROM AncestorsAndSelf AS CTE, MyTable AS T1

   37       WHERE CTE.[ParentID] = T1.MyEntityID

   38   )

   39 

   40     SELECT *

   41     FROM AncestorsAndSelf

Download above code as text file > CTE_ancestorsAndSelf.txt (939.00 bytes)

Please let me know if I missed anything.

Hope you and me could find this useful someday.

LINQ to SQL and Visual C# Developer Express Data Source Connections

While I was evaluating a couple of things including LINQ to SQL (or L2S to some) for a small website I tried to look into whether it was possible to have the LINQ to SQL classes on a separate project (eg. DAL) rather than the website itself. [more]

Since Visual Web Developer Express Edition (you need 2008 for LINQ to SQL) doesn't support having more than one project inside a "solution" I downloaded the Visual C# Express Edition. Note that these are not SP1 beta versions. Haven't looked into those.

I created the C# class library project. Then opened the database explorer and tried to add a connection. You will notice though that only three types of data source available:

They are : Access Database File, SQL Server Compact 3.5 3) and SQL Server Database File.

Since the LINQ to SQL ORM Designer (visual designer for DBML) is need the database explorer to drag and drop tables, objects from then it would mean there it becomes a little problem if you're working on a SQL Server instance rather than a file. (which is common to shared hosting since most if not all doesn't support running/attaching user instances anymore)

One way to work around this however is to the detach the data file (*.mdf) from the SQL Server then add a connection to the file from the database explorer. (detach is likely needed otherwise you might encounter some file locking issues). This would create a new connection string in your app.config which involves attaching the data file. However you can edit this to point to the SQL Server instance containing your database instead without a problem. (** make sure though that you have attached it back to SQL Server).

So basically, it is still definitely possible to have a class library in Visual C# Express 2005 to have LINQ to SQL classes targetted to a SQL Server instance database rather than a database file despite not having that data source option when creating a new connection in database explorer. The limitation is simply more of a hassle in designing.

Not very convenient but you can always design/code against the data file during development and just attach that file to SQL Server and change the connection string.

There could be a way to add the SQL Server option (not the SQL
Server Database File) when adding new database explorer connections but haven't looked into that. Will be sure to
update this if I do.

Also there are a lot of discussions going on regarding discontinued support for LINQ to SQL in favor of ADO.NET Entity Framework (EF) so you might want to read a little on those first before making some decisions. Probably not a problem with small projects but then again, it doesn't hurt to be aware of the issues. It may or may not be true and I would still look into L2S definitely but knowing it was created by the C# team while EF by the data programmability team, plus EF has more features/flexibility than the other and that the ADO.NET MCTS exam prep guide has a section for EF but only a mention of LINQ (and not LINQ to SQL specifically) I'm more inclined to think EF will gain more support moving forward.

Link: SQL Server 2008: new data types and .Net 2 with and without SP1

Here's an interesting I came across in the recent Simple Talk Newsletter (by Red Gate Software).

It talks about issues with new SQL 2008 data types (date/time related) and it's effect on .NET 2.0 datetime data types when SP1 is installed or not. Something to be aware of and one of those times that you're glad you have a web rather than a desktop application. (you only have to install SP1 on the server(s))

SQL Server 2008 has introduced a few new data types, among others the
new date types, like date, time, datatime2 and datetimespan. Because
.Net 2 was released before SQL Server 2008 has introduced these data
types, there are no classes that map to these new types in .Net 2.
But this has changed with .Net 2 SP1, which introduces the DateTimeOffset structure.

Read full article from the following link :SQL Server 2008: new data types and .Net 2 with and without SP1

Link: Testing SQL Stored Procedures Using LINQ

Just ran into this interesting article on MSDN Magazine about using LINQ to test stored procedures. It would be a bad idea to test stored procedures using other stored procedures too (eg. insert SP then verify presence of inserted value using select/load SP) so traditionally we do it using ADO.NET object (conn, commands) to build commands. But this LINQ approach seems very interesting. Easy of building commands, intellisense, compile time checking among others. That's another good use for LINQ which I'm really starting to like (hopefully even after trying it for n-tier projects)
The article also comes with a virtual lab for hands-on experience.

 

The need to test a program that accesses and manipulates a back-end SQL Server®
database is very common. In many such cases, the application interacts
with the back-end data through the use of SQL stored procedures.
In
this type of scenario, you can think of the stored procedures as
auxiliary methods of the system under test; and they must therefore be
tested just like any other module in the system.
 


Read full article here

sp_executesql error with DDL statements (preventing SQL injection)

One of the requirement for an application I'm currently
working on is for the end user of a web application/site to be able to create
objects in the database.

There are a number of objects that must be created or
manipulated but for the sake of simplicity let's take for example creating a table with one
column. The SQL statement would look like:

CREATE TABLE [MyTable]
( [MyColumn] INT NOT NULL )

To minimize SQL injection, I was hoping I could use the sp_executesql stored procedure to come up with a
parameterized query (in addition to other preventive measures like validating input). Something like:

DECLARE @SQLString NVARCHAR(MAX);
DECLARE @ParamDefinition NVARCHAR(256);

SET @SQLString =
  N'CREATE
TABLE @TableName (@ColumnName INT NOT NULL)';
SET
@ParamDefinition =
  N'@TableName VARCHAR(128),
@ColumnName VARCHAR(128)';

EXECUTE sp_executesql
 
@SQLString,
  @ParamDefinition,
  @TableName = 'MyTable',
 
@ColumnName = 'MyColumn';

When I executed the statement it returned an error
"Incorrect syntax near
'@TableName'
". Though I haven't worked with sp_executesql with
Data Definition Language (DDL) statements I have been using it for Data
Manipulation Language (DML) statements such as conditional selects and others so
I'm quite sure there is nothing wrong with syntax near @TableName by itself.
What could be wrong?[more]

It turns out that using sp_executesql with DDL
statements are not supported.

So in this particular case, to minimize SQL injection
you have to use concatenation to dynamically build the SQL statement.

DECLARE @SQLString NVARCHAR(MAX)
DECLARE @TableName VARCHAR(128)
DECLARE @ColumnName VARCHAR(128)
SET @TableName = 'MyTable'
SET
@ColumnName = 'MyColumn'
SET @SQLString =
 'CREATE TABLE ' + @TableName + '( ' + @ColumnName + ' INT
NOT NULL)'
EXEC (@SQLString)

But wait, there is something else you can do to prevent
SQL injection so all is not really hopeless. Meet QUOTENAME. As you might have noticed before,
one way to create a table with spaces in the name is using quotes or brackets
like CREATE TABLE [hello world how are you]. So anything between the delimiter
(eg. quote, brackets) are considered part of the TableName rather than a
command/statement.

When you experiment more you will notice that the
following statement will create a table named as indicated instead throwing an
error or doing something messy: 

CREATE TABLE [Hello; Drop Master; ] ( [MyColumn] INT NOT
NULL)

Yes I know that's a very clever sql injection attempt
but here's another example:

DECLARE @SQLString NVARCHAR(MAX)
DECLARE @TableName VARCHAR(128)
DECLARE @ColumnName VARCHAR(128)
SET @TableName = 'MyTable (ColA INT NULL); PRINT ''HELLO'';
— '
SET @ColumnName = 'MyColumn'
SET @SQLString =
 'CREATE TABLE '
+ @TableName + '( ' + @ColumnName + ' INT NOT NULL)'
PRINT (@SQLString)

Try executing that in your table and
along with creating the table named MyTable you will see HELLO printed in the
output/messages window. If you replace the print statement with a more
maliciously statement such as  DROP DATABASE master and you happen to be running
a highly privileged account then you're in trouble.

But try the statement again but this
time using this @SQLString:

SET @SQLString =
 'CREATE TABLE ' + QUOTENAME(@TableName) + '( ' +
QUOTENAME(@ColumnName) + ' INT NOT NULL)'

As you might have expected you should get a
table named [MyTable (ColA INT NULL); PRINT 'HELLO'; — ] instead of HELLO being
printed. Should there be a '[' or ']' in your variable value, it will be changed
to '[[' and ']]' respectively just like how single quotes are changed to two
single quotes to avoid issues.

Now that's seems a lot better.

So in addition to other ways of
protecting your database such as (but not limited to) validating input (very
well), running least privileged SQL server account, using stored procedures
(along with determining proper parameter data type and length) when faced with
the need to perform dynamic queries where you cannot use sp_executesql like in this case with DDL
statements, see if QUOTENAME would be
applicable and if it would help.

** note that QUOTENAME(dbo.Employee)
will not result to [dbo].[Employee] but rather [dbo.Employee] so for fully
qualified names, quote the server/database/schema and table names
respectively.

TSQL SubQuery requires a Table Alias

Subqueries are essentially a query (or queries) inside another query. That is for example a SELECT statement inside another SELECT statement.

There is already enough information available in the internet and written materials about subqueries including their common uses, when and how to use them (see bottom of post) and I will not elaborate on that further. However, I would like to share my experience where I almost abandoned an elegant solution using a subquery in favor of another approach because I couldn't get it to work for some reason.

Note that the scripts below are merely to illustrate the point and in no way demonstrates best practices, especially considerations in execution performance and actually could be achieved without a subquery but again only to simplify the point. [more]

I had a table for example named "Product" with a few columns in it so the subquery should have been as simple as:

SELECT * FROM (SELECT * FROM Products)

But for some reason I was getting Incorrect syntax near ')'. I know I have done and seen this a couple of times. Or I thought I did since actually during those times I really had a "FROM <an actual table> INNER JOIN <the sub query>". So rather than selecting from a subquery I was selecting from an actual table/view which was joined to a subquery. But even if that was the case, clearly this should be possible and I'm quite sure I was missing something.

After a short walk and little stretching I turns out that I was indeed missing something, an alias for the subquery (or the result set of the subquery).

Changing the query to the following worked: 

SELECT * FROM (SELECT * FROM Products) <Alias>
or
SELECT * FROM (SELECT * FROM Products) AS <Alias>

where <Alias> is any valid text/string that can be used as a table alias. (enclose in [ and ] if you have spaces)

That's it. I'm not sure why an alias should be required that I guess that's up to another discussion (or update sometime).

Googling TSQL subqueries comes up with this good article on Using a Subquery in a T-SQL statement which contains some of the common uses for sub queries. But while your on it, have a look at similar techniques like using JOINs instead when appropriate or Common Table Expressions (CTE) or even the new CROSS and OUTER APPLY in SQL 2005.

TSQL : Concatenate values from a datatable column into a delimited string

I think it's common to come across a requirement where you have to retrieve values from a column in a database table. For this example however, we would use a table variable instead.

 

DECLARE @EmailTable TABLE

(

  Email VARCHAR(64)

)

 

— then we insert sample values 

INSERT INTO @EmailTable VALUES ('test@ryangaraygay.com')

INSERT INTO @EmailTable VALUES ('test@yahoo.com')

INSERT INTO @EmailTable VALUES ('test@gmail.com')

 

The common solution of course is to append/concatenate the values returned from the query.

** Note that you have to set @Emails to '' before the actual concatenation [more]

 

DECLARE @Emails VARCHAR(MAX)

SET @Emails = ''

SELECT @Emails = @Emails + Email + ','

  FROM @EmailTable

 

PRINT @Emails

 

would result to : test@ryangaraygay.com,test@yahoo.com,test@gmail.com,

 

But notice that you will have an extra ',' (or whatever delimiter you are using) at the end of the resulting value. To remove this, one way is to use substring (SUBSTR) to return the value minus the delimiter at the end. But it would in turn also mean that you have to check that @Emails is actually not empty (and maybe if it ends with the delimiter or not) making such a relatively simple tasks involve more code. (I won't even include here since you don't really want to know about it) 

 

This however, I believe is a much simpler way to do it with less code and complexity.

 

DECLARE @Emails VARCHAR(MAX)

SELECT @Emails =

  ISNULL(@Emails + ',', '') + Email

  FROM @EmailTable

 

PRINT @Emails

 

would result to : test@ryangaraygay.com,test@yahoo.com,test@gmail.com

 

Almost the same result, except that there would be no trailing "," (or whatever delimiter) at the end nor at the start anymore. Try It!

 

The trick here is that ISNULL(@Emails + ',', '') returns '' for the first time the statement is executed.

Which also brings up an interesting point that if you add a string/text to a NULL, it would result to a NULL (not sure if ANSI NULL ON/OFF has effect on this). Since @Emails was not initialized to any value before it was used, the statement will return '' for the first time it is encountered in the SELECT query. The first statement would simpy be the email then all the succeeding values would be appended as : , <Email>

 

I'm quite sure I haven't known this by myself but rather from some source/code I've seen before which unfortunately I could not recall to acknowledge properly. I know it's a common approach but I still see the old way (substr) of doing it even now so I might as well post it. And to be honest, the first time I encountered the requirement, I didn't really view it as simple as it was with this now.

Cannot create Scheduled Job on SQL 2005

Unable to cast object
of type 'Microsoft.SqlServer.Management.Smo.SimpleObjectKey' to type
'Microsoft.SqlServer.Management.Smo.Agent.JobObjectKey'.
(Microsoft.SqlServer.Smo)

I ran into this error for the first time a few hours ago [more]and I have no idea what's causing it since I've been creating SQL Jobs for some time except for the past few days. Maybe some patch or update or an installation somewhere but I really can't tell what. Sorry bout that. 

the SqlServer.Management namespace somehow indicates that the issue is with the management (client) tool and you're right. 

The good news though is that there are a number of information available online suggesting that this is usually encountered if you're trying to run an SSIS package inside an SQL scheduled job. Funny thing though is that I wasn't trying to do that. Instead I'm getting an error simply by trying to create a job (with or without steps). The cause is generally associated with having the SQL Server patched wtih SP2 but the client tool used for creating the job was not patched by SP2. Once again, funny thing is that I was trying to create the job on the same machine and if I did apply the patch, I'm sure I would have applied it for client components too.

So the issues discussed in the articles/posts did not exactly match the scenario I'm in but I figured I should check and install SP2 anyway. And when I ran the SP2 executable it displayed to me the components that it detected and the patch can be applied to. In my case, all of the components (server and client side). So for short SP2 was never ran on that machine. The mention that the mismatch between the versions of the client and server might have been true but this shows that there could be other causes.

Nevertheless, installed SP2 and the error went away. Not a scientific discovery there but just FYI for those who might ignore that possible fix since the posts mention only of running SSIS packages inside the job. Then again as always, explore all possible options and if you can't really be certain that it would be the fix, make this the last resort. Although it is always recommended to apply services packs anyway so whether it fixes your issue or not, plan on applying it anyway.

Now I can finish some things I'm preparing for my next posts.

Unit Testing Data Access with .NET TransactionScope

I was curious how to maintain my database at a consistent state while unit testing (technically integration testing already – but since it uses unit testing framework as they say the lines are terminologies are getting blurry) so I looked around for "database unit testing" write ups.

It turns out that [more]mbUnit has [Rollback] which makes use of COM born out of Roy Osherove's article on Simplified Database Unit testing using Enterprise Services (and a number of related articles).

But I'm using NUnit (at least for now until I have checked out mbUnit) and again although there are a number of articles on extending NUnit (using Roy's approaches among others), it's a little messy to extend, create new DLLs and all. So seeing that the approach made use of Enterprise Service, it made me wonder whether it's possible to use .NET 2.0 TransactionScope (and possibly other upcoming enhancements to System.Transactions).

I tried to come up with a simple test/project to see if it would work and it did the job so posting it. (also adding a link to compressed sample project at the bottom)

using System.Data.SqlClient;

using System.Transactions;

using NUnit.Framework;

 

namespace TransactionScopeNUnitTest

{

    [TestFixture]

    public class TestClass

    {

        private const string connString = @"Server=.sql2005;database=test;uid=<yourUID>;pwd=<yourPassword>";

 

        private TransactionScope scope;

 

        [TestFixtureSetUp]

        public void Setup()

        {

            scope = new TransactionScope(TransactionScopeOption.RequiresNew);

        }

 

        [TestFixtureTearDown]

        public void TearDown()

        {

            if (scope != null)

            {

                scope.Dispose();

            }

        }

 

        [Test]

        public void Delete()

        {

            // test runner could run the tests in any order

            // so we insert the a known entry first

            Insert();

 

            using (SqlConnection conn = new SqlConnection(connString))

            {

                using (SqlCommand cmd = new SqlCommand(

                    "DELETE FROM Table1 WHERE Col1 = 1"))

                {

                    cmd.Connection = conn;

                    conn.Open();

                    cmd.ExecuteNonQuery();

                    Verify();

                }

            }

        }

 

        [Test]

        public void Insert()

        {

            using (SqlConnection conn = new SqlConnection(connString))

            {

                using (SqlCommand cmd = new SqlCommand(

                    "INSERT INTO Table1 (Col1, Col2) VALUES (2, '2')"))

                {

                    cmd.Connection = conn;

                    conn.Open();

                    int retval = cmd.ExecuteNonQuery();

                    Assert.Greater(retval, 0);

                }

            }

        }

 

        [Test]

        public void Verify()

        {

            using (SqlConnection conn = new SqlConnection(connString))

            {

                using (SqlCommand cmd = new SqlCommand(

                    "SELECT * FROM Table1 WHERE Col1 = 2"))

                {

                    cmd.Connection = conn;

                    conn.Open();

                    SqlDataReader rdr = cmd.ExecuteReader();

                    Assert.IsTrue(rdr.HasRows, "Inserted Row Not Found");

                }

            }

        }

    }

}

Here's the simple SQL script for the test 'Table1' too

CREATE TABLE [dbo].[Table1](
    [Col1] [int] NOT NULL,
    [Col2] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULL
) ON [PRIMARY]

 

NOTE: When I ran this, despite using just one connection string, it required "Distributed Transaction Coordinator" service to be running on the machine (Start > Run > type "services.msc"). I used to think that it is only needed if the transaction is escalated to a distributed transaction but more on that next time.

Also, System.Transactions is not added by default to new projects (at least for class library template) so you would need to explicitly add a reference to it. 

This is really not a discovery so to speak and I'm sure there are other articles out there on this as it's only brought about by the introduction of TransactionScope as an alternative to previous approaches but for those running on .NET 2.0 and want a simpler solution than extensions or more complicated stuff, I think this would suffice. 

Download sample – TransopeNUnitTest.zip (59.95 kb)

Related Links:

TransactionScope on MSDN

TransactionScope's advantages over ADO.NET and Enterprise Services

Test-Driven Development in Microsoft.NET book