Running Old DTS Packages on a Pure SQL Server 2014 Server

So you digging on a SQL 2014 upgrade but have some dirty, dirty old DTS packages that you don’t want to touch? No shame there though brothers and sisters, who wants to do that? So here’s the deal:

Do your DTS packages use ActiveX Script Tasks?

Yes
Ass burgers. You’re fudged man and looking at installing an older version of SSIS or getting down and dirty to convert those packages.

No
Smooth, you’re cruising on easy street, albeit in a poor quality ride. Just install the Microsoft SQL Server 2005 Backward Compatibility Components and by some old school Jedi-like magic, DTSRun will be returned to you. No need to install a lower SSIS or pay another license (installing a lower version of SSIS on a 2014 server will require that it is also licensed). No chance of any design time tools but the beasts will at least run (keep an old development license 2005/2008 VM for editing and converting them).

Word up, this is a SHORT-TERM SOLUTION and you should be pimping ALL those old packages but sometimes needs must. This just bides you time without blocking the upgrade process. You dig? Sweet.

Learn, Speak, Learn, Repeat.

I started speaking a year ago and yesterday at SQL Saturday Exeter was my 6th session. Looking back on it, I tried to cram way too much into the allocated fifty minutes. I rammed it in, had to deliver the material too fast and got flustered and struggled with some of the questions due to the time-stress. This annoys me, as I feel I didn’t do the best that I could. The session in future will be less than half as big, with plenty of room for questions. Learn, speak, learn, repeat.

Thinking back, I think I confused Hugo Kornelis the most, whilst trying to explain how Always On worked with in-memory data. Hugo, I apologise for this. It has annoyed me so much that I felt compelled to write this quick post to clear things up.

Always On AGs and In-Memory OLTP

  • Log records are transferred to the secondary replicas and replayed for all types of tables: disk-based data to disk, in-memory data to memory (so on failover there is no need for a reload of the in-memory data as it is already there and ready to rock)

Oh, and talking about bleeding edge stuff your testing but haven’t finished yet, such as controlled failovers for in-memory maintenance? Get them sorted and tested first, talk about them second. I learn….

This was where I possibly confused Hugo and every other attendee by talking about “memory-mapping”. Fool. Why did I plump for that phrase? This will NOT be used in future.

I’m going to have a week off thinking of sessions now but when I get back to it they’re going to be smaller and more focused. Less is more. Quality over quantity. I learn, I speak, I learn, I repeat (the speaking, minus the crap bits and with more good bits).

Wondering what deprecated features you’re rocking?

Is yo SQL box a dinosaur? Thinking about pimping the version? No idea where to start? Well forget traces for now and dig on this:

SELECT *
FROM sys.dm_os_performance_counters PC
WHERE [object_name] IN('SQLServer:Deprecated Features')
AND [cntr_value] > 0;

Boom! So simple it hurts. It’s your starter for ten.

Now go search and destroy.

Note: This is a STARTER FOR TEN. Only a total fool would rely solely on this as upgrade planning. Check yourself before you wreck yourself. Damn straight.

Up the EF Junction

Let’s face it, Entity Framework does not really like any tables that do not have a unique numeric primary key. Not the best I know but what are you going to do? I have just had to convert a large number of junction tables to work with an EF application and it was a pain. In short any table like this:


CREATE TABLE [dbo].[ProductCategory]
(ProductID INT,
CategoryID INT,
CONSTRAINT [PK_ProductCategory] PRIMARY KEY CLUSTERED ([ProductID] ASC, [CategoryID] ASC),
CONSTRAINT [FK_ProductCategory_Product] FOREIGN KEY ([ProductID]) REFERENCES [dbo].[Product] ([ProductID]),
CONSTRAINT [FK_ProductCategory_Groups] FOREIGN KEY ([CategoryID]) REFERENCES [dbo].[Category] ([CategoryID]));
GO

Needs to become like this:


CREATE TABLE [dbo].[ProductCategory]
(ProductCategoryID INT IDENTITY(1,1),
ProductID INT,
CategoryID INT,
CONSTRAINT [PK_ProductCategory] PRIMARY KEY NONCLUSTERED ([ProductCategoryID] ASC),
CONSTRAINT [FK_ProductCategory_Product] FOREIGN KEY ([ProductID]) REFERENCES [dbo].[Product] ([ProductID]),
CONSTRAINT [FK_ProductCategory_Groups] FOREIGN KEY ([CategoryID]) REFERENCES [dbo].[Category] ([CategoryID]));
GO

CREATE UNIQUE CLUSTERED INDEX [IX_UQ_ProductCategory]
ON [dbo].[ProductCategoryEF] ([ProductID] ASC, [CategoryID] ASC);
GO

Simple. If not entirely sensible. Note the non-clustered PK and the unique clustered index on the previous PK columns. It does exactly the same thing as before but has an extra column simply for the benefit of EF. With the resurgence of EF and OData this is a handy thing to know and maybe to design in at the start, should your shop or client be going the OData route.

Laters.

MCM Y’All! Now the hard work really starts….

Well, it’s just over ten months ago that I did my first QuickStart MCM session with Robert L Davies (@SQLSoldier) and now, finally, I am a full blown MCM. I must be honest and say that I have never had to work so hard in my entire life as I have had to do to obtain this certification. I thought I knew everything after working with SQL Server for twelve years and it was a shock to realise how much I didn’t know. Yes I had done all these things before but having a working knowledge is not enough for the MCM, you have to REALLY know what you are doing, plus be able to do it under pressure with no scripts or Google.
I breezed through the knowledge exam in May last year – well, I say breezed, I didn’t actually get a score breakdown from Prometric so I may have just scraped – and went straight onto the lab in September full of confidence. Big mistake right there. The lab is a monster. It really is the best test of both what you know and what you can do. It also tests how well you can concentrate under pressure, managing both your time and priorities for the full 5.5 hours.
Well, I failed. Not badly, but I failed. However boB’s email was very encouraging and I decided to go again as soon as the 90 day period was up.
I know others have said this and I will too, failing was the best thing that ever happened to me. I nailed back down to study with more confidence than before. I plugged the gaps in my knowledge where I thought I knew things that when put under pressure I couldn’t do. This I did with practice. And more practice. And yet more practice. I retook the exam on January the 10th and found out that I had passed on the 29th. I don’t think I would have felt as good if I’d passed the first time, as I would still be thinking there was some luck involved. Now however I feel great. I am fully confident in my abilities and ready to rock.
Becoming an MCM is no easy task but is well worth the effort. The support you get from the SQL Community is awesome and the feeling at the end of the process is fantastic. It’s hard-core but who doesn’t want to be hard-core. I do. And I am. It’s all good baby, all good.
To round off a great week my Saturday talk for SQLBits in May was also voted for! Get in. I will be blogging about Table-based database object factories in the next few weeks – what I’m presenting on – so stay tuned folks. I’ve got the bit between my teeth now and have hit top gear. Happy days! More blogging, speaking and the MCSM 2012 upgrade all planned in for this year. No point slowing down now.

Schema Removal: All Objects Must Be Obliterated!

‘Sup people. Whilst playing with SQL Azure Data Sync I often get databases rammed full of DataSync schema objects I no longer need; mainly when I’m done with a particular database or when it has issues. Removing this schema and it’s objects by hand is a ball ache and besides, I don’t do manual.

So I had a quick Google, as you do, as I normally find that some helpful cat has probably already done what you need to do and blogged it. Big up the SQL Family community spirit. So I found an article by Ranjith Kumar S that was my starting point. Way too much temp table and cursor stuff going on here for my liking and no deletion of user-defined table types. Also, the dropping of tables wasn’t Azure-friendly – as you can’t drop the PK in Azure if it’s clustered before the table, as a table without a clustered index is a no-no. So I tweaked it to have no temp tables or cursors, added some error trapping and ended up with this:

IF  EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[obliterateSchema]') AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[obliterateSchema]
GO

CREATE PROCEDURE dbo.obliterateSchema
(
	@schemaName varchar(100),
	@debug char(1) = 'Y' -- defaults to debug to avoid accidental deletion
)
AS

BEGIN TRY

	IF (@schemaName = 'dbo')
	BEGIN
		RAISERROR ('Obliteration of the dbo schema is forbidden.', 16, 1)
	END

	DECLARE @removal_sql nvarchar(max) = ''
 
	-- 01: removes all the foreign keys that reference a PK in the target schema
	SET @removal_sql = '-- 01: drop FKs that reference PKs in the removing schema;' + char(10)
	
	SELECT @removal_sql = @removal_sql + 'ALTER TABLE ['+ SCHEMA_NAME(fk.schema_id) 
		+ '].[' + OBJECT_NAME(fk.parent_object_id) + '] DROP CONSTRAINT [' + fk.name + '];' + CHAR(10)
	FROM sys.foreign_keys fk
	INNER JOIN sys.tables t 
		ON t.object_id = fk.referenced_object_id
	WHERE t.schema_id = schema_id(@SchemaName)
		and fk.schema_id = t.schema_id
	 order by fk.name desc
 
	-- 02: remove all FKs and constraints
	SET @removal_sql = @removal_sql + '-- 02: remove all FKs and constraints;' + char(10)
	
	SELECT @removal_sql = @removal_sql + 'ALTER TABLE [' + schema_name(t.schema_id) + '].['
		+ OBJECT_NAME(fk.parent_object_id) + '] DROP CONSTRAINT ['+ fk.[Name] + '];' + char(10)
	FROM sys.objects fk
	 join sys.tables t on t.object_id = fk.parent_object_id
	 where t.schema_id = schema_id(@SchemaName)
	and fk.type IN ('D', 'C', 'F')
	 
	-- 02.5: if the removing schema is DataSync then drop any dss triggers
	IF (@SchemaName = 'DataSync')
	BEGIN
	
		SET @removal_sql = @removal_sql + '-- 02.5: remove all dss triggers;' + char(10)
	
		SELECT @removal_sql = @removal_sql + 'DROP TRIGGER [' + schema_name(so.schema_id) 
			+ '].[' + st.[name] + '];' + char(10)
		FROM sys.triggers st
		inner join sys.objects so
			on st.parent_id = so.object_id
		WHERE st.[name] like '%dss_update_trigger'
			or st.[name] like '%dss_insert_trigger'
			or st.[name] like '%dss_delete_trigger'
	
	END
 
	-- 03: drop all other objects in order
	SET @removal_sql = @removal_sql + '-- 03: remove all PKs, tables, views, triggers, and functions;' + char(10)
	
	SELECT @removal_sql = @removal_sql + 
		CASE 
			WHEN SO.type = 'PK' THEN 'ALTER TABLE [' + SCHEMA_NAME(SO.schema_id) + '].[' + OBJECT_NAME(SO.parent_object_id) + '] DROP CONSTRAINT [' + SO.name + '];' + char(10)
			WHEN SO.type = 'U' THEN 'DROP TABLE [' + SCHEMA_NAME(SO.schema_id) + '].['+ SO.[Name] + '];' + char(10)
			WHEN SO.type = 'V' THEN 'DROP VIEW [' + SCHEMA_NAME(SO.schema_id) + '].[' + SO.[Name] + '];' + char(10)
			WHEN SO.type = 'P' THEN 'DROP PROCEDURE [' + SCHEMA_NAME(SO.schema_id)+ '].[' + SO.[Name] + '];' + char(10)
			WHEN SO.type = 'TR' THEN 'DROP TRIGGER [' + SCHEMA_NAME(SO.schema_id) + '].[' + SO.[Name] + '];' + char(10)
			WHEN SO.type IN ('FN', 'TF','IF','FS','FT') THEN 'DROP FUNCTION [' + SCHEMA_NAME(SO.schema_id) + '].[' + SO.[Name] + '];' + char(10)
		END
	 FROM SYS.OBJECTS SO
	 WHERE SO.schema_id = schema_id(@SchemaName)
		AND SO.type IN
			(
			select 'FN' union all
			select 'TF' union all
			select 'TR' union all
			select 'V' union all
			select 'U' union all
			select 'P' union all
			select 'PK' where charindex('Azure', @@VERSION) = 0
			-- as no table can exist in Azure without a clustered key, we can't drop this first
			)
	 ORDER BY 
		CASE 
			WHEN type = 'PK' THEN 1
			WHEN type in ('FN', 'TF', 'P','IF','FS','FT') THEN 2
			WHEN type = 'TR' THEN 3
			WHEN type = 'V' THEN 4
			WHEN type = 'U' THEN 5
			ELSE 6
		END

	-- 04: drop user-defined types
	SET @removal_sql = @removal_sql + '-- 04: drop user-defined types;' + char(10)
	
	SELECT @removal_sql = @removal_sql + 'SELECT DROP TYPE [' + schema_name(schema_id) 
		+ '].['+ [name] + '];' + CHAR(10)
	from sys.types
	where schema_id = schema_id(@SchemaName)
	 
	-- if debugging, just print the SQL and bail out
	IF (@debug = 'Y')
	BEGIN
		PRINT @removal_sql
		RETURN
	END 
	
	BEGIN TRAN

	-- actually remove the schema and all it's objects
	EXEC (@removal_sql)

	COMMIT TRAN

END TRY

BEGIN CATCH

		DECLARE @ErrorMessage nvarchar(4000)
		DECLARE @ErrorSeverity int
		DECLARE @ErrorState int

		SELECT	@ErrorMessage = ERROR_MESSAGE(), 
				@ErrorSeverity = ERROR_SEVERITY(), 
				@ErrorState = ERROR_STATE();
				
		IF @@TRANCOUNT > 0 ROLLBACK TRAN

		RAISERROR (@ErrorMessage, @ErrorSeverity, @ErrorState);

END CATCH;
GO

-- exec dbo.obliterateSchema 'DataSync', 'N'

Tidy. Does the job for me and can be run for any schema. Dig that it was created for removing the MS DataSync Schema which only consists of tables, triggers, stored procedures, and user-defined table types (plus the schema itself). You may need to extend it a bit as there’s probably stuff I’ve missed but this’ll get you rolling.

Word.