Learn, Speak, Learn, Repeat. Update.

Following on from my previous post, Learn, Speak, Learn, Repeat, I’m happy to say that I have learnt and am repeating (the speaking that is). I am doing another SQL 2014 In-Memory session at SQL Bits in July but this time it won’t be so wide ranging. Instead of trying to do the whole of 2014 In-Memory in an hour I’m going to spend the time digging on one design plan, using it for a super-hot (“flaming”) partition. Bugger all slides and a whopping demo. Tidy. It’s onwards and upwards amigos, onwards and upwards. Always.

Best get on with actually creating the session now hey.


Running Old DTS Packages on a Pure SQL Server 2014 Server

So you digging on a SQL 2014 upgrade but have some dirty, dirty old DTS packages that you don’t want to touch? No shame there though brothers and sisters, who wants to do that? So here’s the deal:

Do your DTS packages use ActiveX Script Tasks?

Ass burgers. You’re fudged man and looking at installing an older version of SSIS or getting down and dirty to convert those packages.

Smooth, you’re cruising on easy street, albeit in a poor quality ride. Just install the Microsoft SQL Server 2005 Backward Compatibility Components and by some old school Jedi-like magic, DTSRun will be returned to you. No need to install a lower SSIS or pay another license (installing a lower version of SSIS on a 2014 server will require that it is also licensed). No chance of any design time tools but the beasts will at least run (keep an old development license 2005/2008 VM for editing and converting them).

Word up, this is a SHORT-TERM SOLUTION and you should be pimping ALL those old packages but sometimes needs must. This just bides you time without blocking the upgrade process. You dig? Sweet.

Learn, Speak, Learn, Repeat.

I started speaking a year ago and yesterday at SQL Saturday Exeter was my 6th session. Looking back on it, I tried to cram way too much into the allocated fifty minutes. I rammed it in, had to deliver the material too fast and got flustered and struggled with some of the questions due to the time-stress. This annoys me, as I feel I didn’t do the best that I could. The session in future will be less than half as big, with plenty of room for questions. Learn, speak, learn, repeat.

Thinking back, I think I confused Hugo Kornelis the most, whilst trying to explain how Always On worked with in-memory data. Hugo, I apologise for this. It has annoyed me so much that I felt compelled to write this quick post to clear things up.

Always On AGs and In-Memory OLTP

  • Log records are transferred to the secondary replicas and replayed for all types of tables: disk-based data to disk, in-memory data to memory (so on failover there is no need for a reload of the in-memory data as it is already there and ready to rock)

Oh, and talking about bleeding edge stuff your testing but haven’t finished yet, such as controlled failovers for in-memory maintenance? Get them sorted and tested first, talk about them second. I learn….

This was where I possibly confused Hugo and every other attendee by talking about “memory-mapping”. Fool. Why did I plump for that phrase? This will NOT be used in future.

I’m going to have a week off thinking of sessions now but when I get back to it they’re going to be smaller and more focused. Less is more. Quality over quantity. I learn, I speak, I learn, I repeat (the speaking, minus the crap bits and with more good bits).

Wondering what deprecated features you’re rocking?

Is yo SQL box a dinosaur? Thinking about pimping the version? No idea where to start? Well forget traces for now and dig on this:

FROM sys.dm_os_performance_counters PC
WHERE [object_name] IN('SQLServer:Deprecated Features')
AND [cntr_value] > 0;

Boom! So simple it hurts. It’s your starter for ten.

Now go search and destroy.

Note: This is a STARTER FOR TEN. Only a total fool would rely solely on this as upgrade planning. Check yourself before you wreck yourself. Damn straight.

Up the EF Junction

Let’s face it, Entity Framework does not really like any tables that do not have a unique numeric primary key. Not the best I know but what are you going to do? I have just had to convert a large number of junction tables to work with an EF application and it was a pain. In short any table like this:

CREATE TABLE [dbo].[ProductCategory]
(ProductID INT,
CategoryID INT,
CONSTRAINT [PK_ProductCategory] PRIMARY KEY CLUSTERED ([ProductID] ASC, [CategoryID] ASC),
CONSTRAINT [FK_ProductCategory_Product] FOREIGN KEY ([ProductID]) REFERENCES [dbo].[Product] ([ProductID]),
CONSTRAINT [FK_ProductCategory_Groups] FOREIGN KEY ([CategoryID]) REFERENCES [dbo].[Category] ([CategoryID]));

Needs to become like this:

CREATE TABLE [dbo].[ProductCategory]
(ProductCategoryID INT IDENTITY(1,1),
ProductID INT,
CategoryID INT,
CONSTRAINT [FK_ProductCategory_Product] FOREIGN KEY ([ProductID]) REFERENCES [dbo].[Product] ([ProductID]),
CONSTRAINT [FK_ProductCategory_Groups] FOREIGN KEY ([CategoryID]) REFERENCES [dbo].[Category] ([CategoryID]));

ON [dbo].[ProductCategoryEF] ([ProductID] ASC, [CategoryID] ASC);

Simple. If not entirely sensible. Note the non-clustered PK and the unique clustered index on the previous PK columns. It does exactly the same thing as before but has an extra column simply for the benefit of EF. With the resurgence of EF and OData this is a handy thing to know and maybe to design in at the start, should your shop or client be going the OData route.


MCM Y’All! Now the hard work really starts….

Well, it’s just over ten months ago that I did my first QuickStart MCM session with Robert L Davies (@SQLSoldier) and now, finally, I am a full blown MCM. I must be honest and say that I have never had to work so hard in my entire life as I have had to do to obtain this certification. I thought I knew everything after working with SQL Server for twelve years and it was a shock to realise how much I didn’t know. Yes I had done all these things before but having a working knowledge is not enough for the MCM, you have to REALLY know what you are doing, plus be able to do it under pressure with no scripts or Google.
I breezed through the knowledge exam in May last year – well, I say breezed, I didn’t actually get a score breakdown from Prometric so I may have just scraped – and went straight onto the lab in September full of confidence. Big mistake right there. The lab is a monster. It really is the best test of both what you know and what you can do. It also tests how well you can concentrate under pressure, managing both your time and priorities for the full 5.5 hours.
Well, I failed. Not badly, but I failed. However boB’s email was very encouraging and I decided to go again as soon as the 90 day period was up.
I know others have said this and I will too, failing was the best thing that ever happened to me. I nailed back down to study with more confidence than before. I plugged the gaps in my knowledge where I thought I knew things that when put under pressure I couldn’t do. This I did with practice. And more practice. And yet more practice. I retook the exam on January the 10th and found out that I had passed on the 29th. I don’t think I would have felt as good if I’d passed the first time, as I would still be thinking there was some luck involved. Now however I feel great. I am fully confident in my abilities and ready to rock.
Becoming an MCM is no easy task but is well worth the effort. The support you get from the SQL Community is awesome and the feeling at the end of the process is fantastic. It’s hard-core but who doesn’t want to be hard-core. I do. And I am. It’s all good baby, all good.
To round off a great week my Saturday talk for SQLBits in May was also voted for! Get in. I will be blogging about Table-based database object factories in the next few weeks – what I’m presenting on – so stay tuned folks. I’ve got the bit between my teeth now and have hit top gear. Happy days! More blogging, speaking and the MCSM 2012 upgrade all planned in for this year. No point slowing down now.