Tuesday, September 20, 2016

Scribe Insight DTS linking error

It is possible to chain DTS files with Scribe Insight.  The rules are all DTS files must be in the same folder.  If that's the case you can have one DTS call a next one, and so on.  This is very helpful in sequential processes and a vital tool in my box.

I had a problem with Scribe not letting me link DTS files.  I had a set of three files in my Collaborations share, all three files in the same Collaboration folder. When I open the first file and browse to the DTS to run next I receive the following message:

"The selected file must be located in the same folder as the current DTS."

As I said all three files are in the same Collaboration folder. When I click to browse I can even see the next file without having to change folders.

I worked with Scribe Support on this and we tried many different things.  Apparently having UAC on when you install Insight can cause a lot of problems, but I don't think this was one of them.


We solved this by moving the three DTS files to the Default collaboration folder and we were able to link them there.  Apparently there was something wrong with the folder these DTS files were in.  We were not able to determine the source of the problem so it remains a mystery, but we were able to work around it by using a different folder.

Wednesday, March 9, 2016

Scribe Online error connecting to CRM Online

Starting my first escapade into the world of Scribe Online.  In working through a training exercise I ran into an issue that took some figuring, but resolved as follows.

First the error.  When I tried to create a connection to my CRM Online Trial I got an error message thrown by the Manage Connections window that read:

Error - Preconnect failed.  ERROR: The following error has occurred in the Dynamics CRM Connector: Could not load file or assembly 'Microsoft.IdentityModel, Version=3.5.0.0, Culture-neutral, PublicKeyToken=blahblah' or one of its dependencies.  The system cannot find the file specified.

The clue here is the Microsoft.IdentityModel Version 3.5.  This refers to the Windows Identify Foundation 3.5 that was installed when I installed the Scribe Online local agent.  Turns out the install completes but you have to go into the Add Roles and Features Wizard and enable Windows Identity Foundation.

Once that is enabled restart the Scribe Online Agent and then you can create a connection to CRM Online.

Monday, March 7, 2016

SQL database table rowcounts

SELECT o.name, ddps.row_count
FROM sys.indexes AS i
INNER JOIN sys.objects AS o
  ON i.OBJECT_ID = o.OBJECT_ID
INNER JOIN sys.dm_db_partition_stats AS ddps
  ON i.OBJECT_ID = ddps.OBJECT_ID
 AND i.index_id = ddps.index_id
WHERE i.index_id < 2 
  AND o.is_ms_shipped = 0
ORDER BY o.NAME

Wednesday, March 2, 2016

How to write solid integrations.

If you are integrating two or more systems together how do you manage this?  What is the best practice for making integrations robust, error-proof and efficient?

In my years of writing integrations there is only one way to do this:  Staging.

Source data must be passed through staging for validation and cleansing before pass on to the target.  This can happen any number of ways from capturing the source data into a table for further work, or simply building a number of "views" that work the data to meet the expected target specification.

Of course I'm suggesting the integration use a SQL Staging database as the "middle-ware".  I don't know of any other way to do this, but I'd love to hear of any.

Staging allows capturing source data in its original form if needed.  This can provide an important audit trail that will help in troubleshooting.

Oh, did you think it was possible to avoid errors?  Users will make sure every possible error condition is tested, but it won't happen all at once.  This can only happen over time.  And it's why it's important to create an integration that is adjustable.  It can break from bad data but it should only break once.  When the bad data is identified, which should be easy with staging, an error trap can be created to catch it.  Thus the next time the error happens the integration catches it.

This points to the next important topic - notifications and alerts.  It's critical to notify users of error conditions trapped by the integration.  And there are two tiers of notifications - expected and unexpected issues.  Expected issues are designed (or caught later) and user notification is required to allow them to fix the problem so the integration can proceed.

Unexpected issues are technical in nature - errors that are sudden and expected or are environmental.  These would go to technical resources, such as the user or team responsible for maintaining the integration.

I hope to be able to expand on these concepts in later posts.

Wednesday, February 3, 2016

SQL tables that contain a named column

Sometimes you want to see which tables in a database reference a particular column.  Here is a script that will show you these. 

select a.name
from sys.objects a
inner join sys.columns b
on a.object_id = b.object_id
where b.name = 'ownerid'
and a.type = 'u'
order by a.name

If you change the "type" to 'V' it will show you the views that reference that column, in case you want to make changes of this sort.

SQL allows access to its metadata via the sys. objects so hopefully this makes it easier for you to do maintenance.

Wednesday, January 27, 2016

Removing RTF codes from SQL data

A client requested data be migrated from one system to another.  Data in the source system included text data that had embedded RTF codes as below.  These codes were used for application rendering in what was likely an unstructured note field.

{\rtf1\ansi\ansicpg1252\deff0\deflang1033{\fonttbl{\f0\fnil\fcharset0 Microsoft Sans Serif;}}
{\colortbl ;\red0\green0\blue0;}\viewkind4\uc1\pard\cf1\f0\fs16 bunch of text here \par more text here. \cf0\fs17\par
}


Searching did not yield much in the way of help on this so I had to figure this out with my own noodle.  The way I solved this was by cataloging the list of control codes that start with the backslash.  I then removed them by using the SQL REPLACE function.  For example:

UPDATE <table> SET <column> = REPLACE(<column>,'\rtf1','')

That will remove the "\rtf1" code from the data completely.  I ended up with a very long list of control codes and REPLACE statements - you can see from the above snippet there are a lot of codes and this is just a short example.

What was also important to watch out for were short codes that could cause issues.  For example if I found a code "\fs1" and I removed that before removing "\fs16" and "\fs17" then I would have left spurious characters behind - not good.  So I sorted the REPLACE functions by length of control code from longest to shortest.

Of course I also had to remove the open and closing brackets as well as an assortment of font names and semicolons.  Especially trick was getting rid of an accumulation of CR/LF codes at the start of the text once the control codes were removed.  The SQL SUBSTRING function took care of those.

The end result was over 200 REPLACE statements and some very clean text to migrated.

Another happy client!