Difference in mapping between BizTalk 2010 and 2013

Difference in mapping 2010
versus 2013

 

Artifacts description

In a
BizTalk 2010 project we have a function that passes the current XmlNode as an
XpathNodeIterator to a helper component. In BizTalk 2010 the map look like
this:

Note that
we are using custom XSLT here. The content of the XSLT is show below :

<?xml version=1.0 encoding=UTF-16?>
<xsl:stylesheet
xmlns:xsl=“http://www.w3.org/1999/XSL/Transform” xmlns:msxsl=”urn:schemas-microsoft-com:xslt” xmlns:var=”http://schemas.microsoft.com/BizTalk/2003/var” exclude-result-prefixes=”msxsl var userCSharp
ScriptNS0″ version=”1.0″ xmlns:ns0=”http://TestMap.SomeSchema” xmlns:userCSharp=”http://schemas.microsoft.com/BizTalk/2003/userCSharp” xmlns:ScriptNS0=”http://schemas.microsoft.com/BizTalk/2003/ScriptNS0″>
  <xsl:output omit-xml-declaration=“yes” method=”xml” version=”1.0″ />
  <xsl:template match=“/”>
    <xsl:apply-templates select=“/ns0:Root” />
  </xsl:template>
  <xsl:template match=“/ns0:Root”>
    <ns0:Root>
      <xsl:for-each select=“RepeatingNode”>
        <xsl:variable name=“var:v1″ select=”userCSharp:StringTrimRight(string(SomeNode1/text()))” />
        <xsl:variable name=“var:v3″ select=”userCSharp:StringTrimRight(string(SomeNode2/text()))” />
        <RepeatingNode>
          <xsl:variable name=“var:v2″ select=”ScriptNS0:WriteNode(string($var:v1) , .)” />
          <SomeNode1>
            <xsl:value-of select=“$var:v2″ />
          </SomeNode1>
          <xsl:variable name=“var:v4″ select=”ScriptNS0:WriteNode(string($var:v3) , .)” />
          <SomeNode2>
            <xsl:value-of select=“$var:v4″ />
          </SomeNode2>
        </RepeatingNode>
      </xsl:for-each>
    </ns0:Root>
  </xsl:template>
  <msxsl:script language=“C#” implements-prefix=”userCSharp”><![CDATA[

public
string StringTrimRight(string str)
{
       if (str == null)
       {
               return “”;
       }
       return str.TrimEnd(null);
}
]]></msxsl:script>
</xsl:stylesheet>

 

The content
of the Extension XML is shown below :

<ExtensionObjects>
  <ExtensionObject Namespace=“http://schemas.microsoft.com/BizTalk/2003/ScriptNS0″ AssemblyName=”MapHelper,
Version=1.0.0.0, Culture=neutral, PublicKeyToken=94092251336a29ea” ClassName=”MapHelper.mapHelperClass” />
</ExtensionObjects>

 

The code of
function in the Helperclass is shown below :

using
System;
using
System.Collections.Generic;
using
System.Linq;
using
System.Text;
using
System.Xml.XPath;
using
System.Xml;

 

namespace
MapHelper
{
   public class mapHelperClass
   {

        public static bool
WriteNode(string somedata,XPathNodeIterator node)
        {
           XPathNavigator
xpn = node.Current;
           XmlDocument
xdoc = new XmlDocument();
           xdoc.LoadXml(xpn.OuterXml);
           System.Diagnostics.Trace.WriteLine(“In
function ->”+ xdoc.OuterXml);
            return true;
       }
   }
}

Expected behavior BizTalk
2010

If we run
this map in BizTalk 2010 and watch for the output in Debug View we get the
expected output

Also the
map test succeeds in BizTalk 2010. As shown below:

Invoking
component…
F:ProjectsTestMapTestMapSomeMap.btm:
The compilation is using the CustomXslt and CustomExtensionXml tags to generate
the output.  The map content is ignored.
TestMap
used the following file:
<file:///C:UsersAdministratorAppDataLocalTempinputfile.xml> as
input to the map.
Test Map
success for map file F:ProjectsTestMapTestMapSomeMap.btm. The output is
stored in the following file:
<file:///C:UsersAdministratorAppDataLocalTemp_MapDataTestMapSomeMap_output.xml>
Component invocation succeeded.


Observed behavior BizTalk
2013

If we run
this in BizTalk 2013 we get the following output.

Invoking component…

C:ProjectsTestMap2013TestMapTestMapSomeMap.btm:
The compilation is using the CustomXslt and CustomExtensionXml tags to generate
the output.  The map content is ignored.
TestMap used the following file:
<file:///C:UsersAdministratorAppDataLocalTempinputfile.xml> as
input to the map.
C:ProjectsTestMap2013TestMapTestMapSomeMap.btm:
error btm1050: XSL transform error: Unable to write output instance to the
following
<file:///C:UsersAdministratorAppDataLocalTemp_MapDataTestMapSomeMap_output.xml>.

Exception
has been thrown by the target of an invocation. An error occurred during a call
to extension function ‘WriteNode’. See InnerException for a complete
description of the error. Enumeration has not started. Call MoveNext.
Test Map failure for map file <file:///C:ProjectsTestMap2013TestMapTestMapSomeMap.btm>.
The output is stored in the following file:
<file:///C:UsersAdministratorAppDataLocalTemp_MapDataTestMapSomeMap_output.xml>
Component invocation succeeded.

So the map failed because the behavior of the compiled xslt is different from the interpreted xslt.

Desired Behavior

The behavior
should be the same as in BizTalk 2010.

 

You can find the test project in the attachment

MULTILINE EXEC : BizTalk deployment (BTDF), or a gotcha in MSBUILD don’t know but there is a workaround….

I use the BizTalk Deployment Framework (BTDF) a lot for deployment of BizTalk solutions.
BTDF is using MSBUILD tasks to deploy a assembly.

Last week we found a problem with deployment of a specific solution. The problem was MS-BUILD was not performing as expected.
We had a task that looked like this…..

<Exec
     Command=”&quot;$(BtsDir)Trackingbm.exe&quot; add-account -View:&quot;%(BAMViewsAndAccountsGroup.viewName)&quot; -AccountName:&quot;%(BAMViewsAndAccountsGroup.groupNames)&quot;”
     ContinueOnError=”true” Condition=”‘$(BAMViewsAndAccounts)’ != ””/> 

The command would fail but since ContinueOnError was set to true the build would continue as if nothing had happened.
However… This build above was called by another build. And the original toplevel build would still fail !!!

I have looked everywhere and after a lot of googeling finally I found that the MsBuild task EXEC is by default MULTILINE….

This solved my problem very quickly. I changed the command to:

<Exec
     Command=”&quot;$(BtsDir)Trackingbm.exe&quot; add-account -View:&quot;%(BAMViewsAndAccountsGroup.viewName)&quot; -AccountName:&quot;%(BAMViewsAndAccountsGroup.groupNames)&quot; > null
                             Exit 0
     Condition=”‘$(BAMViewsAndAccounts)’ != ””/> 

And now the build complestes with succes.
Not the most elegant solution but it does the job…

 

Something you REALLY should know about dates ! (beeing Kind to DateTime.Kind)

Here we go, some important stuff about dates…..(it’s not only BizTalk related)

I had a very simple scenario.

  1. Webservice Receives a request (with several datetime fields in it)
  2. Send the received message to SQL via the WCF adapter

For some obscure reason some datetime values  originating from the same WebRequest got modified once in the database and some did not. 
After some tracing we found that the dates were serialized in a different way once inside the webservice.
some of the formats we saw :

  • 2011-08-29T18:00:00.826;
  • 2011-08-29T18:00:00
  • 2011-08-29T18:00:00.3983714
  • 2011-08-29T18:00:00Z

All these got to the database unchanged, but we also saw datetime like below

And these datetime fields got calculated back to their UTC time and were then stored in the database.

After some experimenting we found the following interesting stuff…..

Datetime.Kind (but for sure, everybody already knew that)…. So what is this KIND ?

Public property Supported by the XNA Framework Supported by Portable Class LibraryKindGets a value that indicates whether the time represented by this instance is based on local time, Coordinated Universal Time (UTC), or neither.

So what happens is the following, 

  • Create a DateTime in code (like DateTime current = DateTime.Now) and Kind is Local
  • Create a DateTime in code and Assign a database field (like DateTime fromdb = datetimefield from database) and Kind = Unspecified

And what’s the difference ?

Well the difference is that when the WCF Adapter stores the date to the database field.

  • when != to unspecified it calculates it back to UTC
  • when == unspecified it is stored as is….

 

Since I really wanted unified DateTimes in the complete solution I created a pipeline that did the following trick:

        static string CreateUnspecifiedDate(string someXsDateTimeString)
        {
           DateTimeOffset dto = DateTimeOffset.Parse(someXsDateTimeString);
           DateTime res = dto.LocalDateTime;
           return System.Xml.XmlConvert.ToString(res,System.Xml.XmlDateTimeSerializationMode.Unspecified);
        }

And the problem was solved. But I did learn something about datetimes

I think this is a bug in the orchestration engine but I am not really sure.

Today I had a very weird problem.
I submitted a message via a WCF webservice and had an orchestration listening on the messagebox for that particular messagetype.

The orchestration does some serious data-massage and needs to store the converted message a couple of times. For this I created some helper components that store the changed message in a database (all the logic is in a callable orchestration).
Below is a sample of the orchestration that went wrong.

So it’s nothing too fancy and I have used this pattern a lot and never had any problems. Until today…….
If the message is sent to the messagebox, there could be a failure (no subscribers) this will kick in the errorhandeling and the message is send to the Faultportal where you can resubmit the message.
But today things went a little different than I was used to……

Because there was  was no subscribtion I would expect ONE routing failure, but I got TWO routing failures……. So I did some investigation on those routing failures….. Below is a screenshot of this :

The Field marked with the red square is the ID that I get back from the Database insert (Callable orchestration)….. It is a distinguished field so it sits on the contex of a message.
Thats why I can see these are two different routing failures, and there were two calls made to the Database to store the message. (So the callable orchestration was called twice !)
And both routing failures are from one orchestration. See the instance ID in the header of the window.

For sure I have been looking half a day in the message box for that hidden subscription somewhere but finally I gave up there was nothing, everything seems to be all right…

After a day looking, I think i have found it. First let’s describe what should happen.

  1. Message is received
  2. Some transformations are done
  3. New message is stored in the DB and we get the ID back
  4. Message is Send off to the Messagebox
  5. Subscription fails
  6. Errorhandler kicks in and writes the message to the FaultPortal.

Now let’s rewrite this with the things that happen :

  1. Message is received
  2. Some transformations are done
  3. New message is stored in the DB and we get the ID back
  4. Message is Send off to the Messagebox (First Routing Failure)
  5. Subscription fails
  6. (For some reason we start at two again but  this time with the knowledge that publishing will fail)
  7. Some transformations are done (step 2 again !)
  8. New message is stored in the DB and we get the ID back (step 3 so we get a new ID)
  9. Message is Send off to the Messagebox (Second Routing Failure)
  10. Errorhandler kicks in and writes the message to the FaultPortal.

This is definitely not what I was expecting.
So I had a close look at my other orchestrations that do work with almost the same construct and I did find a very small difference that made all the difference.
Below is a screenshot of this.

 Now the Expected flow and the flow that actually happens are again in sync :

  1. Message is received
  2. Some transformations are done
  3. New message is stored in the DB and we get the ID back
  4. Message is Send off to the Messagebox
  5. Subscription fails
  6. Errorhandler kicks in and writes the message to the Faultportal.

I think this is a bug in BizTalk, but I am not really sure about it so if you think you know why this happens please leave a message on my blog.

It ran on a machine with BizTalk 2006 R2 SP 1, but since the XLANG engine did not change that much this could happen in all BizTalk versions.

Ntrace is available for Visual studio 2010

Ntrace is a great tool for logging. It has virtually no impact and is blazing fast. Finally it’s been migrated to Visual studio 2010.
So now I can start building my BizTalk Bestpractice toolbox with it. The author of Ntrace can probably explain best what it is…

What is this ETW thing?

Event Tracing for Windows is a kernel-level tracing service that has been around since Windows 2000. Since it’s baked right into the kernel, it is extremely fast. Most of the developers that use ETW are writing drivers, but why should they have all the fun?

Why should I use ETW?

ETW Tracing has several benefits over the tracing classes provided with the .NET Framework. Most importantly, ETW tracing can be turned on and off without having to restart the application, but it also has features like built-in high performance circular logging (a circular log is one that never grows above a specified size by flushing out older trace messages), and the ability for you to capture the logs from multiple sources into a single trace session.

What is this preprocessor and why do we need it?

Put simply, to maximize application performance when tracing is not enabled. In a perfect world, an application’s performance when tracing is disabled would be identical to one where tracing wasn’t included at all. The problem is that your code is only compiled once; if those trace calls are in there, they’re GOING to get called, and while the ETW functions return quickly when tracing is disabled, the runtime still has to evaluate trace arguments, allocate memory, construct method call stacks, and so on. The application performance would be even faster if the functions were never called in the first place. How much faster is it? Here’s an example: Let’s write a simple application that has a function named DoSomething.

static int DoSomething(String arg0, int arg1, long arg2, DateTime arg3)

As you can see, DoSomething in this case simply returns a value and does no other calculations. It should be blazingly fast, right? Well, it is, but there’s still the overhead of the method call. To demonstrate this, let’s run two loops: one that ends up in a call to DoSomething one million times, and another that will shortcut the call. To do this, we’ll create a method named DoRun that will call DoSomething unless the caller has specified that it should bypass the call entirely. If the value passed to shortcut is true, we’ll skip the DoSomething call altogether.

static void DoRun(bool shortcut)
{
  DateTime start, stop;

  start = DateTime.Now;
  for (int index = 0; index < 1000000; index++)
  {
    if (!shortcut)
    {
      Program.DoSomething("Hi there!", 42, 42L, DateTime.Now);
    }
  }
  stop = DateTime.Now;

  Console.WriteLine(
    "Shortcut {0}: {1} milliseconds", 
    shortcut, 
    (stop - start).TotalMilliseconds);
}

To get our results, we will now call DoRun twice; once with shortcut set to False, and again with shortcut set to True. On my machine (a 2GHz Core2Duo running Windows Vista x86), the results are:

Shortcut False: 546 milliseconds
Shortcut True: 15.6 milliseconds

In other words, it was 35 times faster to completely bypass the function call. That’s nearly two orders of magnitude! Now, imagine if you were doing something even more complicated there such as calling ToString() on an exception or dumping a string containing the values of all of the properties of the object you are working on and you should be able to see that you get a rather large performance boost by skipping those calls completely. NTrace’s preprocessor provides your code with this boost by automatically injecting conditional statements around your trace calls for you, allowing you to focus on writing your code instead of remembering which conditions to check.

 

I have been waiting for this tool to upgrade to 2010 for a long time. So see what it is and grab your copy of Ntrace >>here<<

Removing unwanted namespaces from a BizTalk Map

BizTalk is a great tool but sometimes the output is a little unreadable because of the number of namespaces in a document.
This post is related to a previous post where I had to import a zillion schemas. Once I had those schema’s imported I could map them but the ouput looked as the picture below.

So about 120 namespace declarations and then some data. (the namespace declaration was about 10K and the message itself only 5K) This was not what the customer desired and they wanted to get rid of some (most) of those namespaces.
So I created a nice little CSLT that would get rid of those namespaces. I started a discussion on MSDN <clickto see> and Greg Forsythe came with the following solution.
Create a custom XSLT like this :

<xsl:stylesheet xmlns:xsl=http://www.w3.org/1999/XSL/Transform version=1.0>
   <xsl:template match=*>
      <xsl:element name={local-name()} namespace={namespace-uri()}>
         <xsl:apply-templates select=@* | node()/>
     </xsl:element>
  </xsl:template>
<xsl:template match=@* | text() | comment() | processing-instruction()>
   <xsl:copy/> 
</xsl:template>
</xsl:stylesheet>

And surely that did the trick.
But there was a problem. I now had to execute two maps. One to map to the desired output and one to get rid of all those namespaces.
Furthermore this script had the problem that instead of declaring all the namespaces once at the top of the document, it repeated the namespaces in every node, resulting in a much less readable message.

Leg xmlns=”urn:fec:florecom:xml:data:draft:ReusableAggregateBusinessInformationEntity:3“>
      <TransportMeans>
        <TypeCode xmlns=”urn:un:unece:uncefact:data:standard:ReusableAggregateBusinessInformationEntity:3” listID=”Recommendation 28″ listAgencyID=”6″ listVersionID=”2007″/>
      </TransportMeans>
      <SpecifiedLoadingLocation>
        <CountryID schemeVersionID=”second edition 2006″/>
      </SpecifiedLoadingLocation>

After googling around I came to a solution where I had the following problems solved.

  • No extra map required
  • Namespaces declared only once per message
  • High level of control of what namespace to declare and what namespace to omit.

Here is how I did it :

  • First Create the map as you normally do.
  • Then after the map is finished, select validate map.
  • You will be presented with something like this :

 D:Lokale BestandenOntwikkelingCEFH.CEFH.CCE.CCX.Plugins.XMLLB_v0006FH.CCE.CCX.Plugins.XMLLB_v0006MapsGeleverdePartij_v0006_To_Delivery_0p2.btm: warning btm1004: The destination node “TypeCode” has multiple inputs but none of its ancestors is connected to a looping functoid.
D:Lokale BestandenOntwikkelingCEFH.CEFH.CCE.CCX.Plugins.XMLLB_v0006FH.CCE.CCX.Plugins.XMLLB_v0006MapsGeleverdePartij_v0006_To_Delivery_0p2.btm: The output XSLT is stored in the following file:
file:///C:Documents and SettingspkwkLocal SettingsTemp1_MapDataGeleverdePartij_v0006_To_Delivery_0p2.xsl
D:Lokale BestandenOntwikkelingCEFH.CEFH.CCE.CCX.Plugins.XMLLB_v0006FH.CCE.CCX.Plugins.XMLLB_v0006MapsGeleverdePartij_v0006_To_Delivery_0p2.btm: The Extension Object XML is stored in the following file: <file:///C:Documents and SettingspkwkLocal SettingsTemp1_MapDataGeleverdePartij_v0006_To_Delivery_0p2_extxml.xml>

  •  Now test the map and see if it still gives the same result (it should)
  • Then open the Custom.xsl and look  for “exclude-result-prefixes
  • Then I added all the namespaces I wanted to exclude to this so it looked like this :

exclude-result-prefixes

=msxsl var s2 s0 s1 userCSharp udt qdt ns1 ns2 ns3 ns4 ns5 ns6 ns7 ns8 ns9 ns10 ns11 ns12 ns13 ns14 ns15 ns16 ns17 ns18 ns19 ns20 ns21 ns22 ns23 ns24 ns25 ns26 ns27 ns28 ns29 ns30 ns31 ns32 ns33 ns34 ns35 ns36 ns37 ns38 ns39 ns40 ns41 ns42 ns43 ns44 ns45 ns46 ns47 ns48 ns49 ns50 ns51 ns52 ns53 ns54 ns55 ns56 ns57 ns58 ns59 ns60 ns61 ns62 ns63 ns64 ns65 ns66 ns67 ns68 ns69 ns70 ns71 ns72 ns73 ns74 ns75 ns76 ns77 ns78 ns79 ns80 ns81 ns82 ns83 ns84 ns85 ns86 ns87 ns88 ns89 ns90 ns91 ns92 ns93 ns94 ns95 ns96 ns97 ns98 ns99 ns100 ns101 ns102 ns103 ns104 ns105 ns106 ns107 ns108 ns109 ns110 ns111 ns112

  • That’s guite a list. And now if I test the map the result looks like the picture below :

 

  • And that’s exactly what the customer wanted.

I hope this Blog post will help other people wha are struggeling with the same problem. If it does, please leave a comment below.

XSLT Distinct another way to determine distinct in XSLT 1.0

I had a requirement to map a buyer only if it was the same buyer throughout the entire document.
The reason for this was that in the source document the buyer was defined in a sub sub sub node of a document and in the destination it occurred only once.
So I ended up with several choices.

  • Only map the first buyer
  • Don’t map
  • Only map if they were the same throughout the entire document

For sure the first option would be a bad thing.
The second option would work for all parties involved (it’s an optional element in the output of the map) but the parties really want their buyer information if it’s there.
The third option seemed the best solution. I quickly googled on XSLT and distinct and there were some results. So I told the customer implementing a distinct wouldn’t be too hard. (it already existed in XSLT).
(I wish I looked a bit harder, cause then I would have seen that the distinct function of XSLT comes with XSLT 2.0 and sadly BizTalk is still using XSLT1.0)

After some thinking I got the following solution for this problem.

  1. Perform a count of the number of buyers in a document
  2. Get the first buyer (buyer is mandatory in the input document)
  3. Perform a count of the numbers of buyers where buyer != buyer found in step2

If the number in step 3 is 0 then we know all the buyers are the same. Below is the XSLT I used to perform this different distinct approach.

<xsl:template name=”Buyerparty_DocTemplate”>
  <xsl:param name=”var1″ />
  <xsl:param name=”var2″ />
  <xsl:param name=”dbg” />
  <xsl:variable name=”buyers” select=”count(/s0:Request/GeleverdePartij/LeveringsBericht/Levering[*]/Ladingdrager[*]/Goederen[*]/Koper/kop_gln)” />
  <xsl:variable name=”firstBuyer” select=”/s0:Request/GeleverdePartij/LeveringsBericht/Levering[1]/Ladingdrager[1]/Goederen[1]/Koper/kop_gln” />
  <xsl:variable name=”otherBuyers” select=”count(/s0:Request/GeleverdePartij/LeveringsBericht/Levering[*]/Ladingdrager[*]/Goederen[*]/Koper[not(kop_gln=$firstBuyer)])” />
  <xsl:if test=”$dbg=1″>
    <xsl:element name=”BuyerInfo”>
      <xsl:element name=”TotalBuyers”>
        <xsl:value-of select=”$buyers” />
      </xsl:element>
      <xsl:element name=”FirstBuyer”>
        <xsl:value-of select=”$firstBuyer” />
      </xsl:element>
      <xsl:element name=”OtherBuyers”>
        <xsl:value-of select=”$otherBuyers” />
      </xsl:element>
    </xsl:element>
  </xsl:if>
  <xsl:if test=”$otherBuyers=0″>
    <xsl:if test=”string-length($firstBuyer) > 0″>
      <xsl:element name=”BuyerParty”>
        <xsl:element name=”PrimaryID”>
          <xsl:value-of select=”$firstBuyer” />
        </xsl:element>
        <xsl:element name=”schemeID”>
          <xsl:value-of select=”$var1″ />
        </xsl:element>
        <xsl:element name=”schemeAgencyName”>
          <xsl:value-of select=”$var2″ />
        </xsl:element>
      </xsl:element>
    </xsl:if>
  </xsl:if>
</xsl:template>

xs:int and xs:integer, what’s the difference….

I am busy creating schema’s and exposing them as a web service.
I always generate a client and try to post some messages and this time I was again surprised by BizTalk. (or should I say XML).
When creating a schema you can chose several types for an element. Some of these are xs:int and xs:integer.
I noticed these two before but didn’t bother too much.
But now for the first time I see there is a clear difference in the way stuff is treated by .Net. Below is a screenshot of a node with the type xs:integer.

I also have some regular elements of type xs:int. Below is a screenshot of that.

Now after I generated the WCF service for this schema, I imported the WSDL into VS 2005 and I was quite surprised to see what intellisense did to these elements in visual studio :

 So intellisense showed me it was actually a string !… And the other node of type xs:int was the .Net type I expected to see.

So what did I learn today, to stay away from xs:integer and use xs:int instead.
Hope this will help someone in the future, if it does, leave a comment

 

Millions of records in the BAMAlertsApplication and how to get rid of them (NSVacuum to the rescue)

As a BizTalk consultant I always implement BAM to do basic auditing. This is besides the BAM a business analist would want to see. The basic functionality of this audit trial is:

  • When was the message received
  • Where did it came from
  • Where did it go
  • What happend to the message
  • Any important business decision made in an orchestration

I write this audit data into a BAM view, and it has proven to be valuable information. From time to time you will get questions regarding messages and it’s always nice to have this information. Besides that you can set nice alerts ( if you use them) on specific events and have people mailed in case things go wrong. So this is nice and i wouldn’t want to do an implementation without it.

But there is a downside to it. All these audit records also write some data in a not very well documented database BAMAlertsApplication. And over time there could be millions of rows in them. Below is a sample of the BAMAlertsApplication

The records are just piling up and are consuming more and more resources from your SQL server. I had noticed this behaviour before and posted a question about it on the MDSN forums see “How to clean up the BAMAlertsApplication database.” And the answer of a MS Employee was to open a case by Microsoft. Yesterday the BizTalk Administrator of the customer that I work for, asked me if there was anything I could do about the size of this database that just kept on growing and growing. The administrator even pointing me to my own discussion on MSDN and stated that he wanted to indeed start a case with MS.

This triggered me to have a look at the database and by looking at the stored procedures ( I was looking for remove/archive/delete stored procedures) I noticed the NSVacuum stored procedure. This triggered me to see what it did. So I turned to the almighty google to see what it knew  about NSVacuum. I got only 2 results !.The first post wasn’t really encouraging, since it increased the databases

.

 

But looking at the code of the stored procedure I was convinced that it did some cleaning. So I dropped the “BizTalk” keyword from the search to have another look…..

This was not too encouraging either, only three real results this time. But fortunately the first one pointed me to a Microsoft document with usefull information. It turned out that NSVaccum was exactly what we needed. And after some runs the databas has now shrunk to a more reasonable size. See the picture below.

 So, the takaway is….., do NOT forget to schedule NSVacuum if you are using BAM !….

I really hope this is usefull for other people in the future, if it is, please leave reaction on my blog, it will keep me motivated to share my BizTalk experiences with the community !

Just another BloggingAbout.NET site