Quantcast
Channel: BizTalk – Stuart Charles – Application Development, Software Integration and Data Architecture Blog
Viewing all 16 articles
Browse latest View live

BizTalk Project Structure Guidelines

$
0
0

When I first started developing in BizTalk 2010 I couldn’t find any definitive articles about how best to organise my solutions and projects in visual studio. I started with one solution for an individual project with individual projects containing all the artifacts for a specific area. I soon got into trouble with dependencies throwing errors on deployment and the end result was generally a bit of a mess.

Here are my recommendations on how to organise a BizTalk solution in Visual Studio.

Separate your solution with individual projects for each type of BizTalk artifact as follows.

  • Schemas
  • Transforms
  • Orchestrations
  • Pipelines
  • Components (c# project for any custom objects)

Create your first project as <CompanyName>.<FunctionalArea>.Schemas and the solution as <CompanyName>.<FunctionalArea>. for example “MyCompany.Sales.Schemas”. This solution should contain all artifacts with interfaces either publishing from your sales system into the BizTalk message box or subscribing to messages in the BizTalk message box.

Also create another solution called <CompanyName>.CommonArtifacts with projects containing the canonical form for each schema. e.g. <CompanyName>.CommonArtifacts.Schemas.Customer might contain the Customer.xsd which would then be referenced by all the other solutions which subscribed to this message type. The advantage of having a separate project for each canonical form is that you create less dependencies so if one schema changes then you don’t have to update so many other solutions and can deploy the new schema in isolation.

When you create your new project, the first thing you will want to do is set up your default namespace. My recommendation is that you keep it simple, e.g. <CompanyName>.BizTalk.<FunctionalArea>.<BTArtifactType>. e.g. MyCompany.BizTalk.Sales.Schemas. Also set your project name in the deployment options, usually the same as the Solution name you chose. You will also need to generate a key.

An advantage of using this structure is that it provides a standardised order for all your frameworks and also enables you to deploy .dll individually should you need to, for example if there is only a minor change to a map and you don’t want to have to redeploy everything. It also makes your solution compatible with the excellent BizTalk deployment framework available on Codeplex.

 


Where to Store BizTalk Environment Variables

$
0
0

Something that wasn’t immediate obvious when I began developing BizTalk applications was where to store my environment variables such as database URLs and web service URLs for custom functoids, passwords for authentication with other systems etc. 

SSO Storage. Having tried all the options below, my recommendation is to put them in the SSO storage by creating a custom library and custom functoid to make them easily accessible from maps and orchestrations. It may seem like hard work but it isn’t as bad as it looks and is definitely worht the effort.

First of all download the SSO snapin from Microsoft (seems odd this isn’t bundled with BizTalk) and set up your applications and variale key pairs. Note you will need to be in the BizTalk SSO Administrators AD group.

Then create a new library in your CommonArtifacts solution which will allow you to access SSO variables very easily. Use the SSOClient in the download from Microsoft as a guide for doing this. At the same time it is worth creating a custom fucnctoid which uses this new library so configuration viariables are also easily accessible from maps.

There is a great tutorial from Richard Seroter which explains how to do this in more detail.

BizTalk Configuration File. In my i initial applications I put environment variables in D:\Program Files\Microsoft BizTalk Server 2010\BTSNTSvc.exe.config (BTSNTSvc64.exe.config for 64bit installations) under the appsettings tag. I would access them by simply entering the following code into my orchestration and maps.

public string GetConfigString()
{
string connString=System.Configuration.ConfigurationSettings.AppSettings.Get("MyDatabaseConnectString").ToString();
return connString;
}

It became clear fairly quickly however that there were major drawbacks to this approach

  • You had to restart the host instances for any changes to take effect
  • Everything was unencrypted
  • You had to change a separate dev.config file for the changes to work in the BizTalk test map tool in Visual Studio

A custom Database Table is another option for handling environment variables and may occasionally be suitable for storing setings. But of course the URL to theis database will need to be stored somewhere anyway and variables are not encrypted by default.

The Business Rules Engine should be used when complex rules which are likely to change are required to drive mappings and workflow. I’m still not a huge fan of the BRE but perhaps I am just not using it in the way it is intended just yet.

Debug Tracing in BizTalk Orchestrations and Custom C# Components

$
0
0

It’s not immediately obvious how you can debug code in BizTalk orchestrations, custom pipelines, custom fuctoids etc. After investigating a couple of options I found the easiest way is to use a tool called Debugview which is a simple executable (no installation required) supplied by Microsoft for free.

  1. Download the executable and install it on the desktop or somewhere else thats easy to get to
  2. In an orchestration enter the following code in either a Message Assignment or Expression shape with whatever you want to
    System.Diagnostics.Trace.WriteLine(System.String.Format("Message content is - {0}",msgMyEmployees.OuterXml));
    
  3. Or in c# you can write the same code in your custom functoid or custom pipeline
  4. Deploy / GAC / Restart host instances as usual then kick off DbgView.exe .
  5. Ensure Capture Win32 Capture Global Win32 are all checked under the capture menu
  6. Click on Capture (the magnifying glass)
  7. Run your test case through BizTalk and the output will be printed to the window. Nice and easy.

Scripting Deployment of WCF Services using BtsWcfServicePublishing

$
0
0

BizTalk makes deploying WCF and ASMX services easy using the BizTalk Web Service Deployment wizard, but it can be tedious and time consuming deploying new versions whilst developing new services where the schemas can often change and need to be refreshed. Fortunately the task can be scripted using the BtsWcfServicePublishing.exe tool downloadable from Microsoft which picks up the values from a predefined XML file. This tutorial talks you through the steps you need to take to do this.

Go through the BizTalk WCF deploykment wizard in the normal way. Just before the end of the wizard some XML appears which you can use to automate future deployments. Highlight this text, copy to the clipboard using ctrl-c and paste into a new .xml file in a text editor of your choice before saving to disk. Then publish your service using the wizard.

A slight quirk of the tool (at least on my server deployment) is that I need to change <?xml version=”1.0″ encoding=”utf-16″?> to just <?xml version=”1.0″> at the top of the file. I don’t really know why.

Then whenever your schema or other service properties change at all, deploy the new schema to BizTalk in the usual way from Visual Studio and then run the BtsWcfServicePublishing.exe tool from the command prompt with one parameter, the path to your saved XML file. e.g.

"D:\Temp\BtsWcfServicePublishing.exe" "D:\Temp\myexportedwcfconfig.xml"

You can alter any of the peroperties in the XML file, most are self explanatory. Some of the more useful ones are:

  • CreateReceiveLocations=”false”. If you are just going to refresh the schema you probably want to set this to false otherwise all of your receive properties you have configured will get wiped out.
  • Location=”http://localhost/MyService“. You might want to change this to another location so that you have different authentication methods for example. For example when I am developing certificate based services I find it very useful to have a service with all authentication disabled so I can use test tools such as SoapUI.
  • Overwrite=”true”. Probably always want this to be true otherwise the process will fail.
  • AuthAnonymous=”true”. Probably always want this to be true as authentication is usually managed on the BizTalk side rather than IIS.

You can also hand craft the XML to add new Operations and Services which is very useful when the scope of your service operations expand as you don’t have to reconfigure all of the existing services by selecting scheams and the like in the Wizard.

I would also recommend adding the XML configuration to your Visual Studio BizTalk project and keeping it under source control so that other developers can also take advantage of this.

ESB with a Publish / Subscribe model in BizTalk 2010

$
0
0

I’ve done a lot of work implementing both ESB and Publish / Subscibe patterns and I have come to the conclusion that they are not really compatible with each other, at least not in BizTalk 2010.

  • The Publish / Subscribe model works best when you have one master data source (the publisher) and you need to keep slave datastores in sync with the master.
  • ESB is more suited to a model which is opposite to publish subscribe, where there are multiple publishers passing messages to one subscriber. For example a chain of shops taking orders for products from various store outlets into one central order processing system. It can also be use for exposing functionality and data via WCF services using one universal end point.

The ESB is not really worth the effort unless you have a lot of clients calling your services which are not within your control, e.g. external suppliers or resellers. In this case the ESB give you the flexibility to add new services and schemas without having to ask all of your 3rd party clients to update their code. If you are developing services that will only ever be consumed from within your own organisation then I’d recommend giving it a miss as it just adds complexity for little or no benefit.

I started a discussion on MSDN a few months ago and it seems others have come to the same conclusion.

BizTalk WCF-SQL Transactions Hanging

$
0
0

I’ve had a lot of trouble recently where MS SQL database transactions never seem to finish when useAmbientTransaction is set to true in the WCF-SQL adapter. What made the problem more infuriating is that it didn’t always happen and was much more frequent in my development environment. I was trying to follow the method detailed in Stuart Brierley’s excellent blog article which describes how to execute multiple procedures/inserts/updates in one message and transaction. I’ve also experienced the same problem when transactions are declared within stored procedures BizTalk is trying to execute.

Whilst all this was going on I had noticed but discounted this error appearing in the windows event viewer.

The following stored procedure call failed: ” { call [dbo].[bts_UpdateMsgbox_SendHost]( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)}”. SQL Server returned error string: “Warning: The join order has been enforced because a local join hint is used.;Warning: The join order has been enforced because a local join hint is used.;Duplicate key was ignored.”.

I had originally ignored the error mistakingly thinking it wasn’t really relevant. Why would an error on the BizTalk database have any impact on the transaction taking place on the application database?

I eventually found that this error was causing the entire transaction to hang, or at least never actually commit to the database even though SQL Agent trace proved that the queries were running all the way through to completion. The connection would just sit there, idle and open forever and there would be a lock on the database table I was trying to update/insert into. The only way of releasing the transaction would be to sp_who and then kill <PID>.

Reading more into my problems this error is releated to performance issues in the environment which is entirely plausible based on the modest hardware specs of my development VM. The solution for me was to turn off tracking on my send/receive ports. I have also read that running this SP can also help.

DECLARE @
RC int
DECLARE @fLeaveActSubs int

SET @fLeaveActSubs = 1

EXECUTE @RC = [BizTalkMsgBoxDb].[dbo].[bts_CleanupMsgbox]
@fLeaveActSubs
GO

Flat File Assembler Encoding and Charset

$
0
0

I’ve had lots of problems recently with file encodings including foreign and special characters not appearing properly in target flat files. For example ø being written as ? and even worse an accented é translated as 2 characters é which meant that the column aligned file got messed up.

The reason is that the source system provides the data in UTF-8 (which is the best encoding because it should in theory deal with all character sets) but the files were being created as ANSI which is far more limited and unless you specify the charset will cause the issues I have described. However the target system I am creating files for is a little archaic and does actually require files with an ANSI charset.

BizTalk can handle this issue quite well out the box but you have to specify the charset in the flatfile component otherwise it will take a guess which will in all likelyhood be wrong. I have also found that you have to specify this at design time in visual studio otherwise the change will not be effective. So although you have the option of specifying the charset in the send port properties, this does not work properly – must be a bug in BizTalk.

In visual studio open up your pipelene and select the Flat file assembler component.

Then select the file encoding which matches what your target system expects. This might be UTF-8 but may also be ANSI in which case you should select the charset that matches e.g. Western-European (1252).

You should use a similar process when reading files using the flat file disassembler component. And also be careful when using external tools such as an otherwise excellent PGP pipeline component. I had an issue where the pipeline component was writing a file to a temporary location but did’t take the encoding into account and the encoding was lost. Here is the code I used to change the EncryptStream method int he component source code.

        /// <summary>
        /// Encrypts the stream.
        /// </summary>
        /// <param name="inStream">The in stream.</param>
        /// <param name="pubKeyFile">The pub key file.</param>
        /// <param name="sourceFilename">The source filename.</param>
        /// <param name="extension">The extension.</param>
        /// <param name="armorFlag">if set to <c>true</c> [armor flag].</param>
        /// <param name="integrityCheckFlag">if set to <c>true</c> [integrity check flag].</param>
        /// <param name="strCharset">The STR charset.</param>
        /// <param name="targetCharset">The target charset.</param>
        /// <returns></returns>
        public static Stream EncryptStream(Stream inStream, string pubKeyFile, string sourceFilename, string extension, bool armorFlag, bool integrityCheckFlag, String strCharset, String targetCharset)
        {
            string tmpEncryptedFile = sourceFilename + "." + extension;

            // View debug lines with Debugview.exe
            //System.Diagnostics.Debug.WriteLine(strCharset + " is the charset");

            try
            {

                using (StreamWriter sourceStream = new StreamWriter(sourceFilename, true, Encoding.GetEncoding(targetCharset)))
                {
                    using (StreamReader sr = new StreamReader(inStream, Encoding.GetEncoding(strCharset), true))
                    {
                        sourceStream.Write(sr.ReadToEnd());
                    }
                }

                using (FileStream encryptedStream = File.Create(tmpEncryptedFile))
                {
                    PgpPublicKey publicKey = ReadPublicKey(File.OpenRead(pubKeyFile));
                    EncryptFile(encryptedStream, sourceFilename, publicKey, armorFlag, integrityCheckFlag);

                    encryptedStream.Seek(0, SeekOrigin.Begin);
                    int myCapacity = Int32.Parse(encryptedStream.Length.ToString());
                    Byte[] byteArray = new Byte[myCapacity];

                    MemoryStream memStream = new MemoryStream();

                    encryptedStream.Read(byteArray, 0, myCapacity);
                    memStream.Write(byteArray, 0, myCapacity);

                    encryptedStream.Close();

                    memStream.Seek(0, SeekOrigin.Begin);
                    return memStream;
                }
            }
            catch (Exception ex)
            {
                //System.Diagnostics.EventLog.WriteEntry("PGPWrapper[Encrypt] Error", ex.ToString());
                throw ex;
            }
            finally
            {
                // Clean-up temp files
                if (File.Exists(sourceFilename)) { File.Delete(sourceFilename); }
                if (File.Exists(tmpEncryptedFile)) { File.Delete(tmpEncryptedFile); }

                //FileInfo fileInfo = new FileInfo(sourceFilename);
                //System.Diagnostics.EventLog.WriteEntry("PGP_Debug", fileInfo.FullName + " was deleted");

                //fileInfo = new FileInfo(tmpEncryptedFile);
                //System.Diagnostics.EventLog.WriteEntry("PGP_Debug", fileInfo.FullName + " was deleted");
            }
        }

Search with criteria for value in different XML node in an XSLT Template

$
0
0

There have been a few instances where I have had to filter the incoming XML to look for a value from a different node in the XML. For the example in this article I had the parent node id but I wanted to know the name of that parent node.

This excellent article gave me a starting point for the principle of searching for values in different nodes using an XSLT template. This worked spendedly in one map I was working where I was trying to sum up all values in the current list of node items.


<xsl:template name="OutputSum">
  <xsl:param name="param1" />
  <xsl:element name="TransactionAmount">
    <xsl:variable name="var:vTranAmt"
    select="sum(//ExpenseItem[ReportEntryID=$param1]/TransactionAdjustmentAmount" />

    <xsl:value-of select="$var:vTranAmt" />

  </xsl:element>
</xsl:template>

However, when I tried to use the same principle to select the name value of a parent node for which I only knew the ID, it just wouldn’t work. I kept gettiing blank values no matter what I tried.

The source schema was from the MS Dynamics CRM v4 web services using the RetrieveMultiple method and is much more complex than one I normally use. After a lot of headscratching I noticed that it was the namespaces that were causing the problems. When I right clicked the map, selected validate and viewed the .xsl that Visual Studio was generating I noticed in my other mappings that BizTalk was adding some bizarre namespaces to the nodes, e.g. s6:RetrieveMultipleResult/s4:BusinessEntities . Eventually I got it to work using these namespaces.


<xsl:template name="ParentCompanyCode">
  <xsl:param name="param1" />
  <xsl:element name="Code">

                <xsl:value-of select="//s4:BusinessEntity[s6:new_geographyid=$param1]/s6:new_name" />

  </xsl:element>
</xsl:template>

Unfortunately I can’t explain exactly why this was required, but my advise is that if you can’t work out why custom xsl isn’t working then create something similar using functoids and have a look through the xsl that it generates via the Validate map functionality.


Lost Messages in WCF-SQL Polling Receive Location

$
0
0

I’ve been having a problem with messages not appearing in a WCF-SQL receive location of late. The receive location polls for new data every 30 seconds using the polledDataAvailableStatement and then when it finds some rows to process, the pollingStatement extracts those rows and then returns them to BizTalk using a very simple Stored Procedure. This mostly works fine but mysteriously around 25% of the time the rows were set to processed but no message appeared in the BizTalk message box and the items were lost forever. There were no suspended or failed messages. No messages in the Event Log. Absolutely nothing. Very mysterious.

The solution is I think that whenever you have a polling statement that does anything other than a select statement which mine is as it updates the affected rows to PROCESSED = true, you have to set useAmbientTransaction to true. We had originally disabled this because we thought it might have been related to table locking behaviour we had experienced in our development environments.

This msdn article seems to support the solution.

Not performing operations in a transactional context is advisable only for operations that do not make changes to the database. For operations that update data in the database, we recommend setting the binding property to true; otherwise you might either experience message loss or duplicate messages, depending on whether you are performing inbound or outbound operations.

 

 

Securing a WCF Service in BizTalk by Role/Claim using Windows or Claims Based Authentication

$
0
0

There are loads of articles on the web about how to implement windows or certificate authentication but very few that tell you how to actually use the authentication mechanism to restrict access to a WCF service to a particular group of users. One fantastic exception is this blog by Richard Seroter, on which much of this article is based. The major difference is that I also implement Claims based authentication for use with a federated service end point as well as bring the code samples up to date.

So you’ve got a WCF service working with either Windows or Federated security, but now what? How do we authorize particular users/groups? Unfortunately there is no out of the box solution in BizTalk (at least not up to 2010) so you have to create your own custom behavior extension. The good news is that you only need to write this component once and you can then re-use it across all of your BizTalk WCF services.

In Visual Studio create a new Class called BizTalkCustomServiceAuthManager.cs in an existing or newly created generic project (I call mine <CompanyName>.BizTalk.CommonArtifacts.Components which also holds lots of my other common classes). Add a reference to System.configuration, System.IdentiyModel and System.ServiceModel. Create 3 properties, m_group which will be the AD Group or the Claim Value to search for, the m_claimtype which will be the name of the Claim to search for (Claims only) and m_authenticationType which will be an enum allowing you to select either Windows or Claims based authentication in the Receive Port configuration. This could easily be extended to handle another type of authentication mechanism if you wanted.


        public class BizTalkCustomServiceAuthManager : ServiceAuthorizationManager
    {
        private string m_group;
        private string m_claimtype;
        private CustomServiceAuthenticationType m_authenticationType;

        ///
<summary> /// Claims or Windows Authentication
 /// </summary>
        public enum CustomServiceAuthenticationType
        {
            WINDOWS=1,
            CLAIMS=2
        };

        ///
<summary> /// Initializes a new instance of the class.
 /// </summary>
        ///The group.
        ///Type of the authentication.
        public BizTalkCustomServiceAuthManager(string group, CustomServiceAuthenticationType authenticationType, string claimType)
        {
            this.m_group = group;
            this.m_authenticationType = authenticationType;
            this.m_claimtype = claimType;
        }
  }

Now add the following block of code which will form the basis of the core authorization logic for either Windows or Claims authentication depending on what is chosen. Note that I’ve added a lot of Debug lines so you can view what is going on when we start testing the configuration.

        /// <summary>
        /// Checks authorization for the given operation context based on default policy evaluation.
        /// </summary>
        /// <param name="operationContext">The <see cref="T:System.ServiceModel.OperationContext"/> for the current authorization request.</param>
        /// <returns>
        /// true if access is granted; otherwise, false. The default is true.
        /// </returns>
        protected override bool CheckAccessCore(OperationContext operationContext)
        {
            //check that basic access is ok before checking our custom conditions
            if (!base.CheckAccessCore(operationContext))
            {
                return false;
            }

            if (this.m_authenticationType == CustomServiceAuthenticationType.CLAIMS)
            {
                return performClaimsAuthentication(operationContext);
            }
            if (this.m_authenticationType == CustomServiceAuthenticationType.WINDOWS)
            {
                return performWindowsAuthentication(operationContext);
            }
            else
            {
                System.Diagnostics.Trace.WriteLine("No Authentication Type Selected in BizTalk binding ");
                return false;
            }
        }

        /// <summary>
        /// Performs the claims authentication.
        /// </summary>
        /// <param name="operationContext">The operation context.</param>
        /// <returns></returns>
        public bool performWindowsAuthentication(OperationContext operationContext)
        {
            //print out inbound identities recorded by WCF
            System.Diagnostics.Trace.WriteLine("Primary Identity is " +
         operationContext.ServiceSecurityContext.PrimaryIdentity.Name);
            System.Diagnostics.Trace.WriteLine("Windows Identity is " +
  operationContext.ServiceSecurityContext.WindowsIdentity.Name);
            //create Windows principal object from inbound Windows identity
            WindowsPrincipal p = new WindowsPrincipal(operationContext.ServiceSecurityContext.WindowsIdentity);
            //check user in role
            string[] windowsGroups = this.m_group.Split(',');

            foreach (string windowsGroup in windowsGroups)
            {

                bool isAdmin = p.IsInRole(windowsGroup);
                if (isAdmin)
                {
                    System.Diagnostics.Trace.WriteLine("User is in role. Security Accepted " +
                                    windowsGroup);
                    return true;
                }
                else
                {
                    System.Diagnostics.Trace.WriteLine("User is not in role '" +
                                windowsGroup + "'");
                }
            }
            System.Diagnostics.Trace.WriteLine("Security Rejected Request.");
            return false;
        }

        /// <summary>
        /// Performs the claims authentication.
        /// </summary>
        /// <param name="operationContext">The operation context.</param>
        /// <returns></returns>
        public bool performClaimsAuthentication(OperationContext operationContext)
        {

            string[] claimValues = this.m_group.Split(',');

            foreach (ClaimSet claimset in operationContext.ServiceSecurityContext.AuthorizationContext.ClaimSets)
            {
                foreach (Claim claim in claimset.FindClaims(this.m_claimtype, Rights.PossessProperty))
                {

                    //Add to headers
                    System.Diagnostics.Trace.WriteLine("Current Claim is " +
                                    claim.ClaimType);

                    System.Diagnostics.Trace.WriteLine("Current Claim Resource is " +
                                    claim.Resource.ToString());

                    foreach (string claimValue in claimValues)
                    {
                        if (claim.Resource.ToString() == claimValue)
                        {
                            System.Diagnostics.Trace.WriteLine(String.Format("User is in claim {0} Security Accepted ",claimValue));
                            return true;
                        }
                    }
                }
            }
            System.Diagnostics.Trace.WriteLine("User is not claim. Security Denied " +
                            this.m_group);

            return false;
        }

Now add a new class called BizTalkCustomServiceBehavior.cs and extend IServiceBehavior. This doesn’t really do very much but is necessary.

public class BizTalkCustomServiceBehavior : IServiceBehavior
    {
        private string m_group;
        private .BizTalk.CommonArtifacts.Components.BizTalkCustomServiceAuthManager.CustomServiceAuthenticationType m_authenticationType;
        private string m_claimtype;
        ///
<summary> /// Initializes a new instance of the class.
 /// </summary>
        ///The group.
        ///Type of the authentication.
        public BizTalkCustomServiceBehavior(string group, .BizTalk.CommonArtifacts.Components.BizTalkCustomServiceAuthManager.CustomServiceAuthenticationType authenticationType, string claimType)
        {
            this.m_group = group;
            this.m_authenticationType = authenticationType;
            this.m_claimtype = claimType;
        }

        ///
<summary> /// Provides the ability to change run-time property values or insert custom extension objects such as error handlers, message or parameter interceptors, security extensions, and other custom extension objects.
 /// </summary>
        ///The service description.
        ///The host that is currently being built.
        public void ApplyDispatchBehavior(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase)
        {
            ServiceAuthorizationBehavior authBehavior =
                                                        serviceDescription.Behaviors.Find();
            //pass in Windows Group set during config setup
            authBehavior.ServiceAuthorizationManager = new BizTalkCustomServiceAuthManager(this.m_group, this.m_authenticationType, this.m_claimtype);
            ((IServiceBehavior)authBehavior).ApplyDispatchBehavior(serviceDescription, serviceHostBase);
        }

        /// <summary> /// Provides the ability to pass custom data to binding elements to support the contract implementation.
        ///The service description of the service.
        ///The host of the service.
        ///The service endpoints.
        ///Custom objects to which binding elements have access.
        public void AddBindingParameters(
	            ServiceDescription serviceDescription,
	            ServiceHostBase serviceHostBase,
	            Collection endpoints,
	            BindingParameterCollection bindingParameters) {

        }

        public void Validate(
                 ServiceDescription serviceDescription,
                ServiceHostBase serviceHostBase
                )
        {

        }

    }

And finally add the Behaviour Element where we define the custom properties that can be set in the receive port properties.

public class BizTalkCustomBehaviorElement : BehaviorExtensionElement

    {
        ///
        /// Want custom config property to show up in the BizTalk receive location
        ///
        [ConfigurationProperty("group", IsRequired = false, DefaultValue = "")]
        public string Group
        {
            get { return (string)base["group"]; }
            set { base["group"] = value; }
        }

        ///
<summary> /// Gets or sets the type of the auth.
 /// </summary>
        ///
        /// The type of the auth.
        ///
        [ConfigurationProperty("authenticationtype", IsRequired = false, DefaultValue = CompanyName.BizTalk.CommonArtifacts.Components.BizTalkCustomServiceAuthManager.CustomServiceAuthenticationType.WINDOWS)]
        public CompanyName.BizTalk.CommonArtifacts.Components.BizTalkCustomServiceAuthManager.CustomServiceAuthenticationType AuthType
        {
            get { return (CompanyName.BizTalk.CommonArtifacts.Components.BizTalkCustomServiceAuthManager.CustomServiceAuthenticationType)base["authenticationtype"]; }
            set { base["authenticationtype"] = value; }
        }

        ///
<summary> /// Gets or sets the type of the claim.
 /// </summary>
        ///
        /// The type of the claim.
        ///
        [ConfigurationProperty("claimtype", IsRequired = false, DefaultValue = "")]
        public string ClaimType
        {
            get { return (string)base["claimtype"]; }
            set { base["claimtype"] = value; }
        }

        ///
<summary> /// Creates a behavior extension based on the current configuration settings.
 /// </summary>
        ///
        /// The behavior extension.
        ///
        protected override object CreateBehavior()
        {
            return new BizTalkCustomServiceBehavior(Group, AuthType,ClaimType);
        }
        ///
<summary> /// Gets the type of behavior.
 /// </summary>
        /// A .
        public override Type BehaviorType
        {
            get
            {
                return typeof(BizTalkCustomServiceBehavior);
            }
        }

    }

Now build your project and add it into the GAC using a command like this "C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\x64\gacutil.exe" /if "D:\Projects\Biztalk\CompanyName.CommonArtifacts\CompanyName.CommonArtifacts.Components\bin\Debug\CompanyName.BizTalk.CommonArtifacts.Components.dll"

Next you need to open your machine.config – I know it sounds a bit scary but it’s the only way. Mine is located in C:\Windows\Microsoft.NET\Framework\v4.0.30319\Config but it depends on what version of .NET you are using and also whether you are running x86 or 64 bit applications.

Search for the behavior extensions tag <behaviorExtensions> and at the bottom add a link to your new class that has been added to the GAC. Note that if you aren’t sure what this should look like you can view the .dll meta info using something like gacutil.exe -l CompanyName.BizTalk.CommonArtifacts.Components

<add name="CustomAuthBehavior" type="CompanyName.BizTalk.CommonArtifacts.Components.BizTalkCustomBehaviorElement, CompanyName.BizTalk.CommonArtifacts.Components, Version=1.0.0.0, Culture=neutral, PublicKeyToken=6dbf238400666825" />
</behaviorExtensions>

Almost done. Now restart your BizTalk Host Instances and close and re-open the BizTalk Administration Console. Open the properties for any Windows or Federated binding WCF-Custom receive port (note that it does have to be WCF-Custom) and click Configure. In the Behavior tab, right click ServiceBehavior and select “Add extension…”. You should see your new Service Behavior class appearing there – if it isn’t check you edited the correct machine.config (note you will need to edit the machione.config within the relevant Framework64 directory as well if you are running 64bit OS) and that it is definitely referencing the right assembly.

To configure access to a windows AD group, select Windows authenticationType and specify the AD group (including the domain). Note that you can add multiple groups by separating with commas.

To use the Claims authentication you also need to specify which Claim to search on. The Group will relate to the value of that claim, e.g. in this example the claim type is a member of a SharePoint 2010 site http://schemas.company.com/identity/claims/2010/07/teammember and the value is the name of the site itself, e.g. “Team A”. Again you can specify multiple values by separating the group parameter with commas.

Now restart IIS as well and put some test cases through. If the user fails you will receive a Service Authorization fault. The Debug output might look something like this.

So perhaps not the most straightforward of solutions and I really think Microsoft should have bundled this functionality out the box. But the pain is only had once and at least you get total flexibility for how the authorization is performed.

System.Transactions.TransactionException: The operation is not valid for the state of the transaction

$
0
0

When running large amounts of data into an MS SQL database via a composite WCF-SQL send port in BizTalk I would eventually get (after about 20mins) the error “System.Transactions.TransactionException: The operation is not valid for the state of the transaction”. Smaller amounts of data ran fine.

The solution was to turn off the transaction in the send port. Of course this means you lose the transactional capabilities but do you really want to lock a table on a live database for 20 mins anyway? It may be good practice to only turn this feature off when you know you are running a long transaction and then turn it back on again when you are finished.

Turn_Off_Transactions_In_Send_Port

Active Service Instance Never Finishing in BizTalk 2010

$
0
0

A colleague of mine was trying to get ordered delivery running on a send port but kept seeing a rogue Active Service Instance that never seemed to finish, almost as if it was hanging or was waiting for something else to happen. The process overall seemed to complete without error and there was only ever one of these service instances no matter how many messages were put through BizTalk. Of course you could manually terminate the instance but we were worried that it might create issues such as support people trying to investigate the issue. The instance itself was blank in that it had no messages, errors or properties other than an instance Id, timestamp and a send port name.

We did lots of investigation including detailed orchestration debugging and traffic monitoring in Wireshark but all seemed normal. It had us stumped for hours.

The answer to the issue is that it isn’t an issue at all and is the way Microsoft has designed a port to work with ordered messaging as described in this article.

Ordered Delivery Send Port instance works as a singleton service. Since start it stays in Running state. It will not recycle if we restart its Host Instance. We could manually terminate it, if we want.

BizTalk Flat File Assembler with Untagged Header and Trailer/Footer

$
0
0

I needed to build an interface which converted an inbound message into a tab delimited file containing 1 header line, any number of detail lines and finally 1 footer line - essentially 3 different schemas/formats in one file E.g.

1->0000002->20130101->M
00001->A bike->300.00->Sent->true->true
00002->A ball->30.00->Placed->true->false
00003->A skipping rope->5.00->Sent->true->true
1->3->335.00

I originally tried combining the 3 different schemas into one and had some success. However the footer row was causing issues and I subsequently found out that BizTalk didn’t know when the repeating record finished and the footer row started so it was processing all the rows as the repeating row and failing with something like “Native Parsing Error: Unexpected data found while looking for:’\t’”. One solution was to add a Tag identifier to the footer schema e.g. FOOTER->1->3->335.00 which worked fine except that this was someone else’s schema (a 3rd party) so I couldn’t start adding FOOTER-> to the end of each file otherwise it wouldn’t get processed.

The clue to the solution was provided in this post by colmac73 which involves the execution of pipeline components in an orchestration. This post adds a few more details as to how I achieved it.

  1. First of all create a separate schema for each of the 3 formats, e.g. SchemaHeader, SchemaBody, SchemaTrailer. This is described in detail in a Microsoft article which explains how to do a similar thing but for receiving rather than sending messages.
  2. Create 3 separate maps to transform the source data to each of the SchemaHeader, SchemaBody, SchemaTrailer you have just created
  3. Create a pipeline object called ConvertToFlatfile containing nothing more than a flat file assembler. As colmac73 says in his article, leave the header / trailer  and document schemas to (none) as these will automatically be detected at runtime.
  4. Create an orchestration which receives your inbound message however you prefer. I like to use filter based receive objects that look for schema types, promoted properties or in this case BTS.ReceivedPortName = “MyFileReceivePort”
  5. Create 3 messages in the orchestration for each of the SchemaHeader, SchemaBody, SchemaTrailer
  6. Create a message called CombinedSendPipelineMessage of type System.Xml.XmlDocument. This will be the message you send to the outbound port.
  7. Create a variable called SendPipelineInput of type Microsoft.XLANGs.Pipeline.SendPipelineInputMessages. Note that you will need to reference Microsoft.XLANGs.Pipeline.dll which is located in the BizTalk root installation directory, e.g. D:\Program Files (x86)\Microsoft BizTalk Server 2010
  8. Execute all 3 of your transforms either in sequence or in parrallel
  9. Create a message assignment shape and enter something like this
    CombinedSendPipelineMessage = new System.Xml.XmlDocument();
    
    SendPipelineInput = new Microsoft.XLANGs.Pipeline.SendPipelineInputMessages();
    SendPipelineInput.Add(msgHeader);
    SendPipelineInput.Add(msgBody);
    SendPipelineInput.Add(msgTrailer);
    
    Microsoft.XLANGs.Pipeline.XLANGPipelineManager.ExecuteSendPipeline(typeof(MyCompany.Namespace.Pipelines.ConvertToFlatfile),SendPipelineInput, CombinedSendPipelineMessage);
    
  10. Send the message to a send port. I usually like to use direct subscription / filter based bindings but for this I configured the port directly to the orchestration using “Specify later”
  11. Configure the send port as file / SFTP depending on your requirements and select PassThroughTransmit as the pipeline (as the pipeline has already been executed from within the orchestration).
  12. Now if everything is working correctly, you should be able to output any inbound message into the required header / body / trailer format at your send port.

Your Orchestration might look a bit like this.

BizTalk Flat File Assembler Orchestration with Header and Footer

Hyperion Planning Import Aborts – Required Column ‘Member Name’ not found

$
0
0

I have been involved with writing interfaces into Oracle Hyperion Planning of the past year, although mostly limited to making the data available for our specialist contractors to import into the model. We encountered an error today which had us stumped for hours as there was little or no logging that indicated where the problem was.

We were trying to import a dimension using EPMA which we had done many times before. This time however, the import would “Abort” but with no indication why. This is what the output from the EPMA job  looked like:

2014-01-16 11:57:02,068 INFO Import job has been successfully submitted with jo
b id number "18,441".
2014-01-16 11:57:02,068 INFO JobID 18,441 submitted. To view detailed job infor
mation go to https://myapp.mycompany.com/workspace/?module=awb.appmanager&bp
m.docid=bpmarj&bpm.logoff=false&jobId=18441.
2014-01-16 11:57:07,115 INFO The import job status is "Aborted".
2014-01-16 11:57:07,115 INFO Import failed.

We searched everywhere and eventually found this output in the file D:\Oracle\Middleware\user_projects\epmsystem1\diagnostics\logs\epma

[2014-01-16T11:20:00.898+00:00] [EPMADIM] [INTERNAL_ERROR:32] [EPMADIM-1] [EPMADIM.Hyperion.CommonServices.Exceptions.BaseException] [tid: 15] [ecid: disabled,0] SVR_ERR_IMPORT_COLUMN_NOT_FOUND:Required Column 'Member Name' not found at Hyperion.DimensionServer.ImportHierarchyItem.ValidateColumns()
at Hyperion.DimensionServer.ImportEngine.CleanOutBadImportItems(ImportDimension dim)
at Hyperion.DimensionServer.ImportProfile.IterateDimensions(Action`1 action)
at Hyperion.DimensionServer.ImportEngine.ParseAndExecuteImport()
at Hyperion.DimensionServer.ImportEngine.StartImport(Boolean updateFinalJobStatus)

Still didn’t help us much. Eventually a colleague noticed that in the source data table which feeds the hierarchy there was a row which had the Child field set to blank. i.e. ” . This is what was causing the issue. So simple yet so much time to find. Anyway, hope it might help someone else in the same position.

Filtering Effective Dated Messages in BizTalk

$
0
0

I do a lot of work integrating systems with PeopleSoft using BizTalk. One of the core concepts in PeopleSoft is effective dating, that is the ability to have a history of an entity within the database controlled by an effective date field. For example PeopleSoft may hold an address for a company or person but we know the address will change on a particular date in the future. So you will have one entry effective dated in the past and another effective dated in the future. The advantage being that on the day the person or company moves addresses, the system will automatically pick up the correct address. It also helps keep an audit of historical addresses.

When consuming component interfaces exposed as web services using Integration Broker, all historical and future dated rows are returned. So how can you program a BizTalk map to only return the current row? The answer is to apply filtering via an inline C#

In this example I am using the CUSTOMER_MAIN CI from the Accounts Receivable module in PeopleSoft Financials to filter for current addresses. The incoming message has 2 different addresses (ADDRESS_SEQ_NUM), but one of them has 3 CUST_ADDRESS entries with different effective dates. The end result I want is 2 addresses, the current address for each ADDRESS_SEQ_NUM.

Create an inline c# functoid with the following code. Note that this was based on this tutorial here.

public System.Collections.Generic.List<int> duplicateList = new System.Collections.Generic.List<int>();
public bool IsDuplicate( int addressSeqNo, string effectiveDate, string effectiveStatus )
{

	 DateTime dt = Convert.ToDateTime(effectiveDate);
	 // Not interested in future dated dates
	 if (dt.Date > DateTime.Now.Date)
	   return true;

	 // Not interested in inactive addresses either
	 if (effectiveStatus != "A")
	   return true;

     if( duplicateList.Contains( addressSeqNo ) )
        return true;
     duplicateList.Add( addressSeqNo );
     return false;
}

Add EFFDT, ADDRESSSEQ_NUM and EFF_STATUS to it and connect it to a logical equals functoid, with false hard coded as the second parameter. This is then sent to the destincation node, as per the screenshot below.

FilteringAnEffectiveDatedMessage

Essentially this is filtering out any future effective dated rows and anything that is not active. Then is is pciking the first valid row, ignoring any others that follow. It relies on the order of the dates being sequential, but this is enforced via the PeopleSoft front end.


NUnit Test Cases picking up cached version of an assembly

$
0
0

We all know that Test Driven Development makes development so much quicker and safer. However, I’ve had a problem for a while whereby my NUnit test cases which reference libraries built for BizTalk projects don’t update when I make a change to the code and re-run the test case. The only way around it was to restart Visual Studio – a major pain.

Today I did a bit more digging and found out what was wrong. BizTalk needs signed assemblies deployed to the GAC to be picked up at runtime so I’ve always automatically done this and also added a post build event to add my assemblies to the GAC on a successful build. It was this that was causing my NUnit test case to pick up the signed version instead.

The solution is to untick the “sign the assembly” box in the properties of the library and also REM the post build events whilst you are in the unit testing phase. This will mean the NUnit library will pick up the version of the .dll within your Visual Studio environment.

Viewing all 16 articles
Browse latest View live