Zulfiqar's weblog

Architecture, security & random .Net

RFID Stock Management–Architecture & Challenges

Posted by zamd on September 12, 2014

I’m currently working on an RFID based stock management solution – The high level goals of the solution is to ensure, the stock shown on the computer is the stock available in the store, correct products (in correct quantities) are displayed on the shop floor & to reduce stock loss by having a real-time visibility of what’s passing through the tills before it’s taken out of the store. As a start we are only doing this for clothing products where our items are source (factory) tagged with RFID tags and these items are then tracked from deliveries to sale & returns using various types of RFID readers like Handhelds, Fixed Portals, Security Gates and Click & Collect readers etc.

The hardware side of project is interesting but it’s mostly off the shelf readers & gates supplied by Motorola, Checkpoint & Nedap. These readers are doing the bulk of work and we run a simple integration agent on top of them to connect them to our software backend.

Our software backend is SOA based (REST) web services built with ASP.NET Web API & hosted on Windows. From design & implementation perspective, we use CQS, DDD & Event Sourcing and our domain entities are persisted (using NHibernate) in Oracle (Exadata) which is our master data store. We use SpecFlow/NCrunch to automate our acceptance testing and NUnit for unit testing.

We started this as a typical .NET project and had interesting challenges around performance & latency on the read path, which pushed us to do more & more work on the asynchronous write path. We started to separate our read & write stores and decided to build the read store completely in the cache based on the event stream we capture on the write pathh. We started with Couchbase with it’s memcached Data Bucket as our first choice for read store– Couchbase is a great technology, it converges the key-value & document store models into one great product. I love the power & simplicity of it’s map/reduce framework.

For us the Couchbase didn’t work as well as our latency was still high because of the computation involved on huge list of stock data. We needed to bring data streams from the cache into our service and compute variances and categorization etc and then store the computed results back in the cache.

Our next choice was Redis, the Sorted Set data structure in Redis aligned nicely with the data model we need to store & compute. I’m extremely impressed with the power of Redis and ability to run computation in the cache is exactly what we needed. Most of our computation can be done with a single union or intersection command on sorted sets which is a sub millisecond operation. We are actively building on this model and in future posts, I’ll share more details on our architecture and specifics of Redis usage. Our high level architecture looks like this…



Posted in Architecture, Couchbase, Redis | Leave a Comment »

AD RMS protection with Custom STS – Setting the scene

Posted by zamd on June 19, 2013

In a world where device are exploding, the information protection becomes even more important. AD RMS provides an on-primses and cloud platform to protect documents. Protected documents can be freely distributed and the information protection platform ensures compliance and prevents unauthorized access.

The developer story of AD RMS was significantly simplified with AD RMS SDK 2.x aka MSIPC. The original MSDRM API was an extremely complex and required specialized skills to program. Version 2.x of the SDK introduced a simple File Protection API which makes it super easy to incorporate IPC in custom solutions. The SDK comes with a native API and there is a managed sample wrapper available as well.

The on-premises AD RMS Server is implemented using a set of web services (asmx based) which provides keys, certificates & license management infrastructure required to enable information protection.

The flow starts with authenticating the user and once the user is authenticated & authorized, key distribution is kicked off to enable protection/un-protection functionality. The default deployment of AD RMS is configured to use Windows Authentication – which means consumers of the protected contents must be part of the Active Directory.

Sometime there are requirements to make protected content (protected docx) available to users who doesn’t live in your Active Directory – for example sharing the protected documents with a partner organization etc.

Another scenario is where you own the users but they are stored in custom databases (SQL membership DB etc) rather than AD & you want this user base to access protected contents using their existing credentials.

To enable such scenarios, the AD RMS supports federated authentication using the standard WS-Federation protocol. There is useful step by step guide on how to use AD FS to enable federated access with partners who doesn’t have AD RMS deployed. This guide covers the first scenario I mentioned above.

To enable 2nd scenario,  you would need to deploy a custom STS (e.g. thinktecture identity server) and integrate it with AD RMS infrastructure using standard trust management and claims transformation.

The following figure shows the message flow used to acquire a license to un-protect a protected word document.


In a future post, I’ll explain the details of enabling this scenario using ThinkTecture Identity Server as a custom STS.

Posted in AD RMS, Federation/STS, MSIPC | Leave a Comment »

Azure AD OAuth 2.0 Authorization Grant

Posted by zamd on May 17, 2013

Yesterday I talked about a bug which prevented me to complete the authorization grant flow with Azure AD. It turn out the bug is only exposed when using Azure Management Portal for Relying party registration. In this post, I’ll use Graph Explorer to do the registration which works fine.

My scenario is to create a simple MVC application which would do the user authentication against the Azure AD.

Once the user is signed in, the web app then acquires an “access” & “refresh token” for the Graph API (I’ll work with other resources in future) using the 3-leg authorization grant flow.

I started by creating an empty MVC 4.0 application and added a home controller with a simple view displaying the identity & claims of the authenticated user.


Running the app gave me the url which I would use to register my app with Azure AD using Graph Explorer.  Registration instruction are available in this blog post under the ‘Setting up permissions’ section. My registration settings looks like this


Now back to VS and using the “Identity & Access”, I have externalized the authentication of my app to windows azure AD.


The tooling does all the magic and generates required WIF configuration.

<addvalue=http://localhost:45906/ />
<issuerNameRegistrytype=System.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, System.IdentityModel, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089>
<addthumbprint=3464C5BDD2BE7F2B6112E2F08E9C0024E33D9FE0name=https://sts.windows.net/{tenantid}/ />
<cookieHandlerrequireSsl=false />
<wsFederationpassiveRedirectEnabled=trueissuer=https://login.windows.net/common/wsfedrealm=http://localhost:45906/reply=http://localhost:45906/requireHttps=false />

Now when I run my application, I’m redirected to azure AD for authentication and after signing in, I’m back in my app with following claims.


The Authorize action simply redirects to the authorization endpoint requesting an ‘authorization code’.

Acquire authorization code
publicActionResult Authorize()
var @params = newNameValueCollection
{“response_type”, “code”},
{“client_id”, “87638e3d-6b56-46b6-946f-8b3b9fa6f04e”},
{“resource”, https://graph.windows.net&#8221;},
{“redirect_uri”, Url.Action(“Authorized”, null, null, “http”)}
var query = HttpUtility.ParseQueryString(“”);
query.Add(@params);    return Redirect(Constants.AzureADAuthorizationEndpoint + “?” + query);


I’m passing Authorized action as the redirect_uri to Azure AD. That’s where AD would send me the ‘authorization code’.

Clicking the ‘Authorize’ link takes me to Azure Ad & after authentication AD redirects my browser back to Authorized action with an authorization code.


From the Authorized controller action I fire a post request which gives me access and refresh tokens back. Please note, the redirect_uri MUST be same for both authorization code and token request.

Exchange code with tokens
publicasyncTask<ActionResult> Authorized(string code)
if (string.IsNullOrEmpty(code))
RedirectToAction(“Index”);    var data = newDictionary<string, string>
{“grant_type”, “authorization_code”},
{“client_id”, “87638e3d-6b56-46b6-946f-8b3b9fa6f04e”},
{“redirect_uri”, Url.Action(“Authorized”, null, null, “http”)},
{“client_secret”, “V/7FTVm****************UQi6MkbwhmBqKxz0=”},
{“code”, code}

var client = newHttpClient(newWebRequestHandler());
var response = await client.PostAsync(Constants.AzureADTokenEndpoint, newFormUrlEncodedContent(data));

var msg = await response.Content.ReadAsStringAsync();
var tokenResponse = JsonConvert.DeserializeObject<TokenResponse>(msg);
return View(tokenResponse);


The access & refresh tokens are scoped to Graph API in this case. I can now attach this “access token” to my requests to graph API to read/write the directory data. There are few samples on this topic already so I’m not going to cover that in this post.

Source: HomeController.cs

Posted in Azure AD, OAuth 2.0 | Leave a Comment »

Azure AD OAuth 2.0 Client_Credentials Flow

Posted by zamd on May 16, 2013

I was playing with the Authorization code grant type recently added to Azure Active directory however there is bug in the preview implementation which prevents exchange an ‘authorization code’ with an access token.

I can get the authorization code for graph api by using following url in the browser.


AAD authenticates me and redirects with an authorization code below.


At this stage, I should be able to exchange this ‘code’ with an ‘access token’ & a refresh token by issuing following POST request via fiddler.


User-Agent: Fiddler

Content-Type: application/x-www-form-urlencoded

Host: login.windows.net

Content-Length: 546


However doing this results in a ‘ACS50000: There was an error issuing a token. ACS70001: Error validating credentials. ACS50012: Invalid client secret is provided’ error. I’ll do a follow up post when this bug is fixed.

My second choice was to use the simple client_credentials (also known as two-leg) flow.

This time I used fiddler to craft a POST request to directly acquire a token from AAD OAuth 2.0 endpoint.


User-Agent: Fiddler
Content-Type: application/x-www-form-urlencoded
Host: login.windows.net
Content-Length: 178



I got 200 OK with JWT token as the payload. I can now attach this token to my REST services where I can process it using the WIF JWTTokenHandler extension as shown below:

  1. static void Main(string[] args)
  2. {
  3.     var handler = new JWTSecurityTokenHandler();
  4.     var token = (JWTSecurityToken)
  5.     handler.ReadToken(
  6.         "eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6Ik5HVEZ2ZEstZnl0aEV1THdqcHdBSk9NOW4tQSJ9.eyJhdWQiOiJodHRwOi8vbG9jYWxob3N0LyIsImlzcyI6Imh0dHBzOi8vc3RzLndpbmRvd3MubmV0LzY5MzgzMzU2LTU2ZGQtNGU3OC1hMThlLWE0ZmY1NDUwYzk5NS8iLCJuYmYiOjEzNjg3Mzk2NTYsImV4cCI6MTM2ODc4Mjg1Niwic3ViIjoiY2MyMjQ1NTQtZDhlZC00YTY0LThmZjUtOTJiZjNiYjEzZDkyIiwiYXBwaWQiOiI5ZjAzMGI3NC0xZWMxLTRiNmItODkxMS1mNGU2ZTQ2NWZmOWQiLCJvaWQiOiJjYzIyNDU1NC1kOGVkLTRhNjQtOGZmNS05MmJmM2JiMTNkOTIiLCJ0aWQiOiI2OTM4MzM1Ni01NmRkLTRlNzgtYTE4ZS1hNGZmNTQ1MGM5OTUiLCJpZHAiOiJodHRwczovL3N0cy53aW5kb3dzLm5ldC82OTM4MzM1Ni01NmRkLTRlNzgtYTE4ZS1hNGZmNTQ1MGM5OTUvIn0.BLtqbzU5pEyn5c6ubQxu2UPzoCd_I9Rokycq4LqThWGkdAy9vL3vqptAHXlKOTK-VFPkarfJ1Jui-GaiGZE_BKLFW0x_cxv4bTx_fAktTsDK51iv9wD8jYuftrUWaaqoonD29SQxRmic_r38LBqJwQIJRO4IfMUeMLmgYQ7B1DQs24D9oSx36pyc7CzX3sZH-nfbNPF4z8wUHrX0zzf7KwWCu5RhK6wmXKbiNKaMIw3VzTq6KsEbqFBV-3IuGFSGadrUfpJG0KZrEc3ZhNJ_gEWuBwhwTKtaVrWQ3_1wyxTtdKG1dPuVZmFxKCIfOJkqsvTFZKD4bECv5DJfvyhzlQ");
  7.     var validationParams = new TokenValidationParameters
  8.                                {
  9.                                    AudienceUriMode = AudienceUriMode.Never,
  10.                                    SigningTokenResolver = new HardcodedCertResolver(),
  11.                                    ValidateIssuer = false
  12.                                };
  13.     var ci = handler.ValidateToken(token,validationParams);
  15.     ci.Claims.ToList().ForEach(c =>
  16.                                Console.WriteLine("{0} = {1}", c.Type, c.Value));
  18.     Console.WriteLine(token.ValidTo);
  19.     Console.ReadLine();
  20. }


Hope that helps.

Posted in Azure AD, OAuth 2.0 | Leave a Comment »

CRM Online Entity Creation

Posted by zamd on April 30, 2013

Recently I started exploring the world of Dynamics and specifically Dynamics CRM. Technically the platform looks fairly simple with a reasonably clean web services API (mostly SOAP) & SAML based message security (remember legacy WS-Trust :)) using Live ID as the identity provider.

The helper code from the sdk (\sdk\samplecode\cs\helpercode) hides all the complexity but the under the hood following flow happens to interact with Dynamics Online Web Services.


Following is small console application I used to create a new lead into CRM Online.

Code Snippet
    var connection =
        CrmConnection.Parse("Url=https://psfd365.crm4.dynamics.com; User ID=zamd@psfd365.onmicrosoft.com; Password=*password*;");
    var organization = new OrganizationService(connection);

    var who = organization.Execute(new WhoAmIRequest()) as WhoAmIResponse;
    Console.WriteLine("{0}@{1}", who.UserId, who.OrganizationId);
    var user =
    organization.Retrieve("systemuser", who.UserId,
                          new ColumnSet(new[] {"firstname", "lastname"})) as SystemUser;

    Console.WriteLine("creating lead…");

    var newLead = new Lead
                           Subject = "Interested in dyanamics crm…",
                           FirstName = user.FirstName,
                           LastName = user.LastName,
                           MobilePhone = "004412123231212"

    var leadId = organization.Create(newLead);
    Console.WriteLine("New lead {0} created.",leadId);

that’s it – you can see the new lead created below.


Posted in Dynamics CRM | Leave a Comment »

WF Security Pack Update

Posted by zamd on March 13, 2013

Quite a few folks have asked me about updating WF Security Pack to .NET 4.5 as WIF is now integrated into .NET 4.5.

Today I manage to spare sometime to upgrade the WFSP to .NET 4.5/WIF 4.5.  I have also pushed the updated source code to github which you can pull down from https://github.com/zamd/wfsp/

Please note github version of the codebase is different from codeplex, which was refactored by a WF team member. The github version of the source code came straight from laptop. I intend to create a Nuget package and potentially a Visual Studio Extension as well. Stay tuned…

Posted in WF4.5, WFSP | Leave a Comment »

Federating Office 365 (Azure Active Directory) with a Custom STS

Posted by zamd on February 8, 2013

Let’s start of with a clarification: As of today, federating Office 365 (Azure AD) with a Custom STS is NOT supported by Microsoft.  Today the only supported STSs are AD FS 2.0, Shibboleth 2, Optimal IDM Federation Services and PingFederate 6.10.

With that cleared, Office 365 STS supports both WS-Federation & SAML protocols for user authentication which means technically any compatible STS can be used as the Identity Provider STS for Office 365 services or other Relying Parties with a trust relationship with Azure Active Directory.

Azure AD supports In-cloud & Federated Identities.

With In-Cloud identities all user information, including the passwords, are stored in the online directory.

With Federating identities, only basic information is stored in online directory (as shadow accounts) and user identities are mastered in on-premise directories. Passwords are never copied to online directory and Azure AD relies on federation for user sign in.

A key prerequisite for Office 365 SSO is to create federated identities (shadow accounts) in Azure AD and there are different options/tools to do this.

  1. DirSync is the recommended tool but it only supports Active Directory as the identity source. DirSync & AD FS 2.0 are the primary tools to enable federation between an on-premises AD and Azure AD.
  2. Graph API is a new RESTful API to manage online directory and looks very promising for creating cloud-only identities. Graph API today doesn’t support creating Federated identities.
  3. MSOL PowerShell cmdlets: These cmdlets use the SOAP based Provisioning Service and are functionally quite rich. They support most of the operations including the creation of federated identities. I have used these cmdlet for my scenairo. Few commerical tools also wrap these cmdlets to perform various Office 365 provisioning operations.
  4. Forfront Identity Manager (FIM) is another potential option which can create Federated accounts from source directories other than AD but I haven’t explored that in detail.

Now once you have the federated identities provisioned (or synced from your on-premises user identity store) in Azure AD, the next step is to establish a trust relationship between Azure AD and your custom STS. This is assuming you have already done the domain verification etc.

I have used Set-MsolDomainAuthentication cmdlet for this.

Set-MsolDomainAuthentication –DomainName bccoss.com –federationBrandName bccoss.com -Authentication Federated  -PassiveLogOnUri $url -SigningCertificate $certData -IssuerUri $uri -ActiveLogOnUri $ecpUrl -LogOffUri $logoutUrl -PreferredAuthenticationProtocol WSFed


At this stage, if I browse to the Microsoft Online Services portal (http://portal.microsoftonline.com/) and choose to login using my federated domain (@bccoss.com) – I got redirected to my custom STS.

In this case, I’m using Thinktecture STS but that doesn’t work out of box with Office 365 / Azure AD so I have to modify the STS to make it compatible with Azure AD. I’ll explain the Office 365 compatibility requirements of an STS in a future post.

I’ll also try to contribute my Thinktecture modification to code back to git at some point.

image image

image image

image image

Posted in Azure AD, Office 365, SSO | 11 Comments »

Custom STS for Sitefinity 5.x

Posted by zamd on February 6, 2013

Sitefinity 5.x introduced claims based security & Single-Sign-On features based on a simple HTTP redirect based token issuance protocol which I’m going to call ‘Sitefinity sign-in protocol’ in my posts. Version 5.x has also standardized on using Simple Web Token (SWT) as the default token format for user authentication and SSO needs.

Sitefinity 5.x comes with a built-in local STS which authenticates users using the standard membership authentication and issue SWT tokens in accordance with Sitefinity sign-in protocol. Sitefinity doesn’t have a hard dependency on this built-in STS rather it relies on it’s sign-in protocol and SWT token format which means we can introduce a custom STS in the mix and Sitefinity would happily work with our Custom STS which obviously has to adhere to Sitfinity sign-in protocol and token format.

This STS based design in Sitefinity 5.x could enable many SSO scenarios, some of which I’m going to explore in future posts. Following are examples of few possibilities:

  • I can create a Custom STS and then have multiple applications (RPs :)) including Sitefinity 5.x trust this single STS, which would enable the users to single sign-on across all those applications.
  • I can create a multi-protocol STS which can enable user SSO across workloads/products. For example, SSO between Sitefinity & Office 365 or another portals, speaking the SAML protocol.

For now, I’ll show you how to use a custom STS with Sitefinity for user authentication. I have already developed and deployed a Sitefinity compatible STS @ http://sts.pilesoft.com while Sitefinity is running @ http://pilesoft.com.

Step 1: Register custom STS with Sitefinity so that it can trust the token issued by custom STS.

Open the .\App_Data\Sitefinity\Configuration\SecurityConfig.config file and locate the <securityTokenIssuers> element and following line to <securityTokenIssuers> element.

<add key="CD29559E6EDC312272976AC43F7E921C5766D7063DAF6D177F3EEDEB1802FABE" encoding="Hexadecimal" membershipProvider="Default" realm="http://sts.pilesoft.com"/>

Your config should now look like following:

  1. <securityTokenIssuers>
  2.   <add key="CD29559E6EDC312272976AC43F7E921C5766D7063DAF6D177F3EEDEB1802FABE" encoding="Hexadecimal" membershipProvider="Default" realm="http://sts.pilesoft.com"/>
  3.       <add key="6C4B865442D166796756C8DA1765584F7DD5EC0DE81B1CF29AC5FCE85AE5331D" encoding="Hexadecimal" membershipProvider="Default" realm="http://localhost" />
  4.   </securityTokenIssuers>


In most cases, you need to configure a custom Membership provider as well, which I’m going to talk in a future post.

Step 2: Open the main web.config file and locate the <federatedAuthentication> under the <microsoft.identityModel> section. This is WIF configuration and we need to change the <wsFederation> element to point to our custom STS.

Locate the <wsFederation> element & change the issuer attribute to point to our Custom STS as shown below:

    <claimsAuthenticationManager type="Telerik.Sitefinity.Security.Claims.SFClaimsAuthenticationManager, Telerik.Sitefinity" />
      <add type="Telerik.Sitefinity.Security.Claims.SWT.SWTSecurityTokenHandler, Telerik.Sitefinity" />
    <audienceUris mode="Never"></audienceUris>
    ==>  <wsFederation passiveRedirectEnabled="true"
                       issuer="http://sts.pilesoft.com/issue/sitefinity" realm="http://localhost" requireHttps="false" />
      <cookieHandler requireSsl="false" />
    <issuerNameRegistry type="Telerik.Sitefinity.Security.Claims.CustomIssuerNameRegistry, Telerik.Sitefinity">
    <issuerTokenResolver type="Telerik.Sitefinity.Security.Claims.SWT.WrapIssuerTokenResolver, Telerik.Sitefinity" />


Now if I browse to Sitefinity – I get:


When I click on ‘Login to the backend link’, I’m redirected to my Custom STS. The address bar shows the sitefinity sign-in protocol in action.


When I sign-in at the STS, it issues a SWT token & redirects me back to the Sitefinity app.

As this STS is trusted by Sitefinity, it happily accepts the incoming SWT token and logs me in.


I’ll publish the Custom STS code after removing the IP related bits. Ping me if you desperately needs it :)

Posted in Federation/STS, Sitefinity, SSO | 2 Comments »

Enabling ‘Import Service Contract’ menu option

Posted by zamd on January 2, 2013

WF 4.5 introduced contact first development using which you can generate messaging activities from your existing WCF contracts. Out of box, this feature is only enabled for ‘WCF Workflow Service Application’ project type and is exposed using the ‘Import Service Contract’ context menu.

image image

This is quite useful feature and is certainly required in other project types as well. For example, a workflow hosted using WorkflowServiceHost in a windows service or a console application. You can easily enable the context menu option for other project types by including an additional GUID in the <ProjectTypeGuids> element in csproj file.


  • Unload the project in VS and open the csproj file using the xml editor.
  • Locate the <ProjectTypeGuids> element and insert this {349c5851-65df-11da-9384-00065b846f21} as the content of element along with other GUIDs.
  • Make sure to put a semicolon at the end of your newly inserted GUID.
  • Reload the project in VS and you should now see the ‘Import Service Contract’ menu option.

Posted in WF4.5 | Tagged: , , | Leave a Comment »

Service Bus Property Promotion Nuget Package

Posted by zamd on July 18, 2012

I have just published a Nuget package which adds property promotion features to Service Bus WCF programing model.


Once you added the package to your project you can use the PromotedProperty attribute to mark your properties as promoted. The package supports promotion from both complex & primitive arguments. In addition to PromotedPropertyAttribute you also need to stick PropertyPromotionBehavior on each method of your service contract.

Following service contract captures the sample usage.

  1. public class Order
  2. {
  3.     public double Amount { get; set; }
  4.     [PromotedProperty]
  5.     public string ShipCity { get; set; }
  6. }
  8. [ServiceContract]
  9. public interface IOrderService
  10. {
  11.     [OperationContract(Name = "SubmitFlat", IsOneWay = true)]
  12.     [PropertyPromotionBehavior]
  13.     void Submit(double amount, [PromotedProperty] string shipCity);
  15.     [OperationContract(IsOneWay = true)]
  16.     [PropertyPromotionBehavior]
  17.     void Submit(Order order);
  18. }


Posted in Uncategorized | Leave a Comment »


Get every new post delivered to your Inbox.