CosmosDB Serialization Overview

Serialization Settings

Since version 1.15.0, Azure’s CosmosDB SDK supports JsonSerializerSettings as a paramater. This gives us the ability to define custom contact serializer and how to handle null values.

This is easy as this:

var serializerSettings = new JsonSerializerSettings
{
    NullValueHandling = NullValueHandling.Ignore,
    ContractResolver = new CamelCasePropertyNamesContractResolver()	
};

var client = new DocumentClient(endpointUrl, authorizationKey, serializerSettings);

The base class issue

When you create a document into CosmosDB you will realize that in addition to your own properties, some system properties prefixed with an underscore (_) are added and returned when quering the document:

Property Purpose
_rid System generated, unique, and hierarchical identifier of the resource
_etag etag of the resource required for optimistic concurrency control
_ts Last updated timestamp of the resource
_self Unique addressable URI of the resource

Add the required id propery that is either set by the user or auto-generated by the system if ommited.

In a way to reduce your typing, you could be tempted to inherits your documents from the Microsoft.Azure.Documents.Document class that it the class returned by most of the calls to the SDK. Unfortunatly, it doesn’t work in many fashions:

  • Properties are not serialized if they are not marked with the [JsonProperty]attribute.
  • JsonSerializerSettings are not respected!

The persistance of this class

public class InheritFromDocument : Microsoft.Azure.Documents.Document
{
    [JsonProperty] 
    public string Firstname { get; set; } = "Inherit";
    [JsonProperty]
    public string Lastname { get; set; } = "Document";
    [JsonProperty]
    public string NotDefinied { get; set; } = null;
}

will result into a JSON where our user-defined properties are not correctly camel cased despite the definied Contract Resolver and the null values are serialized :

{
  "Firstname": "Inherit",
  "Lastname": "Document",
  "NotDefinied": null,
  "id": "dcf88cbf-d3ee-4558-b698-a94006d0d49c",
  "_rid": "Ho1MAP0xtwqKAAAAAAAAAA==",
  "_self": "dbs/Ho1MAA==/colls/Ho1MAP0xtwo=/docs/Ho1MAP0xtwqKAAAAAAAAAA==/",
  "_etag": "\"00000000-0000-0000-c501-e64f859b01d3\"",
  "_attachments": "attachments/",
  "_ts": 1522069007
}

In another hand, if you inherit your documents from Microsoft.Azure.Documents.Resource you will get a correct serialization.

public class InheritFromResource : Microsoft.Azure.Documents.Resource
{
    public string Firstname { get; set; } = "Inherit";
    public string Lastname { get; set; } = "Resource";
    public string NotDefinied { get; set; } = null;
}
{
  "firstname": "Inherit",
  "lastname": "Resource",
  "id": "5e1211e8-4589-436a-bfa0-a17dea4accf8",
  "_rid": "Ho1MAP0xtwqLAAAAAAAAAA==",
  "_self": "dbs/Ho1MAA==/colls/Ho1MAP0xtwo=/docs/Ho1MAP0xtwqLAAAAAAAAAA==/",
  "_etag": "\"00000000-0000-0000-c501-e6542d2801d3\"",
  "_attachments": "attachments/",
  "_ts": 1522069007
}

You can also create documents from your own POCO but you will have to handle the system properties by hand if you need to retrieve them.

public class POCO 
{
    public string Id { get; set; }
    public string Firstname { get; set; } = "Po";
    public string Lastname { get; set; } = "Co";
    public string NotDefinied { get; set; } = null;
}

Summary

Don’t use Microsoft.Azure.Documents.Document as your base class for storing document. It will not respect your settings!

Microsoft.Azure.Documents.Resource is a far better alternative. POCO are as always a good fit. It doesn’t tie you to a specific implementation or SDK and you can easily create a base class that will handle the system properties.


GitHub Pages with gh-pages branch

Today, I wanted to create an website for my new tool DocumentDb Explorer. As the source code is hosted on GitHub I wanted to use GitHub Pages to host it also.

GitHub Pages

With GitHub Pages you can host freely your website from your GitHub repository. You just need to push your static content or Jekyll website to a branch called gh-pages and they’ll publish it on <your-username>.github.io/<repo-name> a few seconds later.

Create your gh-pages branch

That’s nice, but I started with an existing repo and didn’t wanted to mix my work on the app with the website. The idea is to create an orphan branch:

git checkout --orphan gh-pages

When you create an orphan branch, git creates a new branch without any parent commits. You can add anything you want to that branch and it’s totally separate from the main history, but stored on the same .git directory.

I then cleaned my directory and added the two needed pieces to have a nice website, a Jekyll config file (_config.yml) and the index.md file that will be renderer as my home page.

The _config.yml file is very simple:

theme: jekyll-theme-cayman
title: DocumentDb Explorer
show_downloads: "true"
google_analytics: <YOUR-GOOGLE-KEY>

It’s now time to push the new branch to GitHub.

git push -u origin gh-pages

The gh-pages branch is now pushed to GitHub and the site is available. For later changes git push will be enough.

Worktree

Good, now I have two branches and I have to git checkout <branch> everytime I want to make change to the website or the application.

My goal is to have a both branches on the same local directory and have each directory linked to his own remote branch. To achieve that, I need to use a feature of git called worktree that let me manage multiple working trees attached to the same repository.

To create my directory structure correctly I need to execute these commands:

mkdir <my-root-folder>
cd <my-root-folder>

git clone https://github.com/<user>/<repo> work # this create a work folder where I will have my app
cd work
git checkout gh-pages # pull down remote branch and make it a local one
git checkout master # switch back to the branch matching the directory tree

mkdir ..\gh-pages # create the gh-pages folder as the same level the work folder
git worktree add ..\gh-pages gh-pages # checkout the gh-pages branch into the local gh-pages folder

Well…

Now, changing from on branch to the other is simple as changing the current directory using cd.

λ  dir


    Directory: C:\Users\sacha\Sources\<my-root-folder>


Mode                LastWriteTime         Length Name
----                -------------         ------ ----
d-----       22.12.2017     10:40                gh-pages
d-----       22.12.2017     10:39                work


C:\Users\sacha\Sources\<my-root-folder>
λ  cd .\gh-pages\
C:\Users\sacha\Sources\<my-root-folder>\gh-pages [gh-pages ]
λ  git status
On branch gh-pages
Your branch is up-to-date with 'origin/gh-pages'.

nothing to commit, working tree clean
λ  cd ..\work\
C:\Users\sacha\Sources\<my-root-folder>\work [master ]
λ  git status
On branch master
Your branch is up-to-date with 'origin/master'.

nothing to commit, working tree clean

Test Azure AD secured API with Postman

Imagine that you have a nice API deployed on Azure and secured by Azure AD. For example, we will create a simple Azure Function who return the name of the logged user. Here is the code:

using System.Net;
using System.Security.Claims;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req)
{
    var name = ClaimsPrincipal.Current.FindFirst("name")?.Value;

    return name == null
        ? req.CreateResponse(HttpStatusCode.BadRequest, "'name' not found in the claims list!")
        : req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}

Try to call the Azure Function from Postman you will receive a "You do not have permission to view this directory or page." message with a 401 Unauthorized error code.

This is because we didn’t pass an Authentication header with a valid bearer token. As we are using AzureAD, we are supporting OAuth2.0 authentication and Postman is providing a way to retrieve a valid token without leaving the application. Check https://www.getpostman.com/docs/postman/sending_api_requests/authorization for details.

So far, so good. But what are the parameters that we should pass to Postman to retrieve a token? First, we will use the Authorization Code grant type. When you select this grant type on Postman, you will see that the following parameters are needed:

  • Callback URL
  • Auth Token URL
  • Access Token URL
  • Client ID
  • Client Secret

To retrieve these information, open the Azure Active Directory blade and select App registration.

Client ID

The Client ID parameter is know on Azure AD as the Application ID. Open your registered app and copy the value.

Get Application ID

Client Secret

Go to the Keys settings of the Registered App and create a new Password. Write down the generated key when saving, you won’t be able to retrieve it later otherwise.

Create Key

Retrieve the URLs

The Auth Token URL and Access Token URL can be found by clicking on the Endpoints button. Azure AD requires that you pass the resource you want to access with both urls, so you will need to add ?resource=[application_id] at the end.

Postman Azure AD
Auth URL https://login.microsoftonline.com/[tenant_id]/oauth2/authorize?resource=[application_id]
Access Token URL https://login.microsoftonline.com/[tenant_id]/oauth2/token?resource=[application_id]

To get the Callback URL, check the Reply URLs setting. If you have created the Azure AD Application using Azure EasyAuth, you will have a default value looking like this: https://[appservice-name].azurewebsites.net/.auth/login/aad/callback

Grant Permissions

Before to be able to call the API, you will need to click on the “Grant Permissions” button of the “Required permission” settings. Otherwise, you could get a error message saying:

error=access_denied
error_description=AADSTS65005: Invalid resource.
The client has requested access to a resource which is not listed in the requested permissions in the client’s application registration

Other Parameters

Moreover, you will neeed to set a Token Name of your choice and set Client Authentication to Send client credentials in body. We can leave the Scope and State parameters empty.

Retrieve a token

You are now ready to get a new access token.

Request Token In Postman

After clicking on “Request Token”, a popup window will prompt you your Azure AD credentials. If you get an issue, start by looking at the Postman console and if you don’t get enought information there launch Fiddler to debug the messages. When everything goes well you recieve a new token that you can add to your request header by clicking on the “Preview Request” button.

Conclusion

You are now able to call your API from Postman and get a nice response.

Request Result In Postman

Don’t forget that an Azure AD Token is valid for 60 minutes only. Request a new token when needed…


Secure your Serverless Architecture

There is two main products in the Azure offering for hosting microservices: Azure Functions and Logic Apps. The former is a code-first integration service designed for developers, the latter is a configutation-first integration service which makes it easy to build processes and workflows and integrate with various SaaS and enterprise applications.

The internet is full of articles showing how to setup a Serverless Architecture with Azure leveraging Azure Functions and/or Logic Apps. But what when you need to apply authentication to your architecture?

Setup

Start by creating a new Function App and add it a simple HttpTrigger C# function that we will call SayHello. Create Azure Function

This will create a default method who return "Hello" + name where name is either an url parameter or a JSON property passed on the body.

Azure Functions Proxies

Functions Proxies is a new feature of Azure Functions. It let you specify endpoints on your function app that are implemented by another resource. It useful to break a large API into multiple functions apps while still presenting a single API surface. It could also be used to present a nicer url for a Logic App. The following image shows you how to create an homogeneous API while organizing your application in a microservice architecture. Azure Functions Proxies architecture

Start by enabling Proxies in your Azure Function Apps. Go to Settings and switch the Proxies (preview) to On. Enable Azure Functions Proxies

We will now create a new endpoint for our SayHello Function. We will limit the verb to GET and define the parameter {name} in the route template. The Backend URL will be the Azure Function URL with two parameters: the name and the code. The name is the value used by the function and the code is the security token for the Azure Function. See Work with Azure Functions Proxies for more information on proxy creation.

Create Proxy

Try your new proxy by browsing to: https://yourfuncapp.azurewebsites.net/api/hello?name=foo and verify that you get this wunderfull result:

<string xmlns="http://schemas.microsoft.com/2003/10/Serialization/">
    Hello foo
</string>

How to secure it?

Azure Functions is built on top of Azure App Service. So we will simply enable Easy Auth, a neat feature of App Service. Go to Platform features and click on Authentication / Authorization. We will select Azure Active Directory with the Express configuration for the success of the rest of this article.

Setup Easy Auth

Now, if you try to call an Azure Function from your browser you will be redirect to the classic AAD Login Page. Our API is now fully protected from unauthorized call, either calling directly an Azure Function or by passing through a proxy endpoint. But Easy Auth has a lot more to offer. We are going to modify slightly our Function to say hello to the logged user. Replace the code in the SayHello Function:

using System.Net;
using System.Security.Claims;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    var name = ClaimsPrincipal.Current.FindFirst("name").Value;

    return name == null
        ? req.CreateResponse(HttpStatusCode.BadRequest, "'name' not found in the claims list!")
        : req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}

And change the Proxy:

  • Route template: /api/hello
  • Backend URL: https://yourfuncapp.azurewebsites.net/api/SayHello?code=...

Try your new Function by browsing to: https://yourfuncapp.azurewebsites.net/api/hello and verify that you get your name correctly displayed.

Call a secured Azure Function from Logic App

Now, it is a common scenario to call an Azure Function from a Logic App. For the purpose we will create this simple Logic App and hide it behind an Function Proxy to protect it.

{
    "definition": {
        "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
        "actions": {
            "Response": {
                "inputs": {
                    "body": {
                        "whoIam": "@{body('SayHello')}"
                    },
                    "headers": {
                        "Content-Type": "application/json"
                    },
                    "statusCode": "@outputs('SayHello')['statusCode']"
                },
                "runAfter": {
                    "SayHello": [
                        "Succeeded"
                    ]
                },
                "type": "Response"
            },
            "SayHello": {
                "inputs": {
                    "function": {
                        "id": "/subscriptions/<sub-id>/resourceGroups/<resource-group>/providers/Microsoft.Web/sites/<function-app>/functions/SayHello"
                    }
                },
                "runAfter": {},
                "type": "Function"
            }
        },
        "contentVersion": "1.0.0.0",
        "outputs": {},
        "parameters": {},
        "triggers": {
            "manual": {
                "inputs": {
                    "method": "GET",
                    "schema": {}
                },
                "kind": "Http",
                "type": "Request"
            }
        }
    }
}

If you try to call your new proxy, you will receive an beautiful error. Digging into the Logic App run, you will find this as the output of the SayHello action:

{
    "statusCode": 401,
    "headers": {
        "Date": "Fri, 07 Jul 2017 07:50:11 GMT",
        "Server": "Microsoft-IIS/8.0",
        "WWW-Authenticate": "Bearer realm=\"<edited>\"",
        "X-Powered-By": "ASP.NET",
        "Content-Length": "58",
        "Content-Type": "text/html"
    },
    "body": "You do not have permission to view this directory or page."
}

Ok, we need to pass authentication to the call to succeed. But the Azure Function connector in Logic Apps does not contains authentication parameters like the HTTP connector. Are we forced to replace the convenient Azure Function connector by the HTTP one? No!

Thanks to Easy Auth, the forwarded HTTP calls from the Azure Functions Proxies are enhanced by a lot HTTP Headers. Take a look at this article from @cgilum about the App Service Token Store.

Replace your Logic App code with this:

{
    "definition": {
        "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
        "actions": {
            "Response": {
                "inputs": {
                    "body": {
                        "whoIam": "@{body('SayHello')}"
                    },
                    "headers": {
                        "Content-Type": "application/json"
                    },
                    "statusCode": "@outputs('SayHello')['statusCode']"
                },
                "runAfter": {
                    "SayHello": [
                        "Succeeded"
                    ]
                },
                "type": "Response"
            },
            "SayHello": {
                "inputs": {
                    "function": {
                        "id": "/subscriptions/<sub-id>/resourceGroups/<resource-group>/providers/Microsoft.Web/sites/<function-app>/functions/SayHello"
                    },
                    "headers": {
                        "Authorization": "@{concat('Bearer ', triggerOutputs()['headers']['X-MS-TOKEN-AAD-ID-TOKEN'])}"
                    },
                    "method": "GET"
                },
                "runAfter": {},
                "type": "Function"
            }
        },
        "contentVersion": "1.0.0.0",
        "outputs": {},
        "parameters": {},
        "triggers": {
            "manual": {
                "inputs": {
                    "method": "GET",
                    "schema": {}
                },
                "kind": "Http",
                "type": "Request"
            }
        }
    }
}

What we did is just pass the X-MS-TOKEN-AAD-ID-TOKEN generated by Easy Auth to our Azure Function. If you try to call your Azure Function Proxies endpoint from a browser you will receive a nice greeting.

The request must be authenticated only by Shared Access scheme.

This is the error you will receive if you try to call your endpoint from a SPA application or a tool like Postman. This is because the Logic App is already protected by a SAS token and that Azure Function Proxies is forwarding the Authorization header we send him to authenticate our request.

Lets try to reproduce it before fixing it! Open Postman and import the following collection:

{
	"variables": [],
	"info": {
		"name": "Test",
		"_postman_id": "08766fd2-ecd3-94c9-9649-54cb6d549136",
		"description": "",
		"schema": "https://schema.getpostman.com/json/collection/v2.0.0/collection.json"
	},
	"item": [
		{
			"name": "HelloLogicApps",
			"request": {
				"url": "https://yourfuncapp.azurewebsites.net/api/helloLogicApps",
				"method": "GET",
				"header": [
					{
						"key": "Authorization",
						"value": "Bearer {{Token}}",
						"description": ""
					}
				],
				"body": {},
				"description": ""
			},
			"response": []
		}
	]
}

Create a new Environment with a Token key. Open your browser and go to this url https://functionapp.azurewebsites.net/.auth/me to copy/paste the id_token into Postman. Click on Send…

Failed call

The solution is the same as in my previous article: Secure your Logic Apps with Azure AD and API Management. We need to remove the Authorization header from the call done by the proxy to the Logic App.

To do this in Azure Function Proxies, we will need to edit the proxies.json file and and add a requestOverrides object to our proxy definition. We will do it from the App Service Editor. Your proxies.json should will look like this:

{
    "$schema": "http://json.schemastore.org/proxies",
    "proxies": {
        "Hello": {
            "matchCondition": {
                "route": "/api/hello",
                "methods": [
                    "GET"
                ]
            },
            "backendUri": "https://function-app.azurewebsites.net/api/SayHello&code=<edited>"
        },
        "Function": {
            "matchCondition": {
                "route": "api/helloLogicApps",
                "methods": [
                    "GET"
                ]
            },
            "backendUri": "https://prod-37.westeurope.logic.azure.com:443/workflows/<edited>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<edited>",
            "requestOverrides": {
                "backend.request.headers.Authorization": ""
            }
        }
    }
}

Overriding the Authorization header by an empty string will remove totally the header entry. Go back to Postman and re-send the query! Et voilà!

Successful call

Conclusion

Azure Function Proxies + Easy Auth is a lightweight solution to secure your Serverless Architecture on Azure. It overlaps with Azure Management API but does not offer all the advanced features you get on APIM like throttling, caching and the developer portal.

However, the cost of Azure Function Proxies will be way cheaper than APIM and as Azure Function is based on App Service, it is easy to deploy your solution using ARM templates.


Secure your Logic Apps with Azure AD and API Management

JWT Validation

Integrating Azure Active Directory and other OpenID providers with Azure API Management is relativly easy with Azure API Management (APIM). Follow this How To to setup the required configuration. You can then validate a JSON Web Token (JWT) with APIM access restriction policy. A simple example for Azure Active Directory will look like this:

<validate-jwt header-name="Authorization" require-scheme="Bearer" 
              failed-validation-httpcode="401"
              failed-validation-error-message="Unauthorized. Access token is missing or invalid.">
    <openid-config url="https://login.windows.net/tenant.onmicrosoft.com/.well-known/openid-configuration"/>
    <audiences>
        <audience>https://tenant.onmicrosoft.com/APIMAADDemo</audience>
    </audiences>
</validate-jwt> 

That’s good, we have now have an API proxy that will control that the calling user is authentified and come from a known place. But if you try to add this policy to an API calling a Logic App, you will receive this cryptic error:

{
  "error": {
    "code": "DirectApiAuthorizationRequired",
    "message": "The request must be authenticated only by Shared Access scheme."
  }
}

This is due to the fact that Logic Apps are not able to handle the Authorization HTTP header. We can easily go through by adding another policy to our inbound section:

<set-header name="Authorization" exists-action="delete"/>

Pass JWT claims to a Logic App

Now, we can call our Logic Apps with success. But what if we need to pass information from the JWT Token to our workflow? For example, if we need to retrieve data based on the calling user. To do that we will need to extract the data out of the JWT Token. For information, this is how an Azure AD token looks. For the sake of this blog, I will pass the upn and the name to the Logic App.

{
    "aud": "https://tenant.onmicrosoft.com/APIMAADDemo",
    "iss": "https://sts.windows.net/.../",
    "iat": 1497517398,
    "nbf": 1497517398,
    "exp": 1497521298,
    "acr": "1",
    "aio": "..",
    "amr": [ "pwd" ],
    "appid": "...",
    "appidacr": "1",
    "family_name": "Bruttin",
    "given_name": "Sacha",
    "ipaddr": "0.0.0.0",
    "name": "Sacha Bruttin",
    "oid": "...",
    "platf": "3",
    "scp": "user_impersonation",
    "sub": "...",
    "tid": "...",
    "unique_name": "[email protected]",
    "upn": "[email protected]",
    "ver": "1.0"
}
If you want to know what your token contains you can copy/paste it to https://jwt.io. The content of a token emitted by Azure AD is documented here.

We will add another set of policies to our inbound section. First, we need to extract the value from the JWT Token. We can access the Authorization property on the Headers object that is part of the Request. We need to split the content on a [space] because the token will be preceded by the scheme (Bearer). The AsJwt method will convert the string into a JWT Token object we can read on the claim by it name. Then we will be able to add these values to the HTTP header using the set-header policy.

<!-- Extract the data into the context variables -->
<set-variable name="x-upn" value="@(context.Request.Headers["Authorization"].First().Split(' ')[1].AsJwt()?.Claims["upn"].FirstOrDefault())"/>
<set-variable name="x-username" value="@(context.Request.Headers["Authorization"].First().Split(' ')[1].AsJwt()?.Claims["name"].FirstOrDefault())"/>

<!-- Add new values to the HTTP Header -->
<set-header name="X-UPN" exists-action="override">
    <value>@((string)context.Variables["x-upn"])</value>
</set-header>
<set-header name="X-Username" exists-action="override">
    <value>@((string)context.Variables["x-username"])</value>
</set-header>

Now we need to read this values in our Logic Apps. It is in fact quite simple. The Request Trigger will pass the Headers to your next action, we will add a Parse JSON action right after it. We will use this schema to parse only our custom attributes:

{
  "properties": {
    "X-UPN": {
      "type": "string"
    },
    "X-Username": {
      "type": "string"
    }
  },
  "type": "object"
}

Our properties are now available for the rest of the workflow.

Great Logic Apps

What’s next?

Our Logic Apps is now ready and we can test if from the developer portal. As you see, it easy to leverage APIM to secure our Logic Apps. The last step is to limit the access to our Logic Apps by restricting the calls to the IP addresses of the API Management portal.


Deploy Logic Apps & API Connection with ARM

One of the key question when developing Logic Apps is how to create an atomic re-deployable package that you will use to deploy your worfklow to another tenant. A typical use-case, is when you have distinct tenants for development and production. You need not only to deploy the workflow itself but also the API connections that you have created during the development.

Imagine you have this wonderfull Logic App that creates a blob in Azure Storage for each tasks you create in Wunderlist. Great Logic Apps

When you try to export the ARM Template by clicking the Automation script button, then you get this message:

API Connections cannot be exported yet and is not included in the template. See error details.

If you try to deploy this template in another resource group or tenant, you will have to recreate the connections manually. This could be a huge task if you have a lot of Logic Apps and API Connections. Great Logic Apps

Hopefully, it is possible to create the API Connections automatically when you deploy your ARM Template.

Creating API Connection

We will need to add a new resource into our ARM Template file. A basic deployment template for a connection looks like this:

{
    "type": "Microsoft.Web/connections",
    "apiVersion": "2016-06-01",
    "location": "[resourceGroup().location]",
    "name": "{ConnectionName}",
    "properties": {
        "api": {
            "id": "[concat('subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', resourceGroup().location, '/managedApis/{Api}')]"
        },
        "displayName": "{DisplayName}",
        "parameterValues": { 
            "{ParameterValues}": "{ParameterValues}"
        }
    }
}

From this you have to replace the following token:

Token Description
{ConnectionName} The technical name of your connection
{Api} The api kind. In our case ‘wunderlist’
{DisplayName} The display name of you connection
{ParameterValues} A key/value list of parameters

Retrieve the API Parameters

The {ParameterValues} are dependent of the api kind. The question is how to retrieve the needed parameters for a given API? I personally use armclient to get the information metadata.

armclient.exe get https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.Web/locations/{region}/managedApis/{Api}?api-version=2016-06-01

From the response, you could see the parameters that you need for your deployment. In our case, for the Azure Storage it is: accountName and accessKey.

{
    "properties": {                                                                          
        "name": "azureblob",                                                                   
        "connectionParameters": {                                                              
            "accountName": {                                                                     
                "type": "string",                                                                  
                "uiDefinition": {                                                                  
                "displayName": "Azure Storage Account name",                                     
                "description": "Name of the storage account the connector should use.",          
                "tooltip": "Provide the storage account name",                                   
                "constraints": {                                                                 
                    "required": "true"                                                             
                    }                                                                                
                }                                                                                  
            },                                                                                   
            "accessKey": {                                                                       
                "type": "securestring",                                                            
                "uiDefinition": {                                                                  
                "displayName": "Azure Storage Account Access Key",                               
                "description": "Specify a valid primary/secondary storage account access key.",  
                "tooltip": "Specify a valid primary/secondary storage account access key.",      
                "constraints": {                                                                 
                    "required": "true"                                                             
                    }                                                                                   
                }                                                                                  
            }                                                                                    
        }                                                                                     
        // Cut for brevity
    }
}

So we can add our connection resource to the ARM Template, we will start by creating 2 parameters to make our template re-usable and add the API connections into the resources array.

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "workflows_test_name": {
            "defaultValue": "test",
            "type": "string"
        },
        "storage_accountName": {
            "type": "string"
        },
        "storage_accessKey": {
            "type": "securestring"
        }
    },
    "resources": [
        {
            "type": "Microsoft.Web/connections",
            "apiVersion": "2016-06-01",
            "location": "[resourceGroup().location]",
            "name": "azureblob",
            "properties": {
                "api": {
                    "id": "[concat('subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', resourceGroup().location, '/managedApis/azureblob')]"
                },
                "displayName": "test blob",
                "parameterValues": {
                    "accountName": "[parameters('storage_accountName')]",
                    "accessKey": "[parameters('storage_accessKey')]"
                }
            }
        },
        {
            "type": "Microsoft.Web/connections",
            "apiVersion": "2016-06-01",
            "location": "[resourceGroup().location]",
            "name": "wunderlist",
            "properties": {
                "api": {
                    "id": "[concat('subscriptions/', subscription().subscriptionId, '/providers/Microsoft.Web/locations/', resourceGroup().location, '/managedApis/wunderlist')]"
                },
                "displayName": "test list",
                "parameterValues": {
                }
            }
        }]
        // Cut for brevity
    }
}

Note, that we did not add any content to the parameterValues for the Wunderlist connection because it is an OAuth connection. We will then add these 2 new resources as dependencies on dependsOn for the Logic Apps and reference them on the $connections.

{
    "resources": [
        // Cut for brevity
        {
            "type": "Microsoft.Logic/workflows",
            "name": "[parameters('workflows_test_name')]",
            "apiVersion": "2016-06-01",
            // Cut for brevity
            "properties": {
                // Cut for brevity
                "parameters": {
                    "$connections": {
                        "value": {
                            "azureblob": {
                                "connectionId": "[resourceId('Microsoft.Web/connections', 'azureblob')]",
                                "connectionName": "azureblob",
                                "id": "[reference(concat('Microsoft.Web/connections/', 'azureblob'), '2016-06-01').api.id]"
                            },
                            "wunderlist_1": {
                                "connectionId": "[resourceId('Microsoft.Web/connections', 'wunderlist')]",
                                "connectionName": "wunderlist",
                                "id": "[reference(concat('Microsoft.Web/connections/', 'wunderlist'), '2016-06-01').api.id]"
                            }
                        }
                    }
                }
            },
            "dependsOn": [
                "[resourceId('Microsoft.Web/connections', 'azureblob')]",
                "[resourceId('Microsoft.Web/connections', 'wunderlist')]"
            ]
        }
    ]
}

If you deploy now the ARM Template, you will see that both API Connections have been created. But when you open the Logic Apps, you will have to update manually the connection to Wunderlist by entering your credentials for the service.

But if you need to finalize the API Connection creation without opening every Logic Apps then you can use this PowerShell script LogicAppConnectionAuth. This script will retrieve a consent link for a connection for an OAuth Logic Apps connector. It will then open the consent link and complete authorization to enable a connection.

What’s next?

I hope this will help you with your Logic Apps deployment. The experience to create the API Connections for OAuth services is far from ideal. But for Azure resources it works perfectly.


Jekyll on Docker for Windows

This blog is hosted on GitHub pages and his generated by Jekyll. Hosting a personal blog on GitHub is cool because it’s totally free and it is also very fast due to the fact that the pages are static.

The only drawback for a Microsoft guy like me, it’s that I need to install Ruby on my machine to be able to test my website. It’s relativly easy to install Ruby and Jekyll on Windows even if it is not officialy supported. But every time I try to install Ruby on my machine, I have nasty issues with HTTPS when trying to install gems.

Last week, my machine crashed and I had to rebase it. But this time I have taken another way to have Jekyll up & running on my machine. I use Docker!

The folks of Jekyll have made Docker image available on Docker Hub and the documentation can be found on GitHub. I will show you how to get it running on your Windows machine.

First, we will pull the image on our Docker images.

docker pull jekyll/jekyll:pages

Note that we add the pages tag because we are targeting GitHub Pages. This image contains the same gems that are available on GitHub Pages.

Now, we need to run Jekyll from the Docker Image:

docker run --rm --volume=$(pwd):/srv/jekyll -p 4000:4000  jekyll/jekyll:pages jekyll serve --watch --incremental --force_polling

You will see something like this:

Configuration file: /srv/jekyll/_config.yaml
Configuration file: /srv/jekyll/_config.yaml
            Source: /srv/jekyll
       Destination: /srv/jekyll/_site
 Incremental build: enabled
      Generating...
                    done in 2.02 seconds.
 Auto-regeneration: enabled for '/srv/jekyll'
Configuration file: /srv/jekyll/_config.yaml
    Server address: http://0.0.0.0:4000/
  Server running... press ctrl-c to stop.

That’s all! Now, browse to http://localhost:4000 and your site is running. So far, so cool. But it is a bit cumbersome to enter all the command arguements every time you want to build your site. Let’s welcome Docker Compose. Just create a file called docker-compose.yml in your website folder with these contents:

jekyll:
    image: jekyll/jekyll:pages
    command: jekyll serve --watch --incremental --force_polling
    ports:
        - 4000:4000
    volumes:
        - .:/srv/jekyll

Here you can define the volume as a relative path. This file will also take care of pulling down the image from the internet if it is not present on your machine. Now run docker-compose up when you want to build your website.


Reboot

As a new year has just begun, it is time to take good resolutions. So I decided to give my blog a new life.

Keep Calm and Reboot

Until now this blog was in French. From now on it will be in English in order to reach a wider audience. It will also deal with the subjects that I am currently passionate about, that is the cloud computing in general and Azure in particular. But I do not forget my previous loves and I would come back on subjects like UWP, Xamarin and ASP.NET …

Stay tunned for more posts very soon…


Gérer les permissions de contenu dans Orchard

Parfois, il peut-être nécessaire de gérer des permissions sur le contenu de son site Orchard. Depuis quelques version, un module existe et fait directement partie du code source. Son nom est “Content Item Permissions” et il est décrit comme “Allows item-level front end view permissions.”.

Malheureusement, la documentation de ce module est quasi inexistante ou alors je ne l’ai pas trouvée :confused: En fait, il s’agit d’un Content Part qu’il faut donc ajouter à la définition du ou des Content Types que l’on veut pouvoir sécuriser.

Nous allons donc voir ici comment mettre en place ce module. Premièrement, il faut activer le module…

Activate Module

Ensuite il faut ajouter le Content Part sur le Content Type choisis. Pour l’exemple nous allons modifier le type Page. Allez dans “Content Definition” et cliquez sur le bouton “Edit” correspondant.

Cliquez ensuite sur le bouton “Add Parts” et sélectionnez “Content Permissions” dans la liste et validez avec “Save”. La page d’édition du Content Type s’affiche à nouveau et vous voyez maintenant la nouvelle part, cliquez sur le bouton d’expansion pour afficher les paramètres par défaut.

Content Permissions

Pour définir les droits d’une page, il faut maintenant éditer et sélectionner “Enable Content Item access control”…

Lorsqu’une page est des droits d’accès définis, un cadenas est affiché sur son item dans la liste “Content Items”.

Content Items


TechDays 13 - Les présentations sont disponibles

Les présentations des TechDays 13 qui ont eu lieu à Lausanne sont maintenant disponibles en ligne. N’hésitez pas à télécharger la présentation que j’ai co-animée avec Valerie Alonso et Xavier Bocken:

Windows Store Apps using HTML and JavaScript: Become a Windows App Store developer in 60 minutes


WPF - Custom sort dans la DataGrid

Ce billet, http://blogs.msdn.com/jgoldb/archive/2008/08/26/improving-microsoft-datagrid-ctp-sorting-performance.aspx, nous explique comment obtenir de meilleures performances lors du tri de la WPF DataGrid.

Voici une variante qui vous permettra de trier une datagrid liée à un XmlDataProvider.

public class XmlDataGridComparer<T> : IComparer where T : IComparable
{
  private readonly ListSortDirection m_sortDirection;
  private readonly string m_sortMemberPath;

  public XmlDataGridComparer(DataGridColumn column)
  {
    m_sortMemberPath = column.GetSortMemberPath();
    m_sortDirection = column.ToggleSortDirection();
  }

  public int Compare(object x, object y)
  {
    XmlElement a = (XmlElement)x;
    XmlElement b = (XmlElement)y;

    if (a == null || b == null)
    {
      throw new ArgumentException();
    }

    T da = GetObject(a);
    T db = GetObject(b);

    if (m_sortDirection == ListSortDirection.Ascending)
    {
      return da.CompareTo(db);
    }
    else
    {
      return db.CompareTo(da);
    }
  }

  private T GetObject(XmlElement element)
  {
    string value = element.GetElementsByTagName(m_sortMemberPath).Item(0).InnerXml;
    return (T)Convert.ChangeType(value, typeof(T));
  }
}

Ce code utilise deux méthodes d’extension sur les DataGridColumn que voici:

public static class DataGridColumnExtension
{
  /// <summary>
  /// Gets the sort member path.
  /// </summary>
  /// <param name="column">The sorted column.</param>
  /// <returns>The sort member path.</returns>
  public static string GetSortMemberPath(this DataGridColumn column)
  {
    string sortPropertyName = column.SortMemberPath;

    if (string.IsNullOrEmpty(sortPropertyName))
    {
      DataGridBoundColumn boundColumn = column as DataGridBoundColumn;

      if (boundColumn != null)
      {
        Binding binding = boundColumn.DataFieldBinding as Binding;

        if (binding != null)
        {
          if (!string.IsNullOrEmpty(binding.XPath))
          {
            sortPropertyName = binding.XPath;
          }
          else if (binding.Path != null)
          {
            sortPropertyName = binding.Path.Path;
          }
        }
      }
    }

    return sortPropertyName;
  }

  /// <summary>
  /// Toggles the sort direction.
  /// </summary>
  /// <param name="column">The sorted column.</param>
  /// <returns>
  ///     <see cref="ListSortDirection.Ascending"/> or <see cref="ListSortDirection.Descending"/>
  ///     depending of the previous sort direction.
  /// </returns>
  public static ListSortDirection ToggleSortDirection(this DataGridColumn column)
  {
    ListSortDirection sortDirection = ListSortDirection.Ascending;
    ListSortDirection? currentSortDirection = column.SortDirection;

    if (currentSortDirection.HasValue && currentSortDirection.Value == ListSortDirection.Ascending)
    {
      sortDirection = ListSortDirection.Descending;
    }

    column.SortDirection = sortDirection;
    return sortDirection;
  }
}

Il ne reste plus qu’à instancier la class XmlDataGridComparer avec le type d’object contenu dans notre colonne dans l’événement Sorting de la DataGrid :

ListCollectionView lcv = (ListCollectionView)CollectionViewSource.GetDefaultView(dataGrid.ItemsSource);
lcv.CustomSort = new XmlDataGridComparer<string>(e.Column)

Et voilà…


Tooltip avec un bord rond

Pour ceux qui comme moi cherchent à créer un Tooltip avec des bords arrondis:

Tooltips personalizadas com WPF par Bruno Sonnino.


Use your code-behind for binding

SerialSeb: WPF Tips’n’Tricks #8: Use your code-behind for binding


Les templates de contrôle

Un très bon article qui décrit comment customizer les contrôles WPF.

Les modèles de contenus et les Templates des contrôles


Logger les exceptions de type EventTopicException

Lorsque l’on utilise le SCSF/CAB, il arrive parfois que l’on se retrouve confronté à des exceptions quelques peux obscure du genre ‘One or more exceptions occurred while firing the topic ….’ .

Ces exceptions sont levées lorsqu’une fonction appellée dans un Event CAB à un problème. Pour avoir plus d’informations lors du debugging, on peux simplement examiner la propriété Exceptions qui contient un tableau des exceptions levée lors de l’appel à l’événement. Mais comment faire lorsque l’application n’est plus sous contrôle du développeur et que le log vous renvoie uniquement ce magnifique message ‘One or more blah blah…’ ??

Eh bien en utilisant la classe EventTopicExceptionFormatter qui nous est créée automatiquement par SCSF dans Infrastructure.Library. Il suffit ensuite de configurer un nouveau type d’exception dans EntLib:

<exceptionHandling>
  <exceptionPolicies>
    <add name="Default Policy">
    <exceptionTypes>
      <add type="Microsoft.Practices.CompositeUI.EventBroker.EventTopicException, Microsoft.Practices.CompositeUI, Version=1.0.51205.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"
          postHandlingAction="NotifyRethrow" name="EventTopicException">
          <exceptionHandlers>
            <add logCategory="General" eventId="100" severity="Error" title="GMS Exception Handling"
                 formatterType="YourNamespace.Infrastructure.Library.EntLib.EventTopicExceptionFormatter, Infrastructure.Library"
                 priority="0" type="Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.Logging.LoggingExceptionHandler, Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.Logging, Version=3.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"
                 name="Logging Handler" />
          </exceptionHandlers>
      </add>
    </exceptionTypes>
    </add>
  </exceptionPolicies>
 </exceptionHandling>

Et voilà! Maintenant vos logs contienent toutes les exceptions levées lors de votre event.