Application Insights with Azure App Service

Application Insights is a simple way to detect and diagnose exceptions and application performance issues in your web apps and web services. In this post, I will walk you through adding it to an ASP.NET MVC application. In order to take advantage of this, you’ll need to log into your Azure account and go to your App Service that you created and look under Monitoring then you will see Application Insights. Open it and create a new resource and press OK as shown below.  You’ll see that you can automatically instrument your ASP.NET app with a restart. Once completed, go and visit your app and refresh a couple of times and then go back and take a look at the live stream. 

If you add Application Insights to your Visual Studio project by right-clicking the project and then adding Configure Application Insights, then you can add code to collect exception telemetry. 

Keyboard Shortcuts in the Azure

Keyboard shortcuts in Azure

It happens a lot that developers are asking for keyboard shortcuts in any environments. Azure has the following keyboard shortcuts:

ACTIONS

CTRL+/ Search blade menu items
G+/ Search resources (global)
G+N Create a new resource
G+B Open the ‘More services’ pane

NAVIGATION

G+, Move focus to command bar
G+. Toggle focus between top bar and side bar

GO TO

G+D Go to dashboard
G+A Go to all resources
G+R Go to resource groups
G+number Open the item pinned to the favorites bar at this position

Load Test with the Azure

Check your web app’s performance before you launch it or deploy updates to production. That way, you can better assess whether your app is ready for release. Feel more confident that your app can handle the traffic during peak use or at your next marketing push. You’ll need a Visual Studio Team Services (VSTS) account to keep your performance test history. A suitable account will be created automatically when you set up your performance test. Or you can create a new account or use an existing account if you’re the account owner. Deploy your app for testing in a non-production environment. Have your app use an App Service plan other than the plan used in production. That way, you don’t affect any existing customers or slow down your app in production.

Set up and run your performance test
1- Sign in to the Azure Portal. To use a VSTS account that you own, sign in as the account owner.

2- Go to your web app.

3- In the DEVELOPMENT TOOLS section choose Performance test.

4- Now you’ll link a VSTS account to keep your performance test history. Choose Set Account.

5- If you have a VSTS account to use, select that account. If you don’t, create a new account.

6- Choose + New to create a new performance test.

7- Set the details and run the test. Your web app’s default URL is added automatically. You can change the URL to test other pages (HTTP GET requests only). To simulate local conditions and reduce latency, select a location closest to your users for generating load.

You simulate load on your app by generating virtual users (customers) who visit your web site at the same time. This will show how many requests are failing or responding slowly.

 

Azure Leads the Industry in ISO Certifications

Microsoft Azure recently completed a new set of independent third-party ISO and Cloud Security Alliance (CSA) audits to expand the certification portfolio.  Azure leads the industry with the most comprehensive compliance coverage, enabling customers to meet a wide range of regulatory obligations.  If that were not enough, having a program with industry leading depth and coverage specific to ISO is exponentially useful to our customers globally as ISO standards provide baselines for information security management that are relied upon by many other standards across regulated industries and markets worldwide. A combination of our ISO and CSA certifications exist in all four Azure clouds, and coverage is now newly expanded across the following clouds

image

Achieving the ISO 20000-1:2011 certification specifically underscores Azure’s commitment to deliver quality IT service management to customers and demonstrates Azure’s capability to monitor, measure, and improve service management processes. The CSA STAR Certification involves a rigorous independent third-party assessment of a cloud provider’s security posture that combines ISO 27001 certification with criteria specified in the CSA Cloud Controls Matrix. Azure maintains the highest possible Gold Award for the maturity capability assessment of the CSA STAR Certification, and as previously stated, is now available in the Azure Government cloud. In addition to the broadest compliance portfolio amongst enterprise cloud providers, Azure maintains the deepest coverage as measured by how many customer-facing services are in audit scope. For example, recently completed Azure ISO 27001 and ISO 27018 audits have 61 customer-facing services in audit scope, making it possible for customers to build realistic ISO-compliant cloud applications with end-to-end platform coverage.

Host a RESTful API with CORS in Azure App Service

This post explains the required steps to deploy an ASP.NET Core API app to App Service with CORS support. You can configure the app using command-line tools and deploy the app using Git.

Azure App Service provides a highly scalable, self-patching web hosting service. In addition, App Service has built-in support for Cross-Origin Resource Sharing (CORS) for RESTful APIs. 

The required steps:

 

Install Git.
Install .NET Core.
Create local ASP.NET Core app

API apps in Azure App Service offer features that make it easier to develop, host, and consume APIs in the cloud and on-premises. With API apps you get enterprise grade security, simple access control, hybrid connectivity, automatic SDK generation, and seamless integration with Logic Apps. In simple words, it is a platform to host the Web apps with the most common API features for which you don’t have to code.

We can directly host the Application in a Web app and leverage all the Services given above:

Inbuilt Swagger Integration.
Ability to push your API APPS into Azure MarketPlace.
API definition.
Support for creating an Azure API client from Visual Studio.
We are going to create a demo and discuss all the 4 main reasons along with the demo.

Create an API from Visual Studio and host it in Azure API app

Go to Visual Studio -> Visual C# -> Web -> ASP.NET Web Application, enter the name of the API and click OK button. Now, select Azure API app from the dialog box. We can select Web API as well and then publish it as Azure API app, which will also serve the same purpose.

Inbuilt Swagger Integration

  • As we have selected an Azure API app, some of the common Web API used packages like Newtonsoft.Json and Swasbuckle.core (Swagger) come directly in the template.
  • Create an API Controller by right clicking on the Controller and Add -> Controller.
  • Now, select the Web API 2 Empty Controller. You can use any of the controllers but for this post, we are going to use the empty controller.
  • Name the controller as Calculator Controller and as we selected the WEB API 2 template, its going to be derived from the APIController. 
  • Now, we need to publish the API to Azure API app instance. Right click on the project and click Publish.
  • Select Azure API app and enter your credentials to authenticate and login. 
  • Enter the name and select a Resource Group, AppService plan in the approproiate subscription. If you don’t have the Resource Group or an app Service plan, you can create from the same wizard by clicking the respective new button and pass the appropriate values. Once all the values are passed, click Create button. It’s going to create an Azure API Web app in your Azure account.
  • Once completed, the published metadata file will be downloaded and subsequently you can click Publish button to push your binaries.
  • Once the publish is complete, it’s going to open up the URL in your Browser.
  • Append the swagger to the URL and you can see all the Methods, which we have created in our code above.
  • Swagger UI also allows you to test the methods by acting as a Client. We are trying the Sub method and passed two params to it and clicked the “TRY IT NOW” button. Now, we will contact the API and return the result. This is very useful in case of APIs, as you can straightaway test your APIs and see if it’s working fine or not without writing a single line of code.
  • Now, go to Azure and go to your resource. Select the API definition and you can see that it gives you an option to export the metadata to the PowerApps and Microsoft Flow.
  • Now, add a new project and select a console Application.
  • Right click on the project and add REST API client.
  • Now, add the metadata URL or select an Azure asset, which we have created and click OK button to download the metadata associated with it. Now, consume the Service, as we do in any other client with the supported metadata.

Health Endpoint Monitoring Design Pattern in Azure

Health Endpoint Monitoring Design Pattern

Implement functional checks in an application that external tools can access through exposed endpoints at regular intervals. This can help to verify that applications and services are performing correctly.

Challenge

It’s a good practice, and often a business requirement, to monitor web applications and back-end services, to ensure they’re available and performing correctly. However, it’s more difficult to monitor services running in the cloud than it is to monitor on-premises services. For example, you don’t have full control of the hosting environment, and the services typically depend on other services provided by platform vendors and others.

There are many factors that affect cloud-hosted applications such as network latency, the performance and availability of the underlying compute and storage systems, and the network bandwidth between them. The service can fail entirely or partially due to any of these factors. Therefore, you must verify at regular intervals that the service is performing correctly to ensure the required level of availability, which might be part of your service level agreement (SLA).

Solution

Implement health monitoring by sending requests to an endpoint on the application. The application should perform the necessary checks, and return an indication of its status.

Scheduler Agent Supervisor Design Pattern

Scheduler Agent Supervisor Design Pattern

Coordinate a set of distributed actions as a single operation. If any of the actions fail, try to handle the failures transparently, or else undo the work that was performed, so the entire operation succeeds or fails as a whole. 

Challenge

An application performs tasks that include a number of steps, some of which might invoke remote services or access remote resources. The individual steps might be independent of each other, but they are orchestrated by the application logic that implements the task.

Whenever possible, the application should ensure that the task runs to completion and resolve any failures that might occur when accessing remote services or resources. Failures can occur for many reasons. If the application detects a more permanent fault it can’t easily recover from, it must be able to restore the system to a consistent state and ensure integrity of the entire operation.

Solution

The Scheduler Agent Supervisor pattern defines the following actors. These actors orchestrate the steps to be performed as part of the overall task.

  • The Scheduler arranges for the steps that make up the task to be executed and orchestrates their operation. These steps can be combined into a pipeline or workflow. The Scheduler is responsible for ensuring that the steps in this workflow are performed in the right order. As each step is performed, the Scheduler records the state of the workflow, such as “step not yet started,” “step running,” or “step completed.” The state information should also include an upper limit of the time allowed for the step to finish, called the complete-by time. If a step requires access to a remote service or resource, the Scheduler invokes the appropriate Agent, passing it the details of the work to be performed. The Scheduler typically communicates with an Agent using asynchronous request/response messaging. This can be implemented using queues, although other distributed messaging technologies could be used instead.

  • The Agent contains logic that encapsulates a call to a remote service, or access to a remote resource referenced by a step in a task. Each Agent typically wraps calls to a single service or resource, implementing the appropriate error handling and retry logic (subject to a timeout constraint, described later). If the steps in the workflow being run by the Scheduler use several services and resources across different steps, each step might reference a different Agent.

  • The Supervisor monitors the status of the steps in the task being performed by the Scheduler. It runs periodically (the frequency will be system specific), and examines the status of steps maintained by the Scheduler. If it detects any that have timed out or failed, it arranges for the appropriate Agent to recover the step or execute the appropriate remedial action. Note that the recovery or remedial actions are implemented by the Scheduler and Agents. The Supervisor should simply request that these actions be performed.

The Scheduler, Agent, and Supervisor are logical components and their physical implementation depends on the technology being used. For example, several logical agents might be implemented as part of a single web service.

The Scheduler maintains information about the progress of the task and the state of each step in a durable data store, called the state store. The Supervisor can use this information to help determine whether a step has failed. 

Azure Ambassador Design Pattern

Azure Ambassador Design Pattern

It creates helper services that send network requests on behalf of a consumer service or application. An ambassador service can be thought of as an out-of-process proxy that is co-located with the client.

This pattern can be useful for offloading common client connectivity tasks such as monitoring, logging, routing, security (such as TLS), and resiliency patterns in a language agnostic way. It is often used with legacy applications, or other applications that are difficult to modify, in order to extend their networking capabilities. It can also enable a specialized team to implement those features.

Challenge

Resilient cloud-based applications require features such as circuit breaking, routing, metering and monitoring, and the ability to make network-related configuration updates. It may be difficult or impossible to update legacy applications or existing code libraries to add these features, because the code is no longer maintained or can’t be easily modified by the development team. Network calls may also require substantial configuration for connection, authentication, and authorization. If these calls are used across multiple applications, built using multiple languages and frameworks, the calls must be configured for each of these instances. In addition, network and security functionality may need to be managed by a central team within your organization. With a large code base, it can be risky for that team to update application code they aren’t familiar with.

Solution

Put client frameworks and libraries into an external process that acts as a proxy between your application and external services. Deploy the proxy on the same host environment as your application to allow control over routing, resiliency, security features, and to avoid any host-related access restrictions. You can also use the ambassador pattern to standardize and extend instrumentation. The proxy can monitor performance metrics such as latency or resource usage, and this monitoring happens in the same host environment as the application.

Azure App Service

The Azure Mobile Services suite which offers push notification capability, authentication, and data storage was discontinued in December 2016. Microsoft moved sites from Mobile Services to the Azure App Service, which offers similar functionality.

There was an overlap between App Service and Mobile Services which was introduced in 2012. In the mobile backend world, Azure Mobile Services has faced competition from the Cognito service offered by public cloud market leader Amazon Web Services. Google Cloud Platform has Cloud Endpoints and Firebase. Facebook is in the process of shutting down its Parse mobile backend.

Please note that your existing Mobile Service is safe and will remain supported. However, there are advantages that the Azure App Service platform provides for your mobile app that are not available with Mobile Services:

  • Simpler, easier, and more cost effective offering for apps that include both web and mobile clients
  • New host features including Web Jobs, custom CNames, better monitoring
  • Integration with Traffic Manager
  • Connectivity to your on-premises resources and VPNs using VNet in addition to Hybrid Connections
  • Monitoring, alerting and troubleshooting for your app using AppInsights
  • Richer spectrum of the underlying compute resources and pricing
  • Built-in auto scale, load balancing, and performance monitoring.
  • Built-in staging, backup, roll-back, and testing-in-production capabilities

 

 

Azure OData Service

What is OData?

SOAP based Web Service are a medium for applications to communicate between each other and RESTful APIs released after to make these communications easier. OData stood as the next step. It stands for Open Data Protocol. The general philosophy carried by OData is standardization of the usage of RESTful API, and empowering applications to make this usage a simpler affair. 

Basically, through the OData protocol, you are able access any particular bit of data in the target system, via an HTTP URL. 

How do I expose my application data, as an OData URL?

So, Salesforce can consume data via the OData protocol. But how would you expose your back-office data as an OData service in the first place?

Where do I find test OData Services?

 

If you would like to stand out, by creating situation specific data, a personal-looking OData URL, or simply to get a feel of creating a custom OData service.

Creating an OData Service

Like I said, one of the ways to create an OData service is through the .Net WCS Data Services templates. That is what we would be using here.

Creating a Cloud Database

To create an OData service, there should be some data to read from in the first place. Let’s leverage the Microsoft Azure platform for making a hosted SQL database. 

Building an OData Service 

Let us now proceed to build an OData service that can read from this cloud data base. We are going to use the WCF Data Service template available in the Visual Studio. Ofcourse you can use earlier versions of Visual Studio, but then you might need to go through some extra steps to enable the WCF Data Services template. Described below are the steps to create the OData Service. 

Open Visual Studio, click File > New Project, and choose ASP .NET Web Application. Provide a meaningful Project Name and Solution Name. Click OK.
In the pop up that is presented, choose “Empty”, as we do not require any pre-built templates at this point.
This would make Visual Studio set up a project and solution space for you. It is good to have the Solution Explorer view enabled, for easy navigation (Cntl+Alt+L). From the solution explorer, right click on your project, and Add > New Item.
Choose ADO .NET Entity Data Model. Since we first have to generate our DB entity diagram within the context of the application we are making. Visual Studio lets us do this all using point and click.

In the choices for Model Contents, choose EF Designer from Database. Which would tell Visual Studio to intuitively design the entity model based on our DB table structure. Click Next.
At this point you are asked the details of the SQL server, to which it need to make the data connection. Click “New” button, to open the dialogue to initiate a new connection request. But now, you need to know the Server name your Azure DB. Where do you find that?
Well, log-in to the Azure Manage platform, go to the detail page of your DB, and there you would be able to find your Server name. Copy that, for use here. 
Once you have the server name, paste the values and enter the SQL authentication credentials that you specified when creating this DB. After a few minutes of wait, the available databases in the SQL server would be displayed. If you are getting an error pop up, it would be most likely because your current IP is not in the list of trusted IPs at Azure. Add your current IP to your DB’s trusted list. The database names might take a while to load, be patient. As long as you did not get an error, you are good. In the next step, choose “Yes, Include sensitive data… ” and further choose Entity model. That would give you a screen to choose what database elements need to be brought in context to this app.
The next broad task we are going to perform is to add a data service to this application. WCF Data Services enables you to create services that use the Open Data Protocol (OData) to expose and consume data over the Web or intranet. 
As you did above, in Step 3, right click on the Project and Add > New Item. In the selection pop up that is thrown, choose the WCF Data Services template from the Web category. Also be sure to give a proper name for this data service, since what you give here would appear as the last part for your OData URL.
Once you click add, Visual Studio would open up a template based DataServices class, that has neat instructions as to what to do. Notice a bunch of *TODO tags within the template. It narrates what exactly need to be filled in.
Fill in the [class name] in line 2. And remove the comment from the following line for code: “config.SetEntitySetAccessRule(“MyEntityset”, EntitySetRights.AllRead);”. And replace “MyEntityset” with “*”. This would grant read access to all data via this OData service.
That is it, you are done. Click to the browser button on the top pane, to check if everything is alright. If yes, you should be getting an XML output with a local version of your OData service.
Suffix the url with “$metadata” to get the entire meta data structure of your data base.

So now we have an OData service running from our local machine. But that might not be enough for other applications to consume. So, lets get this OData URI cloud-ized. How? Azure again.

Sign in to your Azure and via the New button, reach New > Web App > Quick create. Enter a neat looking URL, as this would constitute the first bit of your OData service URL.
Once the Web App is created, you would be taken to the web app dashboard; where you would find the option to “Download the Publisher Profile”. Publisher profile is a collection settings, that lets a third part app authenticate and publish applications to this hosted web application. Download and save the publisher profile to your computer.
Go back to Visual Studio, and right click on the Project, invoke the menu and choose “Publish”. If you don’t see the publish button enabled, it could be because you are in debug mode. There would be a “stop” button on header bar to get you out of the developer version. Post which you would be able to see the publish button.
After the web publishing is done by importing the publisher provide file that you saved a while ago. You would notice that most of the parameters regarding the web application gets filled automatically. And this is why, we choose to use the publisher profile import method. Now, click Publish.
It might take a while for the entire web application to get published onto Azure. Once done you would see a success page. Since there is no landing page or front end defined for our web application, it would be the generic Azure page.
Extend the URL by adding <yourODataServiceName.svc>, and you should be able to see the XML.

That’s it! You have just created a personalized custom OData service URL. 

Copyright © All Rights Reserved - C# Learners