SKILLS & TRICKS
In order to setup Azure MFA as Primary Authentication with ADFS, this does require you to move to Azure MFA (cloud-based version). I have not deployed Azure Multi-Factor Authentication Server (on-prem/hybrid version) in a few years for anyone as pretty much everyone I work with has moved on to cloud-based Azure MFA. Feature parity is pretty close to the same at this point and in my opinion, the days of Azure MFA Server on-prem are numbered. If you’re still on the on-prem Azure MFA Server, it is very easy to migrate to the cloud-based Azure MFA. I’m going to assume you have a working ADFS environment already that is federated with Azure AD using Azure AD Connect for this blogpost for a step by step guide. Configure ADFS and Azure MFA to work1. Log into your ADFS server. In my example, I am using ADFS 4.0 with a Farm Behavior Level (FBL) set to 3 which means Windows Server 2016 and an Active Directory 2016 schema. You can always run the PowerShell cmdlet “Get-AdfsFarmInformation” on your ADFS server to show your FBL version. Go ahead and open the AD FS console:
2 Comments
Azure Application Gateway is a web traffic load balancer that enables you to manage traffic to your web applications. Traditional load balancers operate at the transport layer (OSI layer 4 – TCP and UDP) and route traffic based on source IP address and port, to a destination IP address and port.
Prerequisites
This solution is based on the letsencrypt-webapp-renewer. It uses the same core library than the Azure Lets Encrypt site extension, but it is run as a WebJob. It can (should) be installed on its own web app, and supports multiple target websites.
The author of the letsencrypt-webapp-renewer has made thorough instructions, so I won't copy them here. When granting the service principal rights, you may want to only add Website Contributor and Web Plan Contributor instead of Contributor rights. The only thing needed in addition to those instructions is the support by the web app itself.
According to the documentation of Azure CLI you need to use az storage message put.
Lets admit that migration of any sort is a pain and migrating something to the cloud for the first time is always challenging. I have been going through this for quite some time and finally had the chance to play with Azure Migrate and it has been a good experience overall. If you plan to go to Azure you will have 2 choices for your current environment:
Azure MigrateThe Azure Migrate service assesses on-premises workloads for migration to Azure. The service assesses the migration suitability of on-premises machines, performs performance-based sizing, and provides cost estimations for running on-premises machines in Azure. If you’re contemplating lift-and-shift migrations, or are in the early assessment stages of migration, this service is for you. After the assessment, you can use services such as Azure Site Recovery and Azure Database Migration Service, to migrate the machines to Azure. Why use Azure Migrate?Azure Migrate helps you to:
Since ASP.NET Core 2.2 was released I have been working on getting all my different applications updated and using in-process hosting. I pretty quickly hit an issue with an application that uses SQLite. As soon as the application tried to access the database I ended up with the following error. SqliteException: SQLite Error 14: ‘unable to open database file’. Microsoft.Data.Sqlite.SqliteException.ThrowExceptionForRC(int rc, sqlite3 db) Microsoft.Data.Sqlite.SqliteConnection.Open() Microsoft.EntityFrameworkCore.Storage.RelationalConnection.OpenDbConnection(bool errorsExpected) Issue and Work AroundAfter some Googling, I found an issue on GitHub that details the problem. It turns out that when the application gets its current directory it is returning the path to the IIS process that is hosting the application instead of the directory when the application is.
On another GitHub issue, I found a link to a recommended workaround. Add the following class somewhere in your application. This code comes here. Given the diversity of operating systems supported by Docker and the differences between .NET Framework and .NET Core, you should target a specific OS and specific versions depending on the framework you are using. For Windows, you can use Windows Server Core or Windows Nano Server. These Windows versions provide different characteristics (IIS in Windows Server Core versus a self-hosted web server like Kestrel in Nano Server) that might be needed by .NET Framework or .NET Core, respectively. For Linux, multiple distros are available and supported in official .NET Docker images (like Debian). In the image below you can see the possible OS version depending on the .NET framework used. You can also create your own Docker image in cases where you want to use a different Linux distro or where you want an image with versions not provided by Microsoft. For example, you might create an image with ASP.NET Core running on the traditional .NET Framework and Windows Server Core, which is a not-so-common scenario for Docker.
When you add the image name to your Dockerfile file, you can select the operating system and version depending on the tag you use, as in the following examples:
General OverviewYou should use .NET Core, with Linux or Windows Containers, for your containerized Docker server application when:
You should use .NET Framework for your containerized Docker server application when:
Using .NET Framework on Docker can improve your deployment experiences by minimizing deployment issues. This “lift and shift” scenario is important for containerizing legacy applications that were originally developed with the traditional .NET Framework, like ASP.NET WebForms, MVC web apps or WCF (Windows Communication Foundation) services. After being in preview for quite some time, Azure Storage Explorer is now available in general availability (GA). You can get it from:
With Azure Storage Explorer you can directly access your Azure Storage from your preferred client to download/upload content, manage you blobs, files, queues, tables or even your Cosmos DB Entities. To connect to your Azure tenant (covering all public, government or China) you can use either your credentials, a connection string or shared access URL or the storage account key. You can add multiple accounts to connect to your Azure Storage using the View\Account Management menu.
For the past few weeks a lot has been spoken, written and talked about is GDPR Compliance. I have always retained the view that Europeans are much more intelligent when it comes to compliance and regulatory of Personal Data then Americans. I am not going to use the word privacy because its been the most mocked around word in some quarters and of course in USA we all know how well its protected and implemented. :D I'll start with first highlighting some key aspects of GDPR - like
What is GDPRIts not something new and before GDPR we had Data Protection Act so if you had it implemented then you will go through less pain since a lot of elements are partially covered by it. The whole idea and concept is to know how the data is collected, where the data resides, stored, processed, deleted, who can access it and how its used for EU citizens. This means that organizations will be required to show the data flow or lifecycle to minimize any risk of personal data being leaked and all required steps are in place under GDPR.
In short, GDPR is to have common sense data security ideas, especially from the Privacy by Design school of thought: minimize collection of personal data, delete personal data that’s no longer necessary, restrict access, and secure data through its entire lifecycle and also by adding requirements for documenting IT procedures, performing risk assessments under certain conditions, notifying the consumer and authorities when there is a breach, as well as strengthening rules for data minimization. |
Archives
February 2020
Categories
All
|