SKILLS & TRICKS
Lets admit that migration of any sort is a pain and migrating something to the cloud for the first time is always challenging. I have been going through this for quite some time and finally had the chance to play with Azure Migrate and it has been a good experience overall.
If you plan to go to Azure you will have 2 choices for your current environment:
The Azure Migrate service assesses on-premises workloads for migration to Azure. The service assesses the migration suitability of on-premises machines, performs performance-based sizing, and provides cost estimations for running on-premises machines in Azure. If you’re contemplating lift-and-shift migrations, or are in the early assessment stages of migration, this service is for you. After the assessment, you can use services such as Azure Site Recovery and Azure Database Migration Service, to migrate the machines to Azure.
Why use Azure Migrate?
Azure Migrate helps you to:
Since ASP.NET Core 2.2 was released I have been working on getting all my different applications updated and using in-process hosting. I pretty quickly hit an issue with an application that uses SQLite. As soon as the application tried to access the database I ended up with the following error.
SqliteException: SQLite Error 14: ‘unable to open database file’. Microsoft.Data.Sqlite.SqliteException.ThrowExceptionForRC(int rc, sqlite3 db) Microsoft.Data.Sqlite.SqliteConnection.Open() Microsoft.EntityFrameworkCore.Storage.RelationalConnection.OpenDbConnection(bool errorsExpected)
Issue and Work Around
After some Googling, I found an issue on GitHub that details the problem. It turns out that when the application gets its current directory it is returning the path to the IIS process that is hosting the application instead of the directory when the application is.
On another GitHub issue, I found a link to a recommended workaround. Add the following class somewhere in your application. This code comes here.
Given the diversity of operating systems supported by Docker and the differences between .NET Framework and .NET Core, you should target a specific OS and specific versions depending on the framework you are using.
For Windows, you can use Windows Server Core or Windows Nano Server. These Windows versions provide different characteristics (IIS in Windows Server Core versus a self-hosted web server like Kestrel in Nano Server) that might be needed by .NET Framework or .NET Core, respectively.
For Linux, multiple distros are available and supported in official .NET Docker images (like Debian). In the image below you can see the possible OS version depending on the .NET framework used.
You can also create your own Docker image in cases where you want to use a different Linux distro or where you want an image with versions not provided by Microsoft. For example, you might create an image with ASP.NET Core running on the traditional .NET Framework and Windows Server Core, which is a not-so-common scenario for Docker.
When you add the image name to your Dockerfile file, you can select the operating system and version depending on the tag you use, as in the following examples:
You should use .NET Core, with Linux or Windows Containers, for your containerized Docker server application when:
You should use .NET Framework for your containerized Docker server application when:
Using .NET Framework on Docker can improve your deployment experiences by minimizing deployment issues. This “lift and shift” scenario is important for containerizing legacy applications that were originally developed with the traditional .NET Framework, like ASP.NET WebForms, MVC web apps or WCF (Windows Communication Foundation) services.
After being in preview for quite some time, Azure Storage Explorer is now available in general availability (GA).
You can get it from:
With Azure Storage Explorer you can directly access your Azure Storage from your preferred client to download/upload content, manage you blobs, files, queues, tables or even your Cosmos DB Entities.
To connect to your Azure tenant (covering all public, government or China) you can use either your credentials, a connection string or shared access URL or the storage account key.
You can add multiple accounts to connect to your Azure Storage using the View\Account Management menu.
For the past few weeks a lot has been spoken, written and talked about is GDPR Compliance. I have always retained the view that Europeans are much more intelligent when it comes to compliance and regulatory of Personal Data then Americans. I am not going to use the word privacy because its been the most mocked around word in some quarters and of course in USA we all know how well its protected and implemented. :D
I'll start with first highlighting some key aspects of GDPR - like
What is GDPR
Its not something new and before GDPR we had Data Protection Act so if you had it implemented then you will go through less pain since a lot of elements are partially covered by it. The whole idea and concept is to know how the data is collected, where the data resides, stored, processed, deleted, who can access it and how its used for EU citizens. This means that organizations will be required to show the data flow or lifecycle to minimize any risk of personal data being leaked and all required steps are in place under GDPR.
In short, GDPR is to have common sense data security ideas, especially from the Privacy by Design school of thought: minimize collection of personal data, delete personal data that’s no longer necessary, restrict access, and secure data through its entire lifecycle and also by adding requirements for documenting IT procedures, performing risk assessments under certain conditions, notifying the consumer and authorities when there is a breach, as well as strengthening rules for data minimization.
A few weeks back, me and Pelle (Niteco, CEO) had a call with Microsoft Azure Support based in Singapore. We discussed some of the challenges we were facing and one of the critical issue was to have invoicing more structured. Its not always easy to have your customers have Hosted Apps or backups on your environment under a different Resource Group and then share billing details internally with the Accounting Team for invoicing.
Lets face it, not everyone in an IT company has a technical mind and people in finance only care about numbers. They ideally like to see how the billing gets calculated and how it matches with the invoice they receive from Microsoft every month. Ahhhh its really a pain at times to break down everything and present it to them , the time and effort it took was just too much. If you are a CSP then its a different story all together and i guess CSP Tier 2 partners also have a much more simpler dashboard to get the invoicing details.
Thank You Microsoft for adding Billing User Role to Azure
After a few back and forth calls and some initial testing i was told by Microsoft Support that they would be adding a new User Role in the coming weeks. Well was excited about it and on Tuesday 25th of May this was released at least for our environment and the new Billing Reader role allows you to delegate access to just billing information with no access to services such as VMs and storage accounts. Users in this role can perform Azure billing management operations such as viewing subscription scoped cost reporting data and downloading invoices.
Microsoft has rolled out Azure Managed Disks this month and its available for all to simplify the management, scaling, sizing of disk, etc for VMs.
Since i am still pretty much a .Net guy so download the Azure Management Libraries for .Net to manage Managed Disks.
GitHub Link: https://github.com/Azure/azure-sdk-for-net/tree/Fluent
Time to play :)
We are living in an era of Cloud Computing and though everyone seems to talk about it, very few seem to understand what it really is and how quickly its transforming development of Business Application. Guys like me on the other hand seem to me a bit more relaxed putting apps on Cloud, mostly Microsoft Azure due to its ease and way to get things done. The one thing which I really like is, Microsoft is building it rapidly and adding new features so fast that it’s a bit of a pain at times. Ahh welcome to the life of Technology Enthusiast – in this case Cloud Architect.
A few years back everyone was talking about Virtualization and how it has changed the IT Landscape – now Microsoft Azure has come up with Containers which is the next big thing in the Cloud Computing world. Containers would be able to run on any hardware, cloud or any environment without much modification. What this means is that Customers would now be able to write their Apps once and deploy everywhere – Test, Dev or Production.