Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 30 Next »

Key Points

  1. azure offers free, low-end account
  2. get office 365 for compatibility or use Libre Office - draw.io
  3. test Hyperledger Fabric on azure linux instance


References

Reference_description_with_linked_URLs____________________________Notes______________________________________________________________
Azure concepts






https://docs.microsoft.com/en-us/azure/azure-functions/functions-overview
https://linuxacademy.com/blog/certifications/azure-certifications-and-roadmap/Azure certifications roadmap
https://linuxacademy.com/course/microsoft-azure-fundamentals-az-900-exam-prep/Azure AZ-900 fundamentals cert - 16 hrs - free course - 1 exam
Learn Azure in a Month of Lunches ebook
https://www.youtube.com/user/Bryancutube256123/playlistsBryan Cafferky playlists for Azure software lessons


Azure Cloud pricing options
use SMALL databases, services with small instances to minimize costs, shutdown when not testing
VMs
https://azure.microsoft.com/en-us/pricing/details/virtual-machines/linux/Azure Virtual Server instances - reserved or pay as you go
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/prepay-reserved-vm-instancesAzure Virtual Server instances - prepaid
https://azure.microsoft.com/en-in/pricing/details/virtual-machines/series/Azure VMs overview
https://docs.microsoft.com/en-us/azure/cosmos-db/introduction

https://opdhsblobprod01.blob.core.windows.net/contents/4a6d75bb3af747de838e6
ccc97c5d978/b90e2ae80f5025c8806d44776b7fa0d4?sv=2015-04-05&sr=b&sig=i68TiisFHGcbWYDd4uAYyzhiioc8IhBlWTAA2dlHOWI%3D&st=2019-11-06T20%3A52%3A17Z&se=2019-11-07T21%3A02%3A17Z&sp=r

https://drive.google.com/open?id=1isN57DY8p2v3DrjKpByKU5Dpj1Irhf2Z

Azure VM configuration doc online
https://azure.microsoft.com/en-us/pricing/details/virtual-machines/linux/Azure Linux VM pricing options


Azure Identity Options
https://docs.microsoft.com/en-us/azure/active-directory-b2c/overviewAzure Active Directory B2C identity mgt






Azure DB options
https://docs.microsoft.com/en-us/azure/cosmos-db/introductionAzure Cosmos DB - NoSQL - multi-model

Azure SQL DB
https://www.youtube.com/watch?v=VnU5-erCIC0Azure SQL server DB - video - Bryan Cafferky

BYODB - MySQL etc


Azure serverless functions ( Lambda )
https://azure.microsoft.com/en-us/services/functions/Azure serverless Functions
https://serverless.zone/what-aws-lambda-users-should-know-about-azure-functions-
and-vice-versa-3b04f8aa05a0
Using Azure functions effectively
https://build5nines.com/fixing-azure-portal-errors/View Azure function errors in portal
https://docs.microsoft.com/en-us/azure/azure-functions/create-function-app-linux-
app-service-plan
Create Azure function and test on Linux in Azure app service plan


Azure tools

Visual Studio Enterprise

VSCode

VS App Center

Azure pipelines for CICD








Azure blockchain options
https://docs.microsoft.com/en-us/azure/blockchain/templates/hyperledger-
fabric-consortium-blockchain
Fabric net on Azure
https://azure.microsoft.com/en-us/solutions/blockchain/MS blockchain
https://azure.microsoft.com/en-us/services/blockchain-service/MS blockchain service






https://pages.databricks.com/201811-US-WB-AzureSeries-ty-wb.html?aliId=
eyJpIjoiSXJcL2pld0hRSHRNOFVcL3oxIiwidCI6IjJYaDQ3S0VZNkhRS2Q3WldKeW4rQ1E9PSJ9

https://pages.databricks.com/WB-azuretraining-01.html

https://pages.databricks.com/WB-azuretraining-02.html

https://pages.databricks.com/WB-azuretraining-03.html

Databricks video tutorials on Azure - engineering, analytics, ML


Azure Courses
https://www.udemy.com/course/the-complete-walkthrough-of-microsoft-
azure-services/learn/lecture/6554090#overview
Microsoft Azure cloud - Beginner Bootcamp
https://www.udemy.com/course/aws-certified-associate-architect-developer-
sysops-admin/learn/lecture/6629708#overview
Azure Complete Bootcamp for certiifcations
https://www.udemy.com/course/aws-certified-solutions-architect-associate/
learn/lecture/13885822#overview
Azure Solutions Architect course




Azure signup


https://docs.microsoft.com/en-us/azure/billing/billing-troubleshoot-azure-sign-upAzure signup troubleshooting, errors


Free Azure Account


https://azure.microsoft.com/en-us/free/free-account-faq/

What is the Azure free account? The Azure free account includes free access to our most popular Azure products for 12 months, $200 credit to spend for the first 30 days of sign up, and access to more than 25 products that are always free.

see details on all the options for the first 12 months

https://drive.google.com/open?id=1TJwy5S4u9HKt9IG5DEtlCxNPH2MXA4CR

What does Azure really cost?

https://itproguru.com/expert/2013/01/what-does-windows-azure-cloud-computing-really-cost-how-to-save-part-16-of-31-days-iaas-dan-stolts-itproguru/

AWS Lightsail is < 50% of the cost of Azure for a Linux instance

https://aws.amazon.com/lightsail/pricing/


Key Concepts



Azure Fundamentals


Bryan Cafferky on Azure Fundamentals - Data Services and more - Youtube playlists

play and play along in Azure 

Bryan Cafferky playlists for Azure software lessons

https://www.youtube.com/user/Bryancutube256123/playlists


Create sample SQL Server DB

https://www.youtube.com/watch?v=VnU5-erCIC0





Azure Certifications

https://linuxacademy.com/blog/certifications/azure-certifications-and-roadmap/

And there currently are EIGHT Azure-based certifications spread across these three levels.  All of these are new certifications, not refreshes of previous Azure certifications:

  • Microsoft Certified Azure Fundamentals (Fundamentals)
  • Microsoft Certified Azure Administrator (Associate)
  • Microsoft Certified Azure Developer (Associate)
  • Microsoft Certified Azure AI Engineer Associate (Associate)
  • Microsoft Certified Azure Data Engineer Associate (Associate)
  • Microsoft Certified Azure Security Technologies (Associate)
  • Microsoft Certified Azure Solutions Architect (Expert)
  • Microsoft Certified Azure DevOps (Expert)

Free Microsoft online Azure Fundamentals Course

https://docs.microsoft.com/en-us/learn/certifications/azure-fundamentals


https://docs.microsoft.com/en-us/learn/paths/azure-fundamentals/



Azure Active Directory B2C identity mgt

https://docs.microsoft.com/en-us/azure/active-directory-b2c/overview


Azure AD B2C is a white-label authentication solution. You can customize the entire user experience with your brand so that it blends seamlessly with your web and mobile applications.

Customize every page displayed by Azure AD B2C when your users sign up, sign in, and modify their profile information. Customize the HTML, CSS, and JavaScript in your user journeys so that the Azure AD B2C experience looks and feels like it's a native part of your application.

Infographic of Azure AD B2C identity providers and downstream applications

Azure AD Integration with External Identities

https://azure.microsoft.com/en-us/services/active-directory/external-identities/b2c/

Apply security controls and application- or policy-based multi-factor authentication to help protect your customers’ personal data.

Using External Identities

https://azure.microsoft.com/en-us/services/active-directory/external-identities/

Build an identity experience that works for any user, using any identity, on any device. Make it easy for customers and partners to sign up and sign in using their existing social media ID, phone number, or credentials from any standards-based identity provider.




Microsoft Azure DB options


Cosmos DB

https://docs.microsoft.com/en-us/azure/cosmos-db/introduction

Azure Cosmos DB is Microsoft's globally distributed, multi-model database service. With a click of a button, Cosmos DB enables you to elastically and independently scale throughput and storage across any number of Azure regions worldwide. You can elastically scale throughput and storage, and take advantage of fast, single-digit-millisecond data access using your favorite API including SQL, MongoDB, Cassandra, Tables, or Gremlin

for .Net, Java, Node.js, Python clients



SQL DB

https://docs.microsoft.com/en-us/azure/sql-database/

Warning - Most Transact-SQL features that applications use are fully supported in both Microsoft SQL Server and Azure SQL Database.

Purchasing modelDescriptionBest for
DTU-based modelThis model is based on a bundled measure of compute, storage, and I/O resources. Compute sizes are expressed in DTUs for single databases and in elastic database transaction units (eDTUs) for elastic pools. For more information about DTUs and eDTUs, see What are DTUs and eDTUs?.Best for customers who want simple, preconfigured resource options.
vCore-based modelThis model allows you to independently choose compute and storage resources. The vCore-based purchasing model also allows you to use Azure Hybrid Benefit for SQL Server to gain cost savings.Best for customers who value flexibility, control, and transparency.



pricing model comparison

Compute costs

Provisioned compute costs

In the provisioned compute tier, the compute cost reflects the total compute capacity that is provisioned for the application.

In the business critical service tier, we automatically allocate at least 3 replicas. To reflect this additional allocation of compute resources, the price in the vCore-based purchasing model is approximately 2.7x higher in the business critical service tier than it is in the general purpose service tier. Likewise, the higher storage price per GB in the business critical service tier reflects the high I/O and low latency of the SSD storage.

The cost of backup storage is the same for the business critical service tier and the general purpose service tier because both tiers use standard storage.

Serverless compute costs

For a description of how compute capacity is defined and costs are calculated for the serverless compute tier, see SQL Database serverless.

Storage costs

Different types of storage are billed differently. For data storage, you're charged for the provisioned storage based upon the maximum database or pool size you select. The cost doesn't change unless you reduce or increase that maximum. Backup storage is associated with automated backups of your instance and is allocated dynamically. Increasing your backup-retention period increases the backup storage that’s consumed by your instance.

By default, 7 days of automated backups of your databases are copied to a read-access geo-redundant storage (RA-GRS) standard Blob storage account. This storage is used by weekly full backups, daily differential backups, and transaction log backups, which are copied every 5 minutes. The size of the transaction logs depends on the rate of change of the database. A minimum storage amount equal to 100 percent of the database size is provided at no extra charge. Additional consumption of backup storage is charged in GB per month.

For more information about storage prices, see the pricing page.

vCore-based purchasing model

A virtual core (vCore) represents a logical CPU and offers you the option to choose between generations of hardware and the physical characteristics of the hardware (for example, the number of cores, the memory, and the storage size). The vCore-based purchasing model gives you flexibility, control, transparency of individual resource consumption, and a straightforward way to translate on-premises workload requirements to the cloud. This model allows you to choose compute, memory, and storage resources based upon your workload needs.

In the vCore-based purchasing model, you can choose between the general purpose and business critical service tiers for single databases, elastic pools, and managed instances. For single databases, you can also choose the hyperscale service tier.

The vCore-based purchasing model lets you independently choose compute and storage resources, match on-premises performance, and optimize price. In the vCore-based purchasing model, you pay for:

  • Compute resources (the service tier + the number of vCores and the amount of memory + the generation of hardware).
  • The type and amount of data and log storage.
  • Backup storage (RA-GRS).

Important

Compute resources, I/O, and data and log storage are charged per database or elastic pool. Backup storage is charged per each database. For more information about managed instance charges, see managed instances. Region limitations: For the current list of supported regions, see products available by region. To create a managed instance in a region that currently isn't supported, send a support request via the Azure portal.

If your single database or elastic pool consumes more than 300 DTUs, converting to the vCore-based purchasing model might reduce your costs. You can convert by using your API of choice or by using the Azure portal, with no downtime. However, conversion isn't required and isn't done automatically. If the DTU-based purchasing model meets your performance and business requirements, you should continue using it.

To convert from the DTU-based purchasing model to the vCore-based purchasing model, select the compute size by using the following rules of thumb:

  • Every 100 DTUs in the standard tier require at least 1 vCore in the general purpose service tier.
  • Every 125 DTUs in the premium tier require at least 1 vCore in the business critical service tier.

DTU-based purchasing model

A database transaction unit (DTU) represents a blended measure of CPU, memory, reads, and writes. The DTU-based purchasing model offers a set of preconfigured bundles of compute resources and included storage to drive different levels of application performance. If you prefer the simplicity of a preconfigured bundle and fixed payments each month, the DTU-based model might be more suitable for your needs.

In the DTU-based purchasing model, you can choose between the basic, standard, and premium service tiers for both single databases and elastic pools. The DTU-based purchasing model isn't available for managed instances.

Database transaction units (DTUs)

For a single database at a specific compute size within a service tier, Microsoft guarantees a certain level of resources for that database (independent of any other database in the Azure cloud). This guarantee provides a predictable level of performance. The amount of resources allocated for a database is calculated as a number of DTUs and is a bundled measure of compute, storage, and I/O resources.

The ratio among these resources is originally determined by an online transaction processing (OLTP) benchmark workload designed to be typical of real-world OLTP workloads. When your workload exceeds the amount of any of these resources, your throughput is throttled, resulting in slower performance and time-outs.

The resources used by your workload don't impact the resources available to other SQL databases in the Azure cloud. Likewise, the resources used by other workloads don't impact the resources available to your SQL database.

bounding box

DTUs are most useful for understanding the relative resources that are allocated for Azure SQL databases at different compute sizes and service tiers. For example:

  • Doubling the DTUs by increasing the compute size of a database equates to doubling the set of resources available to that database.
  • A premium service tier P11 database with 1750 DTUs provides 350x more DTU compute power than a basic service tier database with 5 DTUs.

To gain deeper insight into the resource (DTU) consumption of your workload, use query-performance insights to:


SQL DB service purchase options

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-purchase-models

Single database is one of three deployment options for Azure SQL Database. The other two are elastic pools and managed instance.

Single Database option

The single database deployment option creates a database in Azure SQL Database with its own set of resources and is managed via a SQL Database server. With a single database, each database is isolated from each other and portable, each with its own service tier within the DTU-based purchasing model or vCore-based purchasing model and a guaranteed compute size.


Elastic Pools DB option

A single database can be moved into or out of an elastic pool for resource sharing. For many businesses and applications, being able to create single databases and dial performance up or down on demand is enough, especially if usage patterns are relatively predictable. But if you have unpredictable usage patterns, it can make it hard to manage costs and your business model. Elastic pools are designed to solve this problem.


Managed Instance DB option




SQL documentation

https://opdhsblobprod01.blob.core.windows.net/contents/4a6d75bb3af747de838e6ccc97c5d978/6d6fe5f8cfa71b73a5a76990e95b428c?sv=2015-04-05&sr=b&sig=7PloevGQ%2BCpuzEw5jUR5y3kXu7x0n8TmnQ1oEL4cGdc%3D&st=2019-11-06T19%3A16%3A12Z&se=2019-11-07T19%3A26%3A12Z&sp=r


BYODB - MySQL etc



Azure DB security options

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-overview

  1. Network security
  2. Access management
  3. Authorization
  4. Threat protection
  5. Information protection and encryption
  6. Security management
  7. Next steps



Learn Azure Messaging and Serverless applications

https://docs.microsoft.com/en-us/learn/paths/architect-messaging-serverless/


Serverless function concepts

Containers like Docker provide significant environment isolation and flexibility.

An app in a Docker container only talks to the Docker engine and the configured ports.

It has no idea of the environment or OS it runs in.

Deploying microservices in containers provides major benefits for most use cases:

  • locality of reference on data, libraries within a microservice to a high degree when caching is used
  • environment agnostic
  • easy to scale as a unit independent of other services in other containers


faas - single function deployed as a serverless service

the server is conceptually "invisible" to the developer

sounds simple until you deal with the

serverless is a work in progress in 2019

The most popular serverless platforms--AWS Lambda, Google Cloud Functions, Azure Functions--all present challenges once data gets involved. Want to talk to local AWS services? Dead simple. But once authenticated APIs get involved, it’s more of a pain. Where do you store tokens? How do you handle OAuth redirects? How do you manage users? Quickly that narrow use of serverless can snowball into a pile of other public cloud services … to the point that you’ve swapped the complexity developers know for some new piles of stuff to learn.




Azure Functions

https://docs.microsoft.com/en-us/azure/azure-functions/



Learn Azure Serverless Function

https://docs.microsoft.com/en-us/learn/modules/create-serverless-logic-with-azure-functions/



Java Azure Function Example

https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-java-maven

o develop functions using Java, you must have the following installed:

The JAVA_HOME environment variable must be set to the install location of the JDK to complete this quickstart.

Create Functions project

In an empty folder, run the following command to generate the Functions project from a Maven archetype.

mvn archetype:generate \
-DarchetypeGroupId=com.microsoft.azure \
-DarchetypeArtifactId=azure-functions-archetype

If you're experiencing issues with running the command, take a look at what maven-archetype-plugin version is used


Maven asks you for values needed to finish generating the project on deployment. Provide the following values when prompted:

ValueDescription
groupIdA value that uniquely identifies your project across all projects, following the package naming rules for Java. The examples in this quickstart use com.fabrikam.functions.
artifactIdA value that is the name of the jar, without a version number. The examples in this quickstart use fabrikam-functions.
versionChoose the default value of 1.0-SNAPSHOT.
packageA value that is the Java package for the generated function code. Use the default. The examples in this quickstart use com.fabrikam.functions.
appNameGlobally unique name that identifies your new function app in Azure. Use the default, which is the artifactId appended with a random number. Make a note of this value, you'll need it later.
appRegionChoose a region near you or near other services your functions access. The default is westus. Run this Azure CLI command to get a list of all regions:
az account list-locations --query '[].{Name:name}' -o tsv
resourceGroup

Name for the new resource group in which to create your function app. Use myResourceGroup, which is used by examples in this quickstart. A resource group must be unique to your Azure subscription.

Maven creates the project files in a new folder with a name of artifactId, which in this example is fabrikam-functions.

Open the new Function.java file from the src/main/java path in a text editor and review the generated code. This code is an HTTP triggered function that echoes the body of the request.

https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook?tabs=csharp


Run the Function Locally


Run the following command, which changes the directory to the newly created project folder, then builds and runs the function project:

console
cd fabrikam-function
mvn clean package 
mvn azure-functions:run

You see output like the following from Azure Functions Core Tools when you run the project locally:

Output
...

Now listening on: http://0.0.0.0:7071
Application started. Press Ctrl+C to shut down.

Http Functions:

    HttpTrigger-Java: [GET,POST] http://localhost:7071/api/HttpTrigger-Java
...

Trigger the function from the command line using cURL in a new terminal window:

CMD
curl -w "\n" http://localhost:7071/api/HttpTrigger-Java --data AzureFunctions
Output
Hello AzureFunctions!

The function key isn't required when running locally. Use Ctrl+C in the terminal to stop the function code.


Deploy the Function to Azure

A function app and related resources are created in Azure when you first deploy your function app. Before you can deploy, use the az login Azure CLI command to sign in to your Azure subscription.

Azure CLI
az login

Tip

If your account can access multiple subscriptions, use az account set to set the default subscription for this session.

Use the following Maven command to deploy your project to a new function app.

Azure CLI
mvn azure-functions:deploy

This azure-functions:deploy Maven target creates the following resources in Azure:

  • Resource group. Named with the resourceGroup you supplied.
  • Storage account. Required by Functions. The name is generated randomly based on Storage account name requirements.
  • App service plan. Serverless hosting for your function app in the specified appRegion. The name is generated randomly.
  • Function app. A function app is the deployment and execution unit for your functions. The name is your appName, appended with a randomly generated number.

The deployment also packages the project files and deploys them to the new function app using zip deployment, with run-from-package mode enabled.

After the deployment completes, you see the URL you can use to access your function app endpoints. Because the HTTP trigger we published uses authLevel = AuthorizationLevel.FUNCTION, you need to get the function key to call the function endpoint over HTTP. The easiest way to get the function key is from the Azure portal.


Get the HTTP Trigger URL

You can get the URL required to the trigger your function, with the function key, from the Azure portal.

  1. Browse to the Azure portal, sign in, type the appName of your function app into Search at the top of the page, and press enter.

  2. In your function app, expand Functions (Read Only), choose your function, then select </> Get function URL at the top right.

    Copy the function URL from the Azure portal

  3. Choose default (Function key) and select Copy.

You can now use the copied URL to access your function.

Verify the Function in Azure

To verify the function app running on Azure using cURL, replace the URL from the sample below with the URL that you copied from the portal.

Azure CLI
curl -w "\n" https://fabrikam-functions-20190929094703749.azurewebsites.net/api/HttpTrigger-Java?code=zYRohsTwBlZ68YF.... --data AzureFunctions

This sends a POST request to the function endpoint with AzureFunctions in the body of the request. You see the following response.

Output
Hello AzureFunctions!






Azure VM setups

file:///C:/Users/Jim%20Mason/Google%20Drive/_docs/howto/cloud/azure/azure-vm-config-options-2019-document.pdf

create

  1. config files
  2. load balancer
  3. virtual nic
  4. vms - AzAvailabilitySet
    1. Get-Credential to set up admin id / pwd
    2. New-AzVM - select the right images type - windows, ubuntu or ?
  5. create NSG - network security group to manage traffic in and out of subnet ( see below )
  6. more



Azure VM management

  1. setup RBAC controls
  2. set VM resource policies to provide resources, manage costs
  3. hierarchy
    1. resources < resource groups < subscriptions < management groups
  4. sysprep.exe to remove personal info from VM config
  5. monitor VM changes
  6. update VMs
  7. Security Center - setup and manage security policies and events
  8. Install apps - can install mult in single VM ( eg SQL, .Net, IIS ) if needed
  9. secure web server with SSL certs in MS key vault
  10. more


Azure Container setups



Docker on Azure



Docker Jenkins Build Templates





Azure Arc  -  Orchestration Service for Kubernetes on multiple platforms

https://docs.microsoft.com/en-us/azure/azure-arc/

Azure Arc extends Azure Resource Manager capabilities to Linux and Windows servers, as well as Kubernetes clusters on any infrastructure across on-premises, multi-cloud, and edge. With Azure Arc, customers can also run Azure data services anywhere, realizing the benefits of cloud innovation, including always up-to-date data capabilities, deployment in seconds (rather than hours), and dynamic scalability on any infrastructure. Azure Arc for servers is currently in public preview.

Arc Overview

https://docs.microsoft.com/en-us/azure/azure-arc/servers/overview

Azure Arc for servers (preview) allows you to manage your Windows and Linux machines hosted outside of Azure on your corporate network or other cloud provider, similarly to how you manage native Azure virtual machines. When a hybrid machine is connected to Azure, it becomes a connected machine and is treated as a resource in Azure. Each connected machine has a Resource ID, is managed as part of a resource group inside a subscription, and benefits from standard Azure constructs such as Azure Policy and applying tags.

To deliver this experience with your hybrid machines hosted outside of Azure, the Azure Connected Machine agent needs to be installed on each machine that you plan on connecting to Azure. This agent does not deliver any other functionality, and it doesn't replace the Azure Log Analytics agent. The Log Analytics agent for Windows and Linux is required when you want to proactively monitor the OS and workloads running on the machine, manage it using Automation runbooks or solutions like Update Management, or use other Azure services like Azure Security Center.

Guides for Arc

  1. Connect machines to Arc through Azure Portal
  2. Connect machines optionally using a Service Principal for auto-scaling
  3. Connect machines using PowerShell DSC ( Desired State Configuration )
  4. Manage Agents

Azure Arc Policy Samples

https://docs.microsoft.com/en-us/azure/azure-arc/servers/policy-samples

Audit, Monitoring and Deployment policies for VMs

A Closer Look At Azure Arc – Microsoft’s Hybrid And Multi-Cloud Platform

https://www.forbes.com/sites/janakirammsv/2020/05/24/a-closer-look-at-azure-arc--microsofts-hybrid-and-multi-cloud-platform/#363e4f921bce


Arc Agent on each machine or node

The Connected Machine agent sends a regular heartbeat message to the service every 5 minutes. If the service stops receiving these heartbeat messages from a machine, that machine is considered offline and the status will automatically be changed to Disconnected in the portal within 15 to 30 minutes. Upon receiving a subsequent heartbeat message from the Connected Machine agent, its status will automatically be changed to Connected.

https://docs.microsoft.com/en-us/azure/azure-arc/servers/agent-overview

Azure Arc delivers three capabilities - managing VMs running outside of Azure, registering and managing Kubernetes clusters deployed within and outside of Azure and running managed data services based on Azure SQL and PostgreSQL Hyperscale in Kubernetes clusters registered with Azure Arc.

As of Build 2020, Microsoft has opened up the first two features of Azure Arc - management of VMs and Kubernetes clusters running outside of Azure. Azure Arc enabled data services is still in private preview.

Adding machines to a group and defining in policies

The Connected Machine agent can be deployed in a variety of OS environments including Windows Server 2012 R2 or higher, Ubuntu 16.04, SUSE Linux Enterprise Server 15, Red Hat Enterprise Linux 7, and even Amazon Linux 2.

The registered machines are listed in the same Azure resource group that has native Azure VMs running in the public cloud. Customers can apply labels to any VM in the resource group to include or exclude them in a policy. Participating machines can be audited by an Azure Policy and an action can be taken based on the outcome.

Arc can manage Kubernetes Clusters

Similar to how VMs can be onboarded to Azure, Kubernetes clusters can be brought into the fold of Azure Arc.

Customers can attach Kubernetes clusters running anywhere outside of Azure to Azure Arc. This includes bare-metal clusters running on-premises, managed clusters such as Amazon EKS and Google Kubernetes Engine, and enterprise PaaS offerings such as Red Hat OpenShift and Tanzu Kubernetes Grid.

Similar to the Connected Machine agent pushed to a VM, Azure Arc deploys an agent under the azure-arc namespace. It does exactly what the VM agent does - watch for configuration requests. Apart from that, the Arc agent running in a Kubernetes cluster can send telemetry to Azure Monitor. The telemetry includes inventory, Kubernetes events, container std{out; err} logs, and node, container, Kubelet, and GPU performance metrics. 

Once the agent is deployed in a Kubernetes cluster, it can participate in the GitOps-based configuration management and policy updates

Azure Arc-enabled Kubernetes ensures that the workloads match the desired state of the configuration by monitoring the drift and automatically applying the required changes. 

Azure Arc-enabled Kubernetes comes with three capabilities:

Global inventory management - You can onboard all the Kubernetes clusters irrespective of their deployment location to manage them from a single location. 

Centralized workload management - With Azure Arc, it is possible to roll out applications and configuration to hundreds of registered clusters with one commit to the source code repository. 

Policy-driven cluster management - Ensure that the cluster runs the policies by centrally governing and auditing the infrastructure. 

Microsoft has partnered with Red Hat, SUSE, and Rancher to officially bring OpenShift, SUSE CaaS and Rancher Kubernetes Engine to Azure Arc.

Microsoft scores additional points for adopting the open source Flux project as the choice of GitOps tool for Azure Arc. It brings transparency to the platform while providing confidence to users.

Azure Arc for Data Services in K8s

With Azure Arc for data services, customers will benefit from the ability to run managed database services in any Kubernetes cluster managed by Azure Arc. This capability will emerge as the key differentiating feature of Azure Arc.

Microsoft DLT service



Managed Fabric Net on Azure



Custom Fabric Net on Azure






Potential Value Opportunities



Potential Challenges



Candidate Solutions



Step-by-step guide for Example



sample code block

sample code block
 



Recommended Next Steps



  • No labels