Cloud and Datacenter Management Blog

Microsoft Hybrid Cloud blogsite about Management


Leave a comment

Azure Local Cluster + Azure Cloud + Docker AI Edge

Azure Local Cluster on‑site working in tandem with Azure Cloud, running Dockerized AI workloads at the edge — is not just viable. It’s exactly the direction modern distributed AI systems are heading.

Let me unpack how these pieces fit together and why the architecture is so compelling.

Azure Local Baseline reference Architecture

A powerful hybrid model for real‑world AI

Think of this setup as a two‑layer AI fabric:

  • Layer 1: On‑site Azure Local Cluster
    Handles real‑time inference, local decision‑making, and data preprocessing.
    This is where Docker containers shine: predictable, isolated, versioned workloads running close to the data source.
  • Layer 2: Azure Cloud
    Handles heavy lifting: model training, analytics, fleet management, OTA updates, and long‑term storage.

Together, they create a system that is fast, resilient, secure, and scalable

Why this architecture works so well

  1. Ultra‑low latency inference

Your on‑site Azure Local Cluster can run Dockerized AI models directly on edge hardware (Jetson, x86, ARM).
This eliminates cloud round‑trips for:

  • object detection
  • anomaly detection
  • robotics control
  • industrial automation

Azure Local provides the core platform for hosting and managing virtualized and containerized workloads on-premises or at the edge.

  1. Seamless model lifecycle management

Azure Cloud can:

  • train new models
  • validate them
  • push them as Docker images
  • orchestrate rollouts to thousands of edge nodes

Your local cluster simply pulls the new container and swaps it in.
This is exactly the “atomic update” pattern from the blogpost.

  1. Strong separation of concerns

Local cluster = deterministic, real‑time execution
Cloud = dynamic, scalable intelligence

This separation avoids the classic problem of trying to run everything everywhere.

  1. Enterprise‑grade security

Azure Arc, IoT Edge, and Container Registry gives you:

  • signed images
  • policy‑based deployments
  • identity‑bound devices
  • encrypted communication

This is critical when edge devices live in factories, stores, or public spaces.

  1. Cloud‑assisted intelligence

Even though inference happens locally, the cloud can still:

  • aggregate telemetry
  • retrain models
  • detect drift
  • optimize pipelines
  • coordinate multi‑site deployments

This is how AI systems improve over time. 

How Docker fits into this hybrid world

Docker becomes the unit of deployment across both environments for DevOps and developers.

On the edge:

  • lightweight images
  • Hardened images
  • GPU‑enabled containers
  • read‑only root filesystems
  • offline‑capable workloads

In the cloud:

  • CI/CD pipelines
  • model registries
  • automated scanning
  • versioned releases

The same container image runs in both places — but with different responsibilities.

My take: This is one of the strongest architectures for real‑world AI

If your goal is:

  • real‑time AI
  • high reliability
  • centralized control
  • scalable deployments
  • secure operations
  • hybrid cloud + edge synergy

…then Azure Local Cluster + Azure Cloud + Docker AI Edge is a near‑ideal solution.

It gives you the best of both worlds:
cloud intelligence + edge autonomy.

Here you find more about Microsoft Azure Local 

Here you find more blogposts about Docker, Windows Server 2025, and Azure Cloud Services :

Windows Server 2025 Core and Docker – A Modern Container Host Architecture

Docker Desktop Container Images and Azure Cloud App Services


Leave a comment

Creating Dev Environments (Beta) in Docker Desktop for Windows

Exploring Docker Desktop Dev Environments (Beta)

In the ever-evolving landscape of software development, Docker has consistently been at the forefront, providing developers with tools to streamline their workflows. One of the latest additions to Docker’s suite of tools is the Docker Desktop Dev Environments (Beta). This feature promises to revolutionize the way developers collaborate and manage their development environments. Let’s dive into what makes this new feature so exciting.

What is Docker Desktop Dev Environments?

Docker Desktop Dev Environments is a feature designed to simplify the process of setting up and sharing development environments. It allows developers to create, configure, and share their development setups with ease, ensuring consistency across different machines and team members. This is particularly useful in collaborative projects where maintaining identical environments can be challenging.

Key Features

  • Environment Configuration: With Docker Desktop Dev Environments, you can define your development environment using a simple configuration file. This file includes all the necessary dependencies, tools, and settings required for your project. Once defined, the environment can be easily replicated on any machine with Docker Desktop installed.
  • Seamless Sharing: Sharing your development environment with team members has never been easier. Docker Desktop Dev Environments allows you to package your environment configuration and share it via a URL or a file. Team members can then import this configuration and have their environment set up in minutes.
  • Consistency and Reproducibility: One of the biggest challenges in software development is ensuring that all team members are working in the same environment. Docker Desktop Dev Environments addresses this by providing a consistent setup that can be easily reproduced. This reduces the “it works on my machine” problem and ensures that everyone is on the same page.
  • Integration with Docker Hub: Docker Desktop Dev Environments integrates seamlessly with Docker Hub, allowing you to store and manage your environment configurations in the cloud. This makes it easy to access and share your environments from anywhere.

Benefits for Developers

  • Simplified Onboarding: New team members can get up and running quickly by importing the development environment configuration. This reduces the time spent on setting up and troubleshooting environments.
  • Enhanced Collaboration: By providing a consistent environment, Docker Desktop Dev Environments fosters better collaboration among team members. Everyone works with the same tools and settings, reducing discrepancies and integration issues.
  • Improved Productivity: With a standardized environment, developers can focus more on coding and less on environment setup and maintenance. This leads to increased productivity and faster development cycles.

Getting Started

To get started with Docker Desktop Dev Environments (Beta), follow these simple steps:

  1. Install Docker Desktop: Ensure you have the latest version of Docker Desktop installed on your machine.
  2. Create a Dev Environment: Use the Docker Desktop interface to create a new development environment. Define your environment configuration using the provided templates or create your own.
  3. Share Your Environment: Once your environment is set up, share it with your team by generating a URL or exporting the configuration file.
  4. Import an Environment: Team members can import the shared environment configuration and have their setup ready in minutes.

In the following steps I will Create a Dev Environment in Docker Desktop for Windows:

Click on Dev Environments and then on Get Started

Give your environment a Name, select your source and choose your IDE,
Click then on Continue

Preparing and creating.

Click on Continue

You’re all set and you can open VSCode or your IDE.

Your Dev Environment in Docker Desktop for Windows.

Your Docker Desktop for Windows Dev Environment in VSCode.

Your Dev environment microservices running in Docker Desktop

 

Conclusion

Docker Desktop Dev Environments (Beta) is a game-changer for developers looking to streamline their workflows and enhance collaboration. By providing a consistent, reproducible, and easily shareable development environment, Docker is once again proving its commitment to making developers’ lives easier. Whether you’re working on a solo project or collaborating with a large team, Docker Desktop Dev Environments is a tool worth exploring.
Here you find more information about Dev environments at Docker.

Happy coding! 🚀


Leave a comment

Deploy a 10 – Node Azure Service Fabric Standalone Cluster #microservices #Containers

Azure Service Fabric Standalone Cluster

Earlier I wrote a blogpost about Microsoft Azure Service Fabric Standalone Cluster for Dev testing.
This was 5 – Node Azure Service Fabric Cluster locally installed, but now I like to have a bigger ASF Cluster on my
Windows Server 2019 for testing with Visual Studio.

When you have downloaded the Microsoft Azure Service Fabric SDK into a directory

Here you see the JSON Cluster config files

I used the same JSON template for deploying a Azure Service Fabric Standalone Cluster :

Creating Cluster but with a Changed JSON Template.

Here you find the 10 – Node Azure Service Fabric Cluster Config file on Github

10 – Node Microsoft Azure Service Fabric Standalone Cluster for Dev Testing

Important : Use this Azure Service Fabric Standalone Cluster only for Learning and testing and not for production!

Here you find more information and documentation about Azure Service Fabric for Production.


Leave a comment

Microsoft Azure Service Fabric Standalone Cluster for Testing #microservices #Containers #Apps

Microsoft Azure Service Fabric standalone

Azure Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices and containers.

To build and run Azure Service Fabric applications on your Windows development machine, install the Service Fabric runtime, SDK, and tools. You also need to enable execution of the Windows PowerShell scripts included in the SDK.

I have installed the latest version :

  • Service Fabric SDK and Tools 4.1.409
  • Service Fabric runtime 7.1.409

here you find more information about installing Azure Service Fabric Standalone version for testing
I have installed the Azure Service Fabric Cluster on my Windows10 Machine for testing only.

When you want to great your own Azure Service Fabric Cluster for Production, you have to prepare your self and making a plan before you build.

When you have your Azure Service Fabric Standalone Cluster running, you want to deploy your microservices, apps or containers on it and test your solution. In the following steps I deploy with Visual Studio a Web App to Azure Service Fabric Cluster Standalone version 7.1.409

Here is a Github Sample for Azure Service Fabric.

git clone https://github.com/Azure-Samples/service-fabric-dotnet-quickstart

Here you have your Clone from Github.

To deploy this App to the Azure Service Fabric Cluster we use Microsoft Visual Studio

Once the application is downloaded, you can deploy it to a cluster directly from Visual Studio.

  1. Open Visual Studio
  2. Select File > Open
  3. Navigate to the folder you cloned the git repository to, and select Voting.sln
  4. Right-click on the Voting application project in the Solution Explorer and choose Publish

Click on Publish.

Select connection Endpoint Local Cluster and click on Publish.

The Web App is Published to the Azure Service Fabric Standalone Cluster.

When you open the Azure Service Fabric Explorer you will see your App Running

This sample is for testing only and is not secure for production, just to learn how it works 😉

Of course you can also deploy Containers with Visual Studio to your Azure Service Fabric Standalone Cluster.

Deploying Service Fabric Container via Visual Studio.

More Azure Service Fabric information

Here you find the Azure Service Fabric documentation

Here you find the Microsoft Azure Service Fabric website

Here you find the Azure Service Fabric Tech Community Blog

Happy Testing your Apps, microservices, and Containers.

Join the Containers in the Cloud LinkedIn Community Group

 


Leave a comment

Backup – Restore – DR strategy in a Fast changing World #Data #Management

The world of data is moving and changing a lot with new IT technologies coming up like leaves on a tree.
Data is everywhere, on Servers, workstations, BYOD Devices in the Cloud but how do you keep your data save and protected for your business today and in the future? There are a lot of reasons why you should Backup your data :

  • One of your employees accidentally deleted important files for example.
  • Your data got compromised by a virus.
  • Your Server crashed
  • You have to save your data for a period of time by Law
  •  And there will be more reasons why you should do backup…………….

A lot of Enterprise organizations are moving to the Cloud with workloads for the Business, but how is your Backup and Disaster Recovery managed today? A lot of data transitions are made but what if your Backup and Disaster Recovery solution is out dated or reaching end of Life? You can have a lot of Questions like :

  • What data should I backup?
  • Should I just upgrade the Backup Solution?
  • How can I make my Data Management Backup -DR Solution Cheaper and ready for the future?
  • How can I make my new Backup-DR Solution independent? ( Vendor Lockin)

And there will be more questions when you are in this scenario where you have to renew your Backup – DR Solution.
Here we have the following Great Backup Solution from 2014 :

Offsite Microsoft DPM Backup Solution since 2014

Here we have 3 System Center Data Protection Manager Backup Pods with a Tape library and One DPM pod connected with a Microsoft Azure Backup Vault in the Cloud. You do the Security updates and the Rollups for Windows Server 2012 R2 and System Center Data Protection Manager 2012 to keep the Solution save and running.

Long Time Protection to Tape

DPM 2012 Server with direct attached Storage for Short time protection

The four DPM Backup Pods have the same Storage configuration for short time protection with a retention time of 15 days. After that Longtime protection is needed with Backup to tape and Backup to Microsoft Azure Backup Vault.
Since 2014 the Backup data is depending on these solution configurations.

Tape Management cost a lot of time and money

The fourth DPM Backup pod got a Azure Backup Vault in the Cloud to save Tape Management time.

DPM Backup to Microsoft Azure Cloud Backup Vault.

So this is the Start of the Journey to a New Data Management Backup – DR Solution transformation. The next Couple of weeks I will search for the different scenarios and solutions on the Internet and talk with the Community looking for Best Practices. I will do Polls on Social Media and a Serie of blogposts for the Data Management Backup – DR Solution to keep the business continuity.

Magic Quadrant for Data Center Backup and Recovery Solutions

Will it be a Cloud Backup – DR Solution?
Will it be a Hybrid Cloud Backup – DR Solution?
Everything in One Management Console?
Or More then One Backup -DR Solution for the right Job?

We will see what the journey will bring us based on Best Practices  😉


Leave a comment

#Microsoft Hololens 2 Overview Videos #MWC2019 #Hololens #Azure #VR with @satyanadella

Microsoft Keynote HoloLens 2 at Mobile World Congress (MWC) 2019

HoloLens 2

Microsoft HoloLens 2: Partner Spotlight with Philips

Microsoft HoloLens 2: Partner Spotlight with Bentley

Conclusion:

I see Awesome possibilities for Maintenance in Smart Cities and Smart Buildings with Intelligent Cloud and Intelligent Edge together with the Microsoft Hololens 2 and Microsoft Azure. Intelligent Dashboards in your Hololens 2 hybrid with your Azure App for example. Great for Manufacturers, Healthcare, Architects, Maintenance Companies but also for Teachers and Students doing innovative Education 🙂

Here you find more information about Microsoft Hololens 2 and Business Ready Apps


Leave a comment

Learn Azure in a Month of Lunches Free E-book #Azure #Cloud #Education

Learn Azure in a Month of Lunches breaks down the most important Azure concepts into bite-sized lessons with exercises and labs—along with project files available in GitHub—to reinforce your skills. Learn how to:
Use core Azure infrastructure and platform services—including how to choose which service for which task.
Plan appropriately for availability, scale, and security while considering cost and performance.
Integrate key technologies, including containers and Kubernetes, artificial intelligence and machine learning, and the Internet of Things.

You can download the Free Learn Azure in a Month of Lunches E-book here


Leave a comment

How to monitor your #Kubernetes clusters – Best Practices Series #AKS #AzureMonitor

Get best practices on how to monitor your Kubernetes clusters from field experts in this episode of the Kubernetes Best Practices Series. In this intermediate level deep dive, you will learn about monitoring and logging in Kubernetes from Dennis Zielke, Technology Solutions Professional in the Global Black Belts Cloud Native Applications team at Microsoft.

Multi-cluster view from Azure Monitor

Azure Monitor provides a multi-cluster view showing the health status of all monitored AKS clusters deployed across resource groups in your subscriptions. It shows AKS clusters discovered that are not monitored by the solution. Immediately you can understand cluster health, and from here you can drill down to the node and controller performance page, or navigate to see performance charts for the cluster. For AKS clusters discovered and identified as unmonitored, you can enable monitoring for that cluster at any time.

Understand AKS cluster performance with Azure Monitor for containers

Container Live Logs provides a real-time view into your Azure Kubernetes Service (AKS) container logs (stdout/stderr) without having to run kubectl commands. When you select this option, new pane appears below the containers performance data table on the Containers view, and it shows live logging generated by the container engine to further assist in troubleshooting issues in real time.
Live logs supports three different methods to control access to the logs:

AKS without Kubernetes RBAC authorization enabled
AKS enabled with Kubernetes RBAC authorization
AKS enabled with Azure Active Directory (AD) SAML based single-sign on

You even can search in the Container Live Logs for Troubleshooting and history.

View Container Live logs with Azure Monitoring for AKS | Kubernetes | Containers 


Leave a comment

Microsoft #Azure Service Fabric Mesh for your #Microservices and #Container Apps in the #Cloud

Microsoft Service Fabric Mesh

Azure Service Fabric Mesh is a fully managed service that enables developers to deploy microservices applications without managing virtual machines, storage, or networking. Applications hosted on Service Fabric Mesh run and scale without you worrying about the infrastructure powering it. Service Fabric Mesh consists of clusters of thousands of machines. All cluster operations are hidden from the developer. Simply upload your code and specify resources you need, availability requirements, and resource limits. Service Fabric Mesh automatically allocates the infrastructure and handles infrastructure failures, making sure your applications are highly available. You only need to care about the health and responsiveness of your application-not the infrastructure.

With Service Fabric Mesh you can:

  • “Lift and shift” existing applications into containers to modernize and run your current applications at scale.
  • Build and deploy new microservices applications at scale in Azure. Integrate with other Azure services or existing applications running in containers. Each microservice is part of a secure, network isolated application with resource governance policies defined for CPU cores, memory, disk space, and more.
  • Integrate with and extend existing applications without making changes to those applications. Use your own virtual network to connect existing application to the new application.
  • Modernize your existing Cloud Services applications by migrating to Service Fabric Mesh.

Build high-availability into your application architecture by co-locating your compute, storage, networking, and data resources within a zone and replicating in other zones. Azure services that support Availability Zones fall into two categories:

  • Zonal services – you pin the resource to a specific zone (for example, virtual machines, managed disks, IP addresses)
  • Zone-redundant services – platform replicates automatically across zones (for example, zone-redundant storage, SQL Database).

To achieve comprehensive business continuity on Azure, build your application architecture using the combination of Availability Zones with Azure region pairs. You can synchronously replicate your applications and data using Availability Zones within an Azure region for high-availability and asynchronously replicate across Azure regions for disaster recovery protection.

Store state in an Azure Service Fabric Mesh application by mounting an Azure Files based volume inside the container

Twitter AMA on Service Fabric Mesh :

The Service Fabric team will be hosting an Ask Me Anything (AMA) (more like “ask us anything”!) session for Service Fabric Mesh on Twitter on Tuesday, October 30thfrom 9am to 10:30am PST. Tweet to @servicefabric or @AzureSupport using #SFMeshAMA with your questions on Mesh and Service Fabric. More information here

More information about Azure Service Fabric Mesh :

Microsoft Azure Service Fabric Mesh LAB on Github

Get started with Microsoft Azure Service Fabric for your Microservices and Container Apps

Service Fabric Microsoft Ignite 2018 sessions

JOIN Containers in the Cloud Community Group on LinkedIn here


Leave a comment

Download the August 2018 #Developers Guide to #Azure #Cloud

If you are a developer or architect who wants to get started with Microsoft Azure, this book is for you! Written by developers for developers, this guide will show you how to get started with Azure and which services you can use to run your applications, store your data, incorporate intelligence, build IoT apps, and deploy your solutions in a more efficient and secure way.

Download the August 2018 Update of Developers Guide to Azure E-book here

Happy Reading and Building in the Microsoft Azure Cloud with this Awesome E-book !