Configuration Management of Non-Azure VM’s with Azure Automation – Part 1

One of the first questions asked whenever a system or application goes down is “what changed recently”.  Ill-planned or unplanned changes are often the underlying cause of failures.  And if you live on the Operations side of the IT fence like me, a large portion of your existence is dedicated to mitigating the negative impact of these age accelerating events.

To help in this ongoing battle, you can leverage Azure Configuration Management.  This feature is comprised of three tools:  Inventory, Change tracking and State configuration (DSC).  Inventory gives you a view of installed software, Windows services, registry keys, files and Linux Daemons that are present on your machine.  Change Tracking monitors and reports changes to these items.  And DSC allows you to compile configuration files to maintain a desired state for these monitored sources.  Configuration Management can be used for machines that are in Azure of course, but also an On-Premises data center and other cloud services.  For non-Azure machines, the data is gathered by an installed agent and reported back to Azure, where it is stored in a Log Analytics Workspace.

This article will be the first in a three part series that covers the components of Azure Configuration Management and how they can be used with a virtual machine running in AWS EC2. In part 1, I will show you how to get started using Inventory and provide an overview of it’s capabilities.  Part 2 will cover Change Tracking and Part 3 will outline State configuration (DSC).

Prerequisites
Here’s what is needed:

  • An Automation account in Azure
  • A Log Analytics Workspace
  • A non-Azure VM (AWS)

I’m not going to outline the steps of creating the above resources.  Here, I will just focus on the process of enabling and using Inventory.

How to Enable Inventory
In the Azure portal, go to your Automation account and select “Inventory” under Configuration Management.  On the popup screen that appears, select your subscription and the Log Analytics Workspace in the drop-down menus, and click on the enable button. This step will also enable Change Tracking.

The next step is to configure the virtual machine to communicate with Log Analytics.

Configure Non-Azure Machine
This Microsoft doc explains the following in more detail. For this task, I will be using an AWS t2.micro instance of Windows 2016.  The first thing that needs to be done is to add registry subkeys for TLS 1.2 and .Net Framework to use strong cryptography.  The Microsoft Monitoring Agent (MMA) that will be installed, uses TLS 1.2 to report changes to the Log Analytics service in Azure.  Here are the Registry Keys for TLS 1.2 and .NET 4.6 that need to be added:

  1. HKLM\System\System\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.2\Client – you will need to create the TLS 1.2 and Client subkeys.  And in the Client subkey, create the following DWORD value:
    • Enabled [Value = 1]
    • DisabledByDefault [Value = 0]
  2. HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\v4.0.30319 – no subkeys need to be created; create the following DWORD value: SchUseStrongCrypto with a value of 1.
  3. HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\.NETFramework\v4.0.30319 – no subkeys need to be created; created the DWORD value SchUseStrongCrypto with a value of 1.

Now that the Registry edits have been completed, the MMA can be installed.  There are three options for installing the agent: the installation wizard, command line or Powershell DSC.  In this case, I will use the command line.  Also, there are two agent options:  a 64-bit and 32-bit version.  Once the appropriate agent has been downloaded, open an elevated command prompt, navigate to the location of the file and run the following command to extract the MSI file to a folder path.  Finally, install the agent with the extracted MSI file.

Once the agent is installed you can verify the installation a couple of ways.  In Control Panel, go to MMA Agent to check that it’s intstalled as seen here.

Also, in the Azure Portal, you can run a search query to check that the agent has a heartbeat and is communicating with Log Analytics.  Go to the Logs tab on your Workspace to run the query. The following figure shows there’s a heartbeat detected from the AWS EC2 instance.

Now that the VM is communicating with Azure, we can complete the onboarding of the AWS VM.  Go back to Inventory and you will see a new screen with various options:  A button to Add Azure VMs; a doc on how to Add non-Azure machine; another button to Manage Machines.  Click on Manage Machines.

Now you can see the AWS EC2 instance as an available machine.  You have three options to choose on how to enable Change Tracking and Inventory.  I select the default to “Enable on all available machines and click enable.

Inventory
Now that the Inventory has been enabled, the agent has been installed and the machine is reporting to Azure, let’s see what we get out of the box.  First, if you go to the Inventory tab of your Automation account, it will show you the machine(s) reporting, along with details about the machine (O/S, version, platform, etc).  Additionally, there are tabs to view the software installed on the node, along with Windows services, Windows Registry keys and Linux daemons if applicable.  You will need to configure Files and Registry keys to monitor under Edit settings.  I will demonstrate that in the next article on Change Tracking.

To see the installed software, click on the “Software” tab.  It will show the software name, version, publisher, last refreshed time and number of machines it’s installed on.

Also, go to the “Windows Services” tab to see details such as the service name, status, startup type, last refreshed time and number of machines.

Lastly, the Inventory tools gives you the ability to add machines to a Machine Group.  A default group is created when a machine is added to Inventory.

A Machine group is a group of machines defined by a Log Analytics KQL query.  For instance, if you go to Machine groups and select the “MicrosoftDefaultComputerGroup“, you can see the query that was used to generate the group as well as member machines.

You can create your own Machine group as well.  Click on “Create a Machine Group” on the Inventory screen and fill in the fields on the popup window below.  Here, I’m creating a group of machines that contain “EC2” in the name.  Once you enter a query in the definition box, validate the query and then create it.  A more practical use case would be a group of machines that are missing updates.

This completes part 1 of Azure Configuration Management.  Here, I covered how to enable Inventory, install the monitoring agent on an AWS VM and use Inventory to view a machines configuration.  In Part 2 of this series I will cover Change Tracking.

 

 

Standard

C# Intro for Powershellers

Let me begin by stating that I am not a programmer or C# expert.  I am a System engineer by trade, who has become proficient in Powershell scripting.  When I first began working in IT, about 13 years ago, the first programming language that I tried to learn was Perl.  At the beginning of that process, I immediately discovered programming is not for me and this relationship will not work.  So I dumped Perl after a few “dates” and never looked back. However, that was then.  Now, after having worked with Powershell to create scripts to automate tasks, I have developed a renewed interest in coding.  According to Jeffrey Snover, the inventor of Powershell, the intent behind Powershell was for it to be a glide path to C#.  I guess you can say, mission accomplished.

In this article, I’m going to create a basic C# application in Visual Studio while using Powershell as reference to explain some of the concepts in C#.  As the saying goes, use what you already know to learn new things.  However, let’s review a few basics first.

FUNDAMENTALS
Before we dive into C# coding, it’s necessary that we briefly cover the essential elements related to the .Net Framework architecture. There are two main components of .Net Framework – Common Language Runtime (CLR) and Class Library. In a nutshell, the CLR is responsible for converting the compiled code of C# into the native code of the underlying machine.  C# code is initially compiled into a format called Intermediate Language code (IL). This IL code is consumed by the CLR and converted into a format which is comprehensible to the underlying machine. This process flow is depicted in Figure 1.

The second component which makes up the .Net framework architecture, the Class Library, is a set of namespaces, classes, interfaces and types that are used by the application to access the systems functionality.  Here’s a small subset of the .Net Class Library in Figure 2.


Figure 2

Due to the number of classes, they need to be organized in a manner that is based on their function.  This is the purpose of a Namespace – a container of related classes. In Figure 3 below, we see illustrated the structure of a .Net Framework namespace called “System”.  The System namespace consists of multiple classes such as Console, DateTime, Math and many more not presented in the diagram. Powershell commands and C# code use namespaces to produce and manipulate instantiated instances (objects) of these classes, which contains various methods and properties.

As mentioned, within each namespace are classes  A class (Figure 4) is a container that consists of various methods and types.  Methods allow you to perform certain actions against an object referenced by a class and a type describes its properties.  In essence, a class can be seen as the blueprint used to build objects. For Instance, if you have a class called “Car”, it may have the following type categories:  make, model, color, year, etc. Also, it will have methods which determine the actions that can be performed against a “Car” object such as: start(), drive(), stop() and so on.  The types and methods of a class are known as the class members.

To build a fully functional C# application, we must assemble all of the necessary components that work in conjunction to produce a desired result.  For instance, you may have an application that may need to output to the screen or work with a collection of data. These tasks will require you to make the System and System.Collections namespaces accessible to your application, along with the relevant classes.  As the number of namespaces that your application uses increase, they will need to be organized into a unit called an “Assembly”. This is basically a DLL or EXE file (Figure 5).


Figure 5  C# Assembly (DLL or EXE)

Now that we have completed our review of the .Net Framework architecture and components, let’s proceed to doing some basic C# coding.

THE CODE

For this article, I will create a C# application that will retrieve the current date/time and output the results to the console screen.

                                                    Powershell

As you already know, in Powershell this is simply done by executing the “Get-Datecommand shown in Figure 6.


Figure 6

Another important point of significance is to know how Powershell reaches into .Net Framework to produce this output.  If we pipe this command to “Get-Member”, we will see the TypeName or namespace and class the Get-Date command is referencing.  In Figure 5, we see it’s using the DateTime class which is a member of the “System” namespace. Therefore, our C# code must also use the System.DateTime class to achieve the same result.  For non-seasoned Powershellers, I omitted additional output produced by Get-Member in the Figure 7 screenshot. The Get-Member command also displays the methods and properties of objectified output that is passed to it via the pipeline.


Figure 7

                                                         C#

For this example, I will create a C# application that will retrieve the current date/time and output the results to the console screen.  I will be using Visual Studio Community 2017 to accomplish this task.

First, open Visual Studio and go to File > New > Project (CTRL + Shift + N) as shown below in Figure 8.


Figure 8

In the window that appears (Figure 9), expand Visual C#, select > .Net Core and then choose Console App (.Net Core).


Figure 9

Notice you are given several settings to configure:

  • Name:  Give your Project a name.  I will call mine DateApp1.
  • Location:  Choose a location to store your project (or use the default location).
  • Solution:  You can create a new solution or use and existing one.  A solution contains the project and build information. Also, it can have multiple projects.
  • Solution Name:  By default, this will be the same name you give the Project.  You can give it a different name but I will leave the default.
  • Create new Git repository:  Check this box to create a Git repository.  I will leave it unchecked.

Once all the settings are configured, hit “OK”.  You will now see below in Figure 10, a window with a basic structure of your C# program.


Figure 10

Let me explain the above:  By default, certain namespaces are automatically added to a new program, each of which is preceded by the “using” keyword.  But remember, for our project we will only need to use the “System” namespace. It contains the class (DateTime) that our application requires.  Also, notice that a namespace is created that matches the name of the project, as well as a class called “Program”. Both can be changed to a different name.  The final piece added is a static method that can accept multiple string parameters. It has a “void” keyword, meaning it doesn’t return a value. A static method is one that can be called without creating an instance of the class it’s a member of.  Look to the following Powershell examples to provide further explanation:


Figure 11  Powershell static method of the
DateTime class to add 5 days to the current date

Figure 12 Powershell instance method of the
DateTime class variable ($date) to add 5 days
to the current date

Figure 13 shows the structure of our DateTime app after removing the unused .Net namespaces, changing the Class name and adding the code to produce the current date on the screen.  To achieve this, create a variable called “Now” and assign it to the DateTime.Now property. In C#, a variable is created by preceding the variable name with the name of the .Net class it’s using. And finally, add the Console.Writeline method with a “Now” argument. Also, notice the “System” namespace is now highlighted, which indicates it’s being used.


Figure 13

To compile and execute the above code, press CTRL + F5.  If the build is successful, you will get a pop-up Command window showing the current date and time(Figure 14).  By default, it’s displayed in a long date and time format.


Figure 14

Visual Studio will also show output detailing the Build results.  It displays the location of the application executable file and if the Build succeeded or failed.


Figure 15

In summary, in this article I explained the fundamentals of .Net Framework and created a basic C# application that outputs the current date and time to the console.

Standard

MyIgnite – Reflections on Microsoft Ignite 2018

I attended this year’s Microsoft Ignite conference in Orlando, FL and decided i would provide my reflections on the event.  The annual conference provides a plethora of sessions on Microsoft technology offerings and solutions related to Microsoft 365, IoT, containers, DevOps, Team collaboration, Azure services and more.  Also, there’s an Expo of various IT vendors; panel discussions on Diversity in IT; and hands on labs to provide IT skill development.  It’s a huge event with attendees in all walks of IT from around the world.

The conference kicked off with a Keynote address from Microsoft CEO Satya Nadella.  In his opening speech, he outlined Microsoft’s vision of the next generation in IT.  This involves solutions which revolve around an Intelligent cloud and edge that transform products (business apps, gaming, infrastructure, etc.) and how IT organizations design their operations.  Traditionally, IT has been slow to adopt new technologies due to security concerns and policies.  Also, some IT shops are still afraid of the cloud and the perceived risks that it presents to business information.  However, this posture is no longer viable as users and business partners must be given the flexibility to be productive from any device and any location.

What’s new at Microsoft?  Two new features introduced at Ignite are Ideas and a refined Microsoft Search.  Ideas is cool feature that uses AI to predict what a user will do and can offer a set of design suggestions when creating a PowerPoint presentation.  For instance, if you are designing a PowerPoint slide, Ideas will suggest a particular graphic image based on the written text in the slide.  It can also find content inconsistencies such as a particular word spelled differently and offer to remediate the differences.  You must have OfficePro Plus to use this feature since it leverages the AI capabilities in the cloud.  It was announced that Microsoft Search has now been expanded to search across all Office products and device types.  By using Microsoft Graph and Bing, It intelligently provides customized results based on previous activities and work.

Also, Microsoft 365 now has a new Admin center.  As an improvement to the Office 365 Admin Center, it offers a more focused and centralized workplace for managing and securing resources in Microsoft’s cloud ecosystem.  If you are a Security Administrator for your organization, the Microsoft 365 Admin Center has an HTTP endpoint called security.microsoft.com that is a custom portal for security related responsibilities such as DLP, document classification and permissions.  Also, there’s an endpoint called admin.microsoft.com for managing users, groups and resources.  This approach falls in line with the concept of Just Enough Administration (JEA).

Microsoft is clearly implementing a full court press towards a wider adoption of Azure and Office365.  A majority of the sessions are related to Azure cloud platform and its myriad of offerings.  On-Premise enterprise applications such as Exchange server may not be dead, but they are definitely on the endangered list. At past conferences, there would have been a variety of sessions around On-Premise Exchange and related features, particular in a year which has a new release of Exchange server.  Not so this year.  There were a couple session devoted towards Exchange 2019, which is currently in Preview.  Although, the handwriting has been on the wall for several years that the focal point of messaging is the cloud.

In addition to technical skill development of staff, a very important part of IT is creating a work environment that is free of sexual harassment and racial biases.  The IT field is very male dominated and an unspoken reality is that it’s often a toxic world for women.  It was good to see in the session lineup several discussions highlighting the challenges that women face in IT; how to overcome biases; or creating a more inclusive workplace.

Microsoft has announced that Ignite will be held in Orlando again in 2019.  However, it will be the first week of November as opposed to the last week of September.  This will mark the third year in a row that Ignite will be in Orlando.  Although it’s nice to be able to visit different cities, Orlando is great location for the conference, which has nearly 30,000 attendees.  The weather is great, it’s close to Disney theme parks (the closing celebration was at Universal Studios) and it’s not congested like other major cities.

Those are my thoughts and takeaways.  What stood out to you about this year’s Ignite?

 

 

 

 

 

Standard

Azure AD Attribute Hide and Seek

Azure Ad Connect provides organizations with the ability to synchronize their On-premise users and groups to Azure Active Directory.  When synchronizing objects to Azure, administrators have the ability to control which users or groups are synchronized to the cloud.    Furthermore, it’s also possible to select which user or group attributes are synchronized.  Some organizations may have Security policies that prohibit certain information, such as phone numbers and addresses, from appearing in the cloud.  Luckily, attributes can be easily filtered by unchecking the attribute on the AD connector object in Synchronization Service Manager.  However, what if there’s an attribute that is being synced, but does not appear on the Azure AD connector as a filterable option?  Here’s an example that shows you how to deal with that.

Lets take a look at a user called TesterB in Powershell.  Using the Azure Powershell module (or Azure Cloud shell), we can get the user object and its properties with the following command.  Notice that the City attribute for our user is set to New York.

We don’t want location information available in Azure AD.  Lets logon to the Azure AD connect sever and open Synchronization Service Manager to filter this attribute.  Once there, click on the Connectors button.  You will see two connectors:  one for Azure AD and the other for On-premise AD.  Select the On-premise AD connector.

On the Properties window for the AD connector, click on “Select Attributes” to see the list of attributes that are available and being synchronized to Azure.

As shown below in the AD connector attributes window, there isn’t a “City” attribute.  Also, the attributes with a check mark are being synced to Azure AD.  This view shows the ldap name for each attribute, which is not always the same as its Display name, which is what the user property showed above in Powershell. To get to the bottom of this, we will need to look at the Attributes Editor for the user object in On-premise AD.

Open the TesterB user in ADUC and go to the Attribute Editor tab.  There you will see a list of the attributes that are available.  This view shows the ldap name for the attribute and its value, if one is set.  The ldap name for City is “l”, since the value is New York.

Now if you go back to the AD connector for verification, you will notice the attribute “l” is checked.  This will need to be unchecked.

Once you uncheck it and save the change, run the following command in Powershell to remove the City information from users in Azure AD and prevent it from being synced in the future.

A quick look at the City property for TesterB shows the location is no longer displayed.

That’s it!  If you ever have a situation where you can’t find an attribute to filter on the Azure AD connector, remember it probably has a ldap name that is different from the display name.

 

 

 

Standard

A Guide to Passing Azure Exam 70-533

Back in April of this year, I passed Azure exam 70-533:  Implementing Microsoft Azure Infrastructure Solutions.  To be honest, this was actually my second attempt at the exam.  I failed on my first try about three weeks earlier.  But who’s counting?   All that matters is that I persisted and eventually passed.  I’m not mentioning this to be discouraging to anyone intending to take the exam.  However, my intention is to provide encouragement if you don’t pass the first time around.  No one likes seeing the word “Fail” on the exam printout, but it’s not the end of the world.  With that being said, I thought I would write an article outlining the methods I employed to prepare for the test.

Practical Experience

First and foremost, you will need hands-on experience to pass this test.  Azure exam 70-533 is not easy and cannot be passed solely on reading books or articles.  If you do not have access to Azure through your employer or a Visual Studio subscription, Microsoft offers a 30-day free trial, which comes with a $200 credit.  The free trial allows you to create resources in Azure such as VM’s, vrtual networks, storage accounts, web apps, containers, etc.

Once you setup your account, it’s important to have a strategy to learning the skills that are needed to pass the exam.  Microsoft has a list of objectives and related skills that are covered by the exam.  As of this writing, the objectives were last updated on March 29, 2018.  Under each category of objectives are a number of relevant tasks or exercises.  Go to the exam site and do exercises around all the listed skill areas.  Microsoft has excellent documentation that will help you develop the skills measured by exam 70-533.  Also, it’s very important to learn how to accomplish tasks using Powershell and ARM templates, instead of only in the Portal.  For instance, learn how to deploy VM’s and related resources from a script or template.  Perform all of the tasks until you feel you have mastered them.

Training Courses

Pluralsight courses were an asset that proved to be a critical component of my training.  This site offers a number courses that cover topics such as Azure infrastructure solutions, storage, networking, application services, ARM templates, Identity management and more.  Also, there is a learning path for exam 70-533 that consists of about 7 or 8 course.  The training material is excellent, and consists of demos and exercise files that provide some practical training.  Pluralsight courses will give you a solid foundation.  Additionally, a monthly Pluralsight subscription will cost you $29.  The site is more than worth the price.  Another site that was helpful is Cloud Ranger.  The courses are free but many of them are now outdated since they are designed around the old Classic Model.

Practice Exam

I would advise you to get the official Measureup practice exams from Mindhub.  Some of the questions are on the Classic model, however the exam was still very helpful.  The real exam is all ARM, nothing on the Classic model.  The Measureup practice exam provides the option of taking the test in Practice mode, which is a customizable format.  For instance, you can select questions from a particular objective, or only questions that you missed during the last practice exam.  A huge benefit with the practice test is that it offers explanations for why an answer is correct and the others are wrong.  Also, each answer has links to documents that are relevant to each question.  DO NOT memorize the answer; know why an answer is correct.  I retook the full Practice exam (nearly 200 questions) until I consistently passed with at least a 95%.  At this point, I moved on to taking the practice test in Exam mode.  Mindhub currently has a special that offers an exam voucher, the practice test and 2 retakes for $266.00.

Helpful Links

The Exam 70-533 reference book has not been updated for awhile, but this site has tips that were extracted from the book’s content.  These bullet points are important facts that you will need to remember for the exam.  Also, make sure you know the features and pricing with app service plans and SQL database service tiers.

I hope the information I provided was beneficial and will contribute towards you passing exam 70-533.  Good luck!

 

 

 

 

 

 

Standard

Powershell Function to Get Messages

If you are an Exchange server administrator, you more than likely spend a fair amount of time searching the Message Tracking logs.  The data provided by these logs can be helpful in finding all messages with a particular subject or sent by a certain user during a specific time frame.  Of course, there are two ways to search the Message logs:  the Exchange Admin Center (EAC) or Exchange Management Shell.  Using the GUI is perfectly fine, if that is your preference.  However, if you are having to perform searches on a regular basis, the EMS is the more efficient option.

To help make your job and mine alot easier, I wrote this Powershell function that can be used to find messages based on various criteria.  Check it out on Github, and let me know what you think.

https://github.com/rburrs/Powershell-Toolbox/tree/master/Exchange%20Server

 

Standard