Error Code 0xC1900208 and Workaround…

2017-07-27T01:15:46+00:00 August 2nd, 2017|blog, Windows 10|

Windows 10 Compatibility checks + Intel Display Adapters + KB4022719 = 0xC1900208

If you’re in the middle of upgrade testing, you may be very well versed in the 0xC1900208 error code, which indicates a compatibility check failure for the Windows 10 in-place upgrade process.  When reviewing the compatibility results, you may find that your system reports you’ll have issues with your display adapter in Windows 10.  The block is a hard block and the upgrade will not proceed.

The failure may be due to the June Rollup KB4022719, which takes one step forward and resolves a compatibility issue with AMD display adapters, but also creates a new compatibility failure for Intel Display adapters.

Microsoft has not noted this in the comments and do not seem to be issuing any fixes yet.  The recommendation is to uninstall the display adapters as opposed to uninstalling the security updates.

I was recently at a customer site where this issue presented itself on HP 850 Model G1/G2/G3 devices and a workaround needed to be developed for their in-place upgrades to succeed.  Instead of asking users to uninstall the display adapter and driver manually prior to the upgrade, we decided to take advantage of the devcon.exe file that comes with the Windows Driver Kit.

A Link to DevCon.exe information: https://docs.microsoft.com/en-us/windows-hardware/drivers/devtest/devcon

High-Level Steps

Three steps were required to provide this workaround:

  1. Device Installation settings must be configured to never install driver software from Windows Update (as in the figure below).  This prevents the system from connecting online and reinstalling the driver after you uninstall.
  2. The display adapter needs to be removed.
  3. The OEM driver needs to be deleted.

Additional Details

Step 1 can be automated in a task sequence step using the REG command:

reg add HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\DriverSearching /t REG_DWORD /v SearchOrderConfig /d 0x0 /f

For steps 2 and 3, a custom script was developed in PowerShell that utilizes devcon.exe to remove the associated display adapter and delete the OEM driver associated with the specific hardware.  The commands are as follows:

Devcon.exe remove <hardware ID>

Devcon.exe dp_delete <OEMDriver.INF>

To Find the Hardware ID for the models that are failing to pass the compatibility checks, we simply opened device manager and viewed the properties of the display adapter causing the issue.  On the details tab, you can review the Hardware ID property and grab the ID listed first in the list (see figure below).

Once we obtained the hardware ID, we then parsed through all drivers using the following command and scriptomagically grabbed the appropriate INF file name to delete.  The Command to parse is devcon.exe dp_enum.

So a quick sample of the commands would be as follows:
reg add HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\DriverSearching /t REG_DWORD /v SearchOrderConfig /d 0x0 /f

devcon.exe remove "*pci\ven_8086&dev_1616&subsys_2216103c&rev_09*"

devcon.exe dp_delete OEM56.inf

The reg add command replaces whatever value is in the SearchOrderConfig with the appropriate value to tell the system NOT to go to windows update for driver updates.  The second command will remove the device associated with the hardware ID you specified.  The third command will remove the driver associated with the display adapter.  Note that in the above list of commands, OEM56.inf is just an example.  You will need to enumerate all your installed devices to determine which INF file to remove.

So, in summary:

Intel Display Adapters in, at least, certain HP models, no longer pass compatibility checks with Windows 10 v1703.  You must turn off automatic updating of display adapters in Windows, remove the display adapter, remove the driver associated with the display adapter (so that it will not find the local copy), and then run your upgrade.  Doing so will allow your system to pass the compatibility check that is now failing since the June Rollup was deployed.

Hopefully this will save some of you the headaches and troubleshooting steps we ran into.

Happy Upgrading!

 

Which Azure Plan is Right for You?

2017-07-27T00:47:04+00:00 July 27th, 2017|Azure, blog, Cloud, Microsoft, Microsoft Infrastructure, Micrsoft Cloud Solution Provider, Office 365|

As you start to explore the world of Microsoft Azure Cloud Services, you will start to see there are many options.  Let’s discuss the three types of Microsoft programs for you to purchase.

#1 – Pay-As-You-Go Subscriptions

Pay-As-You-Go subscriptions are simple to use and simple to set up.  There are no minimum purchases or commitments.  You pay for your consumption by credit-card on a monthly basis and you can cancel anytime.  This use-case is primarily for infrastructure environments that are setup for a temporary purpose.  It’s important to understand that organizations using this model pay full list price for consumption and do not have direct support from Microsoft.

#2 – Microsoft Enterprise Agreement (EA)

Microsoft Enterprise Agreements are commitment based Microsoft Volume Licensing agreements with a contractual term of 3 years.  Enterprise Agreement customers can add Azure to their EA by making an upfront monetary commitment for Azure services.  That commitment is consumed throughout the year by using a combination of the wide variety of Microsoft cloud services including Azure and Office 365.  This is paid annually in advance with a true up on a quarterly basis for overages.  Any unused licenses are still charged based on the commitment.  If you are a very large enterprise, the greatest advantage of an EA is having a direct relationship with a Microsoft sales team.  Also, EAs offer discounts based on your financial commitment.  And while there are many pros to the EA approach, understanding and controlling the cost of consumption can be a challenge for customers within EAs.  Personally, I recently took over the management of our EA and can attest that this can be very complicated.

#3 – Cloud Solution Provider (CSP)

When using Microsoft Cloud Services through the Cloud Solution Provider (CSP) program, you work directly with a partner to design and implement a cloud solution that meets your unique needs.  Cloud Solution Providers, support all Microsoft Cloud Services (i.e., Azure, Office 365, Enterprise Mobility Suite and Dynamics CRM Online) through a single platform.  CSP is similar to the Pay-As-You-Go subscription in that there are no minimum purchases or commitments.  Your consumption is invoiced monthly based on actual consumption (either via invoiced PO or credit card, your choice), and you can cancel at anytime.  This will significantly simplify your Azure and Office 365 billing process!  CSP offers many advantages over Pay-As-You-Go Subscriptions and Enterprise Agreements, and in most cases can be a more cost effective solution.

As a CSP, Coretek helps customers optimize their consumption cost by working with our customers to ensure they have the right Azure server types assigned to their workloads.  We also work with customers to shut down services when they are minimally used after business hours.  As part of Coretek’s Managed Support, our team provides proactive maintenance to ensure your infrastructure is running in an optimal manor including monitoring and patching of your servers.  Coretek’s Azure Management Suite (AMS) Portal enables business users to understand where the cost of their consumption is being utilized.  The AMS portal can display real time consumption cost based on department and projects.  It also enables business users to understand what Microsoft licenses are being utilized and who they are assigned to in a simple graphical format.

Coretek Services – Improving End User Experience and IT Efficiency.

Microsoft Azure – Global. Trusted. Hybrid.  This is cloud on your terms.

Office 365 and Bing Maps – Issue and Fix

2017-07-27T00:00:55+00:00 July 21st, 2017|blog, Office 365|

Bing Maps Add-in to Office 365 changes the message body for emails received with addresses in them

Recently, I noticed that most of the emails I open and read are requesting me to save as the body has changed.  The message text is “The body of the message <subject> has been changed.  Want to save your changes to this message?”

The issue is that I’ve not changed any part of the message.  I’ve simply opened it and then closed it.  After further investigation, I’ve found that the Bing Maps add-in is modifying the body of the message by replacing any address with a link.

To avoid this behavior, and the annoying message for every email that you open with an address, simply open Outlook, Navigate to File > Manage Add-ins, login with your Office 365 account, and disable the Bing Maps Add-in.  This may take a few minutes to take effect, but a restart should not be required.

This applies to, at least, version 1706 (Build 8229.2086) of the Microsoft Office 365 release.  I’ve read this may also happen with some older versions as well but have not tested.

 

Azure – What tags are we using again…?

2017-07-27T00:00:55+00:00 July 7th, 2017|Azure, blog, PowerShell|

Have you wondered what tags are assigned to all your Azure VMs?  Do you not have ARM Policies in place to enforce your preferred tags yet?

I was in just such a situation the other day.  Just like in my previous post on quick Azure-related scripts, I was working with a customer that just wanted a quick utility to report on what VMs are properly tagged (or not) in their implementation, without having to fish around the Portal.  No ARM Policies there yet…  *yet*.

So I whipped this together.  And much like that previous script, you just paste this into a PS1 file, set the subscription name in the variable, and then run it.

# GetVmTags - Jeremy Pavlov @ Coretek Services 
# Setting the subscription here
$MySubscriptionName = "My Subscription"
# Set some empty arrays
$vmobjs = @()
$vmTagsKeys = @()
$vmTagsValues = @()
# Get the VMs...
$vms = Get-AzureRmVm 
#
$NeedToLogin = Read-Host "Do you need to log in to Azure? (Y/N)"
if ($NeedToLogin -eq "Y")
{
  Login-AzureRmAccount
  Select-AzureRmSubscription -SubscriptionName $MySubscriptionName
}
elseif ($NeedToLogin -eq "N")
{
  Write-Host "You must already be logged in then.  Fine. Continuing..."
}
else
{
  Write-Host ""
  Write-Host "You made an invalid choice.  Exiting..."
  exit
}
#
foreach ($vm in $vms)
{
    Write-Host ""
    $vmname = $vm.name
    $MyResourceGroup = $vm.ResourceGroupName
    Write-Host "Checking tags for VM $vmname... "
    Start-Sleep 1
    $vmTags = (Get-AzureRmVM -name $vmname -ResourceGroupName $MyResourceGroup).Tags
    $vmTagsCount = $vmTags.Count
    if ($vmTagsCount -gt 0)
    {
      $vmTagsKeys = $vmTags.Keys -split '
[\r\n]' $vmTagsValues = $vmTags.Values -split '[\r\n]' for ($i=0;$i -lt $vmTagsCount; $i++) { $CurrentTagKey = $vmTagsKeys[$i] $CurrentTagValue = $vmTagsValues[$i] Write-Host -ForegroundColor Green "Key : Value -- $CurrentTagKey : $CurrentTagValue" } } else { Write-Host -ForegroundColor Yellow "No tags for $vmname" } }

The results should look something like this, except hopefully a lot more serious and business-y:

Have fun with it, change it up, and let me know what you do with it…   I hope it helps.  Enjoy!

Rolling out the Red Carpet… literally!

2017-07-27T00:00:55+00:00 June 22nd, 2017|blog, Headquarters|

On June 8 Coretek Services hosted our New Headquarters Open House complete with TekTalks, networking, and plenty of food and drink. A big thank you to all who helped celebrate with us, as well as our partner sponsors who made the Open House possible! In case you missed them, watch the highlight video below or read the Open House blog post.

A highlight of the event’s fun was the Red Carpet. Below are photos taken as people stepped up to the red carpet throughout the event to have their five seconds of fame in the spotlight. Thank you to Robert Lovelace for the photography!

 

 

Thank you to our Partner Sponsors:

 

 

 

CORETEK SERVICES LAUNCHES NEW HEADQUARTERS WITH INTERACTIVE OPEN HOUSE

2017-07-27T00:00:55+00:00 June 22nd, 2017|blog, Headquarters|

Technical. Modern. Collaborative.

These three words personify Coretek Services’s new gorgeous headquarters, located just west of downtown Farmington at 34900 Grand River Avenue, Farmington Hills, MI.

On June 8, Coretek showcased the New Headquarters with an Open House with over 300 guests from around the nation, including clients, partners, friends, family and members of our community. We cannot express enough gratitude for everyone who was able to make it, as well as the sponsors who made it happen.

The highlight of the event was a Ribbon Cutting Ceremony, featuring speeches from Coretek Services CEO, Ron Lisch, Drew Coleman, Business Manager of Michigan Economic Development Corporation; Ken Massey, Mayor of Farmington Hills; Mary Martin, Executive Director of the Greater Farmington Area Chamber of Commerce; as well as a State Tribute by State Representative Christine Grieg.

Periodic TEK talks focusing on solutions to solve IT business problems were presented throughout the day including: Azure Management Suite, VDI: Clinical Workflows and Cloud Desktops, Post-Breach Response with Win10, Hyperconvergence with Hybrid Cloud, Transforming Outdated Devices, User-Experience Analytics and Optimized Managed Infrastructure.

Additionally, the Open House featured a red carpet (see photos here), networking, food and drinks!

Ron Lisch had a vision to create a headquarters that is a tech center for the Metro Detroit area. He envisioned a place where employees, clients, prospective clients, and partners can come together to collaborate on creative solutions hands-on in a lab. The headquarters would foster the Coretek philosophy of bringing great people together to do great work.

Ron’s vision came into full fruition in May 2017, as the Coretek family officially lives and breathes in the new Headquarters. Inside the new HQ is an innovation center, which hosts different technology solutions Coretek offers that allows visitors to test the technology hands on. Additionally, there is a large events center where user-group meetings, training sessions and other events will be held, a company fitness center for employee use, and a collaboration center that allows employees and visitors a space to work together, network, or relax.

The modern space is bright with the latest technology implemented throughout the space, highlighting pictures from around our community canvased in conference rooms. The innovative HQ truly is a hub of technology and collaboration for the Metro Detroit area.

We will continue to host upcoming events at our new headquarters and are excited to welcome even more guests in the future to collaborate in the new work space.

Some photographs of the new space and Open House are below to get a sneak peek on what we have in store for you when you visit! Thank you to Resa Abbey for the photography!

Ron Lisch – CEO of Coretek Services

Innovation Center – 360 degree view

 

Clint Adkins providing a demo of Coretek Services Cloud Solutions

 

 

 

Thank you to our Partner Sponsors:

Azure – Next Available IP Address…

2017-06-01T19:53:53+00:00 June 15th, 2017|Azure, blog, Cloud, PowerShell|

The other day, I was working at a customer location with one of their IT admins named Jim, designing an Azure IaaS-based deployment.  During our session, he challenged me to make an easy way for him and his coworkers to see if a particular address is available, or find the next available “local” address in their various Azure VNets.

Because while addressing can be handled with ARM automation — or just dynamically by the nature of IaaS server creation — in an Enterprise it is usually necessary to document and review the build information by a committee before deployment.  And as a result, Jim wanted to be able to detect and select the addresses he’d be using as part of build designs, and he wanted his coworkers to be able to do the same without him present.  So, this one is for Jim.

I wanted to write it to be user-friendly, with clear variables, and be easy-to-read, and easy-to-run.  …unlike so much of what you find these days…  😉

Copy these contents below to a file (if you wish, edit the dummy values for subscription, Resource Group, and VNet) and run it with PowerShell.  It will do a mild interview of you, asking about the necessary environment values (and if you need to change them), and then asking you for an address to validate as available.  If the address you specified is available, it tells you so; if it is not, it returns the next few values in the subnet from which the address you entered resides.

# TestIpaddress - Jeremy Pavlov @ Coretek Services 
# You just want to know if an internal IP address is available in your Azure VNet/Subnet.
# This script will get you there.  
#
# Pre-reqs:
# You need a recent version of the AzureRm.Network module, which is 4.0.0 at his writing.  
# You can check with this command: 
#     Get-Command -Module azurerm.network
# …and by the way, I had to update my version and overwrite old version with this command:
#     Install-Module AzureRM -AllowClobber
#
# Some things may need to be hard-coded for convenience...
$MySubscriptionName = "My Subscription"
$MyResourceGroup = "My Resource Group"
$MyVnet = "My VNet"
#
Start-Sleep 1
Write-Host ""
Write-Host "Here are the current settings:"
Write-Host "Current subscription: $MySubscriptionName"
Write-Host "Current Resource Group: $MyResourceGroup"
Write-Host "Current VNet: $MyVnet"
Write-Host ""
$ChangeValues = read-host "Do you wish to change these values? (Y/N)"
if ($ChangeValues -eq "Y")
{
  Write-Host ""
  $ChangeSub = read-host "Change subscription? (Y/N)"
  if ($ChangeSub -eq "Y")
  {
    $MySubscriptionName = Read-host "Enter subscription name "
  }
  Write-Host ""
  $ChangeRg = read-host "Change resource group? (Y/N)"
  if ($ChangeRg -eq "Y")
  {
    $MyResourceGroup = Read-host "Enter Resource group "
  }
  Write-Host ""
  $ChangeVnet = read-host "Change Vnet? (Y/N)"
  if ($ChangeVnet -eq "Y")
  {
    $MyVnet = Read-host "Enter VNet "
  }
}
#
try
{
  $MySubs = Get-AzureRmSubscription
}
catch
{
  Write-Host ""
  Write-Host -ForegroundColor Yellow "Unable to retrieve subscriptions."
}
$MySubsName = $MySubs.name
Start-Sleep 1
Write-Host ""
if ($MySubsName -contains "$MySubscriptionName")
{
  Write-Host "You are logged in and have access to `"$MySubscriptionName`"..."
}
else
{
  Write-Host "It appears that you are not logged in."
  Write-Host ""
  $NeedToLogin = Read-Host "Do you need to log in to Azure? (Y/N)"
  if ($NeedToLogin -eq "Y")
  {
    Login-AzureRmAccount
    Select-AzureRmSubscription -SubscriptionName $MySubscriptionName
  }
  elseif ($NeedToLogin -eq "N")
  {
    Write-Host "You must already be logged in then.  Fine. Continuing..."
  }
  else
  {
    Write-Host ""
    Write-Host "You made an invalid choice.  Exiting..."
    exit
  }
}
#
Start-Sleep 1
Write-Host ""
Write-Host "We will now check to see if a given IP address is available from any subnet in VNet `"$MyVnet`" "
Write-Host "...and if it is not available, provide the next few available on that subnet."
Start-Sleep 1
Write-Host ""
$MyTestIpAddress = Read-Host "What address to you wish to test for availability?"
#
$MyNetwork = Get-AzureRmVirtualNetwork -name $MyVnet -ResourceGroupName $MyResourceGroup
$MyResults = Test-AzureRmPrivateIPAddressAvailability -VirtualNetwork $MyNetwork -IPAddress $MyTestIpAddress
$MyResultsAvailableIPAddresses = $MyResults.AvailableIPAddresses
$MyResultsAvailable = $MyResults.Available
#
Start-Sleep 1
if ($MyResultsAvailable -eq $False)
{
  Write-Host ""
  Write-Host -ForegroundColor Yellow "Sorry, but $MyTestIpAddress is not available."
  Write-Host ""
  Write-Host -ForegroundColor Green "However, the following adddresses are free to use:"
  Write-Host ""
  $MyResultsAvailableIPAddresses
}
else
{
  Write-Host ""
  Write-Host -ForegroundColor Green "Yes! $MyTestIpAddress is available."
}
Write-Host ""
Write-Host " ...Complete"

Now, if you know a better way to handle it, or have tips for improvement — or if you find a bug — I’d love to hear them (and so would Jim).  I hope it helps you out there…

Thanks, and enjoy!

Mobile Application Management with Intune

2017-07-27T00:00:55+00:00 June 2nd, 2017|blog, Intune, Mobility|

Mobile Application Management (MAM) is a feature that’s not new.  However, Microsoft is always improving on the MAM capabilities, and today Intune supports multiple operating systems on Mobile devices.  This is not an easy feat; since Microsoft are bound by the APIs that these other platforms offer, such as iOS and Android.  These non-Microsoft operating systems are the most prevalent on mobile devices today; and with greater access to corporate data, this poses a threat to data protection and leakage.

Policy

We’ve all used application policies from Microsoft’s wide range of applications that have been for many years.  For example:

  • GPOs control where icons are, where data is saved, what drives are mapped, etc.
  • Configuration manager is used to push software out to authorized users and remove applications from those who are not
  • Active Directory provides a way to secure data on the network with Groups and Users

…And while Microsoft released Intune quite a few years back, I’ve only recently become a real fan since I’ve started using Mobile Application Management without enrollment.  Let’s take a quick look at how MAM allows you to offer access to corporate data without compromising too much of that flexibility that users enjoy by choosing their own device platform and bringing their own devices to work.

BYOD

There’s nothing new with the concept of “Bring your own device” (BYOD); it’s a concept that’s been around for quite some time.   Users can bring their own devices and use them for daily business when a cell phone is needed to do just that.  Traditionally, users would logon to a segmented Wi-Fi network that has no access to the corporate network.  This allowed IT admins to avoid having to manage additional network access to the company resources and provide an open network for these devices as well as guests visiting their offices.  However, with many companies moving data and apps to “the Cloud”, the focus is no longer about segmenting networks, and is instead more focused on protecting the data.

Traditional office apps like Word, Excel, and PowerPoint have been available on mobile devices for quite some time now too, but they commonly required sending the documents to your phone and then opening them.  With Office 365, SharePoint online, and OneDrive, these apps now have access to a massive amount of your corporate data.  Without protecting this data when accessed on a mobile device, a user could download sensitive company information on their mobile device unencrypted and unprotected from prying eyes.  This is where I think Mobile Application Management really starts to come into play.

A Real-World Example

Intune’s Mobile Application Management provides the capabilities to protect your sensitive information on the device, wherever that device is, whether it is in a hotel half-way across the world, left behind in a taxi cab, or picked from the pocket of your CEO.  The device may be compromised but the data is secure.  This is due to the way application management protects the data on the device.  Let me provide you with an example:

Bob’s a CEO of an organization that provides financial information to customers across the financial markets.  The details of the finances can make or break a company’s stock profile if they were to be leaked.  Bob uses an iPhone to read emails and open documents on his device while traveling the subway in New York City.  During a busy morning, he’s shuffling to make it to his next appointment and accidentally drops his phone while exiting the train.

Because of a rich set of policies that Bob’s admin has configured with MAM, the data Bob accesses is not allowed to be stored on the device; and after 5 attempts to unlock the phone unsuccessfully, the corporate apps and data would be wiped.  Even if they were to guess the PIN on Bob’s phone, they would still have to guess his credentials; which are required to open any of the company apps that Bob uses.  It’s important to understand that:

  • The data is not on the device
  • There’s a high-probability that someone would automatically wipe the device by guessing the PIN wrong 5 times
  • By the time Bob realizes he’s lost his phone, a quick call to his IT Department triggers the admin to send a remote wipe request to his device AND receives a confirmation of success

That was just one example and there are many more features that MAM can enable to protect your data.

Bringing MAM Home

Mobile Application Management is easy to enable and deploy to your users.  With proper communication and process, your company data will be secured.  Don’t wait for one of your end-users to accidentally leak sensitive information that could make or break your organization’s reputation.  Identify those that are using mobile devices and protect them sooner than later.

The Future of Azure is Azure Stack!

2017-07-27T00:00:55+00:00 May 18th, 2017|Azure, blog, Cloud|

image

image

I realize that the title above might be a bit controversial to some. In this blog post I will attempt to defend that position.

The two diagrams above, taken from recent Microsoft public presentations,  symbolically represent the components of Public Azure and Private Azure (Azure Stack).  If you think that they have a lot in common you are right.  Azure Stack is Azure running in your own data center.  Although not every Azure feature will be delivered as part of Azure Stack at its initial release (and some may never be delivered that way because they require enormous scale beyond the reach of most companies) , it is fair to say that they are more alike than they are different.

Back in 2012 I wrote a blog post on building cloud burstable, cloud portable applications.  My theses in that post was that customers want to be able to run their applications on local hardware in their data center, on resources provided by cloud providers and/or even resources provided by more than one cloud provider. And that they would like to have a high degree of compatibility that would allow them to defer the choice of where to run it and even change their mind as workload dictates.

That thesis is still true today.  Customers want to be able to run an application in their data center. If they run out of capacity in their data center then they would like to shift it to the cloud and later potentially shift it back to on-premises.

That blog post took an architectural approach using encapsulation and modularity of design to build applications that could run anywhere.

The Birth of Azure

A bit of additional perspective might be useful.  Back in 2007 I was working as an Architect for Microsoft and I came across what would eventually become Azure. (In fact that was before it was even called Azure!) I had worked on an experimental cloud project years before at Bell Labs called Net-1000. At the time AT&T was planning on turning every telephone central office into a data center providing compute power and data storage at the wall jack.  That project failed for various reasons some technical and some political, as documented in  the book The Slingshot Syndrome. The main technical reason was that the computers of the day were minicomputers and mainframes and the PC was just emerging on the scene.   So the technology that makes today’s cloud possible was not yet available.  Anyway, I can say that I was present at the birth of Azure.  History has proven that attaching myself to Azure was a pretty good decision. Smile

The Azure Appliance

What many do not know is that this is actually Microsoft’s third at tempt at providing Azure in the Data Center. Back in 2010 Microsoft announced the Azure Appliance which was to be delivered by a small number of Vendors . It never did materialize as a released product.

Azure Pack and the Cloud Platform System

Then came Windows Azure Pack and the Cloud Platform System in 2014 to be delivered, also in appliance form, by a small number of selected vendors.  Although it met with some success, is still available today, and will be supported going forward, its clear successor will be Azure Stack.   (While Azure Pack is an Azure-like emulator built on top of System Center and Windows Server Azure Stack is real Azure running in your data center.)

Because of this perspective I can discuss how Azure Stack is Microsoft’s third attempt at Azure in the Data Center and one that I believe will be very successful. Third times a charm Smile

Azure Stack

The very first appearance of Azure Stack was in the form of a private preview, and later a public preview: “Azure Stack Technical Preview 1”.  During the preview it became clear that those attempting to install it were experiencing difficulties, many of them related to the use of hardware that did not match the recommended minimum specifications.

Since Azure Stack is so important to the future of Azure Microsoft decided to release it in the form of an Appliance to be delivered by three vendors (HP, Dell & Lenovo) in the Summer of 2017.  According to Microsoft that does not mean that there will be no more technical previews, or that no-one will be able to install it on their own hardware.   (It is generally expected that there will be additional Technical Previews, perhaps even one at the upcoming Microsoft Ignite! conference later this month.) It simply means that the first generation will be released in controlled fashion through appliances provided by those vendors so that  so that Microsoft and those vendors can insure its early success.

You may not agree with Microsoft (or me), but I am 100% in agreement with that approach.  Azure Stack must succeed if Azure is to continue to succeed.

This article originally posted 9/20/2016 at Bill’s other blog, Cloudy in Nashville.

How to protect against the next Ransomware Worm

2017-07-27T00:00:58+00:00 May 15th, 2017|blog, Ransomware, Security|

Hopefully you were one of the prepared organizations who avoided the latest Ransomware worms that made its way around the globe this past week.  This worm crippled dozens of companies and government entities, as it impacted over 230K computers in 150 countries.  Most of the infections were in Europe, Asia, and the Middle East, so if you did not get hit, you were either prepared, or lucky.  This blog post will help you be prepared for when this happens again, so that you don’t have to rely on luck.

Patch everything you can, as quick as you can

The exploit at the root of this Ransomware worm was resolved in MS17-010, which was released in March of 2017, giving organizations more than enough time to download, test, pilot through your UAT (User Acceptance Testing), and deploy to Production.  While introducing new patches and changes to your environment carries risk of breaking applications, there is far more risk in remaining unpatched – especially security specific patches.  Allocate the proper resources to test and roll out patches as quickly as you can.

Run the newest OS that you can

While the EternalBlue exploit that was patched by MS17-010 was applicable to every Windows OS, you were safe if you were running Windows 10 due to a security feature called ELAM (Early Launch Anti-Malware).  Many of the infected machines were running Windows XP, or Server 2003, that did not get the MS17-010 patch (Microsoft has released a patch for these OS variants after the infection, please patch if you still have these in your environment).  It is not possible to secure Windows XP or Server 2003.  If you insist on running them in your environment, assume that they are already breached, and any information stored on them has already been compromised (You don’t have any service accounts logging into them that have Domain Admin privileges, right?).

Firewall

Proper perimeter and host firewall rules help stop and contain the spread of worms.  While there was early reports that the initial attack vector was via E-mail, these are unconfirmed.  It appears that the worm was able to spread over the 1.3 Million Windows devices that have SMB (Port 445) open to the Internet.  Once inside the perimeter, the worm was able to spread to any device that had port 445 open without MS17-010 installed.

Turn off Unnecessary Services

Evaluate the services running in your desktop and server environment, and turn them off if they are no longer necessary.  SMB1 is still enabled by default, even in Windows 10.

Conclusion

These types of attacks are going to be the new normal, as they are extremely lucrative for the organizations who are behind them.  Proper preparation is key, as boards are starting to hold both the CEO and CIO responsible in the case of a breach.  While you may have cyber-security insurance, it may not pay out if you are negligent by not patching or running an OS that stopped receiving security updates 3 years ago.  I would recommend to be prepared for the next attack, as you may not be as lucky next time.

Additional Layers of Defense to Consider

For those over-achievers, additional layers of defense can prove quite helpful in containing a breach.
1.    Office 365 Advanced Threat Protection – Protect against bad attachments
2.    Windows Defender Advanced Threat Protection – Post-breach response, isolate/quarantine infected machines
3.    OneDrive for Business – block known bad file types from syncing

Good luck out there.

Load More Posts