Category Archives: microsoft

The downside of “Windows as a service”: disappearing features (and why I will miss Paint)

Microsoft has posted a list of features that are “removed or deprecated” in the next major update to Windows 10, called the Fall Creators Update.

The two that caught my eye are Paint, a simple graphics editor whose ancestry goes right back to Windows 1.0 in 1985, and System Image Backup, a means of backing up Windows that preserves applications, settings and documents.

I use Paint constantly. It is ideal for cropping screenshots and photos, where you want a quick result with no need for elaborate image processing. It starts in a blink, lets you resize images while preserving aspect ratio, and supports .BMP, .GIF, .JPG, .PNG and .TIF – all the most important formats.

I used Paint to crop the following screen, of the backup feature to be removed.

image

System Image Backup is the most complete backup Windows offers. It copies your system drive so that you can restore it to another hard drive, complete with applications and data. By contrast, the “modern” Windows 10 backup only backs up files and you will need to reinstall and reconfigure the operating system along with any applications if your hard drive fails and you want to get back where you were before. “We recommend that users use full-disk backup solutions from other vendors,” says Microsoft unhelpfully.

If System Image Backup does stop working, take a look at Disk2vhd which is not entirely dissimilar, but copies the drive to a virtual hard drive; or the third party DriveSnapshot which can backup and restore entire drives. Or of course one of many other backup systems.

The bigger picture here is that when Microsoft pitched the advantages of “Windows of a service”, it neglected to mention that features might be taken away as well as added.

Microsoft Edge browser crashing soon after launch: this time, it’s IBM Trusteer Rapport to blame

A common problem (I am not sure how common, but there are hundreds of reports) with the Edge browser in Windows 10 is that it gets into the habit of opening and then immediately closing, or closing when you try to browse the web.

I was trying to fix a PC with these symptoms. In the event log, an error was logged “Faulting module name: EMODEL.dll.” Among much useless advice out there, there is one that has some chance. You can reinstall Edge by following a couple of steps, as described in various places. Something like this (though be warned you will lose ALL your Edge settings, favourites etc):

Delete C:\Users\%username%\AppData\Local\Packages\Microsoft.MicrosoftEdge_8wekyb3d8bbwe (a few files may get left behind)

Reboot

Run Powershell then Get-AppXPackage -Name Microsoft.MicrosoftEdge | Foreach {Add-AppxPackage -DisableDevelopmentMode -Register "$($_.InstallLocation)\AppXManifest.xml" -Verbose}

However this did not fix the problem – annoying after losing the settings. I was about to give up when I found this thread. The culprit, for some at lease, is IBM Trusteer Rapport and its Early Browser Protection feature. I disabled this, rebooted, and Edge now works.

Failing that, you can Stop or uninstall Rapport and that should also fix the problem.

Unhealthy Identity synchronization Notification: a trivial solution (and Microsoft’s useless troubleshooter)

If you use Microsoft’s AD Connect, also known as DirSync, you may have received an email like this:

image

It’s bad news: your Active Directory is not syncing with Office 365. “Azure Active Directory did not register a synchronization attempt from the Identity synchronization tool in the last 24 hours.”

I got this after upgrading AD Connect to the latest version, currently 1.1.553.

The email recommends you run a troubleshooting tool on the AD Connect server. I did that. Nothing wrong. I rebooted, it synced once, then I got another warning.

This is only a test system but I still wanted to find out what was wrong. I tweaked the sync configuration, again without fixing the issue.

Finally I found this post. Somehow, AD Connect had configured itself not to sync. You can get the current setting in PowerShell, using get-adsyncscheduler:

image

As you can see, SyncCycleEnabled is set to false. The fix is trivial, just type:

set-adsyncscheduler –SyncCycleEnabled $true

Well, I am glad to fix it, but should not Microsoft’s troubleshooting tool find this simple configuration problem?

Licensing Azure Stack: it’s complicated (and why Azure Stack is the iPad of servers)

Microsoft’s Azure Stack is a pre-configured, cut-down version of Microsoft’s mighty cloud platform, condensed into an appliance-like box that you can install on your own premises.

Azure Stack is not just a a new way to buy a bunch of Windows servers. Both the technical and the business model are different to anything you have seen before from Microsoft. On the technical side, your interaction with Azure Stack is similar to your interaction with Azure. On the business side, you are buying the hardware, but renting the software. There is no way, according to the latest pricing and licensing guide, to purchase a perpetual license for the software, as you can for Windows Server. Instead, there are two broad options:

Pay-as-you-use

In this model, you buy software services on Azure Stack in exactly the same way as you do on Azure. The fact that you have bought your own hardware gets you a discount (probably). The paper says “Azure Stack service fees are typically lower than Azure prices”.

Service
Base virtual machine $0.008/vCPU/hour ($6/vCPU/month)
Windows Server virtual machine $0.046/vCPU/hour ($34/vCPU/month)
Azure Blob Storage $0.006/GB/month (no transaction fee)
Azure Table and Queue $0.018/GB/month (no transaction fee)
Azure App Service (Web Apps, Mobile Apps, API Apps, Functions) $0.056/vCPU/hour ($42/vCPU/month)

This has the merit of being easy to understand. It gets more complex if you take the additional option of using existing licenses with Azure Stack. “You may use licenses from any channel (EA, SPLA, Open, and others),” says the guide, “as long as you comply with all software licensing and product terms.” That qualification is key; those documents are not simple. Let’s briefly consider Windows Server 2016 Standard, for example. Licensing is per core. To install Windows Server 2016 Standard on a VM, you have to license all the cores in the physical server, even if your VM only has one virtual CPU. The servers in Azure Stack, I presume, have lots of cores. Even when you have done this, you are only allowed to install it on up to two VMs. If you need it on a third VM, you have to license all the cores again. Here are the relevant words:

Standard Edition provides rights for up to 2 Operating System Environments or Hyper-V containers when all physical cores in the server are licensed. For each additional 1 or 2 VMs, all the physical cores in the server must be licensed again.

Oh yes, and once you have done that, you need to purchase CALs as well, for every user or device accessing a server. Note too that on Azure Stack you always have to pay the “base virtual machine” cost in addition to any licenses you supply.

This is why the only sane way to license Windows Server 2016 in a virtualized environment is to use the expensive Datacenter edition. Microsoft’s pay-as-you-use pricing will be better for most users.

Capacity model

This is your other option. It is a fixed annual subscription with two variants:

App Service, base virtual machines and Azure Storage $400 per core per year
Base virtual machines and Azure Storage only $144 per core per year

The Capacity Model is only available via an Enterprise Agreement (500 or more users or devices required); and you still have to bring your own licenses for Windows Server, SQL Server and any other licensed software required. Microsoft says it expects the capacity model to be more expensive for most users.

SQL Server

There are two ways to use SQL Server on Azure. You can use a SQL database as a service, or you can deploy your own SQL Server in a VM.

The same is true on Azure Stack; but I am not clear about how the licensing options if you offer SQL databases as a service. In the absence of any other guidance, it looks as if you will have to bring your own SQL Server license, which will make this expensive. However it would not surprise me if this ends up as an option in the pay-as-you-use model.

Using free software

It is worth noting that costs for both Azure and Azure Stack come way down if you use free software, such as Linux rather than Windows Server, and MySQL rather than SQL Server. Since Microsoft is making strenuous efforts to make its .NET application development framework cross-platform, that option is worth watching.

Support

You will have to get support for Azure Stack, since it is not meant to be user-serviceable. And you will need two support contracts, one with Microsoft, and one with your hardware provider. The hardware support is whatever you can negotiate with the hardware vendor. Microsoft support will be part of your Premier, Azure or Partner support in most cases.

Implications of Azure Stack

When Microsoft embarked on its Azure project, it made the decision not to use System Center, its suite of tools for managing servers and “private cloud”, but to create a new way to manage servers that is better automated, more scalable, and easier for end-users. Why would you use System Center if you can use Azure Stack? Well, one obvious reason is that with Azure Stack you are ceding a lot of control to Microsoft (and to your hardware supplier), as well as getting pushed down a subscription path for your software licensing. If you can handle that though, it does seem to me that running Azure Stack is going to be a lot easier and more productive than building your own private cloud, for most organizations.

This presumes of course that it works. The big risk with Azure Stack is that it breaks; and your IT administrators will not know how to fix it, because that responsibility has been outsourced to your hardware vendor and to Microsoft. It is possible, therefore, than an Azure Stack problem will be harder to solve than other typical Windows platform failures. A lot will depend on the quality control achieved both by Microsoft, for the software, and its hardware partners.

Bottom line: this is the iPad of servers. You buy it but don’t really control it, and it is a delight to use provided it works.

No more infrastructure roles for Windows Nano Server, and why I still like Server Core

Microsoft’s General Manager for Windows Server Erin Chapple posted last week about Nano Server (under a meaningless PR-speak headline) to explain that Nano Server, the most stripped-down edition of Windows Server, is being repositioned. When it was introduced, it was presented not only as a lightweight operating system for running within containers, but also for infrastructure roles such as hosting Hyper-V virtual machines, hosting containers, file server, web server and DNS Server (but without AD integration).

In future, Nano Server will be solely for the container role, enabling it to shrink in size (for the base image) by over 50%, according to Chapple. It will no longer be possible to install Nano Server as a standalone operating system on a server or VM. 

This change prompted Microsoft MVP and Hyper-V enthusiast Aidan Finn to declare Nano Server all but dead (which I suppose it is from a Hyper-V perspective) and to repeat his belief that GUI installs of Windows Server are best, even on a server used only for Hyper-V hosting.

Prepare for a return to an old message from Microsoft, “We recommend Server Core for physical infrastructure roles.” See my counter to Nano Server. PowerShell gurus will repeat their cry that the GUI prevents scripting. Would you like some baloney for your sandwich? I will continue to recommend a full GUI installation. Hopefully, the efforts by Microsoft to diminish the full installation will end with this rollback on Nano Server.

Finn’s main argument is that the full GUI makes troubleshooting easier. Server Core also introduces a certain amount of friction as most documentation relating to Windows Server (especially from third parties) presumes you have a GUI and you have to do some work to figure out how to do the same thing on Core.

Nevertheless I like Server Core and use it where possible. The performance overhead of the GUI is small, but running Core does significantly reduce the number of security patches and therefore required reboots. Note that you can run GUI applications on Server Core, if they are written to a subset of the Windows API, so vendors that have taken the trouble to fix their GUI setup applications can support it nicely.

Another advantage of Server Core, in the SMB world where IT policies can be harder to enforce, is that users are not tempted to install other stuff on their Server Core Domain Controllers or Hyper-V hosts. I guess this is also an advantage of VMWare. Users log in once, see the command-line UI, and do not try installing file shares, print managers, accounting software, web browsers (I often see Google Chrome on servers because users cannot cope with IE Enhanced Security Configuration), remote access software and so on.

Only developers now need to pay attention to Nano Server, but that is no reason to give up on Server Core.

OneDrive Files on Demand is back – will users get confused? And how does it look to applications?

Microsoft is restoring a much-requested feature to its OneDrive cloud storage: placeholders, or what is now called Files on Demand.

The issue is that when users have files in cloud storage, they want easy access to them at any time, but downloading everything to local storage may use too much disk space. There are also scenarios where you do not want a local copy, for example for confidential documents, especially if you do not enable Bitlocker encryption.

You can use OneDrive through the web browser, but Windows users expect File Explorer integration, the most natural way of working.

Windows 8.1 introduced placeholders, where OneDrive (then SkyDrive) files appeared in File Explorer but were not actually downloaded until you opened them. It was a popular feature, but Microsoft removed it in Windows 10, saying that users found it confusing. I suppose they might have thought a file was on their PC, boarded a plane, and then discovered they could not work on the document because they it was not actually there.

This was a user interface issue, but apparently there were other technical issues, particularly for applications using the Windows file APIs. Perhaps the problems were so intricate that the team did not think it could be fixed in the first releases of Windows 10.

Now the feature is back, and I have installed it on the latest Windows Insider build:

image

But could users still be confused? Files in OneDrive now have four possible states:

Hidden. You can still choose not to make all folders visible in File Explorer. In fact, hidden seems to be the default for folders previously not synced to the PC, though you can easily check an option to show them all:

image

Online-only: Files have a cloud icon and are offline until you open them:

image

Locally available:

image

Always available:

image

So what is the difference between “Locally available” and “Always available”? It really is not explained here but my assumption is that locally available files could automatically revert to online-only if there is pressure on disk space. It could catch you out, if you saw that a file was locally available and relied on that, only to find that Windows automatically reverted it without you realising.

If you right-click a file in OneDrive you can change its status or share a link. If you want to make a file online-only, you choose Free up space (I think it would be clearer if this option were called Online-only, but this is a preview so it might change).

image

How do online-only files look to applications? I ran up Visual Studio and wrote a utility that iterates through a folder and shows the file name and length:

image

You will note that the API reported the size of the file online, not on disk. This is the kind of thing that can cause issues, though if the file size were reported as zero bytes – well, that could cause issues too.

Incidentally, you can also now sort files in File Explorer by Status. I imagine the latest Windows 10 SDK will also have a way to report status so that applications can catch up.

Server shipments decline as customers float towards cloud

Gartner reports that worldwide server shipments have declined by 4.2% in the first quarter of 2017.

Not a surprise considering the growth in cloud adoption but there are several points of interest.

One is that although Hewlett Packard Enterprise (HPE) is still ahead in revenue (over $3 billion revenue and 24% market share), Dell EMC is catching up, still number two with 19% share but posting growth of 4.5% versus 8.7% decline for HPE.

In unit shipments, Dell EMC is now fractionally ahead, with 17.9% market share and growth of 0.5% versus HPE at 16.8% and decline of 16.7%.

Clearly Dell is doing something right where HPE is not, possibly through synergy with its acquisition of storage vendor EMC (announced October 2015, completed September 2016).

The larger picture though is not great for server vendors. Businesses are buying fewer servers since cloud-hosted servers or services are a good alternative. For example, SMBs who in the past might run Exchange are tending to migrate to Office 365 or perhaps G Suite (Google apps). Maybe there is still a local server for Active Directory and file server duties, or maybe just a NAS (Networked Attached Storage).

It follows that the big cloud providers are buying more servers but such is their size that they do not need to buy from Dell or HPE, they can go directly to ODMs (Original Design Manufacturers) and tailor the hardware to their exact needs.

Does that mean you should think twice before buying new servers? Well, it is always a good idea to think twice, but it is worth noting that going cloud is not always the best option. Local servers can be much cheaper than cloud VMs as well as giving you complete control over your environment. Doing the sums is not easy and there are plenty of “it depends”, but it is wrong to assume that cloud is always the right answer.

PowerShell documentation stubs: frustrating for users

I’ve been writing a piece on PowerShell, Microsoft’s generally excellent scripting and automation platform. PowerShell is also largely open source, in its cross-platform, PowerShell Core guise.

Of course I went straight to the official documentation as part of my research. Looks good; but I was puzzled. I would find a promising topic like Object Pipeline, which says:

In this chapter, we will describe how the Windows PowerShell pipeline differs from the pipelines of most popular shells, and then demonstrate some basic tools that you can use to help control pipeline output and also to see how the pipeline operates.

Then I clicked around to read the chapter, and struggled to find the content.

Eventually I figured out the problem, by going to the GitHub repository for the documentation. This content is not yet written. Microsoft’s JuanPablo Jofre has written stubs, either with the intention of completing them later, or perhaps in the hope that the community will step up and help.

Open Source is great, but the user experience of finding stub documents in official documentation, that is not clearly marked as such, is frustrating.

I do not recall this kind of issue in Microsoft documentation written in the old closed-source world, which makes me wonder if the documentation team is under resourced – though the most important part of the documentation, the cmdlet reference, is pretty good in my experience.

My view of PowerShell is that it is now a critically important part of the Microsoft platform. It is not only a tool for managing Windows Server, but also for Microsoft’s online services such as Office 365, Azure Active Directory and other Azure services.

It is worth getting the documentation right.

Windows S: another go at locking down Windows, but the Store is not ready and making it ready is a challenge

There were two big ideas behind Surface RT and Windows RT, the 2012 Windows 8 project which left Microsoft (and some OEM partners) with a mountain of unsold hardware. One was to compete with iPads and Android tablets by making Windows a touch-friendly operating system. The second was that Windows had to move on from being vulnerable to being damaged or completely broken by applications. Traditional Windows applications have installers that run with full admin rights and there is nothing much to stop them installing files in the wrong places, setting themselves to start up automatically, or bloating the Registry (the central configuration database in Windows). “My PC is so slow” is a common complaint, and the cumulative effect of successive application installs is one of the key reasons. Vulnerability to malware is another problem, and one which anti-virus software can never solve completely.

Windows RT solved these problems by disallowing application installs other than via the Windows Store. At that time, Windows Store apps were also locked down, so that a malware infection was only possible if there were a bug in the operating system.

Why did Surface RT and Windows RT fail? The ARM-based hardware was rather slow, which was one of the issues, but a more serious flaw was the lack of compelling applications in the Store. Why was that? Complex reasons, but the chief one is that Windows RT was caught in a cycle of failure. Developers want to make money, and the Windows 8 Store was not sufficiently popular with users to give them a big market. At the same time, users who tried the Store found few applications worth their time, and therefore rarely used it.

The problem was compounded by the unpopularity of Windows 8, which was an unfamiliar environment for the existing Windows users who formed the primary market.

Nevertheless, the thinking behind Windows 8 and Windows RT was not completely off the mark. If only it could get over the hump of unpopularity and lack of apps, it could usher in a new era of Windows devices that were secure, touch-friendly, and resistant to performance decay.

It never did, and with Windows 10 Microsoft appeared to give up. The desktop was back, mouse and keyboard was again primary, and Store apps now ran in windows on the desktop. A special Tablet Mode attempted to make Windows 10 equally as touch-friendly as Windows 8, but did not succeed.

Windows still has those problems though, the ones which Windows RT was intended to solve. Could there be another approach which would fix those issues but in a manner more acceptable to users?

image

Windows S and the Surface Laptop, announced today in New York, is the outcome. It is still Windows 10, but Microsoft has flipped a switch that enforces all apps to be installed from the Windows Store. This switch is already in the latest version of Windows 10, the Creators Update, but off by default:

image

Microsoft has also taken steps to make the Store more attractive for developers. It is no longer necessary to develop apps on a new platform within Windows, as it was for the Windows 8 Store. Now you can simply take your existing desktop application and wrap it to enable Store download. This feature is called the Desktop Bridge, or Project Centennial. Applications so wrapped are not as secure as Windows 8 Store apps were; they can write to files anywhere that the user has permission. At the same time, Microsoft has taken steps to make Desktop Bridge apps better isolated than normal desktop applications. You can read the details of how this works here. It is arranged that applications install all files to a private location, instead of system locations, and that Windows hides this fact from the application code by using redirection. The same is true of the registry. This approach means that file version problems and registry bloat are much less likely. Such issues are still possible because the Desktop Bridge does not redirect file or registry calls outside the application package; these are allowed if the user has permission, for compatibility reasons. Nevertheless, it is a big advance on old-style Windows desktop application installs.

When the user removes a Desktop Bridge application, in most cases all its files and registry entries are cleanly removed.

An important additional protection is that applications submitted to the Store are vetted by Microsoft, so malicious or badly behaved instances should not get through.

Windows S will be installed by default both on Surface Laptop and on a new generation of low-end laptops aimed mainly at the education market.

The benefits of Windows S are real; but unfortunately Microsoft still has not solved the Store problem. Currently, your favourite Windows applications are not in the Store. Microsoft Office will be there, thanks to the Desktop Bridge, but many others are not.

image

Microsoft’s big bet is that thanks to Windows S and other initiatives, the Store will be sufficiently attractive to developers, and sufficiently easy to target, that it will soon offer a full range of applications including all your favourites.

Right now though, if you get a Windows S laptop, you will probably end up buying the upgrade to Windows 10 Pro, for $49.00 or equivalent. Then you can install any Windows desktop application. However, by doing so you make it unnecessary for developers to bother using Desktop Bridge to wrap their applications – so they might never do so.

Windows S has a few other limitations:

Microsoft Edge is the default web browser on Microsoft 10 S. You are able to download another browser that might be available from the Windows Store, but Microsoft Edge will remain the default if, for example, you open an .htm file. Additionally, the default search provider in Microsoft Edge and Internet Explorer cannot be changed.

In addition, it cannot join a local Windows domain (a problem for many businesses), though it can join Azure AD, the Office 365 directory.

Microsoft’s goal here is worthwhile: to move Windows into a new place in terms of security and resilience. Getting it there though will not be easy.

Xamarin Challenge shows bumps in Microsoft’s path to cross-platform mobile

Microsoft ran a Xamarin Challenge over on Paul Thurrott’s site. The idea was to demo how to build a cross-platform mobile app with Microsoft’s cross-platform mobile toolkit.

The challenge was in three steps. You build a weather app, complete with crash analytics on the Visual Studio Mobile Center.

image

Someone did a lot of work on this, and the app looks pretty and works nicely once you get it running.

Despite this, I am not sure that the challenge was altogether successful. It is a step-by-step which in theory involves no developer expertise as you simply copy and paste code into your project. I am not sure that is the best way to learn, but that is by the by. I doubt that learning how to code for Xamarin was the primary goal of the challenge. I’d guess it was more about showing how easily you can build a cross-platform app (Android, iOS and Windows UWP) using Xamarin, C# and Visual Studio 2017.

Well, in fact a little bit of developer expertise was required to complete the challenge, because the step by step instructions did not quite work (in my experience). I did not make a note of all the times I had to do something not in the given steps, but there were many occasions, the main issues being around using the Visual Studio Android emulator, NuGet package management, and a few small tweaks to the code itself. The code as given made no allowance for the cloud services it called being offline, or the connection to the internet not being available, but would simply crash in this case.

The challenge could be an excellent resource for Microsoft and Xamarin if the company drills down into the problems developers experienced trying to complete the challenge, recorded in this forum thread. Here are a few examples:

Myself and 5 other developers in our office attempted the challenge and none of us have been able to get past the first challenge. We are not Microsoft Visual Studio experts so we had hoped following the provided instructions would be sufficient.

The upload was failing on a discrepancy between 2 different versions of the Json package, which somehow had crept into the project. Installing over 40 updates in Nuget resolved this.

Many thanks for running this challenge –this was very useful and worthwhile. I just wish modern development did not feel like trying to dance on a mile high stack of chairs with a leg missing on the bottommost one!

I got a late start on the challenge and was able to complete part 1 pretty quickly but was only able to run the UWP locally. I cannot seem to get either the Windows mobile emulator or Android emulators to run successfully. I can’t deploy to the Window Mobile emulator, it returns an error indicating the emulator failed to start. As for the Android emulator, it launches, but the emulator does not have a connection to the network, so the application encounters an exception.

Note that those posting to the forum were more likely to be the ones with problems; there could in theory be many others who breezed through without any issues. But as one participant writes, “I’d be interested in what percentage of participants actually got to the end of the challenge with no problems.”

I like Xamarin; it does an amazing job in enabling cross-platform development with C# and it would be my tool of choice for cross-platform mobile development. It is not always straightforward though, and the kinds of issues experienced by the challenge participants illustrate what can go wrong.

If you just use the native toolkits, such as Android Studio and Xcode, you will have a smoother experience, but of course miss out on the productivity benefit of cross-platform code. That is the trade-off you make.