Tag Archives: windows

Ford Microsoft car makes an appearance at Mobile World Congress

At the Showstoppers event just before the Mobile World Congress in Barcelona it was hard to miss the Ford car emblazoned with SYNC Ford Microsoft.

image

So what is this all about? Apparently, the European launch of in-car computers that hook up to Ford’s cloud services. Cue all the jokes about “if your car ran Windows.”

You have to provide the connectivity, for example by docking your smartphone. You can then stream music with voice control, make calls again with voice control, or if you hear a funny noise, send a diagnostic report on your car to Ford or perhaps your dealer.

Why bother with an in-car computer running Windows embedded, when you could just dock a smartphone and let that do all the work? That was my question too, though there are integration benefits. Some details are being held back for an announcement tomorrow.

By the the way if you think the picture is rubbish, blames the Samsung Slate 7, which was used to create this entire post.

The confusing state of Microsoft’s TMG and UAG firewall and proxy software

I have been trying out Microsoft’s ForeFront Unified Access Gateway (UAG) recently, partly because it is the only supported way to publish a SharePoint site for Windows Phone. This was my first go with the product, though I am already familiar with the Threat Management Gateway (TMG) and its predecessor Internet Security and Acceleration Server (ISA) – and before that Proxy Server, dubbed “Poxy Server” by admins frustrated with its limitations. All these products are related, and in the case of UAG and TMG, more closely than I realised.

Note that Microsoft has indicated that the current version of TMG, 2010, is the last. What is happening to UAG is less clear.

What I had not realised until now is that TMG installs as part of UAG, though you are not meant to use it other than for a few limited uses. It is mainly there to protect the UAG server. The product positioning seems to be this:

  • Use UAG for publishing applications such as SharePoint, Direct Access (access to Windows files shares over the internet) and Exchange. It is essentially a reverse proxy, a proxy for publishing and protecting server applications.
  • Use TMG for secure internet access for users on your network.

This means that if you want to use Microsoft’s platform for everything possible, you are expected to run both UAG and TMG. That is OK for enterprises but excessive for smaller organisations. It is odd, in that TMG is also a capable reverse proxy. TMG is also easier to use, though that says more about the intricate user interface of TMG than it does about the usability of TMG. Neither product can be described as user friendly.

The complexity of the product is likely to be one of the reasons TMG is now being discontinued. It is a shame, because it is a decent product. The way TMG and ISA are designed to work is that all users have to authenticate against the proxy before being allowed internet access. This gives administrators a high degree of control and visibility over which users access which sites using which protocol.

Unfortunately this kind of locked-down internet access is inconvenient, particularly when there are a variety of different types of device in use. In many cases admins have to enable SecureNAT, or in other words unauthenticated access, partly defeating the purpose, but there is little choice.

ISA Server used to be supplied as part of Small Business Server (SBS); but when I spoke to Microsoft about why it was dropped in SBS 2008, I was told that few used it. Businesses preferred a hardware solution, whether a cheap router modem from the likes of Netgear or Linksys, or a security appliance from a company like Sonicwall, Cisco or Juniper.

The hardware companies sell the idea that a hardware appliance is more secure, because it is not vulnerable to Windows or Linux malware. There is something in the argument, but note that all security appliances are more software than hardware, and that a Windows box will be patched more regularly. ISA’s security record was rather good.

My hunch is that ease of use was a bigger factor for small businesses. Getting ISA or TMG to do what you want can be even more challenging that working out the user interface of a typical hardware appliance, though perhaps not with the more complex high-end units.

As for UAG, I have abandoned the idea of testing it for the moment. One of the issues is that my test setup has only one external IP. UAG is too elaborate for a small network like mine. I am sticking with TMG.

Windows on ARM: Microsoft can write Desktop apps, but you cannot

Microsoft’s Windows chief Steven Sinofsky has written a long post describing Windows on ARM (WOA), which he says is a:

new member of the Windows family, much like Windows Server, Windows Embedded, or Windows Phone

There are many point of interest in the post, but the one which stands out for me is that while the traditional Windows desktop exists in WOA, third party applications will not be allowed there:

Developers with existing code, whether in C, C++, C#, Visual Basic, or JavaScript, are free to incorporate that code into their apps, so long as it targets the WinRT API set for Windows services. The Windows Store can carry, distribute, and service both the ARM and x86/64 implementations of apps (should there be native code in the app requiring two distributions).

says Sinofsky. He writes with extreme care on this issue, since the position for which he argues is finely nuanced. Why have the Windows desktop on WOA at all?

Some have suggested we might remove the desktop from WOA in an effort to be pure, to break from the past, or to be more simplistic or expeditious in our approach. To us, giving up something useful that has little cost to customers was a compromise that we didn’t want to see in the evolution of PCs

he says, while also saying:

WOA (as with Windows 8 ) is designed so that customers focused on Metro style apps don’t need to spend time in the desktop.

From a developer perspective, the desktop is more than just a different Windows shell. Apps that run on the Windows Runtime (WinRT) are isolated from each other and can call only a limited set of “safe” Windows APIs, protecting users from malware and instability, but also constraining their capabilities. The desktop by contrast is the old Windows, an open operating system. On Windows 8 Intel, most things that run on Windows 7 today will still work. On WOA though, even recompilation to target the ARM architecture will not help you, since Microsoft will not let desktops apps install:

Consumers obtain all software, including device drivers, through the Windows Store and Microsoft Update or Windows Update.

What if you really want to use WOA, but have some essential desktop application without which you cannot do your work, and which cannot quickly and easily be ported to WinRT? Microsoft’s answer is that you must use Windows on Intel.

That said, Microsoft itself has this problem in the form of Office, its productivity suite. Microsoft’s answer to itself is to run it on the desktop:

Within the Windows desktop, WOA includes desktop versions of the new Microsoft Word, Excel, PowerPoint, and OneNote, codenamed “Office 15”.

No Outlook, which I take to imply that a new WinRT-based Exchange client and PIM (Personal Information Manager) is on the way – a good thing.

Microsoft’s aim is to give customers the security and stability of a locked-down machine, while still offering a full version of Office. If you think of this as something like an Apple iPad but with no-compromise document editing and creation, then it sounds compelling.

At the same time, some users may be annoyed that the solution Microsoft has adopted for its legacy desktop application suite is not also available to them.

The caveat: it is not clear in Sinofsky’s post whether there may be some exceptions, for example for corporate deployments, or for hardware vendors or mobile operators. It will also be intriguing to see how Office 15 on ARM handles extensibility, for example with Office add-ins or Visual Basic macros. I suspect they will not be supported, but if they are, then that would be a route to a kind of desktop programming on WOA.

It will be interesting to see how Microsoft locks down Explorer, which Sinofksy says is present:

You can use Windows Explorer, for example, to connect to external storage devices, transfer and manage files from a network share, or use multiple displays, and do all of this with or without an attached keyboard and mouse—your choice.

By the way, this is a picture of the Windows ARM desktop as it looked at the BUILD conference last September. The SoC (System on a Chip) on this machine is from NVIDIA.

Cross-platform Windows and Mac lifts Delphi sales by 54%

Embarcadero has announced 54% growth in sales of Delphi and C++ Builder, its rapid application development tools, in 2011 vs 2010. These tools primarily target Windows, but in the 2011 XE2 edition also support Mac and iOS applications. XE2 also added a 64-bit compiler, making this the most significant Delphi release for years. The company says that the 2011 figures come on top of 15% year on year growth in the previous three years.

This is encouraging for Delphi developers, and well deserved in that Delphi still offers the most productive environment for native code development on Windows. The cross platform aspect is also interesting, though the FireMonkey framework which enables it is less mature than the old VCL, and there are many other options out there for cross-platform apps. FireMonkey does not yet support Android or other mobile platforms apart from Apple iOS.

2012 is also the year of Windows 8, raising the question of whether Delphi and C++ Builder will support the new Windows Runtime (WinRT) in future, and if it does, whether this will be FireMonkey only, or whether it could work with a XAML-defined user interface.

Nokia results: hope for Windows Phone?

It is almost one year since Nokia’s dramatic announcement that it would transition its smartphone range to Windows Phone. Today the company released its results for the fourth quarter and for the full year 2011, the first since the release of the the Lumia range of Windows Phone devices. How it is doing?

This is one you can spin either way. The negative view: Nokia is losing money. Sales are down 21% year on year for the quarter and 9% for the full year, and the company reported an operating loss of just over a billion Euro for the year, most of which was in the last quarter.

If you look at the quarter on quarter device sales, they are down in both smart devices and mobile phones. The Symbian business has not held up as well as the company hoped:

changing market conditions are putting increased pressure on Symbian. In certain markets, there has been an acceleration of the anticipated trend towards lower-priced smartphones with specifications that are different from Symbian’s traditional strengths. As a result of the changing market conditions, combined with our increased focus on Lumia, we now believe that we will sell fewer Symbian devices than we previously anticipated.

says the press release. As for Windows Phone and Lumia, CEO Stephen Elop says that “well over 1 million Lumia devices” have been sold: a start, but still tiny relative to Apple iOS and Google Android. Elop cleverly calls it a “beachhead”, but given the energy Nokia put into the launch I suspect it is disappointed with the numbers.

Put this in context though and there are reasons for hope. First, Nokia’s speed of execution is impressive, from announcement to the first Windows Phones in nine months or so. Further, the Lumia (judging by the Lumia 800 I have been using) does not feel like a device rushed to market. The design is excellent, and within the small world of Windows Phone 7 hardware Nokia has established itself as the brand of first choice.

Second, despite the dismal sales for Windows Phone 7 since its launch, there are signs that Microsoft may yet emerge from the wreckage inflicted on the market by iOS and Android in better shape than others. WebOS has all-but gone. RIM has yet to convince us that it has a viable recovery strategy. Intel Tizen is just getting started. If Microsoft has a successful launch for Windows 8, Elop’s “third ecosystem” idea may yet come to fruition.

Third, Nokia has already shown that it is better able to market Windows Phone 7 than Microsoft itself, or its other mobile partners. Lumia made a good splash at CES in January, and the platform may gain some market share in the influential US market.

Nokia is not just Windows Phone though, and even if its smartphone strategy starts to work it has those falling Symbian sales to contend with. It will not be easy, even taking an optimistic view.

Nor will it be easy for Windows 8 to succeed in a tablet market owned by Apple at the high end and by Amazon/Android at the low end.

Microsoft financials: Windows under stress, Server and Office making up

If we are really in the post-PC era, then one of two things will happen. Either Microsoft will make a big success of non-PC products, or it will start delivering shocking financial results. Neither is yet true. Here are the results just announced, broken down into a simple table.

Quarter ending December 31st 2011 vs quarter ending December 31st 2010, $millions

Segment Revenue Change Profit Change
Client (Windows + Live) 4736 -320 2850 -64
Server and Tools 4772 +484 1996 +285
Online 784 +71 -458 +101
Business (Office) 6279 +169 4152 +65
Entertainment and devices 4237 +539 528 -138

A few observations. Server revenue (though not profit) exceeded client revenue; I am not sure if this is the first time it has done so, but it is unusual. The Office division enjoyed a remarkable quarter, and the press release mentions 10% growth in Exchange and SharePoint, and 30% growth (from a smaller base) in Lync and Dynamics CRM. Azure? Not mentioned so I presume revenue is small.

Where is Office 365? Somewhere in the Office figures I would guess; and once again, since it is not mentioned, I think we can assume it is not delivering a large amount of revenue yet. I would like to know more though.

What Microsoft calls Online is formed of Bing search and services and advertising income. Another hefty loss, but revenue is up, loss somewhat reduced, and Microsoft claims that  “Bing-powered US market share, including Yahoo! properties, was approximately 27%”. Not bad.

This is the big quarter for gaming and Xbox delivered accordingly. The faltering Windows Mobile and Windows Phone 7 are somewhere lost in those Xbox numbers, and again its revenue is not mentioned in the press release.

Meet Resilient File System (ReFS), a new file system for Windows

Microsoft has announced the Resilient File System (ReFS), a replacement for the NTFS file system which has been used since the first release of Windows NT in 1993.

The new file system increases limits in NTFS as follows:

  NTFS ReFS
Max file size 2^64 -1 2^64-1 bytes
Max volume size 2^40 bytes 2^78 bytes
Max files in a directory 2^32 –1 (per volume) 2^64
Max file name length 32K unicode (255 unicode) 32K unicode
Max path length 32K 32K

I have done my best to set out the NTFS limits but it is complicated, and there are limitations in the Windows API as well as in NTFS. See this article for more on NTFS limits; and this article for an explanation of file name and path length limits in the Windows API.

Microsoft’s announcement focuses on two things. One is resilience, with claims that ReFS is better at preserving data in the event of power failure or other calamity. Another is how ReFS is designed to work alongside Storage Spaces, about which I posted earlier this month.

Of the two, Storage Spaces will be more visible to users. In addition, it sounds as if ReFS will not be the default in Windows 8 client:

…we will implement ReFS in a staged evolution of the feature: first as a storage system for Windows Server, then as storage for clients, and then ultimately as a boot volume. This is the same approach we have used with new file systems in the past.

Note that there are losses as well as gains in ReFS. Short file names are gone, so are quotas, so is compression:

The NTFS features we have chosen to not support in ReFS are: named streams, object IDs, short names, compression, file level encryption (EFS), user data transactions, sparse, hard-links, extended attributes, and quotas.

Overall ReFS strikes me as a conservative rather than radical upgrade. This is not the return of WinFS, an abandoned project which was to bring relational file storage to Windows. It will not help, in itself, with the biggest problem client users have with their file system: finding their stuff. Nor does it have built-in deduplication, which can make storage substantially more efficient. Microsoft says the file system is pluggable (as is NTFS) so that features like deduplication can added by other providers or by Microsoft with other products.

OEMs are still breaking Windows: can Microsoft fix this with Windows 8?

Mark Russinovich works for Microsoft and has deep knowledge of Windows internals; he created the original Sysinternals tools which are invaluable for troubleshooting.

His account of troubleshooting a new PC purchased by a member of his family is both amusing and depressing, though I admire his honesty:

My mom recently purchased a new PC, so as a result, I spent a frustrating hour removing the piles of crapware the OEM had loaded onto it (now I would recommend getting a Microsoft Signature PC, which are crapware-free). I say frustrating because of the time it took and because even otherwise simple applications were implemented as monstrosities with complex and lengthy uninstall procedures. Even the OEM’s warranty and help files were full-blown installations. Making matters worse, several of the craplets failed to uninstall successfully, either throwing error messages or leaving behind stray fragments that forced me to hunt them down and execute precision strikes.

I admire his honesty. What he is describing, remember, is his company’s core product, following its mutilation by one of the companies Microsoft calls “partners”.

Russinovich adds:

As my cleaning was drawing to a close, I noticed that the antimalware the OEM had put on the PC had a 1-year license, after which she’d have to pay to continue service. With excellent free antimalware solutions on the market, there’s no reason for any consumer to pay for antimalware, so I promptly uninstalled it (which of course was a multistep process that took over 20 minutes and yielded several errors). I then headed to the Internet to download what I – not surprisingly given my affiliation – consider the best free antimalware solution, Microsoft Security Essentials (MSE).

Right. I do the same. However, the MSE install failed, probably thanks to a broken transfer application used to migrate files and settings from an old PC, and it took him hours of work to identify the problem and complete the install.

What interests me here is not so much the specific problems, but Microsoft’s big problem: that buying a new Windows PC is so often a terrible user experience. Not always: business PCs tend to be cleaner, and some OEMs are better than others. Nevertheless, although I have had Microsoft folk tell me a number of times that its partners were getting the message, that to compete with Apple they need to deliver a better experience, the problem has not been cracked.

There is something about the ecosystem which ensures that users get a bad product. It goes like this I guess: customers are price-sensitive, and to get the price required OEM vendors have to take the money from malware companies and others desperate to drive users towards their products. Yet in doing so they perpetuate the situation where you you have to buy Apple, or be a computer professional, in order to get a clean install. That describes a broken ecosystem.

Microsoft’s Signature PCs are another option, but they are only available from Microsoft stores.

The next interesting question is whether Microsoft can fix this with Windows 8. It may want to follow the example of Windows Phone 7, which is carefully locked down so that OEMs and operators can add their own apps, but their ability to customise the operating system is limited, protecting the user experience. It is hard to see how Microsoft can achieve the same with the x86 version of Windows 8, since this remains an open platform, though it may be possible to insulate the Metro side from too much tinkering. Windows 8 on ARM, on the other hand, may well follow the Windows Phone pattern.

Running Windows on an Apple iPad

I love the convenience of the iPad but there are times when I miss Windows apps. It is not just for work; there is nothing on the iPad to rival Jack Bridge, for example.

The solution is to run Windows on the iPad via remote desktop.

image

Most versions of Windows have remote desktop built-in, though you do need to install a client on the iPad. I have tried several and settled for the moment on Mocha RDP. If you tap the up arrow at bottom right, you get a toolbar which controls the on-screen keyboard, extra keys useful for Windows, and a menu with options including a macro of pre-defined keystrokes. It even works with my cheap iPad keyboard.

The downside of this approach is that Windows needs to be running somewhere on your network. However Mocha RDP supports wake on lan, so you can turn it on remotely; note that this normally needs to be activated in the PC BIOS.

In my case I already run a Hyper-V server, a free download. I have installed Windows 7 on a VM (virtual machine), so it is always available.

The iPad also supports VPN (Virtual Private Network), so given a decent broadband connection I could connect to Windows while out and about. Alternatively there are systems like LogMeIn which do not require a VPN, though you have to install the LogMeIn agent on the target PC.

The general approach makes a lot of sense to me. Technically it is a hybrid thin/thick client approach. An iPad or other tablet is smart and has its own local apps and storage, but does not attempt to provide the full capabilities of a PC or Mac. When you need that, you can log into a remote desktop.

It is another example of how the mobile revolution is making us rethink how we do computing. The thin client concept is nothing new, but it is only now that it is becoming compelling for users as well as administrators, giving them the convenience of a tablet as well as access to rich applications like Microsoft Office.

Microsoft no doubt has its own plans for combining tablets with desktop-as-a-service. I would guess that it involves Windows 8 on ARM; but it will take some effort to tempt users away from their iPads.

NVIDIA plans to merge CPU and GPU – eventually

I spoke to Dr Steve Scott, NVIDIA’s CTO for Tesla, at the end of the GPU Technology Conference which has just finished here in Beijing. In the closing session, Scott talked about the future of NVIDIA’s GPU computing chips. NVIDIA releases a new generation of graphics chips every two years:

  • 2008 Tesla
  • 2010 Fermi
  • 2012 Kepler
  • 2014 Maxwell

Yes, it is confusing that the Tesla brand, meaning cards for GPU computing, has persisted even though the Tesla family is now obsolete.

image
Dr Steve Scott showing off the power efficiency of GPU computing

Scott talked a little about a topic that interests me: the convergence or integration of the GPU and the CPU. The background here is that while the GPU is fast and efficient for parallel number-crunching, it is of course still necessary to have a CPU, and there is a price to pay for the communication between the two. The GPU and the CPU each have their own memory, so data must be copied back and forth, which is an expensive operation.

One solution is for GPU and CPU to share memory, so that a single pointer is valid on both. I asked CEO Jen-Hsun Huang about this and he did not give much hope for this:

We think that today it is far better to have a wonderful CPU with its own dedicated cache and dedicated memory, and a dedicated GPU with a very fast frame buffer, very fast local memory, that combination is a pretty good model, and then we’ll work towards making the programmer’s view and the programmer’s perspective easier and easier.

Scott on the other hand was more forthcoming about future plans. Kepler, which is expected in the first half of 2012, will bring some changes to the CUDA architecture which will “broaden the applicability of GPU programming, tighten the integration of the CPU and GPU, and enhance programmability,” to quote Scott’s slides. This integration will include some limited sharing of memory between GPU and CPU, he said.

What caught my interest though was when he remarked that at some future date NVIDIA will probably build CPU functionality into the GPU. The form that might take, he said, is that the GPU will have a couple of cores that do the CPU functions. This will likely be an implementation of the ARM CPU.

Note that this is not promised for Kepler nor even for Maxwell but was thrown out as a general statement of direction.

There are a couple of further implications. One is that NVIDIA plans to reduce its dependence on Intel. ARM is a better partner, Scott told me, because its designs can be licensed by anyone. It is not surprising then that Intel’s multi-core evangelist James Reinders was dismissive when I asked him about NVIDIA’s claim that the GPU is far more power-efficient than the CPU. Reinders says that the forthcoming MIC (Many Integrated Core) processors codenamed Knights Corner are a better solution, referring to the:

… substantial advantages that the Intel MIC architecture has over GPGPU solutions that will allow it to have the power efficiency we all want for highly parallel workloads, but able to run an enormous volume of code that will never run on GPGPUs (and every algorithm that can run on GPGPUs will certainly be able to run on a MIC co-processor).

In other words, Intel foresees a future without the need for NVIDIA, at least in terms of general-purpose GPU programming, just as NVIDIA foresees a future without the need for Intel.

Incidentally, Scott told me that he left Cray for NVIDIA because of his belief in the superior power efficiency of GPUs. He also described how the Titan supercomputer operated by the Oak Ridge National Laboratory in the USA will be upgraded from its current CPU-only design to incorporate thousands of NVIDIA GPUs, with the intention of achieving twice the speed of Japan’s K computer, currently the world’s fastest.

This whole debate also has implications for Microsoft and Windows. Huang says he is looking forward to Windows on ARM, which makes sense given NVIDIA’s future plans. That said, the I get impression from Microsoft is that Windows on ARM is not intended to be the same as Windows on x86 save for the change of processor. My impression is that Windows on ARM is Microsoft’s iOS, a locked-down operating system that will be safer for users and more profitable for Microsoft as app sales are channelled through its store. That is all very well, but suggests that we will still need x86 Windows if only to retain open access to the operating system.

Another interesting question is what will happen to Microsoft Office on ARM. It may be that x86 Windows will still be required for the full features of Office.

This means we cannot assume that Windows on ARM will be an instant hit; much is uncertain.