All posts by onlyconnect

China’s Tianhe-2 Supercomputer takes top ranking, a win for Intel vs Nvidia

The International Supercomputing Conference (ISC) is under way in Leipzig, and one of the announcements is that China’s Tianhe-2 is now the world’s fastest supercomputer according to the Top 500 list.

This has some personal interest for me, as I visited its predecessor Tianhe-1A in December 2011, on a press briefing organised by NVidia which was, I guess, on a diplomatic mission to promote Tesla, the GPU accelerator boards used in Tianhe-1A (which was itself the world’s fastest supercomputer for a period).

It appears that the mission failed, insofar as Tianhe-2 uses Intel Phi accelerator boards rather than Nvidia Tesla.

Tianhe-2 has 16,000 nodes, each with two Intel Xeon IvyBridge processors and three Xeon Phi processors for a combined total of 3,120,000 computing cores.

says the press release. Previously, the world’s fastest was the US Titan, which does use NVidia GPUs.

Nvidia has reason to worry. Tesla boards are present on 39 of the top 500, whereas Xeon Phi is only on 11, but it has not been out for long and is growing fast. A newly published paper shows Xeon Phi besting Tesla on sparse matrix-vector multiplication:

we demonstrate that our implementation is 3.52x and 1.32x faster, respectively, than the best available implementations on dual IntelR XeonR Processor E5-2680 and the NVIDIA Tesla K20X architecture.

In addition, Intel has just announced the successor to Xeon Phi, codenamed Knight’s Landing. Knight’s Landing can function as the host CPU as well as an accelerator board, and has integrated on-package memory to reduce data transfer bottlenecks.

Nvidia does not agree that Xeon Phi is faster:

The Tesla K20X is about 50% faster in Linpack performance, and in terms of real application performance we’re seeing from 2x to 5x faster performance using K20X versus Xeon Phi accelerator.

says the company’s Roy Kim, Tesla product manager. The truth I suspect is that it depends on the type of workload and I would welcome more detail on this.

It is also worth noting that Tianhe-2 does not better Titan on power/performance ratio.

  • Tianhe-2: 3,120,00 cores, 1,024,000 GB Memory, Linpack perf 33,862.7 TFlop/s, Power 17,808 kW.
  • Titan: 560,640 cores, 710,144 GB Memory, Linpack perf 17,590 TFlop/s, Power 8,209 kW.

Fixing lack of output in AWstats after Debian Linux upgrade

I use AWStats to analyse logs on several web sites that I manage. After a recent upgrade to Debian 7.0 “Wheezy” I was puzzled to find that my web stats were no longer being updated.

I verified that the Cron job which runs the update script was running. I verified that if I ran the same command from the console, it ran correctly. I verified this even using sudo to run with the same permissions as Apache. I also noted that the update button on the stats pages worked correctly. An odd problem.

This is how it rested for a while, and I manually updated the stats. It was annoying though, so I took a closer look.

First, I amended one of the Cron jobs so that it output to a file. Reading the file after the next failed update, I could see the error message:

Error: LogFile parameter is not defined in config/domain file
Setup file, web server or permissions) may be wrong.

I knew the config file was fine, but checked anyway, and of course the LogFile was specified OK.

It was a clue though. Eventually I came across this bug report by Simone Capra:

Hi all, i’ve found a problem:
When run from another perl program, it finds a config file that doesn’t exist!

I applied the suggested fix in awstats.pl, changing:

if (open( CONFIG, "$SiteConfig" ) ) {

to

if ($SiteConfig=~ /^[\\/]/ && open( CONFIG, "$SiteConfig" ) ) {

Presto, everything is running OK.

Microsoft and mediocrity in programming

A post by Ahmet Alp Balkan on working as a developer at Microsoft has stimulated much discussion. Balkan says he joined Microsoft 8 months ago (or two years ago if you count when he started as an intern) and tells a depressing tale (couched in odd language) of poor programming practice. Specifically:

  • Lack of documentation and communication. “There are certain people, if they got hit by a bus, nobody can pick up their work or code.”
  • Inability to improve the codebase. “Nobody will appreciate you for fixing styling or architectural issues in their core, in fact they may get offended.”
  • Lack of enthusiasm. “Writing better code is not a priority for the most”
  • Lack of productivity. “I spend most of my time trying to figure out how others’ uncommented/undocumented code work, debugging strange things and attending daily meetings.”
  • Lack of contribution to the community. “Everybody loves finding Stack Overflow answers on search results, but nobody contributes those answers.”
  • Lack of awareness of the competition. “No one I met in Windows Azure team heard about Heroku or Rackspace.”
  • Working by the book. “Nobody cares what sort of mess you created. As long as that functionality is ready, it is okay and can always be fixed later.”
  • Clipboard inheritance. “I’ve seen source files copy pasted across projects. As long as it gets shit done (described above) no one cares if you produced unmaintainable code.”
  • Using old tools. “Almost 90% of my colleagues use older versions of Office, Windows, Visual Studio and .NET Framework.”
  • Crippling management hierarchy. “At the end, you are working for your manager’s and their managers’ paychecks.”

There are a couple of points to emphasize. This is one person in one team which is part of a very large corporation, and should not be taken as descriptive of Microsoft programming culture as a whole. Balkan’s team is in “the test org”, he says, and not making product decisions. Further, many commenters observe that they have seen similar at other organisations.

Nevertheless, some of the points chime with other things I have seen. Take this post by Ian Smith, formerly a Microsoft-platform developer, on trying to buy a Surface Pro at Microsoft’s online store. From what he describes, the software behind the store is of dreadful quality. Currently, there is a broken image link on the home page.

image

This is not how you beat the iPad.

Another piece of evidence is in the bundled apps for Windows 8. The more I have reflected on this, the more I feel that supplying poor apps with Windows 8 was one of the worst launch mistakes. Apps like Mail, Calendar and Contacts on the Metro-style side have the look of waterfall development (though I have no inside knowledge of this). They look like what you would get from having a series of meetings about what the apps should do, and handing the specification over to a development team. They just about do the job, but without flair, without the benefit of an iterative cycle of improvements based on real user experience.

When the Mail app was launched, it lacked the ability to see the URL behind a hyperlink before tapping it, making phishing attempts hard to spot. This has since been fixed in an update, but how did that slip through? Details matter.

A lot is known about how to deliver high quality, secure and robust applications. Microsoft itself has contributed excellent insights, in books like Steve McConnell’s Code Complete and Michael Howard’s Writing Secure Code. The Agile movement has shown the importance of iterative development, and strong communication between all project stakeholders. Departing from these principles is almost always a mistake.

The WinRT platform needed a start-up culture. “We’re up against iPad and Android, we have to do something special.” Microsoft can do this; in fact, Windows Phone 7 demonstrated some of that in its refreshing new user interface (though the 2010 launch was botched in other ways).

Another piece of evidence: when I open a Word document from the SkyDrive client and work on it for a while, typing starts to slow down and I have to save the document locally in order to continue. I am not alone in experiencing this bug. Something is broken in the way Office talks to SkyDrive. It has been that way for many months. This is not how you beat Dropbox.

In other words, I do think Microsoft has a problem, though equally I am sure it does not apply everywhere. Look, for example, at Hyper-V and how that team has gone all-out to compete with VMWare and delivered strong releases.

Unfortunately mediocrity, where it is does exist, is a typical side-effect of monopoly profits and complacency. Microsoft (if it ever could) cannot afford for it to continue.

VLC efforts targeting WinRT with open source tools could enable more open source ports

An email from VideoLAN concerning the port of the open source VLC media player to WinRT, the tablet platform in Windows 8, provides insight into some of the technical difficulties facing open source developers.

Large Orange VLC media player Traffic Cone Logo

This is the heart of the problem:

The build process of VLC is not integrated with Windows Tools, notably Visual Studio, because VLC uses Unix Tools to run on all platforms. This is one of the reasons why VLC media player works on Windows, Linux, BSD, Solaris, iOS, Android, OS/2 and so many other operating systems.

In order to qualify for Windows Store distribution, apps must pass Microsoft’s security requirements, avoiding prohibited API calls. The VLC developers have done most of that successfully, but hit a problem with the Microsoft C Runtime, MSVCRT. Many open source projects use the ancient version 6.0 for maximum compatibility, but:

on WinRT, one MUST use MSVCRT 11.0 in order to pass the validation. This meant that we had to modify our compiler and toolchain to be able to link with this version.

When we asked Microsoft, some engineers told us that this could not possibly succeed, since the validation would not allow application compiled with 3rd party compilers to link with MSVCRT110. We did not want to believe them, since this would have killed the project.

And, they were wrong. We did it, but this took us way more time than anything we had anticipated. The final work was shared and integrated in our toolchain, Mingw-W64. All other open source applications will benefit from that, from now on.

Apparently the final piece of work is working out how to call the WinRT interop layer (the bit that looks like COM but is not COM) from C code. That is now working too so VLC is now completing the work of rewriting headers to call these new APIs.

This work could have wider consequences. Since VLC is open source, all these efforts are available to others, which means that porting other open source projects that use a similar tool chain should be easier.

This is especially significant for Windows RT, the ARM port, where it is not possible to install desktop apps.

VideoLAN’s work could be a great benefit to the WinRT Platform. Microsoft’s engineers should be doing everything they can to help, rather than (as the email implies) telling the developers that it cannot work.

Windows Server 2012 R2, System Center 2012 R2, SQL Server 14: what’s new, and what is the Cloud OS?

Earlier this month I attended a three-day press briefing on what is coming in the R2 wave of Microsoft’s server products: Windows Server, System Center and SQL Server.

There is a ton of new stuff, too much for a blog post, but here are the things that made the biggest impression.

First, I am beginning to get what Microsoft means by “Cloud OS”. I am not sure that this a useful term, as it is fairly confusing, but it is worth teasing out as it gives a sense of Microsoft’s strategy. Here’s what lead architect Jeffrey Snover told me:

I think of it as a central organising thought. That’s our design centre, that’s our north star. It’s not necessarily a product, it goes across some things … for example, I would absolutely include SQL [Server] in all of its manifestations in our vision of a cloud OS. Cloud OS has two missions. Abstracting resources for consumption by multiple consumers, and then providing services to applications. Modern applications are all consuming SQL … we’re evolving SQL to the more scale-out, elastic, on-demand attributes that we think of as cloud OS attributes.

If you want to know what Cloud OS looks like, it is something like this:

image

Yes, it’s the Azure portal, and one of today’s big announcements is that this is the future of System Center, Microsoft’s on-premise cloud management system, as well as Azure, the public cloud. Azure technology is coming to System Center 2012 R2 via an add-on called the Azure Pack. Self-service VMs, web sites, SQL databases, service bus messaging, virtual networks, online storage and more.

Snover also talked about another aspect to Cloud OS, which is also significant. He says that Microsoft sees cloud as an “operating system problem.” This is the key to how Microsoft thinks it can survive and prosper versus VMWare, Amazon and so on. It has a hold of the whole stack, from the tiniest detail of the operating system (memory management, file system, low-level networking and so on) to the highest level, big Azure datacenters.

The company is also unusual in its commitment to private, public and hybrid cloud. The three cloud story which Microsoft re-iterated obsessively during the briefing is public cloud (Azure), private cloud (System Center) and hosted cloud (service providers). Ideally all three will look the same and work the same – differences of scale aside – though the Azure Pack is only the first stage towards convergence. Hyper-V is the common building block, and we were assured that Hyper-V in Azure is exactly the same as Hyper-V in Windows Server, from 2012 onwards.

I had not realised until this month that Snover is now lead architect for System Center as well as Windows Server. Without both roles, of course, he could scarcely architect “Cloud OS”.

Here are a few other things to note.

Hyper-V 2012 R2 has some great improvements:

  • Generation 2 VMs (64-bit Server 2012 and Windows 8 and higher only) strip out legacy emulation, UEIF boot from SCSI
  • Replica supports a range of intervals from 30 seconds to 15 minutes
  • Data compression can double the speed of live migration
  • Live VM cloning lets you copy a running VM for troubleshooting offline
  • Online VHDX resize – grow or shrink
  • Linux now supports Live Migration, Live Backup, Dynamic memory, online VHDX resize

SQL Server 14 includes in-memory optimization, code-name Hekaton, that can deliver stunning speed improvements. There is also compilation of stored procedures to native code, subject to some limitations. The snag with Hekaton? Your data has to fit in RAM.

Like Generation 2 VMs, Hekaton is the result of re-thinking a product in the light of technical advances. Old warhorses like SQL Server were designed when RAM was tiny, and everything had to be fetched from disk, modified, written back. Bringing that into RAM as-is is a waste. Hekaton removes the overhead of the the disk/RAM model almost completely, though it does have to write data back to disk when transactions complete. The data structures are entirely different.

PowerShell Desired State Configuration (DSC) is a declarative syntax for defining the state of a server, combined with a provider that knows how to read or apply it. It is work in progress, with limited providers currently, but immensely interesting, if Microsoft can both make it work and stay the course. The reason is that using PowerShell DSC you can automate everything about an application, including how it is deployed.

Remember White Horse? This was a brave but abandoned attempt to model deployment in Visual Studio as part of application development. What if you could not only model it, but deploy it, using the cloud automation and self-service model to create the VMs and configure them as needed? As a side benefit, you could version control your deployment. Linux is way ahead of Windows here, with tools like Puppet and Chef, but the potential is now here. Note that Microsoft told me it has no plans to do this yet but “we like the idea” so watch this space.

Storage improvements. Both data deduplication and Storage Spaces are getting smarter. Deduplication can be used for running VHDs in a VDI deployment, with huge storage saving. Storage Spaces support hybrid pools with SSDs alongside hard drives, hot data automatically moved, and the ability to pin files to the SSD tier.

Server Essentials for small businesses is now a role in Windows Server as well as a separate edition. If you use the role, rather than the edition, you can use the Essentials tools for up to 100 or so users. Unfortunately that will also mean Windows Server CALs; but it is a step forward from the dead-end 25-user limit in the current product. Small Business Server with bundled Exchange is still missed though, and not coming back. More on this separately.

What do I think overall? Snover is a smart guy and if you buy into the three-cloud idea (and most businesses, for better or worse, are not ready for public cloud) then Microsoft’s strategy does make sense.

The downside is that there remains a lot of stuff to deal with if you want to implement Microsoft’s private cloud, and I am not sure whether System Center admins will all welcome the direction towards using Azure tools on-premise, having learned to deal with the existing model.

The server folk at Microsoft have something to brag about though: 9 consecutive quarters of double digit growth. It is quite a contrast with the declining PC market and the angst over Windows 8, leading to another question: long-term, can Microsoft succeed in server but fail in client? Or will (for better or worse) those two curves start moving in the same direction? Informed opinions, as ever, are welcome.

Windows 8: return of Start button illuminates Microsoft’s painful transition

The Start button is coming back. At least, that’s the strong rumour, accompanied by leaked screenshots from preview builds. See Mary Jo Foley’s post complete with screen grab, though note that this is the Start button, not the Start menu. Other rumoured changes are boot to desktop by default, and the All Apps view by default in the Start screen.

Will this fix Windows 8? Absolutely not.

There are two reasons. First, in one sense Windows 8 does not need fixing. I’ve been running it from the first previews, and find it solid and fast. The new Start screen works well, and I’m now accustomed to tapping the Windows key and typing to start apps that are not already on the taskbar. It is a better app launcher and organiser than what it replaces, though I am not excited about Live Tiles which are out of sight and out of mind most of the time.

Second, this kind of minor UI change will not address the larger problem, which is the lack of compelling Metro-style apps for the platform. Nor will it fully placate those for whom nothing but making Metro completely invisible is acceptable.

These revisions are intended to make Windows 8 more acceptable to a market that essentially does not want it to change. The core market for Windows is increasingly conservative, being formed of business users with a big investment in the platform who do not want the hassle of retraining users, and home users who are used to Microsoft’s OS and not inclined to switch. While this is a large market, it is also a declining one, with tablets and smartphones taking over many former PC roles, and Macs increasingly the platform of choice for high-end users who need the productivity of a full OS.

Rather than content itself with a declining market, Microsoft came out with its bold re-imagining of Windows, with a new tablet-friendly app platform, while keeping faith with the past by preserving the desktop environment. Predictably, this was not a hit with the conservative market described above; in fact, it was the last thing they wanted, confusing and alienating.

Microsoft made it particularly hard for these users by making the new Metro environment hard to ignore. The Start screen, some settings, default apps for file types including images, PDFs and music, and power button hidden in the right-hand Charms menu all cause confusion.

Only the modern app platform has the potential to lift Windows beyond its large but suffocating and declining market of change-resistant users. Unfortunately the first months of Windows 8 has been more or less the worst case for Microsoft. Existing users dislike it and new users have failed to embrace it.

A rough ride for Windows 8 was expected, though if the script had run according to plan there should have been mitigating factors. A wave of Windows 8 tablets should have delivered a delightful experience with touch while still offering desktop productivity when needed. Well, it has happened a little bit, but Windows 8 tablets have suffered from multiple issues including high prices, lack of availability, fiddly designs, and in the case of Windows RT (the ARM version) poor performance and confusing marketing. Here’s a review of the Lenovo IdeaPad Yoga 11 RT machine, from Ebuyer, which shows what can go wrong:

THIS IS NOT A LAPTOP. It runs the dreadful Windows RT which is NOT windows 8, but a very poor limited version of 8. You can only download what Microsoft wants you to have. It came with a free Norton. The dealer convinced me that the failure to be able to download this was my deficiency. NOT – Norton cannot be downloaded onto RT machines. Neither can any other security software except defender which is already on it. You cannot install Chrome (much better than Explorer) It does not accept I tunes, You cannot dispense with the Microsoft log in password, which I do not need. Where the instructions for how to change the settings are, is still a mystery – as usual THERE IS NO INSTRUCTION MANUAL IN PAPER. You have to hunt for everything or go to an online forum.

A shame, because personally I like the concept of Windows RT with its low power consumption and nearly tinker-proof OS.

Is there hope for Windows 8? Sure. The core of the OS is excellent on the desktop side, less good on the Metro side but this can be improved. The app story remains poor, though occasionally a decent app comes along, like Adobe’s Photoshop Express: easy, fluid, elegant photo editing which works on both ARM and Intel.

image

It is fair to say, though, that Microsoft and its partners have plenty of work to do if they are to make this new Windows a success.

Build Mac and iOS apps in Visual Studio: Oxygene for Cocoa

Remobjects has released Oxygene for Cocoa, which lets you build apps for Mac and iOS using Visual Studio and the Oxygene language.

Oxygene is a Delphi-like language, making this an easy transition for Delphi developers. Until the most recent release, a version of Oxygene, called Prism, was bundled with Delphi, though this targeted .NET rather than Cocoa. Oxygene can also build apps for the Java runtime, making it a three platform solution.

The cross-platform approach is different from that taken by Embarcadero with FireMonkey, a cross-platform framework for Delphi itself. FireMonkey abstracts the GUI as well as the non-visual code, and in many cases controls are drawn by FireMonkey rather than using the native controls on platforms such as iOS. By contrast, Oxygene works directly with the Cocoa frameworks, so you will build the GUI in code or using the Xcode tools on the Mac.

More like Xamarin then? “We do work together with Mono and with Xamarin,” says Remobjects chief Marc Hoffman. “Oxygene for .NET works with the regular Mono framework for desktop or server apps. But when you get to the devices, the benefit with Oxygene is that you get much closer to the framework, you don’t have the weight of providing an abstraction for the classes you want to use.  If you write a UITableViewController to define a view, then you really write a UITableViewController, the same as you would in Objective-C, just the language is different, whereas in Xamarin you write a different class that sits on top and Mono does the mapping.”

Why not just use Xcode? This is in part a language choice. Remobjects says that Oxygene is “better than Objective-C” thanks to features like automatic boxing of integers, floats and strings, and generic arrays. There is more about the language here. Perhaps more important, if you know Pascal or Delphi it will look more familiar. You also get the ability to share code between Windows, Android, Mac and iOS, though this will be the non-visual code. Developers can also work mainly in Visual Studio rather than in Xcode.

The disadvantage is that you need two machines, or a VM running Windows on a Mac, and a remote connection to a Mac in order to debug.

I plan to try out Oxygene for Cocoa soon and of course will report on the experience.

Why custom templates might not appear in Word 2013

I have a custom Word template which I use for transcribing interviews (it lets me start and stop the audio with a key combination). I installed this into the location defined for user templates. This option is in File – Options – Advanced – File Locations.

image

However, when I chose File – New in Word, my custom template did not appear. The reason, I discovered, is that Word has an additional option which sets the save location of personal templates. This was blank in my installation.

image

You have to set this to be the same as the user template path in File locations. After you do that, personal templates show up when you do File – New. Note that you also have to click on the PERSONAL heading before you see them.

image

It works. Now for a little rant.

  • Why are there two locations? What is meant to be the difference between the location for user templates, and the location for personal templates?
  • Why does a Save location impact what happens when happens when you are starting a new document?
  • How did the personal template location get to be blank?
  • If one of these locations is blank, why is Word not smart enough to have a look in the other one?

I guess this may be a bug.

While I am on the subject, it appears that there is no automatic way to sync custom templates across different Office installations, even if you sign in with the same account. A shame.

Miguel de Icaza: don’t blame Google for Microsoft’s contempt for developers

Xamarin’s Miguel de Icaza (founder of the Mono project) has complained on Twitter about Microsoft’s Windows Division’s “contempt for developers” when it created the Windows Runtime and a “4th incompatible Xaml stack”, in a conversation prompted by the company’s spat with Google over the YouTube app for Windows Phone. Google wants this removed because it does not show YouTube ads, to which Microsoft counters that the API for showing these ads is not available.

image 

I am more interested in his general reflections on the wisdom (or lack of it) shown by Microsoft in creating a new platform for touch-friendly apps in Windows 8, that lacks compatibility with previous Windows frameworks. “No developer wants to build apps twice for Windows: one for desktop, one for winstore” he also remarked.

The four XAML stacks are Windows Presentation Foundation, Silverlight (for which de Icaza created a version for Linux called Moonlight), Windows Phone (which runs a slightly different version of Silverlight), and now the Windows Runtime.

Could Microsoft have done this differently, without compromising the goal of creating a new tablet personality for Windows rather than continue with doomed attempts to make the desktop touch-friendly?

The obvious answer is that it could have used more of Silverlight, which had already been adapted to a touch environment for Windows Phone. On the other hand, the Windows division was keen to support native code and HTML/JavaScript as equally capable options for Windows Runtime development. In practice, I have heard developers remark that HTML/JavaScript is better than C#/XAML for the new platform.

It is worth noting that the Windows Runtime stack is by no means entirely incompatible with what has gone before. It still uses the Windows API, although parts are not available for security reasons, and for non-visual code much of the .NET Framework works as before.