Category Archives: internet

A year of blogging: another crazy year in tech

At this time of year I allow myself a little introspection. Why do I write this blog? In part because I enjoy it; in part because it lets me write what I want to write, rather than what someone will commission; in part because I need to be visible on the Internet as an individual, not just as an author writing for various publications; in part because I highly value the feedback I get here.

Running a blog has its frustrations. Adding content here has to take a back seat to paying work at times. I also realise that the site is desperately in need of redesign; I’ve played around with some tweaks in an offline version but I’m cautious about making changes because the current format just about works and I don’t want to make it worse. I am a writer and developer, but not a designer.

One company actually offered to redesign the blog for me, but I held back for fear that a sense of obligation would prevent me from writing objectively. That said, I have considered doing something like Adobe’s Serge Jespers and offering a prize for a redesign; if you would like to supply such a prize, in return for a little publicity, let me know. One of my goals is to make use of WordPress widgets to add more interactivity and a degree of future-proofing. I hope 2010 will be the year of a new-look ITWriitng.com.

So what are you reading? Looking at the stats for the year proves something I was already aware of: that the most-read posts are not news stories but how-to articles that solve common problems. The readers are not subscribers, but individuals searching for a solution to their problem. For the record, the top five in order:

Annoying Word 2007 problem- can’t select text – when Office breaks

Cannot open the Outlook window – what sort of error message is that? – when Office breaks again

Visual Studio 6 on Vista – VB 6 just won’t die

Why Outlook 2007 is slow- Microsoft’s official answer – when Office frustrates

Outlook 2007 is slow, RSS broken – when Office still frustrates

The most popular news posts on ITWriting.com:

London Stock Exchange migrating from .NET to Oracle/UNIX platform -  case study becomes PR disaster

Parallel Programming: five reasons for caution. Reflections from Intel’s Parallel Studio briefing – a contrarian view

Apple Snow Leopard and Exchange- the real story – hyped new feature disappoints

Software development trends in emerging markets – are they what you expect?

QCon London 2009 – the best developer conference in the UK

and a few others that I’d like to highlight:

The end of Sun’s bold open source experiment – Sun is taken over by Oracle, though the deal has been subject to long delays thanks to EU scrutiny

Is Silverlight the problem with ITV Player- Microsoft, you have a problem – prophetic insofar as ITV later switched to Adobe Flash; it’s not as good as BBC iPlayer but it is better than before

Google Chrome OS – astonishing – a real first reaction written during the press briefing; my views have not changed much though many commentators don’t get its significance for some reason

Farewell to Personal Computer World- 30 years of personal computing – worth reading the comments if you have any affection for this gone-but-not-forgotten publication

Is high-resolution audio (like SACD) audibly better than than CD – still a question that fascinates me

When the unthinkable happens: Microsoft/Danger loses customer data – as a company Microsoft is not entirely dysfunctional but for some parts there is no better word

Adobe’s chameleon Flash shows its enterprise colours – some interesting comments on this Flash for the Enterprise story

Silverlight 4 ticks all the boxes, questions remain – in 2010 we should get some idea of Silverlight’s significance, now that Microsoft has fixed the most pressing technical issues

and finally HAPPY NEW YEAR

Browser wars: IE loses 12% market share in 2009, Germany hates it

I’ve been looking at the browser stats for 2009. According to StatCounter, Microsoft began the year with a 67.19% share for IE (versions 6-8 combined) and ends it with 55.23%. That’s a 12% loss or an 18% decline, depending how you figure it.

The biggest part of that share has gone to Firefox, which started with 25.08% and closes with 31.92 – a gain of 7% or a rise of  27%.

The big story is that Firefox 3.5 is now the world’s most popular browser. Although true on these figures, it is also true that IE 7 on the way down is crossing IE 8 on the way up; it’s possible that IE 8 will overtake Firefox sometime next year, though by no means certain.

However, there are huge regional variations. The UK loves IE: currently IE 8 is on 31.48% vs Firefox 3.5 on 19.2%, and IE overall is 56.02%. Germany on the other hand hates it:

According to these stats, Firefox 3.5 has a 44.19% share in Germany and IE 8 just 15.32%. The USA is somewhere in between, though closer to the UK in that IE 8 is in the lead with 26.64%.

Overall, clearly a good year for Mozilla and a bad one for Microsoft.

What about the future? Well, it’s notable that not all IE migrants are going to Firefox. The Other section is showing steady increase, and I’d bet that a large chunk of Other is based on WebKit, either in mobile browsers or in Google Chrome. Apple’s Safari is also WebKit-based, and has increased its share significantly during 2009. Mozilla should worry that developers are largely choosing WebKit rather than Gecko.

A bigger concern for Mozilla is the big G, source of most of its income. Google pays Mozilla for search traffic sent its way. It cannot be good when your main customer has a product that competes directly with your own. I’m guessing that a Google browser will overtake Firefox during the next decade.

Adobe financials and the future of packaged software

I listened to Adobe’s investor conference call yesterday following the release of its fourth quarter results, to the end of November 2009.

The results themselves were mixed at best: revenue was down in all segments year on year and there was a $32 million GAAP net loss, but Adobe reported an “up-tick” towards the end of the quarter and says that it expects a strong 2010, presuming a successful launch for Creative Suite 5.

Adobe’s situation is interesting, in that while it is doing well in strengthening the Flash Platform for media and to a lesser extent for applications, that success is not reflected in its results.

The reason is that it depends largely on sales of design software (mainly Creative Suite) for its revenue. According to its datasheet [PDF], this was how its revenue broke down for the financial years 2006 to 2009:

  2006 2007 2008 2009
Creative 56% 60% 58% 58%
Business Productivity 32% 29% 30% 29%
Omniture (analytics) 1%
Platform 4% 4% 6% 6%
Print and publishing 8% 6% 6% 6%

“Creative” is Creative Suite and its individual products, plus things like Audition and Scene 7.

“Business productivity” encompasses Acrobat (including Acrobat.com), LiveCycle servers, and Connect Pro web conferencing.

“Platform” is developer tools and Flash Platform Services, though not LiveCycle Data Services.

“Print and Publishing” is PostScript, Director, Captivate, and old stuff like PageMaker and FrameMaker but not InDesign.

Some of this segmentation seems illogical to me and probably to Adobe as well; there are no doubt historical reasons.

If the economy recovers and Creative Suite 5 delivers a strong upgrade, Adobe may well have the good 2010 that it is hoping for. One of the things mentioned by CEO Shantanu Narayen was that an aging installed base of PCs more than five years old was holding back its sales; no doubt most of those PCs are running Windows XP and it caused me to wonder how much the general disappointment with Vista has affected other companies such as Adobe which benefit when PCs are upgraded, and how much the good reception for Windows 7 may now help it.

Still, there is aspect of the above figures that rings alarm bells for me. They show no evidence that Adobe is able to migrate its business from one dependent on packaged software sales to one that is service-based. That is important, because I suspect that the packaged software model is in permanent decline.

The pattern which I’ve seen now for many years as a software reviewer is that a vendor brings out version x of its product and explains why it is a must-have upgrade from version x-1, which (it turns out) has a number of deficiencies that are only now being addressed.

A year or two later, there’s another upgrade, another briefing, and lo! it is version x+1 that you really need; version x was not that good after all.

It is a difficult act for vendors to sustain, and hated by users too. Even when users have signed up for some sort of service contract that gets them new releases for free, many are reluctant to upgrade because of the pain factor; if the old edition is performing well, they see no need to switch.

The next-generation software world replaces this model with Internet applications where upgrade is seamless and at no extra cost. You pay for the service, either with money (Salesforce.com) or mainly with advertising (Google Apps).

Adobe is there, of course, with Acrobat.com for productivity applications, and also tools for building them with Flash, Flex and AIR. But it is one thing to be there, and another thing for those investments to be delivering an increasing proportion of overall revenue; and the table above suggests that progress is slow.

It will be fascinating to see how this unfolds over the coming decade.

Fear of Google

Shares in Rightmove, a UK web site for house sales, have dropped  by move then 10% over the last couple of days, following a report by the Financial Times that “Google is in talks with British estate agents to launch an online property portal.”

I do not know what chances of success this venture has. Google does not instantly dominate any market into which it moves; the combination of Google Checkout and Google Shopping hasn’t killed off PayPal/eBay or Amazon as yet.

What interests me though is the impact of the rumour. It is not  only Google’s size and profitability; it is the fact that Google is for many (most?) people the portal to the Web, giving it huge power to reach any Web-based market.

The restraining factor is that it is difficult for Google to exploit that power without falling foul of legislation that protects free markets. When I search for a product on Google, I don’t see any evidence that it favours its own shopping site, for example; though having said that I am only one click away from the Shopping link at the top of the page that searches only Google’s site.

Still, drawing the line between what is and is not reasonable is hard to do. For example, what if Google had a “search for houses on sale here” link in Google Maps?

If Android and later Chrome OS succeed, Google may become the portal to more than just the Web. Its services will have geo-location data as well. Its data-gathering algorithms might learn our shopping and eating out preferences and guide us with uncanny prescience towards things that we enjoy.

I am not surprised that Google rumours have the power to move markets.

Web advertising goes outside: digital signage using force.com and Media RSS

In the last 10 years or so video advertising screens have replaced static posters in busy public places like the London Underground. This is known in the trade as digital signage or Digital Out of Home (DOOH) advertising; and I was interested to speak to a company at the recent Salesforce.com Service Cloud 2 launch which is running digital signage systems on the Salesforce force.com platform. The company is signagelive, run by http://www.remotemedia.co.uk/, and and its secret sauce is to use the internet and commodity technology to run 10,000 displays around the world cheaply and efficiently. As I understood it from my brief conversation, a force.com application provides customers with dashboard for managing their screens, usable from any web browser. Content is served to the screens over the Internet using Media RSS. This is well suited to the purpose since it is easy for customers to update, and fail-safe in that if the system fails or the connection breaks, screens just carry on displaying the last version of the feed which they retrieved. Since Media RSS is a standard, the content can also feed desktop applications; and of course it doesn’t have to be advertising though often it is.

A sinagelive display could be a low-powered network-connected device attached to a display; or a display alone with enough intelligence to retrieve a Media RSS feed and display its images; what you can do in the home with something like a estarling connected photo frame or a PhotoVu wireless digital picture frame but with bigger displays. The company is looking forward to displays which include on-chip Adobe Flash players since this will enable animation and video to be included with little extra cost. The media itself is currently stored on company servers, but is likely to move to Amazon S3 in future – which makes sense for scalability, pay as you go, and for taking advantage of Amazon’s global network, reducing latency.

If you want to see an example, apparently the London Dockland Light Railway screens are driven by signagelive; they are also in Harrods.

CEO Jason Cremlins has a blog post about the future of DOOH. My further thought is that if you had devices able to run Flash applications, you could put this together with touch screens and add interactivity to the mix.

The boundaries between internet and non-internet advertising are blurring. Ad networks such as those run by Google can be extended to networks using this kind of technology in a blink. Why shouldn’t advertisers be able to select airport lounges or underground stations alongside Adsense for websites?

The less compelling aspect of the technology is that as the costs of running these advertising networks come down, the likelihood of intrusive advertising screens invading every possible public space increases.

I also found this interesting as an innovative use of the Salesforce platform. As I recall, it hooks into other force.com applications to handle billing, customer record management and so on, and shows the potential for Salesforce to move beyond CRM. With the Adobe Flash aspect as well this example brings together a number of themes that I’ve been mulling over and I enjoyed hearing about it.

Qt goes mobile, gets bling, aims for broader appeal

Here at Qt Developer Days in Munich we’ve heard how Nokia wants to see “Qt everywhere”, and will be supporting Qt on its Maemo operating system and on Symbian, as well as adding specific support for Windows 7 and Mac OS X 10.6, “Snow Leopard”. Qt already works on Microsoft Windows Mobile, and of course on Linux which is where it all started. What about Google Android, Palm WebOS, Apple iPhone? Nothing has been promised, but there is hope that Qt will eventually work on at least some of these other systems.

So is “Qt everywhere” a realistic proposition? Here’s a few impressions from the conference. First, a bit of context. Qt is a C++ framework for cross-platform development. and although bindings for other languages exist, Nokia says it is focused on excellence in C++ rather than working with multiple languages. Developers get the advantages of both native code executables and cross-platform support, and Qt is popular on embedded systems as well as desktops and mobile devices.

Qt is an open source framework which was developed by a company called Trolltech which Nokia acquired in 2008. Its motivation, one assumes, was to simplify development for its own multiple operating systems, especially Maemo and Symbian. Still, it has also taken its responsibilities to the open source community seriously. Qt was originally available either under the GPL, which requires developers to make their own applications available under the GPL as well, or under a commercial license. This limited Qt’s take-up. In March Nokia introduced a third option, the LGPL, which is a more liberal and allows commercial development using the free license. The result, we were told, has been a 250% increase in usage (though how this is defined is uncertain) accompanied by “a small drop in revenue.”

Although the revenue decrease is troubling, it is not a disaster for Nokia whose main business is selling hardware; and if take up continues to increase I’d expect revenue to follow.

Since the Nokia acquisition, Qt has been energetically developed. 2009 has seen the release of a dedicated IDE called Qt Creator. I was interested to see a company that has chosen not to go the Eclipse route for its primary IDE, though there are plug-ins for both Eclipse and Visual Studio. The trolls explained that Eclipse came with too much baggage and they wanted something more perfectly suited to its purpose, a lean approach that is in keeping with the Qt philosophy.

Another important move is the inclusion of Webkit within the framework, the same open source HTML engine that powers Apple’s Safari, Adobe AIR, and the browser in numerous Smartphones. Webkit also comes with a Javascript engine, which Nokia is exploiting in several interesting ways.

The big deal at Qt Developer Days was another new project called Kinetic. This is comprised of four parts:

1. An animation API.

2. A state machine.

3. A graphical effects API.

4. A declarative API, currently called QML (Qt Markup Language), though this may change.

Many of these pieces, though not the last, are already present in Qt 4.6, just released in technical preview. Nokia has not announced a specific date for Kinetic, though there were mutters about “first half of 2010”.

The thinking behind Kinetic is to make it easier to support the graphical effects and transitions that users have come to expect, as well as improving the designer-developer workflow – showing that it is not only Adobe and Microsoft who are thinking about this.

QML is significant for several reasons. It is a JavaScript-like API: we were told that Nokia started out with XML but found it cumbersome, and settled on JavaScript instead. It is designed to work well with visual design tools, and Nokia has one code-named Bauhaus which will be part of Qt Creator. Finally, it allows snippets of JavaScript so that developers can create dynamic user interfaces.

At runtime, QML is rendered by a viewer widget, which can be programmatically controlled in C++ just like other Qt widgets.  

Nokia’s hope is that designers can be persuaded to work directly in the QML designer, enabling free exchange of code between designers and developers. It is a nice idea, though I doubt designers will easily transition from the more comfortable world of Photoshop and Flash. However, even if in the end QML is used more by developers than designers, it does greatly simplify the task of creating a dynamic Qt UI. Note that there is already a visual GUI designer in Qt Creator but this is geared towards static layouts.

Long term, who knows, we may see entire applications written in QML, opening up Qt to a new and broader audience.

You can see the latest Qt roadmap here.

Qt pros and cons

I was impressed that attendance here has increased – from around 500 last year to around 700 – despite the economy. Those developers I spoke to seemed to like Qt, praising the way it self-manages memory, though some find the model-view aspect too complex and apparently this is to be improved. Nokia’s stewardship and openness is appreciated and the Qt roadmap generally liked, though there is concern that its understandable focus on mobile may leave the desktop under-served.

Cross-platform capability is increasingly important, and for those who want the performance and capability of C++ along with really good Linux support – important for embedded use – Qt is a strong contender. The focus on mobile is right, not only because of Nokia’s own needs, but because demand for Smartphone apps can only increase.

Integrating with Webkit is a smart move, opening up possibilities for hybrid web/desktop applications and giving Windows developers an alternative to embedded IE with all its quirks.

The open source aspect is another strength. This is now a good selling point if you developing for certain governments (the UK is one such) or other organisations that have a bias towards open source.

That said, talk of Qt everywhere is premature. The mobile space is fractured, and without iPhone, WebOS or Android Nokia cannot claim to have a universal solution. Nor has anyone else; but I’m just back from Adobe MAX where we heard about wider support for the Flash runtime. Then again, few choose between C++ or Flash; Adobe’s runtime is pretty much off the map for attendees here.

Qt is well-established in its niche, and is in good hands. I will be interested to see whether Nokia is successful in broadening its appeal.

Incidentally, if you can get to San Francisco you can still catch Qt Developer Days as it is running there from November 2nd-4th.

Guardian ungagging demonstrates power of Twitter (again)?

This morning the Guarding reported that it had been forbidden from reporting on a question to be asked in the UK parliament:

The Guardian is prevented from identifying the MP who has asked the question, what the question is, which minister might answer it, or where the question is to be found.

The Guardian is also forbidden from telling its readers why the paper is prevented – for the first time in memory – from reporting parliament. Legal obstacles, which cannot be identified, involve proceedings, which cannot be mentioned, on behalf of a client who must remain secret.

The only fact the Guardian can report is that the case involves the London solicitors Carter-Ruck, who specialise in suing the media for clients, who include individuals or global corporations.

The clue was enough for others to identify the case, which involves oil traders Trafigura, and Twitter responded in style, as the following image from Trendsmap demonstrates:

A few hours later the injunction was lifted. Although there is no way to show what influence the Twitter activity had on the matter, it at least demonstrates how the real time web can publicise events that others are trying to keep hush.

One question remains. What happened to “publish and be damned”? Maybe this is now unaffordable, which remains a worry from a freedom of speech perspective.

Technorati Tags: ,,

Rentokil Initial adopting Google Apps – largest deployment yet, apparently

Following a successful 100-day trial with 800 users, Rentokil Initial is deploying Google Apps Premier Edition globally to “up to 35,000 colleagues” by the end of 2010, in what the press release says is the:

Largest deployment of Google Apps™ Premier Edition to replace multiple email systems with a standard global email solution … The new platform will provide a single web-based communication and collaboration suite to replace the Group’s existing 180 email domains and 40 mail systems across its six operating divisions.

Note that the focus is on email, though the release also talks about “communication and collaboration”, including Google chat and video and shared calendars.

Rentokil is keen on the translation service which Google offers:

…the frustrations of not having access to a single company-wide email address database will disappear and the translation difficulties faced by those colleagues wanting to collaborate with others around the world will be lessened

says CIO Bryan Kinsella.

There is no mention of word processing, spreadsheets or presentation graphics in the release, suggesting that a wholesale move to Google for documents is not currently envisaged. That said, I suspect that once an organization signs up for email and collaboration services, they will end up using other parts of the platform as well.

Google’s progress in the Enterprise is interesting to watch. If it successful, it will have a profound impact on the IT industry, and there will be less work for all those support organizations that spend their time keeping Microsoft systems up and running.

When the unthinkable happens: Microsoft/Danger loses customer data

Danger is a company acquired by Microsoft in April 2008, which provides synchronization and online data storage for mobile devices, the best-known being the T-Mobile Sidekick.  Here’s the Danger promise:

Data is always synchronized and backed up
Danger-powered devices are always connected to the Danger service. All user data is automatically and securely backed up over-the-air, and emails, photos, and organzier data are automatically synchronized with a Web-based application. All changes that are made on the device are instantly and automatically reflected on the user’s computer, and vice versa.

That dream is in tatters thanks to a currently unspecified server failure. Problems started over a week ago, culminating in this devastating “status update”:

Regrettably, based on Microsoft/Danger’s latest recovery assessment of their systems, we must now inform you that personal information stored on your device – such as contacts, calendar entries, to-do lists or photos – that is no longer on your Sidekick almost certainly has been lost as a result of a server failure at Microsoft/Danger … we recognize the magnitude of this inconvenience.

The word “inconvenience” does not express what some users are experiencing. Here’s an example:

I too have lost business contacts (over 200), family and friends mailing, email address & phone #. Good luck now with holiday cards.  Without my calendar, I now have no clue when all my upcoming appts are.  In addition, I have lost passwords, account codes and my gym workout routine.  I was unable to do my side jobs over the past two weekends without these codes.  To recover the information will take hours of my time worth way more than the month of service credit in addition to the money I have already lost not being able to work.

The entire reason I chose to stay with the sidekick and renew with t-mobile was because of the piece of mind knowing that my data information was backed up to an online system. 

So what next? People are drawing a variety of conclusions, the most obvious being either that the cloud can never be trusted, and/or that Microsoft can never be trusted. Of course there is no such thing as total data (or any other kind of) security, but risks can be minimized, and in the absence of nuclear war, earthquake or volcanic eruption this looks inexcusable – but bad things happen.

The company is promising an update tomorrow (October 12th). Personally I doubt that the data is really irrecoverable, knowing a little about what data forensics can achieve, but it may be economically irrecoverable. Still, the best thing Microsoft could do would be to announce that it can get the data back after all. Failing that, we need to understand as much as possible about what went wrong so that we can make our own judgment about what to conclude.

Presuming that the data does not reappear, this is going to get messy. What happens when the marketing information says one thing, but the small print says another (as is often the case)? One user found this in his contract:

The services and devices are provided on an “as-is” and “with all faults” basis and without warranties of any kind

which may well be typical. Then again, what about T-Mobile’s relationship with Microsoft?

Finally, while I accept that data may be safer in a cloud service provider’s data centre than on my cheap local hard drives, it is also obvious that cloud + local backup is even safer. Apparently this is one thing that Danger made somewhat difficult, and I’ve known this to be true of other cloud-based services.

Update: rumour has it that this was a failed SAN (Storage Area Network) upgrade without a backup. Further rumours of the poor state of Danger (and Windows Mobile) within Microsoft are in this RoughlyDrafted article.

Three reasons why Adobe Flash is hated

In the Adobe-shaped bubble of MAX 2009 in Los Angeles, Flash is the answer to everything, almost. That impression was reinforced yesterday when Chief Technology Officer Kevin Lynch spoke of his ambition to make AIR, the Flash-based out of browser runtime, into a universal runtime for SmartPhones, as I reported yesterday on The Register.

Many users and developers have a different perspective, and you can easily find examples in the comments on the piece linked above. I was also struck by the loud and spontaneous cheer accorded Opera’s Bruce Lawson when he presented HTML 5 as an alternative to Flash and Silverlight at the Future of Web Applications conference last week.

So why is Flash hated? Three main reasons come to mind.

The first is because most of the Flash content that we see is marketing and advertising. Most users prefer web sites that are ad-free, or at least where the advertising is low-key. On the marketing side, there are still plenty of occasions where you want to skip the intro. When I link to Adobe’s home page for MAX 2009, I always link to the Sessions page, not the home page which auto-plays a Flash movie with sound – because I think users would rather get straight to the content, rather than be startled or embarrassed by an unexpected broadcast. Fellow journalist Jon Honeyball tweeted recently:

using a blocker to rid myself of unwanted flash nonsense on web pages. And most of it is unwanted and unnecessary rubbish

A more nuanced angle on this same problem is that Flash developers are inclined to add a little bling to their applications, even if it is not marketing as such. Users who like applications that are sparse and lean react against this.

The second reason is that Flash can be detrimental to browser performance. There are two angles on this. One is that bugs or performance characteristics in the Flash Player, combined with perhaps badly written Flash content, can cause slowdowns or at worst lock-ups in the browser. The other is that much Flash content downloads a lot of data, to create its multimedia effects. This makes Flash pages larger and therefore slower. It is a consideration that matters particularly on mobile devices with slow or intermittent connections, which is why not everyone welcomes the prospect of full Flash on every SmartPhone.

Third, there are those who do not regard Flash as part of the open web, and want to see web content that can be rendered completely without the use of a proprietary runtime, and web standards controlled by a cross-industry group rather than by a single vendor. There could be political, ethical or pragmatic reasons behind this view; but it is one that is still strongly felt, as shown by the reaction to Lawson’s comments at FOWA.

Before you tell me, I realise that there are also plenty of reasons to like Flash; and I am not going to attempt to iterate them here. My argument is that even those who love Flash need to recognise that users with negative perceptions may have good reasons for them. From this perspective, Apple’s resistance to Flash on the iPhone is a force for good, since it compels web developers to continue offering non-Flash content.

It also follows that anything Adobe can do to mitigate these problems will strengthen its campaign to get Flash everywhere. I am thinking of things like improved performance and reduced memory footprint in the player, and better handling of errant applications; demonstrating lean and mean Flash usage in its own sites and examples; and continuing to open the Flash runtime and its future to cross-industry input, even at the expense of relinquishing some control.

Technorati Tags: ,,,,