From Big Blue to Big Red? IBM to acquire Red Hat

image

IBM has agreed to acquire Red Hat:

IBM will acquire all of the issued and outstanding common shares of Red Hat for $190.00 per share in cash, representing a total enterprise value of approximately $34 billion.

IBM Is presenting this as a hybrid cloud play, with the claim that businesses are held back from cloud migration “by the proprietary nature of today’s cloud market.”

IBM and Red Hat will be strongly positioned to address this issue and accelerate hybrid multi-cloud adoption. Together, they will help clients create cloud-native business applications faster, drive greater portability and security of data and applications across multiple public and private clouds, all with consistent cloud management. In doing so, they will draw on their shared leadership in key technologies, such as Linux, containers, Kubernetes, multi-cloud management, and cloud management and automation.

Notably, the announcement specifically refers to multi-cloud adoption, and that the company intends to “build and enhance” partnerships with Amazon Web Services (AWS), Microsoft Azure, Google Cloud and Alibaba.

Red Hat will be a “distinct unit” within IBM, the intention being to preserve its open source culture and independence.

My own instinct is that we will see more IBM influence on Red Hat, than Microsoft influence on GitHub, to take another recent example of an established tech giant acquiring a company with an open source culture.

IBM is coming from behind in the cloud wars, but with Linux ascendant, and Red Hat the leader in enterprise Linux, the acquisition gives the company a stronger position in today’s technology landscape.

Microsoft is making lots of money. Anything else notable in its first quarter financials?

Microsoft has released its statements for the first quarter in its financial year, ending 30th September. Here is the segment breakdown. Everything has moved in the right direction.

Quarter ending September 30th 2018 vs quarter ending September 30th 2017, $millions

Segment Revenue Change Operating income Change
Productivity and Business Processes 9771 +1533 3881 +875
Intelligent Cloud 8567 +1645 2931 +794
More Personal Computing 10746 +1368 3143 +578

The segments break down as:

Productivity and Business Processes: Office, Office 365, Dynamics 365 and on-premises Dynamics, LinkedIn

Intelligent Cloud: Server products, Azure cloud services

More Personal Computing: Consumer including Windows, Xbox; Bing search; Surface hardware

Any points of interest? In his earnings call statement, CEO Satya Nadella talked Teams, the Office 365 conferencing and collaboration solution:

“Teams is now the hub for teamwork for 329,000 organizations, including 87 of the Fortune 100. And, we are adding automated translation
support for meetings, shift scheduling for firstline workers, and new industry-specific offerings including healthcare and small business.”

He also mentioned Power Apps and Flow, interesting to me because they are the most successful so far of the company’s efforts to come up with a low-code development platform:

“Power BI, Power Apps and Flow are driving momentum with customers and have made us a leader in no-code app building and business analytics in the cloud.”

He also mentioned the pending GitHub acquisition, which he says is “an opportunity to bring our tools and services to new audiences while enabling GitHub to grow and retain its developer-first ethos.”

Note that despite the cloud growth, Windows remains the biggest single segment in terms of revenue.

Determining how much of Microsoft’s business is “cloud” is tricky. The figures in the productivity segment lump together Office 365 and on-premises products, while Office 365 itself is in part a subscription to desktop Office, so not pure cloud. Equally, the “intelligent cloud” segment includes on-premises server licenses. No doubt this fuzzing of what is and is not cloud in the figures is deliberate.

Windows on a Chromebook? How containers change everything

Apparently there are rumours concerning Windows on a Chromebook. I find this completely plausible, though unlike Barry Collins I would not recommend dual boot – always a horrible solution.

Rather, when I recently explored about Chromebooks and Chrome OS, it was like the proverbial lightbulb illuminating in my head. Containers (used to implement Linux and Android on Chrome OS) change everything. It makes total sense: a secure, locked-down base operating system, and arbitrary applications running in isolated containers on top.

Could Chrome OS run Windows in a container? Not directly, since containers are isolated from the host operating system but share its base files and resources. However you could run Windows in a VM on a Chromebook, and with a bit of integration work this could be relatively seamless for the user. Systems like Parallels do this trick on MacOS. Instead of the wretched inconvenience of dual boot, you could run a Linux app here, and a Windows app there, and everything integrates nicely together.

Microsoft could also re-engineer Windows along these lines. A lot of the work is already done. Windows supports containers and you can choose the level of isolation, with either lightweight containers or containers based on Hyper-V. It also supports Linux containers, via Hyper-V. Currently this is not designed for client applications, but for non-visual server applications, but his could change. It is also possible to run Linux containers on the Windows Subsystem for Linux, though not currently supported.

Windows RT failed for a few reasons: ARM-only, underpowered hardware, Windows 8 unpopularity, and most of all, inability to run arbitrary x86 Windows applications.

A container-based Windows could have the security and resilience of Windows RT, but without these limitations.

So I can imagine Google giving us the ability to run virtual Windows on Chrome OS. And I can imagine Microsoft building a future version of Windows in which you can run both Windows and Linux applications in isolated environments.

Linux Foundation Open Source Summit opens in Edinburgh: Microsoft praised for “facing reality”

The Linux Foundation Open Source Summit kicked off in Edinburgh today, with Executive Director Jim Zemlin declaring that the organization is now adding a new member daily. The Linux Foundation oversees over 150 open source projects, including Linux, Kubernetes, Let’s Encrypt, Cloud Foundry and Cloud Native Computing Foundation, and has over 1320 members.

image

Microsoft has been a member for several years, but has now also signed up to the Open Invention Network (OIN), promising patent non-aggression to other licensees. It is a significant move which has boosted both the OIN and the Linux Foundation.

image
Linux Foundation Executive Director Jim Zemlin

Keith Bergelt,, CEO of OIN, took the stage to congratulate Microsoft on “facing the reality of the world as it is.”

Another important recent event is the statement by Linus Torvalds, in which he apologises for brusque behaviour and says he is taking some time off Linux kernel development:

“I need to change some of my behavior, and I want to apologize to the people that my personal behavior hurt and possibly drove away from kernel development entirely. I am going to take time off and get some assistance on how to understand people’s emotions and respond appropriately.”

What are the implications for Linux? Nobody known; though LWN’s Jonathan Corbet spoke at this morning’s keynote to assure us that a new code of conduct in which kernel developers promise to be nicer to each other will be a good thing.

I interviewed Zemlin today and will post more from the event soon.

Microsoft’s Windows 10 October 2018 update on hold after some users suffer deleted documents: what to conclude?

Microsoft has paused the rollout of the October 2018 Windows update for Windows 10 while it investigates reports of users losing data after the upgrade.

image

Update: Microsoft’s “known issues” now asks affected uses to “minimize your use of the affected device”, suggesting that file recovery tools are needed for restoring documents, with uncertain results.

Windows 10, first released in July 2015, was the advent of “Windows as a service.” It was a profound change. The idea is that whether in business or at home, Windows simply updates itself from time to time, so that you always have a secure and up to date operating system. Sometimes new features arrive. Occasionally features are removed.

Windows as a service was not just for the benefit of we, the users. It is vital to Microsoft in its push to keep Windows competitive with other operating systems, particularly as it faces competition from increasingly powerful mobile operating systems that were built for the modern environment. A two-year or three-year upgrade cycle, combined with the fact that many do not bother to upgrade, is too slow.

Note that automatic upgrade is not controversial on Android, iOS or Chrome OS. Some iOS users on older devices have complained of performance problems, but in general there are more complaints about devices not getting upgraded, for example because of Android operators or vendors not wanting the bother.

Windows as a service has been controversial though. Admins have worried about the extra work of testing applications. There is a Long Term Servicing Channel, which behaves more like the old 2-3 year upgrade cycle, but it is not intended for general use, even in business. It is meant for single-purpose PCs such as those controlling factory equipment, or embedded into cash machines.

Another issue has been the inconvenience of updates. “Restart now” is not something you want to see just before giving a presentation, or working on it at the last minute, for example. Auto-restart occasionally loses work if you have not saved documents.

The biggest worry though is the update going wrong. For example, causing a PC to become unusable. In general this is rare. Updates do fail, but Windows simply rolls back to the previous version, annoying but not fatal.

What about deleting data? Again it is rare; but in this case recovery is not simple. You are in the realm of disk recovery tools, if you do not have a backup. However it turns out that users have reported updates deleting data for some time. Here is one from 4 months ago:

image

Why is the update deleting data? It is not yet clear, and there may be multiple reasons, but many of the reports I have seen refer to user documents stored outside the default location (C:\users\[USERNAME]\). Some users with problems have multiple folders called Documents. Some have moved the location the proper way (Location tab in properties of special folders like Documents, Downloads, Music, Pictures) and still had problems.

Look through miglog.xml though (here is how to find it) and you will find lots of efforts to make sense of the user’s special folder layout. This is not my detailed diagnosis of the issue, just an observation having ploughed through long threads on Reddit and elsewhere; of course these threads are full of noise.

Here is an example of a user who suffered the problem and had an unusual setup: the location of his special folders had been moved (before the upgrade) to an external drive, but there was still important data in the old locations.

We await the official report with interest. But what can we conclude, other than to take backups (which we knew already)?

Two things. One is that Microsoft needs to do a better job of prioritising feedback from its Insider hub. Losing data is a critical issue. The feedback hub, like the forums, is full of noise; but it is possible to identify critical issues there.

This is related of course to the suspicion that Microsoft is now too reliant on unpaid enthusiast testers, at the expense of thorough internal testers. Both are needed and both, I am sure, exist. What though is the proportion and has internal testing been reduced on the basis of these widespread public betas?

The second thing is about priorities. There is a constant frustration that vendors (and Microsoft is not alone) pay too much attention to cosmetics and new features, and not enough to quality and fixing long-standing bugs and annoyances.

What do most users do after Windows upgrades? They are grateful that Windows is up and running again, and go back to working in Word and Excel. They do not care about cosmetic changes or new features they are unlikely to use. They do care about reliability. Such users are not wrong. They deserve better than to find documents missing.

One final note. Microsoft released Windows 10 1809 on 2nd October. However the initial rollout was said to be restricted to users who manually checked Windows Update or used the Update Assistant. Microsoft said that automatic rollout would not begin until Oct 9th. In my case though, on one PC, I got the update automatically (no manual check, no Insider Build setting) on October 3rd. I have seen similar reports from others. I got the update on an HP PC less than a year old, and my guess is that this is the reason:

With the October 2018 Update, we are expanding our use of machine learning and intelligently selecting devices that our data and feedback predict will have a smooth update experience.

In other words, my PC was automatically selected to give Microsoft data on upgrades expected to go smoothly. I am guessing though. I am sure I did not trigger the update myself, since I was away all day on the 2nd October, and buried in work on the 3rd when the update arrived (I switched to a laptop while it updated). I did not lose data, even though I do have a redirected Documents folder. I did see one anomaly: my desktop background was changed from blue to black, and I had to change it back manually.

What should you do if you have this problem and do not have backups? Microsoft asks you to call support. As far as I can tell, the files really are deleted so there will not be an easy route to recovery. The best chance is to use the PC as little as possible; do a low-level copy of the hard drive if you can. Shadow Copy Explorer may help. Another nice tool is Zero Assumption Recovery. What you recover is dependent on whether files have been overwritten by other files or not.

Update: Microsoft has posted an explanation of why the data loss occurred. It’s complicated and all do to with folder redirection (with a dash of OneDrive sync). It affected some users who redirected “known folders” like Documents to another location. The April 2018 update created spurious empty folders for some of these users. The October 2018 update therefore sought to delete them, but in doing so also deleted non-empty folders. It still looks like a bad bug to me: these were legitimate folders for storing user data and should not have been removed if not empty.

More encouraging is that Microsoft has made some changes to its feedback hub so that users can “provide an indication of impact and severity” when reporting issues. The hope is that Microsoft will find reports of severe bugs more easily and therefore take action.

Updated 8th Oct to remove references to OneDrive Sync and add support notes. Updated 10th Oct with reference to Microsoft’s explanatory post.

Review: Synology DS119J. Great system but single bay and underpowered hardware make it worth spending a bit more

Synology has released a new budget NAS, the DS119j, describing it as “An ideal first NAS for the home".

It looks similar to the DS115j which it probably replaces – currently both models are listed on Synology’s site. What is the difference? The operating system is now 64-bit, the CPU now a dual-core ARMv8, though still at 800 MHz, and the read/write performance slightly bumped from 100 MB/s to 108 MB/s, according to the documentation.

I doubt any of these details will matter to the intended users, except that the more powerful CPU will help performance – though it is still underpowered, if you want to take advantage of the many applications which this device supports.

image

What you get is the Diskstation, which is a fairly slim white box with connections for power, 1GB Ethernet port, and 2 USB 2.0 ports. Disappointing to see the slow USB 2.0 standard used here. You will also find a power supply, an Ethernet cable, and a small bag of screws.

image

The USB ports are for attaching USB storage devices or printers. These can then be accessed over the network.

The DS119j costs around £100.

Initial setup

You can buy these units either empty, as mine was, or pre-populated with a hard drive. Presuming it is empty, you slide the cover off, fit the 3.5" hard drive, secure it with four screws, then replace the cover and secure that with two screws.

What disk should you buy? A NAS is intended to be always on and you should get a 3.5" disk that is designed for this. Two common choices are the WD (Western Digital) Red series, and Seagate IronWolf series. At the time of writing, both a 4TB WD Red and a 4TB IronWolf are about £100 from Amazon UK. The IronWolf Pro is faster and specified for a longer life (no promises though), at around £150.

What about SSD? This is the future of storage (though the man from Seagate at Synology’s press event says hard drives will continue for a decade or more). SSD is much faster but on a home NAS that is compromised by accessing it over a network. It is much more expensive for the same amount of storage. You will need a SATA SSD and a 3.5" adapter. Probably not the right choice for this NAS.

Fitting the drive is not difficult, but neither it is as easy as it could be. It is not difficult to make bays in which drives can be securely fitted without screws. Further, the design of the bay is such that you have to angle a screwdriver slightly to turn the screws. Finally, the screw holes in the case are made entirely of plastic and it would be easy to overtighten then and strip the thread, so be careful.

Once assembled, you connect the drive to a wired network and power it on. In most home settings, you will attach the drive to a network port on your broadband router. In other cases you may have a separate network switch. You cannot connect it over wifi and this would anyway be a mistake as you need the higher performance and reliability of a cable connection.

To get started you connect the NAS to your network and therefore to the internet, and turn it on. In order to continue, you need to find it on the network which you can do in one of several ways including:

– Download the DS Finder app for Android or iOS.

– Download Synology Assistant for Windows, Mac or Linux

– Have a look at your DHCP manager (probably in your router management for home users) and find the IP address

If you use DS Finder you can set up the Synology DiskStation from your phone. Otherwise, you can use a web browser (my preferred option). All you need to do to get started is to choose a username and password. You can also choose whether to link your DiskStation with a Synology account and create a QuickConnect ID for it. If you do this, you will be able to connect to your DiskStation over the Internet.

The DiskStation sets itself up in a default configuration. You will have network folders for music, photo, video, and another called home for other documents. Under home you will also find Drive, which behaves like a folder but has extra features for synchronization and file sharing. For full use of Drive, you need to install a Drive client from Synology.

image

If you attach a USB storage device to a port on the DS119j, it shows up automatically as usbshare1 on the network. This means that any USB drive becomes network storage, a handy feature, though only at USB 2.0 speed.

Synology DSM (Disk Station Manager)

Synology DSM is a version of Linux adapted by Synology. It is mature and robust, now at version 6.2. The reason a Synology NAS costs much more than say a 4TB WD Elements portable USB drive is that the Synology is actually a small server, focused on storage but capable of running many different types of application. DSM is the operating system. Like most Linux systems, you install applications via a package manager, and Synology maintains a long list of packages encompassing a diverse range of functions from backup and media serving through to business-oriented applications like running Java applications, a web server, Docker containers, support ticket management, email, and many more.

DSM also features a beautiful windowed user interface all running in the browser.

image

The installation and upgrade of packages is smooth and whether you consider it as a NAS, or as a complete server system for small businesses, it is impressive and (compared to a traditional Windows or Linux server) easy to use.

The question in relation to the DS119j is whether DSM is overkill for such a small, low-power device.

Hyper Backup

Given that this NAS only has a single drive, it is particularly important to back up any data. Synology includes an application for this purpose, called Hyper Backup.

image

Hyper Backup is very flexible and lets you backup to many destinations, including Amazon S3, Microsoft Azure, Synology’s own C2 cloud service, or to local storage. For example, you could attach a large USB drive to the USB port and backup to there. Scheduling is built in.

I had a quick look at the Synology C2 service. It did not go well. I use the default web browser on Windows 10, Edge, and using Hyper Backup to Synology C2 just got me this error message.

image

I told Edge to pretend to be Firefox, which worked fine. I was invited to start a free trial. Then you get to choose a plan:

image

Plans start at €9.99 + VAT for 100GB for a year. Of course if you fill your 4TB drive that will not be enough. On the other hand, not everything needs to be backed up. Things like downloads that you can download again, or videos ripped from disks, are not so critical, or better backed up to local drives. Cloud backup is ideal though for important documents since it is an off-site backup. I have not compared prices, but I suspect that something like Amazon S3 or Microsoft Azure would be better value than Synology C2, though integration will be smooth with Synology’s service. Synology has its own datacentre in Frankfurt so it is not just reselling Amazon S3; this may also help with compliance.

An ideal first NAS?

The DS119j is not an ideal NAS for one simple reason: it has only a single bay so does not provide resilient storage. In other words, you should not have data that is stored only on this DiskStation, unless it is not important to you. You should ensure that it is backed up, maybe to another NAS or external drive, or maybe to cloud storage.

Still, if you are aware of the risks with a single drive NAS and take sensible precautions, you can live with it.

I like Synology DSM which makes the small NAS devices great value as small servers. For home users, they are great for shared folders, media serving (I use Logitech Media Server with great success), and PC backup. For small business, they are a strong substitute for the role which used to be occupied by Microsoft’s Small Business Server as well as being cheaper and easier to use.

If you only want a networked file share, there are cheaper options from the likes of Buffalo, but Synology DSM is nicer to use.

If you want to make fuller use of DSM though, this model is not the best choice. I noticed the CPU often spiked just using the control panel and package manager.

image

I would suggest stretching to at least the DS218j, which is similar but has 2 bays, 500MB of RAM and a faster CPU. Better still, I like the x86-based Plus series – but a 2-bay DS218+ is over £300. A DS218j is half that price and perhaps the sweet spot for home users.

Finally, Synology could do better with documentation for the first-time user. Getting started is not too bad, but the fact is that DSM presents you with a myriad of options and applications and a better orientation guide would be helpful.

Conclusion? OK, but get the DS218j if you can.

Linux applications and .NET Core on a Chromebook makes this an increasingly interesting device

I have been writing about Google Chromebooks of late and as part of my research went out and bought one, an HP Chromebook 14 that cost me less than £200. It runs an Intel Celeron N3350 processor and has a generous (at this price) 32GB storage; many of the cheaper models have only 16GB.

This is a low-end notebook for sure, but still boots quickly and works fine for general web browsing and productivity applications. Chrome OS (the proprietary version of the open source Chromium OS) is no longer an OS that essentially just runs Google’s Chrome browser, though that is still the main intent. It has for some time been able to run Android applications; these run in a container which itself runs Android. Android apps run fairly well though I have experienced some anomalies.

Recently Google has added support for Linux applications, though this is still in beta. The main motivation for this seems to be to run Android Studio, so that Googlers and others with smart Pixelbooks (high-end Chromebooks that cost between £999 and £1,699) can do a bit more with their expensive hardware.

I had not realised that even a lowly HP Chromebook 14 is now supported by the beta, but when I saw the option in settings I jumped at it.

image

It took a little while to download but then I was able to open a Linux terminal. Like Android, Linux runs in a container. It is also worth noting that Chrome OS itself is based on Linux so in one sense Chromebooks have always run Linux; however they have been locked down so that you could not, until now, install applications other than web apps or Android.

Linux is therefore sandboxed. It is configured so that you do not have access to the general file system. However the Chromebook Files application has access to your user files in both Chrome OS and Linux.

image

I found little documentation for running Linux applications so here are a few notes on my initial stumblings.

First, note that the Chromebook trackpad has no right-click. To right-click you do Alt-Click. Useful, because this is how you paste from the clipboard into the Linux terminal.

Similarly, there is no Delete key. To Delete you do Alt-Backspace.

I attribute these annoyances to the fact that Chrome OS was mostly developed by Mac users.

Second, no Linux desktop is installed. I did in fact install the lightweight LXDE with partial success but it does not work properly.

The idea is that you install GUI applications which run in their own window. It is integrated so that once installed, Linux applications appear in the Chromebook application menu.

I installed Firefox ESR (Extended Support Release).  Then I installed an application which promises to be particularly useful for me, Visual Studio Code. Next I installed the .NET Core SDK, following the instructions for Debian.

image

Everything worked, and after installing the C# extension for VS Code I am able to debug and run .NET Core applications.

I understand that you will not be so lucky with VS Code if you have an ARM Chromebook. Intel x86 is the winner for compatibility.

What is significant to me is not only that you can now run desktop applications on a Chromebook, but also that you can work on a Chromebook without needing to be deeply hooked into the Google ecosystem. You still need a Google account of course, for log in and the Play Store.

You will also note from the screenshot above that Chrome OS is no longer just about a full-screen web browser. Multiple overlapping windows, just like Windows and Mac.

These changes might persuade me to spend a little more on a Chromebook next time around. Certainly the long battery life is attractive. Following a tip, I disabled Bluetooth, and my Chromebook battery app is reporting 48% remaining, 9 hrs 23 minutes. A little optimistic I suspect, but still fantastic.

Postscript: I was always a fan of the disliked Windows RT, which combined a locked-down operating system with the ability to run Windows applications. Maybe container technology is the answer to the conundrum of how to provide a fully capable operating system that is also protected from malware. Having said which, there is no doubt that these changes make Chromebooks more vulnerable to malware; even if it only runs in the Linux environment, it could be damaging and steal data. The OS itself though will be protected.

Microsoft Azure Stack: a matter of compliance

At the Ignite conference last week in Orlando, Microsoft’s hardware partners were showing off their latest Azure Stack boxes.

In conversation, one mentioned to me that Azure Stack was selling better in Europe than in the USA. Why? Because stricter compliance regulations (perhaps alongside the fact that the major cloud platforms are all based in North America) makes Azure Stack more attractive in Europe.

image
Lenovo’s Azure Stack

Azure Stack is not just “Azure for your datacentre”. It is a distinctive way to purchase IT infrastructure, where you buy the hardware but pay for the software with a usage-based model.

Azure / Azure Stack VMs are resilient so you cannot compare the value directly with simply running up a VM on your own server. Azure Stack is a premium option. The benefits are real. Microsoft mostly looks after the software, you can use the excellent Azure management tools, and you get deep integration with Azure in the cloud. Further, you can diminish the cost by scaling back at times of low demand; especially easy if you use abstracted services such as App Service, rather than raw VMs.

How big is the premium? I would be interested to hear from anyone who has done a detailed comparison, but my guess is that running your own servers with Windows Server Datacenter licenses (allowing unlimited VMs once all the cores are licensed) is substantially less expensive.

You can see therefore that there is a good fit for organizations that want to be all-in on the cloud, but need to run some servers on-premises for compliance reasons.

Redesign coming to Outlook for Windows and Mac, but will Microsoft fix what matters most?

At its Ignite conference under way in Orlando, Microsoft has been talking about its plans for Outlook, the unavoidable email and personal information management client for Office 365 and Exchange.

A lot of UI design changes are on the way, as well as back-end changes that should improve our experience. One of the changes is that “AI-infused” search will surface top results, based on contacts we often communicate with, keyword matching and so on. Search is also getting faster; apparently it has already doubled in speed compared to earlier versions.

image

There will be a simplified ribbon, more use of colour, an improved calendar, and many small design changes.

On the Mac, this is what Outlook looks like today:

image

and this is what is planned:

image

The background shading is caused by transparency, which is configurable.

Nothing is set in stone and the previews we saw are just that, previews. Microsoft is looking for feedback via the Office Insider community, as well as previewing features in the application itself and inviting opinions.

It’s good to see redesign work on this application which is essential to many of us. However it is not clear that the things which matter most to me are being addressed. I had a chat with the speakers at the end and mentioned the following personal bugbears:

1. Message formatting still gets messed up especially if you want to do things like replying inline to an email. If you click in the wrong place you can still end up inheriting formatting from the message you are quoting such that you cannot easily get back to normal typing. It is all to do with the use of Word for the message editor, but without all the features of Word to control it.

2. I’d like to see something in the UI that would deter users from quoting a massive chain of previous correspondence in the message, sometimes sending content unawares that would better have remained confidential.

3. Something many have asked for: delayed send, so that when you reply too hastily there is a window of time when you delete or edit the message before it is sent. Configurable, of course.

4. Attention paid to the many obscure dialogs, some of which have not been touched for decades. Like the Open other user’s mailbox control, which is not even a picklist, you have to type it exactly right:

image

5. Ever had a call from someone who has inadvertently engaged Work Offline and does not know why mail is no longer arriving? I have.

6. In Outlook mobile, at least on Android, search is infuriating. It retrieves results, but if they are more than a couple of weeks old, you cannot see the message.

7. Better performance when your connection is poor. I realise it is challenging, but you would think that proper use of background processes could give the user a reasonable and informative experience. Whereas today you can get hangs, lies (“this folder is up to date”, when it is not), that certificate warning when you are on public wifi and have not logged in yet (why can’t Outlook detect this common scenario?), repeated password requests when there are network problems, and so on.

8. Why are Outlook profiles managed in a Mail applet in Control Panel? Admins know this, but why not make it an Outlook Configuration app that appears in the Start menu. It would be easier for those who get stumped when Outlook does not open.

I am sure you have your own list. The bottom line though is this: the cosmetics of the design do matter, but not as much as issues which can stop you getting things done.

Google search to become even more opaque? From answers to “journeys”

Google’s Ben Gomes has posted about the future of search. Nothing in it surprises me. Quick summary:

  • From answers to journeys: search to be more personalized and contextual, helping you “resume tasks where you left off”
  • Queryless information: surfacing information “relevant to your interests” without you asking.
  • More visual results. Because everyone likes a picture.

Personally I would prefer search to be improved in different ways. I would like:

– clearer separation of ads from search results. It is to my mind wrong that brands have to advertise based on their own brand name, just to ensure that users searching for their brand find the official site, and not a competitor or intermediary

– Better results. As a techie I am often looking for answers to technical queries. Search is very useful, but in general, I find too many results with the same question but no answers, too many old results that are no longer relevant, and not enough focus on community forums (where the answers often exist).

– Better authority. As a journalist, authority really matters; and I do not mean “reported by a well known news source”. Authority means first-party information, the announcement from the actual people or companies involved, the information on the first-party sites or from actual employees. Finding this is quite a lot of work, and the algorithms could be much better.

What I do not want includes:

– over-personalized results. There are two reasons. First, I am wary about giving away all the personal data which Google wants to use to personalize results. Second, factors like objectivity, balance, and accuracy matter much more to me. I do not want my own version of fake news, results designed to please me rather than to inform me. Nor do I want this for others, who may end up with a distorted view of the world.

Of course it depends what sort of search you are making. If you search for “best restaurant in Oxford”, what do you want? The most highly-rated restaurant (by some standard) among places where you typically choose to dine? Or the best according to the general population? Or the best according to top restaurant critics? It is not clear; and a journalist (say) might want a different answer to someone looking for a place to eat tonight.

All of this touches on a key point, which is search results versus marketing. Is search a way of researching information on the internet, or a marketing tool? I want the former; but unfortunately it will always be, at least in part, the latter. Particularly as we are unwilling to pay for it.

– too few results. Ten blue links was a luxury: 10 answers to the same question, hopefully from different sources, so we can see any diversity and make a selection. The search, um, experience now more often gives us just one result, or at least, one prominent result and more available if you work at it. This is especially true of voice assistants as I’ve noted elsewhere. There are obvious risks in the trend towards one-result searches, including dominance of a few sources (and the squeezing out of the rest).

– opaque results. Wouldn’t it be great if you could find out why, exactly, Google has chosen to give you the results it has yielded. Puzzling this out is of course the realm of countless SEO experts, and there is always the argument that if too much is known about the algorithms, they are easier to game.

The downside though is that we have to trust Google (as the dominant provider) to do the right thing in many different ways. It will not always do the right thing. If its vision of the next 20 years of search is accurate, we are being asked to become increasingly trusting, even as we are also discovering, through devastating political outcomes, that you cannot trust big algorithm-based, commercial internet providers to look after our best interests.

Tech Writing