All posts by Tim Anderson

Bob Dylan’s Mondo Scripto work and exhibition in London

If you are in London before 30 November 2018, and any kind of Bob Dylan fan, then I highly recommend Mondo Scripto, an exhibition of his drawings and Iron Works gates.

image

Let’s start with the Iron Works. Way back in 2001 Dylan began constructing gates from scrap metal, as gifts for friends or for his own property. “Gates appeal to me because of the negative space they allow. They can be closed but at the same time they allow the seasons and breezes to flow. They can shut you out or shut you in. And in some ways there is no difference,” he says.

Iron is part of Dylan’s history because he was born and brought up in Hibbing, Minnesota, Iron country and near a large open pit mine. So for Dylan it is a return to roots as well as another way to exercise his creativity. His Iron Works were exhibited at the Halycon Gallery in London in 2013, and you can see them now peppered around the display of drawings and paintings in the current exhibition.

The gates look strong but quirky, symmetrical in some ways but not in others, painted in subtle shades that bring out the metalness and variety. There are gear wheels, chains, spanners at odd angles, animal shapes, pliers, roller skates, wheels and springs all jumbled together but making a cohesive whole. Somewhat like the way he uses language in his songs.

image 

But what of Mondo Scripto? Dylan has taken 64 songs, written out the lyrics in blocks (so they are not particularly easy to read), and illustrated each one with a drawing. For example, here is one that I like, Just Like a Woman:

image

Maybe you would expect to see a woman in the drawing; but no, this is the line “as I stand inside the rain.”

Note that the exhibits at the Halcyon Gallery are the actual drawings, not just the signed prints you can buy for £1895 each (10 songs, limited editions of 495 for each song).

I am not much interested in the collector’s aspect here. I think that is a lot of money for a print and a signature. The orginals also look much sharper and better than the prints, which is disappointing if you have the print. You may be able to negotiate to buy an original but it will cost a lot more. I would quite like one of my favourites on the wall, or a Dylan gate in the garden, but the cost is too steep for me; if I were wealthy enough for it to be spare change, perhaps I would. Then again, the profits are enabling this amazing exhibition which is free to attend so it is not so bad.

What I am interested in is the choice of songs, the choice of images and the way they are executed, and little details like small changes in the lyrics. For example, Ballad of a Thin Man in the original:

There ought to be a law against you comin’ around
You should be made to wear earphones

and in Mondo Scripto:

There ought to be a law against you coming around
Next time don’t forget to first telephone

I was intrigued to see that in Subterranean Homesick Blues, Dylan has written out:

don’t try ‘No Doze’

which is the latest chapter in a story; the original is “don’t tie no bows” but via humour and mis-transcription has become what it is. Maybe Dylan just copied it, maybe he likes it better now, who knows?

This is Dylan so there is variation everywhere. You can get a nice book/catalogue of Mondo Scripto for £45 (this one is a good buy), and this includes 60 songs written out with their drawings. However many of the drawings are different to those in the exhibition. Apparently the book was done first so those are the earlier versions. Just like Tom Thumb’s Blues, for example, has a bottle of wine in the book (“I started out on Burgundy”) and a cityscape in the exhibition (“Back to New York City”?).

Knockin’ on Heaven’s Door has special treatment. This has been done as a series of 16 drawings, pen drawings rather than pencil sketches, including a variety of different doors and techniques for knockin’ on them. It begins with a hand knock, and ends with a rap from a cross. There is also a drill, a crowbar, a bottle, and so on.

Lovely humour, but also a meditation on death? Possibly, though Dylan has been singing about death at least since Fixin’ to Die on his very first album, so we should not take this as any kind of final statement.

It is nevertheless true that Mondo Scripto is Dylan’s reflection on his best-known songs, made new by drawings which bring out a striking image or thought and which reminds you how extraordinary they are.

One of the features of this beautifully laid-out exhibition is a wall of books, some by but mostly about Dylan.

image

It is a reminder of how many of us have been entertained, absorbed and challenged by this body of work.

Mondo Scripto along with the Iron Works are remarkable, coming from a man who also tours incessantly and is of an age where many of us sit around doing nothing much at all.

image

The Mondo Scripto song list

1. Song To Woody
2. Blowin’ in The Wind
3. Girl From The North country
4. Don’t Think Twicse, It’s All Right
5. Masters Of war
6. One Too Many Mornings
7. A Hard Rain’s A.Gonna Fall
8. Oxford Town
9. Tombstone Blues
10. Desolation Row
11. It’s All Over Now, Baby Blue
12. It’s Alright, Ma (I’m Only Bleeding)
13. Like A Rolling Stone
14. Mr Tambourine Man
15. It Ain’t Me, Babe
16. Ballad Of A Thin Man
17. Just Like Tom Thumb’s Blues
18. The Times They Are A-Changin’
19. Ballad Of A Thin Man*
20. All 1 Really Want Tc Do
21. Rainy Day Women #12 & 35
22. 1 Want You
23. Highway 61 Revisited
24. Leopard-Skin Pill-Bax Hat
25. Stuck Inside Of Mobile With The Memphis Blues Again
26. Visions Of Johanna
27. One Of Us Must Kuow {Sooner Or Later)
28. Subterranean Homesick Blues
29. She Belongs To Me
30. Maggie’s Farm
31. Love Minus Zero/No Limit
32. Just Like A Woman
33. Chimes Of Freedom
34. Positively 4th Street
35. Tangled Up In Blue
36. “Knockin’ On Heaven’s Door” series
37. All Along The WaLchLower
38. Lay, Lady, Lay
39. The Man In Me
40. Tomorrow Is A Long Time
41. When I Paint My Masterpiece
42. I Shall Be Released
43. Forever Young
44. Knockin’ On Heaven’s Door
45. If You See Her, Say Hello
46. One More Cup Of Coffee(Valley Below)
47. “Subterranean Homesick Blues” series
48. Shelter From The Storm
49. Simple Twist of Fate
50. You’re Gonna Make Me Lonesome When You Go
51. Gotta Serve Somebody
52. Isis
53. Jokerman
54. Every Grain Of Sand
55. Hurricane
56. This Wheel’s On Fire
57. Man Tn the Long Black Coat
58. Isis
59. Things Have Changed
60. Workingman’s Blues #2
61. Mississippi
62. Ain’t Talkin’
63. Highlands
64. Make You Feel My Love


Microsoft’s Windows 10 October 2018 update on hold after some users suffer deleted documents: what to conclude?

Microsoft has paused the rollout of the October 2018 Windows update for Windows 10 while it investigates reports of users losing data after the upgrade.

image

Update: Microsoft’s “known issues” now asks affected uses to “minimize your use of the affected device”, suggesting that file recovery tools are needed for restoring documents, with uncertain results.

Windows 10, first released in July 2015, was the advent of “Windows as a service.” It was a profound change. The idea is that whether in business or at home, Windows simply updates itself from time to time, so that you always have a secure and up to date operating system. Sometimes new features arrive. Occasionally features are removed.

Windows as a service was not just for the benefit of we, the users. It is vital to Microsoft in its push to keep Windows competitive with other operating systems, particularly as it faces competition from increasingly powerful mobile operating systems that were built for the modern environment. A two-year or three-year upgrade cycle, combined with the fact that many do not bother to upgrade, is too slow.

Note that automatic upgrade is not controversial on Android, iOS or Chrome OS. Some iOS users on older devices have complained of performance problems, but in general there are more complaints about devices not getting upgraded, for example because of Android operators or vendors not wanting the bother.

Windows as a service has been controversial though. Admins have worried about the extra work of testing applications. There is a Long Term Servicing Channel, which behaves more like the old 2-3 year upgrade cycle, but it is not intended for general use, even in business. It is meant for single-purpose PCs such as those controlling factory equipment, or embedded into cash machines.

Another issue has been the inconvenience of updates. “Restart now” is not something you want to see just before giving a presentation, or working on it at the last minute, for example. Auto-restart occasionally loses work if you have not saved documents.

The biggest worry though is the update going wrong. For example, causing a PC to become unusable. In general this is rare. Updates do fail, but Windows simply rolls back to the previous version, annoying but not fatal.

What about deleting data? Again it is rare; but in this case recovery is not simple. You are in the realm of disk recovery tools, if you do not have a backup. However it turns out that users have reported updates deleting data for some time. Here is one from 4 months ago:

image

Why is the update deleting data? It is not yet clear, and there may be multiple reasons, but many of the reports I have seen refer to user documents stored outside the default location (C:\users\[USERNAME]\). Some users with problems have multiple folders called Documents. Some have moved the location the proper way (Location tab in properties of special folders like Documents, Downloads, Music, Pictures) and still had problems.

Look through miglog.xml though (here is how to find it) and you will find lots of efforts to make sense of the user’s special folder layout. This is not my detailed diagnosis of the issue, just an observation having ploughed through long threads on Reddit and elsewhere; of course these threads are full of noise.

Here is an example of a user who suffered the problem and had an unusual setup: the location of his special folders had been moved (before the upgrade) to an external drive, but there was still important data in the old locations.

We await the official report with interest. But what can we conclude, other than to take backups (which we knew already)?

Two things. One is that Microsoft needs to do a better job of prioritising feedback from its Insider hub. Losing data is a critical issue. The feedback hub, like the forums, is full of noise; but it is possible to identify critical issues there.

This is related of course to the suspicion that Microsoft is now too reliant on unpaid enthusiast testers, at the expense of thorough internal testers. Both are needed and both, I am sure, exist. What though is the proportion and has internal testing been reduced on the basis of these widespread public betas?

The second thing is about priorities. There is a constant frustration that vendors (and Microsoft is not alone) pay too much attention to cosmetics and new features, and not enough to quality and fixing long-standing bugs and annoyances.

What do most users do after Windows upgrades? They are grateful that Windows is up and running again, and go back to working in Word and Excel. They do not care about cosmetic changes or new features they are unlikely to use. They do care about reliability. Such users are not wrong. They deserve better than to find documents missing.

One final note. Microsoft released Windows 10 1809 on 2nd October. However the initial rollout was said to be restricted to users who manually checked Windows Update or used the Update Assistant. Microsoft said that automatic rollout would not begin until Oct 9th. In my case though, on one PC, I got the update automatically (no manual check, no Insider Build setting) on October 3rd. I have seen similar reports from others. I got the update on an HP PC less than a year old, and my guess is that this is the reason:

With the October 2018 Update, we are expanding our use of machine learning and intelligently selecting devices that our data and feedback predict will have a smooth update experience.

In other words, my PC was automatically selected to give Microsoft data on upgrades expected to go smoothly. I am guessing though. I am sure I did not trigger the update myself, since I was away all day on the 2nd October, and buried in work on the 3rd when the update arrived (I switched to a laptop while it updated). I did not lose data, even though I do have a redirected Documents folder. I did see one anomaly: my desktop background was changed from blue to black, and I had to change it back manually.

What should you do if you have this problem and do not have backups? Microsoft asks you to call support. As far as I can tell, the files really are deleted so there will not be an easy route to recovery. The best chance is to use the PC as little as possible; do a low-level copy of the hard drive if you can. Shadow Copy Explorer may help. Another nice tool is Zero Assumption Recovery. What you recover is dependent on whether files have been overwritten by other files or not.

Update: Microsoft has posted an explanation of why the data loss occurred. It’s complicated and all do to with folder redirection (with a dash of OneDrive sync). It affected some users who redirected “known folders” like Documents to another location. The April 2018 update created spurious empty folders for some of these users. The October 2018 update therefore sought to delete them, but in doing so also deleted non-empty folders. It still looks like a bad bug to me: these were legitimate folders for storing user data and should not have been removed if not empty.

More encouraging is that Microsoft has made some changes to its feedback hub so that users can “provide an indication of impact and severity” when reporting issues. The hope is that Microsoft will find reports of severe bugs more easily and therefore take action.

Updated 8th Oct to remove references to OneDrive Sync and add support notes. Updated 10th Oct with reference to Microsoft’s explanatory post.

Review: Synology DS119J. Great system but single bay and underpowered hardware make it worth spending a bit more

Synology has released a new budget NAS, the DS119j, describing it as “An ideal first NAS for the home".

It looks similar to the DS115j which it probably replaces – currently both models are listed on Synology’s site. What is the difference? The operating system is now 64-bit, the CPU now a dual-core ARMv8, though still at 800 MHz, and the read/write performance slightly bumped from 100 MB/s to 108 MB/s, according to the documentation.

I doubt any of these details will matter to the intended users, except that the more powerful CPU will help performance – though it is still underpowered, if you want to take advantage of the many applications which this device supports.

image

What you get is the Diskstation, which is a fairly slim white box with connections for power, 1GB Ethernet port, and 2 USB 2.0 ports. Disappointing to see the slow USB 2.0 standard used here. You will also find a power supply, an Ethernet cable, and a small bag of screws.

image

The USB ports are for attaching USB storage devices or printers. These can then be accessed over the network.

The DS119j costs around £100.

Initial setup

You can buy these units either empty, as mine was, or pre-populated with a hard drive. Presuming it is empty, you slide the cover off, fit the 3.5" hard drive, secure it with four screws, then replace the cover and secure that with two screws.

What disk should you buy? A NAS is intended to be always on and you should get a 3.5" disk that is designed for this. Two common choices are the WD (Western Digital) Red series, and Seagate IronWolf series. At the time of writing, both a 4TB WD Red and a 4TB IronWolf are about £100 from Amazon UK. The IronWolf Pro is faster and specified for a longer life (no promises though), at around £150.

What about SSD? This is the future of storage (though the man from Seagate at Synology’s press event says hard drives will continue for a decade or more). SSD is much faster but on a home NAS that is compromised by accessing it over a network. It is much more expensive for the same amount of storage. You will need a SATA SSD and a 3.5" adapter. Probably not the right choice for this NAS.

Fitting the drive is not difficult, but neither it is as easy as it could be. It is not difficult to make bays in which drives can be securely fitted without screws. Further, the design of the bay is such that you have to angle a screwdriver slightly to turn the screws. Finally, the screw holes in the case are made entirely of plastic and it would be easy to overtighten then and strip the thread, so be careful.

Once assembled, you connect the drive to a wired network and power it on. In most home settings, you will attach the drive to a network port on your broadband router. In other cases you may have a separate network switch. You cannot connect it over wifi and this would anyway be a mistake as you need the higher performance and reliability of a cable connection.

To get started you connect the NAS to your network and therefore to the internet, and turn it on. In order to continue, you need to find it on the network which you can do in one of several ways including:

– Download the DS Finder app for Android or iOS.

– Download Synology Assistant for Windows, Mac or Linux

– Have a look at your DHCP manager (probably in your router management for home users) and find the IP address

If you use DS Finder you can set up the Synology DiskStation from your phone. Otherwise, you can use a web browser (my preferred option). All you need to do to get started is to choose a username and password. You can also choose whether to link your DiskStation with a Synology account and create a QuickConnect ID for it. If you do this, you will be able to connect to your DiskStation over the Internet.

The DiskStation sets itself up in a default configuration. You will have network folders for music, photo, video, and another called home for other documents. Under home you will also find Drive, which behaves like a folder but has extra features for synchronization and file sharing. For full use of Drive, you need to install a Drive client from Synology.

image

If you attach a USB storage device to a port on the DS119j, it shows up automatically as usbshare1 on the network. This means that any USB drive becomes network storage, a handy feature, though only at USB 2.0 speed.

Synology DSM (Disk Station Manager)

Synology DSM is a version of Linux adapted by Synology. It is mature and robust, now at version 6.2. The reason a Synology NAS costs much more than say a 4TB WD Elements portable USB drive is that the Synology is actually a small server, focused on storage but capable of running many different types of application. DSM is the operating system. Like most Linux systems, you install applications via a package manager, and Synology maintains a long list of packages encompassing a diverse range of functions from backup and media serving through to business-oriented applications like running Java applications, a web server, Docker containers, support ticket management, email, and many more.

DSM also features a beautiful windowed user interface all running in the browser.

image

The installation and upgrade of packages is smooth and whether you consider it as a NAS, or as a complete server system for small businesses, it is impressive and (compared to a traditional Windows or Linux server) easy to use.

The question in relation to the DS119j is whether DSM is overkill for such a small, low-power device.

Hyper Backup

Given that this NAS only has a single drive, it is particularly important to back up any data. Synology includes an application for this purpose, called Hyper Backup.

image

Hyper Backup is very flexible and lets you backup to many destinations, including Amazon S3, Microsoft Azure, Synology’s own C2 cloud service, or to local storage. For example, you could attach a large USB drive to the USB port and backup to there. Scheduling is built in.

I had a quick look at the Synology C2 service. It did not go well. I use the default web browser on Windows 10, Edge, and using Hyper Backup to Synology C2 just got me this error message.

image

I told Edge to pretend to be Firefox, which worked fine. I was invited to start a free trial. Then you get to choose a plan:

image

Plans start at €9.99 + VAT for 100GB for a year. Of course if you fill your 4TB drive that will not be enough. On the other hand, not everything needs to be backed up. Things like downloads that you can download again, or videos ripped from disks, are not so critical, or better backed up to local drives. Cloud backup is ideal though for important documents since it is an off-site backup. I have not compared prices, but I suspect that something like Amazon S3 or Microsoft Azure would be better value than Synology C2, though integration will be smooth with Synology’s service. Synology has its own datacentre in Frankfurt so it is not just reselling Amazon S3; this may also help with compliance.

An ideal first NAS?

The DS119j is not an ideal NAS for one simple reason: it has only a single bay so does not provide resilient storage. In other words, you should not have data that is stored only on this DiskStation, unless it is not important to you. You should ensure that it is backed up, maybe to another NAS or external drive, or maybe to cloud storage.

Still, if you are aware of the risks with a single drive NAS and take sensible precautions, you can live with it.

I like Synology DSM which makes the small NAS devices great value as small servers. For home users, they are great for shared folders, media serving (I use Logitech Media Server with great success), and PC backup. For small business, they are a strong substitute for the role which used to be occupied by Microsoft’s Small Business Server as well as being cheaper and easier to use.

If you only want a networked file share, there are cheaper options from the likes of Buffalo, but Synology DSM is nicer to use.

If you want to make fuller use of DSM though, this model is not the best choice. I noticed the CPU often spiked just using the control panel and package manager.

image

I would suggest stretching to at least the DS218j, which is similar but has 2 bays, 500MB of RAM and a faster CPU. Better still, I like the x86-based Plus series – but a 2-bay DS218+ is over £300. A DS218j is half that price and perhaps the sweet spot for home users.

Finally, Synology could do better with documentation for the first-time user. Getting started is not too bad, but the fact is that DSM presents you with a myriad of options and applications and a better orientation guide would be helpful.

Conclusion? OK, but get the DS218j if you can.

Linux applications and .NET Core on a Chromebook makes this an increasingly interesting device

I have been writing about Google Chromebooks of late and as part of my research went out and bought one, an HP Chromebook 14 that cost me less than £200. It runs an Intel Celeron N3350 processor and has a generous (at this price) 32GB storage; many of the cheaper models have only 16GB.

This is a low-end notebook for sure, but still boots quickly and works fine for general web browsing and productivity applications. Chrome OS (the proprietary version of the open source Chromium OS) is no longer an OS that essentially just runs Google’s Chrome browser, though that is still the main intent. It has for some time been able to run Android applications; these run in a container which itself runs Android. Android apps run fairly well though I have experienced some anomalies.

Recently Google has added support for Linux applications, though this is still in beta. The main motivation for this seems to be to run Android Studio, so that Googlers and others with smart Pixelbooks (high-end Chromebooks that cost between £999 and £1,699) can do a bit more with their expensive hardware.

I had not realised that even a lowly HP Chromebook 14 is now supported by the beta, but when I saw the option in settings I jumped at it.

image

It took a little while to download but then I was able to open a Linux terminal. Like Android, Linux runs in a container. It is also worth noting that Chrome OS itself is based on Linux so in one sense Chromebooks have always run Linux; however they have been locked down so that you could not, until now, install applications other than web apps or Android.

Linux is therefore sandboxed. It is configured so that you do not have access to the general file system. However the Chromebook Files application has access to your user files in both Chrome OS and Linux.

image

I found little documentation for running Linux applications so here are a few notes on my initial stumblings.

First, note that the Chromebook trackpad has no right-click. To right-click you do Alt-Click. Useful, because this is how you paste from the clipboard into the Linux terminal.

Similarly, there is no Delete key. To Delete you do Alt-Backspace.

I attribute these annoyances to the fact that Chrome OS was mostly developed by Mac users.

Second, no Linux desktop is installed. I did in fact install the lightweight LXDE with partial success but it does not work properly.

The idea is that you install GUI applications which run in their own window. It is integrated so that once installed, Linux applications appear in the Chromebook application menu.

I installed Firefox ESR (Extended Support Release).  Then I installed an application which promises to be particularly useful for me, Visual Studio Code. Next I installed the .NET Core SDK, following the instructions for Debian.

image

Everything worked, and after installing the C# extension for VS Code I am able to debug and run .NET Core applications.

I understand that you will not be so lucky with VS Code if you have an ARM Chromebook. Intel x86 is the winner for compatibility.

What is significant to me is not only that you can now run desktop applications on a Chromebook, but also that you can work on a Chromebook without needing to be deeply hooked into the Google ecosystem. You still need a Google account of course, for log in and the Play Store.

You will also note from the screenshot above that Chrome OS is no longer just about a full-screen web browser. Multiple overlapping windows, just like Windows and Mac.

These changes might persuade me to spend a little more on a Chromebook next time around. Certainly the long battery life is attractive. Following a tip, I disabled Bluetooth, and my Chromebook battery app is reporting 48% remaining, 9 hrs 23 minutes. A little optimistic I suspect, but still fantastic.

Postscript: I was always a fan of the disliked Windows RT, which combined a locked-down operating system with the ability to run Windows applications. Maybe container technology is the answer to the conundrum of how to provide a fully capable operating system that is also protected from malware. Having said which, there is no doubt that these changes make Chromebooks more vulnerable to malware; even if it only runs in the Linux environment, it could be damaging and steal data. The OS itself though will be protected.

Microsoft Azure Stack: a matter of compliance

At the Ignite conference last week in Orlando, Microsoft’s hardware partners were showing off their latest Azure Stack boxes.

In conversation, one mentioned to me that Azure Stack was selling better in Europe than in the USA. Why? Because stricter compliance regulations (perhaps alongside the fact that the major cloud platforms are all based in North America) makes Azure Stack more attractive in Europe.

image
Lenovo’s Azure Stack

Azure Stack is not just “Azure for your datacentre”. It is a distinctive way to purchase IT infrastructure, where you buy the hardware but pay for the software with a usage-based model.

Azure / Azure Stack VMs are resilient so you cannot compare the value directly with simply running up a VM on your own server. Azure Stack is a premium option. The benefits are real. Microsoft mostly looks after the software, you can use the excellent Azure management tools, and you get deep integration with Azure in the cloud. Further, you can diminish the cost by scaling back at times of low demand; especially easy if you use abstracted services such as App Service, rather than raw VMs.

How big is the premium? I would be interested to hear from anyone who has done a detailed comparison, but my guess is that running your own servers with Windows Server Datacenter licenses (allowing unlimited VMs once all the cores are licensed) is substantially less expensive.

You can see therefore that there is a good fit for organizations that want to be all-in on the cloud, but need to run some servers on-premises for compliance reasons.

Redesign coming to Outlook for Windows and Mac, but will Microsoft fix what matters most?

At its Ignite conference under way in Orlando, Microsoft has been talking about its plans for Outlook, the unavoidable email and personal information management client for Office 365 and Exchange.

A lot of UI design changes are on the way, as well as back-end changes that should improve our experience. One of the changes is that “AI-infused” search will surface top results, based on contacts we often communicate with, keyword matching and so on. Search is also getting faster; apparently it has already doubled in speed compared to earlier versions.

image

There will be a simplified ribbon, more use of colour, an improved calendar, and many small design changes.

On the Mac, this is what Outlook looks like today:

image

and this is what is planned:

image

The background shading is caused by transparency, which is configurable.

Nothing is set in stone and the previews we saw are just that, previews. Microsoft is looking for feedback via the Office Insider community, as well as previewing features in the application itself and inviting opinions.

It’s good to see redesign work on this application which is essential to many of us. However it is not clear that the things which matter most to me are being addressed. I had a chat with the speakers at the end and mentioned the following personal bugbears:

1. Message formatting still gets messed up especially if you want to do things like replying inline to an email. If you click in the wrong place you can still end up inheriting formatting from the message you are quoting such that you cannot easily get back to normal typing. It is all to do with the use of Word for the message editor, but without all the features of Word to control it.

2. I’d like to see something in the UI that would deter users from quoting a massive chain of previous correspondence in the message, sometimes sending content unawares that would better have remained confidential.

3. Something many have asked for: delayed send, so that when you reply too hastily there is a window of time when you delete or edit the message before it is sent. Configurable, of course.

4. Attention paid to the many obscure dialogs, some of which have not been touched for decades. Like the Open other user’s mailbox control, which is not even a picklist, you have to type it exactly right:

image

5. Ever had a call from someone who has inadvertently engaged Work Offline and does not know why mail is no longer arriving? I have.

6. In Outlook mobile, at least on Android, search is infuriating. It retrieves results, but if they are more than a couple of weeks old, you cannot see the message.

7. Better performance when your connection is poor. I realise it is challenging, but you would think that proper use of background processes could give the user a reasonable and informative experience. Whereas today you can get hangs, lies (“this folder is up to date”, when it is not), that certificate warning when you are on public wifi and have not logged in yet (why can’t Outlook detect this common scenario?), repeated password requests when there are network problems, and so on.

8. Why are Outlook profiles managed in a Mail applet in Control Panel? Admins know this, but why not make it an Outlook Configuration app that appears in the Start menu. It would be easier for those who get stumped when Outlook does not open.

I am sure you have your own list. The bottom line though is this: the cosmetics of the design do matter, but not as much as issues which can stop you getting things done.

Google search to become even more opaque? From answers to “journeys”

Google’s Ben Gomes has posted about the future of search. Nothing in it surprises me. Quick summary:

  • From answers to journeys: search to be more personalized and contextual, helping you “resume tasks where you left off”
  • Queryless information: surfacing information “relevant to your interests” without you asking.
  • More visual results. Because everyone likes a picture.

Personally I would prefer search to be improved in different ways. I would like:

– clearer separation of ads from search results. It is to my mind wrong that brands have to advertise based on their own brand name, just to ensure that users searching for their brand find the official site, and not a competitor or intermediary

– Better results. As a techie I am often looking for answers to technical queries. Search is very useful, but in general, I find too many results with the same question but no answers, too many old results that are no longer relevant, and not enough focus on community forums (where the answers often exist).

– Better authority. As a journalist, authority really matters; and I do not mean “reported by a well known news source”. Authority means first-party information, the announcement from the actual people or companies involved, the information on the first-party sites or from actual employees. Finding this is quite a lot of work, and the algorithms could be much better.

What I do not want includes:

– over-personalized results. There are two reasons. First, I am wary about giving away all the personal data which Google wants to use to personalize results. Second, factors like objectivity, balance, and accuracy matter much more to me. I do not want my own version of fake news, results designed to please me rather than to inform me. Nor do I want this for others, who may end up with a distorted view of the world.

Of course it depends what sort of search you are making. If you search for “best restaurant in Oxford”, what do you want? The most highly-rated restaurant (by some standard) among places where you typically choose to dine? Or the best according to the general population? Or the best according to top restaurant critics? It is not clear; and a journalist (say) might want a different answer to someone looking for a place to eat tonight.

All of this touches on a key point, which is search results versus marketing. Is search a way of researching information on the internet, or a marketing tool? I want the former; but unfortunately it will always be, at least in part, the latter. Particularly as we are unwilling to pay for it.

– too few results. Ten blue links was a luxury: 10 answers to the same question, hopefully from different sources, so we can see any diversity and make a selection. The search, um, experience now more often gives us just one result, or at least, one prominent result and more available if you work at it. This is especially true of voice assistants as I’ve noted elsewhere. There are obvious risks in the trend towards one-result searches, including dominance of a few sources (and the squeezing out of the rest).

– opaque results. Wouldn’t it be great if you could find out why, exactly, Google has chosen to give you the results it has yielded. Puzzling this out is of course the realm of countless SEO experts, and there is always the argument that if too much is known about the algorithms, they are easier to game.

The downside though is that we have to trust Google (as the dominant provider) to do the right thing in many different ways. It will not always do the right thing. If its vision of the next 20 years of search is accurate, we are being asked to become increasingly trusting, even as we are also discovering, through devastating political outcomes, that you cannot trust big algorithm-based, commercial internet providers to look after our best interests.

Microsoft will add an Azure-hosted Windows (and office) Virtual Desktop to Office 365 for small businesses

What if small businesses could add a virtual Windows desktop option to their Office 365 subscription, enabling users to log on to a remote desktop and run the full desktop versions of Office as well as other Windows applications, without the hassle of managing local PC desktops?

This is coming, according to information gleaned from the announcements here at Microsoft’s Ignite conference in Orlando. It is all part of a new Azure service called Windows Virtual Desktop, which sees Microsoft getting serious about desktop virtualisation on Azure for the first time.

image
Microsoft Ignite is under way in Orlando, Florida

Windows desktop virtualization is already available on Azure, but Microsoft currently points you to its partners Citrix or VMWare for this. You will still be able to do this, and the third-party solutions still have advantages especially in terms of management and imaging tools, but of course they are expensive. Windows Virtual Desktop will be interesting to large organisations who are already licensed for virtual desktop access via licenses for subscriptions including Windows 10 Enterprise E3 and E5.

There will apparently be options for both hosted Windows 10, and shared hosting using Remote Desktop Services, but exactly what will be in the Office 365 offering is not yet clear.

A cost-effective solution for small businesses wanting a hosted virtual desktop on Azure is something new though and if Microsoft prices it right, I would expect it to be popular. Virtual desktops are handy for staff working at home or on the go, for example.

Will the pricing be right? That is not yet known of course. But it does look hopeful that Microsoft may be moving away from its policy of making Windows desktop virtualisation deliberately expensive in order to protect its licensing income. 

Microsoft’s 82 Ignite announcements: what really matters

Microsoft’s PR team has helpfully summarised many of the announcements at the Ignite event, kicking off today in Orlando. I count 82, but you might make it fewer or many more, depending on what you call an announcement. And that is not including the business apps announcements made at the end of last week, most notably the arrival of the HoloLens-based Remote Assist in Dynamics 365.

image

Not all announcements are equal. Some, like the release of Windows Server 2019, are significant but not really news; we knew it was coming around now, and the preview has been around for ages. Others, like larger Azure managed disk sizes (8, 16 and 32TB) are cool if that is what you need, but hardly surprising; the specification of available cloud infrastructure is continually being enhanced.

Note that this post is based on what Microsoft chose to reveal to press ahead of the event, and there is more to come.

It is worth observing though that of these 82 announcements, only 3 or 4 are not cloud related:

  • SQL Server 2019 public preview
  • [Windows Server 2019 release] – I am bracketing this because many of the new features in Server 2019 are Azure-related, and it is listed under the heading Azure Infrastructure
  • Chemical Simulation Library for Microsoft Quantum
  • Surface Hub 2 release promised later this year

Microsoft’s journey from being an on-premises company, to being a service provider, is not yet complete, but it is absolutely the focus of almost everything new.

I will never forget an attendee at a previous Microsoft event a few years back telling me, “this cloud stuff is not relevant to us. We have our own datacenter.” I cannot help wondering how much Office 365 and/or Azure that person’s company is consuming now. Of course on-premises servers and applications remain important to Microsoft’s business, but it is hard to swim against the tide.

Ploughing through 82 announcements would be dull for me to write and you to read, so here are some things that caught my eye, aside from those already mentioned.

1. Azure confidential computing in public preview. A new series of VMs using Intel’s SGX technology lets you process data in a hardware-enforced trusted execution environment.

2. Cortana Skills Kit for Enterprise. Currently invite-only, this is intended to make it easier to write business bots “to improve workforce productivity” – or perhaps, an effort to reduce the burden on support staff. I recall examples of using conversational bots for common employee queries like “how much holiday allowance do I have remining, and which days can I take off?”. As to what is really new here, I have yet to discover.

3. A Python SDK for Azure Machine Learning. Important given the popularity of Python in this space.

4. Unified search in Microsoft 365. Is anyone using Delve? Maybe not, which is why Microsoft is bringing a search box to every cloud application, which is meant to use Microsoft Graph, AI and Bing to search across all company data and bring you personalized results. Great if it works.

5. Azure Digital Twins. With public preview promised on October 15, this lets you build “comprehensive digital models of any physical environment”. Once you have the model, there are all sorts of possibilities for optimization and safe experimentation.

6. Azure IoT Hub to support the Android Things platform via the Java SDK. Another example of Microsoft saying, use what you want, we can support it.

7. Azure Data Box Edge appliance. The assumption behind Edge computing is both simple and compelling: it pays to process data locally so you can send only summary or interesting data to the cloud. This appliance is intended to simplify both local processing and data transfer to Azure.

8. Azure Functions 2.0 hits general availability. Supports .NET Core, Python.

9. Helm repositories in Azure Container Registry, now in public preview.

10 Windows Autopilot support extended to existing devices. This auto-configuration feature previously only worked with new devices. Requires Windows 10 October Update, or automated upgrade to this.

Office and Office 365

In the Office 365 space there are some announcements:

1. LinkedIn integration with Office 365. Co-author documents and send emails to LinkedIn contacts, and surface LinkedIn information in meeting invites.

2. Office Ideas. Suggestions as you work to improve the design of your document, or suggest trends and charts in Excel. Sounds good but I am sceptical.

3. OneDrive for Mac gets Files on Demand. A smarter way to use cloud storage, downloading only files that you need but showing all available documents in Mac Finder.

4. New staff scheduling tools in Teams. Coming in October. ”With new schedule management tools, managers can now create and share schedules,employees can easily swap shifts, request time off, and see who else is working.” Maybe not a big deal in itself, but Teams is huge as I previously noted. Apparently the largest Team is over 100,000 strong now and there are 50+ out there with 10,000 or more members.

Windows Virtual Desktop

This could be nothing, or it could be huge. I am working on the basis of a one-paragraph statement that promises “virtualized Windows and Office on Azure … the only cloud-based service that delivers a multi-user Windows 10 experience, is optimized for Office 365 Pro Plus … with Windows Virtual Desktop, customers can deploy and scale Windows and Office on Azure in minutes, with built-in security and compliance.”

Preview by the end of 2018 is targeted.

Virtual Windows desktops are already available on Azure, via partnership with Citrix or VMWare Horizon, but Microsoft has held back from what is technically feasible in order to protect its Windows and Office licensing income. By the time you have paid for licenses for Windows Server, Remote Access per user, Office per user, and whatever third-party technology you are using, it gets expensive.

This is mainly about licensing rather than technology, since supporting multiple users running Office applications is now a light load for a modern server.

If Microsoft truly gets behind a pure first-party solution for hosted desktops on Azure at a reasonable cost, the take up would be considerable since it is a handy solution for many scenarios. This would not please its partners though, nor the many hosting companies which offer this.

On the other hand, Microsoft may want to compete more vigorously with Amazon Web Services and its Workspaces offering. Workspaces is still Windows, but of course integrates nicely with AWS solutions for storage, directory, email and so on, so there is a strategic aspect here.

Update: A little more on Microsoft Virtual Desktop here.

More details soon.

Microsoft Office 365 and Google G-Suite: why multi-factor authentication is now essential

Businesses using Office 365, Google G-Suite or other hosted environments (but especially Microsoft and Google) are vulnerable to phishing attacks that steal user credentials. Here is a recent example, which sailed through Microsoft’s spam and malware filters despite its attempts to use AI and other techniques to catch them.

image

If a user clicks the link and signs in, the bad guys have their credentials. What are the consequences?

– at best, a bunch of spam sent out from the user’s account, causing embarrassment and a quick password reset.

– at worst, something much more serious. Once an unauthorised party has user credentials, there are all sorts of social engineering possibilities to escalate the attack, obtain other credentials, or see what interesting data can be found in collaborative document stores and shared applications.

– another risk is to discover information about an organisation’s customers and contact them to advise of new bank details which of course direct payments to the attacker’s account.

The truth is there are many risks and it is worth every effort to prevent this happening in the first place.

However, it is hard to educate every user to the extent that you can be confident they will never click a link in an email such as the one above, or reveal their password in some other way – such as using the same one as one that has been leaked – check here to find out, for example.

Multi-factor authentication (MFA), which is now easy to set up on both Office 365 or G-Suite, helps matters by requiring users to enter a one-time code from their mobile, either via an authenticator app or a text message, before they can log in. It does not cost any extra and now is the time to set it up, if you have not already.

It seems to me that in some ways the prevalence of a few big providers in hosted email and applications has made matters easier for the hackers. They know that a phishing attack simulating, say, Office 365 support will find many potential victims.

The more positive view is that even small businesses can now easily use Enterprise-grade security, if they choose to take advantage.

I do not think MFA is perfect. It usually depends on a mobile phone, and given that possession of a user’s phone also often enables you to reset the password, there is a risk that the mobile becomes the weak link. It is well known that social engineering against mobile providers can persuade them to cancel a SIM and issue a new one to an impostor.

That said, hijacking a phone is a lot more effort than sending out a million phishing emails, and on balance enabling MFA is well worth it.