Tech tribalism leads to BAD computing decisions

Computing, and I use the term in the widest sense, has always been tribal to an extent. People have loyalties, and there’s nothing wrong with that. This year, tribes are called "ecosystems", but whatever the current label, looking around the Interweb it seems to me that tribalism is becoming more prevalent and more aggressive. It’s as if everyone stood on soapboxes with their fingers in their ears, shouting "LALALALALALALA", while at the same time (a good trick, this) yelling through a megaphone that theirs is the only way and anyone who doesn’t agree is just too stupid to be considered human.

Famously, way back in 1994, the writer and thinker Umberto Eco (The Name of the Rose) compared computing loyalties to religions: Apple followers were Catholics who believed that they would find salvation through following the One True Path. Conversely, PC users, like Protestants, were obliged to find their own way through the many paths open to them, and not all would be saved. And (I guess) Linux users are the hairy prophets who come out of the desert proclaiming, "It’s really easy. Honestly. And these days you only have to scourge yourself with thorns once a week …"

Tribal Warfare

Divisions like this cannot be a good thing, and we can’t all be right. Why don’t we begin by admitting that there is no best ecosystem. They all have good points, and they all have weaknesses. It all depends on what you want to do. Because nobody and nothing is the best at everything.

It used to be axiomatic that you looked at the problem (or need) first, identified the software solution that best fitted what you had to do and purchased whatever hardware you needed to run it on. Tribal loyalties turn that procedure on its head. If you decide on the platform you’ll use before you even know what’s available the danger is that in the end nobody is really happy.

Multiculturalism Computes

A couple of examples: a little while ago I was involved in setting up a small digital TV channel from scratch. The decision on equipping the edit suites was entirely software (not ecosystem) driven. We needed editing software that was capable of fitting in with broadcast television stations and facility houses, and which experienced freelance editors had also worked with. The obvious choice was Avid; but we felt we’d rather look at more cost-efficient alternatives.

Although I personally have worked with Adobe Premiere Pro on Windows for some time, for the TV channel, we chose Apple Final Cut Pro, on Mac Pros -- because that was right for the problem we had at the time. Final Cut Pro is a great piece of software, but then so is Premiere Pro. (They’re extremely similar, partly because Final Cut was initially developed by one of the guys who had previously written Premiere for Adobe.) Final Cut was the right choice in that situation because there were simply more freelance editors around who had experience using Final Cut Pro than Premiere Pro.

The office desktops, however, were Windows PCs, partly because it was easier to support Microsoft's operating system. But mainly because Windows software was what our employees were used to. In both cases the choice was made to provide the greatest compatibility with the surrounding environment. (And, though I have used PCs and Windows since the 1980s, the experience of trying this combination made me think very seriously about going for a Mac Pro on my own next upgrade cycle. Except now it seems Apple has binned the Mac Pro, and killed Final Cut…)

PC Pluralism

Another example. We started out running coolcucumber.tv, a principally Internet TV channel, using Windows media video on a Windows Media Server. This was because at the time we started (about three years ago) the picture quality for a given bandwidth was demonstrably better on Windows video, than on anything else -- especially when combined with Windows streaming server. The situation has changed since then, and the improved quality and popularity of H.264/MPEG4 has made it the obvious choice for streaming video.

So we have replaced the Windows video with H.264/MPEG4 (re-encoded from the original masters, of course), and moved to a Linux server, because that’s what we need to give the widest accessibility to the video. (In particular, with the software we’re using, streaming to iPads/iPhones only works from a Linux server.)

These are (small-scale) corporate examples, but there’s no reason why you shouldn’t do this for yourself. Next time you decide to upgrade your desktop, laptop, or other device, you could take a little time to look at what your habits and needs really are, and whether a different device might suit you better.

Although we all have personal preferences, the most important thing is the overall picture, rather than an unthinking "well, it obviously has to be Mac, or Windows, or iPad or Android". In the end you might not change anything, but at least you will have looked with open eyes and an open mind.

And finally… This year’s prize for flexibility of thought goes to the person of my acquaintance who recently, and reluctantly, acquired something that could run Microsoft Word to use alongside his 25-year old RISC OS Acorn machine. He made the decision because he works with so many others who used Word and needed the compatibility. Software-driven, see?

Photo Credit:  myway2studio/Shutterstock

4 Responses to Tech tribalism leads to BAD computing decisions

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.