There has been a great deal of enthusiasm about making indie movies on Digital SLRs (and even some television -- Joe Wilcox of this parish reminds me that an episode of “House” was filmed on a Canon 5D, as was one of the BBC’s “Wallander” episodes). There are two main reasons for this enthusiastic adoption -- firstly, both camera body and lenses are incredibly low-cost compared to a conventional digital TV or movie camcorder, and secondly, they have full-frame (35mm size) sensors to give that shallow depth of field “film-look”.
While I’ve followed all this with interest, I’ve never personally been fully convinced.
In the last few weeks we’ve been bombarded with a series of really important new hardware or software announcements. Take your pick: iPad mini, Nexus 4 and Surface among many, many, many more. Commentary is relentless from so-called official pundits and overly excited users --what in the days of paper would have deforested an area of the planet the size of Brazil.
You know what? None of it really matters. For all the noise about what these multi-billion dollar companies make, none of them has produced anything really new. We’ve seen no paradigm shifts. No Big Ideas. Nothing that will really change our lives in any way at all. It’s all been like putting racing wheels on the family car. Looks great, but doesn’t actually achieve anything real. Absolutely, our daily lives in the West have changed in extraordinary ways by this technology as compared to, say, 1990. But not 2012. Has the tide reached its high point? Does IT innovation really matter any more?
Computing, and I use the term in the widest sense, has always been tribal to an extent. People have loyalties, and there’s nothing wrong with that. This year, tribes are called "ecosystems", but whatever the current label, looking around the Interweb it seems to me that tribalism is becoming more prevalent and more aggressive. It’s as if everyone stood on soapboxes with their fingers in their ears, shouting "LALALALALALALA", while at the same time (a good trick, this) yelling through a megaphone that theirs is the only way and anyone who doesn’t agree is just too stupid to be considered human.
Famously, way back in 1994, the writer and thinker Umberto Eco (The Name of the Rose) compared computing loyalties to religions: Apple followers were Catholics who believed that they would find salvation through following the One True Path. Conversely, PC users, like Protestants, were obliged to find their own way through the many paths open to them, and not all would be saved. And (I guess) Linux users are the hairy prophets who come out of the desert proclaiming, "It’s really easy. Honestly. And these days you only have to scourge yourself with thorns once a week …"
The launch of the iPhone 5, and the fuss that’s being made over it (wow, 2 million sales in 24 hours) shows once again how far IT is embedded in every part of our lives. How lost would we be without any of the electronic kit and systems we so depend on? Even your toaster likely has a microprocessor embedded in it. And all of that makes us very vulnerable in ways that were almost totally unknown to our grandfathers. It’s not the natural world that has changed. It’s us.
You may remember that a few weeks ago there were widely publicized warnings of a solar storm which, in the end, had limited effects. And no doubt this caused many people to think that solar storms are never what you might call a real and serious problem. But consider this: 153 years ago, beginning on August 28th 1859, a super space storm occurred of such proportions as to make Hurricane Katrina look like a minor inconvenience.
The Commodore 64 celebrates its thirtieth birthday this month. That’s 64 kilobytes for around $600. A massive amount of RAM at the time. And for another $600 you could buy a 5.25-inch floppy disk drive, which could store 170kB on a disk. Programs loaded completely into RAM so that you could remove the program disk from the drive and insert another one to store data. Where can you get a word processor or database that will run in 64k now? Yes, of course we’re routinely doing things now that were only distant dreams back then. But I began my computing experience running my business on just such a Commodore 64.
By 1986 mass market PC clones featured a colossal 512k of RAM and a 4.77MHz processor. But although that was a massive step forward, in no time you needed to upgrade to 640k RAM, and then find ways of using the extended memory registers between 640k and 1MB. In 1990, Windows 3.0 needed 7MB of disk space -- so you’d need a hard drive to run it, which not everyone had.
It’s the Next Big Thing. Any vaguely IT-related person just has to say something like “computing is moving to the cloud” and everyone nods their heads wisely. And so it is with Office 2013. I’ve been using the Public preview of Office since it appeared two weeks ago, and I have to say I like it; and I also like the much more straightforward integration with Skydrive and Sharepoint. But there’s still no way I’m going to change my default habit of local saving and working to using the Cloud as my primary storage. And here’s why.
There are several aspects to this, and the first two are most revealing of the way in which people sitting in Redmond, Wash., Cupertino, Calif., or most other major corporations live in a different world from the rest of the population of this little blue planet of ours.
Another day, another tech product launch, and all those numbers that go with it. We just love numbers, don’t we? And generally the bigger the better. (Except in cases where they’re supposed to be small, obviously.)
The numbers in hardware and software specs are useful tools, and it’s true that bigger numbers are often better. But those same numbers carry hidden dangers, too, and, like a burger that’s too big to be good for you, that extra dollop of cream on your cake -- or the Italian town of San Gimignano, where each family just had to build a tower taller than all the others - we can become addicted to the figures without thinking about what they all really means. So let’s not blindly give every latest marketing prophet his profit, but consider our own health first.
The Galaxy S III is a wonderful beast, whose 8-megapixel stills camera can also shoot video in full HD (1920x1080). So is this a triumph of technological democratisation? Is the Galaxy S III all you need to challenge the dominance of the Hollywood Studios and their ridiculous $200m budgets? After all, some movies are made on Digital SLRs these days. Read on to find out.
The answer is a definite, well, maybe. As I’ve noted in my review, the S3’s pictures are surprisingly good, and while it would be a stretch to say you could shoot "The Avengers" on a smartphone, there is a lot you can do with this tiny camera. It could certainly be a terrific little helper for blogging, web-reporting and almost anything else you might want to put on the Internet, or produce for home or office use. Obviously it has limitations, even when compared to dedicated camcorders of the same price point, so here are a few suggestions and tips for getting the best out of it.
I got my shiny new Galaxy S III about ten days ago -- my first phone upgrade for quite a while. Although I’ve been watching smartphone developments with great interest, I used my Nokia N900 for nearly three years. Nothing out there really looked much better. But suddenly the flagship phones of this generation seem to be a significant improvement over their predecessors.
So was it worth the wait? Absolutely. There are plenty of reviews which give you all the numbers. This is a personal account of what it’s like to use the beast for real (with my contract committing me to it for two years).