How is a television like a fax machine? They are both obsolete.
Remember a time when nobody had a fax machine? Then suddenly everybody had a fax machine. And now nobody again has a fax machine. What would have previously come by fax today is a PDF attachment to an e-mail or text or to one of a number of messaging services. Well the same transformation is happening to traditional television and for generally similar reasons. And just as fax machines seemed to disappear in just a few years, I’ll be surprised if broadcast TV in the U.S. survives another decade.
I’ve already written one prediction about autonomous cars -- that they’ll be far later to the market than most pundits and autonomous car inventors are suggesting. Today’s prediction is about a tangentially-related technology -- aerial delivery drones. These drones are definitely coming just as fast as regulators will allow them, but I don’t think they’ll be implemented in the way people expect. What we’ll see, I predict, is something I call Pizza-to-the-Neighborhood or PTTN.
Aerial drones are a new type of distribution network operating in a new kind of ether. They don’t travel on roads and neither do they travel in what we conventionally think of as airspace. Flying over cities, which is where these delivery drones are going to be used, airplanes are legally restricted from operating below 1000 feet unless they are actively taking-off or landing. Helicopters get to break this rule a bit because they can claim to be taking-off or landing almost anywhere, but fixed-wing aircraft have to stay above 1000 feet, making below 1000 feet the emerging realm of autonomous drones.
We all know people who seem to not like anything. There are very successful people who sometimes seem to have reached that success entirely through saying "no." I’m not that kind of person. I’m an optimist. I’m even a bit of a risk-taker. But I can’t say that we’re going to see anything beyond more beta tests of self-driving cars in 2019. So my Prediction #4 is that self-driving cars won’t hit the retail market in any fashion this year. We simply aren’t ready and probably won’t be for years to come.
The problem with self-driving cars isn’t the technology. Heck, we’ve had the technology pretty much whipped for the past decade. Throw-in all the more recent data collected by Google and -- especially -- all those Teslas on Autopilot and nearly all the kinks have been worked out of having cars drive themselves. Still, it won’t be allowed to happen because people are going to die, mainly because of idiot drivers.
Prediction #2 -- And then there were only 3.5 VPC Cloud players. Cloud computing will continue to grow in 2019 with the key term being not Public Cloud, Private Cloud or Hybrid Cloud -- which are all so 2018 -- but Virtual Private Cloud (VPC). Virtual Private Cloud is an Amazon Web Services (AWS) invention but all the AWS competitors seem to be embracing the idea.
What has developed is that the VPC solution based on Open Source using Linux will change the Internet-as-a-Service (IaaS) Cloudscape to VPC-only during 2019.
People -- well, investors and financial analysts -- seem to worry a lot about Apple. They tend to see Apple as either wonderful or terrible, bound for further greatness or doomed. What Apple actually is is huge -- a super tanker of a company. And, like a super tanker, it’s hard to quickly change Apple’s direction or to make it go appreciably faster or slower. Those who see Apple as doomed, especially, should remember they are worrying about the most profitable enterprise in the modern history of business. Those who see Apple as immortal should remember that’s impossible.
The worry about Apple in 2019 seems to be that the smart phone market may have peaked, or maybe that Apple has made the mistake of building its products so well that they last too long. Then there’s the concern that Steve Jobs is gone and why isn’t Apple reinventing itself and the world yet again through another new product category?
Now, finally, to my predictions for 2019. This is, I believe, my 22nd and possibly last year of looking ahead, so I want to do something different and potentially bigger. Our old format works fine but I’ve been pondering this and I really think we’re at a sea-change in technology. It’s not just that new tech is coming but we as consumers of that tech are in major transitions of our own. It has as much to do with demographics as technology. So while I’ll be looking ahead all this week, coming up with the usual 10 predictions, I want to make sure we all understand that this isn’t business as usual. This time it really IS different.
I’ve been thinking about 50 year cycles. The year 1968, which was 50 years ago when I started writing this column back in November, saw a huge social and political upheaval with student riots all over the world, the rise of the hippy movement in the USA, the Summer of Love and the founding of Intel. Most of the technical progress we have seen since 1968 has been driven by microprocessors, which were largely the work of Intel. And it took 50 years, but we’re now approaching the Internet of Things, where processors will be in everything and everything will be linked or monitored, which is either good or bad depending who you are.
I can’t put this off any longer, so here are the tech predictions I made a year ago for 2018. We have to see how well or poorly I did before we can move on to my predictions for 2019 and beyond. These old predictions have been edited for length, but not to avoid embarrassment. I try to never avoid embarrassment.
One thing I’ve noticed over the years is that my predictions get longer and longer (this column, alone, is 4329 words -- my second longest, ever) as they have drifted from new products to explaining new strategies. This sometimes works against the prediction since it is often easier to claim success if your goal is vague, but I see it more as a tribute to my readers. Many of you have been with me for decades and the very fact that we are both still here has as much to do with the work as with its results. How the future fits together is just as important as where it is heading.
With Apple shares down more than 20 percent from their all-time highs of only a few weeks ago, writers are piling-on about what’s wrong in Cupertino. But sometimes writers looking for a story don’t fully understand what they are talking about. And that seems to me to be the case with complaints that Apple is too far behind in adopting 5G networking technology in future iPhones. For all the legitimate stories about how Apple should have done this or that, 5G doesn’t belong on the list. And that’s because 5G isn’t really about mobile phones at all.
Just to get this out of the way, I see Apple shares currently presenting a huge buying opportunity. A good Christmas quarter will regain that lost 20 percent, and I don’t see any reason why Apple shouldn’t have a good Christmas quarter.
So IBM is buying Red Hat (home of the largest Enterprise Linux distribution) for $34 billion and readers want to know what I think of the deal. Well, if I made a list of acquisitions and things to do to save IBM, buying Red Hat would have been very close to the top of that list. It should have bought Red Hat 10 years ago when the stock market was in the gutter. Jumping the gun a bit, I have to say the bigger question is really which company’s culture will ultimately dominate? I’m hoping it’s Red Hat.
The deal is a good fit for many reasons explained below. And remember Red Hat is just down the road from IBM’s huge operation in Raleigh, NC.
Microsoft co-founder Paul Allen died on Monday at age 65. His cause of death was Non-Hodgkins Lymphoma, the same disease that nearly killed him back in 1983. Allen, who was every bit as important to the history of the personal computer as Bill Gates, had found an extra 35 years of life back then thanks to a bone marrow transplant. And from the outside looking-in, I’d say he made great use of those 35 extra years.
Of all the early PC guys, Allen was probably the most reclusive. Following his departure from Microsoft in 1983 I met him only four times. But prior to his illness Allen had been a major factor at Microsoft and at MITS, maker of the original Altair 8800 microcomputer for which Microsoft provided the BASIC interpreter and where Allen was later head of software.
Kai-Fu Lee's new book says Artificial Intelligence will be Google vs China and will kill half the world's jobs
Kai-Fu Lee was born in Taiwan but grew up in Tennessee, which is nothing -- nothing -- like Taiwan or China. His PhD is from Carnegie-Mellon and for the first half of his career Lee was "that voice recognition guy" first at Apple, then Microsoft, then Google. Lee took Google to China the first time (a new Google China effort is starting just now). Today Lee is an Artificial Intelligence expert who runs a $1 billion venture fund with offices in Taipei and Beijing and, according to Anina (the pretty girl in the picture with me on my site who has lived in Beijing for most of the last decade) he’s "an absolute technology rock star -- everyone in China knows Kai-Fu Lee."
Lee is also a prolific author and in his latest book, coming out in September, he wants to explain to the world how Artificial Intelligence will be dominated by Silicon Valley (mainly Google) and China, with China having somewhat of an edge, how half of all jobs in the world are going to disappear because of AI, but how that only sounds like a bad thing. Well maybe it is a bad thing but there’s no way around it and things could all turn out better in the end. Maybe.
My favorite UK TV producer once had to sell his house in Wimbledon and move to an apartment in Central London just to get his two adult sons to finally leave home. Now something similar seems to be happening in American IT. Some people are calling it age discrimination. I’m not sure I’d go that far, but the strategy is clear: IT is urbanizing -- moving to city centers where the labor force is perceived as being younger and more agile.
The poster child for this tactic is McDonald’s, based for 47 years in Oak Brook, Illinois, but just this summer moved to a new Intergalactic HQ downtown in the Chicago Loop. Not everybody has left the old digs. McDonald’s has opened a software division at the new HQ specifically working on McDonald’s cloud offerings, which is to say working on the future of McDonald’s IT.
After 31 years of doing this column pretty much without a break, I’m finally back from a family crisis and moving into a new house, which sadly are not the same things. Why don’t I feel rested? I have a big column coming tomorrow but wanted to take this moment to just cover a few things that I’ve noticed during our move.
We have become cable cutters. Before the fire we had satellite TV (Dish) and could have kept it, but I wanted to try finding our video entertainment strictly over the Internet. It’s been an interesting experience so far and has taught us all a few lessons about what I expect will be an upcoming crisis of people blowing past their bandwidth caps.
One of the darkest secrets of Information Technology (IT) is called the Productivity Paradox. Google it and you’ll learn that for at least 40 years and study after study it has been the case that spending money on IT -- any money -- doesn’t increase organizational productivity. We don’t talk about this much as an industry because it’s the negative side of IT. Instead we speak in terms of Return on Investment (ROI), or Total Cost of Ownership (TCO). But there is finally some good news: Cloud computing actually increases productivity and we can prove it.
The Productivity Paradox doesn’t claim that IT is useless, by the way, just that we tend to spend more money on it than we get back in benefits from those expenditures. IT still enabled everything from precision engineering to desktop publishing to doctoring movie star photos, but did so at a considerable cost. Follow the history of any organization more than 50-60 years old and you’ll see that they acquired along the way whole divisions devoted not to manufacturing or sales but just to schlepping bits and keeping them safe. Yes, IT reduced the need for secretaries, telephone operators, and travel agents, but it more than replaced those with geeks generally making higher wages.
I began writing the print version of this rag in September, 1987. Ronald Reagan was President, almost nobody carried a mobile phone, Bill Gates was worth $1.25 billion, and there was no Internet in the sense we know it today because Al Gore had yet to "invent" it. My point here is that a lot can change in 30+ years and one such change that is my main topic is that, thanks to the GDPR, the Internet is no longer American. We’ve lost control. It’s permanent and probably for the best.
Before readers start attacking, let’s first deal with the issue of Al Gore and the Internet. What Gore actually said to Wolf Blitzer on CNN in March, 1999 was "During my service in the United States Congress, I took the initiative in creating the Internet." And he did. In 1986-1991 Gore sponsored various bills to both expand and speed-up what had been the ARPAnet and allow commercial activity on the network.