AT&T re-enters the data services field by way of the cloud

It was literally during the 1960s when engineers first envisioned a realistic concept for remote storage of electronic data. It would be stored and retrieved using a radically redefined telephone network, one which folks might have to wait until 1980 or so to finally witness. And since it required the telephone, the master of the new concept seemed inevitably to be the Bell System -- AT&T.

The reason it didn't happen that way (the breakup of AT&T aside) was because local storage ended up being relatively cheap, and hard drives made sense. But four decades later, in a vastly different global economy, businesses' appetite for storage space is exceeding the ability of even cheap technologies like hard drives to keep providing it. So businesses are once again investigating a telecommunications-based option, and it is amid that backdrop of historical irony that AT&T is re-entering the picture. This morning, the company announced a programmed, systematic entry into the cloud-based data storage market, choosing a few customers at a time for a new on-demand storage service model it's calling Synaptic Storage as a Service.

The value proposition is this: Businesses preparing to invest in massive data storage infrastructures for such functions as historical backups of financial transactions, medical records management (a huge new legal requirement for hospitals), and handling duplicates of company webcasts, could be spending millions up front for capacity they may end up not needing. What's more, the service lifetime of that capacity may expire and need replacing long before the company has fully amortized that investment. AT&T's service will cater specifically to businesses that need high capacity, while trimming their costs in accordance with only the capacity they consume.

It's by no means a new field: Amazon is the trailblazer in cloud services, and IBM has been steering its Tivoli customers in the direction of the cloud since 2007. Meanwhile, startups like Zetta Technologies -- founded by a handful of Netscape veterans -- are already doing a splendid job of making the case for applying the public utility service model to mass data storage.

A recent Zetta white paper (PDF available here) makes its case more succinctly than even AT&T: "Large-scale managed NAS arrays...are very complex and expensive to operate, and this complexity increases exponentially as the scale of the storage increases. Good storage administrators are a scarce, expensive resource, and storage consumes extensive, costly space, cooling and power in the data center. Storage vendors also play too large a role in dictating both purchase quantity and purchase interval, based on confusing hardware/software bundling and configuration requirements, which may or may not align to your needs. And vendor lock-in is a real concern ??" moving to another storage vendor can involve an expensive forklift upgrade in technology and significant training on a whole new set of processes and tools."

The ace that AT&T expects to play in order to trump IBM, Amazon, and Zetta in this game comes from EMC, recognized everywhere as the leading network storage provider, though its market share is said to be on a bunny-slope slide toward 25%. Still, EMC has been working on a technology called Atmos -- a networked storage infrastructure on such a massive scale that the only real way to provide it to customers, as EMC itself has said, would be through some well-known third party.

"One way end-users will utilize cloud computing is to access their applications and information from a third-party provider -- like a large telecommunications company -- that has built a global cloud infrastructure," states a recent EMC online brochure entitled "Cloud Optimized Storage." "That cloud infrastructure will make massive amounts of unstructured information available on the Web, and will require policy to efficiently disperse the information worldwide."

What EMC means by "policy" is a way for systems on the customer end to utilize rules determining when systems use local storage, locally accessible network storage, and/or cloud storage. As an EMC white paper explains (PDF available here), "EMC Atmos improves operational efficiency by automatically distributing information based on business policy. The user-defined policies dictate how, when, and where the information resides."

These policies also determine by what means this data may be accessible. Though Atmos provides "legacy" file storage architectures such as CIFS (some may be surprised to consider that "legacy"), it also is capable of mandating that certain storage only be made accessible through Web applications, using protocols such as SOAP and REST. This may require a great deal of customer education as to how policy works in this context and how it should be managed, which may be why AT&T is rolling out its SSS service in what it's calling a "controlled" manner. While the rollout begins this month, the company has not yet revealed any interim milestones for wider availability, and is also not revealing any plans for offering smaller-scale services ("SSS sss?") to the general public.

2 Responses to AT&T re-enters the data services field by way of the cloud

© 1998-2025 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.