What cloud transformation means for your legacy systems
With Salesforce purchasing Slack for $27.7 billion and Microsoft Teams reporting over 115 million daily users, the market for collaboration tools is proving that they are here to stay. With the ubiquity of remote work, companies are reliant on sharing data outside of the traditional, on-premises network to reach employees in their homes. Accordingly, for organizations to optimize their communication platforms’ utility, data policies should allow remote access to files. Consequently, lots of organizations have been forced into accelerating their cloud adoption to meet the needs of their remote workforce, leaving the question: what is to become of legacy systems?
While there is a lot of thought being given to new cloud initiatives, this narrow focus can sometimes let legacy data and systems fall by the wayside. Data regulations do not only pertain to the storage of new information, but they also mandate the proper storage of data from recent years past. Consequently, while it is critical during cloud transition to consider how to defensibly govern data remotely, consideration must also be given to how to scrub, remove, change, delete, and recall data in legacy systems as well. If you cannot access user records, personal data, corporate regulatory data, and legal requirements, then simply storing it is pointless as it fails to meet the demands of regulators. As we have recently seen from the EU’s General Data Protection Regulation and the California Consumer Privacy Act, any customer may request that an organization produce their stored personal data, and not meeting these requests can cost millions.
Pandemic speeds up migration of infrastructure to the cloud
A new report from hybrid infrastructure solutions provider INAP shows that 54 percent of tech leaders say the pandemic has motivated their organization to move applications and workloads off-premise.
The survey of 500 IT infrastructure managers and senior technology leaders reveals that 53 percent say that their organizations are migrating to colocation and hyperscale public cloud environments, with 50 percent also turning to hosted private cloud solutions.
Combating the rising costs of the cloud [Q&A]
Many businesses have moved their operations to the cloud in recent years, spending an estimated $96.4 billion on cloud infrastructure services last year.
The pandemic has accelerated this shift and has led many enterprises to take a hard look the rising costs of the cloud and consider how they can get the most from their investment.
The state of the public cloud in the enterprise
How are enterprises using the public cloud? How mature are cloud programs and operating models? What are the main technical and business benefits? hat is holding businesses back? What are the next steps?
At the beginning of 2020, the Contino team set out to answer these questions.
Employees' home networks could lead to SMEs failing security assessments
With the pandemic forcing more people to work from home, businesses in the UK -- particularly smaller ones -- may not have considered the fact that their employees' home networks now fall under the scope of regulatory and certification requirements.
According to a report from support solutions company A&O IT Group, if an individual works from home more than half of their time, their network must be compliant with current regulations.
Skills shortages blamed for problems with SAP cloud migration
SAP remains one of the most commonly used enterprise resource planning tools and many businesses are looking to migrate their SAP to the cloud from on-premise systems. But this is being held back by a shortage of skills.
New research from Ensono shows just four percent of UK IT leaders say they have completed their SAP to public cloud migration strategy. 80 percent of those who haven't completed projects say they have postponed or canceled their migration of SAP applications to the public cloud due to the SAP skills shortage and 74 percent have taken a similar decision due to a lack of public cloud skills.
Why do you need a global footprint for your cloud?
With the increased awareness around cloud solutions, most organizations immediately think about reducing cost and shortening time-to-market. As more ideas around cloud are discussed, other criteria like performance, security, compliance, workload segmentation, and how to integrate the cloud become more relevant to an existing environment. The profile of a global cloud footprint; however, is an equally important consideration.
It may be time to think about why having a standardized global cloud footprint matters. Here are ten good reasons why:
86 percent of IT pros see hybrid cloud as the ideal model
Hybrid cloud is seen as the ideal infrastructure model according to 86 percent of respondents to a new survey by Nutanix.
It also reveals that the pandemic has shifted the way IT leaders think about their future plans. The majority of respondents (nearly 76 percent) report the pandemic has made them think more strategically about IT, and nearly half (46 percent) say their investments in hybrid cloud have increased as a direct result of the pandemic, including public and private clouds.
Less than a third of organizations use cloud data leakage protection
Only 31 percent of organizations use cloud DLP, despite 66 percent citing data leakage as their top cloud security concern, according to a new report from Bitglass.
In addition organizations say they are unable to maintain visibility into file downloads (45 percent), file uploads (50 percent), DLP policy violations (50 percent), and external sharing (55 percent) in the cloud.
Failure to keep up with complexity leaves businesses at ransomware risk
Increasingly businesses have data stored in hybrid- and multi-cloud environments, but a new report shows that this extra complexity could also be putting data at risk.
The report out today from Veritas Technologies found that only 36 percent of respondents say their security has kept pace with their IT complexity, underscoring the need for greater use of data protection solutions that can protect against ransomware across increasingly varied environments.
IT leaders and front line staff disagree on cloud priorities
While 47 percent of IT decision-makers strongly agree that COVID-19 has accelerated their cloud maturity, only 29 percent of line-of-business IT employees feel the same.
A new report from technology modernization firm SPR surveyed 400 IT decision-makers and the same number of workers to look at how IT teams see their businesses’ cloud resiliency strategy for 2020 and beyond.
IBM makes it easier for clients to use public cloud services
IBM is using this week's KubeCon to announce an initiative enabling clients to take better advantage of public cloud services in any environment they choose.
From today the company will open source Kubeflow Pipelines on Tekton to provide a standardized solution for creating and deploying machine learning models in production and to make machine learning models portable across hybrid cloud environments.
Enterprises accelerate cloud transformation but struggle with security
Enterprises have embraced the moving of multiple applications to the cloud using containers and are utilizing Kubernetes for orchestration. But the findings of a new report also confirm that many are inadequately securing the data stored in these new cloud-native environments.
The report from cloud-native data protection specialist Zettaset shows businesses are continuing to leverage existing legacy security technology as a solution.
IT spending remains buoyant despite the pandemic
In the face of restrictive lockdowns and stay-at-home orders, IT budgets have held up remarkably well according to a new study, as technology becomes a critical ingredient in launching new products and services.
The report from OpsRamp is based on responses from 230 IT operations and DevOps executives in the US and UK with at least 500 employees and $5 million in annual IT budgets.
Operating as cloud first: What it really takes
Let’s take a look back to a time before COVID-19. Systems engineers walked the datacenter floor and managed the infrastructure on-site. A team could purchase, physically receive, and rack-and-stack new infrastructure if needed to run critical platforms on an OpEx model. For many, this former reality seems like a long lost memory.
Since the beginning of the pandemic, price instability has limited predictive budgeting, manufacturers have faced debilitating delays and individuals can no longer enter and exit a datacenter at will. The demand for web-based applications has increased as consumers change the way they interact with everything from grocery shopping to entertainment. Many organizations are facing a harsh reality of working to meet demand while relying on an unstable supply chain.
Recent Headlines
Most Commented Stories
© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.