World Backup Day -- We need to change the name
It remains essential to make copies of the most important data, World Backup Day has rightly been calling for this for years. But today the biggest risk to our data isn’t the traditional business continuity and disaster recovery scenarios for which World Backup Day was originally envisioned to cater for. The biggest threat to data today is destructive cyber attacks in the form of ransomware and wipers.
Tackling these threats by simply making copies and recovering the data after an incident is not enough, instead we need a World Resilience Day, where cyber incidents are investigated, the threats mitigated and systems hardened before being recovered to prevent recurrence and further impact.
Why deploying infrastructure without backup is always a risky gamble
In today's digital landscape, where data is omnipresent across various platforms and devices, maintaining efficient backup processes has become increasingly critical. Yet, despite the inherent risks, a surprising number of organizations continue to deploy their infrastructure without adequate backup measures in place.
According to a poll conducted among IT professionals, only a mere 25 percent of them adhere to industry best practices concerning data backup, creating potentially dangerous data gaps in production and employee risk management. Initially enticed by the allure of cost-saving, many companies overlook the necessity of investing in backup solutions, only to face dire consequences in the long run.
Beyond monitoring: How observability fuels security, sustainability and employee satisfaction
As cloud-based networks, mobility and the rise of hybrid work have spiked ongoing CIOs’ interest in observability, companies are forced to consider how observability factors into their daily systems and routines. To understand what’s happening between their applications, devices, and infrastructure – and how these affect employees -- companies must consider how monitoring and observability go hand in hand to improve their IT operations.
As any good CIO knows, observability is not just about where something is happening but why it’s happening, which helps them resolve network issues. By doing this, companies can better maintain a seamless digital experience that assists in some of their most critical initiatives, namely in heightening security, lowering their carbon footprint and improving overall employee morale.
AI-powered tools: Boosting efficiency in content creation for marketers and creators
As the year draws to a close, it is of great value to reflect upon the last 12 months, particularly in assessing what has proven effective in light of newly introduced tools like generative AI-powered solutions. Gen AI has been a prominent theme this year, and organizations across industries are actively exploring its potential. Numerous companies are not only using popular gen AI tools, but also many are creating their own custom gen AI tools. In fact, the latest reports suggest that the gen AI market is poised for remarkable growth, with projections indicating it could reach an astounding $1.3 trillion by 2032.
McKinsey's findings reveal that commercial leaders are allocating more than 20 percent of digital budgets to gen AI, underscoring their pioneering role in pushing for adoption as a tool for company-wide implementation. This push can be good news for marketing teams. Marketers in particular find themselves at the threshold of transformative possibilities with AI tools for content creation. These AI-powered solutions offer the promise of streamlining content creation, making it faster and more precise. Marketers can harness the capabilities of AI to generate compelling visuals, craft persuasive messaging, and tailor content to specific audiences with unprecedented precision. Let’s dive into specific areas where gen AI has benefited the role of marketers and creatives.
Preparing for potential regulations around AI in electronic bill payment and presentment
The speed with which enterprise-level artificial intelligence has moved from the realm of theoretical -- if not outright science fiction -- to a widely adopted business tool has been nothing short of astonishing. The mad dash to find and implement applications for new, AI-based solutions is reminiscent of the rapid ascension of cloud technology in both fervor and consistency: the race to the cutting edge is taking place across industries.
Where sectors differ is in the level of caution they’re bound by their respective norms and requirements to apply in the process. While those in all industries ought to approach new technologies and tools with healthy prudence, in my personal opinion (and I am not an attorney), those of us in the electronic bill payment and presentment (EBPP) space must additionally consider the regulations that govern individuals’ privacy and the security of their data when it comes to something as private as their finances.
Embracing cloud repatriation: Strategies for successful workload migration
A recent study by Citrix revealed that 25 percent of UK organizations have chosen to migrate more than half of their cloud-based workloads back to on-premises infrastructures. This phenomenon, known as cloud repatriation, is gathering significant momentum and forcing enterprises to re-evaluate their cloud strategies.
While the cloud once promised cost savings, scalability, and flexibility, many organizations have found themselves grappling with unforeseen expenses, security concerns, performance issues, compatibility problems, and service downtime.
Five questions to ask before you choose a cloud provider
The cloud landscape has never been more complex. Research reveals that the majority of EMEA IT decision makers (ITDMs) have hybrid cloud strategies (68 Percent) and even more (72 Percent) have relationships with multiple public cloud providers (72 Percent). Every provider offers different services each with their own pros and cons, and this complexity can be challenging to manage without the right underlying data architecture.
Given the cost, compliance and business risk implications of cloud, due diligence is increasingly important. But for organizations to work out what they need from their cloud service providers (CSPs), they must understand their own data first.
Zero Trust: Moving beyond the chewy centre of cybersecurity
As modern technology becomes increasingly complex, so does the task of securing it. Adding to the complexity is the proliferation of decentralised technology such as cloud adoption and IoT (Internet of Things), and the move to remote working which have changed how cybersecurity experts craft the defences for their systems.
In simpler days, IT systems were comparatively easy to ring-fence, as there was a solidly defined corporate security perimeter, or “trust boundary.” This formed the basis of the classic, trust-based security strategy, where any user inside the boundary was implicitly trusted by default, and anyone outside the boundary was denied access. Being connected to the private network was the only verifiable credential needed to access the system and all its data.
Harnessing the value of data with data monetization
Businesses around the globe are using new technologies to change the world. But this wouldn’t be possible without the use of sensitive data such as Personal Identifiable Information (PII) and Protected Health Information (PHI) to drive advancements in personalization and sophistication. However, if companies are using data that typically is associated with medical records and insurance claims, this bodes the question, is personal data secure?
It is possible to balance data privacy with gleaning the value of the information through a data modernization strategy that enhances and accelerates digital transformation efforts.
Is synthetic data the solution to data privacy challenges?
Synthetic data is artificial material that was not generated by natural life events. As such, it can be created by computer programs and AI tools that use different techniques, with generative adversarial networks and diffusion models being among the most popular and effective today. Synthetic data may come in many forms, but images and textual information are currently the most feasible options.
If you are interested in AI and ML developments, you have probably heard the term already -- “sanitized” synthetic data is a recent hype in the AI training field that, it is believed, might solve pressing data privacy and ownership challenges posed by real data. However, it all sounds like sunshine and rainbows only until you stop and consider the fact that AI algorithms used to generate synthetic data still need to be trained on real data -- the very obstacle they offer to remove.
Why human risk management is key to data protection
Personal data is constantly being processed and transferred in numerous ways -- whether in healthcare applications, store loyalty programs, during purchases or while browsing online. With such a vast amount of personal data in circulation, the likelihood of errors occurring is heightened.
It feels like almost every day we hear a story of another company being breached -- with data being stolen by cybercriminals looking to steal an individual’s identity, access accounts or commit fraud. Things are also getting easier for cybercriminals, thanks to technology advancements like generative AI assisting with more convincing phishing emails and deepfake content.
Workplace communications: The role of artificial intelligence and audio-visual solutions
Technology is reshaping the way we work and the convergence of Audio Visual (AV) solutions with Artificial Intelligence (AI) is not only redefining how we communicate but it is also playing a pivotal role in digitizing workspaces and helping organizations achieve their sustainability objectives.
Research from technology analyst Valoir reveals that AI has the potential to automate 40 percent of the average working day, allowing business leaders to increase productivity levels like never before. This trend is expected to continue with the global market size anticipated to see an annual growth rate of over 15 percent between 2024 and 2030.
To get to AGI, we must first solve the AI challenges of today, not tomorrow
If the World Economic Forum in Davos was any indication, AI safety and security will be this year’s top priority for AI developers and enterprises alike. But first, we must overcome hype-driven distractions that siphon attention, research, and investment away from today’s most pressing AI challenges.
In Davos, leaders from across the technology industry gathered, previewing innovations, and prophesying what’s to come. The excitement was impossible to ignore, and whether it is deserved or not, the annual meeting has built a reputation for exacerbating technology hype cycles and serving as an echo chamber for technology optimists.
How clean code can bridge the developer and security divide
Industry-agnostic software is now an organization’s most critical business asset, as its competitive edge often depends on it. Since companies become more technologically savvy and dependent upon their software to meet revenue goals and deliver products or services to customers, it cannot be afforded to underestimate the importance that secure and high-quality code plays.
The more this becomes evident, the greater the pressure on developers to deliver. Leaders expect their developer teams to work faster, ship more features, and write “better” code, but the technical debt accrued as a result of these escalating demands creates a slowdown effect as developers try to keep up. This technical debt can take a third of developers’ time to address, with refactoring later costing twice, or even three times as much as a proactive fix. While AI code generation tools can help manage the responsibility of creating large amounts of code and handling mundane tasks so developers can focus on collaborative or creative work, AI-generated code shouldn’t be trusted at face value. When code is not properly reviewed for maintainability, security, and reliability (i.e. Clean Code attributes), poor-quality code problems creep in.
Immutability: A boost to your security backup
As the volume of data continues to increase and the threat landscape continues to evolve, it is increasingly important for organizations to protect backup data from unwanted deletion. Threats today can take the form of a malicious insider deleting backup data or a targeted cyberattack on the backups themselves. Modern ransomware attacks often first seek out and destroy backups before moving on to encrypting production data. However, companies will benefit from implementing immutability, the act of making data writable but noneditable for a defined period of time, as part of their data protection arsenal to help avoid or recover from a loss of production data situation.
The rise in cyber incidents, which according to the Veeam Data Protection Trends Report 2023 is the leading cause of outages over the past three years, is bringing the need for immutability to the fore, particularly as most organizations reported having fallen victim to cyber incidents, on average, twice a year.
© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.