BetaNews Staff

Why poor IT asset lifecycle management is rapidly becoming a serious cyber vulnerability

Earlier this year, banking giant Morgan Stanley agreed to pay plaintiffs $60 million to settle a class-action lawsuit resulting from a pair of data breaches discovered in 2019.

While companies like Morgan Stanley find themselves under constant cyber attacks, these breaches were not from a hacker breaking into a database or an employee accidentally exposing customer information. It was simpler: Morgan Stanley threw away decommissioned servers that were not completely wiped clean, leaving customers’ personally identifiable information vulnerable.

Continue reading

How IoT connectivity is reaching new heights

IoT grid

IoT solutions utilizing SIM-based cellular technology for connectivity are not new -- but the speed with which IoT is expanding, embracing ever more exciting and dynamic use cases is both compelling and creating market confusion in equal measure. From a market which is reaching maturity -- the standardized, tried and tested, M2M SIM IoT deployments -- to one (e.g. 5G SIM-based IoT) which is largely in its infancy, separating between those solutions that can be bought with confidence and those where continued innovation warrants discussion and consultation, may not be straightforward. And, for these latter cases, choosing the right cellular (SIM) technology and network type will require an understanding of the technical requirements for each use case and the data profile of the asset to be connected. 

With the definition of IoT expanding almost daily and suppliers increasingly jumping on the IoT bandwagon, this is a complex landscape, requiring knowledge, understanding, and expert partnerships. Nick Sacke, Head of IoT Solutions, Comms365 explains how to navigate the maze of options to optimise and future proof your cellular IoT investments.

Continue reading

Does your Microsoft 365 need to be protected?

As organizations move to fully embrace cloud, the significant benefits of running IT infrastructure via cloud services are becoming even more evident. Not only do cloud-based services come at a far lower cost than physical platforms and deployments, IT leaders are also able to side-step much of the risk and 'heavy lifting' around tech investment and maintenance by moving this out of local data centers. They can also enjoy expert third-party systems management and reliable service delivery, without having to give up much of the control for end users.

Microsoft 365 is a great case in point. The procurement model for this ever-expanding suite of high-qual­ity IT services is based around a price per user. It is easily scalable as teams and organizations grow and can therefore help to optimise budgets, avoiding payment for infrastructure that may go unused. It’s also growing, with new features and functionality added every day that will keep IT departments at the cutting edge of optimal business processes.

Continue reading

The essential ingredient to manage the complexity of energy models: AI

In just a few years, energy management has become a real headache. Not only must the growth in electricity consumption be taken into account, but also the reduction in fossil fuels, and the increase in the production of renewable energies, knowing that they operate intermittently -- a complexity that cannot be resolved without massive data collection, high computing capacity and AI algorithms.

While global warming should force us to reduce our energy consumption, on July 15, 2021, the International Energy Agency (IEA) announced an increase in global electricity demand of five percent in 2021 and four percent in 2022. This is a trend that is not ready to be reversed, driven by the increasingly abundant and electricity-intensive technology era, by a global population that should reach 10 billion in 2050 or even by the development of metropolises and megacities that are still energy intensive.

Continue reading

Game developers shouldn't overlook Python's potential

Python is an object-oriented, general-purpose, and high-level programming language that was developed in 1991 by Guido van Rossum. Since its development, Python is one of the most popular programming languages worldwide. It often ranks high in surveys -- for instance, it claimed the first spot in the Popularity of Programming Language index and came second in the TIOBE index.

Python has gained its acclaim as a widely used tool with multiple purposes for various projects, from data analytics and visualization to artificial intelligence, language development, design, and web development. Python isn’t renowned for its use in video game development. But should this mean game developers should just disregard Python completely?

Continue reading

Critical steps to ransomware protection in the cloud

The ransomware threat is very real with attacks growing in size and frequency, in part, because of the acceleration of digital transformation initiatives and the move to embrace digital services as well as the rapid implementation of hybrid ways of working.

As new digital systems required multiple access points for customers, partners, and employees, this has created a vastly expanded attack surface. This has hastened the rise in ransomware attacks, as attackers quickly took advantage of the increased number of possible attack vectors. 

Continue reading

Lessons learned from 633 destructive ransomware events

The threat landscape continues to see rapid evolution, especially as the digital world grows increasingly connected and more organizations outsource business services. Adversaries are getting smarter, and their techniques are getting more advanced by the day. This has put a spotlight on the security of our global supply chain and how unstable and unprotected it is.

In fact, software supply chain attacks have tripled in 2021. The potential ripple effects of risks and disruptions within an organization’s supply chain that could ultimately impact their business are immense. Research shows that a data breach affecting multiple parties causes 26X the financial damage of the worst single-party breach.

Continue reading

Pay up or play different? Five tips for beating ransomware with backups

When it comes to ransomware, sometimes the cost of downtime can exceed the cost of paying up. Companies with frozen data and systems face loss of revenue, productivity, customer departures, damaged reputations, never mind the cost of the ransom itself. Take an organization like Colonial Pipeline, which should have had healthy backups in place to quickly recover from their attack and most likely did. However, they opted to shell out $4.4 million in ransom because they didn’t know how long it would take to get up and running again. 

And according to ITIC's 2021 Hourly Cost of Downtime survey, one hour of a server being inoperable costs $300,000 or more for 91 percent percent of mid-sized and large enterprises. 

Continue reading

Fighting cybercrime: What's next for Microsoft 365

It has been over a decade since we were first introduced to the Microsoft 365 brand -- and now it is one of the most used lines of subscription services in the world. Last year marked the 10th anniversary, and if we take a look back since its early days, the service has only expanded its scope and capabilities especially when it comes to the Security & Compliance Center.

The swift ascension of Microsoft 365 hardly comes as no surprise, given the hybrid world we now find ourselves in. However, as the number of M365 users continues to increase at a rapid pace, the security risks for both users and admins will only grow as well. To break it down,  between January and December 2021 alone, Microsoft Azure Active Directory blocked more than 25.6 billion malicious attempts to hijack enterprise customer accounts by brute-forcing stolen passwords.

Continue reading

Breaking away from tech giants: Why businesses should consider Filecoin storage

When it comes to storing data, businesses can often feel inundated with options from a whole host of different providers all promising to offer competitive rates and the security of your businesses’ assets. Add blockchain solutions into the mix as well, and decision-makers have a lot to be weighing up when choosing the best solution for their business.

With cloud-based technologies and blockchain in particular, jargon can be a problem when it comes to fully understanding the principles and mechanics at play. For this reason, many business leaders may dismiss these technologies, without considering the benefits they could offer. So, what exactly is Filecoin?

Continue reading

How systems integrators scale IOT to enable global deployments for customers

IoT grid

Today, the promise of IoT is in little doubt. Use cases range from connected products to connected assets and we are seeing IoT deployed across a multitude of industries from telehealth and electric vehicle (EV) connectivity to smart vending, payment systems and more. Accelerated in part by the pandemic and remote working, IoT deployments are now becoming mainstream.

That said, deploying one IoT device or a prototype and ensuring it is functional, versus deploying at scale across multiple sites and geographies is when IoT starts to become more challenging.

Continue reading

Four keys to successful product lifecycle management

As companies begin their yearly evaluation of goals and objectives there may be some new discussions brewing. The need for companies to adapt their product development processes to support new types of supplier collaboration, flexible sourcing strategies, and digital transformation efforts has caused a paradigm shift toward more flexible, sustainable platform technologies that enable them to rapidly adapt to disruptions and opportunities in the market.

These new demands on businesses have stretched the limits of their legacy PLM software. Overly customized PLM software with a history of creating unresolved technical debt is driving many companies to a point of reflection -- questioning where do we go from here? Do we spend millions redeploying or trying to painfully upgrade our traditional PLM software? Will those efforts make our business more resilient and more agile? Here are four questions every company must answer:

Continue reading

Why the pandemic's effect on cloud is more than a technology change

energy

The COVID-19 pandemic has led to a technology leapfrog beyond anything we’ve seen in decades. But now that we’ve made this leap is there any going back? And do we have the right technology for enterprises to keep up with new demands?

According to Pew Research, as of around two years into the pandemic, roughly 60 percent of U.S. workers who say their jobs can mainly be done from home (59 percent) are working from home. Our research found that digital experiences like online gaming, streaming and telehealth increased dramatically during the early days pandemic. And now, technologies and experiences like cloud gaming caught on during the pandemic are exploding. 

Continue reading

Which technology trends can help organizations achieve their digital ambitions?

digital transformation

In the era of fast business, organizations face increased pressure to continually improve and rapidly iterate on their digital transformation strategies. Nearly 70 percent of companies cite digital transformation as their top IT priority, while McKinsey estimates that COVID-19 has sped up digital adoption by seven years.

While the urgency to transform increases, 90 percent of businesses report that they face at least one barrier in their efforts to drive digital change. But what does this really mean and how can an organization achieve their digital ambitions in the era of fast business?

Continue reading

AI's evolution from oddity to ubiquity

Artificial intelligence

There was a time when the notion of machines "thinking" was the stuff of fantasy. We had the tin man in the Wizard of Oz but that was fiction, a nice story created in Hollywood. The idea that a computer could think, solve problems, and learn from experience like a human was just too far-fetched in the early 20th century. But, science happens, and in the mid-1950’s, the first program, Logic Theorist designed to mimic human-like skill solving was created and the rest, as they say, is history.

Over the next twenty years, the concept of artificial intelligence evolved. Many thought that the attention artificial intelligence received after the Logic Theorist program was created would propel this new discipline into practical, real-world applications. But there was a problem; computer storage and speed requirements to process the amount of data used for running machine learning algorithms were just not up to snuff. To fully realize the potential of AI, computers needed to get faster with more storage. Lucky for us, computers did get faster, much faster, and as we learned from Moore’s Law, storage increased too, at a rate that eventually caught up to the requirements of AI.

Continue reading

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.