CISOs will shift their priorities in 2020


Not too long ago, information security was a human scale issue. Because the number of assets to compromise was contained, and because there were only a few attack vectors in the adversarial arsenal, enterprises were able to train security analysts to identify and mitigate threats and vulnerabilities.
Managed endpoints, internal applications, routers, switches, DNS servers and domain controllers compromised the majority of an enterprise’s network presence. In today’s world, mobile devices, cloud applications, IoT, and third party connections to vendors have dramatically grown the enterprise digital footprint. Additionally, adversaries were not nearly as sophisticated as they are today, leveraging only a small fraction of modern day attack vectors. Today’s threat actors have a much larger arsenal of attack vectors to use, including newly discovered vulnerabilities, misconfigured cloud services, and more services and applications exposed to the internet.
Organizations search for tougher cybersecurity measures as APTs run rampant


Advanced persistent threats (APTs) have become aggressive in their attempts to breach organizations’ networks. These malicious actors look to gain unauthorized access to infrastructures for prolonged periods of time so that they can perform various acts including mining and stealing sensitive data. Their ability to evade conventional security measures have allowed them to cause costly data breaches against many businesses.
Hackers have even found ways to intensify their malicious activities. According to an Accenture report, threat actors and groups have now teamed up to conduct targeted intrusions and spread malware. Among them are financially motivated groups such as the Cobalt Group and Contract Crew. These increasing cyberattack threats have prompted companies to toughen up their security. Gartner estimates that security spending will grow to $170.4 billion in 2022.
Cloud predictions for 2020


Multi-cloud environments have been a hot topic for the last year. Already, businesses have been realizing the benefits of a vendor-agnostic approach, which not only minimizes costs but gives them the freedom to innovate. However, there are a couple of aspects of operations which will be key in ensuring multi-cloud remains viable for enterprises in the long-term.
Despite the freedom which comes with a vendor neutral ecosystem, orchestrators haven’t yet overcome the headache associated with migrating workloads between these different cloud infrastructures. The past year saw major cloud players like IBM making acquisitions to address this, but as yet, they haven’t found a successful solution. Over the next year, this will be a priority for enterprises looking to remove the bottlenecks in their CI/CD pipeline. Organizations will invest in services which can help them harness a multi-cloud ecosystem, by supporting fast deployment, scalability, integration and operational tasks across public and private clouds.
5 ways technology is revolutionizing nightlife


A night on the town brings a feeling like no other. Letting your hair down, dancing with old friends and making new friends as you dance the night away. From alternative and underground to the VIP and luxury, there was something for everyone.
But it seems something has changed. In 2018, The Guardian reported that the value of the UK’s nightclub scene had dropped by an estimated £200m in the past five years. People are swapping gin for gyms and martinis for mini golf. So, is the nightclub industry on its way out? With some adapting, evolving and a heavy helping of tech, it’s possible that we’re seeing nightclubs claw back their popularity. Gone are the days of cheap pints and sticky floors. Nowadays people want uniqueness, something that’s worthy of uploading to Instagram, and something entirely experiential.
The world increasingly relies on open source -- here's how to control its risks


Open source software’s hold on the IT sector has deepened in the last five years. An estimated 96 percent of applications use open source components, and big players like Microsoft, IBM and even the U.S. government now embrace open source projects for their software needs. But while open source has transformed organizations’ ability to use proven and maintained code in the development of new software, it’s not untouchable in terms of security. Using code that’s readable by anyone brings risks -- and issues have occurred in the past.
It’s true that open source makes security efforts more transparent since it’s happening out in the open. If there are flaws in the code, they’re often resolved quickly by committed members of the open source community. Additionally, many open source projects have security scans built into their build processes, so contributions that introduce vulnerabilities directly or through dependencies are few and far between. But leaving the code in the open also allows bad actors to write attacks specific to unpatched vulnerabilities or to unrealized vulnerabilities in libraries that products actively depend on. As a result, teams using open source need to take steps to remain secure.
How can the US prepare for these 2020 predictions?


Predicting everything that will happen in 2020 is an impossible task, however, the foundation has been laid for two security events to occur. First, all signs point towards the enactment of a federal data privacy law. The fact that the California Consumer Privacy Act (CCPA) is slated to be enacted on January 1, 2020; shows that the US is starting to take a more steadfast approach to consumer privacy. However, if every state were to enact their own laws, then organizations that operate within the US would have to navigate through 50 different mandates. One unified, federal regulation would make it far more seamless for businesses to continue operations, all while remaining compliant.
Second, it is likely that we will see foreign meddling occur in the 2020 US presidential election. This occurred in 2016, and there have already been reports of foreign entities attempting to interfere with US government agencies. In fact, the state of Ohio recently thwarted an attack from a Russian-backed organization on its voting systems. Let’s dive more into these predictions below.
The phishing tricks that break through standard email filters


Some phishing emails are easy to spot: the spelling is bad, the spoofed email is clearly a fake, and the images are too warped to have possibly been sent by a reputable brand. If you receive one of these low-quality phishing emails, you’re lucky. Today’s phishing emails are extremely sophisticated, and if you’re not well trained to spot one, you probably won’t.
Email filters have long relied on fingerprint and reputation-based threat detection to block phishing emails. A fingerprint is essentially all the evidence a phisher leaves behind -- a signature that, once identified, will be recognized on future phishing attempts and the phishing email or webpage blocked. Examples of a fingerprint include the header, subject line, and HTML.
Forecasting the cloud security landscape in 2020


Every year, threat actors will continue to evolve their current tactics, techniques, and procedures (TTPs) that they use in order to exfiltrate customer, company and partner data, interrupt business operations, implant ransomware, and more. In fact, cybercrime damage costs are predicted to hit $6 trillion annually by 2021, according to research from Cybersecurity Ventures. In 2020, as cybercriminals refine their methods, we will continue to see a plethora of breaches occur due to a common vulnerability: misconfigurations.
Despite organizations running an average of 40 percent of their workloads in the public cloud, most companies fail to be able to accurately identify the risk of misconfiguration in public cloud as higher than the risk in traditional IT environments. In the new year we will also see a greater focus placed on identity in cloud security -- a challenge that’s easier said than done, since approaches that worked in traditional data center environments do not translate to the cloud.
No-deal Brexit: Tips for migrating data to preserve the free flow of business


With Brexit looming large, the only thing that is certain is the uncertainty of Brexit’s impact. A no-deal Brexit conjures visions of trucks and ships backed up at border crossings and ports of entry, slowing commerce to a snail’s pace. But the real business impact of a no-deal Brexit is in the free flow of data between the EU and the U.K., and many small and mid-sized businesses are likely unprepared. It is estimated that 80 percent of Britain’s economy is founded on services, not goods. Between 2005 and 2015, the volume of data entering and leaving the U.K. increased 28 times, and 75 percent of this data was exchanged with EU countries.
In a no-deal Brexit, the U.K. will become a "third country," no longer part of the General Data Protection Regulation (GDPR). Consequently, according to the U.K.’s Information Commissioner’s Office, data from EU countries will likely no longer be able to flow freely into the U.K. without a contract in place between the sender and U.K.-based SMBs that meets EU-approved terms. For those unprepared, the fallout from this could be disastrous. U.K. businesses that manage or store large volumes of data within the EU, such as those in the financial or tech industries, may look to relocate their operations to minimize the risks and impacts for their business. In moving their operations, businesses will need to transfer large volumes of data.
IoT's powerful promises


How big is the Internet of Things (IoT) market? According to a Gartner report, by 2020 connected devices across all technologies will reach 20.6B. The early adopters of this technology can be found in agriculture, utilities, and cities, but the applications are seemingly endless and extend into homes and even wearable devices. In a similar manner that the Internet has changed our lives, so too will its extension into many "things" that connect and extract pertinent data to improve our personal and professional existence.
Interjecting smarts into muted devices will require new methods of connectivity.
Why data stagnation is a threat to digital transformation


Companies are now readily investing in digital transformation to completely digitize their internal operations and get ahead of the competition. But most companies end up focusing too much on the number of applications they are integrating and too little on how those applications are actually helping their employees.
When departments use different applications that don’t integrate well, it can lead to data stagnation and isolation which will threaten your digital transformation initiatives.
Complex transformations need analytics and intelligence


In the public sector IT projects are often struggling. The Infrastructure and Projects Authority annual report (IPA), published in July 2018, assessed 133 large and risky programs the UK government has in flight. Overall, the IPA noted a general increase in the proportion of projects ranked red or amber-red -- which indicates projects are undeliverable or at high risk of failure -- from 38 to 46, and a decline in the proportion given amber-green or green, from 28 to 24. It happens in the commercial markets too, but of course, it doesn’t always make the headlines.
Clients -- commercial or public sector -- need to look for delivery organizations that make greater use of analytics and intelligence if they are to drive the successful completion of complex IT transformations.
The rise of first-party data: Why quality matters over quantity


For years, digital marketers have paid hand over fist in the digital gold rush for data. Instead of a tangible product, tech companies earn millions in revenue from the data they collect on previous, current and future digital consumers. But digital marketers seeking to gobble up as much data as they for their campaigns, while not stopping to consider the source of or methods used to collect it, are taking the wrong approach. The age-old mantra of "quality over quantity" has never been more relevant in online advertising, and marketers must quickly and fully embrace first-party data or risk their digital campaigns (and bottom lines) falling flat.
The primary reason to use first-party data over third party data from data marketplace platforms is simple: it’s better. Publishers, apps and ad platforms alike can gather first-party data directly from their audiences and customers, whether that data be purchases, app downloads, in-app actions, social media interactions, or subscriptions. This data comes directly from the source, making it as precise and accurate as possible. This is in stark contrast to third party data, which is aggregated from multiple platforms and combined into a larger data set where buyers generally do not know the exact sources of their data.
How, what, where, when, and why of experimentation


Every new feature starts as an idea. Not all ideas are good ideas. Therefore, not every new feature is a good idea. So how do you know which feature is a good idea and which one isn’t? You experiment.
The idea of experimenting on users or in production may sound scary and complicated, but it doesn’t have to be. The questions below shed some light on common questions surrounding experimentation and can help you determine if experimentation is right for you.
Is dark data valuable?


A tsunami of dark data is coming -- data that has never been analyzed, tagged, classified, organized, evaluated, used to predict future states, control other processes or has been put up for sale for others to use. So, what do we do with this data? First, we have to understand that exponentially more is coming. We see this in autonomous technology as vehicles generate four thousand gigabytes per day.
Also data is becoming more complex, as most of it is already in in video or other complicated forms. Seemingly free storage is encouraging people to store more and defer deletion.
© 1998-2025 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.