Cyber Essentials? What's that then?

Laptop protect security

New research from Lookout finds that 40 percent of security pros have no clue about the UK Cyber Essentials framework -- the government backed program that aims to help UK organizations improve their cyber resiliency against the most common cyberattacks.

The research, carried out at Infosecurity Europe, surveyed 246 security professionals and finds only 28 percent of organizations had fully implemented Cyber Essentials. Of those that had not implemented the scheme, 58 percent say a lack of awareness or understanding is the reason why they hadn't.

Continue reading

The return of data modeling -- this time it's strategic [Q&A]

financial data exchange

Over the past decade data modeling -- setting up data structures aligned to business requirements -- has tended to take something of a back seat as businesses have rushed to bring products to market.

But we're producing more data than ever and need ways to process it effectively. That's why Satish Jayanthi, CTO and co-founder at Coalesce, believes it's time for data modeling to make a comeback in enterprise strategy. We spoke to him to find out more.

Continue reading

Industry reacts to new SEC breach disclosure rules

data breach

On Wednesday the US Securities and Exchange Commission (SEC) approved new rules that require publicly traded companies to publicize details of a cyber attack within four days of identifying that it has a 'material' impact on their finances.

This marks a major shift in how data breaches are disclosed and industry figures have been quick to give their views on the effect the new rules will have.

Continue reading

Shifting left to improve data reliability [Q&A]

Left turn shift left

The concept of 'shifting left' is often used in the cybersecurity industry to refer to addressing security earlier in the development process.

But it's something that can be applied to data management too. Shifting left in this sense means performing data reliability checks sooner. The ability to execute data reliability tests earlier in the data pipelines helps keep bad data out of production systems.

Continue reading

Employees share more secrets with AI than they would in a bar

Concept of chat bot in modern business communication

A new study of 1,000 office workers across the US and UK shows half of us already use AI tools at work, one-third weekly and 12 percent daily.

But the report from Cybsafe finds 38 percent of users of generative AI in the US admit to sharing data they wouldn't casually reveal in a bar to a friend.

Continue reading

Cyber risks increased by workers taking summer vacations

Remote working beach

As workers take time off for summer holidays it means greater risk that personal devices and public Wi-Fi will be used to access sensitive corporate data.

Vulnerability management specialist Hackuity warns that this is a time when organizations are at their most vulnerable and cybercriminals are well aware of the fact.

Continue reading

A third of SMBs dispose of old hardware in landfill

e-waste

As growing businesses rush to upgrade their hardware, many are simply throwing old computers, routers, and other IT assets into the trash, leading to security and environmental concerns.

A new study from Capterra of 500 IT professionals at US small and midsize businesses (SMBs) reveals that nearly a third (29 percent) indulge in improper IT hardware disposal practices.

Continue reading

Cybercriminals get their very own generative AI

Hack and AI concept

We've already seen how generative AI can be used in cyberattacks but now it seems there's an AI model aimed just a cybercriminals.

Every hero has a nemesis and it looks like ChatGPT's could be FraudGPT. Research from security and operations analytics company Netenrich shows recent activities on the Dark Web Forum reveal evidence of the emergence of FraudGPT, which has been circulating on Telegram Channels since July 22nd.

Continue reading

How data centers need to rethink their vulnerability assessments [Q&A]

Data center

Data centers are increasingly faced with more sophisticated attack techniques, putting the information they hold at risk.

Specific vulnerabilities such as misconfigurations may pass under the radar of traditional security scans. We spoke to Daniel dos Santos, head of security research at Forescout, to discuss the potential impact of these vulnerabilities and why data centers need to strengthen their risk management.

Continue reading

Updated AI engine aims to boost productivity across business teams

Artificial intelligence

Artificial intelligence is finding its way into many areas of business. But its value depends on the quality of the training data and user prompts it receives.

Dynatrace is looking to address this with an update to its Davis AI engine that creates what it calls a 'hypermodal artificial intelligence', combining fact-based, predictive- and causal-AI insights with new generative-AI capabilities.

Continue reading

Generative AI assistant helps secure the cloud

Cloud data protection

Cloud security company Sysdig is launching a new generative AI assistant specifically designed to help with cloud security.

Whereas standard AI chatbots are designed to answer a specific question using a single large language model (LLM) and stateless analysis, Sysdig Sage uses a unique human-to-AI controller that mediates user interactions with LLMs to provide more advanced, tailored recommendations.

Continue reading

New tool uses AI to help ensure AI-generated content is fit for humans

Artificial Intelligence Bias

Experts reckon that over 90 percent of internet content could be AI generated by the end of the decade. But we all know that AI isn't perfect; it can introduce biases and errors.

Checking material to ensure it's suitable for the target audience is therefore essential. User experience research platform WEVO is launching a new research tool, WEVO 3.0, to ensure that AI-generated products and experiences are well received by their target human audience.

Continue reading

How software-defined instrumentation is changing testing and measurement [Q&A]

Software testing

The testing and measurement industry like any other is looking towards digital transformation projects, but it’s a sector that in the past has been slow to adapt.

Software-defined instrumentation looks set to change that and drive transformation efforts forward. We spoke to Daniel Shaddock, the co-founder and CEO of Liquid Instruments and a professor of physics at the Australian National University, to find out more.

Continue reading

Business IT model needs to change to cope with 'workquake'

Office chaos abyss

A pandemic-induced shift to remote work, combined with relentless technological advances such as generative AI, has resulted in a 'workquake', causing a rapidly evolving landscape in which enterprises must adapt to new technologies, working practices, and business models without established procedures.

New research from Doherty Associates identifies a growing complexity avalanche for IT teams. Smaller teams, in particular, report an increase in the breadth and depth of tasks in addition to business-as-usual duties.

Continue reading

Open source supply chain attacks specifically target banking

Broken piggy bank

Researchers at Checkmarx have detected several open-source software supply chain attacks that specifically target the banking sector.

These attacks use advanced techniques, including targeting specific components in web assets of the victim bank by attaching malicious functionalities to them. The attackers employed deceptive tactics such as creating fake LinkedIn profiles to appear credible and customized command and control centers for each target, exploiting legitimate services for illicit activities.

Continue reading

Load More Articles