Improving data analysis with AI [Q&A]


Generative AI is making its presence felt in more and more areas, but there are well-founded concerns about the accuracy of information that it provides.
Is it possible to provide the convenience of a large language model AI system, with the logic and accuracy of advanced analytics? Arina Curtis, CEO and co-founder of DataGPT, thinks so. We spoke to her to find out more.
Why not all AI is created equal and how the wrong choice could be hurting your business [Q&A]


AI seems to be everywhere at the moment. But despite the fact that it has become ubiquitous, it isn't all the same.
Steve Benton, VP of threat research for Anomali, talked to us about why not all AI is equal and what businesses need to consider to ensure they get the most from the technology.
How to build a successful data lakehouse strategy [Q&A]


The data lakehouse has captured the imagination of modern enterprises looking to streamline their architectures, reduce cost and assist in the governance of self-service analytics.
From data mesh support to providing a unified access layer for analytics and data modernisation for the hybrid cloud, it offers plenty of business cases, but many organizations are unsure where to start building one.
Why structured data offers LLMs tremendous benefits -- and a major challenge [Q&A]


ChatGPT and other LLMs are designed to train and learn from unstructured data -- namely, text. This has enabled them to support a variety of powerful use cases.
However, these models struggle to analyze structured data, such as numerical and statistical information organized in databases, limiting their potential.
Growing data volumes drive enterprise IT priorities


A new survey of 500 data and IT leaders who manage data workloads of 150 terabytes or more shows that growing volumes of data are increasingly driving business priorities.
The study, from hyperscale data analytics platform Ocient, finds 90 percent of IT and data leaders are planning to remove or replace existing big data and analytics technologies in the next six to 12 months.
Data governance is top enterprise priority when introducing AI


IT and business leaders are largely allowing employee use of generative AI but the majority (66 percent) are concerned about the data governance risks from AI, including privacy, security and the lack of data source transparency in vendor solutions.
The latest 2023 State of Unstructured Data Management survey from Komprise is based on responses from 300 global enterprise storage IT and business decision makers at companies with more than 1,000 employees in the US and UK, and finds 90 percent of organizations allow employee use of generative AI.
The future of AI lies in open source


I'm almost getting sick of hearing about AI and its ability to change the world for the better, for the worse, for who knows what? But when you get to the heart of what AI is and how it can be applied to unlock value in businesses and everyday life, you have to admit that we're standing on the edge of a revolution. This revolution is likely to change our lives significantly in the short term, and perhaps tremendously so in the medium term.
It wasn't that long ago I felt short-sold by the promise of AI. About eight years ago I saw someone demonstrating a machine's ability to recognize certain flowers. Although impressive, it was a clunky experience, and while I could imagine applications, it didn't excite me. Fast forward a few years, my real moment of surprise came when I found thispersondoesnotexist. My brain couldn't work out why these were not real people, and it stuck with me. My next big moment was podcast.ai and their first AI generated discussion between Joe Rogan and Steve Jobs. But just like everyone else on the planet, the real breakthrough was ChatGPT and the conversation I had with the 'Ghost in the Machine'.
Storage challenges in a world of high-volume, unstructured data [Q&A]


The amount of data held by enterprises is growing at an alarming rate, yet it's often still being stored on 20-year-old technology.
Add to this a proliferation of different types of systems -- or even different storage platforms for specific use cases -- and you have greater complexity at a time when it’s hard to find new IT personnel.
The return of data modeling -- this time it's strategic [Q&A]


Over the past decade data modeling -- setting up data structures aligned to business requirements -- has tended to take something of a back seat as businesses have rushed to bring products to market.
But we're producing more data than ever and need ways to process it effectively. That's why Satish Jayanthi, CTO and co-founder at Coalesce, believes it's time for data modeling to make a comeback in enterprise strategy. We spoke to him to find out more.
Shifting left to improve data reliability [Q&A]


The concept of 'shifting left' is often used in the cybersecurity industry to refer to addressing security earlier in the development process.
But it's something that can be applied to data management too. Shifting left in this sense means performing data reliability checks sooner. The ability to execute data reliability tests earlier in the data pipelines helps keep bad data out of production systems.
More than half of enterprises overwhelmed by data


Today's typical large organization is holding 35 petabytes of data across its systems and this is expected to double by 2025. But 75 percent of IT leaders are concerned that their current infrastructure won't be able to scale to meet this demand.
A new report, from infrastructure specialist Hitachi Vantara, shows that while technologies like generative AI are spurring a goldrush to greater insights, automation, and predictability, they are simultaneously putting pressure on the already-strained infrastructure and hybrid cloud environments on which they run.
Data quality incidents take two days or more to resolve


The latest State of Data Quality survey from Bigeye finds that more than half of respondents have experienced five or more data issues over the last three months.
Another 40 percent say they have experienced moderate to severe data incidents within the last six months, and that it took a major effort to avoid damage to the business. These incidents range from severe enough to impact the company's bottom line, to reducing engineer productivity.
The problem of unstructured data in foundation models [Q&A]


Artificial intelligence is only as good as the data that it has to work with and that means that large volumes of information are needed to train the software in order to get the best results.
Ensuring the quality of data therefore is a key task in any AI implementation. We talked to the CEO of Snorkel AI, Alex Ratner, to find out more about the issues involved and how organizations can overcome them.
What if cloud data was stored on floppy disks?

New solution offers cheaper enterprise-grade cloud storage


A new enterprise-grade, native cloud storage solution aims to deliver increased speed, affordability and security thanks to its use of decentralized Web3 technology.
Impossible Cloud supports almost unlimited capacity, and its Object Storage solution offers a scalable, cost-efficient alternative for organizations that require reliable and secure storage.
Recent Headlines
BetaNews, your source for breaking tech news, reviews, and in-depth reporting since 1998.
© 1998-2025 BetaNews, Inc. All Rights Reserved. About Us - Privacy Policy - Cookie Policy - Sitemap.