Google partners with the White House for Climate Data Initiative
Global warming, or global climate change, is a polarizing topic. Many people staunchly believe in it, while others are skeptical. We will not tell you what to believe -- it is healthy to debate. However, science is ultimately the deciding factor.
Today, the White House announces the Climate Data Initiative. This should empower people to more easily obtain data on climate change. Many people are likely dubious of this initiative, after the initial debacle that was the HealthCare.gov website -- the administration's technology reputation is tainted. Luckily, this time Obama and crew have partnered with Google, which should lend to a better experience.
Major organizations face looming information crisis
Big data, cloud technology, social networking and the switch to mobile computing are all contributing to an increase in the amount of information enterprises have to deal with.
This is forcing companies to focus on the information that’s most relevant, risk related and value generating. As a result Gartner is predicting that 33 percent of Fortune 100 organizations will experience an information crisis by 2017, due to their inability to effectively value, govern and trust their enterprise information.
Cray increases revenue and expands its customer base
Supercomputer maker Cray has reported record revenue for 2013 and is moving into new markets.
Moves towards big data, simulations, predictive modeling and other applications have increased demand for the company's products beyond its traditional base of large corporations and government bodies.
Intel Data Platform helps businesses unlock their big data
Many companies are keen to exploit the potential of big data but are wary of the potential costs involved in doing so.
To help businesses get maximum value from their big data investments Intel is launching the Intel Data Platform, a software suite based on open source technologies.
Elasticsearch adds features and scalability in new release
Elasticsearch, the big data search and analytics specialist, has launched its first major product release.
Elasticsearch 1.0 is built on the company's experience of helping businesses deploy Elasticsearch as part of the ELK stack, used by many major organizations including Netflix, Soundcloud and Facebook.
GoGrid simplifies moving to big data
Big data is something that many companies are keen to exploit, but implementing big data solutions involves a number of hurdles.
Open Data Services specialist GoGrid is aiming to make the move to big data easier with its launch of 1-Button Deploy technology.
Microsoft uses Bing search data for Super Bowl purposes
When it comes to American Football, I am a big fan of the Jets. This is a great tragedy in my life, as this team has been consistently bad for many years. There is only one time a year that I will support a different team -- Superbowl Sunday.
Yes, I like to pick one of the two teams to root for during the big game. Typically, I pick it arbitrarily -- whoever has the prettier cheerleaders, cooler quarterback, etc. This year, I am rooting for the Denver Broncos, because I like Peyton Manning. As a Jets fan, I don't get many opportunities to root for a quality quarterback. However, according to Bing, I am in the minority, as the majority of my state, New York is rooting for the Seattle Seahawks.
Elasticsearch goes commercial with Marvel real-time monitoring
The popular open source analytics platform Elasticsearch aims to help businesses unlock the power of big data.
The company is launching two new products, Elasticsearch ELK which brings together three open source products to create an end-to-end analytics solution, and its first commercial product Marvel, a real-time management and monitoring solution.
IBM creates new business unit for Watson supercomputer
Technology giant IBM has announced that it's to invest more than $1 billion to create a new business unit for Watson, the supercomputer that beat human contestants on the TV quiz Jeopardy.
The new Watson Group will be headed by Michael Rhodin, previously senior vice president of the company’s software solutions group. The unit will be based in New York and have around 2,000 employees.
IBM buys Aspera to speed up movement of big data
One of the problems with big data is its sheer size. This leads to problems when it comes to moving files around and can lead to a loss of competitiveness if companies aren't able to process files in a timely manner. The issue can be magnified if it involves transferring files to and from cloud platforms.
IBM has recognized this problem and in response has swallowed up California-based Aspera, a specialist in high speed transfer techniques. The company's patented "fasp" technology can reduce the transmission times for large files or data sets by up to 99.9 percent. It overcomes bottlenecks in broadband wide area networks that slow the transfer of extremely large files, such as high-definition video or scientific research data, over long distances.
Amazon Web Services rolls out Kinesis to all users
Since it launched in 2006 Amazon Web Services has become a major player in the cloud computing sector. It's now aiming to move further into the big data arena with the rollout of Amazon Kinesis. Kinesis (from the Greek word meaning movement) is a managed service to handle the real-time processing of high-volume streaming data.
By using Amazon Kinesis customers will be able to store and process terabytes of data from hundreds of thousands of sources each hour. This will enable them to write applications that take action on real-time data -- things like website click-streams, marketing and financial transactions, social media feeds, logs and metering data, and location-tracking events.
Data Defined Storage: creating business value from Big Data for financial services organizations
Financial services organizations (FSOs) generate huge volumes of unstructured data -- volumes that roughly double every two years according to an IDC report. To the innovators this signifies increased opportunity for better business insights. However, mass volumes of data, although promising potential value, can also pose as a substantial challenge if the appropriate underlying infrastructure is not in place to enable organizations to store, protect and understand data, unlocking the value of information as a strategic business enabler.
Ever-increasing amounts of electronic data, growing standards of accountability, new rules governing data use and security have resulted in a need for a new approach for managing digital assets that support business policies and ensure long-term preservation of data -- without compromising quick discovery and access, should the need arise.
DataStax releases a NoSQL database with automatic management
Because NoSQL databases are less restrictive than the more conventional relational model, offering simpler design and improved scaling, they're popular for handling big data and real-time web applications. However, this comes at the price of higher maintenance demands.
The latest release of DataStax Enterprise (DSE) 3.2 addresses this with the addition of automated management services, allowing companies to concentrate on generating revenue rather than maintaining the database. This makes it the first NoSQL solution to have management taken care of by the database itself, bringing features that would previously only have been available in products like Oracle to the NoSQL market.
New Kapow release streamlines big data use
It seems that everyone is keen to get in on the big data trend at the moment. If you're still unsure of what it is and where it comes from take a look at the handy infographic we published last week. If you want to start making use of it, then the company behind that graphic, Kapow Software has just released its latest product to make it easier to extract big data from any source.
Kapow Enterprise 9.3 uses synthetic APIs which allow it to draw data from a number of sources and integrate it into existing business processes. What the company calls Kapplets enable users to run and manage thousands of automated data integration applications at the same time. They can then view the different data streams in an integrated way and act on the findings.
Where to find and access big data
Big data is now massively important to many organizations. The more data -- both structured and unstructured -- that firms can access and analyze, the better their insight and decision making processes can become, and that in turn can lead to better performance, improved efficiencies, and reduced risk.
Kapow Software has created an attractive new infographic that provides an overview of the different avenues and channels that big data is pulled from. These data pools include archives, docs, media, data storage, social media, business apps, the public web, and sensor data.
Recent Headlines
Most Commented Stories
BetaNews, your source for breaking tech news, reviews, and in-depth reporting since 1998.
© 1998-2025 BetaNews, Inc. All Rights Reserved. About Us - Privacy Policy - Cookie Policy - Sitemap.