How to Analyze 1.8 Trillion GB of Data, and Growing

3 minutes, 35 seconds Read
Analyzing Trillian Gigabytes of Data
Source: ZE

More than 1.8 trillion gigabytes of information was created in 2011, and the amount is expected to grow by a factor of 50 before 2020 (IDC Digital Universe Study). The cost to organizations of storing this information is projected to increase from US$379.9 million in 2011 to nearly US$6 billion in 2016 (IDC). With the amount of capital spent on data archiving, it is evident that businesses understand its value.

Data capture, however, is not a viable stand-alone strategy. Organizations need ways to manage data, and use technologies to extract information that they need to know. As Suranjan Sam from the Information Management Group writes, “The true value from data comes from being able to contextualise, consume, and understand it. There is no point in collecting and storing data if it’s completely incomprehensible to the people who need to make decisions based on it” (Business Computing World).

For data to create value, it must reveal knowledge that businesses can use to gain competitive advantages.  From this standpoint, the ability to leverage data will become the basis for survival and growth for organizations. The power of analytics is evident to corporations, as the use of analytics has increased 300% since 2009 (Accenture).  In fact, in a 2013 Zpryme survey of 83 executive professionals from U.S. utilities found that 76% of utilities plan “to acquire data grid analytics software and favor solutions integrated into an advanced distribution management system” (

The challenge in analytics lies in being able to extract and deliver the right information to the right people at the right time. More companies are turning to analytics as their primary predictive tool, but research shows that only 39% percent of organizations are able to generate relevant information useful to their business strategies (Accenture). Part of the issue is that only 40% of businesses have access to technology capable of retrieving actionable data, despite that “applying technology that ensures data integrity, quality and accessibility” is a major key to achieving affective analytics, according to Narendra Mulani, senior managing director of Accenture Analytics.

One solution by ZE PowerGroup Inc. to solve the analytics problem is Market Analyzer, from the enterprise data management software, the ZEMA Suite.

Market Analyzer enables users to retrieve timely market data and perform advanced analysis to improve decision making, gain better insight, and manage performance through an easy-to-use interface. It is simple to find, analyze, and manipulate data through queries using pre-defined formulas and transformations including forward curve construction, extrapolation, interpolation, unit conversions, and more.

With the built-in wildcard search, filter, and data selection tools, users can easily obtain the right information they need without comprehensive knowledge of the data.

The graph below illustrates Market Analyzer’s ability to quickly calculate and visualize forward curve spreads and potential arbitrage opportunities. Here we see the differences in CME and ICE prices for Henry Hub penultimate and last day futures contracts.

CME and ICE Prices forward curve visualization.
Figure 1: CME and ICE Prices forward curve visualization.

Not only can Market Analyzer help users retrieve valuable market information, but the software supports rapid decision-making by allowing users to effortless share analysis results across organizations.

To learn more about Market Analyzer, or to book a free demonstration, please feel free to contact us.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *