As the world moves towards an ever increasingly data-centric outlook, businesses must find ways to ensure that they have access to the best data available to assist them in decision-making. While analyzing and visualizing data to build business intelligence has always been a key aspect of decision making, the sheer amount of available information currently being generated brings on unique challenges. It is estimated that in 2011, 1.8 Zettabytes (1.8 trillion Gigabytes) of information was created; a 9 fold increase from the previous 5 years. There are no signs that this rate of production will slow, in fact the amount of information is estimated to increase fifty fold by 2020.[1]
First Layer: Data
What’s more, the data being produced is from a multitude of sources and stored in a variety of formats, none of which are necessarily compatible with each other. It is no longer a case of simply storing similar data in a database and accessing it as needed. Organizations working in the energy and commodity markets require market, financial, weather, and GIS data (to name a few) in a timely fashion to build actionable information for decision making. Before any intelligence is derived from all of these different data types and sources, they must be normalized into a form that can consumed by the internal systems.
For a variety of companies, a large part of collecting, normalizing and storing this data is still being done manually. With sources varying from: ftp servers, websites, emails and web APIs, and a vast array of formats including: .doc, .csv, .xml, .pdf, and .html, manual data handling can be very costly in terms of time and human induced errors. To improve efficiency and reduce instances of bad data, companies need a data management system in place that can automate data capture, handle multiple sources and formats, while providing the ability to validate incoming data. ZEMA excels in all of these areas according to many leading industry experts and clients; let’s see how it can be leveraged by different groups in the energy and commodity trading markets to handle data concerns.
Second Layer: Visualization
The aforementioned onslaught of data has traders spending more and more of their time dealing with the access and control of data instead of using it to drive decision making. ZEMA give traders the ability to not only display data, analyses, and charts from ZEMA, but a wide array of other data sources including RSS feeds, Websites, and weather maps. The dashboards are dynamic and can be refreshed on a set schedule or on demand. These features allow traders to view all of the information that they need in one location regardless of data type.
In Figure 1, there are several Liquefied Natural Gas (LNG) information sources embedded in a dashboard. From the top left moving clockwise we have: A RSS feed from a new LNG source, a ZEMA graph showing the additions and withdrawals of US LNG storage over the last 3 decades, a Bloomberg webpage showing the benchmark prices for energy and oil, a ZEMA table showing various LNG freight costs, and a live satellite weather feed of the US gulf coast. New sources of information can be added to dashboards, and dashboards can be shared between users ensuring that traders can stay up to date with the markets.

Market participants, including risk managers and analysts, need access to a wide variety of data to build the reports which summarize the market status with the current risks and opportunities. The required report demands a lot of data that is often published in different reports from multiple sources and at different time granularities. ZEMA provides an easy to use environment where these data sources can be compared even if they are published with different time granularities. This provides the ability to select the best source of data to meet the organization’s needs.
Last Layer: Integration
ZEMA integration tools empower IT specialists and developers by giving them access to the functionality of ZEMA and the data it warehouses. Through web services, data can be extracted from ZEMA and moved to downstream systems through either SOAP or web requests in a variety of formats. Client connect allows companies to connect their pre-existing internal databases with ZEMA. The Data Manager (DM) API facilitates the building of custom software solutions using DM’s pre-existing code library. Through this you can setup automatic data dumps in a variety of formats including .xls, .txt, .xml, and .csv to downstream systems.
ZEMA transforms the external data mess into a single coherent data source and ensures that the data is in the correct format for consumption by downstream systems. It acts as the data backbone facilitating end to end systems integration. This frees the individual groups of the organization from having to deal with the ever shifting landscape of data sources and formats to focus on building actionable intelligence from the data which drives decision making. To see how ZEMA provides seamless end to end data management, book a complimentary demo.
[1] Lucas Mearian, “World’s data will grow by 50X in next decade, IDC study predicts”, Computer World, June 28, 2011, accessed August 5 2014, http://www.computerworld.com/s/article/9217988/World_s_data_will_grow_by_50X_in_next_decade_IDC_study_predicts?taxonomyId=19&pageNumber=1