April 2016 Newsletter


By Mike Skeffington

If there is anything we can take away from the last 18 months, it’s that uncertainty abounds in the E&P world. More and more companies are turning to analytics to examine past performance and model future behaviors. Nearly every operator these days is building a business intelligence strategy, a data warehouse infrastructure, or teaching teams of people how to use multivariate analytical tools. However, the challenge that most often arises is not the analysis, but rather the underlying data. More data does not make for better decisions…better data makes better decisions.

In our last newsletter, we said only when you have current, complete, comprehensive, high quality, and pervasive data will you be able to realize the full value of information certainty to protect capital, preserve business value, and beat the competition. That’s what ‘better’ data looks like. A business intelligence strategy will not deliver the intended benefits until there is a systematic approach to acquire data, validate it, process it into information, and present it to the business users across the enterprise in the systems and format needed. Better data needs to address all stages of the well lifecycle and provide value to all user groups.


New EnergyIQ Solutions!

By Diana Lovshe

This month has marked several new initiatives for EnergyIQ. Not only have we launched a new and updated website, but we have also launched new solutions. We invite you to learn about these new solutions by using the links below


IQinsights comes with everything you should need to ensure you have quality data and the insights to make the decisions needed. IQinsights is a robust offering that helps you drive business value and have a competitive advantage through trusted and valid information. With a single point of access to all relevant and critical well performance information, IQinsights enables you to identify, assess and adjust business decisions quickly and cost-effectively.

IQinsights includes EnergyIQ’s Data Assurance Service which ensures the data is always current and complete. The system automatically reports loading errors. Our team of experts diagnose and resolve the issue, bringing the database back to a trusted state of integrity, giving you peace of mind.


Built on the robust foundation of Trusted Data Manager (TDM), IQgeoscience gives you the ability to load and manage geoscience data from any source including commercial providers, partners, state government, and your proprietary data. This includes well and production data, logs, directional surveys, formation picks, frac/stimulation data, documents, and more. Through aggregation and blending process, your geoscientists are able to confirm the certainty of the data they have loaded, resulting in the most trusted information.



IQlifecycle is the best solution to help you operate more efficiently. IQlifecycle gives you an in-depth understanding of your operational efficiencies by centrally managing your corporate well information across the life of the well. EnergyIQ’s foundation—TDM–combines an innovative data model with sophisticated software capabilities to create an enterprise well hierarchy. The well hierarchy is created by uniquely identifying and relating each well and all of its components, and establishing and managing the relationships between them. As data is loaded from multiple sources, it is matched with an existing well, and aggregated and blended, to create the well hierarchy.



By incorporating IQempower into your business, you are able to bring together intellectual property, asset-oriented data, external regulatory and opportunity data, and operational business information. IQempower supports your critical business decisions by creating a trusted, high-value view of all of your available information–from a single well to a view of your whole business.



Formerly known as ActiveExchange, EnergyIQ’s event management platform has been renamed IQexchange. IQexchange seamlessly translates events into relevant activity, allowing for real time data routing and synchronization to occur so that you always have the most significant and comprehensive information. IQexchange is bundled with data flows that you can immediately plug-in, but it is also a very open platform allowing you to build custom data flows using a suite of built-in drag and drop agents.

Data Objects: The Evolution of PPDM’s ‘What is a Well?’

By Steve Cooper

Steve Cooper presented a paper on this subject at this year’s annual PPDM Data Management Symposium in Houston on April 11-12. There was a lot of positive response and feedback, and we thought we’d share in our newsletter as well.

A Data Object is defined as a collection of data attributes combined with the information required to manage that object to support business workflows.

Establishing common Data Objects for critical E&P information establishes a platform for effective data management that is a logical extension to accepted industry standards such as the PPDM ‘What is a Well?’ initiative. Having a standard set of attributes defined for a Well, Wellbore and Completion, for example, would greatly facilitate the management, exchange and visualization of information across the Well lifecycle.

Beyond simply defining the attributes that logically belong with a specific Data Object, however, the real value is derived by establishing the rules for the management and quality of the data associated with each object. This represents the practical implementation of the PPDM Business Rules initiative and establishes a foundation for delivering data of an industry recognized standard.

This paper, link, discusses the definition and structure of common Data Objects, along with the benefits of adopting an industry approved set for the effective management of E&P data.

definition and structure of common Data Objects, along with the benefits of adopting an industry approved set for the effective management of E&P data.

Data Quality

By Alan Henson

Data science, analytics, and predictive models have been around for a long time. The insurance industry is arguably one of the first to embrace predictive analytics with huge investments in actuarial tables to help assess the risk of insurance applicants. In fact, even the oil and gas field has been embracing a form of predictive analytics for over a decade when you consider the type of mathematical models that go into reservoir characterization. However, despite the use of predictive analytics or any data-driven model to help in decision making, one must first consider what’s going into the model. Bad data going in means bad models coming out and this can lead to costly mistakes in the decision making process. While analytics provides flashy graphics and interactive displays, it’s the data underneath that makes it possible, and the investment that goes into ensuring high quality will pay dividends if the correct efforts are made.

Early in TDM’s life, the TDM Rules Engine was introduced to address the need to assess and measure the quality of the data stored within TDM. As a master data management solution focused on upstream oil and gas data, the Rules engine provided out of the box quality measurement across a variety of data checks that comprised over 400 rules in total. In looking to take data quality measurement to the next level, TDM 2016.1, due out in May, will offer a powerful evolution of the TDM Rules Engine that will shift the focus from the data quality within a population to the data quality of the Data Object – a collection of data attributes combined with the information required to manage the object to support business workflows.

The new TDM Data Object Quality Measurement engine (DOQM) focuses on measuring the quality of a record based upon the context of the data present and it does this using a combination of Rules and Rule Sets. A Rule is what you might expect, an assessment of an attribute or a grouping of attributes to determine the data quality of the value(s) present. For example, one might want to determine if the Spud date is correct in the context of the Completion Date. Therefore, a rule such as the following might be written:

wo.SpudDate < wb.CompletionDate

In this case, the ‘wo’ represents the alias of the Well Origin Data Object and ‘wb’ represents the alias of the Wellbore Data Object. The two attribute values together contextually assess the quality of the data. Building on this concept, multiple Rules can be written that can then be grouped together in Rule Sets based upon some pattern or higher-level quality test (such as testing all dates in a well header, focusing on depth-related tests, looking at non-null fields and ensuring they are populated, etc.). When a Rule is assigned to a Rule Set, the Rule is given a weight, which becomes the weight of the Rule within the Rule Set. The weight acts as a measure of importance in determining the Rule Set’s overall data quality score. For example, assume a Rule Set has 4 rules associated to it:

  • Rule 1 has a weight of 20
  • Rule 2 has a weight of 10
  • Rule 3 has a weight of 5
  • Rule 4 has a weight of 15

This means that if each Rule passed perfectly, the Rule Set would have a maximum possible score of 50/50 (or 100% data quality). However, if Rule 1 failed or only partially passed, it would have a greater impact on the quality score than any of the remaining Rules. This allows the Rule Set authors to be specific in applying importance to Rules and how they impact the Data Quality measurement.

The full ins and outs of the new Data Object Quality Measurement module exceed the scope of this article, but there are many other exciting capabilities coming with the 2016.1 release. Two of the new features coming are:

  • The new Data Object Quality Measurement module works with Data Objects – this means that any process can make use of the DOQM engine so long as it can map data to the open Data Object Standard while invoking a RESTful service. The engine is wide open and is completely decoupled from the TDM PPDM 3.8 model
  • Rule Set Results can now be stored, which allows a specific Data Object’s quality to be tracked over time. This allows for trend analysis to be conducted across a large population of data or down to a specific attribute. Additionally, the DOQM can assess and store quality results for data that doesn’t even reside in TDM (see first bullet point)

There are many exciting new capabilities coming with the new DOQM module and this article merely touches on a couple of them. And with a Rule syntax based upon Microsoft’s LINQ Expression Language, users are empowered to write rules that vary in degree of complexity without requiring a deep knowledge of SQL or the underlying PPDM model. Stay tuned for the 2016.1 release and all of the exciting features that will be coming with it.

A sneak peek at the new DOQM UI:



TDM Optics

By Mike Haden

Although the PPDM data model is well-suited for storing complex and comprehensive upstream operational data, it is not as good at providing high-performance analytical results through tools such as Verdazo or TIBCO’s Spotfire. With the current rise in popularity of these types of tools for analytics, which can quickly identify areas of improvement in operational efficiencies, EnergyIQ has introduced TDM Optics as a complementary element in the EnergyIQ solution space.

TDM Optics is an extensible solution to extract key indicators from the business data, replicate those into a form easily consumed by leading analytic tools, and maintain and update the information on a regular basis. TDM Optics makes use of EnergyIQ’s business and technical knowledge and your data stored in the TDM database to ensure that you get more value out of your data investment. You can integrate your own proprietary data with TDM’s in generating powerful analytics, and the TDM Optics data is kept current with automatic updates. Built upon the underlying PPDM standard, using the TDM “golden” record, TDM Optics provides tools to “flatten” key data facets to the well hierarchy.

TDM Optics provides tools to "flatten" key data facets to the well hierarchy

The layered design of TDM Optics begins with a large set of small, granular database views which are extensible. These database views are replicated to database tables during an automated update, shifting this load to off-peak hours. Broader views can be layered on top of the database tables, while applications can then use overarching views or the individual database tables as appropriate for best performance.

Partner templates in the leading analytical and GIS mapping tools and/or proprietary dashboards or other applications provide analytical and visual relief to your data landscape. Typical analyses include drilling optimization, benchmarking operator performance, asset acquisition analysis, directional drilling patterns and performance, and production analysis and optimization.


In the first phased release of TDM Optics, the summarized data includes well header, permit and area; cumulative, first ‘n’ operating, last ‘n’ operating, and last ‘n’ calendar months of production; top ‘n’ months of production; and top ‘n’ in first ‘m’ months of production, e.g. sum of the top 3 months of production in the first 6 months of production. Summary values include sum, average, minimum, maximum values over each production time period, and the count of producing months in that time period.

As an extensible solution component that is easy to implement in any TDM implementation, TDM Optics provides fast time to value with high-quality summary data at each level of the well hierarchy and industry-leading tools and templates, ensuring that you’re getting more value out of your data investment.

IHS Re-issue of Louisiana Production Data

By Amelia Webster

Wow – 5.5 million records loaded per customer!


Over the last six weeks, EnergyIQ support services staff have managed the removal of existing Louisiana Production data, completed loading of the new IHS-delivered baseload records and resumed normal daily loading for all our customers who subscribe to Louisiana data. This process has involved deleting over 4.5 million records in each customer environment, then loading almost 300,000 Production entities which contributed to over 5.5 Million records in 14 different PPDM tables. We have coordinated with each customer to ensure these processes would have the least possible impact on their daily work and were timed to synchronize with when and how IHS was delivering the data to each.

The good news for all EnergyIQ customers that subscribe to the Data Assurance Service was that we did all the work for you so that everything was business as usual.  Click here for our Data Assurance brochure

Tech Forum Recap

 The 3rd Annual Technical Forum was held March 8, 2016 at EnergyIQ’s Houston office. The Forum focused on strategic plans and technical direction for EnergyIQ’s Trusted Data Manager (TDM) application suite designed specifically for E&P companies. The event was extremely well attended, both in person and via live-streaming webinar for remote attendees. Among other subjects covered, three key topics that gained much interest from the audience included:

  • The Next Generation Data Quality Rules Engine
  • IQexchange – Architecture and Technology
  • Analytics – Platinum Value from Your Golden Data

“We are pleased with the result of this year’s Forum,” said Steve Cooper, President EnergyIQ. “It was exciting to spend a day with our customers and prospects discussing both needs and solutions to their data quality and data management challenges. More than ever, E&P companies need to leverage information across the organization to drive efficiency and reduce costs.  This year’s theme was all about data integration and transformation, which are critical to any successful data management strategy.”

If you were not able to join us this year—we missed you! Look for information coming soon about the EnergyIQ Rendezvous 2016 to be held in the Fall.

Meet the team

Chris Verret, VP of Software & Technology of EnergyIQ ExchangeChris Verret

Chris Verret leads the IQexchange team at EnergyIQ, bringing over 16 years of upstream G&G data and application integration experience.  He leverages a unique blend of both technical and domain expertise, and has a passion for designing and building software solutions that meet the needs of his customers.

Before joining EnergyIQ, Chris led the customer-facing products division at TGS, focused on next generation delivery methods for geological data.  Prior to that, Chris spent 10 years with Volant Solutions, leading the product and services teams through numerous successful releases and implementation projects.  Chris graduated from the University of Baylor with a Bachelor of Business Administration, majoring in Computer Information Systems.  When he’s not building software or onsite working with a customer, Chris enjoys spending time with his wife and four children.

TDM Tips & Tricks: Using Well Lists to Save and Share Query Results

Using TDM’s powerful search capabilities, users are able to save useful or common queries to locate their data, even as the data grows or changes. However, at times, a static list of wells is a better option as it is generally faster than re-querying the entire database when the wells of interest are already known.

Any search, run through either the QuickSearch or Queries pages, can be made into a Well List. Well Lists can be helpful once the search parameters have been narrowed down and a particular set of wells has been identified as of interest to your group or business, or in need of special attention. Once a working set of wells has been identified, you can quickly create a Well List by selecting the Create Well List icon tips-icon in the main window. Created well lists also become available from the Lists tab from the Left pane of the Home Page (a refresh of the page may be necessary after the Save has occurred).


Privacy options allow you to control who can see or edit the list. Setting the list to Public allows all users to see the list, while setting it to Shared allows all users to see and make changes to the contents of the well list. Private restricts viewing and editing to the user who creates it.


Well lists can be used in subsequent queries, further filtered or expanded on, or used as a common baseline when shared between teams working with a particular set of wells. From the Queries page, the option exists to import the Well List directly. This can be done by selecting ‘Well Lists’ from the Query Parameters pane on the left, and then selecting the ‘Select Lists’ link.


Additionally, one can export the list of wells as a text file for use in other applications, either as a set of Government IDs or as a set of TDM E-Keys.

Where can you find us?

We’d love to see you in person. EnergyIQ will be attending and sponsoring many coming events. Some of them are listed here. Please come by and say hello.

  • May 3, 2016: Spotfire User Group (Dallas TX)
  • May 3, 2016: PPDM Data Management Luncheon (Denver CO)
  • May 11, 2016: Alberta Data Architecture Meeting (Calgary AB)
  • May 17 – 19, 2016:   PNEC Conference (Houston TX)
  • June 7, 2016: PPDM Data Management Luncheon (Oklahoma City OK)
  • June 14, 2016: PPDM Data Management Luncheon (Houston TX)
  • June 15, 2016: PPDM Data Management Luncheon (Calgary AB)
  • June 25 – 26, 2016: Denver MS150 (Denver and Ft. Collins, CO)

Check back often for updated information!

Care to Share?

Request a Demo

WP-Backgrounds Lite by InoPlugs Web Design and Juwelier Schönmann 1010 Wien