My role, as a Solutions Architect and Canadian Oil & Gas data specialist, was to design and implement a pilot project to evaluate Denodo Data Virtualization Server .as part of a BI / Data Governance initiative at Harvest Energy. The approach required the development of a number of business and technical test cases which would highlight a few common data governance and data analysis tasks, as well as tasks not currently available within the IT service catalog.
Challenges: Data Virtualization was a relatively new concept for the business users, and data governance is an IT service that is difficult to attach concrete business value to in a small to medium sized company. In order to promote a more self-serve environment, a lot of the test cases relied heavily of the Data Catalog features, which is a fairly new feature within Denodo.
Primary Accomplishment: The proposal is currently under evaluation. The documentation generated has forced the company to take a step back and consider develop...
My role, as a Solutions Architect and Database Developer, was to design and implement a replacement for an existing Daily Production and Budget Monitoring Report. The new report needed to be sourced from an internally hosted Oracle Data Mart, consisting of Production data from 3 different sources (P2 Qbyte Financial, Ultimus Budget System, CGI PVR Estimates).
Challenges: The current report was built in Excel, and was manually refreshed (cut and paste) every morning. The process had evolved in complexity and scope over time, and was very time consuming to maintain, especially at the beginning of every month and year. The PVR data source was developed by the a CGI resource and the project required a lot of co-ordination and planning in order to deliver a cohesive design.
Primary Accomplishment: The project is currently in its testing phase and the business is evaluating it as part of a larger corporate reporting strategy.
My role, as a Database Developer and Analyst, was to design and implement a procedure for migrating historic SiteCore data into Siteview and Wellview. The resulting approach utilizes the Peloton SDK, an Access export of SiteCore, and a Visual Studio Project to process the historic data, track integrity errors, and append the information into Siteview/Wellview.
Challenges: The majority of the active data already exists in Siteview and Wellview, so additional care had to be taken in order to append this historic information into the production environment.
Primary Accomplishment: The client now has all historic information loaded into their production environment, and can retire the legacy SiteCore system. This saves licencing costs and database resources.
My role, as a Database Developer and Analyst, was to design and implement an interface for an International SAP BI system that would extract, transform, and summarize Canadian operational data. The data required consists of actual (Qbyte) and forecast (Ultimus) amounts from proprietary internal systems, as well as numerous manual sources. The Canadian data is transformed and rationalised into a form that meets the international reporting standards of the owner company and gives them the timely information they need to make critical decisions.
Challenges: Transforming the Canadian operational data so that it met the requirements of the pre-existing SAP BI system but still retaining the original Canadian context for vetting purposes was an essential design element. Integrating spreadsheet data into the process in a maintainable and efficient manner was also critical, as the time constraints put on the process demanded a quick turn-around time between when the data was update...
My role, as a Database Developer and Analyst, was to design and implement a solution for maintaining hierarchical data such that it could be applied to a number of different proprietary systems. The types of data ranged from Alberta Strike Areas to corporate Financial Reporting Hierarchies. The system needed to be able to relate user, location, and other types of data to any level of a hierarchy in such a way as to ensure the synchronization of core data systems.
Challenges: Integration between Oracle and the Windows Enterprise (AD and Exchange Server) was essential in accomplishing some of the requirements like responsibility routing and notification. The application of my NTS and DLS VirtualGrid routines allowed for location based boundary authentication and Map layer creation.
Primary Accomplishment: The client is now able to synchronize all of it's core systems, greatly reducing operational confusion and increasing the precision of executive reports.
My role, as a Database Developer and Analyst, was to design and implement a solution for communicating Land changes (Greensheets) to other departments in an efficient and timely manner. The system would be required to facilitate multiple levels of review and approval as well as the subscription and distribution of this data to other business units. The final design would need to be adaptable to other data sources and data subscribers.
Challenges: This system is an enterprise solution that requires the input from a number of business units. Historically the units have different perspectives and expectations of the data. The requirements gathering phase and sign off phases were essential before coding began. The long tail of the planning phases led to changes in desktop environments that needed to be quickly worked around with major tool substitutions. The system infrastructure and basic operations needed to be scalable and dynamic such that additional ‘modules’ could be plugged in...
My role, as a Database Developer and Analyst, was to design and implement a solution for calculating the Asset Retirement Obligation costs for Wells, Pipelines, and Facilities. The system would be required to track each of the entities by interfacing with existing systems and providing a user interface to audit and resolve any issues resulting from the appending and merging of the data from these systems.
Challenges: The existing process used to currently accomplish the calculation was an Excel spreadsheet and was to be the main source of specifications and logic. Some additional information was provided by the stakeholders, but the approach, design, and business logic were not pre-defined.
Primary Accomplishment: This client's ARO process is now fully auditable and reproducable, resulting in less time spent with auditors and a more concise understanding of their regulatory obligations.
This system is in use and is being internally maintained.
My role, as a Land Systems Specialist, was to code a number of the data transformation code modules, advise on data mapping schematics, and help troubleshoot CS related data anomalies.
Challenges: The PPDM data model is very flexible and expansive in what it can store. Making the simplistic and sometimes makeshift CS Explorer data fit within this model was very challenging. The biggest challenge of this project was the conversion of the CS inclusion based Structured Rights data into a format that could be consumed by Husky’s internal GIS system. The format had to be relatable to other internal and external Well data sources such that a meaning Well schematic could be produced to display the Land information in-line with proprietary engineering and geological data.
Primary Accomplishment: The transformed Land data now contained in their internally managed PPDM data warehouse was now fully accessibleby their SAP and GIS systems, which allowed them to dissemintae and lev...
The Alberta Spacing and Holding Discovery and Update utility has evolved from work I’d done at a number of sites (EOG and Devon Energy), but it wasn’t until I completed the enhancements at Pennwest do I feel that the GeoWebWorks data was being used to its full potential. In addition to automating and standardizing the calculation of DSU values for Wells using the Spacing Order information, the matching of Holding information to proprietary Mineral Files to discover situations where Holdings are in default or where the Mineral information is incomplete. In addition to the deliverables derived out a single data import, a history is maintained such that business processes could be triggered off of events like changes in calculated DSU or inactivation of a Mineral that falls under a Holding.
Challenges (PennWest): This utility is very portable and was designed to be plugged into different sites and Land Systems simply by altering values within cross reference tables. The big...
This project was part of a CS Explorer version 8 deployment for Husky Energy. The existing Structured Rights data needed to be standardized and translated such that Rights searches within CS Explorer were reliable and complete. In addition to ensuring the data integrity within CS, the Rights data was also required to be compliant with Husky’s pre-existing systems.
Challenges: In addition to managing and guiding the manual translation team, I was responsible for developing the metrics for gauging the success of the migration. The challenge in this was that the only tool we had for checking our progress was the CS Explorer Right Regen utility. This tool produces a simple text log with generic errors and File references. During the project we had to press the vendor for a number of bug fixes and at times were testing unreleased code for them. The biggest challenge for this project was devising workarounds that would enable us to use IHS data within CS. Formations and their age...
The Well Offset Discovery utility is an nightly process that leverages Land System data against a public data hub (IHS) to discover, among other things, non-proprietary Wells that may be too close to a company’s Land Rights and/or drawing from a common Pool within a specified Drilling Spacing Unit. It is designed to do the majority of data analysis during off-peak hours and provide up to date data for its interface module. Currently, the interface module is an MS Access front end, utilizing Oracle Procedures and Objects for all data operations. The discovery process is a set of Oracle Procedures triggered by a basic Oracle Job.
Challenges: Performance is always an issue when querying external public data stores like IHS. Separating the discovery process from the browsing interface was essential if we were going to allow users to drill current data in a timely manner. With the separation of the two functions a number of server/client type checks and balances needed to exist in...
The PO Database Application is a VB application distributed via Citrix Server to local desktops and field offices. It has three separate modules. One for data entry which is designed specifically for Operator, Field Office, and Production Accounting use. The second is for executive reporting, and the third is for system reporting and maintenance. All business and data integrity logic, as well as interface display data are handled via procedure calls to a PL/SQL Package. The PO Database system allows the field to quickly issue work POs while still maintaining a high level of data integrity. The reporting components pull actual, as well as budget data together for a real-time picture of current spending.
Primary Accomplishment: Instead of having to spend the extra time and money to rush the deployment of a new accounting system I was able to create a stop-gap solution that would collect the data in a compatible format within an Oracle environment so that w...
The Capital Report utility is an extract and reporting utility that allows Financial Accounting to generate a Capital Report just after month end and use the report to balance transactions against individual records. It has two basic functionalities. It extracts all line items for the previous month end from the QBFM system and applies a number of data transformations. The resulting data is exported into an Excel file which can be distributed amongst individual engineers, who then append and update any data that falls within their own Area. All the Area changes are then merged into one final Master copy of the Excel export. The second component takes an Excel file with the required format and produces a number of highly formatted and summarized executive reports. This reporting utility can be used to display the same report for either the Area data or the Executive data.
Primary Accomplishment: With this project I was able to improve not only their data int...
The EUB Well Transfer Utility is an Access Database that prepares License Transfers for upload into the AEUB Well Transfer Application via their website using their proprietary XML format. The process takes an Excel spreadsheet created from an export out of a Land System, and generates an XML text file in the format required. It can track a number of transfers based on Transfer ID or a number of other attributes so that a transfer can be re-submitted at any time. Audit reporting is also available to reduce the number of load errors encountered when using the AEUB Web Site.
Primary Accomplishment: The Alberta EUB DDS web utility is cumbersome and designed for single license transfers. The ability to do batch transfers lacks the necessary mechanism that would allow you to successfully submit a batch
without at least one issue. My utility allows my clients to quickly re-generate XML files and react to any error they may encounter as they work through hundreds of licenses they...
The Land Utilities is an install-able VB application that connects to a number of Land system environments (CS Explorer, Landman, and Land Rite). It contains reports and exports to enable the porting of the information to other Applications. There is a Schedule A report that Husky uses to post Land information on their land sale website. There are also map layer exports for Accumap and Maplab.
Challenges: Generic install with automatic database client discovery and login
Primary Accomplishment: An off-the-shelf Land support utility that provides my client with all of the benefits of value-add reports and data checks that help them get the most out of their Land System.