Home    Blog   White Papers   Contact Us
 
Customer Case Study: Cutler-Hammer (EATON) | Treehouse Software



Quick Product Links


Data Integration, Replication, and Migration


Application Modernization and Portfolio Management


Legacy


Product Resources


Customer Case Studies

 

Treehouse Software Customer Case Study:
Cutler-Hammer (EATON)

After looking at several options, Brian Johnson, Database Administrator at Cutler-Hammer, recommended replacing their existing in-house developed solution with tRelational and DPS for "data processing efficiency". Due to resource availability and timing considerations, Cutler-Hammer contracted Treehouse consulting services to perform the project tasks to create the tRelational Model(s) and perform the DPS implementation.

The following is a discussion between Brian and Treehouse Software's representatives.

Brian, can you tell us a little about Cutler-Hammer?

Cutler-Hammer, a division of Eaton Corporation, is one of the world's leading suppliers of electrical control products and power distribution equipment, with sales of approximately $2 billion in the year 2000. We have approximately 59,000 employees and 195 manufacturing sites in 24 countries.

Cutler-Hammer sells a complete line of low and medium voltage assemblies from substations, switchgear, and panelboards to load centers, transformers, and safety switches. These products are used wherever there is a demand for electrical power in residences, high-rise apartment and office buildings, commercial sites, hospitals, and factories. Every person reading this article will probably somehow be affected by Cutler-Hammer products each day, every time you turn on a light, plug in your computer, etc.

What kind of application is this, and who are the key players at Cutler-Hammer for this project?

Vista, as we call our ADABAS/NATURAL applications, is a mission-critical system supporting core functions, such as order entry, order management, shipping, and invoicing, and has performed this role for all of Cutler-Hammer since 1991. There are over 260 ADABAS files, the largest containing 40 million records, with approximately 10 thousand NATURAL objects. This mission-critical application processes over 10 thousand batch jobs per day, and we usually see a peak of about 1300 users per day.

Much of the transactional data generated by Vista needs to be analyzed in order to obtain enough business information to make accurate predictions and correct decisions. With the availability of some best-of-breed warehouse and business intelligence (BI) query tools for use against ORACLE, ORACLE became the platform of choice for Mentor and Genesis, our data warehousing projects.

In support of the warehousing effort, I handle the ADABAS issues, Ketan Shah is our Oracle DBA, and Gus Delerme drives the requirements as Decision Support Manager. As is the nature of warehousing efforts, Gus' requirements for the scope of the Data Warehouse projects increased over time, which in turn increased the volume of data propagation from Vista to the warehouse. This exposed the limits of the current process, and it was becoming a maintenance nightmare.

Cutler-Hammer's "Project Scope"

  • Purpose of project: Replace the third party vendor products from SAS with the third party tools from Treehouse Software. We use these tools to move data from our Vista system on the ADABAS database to the Mentor and Genesis data warehousing systems on ORACLE.
  • Reason for the tool conversion project: Data processing efficiency.
  • Project Scope: This project will require resources to continue and complete the process of changing our current SAS routines to tRelational scripts. Acceptance will be determined by error-free code running in production for at least thirty days. The data source is ADABAS (four databases) and the target is an ORACLE Operational Data Store.
  • Skills: This project will require UNIX, ADABAS, and ORACLE (PL/SQL) proficiency.
  • Tasks: Set up the tRelational model and build the DPS processes for the appropriate files/tables.

What was the existing data transfer process?

We extracted After Images from the PLOG (via PLEU) and then massaged the extracts with NATURAL programs into fixed length format (uncompressed). SAS read the flat files and loaded them to ORACLE tables. This was from OS/390 to UNIX.

What were the motives for replacement?

More and more data was being requested, and the propagation processes were taking increasingly longer amounts of time to accomplish. Again, maintenance was becoming more difficult to perform and coordinate.

Did you look at other solutions?

We carefully reviewed other product offerings and found that tRelational and DPS were the only products available that could handle the entire ADABAS-to-RDBMS process end-to-end. Most products only offered a piece or two of the solution.


 

"We're impressed. Starting in mid-September, we required Treehouse to develop a 30-day error-free parallel process by the end of the year. They completed the job with plenty of time to spare.

Our experts tell me the products and consulting services Treehouse provided really bailed us out from a very programmatic Data Warehouse solution. They say the new product solution is much easier to maintain and will enable us to significantly expand on the Warehouse.

Additionally, I'm happy to hear that there will be a significant cost savings over time."

Randy Carson
Chief Executive Officer, Cutler-Hammer


 

What features made tRelational and DPS the products of choice?

tRelational and DPS were the most mature solution, having been built to be as efficient as possible. This efficiency could be seen in the PLOG extraction, which reads the PLOG directly in compressed form; data transformation, which uses efficient assembler routines (many are supplied with the product); simplified maintenance with the modeling tool that reads FDTs & DDMs and generates DDLs and DPS parms; and scalability for future requirements of the Data Warehouse.

Other important features were that the Treehouse solution had absolutely no effect on ADABAS, and allowed for options to tune (i.e., Duplicate Change Scan, Materialization Output, etc.).

What roles did Treehouse consulting play in the project?

Treehouse provided a key role on the planning/management of the project because they brought experience from past projects for data warehousing/migration. Because of this experience, they were able to recommend several improvements to the process.

Treehouse consultants also delivered the needed technical skills in ADABAS, ORACLE, and UNIX for project completion. And in the end, the folks at Treehouse met the given time and deliverable objectives (or we wouldn't have paid them).

Did tRelational and DPS deliver process efficiency improvements?

The DPS process was implemented as a parallel process, so we were able to ensure that we were doing the same job. Overall, DPS ran in 40% of the time that our SAS process took.

Did tRelational and DPS provide added functionality, flexibility, and maintainability?

Maintenance was easy. All we needed to do was re-implement a file (a menu option in tRelational), and in about one minute I could re-run a batch job to generate DPS parms. Voila!

Did implementation meet your acceptance criteria (30-day parallel)?

Yeah, we paid Treehouse (laughs).

How does this affect your ADABAS system?

The Treehouse ADABAS-to-RDBMS solution actually entrenches Software AG products at Cutler-Hammer by eliminating the pressure to migrate off of ADABAS/NATURAL, since we now can easily get ADABAS data to ORACLE using these TSI products. If it wasn't made easy, we'd look at moving to a homogeneous environment (i.e., all ORACLE).

How does this affect your current ORACLE system?

tRelational and DPS allow us to expand utilization of data warehouse, datamart, and distributed applications that require data from the ADABAS source.

Are there plans to transfer more data to ORACLE?

You have no idea!

Project Overview

Culter-Hammer required modeling and mapping of 37 source ADABAS files from 4 production databases to 53 target tables. They did not require an initial Materialization of the ORACLE tables and requested a non-typical implementation of DPS Propagation. Cutler-Hammer required a full table row image (i.e., Materialization output) result from Propagation processing, as opposed to typical SQL Inserts, Updates, and Deletes.

Project Highlights

TSI met with Cutler-Hammer during the initial project scope and contract agreements and for a single project status meeting, but conducted the consulting services remotely.

tRelational Modeling and Mapping Review

Cutler-Hammer had an existing ORACLE Schema to reproduce in tRelational. TSI utilized the tRelationalPC V4.0.0 ODBC Import option to load the existing ORACLE schema (tables) into the tRelational repository. tRelationalPC's enhanced click, drag, and drop functionality enabled quick and easy ADABAS field to RDBMS column mappings. TSI created four models, one model for each unique Database ID (DBID), containing 3, 4, 12, and 34 tables. TSI created the batch jobs to generate the ORACLE DDL and the DPS Parameters.

tRelational provides the unique ADABAS field to RDBMS mapping. An Imported schema (tRelationalPC ERWIN or ODBC Import) contains the physical model (e.g., tables, columns, and constraints). To establish a mapping: click on the target column, drag the column to the Implemented file area, and drop the column on the desired source ADABAS field to establish the mapping.

DPS Implementation

TSI implemented a DPS Propagation Extract and Materialization Transformation (achieved by executing PARAMETER='PROPAGATION EXTRACT, MATERIALIZATION TRANS'). The Materialization Transformation output file must be processed with the DPSSPLIT Utility program producing individual data files.

Mainframe to UNIX (End-to-End) Processing

Cutler-Hammer was concerned with the mainframe FTP processing and the end-to-end Mainframe to UNIX scheduling. TSI provided a newly developed DPS Service Utility (DSU). The purpose of DSU is to provide an additional means to verify the reliability of file transfers between the mainframe and a UNIX system. There are two pieces to DSU: a client portion residing on the mainframe and the server portion residing as a daemon on the UNIX system. The function of the utility is to compare record counts of files residing on the mainframe and UNIX system and report any differences found. DSU executes a completion script or an error script based on the match or mismatch of results.

TSI implemented a mainframe DPSSPLIT and an FTP and DSU step for each of the split files. Upon successful verification, DSU executes a shell script to perform the UNIX load processing. The Propagation result rows are loaded to temporary "staging" tables and subsequently processed with ORACLE PL/SQL scripts to Update or Insert rows into the production report repository. Note: Cutler-Hammer requested all Deletes to be ignored. These requirements were satisfied with additional processing scripts.

Project Summary

TSI successfully implemented a fully-operational system in a test environment to conduct a 30-day parallel validation and acceptance testing. TSI modeled the implementation from the existing production ADABAS-to-RDBMS data warehouse and the processes setup during the initial product trial and review of tRelational and DPS.

TSI expanded on the existing procedures and scripts to provide for a more flexible and maintainable implementation. Log files, error trapping, and the ensured end-to-end execution satisfied the client's objectives and accomplished their desired results.

 

 

 



Products | Services | Partners | About Us | Support | Contact Us | Request a Demo | Home | © 2017 Treehouse Software, Inc.