Treehouse Software Customer Case Study:
Penn State University

The following is a recent discussion between Rusty Sodergren, Lead Database Programmer Analyst at Penn State University, Bill Cook, Manager of Database Administration (DBA) for Enterprise Systems at Penn State University, and Treehouse Project Managers.

Please describe the Penn State University (PSU) organization.

Penn State is a multi-campus, public land-grant university that improves the lives of the people of Pennsylvania, the nation, and the world through integrated, high-quality programs in teaching, research, and service.

Please describe your department and charter.

Information Technology Services (ITS): ITS provides the infrastructure that enables Penn State students, faculty, and staff to make maximum use of the appropriate information technology tools in their learning, teaching, research, outreach, administration, and support activities, as well as the cost-effective information technology resources required to support continuous improvement in the University's ability to fulfill its diverse mission.

Please briefly describe your ADABAS/NATURAL environment.

We have two production databases containing the Integrated Business Information Systems (IBIS), the Integrated Student Information Systems (ISIS) and the Alumni Development Information Systems (ADIS).

Currently all of the core University business computing and information systems reside in ADABAS and have been programmed using NATURAL. The ISIS system includes student registration, enrollment, grades, scheduling, student loans, student aid, bursar, and admissions. The IBIS system includes accounting, budget, payroll, inventory, purchasing, human resources, financial systems, and electronic forms. The Alumni system contains all kinds of alumni and development information, donations, giving, etc.

What prompted PSU to evaluate tRelational and DPS?

PSU has been considering various tools over the years to enhance propagation of ADABAS data to other environments, including various RDBMSs and flat files. Like other shops, we have also suffered from the processing overload of NATURAL extracts as a means of data extraction and propagation.

"With the assistance of Treehouse's excellent remote and 'on site' support, we established a process that quickly was production ready."

Bill Cook
Manager of Database Administration (DBA)
Penn State University

We spoke to Mitch Doricich (Treehouse National Sales Manager) and company during a Conference in 2002 about tRelational and DPS. Previously, at the 2001 NATURAL Conference, we attended a tRe/DPS demo conducted by Wayne Lashley (Treehouse Director of Technical Operations) and Brian Johnson (from Cutler-Hammer). We talked to other vendors as well. The Treehouse solution had the largest market share, and we knew it was a seasoned product.

Can you describe the tRe/DPS Pilot experience?

The pilot was very successful. We learned a lot about the product set and its capabilities. With the assistance of Treehouse's excellent remote and 'on site' support, we established a process that quickly was production ready.

We completed an end-to-end nightly production propagation process that executed in 15 minutes, which included the FTP and the load to ORACLE. This was a significant time-savings compared to the existing methods consisting of a NATURAL extract that was then used to perform a full refresh of the data.

We were very pleased, and your technical support always had an answer for every question. We were impressed with the data transformations and the features of the product, and we were also impressed with the excellent support.

Would you give us some details on the PSU tRe/DPS Projects?

The first project was for the Undergraduate Admissions Office at Penn State. They had developed an application that generates 'match codes' for names and addresses of all prospective students and applicants for Admission. These match codes are used to identify redundant information in the database. For instance, a person could be in the database as "J. Doe" and "John Doe" or as "Bill" and "William." Or, they might have two different addresses in the database, but they both are for the same person. The match codes identify these redundancies and help prevent PSU from corresponding with the same person as if they were two or more different people. This reduces the number of duplicate letters, emails, personal contacts, etc., thus saving time, effort, and funds and helping PSU present a more professional image to students, prospects, and their families. It also prevents important information from getting placed on two different records, such as SAT Scores getting assigned to "William" and Applicant Information getting assigned to "Bill."

The Undergraduate Admissions Office, in conjunction with ITS Enterprise Systems DBA group, created a tRelational data model to capture all of the names and addresses from the "person" file and "correspondence" file. Initially, a DPS materialization of the data was done to capture all of the names and addresses (filtering on home address only). Then, daily propagation started, which captured all name and address changes. Both the materialization and propagation processes use ACTALODF, which creates a fixed-length flat file of the information to be passed to a NATURAL program that subsequently reads the file and generates match codes. Additionally, the propagation process uses APC (ADABAS PLOG Consolidation) to capture only the final image of the record, rather than all of the changes that were done in a day, since it is not necessary to know about each change, only the final result.

It sounds like ACTALODF and APC are going to be well-used features.

Yes. Both ACTALODF and APC are providing a great benefit and are now being used for additional processes, and plans are being made to expand the use of these features. In fact, we recently modified the original data model to feed some new applications that are related to a very important, high profile project at Penn State. PSU is in the midst of converting the primary faculty/staff/student identifier from the Social Security Number to a new Penn State ID Number. The original data model developed for Admissions was modified to capture some additional name/address data. An application was then written to parse this data to identify name, address, and SSN changes, and then update two new ADABAS files with the SSN changes and match code information.

"We are now able to put together a quick model to capture the data they were looking for. This is a much easier and faster method than using ADASEL."

Bill Cook
Manager of Database Administration (DBA)
Penn State University

There are a couple of other projects currently being discussed that will also use ACTALODF/APC as input to current processes. One use will be to populate report datasets (in SAS) that the Office of Student Aid uses as their data warehouse. They currently use NATURAL extracts and the "pull of data" is a full refresh done weekly. Student Aid plans to implement DPS to capture changes daily so their information is more up to date. This simple process enables the Office of Student Aid to avoid doing a full weekly refresh of the data, saving CPU cycles, as well as providing much better service to their office.

Finally, we have already benefited greatly by using ACTALODF/APC for some internal troubleshooting. Periodically, programmers come to the DBA with a request to identify information contained on a particular PLOG. We are now able to put together a quick model to capture the data they were looking for. This is a much easier and faster method than using ADASEL.

We'd also like to mention that Treehouse wrote a special column routine called COLAMAX for us to capture the last occurrence of a PE, which was very important to this project and other future projects.

Is tRe/DPS helping Penn State in any other areas?

Yes. In addition to the projects previously mentioned, Admissions requested more frequent updates of their data that currently resides in the Penn State Data Warehouse. In particular, they were looking for daily updates to the data, as well as expanded access. A number of methods for meeting these goals were discussed, including modifying the current NATURAL extract process, possibly using ADASEL, or using a product called SAS/Access. Our DBA staff was approached by Admissions for some help using SAS/Access, and we mentioned the tRe/DPS trial to them. It was decided that we'd try some test models using the tRelational Autogen feature. Admissions was thrilled with the results.

As a result of Admissions' request, we began the process of creating normalized staging tables for their office. We did an Autogen of nine ADABAS files. The whole process was very simple and straightforward. The most time-consuming part of the project is running the file analysis, which is required for Autogen, but once that is finished, the rest is gravy. tRelational does everything, including creating the DDL and GENning the parameters used as input by DPS.

We materialized the nine ADABAS files into 162 MS SQL Server tables. Then, we began daily propagation of inserts, updates, and deletes. The daily process is very fast and has been virtually trouble free. Admissions has written a number of VIEWS of the normalized data that in turn are being used to produce daily Admissions reports both at our University Park and Commonwealth Campuses.

Next, we in DBA will be working closely with Penn State's Data Warehouse Consultant in selecting warehouse tables that may benefit from population from the staging tables. In addition, we will work together in analyzing the feasibility and benefits of adding new tables to the warehouse or restructuring selected existing tables. We anticipate that this will be able to be done in a more timely and efficient manner by extracting data from the staging tables.

Ultimately, we do not plan on having direct access available to the staging tables, though they may be made available for some ad-hoc reports by a limited number of power users. The long term plan is to expand the staging tables to include more Autogenned ADABAS files that may then be used to populate additional existing data warehouse tables or use them to create new tables.

What were the major decision points or factors in purchasing tRe/DPS?

The major point, and it's really fundamental, is it allowed us to accomplish things that we could not do otherwise. For instance, we could not populate daily updates to the Admissions data on the Data Warehouse due to the time it takes to run the NATURAL Extracts. We had investigated alternative solutions without success.

"... we began daily propagation of inserts, updates, and deletes. The daily process is very fast and has been virtually trouble free."

Bill Cook
Manager of Database Administration (DBA)
Penn State University

How would you rate the products features, functionality, and performance?

We would give them an A grade. We really have not yet come across a requirement that we could not find a way to accomplish.

How would you rate the Treehouse support, skills, and knowledge related to the processing requirements?

We would give an A+. As good as it could possibly be. We deal with several fine vendors and you guys are at the top. The timeliness, the quality, and the knowledge are excellent. The development of the new COLAMAX ETR met an important requirement, which was very helpful, and we really appreciate this level of customer support. In particular, Dan Sycalik's support and knowledge has been key to our success.

Do you have future plans for additional tRelational and DPS projects?

Absolutely! As a matter of fact, there are a number of projects in the works. Number one is the replacement of the current process of populating the transcript table in the Data Warehouse. The other is converting the NATURAL extract process for the Office of Student Aid Data Warehouse. Recently, we met to discuss an upgrade from weekly to daily updates.

We are beginning to get the word out through various means, including presentations to committees (University-wide Data Access Committee) and internal presentations (Enterprise Systems Group).

Would you recommend tRE/DPS for other clients considering ADABAS data transfer?

Most definitely!



Office Location

2605 Nicholson Road, Suite 1230
Sewickley, PA 15143

Contact Us

General Email:
Sales Department:
Support Center:

Connect with Treehouse