Issues have been identified and logged throughout the Conference Room Pilot. In addition, several open issues may be left over from Project Team Training or other early project activities. At this stage in the project, all open issues should be:
1. Classified as to nature of the problem or question involved: Categories may include software issues, policy/procedural issues, training issues, technical issues, and other categories.
2. Given a priority. Priorities should be designated as Medium, High, or Critical. No issue which has been documented should be assigned a priority of "Low".
3. Assigned to a member of the project team, even if resolution will primarily involve personnel outside of the project team.
4. Given a target resolution date. The responsible team member should have a documented solution, or a plan to develop a solution, by the assigned date.
Issues which are likely to result in custom programming should be dealt with as soon as possible. The estimates for any resulting custom development will impact the project budget and schedule.
From this point forward, throughout the remainder of the project, a complete issues list should be maintained, periodically updated, and circulated. The issues list is the minimum project management tool which needs to be in place.
The dates assigned to the various issues need to be consistent with the overall project plan. The project work plan may itself need to be updated periodically, if events dictate a delay in key milestone events, such as System Test, User Training, or Conversion.
Each issue on the issues list should also be recorded on a Project Issue Report form. The following page illustrates a basic issues list format.
Sample Project Issues List
Project Issues |
||||||
As of July 25, 19XX |
||||||
Issue Num. |
Applic- ation |
Description |
Class |
Priority |
Assigned |
Target |
1 |
A/P |
How many remote A/P users need training? |
Training |
Medium |
Bob W. |
8/1 |
2 |
A/P |
How to track expense items for one-time vendors. |
Software |
High |
Bob W. |
8/15 |
3 |
SOP |
How to add sales tax to Sales Orders? |
Software |
Critical |
Tom J. |
8/31 |
4 |
SOP |
Which transactions should Sales Order clerks have access to? |
Policy/ Proced. |
Medium |
Tom J. |
8/31 |
5 |
SOP |
Need to have additional pricing & discount flexibility for selected customers. |
Software |
Critical |
Tom J. |
9/10 |
6 |
Tech |
A second CPU will be needed for custom programming. |
Tech |
High |
Alice P. |
9/10 |
As custom development work nears completion, a detailed System Test plan should be documented. In cases where no custom programming is being done, the test plan can be written at the conclusion of the Conference Room Pilot.
Building the System Test Plan
The Conference Room Pilot script (containing a function list and exercise documents) is the best source of system test plan material. Functions which were tested in the CRP are the basis for testing the entire system in the System Test.
The System Test plan will reflect three key differences between a CRP and a comprehensive System Test:
1. The System Test will include modifications and system interfaces built during the Development / Programming phase.
2. Conversion programs should be run during the System Test, and the converted data should be used in normal transaction activity afterward.
3. The System Test should include a prediction or calculation of expected results before test cycles are executed, and actual results should be compared with these expectations.
The System Test plan should consist of three components:
1. System Test Overview and Schedule
2. Test Condition / Test Item Cross Reference
3. Test Item Control Sheets
The test plan should begin with the Overview section, stating:
the purpose of the test
the resources who will participate
the testing approach (data sources, transaction flow, outputs and reports)
the time frame involved
A Gantt chart showing the schedule for the different test activities (Ex: Financial applications in week one, Payroll processing in week two, stress testing in week three, etc.) should be included.
The remainder of the System Test plan can be constructed using the forms located at the back of the System Integration section of this guide. (Test Condition Inventory ans Test Planning Sheet). Instructions for using these forms are included.
The Conduct System Test activity in this guide should be reviewed before the System Test plan is considered complete.
Now that a detailed list of tests has been constructed, the data necessary to perform these tests must be identified and either converted or entered.
As in the CRP preparation, requirements for test data depend on the tests being conducted. Unlike the CRP, the System Test will normally require large volumes of data, rather than small samples.
The following approach to developing test data will generally yield the best results:
1. All data which will be converted (account balances, customer files, etc.) for live use later on should also be converted now, in full volume, for the System Test.
2. Interfaces to and from other systems or hardware platforms should be operated during the System Test, to provide actual data for transaction processing and reporting.
3. Data files which will be manually built when the new system goes live should be keyed for the System Test. An example might be a Vendor Master file where only 150 vendors will reside, making electronic conversion impractical. The volume entered should be sufficient to test all desired conditions, and to fill more than one page on any reports which list this data (transaction registers, master file lists, etc.)
An alternative technique would be to manually load these files to full volume, and then maintain them in parallel with the current production files until the new system goes live.
4. Generally, transaction data (vs. master file data) will be generated during the System Test, and would not be keyed ahead of time.
A typical test model for a test of Financial application software might require:
An Account Master File and a Vendor Master File which are manually keyed in preparation for System Testing.
A 5000-record Customer Master File, converted electronically.
Journal Entries, Invoices, Vouchers and Checks entered and processed during the testing activity itself.
Once the data is complete and ready for testing, a backup should be made of the entire System Test environment. This backup may need to be restored if for any reason the System Test needs to be restarted.
The scope of the system test may include stress testing, integration testing, user testing, and basic system testing. Working definitions of these test categories follow:
Stress Test: Processing a large volume of transactions through the system to test response time and throughput. These tests often involves setting up large numbers of workstations and users to enter transactions simultaneously. Alternatively, transactions which simulate user activity may be generated by programs designed for this purpose.
Stress Testing focuses specifically on system performance and testing the hardware and software under realistic expected production volumes. Stress testing can be costly and time-consuming to prepare for, and is most appropriate when there is considerable doubt as to whether the new system can comfortably handle its expected transaction volume.
Integration Test: Testing the interaction of two or more system components or programs. For example, an Accounts Payable custom Voucher Entry program may be tested along with a custom program which produces checks.
Integration testing almost always involves passing data between one program and another, or sharing data in a file. In the Accounts Payable example, one program may add vouchers to an open item file, while the second program (the check printer) accesses those same vouchers and creates checks. Verifying that the check amounts and other information are accurate, according to the voucher data input, would complete the test.
A series of such tests, encompassing all critical points where data is shared between programs, forms an Integration Test.
User Test: Testing which allows system users an opportunity to designate the system as ready for use, or to raise issues to the contrary. A User Test, sometimes called a User Acceptance Test, is distinguished by the involvement of actual system users, vs. project team members who represent user departments and organizations.
A User Test would be appropriate where there has been little user involvement in the implementation work to date, or where there is a history of user dissatisfaction with automated systems.
Users whose first participation in the project is during a User Test must be trained prior to the testing period.
Basic System Test: The basic System Test is an organized testing activity, following the Test Condition Inventory and Test Planning Sheets completed during system test planning. The Test Planning Sheets are used in the same way the CRP Exercise Documents were used in the Conference Room Pilot.
Basic system testing using these forms is the most detailed and thorough testing which will normally be done, and represents the most important portion of the System Test. The other System Test components described in this section are optional, depending on circumstances such as those outlined below.
During basic system testing, each test cycle groups system transactions into logical functions and a realistic process flow. For example, a test cycle for an Order Processing system might include entering and pricing an order, adding tax and freight amounts, generating a warehouse Pick List, and creating shipping and billing documents. This cycle could include many programs and test conditions.
A second cycle might include the same steps, but involve a different customer type and perhaps non-taxable products in the order itself.
Project Issue Reports should be logged throughout the System Test, and time should be allowed for re-testing as issues are resolved and the system is adjusted.
Summary of System Test Components
Test Component |
Circumstances where Most Appropriate |
Stress Test |
Uncertainty about computer/network being able to handle the expected volume of transactions or data. |
Integration Test |
Extensive custom modifications, external interfaces, or sharing of data between programs or applications. |
User Test |
Little user involvement in project to date.
History of user dissatisfaction with computer systems. |
Basic System Test |
All implementations. |
During system testing, changes are identified which need to be made to custom programs (interfaces and modifications). As discrepancies are discovered in system processing, programs are changed and tests are rerun.
In this manner, many of the problems discovered during system testing are resolved. Since additional custom programming, even minor changes, must be re-tested, the Conduct System Test and Adjust Custom Programming activities overlap considerably.
The System Test activity is not complete until any all changes to custom software and to any setup options have been thoroughly tested.
Documentation
Program documentation should be maintained as custom programs are adjusted and finalized. This is particularly critical for technical documentation, such as field mappings, detailed flowcharts, and program comments.
Program revisions made during system testing are often done on a trial-and-error basis, requiring several changes before the revision process is complete. For this reason, waiting until all changes have been made before revising technical and user documentation can minimize the overall documentation effort.
"Freezing" the System
During the first run through the System Test, and corresponding changes to any custom software, the entire software solution is in a somewhat fluid state. Once the last changes are made to the software, the tests need to be rerun for all relevant system areas.
When the last round of tests is complete, all team members and users must understand that no further software changes will be allowed. In addition to custom programming, this includes setup options, codes files, user-driven parameters, and any other adjustments which could potentially affect system processing, and invalidate the just-completed System Test.
In preparation for User Training and eventual live operation, hardware and facilities must be ordered or otherwise prepared. Items which must be purchased externally need to be ordered with sufficient lead time to allow for setup and testing before being used.
Hardware and other components which must be considered include:
A CPU which will serve as a central computer or network server
Workstations for all system users, including cabling and connection devices
Additional DASD or other data storage to handle expected production volumes
Additional memory and/or additional processors
Network management software, program schedulers, or other utilities
Special forms such as Checks, Purchase Orders, Invoices, and Vouchers
The actual installation of workstations on users' desks may be postponed until just before a planned live date, but planned locations for all terminals, PC's and printers should be documented as these items are ordered.
Plans for the placement of workstations and printers should be made public in time for management to deal with any questions or disagreements which may arise. A general understanding of the rationale for placement of workstations should be gained by all users, who often see system access as either a status symbol or a burden.
User Training Facility
By the end of the System Integration project phase, the facility which will be used for User Training should be established, and furnished with the necessary hardware and system access. Each workstation in the facility must be individually tested before users are brought in for training.
At the completion of all other activities in the System Integration Phase, the appropriate Project Manager's Checklists need to be reviewed or completed.
Project Manager's Checkpoints, end of System Integration Phase:
Review & Complete Project Administration Checklist
SYSTEM TEST DOCUMENT INSTRUCTIONS
The accompanying System Test documents support the planning and documentation of a thorough System Test. The Test Condition Inventory and the Test Planning Sheet are to be used as follows:
1. Start with the Test Condition Inventory document. Complete the columns labeled "Test Cond. No." and "Test Condition Description". Here you are listing the conditions and variations you want to test. For example: A Sales Order with one Direct Ship item, an order with more than one Direct Ship Item, A taxable Sales Order, an order changed after taxes are calculated, etc.
2. Now use the Test Planning Sheet to create individual testing steps. Test exercises should be grouped into test cycles, and numbered. Each exercise should address at least one test condition, and may include several. For example, one test sheet may call for an order to be entered with two Direct Ship items, and for both items to be treated as taxable. Two conditions are thus tested in one test exercise.
3. As testing steps are documented on the Test Planning Sheets, complete the "Test Cycle" and "Test Item" columns on the Test Condition Inventory. Now you have a list of what conditions and variations were tested, and when and how they were tested. This is valuable information.
4. Use the Project Issue Report document to identify system or procedural issues and track them to various dispositions.
Test Condition Inventory |
System Test |
|
||||||||
Test |
Where Tested |
|
||||||||
Cond |
Test |
Test |
||||||||
No. |
Test Condition Description |
Cycle |
Item |
|||||||
|
|
|
||||||||
|
||||||||||
|
|
|
||||||||
|
||||||||||
|
|
|
||||||||
|
||||||||||
|
||||||||||
|
||||||||||
|
|
|
||||||||
|
||||||||||
|
||||||||||
|
|
|
||||||||
|
||||||||||
|
||||||||||
|
||||||||||
|
|
|
||||||||
|
|
|
||||||||
|
|
|
||||||||
|
|
|
||||||||
System Test |
|
|||||||||||||||||||||||||||||||||
Test Item Control Sheet |
|
|||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
A. Test Cycle: |
|
|||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
B. Test Item Number: |
|
|||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
C. Testing Approach: |
|
|||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
D. Setup/Prerequisites: |
|
|||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||
|