TPC Benchmark Status September 1999 |
TPC Benchmark Status is published about every two months.
The first and primary purpose of the newsletter is to keep interested parties informed about the content, issues, and schedule of the TPC's benchmark development efforts. The second purpose is to invite new members to join these important development efforts.
We've already outlined most of the reasons for joining the TPC in another article, Why Join.
To receive the status report by email, please click here.
|
Last Meeting |
The TPC held a General Council meeting August 25th in Portland, Oregon. There were no major new developments at this meeting. The Subcommittee activity that should be highlighted was again related to TPC-W. At the June TPC meeting, the Council approved issuing a draft of the TPC-W benchmark for public and TPC member review. At this meeting, the TPC-W Subcommittee reported back on the comments they received.
|
TPC-W Subcommittee |
Jerry Buggert of Unisys, the TPC-W Subcommittee Chair, presented the comments the Subcommittee had received during the company/member review period. During the review period, pro-active efforts were made to contact Ecommerce companies outside the TPC membership to solicit comments. Many of these companies did provide feedback.
|
- The Subcommittee received 175 comments from 20 sources. In summary, there were no show-stopping criticisms or holes found in the benchmark workload. The consensus among Subcommittee members was that the TPC-W benchmark could be completed according to the schedule previously submitted to the Council (see schedule below). The Subcommittee addressed all the comments it received:
- 95 comments considered critical or important.
- 21 duplicate comments.
- 59 minor editorial or "desirable" comments.
- The Subcommittee addressed 64 comments at this meeting, including all critical comments. Several of these comments focused on the impact of text searches. The common theme was that that typical usage does not require database scans for text searches and that these scans were taking up too much processing time in the benchmark workload. The Subcommittee agreed to minimize the influence of these scans and agreed to allow use of commercially available text search engines (inside or outside of DBMS).
- Jerry also reported on some TPC-W prototype results and identified the following major points:
- Most of time on database server spent in text searches and best seller identification.
- Number of concurrent users lower than expectation (280 versus 3000 expected).
- Very low I/O rate on database server at 100,000 item scale factor.
- Significant time spent transferring images from application server.
- Security overhead was low.
- The current specification was too database-centric, and the Subcommittee agreed to change the specification to allow additional functionality in the application server area. The functionality change was to allow application level caching and to adopt a higher level of security.
|
TPC-W Schedule |
- 7/1999: TPC-W submitted TPC company review.
- 10/1999: TPC-W submitted for TPC mail ballot approval.
- 12/1999: TPC-W approved as an official TPC benchmark.
- 12/1999: TPC-W results can be published.
|
TPC-C Subcommittee |
TPC-C Subcommittee reported on three major items:
|
- A recommendation to create a new minor revision to the current TPC-C Version 3.4 and move it to Version 3.5 in 60 days. The Subcommittee wanted to clarify the specification's wording to require that pricing, while provided by a specific vendor, should [be obtainable in a majority of the country where it is priced]. The Council approved the Subcommittee's recommendation, and Version 3.5 will become the official version on October 25, 1999.
- Report on progress on the new major Version (Version 4.0), and make a draft of the Version 4.0 benchmark specification, available on the TPC web site for public review and comment.
- Report on the White Paper explaining the changes in Version 4.0. The Subcommittee announced they were publishing a detailed white paper explaining the changes in Version 4.0, which is scheduled for completion in February 2000.
|
The goals for TPC-C Version 4.0 can be summarized as follows:
|
- Recalibrate the benchmark.
- Increase transaction CPU requirements by a factor of 10.
- Reduce transaction disk I/O requirements by a factor of 2.
- Move the read to write ratio from approximately 60:40 to 80:20.
- Grow the benchmark.
- Add new features that are representative of todays OLTP environment.
- Add and/or adjust metrics to improve the relevance of benchmark results.
- Retain existing benchmark strengths.
|
Recalibrating the benchmark ensures that the tested configurations will be more representative of OLTP environments in the industry. The transactions have been modified to increase in complexity as well as in computational and operational requirements. This in turn will reduce the amount of hardware required and, thus, the cost of generating benchmark results. The pricing requirements will be more realistic, with the number of users in the benchmarked system more closely matching those found in an actual customer setting.
In addition to recalibrating the benchmark, new functions were added that are common in the OLTP environment, but were not part of TPC-C Version 3. Functions such as referential integrity and disk parity protection have been added, and the complexity of the flow of the transactions has been increased. Changes to the transactions have increased the database functionality requirements. In addition to the functional enhancements to the benchmark, improvements have been made to the disclosure requirements to ensure that the results fairly represent the hardware and software measured.
Finally, every effort has been made to retain the positive and still relevant aspects of the previous versions of the TPC-C specification.
|
Overview of New Features |
One of the key aspects of improving any application or specification is the introduction of new technologies and requirements. Three new measurement requirements have been added, along with a couple of changes to the reporting requirements. One of the new technology requirements is the introduction of referential integrity. Another is the continuous protection of the durable medium (disks) containing the database. The third requirement is a new approach to defining more realistic workload requirements.
Changes have been made to the database schema, and a new table has been added. The schema changes have been made to support the application and processing requirements of the benchmark, and do not necessarily affect the layout of the database or the data access patterns from TPC-C Version 3. The changes made to the reporting requirements are intended to better predict a systems ability to continue processing at the level of reported throughput at the end of the day, not just during its peak measurement.
|
TPC-C Version 4.0 Schedule |
- 10/1999: TPC-C Version 4 (V.4) submitted TPC company review.
- 12/1999: TPC-C V. 4 submitted for TPC mail ballot approval.
- 2/2000: TPC-C V. 4 approved as an official TPC benchmark.
- 4/2000: First TPC-C V. 4 results can be published.
|
TPC-R/H Subcommittees |
The TPC-R/H Subcommittees (as you will recall, TPC-R and TPC-H were created when TPC-D was split into two workloads in April, 1999) held a joint meeting. The primary work of the joint subcommittees was to make a recommendation to create a new minor version of each benchmark. In addition to making a few minor editorial changes, the Subcommittees wanted to narrow the range of parameters in one of the TPC-R/H queries (Query 18) to make the benchmarks fairer and results more consistent. The Council approved the Subcommittees' recommendation and thus on October 25, 1999, TPC-H Version 1.1 becomes 1.2 and TPC-R Version 1.0 becomes 1.1.
|
The TPC and Public Involvement |
The TPCs focus continues to be on publishing its new TPC-W
benchmark and creating a new major revision of TPC-C (Version 4.). Both benchmark
specifications are now available on the TPC web site. The TPC welcomes suggestions
on these efforts from any source, so please feel at liberty to send in your comments.
Also, its important to remember that TPC benchmark specifications are industry
standardsany party can run and publish a TPC benchmark. Generating a TPC benchmark
is a complex process, but your company doesnt have to go it alone. There are
consultants who are familiar with the benchmarks, and if you need references, please
feel free to contact me.
|
All Benchmark Status Reports |
|