Mix design is arguably the most crucial aspect of customer service, at least for cast-in-place commercial projects. Get low breaks and you may be looking at a catastrophic missed revenue opportunity and maybe even lawsuits. Sure, the producer could rely on a gut feeling that a particular mix design will work for a somewhat similar commercial project and submit old data. The stakes are too high for that, yet the costs of going back to the drawing board every time are also high.

It certainly would save time and money if the producer could predict mix performance using the operational efficiencies of computer modeling. But think of the daunting variables: water-cement ratio, ambient temperature, aggregate gradation, and cement temperature just to name a few.

Cemex, Dyckerhoff Zement, Holnam, Master Builders Technologies, the Portland Cement Association (PCA), and W.R. Grace joined forces with the National Institute of Standards and Technology (NIST) last January to form the Virtual Cement and Concrete Testing Laboratory (VCCTL) consortium for just such an undertaking. NIST's Gaithersburg, Md.-based Building and Fire Research Laboratory (BFRL) studies and tests building materials and construction methods, and it is involved in NIST's Partnership for High Performance Concrete Technology (PHPCT), which was formed to increase concrete's share of the construction market. A PHPCT knowledge system called "Hypercon" is designed to predict HPC performance. NIST/BFRL has been developing the Hypercon database for 12 years now, much of which has been compiled at annual workshops. Version 1.0 of a modeling software program that uses the data is located at the VCCTL Web site: http://ciks.cbt.nist.gov/vcctl.

A complementary program called the Concrete Optimization Software Tool (COST) is accessible through the VCCTL site. This program—which should be particularly valuable to the producer—is designed to allow visitors to optimize concrete mix proportions based on statistical graphic analysis.