News INTRODUCTION History Contact Web links

The increasing potential of the computer technology has an influence upon almost all the areas of human activity . In the structural design, considering the assessment of reliability, the transition from the slide rule era to the era of computer and information technologies leads to new approaches. In order to bridge the divide between the existing concepts to qualitatively new methods, reengineering of the reliability assessment process is needed. Attention should be given to education that will lead to transition from the deterministic to the probabilistic "way of thinking", to development of new design concepts, to restructuring the specifications, and to direct application of data and knowledge bases.

The increasing number of publications dealing with the probabilistic approaches to structural reliability assessment indicate a qualitatively new trend in the concepts applicable to designer's work. The dramatic developments in the computer technology allow for considering a transition to the qualitatively new structural reliability assessment concepts and design processes. Only some ten years ago, powerful personal computers were not available to all designers. Today, a major part of designer's activities is unmanageable without these powerful tools. How is it possible that one of the key activities, the structural reliability assessment, is still based on methods developed in the era of primitive computational tools? What are the main issues and research needs we should take into account when attempting to bridge the divide between the methods used at present and the concepts of the future? Let us turn our attention to some of the main issues.

The concepts applied in the current specifications (such as the allowable stress design and the partial factors method) are based on the "design point" approach, i.e., on separation of the defined maximum load effects and the minimum resistance in the analysis, and on comparing these two quantities. Once the researchers and specification writing committees used the deterministic and later probabilistic approaches and calibration procedures in specifying the corresponding factors and assessment procedure presented in codes. The designer's involvement in the actual reliability check is limited to the interpretation of criteria and instructions contained in specifications.

The current code strategy design doesn't give the designer a chance to contribute directly to the actual reliability analysis. The designer does not consider variables and their interaction. He/she applies quantities contained in the specification and he/she does not need to understand clearly the substance of criteria which are often hidden in the "black boxes". From the designer's point of view all the reliability assessment concepts, as applied in the current specifications, are deterministic. The designer has to strictly follow the instructions given in the documents, while the extent of use of his/her expertise regarding the assessment is not very significat.

It can be observed that the specifications are becoming more and more complex, their volume is increasing and the content is less and less manageable. Inadequate attention is given to the true role of the reliability assessment. Also some other objections may be mentioned. As has already been reviewed in several papers (see, e.g., The JCSS probabilistic model code), the design according to the current specifications based on the partial factors concept does not necessairly lead to a balanced safety. Also, the common representation of loadings (the characteristic values and load factors) gives only a limited chance for a consistent evaluation of the load effect combinations and does not allow for an analysis of combinations of two-or-more-component load effects (see, e.g., Simulation Based Reliability Assessment) at all.

In order to cross the qualitative divide between the current practice and an arrangement corresponding to the potential of the available computer technology, reengineering of the entire assessment procedure should be considered. Reengineering of the structural dimensioning and reliability assessment can be defined as the fundamental rethinking and radical redesigning of processes to achieve dramatic improvements in critical contemporary measures of safety, durability, serviceability of structures and structural components. An important subject of this redesign is the representation and interaction of the variables involved in the analysis of the reliability function. It can be expected that the overall architecture of a future design process will differ very much from the today's routine. One of the expected main improvements seems to be the transition from the semi-probabilistic to the fully probabilistic reliability assessment concept. Such transition can be expressed by symbolic equations: from the current criterion max S < min R , to a probabilistic check P(R - S) < Pd .where R is the variable resistance, S is the resulting combination of variable load effects, and Pd the specified target probability.

Attention should be given to the development of databases (containing information about the variables such as the material properties, imperfections, transformation models, loading etc.) and of knowledge bases. The results of the reengineering should bring about a qualitatively new level of design practice expressed by significant economical effects related to the efficient use of the available computer technology. The reengineering requires education of students, designers and all the other involved. Especially at universities and university extensions the training should lead to the probabilistic way of thinking.

The 'fully' probabilistic concepts applicable to specifications can be based on various approaches, such as the analytical or numerical procedures (see, e.g., The JCSS probabilistic model code) or on the simulation techniques (see, for example, SBRA documented in Simulation Based Reliability Assessment). The SBRA concept is based on the Limit States philosophy, the parameters generated histograms and the Monte Carlo method. All the input variables are expressed by bounded histograms, the reliability function RF = (R-S) is analyzed with the ue of the basic Monte Carlo technique, and the reliability is expressed by probability of failure Pf. The application of the SBRA is demonstrated upon two hundred examples. The application of the SBRA in designer's work would require the approval of the some criteria to assure acceptability of the statistical input (see Developments in LRFD in the United States of America).

One of the important issues related to the introduction of the fully probabilistic concept is the definition of the reference levels regarding safety, durability and serviceability, including the corresponding target probabilities of failure. In the partial factors concept the "ultimate" carrying capacity refers to the collapse (disposal) limit state while in case of the probabilistic SBRA concept the reference levels are defined, for example, either by the onset of yielding, or by a tolerable permanent deformation, or by the magnitude of a "damage" in the performance design.

The transition from the partial factor concept to the fully probabilistic concepts, (such as the SBRA) requires, among others, recognition of the fully probabilistic approach in specifications. Recently revised standards (such as ÈSN 73 1401), already allows to apply the SBRA in design work and specify the target probability Pd.

The concepts applied so far in specifications are restricted to the assessment of structural elements and components. As has been emphasized in the paper Developments in LRFD in the United States of America, it is already time to consider the extension of the reliability concepts to the structural systems. It can be assumed that the probabilistic concept should be introduced in the design of elements and components first and extended to systems after the fundamentals of probabilistic assessment of components have been made clear to designers.

There has been a significant progress in the development of the fully probabilistic structural reliability assessment concepts. The dramatic development of the computer technology allows for considering general application of the Limit States philosophy using simulation technique as a powerful tool. The following issues can be considered:

All the legal aspects of the qualitatively new structural reliability assessment system must be studied by specification committees in cooperation with the state authorities and other institutions having a jurisdiction (see, for example, Developments in LRFD in the United States of America).



References

[1]  Vrouwenvelder, A.C.W.M. (1997). The JCSS probabilistic model code. Structural Safety, Vol. 19, No.3., pp.245 - 251).

[2]  ÈSN 73 1401 (1998). Design of Steel Structures. Czech Standard Institute, Prague, Czech Republic.

[3]  Marek, P., Guštar, M., and Anagnos, T. (1995). Simulation Based Reliability Assessment. CRC Press Inc., Boca Raton, Florida.

[4]  Galambos, T. (1998). Developments in LRFD in the United States of America. In: Structural Engineering World Wide 1998, Elsevier Scienec Ltd.

[5]  Sundararajan, C.(Raj) (1995). Probabuilistic Structural Mechanics Handbook. Chapman&Hall.