• TABLE OF CONTENTS
HIDE
 Title Page
 Acknowledgement
 Table of Contents
 List of Tables
 List of Figures
 Abstract
 Introduction
 An overview on the use of neural...
 Experimental setup and data...
 Development of the artificial neural...
 Performance and analysis of the...
 Conclusions and recommendation...
 Appendix A: Training data
 Appendix B: Beam data
 Appendix C: Test set data
 Biographical sketch














Title: Assessing external reinforcement on reinforced concrete beams using neural networks
CITATION PDF VIEWER THUMBNAILS PAGE IMAGE ZOOMABLE
Full Citation
STANDARD VIEW MARC VIEW
Permanent Link: http://ufdc.ufl.edu/UF00100824/00001
 Material Information
Title: Assessing external reinforcement on reinforced concrete beams using neural networks
Physical Description: Book
Language: English
Creator: Nandy, Sujay, 1976-
Publisher: University of Florida
Place of Publication: Gainesville Fla
Gainesville, Fla
Publication Date: 2001
Copyright Date: 2001
 Subjects
Subject: Building Construction thesis, M.S.B.C   ( lcsh )
Dissertations, Academic -- Building Construction -- UF   ( lcsh )
Genre: government publication (state, provincial, terriorial, dependent)   ( marcgt )
bibliography   ( marcgt )
theses   ( marcgt )
non-fiction   ( marcgt )
 Notes
Summary: ABSTRACT: This thesis aims to develop a neural network model of the performance of externally reinforced concrete beams that is sufficiently accurate to be of use to practicing engineers, and to overcome the speed problem of existing models of analysis, thus facilitating the search for an optimal design solution. The use of carbon fiber reinforced plastic (CFRP) sheets as externally bonded reinforcement is now generally recognized as an efficient and valid procedure for strengthening and upgrading structural concrete members. In this thesis, data collected from a Wright Laboratory Airbase CFRP laminate project are used for a feasibility study by applying neural networks to predict the deflection of tested beams. Finite element methods (FEM) of solution to this problem have a number of drawbacks; in particular they are computationally slow thereby severely limiting the number of alternative design decisions that can be tested. Earlier work involving the application of neural networks to similar structural engineering problems have proven successful, demonstrating the feasibility of the neural network approach in terms of performance estimates and the speed with which results can be generated. The performances of alternative types of neural networks are studied, and their performances are compared with previous analyses using the FEM method, in terms of accuracy and processing time. This thesis demonstrates that using an artificial neural network to analyze external reinforcement is feasible, and a well-trained artificial neural network reveals an extremely fast convergence and a high degree of accuracy.
Summary: KEYWORDS: external reinforcement, neural networks
Thesis: Thesis (M.S.B.C.)--University of Florida, 2001.
Bibliography: Includes bibliographical references (p. 61-63).
System Details: System requirements: World Wide Web browser and PDF reader.
System Details: Mode of access: World Wide Web.
Statement of Responsibility: by Sujay Nandy.
General Note: Title from first page of PDF file.
General Note: Document formatted into pages; contains ix, 144 p.; also contains graphics.
General Note: Vita.
 Record Information
Bibliographic ID: UF00100824
Volume ID: VID00001
Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
Resource Identifier: oclc - 49231708
alephbibnum - 002763568
notis - ANP1590

Downloads

This item has the following downloads:

thesis_etd_version ( PDF )


Table of Contents
    Title Page
        Page i
    Acknowledgement
        Page ii
    Table of Contents
        Page iii
        Page iv
    List of Tables
        Page v
    List of Figures
        Page vi
        Page vii
    Abstract
        Page viii
        Page ix
    Introduction
        Page 1
        Page 2
        Page 3
        Page 4
        Page 5
        Page 6
        Page 7
        Page 8
        Page 9
        Page 10
        Page 11
        Page 12
    An overview on the use of neural networks in civil engineering
        Page 13
        Page 14
        Page 15
        Page 16
        Page 17
        Page 18
    Experimental setup and data generation
        Page 19
        Page 20
        Page 21
        Page 22
    Development of the artificial neural network model
        Page 23
        Page 24
        Page 25
        Page 26
        Page 27
        Page 28
        Page 29
        Page 30
        Page 31
        Page 32
        Page 33
        Page 34
        Page 35
        Page 36
        Page 37
        Page 38
        Page 39
    Performance and analysis of the Ann models
        Page 40
        Page 41
        Page 42
        Page 43
        Page 44
        Page 45
        Page 46
        Page 47
        Page 48
        Page 49
        Page 50
        Page 51
        Page 52
        Page 53
        Page 54
        Page 55
        Page 56
        Page 57
    Conclusions and recommendations
        Page 58
        Page 59
        Page 60
        Page 61
        Page 62
        Page 63
    Appendix A: Training data
        Page 64
        Page 65
        Page 66
        Page 67
        Page 68
        Page 69
        Page 70
        Page 71
        Page 72
        Page 73
        Page 74
        Page 75
        Page 76
        Page 77
        Page 78
        Page 79
        Page 80
        Page 81
        Page 82
        Page 83
        Page 84
        Page 85
        Page 86
        Page 87
        Page 88
        Page 89
        Page 90
        Page 91
        Page 92
        Page 93
        Page 94
        Page 95
        Page 96
        Page 97
        Page 98
        Page 99
        Page 100
        Page 101
        Page 102
        Page 103
        Page 104
        Page 105
        Page 106
        Page 107
        Page 108
        Page 109
        Page 110
        Page 111
        Page 112
        Page 113
        Page 114
        Page 115
        Page 116
        Page 117
        Page 118
        Page 119
        Page 120
        Page 121
        Page 122
        Page 123
    Appendix B: Beam data
        Page 124
        Page 125
        Page 126
        Page 127
        Page 128
        Page 129
        Page 130
        Page 131
        Page 132
    Appendix C: Test set data
        Page 133
        Page 134
        Page 135
        Page 136
        Page 137
        Page 138
        Page 139
        Page 140
        Page 141
        Page 142
        Page 143
    Biographical sketch
        Page 144
Full Text











ASSESSING EXTERNAL REINFORCEMENT ON REINFORCED CONCRETE BEAMS
USING NEURAL NETWORKS
















By

SUJAY NANDY


A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE IN BUILDING CONSTRUCTION

UNIVERSITY OF FLORIDA


2001
















ACKNOWLEDGEMENT


I express sincere appreciation to my committee chair and cochair, Dr. Larry Muszynski and

Dr. Ian Flood, for their guidance, insight and support during this research. I also thank Dr. Abdol

Chini, who was on my committee, for his help, guidance and motivation.















TABLE OF CONTENTS


A CK N OW LED GEM EN T................. ...................................................... .................... ii

LIST OF TABLES ............... ................. ........................................ .v

LIST O F FIG U R E S.. ................. ......................... .................... .. .... ..... .. ............. vi

A B S T R A C T ...................................................................ii

CHAPTERS

1 IN TROD U CTION .................. .............................. .. ....... .................. ..

1.1 Background to the Problem and Literature Review ..............................................1
1.1.1 The Need for External Reinforcement ................................ ................. 1
1.1.2 Methods of Design/Analysis and their Limitations.............. ................2
1.1.3 Discussion of ANN Applications and how they may be
A pplicable to this Problem ...................................................................... 8
1.2 Aim, Research Objectives and Methodology.......... ......................................11

2 AN OVERVIEW ON THE USE OF NEURAL NETWORKS IN CIVIL
EN GINEERING ........................................................... ................... 13

2.1 A application A areas ........................... ...... .................. ................ ............ 13
2.2 Rationale for Use in Analysis of External Reinforcement
Perform ance ............................................. 18

3 EXPERIMENTAL SETUP AND DATA GENERATION...........................................19

4 DEVELOPMENT OF THE ARTIFICIAL NEURAL NETWORK MODEL.................23

4.1 Establishing Training and Testing Patterns ...................................... ...............23
4.2 T raining R esults........... ................................................ .............. ......... . ....... 37

5 PERFORMANCE AND ANALYSIS OF THE ANN MODELS

5.1 Relative Performance of the ANNs (Accuracy and Speed)....................................40
5.2 Analysis of ANN Results to Compare with FEM .............................................55

6 CONCLUSIONS AND RECOMMENDATIONS......... .............................. 58


iii

















L IST O F R E FE R E N C E S ............................................................................. ......... .......... 6 1

APPENDICES

A T R A IN IN G D A T A ........................................................................ ....................64

B BEAM D A TA .................. ........................................................ .. 124

C TEST SET DATA......................................... ............ 133

BIOGRAPHICAL SKETCH .................................................... ............................... 144




































iv















LIST OF TABLES


Table page

1. X-Section Areas of Tensile Steel and External Reinforcement Configuration ....... 21

2. Ranking of Neural Paradigms in Order of Performance.................................39

3. Training Performance of Neural Paradigms...................... .................... 41

4. Test Set Performance of Neural Paradigms........... ............ .................42

5. Performance of Neural Paradigms Based on External Reinforcement Configuration.. 54















LIST OF FIGURES


Figure page

1. X -section of Sam ple B eam s............................................................................ ...... 20

2. Beam Loading Configuration........................................................... ............... 20

3. 3-Layer Simple Backpropagation Network .............. .......... ............. .............26

4. 3-Layer Backpropagation Network using Turboprop .................................................27

5. 4-Layer Backpropagation using Turboprop ....................................... ............... 28

6. 3-Layer Backpropagation with Jump Connections.................................... ..................29

7. 4- Layer Backpropagation with Jump Connections................................ .................. 30

8. 5- Layer Backpropagation with Jump Connections................................................. 31

9. GRNN network ............... ................ ........................ ............ 32

10. W ard Networks with 2 Hidden Layers.................................................................. 34

11. W ard Networks with 3 Hidden Layers.................................................................. 35

12. Ward Network with 2 Hidden Layers (with Input-Output Layers Connected) .............36

13. Scatter Plot of 3-Layer Backpropagation Model ........... ................................. 43

14. Scatter Plot of 3-Layer Backpropagation using Turboprop........................................44

15. Scatter Plot of 4-Layer Backpropagation using Turboprop .........................................45

16. Scatter Plot of 3-Layer Backpropagation with Jump Connections .............................46

17. Scatter Plot of 4-Layer Backpropagation with Jump Connections ............................47

18. Scatter Plot of 5-Layer Backpropagation with Jump Connections ............................48

19. Scatter Plot of G RN N N etw ork ................................................................................49
















20. Scatter Plot of Ward Network with 2 Hidden Layers .............. ......... ............... 50

21. Scatter Plot of Ward Network with 3 Hidden Layers ............................................51

22. Scatter Plot of Ward Network- 2 Hidden Layers with Input-Output Connected...........52















Abstract of Thesis Presented to the Graduate School
Of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science in Building Construction

ASSESSING EXTERNAL REINFORCEMENT ON REINFORCED CONCRETE BEAMS
USING NEURAL NETWORKS.

By

Suj ay Nandy

August 2001

Chair: Dr. Larry Muszynski
Cochair: Dr. Ian Flood
Major Department: Building Construction.

This thesis aims to develop a neural network model of the performance of

externally reinforced concrete beams that is sufficiently accurate to be of use to practicing

engineers, and to overcome the speed problem of existing models of analysis, thus

facilitating the search for an optimal design solution.

The use of carbon fiber reinforced plastic (CFRP) sheets as externally bonded

reinforcement is now generally recognized as an efficient and valid procedure for

strengthening and upgrading structural concrete members. In this thesis, data collected from

a Wright Laboratory Airbase CFRP laminate project are used for a feasibility study by

applying neural networks to predict the deflection of tested beams.

Finite element methods (FEM) of solution to this problem have a number of

drawbacks; in particular they are computationally slow thereby severely limiting the

number of alternative design decisions that can be tested. Earlier work involving the

application of neural networks to similar structural engineering problems have proven









successful, demonstrating the feasibility of the neural network approach in terms of

performance estimates and the speed with which results can be generated.

The performances of alternative types of neural networks are studied, and their

performances are compared with previous analyses using the FEM method, in terms of

accuracy and processing time. This thesis demonstrates that using an artificial neural

network to analyze external reinforcement is feasible, and a well-trained artificial neural

network reveals an extremely fast convergence and a high degree of accuracy.














CHAPTER 1
INTRODUCTION

1.1 Background to the Problem and Literature Review



1.1.1 The Need for External Reinforcement

Several modern cities are being faced with a rapidly deteriorating infrastructure.

The cost of replacing entire deteriorated structures is prohibitive. Research methods have

therefore focused on methods of strengthening existing structures. The rehabilitation of

concrete structures is a challenging problem faced by engineers today. An acceptable

method of repairing weak or damaged concrete structural members in bridges or buildings

(due to damage, increased load and average daily traffic, marginal design, poor

construction, inferior materials, etc.) is the use of external reinforcement. The advantages

of this method over others such as post-tensioning or additional supports are lower cost,

ease of application and maintenance, elimination of special anchorages and the ability to

strengthen the structure while it remains in use (Krasto and Kim,1996). Reinforcement may

be in the form of bonded steel plates or fiber-reinforced plastics (FRPs). The bonded steel

plate effectively acts as a second layer of reinforcement, increasing flexural strength by up

to 40%. Epoxy bonded steel plates have been extensively used to strengthen buildings in

Switzerland, bridges in England, the former USSR, Poland and Japan. (Dussek,1990,

Klaiber, 1987 and Maeda, 1980). However, the potential for corrosion at the epoxy/steel

interface is a significant drawback of using bonded steel plates. Experiments to determine

steel/adhesive interface residual strength after 15 years of exposure to weathering, at the









Swiss Federal Laboratories, confirmed this fact (MacDonald and Calder, 1982). Fiber-

reinforced composites are now generally recognized to offer superior features. Carbon

fiber-reinforced plastics (CFRPs) possess high specific stiffness, strength, and durability

in saline/marine environments, are resistant to corrosion by chemicals, and are more cost-

effective compared to bonded steel reinforcement. Lab tests in Switzerland and Germany

have also shown that replacement of steel plates with CFRP, can reduce the total cost of a

reinforcing project by about 20% (Mufti,1991). The weight of a structure reinforced with

CFRP will also be significantly lower than one reinforced with steel plates.



1.1.2 Methods of Design/Analysis and their Limitations

Finite element analysis (FEM) has been used to calculate the behavior of externally

reinforced structural concrete beams. However, the FEM approach is computationally

expensive, often involving CPU times of several hours just for two-dimensional analysis

(Saadmanesh and Malek, 1998) and is awkward to use when modeling geometrically

complicated forms. More often the FEM approach requires a three-dimensional model thus

increasing processing time by an order of magnitude. Both the lengthy CPU time and the

inconvenience of building FEM models for this type of problem, reduce the number of

alternative external reinforcement configurations that can be evaluated in a study and

hinders the search for an optimal solution (Flood et al., 2000). This thesis evaluates an

alternative approach to the problem, using neural networks developed from observations of

the actual performance of beams tested in the laboratory. The objective of the research is to

determine the viability of using neural networks to obtain accurate predictions of the

performance of externally reinforced concrete beams, and to assess the speed at which the

method can generate solutions.









Considerable research has been done involving the study of strengthening concrete

structures through the application of external reinforcement. Hussain et al., (1995) have

analyzed the Flexural Behavior ofPrecracked Reinforced Concrete Beams S.i enugtihe11u

Externally by Steel Plates. Their paper presents comprehensive data and interpretation on

the plate bonding repair technique in terms of effects of plate thickness and end anchorage

on ductility, ultimate load and mode of failure. Reinforced concrete beams were preloaded

to 85% of their ultimate capacity and subsequently repaired by bonding steel plates of

different thicknesses with and without end anchorages. Anchor bolts were used for end

anchorages. The repaired beams showed higher strength than the original beams, provided

the plates did not exceed a certain limiting thickness. Increasing the plate thickness changed

the mode of failure of the repaired beams from flexural to premature failure, developed due

to shear and/or tearing of the plate, causing a reduction in ductility. End anchorages to the

bonded plates could not prevent the premature failure of the beams but improved ductility

with decreasing significance as the plate thickness increased, and yielded a marginal

improvement in ultimate strength. A procedure for designing the bonded plate is suggested,

whereby the maximum shear and peeling stresses at the interface do not exceed the

corresponding limiting values at which tearing of concrete takes place.

Chajes et al. (1994) have researched the Flexural S.i enligteing Of Concrete

Beams Using Externally Bonded Composite Materials. A series of reinforced concrete

beams were tested in four-point bending to determine the ability of externally-bonded

composite fabrics to improve the beams' flexural capacity. The fabrics used were made of

aramid, E-glass and graphite fibers, and were bonded to the beams using a two-part epoxy

adhesive. The different fabrics were chosen to allow a variety of fabric stiffnesses and









strengths to be studied. The external composite fabric reinforcement led to a 36 to 57%

increase in flexural capacity and a 45 to 53% increase in flexural stiffness. For the beams

reinforced with E-glass and graphite fiber fabrics, tensile failures occurred in the

maximum moment region. The beams reinforced with aramid fabric failed due to the

crushing of the concrete in the compression zone. In addition to the test results, an

analytical model based on the stress-strain relationships of the concrete, steel and

composite fabrics is presented. Using the model, beam response is computed and

compared with the experimental results. The comparisons indicate that the flexural

behavior of composite-fabric-reinforced concrete beams can be accurately predicted using

the described method.

Chajes et al. (1995) have demonstrated \het .r Si. engthining OfReinforced

Concrete Beams Using Externally Applied Composite Fabrics. A series of twelve under-

reinforced concrete beams was tested to study the effectiveness of using externally applied

composite fabrics as a method of increasing a beam's shear capacity. Woven composite

fabrics made of aramid, E-glass, and graphite fibers were bonded to the web of the T-

beams using a two-component epoxy adhesive. The three different fabrics were chosen to

allow various fabric stiffnesses and strengths to be studied. The beams were tested in

flexure, and the performance of eight beams with external shear reinforcement was

compared with the results of four control beams with no external reinforcement. All the

beams failed in shear and those with composite reinforcement displayed excellent bond

characteristics. For the beams with external reinforcement, increases in ultimate strength of

60 to 150 percent were achieved.









Sharif et al. (1994) studied the S. euiigtlning OfInitially Loaded Reinforced

Concrete Beams Using FRP Plates. The repair of initially loaded reinforced concrete

beams with epoxy-bonded fiberglass reinforced plastic (FRP) plates is experimentally

investigated. The RC beams are initially loaded to 85% of the ultimate flexural capacity

and subsequently repaired with FRP plates, bonded to the soffit of the beam. The plate

thickness is varied to assess the premature failure initiated at the plate curtailment zone due

to the high concentration of shear and peeling stresses. Different repair and anchoring

schemes are conducted in an effort to eliminate such failures and insure ductile behavior.

The behavior of the repaired beams is represented by load-deflection curves and the

different modes of failure are discussed. The results generally indicate that the flexural

strength of the repaired beams is increased. The ductile behavior of the repaired beams

increases as the plate thickness decreases. The use of an I-jacket plate provided a proper

anchorage system and improved the ductility of beams repaired with plates of large

thickness.

Shahawy et al. (1995) have investigated Flexural Behavior Of Reinforced

Concrete Rectangular Beams S./ 'eng/ithee'ii1 With CFRP Laminates. The reinforcement

used was epoxy-bonded carbon fiber reinforced plastic (CFRP) laminate. Comprehensive

test data are presented on the effect of CFRP laminates, bonded to the soffit of a beam, on

the first crack load, cracking behavior, deflections, serviceability loads, ultimate strength

and failure modes. Varying the number of laminates assesses the increase in strength and

stiffness provided by the bonded laminates. The results generally indicate that the flexural

strength of strengthened beams is significantly increased. Theoretical analysis using

specially developed computer software is presented to predict the ultimate strength and









moment-deflection behavior of the beams. The comparison of the experimental results with

theoretical values is presented, along with an investigation of failure nodes.

Ritchie et al. (1991) have investigated the External Reinforcement Of Concrete

Beams Using Fiber Reinforced Plastics. A series of 16 under-reinforced beams was

tested to study the effectiveness of external strengthening using fiber-reinforced plastic

(FRP) plates. Plates of glass, carbon and aramid fibers were bonded to the tension side of

the beams using a two-part epoxy. FRP is attractive for this application due to its good

tensile strength, low weight and resistance to corrosion. An iterative analytical method was

developed to predict the stiffness and maximum strength in bending of the plated beam.

Increases in stiffness (over the working load range) from 17 to 99 percent and increases in

strength (ultimate) from 40 to 97 percent were achieved for the beams with FRP plates.

Predicted and actual load-deflection curves showed fairly good agreement, although

generally the theoretical curves were stiffer. Experimental failure did not occur in the

maximum moment region on many of the beams, despite attempts at end anchorage to

postpone local shear failure. The ultimate loads of the beams that did fail in the maximum

region were within about 5 percent of predicted values.

Bohner and Burleigh (1998) studied Thermal Non-Destructive Testing (TNDT) Of

Composite Reinforcement Adhesively Bonded To Concrete Civil Structures That Were

Reinforced With Composite Materials. The first was an 8 foot tall column with a

rectangular cross-section. This column was reinforced with adhesively bonded fiberglass

shells to simulate a seismic retrofit. The second structure was a marine pier that was

upgraded in deck loading capacity by the application of 4 types of composite

reinforcement. TNDT was successful in detecting both simulated and actual disbands in









several types of composite reinforcements. Information on the types of defects, which

occur in these structures and their locations, has led to process improvements in the

application of adhesively bonded laminated composites to steel reinforced concrete

structures.

Jones and Hanna (1997) investigated Composite Wraps For Aging Infrastructure.

The paper evaluated the ability of externally bonded composite wraps to increase the load-

carrying capacity of concrete columns and beams. Failure loads were found to occur when

the composite wraps reached a critical strain level.

Kaempen (1996) analyzed Composite-Reinforced Concrete Building And Bridge

Structures. By eliminating the need to use steel as a concrete reinforcing material, and

using composite laminate structures, the composite-reinforced structure will not suffer

degradation caused by corrosion. The use of composite laminate structures enables an

economical method for reducing the inertial mass of concrete load bearing structures used

in multi-story buildings and highway overpasses, thereby making them safer when they

experience seismic shocks and displacements caused by bombs or earthquakes.

Crasto et al. (1998) studied the Rehabilitation Of Concrete Bridge Beams With

Externally-Bonded Composite Plates. The Air-Force Materials Directorate worked

closely with Butler County in Ohio to demonstrate the feasibility of strengthening concrete

beams in bridge decks with externally-bonded composite plates. As a result of these

studies a graphite epoxy, AS4C/1919, was selected for the composite reinforcement, which

was bonded to the concrete with an epoxy adhesive under ambient conditions. Tests were

scaled up to 8.51-m concrete beams identical to those employed on a bridge. The bonding

materials and processes employed in the lab trials were adapted for larger-sized beams.









Control beams and beams with composite plates were tested in flexure to provide baseline

data for future comparisons, and the data compared with analytical predictions. An

optimized bonding process was then employed to bond composite plates onto beams in the

bridge. The integrity of the composite and adhesive bond was periodically monitored.

After a predetermined outdoor exposure in the actual service environment, the beams were

removed from the bridge and their residual flexural properties determined.



1.1.3 Discussion of ANN Applications and How they may be Applicable to this
Problem

Artificial Neural Networks (ANNs) have been successfully utilized to solve

problems in civil engineering and structures.

Cao et al. (1998) have studied the Application OfArtificial Neural Networks To

Load Identification. Their study describes the application of an artificial neural network to

identify the loads distributed across a cantilevered beam. The distributed loads are

approximated by a set of concentrated loads. The paper demonstrates that using an

artificial neural network to identify loads is feasible and a well-trained artificial neural

network reveals an extremely fast convergence and a high degree of accuracy in the

process of load identification for a cantilevered beam model.

Hegazy et al. (1998) analyzed the Neural Network Approach For Predicting The

Structural Behavior Of Concrete Slabs. Neural networks have been used as a means to

develop efficient predictive models of the structural behavior of concrete slabs. Four

neural networks have been developed to model the load deflection behavior of concrete

slabs, the final crack-pattern formation, and both the reinforcing-steel and concrete strain

distributions at failure. The four neural networks were trained and tested using the









experimental results of 38 full-scale slabs. The results of this study indicated the

applicability of neural networks in predicting deflection, stress and strain failures of

concrete slabs.

Highsmith, Alton, and Keshav (1997) have analyzed the Use Of Measured

Damage Parameters To Predict The Residual Sn eungil OfImpacted Composites Through

A Neural Network Approach. The poor performance of composite materials under

transverse quasi-static and impact loading is of major concern in their application as

primary load carrying components in advanced structural applications. Their paper reports

the result of a modeling exercise that used neural networks as a tool to predict the loss in

residual strength resulting from localized damage in impacted laminates. Several measured

fabric fracture parameters, as well as matrix damage areas, obtained from damaged

laminates, were used as inputs. Neural networks were used to identify those damage

parameters that were essential for effective residual strength prediction. The predicted

strength values were found to be in very good agreement with those obtained from

experiments indicating the suitability of neural networks in this application.

Mukherjee et al. (1996) analyzed the Prediction Of Buckling Load Of Columns

Using Artificial Neural Networks. They developed a tool for the prediction of buckling

load of columns, which required minimal assumptions using neural computing techniques.

This concept can be extended to include a variety of column types in a single model for the

buckling load of columns. This concept can also be further extended for reliability analysis

for the network can also predict the standard deviation in the column strength.

Peetathawatchai and Connor (1996) studied the Applicability OfNeural Network

Systems For Structural Damage Diagnosis. A general architecture of neural network









systems for structural damage diagnosis and a methodology for designing the components

of the architecture were developed and evaluated. Importance was placed on system design

issues like choice of variables, the methodology for choosing the excitation and and type of

vibrational signature for the monitored structure, the configuration of the neural networks,

and their training algorithm. These design issues were first examined in detail for the case

of single-point damage of a multispan beam, and the evaluation was then extended to the

case of multi-point damage.

Alexander et al. (1996) investigated the Application OfArtificial Neural

Networks To Concrete Pavement Joint Evaluation. Using a falling weight deflectometer,

pertinent inputs required to evaluate deflection and stress load transfer efficiencies of

concrete pavement joints, were experimentally determined. A database was generated

using numerical integration of Westergaard-type integrals and was used to train a

backpropagation neural network algorithm for joint evaluation. The resulting computer

program is simple, efficient and precise, and can be used on site for immediate results. Its

predictions were verified by comparisons with closed-form and finite-element solutions

pertaining to data collected at three major civilian airports. It was demonstrated that

significant savings could be achieved through the reduction of the dimensionality of the

problem, which could be reinvested in broadening the range of applicability of the neural

network.

Mikami et al. (1998) studied a Neural Network System For Reasoning Residual

Axial Forces OfHigh-Snt eingtl Bolts In Steel Bridges. High-strength bolts of steel bridges

gradually loosen in service, and have to be periodically inspected by experts with

hammers. Mitsui Engineering & Shipbuilding Company Ltd., developed an automatic









looseness detector of high-strength bolts. A system to reason the residual axial forces of

high-strength bolts of steel bridges was built based on a neural network with the faculty of

pattern recognition, and its reasoning accuracy was verified.



1.2 Aim, Research Objectives and Methodology

The aim of this research was to develop a neural network model of the performance

of externally reinforced beams, that was sufficiently accurate to be of use to practicing

engineers, and overcome the speed problem of existing methods of analysis, thus

facilitating the search for an optimal design solution.



Research Objectives:

1. To select the most accurate neural network architecture for the problem.

2. To select the training procedure to be adopted for the network based on effectiveness
of results generated.

3. To determine the independent variables to be used within the model which are
significant to developing an accurate model.

4. To compare relative performance of ANNs and FEM in terms of accuracy and speed,
to determine which is most appropriate for the problem at hand.


Methodology:

1. Collate data from tested beams to provide the training and testing patterns for the ANN
models.

2. Train and test various neural paradigms and determine how far to train them.

3. Evaluate and compare the accuracy of predictions and speed of solution of different
paradigms, by finding error in predictions and correlation coefficients.

4. Qualitatively evaluate and compare the best network with FEM in terms of accuracy,
by observing the error in predicted values from the actual ones in either case.






12




5. Evaluate the processing time required for ANNs and FEM by observing the time
required to emerge with an accurate solution in each case, by comparing with
published results from FEM analysis.














CHAPTER 2
AN OVERVIEW ON THE USE OF NEURAL NETWORKS IN CIVIL ENGINEERING



2.1 Application Areas

The analysis of the performance of carbon fiber reinforced plastics applied to

concrete beams is a typical civil engineering problem that involves prediction of a

system's behavior based on a small set of laboratory observations. This requires the

development of a mathematical model that will predict the performance of the system by

scientifically extrapolating the results of the laboratory tests onto an undefined system.

Finite Element Methods and Neural Networks are two such mathematical tools that can be

applied for the solution of this problem. The inherent nature of Neural Networks in being

able to detect patterns in a set of data, and extracting this pattern from a set of unknown

data, makes them suitable for this analysis.

Artificial Neural Networks (ANNs) are applicable in Civil Engineering because

they can address the problem areas described below: (Flood and Kartam, 1994)

Using ANNs, incomplete and uncertain data about a system can be mapped into a

description of the state of the system. Civil engineering requires the interpretation of

incomplete, unorganized data to recognize and formulate problems. An example is the

problem of detecting damage in a building with thousands of structural members, by

collecting data at various locations on the structure. This is an inverse problem that

involves determination of a state from the observed behavior of a system.









Engineers have to analyze potential problem solutions and recognize solutions that

deliver desired system behavior and eliminate those that will not. The potentially feasible

solution space must then be refined. ANNs can help in the mapping from a space of desired

system behaviors to a space of system form attributes that deliver that behavior.

Engineers must be able to predict complex behavior of systems from known system

configuration and environmental loads to which the system is subjected. This is a problem

of determining a mapping from cause to effect, which can be achieved by utilizing an ANN.

Another aspect of engineering analysis is the prediction of unmeasured attributes

based on measured attributes, for example predicting how a model will behave beyond the

bounds of its observed behavior. Evaluation of potential solutions to problems and

selection of a solution from available alternatives, by estimating values of evaluation

criteria (i.e. effects) using a set of known form and behavior attributes (i.e. causes) is an

area where ANNs can be effectively used.

Engineering involves planning, scheduling and allocating resources for the

activities required to construct a selected alternative. Mapping a selected alternative onto

the activities required to build it, is an inverse mapping from the completed product (i.e.

effect) to the activities that cause it to exist (i.e. causes). Once the needed activities are

identified, the types and amounts of resources required for each activity must be

determined, which involves two inverse mappings.

Monitoring the state of resource usage and prediction of resources required to

complete a project, requires continuous monitoring. The prediction of final costs is a direct

mapping from the existing state of the project and the resource usage to the total cost of the

project.









The control and operation of dynamic systems as solutions to formulated problems,

for example the control of a HVAC system within a large multi-function building, requires

the determination of a control strategy. In determining the control strategy, the engineer

must determine the mapping between the space of system states and the space of applied

control forces. This is an inverse mapping problem. All of these problems can be

classified into two categories: mapping from cause to effect for estimation and prediction,

and inverse mapping from effects to possible causes. The nature of a neural network is to

map from a set of input patterns to a set of output patterns.

ANNs are currently being applied in civil engineering in classification /

interpretation tasks (i.e. inverse mapping from observations to known classes), diagnosis

(i.e. inverse mapping from observed effect to cause), modeling (mapping from cause to

effect), and control (inverse mapping from observed state to control forces).

One of the most straightforward and common methods of applying neural networks

involves direct mapping from a vector of inputs to a vector of outputs. Examples of the use

of ANNs as direct-mapping devices in civil engineering are:

* A system for selecting vertical formwork (Kamarthi et al., 1992). In this case, the
network maps a set of inputs representing the situation in which a formwork system is
implemented, onto a set of outputs representing recommendation levels for different
formwork systems.

* Seismic hazard prediction (Wong et al., 1992). A network was trained to map a vector
of inputs describing an earthquake, local geology and location data onto an output
providing a forecast of its intensity at the location.

* Predicting tower-guy pre-tension (Issa et al., 1992). A vector describing key aspects of
a guyed tower, is mapped onto an output that estimates the optimal pre-tension for the
cables.

ANNs are applied to solve inverse mapping problems, such as:









* Damage detection in a structural frame (Wu et al., 1992). A modeled structural frame
was artificially damaged and subjected to base excitation from several recorded
earthquakes. The results from these analyses were then used as training cases to train a
network to train a network to take to displacement observations and predict the
location and severity of individual member damage.

* Detecting damage in structural systems (Szewczyk and Hajela, 1994). The ANN they
developed is quickly able to acquire and compute the mapping from patterns of
displacement observations and predict the location and severity of individual member
damage.


The use of a modular approach in neural network development includes:

* Simulation of construction activity (Flood, 1990). A network was used to simulate a
construction process. A number of modules representing sub-processes can be used to
model a wide range of different processes by linking modules as required. Users can
extend the construct by developing new modules suited to their particular needs.
* Estimating truck attributes from the strain response of the structure over which they are
traveling (Gagarine et al., 1992). The network receives as input a vector of values
representing the strain measured at a fixed point on a bridge girder during the passage
of a truck. Each element in this vector represents strain at a different point in time. The
network outputs a prediction of the velocity of the truck, the spacings of axles and the
load on each axle. The network comprises of two modules, the first layer represents the
basic class of the truck and the second layer estimates the axle loads, axle spacings or
velocity of a truck. Modularization enabled the facilitation of the system to an
acceptable level of accuracy and speed.

* Estimating earthmoving equipment production (Karshenas and Feng, 1992). The
network comprises of a number of modules arranged in parallel, each dedicated to the
prediction of speed of earthmoving equipment under certain environmental factors. The
modular approach facilitates the inclusion and removal of new and obsolete equipment
considered by the network.


Another area where ANNs are being applied in engineering is for creating models

for making predictions and estimations. Some examples are:

* Estimating the strain behavior of a geomaterial in response to changes in its stress state.

* Prediction of flow of a river (Karunanithi et al., 1994). The ANN is trained to take a
period of historical river flow data and to predict the flow immediately beyond that
period.









* Use of an ANN as an estimator (Chao and Skibniewski, 1994). The use of an ANN to
estimate the productivity of various construction activities, is described. Data needed
for training the network was obtained from observations of bench-scale operations and
construction simulations.


Many problems in Civil engineering require the output of a series of results over

time. An example is the simulation of construction processes where a prediction is

required of the likely behavior and performance of a system at successive points in time.

The simulation system makes predictions about the next state of a system based on its

current state. Using two or more previous states as inputs to the network can make an

accurate prediction of the following step.

Applications of this approach include:

* Simulating dynamic loading on structures caused by environmental factors, such as
winds.

* Extrapolating the time series of ground accelerations during a seismic event.

* Modeling the thermodynamic behavior of a building or its components.

* Predicting the dynamic response of a structure to sudden loading

* Projecting over a period of time the flow response of a drainage system to a storm.

Optimization problems are characterized by the need to select a viable solution

from a large number of alternatives, which is optimal according to some measure of

performance. Typically, finding an overall optimum solution is very difficult. Neural

networks can often overcome this limitation, providing quick solutions that are near

optimal. Examples of such problems are:

* Determining the optimal numbers and spacings for access shafts to a tunnel so that
construction time is minimized.

* Arranging the cutting of material (such as rebars) so that wastage is minimized.









*Scheduling activities so that the demand for equipment and labor resources are as
constant as possible.



2.2 Rationale for Use in Analysis of External Reinforcement Performance

Neural Networks can be effectively used in engineering problems involving the

prediction of unmeasured attributes based on measured attributes, such as in the prediction

of how a model will behave beyond the bounds of its observed behavior.

In the analysis of external reinforcement on RC beams, the deflection of the beam

based on other known criteria, often needs to be found to determine stability of the beam.

The measured attributes in this case are the load on the beam, the configuration of external

reinforcement (whether placed on shear or tension face or both or none), while deflection

needs to be known when measured bounds are exceeded.

Neural networks achieve the ability to predict by 'learning' the data patterns in the

training set. Depending on the paradigm or architecture selected for training, neural

networks use either hidden neurons with weights corresponding to the learning rate, or

smoothing factors for genetic adaptive training, to hone in on a solution. The effectiveness

of training is verified by evaluation of the learnt pattern on a test set. Training continues

until the network reaches the criteria set by the user for termination of training. Termination

of training may be set after a pre-determined number of epochs (passes through the training

patterns) or after a specified number of events is reached after minimum error is attained.

The accuracy of prediction is however limited to the nature of the data the network is

trained upon. The determination of when to stop training is also an important criteria since

excessive training could cause the network to memorize the training set and not generalize

well on new data.














CHAPTER 3
EXPERIMENTAL SETUP OF TESTED BEAMS



A concerted research effort was carried out between 1994 and 1999, by the Wright

Laboratory Airbase Survivability Section (WL/FIVCS) to explore the strengthening effect

of CFRP laminates applied to reinforced concrete beams, both in the tensile and shear

areas of the beam (Ross et al. (1994)). The data from the WL/FIVCS project was used for

a feasibility study using neural networks to predict the deflection of externally reinforced

concrete beams.

Figure 1 shows the cross-section of the beams used in the WL/FIVCS study while

Figure 2 shows the beam length and positioning of the loads. In all beam samples, the

cross-section was kept to 200mm by 200mm square, and the diameter of the compression

steel was kept constant at 9.5 mm. However, the diameter of the tensile steel bars varied

from 9.5 mm to 22 mm between sample beams. Three-ply CFRP laminates oriented at

00/900/00 were bonded to either the tension face of the beam only or both the tension face

and the sides (shear faces) of the beam. A total of 10 beams were tested, with the various

configurations shown in Table 1.














2-#3 (9.5 mm) rebars

#3 (9.5 mm) stirrups

- CFRP shear faces
. 2-#x rebars

CFRP tension face


Figure 1: X-Section of Sample Beams


P/2
915 mm
______^


P/2
915 mm 915 mm
_______ ^ e^_I___


2745 mm


150 mm


Figure 2: Beam Loading Configuration.


200 mm


150 mm











Table 1: X-Section Areas Of Tensile Steel And External Reinforcement
Configurations

External Reinforcement

Beam Test X-Section Area None Tensile Face Shear Face
of Tensile Steel
(2 bars)
1 568 mm2 Y
(19 mm bar dia)
2 142 mm2 Y
(9.5mm bar dia)
3 568 mm2 Y
(19 mm bar dia)
4 142 mm2 Y
(9.5mm bar dia)
5 568 mm2 Y Y
(19 mm bar dia)
6 142 mm2 Y Y
(9.5mm bar dia)
7 774 mm2 Y
(22 mm bar dia)
8 774 mm2 Y
(22 mm bar dia)
9 258 mm2 Y
(13 mm bar dia)
10 258 mm2 Y
(13 mm bar dia)


Loads were applied to each beam, starting at 0 KN and gradually increasing until

the beam failed. The loads were noted at approximately every 5mm increase in deflection.

A total of 254 observations resulted from these experiments.

The data generated from loading the beams can be found in the Appendix B. The

254 observations made in the beam deflection experiments were divided into two sets: one

for training the neural network; and the second for validating its performance after training.

For each beam, 10% of the observations were selected at random for the validation test,






22


providing a training set of 229 beam deflection observations, and a test set of 25 beam

deflection observations. A conventional feedforward backpropagation neural network

(Neuroshell2 by Ward Systems Group, 1995) was used to train and test the data.














CHAPTER 4
DEVELOPMENT OF THE ANN MODEL

4.1 Establishing Training and Testing Patterns

This chapter presents the setup of neural paradigms used in analysis of the data, and

training performance of the different models.

Neural networks mimic the human brain's ability to classify patterns or make

predictions or decisions based on past experience. The Neuroshell software (Ward

Systems) is able to 'leam' patterns from training data and make its own classifications,

predictions, or decisions when presented with new data. However the results obtained

from Neuroshell are not always absolutely accurate, especially if patterns fed are in some

way conflicting or incomplete.

A total of 10 beams were tested, with configurations as shown in Table 1. Using a

total of 254 sets of observations for each beam, one set of data was used to train the

network, while 10% of the observations were selected at random for testing the validity of

the network.

For this problem, using the Neuroshell2 software, the neural network paradigms

adopted were the Backpropagation method (3 layer simple, 3 layer advanced, 4 layer

advanced), 3 and 4 layer Backpropagation networks with Jump Connections, a GRNN

network, and Ward nets (Networks involving multiple hidden slabs with different

activation functions, developed by Ward systems).









The inputs for the networks used were load values of beams, area of tensile steel

and CFRP (external reinforcement represented by 0 for no external reinforcement, 1 for

external reinforcement on tension face only and 2 for external reinforcement on both

tension face and shear face). The output was predicted deflection values for the beams. It is

assumed that load, area of steel and type of external reinforcement influence the deflection

of the beam. Each type of input is considered a continuous variable that represents the

strength of the input neuron, by the network.

The objective of training the network was to enable it to generalize future data and

produce the most accurate answers. A feature of Neuroshell that was utilized in analysis is

"Net-Pci f'L I". Net-Perfect creates an entirely separate set of data patterns from the entire

set of patterns, called a test set. The test set is used to evaluate how well the network is

predicting or classifying. The test set used was 10% of the size of the training set. Data

patterns for the test set were either selected at random by the computer, or were selected in

a rotational manner from the entire set of data. Net-Perfect functions by minimizing the

mean of the error factors on the test set. The Net-Perfect interval was normally set to 200,

and it was set to 225 when Turboprop was being used. Another feature of Neuroshell that

was utilized is "Turboprop". Turboprop operates faster than other Backpropagation

methods, and is not sensitive to learning rate and momentum. Training proceeds through an

entire epoch before weights on neurons are updated. All the weight changes are added and

are updated at the end of the epoch. The Net-Perfect interval has therefore got to be set to

the number of training patterns.

Training was stopped when either the average error value fell below a pre-defined

threshold or the number of training epochs (1 epoch is one run through the entire set of









patterns) exceeded a specified number, or the number of events since the last minimum

error exceeded a specified number. This depended on the neural paradigm adopted. For the

Backpropagation and Ward networks, training was stopped when the number of events

since minimum error for the test set reached 40,000 events. For the GRNN network, an

initial smoothing factor was used and termination of training was set to no improvement

exceeding 1% in 20 generations of genetic breeding.

Definitions of some terminology used in the description of neural paradigms, is

given below:

Learning Rate: Each time a pattern is presented to the network, the weights leading to an

output node are modified slightly during learning in the direction required to produce a

smaller error the next time the same pattern is presented. The amount of weight

modification is the learning rate times the error. The larger the learning rate, the larger the

weight changes, and the faster the learning will proceed.

Momentum: Large learning rates often lead to oscillation of weight changes and learning

never completes, or the model converges to a solution that is not optimum. One way to

allow faster learning without oscillation is to make the weight change a function of the

previous weight change to provide a smoothing effect. The momentum factor determines the

proportion of the last weight change that is added into the new weight change.

Neuron: A neuron is a basic building block of simulated neural networks which processes

a number of input values to produce an output value. Usually the neuron sums the input

values and then applies a non-linear function to the sum to arrive at the output value.

Pattern: A pattern is a single record (or row) of variables that influence a network's

predictions or classifications.










Weights: As neurons pass values from one layer of the network to the next layer, the values

are modified by a weight value in the link that represents connection strengths between the

neurons.



Training Paradigms

1) 3-Layer Backpropagation Network











Input Slab Hidden Slab Output Slab

Neurons: 3 Neurons: 17 Neurons: 1
Scale Function: Scale Function: Scale Function:
Linear(0,1) Logistic Logistic
Learning rate 0.6 Learning rate 0.6 Learning rate 0.6
Momentum 0.9 Momentum 0.9 Momentum 0.9
Initial Weight 0.3 Initial Weight 0.3 Initial Weight 0.3

Figure 3: 3-Layer Simple Backpropagation Network.




Backpropagation networks are known for their ability to generalize well on a wide

variety of problems. They are used for a vast majority of working neural network

applications. Fig.3 shows the standard type of backpropagation network in which every

layer is connected or linked to the immediately previous layer. The number of input and

output neurons was set equal to the number of inputs and outputs respectively. The number

of hidden neurons was calculated as

# of hidden neurons= '/Inputs + Outputs) + sq. root of number of patterns in training file.

To calculate the number of neurons, the following formula was found to be better: # of









hidden neurons = 2 square root (number of inputs or defining characteristics + the

number of outputs or classifying characteristics) rounded down to the nearest integer.

The complexity was set to very simple, Net-Perfect interval to 200, pattern selection as

random, and save training on best test set.



2) 3-Layer Backpropagation Network Using Turboprop

Figure 4 shows a backpropagation network in which every layer is connected or

linked to the immediately previous layer. "Turboprop" is a training method that operates

much faster in the batch mode than other Neuroshell2 backpropagation methods, and has the

additional advantage that it is not sensitive to learning rate and momentum. Training

proceeds through an entire epoch (a complete set of training patterns) before the weights

are updated by adding all the weight changes.


O<<







Input Slab Hidden Slab Output Slab

Neurons: 3 Neurons: 17 Neurons: 1
Scale Function: Scale Function: Scale Function:
Linear(-1,1) Logistic Logistic
Learning rate 0.1 Learning rate 0.1 Learning rate 0.1
Momentum 0.1 Momentum 0.1 Momentum 0.1
Initial Weight 0.3 Initial Weight 0.3 Initial Weight 0.3

Figure 4: 3-Layer Backpropagation Network using Turboprop.










The number of input and output neurons was again set equal to the number of inputs

and outputs respectively. The number of hidden neurons was calculated as # of hidden

neurons= (Inputs + Outputs) + sq. root of no. of patterns in training file.

Pattern selection was rotational, save training was set to best test set, Net-perfect interval

to 200, and stop training was set to events since minimum average error > 40000. Missing

values were considered error conditions. Training was done with training and test set in

memory. Learning rate, momentum and initial weights were selected by the network,

which is the function of Turboprop.




3) 4- Layer Backpropagation using Turboprop


07






0


Input Slab


Neurons: 3
Scale Function:
Linear(-1,1)
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


W0O







Hidden Slab


Neurons: 8
Scale Function:
Logistic
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


Hidden Slab


Neurons: 8
Scale Function:
Logistic
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


Output Slab


Neurons: 1
Scale Function:
Logistic
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


Figure 5: 4-Layer Backpropagation using Turboprop




Figure 5 shows a backpropagation network using Turboprop, in which every layer

is connected or linked to the immediately previous layer. Use of two hidden layers










facilitates further training on the data. Pattern selection was rotational, save training was

set to best test set, Net-perfect interval to 200, and stop training was set to events since

minimum average error > 40000. Missing values were considered error conditions.

Training was done with training and test set in memory. Learning rate, momentum and

initial weights were selected by the network, which is the function of Turboprop.



4) 3-Layer Backpropagation, with Jump Connections.


Input Slab Hidden Slab Output Slab

Neurons: 3 Neurons: 17 Neurons: 1
Scale Function: Scale Function: Scale Function:
Linear(-1,1) Logistic Logistic
Learning rate 0.1 Learning rate 0.1 Learning rate 0.1
Momentum 0.1 Momentum 0.1 Momentum 0.1
Initial Weight 0.3 Initial Weight 0.3 Initial Weight 0.3

Figure 6: 3-Layer Backpropagation with Jump Connections




"Jump C,,niie' tiiIi" imply that every layer is connected or linked to every

previous layer, in the backpropagation model. Pattern selection was set to rotational, save










training was set to best test set, Net-perfect interval to 225(equal to one epoch), and stop

training was set to events since minimum average error > 40000. Missing values were

considered error conditions. Training was done with training and test set in memory.

Learning rate, momentum and initial weights were selected by the network, which is the

function of Turboprop.




5) 4-Layer Backpropagation with Jump Connections


Input Slab

Neurons: 3
Scale Function:
Linear(-1,1)
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


Figure 7:


Hidden Slab Hidden Slab

Neurons: 8 Neurons: 8
Scale Function: Scale Function:
Logistic Logistic
Learning rate 0.1 Learning rate 0.1
Momentum 0.1 Momentum 0.1
Initial Weight 0.3 Initial Weight 0.3


4-Layer Backpropagation with Jump connections


Output Slab

Neurons: 1
Scale Function:
Logistic
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


A 4-layer Backpropagation Network shown in Fig. 7 uses 2 hidden layers with the

same activation function. This and the subsequent 5-layer paradigm were used to test

whether they provided improvements on the 3-Layer backpropagation model. Pattern









selection was set to rotational, save training was set to best test set, Net-perfect interval to

225(equal to one epoch), and stop training was set to events since minimum average error

> 40000. Missing values were considered error conditions. Training was done with

training and test set in memory. Learning rate, momentum and initial weights were selected

by the network, which is the function of Turboprop.



6) 5-Layer Backpropagation, with Jump Connections


0
0
0



0
O


Input Slab

Neurons: 3
Scale Function:
Linear(-1,1)
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3

Figure 8:


Hidden Slab Hidden Slab Hidden Slab

Neurons: 6 Neurons: 6 Neurons: 6
Scale Function: Scale Function: Scale Function:
Logistic Logistic Logistic
Learning rate 0.1 Learning rate 0.1 Learning rate 0.1
Momentum 0.1 Momentum 0.1 Momentum 0.1
Initial Weight 0.3 Initial Weight 0.3 Initial Weight 0.3

5-Layer Backpropagation, with Jump connections


Output Slab

Neurons: 1
Scale Function:
Logistic
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


As shown in Figure 8, the use of 3 hidden slabs with every layer connected to every

previous layer, enables the network to run through a greater number of training sequences,

which may result in a superior trained network.


O O O O
O O O O






--- cf










Pattern selection was set to rotational, save training was set to best test set, Net-perfect

interval to 225(equal to one epoch), and stop training was set to epochs> 1000, for the

training set. Missing values were considered error conditions. Training was done with

training and test set in memory. Learning rate, momentum and initial weights were selected

by the network, which is the function of Turboprop.



7) GRNN Network

GRNN networks train quickly on sparse data sets. GRNN is a 3-layer network,

where there must be one hidden neuron for each training pattern.


Input Slab Hidden Slab Output Slab

Neurons: 3 Neurons: 255 Neurons: 1
Scale Function: Smoothing factor Smoothing factor
Linear (0,1) =3.329E-02 =3.329E-02
Smoothing factor
=3.329E-02
Figure 9: GRNN network

The number of hidden neurons is the number of patterns in the training set for a

GRNN network. (Fig.9) There are no training parameters such as learning rate and

momentum as there are in Backpropagation networks, but there is a smoothing factor that is

used when the network is applied to new data. The smoothing factor determines how

tightly the network matches its predictions to the data in the training patterns. While using

Neuroshell2, the smoothing factor is automatically computed by Net-Perfect. The number









of neurons in the input layer corresponds to the number of inputs, and the number of

neurons in the output layer is the number of outputs. GRNN networks work by comparing

patterns based upon the 'distance' between them. The distance metric selected for this

problem was the Vanilla (Euclidean) and a Genetic Adaptive method was adopted to

facilitate selection of appropriate smoothing factors. Missing values were considered to be

error conditions. Genetic breeding was applied with a pool size of 20, with automatic

termination for 20 generations with no improvement of 1%.



8) Ward Networks (implementing multiple hidden slabs with different activation
functions).

"WardXinl ,ni A\" are special neural network paradigms developed by Ward

Systems Group Inc. This network uses a backpropagation method with 2 hidden slabs,

usually each with a different activation function. Pattern selection was set to rotational,

save training was set to best test set, Net-perfect interval to 225(equal to one epoch), and

stop training was set to events since minimum average error > 40000.


























Input S1

Neurons: 3
Scale Func
Linear(-l
Learning r
Momentum
Initial Wei


ab

3


0


./^^ Outp

Neuro
H-idlnn b ^


tion: Hidden Slab -uu
,1)
-ate 0.1 Neurons: 8 Neurons: 8
n 0.1 e F Scale Function:
Scale Function:
ght 0.3 Gaussia Gaussian
Learning rate 1 Learning rate 0.1
Learning rate 0.1
Mo m 01 Momentum 0.1
Momentum 0.1
Initial Weight 0.3
Initial Weight 0.3 Initial Weight 0.3

Figure 10: Ward Networks with 2 Hidden Layers


Scale
Logi
Learn
Momt
Initial


,ut Slab

ns: 1
Function:
stic
ing rate 0.1
entum 0.1
Weight 0.3


Missing values were considered error conditions. Training was done with training

and test set in memory. By applying different activation functions to hidden slabs, different

features in a pattern processed through a network can be detected. Combining the different

feature sets in the output layer may lead to a better prediction, since different 'views' of the

data are obtained.


1--- \













9) Ward Networks with 3 hidden layers


0
0



0


0
0


0


0
0



0


0
0



0


L





0
OF


Input Slab Hidden Slab Hidden Slab Hidden Slab Output Slab

Neurons: 3
le Function: Neurons: 6 Neurons: 6 Neurons: 6 Neurons: 1
Scale Function: Scale Function: Scale Function: Scale Function:
Linear(-1,1)
Linear(g r 1 Gaussian Tanh Gaussian Comp Logistic
Moment t 0.1 Learning rate 0.1 Learning rate 0.1 Learning rate 0.1 Learning rate 0.1
Momentum 0.1 Momentum 0.1 Momentum 0.1 Momentum 0.1
Initial Weight 0.3
SW 0 Initial Weight 0.3 Initial Weight 0.3 Initial Weight 0.3 Initial Weight 0.3

Figure 11: Ward Networks with 3 Hidden Layers

This network uses a backpropagation method with 3 hidden slabs. Different

activation functions applied to hidden layer slabs detect different features in a pattern

processed through the network. Combining the two feature sets in the output layer may lead

to a better prediction. Pattern selection was set to rotational, save training was set to best

test set, Net-perfect interval to 225(equal to one epoch), and stop training was set to events









since minimum average error > 40000. Missing values were considered error conditions.

Training was done with training and test set in memory.



10) Ward Networks with 2 hidden layers (with input connected to output layer)

This network uses a 2- layer backpropagation method with a connection between

input and output layers. Each hidden layer has a different activation function. Pattern

selection was set to rotational, save training was set to best test set, Net-perfect interval to

225(equal to one epoch), and stop training was set to events since minimum average error

> 40000. Missing values were considered error conditions. Training was done with

training and test set in memory.


0
0



0

Input Slab

Neurons: 3
Scale Function:
Linear(-l,l)
Learning rate 0.
Momentum 0.1
Initial Weight 0..


Hidden Slab

Neurons: 8
1 Scale Function:
Gaussian
3 Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


0
0



0


0
0



O



0
0





0


Hidden Slab

Neurons: 8
Scale Function:
Gaussian Comp
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


Output Slab

Neurons: 3
Scale Function:
Logistic
Learning rate 0.1
Momentum 0.1
Initial Weight 0.3


Figure 12: Ward Network with 2 Hidden Layers (with Input-Output Layers Connected)


L


z


4









4.2 Training Results

The results of training were analyzed in terms of the 'Perfect model', which was

assumed to have a correlation coefficient of 1.0, Mean Absolute Error of 0, and training

time in the order of a few seconds.

Error was measured for each observation as the absolute difference between the

neural network predicted deflection and the actual beam deflection. The most accurate

network was found to be the 5-layer Backpropagation network with jump connections. The

mean absolute error in the network was found to be 0.909 mm with a correlation factor to

the perfect model of 0.973. Training time was approximately 20 seconds. Training was

stopped when epochs since minimum average error exceeded 1000 on the training set.

Using the GRNN network, with a smoothing factor of 0.03329, the mean absolute error was

obtained as 1.86 mm, with a correlation factor to the perfect model of 0.9326. Training

time was approximately 30 seconds, training being terminated when genetic breeding

failed to produce an improvement greater than 1% over 20 generations. A 3-layer simple

backpropagation network gave good results too, using a learning rate of 0.6 and a

momentum of 0.9. Learning rate, as defined earlier, is the factor by which weights leading

to an output node are modified, resulting in a smaller error the next time the same pattern is

presented. Momentum is a factor that determines the proportion of the last weight change

that is added to the new weight change, in order to allow for faster learning without

oscillation of weight changes (which would result in a non-optimal solution). Different

numbers of hidden neurons were applied and the network performance was analyzed. The

optimal network configuration was determined to be a single hidden layer comprising 17

hidden neurons. Training lasted for 22.5 hours before there was no further significant

improvement in network performance. At this stage, the mean absolute error in the









backpropagation network was found to be 1.68 mm. The correlation between the

backpropagation model and the perfect model was found to be 0.94. The 4-layer

backpropagation method with jump connections, resulted in a mean absolute error of

2.399mm, and a correlation factor to the perfect model of 0.9058. Training time for this

model was of the order of 30 seconds. Training was stopped when events since minimum

average error exceeded 40,000.

The correlation factors were validated from the set of patterns (the 10% randomly

selected test patterns) not used for training. The performance of each neural paradigm was

tested on the test set of data, to determine the correlation coefficients and mean absolute

errors generated on the Test data set. A correlation factor very close to 1.0 would indicate

satisfactory performance, while a Correlation coefficient close to 0 would indicate poor

performance. A low Mean Absolute error value would also represent good performance.

On the Test set, the 5-Layer Backpropagation network with Jump connections had the best

performance, with Correlation coefficient =0.9, and Mean Absolute error = 2.2mm. The

results are summarized in the below table.

However, the Backpropagation networks with jump connections gave higher

maximum absolute errors. The worst performance was of the Ward networks

Backpropagation model with 2 hidden layers. It resulted in a mean absolute error of

6.687mm with a correlation factor to the perfect model of 0.5064 on the training set.

Actual training performance data for all the networks are included in Appendix 'A'. Test

set performance data are in Appendix C.









Table 2: Ranking Of Neural Paradigms In Order Of Performance.


Rank Network type Mean Abs Training time Correlation
Error (mm) coefficient
1 5-Layer Backprop with Jump 2.2 20 seconds 0.9
connections

2 Simple 3-Layer backpropagation 1.98 22.5 hours 0.94

3 GRNN network 2.64 30 seconds 0.95

4 Ward Network: 3 hidden layers 2.7 40 seconds 0.92

5 4-Layer Backprop with Jump 2.8 35 seconds 0.91
connections

6 Ward Network 2 hidden layers, 3.34 30 seconds 0.91
input-output layers connected

7 3-Layer Backprop with Jump 3.8 35 seconds 0.9
connections

8 3-Layer Backprop using Turboprop 3.64 35 seconds 0.87

9 4-Layer Backprop using Turboprop 6.52 30 seconds 0.79

10 Ward Network: 2 hidden layers 6 65 seconds 0.718














CHAPTER 5
PERFORMANCE AND ANALYSIS OF THE ANN MODELS

5.1 Relative Performance of the ANNs



This chapter looks at the performance of ANN models relative to each other. This

is achieved by comparing performance parameters such as correlation coefficient among

the training sets and test sets.

The performance of the ANNs was measured in terms of r-sqaured, Mean squared

error, Mean Absolute error, Min absolute error, Max absolute error, and Correlation

coefficient r. A definition of these terms is given below:

R-squared: This is the square of the correlation coefficient. A perfect fit would

result in an R squared value of 1, a very good fit near 1, and a very poor fit near 0.

Mean Squared Error: This is the mean over all patterns in the file of the square of

the actual value minus the predicted value, i.e., the mean of lactual-predictedl.

Mean Absolute Error: This is the mean over all patterns of the absolute value of

the actual minus predicted, i.e., the mean of |actual predicted.

Min absolute Error: This is the minimum of |actual-predicted| of all patterns.

Max absolute error: This is the maximum of lactual-predictedl of all patterns.

Correlation Coefficient: This is a statistical measure of the strength of the

relationship between the actual and predicted outputs. This can range from +1 to -1,

indicating a positive or negative linear relationship.









Table 3 below summarizes the training performances of the neural paradigms used

for data analysis:



Table 3: Training Performance Of Neural Paradigms.

R- Mean Mean Abs Min Abs Max Correlation
squared Squared Error Error Abs Coefficient
Error Error
Simple 3 -layer 0.9584 7.132 1.678 0.000 18.55 0.983
backpropagation
3-layer Backprop 0.8088 32.8 3.52 0.00 23.72 0.899
using Turboprop
4-layer Backprop 0.577 72.437 6.555 0.046 29.98 0.783
using Turboprop
3 layer Backprop 0.8511 25.53 3.583 0.000 20.74 0.925
with Jump
Connections
4 layer Backprop 0.9058 16.154 2.399 0.000 23.01 0.952
with Jump
Connections
5 layer Backprop 0.9730 4.629 0.909 0.000 28.020 0.987
with Jump
Connections
GRNN network 0.9326 11.568 1.861 0.000 15.228 0.967

Ward Network: 2 0.5064 84.668 6.687 0.000 31.935 0.748
hidden layers
Ward Network: 3 0.8686 22.537 2.97 0.00 21.943 0.933
hidden layers
Ward Network 2 0.8547 24.93 3.59 0.000 22.614 0.925
hidden layers, In-
Out connected









Table 4: Test Set Performance Of Neural Paradigms


R- Mean Mean Min Max Correlation
squared Squared Abs Abs Abs Coefficient
Error Error Error Error
Simple 3 -layer 0.8827 16.844 1.980 0.003 18.55 0.940
backpropagation

3-layer Backprop 0.7576 34.814 3.636 0.00 22.512 0.874
using Turboprop

4-layer Backprop using 0.5556 63.84 6.516 0.325 24.249 0.789
Turboprop

3 layer Backprop with Jump 0.8015 28.516 3.797 0.00 20.75 0.902
Connections

4 layer Backprop with Jump 0.8254 25.082 2.805 0.220 23.01 0.909
Connections

5 layer Backprop with Jump 0.7900 30.17 2.2 0.00 28.02 0.896
Connections

GRNN network 0.8851 16.503 2.641 0.092 15.228 0.949

Ward Network: 2 hidden 0.4985 72.034 6.008 0.00 26.26 0.718
layers
Ward Network: 3 hidden 0.8363 23.513 2.696 0.000 21.94 0.92
layers

Ward Network 2 hidden 0.8183 26.097 3.340 0.000 20.855 0.905
layers, input-output layers
connected







43







Performance Graphs of the Neural Paradigms

The below graphs are scatter plots of predicted versus actual deflection. Points

lying along the 450 line on the plot indicates an accurate result. The test set data only was

used in plotting the graphs. The accuracy of the network is demonstrated by its ability to

predict well on the test data. The 450 line is shown in white on the graphs. The Mean

Absolute Error value is the average distance on the plot that points are distant from the 450

line.



Simple 3-laver Backnronagation network


Predicted vs Actual Deflection

35
S30- .-
0
S25 ___
$ 20 Predicted vs Actual
*15 Deflection
10
05


0 20 40 60
Actual Deflection



Figure 13: Scatter Plot of 3-Layer Backpropagation Model

Figure 13 shows fairly good accuracy of predicted deflections to actual deflections

on the test set. The mean Absolute error was 1.98 mm and the correlation coefficient was







44


0.940 for this model. The 3-Layer simple backpropagation model was a good predictor for

this problem, as is observed in the plot above.




3-layer Backpropagation using Turboprop


Predicted vs Actual deflection

35

30 -
30
I 25 .
S20 Predicted vs Actual
15 deflection



I
0 -


0 20 40 60
Actual deflection



Figure 14: Scatter Plot of 3-Layer Backpropagation using Turboprop




Figure 14 shows a scattered plot of predicted deflections to actual deflections. The

mean Absolute error was 3.64 mm and the correlation coefficient was 0.874. The 3-layer

backprop using Turboprop was not a very good predictor, as can be inferred from the

above plot.












4-laver Backpropaeation using Turboprop


Predicted vs Actual deflections

30

r 25

S20 *
I * Predicted vs Actual
15 -
"5o i deflections


a 5

0
0 20 40 60
Actual deflection



Figure 15: Scatter Plot of 4-Layer Backpropagation using Turboprop




Figure 15 shows a widely scattered plot of predicted deflections to actual

deflections. This is because the mean Absolute error was 6.52 mm and the correlation

coefficient was 0.789 for this model. The 4-layer backprop using Turboprop did not

perform well for this problem.










3-layer Backpropagation with Jump connections


Predicted vs Actual deflection

35

30 *
25 -

S20- Predicted vs Actual
S15 deflection

10 *
I-
I- 5
0 5 I-------------
0 4 0
0 20 40 60
Actual deflection



Figure 16: Scatter Plot of 3-Layer Backpropagation with Jump Connections




Figure 16 shows the plot of predicted deflections to actual deflections grouped

about the 450 line. The mean Absolute error was 3.8 mm and the correlation coefficient

was 0.9. This indicates a moderate accuracy of predictions.







47


4-layer Backpropagation with Jump Connections


Predicted vs Actual deflection

35

30-
o *
2 5 .*
S20 -, Predicted vs Actual
15 I deflection

S10


0 lt-*-------------
0 20 40 60
Actual deflection



Figure 17: Scatter Plot of 4-Layer Backpropagation with Jump Connections




Figure 17 shows better correlation between predicted deflections and actual

deflections. The mean Absolute error was 2.8 mm and the correlation coefficient was 0.91.

The graph demonstrates a fairly good accuracy of predictions.







48


5-layer Backpropagation with Jump Connections


Predicted vs Actual deflection


35

S30 *
S25

I 25 --------------
20 *
S15


a- 5
S10
I--


0 20 40 61
Actual deflection


* Predicted vs Actual
deflection


Figure 18: Scatter Plot of 5-Layer Backpropagation with Jump Connections




Figure 18 shows the plot of predicted deflections to actual deflections closely

located about the 450 line. The mean Absolute error was 2.2 mm and the correlation

coefficient was 0.9 for this paradigm. From the graph, it can be deduced that the 5-Layer

Backprop model with Jump connections was a good predictor.













GRNN network


Predicted vs Actual deflection

35

30 -
c0
25 IJ

20* I* Predicted vs Actual
S15 deflection

s 10 --i -
I 5 --
0
0 20 40 60
Actual deflection



Figure 19: Scatter Plot of GRNN Network




Figure 19 shows the plot of predicted deflection to actual deflection values closely

scattered about the 450 line. The mean Absolute error was 2.6 mm and the correlation

coefficient was 0.95 for this model. This model provided moderately good accuracy of

predictions.













Ward Network Backoropaeation 2 hidden slabs


Predicted vs Actual deflection

25

S20 -S
.2
4u I
15 -
I Predicted vs Actual
deflection
10

| 5--

0 ---, ,
0 20 40 60
Actual deflection



Figure 20: Scatter Plot of Ward Network with 2 Hidden Layers




Figure 20 shows very low correlation between predicted deflections to actual

deflections. The mean Absolute error was 6 mm and the correlation coefficient was 0.718,

for the 4-Layer Ward network. This model provided very low accuracy of predictions on

the test set data, as can be observed from the above graph.












Ward Network 3 hidden laver Backprovaeation model


Predicted vs Actual deflection

35

0
25 *
20 Predicted vs Actual
15 deflection

5I*
I 10 -------------
10
0 "----------------
0 20 40 60
Actual deflection



Figure 21: Scatter Plot of Ward Network with 3 Hidden Layers




Figure 21 shows the plot of predicted deflections to actual deflections grouped

fairly closely about the 450 line. The mean Absolute error was 2.7 mm and the correlation

coefficient was 0.92 for the 5-layer Ward network. Accuracy of predictions was fairly

good for data in the lower range, but decreased as deflection values increased.












Ward Network 2 hidden layer Backpropagation with input-output slab connection


Predicted vs Actual deflection

40
35
.2 30
25 I
25-
0 Predicted vs Actual
20 -
o deflection
S15-
10
I-05
0 .. -l--I----,-----,-
0 20 40 60
Actual deflection



Figure 22: Scatter Plot of Ward Network- 2 Hidden Layers with Input-Output Connected




Figure 22 shows the plot of predicted deflections to actual deflections for test set

data. The points are scattered about the 450 line. The mean Absolute error was 3.34mm

and the correlation coefficient was 0.91 for the 4-layer Ward network (with input-output

slabs connected.) From the above graph, it can be inferred that accuracy of predictions is

moderate by this paradigm.

From the above analysis, it is determined that the best training neural network

paradigm for this problem is the 5-Layer Backpropagation model with jump connections. It

had an R-squared value of 0.9730 and a mean absolute error of 0.909mm on the pattern set

(including both training and test set data). Its time to converge to a solution was

approximately 20 seconds.









The 3-layer simple backpropagation network also yielded good results, with an R-

squared value of 0.9584 and a mean absolute error of 1.678mm on the pattern set.

However the time taken by this network to reach this level of accuracy was 22.5 hours,

which was considerably longer than the other paradigms.

The 4-layer backpropagation network with jump connections resulted in an R-

squared value of 0.9058 and a mean absolute value of 2.399mm on the pattern set. The time

taken to converge was approximately 30 seconds.

The GRNN method had an R-squared value of 0.9326 and a mean absolute error of

1.86mm on the pattern set, and converged to a solution within 1 minute.

The Ward network (designed by Ward Systems) with a two-hidden layer and

connection from input to output layers, also arrived at good results. It had an R-squared

value of 0.8547 and a mean absolute error of 3.59 mm on the pattern set. Time to complete

training was about 1 minute.

A comparison of the performances of the network paradigms based on type of

external reinforcement is presented below. For each network paradigm, the mean absolute

error was found from the test set data for external reinforcement configurations of 0 (no

external reinforcement), 1 (external reinforcement on tension face only), and 2 (external

reinforcement on both tension and shear faces).









Table 5. Performance Of Neural Paradigms Based On External Reinforcement
Configuration

Mean Absolute Mean Absolute Mean Absolute
Test Set Error Test Set Error Test Set Error
(mm) for External (mm) for External (mm) for External
Reinforcement = 0 Reinforcement = 1 Reinforcement = 2
3-Layer Backpropagation 0.183 0.518 0.441

3-Layer Backpropagation with 1.853 0.598 0.757
Jump Connections

3-Layer Backpropagation using 0.557 0.506 1.355
Turboprop

4-Layer Backpropagation using 1.616 2.689 0.676
Jump Connections

4-Layer Backpropagation using 1.542 2.719 1.595
Turboprop

5-Layer Backpropagation using 0.659 3.048 0.711
Jump Connections
GRNN network 0.615 0.388 0.522

Ward 2 hidden layer network 1.566 1.151 0.789

Ward 2 hidden layer with Input- 1.28 1.258 1.28
output connected

Ward 3-hidden layer 0.813 1.262 0.229












5.2 Analysis of ANN results to compare with FEM

The advantage of the neural network approach compared to FEM in analysis of

external reinforcement on RC beams, is that it does not require a new model to be

developed for each new problem, and is capable of producing an answer to the problem

within seconds. The FEM method is computationally expensive, often requiring CPU times

of several hours for just two-dimensional analysis. It would be difficult to use in modeling

geometrically complicated forms. For analysis of this problem, a three dimensional model

would be required to consider all the external reinforcement configurations on beams, and

three-dimensional analysis using FEM would increase processing time over a 2-

dimensional analysis.

A finite-element-method study was carried out (Ross et al., 1994) in the study of

hardening and rehabilitation of concrete structures using carbon fiber reinforced plastics

(CFRP) in 1994. The data used for this study was the same as that used in this thesis. The

ADINA finite element computer code was used to calculate the beam response of concrete

beams, both with and without CFRP. The FEM analysis required considerable man-hours

as well as CPU time. The inelastic-tensile response of the concrete cracking caused the

load step to be reduced almost down to 1.0 lb increments during the FEM calculation. This

led to increased CPU time on the higher loadings of the beam reinforced with No. 7 steel

bars. However, the FEM analysis was worthwhile in that it provided a verification of the

section analysis of beams and appeared to agree very well with the experimental data. The

lengthy CPU time and the inconvenience of building FEM models for this problem, reduce









the number of alternative external reinforcement configurations that can be evaluated in a

study.

In comparison, the ANN analysis on the same data yielded results with a

correlation to the perfect model in the range of 95% to 72%. Mean absolute errors on

deflection prediction were in the range of 1.9 mm to 6.5 mm. Average training time for the

network paradigms were in the range of 20seconds to 45 seconds. In one instance (for the

3-Layer simple backpropagation model), training time was 22.5 hours before significant

improvement in reduction of error values was noticed.

Garcia, Gabe studied the Relative performance of clustering based neural network

and statistical pattern recognition models for non-destructive damage detection (Garcia,

1997). The average localization error for the statistical pattern recognition model (0.25%

of the span) was less than the average localization error for the clustering-based neural

network method (0.75% of the span). The conclusion of the study was that the statistical

pattern recognition model was more efficient in locating damage in terms of the probability

of detection and localization accuracy. However, the author acceded that when

appropriately trained, the generalization capabilities of the clustering-based neural

network should perform as well as statistical pattern recognition techniques. Theoretically,

the neural network model should work equally well as the statistical model.

The search for an optimal solution to this problem is easily accomplished using

different neural paradigms and experimenting with their number of neurons, weights,

learning rates and momentum. The accuracy of a neural network analysis is reliable, and is

comparable to an FEM analysis of the problem. ANN is computationally expensive to

train, in terms of the computing power required, but results are capable of being obtained









in seconds. This is a significant advantage over FEM analysis, which would require re-

building a model for every new analysis that had to be performed and would require CPU

time of the order of several hours to arrive at a solution. The difficulty involved in

formulating an FEM model would also make it inaccessible to the lay person. In

comparison, implementing a Neural Network analysis requires only a working knowledge

of the software that would be used to analyze the model.

Some advantages of using ANNs are summarized below:

1. They are weighted connection and massively parallel processing with fault tolerance, so
that they can automatically learn from experience (internal representation). (Kireetoh,
1995).

2. They have the generalization capability to learn complex patterns of inputs and provide
meaningful solutions to problems even when input data contain errors, or are incomplete,
or are not presented during training. In other words, they have the ability to integrate
information from multiple sources and incorporate new features without degrading prior
learning (Karunasekera, 1992; Hawley et al., 1993; Medsker et al., 1993; Chao and
Skibniewski, 1994; Flood and Kartam ,1994a and 1994b).

3. They are distribution free because no prior knowledge is needed about the statistical
distribution of the classes in the data sources in order to apply the method for
classification. This is an advantage over most statistical methods that require modelling of
data (Karunasekera, 1992; Hawley et al., 1993; Wu and Lim, 1993; Khoshgoftaar and
Lanning, 1995). Neural networks could avoid some of the shortcomings of the currently
used statistically or empirically based techniques.

4. They take care of determining how much weight each data source should have in the
classification, which remains a problem for statistical methods (Karunasekera, 1992; Wu
and Lim, 1993). The non-linear learning and smooth interpolation capabilities give the
neural network an edge over standard computers and rule-based systems for solving certain
problems (Kimoto et al.,1993; Wu and Lim, 1993).














CHAPTER 6
CONCLUSIONS AND RECOMMENDATIONS



This thesis demonstrates the viability of using neural networks to predict deflection

in externally reinforced concrete beams. The neural network approach is determined to be

able to provide accurate estimates of beam deflection from three parameters (the cross-

sectional area of tensile steel, the position of external reinforcement, and the load) for a

beam of fixed size and load orientation. All networks applied generated accurate

predictions for deflection, the least accurate being within 15% of the actual beam

deflection.

A synopsis of performance data for the network paradigms implemented in this

study is below:

* For the 3-layer backpropagation model, mean absolute error was obtained as 1.98 mm
with a correlation coefficient of 0.94 on the test set data.

* The 5-layer backpropagation model with jump connections arrived at a mean absolute
error of 2.2 mm, with a correlation coefficient of 0.9 on the test set data.

* For the GRNN network, mean absolute error was obtained as 2.6 mm with a
correlation coefficient of 0.95 on the test set data.

* The worst performance for the scope of this study was the Ward Network with 2
hidden layers, resulting in a mean absolute error of 6 mm and a correlation coefficient
of 0.72.


The most accurate neural network architecture for this problem was the 3-Layer

backpropagation model, although its training time was significantly longer than the other

paradigms. Based on effectiveness of results generated, the 5-Layer backpropagation









model with jump connections would probably be the best training procedure to be adopted

for this problem. The independent variables used within the neural network models that

were found to be significant in training were the load applied on the beam, the area of

tensile steel, the presence or absence of CFRP and its application configuration, and the

deflection generated under the loading condition.

A comparison of the relative performance of ANNs and FEM suggested that

comparable accuracies in deflection prediction were obtained for the best Neural

paradigm for this problem and the FEM analysis method. The ANN method generally had a

training time in the order of 20-45 seconds and the trained network could predict values on

a new set of data within seconds. The FEM method required processing time in the order

of several hours, and involved reconstructing the model for every new beam loading

configuration and beam reinforcement.

It was qualitatively determined that an Artificial Neural Network model is

advantageous to the FEM model in the analysis of this problem, since an ANN does not

require the creation of a new model for each new problem, and is capable of producing an

answer to a problem within seconds. For the use of FEM, a three-dimensional model

would be required, which would be difficult to construct, as well as being computationally

expensive by requiring lengthy CPU time.

A limitation of this work is the inability to compare quantitatively the performance (in

terms of speed and accuracy) of the FEM method on the same set of training data.



This thesis arrived at a neural network model for the performance of externally

reinforced beams, that is sufficiently accurate to be applicable in actual engineering









problems, and overcomes the speed problem of existing methods of analysis such as FEM

and Statistical pattern recognition. However, the accuracy of paradigms that did not predict

well within the scope of this study, may be improved by having a larger set of training data.

Another area that is open for experimentation is the adjustment of number of neurons, the

learning rate and momentum, and activation functions at the neurons. Varying these

parameters with a larger set of training data may provide better predictions with the

network models that failed to provide very accurate predictions in this analysis.

Future work could analyze the same problem using Probabilistic neural networks,

and the Kohonen unsupervised network, if a larger and more variable set of data is

available. For this thesis, the data available was inadequate to generate meaningful results

using these two paradigms. The fitness of all individuals in the population was found to be

the same and the Genetic algorithm could not proceed. An adequate solution was not

reached since a larger breeding pool was not available.















LIST OF REFERENCES


Alexander, D. R., loannides, A. M., Hammons, M. I. and Davis, C. M. Application
ofArtificial Neural Networks to Concrete Pavement Joint Evaluation. Transportation
Research Record n 1540 Nov (1996). pp. 56-64.

Bohner, Richard and Burleigh, Douglas. Thermal Non-destructive Testing (TNDT)
of Composite Reinforcements Adhesively Bonded to Concrete Civil Structures.
International SAMPE Symposium and Exhibition (Proceedings) v 43 n 2 (1998).

Chajes, M. J., Januszka, T. F., Mertz, D. R., Thomson, T. A. Jr. and Finch, W. W.
Jr. .\hear Strengthening Of Reinforced Concrete Beams Using Externally Applied
Composite Fabrics. ACI Structural Journal v 92 n 3 May-June (1995).

Chajes, M. J., Thomson, T. A. Jr., Januszka, T. F. and Finch, W. W. Jr. Flexural
S. euiglltheIig Of Concrete Beams Using Externally Bonded Composite Materials.
Construction and Building Materials v 8 n 3 Sept (1994) pp. 191-201.

Cao, X., Sugiyama, Y. and Mitsui, Y. Application ofArtificial Neural Networks to
Load Identification. Computers and Structures v 69 n 1 Oct (1998). pp. 63-78.

Crasto, Allan S. and Kim, Ran Y. Rehabilitation of Concrete Bridge Beams i ilh
Externally- bonded Composite Plates. Part ii. 41st International SAMPE Symposium,
March 24-28, (1996).

Crastro, A. S., Kim, R. Y. and Mistretta. J. P. Rehabilitation of Concrete Bridge
Beams i/ ith Externally bonded Composite Plates. International SAMPE Symposium
and Exhibition (Proceedings) v 41 n 2 (1996). pp. 1269-1279.

Dussek, I. J. Strengthening of Bridge Beams and Similar Structures by Means of
Epoxy-Resin-Bonded External Reinforcement. Transportation Research Record 785,
Transportation Research Board, (1980), pp. 21-24.

Flood, Ian and Kartam, Nabil. Neural Networks in Civil Engineering ii : Systems
and Application. Journal of Computing in Civil Engineering v.8 n 2, April (1994).

Flood, Ian and Kartam, Nabil. Artificial Neural Networks: Fundamentals and
Applications. Chapter 2: Systems, American Society of Civil Engineers, Reston, VA.















Flood, I., Nandy, S. and Muszynski, L.C. Assessing External Reinforcement on RC
Beams Using Neural Nets. Proceedings of the VIIIth International Conference:
Computing in Civil and Building Engineering, August (2000) pp. 1114-1120.

Garcia, Gabe. Relative performance of clustering based neural network and
statistical pattern recognition models for non-destructive damage detection. Smart
Materials and Structures, v 6 n 4 Aug (1997). pp. 415-424.

Hegazy, T., Tully, S. and Marzouk, H. Neural Network approach for Predicting the
Structural Behavior of Concrete Slabs. Canadian Journal of Civil Engineering v 25 n
4 Aug (1998). pp. 668-677.

Highsmith, Alton L. and Keshav, Sineesh. Using Measured Damage Parameters to
Predict the Residual Sn engil ofImpacted Composites: A Neural Network Approach.
Journal of Composites Technology and Research v 19 n 4 Oct (1997). pp. 195-201.

Hussain, M., Sharif, A., Basunbul, I.A., Baluch, M.H. and Al-Sulaimani, G.J.
Flexural Behavior OfPrecracked Reinforced Concrete Beams SuCi githee 1
Externally By Steel Plates. ACI Structural Journal, Jan-Feb (1995).

Jones, R. and Hanna, S. Composite Wraps for Aging Infrastructure. Theoretical and
Applied Fracture Mechanics v 28 n 2 Dec (1997). pp. 125-134.

Kaempen, Charles E. Composite-reinforced Concrete Building and Bridge
Structures. International SAMPE Symposium and Exhibition (Proceedings) v 41 n 1
(1996). pp. 679-686.

Klaiber, F.W. Methods of Sn eugilhewing Existing Highway Bridges. NCHRP
Research Report No. 293, Transportation Research Board, (1987), pp.114

MacDonald, M.D. and Calder, A.J. Bonded Steel Plating for Sn euCgilitwuing
Concrete Structures. International Journal of Adhesives, 2(2), (1982), pp.119-127.

Maeda, Y. Deterioration and Repairing of Reinforced Concrete Slabs ofHighway
Bridges in Japan. Technology Reports (Osaka University), 30 (1599), (1980), pp. 135-
144.

Mikami, S., Tanaka, S. and Tatsuya, H. Neural Network System For Reasoning
Residual Axial Forces OfHigh-Sn eingih Bolts In Steel Bridges. Computer Aided
Civil and Infrastructure Engineering v 13 n 4 July (1998). pp. 237-246.















Mufti, A.A. "Introduction and Overview in Advanced Composite Materials i/ i/h
Application to Bridges. Canadian Society for Civil Engineering, Montreal, Canada,
(1991), pp.2.

Mukherjee, A., Deshpande, J.M. and Anmala, J. Prediction of Buckling Load of
Columns Using Artificial Neural Networks. Journal of Structural Engineering v 122 n
11 Nov (1996). pp. 1385-1387.

NeuroShell2 Manual Third Edition August (1995). Ward Systems Group Inc.,
Frederick, MD 21702.

Peetathawatchai, C. and Connor, J.J. Jr. Applicability of Neural Network Systems
for Structural Damage Diagnosis. Proceedings of Engineering Mechanics v 1 (1996).
ASCE, NewYork. pp. 68-71.

Ritchie, P.A., Thomas, D.A., Lu, L.W. and Connelly, G.M. External Reinforcement
of Concrete Beams Using Fiber Reinforced Plastics. ACI Structural Journal, July-
August (1991).

Ross, C.A., Jerome, D.M. and Hughes, M.L. Hardening and Rehabilitation of
Concrete Structures Using Carbon Fiber Reinforced Plastics (CFRP). Final Report
for period April- September (1994), Wright Laboratory Armament Directorate,
Tyndall AFB, FL 32403.

Saadmanesh, H. and Malek, A.M. Design Guidelines for Flexural Sn iegtheiiiing of
RC Beams ii ith FRP Plates. J. of Composites for Construction,2,4,(1998) pp. 158-164.

Shahawy, M.A., Arockiasamy, M., Beitelman, T. and Sowrirajan, R. Reinforced
Concrete Rectangular Beams Sn enugthe1inl, iith CFRP laminates. Florida Dept of
Transportation, May (1995).

Sharif, A., Al-Sulaimani, G.J., Basunbul, I.A., Baluch, M.H. and Ghaleb, B.N.
S.n euglilluing ofInitially Loaded Reinforced Concrete Beams Using FRP Plates.
ACI Structural Journal, March-April (1994).

















APPENDIX A


TRAINING DATA

Appendix A incorporates the data used in training the Neural Network models.
Load is represented in KiloNewtons, Area of tensile steel in square millimeters, CFRP
external reinforcement as a value (0= no CFRP, 1= CFRP on tension face, 2= CFRP on
tension and shear face), Actual deflection and the Network predicted deflection in
millimeters, and the Error is calculated as the difference of the Actual and Predicted
deflections.




5-Layer Backpropagation Network with Jump Connections


Actual Predicted
deflection Deflection
(in mm) (in mm) Error
Load(KN) Area of steel,mm2 (2-bars) CFRP Actual(1) Network(1) Act-Net(1)
0 567.7408 0 0 0 0
3.6920226 567.7408 00.254000008 0.04293159 0.21106842
7.5174918 567.7408 00.5080000161.062196732 -0.5541967
10.898139 567.7408 00.7620000241.443637967 -0.6816379
13.4113833 567.7408 01.0160000321.672314644 -0.6563146
14.9460192 567.7408 01.2699999811.843201756 -0.5732018
15.5465289 567.7408 01.5240000491.917415023 -0.393415
17.0144415 567.7408 01.7779999972.114818096 -0.3368181
18.460113 567.7408 02.0320000652.328862429 -0.2968624
19.2830337 567.7408 02.2860000132.458258629 -0.1722586
20.5952586 567.7408 02.5399999622.674554825 -0.1345549
25.6217472 567.7408 03.8099999433.595994234 0.21400571
31.0040934 567.7408 05.0799999244.715117455 0.36488247
36.8757438 567.7408 06.3499999056.071269035 0.27873087
42.035679 567.7408 07.6199998867.371602058 0.24839783
46.9732032 567.7408 08.8900003438.703854561 0.18614578
51.9774507 567.7408 010.1599998510.13076687 0.02923298
57.4265202 567.7408 011.43000031 11.7524786 -0.3224783
63.6317871 567.7408 012.69999981 13.6554203 -0.9554205
69.2143032 567.7408 013.97000027 15.4078455 -1.4378452
74.3519973 567.7408 015.2399997717.07495117 -1.8349514
79.2895215 567.7408 016.5100002318.75535965 -2.2453594
84.6273855 567.7408 017.7800006918.18295097 -0.4029503
88.2526848 567.7408 019.0499992419.09909821 -0.049099
90.298866 567.7408 020.3199996920.75270844 -0.4327087




















91.9002252 567.7408 022.8600006122.71040916 0.14959145
92.7231459 567.7408 024.1299991623.98640823 0.14359092
93.6127899 567.7408 025.39999962 25.5587616 -0.158762
94.9250148 567.7408 027.9400005328.15386772 -0.2138672
96.081552 567.7408 030.4799995430.57716179 -0.0971622
97.7718756 567.7408 033.0200004634.07014084 -1.0501404
99.3954759 567.7408 035.5600013737.13212585 -1.5721245
100.707701 567.7408 038.0999984739.29722214 -1.1972237
102.264578 567.7408 040.6399993941.47063828 -0.8306389
103.354392 567.7408 043.1800003142.74575043 0.4342498E
104.310759 567.7408 045.72000122 43.7137413 2.00625992
105.311609 567.7408 048.2599983244.59341812 3.6665802
0 141.9352 0 0 0 C
2.891343 141.9352 00.254000008 0.4506419 -0.1966419
8.3181714 141.9352 00.5080000160.337584734 0.1704152E
9.9862539 141.9352 00.7620000240.589119852 0.17288017
13.1444901 141.9352 01.0160000320.943647206 0.0723528`
14.3010273 141.9352 01.2699999811.355989695 -0.0859897
15.9691098 141.9352 01.524000049 1.44854176 0.07545829
17.9040855 141.9352 01.7779999972.184108257 -0.4061082
17.8596033 141.9352 02.0320000652.114660025 -0.08266
18.2821842 141.9352 02.2860000135.693309784 -3.407309E
18.1264965 141.9352 02.5399999623.278800964 -0.738801
18.1932198 141.9352 03.8099999434.135506153 -0.3255062
18.6380418 141.9352 05.0799999249.446576118 -4.3665762
18.3711486 141.9352 06.3499999057.256037712 -0.906037E
18.3489075 141.9352 07.6199998866.897034645 0.72296524
18.4156308 141.9352 08.8900003437.880403519 1.00959682
18.7492473 141.9352 010.159999859.776610374 0.38338947
19.6611324 141.9352 011.43000031 11.9268589 -0.4968586
20.0614722 141.9352 012.6999998112.84028435 -0.1402845
20.6397408 141.9352 013.9700002713.93496132 0.03503895
21.3292149 141.9352 015.2399997715.10927963 0.13072014
21.9519657 141.9352 016.5100002316.38286781 0.12713242
22.5302343 141.9352 017.7800006917.84153938 -0.0615387
23.0862618 141.9352 019.04999924 19.2322464 -0.1822472
23.4643605 141.9352 020.3199996920.22411537 0.09588432
23.8869414 141.9352 021.5900001521.68719482 -0.0971947
24.3095223 141.9352 022.8600006123.70238304 -0.8423824
24.5986566 141.9352 024.1299991625.42682838 -1.2968292
24.7543443 141.9352 025.3999996226.47483444 -1.074834E
25.1546841 141.9352 027.9400005329.52471161 -1.5847111
25.5105417 141.9352 030.4799995432.52240753 -2.04240E
25.6662294 141.9352 033.0200004633.84803772 -0.828037`


91.3219566


567.7408


021.59000015 21.9244976


-0.3344975




















25.9331226 141.9352 038.0999984736.03686905 2.06312942
26.1555336 141.9352 040.63999939 37.7077713 2.9322280S
0 567.7408 1 0 0 C
2.53993362 567.7408 10.2540000080.153437749 0.10056226
4.28363586 567.7408 10.5080000160.442647249 0.06535277
6.48995298 567.7408 10.7620000240.765775084 -0.0037751
8.7629934 567.7408 11.0160000321.073728323 -0.0577282
11.7744383 567.7408 11.269999981 1.30730319 -0.0373032
14.6880224 567.7408 11.5240000490.376362383 1.14763767
17.0144415 567.7408 11.7779999970.704520822 1.0734791E
19.0116923 567.7408 12.0320000651.001803517 1.03019655
24.5853119 567.7408 12.286000013 1.88305676 0.40294325
29.7496954 567.7408 12.5399999622.773976564 -0.233976C
34.5938069 567.7408 13.8099999433.680078506 0.12992144
40.03398 567.7408 15.0799999244.783222198 0.29677772
45.5186353 567.7408 16.3499999055.989840984 0.36015892
51.1990122 567.7408 17.6199998867.340175152 0.27982472
56.9238713 567.7408 18.8900003438.801509857 0.0884904S
62.6442823 567.7408 110.1599998510.35294628 -0.1929464
68.7161026 567.7408 111.4300003112.05708885 -0.6270885
74.810164 567.7408 112.6999998110.94734859 1.75265121
80.2547852 567.7408 113.9700002712.42880726 1.54119301
86.0419195 567.7408 115.2399997714.04766178 1.1923379S
91.8690877 567.7408 116.5100002315.76563549 0.74436474
97.1179873 567.7408 117.7800006917.42340279 0.356597S
102.460299 567.7408 119.0499992419.22584534 -0.1758461
106.205701 567.7408 120.3199996920.56944847 -0.249448E
108.425363 567.7408 121.5900001521.42269516 0.1673049S
110.235788 567.7408 122.8600006122.17486954 0.68513107
113.852191 567.7408 124.1299991623.92639923 0.20359992
117.366285 567.7408 125.3999996226.08973312 -0.6897335
120.90262 567.7408 127.9400005328.80529022 -0.8652897
124.038615 567.7408 130.47999954 31.5897274 -1.109727S
127.041163 567.7408 133.0200004634.41215897 -1.3921585
129.665613 567.7408 135.5600013736.84076309 -1.2807617
132.178857 567.7408 138.0999984739.01304245 -0.913044
134.491932 567.7408 140.6399993940.81595993 -0.1759605
135.990982 567.7408 143.1800003141.87023926 1.3097610E
137.378826 567.7408 145.7200012242.76473999 2.95526122
105.311609 567.7408 148.2599983220.24028206 28.0197162
0 141.9352 1 0 0 C
2.66448378 141.9352 10.2540000080.326959103 -0.0729591
4.35035916 141.9352 10.5080000160.550672412 -0.0426724
7.44187206 141.9352 10.7620000241.159987569 -0.3979875


25.8441582


141.9352


0135.56000137135.32495499


0.2350463S




















14.5323347 141.9352 11.2699999811.458437324 -0.1884372
16.6274464 141.9352 11.524000049 1.61352396 -0.089523S
17.6416405 141.9352 11.7779999971.735392809 0.0426071S
19.976956 141.9352 12.0320000652.078426361 -0.0464262
21.1512861 141.9352 12.2860000132.278162956 0.00783706
21.5115919 141.9352 12.5399999622.344336033 0.19566392
21.978655 141.9352 13.8099999432.434988976 1.37501097
25.4705077 141.9352 15.0799999243.667862654 1.41213727
28.1928184 141.9352 16.349999905 6.02493906 0.32506084
30.0699672 141.9352 17.6199998867.317203522 0.30279636
32.0004947 141.9352 18.890000343 8.20187664 0.6881237
33.1303426 141.9352 110.159999858.669887543 1.4901122
38.0500739 141.9352 111.4300003111.18537712 0.2446231E
39.2911273 141.9352 112.6999998112.05879593 0.6412038E
41.3995835 141.9352 113.9700002713.86200333 0.10799694
42.3737437 141.9352 115.2399997714.83884335 0.40115642
44.219755 141.9352 116.5100002316.88510132 -0.3751011
45.5230835 141.9352 117.7800006917.71737862 0.06262207
47.0221336 141.9352 119.0499992419.01166916 0.0383300E
47.9696045 141.9352 120.3199996920.23558044 0.08441925
49.6287905 141.9352 121.5900001522.68108368 -1.0910835
48.7035608 141.9352 122.8600006121.24388885 1.61611176
48.748043 141.9352 124.1299991621.30796432 2.82203484
48.9170753 141.9352 125.39999962 21.5553112 3.84468842
50.2159556 141.9352 127.94000053 23.7544651 4.18553542
50.8965332 141.9352 130.4799995425.24596596 5.2340335E
52.4623067 141.9352 133.0200004630.23975372 2.78024672
53.6099474 141.9352 135.5600013735.52633286 0.03366852
0 567.7408 2 00.258493662 -0.2584937
1.21881228 567.7408 20.2540000080.319325924 -0.065325S
6.52998696 567.7408 20.508000016 0 0.50800002
9.31457268 567.7408 20.7620000240.228649765 0.53335026
12.8509076 567.7408 21.0160000320.728937685 0.28706235
15.8401114 567.7408 21.2699999811.191228986 0.078771
18.1087036 567.7408 21.5240000491.559157729 -0.0351577
20.3595029 567.7408 21.7779999971.938352108 -0.1603521
21.8007262 567.7408 22.0320000652.188704014 -0.156703S
23.5043945 567.7408 22.286000013 2.49243331 -0.2064332
24.6920692 567.7408 22.5399999622.709263325 -0.1692634
30.3235157 567.7408 23.8099999433.796237946 0.013762
36.333061 567.7408 25.0799999245.067764759 0.01223516
42.6940156 567.7408 26.3499999056.544143677 -0.194143E
48.3565996 567.7408 27.6199998867.973833561 -0.3538337
54.5885558 567.7408 28.8900003439.673434258 -0.783433S


10.5778672


141.9352


11.0160000321.485563278


-0.4695632




















66.4475104 567.7408 211.4300003110.65399075 0.77600956
72.1723695 567.7408 212.6999998112.32174397 0.37825584
78.328706 567.7408 213.9700002714.19344139 -0.2234411
84.1825635 567.7408 215.2399997716.01932907 -0.7793292
90.1476265 567.7408 216.5100002317.88908958 -1.3790894
96.0237251 567.7408 217.7800006919.71518135 -1.9351807
101.637379 567.7408 219.0499992421.45820618 -2.4082069
106.294665 567.7408 220.3199996922.93422508 -2.6142254
109.90662 567.7408 221.5900001524.11208725 -2.5220871
111.85494 567.7408 222.8600006124.76057434 -1.9005737
113.901121 567.7408 224.12999916 25.4509716 -1.3209724
115.5781 567.7408 225.3999996226.02355003 -0.6235504
119.835047 567.7408 227.9400005327.51267242 0.42732811
123.847341 567.7408 230.4799995429.04049492 1.43950462
127.477089 567.7408 233.0200004630.70206642 2.31793404
131.854137 567.7408 235.56000137 33.2913475 2.26865387
135.163613 567.7408 238.0999984735.66852951 2.43146896
138.366331 567.7408 240.6399993938.14926147 2.49073792
0 141.9352 2 00.215275362 -0.2152754
0.05782686 141.9352 20.2540000080.220672145 0.03332786
4.92862776 141.9352 20.5080000160.945739269 -0.4377392
9.68377494 141.9352 20.7620000240.842015207 -0.0800152
12.8953898 141.9352 21.0160000320.420753062 0.59524697
16.5918606 141.9352 21.2699999810.844469666 0.42553031
20.2883314 141.9352 21.524000049 1.32541728 0.19858277
22.281134 141.9352 21.7779999971.611992955 0.16600704
24.3629009 141.9352 22.0320000651.977523923 0.05447614
25.4883006 141.9352 22.2860000132.265819311 0.0201807
25.6217472 141.9352 22.5399999622.308584929 0.23141502
29.8920384 141.9352 23.8099999435.169563293 -1.3595634
33.6952665 141.9352 25.0799999246.741468906 -1.661469
36.3775432 141.9352 26.3499999057.412313461 -1.0623136
39.589158 141.9352 27.619999886 8.21795845 -0.5979586
43.0765625 141.9352 28.8900003439.147535324 -0.25753E
46.6974136 141.9352 210.1599998510.19715214 -0.0371522
49.7133067 141.9352 211.4300003111.16771412 0.26228619
52.1108973 141.9352 212.6999998112.71244335 -0.0124435
54.855449 141.9352 213.9700002714.58885479 -0.6188545
57.2663843 141.9352 215.2399997715.59836769 -0.3583679
59.5883551 141.9352 216.51000023 16.5992012 -0.089201
61.7146043 141.9352 217.7800006917.52415848 0.25584221
63.8675428 141.9352 219.0499992418.46446419 0.58553505
66.5676123 141.9352 220.31999969 19.6777935 0.64220619
68.8717903 141.9352 221.5900001520.83288002 0.75712012


60.5847564


567.7408


210.159999859.869497299


0.29050255




















73.4045264 141.9352 224.1299991623.65086746 0.4791317
75.9933905 141.9352 225.3999996225.45420456 -0.054204S
80.0813047 141.9352 227.9400005328.35725975 -0.4172592
82.2698289 141.9352 230.47999954 29.9337368 0.54626274
84.1736671 141.9352 233.02000046 31.3278389 1.6921615(
0 774.192 0 0 0 C
17.79288 774.192 02.5399999621.754709005 0.7852909(
31.13754 774.192 05.0799999244.157893181 0.92210674
44.4822 774.192 07.6199998867.092042923 0.5279569(
57.82686 774.192 010.1599998510.58166313 -0.4216632
68.94741 774.192 012.6999998113.88581562 -1.185815E
80.06796 774.192 015.2399997715.29004574 -0.05004(
91.18851 774.192 017.7800006916.01569748 1.76430321
102.30906 774.192 020.3199996919.47645378 0.84354591
111.2055 774.192 022.8600006122.75304985 0.1069507(
115.65372 774.192 025.3999996225.04311371 0.35688591
0 774.192 1 0 0 C
26.68932 774.192 12.5399999621.945963502 0.5940364(
37.80987 774.192 15.079999924 2.99820137 2.08179855
55.60275 774.192 17.6199998868.355574608 -0.7355747
66.7233 774.192 110.15999985 9.60956955 0.5504302
80.06796 774.192 112.6999998110.81829071 1.8817091
95.63673 774.192 115.23999977 15.5503273 -0.3103275
106.75728 774.192 117.7800006919.18312263 -1.4031219
122.32605 774.192 120.3199996920.71483994 -0.3948402
133.4466 774.192 122.8600006124.24181557 -1.381815
142.34304 774.192 125.3999996227.29301643 -1.893016E
0 258.064 0 00.002316356 -0.0023164
20.01699 258.064 05.0799999244.763210297 0.31678962
28.91343 258.064 010.1599998513.58454227 -3.4245424
35.58576 258.064 015.2399997717.60326195 -2.3632622
36.475404 258.064 020.3199996918.53609085 1.78390884
37.80987 258.064 025.3999996220.50885582 4.891143E
40.03398 258.064 030.4799995428.78214455 1.69785E
40.923624 258.064 035.5600013735.28065872 0.27934265
41.813268 258.064 040.6399993942.00836182 -1.3683624
42.702912 258.064 045.7200012247.07924652 -1.3592452
43.370145 258.064 050.7999992449.58152008 1.21847916
0 258.064 1 00.052454859 -0.052454S
24.46521 258.064 15.079999924 4.9526577 0.12734222
37.80987 258.064 110.159999859.757398605 0.40260124
51.15453 258.064 115.23999977 13.6613884 1.57861137
55.60275 258.064 120.3199996915.68225384 4.6377458(
66.7233 258.064 125.3999996225.40167618 -0.0016766


71.2738291


141.9352


222.8600006122.24450493


0.6154956E



















80.06796 258.064 135.5600013737.40085983 -1.840858E
86.74029 258.064 140.63999939 42.2421608 -1.6021614
91.18851 258.064 145.7200012245.43424225 0.28575897
97.86084 258.064 150.7999992448.98107529 1.8189239E
Mean absolute error=0.909mm Max error=28 mm




4-Layer Backpropagation with Jump Connections



Actual Predicted
Deflection Deflection Error
Load(KN) Area of steel,mm2 (2-bars) CFRP Actual(l) Network(1) Act-Net(1)
0 567.7408 0 0 0 0
3.692023 567.7408 00.2540000080.399079323-0.14507931
7.517492 567.7408 00.5080000160.877570987-0.36957097
10.89814 567.7408 00.7620000241.666728139-0.90472811
13.41138 567.7408 01.0160000322.190505505-1.17450547
14.94602 567.7408 01.2699999812.484836102-1.21483612
15.54653 567.7408 01.5240000492.597754717-1.07375467
17.01444 567.7408 01.7779999972.872861624-1.09486163
18.46011 567.7408 02.0320000653.146835089-1.11483502
19.28303 567.7408 02.2860000133.305569649-1.01956964
20.59526 567.7408 02.5399999623.564107418-1.02410746
25.62175 567.7408 03.8099999434.624937057-0.81493711
31.00409 567.7408 05.0799999245.878867626 -0.7988677
36.87574 567.7408 06.3499999057.365961552-1.01596165
42.03568 567.7408 07.6199998868.764969826-1.14496994
46.9732 567.7408 08.890000343 10.1795845-1.28958416
51.97745 567.7408 010.1599998511.68638706-1.52638721
57.42652 567.7408 011.4300003113.40984249-1.97984219
63.63179 567.7408 012.6999998115.48184776-2.78184795
69.2143 567.7408 013.9700002717.45951462-3.48951435
74.352 567.7408 015.2399997719.39911842-4.15911865
79.28952 567.7408 016.5100002321.40340042-4.89340019
84.62739 567.7408 017.7800006923.76263809-5.98263741
88.25268 567.7408 019.0499992425.48676109-6.43676186
90.29887 567.7408 020.3199996926.49821091-6.17821121
91.32196 567.7408 021.5900001527.01211548-5.42211533
91.90023 567.7408 022.8600006127.30451393-4.44451332
92.72315 567.7408 024.1299991627.72253609-3.59253693
93.61279 567.7408 025.3999996228.17635345-2.77635384
94.92501 567.7408 027.9400005328.84760857-0.90760803


72.28357E


258.064


1130.47999954131.00452995


-0.5245304




















97.77188 567.7408 033.02000046 30.29895212.721048355
99.39548 567.7408 035.5600013731.11468887 4.4453125
100.7077 567.7408 038.0999984731.763113026.336885452
102.2646 567.7408 040.63999939 32.51632698.123672485
103.3544 567.7408 043.18000031 33.031509410.14849091
104.3108 567.7408 045.7200012233.47459412 12.2454071
105.3116 567.7408 048.2599983233.9287300114.33126831
0 141.9352 0 00.043521088-0.04352109
2.891343 141.9352 00.254000008 0.443133146-0.18913314
8.318171 141.9352 00.5080000161.215917826-0.70791781
9.986254 141.9352 00.762000024 1.48878181-0.72678179
13.14449 141.9352 01.0160000322.137047768-1.12104774
14.30103 141.9352 01.2699999812.457399368-1.18739939
15.96911 141.9352 01.5240000493.146503925-1.62250388
17.90409 141.9352 01.7779999975.179655552-3.40165555
17.8596 141.9352 02.0320000655.101323605-3.06932354
18.28218 141.9352 02.2860000135.930646896-3.64464688
18.1265 141.9352 02.5399999625.602532864 -3.0625329
18.19322 141.9352 03.8099999435.739846706-1.92984676
18.63804 141.9352 05.079999924 6.783280849-1.70328093
18.37115 141.9352 06.3499999056.1303477290.219652176
18.34891 141.9352 07.619999886 6.079579831.540420055
18.41563 141.9352 08.8900003436.2335672382.656433105
18.74925 141.9352 010.159999857.078927517 3.08107233
19.66113 141.9352 011.430000319.9545879361.475412369
20.06147 141.9352 012.6999998111.393733021.306266785
20.63974 141.9352 013.9700002713.539802550.430197716
21.32921 141.9352 015.2399997716.07967567 -0.8396759
21.95197 141.9352 016.5100002318.27796745-1.76796722
22.53023 141.9352 017.7800006920.19708443-2.41708374
23.08626 141.9352 019.0499992421.90499687-2.85499763
23.46436 141.9352 020.3199996922.97935104-2.65935135
23.88694 141.9352 021.5900001524.09128952-2.50128937
24.30952 141.9352 022.8600006125.10789299-2.24789238
24.59866 141.9352 024.1299991625.74933434-1.61933517
24.75434 141.9352 025.3999996226.07702637-0.67702675
25.15468 141.9352 027.9400005326.865171431.074829102
25.51054 141.9352 030.4799995427.503664022.976335526
25.66623 141.9352 033.0200004627.765867235.254133224
25.84416 141.9352 035.5600013728.053495417.506505966
25.93312 141.9352 038.0999984728.192684179.907314301
26.15553 141.9352 040.6399993928.5277500212.11224937
0 567.7408 1 00.722970307-0.72297031
2.539934 567.7408 10.2540000081.220793724-0.96679372


96.08155


567.7408


030.4799995429.439069751.040929794




















6.489953 567.7408 10.7620000240.7167701720.045229852
8.762993 567.7408 11.0160000320.8849945660.131005466
11.77444 567.7408 11.2699999811.1510026450.118997335
14.68802 567.7408 11.5240000491.544953465-0.02095342
17.01444 567.7408 11.7779999971.949771166-0.17177117
19.01169 567.7408 12.0320000652.238199949-0.20619988
24.58531 567.7408 12.2860000132.686860561-0.40086055
29.7497 567.7408 12.5399999623.301107407-0.76110744
34.59381 567.7408 13.8099999433.3526263240.457373619
40.03398 567.7408 15.0799999242.8104457862.269554138
45.51864 567.7408 16.3499999053.3189558983.031044006
51.19901 567.7408 17.6199998864.5641078953.055891991
56.92387 567.7408 18.890000343 6.074637892.815362453
62.64428 567.7408 110.15999985 7.7300212.429978848
68.7161 567.7408 111.430000319.6260776521.803922653
74.81016 567.7408 112.69999981 11.67566491.024334908
80.25479 567.7408 113.9700002713.646856310.323143959
86.04192 567.7408 115.2399997715.92088604-0.68088627
91.86909 567.7408 116.5100002318.44743729-1.93743706
97.11799 567.7408 117.7800006920.96040344-3.18040276
102.4603 567.7408 119.0499992423.72528267-4.67528343
106.2057 567.7408 120.3199996925.72960091-5.40960121
108.4254 567.7408 121.5900001526.91476822-5.32476807
110.2358 567.7408 122.8600006127.86836052-5.00835991
113.8522 567.7408 124.12999916 29.7115593-5.58156013
117.3663 567.7408 125.3999996231.39616013-5.99616051
120.9026 567.7408 127.9400005332.96994019-5.02993965
124.0386 567.7408 130.4799995434.26395416-3.78395462
127.0412 567.7408 133.0200004635.41999817-2.39999771
129.6656 567.7408 135.5600013736.37069321-0.81069183
132.1789 567.7408 138.0999984737.234428410.865570068
134.4919 567.7408 140.6399993937.993125922.646873474
135.991 567.7408 143.1800003138.467864994.712135315
137.3788 567.7408 145.7200012238.896312716.823688507
105.3116 567.7408 148.2599983225.2499980923.01000023
0 141.9352 1 0 0 0
2.664484 141.9352 10.254000008 00.254000008
4.350359 141.9352 10.508000016 00.508000016
7.441872 141.9352 10.7620000240.139960214 0.62203981
10.57787 141.9352 11.0160000320.6480946540.367905378
14.53233 141.9352 11.269999981 1.251677990.018321991
16.62745 141.9352 11.5240000491.5130293370.010970712
17.64164 141.9352 11.7779999971.6754211190.102578878
19.97696 141.9352 12.0320000652.294844866 -0.2628448


4.28363E


567.7408


10.5080000161.008885026-0.50088501




















21.51159 141.9352 12.5399999622.867594004-0.32759404
21.97866 141.9352 13.809999943 3.034215450.775784492
25.47051 141.9352 15.0799999244.4602780340.619721889
28.19282 141.9352 16.3499999056.1019978520.248002052
30.06997 141.9352 17.6199998867.3660988810.253901005
32.00049 141.9352 18.8900003438.7200384140.169961929
33.13034 141.9352 110.159999859.5303878780.629611969
38.05007 141.9352 111.4300003113.22031403-1.79031372
39.29113 141.9352 112.6999998114.19918251 -1.4991827
41.39958 141.9352 113.9700002715.90360451-1.93360424
42.37374 141.9352 115.2399997716.70453644-1.46453667
44.21976 141.9352 116.5100002318.23343086-1.72343063
45.52308 141.9352 117.7800006919.31227303-1.53227234
47.02213 141.9352 119.04999924 20.5416584-1.49165916
47.9696 141.9352 120.31999969 21.3080101-0.98801041
49.62879 141.9352 121.5900001522.62288857-1.03288841
48.70356 141.9352 122.86000061 21.89431190.965688705
48.74804 141.9352 124.1299991621.929616932.200382233
48.91708 141.9352 125.3999996222.06352997 3.33646965
50.21596 141.9352 127.9400005323.078607564.861392975
50.89653 141.9352 130.4799995423.599969866.880029678
52.46231 141.9352 133.0200004624.769969948.250030518
53.60995 141.9352 135.5600013725.60054588 9.95945549
0 567.7408 2 0 0 0
1.218812 567.7408 20.254000008 00.254000008
6.529987 567.7408 20.508000016 00.508000016
9.314573 567.7408 20.762000024 00.762000024
12.85091 567.7408 21.0160000320.5112361910.504763842
15.84011 567.7408 21.2699999811.1072473530.162752628
18.1087 567.7408 21.5240000491.605949402-0.08194935
20.3595 567.7408 21.7779999972.141289711-0.36328971
21.80073 567.7408 22.0320000652.504390717-0.47239065
23.50439 567.7408 22.2860000132.950294018 -0.664294
24.69207 567.7408 22.5399999623.266360044-0.72636008
30.32352 567.7408 23.809999943 4.16734314 -0.3573432
36.33306 567.7408 25.0799999242.4475445752.632455349
42.69402 567.7408 26.3499999053.6788749692.671124935
48.3566 567.7408 27.6199998865.3887410162.231258869
54.58856 567.7408 28.890000343 7.47891141.411088943
60.58476 567.7408 210.159999859.634044647 0.5259552
66.44751 567.7408 211.4300003111.48759651-0.05759621
72.17237 567.7408 212.6999998112.481215480.218784332
78.32871 567.7408 213.9700002714.37507439-0.40507412
84.18256 567.7408 215.2399997716.69459915-1.45459938


21.1512S


141.9352


12.2860000132.733283281-0.44728327




















96.02373 567.7408 217.7800006916.780260090.999740601
101.6374 567.7408 219.0499992417.881380081.168619156
106.2947 567.7408 220.3199996920.146389010.173610687
109.9066 567.7408 221.5900001522.23096085-0.64096069
111.8549 567.7408 222.8600006123.43406677-0.57406616
113.9011 567.7408 224.1299991624.74685478-0.61685562
115.5781 567.7408 225.3999996225.85406303-0.45406342
119.835 567.7408 227.9400005328.73684692-0.79684639
123.8473 567.7408 230.4799995431.42387199-0.94387245
127.4771 567.7408 233.0200004633.69792557-0.67792511
131.8541 567.7408 235.5600013736.13762283-0.57762146
135.1636 567.7408 238.0999984737.751117710.348880768
138.3663 567.7408 240.63999939 39.14363481.496364594
0 141.9352 2 0 0 0
0.057827 141.9352 20.254000008 00.254000008
4.928628 141.9352 20.5080000160.515699506-0.00769949
9.683775 141.9352 20.7620000241.403264523 -0.6412645
12.89539 141.9352 21.016000032 2.23127079-1.21527076
16.59186 141.9352 21.2699999811.491104364-0.22110438
20.28833 141.9352 21.5240000490.8468111160.677188933
22.28113 141.9352 21.7779999971.122720957 0.65527904
24.3629 141.9352 22.0320000651.6248956920.407104373
25.4883 141.9352 22.2860000131.9571176770.328882337
25.62175 141.9352 22.539999962 1.998919010.541080952
29.89204 141.9352 23.8099999433.5534756180.256524324
33.69527 141.9352 25.0799999245.126929283-0.04692936
36.37754 141.9352 26.349999905 6.21759510.132404804
39.58916 141.9352 27.6199998867.4556918140.164308071
43.07656 141.9352 28.8900003438.7538995740.136100769
46.69741 141.9352 210.15999985 10.131697650.028302193
49.71331 141.9352 211.4300003111.359695430.070304871
52.1109 141.9352 212.6999998112.415092470.284907341
54.85545 141.9352 213.9700002713.730570790.239429474
57.26638 141.9352 215.23999977 14.99348640.246513367
59.58836 141.9352 216.51000023 16.31047440.199525833
61.7146 141.9352 217.7800006917.60169411 0.17830658
63.86754 141.9352 219.0499992418.984306340.065692902
66.56761 141.9352 220.3199996920.80295563-0.48295593
68.87179 141.9352 221.5900001522.40211105 -0.8121109
71.27383 141.9352 222.8600006124.08474159-1.22474098
73.40453 141.9352 224.1299991625.56837845-1.43837929
75.99339 141.9352 225.3999996227.33743095-1.93743134
80.0813 141.9352 227.9400005330.02065659-2.08065605
82.26983 141.9352 230.4799995431.39419556-0.91419601


90.14763


567.7408


216.51000023 17.8630085-1.35300827




















0 774.192 0 0 0 0
17.79288 774.192 02.539999962 02.539999962
31.13754 774.192 05.0799999241.8208850623.259114861
44.4822 774.192 07.6199998864.374661446 3.24533844
57.82686 774.192 010.159999857.4441056252.715894222
68.94741 774.192 012.6999998110.39051056 2.30948925
80.06796 774.192 015.2399997713.663366321.576633453
91.18851 774.192 017.7800006917.217792510.562208176
102.3091 774.192 020.3199996921.00687218-0.68687248
111.2055 774.192 022.8600006124.20622253-1.34622192
115.6537 774.192 025.39999962 25.8806591-0.48065948
0 774.192 1 0 0 0
26.68932 774.192 12.539999962 02.539999962
37.80987 774.192 15.079999924 05.079999924
55.60275 774.192 17.6199998862.101116896 5.51888299
66.7233 774.192 110.159999854.4511780745.708821774
80.06796 774.192 112.699999817.7777371414.922262669
95.63673 774.192 115.2399997712.307194712.932805061
106.7573 774.192 117.7800006915.91643143 1.86356926
122.3261 774.192 120.3199996921.47688866-1.15688896
133.4466 774.192 122.8600006125.99133682-3.13133621
142.343 774.192 125.3999996230.08508301-4.68508339
0 258.064 0 0 0 0
20.01699 258.064 05.079999924 18.042799-12.9627991
28.91343 258.064 010.1599998525.79247475-15.6324749
35.58576 258.064 015.2399997729.96155167-14.7215519
36.4754 258.064 020.3199996930.48217583-10.1621761
37.80987 258.064 025.3999996231.24627876-5.84627914
40.03398 258.064 030.4799995432.47169876-1.99169922
40.92362 258.064 035.5600013732.944103242.615898132
41.81327 258.064 040.6399993933.406028757.233970642
42.70291 258.064 045.7200012233.8573455811.86265564
43.37015 258.064 050.7999992434.1888351416.61116409
0 258.064 1 0 0 0
24.46521 258.064 15.0799999244.7497267720.330273151
37.80987 258.064 110.1599998511.44941711-1.28941727
51.15453 258.064 115.2399997719.67797279-4.43797302
55.60275 258.064 120.3199996922.82769966-2.50769997
66.7233 258.064 125.3999996230.41502762 -5.015028
72.28358 258.064 130.4799995433.65052414 -3.1705246
80.06796 258.064 135.5600013737.49334717-1.93334579
86.74029 258.064 140.6399993940.25579834 0.38420105
91.18851 258.064 145.7200012241.866542823.853458405
97.86084 258.064 150.7999992443.970718386.829280853


84.17367


141.9352


33.0200004632.552310940.467689514


















Mean absolute error= 2.399 mm, Max error= 23.01 mm


GRNN Network


Actual Predicted
Deflection Deflection Error
Load(KN) Area of steel,mm2 (2-bars) CFRP Actual(l) Network(1) Act-Net(1)
0 567.7408 0 00.068654977-0.068654977
3.692023 567.7408 00.2540000080.345111012-0.091111004
7.517492 567.7408 00.508000016 0.78768158-0.279681563
10.89814 567.7408 00.762000024 0.96657896-0.204578936
13.41138 567.7408 01.0160000321.307263136-0.291263103
14.94602 567.7408 01.2699999811.487757564-0.217757583
15.54653 567.7408 01.5240000491.551990509 -0.02799046
17.01444 567.7408 01.777999997 1.76014626 0.017853737
18.46011 567.7408 02.0320000651.988907576 0.043092489
19.28303 567.7408 02.2860000132.095602512 0.190397501
20.59526 567.7408 02.5399999622.230579853 0.309420109
25.62175 567.7408 03.8099999433.154846907 0.655153036
31.00409 567.7408 05.0799999244.830469608 0.249530315
36.87574 567.7408 06.3499999055.759119034 0.590880871
42.03568 567.7408 07.6199998867.368995667 0.251004219
46.9732 567.7408 08.8900003438.066763878 0.823236465
51.97745 567.7408 010.1599998510.05042458 0.109575272
57.42652 567.7408 011.43000031 10.3930378 1.036962509
63.63179 567.7408 012.6999998111.75105667 0.948943138
69.2143 567.7408 013.9700002712.92796993 1.042030334
74.352 567.7408 015.2399997712.76074314 2.47925663
79.28952 567.7408 016.5100002315.41827679 1.091723442
84.62739 567.7408 017.7800006916.84755516 0.932445526
88.25268 567.7408 019.0499992419.03291893 0.017080307
90.29887 567.7408 020.31999969 21.1650219-0.845022202
91.32196 567.7408 021.5900001522.20096016-0.610960007
91.90023 567.7408 022.8600006122.81951904 0.040481567
92.72315 567.7408 024.1299991623.77983093 0.350168228




















94.92501 567.7408 027.9400005326.37422943 1.565771102
96.08155 567.7408 030.4799995427.20980644 3.2701931
97.77188 567.7408 033.0200004629.53329849 3.48670196E
99.39548 567.7408 035.5600013733.43157196 2.128429412
100.7077 567.7408 038.0999984734.98027039 3.11972808E
102.2646 567.7408 040.6399993936.54109955 4.098899841
103.3544 567.7408 043.1800003138.52260208 4.657398224
104.3108 567.7408 045.7200012239.66244507 6.057556152
105.3116 567.7408 048.2599983238.97538376 9.284614562
0 141.9352 0 00.061245669-0.061245669
2.891343 141.9352 00.2540000080.273599833-0.01959982E
8.318171 141.9352 00.5080000160.792349279-0.284349262
9.986254 141.9352 00.7620000240.851477981-0.089477956
13.14449 141.9352 01.0160000321.274574041-0.25857400S
14.30103 141.9352 01.2699999811.641238451 -0.37123847
15.96911 141.9352 01.524000049 2.95342803-1.429427981
17.90409 141.9352 01.7779999975.108992577-3.33099257S
17.8596 141.9352 02.0320000655.056828022-3.024827957
18.28218 141.9352 02.2860000135.565524101-3.27952408E
18.1265 141.9352 02.5399999625.374355793-2.834355831
18.19322 141.9352 03.8099999435.455670357-1.645670414
18.63804 141.9352 05.0799999246.024611473-0.944611549
18.37115 141.9352 06.3499999055.677164078 0.672835827
18.34891 141.9352 07.6199998865.649073601 1.970926285
18.41563 141.9352 08.8900003435.733707905 3.15629243S
18.74925 141.9352 010.159999856.175785542 3.984214306
19.66113 141.9352 011.43000031 7.61255312 3.817447186
20.06147 141.9352 012.699999818.386609077 4.313390732
20.63974 141.9352 013.970000279.694396973 4.275603294
21.32921 141.9352 015.2399997711.57620525 3.66379451E
21.95197 141.9352 016.5100002313.61049557 2.899504662
22.53023 141.9352 017.7800006915.79814053 1.981860161
23.08626 141.9352 019.0499992418.10823822 0.941761017
23.46436 141.9352 020.3199996919.71082497 0.60917472E
23.88694 141.9352 021.5900001521.43466568 0.155334472
24.30952 141.9352 022.8600006123.00414848-0.144147872
24.59866 141.9352 024.1299991623.95874214 0.17125701S
24.75434 141.9352 025.3999996224.42818451 0.97181510S
25.15468 141.9352 027.9400005325.48514748 2.45485305E
25.51054 141.9352 030.4799995426.23686409 4.243135452
25.66623 141.9352 033.0200004626.50755692 6.512443542
25.84416 141.9352 035.5600013726.77099228 8.789009094
25.93312 141.9352 038.0999984726.88341141 11.21658707
26.15553 141.9352 040.6399993927.10398865 13.53601074


93.6127S


567.7408


025.39999962 24.9139061


0.486093521




















2.539934 567.7408 10.2540000080.278613478 -0.02461347
4.283636 567.7408 10.5080000160.428214788 0.07978522E
6.489953 567.7408 10.762000024 0.66350466 0.098495364
8.762993 567.7408 11.0160000320.862418592 0.15358144
11.77444 567.7408 11.2699999811.086756349 0.183243632
14.68802 567.7408 11.5240000491.432662249 0.091337E
17.01444 567.7408 11.7779999971.705374241 0.072625756
19.01169 567.7408 12.0320000651.957491279 0.074508786
24.58531 567.7408 12.2860000132.642647505-0.356647491
29.7497 567.7408 12.5399999624.336151123-1.796151161
34.59381 567.7408 13.809999943 4.48984623-0.679846287
40.03398 567.7408 15.0799999246.881983757-1.801983832
45.51864 567.7408 16.349999905 7.12364006-0.773640156
51.19901 567.7408 17.6199998869.603768349-1.983768462
56.92387 567.7408 18.890000343 9.77480793-0.884807587
62.64428 567.7408 110.1599998510.93559837-0.775598526
68.7161 567.7408 111.4300003112.31064129-0.880640984
74.81016 567.7408 112.6999998112.79867458-0.098674774
80.25479 567.7408 113.9700002714.75630856-0.786308289
86.04192 567.7408 115.2399997716.32108116 -1.08108139
91.86909 567.7408 116.5100002322.18501663-5.675016402
97.11799 567.7408 117.78000069 25.2514782-7.471477509
102.4603 567.7408 119.04999924 31.9801445-12.93014526
106.2057 567.7408 120.3199996930.18181419-9.86181449S
108.4254 567.7408 121.5900001523.91671562-2.32671546S
110.2358 567.7408 122.8600006122.94147301-0.081472397
113.8522 567.7408 124.1299991624.17117119-0.04117202E
117.3663 567.7408 125.3999996225.69538498-0.295385361
120.9026 567.7408 127.9400005328.17720795-0.237207412
124.0386 567.7408 130.4799995430.56855202-0.08855247E
127.0412 567.7408 133.0200004633.03781891-0.017818451
129.6656 567.7408 135.5600013735.29885864 0.261142731
132.1789 567.7408 138.0999984737.57992172 0.520076752
134.4919 567.7408 140.6399993940.38449097 0.255508422
135.991 567.7408 143.1800003141.96487427 1.21512603E
137.3788 567.7408 145.72000122 42.9292717 2.790729522
105.3116 567.7408 148.2599983233.03176498 15.22823334
0 141.9352 1 00.081549957-0.081549957
2.664484 141.9352 10.254000008 0.27455768-0.020557672
4.350359 141.9352 10.5080000160.421100914 0.086899102
7.441872 141.9352 10.7620000240.730577767 0.031422257
10.57787 141.9352 11.0160000320.896230161 0.119769871
14.53233 141.9352 11.2699999811.577337503-0.307337522
16.62745 141.9352 11.5240000493.082778215-1.558778167


567.7408


0.100738443-0.10073844:




















19.97696 141.9352 12.032000065 6.653576374-4.62157630S
21.15129 141.9352 12.2860000138.434126854-6.148126841
21.51159 141.9352 12.5399999629.177409172 -6.63740921
21.97866 141.9352 13.80999994310.32806778-6.518067837
25.47051 141.9352 15.07999992419.99451447-14.91451454
28.19282 141.9352 16.34999990515.21820545-8.868205547
30.06997 141.9352 17.6199998867.755726337-0.135726452
32.00049 141.9352 18.8900003438.015823364 0.87417697S
33.13034 141.9352 110.159999858.172818184 1.987181664
38.05007 141.9352 111.4300003110.50219727 0.92780304
39.29113 141.9352 112.6999998111.12367535 1.576324462
41.39958 141.9352 113.9700002712.22212219 1.74787807E
42.37374 141.9352 115.2399997712.96298027 2.277019501
44.21976 141.9352 116.5100002315.17603588 1.33396434E
45.52308 141.9352 117.7800006916.81250381 0.967496872
47.02213 141.9352 119.0499992418.86930275 0.180696487
47.9696 141.9352 120.3199996920.12705421 0.1929454E
49.62879 141.9352 121.5900001521.67520523-0.08520507E
48.70356 141.9352 122.8600006120.89399719 1.96600341E
48.74804 141.9352 124.1299991620.93459511 3.195404052
48.91708 141.9352 125.3999996221.08416176 4.31583786
50.21596 141.9352 127.9400005322.22624969 5.71375083S
50.89653 141.9352 130.4799995423.12311935 7.35688018E
52.46231 141.9352 133.0200004626.04013062 6.979869842
53.60995 141.9352 135.56000137 26.7578373 8.80216407E
0 567.7408 2 00.114406541-0.114406541
1.218812 567.7408 20.2540000080.175531358 0.078468651
6.529987 567.7408 20.5080000160.644696653-0.136696637
9.314573 567.7408 20.7620000240.858443141-0.096443117
12.85091 567.7408 21.0160000321.172438741-0.15643870E
15.84011 567.7408 21.2699999811.485689759-0.21568977E
18.1087 567.7408 21.5240000491.734228611-0.210228562
20.3595 567.7408 21.7779999971.926045418-0.148045421
21.80073 567.7408 22.0320000652.123580456-0.091580391
23.50439 567.7408 22.2860000132.385480881-0.099480867
24.69207 567.7408 22.5399999622.481292725 0.058707237
30.32352 567.7408 23.8099999434.007786751-0.19778680E
36.33306 567.7408 25.0799999244.941522121 0.138477802
42.69402 567.7408 26.3499999056.544304848-0.194304942
48.3566 567.7408 27.6199998867.656723499-0.036723614
54.58856 567.7408 28.8900003439.021968842-0.13196849E
60.58476 567.7408 210.15999985 10.2035284-0.043528557
66.44751 567.7408 211.4300003111.51954174-0.08954143E
72.17237 567.7408 212.69999981 12.6736393 0.026360512


17.64164


141.9352


11.7779999974.102640152-2.32464015E




















84.18256 567.7408 215.2399997715.55913067-0.31913089E
90.14763 567.7408 216.5100002318.39377975-1.88377952C
96.02373 567.7408 217.7800006921.13720894-3.357208252
101.6374 567.7408 219.0499992425.02436829-5.97436904S
106.2947 567.7408 220.31999969 24.3280201-4.008020401
109.9066 567.7408 221.5900001522.77288246-1.18288230S
111.8549 567.7408 222.8600006123.35476303-0.494762421
113.9011 567.7408 224.1299991624.20061111-0.070611954
115.5781 567.7408 225.3999996224.86204147 0.53795814E
119.835 567.7408 227.9400005327.70519066 0.23480987E
123.8473 567.7408 230.4799995430.48184395-0.001844406
127.4771 567.7408 233.0200004633.21837997-0.198379517
131.8541 567.7408 235.5600013736.86117935-1.301177979
135.1636 567.7408 238.0999984740.42089462-2.32089614S
138.3663 567.7408 240.6399993942.21762466-1.57762527E
0 141.9352 2 00.101856261-0.101856261
0.057827 141.9352 20.2540000080.102890939 0.15110907
4.928628 141.9352 20.5080000160.488759369 0.01924064E
9.683775 141.9352 20.762000024 0.8446154-0.082615376
12.89539 141.9352 21.016000032 1.09699297-0.080992937
16.59186 141.9352 21.2699999812.320744514-1.050744534
20.28833 141.9352 21.52400049 4.7382617-3.214261651
22.28113 141.9352 21.7779999976.704191685-4.92619168E
24.3629 141.9352 22.032000065 10.0276022-7.995602131
25.4883 141.9352 22.28600001311.16633892-8.880338907
25.62175 141.9352 22.53999996211.26454353-8.724543571
29.89204 141.9352 23.8099999436.469919205-2.659919262
33.69527 141.9352 25.0799999247.244553566-2.164553642
36.37754 141.9352 26.3499999057.864371777-1.514371872
39.58916 141.9352 27.61999988610.34260845-2.722608566
43.07656 141.9352 28.89000034312.41287327-3.52287292E
46.69741 141.9352 210.1599998516.99487877-6.834878922
49.71331 141.9352 211.4300003119.98731613-8.557315826
52.1109 141.9352 212.6999998122.20508003-9.505080222
54.85545 141.9352 213.9700002720.60906029 -6.63906002
57.26638 141.9352 215.2399997715.89608288-0.656083107
59.58836 141.9352 216.51000023 16.5824337-0.072433472
61.7146 141.9352 217.7800006917.75958633 0.020414352
63.86754 141.9352 219.0499992418.89602089 0.15397834E
66.56761 141.9352 220.3199996920.41453552-0.09453582E
68.87179 141.9352 221.5900001521.58165741 0.008342742
71.27383 141.9352 222.8600006122.92370033-0.063699722
73.40453 141.9352 224.1299991623.99844933 0.13154983E
75.99339 141.9352 225.3999996225.19221306 0.20778656


78.32871


567.7408


213.9700002714.20945644-0.239456177




















82.26983 141.9352 230.4799995430.77278519-0.292785645
84.17367 141.9352 233.0200004632.84871292 0.171287537
0 774.192 0 0 0 C
17.79288 774.192 02.5399999622.539999962 C
31.13754 774.192 05.0799999245.047780991 0.032218932
44.4822 774.192 07.6199998867.620000362-4.76837E-0O
57.82686 774.192 010.159999859.649963379 0.510036469
68.94741 774.192 012.6999998112.18996525 0.510034561
80.06796 774.192 015.23999977 14.214674 1.02532577E
91.18851 774.192 017.7800006917.74778175 0.032218932
102.3091 774.192 020.3199996920.31954002 0.000459671
111.2055 774.192 022.8600006125.39996529-2.53996467(
115.6537 774.192 025.3999996225.39953995 0.000459671
0 774.192 1 0 0 C
26.68932 774.192 12.5399999622.572219133-0.032219172
37.80987 774.192 15.0799999246.350004673-1.27000474S
55.60275 774.192 17.6199998868.130036354-0.51003646S
66.7233 774.192 110.1599998510.67003345-0.510033607
80.06796 774.192 112.6999998113.72532463-1.025324821
95.63673 774.192 115.23999977 15.2726717-0.03267192E
106.7573 774.192 117.7800006920.32003403 -2.54003334
122.3261 774.192 120.3199996920.32045937-0.000459671
133.4466 774.192 122.8600006122.86000061 C
142.343 774.192 125.3999996225.39999962 C
0 258.064 0 0 0 C
20.01699 258.064 05.0799999245.080000401-4.76837E-07
28.91343 258.064 010.1599998510.09634686 0.063652992
35.58576 258.064 015.2399997716.85069275-1.61069297E
36.4754 258.064 020.3199996917.03648567 3.283514022
37.80987 258.064 025.3999996219.57788277 5.822116852
40.03398 258.064 030.4799995433.52078247-3.04078292E
40.92362 258.064 035.5600013737.51759338 -1.95759201
41.81327 258.064 040.6399993940.62810516 0.011894226
42.70291 258.064 045.7200012243.19643021 2.523571014
43.37015 258.064 050.7999992444.76833344 6.031665802
0 258.064 1 0 0 C
24.46521 258.064 15.0799999245.143631458-0.06363153E
37.80987 258.064 110.15999985 17.1569519-6.996952057
51.15453 258.064 115.23999977 15.3347578-0.094758034
55.60275 258.064 120.3199996920.22537994 0.094619751
66.7233 258.064 125.3999996225.41034508-0.01034545S
72.28358 258.064 130.4799995430.46968079 0.01031875(
80.06796 258.064 135.56000137 35.5606575-0.00065612E
86.74029 258.064 140.6399993940.73394012-0.09394073E


80.0813


141.9352


227.9400005328.02173805-0.08173751E



















97.86084 258.064 150.7999992450.79931641 0.000682831
Mean absolute error= 1.86 mm, Max error = 15.23 mm




Simple Backpropagation Network

TRAINED
ACTUAL DATA DATA ERROR

Area of
steel,mm2
Load(KN) (2-bars) CFRP Actual(1) Network(1) Act-Net(1) Absolute error
0 567.7408 0 0 0 0 0
3.692023 567.7408 0 0.254000008 0 0.254000008 0.254000008
7.517492 567.7408 0 0.508000016 0.136708513 0.371291503 0.371291503
10.89814 567.7408 0 0.762000024 0.554100096 0.207899928 0.207899928
13.41138 567.7408 0 1.016000032 0.885760069 0.130239964 0.130239964
14.94602 567.7408 0 1.269999981 1.099540353 0.170459628 0.170459628
15.54653 567.7408 0 1.524000049 1.185815215 0.338184834 0.338184834
17.01444 567.7408 0 1.777999997 1.403404474 0.374595523 0.374595523
18.46011 567.7408 0 2.032000065 1.627652049 0.404348016 0.404348016
19.28303 567.7408 0 2.286000013 1.759989858 0.526010156 0.526010156
20.59526 567.7408 0 2.539999962 1.978402972 0.56159699 0.56159699
25.62175 567.7408 0 3.809999943 2.904310226 0.905689716 0.905689716
31.00409 567.7408 0 5.079999924 4.052268028 1.027731895 1.027731895
36.87574 567.7408 0 6.349999905 5.44604826 0.903951645 0.903951645
42.03568 567.7408 0 7.619999886 6.715855598 0.904144287 0.904144287
46.9732 567.7408 0 8.890000343 7.894256115 0.995744228 0.995744228
51.97745 567.7408 0 10.15999985 8.999910355 1.160089493 1.160089493
57.42652 567.7408 0 11.43000031 10.08813 1.341870308 1.341870308
63.63179 567.7408 0 12.69999981 11.23656178 1.463438034 1.463438034
69.2143 567.7408 0 13.97000027 12.3045435 1.665456772 1.665456772
74.352 567.7408 0 15.23999977 13.45160866 1.788391113 1.788391113
79.28952 567.7408 0 16.51000023 14.82118511 1.688815117 1.688815117
84.62739 567.7408 0 17.78000069 16.68751335 1.092487335 1.092487335
88.25268 567.7408 0 19.04999924 18.31244278 0.737556458 0.737556458
90.29887 567.7408 0 20.31999969 19.81343842 0.506561279 0.506561279
91.32196 567.7408 0 21.59000015 21.20120811 0.388792038 0.388792038
91.90023 567.7408 0 22.86000061 22.39188766 0.468112946 0.468112946
92.72315 567.7408 0 24.12999916 24.78429604 -0.654296875 0.654296875
93.61279 567.7408 0 25.39999962 28.07761002 -2.677610397 2.677610397
94.92501 567.7408 0 27.94000053 32.34885025 -4.408849716 4.408849716
96.08155 567.7408 0 30.47999954 34.62709045 -4.147090912 4.147090912


258.0641


1145.720001221 45.62606431 0.093936921


1 91.188511




















99.39548 567.7408 0 35.56000137 38.69404984 -3.134048462 3.134048462
100.7077 567.7408 0 38.09999847 40.56068802 -2.460689545 2.460689545
102.2646 567.7408 0 40.63999939 43.20806503 -2.568065643 2.568065643
103.3544 567.7408 0 43.18000031 45.30474472 -2.124744415 2.124744415
104.3108 567.7408 0 45.72000122 47.35056305 -1.630561829 1.630561829
105.3116 567.7408 0 48.25999832 49.90964508 -1.649646759 1.649646759
0 141.9352 0 0 0.193302155 -0.193302155 0.193302155
2.891343 141.9352 0 0.254000008 1.629522681 -1.375522673 1.375522673
8.318171 141.9352 0 0.508000016 0.660596192 -0.152596176 0.152596176
9.986254 141.9352 0 0.762000024 0.184476256 0.577523768 0.577523768
13.14449 141.9352 0 1.016000032 0.211023986 0.804976046 0.804976046
14.30103 141.9352 0 1.269999981 0.641276181 0.6287238 0.6287238
15.96911 141.9352 0 1.524000049 1.892618299 -0.36861825 0.36861825
17.90409 141.9352 0 1.777999997 4.732728004 -2.954728007 2.954728007
17.8596 141.9352 0 2.032000065 4.648146629 -2.616146564 2.616146564
18.28218 141.9352 0 2.286000013 5.486230373 -3.20023036 3.20023036
18.1265 141.9352 0 2.539999962 5.168624878 -2.628624916 2.628624916
18.19322 141.9352 0 3.809999943 5.303521156 -1.493521214 1.493521214
18.63804 141.9352 0 5.079999924 6.248215675 -1.168215752 1.168215752
18.37115 141.9352 0 6.349999905 5.672150135 0.67784977 0.67784977
18.34891 141.9352 0 7.619999886 5.625362396 1.994637489 1.994637489
18.41563 141.9352 0 8.890000343 5.766269207 3.123731136 3.123731136
18.74925 141.9352 0 10.15999985 6.495925903 3.664073944 3.664073944
19.66113 141.9352 0 11.43000031 8.669061661 2.760938644 2.760938644
20.06147 141.9352 0 12.69999981 9.685228348 3.014771461 3.014771461
20.63974 141.9352 0 13.97000027 11.19730091 2.772699356 2.772699356
21.32921 141.9352 0 15.23999977 13.04705906 2.192940712 2.192940712
21.95197 141.9352 0 16.51000023 14.7531271 1.756873131 1.756873131
22.53023 141.9352 0 17.78000069 16.38246346 1.397537231 1.397537231
23.08626 141.9352 0 19.04999924 18.03722 1.012779236 1.012779236
23.46436 141.9352 0 20.31999969 19.2599659 1.060033798 1.060033798
23.88694 141.9352 0 21.59000015 20.79418564 0.795814514 0.795814514
24.30952 141.9352 0 22.86000061 22.62019539 0.239805222 0.239805222
24.59866 141.9352 0 24.12999916 24.12665367 0.00334549 0.00334549
24.75434 141.9352 0 25.39999962 25.05439949 0.345600128 0.345600128
25.15468 141.9352 0 27.94000053 27.92940712 0.010593414 0.010593414
25.51054 141.9352 0 30.47999954 31.23256874 -0.752569199 0.752569199
25.66623 141.9352 0 33.02000046 32.93581009 0.084190369 0.084190369
25.84416 141.9352 0 35.56000137 35.07962799 0.480373383 0.480373383
25.93312 141.9352 0 38.09999847 36.22628021 1.873718262 1.873718262
26.15553 141.9352 0 40.63999939 39.27550507 1.364494324 1.364494324
0 567.7408 1 0 0.492317379 -0.492317379 0.492317379
2.539934 567.7408 1 0.254000008 0.64180845 -0.387808442 0.387808442
4.283636 567.7408 1 0.508000016 0.739326239 -0.231326222 0.231326222


97.77188


567.7408


33.02000046


36.75547409 -3.735473633


3.735473633




















8.762993 567.7408 1 1.016000032 0.984877467 0.031122565 0.031122565
11.77444 567.7408 1 1.269999981 1.159057856 0.110942125 0.110942125
14.68802 567.7408 1 1.524000049 1.345639944 0.178360105 0.178360105
17.01444 567.7408 1 1.777999997 1.513237 0.264762998 0.264762998
19.01169 567.7408 1 2.032000065 1.673750162 0.358249903 0.358249903
24.58531 567.7408 1 2.286000013 2.223892212 0.062107801 0.062107801
29.7497 567.7408 1 2.539999962 2.897157907 -0.357157946 0.357157946
34.59381 567.7408 1 3.809999943 3.689193487 0.120806456 0.120806456
40.03398 567.7408 1 5.079999924 4.753669739 0.326330185 0.326330185
45.51864 567.7408 1 6.349999905 5.960764408 0.389235497 0.389235497
51.19901 567.7408 1 7.619999886 7.255840302 0.364159584 0.364159584
56.92387 567.7408 1 8.890000343 8.50428772 0.385712624 0.385712624
62.64428 567.7408 1 10.15999985 9.635409355 0.524590492 0.524590492
68.7161 567.7408 1 11.43000031 10.72402096 0.705979347 0.705979347
74.81016 567.7408 1 12.69999981 11.80426788 0.895731926 0.895731926
80.25479 567.7408 1 13.97000027 12.90162563 1.068374634 1.068374634
86.04192 567.7408 1 15.23999977 14.37173748 0.868262291 0.868262291
91.86909 567.7408 1 16.51000023 16.27722168 0.232778549 0.232778549
97.11799 567.7408 1 17.78000069 18.31809807 -0.538097382 0.538097382
102.4603 567.7408 1 19.04999924 24.35286903 -5.302869797 5.302869797
106.2057 567.7408 1 20.31999969 30.27274323 -9.95274353 9.95274353
108.4254 567.7408 1 21.59000015 31.13757133 -9.547571182 9.547571182
110.2358 567.7408 1 22.86000061 31.68259621 -8.822595596 8.822595596
113.8522 567.7408 1 24.12999916 32.60660553 -8.476606369 8.476606369
117.3663 567.7408 1 25.39999962 33.36376572 -7.963766098 7.963766098
120.9026 567.7408 1 27.94000053 34.25434875 -6.314348221 6.314348221
124.0386 567.7408 1 30.47999954 36.16093445 -5.680934906 5.680934906
127.0412 567.7408 1 33.02000046 38.69126511 -5.671264648 5.671264648
129.6656 567.7408 1 35.56000137 40.08866882 -4.52866745 4.52866745
132.1789 567.7408 1 38.09999847 41.64070129 -3.54070282 3.54070282
134.4919 567.7408 1 40.63999939 43.83740616 -3.197406769 3.197406769
135.991 567.7408 1 43.18000031 45.52642822 -2.346427917 2.346427917
137.3788 567.7408 1 45.72000122 47.01876831 -1.29876709 1.29876709
105.3116 567.7408 1 48.25999832 29.7077713 18.55222702 18.55222702
0 141.9352 1 0 0 0 0
2.664484 141.9352 1 0.254000008 0.066419229 0.187580779 0.187580779
4.350359 141.9352 1 0.508000016 0.252668709 0.255331308 0.255331308
7.441872 141.9352 1 0.762000024 0.652956843 0.109043181 0.109043181
10.57787 141.9352 1 1.016000032 1.149867177 -0.133867145 0.133867145
14.53233 141.9352 1 1.269999981 1.936225057 -0.666225076 0.666225076
16.62745 141.9352 1 1.524000049 2.439715385 -0.915715337 0.915715337
17.64164 141.9352 1 1.777999997 2.708379507 -0.93037951 0.93037951
19.97696 141.9352 1 2.032000065 3.396142244 -1.364142179 1.364142179
21.15129 141.9352 1 2.286000013 3.781824112 -1.495824099 1.495824099


6.489953


567.7408


0.762000024


0.859926701 -0.097926676


0.097926676




















21.97866 141.9352 1 3.809999943 4.070908546 -0.260908604 0.260908604
25.47051 141.9352 1 5.079999924 5.463754654 -0.38375473 0.38375473
28.19282 141.9352 1 6.349999905 6.759959221 -0.409959316 0.409959316
30.06997 141.9352 1 7.619999886 7.765193462 -0.145193577 0.145193577
32.00049 141.9352 1 8.890000343 8.890574455 -0.000574112 0.000574112
33.13034 141.9352 1 10.15999985 9.589231491 0.570768356 0.570768356
38.05007 141.9352 1 11.43000031 12.9133234 -1.483323097 1.483323097
39.29113 141.9352 1 12.69999981 13.81235123 -1.112351418 1.112351418
41.39958 141.9352 1 13.97000027 15.40355682 -1.433556557 1.433556557
42.37374 141.9352 1 15.23999977 16.18022346 -0.940223694 0.940223694
44.21976 141.9352 1 16.51000023 17.79878998 -1.288789749 1.288789749
45.52308 141.9352 1 17.78000069 19.17110252 -1.391101837 1.391101837
47.02213 141.9352 1 19.04999924 21.22759438 -2.177595139 2.177595139
47.9696 141.9352 1 20.31999969 22.92936516 -2.609365463 2.609365463
49.62879 141.9352 1 21.59000015 26.71951675 -5.129516602 5.129516602
48.70356 141.9352 1 22.86000061 24.49193954 -1.631938934 1.631938934
48.74804 141.9352 1 24.12999916 24.59306908 -0.463069916 0.463069916
48.91708 141.9352 1 25.39999962 24.98345375 0.416545868 0.416545868
50.21596 141.9352 1 27.94000053 28.23765373 -0.297653198 0.297653198
50.89653 141.9352 1 30.47999954 30.06507683 0.414922714 0.414922714
52.46231 141.9352 1 33.02000046 34.55067444 -1.530673981 1.530673981
53.60995 141.9352 1 35.56000137 38.19617844 -2.636177063 2.636177063
0 567.7408 2 0 0.400904119 -0.400904119 0.400904119
1.218812 567.7408 2 0.254000008 0.4150199 -0.161019892 0.161019892
6.529987 567.7408 2 0.508000016 0.549874723 -0.041874707 0.041874707
9.314573 567.7408 2 0.762000024 0.680652797 0.081347227 0.081347227
12.85091 567.7408 2 1.016000032 0.918638468 0.097361565 0.097361565
15.84011 567.7408 2 1.269999981 1.183959126 0.086040854 0.086040854
18.1087 567.7408 2 1.524000049 1.420191526 0.103808522 0.103808522
20.3595 567.7408 2 1.777999997 1.677958012 0.100041986 0.100041986
21.80073 567.7408 2 2.032000065 1.852141738 0.179858327 0.179858327
23.50439 567.7408 2 2.286000013 2.064659834 0.221340179 0.221340179
24.69207 567.7408 2 2.539999962 2.2160604 0.323939562 0.323939562
30.32352 567.7408 2 3.809999943 2.96302247 0.846977472 0.846977472
36.33306 567.7408 2 5.079999924 3.843336105 1.236663818 1.236663818
42.69402 567.7408 2 6.349999905 4.946492195 1.40350771 1.40350771
48.3566 567.7408 2 7.619999886 6.100585938 1.519413948 1.519413948
54.58856 567.7408 2 8.890000343 7.495708942 1.394291401 1.394291401
60.58476 567.7408 2 10.15999985 8.841981888 1.31801796 1.31801796
66.44751 567.7408 2 11.43000031 10.0554285 1.3745718 1.3745718
72.17237 567.7408 2 12.69999981 11.10530949 1.594690323 1.594690323
78.32871 567.7408 2 13.97000027 12.13686752 1.833132744 1.833132744
84.18256 567.7408 2 15.23999977 13.15285397 2.087145805 2.087145805
90.14763 567.7408 2 16.51000023 14.40681267 2.103187561 2.103187561


21.51159


141.9352


2.539999962


3.905904293 -1.365904331


1.365904331




















101.6374 567.7408 2 19.04999924 17.96095657 1.089042664 1.089042664
106.2947 567.7408 2 20.31999969 19.76120186 0.558797836 0.558797836
109.9066 567.7408 2 21.59000015 22.07649994 -0.486499786 0.486499786
111.8549 567.7408 2 22.86000061 26.3451004 -3.485099792 3.485099792
113.9011 567.7408 2 24.12999916 30.46710014 -6.337100983 6.337100983
115.5781 567.7408 2 25.39999962 31.66956902 -6.269569397 6.269569397
119.835 567.7408 2 27.94000053 33.36264038 -5.422639847 5.422639847
123.8473 567.7408 2 30.47999954 35.31147385 -4.831474304 4.831474304
127.4771 567.7408 2 33.02000046 37.72388458 -4.703884125 4.703884125
131.8541 567.7408 2 35.56000137 40.73505402 -5.175052643 5.175052643
135.1636 567.7408 2 38.09999847 42.30021286 -4.200214386 4.200214386
138.3663 567.7408 2 40.63999939 43.1695137 -2.529514313 2.529514313
0 141.9352 2 0 0.137486309 -0.137486309 0.137486309
0.057827 141.9352 2 0.254000008 0.139902964 0.114097044 0.114097044
4.928628 141.9352 2 0.508000016 0.394391268 0.113608748 0.113608748
9.683775 141.9352 2 0.762000024 0.758991599 0.003008425 0.003008425
12.89539 141.9352 2 1.016000032 1.085956812 -0.069956779 0.069956779
16.59186 141.9352 2 1.269999981 1.558459401 -0.28845942 0.28845942
20.28833 141.9352 2 1.524000049 2.149601221 -0.625601172 0.625601172
22.28113 141.9352 2 1.777999997 2.523323774 -0.745323777 0.745323777
24.3629 141.9352 2 2.032000065 2.95853281 -0.926532745 0.926532745
25.4883 141.9352 2 2.286000013 3.213919401 -0.927919388 0.927919388
25.62175 141.9352 2 2.539999962 3.245172262 -0.7051723 0.7051723
29.89204 141.9352 2 3.809999943 4.359958172 -0.549958229 0.549958229
33.69527 141.9352 2 5.079999924 5.55638361 -0.476383686 0.476383686
36.37754 141.9352 2 6.349999905 6.528000832 -0.178000927 0.178000927
39.58916 141.9352 2 7.619999886 7.841059685 -0.221059799 0.221059799
43.07656 141.9352 2 8.890000343 9.456921577 -0.566921234 0.566921234
46.69741 141.9352 2 10.15999985 11.33368397 -1.17368412 1.17368412
49.71331 141.9352 2 11.43000031 13.027071 -1.597070694 1.597070694
52.1109 141.9352 2 12.69999981 14.43449116 -1.734491348 1.734491348
54.85545 141.9352 2 13.97000027 16.08234978 -2.11234951 2.11234951
57.26638 141.9352 2 15.23999977 17.53337288 -2.293373108 2.293373108
59.58836 141.9352 2 16.51000023 18.90504265 -2.395042419 2.395042419
61.7146 141.9352 2 17.78000069 20.1142807 -2.334280014 2.334280014
63.86754 141.9352 2 19.04999924 21.27074814 -2.220748901 2.220748901
66.56761 141.9352 2 20.31999969 22.59747505 -2.277475357 2.277475357
68.87179 141.9352 2 21.59000015 23.6075592 -2.017559052 2.017559052
71.27383 141.9352 2 22.86000061 24.54394531 -1.683944702 1.683944702
73.40453 141.9352 2 24.12999916 25.30059624 -1.170597076 1.170597076
75.99339 141.9352 2 25.39999962 26.25604439 -0.856044769 0.856044769
80.0813 141.9352 2 27.94000053 29.2156105 -1.27560997 1.27560997
82.26983 141.9352 2 30.47999954 31.49516106 -1.015161514 1.015161514
84.17367 141.9352 2 33.02000046 32.80700684 0.212993622 0.212993622


96.02373


567.7408


17.78000069


16.02496147


1.755039215


1.755039215


















0 774.192 0 0 0 0 0
17.79288 774.192 0 2.539999962 1.373519421 1.166480541 1.166480541
31.13754 774.192 0 5.079999924 3.093397141 1.986602783 1.986602783
44.4822 774.192 0 7.619999886 5.819315434 1.800684452 1.800684452
57.82686 774.192 0 10.15999985 9.013239861 1.146759987 1.146759987
68.94741 774.192 0 12.69999981 11.09754086 1.602458954 1.602458954
80.06796 774.192 0 15.23999977 12.47515583 2.764843941 2.764843941
91.18851 774.192 0 17.78000069 15.0552597 2.724740982 2.724740982
102.3091 774.192 0 20.31999969 23.69821548 -3.37821579 3.37821579
111.2055 774.192 0 22.86000061 26.39966965 -3.539669037 3.539669037
115.6537 774.192 0 25.39999962 28.10910416 -2.709104538 2.709104538
0 774.192 1 0 0.64108932 -0.64108932 0.64108932
26.68932 774.192 1 2.539999962 1.457342625 1.082657337 1.082657337
37.80987 774.192 1 5.079999924 2.501196384 2.578803539 2.578803539
55.60275 774.192 1 7.619999886 5.716783524 1.903216362 1.903216362
66.7233 774.192 1 10.15999985 8.103355408 2.05664444 2.05664444
80.06796 774.192 1 12.69999981 10.23294353 2.467056274 2.467056274
95.63673 774.192 1 15.23999977 11.80722141 3.432778358 3.432778358
106.7573 774.192 1 17.78000069 21.22229385 -3.442293167 3.442293167
122.3261 774.192 1 20.31999969 25.94849205 -5.628492355 5.628492355
133.4466 774.192 1 22.86000061 30.1002121 -7.240211487 7.240211487
142.343 774.192 1 25.39999962 32.46292496 -7.062925339 7.062925339
0 258.064 0 0 0 0 0
20.01699 258.064 0 5.079999924 5.539708138 -0.459708214 0.459708214
28.91343 258.064 0 10.15999985 9.292630196 0.867369652 0.867369652
35.58576 258.064 0 15.23999977 15.98624706 -0.746247292 0.746247292
36.4754 258.064 0 20.31999969 18.06868744 2.251312256 2.251312256
37.80987 258.064 0 25.39999962 22.40293694 2.997062683 2.997062683
40.03398 258.064 0 30.47999954 32.91650391 -2.436504364 2.436504364
40.92362 258.064 0 35.56000137 37.7833252 -2.223323822 2.223323822
41.81327 258.064 0 40.63999939 42.58349991 -1.943500519 1.943500519
42.70291 258.064 0 45.72000122 46.93615341 -1.216152191 1.216152191
43.37015 258.064 0 50.79999924 49.69796753 1.102031708 1.102031708
0 258.064 1 0 0 0 0
24.46521 258.064 1 5.079999924 4.389547825 0.690452099 0.690452099
37.80987 258.064 1 10.15999985 9.37134552 0.788654327 0.788654327
51.15453 258.064 1 15.23999977 15.42032433 -0.180324554 0.180324554
55.60275 258.064 1 20.31999969 17.75981522 2.560184479 2.560184479
66.7233 258.064 1 25.39999962 23.87393951 1.526060104 1.526060104
72.28358 258.064 1 30.47999954 29.78920364 0.690795898 0.690795898
80.06796 258.064 1 35.56000137 33.51081848 2.049182892 2.049182892
86.74029 258.064 1 40.63999939 40.13523102 0.504768372 0.504768372
91.18851 258.064 1 45.72000122 45.23723602 0.482765198 0.482765198
97.86084 258.064 1 50.79999924 50.79999924 0 0


















1.67802654
average error

18.55
worst error


3-Layer Backpropagation using Turboprop


Predicted
Actual Deflection Deflection Error
Actual(1) Network(1) Act-Net(1)
0 6.15631485 -6.15631485
0.254000008 1.138810515 -0.88481051
0.508000016 0 0.508000016
0.762000024 0 0.762000024
1.016000032 0 1.016000032
1.269999981 0 1.269999981
1.524000049 0.078504875 1.445495173
1.777999997 0.336136997 1.441863
2.032000065 0.664544761 1.367455304
2.286000013 0.885246754 1.40075326
2.539999962 1.287467003 1.252532959
3.809999943 3.329458952 0.480540991
5.079999924 5.934710979 -0.85471106
6.349999905 8.179730415 -1.82973051
7.619999886 9.228502274 -1.60850239
8.890000343 9.697112083 -0.80711174
10.15999985 9.912654877 0.247344971
11.43000031 10.02551651 1.404483795
12.69999981 10.24702358 2.452976227
13.97000027 11.88910389 2.080896378
15.23999977 17.156353 -1.91635323
16.51000023 20.20324898 -3.69324875
17.78000069 20.74677086 -2.96677017
19.04999924 20.79724503 -1.74724579
20.31999969 21.00123024 -0.68123055
21.59000015 23.15250587 -1.56250572
22.86000061 24.42451286 -1.56451225
24.12999916 24.79109573 -0.66109657




















27.94000053 24.82291412 3.117086411
30.47999954 24.82328224 5.6567173
33.02000046 24.8235817 8.196418762
35.56000137 24.82374573 10.73625565
38.09999847 24.82383347 13.27616501
40.63999939 24.82391357 15.81608582
43.18000031 24.82395554 18.35604477
45.72000122 24.82399559 20.89600563
48.25999832 24.82403374 23.43596458
0 0 0
0.254000008 0 0.254000008
0.508000016 0 0.508000016
0.762000024 0 0.762000024
1.016000032 0 1.016000032
1.269999981 0 1.269999981
1.524000049 0 1.524000049
1.777999997 2.890006304 -1.11200631
2.032000065 2.763377905 -0.73137784
2.286000013 4.090994358 -1.80499434
2.539999962 3.569220781 -1.02922082
3.809999943 3.788141966 0.021857977
5.079999924 5.424072742 -0.34407282
6.349999905 4.406261921 1.943737984
7.619999886 4.326285839 3.293714046
8.890000343 4.568499088 4.321501255
10.15999985 5.877993584 4.282006264
11.43000031 10.02998829 1.400012016
12.69999981 11.91524601 0.784753799
13.97000027 14.50101089 -0.53101063
15.23999977 17.27939606 -2.03939629
16.51000023 19.52147102 -3.01147079
17.78000069 21.41847229 -3.6384716
19.04999924 23.09936333 -4.04936409
20.31999969 24.16726685 -3.84726715
21.59000015 25.2894249 -3.69942474
22.86000061 26.3352108 -3.47521019
24.12999916 27.00612831 -2.87612915
25.39999962 27.35225677 -1.95225716
27.94000053 28.19364738 -0.25364685
30.47999954 28.88322067 1.59677887
33.02000046 29.16791534 3.852085114
35.56000137 29.48086357 6.079137802
38.09999847 29.63244057 8.467557907
40.63999939 29.9973526 10.64264679


25.39999962


24.82056808


0.579431534




















0.254000008 0.075299069 0.178700939
0.508000016 0.845724881 -0.33772486
0.762000024 1.829347134 -1.06734711
1.016000032 1.447966456 -0.43196642
1.269999981 0.22117649 1.048823491
1.524000049 1.993330598 -0.46933055
1.777999997 3.042953968 -1.26495397
2.032000065 3.590646982 -1.55864692
2.286000013 5.763460636 -3.47746062
2.539999962 8.10219574 -5.56219578
3.809999943 8.808190346 -4.9981904
5.079999924 9.797991753 -4.71799183
6.349999905 21.17933655 -14.8293366
7.619999886 21.61666489 -13.996665
8.890000343 21.98265266 -13.0926523
10.15999985 22.7983799 -12.6383801
11.43000031 23.94981575 -12.5198154
12.69999981 24.58876419 -11.8887644
13.97000027 24.78120232 -10.811202
15.23999977 24.86640167 -9.6264019
16.51000023 24.95792007 -8.44791985
17.78000069 25.11673546 -7.33673477
19.04999924 25.44863701 -6.39863777
20.31999969 25.8666172 -5.54661751
21.59000015 26.22137642 -4.63137627
22.86000061 26.58433914 -3.72433853
24.12999916 27.53830719 -3.40830803
25.39999962 28.76669502 -3.3666954
27.94000053 30.23113441 -2.29113388
30.47999954 31.58755875 -1.1075592
33.02000046 32.80170441 0.218296051
35.56000137 33.71876907 1.8412323
38.09999847 34.44133759 3.658660889
40.63999939 34.97133636 5.668663025
43.18000031 35.25140762 7.928592682
45.72000122 35.47114563 10.24885559
48.25999832 25.74827576 22.51172256
0 0 0
0.254000008 0 0.254000008
0.508000016 0.368255615 0.139744401
0.762000024 1.529534221 -0.7675342
1.016000032 3.074371338 -2.05837131
1.269999981 5.052273273 -3.78227329
1.524000049 2.004262209 -0.48026216




















2.032000065 3.036234617 -1.00423455
2.286000013 3.635456562 -1.34945655
2.539999962 3.818439007 -1.27843904
3.809999943 4.05442524 -0.2444253
5.079999924 5.760643005 -0.68064308
6.349999905 7.090192318 -0.74019241
7.619999886 8.164099693 -0.54409981
8.890000343 9.616269112 -0.72626877
10.15999985 10.69600868 -0.53600883
11.43000031 15.97401619 -4.54401588
12.69999981 16.87057304 -4.17057323
13.97000027 17.86073303 -3.89073277
15.23999977 18.15473747 -2.9147377
16.51000023 18.57078743 -2.0607872
17.78000069 18.9006443 -1.12064362
19.04999924 19.7933712 -0.74337196
20.31999969 21.21567726 -0.89567757
21.59000015 26.49410057 -4.90410042
22.86000061 23.14398766 -0.28398705
24.12999916 23.28608322 0.843915939
25.39999962 23.84941101 1.550588608
27.94000053 28.6625843 -0.72258377
30.47999954 30.69244957 -0.21245003
33.02000046 33.05683136 -0.0368309
35.56000137 33.77766037 1.782341003
0 0 0
0.254000008 0 0.254000008
0.508000016 0 0.508000016
0.762000024 0 0.762000024
1.016000032 2.056507111 -1.04050708
1.269999981 3.721345186 -2.45134521
1.524000049 2.526202202 -1.00220215
1.777999997 2.921381235 -1.14338124
2.032000065 3.174379587 -1.14237952
2.286000013 3.43252182 -1.14652181
2.539999962 3.588263512 -1.04826355
3.809999943 4.115826607 -0.30582666
5.079999924 4.577964306 0.502035618
6.349999905 6.297617435 0.052382469
7.619999886 7.04801321 0.571986675
8.890000343 8.014572144 0.8754282
10.15999985 9.621442795 0.538557053
11.43000031 11.57218933 -0.14218903
12.69999981 12.80493164 -0.10493183


1.777999997


1.94941628


-0.17141628




University of Florida Home Page
© 2004 - 2010 University of Florida George A. Smathers Libraries.
All rights reserved.

Acceptable Use, Copyright, and Disclaimer Statement
Last updated October 10, 2010 - - mvs