<%BANNER%>

The Use of information technology by county extension agents of the Florida cooperative extension service

University of Florida Institutional Repository

PAGE 1

USE OF INFORMATION TECHNOLOGY BY COUNTY EXTENSION AGENTS OF THE FLORIDA COOPERATIVE EXTENSION SERVICE By JON AUSTIN GREGG A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLOR IDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF SCIENCE UNIVERSITY OF FLORIDA 2002

PAGE 2

ii ACKNOWLEDGMENTS I would like to express my appreciati on to Dr. Tracy Irani, who graciously assumed the chair of my thesis committee. Dr. Irani’s patience, insightful guidance, and encouragement have been exemplary. I am most grateful for all of her efforts on my behalf. My two other committee members, Dr. Rickie Rudd and Dr. William Summerhill, each provided me with unique insight and excellent recommendations. Their contributions to this study are signi ficant, and I acknowledge them with my sincere appreciation. Dr. Christine Waddill and Dr. Larry Arrington of the Florida Cooperative Extension Service were instrumental to this study’s success, and I am most indebted to both of them. My deepest gratitude is reserved for all the c ounty Extension agents who so generously gave their time to complete the survey that I sent to them. I am overwhelmed by their response, and hope that the results of this study will help them in a manner similar to the way they helped me. It is appropriate that I express my appreciation to my original committee chair, Dr. Tracy Hoover. Under her guidan ce I began to write the thesis that you now hold in your hands or observe via the Intern et. I would like to thank Dr. Hoover for her efforts. I must also acknowledge the two friends who finally succeeded in getting me involved with graduate sc hool. Dr. Marshall Breeze and Dr. Mathew Baker played

PAGE 3

iii big roles in this effort, and to both of them I extend my heartfelt appreciation. I would also like to take this opportunity thank the AEC department chairman, Dr. Ed Osborne, for his unfailing support and wis dom. Dr. Osborne seemed to often know what was best for me, when perhaps I di d not. In addition, Sid Sachs and Karen Navitsky were especially help ful to me in the completion of my study. I am humbled by the patience and generosity that they bot h have shown me over the duration of this endeavor.

PAGE 4

iv TABLE OF CONTENTS ACKNOWLEDGMENTS.....................................................................................................ii ABSTRACT....................................................................................................................... ....viii CHAPTERS 1 INTRODUCTION...........................................................................................................1 Background................................................................................................................1 The Florida Cooperative Extension Service..............................................................3 Information Technology and the Cooperative Extension Service.............................3 Need for the Study.....................................................................................................6 Definition of Terms....................................................................................................8 Limitations of the Study.............................................................................................9 Assumptions...............................................................................................................9 Significance of the Study...........................................................................................9 Organization of the Remainder of the Chapters........................................................10 2 REVIEW OF THE LITERATURE.................................................................................11 Introduction................................................................................................................11 Part 1-A Historical Pers pective of the Computers Technical Development............11 A Look at Related Innovation from Ancient Times to 1950.........................11 The War Years: Calculati ng Needs drive Innovation...................................13 Re-invention, Public Attenti on, and Diffusion: The 1950s..........................15 The 1960s: Refinement, more Innovation, and more Adoption...................18 Towards the Personal Computer....................................................................20 Personal Computing: Technology Fuses with Latent Desire........................21 Software Diminishes Complexity, Enhances Compatibility.........................24 Part 2-The Application and Us e of Information Technology....................................26 General Use of Information Technology in Todays Society........................26 Information Technology in Primary and Secondary Education....................29 Information Technology in Higher Education...............................................34 Information Technology and the Cooperative Extension Service.................38 Information Technology and the Fl orida Cooperative Extension Service......................................................................................................40 Part 3-Theoretical Aegis of the Study........................................................................41 Training Needs...............................................................................................41 Determining Training Needs..........................................................................41

PAGE 5

v Summary....................................................................................................................42 3 METHODOLOGY..........................................................................................................45 Introduction................................................................................................................45 Research Design.........................................................................................................45 Population..................................................................................................................46 Instrumentation..........................................................................................................46 Data Collection..........................................................................................................49 On Web-based Surveys..............................................................................................51 How This Study Addressed Sources of Error............................................................55 Data Analysis.............................................................................................................56 4 RESULTS...................................................................................................................... ..58 Objective 1 Describe County Extensi on Agents Demographic Characteristics and, Based on Those Characteristics, Determine Their Use of Information Technology, Including Self-Assessed Le vel of Overall Computer Skills...........59 A General Description of the Respondents....................................................59 Comparing Response Groups........................................................................60 Use of Information Technology and Self -Assessed Level of Overall Computer Skills....................................................................................................................64 An Examination of the Non-respondents.......................................................66 Self-rated Computer Skills and Demographics..........................................................67 Computer Usage and Demographics.........................................................................72 Source of Computer Knowledge and Demographics.....................................77 Demographic Snapshots by Age....................................................................86 Age group 20-30................................................................................86 Age group 31-40................................................................................87 Age group 41-50................................................................................87 Age group 51-60................................................................................88 Age group 61-70................................................................................88 Objective 2 Determine How County Exte nsion Agents are Using Information Technology on the Job in terms of Hardware Use, and the Nature and Frequency of Use of Specific Types of Software................................................89 Connectivity, Hardware and Operating System Use, etc...............................89 Patterns of Use of Electronic Mail.................................................................89 Patterns of Use of Word Processing Software...............................................91 Patterns of Use of Spreadsheet Software.......................................................92 Patterns of Use of Presentation Software.......................................................93 Patterns of Use of the World Wide Web.......................................................94 Patterns of Use of the Web Page Editing/Creation Activity..........................94 Objective 3 Determine county Extension Ag ents Perceived Level of Skill with Regard to a Specific set of Information Technology Tasks.................................95

PAGE 6

vi Objective 4 As a Means to Recomm end Future Information Technology Training, Describe the Relations hip Between Agents Perceived Importance of, and Self-Assessed Knowle dge about each of a Specific set of Information Technology Skills........................................................................99 Summary....................................................................................................................105 5 SUMMARY, CONCLUSIONS AND RECO MMENDATIONS...................................107 Summary....................................................................................................................107 Procedure...................................................................................................................107 Limitations of the Study.............................................................................................109 Key Findings and Implications..................................................................................110 Discussion..................................................................................................................112 Discussion of the Methodology.................................................................................117 Conclusions................................................................................................................120 Recommendations......................................................................................................120 APPENDIX A SURVEY OF COMPUTER TECHNOL OGY SKILLS AND TRAINING NEEDS......122 B THE SURVEY OF COMPUTER TECHNOLOGY SKILLS INSTRUMENT PAPER VERSION...........................................................................................................155 C PRINCIPLES FOR THE DESI GN OF WEB SURVEYS AND THEIR RELATIONSHIP TO TRADITIONAL SOURCES OF SURVEY ERROR (DILLMAN & BOWKER, 2001)....................................................................................163 D SCALES AND THE VALUES THEY REPRESENTED...............................................165 E E-MAIL FROM RESEARCHERS TO PILOT POPULATION INTRODUCING THE STUDY...................................................................................................................169 F E-MAIL FROM RESEARCHERS TO PILOT POPULATION REMINDING THEM TO PARTICIPATE.............................................................................................171 G E-MAIL FROM RESEARCHERS TO PILOT POPULATION THANKING THEM FOR THEIR PARTICIPATION, AND INFORMING THEM THAT THEY WERE PART OF A PILOT STUDY...................................................................172 H E-MAIL FROM THE DEAN OF EX TENSION INTRODUCING THE STUDY.........173 I E-MAIL FROM THE RESEAR CHERS TO THE POPULATION INTRODUCING THE STUDY AND ASKING FOR PARTICIPATION.....................174

PAGE 7

vii J E-MAIL FROM RESEARCHERS TO THE POPULATION REMINDING THEM TO PARTICIPATE (FIR ST REMINDER MESSAGE).....................................176 K E-MAIL FROM RESEARCHERS TO THE POPULATION REMINDING THEM TO PARTICIPATE (SEC OND REMINDER MESSAGE)................................177 L E-MAIL FROM RESEARCHERS TO THE POPULATION REMINDING THEM TO PARTICIPATE (NONRESPONSE MESSAGE).........................................178 M E-MAIL FROM THE DISTRICT EX TENSION DIRECTOR OF THE NORTH CENTRAL DISTRICT TO HIS AGENTS ASKING FOR THEIR PARTICIPATION IN THE STUDY...............................................................................179 N LETTER FROM RESEARCHERS TO AGENTS WHO HAD NOT RESPONDED TO THE ON-LINE SURVEY.................................................................180 O LETTER FROM RESEARCHERS TO AGENTS WHO HAD NOT RESPONDED TO THE ON-LINE SURVEY REMINDING THEM TO PARTICIPATE................................................................................................................182 REFERENCES..................................................................................................................... .183 BIOGRAPHICAL SKETCH.................................................................................................189

PAGE 8

viii Abstract of Thesis Presen ted to the Graduate School Of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Science USE OF INFORMATION TECHNOLOGY BY COUNTY EXTENSION AGENTS OF THE FLORIDA COOPERATIVE EXTENSION SERVICE By Jon Austin Gregg December 2002 Chairperson: Tracy A. Irani Major Department: Agricultur al Education and Communication The purpose of this study was to examine the use of information technology (IT) amongst county Extension agents of the University of Floridas Florida Cooperative Extension Service. Four object ives delineated the research: Describe county Extension agents demographics, and use of IT vis--vis those demographics; determine how county Extension agents are using hardware and software on the job; determine county Extension agents perceived le vel of skills with re gard to a specific set of IT tasks; and lastly recommend futu re IT training by describing the relationship between agents perceived importance of a nd self-assessed knowle dge about specific IT skills. The entire population of 331 county Extens ion agents was considered for this study. A mixed-mode methodology, which empl oyed an electronic survey instrument and a traditional paper survey instrument, wa s used to collect the data. Agents were given three weeks to complete the electr onic version of the survey, and thereafter

PAGE 9

ix mailed the paper version. Respondents were subsequently categorized according to what methodology they chose to use. A dditional categorization was performed on the electronic respondents accordi ng to when they submitted a completed survey. Of the 331 individuals in the popul ation, 278 responded electroni cally, and 21 via paper for an overall response rate of 90.3%. Information collected by the survey was subjected to a batt ery of statistical analyses. Summary statistics and ANOVA were used to compare and contrast patterns of IT use among agents of diffe rent gender, age, area of programmatic concentration, and response category. A weighting formula, based on a series of questions asked within the survey, was em ployed to derive agents future training needs. County Extension agents pa inted a picture of an information technology savvy organization accommodating its clientele through Web sites, e-mail, and other sophisticated forms of information delivery. Th is current state of affairs is contrasted to findings from a similar study conducted ten years ago on Florida county Extension agents use of information technology. Key findings and associated implications, and recommendations for future research ar e then offered from the study at hand.

PAGE 10

1 CHAPTER 1 INTRODUCTION Background The Cooperative Extension Service is a public, non-formal education system established by the Smith-Lever act of 1914. Charged by congress to diffuse “useful and practical information on subjects relating to agriculture and home economics” among the people of the United States, Extension evolve d from the farmer’s institutes of the late 1800s and early 1900s. The organization was orig inally designed as a partnership of the land-grant universities and the U.S. Departme nt of Agriculture, but provisions of the Smith-Lever act enabled a third le gal partner, the coun ties of the states, to be included in the venture. Each partner, though having c onsiderable independence in staffing, funding, and developing programs, nevertheless contri butes functions esse ntial to the whole system (Rasmussen, 1989). Extension in the United States and its protectorates is believed to be the largest such organization in the world, utilizi ng the resources of 67 land-grant universities, certain community colleges, and thousands of county agents (Seevers, Graham, Gamon, & Conklin, 1997). An administrator appointed by the secret ary of agriculture leads the federal Extension partner. This indi vidual reports to the undersecre tary of science and education, and strives to accomplish Extension’s missi on “to assure an effective nationwide Cooperative Extension Service that is res ponsive to priority needs and the federal interests and policies with quality in formation, education, and problem-solving programs” (Rasmussen, 1989, p. 5). Over the years Extension has responded well to

PAGE 11

2 “federal interests.” Duri ng both world wars the organization spurred increases in agricultural production and engage d in service functions such as soliciting for Liberty Bonds, and serving on local draft boards. Du ring the depression Extension participated in many New Deal programs including the Farm Credit Administration, the Rural Electrification Administration, the Te nnessee Valley Authority, and the Soil Conservation Service (Rasmusen, 1989). T oday the federal partner directs special attention and funding to the stat e partners through “N ational Initiatives” in such areas as water quality, food safety, and workforce preparation (Cooperative State Research, Education, and Extension Service, (CSREES), 2001). The state Extension partners are locat ed at Land Grant universities, and are headed by a director or dean selected by the university with th e concurrence of the secretary of agriculture (Rasmussen, 1989). An annual plan of work is submitted by the state Extension director for approval by the fe deral secretary of agriculture. The state partner is also responsible for the admini strative oversight of the county partner. Individuals at the university or research center level who conduct research or who specialize in disseminating research-based information are called “state Extension specialists.” Most state speci alists are members of an academic department associated with the sponsoring Land-Grant institution, a nd are available to c ounty Extension agents to help apply university-based re search to solve local problems. It is primarily at the county level, th orough the county Extension agent, that the university meets the people. Described vari ously as an “Extension educator, change agent, teacher, or social activist,” the count y agent “serves as an educational broker for the community” (Seevers et al., 1997, p. 52). “County Extension agents constantly live

PAGE 12

3 amid and encourage change in people a nd their surroundings” (Rasmussen, 1989, p. 7). They provide leadership and expertise, a nd extend knowledge needed to solve local problems (Seevers et al., 1997). The county Ex tension agent participates in a storied profession of dedication, long hour s, and of gaining the trust of people in order to help them improve their lives through educati on based on scientific knowledge (Rasmussen, 1989). The Florida Cooperative Extension Service Extension work in Florida effectivel y began in 1909 with a $7,500 a year appropriation from the Florida State Legislature. This legi slative action enabled federal authorities to send Florida its first stat e demonstration agent, A.S. Meharg, who developed a successful program before his resignation in 1913. Extension proper began on May 25th, 1915, when Florida accepted the provisions of the federal Smith-Lever Act. Peter H. Rolfs was its first director (Cooper, 1976). Today the Florida Cooperative Extensi on Service operates as part of the University of Florida’s Institute of Food and Agricultural Sciences, and has a presence in each of the state’s 67 counties. The organization conducts educational programming in areas such as agriculture, food safety, energy conservation, family economics, and youth development (University of Florida, Institu te of Food and Agricultural Sciences, 2001). Information Technology and the Cooperative Extension Service In the early 1920s the Cooperative Extension Service adopted two new innovations, the radio and the telephone, to keep rural people informed of Extension activities and, with radio, to deliver edu cational programming (Rasmussen, 1989). The next significant innovation in electronic comm unications, television, was also used by the organization to work with clientele (Ras mussen, 1989). When the personal computer

PAGE 13

4 began its widespread diffusion in the early 1980s, Extension, along with the rest of the world, was introduced to a new technol ogy that would quickly evolve into a revolutionary means of communication. During the early days of the personal computer’s diffusion Cantrell (1982) reported th at Extension educators, lacking computer competencies, were in jeopardy of becoming le ss computer literate th an their clientele – thus evidencing a slowness by agents to a dopt the innovation. Ten years later Ruppert (1992) stated “Extension educators cannot es cape the computer revolution and will be challenged in their roles with the responsib ility of helping people understand and make the best use of such technology” (p. 4). Ei ght years thereafter, and after monumental technological progress in pers onal computing, Albright (2000 ) stated that knowledge had become the central focus of the global econom y, and that a transition to “incorporate the technology required for the dissemination of knowledge” (p. 11) is nowhere more important than within organizations th at have always produced knowledge (i.e. Extension). Furthermore Albr ight states that the organization’s leadership must “consider societal, global, and demographic changes and effectively embrace information technology as an impetus to further the mission of CES” (Albright, 2000, p. 16). The capability, then, for Extension agents to learn and to apply the use of computers, software and associated peri pheral devices (collectively, information technology) for purposes of serv ing clientele and in support of Extension’s administrative infrastructure, has become an essential job-re lated skill. Albright (2000), addressing the need for organizations to “adapt to the technology explosion” (p. 3) stat es: “It is critical that Extension re-invest in employees and tr ain them in the necessary skills to remain competitive and serve a dynamic community” (p. 4). Martin (2001) echoes this: “With

PAGE 14

5 more clients using computers to obtain informa tion, it will be critical for agents and other field staff to gain the computer skills nece ssary to use computers as a means for gaining greater efficiency in obtaining and shar ing educational information” (p. 1). To measure the ability of Extension pr ofessionals to use information technology, Albright, in her 2000 study of Texas county Extension agents asked agents to self-rate their computer skills in eight areas ranging from word proces sing to the use of peripheral devices. Albright also sought to determine future training needs by asking agents to rank each of the eight specific information tec hnology skill areas according to the importance the agent ascribed to a skill ar ea, the agent’s knowledge of the skill area, and the agent’s ability to apply the sk ill area to their job. These thr ee constructs were operationally defined in the following manner: “Importa nce” described the importance of a particular skill to an agent’s job function; “knowledge” me asured the ability to accurately recall or summarize information associated with a skill; and “application” measured the ability of the agent to use specific skill s on the job. Albright found th at the general population of Extension agents indicated that their stronge st skills were in word processing, e-mail, Internet use, and file mana gement, respectively (Albright, 2000). Older agents in the study self-reported fewer information technol ogy skills than younger ones, and indicated that their primary source of IT knowledge stemmed from on-the-job training. Younger agents were found to be “more self-directe d in their technology learning” (Albright, 2000, p. 94). Both younger and older agents repor ted having participated in little IT training within the two years prior to the st udy (Albright, 2000). Us age of the Internet was seen by agents as being a “very crit ical” means of program delivery, and, by younger agents in the survey, a potential means to receive training (Albright, 2000, p. 95).

PAGE 15

6 Based on her research, Albri ght concluded that agents from the general population of Texas Extension agents that were studied needed training in the following areas: Web page development, peripheral device ma nagement, presentation software, file management, E-mail, word processing, Intern et, and spreadsheets. She also found that agents who had taught themselves computer skills self-reported lower computer skills ability. Thus do different employees have different training needs, and should learn skills commensurate to their current le vel of expertise: “The lite rature supports that it is counterproductive to design one tr aining plan for all agents an d expect learning to occur” (Albright, 2000, p. 106). Albri ght also states that devel oping a set of “specific skill standards or competencies” would provide “b enchmarks” for employees to meet when developing their computer skills (Albri ght, 2000, p. 104). This would establish expectations for specific levels of employee computer compet encies, with the implication that training needs could be differentiated and addressed. Need for the Study The last systematic study of genera l computer use amongst county Extension agents of the Florida Cooperative Extension Service (FLCES) was conducted by Kathleen C. Ruppert in 1992. Her objective was to “determine whether county extension agents use computers, to what extent they use computers, and what factors may be inhibiting or encouraging th eir use” (Ruppert, 1992, p. vi). The factors “inhibiting or encouraging” computer use “were operationa lly defined as subscal es of the Computing Concerns Questionnaire (CCQ)”1 which, along with a batte ry of questions about 1 The 32-item Computing Concerns Questionnaire was developed and verified by Martin, and based on the Concerns Based Adoption Model developed by Hall and colleagues as a means to identify and explain discrete stages of concern that individuals progress through (and express) as they adopt an innovation (Ruppert, 1992).

PAGE 16

7 “personal and situational fact ors,” was used in a census study of FLCES county agents (Ruppert, 1992, p. vi). Ninety -four percent of the populati on responded. After subjecting her data to a battery of sta tistical procedures, Ruppert f ound that “almost half” of the agents had a computer on their desk, and that one-third of th em made use of a computer at home. Of computer skills, agents “were most adept at computer word processing,” followed by VAX (computer network), database s, the IFAS CD-ROM, spreadsheets, and computer graphics (Ruppert, 1992, p. 101). Agents associated with the Agriculture, 4-H, and Marine program areas had “significantly higher” co mputer use mean scores2 than agents in other areas (Ruppert, 1992, p. 101). Of the eight subscales of the Computing Concerns Questionnaire, those concerns which focused “either on the individual or the client and how the agents interact with the computer and how thei r computer work effects thei r clientele” were found to be statistically significant (Ruppe rt, 1992, p.102). Ruppert also found that “age, program area, typing, computer training, and comput er resource contact were all significant demographic and situational independent variab les that affected the overall computer use mean score of county agents” (Ruppert, 1992, p.102). Since the Ruppert study many technologica l advances have occurred including faster machines, widespread connectivity to th e Internet, use of gra phical user interfaces in software, and use of the World Wide Web to retrieve and disseminate information. These developments have, over the past 10 years, changed the nature of workplace computer use among FLCES county Extension faculty. The following questions thus arise: Have county agents kept abreast of the manifold tech nological changes of the past 2 This mean is computed from a linear model relati ng computer use to agent’s “personal and situational factors.” A model was also constructed using the responses from the Computing Concerns Questionnaire.

PAGE 17

8 10 years? Are they utilizing the Web for in formation to fulfill client need? Are they disseminating information to clientele through Web sites or e-mail? Are agents using email to exchange information, and can they att ach a file to such me ssages? And finally, to what degree of sophistication do agents us e everyday office software products such as word processors, or a spreadsheet? There was a need, then, to investigate the current use of information technology, level of information technology skills, a nd the workplace app lication of modern information technology among county Extensio n agents of the Florida Cooperative Extension Service. Ultimately, Extension administrative entities, and other parties interested in this issue, will be provided with objective, research-derived information that should provide an understanding of county agents ’ IT use, and consequently enable the specific IT training needs of count y faculty to be addressed. The objectives of this study were therefore 1. Describe county Extension agents’ demogr aphic characteristics and, based on those characteristics, determine their use of in formation technology, in cluding self-assessed level of overall computer skills. 2. Determine how county Extension agents are using information technology on the job in terms of hardware and software use. 3. Determine county Extension agents’ perceived level of skill with re gard to a specific set of information technology tasks. 4. Recommend future information technol ogy training by describing the relationship between agents’ perceived importance of, a nd self-assessed knowle dge about specific information technology skills. Definition of Terms For purposes of this study, the following terms are defined: 1. Information Technology refers to comput ers, computer software, and peripheral devices connected to computers such as modems, scanners, Ethernet, digital television, etc.

PAGE 18

9 2. Office-type software products refers to software that performs such tasks as word processing, spreadsheets, browsing the Wo rld Wide Web, electronic mail, etc. 3. The FLCES is the Florida Cooperative Extension Service. Limitations of the Study A census of the population of county Extension agents of the FLCES was conducted using an instrument accessible via the World Wide Web. Those agents not responding to the on-line instrument after th ree weeks duration of time were sent a traditional paper instrument. Due to the na ture of a census study, the specific IT infrastructure in place within the FLCES, and the specific IT know ledge and skills that might be possessed by FLCES county agents the findings of the study cannot be generalized to Extension orga nizations elsewhere, though they are likely to offer insight to those organizations. Assumptions It was assumed that the county agents of the Florida Cooperative Extension Service who responded to the survey di d so with truthf ulness and honesty. Significance of the Study The level of skills and wor kplace application of information technology by county Extension agents of the FLCES is presen tly under-researched. This situation is significant to Extension because the ability to effectively use IT in a current, up-to-date manner bears directly on the organizations operational effectiveness in two fundamental areas: Its internal functioning, and its missi on to serve the needs of its clientele. Studying patterns of county Extension agents IT skills will help pain t a clearer picture of the strengths and weaknesses faced by the organization in this important area. Recommendations addressing specific needs fo r agent computer training will follow from

PAGE 19

10 the findings of this study. Such informati on may be important to FLCES administrators, particularly in light of the potential for enhanced organizational efficiencies. Organization of the Remainder of the Chapters This thesis is presented in five chap ters. Chapter 1 introduces the study, and proceeds to Chapter 2, a revi ew of the literature. Ch apter 3 discusses the study’s methodology, including research design and pr ocedures followed. A detailed report on the data collected is provided in Chapter 4, and Chapter 5 engages in a summary of the study, conclusions, and recommendations based on the study’s results.

PAGE 20

11 CHAPTER 2 REVIEW OF THE LITERATURE Introduction This review of literature encompasses three parts. Part 1 engages in a broad historical perspective of th e development of computers that has lead to today’s modern information technology. The intent of the section is to impart to the reader a distinct feel for the profound impact that th e computer and its peripheral devices have had on societies throughout the world. Part 2 discusses the app lication and use of information technology, establishing why its us e is an advantageous, if not necessary skill in this day and age. Shown here is the extent of information technology’s penetration into the workplace, school, a nd home. Both the present and potential impact of information technology on higher ed ucation, with considerations specific to Extension, is discussed. Part 3 establis hes the theoretical aeg is under which this study functions. Part 1-A Historical Perspective of the Computer’s Tec hnical Development A Look at Related Innovation from Ancient Times to 1950 Today’s computer is a fusion of innova tions, having evolved from many and varied calculating devices – some dati ng to antiquity. Perhaps the modern computer’s most distant progenitor is th e abacus, a counting device comprised of beads strung on rods. The abacus widely diffused among the merchants of ancient Asia and is still used in parts of the world today (Ross, 1986).

PAGE 21

12 The next significant inno vation in calculating devices occurred in the 1640s when nineteen-year-old French mathematicia n Blaise Pascal invented a gear-driven machine that could add and subtract. Some thirty years later the German mathematician Gottfried Wilhelm von Leibni tz extended the capacity of Pascal’s machine to include multiplication and divi sion. Another widely used calculating device, the slide rule, also st ems from this era (Ross, 1986). In the 1830s an Englishman named Char les Babbage theorized an “analytical engine” which foretold of modern com puters. The “engine” incorporated a programming component based on Joseph Jacquard’s system of using punched cards to operate weaving looms in a prescribed manner. This innovative feature represented a distinct break from processi ng immediate input, and can be seen as the progenitor of modern, stored computer programs (Ross, 1986). American engineer Herman Hollerith us hered in the next significant step towards modern computers. Responding to a competition held by the U.S. Census Bureau to find the best means to tabulat e the 1890 census, Holle rith invented an electromechanical tabulating machine th at successfully automated the census counting process. Based on punched cards, th e machine more than halved the time it took tabulate the previous (1880) census, saving the government an estimated five million dollars. In 1896 Hollerith founded the Tabulating Machine Company that, for the better part of the next century, would play a pivotal role in the development and diffusion of computing devices (Campbell -Kelly and Aspray, 1996; Ross, 1986). A short examination of this company’s emer gence follows: McKinley’ assassination in 1901 brought about a change in the leadership of the Census Bureau. The Bureau’s

PAGE 22

13 new leader soon ended the business relations hip with Hollerith’s Tabulating Machine Company, which forced the company to focu s on diffusing its tabulating machine into new markets. With the introduction of an improved “automatic” version of the machine, adoption of punch card tabulati ng spread rapidly throughout many diverse corporate and governmental entities. In 1911 Hollerith sold the company, which was merged with two other businesses to become the Computing Tabulating and Recording Company (CTR). The new comp any’s president, Thomas J. Watson, Sr., established a highly effective sales force that facilitated the diffusion of the tabulating machine throughout the world. In 1924 the CTR Company re-name itself the International Business Machines Corporat ion (IBM) (Campbell-Kelly and Aspray, 1996). The War Years: Calcula ting Needs drive Innovation In the early 1940s the Moore School of Electrical Engineering at the University of Pennsylvania possessed a Bush Differential Analyzer (a large mechanical calculating machine). In proximity to this school was the Army’s Ballistics Research Laboratory (BRL) at th e Aberdeen (Maryland) Proving Grounds. This laboratory, which also had a Bush Differential Analyzer, was responsible for creating firing tables for each new ballis tic weapon fielded by the United States military. With use of the differential analyzer, a firing table containing data on up to 3,000 trajectories could be completed in about a month. A team of 100 human calculators (characteristica lly young women) working with desktop calculators took approximately the same amount of time to complete a table. As the war progressed the BRL fell behind schedule in completing firi ng tables, thus creating a bottleneck to the deployment of new weapons. It turned to the Moore School for help, but even

PAGE 23

14 with this assistance the deployment of new weapons fell farther behind. The need for effective calculating technology thus b ecame urgent, and this spurred applied research into a solution for the probl em (Campbell-Kelly and Aspray, 1996). In August of 1942 John Mauchly of the Moore School proposed to build an electronic computer to expedite the calcul ation of firing tables, thus relieving the bottleneck. Initially ignored, the proposal was revisited in th e spring of 1943 and approved shortly there after. Mauchly then teamed with a 24-year-old electrical engineer named Presper Eckert, eventually devising an electronic computing machine called ENIAC (Electronic Numerical Integrator and Computer). A chance meeting on a railway platfo rm between the BRL/Moore School liaison officer, Herman H. Goldstein, and mathematics genius John von Neumann of Princeton’s Institute for Advanced Study lead to von Neumann’s involvement with the Moore School’s electronic computer projec t. By this time ENIAC was at such stage of completion that its design had been frozen. It had three major shortcomings: Too little storage, too many tubes, and it took a prodigious amount of time to reprogram. These deficiencies lead to th e development of the EDVAC (Electronic Discrete Variable Automatic Computer), which, associating many of von Neumann’s theoretical insights, pioneered aspects of electronic computing that hold to this day (Campbell-Kelly and Aspray, 1996). EDVAC was a “stored program” computer that consisted of an input device, memory, a control unit, an arithmetic unit, and an output device. In the spring of 1945 von Neumann published “A First Draf t of a Report on the EDVAC” which detailed “the complete logical formulati on of the new machine,” a formulation which

PAGE 24

15 “ultimately was the technological basis for the worldwide computer industry” (Campbell-Kelly and Aspray, 1996, p. 94). This document, though intended only for internal use, was rapidly disseminated ar ound the world. In the meanwhile, shortly after the end of the war, Mauchly and Ecke rt’s ENIAC computer came to life. Its intriguing physical appearance, and its 5,000 operations per second speed generated tremendous coverage in mass media channels thus attracting pub lic and scientific interest. Responding to the publicity, the M oore School sponsored a series of lectures in 1946 specifically to diffuse informati on on the stored-progr am computer. The lectures established a link between the school, and the many governmental, university and industrial entities working on comput ers in the late 1940s (Campbell-Kelly and Aspray, 1996). ENIAC and EDVAC are but two examples of the “first generation” of fully electronic computational devices. The Br itish COLOSSUS, developed in secrecy to break the infamous Nazi Enigma cipher, and the German Z1 developed by Konrad Zeuse were similar devices. Based on vac uum tubes, this “fir st generation” of computers were very large in size and c onsumed prodigious amounts of electricity. Each was a unique creation dedicated onl y to solving mathematical problems. Re-invention, Public Attenti on, and Diffusion: The 1950s The end of the war allowed the once secr et digital computation techniques to quickly diffuse into the civilian arena. Th is brought roughly thirty firms in the United States into the computer business. About ten were established in Great Britain. Office machine manufacturers, electronics and control equipment suppliers, and entrepreneurial start-ups were the three types of companies attempting to capitalize on the new technology (Campbell-Kelly and Aspray, 1996).

PAGE 25

16 During the 1950s the comput er, hitherto a mathematical instrument with limited application, was re-invented as a data-processing machine (Campbell-Kelly and Aspray, 1996). Leading the way, in early 1951, was the Electronic Control Company whose UNIVAC system was geared specifically towards business. IBM, after initially committing the company’s resources to develop a scientific computer called the “Defense Calculator,” quickly re alized that it should focus its efforts on developing machines for civilian business. The wake up call for IBM occurred when the U.S. Census Bureau adopted the UNI VAC system to address the bureau’s computational needs (Campbell-Kelly and Aspray, 1996). Popular excitement about computers in the early 1950s was reflected in mass media channels such as business magazine s, which fanned interest with sanguine predictions of a paperles s revolution driven by sophi sticated automata. Many business establishments were thus spurred to adopt the computer at this early date regardless of cost effectiveness (Campbell-Kelly and Aspray, 1996). New technical innovations continued to provide improvements to the nascent computer industry’s product. Out of MIT’ s Project Whirlwind, a contract to build a flight simulator for the military, came the inventions of magnetic core memory and “real time” operation. The core memory i nnovation alone would qui ckly replace all other types of memory, reaping MIT large royalty payments. Real time operation, in which a computer immediately responds to external input, enabled new military and business applications. Stemming direc tly from Whirlwind’s technological breakthroughs came the military’s SAGE ear ly warning air defense system. Although IBM was the primary contractor for the pr oject and gained tremendous technological

PAGE 26

17 advantage in the industry as a result, SAGE neverthele ss spun off key technological innovations such as printed ci rcuitry, mass-storage devices, graphical displays, digital communications, and core memory to a host of different companies. The project also trained a large cadre of software engineer s. At the end of the 1950s only about 6,000 computers were installed worldwide, but a critical mass of technological innovation was in place to begin intense commerci al exploitation of the machine (CampbellKelly and Aspray, 1996). Two high level programming languages we re introduced in the 1950s that endure to the present day: FORTRAN, for scientific applications, and COBOL for business. These innovations facilitated furt her adoption of the computer because of their similarity to natural, spoken langua ge, and because they had built-in algorithms that clearly spelled out erro rs in newly written code. Programming costs constituted the larges t expense associated with a computer installation, and thus companie s preferred to obtain readymade applications written by outside vendors rather than develop them on their own. Recognizing this, the computer manufacturers began to bundle soft ware that had specific application, be it banking, insurance, manufacturing, etc., to th eir computers. Libraries of existing programs were included with systems, and th e free exchange of code was facilitated through user groups like SHARE (C ampbell-Kelly and Aspray, 1996). It should be noted that by the end of the 1950s the British computer industry, though it was first to market a computer (t he Ferranti Mark I), was struggling for survival. Campbell-Kelly and Aspray attribut e this situation to a lack of enthusiasm amongst Britain’s “old fashioned businesse s” to adopt the innovation (Page 106).

PAGE 27

18 The social consequence of the British esta blishment’s non-adoption was to effectively stifle a new, innovative industry that was tr ying to take hold – thus insuring American dominance in the arena for years to come. The 1960s: Refinement, more Innovation, and more Adoption Replacing the vacuum tube in the late 1950s the discrete transistor ushered in the second and third generations of fully el ectronic computers. These new machines were much more compact in size, consumed less power, and did not generate nearly as much heat (Ross, 1986). This spurre d further adoption of the innovation, and by the end of the decade there was a tenfold in crease in the number of installed systems to almost 80,000 in the United States, and 50,000 elsewhere (Campbell-Kelly and Aspray, 1996, p. 130). Transistors, however, were soon obs olete, being replaced by a truly revolutionary innovation calle d the integrated circuit. Invented by Jack S. Kilby1 of the Texas Instruments Corporation, the inte grated circuit readily lent itself to miniaturization and sophistication. Its intr oduction in the late 1960s ushered in the beginning of a fourth generation of dig ital computational devices, which would steadily increase in power and speed while dropping in size and price (Ross, 1986). Rapidly increasing numbers of comput ers created a demand for application programs that, by 1965, supported 40-50 large software contractors and approximately 2,750 smaller ones. By the e nd of the decade these services were in greater demand because the average size comp any was unable to develop software inhouse that could effectively exploit modern computing power. It simply cost too 1 Kilby was awarded the 2000 Nobel Prize in Physics for his role in the invention of the monolithic integrated circuit.

PAGE 28

19 much money. Custom-designed applications purchased from outside contractors, however, were also prohibitively expensive. This problem opened the door to “packaged software,” which effectively distributed development costs across a whole market (Campbell-Kelly and Aspray, 1996). Some other notable technol ogical developments in the computing arena hail from the 1960s era: Time-sharing, the BASIC programming language, and the minicomputer. Computer time-sharing syst ems, developed through large grants from the U.S. Advanced Research Projects Agen cy, enabled multiple parties, even at divergent locations, to simultaneously use a large computer. This innovation markedly increased computer efficiency, and spawned what would be known as the “computer utility” industry, a short-lived phe nomena that envisaged “piping computer power into homes” (Campbell-Kelly a nd Aspray, 1996, p.217). It should be noted that whereas the popular marketplace applic ation of time-sharing failed, it remains an instrumental part of most all mainframe computing today. The BASIC (Beginners All-purpose Sym bolic Instruction Code) computer language emanated from the Dartmouth (College) Time-Sharing System, and was devised with simplicity in mind. It diffused rapidly through the educational establishment, making it de rigueur for manu facturers to include it on any new system designated for this market. Though some cr iticized its simplicity, BASIC emerged as a user-friendly language that made it possible for a wide sp ectrum of people to adopt the use of computers. It was to be the first widely available programming language for the forthcoming personal computer a nd “laid the foundations of Microsoft” (Campbell-Kelly and Aspray, 1996, p. 211).

PAGE 29

20 Towards the Personal Computer Minicomputers emerged from MIT’s Wh irlwind project to become a product of the electronics industry (as differentia ted from mainline computer manufacturers such as IBM). They were part of the re volution that miniaturized electronics, an effort that brought the world pocket calculato rs, digital watches, and ultimately the personal computer. Minicomputers enjoyed two distinct attributes that lead to their widespread adoption by scientific, academic, and engineering entities: They were far less expensive than a mainframe (having no bundled software, peripheral devices, or marketing overhead built into its price), a nd they allowed for “hands-on computing” like unto the early 1950s. Minicomputer use, especially the Digital PDP-8, spawned interest in computing amongst the students, experienced engineers, and young technicians who used them, and from this interest a “strong computer hobbyis t culture” emerged (Campbell-Kelly and Aspray, 1996, p. 225). In 1966 the Amateur Computer Society was founded, and through its “ACS Newslette r” publication a network of like-minded individuals was formed. The microprocessor arrived in 1971. De veloped over the course of two years by Intel Corporation, it was designed as a ge neral-purpose logic ch ip that could be programmed for specific applications such as a calculator. In fact the first Intel microprocessor was sold in early 1971 to Busicom, a Japanese calculator manufacturer. This chip, however was soon to be re-invented. Precipitous declines in the price of el ectronic calculators soon lead Busicom to relinquish the marketing right s to the general-purpose logic chip back to Intel, who, in November of 1971 began marketing it as a “computer on a chip.” This was the

PAGE 30

21 Intel 4004 microprocessor. A low-powered de vice capable of processing only four bits of information at a time, it neve rtheless sold for $1,000 a copy. Competition from such companies as Motorola, Zilog, and Mostek would soon drive the price of microprocessors much lower (Campbell-Kelly and Aspray, 1996). Personal Computing: Technology Fuses with Latent Desire Two distinct groups played a role in the eventual inception of the personal computer: Computer hobbyists and those involved with “com puter liberation” movement. The hobbyists were concentrated in the Silicone Valley region, around Massachusetts’ Route 128 co rridor, and in lesser nu mbers through the country. Resembling, if not out rightly stemming from the “ham” radio culture, these individuals were character istically young male “technophi les” often with some professional association with th e electronics industry. They were likely to read such mass media publications as “P opular Electronics” from which kits to build such things as stereos and television sets c ould be obtained. Mini computers were too costly for these individuals, as was com puter use by way of time-sharing computer utilities. This then sparke d a desire for economical comput er hardware that could be readily owned by an individual (C ampbell-Kelly and Aspray, 1996). Congruent to the desire for personally owned hardware, and hailing from the anti-establishment culture of the 1960s, th e computer liberation movement espoused the “radical idea called hypertext,” a vision whereby common people could economically access a “universe of informa tion held on computers” (Campbell-Kelly and Aspray, 1996, p. 239). Inhibiting this visi on was the fact that most all computers were “rigidly controlled in government bureaucracies or private corporations”

PAGE 31

22 (Campbell-Kelly and Aspray, 1996, p. 239). Computer hardware that could be personally owned would facili tate computer liberation. In January of 1975 the first microprocessor-based computer was offered as a $397 kit. Called the Altair 800, its availa bility was announced exclusively on the cover of Popular Electronics magazine. Though it had no keyboard or monitor, and unto itself did nothing other than light up a few small light bulbs, it generated a million dollars worth of orders in the firs t three months it was offered. Soon other companies were marketing add-on components for the system such as additional memory, storage devices and software. Th e Altair 800 galvanized the attention of Bill Gates and Paul Allen who formed a company named “Micro-Soft” and quickly developed a BASIC programm ing system to accompany this fledgling personal computer (Campbell-Kelly and Aspray, 1996). New communication channels dedicated to the innovation opened up practic ally overnight – from “The Homebrew Computer Club” near Sili cone Valley, to “Byte,” and “Popular Computing” magazines. By 1977 a chain of stores, ComputerLand, would sell machines and software nationwide (Campbell-Kelly and Aspray, 1996). Key items such as screens and keyboards, which existed from the evolution of mainframe computers, contributed to the ra pid development of the personal computer away from its simplistic beginnings. By 1977 there were three leading manufacturers of personal computers whose products each appealed to a different segment of the market. For Apple, the Apple II machin e was a “home/personal computer,” an attempt to position it beyond the hobby mark et. Tandy’s TRS-80 machine appealed to Radio Shack’s clientele of hobbyists and video game enthusiasts. For Commodore,

PAGE 32

23 the personal computer was conceived as an extension of its line of calculators (Campbell-Kelly and Aspray, 1996). Software drove adoption of the personal computer (PC). Computer games, simulation programs for education, and perhaps most importantly business applications transformed the machine from the realm of the hobbyist to a utility. Leading this transformation was the VisiCa lc spreadsheet that, coupled with the (relative) speed and flexibility of the PC, allowed businesses to easily model various financial scenarios. “Suddenly it became obvious to businessmen that they had to have a personal computer. VisiCalc made it feasible to use one. No prior technical training was needed to use the spreadsh eet program. Once, both hardware and software were for hobbyists, the personal com puter a mysterious toy, used if anything for playing games. But after VisiCalc the computer was recognized as a crucial tool” (Slater, as quoted by Campbell-Kelly and Aspray, 1996, p. 251). The software evidently had copious relative advantage over analogous mainframe software in its speed and flexibility, and because it could be used virtually for free after a modest purchase expense. VisiCalc certainly was compatible with business’ existing values, especially if they had been using simila r software on a mainframe. Given that VisiCalc needed “no prior technical training,” its complexity was such that it could be easily adopted. Evidently it was easy to tr y, and furthermore the success of those trials was very obvious to businesses. Thus this innovation was readily adopted. By 1980 there were many spreadsheets on the market, along with word processing software and first of the databa se products. The PC itself was sporting new monitors that displayed 80 columns of text in both upper and lower case, and

PAGE 33

24 printers were quite affordable. Its potent ial as a business machine had clearly arrived (Campbell-Kelly and Aspray, 1996). IBM’s entry into the personal computer business had the instantaneous effect of casting a seal of approva l on the PC technology. The IBM badge was an emphatic statement that the PC was legitimat e technology compatible with business everywhere – and business responded, in a big way. Launched in New York City on August 12th, 1981 the IBM Personal Computer generated “intense” interest from mass media, thus diffusing knowledge of the innovation far and wide. This attention was in addition to IBM’s own memorable advertising campaign that featured a Charlie Chaplin look alike designed to hum anize the PC machine (to make it seem compatible to those considering adoption) At a price of $2,880 there was soon a waiting list for the product, and IBM quickly quadrupled production (Campbell-Kelly and Aspray, 1996). Software Diminishes Complexity, Enhances Compatibility The IBM personal computer architectu re with its Intel 8088 processor, 64 kilobytes of RAM, and floppy disk drive qui ckly became an industry standard. All computer manufacturers either switched to the new standard or suffered the consequences. A notable exception was A pple Computer, whose business approach to the IBM competition was to design a be tter operating system, and resultantly, better application software. A pple’s president, Steve Jobs had seen the future, so to speak, when he had accepted an invitation to visit the Xerox Corporation’s Palo Alto research laboratories in 1979. (The visit wa s in response to an i nvitation extended by Xerox, who was an early investor in A pple.) During his visit Jobs witnessed, amongst other things, the mouse and the gr aphical user interface (GUI). Clearly

PAGE 34

25 amazed, he commented that Xerox could “blow away” the competition with the technology. Taking his observations back to Apple headquarters in Cupertino California, Jobs convinced hi s colleagues that what he had seen at Xerox was the technological way to go. In May of 1983 Apple launched the “Lisa” computer that incorporated the GUI operating system a nd mouse innovations. At $16,995 the Lisa was a commercial failure. In January 1984 Apple introduced another computer, the Macintosh, which also incorporated the GUI and the mouse. Though described as making “every other personal computer a ppear old-fashioned and lackluster” (Campbell-Kelly and Aspray, 1996, p. 276), th e Macintosh, priced at $2,200, failed to garner much adoption outside of the com puter enthusiast market, education, and printing and media companies. Regardless of Apple’s failure to gain widespread adoption of its products, the GUI was an innovation that would eventually propel adoption the personal computer throughout society – and one company, Microsoft, having written much of the software for the Macintosh, had gained intimate knowledge of how the GUI technology worked (Campbell-Kelly and Aspray, 1996). Responding to the very apparent adva ntage of the Apple operating system, other firms launched similar GUI products. The first was VisiCalc, which introduced VisiOn in October of 1983. Soon thereafte r, in early 1994, Digital Research launched GEM. Microsoft, having licensed characte ristics of the Macintosh operating system, released Windows in late 1985, and IBM, init ially partnered with Microsoft, began work on OS/2 in 1987. Sooner or later each of these operating systems would fail to be commercially viable, except Window s (Campbell-Kelly and Aspray, 1996).

PAGE 35

26 Buoyed by royalties generated from its Disk Operating System (DOS), a copy of which was installed on every IBM PC so ld, Microsoft had the resources to develop and effectively market software – and w eather to marketplace failures when they occurred. The first version of Windows sa w only little adoption. It was “unbearably slow” even on machines r unning the latest In tel 286 microprocessor, and was perceived as a “gimmick” with little advantage over DOS (Campbell-Kelly and Aspray, 1996, p. 278). Yet Microsoft pers isted, developing a base of Windows applications for the IBM PC. When the second version of Windows was released it sold over 2 million copies. By the th ird release of Windows in May of 1990 microprocessor power had been enhanced to the point where the Windows GUI operated with reasonable alacrity. Th is was the era of the Intel 386 and 486 microprocessors. At this time Microsoft chairman Bill Gates, presiding over a $10 million worldwide media spectacular at the launch the new GUI, proclaimed Windows 3.0 “puts the personal back into millions of MS-DOS-based computers” (Campbell-Kelly and Aspray, 1996, p.281). Five years later even more extravagant media events heralded the August 1995 release of Windows, now renamed “Windows95.” Further releases of th e software occurred in 1998, 2000, and 2001. The Windows GUI along with a diverse array of sophisticated software sporting a Windows-based commonality in design, fuel ed adoption of the personal computer across society to the greatest numbers ever. Part 2-The Application and Use of Information Technology General Use of Information Technology in Today’s Society “At work, school, and home, the personal computer has become a basic tool” (U.S. Commerce Department, U.S. Census Bureau, 1999). By October of 1997, 37.4

PAGE 36

27 million or 36.6% of American households ha d acquired a computer. More than 80% of the children living in a hous ehold with a computer used it – primarily for education and games, but also for word processing, gr aphics and design, and e-mail. Boys and girls use computers almost equally, but for di fferent purposes. In 1997 almost half of American adults used a computer at wor k, home or school. Half of all employed adults used a computer on the job, a greater degree of use than at home or at school. The fraction of adults using computers on th e job increases to 75% if they have a college education. Women, because they hol d primarily technical or administrative jobs within industry, tend to have higher le vels of computer use than men. Men and women also use computers differently at work (U.S. Commerce Department, U.S. Census Bureau, 1999). This “basic tool” in the employment setting is used primarily for word processing. Other uses, in order of frequency, are keeping customer records and accounts, e-mail and communications, calenda r/scheduling, databases, spreadsheets, and bookkeeping. Of less common use is invent ory control, analysis, invoicing, sales and marketing, graphics and design, de sktop publishing and newsletters, and programming (U.S. Commerce Depart ment, U.S. Census Bureau, 1999). For many Americans use of the Internet is becoming an increasingly common daily activity. Business transactions, personal correspondenc e, research and information gathering, and shopping are now routinely conducted via computers connected to the Internet. In Augus t 2000, 41.5% of American households had Internet access, and 116 million Americans we re online at some location. This figure is projected to grow substantially by the middle of 2001. Adoption of Internet

PAGE 37

28 technology, and thus the use of computers, is occurring amongst most all Americans regardless of demographic characteristics. Even groups that trad itionally have been lagging behind the national trend are now maki ng dramatic gains. This includes rural households whose rate of Internet penetra tion is now 38.9%, a per centage far closer to the national rate than in the past (U.S. Department of Commerce, 2000). Underscoring the growing importance of Internet activity to all Americans, the Commerce Department states: “Each year, being digitally conn ected becomes even more critical to economic, educational and social advancement. Now that a large number of Americans regularly use the In ternet to conduct daily activities, people who lack access to those tool s are at a growing disadvantag e. Therefore, raising the level of digital inclusion – by increa sing the number of Americans using the technology tools of the digital age – is a vitally important national goal” (U.S. Department of Commerce, 2000, p. 1). The significance of digital inclusi on, and thus the need to adopt the innovation has not escaped the notice of i ndustry, as evidenced by Chase Manhattan Bank’s (CMB) recent commitment to a multi-year, multi-million dollar grant to develop an extensive home-school comput er network at an inner-city school in Brooklyn New York. Included in the grant is state of the art equipment to be given away free to students and staff, a new school website, and the volunteered time of 1500 CMB employees (Chase Manhattan Bank, 2000). The Federal government is also catalyz ing building the nation’s Internet framework through a key component of the 1996 Telecommunications Act. Called the “E-Rate,” the legislation allows sc hool districts and li braries to purchase

PAGE 38

29 telecommunication services at significant di scounts. The result has been nearly $6 billion expended toward improving telecomm unications infrastructure, and Internet access, at predominately needy schools and libraries (Benton Foundation, 2000). Information Technology in Primary and Secondary Education During the 1950s computing power was us ed almost exclusively to develop new technology. In spite of this “preocc upation,” the emergence of computer-based education took form in flight simula tion and various industry-based employeetraining programs. By the early 1960s computing power had tentatively reached the mainstream (K-12) education establis hment, which lead to the programmedinstruction movement (as exemplified by the Stanford Project and PLATO). Adoption of computers, however, was stym ied by their large cost and a lack of individuals who knew how to operate them (Ross, 1986). Regardless of these difficulties, the stage was being set for the further integration of computing power into the mainstream education establishment. Writing in the second half of the 1960s, L oughary indicated that in society at large “computers and sophisticated comm unication devices” had become accepted “as natural parts of our environment” (Loughary, 1966). Mainstream education, he indicated, was not excluded: “The con cepts underlying systems and electronic communications devices are playing increasing ly important roles in education and, if the thinking and planning of some educationa l leaders is valid, are destined to become basic and necessary to education in the not too distant future” (Loughary, 1966, p. xi). Loughary is auguring the increased use of co mputers for instructional purposes, as he goes on to observe that the machine was evol ving from “the garage and workshop of education” (metaphorically, administrative /bookkeeping functions), to “the kitchen

PAGE 39

30 and living room” (metaphoric ally, the classroom) (Lougha ry, 1966). Capping off the thought he states: “The resulting potential for change in our educational institutions are overwhelming” (Loughary, 1966, p. xi). Change as predicted by Loughary would indeed take place, though at a much slower pace than thought at the time. Both technological, and perhaps more significantly, sociological hurdles still needed to be overcome before widespread adoption of the technology took place. One can imagine the reaction of a prof essional educator to a “man-machine” system that performed many of their traditional functions. Such a system, postulated by Loughary, integrated the storage, retrie val, and high-speed printing of indexed reference material along with diagnostic te sting of students and the ability to produce cumulative student progress reports (L oughary, 1966). Of the possible teacher reaction to computers being us ed as a means to aid instruction (and having to learn how to use them in this manner), Loughary writes: “While anticipating the possibilities for individualizing and enriching instruction, he is reluctant to part with the professional methods developed over the ye ars and in which he has a real personal investment. Few people after having gained professional status enjoy returning to the role of novice. Nevertheless, the extent and rapidity with which man-machine systems and new technology are implemented in education will depend upon the willingness of professional, experienced teac hers at all levels – kindergarten through college – to experience some basic re-educ ation in machine and systems technology” (Loughary, 1966, p. 6). Regardless of the reluctance of mainstr eam professional educators, the use of computers to instruct pupils was ga ining momentum. “Throughout the 1960s,

PAGE 40

31 corporations and universities initiated projects to deve lop and evaluate programmed instruction” (Ross, 1986, p. 7). Evidencing the lack of involveme nt of mainstream educators, Loughary indicates that discove ries stemming from the new field were reported primarily in industry and agency publ ications, with little information to be found in professional educational resear ch journals (Loughary, 1966). Certain educators, however, did fathom the impli cations of computers in instruction, and engaged in speculation about what the futu re might hold: “Computers will play an increasingly major role. It does not take much imagination to envisage increasing individual study as a lifelong effort, c onceivably occurring in one’s home via individual consoles connected to larg e computers by way of telephone lines or electron beams. As with today’s soft drink and candy-vending machines, we may live with ‘quick learning’ machines capa ble of rapidly updating an individual in specific skills. One can go on and on with speculation of the deta ils of tomorrow. However, the demands of today make it abund antly clear that radi cal changes in the concepts and operation of education must come, and come soon” (Tondow, writing in Loughary, 1966, p. 80). Twenty-some years later in the late 1980s, when microcomputer use was burgeoning in all areas of society, the comput er’s role in education was only just beginning: “Although the use of computer s was introduced into the educational systems of some OECD2 countries in the late 196 0s and early 1970s, the major developments in the use of computers in schools have taken place in the 1980s since the advent of the microcomputer” (Winship, 1989). As was the case in the 1960s, 2 Organization for Economic Co-operation and Development – Membership as of 1989 included all major European democracies, the United Stat es, Japan, Australia and New Zealand.

PAGE 41

32 access to computer hardware in the late 1980s was a factor inhibi ting its widespread adoption and use in education. Even t hough education systems had made relatively large investments in computer equipment, th e average ratio of co mputers to pupils in secondary schools in the United States was shown to be 1:27, which meant that, on average, a student was recei ving only five to seven mi nutes of direct computer contact per day (Winship, 1989). Revealing th at this situation had not much changed half way through the next decade, diSessa indicates that in 1995 there were about three computers per “average” 30-student classroom in the United States. By 1999 only 10 percent of U.S high schools had a stude nt/computer ratio of 1:10, and the rest had less (diSessa, 1999). Software issues also slowed adoption. In contrast to the 1960s when the availability of software products was limited, the 1980s saw an estimated 1,500 to 2,400 new packages published per year in the U. S. alone. Of all th ese titles, however, only 12 percent was deemed of good quality, with another eighteen percent being of tolerable quality (Winship, 1989). The role of the teacher in adopting computer technology appears as pivotal in the 1980s as it was in the1960s. “Teachers in general seem to resist technological progress and may appear to be the biggest stumbling block inhibiting changes in the way computers are used in schools” (Win ship, 1989, p. 29). Gilbert De Landsheere reasons why this resistance occurs: “The methods that teachers use are governed by beliefs and attitudes that ha ve been deeply and unconsci ously absorbed during their school career, which in some countries begins at the age of three in the case of more than 90 percent of children and thus la sts at least 15 years up to the end of

PAGE 42

33 compulsory education. That is why, when they themselves become teachers, they tend to copy the teaching techniques advocat ed by the more recent teaching theory. Before training anyone to use new technol ogies, or, more accurately, concurrently with this training, the underl ying attitudes and habits of educational practice need to be thoroughly reformed. This will be a complex and costly operation and will only be achieved by working together with the teachers over a long period of time and by endeavoring to resolve jointly the problem s that they have decided to tackle” (Winship, 1989, p. 29). Lerner, citing Papert says that primary and secondary schools resist change because educational policy is dominated by bureaucracies at all levels of government. Furthermore the intellectu al establishment, which dominates educational thinking, stems from a culture wh ere change is extremely slow. Another inhibitor of change is that school, as we know it today, is deeply imbued in both individual and societal c onsciousness (Lerner, 1997). That computers will have profound impact on society and educational institutions is a philosophical theme that threads its way from 1960s to the present. Writing in 1966, Tondow suggested that an “i nformation explosion” as lead by the computer was bringing “profound” change to society (Tondow, writing in Loughary, 1966). Elaborating further he sa ys, “It is apparent that the computer represents one of the major social as well as technological cha nges of our times. It is equally apparent that we have not yet learned to fully utili ze this equipment and have a limited sense of its ultimate impact” (Tondow, writing in Loughary, 1966, p. 30). Writing in 1986, Ross, citing a 1981 article by Johnson, stat es: “As you glance at this page, a revolution is taking place around you. Signs of it can be seen everywhere – on T.V.,

PAGE 43

34 in magazines, and in offices, homes and schools. The computer age has arrived, and in the opinion of some, it will be significant enough to be labeled that by historians” (Ross, 1986, p. 1). Quoting Herbert Simon th at the computer is a “once in several centuries” innovation, diSessa intones in 1999 that, “Computers are incontestably transforming our civilization. Comparisons of our current information revolution to the industrial revolution are commonplace and apt. Almost no corner of society is untouched by computers” (diSessa, 1999, p. 3). She goes on to hypothesize computers can be the technical foundati on of a new and “dramatically enhanced literacy” the influence and scope of which will rival the current text-based literacy (diSessa, 1999). This thinking appears to clearly establish a dynamic link between human cognitive activity and modern computer technologies. Information Technology in Higher Education From administrative functions to academics, information technology has become an integral part of higher ed ucation. A 2000 survey conducted by The Campus Computing Project ( CCP) reported that two-fift hs of the participating colleges (42.7%) have courses that use Web resources as a component of the syllabus, and three-fifths (59.3%) of the participating colleges have courses that use electronic mail. Many campus services, from undergraduate admission applications, checking grades, to paying tuition, are becoming avai lable online. Perhaps a more portentous development is that over half of the colle ges participating in the CCP survey report offering one or more full college courses online (Campus Computing Project, 2000). In fact, Fairleigh Dickinson University of Hackensack, New Jersey has mandated that students take one online course per year during their matriculation (School to Require One Online Course, 2000).

PAGE 44

35 All of the above means that colleg e students have to know how to use computers. Florida State University prefer s students to fulfill computer competency requirements in their freshman year. Th e university states: “Regardless of the vehicle used to satisfy the computer competency requirement, students must demonstrate: 1. Basic familiarity with computer hardware, operating systems, and file concepts; 2. Working knowledge of a wo rd processor or text editor and at least one other software application (e.g., spr eadsheet, database, etc.); 3. Working knowledge of the World Wide Web (WWW) and electronic mail” (Florida State University, College of Arts and Sciences – Department of Computer Sciences Academics, 2000). The College of Agricu lture and Life Sciences at Cornell University has similar requirements, mandating that their students graduate with a working knowledge of word processing, pres entation tools, spreadsheet analysis, database management, graphics, the World Wi de Web, e-mail, and the ability to make effective use of information on th e Internet (Johnson et al., 1999). A demonstrable level of student com puter competency serves not only to facilitate the processes of higher educa tion, but also responds to the technology demands of prospective employers. A study conducted by Cornell University lead its investigators to conclude that agricultur al employers “have a high expectation of computer literacy in recent college gra duates” (Johnson et al., 1999). Computer competency requirements such as those of FSU and Cornell, and presumably many other universities, help students to succeed both at the university and in the job market.

PAGE 45

36 Certain observers feel that the World Wide Web is bringing dramatic change to academia. Duderstadt states: “There is an increasing sense that information technology will have an even more profound impact in the future on educational activities of colleges and unive rsities and on how we deliver our services. To be sure, there have been earlier technology changes, su ch as television, but never before has there been such a rapid and sustained pe riod of change with such broad social applications (Katz & Associates, 1999, p. 5) Richard N. Katz, using a hypothetical entering freshman at the University of Calif ornia Santa Cruz as an example, augurs a very technologically driven campus in the year 2010. It is at this time that the first class of students to have grown up with the Web will enter college. His hypothetical student would have been solicited to enroll at UCSC as early as the 10th grade, based on PSAT test scores and transc ripts that were made available to college officials electronically. By the beginning of their ac tual college experience, this individual will have completed two semesters of collegiate work by way of the Web and appearances by a UCSC prof essor at their high school location. Once on campus, Katz’s hypothetical freshman would be issued a “personal digital assistant” allowing them to select from a variety of onlin e courses offered by numerous UCSC “academic partners,” which include the seven other me mbers of the UC system. Katz’s student would also benefit from the UCSC campus being equipped with wireless technologies, allowing for very easy conn ectivity to such serv ices as “virtual bookstore,” and so forth (Katz & Oblinger, 2000). “Preposterous? Yes, the scenario no doubt understates the likely student expe ctations and campus capabilities of a decade from now by an order of magnitude” (Katz & Oblinger, 2000, p. xv).

PAGE 46

37 Distance learning, or distri buted education, is bri nging new and significant competition to traditional academic organizatio ns. Entities such as the University of Phoenix, WebCT, and Eduprise.com are offeri ng Web-based products and services in what Katz calls the “e-learning ‘markets pace’” (Katz & Oblinger, 2000). In this “marketspace” commercial entities compete with traditional academic organizations, which likely could begin to compete with themselves. As distance learning becomes more common, students might soon have the option of obtaining the classroom experience of renowned instructors locat ed anywhere in the world (Katz & Associates, 1999). Furthermore, the dist ributed learning environment appears to converge with the psychological nature of stude nts raised in an era of interaction with electronic devices. “They approach learning as a ‘plug and play’ experience: they are unaccustomed and unwilling to learn sequentially – to read the manual – and instead are inclined to plunge in a nd learn through participation and experimentation” (Katz & Associates, 1999, p. 7). On a broadly philosophic note, Katz states that education, and thus knowledge, has become the determining factor in the wealth of na tions and the key to individuals’ standard of living. He posits that de mocratic societies bear a responsibility to their citizenry to prov ide them with affordable, and moreover, accessible, high-quality education. This, he says, has long been the theme of higher education in America, which over time ha s encompassed more and more individuals from a broader segment of society. The new and increasingly powerful technologies associated with computers are seen by Katz as an opportunity for U.S. higher education to capitalize on its global pr eeminence – perhaps some day meeting the

PAGE 47

38 demands of not only the do mestic population, but also of a global educational “channel surfer” who carefully selects courses based on such criteria as content and price. Revenues generated from such ventures could conceivably subsidize traditional modes of instru ction found on campus, which are not as remunerative (Katz & Associates, 1999). Information Technology and the Cooperative Extension Service Extension, along with the rest of the world’s societies, is now living in age of rapid change brought about by information technology. In this environment county agents must possess up-to-date IT skills to effectively meet the demands placed on them by the increased use of IT by both clientele and Extension administrative entities. Ladewig states: “Face-to-face communication with clientele is a very important method that we will always rely on to bring timely information to our clientele. However, we must also exam ine how computer technology can help county Extension agents deliver relevant info rmation and support educational programs” (Ladewig, 1999, p. 1). Martin (1998) states: “Computer and information technologies are vital component s of Extension’s current and future infrastructure. Agents and staff will have to transmit info rmation between offices and clientele at a distance” (p. 3). Echoing this, Rasmussen st ates: “Communication is the key to the operations of the county Extension office. More and more county Extension offices are turning to computers and other electronic technology to improve the communications with the state offices and with university specialists, as well as with the people they serve” (1989, p. 8). Inextricably tied to today’s informati on technology, and of clear importance to furthering Extension’s mission, is the Intern et. “There are tremendous opportunities

PAGE 48

39 for Cooperative Extension (CE) on the Inte rnet. These opportunities are for improved functionality of the CE system, and new opportunities for communities that sustain the CE system (Tennessen, PonTell, Ro mine, & Motheral, 1997). Bamka (2000) states that it is important for Extens ion professionals to teach agriculture professionals to become familiar with the Internet in order to take advantage of its use in developing markets for, and promoti ng agricultural products. Sherfey, Hiller, Macduff, and Mack (2000) describe an Internet-based system designed by Extension that assists professionals in developing th eir volunteer management skills. Clientele, some with Extensions assist ance, and some without, are us ing the Internet to acquire information and also to market agricultura l goods. New Jersey hay producers have found marketing success over the Web (Bamka 2000), and Iowa farmers, with a 33% Internet penetration rate, are using a net-based service to price commodities: To be successful in the 21st century, you have to have acce ss to better information and sophisticated tools, said David Lyons, dire ctor of business development for the Iowa Farm Bureau Federation (Bohrer, 2000, p. 6G ). Baker (1998) reported that 46% of Florida Farm Bureau County Directors survey ed felt the Internet helped them do well in their jobs. The Internet has also enabled the cr eation of new ways that Extension professionals can receive in-service trai ning. Lippert, Plank, and Camberato (1998) and Lippert, Plank, and Radhakrishna (2000) described in-service training for Extension professionals in the Southeast th at used a listserv an d a Web site. The authors investigated two different in-servi ce trainings that used this method, and

PAGE 49

40 reported that the participants broadly accepted it, and demonstrated suitable knowledge retention of the subject matter studied. Internal accountability of Extension ac tivity, especially planning and reporting needs, is increasingly being done over the In ternet by way of Webbased applications. This is the case with the FLCES, the No rth Carolina Cooperative Extension Service, and the Clemson University Cooperative Ex tension Service in South Carolina. Radhakrishna and Pinion (1999) stated that accountability is becoming more important because of stricter mandates legislated by federal, state, local and university authorities. Web based accountability syst ems are helping to accommodate these new demands. The World Wide Web, Internet mail, m odern GUI operating systems, and a suite of office software now challenges th e Extension professional on a daily basis. Internal administrative needs and clientele needs both increasingly call for the use of this modern information technology. A dema nd is thus placed on the organization to ensure that its professionals obtain the n ecessary skills to function in this modern context: Knowledge has been the product of Extension since its inception. As CES embraces the knowledge economy, leadership must find ways to insure that their employees become knowledge workers in the information ag e (Albright, 2000, p. 17). Employee training established from a clear understanding of the workforces present information technology skills, is a clear-cut way of ensuring that Extension professionals are using information technology in an effective manner. Information Technology and the Florid a Cooperative Extension Service The information technology (IT) revoluti on ushered in by the microcomputer is now slightly over twenty years old. Use of the technology has progressed within

PAGE 50

41 the Florida Cooperative Extension Service Extension (FLCES) to the point where every county Extension agent has an up-to -date personal computer on his/her desk that is equipped with a suite of current or near-current “office-t ype” software products (L. Arrington, private communication, 2000). Additionally, the FLCES provides all county offices with the resources to connect to the Internet Serving clientele needs, in-service training for Extension faculty, and administrative applications such as information gathering, communication, and pl anning and reporting are all present or potential uses of the organi zation’s IT infrastructure. Part 3-Theoretical Aegis of the Study Training Needs Albright states: “Befor e training programs are unde rtaken by organizations, there should first be a front -end analysis to determine why the training is needed” (Albright, 2000, p. 41). Training needs might stem from employees not knowing how to perform a task, from something preventing employees to do the task, or a lack of incentive to perform the task (Albright, 2000) When it is determined that training is needed, a needs analysis should be performe d to assess what may be causing a deficit in employee performance (Albright, 2000). This analysis should reveal employee competencies, and in so doing establish th e objectives of th e training (Albright, 2000). Determining Training Needs Albright, in her 2000 study of Texas c ounty Extension agents, developed an instrument called the “Computer Technol ogy Skills and Training Needs.” This instrument, designed to assess agents’ IT training needs, used the Borich Needs Assessment Model as its basis. The Bo rich model, later confirmed by Barrack,

PAGE 51

42 Ladewig, and Hedges, functions by having respondents self-assess their knowledge about a competency, the importance of a comp etency to their job, and to what degree of skill are they able to apply a competen cy to their job (Alb right, 2000). Albright states: “The strength of Borich’s mode l allows for finer judgments in rating each competency and allows for a more releva nt evaluation of the response” (Albright, 2000, p. 64). The Borich model predicts that differe nces will occur betw een the rankings (importance, knowledge, and application) for each competency considered. Therefore a respondent might give a high rank to the importance of a skill, but give a low rank to their knowledge and/ or application of that same skill. Thus are training needs more appropriately chosen by compar ing mathematical combinations of the rankings for each of the competences, ra ther than from a single ranking (e.g., importance) of one competency alone (Albri ght, 2000). Ultimately Albright used the formula (Importance mean – Knowledge m ean) X Importance Mean to derive a hierarchy of training needs. She states : “Knowledge and application should be considered in determining relationshi ps; however, the knowledge factor, when weighted and applied to the importance factor becomes the most appropriate measurement to determine ranking” (Albright, 2000, p. 87). Summary From its ancient beginning in Mesopotamia as a series of beads situated on string or shaft to its modern inception with miniaturized circuitry, the computer is a device conceived by man to transmute the co mplex into the simple. It has been employed to preserve the world’s democraci es in time of great peril, to further science in its quest to expl ain natural phenomena, and to aid medicine in its fight

PAGE 52

43 against disease. Computing devices a nd their peripherals, collectively termed “information technology,” have enabled co mplex and large businesses such as the airline industry to grow, a nd in so doing have foster ing unprecedented levels of economic prosperity around the globe. In educational settings information technology is having a significant imp act, changing longstanding teaching methodology and holding the promise to distri bute education to those who previously may have been excluded. Modern information technology powe rful personal computers running sophisticated, easy-to-use software produc ts and integrated with communication technologies enabling access to the World Wide Web, have revolutionized the way individuals can accrue and disseminate info rmation. Certain thinkers propose that this is the basis of a new form of literacy (diScessa, 2000). Others indicate that the power of information technology transforms organizations into highly competitive, agile entitles whose work ers use information to produce new knowledge (Albright, 2000). The Florida Cooperative Extension Se rvice is an organization whose technological infrastructure is suitably developed to pa rticipate in the information revolution at hand (L. Arrington, private communication, 2000). Spurring the modern, effective use of information tec hnology for its own internal functioning, and for the benefit of its clientele, is an effort that the organization needs to pursue. To what degree county Extension agents are ab le to use modern information technology to meet this need is pres ently under-researched. What is known, however, is that

PAGE 53

44 Extension, the great system of education be yond the classroom, must be prepared to take advantage of the still unfoldi ng revolution in information technology.

PAGE 54

45 CHAPTER 3 METHODOLOGY Introduction The following objectives were establishe d in Chapter I to guide the study: 1. Describe county Extension agents’ de mographic characteristics and, based on those characteristics, determine thei r use of information technology, including self-assessed level of over all computer skills. 2. Determine how county Extension agents are using information technology on the job in terms of hardware and software use. 3. Determine county Extension agents’ percei ved level of skill with regard to a specific set of information technology tasks. 4. Recommend future information technol ogy training by describing the relationship between agents’ perceived importance of and self-assessed knowledge about specific information technology skills. Research Design This study uses applied research me thodology in that it seeks to answer practical questions associated with an imme diate problem (Ary et al., 1996). It is quantitative research, having collected obser vations that readily lend themselves to numerical representation (Rossi & Freem an, 1993). Furthermore, the study’s underlying research design is primarily desc riptive in nature, re vealing the existing state of IT use amongst FLCES county agents vis--vis various demographic variables measured. Some inferential statis tical procedures were utilized. T-tests were performed to examine for the existe nce of significant differences between the means of certain variables of intere st. ANOVA was employed similarly. In conjunction with the training needs analysis correlations betw een the “knowledge,”

PAGE 55

46 and “application” constructs were conduc ted. Both Ruppert (1992), and Albright (2000) employed similar survey-based rese arch in their resp ective studies of information technology use amongst county Extension agents. Population The population for this study was county Extension agents in the employ of the Florida Cooperative Extension Service. At the time the study was initiated, during the summer of 2002, this populati on numbered 331. Due to the relatively small population size, and also due to the need to accurately capture demographic differences within the p opulation, a census was conducte d. Ruppert (1992), and Albright (2000) both used a census in th eir respective studies of county Extension faculty. County Extension agents possess unique characteristics based upon formative experiences (age, social class, gender), e ducational experiences (university attended, undergraduate and graduate programs), a nd other factors such as intelligence, motivation, and personality traits (Baker et al., 1997). Accordi ngly, the population of FLCES county agents displays a diversity of demographic features – features that this study wishes to describe in conjunction with the agents’ use of information technology, including self-assessed le vel of overall computer skills. Instrumentation Data for this study was collected by way of an instrument adapted from Albright’s 2000 survey of Texas county Exte nsion agents. The online version of this adaptation is provided as Appendix A, and the paper version is provided as Appendix B. Albright’s instrument, the “Survey of Computer Technology Skills and Training Needs” (SCTS), is based on the methodol ogical framework of the (1980) Borich

PAGE 56

47 Needs Assessment Model as verified by Barrick, Ladewig, and Hedges in 1983 (Albright, 2000). The list of computer co mpetencies used in the SCTS were derived from two documents: The Texas Educa tion Agency’s Texas Essential Knowledge and Skills (for Texas teachers, and Texas students in grades K-12), and the Texas Technology Essential Knowledge and Skills (f or Texas teachers, and Texas students as of the 8th grade), a document developed by “b usiness, industry, and educational professionals” (Albright, 2000, p. 38). Al bright reports the Texas Technology Essential Knowledge and Skills document as having “quickly become a national standard among educational instituti ons” (Albright, 2000, p. 38). The SCTS instrument was subjected to expert review, and furthermore, was compared to similar national tests of computer competency for content and quality of the technology competencies it addressed. These competen cies were found to be “…as or more complete and comprehensive than each of the other assessments reviewed (Albright, 2000, p. 60). In addition, the SCTS instru ment was subjected to a pilot study. Reliability of the SCTS instrument wa s established during data analysis by examining internal consistency of each scale of computer competency using statistical procedures. As Albright puts it: “Cr onbach’s Alpha using summated scale scores completed for each respondent was used on ratings of importance, knowledge and application. Questions were grouped to addr ess specific goals of the study and were assessed for consistency using this procedure” (Albright, 2000, pg. 65). Albright administered the SCTS surv ey to two groups via the World Wide Web. Group one was “a purposive sample of 44 CEAs who are high users of computer technology as identified by a TAEX Computer Information Technology

PAGE 57

48 workgroup” (Albright, 2000, p. 58). The s econd group was the general population of county Extension agents in the employ of th e Texas Agricultural Extension Service. The response rate to the SCTS survey for the first group was 95%, for the second group it was 64%. No comparison was made between the groups. Albright’s survey, and consequently th e study at hand, asked “agents to report self-perceived technology skills, their ability to apply the skills to their work and their perception of the importance of the technology skills” (Albright, 2000, p. 59). Three constructs, “importance,” “knowledge,” a nd “application” were assessed by way of three questions which were asked for each area of computer skills the survey considered. These constructs were “operationalized” as follows: Importance: Importance of this skill to your job function. Knowledge: Your knowledge of this topi c (your ability to accurately recall or summarize the subject matter). Application: Your ability to use this skill in your job. (Albright, 2000, p. 62). The SCTS functions by solicit ing response to a series of questions designed to reveal specific demographic characteristics associated with th e respondent, including information on prior computer training. The instrument then asks the respondent if they can perform specific computer technol ogy skills associated with whichever of the eight types of computer software (i.e., e-mail, word processing, etc.) the instrument is presently considering. With in consideration of one of these specific types of software, the three construct ques tions are then posed. Here the respondent self-assesses their knowledge of the softwa re, the importance they ascribe to the

PAGE 58

49 software, and their ability to apply skills us ing the software to th eir job. Immediately thereafter respondents are given the opport unity to add, in their own words, any additional skills associated with the softwa re they feel are needed for successful employees. A review of this study’s adaptation of the SCTS instrument was conducted by a panel of experts chosen for their know ledge of Florida Cooperative Extension Service county Extension agents and/or information technology. The panel included representative(s) from the Dean for Exte nsion’s Office, the District Extension Director’s Office, the Department of Agri cultural Education a nd Communication, and the University of Florida’s administra tive computing department (Information Systems). As a result of this review many changes were made to the adapted instrument, including enhanced content, scales, and readability. The changed instrument, however, retained the fundame ntal underpinnings necessary to analyze training needs according to the Borich et al. model. This instrument was then subjected to a pilot test involving 20 agents chosen at random from the general population of FLCES county agents. This is described in mo re detail below. Data Collection Data collection followed a “mixed-mode” approach as described by Ladner, Wingenbach, and Raven (2001). This approach gives individuals a period of time (in the case of this study, 3 weeks) to complete a Web-based su rvey instrument, but then sends a paper copy of the survey instrume nt to those individuals who have not completed the Web-based version. It is believed that this method accommodates those individuals who do not have access to the Web, or who pref er not to use the Web, or those who prefer not to perform th e survey via the Web. This study also

PAGE 59

50 took into consideration methodology for elect ronic surveys as described by Dillman (2000) in his book Mail and Internet Surveys Content of the reminder messages sent by the researcher followed recommendations set forth by Glenn D. Israel (Israel, 2000). The Web-based pilot test of the study’s survey instrument involved 20 randomly chosen county Extension agents from the population of agents, and commenced on June 24th, 2002. On that da y an e-mail message introducing the study was sent to the pilot populati on from the researcher. This message contained a link to the Web site that hosted the study’s su rvey instrument, and provided a unique, individualized access code for each potenti al participant to gain access (to the instrument). A reminder message was e-ma iled to non-responding individuals 3 days later. Thereafter the res earcher telephoned non-respondents w ith a personal appeal to participate. The messages transmitted by the researcher for all phases of the study are included as Appendix E – Appendix O. The full Web-based survey was introduced on July 5th, 2002 by a message emailed to all county agents from Dr. Christine T. Waddill, Dean for Extension. On July 7th, 2000 the study commenced when the researcher e-mailed a message containing specific information on the survey ’s rationale, a hype rlink to the World Wide Web site hosting the survey instrume nt, and the agent’s unique, individualized access code. Included in this message were e-mail addresses and telephone numbers to contact the researcher or his faculty advisor if need be. Reminder messages containing the hyperlink to the survey and th e agent’s unique access code were sent July 12th, July 16th, and July 22nd to thos e agents who had not yet completed the

PAGE 60

51 Web-based study. Reminders were also sent out via e-mail on July 11th, and July 25th, by way of the Dean’s “Comings and Goings” bi-monthly electronic publication. The District Extension Directors were each asked by the researcher to encourage participation, which resulted in additional email reminders sent to specific segments of the population. Dillman’s (2000) assertion that multiple contacts with potential respondents are as important to electronic surveys as regul ar mail surveys was readily confirmed by this study. On August 1st, 2000 the population of agen ts who had not filled in the Webbased instrument was sent a packet via conventional mail that included the introductory letter from the researcher and his faculty advisor, a paper version of the survey instrument, and a self-addressed stam ped return envelope w ith which to return the completed instrument. The introductory letter in this package contained language indicating that the survey could alternatel y be filled out on-line, and provided the URL to the site and the individual’s unique access code. A single reminder message was sent by post on August 14th to those ag ents who had not returned the paper survey, or who had not completed it on-lin e. This reminder letter also included language indicating that the survey could alternately be filled out on-line, and provided the URL to the site and the indi vidual’s unique access code. The survey concluded on September 1st, 2002. On Web-based Surveys The first electronic surveys conducted via the Internet were predominately done through e-mail (Solomon, 2001). With the advent of the World Wide Web and its enabling hypertext markup language (HTML), electronic surveys soon became ensconced in this new venue – and became known as “Web-based surveys”

PAGE 61

52 (Solomon, 2001, p.2). This methodology bega n to occur in approximately 1996-1997 (Dilman & Bowker, 2001; Solomon, 2001). Due to their low cost relative to conventional surveys (paper-bas ed, face-to-face, computer a ssisted telephone surveys, etc.), and their abil ity to quickly return copious amounts data from the tremendous populations they reached, Web-based survey s experienced explosive growth (Dilman & Bowker, 2001; Yun & Trumbo, 2000; Solo mon, 2001). Writing from a market research perspective, Jeavons (1999) reporte d that “fashion” played a role in making web-assisted interviewing “a booming industry, ” and that the ability to perform some sort of Web-based data collection has become “almost mandatory” for market research companies (p. 69). Coomber ( 1997), who made novel use of the Web to perform a sociological surv ey on a specific population, suggests that the Internet “presents enormous possibilities” to reach individuals that are desired as research subjects. Leadership for Web-based social (and market) survey procedures came not from the “survey methodology community” but rather stemmed in large part from computer programmers (Dilman & Bowker 2001, p. 1). This produced a situation where technological innovation in survey design and implementation, as performed by the programmers, proceeded without the methodological rigor practiced by survey methodologists (Dilman & Bowker, 2001). Two such cases involving “highly visible” Web-based sample surveys purporting to have yielded scientifically viable results are shown by Dilman and Bowk er as having practiced questionable methodology that did not take into account the pr esence of certain types of error. Just like other types of sample su rveys, those conducted via th e Web are also subject to

PAGE 62

53 four distinct types of error: Coverage error, sampli ng error, measurement error, and non-response error (Dilman & Bowker, 2001). Of the above, coverage error, or the erro r resulting from drawing a sample that does not adequately represent a population, is of particular concern in Web-based surveys especially those of the genera l public (Coomber, 1997; Dilman & Bowker, 2001; Solomon, 2001). Though this situation is seen as mitigating in the future as more individuals use the Web (Coomber, 1997), currently not everyone has access. Under certain circumstances, however, Webbased surveys can be conducted in a scientifically valid manner. Dillman and Bowker (2001) state: Some populations employees of certain organizations, member s of professional organizations, certain types of businesses, students at many uni versities and colleges and groups with high levels of education do not exhibit larg e coverage problems. When nearly all members of a population have computers and Internet access, as is already the case for many such groups, coverage is less of a problem (p. 5). It would appear that county Extension faculty of the FLCES is such a population, and thus the issue of coverage error is averted. Non-response error, though, remains a concern for all surveys, both Webbased and conventional. As Bosnjak and Tuten (2001) put it: Non-response is of particular importance to researchers because the unknown characteristics and attitudes on non-respondents may cause inaccu racies in the resu lts of the study in question (p. 2). The authors then identif y three traditional types of response to requests to participate in a survey: Unit non-response where an individual does not have access to the survey, refuses to re spond, or is unable to respond; item non-

PAGE 63

54 response, where only certain items in a re turned survey are an swered; and, lastly, complete response. Dillman and Bowker (2001) indicate that response to Web-based surveys is likely to be low, and can potentially ca use non-response error. Computer programs running in the background of Web-based su rveys have, however, enabled researchers to identify respondent behavior, including modes of non-response. Bosnjak and Tuten (2001) have classified th e following patterns: Co mplete responders; Unit nonresponders; Answering drop-outs (individuals who answer some questions, but then drop out of the survey before its end); Lurker s (individuals who view all of a surveys questions, but answer no questions); Lurk ing drop-outs (individua ls who only view a fraction of the questions, then drop out); Item non-responders; and Item nonresponding drop-outs (individuals who view some question, answer some, and then leave the survey before its end) (p.6). Understanding these patterns might aid in ameliorating non-response error in Web-based surveys. In addition to low response rates, poor questionnaire design and a respondents lack of computer skills can lead to pre mature termination of the survey, with the implication of introduc ing non-response bias (Dillman & Bowker, 2001, p. 6). The authors illustrate this by identifying seven different scenarios ranging from respondents not knowing how to erase answers, to having to take multiple actions in order to answer a question. Furthermore, non-response could possibly occur due to incompatibilities between the Web-based survey and the respondents hardware or software (Dillman & Bowker, 2001). Different browsers, different versions of HTML, lack of random access memory, slow Internet

PAGE 64

55 connections, and in some instances, use of the Java programming language, can all cause the survey to be difficult, if not impossible to complete (Dillman & Bowker, 2001). Measurement error also presents new issues for Web-based surveys. The foremost difficulty here is how to make a surveys response stimuli identical from one respondent to the next (D illman & Bowker, 2001). The study at hand used a very basic level of HTML language, which gave some assurance that all individuals received the same response stimuli. Addressing ways of reducing the four t ypes of survey error (coverage, sample, non-response, and measurement) as they pertain to Web-based surveys, Dillman and Bowker (2001) promulgated the Principles for the design of web surveys and their relationship to traditional sources of survey error, which is here included as Appendix C. Though presented by the authors with the caveat that the principles are but one attempt to develop such proce dures, they nevertheless range broadly through a gamut of issues salient to the design of Web-based surveys and how each impacts a potential source of error. Th e introductory page, choice of first question, visual appearance of questions, and use of graphical symbols or words to convey level of completion of the survey are amongst the items considered. How This Study Addressed Sources of Error In general, the study sought to reduc e measurement, and non-response error by having followed as many of the recommendations presented in Dillman and Bowkers Principles as possible. Covera ge error was not an issue, as the study was based on a census of FLCES county agents, and as such each pa rticipant had a known

PAGE 65

56 non-zero probability of being included. Samp le error was also moot because a census was conducted. Measurement error was addressed by following Dilman and Bowkers Principles, including a simple, motiva ting welcoming screen, interesting first question, easy-to-understand na vigation buttons, and a clear indication of how much of the survey a respondent has complete d. Reduction of non-response error also followed the Principles, and included an e-mail invitation from the dean for Extension asking for participation, and e-ma il reminders urging participation which were sent at pre-arranged times after the start of the survey. District Extension Directors were asked to encourage partic ipation. Agents use of their unique access code also enabled the resear cher to directly address issues of non-response. As a means to combat non-response due to inability, or reluctance to use the Web, a paperbased version of the survey was sent via post to all agen ts who did not complete the Web-based instrument. Due to the treme ndous response rate (90.3%), the issue of non-response bias appeared to not be a con cern for this study. A limited investigation of the non-respondents (n = 32) did not reveal any obvious differences in gender. As age was an item collected in the survey, the analysis looked at non-respondents rank as a means to assess whether there was an age effect. It is assumed that age and rank have a reasonably strong correlation. Th e majority of non-respondents were of rank I. Most non-responds were located in the Northeast and South Extension districts. Data Analysis The SAS System for Windows, Release 8.2 was used to analyze the data. An alpha level of .05 was set a priori. Frequency distributi ons and descriptive statistics such as the mean, and standard deviation were calculate d for all appropriate

PAGE 66

57 survey items and presented in tabular fo rm (Albright, 2000; Ary et al., 1996; Johnson et al., 1999; Ruppert, 1992). An alysis of variance with an associated Duncan’s test was employed to test for differences in the means of between levels of certain variables such as age. Association betw een the construct vari ables was described using Pearson’s Product Moment Multiple Correlation (Albri ght, 2000). Use of Cronbach’s coefficient alpha tested the consistency of the scale.

PAGE 67

58 CHAPTER 4 RESULTS This study investigated the current us e of information technology, level of information technology skills, and the workpl ace application of modern information technology among county Extension agents of the Florida Coope rative Extension Service. The study used a pplied research methodology in that it sought to answer practical questions associated with an immediate problem (Ary et al., 1996). In light of the manifold technological change of the past 10 years and its impact on Extension, the following questions thus arise: Have county agents kept abreast of this manifold technological change? Are they utilizing the Web to find information to fulfill clientele need? Are they disseminating information to clientele through Web sites or e-mail? Are agents using e-mail to exchange information, and can they attach a file to su ch messages? And finally, to what degree of sophistication do agents use everyday office software pr oducts such as word processors, or spreadsheets? To answer these questions a survey instrument was adapted from that used in a similar (2000) study of county Extension agents in the state of Texas. The (adapted) instrument include d ninety-nine questions that recorded personal and situational factors, and m easured patterns of information technology use, specific skills practiced for six different types of software, and types of computer hardware and connectivity. The instrument also assessed future information technology training needs. To these ends specific questions asked agen ts to gauge their

PAGE 68

59 knowledge of, ability to apply to their job, and their perceived importance of the six types of software. Response to these quest ions was then analyzed, and an order of training need derived. This study presents its findings in sequence with the major objectives established in Chapter 1. Those objectives were to 1. Describe county Extension agents de mographic characteristics and, based on those characteristics, determine thei r use of information technology, including self-assessed level of over all computer skills. 2. Determine how county Extension agents are using information technology on the job in terms of hardware and software use. 3. Determine county Extension agents percei ved level of skill with regard to a specific set of information technology tasks. 4. Recommend future information technol ogy training by describing the relationship between agents perceived importance of and self-assessed knowledge about specific information technology skills. Objective 1 Describe County Extension Agents De mographic Characteristics and, Based on Those Characteristics, Determine Their Use of Information Technology, Including Self-Assessed Level of Ov erall Computer Skills A General Description of the Respondents The number of county Extension agents employed by the Florida Cooperative Extension Service at the inception of th is study was 331. Two hundred ninety-nine agents, or 90.33% of this population comple ted the studys survey instrument either on-line or by paper. By gender the re spondents were 57.86% female (n = 173), and 42.14% male (n = 126). This distribution of males and females mirrored that of the general population of count y Extension agents (58.01% female and 49.99% male) at the beginning of the study. The majority of respondents (63.54%) indicated that their age fell between 41 and 60 years (n = 190) Most respondents (69.90%) reported

PAGE 69

60 work experience, including both inside and outsi de of Extension, of 16 or more years. Table 1 immediately below pr esents this information. Table 1. Number of Respondents by Gender, Age and Years of Work Experience Characteristic N%N Gender Male 12642.14 Female 17357.86 Age Group 20-30 3511.71 31-40 5117.06 41-50 9732.44 51-60 9331.10 61-70 196.35 No response 41.34 Years of Work Experience Less than 5 years 227.36 5-10 years 3110.37 11-15 years 3411.37 16+ years 20969.90 No response 31.00 Comparing Response Groups Of the 299 respondents, 278 (92.98%) comple ted the electronic version of the survey instrument on-line, and 21 (7.02%) co mpleted the paper version. For purposes of comparison, respondents in this study were divided into four groups: Early Online Respondents who completed the surv ey on-line (n = 65), Late On-line Respondents who completed the survey on-li ne (n = 65), All On-line Respondents which include all respondents who comple ted the survey on-line (n = 278), and

PAGE 70

61 Paper Respondents who completed the pape r version of the survey. To form the early and late on-line groups, the on-line respondents, excluding respondents to the pilot study, were divided into percentage quartiles (Glenn D. Israel Personal communication, October 2002). The first and la st quarters of these respondents were chosen to form the early and late groups, respectively. An examination for differences between the Early On-line Respondents and Late On-line Respondents was then performed. As is shown in Table 2, the percentages of male (44.62%) and female (55.38%) early on-lin e respondents are essentially equal to the percentages of ge nder for all respondents. This changes for the late on-line respondents, with females (64.62%) consti tuting a greater percentage of this category. Table 2. Frequency and Percent by Gender for the Early and Late On-line Response Groups Response Group Male Female N %N N %N Early On-line Respondents 29 44.62 36 55.38 Late On-line Respondents 23 35.38 42 64.62 The analysis then examined the Earl y On-line Respondents and Late On-line Respondents for differences in mean response to age, years of work experience, selfrated computer skills, and hours of weekly computer use. This information is provided in Table 3 below. Note that the study employed various scales to

PAGE 71

62 Table 3. Means and Standard Deviations of Early and Late On-line Respondents by Various Variables Variable Early On-line Respondents Late On-line Respondents Mean SD Mean SD Age Group 3.13 1.13 2.95 1.21 Years Work Exp. 3.53 0.88 3.33 1.07 Self-rated Com. Skills 3.63 0.67 3.33 0.79 Hours of Usage/Week 4.90 1.14 4.63 1.32 measure levels of these variables (i.e. ag ents 20-30 years old were assigned the numeric value for having indicated their age fell in the first level of a six level scale). Appendix D gives the scales and valu es they represent (for all variables using a scale). A t-test for statistically signifi cant differences between the means of these selected variables was then conducted. As Table 4 on the following page indicates, a significant difference in the mean self-rated computer skills score was found, with the early respondents averaging a higher score. No other si gnificant differences were found. An examination for differences between the All On-line Respondents group and the Paper Respondents groups was then performed. As is shown in Table 5 on the following page, the percentages of male and female for both the All On-line Respondents and Paper Responde nts groups are essentially equal to the percentages of gender found for the studys total, undiffe rentiated group of respondents (n = 299).

PAGE 72

63 Table 4. T-test for Significant Difference betw een Early and Late On-line Respondents Variable t Value Pr Age Group 0.90 0.37 Years Work Experience 1.16 0.25 Self-rated Computer Skills 2.26 0.026* Hours of Usage per Week 1.27 0.20 *Significant at the = 0.05 level. Table 5. Frequency and Percent by Gender for the Response Groups Response Group Male Female N %N N %N Electronic Respondents 11742.09 161 57.91 Paper Respondents 942.86 12 57.14 The All On-line Respondents and Pa per Respondents groups were then examined for differences in mean response for age, years of work experience, selfrated computer skills, and hours of weekly computer use. This information is provided in Table 6 on the following page. A t-test for statistically significant differences between the means of these se lected variables was then conducted. As Table 7 on the following page indicates no significant differences were found.

PAGE 73

64 Table 6. Means and Standard Deviations of El ectronic Vs. Paper Response Groups Variable Electronic Respondents Paper Respondents Mean SD Mean SD Age Group 3.05 1.10 2.80 1.11 Years Work Exp. 3.47 0.93 3.20 1.10 Self-rated Com. Skills 3.51 0.74 3.19 0.92 Hours of Usage/Week 4.75 1.25 4.66 1.39 Table 7. T-test for Significant Difference be tween Electronic and Paper Respondents Variable t Value Pr Age Group 0.98 0.32 Years Work Experience 1.23 0.21 Self-rated Computer. Skills 1.86 0.06 Hours of Usage per Week 0.31 0.75 Use of Information Technology and Self-Asse ssed Level of Overall Computer Skills County agents use of information tec hnology (IT) is here analyzed vis--vis various demographics collected by the study. Aspects of age, gender, work experience, agents major programmatic area, and other characteristics are examined to determine their effect on IT use and self-a ssessed level of computer skill. Note that the total population of respondents (n =299) is being examined.

PAGE 74

65 The analysis begins by differentiating respondents according to gender, age, and work experience. Table 8 shows female s between 20-40 years of age constitute 38.15% (n = 66) of the female respondents, whereas males in the same age range constitute only 16.13% (n = 20) of the male respondents. As would be expected to follow from this finding, more female respondents (25.44%) reported work Table 8. Age and Work Experience of County Extens ion Agents differentiated by Gender Variable Male Female N %N N %N Age Group 20-30 32.4232 18.50 31-40 1713.7134 19.65 41-50 4233.8755 31.79 51-60 5040.3243 24.86 61-70 129.687 4.05 No response 21.592 1.16 Years of Work Experience Less than 5 years 53.9717 9.83 5-10 years 43.1727 15.61 11-15 years 1310.3221 12.14 16+ years 10381.75106 61.27 No response 10.792 1.16

PAGE 75

66 experience between 1-15 years than did male respondents (7.14%). The majority of agents in both gender groups reported more th an 16 years of work experience. Males with 16+ years of work experience cons tituted 81.75% of their gender, whereas 61.27% of the female population reported 16+ years of experience. An examination by way of a t-test for differences between the mean age of males and females yielded a statistically significant diffe rence. Female agents ar e, on average, younger than male agents. This finding leads to sta tistical examination for differences between mean years of work experience for the gende rs, which also proved to be statistically significant. Table 9 gives th e details of the t-test on both age and years of work. Table 9. T-test for Significant Differ ences between Males and Females Variable Male Female t Value Pr Mean SD Mean SD Age Group 3.41 0.92 2.76 1.14 5.21 <.0001* Years Work Exp. 3.71 0.71 3.26 1.05 4.11 <.0001* *Significant at the = 0.05 level. An Examination of the Non-respondents Thirty-two agents (9.66% of the populat ion) did not respond to the studys survey. Analysis was done to determ ine if this group had any distinguishing characteristics. As Table 10 on the follo wing page shows, the non-respondents were 59.37% female (n = 19) and 40.62% male (n = 13), which is slightly different than the gender breakdown for the populat ion of respondents. Age was an item supplied by the respondents, so analysis of the no n-respondents based on this variable is unavailable. The rank, however, of the non -respondents was available. Given that

PAGE 76

67 rank is usually correlated to an agents ag e, the analysis of non-respondents analyzed accordingly. Table 10 shows that 37.50% (n = 12) of the non-respondents are of Extension Agent I rank, and 21.88% (n = 7) are Extension Agent II rank, etc. Table 10 also shows the Extension districts wher e the non-respondents are located. Note that 37.50% of the non-respondents are fr om the South Extension district. Table 10. Characteristics of Non-respondents (N = 32) Characteristic N%N Gender Male 1340.62 Female 1959.38 Rank EA I 1237.50 EA II 721.88 EA III 721.88 EA IV 618.74 District Northwest 412.50 Northeast 721.88 Central 618.75 South Central 39.37 South 1237.50 Self-rated Computer Skills and Demographics Agents were asked to rate their overa ll computer skills on a scale from poor to excellent. As shown in Table 11 on the following page, 84.95% (n = 254) of the respondents reported their skills to be e ither average and above average. Table

PAGE 77

68 12 reports this information by gender, and shows that 85.37% (n = 107) of the males rated their skills as being either averag e or above average, and 84.97% (n = 147) of the females rate their skills as bei ng either average or above average. Table 11. Self-rated Overall Computer Skill for All Respondents Skill Rating N%N Very Poor 31.00 Poor 186.02 Average 12943.14 Above Average 12541.81 Excellent 227.36 No Response 20.67 Table 12. Self-rated Overall Computer Skills by Gender Skill Rating Male Female N %N N %N Very Poor 10.792 1.16 Poor 86.3510 5.78 Average 5543.6574 42.77 Above Average 5241.7273 42.20 Excellent 107.9412 6.94 No Response 002 1.16

PAGE 78

69 A t-test for significant differences between mean self-rated skill level for males and mean self-rated skill level for females was performed, and the results were not significant. Table 13 below provi des the results of this test. Table 13. T-test for Significant Difference between Male and Female Mean Self-rated Overall Computer Skills Variable Male Female t Value Pr Mean SD Mean SD Self-rated Computer Skill 3.49 0.76 3.48 0.76 0.07 0.93 Analysis was then conducted on self-rat ed computer skills by age. Table 14 shows that, across the five age groups, mo st agents responded that they have average to above average overall computer skills. Note the studys three very poor responses stem from the 61-70 age group and the 41-50 age group. Table 14. Self-rated Overall Computer Skills by Age Skill Rating Age 20-30 Age 31-40 Age 41-50 Age 51-60 Age 61-70 N %N N %N N %N N %N N %N Very Poor 0 0 0011.0300 2 10.53 Poor 1 2.86 11.9655.15 1010.75 0 0 Average 11 31.43 1835.294445.364346.24 12 63.16 Above Av. 20 57.14 2447.064142.273436.56 4 21.05 Excellent 3 8.57 611.7666.1966.45 1 5.28 No Resp. 0 0 23.920000 0 0

PAGE 79

70 An analysis of variance was conducted to determine if differences in mean self-rated computer skills score existed between the five age groups. As Table 15 reports, this hypothesis is va lid. A further comparison of the means was performed using Duncans multiple range test. The re sults of this procedure are displayed in Table 16, and show that differe nces exist between mean sel f-rated computer skill for the 51-60 age group and the younger age groups and differences exist between the 61-70 age group and the younger groups, exclud ing the 51-60 age group. Note that Table 15. Analysis of Variance for Self-rated Overall Computer Skills (N = 295) Source DF SS MS F Value Pr > F Model 4 8.0462.0113.59 0.0070* Error 288 161.1820.559 C Total 292 169.228 Significant at the = .05 level DEP MEAN = 3.491 ROOT MSE = 0.748 R-Square = 0.047 C.V. = 21.426 Table 16. Results of Duncans Test for Comparing Means Independent Variable is Age, Dependent Variable is Self-rat ed Overall Computer Skills Levels of the Independent Variable N Mean Duncan Grouping Age Group 20-30 35 3.714 A Age Group 31-40 49 3.714 A Age Group 41-50 97 3.474 A Age Group 51-60 93 3.387 A B Age Group 61-70 19 3.105 B

PAGE 80

71 groups with the same Duncan Grouping letter designation are not significantly different. Agents major area of programmatic activity (Agriculture, 4-H, Marine, Other, and Family, Youth and Community Se rvices (FYCS) was then considered in the analysis of self-rated computer skills. Ta ble 17 shows that the majority of agents across all program areas self-rated their overall computer skills as average to above average. Note that 2 of the st udys 3 very poor respons es stem from the FYCS program area, and that program areas Agriculture and 4-H had 15 excellent responses between them. Table 17. Self-rated Overall Computer Skills by Program Area Skill Rating Agriculture 4-H Marine FYCS Other N %N N%N N%N N%N N %N Very Poor 1 0.74 00.0000.0022.33 0 0.00 Poor 6 4.44 24.0018.3389.30 1 6.25 Average 59 43.70 2040.00433.334147.67 5 31.25 Above Av. 58 42.96 2244.00758.333034.88 8 50.00 Excellent 11 8.15 48.0000.0055.81 2 12.50 No Resp. 0 0.00 24.0000.0000.00 0 0.00 An analysis of variance was conducted to determine if differences in mean self-rated overall computer sk ills score existed between th e five programmatic areas. As Table 18 on the following page reports, this hypothesis is not valid. Duncans multiple range test for comparison of means was then performed. Table 19 (Pg. 71)

PAGE 81

72 shows that there were no significant diffe rences between mean self-rated overall computer skills for the program areas. Table 18. Analysis of Variance for Self -rated Overall Computer Skil ls (N = 297) Independent Variable is Program Area, Depe ndent Variable is Self-rated Overall Computer Skills Source DF SS MS F Value Pr > F Model 4 3.6200.9051.57 0.1829 Error 292 168.5870.577 C Total 296 172.208 DEP MEAN = 3.488 ROOT MSE = 0.759 R-Square = 0.021 C.V. = 21.783 Table 19. Results of Duncans Test for Comparing Means Independent Variable is Program Area, Dependent Variable is Se lf-rated Overall Computer Skills Levels of the Independent Variable N Mean Duncan Grouping Program Area Agriculture 16 3.687 A Program Area 4-H 48 3.583 A Program Area Marine 135 3.533 A Program Area Other 12 3.500 A Program Area FYCS 86 3.325 A Computer Usage and Demographics As is reported in Table 20 on the following page, 113 agents (37.79%) responded that they use their computers, both at home and at work, over 20 hours a week. Another 78 agents (26.09%) report computer use at between 16-20 hours. Table 21 on the following page shows weekly computer use by gender. A t-test for

PAGE 82

73 significant differences in mean hours of w eekly computer use between the genders was performed, and the results were not si gnificant. Table 22 on the following page provides the results of this test. Table 20. Hours of Computer Use per Week for All Respondents Level of Use N%N 1-5 Hours/week 186.02 6-10 Hours/week 4414.72 11-15 Hours/week 4615.38 16-20 Hours/week 7826.09 20+ Hours/week 11337.79 Table 21. Hours of Computer Use per Week by Gender Level of Use Male Female N %N N %N 1-5 Hours 107.948 4.62 6-10 Hours 1814.2926 15.03 11-15 Hrs. 2015.6726 15.03 16-20 Hrs. 3225.4046 26.59 20+ Hours 4636.5167 38.73

PAGE 83

74 Table 22. T-test for Significant Difference between Male and Female Mean Hours of Weekly Computer Use Variable Male Female t Value Pr Mean SD Mean SD Hours of Weekly Use 4.68 1.31 4.79 1.23 0.78 0.43 Table 23 examines weekly computer us e by age, and shows that 42.86% (n = 15) of the 20-30 age group reported spendi ng over 20 hours a week, as did 21 agents (41.18%) in the 31-40 age group, 35 agents ( 36.08%) in the 41-50 age group, and 37 (39.78%) agents in the 51-60 age group. On ly 4 agents (21.05%) in the 61-70 age group reported 20+ hours a week of computer use, and 26.32% (n = 5) of this group reported being on the computer 1-5 hours a week. Table 23. Hours of Computer Use per Week by Age Level Age 20-30 Age 31-40 Age 41-50 Age 51-60 Age 61-70 N %N N %N N %N N %N N %N 1-5 Hours 0 0 1 1.96 4 4.12 7 7.53 5 26.32 6-10 Hours 6 17.14 9 17.6513 13.4013 13.98 3 15.79 11-15 Hrs. 4 11.43 10 19.6115 15.4612 12.90 4 21.05 16-20 Hrs. 10 28.57 10 19.6130 30.9324 25.91 3 15.79 20+ Hours 15 42.86 21 41.1835 36.0837 39.78 4 21.05

PAGE 84

75 An analysis of variance was conducted to determine if sign ificant differences in mean hours of weekly computer use exis ted between the five age groups. As Table 24 reports, this hypothesis is valid. Duncan s multiple range test for comparison of means was then performed. As reported in Table 25, the mean hours of weekly computer use for the 61-70 age group is less than that of the younger groups. Table 24. Analysis of Variance for Hours of Weekl y Computer Use (N = 295) Independent Variable is Age Group Source DF SS MS F Value Pr > F Model 4 16.1714.0422.59 0.0368* Error 290 452.2551.559 C Total 294 468.427 *Significant at the = .05 level DEP MEAN = 4.755 ROOT MSE = 1.248 R-Square = 0.034 C.V. = 26.257 Table 25. Results of Duncans Test for Compari ng Means Independent Variable is Age Group, Dependent Variable is Hour s of Weekly Computer Use Levels of the Independent Variable N Mean Duncan Grouping Age 20-30 35 4.97 A Age 31-40 97 4.81 A Age 41-50 51 4.80 A Age 51-60 93 4.76 A Age 61-70 19 3.89 B

PAGE 85

76 Table 26 shows hours of weekly computer use differentiated by program area. Sixty-one Agriculture agents (45.19%), a nd 21 (42.00%) 4-H agents report using the computer over 20 hours a week. Percentages of Marine and FYCS agents using the computer over 20 hours a week are less. An analysis of variance was conducted to determine if sign ificant differences in mean weekly computer use existed between the five programmatic areas. As Table 27 reports, this hypothesis is not valid. Table 26. Hours of Weekly Computer Use by Program Area Level Agriculture 4-H Marine FYCS Other N %N N %N N %N N %N N %N 1-5 Hours 9 6.67 12.0018.3366.98 1 6.25 6-10 Hours 22 16.30 612.00216.671416.28 0 0.00 11-15 Hrs. 14 10.37 510.00433.331922.09 4 25.00 16-20 Hrs. 29 21.48 1734.00216.672529.07 5 31.25 20+ Hours 61 45.19 2142.00325.002225.58 6 37.50 Table 27. Analysis of Variance for Hours of Weekly Computer Use (N = 299) Source DF SS MS F Value Pr > F Model 4 12.3693.0921.95 0.1019 Error 294 465.8171.584 C Total 298 478.187 DEP MEAN = 4.749 ROOT MSE = 1.258 R-Square = 0.025 C.V. = 26.504

PAGE 86

77 Duncans multiple range test for comparison of means was then performed. Table 28 shows that no significant differen ces in mean hours of weekly computer exist between the different program areas. Table 28. Results of Duncans Test for Comparing Means Independent Variable is Program Area, Dependent Variable is H ours of Weekly Computer Use Levels of the Independent Variable N Mean Duncan Grouping Program Area Agriculture 50 5.02 A Program Area 4-H 16 4.93 A Program Area Marine 135 4.82 A Program Area Other 86 4.50 A Program Area FYCS 12 4.33 A Source of Computer Knowledge and Demographics Agents were asked to respond yes or no to a list of independent questions about their source of computer knowledge. This information is detailed in Table 29 on the following page, and shows that many agents report learning their computer skills at work. In addition to the sources of knowledge listed in the instrument, agents were also given the opportunity to fill in another source of know ledge on their own. Eighteen responses were recorded in this manner, with the following being salient examples: Computer books, IFAS Help Desk, military, courses at computer shops, and software manuals.

PAGE 87

78 Table 29. Number and Percent of Agents Responding Yes to Questions about Where Most Computer Knowledge was Learned (Questi ons asked Independently of Each Other ) Question N %N Self-taught at home 196 65.55 Learned in college or high school 95 31.77 Self-taught at work 264 88.29 Learned at work through in-service training 200 66.89 Learned from family or friends outside of work 151 50.50 Learned from co-workers at work 229 76.59 Recalling that the studys respondents are 42.14% male and 57.86% female, Table 30 shows a similar gender distribut ion for most sources of computer Table 30. Number and Percent of Agents, by Gende r, Responding Yes to Questions about Where Most Computer Knowledge was Learned (Responses are Independent) Source of Knowledge Male Female N %N N %N Self-taught at home 8543.37111 56.63 Learned in school 2728.4268 71.58 Self-taught at work 11242.42152 57.58 In-service Training 9045.00110 55.00 Family, etc. 5637.09 95 62.91 Co-workers 9441.05135 58.95

PAGE 88

79 knowledge. Two exceptions, however, are appare nt: Females, in greater percentages than males, indicate that they learned th eir computer knowledge in school. This might be related to the previous finding that female agents are significantly younger than male agents, and thus may have ha d more exposure the technology in the school setting. The second exception between the genders in source of computer knowledge is that more females indicate they learned fr om family or friends outside of work than did males. An examination of Table 31, which deta ils source of computer knowledge by age, reveals that agents w ho are 41-60 years of age range report more often that they acquired their knowledge at home than those agents in either the 20-40, or 61-70 age groups. The youngest age group of agents, t hose 20-30 years of ag e, indicated that they acquired their knowledge in high school or college. more often than any other Table 31. Number and Percent of Agents, by Age, Responding Yes to Questions about Where Most Computer Knowledge was Learned (Responses are Independent) Source Age 20-30 Age 31-40Age 41-50 Age 51-60 Age 61-70 N %N N %N N %N N %N N %N Home 23 11.98 3417.706634.385528.65 14 7.29 School 32 33.68 2627.372122.111313.68 3 3.16 Work 28 10.77 5019.238432.318331.92 15 5.77 In-service 13 6.53 3115.586934.677035.18 16 8.04 Family, etc. 21 14.09 3221.484731.544127.52 8 5.37 Co-workers 18 7.96 4118.147734.077533.19 15 6.64

PAGE 89

80 age range. On the other hand, agents 41-60 years of age responde d most often that their source of computer knowledge was self -taught at work. Inservice training is the predominant source of computer knowledge for agents 41-60 years of age. This same age group, with much more frequency than other age groups, also responds that co-workers are a source of their computer knowledge. Table 32 shows computer knowledge re ported by program area. Large percentages of yes respons e were recorded across the program areas for the self taught at work source of computer knowle dge. With the exception of the Marine program area, agents across the program ar eas frequently responded that co-workers were a source of computer knowledge. Table 32. Number and Percent of Agents, by Program Area, Responding Yes to Questions about Where Most Computer Knowledge was Learned (Responses are Independent) Source Agriculture 4-H Marine FYCS Other N %N N %N N %N N %N N %N Home 92 68.15 3366.00650.005563.95 10 62.50 School 34 25.19 2244.00650.002630.23 7 43.75 Work 122 90.37 4284.00866.677890.70 14 87.50 In-service 94 69.63 2958.00758.336373.26 7 43.75 Family, etc. 60 44.44 2652.00541.675260.47 8 50.00 Co-workers 103 76.30 3468.00758.337486.05 11 68.75 Agents were asked if they had taken any computer courses since the year 2000. As evidenced in Table 33 on the follo wing page, the majority (52.17%, n =

PAGE 90

81 159) responded no. Table 34 shows the results of a follow up question that asked the principal reason for not taking a co mputer course since 2000. Seventy agents indicated the reason was too few in-service training (IST) days. Table 33. Agents who Have/Have not Taken Computer Courses Since 2000 Response N%N Yes 14147.16 No 15952.17 No response 2 0.67 (23.41%) indicated lack of time as the reason, and 29 agents (9.70%) Table 34. Reason for not Taking a Computer Course Since 2000 Response N%N Lack of Time 70 23.41 Lack of Access 14 4.68 Too Expensive 31.00 No Incentive 93.01 Not Available 113.68 Too few IST Days 299.70 Other 3511.71 No response 12842.18 By gender, 61.90% (n = 78) of the male s responded that they had not taken a computer course since 2000, but 53.76% (n = 93) responded that they had taken a

PAGE 91

82 computer course since 2000. This informa tion is provided in Ta ble 35 below. Table 36 shows that 30.16% (n = 38) of the males and 18.50% (n = 32) of the females responded that lack of time was the prin cipal reason for not taking a computer Table 35. Agents, by Gender, who Have/Have not Taken Computer Courses Since 2000 Response Male Female N %N N %N Yes 4838.1093 53.76 No 7861.9078 45.09 No response 002 1.16 Table 36. Reason, by Gender, for not Taking a Computer Course Since 2000 Response Male Female N %N N %N Lack of Time 3830.1632 18.50 Lack of Access 53.979 5.20 Too Expensive 10.792 1.16 No Incentive 53.974 2.31 Not Available 53.976 3.47 Too few IST Days 118.7318 10.40 Other 1713.4918 10.40 No response 4434.9284 48.55

PAGE 92

83 course. Agents were also given the opport unity to list other reasons for not taking computer courses. Among the 38 responses received this way were: Not offering what I need, no mid level to high end application training offered, scheduling conflicts, usually too generic and basi c, signed up for classes but they were cancelled, and nothing new Im interested in. Table 37 presents an analysis, by age group, of agents who have not taken a computer course since 2000. Agents 61-70 years of age show the highest incidence (57.89%) of not taking a computer class since the year 2000, whereas agents 31-40 years of age (54.09%) are more lik ely to have taken a course. Table 38 on the following page shows th at across all age groups, lack of time is the most often given response fo r not taking a computer course. Table 39 (Pg. 83) and Table 40 (Pg. 84) examine the computer course questions by program area, again showing lack of time to be the most frequently given reason for not taking a course. Table 37. Agents, by Age Group, who Have/Have not Taken Computer Courses Since 2000 Response Age 20-30 Age 31-40Age 41-50 Age 51-60 Age 61-70 N %N N %N N %N N %N N %N Yes 15 42.86 2854.904546.394447.31 8 42.11 No 20 57.14 2243.145253.614952.69 11 57.89 No Resp. 0 0 11.960000 0 0

PAGE 93

84 Table 38. Reason, by Age Group, for not taking a Computer Course Since 2000 Response Age 20-30 Age 31-40Age 41-50 Age 51-60 Age 61-70 N %N N %N N %N N %N N %N LO Time 8 22.86 59.802525.772425.81 6 31.58 LO Access 0 0 23.9244.1255.38 2 10.53 Expense 0 0 11.9611.0311.08 0 0 Incentive 3 8.57 11.9611.0344.30 0 0 Availability 2 5.71 23.9255.1522.15 0 0 IST Days 3 8.57 713.731010.3199.68 0 0 Other 3 8.57 59.801212.371111.83 4 21.05 No Resp. 16 45.71 2854.903940.213739.78 7 36.84 Table 39. Agents, by Program Area, who Have/Have not Taken a Computer Course Since 2000 Response Agriculture 4-H Marine FYCS Other N %N N %N N %N N %N N %N Yes 53 39.26 2244.00541.675260.47 9 56.25 No 82 60.74 2754.00758.333338.37 7 43.75 No Resp. 0 0.00 12.0000.0011.16 0 0.00

PAGE 94

85 Table 40. Reason, by Program Area, for not ta king a Computer Course Since 2000 Response Agriculture 4-H Marine FYCS Other N %N N %N N %N N %N N %N LO Time 37 27.41 1020.00433.331618.60 3 18.75 LO Access 5 3.70 36.0018.3344.65 1 6.25 Expense 2 1.48 12.0000.0000.00 0 0.00 Incentive 4 2.96 24.0000.0000.00 2 12.50 Availability 8 5.93 12.0000.0022.33 0 0.00 IST Days 11 8.15 612.0018.331112.79 0 0.00 Other 15 11.11 612.0018.331112.79 2 12.50 No Resp. 53 39.26 2142.00433.334248.84 8 50.00 Two hundred ninety-five agents res ponded to the question if you have a question about a computer-related issue, where are you most likely to seek an answer. The majority, 52.17% (n = 156) indi cated that they turned to a colleague or support staff in the office. The second most frequent response at 24.41% (n = 73) was from your districts computer s upport personnel. Table 41 on the following page shows this information. In addition to the supplied response, thirty-three agents indicated other as answer to this question, and voluntarily offered various responses such as: County computer support pers onnel, IFAS help, G ainesville IT or county IT, office support staff or count y computer support personnel, from district computer personnel, or m y friend the computer geek.

PAGE 95

86 Table 41. Where Agents Seek Answer to Computer-related Questions Response N %N From a colleague or support staff in the office 156 52.17 From a colleague or support staff in another county 6 2.01 From your districts computer support personnel 73 24.41 You find the answer on your own 27 9.03 Other 33 11.04 No response 4 1.34 Demographic Snapshots by Age Age group 20-30 This group of agents accounted for 17.06% of the respondents (n = 35). The majority of this group, 88.57%, reported average to above average computer skills. Insofar as computer use is concerned, 42.86% of this group spends 20+ hours a week on the computer, and another 40% spends between 11-20 hours a week on the computer. Of all the sources of computer knowledge that the survey inquired about, the most frequent response by this group wa s learned in college or high school. More than half, 57.14%, have not taken a computer course since 2000, with 22.48% indicating that lack of time was the reason. When as ked where they sought answers for their computer-related questions, 40.00% of this age group responded from a colleague or support staff in the office.

PAGE 96

87 Age group 31-40 This group of agents accounted for 17.06% of the respondents (n = 51). The majority of this group, 82.35%, reported average to above average computer skills. Insofar as computer use is concerned, 41.18% of this group reported spending 20+ hours a week on the computer, and anothe r 39.22% spends between 11-20 hours a week on the computer. Of all the sources of computer knowledge that the survey inquired about, the most frequent response by this group was selftaught at work. Of the respondents in this age group, 43.14% ha ve not taken a computer course since 2000, with 13.73% indicating that too few in-s ervice training days was the reason. The majority of this age group, 50.98%, i ndicated that they sought answers for computer-related issues from a colleague or support staff in the office. Age group 41-50 This group of agents accounted for 32.44% of the respondents (n = 97), and was the largest of the survey. The majo rity of this group, 87.63%, reported average to above average computer skills. Insofa r as computer use is concerned, 36.08% of this group reports spending 20+ hours a week on the computer, and another 46.39% spends between 11-20 hours a week on the comp uter. Of all the s ources of computer knowledge that the survey inquired about, the most frequent response by this group was self-taught at work. Of the res pondents in this age group, 53.61% have not taken a computer course since 2000, and 25.77% indicated that lack of time was the reason. This group responded 56.70% of th e time that they sought answers to computer-related issues from a colleague or support staff in the office.

PAGE 97

88 Age group 51-60 This group of agents accounted for 31.10% of the respondents (n = 93), and was the second largest of the survey. Th e majority of this group, 82.80%, reported average to above average computer skills. Insofar as computer use is concerned, 39.78% of this group reports spending 20+ hours a week on the computer, and another 38.81% spends between 11-20 hours a week on the comput er. Of all the sources of computer knowledge that the survey inquired about, the most frequent response by this group was self-taught at wo rk. Of the respondents in this age group, 52.69% have not taken a computer course since 2000, and 25.81% indicated that lack of time was the reason. Respondi ng to the question about where they sought answers to computer-related issues, this group responded 51.61% of the time from a colleague or support staff in the office. Age group 61-70 This group of agents accounted for 6.35% of the respondents (n = 19), and was the most senior of the survey. Th e majority of this group, 63.61%, reported average computer skills. In sofar as computer use is concerned, 21.05% of this group reported spending 20+ hours a week on th e computer, and another 36.84% spends between 11-20 hours a week on the computer Of all the sources of computer knowledge that the survey inquired about, the most frequent response by this group was learned at work through in-service traini ng, etc. Of the respondents in this age group, 57.89% have not taken a computer course since 2000, and 31.58% indicated that lack of time was the reason. The majority of 61-70 year old county agents, 63.16%, seek answers to computer-related issu es from colleagues or support staff in the office.

PAGE 98

89 Objective 2 Determine How County Extension Agents are Using Information Technology on the Job in terms of Hardware Use, and the Nature and Frequency of Use of Specific Types of Software Connectivity, Hardware and Operating System Use, etc. Ninety-five percent (n = 285) of agents responding to the question as to whether they have a computer on their desk at their office, said yes. Ninety-eight percent (n = 293) of the ag ents responding to the ques tion asking what sort of computer they had on their desk, indicated an IBM P.C. clone with the Windows operating system. Ninety-nine percent (n = 296) of agents res ponding to whether the computer on their desk was connected to the Internet, indicated yes. The majority of agents, 68.90%, responded yes to the que stion about whether they did office work on their home computer. When aske d if they used a laptop computer, 75.92% (n = 227) of the responses to the ques tion were yes, with males, 77.78%, and females, 74.57%, being almost equally distribu ted. When asked if they used a Palm Pilot, I-Paq, or some such similar device, 34.78% (n = 104) of th e respondents to the question indicated, yes, with almost equal responses from males (36.51%) and females (33.53). Patterns of Use of Electronic Mail Asked if they use e-mail, 100% (n =299) of the respondents answered yes. Agents were then asked to give their aver age daily use of e-mail. As is shown in Table 42 on the following page, 26.42% (n = 79) of the agents responded -45 minutes a day, and 25.08% (n = 75) res ponded -60 minutes a day. Asked if they use e-mail to communicate with client ele, a large majority, 91.97% (n = 275), of the agents said yes. A follow up to th is question asked agen ts how often they

PAGE 99

90 Table 42. Agents Average Daily Use of E-mail Average Daily Use N%N 0-15 minutes a day 165.35 16-30 minutes a day 7324.41 31-45 minutes a day 7926.42 46-60 minutes a day 7525.08 Over 60 minutes a day 5618.73 e-mailed clientele during the month. Se venty-five respondents (25.08%) indicated more than 20 times a month, and a nother 75 (25.08%) res ponded -5 times a month. This information is provided in Table 43. Agents were also asked to estimate the number of clientele they reach ed via e-mail during a typical month. As Table 43. How Often Agents E-mail Cl ientele during the Month Number of Timer per Month N%N 1-5 times a month 7525.08 6-10 times a month 5819.40 11-15 times a month 3612.04 16-20 times a month 3612.04 More than 20 times a month 7525.08 Not applicable 175.69 No response 20.67

PAGE 100

91 Table 44 reports, 56.86% (n = 170) of the re spondents indicate th ey contact between 1 and 25 clientele per month via e-mail, and 29 agents (9.70%) reported contacting over 100 clientele per month via e-mail. Table 44. Estimated Number of Clientele Reac hed via E-mail during a Typical Month Est. Number of Clientele Reached N%N 1-25 clientele per month 17056.86 26-50 clientele per month 4715.72 51-75 clientele per month 144.68 76-100 clientele per month 196.35 More than 100 clientele per month 299.70 Not applicable 165.35 No response 41.34 Patterns of Use of Word Processing Software Word processing software is used by 96.66% (n = 289) of the responding county agents. Agents were asked to give their average daily use of word processing software. Table 45 on the following page shows that 25.08% (n = 75) of the respondents indicated they use word processi ng software more than 90 minutes a day. Another 23.41% (n = 70) of the respondents indicate using word processing software between 46-60 minutes a day. When asked which word processing software program they most often used, 46.49% (n = 139) of the respondents to this question indicated Corel Word Perfect, 44.48% (n = 133) indicated MS Word, and 5.35% (n = 16) indicated other. Eleven agents (3 .68%) did not respond to the question.

PAGE 101

92 Table 45. Agents Average Daily Use of Word Processing Software Average Daily Use of Word Processing Software N%N 0-15 minutes a day 165.35 16-30 minutes a day 289.36 31-45 minutes a day 4013.38 46-60 minutes a day 7023.41 61-90 minutes a day 6120.40 More than 90 minutes a day 7525.08 No response 93.01 Patterns of Use of Spreadsheet Software Asked if they use spreadsheet software, 60.87% (n = 182) said yes. Agents were then asked to give, in terms of minutes, their average monthly use of spreadsheet software. Table 46 on the follo wing page shows that 44 agents (14.72%) use spreadsheet software between 0 and 15 minutes a month, and another 38 agents (12.71%) indicate that they use spreadsheet software 16-30 minutes a month. The majority of the respondents, 48.83% (n =146), when asked which spreadsheet software they used most often, indicate d Microsoft Excel, whereas 10.03% (n = 30) indicated Corel Quattro Pro, and 1.34% (n = 4) indicated other. One hundred and nineteen agents did not respond to the quest ion about most frequently used brand of spreadsheet software.

PAGE 102

93 Table 46. Agents Average Monthly Use of Spreadsheet Software Average Monthly Use of Spreadsheet Software N%N 0-15 minutes a month 4414.72 16-30 minutes a month 3812.71 31-45 minutes a month 289.36 46-60 minutes a month 258.36 61-90 minutes a month 175.69 More than 90 minutes a month 3010.03 No response 11739.13 Patterns of Use of Presentation Software Asked if they use presentation software such as Microsoft PowerPoint or Corel Presentations, 245 (81.94%) agents answ ered yes. Table 47 on the following page shows the number of times per year that agents estimated using presentation software. Seventy-six agents (25.42%) repo rted using presentation software over 20 times a year, and another 51 (17.06%) re spondents indicated average use of presentation software between 6-10 time a year. Asked which presentation software they most often used, the majority of ag ents, 74.92% (n = 224), responded that they used Microsoft PowerPoint, while 6.02% (n = 18) used Corel Presentations, and 1.34% (n = 4) indicated that they used a nother product. Fifty-three (17.73%) agents did not respond to the question.

PAGE 103

94 Table 47. Agents Estimated Average Yearly Use of Presentation Software Average Yearly Use of Presentation Software N %N 0-5 times a year 59 19.73 6-10 times a year 51 17.06 11-15 times a year 33 11.04 16-20 times a year 30 10.03 More than 20 times a year 76 25.42 No response 50 16.72 Patterns of Use of the World Wide Web Agents were asked if they could surf or browse the Internet. Two hundred ninety-eight or 98.33% of the respondents answered yes. When asked the openended question, in general, wh at is your opinion of the World Wide Web and its use in Extension work, 226 agents voluntarily responded in writing. Of these agents, almost all of them, 97.34% (n = 220) used very strong positive statements such as: Indispensable; Wonderful resource; V ital for quick information & ideas; Absolutely essential; Inva luable; Very important. Patterns of Use of the Web Page Editing/Creation Activity When asked whether they edit or creat e Web pages, 22.74% (n = 68) agents said yes. Responding to the related que stion who is primarily responsible for maintaining your countys Extension se rvice Web site, 21 respondents (7.02%)

PAGE 104

95 indicated office support staff, followed closely by I am (6.69%, n = 20), and other (4.68%, n = 14). Objective 3 Determine county Extension Agents Percei ved Level of Skill with Regard to a Specific set of Information Technology Tasks The study endeavored to generate a more objective assessment of agents computer skills by asking (agents) to self-report whether they could perform specific tasks with regard to the six types of softwa re that the study considered (e-mail, word processors, spreadsheets, presentati on software, Web browsing, and Web development). To these ends a series of yes/no questions about specific skills associated with the six types of software were asked. As is indicated in Table 48 beginning on the following page, questions a bout e-mail skills and skills associated with surfing the World Wide Web receive d high percentages of yes response, followed by diminishing levels of yes response for word processing skills, presentation software skills, spreadsheet skills, and Web page editing/development skills. For purposes of comparison, the above analysis was done for the Paper Respondents (n = 21), All On-line Respondent s (n = 278), and for the total population (n = 299).

PAGE 105

96 Table 48. Number and Percent of Respondents indicati ng Yes to Specific Skills Questions Specific Software Skill Paper Respondents All On-line Respondents All Respondents N %N N %N N %N E-mail Do you use e-mail? 21100.00278100.00 299 100.00 Can you attach and send files (attachments) through e-mail? 1885.7126796.04 285 95.32 Are you a member of an e-mail listserv? 1885.7126394.60 281 93.98 Can you find addresses in your email programs address book? 1990.4826294.24 281 93.98 Can you create and use e-mail distribution lists using your e-mail program? 1676.1919871.22 214 71.57 Can you access your e-mail away from the office? 1361.9018466.19 197 65.89 Do you use e-mail folders to organize sent or received e-mail messages? 1257.1416760.07 179 59.87 Word Processing Do you use word processing software? 2095.2426996.76 289 96.66 Can you use edit features such as cut and paste? 1990.4826394.60 282 94.31 Can you set page margins? 1780.9524688.49 263 87.96 Can you create tables? 1466.6722279.86 236 78.93 Can you set tabs? 1676.1921276.26 228 76.25

PAGE 106

97 Table 48 Continued Specific Software Skill Paper Respondents All On-line Respondents All Respondents N %N N %N N %N Word Processing (continued) Can you perform mail merge using a dataset of names, etc.? 419.059132.73 95 31.77 Spreadsheet Do you use spreadsheet software such as MS Excel or Corel Quattro Pro? 838.1017462.59 182 60.87 Can you format cells in a spreadsheet to number, or currency, etc.? 314.2915053.96 153 51.17 Can you sort data in a spreadsheet? 523.8112043.17 125 41.81 Can you write formulas in a spreadsheet? 14.7612043.17 121 40.47 Can you create a graph or chart in a spreadsheet (using the Ss. software?) 14.7611641.73 117 39.13 Can you use nested functions in a spreadsheet? 29.526222.30 64 21.40 Presentation software Do you use presentation software? 1676.1922982.37 245 81.94 Can you use different views in the presentation software package? 1676.1921276.26 228 76.25 Can you insert graphics and pictures from a variety of resources? 1571.4319068.35 205 68.56 Can you create a master slide? 1466.6718666.91 200 66.89 Can you create a slide show that runs automatically? 1047.6216358.63 173 57.66

PAGE 107

98 Table 48 Continued Specific Software Skill Paper Respondents All On-line Respondents All Respondents N %N N %N N %N Presentation software (continued) Can you create automatic builds and transitions? 733.3314050.36 147 49.16 World Wide Web Can you surf or browse the Internet? 21100.0027398.20 294 98.33 Can you use a search engine such as Yahoo or Google to find Web pages? 2095.2427398.20 293 97.99 Can you download files from the Internet? 21100.0026394.60 284 94.98 Can you bookmark frequently used Web pages? 1990.4825692.09 275 91.97 Web Page Editing/Development Do you create or edit Web pages? 314.296523.38 68 22.74 Can you edit Web pages? 314.295921.22 62 20.74 Web Page Editing/Development Can you create hyperlinks? 29.525519.78 57 19.06 Can you create a Web page using MS FrontPage or another HTML editor? 29.525419.42 56 18.73 Can you incorporate graphics into Web pages? 29.525419.42 56 18.73 Can you convert existing files into HTML? 00.004817.27 48 16.05 Can you create a Web page using native HTML? 00.00217.55 21 7.02

PAGE 108

99 Percentage of yes respons e to the specific software skills questions was very similar across the three categ ories of respondents, with the notable exception of spreadsheet use. For spreadsheet use, per cent of yes response was much lower for the Paper Respondents group. Objective 4 As a Means to Recommend Future Info rmation Technology Training, Describe the Relationship Between Agents Percei ved Importance of, and Self-Assessed Knowledge about each of a Specific set of Information Technology Skills This aspect of the analysis follows from that engaged by Albright in her similar (2000) study of county Extension agents in the state of Texas. In that study Albright showed that agents self-a ssessed knowledge about an information technology skill is closely correlated with thei r ability to apply the skill on the job. Agents, therefore, reporting a high leve l of knowledge about a skill are in all likelihood able to apply those skills to thei r jobs. For the stud y at hand, this close correlation between knowledge and applicati on held true. Table 49 on the following page provides the results of the correlation.

PAGE 109

100 Table 49. Product-Moment Correlation Coefficients fo r Knowledge and Application by Skill Category E-mail Applic. Word Proc. Appl. Spreads. Appl. Present. Appl. WWW Appl. Web Dev. Appl. E-mail Knowledge .844*** (N = 297) Word Processing Knowledge .516*** (N = 288) .918*** (N = 287) Spreadsheet Knowledge .433*** (N = 182) .468*** (N = 178) .918* (N = 181) Presentation Software Knowledge .479*** (N = 245) .551*** (N = 239) .306*** (N = 165) .915*** (N = 244) WWW Knowledge .564*** (N = 296) .540*** (N = 287) .441*** (N = 181) .519*** (N = 244) .916*** (N = 296) Web Devlpmt. Knowledge .350* (N = 69) .511*** (N = 69) .526*** (N = 59) .506*** (N = 63) .553*** (N = 69) .930*** (N = 69) *Pr < 0.01 ** Pr < 0.001 *** Pr < 0.0001 Albright (2000), following methodology described by Borich posits that a weighted score combining the agents knowledge of a skill w ith the level of importance that they ascribe to a skill i s a stronger relationship to consider for training (p. 84) than is th e knowledge score alone. This weighting functions as a mathematical equation using the mean values of the knowledge and importance constructs: (Importance Mean Knowledge Mean) X Importance Mean = Training

PAGE 110

101 Need (Albright, 2000, p. 87). The effect of the weighting is to accentuate the importance of a skill should the knowledge of that skill be small. Thus, in example, if agents respond in such a manner to indicate that a skill is important, but also respond that they have little knowledge about that skill, the value of the importance of that skill will be increased. Performed for all sk ills of interest, the weighted knowledge scores are then ranked, with the most pr essing training needs garnering the highest value. Using the weighting method described above, training needs were calculated and ordered for each of the six IT skills areas the study considered. The results are listed in Tables 50 and 51, and show that E-mail skills, with a weighted knowledge score of 4.33 appeared at the top of the list of training n eeds, followed by presentation software skills (3.32), word proces sing skills (3.27), World Wide Web Table 50. Mean Importance and Knowledge Construct Scores and Weighted Knowledge Score by Skills Category Skill Category Importance Mean Knowledge Mean Weighted Knowledge Score E-mail 4.42 3.44 4.33 Word Processing 4.36 3.61 3.27 Spreadsheet 3.00 2.67 0.99 Presentation 4.10 3.29 3.32 WWW 3.95 3.49 1.81 Web Development 3.45 3.00 1.55

PAGE 111

102 skills (1.81), Web editing/development skills (1.55), and lastly spreadsheet skills (0.99). Table 51. Borich (1990) Training Need Ranking Skill Category Weighted Knowledge Score Rank of Training Need E-mail 4.33 1 Presentation 3.32 2 Word Processing 3.27 3 WWW 1.81 4 Web Development 1.55 5 Spreadsheet 0.99 6 Respondents were then differentiated by gender and the Borich Model applied in the same manner as above. As Tables 52 and 53 on the following page indicate, a different ranking of training need is evid ent when gender is considered. For males the top three needs are presentation softwa re, e-mail, and word processing. For females the top three training needs are e-mail, word processing, and presentation software.

PAGE 112

103 Table 52. Mean Impotence and Knowledge Construct Sc ores and Weighted Knowledge Score by Skills Category and Gender Male Female Skill Category Import. Mean Know. Mean Wted. Know Score Import. Mean Know. Mean Wted. Know Score E-mail 4.36 3.48 3.84 4.46 3.42 4.64 Word Processing 4.28 3.51 3.32 4.42 3.67 3.30 Spreadsheet 2.97 2.79 0.55 3.02 2.58 1.31 Presentation 4.15 3.22 3.89 4.06 3.35 2.88 WWW 3.97 3.53 1.74 3.94 3.46 1.86 Web Development 3.65 3.40 0.91 3.27 2.64 2.06 Table 53. Borich (1990) Training Need Ranking Differentiated by Gender Male Female Skill Category Weighted Knowledge Score Rank Weighted Knowledge Score Rank E-mail 3.84 2 4.64 1 Word Processing 3.32 3 3.30 2 Spreadsheet 0.55 6 1.31 6 Presentation 3.89 1 2.88 3 WWW 1.74 4 1.86 5 Web Development 0.91 5 2.06 4

PAGE 113

104 Respondents were then differentiated according to age. Respondents 20-41 years of age formed one group, and those aged 41 and above formed the second group. As Table 54 below and Table 55 on th e following page indicate, the raking of training needs differs for the two groups. For the 20-41 age group the top three needs are e-mail, word processing, and presentation software. Note that this ranking is the same as the female respondents training ra nking. For the 41 and above age group the top three training needs are e-mail, pres entation software, and word processing. Table 54. Mean Impotence and Knowledge Construct Sc ores and Weighted Knowledge Score by Skills Category and Age Group Ages 20-41 Age 41 and Above Skill Category Import. Mean Know. Mean Wted. Know Score Import. Mean Know. Mean Wted. Know Score E-mail 4.51 3.65 3.88 4.39 3.36 4.52 Word Processing 4.48 3.83 2.92 4.31 3.50 3.47 Spreadsheet 3.12 2.89 0.73 2.95 2.60 1.02 Presentation 4.10 3.53 2.32 4.10 3.18 3.75 WWW 4.05 3.82 0.94 3.90 3.37 2.10 Web Development 3.65 3.26 1.40 3.31 2.88 1.44

PAGE 114

105 Table 55. Borich (1990) Training Need Ranking Differentiated by Age Group Ages 20-41 Age 41 and Above Skill Category Weighted Knowledge Score Rank Weighted Knowledge Score Rank E-mail 3.88 1 4.52 1 Word Processing 2.92 2 3.47 3 Spreadsheet 0.73 6 1.02 6 Presentation 2.32 3 3.75 2 WWW 0.94 5 2.10 5 Web Development 1.40 4 1.44 4 Summary The 299 respondents were categorized acco rding gender, age, program area, and when and how they submitted their completed survey instrument. Summary statistics were provided for the respondents, showing percentages of males, females, age groups, etc. T-tests revealed no si gnificant difference in mean self-rated computer score between males and females, but an analysis of variance did show differences in this score between certai n of the age groups. A t-test showed a significant difference in se lf-rated computer skills between the Early On-line Respondents and Late On-line Respondents. Analysis of the three construct variab les designed to reveal agents future training needs was then conducted for th e following groups of respondents: The undifferentiated group of respondents, ma le respondents, female respondents,

PAGE 115

106 respondents 20-41 years of age, and respondent s 41 years of age and above. For the undifferentiated group of respondents, E-ma il topped the list of training needs, followed by presentation software, word processing, WWW use, Web development, and spreadsheet skills. For males the t op three training needs were shown to be presentation software, e-mail, and word processing. For females the top three training needs were e-mail, word proce ssing and presentation software. The 20-41 age groups top three training needs were e-mail, word processing, and presentation software. The over 41 age groups top three training needs were e-mail, presentation software, and word processing. As a means to measure consistency be tween responses for the various items, Cronbachs coefficient alpha was computed. This figure was 0.7962.

PAGE 116

107 CHAPTER 5 SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS Summary This study sought to determine how count y Extension agents of the Florida Cooperative Extension service are using in formation technology, and what future information technology training those agents mi ght need. To these ends agents were asked about their use of hardware and softwa re, and also to self-assess their overall level of computer skills. Data was returned from 90.3% of the population of county agents, and subjected to statistical anal ysis. This chapter offers summary, conclusions, and recommendations based upon the data. Procedure The population for this study was county Extension agents in the employ of the University of Florida, Florida Cooperati ve Extension Service. In July 2002, when the study’s survey commenced, this p opulation was 331. Regular and courtesy Extension agents were included. The survey instrument, adapted from Al bright (2000), was renovated and then subjected to a panel of experts. In its final form this instrument contained 99 individual items that collected information on software skills, patterns of IT use, the three constructs used to assess future software training needs, and personal and situational factors. The instrument was made availa ble to the population via a dedicated university-hosted Web site. Agents were assigned their own individual code that allowed access to the instrume nt and identified th eir participation.

PAGE 117

108 Information was collected on numerous i ndependent variables of interest, which included: Type of computer used at the office; if the ag ent has their own computer on their desk; if their office computer is connect ed to the Internet; whether the agent uses a laptop PC or a Palm Pilot-type device; what peripheral devices are used at the office; estimated hourly computer use per week; whether the agents does office work on their home computer; where most of their computer knowledge was learned; overall computer skills rating; whether th e agent has taken a computer course since the year 2000; what is the principal reason for not taki ng a computer class; where questions about computer related issues are answered; whether agents use e-mail, word processing, spreadsheets, presentation software, surf the World Wide Web, or are able to create or edit Web pages; and ag ent’s years of work experience and age. The survey avoided asking many personal and situational questions, such as gender, rank, Extension district, and county by the use of the unique access code that tied individual response to a comprehensive database. Notification of the study, its Web addre ss, the agent’s unique access code, and an introductory message from the resear chers was transmitted to each individual Extension agent via electronic mail. T hose individuals who had invalid e-mail addresses were sent a paper copy of the notification message by way of the United States Postal Service. After three weeks duration, non-respondents were sent a paper version of the instrument, in troductory letter, etc., also by way of the United States

PAGE 118

109 Postal Service.1 Ninety percent (n = 299) of the population of county agents responded. Limitations of the Study One limitation of the study is that a ce nsus of the population of FLCES county Extension agents was conducted. Due to the na ture of this type of study, the specific IT infrastructure in place within the FLCE S, and specific IT knowledge and skills that might be possessed by FLCES county agents the findings of the study cannot be generalized to Extension organizations el sewhere, though they are likely to offer insight to those organizations. Another limitation to the study was th at it was not based on an objective demonstration of information technology sk ills competency, but rather on the felt needs, perceptions, and self-ratings of the respondents. Under these circumstances, and simply under the normal circumstances of a study like this a certain element of bias can be introduced, possibly skewing the results. It can be rationalized, however, that the respondents to this study had little reason to eith er overtly or inadvertently bias the results.2 Respondents were guaranteed anonymity, the information solicited was in essence benign, and the overall rationa le for the survey was supportive of the respondents’ careers as county Extension agents. Furthermore this study shared similar findings with Albright’s 2000 study of county Extension agents in the state of Texas, a consistency that appears to lend ve racity to the response here recorded. 1 For what it is worth: The principal researcher us ed regular issue self-adhesive stamps that, in his estimation, gave the cover a look distinct from that of most other (especially metered) mail. The combination of stamps required for the paper survey packet had a striking contrast of colors. 2 One must also consider the possibility that response might be biased by agents answering in a manner that they feel will please administrative entities.

PAGE 119

110 Key Findings and Implications One key finding of this study is that female respondents are significantly younger than the male respondents, with ma ny more females in the two youngest age groups than there are men. A related key finding is that, by a margin of over 15%, the majority of respondents are female. (Recall that the di stribution of gender amongst the respondents is essentially equa l to the distributio n of gender for the population of agents at the inception of the st udy). An implication of these findings is that the gender demographics of the Florid a Cooperative Extension Service appear to be changing. Perhaps evidencing this cha nge is that respondents to Ruppert’s 1992 study of county Extension agents in Florida were 48.8% female. Another key finding is that agents in the two youngest age groups (those aged 20-30, and 31-40) indicated most frequently that college or high school is where they obtained their computer knowledge. This wa s not the case for the older age groups, which more often indicated other sources of computer knowledge. The implication here is that younger agents, having already developed comp uter skills, may benefit from, and be more attracted to computer training that considers subject matter in a more in-depth manner. Key findings associated with agents’ use of hardware include: Three-quarters of the respondents reported using a laptop computer; approximatel y two-thirds of the respondents use a computer from 16 to over 20 hours a week; and over two-thirds of the respondents do office work on their hom e computer. The implication of this finding is that the Extension organization is now extensively utilizing information technology.

PAGE 120

111 Key findings associated with agents’ us e of software include: The vast majority of agents use e-mail to communicat e with clientele; over three-quarters of the agents use presentation software; and ju st over 20% of the agents responded that they could edit or create Web pages. An implication of this finding is that agents have adopted, or are adopting these powerf ul technological tools and that this may portend a change in the wa y traditional Extension services are delivered. There are two key findings associated with computer use and gender. The first is that there was no significant diffe rence in mean self-rated computer skill between males and females. The second is that there was no significant difference in mean hours of weekly computer use between ma les and females. Implication: In this study males and females are at parity in terms of perceived skill and use of information technology. Another key finding of the study was th at the two oldest age groups (age 5160 and age 61-70) had self-rated computer skil ls scores that were significantly lower than those of the younger age groups. This jib es in part with another finding, which was that agents in the oldest age group use their computers significantly less than do the other age groups. Implication: Olde r individuals in the organization may not have adopted information technology as robustly as have their younger colleagues. In addition to the previous finding, the study also showed that there were no significant differences in mean self-rated co mputer skills score when the agents were grouped into program areas. A related key fi nding is that there were no significant differences in mean weekly computer use between the program area groups.

PAGE 121

112 Implication: Level of information technol ogy use, and the ability to use information technology appears to be similar for each program area. A key finding with respect to when on-line respondents tendered completed surveys showed that the Early On-line Respondents group had a significantly lower mean self-rated computer skills score than the Late On-line Respondents. Implication: Agents who responded early ma y likely be more computer literate than those who responded late; they might be able to more readily u tilize the technology to conduct their daily business. Analysis of the three constructs designed to determine training need produced another key finding for the study: E-mail skills, followed by presentation software skills were derived from agents’ respons e to be the number one and number two training needs. This ranking is based on wh at the agents perceived to be important. This felt importance is indicative of both an interest, and a desire to learn the subject matter (H. W. Ladewig, personal communi cation, October 2002). The implication then is for appropriate entit ies to address information t echnology training need vis-vis the felt needs of their employees. Discussion Ten years ago, in 1992, Ruppert conducte d a study on computer use by county Extension agents of the Florida Cooperative Extension Service. All county agents, a population of 277 at that time, were asked to participate in the study by way of a paper survey. Response to the survey wa s a remarkable 94.22% (n = 261) (Ruppert, 1992). Ruppert reported that 92.70% of th e respondents had access to a computer, on average, 28.64 hours a week. However, 54.4% of the respondents did not have a computer on their desk. Many agents shared a computer with a colleague or office

PAGE 122

113 staff. Slightly less than one-third of th e respondents reported that they used a computer at home. Six areas of computer expertise were considered in Ruppert’s study: VAX3 use, Word processing, databases, spreadsheets, CD-ROM, and computer graphics. For each of the six areas of expertise, agents were categorized as “nonusers,” “novices,” “intermediate,” or “old hand” acco rding to their self-reported ability to apply skills to those areas. Using the le vel of use categories, Ruppert reported an overall mean score for the six areas of com puter expertise she considered. This figure was 1.01, where 0 corresponded to “nonuser ,” 1 corresponded to “novice,” 2 corresponded to “intermediate,” and 3 corre sponded with “old hand.” The average agent, thus, was slightly above the level of a novice user. It should be noted, as a means of comparison to Ruppert’s finding, that the large majority of respondents (+92%) to the current study ra ted their overall computer skills from “average to excellent.” There is a stark contrast to be made be tween the results reported ten years ago by Ruppert, and those reported by the study at hand. For the 2002 survey, 95.32% of the respondents reported ha ving a computer on their desk at the office (45.6% in 1992), and 99.00% of those computers are conne cted to the Internet. The majority (63.88%) of agents reported in 2002 that they use their computers from 16 to over 20 hours a week (average availability of a computer was 28.64 hours a week in 1992). Over two-thirds of the 2002 respondents indi cated that they do office work on their home computer (it was slightly less than one-third in 1992). 3 The VAX was a “mini computer” manufactured by the Digital Equipment Corporation. Among other things, it provided the IFAS organization with co mputer networking, and an e-mail facility.

PAGE 123

114 In 2002, 96.66% of the respondents indicated that they use word processing software, up from 79.00% in 1992. Agents w ho reported using spreadsheet software climbed from 37.70% in 1992 to 60.87% in 2002. Database use, CD-ROM use, and computer graphics, examined by Ruppert in 1992, were not considered by this study. VAX use as performed in 1992, has become obsolete. E-mail, Web browsing, and presentation software are ne w areas of computer-related tasks that have come to the fore, and are recording very high levels of use by county Extension agents. E-mail now facilitates communications throughout the Extension organization, for both administrative purposes, and in serving client ele need. The vast majority (98.33%) of county agents surf the World Wide Web, and indicate that it is an extremely important resource for work-related inform ation. Web sites are also used to disseminate information to clientele, and for Extension administrative purposes such as registering for in-service training. Presentation softwa re is used by 81.94% of the respondents, which is perhaps an endorsem ent of the efficacy of this genre of software and its importance in deliveri ng educational information. High use of presentation software may al so be construed as an i ndication that use of other information technology such as dig ital projectors has been adopted. That over two-thirds of the agents repor ted using their computers from 16 to over 20 hours a week, leads this researcher to ask whether information technology is becoming the tail that is wagging the Exte nsion dog. Indeed the figure on computer use included time both at home and at the offi ce, but for the sake of argument let us assume that most of that computer time is expended at the office. For a normal workweek of 40 hours, this would mean ag ents are on the computer approximately

PAGE 124

115 half their working hours. One wonders if this does not herald some sort of fundamental change in Extension work; that farm visits, face-to-face contact with clientele, demonstrations, and answering telephone calls is being supplanted by email messages, Web sites, and so forth. The fourteenth question in the survey asked agents to indicate the one main reason that they had not taken a computer course since the year 2000. As previously reported in Chapter 4, the majority of responde nts to this question indicated that “lack of time” was the reason they have not taken a computer course. This is the same finding as reported by Albright (2000). There does appear, however, to be some lack of specificity to this response. A question t hus arises as to just what exactly “lack of time” means. Any future research investig ating why agents are not taking computer courses would be advised to reformulate th e response categories to this question in such a manner to home in on more specific reasons for not participating in training. Commentary regarding the results of th e Borich needs assessment conducted by this study is here offered: An exam ination of the mean “knowledge” construct scores generated by this study shows high levels of knowledge for the skills areas ranking highest for training needs. Thus, in example, we see that though e-mail was ranked the number one training need for the undifferentiated group of respondents, the mean level of knowledge reported for this skill was 3.44 on a 5-point scale. As indicated, this effect was recorded for the other top two training needs. It undoubtedly would be a ha rd sell to convince Extens ion administrators to commit resources to train agents in those ar eas that they already appear to know the most about. This, then, seems to point to a disadvantage of the Borich model:

PAGE 125

116 Though agents may indeed have a strong felt need for certain training, is this really the training needed to insure that the orga nization’s IT requirements are effectively being met? Perhaps additional, more obj ective, measures need to accompany the model. On the other hand, given the importance of E-mail and presentation software to everyday Extension activities, it seems both logical and reasona ble that the needs assessment facet of this study ascertaine d these two types of software to be, respectively, the number one and number tw o training needs of the undifferentiated group of county agents. Agents are in essence saying that this communication software that they use on a regular basis is very important to their jobs, and they want to know more about its effective use. Is it reasonable, then, to say that the Borich methodology really does accurately assess what agents need training in? The answer seems to be affirmative insofar as detecting what agents’ desired training needs. But quite possibly those desires may not correspond to where some not-so-desired training may be useful – like spreadsheet use. One last note on the Borich methodology: Gauged by examining the patterns of “yes” responses to the specific skills questions that this study asked, the methodology did appear to correctly assess le vels of knowledge. Thus, in example, high levels of “yes” response to the various e-mail skills corresponded to a high mean knowledge construct score. Regarding gender parity and IT use: Ten years ago Ruppert reported that females had less computer access than males, and that fewer females had a computer

PAGE 126

117 on their desk than did males. Ruppert also reported that females were significantly more likely to have received computer sk ills training from someone else versus having learned them on their own (Ruppert, 1992). Whereas the latter appears to hold to this day (females are much more likely than males to have learned their computer skills in school, from co-workers, or from family members), the issues of computer availability appear to be a thing of the past. Regarding program area and IT use: R uppert (1992) reported a significantly lower “computer mean use score”4 for “home economics or other agents” (Pg. 101). Though the present study has no exact equiva lent mean to contrast to that of Ruppert’s study, it did find that there was no significant difference in the mean selfrated computer skills score between the program areas. The tide of information technology seems to have evenly wash ed through the organization’s major programmatic areas. Discussion of the Methodology Extension agents were notified three da ys in advance of the study by an e-mail message from Dr. Christine T. Waddill, Dean for Extension, who encouraged participation. Thereafter the initial e-mail message from the researchers was sent to the population of county agents, and procla imed the study to be “a unique opportunity for you to express your opinion about computer use on your job, to identify computer hardware and software that you might need, a nd to provide information that will help determine what future computer traini ng should be offered to county Extension 4 This mean was derived from the results of the “Computing Concerns Questionnaire (CCQ),” which Ruppert administered as part of her survey of county Extension agents. The CCQ, based on concerns agents feel about computer use, measures level of adoption.

PAGE 127

118 agents.” This initial message also provide d a hyperlink to the Web-based instrument, and the recipient’s unique access code. It seems likely that the dean’s endorsement, and the fact that all of the messages regarding the study were mailed unde r both the graduate student’s and his committee chair’s name, lent a large margin of authority to the request to participate. A clear association with th e University of Florida and a department well known to Extension agents undoubtedly further establis hed the authority of the study. Perhaps also the graduate student’s pr evious employment in a high-vi sibility role in Extension lent a certain element of name recognition to the study. Between all the official and quasi-official attention, and perhaps due also to the timeliness of the topic, Extension agents probably perceived the study as important. Though the official start date of th e survey was Monday, July 8th, 2002, the message introducing the study was sent out on Sunday, July 7th, 2002. Response was immediate, with 6 completed surveys arri ving that day. Monday the 8th brought 67 more completed surveys, and 28 arrived on Tuesday the 9th. A marked slow-down in response then occurred on Wednesday the 10th (5 surveys received), and Thursday the 11th (14 surveys received). On Thur sday evening the researchers e-mailed a reminder message to all members of th e population who had not yet returned a completed instrument. Response the next day (Friday the 12th) picked up, with 27 completed surveys received. Over the w eekend 3 more completed surveys were tendered, and on Monday, July 15th, an a dditional 16 surveys were received. Response then dwindled again. Tu esday brought 8, Wednesday 7, and 2 on Thursday. On Thursday evening a sec ond reminder was e-mailed by the researchers

PAGE 128

119 to all members of the population who had yet to have participated. Friday saw response once again climb, with 22 completed surveys being received. This pattern of heightened response after a reminder message repeated itself throughout the duration of the survey. Simply put, sending reminder messages on a regular basis was clearly an important factor in generati ng response. It should be noted that the principal researcher appealed to each of th e District Extension Directors to contact agents within their district s and encourage participation. The researchers recorded relatively litt le request for help. Several agents called or e-mailed with problems ranging from not being able to locate their access code, to having a browser that was incomp atible with the survey’s Web site. Approximately four agents, who received re minder messages to participate, indicated to the researchers that they had indeed already submitted completed surveys. These agents were asked to complete the on-line survey again, and all successfully tendered completed surveys. There were, however, at least two known cases where agents unsuccessfully tried to submit completed surveys, in one case at least two times. It is not known why this situation existed, nor is it known how many respondents were thus thwarted in their effort to participate. The data generated by the survey was tendered to the researchers by way of electronic mail. Virtually a ll of the returned surveys we re filled out to completion. An algorithm created by the principal res earcher’s Web guru was run to separate the numeric information from the respondents’ comments, and two flat files generated. The flat file of numeric da ta was readily analyzable with the Statistical Analysis System (SAS).

PAGE 129

120 Agents seemed to have a positive di sposition towards the study’s Web-based format. Over two-thirds (68.56%) of the re spondents rated Web-based surveys to be of average, above average, or high conveni ence. Over three-qua rters (76.55%) said they would be likely, more than likely, or highly likely to respond to a Web-based survey. Agents’ response when asked to estimate how many Web-based surveys they had participated in ranged from a hi gh of 45 to a low of 0 (mean = 5.60). It should be noted that e-mail comm unication for the main survey was conducted very effectively by using MS Wo rd in conjunction with MS Excel to perform a merge-to-e-mail routine. In this manner close to one thousand individualized e-mail messages were broadc ast to the population over the duration of the study. Conclusions Based on the empirical evidence drawn from this study, the following major conclusions are drawn 1. The Web-based methodology used to administer the survey worked for both researcher and respondent, and worked very well. 2. Agents of the Florida Cooperative Exte nsion Service have embraced information technology, and are using it on the job more than ever before. 3. It is evident that a significant change in the way county Extension agents use information technology has occurred in the past 10 years. 4. Agents have a desire to learn more about the staples of their everyday IT experience: E-mail, presentation software word processing, and the Web. Future Information Technology training in th ese areas should go beyond the basics. Recommendations The following recommendations stem from this study 1. The majority of agents report that they have not taken a computer course since 2000, and lack of time was given as the pr incipal reason for not doing so. Further

PAGE 130

121 evidence stemming from county agents’ wr itten commentary submitted with the survey, suggests the computer classes that have been offered did not cover desired subject matter, were too basic, or were poorly conducted. It is recommended that these concerns be investigated and addr essed. It is also recommended that computer training on the subjects shown by this study as top needs (e-mail, and presentation software) be made a priority. 2. Further research should be done to deve lop an objective unde rstanding of what information technology skills all agents s hould be expected to know. What would a suitable suite of IT skills include? Trai ning should then be focused on this set of skills. 3. Research should be conducted on the most effective use of information technology by county agents of the Florid a Cooperative Extension Service. Training on these findings s hould then take place. 4. Research should be conducted to exam ine exactly how county Extension agents are using software. In example, over 45% of agents in this study responded that their average use of word processing so ftware was from 60 to over 90 minutes a day. What is this time being devoted to? 5. Research similar to the above should be done on county Extension agents’ use of hardware. 6. Research should be done to determine ex actly what hardware and software county agents are employing, and whether thes e products are useful and needed. 7. Research should be done to determine how county Extension agents are using the World Wide Web in conjunction with their educational efforts. Are they looking to other IFAS Web sites (possibly maintain ed by state Extension specialists) for information? Corporate Web sites? We b sites maintained by other state and national Extension organizations? 8. Research should be done to determine how county Extension agents are using the World Wide Web to disseminate informati on to clientele. The efficacy of this approach to clientele contact shoul d be assessed, including gaining an understanding of which clientele is being served, and to what degree. 9. Ancillary to the above, research should be conducted to determine how Web site development and maintenance is imp acting county Extension agents’ jobs. 10. Future survey research on county Exte nsion agents should strongly consider the Web-based methodology used by this study. High levels of computer literacy, access to the Internet, and other attributes makes county agents a good choice for this efficient, economical mean s of collecting information.

PAGE 131

122 APPENDIX A SURVEY OF COMPUTER TECHNOL OGY SKILLS AND TRAINING NEEDS Survey of Computer Technology Skills Summer 2002 ...and thank you for taking a few minut es of your valuable time to participate. Please click on the box below to begin the survey. Begin the s urvey

PAGE 132

123 Survey of Computer Technology Skills Summer 2002 You are a pioneer, as this is t he first Web-based survey conducted exclusively for all county Extension agents in Florida. Your participation is greatly appreciated, and you have our sincere thanks! The last survey on Florida county Extension agents' computer use was done 10 years ago. What a long way the technology has come since then, and how different the use of this important resource must now be! Your participation in this study is completely voluntary, and there is no penalty for not participating. The pur pose of the study is to gain an understanding of the level of skill patterns of use, and workplace application of information technology amongst county Extension agents of the Florida Cooperative Extension Service. You do not have to answer any question you do not wish to answer, and you may withdraw from the study at any time without consequence. The survey takes approximately 15 minutes to complete. We believe there are no direct risks or benefits to you for participating in this study Your information will be assigned a code number. The list connecting your name to this number will be kept in a locked file in a secure location When the study is complete and the data have been analyzed, the list will be destroyed. Your name will never appear in any report. There is no compensation for participation in this research. If you have any questions about your rights concerning this study, please contact the UFIRB Office, Box 12250, University of Florida, Gainesville, FL 32611-2250. If you have any questions about the survey, please contact either Dr. Tracy Irani (352) 392-0502 (e-mail IR ANI@ufl.edu), or Austin Gregg (352) 392-1285 (e-mail JAGREGG@ufl.edu). < Previous Page Next Page >

PAGE 133

124 Survey of Computer Technology Skills Summer 2002 Please click in the box and enter your survey access code: S ubmit

PAGE 134

125 Survey of Computer Technolog y Skills, Su mmer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. The survey begins with some questions about the kind of computer hardware you use. NOTE: Please ONLY use the buttons found at the bottom of the survey's pages to go forward or back. Thanks! 1. What sort of computer do y ou primarily use at the office? An IBM PC clone with the Windows operating system An Apple computer Other: 2. Do you have your own comput er on your desk at the office? Yes No 3. If you have a computer on y our desk at the office, is it connected to the Internet? Yes No I don't have a computer on my desk 4. Do you use a laptop computer for your job? Yes No 5. Do you use a Palm Pilot, I-P aq or similar type device for your job? Yes No

PAGE 135

126 6. Please indicate the peripheral devices you might use at the office: Click "yes" or "no" to each A. Laser Printer Yes No B. Color Printer Yes No C. Other Type of Printer Yes No D. Scanner Yes No E. Web Cam Yes No F. Speakers connected to your computer Yes No G. CD Burner (device that allows you to create CDs) Yes No 7 If you were to make a "wish list" of hardware or software that you would like to have at the office, what item would you most want? A computer projector A laptop computer A color printer A new desktop computer A scanner New software Other: Please indicate by typing in the box below... 8. What improvements in hardwar e and/or software do you believe would most positively impact your ability to serve clientele? Please indicate by typing in the box below... Next Page >

PAGE 136

127 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forwar d or back in the survey... otherwise data might be lost. The survey now asks how often you use a computer, how you learned your computer skills, and where you turn for help when you have questions. 9. In a normal week, estimate the number of hours you spend using a computer (both at home and at work): 0 hours 1-5 hours 6-10 hours 11-15 hours 16-20 hours More than 20 hours 10. Do you do office work on your home computer? Yes No Not Applicable 11. Where have you learned most of your computer knowledge? Click "yes" or "no" to each A. Self taught at home Yes No B. Learned in college or high school Yes No C. Self taught at work Yes No D. Learned at work through in-s ervice training, etc. Yes No E. Learned from family or fr iends outside of work Yes No F. Learned from co-worke rs at work Yes No G. Other: 12. How would you rate your overall computer skills? Very Poor Poor Average Above Average Excellent 13. Have you taken any computer courses since 2000? (These may include courses on word proce ssing, spreadsheets, the Windows operating system, etc.) Yes No

PAGE 137

128 14. If not, what was the principa l reason for not taking computer courses? Lack of time Lack of access Too expensive No incentive Not available Too few In-service training days Other: Please indicate by typing in the box below... 15. If you have a question about a co mputer-related issue, where are you most likely to seek an answer? From a colleague or suppor t staff in the office From a colleague or suppor t staff in another county From your district's computer support personnel You find the answer on your own Other: Please indicate by typing in the box below... < Previous Page Next Page >

PAGE 138

129 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forward or back in the survey... otherwise data might be lost. The survey now inquires if you use e-mail. 16a. Do you use e-mail? Yes No < Previous Page Next Page >

PAGE 139

130 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. You have responded that you do use e-mail. The survey would like to ask you the following: 16b. On average, how much time do you devote to e-mail during the day? 0-15 minutes a day 16-30 minutes a day 31-45 minutes a day 46-60 minutes a day More than 60 minutes a day 16c. Do you use e-mail to communicate with clientele? Yes No 16d. How often do you e-mail clientele during the month? 1-5 times a month 6-10 times a month 11-15 times a month 16-20 times a month More than 20 times a month Not applicable 16e. Please estimate the number of clientele you reach via e-mail during a typical month 1-25 clientele 26-50 clientele 51-75 clientele 76-100 clientele More than 100 clientele a month Not applicable 16f. Can you attach and send file s (attachments) through e-mail? Yes No 16g. Are you a member of an e-ma il listserv that distributes professional information to you? Yes No

PAGE 140

131 131 16h. Can you find addresses in your e-mail program's address book? Yes No 16i. Can you create and use e-mail di stribution lists using your email program? Yes No 16j. Do you use e-mail folders to organize sent or received e-mail messages? Yes No 16k. Can you access your e-mail away from the office using a laptop, PDA, or so me other computer? Yes No 16l. Please indicate if there are any other e-mail skills that are necessary for successful Extension employees: < Previous Page Next Page >

PAGE 141

132 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Please consider the following: How important are e-mail skills to your job (such as communicating with colleagues, shari ng information with clientele, receiving important administrative messages etc.)? 1 low importance 2 somewhat important 3 average importance 4 above average importance 5 high importance How would you describe your level of knowledge about e-mail skills? 1 low level of knowledge 2 somewhat kn owledgeable 3 average knowledge 4 above average knowledge 5 high level of knowledge How would you describe your abilit y to apply e-mail skills to your job? 1 low ability to apply the skills 2 somewhat able to apply the skills 3 average ability to apply the skills 4 above average ability to apply the skills 5 high ability to apply the skills < Previous Page Next Page >

PAGE 142

133 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. The survey now asks if you use word processing software. 17a. Do you use word processing software (such as Microsoft Word or Corel WordPerfect)? Yes No Your survey is now more than 1/4 complete! < Previous Page Next Page >

PAGE 143

134 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. You have responded that you do use word processing software. The survey would like to ask you the following: 17b. On average, how often do you use word processing during the day? 0-15 minutes a day 16-30 minutes a day 31-45 minutes a day 46-60 minutes a day 61-90 minutes a day more than 90 minutes a day 17c. Can you use edit features such as cut and paste? Yes No 17d. Can you set page margins? Yes No 17e. Can you set tabs? Yes No 17f. Can you create tables with your word processing software? Yes No 17g. Can you perform "mail me rge" using a dataset of names/addresses and forms (i.e. a letter)? Yes No

PAGE 144

135 17h. Do you use word processi ng to create brochures, fact sheets, or other educ ational documents? Yes No 17i. Which word processing software do you most often use? Corel WordPerfect Microsoft Word Other: 17j. Please indicate if there are any other word processing skills that are necessary for successful Extension employees: < Previous Page Next Page >

PAGE 145

136 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forward or back in the survey... otherwise data might be lost. Please consider the following: How important are word proc essing skills to your job? 1 low importance 2 somewhat important 3 average importance 4 above average importance 5 high importance How would you describe your level of knowledge about word processing? 1 low level of knowledge 2 somewhat kn owledgeable 3 average knowledge 4 above average knowledge 5 high level of knowledge How would you describe your ability to apply word processing skills to your job? 1 low ability to apply the skills 2 somewhat able to apply the skills 3 average ability to apply the skills 4 above average ability to apply the skills 5 high ability to apply the skills < Previous Page Next Page >

PAGE 146

137 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pleas e ONLY use the buttons at the bottom of the page to go forward or back in the survey... otherwise data might be lost. The survey now asks if you use spreadsheets. 18a. Do you use spreadsheet software such as Microsoft Excel or Corel Quattro Pro? Yes No < Previous Page Next Page >

PAGE 147

138 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. You have responded that you do us e spreadsheet software. The survey would like to ask you the following: 18b. On average, how often do you use spreadsheet software during the month? 0-15 minutes a month 16-30 minutes a month 31-45 minutes a month 46-60 minutes a month 61-90 minutes a month more than 90 minutes a month 18c. Can you format cells in a spreadsheet to number, or currency, etc.? Yes No 18d. Can sort data in a spreadsheet? Yes No 18e. Can you use nested functions in a spreadsheet? Yes No 18f. Can you create a graph or c hart from data in a spreadsheet (using the spreadsheet software)? Yes No 18g. Can you write formul as in a spreadsheet? Yes No

PAGE 148

139 18h. Which spreadsheet softw are do you most often use? Corel Quattro Pro Microsoft Excel Other: 18i. Please indicate if there are any other spreadsheet skills that are necessary for successful Extension employees: < Previous Page Next Page >

PAGE 149

140 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Please consider the following: How important are spreadsheet skills to your job? 1 low importance 2 somewhat important 3 average importance 4 above average importance 5 high importance How would you describe your level of knowledge about spreadsheet skills? 1 low level of knowledge 2 somewhat kn owledgeable 3 average knowledge 4 above average knowledge 5 high level of knowledge How would you describe your ability to apply spreadsheet skills to your job? 1 low ability to apply the skills 2 somewhat able to apply the skills 3 average ability to apply the skills 4 above average ability to apply the skills 5 high ability to apply the skills < Previous Page Next Page >

PAGE 150

141 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Presentation software such as Mi crosoft Power Point can be used to add structure to a meeting. A presentation might include charts, graphics, Web links, and so forth. 19a. Do you use presentation software (such as Corel Presentations or Microsoft PowerPoint)? Yes No Just a few more minutes! You ar e now more than 1/2 way done! < Previous Page Next Page >

PAGE 151

142 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forwar d or back in the survey... otherwise data might be lost. You have responded that you do us e presentation software. The survey would like to ask you the following: 19b. On average, how often do y ou use presentation software for your job? 1-5 times a year 6-10 times a year 11-15 times a year 16-20 times a year More than 20 times a year 19c. Can you use different views in the presentation software package such as slide sorter, slide, notes page, or slide show? Yes No 19d. Can you create a master slide? Yes No 19e. Can you insert graphics and pi ctures from a variety of resources? Yes No 19f. Can you create automated builds and transitions? Yes No 19g. Can you create a slide show that runs automatically? Yes No

PAGE 152

143 19h. Which presentation softw are do you most often use? Corel Presentations Microsoft PowerPoint Other: 19i. Please indicate any other presentation skills that are necessary for successful Extension employees: < Previous Page Next Page >

PAGE 153

144 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Please consider the following: How important are pr esentation software skills to your job? 1 low importance 2 somewhat important 3 average importance 4 above average importance 5 high importance How would you describe your level of knowledge about presentation software? 1 low level of knowledge 2 somewhat kn owledgeable 3 average knowledge 4 above average knowledge 5 high level of knowledge How would you describe your ability to apply presentation software skills to your job? 1 low ability to apply the skills 2 somewhat able to apply the skills 3 average ability to apply the skills 4 above average ability to apply the skills 5 high ability to apply the skills < Previous Page Next Page >

PAGE 154

145 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. The survey would now like to ask you about the World Wide Web: 20a. Can you "surf" or browse the Internet? Yes No 20b. Can you bookmark frequently used Web pages? Yes No 20c. Can you download files from the Internet? Yes No 20d. Can you use a search engine (such as Yahoo or Google) to find Web pages? Yes No 20e. Which Web browser do you most often use? Netscape Navigator Microsoft Internet Explorer Other: 20f. Have you ever registered for In-service training using Extension's In-Service Training Web site? Yes No 20g. Have you ever taken an on-line course to receive academic credit? Yes No

PAGE 155

146 20h. In general, what is your opinion of the World Wide Web and its use in Extension work? 20i. Please indicate any other Web browsing skills that are necessary for successful Extension employees: < Previous Page Next Page >

PAGE 156

147 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Please consider the following: How important are Web brow sing skills to your job? 1 low importance 2 somewhat important 3 average importance 4 above average importance 5 high importance How would you describe your level of knowledge about Web browsing skills? 1 low level of knowledge 2 somewhat kn owledgeable 3 average knowledge 4 above average knowledge 5 high level of knowledge How would you describe your ability to apply Web browsing skills to your job? 1 low ability to apply the skills 2 somewhat able to apply the skills 3 average ability to apply the skills 4 above average ability to apply the skills 5 high ability to apply the skills < Previous Page Next Page >

PAGE 157

148 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forward or back in the survey... otherwise data might be lost. The survey would now like to a sk whether you edit or create Web pages. 21a. Do you edit or create Web pages? Yes No Keep on clicking the survey is almost done! < Previous Page Next Page >

PAGE 158

149 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Please ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. You have responded that you do edit or create Web pages. The survey would like to ask you the following: 21b. Can you edit Web pages? Yes No 21c. Can you create a Web page using Microsoft FrontPage or another HTML editor? Yes No 21d. Can you create a Web page using native HTML? Yes No 21e. Can you create hyperlinks? Yes No 21f. Can you incorporate graphics into Web pages? Yes No 21g. Can you convert existing files into HTML? Yes No 21h. Who is primarily responsible for maintaining your county's Extension service Web site? I am Another Extension agent in the office Office support staff A hired consultant Other:

PAGE 159

150 150 21i. Please indicate any other Web page creation skills that are necessary for successful Extension employees: < Previous Page Next Page >

PAGE 160

151 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Please consider the following: How important are Web page cr eation skills to your job? 1 low importance 2 somewhat important 3 average importance 4 above average importance 5 high importance How would you describe your level of knowledge about Web page creation skills? 1 low level of knowledge 2 somewhat kn owledgeable 3 average knowledge 4 above average knowledge 5 high level of knowledge How would you describe your ability to apply Web page creation skills to your job? 1 low ability to apply the skills 2 somewhat able to apply the skills 3 average ability to apply the skills 4 above average ability to apply the skills 5 high ability to apply the skills < Previous Page Next Page >

PAGE 161

152 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Please tell us what you think of Web-based surveys: Compared to other types of surv eys (paper, teleph one. etc.), how do you rate the convenience of Web-based surveys? 1 low convenience 2 somewhat convenience 3 average convenience 4 above average convenience 5 high convenience How likely are you to respond to a Web-based survey as compared to other types of su rveys (paper, telephone, etc.)? 1 Not likely to respond 2 Somewhat likely to respond 3 Likely to respond 4 More than likely to respond 5 Highly likely to respond How many Web-based surveys do you estimate you have participated in? Please indicate a number Do you wish to add any comments about Web-based surveys? < Previous Page Next Page >

PAGE 162

153 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. Kindly indicate the following: 23. How many years of profe ssional work experience do you have? Extension only: Less than 5 years 5-10 years 11-15 years 16+ years Total work experience (i ncluding Extension): Less than 5 years 5-10 years 11-15 years 16+ years 24. Please indicate the group that includes your age: 20-30 31-40 41-50 51-60 61-70 71+ < Previous Page Next Page >

PAGE 163

154 Survey of Computer Technology Skills, Summer 2002 Please complete all questions. Pl ease ONLY use the buttons at the bottom of the page to go forw ard or back in the survey... otherwise data might be lost. If you would like to add any points you consider important to the use of computer technology by Ex tension Agents, please feel free to add them in the space below. Any suggestions for training and other thoughts and comments you might have would be welcomed. Thank you again for your participation! S ubmit Completed Survey PLEASE CLICK THE BUTTON ABOVE TO SUBMIT THE SURVEY

PAGE 164

APPENDIX B THE SURVEY OF COMPUTER TE CHNOLOGY SKILLS INSTRUMENT PAPER VERSION

PAGE 165

156 The survey begins with some questions about the kind of computer hardware you use: 1. What sort of computer do you primarily use at the office? (check only one) ( ) An IBM P.C. clone with the Windows operating system ( ) An Apple Computer ( ) Other: 2. Do you have your own computer on your desk at the office? ( ) Yes ( ) No (please skip to question 4) 3. If you have a computer on your desk at the office, is it connected to the Internet? ( ) Yes ( ) No ( ) I dont have a computer on my desk 4. Do you use a laptop computer for your job? ( ) Yes ( ) No 5. Do you use a Plam Pilot, I-Paq or similar type device for your job? ( ) Yes ( ) No 6. Please indicate the peripheral devices you might use at the office (check all that apply): ( ) Laser Printer ( ) Color Printer ( ) Other type of Printer ( ) Scanner ( ) Web Cam ( ) Speakers connected to your computer ( ) CD burner (Continued above in the right column) 7. If you were to make a "wish list" of hardware or software that you would like to have at the office, what item would you most want? (check only one) ( ) A computer projector ( ) A laptop computer ( ) A color printer ( ) A new desktop computer ( ) A scanner ( ) New Software ( ) Other: 8. What improvements in hardware and/or software do you believe would most positively impact your ability to serve clientele? The survey now asks how often you use a computer, how you learned your computer skills, and where you turn for help when you have questions. 9. In a normal week, estimate the number of hours you spend using a computer (both at home and at work): ( ) 0 Hours ( ) 1-5 Hours ( ) 6-10 Hours ( ) 11-15 Hours ( ) 16-20 Hours ( ) More than 20 Hours 10. Do you do office work on your home computer? ( ) Yes ( ) No ( ) Not Applicable 11. Where have you learned most of your computer knowledge? (check all that apply) ( ) Self taught at home ( ) Learned in college or high school ( ) Self taught at work ( ) Learned at work through In-service training, etc. ( ) Learned from family or friends outside of work ( ) Learned from co-workers at work ( ) Other: Survey of Computer Tec hnology Skills, Summer 2002 page 1

PAGE 166

157 12. How would you rate your overall computer skills? (check only one) ( ) Very Poor ( ) Poor ( ) Average ( ) Above Average ( ) Excellent 13. Have you taken any computer courses since 2000? (These may include courses on word processing, spreadsheets, the Windows operating system, etc.) ( ) Yes ( ) No 14. If not, what was the principal reason for not taking computer courses? (check only one) ( ) Lack of time ( ) Lack of access ( ) Too expensive ( ) No incentive ( ) Not available ( ) Too few In-service training days ( ) Other: 15. If you have a question about a computer-related iss ue, where are you most likely to seek an answer? (check only one) ( ) From a colleague or support staff in the office ( ) From a colleague or support staff in another County ( ) From your district's computer support personnel ( ) You find the answer on your own ( ) Other: 16. The survey now inquires if you use email. 16a. Do you use e-mail? ( ) Yes ( ) No (please skip to question 17 on page 3) (Continued above in the right column) 16b. On average, how much time do you devote to e-mail during the day? ( ) 0-15 minutes a day ( ) 16-30 minutes a day ( ) 31-45 minutes a day ( ) 46-60 minutes a day ( ) More than 60 minutes a day 16c. Do you use e-mail to communicate with clientele? ( ) Yes ( ) No (please skip to question 16f) 16d. How often do you e-mail clientele during the month? ( ) 1-5 times a month ( ) 6-10 times a month ( ) 11-15 times a month ( ) 16-20 times a month ( ) More than 20 times a month ( ) Not applicable 16e. Please estimate the number of clientele you reach via e-mail during a typical month: ( ) 1-25 clientele ( ) 26-50 clientele ( ) 51-75 clientele ( ) 76-100 clientele ( ) More than 100 clientele a month ( ) Not applicable 16f. Can you attach and send files (attachments) through e-mail? ( ) Yes ( ) No 16g. Are you a member of an e-mail listserv that distributes professional information to you? ( ) Yes ( ) No 16h. Can you find addresses in your e-mail program's address book? ( ) Yes ( ) No 16i. Can you create and use e-mail distribution lists using your e-mail program? ( ) Yes ( ) No Survey of Computer Technol ogy Skills, Summer 2002 page 2

PAGE 167

158 16j. Do you use e-mail folders to organize sent or received e-mail messages? ( ) Yes ( ) No 16k. Can you access your e-mail away from the office using a laptop, PDA, or some other computer? ( ) Yes ( ) No 16l. Please indicate if there are any other email skills that are necessary for successful Extension employees: How important are e-mail skills to your job (such as communicating with colleagues, sharing information with clientele, receiving important administrative messages etc.)? ( ) 1 low importance ( ) 2 somewhat important ( ) 3 average importance ( ) 4 above average importance ( ) 5 high importance How would you describe your level of knowledge about e-mail skills? ( ) 1 low level of knowledge ( ) 2 somewhat knowledgeable ( ) 3 average knowledge ( ) 4 above average knowledge ( ) 5 high level of knowledge How would you describe your ability to apply e-mail skills to your job? ( ) 1 low ability to apply the skills ( ) 2 somewhat able to apply the skills ( ) 3 average ability to apply the skills ( ) 4 above average ability to apply the skills ( ) 5 high ability to apply the skills 17. The survey now inquires if you use word processing software. 17a. Do you use word processing software? ( ) Yes ( ) No (please skip to question 18 on page 4) (Continued above in the right column) 17b. On average, how often do you use word processing during the day? ( ) 0-15 minutes a day ( ) 16-30 minutes a day ( ) 31-45 minutes a day ( ) 46-60 minutes a day ( ) 61-90 minutes a day ( ) more than 90 minutes a day 17c. Can you use edit features such as cut and paste? ( ) Yes ( ) No 17d. Can you set page margins? ( ) Yes ( ) No 17e. Can you set tabs? ( ) Yes ( ) No 17f. Can you create tables with your word processing software? ( ) Yes ( ) No 17g. Can you perform "mail merge" using a dataset of names/addresses and forms (i.e. a letter)? ( ) Yes ( ) No 17h. Do you use word processing to create brochures, fact sheets, or other educational documents? ( ) Yes ( ) No 17i. Which word processing software do you most often use? ( ) Corel WordPerfect ( ) Microsoft Word ( ) Other: 17j. Please indicate if there are any other word processing skills that are necessary for successful Extension employees: Survey of Computer Technol ogy Skills, Summer 2002 page 3

PAGE 168

159 How important are word processing skills to your job? ( ) 1 low importance ( ) 2 somewhat important ( ) 3 average importance ( ) 4 above average importance ( ) 5 high importance How would you describe your level of knowledge about word processing? ( ) 1 low level of knowledge ( ) 2 somewhat knowledgeable ( ) 3 average knowledge ( ) 4 above average knowledge ( ) 5 high level of knowledge How would you describe your ability to apply word processing skills to your job? ( ) 1 low ability to apply the skills ( ) 2 somewhat able to apply the skills ( ) 3 average ability to apply the skills ( ) 4 above average ability to apply the skills ( ) 5 high ability to apply the skills 18. The survey now asks if you use spreadsheets. 18a. Do you use spreadsheet software such as Microsoft Excel or Corel Quattro Pro? ( ) Yes ( ) No (please skip to question 19 on page 5) 18b. On average, how often do you use spreadsheet software during the month? ( ) 0-15 minutes a month ( ) 16-30 minutes a month ( ) 31-45 minutes a month ( ) 46-60 minutes a month ( ) 61-90 minutes a month ( ) more than 90 minutes a month 18c.Can you format cells in a spreadsheet to number, or currency, etc.? ( ) Yes ( ) No (Continued above in the right column) 18d. Can sort data in a spreadsheet? ( ) Yes ( ) No 18e. Can you use nested functions in a spreadsheet? ( ) Yes ( ) No 18f. Can you create a graph or chart from data in a spreadsheet (using the spreadsheet software)? ( ) Yes ( ) No 18g. Can you write formulas in a spreadsheet? ( ) Yes ( ) No 18h. Which spreadsheet software do you most often use? ( ) Corel Quattro Pro ( ) Microsoft Excel ( ) Other: 18i. Please indicate if there are any other spreadsheet skills t hat are necessary for successful Extension employees: How important are spreadsheet skills to your job? ( ) 1 low importance ( ) 2 somewhat important ( ) 3 average importance ( ) 4 above average importance ( ) 5 high importance How would you describe your level of knowledge about spreadsheet skills? ( ) 1 low level of knowledge ( ) 2 somewhat knowledgeable ( ) 3 average knowledge ( ) 4 above average knowledge ( ) 5 high level of knowledge How would you describe your ability to apply spreadsheet skills to your job? ( ) 1 low ability to apply the skills ( ) 2 somewhat able to apply the skills ( ) 3 average ability to apply the skills ( ) 4 above average ability to apply the skills ( ) 5 high ability to apply the skills Survey of Computer Technol ogy Skills, Summer 2002 page 4

PAGE 169

160 19. Presentation software like Microsoft PowerPoint can be used to add structure to a meeting. A presentation might include charts, graphics, Web links, and so forth. 19a. Do you use presentation software such as Microsoft PowerPoint or Corel Presentations? ( ) Yes ( ) No (please skip to question 20) 19b. On average, how often do you use presentation software for your job? ( ) 1-5 times a year ( ) 6-10 times a year ( ) 11-15 times a year ( ) 16-20 times a year ( ) More than 20 times a year 19c. Can you use different views in the presentation software package such as slide sorter, slide, notes page, or slide show? ( ) Yes ( ) No 19d. Can you create a master slide? ( ) Yes ( ) No 19e. Can you insert graphics and pictures from a variety of resources? ( ) Yes ( ) No 19f. Can you create automated builds and transitions? ( ) Yes ( ) No 19g. Can you create a slide show that runs automatically? ( ) Yes ( ) No 19h. Which presentation software do you most often use? ( ) Corel Presentations ( ) Microsoft PowerPoint ( ) Other: (Continued above in the right column) How important are presentation software skills to your job? ( ) 1 low importance ( ) 2 somewhat important ( ) 3 average importance ( ) 4 above average importance ( ) 5 high importance How would you describe your level of knowledge about presentation software? ( ) 1 low level of knowledge ( ) 2 somewhat knowledgeable ( ) 3 average knowledge ( ) 4 above average knowledge ( ) 5 high level of knowledge How would you describe your ability to apply presentation software skills to your job? ( ) 1 low ability to apply the skills ( ) 2 somewhat able to apply the skills ( ) 3 average ability to apply the skills ( ) 4 above average ability to apply the skills ( ) 5 high ability to apply the skills 20. The survey would now like to ask you about the World Wide Web. 20a. Can you "surf" or browse the Internet? ( ) Yes ( ) No 20b. Can you bookmark frequently used Web pages? ( ) Yes ( ) No 20c. Can you download files from the Internet? ( ) Yes ( ) No 20d. Can you use a search engine (such as Yahoo or Google) to find Web pages? ( ) Yes ( ) No 20e. Which Web browser do you most often use? ( ) Netscape Navigator ( ) Microsoft Internet Explorer ( ) Other: Survey of Computer Technol ogy Skills, Summer 2002 page 5

PAGE 170

161 20f. Have you ever registered for In-service training using Extension's In-Service Training Web site? ( ) Yes ( ) No 20g. Have you ever taken an on-line course to receive academic credit? ( ) Yes ( ) No 20h. In general, what is your opinion of the World Wide Web and its use in Extension work? 20i. Please indicate any other Web browsing skills that are necessary for successful Extension employees: How important are Web browsing skills to your job? ( ) 1 low importance ( ) 2 somewhat important ( ) 3 average importance ( ) 4 above average importance ( ) 5 high importance How would you describe your level of knowledge about Web browsing skills? ( ) 1 low level of knowledge ( ) 2 somewhat knowledgeable ( ) 3 average knowledge ( ) 4 above average knowledge ( ) 5 high level of knowledge How would you describe your ability to apply Web browsing skills to your job? ( ) 1 low ability to apply the skills ( ) 2 somewhat able to apply the skills ( ) 3 average ability to apply the skills ( ) 4 above average ability to apply the skills ( ) 5 high ability to apply the skills (Continued above in the right column) 21. The survey would now like to ask you whether you edit or create Web pages. 21a. Can you create or edit Web pages? ( ) Yes ( ) No (please skip to question 22 on page 7) 21b. Can you edit Web pages? ( ) Yes ( ) No 21c. Can you create a Web page using Microsoft FrontPage or another HTML editor? ( ) Yes ( ) No 21d. Can you create a Web page using native HTML? ( ) Yes ( ) No 21e. Can you create hyperlinks? ( ) Yes ( ) No 21f. Can you incorporate graphics into Web pages? ( ) Yes ( ) No 21g. Can you convert existing files into HTML? ( ) Yes ( ) No 21h. Who is primarily responsible for maintaining your county's Extension service Web site? ( ) I am ( ) Another Extension agent in the office ( ) Office support staff ( ) A hired consultant ( ) Other: 21i. Please indicate any other Web page creation skills that are necessary for successful Extension employees: Survey of Computer Technol ogy Skills, Summer 2002 page 6

PAGE 171

162 How important are Web creation skills to your job? ( ) 1 low importance ( ) 2 somewhat important ( ) 3 average importance ( ) 4 above average importance ( ) 5 high importance How would you describe your level of knowledge about Web creation skills? ( ) 1 low level of knowledge ( ) 2 somewhat knowledgeable ( ) 3 average knowledge ( ) 4 above average knowledge ( ) 5 high level of knowledge How would you describe your ability to apply Web creation skills to your job? ( ) 1 low ability to apply the skills ( ) 2 somewhat able to apply the skills ( ) 3 average ability to apply the skills ( ) 4 above average ability to apply the skills ( ) 5 high ability to apply the skills 22. Please tell us what you think of Webbased surveys: 22a. Compared to other types of surveys (paper, telephone. etc.), how do you rate the convenience of Web-based surveys? ( ) 1 low convenience ( ) 2 somewhat convenience ( ) 3 average convenience ( ) 4 above average convenience ( ) 5 high convenience 22b. How likely are you to respond to a Web-based survey as compared to other types of surveys (paper, telephone, etc.)? ( ) 1 Not likely to respond ( ) 2 Somewhat likely to respond ( ) 3 Likely to respond ( ) 4 More than likely to respond ( ) 5 Highly likely to respond 22c. How many Web-ba sed surveys do you estimate you have participated in? Please indicate a number: (Continued above in the right column) 22d. Do you wish to add any comments about Web-based surveys? Kindly indicate the following: 23. How many years of professional work experience do you have? Extension only: ( ) Less than 5 years ( ) 5-10 years ( ) 11-15 years ( ) 16+ years Total work experience (including Extension): ( ) Less than 5 years ( ) 5-10 years ( ) 11-15 years ( ) 16+ years 24. Please indicate the group that includes your age: ( ) 20-30 ( ) 31-40 ( ) 41-50 ( ) 51-60 ( ) 61-70 ( ) 71+ If you would like to add any points you consider important to the use of computer technology by Extension Agents, please feel free to add them in the space below. Any suggestions for training and other thoughts and comments you might have would be welcomed. Thank you again for your participation! UFIRB 2002 534 123ABSurvey of Computer Technol ogy Skills, Summer 2002 page 7

PAGE 172

APPENDIX C PRINCIPLES FOR THE DESIGN OF WEB SURVEYS AND THEIR RELATIONSHIP TO TRADITIONAL SOURCES OF SURVEY ERROR (DILLMAN & BOWKER, 2001)

PAGE 173

164 SamplingCoverage Measurement Nonresponse 1. Introduce the Web questionnaire with a welcome screen that is motivational, emphasizes the ease of responding, and instructs respondents on the action needed for proceeding to the next page X 2. Provide a PIN number for limiting access only to in the sample. X X 3. Choose for the first question an item that is likely to be interesting to most respondents, easily answered, and fully visible on the first screen of the questionnaire. X 4. Present each question in a conventional format similar to that normally used on paper self-administered questionnaires. X X 5. Restrain the use of color so that figure/ground c onsistency and readability are maintained, navigational flow is unimpeded, and measurement properties of questions are maintained. X 6. Avoid differences in the visual appearance of questions that result from different screen configurations, operating systems, browsers, partial screen displays and wrap-around text. X X X 7. Provide specific instructions on how to take each necessary computer action for responding to the questionnaire and other necessary instructions at the point where they are needed. X 8. Use drop-down boxes sparingly, consider the mode implications, and identify each with a “click here” instruction. X 9. Do not require respondents to provide an answer to each question before being allowed to answer any subsequent ones. X 10. Provide skip directions in a way that encourages marking of answers and being able to click to the next applicable question. X 11. Construct Web questionnaires so they scroll from question to question unless order effects are a major concern, and/or telephone and web survey results are being combined. X X X 12. When the number of answer choices exceeds the number that can be displayed in a single column on one screen, consider double-banking with an appropriate grouping device to link the together. X 13. Use graphical symbols or words that convey a sense of where the respondent is in the completion process, but avoid ones that require significant increases in computer memory. X X 14. Exercise restraint in the use of question structures that have known measurement problems on paper questionnaires, e.g., check-all-that-apply and open-ended questions. X X

PAGE 174

APPENDIX D SCALES AND THE VALUES THEY REPRESENTED

PAGE 175

166 Variable Scale Value Est. Hours of Weekly Computer use 1 0 Hours 2 1-5 Hours 3 6-10 Hours 4 11-15 Hours 5 16-20 Hours 6 More than 20 Hours Overall Self-rated Computer Skills 1 Very Poor 2 Poor 3 Average 4 Above Average 5 Excellent Time Devoted to E-mail per Day 1 0-15 Minutes a Day 2 16-30 Minutes a Day 3 31-45 Minutes a Day 4 46-60 Minutes a Day 5 More than 60 Minutes a Day How Often do you E-mail Clientele/Month 1 1-5 Times a Month 2 6-10 Times a Month 3 11-15 Times a Month 4 16-20 Times a Month 5 More than 20 Times a Month Est. Clientele Reached by E-mail per Month 1 1-25 Clientele/Month 2 26-50 Clientele/Month 3 5175 Clientele/Month 4 76-100 Clientele/Month 5 More than Clientele/Month Word Processing Use (Avg. Minutes/Day) 1 0-15 Minutes a Day 2 16-30 Minutes a Day 3 31-45 Minutes a Day

PAGE 176

167 Variable Scale Value Word Processing Use continued 4 46-60 Minutes a Day 5 61-90 Minutes a Day 6 More than 90 Minutes a Day Spreadsheet Use (Average Minutes /Month) 1 0-15 Minutes a Month 2 16-30 Minutes a Month 3 31-45 Minutes a Month 4 46-60 Minutes a Month 5 61-90 Minutes a Month 6 More than 90 Min. a Month Presentation Softw. Use (Avg. Times/Year) 1 1-5 Times a Year 2 6-10 Times a Year 3 11-15 Times a Year 4 16-20 Times a Year 5 More than 20 Times a Year Work Experience (Including Exte nsion) 1 Less than 5 Years 2 5-10 Years 3 11-15 Years 4 16+ Years Age Group 1 20-30 2 31-40 3 41-50 4 51-60 5 61-70 6 71+ Construct Variable Importance 1 Low Importance 2 Somewhat Important 3 Average Importance 4 Above Avg. Importance 5 High Importance Construct Variable Knowledge 1 Low Level of Knowledge 2 Somewhat Knowledgeable 3 Average Knowledge 4 Above Avg. Knowledge 5 High Level of Knowledge

PAGE 177

168 Variable Scale Value Construct Variable Application 1 Low Ability to Apply the Skills 2 Somewhat Able to Apply the Skills 3 Average Ability to Apply the Skills 4 Above Average Ability to Apply the skills 5 High Ability to Apply the Skills

PAGE 178

169 APPENDIX E E-MAIL FROM RESEARCHERS TO PI LOT POPULATION INTRODUCING THE STUDY Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, Fl 32611-0540 June 24th, 2002 County Extension Agent University of Florida IFAS Dear County Extension Agent, The last 10 years have brought tremendous chan ge in the way we use, and think about the use of computers. County agents, al ong with many others in societies throughout the world, have experienced the impact brought by such things as the World Wide Web, Windows, and fast, reli able personal computers. This is a request for approximately 15 minutes of your time to participate in a Webbased study on county Extension agent’s use of computers. It is a unique opportunity for you to express your opinion about computer use on your job, to identify computer hardware and software that you might need, a nd to provide information that will help determine what future computer traini ng should be offered to county Extension agents. As a means to check your name off the surv ey’s distribution list, and as a means to deter unauthorized use of the survey’s Web site, all particip ants have been given their own access code. Your response will be kept st rictly confidential, and your name will never appear on any report. We will be happy to share the results of the study with you upon its conclusion. If you have any questions or experience any difficulties with the survey please feel free to contact us by e-mail (IRANI@ufl.e du) or (JAGREGG@ufl.edu). You can reach Tracy Irani by telephone at (352/392-0502), and Austin Gregg at (352/3921285).

PAGE 179

170 The survey can be found at: http://survey.ifas.ufl.edu Please use access code 123AB Thank you in advance for your help

PAGE 180

171 APPENDIX F E-MAIL FROM RESEARCHERS TO PI LOT POPULATION REMINDING THEM TO PARTICIPATE Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, Fl 32611-0540 June 27th, 2002 County Extension Agent University of Florida IFAS Dear County Extension Agent, A few days ago you should have received an e-mail message from us requesting your participation in Web-based survey on count y Extension agent’s us e of computers. If you have already completed the survey, we’d like thank you for helping us out. If you have not yet visited the Web site and filled in the survey, perhaps you would do so sometime today. The survey can be completed quickly and efficiently, and your response is very important to the success of the study. Feel free to contact either of us if you have any questions about, or difficulties with the survey. Our e-mail addresses are: (IRANI@ufl.edu) or (JAGREGG@ufl.edu). Our telephone numbers are: (352/392-0502 (Tracy Irani)) or (352/392-1285 (Austin Gregg)). The survey can be found at: http://survey.ifas.ufl.edu Please use access code 123AB

PAGE 181

172 APPENDIX G E-MAIL FROM RESEARCHERS TO PI LOT POPULATION THANKING THEM FOR THEIR PARTICIPATION, AND IN FORMING THEM THAT THEY WERE PART OF A PILOT STUDY Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, Fl 32611-0540 July 6th, 2002 County Extension Agent University of Florida IFAS Dear County Extension Agent, Many of you will have already received an e-mail message from Dean Waddill announcing the study you’ve just participat ed in. The reason for this apparent incongruity is because you were part of the pilot study. Your participation allowed us to verify the reliability of the survey and the survey’s Web site. The information that you gracious ly provided will be included into the master data set, and we won’t be bugging you with any more e-mails! Thank you again for your help. We are very appreciative! Sincerely, Austin Gregg/Tracy Irani P.S. We hope to put the results up on the Web sometime in October or November.

PAGE 182

173 APPENDIX H E-MAIL FROM THE DEAN OF EX TENSION INTRODUCING THE STUDY FRIDAY JULY 05, 2002 3:29 PM ELECTRONIC MEMORANDUM TO: All County Faculty FROM: Christine T. Waddill SUBJECT: Extension Faculty Computer Usage Survey On July 8th you will receive a survey asking for your input on computer usage. The study is being led by Austin Gregg and Traci Irani, in the Department of Agricultural Education and Communication. We hope you will take a few minutes to respond as it will provide useful information for developing long-range plans for hardware and training needs in Extension. Look for an e-mail from Dr. Irani that w ill direct you to the website hosting the survey. CTW/jmv

PAGE 183

174 APPENDIX I E-MAIL FROM THE RESEARCHERS TO THE POPULA TION INTRODUCING THE STUDY AND ASKING FOR PARTICIPATION Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, FL 32611-0540 Monday, July 8th, 2002 Dear County Extension Agent, The last 10 years have brought tremendous chan ge in the way we use, and think about the use of computers. County agents, along with many other people in societies throughout the world, have experienced the impact brought by technologies such as the World Wide Web, Windows, and fast reliable personal computers. This is a request for approximately 15 minutes of your time to participate in a Webbased study on county Extension agents’ use of computers. It is a unique opportunity for you to express your opinion about computer use on your job, to identify computer hardware and software that you might need, a nd to provide information that will help determine what future computer traini ng should be offered to county Extension agents. As a means to check your name off the survey's distribution list and to deter unauthorized use of the survey's Web site, a ll participants have been given their own access code. Your response will be kept st rictly confidential, and your name will never appear on any report. We will be happy to share the results of the study with you upon its conclusion. If you have any questions about the study or experience any difficulties with the survey, please feel free to contact us by e-mail: IRANI@ufl.edu or JAGREGG@ufl.edu. You can reach Tracy Irani by telephone at 352/392-0502, and Austin Gregg at 352/392-1285. The survey can be found at: http://survey.ifas.ufl.edu Please use this code to access the survey: 123AB

PAGE 184

175 The code consists of 3 numbers and 2 letters. Thank you in advance helping us with our study, Tracy Irani/Austin Gregg

PAGE 185

176 APPENDIX J E-MAIL FROM RESEARCHERS TO TH E POPULATION REMINDING THEM TO PARTICIPATE (FIRST REMINDER MESSAGE) Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, FL 32611-0540 Friday, July 12th, 2002 Dear County Extension Agent, A few days ago you should have received a message from us requesting your participation in a Web-based survey on count y Extension agents’ use of computers. If you have already completed the survey, we ’d like to take this opportunity to thank you for helping us out. If you have not yet visited the Web site a nd filled in the survey, perhaps you’d do so sometime today. The survey can be comple ted quickly and easil y, and your response is very important to the success of the study. Please use this code to access the survey: 123AB The code above consists of 3 numbers and 2 letters. The survey can be found at: http://survey.ifas.ufl.edu Feel free to contact either of us if you ha ve any questions about, or experience any difficulties with the survey. Our e-mail addresses are: IRANI@ufl.edu or JAGREGG@ufl.edu. You can reach Tracy Irani by telephone at 352/392-0502, and Austin Gregg at 352/392-1285. We really appreciate your help with our study! Sincerely, Tracy Irani/Austin Gregg

PAGE 186

177 APPENDIX K E-MAIL FROM RESEARCHERS TO TH E POPULATION REMINDING THEM TO PARTICIPATE (SECOND REMINDER MESSAGE) Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication Friday, July 19th, 2002 Dear County Extension Agent, We sure do hate to bug you again, but… We still haven’t heard from you on the Web-based survey of county agents’ use of computers. If you’ve been unable to access the site, or have experienced any other difficulties completing the survey, please c ontact us by e-mail or telephone. Our email addresses are IRANI@ufl.edu or J AGREGG@ufl.edu. You can reach Tracy Irani by telephone at 352/392-0502, a nd Austin Gregg at 352/392-1285. Please use this code to access the survey: 123AB The code above consists of 3 numbers and 2 letters. The survey can be found at: http://survey.ifas.ufl.edu We hope you find a few minutes to join your colleagues who have already responded to our survey. Your response is very important to us! Thank you for your consideration, Tracy Irani/Austin Gregg

PAGE 187

178 APPENDIX L E-MAIL FROM RESEARCHERS TO TH E POPULATION REMINDING THEM TO PARTICIPATE (NONRESPONSE MESSAGE) Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication Friday, July 26th, 2002 Dear County Extension Agent, We are writing to you concerning our study of county agents’ use of computers. As of today we have yet to have receiv ed your completed survey. Many of your colleagues have successfully completed the ea sy, fast, and efficient on-line survey. The accuracy of the study, however, depends on you and those remaining agents who have yet to respond. Won’t you take a few minutes of your time today and help us out? If you have any questions or experience any difficulties with the survey please feel free to contact us by e-mail (IRANI@ufl.edu or JAGREGG@ufl.edu). You can reach Tracy Irani by telephone at 352/3920502, and Austin Gregg at 352/392-1285. Please use access code: 123AB The survey can be found at: http://survey.ifas.ufl.edu Thank you for your help! Sincerely, Tracy Irani/Austin Gregg

PAGE 188

179 APPENDIX M E-MAIL FROM THE DISTRICT EX TENSION DIRECTOR OF THE NORTH CENTRAL DISTRICT TO HIS AGENTS ASKING FOR THEIR PARTICIPATION IN THE STUDY Monday, July 29, 2002 1:05 PM Good afternoon everyone, I've attached a note from Austin Gregg regarding a survey he is conducting on computer usage by agents. I hope all of you take a few minutes and complete the survey for him. Response has slowed down considerably and about one-third of all agents have replied. Thanks. Rodney L. Clouser (The attached “note” from the principal researcher referenced in the DED’s e-mail above is here included:) Thursday, July 11, 2002 8:53 AM Good morning Dr. Clouser, As I suspect you are aware, we began a We b-based survey on county agents' use of computers last Monday. Anyway, as is ch aracteristic with Web-based surveys, we had a big response right off the bat, but it ha s now slowed to a trickle. (Thus far we have received 124 responses out of a potential 315.) I'm writing today to ask if you would be so kind as to encourage participation in the study. It's easy to complete, and takes about 15 minutes. All agents have been contacted with information on how to find and gain access to the survey. CEDs have replied in droves (a findi ng I suspect), and most all respondents (CEDs and agents alike) have been addi ng very interesting comments. The study should produce good information. Thanks for any assistance you might be able to give, Austin G.

PAGE 189

180 APPENDIX N LETTER FROM RESEARCHERS TO AG ENTS WHO HAD NOT RESPONDED TO THE ON-LINE SURVEY Dear County Extension Agent, By now you most likely have heard about our Webbased survey. Because we haven’t heard from you, we thought you might prefer the paper version, which is enclosed. The Web version is still available to you, and the address and access code is on this page. Thanks for your patience, and your help! Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, Fl 32611-0540 August 1st, 2002 County Extension Agent University of Florida IFAS Dear County Extension Agent, The last 10 years have brought tremendous chan ge in the way we use, and think about the use of computers. County agents, al ong with many others in societies throughout the world, have experienced the impact brought by such things as the World Wide Web, Windows, and fast, reli able personal computers.

PAGE 190

181 This is a request for approximately 15 minutes of your time to participate in a study on county Extension agent’s use of computer s. It is a unique opportunity for you to express your opinion about computer use on your job, to identify computer hardware and software that you might need, and to pr ovide information that will help determine what future computer training should be offered to county Extension agents. Although we have enclosed a paper copy of the survey, you can also quickly and easily complete it on-line. Simply go to http://survey.ifas.ufl.edu and use access code 123AB. The code consists of 3 numbers and 2 characters. Your participation in this study is comple tely voluntary, and there is no penalty for not participating. The purpos e of the study is to gain an understanding of the level of skill, patterns of use, and workplace a pplication of information technology amongst county Extension agents of the Florida C ooperative Extension Service. You do not have to answer any question you do not wi sh to answer, and you may withdraw from the study at any time without consequence. The survey takes approximately 15 minutes to complete. We believe there ar e no direct risks or benefits to you for participating in this study. Your information will be assigned a code number. The list connecting your name to this number will be kept in a locked file in a secure location. When the study is complete and the data have been analy zed, the list will be destroyed. Your name will never appear in any report. There is no compensation for participation in this research. If you ha ve any questions abou t your rights concerning this study, please contact the UFIRB Offi ce, Box 12250, University of Florida, Gainesville, FL 32611-2250. If you have any questions or experience any difficulties with the survey please feel free to contact us by e-mail (IRANI@ufl.edu or JAGREGG@ufl.edu). You can reach Tracy Irani by telephone at 352/3920502, and Austin Gregg at 352/392-1285. Thank you in advance for your help! We w ill be happy to share the results of the study with you when it is complete. Tracy Irani/Austin Gregg Agreement: I have read the procedure descri bed above. I voluntarily agree to participate in the procedure and I have received a copy of this description. Participant____________ ________________________ Date:________________ Principal Investigator ______________ _____________ Date:________________

PAGE 191

182 APPENDIX O LETTER FROM RESEARCHERS TO AG ENTS WHO HAD NOT RESPONDED TO THE ON-LINE SURVEY REMI NDING THEM TO PARTICIPATE Dr. Tracy Irani, Assistant Professor Mr. Austin Gregg, Graduate Student University of Florida Department of Agricultural Education and Communication P.O. Box 110540 Gainesville, Fl 32611-0540 August 14th, 2002 Dear County Agent, Just a brief message to ask if you would be so kind as to participate in our survey… We’ve had a great response from Ex tension agents across the state, but your response is still very, very important to us! Won’t you take a few mi nutes today to complete either the paper or on-line survey? Feel free to contact either of us if you have any questions about the survey. Our email addresses are: IRANI@ufl.edu or J AGREGG@ufl.edu. Our telephone numbers are: 352/392-0502 (Tracy Irani ) or 352/392-1285 (Austin Gregg). Although we sent you a paper copy, you can also very easily complete the survey online. Simply go to http://survey.ifas.ufl. edu and use the following access code: 123AB (The code is 3 numbers and 2 letters.) Thank you for your participation! We l ook forward to sharing the results with everybody sometime in the near future! Tracy Irani and Austin Gregg

PAGE 192

183 REFERENCES Albright, B. B. (2000). C ooperative Extension and the information technology era: An assessment of current competenci es and future traini ng needs of county Extension agents. (Doctoral dissert ation, Texas A&M University, 2000). Dissertation Abstracts International, 61, 2668. Ary, D., Jacobs, L. C., & Razavieh, A. (1996). Introduction to research in education Fort Worth, TX: Harcourt Brace College Publishers Baker, M. T., Rudd, R. D., Hoover, T. S., & Grant, T. (1997). Differences in learning styles of county Extension facu lty in Florida based upon personal and organizational characteristics. In Proceedings of the 24th National Agricultural Education Research Meeting: Vol. 24. Learning Styles (pp. 343-352). Baker, M. T., & Wilson, M. (1998). Intern et access, on-line resources used, and training needs of Florida Farm Bureau county directors. In Proceedings of the 25th National Agricultural Education Resear ch Meeting: Vol. 25. Agricultural communications and information resources (pp. 461-470). Bamka, W. J. (2000). Using the Inte rnet as a farm marketing tool. Journal of Extension, 38 (2) Retrieved March 3rd, 2002, from http://www.joe.org/joe/2000april/tt1.html Benton Foundation (2000). Study finds e-rate is achi eving its goal of building Internet framework for 21st century schools Retrieved March 3rd, 2002, from http://www.benton.org/e-ra te/pressrelease.html Bohrer, B. (2000, July 23rd). Internet is an information, marketing tool for farmers. The Gainesville Sun pp. 5G, 6G. Bosnjak, M. (2001). Classifying response behaviors in Web-based surveys. Journal of Computer-Mediated Communication, 6(3) Retrieved February 28th, 2002, from http://www.ascusc.org/jc mc/vol6/issue3/boznjak.html Campbell-Kelly, M., & Aspray, W. (1996). Computer New York, N.Y.: Basic Books. Campus Computi ng Project (2000). The 2000 national survey of information technology in US higher education Retrieved March 3rd, 2002, from http://www.campuscomputing.ne t/summaries/2000/index.html

PAGE 193

184 Cantrell, M. J. (1982). An assessment of attitudes, needs and delivery system for microcomputer applications by agricu ltural and extension educators in Mississippi. (Doctoral dissertation, Mississippi State University, 1982). Dissertation Abstracts International, 43, 3488. Chase Manhattan Bank (2000). Chase launches most extensive home-school computer network in the nation Retrieved April 1st, 2000, from http://biz.yahoo.com/bw/001018/ny_chase_n.html Coomber, R. (1997). Using the In ternet for survey research. Sociological Research Online, 2(2) Retrieved February 28th, 2002, from http://www.socresonline.org.uk/2/2/2.html Cooper, J. F. (1976). Dimensions in history re counting Florida Cooperative Extension Service progress, 1909-76 Gainesville, FL: Alpha Delta Chapter, Epsilon Sigma Phi. CREES Program Information (2001). National initiatives Retrieved April 3rd, 2001, from http://www.reeusd a.gov/100/programs/init.htm Dillman, D. A. (2000). Mail and Internet surveys, the tailored design method (2nd ed.). New York, NY: John Wiley & Sons, Inc. Dillman, D. A., & Bowker, D. K, (2001). The Web questionnaire challenge to survey methodologists Retrieved March 2nd, 2002, from http://survey.sesrc.wsu.edu/dillman/zuma_paper_dillman_bowker.pdf diSessa, A. A. (2000). Changing minds computers, learning and literacy Cambridge, MA: The MIT Press. Florida State University, College of Arts and Sciences Department of Computer Sciences Academics (2000). Computer competency courses Retrieved March 3rd, 2002, from http://www.cs.fsu.edu/academics/compcomp.html Israel, G. D. (2000). Examples of correspondence with explanatory annotations for a mail survey of Florida homeowners Unpublished manuscript, Department of Agricultural Education and Comm unication, University of Florida. Jeavons, A. (1999). Ethology and the Web Retrieved April 1st, 2002, from http://w3.one.net/~andrewje/ethology.html Johnson, D. M., Ferguson, J. A., & Lester M. L. (1999). Computer experiences, self-efficacy, and knowledge of students enrolled in introductory university agriculture courses. Journal of Agricultural Education, 40(2), 28-37.

PAGE 194

185 Katz, R. N. (1999). Dancing with the d evil information technology and the new competition in higher education San Francisco, CA: Jossey-Bass Publishers. Katz, R. N., & Oblinger, D. G. (Eds.). (2000). The E is for everything Ecommerce, E-business, and E-learning in the future of higher education San Francisco, CA: Jossey-Bass Publishers. Ladewig, H. W. (1999). Survey of computer technology skills and training needs Retrieved March 3rd, 2002, from http://extension education.tam u.edu/computer/ Ladner, M. D., Wingenbach, G. J., & Ra ven, M. R. (2002). Internet and paper based data collection methods in agricultural edu cation research. Journal of Southern Agricultural Education Research, 52(1), pp. 49-60. Lerner, M. (1997). The current state of tec hnology and education: How computers are used in K-12 and Brown University classrooms. Retrieved February 1st, 2001, from http://www.netspa ce.org/~mrl/handbook/inted.html Lippert, R. M., Plank, O., Camberato, J., & Chastain, J. (1998). Regional Extension in-service training via the Internet. Journal of Extension, 36(1) Retrieved March 3rd, 2002, from http://www.joe .org/joe/1998february/a3.html Lippert, R. M., Plank, O., & Radhakr ishna, R. (2000). Beyond perception: A pretest and posttest evaluation of a regiona l Internet Extension in-service training. Journal of Extension, 38(2) Retrieved March 3rd, 2002, from http://www.joe.org/jo e/2000april/a2.html Loughary, J. W. (1966). Man-machine systems in education New York, N.Y.: Harper & Row. Martin, B. L. (1998). Computer anxiety le vels of Virginia Cooperative Extension field personnel. (Doctoral dissertation, Virginia Polytechnic Institute and State University, 1998). Dissertation Abstract s International, 59, 792. Martin, B. L., Stewart, D. L., & Hillis on, J. (2001). Computer anxiety levels of Virginia Extension personnel. Journal of Extension, 39(1). Retrieved April 13th, 2002, from http://www.joe.org/joe/2001february/a1.html Radhakrishna, R., & Pinion, J. (1999). Developing a Web-based system to address accountability reporting needs for Cooperative Extension. Proceedings of the 26th Annual National Agricultural Education Conference: Vol. 26, Cooperative Extension, (pp. 193-201). Rasmussen,W. D. (1989). Taking the university to the people Ames, IA: Iowa State University Press.

PAGE 195

186 Ross, S. M. (1986). BASIC programming for educators Englewood Cliffs, N.J.: Prentice Hall. Rossi, P. H., & Freeman, H. E. (1993). Evaluation a systematic approach Newburry Park, CA: Sage Publications, Inc. Ruppert, K. C. (1992). Factors affecting the utilization of computers by county Extension agents in Florida. (Doctoral di ssertation, University of Florida, 1992). Dissertation Abstracts International, 54, 2915. School to require one online course. (2000, October 22). The Gainesville Sun p. 6A. Seevers, B., Graham, D., Gamon, J., & Conklin, N. (1997). Education through Cooperative Extension Albany, N.Y.: Delmar Publishers. Sherfey, L. E. B., Hiller, J., Macduff, N., & Mack, N. (2000). Washington State University on-line volunteer ma nagement certification program. Journal of Extension, 38(4). Retrieved April 1st, 2001, from http://www.joe.org/joe/2000august/tt1.html Solomon, D. J. (2001). Conduc ting Web-based surveys. Practical Assessment, Research & Evaluation, 7(19) Retrieved February 28th, 2002, from http://ericae.net/par e/getvn.asp?v=7&n=19 Tennessen, D. J., PonTell, S., Romine, V., & Motheral, S. W. (1997). Opportunities for Cooperative Extension and local communities in the information age. Journal of Extension, 35(5) Retrieved March 3rd, 2002, from http://www.joe.org/joe/1997october/comm1.html U.S. Commerce Department, U. S. Census Bureau (1999). Computer use in the United States population characteristics Retrieved April 15th, 2001, from http://www.census.gov/pressrelease/www/1999/cb99-194.html U.S. Commerce Department (2000). Falling through the net: Toward digital inclusion executive summary Retrieved April 15th, 2001, from http://osecnt13.osec.doc.gov/public.nsf /docs/fftn-tdi-executive-summary University of Florida, Institute of Food and Agricultural Sciences (2001). IFAS home page Retrieved March 3rd, 2002, from http://www.ifas.ufl.edu/www/extension/ces.htm Winship, J. A., (1989). Information technologies in education the quest for quality software. Washington, D.C.: Organization for Economic Co-operation and Development (CERI).

PAGE 196

187 Yun, G. W., Trumbo, C. W. (2000). Comparative response to a survey executed by post, e-mail, & Web form. Journal of Computer-Mediated Communication 6(1). Retrieved February 28th, 2002, from http://www.ascusc.org/jcmc/vol6/issue1/yun.htm

PAGE 197

189 BIOGRAPHICAL SKETCH Jon Austin Gregg was born in October of 1956, in Gainesville, Florida. He grew up in Gainesville, a small town in rura l north central Florida that is home to the University of Florida. He graduated from Buchholz High School in 1974. His college career began at Sant a Fe Community College in Gainesville, where he received an AA degree in 1978. Transferring to the University of Florida thereafter, he received a B.S. de gree in statistics in 1984. After graduating from college Austin ran a small landscape maintenance business. He also engaged in post-baccalau reate studies, taking more courses in the area of statistics. In 1987 he began working for the University of Florida, and over the next five years held several OPS positions in computer-related fields. He was hired as a full-time University of Flor ida employee in 1992, and began working for the Institute of Food and Agricultural Sciences Program Evaluation and Organizational Development unit. At Pr ogram Evaluation he was responsible for many aspects of collecting and analyzi ng planning and reporting information generated by the Florida Cooperative Extensi on Service. Currently he is a computer programmer analyst working for the Information Systems Department of the University Floridas Division of Finance and Administration. Austins father was an internationally known research scientist, his mother an instructor of music. Austin is a memb er of both the Gamma Sigma Delta and Alpha Tau Alpha academic honor societies.


Permanent Link: http://ufdc.ufl.edu/UFE0000539/00001

Material Information

Title: The Use of information technology by county extension agents of the Florida cooperative extension service
Physical Description: ix, 189 p.
Creator: Gregg, Jon Austin ( Dissertant )
Irani, Tracy A. ( Thesis advisor )
Rudd, Ricke ( Reviewer )
Summerhill, William ( Reviewer )
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2002
Copyright Date: 2002

Subjects

Subjects / Keywords: Agricultural Education and Communication thesis, M.S
Dissertations, Academic -- UF -- Agricultural Education and Communication

Notes

Abstract: The purpose of this study was to examine the use of information technology (IT) amongst county Extension agents of the University of Florida's Florida Cooperative Extension Service. Four objectives delineated the research: Describe county Extension agents' demographics, and use of IT vis-à-vis those demographics; determine how county Extension agents are using hardware and software on the job; determine county Extension agents' perceived level of skills with regard to a specific set of IT tasks; and lastly recommend future IT training by describing the relationship between agents' perceived importance of and self-assessed knowledge about specific IT skills. The entire population of 331 county Extension agents was considered for this study. A mixed-mode methodology, which employed an electronic survey instrument and a traditional paper survey instrument, was used to collect the data. Agents were given three weeks to complete the electronic version of the survey, and thereafter mailed the paper version. Respondents were subsequently categorized according to what methodology they chose to use. Additional categorization was performed on the electronic respondents according to when they submitted a completed survey. Of the 331 individuals in the population, 278 responded electronically, and 21 via paper for an overall response rate of 90.3 percent. Information collected by the survey was subjected to a battery of statistical analyses. Summary statistics and ANOVA were used to compare and contrast patterns of IT use among agents of different gender, age, area of programmatic concentration, and response category. A weighting formula, based on a series of questions asked within the survey, was employed to derive agents' future training needs. County Extension agents painted a picture of an information technology savvy organization accommodating its clientele through Web sites, e-mail, and other sophisticated forms of information delivery. This current state of affairs is contrasted to findings from a similar study conducted ten years ago on Florida county Extension agents' use of information technology. Key findings and associated implications, and recommendations for future research are then offered from the study at hand.
Subject: agents, amongst, based, collection, computer, Cooperative, county, data, Extension, Florida, information, Service, survey, technology, use, Web
General Note: Title from title page of source document.
General Note: Includes vita.
Thesis: Thesis (M.S.)--University of Florida, 2002.
Bibliography: Includes bibliographical references.
Original Version: Text (Electronic thesis) in PDF format.

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0000539:00001

Permanent Link: http://ufdc.ufl.edu/UFE0000539/00001

Material Information

Title: The Use of information technology by county extension agents of the Florida cooperative extension service
Physical Description: ix, 189 p.
Creator: Gregg, Jon Austin ( Dissertant )
Irani, Tracy A. ( Thesis advisor )
Rudd, Ricke ( Reviewer )
Summerhill, William ( Reviewer )
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2002
Copyright Date: 2002

Subjects

Subjects / Keywords: Agricultural Education and Communication thesis, M.S
Dissertations, Academic -- UF -- Agricultural Education and Communication

Notes

Abstract: The purpose of this study was to examine the use of information technology (IT) amongst county Extension agents of the University of Florida's Florida Cooperative Extension Service. Four objectives delineated the research: Describe county Extension agents' demographics, and use of IT vis-à-vis those demographics; determine how county Extension agents are using hardware and software on the job; determine county Extension agents' perceived level of skills with regard to a specific set of IT tasks; and lastly recommend future IT training by describing the relationship between agents' perceived importance of and self-assessed knowledge about specific IT skills. The entire population of 331 county Extension agents was considered for this study. A mixed-mode methodology, which employed an electronic survey instrument and a traditional paper survey instrument, was used to collect the data. Agents were given three weeks to complete the electronic version of the survey, and thereafter mailed the paper version. Respondents were subsequently categorized according to what methodology they chose to use. Additional categorization was performed on the electronic respondents according to when they submitted a completed survey. Of the 331 individuals in the population, 278 responded electronically, and 21 via paper for an overall response rate of 90.3 percent. Information collected by the survey was subjected to a battery of statistical analyses. Summary statistics and ANOVA were used to compare and contrast patterns of IT use among agents of different gender, age, area of programmatic concentration, and response category. A weighting formula, based on a series of questions asked within the survey, was employed to derive agents' future training needs. County Extension agents painted a picture of an information technology savvy organization accommodating its clientele through Web sites, e-mail, and other sophisticated forms of information delivery. This current state of affairs is contrasted to findings from a similar study conducted ten years ago on Florida county Extension agents' use of information technology. Key findings and associated implications, and recommendations for future research are then offered from the study at hand.
Subject: agents, amongst, based, collection, computer, Cooperative, county, data, Extension, Florida, information, Service, survey, technology, use, Web
General Note: Title from title page of source document.
General Note: Includes vita.
Thesis: Thesis (M.S.)--University of Florida, 2002.
Bibliography: Includes bibliographical references.
Original Version: Text (Electronic thesis) in PDF format.

Record Information

Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.
System ID: UFE0000539:00001


This item has the following downloads:


Full Text











USE OF INFORMATION TECHNOLOGY BY COUNTY EXTENSION AGENTS OF
THE FLORIDA COOPERATIVE EXTENSION SERVICE

















By

JON AUSTIN GREGG


A THESIS PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF SCIENCE

UNIVERSITY OF FLORIDA


2002














ACKNOWLEDGMENTS


I would like to express my appreciation to Dr. Tracy Irani, who graciously

assumed the chair of my thesis committee. Dr. Irani's patience, insightful guidance,

and encouragement have been exemplary. I am most grateful for all of her efforts on

my behalf.

My two other committee members, Dr. Rickie Rudd and Dr. William

Summerhill, each provided me with unique insight and excellent recommendations.

Their contributions to this study are significant, and I acknowledge them with my

sincere appreciation.

Dr. Christine Waddill and Dr. Larry Arrington of the Florida Cooperative

Extension Service were instrumental to this study's success, and I am most indebted

to both of them. My deepest gratitude is reserved for all the county Extension agents

who so generously gave their time to complete the survey that I sent to them. I am

overwhelmed by their response, and hope that the results of this study will help them

in a manner similar to the way they helped me.

It is appropriate that I express my appreciation to my original committee

chair, Dr. Tracy Hoover. Under her guidance I began to write the thesis that you now

hold in your hands or observe via the Internet. I would like to thank Dr. Hoover for

her efforts.

I must also acknowledge the two friends who finally succeeded in getting me

involved with graduate school. Dr. Marshall Breeze and Dr. Mathew Baker played









big roles in this effort, and to both of them I extend my heartfelt appreciation. I

would also like to take this opportunity thank the AEC department chairman, Dr. Ed

Osborne, for his unfailing support and wisdom. Dr. Osborne seemed to often know

what was best for me, when perhaps I did not. In addition, Sid Sachs and Karen

Navitsky were especially helpful to me in the completion of my study. I am humbled

by the patience and generosity that they both have shown me over the duration of this

endeavor.















TABLE OF CONTENTS


A C K N O W L E D G M E N T S .......................................................................... .....................ii

A B S T R A C T ............ ................... .................. ................................... v iii

CHAPTERS

1 IN TR O D U C TIO N ....................... ........................... .. ........ ............. ..

B background ............................ .... .................................... ................
The Florida Cooperative Extension Service ............... .................... ............... 3
Information Technology and the Cooperative Extension Service .........................3
N eed for the Study ....................... ............................................ .....6.
D definition of Term s ................................................. .. ..... ... ........ .. ..
Limitations of the Study............... ................... ..................
A ssum options .............................................. 9
Significance of the Study ................................................. ............................ 9
Organization of the Remainder of the Chapters ..................................................10


2 REVIEW OF THE LITERATURE ............................................................ .. ............. 11

Introduction ................ ............................. .. .......... .............................. 11
Part 1-A Historical Perspective of the Computer's Technical Development............11
A Look at Related Innovation from Ancient Times to 1950 .........................11
The War Years: Calculating Needs drive Innovation...............................13
Re-invention, Public Attention, and Diffusion: The 1950s ........................15
The 1960s: Refinement, more Innovation, and more Adoption ...................18
Towards the Personal Computer.................................... ..............20
Personal Computing: Technology Fuses with Latent Desire......................21
Software Diminishes Complexity, Enhances Compatibility .......................24
Part 2-The Application and Use of Information Technology ..................................26
General Use of Information Technology in Today's Society........................26
Information Technology in Primary and Secondary Education ....................29
Information Technology in Higher Education..............................................34
Information Technology and the Cooperative Extension Service .................38
Information Technology and the Florida Cooperative Extension
Service....................................................... .. ...... 40
Part 3-Theoretical A egis of the Study.............................................. .................. 41
T raining N eeds .................................................................. ............... 4 1
D eterm ining Training N eeds...................................... ........................ 41









S u m m a ry ................................................................................. 4 2

3 M E T H O D O L O G Y .............................................................................. ......................45

Introduction ...................................................................................................... 45
R research D esign...................................................... 45
P o p u latio n .................................................................4 6
In strum entation ................................................................4 6
D ata C collection ................................................................4 9
On Web-based Surveys ......................................... 51
How This Study Addressed Sources of Error .............................................. ...55
D ata A naly sis ......... ........... ........................................................ 56


4 RE SU LTS .............. ....................... .............................. .... 58

Objective 1 Describe County Extension Agents' Demographic Characteristics
and, Based on Those Characteristics, Determine Their Use of Information
Technology, Including Self-Assessed Level of Overall Computer Skills...........59
A General Description of the Respondents..............................................59
Com paring R response G roups .................... ................. ............... .... 60
Use of Information Technology and Self-Assessed Level of Overall Computer
Skills ............................................................................................... ........ 64
An Examination of the Non-respondents.....................................................66
Self-rated Computer Skills and Demographics.....................................67
Computer Usage and Dem graphics .................................................................. 72
Source of Computer Knowledge and Demographics............................ 77
Demographic Snapshots by Age..... ............ .. ................................ 86
A g e g rou p 2 0 -3 0 ...................................................... ................ .. 86
A ge group 3 1-40 ............................ ........................ .. ................ .. 87
A ge group 41-50 ............................ .................... .. ...... ... 87
A ge group 5 1-60 .......................... ......................... ................... .. 88
A ge group 61-70 ............. ..... .. .............. .. ..... .......... .. ........... 88
Objective 2 Determine How County Extension Agents are Using Information
Technology on the Job in terms of Hardware Use, and the Nature and
Frequency of Use of Specific Types of Software.................... ... .............89
Connectivity, Hardware and Operating System Use, etc ...............................89
Patterns of Use of Electronic Mail..... ................................................. ...........89
Patterns of Use of Word Processing Software......................... ............91
Patterns of Use of Spreadsheet Software ................................ ................92
Patterns of Use of Presentation Software ................................................93
Patterns of Use of the World Wide Web ..............................................94
Patterns of Use of the Web Page Editing/Creation Activity..........................94
Objective 3 Determine county Extension Agents Perceived Level of Skill with
Regard to a Specific set of Information Technology Tasks..............................95









Objective 4 As a Means to Recommend Future Information Technology
Training, Describe the Relationship Between Agents' Perceived
Importance of, and Self-Assessed Knowledge about each of a Specific set
of Information Technology Skills........ ....... .......... .......................... 99
Sum m ary ......... ..... ............. ..................................... ........................... 105


5 SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS................................107

Sum m ary ......... ..... ............. ..................................... ........................... 107
P procedure ......... .............. ....................................107
L im stations of the Study.................................................. ............................... 109
Key Findings and Implications ........... .............. ................. .. 110
D discussion .................................................................................. ........ .. ....... .........112
Discussion of the M methodology ...... .............................................. .. ............... 117
Conclusions ............... ......... ........................120
Recommendations ......... ............................. ...... ......... .............120


APPENDIX

A SURVEY OF COMPUTER TECHNOLOGY SKILLS AND TRAINING NEEDS ......122

B THE SURVEY OF COMPUTER TECHNOLOGY SKILLS INSTRUMENT
PAPER VERSION ....................................................... ........... ...... ... .... 155

C PRINCIPLES FOR THE DESIGN OF WEB SURVEYS AND THEIR
RELATIONSHIP TO TRADITIONAL SOURCES OF SURVEY ERROR
(D IL LM A N & B O W K ER 2001) ....................................... ............ ........................... 163

D SCALES AND THE VALUES THEY REPRESENTED ....................................165

E E-MAIL FROM RESEARCHERS TO PILOT POPULATION INTRODUCING
T H E ST U D Y ............................................................................169

F E-MAIL FROM RESEARCHERS TO PILOT POPULATION REMINDING
TH EM T O PA R TIC IPA TE ..................................................................... ..................171

G E-MAIL FROM RESEARCHERS TO PILOT POPULATION THANKING
THEM FOR THEIR PARTICIPATION, AND INFORMING THEM THAT
THEY W ERE PART OF A PILOT STUDY................................................................ 172

H E-MAIL FROM THE DEAN OF EXTENSION INTRODUCING THE STUDY......... 173

I E-MAIL FROM THE RESEARCHERS TO THE POPULATION
INTRODUCING THE STUDY AND ASKING FOR PARTICIPATION.....................174









J E-MAIL FROM RESEARCHERS TO THE POPULATION REMINDING
THEM TO PARTICIPATE (FIRST REMINDER MESSAGE) ....................................176

K E-MAIL FROM RESEARCHERS TO THE POPULATION REMINDING
THEM TO PARTICIPATE (SECOND REMINDER MESSAGE)............................ 177

L E-MAIL FROM RESEARCHERS TO THE POPULATION REMINDING
THEM TO PARTICIPATE NONRESPONSEE MESSAGE) .............................178

M E-MAIL FROM THE DISTRICT EXTENSION DIRECTOR OF THE NORTH
CENTRAL DISTRICT TO HIS AGENTS ASKING FOR THEIR
PAR TICIPA TION IN TH E STU D Y ......................................................... ...................179

N LETTER FROM RESEARCHERS TO AGENTS WHO HAD NOT
RESPONDED TO THE ON-LINE SURVEY............................... ............. ...........180

O LETTER FROM RESEARCHERS TO AGENTS WHO HAD NOT
RESPONDED TO THE ON-LINE SURVEY REMINDING THEM TO
P A R T IC IP A T E ....... .. .. ...... ........ .. .......................................................... 1

REFERENCES ..................... .. ....... .. .. ... .... ...... ........... 83

B IO G R A PH IC A L SK ETCH ....................................... ............ ................ ........................189














Abstract of Thesis Presented to the Graduate School
Of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Science

USE OF INFORMATION TECHNOLOGY BY COUNTY EXTENSION AGENTS
OF THE FLORIDA COOPERATIVE EXTENSION SERVICE

By

Jon Austin Gregg

December 2002

Chairperson: Tracy A. Irani
Major Department: Agricultural Education and Communication

The purpose of this study was to examine the use of information technology

(IT) amongst county Extension agents of the University of Florida's Florida

Cooperative Extension Service. Four objectives delineated the research: Describe

county Extension agents' demographics, and use of IT vis-a-vis those demographics;

determine how county Extension agents are using hardware and software on the job;

determine county Extension agents' perceived level of skills with regard to a specific

set of IT tasks; and lastly recommend future IT training by describing the relationship

between agents' perceived importance of and self-assessed knowledge about specific

IT skills.

The entire population of 331 county Extension agents was considered for this

study. A mixed-mode methodology, which employed an electronic survey instrument

and a traditional paper survey instrument, was used to collect the data. Agents were

given three weeks to complete the electronic version of the survey, and thereafter









mailed the paper version. Respondents were subsequently categorized according to

what methodology they chose to use. Additional categorization was performed on the

electronic respondents according to when they submitted a completed survey. Of the

331 individuals in the population, 278 responded electronically, and 21 via paper for

an overall response rate of 90.3%.

Information collected by the survey was subjected to a battery of statistical

analyses. Summary statistics and ANOVA were used to compare and contrast

patterns of IT use among agents of different gender, age, area of programmatic

concentration, and response category. A weighting formula, based on a series of

questions asked within the survey, was employed to derive agents' future training

needs.

County Extension agents painted a picture of an information technology savvy

organization accommodating its clientele through Web sites, e-mail, and other

sophisticated forms of information delivery. This current state of affairs is contrasted

to findings from a similar study conducted ten years ago on Florida county Extension

agents' use of information technology. Key findings and associated implications, and

recommendations for future research are then offered from the study at hand.














CHAPTER 1
INTRODUCTION

Background

The Cooperative Extension Service is a public, non-formal education system

established by the Smith-Lever act of 1914. Charged by congress to diffuse "useful and

practical information on subjects relating to agriculture and home economics" among the

people of the United States, Extension evolved from the farmer's institutes of the late

1800s and early 1900s. The organization was originally designed as a partnership of the

land-grant universities and the U.S. Department of Agriculture, but provisions of the

Smith-Lever act enabled a third legal partner, the counties of the states, to be included in

the venture. Each partner, though having considerable independence in staffing, funding,

and developing programs, nevertheless contributes functions essential to the whole

system (Rasmussen, 1989). Extension in the United States and its protectorates is

believed to be the largest such organization in the world, utilizing the resources of 67

land-grant universities, certain community colleges, and thousands of county agents

(Seevers, Graham, Gamon, & Conklin, 1997).

An administrator appointed by the secretary of agriculture leads the federal

Extension partner. This individual reports to the undersecretary of science and education,

and strives to accomplish Extension's mission "to assure an effective nationwide

Cooperative Extension Service that is responsive to priority needs and the federal

interests and policies with quality information, education, and problem-solving

programs" (Rasmussen, 1989, p. 5). Over the years Extension has responded well to






2


"federal interests." During both world wars the organization spurred increases in

agricultural production and engaged in service functions such as soliciting for Liberty

Bonds, and serving on local draft boards. During the depression Extension participated

in many New Deal programs including the Farm Credit Administration, the Rural

Electrification Administration, the Tennessee Valley Authority, and the Soil

Conservation Service (Rasmusen, 1989). Today the federal partner directs special

attention and funding to the state partners through "National Initiatives" in such areas as

water quality, food safety, and workforce preparation (Cooperative State Research,

Education, and Extension Service, (CSREES), 2001).

The state Extension partners are located at Land Grant universities, and are

headed by a director or dean selected by the university with the concurrence of the

secretary of agriculture (Rasmussen, 1989). An annual plan of work is submitted by the

state Extension director for approval by the federal secretary of agriculture. The state

partner is also responsible for the administrative oversight of the county partner.

Individuals at the university or research center level who conduct research or who

specialize in disseminating research-based information are called "state Extension

specialists." Most state specialists are members of an academic department associated

with the sponsoring Land-Grant institution, and are available to county Extension agents

to help apply university-based research to solve local problems.

It is primarily at the county level, thorough the county Extension agent, that the

university meets the people. Described variously as an "Extension educator, change

agent, teacher, or social activist," the county agent "serves as an educational broker for

the community" (Seevers et al., 1997, p. 52). "County Extension agents constantly live









amid and encourage change in people and their surroundings" (Rasmussen, 1989, p. 7).

They provide leadership and expertise, and extend knowledge needed to solve local

problems (Seevers et al., 1997). The county Extension agent participates in a storied

profession of dedication, long hours, and of gaining the trust of people in order to help

them improve their lives through education based on scientific knowledge (Rasmussen,

1989).

The Florida Cooperative Extension Service

Extension work in Florida effectively began in 1909 with a $7,500 a year

appropriation from the Florida State Legislature. This legislative action enabled federal

authorities to send Florida its first state demonstration agent, A.S. Meharg, who

developed a successful program before his resignation in 1913. Extension proper began

on May 25th, 1915, when Florida accepted the provisions of the federal Smith-Lever Act.

Peter H. Rolfs was its first director (Cooper, 1976).

Today the Florida Cooperative Extension Service operates as part of the

University of Florida's Institute of Food and Agricultural Sciences, and has a presence in

each of the state's 67 counties. The organization conducts educational programming in

areas such as agriculture, food safety, energy conservation, family economics, and youth

development (University of Florida, Institute of Food and Agricultural Sciences, 2001).

Information Technology and the Cooperative Extension Service

In the early 1920s the Cooperative Extension Service adopted two new

innovations, the radio and the telephone, to keep rural people informed of Extension

activities and, with radio, to deliver educational programming (Rasmussen, 1989). The

next significant innovation in electronic communications, television, was also used by the

organization to work with clientele (Rasmussen, 1989). When the personal computer









began its widespread diffusion in the early 1980s, Extension, along with the rest of the

world, was introduced to a new technology that would quickly evolve into a

revolutionary means of communication. During the early days of the personal

computer's diffusion Cantrell (1982) reported that Extension educators, lacking computer

competencies, were in jeopardy of becoming less computer literate than their clientele -

thus evidencing a slowness by agents to adopt the innovation. Ten years later Ruppert

(1992) stated "Extension educators cannot escape the computer revolution and will be

challenged in their roles with the responsibility of helping people understand and make

the best use of such technology" (p. 4). Eight years thereafter, and after monumental

technological progress in personal computing, Albright (2000) stated that knowledge had

become the central focus of the global economy, and that a transition to "incorporate the

technology required for the dissemination of knowledge" (p. 11) is nowhere more

important than within organizations that have always produced knowledge (i.e.

Extension). Furthermore Albright states that the organization's leadership must

"consider societal, global, and demographic changes and effectively embrace information

technology as an impetus to further the mission of CES" (Albright, 2000, p. 16).

The capability, then, for Extension agents to learn and to apply the use of

computers, software and associated peripheral devices (collectively, information

technology) for purposes of serving clientele and in support of Extension's administrative

infrastructure, has become an essential job-related skill. Albright (2000), addressing the

need for organizations to "adapt to the technology explosion" (p. 3) states: "It is critical

that Extension re-invest in employees and train them in the necessary skills to remain

competitive and serve a dynamic community" (p. 4). Martin (2001) echoes this: "With









more clients using computers to obtain information, it will be critical for agents and other

field staff to gain the computer skills necessary to use computers as a means for gaining

greater efficiency in obtaining and sharing educational information" (p. 1).

To measure the ability of Extension professionals to use information technology,

Albright, in her 2000 study of Texas county Extension agents asked agents to self-rate

their computer skills in eight areas ranging from word processing to the use of peripheral

devices. Albright also sought to determine future training needs by asking agents to rank

each of the eight specific information technology skill areas according to the importance

the agent ascribed to a skill area, the agent's knowledge of the skill area, and the agent's

ability to apply the skill area to their job. These three constructs were operationally

defined in the following manner: "Importance" described the importance of a particular

skill to an agent's job function; "knowledge" measured the ability to accurately recall or

summarize information associated with a skill; and "application" measured the ability of

the agent to use specific skills on the job. Albright found that the general population of

Extension agents indicated that their strongest skills were in word processing, e-mail,

Internet use, and file management, respectively (Albright, 2000). Older agents in the

study self-reported fewer information technology skills than younger ones, and indicated

that their primary source of IT knowledge stemmed from on-the-job training. Younger

agents were found to be "more self-directed in their technology learning" (Albright,

2000, p. 94). Both younger and older agents reported having participated in little IT

training within the two years prior to the study (Albright, 2000). Usage of the Internet

was seen by agents as being a "very critical" means of program delivery, and, by younger

agents in the survey, a potential means to receive training (Albright, 2000, p. 95).









Based on her research, Albright concluded that agents from the general population

of Texas Extension agents that were studied needed training in the following areas: Web

page development, peripheral device management, presentation software, file

management, E-mail, word processing, Internet, and spreadsheets. She also found that

agents who had taught themselves computer skills self-reported lower computer skills

ability. Thus do different employees have different training needs, and should learn skills

commensurate to their current level of expertise: "The literature supports that it is

counterproductive to design one training plan for all agents and expect learning to occur"

(Albright, 2000, p. 106). Albright also states that developing a set of"specific skill

standards or competencies" would provide "benchmarks" for employees to meet when

developing their computer skills (Albright, 2000, p. 104). This would establish

expectations for specific levels of employee computer competencies, with the implication

that training needs could be differentiated and addressed.

Need for the Study

The last systematic study of general computer use amongst county Extension

agents of the Florida Cooperative Extension Service (FLCES) was conducted by

Kathleen C. Ruppert in 1992. Her objective was to "determine whether county extension

agents use computers, to what extent they use computers, and what factors may be

inhibiting or encouraging their use" (Ruppert, 1992, p. vi). The factors "inhibiting or

encouraging" computer use "were operationally defined as subscales of the Computing

Concerns Questionnaire (CCQ)"1 which, along with a battery of questions about



1 The 32-item Computing Concerns Questionnaire was developed and verified by Martin, and based on the
Concerns Based Adoption Model developed by Hall and colleagues as a means to identify and explain
discrete stages of concern that individuals progress through (and express) as they adopt an innovation
(Ruppert, 1992).









"personal and situational factors," was used in a census study of FLCES county agents

(Ruppert, 1992, p. vi). Ninety-four percent of the population responded. After subjecting

her data to a battery of statistical procedures, Ruppert found that "almost half' of the

agents had a computer on their desk, and that one-third of them made use of a computer

at home. Of computer skills, agents "were most adept at computer word processing,"

followed by VAX (computer network), databases, the IFAS CD-ROM, spreadsheets, and

computer graphics (Ruppert, 1992, p. 101).

Agents associated with the Agriculture, 4-H, and Marine program areas had

"significantly higher" computer use mean scores2 than agents in other areas (Ruppert,

1992, p. 101). Of the eight subscales of the Computing Concerns Questionnaire, those

concerns which focused "either on the individual or the client and how the agents interact

with the computer and how their computer work effects their clientele" were found to be

statistically significant (Ruppert, 1992, p.102). Ruppert also found that "age, program

area, typing, computer training, and computer resource contact were all significant

demographic and situational independent variables that affected the overall computer use

mean score of county agents" (Ruppert, 1992, p. 102).

Since the Ruppert study many technological advances have occurred including

faster machines, widespread connectivity to the Internet, use of graphical user interfaces

in software, and use of the World Wide Web to retrieve and disseminate information.

These developments have, over the past 10 years, changed the nature of workplace

computer use among FLCES county Extension faculty. The following questions thus

arise: Have county agents kept abreast of the manifold technological changes of the past


2 This mean is computed from a linear model relating computer use to agent's "personal and situational
factors." A model was also constructed using the responses from the Computing Concerns Questionnaire.









10 years? Are they utilizing the Web for information to fulfill client need? Are they

disseminating information to clientele through Web sites or e-mail? Are agents using e-

mail to exchange information, and can they attach a file to such messages? And finally,

to what degree of sophistication do agents use everyday office software products such as

word processors, or a spreadsheet?

There was a need, then, to investigate the current use of information technology,

level of information technology skills, and the workplace application of modern

information technology among county Extension agents of the Florida Cooperative

Extension Service. Ultimately, Extension administrative entities, and other parties

interested in this issue, will be provided with objective, research-derived information that

should provide an understanding of county agents' IT use, and consequently enable the

specific IT training needs of county faculty to be addressed.

The objectives of this study were therefore

1. Describe county Extension agents' demographic characteristics and, based on those
characteristics, determine their use of information technology, including self-assessed
level of overall computer skills.

2. Determine how county Extension agents are using information technology on the job
in terms of hardware and software use.

3. Determine county Extension agents' perceived level of skill with regard to a specific
set of information technology tasks.

4. Recommend future information technology training by describing the relationship
between agents' perceived importance of, and self-assessed knowledge about specific
information technology skills.

Definition of Terms

For purposes of this study, the following terms are defined:

1. Information Technology refers to computers, computer software, and peripheral
devices connected to computers such as modems, scanners, Ethernet, digital
television, etc.









2. Office-type software products refers to software that performs such tasks as word
processing, spreadsheets, browsing the World Wide Web, electronic mail, etc.

3. The FLCES is the Florida Cooperative Extension Service.

Limitations of the Study

A census of the population of county Extension agents of the FLCES was

conducted using an instrument accessible via the World Wide Web. Those agents not

responding to the on-line instrument after three weeks duration of time were sent a

traditional paper instrument. Due to the nature of a census study, the specific IT

infrastructure in place within the FLCES, and the specific IT knowledge and skills that

might be possessed by FLCES county agents, the findings of the study cannot be

generalized to Extension organizations elsewhere, though they are likely to offer insight

to those organizations.

Assumptions

It was assumed that the county agents of the Florida Cooperative Extension

Service who responded to the survey did so with truthfulness and honesty.

Significance of the Study

The level of skills and workplace application of information technology by county

Extension agents of the FLCES is presently under-researched. This situation is

significant to Extension because the ability to effectively use IT in a current, up-to-date

manner bears directly on the organization's operational effectiveness in two fundamental

areas: Its internal functioning, and its mission to serve the needs of its clientele.

Studying patterns of county Extension agents' IT skills will help paint a clearer picture of

the strengths and weaknesses faced by the organization in this important area.

Recommendations addressing specific needs for agent computer training will follow from






10


the findings of this study. Such information may be important to FLCES administrators,

particularly in light of the potential for enhanced organizational efficiencies.

Organization of the Remainder of the Chapters

This thesis is presented in five chapters. Chapter 1 introduces the study, and

proceeds to Chapter 2, a review of the literature. Chapter 3 discusses the study's

methodology, including research design and procedures followed. A detailed report on

the data collected is provided in Chapter 4, and Chapter 5 engages in a summary of the

study, conclusions, and recommendations based on the study's results.














CHAPTER 2
REVIEW OF THE LITERATURE

Introduction

This review of literature encompasses three parts. Part 1 engages in a broad

historical perspective of the development of computers that has lead to today's

modern information technology. The intent of the section is to impart to the reader a

distinct feel for the profound impact that the computer and its peripheral devices have

had on societies throughout the world. Part 2 discusses the application and use of

information technology, establishing why its use is an advantageous, if not necessary

skill in this day and age. Shown here is the extent of information technology's

penetration into the workplace, school, and home. Both the present and potential

impact of information technology on higher education, with considerations specific to

Extension, is discussed. Part 3 establishes the theoretical aegis under which this

study functions.

Part 1-A Historical Perspective of the Computer's Technical Development

A Look at Related Innovation from Ancient Times to 1950

Today's computer is a fusion of innovations, having evolved from many and

varied calculating devices some dating to antiquity. Perhaps the modern

computer's most distant progenitor is the abacus, a counting device comprised of

beads strung on rods. The abacus widely diffused among the merchants of ancient

Asia and is still used in parts of the world today (Ross, 1986).









The next significant innovation in calculating devices occurred in the 1640s

when nineteen-year-old French mathematician Blaise Pascal invented a gear-driven

machine that could add and subtract. Some thirty years later the German

mathematician Gottfried Wilhelm von Leibnitz extended the capacity of Pascal's

machine to include multiplication and division. Another widely used calculating

device, the slide rule, also stems from this era (Ross, 1986).

In the 1830s an Englishman named Charles Babbage theorized an "analytical

engine" which foretold of modem computers. The "engine" incorporated a

programming component based on Joseph Jacquard's system of using punched cards

to operate weaving looms in a prescribed manner. This innovative feature

represented a distinct break from processing immediate input, and can be seen as the

progenitor of modern, stored computer programs (Ross, 1986).

American engineer Herman Hollerith ushered in the next significant step

towards modern computers. Responding to a competition held by the U.S. Census

Bureau to find the best means to tabulate the 1890 census, Hollerith invented an

electromechanical tabulating machine that successfully automated the census

counting process. Based on punched cards, the machine more than halved the time it

took tabulate the previous (1880) census, saving the government an estimated five

million dollars. In 1896 Hollerith founded the Tabulating Machine Company that, for

the better part of the next century, would play a pivotal role in the development and

diffusion of computing devices (Campbell-Kelly and Aspray, 1996; Ross, 1986). A

short examination of this company's emergence follows: McKinley' assassination in

1901 brought about a change in the leadership of the Census Bureau. The Bureau's









new leader soon ended the business relationship with Hollerith's Tabulating Machine

Company, which forced the company to focus on diffusing its tabulating machine into

new markets. With the introduction of an improved "automatic" version of the

machine, adoption of punch card tabulating spread rapidly throughout many diverse

corporate and governmental entities. In 1911 Hollerith sold the company, which was

merged with two other businesses to become the Computing Tabulating and

Recording Company (CTR). The new company's president, Thomas J. Watson, Sr.,

established a highly effective sales force that facilitated the diffusion of the tabulating

machine throughout the world. In 1924 the CTR Company re-name itself the

International Business Machines Corporation (IBM) (Campbell-Kelly and Aspray,

1996).

The War Years: Calculating Needs drive Innovation

In the early 1940s the Moore School of Electrical Engineering at the

University of Pennsylvania possessed a Bush Differential Analyzer (a large

mechanical calculating machine). In proximity to this school was the Army's

Ballistics Research Laboratory (BRL) at the Aberdeen (Maryland) Proving Grounds.

This laboratory, which also had a Bush Differential Analyzer, was responsible for

creating firing tables for each new ballistic weapon fielded by the United States

military. With use of the differential analyzer, a firing table containing data on up to

3,000 trajectories could be completed in about a month. A team of 100 human

calculators (characteristically young women) working with desktop calculators took

approximately the same amount of time to complete a table. As the war progressed

the BRL fell behind schedule in completing firing tables, thus creating a bottleneck to

the deployment of new weapons. It turned to the Moore School for help, but even









with this assistance the deployment of new weapons fell farther behind. The need for

effective calculating technology thus became urgent, and this spurred applied

research into a solution for the problem (Campbell-Kelly and Aspray, 1996).

In August of 1942 John Mauchly of the Moore School proposed to build an

electronic computer to expedite the calculation of firing tables, thus relieving the

bottleneck. Initially ignored, the proposal was revisited in the spring of 1943 and

approved shortly there after. Mauchly then teamed with a 24-year-old electrical

engineer named Presper Eckert, eventually devising an electronic computing machine

called ENIAC (Electronic Numerical Integrator and Computer).

A chance meeting on a railway platform between the BRL/Moore School

liaison officer, Herman H. Goldstein, and mathematics genius John von Neumann of

Princeton's Institute for Advanced Study lead to von Neumann's involvement with

the Moore School's electronic computer project. By this time ENIAC was at such

stage of completion that its design had been frozen. It had three major shortcomings:

Too little storage, too many tubes, and it took a prodigious amount of time to re-

program. These deficiencies lead to the development of the EDVAC (Electronic

Discrete Variable Automatic Computer), which, associating many of von Neumann's

theoretical insights, pioneered aspects of electronic computing that hold to this day

(Campbell-Kelly and Aspray, 1996).

EDVAC was a "stored program" computer that consisted of an input device,

memory, a control unit, an arithmetic unit, and an output device. In the spring of

1945 von Neumann published "A First Draft of a Report on the EDVAC" which

detailed "the complete logical formulation of the new machine," a formulation which









"ultimately was the technological basis for the worldwide computer industry"

(Campbell-Kelly and Aspray, 1996, p. 94). This document, though intended only for

internal use, was rapidly disseminated around the world. In the meanwhile, shortly

after the end of the war, Mauchly and Eckert's ENIAC computer came to life. Its

intriguing physical appearance, and its 5,000 operations per second speed generated

tremendous coverage in mass media channels, thus attracting public and scientific

interest. Responding to the publicity, the Moore School sponsored a series of lectures

in 1946 specifically to diffuse information on the stored-program computer. The

lectures established a link between the school, and the many governmental, university

and industrial entities working on computers in the late 1940s (Campbell-Kelly and

Aspray, 1996).

ENIAC and EDVAC are but two examples of the "first generation" of fully

electronic computational devices. The British COLOSSUS, developed in secrecy to

break the infamous Nazi Enigma cipher, and the German Z developed by Konrad

Zeuse were similar devices. Based on vacuum tubes, this "first generation" of

computers were very large in size and consumed prodigious amounts of electricity.

Each was a unique creation dedicated only to solving mathematical problems.

Re-invention, Public Attention, and Diffusion: The 1950s

The end of the war allowed the once secret digital computation techniques to

quickly diffuse into the civilian arena. This brought roughly thirty firms in the United

States into the computer business. About ten were established in Great Britain.

Office machine manufacturers, electronics and control equipment suppliers, and

entrepreneurial start-ups were the three types of companies attempting to capitalize

on the new technology (Campbell-Kelly and Aspray, 1996).









During the 1950s the computer, hitherto a mathematical instrument with

limited application, was re-invented as a data-processing machine (Campbell-Kelly

and Aspray, 1996). Leading the way, in early 1951, was the Electronic Control

Company whose UNIVAC system was geared specifically towards business. IBM,

after initially committing the company's resources to develop a scientific computer

called the "Defense Calculator," quickly realized that it should focus its efforts on

developing machines for civilian business. The wake up call for IBM occurred when

the U.S. Census Bureau adopted the UNIVAC system to address the bureau's

computational needs (Campbell-Kelly and Aspray, 1996).

Popular excitement about computers in the early 1950s was reflected in mass

media channels such as business magazines, which fanned interest with sanguine

predictions of a paperless revolution driven by sophisticated automata. Many

business establishments were thus spurred to adopt the computer at this early date

regardless of cost effectiveness (Campbell-Kelly and Aspray, 1996).

New technical innovations continued to provide improvements to the nascent

computer industry's product. Out of MIT's Project Whirlwind, a contract to build a

flight simulator for the military, came the inventions of magnetic core memory and

"real time" operation. The core memory innovation alone would quickly replace all

other types of memory, reaping MIT large royalty payments. Real time operation, in

which a computer immediately responds to external input, enabled new military and

business applications. Stemming directly from Whirlwind's technological

breakthroughs came the military's SAGE early warning air defense system. Although

IBM was the primary contractor for the project and gained tremendous technological









advantage in the industry as a result, SAGE nevertheless spun off key technological

innovations such as printed circuitry, mass-storage devices, graphical displays, digital

communications, and core memory to a host of different companies. The project also

trained a large cadre of software engineers. At the end of the 1950s only about 6,000

computers were installed worldwide, but a critical mass of technological innovation

was in place to begin intense commercial exploitation of the machine (Campbell-

Kelly and Aspray, 1996).

Two high level programming languages were introduced in the 1950s that

endure to the present day: FORTRAN, for scientific applications, and COBOL for

business. These innovations facilitated further adoption of the computer because of

their similarity to natural, spoken language, and because they had built-in algorithms

that clearly spelled out errors in newly written code.

Programming costs constituted the largest expense associated with a computer

installation, and thus companies preferred to obtain ready-made applications written

by outside vendors rather than develop them on their own. Recognizing this, the

computer manufacturers began to bundle software that had specific application, be it

banking, insurance, manufacturing, etc., to their computers. Libraries of existing

programs were included with systems, and the free exchange of code was facilitated

through user groups like SHARE (Campbell-Kelly and Aspray, 1996).

It should be noted that by the end of the 1950s the British computer industry,

though it was first to market a computer (the Ferranti Mark I), was struggling for

survival. Campbell-Kelly and Aspray attribute this situation to a lack of enthusiasm

amongst Britain's "old fashioned businesses" to adopt the innovation (Page 106).









The social consequence of the British establishment's non-adoption was to effectively

stifle a new, innovative industry that was trying to take hold thus insuring American

dominance in the arena for years to come.

The 1960s: Refinement, more Innovation, and more Adoption

Replacing the vacuum tube in the late 1950s, the discrete transistor ushered in

the second and third generations of fully electronic computers. These new machines

were much more compact in size, consumed less power, and did not generate nearly

as much heat (Ross, 1986). This spurred further adoption of the innovation, and by

the end of the decade there was a tenfold increase in the number of installed systems -

to almost 80,000 in the United States, and 50,000 elsewhere (Campbell-Kelly and

Aspray, 1996, p. 130).

Transistors, however, were soon obsolete, being replaced by a truly

revolutionary innovation called the integrated circuit. Invented by Jack S. Kilby1 of

the Texas Instruments Corporation, the integrated circuit readily lent itself to

miniaturization and sophistication. Its introduction in the late 1960s ushered in the

beginning of a fourth generation of digital computational devices, which would

steadily increase in power and speed while dropping in size and price (Ross, 1986).

Rapidly increasing numbers of computers created a demand for application

programs that, by 1965, supported 40-50 large software contractors and

approximately 2,750 smaller ones. By the end of the decade these services were in

greater demand because the average size company was unable to develop software in-

house that could effectively exploit modern computing power. It simply cost too


1 Kilby was awarded the 2000 Nobel Prize in Physics for his role in the invention of the monolithic
integrated circuit.









much money. Custom-designed applications purchased from outside contractors,

however, were also prohibitively expensive. This problem opened the door to

"packaged software," which effectively distributed development costs across a whole

market (Campbell-Kelly and Aspray, 1996).

Some other notable technological developments in the computing arena hail

from the 1960s era: Time-sharing, the BASIC programming language, and the

minicomputer. Computer time-sharing systems, developed through large grants from

the U.S. Advanced Research Projects Agency, enabled multiple parties, even at

divergent locations, to simultaneously use a large computer. This innovation

markedly increased computer efficiency, and spawned what would be known as the

"computer utility" industry, a short-lived phenomena that envisaged "piping computer

power into homes" (Campbell-Kelly and Aspray, 1996, p.217). It should be noted

that whereas the popular marketplace application of time-sharing failed, it remains an

instrumental part of most all mainframe computing today.

The BASIC (Beginners All-purpose Symbolic Instruction Code) computer

language emanated from the Dartmouth (College) Time-Sharing System, and was

devised with simplicity in mind. It diffused rapidly through the educational

establishment, making it de rigueur for manufacturers to include it on any new system

designated for this market. Though some criticized its simplicity, BASIC emerged as

a user-friendly language that made it possible for a wide spectrum of people to adopt

the use of computers. It was to be the first widely available programming language

for the forthcoming personal computer and "laid the foundations of Microsoft"

(Campbell-Kelly and Aspray, 1996, p. 211).









Towards the Personal Computer

Minicomputers emerged from MIT's Whirlwind project to become a product

of the electronics industry (as differentiated from mainline computer manufacturers

such as IBM). They were part of the revolution that miniaturized electronics, an

effort that brought the world pocket calculators, digital watches, and ultimately the

personal computer. Minicomputers enjoyed two distinct attributes that lead to their

widespread adoption by scientific, academic, and engineering entities: They were far

less expensive than a mainframe (having no bundled software, peripheral devices, or

marketing overhead built into its price), and they allowed for "hands-on computing"

like unto the early 1950s.

Minicomputer use, especially the Digital PDP-8, spawned interest in

computing amongst the students, experienced engineers, and young technicians who

used them, and from this interest a "strong computer hobbyist culture" emerged

(Campbell-Kelly and Aspray, 1996, p. 225). In 1966 the Amateur Computer Society

was founded, and through its "ACS Newsletter" publication a network of like-minded

individuals was formed.

The microprocessor arrived in 1971. Developed over the course of two years

by Intel Corporation, it was designed as a general-purpose logic chip that could be

programmed for specific applications such as a calculator. In fact the first Intel

microprocessor was sold in early 1971 to Busicom, a Japanese calculator

manufacturer. This chip, however, was soon to be re-invented.

Precipitous declines in the price of electronic calculators soon lead Busicom

to relinquish the marketing rights to the general-purpose logic chip back to Intel, who,

in November of 1971 began marketing it as a "computer on a chip." This was the









Intel 4004 microprocessor. A low-powered device capable of processing only four

bits of information at a time, it nevertheless sold for $1,000 a copy. Competition

from such companies as Motorola, Zilog, and Mostek would soon drive the price of

microprocessors much lower (Campbell-Kelly and Aspray, 1996).

Personal Computing: Technology Fuses with Latent Desire

Two distinct groups played a role in the eventual inception of the personal

computer: Computer hobbyists and those involved with "computer liberation"

movement. The hobbyists were concentrated in the Silicone Valley region, around

Massachusetts' Route 128 corridor, and in lesser numbers through the country.

Resembling, if not out rightly stemming from the "ham" radio culture, these

individuals were characteristically young male "technophiles" often with some

professional association with the electronics industry. They were likely to read such

mass media publications as "Popular Electronics" from which kits to build such

things as stereos and television sets could be obtained. Minicomputers were too

costly for these individuals, as was computer use by way of time-sharing computer

utilities. This then sparked a desire for economical computer hardware that could be

readily owned by an individual (Campbell-Kelly and Aspray, 1996).

Congruent to the desire for personally owned hardware, and hailing from the

anti-establishment culture of the 1960s, the computer liberation movement espoused

the "radical idea called hypertext," a vision whereby common people could

economically access a "universe of information held on computers" (Campbell-Kelly

and Aspray, 1996, p. 239). Inhibiting this vision was the fact that most all computers

were "rigidly controlled in government bureaucracies or private corporations"









(Campbell-Kelly and Aspray, 1996, p. 239). Computer hardware that could be

personally owned would facilitate computer liberation.

In January of 1975 the first microprocessor-based computer was offered as a

$397 kit. Called the Altair 800, its availability was announced exclusively on the

cover of Popular Electronics magazine. Though it had no keyboard or monitor, and

unto itself did nothing other than light up a few small light bulbs, it generated a

million dollars worth of orders in the first three months it was offered. Soon other

companies were marketing add-on components for the system such as additional

memory, storage devices and software. The Altair 800 galvanized the attention of

Bill Gates and Paul Allen who formed a company named "Micro-Soft" and quickly

developed a BASIC programming system to accompany this fledgling personal

computer (Campbell-Kelly and Aspray, 1996). New communication channels

dedicated to the innovation opened up practically overnight from "The Homebrew

Computer Club" near Silicone Valley, to "Byte," and "Popular Computing"

magazines. By 1977 a chain of stores, ComputerLand, would sell machines and

software nationwide (Campbell-Kelly and Aspray, 1996).

Key items such as screens and keyboards, which existed from the evolution of

mainframe computers, contributed to the rapid development of the personal computer

away from its simplistic beginnings. By 1977 there were three leading manufacturers

of personal computers whose products each appealed to a different segment of the

market. For Apple, the Apple II machine was a "home/personal computer," an

attempt to position it beyond the hobby market. Tandy's TRS-80 machine appealed

to Radio Shack's clientele of hobbyists and video game enthusiasts. For Commodore,









the personal computer was conceived as an extension of its line of calculators

(Campbell-Kelly and Aspray, 1996).

Software drove adoption of the personal computer (PC). Computer games,

simulation programs for education, and perhaps most importantly business

applications transformed the machine from the realm of the hobbyist to a utility.

Leading this transformation was the VisiCalc spreadsheet that, coupled with the

(relative) speed and flexibility of the PC, allowed businesses to easily model various

financial scenarios. "Suddenly it became obvious to businessmen that they had to

have a personal computer. VisiCalc made it feasible to use one. No prior technical

training was needed to use the spreadsheet program. Once, both hardware and

software were for hobbyists, the personal computer a mysterious toy, used if anything

for playing games. But after VisiCalc the computer was recognized as a crucial tool"

(Slater, as quoted by Campbell-Kelly and Aspray, 1996, p. 251). The software

evidently had copious relative advantage over analogous mainframe software in its

speed and flexibility, and because it could be used virtually for free after a modest

purchase expense. VisiCalc certainly was compatible with business' existing values,

especially if they had been using similar software on a mainframe. Given that

VisiCalc needed "no prior technical training," its complexity was such that it could be

easily adopted. Evidently it was easy to try, and furthermore the success of those

trials was very obvious to businesses. Thus this innovation was readily adopted.

By 1980 there were many spreadsheets on the market, along with word

processing software and first of the database products. The PC itself was sporting

new monitors that displayed 80 columns of text in both upper and lower case, and









printers were quite affordable. Its potential as a business machine had clearly arrived

(Campbell-Kelly and Aspray, 1996).

IBM's entry into the personal computer business had the instantaneous effect

of casting a seal of approval on the PC technology. The IBM badge was an emphatic

statement that the PC was legitimate technology compatible with business

everywhere and business responded, in a big way. Launched in New York City on

August 12th, 1981 the IBM Personal Computer generated "intense" interest from

mass media, thus diffusing knowledge of the innovation far and wide. This attention

was in addition to IBM's own memorable advertising campaign that featured a

Charlie Chaplin look alike designed to humanize the PC machine (to make it seem

compatible to those considering adoption). At a price of $2,880 there was soon a

waiting list for the product, and IBM quickly quadrupled production (Campbell-Kelly

and Aspray, 1996).

Software Diminishes Complexity, Enhances Compatibility

The IBM personal computer architecture with its Intel 8088 processor, 64

kilobytes of RAM, and floppy disk drive quickly became an industry standard. All

computer manufacturers either switched to the new standard or suffered the

consequences. A notable exception was Apple Computer, whose business approach

to the IBM competition was to design a better operating system, and resultantly,

better application software. Apple's president, Steve Jobs had seen the future, so to

speak, when he had accepted an invitation to visit the Xerox Corporation's Palo Alto

research laboratories in 1979. (The visit was in response to an invitation extended by

Xerox, who was an early investor in Apple.) During his visit Jobs witnessed,

amongst other things, the mouse and the graphical user interface (GUI). Clearly









amazed, he commented that Xerox could "blow away" the competition with the

technology. Taking his observations back to Apple headquarters in Cupertino

California, Jobs convinced his colleagues that what he had seen at Xerox was the

technological way to go. In May of 1983 Apple launched the "Lisa" computer that

incorporated the GUI operating system and mouse innovations. At $16,995 the Lisa

was a commercial failure. In January 1984 Apple introduced another computer, the

Macintosh, which also incorporated the GUI and the mouse. Though described as

making "every other personal computer appear old-fashioned and lackluster"

(Campbell-Kelly and Aspray, 1996, p. 276), the Macintosh, priced at $2,200, failed to

garner much adoption outside of the computer enthusiast market, education, and

printing and media companies. Regardless of Apple's failure to gain widespread

adoption of its products, the GUI was an innovation that would eventually propel

adoption the personal computer throughout society and one company, Microsoft,

having written much of the software for the Macintosh, had gained intimate

knowledge of how the GUI technology worked (Campbell-Kelly and Aspray, 1996).

Responding to the very apparent advantage of the Apple operating system,

other firms launched similar GUI products. The first was VisiCalc, which introduced

VisiOn in October of 1983. Soon thereafter, in early 1994, Digital Research launched

GEM. Microsoft, having licensed characteristics of the Macintosh operating system,

released Windows in late 1985, and IBM, initially partnered with Microsoft, began

work on OS/2 in 1987. Sooner or later each of these operating systems would fail to

be commercially viable, except Windows (Campbell-Kelly and Aspray, 1996).









Buoyed by royalties generated from its Disk Operating System (DOS), a copy

of which was installed on every IBM PC sold, Microsoft had the resources to develop

and effectively market software and weather to marketplace failures when they

occurred. The first version of Windows saw only little adoption. It was "unbearably

slow" even on machines running the latest Intel 286 microprocessor, and was

perceived as a "gimmick" with little advantage over DOS (Campbell-Kelly and

Aspray, 1996, p. 278). Yet Microsoft persisted, developing a base of Windows

applications for the IBM PC. When the second version of Windows was released it

sold over 2 million copies. By the third release of Windows in May of 1990

microprocessor power had been enhanced to the point where the Windows GUI

operated with reasonable alacrity. This was the era of the Intel 386 and 486

microprocessors. At this time Microsoft chairman Bill Gates, presiding over a $10

million worldwide media spectacular at the launch the new GUI, proclaimed

Windows 3.0 "puts the personal back into millions of MS-DOS-based computers"

(Campbell-Kelly and Aspray, 1996, p.281). Five years later even more extravagant

media events heralded the August 1995 release of Windows, now renamed

"Windows95." Further releases of the software occurred in 1998, 2000, and 2001.

The Windows GUI along with a diverse array of sophisticated software sporting a

Windows-based commonality in design, fueled adoption of the personal computer

across society to the greatest numbers ever.

Part 2-The Application and Use of Information Technology

General Use of Information Technology in Today's Society

"At work, school, and home, the personal computer has become a basic tool"

(U.S. Commerce Department, U.S. Census Bureau, 1999). By October of 1997, 37.4









million or 36.6% of American households had acquired a computer. More than 80%

of the children living in a household with a computer used it primarily for education

and games, but also for word processing, graphics and design, and e-mail. Boys and

girls use computers almost equally, but for different purposes. In 1997 almost half of

American adults used a computer at work, home or school. Half of all employed

adults used a computer on the job, a greater degree of use than at home or at school.

The fraction of adults using computers on the job increases to 75% if they have a

college education. Women, because they hold primarily technical or administrative

jobs within industry, tend to have higher levels of computer use than men. Men and

women also use computers differently at work (U.S. Commerce Department, U.S.

Census Bureau, 1999).

This "basic tool" in the employment setting is used primarily for word

processing. Other uses, in order of frequency, are keeping customer records and

accounts, e-mail and communications, calendar/scheduling, databases, spreadsheets,

and bookkeeping. Of less common use is inventory control, analysis, invoicing, sales

and marketing, graphics and design, desktop publishing and newsletters, and

programming (U.S. Commerce Department, U.S. Census Bureau, 1999).

For many Americans use of the Internet is becoming an increasingly common

daily activity. Business transactions, personal correspondence, research and

information gathering, and shopping are now routinely conducted via computers

connected to the Internet. In August 2000, 41.5% of American households had

Internet access, and 116 million Americans were online at some location. This figure

is projected to grow substantially by the middle of 2001. Adoption of Internet









technology, and thus the use of computers, is occurring amongst most all Americans

regardless of demographic characteristics. Even groups that traditionally have been

lagging behind the national trend are now making dramatic gains. This includes rural

households whose rate of Internet penetration is now 38.9%, a percentage far closer

to the national rate than in the past (U.S. Department of Commerce, 2000).

Underscoring the growing importance of Internet activity to all Americans, the

Commerce Department states: "Each year, being digitally connected becomes even

more critical to economic, educational and social advancement. Now that a large

number of Americans regularly use the Internet to conduct daily activities, people

who lack access to those tools are at a growing disadvantage. Therefore, raising the

level of digital inclusion by increasing the number of Americans using the

technology tools of the digital age is a vitally important national goal" (U.S.

Department of Commerce, 2000, p. 1).

The significance of digital inclusion, and thus the need to adopt the

innovation has not escaped the notice of industry, as evidenced by Chase Manhattan

Bank's (CMB) recent commitment to a multi-year, multi-million dollar grant to

develop an extensive home-school computer network at an inner-city school in

Brooklyn New York. Included in the grant is state of the art equipment to be given

away free to students and staff, a new school website, and the volunteered time of

1500 CMB employees (Chase Manhattan Bank, 2000).

The Federal government is also catalyzing building the nation's Internet

framework through a key component of the 1996 Telecommunications Act. Called

the "E-Rate," the legislation allows school districts and libraries to purchase









telecommunication services at significant discounts. The result has been nearly $6

billion expended toward improving telecommunications infrastructure, and Internet

access, at predominately needy schools and libraries (Benton Foundation, 2000).

Information Technology in Primary and Secondary Education

During the 1950s computing power was used almost exclusively to develop

new technology. In spite of this "preoccupation," the emergence of computer-based

education took form in flight simulation and various industry-based employee-

training programs. By the early 1960s computing power had tentatively reached the

mainstream (K-12) education establishment, which lead to the programmed-

instruction movement (as exemplified by the Stanford Project and PLATO).

Adoption of computers, however, was stymied by their large cost and a lack of

individuals who knew how to operate them (Ross, 1986). Regardless of these

difficulties, the stage was being set for the further integration of computing power

into the mainstream education establishment.

Writing in the second half of the 1960s, Loughary indicated that in society at

large "computers and sophisticated communication devices" had become accepted

"as natural parts of our environment" (Loughary, 1966). Mainstream education, he

indicated, was not excluded: "The concepts underlying systems and electronic

communications devices are playing increasingly important roles in education and, if

the thinking and planning of some educational leaders is valid, are destined to become

basic and necessary to education in the not too distant future" (Loughary, 1966, p. xi).

Loughary is auguring the increased use of computers for instructional purposes, as he

goes on to observe that the machine was evolving from "the garage and workshop of

education" (metaphorically, administrative/bookkeeping functions), to "the kitchen









and living room" (metaphorically, the classroom) (Loughary, 1966). Capping off the

thought he states: "The resulting potential for change in our educational institutions

are overwhelming" (Loughary, 1966, p. xi). Change as predicted by Loughary would

indeed take place, though at a much slower pace than thought at the time. Both

technological, and perhaps more significantly, sociological hurdles still needed to be

overcome before widespread adoption of the technology took place.

One can imagine the reaction of a professional educator to a "man-machine"

system that performed many of their traditional functions. Such a system, postulated

by Loughary, integrated the storage, retrieval, and high-speed printing of indexed

reference material along with diagnostic testing of students and the ability to produce

cumulative student progress reports (Loughary, 1966). Of the possible teacher

reaction to computers being used as a means to aid instruction (and having to learn

how to use them in this manner), Loughary writes: "While anticipating the

possibilities for individualizing and enriching instruction, he is reluctant to part with

the professional methods developed over the years and in which he has a real personal

investment. Few people after having gained professional status enjoy returning to the

role of novice. Nevertheless, the extent and rapidity with which man-machine

systems and new technology are implemented in education will depend upon the

willingness of professional, experienced teachers at all levels kindergarten through

college to experience some basic re-education in machine and systems technology"

(Loughary, 1966, p. 6).

Regardless of the reluctance of mainstream professional educators, the use of

computers to instruct pupils was gaining momentum. "Throughout the 1960s,









corporations and universities initiated projects to develop and evaluate programmed

instruction" (Ross, 1986, p. 7). Evidencing the lack of involvement of mainstream

educators, Loughary indicates that discoveries stemming from the new field were

reported primarily in industry and agency publications, with little information to be

found in professional educational research journals (Loughary, 1966). Certain

educators, however, did fathom the implications of computers in instruction, and

engaged in speculation about what the future might hold: "Computers will play an

increasingly major role. It does not take much imagination to envisage increasing

individual study as a lifelong effort, conceivably occurring in one's home via

individual consoles connected to large computers by way of telephone lines or

electron beams. As with today's soft drink and candy-vending machines, we may

live with 'quick learning' machines capable of rapidly updating an individual in

specific skills. One can go on and on with speculation of the details of tomorrow.

However, the demands of today make it abundantly clear that radical changes in the

concepts and operation of education must come, and come soon" (Tondow, writing in

Loughary, 1966, p. 80).

Twenty-some years later in the late 1980s, when microcomputer use was

burgeoning in all areas of society, the computer's role in education was only just

beginning: "Although the use of computers was introduced into the educational

systems of some OECD2 countries in the late 1960s and early 1970s, the major

developments in the use of computers in schools have taken place in the 1980s since

the advent of the microcomputer" (Winship, 1989). As was the case in the 1960s,


2 Organization for Economic Co-operation and Development Membership as of 1989 included all
major European democracies, the United States, Japan, Australia and New Zealand.









access to computer hardware in the late 1980s was a factor inhibiting its widespread

adoption and use in education. Even though education systems had made relatively

large investments in computer equipment, the average ratio of computers to pupils in

secondary schools in the United States was shown to be 1:27, which meant that, on

average, a student was receiving only five to seven minutes of direct computer

contact per day (Winship, 1989). Revealing that this situation had not much changed

halfway through the next decade, diSessa indicates that in 1995 there were about

three computers per "average" 30-student classroom in the United States. By 1999

only 10 percent of U.S high schools had a student/computer ratio of 1:10, and the rest

had less (diSessa, 1999).

Software issues also slowed adoption. In contrast to the 1960s when the

availability of software products was limited, the 1980s saw an estimated 1,500 to

2,400 new packages published per year in the U.S. alone. Of all these titles, however,

only 12 percent was deemed of good quality, with another eighteen percent being of

tolerable quality (Winship, 1989).

The role of the teacher in adopting computer technology appears as pivotal in

the 1980s as it was in thel960s. "Teachers in general seem to resist technological

progress and may appear to be the biggest stumbling block inhibiting changes in the

way computers are used in schools" (Winship, 1989, p. 29). Gilbert De Landsheere

reasons why this resistance occurs: "The methods that teachers use are governed by

beliefs and attitudes that have been deeply and unconsciously absorbed during their

school career, which in some countries begins at the age of three in the case of more

than 90 percent of children and thus lasts at least 15 years up to the end of









compulsory education. That is why, when they themselves become teachers, they

tend to copy the teaching techniques advocated by the more recent teaching theory.

Before training anyone to use new technologies, or, more accurately, concurrently

with this training, the underlying attitudes and habits of educational practice need to

be thoroughly reformed. This will be a complex and costly operation and will only be

achieved by working together with the teachers over a long period of time and by

endeavoring to resolve jointly the problems that they have decided to tackle"

(Winship, 1989, p. 29). Lerner, citing Papert says that primary and secondary schools

resist change because educational policy is dominated by bureaucracies at all levels of

government. Furthermore the intellectual establishment, which dominates

educational thinking, stems from a culture where change is extremely slow. Another

inhibitor of change is that school, as we know it today, is deeply imbued in both

individual and societal consciousness (Lerner, 1997).

That computers will have profound impact on society and educational

institutions is a philosophical theme that threads its way from 1960s to the present.

Writing in 1966, Tondow suggested that an "information explosion" as lead by the

computer was bringing "profound" change to society (Tondow, writing in Loughary,

1966). Elaborating further he says, "It is apparent that the computer represents one of

the major social as well as technological changes of our times. It is equally apparent

that we have not yet learned to fully utilize this equipment and have a limited sense of

its ultimate impact" (Tondow, writing in Loughary, 1966, p. 30). Writing in 1986,

Ross, citing a 1981 article by Johnson, states: "As you glance at this page, a

revolution is taking place around you. Signs of it can be seen everywhere on T.V.,









in magazines, and in offices, homes and schools. The computer age has arrived, and

in the opinion of some, it will be significant enough to be labeled that by historians"

(Ross, 1986, p. 1). Quoting Herbert Simon that the computer is a "once in several

centuries" innovation, diSessa intones in 1999 that, "Computers are incontestably

transforming our civilization. Comparisons of our current information revolution to

the industrial revolution are commonplace and apt. Almost no corer of society is

untouched by computers" (diSessa, 1999, p. 3). She goes on to hypothesize

computers can be the technical foundation of a new and "dramatically enhanced

literacy" the influence and scope of which will rival the current text-based literacy

(diSessa, 1999). This thinking appears to clearly establish a dynamic link between

human cognitive activity and modern computer technologies.

Information Technology in Higher Education

From administrative functions to academics, information technology has

become an integral part of higher education. A 2000 survey conducted by The

Campus Computing Project (CCP) reported that two-fifths of the participating

colleges (42.7%) have courses that use Web resources as a component of the syllabus,

and three-fifths (59.3%) of the participating colleges have courses that use electronic

mail. Many campus services, from undergraduate admission applications, checking

grades, to paying tuition, are becoming available online. Perhaps a more portentous

development is that over half of the colleges participating in the CCP survey report

offering one or more full college courses online (Campus Computing Project, 2000).

In fact, Fairleigh Dickinson University of Hackensack, New Jersey has mandated that

students take one online course per year during their matriculation (School to Require

One Online Course, 2000).









All of the above means that college students have to know how to use

computers. Florida State University prefers students to fulfill computer competency

requirements in their freshman year. The university states: "Regardless of the

vehicle used to satisfy the computer competency requirement, students must

demonstrate: 1. Basic familiarity with computer hardware, operating systems, and

file concepts; 2. Working knowledge of a word processor or text editor and at least

one other software application (e.g., spreadsheet, database, etc.); 3. Working

knowledge of the World Wide Web (WWW) and electronic mail" (Florida State

University, College of Arts and Sciences Department of Computer Sciences

Academics, 2000). The College of Agriculture and Life Sciences at Cornell

University has similar requirements, mandating that their students graduate with a

working knowledge of word processing, presentation tools, spreadsheet analysis,

database management, graphics, the World Wide Web, e-mail, and the ability to make

effective use of information on the Internet (Johnson et al., 1999).

A demonstrable level of student computer competency serves not only to

facilitate the processes of higher education, but also responds to the technology

demands of prospective employers. A study conducted by Cornell University lead its

investigators to conclude that agricultural employers "have a high expectation of

computer literacy in recent college graduates" (Johnson et al., 1999). Computer

competency requirements such as those of FSU and Cornell, and presumably many

other universities, help students to succeed both at the university and in the job

market.









Certain observers feel that the World Wide Web is bringing dramatic change

to academia. Duderstadt states: "There is an increasing sense that information

technology will have an even more profound impact in the future on educational

activities of colleges and universities and on how we deliver our services. To be sure,

there have been earlier technology changes, such as television, but never before has

there been such a rapid and sustained period of change with such broad social

applications (Katz & Associates, 1999, p. 5). Richard N. Katz, using a hypothetical

entering freshman at the University of California Santa Cruz as an example, augurs a

very technologically driven campus in the year 2010. It is at this time that the first

class of students to have grown up with the Web will enter college. His hypothetical

student would have been solicited to enroll at UCSC as early as the 10th grade, based

on PSAT test scores and transcripts that were made available to college officials

electronically. By the beginning of their actual college experience, this individual

will have completed two semesters of collegiate work by way of the Web and

appearances by a UCSC professor at their high school location. Once on campus,

Katz's hypothetical freshman would be issued a "personal digital assistant" allowing

them to select from a variety of online courses offered by numerous UCSC "academic

partners," which include the seven other members of the UC system. Katz's student

would also benefit from the UCSC campus being equipped with wireless

technologies, allowing for very easy connectivity to such services as "virtual

bookstore," and so forth (Katz & Oblinger, 2000). "Preposterous? Yes, the scenario

no doubt understates the likely student expectations and campus capabilities of a

decade from now by an order of magnitude" (Katz & Oblinger, 2000, p. xv).









Distance learning, or distributed education, is bringing new and significant

competition to traditional academic organizations. Entities such as the University of

Phoenix, WebCT, and Eduprise.com are offering Web-based products and services in

what Katz calls the "e-learning 'marketspace"' (Katz & Oblinger, 2000). In this

"marketspace" commercial entities compete with traditional academic organizations,

which likely could begin to compete with themselves. As distance learning becomes

more common, students might soon have the option of obtaining the classroom

experience of renowned instructors located anywhere in the world (Katz &

Associates, 1999). Furthermore, the distributed learning environment appears to

converge with the psychological nature of students raised in an era of interaction with

electronic devices. "They approach learning as a 'plug and play' experience: they are

unaccustomed and unwilling to learn sequentially to read the manual and instead

are inclined to plunge in and learn through participation and experimentation" (Katz

& Associates, 1999, p. 7).

On a broadly philosophic note, Katz states that education, and thus

knowledge, has become the determining factor in the wealth of nations and the key to

individuals' standard of living. He posits that democratic societies bear a

responsibility to their citizenry to provide them with affordable, and moreover,

accessible, high-quality education. This, he says, has long been the theme of higher

education in America, which over time has encompassed more and more individuals

from a broader segment of society. The new and increasingly powerful technologies

associated with computers are seen by Katz as an opportunity for U.S. higher

education to capitalize on its global preeminence perhaps some day meeting the









demands of not only the domestic population, but also of a global educational

"channel surfer" who carefully selects courses based on such criteria as content and

price. Revenues generated from such ventures could conceivably subsidize

traditional modes of instruction found on campus, which are not as remunerative

(Katz & Associates, 1999).

Information Technology and the Cooperative Extension Service

Extension, along with the rest of the world's societies, is now living in age of

rapid change brought about by information technology. In this environment county

agents must possess up-to-date IT skills to effectively meet the demands placed on

them by the increased use of IT by both clientele and Extension administrative

entities. Ladewig states: "Face-to-face communication with clientele is a very

important method that we will always rely on to bring timely information to our

clientele. However, we must also examine how computer technology can help county

Extension agents deliver relevant information and support educational programs"

(Ladewig, 1999, p. 1). Martin (1998) states: "Computer and information

technologies are vital components of Extension's current and future infrastructure.

Agents and staff will have to transmit information between offices and clientele at a

distance" (p. 3). Echoing this, Rasmussen states: "Communication is the key to the

operations of the county Extension office. More and more county Extension offices

are turning to computers and other electronic technology to improve the

communications with the state offices and with university specialists, as well as with

the people they serve" (1989, p. 8).

Inextricably tied to today's information technology, and of clear importance to

furthering Extension's mission, is the Internet. "There are tremendous opportunities









for Cooperative Extension (CE) on the Internet. These opportunities are for improved

functionality of the CE system, and new opportunities for communities that sustain

the CE system" (Tennessen, PonTell, Romine, & Motheral, 1997). Bamka (2000)

states that it is important for Extension professionals to teach agriculture

professionals to become familiar with the Internet in order to take advantage of its use

in developing markets for, and promoting agricultural products. Sherfey, Hiller,

Macduff, and Mack (2000) describe an Internet-based system designed by Extension

that assists professionals in developing their volunteer management skills. Clientele,

some with Extension's assistance, and some without, are using the Internet to acquire

information and also to market agricultural goods. New Jersey hay producers have

found marketing success over the Web (Bamka, 2000), and Iowa farmers, with a 33%

Internet penetration rate, are using a net-based service to price commodities: "'To be

successful in the 21st century, you have to have access to better information and

sophisticated tools,' said David Lyons, director of business development for the Iowa

Farm Bureau Federation" (Bohrer, 2000, p. 6G). Baker (1998) reported that 46% of

Florida Farm Bureau County Directors surveyed felt the Internet helped them do well

in their jobs.

The Internet has also enabled the creation of new ways that Extension

professionals can receive in-service training. Lippert, Plank, and Camberato (1998)

and Lippert, Plank, and Radhakrishna (2000) described in-service training for

Extension professionals in the Southeast that used a listserv and a Web site. The

authors investigated two different in-service training that used this method, and









reported that the participants broadly accepted it, and demonstrated suitable

knowledge retention of the subject matter studied.

Internal accountability of Extension activity, especially planning and reporting

needs, is increasingly being done over the Internet by way of Web-based applications.

This is the case with the FLCES, the North Carolina Cooperative Extension Service,

and the Clemson University Cooperative Extension Service in South Carolina.

Radhakrishna and Pinion (1999) stated that accountability is becoming more important

because of stricter mandates legislated by federal, state, local and university

authorities. Web based accountability systems are helping to accommodate these new

demands.

The World Wide Web, Internet mail, modem GUI operating systems, and a

suite of office software now challenges the Extension professional on a daily basis.

Internal administrative needs and clientele needs both increasingly call for the use of

this modem information technology. A demand is thus placed on the organization to

ensure that its professionals obtain the necessary skills to function in this modern

context: "Knowledge has been the product of Extension since its inception. As CES

embraces the knowledge economy, leadership must find ways to insure that their

employees become knowledge workers in the information age" (Albright, 2000, p.

17). Employee training established from a clear understanding of the workforce's

present information technology skills, is a clear-cut way of ensuring that Extension

professionals are using information technology in an effective manner.

Information Technology and the Florida Cooperative Extension Service

The information technology (IT) revolution ushered in by the microcomputer

is now slightly over twenty years old. Use of the technology has progressed within









the Florida Cooperative Extension Service Extension (FLCES) to the point where

every county Extension agent has an up-to-date personal computer on his/her desk

that is equipped with a suite of current or near-current "office-type" software products

(L. Arrington, private communication, 2000). Additionally, the FLCES provides all

county offices with the resources to connect to the Internet. Serving clientele needs,

in-service training for Extension faculty, and administrative applications such as

information gathering, communication, and planning and reporting are all present or

potential uses of the organization's IT infrastructure.

Part 3-Theoretical Aegis of the Study

Training Needs

Albright states: "Before training programs are undertaken by organizations,

there should first be a front-end analysis to determine why the training is needed"

(Albright, 2000, p. 41). Training needs might stem from employees not knowing how

to perform a task, from something preventing employees to do the task, or a lack of

incentive to perform the task (Albright, 2000). When it is determined that training is

needed, a needs analysis should be performed to assess what may be causing a deficit

in employee performance (Albright, 2000). This analysis should reveal employee

competencies, and in so doing establish the objectives of the training (Albright,

2000).

Determining Training Needs

Albright, in her 2000 study of Texas county Extension agents, developed an

instrument called the "Computer Technology Skills and Training Needs." This

instrument, designed to assess agents' IT training needs, used the Borich Needs

Assessment Model as its basis. The Borich model, later confirmed by Barrack,









Ladewig, and Hedges, functions by having respondents self-assess their knowledge

about a competency, the importance of a competency to their job, and to what degree

of skill are they able to apply a competency to their job (Albright, 2000). Albright

states: "The strength of Borich's model allows for finer judgments in rating each

competency and allows for a more relevant evaluation of the response" (Albright,

2000, p. 64).

The Borich model predicts that differences will occur between the rankings

(importance, knowledge, and application) for each competency considered.

Therefore a respondent might give a high rank to the importance of a skill, but give a

low rank to their knowledge and/or application of that same skill. Thus are training

needs more appropriately chosen by comparing mathematical combinations of the

rankings for each of the competence, rather than from a single ranking (e.g.,

importance) of one competency alone (Albright, 2000). Ultimately Albright used the

formula (Importance mean Knowledge mean) X Importance Mean to derive a

hierarchy of training needs. She states: "Knowledge and application should be

considered in determining relationships; however, the knowledge factor, when

weighted and applied to the importance factor becomes the most appropriate

measurement to determine ranking" (Albright, 2000, p. 87).

Summary

From its ancient beginning in Mesopotamia as a series of beads situated on

string or shaft to its modern inception with miniaturized circuitry, the computer is a

device conceived by man to transmute the complex into the simple. It has been

employed to preserve the world's democracies in time of great peril, to further

science in its quest to explain natural phenomena, and to aid medicine in its fight









against disease. Computing devices and their peripherals, collectively termed

"information technology," have enabled complex and large businesses such as the

airline industry to grow, and in so doing have fostering unprecedented levels of

economic prosperity around the globe. In educational settings information

technology is having a significant impact, changing longstanding teaching

methodology and holding the promise to distribute education to those who previously

may have been excluded.

Modem information technology powerful personal computers running

sophisticated, easy-to-use software products and integrated with communication

technologies enabling access to the World Wide Web, have revolutionized the way

individuals can accrue and disseminate information. Certain thinkers propose that

this is the basis of a new form of literacy (diScessa, 2000). Others indicate that the

power of information technology transforms organizations into highly competitive,

agile entitles whose workers use information to produce new knowledge (Albright,

2000).

The Florida Cooperative Extension Service is an organization whose

technological infrastructure is suitably developed to participate in the information

revolution at hand (L. Arrington, private communication, 2000). Spurring the

modern, effective use of information technology for its own internal functioning, and

for the benefit of its clientele, is an effort that the organization needs to pursue. To

what degree county Extension agents are able to use modem information technology

to meet this need is presently under-researched. What is known, however, is that






44


Extension, the great system of education beyond the classroom, must be prepared to

take advantage of the still unfolding revolution in information technology.














CHAPTER 3
METHODOLOGY

Introduction

The following objectives were established in Chapter I to guide the study:

1. Describe county Extension agents' demographic characteristics and, based on
those characteristics, determine their use of information technology, including
self-assessed level of overall computer skills.

2. Determine how county Extension agents are using information technology on the
job in terms of hardware and software use.

3. Determine county Extension agents' perceived level of skill with regard to a
specific set of information technology tasks.

4. Recommend future information technology training by describing the relationship
between agents' perceived importance of, and self-assessed knowledge about
specific information technology skills.

Research Design

This study uses applied research methodology in that it seeks to answer

practical questions associated with an immediate problem (Ary et al., 1996). It is

quantitative research, having collected observations that readily lend themselves to

numerical representation (Rossi & Freeman, 1993). Furthermore, the study's

underlying research design is primarily descriptive in nature, revealing the existing

state of IT use amongst FLCES county agents vis-a-vis various demographic

variables measured. Some inferential statistical procedures were utilized. T-tests

were performed to examine for the existence of significant differences between the

means of certain variables of interest. ANOVA was employed similarly. In

conjunction with the training needs analysis, correlations between the "knowledge,"









and "application" constructs were conducted. Both Ruppert (1992), and Albright

(2000) employed similar survey-based research in their respective studies of

information technology use amongst county Extension agents.

Population

The population for this study was county Extension agents in the employ of

the Florida Cooperative Extension Service. At the time the study was initiated,

during the summer of 2002, this population numbered 331. Due to the relatively

small population size, and also due to the need to accurately capture demographic

differences within the population, a census was conducted. Ruppert (1992), and

Albright (2000) both used a census in their respective studies of county Extension

faculty.

County Extension agents possess unique characteristics based upon formative

experiences (age, social class, gender), educational experiences (university attended,

undergraduate and graduate programs), and other factors such as intelligence,

motivation, and personality traits (Baker et al., 1997). Accordingly, the population of

FLCES county agents displays a diversity of demographic features features that this

study wishes to describe in conjunction with the agents' use of information

technology, including self-assessed level of overall computer skills.

Instrumentation

Data for this study was collected by way of an instrument adapted from

Albright's 2000 survey of Texas county Extension agents. The on-line version of this

adaptation is provided as Appendix A, and the paper version is provided as Appendix

B. Albright's instrument, the "Survey of Computer Technology Skills and Training

Needs" (SCTS), is based on the methodological framework of the (1980) Borich









Needs Assessment Model as verified by Barrick, Ladewig, and Hedges in 1983

(Albright, 2000). The list of computer competencies used in the SCTS were derived

from two documents: The Texas Education Agency's Texas Essential Knowledge

and Skills (for Texas teachers, and Texas students in grades K-12), and the Texas

Technology Essential Knowledge and Skills (for Texas teachers, and Texas students

as of the 8th grade), a document developed by "business, industry, and educational

professionals" (Albright, 2000, p. 38). Albright reports the Texas Technology

Essential Knowledge and Skills document as having "quickly become a national

standard among educational institutions" (Albright, 2000, p. 38). The SCTS

instrument was subjected to expert review, and furthermore, was compared to similar

national tests of computer competency for content and quality of the technology

competencies it addressed. These competencies were found to be "... as or more

complete and comprehensive than each of the other assessments reviewed (Albright,

2000, p. 60). In addition, the SCTS instrument was subjected to a pilot study.

Reliability of the SCTS instrument was established during data analysis by

examining internal consistency of each scale of computer competency using statistical

procedures. As Albright puts it: "Cronbach's Alpha using summated scale scores

completed for each respondent was used on ratings of importance, knowledge and

application. Questions were grouped to address specific goals of the study and were

assessed for consistency using this procedure" (Albright, 2000, pg. 65).

Albright administered the SCTS survey to two groups via the World Wide

Web. Group one was "a purposive sample of 44 CEAs who are high users of

computer technology as identified by a TAEX Computer Information Technology









workgroup" (Albright, 2000, p. 58). The second group was the general population of

county Extension agents in the employ of the Texas Agricultural Extension Service.

The response rate to the SCTS survey for the first group was 95%, for the second

group it was 64%. No comparison was made between the groups.

Albright's survey, and consequently the study at hand, asked "agents to report

self-perceived technology skills, their ability to apply the skills to their work and their

perception of the importance of the technology skills" (Albright, 2000, p. 59). Three

constructs, "importance," "knowledge," and "application" were assessed by way of

three questions which were asked for each area of computer skills the survey

considered. These constructs were "operationalized" as follows:

Importance: Importance of this skill to your job function.

Knowledge: Your knowledge of this topic (your ability to accurately recall

or summarize the subject matter).

Application: Your ability to use this skill in your job.

(Albright, 2000, p. 62).

The SCTS functions by soliciting response to a series of questions designed to

reveal specific demographic characteristics associated with the respondent, including

information on prior computer training. The instrument then asks the respondent if

they can perform specific computer technology skills associated with whichever of

the eight types of computer software (i.e., e-mail, word processing, etc.) the

instrument is presently considering. Within consideration of one of these specific

types of software, the three construct questions are then posed. Here the respondent

self-assesses their knowledge of the software, the importance they ascribe to the









software, and their ability to apply skills using the software to theirjob. Immediately

thereafter respondents are given the opportunity to add, in their own words, any

additional skills associated with the software they feel are needed for successful

employees.

A review of this study's adaptation of the SCTS instrument was conducted by

a panel of experts chosen for their knowledge of Florida Cooperative Extension

Service county Extension agents and/or information technology. The panel included

representatives) from the Dean for Extension's Office, the District Extension

Director's Office, the Department of Agricultural Education and Communication, and

the University of Florida's administrative computing department (Information

Systems). As a result of this review many changes were made to the adapted

instrument, including enhanced content, scales, and readability. The changed

instrument, however, retained the fundamental underpinnings necessary to analyze

training needs according to the Borich et al. model. This instrument was then

subjected to a pilot test involving 20 agents chosen at random from the general

population of FLCES county agents. This is described in more detail below.

Data Collection

Data collection followed a "mixed-mode" approach as described by Ladner,

Wingenbach, and Raven (2001). This approach gives individuals a period of time (in

the case of this study, 3 weeks) to complete a Web-based survey instrument, but then

sends a paper copy of the survey instrument to those individuals who have not

completed the Web-based version. It is believed that this method accommodates

those individuals who do not have access to the Web, or who prefer not to use the

Web, or those who prefer not to perform the survey via the Web. This study also









took into consideration methodology for electronic surveys as described by Dillman

(2000) in his book Mail andInternet Surveys. Content of the reminder messages sent

by the researcher followed recommendations set forth by Glenn D. Israel (Israel,

2000).

The Web-based pilot test of the study's survey instrument involved 20

randomly chosen county Extension agents from the population of agents, and

commenced on June 24th, 2002. On that day an e-mail message introducing the study

was sent to the pilot population from the researcher. This message contained a link to

the Web site that hosted the study's survey instrument, and provided a unique,

individualized access code for each potential participant to gain access (to the

instrument). A reminder message was e-mailed to non-responding individuals 3 days

later. Thereafter the researcher telephoned non-respondents with a personal appeal to

participate. The messages transmitted by the researcher for all phases of the study are

included as Appendix E Appendix 0.

The full Web-based survey was introduced on July 5th, 2002 by a message e-

mailed to all county agents from Dr. Christine T. Waddill, Dean for Extension. On

July 7th, 2000 the study commenced when the researcher e-mailed a message

containing specific information on the survey's rationale, a hyperlink to the World

Wide Web site hosting the survey instrument, and the agent's unique, individualized

access code. Included in this message were e-mail addresses and telephone numbers

to contact the researcher or his faculty advisor if need be. Reminder messages

containing the hyperlink to the survey and the agent's unique access code were sent

July 12th, July 16th, and July 22nd to those agents who had not yet completed the









Web-based study. Reminders were also sent out via e-mail on July 11th, and July

25th, by way of the Dean's "Comings and Goings" bi-monthly electronic publication.

The District Extension Directors were each asked by the researcher to encourage

participation, which resulted in additional e-mail reminders sent to specific segments

of the population. Dillman's (2000) assertion that multiple contacts with potential

respondents are as important to electronic surveys as regular mail surveys was readily

confirmed by this study.

On August 1st, 2000 the population of agents who had not filled in the Web-

based instrument was sent a packet via conventional mail that included the

introductory letter from the researcher and his faculty advisor, a paper version of the

survey instrument, and a self-addressed stamped return envelope with which to return

the completed instrument. The introductory letter in this package contained language

indicating that the survey could alternately be filled out on-line, and provided the

URL to the site and the individual's unique access code. A single reminder message

was sent by post on August 14th to those agents who had not returned the paper

survey, or who had not completed it on-line. This reminder letter also included

language indicating that the survey could alternately be filled out on-line, and

provided the URL to the site and the individual's unique access code. The survey

concluded on September 1st, 2002.

On Web-based Surveys

The first electronic surveys conducted via the Internet were predominately

done through e-mail (Solomon, 2001). With the advent of the World Wide Web and

its enabling hypertext markup language (HTML), electronic surveys soon became

ensconced in this new venue and became known as "Web-based surveys"









(Solomon, 2001, p.2). This methodology began to occur in approximately 1996-1997

(Dilman & Bowker, 2001; Solomon, 2001). Due to their low cost relative to

conventional surveys (paper-based, face-to-face, computer assisted telephone surveys,

etc.), and their ability to quickly return copious amounts data from the tremendous

populations they reached, Web-based surveys experienced explosive growth (Dilman

& Bowker, 2001; Yun & Trumbo, 2000; Solomon, 2001). Writing from a market

research perspective, Jeavons (1999) reported that "fashion" played a role in making

web-assisted interviewing "a booming industry," and that the ability to perform some

sort of Web-based data collection has become "almost mandatory" for market

research companies (p. 69). Coomber (1997), who made novel use of the Web to

perform a sociological survey on a specific population, suggests that the Internet

"presents enormous possibilities" to reach individuals that are desired as research

subj ects.

Leadership for Web-based social (and market) survey procedures came not

from the "survey methodology community" but rather stemmed in large part from

computer programmers (Dilman & Bowker, 2001, p. 1). This produced a situation

where technological innovation in survey design and implementation, as performed

by the programmers, proceeded without the methodological rigor practiced by survey

methodologists (Dilman & Bowker, 2001). Two such cases involving "highly

visible" Web-based sample surveys purporting to have yielded scientifically viable

results are shown by Dilman and Bowker as having practiced questionable

methodology that did not take into account the presence of certain types of error. Just

like other types of sample surveys, those conducted via the Web are also subject to









four distinct types of error: Coverage error, sampling error, measurement error, and

non-response error (Dilman & Bowker, 2001).

Of the above, coverage error, or the error resulting from drawing a sample that

does not adequately represent a population, is of particular concern in Web-based

surveys especially those of the general public (Coomber, 1997; Dilman & Bowker,

2001; Solomon, 2001). Though this situation is seen as mitigating in the future as

more individuals use the Web (Coomber, 1997), currently not everyone has access.

Under certain circumstances, however, Web-based surveys can be conducted in a

scientifically valid manner. Dillman and Bowker (2001) state: "Some populations -

employees of certain organizations, members of professional organizations, certain

types of businesses, students at many universities and colleges, and groups with high

levels of education do not exhibit large coverage problems. When nearly all

members of a population have computers and Internet access, as is already the case

for many such groups, coverage is less of a problem" (p. 5). It would appear that

county Extension faculty of the FLCES is such a population, and thus the issue of

coverage error is averted.

Non-response error, though, remains a concern for all surveys, both Web-

based and conventional. As Bosnjak and Tuten (2001) put it: "Non-response is of

particular importance to researchers because the unknown characteristics and

attitudes on non-respondents may cause inaccuracies in the results of the study in

question" (p. 2). The authors then identify three traditional types of response to

requests to participate in a survey: Unit non-response where an individual does not

have access to the survey, refuses to respond, or is unable to respond; item non-









response, where only certain items in a returned survey are answered; and, lastly,

complete response.

Dillman and Bowker (2001) indicate that response to Web-based surveys is

likely to be low, and can potentially cause non-response error. Computer programs

running in the background of Web-based surveys have, however, enabled researchers

to identify respondent behavior, including modes of non-response. Bosnjak and Tuten

(2001) have classified the following patterns: Complete responders; Unit non-

responders; Answering drop-outs (individuals who answer some questions, but then

drop out of the survey before its end); Lurkers (individuals who view all of a survey's

questions, but answer no questions); Lurking drop-outs (individuals who only view a

fraction of the questions, then drop out); Item non-responders; and Item non-

responding drop-outs (individuals who view some question, answer some, and then

leave the survey before its end) (p.6). Understanding these patterns might aid in

ameliorating non-response error in Web-based surveys.

In addition to low response rates, poor questionnaire design and a

respondent's lack of computer skills can lead to "premature termination of the

survey," with the implication of introducing non-response bias (Dillman & Bowker,

2001, p. 6). The authors illustrate this by identifying seven different scenarios

ranging from respondents not knowing how to erase answers, to having to take

multiple actions in order to answer a question. Furthermore, non-response could

possibly occur due to incompatibilities between the Web-based survey and the

respondent's hardware or software (Dillman & Bowker, 2001). Different browsers,

different versions of HTML, lack of random access memory, slow Internet









connections, and in some instances, use of the Java programming language, can all

cause the survey to be difficult, if not impossible to complete (Dillman & Bowker,

2001).

Measurement error also presents new issues for Web-based surveys. The

foremost difficulty here is how to make a survey's response stimuli identical from

one respondent to the next (Dillman & Bowker, 2001). The study at hand used a very

basic level of HTML language, which gave some assurance that all individuals

received the same response stimuli.

Addressing ways of reducing the four types of survey error (coverage, sample,

non-response, and measurement) as they pertain to Web-based surveys, Dillman and

Bowker (2001) promulgated the "Principles for the design of web surveys and their

relationship to traditional sources of survey error," which is here included as

Appendix C. Though presented by the authors with the caveat that the principles are

but "one attempt to develop such procedures," they nevertheless range broadly

through a gamut of issues salient to the design of Web-based surveys and how each

impacts a potential source of error. The introductory page, choice of first question,

visual appearance of questions, and use of graphical symbols or words to convey

level of completion of the survey are amongst the items considered.

How This Study Addressed Sources of Error

In general, the study sought to reduce measurement, and non-response error

by having followed as many of the recommendations presented in Dillman and

Bowker's "Principles" as possible. Coverage error was not an issue, as the study was

based on a census of FLCES county agents, and as such each participant had a known









non-zero probability of being included. Sample error was also moot because a census

was conducted.

Measurement error was addressed by following Dilman and Bowker's

"Principles," including a simple, motivating welcoming screen, interesting first

question, easy-to-understand navigation buttons, and a clear indication of how much

of the survey a respondent has completed. Reduction of non-response error also

followed the "Principles," and included an e-mail invitation from the dean for

Extension asking for participation, and e-mail reminders urging participation which

were sent at pre-arranged times after the start of the survey. District Extension

Directors were asked to encourage participation. Agent's use of their unique access

code also enabled the researcher to directly address issues of non-response. As a

means to combat non-response due to inability, or reluctance to use the Web, a paper-

based version of the survey was sent via post to all agents who did not complete the

Web-based instrument. Due to the tremendous response rate (90.3%), the issue of

non-response bias appeared to not be a concern for this study. A limited investigation

of the non-respondents (n = 32) did not reveal any obvious differences in gender. As

age was an item collected in the survey, the analysis looked at non-respondents' rank

as a means to assess whether there was an age effect. It is assumed that age and rank

have a reasonably strong correlation. The majority of non-respondents were of rank

I. Most non-responds were located in the Northeast and South Extension districts.

Data Analysis

The SAS System for Windows, Release 8.2 was used to analyze the data.

An alpha level of .05 was set a priori. Frequency distributions and descriptive

statistics such as the mean, and standard deviation were calculated for all appropriate






57


survey items and presented in tabular form (Albright, 2000; Ary et al., 1996; Johnson

et al., 1999; Ruppert, 1992). Analysis of variance with an associated Duncan's test

was employed to test for differences in the means of between levels of certain

variables such as age. Association between the construct variables was described

using Pearson's Product Moment Multiple Correlation (Albright, 2000). Use of

Cronbach's coefficient alpha tested the consistency of the scale.














CHAPTER 4
RESULTS

This study investigated the current use of information technology, level of

information technology skills, and the workplace application of modern information

technology among county Extension agents of the Florida Cooperative Extension

Service. The study used applied research methodology in that it sought to answer

practical questions associated with an immediate problem (Ary et al., 1996).

In light of the manifold technological change of the past 10 years and its

impact on Extension, the following questions thus arise: Have county agents kept

abreast of this manifold technological change? Are they utilizing the Web to find

information to fulfill clientele need? Are they disseminating information to clientele

through Web sites or e-mail? Are agents using e-mail to exchange information, and

can they attach a file to such messages? And finally, to what degree of sophistication

do agents use everyday office software products such as word processors, or

spreadsheets?

To answer these questions a survey instrument was adapted from that used in

a similar (2000) study of county Extension agents in the state of Texas. The

(adapted) instrument included ninety-nine questions that recorded personal and

situational factors, and measured patterns of information technology use, specific

skills practiced for six different types of software, and types of computer hardware

and connectivity. The instrument also assessed future information technology

training needs. To these ends specific questions asked agents to gauge their









knowledge of, ability to apply to their job, and their perceived importance of the six

types of software. Response to these questions was then analyzed, and an order of

training need derived.

This study presents its findings in sequence with the major objectives

established in Chapter 1. Those objectives were to

1. Describe county Extension agents' demographic characteristics and, based on
those characteristics, determine their use of information technology, including
self-assessed level of overall computer skills.

2. Determine how county Extension agents are using information technology on the
job in terms of hardware and software use.

3. Determine county Extension agents' perceived level of skill with regard to a
specific set of information technology tasks.

4. Recommend future information technology training by describing the relationship
between agents' perceived importance of, and self-assessed knowledge about
specific information technology skills.

Objective 1

Describe County Extension Agents' Demographic Characteristics and, Based on
Those Characteristics, Determine Their Use of Information Technology, Including
Self-Assessed Level of Overall Computer Skills

A General Description of the Respondents

The number of county Extension agents employed by the Florida Cooperative

Extension Service at the inception of this study was 331. Two hundred ninety-nine

agents, or 90.33% of this population completed the study's survey instrument either

on-line or by paper. By gender the respondents were 57.86% female (n = 173), and

42.14% male (n = 126). This distribution of males and females mirrored that of the

general population of county Extension agents (58.01% female and 49.99% male) at

the beginning of the study. The majority of respondents (63.54%) indicated that their

age fell between 41 and 60 years (n = 190). Most respondents (69.90%) reported









work experience, including both inside and outside of Extension, of 16 or more years.

Table 1 immediately below presents this information.


Table 1.
Number ofRespondents by Gender, Age and Years of Work Experience

Characteristic N %N

Gender

Male 126 42.14
Female 173 57.86

Age Group

20-30 35 11.71
31-40 51 17.06
41-50 97 32.44
51-60 93 31.10
61-70 19 6.35
No response 4 1.34

Years of Work Experience

Less than 5 years 22 7.36
5-10 years 31 10.37
11-15 years 34 11.37
16+ years 209 69.90
No response 3 1.00



Comparing Response Groups

Of the 299 respondents, 278 (92.98%) completed the electronic version of the

survey instrument on-line, and 21 (7.02%) completed the paper version. For purposes

of comparison, respondents in this study were divided into four groups: "Early On-

line Respondents" who completed the survey on-line (n = 65), "Late On-line

Respondents" who completed the survey on-line (n = 65), "All On-line Respondents"

which include all respondents who completed the survey on-line (n = 278), and









"Paper Respondents" who completed the paper version of the survey. To form the

early and late on-line groups, the on-line respondents, excluding respondents to the

pilot study, were divided into percentage quartiles (Glenn D. Israel Personal

communication, October 2002). The first and last quarters of these respondents were

chosen to form the early and late groups, respectively.

An examination for differences between the Early On-line Respondents and

Late On-line Respondents was then performed. As is shown in Table 2, the

percentages of male (44.62%) and female (55.38%) early on-line respondents are

essentially equal to the percentages of gender for all respondents. This changes for

the late on-line respondents, with females (64.62%) constituting a greater percentage

of this category.

Table 2.
Frequency and Percent by Gender for the Early and Late On-line Response Groups

Response Group Male Female

N /oN N %N

Early On-line Respondents 29 44.62 36 55.38

Late On-line Respondents 23 35.38 42 64.62



The analysis then examined the Early On-line Respondents and Late On-line

Respondents for differences in mean response to age, years of work experience, self-

rated computer skills, and hours of weekly computer use. This information is

provided in Table 3 below. Note that the study employed various scales to









Table 3.
Means and Standard Deviations of Early and Late On-line Respondents by Various
Variables

Variable Early On-line Respondents Late On-line Respondents


Age Group

Years Work Exp.

Self-rated Com. Skills

Hours of Usage/Week


Mean SD

3.13 1.13

3.53 0.88

3.63 0.67

4.90 1.14


Mean SD

2.95 1.21

3.33 1.07

3.33 0.79

4.63 1.32


measure levels of these variables (i.e. agents 20-30 years old were assigned the

numeric value "1" for having indicated their age fell in the first level of a six level

scale). Appendix D gives the scales and values they represent (for all variables using

a scale). A t-test for statistically significant differences between the means of these

selected variables was then conducted. As Table 4 on the following page indicates, a

significant difference in the mean self-rated computer skills score was found, with the

early respondents averaging a higher score. No other significant differences were

found.

An examination for differences between the All On-line Respondents group

and the Paper Respondents groups was then performed. As is shown in Table 5 on

the following page, the percentages of male and female for both the All On-line

Respondents and Paper Respondents groups are essentially equal to the percentages

of gender found for the study's total, undifferentiated group of respondents (n = 299).


S









Table 4.
T-test for Significant Difference between Early and Late On-line Respondents

Variable t Value Pr

Age Group 0.90 0.37

Years Work Experience 1.16 0.25

Self-rated Computer Skills 2.26 0.026*

Hours of Usage per Week 1.27 0.20

*Significant at the a = 0.05 level.


Table 5.
Frequency and Percent by Gender for the Response Groups

Response Group Male

N o%N

Electronic Respondents 117 42.09

Paper Respondents 9 42.86


Female

N %N

161 57.91

12 57.14


The All On-line Respondents and Paper Respondents groups were then

examined for differences in mean response for age, years of work experience, self-

rated computer skills, and hours of weekly computer use. This information is

provided in Table 6 on the following page. A t-test for statistically significant

differences between the means of these selected variables was then conducted. As

Table 7 on the following page indicates, no significant differences were found.










Table 6.
Means and Standard Deviations of Electronic Vs. Paper Response Groups

Variable Electronic Respondents Paper Respondents

Mean SD Mean SD

Age Group 3.05 1.10 2.80 1.11

Years Work Exp. 3.47 0.93 3.20 1.10

Self-rated Com. Skills 3.51 0.74 3.19 0.92

Hours of Usage/Week 4.75 1.25 4.66 1.39



Table 7.
T-test for Significant Difference between Electronic and Paper Respondents

Variable t Value Pr

Age Group 0.98 0.32

Years Work Experience 1.23 0.21

Self-rated Computer. Skills 1.86 0.06

Hours of Usage per Week 0.31 0.75



Use of Information Technology and Self-Assessed Level of Overall Computer Skills

County agents' use of information technology (IT) is here analyzed vis-a-vis

various demographics collected by the study. Aspects of age, gender, work

experience, agents' major programmatic area, and other characteristics are examined

to determine their effect on IT use and self-assessed level of computer skill. Note that

the total population of respondents (n =299) is being examined.









The analysis begins by differentiating respondents according to gender, age,

and work experience. Table 8 shows females between 20-40 years of age constitute

38.15% (n = 66) of the female respondents, whereas males in the same age range

constitute only 16.13% (n = 20) of the male respondents. As would be expected to

follow from this finding, more female respondents (25.44%) reported work


Table 8.
Age and Work Experience of County Extension Agents differentiated by Gender

Variable Male Female

N %N N %N
Age Group

20-30 3 2.42 32 18.50

31-40 17 13.71 34 19.65

41-50 42 33.87 55 31.79

51-60 50 40.32 43 24.86

61-70 12 9.68 7 4.05

No response 2 1.59 2 1.16

Years of Work Experience

Less than 5 years 5 3.97 17 9.83

5-10 years 4 3.17 27 15.61

11-15 years 13 10.32 21 12.14

16+ years 103 81.75 106 61.27

No response 1 0.79 2 1.16









experience between 1-15 years than did male respondents (7.14%). The majority of

agents in both gender groups reported more than 16 years of work experience. Males

with 16+ years of work experience constituted 81.75% of their gender, whereas

61.27% of the female population reported 16+ years of experience. An examination

by way of a t-test for differences between the mean age of males and females yielded

a statistically significant difference. Female agents are, on average, younger than

male agents. This finding leads to statistical examination for differences between

mean years of work experience for the genders, which also proved to be statistically

significant. Table 9 gives the details of the t-test on both age and years of work.

Table 9.
T-test for Significant Differences between Males and Females

Variable Male Female t Value Pr

Mean SD Mean SD

Age Group 3.41 0.92 2.76 1.14 5.21 <.0001*

Years Work Exp. 3.71 0.71 3.26 1.05 4.11 <.0001*

*Significant at the a = 0.05 level.

An Examination of the Non-respondents

Thirty-two agents (9.66% of the population) did not respond to the study's

survey. Analysis was done to determine if this group had any distinguishing

characteristics. As Table 10 on the following page shows, the non-respondents were

59.37% female (n = 19) and 40.62% male (n = 13), which is slightly different than the

gender breakdown for the population of respondents. Age was an item supplied by

the respondents, so analysis of the non-respondents based on this variable is

unavailable. The rank, however, of the non-respondents was available. Given that









rank is usually correlated to an agent's age, the analysis of non-respondents analyzed

accordingly. Table 10 shows that 37.50% (n = 12) of the non-respondents are of

Extension Agent I rank, and 21.88% (n = 7) are Extension Agent II rank, etc. Table

10 also shows the Extension districts where the non-respondents are located. Note

that 37.50% of the non-respondents are from the "South" Extension district.


Table 10.
Characteristics ofNon-respondents (N = 32)

Characteristic N %N

Gender

Male 13 40.62
Female 19 59.38

Rank

EA I 12 37.50
EA II 7 21.88
EA III 7 21.88
EA IV 6 18.74

District

Northwest 4 12.50
Northeast 7 21.88
Central 6 18.75
South Central 3 9.37
South 12 37.50




Self-rated Computer Skills and Demographics

Agents were asked to rate their overall computer skills on a scale from "poor"

to "excellent." As shown in Table 11 on the following page, 84.95% (n = 254) of the

respondents reported their skills to be either "average" and "above average." Table









12 reports this information by gender, and shows that 85.37% (n = 107) of the males

rated their skills as being either "average" or "above average," and 84.97% (n = 147)

of the females rate their skills as being either "average" or "above average."


Table 11.
Self-rated Overall Computer .\kill for All Respondents

Skill Rating N %N

Very Poor 3 1.00

Poor 18 6.02

Average 129 43.14

Above Average 125 41.81

Excellent 22 7.36

No Response 2 0.67



Table 12.
Self-rated Overall Computer .\kill/ by Gender

Skill Rating Male Female

N %N N %N

Very Poor 1 0.79 2 1.16

Poor 8 6.35 10 5.78

Average 55 43.65 74 42.77

Above Average 52 41.72 73 42.20

Excellent 10 7.94 12 6.94

No Response 0 0 2 1.16









A t-test for significant differences between mean self-rated skill level for

males and mean self-rated skill level for females was performed, and the results were

not significant. Table 13 below provides the results of this test.

Table 13.
T-test for Significant Difference between Male and Female Mean Self-rated Overall
Computer .\kil//

Variable Male Female t Value Pr

Mean SD Mean SD

Self-rated Computer Skill 3.49 0.76 3.48 0.76 0.07 0.93


Analysis was then conducted on self-rated computer skills by age. Table 14

shows that, across the five age groups, most agents responded that they have

"average" to "above average" overall computer skills. Note the study's three "very

poor" responses stem from the 61-70 age group and the 41-50 age group.

Table 14.
Self-rated Overall Computer .\kill, by Age

Skill Rating Age 20-30 Age 31-40 Age 41-50 Age 51-60 Age 61-70

N /oN N %N N %N N %N N %N

Very Poor 0 0 0 0 1 1.03 0 0 2 10.53

Poor 1 2.86 1 1.96 5 5.15 10 10.75 0 0

Average 11 31.43 18 35.29 44 45.36 43 46.24 12 63.16

Above Av. 20 57.14 24 47.06 41 42.27 34 36.56 4 21.05

Excellent 3 8.57 6 11.76 6 6.19 6 6.45 1 5.28

No Resp. 0 0 2 3.92 0 0 0 0 0 0









An analysis of variance was conducted to determine if differences in mean

self-rated computer skills score existed between the five age groups. As Table 15

reports, this hypothesis is valid. A further comparison of the means was performed

using Duncan's multiple range test. The results of this procedure are displayed in

Table 16, and show that differences exist between mean self-rated computer skill for

the 51-60 age group and the younger age groups, and differences exist between the

61-70 age group and the younger groups, excluding the 51-60 age group. Note that

Table 15.
Analysis of Variance for Self-rated Overall Computer .\kil// (N = 295)

Source DF SS MS F Value Pr > F

Model 4 8.046 2.011 3.59 0.0070*

Error 288 161.182 0.559

C Total 292 169.228

Significant at the a = .05 level
DEP MEAN = 3.491 ROOT MSE = 0.748 R-Square = 0.047 C.V. = 21.426

Table 16.
Results ofDuncan's Testfor Comparing Means -Independent Variable is Age,
Dependent Variable is Self-rated Overall Computer .\kil//

Levels of the Independent Variable N Mean Duncan Grouping

Age Group 20-30 35 3.714 A

Age Group 31-40 49 3.714 A

Age Group 41-50 97 3.474 A

Age Group 51-60 93 3.387 AB

Age Group 61-70 19 3.105 B









groups with the same "Duncan Grouping" letter designation are not significantly

different.

Agents' major area of programmatic activity (Agriculture, 4-H, Marine,

Other, and Family, Youth and Community Services (FYCS) was then considered in

the analysis of self-rated computer skills. Table 17 shows that the majority of agents

across all program areas self-rated their overall computer skills as "average" to

"above average." Note that 2 of the study's 3 "very poor" responses stem from the

FYCS program area, and that program areas Agriculture and 4-H had 15 "excellent"

responses between them.

Table 17.
Self-rated Overall Computer .'ill/ by Program Area

Skill Rating Agriculture 4-H Marine FYCS Other

N /oN N %N N %N N /oN N %N

Very Poor 1 0.74 0 0.00 0 0.00 2 2.33 0 0.00

Poor 6 4.44 2 4.00 1 8.33 8 9.30 1 6.25

Average 59 43.70 20 40.00 4 33.33 41 47.67 5 31.25

Above Av. 58 42.96 22 44.00 7 58.33 30 34.88 8 50.00

Excellent 11 8.15 4 8.00 0 0.00 5 5.81 2 12.50

No Resp. 0 0.00 2 4.00 0 0.00 0 0.00 0 0.00


An analysis of variance was conducted to determine if differences in mean

self-rated overall computer skills score existed between the five programmatic areas.

As Table 18 on the following page reports, this hypothesis is not valid. Duncan's

multiple range test for comparison of means was then performed. Table 19 (Pg. 71)









shows that there were no significant differences between mean self-rated overall

computer skills for the program areas.

Table 18.
Analysis of Variance for Self-rated Overall Computer ,\kil// (N = 297) Independent
Variable is Program Area, Dependent Variable is Self-rated Overall Computer .\kA//

Source DF SS MS F Value Pr > F

Model 4 3.620 0.905 1.57 0.1829

Error 292 168.587 0.577

C Total 296 172.208

DEP MEAN = 3.488 ROOT MSE = 0.759 R-Square = 0.021 C.V. = 21.783

Table 19.
Results ofDuncan's Testfor Comparing Means -Independent Variable is Program
Area, Dependent Variable is Self-rated Overall Computer .kil//

Levels of the Independent Variable N Mean Duncan Grouping

Program Area Agriculture 16 3.687 A

Program Area 4-H 48 3.583 A

Program Area Marine 135 3.533 A

Program Area Other 12 3.500 A

Program Area FYCS 86 3.325 A


Computer Usage and Demographics

As is reported in Table 20 on the following page, 113 agents (37.79%)

responded that they use their computers, both at home and at work, over 20 hours a

week. Another 78 agents (26.09%) report computer use at between 16-20 hours.

Table 21 on the following page shows weekly computer use by gender. A t-test for









significant differences in mean hours of weekly computer use between the genders

was performed, and the results were not significant. Table 22 on the following page

provides the results of this test.


Table 20.
Hours of Computer Use per Week for All Respondents

Level of Use N %N

1-5 Hours/week 18 6.02

6-10 Hours/week 44 14.72

11-15 Hours/week 46 15.38

16-20 Hours/week 78 26.09

20+ Hours/week 113 37.79




Table 21.
Hours of Computer Use per Week by Gender

Level of Use Male Female

N %N N %N

1-5 Hours 10 7.94 8 4.62

6-10 Hours 18 14.29 26 15.03

11-15 Hrs. 20 15.67 26 15.03

16-20 Hrs. 32 25.40 46 26.59

20+ Hours 46 36.51 67 38.73










Table 22.
T-testfor Significant Difference between Male and Female Mean Hours of Weekly
Computer Use

Variable Male Female t Value Pr


Hours of Weekly Use


Mean

4.68


SD

1.31


Mean SD

4.79 1.23


0.78 0.43


Table 23 examines weekly computer use by age, and shows that 42.86% (n =

15) of the 20-30 age group reported spending over 20 hours a week, as did 21 agents

(41.18%) in the 31-40 age group, 35 agents (36.08%) in the 41-50 age group, and 37

(39.78%) agents in the 51-60 age group. Only 4 agents (21.05%) in the 61-70 age

group reported 20+ hours a week of computer use, and 26.32% (n = 5) of this group

reported being on the computer 1-5 hours a week.


Table 23.
Hours of Computer Use per

Level Age 20-30

N %N

1-5 Hours 0 0

6-10 Hours 6 17.14

11-15 Hrs. 4 11.43

16-20 Hrs. 10 28.57

20+ Hours 15 42.86


Week by Age

Age 31-40

N %N

1 1.96

9 17.65

10 19.61

10 19.61

21 41.18


Age 41-50

N o%N

4 4.12

13 13.40

15 15.46

30 30.93

35 36.08


Age 51-60

N %N

7 7.53

13 13.98

12 12.90

24 25.91

37 39.78


Age 61-70

N %N

5 26.32

3 15.79

4 21.05

3 15.79

4 21.05









An analysis of variance was conducted to determine if significant differences

in mean hours of weekly computer use existed between the five age groups. As Table

24 reports, this hypothesis is valid. Duncan's multiple range test for comparison of

means was then performed. As reported in Table 25, the mean hours of weekly

computer use for the 61-70 age group is less than that of the younger groups.


Table 24.
Analysis of Variance for Hours of Weekly Computer Use (N
Variable is Age Group

Source DF SS MS

Model 4 16.171 4.042

Error 290 452.255 1.559

C Total 294 468.427

*Significant at the a = .05 level

DEP MEAN = 4.755 ROOT MSE = 1.248 R-Square = 0.03


F Value

2.59








4 C.V. = 26.257


Table 25.
Results ofDuncan's Test for Comparing Means -Independent Variable is Age
Group, Dependent Variable is Hours of Weekly Computer Use

Levels of the Independent Variable N Mean Duncan Grouping

Age 20-30 35 4.97 A

Age 31-40 97 4.81 A

Age 41-50 51 4.80 A

Age 51-60 93 4.76 A

Age 61-70 19 3.89 B


295) -Independent


Pr> F

0.0368*









Table 26 shows hours of weekly computer use differentiated by program area.

Sixty-one Agriculture agents (45.19%), and 21 (42.00%) 4-H agents report using the

computer over 20 hours a week. Percentages of Marine and FYCS agents using the

computer over 20 hours a week are less.

An analysis of variance was conducted to determine if significant differences

in mean weekly computer use existed between the five programmatic areas. As Table

27 reports, this hypothesis is not valid.

Table 26.
Hours of Weekly Computer Use by Program Area

Level Agriculture 4-H Marine FYCS Other

N /oN N %N N %N N %N N %N

1-5 Hours 9 6.67 1 2.00 1 8.33 6 6.98 1 6.25

6-10 Hours 22 16.30 6 12.00 2 16.67 14 16.28 0 0.00

11-15 Hrs. 14 10.37 5 10.00 4 33.33 19 22.09 4 25.00

16-20 Hrs. 29 21.48 17 34.00 2 16.67 25 29.07 5 31.25

20+ Hours 61 45.19 21 42.00 3 25.00 22 25.58 6 37.50



Table 27.
Analysis of Variance for Hours of Weekly Computer Use (N = 299)

Source DF SS MS F Value Pr > F

Model 4 12.369 3.092 1.95 0.1019

Error 294 465.817 1.584

C Total 298 478.187

DEP MEAN = 4.749 ROOT MSE = 1.258 R-Square = 0.025 C.V. = 26.504









Duncan's multiple range test for comparison of means was then performed.

Table 28 shows that no significant differences in mean hours of weekly computer

exist between the different program areas.

Table 28.
Results ofDuncan's Test for Comparing Means -Independent Variable is Program
Area, Dependent Variable is Hours of Weekly Computer Use

Levels of the Independent Variable N Mean Duncan Grouping

Program Area Agriculture 50 5.02 A

Program Area 4-H 16 4.93 A

Program Area Marine 135 4.82 A

Program Area Other 86 4.50 A

Program Area FYCS 12 4.33 A



Source of Computer Knowledge and Demographics

Agents were asked to respond "yes" or "no" to a list of independent questions

about their source of computer knowledge. This information is detailed in Table 29

on the following page, and shows that many agents report learning their computer

skills at work. In addition to the sources of knowledge listed in the instrument, agents

were also given the opportunity to fill in another source of knowledge on their own.

Eighteen responses were recorded in this manner, with the following being salient

examples: "Computer books," "IFAS Help Desk," "military," "courses at computer

shops," and "software manuals."










Table 29.
Number and Percent ofAgents Responding "Yes" to Questions about Where Most
Computer Knowledge was Learned (Questions asked Independently ofEach Other)

Question N %N

Self-taught at home 196 65.55

Learned in college or high school 95 31.77

Self-taught at work 264 88.29

Learned at work through in-service training 200 66.89

Learned from family or friends outside of work 151 50.50

Learned from co-workers at work 229 76.59



Recalling that the study's respondents are 42.14% male and 57.86% female,

Table 30 shows a similar gender distribution for most sources of computer

Table 30.
Number and Percent ofAgents, by Gender, Responding "Yes" to Questions about
Where Most Computer Knowledge was Learned (Responses are Independent)

Source of Knowledge Male Female

N %N N %N

Self-taught at home 85 43.37 111 56.63

Learned in school 27 28.42 68 71.58

Self-taught at work 112 42.42 152 57.58

In-service Training 90 45.00 110 55.00

Family, etc. 56 37.09 95 62.91

Co-workers 94 41.05 135 58.95









knowledge. Two exceptions, however, are apparent: Females, in greater percentages

than males, indicate that they learned their computer knowledge in school. This

might be related to the previous finding that female agents are significantly younger

than male agents, and thus may have had more exposure the technology in the school

setting. The second exception between the genders in source of computer knowledge

is that more females indicate they learned from family or friends outside of work than

did males.

An examination of Table 31, which details source of computer knowledge by

age, reveals that agents who are 41-60 years of age range report more often that they

acquired their knowledge at home than those agents in either the 20-40, or 61-70 age

groups. The youngest age group of agents, those 20-30 years of age, indicated that

they acquired their knowledge "in high school or college." more often than any other

Table 31.
Number and Percent ofAgents, by Age, Responding "Yes" to Questions about Where
Most Computer Knowledge was Learned (Responses are Independent)

Source Age 20-30 Age 31-40 Age 41-50 Age 51-60 Age 61-70

N /oN N %N N %N N %N N %N

Home 23 11.98 34 17.70 66 34.38 55 28.65 14 7.29

School 32 33.68 26 27.37 21 22.11 13 13.68 3 3.16

Work 28 10.77 50 19.23 84 32.31 83 31.92 15 5.77

In-service 13 6.53 31 15.58 69 34.67 70 35.18 16 8.04

Family, etc. 21 14.09 32 21.48 47 31.54 41 27.52 8 5.37

Co-workers 18 7.96 41 18.14 77 34.07 75 33.19 15 6.64









age range. On the other hand, agents 41-60 years of age responded most often that

their source of computer knowledge was self-taught at work. In- service training is

the predominant source of computer knowledge for agents 41-60 years of age. This

same age group, with much more frequency than other age groups, also responds that

co-workers are a source of their computer knowledge.

Table 32 shows computer knowledge reported by program area. Large

percentages of "yes" response were recorded across the program areas for the "self

taught at work" source of computer knowledge. With the exception of the Marine

program area, agents across the program areas frequently responded that co-workers

were a source of computer knowledge.

Table 32.
Number and Percent ofAgents, by Program Area, Responding "Yes" to Questions
about Where Most Computer Knowledge was Learned (Responses are Independent)

Source Agriculture 4-H Marine FYCS Other

N /oN N %N N %N N %N N %N

Home 92 68.15 33 66.00 6 50.00 55 63.95 10 62.50

School 34 25.19 22 44.00 6 50.00 26 30.23 7 43.75

Work 122 90.37 42 84.00 8 66.67 78 90.70 14 87.50

In-service 94 69.63 29 58.00 7 58.33 63 73.26 7 43.75

Family, etc. 60 44.44 26 52.00 5 41.67 52 60.47 8 50.00

Co-workers 103 76.30 34 68.00 7 58.33 74 86.05 11 68.75



Agents were asked if they had taken any computer courses since the year

2000. As evidenced in Table 33 on the following page, the majority (52.17%, n =









159) responded "no." Table 34 shows the results of a follow up question that asked

the "principal reason" for not taking a computer course since 2000. Seventy agents

indicated the reason was too few in-service training (IST) days.

Table 33.
Agents who Have/Have not Taken Computer Courses Since 2000

Response N %N

Yes 141 47.16

No 159 52.17

No response 2 0.67

(23.41%) indicated "lack of time" as the reason, and 29 agents (9.70%)

Table 34.
Reasonfor not Taking a Computer Course Since 2000

Response N %N

Lack of Time 70 23.41

Lack of Access 14 4.68

Too Expensive 3 1.00

No Incentive 9 3.01

Not Available 11 3.68

Too few IST Days 29 9.70

Other 35 11.71

No response 128 42.18



By gender, 61.90% (n = 78) of the males responded that they had not taken a

computer course since 2000, but 53.76% (n = 93) responded that they had taken a









computer course since 2000. This information is provided in Table 35 below. Table

36 shows that 30.16% (n = 38) of the males and 18.50% (n = 32) of the females

responded that "lack of time" was the principal reason for not taking a computer


Table 35.
Agents, by Gender, who Have/Have not Taken Computer Courses Since 2000

Response Male Female

N %N N %N

Yes 48 38.10 93 53.76

No 78 61.90 78 45.09

No response 0 0 2 1.16




Table 36.
Reason, by Gender, for not Taking a Computer Course Since 2000

Response Male Female

N %N N %N

Lack of Time 38 30.16 32 18.50

Lack of Access 5 3.97 9 5.20

Too Expensive 1 0.79 2 1.16

No Incentive 5 3.97 4 2.31

Not Available 5 3.97 6 3.47

Too few IST Days 11 8.73 18 10.40

Other 17 13.49 18 10.40

No response 44 34.92 84 48.55









course. Agents were also given the opportunity to list other reasons for not taking

computer courses. Among the 38 responses received this way were: "Not offering

what I need," "no mid level to high end application training offered," "scheduling

conflicts," "usually too generic and basic," "signed up for classes but they were

cancelled," and "nothing new I'm interested in."

Table 37 presents an analysis, by age group, of agents who have not taken a

computer course since 2000. Agents 61-70 years of age show the highest incidence

(57.89%) of not taking a computer class since the year 2000, whereas agents 31-40

years of age (54.09%) are more likely to have taken a course.

Table 38 on the following page shows that across all age groups, "lack of

time" is the most often given response for not taking a computer course. Table 39

(Pg. 83) and Table 40 (Pg. 84) examine the computer course questions by program

area, again showing "lack of time" to be the most frequently given reason for not

taking a course.

Table 37.
Agents, by Age Group, who Have/Have not Taken Computer Courses Since 2000

Response Age 20-30 Age 31-40 Age 41-50 Age 51-60 Age 61-70

N %N N %N N %N N %N N %N

Yes 15 42.86 28 54.90 45 46.39 44 47.31 8 42.11

No 20 57.14 22 43.14 52 53.61 49 52.69 11 57.89

No Resp. 0 0 1 1.96 0 0 0 0 0 0










Table 38.
Reason, by Age


Group, for not taking a Computer Course Since 2000


Response



LO Time

LO Access

Expense

Incentive

Availability

IST Days

Other

No Resp.


Age 20-30

N o%N

8 22.86

0 0

0 0

3 8.57

2 5.71

3 8.57

3 8.57

16 45.71


Age 31-40

N %N

5 9.80

2 3.92

1 1.96

1 1.96

2 3.92

7 13.73

5 9.80

28 54.90


Age 41-50

N %N

25 25.77

4 4.12

1 1.03

1 1.03

5 5.15

10 10.31

12 12.37

39 40.21


Age 51-60

N %N

24 25.81

5 5.38

1 1.08

4 4.30

2 2.15

9 9.68

11 11.83

37 39.78


Age 61-70

N %N

6 31.58

2 10.53

0 0

0 0

0 0

0 0

4 21.05

7 36.84


Table 39.
Agents, by Program Area, who Have/Have not Taken a Computer Course Since 2000

Response Agriculture 4-H Marine FYCS Other

N /oN N %N N %N N %N N %N

Yes 53 39.26 22 44.00 5 41.67 52 60.47 9 56.25

No 82 60.74 27 54.00 7 58.33 33 38.37 7 43.75

No Resp. 0 0.00 1 2.00 0 0.00 1 1.16 0 0.00










Table 40.
Reason, by Program Area, for not taking a Computer Course Since 2000

Response Agriculture 4-H Marine FYCS Other

N /oN N %N N %N N %N N %N

LOTime 37 27.41 10 20.00 4 33.33 16 18.60 3 18.75

LO Access 5 3.70 3 6.00 1 8.33 4 4.65 1 6.25

Expense 2 1.48 1 2.00 0 0.00 0 0.00 0 0.00

Incentive 4 2.96 2 4.00 0 0.00 0 0.00 2 12.50

Availability 8 5.93 1 2.00 0 0.00 2 2.33 0 0.00

IST Days 11 8.15 6 12.00 1 8.33 11 12.79 0 0.00

Other 15 11.11 6 12.00 1 8.33 11 12.79 2 12.50

NoResp. 53 39.26 21 42.00 4 33.33 42 48.84 8 50.00



Two hundred ninety-five agents responded to the question "if you have a

question about a computer-related issue, where are you most likely to seek an

answer." The majority, 52.17% (n = 156) indicated that they turned to "a colleague

or support staff in the office." The second most frequent response at 24.41% (n = 73)

was "from your district's computer support personnel." Table 41 on the following

page shows this information. In addition to the supplied response, thirty-three agents

indicated "other" as answer to this question, and voluntarily offered various responses

such as: "County computer support personnel," "IFAS help," "Gainesville IT or

county IT," "office support staff or county computer support personnel," "from

district computer personnel," or "my friend the computer geek."









Table 41.
Where Agents Seek Answer to Computer-related Questions

Response N %N

From a colleague or support staff in the office 156 52.17

From a colleague or support staff in another county 6 2.01

From your district's computer support personnel 73 24.41

You find the answer on your own 27 9.03

Other 33 11.04

No response 4 1.34



Demographic Snapshots by Age

Age group 20-30

This group of agents accounted for 17.06% of the respondents (n = 35). The

majority of this group, 88.57%, reported average to above average computer skills.

Insofar as computer use is concerned, 42.86%, of this group spends 20+ hours a week

on the computer, and another 40% spends between 11-20 hours a week on the

computer. Of all the sources of computer knowledge that the survey inquired about,

the most frequent response by this group was "learned in college or high school."

More than half, 57.14%, have not taken a computer course since 2000, with 22.48%

indicating that lack of time was the reason. When asked where they sought answers

for their computer-related questions, 40.00% of this age group responded "from a

colleague or support staff in the office."









Age group 31-40

This group of agents accounted for 17.06% of the respondents (n = 51). The

majority of this group, 82.35%, reported average to above average computer skills.

Insofar as computer use is concerned, 41.18% of this group reported spending 20+

hours a week on the computer, and another 39.22% spends between 11-20 hours a

week on the computer. Of all the sources of computer knowledge that the survey

inquired about, the most frequent response by this group was "self-taught at work."

Of the respondents in this age group, 43.14% have not taken a computer course since

2000, with 13.73% indicating that too few in-service training days was the reason.

The majority of this age group, 50.98%, indicated that they sought answers for

computer-related issues from "a colleague or support staff in the office."

Age group 41-50

This group of agents accounted for 32.44% of the respondents (n = 97), and

was the largest of the survey. The majority of this group, 87.63%, reported average

to above average computer skills. Insofar as computer use is concerned, 36.08% of

this group reports spending 20+ hours a week on the computer, and another 46.39%

spends between 11-20 hours a week on the computer. Of all the sources of computer

knowledge that the survey inquired about, the most frequent response by this group

was "self-taught at work." Of the respondents in this age group, 53.61% have not

taken a computer course since 2000, and 25.77% indicated that lack of time was the

reason. This group responded 56.70% of the time that they sought answers to

computer-related issues from "a colleague or support staff in the office."









Age group 51-60

This group of agents accounted for 31.10% of the respondents (n = 93), and

was the second largest of the survey. The majority of this group, 82.80%, reported

average to above average computer skills. Insofar as computer use is concerned,

39.78% of this group reports spending 20+ hours a week on the computer, and

another 38.81% spends between 11-20 hours a week on the computer. Of all the

sources of computer knowledge that the survey inquired about, the most frequent

response by this group was "self-taught at work." Of the respondents in this age

group, 52.69% have not taken a computer course since 2000, and 25.81% indicated

that lack of time was the reason. Responding to the question about where they sought

answers to computer-related issues, this group responded 51.61% of the time "from a

colleague or support staff in the office."

Age group 61-70

This group of agents accounted for 6.35% of the respondents (n = 19), and

was the most senior of the survey. The majority of this group, 63.61%, reported

average computer skills. Insofar as computer use is concerned, 21.05% of this group

reported spending 20+ hours a week on the computer, and another 36.84% spends

between 11-20 hours a week on the computer. Of all the sources of computer

knowledge that the survey inquired about, the most frequent response by this group

was "learned at work through in-service training, etc." Of the respondents in this age

group, 57.89% have not taken a computer course since 2000, and 31.58% indicated

that lack of time was the reason. The majority of 61-70 year old county agents,

63.16%, seek answers to computer-related issues from colleagues or support staff in

the office.









Objective 2

Determine How County Extension Agents are Using Information Technology on the
Job in terms of Hardware Use, and the Nature and Frequency of Use of Specific
Types of Software

Connectivity, Hardware and Operating System Use, etc.

Ninety-five percent (n = 285) of agents responding to the question as to

whether they have a computer on their desk at their office, said "yes." Ninety-eight

percent (n = 293) of the agents responding to the question asking what sort of

computer they had on their desk, indicated an IBM P.C. clone with the Windows

operating system. Ninety-nine percent (n = 296) of agents responding to whether the

computer on their desk was connected to the Internet, indicated "yes." The majority

of agents, 68.90%, responded "yes" to the question about whether they did office

work on their home computer. When asked if they used a laptop computer, 75.92%

(n = 227) of the responses to the question were "yes," with males, 77.78%, and

females, 74.57%, being almost equally distributed. When asked if they used a Palm

Pilot, I-Paq, or some such similar device, 34.78% (n = 104) of the respondents to the

question indicated, "yes," with almost equal responses from males (36.51%) and

females (33.53).

Patterns of Use of Electronic Mail

Asked if they use e-mail, 100% (n =299) of the respondents answered "yes."

Agents were then asked to give their average daily use of e-mail. As is shown in

Table 42 on the following page, 26.42% (n = 79) of the agents responded "31-45

minutes a day," and 25.08% (n = 75) responded "46-60 minutes a day." Asked if

they use e-mail to communicate with clientele, a large majority, 91.97% (n = 275), of

the agents said "yes." A follow up to this question asked agents how often they









Table 42.
Agents' Average Daily Use of E-mail

Average Daily Use N %N

0-15 minutes a day 16 5.35

16-30 minutes a day 73 24.41

31-45 minutes a day 79 26.42

46-60 minutes a day 75 25.08

Over 60 minutes a day 56 18.73



e-mailed clientele during the month. Seventy-five respondents (25.08%) indicated

"more than 20 times a month," and another 75 (25.08%) responded "1-5 times a

month." This information is provided in Table 43. Agents were also asked to

estimate the number of clientele they reached via e-mail during a typical month. As

Table 43.
How Often Agents E-mail Clientele during the Month

Number of Timer per Month N O/oN

1-5 times a month 75 25.08

6-10 times a month 58 19.40

11-15 times a month 36 12.04

16-20 times a month 36 12.04

More than 20 times a month 75 25.08

Not applicable 17 5.69

No response 2 0.67









Table 44 reports, 56.86% (n = 170) of the respondents indicate they contact between

1 and 25 clientele per month via e-mail, and 29 agents (9.70%) reported contacting

over 100 clientele per month via e-mail.

Table 44.
Estimated Number of Clientele Reached via E-mail during a Typical Month

Est. Number of Clientele Reached N %N

1-25 clientele per month 170 56.86

26-50 clientele per month 47 15.72

51-75 clientele per month 14 4.68

76-100 clientele per month 19 6.35

More than 100 clientele per month 29 9.70

Not applicable 16 5.35

No response 4 1.34



Patterns of Use of Word Processing Software

Word processing software is used by 96.66% (n = 289) of the responding

county agents. Agents were asked to give their average daily use of word processing

software. Table 45 on the following page shows that 25.08% (n = 75) of the

respondents indicated they use word processing software more than 90 minutes a day.

Another 23.41% (n = 70) of the respondents indicate using word processing software

between 46-60 minutes a day. When asked which word processing software program

they most often used, 46.49% (n = 139) of the respondents to this question indicated

Corel Word Perfect, 44.48% (n = 133) indicated MS Word, and 5.35% (n = 16)

indicated "other." Eleven agents (3.68%) did not respond to the question.