Citation
Assessing Internationalization Efforts

Material Information

Title:
Assessing Internationalization Efforts Utilizing Item Response Theory to Validate Intercultural Competency and Global Awareness in Postsecondary Undergraduate Students
Creator:
Wilson, Timothy J
Place of Publication:
[Gainesville, Fla.]
Florida
Publisher:
University of Florida
Publication Date:
Language:
english
Physical Description:
1 online resource (221 p.)

Thesis/Dissertation Information

Degree:
Doctorate ( Ph.D.)
Degree Grantor:
University of Florida
Degree Disciplines:
Higher Education Administration
Human Development and Organizational Studies in Education
Committee Chair:
CAMPBELL,DALE FRANKLIN
Committee Co-Chair:
MILLER,DAVID
Committee Members:
BROPHY,TIMOTHY S
SAMMONS,DAVID J
Graduation Date:
8/9/2014

Subjects

Subjects / Keywords:
College students ( jstor )
Communication skills ( jstor )
Critical thinking ( jstor )
Cultural values ( jstor )
International students ( jstor )
Item response theory ( jstor )
Modeling ( jstor )
Outcomes of education ( jstor )
Students ( jstor )
Undergraduate students ( jstor )
Human Development and Organizational Studies in Education -- Dissertations, Academic -- UF
instrument -- intercultural -- internationalization -- irt
City of Gainesville ( local )
Genre:
bibliography ( marcgt )
theses ( marcgt )
government publication (state, provincial, terriorial, dependent) ( marcgt )
born-digital ( sobekcm )
Electronic Thesis or Dissertation
Higher Education Administration thesis, Ph.D.

Notes

Abstract:
This study investigated the development and validation of an internationalization assessment to understand postsecondary undergraduate student self-perceptions of global awareness and intercultural competency. The analysis consisted of a two-step process of instrument development and instrument more specifically item validation. In utilizing classical test theory procedures and survey design principles, specific survey items gauging student perceptions of global and intercultural issues were designed, adapted, and ultimately constructed within a final internationalization assessment. The data source for the instrument development phase consisted of undergraduate students enrolled education and political science courses at a large, research intensive postsecondary institution located in the southeastern United States. In the second phase of analysis, item fit and validation was conducted through piloting the instrument with postsecondary undergraduate students enrolled in the general education courses consisting of more than 100 students. Based upon a theoretical framework of Samejima (1969) graded response model for polytomous item response theory items, the piloting of the 26 internationalization survey items addressing critical thinking and communication skills in a global context was analyzed from a response of over 800 postsecondary students. ( en )
General Note:
In the series University of Florida Digital Collections.
General Note:
Includes vita.
Bibliography:
Includes bibliographical references.
Source of Description:
Description based on online resource; title from PDF title page.
Source of Description:
This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Thesis:
Thesis (Ph.D.)--University of Florida, 2014.
Local:
Adviser: CAMPBELL,DALE FRANKLIN.
Local:
Co-adviser: MILLER,DAVID.
Electronic Access:
RESTRICTED TO UF STUDENTS, STAFF, FACULTY, AND ON-CAMPUS USE UNTIL 2016-08-31
Statement of Responsibility:
by Timothy J Wilson.

Record Information

Source Institution:
UFRGP
Rights Management:
Applicable rights reserved.
Embargo Date:
8/31/2016
Resource Identifier:
968131656 ( OCLC )
Classification:
LD1780 2014 ( lcc )

Downloads

This item has the following downloads:


Full Text

PAGE 1

1 ASSESSING INTERNATIONALIZATION EFFORTS: UTILIZING ITEM RESPONSE T HEORY TO VALIDATE INTERCULTURAL COMPETENCY AND GLOBAL AWARENESS IN POSTSECONDARY UNDERGRADUATE STUDENTS By TIMOTHY J. WILSON A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2014

PAGE 2

2 © 2014 Timothy J. Wilson

PAGE 3

3 To my family, friends, and fiancée who have support ed me through this long journey. Also to my parents though you are not here with me physically, I know you have been with me all along.

PAGE 4

4 ACKNOWLEDGMENTS First and foremost, I want to thank God who has granted me the wisdom and perseverance in the dissertation process. I thank my chair, Dr. Dale Campbell for his consistent support throughout the experience of my graduate program. I want to thank Dr. David M iller who has been instrumental in my development in research and for allowing me to be part of the taskforce in developing the Quality Enhancement Plan. I also want to express my gratitude to my family and friends for their prayers and support. Specifical ly, I want to thank my fiancée, Rosana, who has not only supported me but also held me accountable along the way. In completing this doctoral program, I appreciate the support of my friends who have listened, consoled and directed me through this process.

PAGE 5

5 TABLE OF CONTENTS page ACKNOWLEDGMENTS ................................ ................................ ................................ ............... 4 LIST OF TABLES ................................ ................................ ................................ ........................... 9 LIST OF FIGURES ................................ ................................ ................................ ....................... 10 LIST OF ABBREVIATIONS ................................ ................................ ................................ ........ 11 ABSTRA CT ................................ ................................ ................................ ................................ ... 12 CHAPTER 1 INTRODUCTION ................................ ................................ ................................ .................. 14 Purpose of the Study ................................ ................................ ................................ ............... 21 Research Questions ................................ ................................ ................................ ................. 22 Rationale for Study ................................ ................................ ................................ ................. 23 Scope of Study ................................ ................................ ................................ ........................ 27 Significance o f Study ................................ ................................ ................................ .............. 27 Summary ................................ ................................ ................................ ................................ . 28 2 LITERATURE REVIEW ................................ ................................ ................................ ....... 30 Definiti on of Key Terms ................................ ................................ ................................ ......... 31 Internationalization vs. Globalization ................................ ................................ ............. 31 Global Awareness ................................ ................................ ................................ ............ 35 Intercultura l Communication ................................ ................................ ........................... 35 Critical Thinking ................................ ................................ ................................ ............. 36 Communication Skills ................................ ................................ ................................ ..... 36 Assessment ................................ ................................ ................................ ...................... 36 Methodological Def initions ................................ ................................ ............................. 38 Tests with dichotomous variables ................................ ................................ ............ 38 Tests with non dichotomous variables ................................ ................................ ..... 38 Construct ................................ ................................ ................................ .................. 39 Scale of measurement ................................ ................................ ............................... 39 Theta/latent ability ................................ ................................ ................................ .... 39 Threshold parameter/item difficulty parameter ................................ ........................ 39 Item discrimination parameters (slope parameters) ................................ ................. 39 Item difficulty parameter ................................ ................................ .......................... 39 Item characteristic curve (ICC)/Category response curve (CRC) ............................ 39 Step functions ................................ ................................ ................................ ........... 40 Adjacent vs. Cumulative Step Approach ................................ ................................ . 40 Adjacent step approach ................................ ................................ ............................ 40 Cumulative step approach ................................ ................................ ........................ 40 Theo retical Framework ................................ ................................ ................................ ........... 40

PAGE 6

6 Classical Test Theory ................................ ................................ ................................ ...... 41 Item Response Theory ................................ ................................ ................................ ..... 44 Item res ponse theory assumptions ................................ ................................ ............ 47 Item response theory benefits ................................ ................................ ................... 48 Item Response Theory Models ................................ ................................ ........................ 51 Dichotomously Scored Item Models ................................ ................................ ........ 51 Ordered Polytomously Scored Item Models ................................ ............................ 54 Rating Scale Model (RSM) ................................ ................................ ...................... 56 Partial Credit Model (PCM) ................................ ................................ ..................... 57 Ge neralized Partial Credit Model (GPCM) ................................ .............................. 58 Graded Response Model (GRM) ................................ ................................ ..................... 58 Application of Model to Current Study ................................ ................................ .................. 61 Review of Assessment Instruments ................................ ................................ ........................ 61 Intercultural Development Inventory ................................ ................................ .............. 63 Global Perspective Inventory ................................ ................................ .......................... 65 Cross Cultural Adaptability Inventory ................................ ................................ ............ 67 Behavioral Assessment Scale for Intercultural Communication (BASIC) and Assessment of Intercultural Competence (AIC) ................................ .......................... 68 Summary ................................ ................................ ................................ ................................ . 71 3 METHODOLOGY ................................ ................................ ................................ ................. 75 Proposed Aims ................................ ................................ ................................ ........................ 77 Analytical Methods ................................ ................................ ................................ ................. 77 Item Specification and Instrument Development ................................ ................................ ... 79 Item Piloting Phase One ................................ ................................ ................................ .. 81 Item Piloting Phase Two ................................ ................................ ................................ . 84 Graded Response Model Analysis ................................ ................................ .......................... 88 Data Source ................................ ................................ ................................ ............................. 91 Data Sample ................................ ................................ ................................ ............................ 93 Limitations of Study ................................ ................................ ................................ ............... 94 Summary ................................ ................................ ................................ ................................ . 95 4 RESULTS ................................ ................................ ................................ ............................. 108 Model Fit ................................ ................................ ................................ .............................. 108 Factor Loadings ................................ ................................ ................................ .................... 111 Instrument Information ................................ ................................ ................................ ......... 116 Item Analysis ................................ ................................ ................................ ........................ 118 Item Calibration ................................ ................................ ................................ ............. 118 Item Information ................................ ................................ ................................ ............ 119 Category R esponse Curves (CRC) ................................ ................................ ................ 121 Summary ................................ ................................ ................................ ............................... 123 5 DISCUSSION ................................ ................................ ................................ ....................... 137 Summary of Study Contributions ................................ ................................ ......................... 137

PAGE 7

7 Review of Purpose and Research Questions ................................ ................................ ........ 138 Summary of Research Findings ................................ ................................ ............................ 139 Item Analysis Recommendations ................................ ................................ ......................... 140 Critical Thinking ................................ ................................ ................................ ........... 140 Communication Skills ................................ ................................ ................................ ... 142 Instrument Implementation ................................ ................................ ................................ ... 144 Summary ................................ ................................ ................................ ............................... 145 6 IMPLICATIONS AND FUTURE RESEARCH ................................ ................................ .. 149 Implications for Institutional Practice ................................ ................................ .................. 149 Future Research ................................ ................................ ................................ .................... 154 Institutional and Student Demographics ................................ ................................ ....... 155 Methodological Modifications ................................ ................................ ...................... 156 Conclusion ................................ ................................ ................................ ............................ 158 APPENDIX A CRITICAL THINKING OPERATIONAL DEFINITION ................................ ................... 160 B COMMUNICATION OPERATIONAL DEFINITION ................................ ....................... 161 C ITEM SPECIFICATIONS FOR CRITICAL THINKING STUDENT LEARNING OUTCOME ................................ ................................ ................................ ........................... 162 D ITEM SPECIFICATIONS FOR COMMUNICATION STUDENT LEARNING OUTCOME ................................ ................................ ................................ ........................... 168 E TEN ANCHOR ITEMS FOR ALTERNATE FORMS ................................ ........................ 183 F ALTERNATE FORM A FOR FIRST PILOTING PHASE ................................ ................. 184 G ALTERNATE FORM B FOR FIRST PILOTING PHASE ................................ ................. 188 H ALTERNATE FORM A FOR SECOND PILOTING PHASE ................................ ............ 192 I ALTERNATE FORM B FOR SECOND PILOTING PHASE ................................ ............ 196 J FINAL PILOT OF COMMUNICATION SKILL ITEMS (SECOND PHASE ) .................. 200 K INSTITUTIONAL REVIEW BOARD APPROVAL ................................ .......................... 202 L INFORMED CONSENT FORM ................................ ................................ .......................... 206 M ESTIMATED ITEM PARAMETERS FOR CRITICAL THINKING AND COMMUNICATION SKILL ITEMS ................................ ................................ .................. 208 LIST OF REFERENCES ................................ ................................ ................................ ............. 209

PAGE 8

8 BIOGRAPHICAL SKETCH ................................ ................................ ................................ ....... 221

PAGE 9

9 LIST OF TABLES Table page 2 1 Differences Between Classical Test Theory and Item Response Theory .......................... 73 3 1 Critical Thinking Item Pilot Student Demographics (Phase 1) ................................ ......... 96 3 2 Eliminated Critical Thinking Items Alternate Form A (Phase 1) ................................ ... 96 3 3 Retained Critical Thinking Item Properties Alternate Form A (Phase 1) ........................ 97 3 4 Eliminat ed Critical Thinking Items Alternate Form B (Phase 1) ................................ .... 97 3 5 Retained Critical Thinking Item Properties Alternate Form B (Phase1) ........................ 98 3 6 Retained Item Discriminations from Critical Thinking Pilot Phase 1 Alternate Form A & Form B ................................ ................................ ................................ .............. 99 3 7 Retained Critical Thinking Item Discrimination Levels Alternate Form A (Phase 2) . 100 3 8 Retained Critical Thinking Item Discrimination Levels Alternate Form B (Phase 2) . 101 3 9 Communication Item Discrimination Levels Alternate Form A (Phase 2) .................. 102 3 10 Retained Communication Item Discrimination Levels Alternate Form B (Phase 2) ... 103 3 11 Eliminated Communication Items Alternate Forms A and B (Phase 2) ....................... 104 3 12 Retained Communication Item Discrimination for Alternate Forms A and B Phase 2 ................................ ................................ ................................ ................................ ........ 105 3 13 Final Internationalization Assessment Items Critical Thinking & Communication Skills ................................ ................................ ................................ ................................ 106 3 14 Recoded Variables of Final Internationalization Assessment Items Critical Thinking & Communication Skills ................................ ................................ .................. 107 4 1 Exploratory Factor Analysis Result s ................................ ................................ ............... 126 4 2 Two Factor Correlations of Instrument Items ................................ ................................ .. 126 4 3 Critical Thinking Factor Correlations with and without Item 14(Social) ........................ 127 4 4 Communication Skills Factor Correlations with and without Item 14 (Social) ............... 127 4 5 Highest Information Levels and Corresponding Theta Levels for Instrument Items ...... 128

PAGE 10

10 LIST OF FIGURES Figure page 2 1 An Item Characteristic Curve for a Polytomous Item in the Construct of Intercultural Competency ................................ ................................ ................................ . 74 4 1 Scree Plot of Eigenvalues ................................ ................................ ............................... 129 4 2 Critical Thinking Test Information ................................ ................................ ................. 130 4 3 Communication Skills Test Information ................................ ................................ ......... 130 4 4 Communication Skills Test Information Without Item 14 (Social) ................................ 131 4 5 Information Levels for Item 21 (Enjoylearn) ................................ ................................ .. 131 4 6 Information Levels for Item 24(Appdiff) ................................ ................................ ....... 1 32 4 7 Information Levels for Item 5(Knownorms) ................................ ................................ .. 132 4 8 Information Levels for Item 11 (Knowbeliefs) ................................ ............................... 133 4 9 Information Levels for Item 14 (Social) ................................ ................................ ......... 133 4 10 Category Response Curves for Item 12 (Recdec) ................................ ........................... 134 4 11 Category Response Curves for Item 21 (Enjoylearn) ................................ ..................... 134 4 12 Category Response Curves for Item 24 (Appdiff) ................................ .......................... 135 4 13 Category Response Curves for Item 5(Knownorms) ................................ ...................... 135 4 14 Category Response Curves for Item 11 (Knowbeliefs) ................................ .................. 136 5 1 Category Response Curves for Item 2 (Multiperspect) ................................ .................. 147 5 2 Category Response Curves for Item 14 (Social) ................................ ............................ 147 5 3 Information Levels for Item 24 (Appdiff) with Item 14 Eliminated ............................. 148

PAGE 11

11 LIST OF ABBREVIATIONS AIC Assessment of Intercultural Competence BASIC Behavioral Assessment Scale for Intercultural Communications CRC Category Response Curve CTT Classical Test Theory GPCM Generalized Partial Credit Model GPI Global Perspectives Inventory GRM Graded Response Model ICC Item Characteristic Curve IDI Intercultural Development Inventory IRT Item Response Theory PCM Partial Credit Model RSM Rating Scale Model

PAGE 12

12 Abstract of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy ASSESSING INTERNATIONALIZATION EFFO RTS: UTILIZING ITEM RESPONSE THEORY TO VALIDATE INTERCULTURAL COMPETENCY AND GLOBAL AWARENESS IN POSTSECONDARY UNDERGRADUATE STUDENTS By Timothy J. Wilson August 2014 Chair: Dale Campbell Coc hair: David Miller Major: Higher Education Administration This study investigated the development and validation of an internationalization assessment to understand the self perceptions of global awareness and intercultural competency of postsecondary undergraduate students. The analysis consisted of a two step proce ss of instrument development and instrument validation, more specifically item validation. In utilizing classical test theory procedures and survey design principles, specific survey items ere designed, adapted, and ultimately constructed within a final internationalization assessment. The data source for the political science courses at a large, rese arch extensive postsecondary institution located in the southeastern United States and students participating in a study abroad program . In the second phase of analysis, item fit and validation were conducted through piloting the instrument with postsecond ary undergraduate students enrolled in general education courses consisting of more model for polytomous item response theory items, the piloting of the 26 intern ationalization

PAGE 13

13 survey items addressing critical thinking and communication skills in a global context were analyzed from a response of over 800 postsecondary students.

PAGE 14

14 CHAPTER 1 INTRODUCTION Increased global integration has prompted the need for greater international educational opportunities at postsecondary institutions, which has raised awareness among higher education professionals of the learning outcomes associated with internationalized exposure. Given the increased emphasis on developing and prod ucing globally aware and interculturally competent students, postsecondary institutions have engaged in curricular and co curricular internationalization efforts. Within these broad internationalization interventions, higher education institutions have des igned, developed, and implemented curricula and programs that examples of bro education abroad opportunities to such fields as economics (Georgia Tech, 2011) and the analysis of alumni in the workforce by the University of Melbourne in order to determine int ernational curricula components (Leask, 2013). The increased role of international programs at U.S universities and colleges is directly related to greater globalization (Rudenstine, 1997; Zhai, 2004). Held and McGrew (2002, p.1) suggest that globalizatio management activities. In response to globalization, higher educatio n institutions are adapting their campuses and curricula to better prepare their students for a more global workforce. Given

PAGE 15

15 the perceived mandate to develop students who are globally competent, many postsecondary institutions have strategically designed a nd implemented internationally focused programs to ensure student success. A common extension of postsecondary international efforts is the recruitment and enrollment of international students at the undergraduate and graduate levels. Over the past decade, the postsecondary international student population at U.S. universities and colleges grew to 4% of the total U.S. higher education population, to a record high of 819,644 students in 2012 2013. In the 2012 2013 academic year alone, the number of internati onal students studying at U.S. institutions of higher education increased by 7% (IIE, 2013). This considerable growth in international student enrollment has contributed significantly to the economic, research, curricular, social interactions, and campus c limate of U.S. postsecondary institutions. The Institute of International Education (2012) estimated that international students studying at U.S. postsecondary institutions contributed, through tuition payments and living expenses, approximately $21.8 bill ion to the U.S. economy in 2011 2012. Given the economic impact international students can have on institutional resources (tuition, room and board) as well as the local economy, postsecondary institutions continue to develop international marketing strate gies to attract students from around the world. In addition to the economic impact on campuses and local economies, international students are a valuable resource for the internationalization of curricula, as well as for furthering diverse research perspe ctives and agendas. Given that many international students serve as either teaching or research assistants at the graduate level, they play an instrumental role in student interactions within the classroom as well as offering a varied cultural perspective to the research process (Zhai, 2004; Ladd & Rudy, 1999). In fact, research has found that, as international

PAGE 16

16 teaching assistants learn how to navigate the cultural differences in classroom instruction, they also provide intentional cultural interactions wit h their students in this process (Zhai, 2004). These cultural exchanges are important for developing a more interculturally competent and globally aware campus climate. While the integration of international students can foster a more culturally diverse ca mpus climate, incorporating international perspectives within curricular activities allows postsecondary institutions to directly further the global knowledge and competency of students. In terms of curricular development, postsecondary institutions have i nvested in a range of resources, from integrating international components within specific classes to creating an international scholar designation. Specific implementation includes the incorporation of classes on intercultural communication and various co urse components with instruction in, exposure to, and reflection on international concepts (e.g., world geography, international relations, and intercultural communications). Coupling international components in the curricular and co curricular arenas, col leges and universities are better able to offer a holistic approach to campus internationalization. Although curricular exposure to intercultural dimensions and perspectives is vital for student learning, intentional interactions between members of differ ent cultural groups foster the understanding of the concepts learned by students within the classroom. Research indicates that exchanges between different cultural groups reduce prejudice and improve understanding of different cultural perspectives (Pettig rew & Tropp, 2000). In fact, these exchanges foster the development of empathy and the ability to take on different perspectives based upon knowledge of the respective cultural worldviews (Maringe, Lumby, & Morrison, 2007; Mark, 1994). As a result of such research, U.S. postsecondary institutions have implemented programs in

PAGE 17

17 multicultural education (J.A. Banks & Banks, 2004), peace education (Stomfay Stitz, 1993), international or global education (Merryfield, 1996), and culturally relevant or responsive ed ucation (Gay, 2000). Incorporation of internationally focused classes into the fields of teacher education (Cusher & Mahon), engineering (Grandin & Hedderich, 2009), social work (Fong, 2005), and the health care fields (Anand & Lahiri, 1999) has also been implemented by many postsecondary institutions. Through the expansion of international emphasis in such degree areas as global masters of business administration at the graduate level and global economic and international affairs at the undergraduate level , institutions have also bolstered internationally focused curricular offerings. In addition to offering internationally designated classes and degree programs, higher l learning opportunities through an international designation on their transcript (Antioch University, 2013; Georgia Tech, 2011, Thompson River University, 2013). For example, at Georgia Institute of Technology, students who participate in study abroad opp ortunities twice during their undergraduate studies and take courses in modern languages, global economics, and international affairs receive an international designation on their transcript (Georgia Tech, 2011). An expansion of the Georgia Tech internati onal designation is the Global Competency designation offered at Thompson River University. In order to receive this Global Competency recognition, the postsecondary student must complete a compilation of curricular and co curricular activities that exempl a portfolio of these experiences (Thompson River University, 2013, retrieved at http://www.tru.ca/global.html). Based upon four categories consisting of learning another foreign language, e ngaging in education abroad opportunities, participating in campus intercultural or

PAGE 18

18 international activities, and participation in external intercultural and international activities, Thompson River University recognizes international awareness and compete ncy. Incorporation of an international designation on student transcripts provides educational institutions the opportunity to discuss specific institutional components that represent an intercultural and globally competent student. In addition, this desig nation ensures review by future employers of students who have experienced cross cultural situations. In the pursuit of developing and fostering intercultural understanding in their students, U.S. postsecondary institutions have also expanded education ab road opportunities, developed international branch campuses, and invited international speakers to U.S. campuses. In terms of education abroad programs, not only have universities expanded opportunities to more countries, but they have also implemented pro grams to attract more students, and more diverse students as well, to participate in these programs. The expansion of education abroad programs has extended geographically to the Middle East and BRIC (Brazil, Russia, India, and China) (IIE, 2012). In addi tion to developing a more expansive study abroad experience for students, higher education institutions have also investigated how to expand study abroad opportunities to students who have not historically participated in international learning programs, s pecifically minority students. For example, educational institutions have attempted to market and involve students of color in education abroad opportunities (Salisbury, Paulsen, & Pascarella, 2010). In order to engage minority students in study and educat ion abroad programs, higher education institutions have implemented such research based measures as hiring a more culturally diverse staff, increasing scholarship opportunities, and conducting outreach programs to parents and guardians (Brux & Fry, 2009; S alisbury, M., Umbach, P., Paulsen, M., & Pascarella, E., 2009). By incorporating internationally focused interventions that address the social capital (faculty -

PAGE 19

19 student mentorship programs), financial capital (study abroad scholarships), and the cultural ca pital (parental exposure to international activities) of students of color, universities have learned to address the growing need to incorporate minority students into a strategic internationalization plan ( Clemens, 2002; Hembroff & Rusz, 1993; Salisbury, Umbach, Paulsen, & Pascarella, 2009 ). Educational institutions have also ventured into more entrepreneurial endeavors by establishing branch campuses outside the United States. From 1999 to 2009, the number of international branch campuses doubled from 82 to 163 (Knight, 2011). For example, New York University (NYU) has developed an international branch in Dubai (United Arab Emirates) that has enrolled students not only from the Persian Gulf region but also postsecondary students from China, Brazil. Ethiop ia, and Indonesia (Knight, 2011). These international campuses allow institutions to develop a presence in the host country, thereby developing an alternative source of revenue as well as potentially attracting more international students to the central ca mpus. education have also recruited and invited more international speakers to college campuses through international events or discipline specific video conference s (Lloyd, 2010). An increase in international perspectives allows students to negotiate different viewpoints as well as interact with scholars from various regions of the world (Gauvin, 2005). In addition, postsecondary institutions have utilized technolog y to bring international experiences to the classroom and the campus as a whole. Research has shown that incorporating international interaction via technology sources can assist in student learning about international concepts as well as increase student cross cultural interactions on their own campuses (Brown, 2000; Edmunson, 2007). Whether it is the development of overseas campuses or the inclusion of internationally

PAGE 20

20 recognized speakers, increased efforts to internationalize campuses have led to a more c ulturally diverse campus climate. While the inclusion of international speakers offers a short term exposure to intercultural perspectives, postsecondary institutions have also recruited and assimilated international scholars into various institutional res earch and teaching opportunities. Fostered through the Fulbright United States Scholar Program, over 800 faculty members and tional Exchange of Scholars, 2014). Combining both international speakers and visiting international scholars can cultivate a campus environment that initiates intercultural thinking and interactions. In order to accomplish an institutional mission and vi sion of intercultural competency and global awareness, postsecondary institutions have integrated international students into the student development process, recognizing the value that such students bring in fostering and ge and understanding of other cultures. But U.S. postsecondary institutions have also recognized the key value of instilling in the culture of the institution the ability to cognitively process cultural differences and to effectively manage cultural confli cts. Research indicates that internationalization attempts are more effective when they are attached to the strategic vision of the university (Peterson et al., 2002). By incorporating internationalization into the strategic plan of the institution, the g oal is to gain the support of university stakeholders, including central administration (funding of scholarships for study abroad), academic departments (international concepts in classes), and students (involvement in intercultural activities). For exampl e, when the Oregon university system implemented internationalization as a component of their strategic plan, they witnessed an increase in intercultural competency and an expansion of internationalization efforts across all

PAGE 21

21 campuses (Lee, Abd Ella, & Burk s, 2005). By aligning internationalization efforts with the institutional mission and vision, postsecondary institutions are better able to develop international programs and policies and evaluate their effectiveness. Purpose of the Study As postsecondary institutions further their internationalization efforts, they need to measure the understanding of global issues and competence in cross cultural interactions of their students. As more resources are allocated to international programs and policies (Green 2007; Norton, 2008), assessing these initiatives, especially in terms of student development, becomes ever more vital. In order to assess internationally focused student learning outcomes, universities and colleges have designated institutional taskforces to develop international competencies to frame their various international curricular and co curricular activities. Whether in initiatives to internationalize curricula or to enroll more international students, postsecondary institutions have established a ssessment procedures to ensure that these endeavors align with the learning competencies and outcomes. In fact, Samonte and Pastor (2011) recommend that internationalization assessment, particularly in the area of intercultural competency, should be furthe r examined within the context of holistic student development. The purpose of this study is to examine the validity of internationalization assessments in and global awareness, with a specific focus on critical thinking and c ommunication skills. In this study, multiple methods of item analysis are conducted to address the psychometric properties of the assessment. Classical test theory measures were applied to determine item discrimination and difficulty for the purpose of ite m retention and elimination for a survey instrument. In addition, I determine psychometric properties of the student learning outcomes of critical thinking and

PAGE 22

22 communicat ion with regard to internationalization. Based upon this analysis, internationalization stakeholders can determine the contribution of specific items. Broadly, this study intends to offer higher education administrators a more thorough method for evaluatin reported perceptions of intercultural competency and global awareness. As a result, this analysis aims to assist administrators in developing an assessment that can be used to evaluate the effectiveness of institutional internationalizatio n efforts. Research Questions The following research questions constitute the core of this empirical analysis: 1. What are the psychometric properties of an assessment in regards to student self perceptions of intercultural competency and global awareness (cr itical thinking and communication skills) in internationalization? 2. When evaluating the self perceptions of intercultural competency and global awareness (critical thinking and communication skills) of students in regards to internationalization, how valid is the assessment? 3. perceptions of intercultural competency and global awareness (critical thinking and communication skills) in regards to internationalization? In examining the development and ps ychometric properties of an internationalization assessment, there are a multitude of conceptual and methodological frameworks available. For this particular study, it was important to apply a framework that incorporated instrument development (classical t est theory) and item alignment (item response theory). By beginning with a classical test theory model for instrument development and continuing with further item analysis, this study attempts to provide a comprehensive instrument analysis of the perceptio ns of critical thinking and communication skills of postsecondary students in relation to global awareness and intercultural competency.

PAGE 23

23 Rationale for Study As postsecondary institutions increase their internationalization efforts to meet the demands of g lobalization, these institutions will review and implement various curricular and co curricular programs and policies to foster student global preparedness. In implementing these international learning opportunities for their students, higher education ins titutions are responding to the necessity of developing globally competent students. Exposure to international and cross cultural environments benefits students in the competitive global job market and, personal and intrapersonal development (Braskamp).Hoffa and Hoffa (1996) state that study abroad experiences can be a valuable tool for the global job market by offering students opportunities to implement curricular principles in practical settings. Parti cipation in study abroad programs often leads to better job placement rates for postsecondary students, especially in career positions focusing on international engagement (Wiers Jessen, 2008). Even as students pursue advanced degrees, graduate programs, e specially (Martinez, 2011). Students who demonstrate international awareness are more likely to succeed in an increasingly complex and competitive global market place. In addition to the economic and job placement opportunities for those students who engage in international learning opportunities, exposure to multicultural perspectives fosters such holistic student development qualities as cognitive, interpersona l, and intrapersonal skills (Clarke, 2009). Recently, researchers have examined whether participation in international learning opportunities can lead to positive benefits such as academic achievement, degree attainment, cognitive development, increased mu lticultural understanding, and improved interpersonal communication skills (Braskamp, L.A., Braskamp, D.C., & Merrill, K.C., 2009; Clarke, I. Flaherty, T.B., Wright, N.D., & McMillen, R.M., 2009 ). Kaufman, Martin, and

PAGE 24

24 Weaver (1992) discovered that study ab road participation positively affects educational outcomes such as cognitive and personal development, as well as expanding the international perspectives of students. Understanding the effect international exposure can have on cognitive development is cri tical in developing cognitively complex postsecondary students. Moreover, internationalization learning outcomes focused on developing interpersonal and intrapersonal competence promote self confidence through developing a positive view of oneself and oth ers. Research shows that self confidence increases when a person successfully overcomes challenges, such as those encountered through studying abroad (Laubscher, 1994; Neff, 2001). Laubscher (1994) found that study abroad participants were more confident i n their ability to take risks and as a result developed a profound sense of independence and self reliance. Research has shown that international experience (curricular or co exposure to diversity, enhancing educational outc omes such as leadership skills, the ability to work in teams, and increased interaction with peers and faculty members (De Sousa, 2005). By exposing students to diverse cultures, international programs and curricula provide opportunities for students to qu estion personal cultural beliefs as well as experience growth in terms of ethnic identity, racial identity, and intercultural sensitivity (Day Vines, Barker, & Exum, 1998; Neff, 2001). Coupled with intrapersonal and interpersonal development, participation in internationally focused programs supports academic achievement and persistence (Bates, 1997; Braskamp & Merril, 2009; Clarke, 2009). Given the role that international experiences can play in academic achievement and student development, it is important to understand the factors that make internationalization efforts effective, both inside and outside the classroom. Aligning their institutional missions and strategic plans with the current global culture, higher education institutions have incorporated i ntercultural competency and global awareness

PAGE 25

25 into their campus climates. Given the increased importance of intercultural competency and global awareness among college students, postsecondary institutions have assessed the development of student competencie s through professional assessment instruments (e.g. Intercultural Development Inventory, Global Perspectives Inventory). Historically, experiences (Deardorff, 2004). However, with increased attempts at on campus internalization, through such mechanisms as the implementation of an internationally themed Quality Enhancement Plan (QEP), postsecondary institutions have invested vast resources (time, energy, and money) to e nsure the production of globally and interculturally competent students. The allocation of resources to international efforts can be illustrated by the growth of U.S. branch campuses overseas. From 1999 to 2009, the number of overseas branch campuses doubl ed, consisting of 162 domestic postsecondary institutions that ventured into the global education arena (Altbach, June, 2011). The increased inclusion of internationalization into the mission and vision of higher education institutions has led to more glob ally focused programming and consequently an increased need for comprehensive program assessment. With such large investments in internationalization, universities and colleges have comprehended the need to evaluate the effectiveness of these internationa l ventures. In order to address a more comprehensive assessment of international programs and policies, postsecondary institutions have often relied on external organizations to assess their internationalization outcomes (Deardorff, 2006). These external e valuations have been implemented using various professionally designed instruments to measure intercultural and global competencies (Braskamp & Merrill, 2009; Engberg & Fox, 2011). Despite the high stakes in using these instruments for assessing internatio nalization, there has been little research regarding their reliability in terms of

PAGE 26

26 generalizability (Engberg & Fox, 2002; Samonte & Pastor, 2011). In a study conducted on the Global Perspectives Inventory by Samonte and Pastor (2011), the researchers disco vered that constructs. Therefore they recommend further item analysis by intercultural content experts to ensure appropriate item wording. Hammer (2011) investigated the Intercultural Development Inventory (IDI) in regards sample of 4,763 participants across 11 cross cultural groups, the study showed that the IDI had Sensitivity). In defining internationalization outcomes, each institution of higher education may have a specific and unique interest in understanding what entails the internationalization competency for its campus. Aligning student learning outcomes with specific postsecondary internationalization efforts is especially vital in developing an insti internationally focused courses, programs, and policies. Relying on external assessment of specific institutional climates can be challenging since various definitions exist for the underlying concepts of interna tionalization (e.g., intercultural competency) (Deardorff, 2004; Knight 1999). Given the unique institutionally focused perspective on intercultural competency and global awareness, assessment of defined student learning outcomes may also need to be tailor assessment development process as it pertains to evaluating the specifically defined internationalization student learning outcomes at a large, public, research extensive , postsecondary institution in the southeast United States. In addition, this analysis investigate s the

PAGE 27

27 validity and rel iability of the internationalization instrument to better ensure understanding of perceptions of critical thinki ng and communication in a global context. Scope of Study The scope of this study is to examine the development and psychometric properties of an perceptions in internationally focused outc omes of global awareness and intercultural competency. Specifically, the student learning outcomes of critical thinking and communication were analyzed in terms of item development, reliability, and validity. The sample consisted of postsecondary undergrad uate students enrolled in first and second year classes at a large, public, research extensive , postsecondary institution in the southeast United States. The diverse student demographics of the sample reflects the various perspectives of first year studen on critical thinking and communication in various intercultural settings. Given the unique nature of institutional conceptualization of internationalization, the instru ment development process was the result of student, faculty, and administrative input at this particular institution. However, the instrument development process offers a potential model for developing an assessment instrument as part of an overall assessm ent process. Significance of Study Comprehensively and effectively assessing international efforts is vital to ensuring that curriculum interventions result in the development of globally and culturally competent students. As universities and colleges inc orporate internationalization principles as a component of the institutional mission, it becomes even more imperative that specific methods of evaluation self p erceptions of intercultural competency and global awareness. At the institutional level, this

PAGE 28

28 study attempts to analyze the assessment process, specifically in developing and revising an evaluation tool to fit institutional needs. Through the use of instit utional assessment, postsecondary administrators can both realign current internationally focused initiatives with learning outcomes, and investigate new opportunities for fostering international perspectives. For postsecondary faculty members, the signifi cance of this study is that it intends to provide insights into their curricular initiatives towards the successful development of internationalization of student learning outcomes. At the individual level, this study attempts to provide a comprehensive u nderstanding of analyzing demographics (e.g., gender), this study also investigate the differences, and potential disparities of specified learning objectives among students. Research illustrates that student Therefore, this study provides a format for the introduction of intercultural competency and global competency a s concepts to postsecondary students as well as a method to further assess their development within these competencies throughout their postsecondary educational experience. This analysis permits higher education institutions to assess the alignment of th eir policy, support, and implementation dimensions within and across the organization in relation to the results of internationalization. Through alignment of international efforts with agreed upon learning outcomes, postsecondary institutions will better be able to develop a campus climate that prepares their students for the globalized world. Summary This chapter highlighted the increased influence globalization and internationalization have on institutions of higher education. As society becomes more in tegrated with people of

PAGE 29

29 diverse cultural experiences, postsecondary institutions are attempting to prepare their students for a more global workforce. The rationales underlying internationalization efforts can vary with regard to such factors as mission, s tudent population, faculty profile, geographic location, funding source, level of resources, and orientation to local, national, and international interests (Iuspa, 2010; Knight, 2004). While Knight (2004) lays a foundation for internationalized curricula, other international educators add such rationales as professional capabilities (Bremer & Van der Wende (1995), student development (Mestenhauser, 1998), and socialization (Paige & Mestenhauser, 1999). As postsecondary institutions develop international pr ograms and plans to foster a more intercultural climate, it has become more critical for these institutions to evaluate the effectiveness of these implementations by understanding their impact on student global awareness and intercultural competency. The p urpose of this study is to examine the development and psychometric properties of an intercultural instrument for assessing learning outcomes from defined interventions to promote global awareness and intercultural competency. Given the individual institut ional nature of student learning outcomes, this study investigates the process of developing an institutionally specific assessment tool. The instrument development process consists of stakeholder definitions of specific competencies, outlining student lea rning outcomes, item generation and revision, and instrument and item piloting. In addition, this study analyzes the specific items that provide information in relationship to internationalization at this specific research institution. This study aims to p rovide faculty and administrators with insights

PAGE 30

30 CHAPTER 2 LITERATURE REVIEW Understanding the holistic development of postsecondary students in the areas of intercultural competency and global awareness has become crucial for institutions of higher education. The purpose of this study is to examine the validity of internationalization of intercultural understanding and global awareness. Therefo re, this literature review explore s the relationship between internationalization components at the postsecondary level and the assessment principles and methods utilized to ensure effective internationalization efforts. This literature review also include s a brief overview of the component s and process of internationalization in the history of institutions of higher education in the Unite d States. In particular, I address the common misconception involving the use of the terms globalization and internationalization interchangeably. In order to develop a common understanding of internationalization, this review discuss es and provide s a synopsis of definitions of internationalization utilized by postsecondary institutions and internationally focused professional o rganizations. This section lay s the groundwork to comprehend key student competencies of global awareness and intercultural communication, as well as the student learning outcomes of critical thinking and communication skills. In additi on, this literature review address es the definitio n of assessment in the context of this analysis as well as providing clarification of the methodological components of item response theory. les, the literature review pr opose s a theoretical framework for this analysis. The framework utilized for this analysis draw s from specific item response theory models. After describing the theoretical framework that guide s this study, I discuss the various standardized intercultural

PAGE 31

31 assessment instruments utilized by U.S. postsecondary institutions. Despite a significant amount of research on internationalization efforts at postsecondary institutions, scant research exists on the respective instruments employed in measuring the effect iveness of these efforts (Edwards, 2009; Samonte & Pastor, 2011). Definition of Key Terms This research study focuses on internationalization efforts at postsecondary institutions. ing of global awareness and intercultural competency. In order to provide a clear foundation for this study, specific key words need to be addressed. This section offers the definition of these key terms that are further built upon throughout the study. Be ginning at the macro level, the difference between the commonly confused terms internationa lization and globalization are delineated. Following the definition of internationalization, an understanding of key variables related to learning competencies and o utcomes are provided including: global awareness, intercultural communication, critical thinking, and communication skills. Transitioning from international concepts to methodological concepts, I distinguish the variety of uses of assessment a nd offer defi nitions that perceptions of critical thinking and communication in an international context. Given that this study is a methodological analysis of the psychometric properties of an i ntercultural assess ment, I also address important methodological terms that are vital in understanding the effectiveness of the assessment tool. Internationalization vs. Globalization With the increasing promotion of global concepts within postsecondary institutions, it is vital to have an institutional definition of internationalization in order to develop appropriate assessment instruments. In determining this institutional definition, universities and colleges

PAGE 32

32 must distinguish between the concepts of internationalization and globalization. Commonly misinterpreted, globalization and internationalization play different roles in the arena of higher education. As postsecondary institutions attempt to produce more globally aware students, these institutions are implementing in terventions that address the opportunities globalization brings. reality, makes sense about how the world works and structures the way institutions and actors op meaning making definition, Vaira (2004) provides a more multifaceted approach, breaking globalization down into two different prominent theories: convergent and diverg ent. Within the convergent theory, globalization is viewed as a top down process involving the homogenization of global factors (cultural, political, and economic). By contrast, the divergent theory sees globalization as a nonlinear, bottom up process with heterogeneous effects, especially at the local level (Vaira, 2004). However despite these different definitions of globalization, it is apparent that Knight actual flows across borders, internationalization describes how a country responds to the impact of globalization (Qiang, 2003). With internationalization, national identity and culture are the main components. In order to comprehend internationalization, it is important to have ulture as well as foreign cultures to appropriately develop an international vision. According to Altbach and Knight (2006),

PAGE 33

33 The definition of i nternationalization at the postsecondary level has evolved over time. In its infancy, educational stakeholders needed to carefully distinguish between internationalization at the national and institutional levels (Knight, 2004). While the institutional lev el is more clearly limited to educational efforts, the national role in internationalization is much broader in scope and incorporates such areas as employment, agriculture, science, and technology. Although these areas can affect institutions of higher ed ucation, the ways they address these areas differ significantly from those of governmental entities (Qiang, 2003). A second phase in the definition of internationalization occurred prior to the 1980s. During this period, many countries used the term inter national education. With the use of international education, institutions began to identify key characteristics in order to differentiate this new concept from preexisting educational categories, including comparative education, global education, and multi cultural education (Knight, 2004). In this phase, such key characteristics of internationalization as intercultural competency and global competency were introduced into institutional definitions of internationalization for mission and vision development ( Deardorff, 2004; Iupusa, 2006). However in the late 1980s, colleges and universities began to replace the term international education with the term internationalization. While international education was often used in reference to the national role, inte rnationalization was commonly restricted to the institutional level and entailed a set of international activities (Knight, 2004). An example of the activities based definition of internationalization is one by Arum and van de Water (1992) which defines in

PAGE 34

34 Progressing from an activity based foundation to a more operational one, Knight (1994) defined internationalization thereby became the catalyst for incorporating the primary elements of a nternational and intercultural dimension into the teaching (learning), research and service functions of the dynamic process that is not fragmented but integrat international dimensions. In addition, this definition incorporates the three overarching functions of a postsecondary institution: teaching, research, and service. Other definitions provided a broader emphasis of i making higher education responsive to the requirements and challenges related to the International ization definitions similar to that of Van der Wende extend the reach of internationalization beyond the educational sector into external environments. In developing an appropriate definition of internationalization, it is important not to specify rational es, benefits, outcomes, and activities. By incorporating such concepts, the definition of internationalization is limited, which can create obstacles in the assessment of institutional internationalization. According to Knight (2004), it is vital that inte rnationalization

PAGE 35

35 a more focused definition with specific parameters is necessary if internationalization is to be assessed and advance higher education (de Wit, 2002). Many critics of definitions of internationalization hold that these definitions do not address further projections of internationalization efforts. Since many countries and institutions look to internationalization as a means to an end, suc h as quality improvement, Qiang (2003) holds that internationalization should not be solely an aim but should also lead higher education institutions toward international standards, as well as making them more open and responsive to the global environment. With all these considerations in mind, Knight (2003) provides a comprehensive and intercultural or global dimension into the purpose or delivery of postsecondary 2). In addition, Knight (2003) delineates the specific components of the internationalization definition. This definition provides the framework for this analysis, which is as follows: Internationalization is the conscious integration of in tercultural and global dimensions into student learning. Constructed by an institutional taskforce of faculty and administrators, this definition of internationalization forms the basis for the competencies and student leaning outcomes assessed in this stu dy. Global Awareness Global Awareness consist s of the following definition: Students comprehend the trends, challenges, and opportunities that affect local communities and communities worldwide, and apply this knowledge responsibly and effectively. Intercu ltural Communication I ntercultural Communication consist s of the following definition: Students interact effectively with members of other cultures.

PAGE 36

36 Critical Thinking Critical thinking is characterized by appropriate judgments, comprehensive analysis, effe ctive reasoning, and solution finding skills in terms of intercultural competency and global awareness. Communication Skills Communication is the development and utilization of the skills of cultural sensitivity, cultural awareness, situational adaptability, and effective oral and written expression in terms of intercultural competency and global awareness. Assessment This study relies on the understanding of the definition and concepts of intercultural assessment. With the definition of intercul tural communication and global awareness previously addressed, it is important to provide a clear foundation for the components of assessment that are being utilized in this analysis. One key distinction to be made is the difference between assessment and evaluation. As compared to assessment, the term evaluation has greater academic olds that evaluation is a commonly agreed upon terms among professionals (found in the Joint Committee Although there is a general consensus over the meaning of evaluation, many other assessment specialists break down the role of evaluation even further. In fact, Scriven (1967) furthers delineates the forms of evaluation into summative and formative. By formative, Scriven summative, Scriven (1967) outlines the use of evaluations for a final understanding of the

PAGE 37

37 program, class, etc. Similar to th e understanding of internationalization, assessment in academia can have vast components depending on the scope and lens through which the assessment is conducted. In fact, Madaus, Kellaghan, and Stufflebeam (2000) classify the numerous approaches to asses sment into three approaches: methods oriente d evaluation model, improvement /accountability oriented evaluation models, and social agenda directed/advocacy models. The methods bette procedures (Stufflebeam, 2001). The improvement/accountability oriented approach emphasizes ountability, consumer orientation, and accreditation (Stufflebeam, 2001). Finally, the social agenda While these three branches att empt to categorize the various assessment models, there are numerous evaluation models. For example, Fitzpatrick, Worthern, & Sanders (1997) provide a progra m focused assessment framework that entails the identification of evaluation questions and criteria in the assessment process. This assessment method provides a very comprehensive guide by combining curricular assessment principles with a compilation of va rious evaluation models. Within the research literature, there exists a range of definitions of student assessment. analyze, and interpret evidence which describes ef the development of assessment , the definitions have transitioned from more of an cumulative

PAGE 38

38 assessment of student learning and understanding (summative) to understanding the learning processes of students througho ut the course (formative) (William, 2011). William (2011) indicates the evaluation process originally assessed the effectiveness of programs or activities once they were completed, but in recent years, the role of assessment has been to investigate the lea uct, t his analysis define s the assessment process in a broader manner. Given that this analysis can be both summative (alignment of SLOs) and formative (design new internationally focused programs), assessment needs to provide a comprehensive and flexible role that can adapt throughout the analysis. Therefor e for this analysis, I utilize a broad definition of assessment at previously stated interpret evidence which descr intercultural assessment, this broad definition allows for either a formative or summative perceptions of internationalization concepts. Methodological Definitio ns Methodological definitions for the research are outlined below. The majority of these definitions come from Reeves (2002) Introduction to Modern Measurement Primer . Tests with dichotomous variables Testing instruments that only allow for the values of 0 or 1 as possible item scores (Crocker & Algina, 1986). Tests with non dichotomous variables Testing instruments that allow for a range of possible points awarded to a response that is not restricted to 0 or 1 (more than 2 responses) (Embertson & Re ise, 2 000).

PAGE 39

39 Construct A construct can also be known as a trait, domain, ability, latent variable, or theta. For this analysis, a construct consist s of the definitional components found with theta/latent ability. Scale of measurement Scales of measurement refer to ways in which variables/numbers are defined and categorized, such as ordinal, nominal, ratio or interval. Theta/latent ability The unobservable (or latent) construct being measured by the questionnaire. These constructs or traits are measured along a continuous scale. Examples of constructs in internationalization assessment include global awareness, intercultural competency, and cultural sensitivity. Threshold parameter/item difficulty parameter Threshold parameters represent the trait level necessary to respond above threshold with a 0.50 probability when the lower asymptote equals 0 (Hays, Morales, & Reise, 2000). Item discrimination parameters (slope parameters) Describes the strength of an item's discrimination between people with trait levels ( ) below and above the threshold, b . The a parameter may also be interpreted as describing how an item is related to the trait measured by the scale. It is the slope at the inflection point. Item difficulty parameter Item difficulty parameter is a point on , where a person has a 50% chance of responding positively to the scale item. Item characteristic curve (ICC)/Category response curve (CRC) item category and the level on the construct measured by the scale.

PAGE 40

40 Step functions Category intersections considered as step difficulties associated with the transition from one category to the next, and there are m I step difficulties (intersections) for an item with m I + 1 response categories (Embertson & Reise, 2000). Adjacent vs. Cumulative Step Approach Adjacent step approach The adjacent step approach defines each of the J step functions. The jth step function specifies the probability of successfully advancing fro example, a polytomous item with four score levels (i.e., j = 0, 1, 2, and 3) would consist of the following three steps: step 1 would be defined by successfully advancing from a 0 to a 1 (step 0 step 1), step 2 wo uld be defined by successfully advancing from a 1 to a 2 (step 1 step 2), and step 3 would be defined by successfully advancing from a 2 to a 3 (step 2 step 3). Cumulative step approach The cumulative step approach defines the step functions in a manner c onsistent with the graded response model (GRM) (Samejima, 1997). Under the GRM, the jth step function specifies the probability of successfully advancing from 0, 1, . . . , j to j + 1, . . . , j. Following the example of an item with four score levels, ste p 1 would be defined by successfully advancing from a 0 to a 1, 2, or 3 (step 0 steps 1, 2, 3); step 2 would be defined by successfully advancing from a 0 or a 1 to a 2 or a 3 (steps 0,1 steps 2,3); and step 3 would be defined by successfully advancing f rom a 0, 1, or 2 to a 3 (steps 0,1,2 step 3). Theoretical Framework A key component in validating an assessment instrument is exploring and analyzing item fit for the respective instrument. In terms of methodological procedure for item fit investigation an d test measurement, many measurement experts rely on either classical test theory or item

PAGE 41

41 response theory (IRT) (Embertson & Reise, 2000; Thorpe & Favia, 2012).The development of this instrument follows a two step process. The first phase consists of utili zing classical test theory as a framework to determine item retention. The second phase consists of utilizing item response theory as a framework to determine item fit for the instrument pi loting phase. This section provide s a concise review of classical t est theory, followed by a concise review of the similarities and difference between the polytomous IRT models. Following the review of the IRT ed response model (GRM) is discussed providing a rationale f or utilizing this model to explore item fit statistics. Classical Test Theory Classical test theory (CTT) uses a set of principles and assumptions to examine the effectiveness of established proxy indicators in estimating variables of interest (DeVelis, 20 06). CTT offers the possibility of utilizing observable data (e.g., survey responses) to gain insights into often unobserved constructs (e.g., global awareness). Traditionally, analysis of test items in measured utilizing classical test theory. This is the basis for the concept of the true score, a key component of classical test theory. According to observed s 109). rror (Crocker & Algina, 1986). The following formula is used to express CTT: X = T + E (2 1) true score, and E represents the random error of t he individual on the test (Crocker & Algina,

PAGE 42

42 1986). With regard to random error, CTT holds that there are other external factors outside the assumes randomne ss of errors, individual item error statistics are believed to cancel each other (Devalis, 2006). In CTT, the assumption of random error also means these err ors are considered independent of each other. In other words, random error is item specific, and each item has a unique error statement. Classical test theory provides such statistics as item difficulty, which indicates the proportion of correct responses to an item; item discrimination, which entails the corrected item total correlation; and reliability measures. In all the CTT statistics, the results are dependent upon the participant sample. In addition to representing the proportion of correct responses on the 2006). In polytomous response scales (e.g., Likert style scales), item difficulty can be determined by the number of response endorsements. For example, in the assessment instrument many endorsements, which could be con sidered to have a lower difficulty level. Other key components of classical test theory are scale reliability and validity. The concept of scale reliability addresses the issue of consistency of measurement (Hambleton & Jones, 1993; Embretson, & Reise, 200 0). In measuring scale reliability, groups of items are proportion of variance in a set of scores that can be attributed to common influence on the

PAGE 43

43 s that share a Scale validity determines whether the intended characteristic or trait is actually being measured by the s cale. Devalis (2006) lays out the two basic goals of validity: (1) to provide convincing evidence of the scope of the variable of interest and (2) to demonstrate that the scores yielded by that scale take on values that are consistent with our understandin g of how the phenomenon of interest varies in the real world. For the present analysis, given that the intended constructs to be measured are critical thinking and communication skills in a cross cultural context, the goal is to ensure that the respective items of the scale are effectively measuring these two established constructs. While classical test theory procedures are commonly utilized in measurement, it is important to understand the attributes that both CTT and IRT offer for analysis. In classical test theory, responses are typically scored by summing the responses to the items (Crocker & Algina, For example, if a respondent endorses 15 out of 30 items, they earn a score of 50% on the test. In classical test theory, more items generally mean greater scale reliability; therefore, the reliability of the scale relates to the number of items in the scale (Thorpe & Favia, 2012). By contrast, in item response theory (IRT), shorter and equally reliable scales can be developed with appropriate item placement. Even with other measurement options, many researchers have continued to use classical test theory because it offers several advantages. One advantage of CT T is that is it easy to

PAGE 44

44 manage the analysis; the relative ease of calculating a correlation coefficient or transitioning between statistical packages is a benefit of using CTT to analyze data. Another important advantage of CTT is that items can be used th at only modestly address the measured trait. In of more items in the instrument can improve the level of reliability. Despite the advantages of CTT, the mod el also has limitations. One limitation of the CTT model is that, while increasing the number of items can increase reliability; this can lead to the inclusion of similar items. The attempt to create highly correlated items can also introduce additional fa ctors (e.g., grammatical confusion, wording) that can affect measurement of the variables of interest. In addition, classical test theory only provides a single estimate of reliability and its parallel standard error of measurement (Embertson & Reise, 2000 ; Thorpe & Favia. 2012). Given the efficiency and sample size for the instrument development phase of this analysis, classical test theory serves as a beneficial framework to construct the final assessment instrument. However, given the limitations of CTT , item response theory methods allow for a more robust and comprehensive analysis of item information. Item Response Theory competency, level of critical thinking (DeVailis, 2006), item response theory utilizes both the individual responses to each question and the specific properties of the qu estions. The roots of item response theory (IRT) can be traced back to the conceptual foundation described in L.L. introduction of a latent trait or ability, which differentiated this construct from observed test score. In 1960, George Rasch introduced and promoted the need for statistical models with

PAGE 45

45 specific objectivity, the concept that people and item parameters are estimated separately but are comparable on a similar metric . The main distinction of item response theory is its ability to describe the association rait that the instrument According to Hamberton, Swanathianm, and Rodg ers (1991), two primary assumptions lay the foundation for item response theory: (1) the performance of an individual on a test item can be estimated by a set of factors called traits (latent) or abilities, and (2) the connection between individual perform ance on the test items and the underlying traits for the item performance are expressed as item characteristic functions, or item characteristic curves (ICC). The concept of latent traits or abilities involves the construct that the instrument measures and is a continuous construct that explains the covariance among item responses (Steinberg & Thissen, 1995). The theta values have a higher probability of gi ving a correct response (dichotomous) or of endorsing an item (polytomous) (Reeves, 2002). Item characteristic curves (ICC) indicate the specific increase of latent trait levels and how this increase affects the increase in giving a correct response or end orsing an item (Embretson & Reise, 2000). ICCs also illustrate the probability that an examinee may not endorse or correctly answer an item based upon their latent ability. Figure 3 1 is an item characteristic curve for a polytomous item concerning intercu ltural competency that illustrates that those participants with a higher theta value have a greater probability of endorsing the item. In this example, there are five possible responses based upon a

PAGE 46

46 In addition to the underlying principles of latent abilities and item character functions, IRT procedures incorporate parameters describing both the item and the individual participants. These parameters guide the amount of information obtained f rom the items and the item responses. The information gained from items is vital in determining how well it measures the test construct of interest. In terms of item parameters, each is characterized by one or more model parameters that describe such item steps as the difficulty of the item in relationship to a IRT as the b parameter, indicates the specific point on the scale of the instrument where a partic ipant with a specific has a 50% chance of responding positively (correctly answering or endorsing) to the item when the lower asymptote is zero (Steinberg & Thissen, 1995). Given the nature of the S shaped distribution in IRT, the difficulty of the item is defined at the steepest part of the slope of the S curve (Embretson & Reise, 2000). Typically, items with higher levels of difficulty are less likely to be answered correctly or to be endorsed. The item discrimination parameter, expressed as the a param eter, describes the strength ) below and above the threshold b item is related to the trait bein g measured by the item (Chernyshenko, Stark, Chan, Drasgow & Williams; 2001). Given that a good test item will show the relationship between latent ability and item endorsement (e.g., higher ability will have a greater chance of answering the questions cor rectly), the item discrimination parameter explains how well an item differentiates among individual latent abilities. The discrimination values of typical items range from .05 to 2 (Embretson & Reise, 2000). While the item difficulty parameter indicates t he steepest part of the

PAGE 47

47 the slope increases, the discrimin ation values of an item will be higher. The discrimination is the slope of the ICC at the point where the difficulty is defined. A higher discrimination value indicates that those with a higher theta value will be more likely to answer the questions correc tly or endorse the item (Orlando & Thissen, 2000). As specifically applied to dichotomous test items, the c parameter within item response theory indicates the likelihood that a participant with a low theta (ability level) can guess the correct response t important item information for dichotomous items, polytomous items, especially for Likert response scales, are not adjusted for the c parameter. Item response theory assumptions At th e core of item response theory are two important assumptions: unidimensionality imensionality may be interpreted simply as an instrument/test in which all of the items measure the same construct. For example, unidimensionality in an intercultural assessment instrument means that all the items on the assessment measure only a particula r dimension of intercultural ability (e.g., critical thinking or communication skills); the performance of students on the survey is thus explained only by their ability in that dimension of intercultural capacity. Therefore, the presence of a dominating f actor which can account for a significant portion of the test performance is considered standard for meeting this assumption. Unidimensionality can be determined by a comparison of the first and second eigenvalues (factors extracted from a factor analysis) for each

PAGE 48

48 unidimensionality has been met is to determine whether the first factor accounts for a significant proportion of the overall variance (Reise & Waller, 1990). Given the affective factors that may influence test performance (e.g., text anxiety), it is difficult to strictly meet the unidimensionality assumption. In fact, many empirical studies have found that the assumption of unidimensionality is rarely met (Acke rman, 1987; Michaelides, 2010 ). According to Traub (1983), conditions such as differences in instruction and curriculum, test taking strategies (e.g., performance on timed tests), and different vocabulary levels can cause potential violations of the IRT un idimensionality assumption. For this analysis, a potential threat g the items, introducing variation in the designated instrument dimensions. to the test items other than their latent ability. According to Che n and Thissen (1997), the estimated parameter will be different if the local independence assumption is violated, which may result in inaccurate determination of items for scale construction. Only by meeting these two assumptions are the item results consi dered reliable. Item response theory benefits Given that item response theory allows for flexibility in both refining testing instruments and item construction and calibration, it has considerable potential in educational contexts. However, since IRT mode ls have not been used much in the development and evaluation of

PAGE 49

49 internationalization assessment instruments, it is important to review research literature that closely resembles the various construc ts of intercultural competency. Attitudinal and personali ty inventory (self confidence, adaptability) can closely resemble such intercultural competencies as adaptation and self efficacy, so studies utilizing IRT models in the fields of psychology can shed light on the effectiveness of IRT procedures in test ref inement. In a study by Uttaro and Lehman (1999), the researchers explored the responses of 1,805 respondents to a thirty five item scale called the Quality of Life Interview. In this analysis, Uttaro and Lehman (1999) utilized the graded response model and The use of item response theory has several benefits in comparison to classical test theory. One key advantag e is that unlike CTT, IRT is not sample dependent. And unlike classical test theory measurements, whose measurements of item discrimination and item difficulty rely on sample estimates, IRT does not depend on the sample to develop parameter estimates and, in fact, is considered scale invariant across different groups. Another important advantage of item response theory is its ability to provide analysis of individual estimated trait levels independently of the respective questions. Unlike CTT, which is depe ndent on the set of questions for analysis and for determining the summed score, IRT is better able to assess individual trait levels on individual items. The use of scaled scoring in item response theory is beneficial for item analysis because it allows f researchers to consider the pattern of positive and negative responses to items as we ll as item difficulty and discrimination parameters to estimate trait levels. In other words, different

PAGE 50

50 participants can have the same summed score on the instrument but have different IRT latent scores based upon different response patterns. With these st rengths, IRT offers a comprehensive model for item analysis. While item response theory offers considerable benefits in terms of flexibility of item analysis and strong foundational assumptions, IRT model specifications can also be influenced by methodolo gical effects such as differential guessing and context. Although the concept of guessing is usually focused on dichotomously scored tests (e.g., multiple choice, true/false), differential guessing can refer to the omission of items (e.g., student read que stion but failed to answer it). In a study by Choppin (1974) of student response patterns on an academic standards survey based upon geographical region, the researcher discovered that many students, almost 50% for certain regions in the United States, lef t survey questions unanswered. Given that item response theory relies on latent ability estimates, omitted items limit determination of participant specific ability. The context of the item presentation can also influence item parameters. One area of cont ext influence can be in the item setting (e.g., item order). Research has shown that changing the position of an item can lead to different item parameter estimates (Kolen & Brennan, 2004; Yen, 1980). Yen (1980) also determined that item difficulty could b e influenced by item location. These studies revealed that higher discriminating items were often found at the end of a test, while lower discriminating items were often found at the beginning of tests (Wise, Chia, & Park, 1989). To counteract item setting issues, many IRT implementations use alternate forms with sections of items placed in reverse order to counteract test fatigue. Given the small number of items in this analysis (twenty six), the effect of test fatigue on student performance is predicted t o be minimal.

PAGE 51

51 Item Response Theory Models Unlike classical test theory, item response theory allows for the utilization of various model types to explore and investigate ranges of item and test information. These models combine various forms of the item pa rameters of item difficulty, item discrimination, and guessing. The more parameters that a model incorporates, the greater the amount of information needed from the participants, and as a result, the greater the amount of information obtained from the test . The flexibility of IRT allows for unidimensionality analysis, which measures dominant ability, as well as multidimensional analysis, which measures multiple dominant abilities. In addition, in moving from dichotomously scored items to polytomously scored items, item response theory adapts to the transition more easily than classical (or traditional) test theory, needing only changes to the models themselves (Thissen, Nelson, Billeaud, & McLeod, 2001). Given the capacity of IRT models to adapt to various t esting situations while addressing the unique needs of test instruments, the different item response theory models allow for flexibility of usage, especially in determining the appropriate fit of items for the assessment instrument. Dichotomously Scored I tem Models Item response theory models express the probability of obtaining a correct response to a test item based upon the function of a latent ability or trait ( response options for a specific item are limited to either c orrect or incorrect scores. In the dichotomous mode for item responses, the most common dichotomously scored models differ based upon which parameters are included in the complexity of the model (Rasch): one parameter logistic (1PL), two parameter logistic (2PL), and three parameter logistic (3PL). The following description of dichotomously scored IRT models progresses from most complex (3PL) to least complex (1PL).

PAGE 52

52 In the three parameter logistic IRT model (3PL), the three parameters are item discrimination ( a ), item difficulty ( b ), and a lower asymptote or guessing parameter ( c ). The mathematical form of the 3PL model for item i is as follows: In the 3PL formula, the P represents the probability of a positive response to item i, Y represents the item response, s is the individual participant, exp is an exponent equal to the base of the natural logari thm to 1.7, and a , b , and c represent the parameters characterizing an item. Within the model equations, all of the different dichotomously scored items can incorporate a scaling constant, D , which is equal to 1.7. The purpose of the scaling constant is to allow IRT logistic models to approximate the IRT normal ogive models (normal probability distribution). By incorporating all three parameters, items within the 3PL model differ from other items in terms of three item parameters and the se on each is affected by that item difficulty, discrimination, and guessing parameter. Built from the three parameter logistic model, the two parameter logistic model (2PL) holds the assumption of a lower asymptote equaling zero (c i =0). Adjusted to addr ess changes in Rodgers, 1991, p. 34) the 2PL model incorporates both the item difficulty and item discrimination parameters. The mathematical form of the 2PL model for item i is shown below: (2 2) (2 3)

PAGE 53

53 Considered to be the least complex and most restrictive logistic model, the one parameter logistic model (1PL) assumes that the shape of the probability distribution is the same for each item in the assessme nt. In other words, the 1PL model relies on the two assumptions, that the lower asymptote equals zero, and that all the items are equally discriminating. Therefore, the 1PL model assumes that the spread of right and wrong answers is the same for each item. In the performance. The mathematical form of the 1PL model for item I is as shown below: Considered by some pyschometricians to be the least complex of the dichotomously scored IRT models, the Rasch model for dichotomously scored items can either be part of the 1PL logistic model or can be addressed separately. A key characteristic of the Rasch model that separates it from the 1PL model is the assumption tha t items are equally discriminating in that the discrimination is unable to vary by item. Therefore, the Rasch model, unlike the 1PL model, assumes that the item discrimination and slope of all items is equal to 1. The principle of similar item slope allows Another key principle of the Rasch model is the concept of sufficiency. This concept hol ds that the sum of the scores across the items is sufficient to determine the latent ability ( every individual with the same summed score will have the same latent ability or trait without regard for differences in item response patterns. Th e mathematical form of the Rasch model for dichotomously scored items for item i is shown below: (2 4) (2 5)

PAGE 54

54 In this formula, P represents the probability of answering the item correctly, with representing the latent ability of the participant. In addition, b represents the difficulty of the item, i is the specific item, s is the person, and Y represents the item response. Since this research study involves polytomous items (Likert style response scale), the description of the models for theoretical framework address es only polytomous IRT models to d etermine the appropriate model for this study. Ordered Polytomously Scored Item Models Item response theory models have been utilized for several decades (Andrich 1978; Semejima, 1969). However, currently, there has been little application of polytomous I RT models to attitude or personality measurement and no application of these models to intercultural competency assessment (Madera, 2003). Similar to the dichotomous models, the unique characteristics of polytomous models include different parameter estima tes, step approaches, and (Embertson & Reese, 2000, p. 95), and they are often used in performance based assessments. Thissen and Steinberg (1986) indicate that most multiple category response models can be by the total model) or model can be utilized for orde red or nominal responses, while the difference model (Samejima, 1969) fits more appropriately for ordered responses. Mellenbergh (1995) offered an alternative classificati on of parametric IRT models for polytomously scored items. His approach involved deciphering multi categorical responses by adjacent or cumulative categories of step functions. By adjacent categories, Mellenbergh (1995) indicated that it was the function t hat specifies the probability of successfully advancing from

PAGE 55

55 0, 1, 2, and 3) would consist of the following three steps: step 1 would be defined by success fully advancing from a 0 to a 1 (step 0 step 1), step 2 would be defined by successfully advancing from a 1 to a 2 (step 1 step 2), and step 3 would be defined by successfully advancing from a 2 to a 3 (step 2 step 3). A cumulative step function, by cont rast, involves ensuring that the jth step function specifies the probability of successfully advancing from 0, 1, . . . , j to j + 1, . . . , j. Following the example of an item with four score levels, step 1 would be defined by successfully advancing from a 0 to a 1, 2, or 3 (step 0 steps 1, 2, 3). As for step functions, these functions, as addressed in the definition section of this paper, attamorta & Penfield, 2012). However, for polytomous item scales, the concept of step functions involves the probability that a participant will move from one category to a higher category on the response scale (e.g., Agree Strongly Agree). In addition, the C parameter for polytomous item scales assesses the ability of a participant to endorse a specific Encompassing these different item information functions, the polytomous item response theory models consist of four ordinal data models: the rating scale model (Anderson, 1977), the partial credit model (Masters, 1982), the modified graded response model (Muraki, 1990), and the graded response model (Samejima, 1969), along with one nominal data focused model, the nominal respo nse model (NRM). Following an overview of the respective model components, Samej be offered as the methodological framework utilized to assess item fit for this research study.

PAGE 56

56 Rating Scale Model (RSM) In terms of complexity of IRT models, the rating scale model (RSM) is the most parsimonious given the reduced number of parameter estimates. Considered to be the Rasch version of the polytomous IRT models, the rating scale model has no discrimination parameter and the number of steps depends upon the number of possible responses. Since there is no discrimination parameter within RSM, the slope for all the items is equal to 1 (Cole, Smith, Rabin & Kaufman, 2004). As an adjacent categories model, the number of step functions consid ers advanced one step progression independently (step 2 step 3). In addition, the RSM holds that the relative difficulty across items remains constant but allows for variations of step difficulty within the item itself. Therefore, under the rating scale m odel on a polytomous item scale (e.g., Likert style scale), all items would have the same difficulty level but the movement from one step to another (e.g., neutral agree) could have a different difficulty level than another step (e.g., agree strongly ag ree). Similar to other Rasch polytomous models (e.g., partial credit model), the rating scale 00). The formula for the rating scale model is as follows: Unlike other models, RSM, which does not make any assumption regarding specific item step difficulties, does assume that there should be little variation within item steps for response style scale responses across (2 6)

PAGE 57

57 items (e.g., 1=agree, 2=neutral, 3=disagree), the rating scale model expects scant variation in terms of step difficulty across the different items (Cole, Smith, Rabi n & Kaufman, 2004). Partial Credit Model (PCM) Like the rating scale model, the partial credit model (PCM) does not incorporate an item discrimination parameter ( a parameter, all item slopes =1) and the observed score of the respective participant is cons idered sufficient to examine score estimates (Masters, 1982). The PCM was initially developed to address the item analysis for questions that have multiple steps, yet graded scoring required partial credit for steps in item completion (Masters, 1982). Give n the parameter estimates of the PCM, Masters and Wright (1996) indicate that this model is measurements. An example of PCM procedures in test information could be a math exam item that allows partial credit for the use of the correct mathematical steps. The formula for the partial credit model is as follows: Since the partial credit model is an adjacent categories approach (e.g., step 2 step 3), the item progression assumption under this model means that each step progression is evaluated separately. For example, on a five 5= style survey, the PCM analyze step 5) individually rather than considering step difficulty as an accumulation (steps 1, 2, 3 step 4). In addition, unlike non Rasch oriented models, the PCM parameter estimates do not indicate a 50% (2 7)

PAGE 58

58 probability of responding above a category threshold, but rather the statistics indicate the relative difficulty of each step (Thorpe & Favia, 2012). Generalized Partial Credit Model (GPCM ) An adaptation of the partial credit model is the generalized partial credit model (GPCM). Continuing to build upon the PCM concept of adjacent step categories, the generalized partial credit model (GPCM) is distinguishable from the partial cred it model in that the GPCM incorporates a discrimination parameter that allows for item difficulty levels to vary across items. Despite this individual difference between the models, both the PCM and GPCM do not require step progressive order in terms of re sponse items. Therefore, GPCM and PCM can have steps of varying difficulty, whereby the difficulty of a step on a five response Likert style scale could be is as follows: Graded Response Model (GRM) response model (GRM). The graded response model includes a separate slope parameter for each item and an item response parameter (Muraki, 1990). The most commonly used IRT m odel for polytomous item responses is the graded response model (GRM) (Baker, Rounds, & Zevon, 2000). According to Embretson & Reise, (2000, p.97 98), in fitting GRM to a specific measure, complications arise in items parameter estimations or the subsequent parameter interpretation as a result of a measure (2 8)

PAGE 59

59 GRM offers the most complete test item informa tion by providing the most flexibility in item parameter dimensions (Hambleton, R.K., & Swaminathan, H. (2010). The graded response model complies with the cumulative step approach in that the participant considers all previous item responses before consid ering the next item (step 0, 1 step 2). Unlike the partial credit model, which treats distinct thresholds within each item independently (Masters, 1982), the GRM considers the endorsement of a particular response alternative as requiring the successful a ccomplishment of all previous steps (Reckase, 2009). Analogous to 2PL, GRM has one discrimination parameter and a set of difficulty parameters where each parameter is a between discrimination parameter and an n 1 threshold parameter for each item, where n is the number of response categories. The discrimination parameter ( a) indicates the shape of the category response curves, with higher discrimination parameters yielding steeper curves. Curves that are narrow and peaked i ndicate that the response categories differentiate well across theta. The difficulty parameters ( b ) represent item difficulty, which is the theta level at which individuals have a 50% probability of responding affirmatively to the more severe adjacent resp onse category. For n response categories, the GRM estimates n 1 difficulty parameters. For example, if an item has four response options (e.g., not effective, somewhat effective, effective, highly effective), the first b parameter represents the theta le vel at which individuals have a 50% probability of selecting the responses not effective , somewhat effective , effective , highly effective . The following indicates the formula utilized for the graded response model: (2 9)

PAGE 60

60 In contrast to the other IRT polytomous models, the graded response model allows for the Utilized in assessing personality traits and health related outcomes, the GRM has become standard in validating an d analyzing test instruments and specific items that address these areas. For example, in terms of personality domain analysis, the graded response model has been used to explore item content specificity for the NEO PI Conscientiousness scale (Schmidt, Rya n, Stierwalt, & Powell , 1995). The NEO domain scale of personality assessment offers an analogous framework for understanding intercultural cultural of the underlying student learning outcome concepts of openness in intercultural communication and global competency. In addition, multi dimensional item response theory models have been Hsieh, von Eye, & Maier, 2010). When compared to alternative item response theory models, the graded response model offers several advantages for the present analysis. First, the GRM is less confining than the Rasch models (RSM and PCM) and, therefore, may be able to more adequately describe the item information, thereby preventing item alteration or dismissal as a result of inadequate information (Cock, Emons, Nefs, Pops, & Power, 2011). In addition, GRM, similar to Rasch models, requires logistic function s, but unlike the Rasch models, the GRM permits varying slopes across items (Hambleton & Swaminathan, 2010). Because Rasch models assume equal slopes across items, RSM and PCM hold that all items in the instrument have the same discrimination power. In

PAGE 61

61 ana lyzing real data, the assumption of equal slope across items can be unrealistic and result in poor item fit conclusions based upon a lack of item discrimination information. Application of Model to Current Study model allows for a more robust analysis of assessment item information and allows the current study to examine multiple item parameter estimates. In particular, this study focus es on item discriminations, item difficulties, and item and model fit statistic s. The f irst layer of the analysis use s classical test theory procedures to develop the overall assessment tool. By analyzing the contribution of specific items to the overall assessment instrument (discrimination leve ls), classical test theory provide s vi tal statistics t o determine which items are retained or eliminated from the final instrument. In addition to applying classical test theory principles, item response theory components and procedures are offer ed to further understanding as to the item information. The use of item response theory allows for an understanding of global awareness and intercultural communication at both a macro (assessment instrument) and micro level (item discrimination, item difficulty ). Review of Assessment Instruments Various assessments of global awareness and intercultural competency have been implemented at higher education institutions. These methods of intercultural assessment have included both direct methods (exams, essays, an d portfolios) and more indirect methods (self reporting surveys, journals). Despite the different assessment methods, researchers have investigated what international content experts believe incorporates the key components of intercultural competency (Dear dorff, 2004). In a Delphi Study analysis of 23 international experts, Deardorff (2004) found that these international experts agreed upon the common cultural d ifferences, experiencing other cultures, self

PAGE 62

62 (cultural differences), interdependence, effectiveness (e.g. interactions), and responsibilities However despite this definitional consensus, the key components of an overall assessment of intercultural competency. In fact, Deardorff (2004) discovered that most internal content experts believe that a comp rehensive intercultural competency assessment model needs to incorporate both quantitative and qualitative methods. Despite the fact that greater attention has been given to the qualitative analysis of intercultural competency, international experts assert the importance of quantitative assessment, with 70% allowing quantitative measurement data for intercultural evaluation (Deardorff, 2004). A developing way of understanding intercultural development has been through the use of student portfolios. Researc hers have discovered that the use of portfolios provides an effective (Jacobson, Slecher, & Maureen, 1999, p.470). In fact, student portfolios exhibited three dist inct types of student learning: interactive, passive, and active (Jacobson, Slecher, & Maureen, 1999). In addition to portfolio analysis for student understanding, other research has used case studies of student feedback to assess intercultural knowledge ( and intercultural knowledge and the comprehensive internationalization process through case study analysis, Iupusa (2006) discovered in a mixed methods study that there is often a mixed response among faculty an d students toward the overall perceived institutional support of internationalization efforts as well as a failure to adequately encourage students to participate in internationally focused student organizations. Whether through student reflection journals , portfolio analysis, or institutional case study evaluation, many postsecondary institutions have

PAGE 63

63 engaged in various assessment methods to better understand the effectiveness of international learning opportunities, especially programs and policies that r eflect student growth in cultural and global knowledge. Even with regard to student portfolios and case studies, postsecondary institutions have researched and implemented standardized intercultural assessment instruments. Despite the plethora of instrume nts to assess intercultural and global knowledge, most postsecondary institutions have used the following standardized assessment tools: Behavioral Assessment Scale for Intercultural Communication (BASIC), Intercultural Development Inventory (IDI), Global Perspective Inventory (IDI), and Cross Cultural Adaptability Inventory (CCAI). Intercultural Development Inventory The most common standardized assessment instrument utilized by institutions of higher education is the Intercultural Development Inventory ( IDI). The Intercultural Development Inventory (IDI) consists of fifty items measuring intercultural sensitivity and growth as defined by the Developmental Model for Intercultural Sensitivity (DMIS) (Hammer & Bennett, 1997). The Developmental Model for Inte rcultural Sensitivity (DMIS) consists of three overarching cultural, transitional, and global/intercultural (Hammer & Bennett, 1997). The monocultural domain anticipates that an s perceptions of culture are dependent on their vi ews of their own cultures. This domain consists of two subcomponents of denial and polarization. The transitional domain (minimization) depends upon the ability to relate to other cultures by understanding cultural similarities, but it acknowledges that co mprehending and adapting to cultural difference entails a more thorough understanding. Finally, the intercultural or global mindset consists of the two subdomains of acceptance and adaptation, and it involves understanding and empathizing with different cu ltural frameworks (Hammer, 2011).

PAGE 64

64 Traditionally used to measure student intercultural development through participation in study abroad experiences, the IDI has also been implemented in the health and psychological fields (Altshuler, Sussman, & Kachur, 20 03). Often administered at various time points, the IDI intercultural López Portillo, 2004). cultural sample of 4,763 individuals from 11 distinct, cross (Hammer, 2010). In piloting the IDI, the developers conducted a comprehensive factor analysis, which exhibited strong correlations between the core orientation of intercultural dimensions as well as a robust measure that allows for generalizability across cultures (Hammer, 2010). Other studies have investigated the validity of the IDI in terms of various intercultural models (Greenholtz, 2005; Medina López Portillo, 2004). In a mixed methods analysis of the intercultural development of postsecondary stude nts through a study abroad opportunity in Mexico, Medina López Portillo (2004) found that the IDI was highly reliable in relation to the concepts of the Development Model of Intercultural Sensitivity (DMIS) as well as indicating a potential intercultural stage progression for the respective students. Medina López Portillo (2004) does indicate some limitations of the IDI as well as suggesting that further research is needed on the longitudinal use of this instrument. Other research studies have illustrate d the reliability and validity of IDI results (Engle & Engle, 2004; Medina López Portillo, 2004). Most of the analysis research conducted on the Intercultural Development Inventory consists of study abroad populations to assess student

PAGE 65

65 intercultural develo pment over specific time periods. For example, in a study by Paige, Cohen, and Shively (2004), study abroad students illustrated increases in the acceptance and adaptation constructs of the Intercultural Developments Inventory. In addition, the researchers found the Cohen, & Shivley, 2004). Although the Intercultural Development Inventory is often relied on by institutions of higher education, other intercultural as sessment instruments, such as the Global Perspectives Inventory (GPI) and Cross Cultural Adaptive Inventory (CCAI), have been utilized by postsecondary institutions, especially in the area of study abroad evaluation. Global Perspective Inventory The Globa l Perspectives Inventory consists of three different forms to measure general student, new student, and post study abroad student intercultural development. The GPI is based 005) intercultural maturity theory and intercultural communication theory. The GPI is divided into three domains: cognitive, intrapersonal, and interpersonal; each domain contains two scales (See Appendix A). nowledge and increasingly complex understanding of multiple cultural perspectives. This domain asks the individual to consider the knowledge (Braskamp, Braskamp, & Merril, 20 09). awareness of who they are and the integration of self identity into their personhood. This domain direction, purp ose, and personal strengths and weaknesses. Within this domain, the scales attempt to measure how the individuals view

PAGE 66

66 their development in terms of self identity. The two scales within this domain are identity and affect (Braskamp, Braskamp, & Merril, 200 9). Finally, the interpersonal domain asks individuals to consider how comfortable they are in relating to others from different cultural backgrounds with potentially different, acceptable y as they move from dependency to interdependence. The two scales within this domain are social responsibility and social interaction (Braskamp, Braskamp, & Merril, 2009). Although the GPI has been administered at over fifty five different postsecondary c ampuses, there has been a dearth of research investigating its methodology as well as that of the other international assessments (Samonte & Pastor, 2011). Previous research has analyzed this instrument from a factor analysis perspective (Samonte & Pastor, 2011; Engberg & Fox, 2011). Although factor analysis is a prevalent analytical method for determining whether patterns of variety of different types of evidence w ith regard to validity that should be analyzed for such instruments. In an exploratory factor analysis conducted by Samonte & Pastor (2011), the researchers discovered that while some of the GPI subscales measured the intended constructs, there were severa l potential adjustments needed at the item (wording revision) and scale level. These suggested changes included changing negatively worded items to positive wording, reanalyzing the instrument constructs, and revaluating questions in relationship to the re spective constructs (Samonte & Pator, 2011). While the GPI has been administered at a number of different campuses, there is little evidence of the reliability of this instrument for large undergraduate populations.

PAGE 67

67 Cross Cultural Adaptability Inventory A ability to adapt to different cultural environments and interact with members of other cultures is the Cross Cultural Adaptability Inventory (CCAI). Developed by Kelley a nd Meyers (1993), the cultural (emotional resilience, flexibility/openness, perceptual acuity, and personal autonomy) reflect key components of successful cross cultural adaptation. The emotional resilience scale involves the 1993). The flexibility and openness scale measures the degree to which an individual values cultural o and and at the same time respects others and their value system The authors identified behavioral, emotional, and problem solving skills that are related to successful cross cultural adaptation. The Cross Cultural Adaptability Inventory (CCAI) is nd openness to adapt to different cultural situations. In studies addressing participants as physical therapists (Kreamer & Becksted, 2003), foreign medical students (Majumdar, Keystone, & Cuttress ( Eds.), 1999), and behavioral health professionals (Stanho pe, Soloman, Pernell Arnold , Sands, & Bourjolly, 2005), the CCAI has when Kreamer and Becksted (2003) investigated the cross cultural adaptability of 228 gr aduate students enrolled in a physical therapy program, they found the CCAI to be internally consistent

PAGE 68

68 with an estimated reliability of .90 and the potential to be able to predict future cross cultural student adaptability (Kreamer & Becksted, 2003). al., 2007), the validity and reliability of the CCAI has been investigated not only in different populations but also through various methodological procedures. Davis a nd Feeney (2007) researchers discovered a poor fit of the four factor scale and recommended more research and adaptation of the CCAI at both the construct and item level (Davis & Feeney, 2007). Hoffman to adapt cross culturally. In analyzing the reliability of the Cross Cultural Adaptability Inventory (CCAI), Ts ai (1995) found a positive correlation between the student results and the CCAI scales over an intermediate period of time. In addition to analyzing the longitudinal effectiveness of the CCAI, researchers have investigated the effectiveness of the CCAI in to adapt to cross cultural situations (Ward, Berno. & Main, 2000). Ward, Berno, and Main psychological and sociocultural distress overse for certain popu lations, the questions and constructs of the CCAI do not address the specific competencies and student learning outcomes of this study. Behavioral Assessment Scale for Inter cultural Communication (BASIC) and Assessment of Intercultural Competence (AIC) Ot her intercultural inventories include behavioral assessments with the Behavioral Assessment Scale for Intercultural Communication (BASIC) (developed by Koester, and Olebe,

PAGE 69

69 1988) and the Assessment of Intercultural Competence (AIC) utilizing a specific form measuring BASIC assessment instrument measures cross cultural behavioral adaptability through the coordination of eight behavioral scales (Fantini, 2009). This inst rument was validated through its implementation with 263 college students (Fantini, 1995). In the validation of the BASIC instrument, researchers discovered that the eight scale model was reliable and exhibits only one underlying trait (Olebe & Koester, 19 89). The strengths of the Behavioral Assessment Scale are that it acknowledges universal dimensions of communication while recognizing differences in participants, ex pert observers), and permitting the assessment of current and future intercultural communication ability (Olebe & Koester, 1988). While the BASIC inventory focused on the behavioral components of cross cultural behavior, the Assessment of Intercultural Com petence (AIC) assesses intercultural competence and specifically addresses language proficiency (Fantini, 2009). Fantini (1995) held that the AIC, as a self assessment instrument, provides component of the AIC is its ability to measure the longitudinal intercultural development of its participants. These two intercultural assessment instruments did not have the necessary validity and reliability measures to ensure significant results for larger student populations. In reviewing these previous instruments as well as other professionally designed intercultural assessments, I found in respect to this analysis there were two major concerns: psychometric properties of t he instruments and alignment of instrument constructs with established student learning outcomes. In the area of psychometric properties, the key principles investigated were the validity and reliability of the instruments. With such instruments as the

PAGE 70

70 In tercultural Development Inventory (IDI), Cross Cultural Adaptability Index (CCAI), and the Global Perspectives Inventory (GPI), there have been several validity analyses of these instruments (Braskamp, Braskamp, & Merril, 2009; CCAI Research Bibliography, 2011; Hammer, 2011). Despite the numerous validity studies, many of the research studies did not implementations of the instruments were based upon programm atic areas (e.g., study abroad participants, physical therapist students) rather than at an institutional level (Samonte & Pastor, 2011; Stanhope, Soloman, Pernell Arnold, Sands, & Bourjolly, 2005). For example, DeJaeghere and Cao (2009) investigated the r eliability of the IDI in understanding the intercultural competence of U.S. K 12 teachers. Although the researchers illustrated adequate fit of the IDI dimensions, the sample size of 86 teachers would be substantially less than the sample utilized in the i Given that many validity studies addressed a sample size of less than 1,000 and specifically focused on postsecondary liberal arts institutions (Braskamp, Braskamp, & Me rril, 2009), there was considerable concern about the reliability of these instruments when utilized in a diverse, research extensive higher educational institution with a potential sample size of over 5,000. For example in a research study conducted by Kr aemer and Beckstead (2003) on 228 entry level masters level students in physical therapy, they found that the CCAI was a reliable instrument for assessing the cross cultural knowledge of PT students. However, once again, the small sample size and program f ocused demographics did not offer sufficient psychometric In addition to psychometric concerns about the profession instruments when implemented constructs of the assessments

PAGE 71

71 do not align specifically to the student learning outcomes. The overarching competencies of the student learning outcomes for this analysis consisted of global awareness and intercultural communication. From these competencies , three specific student learning outcomes address content knowledge, critical thinking skills, and communication skills in a global context. intercultural communicat ion, few specifically delineated such skills as critical thinking and communication measured outcomes of their assessment. For example with the Global Perspectives Inventory, the focus of the inventory is on holistic student development in areas of cogniti ve, interpersonal, and intrapersonal development (Braskamp, Braskamp, & Merril, 2011). Although the GPI may offer constructs addressing communication skills, it did not provide specific construct parameters addressing the critical thinking student learning outcome. Similar to the GPI, many of the other professionally designed intercultural assessments did not fully align with the entire student learning outcomes. Summary As postsecondary institutions prepare students for the demands of globalization, thes e colleges and universities must develop a core definition of institutional internationalization and respective international efforts in order to develop appropriate assessment procedures. With internationalization (Deardorff, 2004), many institutions have placed little emphasis on the overall evaluation, and, more specifically, even less attention has been focused on the evaluation of student learning outcomes as a result of po stsecondary internationalization (Deardorff, 2004). Upon settling on an operationally defined version of internationalization, postsecondary institutions have used professionally designed instruments or developed their own intercultural assessment instrume nts. In the process of developing a locally designed assessment tools,

PAGE 72

72 researchers have suggested a comprehensive research agenda in the development, piloting, and validation of intercultural instruments (Dowd, Sawatzky, & Korn, 2011; Museus & Maramba, 201 1; Nuñez, 2009; Tanaka, 2002).

PAGE 73

73 Table 2 1 . Differences B etween Classical Test Theory and Item Response Theory Classical Test Theory Item Response Theory Measures of precision fixed for all scores Precision measures vary across scores Longer scales of measurement i ncrease reliability. Shorter, targeted scales of measurement can be equally reliable. Test properties are sample dependent. Test properties are sample independent. Comparing respondents requires parallel scales. Different scales can be placed on a common metric. Summed scores are on ordinal scale of measurement. Scores are on an interval scale of measurement. *Reeves (2002), An Introduction to Modern Measurement Theory

PAGE 74

74 Figure 2 1 . An Item Characteristic Curve for a Polytomous Item in the Construct of Intercultural Competency

PAGE 75

75 CHAPTER 3 METHODOLOGY The purpose of this chapter is to describe the research methodology utilized in this study investigating item difficulties and disc ri mination of an internation alization assessment through item response theor y procedures. This chapter address es the previously outlined research ques tions and hypotheses guiding this study. Following the research aims and questions, this chapter includes a discussion of the data source and sample. After an overview of the data source and sample infor mation, a specific section on instrument development provide s details concerning instrument item generation, revision, piloting, and implementation. The s ubseque nt section describe s the respective item response theory terminology utilized in determining the item s . Next, the statistical method employed to analyze the international ization instrument items is specif ied. Finally, this chapter conclude s with a discussion of the limitations of this study. This study involved a methodological analysis at the core of the instrument development process and builds upon the instrument design by incorporating further instrument validation with item response th eory procedures. In this methodological analysis, the study employs two phases of instrument development and analysis. In the first phase, the construction of the internation alization assessment focusing on critical thinking an d communication skills was bu ilt upon classical test theory assumptions and procedures. In this phase , based upon the agreed student learning outcomes, item specification templates were constructed to define and generate specific items for the internationalization instrument . Coupled with item specifications , this phase applies initial piloting of the instrument to determine which items to retain and which to eliminate. Utilizing classical te st theory procedures, items were determined to be implemented on the final internatio nalization assessment . In the second phase of this study, specific item analysis was conducted utilizing item response theory assumptions and procedures. Based upon

PAGE 76

76 the data sources, specific item fit statistics in regards to item discrimination , difficulty, and participant latent ability was determined in order to distinguish which item s fit within the specified instrument constructs. In utilizing these two phase s , the goal is to develop a valid and reliable internationalization assessment that can be implemented over several time periods to evaluate student progress in the areas of cross cultural critical thinking and communication. The primary aim of this study is to investigate the role of validity and reliability of specific items of an internationalization a ssessment in order to determine undergraduate student development specifically in the areas of critical thinking and communication skills. In addition, this study attempts to expand upon previous intercultural assessment studies in numerous ways. One expan sion of this study involves incorporating feedback from a larger research based postsecondary institution. Many of the samples for intercultural instrument validity studies involve smaller student populations and are often limit ed to those students participating in study abroad programs (Hammer, 2011; Hoffman, 2002; Braskamp, Braskamp, & Merril , 2009 ; Koester, 1998 ) . Secondly , this study utilizes a measurement model, item response theory , specifically , which h as not been previously applied to the intercultural and internationalization assessment tools. Unlike the professionally designed internationalization assessments, many testing agencies like the Educational Testing Service (ETS) and the Florida Department of Education with the Florida Comprehensive Assessment Test (FCAT) have utilized item response theory measures to ensure that the specific questions address the designated domains as well as the representative desired item difficulty and discrimination ( Ta ng, 1996 ). Although item response theory procedures , especially the graded response model , have mainly been utilized with dichotomous response scales, the use of these

PAGE 77

77 procedures can provide further insights into polyt o mous response scales found within intercultural and behavioral assessment instruments. There are three primary questions guiding this empirical study: 1. What are the psychometric properties of an assessment in regards to student self perceptions of intercul tural competency and global awareness (critical thinking and communication skills) in i nternationalization? 2. perceptions of intercultural competency and global awareness (critical thinking and communi cation skills) in regards to i nternationalization? 3. perceptions of intercultural competency and global awareness (critical thinking and communication skills) in regards to i nternationalization? Pro posed Aims The intent of this study is to demonstrate the procedure for developing an internationalization assessment tool utilizing classical test theory and analyzing the specific psychometric properties aded response model. In this study, the instrument and item address postsecondary undergraduate perceptions of critical thinking and communication skill in relationship to other cultures . Therefore, two aims that guided the research study are: 1. To develop an internationalization assessment utilizing classical test theory in order to skills in an international context. 2. To investigate and analyze the validity and rel iability of specific items on an institutional internationalization instrument through the application of item response theory Analytical Methods This analysis consisted of quantitative methods for both item generation for the assessment tool as well as the analysis and interpretation of item rel iability. Data analysis was conducted in two phases: instrument generation and item analysis of discrimination and

PAGE 78

78 difficulty parameters. The pha se one analysis of item correlations for instrum ent development was provided in this chapter as well as a detailed description of the instrument development procedure. In regards to the phase two analysis, th is chapter offer ed an overview of methodological procedures and considerations in relationship to item response theory concepts applied to item fit statistics. In the following chapter, the results of the phase two analysis , as well as more advanced data analysis , is provided with further explanation ab out a pplication undergraduate population. The instrument data analysi s provided in this chapter begin s with investigation and determination of the items retained and eliminated for the final instrument con struction. In the area of item piloting, classical test theory measures were implemented including the student demographics in terms of gender and class level. Student demographics were conducted on the first piloting stage of the student participants with in the comparative politics class. These basic student demographics in gender and class level assisted in matching the population of interest (e.g. , first year students). Utilizing classical test theory procedures, an item and instrument reliability analys is was conducted using an SPSS package. In this reliability analysis, item total score correlations were analyzed based upon a correlation level of .25 (Crocker & Algina, 1980) . Th e se items that had a correlation with the total score above the .25 level we re retained , while the items below this level were not retained . In addition, items were analyzed by determining if the overall total instrument correlation improved with the removal of the item. Items that were eliminated based upon the previously mentioned standard were reviewed for content validity to determine if potentially eliminated items address ed relevant components of the critical thinking and communication student learning outcomes. Once items were determined to be significant and contribu ted to the overall internal consistency of the instrument, the items were re piloted on

PAGE 79

79 one form to analyze their reliability with a different population of undergraduate students in this higher education institution . The next section provide s further desc ription of the retained and eliminated items , with their corresponding item discrimination levels. The subsequent section ad dress es the piloting of the final instrument based upon model. Item Specification and Instrument Development The first phase of analysis of the research study involved the investigation and develop ment of an assessment to ensure not only evaluation of the overarching competencies, global awareness and intercultural competency, but also more specifica lly to address critical thinking and communication skills. In this initial phase, various standardized profes sional intercultural instruments were reviewe d and investigated. Comprised of eight international and psychomet ric content experts, an institutional internationalization assessment sub committee reviewed seven of the most commonly implemented standardized intercultural instruments. Consisting of developer information presentations and various committee members taki ng the individual evaluations, the review analyzed the following seven assessments for validity, reliability, content, and alignment with internationalization goals and outcomes: Global Perspectives Inventory (GPI), Intercultural Development Inventory (IDI ), Global Competencies Inventory, Global Competence Aptitude Assessment (GCAA), Cross Cultural Adaptability Inventory (CCAI), Global Awareness Profile, and Intercultural Effectiveness Scale (IES) . The determination of the assessment committee was that the previously listed instruments did not have the standard of validity and reliability desired for the specific institutional population and did not effectively align with institutionally constructed student learning outcomes. From this instrument review, the determi nation to develop an intercultural assessment was made.

PAGE 80

80 Based upon the decision to proceed with developing an institutionally focused assessment tool, the initial process consisted of developing item specifications to serve as a framework for item generation (e.g. , item writing and content verification). With a focus on the two student learning outcomes (SLOs) of critical thinking and communication skills, relevant research literature was reviewed and analyzed to develop an operational definition fo r each of the respective SLOs ( Ennis, 1985; Facione, 1995; Willingham, 2007 ) . Emerging themes from the literature on critical thinking and communication skills led to the development of two student learning outcome specific acronyms. For the critical think ing concept, the acronym established was JARS , which represented the critical thinking components of judgment, analysis, reasoning, and solution finding. For the communication skills concept, the acronym established was SPAAA , which represented the communi cation components of sensitivity, production, awareness, acceptance, and adaptability. Based upon a comprehensive literature review, the operational definition of critical thinking skills consists of judgment (Case, 2005; Ennis, 1985; Facione, 1990; Lipma n, 1988; Tindal & Nolet, 1995) , analysis (Ennis, 1985; Facione, 1990; Halpern, 1998; Paul, 1992) , reasoning (Ennis, 1985; Facione, 1990; Paul, 1992; Willingham, 2007) , and solution finding (Ennis, 1985; Halpern, 1998; Willingham, 2007). In relationship to the communication skills, the operational definition for this student learning outcome consists of sensitivity (Olson & Kroeger, 2001 ; Ting Toomy, 1999) , production (Deardoff, 2006; Griffith & Harvey, 2000; Gudykunst, 1993; Sue, 2001; Ting Toomey & Kurogi, 1998) , awareness (Landis, Bennett, & Bennett, 2004; Paige, Jorstad, Paulson, Klein, & Colby, 1999 ; Storti, 1999; Storti & Bennhold Samaan, 1998) , acceptance (Dignes, 1983 ; Ting Toomy, 1999) , and adaptability (Dignes, 1983 ; Olson & Kroeger, 2001 ; K im 1991 ; Ting Toomy, 1999) . Appendix A and Appendix B provide the

PAGE 81

81 operational definitions of critical thinking and communication skills respectively. In order to provide a template to construct and align items with specific student learning outcomes, an i tem specification template was developed through a hybrid model from the Davidson and Lynch Model (2002) and the Florida Performing Fine Arts Assessment Project (Brophy, 2007) . Within the item specifications parameters, item generation consisted of identif ying such concepts as the big idea, enduring understanding, and item benchmark. These previous concepts provide d the description and goals for the wording and response attributes of each item. To further link the items to the respective item characteristic s, the item specification lists the item types to measure students competencies as well as the content limits which indicate the length of time estimated for each response. Finally, the item specification parameters offer specific examples of the item typ es ( e.g. , self rating, assessment , behavioral ) . In Appendix C , the item speci fication template for the student learning outcom e of critical thinking offers ways to specifically align with various components and potential response options of the SLO operational definition. Appendix D addresses the communication skills student learning outcome. Item Piloting Phase One Based upon the designed item specification templates, an item bank was generated consi sting of 70 communication and 68 crit ical thin king L ikert response questions. In addition, the assessment team developed alternate forms of the items (critically thinking and communication) in order to be piloted . In conducting t he first pilot analysis of the i nternat ionalization a ssessment t ool, I conducted an item analysis by providing the instrument to a 2000 level Introduction to Comparative Politics Class. The overall class enrollment consisted of 190 students , and from this implementation I was able to receive 128 responses. Given the time constraints on taking the survey as well as the large number of items, I decided to analyze only the critical thinking questions with the intent of conducting further analysis of the

PAGE 82

82 com munication items in sub sequent pilots . Students were given a lternate forms (Form A and Form B) consisting of ten anchor questions on both forms. The ten anchor items for the alternate forms consisted of both critical thinking (5 items) and communication skill items (5 items) . Th ese anchor items are listed in Appendix E. The items were piloted with alternate forms that were structured with two student demographic questions addressing gender and class level and then followed by a question regarding which alternate form that student was answering. In terms of internationalization questions, the first pilot phase order consisted of the 10 anchor items beginning the survey followed by the critical thinking items and the subsequent communication skill items. As previously stated , only t he critical thinking items were analyzed in this pilot phase. For alternate f orm A, the anchor items were th e first 10 items , with another 23 items measuri ng c ritical thinking (33 items). In Appendix F, the critical thinking items for alternate form A ar e listed in addition to the 35 communication skills items (total = 67 items). For alternate f orm B, the anchor items were the first ten items , with another 25 items measuri ng the c ritical t hinking student learning outcome (total = 35 items). In Appendix G, the critical thinking items for alternate form B are listed , in addition to the 36 communication skills items (total = 71). The response scale for both inst ruments consisted of a 5 point L ikert scale ranging from (1 Strongly Disagree , 2 Disagree, 3 Neutra l, 4 Agree, 5 Strongly Agree). The distribution of the forms consisted of 73 responses for alternate f orm A with a male/female distribution of 29 and 44 respectively , and 55 responses for alternate f orm B with a male/female distribution of 20 and 35 respec tively. In addition, the distribution in terms of class level for both forms included: 68 First Y ear students, 31 Second Y ear Students, 23 Third Y ear Students, and 6 Fourth Y ear students (Table 3 1).

PAGE 83

83 To determine item retention and elimination, classical test theory procedures were ut ilized for both alternate forms to assess the critical thinking items . For alternate f orm A, questions were eliminated iteratively if they exhibited item discrimination levels (i.e., item form correlation s ) less than .25 (Crocker & Algina, 1986). For phase one of the question elimination, these questions included the following: Item 12, Item 18, Item 20, Item 29, Item 30, Item 34, and Item 35. A second analysis was run without the previously listed questions, and Item s 14 and 24 were eliminated due to lower than .25 correlation. In the third phase of analysis, question 26 was eliminated also due to its lower discrimination level . The list of eliminated questions is found in Table 3 2. (Note: Questions 20 and 29 were reverse coded due to the direction of the negative wording . ) With the removal of the elimin ated questions, the analysis resulted in Alpha of .901 and the means of the alternate form A critical thinking items and the ir corresponding s tandard deviations , and the item discriminations are found in Table 3 3. For alternative f orm B , implementing the previous item evaluation procedure , I eliminated critical thinking questions that exhibited discrimination less than .25 . T hese eliminated q uestions included: Item 12, Item 16, Item 17, Item 21, Item 24, Item 25, Item 26, Item 32, and Item 37. In addition, questions 29, 34, and 35 needed to be reverse coded and potentially reworded to off er consistent positive wording. In reverse coding these questions, the previous items illustrated significant discrimination levels to be retained in the instrument. The list of eliminated critical thinking items is found in Table 3 4. A second analysis was conducted with the remaining questions, and it generat 926 with all remaining critical thinking items above the .25 threshold . The retained critical thinking items from alternate form B with their corresponding m eans, standard deviations , and item discriminations are listed in Table 3 5.

PAGE 84

84 With the items that illustrated lower discriminations, a majority of the items involved the respondent judging based upon a cultural comparison. For example, Item 16 on Form B states similar result for Form A on Item these critical thinking items were eliminated. One possible item r ecommendation would be re wording cultural comparison questions which could offer increased correlations. In addition as will be further discussed in C hapter 5, it would be beneficial to further analyze these cultural comparison questions to understand how , especially in light of their individual cultural identity. Listed by level of item discrimination, t he critical thinking items that were retained for Form A and Form B are listed in Table 3 6. The total pool of critical thinking items from the two alternate forms that were further piloted in the second phase consisted of 40 items. Item Piloting Phase Two This second phase of piloting analyzed the items in regard to the two factors of critical thinking and communication separately. The second stage of the piloting process consisted of investigating the reliability of the retained critical thinking items and the initial communication skill items. Utilizing two alternate forms (form A and form B) with consistent critical thinking items from the phase one pilot and different communication items, the alternate forms were administered to undergraduate education p sychology students who had a research subject participation requirement as part of their classwork. For alternate form A, the instrument consisted of the retained critical thinking questions from phase one piloting (40 items) and 36 items addressing the co mmunication skills student learning outcome. Beginning with the student demographic questions of gender and class level, alternate form A begins with the

PAGE 85

85 communication skills items (3 38) and then follows with the 40 retained critical thinking items from p ilot phase one (39 78). The items for alternate form A are listed in Appendix H . For alternate form B, the instrument consisted of the retained critical thinking items and 34 items (different from form A) addressing the communication skills student learnin g outcome s . Similar to alternate form A, alternate form B began with student demographic questions (gender and class level) and then the communication skill items (3 36) and the critical thinking items (37 76). The items for alternate form B are listed in Appendix I . The research participants consisted of undergraduate students that were currently enrolled in the following courses: Introduction to Educational Psychology, Human Growth and Development, The Young Adolescent, and the Adolescent. Administered t hrough an online survey tool, the responses for alternate form A were from 68 students and for alternate form B from 70 students. The research participants were required to participate in a research study which offered significant response rates; however, the subject research requirement did not require the inclusion of any student demographics in order to gain participation credit. Given anonymity of the online assessment, this phase of piloting did not allow for a determination of gender and class level differences. Based upon the second piloting of the retained crit ical thinking questions, the item discriminations were analyzed a s two separate scales of critical thinking and communication skills . Table 3 7 indicates the item discrimination levels for the critical thinking questions listed on alternate form A. For the critical thinking items analyzed on alternate form A, these comb ined items resulted in a Cronbach Alpha of .934. Table 3 8 lists the critical thinking questions with their respective discrimination levels for alternate form B. The alternate form B critical thinking items had discrimi nation levels above .90.

PAGE 86

86 Given the time constraints on the examination and the high discrimination levels of the first twelve items, administrators determined to limit the critical thinking items to the ten most discriminating items. Each of the selected critical thinking items ha d discrimi nation levels above .60 for the 40 total items , and the total alpha l evel of the instrument with all 40 items was .945. To align these twelve items with the student learning outcomes, a taskforce of international experts reviewed the items for va lidity. In addition , during the initial phase of the second part of the item piloting, the goal was to determine which communication skills items offered the most information to be retained for the final assessment. In Table 3 9, the first pilot of the com munication skills items is listed with their respective discrimination levels for alter nate form A. Table 3 10 offers the item discrimination levels for the communication skills questions for alternate form B. While many items exhibited item form correlati on higher than the .25 threshold, there were also communication skills items that were eliminated due to their lower discrimination levels (See Table 3 11). Similar to the critical thinking cultural comparison questions, many of the communication skills it ems addressed cultural comparison in the areas of cultural interactions and cultural perspectives. For example, one of the communication skills questions stated , are more accepting of cultural differences that people of other cultu res . This item exhibited a significantly lower discrimination than most of the other communication items (.185). With this item as well as other cultural communication skills items assessing cultural differences, further research and potential item reword of how communication styles are viewed through different cultural lens. Once the eliminated communication items were removed and reanalyzed, then the list of retained communication skills items with their discrimination levels was determined ( See Table 3 12 ).

PAGE 87

87 Once the final retained critical thinking items were established (12 items) , the communication items illustrating the largest discrimination values were again re piloted on a common form for the same population of students. The re pilot of these communication skills items consisted of two student demographic questions (gender and class level) and 42 communication skills related items (See Appendix J) . The sample of postsecondary students for the communication skills items ( n=57) consisted of 93 % females (53) and 7 % males (4) and class levels that were divided between 36.8% first year students (21), 24.6% second year students (14), 24.6% third year stude nts (14), and 14% four year students (8). Given the focus of the communication student learning outcomes, critical thinking questions were not listed in this phase of piloting. The list of the second phase of the retained communicat ion items is found in Ta ble 3 12 . Based upon the re pilot of the retained communication items, the overall instrument Cronbach alpha level was strong , with a result of .925. To maintain consistency with the critical thinking scale that was retained , among the items that generate d an alpha level of .9 0 , fourteen items were selected. These fourteen items were revised by both the assessment committee and the taskforce to ensure content validity. Based upon the previously listed item piloting phases, the final intercultural assessme nt instrument consisted of 12 items addressing the critical thinking student learning outcome s and 14 items addressing the communication skills student learnin g outcomes (total = 26). Table 3 13 outlines the final assessment tool with the first 12 survey items for critical thinking and the following 14 survey questions reflect communication skills in a global context. For the purpose of analysis and clarification, the items have been recoded based upon major concepts that eac h item addresses (see Table 3 1 4 ). Throughout the rest o f this paper, the items are referred to both

PAGE 88

88 as item number and recoded variable. From the development of the internationalization instrument, the second analysis of speci fic item fit is conducted utilizing item response theory measures , Graded Response Model Analysis In this section, a detailed description is given of item response theory measurements as illustrated through the graded response model. The student questionnaire included 26 items measuring critical thinking and communication skills for cross cultural settings. Utilizing the previously designed intercultural assessment instrument, the variables o f gender and class level are determined. In addition, t he analytical met hod for item fit statistics was conducted utilizing an exploratory factor analysis, model fit, and item response theory procedures (graded response model) . An exploratory factor an alysis (EFA) using SPSS was conducted to determine item fit indices for the various factors. Given that there has been a dearth of analyt ical scrutiny on the instrument, an exploratory factor analysis is the preferred methodology when the quality of the items in rela tionship to their constructs has not been comprehensively investigated (Fabringer, Wegener, MacCallum, & Strahan, 1999). An exploratory factor analysis assists in ensuring the assumption of unidimensionality is met by analyzing the items for each construct as well as the overall items ( Norris, M., & Lecavalier, L., 2010 ). In determining model fit and unidimensi onality, Mplus version 7 was utilized in the estimation of the model. The estimation of the two fit indices, the Root Means Square Error of Approxi mation (RMSEA) and the Standardized Root Mea n Square Residual (SRMR) was used to assess the appropriate fit of the factors and local independence of the data. In terms of model fit, t he Roots Means Square Error of Approximation (RMSEA) is a widely used index (Marsh, Hau, & Grayson, 2005; McDonald & Ho, 2002). A key advantage of using RMSEA to determine model fit is that is allows for hypothesis testing by indicating the asymptotic distribution.

PAGE 89

89 By utilizing th is distribution to determine cum ulative distribution , researchers are able to conduct hypothesis testing which in turn allows researchers to look at model evaluation at close or not close fit rather than being limited to exact fit measures ( Browne & Cudeck, 1993; MacCallum, Browne, & Sug awara, 1996; Savalei, 2012). In addition, RMSEA is often used in structural equation modeling (SEM ) as a supplementary fit method in order to accommodate for larger sample sizes (Marais, 2007). In addition to using the root means squares error of approxima tion (RMSEA), the Standardized Root Mean Square Residual (SRMR) was utilized to As a measure of perfect fit ( which determines how far the model is from perfect), a value of 0 with SRMR indicates perfect fit. By definition, SRMR The measure of SRMR tends to be smaller for larger sample size s and has no penalty for model complexity (Kenny, 2012). Based upon guidelines established by Hu and Ben tler (1999), this analysis establish ed adequate fit values for RMSEA at or below .06 and good fit values for SRMR at or below .08. In utilizing item re sponse theory procedures, it is important to address the several foundational model assumptions. One key assumption of IRT is that the underlying theta (latent trait) being assessed in unidimensional. By unidimensionality, item response theory defines thi s assumption as systematic covariance among the items (Orlando, 2007 ). To ensure that the assumption of unidimensionality is met, an exploratory factor analysis can be conducted for both the specific constructs (critical thinking and communication) as well as for the overall instrument. A factor analysis is a prevalent analytical method to determine if patterns of relationships among items

PAGE 90

90 conform to the instrume . This is often utilized when there has been a dearth of analytical scrutiny on the specific instrument (Fabringer, Wegener, MacCallum, & Strahan, 1999). ) graded response model was applied to the sample of undergraduate students. The next section address es more of the data source characteristics. The data was analyzed using th e g raded response m odel (GRM) to determine estimated theta values, standard errors, item and person parameters, and item thresholds. The graded response m odel was utilized due to its comprehensiveness in item information and allowance for analysis for Likert type responses across all the items ( Madera, 2003 ) . In addition, a key strength of the GRM is that it assumes that the relative difficulty between step s across items can vary . Allowing for flexibility in step difficulty is important given that different questions as well as different steps have different levels of difficulty and therefore the allowance for flexible step adjustment provides more item information. In conducting an item fit analysis through using R statis ti cal packag es, category response curves and an i tem information functions were generated to comprehend ability distribution based upon the respective items. In IRT analysis , item calibration for both dichotomous and polytomous response scales is needed as well conduct ing parameter estimation s for each item in relationship to its IRT model (e.g., the graded response model ) . For this study, IRT item pa rameters for each factor was estimated based upon a two parameter logistic graded response model and based upon the relevant fac tors each item pa rameter were estimated in terms of its discrimination (a) and threshold parameter (b). In order to determine both item discrimination and threshold indexes, PARSCALE 4.1 under the graded response model was utilized with a maximum inter cycle change of .01. PARSCALE 4.1 utilizes marginal maximum likelihood

PAGE 91

91 (MML) estimation and assumes a normal distribution of theta, with a mean of 0 and a standard deviation of 1 . While there are four general approaches to estimation of item response models: joint maximum lik elihood, conditional maximum likelihood, marginal maximum likelihood, and Bayesian estimation (Johnson, 2007), marginal maximum likelihood (MML) unlike the other N ther marginal (i.e., aggregate) parameters that are the most likely to have generated the observed sample data 1981), the benefits of a marginal maximum likelihood are that it is considered unbiased, use ful for estimating the mean and variance of a survey scale, asymptotic consistent, and efficient (Hortensius, 2012). In coordination with the factor analysis, the item calibration provided a more thorough understanding each specific item parameters . Data S ource The data analyzed for t his study was derived from undergraduate students a t a large, public research extensive postsecondary institution . In the Fall of 2013, t he assessment instrument was ad ministered to the undergraduate students enrolled in the fo llowing undergraduate courses : GEO2242 Extreme Weather (126) , ANT 2000 General Anthropology (129) , ANT2410 Cultural Anthropology (137) , AST1002 Discover the Universe (104) , C HM1025 Introduction to Chemistry (250) , WIS Wildlife Ecology and Conservation (596) , CLP2001 Personal Growth (144) , and COM1000 Intro Communication Studies (187 ) . Therefore, the total number of potential responses to be analyzed was 1,673 from the undergraduate courses that have large enrollments. In order to amass a significant res ponse rate to estimate item parameters , these previously listed classes were chosen due to th eir large enrollment as well as to gain more first year student

PAGE 92

92 perspectives. In addition to the access to first year students, these interdisciplinary courses off er a broad perspective of students from various majors. In addition to the large undergraduate classes, this internationalization instrument was piloted among undergraduate students prior to their engagement in a study abroad experience. These students cam e from various undergraduate majors. Given the future role of the indirect instrument in assessing intercultural competency and global awareness across academic programs, it is important for the pilot sample to represent the target population of undergradu ate students at a large, research extensive university. Depending on the number of students participating in the survey, the intention is to analyze first year student responses separately for both validity development but also to gain potential baseline data for further instrument implementations. Given the enrollment numbers of the previously stated courses and multiple section numbers , the population of analy sis could represent approximately 3,500 undergraduate students. The instrument was administered to the student population through the online survey service, surveymonkey.com. The survey was implemented over a 3 week period with a two week survey period and a one week reminder for the last week of the survey timeline. Since this study initiates the pilot phase o f a longitudinal analysis of undergraduate intercultural and global development, the implementation of the instrument commence d in the Fall of 2013 for the incoming class and will be administered annually to the same students in the Fall of 2014 and consecutive fall semest ers until graduation. Therefore this instrument will be administered to the incoming first year undergraduate students in the Fall of 2014 and follow their progression through graduation. However, this specific study addresses only the Fall 2013 implementation and data generated from this population .

PAGE 93

93 Data Sample The sample used for this stu dy only contained undergraduate stu dents in various classes at a large, four year degree granting research postsecondary institution located in the southeastern United States. The sample for this research study came from the following classes: GEO2242 Extreme Weather (126), ANT 2000 General Anthropology (129), ANT2410 Cultural Anthropology (137), AST1002 Discover the Universe (104), CHM1025 Introduction to Chemistry (250) , WIS Wildlife Ecology and Conservation (596), CLP2001 Personal Growth (144), and COM1000 Intro Communication Studies (187 ) and from a cohort of undergraduate students from the same postsecondary institution who were about to participate in a study abroad program. The number of responses analyzed from the previously listed courses consisted of 646 students and the number of responses analyzed from the pre study abroad students consisted of 216 students. Therefore, the total number of responses utilized in this study was 862. Given that 1,673 constituted the survey population and the number of responses from the undergraduate courses was 646 , the instrument response rate was 38.6 %. For this study, it was determined that undergraduate students from various majors would be the focus of analysis. The reason for focusing on undergraduate students in particular is that intended use of the internationalization instrument is for a more longitudinal understanding of student development in the areas of intercultural competency and global awareness. Given that this analysis focuses on the piloting of an internationalization instrument , the main criterion is gaining large sample numbers in order to conduct parameter estimates. Research indicates that a minimum of approximately 500 responses can provide baseline data for parameter estimates ( Hays, Morales, & Reise, 2000; Madera, 2003 ) ; h ow ever, my goal in the analysis was to obtain a minimum of 1,000 responses. Since this is a pilot analysis focused on the psychometric properties of the specific instrument items, the student demographic information does not

PAGE 94

94 influence the investigated item discrimination and difficulty levels. However for further analysis, student gender, class enrollment, and previous international experience (e.g., study abroad participation, internationa l designated class enrollment) will be collected. Limitations of St udy The primary limitation of this study is the population utilized for the pil ot analysis. This study was conducted on first year undergraduate students at a large, public research extensive postsecondary institution in the southeast U.S. Given that the proposed instrument intends to measure undergraduate students at all class levels (second year, third year, etc.), the current population offers appr opriate baseline data necessary to determine item parameter estimates. However, it would be beneficial to i nclude cross sectional data on other undergraduate students at various class levels. Another potential limitation is that the study included only cross sectional data and not longitudinal. Given the limited nature of cross sectional data, analysis of item information s tatistics over several implem en tations could not be conducted. The findings of this study are only representative of first year undergraduate studen ts at this specific institution . But the instrument development process can provide a guideline for other similar postsecondary institutions engaging in the assessment of internationalization efforts. There were also limitations based upon the methodological procedures. Although the grading scale model (GRM) was applied to this study, there were other item response theory polychromous models available . Depending upon the nature of the item parameters , it would be possible to analyze the item fit statistics under the Rasch (Rating Scale and Partial Credit Models) models as well as the Modified Partial Credit Model. In addition , an exploratory factor analysis (EFA) was conducted to determine whether the assumption of unidimensionality was achieved. However, given that the as sessment instrument had specifically defined construct s of

PAGE 95

95 critical thinking and communication and aligned items addressing these constructs, it would be possible to conduct a confirmatory factor analysis (CFA) to determine if the items were related to the critical thinking and communication constructs. Summary This chapter provided an overview of the methodology that was utilized to guide this research study and assist in answering the stated research questions. The purpose of this study is to exam ine the development and psychometric properties of the assessment instrument to research questions, the chapter addressed the data source and sampling meth od utilized to draw Finally, the chapter concluded with specific limitations in the areas of population and analytical methods that resulted from this study.

PAGE 96

96 Table 3 1. Critical Thinking Item Pilot Stud ent Demographics (Phase 1) Form A Form B Gender Females 44 35 Males 29 20 Class Level First Year 41 27 Second Year 14 17 Third Year 16 7 Fourth Year 2 4 Table 3 2. Eliminated Critical Thinking Items Alternate Form A (Phase 1) Item Question 12 Some cultures are better than others 18 People from my culture work harder than people from other cultures. 14 When comparing cultures, there are some cultures that are better than others. 20 I do not have a culture. 24 As compared to other perspectives, my cultural point of view should be used to make decisions. 26 When comparing cultures, there are some cultures that are better than others. 29 It is not important to know about my cultural values. 30 I believe you can know someone well without knowing about their cultural beliefs 34 Everyone has the same cultural values. 35 I depend on others to tell me how to determine my cultural beliefs.

PAGE 97

97 Table 3 3. Retained Critical Thinking Item Properties Alternate Form A (Phase 1) Item Mean Standard Deviation Discrimination Item 4 Item 5 4.12 3.89 .901 .977 .558 .516 Item 6 4.11 .977 .515 Item 7 3.92 1.076 .622 Item 8 3.45 1.169 .557 Item 9 4.18 .897 .590 Item 10 3.71 .948 .546 Item 11 3.97 1.071 .617 Item 13 3.60 .949 .525 Item 15 4.03 .975 .628 Item 16 4.03 .923 .710 Item 17 3.63 1.012 .434 Item 19 3.37 1.028 .440 Item 21 3.74 1.007 .459 Item 22 4.06 .939 .501 Item 23 3.65 .977 .572 Item 25 3.65 1.026 .568 Item 27 3.32 1.004 .314 Item 28 3.66 1.130 .419 Item 31 3.85 .884 .688 Item 32 3.32 .825 .343 Item 33 3.18 1.153 .372 Item 36 3.82 1.222 .441 Table 3 4. Eliminated Critical Thinking Items Alternate Form B (Phase 1) Item Question 12 Some cultures are better than others 16 Students from other cultures are more motivated than students from my culture. 17 My culture values critical thinking more than other cultures. 21 My personal cultural view should be used to make decisions. 24 It is acceptable for one culture to have more opportunities than another culture. 25 People from others cultures value family more than people from my culture. 26 I often question my own cultural views. 32 I discuss international issues with member of other cultures. 37 It is easy to determine right and wrong in different cultural settings.

PAGE 98

98 Table 3 5. Retained Critical Thinking Item Properties Alternate Form B (Phase1) Item Mean Standard Deviation Discrimination Item 4 4.04 1.205 .631 Item 5 4.00 1.000 .501 Item 6 4.07 1.074 .657 Item 7 4.13 .842 .698 Item 8 3.96 1.021 .478 Item 9 4.42 .892 .614 Item 10 4.07 .863 .638 Item 11 4.07 1.031 .564 Item 13 3.71 .991 .587 Item 14 4.04 .796 .685 Item 15 3.98 1.215 .639 Item 18 3.40 1.176 .556 Item 19 3.49 1.141 .588 Item 20 3.33 1.168 .276 Item 22 4.31 .848 .719 Item 23 4.40 .863 .765 Item 27 3.00 .953 .279 Item 28 3.67 .905 .480 Item 29 3.47 1.358 .415 Item 30 3.76 1.246 .592 Item 31 4.00 .953 .791 Item 33 4.18 .912 .450 Item 34 3.89 1.172 .553 Item 35 4.51 .968 .512 Item 36 3.11 1.133 .257 Item 38 3.76 1.131 .674

PAGE 99

99 Table 3 6. Retained Item Discriminations from Critical Thinking Pil ot Phase 1 Alternate Form A and Form B Questions Discrimination 1. I have been exposed to several cultural ways of thinking. .791 2. I consider different viewpoints before making a decision. .710 3. I consider different perspectives before making conclusions about the world. .696 4. I am able to manage when faced with multiple cultural perspectives. .687 5. I am open to different cultural ways of thinking in any in ternational context. .681 6. I can make effective decisions when placed in different cultural 1. situations. .666 7. Knowing about other cultural norms and beliefs is important to me. .661 8. I am able to think critically to interpret global and intercultural issues. .650 9. I actively learn about different cultural norms. .649 10. Understanding different points of view is a priority to me. .631 11. I can recognize how different cultures solve problems. .630 12. I am open to other cultural ways of thinking. .628 13. I can contrast important aspects of different cultures with my own. .623 14. I am able to critically think to interpret global and intercultural issues. .622 15. Knowing about other cultural beliefs is important. .621 16. I am able to recognize how members of other cultures make decisions. .612 17. I am able to analyze arguments utilizing different perspectives. .594 18. Cultural diversity is an important part of the college experience. .592 19. I understand how cultural beliefs and values influence decision making. .590 20. It is important to know about my cultural values. .568 21. Other people say that I am interested in other cultures .562 22. I feel comfortable working with international students. .559 23. I am able to demonstrate knowledge of global and intercultural conditions and interdependences. .557 24. When placed in different cultural situations, I ask students of other cultures questions about their culture. .556 25. I can reflect on the impact of my decisions in different cultural settings. .525 26. I understand why there is conflict among nations of different cultures. .501 27. Cultural knowledge is an important job skill. .495 28. I am able to understand what composes a culture. .459 29. I challenge other people to understand different cultural perspectives. .444 30. I am patient when managing in different cultural situations. .440 31. I actively learn about different cultural norms. .434 32. I am confident of my cultural beliefs and values. .419 33. Not everyone has a culture. .404 34. I feel confident I know how to act in most cultural situations. .415 35. .359 36. I work to bridge differences between cultures. .343 37. I try to find solutions to cultural differences .314 38. I am able to assist others in discovering their cultural beliefs. .279 39. When looking at global issues, I tend to view these issues through my cultural lens. .276 40. Cultural diversity is an important part of the college experience. .257

PAGE 100

100 Table 3 7. Retained Critical Thinking Item Discrimination Levels Alternate Form A (Phase 2) Questions Discrimination I am open to different cultural ways of thinking in any international context. .649 I consider different perspectives before making conclusions about the world . .647 It is important to know about my cultural values. .408 I am able to think critically to interpret global and intercultural issues. .630 I am able to demonstrate knowledge of global and intercultural conditions and interdependences. .521 I understand how cultural beliefs and values influence decision making. .506 I can make eff ective decisions when placed in different cultural situations. .603 I can reflect on the impact of my decisions in different cultural settings .574 I am open to other cultural ways of thinking. .556 I consider different perspectives before making a decision. .556 I am patient when managing in different cultural situations. .615 I actively learn about different cultural norms. .601 I am able to understand what composes a culture. .510 I understand why there is conflict among nations of different cultures. .297 I am able to recognize how members of other cultures make decisions. .572 Knowing about other cultural norms and beliefs is important to me. .677 I try to find solutions to cultural differences. .402 I am able to manage when faced with multiple cultural perspectives. .668 I work to bridge differences between cultures. .345 Other people say that I am interested in other cultures. .480 Cultural diversity is an important part of the college experience. .557 I can contrast important aspects of different cultures with my own. .601 Cultural knowledge is an important job skill. .507 When placed in different cultural situations, I ask students of other cultures questions about their culture. .339 I challenge other people to understand different cultural perspectives. .635 When looking at global issues, I tend to view these issues through my cultural lens. .295 I feel comfortable working with international students. .503 Knowing about other cultural beliefs is important. .613 I am able to assist others in discovering their cultural beliefs. .391 I can recognize how different cultures solve problems. .574 I am able to analyze arguments utilizing different perspectives. .491 I have been exposed to several cultural ways of thinking. .604 Understanding international news is important to understand other cultures. .301 I feel confident I know how to act in most cultural situations. .388 Understanding different points of view is a priority to me. .559 Not everyone has a culture. .336 There should be one universal culture. .424

PAGE 101

101 Table 3 8. Retained Critical Thinking Item Discrimination Levels Alternate Form B (Phase 2) Questions Discrimination I am open to different cultural ways of thinking in any international context. .738 I consider different perspectives before making conclusions about the world . .736 It is important to know about my cultural values. .301 I am able to think critically to interpret global and intercultural issues. .655 I am able to demonstrate knowledge of global and intercultural conditions and interdependences. .675 I understand how cultural beliefs and values influence decision making. .504 I can make effective decisions when placed in different cultural situations. .701 It is important to know about my cultural values. .467 I can reflect on the impact of my decisions in different cultural settings . .554 I am open to other cultural ways of thinking. .594 I consider different perspectives before making a decision. .471 I actively learn about different cultural norms. .708 I am able to understand what composes a culture. .508 I understand why there is conflict among nations of different cultures. .466 I am able to recognize how members of other cultures make decisions. .655 Knowing about other cultural norms and beliefs is important to me. .661 I try to find solutions to cultural differences. .572 I am confident of my cultural beliefs and values. .305 I am able to manage when faced with multiple cultural perspectives. .691 I work to bridge differences between cultures. .537 Other people say that I am interested in other cultures. .664 Cultural diversity is an important part of the college experience. .636 I can contrast important aspects of different cultures with my own. .627 Cultural knowledge is an important job skill. .491 When placed in different cultural situations, I ask students of other cultures questions about their culture. .551 I challenge other people to understand different cultural perspectives. .561 When looking at global issues, I tend to view these issues through my cultural lens. .313 I feel comfortable working with international students. .624 Knowing about other cultural beliefs is important. .653 I am able to assist others in discovering their cultural beliefs. .449 I can recognize how different cultures solve problems. .696 I am able to analyze arguments utilizing different perspectives. .414 I have been exposed to several cultural ways of thinking. .559 Understanding international news is important to understand other cultures. .318 There should be one universal culture. .338 I feel confident I know how to act in most cultural situations. .396 Understanding different points of view is a priority to me. .705

PAGE 102

102 Table 3 9. Communication Item Discrimination Levels Alternate Form A (Phase 2) Question Discrimination I can reflect on the impact of my decisions in different cultural settings . .271 I am able to communicate effectively with members of other cultures. .623 I am able to interact effectively with members of other cultures. .690 I do not feel threatened when presented with multiple perspectives. .633 I can reflect on the impact of my decisions. .419 I feel comfortable discussing international issues. .346 I can look at the world through the eyes of a person from another culture. .351 I often have conversations with members of other cultures. .756 I often ask questions about culture to members of other cultures. .450 I appreciate members of others cultures teaching me about their culture. .656 I feel confident I know how to act in most cultural situations. .354 When I am in a conversation with a student from another culture, I often ask questions. .356 I can clearly articulate my message to members of other cultures. .441 When working on group projects with a student from another culture, I feel confident that I am able to collaborate with this student. .486 I evaluate situations in my own culture based on my experiences and knowledge of other cultures. .332 I have the ability to recognize cultural differences easily. .644 I am patient when dealing with members of other cultures. .565 I can clearly articulate my point of view to members of other cultures. .508 I like working in groups with students from other countries. .652 After a conversation about different cultural views, I reflect about the discussion. .553 I feel comfortable in conversations that may involve cultural differences. .441 When working on a group project, I enjoy collaborating with students from other countries. .772 I demonstrate flexibility when interacting with members of another culture. .670 When interacting with an international student, I am conscious of the cultural differences. .431 I try to have friends from different cultural backgrounds. .590 When analyzing global issues, I tend to view these issues only through my cultural perspective. .300

PAGE 103

103 Table 3 10. Retained Communication Item Discrimination Levels Alternate Form B (Phase 2) Question Discrimination I am able to communicate effectively with members of other cultures . .399 I am able to interact effectively with members of other cultures. .410 I prefer to socialize with people of my culture. .289 I often have conversations with members of other cultures. .652 I often ask questions about culture to members of other cultures. .488 I appreciate members of others cultures teaching me about their culture .624 I feel confident I know how to act in most cultural situations .406 I could discuss and contrast various behavioral patterns in my culture with those of another culture .516 I am confident of my cultural beliefs. .440 I demonstrate flexibility when interacting with persons from another culture. .321 I am able to find commonalities between different cultures. .454 I appreciate differences between cultures .452 I am able to adapt to different cultural situations. .661 I enjoy learning about other cultures. .575 It is important to interact with international students . .530 When I do not understand .330 I am aware of the social interactions of other cultures. .340 Others would say that I interact with members of other cultures. .654 I enjoy learning about other cultures. .553 I am confident that I can adapt to different cultural environments. .684 Being a global citizen is an important value in my life. .592

PAGE 104

104 Table 3 11. Eliminated Communication Items Alternate Forms A and B (Phase 2) Question Discrimination I do not feel threatened when presented with multiple perspectives. .122 I feel uncomfortable in situations outside my cultural experiences . .229 I can reflect on the impact of my decisions. .159 I am sensitive to situations that involve a cultural misunderstanding. .078 Too much emphasis is placed on cultural differences . .008 When interacting with an international, I am conscious of the cultural differences. .121 Most of my friends have the same cultural perspective as mine. .150 People are all the same despite cultural differences. .024 People should interact with members of their own cultural group. .130 I feel offended when someone imposes their cultural view on me. .019 When presented with a different cultural perspective, I can adapt it to my way of thinking. .208 I deal with my emotions and frustrations when interacting with a different culture. .106 I have frequently been in situations where there was a cultural misunderstanding. .062 People from my culture are more accepting of cultural differences than people of other cultures. .185 When in different cultural situations, I adapt my behavior to fit in. .033 .054 I feel uncomfortable in situations outside my cultural experiences. .120 I feel my cultural perspective is the most appropriate. .127 I act differently when around students from other cultures. .093 I believe you can know someone well without knowing about their cultural beliefs. .237 I have the ability to recognize cultural differences easily. .151 I am confident of my cultural beliefs. .044 I believe you can know someone well without knowing their cultural beliefs. .080

PAGE 105

105 Table 3 12. Retained Communication Item Discrimination for Alternate Forms A and B Phase 2 Questions Discrimination 1. I demonstrate flexibility when interacting with members of another culture. .692 2. I prefer to socialize with people of my culture. .662 3. I am confident that I can adapt to different cultural environments. .643 4. I am able to communicate effectively with members of other cultures. .632 5. I like working in groups with students from other countries. .620 6. I feel comfortable in conversations that may involve cultural differences. .616 7. When working on a group project, I enjoy collaborating with students from other countries. .610 8. I often ask questions about culture to members of other cultures. .602 9. I enjoy learning about other cultures. .588 10. I appreciate members of others cultures teaching me about their culture. .565 11. I am able to interact effectively with members of other cultures. .554 12. I appreciate differences between cultures. .542 13. I feel comfortable discussing international issues. .541 14. I can clearly articulate my point of view to members of other cultures. .538 15. I am able to adapt to different cultural situations .535 16. Others would say that I have friends from different countries. .526 17. Others would say that I interact with members of other cultures. .522 18. After a conversation about different cultural views, I reflect about the discussion. .498 19. When working on group projects with a student from another culture, I include this student in project discussions. .488 20. I can look at the world through the eyes of a person from another culture. .487 21. I can clearly articulate my message to members of other cultures. .481 22. I could discuss and contrast various behavioral patterns in my culture with those of another culture. .476 23. When interacting with an international student, I am conscious of the cultural differences. .457 24. I am patient when dealing with members of other cultures. .449 25. I try to have friends from different cultural backgrounds. .436 26. It is important to interact with international students. .418 27. I often have conversations with members of other cultures. .414 28. I do not feel threatened when presented with multiple perspectives. .407 29. .382 30. Being a global citizen is an important value in my life. .371 31. I am able to find commonalities between different cultures. .363 32. I evaluate situations in my own culture based on my experiences and knowledge of other cultures. .360 33. When I am in a conversation with a student from another culture, I of ten ask questions. .359 34. I feel confident I know how to act in most cultural situations. .298 35. I am aware of the social interactions of other cultures. .269

PAGE 106

106 Table 3 13. Final Internationalization Assessment Items Critical Thinking and Communication Skills Critical Thinking 1. I consider different perspectives before making conclusions about the world. 2. I am able to manage when faced with multiple cultural perspectives. 3. I am open to different cultural ways of thinking in any international context. 4. I can make effective decisions when placed in different cultural situations. 5. Knowing about other cultural norms and beliefs is important to me. 6. I am able to think critica lly to interpret global and intercultural issues. 7. I actively learn about different cultural norms. 8. Understanding different points of view is a priority to me. 9. I can recognize how different cultures solve problems. 10. I can contrast important aspects of di fferent cultures with my own. 11. Knowing about other cultural beliefs is important. 12. I am able to recognize how members of other cultures make decisions. Communication Skills 13. I demonstrate flexibility when interacting with members of another culture. 14. I prefer to socialize with people of my culture. 15. I am confident that I can adapt to different cultural environments. 16. I am able to communicate effectively with members of other cultures. 17. I like working in groups with students from other countries. 18. I feel comfortable in conversations that may involve cultural differences. 19. When working on a group project, I enjoy collaborating with students from other countries. 20. I often ask questions about culture to members of other cultures. 21. I enjoy learning about other cultures. 22. I appreciate members of other cultures teaching me about their culture. 23. I am able to interact effectively with members of other cultures. 24. I appreciate differences between cultures. 25. I feel comfortable discussing international issues. 26. I can clearly articulate my point of view to members of other cultures.

PAGE 107

107 Table 3 14. Recoded Variables of Final Internationalization Assessm ent Items Critical Thinking and Communication Skills Critical Thinking Recoded Variable I consider different perspectives before making conclusions about the world. Perspectives I am able to manage when faced with multiple cultural perspectives. Multiperspect I am open to different cultural ways of thinking in any international context. Waythink I can make effective decisions when placed in different cultural situations. Effectdec Knowing about other cultural norms and beliefs is important to me. Knownorms I am able to think critically to interpret global and intercultural issues. Thinkcritical I activ ely learn about different cultural norms. Actlearn Understanding different points of view is a priority to me. Ptsview I can recognize how different cultures solve problems. Solvprob I can contrast important aspects of different cultures with my own. Contrcult Knowing about other cultural beliefs is important. Knowbeliefs I am able to recognize how members of other cultures make decisions. Recdec Communication Skills Recoded Variable I demonstrate flexibility when interacting with members of another culture. Flex I prefer to socialize with people of my culture. Social I am confident that I can adapt to different cultural environments. Adapt I am able to communicate effectively with members of other cultures. Commeffect I like working in groups with students from other countries. Grpwork I feel comfortable in conversations that may involve cultural differences. Comfortconv When working on a group project, I enjoy collaborating with students from other countries. Grpcollob I often ask questions about culture to members of other cultures. Askquest I enjoy learning about other cultures. Enjoylearn I appreciate members of other cultures teaching me about their culture. Teachcult I am able to interact effectively with members of other cultures. Inteffect I appreciate differences between cultures. Appdiff I feel comfortable discussing international issues. Intissues I can clearly articulate my point of view to members of other cultures. Articulate

PAGE 108

108 CHAPTER 4 RESULTS The purpose of this chapter is to present the findings from the methodological analyses that were conducted to investigate the primary research questions guiding this study. The results are presented in three main sections beginning with examination of th e overall model fit to specific item information. The first section addresses the fit of the data in relationship to different factor models and dimensions. In this section, the results explain how many factors were analyzed in the model as well as th e loa ding various items in relationship to the specific factors. The next section analyzes the overall test information in order to determine which ability levels offer the most information for the internationalization assessment for critical thinking and commu nication skills. Given this emphasis on specific item calibration and item analysis, the final section offers a comprehensive investigation of all of the items in terms of item fit and item information. This chapter concludes with highlights of the results from the data analyses. Model Fit The first analysis procedure investigated the number of factors that fit within the model. Since researchers have identified the continuous difficulty of assessing the fit of IRT models (Reise, Widaman, & Pugh, 19 93), this analysis utilizes several methods to assist in determining the number of factors in the model and the fit of these factors based upon the data. In considering model fit, traditionally, individual item s as well as data fit were analyzed to determi ne overall model fit (i.e., pattern of responses). Based upon the IRT assumption of local independence, it is assume d that individual item fit result s in comprehensive model fit of the data. While statistical tests for goodness of fit (i.e., chi squared fit statistics) are the most widely used in assessing model fit, this analysis does not utilize some of these fit statistics given their sample size sen sitivity . Chi square fit statistics can often be l fit

PAGE 109

109 especially within IRT polytomous models (Chernyshenko, Stark, Chan, Drasgow, & Williams, 2001). Therefore, the following methods were conducted to determine mod el fit: exploratory factor analysis and evaluation of the model based upon the fit indices of Root Mean Square Error of Approximation (RMSEA) and the Standardized Root Mean Square Residual (SRMR). In determining whether the data fit with the model of inte rest, an exploratory factor analysis was conducted as well as a factor analysis reviewing the various fit indices (e.g., RMSEA, SRMR). The exploratory factory analysis was analyzed up to four factors , using SPSS and MPlus. The results of the factor analy sis showed that one factor contained 53.1% of the total variance with the second factor incorporating another 6.1% of the total variance (59 .2 %). As illustrated in Table 4 1, the inclusion of any additional factors in the model after the first two factors did not provide a significant increase in the composition of the total variance. As illustrated by Figure 4 1, the scree n plot illustrates that a two factor solution, which explained 59% of the variance, could be utilized , as a flattening effect occurs aft er the first two factors. In addition to investigating variance composition, a factor procedure was conducted to Based upon their respective eigenvalues, the results suppo rted the retaining of three factors with the following eigenvalues greater than 1.00 (Kaiser Guttman Rule) . The respective factor loading and proportion of variance are illustrated in Table 4 1. The Kaiser Guttman rule of retaining all principal components with eigenvalues greater t han one was utilized to yield this three component solution (Kaiser, 1991). Therefore while the factor analysis indicated that a two factor analysis statistically contributed to the overall variance, the eigenvalue determination utilizing the Kaiser Guttman principle results in a three factor solution with the third factor much closer to the 1.00 baseline. Although the Kaiser Guttman rule has been one of the most widely

PAGE 110

110 used determiners for principal components analysis based upon eigenvalues (Costello & Osborne, 2005), research has shown that this rule can also tend to overestimate the number of factors , especially in data sets with increased number of variables (Kaiser, 1991; Nunnally & Bernstein, 1994). Given the differences in model fit statistics and sensitivity of the Kaiser Guttman method, further analysis utilizing the fit indices of the Root Mean Square Error of Approximation (RMSEA) and the Standardized Root Mean Square Residual (SRMR) was conducted. The use of the fit in dices of the Root Mean Square Error Approximation (RMSEA) and the (SRMR) provided more information in justifying a two factor fit for the data set. In 0 8 or less for adequate fit , and SRMR of a . 0 6 or less for adequate fit , were utilized. For one factor, the RMSEA indicated a less than adequate fit with .11 , while the SRMR resulted in an adequate fit of . 0 6. For two factors in the model, the SRMR also show ed adequate m odel fit (SRMR = . 0 42), and the two factor RMSEA fit indices were much closer to the Hu and Bentler standard (RMSEA = . 0 89). Although the two factor RMSEA result falls outside the established standards, recent research ( Hooper, D., Coughlan, J., & Mullen, M. R., 2008 ) has indicated that data resulting in a RMSEA of 1.0 can also indicate a mediocre fit. To further understand the model fits based upon the corresponding RMSEA, the 95% confidence intervals were analyzed based upon a standard of .05 for appropri ate fit (Chernyshenko, Stark, Chan, Drasgow, & Williams, 2001). Similar to the respective RMSEA and SRMR results in terms of model fit, the RMSEA confidence levels also illustrated closer fit for a two factor analysis 95% CI [ .085 , .093 ] as compared to a one factor analysis 95% CI [.106 ,.113] . Therefore given the exploratory factor analysis in combination with the RMSEA and SRMR results, a two factor analysis for this study

PAGE 111

111 was conducted. Retaining the two factors for analysis fit appropriately within th e model , given that the two constructs composing the internationalization assessment were critical thinking and communication skills . Factor Loadings Based upon the two factor analysis, the analysis utilized a varimax (orthogonal) and oblique rotation. I n further analysis, the protocol adopted for principal component analysis was to rotate the matrix of loadings to obtain orthogonal (independent) factors (Varimax rotation) and then the analysis was performed using o blique rotation. In analyzing both ortho gonal and oblique, although there were significant differences between the rotations, the results led to more conclusive interpretations using the orthogonal rotation. In addition, the goal of the analysis was to investigate the two factors as uncorrelated (orthogonal). The established parameters for item loading only included those items that had a loading >0.30 on any factor. A factor loading cut off of .30 or greater was utilized based upon research conducted by Hair, Tatham, Anderson, & Black (1998) who found that for sample sizes of 350 or more a .30 factor loading provided significant results. Given that the sample size for this analysis was over 800, the factor thresholds ( Hair et. al. , 1998) c ould be appropriately applied to this sample. The internationalization assessment consisted of two factors, critical thinking and communication skills. The specific items that represented the critical thinking construct were item s 1 through item 12; while the items compromising the communication skills co nstruct consisted of items 13 through item 26. Therefore, the first factor in terms of the factor loading consisted of the critical thinking construct. For the first factor (critical thinking), the following items all loaded significantly on the first fact or: 1,2,3,4,6,7, 8, 9,10,12,13,15,16,18,25, and 26. For example in relationship to the first factor, item 4 (Effectdec) (.756), which address es a

PAGE 112

112 that student s cultural decision making ability contained a significant proportion of the variance for critical thinking. Table 4 2 indicates that all the items incorporating the critical thinking construct , except for Item 5 (Knownorms) and 11(Knowbeliefs) , loaded on the first factor. Although Item 11 , Knowing about other cultural beliefs is important , did not load significantly on the critical thinking factor, its loading was close to the threshold cut off (.256). In addition, Item 5 (Knownorms) , Kno wing about other cultural norms and beliefs is important to me , , and yet failed to load on their factor of inter est but did load significantly on the second factor, communication, i.e., .610 and .591 respectively. Given that both Item 5 and Item 11 address similar sub concepts of critical thinking, it would be beneficial to further analyze the wording of these quest ions to comprehend the difficult y students have in comprehending their own cultural values during the item development phase of this instrument, postsecondary institutions may benefit from providing various opportuniti es for students to explore their own definition of cultural values. The items that had the highest loading on the critical thinking factor were Item 12 (Recdec) (.884) , , Item 9 (Solvpro b) (.866) , , Item 6 (Thinkcritical) (.788) , Both Item 9 and Item 12 had the highest proportion of the critical thinkin g factor which Given that recognition of the cultural decision making process is vital for students to feel

PAGE 113

113 comfortable in relating to these different problem solving methods (Cusher, 2008), it is important provide significant contribution to the critical thinking factor in order to lay a strong foundation for developing other critical thinking components that build upon cultural recognition. In addition to the highest loading factors, there were several items that loaded at the lower end of the significance level for the critical thinking factor. For example, item 8 (Ptsview) (.384) loaded lower than the other critical thinking i tems . This question specifically addresses a foundational belief about the value of understanding different points of view and had lower factor loadings both on the first and second factor ( .412 ) . Despite Item loading on the crit ical thinking factor, all the other items illustrated a strong loading on this first factor. In regards to the second factor, communication skills, the following items loaded significantly: 1, 3, 5, 8, 11, 13, 17 24. The items with a communication constru ct that failed to load significantly on the second factor were: Item 14 (Social) (.256) , people of my culture , Item 15 (.264) , environments , Item 16 (.155) , cultures Item 25 (.151) , , Item 26 (.013) , . Item 14 will b e further addressed in the following sections, the items that failed to provide a significant proportion of the communication factor addressed such sub constructs of communication as . ly on the communication factor, these items did load moderately on the critical thinking factor. It is quite possible that I as a critical thinking component.

PAGE 114

114 In other words, the responder could view the conce pt of cultural adaptation as incorporating a cognitive process whereby cultural recognition must first be applied before one can appropriately adapt to a different cultural situation. However , Item 16 , which is directly relate d to communication skills, ma y require further investigation to understand the rationale behind its failure to load on its respective factor. Within the communication factor, the items that exhibited high factor loadings included: Item 21 , s , Item 22 , other cultures teaching me about their culture and Item 24 , between cultures . constructs that composed the highest proportion of the variance for critical t hinking, the items that loaded the highest on the communication factor were dimensions that are foundational for communication skills development (e.g., cultural appreciation, learning about other cultures). As postsecondary institutions continue to develo p learn about other cultures can eliminate potential obstacles in internationalization efforts. There were five items that loaded on both factors. These items included Item 3 , I am open to different cultural ways of thinking in any international context Item 8 , different points of view is a priority to me Item 13 , lexibility when interacting with members of another culture Item 18 , cultural differences and Item 23 , ich loaded on both factors, there are possible themes that could create overlap between the two factors. For example , for Item s 13 and 18 which loaded

PAGE 115

115 interacting with members of other cultures in order to exhibit a sense of comfort and flexibility in communication . Item 14 (Social) failed to load on either of the respective factors. This item, which states ( .117) and slightly stronger on the second factor (.256). As the only item of the assessment that required reverse coding, Item 14 (Social) also exhibited a higher residual variance level than all the other items (.961 as compared to an average of .4 for all other i tems). Given that the removal of Item 14 (Social) resulted in increased significant levels of information over a broader range of ability levels, specific item loadings on both factor s improved. For example , with Item 1 and the removal of Item 14 (Social), the loadings on factor one increased to .487 from a previous loading of .477. In fact, all items compromising t he critical thinking dimension increased loading on the first factor with the elimination of this item . Table 4 3 illustrates the respective item loadings on the critical thinking factor with Item 14 (Social) and without Item 14 (Social) . In the area of communication skills , with the removal of Item 14 (Social ), a number of items loading on the second factor increased significantly (See Table 4 4) . For example with the removal of item 14, Item 15 (Adapt) strengthened in its loading on the communication fa ctor from .264 to .434 and Item 26 (Articulate) which did not load significantly on the communication factor with the inclusion of Item 14 (.000) showed a strengthening of factor correlation with the removal of the Item 14 (.592). However with the removal of Item 14, some items did decrease in their strength of loading (e.g. , Item 18 ( Comfortconv ) going f rom .399 to .375). Though there was a decrease in factor loading for such items as number 18 (Comfortconv) , this decrease did not result in a factor loading below the .30 threshold cut off. In addition, the

PAGE 116

116 total number of items loading on the second factor without Item 14 (Social) increased from 12 to 13. In addition , with the removal of Item 14 (Social), the RMSEA resulted in a slight increase (.093) while the SRMR resulted in slight decrease (.043). Instrument Information st andard error of measurement (SE m ). However, in item response theory, the instrument (Thorpe et al., 2007, p. 179). Overall for both the factors of critical thinking and communication skills, the internationalization assessment had a vast range of t heta levels and information provided by each survey item. In terms of theta levels, the range for item dif ficulties was from a low of 3.9 to a high of 1 .6 . The instrument information showed the highest levels information (slope peaks) at the ability level s of 2.5, 1, and 0. These lower theta levels do not necessarily indicate that the participating students have a lower ability levels but rather that it requires a lower ability level in either critical thinking or communication skills to endorse a specif ic response category. In Likert scaled polytomous instruments, the movement of the information function and category response curve to the left side of the theta spectrum is often the result of tems. Given the l ower range of theta levels, many items exhibited lower abilities in the areas of critical thinking and communication to endorse a specific concept. For example , in I tem 1 which addresses the statement, efore making conclusions about the world a student taking the internationalization i nstrument would need a low critical thinking level of even more illustrated in it em s such as Item 17 , other countries ,

PAGE 117

117 regarding the value of group work was a 3.5. Further information regarding the speci fic item abi lity levels are disc ussed in the item information and category response section. Given that two factors compromised the majority of the variance, the test information for the first factor of critical thinking and for the second factor were conducted separately. The first factor, critical thinking, exhibited an information level of abou t 20 for the ability levels of 3 to 1 . The item information level begins to decrease at a theta level of .5 (see Figure 4 2). In the critical thinking factor, the highest amount information came from the ability levels of 2.5, 1.0, and 0. Theref ore, the critical thinking items provide the most discrimination among students at these previously listed ability levels. While the ability levels and the highest information levels coincided with overall instrument information ability levels, the amount of information offered by the critical thinking factor reached a climax of approximately information level of approximately 35. Despite the difference s in information levels, the critical thinking factor still provided a significant information range. For the second factor, communication skills, the information level resulted in significantly high er information I ( 3) = 40 at a small er ability level rang e ( 3) than the overall test information function and the critical thinking factor (See Figure 4 3). This limited item information range shows that the test information for the communication factor fails to provi de information over a broad ability range a nd on ly provided significant information for students who have a low ability level of 3. Since item 14 (Social) failed to load on either factor, the test information function was reanalyzed , removing this item. In removing Item 14 (Social), the test infor mation function for communication significantly improved (see Figure 4 4). With the elimination of Item 14 (Social), the information level for the second f actor, the communication

PAGE 118

118 factor illustrated a more consistent item information range 5 over a broader ability range ( 2 to .5). In regards to the communication ability levels, the most information came at the ability levels of 2.5, 1.0, and 0 , with information levels of approximately = 27, 23, and 21. The removal of Item 14 resulted in an information ability range and levels more parallel to the overall instrument information levels. Item Analysis Item Calibration Based upon the factor analysis which results in the two factors of critical thinking and communication skills, the IRT item calibration was performed to determine the estimated item parameters for each item on their corresponding factors. The number of threshold parameters remained consistent across the items in the data convergence process. Despite some of the response di stributions higher for certain items (e.g., item 22 with 86% agreement), none of the response categories need to be collapsed to calibrate the items. In this process, the item parameter estimates for all the critical thinking and communication items are in Appendix M. These item parameters estimates resulted in a theta range of 3.9 to 1.6 ( 16 (Commeffect) containing the lowest ability level of item 12 (Recdec) illustrating a theta discrimination levels, the item slopes ranged from a low of .27 (item 14 Social) to a high of 3.1 (item 24 Appdiff). The item param eters and item graphics showed that for factor one (critical thinking) the items provided good measurement among students with a lower estimated values, and the items for the second factor (communication) showed good measurement among lower estimated value s as well but illustrated lower probability levels when transition in agreement levels (agree strongly agree).

PAGE 119

119 Item Information The item information provide s a measure of reliability for each item within the instrument. Reliability addresses the amount of error in an overall test or item. While classical test theory has one reliability measure of true and error variance in observed scores for the instrument, item response theory (IRT) assumes that each item contributes information by providing information item offers depends on the steepness of the CRC , and the location of the information depends on the discrimination. Therefore, the item information is a able the value of a CRC is at or how well each score is being estimated at Favia, Commins, & Thorpe, 2012 , p. 6 ). In understanding information levels, items that are typically lower than 1.0 are generally le ss informative ( Favia, Commins, & Thorpe, 2012). For this analysis, the item information levels were reported in correspondence with their factors. The item information scale shows the relationship between the amount of information that an item provides and the ability level required for the resp ective items. In analyzing the highest level of item information, the range of item information extends from an insignificant item information level in Item 14 (Social) to items that provide significant information , such as Item 21 , I ( 2.5 ) = 2.6). The items that exhibited higher levels of item information at the highest the ta level s are listed in Table 4 5 . In the critical thinking factor, the item information items that were the most informative at their corresponding theta levels were: Item 3 (Waythink) , Item 5 (Knownorms) , Item 10 (Contrcult) , and Item 11 (Knowbeliefs) . For example with Item 3 (Waythink), the information levels were the highest at ability levels of I ( 2.5 ) = 2.0 , I ( 1.0 ) = 1.5 , and I( 0 ) = 1.4 on the first factor, critical thinking .

PAGE 120

120 For the communication dimension, the items that were the most informative at their theta levels were: Item 21 (Enjoylearn) , Item 22 (Teachcult) , Item 23 (Inteffect) , and Item 24 (Appdiff) . As illust rated in Figure 4 5, Item 21 (Enjoylearn) , cultures , I( 2.3) = 2.6 and an ability level of 2.3. From the ability range of 2.3 to approximately 0, Item 21 (Enjoylearn) , provides the most information with peak information levels in addition to the 2.6 of 2.2 and 2.3 at the ability levels of 1 and 0 respectively. Similar to Item Item 24 , between cultures , , exhibited high levels of information I ( 3) = 2.6 and was able to more consistently maintain those higher information levels over a wide r range of ability levels ( .5). With the given abilit y levels for Item 24 (Appdiff), there is much more volatility in the item formations levels. For example, while in the ability range of 3 to 1 , the information was considerably high er and then drop ped substantially to below 2.0 at a 2.0. This pattern of extreme information highs and lows is consistent further along the theta range (see Figure 4 6). While I tem 21 (Enjoylearn) and 24 (Appdiff) addressed the highest information levels for the communication dimension, both Item s 5 , important to me , , , highest information levels of I ( 2.5, 2.2) = 2.5 at ability levels of 2.5 and 2.2. For Item 5 (Knownorms), the information level illustrated consistent higher levels of information over the range of ability levels (See Figure 4 7). Similar to Item 5 (Knownorms), Item 11 (Knowbeliefs) , also demonstrates consistently less volatile information levels within its th eta range (see Figure 4 8).

PAGE 121

121 In terms of items with less information, Item 15 (Adapt) ( I( 3.0) = .91) and Item 16 (Commeffect) ( I ( 2.1) = 8) both exhibited highest information levels close to the 1.0 threshold for item information. However in terms of ext reme item information, Item 14 (Social) addressing socializing with students of similar cultural backgrounds illustrated a question that failed to offer information at any ability level. Unlike the other instrument questions which have information levels a s high a s 2.0, the Item 14 (Social) information range is very slight with a low of .02215 and a high of .02235 (see Figure 4 9). Category Response Curves (CRC) The category response curves (CRC) indicate the amount of information for each step within the item as well as respective ability levels needed for each step progression. The key characteristics to analyze for the category response curves are the slope of the various curves which indicate how much information each step has to offer and the respectiv e theta levels which indicate the ability level needed to transition from one step to the next. For example , for Item 12 (Recdec) , (See Figure 4 10), this item s CRCs show that it takes approximately a critical thinking ability level of 1.0 for a student and below Even with Item 12 (Recdec) illustrating one of the lower discrimination levels (a 12 =1.9), the majority of the items within both factors (critical thinking and communication skills) illustrated significant discrimination levels (all items except Item 14 above 1.0). As illustrated in the item information functions and item parameter estim ates , the CRCs had ability levels on the lower end of the theta range co nsisting of abilities between 3.9 to 1.6 ( 3.9 1.6); however, with most of the items ability levels congregating in the 3 and .5 range . Despite the lower ability levels for most of the items, there were several items which exhibited significant amounts of item information. As illustrated in the item information function for Item

PAGE 122

122 21 , the item characteristic curves exhibited steeper slopes in the ability range from 3 to .5. These steeper slopes ( a 21 =2.9) illustrate more discrimination of information for respective item s . In Item 21 (Enjoylearn), the steepness of the slope corresponds with the item information function. In analyzing the CRCs for Item 21 (Enjoylearn), th e steepest information and below and below 1.0 and .5 respectively ( b 21 = 2.2 , 2.5 , 1. 5 , 0.1 ) . A key distinction of the CRCs from the disagree to to In addition , for Enjoylearn other selections (see Figure 4 11). One possibility solution is the collapsing these (Strongly disagree and disagree) categories in the analysis to see if more in formation can be obtained from these items. Item 24 (Appdiff) illustrated significant information as well as substantial steepness ( a 24 =3.1) in the information slope to provide a considerable amount of information. Within the ability level range for Appdi ff , all four steps exhibited a 50% probability of item endorsement in accordance with step progression. In fact , with Item 24 (Appdiff), students had an 80% 4 12) ( b 24 = 3.0 , 2.1 , 1.0 , 0.5 ) . Item 5 (Knownorms) ( a 5 =2.9) illustrated significant information within its category response curves (see Figure 4 13). For all different possible selections, Knownorms exhibited steepness in each of its curves. In fact at an ability level of approximately =

PAGE 123

123 of a stude other options ( b 5 = 2.2 , 2.0 , 1.0 , .01 ) . Following the pattern of their item information function, the CRCs for Item 11 (Knowbeliefs) ( a 11 =2.9) offered considerable steep ness in the slop e of each of the possible selections. Similar to Item 5 (Knownorms), Item 11 (Knowbeliefs) , showed that agreement with endorsed by students (see Figure 4 1 4) (b 11 = 2.3, 2.0, 1.5,0.0) . Consistent with the factor analysis and item information results, Item 14 (Social) illustrated a lower discrimination level (slope) for all categories ( a 14 =. 27). This lower discrimination level for Item 14 means that this item does not offer enough information to distinguish between students at the same ability level. In addition, this items fails to illustrate a progression in ability level for the ordered category responses ( b 14 = 1.0, 1.0, 1.0, 1.0). By having the same commun ication skills ability level for all categories, this item indicates that all students are at the same ability level and that it does take an increased ability in communication While a lower discrimination level and lack of progressive ability levels do not necessarily mean an item is not valid, it does Summary This chapter presented findings from the internat ionalization instrument pilot given to over 800 undergraduate students enrolled in first year courses at a research extensive university located in the southeast United States. Initially, the results addressed how adequately the data fit the model. From th e model analysis, the results showed that two factors (critical thinking and

PAGE 124

124 communication skills) appropriately addressed the data. Following the model fit interpretation, factor loadings of the items in relationship to their respective factors were addre ssed. In the area of critical thinking factor loading, all the critical thinking items loaded on this factor except for items 5 and 11. In regards to the factor loadings for the communication skills, the majority of these items significantly loaded on the communication factor , except for Item s 14 16 and 25 26. There were several items that loaded on both factors (i.e., Item s 1, 5, 8, 13, 18) and one item, Item 14 (Social), which did not load on any factor. Next, the chapter provided the result s of the overall instrument information as well as the information for both the critical thinking and communication skills factors. The overall instrument provided a significant amount of information, and the critical thinking information function also prov ided information to benefit the implementation of the instrument. However, a key finding within the communication factor was that the total items illustrated more significant information with the removal of Item 14 (Social) as compared to retaining that it em. Encompassed within the two different student learning outcomes of critical thinking and communication skills, the results analyzed how the respective items fit within these two constructs and the amount of information that each item offers in determin ing the effectiveness of the survey questions in measuring these learning outcomes. For the critical thinking learning outcomes, the results illustrated that several items , such as Item 5 (Knownorms) , and Item 11 (Knowbeliefs) , exhibited higher levels of i tem information than other critical thinking items. In addition , the critical thinking items illustrated information functions above 1.0 and lower ability levels which were consistent with the overall ability levels of the instrument. The communication ski lls learning outcome resulted in items that exhibited high information levels (i.e., Item 21 and Item 24) and followed a similar pattern of lower theta levels for the communication skills. The

PAGE 125

125 next chapter expand s on the discussion of these results and off er instrument and item recommendations within the context of the theoretical framework and the research literature on postsecondary internationalization.

PAGE 126

126 Table 4 1. Exploratory Factor Analysis Results Factor Eigenvalue Proportion Cumulative RMSEA SRMR Confidence Interval 1 13.806 53.1 53.1 .11 .06 [.106 .113] 2 1.596 6.140 59.239 .089 .042 [.085 .093] 3 1.185 4.557 63.797 .073 .029 [.069 .077] Table 4 2. Two Factor Correlations of Instrument Items Item Factor 1 Factor 2 1 .477 .301 2 .619 .204 3 .442 .403 4 .735 .031 5 .240 .623 6 .788 .013 7 .581 .176 8 .373 .412 9 .866 .130 10 .664 .164 11 .256 .601 12 .884 .146 13 .463 .371 14 .117 .256 15 .456 .264 16 .545 .155 17 .197 .513 18 .419 .399 19 .238 .482 20 .048 .699 21 .016 .903 22 .034 .909 23 .385 .464 24 .172 .721 25 .545 .151 26 .656 .000

PAGE 127

127 Table 4 3. Critical Thinking Factor Correlations with and without Item 14(Social) Table 4 4. Communication Skills Factor Correlations with and without Item 14 (Social) Item With Item 14 Without Item 14 1 .477 .487 2 .619 .637 3 .442 .458 4 .735 .756 5 .240 .250 6 .788 .802 7 .581 .600 8 .373 .384 9 .866 .881 10 .664 .674 11 .256 .264 12 .884 .902 Item With Item 14 Without Item 14 13 .371 .468 15 .264 .434 16 .155 .451 17 .513 .541 18 .399 .375 19 .482 .499 20 .699 .701 21 .903 .900 22 .909 .910 23 .464 .391 24 .721 .727 25 .151 .553 26 .000 .592

PAGE 128

128 Table 4 5 . Highest Information Levels and Corresponding Theta Levels for Instrument Items Item Theta (Ability) Level Information Level 3 3 1.9 5 2.5 2.5 10 3 1.7 11 2.2 2.5 21 2.3 2.6 22 3 2.2 23 3 1.9 24 2 2.6

PAGE 129

129 Figure 4 1. Scree Plot of Eigenvalues

PAGE 130

130 Figure 4 2. Critical Thinking Test Information Figure 4 3. Communication Skills Test Information

PAGE 131

131 Figure 4 4. Communication Skills Test Information Without Item 14 (Social) Figure 4 5. Information Levels for Item 21 (Enjoylearn)

PAGE 132

132 Figure 4 6. Information Levels for Item 24(Appdiff) Figure 4 7. Information Levels for Item 5(Knownorms)

PAGE 133

133 Figure 4 8. Information Levels for Item 11 (Knowbeliefs) Figure 4 9. Information Levels for Item 14 (Social)

PAGE 134

134 Figure 4 10. Category Response Curves for Item 12 (Recdec) Figure 4 11. Category Response Curves for Item 21 ( Enjoylearn)

PAGE 135

135 Figure 4 12. Category Response Curves for Item 24 (Appdiff) Figure 4 13. Category Response Curves for Item 5(Knownorms)

PAGE 136

136 Figure 4 14. Category Response Curves for Item 11 (Knowbeliefs)

PAGE 137

137 CHAPTER 5 DISCUSSION This study discuss es the previously presented results in C hapter F our as based upon address the results of t his analysis, this chapter first review s the original purpose of the study and the research questions that guided it . Nex t, the findings discuss in light of assessing students understanding of critical thinking and communication skills in regards to intercultural competency and global awareness. With the discussion of the findings, this chapter offer s concepts for intercultural assessment improvement and elaborate s on ideas for areas of further researc h. The research findings are presented based upon how effective this internationalization percept ions of intercultural competency and glob al awareness. This chapter conclude s with an overview of the findings from this study. Summary of Study Contributions As internationalization efforts increase at postsecondary institutions, there is a vital need to comprehend the outcomes of these various international policies and programs. In understanding the effectiveness of international efforts, educational institutions have invested in platforms to assess these outcomes , especially in the area of intercultura l competency. While programs, institutions have realized the necessity of comprehensively assessing international and intercultural perspectives of all their students (Deardorff, 2006). As institutions develop and implement international programs and policies that affect the whole institution, the role of assessment becomes expansive , with a need to assess the effectiveness of these implementations at the insti tution, college, curricular, co curricular, faculty, and student level. In order to address the overall effectiveness of international ization efforts, postsecondary institutions have utilized

PAGE 138

138 such assessment procedures as student involvement in internation al events and internationally focused course rubrics to understand the role of internationalization on campus. Coupled with student engagement, incorporation of international course components, and measuring international student enrollment, understanding students self perceptions of their intercultural development is vital not only to understand student development but also to create interventions that foster their progression in the areas of intercultural competency and global awareness. Given the speci fic culture and climate that each institution possesses , this study provides a template in the development and validation of an institution attempting to understand the views of their own students . While a plethora of external instruments exist to measure students understanding of global concepts, the alignment of these instruments to specific postsecondary institutions internationalization goals can vary. Therefore, this study assists higher education institutions in the process of evaluating their own internationalization student learning outcomes through the development of an institutionally specific internationalization instrument. Review of Purpose and Research Questions As indicated in previous chapters, the purpose of this study was to e xamine the validity of internationalization assessments in relation to its of intercultural understanding and global awareness, with a specific focus on critical thinking and communication skills. Guiding this a nalysis, the research questions for this study include: 1. What are the psychometric properties of an assessment in regards to student self perceptions of intercultural competency and global awareness ( critical thinking and communication skills ) in regards to i nternationalization? 2. When evaluating the self perceptions of intercultural competency and global awareness (critical thinking and communication skills) of students in regards to internationalization, how valid is the assessment? 3. How reliable is the asses perceptions of intercultural competency and global awareness (critical thinking and communication skills) in regards to internationalization?

PAGE 139

139 Summary of Research Findings The following section present s the results from th e data analysis within the context of internationalization instrument development and validation. To guide the discussion of the instrument and specific item findings, the findings are discussed based upon analysis of model fit, item correlations, and item relationship to the student learn ing outcomes, this section specifically discuss es the research findings based upon the two overarching areas of critical thinking and communication skills in a global context. In terms of factor analysis and model fit, the results illustrated that , while three factors could be relevant to the data , in fact, two factors (critical thinking and communication) more adequately fit the survey data. Based upon the two factor analysis which comprised 59% of the overall variance, the item correlations varied for each factor ; however, generally the items respective to their factors correlated significantly. For example , for the critical thinking factor ( Item s 1 through 12 ), the majority of the critical thinking it ems loaded on the first factor ( 10 of the 12 items ) . Similar to the first factor, for the second factor, a majority of the communication skills items loaded on th e second factor. In fact, the majority of the communication items increased in factor loading strength when Item 14 (Social) was removed. S ince the items aligned with their respective factors, these results offer support for the fit of the items within the established item specifications. While many of the items loaded on their respective factors, questions such as Item 23 (Inteffect) loaded on both factors significantly. Despite the consistent item fit for many of the instrument questions, Item 14 (Social) failed to correlate with either factor and lacked any needed to be reverse scored for analysis. The lack of information provided by I tem 14 (Social)

PAGE 140

140 requires further analysis and item adaptation. Further recommendations regarding Item 14 (Social) and oth er instrument questions are addressed in the next section. Item Anal ysis Recommendations Critical Thinking As previously indicated, many of the critical thinking items correlated on the first factor. In addition to item correlation, this study consisted of analyzing item information and item characteristic curves to determ ine item fit within the internationalization assessment. The items that offered the most information within the critical thinking component were Item s 2, 3, 5, 10, and 11. These items offered information ranging from 1.5 to 2.5. The other critical thinking items all had information levels above 1.0 , indicating that the critical thinking questions provided considerable information about the discrimination of the items as well as the critical thinking ability levels of the students (Thorpe & Favia, 20 12) . In terms of item information, Item 1 (Perspectives) , Item 3 (Waythink) , Item 4 (Effectdec) , Item 5 (Knownorms), Item 10 (Contrcult) and Item 11 (Knowbeliefs) , all indicated less amount s Item s 3 (Waythink) and 1 (Perspectives) , respective items . T herefore one recommendation would be to revisit those speci fic questions to recommendation would be to conduct a random sample feedback of a group of undergraduate students to gain further insights into the wording of Item s 1 (Perspec tives) and 3 (Waythink) . for items such as Item 1. In analyzing the item parameter estimates, the collapsing of the a djacent disagreement categories, the t heta values for Item 1 decreased (moved further left on the ability scale) and resulted in a slightly decreased amount of maximum critical thinking

PAGE 141

141 information. In further implementations of this assessment, researchers could analyze the previously listed items by either collapsing the highest disagreement category with the adjacent category to determine information and ability levels. Through the collapsing of categories, further adaptation of the instrument could be determined if there needs to changes ma de to the scale changes. While it may lead to more item information and that the graded response model allows for differing response scales, participating student s may feel confused as they experience one or two items that have different response scales; thereby, affecting their responses. Another possible analysis would be to relocate these items to a latter part of the critical thinking question s to see if the s ame results would be gained. Given that both Item s (1 and 3) correlate significantly on the critical thinking factor, I would not recommend removing these items. In addition to the previously mentioned items, Item 5 (Knownorms) , illustrated a small ; range of Item 5 (Know 2.1 to 2.5, I would recommend no change to the wording or response scale of Item 5 (Knownorms). In term s of item information levels, Item 9 (Solvprob) offered the least amount of information of all the critical thinking items (approximately 1.0). Despite having the least amount of information, Item 9 (Solvprob) did show consistent steepness of sl ope with it s category response curves, consistent ability level range with the overall instrument , and was above the 1.0 information thresh hold. ( Favia, Comins, & Thorpe, 2012 ) . In addition, the slopes of the various category response curves illustrated a steepnes s that offered significant information about the items , especially in Item s 2, 3, 5, 8, 10, 11, and 12. While many of the items illustrated significant steepness of slope, the ability level (theta) of the critical thinking items varied. The theta levels of critical thinking items were lower than

PAGE 142

142 expected , ranging from 3 and .5. For example , for Item the theta level for a and above categories than all the other critical thinking i and above categories 3.5. In fact with Item 2 (Multiperspect), the theta levels for a student tr eutral and above categories 1.0. These theta levels were lower the other critical thinking items. Another key characteristic of Item 2 (Multiperspect) is that there is a large increase in ability level to move 1.0) with this Item s 3 ( Waythink ) , Item 9 (Solvprob), and Item 10 ( Contrcult ) also have consistent and categories below I would not recommend altering Item 2 (Multiperspect) (see Figure 5 1). Communication Skills Similar to the critical thinking items, many of the communication skill items correlated on the second factor, communication skills. In analyzi ng the item information function for the communication skills questions, the information scale for the communication skills items ranged from .02215 (item 14) to 2.6 ( Item 2.6), Item 21 2. 55), and Item 2.2) offer the most amount of item information , several other items offer a lesser amount of information (i.e., Item 19 and Item 25) . For example , Item , Item , ation = .9 and .8 respectively. Despite these items offer ing lower amount s of information, Item 19 correlates high on the communication skills factor (.692) and aligns with the overall ability range of the internationalization instrument. One k ey characteristic of the communication items was the lack of information and discrimination offered from Item 14 which states ,

PAGE 143

143 socialize with people of my culture . rated in the category response curve for Item 14 (Social), the slope of each of the possible selections was flat and pinnacle d at a point of 1.0 , and does not indicate a progression between the different steps of the response scale (See Figure 5 2). Given the lack of information obtained from Item 14 (Soc ial), the data was reanalyzed without this item and as a result saw a significant increase in factor loading and item information among the other communication skills items. This removal of Item 14 (Social) showed an immediate benefit , as the amount of information for th e total communication skills questions increased over a larger student ability range in communication , from an information level of 40 at one specific theta level, 3.0, to an information level of approximately 25 across a theta range of 3 to .5. As compa red to the whole instrument, the item level effect of the removal of Item 14 (Social) also illustrated a beneficial impact. For example , with Item the information level with Item 14 (Social) retained reached a peak of approximately 2.6 over a theta range of 3.0 to .3; however, when Item 14 (Social) was eliminated from the model , the information level for Item 24 (Appdiff) increased of almost 3.5 over the same ability range (see Figure 5 3). Given the increased effect of the overall and speci fic item fit with the removal of Item 14 (Social), there are several possible recommendations for the communication focused items. With most communication skills items fitting well within the model, it is my recommendation to not alter or adapt the current items except for Item 14 (Social). One possible recommendation is the removal of Item 14 (Social). By removing this item, not only is there an increase of information for the other items, but it will also bring the number of items addressing critical thin king (12) and communication skills (13) into better alignment. Howev e r , if Item 14 (Social) was not removed, another possible alternative would be to explore further feedback from undergraduate perceptions of the wording of this question. While also gaining student

PAGE 144

144 feedback, it could be beneficial to have the assessm ent taskforce content experts review the item for potential changes in wording. As a result of further student and content expert feedback, it could be possible to reword Item 1 4 (Social) to address the same concept of students limiting their exposure to members of other cultural groups. Instrument Implementation As universities and colleges increase efforts to internationalize their campuses, there is a continued need to develop and utilize assessment instruments and protocols to further understand the effectiveness of these international programs. While a compreh ensive assessment plan for international programs incorporates both direct and indirect measures, increased emphasis has been evident on the utilization of intercultural assessments to gain an extensive understanding of student views on cultural issues. A dding to the plethora of intercultural instruments utilized for internationalization assessment purposes, this study offers not only another instrument for post secondary institutions to use in their assessment procedures but also perhaps more importantly provides a framework for colleges and universities to utilize in developing their own intercultural instruments. Since each campus has a unique and diverse climate, postsecondary institutions should review and analyze the constructs and learning outcomes o f the intercultural instruments to ensure that they align with the established instructional outcomes. In the construction of this reviewed professionally designe d instruments and determined that they did not adequately fit the the first step in the application of any intercultural instrument is the aligning of instrument outcome with institutional outcomes. The instrument developed in this study specifically aligned with the specific institutions internationalization competencies and outcomes. As higher education administrators progress in their internationalization assess ment plans, the alignment of

PAGE 145

145 instrument outcomes is important to ensure effective evaluation of student views of intercultural competency. Based upon the framework outlined in the study, if a postsecondary institution decides to develop their own intercult ural assessment, a vital component will be engaging key institution stakeholders in the development process. These key stakeholders including faculty, administrators, assessment and international content experts will provide the foundational knowledge to n ot only assist in the development of learning outcomes but also offer valuable insights into the wording of items that will directly relate to the outcomes of interest. After a series of item piloting phases, higher education administrators can modify thei r instrument to represent their institutional internationalization initiatives. Once a valid and reliable instrument is developed, it is important for a postsecondary institutions to assess student perspectives on the international competencies routinely a nd longitudinally to gain a more comprehensive and long term understanding of development in these areas. Summary This chapter focuses on how to practically apply the results of item analysis to further the instrument alignment to learning outcomes. Criti cal thinking and communication in international contexts are key concepts in this internationalization assessment. In investigating the critical thinking items, the recommendations addressed further analysis of items through student feedback for items that did not provide as much information overall (e.g., item 9). Despite the recommendation that more student input into the wording of the critical thinking items is necessary , most of the critical thinking questions provided significant information about the items and the ability of the students. Similar to the critical thinking items, the majority of the communications items fit well within the model and offered significant item information. The one exception to the communication items was question 14 ,

PAGE 146

146 the scant amount of information obtained from I tem 14 (Social), the recommendation was to either remove the item completely from the instrument or to obtain further student and content expert input into the wording of this question.

PAGE 147

147 Figure 5 1. Category Response Curves for Item 2 (Multiperspect) Figure 5 2. Category Response Curves for Item 14 (Social)

PAGE 148

148 Figure 5 3. Information Levels for Item 24 (Appdiff) with Item 14 Eliminated

PAGE 149

149 CHAPTER 6 IMP LICATIONS AND FUTURE RESEARCH The present study investigated the development and validation of an internationalization communication in a global context. This final chapter introduce s institutional policy and program about global awareness and intercultural competency. Next, suggestions for future research in the area of internationalization is explo perceptions of global and intercultural compete ncy. Finally, this chapter end s with final thoughts related to this study , as well as challenges that campuses must overcome to develop a climate th at promotes the student learning outcomes of critical thinking and communication in an international context. Implications for Institutional Practice It is important to u nderstand student views of global awareness and intercultural competency in order to assure that programs and policies are effective at promoting outcomes. This study generated potential policy considerations relevant to the development of understanding of intercultural issues. In analyzing the results of the various instrument i tems and the concepts that each question addressed, it is important to consider potential interventions both for curricular and co curricular activities to develop students abilities in the areas of globa l l y focused critical thinking and communication ski lls. In the area of critical thinking in a global context, the frequency results illustrated that global skills . Given that many students feel they are able to think critically when placed in different cultural

PAGE 150

150 situations, there are several other areas that postsecondary institutions can address to increase perceptions ab out critical thinking. S everal items had lower agreement levels and included: I tem 4 , ; I tem 7 , ; I cultures solve problems ; I tem 12 (Recdec) , other . Based upon these low agreement items, there are potential policy and programmatic themes of cultural decision making and cultural recognition that higher education institutions can specifically investigate to further a campus climate of intercultural competency. Both Item s 4 (Effectdec) and 12 (Recdec) encompass decision making recognition and processes selected (e.g. , Item 4 = 21.3% and Item 12= 26.5%). Given that many of the students were firs t and second years students, the lack of exposure to international students and students of other cultures could result in lower agreement levels. T herefore to address the understanding of cultural decision making skills, postsecondary institutions can inc orporate such curricular initiatives as international case study analysis and internationalization of course curriculum to offer insights into the various cultural decision making processes. Also within the classroom, the utilization of international stude nts to expound upon their cultural decision making processes could assist domestic students in understanding how different cultures approach various issues. To lay a foundation of cultural understanding and an avenue for cultural recognition, postsecondar y students could offer first year students a course that focuses on intercultural competency. For example, as part of the first year experience, students would participate in a c lass ent cover a specific section on intercult ural

PAGE 151

151 communications and relations. In addition to direct assessment through multiple choice exams and open ended ess ays, the enrolled students could be required to conduct periodic interviews with international students about relevant questions regarding t heir beliefs, attitudes, and interests. Coupled with international exposure through class inter views, students could be required to keep reflective journals regarding what they have been learning and feeling about the intercultural experiences. Given that with the importance of knowing the cultural beliefs of others , coupling exposure to various cultural beliefs with reflective exercises can further reinforce student confidence that not only is it im portant to understand other c ultural beliefs but also to engage in a comprehensive evaluation it is important to provide international education opportunit ies because this allows students to question personal cultural beliefs as well as experience growth in terms of ethnic identity, racial identity, and intercultural sensitivity while developing self confidence through a positive view of oneself and others ( Day Vines, Barker, & Exum, 1998; Neff, 2001). By providing a campus climate where students have a foundational knowledge of their own cultural identity and a willingness to understand and accept other cultural beliefs, postsecondary institutions would be p reparing their students for the diverse and global workforce that lies ahead for them. In analyzing the communication skills questions, there were similar frequency trends to the critical thinking questions, as many of the undergraduate student s felt that the several survey questions. In particular, many of these participating students they are flexible in communicating with people of other cultures as well as appreciating the differences between cultures. However, there we re several items that addressed such issues as group collaboration and comfortableness in discussing international issues that illustrated higher

PAGE 152

152 . percentage of selecte Item 16 , ; Item 17 , countries , ; Item 19 , ; and Item 25 , issues , deve loped and implemented in order to foster opportunities for American students to work with international students in group settings. For curricular initiatives, the inclusion of group work may not suffice for students to feel a sense of excitement about int ercultural group work. As part of the team building process, course instructors could provide intentional prompts to assist group members in the process of getting acquainted with other team members and to discover ses. Research illustrates that students express more satisfaction with group work as well as an increased willingness to engage in cross cultural collaboration if they are provided opportunities to form bonds before initiating group assignments (Gatfield, 1999; Volet & Ang, 1998). In addition to investing in curricular components for further group work opportunities, specific programs can be implemented that provide more co curricular exposure to various cultural groups. To support co curricular opportunities, a key program would be intercultural co mmunication workshops which could consist of both one day and two day workshops. At Michigan State University, they instituted a two day workshop which could consist of half domestic students and half international students engaging in rol e playing, mini lectures, and small group discussions. This program was known as the Michigan State University Internationalizing Student Life Project (Mark, 1994) . The two day workshops can be instituted

PAGE 153

153 within the first year of a experience in order to develop positive experiences between culture groups. In addition to student experiential workshops, o ne day workshops for staff would be conducted which would consist of understanding international students perspectives. Coupled wit h more formal intercultural workshops, college campuses can create an interculturally supportive climate by incorporating more group work activities that allow international students to work on teams with American students in a non competitive atmosphere. Providing opportunities for socializing among diverse student groups will be a key principle in the development of a more interculturally inclusive campus. In the development of social interaction programs, both campus wide activities and intergroup dialog ues could be implemented. These socia l interaction opportunities c ould consist of weekly sessions between an equal number of students (7 or 8) from two or more different countries (e.g., China, India) and could be conducted by a trained moderator with the opportunity for student s to engage in focused reflections on cultural issues (Gurin et al., 2002) . The goal s of these intergroup dialogues are to discern differences and commonalities, incorporate readings on intergroup relations, learn to deal with conflict, and define a collaborative action step. Given that intergroup dialogues have been shown to make a signi ficant positive impact on cultural perceptions (Mayhew & Fernandez, 2007), it is important for all institutional stakeholders , especially students , to participate in these events. In addition, building upon reinforcing the students experience with various cultural groups, a more specific volunteer effort could be incorporated into their first year experience. Research on first year students illustrates that many have previously engaged in volunteer opportunities during their seconda ry studies (Astin, 1998). Therefore building upon their previous interests, a common volunteer experience with international students could be

PAGE 154

154 implemented. This volunteer opportunity could be held in conjunction with a local community service agency. As a more long term implementation, this first year volunteer experience could be extended into various service learning and volunteer opportunities to other undergraduate students. By offering diverse co curricular opportunities for American and international students to collaborate in a group atmosphere, undergraduate students may feel more confident and empowered in engaging in team projects with members of other cultural groups. Future Research Although common themes encompassing internationalization exist across postsecondary institutions, differences in internationalization philosophies and perspectives exist at various higher education institutions. While many institutional strategies include such international components as global awareness and intercult ural competency, the operational definitions of these concepts as well as the various institutional demographics (e.g., student composition, private/public institution) creates disparities on ways to assess student development in internationally focused le arning outcomes. When considering specific institutional demographics for international assessment, there exists a dearth of information on assessing student competencies in the area of intercultural competency and global awareness (Deardorff, 2006). While this study provided insights on undergraduate students perceptions of critical thinking and communication in an international context, further research is necessary to better comprehend their development in the areas of intercultural competency during th eir college experience. There are three areas of future understanding of intercultural competency and global awareness utilizing the internationalization instrument developed in this study . First, v arious institutional demographics can be investigated through a more diverse sampling of students and different higher education institutions.

PAGE 155

155 perceptions of critical thinking and communi cation needs to be conducted to determine continual program and policy evaluation and adaptation. Finally, utilizing diffe rent methodological procedures w ould strengthen the validation of the internationalization instrument as well as shed further light on student development in relationship to international outcomes. Institutional and Student Demographics In the area of institutional demographics, this study focused on undergraduate students enrolled in a large, research extensive university located in the southeastern United States. Within this particular institution, the pilot for the instrument consisted of undergraduate students enrolled in large general education courses and students participating in study abroad programs . While these courses provided significant numbers of responses, the student composition was limited to students enrolled in these courses who were primarily first year and second year students. Given that the internationalization instrument intends to measure all undergraduate students ies , it would be valuable to pilot the instrument among a more diverse student population (e.g., third year and fourth year students). Although this analysis did not investigate individual student demographics, future res earch could investigate the relationship between internationally focused critical thinking and communication constructs and such student demographics as race, major, gender, and participation in international programs. By analyzing various other student de mographic groups, other methodological modification such as different ial item functioning (DIF) could be applied to determine ability level and information difference based upon various groups. Items are said to ividuals being measured have equal ability levels but who are from different groups and do not have the same probability of endorsing the same

PAGE 156

156 In fact, research illustrates that individuals with different cultural backgrounds c an results in differential item functioning in test translations (Ellis, 1989). Although this internationalization instrument does gather student information in terms of class enrollment and international experience, the purpose of this study was to investigate the function and overall fit of each item within the instrument regardless of participant demographics. However further investigation of these previously listed student variables could assist in determining how comprehensively international pro grams are affecting the whole student body. In addition to extending instrument validation within the current postsecondary institution, considering other institutional climates such as comprehensive universities, liberal arts schools, and community colleg es would provide further validation of the international assessment tool. Given the dearth of international assessment conducted at community colleges (Blair, Phinney, & Phillippe, 2001), further implementation of this internalization assessment at communi ty colleges may lead to fostering international efforts for undergraduate students who may not attend a four year institution. B y i mplementing student assessment of intercultural competency and global awareness at different postsecondary institutions and a mong various college students, universities and colleges , the instrument could be used to develop intentional and measurable international opportunities that foster global competency among their students. Methodological Modifications The final area to be addressed in regards to future research involves methodological modifications that could strengthen this study and offer further insights into undergraduate is the restriction of a given sample size or methodology applied to a study, there are limitations within every research analysis that a researcher attempts to overcome. For future studies on assessing internationalization efforts and perspectives, there are three methodological

PAGE 157

157 considerations that could be considered: time points of analysis, item response models, and qualitative analysis. First, a methodological modification to this study would be to include various time points of analysis. The purpose of t his study was to validate an internationalization assessment and its respective items through a pilot study of mostly first year undergraduate students. Although this study investigates one specific time point, further analysis would benefit from applying this internationalization instrument in a longitudinal fashion to assess item fit in a more c omprehensive manner. Although piloting is important for item and instrument validation, it is vital to comprehend student views of internationalization concepts over their postsecondary experience. By analyzing items over a longitudinal study, more evidence can be provided that supports item wording and fit or offers insights into item adaption. Additionally, research shows that student views of intercultural and global issues can change over time , and therefore it is important to have several time points of analysis (McAllister, G., & Irvine, J. J., 2000). By gathering more data with a more diverse population of students over an extended time period, this internat ionalization instrument could better assess student development and learning in Secondly, the inclusion of various item response theory (IRT) models in the analysis of item fit should be conducted to further understand item properties. Given the limited amount of research conducted on intercultural assessments utilizing item response theory procedures (Samonte & Pastor, 2011), there is a need for further analysis utilizing various IRT models . The diffe rent models for item response theory offer unique contributions to item specifications analysis. For example , with the graded response model (GRM), freedom in the various steps of the selection process is allowed, while other models like the rating scale m odel (RSM) , restrict

PAGE 158

158 various components of item information. In addition to utilizing the different IRT models, future research should consider applying other methodological analyzes such as conducting further factor analyzes on the subcomponents of the in ternationalization assessment, simple dependent t tests for intercultural development (matched pairs) ( Hammer, Bennett, & Wiseman, 2003), and a confirmatory factor analysis to explore how well the items are measuring the constructs of critical thinking and communication skills in a global context. Finally, another extension of this study is to incorporate a qualitative analysis component to comprehend students perceptions of intercultural and global issues. In investigating item information and determining item fit, traditionally, the focus of these principles ha s been on quantitative data. While this study focuses on specific item information, further research needs to focus on how students perceive the intercultural concepts that these survey items address. internationalization, qualitative data (e.g. semi structured interviews, focu s groups) could provide a more comprehensive understanding of intercultural competency. Qualitative investigation of the respective items and their constructs can assist in assessing whether the questions continue to address the specified student learning outcomes, and then these results can be translated into more appropriate quantitative methods. Conclusion In a statement made by the President of the International Institute of Education (IIE), Allan E. Goodman indicated that globalization is not a passin how to interact with people from other countries and cultures equips future leaders in all sectors is a vital concept for stude nt preparation regardless of the discipline students choose , postsecondary institutions need to develop programs and policies that reinforce understanding of

PAGE 159

159 various cultures to better prepare students to enter a globalized workforce and community. As inst itutions continue to implement international programs and policies, researchers should continue to assess and evaluate the effectiveness of these programs to foster a campus climate that emphasizes intercultural and global awareness.

PAGE 160

160 APPENDIX A CRITICAL THINKING OPERATIONAL DEFINITION Student Learning Outcome #2 Critical Thinking Students think critically to interpret global and intercultural issues Operational Definition: The s tudents will be able to exemplify effective critical thinking skills in interpreting global and intercultural issues by using the following: JARS (Judgment, Analysis, Reasoning, Solution Finding) Judgment Judging and/or evaluating Case, 2005; Ennis, 1985; Facione, 1990; Lipman, 1988; Tindal & Nolet, 1995 Analysis Analyzing arguments Ennis, 1985; Facione, 1990; Halpern, 1998; Paul, 1992 Reasoning Making inferences using inductive or deductive reasoning Ennis, 1985; Facione, 1990; Paul, 1992; Willingham, 2007 Solution Finding Making decisions and/or solving problems Ennis, 1985; Halpern, 1998; Willingham, 2007

PAGE 161

161 APPENDIX B COMMUNICATION OPERATIONAL DEFINITION Student Learning Outcome #3 Communication Students communicate effectively with members of other cultures Operational Definition: The s tudents will be able to exemplify effective communication with members of other cultures through the use of the following traits: SPAAA (Sensitivity, Production, Awareness, Adaptability, Acceptance) Sensitivity Cultural empathy and sensitivity, nonjudgmental perceptiveness Olson & Kroeger, 2001,Ting Toomy, 1999 Production Active listening, conversation reflectiveness, conveying a clear and intended message, decoding skills, engage in collaboration, and effective interactions Deardoff, 2006; Griffith & Harvey, 2000; Gudykunst, 1993; Sue, 2001; Ting Toomey & Kurogi, 1998 A wareness Cross cultural awareness, engage in international relationships Landis, Bennett, & Bennett, 2004; Paige, Jorstad, Paulson, Klein, & Colby, 1999, Storti, 1999; Storti & Bennhold Samaan, 1998 Adaptability Flexibility, high tolerance for cultural ambiguity and differences, manage cultural misunderstandings Dignes, 1983, Olson & Kroeger, 2001, Kim 1991, Ting Toomy, 1999 Acceptance Open mindedness, cultural reception Dignes, 1983, Ting Toomy, 1999

PAGE 162

162 APPENDIX C ITEM SPECIFICATIONS FOR CRITICAL THINKING STUDENT LEARNING OUTCOME Student Learning Outcome #2: Students think critically to interpret global and intercultural issues Operational Definition: JARS (Judgment, Analysis, Reasoning, Solution Find ing) Benchmark # 1 Big Idea Judgment Enduring Understanding Making judgments Benchmark Students will use multiple strategies to make appropriate judgments Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will effectively judge and/or evaluate issues that exist in global and intercultural situ ations. Response Attributes Students will exhibit effective judgment by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I can evaluate cultural differences from an informed perspective. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I can suspend judgment and appreciate the complexities of communicating and interacting interculturally. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in evaluating cultural differences from an informed perspective ? Not at all effective/Somewhat effective/Effective/Very effective

PAGE 163

163 Behavioral/Task Domain Please choose your response in the following situation: You read about an intern ational situation regarding human rights in another country. You feel disturbed by the situation. Which one of the following may be your next step? A. Initiate a conversation with a student from that country B. Conduct more research on the subject C. Talk to a professor regarding the topic D. Do nothing Essay oppression? Why not? Discuss: You should avoid the assumption that all Latinos speak Spanish Benchmark # 1 Big Idea Judgment Enduring Understanding Making judgments Benchmark Students will use multiple strategies to make appropriate judgments Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will effectively judge and/or evaluate issues that exist in global and intercultural situa tions. Response Attributes Students will exhibit effective judgment by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I can evaluate cultural differences from an informed perspective. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I can suspend judgment and appreciate the complexities of communicating and interacting interculturally.

PAGE 164

164 Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in evaluating cultural differences from an informed perspective ? Not at all effective/Somewhat effective/Effective/Very effective Behavioral/Task Domain Please choose your response in the following situation: You read about an intern ational situation regarding human rights in another country. You feel disturbed by the situation. Which one of the following may be your next step? A. Initiate a conversation with a student from that country B. Conduct more research on the subject C. Talk to a professor regarding the topic D. Do nothing Essay oppression? Why not? Discuss: You should avoid the assumption that all Latinos speak Spanish Benchmark # 3 Big Idea Analysis Enduring Understanding Analyzing Arguments Benchmark Students will comprehend and consider various cultural arguments Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will use multiple cultural perspectives in analyzing arguments in dealing with global and intercultural issues Response Attributes Students will exhibit effective analysis of arguments by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I take into account different perspectives before drawing conclusions about the world around me.

PAGE 165

165 Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I could contrast my own behaviors with those of my hosts in important areas (e.g., social interactions, basic routines, time orientation, etc.) Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in taking into account different perspectives before drawing conclusions about the world around me. Not at all effective/Somewhat effective/ Effective/Very effective Essay What do you find to be the most important global issue facing our world today, give examples of how it is affecting our world, and what would be some potential solutions that you would advise? Benchmark # 4 Big Idea Reasoning Enduring Understanding Making inferences using inductive or deductive reasoning Benchmark Students will use various reasoning strategies to make informed conclusions. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will make inferences using inductive or deductive reasoning regarding global and intercultural issues Response Attributes Students will exhibit effective reasoning strategies by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains

PAGE 166

166 Sample Question Agreement I evaluate situations in my own culture based on my experiences and knowledge of other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in understanding the reasons behind cultural differences ? Not at all effective/Somewhat effective/Effective/Very effective Essay Do groups with diverse members make better decisions? Discuss. Benchmark # 5 Big Idea Solution Finding Enduring Understanding Making decisions Benchmark Students will use multiple strategies to decide upon potential solutions to cultural problems. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question sho uld not take more than 30 seconds to answer. Stimulus Attributes Students will use multiple strategies to decide upon potential solutions to cultural problems. Response Attributes Students will exhibit effective problem solving by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I understand the reasons and causes of conflict among nations of different cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in understanding the reasons and causes of conflict among nations of different cultures. Not at all effective/Somewhat effective/Effective/Very effective Essay In a group project with an international student, how would you respond to their group contribution?

PAGE 167

167 Benchmark # 5 Big Idea Solution Finding Enduring Understanding Making decisions Benchmark Students will use multiple strategies to decide upon potential solutions to cultural problems. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will use multipl e strategies to decide upon potential solutions to cultural problems. Response Attributes Students will exhibit effective problem solving by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sam ple Question Agreement I understand the reasons and causes of conflict among nations of different cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in understanding the reasons and causes of conflict among nations of different cultures. Not at all effective/Somewhat effective/Effective/Very effective Essay In a group project with an international student, how would you respond to their group contribution?

PAGE 168

168 APPENDIX D ITEM SPECIFICATIONS FOR COMMUNICATION STUDENT LEARNING OUTCOME Benchmark # 1 Big Idea Sensitivity Enduring Understanding cultural empathy and sensitivity Benchmark Students will demonstrate a level of sensitivity toward other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will exhibit cultural empathy and sensitivity and nonjudgmental perceptiveness when communicating with members of other cultures. Response Attributes Students will demonstrate cultural empathy and sensitivity by selecting a response to the listed assessment item in terms of a greement/self rating/behavioral task domains Sample Question Agreement I do not feel threatened emotionally when presented with multiple perspectives. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you in situati ons where cultural differences in the way people express their emotions led to misunderstanding. Not at all effective/Somewhat effective/Effective/Very effective Essay Considering different cultures, is it appropriate for a teacher or manager to praise a student or an employee in front of others? Might it be appropriate to criticize or reprimand a student or employee in private?

PAGE 169

169 Benchmark # 2 Big Idea Sensitivity Enduring Understanding Nonjudgmental perspectives Benchmark Students will demonstrate a level of sensitivity toward other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not ta ke more than 30 seconds to answer. Stimulus Attributes Students will exhibit cultural empathy and sensitivity and nonjudgmental perceptiveness when communicating with members of other cultures. Response Attributes Students will demonstrate nonjudgmental perspectives by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I am open to learning about other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you at understanding different perspectives from members of other cultures. Not at all effective/Somewhat effective/Effective/Very effective Essay Consider traditions. Provide examples Benchmark # 2 Big Idea Sensitivity Enduring Understanding Nonjudgmental perspectives Benchmark Students will demonstrate a level of sensitivity toward other cultures. Item types Agreement, Self Rating, Task/Behavior Domain

PAGE 170

170 Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will exhibit cul tural empathy and sensitivity and nonjudgmental perceptiveness when communicating with members of other cultures. Response Attributes Students will demonstrate nonjudgmental perspectives by selecting a response to the listed assessment item in terms of a greement/self rating/behavioral task domains Sample Question Agreement I am open to learning about other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you at understanding different perspectives from members of other cultures. Not at all effective/Somewhat effective/Effective/Very effective Essay Consider traditions. Provide examples Benchmark # 3 Big I dea Production Enduring Understanding Conversation reflectiveness Benchmark Students will demonstrate effective communicative skills production with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate effective conversation reflectiveness skills with mem bers of other cultures. Response Attributes Students will demonstrate effective conversation reflectiveness skills by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains

PAGE 171

171 Sample Question Agreemen t I often reflect upon discussions regarding international issues. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I can reflect on the impact and consequences of my decisions and choices on my hosts Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you at reflecting discussions regarding international issues ? Not at all effective/Somewhat effective/Effective/Very effective Essay

PAGE 172

172 Benchmark # 4 Big Idea Production Enduring Understanding Convey a clear and intended message Benchmark Students will demonstrate effective communicative skills production with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate the ability to convey a clear and intended message with members of other cultures. Response Attributes Students will demonstrate the ability to convey a clear and intended message with members of other cultures. by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I can clearly articulate my message to members of other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you at articulating your message to members of other cultures ? Not at all effective/Somewhat effective/Effective/Very effective Essay

PAGE 173

173 Benchmark # 5 Big Idea Production Enduring Understanding Decoding Skills Benchmark Students will demonstrate effective decoding skills with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining th e meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate effective decoding skills with members of other cultures. Response Attributes Students will demonstrate effective decod ing skills with members of other cultures. by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I am able to understand cultural differences in conversations with members of other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you at understanding cultural differences in conversations with members of other cultures ? Not at all effective/Somewhat effective/Effective/Very effective Essay Share examples of subtle actions or communications directed to someone, which may be Discuss.

PAGE 174

174 Benchmark # 6 Big Idea Production Enduring Understanding Engage in Collaboration Benchmark Students will demonstrate the ability to engage in collaboration with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text s hould contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate the ability to engage in collaboration with members of other cultures. Response Attributes Students will demonstrate the ability to engage in collaboration with members of other cultures. by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement When working on group projects with a student of another culture, I feel confident that I am able to collaborate with this student. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective at negotiating resp onsibilities with international students when working on a team project. Not at all effective/Somewhat effective/Effective/Very effective Essay When working in a team of students or colleagues that include individual from other cultures, how would you pr oceed with group goals while being sensitive to cultural differences?

PAGE 175

175 Benchmark # 7 Big Idea Production Enduring Understanding Effective Interactions Benchmark Students will demonstrate effective interaction skills with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate effective interaction skills with members of other cultures. Response Attributes Students will demonstrate effective interaction skills with members of other cultures. by selecting a response to the listed assessment i tem in terms of agreement/self rating/behavioral task domains Sample Question Agreement I often have conversations with members of other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I use appropriate strategies for adapting to th e host culture and reducing stress. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating How effective are you at having conversations with members of other cultures ? Not at all effective/Somewhat effective/Effective/Very effective Essay Do you feel you are able to engage effectively with people of other cultures? Why or why not?

PAGE 176

176 Benchmark # 8 Big Idea Awareness Enduring Understanding Cross cultural awareness Benchmark Students will demonstrate cross cultural awareness with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answ er. Stimulus Attributes Students will demonstrate cross cultural awareness with members of other cultures. Response Attributes Students will demonstrate cross cultural awareness with members of other cultures. by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I evaluate situations in my own culture based on my experiences and knowledge of other cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I could discuss and contrast various behavioral patterns in my own culture with those of another culture. Strongly Agree/Agree/Neutr al/ Disagree/Strongly Disagree Self Rating How effective are you at evaluating situations in my own culture based on my experiences and knowledge of other cultures. Not at all effective/Somewhat effective/Effective/Very effective Essay In which cultures might touching be important when others are greeted? In which cultures might touching not be appropriate? Discuss In which cultures might eye contact be important and respectful? In which cultures might there be a different scenario? Explain.

PAGE 177

177 Benchmark # 9 Big Idea Awareness Enduring Understanding Engage in international relationships Benchmark Students will demonstrate the ability to engage in international relationships with members of other cultures. Item types Agreement, Self Rating, T ask/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate the ability to engage in international relationships with members of other cultures. Response Attributes Students will demonstrate the ability to engage in international relationships by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I often show interest in new cultur al aspects (e.g., to understand the values, history, traditions, etc.) Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective at engaging in relationships with international students. Not at all effective/Somewhat effective/Ef fective/Very effective Essay How specifically might one work to increase or achieve cultural competence ? Provide examples in the community, school, work and other situations.

PAGE 178

178 Benchmark # 10 Big Idea Adaptability Enduring Understanding Flexibility Benchmark Students will demonstrate communicative flexibility with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate communicative flexibility with members of other cultures. Response Attributes Students will demonstrate communicative fle xibility by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I demonstrate flexibility when interacting with persons from another culture. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective at demonstrating flexibility when interacting with persons from another culture Not at all effective/Somewhat effective/Effective/Very effective Essay Adapting to different cultur al situations is important. Why or why not?

PAGE 179

179 Benchmark # 11 Big Idea Adaptability Enduring Understanding High tolerance for cultural ambiguity and differences Benchmark Students will demonstrate a high tolerance for cultural ambiguity and differences with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 3 0 seconds to answer. Stimulus Attributes Students will demonstrate a high tolerance for cultural ambiguity and differences with members of other cultures. Response Attributes Students will demonstrate a high tolerance for cultural ambiguity and differences by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I can look at the world through the eyes of a person from another culture . Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I adjusted my behavior, dress, etc., as appropriate, to avoid offending my hosts . Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective in situations where there are cultural differences. Not at all effective/Somewhat effective/Effective/Very effective Essay Discuss: Although persons may have different cultures, they may share similar needs and values. Although persons may identify with the same culture, they may have needs and values that are not the same. Provide examples.

PAGE 180

18 0 Benchmark # 12 Big Idea Adaptability Enduring Understanding Manage cultural misunderstandings Benchmark Students will demonstrate the ability to manage cultural misunderstandings with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate the ability to manage cultural misunderstandings with members of other cultures. Response Attributes Students will demonstrate the ability to manage cultural mi sunderstandings by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I have seen many situations where cultural differences in the way people express their emotions le d to misunderstanding. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree I deal with my emotions and frustrations when interacting with a different culture. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective at managing in situations where there are cultural misunderstandings. Not at all effective/Somewhat effective/Effective/Very effective Essay When communicating with persons of other cultures and ethnicities, what are common miscommunications? How can we lear n to avoid them?

PAGE 181

181 Benchmark # 13 Big Idea Acceptance Enduring Understanding Open mindedness Benchmark Students will demonstrate open mindedness with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Each question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate open mindedness with members of other cult ures. Response Attributes Students will demonstrate open mindedness by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I an open minded when learning about new cultures. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective at being patient when trying to learn about new cultural aspects. Not at all effective/Somewhat effective/Effective/Very effective Essay Describe your reaction to a time you were in a situation that involved a different cultural norm from your own and how you felt and responded to this situation.

PAGE 182

182 Benchmark # 14 Big Idea Acceptance Enduring Understanding Cultural reception Benchmark Students will demonstrate the ability to be culturally receptive with members of other cultures. Item types Agreement, Self Rating, Task/Behavior Domain Content Limits The text should contain clear and sufficient context for determining the meaning of the question. Ea ch question should not take more than 30 seconds to answer. Stimulus Attributes Students will demonstrate the ability to be culturally receptive with members of other cultures. Response Attributes Students will demonstrate the ability to be culturally receptive with members of other cultures by selecting a response to the listed assessment item in terms of agreement/self rating/behavioral task domains Sample Question Agreement I implement diffe ring cultural norms into my everyday life. Strongly Agree/Agree/Neutral/ Disagree/Strongly Disagree Self Rating I am effective at implementing different cultural components into my life. Not at all effective/Somewhat effective/Effective/Very effective Es say If a few employees or students are speaking their first language, such as Spanish or Japanese and there is not a reason at this time for them to be communicating to a person who does not speak their native language, should the employees or students be told by a manager, teacher or someone else to refrain from speaking their native language? Discuss .

PAGE 183

183 APPENDIX E TEN ANCHOR ITEMS FOR ALTERNATE FORMS 1. I am open to different cultural ways of thinking in any international context. 2. I consider different perspectives before making conclusions about the world. 3. I do not feel threatened when presented with perspectives from outside the U.S. 4. I feel uncomfortable in situations outside my cultural experiences. 5. I prefer to socialize with people from my culture. 6. In a global context, I can reflect on the impact of my decisions. 7. In a global context, I understand how cultural belief s and values influence decision making. 8. It is important to know about my cultural values. 9. Some cultures are better than other s. 10. I feel comfortable discussing international issues.

PAGE 184

184 APPENDIX F ALTERNATE FORM A FOR FIRST PILOTING PHASE A. Please indicate how strongly you agree or disagree with the statements in terms of yourself. For the following questions, please consider the word "culture" as "interacting with and thinking about cultures from outside the United States" 1. Gender: 1. Female 2. Male 2. Class Level: 1. First Year 2. Second Year 3. Third Year 4. Fourth Year or more 3. Survey Form: 1. Form A 2. Form B 4. I am open to different cultural ways of thinking in any international context. 5. I consider different perspectives before making conclusions about the world . 6. It is important to know about my cultural values. 7. I am able to think critically to interpret global and intercultural issues. 8. I am able to demonstrate knowledge of global and intercultural conditions and interdependences. 9. I understand how cultural beliefs and values influence decision making. 10. I can make effective decisions when placed in different cultur al situations. 11. It is important to know about my cultural values. 12. Some cultures are better than others. 13. I can reflect on the impact of my decisions in different cultural settings 14. When comparing cultures, there are some cultures that are better than others. 15. I am open to other cultural ways of thinking.

PAGE 185

185 16. I consider different perspectives before making a decision. 17. I am patient when managing in different cultural situations. 18. People from my culture work harder than people from other cultures. 19. I actively learn about different cultural norms. 20. I do not have a culture. 21. I am able to understand what composes a culture. 22. I understand why there is conflict among nations of different cultures. 23. I am able to recognize how members of other cultures make decisions. 24. As compared to other perspectives, my cultural point of view should be used to make decisions. 25. Knowing about other cultural norms and beliefs is important to me. 26. When compa ring cultures, there are some cultures that are better than others. 27. I try to find solutions to cultural differences. 28. I am confident of my cultural beliefs and values. 29. It is not important to know about my cultural values. 30. I believe you can know someone well without knowing about their cultural beliefs. 31. I am able to manage when faced with multiple cultural perspectives. 32. I work to bridge differences between cultures. 33. Other people say that I am interested in other cultures. 34. Everyone has the same cultural values. 35. I depend on others to tell me how to determine my cultural beliefs. 36. Cultural diversity is an important part of the college experience. 37. I am able to communicate effectively with members of other cultures. 38. I feel uncomfortable in situati ons outside my cultural experiences.

PAGE 186

186 39. I am able to interact effectively with members of other cultures. 40. I do not feel threatened when presented with multiple perspectives. 41. I prefer to socialize with people of my culture. 42. I can reflect on the impact of my decisions. 43. I feel comfortable discussing international issues. 44. I can look at the world through the eyes of a person from another culture. 45. I often have conversations with members of other cultures. 46. I often ask questions about culture to members of oth er cultures. 47. I appreciate members of others cultures teaching me about their culture. 48. I feel confident I know how to act in most cultural situations. 49. I feel my cultural perspective is the most appropriate. 50. When I am in a conversation with a student from another culture, I often ask questions. 51. I can clearly articulate my message to members of other cultures. 52. When working on group projects with a student from another culture, I feel confident that I am able to collaborate with this student. 53. I evalua te situations in my own culture based on my experiences and knowledge of other cultures. 54. I have to ability to recognize cultural differences easily. 55. I deal with my emotions and frustrations when interacting with a different culture. 56. I act differently wh en around students from other cultures. 57. I have frequently been in situations where there was a cultural misunderstanding. 58. People from my culture are more accepting of cultural differences than people of other cultures. 59. When analyzing global issues, I tend to view these issues only through my cultural perspective. 60. I believe you can know someone well without knowing about their cultural beliefs.

PAGE 187

187 61. When in different cultural situations, I adapt my behavior to fit in. 62. nderstand my point of view. 63. I am patient when dealing with members of other cultures. 64. I can clearly articulate my point of view to members of other cultures. 65. I like working in groups with students from other countries. 66. After a conversation about differ ent cultural views, I reflect about the discussion. 67. I feel comfortable in conversations that may involve cultural differences. 68. When working on a group project, I enjoy collaborating with students from other countries. 69. I demonstrate flexibility when inte racting with members of another culture. 70. When interacting with an international student, I am conscious of the cultural differences. 71. I try to have friends from different cultural backgrounds.

PAGE 188

188 APPENDIX G ALTERNATE FORM B FOR FIRST PILOTING PHASE A. Please indicate how strongly you agree or disagree with the statements in terms of yourself. For the following questions, please consider the word "culture" as "interacting with and thinking about cultures from outside the United States 1. Ge nder 1. Female 2. Male 2. Class Level 3. First Year 4. Second Year 5. Third Year 6. Fourth Year or more 1. Survey Form: 1. Form A 2. Form B 2. I am open to different cultural ways of thinking in any international context. 3. I consider different perspectives before making conclusions about the world 4. It is important to know about my cultural values. 5. I am able to think critically to interpret global and intercultural issues. 6. I am able to demonstrate knowledge of global and intercultural conditions and interdepend ences. 7. I understand how cultural beliefs and values influence decision making. 8. I can make effective decisions when placed in different cultural situations. 9. It is important to know about my cultural values. 10. Some cultures are better than others 11. I can reflect on the impact of my decisions in different cultural settings 12. I can contrast important aspects of different cultures with my own.

PAGE 189

189 13. Cultural knowledge is an important job skill. 14. Students from other cultures are more motivated than students from my culture. 15. My culture values critical thinking more than other cultures. 16. When placed in different cultural situations, I ask students of other cultures questions about their culture. 17. I challenge other people to understand different cultural perspectives. 18. When looking at global issues, I tend to view these issues through my cultural lens. 19. My personal cultural view should be used to make decisions. 20. I feel comfortable working with international students. 21. Knowing about other cultural beliefs is important. 22. It is acceptable for one culture to have more opportunities than another culture. 23. People from others cultures value family more than people from my culture. 24. I often question my own cultural views. 25. I am able to assist others in discovering their cultural beliefs. 26. I can recognize how different cultures solve problems. 27. Not everyone has a culture. 28. I am able to analyze arguments utilizing different perspectives. 29. I have been exposed to several cultural ways of thinking. 30. I discuss international iss ues with member of other cultures. 31. Understanding international news is important to understand other cultures. 32. Incorporating more diverse opinions does not assist in solving problems. 33. There should be one universal culture. 34. I feel confident I know how t o act in most cultural situations.

PAGE 190

190 35. It is easy to determine right and wrong in different cultural settings. 36. Understanding different points of view is a priority to me. 37. I am able to communicate effectively with members of other cultures. 38. I feel uncomfortable in situations outside my cultural experiences. 39. I am able to interact effectively with members of other cultures. 40. I do not feel threatened when presented with multiple perspectives. 41. I prefer to socialize with people of my culture. 42. I can re flect on the impact of my decisions. 43. I feel comfortable discussing international issues. 44. I can look at the world through the eyes of a person from another culture. 45. I often have conversations with members of other cultures. 46. I often ask questions about culture to members of other cultures. 47. I appreciate members of others cultures teaching me about their culture. 48. I feel uncomfortable in situations outside my cultural experiences. 49. I feel comfortable assisting international students in adjusting to the US culture. 50. I often reflect upon discussions regarding international issues. 51. I often have conversations with members of other cultures. 52. I could discuss and contrast various behavioral patterns in my culture with those of another culture. 53. I am confident of my cultural beliefs. 54. Most of my friends have the same cultural perspective as mine. 55. I demonstrate flexibility when interacting with persons from another culture. 56. I am able to find commonalities between different cultures. 57. I appreciate differences between cultures.

PAGE 191

191 58. I am able to adapt to different cultural situations. 59. People should interact with members of their own cultural group. 60. People are all the same despite cultural differences. 61. I enjoy learning about other cultures. 62. It is important to in teract with international students. 63. 64. I am sensitive to situations that involve a cultural misunderstanding. 65. I feel offended when someone imposes their cultural view on me. 66. I a m aware of the social interactions of other cultures. 67. Others would say that I interact with members of other cultures. 68. I enjoy learning about other cultures. 69. I am confident that I can adapt to different cultural environments. 70. Too much emphasis is placed on cultural differences. 71. When presented with a different cultural perspective, I can adapt it to my way of thinking. 72. Being a global citizen is an important value in my life.

PAGE 192

192 APPENDIX H ALTERNATE FORM A FOR SECOND PILOTING PHASE 1. Gender: 2. Class Level: 3. I can reflect on the impact of my decisions in different cultural settings 4. I am able to communicate effectively with members of other cultures. 5. I feel uncomfortable in situations outside my cultural experiences. 6. I am abl e to interact effectively with members of other cultures. 7. I do not feel threatened when presented with multiple perspectives. 8. I prefer to socialize with people of my culture. 9. I can reflect on the impact of my decisions. 10. I feel comfortable disc ussing international issues. 11. I can look at the world through the eyes of a person from another culture. 12. I often have conversations with members of other cultures. 13. I often ask questions about culture to members of other cultures. 14. I appreciat e members of others cultures teaching me about their culture. 15. I feel confident I know how to act in most cultural situations. 16. I feel my cultural perspective is the most appropriate. 17. When I am in a conversation with a student from another culture, I often ask questions. 18. I can clearly articulate my message to members of other cultures. 19. When working on group projects with a student from another culture, I feel confident that I am able to collaborate with this student. 20. I evaluate s ituations in my own culture based on my experiences and knowledge of other cultures. 21. I have the ability to recognize cultural differences easily.

PAGE 193

193 22. I deal with my emotions and frustrations when interacting with a different culture. 23. I act differen tly when around students from other cultures. 24. I have frequently been in situations where there was a cultural misunderstanding. 25. People from my culture are more accepting of cultural differences than people of other cultures. 26. When analyzing glob al issues, I tend to view these issues only through my cultural perspective. 27. I believe you can know someone well without knowing about their cultural beliefs. 28. When in different cultural situations, I adapt my behavior to fit in. 29. I feel frustrat 30. I am patient when dealing with members of other cultures. 31. I can clearly articulate my point of view to members of other cultures. 32. I like working in groups with students from other countries. 33. After a conversation about different cultural views, I reflect about the discussion. 34. I feel comfortable in conversations that may involve cultural differences. 35. When working on a group project, I enjoy collaborating with students from other countri es. 36. I demonstrate flexibility when interacting with members of another culture. 37. When interacting with an international student, I am conscious of the cultural differences. 38. I try to have friends from different cultural backgrounds. 39. I am open to different cultural ways of thinking in any international context. 40. I consider different perspectives before making conclusions about the world. 41. It is important to know about my cultural values. 42. I am able to think critically to interpret global and intercultural issues. 43. I am able to demonstrate knowledge of global and intercultural conditions and interdependences.

PAGE 194

194 44. I understand how cultural beliefs and values influence decision making. 45. I can make effective decisions when placed in different cultural situations. 46. It is important to know about my cultural values. 47. I can reflect on the impact of my decisions in different cultural settings 48. I am open to other cultural ways of thinking. 49. I consider different perspectives b efore making a decision. 50. I am patient when managing in different cultural situations. 51. I actively learn about different cultural norms. 52. I am able to understand what composes a culture. 53. I understand why there is conflict among nations of diff erent cultures. 54. I am able to recognize how members of other cultures make decisions. 55. Knowing about other cultural norms and beliefs is important to me. 56. I try to find solutions to cultural differences. 57. I am confident of my cultural beliefs a nd values. 58. I am able to manage when faced with multiple cultural perspectives. 59. I work to bridge differences between cultures. 60. Other people say that I am interested in other cultures. 61. Cultural diversity is an important part of the college ex perience. 62. I can contrast important aspects of different cultures with my own. 63. Cultural knowledge is an important job skill. 64. When placed in different cultural situations, I ask students of other cultures questions about their culture. 65. I challenge other people to understand different cultural perspectives. 66. When looking at global issues, I tend to view these issues through my cultural lens. 67. I feel comfortable working with international students. 68. Knowing about other cultural beli efs is important.

PAGE 195

195 69. I am able to assist others in discovering their cultural beliefs. 70. I can recognize how different cultures solve problems. 71. Not everyone has a culture. 72. I am able to analyze arguments utilizing different perspectives. 73. I ha ve been exposed to several cultural ways of thinking. 74. Understanding international news is important to understand other cultures. 75. Incorporating more diverse opinions does not assist in solving problems. 76. There should be one universal culture. 77 . I feel confident I know how to act in most cultural situations. 78. Understanding different points of view is a priority to me.

PAGE 196

196 APPENDIX I ALTERNATE FORM B FOR SECOND PILOTING PHASE 1. Gender 2. Class Level 3. I am able to communicate effectively with members of other cultures. 4. I feel uncomfortable in situations outside my cultural experiences. 5. I am able to interact effectively with members of other cultures. 6. I do not feel threatened when presented with multiple perspectives. 7. I prefer to so cialize with people of my culture. 8. I can reflect on the impact of my decisions. 9. I feel uncomfortable in situations outside my cultural experiences. 10. I can reflect on the impact of my decisions. 11. I often have conversations with members of other cultures. 12. I often ask questions about culture to members of other cultures. 13. I appreciate members of others cultures teaching me about their culture. 14. I feel confident I know how to act in most cultural situations. 15. I could discuss and contras t various behavioral patterns in my culture with those of another culture. 16. I am confident of my cultural beliefs. 17. Most of my friends have the same cultural perspective as mine. 18. I demonstrate flexibility when interacting with persons from anothe r culture. 19. I am able to find commonalities between different cultures. 20. I appreciate differences between cultures. 21. I am able to adapt to different cultural situations. 22. People should interact with members of their own cultural group.

PAGE 197

197 23. Peop le are all the same despite cultural differences. 24. I enjoy learning about other cultures. 25. It is important to interact with international students. 27. I am sensitive t o situations that involve a cultural misunderstanding. 28. I feel offended when someone imposes their cultural view on me. 29. I am aware of the social interactions of other cultures. 30. Others would say that I interact with members of other cultures. 31. I enjoy learning about other cultures. 32. I am confident that I can adapt to different cultural environments. 33. Too much emphasis is placed on cultural differences. 34. When presented with a different cultural perspective, I can adapt it to my way of thinking. 35. Being a global citizen is an important value in my life. 36. When interacting with an international student, I am conscious of the cultural differences. 37. I am open to different cultural ways of thinking in any international context. 38. I consider different perspectives before making conclusions about the world. 39. It is important to know about my cultural values. 40. I am able to think critically to interpret global and intercultural issues. 41. I am able to demonstrate knowledge of globa l and intercultural conditions and interdependences. 42. I understand how cultural beliefs and values influence decision making. 43. I can make effective decisions when placed in different cultural situations. 44. It is important to know about my cultural values. 45. I can reflect on the impact of my decisions in different cultural settings

PAGE 198

198 46. I am open to other cultural ways of thinking. 47. I consider different perspectives before making a decision. 48. I am patient when managing in different cultural si tuations. 49. I actively learn about different cultural norms. 50. I am able to understand what composes a culture. 51. I understand why there is conflict among nations of different cultures. 52. I am able to recognize how members of other cultures make de cisions. 53. Knowing about other cultural norms and beliefs is important to me. 54. I try to find solutions to cultural differences. 55. I am confident of my cultural beliefs and values. 56. I am able to manage when faced with multiple cultural perspective s. 57. I work to bridge differences between cultures. 58. Other people say that I am interested in other cultures. 59. Cultural diversity is an important part of the college experience. 60. I can contrast important aspects of different cultures with my own. 61. Cultural knowledge is an important job skill. 62. When placed in different cultural situations, I ask students of other cultures questions about their culture. 63. I challenge other people to understand different cultural perspectives. 64. When lo oking at global issues, I tend to view these issues through my cultural lens. 65. I feel comfortable working with international students. 66. Knowing about other cultural beliefs is important. 67. I am able to assist others in discovering their cultural be liefs. 68. I can recognize how different cultures solve problems. 69. Not everyone has a culture. 70. I am able to analyze arguments utilizing different perspectives.

PAGE 199

199 71. I have been exposed to several cultural ways of thinking. 72. Understanding internati onal news is important to understand other cultures. 73. Incorporating more diverse opinions does not assist in solving problems. 74. There should be one universal culture. 75. I feel confident I know how to act in most cultural situations. 76. Understandi ng different points of view is a priority to me.

PAGE 200

200 APPENDIX J FINAL PILOT OF COMMUNICATION SKILL ITEMS (SECOND PHASE) 1. Gender: 2. Class Level: 3. I am able to communicate effectively with members of other cultures. 4. I am able to interact effectively with members of other cultures. 5. I do not feel threatened when presented with multiple perspectives. 6. I prefer to socialize with people of my culture. 7. I feel comfortable discussing international issues. 8. I can look at the worl d through the eyes of a person from another culture. 9. I often have conversations with members of other cultures. 10. I often ask questions about culture to members of other cultures. 11. I appreciate members of others cultures teaching me about their culture. 12. I feel confident I know how to act in most cultural situations. 13. When I am in a conversation with a student from another culture, I often ask questions. 14. I can clearly articulate my message to members of other cultures. 15. When working on group projects with a student from another culture, I feel confident that I am able to collaborate with this student. 16. I evaluate situations in my own culture based on my experiences and knowledge of other cultures. 17. I have the ability to recogniz e cultural differences easily. 18. When analyzing global issues, I tend to view these issues only through my cultural perspective. 19. I am patient when dealing with members of other cultures. 20. I can clearly articulate my point of view to members of oth er cultures. 21. I like working in groups with students from other countries.

PAGE 201

201 22. After a conversation about different cultural views, I reflect about the discussion. 23. I feel comfortable in conversations that may involve cultural differences. 24. When w orking on a group project, I enjoy collaborating with students from other countries. 25. I demonstrate flexibility when interacting with members of another culture. 26. When interacting with an international student, I am conscious of the cultural differen ces. 27. I try to have friends from different cultural backgrounds. 28. I could discuss and contrast various behavioral patterns in my culture with those of another culture 29. I am confident of my cultural beliefs. 30. I am able to find commonalities betw een different cultures. 31. I appreciate differences between cultures 32. I am able to adapt to different cultural situations. 33. I enjoy learning about other cultures. 34. It is important to interact with international students 35. When I do not understa 36. I am aware of the social interactions of other cultures. 37. Others would say that I interact with members of other cultures. 38. I enjoy learning about other cultures. 39. I am confident that I can adapt to different cultural environments. 40. Being a global citizen is an important value in my life. 41. I believe you can know someone well without knowing about their cultural beliefs.

PAGE 202

202 APPENDIX K INSTITUTIONAL REVIEW BOARD APPROVAL UFIRB 02 So cial & Behavioral Research Protocol Submission Form This form must be typed. Send this form and the supporting documents to IRB02, PO Box 112250, Gainesville, FL 32611. Should you have questions about completing this form, call 352 392 0433. Title of Protocol: Internationalization Assessment for University of Florida Quality Enhancement Plan Principal Investigator: Timothy J Wilson UFID #: 9435 0196 Degree / Title: Doctoral Student /Research Assistant Mailing Address: ( If on campus include PO Box address ): 1215 Norman Hall PO Box 117049 Gainesville, FL 32611 7049 Email: tjwilson@coe.ufl.edu Department: School of Human Development and Organizational Studies Telephone #: 352 273 4293 Co Investigator(s): UFID#: Email: Supervisor (If PI is student) : Dale Campbell UFID#: 2916 5530 Degree / Title: Professor Mailing Address: ( If on campus include PO Box address ): 1215 Norman Hall PO Box 117049 Gainesville, FL 32611 7049 Email : dfc@coe.ufl.edu Department: School of Human Development and Organizational Studies Telephone #: 352 273 4300 Date of Proposed Research: October 4, 2013 to October 18, 2013 Source of Funding (A copy of the grant proposal must be submitted with this protocol if funding is involved): Self funded

PAGE 203

203 Scientific Purpose of the Study: perceptions of gl accreditation, this study intends to pilot the instrument on undergraduate students from various majors who are currently enrolled in specific large enrol lment courses. Describe the Research Methodology in Non Technical Language: ( Explain what will be done with or to the research participant. ) Once permission is granted by the instructor of the course, the instructor will notify the student participates by email or Institutional Planning & Research. The students will be sent an email with the attached documentation and survey information. Here is the information provided to the students: Dear Students, efforts to develop a quality educational experience for its students, UF h as decided to focus on internationalization of the learning experience for the undergraduate population through a strategic initiative known as Learning Without Border s: Internationalizing the Gator Nation . In order to understand student progress in globa l awareness and intercultural competency, we are attempting to survey UF undergraduate students from various academic disciplines. You have been identified for this survey because of your enrollment in _______(class). The survey focuses on your personal views of global awareness and intercultural competency. This survey is designed to collect information about UF undergraduate stud internationalization and the results will be examined to provide further insights into the reliability and validity of the survey. The survey will take about 10 15 minutes of your time. Your responses will be held in confidence and you r identity will never be revealed. Only the evaluator or survey administrator, Dr. M. David Miller, will have access to the individual survey responses. All reporting of data will be done in the aggregate without identification of individual respondents. Your participation will assist the University of Florida in becoming a global campus that fosters be better prepared for the global workforce. Your d ata will be held strictly confidential. The survey can be completed through the following link: Please complete the survey by _____(Date). If you have questions or concerns regarding this survey, please contact me at tjwilson@coe.ufl.edu. Thanks for your participation.

PAGE 204

204 In addition, the students will be asked to answer the following questions based on a 5 response option Likert scale (Strongly Disagree, Disagree, Neutral, Agree, Strongly Agree): 1. I consider different perspectives before making con clusions about the world. 2. I am able to manage when faced with multiple cultural perspectives. 3. I am open to different cultural ways of thinking in any international context. 4. I can make effective decisions when placed in different cultural situations. 5. Knowing about other cultural norms and beliefs is important to me. 6. I am able to think critically to interpret global and intercultural issues. 7. I actively learn about different cultural norms. 8. Understanding different points of view is a priority to me. 9. I ca n recognize how different cultures solve problems. 10. I can contrast important aspects of different cultures with my own. 11. Knowing about other cultural beliefs is important. 12. I am able to recognize how members of other cultures make decisions. 13. I demonstrate fle xibility when interacting with members of another culture. 14. I prefer to socialize with people of my culture. 15. I am confident that I can adapt to different cultural environments. 16. I am able to communicate effectively with members of other cultures. 17. I like working in groups with students from other countries. 18. I feel comfortable in conversations that may involve cultural differences. 19. When working on a group project, I enjoy collaborating with students from other countries. 20. I often ask questions about culture to members of other cultures. 21. I enjoy learning about other cultures. 22. I appreciate members of other cultures teaching me about their culture. 23. I am able to interact effectively with members of other cultures. 24. I appreciate differences between cultures. 25. I fee l comfortable discussing international issues. 26. I can clearly articulate my point of view to members of other cultures. In addition, students will be asked to provide such demographic info as gender, course enrollment, previous enrollment in an internation ally designated course, study abroad experience, and whether or not they are an international student. Describe Potential Benefits: Participation in this study is voluntary. There are no tangible, potential benefits to survey participants. Participants will be informed via informed consent that they are not required to respond to questions they do not wish to answer. The responses will create a better informed conceptualization of internationalization perspectives of UF undergraduate students. Describe Potential Risks: ( If risk of physical, psychological or economic harm may be involved, describe the steps taken to protect participant .) There are no perceived physical or economic risks associated with participation in this study. Steps will be taken to protect the identity of the participants in all presentations and publications of this study Describe How Participant(s) Will Be Recruited: The survey administrators will contact potential participants via e mail and students who are enrolled in the following courses will be asked to participate: ACG2021, CLP2001, AST1002, COM1000, ANT2000, ANT2410, CHM1025, GEO2242, and WIS2040. Maximum Number of Participants (to be approached 5000 Age Range of Participants: 18 60 Amount of Compensation/ course credit: 0/0

PAGE 205

205 with consent) Describe the Informed Consent Process. (Attach a Copy of the Informed Consent Document. See http://irb.ufl.edu/irb02/samples.html for examples of consent.) Participants will read the consent paragraph of the survey and agree to participate before completing the rest of the survey. The consent pa ragraph informs individuals of the nature of the study, the rights of participants, and who they can contact for additional information. It will inform participants that their participation is voluntary and that their responses and ident ity will be kept c onfidential to the extent provided by law. They can discontinue their participation at any time without consequences. (SIGNATURE SECTION) Principal Investigator(s) Signature: Date: Co Investigator(s) Signature(s): Date: (if PI is a student): Date: Department Chair Signature: Date:

PAGE 206

206 APPENDIX L INFORMED CONSENT FORM

PAGE 207

207

PAGE 208

208 APPENDIX M ESTIMATED ITEM PARAMETERS FOR CRITICAL THINKING AND COMMUNICATION SKILL ITEMS Item Parameter Estimates For Factor One Critical Thinking Item Parameter Estimates Item Name a b 1 b 2 b 3 b 4 Perspectives 2.1 2.4 2.1 1.6 0.5 Multiperspect 2.4 3.2 2.2 1.7 0.5 Waythink 2.5 2.1 3.0 1.7 0.4 Effectdec 1.9 3.1 2.1 1.8 1.0 Knownorms 2.9 2.2 2.0 1.0 .01 Thinkcritical 2.0 3.0 1.9 .06 0.6 Actlearn 1.9 2.3 1.6 .02 0.7 Ptsview 2.1 3.0 2.0 1.0 .04 Solvprob 1.9 2.3 1.8 .03 1.2 Contrcult 2.4 2.5 2.0 1.0 0.5 Knowbeliefs 2.9 2.3 2.0 1.5 0.0 Recdec 1.9 3.2 2.8 0.3 1.6 Item Parameter Estimates for Factor Two Communication Skills Item Parameter Estimates Item Name a b 1 b 2 b 3 b 4 Flex 2.3 2.2 3.6 1.7 0.5 Social 0.27 1.0 1.0 1.0 1.0 Adapt 1.7 3.7 2.5 1.0 1.0 Commeffect 1.7 3.9 2.3 0.9 1.6 Grpwork 1.7 3.6 2.2 0.5 1.0 Comfortconv 2.4 3.5 2.1 1.0 .08 Grpcollob 1.8 3.6 2.5 0.2 1.0 Askquest 1.8 3.0 1.8 1.0 0.5 Enjoylearn 2.9 2.2 2.5 1.5 0.1 Teachcult 2.8 3.0 2.6 1.1 0.2 Inteffect 2.6 3.2 2.5 1.0 0.5 Appdiff 3.1 3.0 2.1 1.0 0.5 Intissues 1.7 3.2 2.0 0.8 1.0 Articulate 1.6 3.8 2.1 0.9 1.2

PAGE 209

209 LIST OF REFERENCES Ackerman, T. A., & American College Testing Program. (1988). The use of unidimensional item parameter estimates of multidimensional items in adaptive testing. Iowa City, Iowa: American College Testing Program. Altbach, P. G., Reisberg, L., & Rumbley, L.E. (2010). Tracking a Global Academic Revolution. Change: The Magazine of Higher Learning 42 (2): 30 39. Altshuler, L., Sussman, N., & Kachur, E. (2003). Assessing Changes in Intercultural Sensit ivity Among Physician Trainees Usin g the Intercultural Development Inventory. International Journal of Intercultural Relations 27(4): 387 401. Andersen, E.B. (1977). Sufficient statistics and latent trait models, Psychometrika , 42, 69 81. Arasaratnam, L. A.: Banerjee, S.C., & Dembek, K. (2010). Sensation Seeking and the Integrated Model of Intercultural Communication Competence. Journal of Intercultural Communication Research 39 (2): 69 79. Astin, A. W., & American Council on Education. (1991). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. New York: American Council on Education. Benson, J. (1998) Developing a strong program of construct validation: A test anxiety example. Educational Measurement: Issues and Practice, 17 , 10 17. Braskamp, L.A., Braskamp, D.C., & Merrill, K.C. (2009). Assessing progress in global learning and development of students with education abroad experiences. Frontiers: The Interdisciplinary Journal of Study Abroad, 18 , 101 118. Bremer, L., & van der Wende, M. (eds.) (1995). Internationalizing the curricula in higher education: Experiences in the Netherlands. The Hague, The Netherlands: Organization for International Co operation in Higher Education. Brophy, T. (2007). The Florida Music Assessment:2007 Elementary Phase I Field Test Report. Brown, D. G. (2000). Interactive learning: Vignettes from A merica's most wired campuses. Bolton, MA: Anker Pub. Co. B urri s, A. (2006). Institutional effectiveness in internationalizati on: A case study of internationalization at three higher education institutions. Ed.D. dissertation, The George Washington University, District of Columbia. Retrieved from Dissertations & Theses: A&I. (Publication No. AAT 3199633). Case, R. (2005). Moving critical thinking to the main stage. Education Canada, 45 (2), 45 49.

PAGE 210

210 Chen, W., & Thissen, D. (1997). Local dependence indexes for item pairs using item response theory. Journal of Educational and Behavioral Statistics, 22 (3), 265. doi:10.2307/1165285 Chernyshenko, O., Stark, S., Chan, K., Drasgow, F., & Williams, B. (2001). Fitting item response theory models to two personality inventories: Issues and insights. Multivariate Behavioral Research, 36 (4), 523 562. doi:10.1207/S1532790 6MBR3604_03 Childress, L. K. (2009). Internationalization Plans for Higher Education Institutions. Journal of Studies in International Education 13 (3): 289 309. Choppin, J. (1997). The learning organization. Managing Service Quality, 7 (6), 269 273. doi:1 0.1108/09604529710186606 Collier, M. J. (1989). Cultural and Intercultural Communication Competence: Current Approaches and Directions for Future Research. International Journal of Intercultural Relations 13 (3): 287 302. Costello, A. B., & Osborne, J. W . (2011). Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Pract Assess Res Eval 2005; 10. pareonline. net/getvn. asp , 10 , 7. Crocker, L., & Algina, J. (1986). Introduction to classical and moder n test theory Holt, Rinehart and Winston. Cushner, K. (2008). International Socialization of Young People: Obstacles and Opportunities. International Journal of Intercultural Relations 32 (2): 164 173. Deardorff, D. K . ( 2006).Identification and Assessment of Intercultural Competence as a Student Outcome of Internationalization. Journal of Studies in International Education 10 (3): 241 266. Deardorff, D. K. (2004). The identification and assessment of intercultural compet ence as a student outcome of internationalization at institutions of higher education in the United States. Retrieved from http://repository.lib.ncsu.edu/ir/handle/1840.16/5733 DeJaeghe Does Professional Development Matter? International Journal of Intercultural Relations 33( 5): 437 447. ternational Strategies and Activities of Public U.S. Universities. http://arizona.openrepository.com/arizona/handle/10150/293359, accessed October 7, 2013. De Wit, H. (2002). Internationalization of higher education on the United States of America and Europe. Westport, CT: Greenwood Publishing Group

PAGE 211

211 Dignes, N. (1983). Intercultural competence. In D. Landis & R. W. Brislin (Eds.), Handbook of intercultural training (Vol. 1, pp. 176 202). Elmsford, NY: Pergamon. Dillenbourg, P., Baker, M collaborative learning. In E. Spada & P. Reiman (Eds.), Learning in humans and machine: Towards an interdisciplinary learning science (pp. 189 211). Oxford, England: Elsevier. Dowd, A. C.; Sawatzky, M.; & Korn, R. (2011). Theoretical Foundations and a Research Agenda to Validate Measures of Intercultural Effort. The Review of Higher Education 35 (1): 17 44. Dunning, J., Edward, H ,Chen, KY , & Fatouros, A (1998). Board of Advisers. http://unctad.org/en/docs/iteiit9v7n1_en.pdf, accessed October 7, 2013. Edmundson, A. (2007). Globalized e learning cultural challenges. Hershey, PA: Information Science Pub. Ellingboe, B.J. (1997). The most frequently asked questions about internationalization. Retrieved February 20, 1998 from http://education.umn.edu/IntEduc/Int.FAQs.html . Ellis, B. B. (1989). Differential item functio ning: Implications for test translations. Journal of Applied Psychology, 74 (6), 912 921. doi:10.1037/0021 9010.74.6.912. Embretson, S.E., & Reise, S.P. (2000). Item response theory for psychologists. Mahwah, NJ: Lawrence Erlbaum Associates. Engberg, M. E , & Fox. K (2011). Exploring the Relationship Between Undergraduate Service Learning Experiences and Global Perspective Taking . Journal of Student Affairs Research and Practice 48 (1). Engberg, M.E & Fox, K. (2011). Service participation and the development of a global perspective. Journal of Student Affairs Research and Practice, 48( 1), 85 105. Engberg, D., & Green, M. (2002). Promising practices: Spotlighting excellence in comprehensive internatio nalization. Washington, D.C.: American Council on Education. Ennis, R. H. (1989). Critical thinking and subject specificity: Clarification and needed research. Educational Researcher, 18 (3), 4 10. Ennis, R. H. (1985). A logical basis for measuring criti cal thinking skills. Educational Leadership, 43 (2), 44 48. Ewell, P. T. (1988). Implementing assessment: Some organizational issues. In T. Banta (Ed.), Implementing outcomes assessment: Promise and perils (New Directions for Institutional Research), 59 (p p. 15 28). San Francisco: Jossey Bass.

PAGE 212

212 Fabrigar, L. R., Wegener, D. T., MacCallum, R. C., & Strahan, E. J. (1999). Evaluating the use of exploratory factor analysis in psychological research. Psychological Methods, 4 , 272 299. Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20 (1), 61 84. Fan, X. (1998). Item Response Theory and Classical Test Theory: An Empirical Comparison of Their Item/Person Statistics. Educational and Psychological Measurement 58 (3): 357 381. Favia, A., Comins, N. F., & Thorpe, G. L. (2012). The elements of item response theory and its framework in analyzing introductory astronomy college stud ent misco nceptions. I. galaxies arXiv preprint arXiv:1206.2302 . Fitzpatrick, J. L., Worthen, B. R., & Sanders, J. R. (2004). Program evaluation: Alternative approaches and practical guidelines . Boston: Allyn and Bacon. Fung, S., & J Filippo, J. (2002). What Kind s of Professional International Opportunities May Be Secured for Faculty? New Directions for Higher Education (117 ): 57 62. Gatfield, T. (1999). Examining student satisfaction with group projects and peer assessment. Assessment & Evaluation in Higher Educ ation, 24(4), 365 377. Graf, A., & Harland, L. K. ( 2005). Expatriate Selection: Evaluating the Discriminant, Convergent, and Predictive Validity of Five Measures of Interpersonal and Intercultural Competence. Journal of Leadership & Orga nizational Studies 11 (2): 46 62 Grandin, J. M., & Hedderich, N. (2009). Global competence for engineers. The SAGE handbook of intercultural competence, 362 373. Green, M. F. (2007). Internationalizing community colleges: Barriers and strategies. New Dire ctions for Community Colleges , 138 , 15 24. Greenholtz, J. F. (2005). Does intercultural sensitivity cross cultures? Validity issues in porting instruments across languages and cultures. International Journal of Intercultural Relations, 29 (1), 73 89. doi:10.1016/j.ijintrel.2005.04.010 Gudykunst, W. B., & Nishida, T. (2001). Anxiety, Uncertainty, and Perceived Effectiveness of Communication Across Relationships and Cultures. International Journal of Intercultural Relations 25 (1): 55 71. Guth, S., & F. Helm (2011). Developing Multiliteracies in ELT through Telecollaboration. ELT Journal, 66 (1): 42 51.

PAGE 213

213 Hadis, B. F. (2005). Why are they better students when they come back? Determinants of academic focusing gains in the study abroad experience . The Interdisciplinary Journal of Study Abroad , 11(August), 57 67. Hair JF, Tatham RL, Anderson RE and Black W (1998) Multivariate data analysis. (Fifth Ed.) Prentice Hall: London . Hambleton, R.K., Swaminathan, H., & Rogers, H.J. (1991). Fundamentals o f item response theory. Newbury Park, CA: SAGE Publications, Inc. Hambleton, R.K., & Swaminathan, H. (2010). Item response theory: Principles and applications. Norwell, MA: Kiuwer Academic Publishers. Hammer, M. R. (2011). Additional Cross cultural Valid ity Testing of the Intercultural Development Inventory. International Journal of Intercultural Relations 35 (4): 474 487. Hammer, M. R., & Bennett, M. J. (1998). The intercultural development inventory (IDI) Manual . Portland, OR: Intercultural Communicatio n Institute. Hammer, M. R., Bennett, M. J., & Wiseman, R. (2003). Measuring intercultural sensitivity: The intercultural development inventory. International Journal of Intercultural Relations, 27, 421 443. Halpern, D. F. (2001) Assessing the effectiveness of critical thinking instruction. The Journal of General Education, 50 (4), 270 286. Harari, M. (1992). The internationalization of the curricula. In C. Klasek (Ed.), Bridges to the futures: Strategies for internationalizing higher education (pp. 52 79). Carbondale, IL: Association of International Education Administrators. . Hays, R. D., Morales, L. S., & Reise, S. P. (2000). Item response theory and health outcomes measurement in the 21st century. Medical Care, 38(9), II28 II42. Hayward, F.M., & Siaya, L.M. (2001). Public experience, attitudes and knowledge: A report on two national surveys about international education. Washington, D.C.: American Council on Education. Hooper, D., Coughlan, J., & Mullen, M. R. (2008). Structural equation modelling: Guidelines for determining model fit. Electronic Journal of Business Research Methods, 6 (1). Horan, P. M., DiStefano C, and Motl, R.W. (2003). Wording Effects in Self Esteem Scales: Methodological Artifact or Response Style? Structural Equation Modeling: A Multidisciplinary Journal 10 (3): 435 455.

PAGE 214

214 Hsieh, C., von Eye, A., & Maier, K. (2010). Using a multivariate multilevel polytomous item response theory model to study par allel processes of change: The dynamic association between adolescents' social isolation and engagement with delinquent peers in the national youth survey. Multivariate Behavioral Research, 45(3), 508 508. doi:10.1080/00273171.2010.483387 Hu, L. & Bentler, P. M. (1999) Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1 55. Hudzik, J. K. ( 2011).Comprehensive Internationalization. Washington, DC: NAFSA, The Ass ociation of International Educators. https://www.pubapps.vcu.edu/global/world/internationalization/pdf/Comprehensive_Inter nationalization.pdf, accessed October 7, 2013. Hunter, B. (2006).What Does It Mean to Be Globally Competent? Journal of Studies in International Education 10 (3): 267 285. Iuspa, F. E. (2010). Assessing the effectiveness of the internationalization process in Higher Education Institutions: A case study of Florida International University. Retrieved from http://digitalcommons.fiu.edu/etd/316/ Jacobson, W.; Sleicher, D; & Maureen, B. (1999). Portfolio Assessment of Intercultural Competence. International Journal of Intercultural Relations 23 (3): 467 492. Jayakumar, U. M. (2008). Can Higher Education Meet the Needs of an Increasingly Diverse and Global Society? Campus Diversity and Cross cultural Workforce Competencies. Harvard Educational Review 78 (4): 615 651. Johnson, M. S. (2007). Marginal maximum likelihood estimation of item response models in R. Journal of Statistical Software , 20 (10), 1 24. Kelley, C., & Meyers, J. (1993). Cross cultural adaptability inventory manual. Minneapolis, MN: National Computer Sy stems. Kim, Y.Y. (1992). Intercultural communication competence: A systems thinking view. In W.B. Gudykunst & Y.Y. Kim (Eds.), Readings on communicating with strangers: An approach to intercultural communication (pp. 371 381). New York: McGraw Hill. King , Patricia M., & Marcia B. Baxter Magolda ( 2005).A Developmental Model of Intercultural Maturity. Journal of College Student Development 46 (6): 571 592. Kirkwood, T. F. (2001). Our Global Age Requires Global Education: Clarifying Definitional A mbiguities. The Social Studies 92( 1): 10 15. Klak, T., & P. Martin (2003). Do University sponsored International Cultural Events Help International Journal of Intercultural Relations 27 (4): 445 465.

PAGE 215

215 Knight, J. (2011). Education hubs: A fad, a brand, an innovation? Journal of Studies in International Education, 15 , 221 240. Knight, J. (2004). Internationalization remodeled: Definition, approaches, and rationales. Journal of Studies in International Ed ucation, 8 (1), 5 31. Kolen, M. J., & Brennan, R. L. (2004). Test equating, scaling, and linking: Methods and practices . New York: Springer. Kraemer, T.J., and Beckstead, J. (2003). Establishing the reliability of using the Cross Cultural Adaptability Inventory with physical therapist students. Journal of Physical Therapy Education , Spring. Lai, E.R. (2011). Critical thinking: A literature review. A research report. Pearson Learning . Lipman, M. (1988). Critical thinking what can it be? Ed ucational Leadership, 46 (1), 38 43. Lowis, M., & Castley, A. (2008). Factors Affecting Student Progression and Achievement: Prediction and Intervention. A Two year Study. Innovations in Education and Teaching International 45 (4): 333 343. McAllister, G. , & Irvine, J. J. (2000). Cross cultural competency and multicultural teacher education. Review of Educational Research , 70 (1), 3 24. Madaus, G. F., Kellaghan, T., & Stufflebeam, D. L. (2000). Evaluation models: Viewpoints on educational and human service s evaluation . Boston: Kluwer Academic Publishers. Madera, E. K. (2003). Application of the graded response model to the assessment of student attitudes. ProQuest, UMI Dissertations Publishing. Medina López Portillo, A. (2004). College students' intercultural sensitivity development as a result of their studying abroad: A comparative description of two types of study abroad programs. 65(6), 2185 2186. Michaelides, M. P. (2010). A review of the effects on IRT item parameter estimates with a focus on misbehaving common items in test equating. Frontiers in Psychology, 1, 167. doi:10.3389/fpsyg.2010.00167. Majumdar, B., Keystone, J.S., and Cuttress, L.A. (Eds.). (1999). Cultural sensitivity training among foreign medical graduates. Medical Education, 33 (3):177 184. Marais I, Andrich D (2007) \ : RUMMss. Rasch Unidimensional Measurement Models Simulation Studies Software. The University of Western Australia, Perth. Marsh, H. W. (1996). Positive and Negative Global Self esteem: A Substantively Meaningfu l Distinction or Artifactors? Journal of Personality and Social Psychology 70 (4): 810.

PAGE 216

216 Masters, G. N. (1982). A R asch model for partial credit scoring. Psychometrika, 47 (2), 149 74. Maynard, A., & Martini, M. I. (2005). Learning in cultural context: Family, peers, and school. New York: Springer U.S.. doi:10.1007/0 387 27550 9. Mellenbergh, G. J. (1995). Conceptual notes on models for discrete polytomous item responses. Applied Psychological Measu rement, 19 (1), 91 100. doi:10.1177/014662169501900110 Mestenhauser, J. (2002). In search of a comprehensive approach to international education: a Critical approaches to int ernational education in the age of cyberculture (pp. 165 202). London: Vit Verlag. Mestenhauser, J.A., & Ellingboe, B.J. (Eds.). (1998). Reforming the higher education curricula: Internationalizing the campus. Phoenix, AZ: Oryx Press. Muraki, E. (1992). A generalized partial credit model: Application of an EM algorithm. Applied Psychological Measurement, 16 (2), 159 176. doi:10.1177/014662169201600206. Musil, C. M. (2006). Assessing Global Learning: Matching Good Intentions with Good Practice. Washington , DC: Association of American Colleges and Universities. 40 470 SAGE HANDBOOK OF ORGANIZATIONAL INSTITUTIONALISM 9781412931236 Ch18 11/13 /07 2:44 PM Page 470 in G. M. Thomas, J. W. Meyer, F. O. Ramirez, and J. Boli, Institutional structure: Constituting state, society, and the individual. Newbury Park, CA: Sage. Norris, M., & Lecavalier, L. (2010). Evaluating the use of exploratory factor analysis in developmental disability psychological research. Journal of Autism and Developmental Disorders, 40 (1), 8 20. doi:10.1007/s10803 009 0816 Norton, I. (2008, September 26). Changing the face of study abroad. The Chronicle of Higher Education. Re trieved November 10, 2008, from http://chronicle.com/weekly/v55/i05/05b01201.htm Nunnally, J. C., & Bernstein, I. H. (1994). Psychological theory. Olson, C. L.,& Kroeger, K. R. (2001). Global competency and intercultural sensitivity. Journal of Studies in International Education , 5 , 116 137. Orlando, M., & Thissen, D. (2000). Likelihood based item fit indices for dichotomous item response theory models. Applied Psychological Measuremen t, 24 (1), 48 62. Orr, R. J, & Scott, W.R. (2008). Institutional Exceptions on Global Projects: a Process Model. Journal of International Business Studies 39 (4): 562 588.

PAGE 217

217 Paige, M. (2005). Internationalization of higher education: Performance assessment and indicators. Paige, R. M. (Ed.). (1993). Education for the intercultural experience. Yarmouth, ME: Intercultural Press. Paige, M. & Mestenhauser, J. (1999, October). Internationalizing educational administration. Educational Administration Quarterl y, 35 (4), 500 517. Paige, R. M., Jacobs Cassuto, M., Yershova, Y. A., & DeJaeghere, J. (2003). Assessing intercultural sensitivity: An empirical analysis of the Hammer and Bennett intercultural development inventory. International Journal of Intercultural Relations, 27(4), 467 486. Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 1992(77), 3 24. Pedersen, P. J. (2010). Assessing In tercultural Effectiveness Outcomes in a Year long Study Abroad Program . International Journal of Intercultural Relations 34 (1): 70 80. Peterson, G. P. (2011).Recent Trends and Milestones: Summer 2011. http://smartech.gatech.edu/handle/1853/ 39256, accessed October 7, 2013. Pottinger, P.S., & Goldsmith, J. (Eds.). (1979). Defining and measuring competence. San Francisco: Jossey Bass. Purnell, L.(2002).The Purnell Model for Cultural Competence. Journal of Transcultural Nursing 13( 3): 193 196. Qiang, Z. (2003). Internationalization of higher education: Towards a conceptual framework. Policy Futures in Education, 1 (2), 248. doi:10.2304/pfie.2003.1.2.5 Reeve, B.B. (2002) An Introduction to Modern Measurement Theory. National Cancer Institute. Reise, S. P., & Waller, N. G. (2009). Item response theory and clinical measurement. Annual Review of Clinical Psychology, 5 (1), 27 48. doi:10.1146/annurev.clinpsy.032408.153553 Reise, S. P., Widaman, K. F., & Pugh, R. H. (1993). Confirmatory factor anal ysis and item response theory: two approaches for exploring measurement invariance. Psychological bulletin, 114(3), 552. Reckase, M., & SpringerLink (Online service). (2009). Multidimensional item response theory. New York: Springer.

PAGE 218

218 Rudenstine, N. (1997 ). Empires of the mind. Harvard International Review, 20 (1), 84. Ruzzier, M., Hisrich, R.D.; and Antoncic, B. (2006). SME Internationalization Research: Past, Present, and Future. Journal of Small Business and Enterprise Development 13 (4): 476 497. Samejima, F. (1969). Estimation of latent ability using a response pattern of graded scores . United States: Samonte, K., & Pastor, D. (2011). An Exploratory Factor Analysis of the Global Perspective Inventory. Presented at the 23rd Annual Conv ention of the Association for Psychological Sciences. Schmit, M. J., Ryan, A. M., Stierwalt, S. L., & Powell, S. L. (1995). Frame of reference effects on personality scores and criterion related validity. Journal of Applied Psychology , 80 , 607 620. Scriven, M. (1967). The methodology of evaluation. In R. W. Tyler, R. M. Gagne, & M. Scriven (Eds.), Perspectives of curriculum evaluation (pp. 39 83). Chicago: Rand McNally. Scott, P. (2000) Globalization and Higher Education: Challenges for the 21st Cen tury. Journal of Studies in International Education 4( 1): 3 10. Scott, P. (Ed.). (1998). The globalization of higher education. Buckingham: Open University Press. Southern Association of Colleges and Schools (July, 2012). Quality enhancement p lan guidelines: Indicators of an acceptable quality enhancement plan. Retrieved on March 10 at http://www.sacscoc.org/pdf/Quality%20Enhancement%20Plan%20Guidelines.pdf . Sperandio, J.; Magdalena,G.H.; & Stewart Gambino, H . (2010).Developing an Undergraduate Global Citizenship Program: Challenges of Definition and Assessment. International Journal of Teaching and Learning in Higher Education 2 2(1): 12 22. Spitzberg, B. H., & Cupach, W.R. (1984). Interpersonal communication competence. London: Sage. Stanhope, V., Soloman, P., Pernell Arnold, A., Sands, R.G., and Bourjolly, J.N. (2005). Evaluating cultural competence among behavioral health professionals. Psychiatric Reha bilitation Journal 28( 3): 225 233. Straffon, D. A. (2003). Assessing the Intercultural Sensitivity of High School Students Attending an International School. International Journal of Intercultural Relations 27 (4): 487 501. Stufflebeam, D. L., & Shinkfield, A. J. (2007). Evaluation theory, models, and applications. San Francisco: Jossey Bass.

PAGE 219

219 Taylor, J. (2004). Toward a Strategy for Internationalization : Lessons and Practice from Four Universities. Journal of Studies in International Education 8( 2): 149 171. Thissen, D., Nelson, L., Rosa, K., & McLeod, L.D. (2001). Item response theory for items scored in more than two categories. In D. Thissen & H. Wainer (Eds), Test Scoring (Pp. 141 186). Hillsdale, NJ: Lawrence Erlbaum Associates. Thissen, D., & Steinberg, L. (1986). A taxonomy of item response models. Psychometrika, 51 (4), 567 77. Thompson River University. Global Competency: Recognizing Future Global Leaders. Retrieved at http://www.tru.ca/g lobal.html . Tindal, G., & Nolet, V. (1995). Curriculum based measurement in middle and high schools: Critical thinking skills in content areas. Focus on Exceptional Children, 27 (7), 1 22. Ting Toomey, S. (1999). Communicating across cultures . New York: Guilford. Uttaro, T., & Lehman, A. (1999). Graded response modeling of the quality of life interview. Evaluation and Program Planning, 22 (1), 41 52. doi:10.1016/S0149 7189(98)00039 1 Upcraft, M. L, &.Schuh, J. H., (2001). Assessment practice in student affairs: An applications manual . San Francisco: Jossey Bass Publishers. Vaira, M. (2004). Globalization and Higher Education Organizational Change: A Framework for Analysis. Higher Education 48 (4): 483 510. Van der Wende, M. (1999). An innovation perspective on internationalization of higher education institutionalization: The critical phase. Journal of Studies in International Education, 3 (1), 3 22. Van Der Linden, W. J., & Hambleton, R. K. (1997). Item response theory: Brief history, common mode ls, and extensions. Handbook of modern item response theory, 1 28. Volet, S. E., & Ang, G. (1998). Culturally mixed groups on international campuses: An opportunity for inter cultural learning. Higher Education Research & Development, 17(1), 5 23. Wiers Jenssen, J. (2008). Does higher education attained abroad lead to international jobs? Journal of Studies in International Education, 12 (2), 101 130. doi:10.1177/1028315307307656. Wiley, K. S. (2008). Enhancing Student Learning through the Development of I dentity and Christian Servant Leadership. http://www.etbu.edu/export/sites/default/QEP/Jeopardy/QEP__Document1_Final_Copy.p df, accessed October 7, 2013

PAGE 220

220 and Friendships with International Students . International Journal of Intercultural Relations 35( 1): 41 48. W illi am, D. (2011). What is assessment for learning? Studies in Educational Evaluation, 37(1), 3 14. doi:10.1016/j.stueduc.2011.03.001 Willingham, D. T. (2007). Critical thinking: Why is it so hard to teach? American Educator , 8 19. Wright, D.W.; Mead, R.J; & Bell, S.R. (1980). BICAL: Calibrating items with R asch model. Research Memorandum Number 23C June 1980. Department of Education, University of Chicago. Zhai, L. (2004). Studying international students: Adjustment issues and social support. Journal of International Agricultural and Extension Education,11 (1), 97 104.

PAGE 221

221 BIOGRAPHICAL SKETCH Timothy J. Wilson received his PhD from the University of Florida in Higher Education Administration in the summer of 2014. He has over 10 years of experience in international During his doctoral studies, Tim was a research alumni f ellow working on projects ranging from access and equality issues to program assessment. In addition, he served three years as the program director for the Community College Futures Assembly, an independent policy think tank for community college president s, trustee members, administrators, and faculty. Prior to his doctoral work, Tim has a had a broad international experience consisting of teaching English as a second language courses, immigration advising, institutional partnerships and exchanges, and con ducting training and development at universitie s worldwide. In addition, he has consulted international schools in the areas of hospitality and tourism and STEM programs.