LTRC 2013 CONFERENCE OVERVIEW

2 CONFERENCE SCHEDULE PRECONFERENCE WORKSHOPS Time MONDAY, JULY 1 Location 8:30-9:00 Workshop Registration Workshop Rooms Seoul National...

0 downloads 40 Views 704KB Size
LTRC 2013 CONFERENCE OVERVIEW MONDAY, JULY 1 8:30-9:00 9:00-12:00

12:00-1:00 1:00-5:00 6:00-8:00

THURSDAY, JULY 4

Workshop Registration Morning Sessions (4 parallel workshops) Lunch Break Afternoon Sessions (4 parallel workshops) Workshop Reception

7:45-6:50 8:00-6:50 8:00-9:40

09:40-10:00 10:00-10:55

TUESDAY, JULY 2 8:30-9:00 9:00-12:00

12:00-1:00 1:00-5:00 12:00-5:00 5:40-6:20 6:30-8:00

10:55-11:10 11:10-12:50

Morning Sessions (3 parallel workshops) Lunch Break Afternoon Sessions (3 parallel workshops) ILTA Executive Board Session LTRC Newcomers Session Opening Reception (KICE)

12:50-2:50 12:50-2:50 2:50-4:50 4:50-5:10 5:10-6:50

WEDNESDAY, JULY 3 8:15-5:30 8:30-5:30 9:00-9:30

9:30-9:45 9:45-9:55 9:55-10:55 10:55-11:10 11:10-12:50 12:50-2:20 12:50-2:20

2:20-3:20 3:20-3:40 3:40-5:40

6:00-7:00

Conference Registration Exhibition Paper Session 2 (3 Parallel Sessions) Break British Council Scholar Lecture Break Paper Session 3 (3 Parallel Sessions) Lunch Break ILTA Business Meeting Symposia 2 (3 Parallel Sessions) Break Paper Session 4 (3 Parallel Sessions)

FRIDAY, JULY 5

Conference Registration Exhibition Opening Ceremony Conference Chair ILTA President, Dignitaries from MOE & SNU President of KICE, Seong Taeje GROUP PHOTO Remembering John Trim Samuel J. Messick Memorial Lecture Break Paper Session 1 (2 Parallel Paper Sessions) Lunch Break Language Assessment

8:15-5:25 8:30-5:25 8:30-10:10

10:10-10:30 10:30-12:00 12:00-1:30 1:30-2:35 2:35-2:55 2:55-4:55 4:55-5:25 6:30-9:30

Quarterly Editorial Board Meeting Poster Session Break Symposia 1 (Plenary – no parallel sessions) LTRC 2013 Special Session on Language Testing in Korea 1

Conference Registration Exhibition Paper Session 5 (3 Parallel Sessions) Break Works-In-Progress Session Lunch Break Paper Session 6 (3 Reduced Parallel Sessions) Break Symposia 3/Paper Session 7 (2 symposia + 1 paper session) Closing Comments Banquet and Awards Presentation

CONFERENCE SCHEDULE PRECONFERENCE WORKSHOPS Time

MONDAY, JULY 1

Location

8:30-9:00

Workshop Registration

Workshop Rooms Seoul National University (SNU)

9:00-5:00

Morning Sessions (4 parallel workshops)

Workshop 1 – Speaking Rating Scales and Raters in Speaking Assessment Gad S. Lim, Cambridge English Language Assessment

Room 309 Shinyang Humanities Hall

Evelina D. Galaczi, Cambridge English Language Assessment Workshop 2 – Writing Defining and Assessing Writing Ability: Best Practices and Current Issues in Classroom and Large-Scale Writing Assessment Sara Cushing Weigle, Georgia State University Lynda Taylor, University of Bedfordshire

Room 302 Shinyang Humanities Hall

Workshop 3 – CDA Cognitive Diagnostic Assessment: Modeling and Application Eunice Eunhee Jang, University of Toronto Jimmy de la Torre, Rutgers University

Room 115 Building 3

Workshop 5 – ELP Tests Developing English language Proficiency Tests: An Integrated Framework for Understanding the Steps Dorry M. Kenyon, Center for Applied Linguistics

Room 301 Shinyang Humanities Hall

12:00-1:00 1:00-5:00

Lunch Break Afternoon Sessions (4 parallel workshops)

Shinyang Humanities Hall & Building 3

Workshop 1 – Speaking Rating Scales and Raters in Speaking Assessment Gad S. Lim, Cambridge English Language Assessment

Room 309 Shinyang Humanities Hall

Evelina D. Galaczi, Cambridge English Language Assessment Workshop 2 – Writing Defining and Assessing Writing Ability: Best Practices and Current Issues in Classroom and Large-Scale Writing Assessment Sara Cushing Weigle, Georgia State University Lynda Taylor, University of Bedfordshire Workshop 3 – CDA Cognitive Diagnostic Assessment: Modeling and Application 2

Room 302 Shinyang Humanities Hall

Room 115 Building 3

Eunice Eunhee Jang, University of Toronto Jimmy de la Torre, Rutgers University Workshop 5 – ELP Tests Developing English language Proficiency Tests: An Integrated Framework for Understanding the Steps Dorry M. Kenyon, Center for Applied Linguistics

Room 301 Shinyang Humanities Hall

6:00-8:00

Reception for Workshop Participants

Conference Hall SNU Faculty Club

Time

TUESDAY, JULY 2

Location

8:30-9:00

Workshop Registration

Workshop Rooms Seoul National University (SNU)

9:00-12:00

Morning Sessions (3 parallel workshops)

Workshop 2 – Writing Defining and Assessing Writing Ability: Best Practices and Current Issues in Classroom and Large-Scale Writing Assessment Sara Cushing Weigle, Georgia State University Lynda Taylor, University of Bedfordshire Workshop 3 – CDA Cognitive Diagnostic Assessment: Modeling and Application Eunice Eunhee Jang, University of Toronto Jimmy de la Torre, Rutgers University Workshop 4 – EQSIRT Item Response Theory Modeling with EQSIRT Peter M. Bentler, UCLA Eric Wu, Multivariate Software Inc. 12:00-1:00 1:00-5:00

Room 302 Shinyang Humanities Hall

Room 115 Building 3

Room 316 CTL Building

Lunch Break Afternoon Sessions (3 PARALLEL WORKSHOPS)

Shinyang Humanities Hall, Building 3 & CTL Building

Workshop 2 – Writing Defining and Assessing Writing Ability: Best Practices and Current Issues in Classroom and Large-Scale Writing Assessment Sara Cushing Weigle, Georgia State University Lynda Taylor, University of Bedfordshire Workshop 3 – CDA Cognitive Diagnostic Assessment: Modeling and Application Eunice Eunhee Jang, University of Toronto Jimmy de la Torre, Rutgers University 3

Room 302 Shinyang Humanities Hall

Room 115 Building 3

Workshop 4 – EQSIRT Item Response Theory Modeling with EQSIRT Peter M. Bentler, UCLA Eric Wu, Multivariate Software Inc. 12:00-5:00

Room 316 CTL Building

ILTA Executive Board Session

5:10-6:00

LTRC Newcomers Session

6:30-8:00

Opening Reception (sponsored by KICE)

4

Fraser Place Conference nd Room 2 (2 Floor) Meeting Room Olive Tower nd (22 Floor) Reception Hall Ofelis Olive Tower st (21 Floor.)

CONFERENCE SCHEDULE MAIN CONFERENCE

Time

WEDNESDAY, JULY 3

Location

8:15-5:30

Conference Registration

Foyer

8:30-5:30

Exhibition

KCCI

9:00-9:30

OPENING CEREMONY Moderator: Young Shik Lee, Hannam University Welcoming Address Yong-Won Lee, Co-Chair, LTRC 2013 Organizing Committee Opening Address Dan Douglas, ILTA President Congratulatory Speeches Seung-il Na, Vice Minister of Education Chang-Ku Byun, Executive Vice President and Provost of Seoul National University Taeje SEONG, President of Korea Institute for Curriculum and Evaluation

9:30-9:45

GROUP PHOTO

9:45-9:55

Remembering John Trim

SAMUEL J. MESSICK MEMORIAL LECTURE Session Chair: Antony Kunnan, California State University, Los Angeles/Nanyang Technological University

09:55-10:00

Introduction

10:00-10:40

Lecture Establishing a Validation Framework for Classroom Assessment Terry Ackerman, University of North Carolina Greensboro

10:40-10:50

Q&A

10:50-10:55

Presentation of the Samuel J. Messick Memorial Lecture Award (by Xiaoming Xi, Educational Testing Service)

10:55-11:10

BREAK

11:10-12:50

PAPER SESSION 1 (2 PARALLEL) Session Theme: Diagnostic Language Assessment Session Chair: Alan Urmston, Hong Kong Polytechnic University How do language teachers diagnose reading and writing in a second or foreign language? Lea Nieminen, University of Jyvaskyla Eeva-Leena Haapakangas, University of Jyvaskyla J. Charles Alderson, Lancaster University Riikka Ullakonoja, University of Jyvaskyla Ari Huhta, University of Jyvaskyla

11:10-11:40

5

Grand Hall

Grand Hall

11:45-12:15

12:20-12:50

Implementing CDA in an Institutional Test: A New Networking Model and Experiment with a New CDM and Task Type Yeon-Sook Yi, Seoul National University An MFRM approach to subskill divisibility in diagnostic language assessment Hongwen Cai, Guangdong University of Foreign Studies Session Theme: Diagnostic Language Assessment & Psychometrics Session Chair: Craig Deville, Measurement Incorporated

11:10-11:40

11:45-12:15

12:20-12:50

Comparison of halo detection methods in language proficiency ratings Matthew J. Borneman, SWA Consulting Inc. Eric A. Surface, SWA Consulting Inc. Finite mixture models for extracting learner proficiency profiles based on analytically scored performance assessments Ikkyu Choi, University of California, Los Angeles

Seminar Room

Multiple regression formula or correction formulas - a better use for pseudoword false alarm data? Raymond Stubbe, Kyushu Sangyo University

12:50-2:20

LUNCH BREAK

12:40-2:20

Language Assessment Quarterly Editorial Board Meeting

Seminar Room

POSTER SESSION

Grand Hall & Foyer

2:20-3:20

Session Chairs: Hanki Jung, Korea Army Academy at Yeong-Cheon Aejin Kang, Sookmyung Women’s University A validity argument for the use of scores from a web-search-permitted integrated writing test Heesung Grace Jun, Iowa State University

Developing theory-based diagnostic tests of grammar: application of Processability Theory Ros Hirch, Seoul National University

MERLIN – Illustrating and researching the CEFR levels with a multilingual online platform Katrin Wisniewski, Technical University of

Effectiveness of a diagnostic e-report for independent learning Michelle Raquel, The Hong Kong Polytechnic

Dresden

University

Carrie Tsang, The Hong Kong Polytechnic University

Roxanne Wong, City University of Hong Kong Winnie Shum, Lingnan University Gwendoline Guan City University of Hong Kong

The eye tracker study of text comprehension and vocabulary knowledge Jungok Bae, Kyungpook National University Sungmook Choi, Kyungpook National University Minho Lee, Kyungpook National University Young-Min Jang, Kyungpook National University Sangwook Kim, Kyungpook National University Moon-Jung Jang, Kyungpook National University 6

The validation process on the Brazilian English Proficiency Exam for Air Traffic Controllers Natalia de Andrade Raymundo, ICEA Brazilian Air Force

Natália de Castro Marques, ICEA - Brazilian Air Force

Complexity comparisons between examination passages and real-world reading texts at 2 CEFR levels Daniel J. Reed, Michigan State University Aaron Ohlrogge, Michigan State University Hyojung Lim, Michigan State University Heekyoung Kim, Michigan State University

Formative and diagnostic language assessment within a plurilingual approach to language learning in Swiss schools Peter Lenz, University of Fribourg

Development of diagnostic tools assessing English teachers‟ performance Sangha Lee, Korea Institute for Curriculum and

Teachers‟ use of English language proficiency descriptor scales in the Ontario educational context: Supporting formative purposes of language assessment Saskia Stille, OISE/University of Toronto June Starke, OISE/University of Toronto Eunice Jang, OISE/University of Toronto Maryam Wagner, OISE/University of Toronto Maggie Dunlop, OISE/University of Toronto

Evaluation

Bokyung Cho, Korea Institute for Curriculum and Evaluation

KyungAe Jin, Korea Institute for Curriculum and Evaluation

Investigating the potential use of learning analytics and data mining in advancing research on language tests Samira ElAtia, The University of Alberta

Establishing cut scores through technological resources on the general version of the Listening and Reading Tests of the Canadian English Language Proficiency Index Program (CELPIP-G). Angel Arias, University of Montreal Amery Wu, University of British Columbia

Introduction to the NEAT Online Rater Training Program for Speaking in South Korea Mee-Jee Kim, Korea Institute for Curriculum and

Development and validation of two oral instruments for measuring interlanguage pragmatic competence Rui XU, Guangdong University of Foreign

Evaluation

Chae Kwan Jung, Korea Institute for Curriculum

Studies & Jinggangshan University

and Evaluation

Factor structure and factorial invariance of an institutional English Placement Test (EPT) Zhi Li, Iowa State University Hyejin Yang, Iowa State University Jooyoung Lee, Iowa State University Impact of English speaking test preparation on Korean test-takers' identity Nahee Kim, University of Leicester

Charting the landscape: Assessing ELLs academic achievement through classroom content tests Beth Clark-Gareca, New York University

Placement decision of ESL students based on holistic scores of Automated Writing Evaluation (AWE) Hyejin Yang, Iowa State University Zhi Li, Iowa State University Stephanie Link, Iowa State University Volker Hegelheimer, Iowa State University

Developing a general English proficiency test for NATO military training and cooperation purposes May Tan, Canadian Defence Academy Investigating the effectiveness and impact of three- versus four-option multiplechoice in listening test items using qualitative and quantitative methods Lauren Kennedy, Second Language Testing, Inc.

Jacquelin Church, Second Language Testing, Inc.

Investigating the validity of the reading items in the Internet-based National English Ability Test Younghyo Park, Korea Institute for Curriculum and Evaluation (KICE)

Su Yon Yim, KICE Jun-Shik Kim, KICE Bokyung Cho, KICE Suh Keong, Kwon, KICE 7

The effects of different types of feedback on Japanese EFL learners’ TOEFL iBT reading practice test performance Yasuyo Sawaki, Waseda University

The relationship between TOEFL iBT speaking scores and oral ability in an academic EFL environment Gary J. Ockey, Educational Testing Service Eric Setoguchi, Kanda University of International Studies Dennis Koyama, Purdue University Angela Sun, Kanda University of International Studies

A meta-analysis of generalizability studies on task and rater effects in L2 speaking and writing Yo In'nami, Shibaura Institute of Technology Rie Koizumi, Juntendo University

Validating the ACTFL Listening Proficiency Test Erwin Tschirner, University of Leipzig Olaf Bärenfänger, University of Leipzig

Extrapolating test performance to non-test setting: The use of SEM for model testing Kadeessa Abdul-Kadir, Public Service

Qualitative change in junior high school students‟ peer assessments on speaking performances Hidetoshi Saito, Ibaraki University

The Effect of Students‟ Background Characteristics on English Performance in the NAEA Young-Ju Lee, Korea Institute for Curriculum

Test takers‟ strategy use and reading test performance: A multiple-sample study Limei Zhang, Nanyang Technological University Christine Goh, Nanyang Technological University Antony Kunnan, Nanyang Technological University

Monitoring the effects of exam renewal: examining formulaic phrases in two speaking test formats Fabiana MacMillan, CaMLA Barb Dobson, CaMLA Jayanti Banerjee, CaMLA Investigating the impact of different misfit criteria on item parameter estimation using the Rasch model Keita Nakamura, Eiken Foundation of Japan Functionality of a contingency-table approach for differential item functioning Seon-Hi Shin, Seoul National University

Department

and Evaluation

Using bifactor-MIRT composite scores as valid indicators of ESP reading test performance Yuyang CAI, The University of Hong Kong How impressionistic can holistic scoring be? A test-taker perspective Qin XIE, Hong Kong Institute of Education Comparative diagnostic assessment of foreign language vocabulary: Differences in English and German learners‟ word knowledge Tibor Vigh, University of Szeged Olga S. Hrebik, University of Szeged Istvan Thekes, University of Szeged Tibor Vidakovich, University of Szeged Assessing the impact of rater negotiation on writing and speaking test rubric scores Gerriet Janssen, University of Hawaii at Manoa; Universidad de los Andes-Colombia

Valerie Meier, University of Hawaii at Manoa Jonathan Trace, University of Hawaii at Manoa 3:20-3:40

Foreign language proficiency testing: Individual differences in the accuracy of self-assessment Daniel S. Stanhope, SWA Consulting Inc. Jennifer Lindberg McGinnis, SWA Consulting Inc.

Eric A. Surface, SWA Consulting Inc. Skill integration in language assessment: A comparative study of PTE-Academic and IB-CET Yan Jin, Shanghai Jiao Tong University Xiaoyi Zhang, Shanghai Jiao Tong University BREAK

3:40-5:40

SYMPOSIA 1 (PLENARY)

3:40-5:40

Organizers: Yong-Won Lee, Seoul National University Eunice E. Jang, University of Toronto Discussant: Alister Cumming, University of Toronto Future of diagnostic language assessment: Moving beyond where we are Charles Alderson, Tineke Brunfaut, and Luke Harding Lancaster University Matthew Poehner, Pennsylvania State University

8

Grand Hall

Eunice E. Jang, University of Toronto Carol Chapelle, Elena Cotos, and Jooyoung Lee Iowa State University

6:00-7:00

LTRC 2013 Special Session on Language Testing in Korea Kyung-Ae Jin and her colleagues, Korea Institute for Curriculum and Evaluation.

Grand Hall

Time

THURSDAY, JULY 4

Location

7:45-6:30

Conference Registration

Foyer

8:00-6:30

Exhibition

KCCI

8:00-9:40

PAPER SESSION 2 (3 PARALLEL) Session Theme: Construct/Validation Session Chair: Yan Jin, Shanghai Jiao Tong University Assessing English as a global language: An empirical analysis of reading proficiency in CEFR and PISA Johanna Mö ller, IPN Kiel Michael Leucht, IPN Kiel Hans Anand Pant, IQB Berlin Olaf Kö ller, IPN Kiel

8:00-8:30

8:35-9:05

9:10-09:40

8:00-8:30

8:35-9:05

A meta-analytic investigation of the relationship between language proficiency and performance Eric Surface, SWA Consulting Inc. Amanda Gissel, SWA Consulting Inc. Matthew Borneman, SWA Consulting Inc.

Conference Hall A

Theoretical and practical issues in cross-language comparisons of proficiency: The European Survey on Language Competences Neil Jones, Cambridge English Language Assessment Nick Saville, Cambridge English Language Assessment Session Theme: Speaking & Writing Assessment Session Chair: Sara Cushing Weigle, Georgia State University Validation research for developing and applying the automated scoring program for the speaking section of the NEAT Dongkwang Shin, Korea Institute for Curriculum and Evaluation Hoky Min, Korea Institute for Curriculum and Evaluation Sang-Bok Park, Korea Institute for Curriculum and Evaluation Chae Kwan Jung, Korea Institute for Curriculum and Evaluation Hunwoo Joo, Korea Institute for Curriculum and Evaluation Mee-Jee Kim, Korea Institute for Curriculum and Evaluation Students‟ perceptions of the effects of rubric-referenced self-assessment on EFL writing: A developmental perspective Weiqiang Wang, Guangdong University of Foreign Studies Yongqiang Zeng, Guangdong University of Foreign Studies Haiyan He, Guangdong University of Foreign Studies 9

Conference Hall B

Investigating rating processes in an EAP writing test: Insights into scoring validity 9:10-09:40

8:00-8:30

8:35-9:05

Jessica Wu, The Language Training and Testing Center Tung-Mei Ma, The Language Training and Testing Center

Session Theme: Performance-Based Assessment Session Chair: Jungok Bae, Kyungpook National University Handling sparse data in performance-based language assessment under the generalizability-theory framework Chih-Kai (Cary) Lin, University of Illinois at Urbana-Champaign Writing proficiency and scoring judgment: The case of preservice EFL teachers in mainland China Li Liu, The Chinese University of Hong Kong Tan Jin, Shutong Research Institute of International Language

Seminar Room

Education

9:10-09:40 9:40-10:00

Context, construct, and consequences: Washback of the College English Test in China Youyi Sun, Queen's University BREAK

BRITISH COUNCIL SCHOLAR LECTURE Session Chair: Young Shik Lee, Hannam University 10:00-10:05 10:00-10:40

Introduction Accountability: Standards and assessment in learning systems Barry O'Sullivan, The British Council

10:45-10:55

Q&A

10:55-11:10

BREAK

11:10-12:50

PAPER SESSION 3 (3 PARALLEL)

11:10-11:40

11:45-12:15

12:20-12:50

Conference Halls A&B Seminar Room

Session Theme: LSP Assessment Session Chair: Yasuyo Sawaki, Waseda University Expanding the construct underlying speaking assessment criteria in a specific-purpose language test Sally O'Hagan, University of Melbourne John Pill, University of Melbourne Barbara Zhang, The OET Centre Investigating test takers‟ processes on an LSP „skim and scan‟ reading task: A validation study of the Occupational English Test Kellie Frost, University of Melbourne Hyejeong Kim, University of Melbourne John Pill, University of Melbourne Catriona Fraser, University of Melbourne Ute Knoch, University of Melbourne Developing a test for diplomats: Challenges, impact and accountability 10

Conference Hall A

Dhiravat Bhumichitr, Devawongse Varopakarn Institute of Foreign Affairs

David Gardner, Devawongse Varopakarn Institute of Foreign Affairs

Rita Green, TDTA

11:10-11:40

11:45-12:15

12:20-12:50

11:10-11:40

11:45-12:15

12:20-12:50

Session Theme: Speaking Assessment Session Chair: Lorena Llosa, New York University Do test-takers‟ self-assessments correspond to their oral test performance? Yujie Jia, University of California, Los Angeles Embedding nonverbal delivery into speaking assessment: A mixed-method rating scale validation Mingwei Pan, The Hong Kong Polytechnic University

Conference Hall B

Modeling speaker proficiency, comprehensibility, and perceived competence in a language use domain Jonathan Schmidgall, University of California, Los Angeles Theme Session: Washback Session Chair: John De Jong, VU Amsterdam University /Pearson Interface between language assessment and teaching/learning: Teachers‟ grading decision-making Liying Cheng, Queen's University Youyi Sun, Queen's University The use of external standardised assessment in a school context for motivational and accountability purposes Angeliki Salamoura, Cambridge English Language Assessment Coreen Docherty, Cambridge English Language Assessment Miranda Hamilton, Cambridge English Language Assessment Neil Jones, Cambridge English Language Assessment

Seminar Room

A longitudinal case study of the washback of the National College English Test on teachers‟ teaching processes and behaviours: classroom observation Xiangdong Gu, Chongqing University Zhiqiang Yang, Chongqing University of Science and Technology

12:50-2:50

LUNCH BREAK

12:50-2:50

ILTA Annual Business Meeting

2:50-4:50

SYMPOSIA 2 (3 PARALLEL)

Conference Hall A

Organizers: Young Shik Lee, Hannam University Sang-Bok Park, Korea Institute for Curriculum and Evaluation Discussant: Lyle Bachman, University of California, Los Angeles

2:50-4:50

The challenges and issues in developing English language tests in the Asian EFL context Kyung-Ae Jin, Korea Institute for Curriculum and Evaluation Yan Jin, Shanghai Jiao Tong University Michael Todd Fouts, Eiken Foundation of Japan Jessica R. W. Wu, The Language Training and Testing Center Neil Drave, Hong Kong Examinations and Assessment Authority 11

Conference Hall A

2:50-4:50

2:50-4:50

Organizers: Luke Harding, Lancaster University Huei-Lien Hsu, Fu-Jen Catholic University Discussant: Lynda Taylor, University of Bedfordshire World Englishes, English as a Lingua Franca and language testing: Change on the horizon? Huei-Lien Hsu, Fu-Jen Catholic University Luke Harding, Lancaster University Hyejeong Kim, The University of Melbourne Organizer: Ari Huhta, University of Jyvaskyla Discussant: Eunice E. Jang, University of Toronto Diagnosing reading in a second or foreign language insights from a multi-method study of two different languages Ari Huhta, University of Jyvaskyla Charles Alderson, Lancaster University Lea Nieminen, University of Jyvaskyla Riikka Ullakonoja, University of Jyvaskyla Eeva-Leena Haapakangas, University of Jyvaskyla

4:50-5:10

Break

5:10-6:50

PAPER SESSION 4 (3 PARALLEL SESSIONS)

5:10-5:40

5:45-6:15

6:20-6:50

5:10-5:40

5:45-6:15

6:20-6:50

Conference Hall B

Seminar Room

Session Theme: Statistics & Psychometrics for Language Testers Session Chair: Seock-Ho Kim, The University of Georgia Optimizing raw score usage to reduce measurement error John De Jong, VU Amsterdam University / Pearson Ying Zheng, Pearson Item-analysis methods and their implications for the ILTA Guidelines for Practice David Ellis, University of Maryland

Conference Hall A

Facilitating communication to stakeholders using statistical graphics Sung-Ock Sohn, University of California, Los Angeles Ikkyu Choi, University of California, Los Angeles Session Theme: Speaking Assessment Session Chair: Alistair van Moere, Pearson Rating scale development and use: The rater perspective Evelina Galaczi, Cambridge English Language Assessment Gad Lim, Cambridge English Language Assessment Nahal Khabbazbashi, Cambridge English Language Assessment Developing and validating an automated speaking test for elementary students Young Shik Lee, Hannam University Jungtae Kim, Pai Chai University Hyun-Ju Kim, Dankook University Taeyoung Jeong, Korea Military Academy The role of the native speaker in aviation communication Carol Lynn Moder, Oklahoma State University 12

Conference Hall B

Gene B. Halleck, Oklahoma State University

5:10-5:40

5:45-6:15

6:20-6:50

Session Theme: Non-English Language Assessment Session Chair: Yo In’nami, Shibaura Institute of Technology The usefulness of accreditation-mandated assessment in college foreign language programs John Davis, University of Hawaii at Manoa Cantonese-speaking Chinese heritage learners' response patterns at a placement test Wei-Li Hsu, University of Hawaii at Manoa

Seminar Room

Morpheme-by-morpheme Rasch analysis of a Korean C-test as a diagnostic tool Hyunah Ahn, University of Hawaii at Manoa

Time

FRIDAY, JULY 5

Location

8:15-5:25

Conference Registration

Foyer

8:30-5:25

Exhibition

KCCI

8:30-10:10

PAPER SESSION 5 (3 PARALLEL SESSIONS) Session Theme: Validity, Accountability, & Validation Session Chair: Dan Douglas, Iowa State University What is argument-based validation? Carol Chapelle, Iowa State University Hye-Won Lee, Iowa State University

8:30-9:00

Validity and accountability: IELTS as a measure of language ability for medical practitioners Vivien Berry, Centre for Language Assessment Research, 9:05-9:35

University of Roehampton

Barry O'Sullivan, The British Council Sandra Rugea, Centre for Language Assessment Research,

Conference Hall A

University of Roehampton

9:40-10:10

8:30-9:00

9:05-9:35

Language ability of young English language learners: Definition, configuration, and implications Lin Gu, Educational Testing Service Session Theme: Non-English Language Assessment & Longitudinal Analysis Session Chair: Toshihiko Shiotsu, Kurume University Examining testlet effects in the TestDaF Reading Section: A testlet response modeling approach Thomas Eckes, TestDaF Institute Development of diagnostic Japanese vocabulary assessment for non-native speakers‟ learning Yuan Sun, National Institute of Informatics Hiroko Yabe, Tokyo Gakugei University Megumi Shimada, Tokyo Gakugei University 13

Conference Hall B

9:40-10:10

Longitudinal and cross-sectional investigation into the development of speaking ability at Japanese schools Rie Koizumi, Juntendo University Yo In'nami, Shibaura Institute of Technology Session Theme: Assessment of Pragmatics & Strategic Competence Session Chair: Liying Cheng, Queen's University

8:30-9:00

9:05-9:35

Validating task-based assessment of L2 pragmatics in interaction using mixed methods Soo Jung Youn, University of Hawaii at Manoa An investigation into the nature of strategic competence through test-takers‟ lexico-grammatical test performance Nick Bi, University of Sydney

9:40-10:10

Testing implicature under operational conditions Carsten Roever, The University of Melbourne

10:10-10:30

BREAK

10:30-12:00

WORKS-IN-PROGRESS SESSION

Seminar Room

Conference Halls A&B

Session Chairs: Neil Jones, Cambridge English Language Assessment Xiaoming Xi, Educational Testing Service L2 collocational proficiency: Expanding the Investigating the assessment literacy of construct of speaking proficiency in standardised language test score users in automated speech scoring Canadian higher education Sumi Han, Northern Arizona University Beverly Baker, McGill University Rika Tsushima, McGill University Shujiao Wang, McGill University Mariusz Galczynski, McGill University Sarah Desroches, McGill University The development of second language writing Assessing writing: How do raters deal with proficiency: A linguistic analysis aspects not covered in rating scales? Yeon Joo Jung, Indiana University Yi Mei, Queen's University Investigating growth in paragraph writing Exploring the criterial features of spoken skills of tertiary students performances on the same tasks at A2, B1, Vahid Aryadoust, National University of and B2 Singapore Chihiro Inoue, Asahikawa Medical University Siew Mei Wu, National University of Singapore Computerized speaking tasks for assessing An investigation into the comparability of young English language learners students‟ writing performance on the TOEFLDorry M. Kenyon, Center for Applied Linguistics iBT and in university writing courses Margaret E. Malone, Center for Applied Lorena Llosa, New York University Linguistics Margaret Malone, Center for Applied Linguistics Megan Montee, Center for Applied Linguistics Jing Wei, New York University Anne Donovan, Center for Applied Linguistics Investigating prompt difficulty in Content validation study of a computerized automatically scored speaking performance business English test: developing a language assessments framework Troy Cox, Brigham Young University Youngshin Chi, University of Illinois at UrbanaChampaign

The role of diagnostic assessment in academic support Jawee Perla, American University

Reconciliation between assessment for learning and assessment of learning in Chinese award-winning teachers' EFL 14

Listening subskills and metacognitive strategies in a diagnostic English language assessment Jeremy Gray, Lingnan University Wai Lan, Winnie Shum, Lingnan University Yuanyuan, Gwendoline Guan, City University

classrooms Jiming ZHOU, The University of Hong Kong Automatic writing assessment and feedback: An approach to improve construct and consequential validity Helen Yannakoudakis, University of Cambridge ESOL Examinations

Gad Lim, University of Cambridge ESOL Examinations

of Hong Kong

Ø istein Andersen, iLexIR Ltd Ted Briscoe, University of Cambridge Computer Laboratory

Fiona Barker, University of Cambridge ESOL Examinations

Preparing for the writing tasks of Graduate School Entrance English Examination: Stakeholders‟ practice as a response to test task demands Shasha Xu, Zhejiang University Bridging assessment for learning to selfregulation in Chinese tertiary EFL writing classrooms Yongfei Wu, Queen's University Diagnosing EFL writing difficulties in the Chinese context Cecilia Guanfang Zhao, Shanghai International Studies University

Innovative assessment tasks for academic English proficiency: An integrated listeningspeaking task vs. a multimedia-mediated speaking task Hye Won Lee, Iowa State University DIF investigations with different test types on Pearson Test of English Academic Xiaomei Song, Queen's University Ying Zheng, Pearson Optimizing the diagnostic utility of automated essay scoring Brent Bridgeman, Educational Testing Service Chaitanya Ramineni, Educational Testing Service

12:00-1:30

Lunch Break

1:30-2:35

PAPER SESSION 6 (3 REDUCED PARALLEL SESSIONS)

1:30-2:00

Session Theme: Reading/Writing Session Chair: David Qian, Hong Kong Polytechnic University Investigating Korean EFL College Students‟ Perception, Strategy Use and Task Performance on an English Integrated Reading-Writing Task. Yoonhee Choe, Chongshin University Conference Hall A

2:05-2:35

1:30-2:00

Relative significance of component skills to EFL reading: Implications for diagnosis and instruction Min Gui, Wuhan University (to be confirmed) Session Theme: Disability Accommodation and Technology Issues Session Chair: Ute Knoch, University of Melbourne Assessing students with disabilities: Listening to voices from the stakeholder community Lynda Taylor, University of Cambridge ESOL Examinations Hanan Khalifa, University of Cambridge ESOL Examinations Conference Hall B

2:05-2:35

Developing technology-enhanced language assessment tasks: Issues to consider Mikyung Kim Wolf, Educational Testing Service Alexis Lopez, Educational Testing Service Session Theme: Integrated Tasks in Language Assessment Session Chair: Gad S. Lim, Cambridge English Language Assessment 15

1:30-2:00

2:05-2:35

Impact of the length of stimulus materials on TOEFL® Junior Comprehensive Integrated Task Performance Youngsoon So, Educational Testing Service Rhetorics and realities in preparing intensively for TOEFL iBT speaking test: Teacher and test taker perspectives Guoxing Yu, University of Bristol

2:35-2:55

Break

2:55-4:55

SYMPOSIA 3 and PAPER SESSION 7 (2 SYMPOSIUM & 1 PAPER PARALLEL SESSIONS)

Seminar Room

Organizer: John Read, University of Auckland Exploring the diagnostic potential of post-admission language assessments in English-medium universities Ute Knoch and Cathie Elder, University of Melbourne John Read, Janet von Randow, and Eleanor Clemeau 2:55-4:55

University of Auckland

Janna Fox, John Haggerty, and Zinat Goodarzi

Conference Hall A

Carleton University

Alan Urmston, Michelle Raquel, and Carrie Tsang Conference Hall A

Hong Kong Polytechnic University

Organizer: Carsten Wilmes, WIDA Consortium

2:55-4:55

Discussant: Lyle Bachman, University of California, Los Angeles Broadening language assessment horizons: From Largescale accountability to diagnostic purposes Timothy Boals, WIDA Consortium Dorry Kenyon, WIDA Consortium Margo Gottlieb, WIDA Consortium Carsten Wilmes, WIDA Consortium Elizabeth Cranley, WIDA Consortium

Conference Hall B

Paper Session 7

2:55-3:25

3:30-4:00

4:05-4:35

Session Theme: Automated Essay & Speech Scoring Session Chair: Seon-Hi Shin, Seoul National University Can you “game the system” by responding off-topic in automatically scored speaking tests? Jian Cheng, Pearson Alistair Van Moere, Pearson Masanori Suzuki, Pearson Factor structure of a spoken Chinese test: investigating five subskill scores for diagnosis Masanori Suzuki, Pearson Yujie Jia, Pearson Developing an automated essay scoring system for the NEAT Doyoung Park, Korea Institute for Curriculum and Evaluation Kija Si, Korea Institute for Curriculum and Evaluation Yongsang Lee, Korea Institute for Curriculum and Evaluation Sangwook Park, Korea Institute for Curriculum and Evaluation 16

Seminar Room

Eunyoung Lim, Korea Institute for Curriculum and Evaluation Seulki Koo, Korea Institute for Curriculum and Evaluation Hwangkyu Lim, Korea Institute for Curriculum and Evaluation 4:55-5:25

Closing Comments

Conference Hall A

6:30-9:30

Banquet and Awards Presentation

Bus from the hotel Sejong Hall

* Note: The current version of the conference program will go through several rounds of revision and editing before it is printed in June for distribution. If you have any suggestions for corrections and changes, please contact Dr. Sang Bok Park at: [email protected]

17