Important Announcement
PubHTML5 Scheduled Server Maintenance on (GMT) Sunday, June 26th, 2:00 am - 8:00 am.
PubHTML5 site will be inoperative during the times indicated!

Home Explore Assessment and Teaching of 21st Century Skills

Assessment and Teaching of 21st Century Skills

Published by rojakabumaryam, 2021-09-02 03:14:49

Description: Assessment and Teaching of 21st Century Skills

Search

Read the Text Version

334 L. Darling-Hammond must be used to evaluate students, in combination with the other evidence teachers assemble from the classroom. In other years, the use of the tasks is optional. As described by the QCA: “The tasks are designed to support teacher assessment. They can be used to indicate what pupils are able to do and inform future learning and teaching strategies. Individual tasks can be used to provide a basis for discussion by teachers and pupils on what has been achieved and to identify the next steps. They can support day-to-day assessment and generate outcomes which can contribute to the breadth of evidence which is used as the basis for periodic and transitional assessment.” At key stage 4, ages 15–16, the national qualification framework includes multiple pathways for students and consequently multiple measures of student achievement. There are four pathways based on students’ aspirations after graduation: apprentice- ship, diploma, the General Certificate of Secondary Education (GCSE), and the A-Level examinations. Some students go on to a Further Education college to take vocationally related courses. They usually take the National Vocational Qualification using the apprenticeship model. Most students take the GCSE, a 2-year course of study evaluated by assessments both within and at the end of courses or unit. Students may take as many single- subject or combined-subject assessments as they like, and they choose which ones to take on the basis of their interests and areas of expertise. The exams involve constructed response items and structured, extended classroom-based tasks, which comprise from 25% to 60% of the final examination score. England is currently piloting new tasks for the GCSE with an increased emphasis on functional skills like problem-solving, team building, and communication as well as personal learn- ing and thinking skills across subjects. These new tasks, called “controlled assess- ments” are either designed by the awarding body and marked by teachers or designed by teachers and marked by the awarding body. Either way teachers determine the timing of controlled assessments. These classroom-based assessments comprise 25% of the total examination score in subjects like business studies, classical civilization, English literature, geography, history, humanities, or statistics, and 60% of the total examination score in subject areas such as applied business, music and dance, design and technology, drama, engi- neering, English, English Language, expressive arts, health and social care, home economics, ICT, manufacturing, media studies, and modern foreign languages. Examples of classroom-based tasks in English are given in Table 6.2 and in Interactive Computer Technology (ICT) in Fig. 6.9. During key stage 4, most students take five or more GCSE exams. Their perfor- mance determines the level of the diploma they receive, and whether they will go on to Advanced Studies that are later evaluated by A-level exams that qualify students for university admissions. England has 45 areas for A-level exams. The exam ques- tions require extended answers aimed at assessing deeper levels of understanding and applications of knowledge to real-world problems, as illustrated in the example in Fig. 6.10. Most of the exams take the form of essay questions. The mathematics exams include questions that ask students to show their reasoning behind their answers.

6 Policy Frameworks for New Assessments 335 Table 6.2 Classroom-based assessment tasks, english GCSE Unit and assessment Tasks Reading literacy texts controlled Responses to three texts from choice of tasks and assessment (coursework) 40 marks texts. Candidates must show an understanding of texts in their social, cultural, and historical Imaginative writing controlled context assessment (coursework)40 marks Two linked continuous writing responses from a Speaking and listening controlled choice of Text Development or Media assessment (coursework) 40 marks Three activities: a drama-focused activity, a group Information and ideas written activity, an individual extended contribution. One examination 80 marks activity must be a real-life context in and beyond (40 per section) the classroom Nonfiction and media: Responses to unseen authentic passages Writing information and ideas: One continuous writing response—choice from 2 options A City council attempted to reduce traffic congestion by introducing a congestion charge. The charge was set for 4 pounds for the first year and was then increased by 2 pounds each year. For each of the first eight years, the council recorded the average number of vehicles entering the city center per day. The results are shown in the table: Charge (Pounds), x 4 6 8 10 12 14 16 18 Average number of vehicles per day, y million 2.4 2.5 2.2 2.3 2.2 1.8 1.7 1.5 Calculate the product moment correlation coefficient for these data. Explain why x is the independent variable. Calculate the equation of the regression line of y on x. 4a Use your equation to estimate the average number of vehicles, which will enter the city center per day when the congestion charge is raised to 20 pounds. 4b Comment on the reliability of your estimate. 5 The council wishes to estimate the congestion charge required to reduce the average number of vehicles entering the city per day to 1.0 million. Assuming that a reliable estimate can be made by extrapolation, state whether they should use the regression line of y on x or the regression line of x on y. Give a reason for your answer. Fig. 6.9 English A-level question from a probability and statistics examination Foreign language exams require oral presentations. The “A”-level exam in English literature asks students to show their skills and knowledge in four sections: poetry, drama, prose, and general, analyzing works of literature they have read as part of their curriculum in terms of their meaning and interpretation as well as literary devices and writing strategies. Coursework accounts for 25–30% of the “A”-level score, depending on the course. Students must now also complete an independently

336 L. Darling-Hammond Litchfield Promotions works with over 40 bands and artists to promote their music and put on performances in England. The number of bands they have on their books is gradually expanding. Litchfield Promotions needs to be sure that each performance will make enough money to cover all the staffing costs and overheads as well as make a profit. Many people need to be paid: the bands; sound engineers; and, lighting technicians. There is also the cost of hiring the venue. Litchfield Promotions needs to create an ICT solution to ensure that they have all necessary information and that it is kept up to date. Their solution will show income, outgoings and profit Candidates will need to: 1) Work with others to plan and carry out research to investigate how similar companies have produced a solution. The company does not necessarily have to work with bands and artists or be a promotions company. 2) Clearly record and display your findings. 3) Recommend a solution that will address the requirements of the task. 4) Produce a design brief, incorporating timescales, purpose and target audience. Produce a solution, ensuring that the following are addressed: 1) It can be modified to be used in a variety of situations. 2) It has a friendly user interface. 3) It is suitable for the target audience. 4) It has been fully tested. You will need to: 1) incorporate a range of: software features, macros, modelling, and validation checks - used appropriately. 2) Obtain user feedback. 3) Identify areas that require improvement, recommending improvement, with justification. 4) Present information as an integrated document. 5) Evaluate your own and others’ work. Fig. 6.10 Controlled assessment tasks, interactive computer technology GCSE designed extended research project as part of the A-level assessments. Teachers mark assessments in a moderated process managed by the five examination agencies that organize sets of examinations. While England has moved to include some school-based assessments in its increasingly performance-oriented assessment system, Scotland, Wales, and Northern Ireland have gone even further in revising their approaches to assessment. Scotland Scotland has a governing body for its educational system that is separate from that of the UK and uses a set of assessments called the Scottish Survey of Achievement, administered in the third, fifth, and seventh years of primary school as well as standardized courses and benchmark exams in secondary school. The assessment tasks for the primary courses and general secondary courses are designed and marked by teachers and lecturers. Schools use external assessments for the inter- mediate and advanced secondary courses. The Scottish Qualifications Authority designs and scores those assessments which may take the form of examinations, project work, or portfolios (Scottish Qualifications Authority March 2004; The Scottish Government 2008). Wales Wales recently separated from the system used in England and now has its own governing body for its educational system (Archer 2006). Wales abolished national

6 Policy Frameworks for New Assessments 337 exams for children through age 14. Much like Finland, during the primary years, Welsh schools have a national school curriculum supported by teacher-created, administered, and scored assessments. During the secondary years, teachers create and manage all assessment of 14-year-old students, while students 16 years and older are encouraged to participate in the relevant GCSE exams and A-level courses and exams administered by the U.K.’s Qualifications and Curriculum Authority (Welsh Assembly Government 2008a, b). With these changes to its assessment system, Wales hopes to increase student engagement, engage students in more creative tasks, and reduce teaching to the test (Archer 2006). Northern Ireland Northern Ireland is in the process of implementing an approach at all levels called “Assessment for Learning.” This approach emphasizes locally developed, adminis- tered, and scored assessments and focuses on five key actions: 1. Sharing learning intentions where students and teacher agree upon learning intentions to give them ownership over their learning. 2. Sharing and negotiating success criteria where students and teacher create the criteria for successful completion of a task together to help with self- assessment. 3. Feedback where teachers provide on-going feedback during formative assessment sessions. 4. Effective questioning where teachers introduce strategies like using open-ended questions and giving more thinking time so students will feel more confident thinking aloud and explaining their reasoning. 5. How pupils reflect on their learning where teachers provide students with strategies to think about what they have learned. Northern Ireland does not require schools to externally assess students up through age 14, but it provides teachers with the option to give students end of stage 3 assessments that are externally graded through the Northern Ireland Council for Curriculum Examinations and Assessments (CCEA). These are largely open-ended assessments that evaluate how students reason, think, and solve problems. CCEA provides multiple assessments for stage 4, according to which pathway a student chooses to follow, including taking the GCSE exam and A-level courses and exams from the U.K. system (whether aiming towards university or a vocational degree) (Council for the Curriculum Examinations and Assessment 2008a, b). Conclusion A variety of challenges confront nations seeking to integrate twenty-first century skills into standards, curriculum, assessment, and teaching. An examination of assessment policies and practices in these four nations suggests a range of potential

338 L. Darling-Hammond opportunities for evaluating twenty-first century skills in both on-demand tests and curriculum-embedded assessments. The growing move to promote assessment of, for, and as learning, rather than seeing testing as a separate disjointed element of the education enterprise, may provide opportunities for strengthening the teaching and learning of twenty-first century skills, as well as their assessment. The growing emphasis on school-based performance assessments in many coun- tries appears to strengthen teaching in which teachers learn more deeply about how to enact standards by participating in scoring and/or reviewing student work. It may also increase curriculum equity, since all students engage in more common activi- ties and instructional supports as part of the required assessments. Some assessment policies also seek to use assessment to strengthen teaching by considering how to provide both feedback and “feedforward” information. They incorporate rich feed- back to students, teachers, and schools about what has been learned, and they shape students’ future learning by offering opportunities for student and teacher reflection that supports learning to learn. Technology supports for these efforts are becoming increasingly sophisticated and should be shared across states and nations. Given the critical importance of these initiatives to the teaching and acquisition of twenty-first century skills, the ATC21S project should facilitate countries’ efforts to develop optimal policy strategies that integrate school-based assessments of ambi- tious intellectual performances with large-scale assessments that seek to measure problem-solving, critical thinking, collaboration, and learning to learn in increasingly sophisticated ways. References Archer, J. (December 19th, 2006). Wales eliminates National Exams for many students. Education Week. Retrieved on September 11th, 2008, from http://www.edweek.org/ew/articles/2006/12/20/16wales. h26.html?qs=Wales. Buchberger, F., & Buchberger, I. (2004). Problem solving capacity of a teacher education system as a condition of success? An analysis of the “Finnish case. In F. Buchberger & S. Berghammer (Eds.), Education policy analysis in a comparative perspective (pp. 222–237). Linz: Trauner. Chan, J. K., Kennedy, K. J., Yu, F. W., & Fok, P. (2008). Assessment policy in Hong Kong: Implementation issues for new forms of assessment. The Hong Kong Institute of Education. Retrieved on September 12th, 2008, from http://www.iaea.info/papers.aspx?id=68 Council for Curriculum Examinations and Assessment. (2008a). Curriculum, key stage 3, post- primary assessment. Retrieved on September 12th, 2008, from http://www.ccea.org.uk/ Council for Curriculum Examinations and Assessment. (2008b). Qualifications. Retrieved on September 12th, 2008, from http://www.ccea.org.uk/ Dixon, Q. L. (2005). Bilingual Education Policy in Singapore: An analysis of its sociohistorical roots and current academic outcomes. International Journal of Bilingual Education and Bilingualism, 8(1), 25–47. Education Bureau. (2001). Domain on learing and teaching. Hong Kong: Education Department. European Commission. (2007/2008). Eurybase, The Information Database on Education Systems in Europe, The Education System in Finland. Finnish National Board of Education. (2007, November 12). Background for Finnish PISA success. Retrieved on September 8th, 2008, from http://www.oph.fi/english/SubPage.asp?path= 447,65535,77331

6 Policy Frameworks for New Assessments 339 Finnish National Board of Education. (2008a, April 30). Teachers. Retrieved on September 11th, 2008, from http://www.oph.fi/english/page.asp?path=447,4699,84383. Finnish National Board of Education. (2008b, June 10). Basic education. Retrieved on September 11th, 2008, from http://www.oph.fi/english/page.asp?path=447,4699,4847. Hautamäki, J., & Kupiainen, S. (2002, May 14). The Finnish Learning to Learn Assessment Project: A concise report with key results. Prepared for the Workshop on Learning-to-Learn Assessment, Brussels. Helsinki: Centre for Educational Assessment, Helsinki University. Hautamäki, J., Arinen, P., Eronen, S., Hautamäki, A., Kupiainen, S., Lindblom, B., Niemivirta, M., Pakaslahti, L., Rantanen, P., & Scheinin, P. (2002). Assessing learning-to-learn: A framework. Helsinki: Centre for Educational Assessment, Helsinki University, and the National Board of Education in Finland. HKEAA. (2009). School-based Assessment (SBA). Retrieved on August 10th, 2011, from http:// www.hkeaa.edu.hk/en/sba Kaftandjieva, F., & Takala S. (2002). Relating the Finnish Matriculation Examination English Test Results to the CEF Scales. Paper presented at Helsinki Seminar on Linking Language Examinations to common European Framework of reference for Languages: Learning, Teaching Assessment. Kaur, B. (2005). Assessment of mathematics in Singapore schools—The present and future. Singapore: National Institute of Education. Korpela, S. (2004, December). The Finnish school—A source of skills and well-being: A day at Stromberg Lower Comprehensive School. Retrieved on September 11th, 2008, from http:// virtual.finland.fi/netcomm/news/showarticle.asp?intNWSAID=30625 Laukkanen, R. (2008). Finnish Strategy for High-Level Education for All. In N. C. Soguel, & P. Jaccard (Eds.), Governance and performance of education systems. Dordrecht: Springer. Lavonen, J. (2008). Reasons behind Finnish Students’ Success in the PISA Scientific Literacy Assessment. University of Helsinki, Finland. Retrieved on September 8th, 2008, from http:// www.oph.fi/info/finlandinpisastudies/conference2008/science_results_and_reasons.pdf. Ng, P. T. (2008). Educational reform in Singapore: from quantity to quality. Education Research on Policy and Practice, 7, 5–15. Qualifications and Curriculum Authority (2009). Assessing pupils’ progress: Assessment at the heart of learning. Retrieved on May 23, 2009, from http://www.qca.org.uk/libraryAssets/ media/12707_Assessing_Pupils_Progress_leaflet_-_web.pdf. Queensland Government. (2001). New basics: The why, what, how and when of rich tasks. Retrieved on September 12th, 2008, from http://education.qld.gov.au/corporate/newbasics/ pdfs/richtasksbklet.pdf. Scottish Qualifications Authority. (2004, March). Scotland’s national qualifications: Quick guide. Retrieved on September 11th, 2008, from http://www.sqa.org.uk/files_ccc/NQQuickGuide.pdf. Singapore Examinations and Assessment Board. (2006). 2006 A-Level Examination. Singapore: Author. Stage, E. K. (2005, Winter). Why do we need these assessments? Natural Selection: Journal of the BSCS, 11–13. The Finnish Matriculation Examination. (2008). Retrieved on September 8th, 2008, from http:// www.ylioppilastutkinto.fi/en/index.html The Scottish Government. (2008). Schools: Attainment. Retrieved on September 11th, 2008, from http://www.scotland.gov.uk/Topics/Education/Schools/curriculum/Attainment. Victoria Curriculum and Assessment Authority. (2009). Planning for Assessment. http://vels.vcaa. vic.edu.au/support/tla/assess_planning.html. Welsh Assembly Government. (2008a). Primary (3–11). Retrieved on September 12th, 2008, from http://old.accac.org.uk/eng/content.php?cID=5. Welsh Assembly Government. (2008b). Secondary (11–16). Retrieved on September 12th, 2008, from http://old.accac.org.uk/eng/content.php?cID=6.



Index A C Accessibility, 50, 61–62, 172, 229, Calibration, 8, 13, 185 CAT. See Computerized adaptive 238–240 Accountability, 10, 19–22, 24, 42, 85, 152, testing (CAT) 21st Century frameworks, 34–36, 309 155, 160, 162, 217, 245, 291, 21st Century skills, 4, 6, 7, 9, 11, 12, 14, 15, 314, 342 Adaptive expertise, 263, 284, 295 17–88, 96–99, 111, 114–120, 125, 126, Adaptive testing, 125, 150, 179, 180, 191, 216, 134, 138–149, 151, 152 172, 173, 176, 227, 229, 245 210, 221, 233, 246, 260–264, 266–270, Advisory panel, 5–7, 15 273–280, 283, 292–302, 305, Assessment centered learning, 308–313, 315, 319, 329–333, 339, 290–291 341, 365, 366 Assessment design, 22, 27, 96, 99, 107, Change-nullifying system, 261 121, 153–162, 219, 241, 242, Cheating, 28, 239 294, 296 Chemistry, 120, 353 Assessment domains, 174–176, 180, 181, Cisco systems, 1, 5 210, 226 Citizenship, 6, 19, 37, 54–56, 61, 194, 264, Assessment frameworks, 34, 54, 100, 183, 274, 277, 297, 299, 317, 319, 320, 332, 192, 240, 244, 296 334, 338, 342, 343 Assessment Research Centre, 5 Classroom practice, 2, 5, 6, 20, 142, 303 Assessment tasks, 7, 13, 15, 18, 22, 25, 29, 51, Closed-response, 180 59, 62, 111, 119, 161, 162, 180, 188, Cognitive functioning, 148, 178 204, 205, 245, 296, 298, Cognitive proficiency, 148, 187 309, 341, 342, 346, 347, 349, Coherence, 24, 156, 162, 293, 353 359, 363, 364 Collaboration, 6, 9, 18, 19, 30, 32, 33, 36, Associate countries, 6, 8 45–48, 58, 61, 72, 96, 126, 134, Automated scoring, 59, 66, 73, 80, 124, 126, 143–145, 148, 175, 187, 216–218, 173, 209, 211, 245, 246 221, 230, 243, 247, 248, 267, 272–274, 276, 279, 280, 284, 289, 295, 297, B 299, 302, 303, 305, 309, 312, 315, Balanced assessment, 24 316, 319, 320, 331–333, 339, 342, Becker, G., 3, 4 343, 346, 359, 366 Bi-lingual, 353 Collaborative problem-solving, 7, 8, 12, 32 Biology, 351, 353 Collaborative skills, 126, 216–218, 245, 247 Biometrics, 150–151 Collective cognitive responsibility, 275, Brain dumping, 239 303, 305 Collective responsibility, 58, 272, 279, 305 P. Griffin et al. (eds.), Assessment and Teaching of 21st Century Skills, 341 DOI 10.1007/978-94-007-2324-5, © Springer Science+Business Media B.V. 2012

342 Index Communication, 2, 6, 18, 19, 26, 27, 30, 32, D 34, 36, 43, 45, 46, 50–52, 60, 61, 66, Data collection, 7, 62, 86, 87, 115, 173, 220, 67, 70, 81, 126, 139, 144–146, 172, 182, 183, 186, 195, 218, 221, 239, 247, 223, 225, 242, 244, 247, 345 260, 262, 271, 272, 274, 276, 279, 294, Data density, 115, 116, 149 296, 297, 299, 300, 309, 312, 315, 316, Deficit model of learning, 11 319, 331, 332, 339, 340, 342–344, 346, Delivery modes, 180, 181, 211 350, 359, 360, 362 Delivery techniques, 196, 203, 212, 223, Community, 31, 33, 47, 55, 56, 57, 70, 98, 232–234, 236 140, 142–144, 175, 185, 194, 222, 224, Deployment modes, 223 226, 229, 230, 240, 266, 272, 274–277, Developmental model of learning, 282, 283, 288–291, 301, 305–308, 311, 312, 315–319 9, 100 Digital literacy, 2, 51, 102 Community knowledge space, 274, 275, Digital numeracy, 2 312, 316 Digital portfolios, 29, 30, 39, 67, 69 Digital technologies, 18, 50, 52, 53, 72, Comparability, 62, 162, 172, 203, 211–214, 238, 242, 244, 300, 334, 176, 180 346, 357 Digital thinking, 32 Direct feedback, 10 Computerized adaptive testing (CAT), Dual programs, 28 125, 191 Dynamic items, 40 Computer platform, 190, 224 E Constructed response formats, 180 e-assessment, 27–31, 42, 343 Construct-irrelevant variance, 134, Educational outcomes, 4, 193, 211–214 348, 352 Construct maps, 159–160 Education reform, 183, 260, 261, 270 Construct-referenced, 98 Edutainment, 246 Constructs, 10–12, 14, 48, 53, 69, 96–101, Efficiency, 27, 135, 160, 172, 284 e-learning, 145, 186, 216, 221, 224, 231 105, 106, 110–114, 120–123, 125, 128, Embedded assessment, 160, 295, 296, 306, 130–135, 138–140, 143, 144, 146, 147, 149, 150, 152, 154, 159, 160, 162, 174, 312, 331, 332, 366 179–182, 184, 185, 192, 203, 210–214, Emergence, 2, 9, 175, 261, 263–270, 278, 224, 230, 241–243, 267, 269, 272, 287, 292, 298–300, 311, 331, 353 279, 292, 293, 300, 301, 307, 308, Construct-under representation, 211, 214 312–314 Continuum, 142, 215, 264, 277, 278, 315 Equality, 55, 217, 355 Cost and feasibility, 62 Ethical issues, 54, 184, 306, 348 Creativity, 6, 19, 27, 30–32, 34, 36–40, 50, 61, Evidence centered design, 85, 86, 99, 148, 68, 72, 81, 87, 96, 144, 147–148, 182, 153–156, 293, 294, 296, 309 263, 274, 276, 296, 302, 316, 319, 320, Executive Director, 5 331, 332, 339, 346, External standardized testing, 352, 361 356, 357 Eye tracking, 150, 151, 216, 246 Criterion-referenced assessments, 359 Critical thinking, 6, 9, 25, 28, 30–33, 36, F 40–43, 50, 61, 87, 201, 218, 263, 267, Fairness, 23, 24, 127, 161, 214, 239 274, 277, 294, 315, 317, 319, 320, 331, Feedback, 10, 15, 24, 26, 28, 39, 57, 96, 333, 342, 344, 346, 354, 356, 357, 359, 360, 366 116, 127, 135, 138, 142, 143, 149, Crowd wisdom, 115, 143, 144, 145, 158, 172, 173, 177, 181, 191, 205, 152, 311 216, 227, 246, 291, 292, 295, 297, Cultural competence, 19, 37, 56, 58, 274, 277, 302, 306, 311, 314, 335, 350, 352, 319, 332 353, 364–366 Cultural differences, 45, 47, 50, 57, 61, 99 Fluidity, 62 Curriculum standards, 19, 35, 247, 278, Formal learning, 50, 290, 293, 297 313, 342

Index 343 Formative assessment, 11, 21, 22, 27, 29, 30, Interactive media, 225, 226 33, 43, 128, 141, 176, 193, 197, 204, Inter-disciplinary, 238, 334, 209, 210, 214, 215, 246, 262, 279, 295, 296, 297, 300, 302, 308, 309, 349, 365 347, 358 Interface, 32, 55, 117, 118, 151, 208, 209, 226, Founder countries, 5 235, 364 G International Research Coordinator, 5 Generalizable frameworks, 140 Inter-rater reliability, 73 Group measures, 148–150 Item authoring, 223–227, 229, 230 Group ratings, 61 Item bank, 125, 180, 191, 192, 229, Group work, 33, 149, 217, 276, 316, 358 230, 239 H Item design, 114, 223, 224, 229, 243 Hard skills, 264, 267, 269, 278, 280 Item response theory, 219, 227 Higher-order thinking skills, 32, 267 Item-specific variability, 139 Horizontal platform approach, 227, 229 IVE. See Immersive Virtual Human capital, 3, 4 Humanities, 36, 296, 353, 357, 359, 362 Environment (IVE) J Job mobility, 54 Job readiness, 4 I K Immersive Virtual Environment (IVE), 85 Keystroke dynamics, 150 Informal learning, 50, 290, 293, 297 Knowledge building Information and communication environments, 262, 265–266, 270, 275, technology (ICT) 279, 280, 281, 295, 300, 302, 305, based assessment, 2, 5, 27, 339 312–315 infrastructure, 27, 190 Information based economy, 2 pedagogy, 312 Information explosion, 49 Knowledge creating organizations, 262–264, Information literacy, 19, 30, 36, 49, 50, 51, 96, 267, 270, 272–277, 281, 305, 306, 309, 177, 178, 183, 187–190, 194–196, 274, 310, 313, 315 277, 279, 315, 317, 319, Knowledge, skills, attitudes, values, ethics 320, 331 (KSAVE), 6, 12, 34, 36–37 Information skills, 2, 3 Knowledge society, 260, 282 Innovation, 6, 14, 18, 19, 21, 27, 28, 36, 38–40, 42, 45, 47, 50, 57, 63, 68, L 73–74, 81, 85–87, 96, 142, 149, 172, Large scale assessment, 10, 14, 15, 24, 26, 34, 176, 177, 187, 238, 260, 262, 263, 265, 266, 271, 274, 276, 279, 296, 297, 299, 38, 40, 43–46, 48, 55, 60, 61, 97, 141, 310, 316, 320, 331, 332, 341, 358 142, 160, 162, 179, 180, 191, 217, 221, Innovative assessment, 7, 27, 28, 32, 45, 54, 234, 236, 294, 295, 298, 300, 308, 313, 173, 177 314, 366 Innovative teaching, 117 Learning based assessment, 22–24 Integrated assessments, 23, 54, 160, 184, 185, Learning environment, 9, 13, 32, 148, 160, 191, 222, 298, 299, 349, 355, 360 204, 273, 277, 282, 285, 289, 290, Intel Corporation, 5 293–300, 306, 309–311 Interaction, 5, 7, 34, 46, 48, 58, 117, 137, 138, Learning event, 161 143, 148, 151, 172, 213, 216, 225–228, Learning goals, 20, 22, 23, 25, 241, 243, 246, 274, 276, 289, 316 140, 295 Interactive, 38, 42, 50, 84, 177, 186, 191, 194, Learning in a digital network, 7, 8, 12 196, 203, 204, 206, 209, 223, 225–230, Learning targets, 100, 101, 104, 106, 141 233, 271, 293, 303, 306, 342, 362, 364 Lifelong learning, 30, 32, 33, 35, 40, 186, 260, 343

344 Index Literacy, 2–4, 6, 10, 14, 19, 30, 31, 36, 40, Online diagnostic assessment system, 192 49–54, 59, 96, 102, 105, 177, 178, 183, Online social games, 118 185–190, 193–196, 233, 236, 238, 268, Online tests, 28, 193, 300 269, 274, 277, 279–281, 300, 302, 312, Open-ended tasks, 40, 117, 161, 296, 352 313, 315, 317–319, 330–332, 334, 335, Open-source software, 226, 306 338, 339, 342, 343, 363 Organisation for Economic Co-operation and Live software application, 184, 185, 232 Development (OECD), 3–6, 21, 35, Logfiles, 86 40, 59, 162, 180, 181, 183, 191, Longitudinal assessment data, 245 193–195, 352 Organizational structures, 271, 273 M Outcome space, 111, 117, 119–123, 133, 143, Mass education, 2 149–150, 152, 161 Mastery, 57, 100–102, 104, 107, 110, 116, 127, Out-of-school learning environments, 273 128, 154, 159, 175, 305, 354, 355, 357 P Mathematics, 10, 24, 36, 40, 42, 45, 48, 51, Paper-and-pencil tests, 28, 29, 85, 182 Partial-credit model, 59, 130, 132 54, 60, 81–84, 139, 140, 146, 155, 156, Participant observation, 112, 113 174, 179, 183, 188, 189, 192, 193, 195, Participatory design mode, 221 196, 203, 212, 213, 268, 278, 283, 296, Pedagogy, 9, 72, 187, 261, 312 299, 312, 330, 335, 346, 352–357, Peer assessment, 7, 32, 43, 137, 147, 360–362 Measurement, 10, 11, 24, 26, 28, 31–33, 47, 295, 296 51, 54, 56, 58, 68, 70, 83, 98, 100, 104, Performance-based assessments, 114, 346 106, 107, 114–116, 119, 121, 127, 128, Personal and social responsibility, 6, 19, 37, 133, 138, 139, 142, 143, 145, 147–150, 152, 156, 158–160, 174, 179, 191, 196, 54, 56, 58, 277, 319, 332 210, 211, 213, 214, 216, 219, 221, 222, Personalisation/Personalization, 27, 116, 143, 230, 237, 238, 240, 241, 243, 248, 294, 299, 308, 312, 341, 357 145, 148, 151 Metacognition, 6, 19, 20, 22, 30, 36, 43–44, Policy development, 6, 11, 338 147, 274, 277, 279, 286–288, 318, 319, Predictive validity, 97 331, 360 Problem solving, 6–9, 12, 18–21, 25, 27, 28, Metacognitive learner, 286 Metadata, 222, 229–232, 242 31–33, 36, 40–43, 47, 48, 57, 61, Methodology, 6, 7, 29, 43, 70, 218, 220, 81–84, 139, 140, 146–148, 151, 173, 241, 292 174, 177, 178, 181, 182, 191, 192, Microsoft Corporation, 5 194–198, 203, 208, 210, 212, 228, 246, Monitoring, 29, 78, 150, 151, 221, 295, 338, 345 263, 265, 267, 271–274, 277, 279, 283, Multimodal literacy, 31 284, 289, 292, 294, 296–302, 309, 312, Multiple scoring systems, 246 317, 331–333, 336, 341, 343–346, 349, 353, 354, 356, 360, 362, 366 N Professional development, 10, 57, 103, National Project Manager (NPM), 5 136–138, 300, 313, 340 Nested understanding, 103 Psychometric testing, 10, 31, 66, 84, Networking, 50, 52, 61, 102, 117, 118, 143, 127, 157 145, 271, 297 R Non-traditional learning outcomes, 204 Real-life contexts, 146 Reporting, 10, 30, 35, 54, 126, 127, 134–138, O OECD. See Organisation for Economic 141, 142, 155, 162, 204, 299, 338, 339, 341, 342, 361 Co-operation and Development Research and development, 178–196, (OECD) 302, 338 Response latency, 244 Rubrics, 246, 300, 301, 334, 346, 348, 360

Index 345 S T Scaffolding, 22, 60 Targeted teaching, 135, 262, 338 Scale of achievement, 185 Task analysis, 226 School-based performance assessment, 190, Taskforce, 5 Teacher education, 9, 10, 330, 352 341, 366 Teacher involvement, 56 Science, 5, 29, 105, 175, 261, 330 Teamwork, 6, 18, 19, 30, 36, 46–48, 58, 61, Scoring, 7, 13, 23, 28, 31, 40, 44, 59, 60, 63, 96, 144, 148, 182, 271, 274, 276, 315, 66, 67, 70–71, 73, 74, 79–80, 113, 114, 316, 331, 339 117, 119, 120, 122, 124, 126, 134, 143, Technical infrastructure, 62, 299 149, 152, 161, 173, 177, 186, 196–211, Technology based assessment, 172, 219, 224, 227–229, 240, 243–246, 281, 174–237, 244 299, 300, 332, 334, 341, 342, 344, 346, Technology-rich environments, 181, 194, 348, 356, 357, 366 195, 295 Scoring guides, 119, 120, 161 Theory building, 269, 307, 308 Security, 28, 29, 124, 172, 219, 234, Think-aloud protocols, 7, 43 238–240 Transdiscplinary learning, 221, 348 Self-assessment, 22, 43, 147, 218, 221, Transformative assessments, 58, 59, 61, 62, 245–246, 276, 353, 360, 365 81, 311 Self-directed learning, 354 Self-regulated learning, 30, 43, 181 U Simulated internet, 206, 207 Uniform computing environment, 180, 190 Simulation and scenario-based University of Melbourne, 5 assessment, 42 Skill sets, 32, 66, 106, 141, 146, 157, 185, V 204, 261, 292, 299, 300 Validity, 23, 28, 29, 50, 61–62, 73, 74, 97–98, Social characteristics, 186 Social interaction, 5, 34, 118, 175, 216, 280, 104, 107, 109–111, 133–139, 142, 148, 314–315 152, 153, 159, 161, 173, 176, 182, Socially responsive agency, 287 210–215, 241, 244, 246, 292, 295, 309, Social network analysis, 247–248, 303–305, 311, 333 314, 315 Variations, 104, 106, 180, 193, 194, 220, 262, Social networking, 61, 143, 281, 293 145, 297 Virtual desktop environment, 64 Socio-economic status, 48, 186 Vygotsky, 261, 289 Soft skills, 264, 267, 272, 278–280 Special needs students, 60 W Stakes, 176, 193 Web 2.0, 31, 56, 143, 148, 305, 314 Standardised fixed-response, 113–114 White papers, 5, 6, 12 Standardised national exams, 29 Wright maps, 162 Standardised testing programs, 10 Writing, 20, 32, 38, 45, 46, 50, 51, 60, 87, 96, Standards based assessments, 20–21 Standards based reporting, 155 119, 154, 174, 179, 180, 182, 193, 201, State of proficiency, 102, 109 205, 206, 211, 213, 248, 269, 302, 306, Static items, 58 308, 312, 330, 334–336, 338, 342, 353, Stimulus material, 180, 181, 232, 243, 248 354, 360, 363 Student self-efficacy, 85 Studies of student achievement, 2, 5 Z Subject-embedded knowledge, 37 Zone of proximal development, 9, 13, 148, Submerged talent, 84 Summative assessment, 30, 115, 140, 176, 217, 289 178, 209, 214, 291–295, 301, 310, 313–315, 349


Like this book? You can publish your book online for free in a few minutes!
Create your own flipbook