Page 1 of 1

Anyone Unhappy with the Level Test Used at their School?

Posted: Sat Nov 10, 2007 2:22 pm
by bsjess
Hi Everyone,
I'm currently teaching in an adult language school in Paris, and I was shocked to see the quality of their supposed "level" test. Incoming students pay for this test which is littered with typos, silly oral questions, and multiple choice questions with more than one correct answer! :x :x :x :evil: :evil: :evil:

I plan on seeing the director about it, but first...

Anyone have the same problem?
What are your feelings about the "level," or placement test used in your school?

Would love to hear from you.

Yes!

Posted: Mon Jun 09, 2008 8:13 pm
by Eric18
Everybody has this problem. Perhaps your poll should have asked "Is anybody happy with the accuracy of their school's placement test?"

Seriously, different placement tests produce different results and different problems. At least from the half-dozen I've seen used, none actually delivers what it promises. But neither does the SAT, the TOEFL!

Standardized exams: you can't live with or without them!

Assessment tests

Posted: Mon Nov 29, 2010 10:53 am
by andrel
Hi there,
I am teaching in Korea and the test which determines a level for our students I think is clearly unsatisfactory. Mostly due to the fact that it held by Koreans for Koreans. In no way does a foreign teacher have any say on the matter. Here we tread the fine line between business and educational institute. Unfortanately the leaning tends to be on the former.

Posted: Wed Sep 21, 2011 9:41 am
by longshikong
Canada has a good oral and written 'placement test' for immigration purposes.... but it's not a private school. I'm guessing that even DELTA and CELTA programs only gloss over assessment so it really depends on the DOS or Ed Dept.--a shame really because assessment is such an important tool for teachers, students and even marketing.

I recommended recording placement interviews 10 years ago as a means of tracking adult progress over time but even today, few privates keep anything other than student contact info.

Re: Yes!

Posted: Wed Sep 21, 2011 10:18 am
by longshikong
Eric18 wrote:Everybody has this problem. Perhaps your poll should have asked "Is anybody happy with the accuracy of their school's placement test?"

Seriously, different placement tests produce different results and different problems. At least from the half-dozen I've seen used, none actually delivers what it promises. But neither does the SAT, the TOEFL!

Standardized exams: you can't live with or without them!
The problem you describe originates from coursebook author ignorance. Many coursebooks claim in their preface to adopt the communicative approach but clearly do not. The management of schools that adopt such books simply base their curriculum and placement tests on them, unaware of such discreancies. If not much thought went into placement tests, it doesn't sound promising for the entire program either.

Posted: Thu Oct 06, 2011 12:34 pm
by Maestra
IMHO, it depends on the location of the school. Of course, non-native English speaking countries will have a lower level of standards but the tests should not have typo or grammatical errors. In some cases, the questions may be literal translations from their own language which can be confusing for us, but maybe more understandable to them. :lol:

Posted: Thu Oct 06, 2011 2:08 pm
by longshikong
I've told college students to demand bonus marks for locating errors in their exam. In China, college English tests have at least a mistake a page. Often, there's no correct answer or more than one.

Posted: Wed Jan 18, 2012 1:41 pm
by longshikong
Has it ever struck you that most initial placement (level) tests are actually (mid or final) achievement tests--they provide limited information of the overall strengths and weaknesses of prospective students. They just leave them feeling inadequate for not being able to respond to a single question from a coursebook they've never studied.

As for course achievement tests, here's something to ponder regarding content- vs objective-based testing:
The alternative approach is to base the test content directly on the objectives of the course. This has a number of advantages. First, it compels course designers to be explicit about objectives. Secondly, it makes it possible for performance on the test to show just how far students have achieved those objectives. This in turn puts pressure on those responsible for the syllabus and for the selection of books and materials to ensure that these are consistent with the course objectives.

Tests based on objectives work against the perpetuation of poor teaching practice, something which course-content-based tests, almost as if part of a conspiracy, fail to do. It is my belief that to base test content on course objectives is much to be preferred: it will provide more accurate information about individual and group achievement, and it is likely to promote a more beneficial backwash effect on teaching.

From Testing for Language Teachers Cambridge University Press ©1989 by Arthur Hughes

Currently I'm teaching from a highly content-driven syllabus. The courses I've started teaching 2 months ago expect children (from flashcards prompts) to be able to ask and answer sets of limited-response type questions by the end of the course: Have you been overseas? Do you like oranges? etc. The expectation is that through continual drilling, children will eventually pick up the vocab and questions but I'd rather test them on their understanding by their ability to make their own questions and answers from a wider vocabulary than that given. So basically, I and my colleague are having to redefine the objectives of our courses to meet the needs of our students.