Anyone Unhappy with the Level Test Used at their School?

<b> Forum for the discussion of assessment and testing of ESL/EFL students </b>

Moderators: Dimitris, maneki neko2, Lorikeet, Enrico Palazzo, superpeach, cecil2, Mr. Kalgukshi2

Post Reply

Are you unhappy with your language school's test?

Poll ended at Sun Nov 18, 2007 2:22 pm

Yes
0
No votes
No
0
No votes
I'm 50/50
0
No votes
No opinion
0
No votes
 
Total votes: 0

bsjess
Posts: 3
Joined: Thu Nov 08, 2007 4:07 pm

Anyone Unhappy with the Level Test Used at their School?

Post by bsjess » Sat Nov 10, 2007 2:22 pm

Hi Everyone,
I'm currently teaching in an adult language school in Paris, and I was shocked to see the quality of their supposed "level" test. Incoming students pay for this test which is littered with typos, silly oral questions, and multiple choice questions with more than one correct answer! :x :x :x :evil: :evil: :evil:

I plan on seeing the director about it, but first...

Anyone have the same problem?
What are your feelings about the "level," or placement test used in your school?

Would love to hear from you.

Eric18
Posts: 151
Joined: Fri May 18, 2007 12:38 pm
Location: Los Angeles, California
Contact:

Yes!

Post by Eric18 » Mon Jun 09, 2008 8:13 pm

Everybody has this problem. Perhaps your poll should have asked "Is anybody happy with the accuracy of their school's placement test?"

Seriously, different placement tests produce different results and different problems. At least from the half-dozen I've seen used, none actually delivers what it promises. But neither does the SAT, the TOEFL!

Standardized exams: you can't live with or without them!

andrel
Posts: 1
Joined: Sat Jun 27, 2009 8:56 pm
Location: Ulsan, Korea

Assessment tests

Post by andrel » Mon Nov 29, 2010 10:53 am

Hi there,
I am teaching in Korea and the test which determines a level for our students I think is clearly unsatisfactory. Mostly due to the fact that it held by Koreans for Koreans. In no way does a foreign teacher have any say on the matter. Here we tread the fine line between business and educational institute. Unfortanately the leaning tends to be on the former.

longshikong
Posts: 88
Joined: Mon Oct 26, 2009 1:49 am

Post by longshikong » Wed Sep 21, 2011 9:41 am

Canada has a good oral and written 'placement test' for immigration purposes.... but it's not a private school. I'm guessing that even DELTA and CELTA programs only gloss over assessment so it really depends on the DOS or Ed Dept.--a shame really because assessment is such an important tool for teachers, students and even marketing.

I recommended recording placement interviews 10 years ago as a means of tracking adult progress over time but even today, few privates keep anything other than student contact info.

longshikong
Posts: 88
Joined: Mon Oct 26, 2009 1:49 am

Re: Yes!

Post by longshikong » Wed Sep 21, 2011 10:18 am

Eric18 wrote:Everybody has this problem. Perhaps your poll should have asked "Is anybody happy with the accuracy of their school's placement test?"

Seriously, different placement tests produce different results and different problems. At least from the half-dozen I've seen used, none actually delivers what it promises. But neither does the SAT, the TOEFL!

Standardized exams: you can't live with or without them!
The problem you describe originates from coursebook author ignorance. Many coursebooks claim in their preface to adopt the communicative approach but clearly do not. The management of schools that adopt such books simply base their curriculum and placement tests on them, unaware of such discreancies. If not much thought went into placement tests, it doesn't sound promising for the entire program either.

Maestra
Posts: 2
Joined: Mon Oct 03, 2011 12:20 pm
Location: United States of America

Post by Maestra » Thu Oct 06, 2011 12:34 pm

IMHO, it depends on the location of the school. Of course, non-native English speaking countries will have a lower level of standards but the tests should not have typo or grammatical errors. In some cases, the questions may be literal translations from their own language which can be confusing for us, but maybe more understandable to them. :lol:

longshikong
Posts: 88
Joined: Mon Oct 26, 2009 1:49 am

Post by longshikong » Thu Oct 06, 2011 2:08 pm

I've told college students to demand bonus marks for locating errors in their exam. In China, college English tests have at least a mistake a page. Often, there's no correct answer or more than one.

longshikong
Posts: 88
Joined: Mon Oct 26, 2009 1:49 am

Post by longshikong » Wed Jan 18, 2012 1:41 pm

Has it ever struck you that most initial placement (level) tests are actually (mid or final) achievement tests--they provide limited information of the overall strengths and weaknesses of prospective students. They just leave them feeling inadequate for not being able to respond to a single question from a coursebook they've never studied.

As for course achievement tests, here's something to ponder regarding content- vs objective-based testing:
The alternative approach is to base the test content directly on the objectives of the course. This has a number of advantages. First, it compels course designers to be explicit about objectives. Secondly, it makes it possible for performance on the test to show just how far students have achieved those objectives. This in turn puts pressure on those responsible for the syllabus and for the selection of books and materials to ensure that these are consistent with the course objectives.

Tests based on objectives work against the perpetuation of poor teaching practice, something which course-content-based tests, almost as if part of a conspiracy, fail to do. It is my belief that to base test content on course objectives is much to be preferred: it will provide more accurate information about individual and group achievement, and it is likely to promote a more beneficial backwash effect on teaching.

From Testing for Language Teachers Cambridge University Press ©1989 by Arthur Hughes

Currently I'm teaching from a highly content-driven syllabus. The courses I've started teaching 2 months ago expect children (from flashcards prompts) to be able to ask and answer sets of limited-response type questions by the end of the course: Have you been overseas? Do you like oranges? etc. The expectation is that through continual drilling, children will eventually pick up the vocab and questions but I'd rather test them on their understanding by their ability to make their own questions and answers from a wider vocabulary than that given. So basically, I and my colleague are having to redefine the objectives of our courses to meet the needs of our students.

Post Reply