319 messages over 40 pages: << Previous 1 2 3 4 5 6 7 ... 39 40 Next >>
s_allard Triglot Senior Member Canada Joined 5443 days ago 2704 posts - 5425 votes Speaks: French*, English, Spanish Studies: Polish
| Message 305 of 319 11 May 2014 at 11:41pm | IP Logged |
I think that Iversen's efforts at measuring his actual English vocabulary demonstrates again what we have alluded
to here, i.e. that demonstrated vocabulary size increases with sample size. This is not unexpected. A variant of
this is the called dictionary effect that describes how using a bigger dictionary tends to inflate estimated
vocabulary sizes.
But these results can be seen from a different angle. How many words do you need to know to demonstrate
speaking proficiency in a relatively short test period? Let's say that your actual speaking time is 30 minutes. I
think that this is all the time the examiners will need in order to determine how well you master the spoken
language. How many different words do you have to use in 30 minutes to demonstrate C2 speaking proficiency?
It would seem that the answer for a language like English is around 500 words or even less. There is no doubt
that if you spoke for 5 hours, you would probably use more words. But since vocabulary size is not being
measured, what is the point of having the candidate speak for 5 hours?
We could also imagine that examiners probably have a good idea of a person's speaking proficiency in less than
10 minutes by the general way the person can handle pronunciation, grammar and vocabulary. It is very unlikely
that the candidate speaks really well for 10 minutes and then falls apart for the next 20 minutes.
1 person has voted this message useful
|
Iversen Super Polyglot Moderator Denmark berejst.dk Joined 6716 days ago 9078 posts - 16473 votes Speaks: Danish*, French, English, German, Italian, Spanish, Portuguese, Dutch, Swedish, Esperanto, Romanian, Catalan Studies: Afrikaans, Greek, Norwegian, Russian, Serbian, Icelandic, Latin, Irish, Lowland Scots, Indonesian, Polish, Croatian Personal Language Map
| Message 306 of 319 12 May 2014 at 12:06am | IP Logged |
I agree that even a short interview can be enought to give an examinator a fairly good idee about the level of a candidate, and I'm also sure that nobody counts the number of words spoken during such a test. A small background vocabulary will however mean that the candidate has a fairly narrow comfort zone, and a narrow comfort zone will be vulnerable.
However I was not even speculating about language tests when I did my 'research' - I had assumed that my first estimate of 2600 headwords actually represented the size of my comfort zone (ie.e. the words I would keep repeating), but instead it turned out that I just keep adding new words with growing corpora. I still think there is an upper limit, but that hunch is mostly based on my experiences with extremely large dictionaries, where my percentage of passively known words tumbles downwards compared to measurements based on more modest dictionaries.
As for the topic of test behaviours I think something similar might happen with very long tests, simply because people can't keep up the steam indefinitely if they constantly are straining to speak well and avoid their weak points. So maybe it takes twenty minutes before they crack, but they do crack if their linguistic basis isn't robust and comprehensive enough.
I remember a math exam at an economical school where I studied in the 80s. For most of the times we discussed statistical functions, and I referred to beta-functions and other fun things way beyond the things we had been told during the course. But just before we ended the examinator posed me a simple question from the prescribed pensum, and I couldn't hide that I simply didn't know a thing about that part of the pensum. Actually I hadn't even bothered to find out what the pensum included - and that is not the only time I have had that attitude. The examinator would not have discovered that lacune if the test had been shorter (and afterwards I was scolded because I blew the teacher's one and only chance of dispensing a top note to somebody, but that's another history).
Since then I have forgotten all my math and today I doubt that I could do even a simple chi2 test. Now I apply my idiosyncratic study techniques to language studying, and there I can fortunately avoid taking exams.
Edited by Iversen on 12 May 2014 at 12:18am
2 persons have voted this message useful
| s_allard Triglot Senior Member Canada Joined 5443 days ago 2704 posts - 5425 votes Speaks: French*, English, Spanish Studies: Polish
| Message 307 of 319 12 May 2014 at 5:04am | IP Logged |
Actually, this discussion raises an interesting question: How many posts does one have to read to get an idea of the
poster's writing proficiency. As of this writing, Iversen has 8333 posts. How many of these should we read to
determine Iversen's writing ability on the CFER scale? One is probably not enough. But do we have to read 100?
Maybe just ten. Or even just five. I don't have an exact answer myself, but I think everybody gets the point. A pretty
small sample can give us an overall idea of performance proficiency.
1 person has voted this message useful
|
Iversen Super Polyglot Moderator Denmark berejst.dk Joined 6716 days ago 9078 posts - 16473 votes Speaks: Danish*, French, English, German, Italian, Spanish, Portuguese, Dutch, Swedish, Esperanto, Romanian, Catalan Studies: Afrikaans, Greek, Norwegian, Russian, Serbian, Icelandic, Latin, Irish, Lowland Scots, Indonesian, Polish, Croatian Personal Language Map
| Message 308 of 319 12 May 2014 at 9:52am | IP Logged |
I would say that 100 long posts would be enough to show my writing skills (or lack thereof in certain languages), and the 8333 (now 8334) posts are more like af test of stamina.
The thing my tests suggests has not much to do with the evalution of my writing skills - they just show that when you have got a fairly large active vocabulary then you WILL use it, at least in writing. Or in other words: they just show that it is meaningless to use used words as a measure for your potientially active vocabulary: - if 15.000 words aren't enough to find the saturation point or the level that defines an asymptotic curve then you can can just as well settle for a smaller sample. And of course that's what happens during most written examinations - nobody expects you to write 15.000 or 30.000 words. And that's arguably a result that would have been at the edge of linguistics if I had been able to conduct the test in a squeekily clean scientific way.
Does this result also apply to speech? I dunno, the jury is out on that one. Maybe it does because you can't check a dictionary while you discuss. One corollary of the result for writing is that you simply can't test the size of the active vocabulary by making the exams longer - but that doesn't exclude that you can test other things, like stamina and versatility and the number of topics where you are at ease. So simply boosting the size of the examination essays is not a good idea, but asking for several shorter texts on different leels might be relevant. Or asking for essays in different styles.
But these observations mostly concern writing and languages at a fairly advanced level. If I had some really large word samples in digital form of my writings in a weak language then I might test whether the saturation effect is visible there. But so far I only have reached the 15.000 words level in French (and 13.000 in German) - and both French and German are among the languages where I wouldn't expect to see an effect. The problem is of course that you tend to write less in languages where you have to look things up and check grammar etc. So in practice that line of research has run against a wall, at least for my part.
Edited by Iversen on 12 May 2014 at 2:54pm
2 persons have voted this message useful
| s_allard Triglot Senior Member Canada Joined 5443 days ago 2704 posts - 5425 votes Speaks: French*, English, Spanish Studies: Polish
| Message 309 of 319 12 May 2014 at 2:04pm | IP Logged |
Iversen wrote:
I would say that 100 long posts would be enough to show my writing skills (or lack thereof in
certain languages), and the 8333 (now 8334) posts are more like af test of stamina.
.... |
|
|
Why 100 posts? That seems like a lot. I don't have a specific number myself, but, not surprisingly, I tend to favour
the other end of the spectrum. I would say that two long posts (minimum 200 words) are enough to get a good
idea of the writing proficiency. Then add another post just to confirm. So I'm looking at three posts.
My reasoning is that within a specific writing genre such as these posts the quality of production does not vary
much from one sample to the next. By quality I mean the ability to construct sentences of a certain complexity
and variety. We look for nuances and natural native-like style. We assume that these qualities will be constant
regardless of the subject matter.
One could test for different writing genres such as fiction, newspaper reporting, letter writing, academic articles,
etc.
Now, one could get picky and say that the quality probably varies over time as the writer gains more experience.
That's probably true but for our purposes, it's not a big issue. But it would suggest that we use recent samples
to assess current writing skills.
The fundamental point here is that the assessment of language performance has little to do with vocabulary size.
Vocabulary usage skill? yes, size? no.
1 person has voted this message useful
| Serpent Octoglot Senior Member Russian Federation serpent-849.livejour Joined 6610 days ago 9753 posts - 15779 votes 4 sounds Speaks: Russian*, English, FinnishC1, Latin, German, Italian, Spanish, Portuguese Studies: Danish, Romanian, Polish, Belarusian, Ukrainian, Croatian, Slovenian, Catalan, Czech, Galician, Dutch, Swedish
| Message 310 of 319 12 May 2014 at 2:57pm | IP Logged |
The question is whether the "vocabulary usage skill" has to do with vocabulary size. Being able to use vocabulary precisely and make distinctions requires you to know the vocabulary in the first place. Working around your gaps is just one part of those vague skills.
2 persons have voted this message useful
| s_allard Triglot Senior Member Canada Joined 5443 days ago 2704 posts - 5425 votes Speaks: French*, English, Spanish Studies: Polish
| Message 311 of 319 12 May 2014 at 6:35pm | IP Logged |
Anyone who has done any kind of professional writing or editing knows that there such a thing as writing quality.
Speaking quality is more nebulous,
I won't outline a course on what is good writing, but a couple of items come to mind: proper grammar, proper word
usage, readabilty, logic, lack of repetition. In fact, good editing usually reduces the length of a text because one
eliminates a lot of verbosity. Bloated vocabulary is a bad trait.
1 person has voted this message useful
| Serpent Octoglot Senior Member Russian Federation serpent-849.livejour Joined 6610 days ago 9753 posts - 15779 votes 4 sounds Speaks: Russian*, English, FinnishC1, Latin, German, Italian, Spanish, Portuguese Studies: Danish, Romanian, Polish, Belarusian, Ukrainian, Croatian, Slovenian, Catalan, Czech, Galician, Dutch, Swedish
| Message 312 of 319 12 May 2014 at 7:48pm | IP Logged |
But L2 is completely different from L1!
Naturally using fancy linguistic units is also a skill. Many look ridiculous if they try to show off the cool words they've learned, especially if they know a lot of basic words, a tiny amount of advanced words and nothing in between. But the solution is acquiring more vocabulary, not less, in order to be more precise.
And at an exam those words are often dragged out of you by force if you don't use them.
1 person has voted this message useful
|
You cannot post new topics in this forum - You cannot reply to topics in this forum - You cannot delete your posts in this forum You cannot edit your posts in this forum - You cannot create polls in this forum - You cannot vote in polls in this forum
This page was generated in 1.0000 seconds.
DHTML Menu By Milonic JavaScript
|