Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Messages - Sebastian_Rocheleau

Pages: [1]
Assessment of Writing / Formative portfolio based assesment
« on: May 12, 2017, 12:08:54 AM »
Here's a portfolio based assessment folder I have developed for an academic writing course for graduate students. It enables instructors to asses their students intake throughout the two major units of the class (annotated bibliography & secondary research paper) I teach. Feel free to modify this portfolio to meet your student needs. It includes a task targeting the higher levels of Bloom's taxonomy and a set of self reflection prompts to raise student's awareness to their own writing and the content covered in class. There is also a section for instructor feedback on each lesson's portfolio file. 

I suggest using box (if you have access) for this portfolio due to the browser based editing and FERPA compliancy that box provides. If you cannot use box, Google drive is adequate substitute. Enjoy!

What an interesting phenomena. Just to clarify when you say achievement test I assume you mean a low stakes in class summative tests?
I think it really depends on the construct you are assessing with the reading in the test. If your construct is writing then I don't see it being problematic. It might actually be beneficial as I would make the task more focused. For example, if you give students a text they have already read but they have to write something new based off the text. If the purpose of the text is to assess reading then using a previously covered text would cause issues with various aspects of test validity (predictive, convergent, face, and content). I'm not really sure what can be done to fix this situation in Korea. It really depends on what the instructors wish to asses and their awareness of the limitations of this form of testing.
When I was teaching in the Japanese junior high school system I saw some instructors generating test that did use previously covered texts as part of the item for reading, grammar and speaking assessment. Maybe a reason for this in EFL environments is instructor confidence in item design? Some teachers might doubt their ability to create a text item that covers the same aspects of the reading construct that were covered in the in-class text. I know that here at the University of Illinois we generate all texts for assessment. We normally base the text on an authentic source (newspaper article, research articles etc...) and then modify it to fit our needs but like any type of item design there are drawback to this approach. It can be difficult to maintain a consistent level of difficulty between semesters if the items are always changing. We use several different techniques to maintain the same level of difficulty over semesters. For example, to maintain a consistent level of vocabulary difficulty we set word frequency guidelines for text items (eg. From total amount of words in the text add up to the following percentages:  60% from the top 1k words, 15% 2k words, 15% Academic Word List, 10% above 2k) in our English Placement Test. This allows us to increase the test validity.

I think one thing that needs to be added to this training is how to monitor how much of student feedback is turning from input into intake. From my experience in feedback training seminars, instructors tend to focus on how to give feedback and don't really follow up with students to see if they actually understood what the feedback meant. Some suggestions to help teachers understand how effective their feedback was for their students include
  • a follow up questionnaire for students
  • an individual conference where students confirm and clarify the teachers feedback
  • some form on student reflection in paper or media form (I recommend screencasts if you have time) on the feedback and how they plan to incorporate it into the assignment

Feedback--Teacher / Free Screencasting tool for video based feedback
« on: April 19, 2017, 11:21:54 PM »
Greetings! I have been using OBS (Open Broadcasting Software) to give video based feedback to my students and to make instructional videos. You can record mics and computer based sounds and include graphic overlays with multiple windows in your screencast. The learning curve can be a little but of a challenge but its completely free and offers the same feature that "premium" pay for software includes.

Reading Activites (during reading) / Article Analysis
« on: March 09, 2017, 08:40:36 PM »
Here's an article analysis activity to help turn learners into active readers. See the link for all info!

Harry Potter Lessons! / Pronunciation Prediction Using Harry Potter.
« on: March 09, 2017, 11:25:46 AM »
Greetings! Many people generate an internal monologue when they read but who do we know if  L2 English learner's internal monologue reflects real word pronunciation? This activity bridges reading and pronunciation. Enjoy!

Briana, Sebastian and Aaricka

Here's a link to our activities during the reading section.

Just to add some future modifications for a longer class. You could add a student generated word list section where students writing down textspeck constructions they do not understand and fine their meaning. You could also cover the frequency of textspeak in messaging. The study that formed the backbone of the this lesson mentioned that only between 10-20% of language is shortened. Students can also do an audience analysis to raise learners awareness to with whom and to what level of complexity they use textspeak with people they know.

Pages: [1]