Category Archives: H817

Assessment

Assessment is an interesting subject.  In everyday life assessment is usually against a standard of ‘fit for purpose”. But what is the purpose of education?

  • ‘It’s a good thing’.  So everyone does it, learning things they may not necessarily need to know, as an entry into civilised society and as a qualification for employment.  The higher the score the better you are deemed to be; hence highly structured, consistent assessment which does not try to assess useful knowledge, just knowledge to a standard.
  • To be better at what you have chosen to do.  Hence vocational education including things like law and accountancy.  Even here it is really only an entry card with real life needing internet trawling for precedent and case law.  Assessment in this case should be authentic to practical demands and involve problem solving.  Some of it is like this, some of it is just mindless repetition of facts.
  • To be able to do something new.  This is the tricky one but particularly relevant to OU and distance education.  In this case, reliance on context and ‘real life’ situations actively discriminates against those with no current context.  Case studies are OK where the context is provided but questions relying on use of knowledge from practice and contextual background are not.

So it is horses for courses.  However, it is difficult to see how formative assessment/feedback on coursework cannot be beneficial in all three situations, in that it is by nature personal, relevant and remedial.  However, coursework used in summative assessment can be ‘gamed’ and hence has got a bad name – it is said to be too easy.  It does though represent a much closer approximation to how knowledge is gained and refined in everyday life than terminal summative assessment.

So I think Web 2.0 tools should be very valuable for assessment.  Assessment of forums, blogs, quizzes, wikis, shared projects should all contribute to measurement of performance.  How one does that consistently and equitably is another discussion but it seems to be a demonstrably more appropriate system than trying to make terminal summative assessment authentic.

Reflection on H817 Block 3

What did I expect?

I expected a highly structured approach – that is almost inevitable where a technique is being taught through a worked example.  The level of structure was – as has become typical in MAODE – rather more than I expected at MA level.  With the trend even at GCSE to move back from heavily prescribed learning and assessment it is a jolt to be force-fed with processes, templates and headings down the required path.  I thought however that we handled it well – we didn’t ignore the process but we did take some shortcuts and overrides in line with our team judgement.

What did I hope for? What did I do, and what did I achieve?

I hoped to learn about design and I achieved that. I was (more than most) unfamiliar with software/learning design processes and I found the structured approach useful. I particularly liked the Personas, Forces and Use Cases which required me to take that user perspective.  I was surprised at the degree of care we had to take to ensure engagement from users.  I am from a generation used to doing what they are told – I have to learn to earn that attention from the current generation of users.  This is valuable for me as I take the MA forward into real application in today’s environment.

Which barriers and challenges did I confront, and how did we resolve them? Finally, what did I learn?

We are counselled only to talk about process as it impacts the learning but I can’t ignore the barriers put in place to practical learning by the micro-sequential process (by item, by day, by each team member) which flew in the face of availability for a part-time, world-spread course membership.  This impacted learning by compressing useful processes into windows of availability and making it impossible to compare learning and interpretation because of the mismatch of timings.  We resolved this by a sequential approach of building on each other’s work which was highly successful – but would have been high risk in a less cohesive group.

I have already discussed the learning I achieved about design.  I also learnt the real value of trusting and respecting the intellectual inputs of fellow learners as complementary and part of one’s own learning process.

Reflections on Week 17 – design principles, conceptualising and storyboarding

Officially this section is called ‘ideation’ but that is a step too far from English for my comfort.  It is interesting that some word contractions seem to work and some just seem to jar. I guess the key is clarity – it takes me longer to work out that ideation means creation of ideas than to use the longer phrase!

The design patterns and principles exercise was useful – distilling the key learning messages out of the case studies and theories which would be relevant to our design.  This then fed into storyboarding which again showed the diverse (but not divergent) strengths of our team.  We approached it from our experience – I created a flowchart for the game process and an event map and Gordon then fleshed out the flowchart into a more professional systems diagram.  Christine fed from this into a much more visual mindmap style of process flow with each level subdivided into what will become separate page elements of the game.  Floriane took the user’s point of view, taking a couple of personas through the process to see how well it met their needs and preferences.  Altogether a great combination of relevant skills to get a useful and usable result.  Much quicker than duplicating and debating each process!

We are now approaching the convergence phase – producing a coherent product.  Some nervousness but equally complete confidence that our pragmatic approach will produce a competent product.  Tom Peters has just produced a great paper – ‘Systems have their place; SECOND place’!  He points out that human values, organisational culture and instinct for what is appropriate – the soft stuff – supported by hard systems can produce great results.  Systems on their own can create conceptually perfect but practically disastrous results.  I think our team has concentrated well on trusting our judgement, interpreting the ‘rules’ to produce a good product in the time available.

Peters, T. (2013)  Systems have their place; SECOND place [Online]. Available from http://www.tompeters.com/docs/SystemsSecondPlace021113_final.pdf (accessed 07 June 2013).

Reflection on Week 16 – Inspiration?

Inspiration is the somewhat optimistic title for the research into case studies, frameworks and design patterns.  As in much of the research into the value added by ICT to education, the answer seems to be ‘not as much as you would expect’.  The key problem seems to be the failure to identify ways of using games (in this case) to teach.  The case studies and research reviews seemed to conclude that most games were in the behaviourist mode of drill-and-practice rather than teaching or providing a constructivist environment for learning.  Was Mor’s tongue in his cheek when he identified that constructivist games were ‘mainly in research settings’ – a case maybe of researchers trying to justify their current fashion in educational theory?

Reading the National Curriculum one can understand why.  All the requirements are phrased in terms of demonstration of outcomes, with no comment on how teachers attempt to create such outcomes.  Little surprise then that teach-to-test has become endemic, it provides the answers to the questions the National Curriculum poses. The end appears to justify the means.

It has been interesting to observe the different preferences for techniques introduced in this block.  However much the course designers try to shoe-horn us into a prescribed approach, many contributors rebel because they just don’t see things that way. To quote one of the more analytical contributors ‘it didn’t seem to want to do anything for me…the whole thing leaves me quite cold. I just don’t think in this way to be honest’.  This highlights that, with an experienced cohort at MA level there are serious questions to be asked about the very prescriptive way in which this block, module and the whole MAODE edifice is structured. Should we not be expected and entitled to interpret the learning in a more individual way and, most important, in the way that adds most value to our particular needs?  If paid MAs don’t allow this, it opens the way to MOOCs and other such user-driven environments.

Mor, Y., Winners, N., Cerulli, M., Bjork, S., Alexopoulou, E., Bennerstedt, U.,Childs, M., Jonker, V., Kynigos, C., Pratt, D and Wijers, M. (2006) Learning patterns for the design and deployment  of mathematical games [Online], London, Institute of Education. Available athttp://eprints.ioe.ac.uk/4223/1/LP-LitReview-v2.pdf (accessed 24/05/13)

Reflections on Week 15 and forward view

I enjoyed Week 15 overall.  Generating Personas has seemed a good way of capturing experience of how users or stakeholders view new developments, enabling those tests to be applied in the design of the product.  The Personas led on to Forces, conflicts and the Force Map.  The map became a little mechanistic to build but usefully showed the interaction between Personas and game design attributes.

The team has worked really well.  We could easily have agonised about roles, work division and outside distractions but instead, we have just got on with it!  Everyone has contributed showing a lot of self-motivation, leaving the team leader to steer and tidy, which Christine is doing effectively.  Good mature, self-directed stuff – what adult learners should be showing

We are now into more familiar H8XX territory – reviewing case studies and theories to see what is relevant to our design.  There is a lot out there, though one has to say that most of the more interesting approaches are research studies rather than real public games.  Mor et al. (2006) identified the combining of mathematical skills and instructional skills as the key issue but I would add game design experience to that.  It is one thing to conceptualise a game, quite another to know the tricks for putting it into practice.

I think that this is going to be the pinch point in weeks 17-19.  We will be overwhelmed with inputs and ideas and will need to focus those down to a game and game-play concept. Simultaneously however, we will be trying to learn how to put a game prototype together – an entirely different skill.  I hope it will not feel like a traditional TMA reflection – wondering if it was worth doing all the preparatory work to have so little in terms of output?

Mor, Y., Winners, N., Cerulli, M., Bjork, S., Alexopoulou, E., Bennerstedt, U.,Childs, M., Jonker, V., Kynigos, C., Pratt, D and Wijers, M. (2006) Learning patterns for the design and deployment  of mathematical games [Online], London, Institute of Education. Available at http://eprints.ioe.ac.uk/4223/1/LP-LitReview-v2.pdf (accessed 24/05/13).

Reflections on Week 14 – Intro to Design Studio

Positives –

I like the project and I am delighted to be in a well-balanced and proactive team.

The concept of the Design Studio seems to be well thought through, rather exhaustive in structure.  The emulation of an artistic design studio is a good idea allowing peer review and input to achieve the best outcome without sacrificing individuality.  The process design seems to allow all team members to contribute their experience.

Concerns

Is the process exhaustive or exhausting? There is not much room for instinct, flair and unconstrained brainstorming. Not technology-led but certainly rigidly process-led.  Having part of the assessment on team process is good but it deters regarding the rules as guidance rather than decrees.

As in Block 1, the required sequential team actions do not really accommodate the reality of different availabilities during the week and weekend, though we are coping.

Verdict

Positive so far.  I will interested to see if the rigour of the process produces the perceived best result. A bit like Communism – good in theory but not noted for great outcomes?

Team Roles

I agree that the categorisations in Salas et.al. (2005) and Kay et al. (2006) are useful competencies/behaviours within a team.  I don’t agree that all team members have to aspire to or practice all the competencies all of the time. The team leader may not want the application designer to be wasting time by simultaneously trying to exercise back-up behaviour or mutual performance monitoring.  I don’t usually like mechanistic solutions which fit people into boxes but I did find Myers-Briggs and Belbin useful in my business career.  I expect people are familiar but essentially:

  • Myers Briggs methodology fits people into one of 16 personality types. Sounds crude but most people’s first reaction on reading the description of their type is ‘how on earth did they know?’.  It focuses on strengths but, as important, identifies complementary weaknesses. It helps you to accept that, if you are a good strategist/conceptual thinker you are unlikely also to be diligent at the fine detail. Hence instead of trying to hide this ‘weakness’ you strategise to accommodate it.
  • Belbin identifies team skills such as ‘Resource Investigator’ and ‘Completer/Finisher’ and proposes that a good team will have a combination of such skills in its members.  Their punchline is ‘you can’t be a perfect individual but you can have a perfect team’.

Given the diverse experience of MAODE students, I think this sort of approach is relevant to team development. We can’t pick our members but we can allocate roles to relevant strengths or interests. Hence it will be appropriate not just to compare our experience but also to identify known strengths or betes noires.  We’re here to learn online education not to experiment in becoming more complete people – there’s a MOOC for that!