Forums | Mahara Community
Developing the use of Smart evidence
02 May 2018, 9:41 PM
I wanted to post my experience using smart evidence and trying to design it to be used for degree apprenticeships. any comments and feedback is grateful.
From the academics that I have been working with the degree apprenticeships require that students collect evidence from their work place to meet standards that the professional body set out. These students are using Mahara to collect evidence and upload evidence and I a wanted to see if we could introduce Smart Evidence so that tutor could sign off each competency when the appropriate evidence has been added.
so I took the professional standard and converted it to a JSON file ( I am not a code person but I would like to recommend ATOM which helped me with the syntax) and it looked like this:
(unable to post screenshot)
I presented this to the academic for their feedback this is what I got:
After taking a look at the smart evidence criteria Claire commented that there may be a number of pieces of evidence required in order to show that the criteria has been met. what she wants is to further features. one to show how many pieces of evidence are required in numerical form x/x have been completed.
The next is the ability to mark a piece of evidence as being good and contributing to criteria but not in itself completing the criteria. for example one criterion requires 5 blog posts that shows progression in an area. The tutor may provide feedback after each post marking if they contribute.
sub topics should be collapsable and there should be the same traffic light system on the headings when all of the subheadings have been completed
Finally she would like to see a traffic light on the headers once they have been completed and once all the criteria in a section have been completed a traffic light to indicate this.
My personal feelings about smart evidence:
how should the annotation section be used, when users upload or create their evidence into the portfolio on a page the only thing that feel you need to do in this box is write "this is now ready to be assessed" should this not just be a button, or am I missing the point?
when I presented this at Mahoodle a week ago one question I was asked is how do we give access to externals to sign off work where other sections of the portfolio may have sensitive material that they should not access. In the previous institution that I worked at they had PebblePad. they had developed a very interesting app whereby a student could create a piece of evidence show it to their Mentor/line manager in the work place and they could verify that the content of the evidence was correct/their own work with a signature. at this point the evidence would lock and become un-editable and then be uploaded to the portfolio. Would something like this be worth investigating?
Sorry for the long post, please comment and feedback
03 May 2018, 5:04 PM
Thank you very much for your thoughts and questions. That's exactly what we at the core team are looking for now that more institutions have SmartEvidence available. We need to keep in mind that the current functionality was the initial phase of development based on analysis that the University of Canberra and we at Catalyst have done (with input from others who attended the hackfest at Mahara Hui NZ in 2014).
Currently, the alignment with a standard is based on a page. Thus, if you have 5 blog posts that contribute to a standard, they can all be displayed on one page and then viewed altogether and the icon would go for all 5 of them as the student would only write one annotation / reflection for the entire page. We do not only have tick boxes to say that a standard was completed but require an annotation so the student can / has to explain why they think that their evidence aligns with a standard to make their decision clear(er) to the assessor. SmartEvidence shouldn't just be a multiple choice option but ask students to reflect, make meaningful connections, and get into the habit of reflecting / summarizing by providing the annotation.
Individual pieces of evidence can contribute to the overall completion of a standard and the annotation goes for the entire page housing that evidence. If a teacher wants to give feedback without making an assessment, they can do so on the annotation, or if they want to provide feedback on individual artefacts, they can still do so by commenting on the evidence directly. That wouldn't help with calculating a numerical score, but hat can be investigated.
Standards are already collapsible (at least since Mahara 17.10), but additional subsections aren't).
Recently, I looked into how we could align individual pieces of evidence to a standard as that had come up as a question. So I came up with the idea to allow annotations / reflections for the same standard more than once on a page, which is not possible right now. The idea was to make as few changes as necessary (low budget available) but still have maximum benefit.
The tricky thing was to find a way to keep the matrix overview page manageable (page 2 in the file I linked to). So, pages continue to be displayed rather than individual pieces of evidence. However, instead of displaying the status icon for 1 reflection, the status icon displayed underneath a page for a standard is an aggregated view of all the reflections on a page. Displaying multiple icons could easily overwhelm a viewer even when there are only 5 reflections per page. Furthermore, in order to view the evidence in relation to the reflection, the page would need to be viewed anyway.
Instead of displaying multiple icons per standard per page, which would get unwieldy very quickly, the reflection summary area would be used to indicate the number of statuses of the individual reflections on the pages and the icons would indicate the overall assessment.
An alternative (not mocked up in the PDF) would be to add the reflection directly onto the piece of evidence and make it part of it. That would have had bigger implications and thus wasn't looked into.
The last page shows another option for working with reflections and the standards. If your students are only expected to write 1 reflection per piece of evidence and then select the standards to which they want to align their evidence, this could be a more efficient way of displaying the reflections and the alignments. The challenge there would be how to remove standard alignments when they are not needed / wanted anymore (as all is now in one block) and thus would need more changes than the first option, but can certainly be looked into.
At the moment, only people with the role of staff can assess a SmartEvidence collection (unless it's self assessment) and thus they need to be logged in. We had made this decision because other assessment options are only available to staff and with a secret URL people can't be verified as anybody with the URL could make the assessment, which wouldn't allow for tracking who the real person was.
PebblePad's solution of showing the assessor the portfolio while the student is logged in and then asking the assessor to sign the portfolio on a mobile app is one way of dealing with accounts. However, this doesn't work when students and assessors are remote from each other as far as I know. It is though an option for people who are local.
BTW, if you'd be willing to share your framework with the rest of the community, I'd be happy to include it in our repository of sample files.
03 May 2018, 8:14 PM
Hi Edd - here's the screen shot as requested :)