Introduction to critical appraisal!
Welcome to all students studying the MSc in Forensic Analysis. This blog post is part of your Research Methods and Practical Skills module led by Helen Green (USSKM3-30-M). USSKM3 = the module code; 30 = indicates this is a 30 credit module; M = indicates master’s level).
I’m Viv Rolfe and we have a number of 2 hour sessions together as follows:
Week 9 (20 September)
Lecture and discussion –> Notes here lecture-1-19sept2016_online
Task 1 – writing critically
Week 11 (4th October)
Week 15 (Independent learning)
Week 16 (8th November)
Week 17 (15th November)
Week 18 (22nd November)
My goal for you is of course to pass the January 2017 examination in which you will write a critical appraisal of a forensic science journal article. I also hope we have a constructive and fun time in these sessions and that you will also develop valuable skills in critical thinking and critical writing.
I think we take for granted our ability to read scientific articles, and write about them, but do we ever stop to question whether we are being really effective? How are your critical thinking skills? Do you sometimes think critically about the scientific world around you, or are you too rushed to stop and do so? Do you consider yourself a fair person, unbiased, in the way you think and communicate your ideas with others?
I hope these sessions help you grow as critical thinkers and writers. You might wish to watch this introductory video to critical thinking, which references the work of Richard Paul who was a leading proponent in this area.
How are these sessions structured?
You will need to bring a pen and paper to these sessions, as a big part of them will be you developing your thinking and writing skills. We shall be forming pairs and groups to discuss aspects of forensic science research and court case studies. Each week there will be a ‘task’ or homework which I very much hope you will all take part in; these will include the opportunity for you to complete small writing tasks for me to help you develop your talents!
Some preliminary reading for weeks 9 – 11
I am hoping you’ll find these sessions a bit of an ‘eye opener’ and we will be challenging some of the established doctrines that surround our research industry – from experimenting, interpretation, communication and publication. Here is a blog post that I wrote reflecting on the quality of medical research – or often, lack of it.
We are going to base some of this module work on the writing of Trish Greenhalgh. She has perfected the art of ‘trashing a paper’, and there are a number of articles that you can refer to, all freely available here. I’d focus on two at the start of this module:
Getting Your Bearings
Assessing the Methodological Quality
We’ll work toward understanding these papers toward the end:
Papers that Report Drugs Trials
Papers that Report Diagnostic Screening
Greenhalgh T (-) How to read a paper. The BMJ. Available: http://www.bmj.com/about-bmj/resources-readers/publications/how-read-paper
The do’s and don’ts of publishing.
American Association of Immunologists (2010). Dos and Don’ts
for Authors and Reviewers. Available: http://vivrolfe.com/research-methods/Assets/Scientific%20Publishing_Dos_Donts.pdf
These items may also be helpful:
A blog post on basic journal searching and some openly licensed (again free) learning resources coving all basic skills for students.
Let’s get started!
You have my UWE email address and contact details on the Blackboard Module Page, but you can also contact and chat with me via Twitter – in fact, I would love for you to share any interesting articles or videos relating to our studies.
How to think about assessment.
A starting point – or refresher – for anyone wishing to think about assessment and feedback is Phil Race’s textbook (available as a free eBook via our UWE library):
“The lecturer’s toolkit: A practical guide to assessment, learning and teaching”. (Latest UWE edition, 2015).
Chapter 2 gets us thinking about how to make assessments (and feedback) fair, valid, authentic and reliable. I think in recent years we have become confused regarding the purposes of formative and summative assessment. Our summative assessments come with the expectation of providing feedback, and programme assessment strategies tend to lose the scope for incremental development. Here are the four main purposes of assessment:
- Formative assessment – assessment ‘for’ learning uses assessment to raise student achievement.
- Summative assessment – assessment ‘of’ learning judges work against standards or criteria.
- Diagnostic – identification of needs/knowledge at the start of a session/course.
- Ipsative – self-assessment based on student previous levels of achievement.
Based on Phil’s toolkit, and some of my ideas, here are some important principles, with Phil’s highlighted in bold:
- Validity – how does the assessment link to the learning outcomes?(i.e. follow the principles of constructive alignment)?
- Reliability – how to ensure consistency between markers / quality control?
- Authenticity – is the assessment relevant to the discipline, or more broadly provide skills for employability?
- Fairness – do your students have a practice run or feedback on a draft piece of work? Are the assessment and marking criteria clear, and is the availability of support clear to learners?
- Engagement – is there evidence of the work in progress (e.g. blog or progress log, or submission of drafts to Turnitin)?
- Innovation – are you inviting originality from the students, or even better foster co-creation?
- Inclusivity – does your approach support all students on an equal basis, or are you testing their ability to take the assessment? Are you offering assessments in other formats? Is your overall curriculum design for assessments diverse to offer equity in opportunity overall?
When you are designing assessments – ideally as a team – you need to consider the individual student as well as a programmatic approach. I believe that if you get many of these elements right – engagement, innovation and inclusivity, then you will achieve the goal of ‘designing out’ academic offences in the holistic approach described by Macdonald and Carroll (2006).
Academic offences – and how we should be thinking about them.
The paper “Plagiarism detection and prevention” by Ursula McGowan (2005) always struck a chord with me. Ursula describes how the introduction of so-called ‘plagiarism detection software’ and academic offence policies have taken precedent over ensuring students have the right skills and understanding in the first place. She rightly argues we have put ‘the cart before the horse’, and a number of studies have verified how students new to university more often do not know what academic offences are, and international students may be particularly disadvantaged as they are not used to the regulations and cultural norms of our education system. Are we helping our students learn, or are we just trying to stop them cheat? Review the language in your module handbook? Are you policing students or building support and a culture of honesty and academic integrity?
What causes academic offences?
Academic offences can occur for many reasons. They may be inadvertent due to lack of student skill or understanding, they may of course be purposeful, or they quite often can be encouraged by poor assessment design. The main offences we are dealing with here include plagiarism, collusion and contract cheating (the purchasing of assignments from essay mill sites, or exchanges through social media or forums). In most of these sites where work can be purchased – the work is run through text-matching software and occasional words altered to break the text string. Therefore the work purchased won’t be detected by the digital tools often used in institutions (see sections below). BUT BUT BUT the well trained eye of a teacher or tutor will spot such work a mile off. And of course, the quickest and simplest thing to do is to copy and paste any suspected text into Google to see what comes up.
Are we really sure what these offences are?
What is plagiarism? “The action or practice of taking someone else’s work, idea, etc., and passing it off as one’s own” (Oxford English Dictionary 2007). This can be done in number of ways or combinations of these:
- Copying words verbatim without acknowledging the source
- Poor paraphrasing – a poor attempt at writing in student’s own words – without acknowledging the source
- Using other media without acknowledgement (e.g. photographs/diagrams – and of course the acquisition of additional permissions may also be required here)
It is always worth checking your own university definitions and regulations around academic offences as they may differ slightly. You can access the UWE student study skills guide on plagiarism here –> plagiarism.
For the correct acknowledgement we would look for a citation and reference in the correct format. Work can also be ‘self-plagiarised’ if previously submitted work is reused but not cited (i.e. acknowledging this is their own previous work). If a full assignment is resubmitted without citation, then this would be self-plagiarism alongside a fuller consideration of poor academic practice. Plagiarism is all about lack of acknowledgement of the ownership of the original work. Broader aspects of cheating could include altering or inventing data for example.
What is collusion? Another form of academic misconduct that can be common in group working where the assignment brief is not made clear to students. In this, students submit assignments that have been completed with other people.
Contract cheating? The purchasing of coursework online could not be simpler, either through essay mill websites, or through easy enough contact with other students. The only way of avoiding this is using the steps below to ensure the originality and individuality of work submitted.
How do we design out plagiarism?
Going back to our assessment principles, and in particular the ones that can help ‘design out’ academic offences:
I do believe there should be no such thing as an academic offence, or that it should very rarely occur. Students are often accused of collusion because for example the requirements and allowances of group working, say, for writing up laboratory practicals, is not clearly explained. (Is this a fair assessment?) We set them high stakes assignment (e.g. a dissertation write up) often with no evidence of work in progress. (Can we encourage engagement and therefore verify their work?) Research suggests students are more tempted to cheat when they are under stress – look at your programme assessment strategy. Is coursework bunched up at the end of term encouraging them to cut corners? Is the assessment blend diverse and flexible to accommodate learner abilities? (Inclusive?) Are students getting adequate formative feedback to develop their writing, and is this incremental helping them to develop across years of study? (Fairness)? Are you setting the same essay titles year on year? (Innovation)?
We must look at our individual practice and the assessments that we set, and we must also look at our overall programme curriculum design. Macdonald and Carol (2006) talk about a holistic approach from policy to curriculum design and student skills needs, and to balance “low-stakes formative assessment for learning and using high-stakes assessment sparingly to genuinely measure student learning“.
So ask yourselves some questions about your own assessments?
- Do you set the same essay title year on year?
- If students are retaking a module or the year, do you set the same assignment (tempting them to resubmit old work and self-plagiarise)?
- Do you provide an obvious title “describe the electrical conductance system of the heart” that can simply be Googled and easily ‘patchwork written’ or purchased?
- Is the student answering your question or establishing one for themselves? Or can you include a reflective question to ask “how did you approach this assignment”?(Innovation and originality?)
- Are you monitoring progress by introducing iterative steps? e.g if completing a 2000 word essay, why not include a search strategy and/or annotated bibliography that can be submitted separately for quick feedback (or peer-feedback)?
- Do you encourage independent critical thinking or do you slog through providing all the feedback yourself? Why not use Turnitin or similar for the submissions and self-review of drafts? (Evidence of engagement).
- Why not go the whole hog and establish the practice of students as ‘co-creators’ of their curricula and assessments? (Innovative and original; fairness; engagement).
‘Plagiarism detection software’ klaxon!
The article already mentions the use of digital tools for the detection of plagiarism. First of all this is a misnomer. Software such as Turnitin(R) or SafeAssign(R) do not detect plagiarism – you do! Or if the student is using services to check their work, then they are forming a judgement upon it. These software use text-matching algorithms to identify strings of words similar to other sources. They make the matches visible by often highlighting them in colour, and coming up with a percentage value of matched-text. Do not fall into the trap of using this % as a cut off for plagiarism or not, as I have seen in some university regulations. These commercially available plug-ins such as Turnitin can be used with virtual learning environments (e.g. commonly Moodle or Blackboard).
Turnitin as an example works by scanning submitted work against the company’s database of submitted work, articles and documents available from publishers, and internet pages. Not all publishers (especially some journals) have their text available for comparison, and the matches provided do not necessarily equate to the original source of the material. Whilst I do think they are useful tools, there is some work to be done to use them effectively.
What to think about before implementing the use of such software?
In some of my previous work, we were about to roll out the use of Turnitin for all student assignments in our university. One thing was clear, it was very important to proceed carefully with students and staff when doing this. Think about the message your are presenting when you make it compulsory for students Ito submit work to something that is essentially for the detection of cheating? Ask yourself also, are you also providing a robust programme of skills development for new students? Also, what impact is this extra assessment step going to have on staff marking time? In this paper I used Turnitin to provide instant formative feedback for students – but the process for getting there involved student and staff interviews. (Rolfe 2010).
Using Turnitin formatively = assessment for learning.
So in rolling out Turnitin, I worked with students and staff through interviews and questionnaires to come up with the best solution. The end result was brilliant – students helped us evolve a ‘self-service’ approach to enable them to check their own work. They could submit draft assignments to Turnitin and review their own reports. Any passages of their writing that was matched (highlighted in the software’s report) would have to be rewritten. Of course, references and text in quotations WOULD be highlighted, so it was also a way of crudely checking that references were correct!
The added bonus of this idea was that students would be regularly submitting DRAFTS of work in progress. This is an important step for verifying the ownership of work and engagement in the assessment process. For Turnitin and SafeAssign you can alter the submission and report settings to facilitate their use in a variety of ways, i.e. ensuring student draft work was not submitted to the database and would not therefore be a match for their final work.
I hope this post is useful and can help inject some creativity and fun back into assessments. As Macdonald and Carol (2006) conclude – we need to look at the causes and not just the symptoms. And that means also looking at our own teaching and assessment practice.
Presentation from 2009
Macdonald, R. and Carroll, J. (2006). Plagiarism – a complex issue requiring a holistic institutional approach. Assessment and Evaluation in Higher Education, 31 (2), 233-245.
McGowan, Ursula. (2005). Plagiarism detection and prevention: Are we putting the cart before the horse.” Proceedings of the HERDSA conference.
Rolfe V. 2008. Powerpoint guide for staff and students to understand Turnitin reports. (The software version will have changed, but the principles may be helpful to you). PPT slides –> Understanding Turnitin Reports
Rolfe V. 2011. Can Turnitin be used to provide instant formative feedback?. British Journal of Educational Technology, 42(4), pp.701-710. –> 10-108002602930500262536
Rolfe V. 2016. Powerpoint slides from a staff plagiarism workshop.
(Happy to run with your team!) PPT slides –> Plagiarism Workshop_Feb2016
- 29th March The HEA published a report on ‘fellowships and student engagement’, unpicking the impact on fellowships and teaching which is a nice read. However, a line is included that students are more able to contact and interact with staff “outside of formal class hours/work on activities other than coursework”. This casual comment worried me as it massages the expectation that staff are always available.
- 30th March Dave Cormier wrote about the resilience that students need to demonstrate to become successful learners. This seems a combination of emotional and academic resilience, and what we might thinking of in the UK more in terms of transitions? This reminds me of Helen Beetham’s work on digital wellbeing.
- 31st March Times Higher Education article suggesting stress is not always a bad thing. There is no mention of institutional responsibility or support.
- 2nd April Frances Bell discussed the idea of institutional fragility and links to some further excellent writing on resilience and well being.
- Article on White Fragility
Keep sharing folks.
RefWorks is a simple to use on-line application for organising references to articles that you may wish to use as part of a project or piece of research. It is very simple and intuitive to use, but often some of the difficult bits are exporting references or getting the information that you require. This is where Google Scholar can come in very handy, and the short video shows how I use Scholar and also another bibliographic database PubMed to get my ‘stuff’ into RefWorks.
There was another nail in the coffin of the UK Higher Education (HE) sector today as we know it. I see the arena in which I have worked for 10 years undergo such recent policy changes that the underlying principles that drew me to work in HE are fading away. I didn’t sign up to a sector that cares more about league tables than learners. But perhaps because I’ve worked also in the private sector, I’m just not familiar with the comings and goings of education policy, infrastructure and investment. Perhaps this level of turbulence, poor economic management and lack of long-term vision is normal. In industry we used to say it was a “moveable feast” which at least brought welcome images.
Today the HEA announced the delay of the 2016 National Teaching Fellowship scheme, quite understandable really as a result of financial pressure and insecurity of the future of HEFCE. It is beginning to sound like the song “I know an old lady who swallowed a fly”. The implication of each announcement sends ripples through the community and I feel now we are a few steps away from swallowing the “horse”. Here are some other notable ripples.
Higher Education Academy
To become a self-sustaining organisation by 2017. Well all I can say is that I wouldn’t be sitting here now if it wasn’t for the Higher Education Academy (HEA). I bring no baggage and memories of whoever or whatever was there before. I walked straight out of industry in to a lecturer job and I knew nothing about students and teaching of science. The HEA – subject centre for bioscience – offered an instant community. It was fast-food professional development – like pouring hot water onto your Pot Noddle and getting an instant meal – you attended the bioscience events and got an immediate sense of the profession and set of skills and resources on which to build your career. I was also participant in other subject centres, which was vital to observe approaches from other disciplines. You could pick the best thinking – the ‘designing out plagiarism’ materials from 2005 the subject centre for ICT I still refer to today. But for me the Bioscience Subject Centre was the ‘Bombay Bad-boy’ of them all. The community, expertise, research methodology and skills. I wrote for the first time, I published in my first peer-review journal and I presented at my first education conferences. I gained some funding. I got my first promotion to Senior Lecturer and then to Principal Lecturer. I have three National Awards.
And then there was TechDis who provided technological support and guidance to support the needs of all learners. The team of staff were employed by the HEA, and from January 1st 2015, Techdis has been languishing helplessly on the internet archive. That the government can think so little of the education requirements of the general population is beyond belief. Some 6.8% of undergraduate learners who decide to draw down Disability Funding Allowance (excluding postgraduates and parttime learners and likely to be an underestimate) contribute to the diversity and richness of our learner population. That is still around 82,000 students in the UK. We commonly then also think of also of dyslexia – 10% of the population. So without even thinking too deeply about other differences and needs that our learners (and always overlooked staff) may be experiencing, the idea that we do not invest centrally to support the full TechDis service and the specialist expertise therein is incredulous.
It is deeply worrying to see the lack of visibility of the importance of diversity in universities across the board – gender imbalance, lack of BME participation. A quick Google search will direct you to a wealth of information including such from HEFCE, Times Higher Education, ScienceGrrl.co.uk and Sciencecampaign.org.uk. Parts of the sector are toxic with discrimination. And the most staggering thing for me in the last week or so was a colleague saying if a learner can’t spell by the age of 18 year that it is tough, they are an adult. You can pick up professional skills and knowledge relatively easily with a bit of training and investment. However, once discriminatory attitudes become entrenched, I would think it would be hugely challenging to change the culture and recreate a positive environment once more.
And what about Cetis? Not that I understand much about technology but I know enough to appreciate you need standards of practice and uniform ways of working. Again, a small group who pack a big punch – they have provided the ‘brains’ behind educational approaches toward learning, assessment, diversity, new technologies and much more. Again I cannot understand why they are not supported through a little investment to recognise the essential work that they do. However I do believe they are presently holding their own and have become a sustainable organisation despite their transition in May 2015.
Jisc have weathered an immense storm and undergone dramatic organisational change that in itself has to be an impressive success since the announcement to funding changes in 2013. However I do worry at what we are left with, and whether the education sector needs the newer commercially-focused organisation as opposed to passionate and expert staff who responded to the technological needs of the sector. The annual Digifest seems to position Jisc as a technology-broker, and through lack of project funding and the loss of those valuable networking opportunities, the result is the feeling of lack of engagement with academic teams.
I welcome the excellent online resources and few networking opportunities, there isn’t any support for digital innovation or professional development that we saw previously. I would think this will have a major impact on the competitiveness of the UK HE sector against global players. Jisc is nowhere in significant areas of activity including participation in open education, and has been noticeably silent in responding to major areas of challenge for the HE sector including the Teaching Excellence Framework.
National Teaching Fellowship Scheme
The NTF scheme has awarded around 700 fellowships to outstanding education professionals –teachers, librarians and professional support staff. Very different to the UK Professional Skills Framework (UKPSF) that rewards four levels of professional competency, the NTF scheme is a reward for excellence based on an annual quota and attracting a small amount of funding for those awarded. These individuals are often the champions within their institutions, leading innovation and questioning decisions that impact on their colleagues or jeopardise the spirit of their learning community. In a time where there is practically no research or project funding to support education endeavours, the funds are a welcome life-line for staff who otherwise would not be able to financially support projects or attend conferences and events. I dearly hope the announcement, now delayed until the Spring brings good news, but deep inside I don’t feel so optimistic.
So we’ve swallowed the horse.
But will we die?
By next year we will know not just about the National Teaching Fellowship Scheme, and the fate of the HEA and HEFCE. This period of lack of investment and the prospect of establishing new professional organisations and regulators will take its toll on our education sector – ramifications not just for Higher Education but ripples across schools, college, adult and community sectors. The challenges ahead:
- Rebuilding communities of practice and changing mind set and cultures will take time and investment.
- New staff are reinventing the wheel and not sharing across their subject disciplines; they don’t know that a body of literature and practice reports are out there and why should they?
- The lack of visibility of learner diversity in universities and support could be significant.
- Lack of investment has pulled the rug from under our feet in terms of evidence-base – who is exploring urgent topical issues and informing the community?
- What university – even now – is going to prioritise investment for learning and teaching over discipline research, and show me one already not investing in the next REF?
- The poor curation of our literature-base and sector reports is a travesty and waste of public funds.
- The loss of networks will erase sector ‘memory’ of past work, people and practices?
There is room for optimism
At least we have social media and new networks and communities can pop up in an instant. Great ideas such as the Wednesday evening #LTHEchat on Twitter is one example. However we cannot sustain a sector that increasingly relies on home-working and self-funding to participate. This maintains connections but does not fuel innovation.
Will the Teaching Excellence Framework bring optimism and provoke genuine change and recognition for learning and teaching in universities, or will it join the other key performance indicators on the league-table scrap heap? As observed in a keynote session at a national conference last year, do we need any more metrics, aren’t there bigger problems to solve:
Add the shoe sizes of VC’s into league tables! Would be just as accurate.
What can we do?
This is what I think and feel free to Tweet to me more or respond in the comments box:
- We can respond to online e-petitions and provoke parliamentary debate.
- You can gather your communities – professional bodies – institutions or even individually to blog, lobby and shout.
- You can respond to parliamentary committee inquiries individually or as a group.
- You can lobby your institutional leaders and decision makers.
- You can lobby your students!
- Form pedagogic communities within your institution.
And now for a poem
I would not by any means consider myself an expert because there are so many nuances and technicalities to writing good quality questions that do not discriminate, and that actually test knowledge and understanding and not just examination technique.
This guide is licensed under CC-BY-SA and I fully advocate people developing their own versions and examples with their programme and teaching teams.
Right mouse click the links to download them and have fun.
OMG. This Minion really does look like me. I shudder to think when I am in charge what I will actually look like? But you know what, I don’t care. I’ll be in charge and sorting out this unholly and ungodly mess.
I hate just to turn out a blog post unsubstantiated and ill referenced, but sometimes I do think plain and simple opinion is important.
I wonder at what point education innovation in the UK is going to entirely come to a halt. It can’t be far off if it hasn’t already done so.
I have spent an amazing day with education researchers from around the UK being trained. But we were peering over our vol-au-vents and thinking well this is all great, but there is no funding for me to do this.
What on earth is seriously happening to the UK Higher Education Sector. Come on.
I can’t entirely blame the organisations involved who previously dished out vast sums of money to support pedagogy and technology projects, training, networks and research. But I do look back feeling quite enraged at things like CETL (Centres for Excellence in Teaching and Learning) a few years back that dished out some £350 million pounds to the sector. WHAAAT? Around half of that investment is no longer visible or of any permanent use to the sector. But gosh, what we could do with a few pounds right now.
OK so these organisations are busy licking their wounds but the fact that they haven’t stood up with the HE sector to lobby for support is unfathomable. What however seems to have happened is almost instantly they underwent significant institutional change and reorganisation to reposition themselves as commercial ventures. OK that may be impressive, but personally I feel utterly let down by them, and like many others, having worked on and supported work in the sector for decades, I feel utterly betrayed. In some quarters, entire repositories of educational materials have gone. Do prospective students and families realise what an absolute mess the whole thing is in?
So what is it with UK Higher Education? We are supposed to be players in a global market, to be widening our entry gates, to be ensuring students have grande employability opportunites, to be keeping up with increasing and insane demands of the NHS and professional bodies (yes I deal with three on one single course of 15 sutdents per year). So really, how can we evaluate and address simple questions without money? I’m not even talking about full economically costing to buy me out for one hour a month, that is ridiculous. But to put it simply for me 1) I DO need to feel some sense of value on what I am doing by my institution, 2) I need some fair reimbursement for the extensive time I put in (outside of work) working in YOUR HIGHER EDUCATION SECTOR and 3) I wouldn’t mind the odd financial reimbursement for the training or odd conference here and there that increasingly I pay for myself. I am 47 and am still renting a house. We talk about the NEW generation of people experiencing difficulties, but there is a whole generation who are still struggling?
But size of money isn’t everything
I’m not saying research investment is the be all and end all. I’ve worked in industry and know that throwing millions of pounds at projects does not necessarily produce inventive steps or life-changing results. UK universities that receive funding to produce results targeted toward certain outputs and impacts – that by definition cannot be robust research. Money increasingly goes to more polarised body of institutions and I hate to hear of money simply being wasted because they received so much and it is the end of the financial year. This is WASTEFUL to the UK overall. This knocks blue-sky research. Bashes creativity. We have a generation of researchers now (well, the ones with the jobs), who entirely think about the outcomes of their work rather than even thinking what would be interesting, what if we combined these theories…..innovation has come to a crashing halt.
OK so I moved from scientific research to education a few years ago. If we think that about half of UK universities aren’t bothered with learning and teaching at all, so the investment isn’t going to come from these institutions, then there is practically NO money for very fundamental EDUCATION research from anywhere else.
1 There is little money for education research and investment.
2 There is not much money to invest in developing education.
3 Nobody is interested in your child, or investing in what their needs are to gain a fruitful education.
I really can’t see how enough of the UK is going to compete globally for very much longer
I absolutely do think a university education is tremendous. But it isn’t fair, it isn’t equal, and great parts of it does not work. I bite my tongue talking to prospective students and parents in talking about education when I know the majority of my time and that of colleagues is administration, sorting out timetables, I count bus tickets, I input data, we verify administration decisions, I spend an vast amount of time requesting rooms or car parking. I might say these things flippantly, but when students need accommodation then I am on the case. But I do think, why am I doing this when I do absolutely know people who would be far bester than me in doing these things. Please sir – we want to teach!!!! Don’t start me on workload administration – my 9 year old niece would do this for me. Charmingly, one of the senior faculty members who introduced the scheme (that I do not condemn overall) did not even realise we inputted the same data year on year.
So what happens next?
OK so it is well established that academic hours are huge, but nobody does anything about it really. Academics have always worked long hours but usually writing papers and because they are engrossed in their research. I work weekend to stick boring numbers into a workload system. Because I have to compile over 200 documents for an NHS programme review. A complete and utter waste of time in terms of real value given to students. And this brings the notion of ‘hyper stress’ – immense stress by the shear volume of tasks, and things like data input if you are tired, dyslexic or whatever, do take a huge amount of attention-to-detail and skill to undertake correctly.
This worries me totally. I’ve had an amazing day today at a research workshop in London and I am tired of the conversations that show that we will do the research anyway. We all sat there thinking about the important work needing to be done, and with few exceptions, and verified by coffee and lunchtime chats, well, we’ll do this anyway. We will work evenings and weekends to make sure the concerns we have with international students, widening participation, making sure young people get the best out of their university experience……..we will make sure these get addressed. Not because of any UK sector leadership that used to come from the HEA or Jisc, or from our institutions…
…but because of us.
Is this a form of torture?
The situation has become so bizarre that you start to think that you are living some crazy dream. Is this a form of torture? I’m in my mid-40’s and driven to at times work around the clock for a job I’m allocated 2 days a week for? Because I’m under allocated on a system (despite having a fantastic boss) I still have to take on more. I’ve never been so physically ill in my life. But is this some joke? Is someone going to leap out from behind a lamp post and say you silly thing?
Part of some big master plan!
Perhaps I should just admit defat and stop caring. Is this what is intended? Am I supposed to just put all PowerPoint slides on Blackboard and assess every student by multiple choice questions? Because frankly, with no money for innovation, with no realistic look at what academic staff do, that is where we are.
Bring on the summer. A colleague and I have just finished interviewing for 3 student internship positions to work with us over the summer. We interviewed 7 students from 1st years through to graduates, and have been absolutely blown away by the talent and the entrepreneurship shown by our candidates. It’s left me thinking that their abilities and inherent skills stretch far beyond their undergraduate curricula. I’m questioning what are we doing at all to give young people the opportunity to explore, experiment and test their own abilities? University should be life changing after all?
Things that they are all doing?
Each without exception showed tremendous initiative and had looked for opportunities throughout their educational journey so far to be involved with projects. One candidate at the age of 15 had surveyed his school peers about the economic and financial plight of his home country. Another had helped enhance her family florist business. Another candidate exports second hand textbooks to schools oversees that are in need. Another has a long history working part-time in a fast food chain which is not for the faint-hearted surely, and spoke about seeing ‘through’ the job in order to collaborate and connect with his colleagues. These things were hidden and we had to tease them out which was interesting, and clearly not viewed as being immediately relevant. There is work to be done there clearly!
Personality traits they shared?
These are intuitive, opportunistic and dynamic people. They are willing to get stuck in and have a go. In fact, this project I think sounded quite dull from a scientist perspective – I really didn’t think it would attract anyone as it aims to explore public and patient involvement in education and research. I’m a bit nervous about having to move something in our department in a new direction. Our mantra is that all students just want to work in laboratories, but I have got this hugely wrong. The advert did attract a large number of students, and every one had a sense of adventure. It intrigues me that we just don’t see this in the classroom. Work to be done here clearly also.
So what are we doing to nurture this?
Where are our free spaces for students to develop their own ideas, initiative and intuition? We have undergraduate projects and activities built into university processes, but this immediately creates a barrier and starts constraining and clamping down on ideas. We need to open things up, but how? As we move along the road of ‘embedding employability into the curriculum’ and ‘embedding entrepreneurship’, in my 10 years experience of higher education I know enough to realise this is the nail in the coffin. We see this all over, once something becomes embedded, a target, a metric, it loses all effect, and time is spent obsessing over the process, and why we are 0.25% down on last year’s target.
However these internship schemes are a real success, as are international exchanges (e.g. #DMUglobal), opportunities to work in local communities (e.g. DMU Frontrunner), graduate futures awards (e.g. UWE Futures). I met a Graduate Associate from SOLENT University at OER15 who has been given ‘freedom’ to ‘go and do some stuff’ and came up with an astonishing open course to help introduce international students to university life before coming to the UK. These are all really good things.
If I ruled the world….
I would like to see education growing the individual and not just assessing the learning gained. There should be an ‘open module’ each year in which students can work on a campus based project; take time out to run a centre for student innovation; or take time out for a community or international initiative. And back to the original idea, shouldn’t we involve the public or relevant stakeholders in everything we teach?
Sorry this is an ‘off the top of my head’ article – I’m sure there are many more great initiatives out there I have missed. I can’t wait for the summer – not to sit on a beach or go and visit my friends Wendy and Mabel the donkeys at Western-Super-Mare, but to work with a group of amazing people.
...and I wanna play the game with you.
What do these items have in common?
Amazon gift vouchers.
A free iPad.
Tickets to the ball.
Free entry to “Scouting for girls”
£10 printing credit.
You may be mistaken for thinking these items may be about to feature in a 2015 version of the Generation Game (where a varied assortment of items were placed on a conveyor belt that the contestants would have to memorise them to win the prize). No. These items signify the arrival of that good old time of year again. Roll up. Roll up. It is the National Student Survey.
GO ON I KNOW YOU WANT TO!
History of the survey
The National Student Survey (NSS) in the UK started in 2005 and asked university students in their 3rd (mostly final) year of study series of questions relating to the experience on their chosen course (NSS, HEFCE 2005). The survey was intended to measure the ‘quality of higher education’. The questions have largely remained unchanged, and the most acclaimed and most likely to break vice chancellors out in an uneasy sweat is Q22 “overall how satisfied are you with your course”? The survey has been modified over the years and now includes additional questions for NHS based courses. The data presented is subject to a number of benchmarks and adjustments.
The survey was launched early on to a fair amount of resistance, questioning the poor style of analysis that bundles the data together and reduces the clarity of results, relying on broad and unspecific questions, and quite worrying on the back of this, that universities subsequently making important strategic decisions based on the outcomes of the survey (Jumping through hoops on a white elephant, 2008).
Is experience becoming more important than education?
The survey, despite these early robust discussions, has not gone away, and quite to the contrary. Whilst the survey itself remains largely unchanged, the resourcing and time investment by institutions to ensure an effective process for collecting responses is now a significant activity in every academic calendar. It is part of the ‘business’ of Higher Education and feeds the growing appetite of the sector for key performance indicators, data, league tables, frameworks, benchmarks, as we all set to the business of ‘measuring’ what education is. Couple this volume of work with poor data management systems, the increasing need to deliver and duplicate this data in a variety of forms to other places, the administration and teaching teams within universities are not surprisingly under huge pressures and experience unhealthily long working hours (UCU 2014 Survey). As with other areas of the public sector, I’m sure we’d much rather spend our time educating young people, and do the job we were initially intended to do?
But what will happen next – some positive action or collapse? I get a sense of a rise in scepticism across the sector as reported in a previous blog post (Guinea pigs in a maelstrom, 2014), where at the Society of Research into Higher Education annual conference, Bob Burgess and Jurgen Enders questioned with regards to league tables:
Aren’t there bigger problems to solve?
Like an academic arms race.
Add the shoe sizes of VC’s into league tables! Would be just as accurate.
I am optimistic we are on the verge of one brave institutional leader saying enough is enough.
So does it provide a useful view of the quality of education? What do people think?
I don’t doubt at all that prospective students and their families should be better informed about the performance of their institution to which they might be making a considerable and hefty commitment to. However are league tables the best way of doing it? I would question whether people read or indeed understand the increasing numbers of them. Having spent 12 years doing open days at three different universities I can honestly say I cannot recall it being a subject of conversation once. University choice is about gut feel of a place, it is about coming to an open day and meeting great existing students, academic and technical staff. If anyone makes a decision based on the position of a university on a table alone, they must be pretty mad.
But what do people really think, and going back to the cuddly toy, are incentives wrong? Incentives are an established means in market research for improving response rates and quality of responses to questionnaires. You might think they naturally bias responses toward being more favourable? Research shows this is not always the case, but the optimum incentive point must be found, otherwise the opposite can happen, and respondents start to get pretty naffed-off. One way to minimise any bias would be to get students to complete the survey independently of their univeristy – they complete the ‘Destination of Leavers from Higher Education’ – graduate employment – survey 6 months after leaving, so why not the NSS? Why not manage the survey centrally via the student union to relieve the burden on the teaching teams?
So what do students think of the survey? We don’t really know to be sure, but one media article about it, attracted a colourful range of comments:
Doing ourselves a favour by reviewing the university positively
We were pressed by tutors to answer certain questions in a particular way
Taxpayers deserve more open, fuller accountability by this sector because of the huge amounts now spent and the financial burden put on our young people.
Still, I remember my uni days fondly and would encourage anyone to seek out a uni experience and screw the untrustworthy rankings.
You can read the conversation for yourself. (BBC, Universities face survey warning, 2008).
What about the meaning of the data?
So what do we really know about the meaning of the data? Do we every really sit and question it? All the data is openly retrievable with data sets going back to 2005 on the HEFCE website (http://www.hefce.ac.uk/whatwedo/lt/publicinfo/nss/data/2008/ ). This is the approach I took to looking at it in this first instance.
1) I downloaded all the Higher Education year datasets to an Excel spreadsheet.
2) I looked at the ‘registered’ data as opposed to the ‘taught’ data – data being responded to the institution at which the student was registered rather than where the majority of teaching may have been.
3) I manually corrected the variation in university names over the years, and included the latest name for those institutions that had been renamed.
4) I sorted the data by institution to allow for comparisons across each year.
Work by Paula Surridge (http://www.bristol.ac.uk/esu/ug/nss/research.html ) informed the use of benchmarking, adjusting for subject, ethnicity, age, mode of study, gender and disability. It does not adjust for socio-economic group which on the surface is rather surprising. In 2008, the benchmarking changed, so comparing to data prior to that is not terribly useful.
HOW SATISFIED ARE STUDENTS ACROSS THE UNIVERSITY SECTOR?
When I first plotted this out by arranging the HEIs in England alphabetically, I thought it looked pretty and rather interesting. My partner thought it looked like the German world cup football strip. My statistician, who I gave the data blinded, observed “clearly some pattern and cyclical event going on”. I enlisted the help of a second statistician to analyse the data.
Performing an ANCOVA to compare year on year differences, there were significant differences between each year group with the exception of 2013 to 2014. Each institution was incrementally better year upon year until 2013 when there was another benchmark change.
Conclusion? The data suggests students are more satisfied year on year with the HEIs in England. Or, are the processes to gather the data are improving year on year?
HOW DO THE DIFFERENT UNIVERSITY GROUPINGS COMPARE?
Data was sorted according to Russell Group, Alliance University or other.
By sorting universities in England by their commonly referred to groupings, and extrapolating the dataset, the Russell Group clearly achieve higher satisfaction rates compared to Alliance Universities and all others. The data does however show their rates of satisfaction slowing down. Should the benchmarking remain unchanged, by 2018, the UA and other will match the performance of the Russell Group. By 2023 the ‘others’ nose past the winning post being the first to reach 100%.
Improvement in satisfaction or processes?
It is not clear what the nature of these observations really are, and with all my discussions on the data, there are some interesting hypotheses. I hope this article prompts some serious data analyst to interrogate the data sets more fully. I have done the same analysis on other questions – and there are interesting differences with those also.
But do we see this elsewhere? In the 2014 REF the sector seemed to incrementally improve with suggestions that the evaluation is flawed (The Guardian, REF 2014), and whilst we cannot doubt the amazing and outstanding work that does go on in UK universities, we could say that this was also due to also improvements in the system.
One thought that has come up a few times is that we are dealing with a system that is corrupt – and by human nature if you set us targets and measure our performance, we will work to comply with those targets. We all know that is exactly what happens – how do we spend much of our academic time?
Donald Campbell was an economist who observed this in his writings in 1976:
The more any quantitative social indicator (or even some qualitative indicator) is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.
Charles Goodhart an economist made similar observations in 1975, now known as ‘Goodhart’s Law’:
When a measure becomes a target, it ceases to be a good measure.
Where do we go from here?
I do think the HE sector needs to take a good look at itself and understand fully the series of measures and targets to which our performances are increasingly evaluated (research, students satisfaction, teaching performance). If we do persist in having monitoring systems, they have to be run effectively. We are detracting academic staff from doing their jobs, and the pressure on teams do get good results, as we’ve see with the REF can develop a sinister side (The Guardian, REF 2014). Higher education is not a theme park delivering a jolly experience. It should be nurturing and at times challenging one to enable learners to develop and achieve meaningful goals, and I would therefore at times be quite happy if my students at times were left a little unsatisfied because I had chosen to stretch their thinking and their approaches.
“Good game. Good game”.