Other Voices: The Living Museum, Dahomey, and the ethics of AI

I know it’s a crowded field, but I came across an AI / open data development recently that really made me stop and take a breath.

The Living Museum introduces itself as follows:

If the artifacts in museums could talk, what would you say to them? Would you ask about their origins, or what life was like back in their eras? Or would you simply listen to their stories?

Created by an independent developer, Jonathan Talmi, The Living Museum is an experimental AI interface that uses content from the BM’s open licensed digital collections database to enable users to curate personalised exhibits and “talk” to individual artefacts about their history and origins.  The developer is unaffiliated with the British Museum and makes it clear that the data is used under the terms of the CC BY-NC-SA licence. 

In an introductory blog post Talmi says

I hope this project demonstrates that technology like AI can increase immersion, thereby improving educational outcomes, without sacrificing authenticity or factuality.

The app was launched on the Museums Computer Group mailing list and twitter a couple of weeks ago and it was met with a generally favourable response.  However there were some dissenting voices, from curators, art historians, and authors, who pointed out the problematic nature of imposing AI generated voices onto artefacts of deep spiritual and cultural significance, whose presence in the BM’s collections is hugely contested. 

Others questioned the macabre ethics of foisting an artificial voice on actual human remains, such as the museum’s collection of mummies.  I had a surreal conversation with the mummy of Cleopatra, who died in Thebes aged 17, during the reign of Trajan. It was a deeply unsettling experience. 

This is where “authenticity and factuality” were both sacrificed…

The response actually acknowledges the disrespectful and ethically questionable nature of the whole project. My head was starting to melt at this point.

Pressing the question of repatriation prompts the voice to “step out of the artificial artifact persona”…

The whole experience was as surreal as it was disturbing

There was also criticism from some quarters that the developer had “exploited” the work of professional curators by using the British Museum’s data set without their explicit knowledge or permission.  It’s important to note that the CC BY-NC-SA licence does explicitly allow anyone to use the British Museum’s data within the terms of the licence, however just because the license says you can, doesn’t necessarily mean you should. When it comes to reusing open content, the licence is not the only thing that should be taken into consideration.  This is one of the key points raised by the Ethics of Open Sharing working group commissioned by Creative Commons in 2021, and led by Josie Fraser. The report of the working group acknowledges that not everything should be shared openly, and highlights issues relating to cultural appropriation:

Ethical open sharing may require working in partnership with individuals, communities and groups and ensuring their voices are heard and approaches respected. While in some cases openly sharing resources can help to promote cultural heritage and redress gaps in knowledge, in others it may be experienced as cultural insensitivity, disrespect or appropriation — for example, in relation to sacred objects or stories and funerary remains.

Something that both the British Museum and developers using its digital collections should perhaps consider. 

By coincidence, the launch of The Living Museum coincided with the release of Mati Diop‘s film Dahomey, winner of the Berlin Film Festival’s Golden Bear award.  Dahomey, also gives a voice to sacred cultural artefacts; a collection of looted treasures being repatriated from France to the former kingdom of Dahomey, in current day Benin. In Diop’s absorbing and hypnotic film the power figure of the Dahomeyan king Ghezo speaks in Fon, his voice disembodied and electronically modified. 

 
In an interview with Radio 4’s Screenshoot (23:20), Diop spoke eloquently about “the violence of the absence of the artefacts from the African continent.”

“These artefacts are not objects, they have been objectified by the Western eye, by the colonial perspective, locked into different stages, art objects, ethnographic objects, even locked into beauty.”

“To me it was immediate to give back a voice to these artefacts because I felt that the film is what restitution is about, which is giving back a voice, which is giving back a narrative, a perspective. The film tries to embody the meaning of restitution.”

I was lucky enough to see Dahomey at the GFT accompanied by a conversation with Giovanna Vitelli, Head of Collections at The Hunterian, and Dr Christa Roodt and Andreas Giorgallis, University of Glasgow.  The Hunterian is just one of a number of museums interrogating the harms perpetuated by their colonial legacy, through their Curating Discomfort intervention.  The conversation touched on power, control and sacredness, with Vitelli noting

“Possession means power. We, the museums, hold the power, and control the power of language.  The film speaks powerfully about voices we in the global north do not hear.” 

I’ve written in the past about the importance of considering whose voices are included and excluded from open spaces and the creation and curation of open knowledge. On the surface it may appear that AI initiatives facilitated by the cultural commons, like The Living Museum, have the potential to bring collections to life and give a voice to marginalised subjects, however it’s important to question the authenticity of those voices.  By imposing inauthentic AI generated voices on culturally sensitive artefacts there is a serious risk of perpetuating exploitative colonial legacies and racist ideology, rather than addressing harms and increasing knowledge equity. Something for us all to think about. 

Open Education and AI: Proselytisers, prophets and poets.

I’ve been dipping my toes back into the debate about open education and AI over the last few weeks.  I stepped back from this space earlier in the year both for personal reasons and because I was getting a bit dispirited by the signal to noise ratio. It’s still a very noisy space, more so if anything, but there are some weel-kent voices emerging that are hard to ignore.

David Wiley laid out his stall last month in the webinar Why Open Education Will Become Generative AI Education, and his views have been predictably polarising. There have already been several thoughtful response to David, which I can highly recommend reading: 

I don’t want to repeat the very pertinent points that have already been made, but I do want to add my concerns about the staring point of David’s argument which is

“the primary goal of the open education movement has been to increase access to educational opportunities. The primary strategy for accomplishing this goal has been to increase access to educational materials. And the primary tactic for implementing this strategy has been to create and share OER.” 
~ Why Generative AI Is More Effective at Increasing Access to Educational Opportunity than OER

This is certainly one view of the open education movement, (which is by no means a homogenous entity), but open education isn’t just about goals, strategies and tactics, there are other perspectives that need to be taken into consideration.  I find this content centric view of open education a bit simplistic and reductive and I had hoped that we’d moved on from this by now.  I would suggest that the primary purpose of open education is to improve knowledge equity, support social justice, and increase diversity and inclusion. While content and OER have an important role to play, the way to do this is by sharing open practice. 

This slide in particular made me pause…

Leaving aside the use of the Two Concepts of Liberty, which is not unproblematic, I’m presuming “users” equates here to teachers and learners, which is a whole other topic of debate. It’s certainly true that open licences alone don’t grant the skills and expertise needed to engage in “high-demand revise and remix activities”, but I’m not sure anyone ever claimed they did? And yes GenAI could be a way to provide users with these skills, but at what cost? There’s little discussion here about the ethical issues of copyright theft, algorithmic bias, exploitation of labour, and the catastrophic environmental impact of AI. Surely a more responsible and sustainable way to gain these skills and expertise is to connect with other teachers and learners, other human beings, and by sharing our pedagogy and practice? While there’s a certain logic to David’s hypothesis, it doesn’t take into account the diversity of practice that can make open education so empowering. 

Aside from the prediction that Generative AI Education will save / replace / supersede OER, I couldn’t help feeling that there is still an underlying assumption that OER = open textbooks. (This was also an issue I had with one of the keynotes at this year’s OER24 Conference) It shouldn’t need saying, but there are myriad kinds of open resources above and beyond open textbooks.  What about student co-created OER for example? It’s through the process of creation, of gathering information, of developing digital and copyright literacy skills, of formulating knowledge and understanding, that learning takes place.  The OER, the content created, is a valuable  tangible output of that process, but it’s not the most important thing. If we ask GenAI to produce our OER, what happens to the process of learning by doing, creating and connecting with other human beings? 

This issue was touched on by Maren Deepwell and Audrey Watters in the most recent episode of Maren’s brilliant Leading Virtual Teams podcast.  It’s been really inspiring  to see Audrey re-enter the fray of education technology criticism.  We need her clear incisive voice and fearless critique now more than ever.  

Touching on the language we use to talk about AI, Audrey reminded us that “Human memory and computer memory are not the same thing.” And in her The Extra Mile newsletter she says:

“I do not believe that the machine is or can be “intelligent” in the way that a human can. I don’t think that generative AI and LLMs work the same way my mind does.” 

This very much called to mind Helen Beetham’s thoughtful perspective on ethics and AI at the ALT Winter Summit last year where she said that “generative”, “intelligence”, and “artificial” are all deeply problematic concepts.  

“Every definition is an abstraction made from an engineering perspective, while neglecting other aspects of human intelligence.”

Towards the end of the podcast, Maren and Audrey talked about the importance of the embodied nature of being and learning, how we tap into such a deep well of embodied knowledge when we learn. It’s unthinkable to outsource this to AI, for the simple reason that AI is stupid. 

The embodied human nature of learning was also the theme of Marjorie Lotfi’s beautiful six-part poem, Interrogating Learning, commissioned by Edinburgh Futures Institute for the inaugural event of their Learning Curves Future of Education series. Marjorie weaves together the voices of displaced women and, I believe, speaks more deeply about what it means to learn than any disembodied “artificial intelligence” ever could. 

What have you learned?

When asked this question how will a woman answer?

For a moment she’s back in her mother’s belly
a heart beating out a rush of cortisol
or a warm dream of sleep listening through a barrier of skin and blood
before even her own first breath.

And then the day she’s born
blinking at the bright of daylight, candle, bulb,
hearing the low buzz of electric
and the sudden clarity of a voice she knows already.
Learning it again.

There have been a thousand things to learn in every day I’ve been alive,
the woman thinks,
and I am 53 this year.

(Please listen to Marjorie reading the six-parts of Interrogating Learning in the video below.)

 

OER24: Gathering Courage

Hands of Hope, Cork, CC BY, Lorna M. Campbell

Last week the OER24 Conference took place at the Munster Technological University in Cork and I was privileged to go along with our OER Service intern Mayu Ishimoto. 

The themes of this year’s conference were: 

  • Open Education Landscape and Transformation
  • Equity and Inclusion in OER
  • Open Source and Scholarly Engagement
  • Ethical Dimensions of Generative AI and OER Creation
  • Innovative Pedagogies and Creative Education

The conference was chaired with inimitable style by MTU’s Gearóid Ó Súilleabháin and Tom Farrelly, the (in)famous Gasta Master.

The day before the conference I met up with a delegation of Dutch colleagues from a range of sectors and organisations for a round table workshop on knowledge equity and open pedagogies. In a wide ranging discussion we covered the value proposition and business case for open, the relationship between policy and practice, sustainability and open licensing, student engagement and co-creation, authentic assessment and the influence of AI.  I led the knowledge equity theme and shared experiences and case studies from the University of Edinburgh.  Many thanks to Leontien van Rossum from SURF for inviting me to participate.

A Cautionary Fairy Tale

The conference opened the following day with Rajiv Jhangiani’s keynote, “Betwixt fairy tales & dystopian futures – Writing the next chapter in open education“, a cautionary tale of a junior faulty member learning to navigating the treacherous path between commercial textbook publishers on the one hand and open textbooks on the other.  It was a familiar tale to many North American colleagues, though perhaps less relatable to those of us from UK HE where the model of textbook use is rather different, OER expertise resides with learning technologists rather than librarians, OER tends to encompass a much broader range of resources than open textbooks, and open resources are as likely to be co-created by students as authored by staff. However Rajiv did make several point that were universal in their resonance.  In particular, he pointed out that it’s perverse to use the moral high ground of academic integrity to defend remote proctoring systems that invade student privacy, and tools that claim to identify student use of AI, when these companies trample all over copyright and discriminate against ESL speakers. If we create course policies that are predicated on mistrust of students we have no right to criticise them for being disengaged. Rajiv also cautioned against using OER as a band aid to cover inequity in education; it might make us feel good but it distracts us from reality. Rajiv called for ethical approaches to education technology, encouraging us not to be distracted by fairy tales, but to engage with hope and solidarity while remaining firmly grounded in reality. 

Rajiv Jhangiani, OER24, CC BY Lorna M. Campbell.

Ethical Dimensions of Generative AI and OER Creation

Generative AI (GAI) loomed large at the conference this year and I caught several presentations that attempted to explore the thorny relationship between openness and GAI. 

UHI have taken a considered approach by developing policy, principles and staff and student facing guidance that emphasises ethical, creative, and environmentally aware use of generative AI. They are also endorsing a small set of tools that provide a range of functionality and stand up to scrutiny in terms of data security.  These include MS Copilot, Claude, OpenAI ChatGPT, Perplexity, Satlas and Semantic Scholar. Keith Smyth, Dean of Learning & Teaching at UHI, outlined some of the challenges they are facing including AI and critical literacy, tensions around convenience and creation, and the relationship between GAI and open education. How does open education practice sit alongside generative AI? There are some similarities in terms of ethos; GAI repurposes, reuses, and remixes resources, but in a really selfish way. To address these ambiguities, UHI are developing further guidance on GAI and open education practice and will try to foster a culture that values and prioritises sharing and repurposing resources as OER. 

Patricia Gibson gave an interesting talk about “Defending Truth in an Age of AI Generated Misinformation: Using the Wiki as a Pedagogical Device”.  GAI doesn’t know about the truth, it is designed to generate the most most accurate response from the available data, if it doesn’t have sufficient data, it simply guesses or “hallucinates”. Patricia cautioned against letting machines flood our information channels with misinformation and untruth. Misinformation creates inaccuracy and unreliability and leads us to question what is truth.  However awareness of GAI is also teaching us to question images and information we see online, enabling us to develop critical digital and AI literacy skills. Patricia went on to present a case study about Business students working collaboratively to develop wiki content, which echoed many of the findings of Edinburgh’s own Wikipedia in the curriculum initiatives.  This enabled the students to co-create collaborative knowledge, develop skills in sourcing information, curate fact-checked information, engage in discussion and deliberation, and counter misinformation.

Interestingly, the Open Data Institute presented at the conference for what I think may be the first time. Tom Pieroni, ODI Learning Manager, spoke about a project to develop a GAI tutor for use on an Data Ethics Essentials course: Generative AI as an Assistant Tutor: Can responsible use of GenAI improve learning experiences and outcomes?  

CC BY SA, Tom Pieroni, Open Data Institute

One of the things I found fascinating about this presentation was that while there was some evaluation of the pros and cons of using the GAI tutor, there was no discussion about the ethics of GAI itself. Perhaps that is part of the course content? One of the stated aims of the Assistant AI Tutor project is to “Explore AI as a method for personalising learning.” This struck me because earlier in the conference someone, sadly I forget who, had made the sage comment that all too often technology in general and AI an particular effectively remove the person from personalised learning. 

Unfortunately I missed Javiera Atenas and Leo Havemann’s session on A data ethics and data justice approach for AI-Enabled OER, but I will definitely be dipping in to the slides and resources they shared. 

Student Engagement and Co-Creation

Leo Havemann, Lorna M. Campbell, Mayu Ishimoto, Cárthach Ó Nuanáin, Hazel Farrell, OER24, CC0.

I was encouraged to hear a number of talks that highlighted the importance of enabling students to co-create open knowledge as this was one of the themes of the talk that OER Service intern Mayu Ishimoto and I gave on Empowering Student Engagement with Open Education. Our presentation explored the transformative potential of engaging students with open education through salaried internships, and how these roles empower students to go on to become radical digital citizens and knowledge activists. There was a lot of interest in Information Services Group’s programme of student employment and several delegates commented that it was particularly inspiring to hear Mayu talking about her own experience of working with the OER Service.  

Open Education at the Crossroads

Laura Czerniewicz and Catherine Cronin opened the second day of the conference with an inspiring, affirming and inclusive keynote The Future isn’t what it used to be: Open Education at a Crossroads OER24 keynote resources.  Catherine and Laura have the unique ability to be fearless and clear sighted in facing and naming the crises and inequalities that we face, while never losing faith in humanity, community and collective good. I can’t adequately summarise the profound breadth and depth of their talk here, instead I’d recommend that you watch to their keynote and read their accompanying essay.  I do want to highlight a couple of points that really stood out for me though. 

Laura pointed out that we live in an age of conflict, where the entire system of human rights are under threat. The early hope of the open internet is gone, a thousand flowers have not bloomed. Instead, the state and the market control the web, Big Tech is the connective tissue of society, and the dominant business model is extractive surveillance capitalism.

AI has caused a paradigmatic shift and there is an irony around AI and open licensing; by giving permission for re-use, we are giving permission for potential harms, e.g. facial recognition software being trained on open licensed images.  Copyright is in turmoil as a result of AI and we need to remember that there is a difference between what is legal and what is ethical. We need to rethink what we mean by open practice when GAI is based on free extractive labour.  Having written about the contested relationship of invisible labour and open education in the past, this last point really struck me. 

HE for Good was written as an antidote to these challenges.  Catherine & Laura drew together the threads of HE for Good towards a manifesto for higher education and open education, adding:

“When we meet and share our work openly and with humility we are able to inspire each other to address our collective challenges.”

CC BY NC, Catherine Cronin & Laura Czerniewicz, OER24

Change is possible they reminded us, and now is the time.  We stand at a crossroads and we need all parts of the open education movement to work together to get us there.  In the words of Mary Robinson, former President of Ireland, former UN High Commissioner for Human Rights, and current Chair of the Elders:

“Our best future can still lie ahead of us, but it is up to everyone to get us there.”  

Catherine Cronin & Laura Czerniewicz, OER24, CC BY, Lorna M. Campbell.

The Splintering of Social Media

One theme that emerged during the conference is what Catherine and Laura referred to as the “splintering of social media”, with a number of presenters exploring the impact this has had on open education community and practice.  This splintering has lead people to seek new channels to share their practice with some turning to the fediverse, podcasting and internet radio. Blogging didn’t seem to feature quite as prominently as a locus for sharing practice and community, but it was good to see Martin Weller still flying the flag for open ed blogging, and I’ve been really encouraged to see how many blog posts have been published reflecting on the conference.  

Gasta! 

The Gasta sessions, overseen by Gasta Master Tom Farelly, were as raucous and entertaining as ever.  Every presenter earned their applause and their Gasta! beer mat. It seems a bit mean to single any out, but I can’t finish without mentioning Nick Baker’s Everyone’s Free..to use OEP, to the tune of Baz Luhrmann “Everybody’s Free (To Wear Sunscreen)”, Alan Levine’s Federated, and Eamon Costello’s hilarious Love after the algorithm: AI and bad pedagogy police.  Surely the first time an OER Conference has featured Jon Bon Jovi sharing his thoughts on the current state of the pedagogical landscape?!

Eamon Costello, Jon Bon Jovi, Tom Farrelly, Alan Levine, OER24, CC BY, Lorna M. Campbell

The closing of an OER Conference is always a bit of an emotional experience and this year more so than most. The conference ended with a heartfelt standing ovation for open education stalwart Martin Weller who is retiring and heading off for new adventures, and a fitting and very lovely impromptu verse of The Parting Glass by Tom. Tapadh leibh a h-uile duine agus chì sinn an ath-bhliadhna sibh!

Martin Weller, Tom Farrelly, Gearóid Ó Súilleabháin, CC BY, Lorna M. Campbell, OER24.

* The title of this blog post is taken from this lovely tweet by Laura Czerniewicz.

OER24 Conference: Empowering Student Engagement with Open Education

This week I’m looking forward to traveling to Cork with OER Service intern Mayu Ishimoto for the OER24 Conference. The conference is being hosted by the Munster Institute of Technology this year and chaired by the Gearóid Ó Súilleabháin and Tom Farrelly.  The theme this year is digital transformation in education and Mayu and I will be presenting a research paper on Empowering Student Engagement with Open Education. 

At the University of Edinburgh student engagement is a fundamental aspect of our strategic support for OER and open education and our institutional commitment to digital transformation.  As part of Information Services Group’s programme of student employment, the university’s OER Service and Online Course Production Service regularly employ student interns in a number of roles including Open Content Curators, OER support officers, media studio assistants, and open textbook co-creators.  These roles enable students to gain a wide range of core competencies and transferable attributes including digital and information literacy skills, which open the door to new careers and employment opportunities, while also providing the opportunity to develop open practice and digital competence, and improve knowledge equity  

Our research paper will explore the transformative potential of engaging students with open education through salaried internships, exploring how these roles empower students to go on to become radical digital citizens and knowledge activists, not just passive consumers of information, but active and engaged creators of open knowledge.   We will also provide guidance on how other institutions can adopt and adapt this model to engage students with open education and transform their digital skills.

2023 End of Year Reflection

Posting an end of year round up at the end of January might seem a bit daft, but I’m already one step ahead of last year, when I posted my end of year reflection in February! 

The beginning of the year was a succession of real highs and lows.  UCU entered a long phase of industrial action which came at a particularly challenging time for me as January and February is usually when I’m preparing for Open Education Week and the OER Conference.  However I also took some time out for a trip to New York with friends, which turned out to be one of the high points of my year. 

Open Education Week

For Open Education Week we ran a webinar that celebrated 10 years of open course development at the University of Edinburgh and shared the open course creation workflow that we’ve developed and refined over the years. 

 

OER23 Conference

It was great to see the OER Conference returning to Scotland in March when it was hosted by UHI in Inverness.  Inverness is a place that is very close to my heart as it’s the main city in the Highlands and it’s also were we used to go on holiday when I was a kid.  Inverness is still a stopping off point on the journey home when I go to visit family in Stornoway so I had a slightly weird feeling of nostalgia and home-sickness while I was there, it was odd being in Inverness and not traveling on further north and west. 

One of the themes of this years conference was Open Scotland +10 and Joe Wilson and I ran a number of sessions including a pre-conference workshop and closing plenary to reflect on how the open education landscape in Scotland has evolved over the last decade, and to discuss potential ways to advance open education across all sectors of Scottish education. 

Photograph of Open Scotland Plenary Panel at the OER23 Conference.

Open Scotland Plenary Panel by Tim Winterburn.
Here, the closing Panel Plenary session

Generative AI

Like many working in technical, educational and creative sectors I found it impossible to ignore the discourse around generative AI, though I hope I managed to avoid getting swept up in the hype and catastrophising.  In July I wrote an off-the-cuff summary of some of the many ethical issues related to generative AI and LLMs that are becoming increasingly hard to ignore: Generative AI – Ethics all the way down.  I appreciated having an opportunity to revisit these issues again at the end of the year when I joined the ALT Winter Summit on Ethics and Artificial Intelligence which provided much food for thought. Helen Beetham’s keynote Whose Ethics? Whose AI? A relational approach to the challenge of ethical AI was particularly thoughtful and thought provoking. 

Student Interns

Much of the summer was taken up with recruiting and managing our Open Content Curator student interns.  It’s always a joy working with our interns, their energy and enthusiasm is endlessly inspiring, and this year’s interns, August and Mayu, were no exception. I suggested it might be fun for them to interview each other about their experience of working with the OER Service and, with the help of our fabulous Media Team, they produced this lovely video. 

 

I was delighted when August and Mayu were shortlisted for the Student Employee of the Year Award in Information Services Group’s Staff Recognition Awards, in acknowledgement of their outstanding work with the OER Service and their wider contribution to ISG and the University. 

Their Finest Hour

The OER Service welcomed another student intern in the summer, Eden Swimer, who joined us to help run a digital collection day as part of the University of Oxford’s Their Finest Hour, a National Lottery Heritage funded project at the University of Oxford, which is collecting and preserving the everyday stories and objects of the Second World War. Organising and running the digital collection day proved to be a huge undertaking and we couldn’t have done it without the help of 26 volunteers from across ISG and beyond who committed so much time and energy to the project.  

 

The digital collection day took place in Rainy Hall, New College at the end of November and it was a huge success. Over 100 visitors attended and volunteers recorded over 50 interviews and took thousands of photographs, all of which will be uploaded to an open licensed archive that will be launched by the University of Oxford in June this year.  It was a deeply moving event, many of the stories recorded were truly remarkable and the visitors clearly appreciated having the opportunity to share their families stories.  In some cases these stories were being told by the last surviving relatives of those who had witnessed the historic events of WW2 and there was a real sense of preserving their experiences for posterity. 

Their Finest Hour digital collection day by Fiona Hendrie

The collection day was covered by STV and you can see a short clip of their news item here: Second World War memories to be preserved at university collection day

Publications

It was a privilege to work with co-authors Frances Bell, Lou Mycroft, Guilia Forsythe and Anne-Marie Scot to contribute a chapter on the “FemEdTech Quilt of Care and Justice in Open Education” to Catherine Cronin and Laura Czerniewicz’s timely and necessary Higher Education for Good: Teaching and Learning Futures. 

“Quilting has always been a communal activity and, most often, women’s activity. It provides a space where women are in control of their own labour: a space where they can come together to share their skill, pass on their craft, tell their stories, and find support. These spaces stand outside the neoliberal institutions that seek to appropriate and exploit our labour, our skill, and our care. The FemEdTech-quilt assemblage has provided a space for women and male allies from all over the world to collaborate, to share their skills, their stories, their inspiration, and their creativity. We, the writers of this chapter, are five humans who each has engaged with the FemEdTech Quilt of Care and Justice in Open Education in different ways, and who all have been active in the FemEdTech network.” 

I was also invited to submit a paper to a special open education practice edition of Edutec Journal.  Ewan McAndrew, Melissa Highton and I co-authored a paper on “Supporting open education practice: Reflective case studies from the University of Edinburgh.”

“This paper outlines the University of Edinburgh’s long-running strategic commitment to supporting sustainable open education practice (OEP) across the institution. It highlights how the University provides underpinning support and digital capability for OEP through central services working with policy makers, partners, students, and academics to support co-creation and active creation and use of open educational resources to develop digital literacy skills, transferable attributes, and learning enhancement. We present a range of case studies and exemplars of authentic OEP evidenced by reflective practice and semi-structured ethnographic interviews, including Wikimedia in the Curriculum initiatives, open textbook production, and co-creation of interdisciplinary STEM engagement resources for schools. The paper includes recommendations and considerations, providing a blueprint that other institutions can adopt to encourage sustainable OEP. Our experience shows that mainstreaming strategic support for OEP is key to ensuring inclusive and equitable quality education and promoting lifelong learning opportunities for all.”

Writing this paper was an interesting experience as Edutec is a research journal that expects evidence to be presented in a very particular way.  As a service division, we support practice rather than undertaking academic research, so the case studies we present are based on authentic reflective practice rather than empirical research, however it was useful to think about this practice from a different perspective. 

Wikimedia UK

In July I was awarded Honorary Membership of Wikimedia UK in recognition of my contribution to the work of the charity during my six years as a Trustee. When my term as a trustee came to an end, I was hoping that I’d have more time to contribute to the Wikimedia projects.  That hasn’t quite happened, I didn’t manage to do any Wikipedia editing in 2023, but I did enjoy taking part in Wiki Loves Monuments again.  I also digitised some pictures I took of the Glasgow Garden festival way back in 1988 and uploaded them to Wikimedia Commons to share them with the fabulous After the Garden Festival project, which is attempting to locate and archive the legacy of the festival. 

Teddy Bears Picnic, sponsored by Moray District Council. CC BY, Lorna M. Campbell on Wikimedia Commons.

ALT

I made short-lived trip to the ALT Conference in Warwick in September.  Unfortunately I  had to leave early as I came down with a stinking cold. I was really disappointed to have to miss most of the conference as it was outgoing CEO Maren Deepwell’s last event and I was also due to receive an Honorary Life Membership of ALT award. It was a huge honour to receive this award as ALT has been a significant part of my professional life for over two decades now.  You can read my short reflection on the award here: Honorary Life Membership of ALT. 

For almost three decades Lorna has been a champion of equitable higher education and an open education activist. Lorna ‘s lifelong commitment to and passion for equality and diversity clearly is evident in her work, yet Lorna tends not to push herself forward and celebrate – or even self-acknowledge – her many achievements. 
ALT press release.

Kenneth White, 1936 – 2023

I was deeply saddened to hear of the death of Kenneth White in August.  Despite being an avid reader of Scottish poetry, and having studied Scottish Literature at Glasgow University for a couple of years, I hadn’t come across White until my partner introduced me to him in 2002.  His absence from Glasgow’s curriculum, and indeed his relative obscurity in his homeland, is striking given that he was a graduate of Glasgow University who went on to become the chair of 20th century poetics at Paris-Sorbonne. White, however, has always been a writer who divides the critics, particularly in Scotland. A poet, writer, philosopher, traveller, and self-identified transcendental Scot, White founded the International Institute of GeoPoetics and was a regular visitor to the Edinburgh International Book Festival, where I was fortunate to see him read.  To say that White’s writing, particularly his meditations on openness and the Atlantic edge, had a profound effect on me, is something of an understatement. This blog is named after the title of White’s collected poetic works and his lines frequently find their way into more unguarded pieces I’ve written.  I’ll leave you with a few words from the man himself. 

Image of the coast with the words of Scotia Deserta by Kenneth White.

ALT Winter Summit on Ethics and Artificial Intelligence

Last week I joined the ALT Winter Summit on Ethics and an Artificial Intelligence. Earlier in the year I was following developments at the interface between ethics, AI and the commons, which resulted in this blog post: Generative AI: Ethics all the way down.  Since then, I’ve been tied up with other things, so I appreciated the opportunity to turn my attention back to these thorny issues.  Chaired by Natalie Lafferty, University of Dundee, and Sharon Flynn, Technological Higher Education Association, both of whom have been instrumental in developing ALT’s influential Framework for Ethical Learning Technology, the online summit presented a wide range of perspectives on ethics and AI, both practical and philosophical, from scholars, learning technologists and students.  

Whose Ethics? Whose AI? A relational approach to the challenge of ethical AI – Helen Beetham

Helen Beetham opened the summit with an inspiring and thought-provoking keynote that presented the case for relational ethics. Positionality is important in relational ethics; ethics must come from a position, from somewhere. We need to understand how our ethics are interwoven with relationships and technologies. The ethics of AI companies come from nowhere. Questions of positionality and power engender the question “whose artificial intelligence”?  There is no definition of AI that does not define what intelligence is. Every definition is an abstraction made from an engineering perspective, while neglecting other aspects of human intelligence.  Some kinds of intelligence are rendered as important, as mattering, others are not. AI has always been about global power and categorising people in certain ways.  What are the implications of AI for those that fall into the wrong categories?

Helen pointed out that DARPA have funded AI intensively since the 1960’s, reminding me of many learning technology standards that have their roots in defence and aeronautical industries.

A huge amount of human refinement is required to produce training data models; this is the black box of human labour, mostly involving labourers in the global south.  Many students are also working inside the data engine in the data labelling industry. We don’t want to think about these people because it affects the magic of AI.

At the same time, tools are being offered to students to enable them to bypass AI detection, to ‘humanise” the output of AI tools.  The “sell” is productivity, that this will save students’ time, but who benefits from this productivity?

Helen noted that the terms “generative”, “intelligence”, and “artificial” are all very problematic and said she preferred the term “synthetic media”.  She argued that it’s unhelpful to talk about the skills humans need to work alongside AI, as these tools have no agency, they are not co-workers. These approaches create new divisions of labour among people, and new divisions about whose intelligence matters. We need a better critique of AI literacy and to think about how we can ask questions alongside our students. 

Helen called for universities to share their research and experience of AI openly, rather than building their own walled gardens, as this is just another source of inequity.  As educators we hold a key ethical space.  We have the ingenuity to build better relationships with this new technology, to create ecosystems of agency and care, and empower and support each other as colleagues.

Helen ended by calling for spaces of principled refusal within education. In the learning of any discipline there may need to be spaces of principled refusal, this is a privilege that education institutions can offer. 

Developing resilience in an ever-changing AI landscape ~ Mary Jacob, Aberystwyth University

Mary explored the idea of resilience and why we need it. In the age of AI we need to be flexible and adaptable, we need an agile response to emerging situations, critical thinking, emotional regulation, and we need to support and care for ourselves and others. AI is already embedded everywhere, we have little control over it, so it’s crucial we keep the human element to the forefront.  Mary urged us to notice our emotions and think critically, bring kindness and compassion into play, and be our real, authentic selves.  We must acknowledge we are all different, but can find common ground for kindness and compassion.  We need tolerance for uncertainty and imperfection and a place of resilience and strength.

Mary introduced Aberystwyth’s AI Guidance for staff and students and also provided a useful summary of what constitutes AI literacy at this point in time.

Mary Jacob's AI Literacy

Achieving Inclusive education using AI – Olatunde Duruwoju, Liverpool Business School

Tunde asked us how we address gaps in inequity and inclusion?  Time and workload are often cited as barriers that prevent these issues from being addresses, however AI can help reduce these burdens by improving workflows and capacity, which in turn should help enable us to achieve inclusion.

When developing AI strategy, it’s important to understand and respond to your context. That means gathering intersectional demographic data that goes beyond protected characteristics.  The key is to identify and address individual students issues, rather than just treating everyone the same. Try to understand the experience of students with different characteristics.  Know where your students are coming from and understand their challenges and risks, this is fundamental to addressing inclusion.

AI can be used in the curriculum to achieve inclusion.  E.g. Using AI can be helpful for international students who may not be familiar with specific forms of assessment. Exams trigger anxiety, so how do we use AI to move away from exams?

Olatunde Duruwoju - Think intersectionality

AI Integration & Ethical Reflection in Teaching – Tarsem Singh Cooner

Tarsem presented a fascinating case study on developing a classroom exercise for social work students on using AI in practice.  The exercise drew on the Ethics Guidelines on Reliable AI from the European Group on Ethics, Science and New Technologies and mapped this against the Global Social Work Ethical Principles.

Tarsem Singh Cooner - comparison of Principles on Reliable AI  and Global Social Work Ethical Principles

The assignment was prompted by the fact that practitioners are using AI to uncritically write social work assessments and reports. Should algorithms be used to predict risk and harm, given they encode race and class bias? The data going into the machine is not benign and students need to be aware of this.

GenAI and the student experience – Sue Beckingham, Louise Drum, Peter Hartley & students

Louise highlighted the lack to student participation in discussions around AI. Napier University set up an anonymous padlet to allow students to tell them what they thought. Most students are enthusiastic about AI. They use it as a dialogue partner to get rapid feedback. It’s also helpful for disabled and neurodivergent students, and those who speak English as a second language, who use AI as an assistive technology.  However students also said that using AI is unfair and feels like cheating.  Some added that they like the process of writing and don’t want to loose that, which prompted Louise to ask if we’re outsourcing the process of critical thinking?  Louise encouraged us to share our practice through networks, adding that collaboration and cooperation is key and can lead to all kinds of serendipity.

The students provided a range of different perspectives:

Some reported conflicting feelings and messages from staff about whether and how AI can be used, or whether it’s cheating.  Students said they felt they are not being taught how to use AI effectively.

GCSEs and the school system just doesn’t work for many students, not just neurotypical ones, it’s all about memorising things.  We need more skills based learning rather than outcome based learning.

Use of AI tools echoes previous concerns about the use of the internet in education. There was a time when there was considerable debate about whether the internet should be used for teaching & learning.

AI can be used to support new learning. It provides on hand personal assistance that’s there 24/7.  Students create fictional classmates and partners who they can debate with.  A lot of it is garbage but some of it is useful. Even when it doesn’t make sense, it makes you think about other things that do make sense.

A few thoughts…

As is often the case with any new technology, many of the problematic issues that AI has thrown up relate less to the technology itself, and more to the nature of our educational institutions and systems.  This is particularly true in the cases of issues relating to equity, diversity and inclusion; whose knowledge and experiences are valued, and whose are marginalised?   

It’s notable that several speakers mentioned the use of AI in recruitment. Sue Beckingham noted that AI can be helpful for interview practice, though Helen highlighted research that suggested applicants who used chatGPT’s paid functionality perform much better in recruitment than those who don’t.  This suggests that we need to be thinking about authentic recruitment practices in much the same way we think about authentic assessment.  Can we create recruitment process that mitigate or bypass the impact of these systems?

I particularly liked Helen’s characterisation of AI as synthetic media, which helps to defuse some of the hype and sensationalism around these technologies.

The key to addressing many of the issues relating to the use of AI in education is to share our practice and experience openly and to engage our colleagues and students in conversations that are underpinned by contextual ethical frameworks such as ALT’s Framework for Ethical Learning Technology.  Peter Hartley noted that universities that have already invested in student engagement and co-creation are at an advantage when it comes to engaging with AI tools.

I’m strongly in favour of Helen’s call for spaces of principled refusal, however at the same time we need to be aware that the genie is out of the bottle.  These tools are out in the world now, they are in our education institutions, and they are being used by students in increasingly diverse and creative ways, often to mitigate the impact of systemic inequities. While it’s important to acknowledge the exploitative nature and very real harms perpetrated by the AI industry, the issues and potential raised by these tools also give us an opportunity to question and address systemic inequities within the academy. AI tools provide a valuable starting point to open conversations about difficult ethical questions about knowledge, understanding and what it means to learn and be human.  

Honorary Life Membership of ALT

Many thanks to Martin Hawksey for sharing this picture of Helen O’Sullivan announcing the award.

I’m hugely honoured to have been awarded Honorary Life Membership of the Association for Learning Technology at ALT’s 30th Annual Conference Gala at the University of Warwick.  Unfortunately I wasn’t there to receive the award in person because, in a stroke of spectacularly bad timing, I’ve come down with a really horrible cold. Though as Maren pointed out, the great thing about Honorary Life Membership is that you can celebrate it any time! 

For almost three decades Lorna has been a champion of equitable higher education and an open education activist. Lorna ‘s lifelong commitment to and passion for equality and diversity clearly is evident in her work, yet Lorna tends not to push herself forward and celebrate – or even self-acknowledge – her many achievements. 
~ ALT press release.

ALT has been a significant part of my professional life for over 20 years.  I attended my first ALT Conference in the late 1990s, joined the Board of Trustees in 2016, and was awarded Senior CMALT in 2022.  Serving on the ALT Board, and various committees, including the ALT Scotland SIG and the COOL SIG, has been immensely rewarding to me on both a personal and professional level.  I learned a huge amount from my fellow trustees and ALT colleagues and benefited enormously from working with a diverse group of people from a wide range of backgrounds, who I might not have had the opportunity to work with otherwise. I really appreciated having the opportunity to engage with the wider learning technology community at a senior level and to contribute to sector level strategic initiatives. But perhaps most importantly, working with ALT gave me an opportunity to give something back to the Association in return for their dedicated commitment to openness and ethics in learning technology. I’m really humbled that this award acknowledges my open practice and open education advocacy. Open education is a cause that I have a deep personal commitment to, and though at times it has felt like a bit of a quiet up-hill struggle, it has also been immensely rewarding. 

I’m also really touched to be following in the footsteps of other Honorary Life Members who have been a real inspiration to me throughout my career, including Josie Fraser, Linda Creanor, Frances Bell and Teresa MacKinnon.  However I can’t reflect on this award without acknowledging the exemplary leadership of ALT’s outgoing CEO Maren Deepwell, who successfully steered the Association through many changes and challenges. Throughout her tenure as chief executive, Maren has really embodied ALT’s core values.  It’s been a privilege and a pleasure to work with Maren over the years. I’ve learned a great deal from her and been continually inspired by her professionalism, commitment, and compassion.  I have no doubt that ALT will continue to go from strength to strength under the leadership of CEO Billy Smith and I look forward to seeing new directions ALT will take with him at the helm.

Photograph of L. Campbell presenting at ALTC 2019.

Picture by Chris Bull for Association for Learning Technology, CC BY-NC 2.0, 2019.

 

Generative AI – Ethics all the way down

How to respond to the affordances and challenges of generative AI is a pressing issue that many learning technologists and open education practitioners are grappling with right now and I’ve been wanting to write a blog post about the interface between AI, large language models and the Commons for some time. This isn’t that post.  I’ve been so caught up with other work that I’ve barely scratched the surface of the articles on my rapidly expanding reading list.  Instead, these are some short, sketchy notes about the different ethical layers that we need to consider when engaging with AI.  This post is partly inspired by technology ethics educator Casey Fiesler, who has warned education institutions of the risk of what she refers to as ethical debt. 

“What’s accruing here is not just technical debt, but ethical debt. Just as technical debt can result from limited testing during the development process, ethical debt results from not considering possible negative consequences or societal harms. And with ethical debt in particular, the people who incur it are rarely the people who pay for it in the end.”
~ Casey Fiesler, The Conversation

Apologies for glossing over the complexity of these issues, I just wanted to get something down in writing while it’s fresh in my mind 

Ethics of large language models and Common Crawl data sets

Most generative AI tools use data sets scraped from the web and made available for research and commercial development.  Some of the organisations creating these data sets are non-profits, others are commercial companies, the relationship between the two is not always transparent. Most of these data sets scrape content directly from the web regardless of ownership, copyright, licensing and consent, which has led to legitimate concerns about all kinds of rights violations. While some companies claim to employ these data sets under the terms of fair use, questions have been raised about using such data for explicitly commercial purposes. Some open advocates have said that while they have no objection to these data sets being used for research purposes they are very concerned about commercial use. Content creators have also raised objections to their creative works being used to train commercial applications without their knowledge or consent.  As a result, a number copyright violation lawsuits have been raised by artists, creators, cultural heritage organisations and copyright holders.

There are more specific issues relating to these data sets and Creative Commons licensed content.  All CC licenses include an attribution clause, and in order to use a CC licensed work you must attribute the creator. LLMs and other large data sets are unable to fulfil this crucial attribution requirement so they ride roughshod over one of the foundational principles of Creative Commons. 

LLMs and common crawl data sets are out there in the world now.  The genie is very much out of the bottle and there’s not a great deal we can do to put it back, even if we wanted to. It’s also debatable what, if anything, content creators, organisations and archives can do to prevent their works being swept up by web scraping in the future. 

Ethics of content moderation and data filtering

Because these data sets are scraped wholesale from the web, they inevitably include all kinds of offensive, degrading and discriminatory content. In order to ensure that this content does not influence the outputs of generative AI tools and damage their commercial potential, these data sets must be filtered and moderated.  Because AI tools are not smart enough to filter out this content automatically, the majority of content moderation is done by humans, often from the global majority, working under exploitative and extractive conditions. In May, content moderators in Africa who provide services for Meta, Open AI and others voted to establish the first African Content Moderators Union, to challenge low pay and exploitative working conditions in the industry. 

Most UK universities have a commitment to ending modern slavery and uphold the terms of the Modern Slavery Act. For example the University of Edinburgh’s Modern Slavery Statement says that it is “committed to protecting and respecting human rights and have a zero-tolerance approach to slavery and human trafficking in all its forms.” It is unclear how commitments such as these relate to content workers who often work under conditions that are exploitative and degrading at best, and a form of modern slavery at worst. 

Ethics of anthropomorphising AI 

The language used to describe generative AI tools often humanises and anthropomorphises them, either deliberately or subconsciously. They are ascribed human characteristics and abilities, such as intelligence and the ability to dream. One of the most striking examples is the use of hallucinating.  When Chat GPT makes up non-existent references to back up erroneous “facts” this is often described as “hallucinating“.  This propensity has led to confusion among some users when they have attempted to find these fictional references. Many commenters have pointed out that these tools are incapable of hallucinating, they’re just getting shit wrong, and that the use of such humanising language purposefully disguises and obfuscates the limitations of these systems. 

“Hallucinate is the term that architects and boosters of generative AI have settled on to characterize responses served up by chatbots that are wholly manufactured, or flat-out wrong.”
~ Naomi Klein, The Guardian

Ethics of algorithmic bias

Algorithmic bias is a well known and well documented phenomenon (cf Safiya U. Noble‘s Algorithms of Oppression) and generative AI tools are far from immune to bias. Valid arguments have been made about the bias of the ‘intelligence” these tools claim to generate.  Because the majority of AI applications are produced in the global north, they invariably replicate a particularly white, male, Western world view, with all the inherent biases that entails. Diverse they are not. Wayne Holmes has noted that AI ignores minority opinions and marginalised perspectives, perpetuating a Silicon Valley perspective and world outlook. Clearly there are considerable ethical issues about education institutions that have a mission to be diverse and inclusive using tools that engender harmful biases and replicate real world inequalities. 

“I don’t want to say I’m sure. I’m sure it will lift up the standard of living for everybody, and, honestly, if the choice is lift up the standard of living for everybody but keep inequality, I would still take that.”
~ Sam Altman, OpenAI CEO. 

Ethics of catastrophising
 

Much has been written about the dangers of AI, often by the very individuals who are responsible for creating these tools. Some claim that generative AI will end education as we know it, while others prophesy that AI will end humanity altogether. There is no doubt that this catastrophising helps to feed the hype cycle and drive traffic to to these tools and applications, however Timnit Gebru and others have pointed out that by focusing attention on some nebulous future catastrophe, the founding fathers of AI are purposeful distracting us from current real world harms caused by the industry they have created, including reproducing systems of oppression, worker exploitation, and massive data theft. 

“The harms from so-called AI are real and present and follow from the acts of people and corporations deploying automated systems. Regulatory efforts should focus on transparency, accountability and preventing exploitative labor practices.”
Nirit Weiss-Blatt’s (@DrTechlash) “Taxonomy of AI Panic Facilitators” A visualization of leading AI Doomers (X-risk open letters, media interviews & OpEds). Some AI experts enable them, while others oppose them. The gender dynamics are fucked up. It says a lot about the panic itself.

Not really a conclusion

Clearly there are many ethical issues that education institutions must take into consideration if they are to use generative AI tools in ways that are not harmful.  However this doesn’t mean that there is no place for AI in education, far from it.  Many AI tools are already being used in education, often with beneficial results, captioning systems are just one example that springs to mind.  I also think that generative AI can potentially be used as an exemplar to teach complex and nuanced issues relating to the creation and consumption of information, knowledge equity, the nature of creativity, and post-humanism.  Whether this potential outweighs the ethical issues remains to be seen.

A few references 

AI has social consequences, but who pays the price? Tech companies’ problem with ‘ethical debt’ ~ Casey Fiesler, The Conversation 

Statement from the listed authors of Stochastic Parrots on the “AI pause” letter ~ Timnit Gebru (DAIR), Emily M. Bender (University of Washington), Angelina McMillan-Major (University of Washington), Margaret Mitchell (Hugging Face)

Open letter to News Media and Policy Makers re: Tech Experts from the Global Majority ~ @safiyanoble (Algorithms of Oppression), @timnitGebru (ex Ethical Artificial Intelligence Team)@dalitdiva, @nighatdad, @arzugeybulla, @Nanjala1, @joana_varon

150 African Workers for ChatGPT, TikTok and Facebook Vote to Unionize at Landmark Nairobi Meeting ~ Time

AI machines aren’t ‘hallucinating’. But their makers are ~ Naomi Klein, The Guardian  

Just Because ChatBots Can’t Think Doesn’t Mean They Can’t Lie ~ Maria Bustillos, The Nation 

Artificial Intelligence and Open Education: A Critical Studies Approach ~ Dr Wayne Holmes, UCL 

‘What should the limits be?’ The father of ChatGPT on whether AI will save humanity – or destroy it ~ Sam Altman interview, The Guardian 

Open Scotland @10 Plenary Panel synthesis & outputs

To mark 10 years of the Open Scotland initiative, Joe Wilson and I ran two events as part of the OER23 Conference at UHI in Inverness, which provided an opportunity for members of the education community to reflect on how the open education landscape in Scotland has evolved over the last decade, and to discuss potential ways to advance open education across all sectors of Scottish education. 

Open Scotland Pre-Conference Workshop

Joe has already written up our pre-conference Open Scotland workshop, which brought together around 40 colleagues, in person and online, to discuss key challenges and priorities. You can read Joe’s summary of the workshop here: Open Scotland Reflections on Pre-Conference Workshop.

 

OpenScotland @10 Plenary Panel

The closing plenary panel of OER23 brought together open education practitioners from within Scotland and beyond.  Panel participants were Lorna M. Campbell, Open Scotland and University of Edinburgh; Scott Connor, UHI;  Maren Deepwell, ALT; Stuart Nicol, University of Edinburgh; Robert Schuwer, consultant and former UNESCO Chair on Open Educational Resources; Joe Wilson, Open Scotland and City of Glasgow College.  Each member of the panel was invited to briefly share their thoughts on future directions for Open Education, before we opened the discussion to the floor. 

Photograph of Open Scotland Plenary Panel at the OER23 Conference.

Open Scotland Plenary Panel by Tim Winterburn.

Stuart Nicol, Head of Educational Design and Engagement at the University of Edinburgh, acknowledged that while it’s disappointing that there hasn’t been more support from Scottish Government, there has been a support for open education at a number of institutions, including the University of Edinburgh.  Stuart highlighted the important role of committed people who have pushed the open agenda within institutions.  Short of having government level commitment and policy, Stuart suggested we need to provide opportunities for people to come together to share practice and to encourage institutions to work together.

Scott Connor, Digital and Open Education Lead at UHI’s Learning and Teaching Academy, outlined UHI’s strategic commitment to open education which is underpinned by an OER Policy and a framework for the development of open educational practices. Scott highlighted lack government mandates and funding as a barriers to engagement with open education and suggested that real impact would come through the government adopting the Scottish Open Education Declaration and using it to mandate that resources created with public funding should be shared openly to benefit everyone. 

Both Scott and Stuart highlighted the OER policies adapted and adopted by the University of Edinburgh and UHI as a prime example of open education collaboration.

Photograph of Open Scotland Plenary Panel at the OER23 Conference.

Open Scotland Plenary Panel by Tim Winterburn.

Robert Schuwer, independent consultant and former UNESCO Chair of OER, provided an overview of open education in The Netherlands where the government has supported a range of OER initiatives and stimulation grants since 2006. In 2014 the Education Ministry issued a  strategic agenda stating that by 2025 all teachers should share their learning materials. Although some institutions such as TU Delft are front-runners, other smaller institutions are just getting started. 

Robert suggested that the biggest challenge is to cross the chasm from early adopters and innovators to the majority of teachers to encourage them to adopt principles of openness in education.  He suggested connecting to teachers passion, which is teaching, not sharing materials, and highlighting how open education can help them to become better teachers. 

Maren Deepwell, CEO of the Association for Learning Technology, reminded us that we’re not just talking about openness in Higher Education we’re looking at all sectors including schools, training, vocational education, FE, HE, and research. UK Government looks at Open Access research and thinks the open box is ticked. ALT has tried to reach out to both Scottish Government and the Department of Education, but often there is no one with responsibility for open education policy beyond Open Access and Open Research funding. 

Maren noted that we tend to see open education as another challenge alongside Brexit, the cost of living crisis, climate change, sustainability, etc., and ultimately it is never at the top of the agenda.  She suggested that our opportunity is to present openness as a way to solve these challenges.  It’s ingrained in us that openness is the extra step that teachers need more time, more funding, more skills, to take.  Instead we need to highlight how openness could solve resource scarcity and training issues, and help small independent providers collaborate across sectors.  We need to show openness as a way to solve these challenges, rather than as a stand alone challenge in its own right.

Photograph of Open Scotland Plenary Panel at the OER23 Conference.

Open Scotland Plenary Panel by Tim Winterburn.

Opening the discussion to the floor, members of the community put forward a range of comments and suggestions including: 

  • Taking a whole population approach to education rather than a sectoral approach. Open education is a way to educate for all our futures, not just those who can afford a good education. Open educators should collaborate with demographic data experts to see how open education could address key challenges of our ageing population, including health and social care. 
  • Start with early interventions at primary school level. How do children learn, what do they learn, what role models  do they see? Start to train a new generation of people to think in different ways. Currently there is no mention of openness in the General Teaching Council programme, but a logical place to start would be with teaching staff who are teaching children how to learn.  However because of concerns about GDPR, teachers work in closed environments, there are challenges around safeguarding and managing digital identities. 
  • Scotland’s baby box has been an import mechanism for learning for both parents and children, why not add a leaflet about open education?
  • Scotland has always had a very egalitarian tradition of education, the principles of openness fit well with this tradition, from school all the way up, so it’s frustrating that we haven’t been able to introduce open education at school level.
  • Maybe we’re trying too hard to change policy, perhaps it would be better to focus on doing fun stuff and sharing open practice. Do what you can at the small level; small OER, rather than big OER. This can be really powerful. Sharing in small ways can make a difference.
  • People hear about Open Scotland and are interested in open education, but they’re constrained by their local authorities or their college marketing teams. 
  • The strength of open education is in the grass roots, as soon as it get sucked into politics, it gets watered down. There is a risk that comes with government policy and funding. You cede some control when policy is dictated at that level.  At grass roots level we can control it, shape it and manage it.  It’s hard work pushing upwards but there is a danger when it comes from the other direction that we lose something and open education gets co-opted by people we may not wish to work with. 
  • Robert Schuwer countered this point by noting that this has not happened in The Netherlands.  Government support is provided at all levels of education but there is a lot of autonomy within institutions. The only mandates were the 2014 strategic agenda and a 2020 Open Access research mandate, both of which have been beneficial.  Robert also noted that students lobbied the Education Minister and had directly input to the 2014 sharing agenda.  This was also the case at the University of Edinburgh, where EUSA encouraged the University to support open education and OER. 
  • We have a political problem in that our education ministers don’t know much about education, so openness is never a priority.  We need to trust ourselves and continue with the grass roots work.  We need to feed messages up to government ministers that open education can be a solution to sustainability and other strategic agendas.  We need to take our advocacy up a notch, perhaps take out an advert in the press. 

Next steps

The next step will be to continue synthesising the outputs of the workshop and plenary panel, captured in this Padlet, with a view to drafting a new Open Scotland manifesto to share with the community and move the open education agenda forward. 

 

Made with Padlet

 

 

 

OER23 Conference: Imagining hopeful futures

I’m a bit late with this OER23 reflection, it’s taken me a couple of weeks to catch up with myself and to let some of the ideas generated by the conference percolate.  

It was fabulous to see the OER Conference returning to Scotland for the fist time since we hosted it at the University of Edinburgh in 2016, and I was particularly pleased to see the conference visit the University of the Highlands and Islands in Inverness.  Inverness holds a rather special place in my heart as the site of many childhood holidays (it seemed like such a big city compared to Stornoway!) and as a stopping off point on annual journeys home to the Hebrides.  I had a slightly weird feeling of nostalgia and home-sickness while I was there, it was odd being in Inverness and not traveling on further north and west. Perhaps not coincidentally, sense of place and community were two themes that emerged throughout the conference. 

As one of the few universities in Scotland, along with Edinburgh, with a strategic commitment to open education, including an OER Policy and a Framework for the Development of Open Education Practices, UHI was a fitting venue for the conference. Keith Smyth and his UHI colleagues were the warmest of hosts and the airy Inverness campus was a beautiful location with plenty of space to breathe, think, and (re)connect. It was lovely seeing so many colleagues from around the world experiencing a Highland welcome for the first time. 

UHI Inverness, CC BY, Lorna M. Campbell

One of the main themes of the conference was “Open Education in Scotland – celebrating 10 years of the Scottish Open Education Declaration” and Joe Wilson and I ran both a pre-conference workshop and the closing plenary panel to reflect on progress, or not, over the last ten years and to map a way forward.  I’ll be reflecting on these discussions in another post.

Rikke Toft Nørgård opened the conference with a fantastic and fantastical keynote on “Hyper-Hybrid Futures? Reimagining open education and educational resources Places // Persons // Planets” (slides, recording) that challenged us to imagine and manifest transformative speculative futures for education.  Her call for “open hopepunk futures in grimdark times” clearly resonated with participants. Rikke described hopepunk as a sincerely activist approach to fighting for a more hopeful future.  I particularly liked her vision for place-ful OERs; education that has a home, that belongs and dwells in placefulness, being some-where, not any-where. 

Anna-Wendy Stevenson also picked up on this idea of belonging and placefulness in her keynote “Setting the Tone: The democratisation of music eduction in the Highlands and Island and beyond” (recording). Anna-Wendy is the course leader of UHI’s award-winning BA in Applied Music, a blended learning course that enables students to study music in their own communities while providing opportunities for both virtual and place based residencies in the Outer Hebrides and beyond.  Having grown up in the Hebrides I appreciate the importance of having the opportunity to study at home, and the benefits this can bring to students and the community.  I left the islands to go to university and, like many graduates, never returned.  While eighteen-year-old me wouldn’t have passed up on the opportunity to move to “the mainland” in a month of Sundays (IYKYK), I would have jumped at the chance if there had been a possibility to go back home to continue studying archaeology at postgraduate level. It’s wonderful that students now have that opportunity. After Anna-Wendy’s keynote, it was lovely to hear her playing traditional Scottish music with some of her students who have benefited from this place-based approach to music education. 

It was great being able to attend the conference with a group of colleagues from the University of Edinburgh, several of whom were experiencing the conference for the first time. Fiona Buckland and Lizzy Garner-Foy from the Online Course Production Service gave a really inspiring presentation about the University’s investment in open education, which has resulted in 100 free short online courses and over 1000 open educational resources (OER) that have benefited almost 5 million learners over the last 10 years. It makes you proud 🙂

Tracey Madden told the story of the University’s digital badges pilot project and the challenges of developing a sustainable service that assures both quality and accessibility. Stuart Nicol and I shared the university’s experience of transforming the curriculum with OER and presented case studies from the fabulous GeoScience Outreach course and our indefatigable Wikimedian in Residence (slides). We shared a padlet of open resources, along with staff and student testimonies, which you can explore here: Open For Good – Transforming the curriculum with OER at the University of Edinburgh.

 

The Edinburgh team also had a really productive meeting with a delegation of colleagues from a wide range of institutions and organisations in the Netherlands to share our experiences of supporting open education policy and practice at institutional and national level in our respective countries. 

As with so many OER Conferences, hope and joy were prominent themes that were woven into the fabric of the event. Catherine Cronin gave us an update on the eagerly anticipated book Higher Education for Good: Teaching and Learning Futures, which she has been editing with Laura Czerniewicz. 

Prajakta Girme spoke about “Warm Spaces”; open multicultural space, or “pockets of community” for vulnerable communities and non-students within the university environment. Frances Bell and Lou Mycroft asked how we can use feminist posthuman storytelling to promote activism in FemEdTech and open education, challenging us to develop “productive approaches to exploring uncertain educational futures critically, retaining the pragmatic hope offered by Posthuman Feminism.”  Frances had brought one of the Femedtech quilts (it was lovely to see my Harris Tweed square at home in the Highlands) and she invited us to write speculative futures for the quilt assemblage.  You can read my micro-speculative future on femedtech.net here: Reconnecting with Joy.

Frances Bell and the Femedtech quilt, CC BY, Lorna M. Campbell

I also had a really lovely conversation with Bryan Mathers of Visual Thinkery about our shared experience of reconnecting with our Gàidhlig / Gaeilge language and culture. His Patchwork Province zines had me laughing and nodding along in rueful recognition. 

I always leave the OER Conferences inspired and hope-full and this year it was lovely to end the conference by sharing a quiet, reflective train journey with Catherine, Joe and Louise Drumm, who captured this beautiful image as we traveled home through the Highlands.