You’re Tired!

fired

It was on this very day 10 years ago that markets woke up to a problem when French bank BNP Paribas halted redemptions, or funds claimed by investors, on three investment funds. It was this act that triggered what has become the UK’s most recent and largest financial crash, but conversely, it was also exactly ten years ago today that Apple launched its first iPhone. Both of these anniversaries, though not intrinsically linked, have given me cause to think about where we are a decade later, and have prompted this post.

From 2008 onwards, the word recession became a part of our everyday life and language, and as budget cuts began to take hold, so the educational zeitgeist – be it around compulsory or post-16 teaching and learning – started to be re-framed around two key words: innovation and enterprise.  It makes practical sense to think in these terms if the country’s economy is flat-lining and the need to do more with less (innovation) along with the need to try to make a living at a time where the financial ground is barren (enterprise) become survival. I've written about innovation in education before, so if you want to check my earlier post out before continuing, do go ahead.

If I were to link these two words with cognitive processes, then creativity of thought, lateral thinking, and imagination, would be fundamental.

A decade ago, the iPhone was the epitome of innovation. It was, and still is, the 'must have' smartphone, it's still ubiquitous, it's rather beautiful, it has, effectively, replaced the office, but now that it's ten years old and developers' ideas are running out, each brand new iteration becomes less groundbreaking and certainly less innovative.

As a member of the conference panel for the next International Conference of Education, Research and Innovation in November (ICERI 2017), I’ve been reading through a selection of papers that have been submitted from educators, researchers and technologists from around the world. Though it has been a genuinely interesting and rewarding experience, something has been niggling me, and I can't shake it off.  And it's not just something that's coming from some of the submissions I've been reading: I've noticed it locally too. Things that are being mooted as being innovative are, on closer inspection,  no more than the re-hashing or re-branding of concepts, methods, and processes that have been around for ages. At the risk of sounding like 'Irritated of Nantwich', I'm going to suggest that filming clinical skills procedures and then making them available for students to view online is not innovative, asking teachers to curate and then reflect upon their CPD by way of an online portfolio or blog isn't a new idea, and recording audio feedback to students rather than typing assignment feedback isn't enterprising. These are all genuine examples of practice that has been labelled as being innovative and enterprising and that have been on my radar for a few weeks. The thing is,  they are also all examples of practice that were on the same radar a decade ago.

So have we run out of ideas?

I don't think so. I'd like to think that our collective imagination is limitless. But I do think that because we live and work in a daily state of emergency – where our day in the office amounts to little more than fire fighting, and our home lives are increasingly fraught and lived in the shadow of political unrest, inequality, lack of resources and a race to he bottom – it can be hard to find the space, the time and the right frame of mind in which to be innovative or enterprising. And so, because we are told that we must be enterprising and innovative in order to raise our institution's profile and remain relevant, but we can't reasonably be inspired on command, we re-package, rename and rebadge projects that have already been done, hoping that our audience likes the emperor's newly-tailored suit. Maybe if we had time to breathe, and vitally, to be allowed to take risks and make mistakes, we would be able to be truly innovative. Sadly, I can't see a time when this will be allowed to happen.

But I am determined to finish on a positive note. A cafe down the road from me not only makes the best coffee in Cardiff, it has been innovative and enterprising by doing one small thing. You know those cardboard coffee sleeves that you get in Starbucks and Costa? Visit the cafe, buy one of their fantastic coffees, then for one pound, you can buy a reusable coffee sleeve made from material. Simple. Brilliant. And it saves trees too.

Game On! (Part 2)

A warning to other bloggers out there: this is what happens when you promise your audience that you are going to write something groundbreaking based around a very specific subject, (and one you arrogantly think is yours and yours alone to write about), but instead you start prevaricating and writing blog posts about other things instead.

I have mentioned a couple of times in recent weeks my need to write about a theory that has been buzzing around my head for a while now: that console gaming is therapeutic. Well, an article from The Conversation has just popped up in my Facebook feed, and it says exactly what I should have done as few weeks ago. It would appear that my groundbreaking theory isn’t as groundbreaking as I had assumed. However, finding out that others have the same theory as you is comforting, this particular article makes for concise and genuinely interesting reading, had my head nodding enthusiastically in agreement (and my teeth grinding in frustration at my own laziness), and vitally, goes on to cite academic research published in April 2016.  The research, carried out by Leonard Reinecke of the Johannes Gutenberg-Universität Mainz, states that when video games are systematically used after exposure to stressful situations and strain, that recovery experience is a significant facet of the gaming experience (Reinecke, 2016). Console gaming as therapy. Boom!

In terms of my own recovery from stressful periods, I concur absolutely. Were it not for Dragon Age II at an incredibly stressful period of my professional life, I would probably not be in the positive frame of mind I’ve been able to maintain for a few years. I’d also be lucky to be able to work at all. Skyrim got me through equally tough periods in my personal life, and on a smaller scale, I still always have a ‘go-to game’ in my PS4 in case I’ve had a trying day and need to let off steam and ‘ground myself’ again. This week, it’s mostly Elder Scrolls Online, though reading this back to myself, I really do need to find a genre of game that doesn’t involve swords, sorcery or picking flowers in order to make potions…

Reference:

Games and Recovery: The Use of Video and Computer Games to Recuperate from Stress and Strain (PDF Download Available). Available from: https://www.researchgate.net/publication/232594932_Games_and_Recovery_The_Use_of_Video_and_Computer_Games_to_Recuperate_from_Stress_and_Strain [accessed Jul 10, 2017].

‘Isn’t she in Game of Thrones?’

Darksansastark…said my erstwhile partner when he caught me clapping my hands with same level of enthusiasm a horse-mad 4 year old may display on receiving a pony for her birthday.

I’ll go back a little. It was the 6th July, and I had just received an email from Linden Labs informing me that I had been selected to be among the first to create ‘social VR experiences’ with Sansar, Linden’s virtual world for the virtual reality generation.

It’s been a long time since I was at the forefront of anything techy, so my excitement was entrenched in that little part of me that wants to try everything before anyone else does. The fact that it took me ages to log in, my avatar was about as customisable as a lump of coal, has the face of a corpse and walks like she has done a number 2 in her pants is nothing. Whatever happens, I was one of the first people to log into Sansar. And. most importantly,  as a developer, I can build my own space. Ladies and gentlemen, let me introduce you to:

sansa1

Oh, we learning technology types may scoff at how far behind we think our institutions are compared to the rest of the world, but I’m pretty certain that Cardiff University is (probably one of) the first HE institution(s) that has a presence in Sansar,  though how long it will last, I don’t know.

So how does it look and how does it feel? Graphically, Sansar is so advanced it makes Second Life look like an early Sega Megadrive game, and the quality of the audio is just fantastic: like being in a cutting – edge cinema. I’m even adoring the font style used for messaging (I have thing about fonts. I may need help). But it’s not all beer and skittles, and at the moment it’s really just a triumph of style over substance. Because it’s so new it’s frustratingly limited, it’s laggy, and even the usually simple process of trying to move objects around is a pig. So here we have exactly what he had 12 years ago when Second Life was introduced – something with real promise, and a glimpse of a future that I really want go be a part of, but with more bugs than an NHS hospital. As with Second Life’s early days, Sansar delights and frustrates in equal measure, and in this iteration, the virtual world can’t even be used with VR headsets as yet – despite this being the very premise on which it was founded.

I don’t care. I’m still really excited. Here’s a screenshot of my avatar in her new Cardiff University ‘home’ to tantalise and delight you:

sansar1

The way to academics’ hearts is through their minds

I presented the following abstract at Cardiff University’s Learning and Teaching conference on Tuesday. And no, I haven’t forgotten about those ‘gaming is the future’ blog posts I keep promising; other things keep getting in the way!

When it comes to Technology Enhanced Learning (TEL) there has long been an emphasis on demonstrating how to use digital tools in staff development sessions. However, there is little evidence of other staff development sessions examining the methods and models TEL. Institutional directives request that staff use a Virtual Learning Environment (VLE) and offer training on the mechanics of uploading documents and renaming folders, but they do not explain the methodologies or pedagogic models behind using a VLE. Other directives require that academic staff embed digital literacy skills into their teaching practice in order to hone their students’ own skills. Academic staff are rarely asked if they know what digital literacy means themselves, hoping, it would seem, that the meaning of digital literacy is learnt and passed onto students through a process of osmosis. I would suggest that if academics and teachers work from the taxonomy of pedagogy it is from this taxonomy that staff development is approached.

Repeated reviews into the professional development of teachers and ways to diminish their fear of technology have recommended that staff are given substantial time if they are going to acquire and, in turn, transfer to the classroom the knowledge and skills necessary to effectively and completely infuse technology to curriculum areas. (Brand, G.A., 1997). However, lack of time is just one issue, and constant emphasis of the need to ‘find time’ merely distracts from the proverbial elephant in the room: that academics are ‘scared’ of technology because they aren’t told how it fits a familiar pedagogic framework. Learning technologists are expert at explaining how to use a tool, but often miss out the pedagogical value of the tool, assuming that the teacher will think of a use for it.

In response to this, I currently run sessions for teachers and academic staff looking at methods and models such as the flipped classroom, Personal Learning Networks, blended learning, digital literacy, the benefits of online communities of practice, and the differences between pedagogy, andragogy and heutagogy. We have debated at length Prensky’s notion of the Digital Native against that of the Residents and Visitors model espoused by Dave White. We have looked at the psychology behind the online learner and their need to feel part of a group. When staff begin to understand these theories and methods, they feel better placed to choose tools that are appropriate to their curricula, their students and to relevant assessment process.

I would suggest that there is a real need to do more of this. If academics can see things from their particular (and familiar) perspective, they will see what tools work best and then, if needed, be taught how to use it.

Technology often feels like something that is being ‘done’ to people via institution-wide directives, and not something that they can do themselves. It is now 2017, so the time has come for a change in thinking.

References:

Brand, G.A., (1997), Training Teachers for using Technology, Journal of Staff Development, Winter 1997 (Vol 19, No. 1)

Prensky, M., (2001), Digital Natives, Digital Immigrants, located at: http://www.marcprensky.com/writing/Prensky%20-%20Digital%20Natives,%20Digital%20Immigrants%20-%20Part1.pdf, date accessed: 22nd February, 2017

White, D.S. and Le Cornu, A, (2011), Visitors and Residents: a New Typology for Online Engagement, located at: http://firstmonday.org/ojs/index.php/fm/article/view/3171/3049This, date accessed: 22nd February, 2017

Figuratively Speaking

9hjVhuxI often like to think of myself as the Queen of the Cliche or Dark Mistress of the Analogy. Not in terms of any Stephen Fry-esque use of language, but because I’ve noticed that it’s how I communicate, by and large, on a professional basis. However, having read the following post through before pressing ‘Publish’ I’m going to back-pedal on that first sentence a bit. I’m not as cool as I think I am, and I sound more like Theresa May by the week, with her ‘strong and stable magic money tree’ soundbites and the way her limbs squeak if Team Maybot have forgotten to squirt some WD40 into her joints.  Mine just squeak because I’m old. But let’s return to the subject at hand and my trotting out of paraphrased stock-phrases ad nauseam. Here’s some examples I have used in the last week alone:

‘You can lead a horse to water, but you can’t make it drink, And actually, you can’t even lead it to water half the time.’

(Translation: You can tell people about technology, but you can’t make them use it. And much of the time, because they are just too busy or too frightened or too uninterested in technology in the first place, you won’t even get the opportunity to tell them about it.)

‘If you build it, they will NOT come.’

(Translation: If you knew your students had set up their own Facebook group for your course, why did you then set up a group for them on Yammer, and why are you now complaining about how technology doesn’t work because they aren’t using it?)

‘We can’t just switch off the Internet! It’s too late for that!’

(Translation: I know you felt more in control when you were in the ruler of the lecture theatre and in charge of the overhead projector and acetates, but it’s 2017, the world has changed, and our students really do expect more than that, so please try to that accept that and let me help you to move on.)

That last one I used just yesterday, at a programme planning meeting. In 2018 we’re going to be offering some new modules at postgraduate level, and I had been invited by the module directors to deliver a presentation looking at how and why we need to work together to embed technology into these new modules.

Before doing my bit,  the marketing team were talking about how to make these modules more sellable both internationally and locally. ‘What are the School’s USPs?’ asked the team. A member of the teaching team stuck their hand up to offer a suggestion.

‘That we use traditional teaching methods’ said my colleague, ‘and none of this blended learning stuff we keep hearing about.  My entire course is taught completely face to face in the lecture theatre, and I think that this is what my students want and makes my course unique.’

So. Despite a swathe of comments to the contrary made from NSS respondents, I realised again that I am still wading through wet cement and fighting the same battle I’ve been trying to fight for almost 15 years: technology versus tradition. A million questions, suppositions and accusations clamoured for recognition at the front of my brain hole:

‘Thank God your course is unique…but do you think that’s what they want? Have you ever actually asked them…or are you are scared of technology and too proud to admit it…or are you are scared of technology and terrified of being replaced by a robot…or are you are so close to retirement that you just want to carry on coasting along for another 2 or 3 years without being hassled…or are you completely uninterested in technology, and assume that because you are, everyone is?’ And what is it that I haven’t done because clearly I have failed you if you feel this way?’

I delivered my presentation a little later, about how blended learning, e-learning, and micro learning could give these new modules an edge then talked about how respondents to student surveys are crying out for the same things – more lecture capture, a more organised and up to date Virtual Learning Environment, and…yes…because it’s too late to switch off the internet…more technology enhanced learning and an end to didactic lecture-theatre based content 8 hours a day, 5 days a week.

I’m not sure whether my message got through, because it’s hard to be heard when the cement mixers keep depositing their warm sludge around your ankles wherever you go. 😉

(BTW: I’ve not forgotten about my ‘why console gaming is great’ series of posts, it’s just that this cropped up and I needed to write it down.)

 

Game On! (Part 1)

This slideshow requires JavaScript.

I wanted to write a post about video gaming and how, not only has it been a source of entertainment to me for over 40 years, but, almost as a by-product, how it has also been a tool for therapy and for education. I then realised it would make for a very long post.  Instead I am going to chop it up into digestible pieces and attempt to write a regular series of thematic posts. It makes sense, sequentially, to make the first one an overview of sorts, and to set the scene. So here it is:

Introduction (or: ‘Coming to think of it, are Generation X the original gamers?’)

I first ‘owned’ a video game console in 1978. I was 8 years old and my parents ran one of the first pubs in the country to have a Space Invaders cabinet standing in the corner of the public bar. It terrified and delighted me in equal measure. After school, I would stand at my dad’s side and delightedly watch him zapping those sideways-creeping, crab-like aliens, the music becoming quicker and more ominous as more were destroyed and their rampaging across the screen became more desperate. This he would do at opening time, so fairly often, before I was sent upstairs and away from the soon-to-be smoke-filled haze of an adults-only environment (and how many hyphens do I need to construct a sentence, eh readers?), the first  customer of the evening would wander in and dad would have to leave the game mid-level, asking me to take over until all three lives were lost (as they were within about 15 seconds. I was seven, had no fine motor skills or coordination and was too scared to do anything other than panic, stand there doing nothing and die immediately), and that’s when the game would terrify me. Five minutes of excitement, 15 seconds of terror, then upstairs quickly for crispy pancakes, Coronation Street, bath and bed.

Via stints in my teenage years with a Commodore 16 and a few lacklustre attempts at playing Super Mario and Sonic the Hedgehog in my early twenties (which, ultimately, were both games about testing the player’s timing rather than problem solving, and didn’t interest me in the slightest), video gaming had fallen so far off my radar it was now on Mars. It was only when, in my mid-to-late twenties, to silence my console-addicted husband of the time who kept nagging me to ‘just have a go’, that I picked up a joypad, stuck Tomb Raider on his Play Station and my world changed.

I’m 47 now. I own an Xbox 360 and a PS4, and I still play Tomb Raider games, but my tastes have changed. I’m currently playing Elder Scrolls Online (so perhaps should have realised years ago when I was addicted to Fighting Fantasy books that role playing games would become my go-to genre). This year alone I have dedicated what amounts to hundreds of hours of free time to Final Fantasy XV, and a remastered version of Skyrim that I first completed 6 year ago when it was originally released on an older system. Last year it was Fallout 3, Fallout New Vegas and Fallout 4. Though thanks to Donald Trump, I can no longer play these without wincing. I’m a sucker for a few hours on Silent Hill too. But I have to play with the lights on.

You might be shaking your head in disgust now. All those hours hunched over a console, staring at a television, when you could be doing something healthier / more sociable / ‘useful’!  Tsk!

Over the next few posts I hope to be able to change your mind or, if you are a fellow gamer, confirm what you have been suspecting for a while. Either way, I’ll be talking about why I think gaming is not the product of a misspent youth, not just for ‘sad people’, and hasn’t, as yet, turned me into a thug or a terrorist. I will also be suggesting that gaming can be used as therapy and looking at why I think console gaming may have value as a teaching and learning tool.

Make Mine a Double

gin

I’ve been thinking a lot about the psychology of eLearning. I’m not even certain that ‘the psychology of eLearning’ is a thing – but it is something that I find myself banging on about a lot at the moment, especially when talking to academics, instructional designers and other learning technologists, so I may sound like a crazy fool. Or this is old news, in which case, you can stop reading now.

If you’re still reading, I’ll try to explain what I mean by way of one of my patented food analogies:

A couple of weeks ago I went back to my home town for the weekend to catch up with family and friends, and started the weekend in the pub. We started off in the local Wetherspoons, where my partner ordered a gin and tonic. He specifically ordered Tarquin’s gin (a brand that he had never tried before), and it was served in a standard tumbler with a couple of ice cubes and a slice of lime. And very nice it was too –  I had a sip and decided that I wanted to buy a bottle while I was in Cornwall.

We finished the night in our hotel’s bar, where my partner ordered another Tarquin’s gin and tonic. This time the drink was served in a long stemmed gin glass, again with ice and a slice of lemon. Again, I tried a sip – but this time it tasted much better. It shouldn’t have; it was the same mix of the same ingredients – but I am convinced that because it was served in a ‘proper’ gin glass and not a standard tumbler, it made a real difference. On a subconscious level, I guess felt that I was being ‘better cared for’. Maybe it had a placebo-like affect: because it looked nicer, it tasted nicer.

I think the same applies in eLearning; specifically via instructional design.

A lot of people involved in developing online resources suggest that function trumps design. As long as an online package does what it is supposed to do, then it works. And on a mechanistic level, yes, it does. I mean, who wants something that looks nice, but where the interactions don’t work? But I have seen a lot of fully-functioning elearning packages that, for all their drag and drop, fully-functioning multiple choice questions and multimedia elements, just look uninspiring. Lots of black text on a white background, lots of corporate clipart, and tonnes of ‘click forward for more of the same’. These packages do what they need to do, students work through them because they need to – but do they feel valued by the content? Are these packages the elearning equivalent of a gin and tonic in Wetherspoons? And, if you are paying up to 9k per year to study, should you, on some subliminal or subconscious level, feel valued by the institution you’ve given your money to?

I don’t know. Obviously, I can’t make a judgement based on a theory I haven’t really done anything with. Maybe I should talk to some students, see if I’m on to anything here, or just stating the bleeding obvious.

Don’t be Scared of the Dork

fear-of-tech

James Clay wrote a thought-provoking post last week. Called Show me the Evidence, James talks about how: “when demonstrating the potential of TEL and learning technologies to academics, the issue of evidence of impact often arises. You will have a conversation which focuses on the technology and then the academic or teacher asks for evidence of the impact of that technology.”

James cites fear as a key reason behind this, suggesting that many lecturers don’t have the capabilities to use IT, so lack the basic confidence to use learning technologies. To save face, and because it would be mortifying to have to confess to a lack of skills, they ask for the “evidence”. This then enables them to delay things.

Weirdly, I can’t think of a single occasion when an academic has asked me for empirical evidence or to cite the research framed around my work. I tend to go about things the other way-heading the academics off at the pass because I am the one who is afraid to look like a dunce in front of them.

I delivered a lunchtime session to academics looking at the flipped classroom model last week. The conversation turned to the (still) widely-held belief that anyone under 25 is a techy-wizard, while the rest of us can barely use our smartphones. (A different kind of stalling technique, perhaps? It’s always academics who bring this up.) I offered some ramblings about Marc Prensky’s ‘Digital Natives‘ theory being a load of old cock-and-bull, and that Dave White’s ‘Visitors and Residents‘ model was more realistic and less ‘pigeon-holey’. The group liked this as it appealed to their academic mindsets, so I was then able to sneakily show them some tools while they were feeling more at home.

Another thing I sometimes do is suggest that the academic in question might want to try out the method / tool that is being suggested, and then write a paper about the experience. Again, this appeals because it’s more in tune with how academics tend to work. I think a lot of the fear that James mentions comes from an assumption that learning technologists and academics speak two totally different, and incompatible languages. We don’t, but it can certainly be hard to prove it!

No, I will NOT fix your computer.

60ead8ca-63af-40b5-a065-367dbfdfde67-6458-00000442963d3a0a_tmp

And on that fateful day, when my (then) line manager told me that I needed to choose between the path I was already treading – that of the lecturer – and the route I was constantly meandering over to: the path of the learning technologist, he told me to choose carefully. ‘If you decide to be a learning technologist’ he said cautiously ‘you will walk around with an invisible target on your chest.’

Turns out he was wrong. It’s not a target I have emblazoned across my chest, but a question mark.

Nobody knows what I do. Everyone thinks they know, but they are always, always wrong. It can be best encapsulated by a corridor-based chat I had with a lecturer just before Christmas. It went something like this:

Lecturer: ‘Ah, Bex, I’m glad I bumped into you. I can’t seem to get my computer linked to my printer.’

Me: ‘Ah. You need to have a chat with the IT support team about that.’

Lecturer ‘But you work in IT!’

And there it is. And there it always is. Today another lecturer popped into the office to tell my colleague that he couldn’t get the projector screen to display anything in a lecture theatre. And when my colleague pointed out that it wasn’t her job to fix it, he was surprised. Because we work in technology, ergo, we fix computers.

So what do we do about it. It’s simple enough: the first word of our job title is LEARNING. The second is technology. Yet that first word seems to be missing, so as far as anyone knows, we are working in learning TECHNOLOGY.

It’s a battle I’m know many in my role are fighting. The odds are stacked against us though. Generally (though this is slowly changing) learning technologists are employed on non-teaching, administrative contracts. This is at odds with what we want to do: consult with teaching staff and show them how best to enhance teaching and learning with a technological bent, to carry out research into the use of technology in practice and, quite simply, to make teaching better. But when I use words such as ‘pedagogy’ (or, God forbid, ‘heutagogy’) or talk about digital literacy, or how to engage students with peer assessment techniques that involve technology, people look at me as if I have just given the Pope a wedgie.

And so I spend my days enrolling staff to courses on our LMS, showing people how to embed YouTube videos into PowerPoint slides and copying and pasting content from paper-based rubrics into Grademark. So I ask – no, plea – for some guidance.

How do I get rid of this question mark?

Baby, you’re a Star!

Last week, for almost 36 hours, I became an internet celebrity.

I’d just discovered that, after discussions between BBC Worldwide and Cardiff City Council, The Doctor Who Experience (DWE) is going to be closed next summer. As a massive Whovian (literally and metaphorically), this made me want to sob with fury and indignation. I bloody well moved to Cardiff so that I could visit the DWE as often as possible (not strictly true – but I’d be lying if I didn’t say that having the DWE in Cardiff didn’t have a tiny part in my overall decision to leave Cornwall), and now they are going to take it away.

Cardiff is the home of Doctor Who. It’s so synonymous with it, even the road signs have Daleks on them!

daleksignAs a resident of Cardiff, I love watching the Doctor and his erstwhile companions running around ‘London’ or ‘Nevada’ or ‘the Planet of the Merenghi’ and going ‘Oh look – that’s Fast Eddie’s Diner down in the Bay!’ or ‘That’s the Haydn Ellis Building – I went there for a meeting last week!’ I also appreciate that Doctor Who and the DWE bring a lot of income into Cardiff, which may be a capital city, but by other capital city standards is really no bigger than a large town. It has a relatively small city centre and it’s located in the poorest area of Europe – South Wales. Here, we see the highest rates of unemployment and lowest levels of socioeconomic health in Europe: if anything brings extra income into the area, it should be welcomed with open arms – not closed down!

The news of the closure angered my inner fan girl, but the knock on effect for the local economy worried the grown up me. So I decided to set up a petition to stop the closure of the DWE. I did this for two reasons:

  • Like a child who had had their sweets taken away, I was genuinely upset at the thought of losing the DWE, and scared of the repercussions for Cardiff.
  • I wanted to do something to take my mind (and maybe the minds of others) off the shit-fest that was the American Presidential Election.

So I set up the petition, shared it on Facebook, expected my mum and 2 of my friends to sign it out of pity, and went about my business.

When I checked my personal email account a few hours later, I was very surprised to see hundreds of messages of support from people who had signed the petition. I had a quick look to see if anyone had signed it, and was completely gobsmacked to see that almost 3,000 people had. Wales Online had, somehow, got hold of the story too. To this day, I do not know how.

The next day journalists began to call me. Not big league journalists, but local papers and websites  – and a reporter from BBC Wales Today called me saying he wanted to interview me for that evening’s local television news. I’ve still not watched this, as I know I will cringe.

So what’s this got to do with learning technology? Well, there’s good news and bad news.

The Bad News

As a woman, I had to think long and hard about what I was going to do – I made an assumption when I pressed the ‘launch petition button’ I would get a fair amount of misogynist abuse, so prepared myself for this accordingly. Actually, I very nearly didn’t start the petition because of the possibility of trolling and flaming. As it turns out, I needn’t have worried, and I have received just one abusive comment – the content of which was laughable. But the fact is that, as a woman, I had to think long and hard about whether putting myself ‘out there’ was a good idea.

This got me to thinking. I wonder how many women have decided against posting something that could make a difference online because of the spectre of online sexism, and of rape and death threats. And now that we appear to be moving into the ‘Age of Endarkenment’, (thanks in no small part to that shit-fest I mentioned earlier), I’m wondering whether this will be the last time I allow my voice to be heard on the Internet.

The Good News

But there’s a positive side to all of this – my petition took seconds to set up and launch, and has travelled the world gathering thousands of digital signatures as it goes. In 5 days, it had garnered over 6,000 signatures – and many, many messages from fans all over the world wanting to talk about how upset they were to hear that the DWE is going to close.  Hundreds of fans messaged me through the petition’s webpage (and many others found and messaged me on Facebook and Twitter), and shared stories of their travels to the UK and their pilgrimage to Cardiff. Many more expressed real disappointment that the travel plans they had been saving up for next year – to travel half way across the world for the sole purpose of visiting the DWE – now lay in ruins. But for 36 brief hours, we were a global community, brought together by a shared passion, wanting to share our experiences and stories of Doctor Who fandom, of visits to the DWE, and our collective grief at the thought of losing something so important to us – and to the beleaguered economy.

A week later and I have almost 6,200 signatures. I’m going to take the petition to the council and send a copy to BBC Worldwide (along with much heartfelt pleading). I didn’t think it would come to this – but clearly there’s a lot of passion out there for Cardiff’s very own Dalek museum! And it’s my duty to now finish what I started. The Internet can be one hell of a force for good – communities are formed in seconds, people rally around a cause, and strength is gathered in numbers. Had I printed out a petition and carted it around Cardiff with me for 6 days, I’d be knackered, need a new pair of shoes and may well have gathered 600 signatures – but not over 6,000!

In Other News…

…so here’s a thought. Maybe when we teach students how to be digitally literate, we should also make a solid effort to teach them about respecting those of us who aren’t white, middle class, heterosexual males. Collaboration is one of the linchpins of digital literacy, so let’s make sure we teach our students that the people we may need to collaborate with ARE ALL THE SAME UNDERNEATH. And let’s teach them that it isn’t cool to break Wheaton’s Law. In fact, let’s add this to the skills agenda. I think, when I look at the way things are shaping up out there, it’s the most sensible thing we can do.