Middle Aged = Digital Native?

subnetworks-space-invadersIt’s been a busy few months with a pretty full social calendar, a dissertation to finish before the end of September, and a landslide of work-based projects, all of which have conspired against me to stop my blogging.

And, to be brutally honest, despite the aforementioned social life outside of work, I’ve been feeling listless, unenthusiastic and devoid of mojo for a couple of months.  And, it goes without saying that when you feel as if all the pleasure you once had for all things learning technology-esque have buggered off, it’s pretty much impossible to think of anything to blog about.

However, an office move, a couple of work-based quick wins, an invitation to sit on the advisory board for an international conference and a couple of speaking engagements have all pulled me out of my temporary rut.  The icing on my ‘happy cake’ was provided at a Jisc event yesterday when a delegate approached me to say how much they enjoyed reading my blog nposts. Well, clearly, I have a public to entertain!  Which is why I was rather pleased when, whilst floundering in the bath last night I had an idea for a post that captured my interest.  So here it is.

At yesterday’s event, one or two common themes cropped up across the day.  One of these was the notion that the digital native did not exist.  As some of you may know, I hate pigeon-holing, and am frustrated at the notion that ‘anyone under 25 is a digital wizard, and anyone over 30 is a digital dinosaur.’  You may as well say that anyone with a shoe size larger than 8 will only eat pepperoni pizza while those with smaller feet will always stick to spaghetti. Nonsense.

Yet it appears that putting every aspect of one’s behaviour, personality, abilities and preferences into clearly labelled boxes is here to stay, so I’ll add my two penneth and posit that people of my age (I’m 45) are probably at the BEST age to understand technology and to ‘get’ the concept of digital literacy. And that’s because we know when to use if to enhance what we are doing  and when to stick to ‘old school’, non-techy methods. And that’s p[robably got something to do with the fact that we were there at the start.

When I was 7 I played my first ever video game.  It was 1977, it was Space Invaders, and yes, I was very lucky because my parents were publicans who were fortunate enough to have one of the very first arcade gaming machines in the country in their pub. I remember being fascinated and terrified in equal parts – after opening the doors of the pub for the evening I was happy to watch my dad zapping that curtain of crab-like, pixelated blobs moving down the screen, terrified that a customer might come in before he lost his three lives and the joystick would be passed over to me to finish his turn.  (Before being sent upstairs – not  good for business to have a 7 year old running around the public bar demanding ‘gimm and tommics’ at 6.00 in the evening).

Soon it was the 1980s and the first rudimentary home computers were making their mark. I remember getting a scorchingly average grade in my CSE Computer Studies exam in 1986. Thing is, as much as I liked trying to programme in BASIC, I was entranced more with the Commodore 16 my parents had bought my brother and I for Christmas in 1986 and had become obsessed with playing Mercenary and those text-based ‘choose your own adventure stories’, so had never really practiced coding.  But with a Commodore at home and a suite of BBC B models at school, I can say that I was there, right at the start of the digital revolution.  We had a top loading video recorder at home too.  And a microwave. Ha!  Who are the digital natives NOW then?

Thing is, because I was there when technology started to become mainstream but before it became an established part of daily life, I still have a heap of non-digital skills that I use regularly.  Note taking, when done with a pen and paper, is much more meaningful than simply taking a photo of notes on a whiteboard, or recording a lecture to listen to or watch again later.  That act of putting pen to paper – of having to think about forming the correct shapes in the correct order – commits the word to your brain in a way typing never will.

And books!  Yes, those proper, smelly books, with their cracked and bent spines standing proudly on shelves – nothing can beat that (other than wandering into Waterstones and browsing for an hour). And yet it feels incredibly liberating to go on holiday armed with 16 books on a Kindle that weighs less than a bag of ‘Monster Munch’.

I was there when the Internet started to become popular, so was able to navigate it while it was still constructed of 16 pages. On the way, I learned about how I projected myself online, and how best to manage my growing digital identities (professional / social).  I did this just in time for Twitter, Facebook and LinkedIn to start making their mark.

I know instinctively when technology will make a difference and, vitally, where it doesn’t. I have learned as the Internet has grown how to be secure. My parents did not post photos of my achievements all over Facebook, so my childhood was very secure and totally private. I played outside, developed social skills by talking to my friends and see the value of disconnecting and living as I did pre 1995, with no telephone, and no tethering to the digital. Indeed, once a year I purposely stick all my gadgets in a cupboard and take a tent to the middle of nowhere so that I can have one of those ‘digital detoxes’ that seem to be trendy amongst Guardian readers.

I had a Walkman when I was a teenager and a Discman in my 20s so getting an iPod in my 30s felt like normal progression rather than something new. Developments in technology don’t scare or overwhelm me, but neither am I on the eternal hunt for an upgrade to my smartphone or 1,000 more Facebook friends.

So today I will copy and paste this document from my OneDrive to my blog site,  I will have a quick blast on Witcher III on my PS4 when I get home, check Facebook on my iPad after dinner and go to bed with a cup of tea and a book. Made of paper. With smells.

BYOD (Bring Your Own Disprin)

My job means I get to play around all day with a nice mix of technology and education.  It means that I need to know about lots of emerging and developing technologies, theories, ways of teaching and learning, hardware, software…and so on, and it also means that I need to be (seen as) positive and optimistic about all things digital, which I always try to be.  And yet, when I stumbled across this post on the JISC RSC Wales blog yesterday, it made me feel as if a great weight had been lifted from my shoulders:


Click image to access post

For a number of years now, teaching and learning with mobile devices-now referred to as Bring Your Own Device (BYOD-Because Education Needs Acronyms) has been a constant theme.  It has also been something I have willfully ignored, because BYOD has always felt to me like a massive and incredibly knotty topic as well as a way of working that sounds both time consuming and tricky to manage.  Past experience has taught me that the IT infrastructure (well, the internet) in most institutions isn’t quite ‘fat’ enough or fast enough to deal with the volume of data pinging back and forth. Teachers have to find a way to get students with a massive range of skills levels to do the same thing on a variety of devices working on a variety of operating systems.  And this opens up a veritable shed full of possible problems.

What if, for example, using an all singing and dancing app sounds fine in theory…but it isn’t available on all operating systems (Microsoft, I’m looking at you)?

What if the WiFi signal is weak or keeps dropping out? What if your students are having trouble connecting their device to the Internet? I did a demo for a browser based quiz (using Kahoot) with a group of PGCE sessions recently, thinking that bypassing branded apps and sticking to the one thing all mobile devices have-the internet-would keep things quick and simple.  What I thought would take no more than 10 minutes took closer to 30 because, despite the wealth of mobile devices present in the classroom, half the class just couldn’t get their devices to connect to the WiFi. We got there in the end, but were I being observed I would have received a right talking to at the end of the session.

It’s the easiest thing in the world to assume that everyone can use every aspect of their device, from Internet settings to film editing apps, but usually the truth is quite different.  Owners with the smartest of gadget will likely admit to only using it for phone calls / text messages / Facebook / taking photos. So assuming that all students can use their gadgets to the full is blinkered, naive, and possibly arrogant. Actually, assuming that all teachers have more than a working knowledge of how all mobile devices work is really asking the impossible.  Because surely for an activity to work, this has to be the case doesn’t it?

And what if there are more students than devices? It may be good to have a spare iPad available to give to someone without their own tablet…but if they have no tablet, they probably have even less idea how to use the shiny and slightly scary tablet the lecturer has proudly put in front of you than those mentioned above. And do students want to be picked out by their tutor and peers, for whatever reason, as ‘the one who still hasn’t got a smartphone’?

Equality of access is more than ensuring that everyone has a device in front of them. Students with physical and special learning needs make deployment of the right devices and software vital…and more complicated.  There can be accessibility issues beyond connection speed too.  ‘Blackboard’ can be accessed through a browser, but is an incredibly frustrating site to use on a device with a screen as large as an iPad, so must be hellish on a BlackBerry.  It can also be accessed via an app, but only on an Apple or Android-powered device, so is no good for people using Microsoft devices. And bingo!  We have an inequality of service issue.

So I completely and utterly understand why teachers don’t bother. And I know that I should slap on my positive face and try to convince them that this is how (someone) has decided our students will learn BEST from now on, so get on board because you don’t want to get a reputation as an educational dinosaur. And if the shed full of problems wasn’t there, I would.

I don’t want to be seen as a Luddite, and there are some common sense approaches to BYOD mentioned in the following articles, so I’ll finish up by linking to these, thereby leaving on a more positive note.

UFI Charitable Trust: Primer on Bring Your Own Device – 7 reasons to leave them to their own devices (advocates letting students use their own devices in ways that suit them as a means of learning rather than trying to deliver lessons with prescriptive ‘you need a mobile device, this app and a working knowledge of network troubleshooting t0o do this’ content.

Donald Clark: Keep on taking the tablets – 7 reasons why this is lousy advice (there must be something magical about the number 7!  Quite liking the author’s conclusion:  “I’m not against the use of tablets in schools, I just think that turning it into a ‘movement’ is a mistake and that too many of these projects are poorly planned, badly procured and lack proper evaluation.”)