Over the past few days there have been a whole bunch of interesting tweets relating to tricks that help people write better. Many of these have the #GetYourManuscriptOut hashtag that was set up by Raul Pacheco-Vega in his blogpost about the problems with finishing off papers http://www.raulpacheco.org/2014/07/getyourmanuscriptout-or-how-to-fight-procrastination-in-academic-writing-through-crowdsourcing/.
I guess all this attention on academic writing is because there’s a lot of writing going on at this time of year: many PhD students are currently finishing up their theses, as are many MSc students, academics are trying to get their paper submitted and those of us working in HCI are already focused on the CHI deadline on the 22nd September. As a result, I’ve spent most of today reading drafts and making comments on them, and pointing my students to various online resources that provide useful hints and tips about reverse-outlining
and how to write good paragraphs https://medium.com/advice-and-help-in-authoring-a-phd-or-non-fiction/how-to-write-paragraphs-80781e2f3054.
So what are the secrets of writing a good paper? Rachel Cayley writes about the importance of extensive revision of a manuscript http://explorationsofstyle.com/2014/06/11/committing-to-extensive-revision-from-the-archives/. But how much revision is enough? How many drafts does it take to write a paper that’s good enough to be submitted? I discussed exactly this with one of my post-docs this summer. We’d made a rather last minute decision to submit a paper to a conference and had just one week until the deadline. In that week the paper went from verson1 to version12. After we’d submitted she asked me “Is it normal to go through that many revisions?” My immediate response was “Yes - of course!” but it prompted me to take a quick look through my folders to see how many revisions my papers usually go through before submission. Very quickly it became obvious that my papers tend to go through at least 10 versions before submission, and those that get accepted have (perhaps unsurprisingly) often had even more . It turns out that in this case 12 was enough and our paper was accepted. Of course we hadn’t been writing it all from scratch. We had various files which contained the method and analysis, and we’d spent many hours talking about the results. What we did do in that week though was quickly identify a narrative for the paper, write up the story, and iterate over the details of how best to present the findings. So, if you have done a lot of the thinking already, then in a week of concentrated effort you can iterate enough to get a paper from zero to submission.
I was surprised today by another colleague when I sat down to read version 1 of the paper we had decided to write for CHI2015. The reason I was surprised is because although it was version 1 it was so polished. It was so good it even prompted me to write to her asking how many versions it had really gone through before she sent it to me. “Just the one” she said! This isn’t the whole truth though as like the paper I wrote about above, we’d already done a lot of work on this one before she wrote the first word. We had written a fantasy abstract (an abstract for a paper that you’ve not even done the research for yet) back in January, and then after doing the study, we’d written an abstract for a talk (in May), and written and given the presentation (in July and August) so actually writing the first version of the paper was easy. The draft of the paper was started just two weeks ago with a file of notes that outline the structure of the paper, the literature that needs to be included, and the main contribution. It took just a week to write. My co-author wrote “I think a an awful lot of thinking has gone into this paper before I sat down to write it - we’ve spoken many times. […] So we’ve had a head start with this one and I’m glad we didn’t need 14 iterations to get it ready…”.
But not all papers can be written so quickly. Another paper I’m currently working on for CHI2015 is currently on version 9. We’ve already done a lot of iteration on it. Version1 was created in mid June and mainly consisted of cutting and pasting text that had already been written elsewhere into the correct format. There it sat for two weeks before more work was done on it. Through versions 2 to 6 the story changed, the paper got chopped from full 10 page paper to a short 4 page note and then back up again to a 10 page paper. During this process we changed our minds about exactly which studies were going to be included but by mid August we had settled on the story. The next two weeks saw versions 7 and 8 in which we refined the write up of the details of the studies, and worked on making the story flow. There’ll be another version or two before the deadline I’m sure. It’s clearly taken a lot longer than a week to write this one. A big part of that reason is because doing the writing has been part of the process of thinking about what our results mean. (See http://explorationsofstyle.com/2014/06/04/using-writing-to-clarify-your-own-thinking-from-the-archives/ for Rachel Cayley’s blog post about using writing to clarify your thinking.)
Having a deadline is always a good way to focus the mind and find the motivation to get something down on paper. So with the CHI deadline now three weeks away, is there enough time to get a paper together if you haven’t already got to version 8?
I’ve got one that reached version 6 today but we’ve had to strip out all the text and engage in some reverse outlining to try to identify one clear narrative for the paper. It’s been a painful day as a result but I’m still hopeful that we can craft it into a good paper before the deadline - there’s still a way to go but it seemed like we made a lot of progress today. With 3 weeks until the deadline I think this one stands a good chance. I’m even tempted to start another one - a paper I’ve been meaning to start for a year. With all the thinking that’s gone into it already perhaps it would only take a week to get into shape? The concrete deadline might be enough to make me put finger to keyboard.
And then there’s another that we’re really excited about but we’re not even half way through the data collection! If all goes to plan we’ll have finished running the study next week. It’s still tempting to think we might be able to do that one too. In my more rational moments it’s obvious that this will be hard to pull off. We won’t be able to write a polished draft in just a week as we haven’t had the luxury of months of thinking about the results and how best to communicate them. Nor will there be time for 10 to 12 versions of the paper to help us craft that story. Perhaps we should line this one up for September 2015?
By Charlene Jennett, University College London and Anna L Cox, University College London
It’s irritating when you try to talk to someone playing a videogame. You tell them dinner is ready and they completely ignore you. Their eyes are glued to the screen, their fingers frantically pushing buttons. We find it rude and it has led to many an argument in the family home.
But research suggests that your children, partner or parent may not be simply ignoring you when they’re plugged in to World of Warcraft – they may be experiencing something called “inattentional blindness.”. This is when a person chooses to focus on one thing and as a result they are blind to everything else around them.
Psychologists say that our lives would be pretty chaotic if we didn’t selectively attend to things in our environment in this way. Every noise, every sight, every smell, would distract us from our goals. We would simply feel overwhelmed with incoming information and we wouldn’t be able to get anything done.
This is, in fact, a common experience of people who are diagnosed with attention deficit disorder. They feel overwhelmed because they are unable to focus their attention. People selectively attend to some things over others all the time to avoid this feeling. It’s a natural process.
But there are some activities that absorb your attention more than others. When sports players or musicians feel extremely focused on what they are doing, they might say they are “in the zone” or “in full flow”. Videogame players describe themselves as feeling “immersed” when they’re focusing. They are fully engaged in a new reality, as though submerged in water.
In our research at the UCL Interaction Centre, we have been investigating immersion for several years now, following on from studies on inattentional blindness carried out by psychologists in the 1950s and 1960s. We ask our participants to focus on one source of information while ignoring others. But where older studies had participants staring at screens or listening to sounds, we asked ours to play a video game.
In one study we asked participants to play a driving game until they were told to stop but we didn’t tell them how they would be told. Part way through the game, a small pop-up box appeared on the bottom right of the screen saying “End of experiment – click here.” Participants who performed well in the game were slower to click on the pop-up box. These were the players who rated themselves as highly immersed in our immersive experience questionnaire.
In another study we asked participants to play a spaceship game. They were told that several distracting sounds would be broadcast into the room but that they should ignore them and continue playing. Some of the sounds related to the game, such as a voice saying “space games are boring” while others were person-relevant, such as a voice saying “London is boring”. Others, such as a voice saying “collecting stamps is boring”, were simply irrelevant. At the end of the study, participants were asked to remember as many of the distracting sounds as possible. We found that participants who performed well in the game recalled fewer auditory distracters, particularly the irrelevant ones.
Our findings share quite a few similarities with traditional psychology experiments. People are less aware of visual distractions when they are highly focused on a videogame. They are less aware of auditory distractions too, with only the most relevant breaking through to their conscious attention.
Feedback and immersion
There is another a key difference between our work and those of traditional psychology experiments because unlike watching a screen, you get feedback when you play a videogame. We found that a person’s immersive experience and the extent to which they were less aware of their distracters is related to the feedback they received. Positive feedback and positive perceptions of performance are essential for keeping a person’s attention during gaming.
But even when an indicator of performance is clearly unrelated to their true performance, players are unable to prevent themselves from interpreting it as meaningful. In another version of our spaceship experiment, we rigged the game so that no matter how well the player controlled the spaceship, they would either score really well or score really badly. Despite it being obviously rigged we found the same results. Participants who scored high in the game rated themselves as more immersed and recalled fewer auditory distractions. What seems to be important for immersion then is not that players actually perform well, but that they are able to perceive themselves as performing well.
These results reveal the powerful impact that feedback has on people’s immersive experiences and their motivation to continue with an activity. Receiving regular feedback that you are doing well is pleasurable. It might even be viewed as addictive in some ways, as it motivates the player to keep coming back for more.
This in part helps us explain why the gamer in your life ignores you when you tell them it’s dinnertime. Game designers have clocked that providing feedback as part of the game encourages us to keep on playing. This same feedback encourages inattentional blindness in the player. And given that children are more prone to inattentional blindness than adults it’s a wonder that they hear anything you say.
Charlene Jennett was supported by an EPSRC DTA studentship.
Anna L Cox receives funding from the EPSRC and NIHR.
This article was originally published on The Conversation. Read the original article.
By Emily Collins, University College London and Anna L Cox, University College London
Videogames have a had a particularly bad rap lately, not least after a UK coroner suggested a link between Call of Duty and teenage suicide. But recent evidence suggests that gaming can be good for us and, in particular, can help us unwind after a stressful day at work.
Many of us spend much of our working days immersed in technology. A smartphone or tablet comes with many advantages, such as flexible working, but the spread of work-based technology can add to stress too. Many people complain about the constant pressure to reply to e-mails as soon as they are received, even if that’s at 10 o’clock at night.
And even if we do switch off from work, technology is often at the centre of our home life. Watching a favourite show, keeping up to date with social media or reading online blogs are a vital component of many people’s evening activities. And while videogames were once seen as the preserve of teenagers, they have grown exponentially in popularity in recent years, largely because players no longer need a specialist console to enjoy them.
Roughly 1 out of every 4 UK adults, across ages and genders, now plays some kind of digital game at least once a week. Despite this rising popularity, digital games rarely make headlines for anything positive. Before the most recent controversy about suicide, they have been blamed for a variety of negative effects, including causing Attention Deficit Disorder encouraging acts of violence and antisocial behaviour. At the very least, they are perceived to be a bit of a waste of time, with no real, external value.
But our recent evidence suggests that not only can digital games be good for you, they may also be beneficial for your work life. Post-work recovery is a vital part of feeling prepared for the next day because it’s when you replenish the mental resources you used up during the daily grind. And it’s beginning to look like the type of activity you do during this period is important too.
Activities such as playing team sports and socialising have been found to benefit recovery but for many, this requires dedication, time and resources that simply aren’t available. So turning to activities that can be performed for brief periods of time in any location could be an ideal solution. This is exactly where digital games fit in.
Our work has built on previous studies that found that those who play digital games are better recovered than those who don’t. We were interested in whether the type of game is important; the experience of playing casual games such as 2048 is likely to be very different to that of playing Call of Duty, for example. We asked participants to estimate how much time they spent playing a variety of genres of digital games in a week and then got them to complete a questionnaire to assess how much interference they experienced from work while at home and how well they recovered after gaming. They were also asked about the social support they receive from others, both on and offline.
Those who played digital games were more recovered and experienced less work-home interference than those who didn’t. And the more time people spent playing, the more recovered they felt. First-person shooter games were particularly beneficial.
While no one genre helped with every aspect of recovery, many were good for one or another. Massive multiplayer online games, which generally involve completing challenges of increasing difficulty, were good for giving players a sense of mastery. Action games, on the other hand, were related to relaxation.
We also found that out of those who claimed to have formed relationships as a result of the digital games they played, the extent of recovery was influenced by the amount of social support they had online. This suggests that digital games are effective in helping with recovery from work at least in part because they provide an opportunity to socialise.
This particular study can’t establish conclusively that the games the participants played were directly responsible for the improved recovery. It may just be that those who play digital games may simply have more time available to recover. But the findings do go some way to suggest that this causal link is possible. We are currently in the process of running more controlled studies, testing whether digital games directly improve recovery, and if so, whether the kind of game or level of immersion are important factors.
Emily Collins receives funding from the Engineering and Physical Sciences Research Council.
Anna L Cox receives funding from the EPSRC and NIHR.
This article was originally published on The Conversation. Read the original article.
By Anna L Cox, University College London and Duncan Brumby, University College London
A new app is about to come on the market with promises to dramatically increase the speed at which you read. Spritz is a text streaming technology that allows you to read a sentence, one word at a time. Each word is shown for only a brief flash, in the same place on a screen, before the next appears.
Up to 1,000 words can be shown every minute, which would allow you to race through an entire novel in just 90 minutes. This could revolutionise the way we read, particularly on small-screen devices, such as smartphones and smartwatches.
How it works
When we read, our eyes normally make a series of brief pauses to extract information before moving on. These are referred to as eye movement fixations and saccades. It is only during fixations that the eye processes what is before it.
Spritz allows you to read faster by presenting words in the same place on the screen. This means that the eye can remain in the same spot and does not have to waste time making lengthy saccades to get to the next word.
When we read we have to recognise words. Spritz supports this by focusing our attention on the most informative part of a word so that we recognise it quickly – the word’s optimal recognition position. This is done by highlighting the letter that should be fixated in a different colour to the rest. Doing this grabs our attention and makes us fixate at that point.
Spritz also takes into account some of the factors that explain why we don’t fixate on each word for the same length of time. Rather than reading each letter in a word, our visual system is able to process common words as whole units, using the context of the text and the overall shape of the word as clues to aid recognition. This means that words that are very similar to each other in shape, like “bed” and “fed”, will require more time to process than words that are more distinct, like “decide”. Spritz therefore slows down when displaying very short words and speeds up for words that are four to seven letters long.
Reading or skimming?
Reading is more than just moving your eyes across a page. It is a cognitive activity. Our brain is busy processing the meaning of the words that have been read. For demanding content or tasks, this cognitive processing can take time. We naturally compensate for this by lingering on some words for longer than others to give our brains time to process the meaning of what we are reading.
The spritz website hints that some in-house testing has been done and claims people are able to perform well when tested for their comprehension and recall of a text that they have just “spritzed”.
But there’s a potential problem with treating all words equally and pushing everything through a high-speed funnel. Experiments using Rapid Serial Visual Presentation (RSVP), participants have demonstrated attentional blink. They were more likely to miss something important because their attention is occupied by something else that they had just experienced. For example, if you’ve just read an emotionally charged word such as “coffin” or “murder”, you’re more likely to miss something important if it occurs very soon afterwards.
The current system doesn’t appear to offer any deeper adaption to the content that is being presented. It doesn’t slow down if a main character dies suddenly, or a complex idea is presented in a sentence. This is not so surprising as this would be an extremely difficult technological problem to solve.
Spritz does slow down at the end of sentences to provide the reader with a little extra time to process the sentence that has just been read. But this might not be sufficient. In fact, readers frequently re-read sections of text when it did not make complete sense on the first pass –- something that is not possible with this system.
Spritz users might find themselves missing important information that they would have caught with regular reading as a result.
So while this technology could open up fantastic opportunities for reading small snippets of text, such as a tweet, on a very small screen, there’s probably some way to go in the development of the technology before you’d want to read War and Peace on a smart watch.
Anna L Cox receives funding from the EPSRC.
Duncan Brumby University College London. He receives funding from the EPSRC.
This article was originally published on The Conversation. Read the original article.
Having just read a blog post by ThinkProductive’s founder Graham Allcott http://www.thinkproductive.co.uk/end-of-the-month-the-lemon/ has got me thinking about where we started from with the Digital Epiphanies project http://www.digitalepiphanies.org/. In Graham’s blogpost he talks about epiphanies he’s experienced over the past month as a result of some personal challenges, and the changes he intends to make to both work and non-work aspects of his life.
This has reminded me of how our ideas on the Digital Epiphanies project were inspired by Jane McGonigal’s TED talk http://www.ted.com/talks/jane_mcgonigal_the_game_that_can_give_you_10_extra_years_of_life.html where she talks about the top 5 regrets of the dying http://www.theguardian.com/lifeandstyle/2012/feb/01/top-five-regrets-of-the-dying:
1. I wish I’d had the courage to live a life true to myself, not the life others expected of me.
2. I wish I hadn’t worked so hard.
3. I wish I’d had the courage to express my feelings.
4. I wish I had stayed in touch with my friends.
5. I wish that I had let myself be happier.
She also talks about the experience of post-traumatic growth. This is a positive change that occurs in response to a highly challenging life experience. Instead of being paralysed by stress when faced with difficulties, those who experience post-traumatic growth reflect on their priorities and change their lives. It seems that the traumatic event motivates them to live a life with fewer regrets
We wondered whether we could use digital technologies to give people the opportunity to reflect upon the way they live their lives and consider whether their actions are in line with their own values. Could we facilitate post-traumatic growth without the trauma?
Personal informatics tools are a range of technologies that enable people to track aspects of their lives. We’ve been investigating whether personal informatics tools can give people digital epiphanies (moments of insight about their digital habits). Given our focus on work-life balance, we’ve been thinking a lot about work-related digital activities that seem to take up lots of our time and invade our non-work time - the most obvious candidate being email. We’ve also considered non-work activities that people often engage in when they feel that they should be working such as social networking. Many of the existing personal informatics tools that track how you spend your time on your digital technologies include some measure of productivity i.e. they try to make explicit whether you’re spending your time effectively. They do this by classifying activities as productive (e.g. working on a word document), neutral (the default setting for email and scheduling activities) or highly distracting (social networking sites). The implication is of course that we should minimise our time on email and social networking sites so that we can spend our time doing more “real work”. This sounds like good advice, as long as we maintain firm boundaries between work and non-work and don’t let work grow to take up more and more of our time. And that’s not always easy, particularly with smartphones beeping with every email that lands in our inboxes. It’s worth remembering I think that one of the top 5 regrets of the dying was “I wish I hadn’t worked so hard”.
At the start of this post I mentioned Graham’s blogpost in which he argues that “disruptive times in your life are where you see the bigger picture from. Disruptive doesn’t mean ‘bad’ it just means ‘different from the everyday’.” This has made me think. For technology to really create digital epiphanies that can make us think about whether our current habits and behaviours are in line with our values, perhaps we need more than the data collection and opportunities for reflection that personal informatics tools can provide. Perhaps we need disruption to our everyday activities to have a real epiphany. Sometimes this might be a nagging feeling that things are not quite right. Or a complaint from a family member that we’re working too much. Could a technology provide the disruption?
Switching off, going dark, saying no. These are all phrases that relate to good advice about how to get things done, or more to the point, how to avoid being distracted and concentrate on the things you want or need to work on. This week I’m trying to carve out time by switching off email and avoiding other forms of digital communications so I can concentrate on the huge pile of tasks I have to get through. This thing is that I’m finding it really difficult. I mean *really* difficult.
I started off by deciding that I was going to take my own advice and try a once-a-day email strategy (Bradley, Brumby, Cox and Bird (2013) How to Manage Your Inbox: Is a Once a Day Strategy Best?). I even scheduled it in my day. Inspired by a blog post by Think Productive’s Graham Alcott (http://www.thinkproductive.co.uk/the-lemon-routine-rhythm/) I decided to dedicate the morning to important tasks , check email at lunchtime, and then use the afternoons for more communal activities such as meetings.
Just 24hours in and it all went wrong when I had to check my email first thing as was expecting to receive a file from a colleague which I needed to work on. As the 47 emails piled into my inbox I found it impossible to ignore them.
I’d successfully ignored them the previous night when doing the same thing. That time I’d used the snooze function in my GTD outlook add-in that enables you to snooze a message until the following day. But this time the snooze button didn’t seem appropriate. I didn’t want every email from yesterday to disappear until tomorrow. So it sat there in my inbox, looking at me, and it was all of 3 minutes before I started going through it (I’m addicted to having inbox at zero). 90 minutes later I had answered emails, added things to my to-do list, and deleted a whole bunch. What I hadn’t done was work on the document I had been waiting for!!
In recent years there has been a profound shift in the way that people consume television programmes. We’re no longer constrained to 4 or 5 channels on the single TV in the living room. With the rise of internet TV we can use a mobile device, such as laptop or tablet computer, to watch our favourite television show whenever we like. But look around the average living room and what are people doing? They’re no longer glued to the box in the corner but are shifting their attention across multiple devices: keeping up with others through email and social networking sites, as well playing games and looking up information (Müller, Gove, & Webb, 2012; Stawarz, Cox, Bird, & Benedyk, 2013).
Encouraged by a friend who loves x-factor and her husband who works for a (competitor) UK TV channel, I subscribed to the x-factor app on Saturday (for research purposes!!!). There’s loads of video content on the app which I’ve not looked at. The bit that I played around with, the 5th judge, only appears when the show is live. As each act does their audition the app offers you the chance to vote. If you log in via facebook then you can also see whether your friends voted ‘yes’ or ‘no’.
But I guess the truth is that I don’t really watch x-factor, in fact I don’t often really watch any TV. More frequently you’ll find that it’s on in the background whilst I sit on the sofa and do my emails. But what I discovered yesterday was that this app is working really hard to stop me doing this by demanding my attention. It talks to me and tells me what the majority of 5th judges have decided, thus prompting me to interact with it. And just when the adverts come on and I think I can dedicate my attention to my email, it asks me questions and makes a ticking clock noise that suggests that I only have a limited time in which to give my response - who can resist that?? Companion apps like this are really successful in keeping the viewers attention on the TV programme and preventing them from switching to their email or social networking sites.
This week I also came across a startup called CanFocus http://www.canfocus.com/ . They’ve designed a button that switches all your email, phone and IM statuses to “do not disturb” so that you can focus on work and “become a productivity superhero”. There’s been many a situation where I’ve sat down with the idea of watching something, only to be distracted by a desire to engage with a digital device. I think I need one of these to help me resist temptations to engage in work when I’m supposed to be doing non-work!
I’m on sabbatical for the next term. Many academics take this opportunity to relocate to another university, often one in another country, for a change of scenery and an opportunity to recharge their batteries. This isn’t something that’s open to me given that I have two young children and a partner with a fulltime job that doesn’t provide the same opportunity for a paid sabbatical. As a result, I have a term without teaching and (most of my) admin duties. It seems challenging to make the most of this opportunity so that I look back on this time and feel that it was really different in some way from any other term. Given that a change is as good as a rest, how am I going to change things without changing things in my personal life too much?
In order to help me decide what to do I talked to some friends and colleagues and also posted the question on facebook. I got a bunch of interesting suggestions: https://www.facebook.com/Anna.L.Cox/posts/10151893679466189
So what did I settle on?
- Blogging (suggested by Charlene Jennett) - hence this post. I’m going to write something every week.
- I have some short trips planned to give some talks
- Maria Kutar suggested life-logging, which given my current interest in personal informatics tools, is also something i’m going to add to my list.
- Paul Marshal’s suggestion of juggling might morph into ‘juggling work-life balance in new ways’, probably not quite what he had in mind, but I’m going to try out a few things, including Steve Payne’s suggestion of time off from facebook and email!,
- Jo Iacovides suggested playing video games.
- And Lisa Tweedie and Ann Blandford suggested voluntary work
Best get busy then!
It seems as though every day we are being warned that once something is on the internet it never really disappears. Although I can find articles written about me 9 years ago http://www.timeshighereducation.co.uk/189972.article unfortunately I can’t find my new webpages (and nor can anyone else!) because the powers that be have changed a single character in the URL! They’ve decided that underscores are out of fashion and hyphens are the new black! So my webpage is now at http://www.ucl.ac.uk/uclic/people/a-cox. The cost of this tiny change? My business cards need re-printing, URLs on posters being displayed at CHI2013 won’t work, and my colleagues and are have to go through all our webpages updating the links.
My OH bought me a coffee this morning using his chip-and-pin card. The waiter handed him the machine, he entered his PIN, pressed enter, and handed the machine back to the waiter. What’s wrong with this picture? Well, the machine hadn’t been asking for the PIN, it was asking how much tip he would like to give. So now, not only did we have a 4 figure bill for a couple of coffees, he’d told this waiter the pin for his card!
The waiter said that my OH wasn’t even the first person to do this. Last week, someone had made the same error, but had also gone as far as entering their PIN the 2nd time to actually pay. The waiter had chased the customer into the street to explain to him that he’d paid over £2,000 for his lunch!