Episode 40 – The Ethics of Video Games Research (with Ashley Guajarado and Ann Johnson)

[Release Date: April 12, 2022] There’s lots of research going on about video games.  Some of it involves product testing like VR games or the effectiveness of interfaces.  Some of it looks at the impact of gaming on players for things like violence, sexism, and addiction.  What ethical concerns come into play when doing research like this on live human beings like you and I?

SHOW TRANSCRIPT


00:11:37.620 –> 00:11:48.390
Shlomo Sher: Today this is our first episode, with two guests, our first four person episodes we’re going to talk about essentially the ethics of video game research, a topic.

126
00:11:49.050 –> 00:12:07.290
Shlomo Sher: That is incredibly interesting and one that we’ve never remotely touched on, and I think that most people know very, very little about this is one of those things that I think kind of a takes place behind the scenes, as far as most of us who play video games are concerned.

127
00:12:08.610 –> 00:12:09.060
Shlomo Sher: So.

128
00:12:10.200 –> 00:12:11.520
Shlomo Sher: Ashley can call you Ashley.

129
00:12:11.820 –> 00:12:12.840
Professor_Ashley: Is that’s perfect.

130
00:12:13.110 –> 00:12:29.040
Shlomo Sher: Great so Ashley is an associate Professor well let’s do our official introductions so Ashley gallardo is associate professor of entertainment arts and engineering at the University of utah we love the University of utah, can you tell how many guess what we had for the.

131
00:12:29.040 –> 00:12:33.000
Shlomo Sher: University of utah you guys are awesome you do such cool things.

132
00:12:33.120 –> 00:12:34.020
Professor_Ashley: For people.

133
00:12:34.440 –> 00:12:35.670
Shlomo Sher: and your party people.

134
00:12:37.020 –> 00:12:46.680
Shlomo Sher: where she teaches game design and game user research when she isn’t researching twitch streamers or twitch streaming herself she serves on the institutional review board.

135
00:12:47.160 –> 00:12:57.660
Shlomo Sher: Which is what we’re here for and his curse currently co directing the hashtag games, you are summit 2020 the largest conference for games users researchers.

136
00:12:59.070 –> 00:13:00.660
Shlomo Sher: And and.

137
00:13:02.100 –> 00:13:09.840
Shlomo Sher: When people ask me what her dream job would be your answer is always the same, making one of a kind clothes and she’s here to do that for you.

138
00:13:11.730 –> 00:13:20.130
Shlomo Sher: Maybe not but she loves the creativity and artistry that pours out of her when designing and sewing clothes and yet she’s a scientist at heart, and by profession.

139
00:13:20.670 –> 00:13:29.910
Shlomo Sher: She has degrees in biology chemistry by biology chemistry and public health wow I gotta say i’m you know I i’m pretty impressed by that.

140
00:13:30.870 –> 00:13:31.170
me too.

141
00:13:32.220 –> 00:13:49.740
Shlomo Sher: and her career is making sure people who volunteered to be in research studies are a safe and as respected as possible she’s a director of the University of utah’s IRB and you’ve been research protection program wow you guys sound amazing Ashley and and welcome to the show yeah.

142
00:13:52.830 –> 00:14:06.900
Shlomo Sher: Okay, so we are here to talk about IRB boards and IRB is institutional review board but that still means absolutely nothing to the majority of our listeners so let’s just start with this what is an IRB board.

143
00:14:10.230 –> 00:14:11.850
Professor_Ashley: That sounds like a great question for him.

144
00:14:13.170 –> 00:14:13.590
and go.

145
00:14:15.150 –> 00:14:18.450
Shlomo Sher: By the way, guys so remember, we can always edit everything it’s.

146
00:14:18.450 –> 00:14:19.710
Shlomo Sher: so easy to do that stuff.

147
00:14:19.920 –> 00:14:20.850
Professor_Ashley: and make us sound smart.

148
00:14:22.500 –> 00:14:22.950
A Ashcraft: wow.

149
00:14:23.910 –> 00:14:35.070
Ann Johnson: That the soft answer for what an institutional review board is is we’re an ethics committee and we review research for ethical thinks so and it’s not just.

150
00:14:35.520 –> 00:14:44.490
Ann Johnson: Any research it’s actually only the research that’s done on people on living human beings so clearly there’s a lot of research out there in the world that’s done on a lot of different things.

151
00:14:44.760 –> 00:15:00.930
Ann Johnson: With the IRB cares about is the stuff that’s done on people, and so we look at the ethical constructs the ethical practices of research and we make sure that the research is as safe and as fair as possible when people participate.

152
00:15:02.010 –> 00:15:03.780
Shlomo Sher: Fair sounds interesting.

153
00:15:04.920 –> 00:15:14.760
A Ashcraft: I I frequently asked my students, when I have when I have brand new students, I asked them to write me a short essay about what fair means and it’s and of course it’s always different.

154
00:15:16.560 –> 00:15:21.270
A Ashcraft: yeah you know, nobody can quite define everybody knows what farias but nobody can define it.

155
00:15:23.880 –> 00:15:28.020
A Ashcraft: In the in the in the context of this, and these IRB.

156
00:15:30.090 –> 00:15:31.590
A Ashcraft: What is fair.

157
00:15:33.330 –> 00:15:38.550
Ann Johnson: that’s a really great question and, just as you said, the concept of fair is always changing.

158
00:15:39.210 –> 00:15:52.140
Ann Johnson: And we’re all different people and things I think are fair things you might think are unfair so we’re living in the subjective reality, but in research, we really think about fair in terms of who is.

159
00:15:53.250 –> 00:15:59.880
Ann Johnson: Who is bearing the burden of the research and then, who is getting the benefit of the research so just, for example.

160
00:16:00.300 –> 00:16:11.460
Ann Johnson: In history we’ve done a lot of terrible things to people in the name of research and they bore the burden of that research, so the rest of us could benefit that’s not really fair and so.

161
00:16:12.240 –> 00:16:20.970
Ann Johnson: We are always trying as an IRB to make sure that the people who are bearing the weight of the research that are being tested on that are contributing of themselves.

162
00:16:21.510 –> 00:16:31.050
Ann Johnson: are also potentially the people that are going to benefit and that we’re not targeting people unfairly to be in research, but that we’re also not excluding people unfairly.

163
00:16:31.380 –> 00:16:37.530
Ann Johnson: When it comes to being a researcher, and that is a balance, and that is hard to find, but that is the goal and that’s the ideal.

164
00:16:38.160 –> 00:16:44.070
Shlomo Sher: Can Can you give us an example, this so let’s say Okay, I want to do a research project involving video games.

165
00:16:45.300 –> 00:16:50.670
A Ashcraft: i’m Sorry, I have one moment please, I have had some technical difficulties hang on just a second i’ll be right back.

166
00:16:53.250 –> 00:16:55.440
Shlomo Sher: Andy what we’re going to pause right.

167
00:17:00.960 –> 00:17:03.150
Shlomo Sher: we’ll come back to this in a moment.

168
00:17:06.870 –> 00:17:11.910
Shlomo Sher: And Andy let me know if you were starting something do need to be rethink.

169
00:17:15.240 –> 00:17:18.270
Shlomo Sher: Actually, by the way, I listened to the podcast for working out to.

170
00:17:20.250 –> 00:17:29.340
Shlomo Sher: it’s usually like I got I got I go on a bike ride and that’s where I test, listen to the podcast and I have to stop and record notes on.

171
00:17:29.490 –> 00:17:30.150
My phone.

172
00:17:31.740 –> 00:17:32.850
Shlomo Sher: As as I do it.

173
00:17:33.300 –> 00:17:39.510
A Ashcraft: i’m so sorry I realized that while I have my headphones on the sound was actually coming out of my computer.

174
00:17:41.040 –> 00:17:52.620
A Ashcraft: And so I fixed that now, but the we might want to start over just for that that first question, because otherwise we’re going to hear your voices through my microphone.

175
00:17:53.460 –> 00:17:54.420
Shlomo Sher: Is that OK and.

176
00:17:54.570 –> 00:17:55.890
Ann Johnson: You absolutely.

177
00:17:56.100 –> 00:17:56.760
Ann Johnson: i’m so sorry.

178
00:17:58.500 –> 00:17:59.370
A Ashcraft: My apologies.

179
00:18:00.570 –> 00:18:01.830
Professor_Ashley: for stopping the recording.

180
00:18:02.550 –> 00:18:04.080
Shlomo Sher: No, no, no, no continue.

181
00:18:04.440 –> 00:18:05.100
Professor_Ashley: I was keep going.

182
00:18:05.580 –> 00:18:06.990
A Ashcraft: we’re just going to ask the question again.

183
00:18:07.170 –> 00:18:08.190
Professor_Ashley: Oh yeah perfect.

184
00:18:08.610 –> 00:18:12.060
Shlomo Sher: cool okay so i’m just going back to the question when in fact i’ll do the whole question what.

185
00:18:13.470 –> 00:18:22.110
Shlomo Sher: Okay, so okay so uh and let’s start with you, what is an IRB board right, so I want to do a research project involving video games.

186
00:18:23.370 –> 00:18:25.770
Shlomo Sher: What does that have to do with you how do I encounter you.

187
00:18:26.850 –> 00:18:32.280
Ann Johnson: yeah so an IRB institutional review board the easy way of saying that is we’re an ethics committee.

188
00:18:32.790 –> 00:18:44.430
Ann Johnson: And we’re an ethics committee that looks at the ethics of research and we don’t look at the ethics of any just any research it’s only the research that’s done on human beings so living human beings.

189
00:18:44.730 –> 00:19:04.170
Ann Johnson: So there’s clearly a lot of research out there that could be done on a lot of different things, but we are looking at it in the context of humans, and so we are trying our best as a group of people to decide if research is keeping people safe but it’s also being done fairly.

190
00:19:05.820 –> 00:19:13.650
Shlomo Sher: And okay So what does that mean, and so i’m doing a research projects and video games, what kind of things don’t need to worry about.

191
00:19:13.860 –> 00:19:22.680
Shlomo Sher: let’s let’s talk about safety first and then let’s talk about fairness and i’m sure me i’m assuming fairness is going to be a lot more complicated and safety, but what do I know i’ve never actually.

192
00:19:22.710 –> 00:19:24.150
Shlomo Sher: had anything goes to IRB.

193
00:19:25.530 –> 00:19:27.300
Professor_Ashley: So i’m i’m sorry, no.

194
00:19:28.290 –> 00:19:29.820
Ann Johnson: No go ahead, actually you started.

195
00:19:30.270 –> 00:19:31.410
Professor_Ashley: I was going to give.

196
00:19:32.880 –> 00:19:35.700
Professor_Ashley: The insight from someone who studies games.

197
00:19:37.050 –> 00:19:49.950
Professor_Ashley: Specifically, first, and then we can dive into the more official IRB answer so when, in terms of safety, we have safety designations at the institutional review board that.

198
00:19:51.210 –> 00:20:00.150
Professor_Ashley: determine whether or not the study is minimal risk or greater than minimal risk for the most part, video game studies tend to be minimal risk, meaning that we’re not like.

199
00:20:00.630 –> 00:20:06.120
Professor_Ashley: poking people with needles we’re not like injecting them with stuff to see what happens.

200
00:20:06.810 –> 00:20:15.510
Professor_Ashley: we’re not necessarily exposing them to things that they would encounter outside of their daily life, most of the time when we are researching games, we are.

201
00:20:15.960 –> 00:20:23.460
Professor_Ashley: researching groups of people that would already play games not always but most the time, so this is nothing outside of their ordinary day.

202
00:20:24.330 –> 00:20:30.360
Professor_Ashley: So in that case we were classified as minimal risk, but it is still human subjects research with some exceptions.

203
00:20:30.780 –> 00:20:40.740
Professor_Ashley: If we’re doing like what my games user research lab does we’re actually doing product research and so that doesn’t meet the federal guidelines for human subjects research and quite the same way.

204
00:20:41.220 –> 00:20:51.540
Professor_Ashley: we’re not studying to make generalizable data about a population, we are studying just a product and the product and development and we’re trying to make that product better.

205
00:20:51.990 –> 00:20:57.690
Professor_Ashley: So it kind of it’s similar to like student emails right Sunni value of your classes are a type of research but.

206
00:20:58.290 –> 00:21:12.570
Professor_Ashley: you’re not going to go and write research papers based off that data you’re going to use it to improve your pedagogy and your teaching, so in that case we’re video games are pretty minimal risk, but if you’re asking people questions you definitely need to seek IRB approval.

207
00:21:13.650 –> 00:21:22.500
Professor_Ashley: outside of product development because we want to make sure that you’re not causing harm, so in this case, if your video game happened to have.

208
00:21:22.980 –> 00:21:34.260
Professor_Ashley: A component which might bring up emotional feelings that would not necessarily be encountered in daily life or adults with topics that folks might find disturbing.

209
00:21:34.590 –> 00:21:43.680
Professor_Ashley: Then at the minimum we would want you to tell people so that they can make an informed choice and consent to that research, rather than just kind of springing out on them.

210
00:21:44.550 –> 00:21:56.910
Professor_Ashley: covertly, but of course covered research can happen, and it can happen ethically but we there their safety protocols in place to ensure that people are not harmed as a process to gather data.

211
00:21:57.360 –> 00:22:01.020
A Ashcraft: So both physically and or and or emotionally harmed.

212
00:22:01.410 –> 00:22:15.300
Professor_Ashley: yeah so physically we don’t I mean with most games Asterix course exceptions we’re not doing a whole lot of physical research, of course, the exception being virtual reality and then there’s physical concerns about.

213
00:22:16.560 –> 00:22:31.380
Professor_Ashley: running into things losing balance falling over, especially when we get into vr studies and physiology not not looking at, can we make a really cool vr game, but can we use vr for phantom limb pain right can we use vr to truth.

214
00:22:32.640 –> 00:22:42.000
Professor_Ashley: In place of an analgesic or an anesthetic can we use vr to manage pain right that’s a question that if you do a quick library search it comes up in a lot of medical journals.

215
00:22:42.390 –> 00:22:49.320
Professor_Ashley: and pain management journals and it’s a really interesting question, but you know what kind of physical concerns do we have with with vr.

216
00:22:50.700 –> 00:22:54.600
Professor_Ashley: And if you if you read the headset I didn’t i’m not gonna lie.

217
00:22:55.650 –> 00:23:05.640
Professor_Ashley: We have an htc vr kit here, I think we have htc index my house and I never read the safety warning because you know i’m going to.

218
00:23:07.500 –> 00:23:18.210
Professor_Ashley: Like I that’s when you should read the safety board exam so I sat down to read them because I actually prepare the presentation for this for the IRB and there are things that I never considered.

219
00:23:19.350 –> 00:23:34.710
Professor_Ashley: So in addition to the balance stuff which, like or hitting a ceiling fan right those those things I considered but vr has been correlated I don’t know that we’ve established a causation relationship with it, but with myopia so nearsightedness, particularly in.

220
00:23:34.710 –> 00:23:54.120
Professor_Ashley: kids having a screen so close to their eyes, with light is showing developmental problems with with eyesight and of course motion sickness I anticipated, but I didn’t know this so because the headset rests on your sinuses it can actually exacerbate cold and flu and sinus something of.

221
00:23:54.510 –> 00:23:55.530
Professor_Ashley: Interest, so if you’re.

222
00:23:55.740 –> 00:24:02.670
Professor_Ashley: If you’re getting sick and you use vr you can feel more sick or it can it doesn’t give you a cold right so for a while people are thinking.

223
00:24:03.060 –> 00:24:14.460
Professor_Ashley: Oh it’s because we’re sharing headsets we’re all getting sick, but actually what we think is actually happening is that it’s just putting pressure on your sinuses so it’s making your nose run more and so they’re like some minor safety concerns like that.

224
00:24:15.450 –> 00:24:21.330
Shlomo Sher: So, is that the sort of thing that if you know if i’m doing research on you know, on vr and i’m you know.

225
00:24:21.840 –> 00:24:30.120
Shlomo Sher: On a vr game and i’m going to have them wear a headset I should let them know ahead of time, if you have a cold, this is going to potentially aggravate your systems and.

226
00:24:30.540 –> 00:24:40.230
Shlomo Sher: Do I need to let them know that long term use of the our devices have been shown to increase myopia and kids is that the idea something like that.

227
00:24:40.530 –> 00:24:51.210
Professor_Ashley: yeah so that becomes a really interesting question and, in the past i’ve i’ve defaulted to advising the researchers to read, but it varies by headset right but, for example.

228
00:24:51.810 –> 00:24:59.790
Professor_Ashley: I want to say the oculus says no one under the age of eight should be using the headset and that’s because of these studies on ocular development.

229
00:25:00.960 –> 00:25:02.040
Professor_Ashley: Ironically oculus.

230
00:25:02.130 –> 00:25:03.960
Shlomo Sher: Can I just say Ashley that uh.

231
00:25:04.050 –> 00:25:08.430
Shlomo Sher: i’ve been looking for an excuse to not let my son use my oculus and.

232
00:25:08.970 –> 00:25:12.960
Professor_Ashley: that’s the worst part right because you have this shiny piece of tech that kids want and you’re like.

233
00:25:13.020 –> 00:25:14.580
Professor_Ashley: No your eyesight.

234
00:25:15.150 –> 00:25:25.470
Shlomo Sher: Right, yes, well you know now got my excuse that I needed, which is, which is great there’s plenty of other games, he can play that are fantastic for him, he doesn’t need us by vr headset.

235
00:25:27.270 –> 00:25:40.800
Professor_Ashley: So the advisory I give to research teams is to adopt the language that’s used in the headset devices themselves so if the company is saying hey look we don’t think that people of this age should be using this product, then.

236
00:25:41.040 –> 00:25:45.270
Professor_Ashley: You have to take that into consideration when when conducting research with that product.

237
00:25:45.990 –> 00:25:47.190
A Ashcraft: yeah that makes perfect sense.

238
00:25:48.000 –> 00:25:53.340
Shlomo Sher: So to go back to the other sort of content warning right so let’s say.

239
00:25:54.360 –> 00:26:00.420
Shlomo Sher: i’m I want to measure reactions to games like ethnic cleansing right.

240
00:26:02.670 –> 00:26:12.630
Shlomo Sher: You know, do I need to let so at the cleansing we’ve talked about before this game right essentially it’s a racist game it if if it doesn’t sound like a racist game I don’t know what those.

241
00:26:13.740 –> 00:26:15.060
Shlomo Sher: Right and it’s part of.

242
00:26:15.390 –> 00:26:16.830
A Ashcraft: The title of a game by the lesson.

243
00:26:16.830 –> 00:26:17.250
Shlomo Sher: Title a.

244
00:26:17.790 –> 00:26:19.470
A Ashcraft: cleansing is the title of again.

245
00:26:20.010 –> 00:26:20.880
Shlomo Sher: Right right.

246
00:26:21.900 –> 00:26:36.810
Shlomo Sher: Right do I have to inform my my studies subjects ahead of time that you are playing a game that will have racist elements do I need to give up the title, how much do I need to protect them kind of against this.

247
00:26:38.490 –> 00:26:42.330
Ann Johnson: I think that’s a really good question when we look at.

248
00:26:43.080 –> 00:26:56.820
Ann Johnson: You know harm we’ve just talked about, like all these physical harm to people but there’s also like that whole vast array of like psychological harm social harm harm within you know relationships and maybe legal harm.

249
00:26:57.450 –> 00:27:09.300
Ann Johnson: That could come to you, depending on you know certain things that you do in life and potentially in research and so when we look at harm that could come to research subjects we’ve got to take all of that into account.

250
00:27:10.470 –> 00:27:21.150
Ann Johnson: And I think we could all agree that there’s a difference in the new harm that’s introduced in a research, study from someone who’s never played let’s say this race this game before.

251
00:27:21.570 –> 00:27:34.830
Ann Johnson: versus someone who’s been playing it prior to ever being in a research, study so that’s something for us to think about too is should we expose new people to playing this race escape it, or should we maybe recruit.

252
00:27:36.300 –> 00:27:41.580
Ann Johnson: Even playing it they’re already doing that harm to themselves whether they perceive it as harm or not.

253
00:27:42.930 –> 00:27:51.750
Ann Johnson: And so, picking the right population that is going to receive less harm from our research is something that we would want to consider as well.

254
00:27:52.110 –> 00:28:05.160
Ann Johnson: But then, just as you said, we would want to be able to talk about any harms that are going to come from being in our study with the person before they say yes or no to doing whatever we’re asking them to do.

255
00:28:05.970 –> 00:28:15.150
Shlomo Sher: You know, it strikes me that in play in having my studies subjects a play a game like this i’m kind of looking to see if it harms them.

256
00:28:15.840 –> 00:28:30.450
Shlomo Sher: Is you know and i’m assuming lots of studies, I mean that’s kind of the point of the study to see if the product or the subject matter or something harms my subjects, but your job is to make sure I don’t harm my subjects, how does How does that work.

257
00:28:31.800 –> 00:28:40.560
Ann Johnson: I think that that is also really fundamental is an understanding what IRB do is we can’t always get rid of all the harm.

258
00:28:40.950 –> 00:28:56.370
Ann Johnson: Of a research, study right some studies are just risky so if we took the example of, say, a new medical drugs that was going to treat some new condition that those doctors are also trying to see if this new drug harms you because it might.

259
00:28:57.450 –> 00:29:06.060
Ann Johnson: But they’re also trying to see if it benefits, you and I think that is the other key component, that we can apply over to video game research as well.

260
00:29:06.390 –> 00:29:16.050
Ann Johnson: Yes, it is important to understand a certain video games, or certain contexts that we play within are harming people but what’s the benefit.

261
00:29:16.680 –> 00:29:21.870
Ann Johnson: Is there is there some benefit you’re trying to get out of the research like are you trying to show that.

262
00:29:22.380 –> 00:29:35.940
Ann Johnson: This harm is bad, that we should limit harm in society, you know that’s a benefit to society potentially right right, but if we’re just trying to see if things harm people just for the sake of seeing if they harm people perhaps that’s.

263
00:29:36.030 –> 00:29:45.390
Ann Johnson: Not ethical right so it’s that risk benefit balance that we’re trying to assess and it’s not just to me as a person, but it’s just two people around me.

264
00:29:45.810 –> 00:29:53.910
Ann Johnson: Not just the benefits to me as a person, but also the benefits to society around me so we’ve got away all of that which can get really complicated.

265
00:29:54.990 –> 00:29:56.430
Ann Johnson: But we’ve got a considerable sites.

266
00:29:56.820 –> 00:30:10.920
Shlomo Sher: Do you have any kind of systematic ways of weighing that I mean you know it seems like a really tough challenge to to figure it stuff like this out, and you can again feel free to use the the racist gain example I just gave you is.

267
00:30:11.490 –> 00:30:17.610
Shlomo Sher: What would you do with that I have no idea what I would do with that you know, but then again i’m not you know i’m not in your position.

268
00:30:19.980 –> 00:30:29.430
Ann Johnson: Right, so I think you know, one of the one of the things that actually makes medical research easy, as we can measure harm a lot better.

269
00:30:30.030 –> 00:30:38.190
Ann Johnson: Once a better, but maybe unquantifiable ways so like I can see if, like, I give you this drug or I do the surgery on you, I can see, like a how it affects your.

270
00:30:38.550 –> 00:30:43.860
Ann Johnson: Your heart rate, I can see how it affects your blood chemistry like I can measure those things right.

271
00:30:44.850 –> 00:31:03.000
Ann Johnson: When we’re talking about other kinds of harm the harm is actually really subjective so if we take like a racist video game right you could argue, and this is a very i’m going to say this in a you know very simple way because there’s a lot of complexity to racism, I am a white woman.

272
00:31:04.140 –> 00:31:06.060
Ann Johnson: I don’t experience racism.

273
00:31:07.920 –> 00:31:08.370
Right.

274
00:31:10.590 –> 00:31:12.120
Ann Johnson: Experience racism at all.

275
00:31:14.550 –> 00:31:23.130
Ann Johnson: So the population that we’re studying actually matters and the harm could be different, depending on who you are.

276
00:31:23.910 –> 00:31:41.760
Ann Johnson: And so that’s something that we have to take into account as well when we’re doing the research there’s also the benefit to me could be different than the benefit to say a black woman for studying this, so I think thinking about it in that way as important to.

277
00:31:43.650 –> 00:31:44.370
Shlomo Sher: Okay, great.

278
00:31:45.960 –> 00:31:51.180
Shlomo Sher: Let me, let me go now back to the issue of fairness right.

279
00:31:52.590 –> 00:32:00.810
Shlomo Sher: Right, you talked about fairness and in in two ways so yeah I had is fairness, I want to do something about.

280
00:32:01.920 –> 00:32:03.870
Shlomo Sher: You know let’s keep it simple, I want to know if.

281
00:32:05.310 –> 00:32:20.790
Shlomo Sher: Have a sub two players will be to this new ask this this new game mechanics i’ve introduced to vr games why, why not vr games seems like such a great area of study right, what do I need to consider in terms of fairness.

282
00:32:22.080 –> 00:32:27.060
Professor_Ashley: So now, this is a product development right, so this is, for me, this is really interesting because.

283
00:32:28.080 –> 00:32:41.790
Professor_Ashley: For some reason we have this idea that if research doesn’t meet the federal guideline for what researches and you don’t have to go through an IRB, then you can just do what you want, which is not true, like.

284
00:32:42.240 –> 00:32:56.070
Professor_Ashley: These these principles of research ethics should still apply, regardless of whether or not you need to go and fill out a form with an institutional review board, so I see this quite often in terms of.

285
00:32:57.660 –> 00:33:05.310
Professor_Ashley: play testing, so we know that there is a week okay I shouldn’t say we know because I don’t know of any.

286
00:33:06.990 –> 00:33:19.830
Professor_Ashley: quantifiable peer reviewed study on this, but as someone who directs a research summit for at least for this year and has been to the research summit for user research and product research on video games.

287
00:33:20.730 –> 00:33:39.540
Professor_Ashley: We there’s been a historical bias and who is selected for for play testing being usually white and male when there’s there’s anecdotal examples of us, but you know stories that people tell about even from the other side of.

288
00:33:40.980 –> 00:33:49.290
Professor_Ashley: going in for play testing and assuming that the boyfriend was there to play test and and the woman wasn’t interested in play test she just drove her boyfriend or something when.

289
00:33:49.320 –> 00:33:50.760
Professor_Ashley: When she was in fact the person.

290
00:33:51.090 –> 00:34:01.200
Professor_Ashley: Who was selected to do the play testing, so in this case we have this issue of justice, which is one of the principles I think I mentioned previously, of the belmont report.

291
00:34:02.490 –> 00:34:10.350
Professor_Ashley: which was the belmont report and and my details on this or bit shaky it was written 1972 something like that.

292
00:34:11.790 –> 00:34:20.550
Professor_Ashley: which grew out of this desire to have a formalized government document that says what is what is good research in terms of ethics.

293
00:34:21.300 –> 00:34:30.720
Professor_Ashley: One of the principles, there is justice which hence that what and was saying previously about participant selection so while you may be.

294
00:34:31.110 –> 00:34:41.340
Professor_Ashley: You may have a really good reason for selecting a certain group of participants, even in video games students right, so if, for example, you want to see if your new game is colorblind friendly.

295
00:34:42.210 –> 00:34:51.330
Professor_Ashley: It makes sense to pick participants who have colorblind right like that’s that’s you are excluding people, yes, but are you excluding people for a good reason.

296
00:34:51.930 –> 00:35:02.160
Professor_Ashley: Also, yes, because you want to make your game more accessible, so in that case the most just thing to do would be to pick people with colorblindness to make your game or colorblind accessible.

297
00:35:02.880 –> 00:35:10.200
Professor_Ashley: But if you’re excluding people because of convenience, which is problem we see I think across research studies.

298
00:35:12.150 –> 00:35:14.790
Professor_Ashley: And I know and has more to say later.

299
00:35:15.870 –> 00:35:24.420
Professor_Ashley: But when you’re excluding people because it’s difficult to reach them because they don’t speak English as a primary language, because.

300
00:35:25.020 –> 00:35:30.840
Professor_Ashley: They might live far away because you’re you’re a game studio that’s located in a city and.

301
00:35:31.530 –> 00:35:38.730
Professor_Ashley: People that you might want to research are out in the suburbs and so transportation is difficult, should you exclude them because it’s difficult.

302
00:35:39.630 –> 00:35:47.280
Professor_Ashley: Probably not you should probably try and work out a way that they can still participate because you’re interested in a diversity of opinions right so.

303
00:35:48.120 –> 00:36:07.110
Professor_Ashley: The idea of justice is that the barriers to entry to participate in racial research should be lowered and or equal for everyone, your race your gender your faith your economic status shouldn’t inhibit you from participating in research, even if it’s just just quote unquote product research.

304
00:36:08.190 –> 00:36:23.580
Ann Johnson: Rather than if I if I could add to that as just like some thoughts about society in general, one of the questions that comes up for IRB is how much do we need to try to make research fair when the world is unfair.

305
00:36:23.820 –> 00:36:24.900
Shlomo Sher: Right right.

306
00:36:25.050 –> 00:36:29.850
Ann Johnson: And so, are we holding ourselves to a standard to try to correct the world’s and justices.

307
00:36:31.320 –> 00:36:38.790
Ann Johnson: In a way, that maybe isn’t realistic, but I think one of the things that is really important to think about is.

308
00:36:39.300 –> 00:36:49.080
Ann Johnson: There have been research studies that have perpetuated injustices and often the reasons why something is inconvenient to do.

309
00:36:49.440 –> 00:36:58.470
Ann Johnson: In research is because it’s inconvenient in society, and so are we just continuing to perpetuate that inconvenience perpetuate that injustice.

310
00:36:58.980 –> 00:37:12.000
Ann Johnson: That comes and so research definitely has to consider conveniences we can’t there’s not a lot of money, all the time to study all the things and perfectly just wave and yet.

311
00:37:12.390 –> 00:37:17.760
Ann Johnson: If the goal is to put benefit out into society based on the research that you’re doing.

312
00:37:18.690 –> 00:37:26.700
Ann Johnson: The goal probably also should be to make society, a little bit better a little more just alongside of that, so I think that this has.

313
00:37:27.060 –> 00:37:36.870
Ann Johnson: hit IRB is a little harder in the last decade, with with renewed Social justice movements that have gone on in the United States and internationally.

314
00:37:37.320 –> 00:37:55.530
Ann Johnson: Where we can’t sit back and not listen to these voices anymore, we need to actually care about that social injustice that that certain communities are experiencing so that, as we do research, we can help in football ways we can’t do it all but helping smaller ways.

315
00:37:56.580 –> 00:37:59.610
Ann Johnson: For those injustices to be overcome in our society.

316
00:38:00.120 –> 00:38:12.600
A Ashcraft: So an example like that would be like you know we run our we run these tests only during work hours, and so you know and we’re when we’re looking to.

317
00:38:13.980 –> 00:38:19.980
A Ashcraft: To test people who are you know 20 to 30 years old that’s probably when they’re working.

318
00:38:21.270 –> 00:38:23.850
A Ashcraft: or their or their you know rich.

319
00:38:27.270 –> 00:38:34.200
Shlomo Sher: yeah I was thinking of classic stuff like you know it’s it’s it’s it’s harder for me to translate to video games i’m hoping.

320
00:38:34.530 –> 00:38:40.800
Shlomo Sher: You know Ashley you could help translate the videos but I was thinking about classic things like medical research that ends up being only on men.

321
00:38:41.490 –> 00:38:54.780
Shlomo Sher: and thereby essentially you know the product is released, and then you know, women are not really protected in the same way as men are because they were never included as part of the study.

322
00:38:55.080 –> 00:38:56.790
Professor_Ashley: Well that’s that’s vr baby.

323
00:38:58.350 –> 00:39:01.020
Professor_Ashley: virtual reality is a really fun example of that because.

324
00:39:01.110 –> 00:39:03.900
Professor_Ashley: Motion sickness disproportionately impacts women.

325
00:39:04.320 –> 00:39:05.520
Professor_Ashley: I think it’s something like.

326
00:39:06.150 –> 00:39:10.710
Professor_Ashley: 50% of women, I should not I should look up the actual number.

327
00:39:12.000 –> 00:39:14.670
Professor_Ashley: But the proportion, maybe it’s like 40% hold on.

328
00:39:15.090 –> 00:39:19.170
A Ashcraft: But even so, I had no idea that it was that it was gender related at all.

329
00:39:20.550 –> 00:39:29.460
Professor_Ashley: yeah yeah it’s so some some studies say that estrogen is like the cause of motion sickness or can increase feelings emotions sickness.

330
00:39:30.480 –> 00:39:41.010
Professor_Ashley: But anyway, I can look up the exact number later, women are disproportionately affected by motion sickness and we know that that’s exacerbated by certain factors and games like.

331
00:39:41.580 –> 00:39:51.270
Professor_Ashley: You know movement if you’re seated vr game versus standing vr game, and then, if you have a frame rate drop so if your your game is.

332
00:39:51.870 –> 00:40:05.100
Professor_Ashley: jarring or glitchy at all that’s going to really trigger motion sickness and so you’re making a product that about a quarter of the population cannot access will just be incredibly sick if they access it and.

333
00:40:06.060 –> 00:40:12.510
Professor_Ashley: it’s like either people don’t know that right which it’s not common knowledge or people are like Oh well, you know.

334
00:40:13.560 –> 00:40:20.850
Professor_Ashley: Oh well, it’s like you’re excluding this massive percentage of the population from your product right.

335
00:40:20.940 –> 00:40:25.410
Shlomo Sher: And, and if you’re testing obviously doesn’t include women, you will never find this out.

336
00:40:25.800 –> 00:40:30.960
Professor_Ashley: yeah exactly exactly so in a lot of ways it’s very similar to the medical research is only.

337
00:40:32.010 –> 00:40:32.730
Professor_Ashley: studied man.

338
00:40:33.420 –> 00:40:46.470
Shlomo Sher: Right yeah that’s really interesting right i’m a woman, I you know I get my oculus quest you know i’m a you know I want to play my game and I can’t figure out why I can’t do this, and so I made to suffer.

339
00:40:47.940 –> 00:40:53.280
Shlomo Sher: Simply because, when the researcher did the you did their study, they just didn’t include people like me.

340
00:40:53.910 –> 00:41:02.730
Professor_Ashley: yeah and there’s some really great just as an aside, so if there are women out there listening, who are triggered by motion sickness in vr.

341
00:41:03.090 –> 00:41:13.950
Professor_Ashley: There are a couple protests that i’ve discovered I get motion sick very easy and so see sitting down finding games in vr that are seated helps me and putting a fan on.

342
00:41:15.030 –> 00:41:26.130
Professor_Ashley: So, having a box fan that you can direct at your face it’s it’s kind of the same effect, I also get seasick So for me it feels like when you’re on a ship and you go above duck to get fresh air it feels like that.

343
00:41:27.570 –> 00:41:28.560
Professor_Ashley: So that’s helpful.

344
00:41:28.860 –> 00:41:39.420
Shlomo Sher: But sorry that was attention is is is that something you is this a personal hack that you’ve come up with, or is this, something that has been researched, because I mean we’re talking research here.

345
00:41:39.660 –> 00:41:47.940
Professor_Ashley: yeah I wish I were that clever that actually came out of the Games user research summit and the video is on I can give you the link.

346
00:41:48.870 –> 00:42:07.200
Professor_Ashley: it’s on YouTube and I can give you the citation as well i’ll have to look that up but yeah so some some great research pro tips are shared at the Games user research summit every year on on how to make your play testers more comfortable while they are testing games.

347
00:42:07.500 –> 00:42:08.400
A Ashcraft: While they’re getting sick.

348
00:42:08.850 –> 00:42:09.120
Be.

349
00:42:10.830 –> 00:42:11.880
A Ashcraft: comfortable but.

350
00:42:12.690 –> 00:42:14.700
Shlomo Sher: Though it’s interesting because it means that.

351
00:42:14.730 –> 00:42:25.110
Shlomo Sher: Unless you pass that on that information on to players later on right what you’re testing on is ideal conditions that are going to apply to the two regular play conditions.

352
00:42:26.820 –> 00:42:27.390
Ann Johnson: Right.

353
00:42:27.450 –> 00:42:28.170
A Ashcraft: Right right right.

354
00:42:29.940 –> 00:42:32.610
Ann Johnson: And that happens in medical research as well right.

355
00:42:32.760 –> 00:42:40.860
Ann Johnson: Where we have these really confined conditions and it doesn’t translate into the real world and so same with same with video game research.

356
00:42:42.240 –> 00:42:43.290
A Ashcraft: yeah that’s fascinating.

357
00:42:44.880 –> 00:42:47.610
Shlomo Sher: um Okay, so let me, let me ask i’m going to go to number two.

358
00:42:48.990 –> 00:42:49.410
Shlomo Sher: So.

359
00:42:50.550 –> 00:42:56.550
Shlomo Sher: You know we’ve talked a lot this point about you know we’ve talked about some about fairness, some of the concerns we have.

360
00:42:57.720 –> 00:43:05.010
Shlomo Sher: We haven’t actually talked a whole lot about the range of kind of video games research projects and i’m just kind of curious if.

361
00:43:05.850 –> 00:43:17.700
Shlomo Sher: Number one you could tell us about some of the ways that video games a research and number two, as some of the ways where maybe the morally right thing is not always the practical thing to do.

362
00:43:18.210 –> 00:43:28.110
A Ashcraft: yeah and and i’m one of you mentioned also that there’s a difference between morals and ethics that i’d like to to dive into to like if something’s morally wrong is it ethically wrong.

363
00:43:30.660 –> 00:43:31.800
Shlomo Sher: let’s do this one, at a time.

364
00:43:32.460 –> 00:43:33.420
Professor_Ashley: Okay, which one do you want to.

365
00:43:33.600 –> 00:43:47.340
Shlomo Sher: Let let’s let’s start let’s start actually with the let’s start just with what do you think is a give us some examples of research and video games that we haven’t talked about that let’s say doesn’t involve vr.

366
00:43:48.480 –> 00:43:50.160
Professor_Ashley: Okay, and.

367
00:43:51.960 –> 00:43:54.990
Professor_Ashley: In terms of ethical issues that could come up.

368
00:43:55.830 –> 00:44:02.700
Shlomo Sher: Maybe even just a regular regular kind of research that that happens, we probably should have done this in the beginning of program but we just kind of go with the flow.

369
00:44:03.360 –> 00:44:05.040
Professor_Ashley: Sure sure sure.

370
00:44:05.070 –> 00:44:06.660
Professor_Ashley: So sorry.

371
00:44:07.680 –> 00:44:13.200
Ann Johnson: No so Ashley you know I think you’re obviously the best person to talk about specific kinds of research.

372
00:44:13.500 –> 00:44:27.300
Ann Johnson: You know, but I can also just talk in generalities about you know here’s something that might be legal but that’s not moral or here’s something that is moral, but may not be practical, you know, and so you know I can bounce off of that.

373
00:44:28.320 –> 00:44:38.130
Shlomo Sher: So okay great so let’s start with what kind of researches there is then let’s go to moral but not practical or legal hey so actually if you could start us off.

374
00:44:39.060 –> 00:44:39.960
Professor_Ashley: yeah so.

375
00:44:41.700 –> 00:44:46.050
Professor_Ashley: In terms of just broadly the types of video game research that happens, I.

376
00:44:46.050 –> 00:44:47.430
Shlomo Sher: mean there’s.

377
00:44:48.030 –> 00:44:57.210
Professor_Ashley: there’s so much I can’t even I think I think I did a library search Recently I was looking specifically at vr studies and there was over a million results.

378
00:44:57.300 –> 00:44:57.870
A Ashcraft: Like wow.

379
00:44:58.140 –> 00:45:03.870
Professor_Ashley: So, would you try to narrow down by date there’s just people are doing so much really interesting stuff.

380
00:45:04.920 –> 00:45:05.400
Professor_Ashley: Out there.

381
00:45:07.320 –> 00:45:17.820
Professor_Ashley: So one one thing that might be interesting, this is an example, I actually give my students in in class to think about morality practicality and.

382
00:45:18.270 –> 00:45:31.380
Professor_Ashley: Research ethics is say, for example, you were developing a mobile game for teenagers and there’s like a social media component and.

383
00:45:32.190 –> 00:45:45.690
Professor_Ashley: You know this is a fictional game this game does not exist, this is just a thought experiment, what if you’re you’re studying to find out how fun the game is how helps feeling it as you’re not doing anything controversial you’re not the game isn’t controversial either it’s.

384
00:45:47.670 –> 00:45:51.660
Professor_Ashley: it’s a farm game because i’m using really bad examples right now.

385
00:45:53.460 –> 00:46:00.780
Professor_Ashley: I will do okay let’s do something more creative it’s a furniture crafting game what could go wrong with that there’s nothing controversial about this.

386
00:46:01.320 –> 00:46:09.450
Professor_Ashley: But what if you have a chat feature And what if, in one of your research reports participant says that they’re being bullied and harassed.

387
00:46:10.200 –> 00:46:24.240
Professor_Ashley: What if that participant says, please don’t tell my parents I don’t want them to take this game away, I really enjoy the game, but i’m getting bullied and i’m i’m scared What do you do well in my case it’s easy i’m a mandatory reporter.

388
00:46:25.680 –> 00:46:36.510
Professor_Ashley: The State says, I have to tell them, but in in anyone else’s case that might be a real moral quandary So what is legal, what do you legally have to do well that’s a that’s a Gray area.

389
00:46:37.200 –> 00:46:49.200
Professor_Ashley: Could the parents turn around and Sue if you didn’t tell them, I mean people, particularly in the United States can sue for just about anything, would it go anywhere I don’t know is there precedent I don’t know i’m not a legal expert right.

390
00:46:50.430 –> 00:46:50.940
Professor_Ashley: Is it.

391
00:46:52.170 –> 00:47:01.860
Professor_Ashley: If you tell the parents is that morally appropriate you’re just taking this child’s game away and that could have been the child’s only outlet for socialization.

392
00:47:03.240 –> 00:47:14.070
Professor_Ashley: If you don’t tell the parents and the child continues to get bullied and it affects their self esteem, or something worse happens, then what have you done, have you have you been negligent in your role as a game developer.

393
00:47:14.370 –> 00:47:28.740
Professor_Ashley: So this is, this is an example that I do not have a clear example of I do not know of a precedent, where this has come up this is kind of just a thought experiment, I, like my students to kind of struggle and grapple with but yeah i’d like to hear what Dan has to say.

394
00:47:30.480 –> 00:47:42.180
Ann Johnson: yeah I think that that’s a great example of what’s legal but what’s the actually the right thing to do right and and we grapple with that our laws don’t always dictate the right thing to do.

395
00:47:43.500 –> 00:47:54.450
Ann Johnson: But here here’s the here’s the kind of slip that another example on its head in terms of there may be something that we would all agree is morally right to do so, the concept of informed consent.

396
00:47:55.260 –> 00:48:06.990
Ann Johnson: You know, you say that you’re like yes that’s that is a that is a great principle, we want to respect people by letting them have informed consent and choose you know what they want to do, based on good information.

397
00:48:07.200 –> 00:48:09.810
Professor_Ashley: So we defined inside inside real quick.

398
00:48:11.940 –> 00:48:12.420
Ann Johnson: So.

399
00:48:12.750 –> 00:48:13.230
A Ashcraft: If you like.

400
00:48:14.490 –> 00:48:14.730
Shlomo Sher: yeah.

401
00:48:15.960 –> 00:48:24.180
Ann Johnson: So consent is obviously your ability to say yes or no, and making a choice, but the informed part means you had good information to make that choice list.

402
00:48:24.780 –> 00:48:36.510
Ann Johnson: Okay, so I think that that’s important, but so again informed consent sounds like a great thing, and you would think, yes, everyone should give informed consent for all the things that they do, but think about the practicalities of that.

403
00:48:38.220 –> 00:48:46.830
Ann Johnson: it’s it can be really challenging to sit let’s say you’re trying to study a video game and you’re trying to look at the data of thousands of players.

404
00:48:48.450 –> 00:48:56.910
Ann Johnson: Getting true informed consent, other than just those terms and conditions, you check the box next to when you sign up for you know, using a game.

405
00:48:57.210 –> 00:48:57.420
Ann Johnson: Right.

406
00:48:57.660 –> 00:48:59.070
A Ashcraft: So get over them recess.

407
00:49:00.360 –> 00:49:04.350
Ann Johnson: Exactly yeah it’s actually it’s really impractical.

408
00:49:05.250 –> 00:49:15.960
Ann Johnson: And so we like to think that we’ve gotten informed consent by having that both terms and conditions and blah blah blah but we’re openly admitting we’re not reading it so we’re not important, we gave consent, but we’re not informed.

409
00:49:16.440 –> 00:49:29.460
Ann Johnson: So, you know that is a real to life practicality, that means we’re not honoring something that we morally think it’s right but because it’s impractical we don’t care.

410
00:49:30.600 –> 00:49:32.460
Ann Johnson: You know and where’s that balance.

411
00:49:33.390 –> 00:49:33.780
Right.

412
00:49:35.040 –> 00:49:42.660
Shlomo Sher: And it’s you know it’s interesting I mean normally we say well it’s your risk we’re giving you the opportunity to become informed.

413
00:49:43.200 –> 00:49:50.040
Shlomo Sher: But we know that you’re not really taking that opportunity, so we really know you’re not informed, but it was, but the.

414
00:49:50.460 –> 00:49:55.350
Shlomo Sher: You know the ethical part was to respect your agency by giving you that opportunity but, of course.

415
00:49:55.740 –> 00:50:03.240
Shlomo Sher: You know, as classic would you know any kind of medical situations, even if you or any legal document I ever signed ever.

416
00:50:03.660 –> 00:50:22.530
Shlomo Sher: Right there’s also the limitations of actually having the capacity to be informed and understand what you’re doing in the first place, because you might not have the expertise in it um I I wanna I want to kind of go back to something you said earlier and.

417
00:50:24.600 –> 00:50:33.750
Shlomo Sher: Which is it’s interesting records being informed right i’m being given correct information, but sometimes research lies.

418
00:50:34.170 –> 00:50:43.470
Shlomo Sher: Right, the seeds, I mean some of the greatest psychological studies right and I don’t know how exactly psychological studies fit into video game research but they’ve got to fit in the.

419
00:50:44.130 –> 00:50:44.550
Shlomo Sher: Research.

420
00:50:44.610 –> 00:50:46.380
Shlomo Sher: For in lots of ways right.

421
00:50:47.460 –> 00:50:52.200
Shlomo Sher: How much deception, can I do in my in my research.

422
00:50:54.000 –> 00:50:56.790
Ann Johnson: that’s a really good question so let’s think about.

423
00:50:57.780 –> 00:51:10.650
Ann Johnson: You know let’s say, well, I won’t say in the IRB world in human research world we have these three core values core ethical principles and we’ve talked about all of them here today informed consent is a big one.

424
00:51:11.280 –> 00:51:29.220
Ann Johnson: that’s the value we have waiting risks and benefits that’s the second value we have and then fairness and justice that’s a third value we have so in the case that you just described this idea of deceiving people in order to study them and we’re we’re basically trying to weigh.

425
00:51:30.420 –> 00:51:39.330
Ann Johnson: Our value of informed consent against our value of of risk benefit the benefit of that research of what you’re going to learn right.

426
00:51:40.020 –> 00:51:56.580
Ann Johnson: And so, is it is it more valuable to learn the thing we’re going to learn from this research, even though we’re deceiving people, or is it more valuable to get their informed consent, what do we value more, and so, with every study, then we have to decide, are the benefits worth it.

427
00:51:57.720 –> 00:52:08.220
Ann Johnson: Is is that worse, the deception now in psychology research, you know it is commonly accepted practice, now that you don’t just deceive people and then walk away.

428
00:52:08.790 –> 00:52:17.100
Ann Johnson: You deceive them they do the study right they do the thing you’re asking them to do, and then you actually tell them that you didn’t see it to them before they walk away.

429
00:52:17.610 –> 00:52:30.720
Ann Johnson: and tell them why you needed to deceive them in the first place so you’re actually getting informed consent later, a little bit right because, even though they couldn’t make that full decision up front, they are getting information later.

430
00:52:31.170 –> 00:52:40.170
Ann Johnson: And we ask our researchers Center respect people and say, if you hated what we did, to you, you don’t have you can pull all your data from our research project right.

431
00:52:41.610 –> 00:52:48.660
Ann Johnson: And you know you have the right to voice that so it’s not a perfect way of getting informed consent, but it helps fill in some of the gaps.

432
00:52:50.130 –> 00:52:55.440
Ann Johnson: That that would cause people to feel like that was an ethically questionable practice in the first place.

433
00:52:56.970 –> 00:53:04.590
A Ashcraft: Interesting yeah so that brings into question there’s a lot of stuff that happens in games, particularly now that you know mobile games and other connected games where.

434
00:53:07.200 –> 00:53:19.650
A Ashcraft: The game itself is in some ways a a giant test right and and people test all kinds of things there’s a be testing for for ui issues like in some ways, these are psychological tests right.

435
00:53:20.130 –> 00:53:30.780
A Ashcraft: Like does this does this interface work better, for doing whatever it is that we’re attempting to do, then this other interface and we’re going to test half of our population of you know 30,000 gamers.

436
00:53:31.950 –> 00:53:38.880
A Ashcraft: What they you know against one, and then the other 30,000 against the other, and we get a lot of good data out of this stuff and get a lot of good information.

437
00:53:40.680 –> 00:53:54.630
A Ashcraft: You know, and every game has its terms of its terms that you, you know check the box off and not read, but is that should that be something that we should really consider through these through the same lens that the IRB do.

438
00:53:55.920 –> 00:54:05.100
Shlomo Sher: Oh, I say so, just to be clear, so these are so this is these would be just kind of practical research that the game itself does.

439
00:54:05.580 –> 00:54:08.340
Shlomo Sher: Right continuously right and continuously.

440
00:54:08.430 –> 00:54:09.090
A Ashcraft: Absolutely.

441
00:54:09.330 –> 00:54:17.130
Shlomo Sher: you’re getting feedback from users very, very I mean directly or indirectly, I guess you could always you know ask them after the fact.

442
00:54:17.370 –> 00:54:19.560
A Ashcraft: So we’re just getting it directly we’re just seeing.

443
00:54:19.590 –> 00:54:22.170
A Ashcraft: Seeing what they do in the game and we’re and we’re collecting that data.

444
00:54:22.410 –> 00:54:31.800
Shlomo Sher: Right So yes, i’d love to hear that yeah How is this different from what you guys do and are there any lessons that you think game company should take from this.

445
00:54:32.820 –> 00:54:33.990
Professor_Ashley: So that is.

446
00:54:35.130 –> 00:54:39.630
Professor_Ashley: The product development type angle, so one of the ways I think about this and I.

447
00:54:41.010 –> 00:54:45.840
Professor_Ashley: I don’t know that there’s you know I don’t know that this is in the belmont report right.

448
00:54:46.350 –> 00:55:00.180
Professor_Ashley: I don’t know that this is officially documented, but I tend to think about things in terms of intervention in the psychological sense in terms of user testing and how much we are impacting changing or.

449
00:55:01.710 –> 00:55:07.530
Professor_Ashley: messing with people’s natural behaviors so if, for example, I want to study.

450
00:55:09.450 –> 00:55:11.370
Professor_Ashley: people’s reaction to.

451
00:55:12.930 –> 00:55:23.610
Professor_Ashley: A virtual world in one version where site is very limited and audio is more apparent and and another version where site is there, but there’s no audio track something like that right.

452
00:55:23.850 –> 00:55:29.160
Professor_Ashley: i’m introducing a change i’m having them play and then introducing this change and then and seeing the results.

453
00:55:30.390 –> 00:55:38.820
Professor_Ashley: i’m impacting them by introducing this change in a minor way that’s not going to presumably cause anyone any psychological distress or any kind of harp.

454
00:55:40.650 –> 00:55:43.260
Professor_Ashley: Whereas, if I have someone.

455
00:55:44.280 –> 00:55:44.970
Professor_Ashley: play.

456
00:55:46.620 –> 00:55:54.840
Professor_Ashley: Like a very cutesy colorful game, and all of a sudden, a giant scary monster comes out to scare them and because i’m introducing the stimuli.

457
00:55:55.110 –> 00:56:04.890
Professor_Ashley: As a type of intervention to see how I mean I don’t know why you’d want to do this research unless you’re doing something biometric Lee right you want to see how fast, you can make someone’s heart rate go up.

458
00:56:06.990 –> 00:56:12.660
Professor_Ashley: very cruel but fun that’s a little bit more serious now i’m impacting them impacting their physicality.

459
00:56:12.990 –> 00:56:22.080
Professor_Ashley: I am intentionally disrupting what they are setting out to do so, I think in terms of when companies are collecting telemetry telemetry.

460
00:56:22.980 –> 00:56:39.240
Professor_Ashley: And and they’re collecting analytical data from their users their users are playing the game as intended they’re playing the product that they purchased of course okay so maybe maybe they have some complaints about this product as users, you know we’ve all been on reddit before.

461
00:56:39.480 –> 00:56:40.980
Professor_Ashley: But there they purchase this product they’re.

462
00:56:40.980 –> 00:56:52.380
Professor_Ashley: playing this product they’re happy to play this product now sending data to the company about pinch points where the users dying sending data to the company about how much in game currency they’ve earned.

463
00:56:53.310 –> 00:57:00.720
Professor_Ashley: And how much they’ve spent says find out the economy is balanced, all of these things are are fairly innocuous they’re there.

464
00:57:00.990 –> 00:57:13.980
Professor_Ashley: For the benefit of the game and for the benefit of the Games balancing and and presumably to make that person even happier with the game, the next time they play it because there’ll be more more balanced and more theoretically more enjoyable.

465
00:57:15.420 –> 00:57:20.400
Professor_Ashley: which is different to intentionally disrupting someone’s gameplay to see how they react right.

466
00:57:20.460 –> 00:57:24.480
A Ashcraft: Well, what about things like because we’ve certainly heard a lot of stories about.

467
00:57:26.310 –> 00:57:32.550
A Ashcraft: Using a B testing to see which things will lead people to buy more in game purchases.

468
00:57:34.200 –> 00:57:35.010
Professor_Ashley: Oh marketing.

469
00:57:35.370 –> 00:57:35.640
Oh, my.

470
00:57:39.930 –> 00:57:43.710
Professor_Ashley: So, in terms of what would be the question is it ethical.

471
00:57:44.130 –> 00:57:51.360
A Ashcraft: Well, no, no, not I mean, how could, how can we, how can we make that more ethical because it’s not going to go away.

472
00:57:52.740 –> 00:57:53.100
Shlomo Sher: Right so.

473
00:57:53.220 –> 00:58:04.740
Shlomo Sher: The question is not whether let’s say research about what marketing works is ethical, I guess, but maybe what concerns need to go into.

474
00:58:04.800 –> 00:58:05.220
A Ashcraft: yeah.

475
00:58:05.250 –> 00:58:05.880
Shlomo Sher: Could should we.

476
00:58:06.090 –> 00:58:15.240
A Ashcraft: Should we be using the lens of that that and talks about for the the IRB us for doing these sorts of things as well.

477
00:58:16.710 –> 00:58:16.830
Ann Johnson: I.

478
00:58:17.160 –> 00:58:18.510
Ann Johnson: really think you can.

479
00:58:19.740 –> 00:58:31.560
Ann Johnson: Sorry actually I I think you can one of the things that I teach and and why it’s probably particularly interesting that my bio starts with the fact that i’m a soul and I love the so.

480
00:58:32.100 –> 00:58:42.780
Ann Johnson: Is I actually have kind of taken on some projects where I go out into what you would think are completely unrelated fields like sewing and artistry.

481
00:58:43.500 –> 00:58:53.940
Ann Johnson: You would think it’s unrelated to research ethics and I actually go and I talked about how are three core ethical principles we we use all over our lives in lots of different fields and in lots of different ways.

482
00:58:54.420 –> 00:59:01.860
Ann Johnson: And so I think it’s really easy to look at things like informed choices risk benefit balance and justice.

483
00:59:02.220 –> 00:59:07.530
Ann Johnson: across all the things that we do there they’re just great ethical principles to away.

484
00:59:07.800 –> 00:59:17.460
Ann Johnson: With all of our decisions that we’re making and so when you’re looking at you know phase setting different ways, you would advertise to people to get them to make purchases, you know you absolutely can say.

485
00:59:18.030 –> 00:59:29.550
Ann Johnson: what’s the what’s the risk to the person into society here what’s the benefit to the person in society here are the burdens and the benefits equitably distributed is it adjust way a fair way of doing things.

486
00:59:30.060 –> 00:59:37.500
Ann Johnson: So, even though that that type of an activity doesn’t need to be reviewed by an IRB think the word above.

487
00:59:38.970 –> 00:59:55.920
Ann Johnson: is something that you can still apply apply ethics to you can still you know use these principles to make better decisions if you know if you’re trying to make the world more ethical you would absolutely be able to do that and sorry actually I cut you off, please.

488
00:59:56.580 –> 00:59:56.970
Professor_Ashley: Oh, I would.

489
00:59:57.720 –> 01:00:13.560
Professor_Ashley: Say from a very practical standpoint, you have this idea of well, are you making people spend more money or were those people going to spend that money anyway and you’ve just made it easier for them, because your user interfaces more aligned to how people think it’s more intuitive right.

490
01:00:14.760 –> 01:00:23.610
Professor_Ashley: Do I mean I don’t know of a causal relationship to to having a particular aesthetic design that causes people to spend more money.

491
01:00:24.210 –> 01:00:36.930
Professor_Ashley: I do know that you want to good good user experience design practices that you want to limit the amount of barriers that exist for people to be able to spend their money, I mean I have examples of I was trying to get a food delivery APP.

492
01:00:38.340 –> 01:00:45.300
Professor_Ashley: Like a box, the box of groceries that’s delivered to your House and I tried two different Apps and I couldn’t figure out how to give them my money.

493
01:00:45.510 –> 01:00:55.890
Professor_Ashley: Like I wanted this service, I was a client like let’s go I got my credit card in hand, but your ui is so terrible that I cannot figure out how to order my first box, I am not going to use your service.

494
01:00:56.310 –> 01:01:05.760
Professor_Ashley: So when we’re talking about like the the convert the conversion between the number of people who download a game or an APP and the number of people who pay for the game, or the APP.

495
01:01:06.360 –> 01:01:13.830
Professor_Ashley: We were looking at how much of that is people getting frustrated and closing it and uninstalling it because they can’t figure out how to do what they want to do.

496
01:01:14.280 –> 01:01:25.350
Professor_Ashley: That might not necessarily always be give someone money right your ui might be terrible and people get frustrated not being able to play the game in general and then close the APP and never convert to a paying customer but.

497
01:01:26.010 –> 01:01:35.910
Professor_Ashley: But I totally ECHO and agree with an as well, but in terms of dark design practices, there we absolutely can implement this moral framework.

498
01:01:36.960 –> 01:01:46.650
Professor_Ashley: I just wanted to be careful to suss out the kind of practicalities of user interface design is not and user experience design is not always to fleece.

499
01:01:48.270 –> 01:01:50.640
Professor_Ashley: we’ve got we gotta make money.

500
01:01:51.210 –> 01:01:52.620
A Ashcraft: that’s right and it might not even be.

501
01:01:52.680 –> 01:02:07.950
A Ashcraft: Your your your your business model right, it might be, you know, the way that you serve ads or the way that you do anything but, but we do a lot of this sort of testing all the time, as it was was my point and it was sort of I was sort of struck by how.

502
01:02:08.430 –> 01:02:16.680
A Ashcraft: The descriptions of these of these psychological tests mapped over to this this kind of a B testing that we do in games and other Apps.

503
01:02:16.800 –> 01:02:30.150
Shlomo Sher: it’s interesting with the with the ap with the A, B testing also whether you guys have demographics, of the of it, whether these are just players or whether you have information that gives you different graphics on these players and how.

504
01:02:30.150 –> 01:02:40.320
Shlomo Sher: That would kind of you know how how that would play into it, I was thinking Ashley with your example I was thinking of you know, all these games that have.

505
01:02:41.220 –> 01:02:53.760
Shlomo Sher: ads in the middle, to get you to buy other games and how hard it is for me to find the X to shut them off or those two little arrows to forward them and how you can essentially do a B testing where.

506
01:02:54.330 –> 01:03:09.810
Shlomo Sher: The button is just a little tiny bit smaller or yet in another place that’s even harder to find and, again, I think the way unexplained it is OK, you know what’s the potential harms here right How does that way gets the benefits, I thought that I thought that was really great just.

507
01:03:09.870 –> 01:03:10.230
Shlomo Sher: yeah.

508
01:03:10.260 –> 01:03:11.430
Shlomo Sher: You know how you could apply it.

509
01:03:12.030 –> 01:03:19.410
A Ashcraft: I kind of love this as just a set of lenses that we should look at, we should look at all this stuff through maybe.

510
01:03:19.980 –> 01:03:20.730
Professor_Ashley: totally agree.

511
01:03:20.820 –> 01:03:21.300
yeah.

512
01:03:22.410 –> 01:03:25.050
Shlomo Sher: Andy really quick i’m checking on i’m having a time check with you.

513
01:03:25.260 –> 01:03:27.450
A Ashcraft: Right we’re we’re we’re good for another 10.

514
01:03:27.930 –> 01:03:42.510
Shlomo Sher: Great Okay, so let me get to okay guys, so let me ask you that so i’m probably the most famous topics of research right on video games are the ones that have made it to the public discussion right.

515
01:03:43.440 –> 01:03:58.980
Shlomo Sher: There about you know, violence, sexism and addiction right that’s been kind of the the big three right um you know I want to know if ultra violent video game, whatever exactly that means you know but let’s say a very violent video game right.

516
01:04:00.120 –> 01:04:03.360
Shlomo Sher: is going to actually lead to violence right and.

517
01:04:05.130 –> 01:04:15.360
Shlomo Sher: i’m as i’m sure you guys are at least Ashley has a seminar standing of the history, you know of of this but um.

518
01:04:16.020 –> 01:04:30.360
Shlomo Sher: You know I remember reading study on aggression and you know, looking at how they were doing the study and thinking my God there Sam number one their sample so small, but number two they were using the blast of horns.

519
01:04:31.770 –> 01:04:42.930
Shlomo Sher: Right to so you’re playing a game and weather and you have this ability to blast a horn that’s going to be really loud in someone’s face, and if you do that, that was a sign that you are aggressive.

520
01:04:43.500 –> 01:04:58.320
Shlomo Sher: And I thought to myself oh my God that is Nicholas in terms of measuring aggression, but, of course, you know you’re limited right in what you can so you know how am I limited right is is.

521
01:04:59.100 –> 01:05:01.530
Professor_Ashley: I think it’s really interesting that you bring up that they had such a.

522
01:05:01.530 –> 01:05:06.030
Professor_Ashley: small sample size, because in some in some I mean yes in terms of your.

523
01:05:06.840 –> 01:05:08.400
Shlomo Sher: Particular study I was looking at.

524
01:05:08.430 –> 01:05:11.310
Shlomo Sher: yeah yeah sure there’s some with large sample sizes.

525
01:05:11.370 –> 01:05:15.510
Professor_Ashley: Oh yeah, no, no, I know the exact day you mean and I can’t remember the name of the authors of the moment but.

526
01:05:16.380 –> 01:05:24.000
Professor_Ashley: it’s it’s really interesting because, of course, when we’re doing a research design we if we’re doing quantitatively, we want, as many people as possible but.

527
01:05:24.270 –> 01:05:37.830
Professor_Ashley: And i’m always heartened a little bit when you have these wild studies really quiet and you see that they have a small sample size, because it means that the IRB is working right because people did not consent to participate in the research.

528
01:05:38.760 –> 01:05:49.530
Professor_Ashley: I mean i’m always floored by pain management studies, where I mean it’s not they’re not like inflicting permanent harm but, but you see them all the time, where people hold an ice cube.

529
01:05:50.220 –> 01:05:56.520
Professor_Ashley: Which is discomfort right or they will snap rubber bands on their arm or they will expose skin to heat or cold.

530
01:05:56.850 –> 01:06:11.250
Professor_Ashley: And then determine how effective an analgesic is or an aesthetic or whatever the the intervention is to try and reduce you know not feel that pain and every time I read one of these I just think who in the hell would want.

531
01:06:14.490 –> 01:06:25.620
Shlomo Sher: As an undergrad I I participated in you know clinical trials, because I needed the money and, as I totally would have held an ice cube for money right.

532
01:06:26.130 –> 01:06:29.250
Shlomo Sher: You know, we need money I would be would be my guess.

533
01:06:29.970 –> 01:06:30.870
A Ashcraft: yeah so our.

534
01:06:31.620 –> 01:06:33.540
Professor_Ashley: Research respects it comes down to.

535
01:06:33.600 –> 01:06:40.560
Professor_Ashley: If you would do it for money, and if that seems reasonable to you, you we respect you as a person, we respect your ability to consent to that.

536
01:06:41.670 –> 01:06:42.210
Professor_Ashley: Go for it.

537
01:06:45.360 –> 01:06:56.820
Ann Johnson: Although we do have to make the decision of like how much pain, are we willing to let you choose to inflict on yourself at your own expense so even though you might be a competent adults.

538
01:06:57.120 –> 01:07:09.540
Ann Johnson: right there might be a threshold of risk that we, as an IRB we’re not okay with you, choosing that amount of risk, you know whether it’s for money or you know, whatever the result could be disability after that risk is taken.

539
01:07:11.280 –> 01:07:23.070
Ann Johnson: One of the one of the things that I thought about with this question, and this is really somber topic, unfortunately, but I think it’s a really good example is studies about child abuse.

540
01:07:24.210 –> 01:07:34.830
Ann Johnson: So I think we can all agree that figuring out how to reduce child abuse in our society is a great thing to figure out.

541
01:07:35.100 –> 01:07:38.970
Ann Johnson: And yet we don’t want to do any studies, where we’re actually abusing children.

542
01:07:40.230 –> 01:07:51.000
Ann Johnson: that’s that’s not the goal right, and yet we need to study it, and so the way we design our studies is actually how we address kind of the the issues.

543
01:07:51.360 –> 01:08:00.510
Ann Johnson: Of of risk and benefit here right i’m not going to design a study that abuses children that’s not right, however, I could design a study that observed.

544
01:08:01.590 –> 01:08:12.180
Ann Johnson: Child abuse behavior that’s happening out in the real world environment right and and then study you know things that go along, without observation I think it’s the same.

545
01:08:12.750 –> 01:08:17.790
Ann Johnson: With violent video game research right if we.

546
01:08:18.750 –> 01:08:27.690
Ann Johnson: If there is the hypothesis that violent video games are causing people to be violence but it’s just questionable, but if that’s your hypothesis, then.

547
01:08:28.050 –> 01:08:37.980
Ann Johnson: You need to go about studying that, in a way that doesn’t increase harm to people and so perhaps it again when we talked about this earlier you don’t go and expose naive.

548
01:08:38.430 –> 01:08:44.550
Ann Johnson: video game players who have never played a violent video game to all these violent video games and then see what happens.

549
01:08:45.390 –> 01:08:58.830
Ann Johnson: But there are a lot of people out there that are already playing violent video games that there’s a pool to potentially observe and study, now the science of you know how we measure.

550
01:08:59.850 –> 01:09:15.060
Ann Johnson: Their progression toward violence or not is is very interesting in and of itself and correlation versus causation is also, I think, an interesting thing to think about here that we have to be careful as scientists, but.

551
01:09:17.040 –> 01:09:25.530
Ann Johnson: I think it’s important to really say like what am I really trying to study here what’s the social problem i’m really trying to understand and maybe we’re trying to benefit from.

552
01:09:26.040 –> 01:09:31.710
Ann Johnson: And then, how do I make sure that I am not just harming people, for the sake of harming people to answer my question.

553
01:09:33.300 –> 01:09:38.970
Shlomo Sher: Right I just basically really want to get a psychologist WHO studies stuff like this.

554
01:09:40.110 –> 01:09:41.880
Shlomo Sher: i’ve had my and Chris Ferguson.

555
01:09:43.260 –> 01:09:52.260
Shlomo Sher: To come talk about you know study you know, creating studies for dealing with with with with issues like this and how they they think about doing that.

556
01:09:54.090 –> 01:10:01.590
Shlomo Sher: You know, though it does make me think the study actually I was thinking about an Ashley I don’t know for think about the same study or if there’s lots of studies that through this.

557
01:10:01.860 –> 01:10:06.660
Shlomo Sher: But one of the things that they did want to know is, they wanted to know the difference between people who do play.

558
01:10:07.410 –> 01:10:19.500
Shlomo Sher: violent games and I remember how violent we’re talking about Maybe it was just first person shooters are, but they did want to know the difference between people who play regularly and people that don’t play so they can see if.

559
01:10:20.400 –> 01:10:32.310
Shlomo Sher: You know if you’re more desensitized to violence or anything like that, and so they did do that, but again i’m curious if that was maybe they were playing call of duty, as opposed to like man hunter or.

560
01:10:32.310 –> 01:10:32.910
Professor_Ashley: Something yeah.

561
01:10:33.390 –> 01:10:44.370
Shlomo Sher: or right so that’s and and that’s when we’re talking about also adults versus kids is another one that seems really interesting there speaking of.

562
01:10:45.210 –> 01:10:59.520
Shlomo Sher: You know, one of the big concerns with kids is pathological playing addiction right and it really kind of makes me wonder again study design for an addictive behavior.

563
01:11:00.600 –> 01:11:03.540
Shlomo Sher: How do you do that, and you know ethically.

564
01:11:05.220 –> 01:11:07.500
Professor_Ashley: Do the design hearts between you and God.

565
01:11:08.970 –> 01:11:10.860
Professor_Ashley: In terms of do the research.

566
01:11:12.030 –> 01:11:20.610
Professor_Ashley: that’s like i’d like and so that’s something that the IRB wouldn’t have to deal with because it’s product research, thank goodness, because that’s a that’s an ethical land mine.

567
01:11:22.350 –> 01:11:22.830
Professor_Ashley: But.

568
01:11:24.210 –> 01:11:35.610
Professor_Ashley: I think it really goes back to thinking about the core principles and in your your the Why are you doing this study, are you doing the study to make a profit.

569
01:11:37.320 –> 01:11:45.120
Professor_Ashley: I mean that’s some people value that right that’s not something that I value and I I would not, I would not participate or.

570
01:11:45.930 –> 01:12:02.670
Professor_Ashley: run a study according to what’s going to line my pockets, but you know if are you doing it because you’re a small indie company and you’re trying to make it big I mean that’s that’s a little bit more relatable a little bit more sympathetic, we can we can say.

571
01:12:03.330 –> 01:12:07.620
Shlomo Sher: I want to make it big by creating attic set of my players so they’ll spend a lot of money.

572
01:12:08.580 –> 01:12:13.410
A Ashcraft: or I but i’ve created a mechanic that i’m worried might be addicting.

573
01:12:13.650 –> 01:12:15.090
A Ashcraft: And I don’t want to do this.

574
01:12:16.020 –> 01:12:23.580
Shlomo Sher: Or, I must say, oh my cycle or psychologist who is concerned about addiction and just wants to do this for the sake of knowledge.

575
01:12:25.050 –> 01:12:27.060
Professor_Ashley: So how can you do that ethically mm hmm.

576
01:12:27.150 –> 01:12:36.150
Professor_Ashley: yeah um well make sure that your participants know that what they’re signing up for that they know that there’s the potential that they could become addicted.

577
01:12:36.780 –> 01:12:48.330
Professor_Ashley: and make sure you have some sort of onboarding and D boarding process where you can break them of that addictive habit and or provide support in terms of therapy or addiction counseling.

578
01:12:48.840 –> 01:12:55.500
Professor_Ashley: For them afterwards, should they actually become dependent on on this particular mechanic i’d say what do you think.

579
01:12:57.150 –> 01:13:04.230
Ann Johnson: yeah I think one of the ways that we reduce risk isn’t necessarily by preventing it, but maybe by treating it after the fact.

580
01:13:05.340 –> 01:13:14.940
Ann Johnson: So you know with those additional offerings of services that it’s like we as researchers aren’t trying to research, how you protect the services later, but we’re trying to do.

581
01:13:15.240 –> 01:13:20.190
Ann Johnson: Good by you, as the person who was in our study, if we can help you.

582
01:13:20.640 –> 01:13:35.610
Ann Johnson: Based on something that we’ve learned and and leave you better off than when you came here, or at least try to leave you better off than you know, maybe we’ve done our duty to help you and to help ourselves as researchers right to gain that knowledge.

583
01:13:36.750 –> 01:13:42.450
Professor_Ashley: yeah Dr Rachel co where it would be a great person to talk to she’s a research psychologist who’s done.

584
01:13:42.990 –> 01:13:50.190
Professor_Ashley: loads of research into video game ethics video game and violence and children she’s also a parent.

585
01:13:50.490 –> 01:14:03.570
Professor_Ashley: So she’s she’s kind of a talking head for these kinds of things which is really nice she also produce the video game debate book, which is all the controversy surrounding video game, so I can provide an introduction to her if you if you’d like she’s awesome.

586
01:14:03.870 –> 01:14:04.740
Shlomo Sher: We would looks like.

587
01:14:05.640 –> 01:14:07.740
Professor_Ashley: She doesn’t work for the University of utah.

588
01:14:11.280 –> 01:14:12.750
Ann Johnson: Which is a shame we.

589
01:14:12.750 –> 01:14:13.830
Ann Johnson: really want her to.

590
01:14:18.510 –> 01:14:21.570
Shlomo Sher: I have to say I mean this conversation really gave me.

591
01:14:23.220 –> 01:14:31.770
Shlomo Sher: A respect for how hard it is to do you know practical ethical research at some of the most controversial topics.

592
01:14:32.820 –> 01:14:34.890
Shlomo Sher: You know in video games.

593
01:14:35.730 –> 01:14:41.070
A Ashcraft: I agree, and I, and I feel like i’ve come away with a with a new set of lenses that I can I can apply.

594
01:14:42.390 –> 01:14:54.570
Professor_Ashley: One thing to be clear about, though, is that because it’s difficult and I don’t in terms of like the paperwork I don’t think it is that difficult I think it’s more about.

595
01:14:55.050 –> 01:15:13.290
Professor_Ashley: The thought experiments and thinking through your research design in a in an ethical way that’s challenging but you should still do it, not only because it makes you a better person, but because that research, study researching controversial things is really important, the the more.

596
01:15:14.370 –> 01:15:22.500
Professor_Ashley: Ethical kind of alarm bells that sound isn’t necessarily a deterrent, it means you’re thinking, the right things you’re asking the right questions.

597
01:15:22.830 –> 01:15:29.640
Professor_Ashley: And you’re putting your participants first and thinking about your research design, but it shouldn’t dissuade you from doing that research, you should just.

598
01:15:29.970 –> 01:15:46.200
Professor_Ashley: help encourage you to do that research ethically and your local institutional review board and whatever form that takes is friendly and and there to help you I, for some reason I think on campuses in particular IRB is get this reputation for being this like draconian.

599
01:15:47.520 –> 01:15:50.250
Professor_Ashley: You know bureaucracy that’s just there to ruin your day.

600
01:15:51.510 –> 01:15:59.670
Professor_Ashley: really not we want to see be successful because you know we’re researchers to and we’re really excited to generate knowledge it’s really important to us.

601
01:16:02.370 –> 01:16:09.840
Shlomo Sher: awesome i’m going to ask you guys if there was anything else that you guys think our listeners to know about the ethics of research and you must subject involving video games and I think.

602
01:16:10.290 –> 01:16:17.130
Shlomo Sher: Actually, that was that was a great answer to that are really unless you want to add anything to it and and we’d love to hear from you as well.

603
01:16:17.340 –> 01:16:20.040
Professor_Ashley: I have one quick thing that i’m that.

604
01:16:21.150 –> 01:16:34.320
Professor_Ashley: So much of research ethics is just not taught, so this is not necessarily ethics and games, but when I have students come through my user research class and and I tell them about research ethics and.

605
01:16:34.890 –> 01:16:40.440
Professor_Ashley: They you know they don’t necessarily say this Alex they’re very kind of sweet people and they wouldn’t hurt my feelings.

606
01:16:40.770 –> 01:16:45.660
Professor_Ashley: But I get the kind of why should we be learning about this we’re just studying video games and.

607
01:16:46.110 –> 01:16:56.370
Professor_Ashley: and made a point earlier that our history with research ethics in this country is absolutely atrocious and I would encourage everyone to when when they have the spoons.

608
01:16:57.270 –> 01:17:03.660
Professor_Ashley: which might not be anytime soon, because the world is difficult, but there is a book called medical apartheid by Harriet Washington.

609
01:17:04.230 –> 01:17:15.510
Professor_Ashley: And as part of my own self improvement work for black history month I decided to read that book it deals with race, particularly the.

610
01:17:16.170 –> 01:17:33.060
Professor_Ashley: Unfair unjust and disproportionate amount of medical research that has been done on black people throughout the United States history with without consent and it, I mean full full disclosure, it is a dark dark pocket is something that you need to.

611
01:17:34.170 –> 01:17:40.020
Professor_Ashley: prepare yourself for and do when you’re in a good headspace for sure, because it is, we have a lot to be.

612
01:17:41.070 –> 01:17:45.630
Professor_Ashley: To be ashamed of in terms of our history, but it’s very important that we know it.

613
01:17:47.940 –> 01:17:58.110
Ann Johnson: And if I could add on to that you know examining our history is really important it’s brought us to where we are and it’s helped us helped us to make better ethical decisions.

614
01:17:58.620 –> 01:18:12.210
Ann Johnson: However, what’s really interesting about video game research is how fast the technology changes, you all know this and when technology changes fast our ability to keep up ethically kind of struggle.

615
01:18:12.840 –> 01:18:20.370
Ann Johnson: So one of the things IRB is have to grapple with is that changing landscape of what’s acceptable what’s ethical what’s fair.

616
01:18:21.780 –> 01:18:28.680
Ann Johnson: Because our society changes, just like the technology changes, and so I would just encourage everybody to.

617
01:18:29.160 –> 01:18:36.870
Ann Johnson: Look at the potential for the future of being able to sit with ourselves and and and apply these ethical lenses.

618
01:18:37.560 –> 01:18:52.410
Ann Johnson: To how video game research changes over time what new research questions are going to come up because of that, but maybe also what new ethical quandary are we going to be put in because of it and that’s what I feel is incredibly interesting.

619
01:18:53.700 –> 01:19:15.570
Ann Johnson: about my field of study, and what I do today today but it’s also imperative that if we’re not going to repeat history by treating people badly, we need to continue to question the status quo and move the ethical conversation as fast as the technology is moving to.

620
01:19:16.470 –> 01:19:17.490
Shlomo Sher: Well, said.

621
01:19:17.640 –> 01:19:18.300
Professor_Ashley: yeah very.

622
01:19:18.840 –> 01:19:19.500
A Ashcraft: Hard agree.

623
01:19:19.770 –> 01:19:20.400
Shlomo Sher: All right.

624
01:19:21.810 –> 01:19:30.300
Shlomo Sher: All right, and Johnson Ashley rado Thank you so much for for coming on the show we really, really appreciate yay.

625
01:19:30.810 –> 01:19:32.070
Professor_Ashley: Thank you so much for having us.

626
01:19:32.550 –> 01:19:36.480
Shlomo Sher: Right really great podcast everybody.

Related Posts