{"id":1054,"date":"2023-03-26T05:04:49","date_gmt":"2023-03-26T05:04:49","guid":{"rendered":"https:\/\/ethicsandvideogames.com\/?p=1054"},"modified":"2023-04-14T23:09:11","modified_gmt":"2023-04-14T23:09:11","slug":"episode-66-video-game-design-lessons-from-moral-psychology-with-paul-formosa-and-malcolm-ryan","status":"publish","type":"post","link":"https:\/\/ethicsandvideogames.com\/?p=1054","title":{"rendered":"Episode 66 \u2013 Video Game Design Lessons from Moral Psychology \u00a0(with Paul Formosa and Malcolm Ryan)"},"content":{"rendered":"\n<p>[Release Date: April 11, 2023]&nbsp;How do players morally engage with games?&nbsp; What can user experience research and moral psychology tell us about how players experience and think about ethical decisions in games?&nbsp; We chat with philosopher Paul Formosa and Game Designer Malcolm Ryan about their collaborative ongoing research exploring these questions.<\/p>\n\n\n\n<iframe style=\"border-radius:12px\" src=\"https:\/\/open.spotify.com\/embed\/episode\/2tgs82WTBupFzQyQjc29W2?utm_source=generator&#038;theme=0\" width=\"100%\" height=\"152\" frameBorder=\"0\" allowfullscreen=\"\" allow=\"autoplay; clipboard-write; encrypted-media; fullscreen; picture-in-picture\" loading=\"lazy\"><\/iframe>\n\n\n\n<p class=\"has-text-align-center\">SHOW TRANSCRIPT<\/p>\n\n\n\n<p>00:05:14.680 &#8211;&gt; 00:05:22.970<br>Shlomo Sher: All right. Welcome. Everybody. We&#8217;re here with Paul Fromosa, a professor of Philosophy, head of the Department of Philosophy and Co-director of the Center for agency values and ethics.<\/p>\n\n\n\n<p>60<br>00:05:23.020 &#8211;&gt; 00:05:30.880<br>Shlomo Sher: you know, I should have stopped before we did this, Paul, and ask you ask, you guys, how do you? How do you pronounce? Is it, Macquarie?<\/p>\n\n\n\n<p>61<br>00:05:30.970 &#8211;&gt; 00:05:38.230<br>Shlomo Sher: Yeah, my Corey. Okay, Anything else I need to know. And I had pronounced, No, we we&#8217;re good. Alright.<\/p>\n\n\n\n<p>62<br>00:05:38.460 &#8211;&gt; 00:05:49.509<br>Shlomo Sher: All right. Welcome, everybody. We&#8217;re here with Paul Fromosa, a professor of Philosophy, head of the Department of Philosophy and Co-director of the Center for Agency Values and Ethics at Macquarie University in Sydney.<\/p>\n\n\n\n<p>63<br>00:05:49.520 &#8211;&gt; 00:06:16.329<br>Shlomo Sher: Australia, Paul has published widely on topics and moral political philosophy with the recent folks on ethical issues raised by new technology, such as video games and AI. He also collaborates regularly with colleagues from a range of different discipline disciplines outside of philosophy, one of whom was our other guest today. Malcolm Ryan, Malcolm Ryan is course director of the game design and development program in the school of computing at Macquarie University in Sydney, Australia.<\/p>\n\n\n\n<p>64<br>00:06:16.340 &#8211;&gt; 00:06:23.919<br>Shlomo Sher: By the way, we love our Aussie. We&#8217;ve had so many on this show. You guys do some really interesting stuff.<\/p>\n\n\n\n<p>65<br>00:06:24.020 &#8211;&gt; 00:06:37.090<br>Shlomo Sher: Malcolm is published in game design, virtual reality and and and artificial intelligence. His current research focuses on how players make ethical decisions and video games and collaboration with colleagues from philosophy, psychology, and creative writing.<\/p>\n\n\n\n<p>66<br>00:06:37.270 &#8211;&gt; 00:06:50.279<br>Shlomo Sher: He blogs about video game, ethics, research at morality play org. So both of you guys really do collaboration with people from other fields which is fantastic, and they&#8217;re here to talk to us together.<\/p>\n\n\n\n<p>67<br>00:06:50.360 &#8211;&gt; 00:06:53.550<br>Shlomo Sher: right? The research together is that<\/p>\n\n\n\n<p>68<br>00:06:53.560 &#8211;&gt; 00:07:18.580<br>Shlomo Sher: is really founded in moral psychology and user experience or Ux research. They&#8217;re interested in how players engage morally with games, how they exercise their ethical thinking to make decisions and games, and how they experience these games, these kinds of decisions. The hope is that this research will lead to richer, more engaging ethical content and games both for entertainment and education. And wow, that was a very long introduction.<\/p>\n\n\n\n<p>69<br>00:07:18.590 &#8211;&gt; 00:07:24.910<br>Shlomo Sher: Welcome, and, Paul, welcome to the show. Yeah, thank you. Thank you very much for having us scratch speaking.<\/p>\n\n\n\n<p>70<br>00:07:26.040 &#8211;&gt; 00:07:42.409<br>Shlomo Sher: Okay. So our episode is about what you&#8217;ve learned about more. So about how more so sorry our episode is how what you&#8217;ve learned about moral psychology can help video games. Better engage with ethics. So to set things up clearly for listeners. What is moral psychology.<\/p>\n\n\n\n<p>71<br>00:07:43.290 &#8211;&gt; 00:08:02.359<br>Malcolm Ryan: do I? I&#8217;ll take that one so essentially moral Psychology is the study. I like to think of it as it&#8217;s. How we do morality when we&#8217;re doing morality so as thinking creatures. When we&#8217;re being moral, we&#8217;ve got a various processes going on in our mind trying to work out how we do, what we do when we moral.<\/p>\n\n\n\n<p>72<br>00:08:02.440 &#8211;&gt; 00:08:12.219<br>Shlomo Sher: And so it&#8217;s different from sorry someone&#8217;s starting it more and more outside. That&#8217;s very timely<\/p>\n\n\n\n<p>73<br>00:08:12.230 &#8211;&gt; 00:08:22.990<br>Malcolm Ryan: from philosophical. I think that it&#8217;s not asking sort of theoretically what is ethics or what is right and wrong? It&#8217;s more asking. Okay, you&#8217;re sitting down. You&#8217;re making an ethical decision.<\/p>\n\n\n\n<p>74<br>00:08:23.040 &#8211;&gt; 00:08:26.180<br>Malcolm Ryan: What is going on in your head when you&#8217;re when you&#8217;re doing this.<\/p>\n\n\n\n<p>75<br>00:08:26.210 &#8211;&gt; 00:08:33.750<br>Malcolm Ryan: And so and i&#8217;m i&#8217;m really interested in it, because I think you know these are the questions we, as game designers think about all the time is, how does player<\/p>\n\n\n\n<p>76<br>00:08:34.240 &#8211;&gt; 00:08:43.710<br>Malcolm Ryan: think about what they&#8217;re doing when they&#8217;re playing our game? So in this case the game is a moral problem of some sort.<\/p>\n\n\n\n<p>77<br>00:08:43.780 &#8211;&gt; 00:08:47.739<br>Paul Formosa: What I what are we doing? I pulled you on it.<\/p>\n\n\n\n<p>78<br>00:08:47.750 &#8211;&gt; 00:09:07.780<br>Paul Formosa: Well, I I guess we&#8217;ll get into it as we go along. But I I look yeah, I just wanna I I agree with Michael Malcolm&#8217;s kind of definition there and like, when we think about like more psychology and games like it&#8217;s pretty straightforward like. So how how are people making those moral decisions in games? What sort of things influence those decisions, for example, things like morality meters, or the the design of the game.<\/p>\n\n\n\n<p>79<br>00:09:07.790 &#8211;&gt; 00:09:23.530<br>Paul Formosa: the different characters they interact with, and so on. And the other thing we also want to think a little about in in our psychology is also things about sort of model development, moral engagement. So you know, when we&#8217;re thinking about models of culture. We&#8217;re also thinking like, how how can people change or get better or worse?<\/p>\n\n\n\n<p>80<br>00:09:23.560 &#8211;&gt; 00:09:39.030<br>Paul Formosa: And we can also think about how games by engagement, with games, we might be able to make it a lot better or worse. And what sort of different design strategies or techniques will interact or engage with those more capacities. And that&#8217;s the sort of things we&#8217;ve been trying to look at our space.<\/p>\n\n\n\n<p>81<br>00:09:39.080 &#8211;&gt; 00:09:41.770<br>Shlomo Sher: Okay. So<\/p>\n\n\n\n<p>82<br>00:09:41.960 &#8211;&gt; 00:09:51.589<br>Malcolm Ryan: okay, so i&#8217;ll go. So sort of historically, both in philosophical ethics and also in the in the psychology there&#8217;s sort of been 2<\/p>\n\n\n\n<p>83<br>00:09:51.670 &#8211;&gt; 00:10:04.170<br>Malcolm Ryan: contrasting approaches to thinking about. How do we do this? And one is the more rational. Okay, Morality is about thinking about certain rules and sudden God, you know, and working out how to apply those.<\/p>\n\n\n\n<p>84<br>00:10:04.180 &#8211;&gt; 00:10:11.729<br>Malcolm Ryan: And so it&#8217;s a historically a very sort of rationalist approach to to how do we do morality? We sit down and we solve model problems.<\/p>\n\n\n\n<p>85<br>00:10:11.760 &#8211;&gt; 00:10:31.130<br>Malcolm Ryan: The alternative approach is is sort of thinking about morality more as character or more as sort of intrinsic intuitive things. And and some people will go so far as saying, You know all that rationalization is just. Post talk rationalization of what we emotionally want, or I don&#8217;t want. And we just say<\/p>\n\n\n\n<p>86<br>00:10:31.140 &#8211;&gt; 00:10:44.510<br>Malcolm Ryan: modern psychology, what moral psychology seems to sort of prefer what&#8217;s called a dual process model, which is like the the thinking, fast and slow model that people are familiar with from Kahneman&#8217;s research, and so forth.<\/p>\n\n\n\n<p>87<br>00:10:44.520 &#8211;&gt; 00:10:52.189<br>Malcolm Ryan: Where we&#8217;ve got, we do have these rational processes, and sometimes we sit down and we do rationalize our moral responses.<\/p>\n\n\n\n<p>88<br>00:10:52.240 &#8211;&gt; 00:11:10.560<br>Malcolm Ryan: But we also have these intuitive moral responses and automatic role responses and a lot of what we&#8217;re doing in real life. We don&#8217;t have the opportunity to sit down and do the do. The the moral math. We just make, make, make, make decisions automatically. And so this is what really this is sort of the model that informs what we&#8217;re doing<\/p>\n\n\n\n<p>89<br>00:11:10.570 &#8211;&gt; 00:11:25.729<br>Malcolm Ryan: is thinking about. Yeah, we do have these times when we sit down, and we really not out a moral choice. But we also have times where we just sort of reacting morally to things. And so then using that sort of psychological model. And then there&#8217;s sort of<\/p>\n\n\n\n<p>90<br>00:11:25.740 &#8211;&gt; 00:11:31.329<br>Malcolm Ryan: more detailed theories about the different components of that, and how, how, what we, what we&#8217;re doing when we&#8217;re doing that<\/p>\n\n\n\n<p>91<br>00:11:31.910 &#8211;&gt; 00:12:00.499<br>Shlomo Sher: Okay. Games obviously can present you with lots of different opportunities right to engage both of those kind of ways of thinking, though when most people think of how video games try to gauge the players with ethics, I think what they typically tend to think about is morality meters, or of like big moral dilemmas with a lot of large at stake like you know my favorite are from the mass effect games that give you action, options that to choose from, and then you get rated<\/p>\n\n\n\n<p>92<br>00:12:00.510 &#8211;&gt; 00:12:17.009<br>Shlomo Sher: for your actions as paragon or renegade, or good or evil, or something like that, and they affect your characters more alignment on some sort of morality meet or something like that. Your research shows that this sort of focus is too narrow right that there&#8217;s other aspects of our moral expertise.<\/p>\n\n\n\n<p>93<br>00:12:17.020 &#8211;&gt; 00:12:35.560<br>Shlomo Sher: And and your notes. I use this word moral expertise, as if it&#8217;s clear as if we know what that means that they you engage with. So before we we talk about kind of a, you know, moral expertise. What exactly our morality meters? And are they really, substantially different parts, types of morality meters.<\/p>\n\n\n\n<p>94<br>00:12:36.050 &#8211;&gt; 00:12:51.879<br>Malcolm Ryan: So I mean they Probably they&#8217;ve been around in some some sort of moral accounting, and games has been around since, at least since the early ultimate games. Where you get points for goodness or evilness in the game.<\/p>\n\n\n\n<p>95<br>00:12:51.930 &#8211;&gt; 00:13:00.669<br>Malcolm Ryan: I guess morality made is really kind of first appeared as a very you on screen mechanic in in a night for the old Republic.<\/p>\n\n\n\n<p>96<br>00:13:00.850 &#8211;&gt; 00:13:13.759<br>Malcolm Ryan: I think that was the first game that had that had an explicit meter on the screen that was telling you how many points of good or evil you had, and as you decision, you made decisions in the game that was reflected. And so and essentially.<\/p>\n\n\n\n<p>97<br>00:13:13.770 &#8211;&gt; 00:13:31.970<br>Malcolm Ryan: you know, and it&#8217;s called. There are lots of variations on this, and lots of games that have some moral accounting going on where the behind the scenes are explicitly presented to you. But when we&#8217;re talking about morality meeting meetings, we most, and especially in the research we&#8217;re doing. We&#8217;re holding a very visible on screen.<\/p>\n\n\n\n<p>98<br>00:13:31.980 &#8211;&gt; 00:13:40.690<br>Malcolm Ryan: you know. Very explicit meter showing you. Yes. This decision resulted in you getting 10 points of good. This decision resulted in you getting<\/p>\n\n\n\n<p>99<br>00:13:40.770 &#8211;&gt; 00:13:54.230<br>Malcolm Ryan: negative 10 points or or 10 points of evil, and you&#8217;ll need to start somewhere at 0 and moves towards good or evil as the game, assessing your your moral character in some sense based on your choices.<\/p>\n\n\n\n<p>100<br>00:13:54.300 &#8211;&gt; 00:13:59.829<br>Malcolm Ryan: And there are a lot of games that do have. This is a very explicit, you know, explicit mechanic in the game.<\/p>\n\n\n\n<p>101<br>00:14:00.080 &#8211;&gt; 00:14:03.860<br>A Ashcraft: Would you count the the giant beast in black and white<\/p>\n\n\n\n<p>102<br>00:14:04.080 &#8211;&gt; 00:14:06.260<br>A Ashcraft: as you as a moral meter?<\/p>\n\n\n\n<p>103<br>00:14:06.270 &#8211;&gt; 00:14:24.480<br>Malcolm Ryan: Yeah, so it&#8217;s a definitely. It&#8217;s definitely a kind of moral accounting going on there. It&#8217;s showing it and same. Similarly, with like, I think Fable does it much more in terms of showing your avatar, changing and expressing it in that way. So there are definitely more subtle ways of showing this sort of thing, but still behind the same with some.<\/p>\n\n\n\n<p>104<br>00:14:24.490 &#8211;&gt; 00:14:36.890<br>Malcolm Ryan: You know that immune value in the game, which is, you are currently at 20 points of good, and that is represented on the screen somehow visually, I think I mean, we&#8217;re looking at very explicit meters, because, you know, we wanted.<\/p>\n\n\n\n<p>105<br>00:14:37.030 &#8211;&gt; 00:14:42.120<br>Malcolm Ryan: you know, for the for starting out research. We wanted to say, yeah, this is the thing. See how it affects people&#8217;s thing<\/p>\n\n\n\n<p>106<br>00:14:42.360 &#8211;&gt; 00:14:59.079<br>Malcolm Ryan: decisions. But ultimately there are lots of ways. This can be done. I guess a related mechanic is the kind of reputation mechanics that you see in games like full out in Vegas and other games like that, where it&#8217;s less explicitly about telling you<\/p>\n\n\n\n<p>107<br>00:14:59.130 &#8211;&gt; 00:15:07.400<br>Malcolm Ryan: your overall, good or evil, and more about Well, this faction or this individual, regards you as good, and regards, and which<\/p>\n\n\n\n<p>108<br>00:15:07.480 &#8211;&gt; 00:15:09.280<br>Malcolm Ryan: allows you<\/p>\n\n\n\n<p>109<br>00:15:09.340 &#8211;&gt; 00:15:19.879<br>Malcolm Ryan: to as a designer to present more than one axis of of morality, which is again something we haven&#8217;t gone into. Yet something i&#8217;m interested in exploring further is what if we have<\/p>\n\n\n\n<p>110<br>00:15:20.280 &#8211;&gt; 00:15:39.849<br>Malcolm Ryan: different morality meters that Well, we so far we have sort of looked at having one morality meter, but having it expressed different kinds of morality. But you know it would be interesting. And for the research to look at. Well, you&#8217;ve got this meter which tells you. According to this person&#8217;s standards. You&#8217;re a good person. But according to this person standards you&#8217;re you&#8217;re you&#8217;re less good or so<\/p>\n\n\n\n<p>111<br>00:15:39.930 &#8211;&gt; 00:15:48.569<br>Shlomo Sher: right to. To. To To me yeah, those those 2 things seem dramatically different, and it makes sense. You guys, you guys, but separate them.<\/p>\n\n\n\n<p>112<br>00:15:48.630 &#8211;&gt; 00:15:58.039<br>Shlomo Sher: You know one is judging you very explicitly. The other one is not so much judging you, but kind of letting you know that you have a reputation<\/p>\n\n\n\n<p>113<br>00:15:58.070 &#8211;&gt; 00:16:05.980<br>Shlomo Sher: right among different groups, and allows you for reputation management rather than to kind of. You know, mess with your<\/p>\n\n\n\n<p>114<br>00:16:06.570 &#8211;&gt; 00:16:23.950<br>Shlomo Sher: I don&#8217;t know overall alignment. It depends how you look at or overall character, so to tell us about the the studies that you did that you guys did right. You ran a study of morality meters, and it included creating a game specifically for that purpose. What did you guys learn about how users interact with morality meters?<\/p>\n\n\n\n<p>115<br>00:16:24.230 &#8211;&gt; 00:16:25.520<br>Malcolm Ryan: I want to talk about that. Poll.<\/p>\n\n\n\n<p>116<br>00:16:25.570 &#8211;&gt; 00:16:29.379<br>Paul Formosa: How about you? Start? Talk about the game, and i&#8217;ll talk about the qualitative study.<\/p>\n\n\n\n<p>117<br>00:16:29.540 &#8211;&gt; 00:16:44.370<br>Malcolm Ryan: Okay, yeah. So we so our AIM was to just look at very basically because nobody is there. There&#8217;s a lot of conversation in the in the research about how morality meters are good or bad, whether or not they.<\/p>\n\n\n\n<p>118<br>00:16:44.380 &#8211;&gt; 00:17:00.619<br>Malcolm Ryan: a lot of the concerns well, whether they just purely instrumentalize the morality and turns it into a I just a point scoring exercise, and people don&#8217;t actually think we&#8217;re at morally about their choices. They just go, and you know, choose the thing that is labeled good, and and there are various<\/p>\n\n\n\n<p>119<br>00:17:00.720 &#8211;&gt; 00:17:09.460<br>Malcolm Ryan: lots of research that is based on, you know, player, ethnography, or or so deep readings of games, but very little, You know, Mass.<\/p>\n\n\n\n<p>120<br>00:17:09.480 &#8211;&gt; 00:17:12.400<br>Malcolm Ryan: you know, getting lots of people play a game and see what happens.<\/p>\n\n\n\n<p>121<br>00:17:12.450 &#8211;&gt; 00:17:18.989<br>Malcolm Ryan: And so we wanted to look into that and say, get some real data on how to morale. You need to actually change people&#8217;s decisions.<\/p>\n\n\n\n<p>122<br>00:17:19.130 &#8211;&gt; 00:17:25.630<br>Malcolm Ryan: And we had a bunch of different kinds of ethical decisions we want to represent in the game, and we wanted to show how<\/p>\n\n\n\n<p>123<br>00:17:25.720 &#8211;&gt; 00:17:32.829<br>Malcolm Ryan: different kinds of morality meters telling you different kinds of morality, whether or not they whether they had effect on on the player.<\/p>\n\n\n\n<p>124<br>00:17:33.320 &#8211;&gt; 00:17:40.029<br>Malcolm Ryan: In order to do this, we looked at, you know, existing games, but there&#8217;s very few games out there that are<\/p>\n\n\n\n<p>125<br>00:17:40.280 &#8211;&gt; 00:17:51.529<br>Malcolm Ryan: short enough to play in a research set setting that give us that. Give us the explicit mechanics and control over those mechanics, and the way that we wanted to design is to change up the design.<\/p>\n\n\n\n<p>126<br>00:17:51.580 &#8211;&gt; 00:18:07.010<br>Malcolm Ryan: and you know, and and you know all those for the factors, and allow us to measure the things we want to measure. And so we thought, okay? Well, we really need to make our own game in order to do this properly and that way we have the full source of the game. We can change any design, make any design changes we want.<\/p>\n\n\n\n<p>127<br>00:18:07.020 &#8211;&gt; 00:18:19.970<br>Malcolm Ryan: And so we specifically designed this game, the great file which you can play on that webpage and borrelity played it all. Actually, it&#8217;s on morality played on itch dot I\/O<\/p>\n\n\n\n<p>128<br>00:18:20.130 &#8211;&gt; 00:18:22.210<br>you can<\/p>\n\n\n\n<p>129<br>00:18:22.400 &#8211;&gt; 00:18:38.529<br>Malcolm Ryan: in the game. We. So we engaged a creative writing academic from Macquarie and a local game development team chaos, theory games, and they helped write the game and develop the game produce a beautiful.<\/p>\n\n\n\n<p>130<br>00:18:38.940 &#8211;&gt; 00:18:46.639<br>Malcolm Ryan: It&#8217;s sort of a film. No, it&#8217;s game set in a in a cinema in in sort of 1940 s country, Australia.<\/p>\n\n\n\n<p>131<br>00:18:46.680 &#8211;&gt; 00:18:58.159<br>Malcolm Ryan: and it&#8217;s a your You play on Asher at the cinema, and it tells the story. It&#8217;s sort of a visual novel style game telling the story of one day one tragic day at the at work.<\/p>\n\n\n\n<p>132<br>00:18:58.430 &#8211;&gt; 00:19:13.120<br>Malcolm Ryan: and and as such you have a lot of difficult moral decisions to make along the way, and we were able to. Then you know, record exactly what players were doing in those moral decisions, and present a variety of different morality meters there.<\/p>\n\n\n\n<p>133<br>00:19:13.130 &#8211;&gt; 00:19:42.879<br>Malcolm Ryan: showing, you know kind of the the different different. So we had one sort of reality meter which recommended the intuitively the obviously intuitively good choices. We&#8217;re all labeled as good, and the obviously intuitively evil choices were labeled as evil, although we did have some choices that are actually sort of closer to being trolley problem dilemmas, whether it wasn&#8217;t clearly one good or evil. And so we were interested in seeing how changing that meter around, recommending some things as good and some things as evil would change<\/p>\n\n\n\n<p>134<br>00:19:42.890 &#8211;&gt; 00:19:44.060<br>Malcolm Ryan: as the<\/p>\n\n\n\n<p>135<br>00:19:44.690 &#8211;&gt; 00:19:58.949<br>Malcolm Ryan: Hmm. Yeah. And so this is by making our game we could. Then we then ran an experiment. We ran. Actually, we run lots of experiments, we we, the nice thing about having our own game is we can keep coming up with another experiment idea and doing a different different version of it.<\/p>\n\n\n\n<p>136<br>00:19:59.020 &#8211;&gt; 00:20:04.420<br>Malcolm Ryan: the but the first one that all talk about. We ran a qualitative study where we got I can&#8217;t remember<\/p>\n\n\n\n<p>137<br>00:20:04.720 &#8211;&gt; 00:20:15.719<br>Paul Formosa: around 25, I think. 25, 2 groups of you know, had 2 groups play the game basically with one group. Had the in should have made a setting. So this I think there was 7 or 8 key choices.<\/p>\n\n\n\n<p>138<br>00:20:15.730 &#8211;&gt; 00:20:32.800<br>Paul Formosa: so we wanted them to be nicely structured. So we had, facing morality versus self interest, type choices and a contact and non contact version. So one you sort of trip over someone&#8217;s still and money one the money to stand. You steal it. We had 2 trolley problems which are basically down to logical, first consequentialist type decisions.<\/p>\n\n\n\n<p>139<br>00:20:32.810 &#8211;&gt; 00:20:43.120<br>Paul Formosa: So one is a version of the the standard trolley problem where you pull lever. There was a that version of that one is the a version of the one where you push the large man off the bridge. In our case you have to kick a chair off somebody.<\/p>\n\n\n\n<p>140<br>00:20:43.220 &#8211;&gt; 00:20:59.959<br>Paul Formosa: and then we have a a nominal choice, a coin toss which has been 50 50. So we want to see. Okay, has a meter impact, something like that. And we had another choice in the end, which sort of came up organically about getting revenge on the on the on the evil and evil person who does all this damage, or you could hand over the police.<\/p>\n\n\n\n<p>141<br>00:21:00.280 &#8211;&gt; 00:21:16.290<br>Paul Formosa: And so, as as Martin said, one group got what we consider the intuitive thing so doing in the rally of a self interest, doing the model thing with counts and children, so stealing would count as evil and not ceiling account as good, and the other group had that flipped around. So stealing something says Good, not stealing, says evil.<\/p>\n\n\n\n<p>142<br>00:21:16.520 &#8211;&gt; 00:21:30.829<br>Paul Formosa: and likewise with the others. Now, of course, the try problems. It&#8217;s much more ambiguous, you know, although most people think you should pull the liver lever. Not if one agrees, and and then we should probably explain what the trolley problem is, just in case somebody it doesn&#8217;t know what it is.<\/p>\n\n\n\n<p>143<br>00:21:30.840 &#8211;&gt; 00:21:45.490<br>Paul Formosa: Yeah. So Charlie, problems are widely study the most psychology. There&#8217;s a lot of studies on them, and they come out of Philip the foot to work around a lot of fossil F. And basically the idea is, I can imagine there&#8217;s a run of my train. It&#8217;s hurling down track. There&#8217;s a lever there.<\/p>\n\n\n\n<p>144<br>00:21:45.500 &#8211;&gt; 00:21:54.430<br>Paul Formosa: There&#8217;s 3 people on the track. If you don&#8217;t do anything. The train is gonna run over and kill us 3 people. If you pull this lever it&#8217;ll turn the train into another path, and it&#8217;ll kill one person.<\/p>\n\n\n\n<p>145<br>00:21:54.560 &#8211;&gt; 00:21:59.979<br>Paul Formosa: What do you do? Well, most people think. Oh, in that case, pull the lever, save the 3. Unfortunately, you&#8217;ve killed one.<\/p>\n\n\n\n<p>146<br>00:22:00.250 &#8211;&gt; 00:22:01.110<br>Paul Formosa: Now<\/p>\n\n\n\n<p>147<br>00:22:01.160 &#8211;&gt; 00:22:08.080<br>Paul Formosa: there&#8217;s a variation called what&#8217;s called the the fat man the large man version where you imagine there&#8217;s a bridge so same thing as a runaway train.<\/p>\n\n\n\n<p>148<br>00:22:08.150 &#8211;&gt; 00:22:33.809<br>Paul Formosa: Once again there&#8217;s 3 people on the track. They set up pulling a lever to stop the train killing the 3. You have to push a large man off a bridge. Now you can&#8217;t go for yourself, you you 2, or something like that. But the large man will somehow stop it to try, and I don&#8217;t know how it&#8217;s supposed to happen, but anyway, it&#8217;s it&#8217;s it&#8217;s supposed to not weigh a whole lot like it&#8217;s a mass. Well, his stuff will is intestines. We&#8217;ll get into the wheels. I I really don&#8217;t know this is where this is where the hand waving comes in and now<\/p>\n\n\n\n<p>149<br>00:22:33.820 &#8211;&gt; 00:22:38.730<br>Paul Formosa: but anyway, so but then most people think well actually no, it&#8217;d be wrong to push this large guy off the<\/p>\n\n\n\n<p>150<br>00:22:38.740 &#8211;&gt; 00:22:56.790<br>Paul Formosa: the bridge. And then the question is, what&#8217;s the difference? Right in both cases One person&#8217;s dying. 3 people think, say, why is it okay to pull? Leave it, but not push a large guy, and there&#8217;s lots of answers. People give one. You&#8217;re using the large management means, and the other case, you know it&#8217;s just a full same side effect, but not intended consequence, and it&#8217;s the document double fact, and all sorts of<\/p>\n\n\n\n<p>151<br>00:22:56.800 &#8211;&gt; 00:23:10.570<br>Paul Formosa: philosophical issues which not that interesting here. So what we wanted, though, is to put equivalent versions in into our game, and and the reason you know that they&#8217;re interesting is because there are competing moral concerns. Right, you know it would be.<\/p>\n\n\n\n<p>152<br>00:23:10.580 &#8211;&gt; 00:23:21.680<br>Paul Formosa: It&#8217;s good to say the 3. But you know, is it all right to, you know, use our voice personal violence in one case, or in personal levers to to in the other case?<\/p>\n\n\n\n<p>153<br>00:23:21.820 &#8211;&gt; 00:23:23.399<br>Paul Formosa: So basically, what we found<\/p>\n\n\n\n<p>154<br>00:23:23.430 &#8211;&gt; 00:23:30.039<br>Paul Formosa: was really interesting. Now, of course, it&#8217;s only small studies that needs to be kept in mind, and Mac will talk about a bigger study afterwards.<\/p>\n\n\n\n<p>155<br>00:23:30.730 &#8211;&gt; 00:23:43.640<br>Paul Formosa: But basically. We went through and looked at, Coded, the various themes that people bought up, and we put them. We full found 4 main sorts of issues. When we asked you about how they, you know, thought about the meet, or engage with the meter.<\/p>\n\n\n\n<p>156<br>00:23:43.870 &#8211;&gt; 00:23:50.029<br>Paul Formosa: and one of them was well. 2 of them were about in difference of rejecting the meter. So about half the people<\/p>\n\n\n\n<p>157<br>00:23:50.050 &#8211;&gt; 00:24:04.840<br>Paul Formosa: just said I ignored it didn&#8217;t pay attention to it, or deliberately put it aside or rejected it, or something like that, so that half the people in both our groups, whether whatever the whether the made it was intuitive or not, intuitive, just rejected or in different to it. So I think that&#8217;s an interesting finding<\/p>\n\n\n\n<p>158<br>00:24:04.920 &#8211;&gt; 00:24:13.949<br>Paul Formosa: that half of people just just didn&#8217;t engage with the met at all. Didn&#8217;t particularly like it didn&#8217;t want to influencing their decisions. They wanted to make decisions on their own merits, not, you know, not with the media to them to do.<\/p>\n\n\n\n<p>159<br>00:24:14.640 &#8211;&gt; 00:24:24.470<br>Paul Formosa: But the I think more interesting finding is that in our insurance group the most common thing was treating them the need as a guide as a moral guide, so people would say things like<\/p>\n\n\n\n<p>160<br>00:24:25.060 &#8211;&gt; 00:24:40.880<br>Paul Formosa: I didn&#8217;t just follow up, but it made me sort of stop and think, or gave me sort of it gave me, you know I I Without that I would have just kind of done whatever, but it made me think seriously about what was going on, or you know it got me thinking about this or so it was. It was actually a if they actually treated it, kind of useful moral guide.<\/p>\n\n\n\n<p>161<br>00:24:40.930 &#8211;&gt; 00:24:49.389<br>Shlomo Sher: And so was it was the idea that they know that they&#8217;re being judged in some way because they know it&#8217;s going to affect where they are in the meter. Is that the idea?<\/p>\n\n\n\n<p>162<br>00:24:50.190 &#8211;&gt; 00:25:07.619<br>Paul Formosa: Okay. So I mean, these are all self report things. This is what people tell us, what what, what&#8217;s actually going on. We we get to in a second when we look at the quantitative study. But we actually see, is it making a difference or not? But what people told us was, you know that that that would take into account no making and think about what to do, and so on.<\/p>\n\n\n\n<p>163<br>00:25:08.050 &#8211;&gt; 00:25:10.449<br>Paul Formosa: So that was the most common when it was intuitive, made up.<\/p>\n\n\n\n<p>164<br>00:25:10.670 &#8211;&gt; 00:25:21.320<br>Paul Formosa: But when we look at the non-intuitive meter group, it flipped around. That was the that was the least common thing, and the most common thing was training the meters of school.<\/p>\n\n\n\n<p>165<br>00:25:21.700 &#8211;&gt; 00:25:25.200<br>Paul Formosa: so they thought so. They said things like oh.<\/p>\n\n\n\n<p>166<br>00:25:25.540 &#8211;&gt; 00:25:37.120<br>Paul Formosa: I thought it was the wrong thing to do, but it said it was 15 good, so it didn&#8217;t, or I was already on 90, and so I did it because I want to get to 100 on my meter or or other people would sort of have different sort of<\/p>\n\n\n\n<p>167<br>00:25:37.140 &#8211;&gt; 00:25:44.739<br>Paul Formosa: methods, like they might say. Oh, I want to sort of keep my made about halfway. So this one was plus 15 able up getting back about halfway, so I pick pick that<\/p>\n\n\n\n<p>168<br>00:25:45.350 &#8211;&gt; 00:26:02.780<br>Paul Formosa: so I can think of these 2 approaches as a kind of a much more kind of an instrumental relationship to the to the meter and the kind of more, I I guess, intrinsic relationship with a sort of it&#8217;s. It&#8217;s getting to think about morality. So it&#8217;s really interesting. When the made it was non-intuitive people just instrumentalized<\/p>\n\n\n\n<p>169<br>00:26:02.790 &#8211;&gt; 00:26:08.419<br>Paul Formosa: the Major a lot more. It was just it was just another mechanic another school to be optimized or do what you want with.<\/p>\n\n\n\n<p>170<br>00:26:08.500 &#8211;&gt; 00:26:26.180<br>Paul Formosa: It was kind of meaningless in a sense, to them. Morally it wasn&#8217;t he wasn&#8217;t promoting reflection, or anything like that; whereas but if in the more tube of meet up, it was tending to to do that much more often so, I thought that was a really interesting Finding that it did seem to lead to this more reflective play. If it was intuitive and instrumental play if it was not intuitive.<\/p>\n\n\n\n<p>171<br>00:26:26.190 &#8211;&gt; 00:26:31.099<br>A Ashcraft: So the and the difference between the the intuitive meter and the non-intuitive meter is<\/p>\n\n\n\n<p>172<br>00:26:32.070 &#8211;&gt; 00:26:43.820<br>Malcolm Ryan: a non-intuitive meter is is flipped around so that every choice that you might intuitively think of as good was labeled evil, and vice versa. Stealing money was good. You get points for good<\/p>\n\n\n\n<p>173<br>00:26:44.000 &#8211;&gt; 00:26:58.669<br>Malcolm Ryan: picking a dog. They, one of the most controversial choice amongst all the players, was there&#8217;s one there&#8217;s just a harmless dog sitting there, and it says, do you want to kick the dog or not kick the dog. And and if we say, you know there.<\/p>\n\n\n\n<p>174<br>00:26:58.740 &#8211;&gt; 00:27:28.269<br>Malcolm Ryan: that was the choice that it&#8217;s surprisingly of all the choices. That was the one that you know people just had. No, there&#8217;s no way of kicking the dog. There was like. There&#8217;s a choice at the end whether you want to shoot a guy, and much happier with that idea than they were with just kicking the dog, although there was one person in the study, who played with the the counterintuitive meter, and just made every single choice. Because when I asked him, you know, why did you steal the money gets because the media says it was good<\/p>\n\n\n\n<p>175<br>00:27:28.300 &#8211;&gt; 00:27:47.170<br>Malcolm Ryan: and just did the thing. So I mean a lot of our report. A lot of our results in general, you know, very generalized. But there are some people who react very differently to this thing, so some people that we completely ignored and say no. I paid no attention to the meter at all. I just did my own thing, and some people they went. Yeah.<\/p>\n\n\n\n<p>176<br>00:27:47.270 &#8211;&gt; 00:27:56.989<br>Malcolm Ryan: I did what the mayor told me when that was the whole that was the whole game for me. Do you suppose that that some of that has to do with? If it&#8217;s a non-intuitive meter.<\/p>\n\n\n\n<p>177<br>00:27:57.090 &#8211;&gt; 00:28:01.859<br>A Ashcraft: it&#8217;s creating some sort of like, you know, into like a a<\/p>\n\n\n\n<p>178<br>00:28:02.120 &#8211;&gt; 00:28:04.679<br>A Ashcraft: friction like I have to resolve<\/p>\n\n\n\n<p>179<br>00:28:04.800 &#8211;&gt; 00:28:14.790<br>A Ashcraft: the the difference between like or like the cognitive dissonance between. Yeah, yeah, My, what I think is good. And what this game is telling me is good.<\/p>\n\n\n\n<p>180<br>00:28:15.490 &#8211;&gt; 00:28:29.210<br>Paul Formosa: Yeah, that&#8217;s that&#8217;s definitely something we found in the that would the next day we should tell him about in a second. So first thing we found is that there was a we also with the reaction time, and we found that when that first time they got that count on truth of thing, it was a very long reaction time, because I think, Whoa! What&#8217;s going on in the trying to process it.<\/p>\n\n\n\n<p>181<br>00:28:29.250 &#8211;&gt; 00:28:49.180<br>Paul Formosa: We also have in that second study which Mac will talk about in a minute a mix to meter, which was intuitive up until the trolley problems, and then it was flipped around. And so that was trying to create exactly what you said, trying to get over that cognitive distance, trying to get some trust and then flipping it around and looking at the impacts.<\/p>\n\n\n\n<p>182<br>00:28:49.190 &#8211;&gt; 00:29:09.079<br>Malcolm Ryan: Yeah, so so like, Paul said, we ran the the so we run 4 versions, baseline, with no major at all, just to see how people would normally play the game a. And then the intuitive meter, which, recommending everything, you would normally think of as good is good. The counterintuitive meter, which was, you know, everything that you normally think as good as evil, and then the Mix made it, which started off<\/p>\n\n\n\n<p>183<br>00:29:09.090 &#8211;&gt; 00:29:20.400<br>Malcolm Ryan: for the first sort of 5 or 6 decisions. It was doing the intuitive thing. Don&#8217;t kick the dog, don&#8217;t still the money. Don&#8217;t everything like that, and they got to the trolley problems, and we switch them over. And so we said, You know what happens if we switch these.<\/p>\n\n\n\n<p>184<br>00:29:20.480 &#8211;&gt; 00:29:35.039<br>Paul Formosa: I should say what the rating was in the trolley problem. Actually, that probably helped. So so I at first the the standard try problem with the the case. We said that the intuitive thing the good thing to do is pull the lever in the<\/p>\n\n\n\n<p>185<br>00:29:35.090 &#8211;&gt; 00:29:48.589<br>Paul Formosa: In the large man version of Push off the bridge version, which in our case, you have to kick a chair off somebody which would hang them. Most people tend to think that it&#8217;s wrong to push the large guy off the bridge, or in this case keep the chair off. So the intuitive thing in that case was not<\/p>\n\n\n\n<p>186<br>00:29:48.720 &#8211;&gt; 00:29:50.200<br>Paul Formosa: not to kill the one person.<\/p>\n\n\n\n<p>187<br>00:29:50.590 &#8211;&gt; 00:29:53.789<br>Malcolm Ryan: and the baseline results without the meters reflected.<\/p>\n\n\n\n<p>188<br>00:29:53.940 &#8211;&gt; 00:30:12.129<br>Malcolm Ryan: reflected this throughout, like people would do mostly do the intuitively good thing. There was still 10% of people who kicked the dog, and one of the things that people did talk about in the qualitative study was Well, it&#8217;s just a game I wanted to see what would happen. Things like that, or or otherwise. People would also say it&#8217;s a game<\/p>\n\n\n\n<p>189<br>00:30:12.410 &#8211;&gt; 00:30:32.150<br>Malcolm Ryan: i&#8217;m going to get rewarded for being good, or i&#8217;m going to get punished for being able. So people had expectations of how the game would respond. We design the game to be. You know it&#8217;s. It&#8217;s a very linear narrative. I don&#8217;t. I I mean, for people at home. Yeah, your choices really matter. But we didn&#8217;t, you know it was only the meter that really told you anything about that? So you the question.<\/p>\n\n\n\n<p>190<br>00:30:32.180 &#8211;&gt; 00:30:40.570<br>Shlomo Sher: Yeah, yeah, it&#8217;s interesting with the setup like that, knowing that people are going to interpret the usefulness of the meter differently.<\/p>\n\n\n\n<p>191<br>00:30:40.900 &#8211;&gt; 00:30:45.350<br>Shlomo Sher: How do you get good results when you know that for some people.<\/p>\n\n\n\n<p>192<br>00:30:45.910 &#8211;&gt; 00:30:55.300<br>Shlomo Sher: you know, the some people will take the meter seriously, and other people will essentially use it as a mechanic that they&#8217;re that they think they&#8217;re just supposed to follow in order to do well in the game.<\/p>\n\n\n\n<p>193<br>00:30:55.310 &#8211;&gt; 00:31:13.260<br>Malcolm Ryan: Yeah. So we deliberately avoided telling them anything about the met, and none of the instructions mentioned it at all. So what we really we are wanting to try to measure that is saying, Well, how do people respond to this at all? We&#8217;re not telling you you should do the good thing. There&#8217;s nothing in the game. There&#8217;s no game instructions at all, but there&#8217;s nothing in the game that says.<\/p>\n\n\n\n<p>194<br>00:31:13.270 &#8211;&gt; 00:31:20.349<br>Malcolm Ryan: This is what you should be doing. All there is is a bar at the top of the screen with, we&#8217;re good at one end, and the word evil at the other end, and it moves.<\/p>\n\n\n\n<p>195<br>00:31:20.540 &#8211;&gt; 00:31:24.489<br>Malcolm Ryan: and so that very minimal people.<\/p>\n\n\n\n<p>196<br>00:31:24.550 &#8211;&gt; 00:31:36.930<br>Malcolm Ryan: what, whatever meaning they expect on to that. And some of our players would have been get quite, You know, people who had played games for, and would recognize that as a as a sort of thing, and that came through in the qualitative study. There are different people who<\/p>\n\n\n\n<p>197<br>00:31:37.000 &#8211;&gt; 00:31:46.179<br>Malcolm Ryan: played this sort of game before, and you what what this meant. But there are also people who are just responding to it as, yeah. There&#8217;s a there&#8217;s a thing on the screen that&#8217;s telling me that i&#8217;m good.<\/p>\n\n\n\n<p>198<br>00:31:47.230 &#8211;&gt; 00:32:04.600<br>Paul Formosa: You know. What does that mean? And so it could be really quickly at sorry correctly that in those big choices where the made it was affected. It did come up like this choice, it would say, you know, Kick the dog plus 15 good don&#8217;t keep the dog minus 15 or plus. So it was clear before you made the choice that this would affect the meter in this way.<\/p>\n\n\n\n<p>199<br>00:32:04.640 &#8211;&gt; 00:32:24.400<br>Malcolm Ryan: Yeah, we wanted to, which which is obviously, you know. Yeah, for for the sake of the experiment, we we wanted to be up front about that. It was something we sort of went back and forth within the design. We figured it was necessary, for the a lot of games have these things hidden. But<\/p>\n\n\n\n<p>200<br>00:32:24.410 &#8211;&gt; 00:32:41.809<br>Malcolm Ryan: but ultimately you&#8217;re doing the same sort of behaviors over and over again. So you do learn what the morality is attached to things, or you can go spoil yourself. But you know, reading spoilers about how many more like, how many come a point something is worth. We wanted to be up front about that, to let you know what the effect was going to be, because there wasn&#8217;t enough game<\/p>\n\n\n\n<p>201<br>00:32:41.820 &#8211;&gt; 00:32:46.110<br>Malcolm Ryan: sort of to learn to to predict what the meter might be might be rewarding you for<\/p>\n\n\n\n<p>202<br>00:32:46.530 &#8211;&gt; 00:33:06.070<br>Malcolm Ryan: the interesting thing in the in the baseline data was that I mean our our expectations were validated. People mostly chose the good options for the intuitively good options for everything except for the the the large man or the footbridge problem version, which which was in our game. It was<\/p>\n\n\n\n<p>203<br>00:33:06.080 &#8211;&gt; 00:33:12.900<br>Malcolm Ryan: your your boss. There, there&#8217;s a there&#8217;s a mad. I&#8217;m spoiling the game. So if you want to go play the game and come back.<\/p>\n\n\n\n<p>204<br>00:33:12.910 &#8211;&gt; 00:33:28.539<br>Malcolm Ryan: But in the same there&#8217;s there&#8217;s a crazy guy on the loose, killing people and setting 5 of things and whatever. And there&#8217;s one scene where he sets up your your bosses standing on a on a chair with a news around his neck, and<\/p>\n\n\n\n<p>205<br>00:33:28.550 &#8211;&gt; 00:33:42.039<br>Malcolm Ryan: and there&#8217;s he tells you there&#8217;s a bomb wire to another room where where the other people will die, and if he he says, you know, if you don&#8217;t kick the chair now and kill your boss, I will blow up the room and kill the other 3 other people.<\/p>\n\n\n\n<p>206<br>00:33:42.630 &#8211;&gt; 00:34:00.609<br>Malcolm Ryan: and and this in the baseline data we got exactly 50 50 in in people wanting to. 50% people chose to kill their boss and save 3 lives, and 50% of people chose to to to do the opposite to to say No, they wouldn&#8217;t kill it. And then 3 people died.<\/p>\n\n\n\n<p>207<br>00:34:00.620 &#8211;&gt; 00:34:14.039<br>Malcolm Ryan: and I thought that was really interesting, because we had, you know, a real moral dilemma there, that that didn&#8217;t have a intuitively good thing to do. And this was where we could really see. Well, it&#8217;s the media going to affect. That is the me. It going to sway people&#8217;s opinions when we&#8217;re in another.<\/p>\n\n\n\n<p>208<br>00:34:14.050 &#8211;&gt; 00:34:21.639<br>Malcolm Ryan: And the data was interesting in that it depended on the type of the meter, and it can be depended on the type of decision. And so the<\/p>\n\n\n\n<p>209<br>00:34:22.179 &#8211;&gt; 00:34:23.219<br>Malcolm Ryan: the<\/p>\n\n\n\n<p>210<br>00:34:23.320 &#8211;&gt; 00:34:42.130<br>Malcolm Ryan: for the early decisions, where there was an intuitively good choice people, largely weren&#8217;t, affected by the meter, giving them a meter which said, you should steal money, you should kick the dog like Paul said there was this reaction time where people were like. Wait what? I don&#8217;t understand why that&#8217;s good. But then.<\/p>\n\n\n\n<p>211<br>00:34:42.139 &#8211;&gt; 00:35:00.790<br>Malcolm Ryan: very quickly, people just seem to learn to ignore the meter and say, i&#8217;m going to do the good thing, anyway. I&#8217;m not going to steal money just because the media says it&#8217;s good. I&#8217;m not going to kick a dog just because to me this isn&#8217;t good. We sort of validate. There&#8217;s some existing work research which says, yeah, people play good, regardless of what you tell them to do in in most of these games.<\/p>\n\n\n\n<p>212<br>00:35:00.810 &#8211;&gt; 00:35:06.520<br>Malcolm Ryan: And so that was, you know, in line with previous research.<\/p>\n\n\n\n<p>213<br>00:35:06.800 &#8211;&gt; 00:35:10.629<br>Malcolm Ryan: But then we came to this probably problems, and we said, Well, okay, okay, what if we.<\/p>\n\n\n\n<p>214<br>00:35:10.720 &#8211;&gt; 00:35:14.039<br>Malcolm Ryan: what do people do here? And again, if the media had been telling you.<\/p>\n\n\n\n<p>215<br>00:35:14.070 &#8211;&gt; 00:35:23.160<br>Malcolm Ryan: Obviously they may have been telling you good things all well. So if the meter had been telling you evil things all along, if the meter it&#8217;s in and kick the dog if you need him, and saying, Steal the money.<\/p>\n\n\n\n<p>216<br>00:35:23.360 &#8211;&gt; 00:35:30.499<br>Malcolm Ryan: The me had no effect on the trolley problems. It was, it was still 50 50. In that case, so people just said, learnt that met up<\/p>\n\n\n\n<p>217<br>00:35:30.550 &#8211;&gt; 00:35:33.420<br>Malcolm Ryan: is not a moral guide. I&#8217;m going to ignore it.<\/p>\n\n\n\n<p>218<br>00:35:33.440 &#8211;&gt; 00:35:37.599<br>Malcolm Ryan: and and people would make seem to make choices the same way as as before.<\/p>\n\n\n\n<p>219<br>00:35:37.690 &#8211;&gt; 00:35:42.130<br>Malcolm Ryan: But if the meter had been telling you good things all along.<\/p>\n\n\n\n<p>220<br>00:35:42.180 &#8211;&gt; 00:35:51.900<br>Malcolm Ryan: then then get to that choice. And then we had 2 versions of the meter at that point, one which had been telling you good and said, okay, kick the chair, and one of which is telling you good, and said, Don&#8217;t, kick the chair.<\/p>\n\n\n\n<p>221<br>00:35:52.050 &#8211;&gt; 00:36:03.169<br>Malcolm Ryan: and that seems to sway people. People seem to go all right. Okay. So the meter has been. Is this trustworthy guide? It&#8217;s been matching my personal morality, although we don&#8217;t have<\/p>\n\n\n\n<p>222<br>00:36:03.360 &#8211;&gt; 00:36:19.530<br>Malcolm Ryan: any way to assess whether or not that but at least it&#8217;s been telling me things that are intuitively good. And then at that point there was a significant difference between the people who were told, yeah, kick. The chair is a good thing, and the people who were told don&#8217;t give the chair is a good thing, so that when the meter was acting<\/p>\n\n\n\n<p>223<br>00:36:19.590 &#8211;&gt; 00:36:25.549<br>Malcolm Ryan: so it didn&#8217;t seem like their behavior was consistent with people just trying to optimize the meter.<\/p>\n\n\n\n<p>224<br>00:36:25.590 &#8211;&gt; 00:36:44.959<br>Malcolm Ryan: It seemed very much more like people were taking this as a point of data in their decision, making in their moral choice a second, a second opinion. Here&#8217;s Here&#8217;s one that&#8217;s hard for me. Let me look and see what what the meter asks. Yeah. And that was consistent with some of the the the verbal feedback we were getting from people in the in, the<\/p>\n\n\n\n<p>225<br>00:36:44.970 &#8211;&gt; 00:36:46.259<br>Malcolm Ryan: they qualitative study.<\/p>\n\n\n\n<p>226<br>00:36:46.390 &#8211;&gt; 00:36:54.949<br>Malcolm Ryan: What I&#8217;m interested in in exploring, and what we didn&#8217;t think to do in this experiment, although we have done in later experiments, is to ask them after the fact.<\/p>\n\n\n\n<p>227<br>00:36:54.970 &#8211;&gt; 00:37:08.170<br>Malcolm Ryan: What did you use to make your decisions? And we&#8217;ve asked in a. In a, in a similar study? We&#8217;ve done a a similar study since which we Haven&#8217;t published. Yet where we have data with rather than telling you, this is good, and this is evil. We&#8217;ve told you<\/p>\n\n\n\n<p>228<br>00:37:08.340 &#8211;&gt; 00:37:25.410<br>Malcolm Ryan: most people chose this. 60% of people chose this. 60% of people chose that to a classic kind of walking dead staffing. But again presented on the screen when you&#8217;re making the choice. And again, that seems to and I Haven&#8217;t completed. This is sort of preliminary data, but that seems to also sway people&#8217;s opinion.<\/p>\n\n\n\n<p>229<br>00:37:25.420 &#8211;&gt; 00:37:41.779<br>Malcolm Ryan: If you, if you&#8217;d say 60% people to kick the chair, people will will favor that choice of you. Say people didn&#8217;t that people favor that choice, and after that study we asked them in a survey, what count? What was the foundation for your decision making? What factors did you use? And was it.<\/p>\n\n\n\n<p>230<br>00:37:41.790 &#8211;&gt; 00:37:57.610<br>Malcolm Ryan: Your writing, your personal morality, your you know the value, trying to do the good thing on the meter, trying to do the evil thing on the meter trying to just do things for people you like in the game trying to progress the narrative of the game, Lots of different kinds of motivations that we&#8217;ve sort of encountered.<\/p>\n\n\n\n<p>231<br>00:37:57.730 &#8211;&gt; 00:38:03.539<br>Malcolm Ryan: and by and large people said my personal morality was the thing that drove my decisions in the game.<\/p>\n\n\n\n<p>232<br>00:38:04.040 &#8211;&gt; 00:38:14.590<br>Malcolm Ryan: and we asked, Did you try to optimize the meter in any way is that No, not at all. I was not not influenced by the meter, strongly discrete with the idea that I, influenced by the meter.<\/p>\n\n\n\n<p>233<br>00:38:14.880 &#8211;&gt; 00:38:21.319<br>Malcolm Ryan: which is interesting, because clearly they were influenced by the Meta. And so I<\/p>\n\n\n\n<p>234<br>00:38:21.370 &#8211;&gt; 00:38:31.689<br>Malcolm Ryan: We&#8217;re doing more variations of this study. And this is something that I want to look into a bit more deeply is, how much do people think they&#8217;re being influenced by these factors? Versus how much are they actually doing.<\/p>\n\n\n\n<p>235<br>00:38:31.770 &#8211;&gt; 00:38:48.670<br>Shlomo Sher: And so the next study that we want to do is an eye tracking study. We want to see, actually, how they, how much attention are they paying for the meter at any point in time while playing the game?<\/p>\n\n\n\n<p>236<br>00:38:48.680 &#8211;&gt; 00:39:07.740<br>Shlomo Sher: I mean the bandwagon effect has got to play a role in all this, even though you don&#8217;t we you changing in real time the the number of people so that could could they see themselves like good question. Yeah, no, No. So we I mean the data was the data was all based on the previous study, and then fight appropriately to<\/p>\n\n\n\n<p>237<br>00:39:07.750 &#8211;&gt; 00:39:19.370<br>Malcolm Ryan: to to change up versions that we want for the decisions we wanted to test.<\/p>\n\n\n\n<p>238<br>00:39:19.860 &#8211;&gt; 00:39:22.530<br>Shlomo Sher: Okay, let me.<\/p>\n\n\n\n<p>239<br>00:39:22.740 &#8211;&gt; 00:39:25.389<br>Shlomo Sher: I I want to go now to<\/p>\n\n\n\n<p>240<br>00:39:25.600 &#8211;&gt; 00:39:33.569<br>Shlomo Sher: making the kind of big moral judgments that we&#8217;re used to in video games, right? Guys, just to make sure we&#8217;re done with the morality meter part.<\/p>\n\n\n\n<p>241<br>00:39:33.860 &#8211;&gt; 00:39:34.490<br>Okay.<\/p>\n\n\n\n<p>242<br>00:39:34.690 &#8211;&gt; 00:40:03.480<br>Shlomo Sher: So okay, so let&#8217;s let&#8217;s go back to talk about the the big Mall judgments that people make in video games. Right? So my moral engagement with the game here is that I get to choose whether to do X or Y right. So I decide whether to save the Crogan race and mass effect, or the risk of possibility that they&#8217;ll unleash their wrath upon the galaxy later on, or I get to decide whether Euthanize, my best friend in life is strange as she&#8217;s asking me to do or to liver to live the rest of her life as a quadriplegic.<\/p>\n\n\n\n<p>243<br>00:40:03.490 &#8211;&gt; 00:40:14.310<br>Shlomo Sher: So these are kinda every players familiar with these kind of decisions. Hard decisions like these, supposed to engage us with morality. And from what you guys found, do they actually work as intended.<\/p>\n\n\n\n<p>244<br>00:40:15.000 &#8211;&gt; 00:40:20.470<br>Malcolm Ryan: So to go back to the moral psychology stuff we talked about at the beginning<\/p>\n\n\n\n<p>245<br>00:40:20.620 &#8211;&gt; 00:40:33.590<br>Malcolm Ryan: the one of the problems I think, with especially with this sort of morality, major research, and also these these very big binary choices. Do you do the good thing or the evil thing? Or do you do? The sort of choice between things<\/p>\n\n\n\n<p>246<br>00:40:33.700 &#8211;&gt; 00:40:45.459<br>Malcolm Ryan: is when we look at how people actually make moral choices in real life. There&#8217;s a there&#8217;s a a model that we tend to follow in drawing from moral psychology called the full component model<\/p>\n\n\n\n<p>247<br>00:40:45.610 &#8211;&gt; 00:40:50.300<br>Malcolm Ryan: moral psychologist, James Rest and colleagues, and a lot of people following in this tradition<\/p>\n\n\n\n<p>248<br>00:40:50.370 &#8211;&gt; 00:40:54.079<br>Malcolm Ryan: say that there are really 4 different sort of<\/p>\n\n\n\n<p>249<br>00:40:54.520 &#8211;&gt; 00:40:55.700<br>Malcolm Ryan: congratulations.<\/p>\n\n\n\n<p>250<br>00:40:55.880 &#8211;&gt; 00:40:59.710<br>Malcolm Ryan: processes that are going on when we, when making doing ethics.<\/p>\n\n\n\n<p>251<br>00:40:59.790 &#8211;&gt; 00:41:07.890<br>Malcolm Ryan: and they moral motivation or moral focus, which is your, you know your drive to do the moral thing. In the first place<\/p>\n\n\n\n<p>252<br>00:41:07.920 &#8211;&gt; 00:41:15.870<br>Malcolm Ryan: moral sensitivity, which is your ability to read a situation. See that it is a moral situation, because rural situations don&#8217;t<\/p>\n\n\n\n<p>253<br>00:41:16.000 &#8211;&gt; 00:41:26.160<br>Malcolm Ryan: present to us, You know, present themselves to us as here as a moral thing, and see what the salient factors are, and and look at, you know, to read the world in a moral way.<\/p>\n\n\n\n<p>254<br>00:41:26.200 &#8211;&gt; 00:41:45.350<br>Malcolm Ryan: moral judgment, which is the then making the decision, deciding what what it is that you&#8217;re supposed to do, and using, You know whatever sort of factors that you know, influence your moral choice, and then moral action, which is to go out and actually put that action into effect in the world. And Usually, you know, being laurel is not an easy thing.<\/p>\n\n\n\n<p>255<br>00:41:45.410 &#8211;&gt; 00:41:46.889<br>Malcolm Ryan: and<\/p>\n\n\n\n<p>256<br>00:41:46.930 &#8211;&gt; 00:42:06.939<br>Malcolm Ryan: the problem with a lot of games from this model is that we really reduce that down to moral judge, which is the thing that we&#8217;re doing here is, and we say, okay, we don&#8217;t sort of eliminate moral sensitivity by saying, here is a moral choice. It&#8217;s heavily signed, posted in the gamers. There is a moral question here. Here are the 2 options you can make, and we just have to sort of choose between those options.<\/p>\n\n\n\n<p>257<br>00:42:06.950 &#8211;&gt; 00:42:11.190<br>Malcolm Ryan: and then also sort of removing moral action, because often it&#8217;s just like, okay.<\/p>\n\n\n\n<p>258<br>00:42:11.210 &#8211;&gt; 00:42:13.529<br>Malcolm Ryan: make the dial on choice that<\/p>\n\n\n\n<p>259<br>00:42:13.550 &#8211;&gt; 00:42:17.260<br>Malcolm Ryan: chooses that thing, and then you don&#8217;t have to put it into effect.<\/p>\n\n\n\n<p>260<br>00:42:17.310 &#8211;&gt; 00:42:30.610<br>Malcolm Ryan: Right? It&#8217;s it&#8217;s one click either way. Yeah, one click, I do that thing, and it&#8217;s clearly. I&#8217;m gonna do. Yes, there&#8217;s nothing about making a hard choice. Well, that choice itself might be difficult, but there&#8217;s nothing. Once you&#8217;ve made the choice to go clicking. It&#8217;s done.<\/p>\n\n\n\n<p>261<br>00:42:30.640 &#8211;&gt; 00:42:40.570<br>Malcolm Ryan: And so what we&#8217;re interested in is these: some of these big choices to present themselves in those ways, and traditionally, we&#8217;ve had. You know very much. This focus on<\/p>\n\n\n\n<p>262<br>00:42:40.670 &#8211;&gt; 00:42:43.370<br>Malcolm Ryan: here is a Here is an option. What you do.<\/p>\n\n\n\n<p>263<br>00:42:44.670 &#8211;&gt; 00:42:59.039<br>Malcolm Ryan: I&#8217;m interested in this thinking about this as a design advice for looking into those other kinds of moral moral components that we they&#8217;re ringing to bear and thinking about how to design for those. And so thinking about how to design a moral.<\/p>\n\n\n\n<p>264<br>00:42:59.080 &#8211;&gt; 00:43:09.370<br>Malcolm Ryan: I have a how to let the player sort of read in the different world things rather than hand them the moral problem which some of those bigger choices do I mean? Life is strange doesn&#8217;t.<\/p>\n\n\n\n<p>265<br>00:43:09.760 &#8211;&gt; 00:43:16.579<br>Malcolm Ryan: you know, and it&#8217;s more in the narrative about Well, what are the moral questions here that doesn&#8217;t hand. You like the<\/p>\n\n\n\n<p>266<br>00:43:16.620 &#8211;&gt; 00:43:33.490<br>Malcolm Ryan: often you have the 2 people saying you should do this, and you should do that, and you have to choose, You know, when you have a rich and narrative, and you have more going on. People read in lots of different factors like we found this in it in our game. One of the last decisions you make in the game is choosing whether or not to kill the bad guy<\/p>\n\n\n\n<p>267<br>00:43:33.510 &#8211;&gt; 00:43:36.890<br>Malcolm Ryan: out what I order to hand him over to the to the police.<\/p>\n\n\n\n<p>268<br>00:43:36.920 &#8211;&gt; 00:43:52.090<br>Malcolm Ryan: and and people would people would report. The name of the town was Mayhem. One player said I, just the very first introductory text presents. The name of the town is my him, and they said, I didn&#8217;t think<\/p>\n\n\n\n<p>269<br>00:43:52.100 &#8211;&gt; 00:43:59.449<br>Malcolm Ryan: in a town called Mayhem, I didn&#8217;t think that the police would be trustworthy. And so I thought I should take the law into my own hands, and so red<\/p>\n\n\n\n<p>270<br>00:43:59.480 &#8211;&gt; 00:44:14.019<br>Malcolm Ryan: those other elements of the narrative into that that moral choice. And so we&#8217;re. We&#8217;re very capable of of taking a much bigger and richer concerns than than as design. As we often feel like. We need to lay out here<\/p>\n\n\n\n<p>271<br>00:44:14.320 &#8211;&gt; 00:44:19.829<br>Malcolm Ryan: thing arguments for the one side here, arguments to the other side and and and choose.<\/p>\n\n\n\n<p>272<br>00:44:20.140 &#8211;&gt; 00:44:26.730<br>Malcolm Ryan: and and the sort of lens of moral sensitivity suggests. Well, look, you know, give the player<\/p>\n\n\n\n<p>273<br>00:44:26.790 &#8211;&gt; 00:44:45.909<br>Malcolm Ryan: respect the players moral sensitivity, reflect the fact that they can read the details of a larger situation and make up their own mind and bring to bear factors, that we may, as designers may have never even considered a a part of that decision. And I think big narrative games like like life, is strange, really. Do that in that. It&#8217;s not just.<\/p>\n\n\n\n<p>274<br>00:44:45.920 &#8211;&gt; 00:44:51.660<br>Malcolm Ryan: Here is a moral, you know it&#8217;s not. It&#8217;s not your mass effect. You walk up to a couple of strangers, and who ask you whether or not<\/p>\n\n\n\n<p>275<br>00:44:51.680 &#8211;&gt; 00:45:06.970<br>Malcolm Ryan: solve this moral problem for us. Here it is. It&#8217;s this is a person you&#8217;ve been living, and you know, being friends with, for you know, hours of play already have this established relationship a lot going on in this one decision that you&#8217;re making at this point in time.<\/p>\n\n\n\n<p>276<br>00:45:07.410 &#8211;&gt; 00:45:28.930<br>Paul Formosa: I think it&#8217;s useful. Oh, sorry, my question. I just I I think it&#8217;s useful. Then, to break something back it down into those different components. Right? So life life is straight, for example, more action is still very easy to just click this equip that. But things like moral focus, or there&#8217;s lots of reasons why you might care about morality. There&#8217;s lots of competing interest. You&#8217;ve got your relationship and and all the all the rich narrative, and it&#8217;s been going on.<\/p>\n\n\n\n<p>277<br>00:45:28.940 &#8211;&gt; 00:45:38.939<br>Paul Formosa: and sensitivity likewise. You know they they it doesn&#8217;t just sort of flag. This is a model issue or not. A moral issue is usually relationship with the other characters that a moral issue, and not a moral issue. How does that relate to<\/p>\n\n\n\n<p>278<br>00:45:39.020 &#8211;&gt; 00:45:57.480<br>Paul Formosa: like it&#8217;s not. It&#8217;s not so it so you can think about like a sensitive. It could be quite difficult. But what exactly is moderately relevant here. I&#8217;m: not. That&#8217;s going to be quite challenging moral focus. You know you&#8217;ve got personal relationship. And how does that work with morality? Once again it can make more focus, quite difficult judgment, obviously difficult thing to do. But then more action. It&#8217;s pretty easy. You just so click a button on that button<\/p>\n\n\n\n<p>279<br>00:45:57.490 &#8211;&gt; 00:46:11.960<br>Paul Formosa: so you can think of different games that different ways of designing games can make. Some of those components are the easier, harder, or gauge of those skills in different sorts of ways, and it&#8217;s not like Every game is going to all those things. It&#8217;s fine, and not ever again has to. It&#8217;s fine that the same old action is easy in this game, but these other ones might be hard.<\/p>\n\n\n\n<p>280<br>00:46:11.970 &#8211;&gt; 00:46:29.360<br>Paul Formosa: So it&#8217;s just a matter of like thinking through. How does a game engage with these different aspects, how they kind of easy or hard does it design a want to engage with those, and also just. But I guess the important thing is thinking more broadly. Then there&#8217;s more to relevance. I want to make a judgment that&#8217;s just short, as one thing you can do. But there&#8217;s always other things, too.<\/p>\n\n\n\n<p>281<br>00:46:29.790 &#8211;&gt; 00:46:41.570<br>A Ashcraft: Did Did any of your in in in the in, the follow up questions when they were giving you verbal verbal feedback on it? Did anybody talk about making decisions because they thought the game would go longer.<\/p>\n\n\n\n<p>282<br>00:46:43.270 &#8211;&gt; 00:46:54.060<br>Malcolm Ryan: Hmm. No, I don&#8217;t remember. There were definitely people who were thinking it would. Would. It would influence the the future of the game. But there is. Suddenly.<\/p>\n\n\n\n<p>283<br>00:46:54.620 &#8211;&gt; 00:47:06.940<br>Malcolm Ryan: there is. Yeah, there&#8217;s some idea of this. In fact, this is a study I was recently pondering with the student is what if we made a study where you made a sequence of moral decisions? But you also had<\/p>\n\n\n\n<p>284<br>00:47:07.030 &#8211;&gt; 00:47:09.590<br>Malcolm Ryan: a combat scene in between them, and<\/p>\n\n\n\n<p>285<br>00:47:09.730 &#8211;&gt; 00:47:21.180<br>Malcolm Ryan: there was a factor in the moral decision which made you more likely to win or lose the combat. And if you lost the combat you lost the game, and the game would stop, because that&#8217;s the the punishment in most games eventually is<\/p>\n\n\n\n<p>286<br>00:47:21.330 &#8211;&gt; 00:47:27.410<br>Malcolm Ryan: stopping you from playing the game. You know the reward is, there is more game, and the punishment is not you lost?<\/p>\n\n\n\n<p>287<br>00:47:27.450 &#8211;&gt; 00:47:37.890<br>Malcolm Ryan: I think about it in the other way. I think about it as reward. Yeah, the best reward you can give a player is more game. Yeah.<\/p>\n\n\n\n<p>288<br>00:47:38.140 &#8211;&gt; 00:47:45.270<br>Malcolm Ryan: Yeah. And so I&#8217;m: I would be very interested in doing that study of saying, okay, here is a moral choice, but there is also a pragmatic.<\/p>\n\n\n\n<p>289<br>00:47:45.800 &#8211;&gt; 00:48:04.510<br>Malcolm Ryan: Do you do the thing that will risk, you know. Maybe every choice has a do the good thing, but lose your You&#8217;ve got a 5% chance of losing the next combat, or a 10% chance, or whatever it is, how you willing to weigh up that chance of losing the game versus the versus the moral, the moral doing the moral thing.<\/p>\n\n\n\n<p>290<br>00:48:04.640 &#8211;&gt; 00:48:21.339<br>Malcolm Ryan: Nobody really expressed that in the study that we looked at, but it&#8217;s suddenly a a factor that&#8217;s really unique to making normal choices in games versus making moral choices in real life. Obviously, you know, you although I guess many of our moral choices in real life are also affecting our ability to continue<\/p>\n\n\n\n<p>291<br>00:48:21.380 &#8211;&gt; 00:48:28.460<br>Malcolm Ryan: in the relationship. We&#8217;ll continue in the job. We&#8217;ll continue in the whatever the game is that we&#8217;re playing, that the the moral choice comes up in.<\/p>\n\n\n\n<p>292<br>00:48:28.760 &#8211;&gt; 00:48:33.199<br>Malcolm Ryan: But there is this very specific thing in the game where you know I<\/p>\n\n\n\n<p>293<br>00:48:33.650 &#8211;&gt; 00:48:35.199<br>Malcolm Ryan: You feel the need to<\/p>\n\n\n\n<p>294<br>00:48:35.300 &#8211;&gt; 00:48:48.089<br>Malcolm Ryan: play more game, and that&#8217;s the objective of the game is to keep playing the game and to finish the game in some sense. And so we Haven&#8217;t examined that, and it Hasn&#8217;t come up in a in a study so far, but it would suddenly be an interesting<\/p>\n\n\n\n<p>295<br>00:48:48.210 &#8211;&gt; 00:48:49.839<br>Malcolm Ryan: factor to examine.<\/p>\n\n\n\n<p>296<br>00:48:49.950 &#8211;&gt; 00:48:52.629<br>Malcolm Ryan: and something that I would love to run an experiment on.<\/p>\n\n\n\n<p>297<br>00:48:52.720 &#8211;&gt; 00:49:12.219<br>Paul Formosa: Did you get some? I&#8217;m sorry i&#8217;m sorry, please. I was just gonna say we did get some nearby comments for some people talked about. I just chose what would make the story more interesting at the beginning. Still, in the all goes Money, I thought. Make kind of boring if I didn&#8217;t steal it. So some people picked it, because I thought narratively, it&#8217; be more interesting.<\/p>\n\n\n\n<p>298<br>00:49:12.230 &#8211;&gt; 00:49:21.619<br>A Ashcraft: It wasn&#8217;t always how you would length in it, but you could make it more interesting. I space right if I didn&#8217;t know how long the game was. I might try to play the meter<\/p>\n\n\n\n<p>299<br>00:49:21.700 &#8211;&gt; 00:49:34.280<br>A Ashcraft: such a way that the game extended like oh, is i&#8217;m going to play this until and like i&#8217;m going to try to keep it in the middle, right? Because I might. The game might end if I get it to one end or the other.<\/p>\n\n\n\n<p>300<br>00:49:34.790 &#8211;&gt; 00:50:03.639<br>Malcolm Ryan: They suddenly were people interesting on that, Toby. There were people who in the early choices did something immoral just to try. And then in the we sort of did a we did a focus group discussion. So there was like 2 or 3 people talking, and they were so embarrassed by making that choice like there&#8217; be. The people who reported those things, but like I didn&#8217;t realize where it was going like they they were like I made. I thought it was just a game. I thought I could just steal this money. It would be fine.<\/p>\n\n\n\n<p>301<br>00:50:03.650 &#8211;&gt; 00:50:16.259<br>Malcolm Ryan: And the guy you steal the money from comes back later in the game and actually helps you helps you survive, and things like that, and they like I Really, if i&#8217;d known where, what kind of game or was where it was going, I would never have made that choice.<\/p>\n\n\n\n<p>302<br>00:50:16.270 &#8211;&gt; 00:50:31.730<br>Malcolm Ryan: So there&#8217;s a point in which it turned for them from being just again, and something that I can do what I like and explore things to. This is a story that I&#8217;m invested in, and I care about the characters that i&#8217;m in that i&#8217;m doing. I think that as a designer. That&#8217;s that&#8217;s sort of the<\/p>\n\n\n\n<p>303<br>00:50:31.850 &#8211;&gt; 00:50:45.949<br>Malcolm Ryan: comes to. That question of moral focus is, how do we, as designers get people to engage Very tempting as a game to to just game it right. Most of our games are a moral spaces where we don&#8217;t ask the player to even think about the morality, what they&#8217;re doing, and that&#8217;s fine.<\/p>\n\n\n\n<p>304<br>00:50:46.050 &#8211;&gt; 00:50:54.739<br>Malcolm Ryan: Games are meant to be sandboxes where we can mess around, but we also want in some games to say, no. This is a game where your morality matters.<\/p>\n\n\n\n<p>305<br>00:50:55.170 &#8211;&gt; 00:51:13.299<br>Malcolm Ryan: And so that&#8217;s a moral focus question of how do we get you into the game in a way that says your choices. You need to be thinking about the morality of your choices in this game, whereas you know, in your average shooter you&#8217;re not going to go. Oh, my God! Should I shoot this person or not? It&#8217;s like, you know. Games Don&#8217;t. Then bring that.<\/p>\n\n\n\n<p>306<br>00:51:13.780 &#8211;&gt; 00:51:31.410<br>Shlomo Sher: you know I I really like the idea of of focusing more on moral sensitivity, right? Because, yeah, I mean, you know, there&#8217;s tons of choices that lead to ethical reflection and some where you can put you control it down and think about it for a half hour before deciding. But it&#8217;s all supposed to be about reasons.<\/p>\n\n\n\n<p>307<br>00:51:31.420 &#8211;&gt; 00:51:42.339<br>Shlomo Sher: and i&#8217;m curious. I i&#8217;m trying to remember life and strange I mean I&#8217;m. Since we were talking about that, I I think there is a scenario where, let&#8217;s say, there&#8217;s a kid in the quad, who seems, you know, lonely.<\/p>\n\n\n\n<p>308<br>00:51:42.480 &#8211;&gt; 00:51:59.159<br>Shlomo Sher: and you know you have the opportunity. But the game is not gonna for you to to let&#8217;s say, come and come up and talk to to that fellow student. Right? That&#8217;s let&#8217;s see an opportunity that might test your moral sensitivity, or it might get you to be the kind of player that just checks out every option<\/p>\n\n\n\n<p>309<br>00:51:59.170 &#8211;&gt; 00:52:16.289<br>Shlomo Sher: right? But i&#8217;m i&#8217;m curious about If you guys know some games that you think might might have been well designed to explore the moral sensitivity that players might have for the situation. And if you do, is the idea, then to<\/p>\n\n\n\n<p>310<br>00:52:16.300 &#8211;&gt; 00:52:29.370<br>Shlomo Sher: because I can see a situation where you encounter characters that you know things will happen in your interaction that they will think are morally important. But your character might have not realized, and later on they could essentially, in the conversation.<\/p>\n\n\n\n<p>311<br>00:52:29.380 &#8211;&gt; 00:52:43.339<br>Shlomo Sher: come back to you, perhaps upset, perhaps happy because of how things morally turned out according to how they saw it. But you might not have an idea. I&#8217;m just kind of curious. If you guys have any ideas of how to do the moral sensitivity part. Well.<\/p>\n\n\n\n<p>312<br>00:52:43.380 &#8211;&gt; 00:52:46.569<br>Malcolm Ryan: so well, we we we I mean, so<\/p>\n\n\n\n<p>313<br>00:52:46.620 &#8211;&gt; 00:53:04.850<br>Malcolm Ryan: yeah, blah, blah blah something that I find very interesting. This is most of our games. Do you know the sign post here is a moral choice, and I want to answer. We&#8217;ve been thinking a little bit about how do you step back? How do you. How do you design a game to let to not tell the play that they&#8217;re making moral choices and<\/p>\n\n\n\n<p>314<br>00:53:04.860 &#8211;&gt; 00:53:22.669<br>Malcolm Ryan: and part of that comes back to sort of making the game more systemic and more generic, and the choices that you&#8217;re making, and that and that the is not one scripted choice that has. You know this is this is the moral choice, and these are the options it&#8217;s. Here is a system with a number of choices that you&#8217;re making along the way, and<\/p>\n\n\n\n<p>315<br>00:53:23.040 &#8211;&gt; 00:53:32.450<br>Malcolm Ryan: there is a moral arc to that to that game play. But there, isn&#8217;t necessarily, you know, a moment where you&#8217;re making the moral choice. You&#8217;re making a bunch of choices, and and<\/p>\n\n\n\n<p>316<br>00:53:32.600 &#8211;&gt; 00:53:34.280<br>Malcolm Ryan: they have moral impact. And<\/p>\n\n\n\n<p>317<br>00:53:34.590 &#8211;&gt; 00:53:35.229<br>Malcolm Ryan: And<\/p>\n\n\n\n<p>318<br>00:53:35.330 &#8211;&gt; 00:53:53.590<br>Malcolm Ryan: and so we actually did just an investigation. Looking at papers, please, in this regard as a more systemic systems driven game where there is ethics to what you&#8217;re doing in the game. But there&#8217;s not there. Well, there are some scripted moments where you&#8217;re making, you know clearly a scripted choice. But there&#8217;s also just an overall<\/p>\n\n\n\n<p>319<br>00:53:53.710 &#8211;&gt; 00:53:59.819<br>Malcolm Ryan: it moral AR to what you&#8217;re doing, and whether or not you feel like you came away as a as a good person in that game.<\/p>\n\n\n\n<p>320<br>00:54:00.310 &#8211;&gt; 00:54:01.229<br>Malcolm Ryan: And<\/p>\n\n\n\n<p>321<br>00:54:01.280 &#8211;&gt; 00:54:10.070<br>Malcolm Ryan: And so this is a very interesting kind of different way of approaching approaching morality in games is rather than saying, Here is a moral choice. It&#8217;s like. Here is<\/p>\n\n\n\n<p>322<br>00:54:10.600 &#8211;&gt; 00:54:28.989<br>Malcolm Ryan: a system which has moral consequence. Similar games. Frost punk. Does that? Well, I think you are there. You know again. Sort of it&#8217;s more of a management, SIM, that you&#8217;re planning there, but overall the choices you&#8217;ll make. Have have sort of moral impact for, and there are sort of ways of reading that in there, and those<\/p>\n\n\n\n<p>323<br>00:54:29.280 &#8211;&gt; 00:54:38.420<br>Malcolm Ryan: those sorts of games then require require more of you as a player to bring moral focus to the game, because it&#8217;s not in your face, saying, Here is a moral choice to you.<\/p>\n\n\n\n<p>324<br>00:54:38.600 &#8211;&gt; 00:54:43.460<br>Malcolm Ryan: to you, euthanize your friend. It&#8217;s here as a society you have to run.<\/p>\n\n\n\n<p>325<br>00:54:43.520 &#8211;&gt; 00:54:45.409<br>Malcolm Ryan: There are competing interests<\/p>\n\n\n\n<p>326<br>00:54:45.520 &#8211;&gt; 00:54:47.720<br>Malcolm Ryan: at the end of the day. Are you a good person?<\/p>\n\n\n\n<p>327<br>00:54:47.820 &#8211;&gt; 00:54:50.450<br>Malcolm Ryan: And also<\/p>\n\n\n\n<p>328<br>00:54:50.460 &#8211;&gt; 00:55:08.399<br>Malcolm Ryan: It also challenges you for moral action because it it says, you know, yeah, you want to do the right thing. You want to do the good thing and papers, please. You want to let through the people who You&#8217;re going to be nice to the people you want. You haven&#8217;t actually be good at the game to do that. You can&#8217;t just go. I&#8217;m going to do the good thing and get good points. It&#8217;s like<\/p>\n\n\n\n<p>329<br>00:55:08.410 &#8211;&gt; 00:55:13.870<br>Malcolm Ryan: I really want to do the good thing, but i&#8217;m so bad at this game that i&#8217;m on the verge of losing, and I can&#8217;t.<\/p>\n\n\n\n<p>330<br>00:55:14.220 &#8211;&gt; 00:55:23.319<br>Malcolm Ryan: I&#8217;m? You know my child is sick, and whatever, and I need to actually get better at the game in order to be allowed to exercise my morality within the game.<\/p>\n\n\n\n<p>331<br>00:55:23.400 &#8211;&gt; 00:55:24.970<br>Malcolm Ryan: So I think<\/p>\n\n\n\n<p>332<br>00:55:25.040 &#8211;&gt; 00:55:41.990<br>A Ashcraft: it seems like you would want to have low low value decisions early on, just to get people sort of used to the kinds of decisions that they&#8217;re making, so they can explore their the gains of agency that they have in the game, and the kinds of decisions that they&#8217;re going to come.<\/p>\n\n\n\n<p>333<br>00:55:42.000 &#8211;&gt; 00:55:56.459<br>A Ashcraft: I think, about the Black Mirror episode, band or snatch the the the choose. Your own adventure kind of style I I don&#8217;t know if you, if you all have seen it&#8217;s great like the first 2 choices are are utterly meaningless.<\/p>\n\n\n\n<p>334<br>00:55:57.090 &#8211;&gt; 00:56:08.229<br>A Ashcraft: but they get you, and they they get you used to like. Oh, this is how I make choices, because this is, you know this is a different medium, and I&#8217;m. I&#8217;m. This is what I can do. How how can how does this work?<\/p>\n\n\n\n<p>335<br>00:56:08.350 &#8211;&gt; 00:56:12.780<br>A Ashcraft: And so you make a couple of choices that are not not not meaningful at all.<\/p>\n\n\n\n<p>336<br>00:56:12.820 &#8211;&gt; 00:56:16.970<br>A Ashcraft: and before it gets into anything that&#8217;s sort of meteor than that.<\/p>\n\n\n\n<p>337<br>00:56:17.410 &#8211;&gt; 00:56:22.309<br>Paul Formosa: Yeah, I think that&#8217;s really important point, and it gets us to come back to something. We talk about it starting around all expertise.<\/p>\n\n\n\n<p>338<br>00:56:22.450 &#8211;&gt; 00:56:41.239<br>Paul Formosa: It&#8217;s too often like we don&#8217;t treat rally as it&#8217;s something that can be difficult or hard or easier, and we just throw people in. But if we think about like how again a game always starts with some of tutorials. It&#8217;s like how to use the system. It&#8217;s the you know. The opponent&#8217;s start of easy. Get harder because you&#8217;re supposed to progress and get better. But we often don&#8217;t think and rally that way in games as well. We just to throw people in, and<\/p>\n\n\n\n<p>339<br>00:56:41.250 &#8211;&gt; 00:57:01.100<br>Paul Formosa: but this sort of thing suggesting, and some sort of written about as well. If, when once you sort of think about more as an expertise or a skill, it&#8217;s something that can. You can start off with sort of simpler types of dilemmas where there&#8217;s kind of an obvious answer. And you know you get sort of different kind of feedback from that. But then, so these kind of ramp up morality in the same way you might ramp up the<\/p>\n\n\n\n<p>340<br>00:57:01.190 &#8211;&gt; 00:57:07.849<br>Paul Formosa: you know the toughness of bosses, or something like that. So yeah, I think that&#8217;s definitely right. It&#8217;s to sort of think about<\/p>\n\n\n\n<p>341<br>00:57:07.880 &#8211;&gt; 00:57:18.240<br>Paul Formosa: all the different aspects, you know. You might wrap up, you know sensitivity, or something like that, or action, and you know, but do that gradually, so that the player gets kind of experience and gets better at it.<\/p>\n\n\n\n<p>342<br>00:57:18.260 &#8211;&gt; 00:57:37.649<br>Paul Formosa: and one, you know. Once again my pay. This place is is kind of good like that in that, you know. There isn&#8217;t there&#8217;s just all these little decisions you have to make all the time, and it&#8217;s not obvious. I mean it&#8217;s really exploring this idea of the penalty people how just having you standard motives like I want to do my job on, for my family can lead you to, you know, and be involved in evil actions.<\/p>\n\n\n\n<p>343<br>00:57:37.660 &#8211;&gt; 00:57:40.390<br>Paul Formosa: and it doesn&#8217;t that. That&#8217;s why I think the sensitivity is<\/p>\n\n\n\n<p>344<br>00:57:40.580 &#8211;&gt; 00:57:42.570<br>Paul Formosa: intentionally difficult in that game, because.<\/p>\n\n\n\n<p>345<br>00:57:42.600 &#8211;&gt; 00:57:50.589<br>Paul Formosa: you know you&#8217;ve got your right. You got a feature family. It&#8217;s cold at night. You&#8217;ve got little money, so you kind of want to focus on that. But then, but you know there&#8217;s people pleading for their lives coming through.<\/p>\n\n\n\n<p>346<br>00:57:50.660 &#8211;&gt; 00:57:53.219<br>Paul Formosa: So you know you. You&#8217;ve kind of<\/p>\n\n\n\n<p>347<br>00:57:53.460 &#8211;&gt; 00:58:11.090<br>Paul Formosa: It&#8217;s so busy sometimes. You got very quick with the in the game, makes you focus very quickly on the processing the passport, and and so it&#8217;s. It&#8217;s almost pushing You don&#8217;t don&#8217;t. Just ignore the dial by quickly, Just quickly process process. But but of course, then, of course, you&#8217;re missing the kind of the model point as well. So you know it. In that sense it does a really good job. Another<\/p>\n\n\n\n<p>348<br>00:58:11.100 &#8211;&gt; 00:58:15.129<br>Paul Formosa: what I quite like is more about more action is in the walking dead, the first one<\/p>\n\n\n\n<p>349<br>00:58:15.310 &#8211;&gt; 00:58:22.149<br>Paul Formosa: There&#8217;s this really nice. A lot of them choices are just simple, you know. Do this do that. But there&#8217;s this really nice one where you, Lee is trying to stop. I think Kenny<\/p>\n\n\n\n<p>350<br>00:58:22.170 &#8211;&gt; 00:58:33.439<br>Paul Formosa: ken he&#8217;s driving train. He&#8217;s really upset. I think his sons just got infected, or something like that, and he had in the the dial. Choice is not like. Get him to stop without violence or use violence. It&#8217;s like you have to make about 8 or 9<\/p>\n\n\n\n<p>351<br>00:58:33.490 &#8211;&gt; 00:58:36.299<br>Paul Formosa: decisions in a dollar choices and run.<\/p>\n\n\n\n<p>352<br>00:58:36.370 &#8211;&gt; 00:58:44.589<br>Paul Formosa: and they&#8217;re all they. They&#8217;re quite quite a a degree of kind of you know, emotional intelligence like what&#8217;s going to anger him. What&#8217;s going to calm him down? What&#8217;s going to frustrating what&#8217;s not?<\/p>\n\n\n\n<p>353<br>00:58:44.640 &#8211;&gt; 00:58:55.190<br>Paul Formosa: And you know you&#8217;ve got to make, you know, like I said, I don&#8217;t know on choices in a row, and only when you&#8217;ve done that we&#8217;ve sort of be able to talk him down without violence. So, instead of saying, you know, talking down with on so not You&#8217;ve got to. Actually, you have<\/p>\n\n\n\n<p>354<br>00:58:55.200 &#8211;&gt; 00:59:08.690<br>Paul Formosa: moral skill in this case. Small action skills around emotional sensitivity or things like that. We can imagine other sort of things like communication skills or leadership skills, and needed to actually achieve a moral goal rather than just so click a box. So I think that&#8217;s a another nice example.<\/p>\n\n\n\n<p>355<br>00:59:08.700 &#8211;&gt; 00:59:23.110<br>Shlomo Sher: It&#8217;s it&#8217;s interesting. How much of this is, you know, engaging in personal relationships with people and thinking in terms of personal relationship. Instead of these kind of trolley problem examples which are, you know, really abstract or really<\/p>\n\n\n\n<p>356<br>00:59:23.510 &#8211;&gt; 00:59:25.390<br>Shlomo Sher: dealing with.<\/p>\n\n\n\n<p>357<br>00:59:25.560 &#8211;&gt; 00:59:32.920<br>Shlomo Sher: You know, people that you don&#8217;t have. Let&#8217;s say relationships with right, because digging into these relationships kind of<\/p>\n\n\n\n<p>358<br>00:59:33.700 &#8211;&gt; 00:59:43.930<br>Shlomo Sher: connects with you seeing a person from a variety of that mentions hopefully. And I&#8217;m. Curious how things like that stack against interpersonal morality.<\/p>\n\n\n\n<p>359<br>00:59:43.940 &#8211;&gt; 01:00:02.079<br>Shlomo Sher: Sorry against the impartial morality. Right? So you you have this interpersonal morality, let&#8217;s say, with Kenny in in the first walking dead right? He&#8217;s your friend. How should you treat your friend as a point where you think he&#8217;s not doing the right thing. And this is a kind of a nice thing. I think this another dilemma in in that first one<\/p>\n\n\n\n<p>360<br>01:00:02.390 &#8211;&gt; 01:00:17.499<br>Shlomo Sher: you have loyalty to him as a friend. Right? And you&#8217;re aware that as a friend you have certain obligations to him as a friend right? And and you&#8217;re aware that he&#8217;s very upset, but he also wants to do something that seems<\/p>\n\n\n\n<p>361<br>01:00:17.510 &#8211;&gt; 01:00:22.700<br>Shlomo Sher: from the point of view of impartial morality. And I don&#8217;t remember what it is, but it seems like unjust in in some way<\/p>\n\n\n\n<p>362<br>01:00:22.730 &#8211;&gt; 01:00:27.500<br>Shlomo Sher: right. And I I thought that contrast was was was really nice.<\/p>\n\n\n\n<p>363<br>01:00:27.640 &#8211;&gt; 01:00:36.409<br>Shlomo Sher: But again, that contrast it&#8217;s You need to make it a moral judgment, but it&#8217;s not just a moral judgment. It&#8217;s also how do you manage the moral expectations of your friends<\/p>\n\n\n\n<p>364<br>01:00:36.560 &#8211;&gt; 01:00:41.479<br>Shlomo Sher: right and managing more expectations, I think, was that was, I think, the the more interesting part.<\/p>\n\n\n\n<p>365<br>01:00:42.600 &#8211;&gt; 01:00:55.790<br>Paul Formosa: Yeah, I mean. And that must get that. That pulls on a bunch of different skills there like. So it&#8217;s 1 one you might think it&#8217;s moral focus, you know. Is it among you know? Am I to prioritize my friendship over morality. But there&#8217;s also model sensitivity like are some of these<\/p>\n\n\n\n<p>366<br>01:00:56.170 &#8211;&gt; 01:01:14.980<br>Paul Formosa: other kind of models, dimensions to my relationship with this other person. It&#8217;s, it&#8217;s, and it&#8217;s not clearly like this is moral, not moral. And so I think once, what what are you being said? What are the model? The silent issues is the fact that i&#8217;m a friend that I have. This deep relationship with him is that morally sailing? And how does the way up with other morely sailing features? So I think that yeah, I think. And and I think this actually is what<\/p>\n\n\n\n<p>367<br>01:01:15.020 &#8211;&gt; 01:01:21.270<br>Paul Formosa: a really good feature game I met Nac. We talked about this idea of games as a sandbox, but we can also think about it as a you know, as a way to explore<\/p>\n\n\n\n<p>368<br>01:01:21.300 &#8211;&gt; 01:01:25.980<br>Paul Formosa: ethics is a way to support morality, and you know, and that that&#8217;s one of those cases where there are<\/p>\n\n\n\n<p>369<br>01:01:26.270 &#8211;&gt; 01:01:37.389<br>Paul Formosa: genuine, competing, different, competing, ethical considerations. And so you know what it it&#8217;s about. What prioritizing what are you sensitive to, as well as the kind of judgment, and and implementing it as well?<\/p>\n\n\n\n<p>370<br>01:01:37.410 &#8211;&gt; 01:01:44.500<br>Paul Formosa: And I think it&#8217;s nice that we could sort of explore those, and again doesn&#8217;t have to tell you which one of those is the wrong one or not moral, you know it can be partly up to you.<\/p>\n\n\n\n<p>371<br>01:01:44.690 &#8211;&gt; 01:01:51.889<br>Paul Formosa: but I just want to quickly touch on. Another thing. You mentioned that, like the interpersonal relation, I think that&#8217;s when we think about Morality Day to day.<\/p>\n\n\n\n<p>372<br>01:01:52.150 &#8211;&gt; 01:02:08.990<br>Paul Formosa: It&#8217;s pretty much usually interpersonal stuff. I mean that that&#8217;s that&#8217;s right. That&#8217;s a bread and butter of morality, for we we&#8217;re not all facing these big life and death decisions all the time, you know. Maybe someone&#8217;s face that some of the time, but they&#8217;re not the sort of everyday moral decisions we have to sort of face every day. And so, I think you know.<\/p>\n\n\n\n<p>373<br>01:02:09.000 &#8211;&gt; 01:02:15.389<br>Paul Formosa: getting those in games as well like thinking about how they how to put them in games and plenty of people explore those, I think, is also<\/p>\n\n\n\n<p>374<br>01:02:15.420 &#8211;&gt; 01:02:17.780<br>Paul Formosa: important aspect to.<\/p>\n\n\n\n<p>375<br>01:02:18.420 &#8211;&gt; 01:02:20.899<br>Malcolm Ryan: So there&#8217;s another<\/p>\n\n\n\n<p>376<br>01:02:21.790 &#8211;&gt; 01:02:32.990<br>Malcolm Ryan: another theory of of moral psychology. The moral oral foundation theory which talks about it&#8217;s based in the idea of intuitive morality, and it says there are<\/p>\n\n\n\n<p>377<br>01:02:33.000 &#8211;&gt; 01:02:48.499<br>Malcolm Ryan: 5, maybe 6 different sort of moral priorities that the different people sort of right at different levels of importance, and that I can remember the more concerns about justice and fairness concerns about care versus harm.<\/p>\n\n\n\n<p>378<br>01:02:48.520 &#8211;&gt; 01:03:08.349<br>Malcolm Ryan: concerns about purity, purity, authority, and loyalty to your in group and liberty and freedom. And and there&#8217;s evidence that different groups in different societies will priority on on some of those over others.<\/p>\n\n\n\n<p>379<br>01:03:08.360 &#8211;&gt; 01:03:10.370<br>In Western society.<\/p>\n\n\n\n<p>380<br>01:03:10.840 &#8211;&gt; 01:03:20.640<br>Malcolm Ryan: More liberal people put priority on on fairness, care, and liberty, and more conservative people put more relatively, more priority on authority and and<\/p>\n\n\n\n<p>381<br>01:03:20.930 &#8211;&gt; 01:03:25.589<br>Malcolm Ryan: loyalty and and purity which is interesting. I think I think<\/p>\n\n\n\n<p>382<br>01:03:25.850 &#8211;&gt; 01:03:37.630<br>Malcolm Ryan: there&#8217;s some evidence that that people do respond in game in following these sorts of foundations. And so there there! There are measures to say, what are your foundations? You know what are your priorities?<\/p>\n\n\n\n<p>383<br>01:03:38.020 &#8211;&gt; 01:03:47.489<br>Malcolm Ryan: And look, if you present these kinds of choices and games, but I also think they they make for really interesting material as game designers to say, okay, I want to design a choice.<\/p>\n\n\n\n<p>384<br>01:03:47.610 &#8211;&gt; 01:03:48.569<br>Malcolm Ryan: We lean<\/p>\n\n\n\n<p>385<br>01:03:48.600 &#8211;&gt; 01:04:05.949<br>Malcolm Ryan: heavily on the trolley problem a lot of times, these utilitarian versus sort of d and logical choices. But there are other kinds of dilemmas that that challenge us in more interesting ways and more sort of really natural kinds of dilemmas rather than we very rarely have. This do I say, how many people do I save kind of choices in real life.<\/p>\n\n\n\n<p>386<br>01:04:06.040 &#8211;&gt; 01:04:15.509<br>Malcolm Ryan: but we do often have choices between. Do I do the caring thing, or do I do the fair thing like as a as a teacher. I encounter this all the time. There&#8217;s a student with<\/p>\n\n\n\n<p>387<br>01:04:15.520 &#8211;&gt; 01:04:28.599<br>Malcolm Ryan: who&#8217;s, you know, for whatever reasons you know hasn&#8217;t done their assignment, and has comes to you with some sob story, and I&#8217;m faced with a moral dilemma. Do I care for this person? To care? Anything to do right now would be to say.<\/p>\n\n\n\n<p>388<br>01:04:28.650 &#8211;&gt; 01:04:32.500<br>Malcolm Ryan: and that doesn&#8217;t matter, you know. Just do the work and and give you the mark.<\/p>\n\n\n\n<p>389<br>01:04:32.830 &#8211;&gt; 01:04:40.519<br>Malcolm Ryan: The fair thing is to treat all students equally, and there are other students who didn&#8217;t come to me with those problems, and they just<\/p>\n\n\n\n<p>390<br>01:04:40.550 &#8211;&gt; 01:04:43.589<br>Malcolm Ryan: submitted the best work that they could within the deadline.<\/p>\n\n\n\n<p>391<br>01:04:43.800 &#8211;&gt; 01:04:58.549<br>Malcolm Ryan: and the fair thing would be to treat them all equally, and say to this student: No, I can&#8217;t do that, and so, I think, as as designers. These moral foundations are really good way of thinking about really sort of much more natural, everyday kind of decisions.<\/p>\n\n\n\n<p>392<br>01:04:58.770 &#8211;&gt; 01:05:18.400<br>Malcolm Ryan: Do I do the thing that is looking after my friends versus do I? By the person who&#8217;s in authority over me, you know? Do I do I do the to boosting, or do I? Do the you know the thing that&#8217;s gonna make me feel feel uncomfortable versus do I actually do the caring, or whatever in circumstance, I, you know.<\/p>\n\n\n\n<p>393<br>01:05:19.590 &#8211;&gt; 01:05:28.910<br>Malcolm Ryan: And so I think those that&#8217;s that gives us much more meters designers than than these sort of big utilitarian, d and logical philosophical moral choices.<\/p>\n\n\n\n<p>394<br>01:05:29.730 &#8211;&gt; 01:05:32.649<br>A Ashcraft: though there are, there are.<\/p>\n\n\n\n<p>395<br>01:05:32.670 &#8211;&gt; 01:05:43.249<br>A Ashcraft: I believe there are sort of online, you know. Take this quiz and get your your your moral foundation ratings. Things right. Are they any? Are they of any value<\/p>\n\n\n\n<p>396<br>01:05:43.330 &#8211;&gt; 01:05:59.399<br>Malcolm Ryan: up. Well, I mean they are. They are the the big one is actually made by the researchers who do this work. And so, presumably the the moral foundations questionnaire. You can actually take online. And that is, you know, that is actually one of the most.<\/p>\n\n\n\n<p>397<br>01:05:59.450 &#8211;&gt; 01:06:12.480<br>Malcolm Ryan: There&#8217;s more data validating that as a survey than basically any kind of psychological instrument that I&#8217;ve investigated in this field at all. They have that massive worldwide studies of of that. And so<\/p>\n\n\n\n<p>398<br>01:06:12.560 &#8211;&gt; 01:06:22.869<br>Malcolm Ryan: Yeah, I mean it&#8217;s it&#8217;s like any of these things. It&#8217;s very much based on self report. And so it&#8217;s like, do you, Prior? Which of these things do you prioritize? And so it is<\/p>\n\n\n\n<p>399<br>01:06:22.910 &#8211;&gt; 01:06:28.249<br>Malcolm Ryan: to some degree representative of your self image of yourself as a moral person, and maybe not<\/p>\n\n\n\n<p>400<br>01:06:28.480 &#8211;&gt; 01:06:35.280<br>Malcolm Ryan: one of the interesting questions there is. Do you actually does your behavior reflect your your your what you say on the moral?<\/p>\n\n\n\n<p>401<br>01:06:35.340 &#8211;&gt; 01:06:44.849<br>Shlomo Sher: I&#8217;m. I&#8217;m curious. If you could take you know I was a huge fan of ultimate, for when I was in middle school<\/p>\n\n\n\n<p>402<br>01:06:44.970 &#8211;&gt; 01:07:01.980<br>Shlomo Sher: and I was I was 13 when when when I when I played it, and you know i&#8217;m thinking if because in ultimate 4 right you, you&#8217;re engaging in actions that are then correlated to virtues, and you get points and and their morality systems for virtue.<\/p>\n\n\n\n<p>403<br>01:07:01.990 &#8211;&gt; 01:07:20.240<br>Shlomo Sher: It it&#8217;s interesting if some if someone could do something like that. But instead of kind of these morality meters, you&#8217;re shifted to some sort of because in the beginning of October you&#8217;re asked kind of these small dilemmas that create what your persona is in the game where your moral meter starts.<\/p>\n\n\n\n<p>404<br>01:07:20.250 &#8211;&gt; 01:07:39.299<br>Shlomo Sher: I wonder if something like that could be done in a game that incorporates these these core values in some sort of test that sets up your character in the game, in accordance with whatever kind of value system you have. I&#8217;ve been dying to see anything like ultimate for for years.<\/p>\n\n\n\n<p>405<br>01:07:39.390 &#8211;&gt; 01:07:41.950<br>Shlomo Sher: just because it&#8217;s such a such a personal favorite.<\/p>\n\n\n\n<p>406<br>01:07:42.360 &#8211;&gt; 01:07:49.019<br>Malcolm Ryan: This is something to relate back to the morality meter research. We talked in early about the the idea of reputation as well as being<\/p>\n\n\n\n<p>407<br>01:07:49.040 &#8211;&gt; 01:07:53.890<br>Malcolm Ryan: sort of competing meters or different ways of of moral standards in the world.<\/p>\n\n\n\n<p>408<br>01:07:53.930 &#8211;&gt; 01:08:01.539<br>Malcolm Ryan: and study that I again want to do when in when I get time to make. But we have to make a new game and making a game takes forever.<\/p>\n\n\n\n<p>409<br>01:08:01.550 &#8211;&gt; 01:08:19.889<br>Malcolm Ryan: But I want to make a game where which does keep these different moral foundations against each other and presents you with Well, maybe just another meter, but a meter that has more axes that show you. Yeah, you&#8217;re You&#8217;re sitting here in the in the in the moral world at the moment. How does and this this? This choice is going to give you points in<\/p>\n\n\n\n<p>410<br>01:08:20.380 &#8211;&gt; 01:08:22.670<br>Malcolm Ryan: injustice, and this choice is going to give you points in<\/p>\n\n\n\n<p>411<br>01:08:22.910 &#8211;&gt; 01:08:24.699<br>Malcolm Ryan: Okay.<\/p>\n\n\n\n<p>412<br>01:08:24.740 &#8211;&gt; 01:08:43.929<br>Malcolm Ryan: And the idea there is is sort of I&#8217;m interested in exploring the idea of moral role play where it&#8217;s not often we go. I&#8217;m going to play a good player on to play an evil player, but you know, and we go. We do set into a game to say I&#8217;m going to play evil, and I wanna that&#8217;s moral role. Play. It&#8217;s not reflecting my values. It&#8217;s like. I want to be a villain.<\/p>\n\n\n\n<p>413<br>01:08:43.939 &#8211;&gt; 01:08:58.029<br>Malcolm Ryan: but interested in these games which the which give you this richer set of of okay. Well, i&#8217;m going to moral. I&#8217;m going to role, play. I&#8217;m. Not a person who&#8217;s really caring and really kind, or I&#8217;m: going to role, play a person who really cares about justice, and that&#8217;s the<\/p>\n\n\n\n<p>414<br>01:08:58.080 &#8211;&gt; 01:09:13.529<br>Malcolm Ryan: primary driver, and maybe it&#8217;s less a reflection of my personal morality and more a choice of a character that I want to play in the game and using these meters, then as a as a guide for moral role, play rather than is it rather than as a a measurement of mind reality. Excuse me.<\/p>\n\n\n\n<p>415<br>01:09:15.140 &#8211;&gt; 01:09:19.719<br>A Ashcraft: I don&#8217;t know there is a I can say as a as a big role player. I have done a lot of that.<\/p>\n\n\n\n<p>416<br>01:09:19.800 &#8211;&gt; 01:09:20.540<br>Malcolm Ryan: Yeah.<\/p>\n\n\n\n<p>417<br>01:09:20.790 &#8211;&gt; 01:09:38.090<br>Malcolm Ryan: yeah, absolutely, I mean even I mean D. And d with it&#8217;s, you know, lawful, chaotic kind of gives you different approaches from reality. And you say i&#8217;m going to be good. I can be lawful good. I can be a you know we&#8217;re buying the rules, or I can be counted good. I can be ignoring the rules, and you know, and just doing whatever matters.<\/p>\n\n\n\n<p>418<br>01:09:38.189 &#8211;&gt; 01:09:39.880<br>Malcolm Ryan: And I think<\/p>\n\n\n\n<p>419<br>01:09:39.899 &#8211;&gt; 01:09:46.679<br>Malcolm Ryan: giving the player more room for that kind of moral role. Play, I think, will will lead to more interesting and and rich again. In a sense.<\/p>\n\n\n\n<p>420<br>01:09:46.720 &#8211;&gt; 01:10:03.060<br>Malcolm Ryan: there&#8217;s a game by an Australian studio coming out this year, called Broken Roads, which i&#8217;m really keen to see, because it sort of expands. This idea of the morality meter to to represent different moral philosophies in your in your rally meter.<\/p>\n\n\n\n<p>421<br>01:10:03.340 &#8211;&gt; 01:10:10.699<br>Malcolm Ryan: I&#8217;m very excited to see it. I we&#8217;ve had a bit of chat with the developer about what they&#8217;re doing. And<\/p>\n\n\n\n<p>422<br>01:10:11.240 &#8211;&gt; 01:10:17.580<br>Malcolm Ryan: yeah, i&#8217;m keen to team to play that game that&#8217; my next research paper will be playing the game and reporting back<\/p>\n\n\n\n<p>423<br>01:10:17.690 &#8211;&gt; 01:10:33.059<br>Shlomo Sher: cool. Hey, guys, I I hate to say it, really I cause I have a i&#8217;m a bunch of other questions, but i&#8217;m looking at the clock. It&#8217;s 13&nbsp;min. So I want to ask you guys kind of the the last 2 questions, and then I I I really have to get out of here. So the the first one is really<\/p>\n\n\n\n<p>424<br>01:10:33.300 &#8211;&gt; 01:10:34.349<br>Shlomo Sher: there&#8217;s a lot of<\/p>\n\n\n\n<p>425<br>01:10:34.380 &#8211;&gt; 01:10:38.679<br>Shlomo Sher: places to go here, I mean, you know I I feel like<\/p>\n\n\n\n<p>426<br>01:10:38.940 &#8211;&gt; 01:10:49.899<br>Shlomo Sher: the way you&#8217;re kind of talking about it. It&#8217;s mostly games. Do at games. Explore one out of 4 dimensions of of morality. And you know<\/p>\n\n\n\n<p>427<br>01:10:50.270 &#8211;&gt; 01:10:58.040<br>Shlomo Sher: you know what are some of the implications. If you&#8217;re saying, let&#8217;s look at the other 3 for game designers. Players. Anyone interested in ethics.<\/p>\n\n\n\n<p>428<br>01:11:00.350 &#8211;&gt; 01:11:15.959<br>Malcolm Ryan: So yeah, we wrote a paper. You sort of Jesse Shell wrote the the Book of Lenses for game design where you presented. He had this concept where we&#8217;ll just present the questions to ask yourself as designers and not answers about how to do design. They they&#8217;re very much posed, as as you.<\/p>\n\n\n\n<p>429<br>01:11:16.060 &#8211;&gt; 01:11:17.360<br>Malcolm Ryan: You know these sort of prompts.<\/p>\n\n\n\n<p>430<br>01:11:17.570 &#8211;&gt; 01:11:26.789<br>Malcolm Ryan: and we wanted to follow in that model and say, Well, what is this sort of 4 component model? What are the questions that poses to us designers to ask about a line, and to consider these kind of factors.<\/p>\n\n\n\n<p>431<br>01:11:26.930 &#8211;&gt; 01:11:34.340<br>Malcolm Ryan: And and this is we were didn&#8217;t want to say, Look, you know, this is the way to make a little moral game, and you have to make your games better, because they<\/p>\n\n\n\n<p>432<br>01:11:35.160 &#8211;&gt; 01:11:39.859<br>Malcolm Ryan: the more choices and games at the moment are good, but I think we have more scope by<\/p>\n\n\n\n<p>433<br>01:11:39.900 &#8211;&gt; 01:11:58.500<br>Malcolm Ryan: sitting down and looking at our game and asking how we engaging the players moral sensitivity? Are we handing the moral questions to them, or we are inviting them to do it? How we engaging the players moral focus, how we, what what is it about our game that invites us to think about playing this game morally? How we engaging moral action! Now we just<\/p>\n\n\n\n<p>434<br>01:11:58.510 &#8211;&gt; 01:12:08.839<br>Malcolm Ryan: click on it, and and the moral thing that happens? Or do we have to strategize about a Mar? Or do we have to be skillful in character? In in the dialogue do we have to be skillful in some other sense?<\/p>\n\n\n\n<p>435<br>01:12:08.930 &#8211;&gt; 01:12:19.069<br>Malcolm Ryan: I don&#8217;t think they Again, there&#8217;s no one sort of answer to these questions. We can bring some some of these we can say, oh, yeah, we&#8217;re totally ignoring that. We moral action is not in this game.<\/p>\n\n\n\n<p>436<br>01:12:19.180 &#8211;&gt; 01:12:31.420<br>Malcolm Ryan: but some we can say, but we really want to dive into moral sensitivity. We want to, you know, make the moral problems rich and ambiguous, and bring a lot of factors to bear on that, and we&#8217;re going to do that in through narrative or through whatever it might be.<\/p>\n\n\n\n<p>437<br>01:12:31.630 &#8211;&gt; 01:12:40.450<br>Malcolm Ryan: So this is where I think you know, understanding. The psychology really just gives us a better set of tools to look at our design, and as game designers, and that for me is.<\/p>\n\n\n\n<p>438<br>01:12:40.520 &#8211;&gt; 01:12:41.559<br>Malcolm Ryan: you know, as<\/p>\n\n\n\n<p>439<br>01:12:41.580 &#8211;&gt; 01:12:43.190<br>Malcolm Ryan: you know, I come into this<\/p>\n\n\n\n<p>440<br>01:12:43.320 &#8211;&gt; 01:12:57.160<br>Malcolm Ryan: with more concern about games than about ethics, in the sense that you know. I&#8217;m not. I believe that a lot of what we can, what we&#8217;re doing can be used to improve that, make games that teach ethics and make games that improve ethical development.<\/p>\n\n\n\n<p>441<br>01:12:57.170 &#8211;&gt; 01:13:03.609<br>Malcolm Ryan: But for me, moral choice is a fun and a designer. I want to make more interesting, more meaningful, more sort of<\/p>\n\n\n\n<p>442<br>01:13:03.630 &#8211;&gt; 01:13:09.960<br>Malcolm Ryan: adult games where i&#8217;m really sort of, you know, like all of our media. You know. All of our media had wrestles with morality in<\/p>\n\n\n\n<p>443<br>01:13:10.140 &#8211;&gt; 01:13:13.690<br>Malcolm Ryan: in very interesting ways. It&#8217;s a big part of what we do for entertainment, and<\/p>\n\n\n\n<p>444<br>01:13:13.770 &#8211;&gt; 01:13:24.459<br>Malcolm Ryan: and I want to make games which wrestle with that in much more interesting ways. And I think engaging with these myself, moral psychologies uses those those lenses to look at our game and say, how can we make this better by<\/p>\n\n\n\n<p>445<br>01:13:24.880 &#8211;&gt; 01:13:27.520<br>Malcolm Ryan: better engaging parts of the by, his moral expertise?<\/p>\n\n\n\n<p>446<br>01:13:28.050 &#8211;&gt; 01:13:35.690<br>Shlomo Sher: Yeah, that&#8217;s great, Paul. Do you want to have. Do you want to add anything to that out of curiosity from coming for more of the ethics than than the the gave side?<\/p>\n\n\n\n<p>447<br>01:13:36.640 &#8211;&gt; 01:13:44.349<br>Paul Formosa: Not really I mean, so I guess we, as I said, games are our sandboxes, and I think it&#8217;s really interesting. The way we we can<\/p>\n\n\n\n<p>448<br>01:13:44.430 &#8211;&gt; 01:13:49.470<br>Paul Formosa: you know role-play, different ethical personas are explored different ethical situations.<\/p>\n\n\n\n<p>449<br>01:13:49.650 &#8211;&gt; 01:14:08.569<br>Paul Formosa: So I think games are useful in that way, and that that that sort of then leads into the stuff. Malcolm was talking about like. Well, if we want that, then we want to think about the ways that we can gauge those different aspects of our moral expertise, or you know, some, you know, 10 games teaches to be better, more than you know. What counts is better. How we how do we do that?<\/p>\n\n\n\n<p>450<br>01:14:08.640 &#8211;&gt; 01:14:17.399<br>Paul Formosa: One thing we actually work on another game we haven&#8217;t talked about, yet is again looking at, develop around cyber, security ethics. And so we have a game that tries to take sensitivity around cyber security ethics.<\/p>\n\n\n\n<p>451<br>01:14:17.410 &#8211;&gt; 01:14:28.649<br>Paul Formosa: So one thing is like, what what are the ethical issues in some security? They&#8217;re not obvious. And so one thing that game is trying to do is okay. Then we just focus on sensitivity. How can we make people more aware of what the ethical issues are in sub security in the gaming context.<\/p>\n\n\n\n<p>452<br>01:14:28.660 &#8211;&gt; 01:14:38.680<br>Paul Formosa: so that there there&#8217;s an example of what like we have to go. We want to improve sensitivity, and you know, then we could design the game to try and achieve that. And so I think, just you know, being aware that there are these different components<\/p>\n\n\n\n<p>453<br>01:14:38.790 &#8211;&gt; 01:14:58.259<br>Paul Formosa: being kind of conscious and thinking about them. And then, as game is flipping that around, thinking, you know, think about how those different moments are being engaged with or not being engaged with. I&#8217;ve been challenged, not being challenged, you know, and and think about more broadly how ethics is put into games, and how you engage with it in games beyond, just like he&#8217;s the big choice. What do you do?<\/p>\n\n\n\n<p>454<br>01:14:58.980 &#8211;&gt; 01:15:14.199<br>Shlomo Sher: All right, Cool. So, guys. So we end our podcast with essentially this. What do you want to live our listeners with? And right we ask people to do it under 1&nbsp;min. We rarely have had 2 people, and that was, after<\/p>\n\n\n\n<p>455<br>01:15:14.470 &#8211;&gt; 01:15:28.090<br>Shlomo Sher: which one of you guys wanted once once to take that to kind of give us our like, what you guys want to live with, and under 1&nbsp;min we end up using it. If it&#8217;s good in in a promotional, you know, from the promotional capacity<\/p>\n\n\n\n<p>456<br>01:15:29.360 &#8211;&gt; 01:15:38.989<br>Malcolm Ryan: you can. I&#8217;m Sure, it&#8217;s your Hmm. All right. So I think. Let me let me start that again.<\/p>\n\n\n\n<p>457<br>01:15:39.490 &#8211;&gt; 01:15:50.159<br>Malcolm Ryan: I mean, I think moral psychology is is useful to design. And so a lot of what we teach in game design is understanding. The player, understanding<\/p>\n\n\n\n<p>458<br>01:15:50.170 &#8211;&gt; 01:16:07.069<br>Malcolm Ryan: how what the player will do, and understanding what the player will feel, and these are the fundamental questions of game design. And how do we design the game to get the player to do what we want them to do and feel what we want them to feel and and understanding play. Psychology is a big part of that as a As a design. I want to get into the player&#8217;s head.<\/p>\n\n\n\n<p>459<br>01:16:07.190 &#8211;&gt; 01:16:17.340<br>Malcolm Ryan: So if we want to design morally engaging games, and we want to get into the player&#8217;s moral psychology, we want to understand how the players, how plays, moral thinking works.<\/p>\n\n\n\n<p>460<br>01:16:17.350 &#8211;&gt; 01:16:37.270<br>Malcolm Ryan: And so there, there&#8217;s a lot of we can learn from from the existing moral psychology research to say, okay, Well, he is. What&#8217;s going on in your player&#8217;s head when they&#8217;re making a moral choice in your game. And here are these different components, this some moral moral focus, more sensitivity, more judgment, more action, these different components of their thinking that are going on when they&#8217;re making and executing the world<\/p>\n\n\n\n<p>461<br>01:16:37.280 &#8211;&gt; 01:16:54.589<br>Malcolm Ryan: choice. And as designers. If we can learn about this kind of psychology and use it to reflect on the our game design. So we can make games that engage that moral expertise and practice and play that more, Alex. But it&#8217;s more interestingly. We can also learn more about how the players experience that pounded players.<\/p>\n\n\n\n<p>462<br>01:16:54.600 &#8211;&gt; 01:17:04.750<br>Malcolm Ryan: E. C.<\/p>\n\n\n\n<p>463<br>01:17:04.790 &#8211;&gt; 01:17:15.739<br>Malcolm Ryan: I would encourage any game designers out there to go and investigate moral psychology come to our blog. We have lots of resources from more psychology research, and how they might relate to games<\/p>\n\n\n\n<p>464<br>01:17:16.230 &#8211;&gt; 01:17:25.399<br>Malcolm Ryan: the plug from rally, play.org, and also and and then think about. Use that as a sort of lens to think about your design. And for players.<\/p>\n\n\n\n<p>465<br>01:17:25.520 &#8211;&gt; 01:17:46.830<br>Malcolm Ryan: Yeah, think about what you&#8217;re doing when you&#8217;re making moral choices and games. Think about. You know. What is that that influence this particular choice was I influenced by you know these these instrumental factors of the meter, or worrying about losing the game? Or was I actually making my own mo following my own moral judgment. What was I morally role playing in this situation? And do I want to enter into this game in a way that i&#8217;m playing? And I know<\/p>\n\n\n\n<p>466<br>01:17:46.840 &#8211;&gt; 01:17:47.990<br>a new kind of morality.<\/p>\n\n\n\n<p>467<br>01:17:48.040 &#8211;&gt; 01:17:59.619<br>Malcolm Ryan: and gives you no more reflection on how you&#8217;re playing, and what you what your what your own moral psychology is, and why you&#8217;re making the choices you&#8217;re making. And then maybe you can think about how that relates to real life, although it<\/p>\n\n\n\n<p>468<br>01:18:00.140 &#8211;&gt; 01:18:01.609<br>Oh, that&#8217;s interesting.<\/p>\n\n\n\n<p>469<br>01:18:02.050 &#8211;&gt; 01:18:14.079<br>Shlomo Sher: That really tiled off that the of the I&#8217;m. I&#8217;m going to I&#8217;m going to take a clip from that. I can make that work.<\/p>\n\n\n\n<p>470<br>01:18:14.090 &#8211;&gt; 01:18:41.240<br>Shlomo Sher: Malcolm Ryan. Paul Fromosa. Thank you so much. This this was just super interesting. I I got my class right after this, and i&#8217;m gonna talk to them about the the the for the 4 dimensions here. I&#8217;ve actually I I forgot how it said. Call these d 4 component components for 4 components of how we morally engage. Well, how we engage morality in in this case. So also how we could engage morality in games.<\/p>\n\n\n\n<p>471<br>01:18:41.250 &#8211;&gt; 01:18:43.890<br>Shlomo Sher: And we want to thank you guys.<\/p>\n\n\n\n<p>472<br>01:18:44.300 &#8211;&gt; 01:18:53.420<br>Shlomo Sher: It was a real pleasure, real pleasure. Thank you very much.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>[Release Date: April 11, 2023]&nbsp;How do players morally engage with games?&nbsp; What can user experience research and moral psychology tell us about how players experience<\/p>\n","protected":false},"author":1,"featured_media":1055,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_FSMCFIC_featured_image_caption":"","_FSMCFIC_featured_image_nocaption":"","_FSMCFIC_featured_image_hide":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[4],"tags":[55,231,232,74],"class_list":["post-1054","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-the-podcast","tag-game-design","tag-game-dev","tag-game-psychology","tag-moral-psychology"],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/ethicsandvideogames.com\/wp-content\/uploads\/2023\/03\/66-1-1-5.png?fit=1200%2C628&ssl=1","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/posts\/1054","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=1054"}],"version-history":[{"count":3,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/posts\/1054\/revisions"}],"predecessor-version":[{"id":1078,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/posts\/1054\/revisions\/1078"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=\/wp\/v2\/media\/1055"}],"wp:attachment":[{"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=1054"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=1054"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ethicsandvideogames.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=1054"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}