Some more embodiment analyses

Here are some more (explorative) analyses from the embodiment data used the Embodiment in character-based video games.

I collected also workload data using raw Nasa TLX when gathering data for EFA and CFA, but then I did not use workload data in analyses. My assumption was that workload would correlate with the embodiment, but did not look at this.

The first run of linear mixed models (using lme4::lmer)  showed issues with model assumptions and I dropped out games that have less than eight answers. The linear mixed model results are shown in Fig1 and they indicate a significant effect. However, even after dropping the games the residuals are not normally distributed and variance of random effects is practically zero. Hence, I run the model using robust lmm which give similar results: rlmer’s workload estimate is 0.295 vs lmer’s 0.301). As the impact of the game is low (the random effect is small), Pearson correlation is a feasible approach for calculating the relation between variables: correlation: r=0.26, p=0.0023, n=138. Correlation is again in line with the mixed model results.

So there seems to be a small relationship between experienced workload and embodiment. This would be natural, as when workload increases, the players are not able to focus their game controllers.


Fig 1. Model: embodiment ~ workload + (1|Game)

The study 4 data of the embodiment study allows looking the relation between completion times and the sense of presence. Unlike, the embodiment (cf. study 3), the presence does not seem to relate to the embodiment (see fig 2). Notably, this model has autocorrelation and I am unsure how to reduce that.


Fig 2. Model: Presence ~ Trial * log(Time) + (1|id)

The study 4 data of the embodiment study allows looking the relation between completion times and the embodiment. The similar relation between completion times and embodiment is seen in the static 3rd person camera condition which is the same than in study 3 (see fig 2). The embodiment and completion times does not relate to other conditions. However, this model has autocorrelation and I am unsure how to reduce that and that autocorrelation can have an impact on the results.

If the results can be trusted, the results are in line with the study 3 in the embodiment paper. Notably, the embodiment relation to completion times was not present in the 1st person and traditional 3rd person camera conditions.


Fig 3. Model: Embodiment ~ Trial * log(Time)


Embodiment in character-based video games

Petri Lankoski

This is author’s version of the paper. The authoritative version is available via ACM.DOI:

The paper is presented at AcademicMindtrek’16, October 17-18, 2016, Tampere, Finland (c) 2016 ACM. ISBN978-1-4503-4367-1/16/10… and published in the conference proceedings.


Embodiment is used to denote the sense that something is a part of one’s body. The sense of own body is argued to relate to the sense of agency of one’s own actions and of the ownership of the body. In this sense of own body can incorporate something external to the body, such as simple tools or virtual hands. The premise of the study is that the player-characters and game controllers get embodied in a similar to a tool or a virtual hand. In order to study embodiment, a psychometric scale is developed using explorative factor analysis (n=104). The scale is evaluated with two sets of data (n=103 and n=89) using confirmatory factor analysis. The embodiment scale ended to having two dimensions: controller ownership and player-character embodiment. Finally, the embodiment scale is tested and put into action in two studies with hypotheses 1) embodiment and players’ skills correlate and 2) the sense of presence and embodiment correlate. The data (n=37 and n=31) analysed using mixed effects models support both hypotheses.

Continue reading

Confidence intervals & credible intervals

This is a note for me.

Confidence intervals: “Are the observed data x reasonable given the hypothesised values of θ?” == P(θ| x)


Credible intervals: “What values of θ are reasonable given the observed data x?” == P(x| θ)

Those are related as “P(θ| x) = P(θ)P(x| θ)”

Credible intervals are part of Bayesian approach.

Embodiment Scale

During last year, I have been developing a way to evaluate embodiment experience in videogames. I will present the scale development study in the Academic Mindtrek ’16 (Tampere, Oct 17th to 19th, 2016).

The scale is based on work by Longo et al. (2008) who developed a psychometric scale for evaluation embodiment in rubber hand illusion.

Embodiment scale (7-point Likert-scale, strongly disagree–strongly agree) with two subscales:

Controller ownership subscale

  • I perceived the game controller as a part of my body
  • perceived the game controller as an extension of my body
  • It seemed like the game controller had disappeared

Player-character embodiment

  • It seemed like I was in the location where my character was
  • I perceived the character I control as an extension of my body
  • It seemed like the character I controlled was I

The average of all items or average of subscales separately can be used as the embodiment score.


  • M. R. Longo, F. Schüür, M. P. Kammers, M. Tsakiris,and P. Haggard. What is embodiment? A psychometric approach. Cognition, 107(3):978–998, June 2008.

My forthcoming paper

  • Lankoski, P. (2016). Embodiment in videogames. In: Academic Mindtrek ’16 (

    AcademicMindtrek’16, October 17-18, 2016, Tampere, Finland). New York: ACM Press. DOI=10.1145/2994310.2994320. ( Version on my blog)

Tutorial: 1st-person sneak in Unity 5, part 8

It is time to create a start menu for the game. For that, we need a scene. After creating the scene, rename the scene file to “start_screen” or something like that.

(Scroll down for the links to previous parts of this tutorial.)

Add the scene to the build settings (File->Build Settings…)

Make sure that start_screen scene is scene number zero (that scene will be loaded when the game starts).

Below in fig schematics of the start screen is shown. Arrows indicate state transitions triggered by button bushes.

State transitions and the buttons.

State transitions and the buttons.

To easily show the main screen and subscreens (OK to start new, Credits, OK to quit) we need canvases for each and script that set canvases active when they should be shown. That script should also handle the logics needed to start a new game or continue an already started game.


public class StartScreen : MonoBehaviour {

	// Canvases that contains all the screens
	// of the start menu
	public Canvas mainScreen;
	public Canvas creditsScreen;
	public Canvas confirmNewGameScreen;
	public Canvas confrimQuitScreen;

	// Info what is the name of level that should be loaded
	// with a new game selection
	public string nextLevel;

	private string currLevel;

	void Start () {
		currLevel = PlayerPrefs.GetString(PlayerLogic.CURRENT_LEVEL_KEY);	
		if (currLevel != "") {
			Debug.Log ("StartScreen.Start(): last completed level: " + currLevel);
		if (nextLevel == "") {
			Debug.LogError ("StartScreen.Start(): Specify the level that for new game at " + name);		
	public void CreditsButtonCb() {
	public void MainScreenCb() {
		// we make sure that the main screen is shown and
		// subscreens are hidden.
	public void ConfirmQuitCb() {
		// the player clicked the quit button. Now we hide 
		// the main screen and show confirm quit canvas
	public void ConfirmNewGameCb() {
		// the player clicked the new button. Now we hide 
		// main screen and show confirm new game canvas

	public void QuitCb() {
		// the player clicked quit in confirm quit
		// exiting...
		Debug.Log ("StartScreen.QuitCb()");
	public void NewGameCb() {
		// The player confirmed a new game selection
		// Deleting a possible old save file
		// and creating new save information
		// and loading the first level after that
		Debug.Log ("StartScreen.NewGameCb()");
	public void ContinueCb() {
		// The player clicked continue
		// loading the level based on save information

Add StartScreen script to an empty game object (and rename that game object as StartScreen).

After you have added Button and Text game objects under each canvas, you need to connect each button to the correct function in the StartScreen script.

To add function to a button,  I give an example about adding function call to the New Game button:

  • Select the button on the hierarchy view.
  • Scroll down on the Inspector view until you see On Click area under Button script component
  • Click +
  • Select StartScreen game object (click the small circle on the left-hand side, bottom row of the On Click area).
  • Select ConfirmNewGameCb function on the pop-up list on the right-hand side of On Click area.

The NewGame Button On Click area should look like as in the figure below after you have done this.


Adding function call On Click. The Button object on Click part (inside the red rectangle) should look like this.

After this add functions to all Button game objects.


  • Credits Button should call CreditsButtonCb()
  • NewGame Button should call ConfirmNewGameCb()
  • Continue Button should call ContinueCb()
  • Quit Button should call ConfirmQuitCb()


  • Back button should call MainScreenCb()


  • Back button should call MainScreenCb()
  • OK button should call NewGameCb()


  • Back button should call MainScreenCb()
  • OK Button should call QuitCb()

Now we are almost ready. Remember to set button and text anchors so that the screens looks OK also in different resolutions and aspect rations.

Previous parts of the tutorial


About my formal education & stuff

As there seems to be some interest of my eduction, I thought to add a short note about that so that this is out here.

I have Masters degree in New Media (art & design). That degree includes 60-70% computer science, mathematics, and multimedia (that was at that time formal languages, signal processing, virtual reality, etc.). The rest of my Masters are interactive storytelling, management and game design.

My doctorate is in Art and Design in department focusing on new media which means obligatory studies in the philosophy of art and aesthetics, and some HCI. Of course, research literature & methods part of studies were about games.

Beside my Master studies I worked as software designer for almost five years developing network management systems (e.g. data visualisation and management tools) with C++, X/Motif, SQL, and Perl.

Part of my work in academia, I have been developing games for teaching and research (location-aware mobile games, games/interactive narrative for interactive television, …).