Since there was a bank holiday, I’m compiling these two weeks into a single post for this unit, seeing as the readings provided for the period were also on one singular week’s slides.
Mobile AR — creating augmented experiences
This one was an interesting read — it focused a lot on the different kinds of experiences that AR can provide, and things such as the impact on a social level that AR might have. The mentioned “Check-ins” feature many social apps have now is a really good example of this, but more recently I suppose this could also be applied to AR games that have started to appear in a more mainstream fashion such as Pokémon Go, a game that was renowned for creating a feeling of community in people’s local areas.
While not super applicable to anything I’m working on right now course wise, this kind of technology is something I’d love to be able to work with in future, perhaps once it’s become more wearable, as I feel wearable AR might be the next step into creating social experiences in AR.
The idea of imageability is one I might be able to apply to my current project — The perception of how places like cities are formed in a consistent and predictable manner. This might be something I should research regarding the layout of my levels.
Future presence: how virtual reality is changing human connection, intimacy, and the limits of ordinary life
The first thing that stood out to me was the mentioning of the VOID — An attraction that was also mentioned in this week’s voices of VR podcast. It’s not something I’ve tried personally (though I’d love to) but in design it reminds me somewhat of the star wars VR experience that was in London a few years back. The concept of using real objects and placing them where they would appear to be in the virtual one, as well as other tertiary effects such as wind is amazing, and I imagine it really helps with making the entire experience feel immersive.
The six definitions of types of compresence seem like useful definitions that I’d love to reflect on in future when talking about virtual reality projects, as they quite aptly describe the interaction between users in different settings.
The Voices of VR # 351 — Redirected Touch
The idea of “tricking” the human mind with haptics and other effects is a really awesome one, and I hope that it’s explored more in the future. The ability to disconnect senses from what we see is a really interesting concept, and being able to basically warp a player’s perspective using their senses of touch is quite scary in a way, and he mentioned a few examples where he’d be touching what appears to be a straight wall but is actually curved in reality, but the senses being messed with combined with the visual presentation of a straight wall in the virtual space make the player feel as if nothing is awry.
While I’m not sure I can apply this practically to much that I’m currently working on, perhaps this kind of manipulation of senses could be something that would work well in combination with some of my earlier projects from this year, such as the walking simulator I developed which focused on in the first terms of the course, which had emphasis on visual trickery. Perhaps something like that combined with these effects in a VR or AR space would make for a fun experience?
Project Work
Having read over our formative feedback, over this period I decided to spend some time re-doing some of the research and pre-production material that we were previously tasked with creating. I’ll also be creating a separate blog post regarding the technical progression of development as I realised that it’s something I should’ve done prior to this point — But I have notes saved regarding changes, so I may just create one large post with all of those on it for that.
This section of this blog post will be updated gradually with these once I am happy enough with their current state. Given our final week did not have any assigned readings, I’m compiling the work from that week into this section as well, as most of it was catching up with these pre-production elements anyways.
Research
With the new theme I decided I needed to re-research the subject matter. Switching the theme to be somewhat COVID themed, I thought I would look up some COVID related research to do with lockdown procedures, and how rapidly and easily the disease spread. The gameplay is somewhat inspired by this — I wanted the game to actually be quiet easy, but with the potential to rapidly spiral out of control should you not pay attention for long enough.
Much of this research came from the WHO (World Health organization) and provided a lot of statistics and information regarding COVID-19. A substantial amount of it can be found here, however the gist of it is that we are still seeing tens of thousands of people dying from it daily worldwide.
Mood Board
Script
- Control scheme: Not based on any specific controller mapping, but instead designed to be utilised by multiple. Provided below is a demonstration image of how cross-platform VR controls translate between oculus and Vive. (I am using a HP Reverb, but it shares most of its control layout with the design of the oculus touch controllers).
- In this project, the main hotkeys used would be the grip button and trigger button. The grip button would be used to pick up objects and place them, and the trigger would be used for teleportation. The joystick (or pad, if on a Vive), can be used to rotate objects you are holding or bring them closer or further away from your hand.
Scene 1 — Tutorial / Introduction
- The player begins in a field, and text (and a voiceover) is presented to the player, describing the basic basic VR controls systems. This would have a button to skip the scene, for players who are already competent with basic VR functionality. The voiceover/text then goes on to describe the basic gameplay of the first level in the form of a singular house spreading and a cage being provided, and explaining how the interaction works, and then transitions to the first level. During this level, the player’s mobility is locked so that they can focus on what is being shown.
Scene 2 — Level 0
- This level serves as a functional demo for the gameplay and will likely form most of the vertical slice of gameplay that I will need to develop for this unit. This level functions as a basic demonstration of the previously explained mechanic, with the player being placed in a forest with a few houses, and two “cages” presented to them to control the spread.
Scene 2 — Level Select
- Having completed the tutorial level, the player is placed into a level select area, which is a mountainous area filled with little gateways that can be teleported into to choose a level to travel to. The player can also view their high scores from any given level from here. Naturally, it is also possible to quit the game from here via a similar exit.
Scene 3 onwards — Levels 1–10s
- This continues from the mechanic that was introduced in level 0, but gradually building upon it, by adding more hazards such as pedestrians walking past infected areas and carrying the spread themselves in a smaller radius and adding new mechanics later in the levels. Primarily, the difference between the levels is theme and density of structures, as the density massively informs how difficult a level can be to maximise score on. Upon completing any given level, the player is returned to the level select scene, to view high scores and select a new level. Given the game’s arcade-game style gameplay, there is no plot “ending”, per se, but it is meant to show how once you reach the more densely populated or less educated areas, it becomes far harder to monitor and control the spread of the disease.
Interaction, Immersion, and spatial Plans