Senses & Scripts

Both of these elements were thoroughly explored in rehearsals, which helped when it came to applying sensations and clarity to our story. They also gave us a broader understanding of how to efficiently produce these elements when it came to the show, too. What would we need to create certain smells? To make audiences feel like they were in a completely different space? To make them feel welcomed, warm, distanced or displaced? Those answers came with sensory experimentation.  

→ Sensory elements

“As we become accustomed to and crave the mediation of the screen, we become increasingly cut off from human contact, we are in increasing need of reclaiming mass public interactions to provide us with proximal human contact. We have bleached out personal smells. animal smells, and tastes from our world. It is a comparatively sensory bland environment that our animal brains crave stimulation to mimic the hunt, to savour the smells, tastes, and touches of the natural and social world. As a result, there is an ever-increasing need to create events that stimulate our bodies as well as our minds.” -Stephen Di Benedetto Provocation of the Senses in Contemporary Theatre (2010: 88).

Wanting to create a full-sensory experience, we set out on a exploration of sensations. Gaining inspiration from our mentor Roderick Morgan, we began our sensory R&D by using blindfolds in preparation for the VR headset that blocks out the 'real' world around the audience.

Ben and Carly created sensory journeys for each other. These journeys were designed to inspire feeling within the participant and had varying combinations of sensory stimulating elements. 

Some of these experiments included: 

  • Letting Go - Beach Experience
  • Central Park Walk
  • Jazz Bar
  • 99 Balloons 

In Letting Go - Beach Experience, Carly lead Ben through a touch journey that included sand, water, glitter, rocks, and textured floors. As Ben was blindfolded, Carly lead Ben by his hands through the journey. Carly's intention was to inspire feelings of peace, calm and "letting go." After the experience, Ben reported feelings of feeling soothed, serenity and calm. This showed us that using these sensory elements and techniques for touch were effective in creating a calm feeling. Watch a segment of Letting Go (Right)

Central Park Walk was created by Ben for Carly. In this experience, Carly wore a headset that was playing 360 degree footage from Central Park during a race. In this experience. Ben asked Carly to view the experience in 5 various ways (see more on this in R&D 2: VR Experimentation)

We found that having a full combination of storytelling, 360 video and sensory stimulation (including touching leaves, pipes, siding on buildings and sticks) was the most effective for Carly in terms of emotional, immersive experience. We found that while the other options provided some sensory stimulation, they did not create a story. The fifth and final option was the best because it guided the journey through sensation and storytelling. See a link to Carly's Notes on this experience here

Jazz Bar was a sensory experience that Ben created to immerse Carly into a Jazz bar setting. Here, Carly was blindfolded and guided through the experience by Ben. Ben found a live recording of a jazz band playing at a bar, which surrounded Carly with sounds of clinking glasses, scooting chairs and soothing, jazzy tones. Ben sat her down at a table, and placed an opened cider under her nose. Carly then drank from the cider. She was then lead to a piano, where she was free to touch the keys and play a song if she wanted to. We found that this experience was quite successful in creating a bar setting. The use of intimate touch, the smell of cider and the sound coming from the headphones created a full and rich environment 

99 Balloons was a storytelling and sensory experience created by Carly for Ben. In this experience, Carly guided ben to a chair and sat him down. She then played the audio for the video 99 Balloons, a story about an infant with cancer that lived for 99 days. In this experiment, Carly wanted to see how introducing touch stimuli to the audience could effect the impact of the reaction. Ben listened to the heart breaking story for about 3 minutes before Carly added any touch sensations. At this point, Carly placed swaddled sweatshirts into Ben's arms when the story mentioned the parents holding their baby. Ben gasped, and was completely in awe of how much he perceived this lump of sweatshirts as a baby. This showed us how effective storytelling can create a strong environment where very few sensations are needed to create a big impact. 

After Experimenting with smell, touch and taste, we felt confident when applying sensory elements to the story. 

→ Script edits. How the script changed over time + dramaturgy/clarity.

The script was devised considering themes, messages and imagery Carly and Ben had spoken about during the proposal stage. We set the story in a Science-Fictional world, in an altered reality. We'd hoped that we were able to justify the use of VR with an abstract narrative. By doing so, we could convey our ethical questioning of the rise of digital and social media. What was more important, the time you spent with someone in person, or the legacy you left behind? How could we remind audiences of intimacy with one and other in a world that is becoming less personable in its interaction? Ironically, we were trying to ask these questions by using a digital medium, but this was done intentionally. Hopefully, we'd be reminding audiences of what digital media lacks, and how feeling something is still more poignant than just watching it happen. What does it mean to experience, and not to be told? Our aim was to make the audience more active with their emotions, engagement and immersion within our experience.

Our very first draft was finished in October, and it acted as our starting point during the devising process. We made sure that it was rich in imagery, to try and help when devising moments of sensation. Not only that, but we aimed to showcase the foundations of what we were trying to say as clear as possible. The choice to write a script was controversial amongst our tutors, especially because we had not been in the rehearsal room that long. However, we did make it clear that through this was a working script, which was constructed for the purpose of translating our themes, messages and performance style, acting as a basis for our experimentation. Being new to us, we were unsure about how to experiment or devise from scratch with this sort of medium that neither of us had worked with before. This difficulty was also aided by the fact that every experience we had seen before had been either for promotional use, a game or documentary. We had not been exposed to many experiences that focused solely on a VR exclusive story that stood on its own. The script could act as our source material, allowing us to have something to work with when it came to film, exploring different ways to execute what we wanted to do for the audience experience. 

The First Draft can be found here

Prior to buying the camera however, we did take on board our tutors notes of both refining our script and focusing on sensory interaction until buying our camera. We had recurring meetings with Nohar and Nick to discuss the content, form and specifics of the narrative. These were also discussed with our PAT tutors. We had found out that writing for this particular medium had made it less of a script, but more of an instruction manual for us as the facilitators and a description of the experience for readers. Much of it looked at imagery, transitions, cues, focus points and interactions with a participant. However, it was laced with scenes that contained dialogue to drive the narrative, as well as voice-over. We were also encouraged to write and devise text during our time working with sensory elements. For example:

  • How could you devise text for darkness?
  • How could you manipulate a participant using only types of sound?

This was also suggested for our time testing out the camera. As we were working with a different awareness of the space, we came up with a few thoughts and ideas that we could consider, film and edit. Out of the list of questions we had, these were a few to consider:

  • How would text (and movement) draw the focus of a participant? How can you use text to manipulate that?
  • How does your relationship with a participant change when working with a first person angle?
  • What can you do in VR with story elements that you can't necessarily do in a live theatre space?

We did write some dialogue to explore these questions.

An example can be found here

We were very aware that we were working with 360 film, a new medium for the both of us. We were unfamiliar with how to film with a 360 degree camera, how long it would take to edit and stitch* the footage, which were just two considerations we reflected when developing a schedule. Upon interviewing practitioners such as Chris Elson from Diverse Interactive, we addressed these concerns. For his particular practice, they had to stitch and edit the footage manually, which was extremely slow, but necessary for their experiences. 15 seconds of film could take Chris up to four hours to manage. Luckily, our footage was stitched automatically, which could speed up this process. However, we knew that we had effects to add and new software to get used to, so we needed to give ourselves plenty of time to be able to comfortably and efficiently do so.

It was because of this that we needed our final draft sorted by March. With locations and dates confirmed and the camera available to use, we knew that we had to have this script as concrete as possible to ensure that we were happy with the final product. Loading the footage onto the software, editing and rehearsals with the film would take time, especially when combining all of the mediums together. If we did want to make changes, this could either be done in the editing room or by making sure that there was enough time to go back for re-shoots - which we made sure we had time for. 

Script edits.jpg

To make this goal achievable, we sought to have regular discussions on the script. We met regularly to sift through each scene, updating text wherever we could, based on any decisions made in the rehearsal room. We were also fortunate that our tutors were available to offer us feedback throughout the devising process. Through various meetings, we addressed the issues our script faced. On a meeting with Nohar in February, we found that although the chain of events and scenes may have been clear, the characters lacked depth. We acted on this note, by writing individual scenes that would give characters spotlight moments to drive the plot even more emotionally. How did they feel by a certain plot point? By a certain characters actions? We started to devise monologues to explore how these matters could be addressed.

A sample of those ideas/dialogues can be found here

We were very lucky that our peers offered to help with our process too. We gave a reading of the script at a writing workshop we organised with a couple of other MFAs, who also read their own excerpts and ideas for discussion. It was extremely useful to hear the script being read out loud from a different voice, as well as the comments made to consider how to better and further its development. We had mentioned beforehand that the script was not gender biased, and our main concern was to see if it read well. Not only that, but everyone was encouraged to summarise their themes or the piece overall through short paragraphs - to be able to address clarity, imagery and messages we wanted to reflect in our projects.

A common note was that the script was quite confusing in a couple of ways. There were many plot points which should either be explored more, or completely cut in order to make the narrative that much clearer. Not only that, but the participants role was unclear.

  • Should there be a different term for their character?
  • Who are they with regards to the story?
  • Why is the narrative 'a secret' and why are all these actions happening?  

Other notes included requests to make moments more specific and to allow time for moments with regards to the participant. It lacked a sense of awareness for the participant, and that we should factor that into our script. We were also given great references to research to help with our concept, including Glen Neath's Seance piece. 

More notes from our feedback session can be found here

We also found that edits could be made as a result of experimentation in the rehearsal room. Sometimes we would find that a moment of play with sensations, that we wanted to factor it into the script for the audience to experience. Times where we were playing with the camera, testing out different angles for perception or interaction in relation to relationships with space or performance. We would add or remove moments depending on what we were trying to say and if it could be achieved. Every time a decision like this was made, we would assess the dramaturgy and reasons why we were deciding to leave or include them in the narrative.

Alongside our main project, we thought it would be interesting to devise smaller projects to test out on others. Ideas included a 'Piggy in the Middle' styled storytelling experience, in which participants immersed in the VR had the choice to either listen to Carly or Ben's story. Within this, each participant would feel sensory stimuli relative to whichever story they were watching. However, only one filming attempt was made at this and didn't progress throughout the year - however it is something we would be interested in pursuing. This was just one example.

The one sister project we did make some progress on was a piece called The Applicant. The setup and style of this particular project was identical to what we wanted to do with The Extension. The audience would have been given information for 'an interview', which disclosed location, timings etc. They would have met one of us in a room, set up with one small table, with 2 chairs either side, facing each other. When told that they were 'late' they would be instructed to put on the headset. From there, the footage would have transported them to a Cafe, where they would have met The Manager character, who would discuss their future employment. Exterior sensations would correspond to the visual imagery. E.g. the smell of coffee as a waiter walked past. After the footage ends, each participant would be seen to the door. When they got home, a follow up email of employment would be sent to them - as if it were real. The script was written, and an actress was being talked to for the role of The Manager. However, due to time constraints concerning the main project and availability, we decided to develop it at a later date. 

The script for The Applicant can be found here

In March, we felt that we had reached a great draft of the script.

  • We had created depth for our characters. Monologues and individual, intimate moments that addressed plot points made characters more emotionally engaging and easier to invest in.
  • Made the narrative clearer in simplifying the plot. Other intruding elements like the possibility of having a previous attempt at creating an A.I. for the couples child was removed. The realisation to the audience about a loss of a child strengthened the Scientists choice to create an A.I. of herself for the Professor, making that understanding more poignant. 
  • Justified each choice for the audiences perception of the piece. Why were they experiencing these moments, and what was their standing in the piece? We made it clear that they were acting as the AI character, and that abstract moments were addressed with the text.

The final draft of the script can be found here

Possibilities, Limitations & Working with the Camera

ogimg.jpg

Prior to working with cameras and getting too attached to preconceived material, Roderick advised that we watch and experience as many VR experiences as we could to broaden our scope.

31SK9Kh8tVL._SL500_AC_SS350_.jpg

Before buying any new bits of equipment, including our camera, we used our phones to view the content. However, we did purchase a course on a website called Unity (logo pictured Right), which Roderick also recommended. This course was dedicated to creating VR games, but gave us more options, knowledge and broadened our scope if we decided wanted to head down the animated route instead.

We also bought Zeiss VR One headsets (pictured Right), which were compatible with our personal iPhones. 

We played many games, such as Ninja Run, which required the participant to move the avatar using head rotations.

We watched a few short National Geographic segments.

We even viewed photos taken by others.

The most notable experience we encountered though, was the Sky VR Jungle Book experience (Linked Right). Have a go at watching and navigating your viewpoints using the cursors top left of the YouTube video:

This experience was especially helpful as it taught us how to address participants attention through sound and moving image - a difficulty we found that was unique to VR in the theatrical context we were applying it to. How could we cue certain moments to draw attention to them? To ensure the audience saw what we wanted? This experience helped pave the way. 

Before getting hands on with the camera, we decided that we would apply sensory elements to videos we had already watched. This would require us to imagine what a person may touch, smell, taste and hear using particular imagery, which would make the task of creating our own experience a lot more efficient in the months to come.

EXAMPLE: As an exercise, Ben found a short video of Central Park. He deconstructed what a walk through Central Park, with this particular imagery, would feel like and began shortlisting how he could create an experience that made you feel as if you were there. It was also chosen as Carly had memories of her trips to New York, including ones of Central Park, and this was a factor we noted in our feedback afterwards. A series of experiments was carried out:

1) A blindfolded tour of the damp, cold CSSD courtyard, without visuals or score. Exploring senses other than visual.

2) A tour of Central Park, without visual aid, CSSD courtyard & narration from video. 

3) A tour of Central Park, with visual aid and the CSSD courtyard, using senses with the visual. No score.

4) Tour of Central Park, standing motionless in PS2. Narration of video only,  No other sensory elements. 

5) Inside PS2, using Stewart Copeland's Autumn Plains from Spyro 2 (below) as score out loud. Carly sat still in chair, visual aid of Central Park. 

The range of these experiences was to see what elements combined created the most poignant immersive experience. What was more effective, the abstract or the realistic? Was it effective to merge both, or keep them separate? We also asked if memories, if any, were evoked from the different sensory combinations. Alongside that, an open feedback session was held to ask about some individual parts of the experience, like clarifying was was touched at a point of the experience, and any similarities or feelings it made you think of.

61dljpoGx6L._SY355_.jpg

 

During these experiments, it occurred to us that we would need wireless headphones to be able to reduce the risk of technical issues during the performance. What if the participant felt the chord during the performance? Worse still, what if they were to disconnect the wire from the phone during the experience? We knew we would need one that cancelled out noise too, as we did not want any unwanted exterior noise invading the narrative. We chose to buy some Axceed Wireless Noise Cancelling headphones (left), which catered to our needs perfectly.

 

 

Before buying and testing a camera, we also researched what apps were best in terms of playing video content, mirroring footage to help with sensory cues. At this point in time, we were using a wired connection, which was extremely restrictive of a participants movement. We also looked at apps that could show us a preview of what recording VR experiences could look like. We found a variety of apps for each. 

E.g. Mirroring (From Left to Right): TeamViewer, AirDroid, MirrorOp

 

Some were extremely effective, others proved problematic. For example, a lot of the VR players could not play content that was stored on phones. All the VR recording software's could only take photos at best and many of the mirroring software's proved difficult to work with, only working for limited periods at a time, which was not feasible for what we were trying to achieve. 

With consultation with our mentor Roderick, as well as research online, we decided to purchase a Samsung 360 Camera. We found that the latest version in terms of quality (2017) was better on technical specifics, but the image on the version before (2016) was preferable. Also, the latest version worried us with potential compatibility issues, which we could not risk. It was also very much dedicated to handheld use, which was not desirable for our needs. The 2016 camera could be mounted on a tripod and record from a perspective that we wanted. It was also a cheaper option. The next few weeks were used to test the camera out. How could we record demos that made the cameras visibility seem like another person? We looked at levels of elevation, proximity of our

bodies to camera and opportunities of sensory engagement that

could be implemented in recordings we took. 

 

Samsung 360 Camera (2016) Vs. Samsung 360 Camera (2017)

Perception, Choreography & Introduction to Insta360 Pro

 

Phase 3: Filming on Insta360, how the camera differed. Software + editing. Perception and camera angles etc. + THE WORKSHOP @ CENTRAL

 

After a tutorial with Ken Mizutani, we began our exploration of the Insta 360 Pro. While the Samsung 360 has two cameras, the Insta 360 Pro has six, making the images smoother. Shooting with the Insta 360 Pro required us to be hyper-aware of our surroundings. Every corner of the room is captured by the 360º, and in 8k definition. This meant we had the ability to create film-quality footage for our show. 

INSTA360.jpg

We first experimented with this camera in the movement studio, testing each of the settings on the camera and exploring the different angles for filming. Through this experimentation, we found that by placing the camera on the tripod to it's fullest extent, the lenses of the camera were at the height of Carly's eyes. shooting from this perspective was perfect for our story, as the audience discovers that they are playing the role of the AI that Carly's character created in her image.  

But designing the film shoots was not about getting the height of the camera correct. In each shoot, we had to:

  • Check the lighting in the room
  • Position the camera in a place that matched the needs for the story telling and post-shooting editing
  • Calibrate the camera to pick up the accurate dimensions of the room
  • Ensure that every aspect of the space was purposeful

Most of the shoot was completed in a flat near Old Street. The flat was an excellent location for shooting because it was quite, spacious and had a cohesive design. This allowed us to obtain clear audio, clearly mapped out scenes and a sense of continuity for the audience to grasp onto. 

When filming outside, we found the images were clear, but slightly corrupted if the lighting changed. The audio was not as clear due to wind and the vast open space. Because of this, most of the filming that we did outside has the audio stripped away from it with usually a voice over playing on top of the video. 

 

View our broken down scene list for production here

Carly's placement with tara d'arquian on Bad faith 

In February of 2018, Tara offered me the role of Assistant Choreographer on her piece, Bad Faith, a dance and poetry fusion. As a choreographer interested in devising, the role was a perfect fit for me. See the initial list of duties Tara sent me here

D'Arquian and Doehler re-shaping choreography - 14 February 2018 

D'Arquian and Doehler re-shaping choreography - 14 February 2018 

The placement took me to Brussels, where I had the privilege of assisting D'Arquian and her co-performer, Laura Doehler. Traveling to Brussels was a thrill, and added so much to the richness of the placement. I set intentions for exploration during the placement, looking to examine D'Arquians style of choreography, seeing how she collaborated with others, and how the elements of poetry were fused with dance.

The rehearsals took place at L'Escaut, an arts and architecture firm that boasts a large and bright rehearsal space. Here, D'Arquian lead the rehearsal sessions. D'Arquian came into rehearsals with choreography that she crafted herself, pulling from array of modern and contemporary dance that she obtained at Trinity Laban. D'Arquian wanted very much for Doehler's own creative practice to impact the choreography of Bad Faith. To blend D'Arquian and Doehler's choreography, they held improvisation sessions that took inspiration from particular emotions and lines of poetry in Bad Faith written by Jemima Foxtrot. These improvisations formed a basis for each section of the show. This created a cohesive structure for the show that supported the strong spirit of devised performance in Bad Faith. 

Jemima Foxtrot and the Extended Cast of  Bad Faith

Jemima Foxtrot and the Extended Cast of Bad Faith

While D'Arquian and Doehler created the choreography, I was asked to act as an outside eye to the movement, often being asked what elements of the work were strong and which could be further developed. I also documented the week of rehearsals through film, photos and typed notes. See an example of rehearsal notes here

Arriving back in London, the Bad Faith creative team welcomed in the Extended Cast (EC), a group of women ages 65 and up interested in dance, poetry and performance. It was my duty to:

 

Bad Faith  EC and Assistant Choreographer 

Bad Faith EC and Assistant Choreographer 

  • Schedule the rehearsals for the EC
  • Document all rehearsals and coordinate feedback messages
  • Assist and lead rehearsals for the EC's movement and spoken performance 
  • Film and edit promotional videos with EC and other creative team members
  • Liaise between the performance venue (Trinity Laban) and the EC 

 

In rehearsals, the EC helped to write parts of the script through improvised writing sessions, where Jemima gave them a stimulus (of song and text) to inspire their work. I had the chance to put my creative, organisational, and hospitality skills to work.  It was a joy working with these incredible women, and I had such a rich experience watching and assisting in this process from start to finish. 

I applied the creative skills I gained from watching D'Arquian's process in my work. I applied her use of contact improv techniques to moments of touch in The Extension, and I used the marketing and communication skills I gained during this placement to help me in the promotion and organising of The Extension. 

 

 

Ben's placement with Dante or Die on User Not Found

 

DOD.jpg

When meeting Daphna in the third term of the first year of the MFA, she discussed that Dante or Die's next piece was looking at digital legacy. She discussed that the play itself would focus on social media presence of a person after death, and how it could be maintained or perceived by those left living. Daphna went on to discuss the technology involved, stating that the piece was going to be told using the help of smartphones, all synchronized, to convey imagery and sequences to help tell the story. As the piece was both thematically and stylistically similar to our current idea, I jumped at the opportunity to work on it. With more talks of joining the project in February, plans were made for an attachment to help with R&D and previews.

In March 2018, Ben joined Dante or Die in one of the early phases of R&D. The first week consisted of a full script read, design, technical cues and the blocking of scene one and eight. The week led up to a rehearsed reading preview for industry professionals, as well as staff from the University of Reading, who Dante or Die were working with on a research front. 

During the week I was given tasks. They ranged from making a list of apps for the characters of Terry and Luka, setting up the speakers, and arranging/designing the space as a Cafe. One of the more complicated tasks however was finding new headphones to work with the smartphones, as the ones the company currently had were not the best in sound quality. To do this, I had to find out particular specifications and made extensive research on what models of headphones fit the bill. After narrowing it down, we made a list of potential investments and also went out into the shops to see if we could find any to test. We needed radio frequency compatible headphones, which were also wireless. We bought a couple of models to test for a later date, prior to bulk buying the definite option for the show. 

The weeks that I spent with Dante or Die after the first, followed the same sort of routine. I lent a hand wherever I could:

  • Helping with designing the tech (which included fitting together custom made lamps)
  • Helping with the sound by operating QLAB when the technical stage manager was away. 
  • Installing software into each phone. I was also in charge of setting up each phone and labeling them. 
  • Designing Social Media pages for the characters of Terry, Luka and Laurent Mercier - included formatting what each page would look like, what it would say, and designing real social media pages.

E.g. You can view Laurent Mercier's Facebook page here

I also met Marmelo Digital, a small tech company that specialises in web and app development for theatrical experiences and events. I observed how they worked with designing elements for piece, giving as well as receiving feedback. This input in the rehearsal room was new to me as I had never been in an R&D where there were programmers, let alone working and observing the process with you. I asked about recent and future projects that they had been a part of, all of which seemed incredibly interesting!

I was also fortunate to spend some time at Soho Sonic Studios to watch a recording session. I learnt how sound engineers and designers worked within their space, separate to the rehearsal room. Similarly to Marmelo, it was interesting to observe how they collaborated with Dante or Die - interpreting feedback, editing different types of score for the piece, as well as trying out new ideas live in front of you. 

I accompanied the theatre company to the New Wolsey in Ipswich for the Pulse Festival. I had never been to the venue before, let alone the festival. Learning about the process and organisation that entails working with a venue as well as a piece of this scale was enlightening. The days were long, but scheduled to optimise efficiency - which of course included breaks in between. Treated as the first preview, it was interesting to see what worked and what didn't. For example, some movement in practice with an audience didn't fit Dapha and Terry's vision for the moment, so they had to be revised in the rehearsal room days later. However, it was effective to see how and if all the phones worked together, new app updates and how the lighting worked (in partnership and separately from the sound). The venue was equipped for the piece, which meant time was shaved off set up. E.g. the venue had blackout blinds, which was beneficial to the setting of the piece. However, some venues don't, and a solution needs to be thought about to fix this potential issue. 

 

UOR.jpg

The placement was also incredibly useful for research. Ben was very lucky to be able to attend a symposium, held by the University of Reading. Dante or Die were speaking about Social Media Technologies in Immersive Performance with regards to their show User Not Found. There were many talks around this subject area, including a talk from practitioners on digitising spaces and accessibility. This was extremely useful for Ben, especially because it related to his essay question on immersion within digital performance. Not only that, but he learned a lot about how to make these pieces more accessible to others, which he will definitely utilise in his practice in the future.

Notes from the panel discussion on Digitising Spaces and Accessibility can be found here

 

The last week of Ben's R&D will be spent prepping for the Roundhouse in London. The most recent rehearsal photos for that run can be found below.

Who did we speak to?

Over the course of the academic year, we interviewed the following practitioners. We had little to no idea about the practicalities of VR, and had not experimented with it in depth to have a broad understanding to answer some of our questions ourselves. We found that interviewing practitioners especially helpful because we found out a lot of practical information about VR going forward.

Each practitioner had different perspectives and views on our questions, but each answer was extremely helpful for us to consider for our piece, as well as to implement for our research essays. In relation to practical advice for example, Chris Elson (from Diverse Interactive - pictured below) told us that the editing process could be quite long, due to the amount of information a computer would have to process, and that this should be considered when scheduling your time. 

Provided is a transcript of an interview from 59 Productions' Lysander Ashton. It can be found here.

Over the course of the year, we've been able to talk to many different practitioners, who all use digital elements, including VR, as part of their practice. They are all listed below.

From left to right: 

  • Breaking Fourth
  • 59 Productions
  • Diverse Interactive
  • Raindance Film Festival: VRX
  • Epiphany VR
  • The Tom Sawyer Effect
  • The National Theatre
  • CyberRauber VR
  • Curious Directive
  • DotDotDot
  • Blast Theory
  • Douglas O'Connell

Ben Mason (The Tom Sawyer Effect) in the experience (LEFT). Chris Elson (Diverse Interactive) takes a selfie with the company after the interview (RIGHT)

Some of these practitioners even came to see our show. These was even more beneficial as they were able to give feedback from their specialist point of view. Ben Mason from The Tom Sawyer Effect talked to us about the nature of presenting the experience. He felt that some moments were like 'a hat on top a hat' (Grant: Working Journal) and that we should simplify the narrative as it was too much to think about in such little time.

 

Our list of interview questions can be found here

 

Pre & Post Production

 

35925928_10156738139338755_1908055413274705920_n.jpg

The Motion Capture Workshops

Before filming, Carly and Ben thought it would be beneficial to participate in a full day Motion Capture workshops. Led by Sarah Perry, founder of movement company Shapes in Motion, we learnt about different elements of motion capture, which we thought would be useful for our practice.

The first workshop was called 'Intro to Mocap/Creation of a Creature'. Sarah began by teaching us about motion capture, and the science and logic behind certain physicality. After a warm up, we were tasked to explore different qualities of movement, using techniques from Laban. This included levels, direct and indirect movement, and hard and soft pressures. After experimenting with different elements, including tableau, we were to devise a creature, which we manifested in walks, gestures and noises to what formed what was called an accent. 

To end the session, we sat down and watched some videos about the development of motion capture and how it works on a film set. One interesting thing Sarah taught us during the session was not to drag or slide, especially when on the floor. The reason for this was because the motion capture sensors would not pick up this quality of movement, which would be a problem for digital designers. 

The next workshop was called 'Hitting the Brief' which taught us about interpreting certain descriptions. Sarah had brought along a couple of animators, who had designs of characters that we had to interpret as performers. One of these designs can be seen on the right hand side. We had to consider the quality of movement this character had. 

  • How does it move? - Was it fast? Flexible?
  • How does it behave? - Does it make any noise? Is it clumsy?

The animators gave us feedback on our interpretation of these designs, before asking us to devise another sequence for another character. Feedback was useful as it put what a designer was looking for into perspective. Research and work on a character was vital - as one animator said that many people come in with stereotypes of a character based on other media. This usually leads them to perform an over the top, pantomime like performance. One example was an Orc character, who we actually tried hitting the brief for. The animator stressed that this character was not idiotic like it was in some interpretations, but was rather intimidating. One of the key focuses of this particular character was its distribution of strength and weight - how did this affect its walk, and how did it carry heavy objects? Etc.

These workshops were really useful in providing us with some context about physicality within a virtual environment. It provoked some questions: How did our characters interact with the space? How did we want the AI to live within the space? How could we make every decision clear, and avoid mistakes along the way?

 We think this will be especially helpful for work in the future, where we could be using animation or programming to create our virtual world, rather than 360 film.

Filming w. PJ.jpg

Filming

Filming took about three whole days, which was more than enough time to shoot all the scenes. We made sure for continuity that we had costume changes for different scenes, as well as manipulated the set around us to make sure it looked different every time. 

We made sure that we took at least 3 takes of every scene, and scheduled breaks in between to have lunch and charge the cameras. 

Filming was really interesting, because we had to be aware of the camera angles for every single shot. For example, if the AI was meant to be sitting on a chair, we had to shorten the tripod and place it on the chair, so it would remain eye level with Carly. Not only that, but we were playing with a different playing space than usual. We had to consider our proximity to the camera in terms of sound and lighting, as well as how it would look for the participant in our attempted interactions with them.

 

Editing:

Editing Room.jpg

As mentioned, we had to make sure that we filmed with more than enough time before show day. This was because we were not sure how long the process would take us to execute a great draft of the narrative. We scheduled the shoot in March, and it was lucky that we did. Editing the entire show took a long time - 3 weeks of full days in the editing suite to be exact. The element that took the longest was stitching and compressing the files that we had. There we also a few scenes which didn't save on the cameras memory card, but because we had enough time, we could go shoot the scenes we considered vital and still be on schedule. We used Final Cut Pro, which was a new software for both of us, so we had to learn on the job. Ken Mizutani from TSD Media was kind enough to give us a tutorial on how to use the software, as well as book out the studio for us when we needed it.

 

 

st_ecom_large_1.png

 

We learnt how to add effects, change the opening camera angles for audience perception and how to manage audio, amongst other things. Upon completion, we uploaded the files onto our Google Drive to assess. Once happy, we downloaded the film onto our 2 Samsung Galaxy S7 Phones in anticipation for the show. 

The Galaxy S7's were the phones we invested in for the next stage of our development. They were cheap, reliable, easy to use and had a great visual quality for the video content. We bought two to be able to switch phones in between performances so that the other could charge. Among our new purchases, we bought an internet router and a Samsung Gear 360 Headset, which was compatible with our Galaxy phones. We also bought another set of Axceed headphones, again to be able to swap so the other could charge during performance days. 

 

What Happened?

 

Electrick Village Updated Poster.jpg

The details:

Where: The Antelope, Tooting.

When: 6th, 7th, 14th & 18th of May

Part of the Wandsworth Arts Fringe

We debuted the show at the Wandsworth Arts Fringe, using a The Antelope Pub in Tooting as our venue. 

Promoting The Extension:

Flyers: We promoted the show a number of ways. We designed our own posters (Top Left), and had them printed as flyers (Far left, Below) which we posted around Royal Central, Swiss Cottage, the Tooting Broadway area as well as at The Antelope itself. 

Reviews: We also invited reviewers to the show. Dante or Die were kind enough to tell Ben how to create a press release, which he composed using a template they offered him. You can find that press release hereWe received a few responses - some could not make it, some who offered to promote it via their web pages, and others who could attend. A review of the show, written by blog There Ought to be Clowns (Middle, below) can be found here.

Radio: We had an interview arranged with Wandsworth Radio (Far Right, below) to promote the show as well. We managed to get a slot as part of their Spotlight broadcast in the evening, which gave us access to prime time listeners which was brilliant for our outreach. This was our first radio broadcast as a company, and it was really fun. It was interesting trying to format the experience into words, teaching us how to market ourselves and what to expect in promoting it at a later stage. That interview can be found here.

Social Media: We made sure that we kept up our presence via social media. We have Instagram, Facebook and Twitter pages, where we kept our followers up to date with updates. We also paid for promotional posts, aiming to increase our awareness across all platforms.

Facebook: https://www.facebook.com/ElectrickVillage/

Twitter: https://twitter.com/ElectrickVille

Instagram: https://www.instagram.com/electrickvillagetc/

Get in:

We set up the experience every day. We allocated a table for technical elements, which we used to house our laptops and phone chargers. Another table was used for sensory elements, which were all carefully laid out. These stations were strategically placed at the sides of the room to increase the size of the playing space. We had to make sure that each phone was charged or on charge if battery was low. Check that sensory elements were stocked up, and that the right technical specs were arranged to avoid any problems during the show. We also made sure that booking for each show day closed at 9AM of the day, to make sure there weren't any last minute bookings that we were not aware of. If we were in the middle of a heavy show day, we would not necessarily be able to check, so we decided that we would prep everything during each show mornings, including organising breaks etc.

36176110_10156738139098755_9193608725242314752_n.jpg

Examples: Making sure the Bluetooth connection between the phones and headphones was vital. Without this, the participant wouldn't be able to hear what was going on in the narrative. The same went for each phone's connection to the computer. Without the mirroring software, we could not project what the participant was seeing on our monitors via the Airdroid application. We found that the internet connection the venue offered was not strong enough for us to have an efficient stream of the participants screen. We tried to create a local network via our own router to bypass this problem, but this also proved problematic. We found that streaming via a 4G hot-spot was the most efficient, and our problem was solved. We also made sure that we had our cue sheet ready, and that we had alternative foods for those with allergies or different dietary requirements. An example would be a vegan friendly bag of sweets, which we used as an alternative to milk chocolate.

How the show worked:

Each participant, after booking a ticket, would then receive an email disclosing more details. Set up as a secret meeting, the email would include the venue location, a briefing about the experience, a request for allergen details as well as extra details about the characters themselves (which took the form of video logs). That email can be found hereWe had also placed WANTED posters that we had made strategically leading up to and back from the venue. An example can be found above (Top Right).

When each audience member met us outside the venue, as requested, at the time of their slot, we would then take them into the venue through a side door with a sense of urgency. Upon locking it, we would then clarify if there were any allergens, and that they had seen the email, before briefing them for the experience. After making sure that the headset and the headphones were comfortable on their head, they would then be led through into the playing space. They would engage in the main part of the experience, which had us moving back and forth around them in an attempt to match imagery with sensory elements. We were also very careful in making sure collisions with us or the venue were avoided. As the experience came to a close, the participant would be led out the room. At the end, they would be asked if they had any questions, before clarifying the end of the narrative, which teased that we would be in touch again. Each participant was asked if 'they could keep their word' - their answer would determine the content of their follow up email. They were escorted out of the venue and the door was locked behind them, leaving them to continue with their day. 

After the run was over, each participant would get a follow up email that would tie up the story. Their were video logs and newspaper clippings (Below) that added to the experience of the narrative. Not only that, but a survey was added to help us for research purposes, and to help with improving the show for future dates. You can find that email here.

 

 

Watch the video below to see how Ben and Carly interacted with the audience and facilitated  The Extension.

What did we Learn?

Videos of Participants

What we found

The first two days of shows were quieter than the later two. With that, the participants who came to see the show were slightly static in their movement. As we didn't experience a variety in a participants energy or engagement with the piece, we were not practiced when it came to the more varied group of participants we met in days 3 and 4. We had to be more active in our care for participants in days 3 and 4 as they explored the space around them more than participants the two days before. It meant that our interactions had to be precise, and we had to be aware of participants colliding with us and the space, which would have been a breach of safety and the immersion within the piece. Not only that, but we had to adapt our sensory interactions to a participants position in the space.

Days 3 and 4 also tested our stamina as facilitators, performing shows consecutively one after the other with less time for breaks. In addition, we were tested in our organisation of the technology. For example, switching phones every two or three participants so they didn't overheat or power off due to low battery during the next persons experience. The same applied for the headphones. Cleaning and switching of headsets was more frequent too, as well as reassessing the quantity of sensory stimuli we needed per show day.

It was interesting seeing other interactions the participants had with the piece other than geographically in the space. Day 1 and 2, we had no vocal engagement with the piece, but the days after, people engaged with the experience much more. They responded to cues in the narrative, as well as indicate how they felt about certain sensations. They were much more reactive to stimulus, including one person vocally rejecting a moment of taste sensation with a 'no thank you'. 

Precautions at the beginning and end of piece were also made - this was mainly due to the venue itself. We made sure that the participant was always with someone, just in case they caused harm colliding with the interior of the venue. At the beginning, we made sure that we communicated that there was a step before entering the space. So they didn't trip, Carly told them that she would indicate the step was coming up with a double pulse on their hands. Every single participant understood this interaction, and avoided the step before entering and leaving the space. We also made sure that someone was with them after, as they were still immersed in the experience whilst next to a flight of stairs. With people being so physically curious in their movement, we made sure that they were observed to avoid serious injury. 

 

What next?:

Over the next year, we will be continuing to develop the piece further, addressing feedback that we received during our Wandsworth Arts Fringe run, as well as our showing at Royal Central. 

Screen Shot 2018-06-18 at 10.26.08.png

We have also teamed up with Clare McCall (MA ATP 2016/2017) on her one woman show Social Media SuicideClare approached us to talk about implementing digital elements in her show, and we agreed to collaborate on the piece after commitments with our first run of the show (and our assignments) were finished. That way, we could focus and balance our time more effectively on each project. We've found that since original meetings in March, our experience with working with digital elements has become a bit more expansive. Our skills have advanced quite a bit, and we feel that we can deliver interesting and useful elements to help with Clare's piece.

Edfringe.jpg

 

 

 

This revision of our show will begin prior to our own 3 day run at the Edinburgh Fringe festival. We are excited to perform the experience, which will take place in The Cutting Room (Out of the Blue venues) on the 16th, 17th and 18th of August. 

National_Youth_Theatre_(logo).svg.png

 

 

 

After our run at the Fringe festival, we have the honour of hosting a workshop/lecture at the National Youth Theatre. This engagement came from our contact with Sheffield based theatre company Epiphany VR. They were interested in our work after an interview we conducted with them, which we organised for our own research. We have scheduled the lecture on the 28th of August, as part of Epiphany's digital course specifically designed for NYT. 

 

Our current schedule extends to mid October, where we are provisionally scheduled to show and discuss our piece at Queen's University in Belfast. At the moment, we have spoken about arranging 2 or 3 days to be able to increase our outreach. In terms of audience, we can expect students, graduates and staff to attend. During our stay, we aim to discuss questions about Post-Graduate life, Central and the Advanced Theatre Practice course, amongst other things.

What did we look at?

The Presentation:

Halfway through the year, our course leaders decided that it could be beneficial for us, and the MA ATP class, to give presentations based on our practice and research. We agreed, noting that putting our theoretical and practical notes on paper would put us in good stead for the end of year essay assignment. The presentation was divided into 2 parts, which reflected both Carly and Ben's individual research and also how it merged in collaboration. We discussed phenomenology, the presence of senses, as well as our experience with VR technology.

One of the points we raised about the difference with VR was the aspect of theatrical cues. In 360 degree vision, a participant could be looking anywhere, and if the cue is visual, they may not engage with the next step of the narrative. However, if you engage the other senses such as sound, a participant is more likely to focus their attention on responding to the sound. 

WORKSHOPING VR AND SENSORY EXPERIENCES  

In February, we hosted a sensory VR workshop. In this workshop offered to ATP year one students, we asked the participants to put on the VR headset and headphones and become immersed into a short virtual and sensory-stimulating world. As this four minute VR journey had the same structure as our full-length experience, this workshop was imperative to see how audiences would react to sensory elements in tandem with VR. The sensory elements included feeling wind, smelling coffee, stepping onto grass and tasting a sweet. 

 

From this workshop, we figured out just how close we needed to get the scent stimuli to the audience in order for them to smell it. We also discovered very practical ways for moving the audience, for example using two hands to guide the audience instead of one hand on their hand and one hand guiding their back. Furthermore, we discovered that the juxtaposition of the senses and images was exciting for the audience. They enjoyed the moment when it looks like Carly's virtual-self is feeding the audience an orange, but Carly's live-self feeds them a sugary sweet instead. As seen in the video, this made workshop attendee Hoa-Yeh laugh. She encouraged these moments of dissonance. We used these findings to strengthen our work on The Extension. 

 

 

 

How Did we keep track?

Everything we read, watched or experienced either in or out of the rehearsal room was noted. This was both in physical written form, as well as our Google Drive, where we organised everything into specific folders (See Image, Right).

 

We would keep diaries that kept logs of our rehearsals and meetings, which we would then transcribe into a word document and upload onto our storage space.

 

 

 

The Google Drive was really useful to us because we could have access to all the types of media we wanted. VR content was among the most useful, but we also kept track of articles and books that we read, as well as pictures and videos that we thought would be a useful reference to our practice. An example would be the TEDx Talk given by Graham Hancock on Life & Death after Consciousness (Video, Left).

 

 

 

 

Bibliographies:

During our research and development, we would keep track of what we were reading and what we should read. The same applied to videos or any other media. Attached below are both Carly and Ben's essay bibliographies for you to read over:

 

Carly's Bibliography here

Ben's Bibliography here

 

You can find Carly and Ben's joint bibliography in the SIP Proposals at the top of this website. 

 

What and how did we spend?

 

Since the proposal of their joint VR project, Carly and Ben knew that they would need to invest in the necessary equipment to create a technical piece like The Extension. Because of the research that we did in the beginning of our process, we spent our designated budget on VR headsets, headphones, Samsung S7 phones and the Samsung Gear 360 camera. We also purchased necessary sensory elements needed for our experimentation. These included sand, essential oils, small carpets, food, drink and various textured materials. 

Once we had the necessities to create and perform The Extension, we pitched for monetary support to take the show beyond the Wandsworth Fringe as ambassadors for Central and the MFA Advanced Theatre Practice course. 

See our budget pitch here

 

Electrick Village couldn't have MADE IT through the year without the help of some incredible people. We would like to take this OPPORTUNITY to thank all those who have helped us over the past 2 years. They are:

 

Our Course Leaders: Nick Wood, Nohar Lazarovich, Lynne Kendrick

Our Tutors: Jane Munroe, Farokh Soltani-Shirazi

Our Placements: Dante or Die (Daphna Attias, Terry O'Donovan), Bad Faith (Tara D'Arquian) 

Our Mentors: Roderick Morgan & Tara D'Arquian 

TSD Media: Ken Mizutani & Roberto Puzone

Our Practitioners: Chris Elson, Ben Mason, Lysander Ashton, Jack Lowe, Douglas O'Connell, Bjorn Lengers, Connie Harrison, Ben Carlin, David Kaskel, Dan Lamont, Maria Rakusanova. Roderick Morgan.

MA ATP - Class of 2017

MFA ATP - Class of 2018