The Seven Sentence Stories (Saphya + ZZ)

The Seven Sentence Stories

192.168.50.184/~zz791/DevWeb_w14-assignment

Idea: A collaborative writing platform, bridging the physical distance and igniting the imaginations of authors around the globe. We wanted the benefit of a social medium focused on writing and idea-making thus TSSS was born. Using this app, people can create new stories, edit existing ones or just read.

Front End:

Jquery Mobile Theme Picker

Screen Shot 2017-05-22 at 10.24.32 PM

using ThemeRoller from JQuery Mobile site

Screen Shot 2017-05-22 at 10.38.08 PM

Screen Shot 2017-05-22 at 10.38.36 PM

 

HTML

changeStory(): this updates another page in our html called “readstory” with the identity of whichever story is clicked on in the main page. Therefore, if you click on “test”, the empty “readstory” page will have the title, story, and collaborator list of “test”.

showStories(): uses the ‘story-list’ get call to list all the stories on the list view page.

filterStories(): uses the ‘story-list’ get call to compare the search input with existing story titles in the database to spit out the stories that exactly match the search.

storyUpdate(): uses the ‘set-story’ get call to add new row in a ‘p’ tag to that page. Also updates the database.

newStory(): uses the ‘set-story’ get call to add a new row for that story to the php database and updates the list of stories on the main page.

 

Back End:

18697733_1519023128119436_1801275045_o

phpMyAdmin

In phpMyAdmin, we made a new database called “collab_fiction”. In it we have a table called ‘stories’ with columns, ‘id’, ‘dates’, ‘author’, ‘sentence’, and ‘storyid’. In our index.php, we have four ‘get commands’:

‘Set-story’: adds a new row to the table

‘Get-stories’: lists all rows in the table

‘Get-story’: list all the rows under that storyid

‘Story-list’: lists all unique storyids in the table

We use the commands for different javascript functions in our html code.

 

Problems:

Although the story updated instantly, the author list did not update unless you refreshed the page. We tried to fix this by placing the portion of code that controlled the author list within the submit button’s “click” function, but this only generated duplicates. We are still seeking to solve this issue.

 

Reflection:

I really enjoyed this project because I got to work with phpMyAdmin and MySQL, both of which I’ve heard about but have never seen in action. After making the database and learning the correct jargon to pull information from that database, it was relatively easy to make a for loop in a .getJSON method in our HTML. Then it just became a matter of displaying the specific part of that json, which required a lot of tweaking and still does til this day to perfect. I am satisfied with the work we were able to produce in such a limited time. TSSS works fine individually on your personal devices, if you are able to make a database of the same name with the same table, however this is not ideal. I still want to learn how to put this online, but that is a future challenge.

The Immersive Soundscape of HATCH/宝贝 (Nicole + Saphya)

Finding Sound

Screenshot (453)

“Coins 1” by ProjectsU012 aka Happy Noise

“8-Bit Wrong 2” by TheDweebMan aka Wrong Noise

“8bit-harmony-lowcutoff-envelope” by DirtyJewbs aka Theme Music

We stuck with chiptune and 8-bit audio, to go with the oldie feel of our game.

In-game Music

Screenshot (452)

  1. Sad Noise: this sound is activated whenever a heart is removed

2. Happy Noise: this sound is activated every time the plater satisfies the qilin by scanning the Vuforia markers, giving the qilin pizza and/or coffee

3. Theme Music: this sound is constant and plays on awake on loop

 

Google Tango Spatial Sound Experiment

Using the same audio, Sean helped us make a Unity scene for the Tango in which users can walk around in a 3D soundscape and approach orbs that had an AudioSource attached. You would hear the sound clearer the closer you got to the orb and fainter the farther you were.

image

HATCH/宝贝 Core Mechanic (Nicole + Saphya)

Screenshot (451)

The Basics:

The foundation of our app is controlled by four scripts: Time Management, Heart System, Camera Button, and Tracking Event Handler.

Time Management controls the active states of the hearts in the GUI. Using an array, this script will set the active state of each heart in the array to false in thirty second intervals through communicating with a function in the HeartSystem script called HeartDeletion().

https://github.com/saphya-council/hatch-spr17/blob/master/timemanagement.cs

Camera Button is connected to the lower button in the GUI that activates the ARCamera. This script is attached to both the coffee button and the pizza button and takes several parameters associated with the button identity to pass into the Tracking Event Handler script. If OnTrackingFound() returns true for the specified image target, then HeartAddition() from the HeartSystem script will initialize.

https://github.com/saphya-council/hatch-spr17/blob/master/camerabutton.cs

Tracking Event Handler cross compares what is being scanned in the ARCamera with the image target the CameraButton script has sent it. This is to ensure that the camera doesn’t deinitialize after scanning just any image target in the database, and to make sure that the qilin receives the correct  icon popup in the game.

https://github.com/saphya-council/hatch-spr17/blob/master/trackingeventhandler.cs

Heart System is connected to both the CameraButton and Time Management scripts, and controls the qilin’s animations and appearance. If the qilins hearts reach zero, the qilin will turn into a pile of bones, otherwise it will be a cute qilin.

https://github.com/saphya-council/hatch-spr17/blob/master/heartsystem.cs

Future Plans:

We tried to incorporate multiplayer compatibility so that couples can chat and take care of the same qilin. We first did this through the Unity Multiplayer Networking tutorial, however it did not work between PC and mobile phone. This is because there was no server to connect the two devices. Next, we explored Photon Unity after getting advice from Sean. This was better, because Photon uses a cloud that can be accessed just as an API is used. In the short time that we had, we prioritized perfecting the Vuforia camera and the appearance of our app over the networking component. In the future, we hope to finish our work on that part of the app.

HATCH/宝贝 Final Doc Post (Saphya + Nicole)

PROMOTIONAL VIDEO:

 

WHAT IS HATCH?

Game Design Document_slide1

宝贝/hatch is an augmented reality dating game that encourages “real world” interaction.

Game Design Document_slide2

We realized that the problem with the culture of online dating is that people are less likely to meet in person, which is an issue we hope to rectify through our app.

Game Design Document_slide3

A combination of Neopets and Tinder, users will match with each other based on common interests in their profiles, thus enacting an event where they must find and foster a pet.

Game Design Document_slide4

These rules are put in place to establish a serious minded relationship between two people. We want users to use our app as a means for physical interaction, not to prolong virtual discourse.

Game Design Document_slide5

Users can obtain resources in different places within the academic building to raise their pet. In the cafeteria one can scan for food; in the cafe one can scan for treats; and at any water cooler one can scan for water. The bare necessities. Users are also urged to visit their pet together every now and then to give their pet a love boost.

Game Design Document_slide6 Game Design Document_slide7

 

HOW DOES IT WORK?

Screenshot (451)

The foundation of our app is controlled by four scripts: Time Management, Heart System, Camera Button, and Tracking Event Handler.

Time Management controls the active states of the hearts in the GUI. Using an array, this script will set the active state of each heart in the array to false in thirty second intervals through communicating with a function in the HeartSystem script called HeartDeletion().

https://github.com/saphya-council/hatch-spr17/blob/master/timemanagement.cs

Camera Button is connected to the lower button in the GUI that activates the ARCamera. This script is attached to both the coffee button and the pizza button and takes several parameters associated with the button identity to pass into the Tracking Event Handler script. If OnTrackingFound() returns true for the specified image target, then HeartAddition() from the HeartSystem script will initialize.

https://github.com/saphya-council/hatch-spr17/blob/master/camerabutton.cs

Tracking Event Handler cross compares what is being scanned in the ARCamera with the image target the CameraButton script has sent it. This is to ensure that the camera doesn’t deinitialize after scanning just any image target in the database, and to make sure that the qilin receives the correct  icon popup in the game.

https://github.com/saphya-council/hatch-spr17/blob/master/trackingeventhandler.cs

Heart System is connected to both the CameraButton and Time Management scripts, and controls the qilin’s animations and appearance. If the qilins hearts reach zero, the qilin will turn into a pile of bones, otherwise it will be a cute qilin.

https://github.com/saphya-council/hatch-spr17/blob/master/heartsystem.cs

Here are our markers:

fdpizza fdcoffee

 

FUTURE PLANS:

We tried to incorporate multiplayer compatibility so that couples can chat and take care of the same qilin. We first did this through the Unity Multiplayer Networking tutorial, however it did not work between PC and mobile phone. This is because there was no server to connect the two devices. Next, we explored Photon Unity after getting advice from Sean. This was better, because Photon uses a cloud that can be accessed just as an API is used. In the short time that we had, we prioritized perfecting the Vuforia camera and the appearance of our app over the networking component. In the future, we hope to finish our work on that part of the app.

 

Staging Fright “824A” Documentation

The story:

IMG_0684

In the beginning, our haunted house story was designed for the interior of room 824A. We planned to employ the help of a professor to pose as an innocent teacher looking for something she lost. The victim, i.e. you, ideally would volunteer to help the professor and look for this lost item in room 824A. The professor gives you explicit instructions to search in locker A and as planned, you would go directly to locker A first, however the item won’t be there. Upon opening the locker, a sound is triggered from somewhere deeper in the room, from another locker and a curious victim would investigate the noise. In that locker would be something which is subtlety scary, and a trigger that activates a louder set of noises to scare off the victim. When the victim turns to leave, a short film plays on a small computer screen of a person being mutilated. At this point, the victim really wants to leave, but they can’t leave from the way they entered because the culprit from the video is standing in their way! The victim is successfully scared away and runs out of the room through the other door.

*Story edited for storage room:

Instead of a physical figure, we decided to explore the power of projection to project a ghost onto the wall.

Technical:

We wanted the project to be less manual and more automatic, so we relied on motion capture primarily to trigger several events within our maxpatch. The only control we have over the patch is the opening and closing of the webcam and the uppermost toggle to activate the entire patch.

maxpatch2

We have four events in total. There is a webcam attached to one of the shelves in the storage room, and we apply a fade between that feed and a video of a stairwell, which is triggers upon entering the matrix at any point in the second column (4). This also triggers the “comehere” mp3 clip, to somewhat urge the victim deeper into storage (1). As the victim gets closer to finding the item they were sent into the room to look for, the victim enters coordinates 1,3 of the jit.split, which activates the “iSeeU” mp3 (2) and once the sfplay~ is finished, reads the “ghostCrawl_sound” mov and plays the video (3).

Using remote management, we controlled the security camera feed playing within the room and the ghost video playing from the projector all within one patch.

 

‘The Legend of Link’ Documentation by Saphya

Week 1 (02/5): Conceptualising

Hypertext Fiction Game with Musical Output

Inspiration

interactive project presentation (3) interactive project presentation (2) interactive project presentation (1) interactive project presentation

In the beginning,  I was focused on the visuals of my project, less so the story which was essentially the root of my project. Without it, I wouldn’t have an interactive text adventure because there’d be no text! As you can see from my earlier inspiration, This Book is a Dungeon and Choice of Games are essentially hypertext fiction games. While the latter looks more like a book that’s read in the browser and uses toggle and input to make choices, the former is an external application that uses a hypertext style of progression to run. I thought this would be such a simple approach that it would allow me more time to decorate my application a little. And because this would be scripted in HTML/CSS and Javascript, I could make those changes easily. I was considering node.js, p5.jstone,js, and rita.js for the interactive and musical components. However, I was not skilled in javascript libraries and didn’t want to rely on third party sources such as Ren.py, choiceofgames.com or Twine for my text adventure.

I was set on Python. And so I started looking up tutorials on python text adventures. Surprisingly, they weren’t hard to find, and because most of them existed within the terminal environment I became fascinated with retro games and the retro text adventure game history. Zork is probably the most comparable existing text adventure out there. I love the frustration of them (when the computer doesn’t recognize a certain string of words) and the reward you get if you manage to get deep into the lore of the game. That’s something I hoped to emulate in my own text adventure.

Week 2 (02/12): Paper Prototype

18197357_1388297404570783_140500215_n 18197818_1388297274570796_822559529_n

As part of my homage to the 80s, I decided to base the content of my text-based adventure on the popular video game character, Link. Characters, items and locations have been researched from the Zeldapedia, specifically from the first Zelda game. In the beginning I wrote the story out in my notebook, but my hands got too tired so I transferred over to the pc.

capstone_paperprototype

paper prototype

For my paper prototype I created a flip panel, which has been compared to the A Hundred Thousand Billion Poems book. One reads the passage on the flap, then turns the corresponding flap to choose a direction for the character to go in. In my text adventure, I wanted to use cardinal directions and the occasional ‘look’ and ‘inspect’ like in Zork and Mystery House.

Notes

  • have a heart system where if you are injured your characters “limps” instead of “walks”
  • You have to find Link!

Week 3 (02/19)Officially not a Hypertext

“My project will no longer be a hypertext fiction. I like the style of Zork and Mystery House because I think there are more opportunities for interactivity when the user must type there own input rather than choosing from a list of cues. It gives players the illusion that they have some degree of freedom, when in reality the program is just checking for target words in the input string. In order to achieve this style of interactive literature, I want to take advantage of Python’s cmd module.” -Saphya on March 5th

Notes

  • processing instead of python?
    • string manipulation of giant text file
    • put it on a server
  • ASCII art, pixel art

Week 4 (02/26): User Testing

Using this tutorial, I made a simple text adventure in python using the cmd and json modules.

Screenshot (372)

Python script (left) and story with different nodes (right)

Screenshot (375)

running the game in the command prompt

I struggled a lot with including dialogue in the game, because Python had trouble parsing json and would not let me use the combination of double quotes and single quotes. I had to settle for using dashed lines instead to show the transition between speaker and narrator.

Notes

  • font too small in command line
  • don’t use cardinal directions, use “go forward” or “return”

Week 5 (03/5): MIDI?

Since I want my game to have a musical component, I was looking at ways to turn the Zelda mp3 into txt that I can plug into the json files as attributes for the different areas in the game.

Screenshot (374)

what happens when you open an mp3 file in a text editor?

Screenshot (376)

“The Legend of Zelda Side B” in MIDI

The goal of this is concatenate the hexadecimal codes into one giant text file and export it as mp3 only after the player has reached the final area. That way users can play an audio file, but the audio will be remixed because of the various paths players decided to take.

Week 6 (03/12): Storywriting

18197683_1388297544570769_723917805_n

story map v1

Screenshot (436)

my word doc for the story, paths that have been included in the game are scratched out

In order to form an ending of my story/game, I needed to figure out how the different locations, characters and items relate to each other in order to help the player figure out what to do and where to go next. This required me to revisit earlier parts of my story, and edit them so that the story is coherent and easy to follow. For every decision I would make a branching path and create more and more tabulations the deeper I got into a path. I had to assume every possible action players might consider typing in when playing, therefore a lot of prompts might seems repetitive.

Week 7 (03/19): Music21 & More Commands

I wanted to revisit the musical component of my project. Because I could not find a straightforward way to have the MIDI be controlled by the game. In the end, I used the music21 library to convert the individual Zelda notes back to mp3. This was done by placing the theme’s midi file in Musescore and finding the information such as note, dot, and duration for the notes I wanted to capture in my game.

Screenshot (437)

https://github.com/saphya-council/theLegendofLink/blob/master/src/musescore_midi.py

Besides cardinal directions, I added actions such as “inspect”, “look”, “intimidate”, “talk”, etc. This was because I wanted to predict every possible input a user might write, and due to this my game and the help list became quite messy. I was expecting players to be more appreciative of the fact, since there were moments where certain actions seemed lacking and needed to be included, but turns out less is more.

Notes

  • use raspberry pi to make your game like a standalone application!
  • remove install music21 error message

Week 8 (03/26): User Testing & Raspberry Pi

I was advised to use Raspberry Pi for my python game, so I prepped one with all my files. The quality was much better on the Pi because the terminal font was not small like it is on Windows. Going forward, I want to have LoL on a very ancient looking computer, as in The House Abandon to complete the retro feel I was initially going for.

capstone_zz_test

user tester playing LoL on raspberry pi

Notes

  • edit grammar, vocab issues
  • ASCII art?
  • opening message is a prompt with instructions, change to opening “1” get.room(1) once player types something
  • prompt get to know the orientation of people
  • counter for how many times people get ‘you can’t go that way’
  • check if forest visited/merchant visited

Week 9 (04/02): ASCII Art

ascii_input

making a logo and map in python

ascii_output

I wanted to add some personality to my game so that it isn’t just text. The addition of the ASCII art complements it a lot, I think. I also made a map so that players aren’t confused about their location in the game/story so they can move around with more confidence. It is also to help them take advantage of the many controls I’ve added to the game.

Notes

  • add music background through winaudio library

Week 10 (04/09): Final Edits

Notes

  • finish story ending (fight w/ Ganon)
  • find some way to export midi w/ music21 after player “quits”
  • update directions
  • procedural generated music

All my sourcecode (excluding the .json files) can be accessed here on my Github!

Saphya & Teresa Project 2 “Oh the agony!”

In-class Practice with Max/MSP

IMG_6972

For the in-class activity, we decided to combine jitter, kslider, and the Arduino serial command to switch between two videos in Max. In order to do this, we simply had a ‘change’ object connected to two ‘jit movie’ objects that would activate if a number (coming in from the kslider) was greater than or less than some number. In our Arduino we used a potentiometer to control the ‘slider’ object controlling the input of the kslider. It was important to use ‘zmap’ to map the serial feed (which was from 0 to 1053) to a number on the kslider (0 to 127).

Homework

We decided to take our earlier example further, by including the use of a sensor. We used a light sensor to to change a seemingly happy video into something horrifying if it detects that the lights are off. To do this, we simply changed our digital input to the light sensor and changed the integer value attached to our ‘change’ object. We also removed the kslider.

Screen Shot 2017-04-27 at 5.52.03 PM

 

Saphya Plays with Gmail, Marvel, Astronauts API

Initially I wanted to use the Gmail API to forward messages from my portfolio site directly to my email, but as it turns out it isn’t safe to leave my computer vulnerable to that kind of information. I was directed to FormspreeGetSimpleForm, and Kontactr which were email shortcuts that made users go through a third-party service to draft an email and forward it to the right parties. This was not what I wanted to do in the end, so I looked up an alternative API which was unrelated to the initial dream of the direct email that I was going for.

https://github.com/craigprotzel/Mashups/tree/master/__HELP#openish-apis

I decided to test out the Marvel API on my Spongebob Fansite, just to see if it would work out. It didn’t either, unfortunately, because I needed a server to connect my client in order to retrieve the information I wanted. My code was as follows:

var HERO = document.getElementById(‘hero’).value;
var url = ‘https://gateway.marvel.com:443/v1/public/characters?name=’+HERO+’&apikey=’+KEY;
var ts = new Date().getTime();
var hash = crypto.createHash(‘md5’).update(ts + PRIV_KEY + KEY).digest(‘hex’);
url += “&ts=”+ts+”&hash=”+hash;
console.log(HERO);
console.log(url);
//return $.get(url);
//document.getElementById(‘hero-content’).appendChild(url);

https://dzone.com/articles/more-examples-marvel-api

Source code from: https://dzone.com/articles/more-examples-marvel-api

I had a problem with generating a timestamp and a hash for the url in order to access the information the website was ‘getting’. I think if I had a server this could have been avoided, but I didn’t realise that until later.

In conclusion, I had to resort to an API that did not require me to use the server, strictly client-side. The People In Space API is an open API so I didn’t have to jump through hoops to get it to work. Unfortunately, there wasn’t a lot it could offer since the number stands at 5.

Screenshot (429)

 

Saphya, Digital Performance Ch 1 + Goosebumps

Writing Assignment 2:

In Chapter 1 of Digital Performance, the author starts a discourse about the contentions of using technology in theater and how it devalues its purity. I disagree with Dixon’s commentary and would argue that his examples are biased and depict obviously crude functions of technology in live theater. Productions like Fever (2001) and Monsters of Grace (1998) are great sources of evidence to support Dixon’s argument, however those are only two examples in a history of many mixed media productions. If the beauty of theater is that it can be done live, this is following the misguided assumption that digital imagery cannot be performed in real time. Thus, it is possible to have a harmony between theater and technology and for it to be executed in such a way that critics will appreciate digital performance as thoughtful art.

Dixon cites the one-man show Fever (2001) and Monsters of Grace (1998) as evidence to support his claim that technology disrupts the natural balance of theater, yet Fever is a satirical example and should be taken with a grain of salt. As a solution, Dixon could’ve brought Fever up earlier in the book as a short analogy of how theater and new media is perceived to critics. However, in this specific chapter, if Dixon’s motive is to show evidence that “technological intrusion is alien; the two forms are aesthetic enemies” (28) his argument would be stronger if he cited another example of a production where its inclusion of the digital was not to poke fun at digital performance (as with Fever). On the contrary, Monsters of Grace did not attempt to be a satire rather, it was just an honest failure because of poor execution.

The problem with most mixed media theatre today is technology’s use as a ‘prop’ because directors or actors will overuse this new media without taking into consideration its effect on the greater production. To Dixon, “digital performance is an additive process” so inherently, technology is superfluous and deemed unnecessary. This is how technology can get so out of hand, permeating every aspect of the production so that it loses its luster, “opt for the spectacle of thought rather than thought itself” (28). If technology was considered at the forefront of the product’s germination, then would it still be considered an extra part? I would argue no, because if technology is considered later than only then is it a second thought, a potential component. Like anything one comes up with later, it was most likely not important in the grand scheme of things and is just for decorative purposes. On the other hand, if the technology is the first thought, it must be an actual component, and without its existence on the production, the entire piece would come apart.

It is important to distinguish between needs and wants when it comes to digital performance. As a director, one should ask themselves, “Is this important to the narrative?” and make careful consideration before going with fanfare when it comes to its import of technology. A theater production should not be afraid to explore the power of technology, and just keep in mind the question of whether it is having purpose. If more and more productions consider this when creating these stories, then they can avoid the artificial traps of too much technology.

Project 1

For my first project, I decided to explore another physical and visual means of measuring fear in humans by studying piloerection. Piloerection or goosebumps is the “involuntary erection or bristling of hairs due to a sympathetic reflex usually triggered by cold, shock or fright”. I composed a short auditory experience to be used in a dark, quiet room and subjected both test subjects to the audio, using a DSLR positioned over their forearms to record the goosebumps.

Test Subject 1

The first subject showed no physical reactions. He remained calm and composed throughout the recording. No sign of goosebumps.

Comments:

“nice end”

Test Subject 2

The second subject jumped twice while listening. No sign of goosebumps.

Comments:

“that was really creepy…”

Test Subject 3

The third subject showed no physical response. No sign of goosebumps.

Comments:

“towards the end the voice got creepier but it was sorta calming”

Conclusion

I would say that this experiment was a failure. No goosebumps showed at all during the three trials, which leads me to believe that goosebumps aren’t an effective indicator of fear. It appears that the subjects could easily show their discomfort on their face but goosebumps were unable to mature as quickly as I had hoped. I believe it takes longer for goosebumps to develop so for future experiments I should extend the suspense for a longer period of time to give the goosebumps able time to build up.

Sources:

Foley : freesound.org

Script :

White Tail Spider

Saphya, Responsive Website

 

After I got the gist of tags and the css in different views, I could easily style my portfolio site the way I wanted. It was helpful to give everything a border so that I could adjust my different tags accordingly in the window.

In terms of design, I wanted to go with a very simple layout with little color. I felt that a lot of websites these days have rounded edges, but I was more taken to sharp corners, therefore my website looks very boxy and retro. Totally unintentional but I love the compliments!

I planned to have three pages; one for the homepage which would host my projects, another page  for contact info and  final page for my portfolio. I also planned to make a logo for myself and the website, but I never got around to it, so it’s still “LOGO GOES HERE”.

One thing I struggled with a lot in making this website is making the hamburger menu for the mobile view. There was a lot of javascript and “display = none”s and “display = block”s involved so that the webpage didn’t seem to crowded. On a smaller screen, it was hard to place important links in an obvious position when the page looked different on different devices.

Desktop View

Screenshot (421) Screenshot (422)

Mobile View

Screenshot (423) Screenshot (424) Screenshot (425)

Feedback

  • load content in the background
  • text area instead of input
  • multiple categories instead of sort by year
  • change the size for older readers

All in all, I think my desktop view is far more polished than my mobile view, so I would like to continue to work on it so that users like both formats equally. It would require more experimentation with margins and floats, two things of which I am horrible in.

http://192.168.50.184/~smc737/midterm/index.html

*You can view my updated portfolio site here: https://saphya-council.github.io/portfolio/