Sienna Ramos

Experience

Projects

Coursework

Activities

Projects

Completed

A Movie Script Ending

March 2019

Co-wrote an arrangement for a cappella, based on Death Cab For Cutie’s A Movie Script Ending.

AcaPalette

October 2018

Finalist at the Hacking Arts Hackathon and Conference hosted at the MIT Media Lab. Intense emotions, such as anger, cause the speaking voice to deepen and sound more tense. Using sound processing, we can identify when a speaker is experiencing an intense emotion, and map this emotion to a color display. Potential application for an art installation to question the nature of arguments. What are the emotional differences between a long argument that fills the display with vibrant colors, or an argument that keeps the display colorless for as long as possible?

FPGA iPad

December 2017

Final Project with Alex Leffell for 6.111, Introductory Digital Systems Laboratory. We built a program in Verilog that rotates, resizes, and translates images on a VGA screen. The user’s pinch and twist finger gestures are recorded by an NTSC camera and translated into commands sent to the image processing module.

Adder(f)all

October 2017

A sololess arrangement for a cappella, based on Tei Shi’s Adder(f)all.

Pressure Sensitive MIDI Controller

May 2017

Final Project for 6.115, Microcomputer Project Laboratory. I created 12 “keys” out of pressure sensitive resistor material, and connected them to a PSoC microcontroller, which translated key and pressure information into MIDI note and volume information. I connected a touch screen to the Intel 8051 microcontroller, which the user could use to select sound quality information (sine, sawtooth, or square wave), as well as the octave of the note. All of this information was sent to a laptop running PureData, which consolidated the information from the PSoC and 8051 to generate sounds.

Introception

April 2017

Fluid Interfaces

Part of Xin Liu’s Master’s thesis in the Fluid Interfaces Group at the MIT Media Lab. Xin was studying a person’s perception of their body, and wanted to build a wearable device that would react to a person’s breath by playing back either real-time or delayed breath noises. I built the software for the device using Python and Arduino, tested sensors, and helped design the device. Once the device was built, Xin ran a study testing emotional responses to using the device. I helped her design the survey, and helped consolidate the data once collected.

cycles

December 2016

cycles

Final Project for 21M.361, Electronic Music Composition I. Originally performed in 4 speaker surround sound, this piece is a walk through the repetitive and intense life of an MIT student.

Past Lives

October 2016

Past Lives

An arrangement for a capella, based on BØRNS’s Past Lives.

Water

September 2015

Water

An arrangement for a capella, based on Andrew Huang’s Water.

The Human Joystick

January 2015

Momentum

Project for 16.682, Momentum: Brain-Computer Interfaces. I worked on a team of four to build a robot which used signals from muscle movements to control a robotic arm. As the team’s Hardware specialist, I was in charge of assembling the robot, designing how input would be filtered, and writing programs in C for various functions. I was also the point person for budget management and part acquisition. We gave a presentation and a poster session about our project.

Terrascope: Our Energy Future

December 2014

Terrascope

Project for 12.000, Solving Complex Problems. I researched energy policy in China for my team of 50 to develop an energy portfolio for the world.

Google Trailblazer: Project ICE

November 2013

I was part of a group of students selected by Google to design academic programs to promote computer science. With my team of five, I planned a platform and forum for students from K-12 where they could compete with each other to complete coding lessons in languages like Scratch, Python, and Java.