Monthly Archives: March 2015

Lukasz Pypec – Computational Research

www.vitaljacket.com

Project 1 – Vital Jacket

Creater – Avenida D Afonso Henriques

A suit that tracks your body as you move.  The suit will track your bodie’s heart and monitors it before storing the data for about 72 hours.  The input is the person.  The person wears the suit and the suit tracks their motion as they walk, run or do something that tracks your heart to monitor it.  This is a great way to monitor your heart as you go about your daily life.  The output is the data that is stored based on the data created by the suit.  The output is a small hardware box that has all the data stored in a small pocket of the suit.

The suit has exposable elctrodes that connected in the suit to the hardware box.  The electrodes then send the data to the box to store it for 72 hours.  The artist(s) is trying to help people monitor the their hearts to make sure nothing is wrong as they go about their daily lives.  This can help doctors know about their patients heart condition and a great opportunity to determine early cases of heart disease.  Plus its fashionable.  The technology allows the suit feel comfortable to not put stress on the heart and avoid wrongful data.  The suit allows users to monitor their heart to stay healthy.  This is used as a fashion sense and a relief to enjoy their lives.

Video: https://youtube.com/watch?v=c2c-y3VRSPA

Project 2 – Flip Deck

Creator: Vincent Leclerc

Link:  vincenteclerc.com/flip

The Flip Deck is a skateboard that takes people’s emotions and turn them into dynami interfaces.  It alows to display animation that give off motion and speed using LED lights.  The input is the persons emotion.  By touching the board, the sensor and circuitry will then sense the person’s emotion to light up the LEDs.  The ouput is the LEDs that flash andight up based on the input.

There is a circuitry embedded into the skateboard and it is connected to an 802.154 radio.  They also have flip-flop gates.  I think the artist is trying to show what kind of emotion the prson is experiencing and usng that to create a small light show.  The entire project focuses on the mood of the person.  If its a strong emotion, the lights will flash faster and stronger than a weaker emotion.  The artists wants to try and express a person’s emotion in some sort of art form.

Project 3 – E-Static Shadows

Creator : Dr. Zane Berzina and Professor Janis Jeffries

www.zaneberzina.com/e-staticshadows.htm

E-Static Shadows are the exploration of static electricity surrounding our interaction with everything around us.  By detecting, processing and displaying electrostatic charges as audio- visual patterns on texture.  The input of this project is humaninteraction with the e-static shadow.  Touching the project will allow e project to interat with the person.  The ouput of the project is the LEDs.  The LEDs are patterns that are form whenever a person interacts with the e-static shadows.

The entire project is conncted all with circuitry.  The static produced by us connects with the circuitry to light up the LEDs.  I think the arist is trying to tell us that by using the staic that courses on our body to create art using LEDs.  This gives us creativity using static as our utensil.  I think the projct is trying to give us a happy mood that helps them with creating art.  This hopefully will relieve stress and just to have fun.

 

 

 

Andrew J. Charles: Computational Research

Project #1
Project Name & Creator/s: Genesis, 1999 created by Eduardo Kac
Project URL: http://www.ekac.org/geninfo.html
Project Description: The process of this project allows the user to turn a simple sentence from ‘Genesis from the Hebrew Bible into Morse Code which is then translated into the sections that are made to create DNA.
Input: Basically you start out with typing a simple message which is then translated into Morse Code
Output: Then this Morse Code is translated once again into the genetic structure that is used for DNA.
Process: Installation with genetically modified E-Coli, bacteria, Ultraviolet light, Computer and Internet
Statement: That by converting a simple message through the use of translation one can understand how it would look in terms of a genetic code. It goes through a lot to make a simple message but the idea of it is surprisingly creative.
Critique: For being made back in the late 1990s it is rather interesting to see such a thing be made for its time. Since it means that we can control how the genetic code of things can be ordered and placed based on how a person plans it. I do wonder if there are more things to experiment within this section.

Project #2
Project Name & Creator/s: Avatar Machine, 2008 created by Marc Owens
Project URL: http://cargocollective.com/marcowens/Avatar-Machine
Project Description: The general idea of the ‘Avatar Machine’ is for the user to experience themselves in the point of view as a virtual character in terms of third person gaming. This would allow for the user to express ideas on a different perspective of their views normally.
Input: “The super wide camera mounted to the body harness is pointed back at the user. The video is streamed live to a head mounted display worn by the user.” This means you wear the camera first.
Output: “The effect is real life viewed in third person.” Then you are ready to experiment from this point of view.
Process: Camera, Computer, Mirrors and Harnesses
Statement: By the use of cameras, and computers, one can be their own ‘virtual character’. Given that their perspective is third person.
Critique: I have to say the concept and the video is a rather interesting way to express how this would work. But, if there was a way to make the ‘armor’ much more small/lighter it could allow for much more better progress to base on more.

Project #3
Project Name & Creator/s: Sharkrunners,, 2007 Area/Code(USA,est,2005)
Project URL: http://areacodeinc.com/projects/sharkrunners/
Project Description: It is a game for Discovery Channel’s 20th Anniversary as it allows for oceanic research as well as shark research.
Input: You play as ships that you have control over. While the sharks actually move and function due to the GPS attached to their fins.
Output: The ships move in real time and that the players get updates/announcements from possible shark encounters.
Process:

Amoni B: Computational Research

3/10/14

Project 1:

Project Name & Creator/s:

M Dress byCute Circuit (Founders Francesca Rosella and Ryan Genz)

Project URL: 

http://cutecircuit.com/collections/m-dress/

Project Description: 

This is a stylish dress that operates as a mobile phone. You can make and receive calls conveniently with gestures.

Input:

An activated mobile Sim card will be needed. Human hand gestures allows for responses. Bringing your hand to your face will allow for a call. Moving hand down to hang up.

Output:

A phone activated dress when sim inserted. A phone call is picked up with the movement of the hand to face. Dialing out, hanging up.

Process: 

CuteCircuit created this technology using soft circuitry. its is solar powered. They created a hand gesturing mechanism. I believe after they designed the dress then they figured out ways to hide the wires.

Statement: 

The artist is combining convenience with limited mishaps.  CuteCircuit know the hardships of carrying phones when wearing clothes with no pockets or in general. They also realize the horror of misplacing our phone. This dress is ease of use without worry or hassle.

Critique: 

The hidden components are amazing. Its a relief and a shock that something like this is out there. How ever I wonder if you have to wear the same dress over and over again.

Project 2:

Project Name & Creator/s:

Heartbeat Hoodie by Diana Eng & Rhode Island School of Design

Project URL: 

http://www.trendhunter.com/trends/heartbeat-hoodie-by-diana-eng

Project Description: 

This hoodie will take images of its frontal surrounding based on increased heartbeat. It documents the interesting momments.

Input:

The following are the inputs: heart rate and motion.

Output:

Output include camera operation and images.

Process: 

The “Basic Stamp” algorithm created a motion detection. The accelerated motion then activated the hidden camera to capture its surroundings.

Statement: 

They are trying to document times that makes you most excited. It also documents when your emotions are bothered. Great for recalling or for documenting.

Critique: 

It seems silly to have a camera on your head or even a hood for that manner.  It is great for alarming situation or even if you want to remember that wonderful moment you had.

Project 3:

Project Name & Creator/s: 

Scentsory Design by Hills, Latham, Geesin, Evens, Feltham, & Briggs

Project URL:

https://www.youtube.com/watch?v=G9nLCPEGzRw

Project Description: 

A variety of clothes made to benefit a person health. The clothes let out smells or chemicals based on need.

Input:

Scent, human body visuals, emotions

Output:

scent, chemicals,

Process: 

Microfluids flow through different microscopic tubing and leave out of small incisions. that tubing is created within the clothing.  Liquid is sprayed on targeted areas based on bodily state.

Statement: 

They are trying to enrich life, human intersection and health. This happens discreetly without others knowing what is happening. Those sick or with problems such as asthma can benefit. It can also be used to attract members towards you or deodorize.

Critique: 

 This seems good for those who are unhealthy are want perfume all the time. I wonder what are the risks if it mishaps. Also I want to know specifics of the sensor capabilities.

Sean: Computational Research

Project 1:

Project Name/Creators: Blur-braincoat by Diller, Scofidio, and Renfro

Project URL: http://www.dsrny.com/#/projects/blur-braincoat

 Project Description: This project reinvents the way humans interact with each other by removing the ability for people to physically visualize another person. The project works to connect participants to a network proxy and have them interact without being able to judge based upon visual expressions.

Input: The user of the braincoat acts as the input, by completing a survey of questions before interacting with other participants. The users input questions determines the response the braincoat emits to other users rather than the users own human expression.

Output: After completing a survey, the user of the braincoat interacts with other participants through the coat. The coat gives off simple digital signals such as a light to show how the user would respond. These responses could include affinity, embarrassment, or even shock.

 Process: The coat works by connecting users through a network and matches people based upon their survey results. If a user is usually shy when interacting with others face to face, the coat will emulate that expression through its light components or even sounds. The coat’s main design process is that it can cloud the users features to give an anonymous result.

 Statement: The blur-braincoat works off the concept of social relations and how we have preconceived ideas about people simply based on what we see from their physical features. The artist explains that we can judge a person’s gender, age, race, and even social standing. Acting as a type of physical social network the braincoat matches individuals not on appearance, but on ideal comparisons.

Critique: The concept of the braincoat is revolutionary for social networking, but lacks practicality in that the artist need to conduct a question and have the users wear a prosthetic suit in order to function. The idea could be improved if somehow anonymity could be upheld without the use of a sort of external garment or questionnaire.

Project 2:

Project Name/Creators: Face visualiser by Daito Manabe

Project URL: http://www.daito.ws/en/work/smiles.html

Project Description: This project works to recreate human facial expression and emotion through the use of computer programming and basic knowledge of electricity and how muscles in the face work. The design allows for control of the muscles in the face and gives off unusual facial reactions that might not be possible to recreate naturally.

Input: For this project, the input would be considered the synchronized electrical signals that are given off by a computer program. These could be modulated to allow for different patterns, and in the artist demonstration, the computer emitted signals based on music.

Output: The output of the project was the way the face reacted to the electrical stimulation. Manabe explained that the output signals would hurt his face similar to a feeling of needles on skin, but the larger issue was that his responses would sometimes inhibit sight and breathing.

Process: The process of this project is to create an array of digital signals that translate over to facial expressions using electric nodes attached to the users major muscles. The project can be linked to music, giving of the illusion that the face is dancing and moving in many areas that do not seem natural.

Statement: Although the original concept was to capture Manabe’s emotional expressions and translate them to another person, the hypothesis was incorrect and the artist found it not possible to recreate emotions such as a smile without human intervention. The project was then used to show synchronized motion between two users faces.

Critique: The concept is interesting, but strays off from its original purpose. I believe that if Manabe could actually find a way to match his expressions to another users face, the response would look much more synchronized than random facial muscle movement.

Project 3:

Project Name/Creators: XSense by Adam Danielsson, Per Nilsson, Melvin Ochsmann, Koen Van Mol, Robert Winters, Tamara Klein, and Andreas Nertlinge

Project URL: http://www.slide.nu/projects.php?id=53

Project Description: This project allows users to experience a natural phenomenon called synesthesia, or the ability to cross associate color, sound, and visuals through the use of a modified helmet. This is a rare occurrence in some people that may cause brain sensors to collaborate due to a chemical imbalance. The result could include hearing visuals or associating color with a taste.

Input: The input in this project is the physical space and events going on around the user. As the user experiences sensory stimulation, the helmet converts the signals or crosses them with other signals. This includes giving the user the ability to receive sound and visualize it through the helmets LED embedded lights.

Output: The output of the project would be the users reaction and response, which gives a sense of special awareness even with sensory inputs being crossed. The best description of the project is that its similar to a sonar system, which allows animals to send sound waves and receive visual awareness of what is around them.

Process: The process of this project takes in environmental stimuli and converts and translates the external signals into the internal components of the helmet. The external inputs being crossed still allow the user to feel as if they can recognize the space around the.

Statement: The statement of this project is to give the users feeling and senses that could not be felt without the chemical imbalance of synesthesia. It almost seems to show that we can still be aware with our senses crossed as the brain has ways of interpreting these new environmental inputs.

Critique: I believe the concept is appealing, but I would like to see how the product could be used in real world scenarios such as providing sensory vision to the blind or even visual hearing for the deaf.

 

 

Sean: Computational Concept

My concept for a computational art display is to have a device that could capture simple hand motions to allow a user to move a digital line, projected against a wall, and create sounds much like a synthesizer. The motion of the users hand could be set to create different types of waves such as sine, saw, and squares to produce a different effect. Drastic changes in motion could affect different components of the sounds and create different tones.

Kenny Chiu: Computational Research

Project 1:

Project Name & Creator/s: The Muscle-Computer Interface – Desney S Tan, T Scott Saponas, Daniel Kelly and Babak Parviz

Project URL: http://research.microsoft.com/en-us/um/redmond/groups/cue/muci/

Project Description: The Muscle-Computer interface seeks to turn the human body in to functioning mouses and keyboard. It allows the person’s gesture or finger movements to control the interface of the computer as oppose to the traditional mouse and keyboard.

Input: The input of this project would be the user who’s arm movement, gestures and postures which is sent in to an arm band worn on the arm. The movement can also varies and dependent upon how the fingers and arms move. Even the pressure created by the finger can create an input.

Output: When the arm band receives signal from the user it act accordingly as if it was a mouse or a keyboard and does what the user intend to do with it. The type of output varies as it replaces the function of a mouse, keyboard and even game controller.

Process: By placing electrode like wires across the arm the user’s muscle movement can send signals from the nerve in to the computer and thereby indicating the gesture used or the muscle that is moved. After detecting the movement or pressure the computer does the function it was intended for like a regular user-interface. The exact process depends on what the interface is controlling.

Statement: The Muscle-Computer Interface is created on the concept that we cannot access our electronic device when we need it due to the hassle and things we carry everyday life. When our hands are occupied with objects or other matter we cannot reach for our device and therefore the Muscle-Computer interface can solve many of the problem by allowing the user’s gesture to become the input as oppose to handling actual input device while we’re busy.

Critique: The concept of the idea is very innovative but it is quite primitive at it’s current stage as it requires electrodes and I doubt anyone will walk around with wire hanging off their arm all day. More importantly it’ll take a little while before this technology can be implemented in to our mobile phone as that’s where the greatest hassle is located since nobody will carry an entire desktop with them.

Project 2:

Project Name & Creator/s: Spinning Shaft – Alejandro and Moire Sina

Project URL: http://bermant.arts.ucla.edu/sina_spinning_shaft.htm

Project Description: The spinning shaft project creates beautiful light show using neon tubes and the circular motion of the neon tube creates a sort of cylinder of neon light circling about. The neon lights can also go on and off in different time creating a spectacular light show.

Input: The neon light of the piece is a part of the output but surprisingly, part of the project’s input is the user. The user can participate by activating the device or light by touch and they can modify the lights and rhythm of the lights based on small gesture such as clapping or pressing a switch.

Output: With the neon light and the participant the lights will go on and off which is part of the output. They also speed in constant velocity which leads to the final result of the spinning cylinder of lights that is created.

Process: While the exact processing is unclear, it is clear that a power supplies is used and depending on how the user participate they can vary from motion sensor or other type of sensor that triggers reaction from the light. They act as switch that transport power in and out of the neon tube.

Statement: The statement isn’t very clear either since the project doesn’t seem too abstract. It is pretty direct as it’s purpose seems to be creating a spectacular light show through the use of moving light works.

Critique: The piece itself while simple is very beautiful. The ability to allow participant to control part of the light show gives it a wide range of lights and patterns. Despite the beauty of it, it is still a simple piece nevertheless and doesn’t have much functionality. It certainly serves it’s purpose well and looks very beautiful.

Project 3:

Project Name & Creator/s: A-Volve – Christa Sommerer and Laurent Mignonneau

Project URL: http://www.medienkunstnetz.de/works/a-volve/

Project Description: This project is an example of a genetic art. It allows the user to construct and create artificial life form by drawing their shape and anatomic features then release it in a type of virtual water like environment and observe their life as if they are real creature. They can observe the creature’s capability and see it evolve or produces offspring. Some can be killed or be eaten simulating the actual evolution of life within the ocean.

Input: The users are the original inputs as they are the one who create these creature and release them in to the virtual environment. They alone can dictate the shape and anatomic advantage these artificial life have. Even after the user releases their artificial life they can continue to effect these creature in various ways.

Output: The output can vary but ultimately it’s effect is on the creature. The specific action depends on the creature or the user’s action. For example, an output would be a user placing his hand within the pool to stop a predator and thereby saving an artificial prey from said predator. The user can control and create these creature as they wish creating a varieties of output.

Process: The actual process is difficult to determine as the creature most likely follow an algorithm that makes them animal life. A touch screen is required since the user interact with these creature. How these creature behave might or might not be determined by their anatomy or users.

Statement: This project is a clear example of the combination between biological science and technology. These artificial life is created by technology but also has the characteristic of a living organism. The “genetic” of the artificial life is taken in to account and simulated in a very accurate way. The authors certain created a demonstration to show how far artificial intelligence can go especially when they begin replicating the actions of real animals.

Critique: The project looks to be very entertaining and well made. The technology seems very innovative for it’s time and can amaze even in the modern time. The algorithm necessary to create these behavior must be quite complex. This project could be both artistic and fun as it heavily relies on user interaction and experience. In fact this project wouldn’t be a big deal without the interaction of the users making it a rather unique project.

Kenny Chiu: Computational Concept

The concept I have come up with is the usage of pressure sensing that allows a user to create ripple as a projection or some type of effect on a wall or screen. The greater the pressure the user place the larger and higher frequency the ripple becomes. This could be done with computation as the basic of the idea is simply circles expanding over an area depending on how far it’s pushed down.

Amoni B: Computational Concept

I would like to create a processing sketch that uses different hues and shades of purple. As someone goes left to right with the curser, the different shades will take effect (from the highlight/white to the most deep color). As a person moves vertically south, up to down, the hues will change. For example lavender, violet, and periwinkle.