AerForge

December 2018

Description

It begins with a point and ends with a print.

AerForge is a project about change, and how the different spaces in which we create change the creation itself. The entity with which the user interacts transitions through realms and dimensions of creation, exploring and connecting different environments.

To begin with, the AerForge experience is intangible. The user draws an imaginary line into the air in front of them, and out of thin air comes a visual. Projected on a screen in front of the user is the line they began to draw, and as the user continues to move their hand through nothing, they create something. Thin columns appear on the projection, their height matching the height of the user’s line. With a wave, the user rotates the projected image and out of lines and columns emerges a form. Though empty-handed, the user is able to create and engage with a virtual object. The user places their hands out in front of them, palms up as if asking for something. This gesture cues the virtual object to be downloaded and sent to a 3D printer. The final transformation brings AerForge into the physical world, and into the hands of the user.

Tracking Hand Movement

The Leap is able to track hands, fingers, joints, and different combinations of those. To create AerForge we tracked the user’s right index finger and both hands. The right index fingertip was used as input for the circles that made up the line the user drew, and it was also used as a variable for size and position of cuboids. The left-hand input controlled an orbit camera, orbit around the x-axis was to be limited, so the x-value of the left hand position changed the position of the camera, and thus the view the user had of the scene. The “save” and “reload” functions were triggered by gestures in which both hands were tracked, not only their position but also orientation and whether they were open or in a fist. To save, the user has to put both hands palms up and to reload and start over with a new design, the user has to make a fist with both hands.

Drawing a Shape

We drew a series of ellipses to the screen at the user’s fingertip coordinates, which allowed to user to see what they were drawing and how that changed the AerForge model. We also positioned the camera (within the code) so that the user was looking at the profile of the model, thus giving the illusion of 2-dimensionality. The ability to orbit the scene with a gesture gave the user more freedom, in the sense that they were not limited to the 2D perspective, but could move away from the profile view.

3D Printing the Model

The issue with this step of the process is how to get from the downloaded STL to a g-code file sent to the printer. After being downloaded, the STL file has to be repaired (using 3D Builder), then sliced (using Cura), then sent to the printer. All of this has to be done by a person, and can’t easily be automated. The standard method of sending g-code to the printer is via SD card, which can be tedious if we have to move the SD card from the printer to the laptop and back again after every print to add a new file.

However, using Astroprint we could bypass both 3D Builder and Cura. Additionally, Astroprint allows wireless communication between the laptop and the 3D printer, which means we could avoid transferring files through an SD card. 

Influences

Air Matter is an interactive installation that allows the participant to generate virtual vases using motions drawn from traditional pottery. They also used the Leap Motion Controller, however the technicalities of hand tracking they used are different, as well as how they used the coordinates from the Leap Controller.

Mahnoor’s Rainy Weather Controller, created a visual display from physical input, which allowed for a great deal of interactivity. She used material sensors, manipulated by the user, to send a stream of inputs to alter the display. The constant user-generated input provided by the sensors is something we wanted to incorporate, and we also wanted the user to be independent of the sensor.

A previous project of mine, Nameblem, was also an influence. My method of using specific variables from a text entry to manipulate the design of a 3D model allowed the creation of a custom 3D print; the customization came from the user inputting their name, the user did not interact directly with the design. We expanded upon this method for AerForge’s 3D modelling. Though we maintained the framework of using user input as design variables, we made a number of changes to increase customization.

Team Members

Salisa Jatweerapong

Melissa Roberts

Mahnoor Shahid

Samantha Sylvester

Using Format