Robosense 3.0

Interactive Robotic Printing Integrated with Inflatable Dynamic Printbed (2018)

Directed by Prof. Jenny Sabin

Worked with Madeleine Eggers, Chi Yamakawa, Alexander Wolkow

The Robosense 3.0 was an experimental project conducted in Sabin Design Lab at Cornell University where I am acting as a research associate and lab personnel.

0.00 Abstract

This project aims to streamline a system for the interaction between designers and a digital fabrication workflow tailored to their skill set - a system which makes available a digital fabrication workflow which is computationally advanced while remaining user-friendly to designers not trained in the use of code or scripting. Simultaneously, the project aims to explore ability for such a workflow to be dynamically updating - able to update its toolpath to accommodate a changing work environment (printbed). In doing these things, Robosense 3.0 aims to explore the place of the human user within a cooperative design space created by the inherent shortcomings associated with robotic fabrication setups, and their ability to operate unilaterally within a changing work environment despite their having some ability to update their toolpath dynamically. By taking the opportunity to insert human intuition into an equation which is otherwise digital in nature, a design space which exploits the results emergent of this interaction is created.

1.00 Context

This project draws on its predecessor project, Robosense 2.0 (Bilotti, Norman, & Rosenwasser et al, 2018) as foundational research for the development of a dynamically updating robotic fabrication setup which also entertains the contribution of human users within an otherwise digital workflow. Robosense 2.0 also explicitly explored the role of the material within a digital fabrication workspace; both the influence of material properties on fabrication settings and constraints and as a design driver were explored. Robosense 3.0 seeks to contribute to this developing body of research by engaging with the nuances of cooperation between digital fabrication methods and human intuition.

2.00 Scope

Our project is concerned with the development of a dynamic work surface (printbed), the evaluation and updating of robotic movement (a toolpath) the human compensation/cooperation via modulation of the extrusion rate of the material used in printing, and the autonomous updating of a toolpath and eventually extrusion rate.

3.00 State of the Art

Robosense (A. Moorman et al, 2016) established a Python-centered workflow through which designers with some knowledge of code and scripting could interface with digital fabrication tools e.g., an ABB robotic arm. (Fig.x) Robosense 2.0 (J. Bilotti et al, 2018) streamlined this workflow, emphasizing the role of grasshopper within the process of generating robot code, and thus lessening the extent to which users must use python code. Critically, Robosense 2.0 broadened the scope of the project to exploit the material properties of the medium used i.e., clay. Not only were material properties used to inform toolpath and extrusion parameters, but their characteristics were openly explored as design elements. Various projects have explored and employed the ability of digital fabrication processes to dynamically update their toolpaths according to changes in their environment and/or the work site. (Fig.x) The 2013/14 ICD/ITKE pavilion used a Kuka-driven end effector to deposit epoxy resin impregnated carbon fibre along structurally determined paths on the inside of an ETFE formwork. Because of the pressure applied to the formwork, its precise position is dynamic with respect to the effector. In order to facilitate deposition onto this moving work surface, a dynamically updating workflow was designed which iteratively adjusted the robotic toolpath according to where the work surface was at any given time. (Fig.x) The ETH project Spatial Timber Assemblies, 2018, explores a case study at the pavilion scale whereby robots with dynamically updating toolpaths cooperate with humans who complete their own critical tasks. This requires a sufficiently sophisticated digital fabrication system to allow robots (in this case two) to update their paths according to changes in a work site, but the provision of tasks which humans are particularly suited to, or which robots are unsuited to. Tasks are partitioned between those for the robot and those for the human. What Robosense 3.0 seeks to investigate is a creative role for humans within a collaborative environment including robots which complete their own essential tasks. Through acknowledging the limitations of digital fabrication tools, we hope to exploit what may be perceived as a gap in performance as an opportunity for creative expression.

4.00 Aim

Our aim is to exploit the opportunity presented by the inherent delay observed between the changes in a fabrication space, and the rapidity with which a dynamically updating robotic fabrication system can adapt to such changes, to carve out a design space or niche within which humans can contribute their natural abilities to improving the performance of such a system. Through creating a design space within which robots and humans collaborate, novel design outcomes might be achieved.

Our investigations involve:

Development of a dynamic work surface (printbed)

Evaluation and updating of robotic movement (toolpath)

Human compensation/cooperation via modulation of extrusion rate

Autonomous toolpath updating and extrusion control

5.00 Methods

5.10 End Effectors

A series of adhesive extruders were designed, built, and tested for their ability to extrude adhesive in a consistent and predictable way, and for their compatibility with modulation by a human user during extrusion tests.

5.11 Syringe Extruder

This deposition setup for syringe-based extrusion involves a stepper motor to depress a piston, squeezing out a bead (Fig.x). The 200ml custom syringe end effector is attached to the side of the robot which allows us to deposit large amounts of material with minimal travel distance. The limit of this extruder was that it was controlled by an arduino with little to no control over extrusion speed.

5.12 Stepper-driven Extruder

In an effort to move forward with tests as quickly as possible, a design taken from previous research (Biosynthetic Robotic Fabrication, Cornell AAP. Spring 2016. Instructor: Jenny E. Sabin., Students: Paola Cuevas & Natalie Hemlick) and assembled. As per the design, a stepper motor was used to drive extrusion, and a grasshopper script was written in order to drive it (Fig.x).

5.13 Motor-driven Extruder

Following issues with our ability to control the consistency of extrusion through the use of a stepper motor, a variable-speed electric motor was used instead. With this setup, extrusion speed could be set at a constant rate for the duration of a test, and (unlike with the stepper motor) extrusion was uninterrupted (Fig.x).

5.14 Pneumatic Extruder

In an effort to reduce the complexity of the extrusion setup, and endow the extrusion apparatus with an ability to be modulated manually, a pneumatic caulk gun (ChemiCar, USA) was purchased, and fixed to the robot using a plywood frame (Fig.x). Extrusion pressure was modulated via a trigger placed between the compressor and the extruder (Fig.x) . Pressure at the compressor was set to 15 psi, and the pressure allowed to reach the extruder was modulated through the manual actuation of the trigger. The caulk extruder itself was fitted with an emergency pressure release valve set to 150 psi in order to prevent pressure buildup, and potential explosion of the extruder should a blockage occur. For our most recent tests, the plywood framing for the extruder was reduced to minimal dimensions in order to reduce the likelihood of its colliding with the printbed, and to allow the robot itself more freedom of movement.

5.20 Dynamic Printbed

5.21 Background

The printbed designed here draws from precedent thesis research by Gergana Rusenova (ITECH University of Stuttgart. Supervisors: A. Menges, K. Dierichs, E. Baharlou, 2015) who designed, constructed, and employed a dynamic formwork for the creation of volumes within an aggregate of designed granular particles. When inflated, the balloons provide a void within the formwork which the particles aggregate around - forming structure. When deflated, the aggregate is left to stand on its own. In our design, an air compressor is connected to a manifold, branching into 4 tubes which are individually connected in turn to balloons. In the middle of each tube is a solenoid connected to arduino controlling their on/off state. This setup allows the inflation/deflation of individual balloons beneath an elastic textile cladding which forms a lofted surface between the underlying balloons.

5.22 Physical Construction

The final build of the dynamic print bed is made up of metal struts, balloons, spandex membrane, and a system of custom-printed balloon holders, tubes, solenoids, and an air compressor. The UNISTRUT strut channel framing is strong enough to hold down the fabric in tension and allows for a fully adjustable balloon bed. The solenoids control the air flow that regulates the inflation and deflation of the balloons under the spandex. In doing this, the fabric relates to the elastically-growing nature of the balloons under it and is easily replaceable after each test.

5.23 Printbed /solenoid Control

To control the solenoid valve with the Arduino, simply set the pin to a high level for the appropriate time. However, it is dangerous that the solenoid works at a different voltage from the Arduino. Connecting the Arduino and solenoid directly will lead to damages. In this case, the TIP120 transistor is needed to act as a bridge. The TIP120 allows a lower dc voltage ( 5V from Arduino) to be pressurized to a higher voltage (12V) and to drive solenoid valve. It can be considered as a switch that applies current to B and allows current to flow between C and E. The diode connected to the solenoid (1N4007) allows current to flow in only one direction. When the current is turned off, the solenoid valve attempts to continue the current. The diode feeds this current back to the solenoid valve until it dissipates. Since the transistor is doing all the heavy work in the circuit, the coding is relatively simple. When Arduino pins are set to high power, this connects the transistor collector to the emitter of the transistor, which activates the solenoid valve. When using grasshopper to control the solenoid valve, we need to send a series of digital signal ( high or low) to a serial port to activate the solenoid through Arduino.

To embody the vision of human’s input, all solenoid valves can be controlled by a gesture sensor as the input device that is connected with a computer, generating high and low values. When user is lifting hands away from the work surface, it actives the HIGH value, that leads the balloon to be Inflated, but when the user's hands are moving closer to the work surface, it actives the LOW value, and leads to the balloon to be deflated.

5.30 Sensing

The pipeline for creating a responsive robotic toolpath involved five key phases: establishing a simple toolpath to be altered, reading the environment, translating coordinate systems, evaluating the toolpath curve, and refreshing the toolpath values.

Phase one was to establish a simple toolpath, such as a line tracing across the printbed. The robot was to follow this line, continually changing its z-value to correspond to the height of the balloon it read at its current location along its toolpath. Phase two consisted of using a Kinect to read the environment as a point cloud, to then send to the robot as its means of perceiving its target -- in our case, the continually varying printbed topography. Phase three involved remapping the points read by the vision system -- at a random location and scale in space -- to the real-world cartesian coordinate system in which the robot would be operating. This step placed a QR code on each corner of the printbed and remapped the field of points read by the Kinect to the four corners of a box in space in front of the robot, where we could then accurately match the digital toolpath to the real bed. Phase four evaluates the toolpath curve according to time and travel speed: the robot’s XY position along the toolpath curve at t=0, t=1, etc. A straight, two-dimensional toolpath curve in the XY axis was generated by intersecting a point cloud with a vertical plane coinciding with the simple toolpath established in phase one, interpolating between the intersection points, and evaluating the resultant (now three-dimensional) curve along its length. This had the effect of projecting a line to an actively-updating point cloud. At its current location along the line, the robot would receive a z-value read from the point cloud, and update its current target z-value accordingly, actively adjusting to the height of the bed. Phase five performed the active updating of the toolpath in the robot’s proprietary software. Using python scripts altered from Robosense 2.0 (Bilotti, Norman, & Rosenwasser, 2018), the z-value was sent out of grasshopper and into a text file, to be retrieved by a different python script, which would unpack the coordinates sent out in RAPID code and replace the current z-value with the contents of the text file -- the new z-value-- at every 50 ms interval. The robot’s RAPIDcode targets actively update using the open-source software OpenABB, and the robot is able to responsively change its height based on its “sensed” environmental inputs. Unfortunately, phase five ultimately had too many issues to be fruitful, so controlled, hardcoded tests were conducted using the same principles outlined above to simulate active updating and explore the gap between perceiving and reacting.

5.40 Human Contribution

Fig.x In updating its toolpath in accordance to changes in the printbed’s topology, some degree of error or delay is always manifested in the robot’s actual resultant movement. As the rate with which the printbed changes is increased, this delay increases. The delay is manifested as an increasing vertical displacement between the effector and the work surface. In turn, this results in a mismatch between the rate with which adhesive is extruded, and the vertical displacement. When extrusion rate is too high for the displacement, the material curls and piles on itself - in order to achieve a straight bead, extrusion rate should be reduced. The system as designed has no capacity to reason through this discrepancy or to react accordingly - but a human is able to adjust extrusion rate intuitively. As the robot passes over the work surface with constantly changing displacements in z, a human modulates extrusion rate accordingly, compensating for the robot’s inaccuracy.

5.50 Preliminary Studies

Fig.x An adhesive bead was laid over a work surface. In this case the default extrusion rate was too high for the tool’s displacement from the surface - the material piled and curled on itself. When a human modulates extrusion rate accordingly (and also in accordance to inconsistencies in displacement) a straight, consistent bead can be achieved. A straight consistent bead might not always be desirable, but the fact it can be achieved demonstrates the efficacy of human input.

5.60 Simulation and Concluding Studies

In order to simulate an accelerating dynamic printbed, successive tests were conducted with an identical toolpath where each iteration used a larger vertical displacement from the work surface (Fig. x - x) All other variables were kept constant.

Following the same principles as the updating toolpath, the script used to calculate the toolpath was based on updating points along the worksurface. Like the updating toolpath, the toolpath line was generated by simulating a cloth draped over two inflating and deflating spheres, and then intersecting the draped surface with a plane to create a single line toolpath. As the toolpath curve changes, the resultant curves are recorded at 1-second intervals. The curves are parametrized in the direction of travel, allowing us to extract the point along curve 0 at t=0s/dx=125px, curve 1 at t=1s/dx=250px, etc. (Fig.x)



The final toolpath curve is interpolated from the points generated. When executing the hardcoded toolpath and the dynamic bed at the same time, the robot speed is synchronized to bed inflation/deflation via trial and error. To simulate a more rapidly changing environment, the z-height of the toolpath is moved upward incrementally to artificially produce the typical z-height gap between sensing and reacting. In this gap, the human can intervene by modulating extruder pressure to ensure bead consistency. (Fig.x)

6.00 Results

Tests were done for each extruder type, and the pneumatic assembly described (5.14) was additionally tested over a dynamic printbed (see 5.20).

6.10 Syringe extruder

The results using the syringe extruder proved to be difficult to control in many aspects. The liquid-plastic materials we tested were not viscous enough and dripped over the balloon where the original path of the robot hardly reflected onto the balloon. They would then harden and our syringe was no longer useable. Thus, even if the position and height of the bed was perfectly calibrated, our results were not accurate. In addition, because the motor was controlled by an arduino via grasshopper, a completely separate setup from the robot, both systems were never perfectly aligned. The way the extruder was built, made it necessary for it to be perfectly level for the pistol to lower correctly- which it never truly achieved. The physical balloon tests that resulted from this extruder were both inaccurate and inconsistent because of how little control it provided.

6.20 Stepper-driven Extruder

Results from the stepper-motor driven extruder demonstrated inconsistency in extrusion. Because the motor runs only in increments delivered from a grasshopper script, extrusion only happens in bursts. The results show this: adhesive is extruded in quick bursts, interrupted by periods where there is no extrusion (Fig.x), until the code can be manually reset and run again. Control of this setup is cumbersome, and unpredictable. Additionally, this design was plagued with physical problems - several components broke under the pressure of extrusion, and needed to be repaired using two-part epoxy (Fig.x).

6.30 Motor-driven Extruder

When the same extruder was revised to use an electric motor, a constant extrusion speed could be selected. This results in an even pattern of extrusion, but does not give the user opportunity to modulate extrusion speed during a test (Fig.x).

6.40 Pneumatic Extruder

When the pneumatic setup (see 5.14) was employed, adhesive extrusion rate could be modulated by hand. More curling of the bead was observed as displacement between the extruder and the work surface was increased. The ability of a human contributor to bring the bead under control (begin a constant path) decreased proportionately. There is a transition period between when a human begins to modulate the extrusion rate and when the bead straightens. As displacement increased, the point at which the bead straightens despite starting modulation at the same time in each trial moves further along the total path length (Fig.x).

7.00 Discussion

When we acknowledge the limitations of digital fabrication workflows, we open the doors to opportunities for humans to express themselves creatively. That is, when we allow robotics to provide us with a degree of control over complex parameters of the worksite which computational systems handle well, we free the more nuanced, physical parameters for human control; we furnish a previously inaccessible venue within which humans might capitalise on their intuitive sense of space, proportion, their material sense. Maybe we don’t want our machines to control everything.

In developing this study, the importance of a learning curve on the part of the human contributor was noted. Although this approach in theory furnishes an opportunity for human contribution, some degree of experience or learning on their part is essential before that contribution is effective. In our case, the human participant needed time and experience in learning how to modulate adhesive extrusion as desired. However, it might be argued that a human’s ability to learn - in this case - far exceeded a robotic system’s ability to do the same, bolstering the argument for setting aside some tasks for humans.

Although the artistic or creative implications of this proposal might be evident, practical or everyday implications might also emerge; this approach might facilitate workflows designed to allow craftsmen or construction workers to work more naturally or safely within complex and/or changing work environments which could be controlled by robotic systems.