Date of Award
Doctor of Philosophy (PhD)
An experimental study was conducted to investigate the feasibility of integrating the cognitive architecture Soar with a robotic system to aid in the disassembly of electronic waste (e-waste). This study was broken up into different parts, first, to examine how well the different sensors and tools that composed the system worked before finally integrating the cognitive architecture Soar to assess how it improved the systems performance. Due to the increasing amounts of e-waste and the adverse effects it has on human health, a robotic system that can aid in the disassembly of e-waste is essential.
The first part in this study was to investigate the performance of three different models of the Microsoft Kinect sensor using the OpenNI driver from Primesense. The study explored the accuracy, repeatability, and resolution of the different Kinect models‟ abilities to determine the distance to a planar target. An ANOVA analysis was performed to establish if the model of the Kinect, the operating temperature, or their interaction were significant factors in the Kinect’s ability to calculate the distance to the target. Different sized gauge blocks were also used to test how well a Kinect could reconstruct precise objects. Machinist blocks were used to examine how accurately the Kinect could reconstruct objects set up on an angle and determine the location of the center of a hole. All the Kinect models could determine the location of a target with a low standard deviation (less than 2 mm). At close distances, the resolutions of all the Kinect models were 1 mm. Through the ANOVA analysis, the best performing Kinect at close distances was the Kinect model 1414, and at farther distances was the Kinect model 1473. The internal temperature of the Kinect sensor influenced the distance reported by the sensor. Using different correction factors, the Kinect could determine the volume of a gauge block and the angles machinist blocks were setup at, with under a 10 percent error.
After the Kinect’s performance was characterized, the study continued and investigated the performance of an automated robotic system, which used a combination of vision and force sensing to remove screws from the back of laptops. This robotic system used two webcams, one that is fixed over the robot and the other mounted on the robot, as well as a sensor-equipped (SE) screwdriver. Experimental studies were conducted to test the performance of the SE screwdriver and vision system. The parameters that were varied included the internal brightness settings on the webcams, the method in which the workspace was illuminated, and color of the laptop case. A localized light source and a higher brightness setting as the laptop’s case became darker produced the best results. In this study, the SE screwdriver successfully removed 96.5% of the screws from laptops.
Since a method to locate and remove screws from a laptop automatically, and guidelines to find holes based on the laptop color and camera brightness were established, the study continued and added the cognitive architecture language Soar to the system. Soar’s long term memory module, semantic memory, was used to remember pieces of information regarding laptop models and screw holes. The system was trained with multiple laptop models, and the method in which Soar was used to facilitate the removal of screw was varied to determine the best performance of the system. In all cases, Soar could determine the correct laptop model and in what orientation it was placed in the system. Remembering the locations of holes decreased all the trial times for all the different laptop models by at least 60%. The system performed the best when the amount of training trials that were used to explore circle locations was limited as this decreased the total trial time by over 10% for most laptop models and orientations.
DiFilippo, Nicholas M., "Framework for the Automated Disassembly of Electronic Waste Using the Soar Cognitive Archicture" (2016). Open Access Dissertations. Paper 514.
Available for download on Friday, November 30, 2018