Robotics and Tangible Computing
One of my early efforts was the Bikeswipe project. Along with Dr. Troy Messina, Centenary College Department of Physics, and students Richard Lopez '11 and Roland Womack '10, we designed an automated bike checkout program to facilitate a convenient and secure method for using the campus Green Bikes. With one swipe of their ID at any bike lock station on campus, a student/staff/faculty member will be able to either take or return RFID-tagged keys for bikes, with all transactions being recorded and validated using a wireless connection to a remote SQL database. We spent Summer 2008 laying out the groundwork for our design, and in Fall 2008 we constructed and programmed a working prototype. The implementation of the Bikeswipe project with Dr. Messina was mature enough for a remote station implementation, and our students Richard Lopez and Roland Womack presented their work at the Centenary Student Research Forum in April 2009. Dr. Messina and I are currently re-engineering this project to involve an Arduino microcontroller and simplifying many of the components for publication as a Make:Online project.
Separately, Bradlee Robertson '10, English major and computer science minor, and I designed a controller in Java for a virtual Urban Search and Rescue Robot. In this RoboCupRescue task, a virtual disaster situation is simulated through Unreal Tournament 2004, and robots are launched into the world to both map the building state and identify possible victims in need of rescue. We used this controller to develop artificially intelligent robots capable of navigating test scenarios. Over two semesters, we continued our research and augmented the controller to allow for multiple robot interactions, and completed an exploration of clustering algorithms for detecting victims.
This robotics research continued with students Eren Corapcioglu '12 and Kathryn Hardey '12, along with colleague Dr. Matthew Jadud and his student Molly Mattis at Allegheny College in Meadville, PA. The students were supported through a Collaborative Research Experience for Undergraduates grant, funded by the NSF. Our project investigated the feasibility of using native parallel languages for the construction of effective robotic controllers and the evolution of new controllers using genetic programming techniques. Our intention was to qualify and quantify the limits and benefits of this approach, and to lay a potential foundation for evolving robotic controllers to the more complex (and true-to-life) RoboCupRescue competitions.
Our research proceeded in two phases. First we immersed ourselves in a specific, real world robotics task: finding and extinguishing a candle within a physical model of a home environment for the Trinity College Fire Fighting Home Robot Contest (TCFFHRC). This competition attracts teams from around the country to create robots capable of responding to an alarm and extinguishing a candle in a maze that resembles a house. Kathryn and Molly collaborated remotely, and built an Arduino-based robot which placed 15th out of 41 contestants in the Senior Division of the contest. Kathryn presented a poster at the 2011 Centenary Student Research Forum, as did Molly Mattis at Allegheny.
Second, we transitioned from hand-crafted controllers making use of occam and the subsumption architecture to the use of layered control systems that we evolve using genetic algorithmic techniques. We were able to show how process networks written in dataflow languages can be used as a basis for evolutionary robotics, where the level of controller complexity can be easily tuned. Our work examined the evolution of our prior process network to improve the performance of a robot within a task involving coordination of multiple simultaneous behaviors. We employed an evolutionary algorithm using the basic control processes to improve our performance on the locomotion and object avoidance aspects of the fire-fighting task. We studied various levels of mutation and crossover to find optimal settings for improved performance, and were able to learn controllers capable of outperforming our handcrafted solution in the physical task.
Kathryn and Eren presented their work in progress over the summer to the Summer Science Seminar Series, and the 2012 Centenary Student Research Forum. In addition, our work was accepted as a full paper for publication at GECCO 2012, a top conference on Evolutionary Computation, with a 30% acceptance rate.
I began the Pherophone project in Summer 2010 with three students, Nolan Baker '10, Jacob Jennings '12 and Kathryn Hardey '12. Pherophone is based on insect communication, where ants leave temporary anonymous messages to help guide and coordinate group activity. Our application brought this to the smartphone, where users can drop short text messages at selected locations and detect messages left by others. This app also allowed us to collect data and explore communication patterns of our users so that we could understand the benefits and weaknesses of this unique model of communication. The local media picked up this project, leading to interviews with three television reporters and a correspondent from the Shreveport Times.
Our work inspired a separate collaboration with Dr. Matthew Jadud from Allegheny College in the development of course materials for our CSC 234 Data Structures courses. My Pherophone experience led us to incorporate Android OS programming into this course as a motivating framework. Working again with my student Jacob Jennings, we created ten exercises where students could implement the models of standard data structures within Android implementations of card and dice games. We reported on the preliminary success of our approach by presenting a paper at the Smartphones across the curriculum (SMACK) 2011 workshop in May 2011, Honolulu, Hawaii. I continue to refine and augment this approach, and this has led to an infusion of Android across my curriculum, from a homework project on steganography in CSC 450 Cryptography, to student projects on genetic programming for art in CSC 440 Artificial Intelligence, culminating in my revision this semester of my CSC 310 Database course to intentionally program mobile applications for community partners in the local area.
In parallel to this work, I began a conversation with Dr. Michael Rogers from Northwest Missouri State concerning the differences and similarities between iPhone and Android development from the perspective of computer science educators. Our goal was to compare and contrast two mobile phone operating systems that can be used for classroom teaching. iOS is a very popular platform from Apple, but has a large learning curve for development, while Android is small but growing, and has a lower threshold for entry. We summarized our findings in a submission to the ACM SIGCSE 2011 conference on computer science education, where this work was accepted and presented in Dallas, TX, March 2011. This highly-regarded conference attracts over 2000 educators from across the world each year and had a paper acceptance rate of 34%. In Summer 2011, we continued to develop more complicated apps, and presented a joint workshop at the SIGCSE 2012 to demonstrate the Model-View-Controller software pattern in both platforms. The acceptance rate for workshops that year was 46%.
Over Summer 2011, I recruited students Michael Hoppe '14 and Gerhardt Funk '13 to implement another location-based Android application based on the initial work of Pherophone. We developed a beta version of Cogmality, an augmented-reality game, where players collect resources (wood, coal, wool, water, etc) by moving throughout the world to local businesses found in Google Places, and then combine them to create useful in-game objects. Our beta-testers were very excited by the concept, and we are now developing the full version of the game, where players can also create their own items and submit them for use by others, after completing sufficient progress in the game. We plan to release this to the Android Market in Fall 2012, and analyse the usage patterns to understand the formation of communities and dissemination of knowledge within this domain.
I am currently researching ways to improve mobile user interface designs using EEG data, a project titled Neuraid. Don Adley '13 and I plan to investigate the creation of a software system that is capable of simultaneously tracking eye-movement and reading EEG information from users while they are interacting with mobile applications. The collected data will then be analyzed to detect when and where the user was alternately comfortable and frustrated with the use of the application. We will train our system while users view and use both good and bad user interface designs, test our model on designs in progress, and validate our results using domain experts.
Other Peer-Reviewed Research
Dr. Mark Schlatter and I explored combinatorial game theory analysis as applied to the game Babylon. We are using a combined approach of mathematical analysis and computer simulation to drive our exploration of this game and our search for winning strategies. Babylon has not been previously explored with mathematical analysis, and our work has illuminated connections between this game and underlying theory in integer partitions. This work could have implications to other games, and is interesting as research into a simple mathematical object. We also hope our work will encourage more collaboration in combinatorial game theory between mathematicians and computer scientists. This work was accepted to the journal Integers for publication in early 2011 and has spurred mathematical investigations into this game by other researchers.
In collaboration with Dr. Jeffrey Agnew, Centenary College of Louisiana Department of Geology, we explored the application of Inductive Logic Programming to a new domain involving decapod crustacean claws. We found that we can distinguish dactyl shapes by automatically extracting relational features that describe their underlying spatial structure. Our work was published in two venues, first at the 24th International Conference on Inductive Logic Programming (ILP 2008), in Prague, Czech Republic from the 10th – 12th of September 2008, as a Late Breaking Paper entitled "Learning Comprehensible Relational Features to Distinguish Subfossil Decapod Crustacean Dactyls." Our second presentation occurred at the Geological Society of America Annual Meeting on October 8th, 2008, with a poster in the Paleontology IV - Stratigraphy and Morphology section entitled "Distinguishing Dactyls of Crab Species Using Relational Machine Learning," where we received feedback on the applied portion of the research and discussed new domains where this approach would be relevant.