Overview![]() |
AbstractThis project produced a large-scale experimental study, in which a humanoid robot learned to press and detect doorbell buttons autonomously. The models for action selection and visual detection were grounded in the robot's sensorimotor experience and learned without human intervention. Experiments were performed with seven doorbell buttons, which provided auditory feedback when pressed. The robot learned to predict the locations of the functional components of each button accurately. The trained visual model was also able to detect the functional components of novel buttons.PaperSukhoy, V. and Stoytchev, A., "Learning to Detect the Functional Components of Doorbell Buttons Using Active Exploration and Multimodal Correlation," In Proceedings of the 2010 IEEE International Conference on Humanoid Robots (Humanoids), Nashville, TN, pp. 572–579, December 6–8, 2010. PDF. |
![]() Click on a stage in the framework to view the video that describes it. |
BibTeX Snippet@InProceedings{sukhoy2010Humanoids, author = {V. Sukhoy and A. Stoytchev}, title = {Learning to Detect the Functional Components of Doorbell Buttons Using Active Exploration and Multimodal Correlation}, booktitle = {In Proceedings of the 2010 IEEE-RSJ Conference on Humanoid Robots}, year = {2010}, pages = {572-579} address = {Nashville, TN}, month = {December} } Earlier Papers
|