{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,9,12]],"date-time":"2024-09-12T17:59:18Z","timestamp":1726163958887},"publisher-location":"Cham","reference-count":9,"publisher":"Springer International Publishing","isbn-type":[{"type":"print","value":"9783031062483"},{"type":"electronic","value":"9783031062490"}],"license":[{"start":{"date-parts":[[2022,1,1]],"date-time":"2022-01-01T00:00:00Z","timestamp":1640995200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,5,20]],"date-time":"2022-05-20T00:00:00Z","timestamp":1653004800000},"content-version":"vor","delay-in-days":139,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,1,1]],"date-time":"2022-01-01T00:00:00Z","timestamp":1640995200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,5,20]],"date-time":"2022-05-20T00:00:00Z","timestamp":1653004800000},"content-version":"vor","delay-in-days":139,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2022]]},"abstract":"Abstract<\/jats:title>We propose a method for estimating the frictional force between a contacted surface and the human touch using thermal video images captured using an infrared thermographic camera. Because this method can estimate force remotely, its application to various situations, in which the measurement is difficult to obtain using conventional contact-based methods, is expected. Furthermore, thermal images have the advantage of measuring physical quantities directly related to frictional force. As a result of machine learning using the measured data from multiple subjects and materials, we succeeded in estimating the frictional force with a high accuracy from the information of the temperature change on the surface. In addition, we account for both the frictional and direct heat transferred between the finger and object affecting the temperature change; therefore, we attempted to improve the accuracy by extracting only frictional heat. Consequently, our method succeeded in improving the accuracy.<\/jats:p>","DOI":"10.1007\/978-3-031-06249-0_27","type":"book-chapter","created":{"date-parts":[[2022,5,19]],"date-time":"2022-05-19T16:05:38Z","timestamp":1652976338000},"page":"234-242","update-policy":"http:\/\/dx.doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Estimation of Frictional Force Using the Thermal Images of Target Surface During Stroking"],"prefix":"10.1007","author":[{"ORCID":"http:\/\/orcid.org\/0000-0002-7950-0248","authenticated-orcid":false,"given":"Mitsuhiko","family":"Shimomura","sequence":"first","affiliation":[]},{"ORCID":"http:\/\/orcid.org\/0000-0002-5042-1773","authenticated-orcid":false,"given":"Masahiro","family":"Fujiwara","sequence":"additional","affiliation":[]},{"ORCID":"http:\/\/orcid.org\/0000-0002-9362-4407","authenticated-orcid":false,"given":"Yasutoshi","family":"Makino","sequence":"additional","affiliation":[]},{"ORCID":"http:\/\/orcid.org\/0000-0002-3006-430X","authenticated-orcid":false,"given":"Hiroyuki","family":"Shinoda","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,5,20]]},"reference":[{"key":"27_CR1","doi-asserted-by":"crossref","unstructured":"Brahmbhatt, S., Ham, C., Kemp, C., Hays, J.: Contactdb: analyzing and predicting grasp contact via thermal imaging. In: Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition (2019)","DOI":"10.1109\/CVPR.2019.00891"},{"key":"27_CR2","doi-asserted-by":"crossref","unstructured":"Dunn, T., Banerjee, S., Banerjee, N.K.: User-independent detection of swipe pressure using a thermal camera for natural surface interaction. In: 2018 IEEE 20th International Workshop on Multimedia Signal Processing (MMSP), pp. 1\u20136. IEEE (2018)","DOI":"10.1109\/MMSP.2018.8547052"},{"key":"27_CR3","doi-asserted-by":"crossref","unstructured":"Grieve, T., Lincoln, L., Sun, Y., Hollerbach, J.M., Mascaro, S.A.: 3D force prediction using fingernail imaging with automated calibration. In: 2010 IEEE Haptics Symposium, pp. 113\u2013120. IEEE (2010)","DOI":"10.1109\/HAPTIC.2010.5444669"},{"issue":"5","key":"27_CR4","doi-asserted-by":"publisher","first-page":"1116","DOI":"10.1109\/TRO.2015.2459411","volume":"31","author":"TR Grieve","year":"2015","unstructured":"Grieve, T.R., Hollerbach, J.M., Mascaro, S.A.: 3-D fingertip touch force prediction using fingernail imaging with automated calibration. IEEE Trans. Rob. 31(5), 1116\u20131129 (2015)","journal-title":"IEEE Trans. Rob."},{"key":"27_CR5","doi-asserted-by":"crossref","unstructured":"Kishino, Y., Shirai, Y., Yanagisawa, Y., Ohara, K., Mizutani, S., Suyama, T.: Identifying human contact points on environmental surfaces using heat traces to support disinfect activities: poster abstract. In: Proceedings of the 18th Conference on Embedded Networked Sensor Systems, SenSys 2020, pp. 768\u2013769. Association for Computing Machinery, New York (2020)","DOI":"10.1145\/3384419.3430597"},{"key":"27_CR6","doi-asserted-by":"publisher","first-page":"134","DOI":"10.1016\/j.bspc.2019.01.011","volume":"50","author":"A Marban","year":"2019","unstructured":"Marban, A., Srinivasan, V., Samek, W., Fern\u00e1ndez, J., Casals, A.: A recurrent convolutional neural network approach for sensorless force estimation in robotic surgery. Biomed. Signal Process. Control 50, 134\u2013150 (2019)","journal-title":"Biomed. Signal Process. Control"},{"issue":"1","key":"27_CR7","doi-asserted-by":"publisher","first-page":"26","DOI":"10.1109\/TRA.2003.820931","volume":"20","author":"SA Mascaro","year":"2004","unstructured":"Mascaro, S.A., Asada, H.H.: Measurement of finger posture and three-axis fingertip touch force using fingernail sensors. IEEE Trans. Rob. Autom. 20(1), 26\u201335 (2004)","journal-title":"IEEE Trans. Rob. Autom."},{"issue":"2","key":"27_CR8","doi-asserted-by":"publisher","first-page":"204","DOI":"10.1109\/TOH.2018.2803053","volume":"11","author":"S Yoshimoto","year":"2018","unstructured":"Yoshimoto, S., Kuroda, Y., Oshiro, O.: Estimation of object elasticity by capturing fingernail images during haptic palpation. IEEE Trans. Haptics 11(2), 204\u2013211 (2018)","journal-title":"IEEE Trans. Haptics"},{"key":"27_CR9","doi-asserted-by":"crossref","unstructured":"Zolfaghari, M., Singh, K., Brox, T.: Eco: efficient convolutional network for online video understanding. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 695\u2013712 (2018)","DOI":"10.1007\/978-3-030-01216-8_43"}],"container-title":["Lecture Notes in Computer Science","Haptics: Science, Technology, Applications"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/978-3-031-06249-0_27","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,3,12]],"date-time":"2024-03-12T11:24:44Z","timestamp":1710242684000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/978-3-031-06249-0_27"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022]]},"ISBN":["9783031062483","9783031062490"],"references-count":9,"URL":"https:\/\/doi.org\/10.1007\/978-3-031-06249-0_27","relation":{},"ISSN":["0302-9743","1611-3349"],"issn-type":[{"type":"print","value":"0302-9743"},{"type":"electronic","value":"1611-3349"}],"subject":[],"published":{"date-parts":[[2022]]},"assertion":[{"value":"20 May 2022","order":1,"name":"first_online","label":"First Online","group":{"name":"ChapterHistory","label":"Chapter History"}},{"value":"EuroHaptics","order":1,"name":"conference_acronym","label":"Conference Acronym","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"International Conference on Human Haptic Sensing and Touch Enabled Computer Applications","order":2,"name":"conference_name","label":"Conference Name","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"Hamburg","order":3,"name":"conference_city","label":"Conference City","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"Germany","order":4,"name":"conference_country","label":"Conference Country","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"2022","order":5,"name":"conference_year","label":"Conference Year","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"22 May 2022","order":7,"name":"conference_start_date","label":"Conference Start Date","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"25 May 2022","order":8,"name":"conference_end_date","label":"Conference End Date","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"13","order":9,"name":"conference_number","label":"Conference Number","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"eurohaptics2022","order":10,"name":"conference_id","label":"Conference ID","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"http:\/\/www.eurohaptics2022.org","order":11,"name":"conference_url","label":"Conference URL","group":{"name":"ConferenceInfo","label":"Conference Information"}},{"value":"Single-blind","order":1,"name":"type","label":"Type","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"OCS","order":2,"name":"conference_management_system","label":"Conference Management System","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"129","order":3,"name":"number_of_submissions_sent_for_review","label":"Number of Submissions Sent for Review","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"36","order":4,"name":"number_of_full_papers_accepted","label":"Number of Full Papers Accepted","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"0","order":5,"name":"number_of_short_papers_accepted","label":"Number of Short Papers Accepted","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"28% - The value is computed by the equation \"Number of Full Papers Accepted \/ Number of Submissions Sent for Review * 100\" and then rounded to a whole number.","order":6,"name":"acceptance_rate_of_full_papers","label":"Acceptance Rate of Full Papers","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"2.1","order":7,"name":"average_number_of_reviews_per_paper","label":"Average Number of Reviews per Paper","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"1.8","order":8,"name":"average_number_of_papers_per_reviewer","label":"Average Number of Papers per Reviewer","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}},{"value":"Yes","order":9,"name":"external_reviewers_involved","label":"External Reviewers Involved","group":{"name":"ConfEventPeerReviewInformation","label":"Peer Review Information (provided by the conference organizers)"}}]}}