{"id":1026,"date":"2023-06-24T03:43:58","date_gmt":"2023-06-23T22:13:58","guid":{"rendered":"https:\/\/icmi.acm.org\/2024\/?page_id=1026"},"modified":"2024-04-29T22:40:03","modified_gmt":"2024-04-29T17:10:03","slug":"call-for-papers","status":"publish","type":"page","link":"https:\/\/icmi.acm.org\/2024\/call-for-papers\/","title":{"rendered":"Call for Papers"},"content":{"rendered":"<p>[et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;section&#8221; _builder_version=&#8221;4.14.4&#8243; background_enable_image=&#8221;off&#8221; custom_padding=&#8221;3px||0px|||&#8221; global_colors_info=&#8221;{}&#8221; theme_builder_area=&#8221;post_content&#8221;][et_pb_row admin_label=&#8221;row&#8221; _builder_version=&#8221;4.14.4&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221; width=&#8221;90%&#8221; custom_padding=&#8221;4px|||||&#8221; global_colors_info=&#8221;{}&#8221; theme_builder_area=&#8221;post_content&#8221;][et_pb_column type=&#8221;4_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; global_colors_info=&#8221;{}&#8221; custom_padding__hover=&#8221;|||&#8221; theme_builder_area=&#8221;post_content&#8221;][et_pb_text _builder_version=&#8221;4.14.4&#8243; _module_preset=&#8221;default&#8221; text_font=&#8221;||||||||&#8221; text_text_color=&#8221;#4f4f4f&#8221; text_font_size=&#8221;13px&#8221; header_4_text_color=&#8221;#3DBDA8&#8243; header_4_line_height=&#8221;2em&#8221; header_5_text_color=&#8221;#6292C2&#8243; header_5_line_height=&#8221;1.6em&#8221; custom_margin=&#8221;||0px|||&#8221; global_colors_info=&#8221;{}&#8221; theme_builder_area=&#8221;post_content&#8221;]<\/p>\n<h4><strong><\/strong><\/h4>\n<h4><strong>Call for Papers<\/strong><\/h4>\n<p><a href=\"https:\/\/new.precisionconference.com\/submissions\/icmi24a\" target=\"_blank\" rel=\"noopener\">Go to Submission site<\/a><strong><\/strong><\/p>\n<p><span style=\"font-weight: 400;\">The 26th International Conference on Multimodal Interaction (ICMI 2024) will be held in San Jos\u00e9, Costa Rica. ICMI is the premier international forum that brings together multimodal artificial intelligence (AI) and social interaction research. Multimodal AI encompasses technical challenges in machine learning and computational modeling such as representations, fusion, data, and systems. <\/span><span style=\"font-weight: 400;\"><\/span><span style=\"font-weight: 400;\">The study of social interactions encompasses both human-human interactions and human-computer interactions.\u00a0 A unique aspect of ICMI is its multidisciplinary nature which values both scientific discoveries and technical modeling achievements, with an eye towards impactful applications for the good of people and society.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Novelty will be assessed along two dimensions: scientific novelty and technical novelty. Accepted papers at ICMI 2024 will need to be novel along one of the two dimensions:<\/span><\/p>\n<p><span style=\"font-weight: 400;\"><\/span><\/p>\n<ul>\n<li><strong>Scientific Novelty<\/strong>: <span style=\"font-weight: 400;\">Papers should bring new scientific knowledge about human social interactions, including human-computer interactions. For example, discovering new behavioral markers that are predictive of mental health or how new behavioral patterns relate to children\u2019s interactions during learning. It is the responsibility of the authors to perform a proper literature review and clearly discuss the novelty in the scientific discoveries made in their paper.<\/span><\/li>\n<li><strong>Technical Novelty<\/strong>:\u00a0<span style=\"font-weight: 400;\">Papers should propose novelty in their computational approach for recognizing, generating or modeling multimodal data. Examples include: novelty in the learning and prediction algorithms, in the neural architecture, or in the data representation. Novelty can also be associated with new usages of an existing approach.<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">Commitment to ethical conduct is required and submissions must adhere to ethical standards in particular when human-derived data are employed. Authors are encouraged to read the ACM Code of Ethics and Professional Conduct (<\/span><a href=\"https:\/\/ethics.acm.org\/\"><span style=\"font-weight: 400;\">https:\/\/ethics.acm.org\/<\/span><\/a><span style=\"font-weight: 400;\">).<\/span><\/p>\n<p><span style=\"font-weight: 400;\"><\/span><\/p>\n<h5><b>ICMI 2024 conference theme:<\/b><\/h5>\n<p><span style=\"font-weight: 400;\">The theme of this year&#8217;s ICMI conference revolves around &#8220;<\/span><b>Equitability and environmental sustainability in multimodal interaction technologies.<\/b><span style=\"font-weight: 400;\">&#8221; The focus is on exploring how multimodal systems and multimodal interactive applications can serve as tools to bridge the digital divide, particularly in underserved communities and countries, with a specific emphasis on those in Latin America and the Caribbean. The conference aims to delve into the design principles that can render multimodal systems more equitable and sustainable in applications such as health and education, thereby catalyzing positive transformations in development for historically marginalized groups, including racial\/ethnic minorities and indigenous peoples. Moreover, there is a crucial exploration of the intersection between multimodal interaction technologies and environmental sustainability. This involves examining how these technologies can be crafted to comprehend, disseminate, and mitigate the adverse impacts of climate change, especially in the Latin America and Caribbean region. The conference endeavors to explore the potential of multimodal systems in fostering community resilience, raising awareness, and facilitating education related to climate change, thereby contributing to a holistic approach that encompasses both social and environmental dimensions.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Additional topics of interest include but are not limited to:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Affective computing and interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Cognitive modeling and multimodal interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Gesture, touch and haptics<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Healthcare, assistive technologies<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Human communication dynamics<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Human-robot\/agent multimodal interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Human-centered A.I. and ethics<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Interaction with smart environment<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Machine learning for multimodal interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Mobile multimodal systems<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal behaviour generation<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal datasets and validation<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal dialogue modeling<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal fusion and representation<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal interactive applications<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Novel multimodal datasets<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Speech behaviours in social interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">System components and multimodal platforms<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Visual behaviours in social interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Virtual\/augmented reality and multimodal interaction<\/span><\/li>\n<\/ul>\n<h5><b>ACM Publication Policies<\/b><\/h5>\n<p><b><span style=\"font-weight: 400;\">By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all <\/span><a href=\"https:\/\/www.acm.org\/publications\/policies\"><span style=\"font-weight: 400;\">ACM Publications Policies<\/span><\/a><span style=\"font-weight: 400;\">, including ACM\u2019s new <\/span><a href=\"https:\/\/www.acm.org\/publications\/policies\/research-involving-human-participants-and-subjects\"><span style=\"font-weight: 400;\">Publications Policy on Research Involving Human Participants and Subjects<\/span><\/a><span style=\"font-weight: 400;\">. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.<\/span><\/b><\/p>\n<p><b><span style=\"font-weight: 400;\"><br \/><\/span><span style=\"font-weight: 400;\">Please ensure that you and your co-authors <\/span><a href=\"https:\/\/orcid.org\/register\"><span style=\"font-weight: 400;\">obtain an ORCID ID<\/span><\/a><span style=\"font-weight: 400;\">, so you can complete the publishing process for your accepted paper. ACM has been involved in ORCID from the start and we have recently made a <\/span><a href=\"https:\/\/authors.acm.org\/author-resources\/orcid-faqs\"><span style=\"font-weight: 400;\">commitment to collect ORCID IDs from all of our published authors<\/span><\/a><span style=\"font-weight: 400;\">. ACM requires that all accepted journal authors register and provide ACM with valid ORCIDs prior to paper publication. We are committed to improve author discoverability, ensure proper attribution and contribute to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.<\/span><\/b><\/p>\n<h5><strong>Important Dates<\/strong><\/h5>\n<table style=\"width: 100%; border-collapse: collapse;\">\n<tbody>\n<tr>\n<td style=\"width: 50%;\">Abstract deadline<\/td>\n<td style=\"width: 50%;\">April 26th, 2024<\/td>\n<\/tr>\n<tr>\n<td style=\"width: 50%;\">Paper Submission<\/td>\n<td style=\"width: 50%;\"><strong>May 10th, 2024<\/strong><\/td>\n<\/tr>\n<tr>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">Rebuttal Period<\/span><\/td>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">June 16th-23rd, 2024<\/span><\/td>\n<\/tr>\n<tr>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">Paper notification<\/span><\/td>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">July 18th, 2024<\/span><\/td>\n<\/tr>\n<tr>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">Camera-ready paper<\/span><\/td>\n<td style=\"width: 50%;\"><strong>August 16th, 2024<\/strong><\/td>\n<\/tr>\n<tr>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">Presenting at main conference<\/span><\/td>\n<td style=\"width: 50%;\"><span style=\"font-weight: 400;\">November 5th-7th, 2024<\/span><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><strong><\/strong><\/p>\n<p><b><span style=\"font-weight: 400;\"><\/span><\/b><\/p>\n<p><span style=\"font-weight: 400;\"><\/span><\/p>\n<p><span style=\"font-weight: 400;\"><\/span><\/p>\n<p><span style=\"font-weight: 400;\"><\/span><\/p>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Call for Papers Go to Submission site The 26th International Conference on Multimodal Interaction (ICMI 2024) will be held in San Jos\u00e9, Costa Rica. ICMI is the premier international forum that brings together multimodal artificial intelligence (AI) and social interaction research. Multimodal AI encompasses technical challenges in machine learning and computational modeling such as representations, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_et_pb_use_builder":"on","_et_pb_old_content":"<!-- wp:paragraph -->\n<p>This is an example page. It's different from a blog post because it will stay in one place and will show up in your site navigation (in most themes). Most people start with an About page that introduces them to potential site visitors. It might say something like this:<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:quote -->\n<blockquote class=\"wp-block-quote\"><p>Hi there! I'm a bike messenger by day, aspiring actor by night, and this is my website. I live in Los Angeles, have a great dog named Jack, and I like pi\u00f1a coladas. (And gettin' caught in the rain.)<\/p><\/blockquote>\n<!-- \/wp:quote -->\n\n<!-- wp:paragraph -->\n<p>...or something like this:<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:quote -->\n<blockquote class=\"wp-block-quote\"><p>The XYZ Doohickey Company was founded in 1971, and has been providing quality doohickeys to the public ever since. Located in Gotham City, XYZ employs over 2,000 people and does all kinds of awesome things for the Gotham community.<\/p><\/blockquote>\n<!-- \/wp:quote -->\n\n<!-- wp:paragraph -->\n<p>As a new WordPress user, you should go to <a href=\"https:\/\/icmi.acm.org\/2024\/wp-admin\/\">your dashboard<\/a> to delete this page and create new pages for your content. Have fun!<\/p>\n<!-- \/wp:paragraph -->","_et_gb_content_width":"","inline_featured_image":false,"footnotes":""},"class_list":["post-1026","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/pages\/1026","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/comments?post=1026"}],"version-history":[{"count":21,"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/pages\/1026\/revisions"}],"predecessor-version":[{"id":1627,"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/pages\/1026\/revisions\/1627"}],"wp:attachment":[{"href":"https:\/\/icmi.acm.org\/2024\/wp-json\/wp\/v2\/media?parent=1026"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}