{"id":273,"date":"2022-02-22T11:10:17","date_gmt":"2022-02-22T05:40:17","guid":{"rendered":"https:\/\/icmi.acm.org\/2022\/?page_id=273"},"modified":"2023-04-25T23:32:55","modified_gmt":"2023-04-25T18:02:55","slug":"call-for-papers","status":"publish","type":"page","link":"https:\/\/icmi.acm.org\/2023\/call-for-papers\/","title":{"rendered":"Call for Papers"},"content":{"rendered":"<p>[et_pb_section fb_built=&#8221;1&#8243; admin_label=&#8221;section&#8221; _builder_version=&#8221;4.14.4&#8243; background_enable_image=&#8221;off&#8221; custom_padding=&#8221;3px||5px|||&#8221; global_colors_info=&#8221;{}&#8221; theme_builder_area=&#8221;post_content&#8221;][et_pb_row admin_label=&#8221;row&#8221; _builder_version=&#8221;4.14.4&#8243; background_size=&#8221;initial&#8221; background_position=&#8221;top_left&#8221; background_repeat=&#8221;repeat&#8221; width=&#8221;90%&#8221; custom_padding=&#8221;4px||6px|||&#8221; global_colors_info=&#8221;{}&#8221; theme_builder_area=&#8221;post_content&#8221;][et_pb_column type=&#8221;4_4&#8243; _builder_version=&#8221;3.25&#8243; custom_padding=&#8221;|||&#8221; global_colors_info=&#8221;{}&#8221; custom_padding__hover=&#8221;|||&#8221; theme_builder_area=&#8221;post_content&#8221;][et_pb_text admin_label=&#8221;Text&#8221; _builder_version=&#8221;4.14.4&#8243; _module_preset=&#8221;default&#8221; text_font=&#8221;||||||||&#8221; text_text_color=&#8221;#000000&#8243; text_font_size=&#8221;13px&#8221; header_4_text_color=&#8221;#072f93&#8243; header_5_text_color=&#8221;#085593&#8243; text_orientation=&#8221;justified&#8221; custom_margin=&#8221;||18px|||&#8221; hover_enabled=&#8221;0&#8243; global_colors_info=&#8221;{}&#8221; theme_builder_area=&#8221;post_content&#8221; sticky_enabled=&#8221;0&#8243;]<\/p>\n<h4><strong>Call for Papers<\/strong><\/h4>\n<p><span style=\"font-weight: 400;\">The 25th International Conference on Multimodal Interaction (ICMI 2023) will be held in Paris, France. ICMI is the premier international forum that brings together multimodal artificial intelligence (AI) and social interaction research. Multimodal AI encompasses technical challenges in machine learning and computational modeling such as representations, fusion, data and systems. The study of social\u00a0 interactions englobes both human-human interactions and human-computer interactions.\u00a0 A unique aspect of ICMI is its multidisciplinary nature which values both scientific discoveries and technical modeling achievements, with an eye towards impactful applications for the good of people and society.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Novelty will be assessed along two dimensions: scientific novelty and technical novelty. Accepted papers at ICMI 2023 will need to be novel along one of the two dimensions:<\/span><\/p>\n<ul>\n<li><strong>Scientific Novelty<\/strong>:\u00a0<span style=\"font-weight: 400;\">Papers should bring new scientific knowledge about human social interactions, including human-computer interactions. For example, discovering new behavioral markers that are predictive of mental health or how new behavioral patterns relate to children\u2019s interactions during learning. It is the responsibility of the authors to perform a proper literature review and clearly discuss the novelty in the scientific discoveries made in their paper.<\/span><\/li>\n<li><strong>Technical Novelty<\/strong>: <span style=\"font-weight: 400;\">Papers should propose novelty in their computational approach for recognizing, generating or modeling multimodal data. Examples include: novelty in the learning and prediction algorithms, in the neural architecture, or in the data representation. Novelty can also be associated with new usages of an existing approach.<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400;\">Please see the Submission Guidelines for Authors (<a href=\"https:\/\/icmi.acm.org\/2023\/guidelines-for-authors\/\">https:\/\/icmi.acm.org\/2023\/guidelines-for-authors\/)<\/a><\/span><span style=\"font-weight: 400;\">\u00a0for detailed submission instructions. Commitment to ethical conduct is required and submissions must adhere to ethical standards in particular when human-derived data are employed. Authors are encouraged to read the ACM Code of Ethics and Professional Conduct (<\/span><a href=\"https:\/\/ethics.acm.org\/\"><span style=\"font-weight: 400;\">https:\/\/ethics.acm.org\/<\/span><\/a><span style=\"font-weight: 400;\">).<\/span><\/p>\n<p><b>ICMI 2023 conference theme:<\/b><span style=\"font-weight: 400;\"><\/span><\/p>\n<p><span style=\"font-weight: 400;\">The theme for this year\u2019s conference is \u201cScience of Multimodal Interactions\u201d. As the community grows, it is important to understand the main scientific pillars involved in deep understanding of multimodal social interactions. As a first step, we want to acknowledge key discoveries and contributions that the ICMI community enabled over the past 20+ years. As a second step, we reflect on the core principles, foundational methodologies and scientific knowledge involved in studying and modeling multimodal interactions. This will help establish a distinctive research identity for the ICMI community while at the same time embracing its multidisciplinary collaborative nature. This research identity and long-term agenda will enable the community to develop future technologies and applications while maintaining commitment to world-class scientific research.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Additional topics of interest include but are not limited to:<\/span><\/p>\n<ul>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Affective computing and interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Cognitive modeling and multimodal interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Gesture, touch and haptics<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Healthcare, assistive technologies<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Human communication dynamics<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Human-robot\/agent multimodal interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Human-centered A.I. and ethics<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Interaction with smart environment<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Machine learning for multimodal interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Mobile multimodal systems<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal behaviour generation<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal datasets and validation<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal dialogue modeling<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal fusion and representation<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Multimodal interactive applications<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Novel multimodal datasets<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Speech behaviours in social interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">System components and multimodal platforms<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Visual behaviours in social interaction<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><span style=\"font-weight: 400;\">Virtual\/augmented reality and multimodal interaction<\/span><\/li>\n<\/ul>\n<h5><strong><span>ACM Publication Policies<\/span><\/strong><\/h5>\n<p>By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all <a target=\"_blank\" class=\"c-link\" data-stringify-link=\"https:\/\/www.acm.org\/publications\/policies\" delay=\"150\" data-sk=\"tooltip_parent\" href=\"https:\/\/www.acm.org\/publications\/policies\" rel=\"noopener noreferrer\" style=\"font-size: 13px;\">ACM Publications Policies<\/a><span style=\"font-size: 13px;\">, including ACM\u2019s new <\/span><a target=\"_blank\" class=\"c-link\" data-stringify-link=\"https:\/\/www.acm.org\/publications\/policies\/research-involving-human-participants-and-subjects\" delay=\"150\" data-sk=\"tooltip_parent\" href=\"https:\/\/www.acm.org\/publications\/policies\/research-involving-human-participants-and-subjects\" rel=\"noopener noreferrer\" style=\"font-size: 13px;\">Publications Policy on Research Involving Human Participants and Subjects<\/a><span style=\"font-size: 13px;\">. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.<\/span><span style=\"font-size: 13px;\"><br \/>\nPlease ensure that you and your co-authors<\/span><span style=\"font-size: 13px;\">\u00a0<\/span><a target=\"_blank\" class=\"c-link\" data-stringify-link=\"https:\/\/orcid.org\/register\" delay=\"150\" data-sk=\"tooltip_parent\" href=\"https:\/\/orcid.org\/register\" rel=\"noopener noreferrer\" style=\"font-size: 13px;\">obtain an ORCID ID<\/a><span style=\"font-size: 13px;\">, so you can complete the publishing process for your accepted paper. ACM has been involved in ORCID from the start and we have recently made a<\/span><span style=\"font-size: 13px;\">\u00a0<\/span><a target=\"_blank\" class=\"c-link\" data-stringify-link=\"https:\/\/authors.acm.org\/author-resources\/orcid-faqs\" delay=\"150\" data-sk=\"tooltip_parent\" href=\"https:\/\/authors.acm.org\/author-resources\/orcid-faqs\" rel=\"noopener noreferrer\" style=\"font-size: 13px;\">commitment to collect ORCID IDs from all of our published authors<\/a><span style=\"font-size: 13px;\">. The collection process has started and will roll out as a requirement throughout 2022. We are committed to improve author discoverability, ensure proper attribution and contribute to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.<br \/>\n<\/span><strong style=\"color: #085593; font-size: 16px;\">Important Dates<\/strong><\/p>\n<table width=\"479\">\n<tbody>\n<tr>\n<td width=\"199\">Abstract deadline<\/td>\n<td width=\"280\"><span>May 1st, 2023<\/span><\/td>\n<\/tr>\n<tr>\n<td width=\"199\">Paper Submission<\/td>\n<td width=\"280\"><span>May 8th, 2023<\/span><\/td>\n<\/tr>\n<tr>\n<td width=\"199\">Rebuttal Period<\/td>\n<td width=\"280\">June 23-29, 2023<\/td>\n<\/tr>\n<tr>\n<td width=\"199\">Paper notification<\/td>\n<td width=\"280\">July 21, 2023<\/td>\n<\/tr>\n<tr>\n<td width=\"199\">Camera-ready paper<\/td>\n<td width=\"280\">August 14, 2023<\/td>\n<\/tr>\n<tr>\n<td width=\"199\">Presenting at main conference<\/td>\n<td width=\"280\">October 9-13, 2023<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>[\/et_pb_text][\/et_pb_column][\/et_pb_row][\/et_pb_section]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Call for Papers The 25th International Conference on Multimodal Interaction (ICMI 2023) will be held in Paris, France. ICMI is the premier international forum that brings together multimodal artificial intelligence (AI) and social interaction research. Multimodal AI encompasses technical challenges in machine learning and computational modeling such as representations, fusion, data and systems. The study [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"open","template":"","meta":{"_et_pb_use_builder":"on","_et_pb_old_content":"<!-- wp:paragraph -->\n<p>This is an example page. It's different from a blog post because it will stay in one place and will show up in your site navigation (in most themes). Most people start with an About page that introduces them to potential site visitors. It might say something like this:<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:quote -->\n<blockquote class=\"wp-block-quote\"><p>Hi there! I'm a bike messenger by day, aspiring actor by night, and this is my website. I live in Los Angeles, have a great dog named Jack, and I like pi\u00f1a coladas. (And gettin' caught in the rain.)<\/p><\/blockquote>\n<!-- \/wp:quote -->\n\n<!-- wp:paragraph -->\n<p>...or something like this:<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:quote -->\n<blockquote class=\"wp-block-quote\"><p>The XYZ Doohickey Company was founded in 1971, and has been providing quality doohickeys to the public ever since. Located in Gotham City, XYZ employs over 2,000 people and does all kinds of awesome things for the Gotham community.<\/p><\/blockquote>\n<!-- \/wp:quote -->\n\n<!-- wp:paragraph -->\n<p>As a new WordPress user, you should go to <a href=\"https:\/\/icmi.acm.org\/2023\/wp-admin\/\">your dashboard<\/a> to delete this page and create new pages for your content. Have fun!<\/p>\n<!-- \/wp:paragraph -->","_et_gb_content_width":"","inline_featured_image":false,"footnotes":""},"class_list":["post-273","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/pages\/273","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/comments?post=273"}],"version-history":[{"count":26,"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/pages\/273\/revisions"}],"predecessor-version":[{"id":1293,"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/pages\/273\/revisions\/1293"}],"wp:attachment":[{"href":"https:\/\/icmi.acm.org\/2023\/wp-json\/wp\/v2\/media?parent=273"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}