• [CfP] Call for Papers: Information Processing & Management (IP&M) (IF:

    From Ptaszynski Michal@21:1/5 to All on Wed Feb 2 03:52:35 2022
    Dear Colleagues,

    ** Apologies for cross-posting **

    This is Michal Ptaszynski from Kitami Institute of Technology, Japan.

    We are accepting papers for the Information Processing & Management
    (IP&M) (IF: 6.222) journal Special Issue on Science Behind Neural
    Language Models. This special issue is also a Thematic Track at
    Information Processing & Management Conference 2022 (IP&MC2022),
    meaning, that at least one author of the accepted manuscript will need
    to attend the IP&MC2022 conference. For more information about
    IP&MC2022, please visit: https://www.elsevier.com/events/conferences/information-processing-and-
    management-conference

    The deadline for manuscript submission is June 15, 2022, but your paper
    will be reviewed immediately after submission and will be published as
    soon as it is accepted.

    We hope you will consider submitting your paper. https://www.elsevier.com/events/conferences/information-processing-
    and-management-conference/author-submission/science-behind-neural- language-models

    Info regarding submission: https://www.elsevier.com/events/conferences/information-processing-and-
    management-conference/author-submission

    Best regards,

    Michal PTASZYNSKI, Ph.D., Associate Professor Department of Computer
    Science Kitami Institute of Technology, 165 Koen-cho, Kitami, 090-8507,
    Japan TEL/FAX: +81-157-26-9327 michal@mail.kitami-it.ac.jp

    ===========================================Information Processing & Management (IP&M) (IF: 6.222) Special Issue on
    "Science Behind Neural Language Models" & Information Processing &
    Management Conference 2022 (IP&MC2022) Thematic Track on "Science Behind
    Neural Language Models"

    Motivation

    The last several years showed explosive popularity of neural language
    models, especially large pre-trained language models based on the
    transformer architecture. The field of Natural Language Processing
    (NLP) and Computational Linguistics (CL) experienced a shift from
    simple language models such as Bag-of-Words, and word representations
    like word2vec, or GloVe, to more contextually-aware language models,
    such as ELMo, or more recently, BERT, or GPT including their
    improvements and derivatives. The general high performance obtained by
    BERT-based models in various tasks even convinced Google to apply it
    as a default backbone in its search engine query expansion module,
    thus making BERT-based models a mainstream, and a strong baseline in
    NLP/CL research. The popularity of large pretrained language models
    also allowed a major growth of companies providing freely available
    repositories of such models, and, more recently, the founding of
    Stanford University’s Center for Research on Foundation Models (CRFM).
    However, despite the overwhelming popularity, and undeniable
    performance of large pretrained language models, or “foundation
    models”, the specific inner-workings of those models have been
    notoriously difficult to analyze and the causes of - usually
    unexpected and unreasonable - errors they make, difficult to untangle
    and mitigate. As the neural language models keep gaining in popularity
    while expanding into the area of multimodality by incorporating visual
    and speech information, it has become the more important to thoroughly
    analyze, fully explain and understand the internal mechanisms of
    neural language models. In other words, the science behind neural
    language models needs to be developed.

    Aims and scope

    With the above background in mind, we propose the following
    Information Processing & Management Conference 2022 (IP&MC2022)
    Thematic Track and Information Processing & Management Journal
    Special Issue on Science Behind Neural Language Models. The TT/SI
    will focus on topics deepening the knowledge on how the neural
    language models work. Therefore, instead of taking up basic topics
    from the fields of CL and NLP, such as improvement of part-of-speech
    tagging, or standard sentiment analysis, regardless of whether they
    apply neural language models in practice, we will focus on promoting
    research that specifically aims at analyzing and understanding the
    “bells and whistles” of neural language models, for which the
    generally perceived science has not been established yet.

    Target audience

    The TT/SI will aim at the audience of scientists, researchers,
    scholars, and students performing research on the analysis of
    pretrained language models, with a specific focus on explainable
    approaches to language models, analysis of errors such models make,
    methods for debiasing, detoxification and other methods of
    improvement of the pretrained language models. The TT/SI will not
    accept research on basic NLP/CL topics for which the field has been
    well established, such as improvement of part-of-speech tagging,
    sentiment analysis, etc., even if they apply neural language models
    unless they directly contribute to furthering the understanding and
    explanation of the inner workings of large scale pretrained
    language models.


    List of Topics

    List of Topics The Thematic Track / Special Issue will invite papers on
    topics listed, but not limited to the following:
    - Neural language model architectures
    - Improvement of neural language model generation process
    - Methods for fine tuning and optimization of neural language models
    - Debiasing neural language models
    - Detoxification of neural language models
    - Error analysis and probing of neural language models
    - Explainable methods for neural language models
    - Neural language models and linguistic phenomena
    - Lottery Ticket Hypothesis for neural language models
    - Multimodality in neural language models
    - Generative neural language models
    - Inferential neural language models
    - Cross-lingual or multilingual neural language models
    - Compression of neural language models
    - Domain specific neural language models
    - Expansion of information embedded in neural language models


    Important Dates:

    Thematic track manuscript submission due date; authors are welcome to
    submit early as reviews will be rolling: June 15, 2022 Author
    notification: July 31, 2022 IP&MC conference presentation and feedback:
    October 20-23, 2022 Post conference revision due date: January 1, 2023

    Submission Guidelines:

    Submit your manuscript to the Special Issue category (VSI: IPMC2022
    HCICTS) through the online submission system of Information Processing & Management. https://www.editorialmanager.com/ipm/

    Authors will prepare the submission following the Guide for Authors on
    IP&M journal at (https://www.elsevier.com/journals/information-processing-and- management/0306-4573/guide-for-authors). All papers will be peer-
    reviewed following the IP&MC2022 reviewing procedures.

    The authors of accepted papers will be obligated to participate in IP&MC
    2022 and present the paper to the community to receive feedback. The
    accepted papers will be invited for revision after receiving feedback on
    the IP&MC 2022 conference. The submissions will be given premium
    handling at IP&M following its peer-review procedure and, (if accepted), published in IP&M as full journal articles, with also an option for a
    short conference version at IP&MC2022.

    Please see this infographic for the manuscript flow: https://www.elsevi- er.com/__data/assets/pdf_file/0003/1211934/IPMC2022Timeline10Oct2022.pdf

    For more information about IP&MC2022, please visit https://www.elsevier.com/events/conferences/information-processing-and-
    management-conference.


    Thematic Track / Special Issue Editors:

    Managing Guest Editor: Michal Ptaszynski (Kitami Institute of
    Technology)

    Guest Editors: Rafal Rzepka (Hokkaido University) Anna Rogers
    (University of Copenhagen) Karol Nowakowski (Tohoku University of
    Community Service and Science)


    For further information, please feel free to contact Michal
    Ptaszynski directly.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)