Discussion:
Final CfP: 1st Workshop on Semantic Explainability (SemEx 2019), co-located with ICSC 2019
Basil Ell
2018-11-16 22:18:09 UTC
Permalink
*10 days left* to submit to the 1st Workshop on Semantic Explainability
(SemEx 2019).

Submission deadline: *Nov 26, 2018* – 23:59 Hawaii Time

------------------------------------------------------------------------------------------------------------------------------
                                                          Final Call
For Research Papers
------------------------------------------------------------------------------------------------------------------------------

                                         1st Workshop on Semantic
Explainability (SemEx 2019)
http://www.semantic-explainability.com/

co-located with
                          The 13th IEEE International Conference on
Semantic Computing (ICSC 2019)
                                        Jan 30 - Feb 1, 2019 - Newport
Beach, California, USA

------------------------------------------------------------------------------------------------------------------------------
Overview
------------------------------------------------------------------------------------------------------------------------------
In recent years, the explainability of complex systems such as decision
support systems, automatic decision systems, machine
learning-based/trained systems, and artificial intelligence in general
has been expressed not only as a desired property, but also as a
property that is required by law. For example, the General Data
Protection Regulation’s (GDPR) „right to explanation“ demands that the
results of ML/AI-based decisions are explained. The explainability of
complex systems, especially of ML-based and AI-based systems, becomes
increasingly relevant as more and more aspects of our lives are
influenced by these systems‘ actions and decisions.

Several workshops address the problem of explainable AI. However, none
of these workshops has a focus on semantic technologies such as
ontologies and reasoning. We believe that semantic technologies and
explainability coalesce in two ways. First, systems that are based on
semantic technologies must be explainable like all other AI systems. In
addition, semantic technologies seem predestined to support rendering
systems that are not based on semantic technologies explainable.

Turning a system that already makes use of ontologies into an
explainable system could be supported by the ontologies, as ideally the
ontologies capture some aspects of the users‘ conceptualizations of a
problem domain. However, how can such systems make use of these
ontologies to generate explanations of actions they performed and
decisions they took? Which criteria must an ontology fulfill so that it
supports the generation of explanations? Do we have adequate ontologies
that enable to express explanations and enable to model and reason about
what is understandable or comprehensible for a certain user? What kind
of lexicographic information is necessary to generate linguistic
utterances? How to evaluate a system‘s understandability? How to design
ontologies for system understandability? What are models of
human-machine interaction where the system enables to interact with the
system until the user understood a certain action or decision? How can
explanatory components be reused with other systems that they have not
been designed for?

Turning systems that are not yet based on ontologies but on sub-symbolic
representations/distributed semantics such as deep learning-based
approaches into explainable systems might be supported by the use of
ontologies. Some efforts in this field have been referred to as
neural-symbolic integration.

This workshop aims to bring together international experts interested in
the application of semantic technologies for explainability of
artificial intelligence/machine learning to stimulate research,
engineering and evaluation – towards making machine decisions
transparent, re-traceable, comprehensible, interpretable, explainable,
and reproducible. Semantic technologies have the potential to play an
important role in the field of explainability since they lend themselves
very well to the task, as they enable to model users‘ conceptualizations
of the problem domain. However, this field has so far only been only
rarely explored.

-------------------------------------------------------------------------------------------------------------------------------
                                                           Topics of
Interest
-------------------------------------------------------------------------------------------------------------------------------

Topics of interest include, but are not limited to:

- Explainability of machine learning models based on semantics/ontologies
- Exploiting semantics/ontologies for explainable/traceable recommendations
- Explanations based on semantics/ontologies in the context of decision
making/decision support systems
- Semantic user modelling for personalized explanations
- Design criteria for explainability-supporting ontologies
- Dialogue management and natural language generation based on
semantics/ontologies
- Visual explanations based on semantics/ontologies
- Multi-modal explanations using semantics/ontologies
- Interactive/incremental explanations based on semantics/ontologies
- Ontological modeling of explanations and user profiles

------------------------------------------------------------------------------------------------------------------------------
                                                         Author
Instructions
------------------------------------------------------------------------------------------------------------------------------

Manuscripts should be prepared according to the IEEE ICSC Author
Guidelines (Go Here for Formatting Guideline, LaTex Styles and Word
Template: https://www.ieee-icsc.org/submission). Submissions must be in
English, must not be longer than eight (8) pages, must be provided as a
PDF file, and must be submitted using the SemEx 2019 EasyChair
(https://easychair.org/conferences/?conf=semex2019) site.

------------------------------------------------------------------------------------------------------------------------------
                                                         Important Dates
------------------------------------------------------------------------------------------------------------------------------
Submission deadline: Nov 26, 2018 – 23:59 Hawaii Time
Notification of acceptance: Dec 10, 2018 – 23:59 Hawaii Time
Camera-ready version due: Dec 17, 2018 – 23:59 Hawaii Time
Workshop Date: Jan 30, 2019 (subject to conference schedule)

------------------------------------------------------------------------------------------------------------------------------
                                                     Workshop Organizers
------------------------------------------------------------------------------------------------------------------------------
Philipp Cimiano – Bielefeld University
Basil Ell – Bielefeld University, Oslo University
Axel-Cyrille Ngonga Ngomo – Paderborn University
--
Dr. Basil Ell
AG Semantic Computing
Bielefeld University
Bielefeld, Germany
CITEC, 2.311
+49 521 106 2951
--
Dr. Basil Ell
AG Semantic Computing
Bielefeld University
Bielefeld, Germany
CITEC, 2.311
+49 521 106 2951
Basil Ell
2018-11-26 21:18:35 UTC
Permalink
We extended the deadline to *December 3rd, 2018* – 23:59 Hawaii Time!
Post by Basil Ell
*10 days left* to submit to the 1st Workshop on Semantic
Explainability (SemEx 2019).
Submission deadline: *Nov 26, 2018* – 23:59 Hawaii Time
------------------------------------------------------------------------------------------------------------------------------
                                                          Final Call
For Research Papers
------------------------------------------------------------------------------------------------------------------------------
                                         1st Workshop on Semantic
Explainability (SemEx 2019)
http://www.semantic-explainability.com/
co-located with
                          The 13th IEEE International Conference on
Semantic Computing (ICSC 2019)
                                        Jan 30 - Feb 1, 2019 - Newport
Beach, California, USA
------------------------------------------------------------------------------------------------------------------------------
Overview
------------------------------------------------------------------------------------------------------------------------------
In recent years, the explainability of complex systems such as
decision support systems, automatic decision systems, machine
learning-based/trained systems, and artificial intelligence in general
has been expressed not only as a desired property, but also as a
property that is required by law. For example, the General Data
Protection Regulation’s (GDPR) „right to explanation“ demands that the
results of ML/AI-based decisions are explained. The explainability of
complex systems, especially of ML-based and AI-based systems, becomes
increasingly relevant as more and more aspects of our lives are
influenced by these systems‘ actions and decisions.
Several workshops address the problem of explainable AI. However, none
of these workshops has a focus on semantic technologies such as
ontologies and reasoning. We believe that semantic technologies and
explainability coalesce in two ways. First, systems that are based on
semantic technologies must be explainable like all other AI systems.
In addition, semantic technologies seem predestined to support
rendering systems that are not based on semantic technologies explainable.
Turning a system that already makes use of ontologies into an
explainable system could be supported by the ontologies, as ideally
the ontologies capture some aspects of the users‘ conceptualizations
of a problem domain. However, how can such systems make use of these
ontologies to generate explanations of actions they performed and
decisions they took? Which criteria must an ontology fulfill so that
it supports the generation of explanations? Do we have adequate
ontologies that enable to express explanations and enable to model and
reason about what is understandable or comprehensible for a certain
user? What kind of lexicographic information is necessary to generate
linguistic utterances? How to evaluate a system‘s understandability?
How to design ontologies for system understandability? What are models
of human-machine interaction where the system enables to interact with
the system until the user understood a certain action or decision? How
can explanatory components be reused with other systems that they have
not been designed for?
Turning systems that are not yet based on ontologies but on
sub-symbolic representations/distributed semantics such as deep
learning-based approaches into explainable systems might be supported
by the use of ontologies. Some efforts in this field have been
referred to as neural-symbolic integration.
This workshop aims to bring together international experts interested
in the application of semantic technologies for explainability of
artificial intelligence/machine learning to stimulate research,
engineering and evaluation – towards making machine decisions
transparent, re-traceable, comprehensible, interpretable, explainable,
and reproducible. Semantic technologies have the potential to play an
important role in the field of explainability since they lend
themselves very well to the task, as they enable to model users‘
conceptualizations of the problem domain. However, this field has so
far only been only rarely explored.
-------------------------------------------------------------------------------------------------------------------------------
Topics of Interest
-------------------------------------------------------------------------------------------------------------------------------
- Explainability of machine learning models based on semantics/ontologies
- Exploiting semantics/ontologies for explainable/traceable
recommendations
- Explanations based on semantics/ontologies in the context of
decision making/decision support systems
- Semantic user modelling for personalized explanations
- Design criteria for explainability-supporting ontologies
- Dialogue management and natural language generation based on
semantics/ontologies
- Visual explanations based on semantics/ontologies
- Multi-modal explanations using semantics/ontologies
- Interactive/incremental explanations based on semantics/ontologies
- Ontological modeling of explanations and user profiles
------------------------------------------------------------------------------------------------------------------------------
                                                         Author
Instructions
------------------------------------------------------------------------------------------------------------------------------
Manuscripts should be prepared according to the IEEE ICSC Author
Guidelines (Go Here for Formatting Guideline, LaTex Styles and Word
Template: https://www.ieee-icsc.org/submission). Submissions must be
in English, must not be longer than eight (8) pages, must be provided
as a PDF file, and must be submitted using the SemEx 2019 EasyChair
(https://easychair.org/conferences/?conf=semex2019) site.
------------------------------------------------------------------------------------------------------------------------------
Important Dates
------------------------------------------------------------------------------------------------------------------------------
Submission deadline: Nov 26, 2018 – 23:59 Hawaii Time
Notification of acceptance: Dec 10, 2018 – 23:59 Hawaii Time
Camera-ready version due: Dec 17, 2018 – 23:59 Hawaii Time
Workshop Date: Jan 30, 2019 (subject to conference schedule)
------------------------------------------------------------------------------------------------------------------------------
                                                     Workshop Organizers
------------------------------------------------------------------------------------------------------------------------------
Philipp Cimiano – Bielefeld University
Basil Ell – Bielefeld University, Oslo University
Axel-Cyrille Ngonga Ngomo – Paderborn University
--
Dr. Basil Ell
AG Semantic Computing
Bielefeld University
Bielefeld, Germany
CITEC, 2.311
+49 521 106 2951
--
Dr. Basil Ell
AG Semantic Computing
Bielefeld University
Bielefeld, Germany
CITEC, 2.311
+49 521 106 2951
--
Dr. Basil Ell
AG Semantic Computing
Bielefeld University
Bielefeld, Germany
CITEC, 2.311
+49 521 106 2951
Loading...