|
|
Second International Workshop on
Performance and Evaluation
of Data Management Systems
Beijing, China, Friday June 15 2007
|
|
|
Call for Papers
The first goal of this workshop is to present insights gained from
experimental results in the area of data management systems. The second goal
is to promote the scientific validation of experimental results in the
database community and facilitate the emergence of an accepted methodology
for gathering, reporting, and sharing performance measures in the data
management community.
Current conferences and/or journals do not encourage submission of mostly
(or purely) experimental results. It is often difficult or impossible to
reproduce the experimental results being published, either because the
source code of research prototypes is not made available or because the
experimental framework is under documented. Most performance studies have
limited depth because of space limitations. Their validity is limited in
time because assumptions made in the experimental framework become obsolete.
This workshop is meant as a forum for presenting
quantitative evaluation of various data management techniques and systems.
We invite the submission of original results from researchers, practitioners
and developers. Of particular interest are:
- performance comparisons between competing techniques,
- studies revisiting published results,
- unexpected performance results on rare but interesting cases,
- negative results,
- scalability experiments.
We also invite contributions that quantify the performance of deployed
applications of data management systems.
To be considered, submissions should present a reproducible experimental
framework. Based on the information presented in the paper, it should be
possible for a reader to:
- install and configure the system being studied,
- reproduce the workload,
- run the experiments and perform the measurements being reported.
Note that the above requirements do not imply that the software used in the
presented measures should be open source. Performance studies on systems
whose access is in some ways restricted should very clearly state the
version, distribution, and all relevant configuration parameters, enabling a
willing reader to reproduce the experiment, once he or she has gained
possession of the software.
|