Blue note 1: Scheduling other documents
Jacco van Ossenbruggen
Introduction
I guess this is the first blue note of the INS2 group.
A.k.a. "Lynda's brave new world" because it reflects the
discussion Lynda and I had about the main differences between
SMIL 1.0 and the future SMIL 2.0, from the perspective of the
Amsterdam Hypermedia Model. At the time of writing, the results
of this discussion are only partially visible on the blackboard
we used, so the first goal of this note is to archive the
results in a more permanent fashion. A second goal is to find
out what the short-term issues are, in order to be better
prepared for our work within the W3C SYMM WG. A third goal is
to identify the real fundamental issues we need to address in
our long-term research agenda. So it safe to say that this is
quite an ambitious first blue note...
With SMIL 1.0, we can schedule and position various media types,
without needing to know the precise inner details of these media
types. In SMIL 2.0, we want to be able to schedule (and, in some
cases, position) elements within these media types. This
requirement raises a whole new set of issues to be discussed.
Typical examples of media types (or WHAT elements do we want
to schedule)
We want to be able to schedule the elements of the
following (non exhaustive) list of media types.
- HTML
- SVG
- XML
- VRML (?)
- CSS (?!)
Fragment identification (or HOW we address these elements)
If we want to schedule elements of other media types, we need a
mechanism to address the elements to be scheduled. This is a
problem already addressed in hypermedia linking (anchoring) and
style sheet languages (selectors). Note that the spatial layout
of media items can also be defined relative to anchor positions,
and anchors have also been used to attach semantic information
to specific fragments of a media item. Examples of addressing
mechanisms include:
- XML IDs and HTML names, image maps
- CSS selectors
- XPointers
- Media-type specific (e.g. MPEG-7)
- HyTime location addressing
Scheduling languages (or WHERE do we define the schedule)
If we want to schedule elements of other media types, we need to
determine where (that is, in which document, or even style sheet) we
will define the
schedule. Examples include:
-
In the media type itself, by extending the media type with a
notion of time (e.g. HTML+TIME)
-
Using a time-based document type which is able to refer to the
elements being scheduled (e.g. SMIL++, "time-sheet")
-
In a style sheet language, which is extended with the notion
of time (e.g. CSS+TIME, BHTML)
Interaction (or WHO controls the schedule)
Given a specific schedule, we need a mechanism to control it in
order to support user interaction, to take into account changes
in the run-time environment and to allow scripting for
applications that need behavior which cannot be described with
current declarative methods. Examples include the various
DOM-related event models.
Integration problems
Independent of the issues related to timing and scheduling, the
process of integrating different media types by it self raise
many new issues. Examples include:
-
Spatial scrolling vs. temporal scheduling. What to do if an
element has been scrolled outside the current window when it
is activated?
-
Link behavior. Are links between/among all those different
(synchronized) media types and concurrent media streams still
manageable and behaving in a consistent manner?
-
Transitions. How do we define/implement transition effects
between all those different media.
-
Flow-based vs absolute positioning. How do we integrate
layout mechanisms based on text-flow (common in HTML and other
text-based media) with mechanisms based on absolute/relative
positioning (common in multimedia, vector graphics, VR,
etc)?
-
Combining multiple document hierarchies. Several media types
use hierarchical composition, but often for different
reasons. In text, the composition usually follows the
text-flow, in multimedia, it often follows the temporal flow,
and in yet another media type, the composition may be purely
on semantic reasons. Therefore, mechanisms that work for a
specific media type and depend on the media types hierarchical
composition, do typically not work for media types with
hierarchies that are based on other composition
mechanisms.
-
Combining intra-media and inter-media synchronization. If we
define external schedules for media types that are already
time-based, conflicts may arise. The same applies to defining
spatial relations for media types that define their own
layout.
-
APIs. Even if we do solve all of these issues on the modeling
level, we still need to define and standardize the associated
APIs to realize this in an open environment as the Web. Is it
reasonable to expect that we can standardize and implement the
(many) associated APIs we need to solve those issues?
$Id: newworld.html,v 1.1 2001/02/21 19:28:38 lynda Exp $