Call for Papers

                            RV'03
                      Third Workshop on
                     Runtime Verification
                http://www.cis.upenn.edu/rv2003

                        13 July, 2003
                     Boulder, Colorado, USA
 
                   Affiliated with CAV'03
     http://www.cs.utexas.edu/users/trcenter/CAV/cav2003homepage.html



OBJECTIVES

The objective of RV'03 is to bring scientists from both academia 
and industry together to debate on how to monitor, analyze and 
guide the execution of programs. The ultimate longer term goal 
is to investigate the use of lightweight formal methods applied 
during the execution of programs from the following two points of
view.  On the one hand, whether run-time application of formal methods 
is a viable complement to the current heavyweight methods proving 
programs correct always before their execution, such as model checking 
and theorem proving.  On the other hand, whether formality
improves traditional ad-hoc monitoring techniques used in 
performance monitoring, distributed debugging, etc.  Dynamic program 
monitoring and analysis can occur during testing or during operation. 
The subject covers several technical fields as outlined below. 

Dynamic Program Analysis. Techniques that gather information during 
program execution and use it to conclude properties about the program, 
either during test or in operation. Algorithms for detecting 
multi-threading errors in execution traces, such as deadlocks and data 
races.

Specification Languages and Logics. Formal methods scientists have 
investigated logics and developed technologies that are suitable 
for model checking and theorem proving, but monitoring can reveal 
new observation-based foundational logics.

Program Instrumentation. Techniques for instrumenting programs, at 
the source code or object code/byte code level, to emit relevant 
events to an observer.

Program Guidance. Techniques for guiding the behavior of a program 
once its specification is violated. This ranges from standard exceptions 
to advanced planning. Guidance can also be used during testing to expose 
errors. 

Novel applications for run-time verification.  Formalisms that go beyond
correctness properties.  This includes, but certainly is not limited to,
performance properties, survivability and fault tolerance, and so on.

Both foundational and practical aspects of dynamic monitoring are 
encouraged. 


INVITED SPEAKER

Aloysius K. Mok 
Department of Computer Science
University of Texas, Austin, USA
http://www.cs.utexas.edu/~mok

Professor Mok has developed a sophisticated framework for monitoring timing
constraints. 

SUBMISSIONS

The full submission should be sent by ** May 12 **.

Submissions should be up to 20 pages, describing recent  work,  
work-in-progress, and even highly speculative work  on all
aspects of dynamic program monitoring and analysis.

Topics of interest include, but are not limited to, the  following:

- Specification languages and logics for program monitoring. This
  includes real-time logics and automata.
- Predictive analysis: from one execution trace predicting 
  possible errors in other traces.
- Event extraction: how to instrument source code or object code
  to emit events during execution to an observer.
- Tracing and dynamic analysis of concurrent/distributed systems,
  including multi-threading analysis, such as deadlock and data race detection.
- Program behavior correction on-the-fly, based on violation of 
  a specification during program execution.
- Program execution guidance to expose errors.
- Synergy with other program analysis techniques such as testing, 
  model checking and static analysis.

Abstracts and submissions should be sent to one of the organizers.

We expect that accepted papers will be published in Electronic Notes in
Theoretical Computer Science. Selected papers will be considered for
publication in a prestigious journal. 


DATES:

  Submissions:   May  12, 2003
  Notification:  June 12, 2003
  Final papers:  June 22, 2003
  Workshop:      July 13, 2003


WEBSITE:

  http://cis.upenn.edu/rv2003/


PROGRAM COMMITTEE:

  Saddek Bensalem     (VERIMAG)
  Rance Cleaveland    (State University of New York at Stony Brook)
  Ann Gates	      (University of Texas, El Paso) 
  Patrice Godefroid   (Bell Laboratories)			    
  Gerard Holzmann     (Bell Laboratories)
  Susan Horwitz	      (University of Wisconsin, Madison)
  Aloysius K. Mok     (University of Texas, Austin)
  Michael Moeller     (University of Oldenburg)
  Henny Sipma	      (Stanford University)
  Oleg Sokolsky	      (University of Pennsylvania)
  Scott Stoller	      (State University of New York at Stony Brook)
  Mahesh Viswanathan  (University of Illinois, Urbana-Champaigne)
  Sergio Yovine	      (VERIMAG)
  Lenore Zuck	      (New York University)

STEERING COMMITTEE:

  Klaus Havelund      (NASA Ames Research Center - Kestrel Technology)
  Insup Lee	      (University of Pennsylvania)
  Grigore Rosu	      (University of Illinois, Urbana-Champaign)

  Several other PC members are pending their approval.

ORGANIZING COMMITTEE:
  Oleg Sokolsky	      (University of Pennsylvania)
  Mahesh Viswanathan  (University of Illinois, Urbana-Champaign)