Publication

Algorithmic differentiation of Java programs

  • Algorithmische Differentiation von Java Programmen

Slusanschi, Emil-Ioan; Bischof, Christian (Thesis advisor)

Aachen : Publikationsserver der RWTH Aachen University (2008)
Dissertation / PhD Thesis

Aachen, Techn. Hochsch., Diss., 2008

Abstract

Derivatives are a crucial component in many areas of science and engineering, and their accurate evaluation is often required in various scientific applications. One technique widely used to obtain computer derivatives is Automatic Differentiation (AD). The fact that to date no usable AD tool implementation exists for Java motivated the development of an AD tool for the Java language. Because of the portability and simplicity in terms of standardization provided by the Java bytecode, our ADiJaC (Automatic Differentiation for Java Class files) tool implements AD transformations on Java classfiles. However, to effectively implement AD, a more abstract internal representation than the unstructured bytecode for performing AD transformations is required. One of the main goals of this work has been the identification of a level of intermediate representations suitable for automatic differentiation transformations and their subsequent implementation for Java bytecode programs. The need for a unified intermediate representation to be used as the basis for all kinds of program transformations, has lead to the development of IRs such as Jimple or Grimp in the Soot framework. At this level, enough information about the program code is available, while maintaining a useful level of abstraction. As we constructively show in our work, this level of abstraction also enables efficient implementation of AD specific transformations. The ADiJaC tool implements these AD transformations in the forward mode, for the scalar and vector modes alike, with support for the entire Java 2 Math library. Moreover, ADiJaC deals with the new issues appearing in the semantics of automatic differentiation transformations due to problems such as the need to "deep-clone" certain objects in order to obtain local copies of these objects, instead of simple references to them. The reverse mode implementation consists of two main sweeps: the forward and reverse. The former is used to save the necessary intermediate variables while the latter computes the required adjoints. ADiJaC saves context variables on a local stack, rather than recompute them, implementing a so-called "joint" reversal strategy. The fact that the stack objects are local to each new adjoint method, coupled with the ability to use on-demand garbage collection, leads to a particularly efficient implementation of this approach. Considering that loop structures are represented by a combination of if and goto statements, and that they can be confused with the traditional conditional statements in the Soot IRs, it becomes a necessity to properly identify these structures. We implement transformations called loopPass and ifPass that capture this structure and effectively deal with it in implementing forward and reverse mode AD transformations. Using tags, internal data-structures and specific algorithms, ADiJaC extracts loop information from the unstructured Java bytecode, thus enabling the implementation of efficient AD transformations. Exception handling and warnings reporting is also supported for both AD modes. The accuracy and correctness of both forward and reverse mode ADiJaC transformations were successfully tested on codes from the MINPACK-2 test problem collection - often used as standard AD test-cases. The runtime and memory requirements for these problems showed that a correct and efficient code is being generated by ADiJaC. ADiJaC-generated derivative code was also compared to Tapenade-generated code for equivalent Fortran codes, and ADiJaC fared quite well against the robust, well-established and reliable Tapenade ADtool in performance comparisons. The speedup and efficiency obtained for a "derivative strip-mining" technique through thread-based parallelization of the forward implementation was also shown. The results demonstrate that most AD-enhanced codes scale quite well on shared-memory platforms. The development of the ADiJaC tool is a continuing process. Future optimizations include loop fusion for the derivative computation in the vector forward mode to join together iterations over the derivative vectors coming from different statements, whenever the dependencies in the original code allow for it. In the reverse mode approach to automatic differentiation of real applications, a substantial amount of the total execution time is spent with the stack operations - for saving and retrieving the intermediate values. Accordingly, these operations should be minimized by using analyses similar to Tapenade’s TBR, meant to reduce the number of values being stored.

Institutions

  • Chair of High Performance Computing (Computer Science 12) [123010]
  • Department of Computer Science [120000]