SPARK Plugin for Eclipse User Guide

Author: Ken Conley
NOTE: This document applies to SPARK-IDE version 0.3.7 (circa April 2005)

This guide contains instructions on how to get the most out of the SPARK plugin for Eclipse. Please make sure that you have followed the installation instructions first and have successfully installed the plugin before continuing.

This User Guide covers:

The Spark Plugin is still under active development, and features described in this document are likely to change.

Contents

Basics

Switching into the SPARK Perspective

Expert tip: You can switch perspectives quickly within Eclipse by typing Ctrl+F8

Eclipse has a notion of "Perspectives" and "Views." A Perspective is a preset layout of various windows, which are called Views. You can think of a Perspective as being a driver's seat preset that saves the position of the seat, steering wheel, mirrors, etc... If you are editing .spark files, you will want to have a large editing window for your files, if you are debugging your files you will want to see windows/views that show you what different variables are bound to, and if you are checking files out of CVS you will want to see a window/view listing various CVS repositories. Perspectives allow you to make these switches between what windows are being show very quickly, in a task-oriented manner.

Perspectives in Eclipse are opened using the Window->Open Perspective menu. The initial menu shows the more common perspectives that are available. There are several more perspectives that are available if you select "Other..." Views in Eclipse are opened using the Window->Show View menu. Depending on which perspective you are currently in, this menu will change, but you can access any of the views by selecting Window->Show View->Other...

For editing your .spark files, there is a specialized perspective, predicatably referred to as the SPARK perspective. You can open the SPARK perspective using the Window->Open Perspective menu. If you do not see the SPARK perspective immediately listed, click on "Other", and you should see the SPARK perspective listed there.

After you have opened the SPARK perspective, you should see the icon on the left-hand side of your Eclipse Window (or on the top right if you are using Eclipse 3.0). In the future, you can switch into the SPARK perspective by clicking on this icon, without having to use the Window->Open Perspective menu.

List of SPARK Views

There are several views specially defined for use with SPARK and SPARK-related tools:

These views are accessible using the Window->Show View menu. If these are not immediately listed, select "Other..." and then select the folder labelled "Debug". Most of these views are automatically set to open once you open the appropriate perspective.

List of Common Eclipse Views

The following is a list of non-SPARK-specific Views that you are likely to encounter while using Eclipse. If the listed view is not visible, you can click on Window->Show View. If you do not see the desired view listed, click on "Other..." and then select find the view you want in the appropriate folder (e.g. Basic Views->Console).

Editing project properties

NOTE: ignore this step if you are not a Java developer. I've included it to introduce Java developers to some more features of Eclipse.

We won't actually be changing any settings in this step, but now is a good time to show you where Eclipse stores its project settings.

  1. Switch into the Java perspective (Window->Open Perspective->Java Perspective) if you haven't already, and also make sure that the Navigator view is open (Window->Show View->Navigator).
  2. Right click on the spark project in the Navigator and select properties.
  3. Select "Java Build Path". This shows you what Java resources Eclipse is using to compile your project. At the very bottom you'll notice a field called Default output folder. This sets where your .class files will be generated. The Source tab lets you configure where the actual .java files are, the Projects tab lets you indicate other projects that this project depends on, the Libraries tab configures where the .jar files are, and the Order and Export tab lets you determine in which order Java includes the resources.
  4. Select Java Task Tags in the left-hand pane. This is the configuration options for an interesting feature of Eclipse. When you are editing Java code, you can leave a comment in your code like:
    String foo = "bar"; //TODO: rename this to foobar
    Eclipse will see this TODO comment, and add 'rename this to foobar' to the list of tasks in the Tasks view.

Editing your SPARK-L files

Expert tip 1: Eclipse has several keyboard shortcuts for navigating between editors. Ctrl+E pulls up a list of all open editors. Ctrl+PageDown and Ctrl+PageUp navigates between editor tabs. Alt+Left and Alt+Right navigate backwards and forwards like a Web browser.

Expert tip 2: You can leave yourself 'TODO' notes by typing #TODO: your todo note at the beginning of a line. A summary of all your TODO notes is available in the "Tasks" view, and there will also be a blue marker next to your scrollbar that you can quickly navigate to.

Expert tip 3: You can quickly navigate to syntax errors and TODO markers in a file using Ctrl+. and Ctrl+, (or you can use the or buttons on the toolbar, or use the Navigate menu).

  1. Switch into the SPARK perspective (Windows->Open Perspective->SPARK Perspective) if you haven't already, and also make sure that the Navigator view is open (Window->Show View->Navigator).
  2. Find a file you wish to edit in the Navigator you want to edit and double-click on it. (NOTE: all .spark files should have an icon next to them).

The SPARK-L editor provides several basic editing features:

Creating new SPARK-L files

  1. Switch into the SPARK perspective (Window->Open Perspective->SPARK Perspective) if you haven't already, and also make sure that the Navigator view is open (Window->Show View->Navigator).
  2. Find the folder containing the SPARK package that you want to add your new .spark file to (e.g. spark_examples).
  3. Right click on this package and select "New->File" and name your new file with a .spark extension.

Synchronizing with CVS

  1. Switch into the SPARK perspective (Window->Open Perspective->SPARK Perspective) if you haven't already, and also make sure that the Navigator view (Window->Show View->Navigator) is open.
  2. Right click on the file or directory tree that you want to update/commit to CVS. If you are updating everything, you can select the project itself.
  3. Select "Team->Synchronize with Repository"
  4. A new view should open titled "Synchronize". It might be a bit small, so feel free to double click on "Synchronize" to make it full screen.
  5. The Synchronize view has three different modes: incoming mode, outgoing mode, and incoming/outgoing mode. Select whichever is appropriate to what you are doing (checkin/checkout/merge).
  6. Eclipse will show you a list of files that are appropriate to the mode you selected. To commit or update files/directory trees from CVS, right click (Mac: control-click) on them and select the appropriate operation. You can also select "Show Content Comparison," which will open up a side-by-side comparison of the resources.
    If you don't see the appropriate update/commit action, check to make sure that you are in the correct mode. You can only only update in the incoming or incoming/outgoing mode, and you can only commit in the outgoing or incoming/outgoing mode.

Debugging

Opening the Debug Perspective

The Debug Perspective is where you will be running all of your SPARK processes, so it's important to learn how to switch into it first.

  1. Select Window->Open Perspective->Debug Perspective. If you are not already in the Debug Perspective your views should change. You can tell if you are in the Debug Perspective by the presence of a view called Debug. There should also be a view with tabs labelled "Expressions" and "Variables" among others.
  2. Once you have opened the Debug Perspective once, you can switch back to it more quickly by clicking on the icon on the left side of your Eclipse Window.

Debugging SPARK modules

  1. Open the file containing the SPARK module that you want to run. For this example, we will use spark_examples.delivery. Select Run->Debug. Do NOT select Run->Run as this is broken.
  2. In the left pane there should be a list of "configurations" including SPARK Interpreter. Click on "SPARK Interpreter" and select "New".
  3. Some of the values will automatically be filled in for you, but you will still need to tell Eclipse the location of OAA and Jython. The error messages at the top of the screen will guide you through the required values. (Expert tip: You can override the default values by using the preferences menu)
  4. Click Debug
  5. SPARK will now run the module you select with all output sent to the view called "console". You can interact with SPARK directly in this view by typing commands, or you can use some of the built-in SPARK IDE tools to interact with SPARK. NOTE: if you get an error message telling you that you
    "could not create connector, most likely the port is still in use"
    or
    "Only one instance of SPARK can run be running - please terminate any existing SPARK processes before starting a new one."
    you will need to follow the Terminating SPARK instructions for getting rid of existing SPARK processes.

Terminating a SPARK debug process (IMPORTANT)

There are multiple ways to terminate a SPARK debug process:

Sending commands to SPARK

Execute Window

  1. Start up a SPARK debug process (see "Debugging SPARK modules").
  2. Look for the SPARK Execute view. If you do not see this view, select Window->Show View->Spark Execute.

Debugging SPARK modules Part II (Knowledge Base)

evaluate viewYou can monitor the value of predicate expressions in the SPARK agent knowledge base by using the IconEvaluate/Test View, which is part of the Debug Perspective (If you do not see this view, select Window->Show View->Evaluate/Test).

The list of predicate expressions you enter is saved between sessions so that you do not have to re-enter them. If a predicate on the list is not applicable to your current SPARK interpreter session, no results for it will be shown.

Debugging SPARK modules Part III (Stepping)

  1. Follow the instructions in Debugging SPARK modules (part I) to get a SPARK interpreter running. Also please follow the instructions in Sending Commands to SPARK so you learn how to use the Spark Execute view.
  2. In the Debug view, Spark Agent Thread should appear. Please click on it once to select it.
  3. Pause the SPARK Interpreter. You can do this using the pause icon that on the Debug view title bar.
    NOTE: You can resume the SPARK interpreter by selecting the resume icon
  4. Run a SPARK command using the SPARK Execute view. You should see a new intention appear underneath Spark Agent Thread.
  5. Step the debugger. You can do this by using the step into or step over icons which are on the Debug view menu. If you have trouble identifying the icons, hold your mouse over one of the arrow-shaped icons and a pop-up should appear identifying it for you.
  6. You can continue stepping the debugger until SPARK runs out of tasks to execute. You can select new tasks to run at any time during the stepping process.

Debugging SPARK modules Part IV (variable bindings)

  1. Follow the instructions in Part III (Stepping) to get the SPARK interpreter into stepping mode.
  2. To the right of the Debug view you should see a Window with tabs labelled "Variables", "Expressions", and others. Select the tab labelled "Variables".
  3. As you step through the SPARK process, you should see the local variable bindings appear in this view.

Setting SPARK Interpreter preferences

You can override some of the default settings (e.g. SPARK home, Jython home) that the SPARK Plugin chooses when you run a SPARK module:

Configuring Step Filters

Step filters are a very useful feature when stepping through SPARK process models. A simple process model can require hundreds of steps to complete, and you may only wish to monitor a very small handful of these. Step filters allow you to specify events and keywords to either monitor or ignore. For example, you can tell SPARK to report 'SUCCEEDED' and 'FAILED' events, but ignore 'EXECUTING' events. Similarly, you can tell it to ignore events with the text 'print' in them or only report events with the text 'foobar.'

You can access the step filter configuration in one of three different ways:

  1. Under the 'Run' menu: 'Configure SPARK Step Filters'
  2. On your toolbar: configure spark step filters
  3. On the right-click menu in the 'Debug' view: 'Configure SPARK Step Filters'