Computer Programs in Seismology

Tutorials

Last updated April 9, 2025

Tutorials

This section contains links to tutorials that illustrate the use of many of the Computer Programs in Seismology codes. These examples were developed to test codes, to address research topics and to respond to needs of users. In most cases, a compressed archive is provided to permit the user to duplicate the example and to serve as a prototype for user modification. As needs arise, new tutorials are added to this compilation.

I believe that the value of simulations lies in insights that they provide for the design of actual field experiments and defining the limitations of actual data sets. The tutorial on regional wave propagation in the ocean is an example of the limitations of ambient noise empirical Green's functions when using vertical component data but the opportunities of using the transverse-transverse cross-correlations.

Note that because of the age of some of the tutorials, links to external sites may no longer exist. This fact should not affect the lessons inherent in each tutorial.

Basic Seismology

Synthetics

Surface Waves

Receiver Functions

Earth Structure

Sources

Seismic Data and Instrumentation

Velocity Models

Simulations

The purpose of the following is to demonstrate how the tools of Computer Programs in Seismology can be used for simulations of observed data so that a better understanding of the problems inherent in real data sets is appreciated.

Data Downloads


Shell Scripting

The Computer Programs in Seismology is a documented package of well tested algorithms for certain types of seismological study. However, possession of the programs is just the first step for research. Equally important is the use of these programs to prepare and analyze data sets. For any researcher, this is difficult until one becomes comfortable with data analysis.

The purpose of these Tutorials is to provide proven procedures for data analysis in areas of common interest.  Hopefully this list will continue to add new topics.
Some of the Tutorials are tests with synthetic data which serve to highlight the limitations of any analysis technique. These also serve as a validation of the compiled programs on a given computer.  Other tutorials provide real data sets for consideration.

Considerable use of made of SHELL scripts (get any modern book on LINUX and read the section on the BASH shell). The BASH shell is inherently a programming language that is the default shell on CYGWIN, LINUX and MacOS-X. A simple example of the bash shell (with comments in color) is the following.

Assume that we have a file named DOIT. Also assume that this is an executable shell script, e.g.,

> ls -l DOIT
-rwxr-xr-x 1 rbh rbh 3619 2007-01-04 08:16 DOIT	
[The x means that this is an executable]

Assume that the contents of this file are the following:

#!/bin/sh
##### #A line starting with an initial # is a comment. #I use comments a lot to document scripts so that I know
#what I did when I look at them later
#####
##### #This script will get ray parameters for a given source #depth and epicentral distance in degrees from the #program udtdd ##### #define event depth as a shell variable - note #no spaces are permitted ##### EVDP=100 ##### # loop over epicentral distances ##### for GCARC in 30 40 50 60 70 80 90 do RAYP=`udtdd -GCARC ${GCARC} -EVDP ${EVDP}` ##### # the desired ray parameter is placed in the shell variable # RAYP. We can use this value later. Note that the command # syntax is -GCARC distance_degrees The ${GCARC} places # the value from the for loop into the proper position ##### # now list the results using the echo command ##### echo ${GCARC} ${EVDP} ${RAYP} done

I can now run the script as follows

DOIT
30 100 0.0787601843
40 100 0.0742743611
50 100 0.0685746446
60 100 0.0613518916
70 100 0.0545108728
80 100 0.0479682162
90 100 0.0420032926

Use SHELL scripts and put in comments.