GPSeismic Tutorial
GPSeismic Tutorial
GPSeismic Tutorial
GPSeismic
Document contents - This document contains a brief overview and tutorial for use of GPSeismic software. Because of
its importance, considerable time is spent on Geodesy. Consult individual manuals for detailed use of the QuikLoad,
QuikView , GPSQL, QuikMap and QuikCon applications.
When you see this image, read the corresponding section very carefully.
Contact information - GPSeismic is a software product of Trimble Navigation Ltd.
Cliff Harris is the author of QuikLoad, QuikView, QuikMap, and QuikEdit.
e-mail cliff_harris@trimble.com Tel 1-802-872-7770
Rudy Lambert is the author of GPSQL, QuikCon, GPArc, GPNav, GPLocator, Project Manager and NGSearch.
e-mail rudy_lambert@trimble.com Tel 1-828-263-0356
Bruce Payne is involved in all aspects of GPSeismic including geoid model support, application installation, source code
management and raw data decoding.
e-mail bruce_payne@trimble.com Tel 1-720-587-4892
Web site - http://www.gpseismic.com/
Software Overview
A First Look - If you are reading this, you probably are new to
GPSeismic, a suite of applications that represents several hundred
megabytes of installed software. The normal question is Where do I
start? and the standard answer is It depends on what you want to do.
With GPSeismic, you can perform actions as diverse as vehicle tracking
or inertial navigation system processing. GPSeismic currently consists of
eleven applications.
It is instructive to note that GPSeismic started as two applications, QuikLoad and QuikView, in support of the Trimble
real-time GPS stakeout system. QuikLoad imported or generated preplots in grid coordinates, converted these coordinates
to WGS84 geographic coordinates, and then allowed the user to create stakeout files for the Trimble system. QuikView
converted the stakeout files back to grid coordinates. Both programs allowed these functions to be conducted graphically.
GPSeismic now includes support for numerous GPS, inertial and conventional systems and contains an application called
GPSQL for data management. An MDB database serves as the repository of all surveyed points and associated quality
control indicators. It may also contain the preplots (aka, stakeout design points).
There are several additional programs that the user can take advantage of. There is a presentation mapping application,
one that is an incredibly powerful application capable of preplot redesign and coordinate problem solving, one capable of
conventional data processing, one capable of real time vehicle tracking, and one capable of real time fleet tracking.
In the typical day to day usage scenario, three applications are used. We have preplots that we either generate or import
into QuikLoad for the purpose of creating stakeout files. QuikView processes these stakeout files and deposits all
survey data into a Microsoft database. GPSQL allows you to access this data and create any conceivable client
deliverable (coordinate files, reports, graphs, etc.).
There are two other applications you might find yourself using quite frequently:
QuikMap
There is often the need to modify preplots in some manner. Perhaps there
are exclusion zones that require that points be deleted or moved. QuikMap
is a powerful application that allows the user to make these modifications
and much more. So it might very well be that for a particular project,
QuikMap is first used to make the required preplot modifications, and
these modified preplots are then imported into QuikLoad.
Note that QuikMap can do a lot more than just preplot re-design. If you
need to work with DEMs, exclusion zones or image processing, this
program is for you.
QuikCon
In some areas around the world, conventional surveying
still rules. GPSeismic contains a conventional data
processing program called QuikCon which outputs
coordinate files or a special proprietary file suitable for
import into QuikView. It specializes in processing
reciprocal type traverses and currently supports over
twenty formats including TDS , Western/Gecos DCO, SP3,
CGGs CHE and JOB, Sokkia, Swift, SDR 20 and 33 and
GSI.
The rest of the story
The remaining programs might or might not be utilized depending on the user's needs. They include a mapping
application called GPArc which is based on ESRIs ArcObject libraries, an application called GPNav which is used for
vehicle tracking and vessel navigation, one called GPLocator capable of monitoring up to 100 vehicles in real time,
QuikEdit, a Windows editor, and NGSearch, a program which searches NGS control data CDs.
Software Installation
You will install from the file you downloaded from our website at:
http://www.gpseismic.com/download.htm
Perform a full installation. Note that manuals are in CHM format and are automatically installed in the Manuals folder.
The installation is your typical run-of-the-mill installation with a couple of exceptions. The first is that there are two
folders that need to be specified. One folder is for the applications and the default is /Program Files/. The other
folder is /GPSeismicData/ and this must be at a location with read/write permissions. The default is immediately
off the root folder.
If you are installing for the first time on a computer dont plug in the security key until after you install the software (and
drivers) and have re-booted. If you do, Windows might get confused as to what USB driver should be installed and install
the wrong one. Also, note carefully the last dialog that gives you the option of installing the security key software drivers.
You do have to do this if you have not done it on a previous version install.
Note that GPSeismic will run on a computer it is installed on (for the first time) for a period of 4 days without a key.
Other required files/programs you might need - We support every regional or world geoid model in use today that we are
aware of. GPSeismic expects the geoid model files to be in their native format as distributed by the respective
government or university agencies. Note that we fully support the Trimble .ggf formatted .
Data collector serial communications software Most modern systems are capable of working as mass storage
devices so you can simply use Windows Explorer to move files around. If your system uses a proprietary communication
program, they must come from the respective vendor.
Interpolative datum transformations models - The file associated with the NADCON NAD27<->NAD83 datum
transformation (conus.ncn) is supplied with GPSeismic. So is the Canadian NTV2 file. There are many other similar files
distributed which can be found in a folder called \GPSeismic\Geodesy\ once GPSeismic is installed. This includes state
Harn files.
Licensing
Once GPSeismic is installed, there is a program called License Manager that you might have to use. Consult our online
document for information concerning license activation, network licenses and more.
Using GPSeismic
First Things First What is the WGS84 datum and why does it come into play?
In very general terms, there are many latitude/longitude systems, not just one. For example, one survey marker can have
several sets of latitude/longitude coordinates, each a bit different from the other. In the United States, survey markers
have latitude/longitude coordinates that have their origins in a large conventional survey effort carried out in the early
1900s. However, with the advent of GPS, survey markers also have a second set of latitude/longitude coordinates that
have their origins in satellite observations of that marker. The first set of latitude/longitudes is called NAD27 which is
short for North American Datum of 1927. The GPS-based latitude/longitude system is called the WGS84 datum which
stands for World Geodetic System of 1984.
The sweet short summary of these different systems is this. Each latitude/longitude system along with a mathematically
defined shape on which the coordinates exist is called a datum. The shape is called an ellipsoid and NAD27 uses one
called Clarke 1866 while WGS84 uses one called, not surprisingly, WGS84. WGS84 is a world datum and is available
anywhere. Regional datums such as NAD27 are only used in a specific area. Our primary problem is that the difference
between identical coordinates in WGS84 and a local datum can range from very small to very large (hundred of meters),
and although we must normally report final survey coordinates in the local datum, we should perform GPS surveys in the
WGS84 datum.
You will see this recurring theme that when using GPS, WGS84 coordinates and ellipsoidal heights should be used at any
real-time reference and that errors can result in the rover positioning if there is a difference between the actual WGS84
coordinates of the reference and those that are utilized (i. e., local). Why is this?
As an analogy, consider the use of a total station in placing several points at predetermined
locations. We are given a setup point and a computed range and bearing for each point. A last
minute change requires shifting all points 5 meters south so a decision is made to add 5 meters to
the computed ranges. This would prove a correct solution only for the points due South of the total
station and increasing error would result as the bearing deviated from this. In summary, altering the
range only is incorrect and we must alter ranges and angles to establish a new framework in which
we can work.
Now, substitute a GPS satellite for the total station. Whenever we alter the true WGS84 coordinates
of a GPS reference station, we artificially alter the computed satellite ranges. This means we alter the
range corrections that are transmitted since they are the difference between these computed ranges
and those we observe. We thereby make the same mistake that was made in the analogy. I
An additional factor is that if we use local datum coordinates at a reference and believe our surveyed
coordinates are also in the local datum, this might be true near the reference station but not further
away. The local datum might exhibit geodetic tensions and warps that the highly precise RTK is
incapable of realizing.
Theres an additional item to consider. If you use mean sea level height at the
reference, the rover is not necessarily going to record a mean sea level height
because frankly, sea level isnt level. Its a surface that undulates in response to
gravity. Consider Carlsbad, New Mexico. In this region, the gradient can be 0.5
meters per 5 kilometers. So if you use msl height at the reference, the recorded
msl height 15 km away is 1.5 meters in error. For this reason, you should use
WGS84 ellipsoid height and then in processing, use a geoid model to convert
each points height back to msl height. A geoid model is a sort of magic black box that, given WGS84 coordinates,
provides a value that can be used to convert a WGS84 ellipsoid height to local height.
Project Manager
In Project Manager, you may create a project at which time you are asked to select
a coordinate system, a datum transformation method, and a geoid model, and then
prompted for several other important processing parameters. Once these selections
are made, you can operate the various applications and will observe that the
system selected in Project Manager, the geoid model, and other preferences are
being utilized. It is possible to avoid Project Manager and work with the
applications only, choosing systems and entering parameters as you go. However,
by using Project Manager, you are less likely to forget one or more, and by using
Project Manager, settings for each project that is created are saved and allows for
switching between projects later on.
Managing Your Geodetic Settings
You must choose a coordinate system and a datum transformation method when creating a project. GPSeismic ships
with dozens of predefined coordinate systems and datum transformations. All of the parameters associated systems and
datum transformations reside in the Geodsey.mdb file which is located in the \GPSeismicData\Geodesy\ folder. You can
access the database by selecting Geodetic Settings from the utility menu of Project Manager. The same geodetic settings
utility can be accessed from all of the GPSeismic applications. Before we create our project, lets delve into the geodetic
settings utility and see how we can view the parameters of a coordinate system or a datum transformation.
When the main geodetic settings interface is displayed, you have
the choice of about a dozen items. This includes a conversion
calculator, geoid heights calculator, and a number of viewing and
creation utilities.
Step 2 - On the dialog above press the tool button, Create (At A Point). For
seismic operations, the suggested datum transformation method is a threeparameter shift (aka, a Molodensky shift) in order to preserve the relative
positions of the seismic stakeout locations. If these parameters are not known
explicitly, they are determined using this utility.
In the Create-Datum-At-A-Point dialog, to derive a three-parameter shift, you
must choose the ellipsoid associated with the local datum, and enter the
geographic coordinates in local and WGS84 datums for a single point, then press
the Compute button. You should then see the geocentric offsets in the bottom of
the dialog. Note here that the entry format for geographic coordinates is quite
flexible and includes DD MM SS.sss, DD.ddd, and unsigned assuming the
hemisphere is designated (e.g., the local west longitude entered could be entered
92 45 31.0300 or 92 45 31.0300 W). The 3 parameters values represent the dx,
dy and dz shifts to WGS84 respectively. If you leave the checkbox on this dialog
checked, then the three parameters will be copied to the Shifts To WGS84 X,Y
and Z textboxes of the parent dialog. The final step is to press Update on the
parent dialog. That datum transformation should now be available for selection at
a later time.
Creating a Datum Transformation When The Specific Parameters are Known
Creating a datum transformation when you already know the datum transformation parameters requires you to display
the dialog as in Step 1 above, selecting the method and ellipsoid, and then simply entering the parameters. The datum
transformation is normally one of four types: 1) a three-parameter shift, 2) a seven-parameter shift, 3) a ten parameter
shift or 4) an interpolative black box type such as NADCON or Canadas NTV2. In the case of the latter, the shift file
must exist in the \GPSeismic\Geodesy\ folder and you must select it on the dialog. The final step is to press the Update
button with the add option chosen.
TIP - The most common mistakes made when creating a datum is to fail to select the proper local datum ellipsoid or to
enter signs incorrectly.
NTV2 Note For you Canadians, we have a Datum listed in our geodetic
database for the NTV2 interpolative datum transformation..
If you are a good surveyor, once either a datum transformation or coordinate system (or both) have been
selected or defined, you should test the conversion against known coordinates using the conversion calculator
available from the main Geodetic Settings dialog. In fact, it wouldn't be a bad idea to compare what the GPSeismic
conversion calculator gives you against another conversion program.
Creating the Project...at last
ITS GO TIME! If you have an existing project, the first dialog asks you whether to
back up the current settings. Its a good idea to say YES to this. Then a dialog is
displayed that prompts the user for the most important information in the entire
project creation sequence, namely the coordinate system and datum transformation
method. If you don't know what this is for, bail out now and start reading from the
top of this document again. Make sure for the following exercises, you select the
settings you see here.
Note that on the dialog the Select button displays, you can press the buttons next to
system and datum to list the specific parameters for that item. If you were using
NAD27, you would expect to see that you were using the conus.ncn shift file if you
pressed the button next to the datum.
On the same dialog, the user can select the geoid model to use. If one is not available at this point, the user may ignore
this selection and move forward in creating the project. However, if one is available, the user can navigate to it and select
it. The possible models are available from the file dialog file filter. If you skipped this step, make a note to yourself to get
the geoid model and specify it in the Preferences menu of QuikView and GPSQL, or alternatively, you can modify the
geodetic parameters of the current project (which will eventually be the one we are creating).
Note It is unlikely you will adjust heights based on a selection of a classical datum shift (3 or 7 parameter), but if you
do, check the appropriate box and skip the selection of the geoid model.
The next dialog allows the user to add specialty fields to the database and/or
change the names of several fields in the database rather than use their default
names. If changing the names of default fields, the data placed in these fields
does not change! Only the field names are affected.
If you choose the last item of the drop list, you can select a database model. The
new project database will contain all the fields the selected database contains.
Note that if you fail to add a specialty field at this point, its always possible to
add fields to the database later in the application, GPSQL.
The next dialog to be displayed requires a name for the project, a place to put the project, and a
subdirectory structure. The name can be practically anything but cannot be the name of an existing
project. The user can dictate where the project directory structure is placed, but lets use the
default. This action will create a directory structure based on separate directories for receiver and
sources and create some files in a directory called DATABASE. Don't delete these files or this
directory. Feel free to do anything you want with the others that are created. They will be empty.
The final dialog requires that the user enter several important processing
parameters. These are not all of the processing parameters, but those that are
considered to be the most important based on the fact that they affect the
results that are ultimately deposited in the database.
Source and Receiver Azimuths - Some of the more important parameters
are the azimuths that are specified for source and receiver. These are
required in order to accurately compute Inline and Crossline offsets in the
QuikView application. These azimuths may be entered in Grid or Geodetic. If
the user enters grid, a small dialog will be displayed in which the user must
enter approximate local grid coordinates for the prospect. From this dialog,
once the user presses Compute, the grid convergence value will appear for the entered point. Press Apply and the values
will be applied to each of the entered azimuths. The 'Geodetic' option button will then be automatically selected. Now
select the other items that tell the QuikView application how to discriminate between source and receiver. This is as
important as entering the correct azimuths! Use 0 degrees for source and 90 for receiver in this tutorial. When the
approximate coordinate dialog appears, enter an X of 530000 and a Y of 2000000.
There are several remaining items on this dialog that should be addressed:
Bin Digits - the number of digits in the bin (station) portion of the points that will be in the project. Get this wrong
and the fields which represent the line portion and station portion of a point (what we call Track and Bin), will be
populated incorrectly. Use 4 for this tutorial.
Local Time Offset - offset from GMT. Again, there are two time fields in the database, local and GMT. Leave this at 0 or
get it wrong, and your local time will be wrong. Not devastating but you might as well address it here.
Checkpoints - When QuikView adjusts heights by applying the geoid/ellipsoid separation for a point, any points with this
user entered character string will not have its delta-height from the preplot recalculated. Therefore, the measured
differences between monument height and recorded height in the field will be preserved. Checkpoints are also used in
other places in QuikView for quality control. Have two strings which represent checkpoints? Enter both and separate by a
comma. We wont concern ourselves with this for our tutorial so you can use the default.
QC Parameters - Based on user entries, several quality control checks will be automatically created that can be used in
QuikView to quickly QC your data when processing data. These queries, for example, may reflect the users desire to
isolate GPS code tracking points with undesirable DOP or number of satellites, large offsets, or bad instrument heights.
Alternatively, the user may build his own queries using the Build tool button. This launches the Query Building dialog that
allows the user to save named queries that will help in the QC process. Remember that it is always possible to build these
queries from QuikView at a later time.
When the project is finally created, it will appear in Project Managers project list. The user is prompted for whether to
make the settings current. Note that if the user chooses NO the settings can be made current at any time. If this is the
first project, you obviously make them current.
Before exiting Project Manager, look at all current settings in the right of the display. Glance through these. These are the
current settings for your project.
Note that when you created the project, if you used the default structure, you created several
directories, all of which are empty except database which contains an empty database
(.mdb), Geodetic Parameters file (.gdp), miscellaneous settings file (.set), a history of
everything you do for this project (.log), and a file of stranded preplot points (.uhf).
Final thoughts on Project Manager - You should create another project in Project Manager.
Once you do, try switching between the two. You do this by clicking the one you want to be
current in the list. Choose the menu item, Restore This Project To Current. You will be asked
if you want to backup current settings. Its a good idea to answer Yes to this in almost every situation. Doing so will take
every parameter from every application and place them in the projects SET file. After you restore the project, you can
make any change you want and when you restore the original project, you will be back to the state at which you left it.
Verifying coordinate transformation - Once a file is imported and transformed, you can verify
coordinate transformation by right clicking on a point. The WGS84 coordinates and local grid
coordinates are displayed. The coordinates should be reasonable and for control points,
there should be a match with published coordinates for this point. By the way, right click with the
Shift key pressed and you will get a modal dialog. Press Esc or click on this dialog to get rid of it.
If the coordinates were imported and transformed correctly, lets append preplots to the database at this point. We only
have to do this once (as opposed to QuikView in which case we will be appending the surveyed data each day). You
append the database by selecting File/Database and pressing the Add all... button on this dialog.
At this point, you normally select and populate the controller file. One of the easiest ways to do this is to lasso the
points. You lasso by double clicking, than single click at points defining the polygon, and then double click at the point
where you started. You will see a dialog that asks whether to place all points in the collector file or Refine Using SQL.
You can send all points to the collector file, or you can send a subset by defining a query. If you press Yes, all points
are used to populate the collector file. Select Refine Using SQL.
Refining the contents of a lasso - If you say No, then you will see a spreadsheet of
all selected points. If you press the build query tool button (a hammer), you will
see the query building dialog. Its beyond the scope of this tutorial to discuss all
facets of query building, but if you wanted just sources, you would go to the
second tab page, select Station (value), the > symbol for operator, and type in
any number greater than the highest receiver number (try 50000000 here). Then
press And into criteria. Press the far left tool button on this dialog and the
spreadsheet content will reflect the query, and finally press the QuikLoad tool
button (looks like a multi color collector), an action which sends the selected
points to the collector file. On the map display, the points in the collector will
become green, the points not in the collector, red.
Remember that for making any other upload files on subsequent days, it is only
necessary to import the QLD file and select the points desired for the upload file.
It is not necessary to transform the coordinates each time. How you actually get
the upload file into a particular manufacturer's collector varies. It is beyond the
scope of this presentation is a polite way of saying, get the manufacturers manual out now. Some systems use serial
comms, but most use cards which can be inserted into the PC and appears as additional hard drives. So uploading is as
easy as moving a file using Windows Explorer.
A word about UHF (Upload History) files. - The default behavior of QuikLoad is to create a file of station names for all
points uploaded. When you import your QLD file the next day, this file is utilized to both color the previously selected
stations green (i.e., those in the UHF file) and make them un-selectable for upload. You can change any of this behavior
in the Miscellaneous dialog, and a cool thing to do is take advantage of the database and have QuikLoad automatically
replace this file with all points that have been surveyed. See the UHF tab page of the Miscellaneous dialog for this option.
What you want to do is create a query which isolates all valid Postplot table survey points. Then, QuikLoad will recreate
the UHF file each time it is started. It means you will essentially be working with a map of points in which you can
discriminate between surveyed and non-surveyed points. Confusing? Yes, at this point of your learning, it is. However, its
quite useful so if you want to blow this off right now, do so. Just make a note to yourself to visit this feature at a later
time.
HI dialog - The next dialog allows you review and possibly change the instrument heights used
by the rovers. What you will see is every HI used for ranges of stations. You can see each range
by using the scroll arrows. Entering a new HI and pressing Update can change any of these. The
effect of changing the HI is to change the ground adjusted height. Note that this dialog can be
displayed by selecting HI Entrys in the File menu at any time during the processing session.
If you have processed this file before and had made some changes to the HI and perhaps some
station name changes, an edit file would have been created in the same directory as the data
collector file. A raw file is never altered, so changes are recorded to this EDT file. If one exists, QuikView will ask at this
point if it should re-apply these edits.
Minimum processing steps - You can do a number of things to QC the data, but minimally, to process the data you have
to:
perform the horizontal transformation
and perform the vertical adjustment.
Note that specific QC methods vary greatly, but some suggested items are:
use the Changes dialog (or try double clicking a point) this is used for station name, HIs, and comment changes.
Try making at least one change here. Notice two things. One is that you must press the Update button on this dialog for
the change to take effect. The second is that when you type in any change, an update all button appears. Check this
and all points will be changed to your entry.
use the Masks dialog which can isolate bad DOPs, precisions, unit variance. This is the query builder dialog we saw
briefly in QuikLoad. This is a complex dialog and you still might be uncomfortable with it. Thats all right. Skip this part for
now.
Here are some items that are very important, however if you addressed them when creating a project, you shouldn't have
to address them now. If you want to check, click on the tool buttons below:
local time offset, number of digits in bin portion of station, and checkpoint designation. All of these are located on
the Miscellaneous dialog.
rx/source azimuths; determination of rx and source. You can enter these on the Offsets dialog but again if you
created the project correctly, you shouldnt have to change these.
One final required item is to update the database by pressing
also create any desired reports or chart.
Final thoughts about QuikView - Note that QuikView NEVER alters the input file. All changes are done in memory, and will
indeed be reflected in the database. Also, a small separate file is created with the edits you make. If you open the same
file again, you are prompted for whether to re-apply previous edits. Given this, you should play around a bit with the
sample file and try changes HIs and the like in order to gain some insight into what you can and cant do to the data.
Also, take a look at the utility menu and you can see that there are numerous supported data formats and systems. There
is also a generic ASCII import feature that allows you to import practically any file that might be out there.
Preferences - Display the Preferences/Options dialog but for now dont change anything. Just remember that there are
certain options you might want to exercise at a later time and this where you will go. For example, the dialog contains
options that allow you to create a backup each time you start GPSQL.
We can build six template queries by pressing the tool button at left in the Define menu
and then entering a value that separates receivers from sources numerically (30000000 will do) in the dialog that follows.
You will the be asked at which number (1-99) to place the queries. Leave this at 1. What will appear will be six SQL
syntax queries that isolate all postplots, source postplots, receiver postplots, and the same for preplots. Select each and
look at the SQL syntax and chances are you will gain immediate insight into the language of databases.
Note that we could have built our six template queries using the project queries button as
well. We would have obtained the same queries but the descriptions would have been clearer in the sense that they
would have referred to source and receiver. This was made possible by your choices earlier in Project Manager.
Here are some things you can do with the currently selected query:
Press one of the eye icons in the QueryA menu to display the records in a spreadsheet. There are
two spreadsheet viewers. The classic has been around for a while and will disappear in a future
release. The standard viewer has superior display capabilities . The classic on the other hand is a bit
faster to load and allows the user to edit cells in one step. The modern viewer requires edit followed by a save.
Press this icon to create a custom report. The custom report builder will display all fields defined
by the query. You drag these from the left side to the right (output) side. Play with the various formatting options and
press the small eye icon to get a preview of the first 100 records in the format you specified. When you get it the way
you want it, you can save all records to a file. Yes, there are many formatting options, but that gives you the flexibility to
create anything the client might desire.
For practice, get several fields including easting, northing, latitude and
longitude into the rightmost list box. Highlight the easting and then northing
fields and press the flag the currently selected field for button with the this
precision -> selection which annotates the field with a /p2. Select the lat/longs
and use the DMS.ssss selection which annotates the field with a /d2. These
annotations will be used to tell GPSQL to write the fields to a certain precision
and/or format.
On the Format tab page, you can determine if the file will be fixed card column
or delimited. You can also specify variable card columns. Experiment as much
as you want here and eventually view the report by pressing by pressing eye
tool button. This action displays the first 100 records so you can see if it what you had in mind. Finally, use the Save or
Save/Append File menu items to actually save a file to disk. Remember, you cant hurt anything if the format is wrong,
youll just create bad reports. You may save this format with a name by pressing
report with another selected query by pressing
Press this icon to create a seismic coordinate file. When you initiate this routine, you are first
prompted for the name of the file. Then the field selection dialog will be displayed.
The field selection dialog comes up often so lets discuss it. In the case of
seismic files (and other outputs), the required fields are well known. For seismic
files, the fields include point name (aka, station), local coordinates (grid and
geographic), local height and an optional descriptor. These are exactly what you
will see as the selected default fields in the field selection dialog. Can you change
them? Yes. Should you change them? Not normally. But the flexibility is there to
do so.
What about the other options? Normally, you wont need to use them, but you
could use them if you had to. For example, there is one option that takes any
points with the same name and produces one record with average coordinates.
The next dialog will be the coordinate system dialog. Why? The fields from the
database do not need to be transformed. They are already in the system/datum
they should be. The reason for this dialog is simply to create an appropriate
header should you want to. That is an option you get on the final dialog of this
process.
The final dialog is the standard GPSeismic ASCII export dialog. As with the import dialog, if
this is a new installation, the first thing you want to do is press the Restore Standards button
so that several industry standard formats become available for selection. Again, as with the
import dialog, spend some time here to look over your options. Use the Preview Button to
examine what a record will look like. Also note that you can include a header by checking the
appropriate box and if necessary, you can type into the header textbox. Any change is
remembered the next time you display this dialog. Also note that the header created by
GPSeismic does not adhere to any standard (such as SEG since it itself has a nebulous
definition).
Other Query Actions - To be sure there are other things you can do with a selected query (well over forty). This includes
immediately mapping the results using QuikMap, making a DXF file, and graphing up to six fields. Displaying and
reporting are the most important so thats why we took a closer look at this. Note that anything you do with a query
involves displaying or reporting the data, not modifying it. So feel free to look at any or all of these actions if you want.
Building Queries
Lets first change our direction now and build a query. At this point, select the seventh query that is
currently undefined. Then press the build query button from the Define menu which displays the query builder dialog.
Building queries - When the query builder dialog is displayed, click once on the
POSTPLOT table and the display will look like that at the left. On this first tab page,
you can change the way the records are ordered, the default being the first field in
the table. You can select specific fields for display, but dont select any at this point.
Note that by selecting View in the Query menu you can display the results of your
query. At this point, it is all POSTPLOT records ordered by Station (text).
Tip A better field to order by is actually Station (value). The Station (text) field is
just that, text. The ordering does not normally give you what you expect.
Delete the criteria above (highlight the text and press delete) and then select Station (text), Like for the operator, and
*carito* (without the single quotes) as the value. Press the And Into Criteria and then display the results. The
significance of the asterisks is that the represent wild cards. Essentially you are saying to return everything that has the
string carito in it. Note that with regards to text, GPSQL is not case sensitive.
In order to copy the query you have built to query 7 that is currently undefined, either press the top left tool button or
make sure 7 is selected from the dropdown in the toolbar and press the arrow to the left.
On the main display, you will see the actual syntax of the query displayed. In list item number 7 at
left, type in a name for the query. You can now click on another query and back to this one and see
that it has been automatically saved.
tool button. This action temporarily modifies the table to include a primary index
and places the grid in edit mode. Once edits are made, press the
tool button.
Final Thoughts On GPSQL - There is a Backup feature that creates a copy of the database at startup. This is a good
feature to turn on. Make sure you make a backup on a floppy, zip, CD or other media as often as you can.
There are many more powerful query building and database modification features we have left untouched, but the ones
described are required and will handle the majority of data management requirements. Recent additions include Query
Builder helpers which allow the standard query building dialog to be expanded to reveal neat helper functions for dealing
with strings, values and dates, and Aliases which allow the user to create a fields based on an expression which could
involve other fields. Aliases dont really exist in the database, but are nonetheless available for display, reports, etc.
The image has been removed here to show clearly the effects of the auto-offset function with the
parameters we used. Although we didnt do it here, we can limit the offsetting to inline or crossline, or
constrain it to a swath through the use of grid definition files.
As with any operation, you can select the File menu item, Outputs in order to display
the Outputs dialog. From here we can send primary, hit or any other category of points
to a coordinate file, DXF file, or SHP file. We could also save the current search zones
as a SHP file or DXF file.
A few final thoughts on exclusion zones
We imported an existing exclusion zone (.xzo), but how would we make such a file? There are two ways: 1) We can
import ASCII coordinate files which represent features such as wells, and in this point mode we would specify a radius to
use. We can also import a file of points that represent points along a line and specify a minimum distance from the line.
We can also import a file of points that represent a polygon. Each of these searches is initiated by selecting the Begin
Search item from the Search menu. 2) We can dynamically draw these types of features. This is initiated by selecting the
Dynamic Search item from the search menu. All exclusion zones, no matter what their origin was, can be saved to an
xzo file.
Also note that exclusion zones can be made with a specific color and hatching as well as a string attribute. Try doubleright-clicking in an exclusion zone and you will display a dialog allowing you to set these features.
Comparing files - Here's a simple QuikMap tutorial demonstrating comparisons:
You will need the following files:
\GPSeismic\Sample Data\Preplots\GPSeismic_postplot.sp1
\GPSeismic\Sample Data\Preplots\GPSeismic_sr_preplots.sp1
First, choose Edit/Clear All if you have done anything previously in the application. From the File menu, open the
GPSeismic_sr_preplots.sp1 as the initial file. Choose SEG P1 grid for import type.
Second, open GPSeismic_postplot.sp1 as the secondary file.
Third, go to the File menu and select the Compare/Offsets menu item. Zoom in to a region where there
are a significant number of offsets. Preplot points for which a match has been found are red and for
points with no match, green. Blue points here represent the postplot points in the secondary layer. Arcs
connect preplots to postplots and can be changed to straight lines in the Miscellaneous dialog.
From the File menu, display the Outputs dialog by selecting the Output item. This time, notice that in addition to saving
points in several ways, the connecting arcs can be saved as SHP or DXF files. Also, the Reports tool button is enabled.
Press this then select Yes when prompted about displaying matches only. In a few seconds, a spreadsheet will be
displayed with the results of the comparison. The spreadsheet will display offsets expressed in several different ways.
From here you can save a custom report if required.
Note that as with our previous exercise, inline and crossline azimuths imply a reference azimuth. For comparisons, this
happens to be the azimuth displayed in the status panel. Since our sources are oriented at grid north, the default value of
0 was correct.
Moving points - Here's a simple QuikMap tutorial demonstrating the use of grid definition files in moving points:
You will need the following files:
\GPSeismic\Sample Data\Preplots\GPSeismic_sr_preplots.sp1
\GPSeismic\Sample Data\Preplots\GPSeismic_sr_grd_definition.grd
First, choose Edit/Clear All if you have done anything previously in the application. From the File menu, open the
GPSeismic_sr_preplots.sp1 as the initial file. Choose SEG P1 grid for import type.
Next, use the File menu item, Open Grid Definition File to select and open the
GPSeismic_sr_grd_definition.grd file. A dialog will be shown that displays the parameters of the grid
definition file. These are all fine for this exercise, so press OK. Zoom in a bit and you will be able to see the
depiction of the bins with respect to the points.
Carefully, draw a polygon around some points by first double clicking with the mouse, then single click, an
action that draws the sides of the encompassing polygon for each mouse click. When you get very close
to the place to where you started, right click to display a popup menu with a number of things you can do
with the points inside the polygon you have described. Among the various choices you have in performing
an action on the points in the polygon, Group Traverse is the one we will select. Then we will right click
again to display the popup with that selection made and select Close polygon and process data.
Now, a dialog will be displayed that allows you to move the enclosed points based on
several offsetting strategies. Lets choose Delta Inline/Crossline and enter 120 for
the Crossline value. We will also check on Compute station number and Create at
bin center.
By making these last two choices, we will ensure that the points are moved exactly to
the bin center in which it falls (even if we were a bit off in our entered offset value)
and that the station will be renumbered according to the bin it falls in.
Note that as with our previous exercise, Delta inline and crossline imply a reference
azimuth. For group traverses, this happens to be the azimuth displayed in the status
panel. Since our sources are oriented at grid north, the default value of 0 was
correct.
When we press OK to the above dialog, the points will be moved to their new bin
locations.
On the left side of the dialog, in the Editing Levels group box, check every box except zenith angle.
Press OK. Once this is done, press the eye tool button in the group box. Cells containing information
that exceed the levels will turn red. Go to the Errors tab page and press the eye button there. A list
of all errors will be generated with the row they appear on. Click on one and it will scroll to that row.
Select "Open Qcn file' from File menu. Select the QCN file created above.
A spreadsheet of the processed traverse will appear along with a map
visualization and a height profile. You will notice in traverse data that
there are over forty fields, many blank and there is no MSL adjustment,
no SF applied and 0 split values (even though they appeared in DCO
editor earlier. These are all processing settings that you can elect to turn
on from the Settings tab page. For example, on the Settings tab page,
check the split errors check boxes and then press OK. Process the
Sample.Qcn file again. This time, you will see the applied splits in traverse
data. You should experiment with other processing settings. If you turn on
scale factoring, make sure to select UTM Zone 15N.
Eventually, on the Settings page, select 'use any found in QCC' in the Fixed BS Azimuths
group box. Process the Sample.qcn file again and you will see in the traverse data that
the backsight azimuth of one of the turnpoints was held fixed. The difference between
the computed and fixed azimuth is extremely large for demonstration purposes. You
might want to take a look at the QCC control file again so that you can see why this point
was held fixed by QuikCon.
Try selecting the File menu item, 'Prorate Azimuths ' . The routine does exactly what the
File menu item says. Any computed/known azimuth difference is prorated backwards
through the traverse. In the traverse data, you can see all coordinate and azimuth
change. You will note that the Fixed Az Delta is zero because by prorating, by the time
this point is processed, the fixed backsight matches the computed azimuth.
You can go further by actually adjusting the coordinates. This can be done by holding
points fixed in the Control File or doing that individually from the File menu. Eventually,
you will want to display the Export tab page, specify the output type and preferences, and
create a report.
General
If using Leica, when selecting the GPS system in QuikLoad and QuikView,
remember that there are several systems and you must choose the
appropriate one from the list that you can drop down. One of the most
frequent technical support calls we get for Leica users is that they cannot
properly import files. This is often traced to the fact that they simply have to
select the system they are working with.
File format Leica data collector files are not like any other. A data collector file is actually many files with distinct names
and extensions like X01, X02, etc. All the files that comprise the data collector file are given the name GeoDb or
DBXwhich stands for geodetic database. Within this GeoDb, there might exist one or more jobs. Leica GeoDbs are
typically moved from data collector to computer using flash ram cards. Once inserted in a computer, a flash card simply
becomes another drive. GPSeismic programs QuikLoad and QuikView have a Transfer GeoDB File menu item which
allows the user to move GeoDb jobs from folder to folder. The same dialog in QuikLoad is used to select the folder where
the GeoDb is placed and the job created. In QuikView the same dialog is used to navigate to the folder the GeoDb is in
and select the job to be imported.
QuikLoad Uploading
The next dialog will display what jobs are currently in the directory.
We could select one or create a new one. Here we are going to create
a new job called MyJob. You might or might not see a final dialog
which has to do with the creation of a Leica transformation set. This
is information which allows the WGS84 coordinates in the data
collector to be viewed in local grid coordinates.
QuikView Downloading
One important setting to address before importing files with QuikView is whether those
files have preplot coordinates in them in addition to the surveyed coordinates. Some
systems do have these and some dont. For the Leica 1200 data, we normally have to
specify a preplot QLD file. In this manner, we can see both preplot AND postplot points
and generate offsets between the two. You specify a QLD file on the Processing
Settings tab page of the Miscellaneous dialog.
After the preplot file is specified (which only has to be done once),
select Open Data Collector file on the File menu. As with QuikLoad,
you will first be prompted for a directory where the jobs exist.
Navigate to the desired folder. Then you will see a dialog displaying
the jobs that are in the folder. Check the desired job and press OK.