@@ -11,8 +11,11 @@ Introduction
11
11
functional units and should be easier to run and easier to interpret.
12
12
13
13
Some properly installed and fully functional PostgreSQL installations
14
- can fail these regression tests due to artifacts of the genetic optimizer.
15
- See the v6.1-specific release notes in this document for further details.
14
+ can fail these regression tests due to artifacts of floating point
15
+ representation and time zone support. The current tests are evaluated
16
+ using a simple "diff" algorithm, and are sensitive to small system
17
+ differences. For apparently failed tests, examining the differences
18
+ may reveal that the differences are not significant.
16
19
17
20
Preparation
18
21
@@ -58,17 +61,17 @@ Running the regression test
58
61
59
62
make all runtest
60
63
61
- Normally, the regression test should be run as the pg_superuser as the
62
- 'src/test/regress' directory and sub-directories are owned by the
64
+ Normally, the regression test should be run as the pg_superuser since
65
+ the 'src/test/regress' directory and sub-directories are owned by the
63
66
pg_superuser. If you run the regression test as another user the
64
67
'src/test/regress' directory tree should be writeable to that user.
65
68
66
69
Comparing expected/actual output
67
70
68
- The results are in the files in the ./results directory. These
69
- results can be compared with results in the ./expected directory
70
- using 'diff'. The files might not compare exactly. The following
71
- paragraphs attempt to explain the differences.
71
+ The results are in files in the ./results directory. These results
72
+ can be compared with results in the ./expected directory using 'diff'.
73
+ The files might not compare exactly. The following paragraphs attempt
74
+ to explain the differences.
72
75
73
76
OID differences
74
77
@@ -94,6 +97,9 @@ DATE/TIME differences
94
97
most of the date and time results will reflect your local time zone and
95
98
will fail the regression testing.
96
99
100
+ There appear to be some systems which do not accept the same syntax for
101
+ setting the local time zone.
102
+
97
103
FLOATING POINT differences
98
104
99
105
Some of the tests involve computing 64-bit (FLOAT8) number from table
@@ -107,6 +113,9 @@ FLOATING POINT differences
107
113
of these differences which are usually 10 places to the right of
108
114
the decimal point.
109
115
116
+ Some systems signal errors from pow() and exp() differently from
117
+ the mechanism expected by the current Postgres code.
118
+
110
119
POLYGON differences
111
120
112
121
Several of the tests involve operations on geographic date about the
@@ -184,6 +193,17 @@ Current release notes (Thomas.Lockhart@jpl.nasa.gov)
184
193
to differences in implementations of pow() and exp() and the signaling
185
194
mechanisms used for overflow and underflow conditions.
186
195
187
- The "random" results in the random test do not seem to produce random
188
- results on my test machine (Linux/gcc/i686).
196
+ The "random" results in the random test should cause the "random" test
197
+ to be "failed", since the regression tests are evaluated using a simple
198
+ diff. However, "random" does not seem to produce random results on my
199
+ test machine (Linux/gcc/i686).
200
+
201
+ Sample timing results
202
+
203
+ Timing under Linux 2.0.27 seems to have a roughly 5% variation from run
204
+ to run, presumably due to the timing vagaries of multitasking systems.
189
205
206
+ Time System
207
+ 06:12 Pentium Pro 180, 32MB, Linux 2.0.27, gcc 2.7.2 -O2 -m486
208
+ 12:06 P-100, 48MB, Linux 2.0.29, gcc
209
+ 39:58 Sparc IPC 32MB, Solaris 2.5, gcc 2.7.2.1 -O -g
0 commit comments