Release Notes For Tivoli Workload Scheduler, Version 8.5
Release Notes For Tivoli Workload Scheduler, Version 8.5
Release Notes For Tivoli Workload Scheduler, Version 8.5
To download the appropriate package for your operating system, see the
Scheduler download page.
For detailed system requirements for all operating systems, see the Detailed System
Requirements page.
To access the Tivoli Workload Scheduler documentation, see the online Information Center
Installation improvements
Variable tables
Workload Service Assurance
Problem determination and troubleshooting enhancements
Enhanced integration features
New Tivoli Dynamic Workload Console features
What has changed in the Tivoli Workload Automation library
Installation improvements
Common shared architecture for Tivoli Workload Scheduler and Tivoli Dynamic
Workload Console
Tivoli Workload Scheduler and Tivoli Dynamic Workload Console are from this
version capable of being integrally installed. This gives the following benefits:
Automatic response file generation during the Tivoli Workload Scheduler and Tivoli
Dynamic Workload Console installation and upgrade processes
You have the option, during the installation process, to require that a response file is
generated with the selections made during the process. The generated response file
can then be used to clone the installation on a number of other systems, greatly
simplifying the product deployment process.
Addition of diagnosis and resume of a failed Tivoli Dynamic Workload Console
installation
In the same way as Tivoli Workload Scheduler, Tivoli Dynamic Workload Console
can automatically detect a number of specific failing points during installation, so that
you can correct the problem and rerun the steps in error.
Fix pack Rollback Function for Tivoli Dynamic Workload Console
Automatic upgrading from previous versions is possible. In addition, a rollback
function is available for when a fix pack is applied.
Variable tables
Variable tables are new scheduling objects that can help you reuse job and job stream
definitions.
Workload Service Assurance helps Job scheduling personnel to meet their service level
agreements. Using Workload Service Assurance, Job schedulers can flag mission-critical jobs
for privileged processing throughout the scheduling process. At plan run time the Workload
Service Assurance components, Time Planner and Plan Monitor, dynamically monitor that
the conditions for the timely completion of mission critical jobs are met and automatically
promote these jobs or their predecessors if they are late.
Using Tivoli Dynamic Workload Console views, operators can monitor the progress of the
critical network, find out about current and potential problems, release dependencies, and
rerun jobs.
Tivoli Dynamic Workload Console version 8.5 includes modeling and topology functions.
You can now use a single graphical user interface to model and monitor your workload
environment.
A Workload Designer was added for modeling, covering all the use cases related to creating
and maintaining the definition of the automation workload, such as jobs, job streams,
resources, variable tables, workstation classes, and their dependencies.
You can now use the Tivoli Dynamic Workload Console to define Tivoli Workload
Scheduler topology objects, covering all the use cases related to the definition and
maintenance of the scheduling environment, such as workstations and domains.
What has changed in the IBM Tivoli Workload Automation version 8.5 publications
These are the changes made to the Tivoli Workload Automation publications, which are
divided into a series of libraries:
The following summarizes the changes made to the Tivoli Workload Automation
cross-product publications:
Publications
New publication. It lists, with a brief description, all publications in the Tivoli
Workload Automation library.
Glossary
New publication. The product glossary that was previously included at the back of all
the publications is now in a separate publication.
Download documents, System Requirements documents, and Release Notes
documents
Improvements have been made to the layout and more content has been added.
Tivoli Workload Scheduler distributed library
The following summarizes the changes made to the Tivoli Workload Scheduler
distributed library:
The manual is now in two parts, a user's guide, which includes conceptual information
and describes user scenarios, followed by a reference to the commands and utilities.
All information relating to the installation, upgrade, and uninstallation, and the
troubleshooting of these activities, has been moved to the
Installation Guide
All information relating to the configuration or administration of the Tivoli
Dynamic Workload Console and the embedded WebSphere Application
Server has been moved to the Administration Guide
All information relating to the troubleshooting of the Tivoli Dynamic
Workload Console and the embedded WebSphere Application Server has been
moved to the Troubleshooting Guide
For full details of what is new, see the IBM Tivoli Workload Automation Overview
Interoperability tables
Support at level of older product or component: For all products and components
described in this section, the level of support is at that of the older product or component.
In the tables in this section the following acronyms are used:
MDM
Y Y Y
8.5
DM 8.5 Y Y Y Y Y
Agent
Y Y Y Y Y
8.5
Tivoli Dynamic Workload Console: compatibility
MDM/DM/FTA MDM/DM/FTA
8.5 8.4
TDWC 8.5 Y Y
With Fix Pack 2 (or later fix With Fix Pack 5 (or later fix
packs) installed on the Tivoli packs) installed on the Tivoli
Workload Scheduler V8.4 Workload Scheduler V8.3
component.
TDWC 8.4, Y
with or With or without any fix packs
without installed on the Tivoli
any fix Workload Scheduler V8.4
packs component.
TDWC 8.3
With Fix Pack 2 (or later fix
packs) installed on the Tivoli
Workload Scheduler V8.3
Note:
The indicated fix pack levels are the minimum required. You are strongly
recommended to maintain all Tivoli Workload Scheduler components at the latest fix
pack level.
Tivoli Workload Scheduler Job Scheduling Console: compatibility
z/OS z/OS
MDM/DM/FT MDM/DM/FT MDM/DM/FT
Con Controlle
A A A
n r
8.5 8.4 8.3
8.3 8.3
JSC 8.
Y (1) Y Y (3) Y (4) Y (5)
4
JSC 8.
Y (1) Y (2) Y Y Y (5)
3
Notes
1. The list below summarizes the behavior and limitations in using the Job Scheduling
Console V8.4 or version V8.3 with a master domain manager V8.5:
o Limitations in using variable tables:
You cannot manage variable tables.
You can manage only the parameters defined in the default variable
table.
You cannot create parameters when the default variable table is
locked.
You can perform only the browse action on objects that address
variable tables. For example, to modify the properties of a job stream
that references a variable table you must use the command line or the
Tivoli Dynamic Workload Console.
During the submission, you cannot change the properties of a job
stream that uses a variable table.
You can perform only the browse action on jobs that are flagged as
critical.
o Limitations in using the integration with Tivoli Dynamic Workload Broker:
Both in the database and in the plan, you cannot perform any
operations on workstations defined as broker workstation; you can
only browse them. They are shown as standard agent workstations. In
particular, the Create Another function, although not disabled, must
not be used, because it does not produce a copy of the original broker
workstation definition but it produces a standard agent workstation
definition.
Both in the database and in the plan, you cannot perform any
operations on jobs that manage Tivoli Dynamic Workload Scheduler
jobs; you can only browse them. They are shown as
particular, the Create Another function, although not disabled, must
not be used because it does not produce a copy of the original job
definition but it produces a job definition without the workload broker
specifications.
2. For firewall support, Java™ Virtual Machine 1.4.2 SR9 must be installed on the
computer where the Job Scheduling Console is installed
3. For firewall support, Fix PK47309 must be installed on the Tivoli Workload
Scheduler component
4. For firewall support, Fix Pack 3 of the z/OS Connector 8.3 must be installed
5. With fix PK41611 installed on the z/OS controller
6. If you have applied the standard agent integration PTFs (UK17981, UK18052,
UK18053) you must also install fix PK33565 on the z/OS Controller
MDM/DM/FTA JSC
8.5, 8.4, 8.3, 8.2.1, 8.2.0 8.4, 8.3, 8.2.1
TWS for Apps
Y Y
8.4, 8.3, 8.2.1, 8.2.0
Note:
It is recommended that you upgrade Tivoli Workload Scheduler for Applications to
the latest fix pack version before upgrading the Tivoli Workload Scheduler engine to
version 8.5. This applies to all Tivoli Workload Scheduler for Applications versions
up to version 8.4. If you have already upgraded the Tivoli Workload Scheduler engine
to version 8.5 and now need to upgrade any version of Tivoli Workload Scheduler for
Applications up to version 8.4, refer to Tech Note 1384969.
Workaround:
On the installation window some buttons seem to disappear if you press next and then
back
Workaround:
When you click Back, the two buttons mentioned do not disappear; they are just
moved a little further down in the window. You must scroll down to see the buttons
again.
Parentheses () are not permitted in the Tivoli Workload Scheduler installation path
When installing Tivoli Workload Scheduler, you cannot specify parentheses in the
installation path field.
Workaround:
Rerun the installation, using the correct value for the password.
On Red Hat Enterprise Linux V5.0 the automount feature does not work
On Red Hat Enterprise Linux V5.0 after inserting the DVD and double-clicking on
the desktop icon, the following message is displayed:
This is because the automount feature that mounts the DVD does so with the option
-noexec, which is not compatible with the way Tivoli Workload Scheduler needs to
use the DVD.
Workaround:
To solve the issue, umount the DVD, and manually remount it, using the following
command:
Two installations cannot coexist on the same workstation if they have the same
TWSuser name, where one is a local user account and the other is a domain user
account.
Workaround:
Upgrade notes
Upgrade procedure for platforms out of support
To upgrade master domain managers or backup masters running on platforms that are
no longer supported by version 8.5 (such as AIX® 32 bit kernel), run the parallel
upgrade procedure on a supported system. Upgrade procedures are documented in the
IBM Tivoli Workload Scheduler: Planning and Installation Guide
When performing a direct upgrade from version 8.2 to version 8.5 , the tokensrv
process must not be stopped
When performing a direct upgrade from Tivoli Workload Scheduler version 8.2 to
version 8.5, in a Windows® environment, the token server process must be active for
the automatic data migration steps to complete successfully.
For a detailed description of the problem and the workaround to apply, refer to the
technote TWS incorrectly installs licenses of some other products
The connection does not work due to a missing variable in the configuration of the
embedded WebSphere Application Server on Tivoli Workload Scheduler.
If you run Tivoli Workload scheduler version 8.4 (where Custom user registry is the
default configuration value on Linux), or if you use PAM authentication, you must
expect to experience such connectivity problems with Tivoli Dynamic Workload
Console version 8.5.
To check if the Tivoli Workload Scheduler version 8.4 engine on the Linux system is
configured based on the Custom user registry, do the following on the master domain
manager of the Tivoli Workload Scheduler version 8.4 engine:
realm=LocalOSServerREALM value
File eif.templ is recreated when migrating from version 8.4 GA to version 8.5
When Tivoli Workload Scheduler version 8.4.0 (the General Availability version
without the addition of any fix pack) is migrated to version 8.5, before the embedded
WebSphere Application Server is restarted, the file
<TWS_home>/appserver/profiles/twsprofile/temp/TWS/EIFListener/eif.tem
pl is removed and replaced with a new one.
This implies that, if you had changed the value of property BuffEvtmaxSize
file, your changes are lost.
If this happens, you must set the value of this property again in the new version of the
file. The section Managing the event processor in the IBM Tivoli Workload
Scheduler: Administration Guide documents how to do this. (38971)
Note that the new copy of the file is created in the new path, which is:
<TWA_home>/eWas/profiles/twaprofile/temp/TWS/EIFListener/eif.templ
Workaround
If you need to deploy large numbers of event rules, there are two measures you can
take to improve performance:
Increase the Java heap size of the application server, as described in the
section of the Performance chapter in the Administration Guide
which you should increase your heap size is difficult to quantify, but consider a
deployment of several thousands of rules as being at risk from an out of memory
failure.
The absolute keyword specifies that the start date is based on the calendar day rather
than on the production day. (41192)
Workaround
None. (40729)
When using the Legacy ID and Start of Day Evaluation, dependencies are not handled
correctly
Workaround
None. (40729)
The deploy (D) flag is not set on workstations after the ResetPlan
This is not a problem that affects the processing of events but just the visualization of
the flag which indicates that the event configuration file has been received at the
workstation.
Workaround
You can choose to do nothing, because the situation will be normalized the next time
that the event processor sends an event configuration file to the workstation.
Alternatively, if you want to take a positive action to resolve the problem, do the
following:
Create a dummy event rule that applies only to the affected workstations
Perform a planman deploy to send the configuration file
Monitor the receipt of the file on the agent
When it is received, delete the dummy rule at the event processor. (36924 /
37851)
Some data not migrated when you migrate database from DB2® to Oracle, or
Neither of the two migration procedures migrate the following information from the
source database:
You are using Tivoli Workload Scheduler in an environment where nodes are in
different time zones, but the time zone feature is not enabled. The time-related status
of a job (for example, LATE) is not reported correctly on workstations other than that
where the job is being run.
Workaround
Enable the time zone feature to resolve this problem. See the User's Guide and
Reference Manual to learn more about the time zone feature. See the
Guide for instructions about how to enable it in the global options. (37358)
You are monitoring a log file for a specific log message, using the
LogMessageWritten event. The message is written to the file but the event is not
triggered.
Cause
The SSM agent monitors the log file. It sends an event when a new message is written
to the log file that matches the string in the event rule. However, there is a limitation.
It cannot detect the very latest message to be written to the file, but only messages
prior to the latest. Thus, when message line "n" is written containing the string that
the event rule is configured to search for, the agent does not detect that a message has
been written, because the message is the last one in the file. When any other message
line is written, whether or not it contains the monitored string, the agent is now able to
read the message line containing the string it is monitoring, and sends an event for it.
Workaround
There is no workaround to resolve this problem. However, note that in a typical log
file, messages are being written by one or other processes frequently, perhaps every
few seconds, and the writing of a subsequent message line will trigger the event in
question. If you have log files where few messages are written, you might want to
attempt to write a dummy blank message after every "real" message, in order to
ensure that the "real" message is never the last in the file for any length of time.
(33723)
If you use the Microsoft Remote Desktop Connection to operate Tivoli Workload
Scheduler, you must use it always with the "/console" parameter, otherwise Tivoli
Workload Scheduler gives inconsistent results.
The plan time displayed by the planman showinfo command might be incongruent
with the time set in the operating system of the workstation. For example, the time
zone set for the workstation is GMT+2 but planman showinfo displays plan times
according to the GMT+1 time zone.
This situation arises when the WebSphere Application Server Java Virtual Machine
does not recognize the time zone set on the operating system.
Workaround:
Set the time zone defined in the server.xml file equal to the time zone defined for
the workstation in the Tivoli Workload Scheduler database. Proceed as follows:
genericJvmArguments="-Duser.timezone=time zone
where time zone is the time zone defined for the workstation in the Tivoli
Workload Scheduler database.
WebSphere Application Server limitations in a pure IPV6 environment when using the
Job Scheduling Console or the Tivoli Dynamic Workload Console (35681)
When you install Tivoli Workload Scheduler, the following WebSphere Application
Server variables are initialized as follows to allow communication in a mixed IPV4
and IPV6 environment:
java.net.preferIPV6Addresses=false
java.net.preferIPv4Stack=false
If your configuration requires the use of a pure IPV6 environment, or you have
specific firewall configuration settings that block IPV4 packets, the connection
between the Tivoli Workload Scheduler master domain manager and the Tivoli
Dynamic Workload console or the Job Scheduling Console fails.
Workaround:
To establish a connection in this specific environment, you must initialize the variable
as follows:
java.net.preferIPV6Addresses=true
$TWS_home/appserver/profiles/twsprofile/config/cells/DefaultNode
/nodes/DefaultNode/servers/server
If, instead, you want to use IPV4 communication exclusively, set:
java.net.preferIPv4Stack=true
Different behavior of UNIX and Windows operating systems at springtime daylight
savings (94279)
For example:
Windows
Workaround
Avoid creating jobs that have a start time in the "lost" hour and are due to run on the
night of the daylight savings change (a Saturday night in spring).
The writer process on a fault-tolerant agent does not download the Symphony file
(22485)
If you delete the Symphony file on a fault-tolerant agent, writer should automatically
download it when it next links to the master domain manager. However, this does not
happen if you launch conman before the file has downloaded.
Workaround
Delete the mailbox.msg file and writer downloads the Symphony file.
File monitor provider events: older event configurations might stay active on
workstations after rule redeployment (34103)
The status of the local monitoring configuration on the agent will be corrected when
one of the following occurs:
Event rule management: Deploy flag is not maintained in renewed symphony (36924)
The deploy flag (D) indicates that a workstation is using an up-to-date package
monitoring configuration and can be displayed by running the
command. Testing has revealed that the flag is lost from the symphony file when the
file is renewed after a JnextPlan or ResetPlan command. Although the event
monitoring configuration deployed to the agents is the latest one, and event
management works properly, an incorrect monitoring agent status is shown on the
workstations.
Time zone not enabled: wrong schedtime reported on the command line of agents
(36588)
The following problem has been found when a job stream is defined with time
restrictions and time zone was not enabled: after running JnextPlan
showschedules command on both the master and the agents. You notice that, while
the time restriction times remain the same, the schedtimes shown on the agents are
converted to the workstation's time zone. This should not happen when the time zone
conversion is not enabled.
If you run date command after setting tws_env, you see your date referred to in GMT,
and not in your TimeZone. (IZ36977)
On AIX 6.1 batchman process does not correctly recognize the timezone of the local
workstation (IZ70267)
On AIX 6.1 batchman process does not correctly recognize the timezone of the local
machine that is set to GMT, even if, in the Tivoli Workload Scheduler CPU
definition, it is correctly set to the correct timezone. You see in the stdlist log the
following message:
"10:29:39 24.11.2008|BATCHMAN:AWSBHT126I
Time in CPU TZ (America/Chicago): 2008/11/24 04:29
10:29:39 24.11.2008|BATCHMAN:AWSBHT127I
Time in system TZ (America/Chicago): 2008/11/24 10:29
10:29:39 24.11.2008|BATCHMAN:+
10:29:39 24.11.2008|BATCHMAN:+ AWSBHT128I
Local time zone time differs from workstation
time zone time by 360 minutes."
Batchman does not recognize the correct timezone because AIX 6.1 uses
(International Components for Unicode) libraries to manage the timezone of the
system, and these ICU libraries are in conflict with the Tivoli Workload Scheduler
ones.
Workaround
Export the TZ environment variable before starting Tivoli Workload Scheduler to the
old POSIX format, for example CST6CDT. This is an example of a
convention instead of an Olson name convention (for example America/Chicago).
This avoids the new default TimeZone management through the
6.1, by switching to the old POSIX one (as in AIX 5.x).
Globalization notes
This section describes software limitations, problems, and workarounds for globalized
versions of Tivoli Workload Scheduler.
The InstallShield wizard installation fails if DBCS characters are used in the
-is:tempdir path (36979)
If you are installing using the -is:tempdir option and you specify DBCS characters
in the path, the installation fails.
Workaround:
In the output of the composer list and display commands, the list and report headers
are in English (22301, 22621, 22654)
This has been done to avoid a misalignment of the column headers in DBCS versions
that was making it difficult to understand the information.
In the output of the product reports, the report headers are in English
This has been done to avoid a misalignment of the column headers in DBCS versions
that was making it difficult to understand the information.
All information is stored and passed between modules as UTF-8, and some characters
occupy more than one byte in UTF-8. For DBCS languages, each character is three
bytes long. Western European national characters are two bytes long. Other Western
European characters are one byte long.
On Windows operating systems, you cannot create a calendar with a name containing
Japanese characters using the "makecal" command (123653).
Workaround:
On Windows operating systems, the Tivoli Workload Scheduler joblog is created with
incorrect characters (APAR IY81171)
You are working in a non-English language environment and you have correctly set
the LANG and TWS_TISDIR environment variables. However, the Tivoli Workload
Scheduler joblog is created with incorrect characters in the body of the log (the
headers and footers of the log are correct).
Workaround
The problem is caused by the code page in use. Windows editors and applications use
code page 1252 which is correct for writing text files. However, the DOS shell uses
the default code page 850. This can cause some problems when displaying particular
characters.
To resolve this problem for Tivoli Workload Scheduler jobs, add the following line to
the beginning of the file jobmanrc.cmd on the workstation:
chcp 1252
For further details about the jobmanrc.cmd file, see the section on customizing job
processing on a workstation in the Tivoli Workload Scheduler: User's Guide and
Reference
It is possible to resolve this problem for all applications on the workstation, by using
regedit to set the DOS code page globally in the following registry keyword:
Note:
Microsoft warns you to take particular precautions when changing registry entries.
Ensure you follow all instructions in Microsoft documentation when performing this
activity.
Tivoli Workload Scheduler version 8.5 includes all applicable APARs fixed in V8.3 Fix Pack
4 and V8.4 Fix Pack 1, plus the following:
IZ07038 : Access check for scripts defined without fully qualified path.
IZ08936 : If a job has "recovery rerun", submitting a job stream logs AWSBHT023E
to twsmerge.log.
IZ10297 : Ad hoc submit job stream containing an "until" dependency after midnight
+1 day is incorrectly added to the "until" time.
IZ10349 : Wrong date in "last start time" and "scheduled start" is set by
IZ12472 : On Tivoli Workload Scheduler V8.3 Fix Pack 4, security errors can occur
when connecting the Job Scheduling Console when a Symphony file is not present.
IZ12504 : "Schedule not found" error when the SAP R/3 extended agent interception
collector attempts to put an interception job onto a nonexistent job stream.
IZ13027 : Tivoli Workload Scheduler V8.3 fix pack installation overwrites the
jobmanrc in the maestro_home directory.
IZ13672 : "Scheduled time" of a job submitted without a date (to start immediately)
shows the next day.
IZ14290 : Fault-tolerant agent stays unlinked even after issuing link commands.
IZ14618 : Job streams that do not have an "at" time dependency at the job stream
level have a start date of next day instead of current day
IZ14746 : After FINAL job stream, the unison_sched_id of job streams matches the
carried forward FINAL job stream's sched_id.
IZ15392 : Cannot kill two jobs with same job name in userjobs from the Job
Scheduling Console.
IZ15584 : Concurrent conman sbs fails with AWSJCS011E with Oracle database.
IZ16002 : Multiple users browsing joblog at same time, caused crash of Tivoli
Workload Scheduler V8.3 with Fix Pack 4.
IZ16601 : Some job streams are duplicated on different days in the plan during
migration.
IZ16832 : Tivoli Event Console event management fails when a fault-tolerant agent is
not reachable.
IZ16857 : Unixlocl extended agent stops if backup domain manager is stopped when
"enswfaulttol"=yes.
IZ17294 : Extended agent is shown as unlinked on Job Scheduling Console, although
it is linked.
IZ17475 : The rep7 report on V8.3 displayed the "elapsed time" in minutes. In V8.4
it is displaying the elapsed time in seconds.
IZ17565 : The usage command xref -u does not list the -when
IZ17806 : Garbled MBCS characters in a joblog header, if the MBCS characters are
passed as a parameter in a scriptname field.
IZ18166 : Unixlocl extended agent stops if backup domain manager is stopped when
"enswfaulttol"=yes.
IZ19308 : FINAL job stream not scheduled correctly at Daylight Saving Time
change.
IZ20328 : On the Job Scheduling Console, the last run time information disappears in
the All Job Definitions list view after right-cling to access the job properties panel.
IZ21378 : Incorrect output when running "reptr" and outputting to text file.
IZ21379 : In the "Set Alternate Plan" panel of the Job Scheduling Console, V8.3, the
"Plan Start Time" and "Plan End Time" have incorrect times when the timezones
feature is switched off in optman.
IZ21464 : After using the "Set Alternate Plan" panel of the Job Scheduling Console,
V8.3 to view the job streams in an old plan, removing a schedlog
browsed by that panel does not increase the available disk space as it does if the
removed schedlog file had not been browsed by that panel.
IZ21879 : Special characters in passwords are not allowed during the installation.
IZ22712 : Tivoli Workload Scheduler V8.4 "child workstation link changed" event
rule does not work.
IZ22904 : SSM 4.0 in Tivoli Workload Scheduler V8.4 logs Windows events about
the missing driver pcamp50.sys.
IZ22949 : The V8.3 rmstdlist did not work properly when launched outside the
maestro_home directory.
IZ22954 : After applying Fix Pack 04, XRXTRCT does not put all of the jobs in the
output files.
IZ24025 : Batchman went down when the job stream was canceled by the keywords
onuntil canc.
IZ24747 : For a chain of rerun jobs stageman is considering the first job in the chain
to determine if it should be carried forward, instead of the latest job. If the first rerun
job is in ABEND state and the last in SUCC state, the job is carried forward, even
though it has run successfully.
IZ25226 : After running switchmgr to the backup master domain manager, when the
master domain manager is brought back online or restarted the fault-tolerant agents
are unlinked.
IZ26739 : A job stream that has a pending predecessor incorrectly goes into
status.
IZ27478 : Jobs fail or do not execute properly when the system path on Windows
exceeds 1024 characters.
IZ28131 : A job stream with the onuntil cont option is not carried forward.
IZ28400 : Specifying "schedid" without a job stream ID did not return a syntax error.