Neural Network Toolbox™ Release Notes
Neural Network Toolbox™ Release Notes
Neural Network Toolbox™ Release Notes
Release Notes
How to Contact MathWorks
www.mathworks.com Web
comp.soft-sys.matlab Newsgroup
508-647-7000 Phone
508-647-7001 Fax
Trademarks
MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See
www.mathworks.com/trademarks for a list of additional trademarks. Other product or brand
names may be trademarks or registered trademarks of their respective holders.
Patents
MathWorks products are protected by one or more U.S. patents. Please see
www.mathworks.com/patents for more information.
Contents 1
Summary by Version . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
iv
Neural Network Toolbox™ Release Notes
Summary by Version
This table provides quick access to what’s new in each version. For
clarification, see “Using Release Notes” on page 2.
Current product
documentation
V6.0.4 (R2010a) No No Bug Reports None
Includes fixes
V6.0.3 (R2009b) No No Bug Reports None
Includes fixes
V6.0.2 (R2009a) No No Bug Reports None
Includes fixes
V6.0.1 (R2008b) No No Bug Reports None
Includes fixes
V6.0 (R2008a) Yes Yes Bug Reports None
Details Summary Includes fixes
V5.1 (R2007b) Yes Yes Bug Reports None
Details Summary Includes fixes
1
Summary by Version
• New features
• Changes
• Potential impact on your existing files and practices
Review the release notes for other MathWorks® products required for this
product (for example, MATLAB® or Simulink®) for enhancements, bugs, and
compatibility considerations that also might impact you.
If you are upgrading from a software version other than the most recent one,
review the release notes for all interim versions, not just for the version you are
installing. For example, when upgrading from V1.0 to V1.2, review the release
notes for V1.1 and V1.2.
2
Neural Network Toolbox™ Release Notes
3
Version 7.0 (R2010b) Neural Network Toolbox™ Software
4
Neural Network Toolbox™ Release Notes
Network diagrams shown in the Neural Time Series Tool, Neural Training
Tool, and with the view(net) command, have been improved to show tap delay
lines in front of weights, the sizes of inputs, layers and outputs, and the time
relationship of inputs and outputs. Open loop feedback outputs and inputs are
indicated with matching tab and indents in their respective blocks.
5
Version 7.0 (R2010b) Neural Network Toolbox™ Software
The Save Results panel of the Neural Network Time Series Tool allows you to
generate both a Simple Script, which demonstrates how to get the same results
as were obtained with the wizard, and an Advanced Script, which provides an
introduction to more advanced techniques.
The Train Network panel of the Neural Network Time Series Tool introduces
four new plots, which you can also access from the Network Training Tool and
the command line.
6
Neural Network Toolbox™ Release Notes
The dynamic response can be plotted, with colors indicating how targets were
assigned to training, validation and test sets across timesteps. (Dividing data
by timesteps and other criteria, in addition to by sample, is a new feature
described in “New Time Series Validation” on page 13.)
plotresponse(targets,outputs)
7
Version 7.0 (R2010b) Neural Network Toolbox™ Software
Simpler time series neural network creation is provided for NARX and
time-delay networks, and a new function creates NAR networks. All the
network diagrams shown here are generated with the command view(net).
8
Neural Network Toolbox™ Release Notes
Several new data sets provide sample problems that can be solved with these
networks. These data sets are also available within the ntstool GUI and the
command line.
9
Version 7.0 (R2010b) Neural Network Toolbox™ Software
[x, t] = simpleseries_dataset;
[x, t] = simplenarx_dataset;
[x, t] = exchanger_dataset;
[x, t] = maglev_dataset;
[x, t] = ph_dataset;
[x, t] = pollution_dataset;
[x, t] = refmodel_dataset;
[x, t] = robotarm_dataset;
[x, t] = valve_dataset;
The preparets function formats input and target time series for time series
networks, by shifting the inputs and targets as needed to fill initial input and
layer delay states. This function simplifies what is normally a tricky data
preparation step that must be customized for details of each kind of network
and its number of delays.
[x, t] = simplenarx_dataset;
net = narxnet(1:2, 1:2, 10);
[xs, xi, ai, ts] = preparets(net, x, {}, t);
net = train(net, xs, ts, xi, ai);
y = net(xs, xi, ai)
The output-to-input feedback of NARX and NAR networks (or custom time
series network with output-to-input feedback loops) can be converted between
open- and closed-loop modes using the two new functions closeloop and
openloop.
net = narxnet(1:2, 1:2, 10);
10
Neural Network Toolbox™ Release Notes
net = closeloop(net)
net = openloop(net)
The total delay through a network can be adjusted with the two new functions
removedelay and adddelay. Removing a delay from a NARX network which
has a minimum input and feedback delay of 1, so that it now has a minimum
delay of 0, allows the network to predict the next target value a timestep ahead
of when that value is expected.
11
Version 7.0 (R2010b) Neural Network Toolbox™ Software
net = removedelay(net)
net = adddelay(net)
The new function catsamples allows you to combine multiple time series into
a single neural network data variable. This is useful for creating input and
target data from multiple input and target time series.
x = catsamples(x1, x2, x3);
t = catsamples(t1, t2, t3);
In the case where the time series are not the same length, the shorter time
series can be padded with NaN values. This will indicate “don't care” or
equivalently “don’t know” input and targets, and will have no effect during
simulation and training.
x = catsamples(x1, x2, x3, 'pad')
t = catsamples(t1, t2, t3, 'pad')
Alternatively, the shorter series can be padded with any other value, such as
zero.
x = catsamples(x1, x2, x3, 'pad', 0)
12
Neural Network Toolbox™ Release Notes
There are many other new and updated functions for handling neural network
data, which make it easier to manipulate neural network time series data.
help nndatafun
However, many time series problems involve only a single time series. In order
to support validation you can set the new property to divide data up by
timestep. This is the default setting for NARXNET and other time series
networks.
net.divideMode = 'time'
This property can be set manually, and can be used to specify dividing up of
targets across both sample and timestep, by all target values (i.e., across
sample, timestep, and output element), or not to perform data division at all.
net.divideMode = 'sampletime'
net.divideMode = 'all'
net.divideMode = 'none'
When the feedback mode of the output is set to 'closed', the properties
change to reflect that the output-to-input feedback is now implemented with
13
Version 7.0 (R2010b) Neural Network Toolbox™ Software
internal feedback by removing input j from the network, and having output
properties as follows:
net.outputs{i}.feedbackInput = [];
net.outputs{i}.feedbackMode = 'closed'
Another output property keeps track of the proper closed-loop delay, when a
network is in open-loop mode. Normally this property has this setting:
net.outputs{i}.feedbackDelay = 0
You can define error weights by sample, output element, time step, or network
output:
ew = [1.0 0.5 0.7 0.2]; % Weighting errors across 4 samples
ew = [0.1; 0.5; 1.0]; % ... across 3 output elements
ew = {0.1 0.2 0.3 0.5 1.0}; % ... across 5 timesteps
ew = {1.0; 0.5}; % ... across 2 network outputs
These can also be defined across any combination. For example, weighting
error across two time series (i.e., two samples) over four timesteps:
ew = {[0.5 0.4], [0.3 0.5], [1.0 1.0], [0.7 0.5]};
In the general case, error weights can have exactly the same dimension as
targets, where each target has an associated error weight.
Some performance functions are now obsolete, as their functionality has been
implemented as options within the four remaining performance functions: mse,
mae, sse, and sae.
14
Neural Network Toolbox™ Release Notes
Compatibility Considerations
The old performance functions and old performance arguments lists continue
to work as before, but are no longer recommended.
15
Version 7.0 (R2010b) Neural Network Toolbox™ Software
gensim has new options for generating neural network systems in Simulink.
Name - the system name
SampleTime - the sample time
InputMode - either port, workspace, constant, or none.
OutputMode - either display, port, workspace, scope, or none
SolverMode - either default or discrete
For instance, here a NARX network is created and set up in MATLAB to use
workspace inputs and outputs.
[x, t] = simplenarx_dataset;
net = narxnet(1:2, 1:2, 10);
[xs, xi, ai, ts] = preparets(net, x, {}, t);
net = train(net, xs, ts, xi, ai);
net = closeloop(net);
[sysName, netName] = gensim(net, 'InputMode', 'workspace', ...
'OutputMode', 'workspace', 'SolverMode', 'discrete');
Simulink neural network blocks now allow initial conditions for input and
layer delays to be set directly by double-clicking the neural network block.
setsiminit and getsiminit provide command-line control for setting and
getting input and layer delays for a neural network Simulink block.
setsiminit(sysName, netName, net, xi, ai);
16
Neural Network Toolbox™ Release Notes
Subobjects of the network, such as inputs, layers, outputs, biases, weights, and
parameter lists also display with links.
net.inputs{1}
net.layers{1}
net.outputs{2}
net.biases{1}
net.inputWeights{1, 1}
net.trainParam
The training tool nntraintool and the wizard GUIs nftool, nprtool, nctool,
and ntstool, provide numerous hyperlinks to documentation.
17
Version 7.0 (R2010b) Neural Network Toolbox™ Software
For instance, here you can calculate the error gradient for a newly created and
configured feedforward network.
net = feedforwardnet(10);
[x, t] = simplefit_dataset;
net = configure(net, x, t);
d = staticderiv('dperf_dwb', net, x, t)
% Old function
net = newff(x,t,hiddenSizes, transferFcns, trainingFcn, ...
learningFcn, performanceFcn, inputProcessingFcns, ...
outputProcessingFcns, dataDivisionFcn)
The new functions (and the old functions they replace) are:
feedforwardnet (newff)
cascadeforwardnet (newcf)
competlayer (newc)
distdelaynet (newdtdnn)
elmannet (newelm)
fitnet (newfit)
layrecnet (newlrn)
linearlayer (newlin)
lvqnet (newlvq)
narxnet (newnarx, newnarxsp)
patternnet (newpr)
perceptron (newp)
18
Neural Network Toolbox™ Release Notes
selforgmap (newsom)
timedelaynet (newtdnn)
The network’s inputs and outputs are created with size zero, then configured
for data when train is called or by optionally calling the new function
configure.
net = configure(net, x, t)
Unconfigured networks can be saved and reused by configuring them for many
different problems. unconfigure sets a configured network’s inputs and
outputs to zero, in a network which can later be configured for other data.
net = unconfigure(net)
Compatibility Considerations
Old functions continue working as before, but are no longer recommended.
Improved GUIs
The neural fitting nftool, pattern recognition nprtool, and clustering nctool
GUIs have been updated with links back to the nnstart GUI. They give the
option of generating either simple or advanced scripts in their last panel. They
also confirm with you when closing, if a script has not been generated, or the
results not yet saved.
19
Version 7.0 (R2010b) Neural Network Toolbox™ Software
Compatibility Considerations
The trainlm and trainbr training parameter MEM_REDUC is now obsolete.
References to it will need to be updated. Code referring to it will generate a
warning.
Compatibility Considerations
Any custom functions of these types, or code which calls these functions
manually, will need to be updated.
20
Neural Network Toolbox™ Release Notes
21
Version 6.0.3 (R2009b) Neural Network Toolbox™ Software
22
Neural Network Toolbox™ Release Notes
23
Version 6.0.1 (R2008b) Neural Network Toolbox™ Software
24
Neural Network Toolbox™ Release Notes
25
Version 6.0 (R2008a) Neural Network Toolbox™ Software
• plotperform—Plot performance.
• plottrainstate—Plot training state.
Compatibility Considerations
To turn off the new training window and display command-line output (which
was the default display in previous versions), use these two training
parameters:
net.trainParam.showWindow = false;
net.trainParam.showCommandLine = true;
26
Neural Network Toolbox™ Release Notes
Compatibility Considerations
You can call the newsom function using conventions from earlier versions of the
toolbox, but using its new calling conventions gives you faster results.
Compatibility Considerations
The code generated by nftool is different the code generated in previous
versions. However, the code generated by earlier versions still operates
correctly.
27
Version 5.1 (R2007b) Neural Network Toolbox™ Software
28
Neural Network Toolbox™ Release Notes
For detailed information about each function, see the corresponding reference
pages.
Changes to the syntax of network-creation functions have the following
benefits:
• You can now specify input and target data values directly. In the previous
release, you specified input ranges and the size of the output layer instead.
• The new syntax automates preprocessing, data division, and postprocessing
of data.
This command also sets properties of the network such that the functions sim
and train automatically preprocess inputs and targets, and postprocess
outputs.
In the previous release, you had to use the following three commands to create
the same network:
pr = minmax(p);
s2 = size(t,1);
net = newff(pr,[20 s2]);
Compatibility Considerations
Your existing code still works but might produce a warning that you are using
obsolete syntax.
29
Version 5.1 (R2007b) Neural Network Toolbox™ Software
To create the same network in a previous release, you used the following longer
code:
[p1,ps1] = removeconstantrows(p);
[p2,ps2] = mapminmax(p1);
[t1,ts1] = mapminmax(t);
pr = minmax(p2);
s2 = size(t1,1);
net = newff(pr,[20 s2]);
net = train(net,p2,t1);
y1 = sim(net,p2)
y = mapminmax('reverse',y1,ts1);
30
Neural Network Toolbox™ Release Notes
net.outputs{2}.processFcns = {'removeconstantrows','mapminmax'}
These defaults process outputs by removing rows with constant values across
all samples and mapping the values to the interval [-1 1].
sim and train automatically process inputs and targets using the input and
output processing functions, respectively. sim and train also reverse-process
network outputs as specified by the output processing functions.
For more information about processing input, target, and output data, see
“Multilayer Networks and Backpropagation Training” in the Neural Network
Toolbox™ User’s Guide.
The following input properties are automatically set and you cannot change
them:
31
Version 5.1 (R2007b) Neural Network Toolbox™ Software
Note These output properties require a network that has the output layer as
the second layer.
The following new output properties are automatically set and you cannot
change them:
Previously, you entered the following code to accomplish the same result:
pr = minmax(p);
s2 = size(t,1);
32
Neural Network Toolbox™ Release Notes
For more information about data division, see “Multilayer Networks and
Backpropagation Training” in the Neural Network Toolbox™ User’s Guide.
33
Version 5.1 (R2007b) Neural Network Toolbox™ Software
Compatibility Considerations
Several properties are now obsolete, as described in the following table. Use the
new properties instead.
net.numOutputs net.numTargets
net.outputConnect net.targetConnect
net.outputs net.targets
34
Neural Network Toolbox™ Release Notes
35
Version 5.0.1 (R2006b) Neural Network Toolbox™ Software
36
Neural Network Toolbox™ Release Notes
37
Version 5.0 (R2006a) Neural Network Toolbox™ Software
Custom Networks
The training functions in Neural Network Toolbox are enhanced to let you
train arbitrary custom dynamic networks that model complex dynamic
systems. For more information about working with these networks, see the
Neural Network Toolbox™ documentation.
38
Neural Network Toolbox™ Release Notes
mapminmax premnmx
postmnmx
tramnmx
mapstd prestd
poststd
trastd
processpca prepca
trapca
Each new function is more efficient than its obsolete predecessors because it
accomplishes both preprocessing and postprocessing of the data. For example,
previously you used premnmx to process a matrix, and then postmnmx to return
the data to its original state. In this release, you accomplish both operations
using mapminmax; to return the data to its original state, you call mapminmax
again with 'reverse' as the first argument:
mapminmax('reverse',Y,PS)
39
Version 5.0 (R2006a) Neural Network Toolbox™ Software
dmae
dmse
dmsereg
dnetprod
dnetsum
dposlin
dpurelin
dradbas
dsatlin
dsatlins
dsse
dtansig
dtribas
Compatibility Considerations
To calculate a derivative in this version, you must pass a derivative argument
to the function. For example, to calculate the derivative of a hyperbolic tangent
sigmoid transfer function A with respect to N, use this syntax:
A = tansig(N,FP)
dA_dN = tansig('dn',N,A,FP)
40
Neural Network Toolbox™ Release Notes
41
Compatibility Summary for Neural Network Toolbox™ Software
42
Neural Network Toolbox™ Release Notes
43
Compatibility Summary for Neural Network Toolbox™ Software
44