Skip to content

InputOutputSystem.copy makes shallow copies #728

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
roryyorke opened this issue Apr 24, 2022 · 4 comments · Fixed by #797
Closed

InputOutputSystem.copy makes shallow copies #728

roryyorke opened this issue Apr 24, 2022 · 4 comments · Fixed by #797

Comments

@roryyorke
Copy link
Contributor

This is may be by design, but I found it unexpected (see below). If it is by design, it would help to add "shallow" to the doctring.

In [220]: g = ct.rss()

In [221]: g.D[0,0]
Out[221]: -1.8274167505512213

In [222]: g2 = g.copy()

In [223]: g2.D[0,0] = 1

In [224]: g.D[0,0]
Out[224]: 1.0

One way to deep copy is to use control.ss; continuing from above:

In [225]: g3 = ct.ss(g)

In [226]: g3.D[0,0] = 5

In [227]: g.D[0,0]
Out[227]: 1.0
@murrayrm
Copy link
Member

Good catch.

My sense is that if someone calls the copy method explicitly, we should do a deep copy. But we should also think about compatibility with NumPy, SciPy, etc and I'm not sure what they do.

Some other situations we should also think about:

  • If you call the ss or tf factory functions and pass them a state space system or transfer function, respectively, they just return that object. Should they do a copy instead? Shallow or deep?
  • When you use a single system more than once in an interconnected system, we generate a copy (so that things like the state of one block are distinct from the state of the other). Shallow or deep?

@roryyorke
Copy link
Contributor Author

But we should also think about compatibility with NumPy, SciPy, etc and I'm not sure what they do.

The answer is "it's complicated"; for arrays of immutable types (e.g., numbers), ndarray.copy appears to be deep:

n [5]: a = np.array([1,2,3])

In [6]: b = a.copy()

In [7]: b[0] = 100

In [8]: a[0]
Out[8]: 1

but for mutable array elements, with dtype=object, you can see the copies are actually shallow:

n [11]: c = np.array([{'a':23},{'b':99}],dtype=object)

In [12]: c[0]
Out[12]: {'a': 23}

In [13]: d = c.copy()

In [14]: d[0]['b'] = 100

In [15]: c[0]
Out[15]: {'a': 23, 'b': 100}

If you call the ss or tf factory functions and pass them a state space system or transfer function, respectively, they just return that object. Should they do a copy instead?

I don't think this is right for ss; see my example above. AFAICT, control.ss is control.iosys.ss, which calls control.statesp._ss, which calls control.statesp.StateSpace; this in turn calls control.statesp._ssmatrix on each of A, B, C and D, which makes a copy of these arrays via np.array (which does copy by default).

When you use a single system more than once in an interconnected system, we generate a copy (so that things like the state of one block are distinct from the state of the other). Shallow or deep?

I think deep, so that if you fiddle with entries of one (scaling with a gain, say), you don't affect the others.

@hazrmard
Copy link

hazrmard commented Jul 21, 2022

@roryyorke I agree with your assessment. copy by default should be deep. This affected some experiments I was running on copies of a system. For example, I was modifying a copy of a LinearIOSystem made from control.ss by using the *= operation. That changed the matrix values of the original, in-place. This was unexpected behavior, since I implicitly understood copy to duplicate the system, and not just make another reference to the original.

With regards to ss or tf functions, I believe a deep copy should be made. They are factory functions to create new systems, any they should keep their own copies of parameters passed. If modifications are needed, then their attributes can be modified. This will prevent implicit side-effects.

If shallow copies are to be kept, I think it will be useful to have a disclaimer for in-place operations like *=:

A,B,C,D = [np.eye(2) for _ in range(4)]

sys = control.ss(A,B,C,D)
sys_copy = sys.copy()
# in-place operations modify original
sys_copy.A *= 2
print('Copy (in-place assignment)\n', sys_copy.A)
print('Original sys (after copy)\n', sys.A)
print('Actual matrix (after copy)\n', A)
print()

sys = control.ss(A,B,C,D)
sys_copy = sys.copy()
# Re-assigning matrices does not modify original:
sys_copy.A = sys_copy.A * 2
print('Copy (re-Assignment)\n', sys_copy.A)
print('Original sys (after copy)\n', sys.A)
print('Actual matrix (after copy)\n', A)

@ilayn
Copy link

ilayn commented Jul 21, 2022

In NumPy and SciPy copying is a bit easier since the main issue is the underlying memory buffer (direct or indirect) and the rest is just some meta pepper. Once the buffer is copied and if the dtype is suitable, it is quite easy to copy the rest. For objects it's the copy() of the object. If it is mutable then copy is shallow because that's the __copy__ of the object dictates.

I wouldn't bother with object arrays since many consider it to be an historical overzealous inclusion. For generic use, it is fair to assume that NumPy and SciPy favors deep copy, but again, the shallow copy for NumPy arrays is just assignment a=b hence copy has not much choice other than to be deep.

Here indeed the copies should be deep but copying user-defined classes is always a pain.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants