Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement arithmetic and logical operator support? #939

Open
tyarkoni opened this issue Jul 24, 2020 · 5 comments · May be fixed by #1014
Open

Implement arithmetic and logical operator support? #939

tyarkoni opened this issue Jul 24, 2020 · 5 comments · May be fixed by #1014

Comments

@tyarkoni
Copy link

This has been suggested before (by me, and probably others), but it would be quite helpful to have nibabel classes (at least in the nifti hierarchy) support arithmetic and logical operations as syntactic sugar for common operations—at least in cases where the semantics can be defined with little or no ambiguity.

E.g., img1 & img2 would give the logical conjunction of two images; img1 - img2 the difference; etc. It doesn't seem like users are likely to be confused about what these operators do, and obviously they would only work for operands with identical dimensions/affines.

@effigies
Copy link
Member

I think we could probably do something like:

class OperableImage:
    def _op(self, other, op):
        # Check header... need to have axes in the same places
        # Check affine
        # Check shape... maybe some limited broadcasting
        # Check types? Problematic types will be caught by nummpy, but might be cheaper to check before loading data.
        dataobj = op(np.asanyarray(self.dataobj), np.asanyarray(other.dataobj))
        return self.__class__(dataobj, self.affine, self.header)

    __and__ = partial(_op, op=operator.__and__)
    ...

This seems like it could easily fit into DataobjImage, and delegate out compatibility checks to headers and subclass methods. At least the first class type has to be makeable.

The user definitely gives up control of memory management, e.g., caching, so we'll need to be clear about that tradeoff in the docs.

We also have something like this with ArraySequences in the streamlines API:

def _define_operators(cls):
""" Decorator which adds support for some Python operators. """
def _wrap(cls, op, inplace=False, unary=False):
def fn_unary_op(self):
return self._op(op)
def fn_binary_op(self, value):
return self._op(op, value, inplace=inplace)
setattr(cls, op, fn_unary_op if unary else fn_binary_op)
fn = getattr(cls, op)
fn.__name__ = op
fn.__doc__ = getattr(np.ndarray, op).__doc__
for op in ["__add__", "__sub__", "__mul__", "__mod__", "__pow__",
"__floordiv__", "__truediv__", "__lshift__", "__rshift__",
"__or__", "__and__", "__xor__"]:
_wrap(cls, op=op, inplace=False)
_wrap(cls, op=f"__i{op.strip('_')}__", inplace=True)
for op in ["__eq__", "__ne__", "__lt__", "__le__", "__gt__", "__ge__"]:
_wrap(cls, op)
for op in ["__neg__", "__abs__", "__invert__"]:
_wrap(cls, op, unary=True)
return cls
@_define_operators
class ArraySequence(object):

def _op(self, op, value=None, inplace=False):
""" Applies some operator to this arraysequence.
This handles both unary and binary operators with a scalar or another
array sequence. Operations are performed directly on the underlying
data, or a copy of it, which depends on the value of `inplace`.
Parameters
----------
op : str
Name of the Python operator (e.g., `"__add__"`).
value : scalar or :class:`ArraySequence`, optional
If None, the operator is assumed to be unary.
Otherwise, that value is used in the binary operation.
inplace: bool, optional
If False, the operation is done on a copy of this array sequence.
Otherwise, this array sequence gets modified directly.
"""
seq = self if inplace else self.copy()
if is_array_sequence(value) and seq._check_shape(value):
elements = zip(seq._offsets, seq._lengths,
self._offsets, self._lengths,
value._offsets, value._lengths)
# Change seq.dtype to match the operation resulting type.
o0, l0, o1, l1, o2, l2 = next(elements)
tmp = getattr(self._data[o1:o1 + l1], op)(value._data[o2:o2 + l2])
seq._data = seq._data.astype(tmp.dtype)
seq._data[o0:o0 + l0] = tmp
for o0, l0, o1, l1, o2, l2 in elements:
seq._data[o0:o0 + l0] = getattr(self._data[o1:o1 + l1], op)(value._data[o2:o2 + l2])
else:
args = [] if value is None else [value] # Dealing with unary and binary ops.
elements = zip(seq._offsets, seq._lengths, self._offsets, self._lengths)
# Change seq.dtype to match the operation resulting type.
o0, l0, o1, l1 = next(elements)
tmp = getattr(self._data[o1:o1 + l1], op)(*args)
seq._data = seq._data.astype(tmp.dtype)
seq._data[o0:o0 + l0] = tmp
for o0, l0, o1, l1 in elements:
seq._data[o0:o0 + l0] = getattr(self._data[o1:o1 + l1], op)(*args)
return seq

So perhaps worth thinking about how to turn this into an abstract base class that both objects could derive from.

@matthew-brett
Copy link
Member

Yes - nice idea.

@htwangtw
Copy link
Contributor

htwangtw commented May 6, 2021

I will do it. Any idea where should the code live? Inside streamlines?

@effigies
Copy link
Member

effigies commented May 6, 2021

No, DataobjImages shouldn't depend on streamlines. If we use the same tooling, we would want it to be somewhere they could both depend on it. Maybe something like nibabel/arrayops.py? We don't have to settle on a name right away.

@htwangtw
Copy link
Contributor

htwangtw commented May 6, 2021

Sounds good. We can finalise the names etc later.

@htwangtw htwangtw linked a pull request May 6, 2021 that will close this issue
6 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants