🔍 Code Extractor

class DynamicMap

Maturity: 31

A DynamicMap is a type of HoloMap where the elements are dynamically generated by a callable. The callable is invoked with values associated with the key dimensions or with values supplied by stream parameters.

File:
/tf/active/vicechatdev/patches/spaces.py
Lines:
731 - 1729
Complexity:
moderate

Purpose

A DynamicMap is a type of HoloMap where the elements are dynamically generated by a callable. The callable is invoked with values associated with the key dimensions or with values supplied by stream parameters.

Source Code

class DynamicMap(HoloMap):
    """
    A DynamicMap is a type of HoloMap where the elements are dynamically
    generated by a callable. The callable is invoked with values
    associated with the key dimensions or with values supplied by stream
    parameters.
    """

    # Declare that callback is a positional parameter (used in clone)
    __pos_params = ['callback']

    kdims = param.List(default=[], constant=True, doc="""
        The key dimensions of a DynamicMap map to the arguments of the
        callback. This mapping can be by position or by name.""")

    callback = param.ClassSelector(class_=Callable, constant=True, doc="""
        The callable used to generate the elements. The arguments to the
        callable includes any number of declared key dimensions as well
        as any number of stream parameters defined on the input streams.

        If the callable is an instance of Callable it will be used
        directly, otherwise it will be automatically wrapped in one.""")

    streams = param.List(default=[], constant=True, doc="""
       List of Stream instances to associate with the DynamicMap. The
       set of parameter values across these streams will be supplied as
       keyword arguments to the callback when the events are received,
       updating the streams. Can also be supplied as a dictionary that
       maps parameters or panel widgets to callback argument names that
       will then be automatically converted to the equivalent list
       format.""")

    cache_size = param.Integer(default=500, bounds=(1, None), doc="""
       The number of entries to cache for fast access. This is an LRU
       cache where the least recently used item is overwritten once
       the cache is full.""")

    positional_stream_args = param.Boolean(default=False, constant=True, doc="""
       If False, stream parameters are passed to the callback as keyword arguments.
       If True, stream parameters are passed to callback as positional arguments.
       Each positional argument is a dict containing the contents of a stream.
       The positional stream arguments follow the positional arguments for each kdim,
       and they are ordered to match the order of the DynamicMap's streams list.
    """)

    def __init__(self, callback, initial_items=None, streams=None, **params):
        streams = (streams or [])
        if isinstance(streams, dict):
            streams = streams_list_from_dict(streams)

        # If callback is a parameterized method and watch is disabled add as stream
        if (params.get('watch', True) and (util.is_param_method(callback, has_deps=True) or
            (isinstance(callback, FunctionType) and hasattr(callback, '_dinfo')))):
            streams.append(callback)

        if isinstance(callback, types.GeneratorType):
            callback = Generator(callback)
        elif not isinstance(callback, Callable):
            callback = Callable(callback)

        valid, invalid = Stream._process_streams(streams)
        if invalid:
            msg = ('The supplied streams list contains objects that '
                   'are not Stream instances: {objs}')
            raise TypeError(msg.format(objs = ', '.join('%r' % el for el in invalid)))

        super().__init__(initial_items, callback=callback, streams=valid, **params)

        if self.callback.noargs:
            prefix = 'DynamicMaps using generators (or callables without arguments)'
            if self.kdims:
                raise Exception(prefix + ' must be declared without key dimensions')
            if len(self.streams)> 1:
                raise Exception(prefix + ' must have either streams=[] or a single, '
                                + 'stream instance without any stream parameters')
            if self._stream_parameters() != []:
                raise Exception(prefix + ' cannot accept any stream parameters')

        if self.positional_stream_args:
            self._posarg_keys = None
        else:
            self._posarg_keys = util.validate_dynamic_argspec(
                self.callback, self.kdims, self.streams
            )

        # Set source to self if not already specified
        for stream in self.streams:
            if stream.source is None:
                stream.source = self
            if isinstance(stream, Params):
                for p in stream.parameters:
                    if isinstance(p.owner, Stream) and p.owner.source is None:
                        p.owner.source = self

        self.periodic = periodic(self)

        self._current_key = None

    @property
    def opts(self):
        return Opts(self, mode='dynamicmap')

    @property
    def redim(self):
        return Redim(self, mode='dynamic')

    @property
    def unbounded(self):
        """
        Returns a list of key dimensions that are unbounded, excluding
        stream parameters. If any of theses key dimensions are
        unbounded, the DynamicMap as a whole is also unbounded.
        """
        unbounded_dims = []
        # Dimensioned streams do not need to be bounded
        stream_params = set(self._stream_parameters())
        for kdim in self.kdims:
            if str(kdim) in stream_params:
                continue
            if kdim.values:
                continue
            if None in kdim.range:
                unbounded_dims.append(str(kdim))
        return unbounded_dims

    @property
    def current_key(self):
        """Returns the current key value."""
        return self._current_key

    def _stream_parameters(self):
        return util.stream_parameters(
            self.streams, no_duplicates=not self.positional_stream_args
        )

    def _initial_key(self):
        """
        Construct an initial key for based on the lower range bounds or
        values on the key dimensions.
        """
        key = []
        undefined = []
        stream_params = set(self._stream_parameters())
        for kdim in self.kdims:
            if str(kdim) in stream_params:
                key.append(None)
            elif kdim.default is not None:
                key.append(kdim.default)
            elif kdim.values:
                if all(util.isnumeric(v) for v in kdim.values):
                    key.append(sorted(kdim.values)[0])
                else:
                    key.append(kdim.values[0])
            elif kdim.range[0] is not None:
                key.append(kdim.range[0])
            else:
                undefined.append(kdim)
        if undefined:
            msg = ('Dimension(s) {undefined_dims} do not specify range or values needed '
                   'to generate initial key')
            undefined_dims = ', '.join(['%r' % str(dim) for dim in undefined])
            raise KeyError(msg.format(undefined_dims=undefined_dims))

        return tuple(key)


    def _validate_key(self, key):
        """
        Make sure the supplied key values are within the bounds
        specified by the corresponding dimension range and soft_range.
        """
        if key == () and len(self.kdims) == 0: return ()
        key = util.wrap_tuple(key)
        assert len(key) == len(self.kdims)
        for ind, val in enumerate(key):
            kdim = self.kdims[ind]
            low, high = util.max_range([kdim.range, kdim.soft_range])
            if util.is_number(low) and util.isfinite(low):
                if val < low:
                    raise KeyError("Key value %s below lower bound %s"
                                   % (val, low))
            if util.is_number(high) and util.isfinite(high):
                if val > high:
                    raise KeyError("Key value %s above upper bound %s"
                                   % (val, high))

    def event(self, **kwargs):
        """Updates attached streams and triggers events

        Automatically find streams matching the supplied kwargs to
        update and trigger events on them.

        Args:
            **kwargs: Events to update streams with
        """
        if self.callback.noargs and self.streams == []:
            self.param.warning(
                'No streams declared. To update a DynamicMaps using '
                'generators (or callables without arguments) use streams=[Next()]')
            return
        if self.streams == []:
            self.param.warning('No streams on DynamicMap, calling event '
                               'will have no effect')
            return

        stream_params = set(self._stream_parameters())
        invalid = [k for k in kwargs.keys() if k not in stream_params]
        if invalid:
            msg = 'Key(s) {invalid} do not correspond to stream parameters'
            raise KeyError(msg.format(invalid = ', '.join('%r' % i for i in invalid)))

        streams = []
        for stream in self.streams:
            contents = stream.contents
            applicable_kws = {k:v for k,v in kwargs.items()
                              if k in set(contents.keys())}
            if not applicable_kws and contents:
                continue
            streams.append(stream)
            rkwargs = util.rename_stream_kwargs(stream, applicable_kws, reverse=True)
            stream.update(**rkwargs)

        Stream.trigger(streams)


    def _style(self, retval):
        "Applies custom option tree to values return by the callback."
        from ..util import opts
        if self.id not in Store.custom_options():
            return retval
        spec = StoreOptions.tree_to_dict(Store.custom_options()[self.id])
        return opts.apply_groups(retval, options=spec)


    def _execute_callback(self, *args):
        "Executes the callback with the appropriate args and kwargs"
        self._validate_key(args)      # Validate input key

        # Additional validation needed to ensure kwargs don't clash
        kdims = [kdim.name for kdim in self.kdims]
        kwarg_items = [s.contents.items() for s in self.streams]
        hash_items = tuple(tuple(sorted(s.hashkey.items())) for s in self.streams)+args
        flattened = [(k,v) for kws in kwarg_items for (k,v) in kws
                     if k not in kdims]

        if self.positional_stream_args:
            kwargs = {}
            args = args + tuple([s.contents for s in self.streams])
        elif self._posarg_keys:
            kwargs = dict(flattened, **dict(zip(self._posarg_keys, args)))
            args = ()
        else:
            kwargs = dict(flattened)
        if not isinstance(self.callback, Generator):
            kwargs['_memoization_hash_'] = hash_items

        with dynamicmap_memoization(self.callback, self.streams):
            retval = self.callback(*args, **kwargs)
        return self._style(retval)


    def options(self, *args, **kwargs):
        """Applies simplified option definition returning a new object.

        Applies options defined in a flat format to the objects
        returned by the DynamicMap. If the options are to be set
        directly on the objects returned by the DynamicMap a simple
        format may be used, e.g.:

            obj.options(cmap='viridis', show_title=False)

        If the object is nested the options must be qualified using
        a type[.group][.label] specification, e.g.:

            obj.options('Image', cmap='viridis', show_title=False)

        or using:

            obj.options({'Image': dict(cmap='viridis', show_title=False)})

        Args:
            *args: Sets of options to apply to object
                Supports a number of formats including lists of Options
                objects, a type[.group][.label] followed by a set of
                keyword options to apply and a dictionary indexed by
                type[.group][.label] specs.
            backend (optional): Backend to apply options to
                Defaults to current selected backend
            clone (bool, optional): Whether to clone object
                Options can be applied inplace with clone=False
            **kwargs: Keywords of options
                Set of options to apply to the object

        Returns:
            Returns the cloned object with the options applied
        """
        if 'clone' not in kwargs:
            kwargs['clone'] = True
        return self.opts(*args, **kwargs)


    def clone(self, data=None, shared_data=True, new_type=None, link=True,
              *args, **overrides):
        """Clones the object, overriding data and parameters.

        Args:
            data: New data replacing the existing data
            shared_data (bool, optional): Whether to use existing data
            new_type (optional): Type to cast object to
            link (bool, optional): Whether clone should be linked
                Determines whether Streams and Links attached to
                original object will be inherited.
            *args: Additional arguments to pass to constructor
            **overrides: New keyword arguments to pass to constructor

        Returns:
            Cloned object
        """
        callback = overrides.pop('callback', self.callback)
        if data is None and shared_data:
            data = self.data
            if link and callback is self.callback:
                overrides['plot_id'] = self._plot_id
        clone = super(UniformNdMapping, self).clone(
            callback, shared_data, new_type, link,
            *(data,) + args, **overrides)

        # Ensure the clone references this object to ensure
        # stream sources are inherited
        if clone.callback is self.callback:
            from ..operation import function
            with util.disable_constant(clone):
                op = function.instance(fn=lambda x, **kwargs: x)
                clone.callback = clone.callback.clone(
                    inputs=[self], link_inputs=link, operation=op,
                    operation_kwargs={}
                )
        return clone


    def reset(self):
        "Clear the DynamicMap cache"
        self.data = OrderedDict()
        return self


    def _cross_product(self, tuple_key, cache, data_slice):
        """
        Returns a new DynamicMap if the key (tuple form) expresses a
        cross product, otherwise returns None. The cache argument is a
        dictionary (key:element pairs) of all the data found in the
        cache for this key.

        Each key inside the cross product is looked up in the cache
        (self.data) to check if the appropriate element is
        available. Otherwise the element is computed accordingly.

        The data_slice may specify slices into each value in the
        the cross-product.
        """
        if not any(isinstance(el, (list, set)) for el in tuple_key):
            return None
        if len(tuple_key)==1:
            product = tuple_key[0]
        else:
            args = [set(el) if isinstance(el, (list,set))
                    else set([el]) for el in tuple_key]
            product = itertools.product(*args)

        data = []
        for inner_key in product:
            key = util.wrap_tuple(inner_key)
            if key in cache:
                val = cache[key]
            else:
                val = self._execute_callback(*key)
            if data_slice:
                val = self._dataslice(val, data_slice)
            data.append((key, val))
        product = self.clone(data)

        if data_slice:
            from ..util import Dynamic
            dmap = Dynamic(self, operation=lambda obj, **dynkwargs: obj[data_slice],
                           streams=self.streams)
            dmap.data = product.data
            return dmap
        return product


    def _slice_bounded(self, tuple_key, data_slice):
        """
        Slices bounded DynamicMaps by setting the soft_ranges on
        key dimensions and applies data slice to cached and dynamic
        values.
        """
        slices = [el for el in tuple_key if isinstance(el, slice)]
        if any(el.step for el in slices):
            raise Exception("DynamicMap slices cannot have a step argument")
        elif len(slices) not in [0, len(tuple_key)]:
            raise Exception("Slices must be used exclusively or not at all")
        elif not slices:
            return None

        sliced = self.clone(self)
        for i, slc in enumerate(tuple_key):
            (start, stop) = slc.start, slc.stop
            if start is not None and start < sliced.kdims[i].range[0]:
                raise Exception("Requested slice below defined dimension range.")
            if stop is not None and stop > sliced.kdims[i].range[1]:
                raise Exception("Requested slice above defined dimension range.")
            sliced.kdims[i].soft_range = (start, stop)
        if data_slice:
            if not isinstance(sliced, DynamicMap):
                return self._dataslice(sliced, data_slice)
            else:
                from ..util import Dynamic
                if len(self):
                    slices = [slice(None) for _ in range(self.ndims)] + list(data_slice)
                    sliced = super(DynamicMap, sliced).__getitem__(tuple(slices))
                dmap = Dynamic(self, operation=lambda obj, **dynkwargs: obj[data_slice],
                               streams=self.streams)
                dmap.data = sliced.data
                return dmap
        return sliced


    def __getitem__(self, key):
        """Evaluates DynamicMap with specified key.

        Indexing into a DynamicMap evaluates the dynamic function with
        the specified key unless the key and corresponding value are
        already in the cache. This may also be used to evaluate
        multiple keys or even a cross-product of keys if a list of
        values per Dimension are defined. Once values are in the cache
        the DynamicMap can be cast to a HoloMap.

        Args:
            key: n-dimensional key corresponding to the key dimensions
               Scalar values will be evaluated as normal while lists
               of values will be combined to form the cross-product,
               making it possible to evaluate many keys at once.

        Returns:
            Returns evaluated callback return value for scalar key
            otherwise returns cloned DynamicMap containing the cross-
            product of evaluated items.
        """
        self._current_key = key

        # Split key dimensions and data slices
        sample = False
        if key is Ellipsis:
            return self
        elif isinstance(key, (list, set)) and all(isinstance(v, tuple) for v in key):
            map_slice, data_slice = key, ()
            sample = True
        elif self.positional_stream_args:
            # First positional args are dynamic map kdim indices, remaining args
            # are stream values, not data_slice values
            map_slice, _ = self._split_index(key)
            data_slice = ()
        else:
            map_slice, data_slice = self._split_index(key)
        tuple_key = util.wrap_tuple_streams(map_slice, self.kdims, self.streams)

        # Validation
        if not sample:
            sliced = self._slice_bounded(tuple_key, data_slice)
            if sliced is not None:
                return sliced

        # Cache lookup
        try:
            dimensionless = util.dimensionless_contents(get_nested_streams(self),
                                                        self.kdims, no_duplicates=False)
            empty = self._stream_parameters() == [] and self.kdims==[]
            if dimensionless or empty:
                raise KeyError('Using dimensionless streams disables DynamicMap cache')
            cache = super().__getitem__(key)
        except KeyError:
            cache = None

        # If the key expresses a cross product, compute the elements and return
        product = self._cross_product(tuple_key, cache.data if cache else {}, data_slice)
        if product is not None:
            return product

        # Not a cross product and nothing cached so compute element.
        if cache is not None: return cache
        val = self._execute_callback(*tuple_key)
        if data_slice:
            val = self._dataslice(val, data_slice)
        self._cache(tuple_key, val)
        return val


    def select(self, selection_specs=None, **kwargs):
        """Applies selection by dimension name

        Applies a selection along the dimensions of the object using
        keyword arguments. The selection may be narrowed to certain
        objects using selection_specs. For container objects the
        selection will be applied to all children as well.

        Selections may select a specific value, slice or set of values:

        * value: Scalar values will select rows along with an exact
                 match, e.g.:

            ds.select(x=3)

        * slice: Slices may be declared as tuples of the upper and
                 lower bound, e.g.:

            ds.select(x=(0, 3))

        * values: A list of values may be selected using a list or
                  set, e.g.:

            ds.select(x=[0, 1, 2])

        Args:
            selection_specs: List of specs to match on
                A list of types, functions, or type[.group][.label]
                strings specifying which objects to apply the
                selection on.
            **selection: Dictionary declaring selections by dimension
                Selections can be scalar values, tuple ranges, lists
                of discrete values and boolean arrays

        Returns:
            Returns an Dimensioned object containing the selected data
            or a scalar if a single value was selected
        """
        if selection_specs is not None and not isinstance(selection_specs, (list, tuple)):
            selection_specs = [selection_specs]
        selection = super().select(selection_specs=selection_specs, **kwargs)
        def dynamic_select(obj, **dynkwargs):
            if selection_specs is not None:
                matches = any(obj.matches(spec) for spec in selection_specs)
            else:
                matches = True
            if matches:
                return obj.select(**kwargs)
            return obj

        if not isinstance(selection, DynamicMap):
            return dynamic_select(selection)
        else:
            from ..util import Dynamic
            dmap = Dynamic(self, operation=dynamic_select, streams=self.streams)
            dmap.data = selection.data
            return dmap


    def _cache(self, key, val):
        """
        Request that a key/value pair be considered for caching.
        """
        cache_size = (1 if util.dimensionless_contents(
            self.streams, self.kdims, no_duplicates=not self.positional_stream_args)
                      else self.cache_size)
        if len(self) >= cache_size:
            first_key = next(k for k in self.data)
            self.data.pop(first_key)
        self[key] = val


    def map(self, map_fn, specs=None, clone=True, link_inputs=True):
        """Map a function to all objects matching the specs

        Recursively replaces elements using a map function when the
        specs apply, by default applies to all objects, e.g. to apply
        the function to all contained Curve objects:

            dmap.map(fn, hv.Curve)

        Args:
            map_fn: Function to apply to each object
            specs: List of specs to match
                List of types, functions or type[.group][.label] specs
                to select objects to return, by default applies to all
                objects.
            clone: Whether to clone the object or transform inplace

        Returns:
            Returns the object after the map_fn has been applied
        """
        deep_mapped = super().map(map_fn, specs, clone)
        if isinstance(deep_mapped, type(self)):
            from ..util import Dynamic
            def apply_map(obj, **dynkwargs):
                return obj.map(map_fn, specs, clone)
            dmap = Dynamic(self, operation=apply_map, streams=self.streams,
                           link_inputs=link_inputs)
            dmap.data = deep_mapped.data
            return dmap
        return deep_mapped


    def relabel(self, label=None, group=None, depth=1):
        """Clone object and apply new group and/or label.

        Applies relabeling to children up to the supplied depth.

        Args:
            label (str, optional): New label to apply to returned object
            group (str, optional): New group to apply to returned object
            depth (int, optional): Depth to which relabel will be applied
                If applied to container allows applying relabeling to
                contained objects up to the specified depth

        Returns:
            Returns relabelled object
        """
        relabelled = super().relabel(label, group, depth)
        if depth > 0:
            from ..util import Dynamic
            def dynamic_relabel(obj, **dynkwargs):
                return obj.relabel(group=group, label=label, depth=depth-1)
            dmap = Dynamic(self, streams=self.streams, operation=dynamic_relabel)
            dmap.data = relabelled.data
            with util.disable_constant(dmap):
                dmap.group = relabelled.group
                dmap.label = relabelled.label
            return dmap
        return relabelled

    def _split_overlays(self):
        """
        Splits a DynamicMap into its components. Only well defined for
        DynamicMap with consistent number and order of layers.
        """
        if not len(self):
            raise ValueError('Cannot split DynamicMap before it has been initialized')
        elif not issubclass(self.type, CompositeOverlay):
            return None, self

        from ..util import Dynamic
        keys = list(self.last.data.keys())
        dmaps = []
        for key in keys:
            el = self.last.data[key]
            def split_overlay_callback(obj, overlay_key=key, overlay_el=el, **kwargs):
                spec = util.get_overlay_spec(obj, overlay_key, overlay_el)
                items = list(obj.data.items())
                specs = [(i, util.get_overlay_spec(obj, k, v))
                         for i, (k, v) in enumerate(items)]
                match = util.closest_match(spec, specs)
                if match is None:
                    raise KeyError('{spec} spec not found in {otype}. The split_overlays method '
                                   'only works consistently for a DynamicMap where the '
                                   'layers of the {otype} do not change.'.format(
                                       spec=spec, otype=type(obj).__name__))
                return items[match][1]
            dmap = Dynamic(self, streams=self.streams, operation=split_overlay_callback)
            dmap.data = OrderedDict([(list(self.data.keys())[-1], self.last.data[key])])
            dmaps.append(dmap)
        return keys, dmaps

    def decollate(self):
        """Packs DynamicMap of nested DynamicMaps into a single DynamicMap that
        returns a non-dynamic element

        Decollation allows packing a DynamicMap of nested DynamicMaps into a single
        DynamicMap that returns a simple (non-dynamic) element. All nested streams are
        lifted to the resulting DynamicMap, and are available in the `streams`
        property.  The `callback` property of the resulting DynamicMap is a pure,
        stateless function of the stream values. To avoid stream parameter name
        conflicts, the resulting DynamicMap is configured with
        positional_stream_args=True, and the callback function accepts stream values
        as positional dict arguments.

        Returns:
            DynamicMap that returns a non-dynamic element
        """
        from .decollate import decollate
        return decollate(self)

    def collate(self):
        """Unpacks DynamicMap into container of DynamicMaps

        Collation allows unpacking DynamicMaps which return Layout,
        NdLayout or GridSpace objects into a single such object
        containing DynamicMaps. Assumes that the items in the layout
        or grid that is returned do not change.

        Returns:
            Collated container containing DynamicMaps
        """
        # Initialize
        if self.last is not None:
            initialized = self
        else:
            initialized = self.clone()
            initialized[initialized._initial_key()]

        if not isinstance(initialized.last, (Layout, NdLayout, GridSpace)):
            return self

        container = initialized.last.clone(shared_data=False)
        type_counter = defaultdict(int)

        # Get stream mapping from callback
        remapped_streams = []
        self_dstreams = util.dimensioned_streams(self)
        streams = self.callback.stream_mapping
        for i, (k, v) in enumerate(initialized.last.data.items()):
            vstreams = streams.get(i, [])
            if not vstreams:
                if isinstance(initialized.last, Layout):
                    for l in range(len(k)):
                        path = '.'.join(k[:l])
                        if path in streams:
                            vstreams = streams[path]
                            break
                else:
                    vstreams = streams.get(k, [])
            if any(s in remapped_streams for s in vstreams):
                raise ValueError(
                    "The stream_mapping supplied on the Callable "
                    "is ambiguous please supply more specific Layout "
                    "path specs.")
            remapped_streams += vstreams

            # Define collation callback
            def collation_cb(*args, **kwargs):
                layout = self[args]
                layout_type = type(layout).__name__
                if len(container.keys()) != len(layout.keys()):
                    raise ValueError('Collated DynamicMaps must return '
                                     '%s with consistent number of items.'
                                     % layout_type)

                key = kwargs['selection_key']
                index = kwargs['selection_index']
                obj_type = kwargs['selection_type']
                dyn_type_map = defaultdict(list)
                for k, v in layout.data.items():
                    if k == key:
                        return layout[k]
                    dyn_type_map[type(v)].append(v)

                dyn_type_counter = {t: len(vals) for t, vals in dyn_type_map.items()}
                if dyn_type_counter != type_counter:
                    raise ValueError('The objects in a %s returned by a '
                                     'DynamicMap must consistently return '
                                     'the same number of items of the '
                                     'same type.' % layout_type)
                return dyn_type_map[obj_type][index]

            callback = Callable(partial(collation_cb, selection_key=k,
                                        selection_index=type_counter[type(v)],
                                        selection_type=type(v)),
                                inputs=[self])
            vstreams = list(util.unique_iterator(self_dstreams + vstreams))
            vdmap = self.clone(callback=callback, shared_data=False,
                               streams=vstreams)
            type_counter[type(v)] += 1

            # Remap source of streams
            for stream in vstreams:
                if stream.source is self:
                    stream.source = vdmap
            container[k] = vdmap

        unmapped_streams = [repr(stream) for stream in self.streams
                            if (stream.source is self) and
                            (stream not in remapped_streams)
                            and stream.linked]
        if unmapped_streams:
            raise ValueError(
                'The following streams are set to be automatically '
                'linked to a plot, but no stream_mapping specifying '
                'which item in the (Nd)Layout to link it to was found:\n%s'
                % ', '.join(unmapped_streams)
            )
        return container


    def groupby(self, dimensions=None, container_type=None, group_type=None, **kwargs):
        """Groups DynamicMap by one or more dimensions

        Applies groupby operation over the specified dimensions
        returning an object of type container_type (expected to be
        dictionary-like) containing the groups.

        Args:
            dimensions: Dimension(s) to group by
            container_type: Type to cast group container to
            group_type: Type to cast each group to
            dynamic: Whether to return a DynamicMap
            **kwargs: Keyword arguments to pass to each group

        Returns:
            Returns object of supplied container_type containing the
            groups. If dynamic=True returns a DynamicMap instead.
        """
        if dimensions is None:
            dimensions = self.kdims
        if not isinstance(dimensions, (list, tuple)):
            dimensions = [dimensions]

        container_type = container_type if container_type else type(self)
        group_type = group_type if group_type else type(self)

        outer_kdims = [self.get_dimension(d) for d in dimensions]
        inner_kdims = [d for d in self.kdims if not d in outer_kdims]

        outer_dynamic = issubclass(container_type, DynamicMap)
        inner_dynamic = issubclass(group_type, DynamicMap)

        if ((not outer_dynamic and any(not d.values for d in outer_kdims)) or
            (not inner_dynamic and any(not d.values for d in inner_kdims))):
            raise Exception('Dimensions must specify sampling via '
                            'values to apply a groupby')

        if outer_dynamic:
            def outer_fn(*outer_key, **dynkwargs):
                if inner_dynamic:
                    def inner_fn(*inner_key, **dynkwargs):
                        outer_vals = zip(outer_kdims, util.wrap_tuple(outer_key))
                        inner_vals = zip(inner_kdims, util.wrap_tuple(inner_key))
                        inner_sel = [(k.name, v) for k, v in inner_vals]
                        outer_sel = [(k.name, v) for k, v in outer_vals]
                        return self.select(**dict(inner_sel+outer_sel))
                    return self.clone([], callback=inner_fn, kdims=inner_kdims)
                else:
                    dim_vals = [(d.name, d.values) for d in inner_kdims]
                    dim_vals += [(d.name, [v]) for d, v in
                                   zip(outer_kdims, util.wrap_tuple(outer_key))]
                    with item_check(False):
                        selected = HoloMap(self.select(**dict(dim_vals)))
                        return group_type(selected.reindex(inner_kdims))
            if outer_kdims:
                return self.clone([], callback=outer_fn, kdims=outer_kdims)
            else:
                return outer_fn(())
        else:
            outer_product = itertools.product(*[self.get_dimension(d).values
                                                for d in dimensions])
            groups = []
            for outer in outer_product:
                outer_vals = [(d.name, [o]) for d, o in zip(outer_kdims, outer)]
                if inner_dynamic or not inner_kdims:
                    def inner_fn(outer_vals, *key, **dynkwargs):
                        inner_dims = zip(inner_kdims, util.wrap_tuple(key))
                        inner_vals = [(d.name, k) for d, k in inner_dims]
                        return self.select(**dict(outer_vals+inner_vals)).last
                    if inner_kdims or self.streams:
                        callback = Callable(partial(inner_fn, outer_vals),
                                            inputs=[self])
                        group = self.clone(
                            callback=callback, kdims=inner_kdims
                        )
                    else:
                        group = inner_fn(outer_vals, ())
                    groups.append((outer, group))
                else:
                    inner_vals = [(d.name, self.get_dimension(d).values)
                                     for d in inner_kdims]
                    with item_check(False):
                        selected = HoloMap(self.select(**dict(outer_vals+inner_vals)))
                        group = group_type(selected.reindex(inner_kdims))
                    groups.append((outer, group))
            return container_type(groups, kdims=outer_kdims)


    def grid(self, dimensions=None, **kwargs):
        """
        Groups data by supplied dimension(s) laying the groups along
        the dimension(s) out in a GridSpace.

        Args:
        dimensions: Dimension/str or list
            Dimension or list of dimensions to group by

        Returns:
        grid: GridSpace
            GridSpace with supplied dimensions
        """
        return self.groupby(dimensions, container_type=GridSpace, **kwargs)


    def layout(self, dimensions=None, **kwargs):
        """
        Groups data by supplied dimension(s) laying the groups along
        the dimension(s) out in a NdLayout.

        Args:
        dimensions: Dimension/str or list
            Dimension or list of dimensions to group by

        Returns:
        layout: NdLayout
            NdLayout with supplied dimensions
        """
        return self.groupby(dimensions, container_type=NdLayout, **kwargs)


    def overlay(self, dimensions=None, **kwargs):
        """Group by supplied dimension(s) and overlay each group

        Groups data by supplied dimension(s) overlaying the groups
        along the dimension(s).

        Args:
            dimensions: Dimension(s) of dimensions to group by

        Returns:
            NdOverlay object(s) with supplied dimensions
        """
        if dimensions is None:
            dimensions = self.kdims
        else:
            if not isinstance(dimensions, (list, tuple)):
                dimensions = [dimensions]
            dimensions = [self.get_dimension(d, strict=True)
                          for d in dimensions]
        dims = [d for d in self.kdims if d not in dimensions]
        return self.groupby(dims, group_type=NdOverlay)


    def hist(self, dimension=None, num_bins=20, bin_range=None,
             adjoin=True, **kwargs):
        """Computes and adjoins histogram along specified dimension(s).

        Defaults to first value dimension if present otherwise falls
        back to first key dimension.

        Args:
            dimension: Dimension(s) to compute histogram on
            num_bins (int, optional): Number of bins
            bin_range (tuple optional): Lower and upper bounds of bins
            adjoin (bool, optional): Whether to adjoin histogram

        Returns:
            AdjointLayout of DynamicMap and adjoined histogram if
            adjoin=True, otherwise just the histogram
        """
        def dynamic_hist(obj, **dynkwargs):
            if isinstance(obj, (NdOverlay, Overlay)):
                index = kwargs.get('index', 0)
                obj = obj.get(index)
            return obj.hist(
                dimension=dimension,
                num_bins=num_bins,
                bin_range=bin_range,
                adjoin=False,
                **kwargs
            )

        from ..util import Dynamic
        hist = Dynamic(self, streams=self.streams, link_inputs=False,
                       operation=dynamic_hist)
        if adjoin:
            return self << hist
        else:
            return hist


    def reindex(self, kdims=[], force=False):
        """Reorders key dimensions on DynamicMap

        Create a new object with a reordered set of key dimensions.
        Dropping dimensions is not allowed on a DynamicMap.

        Args:
            kdims: List of dimensions to reindex the mapping with
            force: Not applicable to a DynamicMap

        Returns:
            Reindexed DynamicMap
        """
        if not isinstance(kdims, list):
            kdims = [kdims]
        kdims = [self.get_dimension(kd, strict=True) for kd in kdims]
        dropped = [kd for kd in self.kdims if kd not in kdims]
        if dropped:
            raise ValueError("DynamicMap does not allow dropping dimensions, "
                             "reindex may only be used to reorder dimensions.")
        return super().reindex(kdims, force)


    def drop_dimension(self, dimensions):
        raise NotImplementedError('Cannot drop dimensions from a DynamicMap, '
                                  'cast to a HoloMap first.')

    def add_dimension(self, dimension, dim_pos, dim_val, vdim=False, **kwargs):
        raise NotImplementedError('Cannot add dimensions to a DynamicMap, '
                                  'cast to a HoloMap first.')

    def __next__(self):
        if self.callback.noargs:
            return self[()]
        else:
            raise Exception('The next method can only be used for DynamicMaps using'
                            'generators (or callables without arguments)')

Parameters

Name Type Default Kind
bases HoloMap -

Parameter Details

bases: Parameter of type HoloMap

Return Value

Returns unspecified type

Class Interface

Methods

__init__(self, callback, initial_items, streams)

Purpose: Internal method: init

Parameters:

  • callback: Parameter
  • initial_items: Parameter
  • streams: Parameter

Returns: None

opts(self) property

Purpose: Performs opts

Returns: None

redim(self) property

Purpose: Performs redim

Returns: None

unbounded(self) property

Purpose: Returns a list of key dimensions that are unbounded, excluding stream parameters. If any of theses key dimensions are unbounded, the DynamicMap as a whole is also unbounded.

Returns: See docstring for return details

current_key(self) property

Purpose: Returns the current key value.

Returns: See docstring for return details

_stream_parameters(self)

Purpose: Internal method: stream parameters

Returns: None

_initial_key(self)

Purpose: Construct an initial key for based on the lower range bounds or values on the key dimensions.

Returns: None

_validate_key(self, key)

Purpose: Make sure the supplied key values are within the bounds specified by the corresponding dimension range and soft_range.

Parameters:

  • key: Parameter

Returns: None

event(self)

Purpose: Updates attached streams and triggers events Automatically find streams matching the supplied kwargs to update and trigger events on them. Args: **kwargs: Events to update streams with

Returns: None

_style(self, retval)

Purpose: Applies custom option tree to values return by the callback.

Parameters:

  • retval: Parameter

Returns: See docstring for return details

_execute_callback(self)

Purpose: Executes the callback with the appropriate args and kwargs

Returns: None

options(self)

Purpose: Applies simplified option definition returning a new object. Applies options defined in a flat format to the objects returned by the DynamicMap. If the options are to be set directly on the objects returned by the DynamicMap a simple format may be used, e.g.: obj.options(cmap='viridis', show_title=False) If the object is nested the options must be qualified using a type[.group][.label] specification, e.g.: obj.options('Image', cmap='viridis', show_title=False) or using: obj.options({'Image': dict(cmap='viridis', show_title=False)}) Args: *args: Sets of options to apply to object Supports a number of formats including lists of Options objects, a type[.group][.label] followed by a set of keyword options to apply and a dictionary indexed by type[.group][.label] specs. backend (optional): Backend to apply options to Defaults to current selected backend clone (bool, optional): Whether to clone object Options can be applied inplace with clone=False **kwargs: Keywords of options Set of options to apply to the object Returns: Returns the cloned object with the options applied

Returns: See docstring for return details

clone(self, data, shared_data, new_type, link)

Purpose: Clones the object, overriding data and parameters. Args: data: New data replacing the existing data shared_data (bool, optional): Whether to use existing data new_type (optional): Type to cast object to link (bool, optional): Whether clone should be linked Determines whether Streams and Links attached to original object will be inherited. *args: Additional arguments to pass to constructor **overrides: New keyword arguments to pass to constructor Returns: Cloned object

Parameters:

  • data: Parameter
  • shared_data: Parameter
  • new_type: Parameter
  • link: Parameter

Returns: See docstring for return details

reset(self)

Purpose: Clear the DynamicMap cache

Returns: None

_cross_product(self, tuple_key, cache, data_slice)

Purpose: Returns a new DynamicMap if the key (tuple form) expresses a cross product, otherwise returns None. The cache argument is a dictionary (key:element pairs) of all the data found in the cache for this key. Each key inside the cross product is looked up in the cache (self.data) to check if the appropriate element is available. Otherwise the element is computed accordingly. The data_slice may specify slices into each value in the the cross-product.

Parameters:

  • tuple_key: Parameter
  • cache: Parameter
  • data_slice: Parameter

Returns: See docstring for return details

_slice_bounded(self, tuple_key, data_slice)

Purpose: Slices bounded DynamicMaps by setting the soft_ranges on key dimensions and applies data slice to cached and dynamic values.

Parameters:

  • tuple_key: Parameter
  • data_slice: Parameter

Returns: None

__getitem__(self, key)

Purpose: Evaluates DynamicMap with specified key. Indexing into a DynamicMap evaluates the dynamic function with the specified key unless the key and corresponding value are already in the cache. This may also be used to evaluate multiple keys or even a cross-product of keys if a list of values per Dimension are defined. Once values are in the cache the DynamicMap can be cast to a HoloMap. Args: key: n-dimensional key corresponding to the key dimensions Scalar values will be evaluated as normal while lists of values will be combined to form the cross-product, making it possible to evaluate many keys at once. Returns: Returns evaluated callback return value for scalar key otherwise returns cloned DynamicMap containing the cross- product of evaluated items.

Parameters:

  • key: Parameter

Returns: See docstring for return details

select(self, selection_specs)

Purpose: Applies selection by dimension name Applies a selection along the dimensions of the object using keyword arguments. The selection may be narrowed to certain objects using selection_specs. For container objects the selection will be applied to all children as well. Selections may select a specific value, slice or set of values: * value: Scalar values will select rows along with an exact match, e.g.: ds.select(x=3) * slice: Slices may be declared as tuples of the upper and lower bound, e.g.: ds.select(x=(0, 3)) * values: A list of values may be selected using a list or set, e.g.: ds.select(x=[0, 1, 2]) Args: selection_specs: List of specs to match on A list of types, functions, or type[.group][.label] strings specifying which objects to apply the selection on. **selection: Dictionary declaring selections by dimension Selections can be scalar values, tuple ranges, lists of discrete values and boolean arrays Returns: Returns an Dimensioned object containing the selected data or a scalar if a single value was selected

Parameters:

  • selection_specs: Parameter

Returns: See docstring for return details

_cache(self, key, val)

Purpose: Request that a key/value pair be considered for caching.

Parameters:

  • key: Parameter
  • val: Parameter

Returns: None

map(self, map_fn, specs, clone, link_inputs)

Purpose: Map a function to all objects matching the specs Recursively replaces elements using a map function when the specs apply, by default applies to all objects, e.g. to apply the function to all contained Curve objects: dmap.map(fn, hv.Curve) Args: map_fn: Function to apply to each object specs: List of specs to match List of types, functions or type[.group][.label] specs to select objects to return, by default applies to all objects. clone: Whether to clone the object or transform inplace Returns: Returns the object after the map_fn has been applied

Parameters:

  • map_fn: Parameter
  • specs: Parameter
  • clone: Parameter
  • link_inputs: Parameter

Returns: See docstring for return details

relabel(self, label, group, depth)

Purpose: Clone object and apply new group and/or label. Applies relabeling to children up to the supplied depth. Args: label (str, optional): New label to apply to returned object group (str, optional): New group to apply to returned object depth (int, optional): Depth to which relabel will be applied If applied to container allows applying relabeling to contained objects up to the specified depth Returns: Returns relabelled object

Parameters:

  • label: Parameter
  • group: Parameter
  • depth: Parameter

Returns: See docstring for return details

_split_overlays(self)

Purpose: Splits a DynamicMap into its components. Only well defined for DynamicMap with consistent number and order of layers.

Returns: None

decollate(self)

Purpose: Packs DynamicMap of nested DynamicMaps into a single DynamicMap that returns a non-dynamic element Decollation allows packing a DynamicMap of nested DynamicMaps into a single DynamicMap that returns a simple (non-dynamic) element. All nested streams are lifted to the resulting DynamicMap, and are available in the `streams` property. The `callback` property of the resulting DynamicMap is a pure, stateless function of the stream values. To avoid stream parameter name conflicts, the resulting DynamicMap is configured with positional_stream_args=True, and the callback function accepts stream values as positional dict arguments. Returns: DynamicMap that returns a non-dynamic element

Returns: See docstring for return details

collate(self)

Purpose: Unpacks DynamicMap into container of DynamicMaps Collation allows unpacking DynamicMaps which return Layout, NdLayout or GridSpace objects into a single such object containing DynamicMaps. Assumes that the items in the layout or grid that is returned do not change. Returns: Collated container containing DynamicMaps

Returns: See docstring for return details

groupby(self, dimensions, container_type, group_type)

Purpose: Groups DynamicMap by one or more dimensions Applies groupby operation over the specified dimensions returning an object of type container_type (expected to be dictionary-like) containing the groups. Args: dimensions: Dimension(s) to group by container_type: Type to cast group container to group_type: Type to cast each group to dynamic: Whether to return a DynamicMap **kwargs: Keyword arguments to pass to each group Returns: Returns object of supplied container_type containing the groups. If dynamic=True returns a DynamicMap instead.

Parameters:

  • dimensions: Parameter
  • container_type: Parameter
  • group_type: Parameter

Returns: See docstring for return details

grid(self, dimensions)

Purpose: Groups data by supplied dimension(s) laying the groups along the dimension(s) out in a GridSpace. Args: dimensions: Dimension/str or list Dimension or list of dimensions to group by Returns: grid: GridSpace GridSpace with supplied dimensions

Parameters:

  • dimensions: Parameter

Returns: See docstring for return details

layout(self, dimensions)

Purpose: Groups data by supplied dimension(s) laying the groups along the dimension(s) out in a NdLayout. Args: dimensions: Dimension/str or list Dimension or list of dimensions to group by Returns: layout: NdLayout NdLayout with supplied dimensions

Parameters:

  • dimensions: Parameter

Returns: See docstring for return details

overlay(self, dimensions)

Purpose: Group by supplied dimension(s) and overlay each group Groups data by supplied dimension(s) overlaying the groups along the dimension(s). Args: dimensions: Dimension(s) of dimensions to group by Returns: NdOverlay object(s) with supplied dimensions

Parameters:

  • dimensions: Parameter

Returns: See docstring for return details

hist(self, dimension, num_bins, bin_range, adjoin)

Purpose: Computes and adjoins histogram along specified dimension(s). Defaults to first value dimension if present otherwise falls back to first key dimension. Args: dimension: Dimension(s) to compute histogram on num_bins (int, optional): Number of bins bin_range (tuple optional): Lower and upper bounds of bins adjoin (bool, optional): Whether to adjoin histogram Returns: AdjointLayout of DynamicMap and adjoined histogram if adjoin=True, otherwise just the histogram

Parameters:

  • dimension: Parameter
  • num_bins: Parameter
  • bin_range: Parameter
  • adjoin: Parameter

Returns: See docstring for return details

reindex(self, kdims, force)

Purpose: Reorders key dimensions on DynamicMap Create a new object with a reordered set of key dimensions. Dropping dimensions is not allowed on a DynamicMap. Args: kdims: List of dimensions to reindex the mapping with force: Not applicable to a DynamicMap Returns: Reindexed DynamicMap

Parameters:

  • kdims: Parameter
  • force: Parameter

Returns: See docstring for return details

drop_dimension(self, dimensions)

Purpose: Performs drop dimension

Parameters:

  • dimensions: Parameter

Returns: None

add_dimension(self, dimension, dim_pos, dim_val, vdim)

Purpose: Performs add dimension

Parameters:

  • dimension: Parameter
  • dim_pos: Parameter
  • dim_val: Parameter
  • vdim: Parameter

Returns: None

__next__(self)

Purpose: Internal method: next

Returns: None

Required Imports

import itertools
import types
from numbers import Number
from itertools import groupby
from functools import partial

Usage Example

# Example usage:
# result = DynamicMap(bases)

Similar Components

AI-powered semantic similarity - components with related functionality:

  • class HoloMap 70.1% similar

    HoloMap is an n-dimensional mapping container that stores viewable elements or overlays indexed by tuple keys along declared key dimensions, enabling interactive exploration through widgets.

    From: /tf/active/vicechatdev/patches/spaces.py
  • class Callable 63.4% similar

    Callable is a wrapper class for callback functions used with DynamicMaps, providing memoization, stream management, and input/output tracking capabilities.

    From: /tf/active/vicechatdev/patches/spaces.py
  • function dimensioned_streams 61.2% similar

    Filters and returns streams from a DynamicMap that have parameters matching the DynamicMap's key dimensions.

    From: /tf/active/vicechatdev/patches/util.py
  • function dynamicmap_memoization 61.1% similar

    A context manager that temporarily controls memoization behavior of a callable object based on the state of associated streams, disabling memoization when transient streams are triggering.

    From: /tf/active/vicechatdev/patches/spaces.py
  • class periodic_v1 59.3% similar

    A utility class that manages periodic event updates for DynamicMap objects, allowing scheduled triggering of stream updates that can be started and stopped.

    From: /tf/active/vicechatdev/patches/spaces.py
← Back to Browse