Convert the Map<int, ObjectRef> type to the Matrix<ObjectRef> $>$ type when the ObjectRef of the given Map<int, ObjectRef> is either the Matrix<float> type or the Matrix<complex<float> > type.
No files are required.
When to use
This node is used to convert the Map<int, ObjectRef> type to Matrix<ObjectRef> $>$ type when the ObjectRef of the given Map<int, ObjectRef> is either the Matrix<float> type or the Matrix<complex<float> > type. Taking a Map$<$ int , Matrix<float> $>$ as the input will output a Matrix<float> . Taking a Map$<$ int , Matrix<complex<float> > $>$ as the input will output a Matrix<complex<float> > .
Input
: Map$<$ int , Matrix<float> $>$ or Map$<$ int , Matrix<complex<float> > $>$ of the Map<int, ObjectRef> type.
Output
: any type. Note that the supported data types are the Matrix<float> type and the Matrix<complex<float> > type.
Parameter
Parameter name |
Type |
Default value |
Unit |
Description |
METHOD |
min_key |
The method to convert the Map<int, ObjectRef> type to the Matrix<ObjectRef> type. Select min_key, max_key, average, or summation. |
||
DEBUG |
false |
Enable or disable to output the conversion status to standard output. |
: string type. The method to convert the Map<int, ObjectRef> type to the Matrix<ObjectRef> type. Select min_key, max_key, average, or summation. Selecting min_key or max_key outputs a Matrix<float> or a Matrix<complex<float> > whose key is the minimum or the maximum among the input Map<int, ObjectRef> . Selecting average or summation outputs a Matrix<float> or a Matrix<complex<float> > whose values are the summation of or the average of the values of the input Map<int, ObjectRef> . The default value is min_key.
: bool type. Setting the value to true outputs the conversion status to the standard output. The default value is false.
INPUT |
METHOD |
OUTPUT |
|
min_key |
(1) |
||
Map$<$ int , Matrix<float> $>$ |
max_key |
(2) |
|
average |
(3) |
||
summation |
(4) |
||
min_key |
|||
Map$<$ int , Matrix<complex<float> > $>$ |
max_key |
||
average |
|||
summation |
$<$example$>$
INPUT: Three input values, each value consists of a key and a 2x2 matrix.
\[ \begin{array}{ccc} \left\{ \begin{array}{cc} 0, & \left[ \begin{array}{cc} 1 & 2\\ 3 & 4 \end{array} \right] \end{array} \right\} , & \left\{ \begin{array}{cc} 1, & \left[ \begin{array}{cc} 5 & 6\\ 7 & 8 \end{array} \right] \end{array} \right\} , & \left\{ \begin{array}{cc} 2, & \left[ \begin{array}{cc} 9 & 10\\ 11 & 12 \end{array} \right] \end{array} \right\} \end{array} \] |
OUTPUT(1): a 2x2 matrix for key 0.
\[ \left[ \begin{array}{cc} 1 & 2\\ 3 & 4 \end{array} \right] \] |
OUTPUT(2): a 2x2 matrix for key 1.
\[ \left[ \begin{array}{cc} 9 & 10\\ 11 & 12 \end{array} \right] \] |
OUTPUT(3): a 2x2 matrix, the average of matrices whose key is from 0 to 2.
\[ \left[ \begin{array}{cc} 5 & 6\\ 7 & 8 \end{array} \right] \] |
OUTPUT(4): a 2x2 matrix, the summation of matrices whose key is from 0 to 2.
\[ \left[ \begin{array}{cc} 15 & 18\\ 21 & 24 \end{array} \right] \] |