leakage_from_time_evolution_operator
- leakage_from_time_evolution_operator(matrix)
Compute the leakage from the qubit subspace of a quantum operation.
Leakage is defined as the probability that a quantum operation causes the system to evolve outside the qubit subspace. For a unitary operator, the leakage is zero. For a non-unitary (e.g., projected) operator describing dynamics within the qubit subspace, the leakage is expected to be less than 1.
- Parameters:
matrix (
ndarray
|Array
) – A square complex matrix representing the quantum operation. It could be a multidimensional array where the last two dimensions correspond to the time evolution operator themselves.- Return type:
- Returns:
Leakage from the qubit subspace.
- Raises:
SimphonyError – If the input matrix is not square in the last two dimensions.
Notes
The leakage \(L\) is calculated as:
\[L = 1 - \frac{1}{d} \text{Tr}(U^\dagger U),\]where \(d\) is the dimension of the matrix \(U\), and \(U^\dagger\) is the Hermitian conjugate of \(U\).