Write numerical results intro
This commit is contained in:
@@ -628,12 +628,13 @@ and after decoding all windows we will therefore have committed all \acp{vn}.
|
||||
\node[align=center] at ($(a00)!0.5!(b01)$)
|
||||
{%
|
||||
$\bm{H}_\text{overlap}^{(\ell)}$ \\[3mm]
|
||||
$= \left(\bm{H}_\text{DEM}\right)_{\mathcal{J}_\text{overlap}^{(\ell)},
|
||||
$=
|
||||
\left(\bm{H}_\text{DEM}\right)_{\mathcal{J}_\text{overlap}^{(\ell)},
|
||||
\mathcal{I}_\text{commit}^{(\ell)}}$%
|
||||
};
|
||||
\end{tikzpicture}
|
||||
|
||||
\caption{Visual representation of notation for window splitting.}
|
||||
\caption{Visual representation of notation used for window splitting.}
|
||||
\label{fig:vis_rep}
|
||||
\end{figure}
|
||||
|
||||
@@ -669,25 +670,27 @@ estimates commited after decoding window $\ell$, we have to set
|
||||
|
||||
% Intro
|
||||
|
||||
\content{Change view from PCM to Tanner graph}
|
||||
\content{Call attention to SC-LDPC-like structure}
|
||||
\content{High-level overview of modification}
|
||||
The sliding-window structure visible in \Cref{fig:windowing_pcm} is
|
||||
highly reminicent of the way \ac{sc}-\ac{ldpc} codes are decoded.
|
||||
Switching our viewpoint to the Tanner graph depicted in
|
||||
\Cref{fig:windowing_tanner}, however, we can see an important
|
||||
difference between \ac{sc}-\ac{ldpc} decoding and the
|
||||
sliding-window decoding procedure detailed above.
|
||||
While the windowing process is similar, the algorithm above
|
||||
reinitializes the decoder to start from a clean state when moving to
|
||||
the next window.
|
||||
It therefore does not make use of the integral property of
|
||||
\ac{sc}-\ac{ldpc} decoding of exploiting the spatially coupled
|
||||
structure by passing soft information from earlier to later spatial positions.
|
||||
|
||||
% Warm-Start decoding for BP
|
||||
We propose a modification to the procedure detailed in
|
||||
\Cref{subsec:Window Splitting and Sequential Sliding-Window Decoding}:
|
||||
Instead of zero-initializing the \ac{bp} messages of the next window,
|
||||
we perform a \emph{warm start} by initializing the messages in the
|
||||
overlapping region to the values last held during the decoding of the
|
||||
previous window.
|
||||
|
||||
\content{Pass messages to next window}
|
||||
\content{(?) Explicitly mention initialization using only CN->VN
|
||||
messages + swapping of CN and VN update?}
|
||||
\content{(?) Algorithm}
|
||||
|
||||
% Warm-Start decoding for BPGD
|
||||
|
||||
\content{Modified structure of BPGD $\rightarrow$ In addition to
|
||||
messages, pass decimation info}
|
||||
\content{(?) Explicitly mention decimation info = channel llrs?}
|
||||
\content{(?) Algorithm}
|
||||
|
||||
\begin{figure}[H]
|
||||
\begin{figure}[t]
|
||||
\centering
|
||||
|
||||
\tikzset{
|
||||
@@ -789,7 +792,18 @@ messages, pass decimation info}
|
||||
\label{fig:windowing_tanner}
|
||||
\end{figure}
|
||||
|
||||
\begin{figure}[H]
|
||||
%%%%%%%%%%%%%%%%
|
||||
\subsection{Warm-Start Belief Propagation Decoding}
|
||||
\label{subsec:Warm-Start Belief Propagation Decoding}
|
||||
|
||||
% Warm-Start decoding for BP
|
||||
|
||||
\content{Pass messages to next window}
|
||||
\content{(?) Explicitly mention initialization using only CN->VN
|
||||
messages + swapping of CN and VN update?}
|
||||
\content{(?) Algorithm}
|
||||
|
||||
\begin{figure}[t]
|
||||
\centering
|
||||
|
||||
\tikzset{
|
||||
@@ -911,7 +925,18 @@ messages, pass decimation info}
|
||||
\label{fig:messages_tanner}
|
||||
\end{figure}
|
||||
|
||||
\begin{figure}[H]
|
||||
%%%%%%%%%%%%%%%%
|
||||
\subsection{Warm-Start Belief Propagation with Guided Decimation Decoding}
|
||||
\label{subsec:Warm-Start Belief Propagation with Guided Decimation Decoding}
|
||||
|
||||
% Warm-Start decoding for BPGD
|
||||
|
||||
\content{Modified structure of BPGD $\rightarrow$ In addition to
|
||||
messages, pass decimation info}
|
||||
\content{(?) Explicitly mention decimation info = channel llrs?}
|
||||
\content{(?) Algorithm}
|
||||
|
||||
\begin{figure}[t]
|
||||
\centering
|
||||
|
||||
\tikzset{
|
||||
@@ -1045,16 +1070,122 @@ messages, pass decimation info}
|
||||
\label{fig:messages_decimation_tanner}
|
||||
\end{figure}
|
||||
|
||||
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
\section{Numerical results}
|
||||
\label{sec:warm_start_bpgd}
|
||||
\section{Numerical Results}
|
||||
\label{sec:Numerical Results}
|
||||
|
||||
% Intro
|
||||
|
||||
\content{Some info on used code (what it is, why it was chosen)}
|
||||
\content{Some info on simulation setup (Stim, circuit-level noise,
|
||||
standard circuit-based depolarizing noise model, etc.)}
|
||||
\content{All datapoints generated with at least 100 error frames}
|
||||
In this section, we perform numerical experiments to evaluate the
|
||||
modification to sliding-window decoding we introduced in
|
||||
\Cref{sec:warm_start_bp}.
|
||||
We chose to carry out our simulations on \ac{bb} codes, as they have
|
||||
recently emerged as particularly promising candidates for practical
|
||||
\ac{qec}, offering high encoding rates and large minimum distances
|
||||
while admitting short-depth syndrome extraction circuits
|
||||
\cite[Sec.~1]{bravyi_high-threshold_2024}.
|
||||
Specifically, we chose the $\llbracket 144, 12, 12 \rrbracket$ BB
|
||||
code, as it represents a good trade-off between code size and
|
||||
simulation tractability \cite{gong_toward_2024}.
|
||||
We employ standard circuit-based depolarizing noise as described in
|
||||
\Cref{subsec:Choice of Noise Model}, and report performance in terms
|
||||
of the per-round \ac{ler} as defined in
|
||||
\Cref{subsec:Per-Round Logical Error Rate}.
|
||||
All datapoints have been generated by simulating at least $200$
|
||||
logical error events.
|
||||
|
||||
For the practical aspects of implementation, several layers of
|
||||
abstraction must be considered.
|
||||
The lowest layer is the circuit-level simulator.
|
||||
This serves as the backbone of all further simulations, handling the
|
||||
quantum mechanical aspects of the system, including the modeling of
|
||||
noise on gates, idling qubits, and measurements according to the
|
||||
chosen noise model.
|
||||
|
||||
Moving one level of abstraction higher, the syndrome extraction
|
||||
circuit itself must be generated.
|
||||
This entails defining the gate schedule for the ancilla measurements
|
||||
and specifying the error locations introduced by the chosen noise
|
||||
model, both of which depend on the code and noise model in question.
|
||||
|
||||
Even further up, given an already constructed syndrome extraction
|
||||
circuit and the resulting \acf{dem}, we must split the detector error
|
||||
matrix into separate windows and manage the interplay between the
|
||||
inner decoders acting on those individual windows.
|
||||
|
||||
Finally, we require the decoder itself, which operates on a
|
||||
\acf{pcm} and a syndrome, with no dependence on the complexity of the
|
||||
layers below.
|
||||
|
||||
In our implementation, Stim \cite{gidney_stim_2021} served as the
|
||||
circuit-level simulator, chosen for its efficiency and native support
|
||||
for the \ac{dem} formalism.
|
||||
For the circuit generation, we employed utilities from QUITS
|
||||
\cite{kang_quits_2025}, which provides syndrome extraction circuitry
|
||||
generation for a number of different \ac{qldpc} codes.
|
||||
We initially created a Python implementation, which used QUITS for the window
|
||||
splitting and subsequent sliding-window decoding as well.
|
||||
The \ac{bp} and \ac{bpgd} decoders were also initially implemented in Python.
|
||||
After a preliminary investigation, we opted for a complete
|
||||
reimplementation in Rust to achieve higher simulation speeds due to
|
||||
the compiled nature of the language.
|
||||
We reimplemented both the window splitting and the decoders themselves.
|
||||
|
||||
% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|
||||
% \section{Numerical results}
|
||||
%
|
||||
% % Intro
|
||||
%
|
||||
% In this section, we perform numerical experiments to evaluate the
|
||||
% modification to sliding-window decoding we introduced in
|
||||
% \Cref{sec:warm_start_bp}.
|
||||
% We chose to carry out our simulations on \ac{bb} codes, as they
|
||||
% \red{[something about qldpc codes being a hot topic in the literature
|
||||
% currently because of some promising properties]}.
|
||||
% Specifically, we chose the $\llbracket 144, 12, 12 \rrbracket$ BB
|
||||
% code, as this \red{[something something]}.\\
|
||||
% \red{[Circuit-level noise]} \\
|
||||
% \red{[Per-round LER]} \\
|
||||
% All datapoints have been generated by simulating at least $200$
|
||||
% logical error events.
|
||||
%
|
||||
% For the practical aspects of implementation, several layers of
|
||||
% abstraction must be considered.
|
||||
% The lowest layer is the circuit-level simulator.
|
||||
% This serves as the backbone of all further simulations.
|
||||
% It takes care of the quantum mechanical aspects of the system.
|
||||
% \red{It is, for example, responsible for the introduction of noise
|
||||
% [rephrase this]}.
|
||||
%
|
||||
% Moving one level of abstraction higher, aside from the circuit
|
||||
% simulation itself, the circuit also has to be generated.
|
||||
% E.g., the syndrome extraction circuitry must be defined and possible
|
||||
% sources of noise must be modeled.
|
||||
% This heavily depends on the code in question and the chosen noise model.
|
||||
% \red{[Find something more to say]}
|
||||
%
|
||||
% Even further up, we have already defined syndrome extraction
|
||||
% circuitry and built the \acf{dem}.
|
||||
% We must now split the detector error matrix into separate windows and
|
||||
% manage the interplay of the inner decoders acting on the individual
|
||||
% windows themselves.
|
||||
% \red{[Find something more to say]}
|
||||
%
|
||||
% Finally, we require the decoder itself.
|
||||
% This simply gets a \acf{pcm} and a syndrome with no regard
|
||||
% \red{[Rephrase this] for the complexity in the rest of the system}.
|
||||
%
|
||||
% In our implementations, Stim \cite{gidney_stim_2021} served as the
|
||||
% circuit-level simulator \red{[Possibly mention why stim was chosen]}.
|
||||
% For the circuit generation, we employed utilities from QUITS
|
||||
% \cite{kang_quits_2025}, where syndrome extraction circuitry
|
||||
% generation is implemented for a number of different \ac{qldpc} codes.
|
||||
% An initial Python implementation used QUITS for the window
|
||||
% splitting and subsequent sliding-window decoding as well.
|
||||
% The \ac{bp} and \ac{bpgd} decoders were also initially implemented in Python.
|
||||
% After a preliminary investigation, we opted for a complete
|
||||
% reimplementation in Rust to achieve higher simulation speeds due to
|
||||
% the compiled nature of the language.
|
||||
% We reimplemented both the window splitting and the decoders themselves.
|
||||
|
||||
%%%%%%%%%%%%%%%%
|
||||
\subsection{Belief Propagation}
|
||||
|
||||
Reference in New Issue
Block a user