Introducing low-rank and sparse constraints can effectively restore the low-rank structure
of images and remove noise and outliers. Meanwhile, the regularization term further
enhances the robustness and generalization ability, making it better able to adapt
to different types of visual art image restoration tasks.
3.1. Low-rank Visual Art Image Restoration Based on Progressive Regularization
LRM restoration has been extensively applied in many fields such as image restoration
and deblurring. In an image data matrix, each row or column can be expressed by other
rows or columns, meaning that this matrix contains a lot of redundant information
[16]. In a sense, image restoration is a typical LRM restoration problem. The image restoration
is displayed in Fig. 1.
Fig. 1. Image recovery process.
In Fig. 1, the image restoration is to repair and eliminate blurring of the image. Although
image restoration techniques have been able to effectively repair or eliminate blurring
in images, there is still a significant gap in clarity in the restored and the original
image due to various reasons. Therefore, the study introduces a weight update strategy.
The weighted kernel norm minimization method is used for matrix filling, as shown
in Eq. (1).
In Eq. (1), $\Omega$ represents a binary indicator matrix of the same size as $Y$. $p_{\Omega}(Y)$
represents the Hadamard product of the indicator matrix and the observation matrix
$Y$. This limitation indicates that the estimation matrix $X$ matches the lossless
information of the observation matrix $Y$. Adjacency matrix is a matrix used to describe
the nearest neighbor relationships between nodes. This algorithm takes a one-dimensional
array to store all points in the graph, while a two-dimensional array stores the relationships
between points [17]. The adjacency matrix is specifically shown in Fig. 2.
Fig. 2. Adjacency matrix diagram.
In Fig. 2, $+1$ indicates mutual perception, while blank indicates inability to perceive each
other. For an undirected graph, its adjacent matrices must be symmetric, its main
diagonal must be 0, and its sub diagonal may not necessarily be 0. In an undirected
graph, the degree of any vertex refers to the $i$-th non-zero element quantity. In
a directed graph, adjacent matrices may not necessarily be symmetric. Among them,
the number of outgoing lines of node $i$ in a directed graph is the non-zero element
quantity in row $i$. In a directed graph, the outgoing lines of node $i$ are non-zero
elements [18]. LRM recovery takes the rank of the matrix as the sparse measure. The linear measurement
is shown in Eq. (2).
In Eq. (2), $y$ represents the observed data. $A$ represents the linear operator. $X \in R^{m
\times n}$, $y \in R^p$. The definition of the linear operators is specifically shown
in Eq. (3).
In Eq. (3), $\langle A_i, X \rangle$ represents the inner product of the matrix. From a mathematical
perspective, the low-rank restoration is reduced to a rank minimization, as displayed
in Eq. (4).
In Eq. (4), $rank(X)$ signifies the rank function of matrix $X$. The update process of the weight
vector is set to the following form, as shown in Eq. (5).
In Eq. (5), $\varepsilon_l$ represents a sequence that gradually decreases in the range of 0
to 1 and approaches 0. Traditional regularization algorithms require $\varepsilon$
to take a small optimal solution and maintain a constant during iteration. This algorithm
takes $\varepsilon_l = 1$ as the initial value, increasing from high to low. The maximum
value includes some suboptimal solutions first. During the decreasing process, it
continuously shrinks to gradually approach the optimal solution. This regularization
method can improve the sparsity of rank and achieve better results. A block-based
image restoration algorithm is introduced to address the non-local self-similarity
characteristics of images. This algorithm mainly includes similar block matching,
matrix low-rank approximation, image reconstruction, and iterative diffusion. In the
block matching stage, the missing image $Y$ is first divided into $N$ overlapping
blocks. For each block, the similarity measure is used to find other blocks that are
similar to it. Next, each similar block is transformed into a column vector form,
and a block matrix is formed by connecting all column vectors and stacking them. The
process of building a block matrix is shown in Fig. 3.
Fig. 3. The process of building a block matrix.
In Fig. 3, in the process of building block matrix, the art image is first divided into several
small blocks, which are then searched for other similar blocks in the image through
similarity search algorithm. Each similar block is converted into column vector form,
and these column vectors are connected and stacked to form block matrix. In this process,
the adjacency matrix plays an important role, which describes the adjacency relationship
between nodes. For image restoration, nodes here can be image blocks, and the adjacency
matrix reflects the similarity or correlation between these image blocks. In this
way, the adjacency matrix helps the algorithm identify and exploit similar areas in
the image, thereby preserving the structure and texture information of the image during
the repair process. In image restoration, adjacency matrix not only helps to search
and match similar blocks, but also helps to optimize the combination of image blocks
in the iterative process, so that the repaired image can better retain the features
and styles of the original image.
3.2. Image Restoration Based on Singular Value Entropy Function
Singular value arrangement in image restoration uses singular value decomposition
to decompose an image, obtain a set of singular values, and arrange these singular
values in descending order. In image restoration, singular value permutation is used
to restore or reconstruct damaged or missing image parts. Specifically, singular value
decomposition can decompose a complex image matrix into several simple matrices, of
which the singular value matrix is an important component. These singular values can
reflect important features and structural information. Therefore, the main features
can be preserved by retaining larger singular values. For smaller singular values,
they can be ignored or set to zero to remove noise, fill missing parts, or perform
other image restoration operations [19,20]. Singular value decomposition and arrangement are shown in Fig. 4.
Fig. 4. Singular value decomposition and arrangement.
In image processing, scratches are equivalent to collision noise at different positions
in the spatial domain, making the disorder of pixel values in the image more severe.
On the contrary, due to the random loss of pixel values following a uniform distribution,
the confusion caused is not significant. For some systems with high chaos, the entropy
minimization method can be effectively improved. To address this issue, an image repair
method based on the entropy weight is designed to further optimize scratches. The
specific definition of entropy function is shown in Eq. (6).
In Eq. (6), $x \in R$. Information theory suggests that the maximum entropy of a random variable
exists when it has a uniform distribution. In other words, if the numerical value
of a random variable is small, its entropy value is lower. That is, its vector $x$
is sparser. Based on the low-rank approximation, the singularity performance of the
repaired image can be improved, resulting in better sparsity and more concentrated
main information, thereby improving the quality of image restoration. Therefore, the
study introduces another rank approximation method, which is the entropy function
of singular values. For an LRM $X$, the rank minimization approximation is equal to
solving the minimization problem of the corresponding singular value entropy function.
This approach can better capture and utilize the low-rank structure of matrices, thereby
optimizing the image restoration effect. The rank minimization approximation is specifically
shown in Eq. (7).
Under certain fidelity limitations, the equivalent result of the above problem is
displayed in Eq. (8).
The entropy function minimization can be transformed into a weighted kernel norm minimization
problem. This entropy-based weight is applied to the image restoration framework to
perform block processing on the image. Unlike global low-rank, image restoration is
achieved by utilizing the low-rank of similar block matrices and solving the weighted
kernel norm minimization problem corresponding to entropy gradient weights. It better
capture the local structure and maintain the integrity of these structures during
the repair process, thereby improving the repair effect. The consensus weight method
can cause the repair result to be overly smooth, so the weight value is specifically
shown in Eq. (9).
During the iteration process, an iterative regularization step is introduced to reinforce
the improved estimation results by back-projecting the residual image onto the estimated
image. Specifically, at the beginning of each iteration, the estimated image output
from the previous iteration is updated and then used as the input image for the next
iteration. This method can utilize the information from the previous iteration to
gradually optimize the image, thereby improving estimation accuracy and stability,
as shown in Eq. (10).
In Eq. (10), $\hat{X}$ represents the repair of damaged images. $\delta$ is an adjustment parameter.
Peak Signal-to-Noise Ratio (PSNR) is an objective standard used to assess image or
video quality. PSNR is frequently used to compare the quality differences between
original images and compressed, processed, or reconstructed images, as shown in Eq. (11).
Parameters $M$ and $\delta$ directly affect the effectiveness of the algorithm. The
impact parameters $M$ and $\delta$ on PSNR is shown in Fig. 5.
Fig. 5. Effect of δ on PSNR.
Finally, it is decided to choose $M$ as 5 and $\delta$ as 0.4 to complete the image
restoration task. Mean Absolute Error (MAE) and Root Mean Square Error (RMSE) are
taken as evaluation metrics to measure the error in the repair results. MAE is a commonly
used error measure that represents the average absolute deviation in the repaired
and the true. A small MAE indicates that the repair result is approached to the true
value and the repair effect is better. The MAE is shown in Eq. (12).
RMSE considers the variance and mean of the repair error. Compared with MAE, RMSE
is more sensitive to outliers. A small RMSE value indicates better stability of the
repair result and better repair effect. The RMSE is shown in Eq. (13).
The steps of an LRM image restoration method ground on asymptotic regularization singular
value function roughly include image partitioning, constructing similar block groups,
singular value decomposition, singular value contraction, reconstructing LRM, image
block aggregation, etc., as shown in Fig. 6.
Fig. 6. The specific process steps of the algorithm.
In Fig. 6, the image to be restored is divided into overlapping or non-overlapping blocks during
the image restoration process. For each image block, other similar blocks are searched
and combined into a matrix. The matrix composed of each similar block group is subjected
to singular value decomposition. Asymptotic regularization singular value function
is used to contract singular values. The singular values are adjusted based on their
size and preset thresholds or regularization parameters. The adjusted singular values
and corresponding left and right singular vectors are used to reconstruct an LRM.
All reconstructed low-rank matrices are converted back into image blocks and placed
back in their corresponding positions in the image. If the blocks are overlapping,
weighted averaging may be necessary to fuse the overlapping regions. The above process
is repeated until the image quality no longer significantly improves. The weight updating
strategy uses the weighted kernel norm minimization method to fill the matrix, in
which the weight vector gradually decreases in the iterative process, the initial
value is large to contain part of the suboptimal solution, and then gradually decreases
to approximate the optimal solution. This process enhances the sparsity of the rank,
so as to obtain better repair effect. The regularization term effectively restores
the low-rank structure of the image and removes noise and outliers by introducing
low-rank constraints and sparse constraints. At the same time, the robustness and
generalization ability of the algorithm are enhanced, so that it can adapt to different
types of visual art image restoration tasks. In the process of restoration, these
parameters interact with the low-rank components of the image, and the damaged image
is decomposed into low-rank components and sparse components through the low-rank
approximation model. The low-rank matrix is estimated by the iterative optimization
algorithm, and the progressive regularization strategy is used to gradually approximate
the original image. Finally, the repaired low-rank components and sparse components
are synthesized to obtain the repaired image.