Prikladnaya Diskretnaya Matematika
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive
Impact factor

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Prikl. Diskr. Mat.:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Prikl. Diskr. Mat., 2020, Number 48, Pages 100–108 (Mi pdm708)  

Computational Methods in Discrete Mathematics

Associative memory based on cellular neural networks with bipolar stepwise activation function

M. S. Tarkov

Rzhanov Institute of Semiconductor Physics SB RAS, Novosibirsk, Russia

Abstract: Cellular neural networks (CNN) with a bipolar stepwise activation function are considered as obtained by training on a given set of binary reference images. The trained CNN versions with the different cell neighborhood sizes were tested in solving the problem of filtering noisy reference images. It has been established that global training methods (the Hebb method and the projection method), traditionally used in Hopfield networks, generate high-level noise (tens of percent) at the output of cellular networks even in the absence of input noise. A local analogue of the projection method is proposed that provides the filtering of noisy images significantly better than the classical local perceptron learning algorithm. The local Hebb method works better than the above two methods only with minimal neighborhood and high noise levels ($70 %$ at least). The influence of the CNN weights quantization levels number on the CNN information capacity is investigated. It is shown that: 1) with the number of quantization levels greater than $8$ and the number of neurons $16\times 16$, the capacity of the CNN with quantized weights, trained according to the local Hebb rule, approximates the capacity of the CNN with continuous weights; 2) when using the local projection method, a similar result is achieved with a number of levels not less than $64$.

Keywords: cellular neural networks, noise filtering, perceptron training algorithm, local projection method, cell neighborhood, informational capacity of cellular neural network, weight quantization.

DOI: https://doi.org/10.17223/20710410/48/9

Full text: PDF file (1142 kB)
References: PDF file   HTML file

Bibliographic databases:

UDC: 621.396:621.372

Citation: M. S. Tarkov, “Associative memory based on cellular neural networks with bipolar stepwise activation function”, Prikl. Diskr. Mat., 2020, no. 48, 100–108

Citation in format AMSBIB
\Bibitem{Tar20}
\by M.~S.~Tarkov
\paper Associative memory based on cellular neural networks with bipolar stepwise activation function
\jour Prikl. Diskr. Mat.
\yr 2020
\issue 48
\pages 100--108
\mathnet{http://mi.mathnet.ru/pdm708}
\crossref{https://doi.org/10.17223/20710410/48/9}


Linking options:
  • http://mi.mathnet.ru/eng/pdm708
  • http://mi.mathnet.ru/eng/pdm/y2020/i2/p100

    SHARE: VKontakte.ru FaceBook Twitter Mail.ru Livejournal Memori.ru


    Citing articles on Google Scholar: Russian citations, English citations
    Related articles on Google Scholar: Russian articles, English articles
  • Прикладная дискретная математика
    Number of views:
    This page:80
    Full text:24
    References:3

     
    Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2021