- 无标题文档
查看论文信息

中文题名:

 基于数据处理的联想记忆网络结构扩充    

姓名:

 李丽娜    

学号:

 1304110237    

保密级别:

 公开    

论文语种:

 chi    

学科代码:

 081101    

学科名称:

 工学 - 控制科学与工程 - 控制理论与控制工程    

学生类型:

 博士    

学位:

 工学博士    

学校:

 西安电子科技大学    

院系:

 机电工程学院    

专业:

 控制科学与工程    

研究方向:

 联想记忆网络拓扑结构扩展    

第一导师姓名:

 李志武    

第一导师单位:

 西安电子科技大学机电工程学院    

完成日期:

 2022-06-20    

答辩日期:

 2022-08-31    

外文题名:

 The structure extension of associative memories based on data processing    

中文关键词:

 单层联想记忆网络 ; 双层联想记忆网络 ; 张量联想记忆网络 ; 非线性变换 ; 自编码器 ; 模糊C均值    

外文关键词:

 single-level associative memories ; two-level associative memories ; tensor associative memories ; nonlinear function ; autoencoding ; fuzzy C-meanse    

中文摘要:

近年来,随着科技水平和生活水平的发展提高,人类追求事件物质的精细程度越来越高,大量的数据记录随之而来。人们在享受科技带来的便利时,面临着大量繁琐的数据处理。如今,数据集之间不仅仅是高维度的数据集,还存在着各种相互作用的关系,这样就对于人类处理数据的能力、速度、效率提出了更高的要求。对数据处理方法的研究日益受到重视,如何高效地记录数据、处理数据、分析数据一直是数据处理领域中活跃的前沿研究课题。

研究过程采用粒子群优化算法、梯度下降法等优化算法对一系列联想记忆网络机制进行了理论研究,并利用数据空间转换、自编码器、模糊聚类处理等方法对一些数据联想记忆机制进行逻辑优化,以期通过提出联想记忆网络的逻辑扩展结构进一步改善这些联想记忆机制,使得可以对数据进行更快速、高效、准确地存储、恢复和处理,为深入了解和发展大数据处理方法提供理论基础,为进一步的实验研究提供理论依据。主要内容概括如下:

1、联想记忆网络是在两个数据集之间,建立一个记忆矩阵,通过输入集与记忆矩阵的逻辑运算得到输出集。在考虑单层数据联想记忆网络的结构时,引入数据空间转换的概念(非线性分段函数映射)对输入输出集分别进行处理。数据集先被数据空间转换到一个新的空间域,其次再以联想记忆矩阵的形式进行存储,回忆数据时利用数据空间转换函数的逆变换得到原数据集的回忆集。数据空间转换模式主要存在三种模式——输入空间转换、输出空间转换以及同时对输入和输出空间转换。具体实验里采用了非线性分段函数处理输入集和输出集,而分段函数的参数,即分段函数的截点数(如何分段),主要通过以回忆数据集与原数据集的误差为目标函数的粒子群算法优化而得到。实验中重点针对几种常见的数据联想记忆网络——相关联想记忆网络、模糊联想记忆网络和形态联想记忆网络来论证非线性变换的处理能有效地提高数据存储记忆能力。这部分实验揭示了分段函数的截点数的选择与不同的联想记忆网络回忆误差之间的关系,分段函数的截点数越多,数据集回忆的误差越小。证明了非线性变换的引入有助于提高单层数据联想记忆网络的回忆能力。

2、双层联想记忆网络是在单层联想记忆网络中添加了一个隐藏层矩阵,原数据集都与这个隐藏层的数据集进行逻辑运算,即得到两个联想记忆矩阵。在研究双层数据联想记忆网络结构时,引入多层神经网络——数据自编码处理方法对数据集进行处理,提出了基于自编码器接口的双层联想记忆网络结构模型。这个模型由两部分单元组成。第一部分利用自编码器对原高维度数据集进行有效地降维处理,得到输入集的良好表示,这个新表达有助于第二部分的面向逻辑的联想记忆网络实现存储和回忆数据,最后利用解码器得到原数据集的回忆集。在这部分实验中,分别利用粒子群优化算法、差分进化法和梯度下降法对自编码参数设置进行优化。实验表明,随着自编码器中的隐藏层层数的增加和每一层的维数选择不同,可以不同程度地提高数据存储回忆能力。对比结果可知,基于自编码器接口的双层数据联想记忆网络比原双层联想记忆网络、单层联想记忆网络能更有效地存储回忆数据。

3、多个数据集需要同时存储记忆时,需要存储的记忆矩阵就是一个庞大的高维数据,在多数据集联想记忆网络(即张量联想记忆网络)中,引入模糊聚类分析方法——模糊C均值方法对多个数据集进行聚类处理,联想记忆网络的存储对象由原先多个原数据的整体存储记忆矩阵变为多个数据集的聚类中心和隶属度的存储记忆矩阵,减少了联想记忆矩阵的数据存储量和计算量,最后利用聚类中心和隶属度函数还原多个数据集的回忆集。在这部分实验中,主要利用粒子群优化算法和梯度下降法对模糊C均值方法参数进行优化选择。在选择到合适的聚类分析法参数时,对基于模糊C均值的张量联想记忆网络进行了大量的数据实验。计算结果表明,基于模糊C均值的张量联想记忆网络可以有效地提高数据存储能力和减少回忆数据的误差。

 

关键词:单层联想记忆网络, 双层联想记忆网络, 多数据集联想记忆网络, 非线性变换, 自编码器, 模糊C均值

 

外文摘要:

Recently, with the development of science and technology, the pursuit of event material is getting higher and higher and a large number of data should be recorded with follow shortly. Nowadays, data sets are not only high-dimensional data sets, but also interact with each other, and higher requirements are put forward for the speed, mechanism and efficiency of data processing. More attention has been paid to the research of data processing methods. Studies on the recording, processing and analysis of data have been the essential parts of data processing.

This main objective of this dissertation is to introduce a logic-oriented associative memory which constitutes the developments of the architecture, to achieve the storage and the recall mechanisms efficiently, quickly, accurately. This dissertation carries out a series of theoretical researches and developments on associative memories used gradient-based learning mechanisms, particle swarm optimization (PSO) and differential evolution (DE), and the logic development of associative memories is optimized by means of nonlinear transformations (mappings) of spaces of data, autoencoding mechanisms and fuzzy clustering processing. The modification and optimization of these association mechanisms can store, process and recall the data more quickly, efficiently and accurately. These results provide a theoretical basis for the in-depth understanding and development of the big-data processing——granular data, and provide some elemental theoretical basis for further experiments. The main results are as follows:

1. A memory matrix is created between the input data set and the output data set. In the single-level fuzzy associative memories, the data are first transformed into a new space and then resulting objects (patterns) are stored in the memory. Then, the recall is obtained by the inverse transformation of the nonlinear transformation. The objective of the transformations is to enhance the recall abilities of the memories. Nonlinear transformations are applied to transform the input space, transform the output space and transform both input and output spaces. The transformations are realized in the form of piecewise linear functions whose parameters are optimized with the use of PSO. A comprehensive suite of experiments involves several types of associative memories such as correlation associative memories, fuzzy associative memories and morphological associative memories. The experiments reveal some interesting relationships between the parameters of the nonlinear mappings and the resulting quality of the recall. They also help quantify the capabilities of the nonlinear mappings to improve the quality of recall.

2. Two-level associative memories have one more hidden layer data set than single-level associative memories. The first memory matrix is established between the input data and the hidden matrix, and the other memory matrix is established between the hidden matrix and the output data. We develop a logic-driven model of two-level fuzzy associative memories augmented by autoencoding processing. It is composed of two functional modules. The first module of this architecture implements an efficient dimensionality reduction of the original high dimensional data with the use of an autoencoder to get a new representation of the input data. This helps achieve storing and completing the recall realized by a logic-oriented associative memory which constitutes the second module of the architecture. Then, the recall data set is obtained by decoder. The optimization of the association matrices studied in the dissertation involves both gradient-based learning mechanisms and the algorithms of population-based optimization, i.e., PSO and DE. A suite of experimental studies is presented to quantify the performance of the proposed approach. With the increase of the number of coding layers and the different nodes in each layer, the data storage and recall capability can be improved to different degrees.

3. Associative memories need to have strong capacity to store numerous data sets with a minimal recall error. With the explosion of the database, a huge dataset could be split into some subsets and these subsets interact with each other through a relationship. Tensor associative memories aim at further improvement of the storage capacity and the performance of the recall. In tensor associative memories, we combined with clustering analysis——fuzzy C-means for multiple data sets. The memory matrix comes from membership degree of multiple data sets, which reduces the storage and calculation amount of original associative memory matrix. Then, the recall is restored by using clustering center and membership function. In this part of experiments, particle swarm optimization algorithm and gradient-based method are used to optimize the parameters of the fuzzy C-means. The results show that for tensor associative memory, the combination of fuzzy C-means can effectively improve the data storage capacity and recall with a minimal error.

 

Key words: single-level associative memories, two-level associative memories, tensor associative memories, nonlinear function, autoencoding, fuzzy C-means

参考文献:
[1] 侯媛彬, 杜京义, 汪梅. 神经网络[M]. 西安电子科技大学出版社. 2007.
[2] 陈苒, 董占球. 三类联想记忆模型的分析与比较[J], 计算机应用.2000, 22-24.
[3] A. Pedrycz, F. Dong, K. Hirota. Nonlinear mappings in problem solving and their PSO-based development[J], Information Sciences. Vol. 181,Issue 19, October 1 2011, 4112-4123.
[4] F. Li, C.H. Chang, A. Basu. A 0.7 V low-powerfully programmable Gaussian function generator for brain-inspired Gaussian correlation associative memory[J], Neurocomputing. Vol.138 , 2014, 69-77.
[5] R.J. Marks, L.E. Atlas, J.J. Choi. Performance analysis of associative memories with nonlinearity in the correlation domain[J], Applied Optics. October 27 1988, 2900-2904.
[6] Z.Y. Chen, C.P. Kwong. Recurrent correlation associative memories with multiple-value[C], in Proc. IEEE International Conference on Neural Networks. June 27-July 2 1994, 1068-1073.
[7] H.B. Ji, K.S. Leung, Y. Leung. Gaussian correlation associative memory[C], in Proc. IEEE International Conference on Neural Networks, November/ December 1995, 1761-1766.
[8] R.C. Wilson, R.E. Hancock. A study of pattern recovery in recurrent correlation associative memories[J], IEEE Transactions Neural Network. Vol. 14, 2003, 506-519.
[9] Y. Kuroe Y. Taniguchi. Models of orthogonal type complex-valued dynamic associative memories and their performance comparison[C], Artificial Neural Networks -ICANN 2007, Lecture Notes in Computer Science. Vol. 4668, 2007, 838-847.
[10] R.C. Wilson, E.R. Hancock. Optimising pattern recovery in recurrent correlation associative memories[C], in Proc. 15th International Conference on Pattern Recognition. 2000, 1005-1009.
[11] F. Chung, T. Lee. Towards a high capacity fuzzy associative memory model[C], in Proc. IEEE International Conference on Neural Networks. June 27-July 2, 1994, 1595-1599.
[12] P. Sussner, M.E. Valle. Implicative fuzzy associative memories[J], IEEE Transactions on Fuzzy System. Vol.14, Issue 6, June 2006, 793-807.
[13] F. Chung, T. Lee. On fuzzy associative memory with multiple-rule storage capacity[J], IEEE Transactions on Fuzzy Systems. Vol.4, Issue 3, August 1996, 375-384.
[14] J.L. Davidson, F. Hummer. Morphology neural networks: an introduction with applications, Circuits[J], Systems and Signal Processing. December 1993, 177-210.
[15] B.H. Wang, G. Vachtsevanos. Fuzzy associative memories: identification and control of complex systems[C], in Proc. 5th IEEE International Symposium on Intelligent Control. September 5-7 1990, 910-915.
[16] J.B. Fan, F. Jin, Y. Shi. A learning rule for fuzzy associative memories[C], in Proc. of the IEEE International Joint Conference on Neural Networks. 1994, 4273-4277.
[17] P. Liu. The fuzzy associative memory of max-min fuzzy neural networks with threshold[J], Fuzzy Set Systems. Vol. 107, Issue 2, October 16 1999, 147-157.
[18] E. Esmi, P. Sussner, H. Bustince. θ-fuzzy associative memories (θ-FAMs)[J], IEEE Transactions on Fuzzy Systems. Vol. 23, March 27 2015, 313-326.
[19] Z.C. Wang, J. Zhang. Detecting pedestrian abnormal behavior based on fuzzy associative memory[C], in Proc. 4th International Conference on Natural Computation. October 18-20 2008, 143-147.
[20] W. Pedrycz. Fuzzy neural networks and neurocomputations[J], Fuzzy Sets Systems. Vol. 56, 1993, 1-28.
[21] A. Sudo, A. Sato, O. Hasegawa. Associative memory of online learning in noisy environments using self-organizing incremental neural network[J], IEEE Transactions Neural Network. Vol. 20, 2009, 964-972.
[22] B. Milidge, T. Salvatori, Y. Song. Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models[J], Neural and Evolutionary Computing. February 2022, 557-575.
[23] D. Maximov, V. I. Goncharenko, Y. S. Legovich. Multi-valued neural networks I: a multi-valued associative memory[J], Neural Computing and Applications. Vol. 33, April 2021, 10189-10198.
[24] V. M. Ladwani, V. Ramasubramanian. M-ary Hopfield Neural Network Based Associative Memory Formulation: Limit-Cycle Based Sequence Storage and Retrieval[M], Lecture Notes in Computer Science. September 2021.
[25] 冯乃勤, 刘春红, 张聪品等. 形态学联想记忆框架研究[J], 计算机学报. 2010, 157-166.
[26] M. S. Mahnoud, Y. Xia. LMI-based Exponential Stability Criterion for Bidirectional Associative Memory Neural Networks[J], Neurocomputing. Vol. 74, Issue 1-3, December 2010, 284-290.
[27] C. Aouiti, M. B. Rezeg. Impulsive multidirectional associative memory neural networks: New results[J], International Journal of Biomathematics.Vol. 14, Issue 7, 2021, 601-610.
[28] C. Aouiti, S. Rathinasamy, F. Touati. Global dissipativity of fuzzy bidirectional associative memory neural networks with proportional delays[J], Iranian Journal of Fuzzy Systems. Vol. 18, Issue 2, January 2021, 65-80.
[29] K. S. Chiu, T. Li. New stability results for bidirectional associative memory neural networks model involving generalized piecewise constant delay[J], Mathematics and Computers in Simulation. Vol. 194, April 2022, 719-743.
[30] 曾水玲, 徐蔚鸿, 杨静宇. 模糊形态学双向联想记忆网络的性质[J], 模式识别与人工智能. 2012, 54-62.
[31] 曾水玲. 一种模糊双向联想记忆网络的性质研究[J], 控制工程. 2016, 1774-1778.
[32] R. Huang, P.K. Mungai, J. Ma, K.K. Wang. Associative memory and recall model with KID model for human activity recognition[J], Future Generation Computer Systems. Vol. 92, March 2019, 312-323.
[33] C. Guan, Y. Cheng, H. Zhao. Semantic Role Labeling with Associated Memory Network[J], Computation and Language. August 2019, 3361-3371.
[34] D. Krotov, J. J. Hopfield. Dense Associative Memory for Pattern Recognition[M], Advances in Unconventional Computing. 2016, 1172-1180.
[35] G. Herzmann, G. Minor, M. Adkins. Neural correlates of memory encoding and recognition for own-race and other-race faces in an associative -memory task[J], Brain Research. Vol. 1655, January 2017, 194-203.
[36] T. Yamaguchi, K. Goto, T. Takagi. Intelligent control of a flying vehicle using fuzzy associative memory system[C], IEEE International Conference on Fuzzy Systems, April 1992,1139-1149.
[37] D. Tamojay, K.G. Anjan, M. Anjan. Singular value Decomposition applied to Associative Memory of Hopfield Neural Network[M], materials today: PROCEEDINGS. Vol. 5, Issue 1, Part 2, 2018, 2222-2228.
[38] A. Wu, Z. Zeng, X. Song. Global Mittag-Leffler stabilization of fractional-order bidirectional associative memory neural networks[J], Neurocomputing. Vol. 177, February 2016, 489-496.
[39] Y. Li, J. Li, J. Li, S. Duan, L. Wang, M. Guo. A reconfigurable bidirectional associative memory network with memristor bridge[J], Neurocomputing. Vol. 454, September 2021, 382-391.
[40] M. Xia, J. Fang, Y. Tang. Dynamic depression control of chaotic neural networks for associative memory[J], Neurocomputing. Vol. 73, January 2010, 776-783.
[41] D. Reay, T.C. Green, B.W. Williams. Application of associative memory neural networks to the control of a switched reluctance motor[C], Conference of the IEEE Industrial Electronics Society, Vol. 1, December 1993, 200-206.
[42] J. Liu, M. Gong, H. He. deep associative neural network for associative memory based on unsupervised representation learning[J], Neural Networks. Vol. 113, May 2019, 41-53.
[43] G. Yang, F. Ding. Associative memory optimized method on deep neural networks for image classification[J], Information Sciences. Vol. 533, September 2020, 108-119.
[44] A.V. Roberto, H. Sossa. Behavior of morphological associative memories with true-color image pattern[J], Neurocomputing. Vol. 73, Issues 1-3, December 2009, 225-244.
[45] 吴锡生, 王宇辉, 王士同. 模糊形态联想记忆网络抗随机噪声研究及应用[J], 计算机工程与应用. 2007, 66-68.
[46] D. Ventura, T. Martinez. Quantum associative memory with exponential capacity[C], 1998 IEEE International Joint Conference on Neural Networks Proceedings. Vol. 1, 1998, 509-513.
[47] M. Andrecut, M. K. Ali. Quantum Associative Memory[J], International Journal of Modern Physics B. Vol. 17, Issue 12, 2003, 2447-2472.
[48] M. Zak. Quantum-inspired resonance for associative memory[J], Chaos, Solitons & Fractals. Vol. 41, Issue 5, September 2009, 2306-2312.
[49] FMDP. Neto, AJD. Silva, WRD. Oliveira. Non-unitary Quantum Associative Memory[C], 2017 Brazilian Conference on Intelligent Systems. October 2017, 97-102.
[50] A. K. Behera, M. Rao, S. Sastry. Enhancing associative memory recall in non-equilibrium materials through activity[J], Disordered Systems and Neural Networks. March 2022, 24-43.
[51] H. Bao, R. Zhang, Y. Mao. The capacity of the dense associative memory networks[J], Neurocomputing. Vol. 469, January 2022, 198-208.
[52] I. V. Stepanyan. Neural Network Modeling and Organization of a Hierarchical Associative Memory System[J], Journal of Machinery Manufacture and Reliabiliry. Vol. 50, January 2022, 735-742.
[53] M. Tsuji, T. Isokawa, M. Kobayashi. Gradient Descent Learning for Hyperbolic Hopfield Associative Memory[J], Transactions of the Institute of Systems Control and Information Engineers. Vol. 34, Issue 1, January 2021, 11-22.
[54] F.M. Paula Neto, A.J. Silva, W.R. Oliveira, T.B. Ludermir. Quantum probabilistic associative memory architecture[J], Neurocomputing. Vol. 351, July 2019, 101-110.
[55] I. Mohsen, R. Abbas, K. Deqian, R. Tajana, M.R. Jan. Exploring Hyperdimensional Associative Memory[C], IEEE International Symposium on High Performance. May 8 2017, 85-90.
[56] P. Sussner, T. Schuster. Interval-valued fuzzy morphological associative memories: Some theoretical aspects and applications[J], Information Sciences. Vol. 438, April 2018, 127-144.
[57] R.R. Rogelio, A.P. Mario, Y.M. Cornelio. Pattern classification using smallest normalized difference associative memory[J], Pattern Recognition Letters. Vol. 93, July 2017, 104-112.
[58] 何虎, 王麒淋, 董丽亚. 一种提高联想记忆脉冲神经网络准确度的学习方法[P]. 中国, CN111260054A, 2020.
[59] 曾水玲. 基于三角模的模糊联想记忆网络[J], 计算机研究与发展. 2013, 998-1004.
[60] A. Pedrycz. Bidirectional and multidirectional associative memories as models in linkage analysis in data analytics: Conceptual and algorithmic developments[J], Knowledge-Based Systems. Vol. 142, February 2018, 160-169.
[61] C. Zhou, X. Zeng, H. Jiang. A generalized bipolar auto-associative memory model based on discrete recurrent neural networks[J], Neurocomputing. Vol. 162, August 2015, 201-208.
[62] 王敏,王士同,吴小俊. 新模糊形态学联想记忆网络的初步研究[J], 电子学报. 2003, 690-693.
[63] M. Guo, Y. Zhu, R. Liu. An associative memory circuit based on physical memristors[J], Neurocomputing. Vol. 472, February 2022, 12-23.
[64] R. Morales, N. Hernandez, R. Cruz. Entropic Associative Memory for Manuscript Symbols[J], Machine Learning. February 2022, 413-437.
[65] 王剑,万冬梅,毛宗源. 新型联想记忆神经网络的硬件实现研究[J], 计算机工程与应用. 2003, 25-29.
[66] Y. Kultur, B. Turhan, A. Bener. Ensemble of neural network with associative memory (ENNA) for estimating software development costs[J], Knowledge-Based System. Vol. 22, Issue 6, August 2009, 395-402.
[67] T. Chen, L. Wang, S. Duan. Implementation of circuit for reconfigurable memristive chaotic neural network and its application in associative memory[J], Neurocomputing. Vol. 380, March 2020, 36-42.
[68] K.O.T. Andre, F.R.A. Aluizio. Control strategies for Hopf bifurcation in a chaotic associative memory[J], Neurocomputing. Vol. 323, January 2019, 157-174.
[69] R. S. Sembiring, S. Efendi, S. Suwilo. Improving the accuracy of old and young face detection in the template matching method with Fuzzy associative Memory(FAM)[C], IOP Conference Series Materials Science and Engineering. Vol. 725, Issue 1, January 2020, 12117-12123.
[70] A. A. Mofrad, S. A. Mofrad, A. Yazidi. On Neural Associative Memory Structures: Storage and Retrieval of Sequences in a Chain of Tournaments[J], Neural Computation. Vol. 33, Issue 9, June 2021, 1-30.
[71] G. Bao, S. Gong, X. Zhou. Associative memory Synthesis Based on Region Attractive Recurrent Neural Networks[J], Neural Processing Letters. April 2022, 10823-10831.
[72] T. Kohonen. Correlation matrix memories[J], IEEE Transactions Computing. Vol. 21, 1972, 353-359.
[73] B. Kosko. Fuzzy associative memories[C], in Proc. 2nd Joint Technology Workshop on Neural Networks and Fuzzy Logic. 1991, 3-58.
[74] G.X. Ritter, P. Sussner. Morphological associative memories[J], IEEE Transactions Neural Network. Vol. 9, 1998, 281-293.
[75] A. Pedrycz, F. Dong, K. Hirota. Representation of neural networks through their multi-linearization[J], Neurocomputing. Vol. 74, Issue 17, October 2011, 2852-2860.
[76] C.Y. Liou, W.C. Cheng, J.W. Liou, D.R. Liou. Autoencoder for words[J], Neurocomputing. Vol. 139, September 2014, 84-96.
[77] J.C. Bezdek, R. Ehrlich, W. Full. FCM: The fuzzy c-means clustering algorithm[C]. Computer & Geoences. 1984, 191-203.
[78] N.R. Pal, J.C. Bezdek. On cluster validity for the fuzzy c-means model[J], IEEE Transactions on Fuzzy Systems. 2002, 370-379.
[79] R. Perfetti, E. Ricci. Recurrent correlation associative memories: a feature space perspective, IEEE Transactions Neural Network[J]. Vol. 19, 2008, 333-345.
[80] Y. Suzuki, N. Konno, J. Maeda. Associative memory system using fuzzy sets[C], in Proc. 14th International Conference on Pattern Recognition. August 16-20 1998, 331-333.
[81] G.X. Ritter, G. Urcid, L. Iancu. Reconstruction of patterns from noisy inputs using morphological associative memories[J], Journal of Mathematical Imaging & Vision. Vol. 19, 2003, 95-111.
[82] K. Ratnavelu, M. Manikandan, P. Balasubramaniam. Synchronization of fuzzy bidirectional associative memory neural networks with various time delays[J], Applied Mathematics & Computation. Vol. 270, November 2015, 582-605.
[83] T.D. Bui, T.H. Nong, T.K. Dang. Improving learning rule for fuzzy associative memory with combination of content and association[J], Neurocomputing. Vol. 149, 2015, 59-64.
[84] M.E. Valle, P. Sussner. Storage and recall capabilities of fuzzy morphological associative memories with adjunction-based learning[J], Neural Networks. Vol. 24, 2011, 75-90.
[85] M.E. Valle. Fundamentals and applications of fuzzy morphological associative memories[D], Ph. D thesis, State University of Campinas, 2007.
[86] Y.Y. Yao. A partition model of granular computing[J], Transactions Rough Set. Vol. 3100, 2004, 232-253.
[87] P. Sussner, M.E. Valle. Grayscale morphological associative memories[J], IEEE Transactions Neural Network. Vol. 17, 2006, 559-570.
[88] T. Saeki, T. Miki. Effectiveness of the block splitting approach on morphological associative memory without a kernel image[C], in International Conference on Fuzzy Systems. July 16-21 2006, 1175-1178.
[89] M.E. Valle. A class of sparsely connected auto associative morphological memories for large color images[J], IEEE Transactions Neural Network. Vol. 20, 2009, 1045-1050.
[90] B. Kosko. Bidirectional associative memory[J], IEEE Transactions on Systems Man & Cybernetics. Vol. 18, 1988, 49-60.
[91] G.X. Ritter, J.L. Diaz-de-Leon, P. Sussner. Morphological bidirectional associative memories[J], Neural Networks. Vol. 12, Issue 6, July 1999, 851-867.
[92] S. Chartier, M. Boukadoum. Encoding static and temporal patterns with a bidirectional hetero associative memory[J], Journal of Applied Mathematics. 2011, 1-34.
[93] S. Chartier, G. Giguère, P. Renaud, J.M. Lina, R. Proulx. FEBAM: A feature-extracting bidirectional associative memory[C], in Proc. the International Joint Conference on Neural Networks. August 12-17 2007, 1679-1684.
[94] M.P. Singh, V.K. Sarawat. Multilayer feed forward neural networks for non-linear continuous bidirectional associative memory[J], Applied Soft Computing. Vol. 61, December 2017, 700-713.
[95] C. Sha, H. Zhao. Design and analysis of associative memories based on external inputs of continuous bidirectional associative networks[J], Neurocomputing. Vol. 266, November 2017, 433-444.
[96] G. Quiroz, L. Ice, A. Delgado, T.S. Humble. Particle track classification using quantum associative memory[J], Nuclear Instruments nad Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment. Vol. 1010, September 2021, 988-1023.
[97] R.A. Ayoubi, H. Ziade, M.A. Bayoumi. Hopfield associative memory on mesh[C], International Symposium on Circuits & Systems. Vol. 5, May 23-26 2004, 1-10.
[98] D. Krotov, Hierarchical Associative Memory[J], Neural and Evolutionary Computing. July 2021, 723-727.
[99] H. Bao, R. Zhang, Y. Mao. The capacity of the dense associative memory networks[J], Neurocomputing. Vol. 469, January 2022, 198-208.
[100] R.J. Hammell, T. Sudkamp. A Two-level Architecture for Fuzzy Learning[J], Journal of Intelligent & Fuzzy Systems. January 1995, 273-286.
[101] B. Cruz, H. Sossa, R. Barron. A New Two-Level Associative Memory for Efficient Pattern Restoration[J], Neural Processing Letters. Vol. 25, February 2007, 1-16.
[102] N. Ikeda, P. Watta, M. Artiklar, MH. Hassoun. A two-level hamming network for high performance associative memory[J], Neural Networks. Vol. 14, Issue 9, November 2011, 1189-1200.
[103] L. Benjamin, H. Damien, P. Isabelle. WCET analysis of multi-level set associative data caches[C], 9th Intl. Workshop on Worst-Case Execution Time WCET Analysis. Vol. 10, June 30 2009, 118-127.
[104] R.M. Gomes, A.P. Braga, H.E. Borges. Storage capacity of hierarchically coupled associative memories[M], Brazilian Symposium on Neural Networks. 2006, 196-201.
[105] R.K. Arimilli, L.J. Clark, J.S. Dodson, G.L. Guthrie, W.J. Starke. Method and system for managing speculative requests in a multi-level memory hierarchy[M], July 2002.
[106] I-C. Kuo, Z. Zhang. The capacity of associative memory by using a combination rule[C], Proc. of IEEE International Conference on Systems Man and Cybernetics. August 6 2002, 2478-2479.
[107] T. Kubota. A higher Order Associative Memory with McCulloch-Pitts neurons and Plastic Synapses[C], International Joint Conference on Neural Networks. August 12-17 2007, 1982-1989.
[108] T. Kubota. Second order associative memory models with threshold logics-eigen mode selections[C], IEEE International Conference on Systems. October 7-10 2007, 1884-1889.
[109] I. Schlag, J. Schmidhuber. Learning to Reason with Third-Order Tensor Products[C], Proc. of the 32nd International Conference on Neural Information Processing Systems. December 2018, 1003-1014.
[110] V. Tresp, Y. Ma. The tensor Memory Hypothesis[J/OL], arXiv e-prints. August 2017, 1708-1716.
[111] J. Kennedy, R. Eberhart. Particle Swarm Optimization[C], in Proc. IEEE International Conference on Neural Networks. 1995, 1942-1948.
[112] S. Mirjalili. Particle Swarm Optimization[M], Studies in Computational Intelligence. January 2019, 15-31.
[113] I.C. Trelea. The particle swarm optimization algorithm: convergence analysis and parameter selection[J], Information Processing Letters. Vol. 85, 2003, 317-325.
[114] M. Meissner, M. Schmuker, G. Schneider. Optimized particle swarm optimization (OPSO) and its application to artificial neural network training[J], BMC Bioinformatic. Vol. 7, Issue 1, February 2006, 7-125.
[115] M. Isiet, M. Gadala. Self-adapting control parameters in particle swarm optimization[J], Applied Soft Computing. Vol. 83, October 2019, 105653-105670.
[116] X. Cai, H. Qiu, L. Gao. An efficient surrogate-assisted particle swarm optimization algorithm for high-dimensional expensive problems[J], Knowledge-Based Systems. available online July 30 2019, 104901-104922.
[117] Y.J. Kenned, R.C. Eberhart. Swarm Intelligence[M], San Francisco: Morgan Kaufmann Publisher. 2001.
[118] 李兵, 蒋尉孙. 混沌优化方法及其应用[J], 控制理论与应用. 1997, 613-615.
[119] Y.H. Shi, R.C. Eberhart. Empirical study of particle swarm optimizer[C], Proc. of Congress on Evolutionary Computation. 1999, 1945-1950.
[120] 吕振肃, 侯志容. 自适应变异的粒子群优化算法[J], 电子学报. 2004, 416-420.
[121] 张劲松, 李歧强, 王朝霞. 基于混沌搜索的混合粒子群优化算法[J], 山东大学学报. 2007,47-50.
[122] 孟红记, 郑鹏, 梅国晖. 基于混沌序列的粒子群优化算法[J], 控制与决策. 2006, 263-266.
[123] W.M. Alhasan, S. Ibrahim, H.A. Hefny, S.I. Shaheen. LDWMeanPSO: A new improved particle swarm optimization technique[C], Computer Engineering Conference. 2011, 37-43.
[124] Z. Zhan, J. Zhang, Y. Li, H. Chung. Adaptive Particle Swarm Optimization[J], IEEE transactions on Systems, Man, and Cybernetics. Vol. 39, Issue 6, 2009, 1362-1381.
[125] R. Storn, K. Price. Differential evolution- a simple and efficient heuristic for global optimization over continuous spaces, Journal of Global Optimization, Journal of Global Optimization[J]. Vol. 11, Issue 4, December 1997, 341-359.
[126] R. Storn, K. Price. Differential Evolution: Numerical Optimization Made Easy[J], Dr. Dobb’s Journal. April 1997, 18-24.
[127] K. Price, R.M. Storn, J.A. Lampinen. Differential Evolution: A Practical Approach to Global Optimization[M], Springer Berlin Heidelberg New York. January 2005.
[128] V. Feoktistov. Differential Evolution: In Search of Solutions[J], New York Ny American Society of Civil Engineers, Vol. 4, January 2006, 13-20.
[129] S. Das, P.N. Suganthan. Differential Evolution: A Survey of the State-of-the-art[J], IEEE Transactions on Evolutionary Computation. Vol. 15, February 2011, 4-31.
[130] Z. Meng, J.S. Pan, L. Kong. Parameters with Adaptive Learning Mechanism (PALM) for the enhancement of Differential Evolution[J], Knowledge-Based Systems. Vol. 141, February 2018, 92-112.
[131] E.N. Goncalves, M.AR. Belo, A.P. Batista. Self-adaptive multi-objective differential evolution algorithm with first front elitism for optimizing network usage in networked control systems[J], Applied Soft Computing. Vol. 114, January 2022, 108112-108128.
[132] U.B. Vyas, V.A. Shah. Differential evolution based regression algorithm for mathematical representation of electrical parameters in lithium-ion battery model[J], Journal of Energy Storage. Vol. 45, January 2022, 103673.
[133] J. Barzilai, J.M. Borwein. Two-point step size gradient method[J], IMA journal of numerical analysis. Vol. 8, Issue 1, January 1998, 141-148.
[134] Y. Yuan, Step-sizes for the gradient method, MAS/IP Studies in Advanced Mathematics. Providence[M], RI: American Mathematical Society. Vol. 42, Issue 3, 2008, 785-796.
[135] Y. Lecun, L. Bottou, Y. Bengio, P. Haffner. Gradient-based learning applied to document recognition[J], Proc. of the IEEE. Vol. 86, Issue 11, December 1998, 2278-2324.
[136] T. Le, B. Vo, H. Fujita, N.T. Nguyen, S.W. Baik. A fast and accurate approach for bankruptcy forecasting using squared logistics loss with GPU-based extreme gradient boosting[J], Information Sciences. Vol. 494, August 2019, 294-310.
[137] C. Yin, S. Dadras, X. Huang, Y. Cheng, H. Malek. The design and performance analysis of multivariate fractional-order gradient-based extremum seeking approach[J], Applied Mathematical Modelling. Vol. 62, October 2018, 680-700.
[138] A.A. Gorodetsky, J.D. Jakeman. Gradient-based optimization for regression in the functional tensor-train format[J], Journal of Computational Physics. Vol. 374, December 2018, 1219-1238.
[139] L. Tang, Y. Zhu, Q. Fu. Designing PAR-constrained periodic/aperiodic sequence via the gradient-based method[J], Signal Processing. Vol. 147, June 2018, 11-22.
[140] X. Sheng. A relaxed gradient based algorithm for solving generalized coupled Sylvester matrix equations[J], Journal of the Franklin Institute. Vol. 355, Issue 10, July 2018, 4282-4297.
[141] A. Bargiela, W. Pedrycz. Granular Computing: An Introduction[M], Kluwer Academic Publishers. Boston, MA, 2002.
[142] W. Pedrycz. Knowledge-Based Clustering[M]. 2005. 11-22.
[143] P. Sussner, C.R. Medeiros. An introduction to morphological associative memories in complete lattices and inf-semilattices[C], in IEEE International Conference on Fuzzy Systems. June 10-15, 2012, 1-8.
[144] C. Zhong, W. Pedrycz, Z. Li. Fuzzy associative memories: A design thorough fuzzy clustering[J], Neurocomputing. Vol. 173, Part 3, January 2015, 1154-1162.
[145] C.Y. Liou, W.C. Cheng, J.W. Liou, D.R. Liou. Autoencoder for words[J], Neurocomputing. Vol. 139, September 2014, 84-96.
[146] D.H. Ackley, G.E. Hinton, T.J. Sejnowski. A learning algorithm for Boltzmann machines[J], Cognitive science. Vol. 9, Issue 1, 1985, 147-169.
[147] G.E. Hinton. Connectionist learning procedures[J], Machine Learning. 1990, 555-610.
[148] A. Makhzani, B. Frey. k-Sparse Autoencoders[C], International Conference on Learning Representations. March 22 2014, 1312-1319.
[149] D. Chicco, P. Sadowski, P. Baldi. Deep autoencoder neural networks for gene ontology annotation predictions[C], Acm Conference on Bioinformantics. September 2014, 533-540.
[150] R. AH, W. Pedrycz, A. Balamash. Logic-driven autoencoders[J], Knowledge-Based Systems. July 2019, 104874-104887.
[151] J. Zhao, X. G, J. Zhou, Q. Sun, Y. Xiao, Z. Zhang, Z. Fu. Attribute mapping and autoencoder neural network based matrix factorization initialization for recommendation systems[J], Knowledge-Based Systems. Vol. 166, February 2019, 132-139.
[152] Y. Ye, L. Chen, S. Hou. Deep AM: a heterogeneous deep learning framework intelligent malware detection[J], Knowledge and Information Systems. Vol. 54, Issue 4, April 2017, 1-21.
[153] A. Sperduti. Labelling Recursive Auto-associative Memory[J], Connection Science. Vol. 6, Issue 4, January 1994, 429-459.
[154] B. Larras, B. Boguslawski, C. Lahuec. Analog encoded neural network for power management in MPSoCp[J], Analog Integrated Circuits &Signal Processing. Vol. 81, Issue 3, December 2014, 595-605.
[155] A.A. Frolov, D. Husek, I.P. Muraviev. Associative memory based on sparsely encoded Hopfield-like neural network[C], International Symposium on Neuroinformatics and Neurocomputers. 1995, 70-76.
[156] M. Crespo-Garcia, J.L. Cantero, A. Pomyalov, S. Boccaletti, M. Atienza. Functional neural networks underlying semantic encoding of associative memories[J], Neuroimage. Vol. 50, Issue 3, April 2010, 1258-1270.
[157] J. Mcelroy, P. Gader. Generalized Encoding and Decoding Operators for Lattice-Based Associative Memories[J], IEEE Transactions on Neural Networks. Vol. 20, Issue 3, November 2009, 1647-1678.
[158] L. Doorenhos, S. Cavuoti, M. Brescia, A. D’Isanto, G. Longo. Comparison of Outlier Detection Methods on Astronomical Image Data[J], Intelligent Astrophysics. Vol. 39, April 2021, 1-29.
[159] K. Ahrabian, B. Babaali. On Usage of Autoencoders and Siamese Networks for Online Handwritten Signature Verification[J], Neural and Evolutionary Computing. Vol.31, December 2019, 9321-9334.
[160] N. R. Dzakiyullah, A. Pramuntadi, A. K. Fauziyyah. Semi-Supervised Classification on Credit Card Fraud Detection using AutoEncoders[J], Bright Publisher. Vol. 2, January 2021, 1-7.
[161] M. Saha, A. Santara, P. Mitra, A. Chakraborty, R. S. Nanjundiah, R.J. Hyndman. Prediction of the Indian summer monsoon using a stacked autoencoder and ensemble regression model[J], International Journal of Forecasting. Vol. 37, 2021, 58-71.
[162] C. Duan, J. Sun, K. Li, Q. Li. A Dual-Attention Autoencoder Network for Efficient Recommendation System[J], Electronics. Vol. 10, June 2021, 1581-1590.
[163] Y. Song, S. Hyun, Y. G. Cheong. A Systematic Approach to Building Autoencoders for Intrusion Detection[M], Silicon Valley Cybersecurity Conference. April 2021, 188-204.
[164] Y. Saito, T. Nakamura, Y. Ljima, K. Nishida, S. Takamichi. Non-parallel and many-to-many voice conversion using variational autoencoders integrating speech recognition and speaker verification[J], Acoustical Science and Technology. Vol. 42, January 2021, 1-11.
[165] D. Sculley. Web-scale k-means clustering[C], Proc. of the 19th international conference on World wide web. April 2010, 1177-1178.
[166] J. Zhou, W. Pedrycz, X. Yue. Projected fuzzy C-means clustering with locality preservation[J], Pattern Recognition. Available online 2, November 2020, 107748-107761.
[167] S. Askari. Fuzzy C-Means clustering algorithm for data with unequal cluster sizes and contaminated with noise and outliers: Review and development[J], Expert Systems with Applications. Vol. 165, March 1 2021, 113856-113866.
[168] H. Verma, A. Gupta, D. Kumar. A modified intuitionistic fuzzy c-means algorithm incorporating hesitation degree[J], Pattern Recognition Letters. Vol. 122, May 1 2019, 45-52.
[169] L. Mantilla, Y. Yari. FCM Algorithm: Analysis of the Membership Function Influence and Its Consequences for Fuzzy Clustering[M], Applications of Computational Intelligence. February 2021, 119-132.

[170] 冯乃勤, 李素娟, 敖连辉等. 形态学联想记忆网络的研究发展[J], 计算机工程与设计. 2010, 4665-4673.
[171] S. Vaidyanathan, K. Hachemi, B. Beddad. MRI images segmentation using improved spatial FCM clustering and pillar algorithms[J], International Journal of Biomedical Engineering and Technology. Vol. 36, January 2021, 220-229.
[172] 朱斌, 梁胜彬, 渠慎明. 一种基于FOA与Autoencoder改进的聚类算法[J], 河南大学学报. 2020, 70-79.
[173] 胡学刚, 严思奇. 基于FCM聚类的图像分割算法[J], 计算机工程与设计. 2018, 159-164.
[174] Y. Li, S. Li, S. Peng, S. Zhao. Extraction of plateau lake water bodies based on an improved FCM algorithm[J], Journal of Intelligent and Fuzzy Systems. Vol. 41, June 2021, 1-14.
[175] 李勇发, 左小清, 杨芳等. 基于FCM聚类及其改进的遥感图像分割算法[J], 浙江农业科学. 2017, 518-520.
[176] MH. Zhang. Short-term Traffic Flow Prediction of Non-detector Intersections Based on FCM[J], Computer Technology and Development. April 2017, 39-45.
[177] 张立勇. 基于信息粒的模糊聚类方法研究[D]. 大连:大连理工大学, 2018.
[178] 董晶, 黄海峰. 基于信息粒空间的信息组织方法,装置和设备[P].中国, CN202011599193.9, 2021.
[179] 朱修彬. 用于系统建模和数据挖掘的粒度数据描述[D]. 西安:西安电子科技大学机电工程学院, 2018.
[180] 甘秀娜, 李明, 王月波. 基于信息粒化的区间值信息系统不确定性度量方法[J], 计算机应用与软件. 2021, 107-114.
中图分类号:

 N37    

馆藏号:

 56010    

开放日期:

 2023-03-25    

无标题文档

   建议浏览器: 谷歌 火狐 360请用极速模式,双核浏览器请用极速模式