References for "Can IP Laws keep up with AI"
1. Aizenberg N.N. and Aizenberg I.N. ,"The Universal Neural-like Logical Elements in Pattern Recognition and Digital Image Processing", Proc.of the 1-st Int. Conf. on Information Technologies for Image Analysis and Pattern Recognition, Lvov, 1990, vol.2, pp. 141-145., Aizenberg N.N. and Aizenberg I.N., "Model of the Neural Network Basic Elemets (Cells) with Universal Functionality and various of Harware Implementation", Proc. of the 2-nd International Conference "Microelectronics for Neural Networks", Kyrill & Methody Verlag, Munich, 1991, pp. 77-83., Aizenberg N.N. and Aizenberg I.N., "CNN Based on Multi-Valued Neuron as a Model of Associative Memory for Gray-Scale Images", Proceedings of the Second IEEE International Workshop on Cellular Neural Networks and their Applications, Technical University Munich, Germany October 14-16, 1992, IEEE 92TH0498-6, ISBN 0-7803-875-1, pp.36-41.
2. The progress is classified into parameter pruning and sharing, low-rank factorization, Transferred/compact convolutional filters, and knowledge distillation. See, Y. Cheng, D. Wang, P. Zhou, T. Zhang, “ A Survey of Model Compression and Acceleration for Deep Neural Networks”, airxiv: 1710.09282v7 (2019).
3. See, C. Buciluˇa, R. Caruana, and A. Niculescu-Mizil. Model compression. In Proceedings of the 12th ACMSIGKDD International Conference on Knowledge Discovery and Data Mining, KDD
’06, pages 535–541, New York, NY, USA, 2006. ACM, Q. Hinton, O. Vinyals, and J. Dean. Distilling the Knowedge in a Neural Network, arxiv: 1503.02531 (2015).
4. Distillation enables to run a same system as sophisticated models’ by a limited computational resource of our mobile device.