Page 21 - Convolutional Neural Networks News Today : Breaking News, Live Updates & Top Stories | Vimarsana

Stay updated with breaking news from Convolutional neural networks. Get real-time updates on events, politics, business, and more. Visit us for reliable news and exclusive interviews.

Top News In Convolutional Neural Networks Today - Breaking & Trending Today

"Text classification based on machine learning for Tibetan social netwo" by Hui Lv, Fenfang Li et al.

Social network technologies have gained widespread attention in many fields. However, the research on Tibetan Social Network (TSN) is limited to the sentiment analysis of micro-blogs, and few researchers focus on text classification and data mining in TSN. It cannot meet the social needs of the majority of Tibetans and the text information they really care about. In this paper, we investigate and compare different models that we adopted for the classification of Tibetan text. Machine learning models including Naive Bayesian (NB), Random Forest (RF), Support Vector Machine (SVM), fastText and text Convolutional Neural Networks (CNN) are used as classifiers to determine the best approach in Tibetan Social Network. In addition, term frequency-inverse document frequency (TF-IDF) is used to extract hot words and generate the word cloud. The results show that the random forest is significantly better than other machine learning algorithms on Tibetan text classification. ....

Tibetan Social Network , Convolutional Neural Networks , Naive Bayesian , Random Forest , Support Vector Machine , Tibetan Social , Data Mining , Social Network , Text Classification ,

"Defensive Few-shot Learning" by Wenbin Li, Lei Wang et al.

This paper investigates a new challenging problem called defensive few-shot learning in order to learn a robust few-shot model against adversarial attacks. Simply applying the existing adversarial defense methods to few-shot learning cannot effectively solve this problem. This is because the commonly assumed sample-level distribution consistency between the training and test sets can no longer be met in the few-shot setting. To address this situation, we develop a general defensive few-shot learning (DFSL) framework to answer the following two key questions: (1) how to transfer adversarial defense knowledge from one sample distribution to another? (2) how to narrow the distribution gap between clean and adversarial examples under the few-shot setting? To answer the first question, we propose an episode-based adversarial training mechanism by assuming a task-level distribution consistency to better transfer the adversarial defense knowledge. As for the second question, within each few-s ....

Adversarial Attacks , Convolutional Neural Networks , Efensive Few Shot Learning , Distribution Consistency , Pisodic Training , Graphics Processing Units , Image Classification , Learning Systems , Task Analysis ,