Javascript must be enabled to continue!
AI Uncertainty Based on Rademacher Complexity and Shannon Entropy
View through CrossRef
In this paper from communication channel coding perspective we are able to present both a theoretical and practical discussion of AI’s uncertainty, capacity and evolution for pattern classification based on the classical Rademacher complexity and Shannon entropy. First AI capacity is defined as in communication channels. It is shown qualitatively that the classical Rademacher complexity and Shannon rate in communication theory is closely related by their definitions. Secondly based on the Shannon mathematical theory on communication coding, we derive several sufficient and necessary conditions for an AI’s error rate approaching zero in classifications problems. A 1/2 criteria on Shannon entropy is derived in this paper so that error rate can approach zero or is zero for AI pattern classification problems. Last but not least, we show our analysis and theory by providing examples of AI pattern classifications with error rate approaching zero or being zero. Impact Statement: Error rate control of AI pattern classification is crucial in many lives related AI applications. AI uncertainty, capacity and evolution are investigated in this paper. Sufficient/necessary conditions for AI’s error rate approaching zero are derived based on Shannon’s communication coding theory. Zero error rate and zero error rate approaching AI design methodology for pattern classifications are illustrated using Shannon’s coding theory. Our method shows how to control the error rate of AI, how to measure the capacity of AI and how to evolve AI into higher levels. Index Terms: Rademacher Complexity, Shannon Theory, Shannon Entropy, Vapnik-Cheronenkis (VC) dimension.
Title: AI Uncertainty Based on Rademacher Complexity and Shannon Entropy
Description:
In this paper from communication channel coding perspective we are able to present both a theoretical and practical discussion of AI’s uncertainty, capacity and evolution for pattern classification based on the classical Rademacher complexity and Shannon entropy.
First AI capacity is defined as in communication channels.
It is shown qualitatively that the classical Rademacher complexity and Shannon rate in communication theory is closely related by their definitions.
Secondly based on the Shannon mathematical theory on communication coding, we derive several sufficient and necessary conditions for an AI’s error rate approaching zero in classifications problems.
A 1/2 criteria on Shannon entropy is derived in this paper so that error rate can approach zero or is zero for AI pattern classification problems.
Last but not least, we show our analysis and theory by providing examples of AI pattern classifications with error rate approaching zero or being zero.
Impact Statement: Error rate control of AI pattern classification is crucial in many lives related AI applications.
AI uncertainty, capacity and evolution are investigated in this paper.
Sufficient/necessary conditions for AI’s error rate approaching zero are derived based on Shannon’s communication coding theory.
Zero error rate and zero error rate approaching AI design methodology for pattern classifications are illustrated using Shannon’s coding theory.
Our method shows how to control the error rate of AI, how to measure the capacity of AI and how to evolve AI into higher levels.
Index Terms: Rademacher Complexity, Shannon Theory, Shannon Entropy, Vapnik-Cheronenkis (VC) dimension.
Related Results
Reserves Uncertainty Calculation Accounting for Parameter Uncertainty
Reserves Uncertainty Calculation Accounting for Parameter Uncertainty
Abstract
An important goal of geostatistical modeling is to assess output uncertainty after processing realizations through a transfer function, in particular, to...
Analisis Bibliometrik Shannon Entropy: Tren Penelitian dan Relevansi Multidimensional
Analisis Bibliometrik Shannon Entropy: Tren Penelitian dan Relevansi Multidimensional
Dengan meningkatnya adopsi Shannon Entropy di berbagai bidang, penting untuk melakukan analisis komprehensif mengenai perkembangan penelitiannya, termasuk tren utama, kolaborasi an...
Complexity Theory
Complexity Theory
The workshop
Complexity Theory
was organised by Joachim von zur Gathen (Bonn), Oded Goldreich (Rehovot), Claus-Peter Schnorr (Frankfurt), and Madhu Sudan ...
Entropy and Wealth
Entropy and Wealth
While entropy was introduced in the second half of the 19th century in the international vocabulary as a scientific term, in the 20th century it became common in colloquial use. Po...
Cross-Subject Emotion Recognition Using Fused Entropy Features of EEG
Cross-Subject Emotion Recognition Using Fused Entropy Features of EEG
Emotion recognition based on electroencephalography (EEG) has attracted high interest in fields such as health care, user experience evaluation, and human–computer interaction (HCI...
Linguistic Complexity
Linguistic Complexity
Linguistic complexity (or: language complexity, complexity in language) is a multifaceted and multidimensional research area that has been booming since the early 2000s. The curren...
Sampling Space of Uncertainty Through Stochastic Modelling of Geological Facies
Sampling Space of Uncertainty Through Stochastic Modelling of Geological Facies
Abstract
The way the space of uncertainty should be sampled from reservoir models is an essential point for discussion that can have a major impact on the assessm...
Shannon entropy and quantitative time irreversibility for different and even contradictory aspects of complex systems
Shannon entropy and quantitative time irreversibility for different and even contradictory aspects of complex systems
The Shannon entropy and quantitative time irreversibility (qTIR) are statistical quantifiers that are widely used for characterizing complex processes. However, the differences and...

