SUMMARY / RELATED TOPICS

Information theory

Information theory studies the quantification and communication of information. It was proposed by Claude Shannon in 1948 to find fundamental limits on signal processing and communication operations such as data compression, in a landmark paper titled "A Mathematical Theory of Communication", its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, numerous other fields. The field is at the intersection of mathematics, computer science, neurobiology, information engineering, electrical engineering; the theory has found applications in other areas, including statistical inference, natural language processing, neurobiology, human vision, the evolution and function of molecular codes, model selection in statistics, thermal physics, quantum computing, plagiarism detection, pattern recognition, anomaly detection.

Important sub-fields of information theory include source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, Grey system theory and measures of information. Applications of fundamental topics of information theory include lossless data compression, lossy data compression, channel coding. Information theory is used in information retrieval, intelligence gathering, statistics, in musical composition. A key measure in information theory is "entropy". Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip provides less information than specifying the outcome from a roll of a die; some other important measures in information theory are mutual information, channel capacity, error exponents, relative entropy. Information theory studies the transmission, processing and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty.

In the case of communication of information over a noisy channel, this abstract concept was made concrete in 1948 by Claude Shannon in his paper "A Mathematical Theory of Communication", in which "information" is thought of as a set of possible messages, where the goal is to send these messages over a noisy channel, to have the receiver reconstruct the message with low probability of error, in spite of the channel noise. Shannon's main result, the noisy-channel coding theorem showed that, in the limit of many channel uses, the rate of information, asymptotically achievable is equal to the channel capacity, a quantity dependent on the statistics of the channel over which the messages are sent. Information theory is associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, informatics, machine learning, along with systems sciences of many descriptions.

Information theory is a broad and deep mathematical theory, with broad and deep applications, amongst, the vital field of coding theory. Coding theory is concerned with finding explicit methods, called codes, for increasing the efficiency and reducing the error rate of data communication over noisy channels to near the channel capacity; these codes can be subdivided into data compression and error-correction techniques. In the latter case, it took many years to find the methods. A third class of information theory codes are cryptographic algorithms. Concepts and results from coding theory and information theory are used in cryptography and cryptanalysis. See the article ban for a historical application; the landmark event that established the discipline of information theory and brought it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948. Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability.

Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation W = K log m, where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log Sn = n log S, where S was the number of possible symbols, n the number of symbols in a transmission; the unit of information was therefore the decimal digit, which has since sometimes been called the hartley in his honor as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers.

Much of the mathematics behind information theory with events of different probabilities were develop

Mill Spring, Missouri

Mill Spring is a village in Wayne County, United States, along the Black River. The population was 189 at the 2010 census. Mill Spring was laid out in 1871; the community took its name from a nearby spring of the same name. A post office called Mill Spring has been in operation since 1874. Millspring post office was shut down in 2014 Mill Spring is located in a tributary valley on the northeast edge of the Black River floodplain; the community of Leeper is one mile to the northwest along Missouri Route 49. According to the United States Census Bureau, the village has a total area of 0.44 square miles, all land. At the 2010 census, there were 189 people, 81 households and 46 families living in the village; the population density was 429.5 per square mile. There were 106 housing units at an average density of 240.9 per square mile. The racial makeup was 96.8% White, 1.1% African American, 0.5% from other races, 1.6% from two or more races. Hispanic or Latino of any race were 0.5% of the population. There were 81 households of which 25.9% had children under the age of 18 living with them, 38.3% were married couples living together, 13.6% had a female householder with no husband present, 4.9% had a male householder with no wife present, 43.2% were non-families.

30.9% of all households were made up of individuals and 13.6% had someone living alone, 65 years of age or older. The average household size was 2.33 and the average family size was 3.02. The median age was 37.8 years. 21.7% of residents were under the age of 18. The gender makeup was 48.1 % female. At the 2000 census, there were 81 households and 55 families living in the village; the population density was 1,041.8 per square mile. There were 104 housing units at an average density of 494.7 per square mile. The racial makeup was 96.35% White, 0.46% Native American, 0.91% Pacific Islander and 2.28% from two or more races. There were 81 households of which 30.9% had children under the age of 18 living with them, 45.7% were married couples living together, 13.6% had a female householder with no husband present, 30.9% were non-families. 24.7% of all households were made up of individuals and 16.0% had someone living alone, 65 years of age or older. The average household size was 2.70 and the average family size was 3.13.

28.8% of the population were under the age of 18, 8.7% from 18 to 24, 24.2% from 25 to 44, 17.8% from 45 to 64, 20.5% who were 65 years of age or older. The median age was 37 years. For every 100 females, there were 104.7 males. For every 100 females age 18 and over, there were 97.5 males. The median household income was $22,750 and the median family income was $26,389. Males had a median income of $18,125 and females $14,063; the per capita income was $9,723. About 13.5% of families and 22.7% of the population were below the poverty line, including 25.0% of those under the age of eighteen and 20.0% of those sixty five or over. Bernarr Macfadden, magazine publisher and fitness authority

Wentzville Ice Arena

Wentzville Ice Arena is an arena and recreational sport facility located in Wentzville and owned and operated by the City of Wentzville Parks & Recreation Department. It served as the home for LU Lindenwood Lions Men's and Women's ice hockey teams and LU synchronized skating team until relocating to the newly built Centene Community Ice Center, it features two NHL-size sheets of ice for ice hockey, figure skating and open skating, local high school hockey. Other features of the arena include eight locker rooms, heated bleacher seating, concession stand. Current capacity for spectators is about 750 for each of the two NHL-size hockey rinks; the ice arena received renovations during the summer of 2010 to prepare for the transition of the university's athletic department to the NCAA. The renovation included new in-ice logos, new boards, major updates to the women's locker room that includes a new video room and new workout room for the team's move to NCAA Division I; the university constructed a new men's ACHA DII/JV locker room.

It is host to youth ice hockey tournaments. The Wentzville Ice Arena was built in 1998 as the Wentzville Ice Arena; the facility was developed, owned, by Wayne Stumpf. The 67,800-square-foot complex was constructed at a cost of $4.5 million. The Wentzville Ice Arena first opened its doors to the public on December 18, 1998. Naming rights were sold, the arena became known as the CenturyTel Ice Arena; the arena was sold to Lindenwood University in July 2004. In June 2019, the City of Wentzville purchased the arena, renamed it the Wentzville Ice Arena. Official Wentzville Ice Arena website