1.
Ronald Fisher
–
Sir Ronald Aylmer Fisher FRS, who published as R. A. Fisher, was an English statistician and biologist who used mathematics to combine Mendelian genetics and natural selection. This helped to create the new Darwinist synthesis of evolution known as the evolutionary synthesis. He was also a prominent eugenicist in the part of his life. He is known as one of the three founders of population genetics. He outlined Fishers principle as well as the Fisherian runaway and sexy son hypothesis theories of sexual selection and he also made important contributions to statistics, including the maximum likelihood, fiducial inference, the derivation of various sampling distributions among many others. Anders Hald called him a genius who almost single-handedly created the foundations for modern statistical science, not only was he the most original and constructive of the architects of the neo-Darwinian synthesis, Fisher also was the father of modern statistics and experimental design. He therefore could be said to have provided researchers in biology and medicine with their most important research tools, geoffrey Miller said of him To biologists, he was an architect of the modern synthesis that used mathematical models to integrate Mendelian genetics with Darwins selection theories. To psychologists, Fisher was the inventor of various tests that are still supposed to be used whenever possible in psychology journals. To farmers, Fisher was the founder of agricultural research. Fisher was born in East Finchley in London, England, one of twins with the other being still-born, from 1896 until 1904 they lived at Inverforth House in London, where English Heritage installed a blue plaque in 2002, before moving to Streatham. He entered Harrow School age 14 and won the schools Neeld Medal in mathematics, in 1909, he won a scholarship to Gonville and Caius College, Cambridge. In 1919 he began working at Rothamsted Research and his fame grew and he began to travel and lecture widely. In 1937, he visited the Indian Statistical Institute in Calcutta, mahalanobis, often returning to encourage its development, being the guest of honour at its 25th anniversary in 1957 when it had 2000 employees. His marriage disintegrated during World War II and his oldest son George and his daughter and one of his biographers, Joan, married the noted statistician George E. P. Box. Fisher gained a scholarship to study Mathematics at the University of Cambridge in 1909, in 1915 he published a paper The evolution of sexual preference on sexual selection and mate choice. He published The Correlation Between Relatives on the Supposition of Mendelian Inheritance in 1918, in which he introduced the term variance, Joan Box, Fishers biographer and daughter says that Fisher had resolved this problem in 1911. Between 1912 and 1922 Fisher recommended, analyzed and vastly popularized Maximum likelihood, in 1928 Joseph Oscar Irwin began a three-year stint at Rothamsted and became one of the first people to master Fishers innovations. His first application of the analysis of variance was published in 1921 and he pioneered the principles of the design of experiments and the statistics of small samples and the analysis of real data
2.
Normal distribution
–
In probability theory, the normal distribution is a very common continuous probability distribution. Normal distributions are important in statistics and are used in the natural and social sciences to represent real-valued random variables whose distributions are not known. The normal distribution is useful because of the limit theorem. Physical quantities that are expected to be the sum of independent processes often have distributions that are nearly normal. Moreover, many results and methods can be derived analytically in explicit form when the relevant variables are normally distributed, the normal distribution is sometimes informally called the bell curve. However, many other distributions are bell-shaped, the probability density of the normal distribution is, f =12 π σ2 e −22 σ2 Where, μ is mean or expectation of the distribution. σ is standard deviation σ2 is variance A random variable with a Gaussian distribution is said to be distributed and is called a normal deviate. The simplest case of a distribution is known as the standard normal distribution. The factor 1 /2 in the exponent ensures that the distribution has unit variance and this function is symmetric around x =0, where it attains its maximum value 1 /2 π and has inflection points at x = +1 and x = −1. Authors may differ also on which normal distribution should be called the standard one, the probability density must be scaled by 1 / σ so that the integral is still 1. If Z is a normal deviate, then X = Zσ + μ will have a normal distribution with expected value μ. Conversely, if X is a normal deviate, then Z = /σ will have a standard normal distribution. Every normal distribution is the exponential of a function, f = e a x 2 + b x + c where a is negative. In this form, the mean value μ is −b/, for the standard normal distribution, a is −1/2, b is zero, and c is − ln /2. The standard Gaussian distribution is denoted with the Greek letter ϕ. The alternative form of the Greek phi letter, φ, is used quite often. The normal distribution is often denoted by N. Thus when a random variable X is distributed normally with mean μ and variance σ2, some authors advocate using the precision τ as the parameter defining the width of the distribution, instead of the deviation σ or the variance σ2
3.
Statistics
–
Statistics is a branch of mathematics dealing with the collection, analysis, interpretation, presentation, and organization of data. In applying statistics to, e. g. a scientific, industrial, or social problem, populations can be diverse topics such as all people living in a country or every atom composing a crystal. Statistics deals with all aspects of data including the planning of data collection in terms of the design of surveys, statistician Sir Arthur Lyon Bowley defines statistics as Numerical statements of facts in any department of inquiry placed in relation to each other. When census data cannot be collected, statisticians collect data by developing specific experiment designs, representative sampling assures that inferences and conclusions can safely extend from the sample to the population as a whole. In contrast, an observational study does not involve experimental manipulation, inferences on mathematical statistics are made under the framework of probability theory, which deals with the analysis of random phenomena. A standard statistical procedure involves the test of the relationship between two data sets, or a data set and a synthetic data drawn from idealized model. A hypothesis is proposed for the relationship between the two data sets, and this is compared as an alternative to an idealized null hypothesis of no relationship between two data sets. Rejecting or disproving the hypothesis is done using statistical tests that quantify the sense in which the null can be proven false. Working from a hypothesis, two basic forms of error are recognized, Type I errors and Type II errors. Multiple problems have come to be associated with this framework, ranging from obtaining a sufficient sample size to specifying an adequate null hypothesis, measurement processes that generate statistical data are also subject to error. Many of these errors are classified as random or systematic, the presence of missing data or censoring may result in biased estimates and specific techniques have been developed to address these problems. Statistics continues to be an area of research, for example on the problem of how to analyze Big data. Statistics is a body of science that pertains to the collection, analysis, interpretation or explanation. Some consider statistics to be a mathematical science rather than a branch of mathematics. While many scientific investigations make use of data, statistics is concerned with the use of data in the context of uncertainty, mathematical techniques used for this include mathematical analysis, linear algebra, stochastic analysis, differential equations, and measure-theoretic probability theory. In applying statistics to a problem, it is practice to start with a population or process to be studied. Populations can be diverse topics such as all living in a country or every atom composing a crystal. Ideally, statisticians compile data about the entire population and this may be organized by governmental statistical institutes
4.
Probability
–
Probability is the measure of the likelihood that an event will occur. Probability is quantified as a number between 0 and 1, the higher the probability of an event, the more certain that the event will occur. A simple example is the tossing of a fair coin, since the coin is unbiased, the two outcomes are both equally probable, the probability of head equals the probability of tail. Since no other outcomes are possible, the probability is 1/2 and this type of probability is also called a priori probability. Probability theory is used to describe the underlying mechanics and regularities of complex systems. For example, tossing a coin twice will yield head-head, head-tail, tail-head. The probability of getting an outcome of head-head is 1 out of 4 outcomes or 1/4 or 0.25 and this interpretation considers probability to be the relative frequency in the long run of outcomes. A modification of this is propensity probability, which interprets probability as the tendency of some experiment to yield a certain outcome, subjectivists assign numbers per subjective probability, i. e. as a degree of belief. The degree of belief has been interpreted as, the price at which you would buy or sell a bet that pays 1 unit of utility if E,0 if not E. The most popular version of subjective probability is Bayesian probability, which includes expert knowledge as well as data to produce probabilities. The expert knowledge is represented by some prior probability distribution and these data are incorporated in a likelihood function. The product of the prior and the likelihood, normalized, results in a probability distribution that incorporates all the information known to date. The scientific study of probability is a development of mathematics. Gambling shows that there has been an interest in quantifying the ideas of probability for millennia, there are reasons of course, for the slow development of the mathematics of probability. Whereas games of chance provided the impetus for the study of probability. According to Richard Jeffrey, Before the middle of the century, the term probable meant approvable. A probable action or opinion was one such as people would undertake or hold. However, in legal contexts especially, probable could also apply to propositions for which there was good evidence, the sixteenth century Italian polymath Gerolamo Cardano demonstrated the efficacy of defining odds as the ratio of favourable to unfavourable outcomes
5.
SAS Institute
–
SAS Institute is an American multinational developer of analytics software based in Cary, North Carolina. SAS develops and markets a suite of software, which helps access, manage, analyze. The company is the worlds largest privately held software business and its software is used by most of the Fortune 500, SAS has developed a model workplace environment and benefits program designed to retain employees, allow them to focus on their work, and reduce operating costs. It provides on-site, subsidized or free healthcare, gyms, daycare and it became an independent, private business led by current CEO James Goodnight and three other project leaders from the university in 1976. SAS grew from $10 million in revenues in 1980 to $1.1 billion by 2000, a larger proportion of these revenues are spent on research and development than at most other software companies, at one point more than double the industry average. The Statistical Analysis System began as a project at North Carolina State Universitys agricultural department and it was originally led by Anthony James Barr in 1966, then joined by NCSU graduate student James Goodnight in 1967 and John Sall in 1973. In the early 1970s, the software was primarily leased to other departments in order to analyze the effect soil, weather. The project was funded by the National Institutes of Health and later by a coalition of university statistics programs called the University Statisticians of the Southern Experiment Stations, by 1976 the software had 100 customers and 300 people attended the first SAS user conference in Kissimmee, Florida that year. Goodnight, Barr, Sall and another participant, Jane Helwig, founded SAS Institute Inc. as a private company on July 1,1976. Barr and Helwig later sold their interest in the company, SAS tradition of polling users for suggestions to improve the software through the SASWare Ballot was adopted during its first year of operation. Many of the companys employee perks, such as fruit, reasonable work hours. In the late 1970s, the company established its first marketing department, SAS started building its current headquarters in a forested area of Cary, North Carolina in 1980. Later that year it started providing on-site daycare in order to keep an employee who was planning on being a stay-at-home mom, by 1984, SAS had begun building a fitness center, medical center, on-site cafe and other facilities. It had also developed some of its other benefits programs, SAS became known as a good place to work and was frequently recognized by national magazines like BusinessWeek, Working Mother and Fortune for its work environment. During the 1980s, SAS was one of Inc, Magazines fastest growing companies in America from 1979 and 1985. It grew more than ten percent per year from $10 million in revenues in 1980 to $1.1 billion by 2000, in 2007, SAS revenue was $2.15 billion, and in 2013 its revenue was $3.02 billion. By the late 1990s, SAS was the largest privately held software company, the Associated Press reported that analysts attributed the growth to aggressive R&D spending. It had the highest ratio of its revenues spent on R&D in the industry for eight years, setting a record of 34 percent of its revenues in 1993, the company began its relationship with Microsoft and development for Windows operating systems in 1989
6.
Central limit theorem
–
If this procedure is performed many times, the central limit theorem says that the computed values of the average will be distributed according to the normal distribution. The central limit theorem has a number of variants, in its common form, the random variables must be identically distributed. In variants, convergence of the mean to the normal distribution also occurs for non-identical distributions or for non-independent observations, in more general usage, a central limit theorem is any of a set of weak-convergence theorems in probability theory. When the variance of the i. i. d, Variables is finite, the attractor distribution is the normal distribution. In contrast, the sum of a number of i. i. d, Random variables with power law tail distributions decreasing as | x |−α −1 where 0 < α <2 will tend to an alpha-stable distribution with stability parameter of α as the number of variables grows. Suppose we are interested in the sample average S n, = X1 + ⋯ + X n n of these random variables, by the law of large numbers, the sample averages converge in probability and almost surely to the expected value µ as n → ∞. The classical central limit theorem describes the size and the form of the stochastic fluctuations around the deterministic number µ during this convergence. For large enough n, the distribution of Sn is close to the distribution with mean µ. The usefulness of the theorem is that the distribution of √n approaches normality regardless of the shape of the distribution of the individual Xi, formally, the theorem can be stated as follows, Lindeberg–Lévy CLT. Suppose is a sequence of i. i. d, Random variables with E = µ and Var = σ2 < ∞. Then as n approaches infinity, the random variables √n converge in distribution to a normal N, n → d N. Note that the convergence is uniform in z in the sense that lim n → ∞ sup z ∈ R | Pr − Φ | =0, the theorem is named after Russian mathematician Aleksandr Lyapunov. In this variant of the limit theorem the random variables Xi have to be independent. The theorem also requires that random variables | Xi | have moments of order. Suppose is a sequence of independent random variables, each with finite expected value μi, in practice it is usually easiest to check Lyapunov’s condition for δ =1. If a sequence of random variables satisfies Lyapunov’s condition, then it also satisfies Lindeberg’s condition, the converse implication, however, does not hold. In the same setting and with the notation as above. Suppose that for every ε >0 lim n → ∞1 s n 2 ∑ i =1 n E =0 where 1 is the indicator function
7.
Wolfram Mathematica
–
Wolfram Mathematica is a mathematical symbolic computation program, sometimes termed a computer algebra system or program, used in many scientific, engineering, mathematical, and computing fields. It was conceived by Stephen Wolfram and is developed by Wolfram Research of Champaign, the Wolfram Language is the programming language used in Mathematica. The kernel interprets expressions and returns result expressions, all content and formatting can be generated algorithmically or edited interactively. Standard word processing capabilities are supported, including real-time multi-lingual spell-checking, documents can be structured using a hierarchy of cells, which allow for outlining and sectioning of a document and support automatic numbering index creation. Documents can be presented in an environment for presentations. Notebooks and their contents are represented as Mathematica expressions that can be created, modified or analyzed by Mathematica programs or converted to other formats, the front end includes development tools such as a debugger, input completion, and automatic syntax highlighting. Among the alternative front ends is the Wolfram Workbench, an Eclipse based integrated development environment and it provides project-based code development tools for Mathematica, including revision management, debugging, profiling, and testing. There is a plugin for IntelliJ IDEA based IDEs to work with Wolfram Language code which in addition to syntax highlighting can analyse and auto-complete local variables, the Mathematica Kernel also includes a command line front end. Other interfaces include JMath, based on GNU readline and MASH which runs self-contained Mathematica programs from the UNIX command line, version 5.2 added automatic multi-threading when computations are performed on multi-core computers. This release included CPU specific optimized libraries, in addition Mathematica is supported by third party specialist acceleration hardware such as ClearSpeed. Support for CUDA and OpenCL GPU hardware was added in 2010, also, since version 8 it can generate C code, which is automatically compiled by a system C compiler, such as GCC or Microsoft Visual Studio. A free-of-charge version, Wolfram CDF Player, is provided for running Mathematica programs that have saved in the Computable Document Format. It can also view standard Mathematica files, but not run them and it includes plugins for common web browsers on Windows and Macintosh. WebMathematica allows a web browser to act as a front end to a remote Mathematica server and it is designed to allow a user written application to be remotely accessed via a browser on any platform. It may not be used to full access to Mathematica. Due to bandwidth limitations interactive 3D graphics is not fully supported within a web browser, Wolfram Language code can be converted to C code or to an automatically generated DLL. Wolfram Language code can be run on a Wolfram cloud service as a web-app or as an API either on Wolfram-hosted servers or in an installation of the Wolfram Enterprise Private Cloud. Communication with other applications occurs through a protocol called Wolfram Symbolic Transfer Protocol and it allows communication between the Wolfram Mathematica kernel and front-end, and also provides a general interface between the kernel and other applications
8.
Doug Altman
–
Douglas Altman FMedSci is an English statistician best known for his work on improving the reliability and reporting of medical research and for highly cited papers on statistical methodology. Doug Altman graduated in statistics from the University of Bath and his first job was in the Department of Community Medicine at St Thomass Hospital Medical School. He then spent 11 years working for the Medical Research Councils Clinical Research Centre where he worked almost entirely as a consultant in a wide variety of medical areas. In 1998 he was made Professor of Statistics in Medicine by the University of Oxford, Altman is regarded as a leading authority on the execution and reporting of health research, and has played a leading role in establishing better standards. He is also one of the authors of the IDEAL framework for improving surgical research. His textbook Practical Statistics for Medical Research, published in 1991, has sold 50,000 copies in hardback, Altman is the author of over 450 papers in statistical methodology, with 11 being cited over 1,000 times. Among them is one Lancet paper, which has been cited over 23,000 times and is ranked 29th in the Nature/Web of Science Top 100 most-cited research papers of all time. Altman was awarded the Bradford Hill Medal by the Royal Statistical Society for his contributions to statistics in 1997. Altman is also editor in chief of Trials, a Fellow of the Academy of Medical Sciences, Altman, Douglas G. Practical Statistics for Medical Research. Monographs on Statistics and Applied Probability, Douglas G. Altman ISBN 0-412-27630-5 Systematic Reviews in Healthcare, Meta-Analysis in Context. Editors, Douglas G. Altman, Iain Chalmers, Gerd Antes, Michael Bradburn, Mike Clarke, Matthias Egger, ISBN 0-7279-1488-X Statistics With Confidence, Confidence Intervals and Statistical Guidelines. Editors, Douglas G. Altman, David Machin, T. N. Bryant, editors, Douglas G. Altman, Iain Chalmers. ISBN 0-7279-0904-5 Statistics in Practice, Articles Published in the British Medical Journal, editors, Sheila M. Gore, Douglas G. Altman. ISBN 0-7279-0085-4 List of the over 396 articles by Doug Altman available through PubMed, David M, Kenneth FS and Altman DG for the CONSORT Group. Revised recommendations for improving the quality of reports of parallel group randomized trials, Statistical methods for assessing agreement between 2 methods of clinical measurement. A reprint is available HERE BMJ Statistical Notes - A series of articles on the use of statistics by Doug Altman. Measurement in medicine - the analysis of method comparison studies, measuring agreement in method comparison studies. Statistical Methods in Medical Research 8, 135-160, comparing methods of measurement - why plotting difference against standard method is misleading
9.
Microsoft Excel
–
Microsoft Excel is a spreadsheet developed by Microsoft for Windows, macOS, Android and iOS. It features calculation, graphing tools, pivot tables, and a programming language called Visual Basic for Applications. It has been a widely applied spreadsheet for these platforms, especially since version 5 in 1993. Excel forms part of Microsoft Office, Microsoft Excel has the basic features of all spreadsheets, using a grid of cells arranged in numbered rows and letter-named columns to organize data manipulations like arithmetic operations. It has a battery of supplied functions to answer statistical, engineering, in addition, it can display data as line graphs, histograms and charts, and with a very limited three-dimensional graphical display. It allows sectioning of data to view its dependencies on various factors for different perspectives, Excel was not designed to be used as a database. Microsoft allows for a number of optional command-line switches to control the manner in which Excel starts, the Windows version of Excel supports programming through Microsofts Visual Basic for Applications, which is a dialect of Visual Basic. Programming with VBA allows spreadsheet manipulation that is awkward or impossible with standard spreadsheet techniques, programmers may write code directly using the Visual Basic Editor, which includes a window for writing code, debugging code, and code module organization environment. A common and easy way to generate VBA code is by using the Macro Recorder, the Macro Recorder records actions of the user and generates VBA code in the form of a macro. These actions can then be repeated automatically by running the macro, the macros can also be linked to different trigger types like keyboard shortcuts, a command button or a graphic. The actions in the macro can be executed from these types or from the generic toolbar options. The VBA code of the macro can also be edited in the VBE, advanced users can employ user prompts to create an interactive program, or react to events such as sheets being loaded or changed. Macro Recorded code may not be compatible between Excel versions, some code that is used in Excel 2010 can not be used in Excel 2003. Making a Macro that changes the colors and making changes to other aspects of cells may not be backward compatible. User-created VBA subroutines execute these actions and operate like macros generated using the macro recorder, from its first version Excel supported end user programming of macros and user defined functions.0. Beginning with version 5.0 Excel recorded macros in VBA by default, after version 5.0 that option was discontinued. All versions of Excel, including Excel 2010 are capable of running an XLM macro, Excel supports charts, graphs, or histograms generated from specified groups of cells. The generated graphic component can either be embedded within the current sheet and these displays are dynamically updated if the content of cells change
10.
SciPy
–
SciPy is an open source Python library used for scientific computing and technical computing. SciPy builds on the NumPy array object and is part of the NumPy stack which includes tools like Matplotlib, pandas, there is an expanding set of scientific computing libraries that are being added to the NumPy stack every day. This NumPy stack has similar users to other such as MATLAB, GNU Octave. The NumPy stack is sometimes referred to as the SciPy stack. SciPy is also a family of conferences for users and developers of these tools, enthought originated the SciPy conference in the United States and continues to sponsor many of the international conferences as well as host the SciPy website. The SciPy library is distributed under the BSD license. It is also supported by Numfocus which is a community foundation for supporting reproducible and accessible science, a typical Python Scientific Computing Environment includes many dedicated software tools. The SciPy package of key algorithms and functions core to Pythons scientific computing capabilities, NumPy provides some functions for linear algebra, Fourier transforms and random number generation, but not with the generality of the equivalent functions in SciPy. NumPy can also be used as an efficient multi-dimensional container of data with arbitrary data-types and this allows NumPy to seamlessly and speedily integrate with a wide variety of databases. Older versions of SciPy used Numeric as a type, which is now deprecated in favor of the newer NumPy array code. In the 1990s, Python was extended to include a type for numerical computing called Numeric. As of 2000, there was a number of extension modules and increasing interest in creating a complete environment for scientific. In 2001, Travis Oliphant, Eric Jones, and Pearu Peterson merged code they had written, the newly created package provided a standard collection of common numerical operations on top of the Numeric array data structure. Since then the SciPy environment has continued to grow with more packages, list of numerical analysis software Comparison of numerical analysis software SageMath Official website NumPy website SciPy Course Outline by Dave Kuhlman Python Scientific Lecture Notes
11.
Standard deviation
–
In statistics, the standard deviation is a measure that is used to quantify the amount of variation or dispersion of a set of data values. The standard deviation of a variable, statistical population, data set. It is algebraically simpler, though in practice less robust, than the absolute deviation. A useful property of the deviation is that, unlike the variance. There are also other measures of deviation from the norm, including mean absolute deviation, in addition to expressing the variability of a population, the standard deviation is commonly used to measure confidence in statistical conclusions. For example, the margin of error in polling data is determined by calculating the standard deviation in the results if the same poll were to be conducted multiple times. This derivation of a deviation is often called the standard error of the estimate or standard error of the mean when referring to a mean. It is computed as the deviation of all the means that would be computed from that population if an infinite number of samples were drawn. It is very important to note that the deviation of a population. The reported margin of error of a poll is computed from the error of the mean and is typically about twice the standard deviation—the half-width of a 95 percent confidence interval. The standard deviation is also important in finance, where the standard deviation on the rate of return on an investment is a measure of the volatility of the investment. For a finite set of numbers, the deviation is found by taking the square root of the average of the squared deviations of the values from their average value. For example, the marks of a class of eight students are the eight values,2,4,4,4,5,5,7,9. These eight data points have the mean of 5,2 +4 +4 +4 +5 +5 +7 +98 =5 and this formula is valid only if the eight values with which we began form the complete population. If the values instead were a sample drawn from some large parent population. In that case the result would be called the standard deviation. Dividing by n −1 rather than by n gives an estimate of the variance of the larger parent population. This is known as Bessels correction, as a slightly more complicated real-life example, the average height for adult men in the United States is about 70 inches, with a standard deviation of around 3 inches
12.
MATLAB
–
MATLAB is a multi-paradigm numerical computing environment and fourth-generation programming language. Although MATLAB is intended primarily for numerical computing, an optional toolbox uses the MuPAD symbolic engine, an additional package, Simulink, adds graphical multi-domain simulation and model-based design for dynamic and embedded systems. In 2004, MATLAB had around one million users across industry, MATLAB users come from various backgrounds of engineering, science, and economics. Cleve Moler, the chairman of the science department at the University of New Mexico. He designed it to give his students access to LINPACK and EISPACK without them having to learn Fortran and it soon spread to other universities and found a strong audience within the applied mathematics community. Jack Little, an engineer, was exposed to it during a visit Moler made to Stanford University in 1983, recognizing its commercial potential, he joined with Moler and Steve Bangert. They rewrote MATLAB in C and founded MathWorks in 1984 to continue its development and these rewritten libraries were known as JACKPAC. In 2000, MATLAB was rewritten to use a set of libraries for matrix manipulation. MATLAB was first adopted by researchers and practitioners in control engineering, Littles specialty and it is now also used in education, in particular the teaching of linear algebra, numerical analysis, and is popular amongst scientists involved in image processing. The MATLAB application is built around the MATLAB scripting language, common usage of the MATLAB application involves using the Command Window as an interactive mathematical shell or executing text files containing MATLAB code. Variables are defined using the assignment operator, =, MATLAB is a weakly typed programming language because types are implicitly converted. It is a typed language because variables can be assigned without declaring their type, except if they are to be treated as symbolic objects. Values can come from constants, from computation involving values of other variables, for example, A simple array is defined using the colon syntax, init, increment, terminator. For instance, defines a variable named array which is an array consisting of the values 1,3,5,7 and that is, the array starts at 1, increments with each step from the previous value by 2, and stops once it reaches 9. The increment value can actually be left out of this syntax, assigns to the variable named ari an array with the values 1,2,3,4, and 5, since the default value of 1 is used as the incrementer. Indexing is one-based, which is the convention for matrices in mathematics, although not for some programming languages such as C, C++. Matrices can be defined by separating the elements of a row with blank space or comma, the list of elements should be surrounded by square brackets. Parentheses, are used to access elements and subarrays, sets of indices can be specified by expressions such as 2,4, which evaluates to