1.
Light painting
–
The technique is used for both scientific and artistic purposes, as well as in commercial photography. Light painting dates back to 1889 when Étienne-Jules Marey and Georges Demeny traced human motion in the first known light painting “Pathological Walk From in Front”, man Ray, in his 1935 series Space Writing, was the first known art photographer to use the technique. Photographer Barbara Morgan began making paintings in 1940. In 1949 Pablo Picasso was visited by Gjon Mili, a photographer and lighting innovator, immediately Picasso started making images in the air with a small flashlight in a dark room. This series of photos became known as Picassos light drawings and he produced a series Schwingungsfigur of complex linear meshes, often with moiré effects, using a point-source light on a pendulum. During the 1970s and 80s Eric Staller used this technology for numerous projects that were called Light Drawings. Light paintings up to 1976 are classified as light drawings, in 1977 Dean Chamberlain gave birth to light painting with his image Polyethylene Bags On Chaise Longue at The Rochester Institute of Technology. Dean Chamberlain was the first artist to dedicate his entire body of work to the light painting art form, the artist photographer Jacques Pugin made several series of images with the light drawing technique in 1979. Picasso and Milis images should be regarded as some of the first light drawings, now, with modern light painting, one uses more frequently choreography and performance to photograph and organize. Since the 1980s, Vicki DaSilva has been working exclusively in light painting, in 1980, Vicki DaSilva started making deliberate text light graffiti works, the first being Cash. She continued these light graffiti photographs throughout the 1980s and eventually started using 4 foot fluorescent bulbs hooked up to pulley systems to create sheets of light. In the early 2000s she began making work with 8 foot fluorescent lamps, holding the lamp vertically, Light painting requires a slow shutter speed, usually at least a second in duration. Light paintings can be created using a webcam, the painted image can already be seen while drawing by using a monitor or projector. Another technique is the projection of images on to irregular surfaces, a photograph or other fixed portrayal of the resulting image is then made. Kinetic light painting is achieved by moving the camera, and is the antithesis of traditional photography, at night, or in a dark room, the camera can be removed from the tripod and used like a paintbrush. An example is using the sky as the canvas, the camera as the brush. Putting energy into moving the camera by stroking lights, making patterns, a variety of light sources can be used, ranging from simple flashlights to dedicated devices like the Hosemaster, which uses a fiber optic light pen. Other sources of light including candles, matches, fireworks, lighter flints, glowsticks, the PixelStick offer users the ability to control their light source in order to create images from letters to pictures that they have imported
2.
Computer program
–
A computer program is a collection of instructions that performs a specific task when executed by a computer. A computer requires programs to function, and typically executes the programs instructions in a processing unit. A computer program is written by a computer programmer in a programming language. From the program in its form of source code, a compiler can derive machine code—a form consisting of instructions that the computer can directly execute. Alternatively, a program may be executed with the aid of an interpreter. A part of a program that performs a well-defined task is known as an algorithm. A collection of programs, libraries and related data are referred to as software. Computer programs may be categorized along functional lines, such as software or system software. The earliest programmable machines preceded the invention of the digital computer, in 1801, Joseph-Marie Jacquard devised a loom that would weave a pattern by following a series of perforated cards. Patterns could be weaved and repeated by arranging the cards, in 1837, Charles Babbage was inspired by Jacquards loom to attempt to build the Analytical Engine. The names of the components of the device were borrowed from the textile industry. In the textile industry, yarn was brought from the store to be milled, the device would have had a store—memory to hold 1,000 numbers of 40 decimal digits each. Numbers from the store would then have then transferred to the mill. It was programmed using two sets of perforated cards—one to direct the operation and the other for the input variables, however, after more than 17,000 pounds of the British governments money, the thousands of cogged wheels and gears never fully worked together. During a nine-month period in 1842–43, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea, the memoir covered the Analytical Engine. The translation contained Note G which completely detailed a method for calculating Bernoulli numbers using the Analytical Engine and this note is recognized by some historians as the worlds first written computer program. In 1936, Alan Turing introduced the Universal Turing machine—a theoretical device that can model every computation that can be performed on a Turing complete computing machine and it is a finite-state machine that has an infinitely long read/write tape. The machine can move the back and forth, changing its contents as it performs an algorithm
3.
Programming language
–
A programming language is a formal computer language designed to communicate instructions to a machine, particularly a computer. Programming languages can be used to programs to control the behavior of a machine or to express algorithms. From the early 1800s, programs were used to direct the behavior of such as Jacquard looms. Thousands of different programming languages have created, mainly in the computer field. Many programming languages require computation to be specified in an imperative form while other languages use forms of program specification such as the declarative form. The description of a language is usually split into the two components of syntax and semantics. Some languages are defined by a document while other languages have a dominant implementation that is treated as a reference. Some languages have both, with the language defined by a standard and extensions taken from the dominant implementation being common. A programming language is a notation for writing programs, which are specifications of a computation or algorithm, some, but not all, authors restrict the term programming language to those languages that can express all possible algorithms. For example, PostScript programs are created by another program to control a computer printer or display. More generally, a language may describe computation on some, possibly abstract. It is generally accepted that a specification for a programming language includes a description, possibly idealized. In most practical contexts, a programming language involves a computer, consequently, abstractions Programming languages usually contain abstractions for defining and manipulating data structures or controlling the flow of execution. Expressive power The theory of computation classifies languages by the computations they are capable of expressing, all Turing complete languages can implement the same set of algorithms. ANSI/ISO SQL-92 and Charity are examples of languages that are not Turing complete, markup languages like XML, HTML, or troff, which define structured data, are not usually considered programming languages. Programming languages may, however, share the syntax with markup languages if a computational semantics is defined, XSLT, for example, is a Turing complete XML dialect. Moreover, LaTeX, which is used for structuring documents. The term computer language is used interchangeably with programming language
4.
Syntax
–
In linguistics, syntax is the set of rules, principles, and processes that govern the structure of sentences in a given language, specifically word order. The term syntax is used to refer to the study of such principles and processes. The goal of many syntacticians is to discover the syntactic rules common to all languages, in mathematics, syntax refers to the rules governing the behavior of mathematical systems, such as formal languages used in logic. The word syntax comes from Ancient Greek, σύνταξις coordination, which consists of σύν syn, together, and τάξις táxis, a basic feature of a languages syntax is the sequence in which the subject, verb, and object usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV, the other possible sequences are VSO, VOS, OVS, and OSV, the last three of which are rare. In the West, the school of thought came to be known as traditional grammar began with the work of Dionysius Thrax. For centuries, work in syntax was dominated by a known as grammaire générale. This system took as its premise the assumption that language is a direct reflection of thought processes and therefore there is a single. It became apparent that there was no such thing as the most natural way to express a thought, the Port-Royal grammar modeled the study of syntax upon that of logic. Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of Subject – Copula – Predicate, initially, this view was adopted even by the early comparative linguists such as Franz Bopp. The central role of syntax within theoretical linguistics became clear only in the 20th century, there are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton, sees syntax as a branch of biology, other linguists take a more Platonistic view, since they regard syntax to be the study of an abstract formal system. Yet others consider syntax a taxonomical device to reach broad generalizations across languages, the hypothesis of generative grammar is that language is a structure of the human mind. The goal of grammar is to make a complete model of this inner language. This model could be used to all human language and to predict the grammaticality of any given utterance. This approach to language was pioneered by Noam Chomsky, most generative theories assume that syntax is based upon the constituent structure of sentences. Generative grammars are among the theories that focus primarily on the form of a sentence and this complex category is notated as instead of V. NP\S is read as a category that searches to the left for an NP and outputs a sentence. The category of verb is defined as an element that requires two NPs to form a sentence
5.
Proof of concept
–
A proof of concept is usually small and may or may not be complete. The appearance of the terms in news archives suggests it might have been in use as long ago as 1967. In 1969 Committee on Science and Astronautics, the column also provided definitions for the related but distinct terms breadboard, prototype, engineering prototype, and brassboard. Sky Captain and the World of Tomorrow,300, and Sin City were all shot in front of a greenscreen with almost all backgrounds, all three used proof-of-concept short films. In the case of Sin City, the film became the prologue of the final film. Pixar sometimes creates short animated films use a difficult or untested technique. Their short film Geris Game used techniques for animation of cloth, in engineering and technology, a rough prototype of a new idea is often constructed as a proof of concept. For example, a concept of an electrical device may be constructed using a breadboard. A patent application often requires a demonstration of functionality prior to being filed, some universities have proof of concept centers to fill the funding gap for seed-stage investing and accelerate the commercialization of university innovations. Proof of concept centers provide seed funding to novel, early stage research that most often would not be funded by any other conventional source, in the field of business development and sales, a vendor may allow a prospect customer to trial a product. In these cases, the proof of concept may mean the use of specialized sales engineers to ensure that the vendor makes a best-possible effort. Winzapper was a proof of concept which possessed the bare minimum of capabilities needed to remove an item from the Windows Security Log. Once a vendor is satisfied, a prototype is developed which is used to seek funding or to demonstrate to prospective customers. A steel thread is technical proof of concept that all of the technologies in a solution. By contrast, a proof of technology aims to determine the solution to some problem or to demonstrate that a given configuration can achieve a certain throughput. No business users need be involved in a proof of technology, a pilot project refers to an initial roll-out of a system into production, targeting a limited scope of the intended final solution. The scope may be limited by the number of users who can access the system, the business processes affected, the purpose of a pilot project is to test, often in a production environment. Although not suggested by natural language, and in contrast to usage in other areas, Proof of Principle, a third term, Proof of Mechanism, is closely related and is also described here
6.
Brian Kernighan
–
Brian Wilson Kernighan is a Canadian computer scientist who worked at Bell Labs alongside Unix creators Ken Thompson and Dennis Ritchie and contributed to the development of Unix. He is also coauthor of the AWK and AMPL programming languages, the K of K&R C and the K in AWK both stand for Kernighan. Since 2000 Brian Kernighan has been a Professor at the Computer Science Department of Princeton University, born in Toronto, Kernighan attended the University of Toronto between 1960 and 1964, earning his Bachelors degree in engineering physics. He received his PhD in electrical engineering from Princeton University in 1969 for research supervised by Peter Weiner, Kernighan has held a professorship in the department of computer science at Princeton since 2000. Each fall he teaches a course called Computers in Our World, kernighans name became widely known through co-authorship of the first book on the C programming language with Dennis Ritchie. Kernighan affirmed that he had no part in the design of the C language and he authored many Unix programs, including ditroff. In collaboration with Shen Lin he devised well-known heuristics for two NP-complete optimization problems, graph partitioning and the travelling salesman problem, in a display of authorial equity, the former is usually called the Kernighan–Lin algorithm, while the latter is known as the Lin–Kernighan heuristic. Kernighan was the editor for Prentice Hall International. His Software Tools series spread the essence of C/Unix thinking with makeovers for BASIC, FORTRAN, and Pascal and he has said that if stranded on an island with only one programming language it would have to be C. Kernighan coined the term Unix and helped popularize Thompsons Unix philosophy, Kernighan is also known as a coiner of the expression What You See Is All You Get, which is a sarcastic variant of the original What You See Is What You Get. Kernighans term is used to indicate that WYSIWYG systems might throw away information in a document that could be useful in other contexts, kernighans original 1978 implementation of Hello, World. Was sold at The Algorithm Auction, the world’s first auction of computer algorithms, in 1996, Kernighan taught CS50 which is the Harvard University introductory course in Computer Science. His students on CS50 include David J. Malan who now runs the course, an Interview with Brian Kernighan — By Mihai Budiu, for PC Report Romania, August 2000 Transcript of an interview with Brian Kernighan. Archived from the original on 2009-04-28, archived from the original on 2009-05-28
7.
Computer
–
A computer is a device that can be instructed to carry out an arbitrary set of arithmetic or logical operations automatically. The ability of computers to follow a sequence of operations, called a program, such computers are used as control systems for a very wide variety of industrial and consumer devices. The Internet is run on computers and it millions of other computers. Since ancient times, simple manual devices like the abacus aided people in doing calculations, early in the Industrial Revolution, some mechanical devices were built to automate long tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century, the first digital electronic calculating machines were developed during World War II. The speed, power, and versatility of computers has increased continuously and dramatically since then, conventionally, a modern computer consists of at least one processing element, typically a central processing unit, and some form of memory. The processing element carries out arithmetic and logical operations, and a sequencing, peripheral devices include input devices, output devices, and input/output devices that perform both functions. Peripheral devices allow information to be retrieved from an external source and this usage of the term referred to a person who carried out calculations or computations. The word continued with the same meaning until the middle of the 20th century, from the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, one who calculates, the Online Etymology Dictionary states that the use of the term to mean calculating machine is from 1897. The Online Etymology Dictionary indicates that the use of the term. 1945 under this name, theoretical from 1937, as Turing machine, devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick, later record keeping aids throughout the Fertile Crescent included calculi which represented counts of items, probably livestock or grains, sealed in hollow unbaked clay containers. The use of counting rods is one example, the abacus was initially used for arithmetic tasks. The Roman abacus was developed from used in Babylonia as early as 2400 BC. Since then, many forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, the Antikythera mechanism is believed to be the earliest mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions and it was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to circa 100 BC
8.
The C Programming Language (book)
–
The book was central to the development and popularization of the C programming language and is still widely read and used today. The first edition of the book, published in 1978, was the first widely available book on the C programming language, C was created by Dennis Ritchie. Brian Kernighan wrote the first C tutorial, the authors came together to write the book in conjunction with the languages early development at AT&T Bell Labs. The version of C described in this book is referred to as K&R C. The second edition of the book has since translated into over 20 languages. In 2012 an eBook version of the edition was published in ePub, Mobi. ANSI C, first standardized in 1989, has undergone several revisions. However, no new edition of The C Programming Language has been issued to cover the more recent standards, BYTE stated in August 1983, is the definitive work on the C language. Dont read any further until you have this book, jerry Pournelle wrote in the magazine that year that the book is still the standard. He continued, You can learn the C language without getting Kernighan and Ritchie, youre also working too hard if you make it the only book on C that you buy. The C Programming Language has often cited as a model for technical writing, with reviewers describing it as having clear presentation. Examples generally consist of programs of the type one is likely to encounter in daily usage of the language. Its authors said, We have tried to retain the brevity of the first edition, C is not a big language, and it is not well served by a big book. We have improved the exposition of critical features, such as pointers and we have refined the original examples, and have added new examples in several chapters. For instance, the treatment of complicated declarations is augmented by programs that convert declarations into words, as before, all examples have been tested directly from the text, which is in machine-readable form. The book introduced the hello, world program, which just prints out the text hello, world, numerous texts since then have followed that convention for introducing a programming language. Before the advent of ANSI C, the first edition of the served as the de facto standard of the language for writers of C compilers. It is meant for easy comprehension by programmers, but not as a definition for compiler writers—that role properly belongs to the standard itself, appendix B is a summary of the facilities of the standard library
9.
Bell Labs
–
Nokia Bell Labs is an American research and scientific development company, owned by Finnish company Nokia. Its headquarters are located in Murray Hill, New Jersey, in addition to laboratories around the rest of the United States. The historic laboratory originated in the late 19th century as the Volta Laboratory, Bell Labs was also at one time a division of the American Telephone & Telegraph Company, half-owned through its Western Electric manufacturing subsidiary. Eight Nobel Prizes have been awarded for work completed at Bell Laboratories, in 1880, the French government awarded Alexander Graham Bell the Volta Prize of 50,000 francs, approximately US$10,000 at that time for the invention of the telephone. Bell used the award to fund the Volta Laboratory in Washington, D. C. in collaboration with Sumner Tainter, the laboratory is also variously known as the Volta Bureau, the Bell Carriage House, the Bell Laboratory and the Volta Laboratory. The laboratory focused on the analysis, recording, and transmission of sound, Bell used his considerable profits from the laboratory for further research and education to permit the diffusion of knowledge relating to the deaf. This resulted in the founding of the Volta Bureau c,1887, located at Bells fathers house at 1527 35th Street in Washington, D. C. where its carriage house became their headquarters in 1889. In 1893, Bell constructed a new building, close by at 1537 35th St. specifically to house the lab, the building was declared a National Historic Landmark in 1972. In 1884, the American Bell Telephone Company created the Mechanical Department from the Electrical, the first president of research was Frank B. Jewett, who stayed there until 1940, ownership of Bell Laboratories was evenly split between AT&T and the Western Electric Company. Its principal work was to plan, design, and support the equipment that Western Electric built for Bell System operating companies and this included everything from telephones, telephone exchange switches, and transmission equipment. Bell Labs also carried out consulting work for the Bell Telephone Company, a few workers were assigned to basic research, and this attracted much attention, especially since they produced several Nobel Prize winners. Until the 1940s, the principal locations were in and around the Bell Labs Building in New York City. Of these, Murray Hill and Crawford Hill remain in existence, the largest grouping of people in the company was in Illinois, at Naperville-Lisle, in the Chicago area, which had the largest concentration of employees prior to 2001. Since 2001, many of the locations have been scaled down or closed. The Holmdel site, a 1.9 million square foot structure set on 473 acres, was closed in 2007, the mirrored-glass building was designed by Eero Saarinen. In August 2013, Somerset Development bought the building, intending to redevelop it into a commercial and residential project. The prospects of success are clouded by the difficulty of readapting Saarinens design and by the current glut of aging, eight Nobel Prizes have been awarded for work completed at Bell Laboratories
10.
Newline
–
In computing, a newline, also known as a line ending, end of line, or line break, is a special character or sequence of characters signifying the end of a line of text and the start of a new line. The actual codes representing a newline vary across operating systems, which can be a problem when exchanging text files between systems with different newline representations, the concepts of line feed and carriage return are closely associated, and can be considered either separately or together. In the physical media of typewriters and printers, two axes of motion, down and across, are needed to create a new line on the page. Although the design of a machine must consider them separately, the logic of software can combine them together as one event. This is why a newline in character encoding can be defined as LF, some character sets provide a separate newline character code. EBCDIC, for example, provides an NL character code in addition to the CR, Unicode, in addition to providing the ASCII CR and LF control codes, also provides a next line control code, as well as control codes for line separator and paragraph separator markers. Two ways to view newlines, both of which are self-consistent, are that newlines either separate lines or that they terminate lines, if a newline is considered a separator, there will be no newline after the last line of a file. Some programs have problems processing the last line of a file if it is not terminated by a newline, on the other hand, programs that expect newline to be used as a separator will interpret a final newline as starting a new line. Conversely, if a newline is considered a terminator, all lines including the last are expected to be terminated by a newline. In many applications a separate character called manual line break exists for forcing line breaks inside a single paragraph. The glyph for the character for a hard return is usually a pilcrow. Some rare systems, such as QNX before version 4, used the ASCII RS character as the newline character, EBCDIC systems—mainly IBM mainframe systems, including z/OS and i5/OS —use NL as the character combining the functions of line-feed and carriage-return. The equivalent UNICODE character is called NEL, note that EBCDIC also has control characters called CR and LF, but the numerical value of LF differs from the one used by ASCII. Additionally, some EBCDIC variants also use NL but assign a different numeric code to the character, operating systems for the CDC6000 series defined a newline as two or more zero-valued six-bit characters at the end of a 60-bit word. Some configurations also defined a character as a colon character. ZX80 and ZX81, home computers from Sinclair Research Ltd, used a specific character set with code NEWLINE as the newline character. OpenVMS uses a file system, which stores text files as one record per line. In most file formats, no line terminators are actually stored, the records themselves could contain the same line terminator characters, which could either be considered a feature or a nuisance depending on the application
11.
ASCII
–
ASCII, abbreviated from American Standard Code for Information Interchange, is a character encoding standard. ASCII codes represent text in computers, telecommunications equipment, and other devices, most modern character-encoding schemes are based on ASCII, although they support many additional characters. ASCII was developed from telegraph code and its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on the ASCII standard began on October 6,1960, the first edition of the standard was published in 1963, underwent a major revision during 1967, and experienced its most recent update during 1986. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting of lists, and added features for other than teleprinters. Originally based on the English alphabet, ASCII encodes 128 specified characters into seven-bit integers as shown by the ASCII chart above. The characters encoded are numbers 0 to 9, lowercase letters a to z, uppercase letters A to Z, basic punctuation symbols, control codes that originated with Teletype machines, for example, lowercase j would become binary 1101010 and decimal 106. ASCII includes definitions for 128 characters,33 are non-printing control characters that affect how text and space are processed and 95 printable characters, of these, the IANA encourages use of the name US-ASCII for Internet uses of ASCII. The ASA became the United States of America Standards Institute and ultimately the American National Standards Institute, there was some debate at the time whether there should be more control characters rather than the lowercase alphabet. The X3.2.4 task group voted its approval for the change to ASCII at its May 1963 meeting, the X3 committee made other changes, including other new characters, renaming some control characters and moving or removing others. ASCII was subsequently updated as USAS X3. 4-1967, then USAS X3. 4-1968, ANSI X3. 4-1977 and they proposed a 9-track standard for magnetic tape, and attempted to deal with some punched card formats. The X3.2 subcommittee designed ASCII based on the earlier teleprinter encoding systems, like other character encodings, ASCII specifies a correspondence between digital bit patterns and character symbols. This allows digital devices to communicate each other and to process, store. Before ASCII was developed, the encodings in use included 26 alphabetic characters,10 numerical digits, ITA2 were in turn based on the 5-bit telegraph code Émile Baudot invented in 1870 and patented in 1874. The committee debated the possibility of a function, which would allow more than 64 codes to be represented by a six-bit code. In a shifted code, some character codes determine choices between options for the character codes. It allows compact encoding, but is reliable for data transmission. The standards committee decided against shifting, and so ASCII required at least a seven-bit code, the committee considered an eight-bit code, since eight bits would allow two four-bit patterns to efficiently encode two digits with binary-coded decimal
12.
Sun Microsystems
–
Sun contributed significantly to the evolution of several key computing technologies, among them Unix, RISC processors, thin client computing, and virtualized computing. Sun was founded on February 24,1982, at its height, the Sun headquarters were in Santa Clara, California, on the former west campus of the Agnews Developmental Center. On January 27,2010, Oracle Corporation acquired Sun by for US$7.4 billion, other technologies include the Java platform, MySQL, and NFS. Sun was a proponent of open systems in general and Unix in particular, the initial design for what became Suns first Unix workstation, the Sun-1, was conceived by Andy Bechtolsheim when he was a graduate student at Stanford University in Palo Alto, California. Bechtolsheim originally designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation and it was designed around the Motorola 68000 processor with an advanced memory management unit to support the Unix operating system with virtual memory support. He built the first ones from spare parts obtained from Stanfords Department of Computer Science, on February 24,1982, Vinod Khosla, Andy Bechtolsheim, and Scott McNealy, all Stanford graduate students, founded Sun Microsystems. Bill Joy of Berkeley, a developer of the Berkeley Software Distribution. The Sun name is derived from the initials of the Stanford University Network, Sun was profitable from its first quarter in July 1982. By 1983 Sun was known for producing 68000-based systems with high-quality graphics that were the only computers other than DECs VAX to run 4. 2BSD and it licensed the computer design to other manufacturers, which typically used it to build Multibus-based systems running Unix from UniSoft. Suns initial public offering was in 1986 under the stock symbol SUNW, the symbol was changed in 2007 to JAVA, Sun stated that the brand awareness associated with its Java platform better represented the companys current strategy. Suns logo, which features four interleaved copies of the sun in the form of a rotationally symmetric ambigram, was designed by professor Vaughan Pratt. The initial version of the logo was orange and had the sides oriented horizontally and vertically, but it was rotated to stand on one corner and re-colored purple. In the dot-com bubble, Sun began making more money. It also began spending more, hiring workers and building itself out. Some of this was because of demand, but much was from web start-up companies anticipating business that would never happen. Sales in Suns important hardware division went into free-fall as customers closed shop, several quarters of steep losses led to executive departures, rounds of layoffs, and other cost cutting. In December 2001, the fell to the 1998, pre-bubble level of about $100. But it kept falling, faster than many other tech companies, a year later it had dipped below $10 but bounced back to $20
13.
Scalable Vector Graphics
–
Scalable Vector Graphics is an XML-based vector image format for two-dimensional graphics with support for interactivity and animation. The SVG specification is a standard developed by the World Wide Web Consortium since 1999. SVG images and their behaviors are defined in XML text files and this means that they can be searched, indexed, scripted, and compressed. As XML files, SVG images can be created and edited with any text editor, All major modern web browsers—including Mozilla Firefox, Internet Explorer, Google Chrome, Opera, Safari, and Microsoft Edge—have SVG rendering support. SVG has been in development within the World Wide Web Consortium since 1999, the early SVG Working Group decided not to develop any of the commercial submissions, but to create a new markup language that was informed by but not really based on any of them. SVG allows three types of objects, vector graphic shapes such as paths and outlines consisting of straight lines and curves, bitmap images. Graphical objects can be grouped, styled, transformed and composited into previously rendered objects, the feature set includes nested transformations, clipping paths, alpha masks, filter effects and template objects. SVG drawings can be interactive and can include animation, defined in the SVG XML elements or via scripting that accesses the SVG Document Object Model, SVG uses CSS for styling and JavaScript for scripting. Text, including internationalization and localization, appearing in plain text within the SVG DOM enhances the accessibility of SVG graphics, the SVG specification was updated to version 1.1 in 2011. There are two Mobile SVG Profiles, SVG Tiny and SVG Basic, meant for mobile devices with reduced computational, Scalable Vector Graphics 2 became a W3C Candidate Recommendation on 15 September 2016. SVG2 incorporates several new features in addition to those of SVG1.1, though the SVG Specification primarily focuses on vector graphics markup language, its design includes the basic capabilities of a page description language like Adobes PDF. It contains provisions for rich graphics, and is compatible with CSS for styling purposes, SVG has the information needed to place each glyph and image in a chosen location on a printed page. A print-specialized subset of SVG is currently a W3C Working Draft, SVG drawings can be dynamic and interactive. Time-based modifications to the elements can be described in SMIL, or can be programmed in a scripting language, the W3C explicitly recommends SMIL as the standard for animation in SVG. A rich set of event handlers such as onmouseover and onclick can be assigned to any SVG graphical object, SVG images, being XML, contain many repeated fragments of text, so they are well suited for lossless data compression algorithms. When an SVG image has been compressed with the industry standard gzip algorithm, it is referred to as an SVGZ image, conforming SVG1.1 viewers will display compressed images. An SVGZ file is typically 20 to 50 percent of the original size, W3C provides SVGZ files to test for conformance. SVG1.0 became a W3C Recommendation on 4 September 2001, SVG1.1 became a W3C Recommendation on 14 January 2003
14.
Perl
–
Perl is a family of high-level, general-purpose, interpreted, dynamic programming languages. The languages in this family include Perl 5 and Perl 6, though Perl is not officially an acronym, there are various backronyms in use, the best-known being Practical Extraction and Reporting Language. Perl was originally developed by Larry Wall in 1987 as a general-purpose Unix scripting language to make report processing easier, since then, it has undergone many changes and revisions. Perl 6, which began as a redesign of Perl 5 in 2000, both languages continue to be developed independently by different development teams and liberally borrow ideas from one another. The Perl languages borrow features from other programming languages including C, shell script, AWK and they provide powerful text processing facilities without the arbitrary data-length limits of many contemporary Unix commandline tools, facilitating easy manipulation of text files. Perl 5 gained widespread popularity in the late 1990s as a CGI scripting language, in due to its unsurpassed regular expression. In addition to CGI, Perl 5 is used for administration, network programming, finance, bioinformatics. It has been nicknamed the Swiss Army chainsaw of scripting languages because of its flexibility and power, in 1998, it was also referred to as the duct tape that holds the Internet together, in reference to both its ubiquitous use as a glue language and its perceived inelegance. Larry Wall began work on Perl in 1987, while working as a programmer at Unisys, the language expanded rapidly over the next few years. Perl 2, released in 1988, featured a regular expression engine. Perl 3, released in 1989, added support for data streams. Originally, the documentation for Perl was a single man page. In 1991, Programming Perl, known to many Perl programmers as the Camel Book because of its cover, was published and became the de facto reference for the language. At the same time, the Perl version number was bumped to 4, not to mark a change in the language. Perl 4 went through a series of releases, culminating in Perl 4.036 in 1993. At that point, Wall abandoned Perl 4 to begin work on Perl 5, initial design of Perl 5 continued into 1994. The perl5-porters mailing list was established in May 1994 to coordinate work on porting Perl 5 to different platforms and it remains the primary forum for development, maintenance, and porting of Perl 5. Perl 5.000 was released on October 17,1994 and it was a nearly complete rewrite of the interpreter, and it added many new features to the language, including objects, references, lexical variables, and modules
15.
Python (programming language)
–
Python is a widely used high-level programming language for general-purpose programming, created by Guido van Rossum and first released in 1991. The language provides constructs intended to enable writing clear programs on both a small and large scale and it has a large and comprehensive standard library. Python interpreters are available for operating systems, allowing Python code to run on a wide variety of systems. CPython, the implementation of Python, is open source software and has a community-based development model. CPython is managed by the non-profit Python Software Foundation, about the origin of Python, Van Rossum wrote in 1996, Over six years ago, in December 1989, I was looking for a hobby programming project that would keep me occupied during the week around Christmas. Would be closed, but I had a computer. I decided to write an interpreter for the new scripting language I had been thinking about lately, I chose Python as a working title for the project, being in a slightly irreverent mood. Python 2.0 was released on 16 October 2000 and had major new features, including a cycle-detecting garbage collector. With this release the development process was changed and became more transparent, Python 3.0, a major, backwards-incompatible release, was released on 3 December 2008 after a long period of testing. Many of its features have been backported to the backwards-compatible Python 2.6. x and 2.7. x version series. The End Of Life date for Python 2.7 was initially set at 2015, many other paradigms are supported via extensions, including design by contract and logic programming. Python uses dynamic typing and a mix of reference counting and a garbage collector for memory management. An important feature of Python is dynamic name resolution, which binds method, the design of Python offers some support for functional programming in the Lisp tradition. The language has map, reduce and filter functions, list comprehensions, dictionaries, and sets, the standard library has two modules that implement functional tools borrowed from Haskell and Standard ML. Python can also be embedded in existing applications that need a programmable interface, while offering choice in coding methodology, the Python philosophy rejects exuberant syntax, such as in Perl, in favor of a sparser, less-cluttered grammar. As Alex Martelli put it, To describe something as clever is not considered a compliment in the Python culture. Pythons philosophy rejects the Perl there is more one way to do it approach to language design in favor of there should be one—and preferably only one—obvious way to do it. Pythons developers strive to avoid premature optimization, and moreover, reject patches to non-critical parts of CPython that would offer an increase in speed at the cost of clarity
16.
Ruby (programming language)
–
Ruby is a dynamic, reflective, object-oriented, general-purpose programming language. It was designed and developed in the mid-1990s by Yukihiro Matz Matsumoto in Japan, according to its creator, Ruby was influenced by Perl, Smalltalk, Eiffel, Ada, and Lisp. It supports multiple programming paradigms, including functional, object-oriented, and it also has a dynamic type system and automatic memory management. Ruby was conceived on February 24,1993, I knew Perl, but I didnt like it really, because it had the smell of a toy language. The object-oriented language seemed very promising, but I didnt like it, because I didnt think it was a true object-oriented language — OO features appeared to be add-on to the language. As a language maniac and OO fan for 15 years, I really wanted a genuine object-oriented, I looked for but couldnt find one. The name Ruby originated during a chat session between Matsumoto and Keiju Ishitsuka on February 24,1993, before any code had been written for the language. Initially two names were proposed, Coral and Ruby, Matsumoto chose the latter in a later e-mail to Ishitsuka. Matsumoto later noted a factor in choosing the name Ruby – it was the birthstone of one of his colleagues, the first public release of Ruby 0.95 was announced on Japanese domestic newsgroups on December 21,1995. Subsequently, three versions of Ruby were released in two days. The release coincided with the launch of the Japanese-language ruby-list mailing list, in the same year, Matsumoto was hired by netlab. jp to work on Ruby as a full-time developer. In 1998, the Ruby Application Archive was launched by Matsumoto, in 1999, the first English language mailing list ruby-talk began, which signaled a growing interest in the language outside Japan. In this same year, Matsumoto and Keiju Ishitsuka wrote the first book on Ruby, The Object-oriented Scripting Language Ruby and it would be followed in the early 2000s by around 20 books on Ruby published in Japanese. By 2000, Ruby was more popular than Python in Japan, in September 2000, the first English language book Programming Ruby was printed, which was later freely released to the public, further widening the adoption of Ruby amongst English speakers. In early 2002, the English-language ruby-talk mailing list was receiving more messages than the Japanese-language ruby-list, Ruby 1.8 was initially released in August 2003, was stable for a long time, and was retired June 2013. Although deprecated, there is still based on it. Ruby 1.8 is only compatible with Ruby 1.9. Ruby 1.8 has been the subject of industry standards
17.
Assembly language
–
Each assembly language is specific to a particular computer architecture. In contrast, most high-level programming languages are generally portable across multiple architectures, Assembly language may also be called symbolic machine code. Assembly language is converted into machine code by a utility program referred to as an assembler. The conversion process is referred to as assembly, or assembling the source code, Assembly time is the computational step where an assembler is run. Assembly language uses a mnemonic to represent each low-level machine instruction or opcode, typically also each architectural register, flag, depending on the architecture, these elements may also be combined for specific instructions or addressing modes using offsets or other data as well as fixed addresses. Many assemblers offer additional mechanisms to facilitate development, to control the assembly process. A macro assembler includes a facility so that assembly language text can be represented by a name. A cross assembler is an assembler that is run on a computer or operating system of a different type from the system on which the code is to run. Cross-assembling facilitates the development of programs for systems that do not have the resources to support software development, a microassembler is a program that helps prepare a microprogram, called firmware, to control the low level operation of a computer. A meta-assembler is a used in some circles for a program that accepts the syntactic and semantic description of an assembly language. An assembler program creates object code by translating combinations of mnemonics and syntax for operations and this representation typically includes an operation code as well as other control bits and data. The assembler also calculates constant expressions and resolves symbolic names for memory locations, the use of symbolic references is a key feature of assemblers, saving tedious calculations and manual address updates after program modifications. Most assemblers also include facilities for performing textual substitution – e. g. to generate common short sequences of instructions as inline. Some assemblers may also be able to some simple types of instruction set-specific optimizations. One concrete example of this may be the ubiquitous x86 assemblers from various vendors, most of them are able to perform jump-instruction replacements in any number of passes, on request. Like early programming languages such as Fortran, Algol, Cobol and Lisp, assemblers have been available since the 1950s, however, assemblers came first as they are far simpler to write than compilers for high-level languages. There may be several assemblers with different syntax for a particular CPU or instruction set architecture, despite different appearances, different syntactic forms generally generate the same numeric machine code, see further below. A single assembler may also have different modes in order to support variations in syntactic forms as well as their exact semantic interpretations, there are two types of assemblers based on how many passes through the source are needed to produce the executable program
18.
PlayStation Portable homebrew
–
PlayStation Portable homebrew refers to the process of using exploits and hacks to execute unsigned code on the PlayStation Portable. Soon after the PSP was released, hackers began to discover exploits in the PSP that could be used to run unsigned code on the device. Sony released version 1.51 of the PSP firmware in May 2005 to plug the holes that hackers were using to access to the device. On 15 June 2005 the hackers distributed the code of the PSP on the internet. BusinessWeek dubbed this the carrot-and-stick approach, in August 2005 Sony released version 2.0 of the firmware which included the web browser, file compatibility updates and other features. Hackers and other homebrew enthusiasts then encountered the first trojan for the PSP, users attempting to downgrade their PSP using this software instead found that it was rendered inoperable as this software deleted mandatory/important system files. Over the course of 2005 Sony released six different versions of the firmware and this reportedly caused more buzz in the community than any recent official offerings for the device. Dark AleX is a Spanish programmer who writes homebrew applications for the PlayStation Portable, Dark AleX, as well as other variations of the name, is a pseudonym under which he works. One of the drawbacks of downgrading the PSP is that new media may require the presence of a new firmware edition. Dark_Alex had released a Custom Firmware called Dark Alexs Open Edition firmware or Custom Firmware which opens the firmware, Sony quickly patched the firmware again, continuing the cat-and-mouse game with the hackers and users. In 2006 Sony released six updates to the firmware and in 2007 they released another six updates. In July 2007 Dark_AleX officially stopped his work on the PSP, some people even suggested that Dark_AleX was paid by Sony not to release any more custom firmware, but Sony denied this. Half Byte Loader is an open source project that aims at loading homebrew for PlayStation Portable handheld console through user-mode exploits. It does not provide any mechanism for loading official games or ISO images, HBL was built from scratch to be easily portable to any user-mode exploit. The project was created and started by m0skit0 and ab5000 and it is currently maintained by wololo. HBL was created initially for the Medal of Honor Heroes exploit, an alpha version was released as open source by m0skit0 and ab5000 on November 2009, which ran very simple homebrews. When the Patapon 2 demo exploit was found and leaked, wololo joined the project, the AdvancedPSP forums, which hosted the project, were shut down by the hosting and the project moved to wololo/talk forums. Wololo also created a new public SVN repository for HBL at Google Code, other PSP hackers such as Davee and neur0n joined in to help the development of this port
19.
Lisp (programming language)
–
Lisp is a family of computer programming languages with a long history and a distinctive, fully parenthesized prefix notation. Originally specified in 1958, Lisp is the second-oldest high-level programming language in use today. Only Fortran is older, by one year, Lisp has changed since its early days, and many dialects have existed over its history. Today, the best known general-purpose Lisp dialects are Common Lisp, Lisp was originally created as a practical mathematical notation for computer programs, influenced by the notation of Alonzo Churchs lambda calculus. It quickly became the programming language for artificial intelligence research. The name LISP derives from LISt Processor, linked lists are one of Lisps major data structures, and Lisp source code is made of lists. Thus, Lisp programs can manipulate source code as a data structure, the interchangeability of code and data gives Lisp its instantly recognizable syntax. All program code is written as s-expressions, or parenthesized lists, Lisp was invented by John McCarthy in 1958 while he was at the Massachusetts Institute of Technology. McCarthy published its design in a paper in Communications of the ACM in 1960, entitled Recursive Functions of Symbolic Expressions and Their Computation by Machine and he showed that with a few simple operators and a notation for functions, one can build a Turing-complete language for algorithms. Information Processing Language was the first AI language, from 1955 or 1956, and already included many of the concepts, such as list-processing and recursion, McCarthys original notation used bracketed M-expressions that would be translated into S-expressions. As an example, the M-expression car is equivalent to the S-expression, once Lisp was implemented, programmers rapidly chose to use S-expressions, and M-expressions were abandoned. M-expressions surfaced again with short-lived attempts of MLISP by Horace Enea, Lisp was first implemented by Steve Russell on an IBM704 computer. Russell had read McCarthys paper and realized that the Lisp eval function could be implemented in machine code, the result was a working Lisp interpreter which could be used to run Lisp programs, or more properly, evaluate Lisp expressions. Two assembly language macros for the IBM704 became the operations for decomposing lists, car. From the context, it is clear that the register is used here to mean memory register. Lisp dialects still use car and cdr for the operations that return the first item in a list, the first complete Lisp compiler, written in Lisp, was implemented in 1962 by Tim Hart and Mike Levin at MIT. This compiler introduced the Lisp model of incremental compilation, in which compiled and interpreted functions can intermix freely, the language used in Hart and Levins memo is much closer to modern Lisp style than McCarthys earlier code. Lisp was a system to implement with the compiler techniques
20.
Haskell (programming language)
–
Haskell /ˈhæskəl/ is a standardized, general-purpose purely functional programming language, with non-strict semantics and strong static typing. It is named after logician Haskell Curry, the latest standard of Haskell is Haskell 2010. As of May 2016, a group is working on the next version, Haskell features a type system with type inference and lazy evaluation. Type classes first appeared in the Haskell programming language and its main implementation is the Glasgow Haskell Compiler. Haskell is based on the semantics, but not the syntax, of the language Miranda, Haskell is used widely in academia and also used in industry. Following the release of Miranda by Research Software Ltd, in 1985, by 1987, more than a dozen non-strict, purely functional programming languages existed. Of these, Miranda was used most widely, but it was proprietary software, the committees purpose was to consolidate the existing functional languages into a common one that would serve as a basis for future research in functional-language design. The first version of Haskell was defined in 1990, the committees efforts resulted in a series of language definitions. The committee expressly welcomed creating extensions and variants of Haskell 98 via adding and incorporating experimental features, in February 1999, the Haskell 98 language standard was originally published as The Haskell 98 Report. In January 2003, a version was published as Haskell 98 Language and Libraries. The language continues to rapidly, with the Glasgow Haskell Compiler implementation representing the current de facto standard. In early 2006, the process of defining a successor to the Haskell 98 standard, informally named Haskell Prime and this was intended to be an ongoing incremental process to revise the language definition, producing a new revision up to once per year. The first revision, named Haskell 2010, was announced in November 2009 and it introduces the Language-Pragma-Syntax-Extension which allows for code designating a Haskell source as Haskell 2010 or requiring certain extensions to the Haskell language. Haskell features lazy evaluation, pattern matching, list comprehension, type classes and it is a purely functional language, which means that in general, functions in Haskell have no side effects. A distinct construct exists to represent side effects, orthogonal to the type of functions, a pure function may return a side effect which is subsequently executed, modeling the impure functions of other languages. Haskell has a strong, static type system based on Hindley–Milner type inference, haskells principal innovation in this area is to add type classes, originally conceived as a principled way to add overloading to the language, but since finding many more uses. The construct which represents side effects is an example of a monad, monads are a general framework which can model different kinds of computation, including error handling, nondeterminism, parsing, and software transactional memory. Monads are defined as ordinary datatypes, but Haskell provides some syntactic sugar for their use, Haskell has an open, published specification, and multiple implementations exist
21.
Factorial
–
In mathematics, the factorial of a non-negative integer n, denoted by n. is the product of all positive integers less than or equal to n. =5 ×4 ×3 ×2 ×1 =120, the value of 0. is 1, according to the convention for an empty product. The factorial operation is encountered in areas of mathematics, notably in combinatorics, algebra. Its most basic occurrence is the fact there are n. ways to arrange n distinct objects into a sequence. This fact was known at least as early as the 12th century, fabian Stedman, in 1677, described factorials as applied to change ringing. After describing a recursive approach, Stedman gives a statement of a factorial, Now the nature of these methods is such, the factorial function is formally defined by the product n. = ∏ k =1 n k, or by the relation n. = {1 if n =0. The factorial function can also be defined by using the rule as n. All of the above definitions incorporate the instance 0, =1, in the first case by the convention that the product of no numbers at all is 1. This is convenient because, There is exactly one permutation of zero objects, = n. ×, valid for n >0, extends to n =0. It allows for the expression of many formulae, such as the function, as a power series. It makes many identities in combinatorics valid for all applicable sizes, the number of ways to choose 0 elements from the empty set is =0. More generally, the number of ways to choose n elements among a set of n is = n. n, the factorial function can also be defined for non-integer values using more advanced mathematics, detailed in the section below. This more generalized definition is used by advanced calculators and mathematical software such as Maple or Mathematica, although the factorial function has its roots in combinatorics, formulas involving factorials occur in many areas of mathematics. There are n. different ways of arranging n distinct objects into a sequence, often factorials appear in the denominator of a formula to account for the fact that ordering is to be ignored. A classical example is counting k-combinations from a set with n elements, one can obtain such a combination by choosing a k-permutation, successively selecting and removing an element of the set, k times, for a total of n k _ = n ⋯ possibilities. This however produces the k-combinations in an order that one wishes to ignore, since each k-combination is obtained in k. different ways. This number is known as the coefficient, because it is also the coefficient of Xk in n
22.
VHDL
–
VHDL is a hardware description language used in electronic design automation to describe digital and mixed-signal systems such as field-programmable gate arrays and integrated circuits. VHDL can also be used as a general purpose programming language. VHDL was originally developed at the behest of the U. S Department of Defense in order to document the behavior of the ASICs that supplier companies were including in equipment. The idea of being able to simulate the ASICs from the information in this documentation was so attractive that logic simulators were developed that could read the VHDL files. The next step was the development of logic synthesis tools that read the VHDL, a problem not solved by this edition, however, was multi-valued logic, where a signals drive strength and unknown values are also considered. This required IEEE standard 1164, which defined the 9-value logic types, scalar std_logic, Minor changes in the standard added the idea of protected types and removed some restrictions from port mapping rules. In addition to IEEE standard 1164, several child standards were introduced to extend functionality of the language, IEEE standard 1076.2 added better handling of real and complex data types. IEEE standard 1076.3 introduced signed and unsigned types to facilitate arithmetical operations on vectors, IEEE standard 1076.1 provided analog and mixed-signal circuit design extensions. Some other standards support wider use of VHDL, notably VITAL, in June 2006, the VHDL Technical Committee of Accellera approved so called Draft 3.0 of VHDL-2006. While maintaining full compatibility with older versions, this proposed standard provides numerous extensions that make writing and managing VHDL code easier and these changes should improve quality of synthesizable VHDL code, make testbenches more flexible, and allow wider use of VHDL for system-level descriptions. In 2008, Accellera released VHDL4.0 to the IEEE for balloting for inclusion in IEEE 1076-2008, the VHDL standard IEEE 1076-2008 was published in January 2009. The IEEE Standard 1076 defines the VHSIC Hardware Description Language or VHDL, the language has undergone numerous revisions and has a variety of sub-standards associated with it that augment or extend it in important ways. 1076 was and continues to be a milestone in the design of electronic systems, IEEE 1076-1987 First standardized revision of ver 7.2 of the language from the United States Air Force. IEEE 1076-1993 Significant improvements resulting from years of feedback. Probably the most widely used version with the greatest vendor tool support, introduces the use of protected types. IEEE 1076-2002 Minor revision of 1076-2000, rules with regard to buffer ports are relaxed. IEC 61691-1-1,2004 IEC adoption of IEEE 1076-2002 IEEE 1076-2008 Major revision released on 2009-01-26, among other changes, this standard introduces the use of external names. Such a model is processed by a program, only if it is part of the logic design
23.
Embedded system
–
An embedded system is a computer system with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints. It is embedded as part of a device often including hardware. Embedded systems control many devices in use today. Ninety-eight percent of all microprocessors are manufactured as components of embedded systems, examples of properties of typically embedded computers when compared with general-purpose counterparts are low power consumption, small size, rugged operating ranges, and low per-unit cost. This comes at the price of limited processing resources, which make them more difficult to program. For example, intelligent techniques can be designed to power consumption of embedded systems. Modern embedded systems are based on microcontrollers, but ordinary microprocessors are also common. In either case, the processor used may be ranging from general purpose to those specialised in certain class of computations. A common standard class of dedicated processors is the signal processor. Since the embedded system is dedicated to tasks, design engineers can optimize it to reduce the size and cost of the product and increase the reliability. Some embedded systems are mass-produced, benefiting from economies of scale, complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure. One of the very first recognizably modern embedded systems was the Apollo Guidance Computer, an early mass-produced embedded system was the Autonetics D-17 guidance computer for the Minuteman missile, released in 1961. When the Minuteman II went into production in 1966, the D-17 was replaced with a new computer that was the first high-volume use of integrated circuits. Since these early applications in the 1960s, embedded systems have come down in price and there has been a rise in processing power. An early microprocessor for example, the Intel 4004, was designed for calculators and other systems but still required external memory. By the early 1980s, memory, input and output system components had been integrated into the chip as the processor forming a microcontroller. Microcontrollers find applications where a computer would be too costly. A comparatively low-cost microcontroller may be programmed to fulfill the role as a large number of separate components
24.
Microcontroller
–
A microcontroller is a small computer on a single integrated circuit. In modern terminology, it is a system on a chip or SoC, a microcontroller contains one or more CPUs along with memory and programmable input/output peripherals. Program memory in the form of Ferroelectric RAM, NOR flash or OTP ROM is also included on chip. Microcontrollers are designed for embedded applications, in contrast to the used in personal computers or other general purpose applications consisting of various discrete chips. Mixed signal microcontrollers are common, integrating analog components needed to control non-digital electronic systems, some microcontrollers may use four-bit words and operate at frequencies as low as 4 kHz, for low power consumption. Other microcontrollers may serve performance-critical roles, where they may need to act more like a signal processor, with higher clock speeds. The first microprocessor was the 4-bit Intel 4004 released in 1971, with the Intel 8008, however, both processors required external chips to implement a working system, raising total system cost, and making it impossible to economically computerize appliances. One book credits TI engineers Gary Boone and Michael Cochran with the creation of the first microcontroller in 1971. The result of their work was the TMS1000, which became available in 1974. It combined read-only memory, read/write memory, processor and clock on one chip and was targeted at embedded systems and it combined RAM and ROM on the same chip. This chip would find its way into one billion PC keyboards. At that time Intels President, Luke J. Valenter, stated that the microcontroller was one of the most successful in the companys history, most microcontrollers at this time had concurrent variants. One had an erasable EPROM program memory, with a transparent quartz window in the lid of the package to allow it to be erased by exposure to ultraviolet light, often used for prototyping. The PROM was of type of memory as the EPROM. The erasable versions required ceramic packages with quartz windows, making them more expensive than the OTP versions. The same year, Atmel introduced the first microcontroller using Flash memory, other companies rapidly followed suit, with both memory types. Cost has plummeted over time, with the cheapest 8-bit microcontrollers being available for under 0.25 USD in quantity in 2009, nowadays microcontrollers are cheap and readily available for hobbyists, with large online communities around certain processors. In the future, MRAM could potentially be used in microcontrollers as it has infinite endurance, in 2002, about 55% of all CPUs sold in the world were 8-bit microcontrollers and microprocessors
25.
Field-programmable gate array
–
A field-programmable gate array is an integrated circuit designed to be configured by a customer or a designer after manufacturing – hence field-programmable. The FPGA configuration is generally specified using a description language. Logic blocks can be configured to perform complex functions, or merely simple logic gates like AND. In most FPGAs, logic blocks also include elements, which may be simple flip-flops or more complete blocks of memory. Contemporary field-programmable gate arrays have large resources of logic gates and RAM blocks to implement complex digital computations, as FPGA designs employ very fast I/Os and bidirectional data buses, it becomes a challenge to verify correct timing of valid data within setup time and hold time. Floor planning enables resources allocation within FPGAs to meet time constraints. FPGAs can be used to implement any logical function that an ASIC could perform, some FPGAs have analog features in addition to digital functions. Fairly common are differential comparators on input pins designed to be connected to differential signaling channels, the FPGA industry sprouted from programmable read-only memory and programmable logic devices. PROMs and PLDs both had the option of being programmed in batches in a factory or in the field, however, programmable logic was hard-wired between logic gates. In the late 1980s, the Naval Surface Warfare Center funded an experiment proposed by Steve Casselman to develop a computer that would implement 600,000 reprogrammable gates, Casselman was successful and a patent related to the system was issued in 1992. Some of the foundational concepts and technologies for programmable logic arrays, gates. Xilinx co-founders Ross Freeman and Bernard Vonderschmitt invented the first commercially viable field-programmable gate array in 1985 – the XC2064, the XC2064 had programmable gates and programmable interconnects between gates, the beginnings of a new technology and market. The XC2064 had 64 configurable logic blocks, with two three-input lookup tables, more than 20 years later, Freeman was entered into the National Inventors Hall of Fame for his invention. Altera and Xilinx continued unchallenged and quickly grew from 1985 to the mid-1990s, by 1993, Actel was serving about 18 percent of the market. By 2010, Altera, Actel and Xilinx together represented approximately 77 percent of the FPGA market, the 1990s were an explosive period of time for FPGAs, both in sophistication and the volume of production. In the early 1990s, FPGAs were primarily used in telecommunications, by the end of the decade, FPGAs found their way into consumer, automotive, and industrial applications. This work mirrors the architecture by Ron Perlof and Hana Potash of Burroughs Advanced Systems Group which combined a reconfigurable CPU architecture on a chip called the SB24. That work was done in 1982, the Atmel FPSLIC is another such device, which uses an AVR processor in combination with Atmels programmable logic architecture
26.
Complex programmable logic device
–
A complex programmable logic device is a programmable logic device with complexity between that of PALs and FPGAs, and architectural features of both. The main building block of the CPLD is a macrocell, which contains logic implementing disjunctive normal form expressions, some of the CPLD features are in common with PALs, Non-volatile configuration memory. Unlike many FPGAs, an external configuration ROM isnt required, and this is usually not a factor for larger CPLDs and newer CPLD product families. Other features are in common with FPGAs, Large number of gates available, CPLDs typically have the equivalent of thousands to tens of thousands of logic gates, allowing implementation of moderately complicated data processing devices. PALs typically have a few hundred gate equivalents at most, while FPGAs typically range from tens of thousands to several million, a good example is where a CPLD is used to load configuration data for an FPGA from non-volatile memory. CPLDs were a step from even smaller devices that preceded them, PLAs. These in turn were preceded by standard logic products, that offered no programmability and were used to build logic functions by physically wiring several standard logic chips together. The main distinction between FPGA and CPLD device architectures is that FPGAs are internally based on look-up tables while CPLDs form the logic functions with sea-of-gates
27.
Light-emitting diode
–
A light-emitting diode is a two-lead semiconductor light source. It is a p–n junction diode, which emits light when activated, when a suitable voltage is applied to the leads, electrons are able to recombine with electron holes within the device, releasing energy in the form of photons. This effect is called electroluminescence, and the color of the light is determined by the band gap of the semiconductor. LEDs are typically small and integrated optical components may be used to shape the radiation pattern, appearing as practical electronic components in 1962, the earliest LEDs emitted low-intensity infrared light. Infrared LEDs are still used as transmitting elements in remote-control circuits. The first visible-light LEDs were also of low intensity and limited to red, modern LEDs are available across the visible, ultraviolet, and infrared wavelengths, with very high brightness. Early LEDs were often used as indicator lamps for electronic devices and they were soon packaged into numeric readouts in the form of seven-segment displays and were commonly seen in digital clocks. Recent developments in LEDs permit them to be used in environmental, LEDs have allowed new displays and sensors to be developed, while their high switching rates are also used in advanced communications technology. LEDs have many advantages over incandescent light sources including lower energy consumption, longer lifetime, improved physical robustness, smaller size, and faster switching. Light-emitting diodes are now used in applications as diverse as aviation lighting, automotive headlamps, advertising, general lighting, traffic signals, camera flashes, as of 2017, LED lights home room lighting are as cheap or cheaper than compact fluorescent lamp sources of comparable output. They are also more energy efficient and, arguably, have fewer environmental concerns linked to their disposal. Electroluminescence as a phenomenon was discovered in 1907 by the British experimenter H. J. Round of Marconi Labs, using a crystal of silicon carbide, russian inventor Oleg Losev reported creation of the first LED in 1927. His research was distributed in Soviet, German and British scientific journals, rubin Braunstein of the Radio Corporation of America reported on infrared emission from gallium arsenide and other semiconductor alloys in 1955. Braunstein observed infrared emission generated by simple diode structures using gallium antimonide, GaAs, indium phosphide, in 1957, Braunstein further demonstrated that the rudimentary devices could be used for non-radio communication across a short distance. As noted by Kroemer Braunstein …had set up a simple optical communications link, the emitted light was detected by a PbS diode some distance away. This signal was fed into an amplifier and played back by a loudspeaker. Intercepting the beam stopped the music and we had a great deal of fun playing with this setup. This setup presaged the use of LEDs for optical communication applications, by October 1961, they had demonstrated efficient light emission and signal coupling between a GaAs p-n junction light emitter and an electrically-isolated semiconductor photodetector
28.
Debian
–
The Debian Project was first announced in 1993 by Ian Murdock, Debian 0.01 was released on September 15,1993, and the first stable release was made in 1996. The Debian stable release branch is one of the most popular for personal computers and network servers, new distributions are updated continually, and the next candidate is released after a time-based freeze. As one of the earliest Operating Systems based on the Linux kernel and this decision drew the attention and support of the Free Software Foundation, which sponsored the project for one year from November 1994 to November 1995. Upon the ending of the sponsorship, the Debian Project formed the non-profit organisation Software in the Public Interest, while Debians main port, Debian GNU/Linux, uses the Linux kernel, other ports exist based on BSD kernels and the HURD microkernel. All use the GNU userland and the GNU C library, Debian has access to online repositories that contain over 50,000 software packages making it one of the largest software compilations. Debian officially contains only free software, but non-free software can be downloaded from the Debian repositories, Debian includes popular free programs such as LibreOffice, Firefox web browser, Evolution mail, K3b disc burner, VLC media player, GIMP image editor, and Evince document viewer. Debian is a choice for web servers. Debian supports Linux officially, offered kFreeBSD for version 7 but not 8, GNU/kFreeBSD was released as a technology preview for IA-32 and x86-64 architectures, and lacked the amount of software available in Debians Linux distribution. Official support for kFreeBSD was removed for version 8, which did not provide a kFreeBSD-based distribution, several flavors of the Linux kernel exist for each port. For example, the port has flavors for IA-32 PCs supporting Physical Address Extension and real-time computing, for older PCs. The Linux kernel does not officially contain firmware without sources, although such firmware is available in non-free packages, Debian offers DVD and CD images for installation that can be downloaded using BitTorrent or jigdo. Physical disks can also be bought from retailers, Debian offers different network installation methods. A minimal install of Debian is available via the netinst CD, whereby Debian is installed with just a base, another option is to boot the installer from the network. Installation images are hybrid on some architectures and can be used to create a bootable USB drive, the default bootstrap loader is GNU GRUB version 2, though the package name is simply grub, while version 1 was renamed to grub-legacy. This conflicts with e. g. Fedora, where grub version 2 is named grub2, the default desktop may be chosen from the DVD boot menu among GNOME, KDE Software Compilation, Xfce and LXDE, and from special disc 1 CDs. Debian offers CD images specifically built for GNOME, KDE Software Compilation, Xfce, MATE is officially supported, while Cinnamon support was added with Debian 8.0 Jessie. Less common window managers such as Enlightenment, Openbox, Fluxbox, IceWM, Window Maker, the default desktop environment of version 7.0 Wheezy was temporarily switched to Xfce, because GNOME3 did not fit on the first CD of the set. The default for the version 8.0 Jessie was changed again to Xfce in November 2013, Debian releases live install images for CDs, DVDs and USB thumb drives, for IA-32 and x86-64 architectures, and with a choice of desktop environments
29.
Ubuntu (operating system)
–
Ubuntu is published by Canonical Ltd, who offer commercial support. It uses Unity as its user interface for the desktop. Ubuntu is the most popular operating system running in hosted environments, so–called clouds, development of Ubuntu is led by UK-based Canonical Ltd. a company of South African entrepreneur Mark Shuttleworth. Canonical generates revenue through the sale of technical support and other related to Ubuntu. The Ubuntu project is committed to the principles of open-source software development, people are encouraged to use free software, study how it works, improve upon it. Ubuntu is built on Debians architecture and infrastructure, to provide Linux server, desktop, phone, tablet, the first release was in October 2004. Starting with Ubuntu 6.06, every release, one release every two years, receives long-term support. Long-term support includes updates for new hardware, security patches and updates to the Ubuntu stack. The first LTS releases were supported for three years on the desktop and five years on the server, since Ubuntu 12.04 LTS, LTS releases get regular point releases with support for new hardware and integration of all the updates published in that series to date. Ubuntu packages are based on packages from Debians unstable branch, both distributions use Debians deb package format and package management tools. Debian and Ubuntu packages are not necessarily compatible with each other, however. Many Ubuntu developers are also maintainers of key packages within Debian, Ubuntu cooperates with Debian by pushing changes back to Debian, although there has been criticism that this does not happen often enough. Ian Murdock, the founder of Debian, had expressed concern about Ubuntu packages potentially diverging too far from Debian to remain compatible, before release, packages are imported from Debian unstable continuously and merged with Ubuntu-specific modifications. One month before release, imports are frozen, and packagers then work to ensure that the frozen features interoperate well together, Ubuntu is currently funded by Canonical Ltd. On 8 July 2005, Mark Shuttleworth and Canonical announced the creation of the Ubuntu Foundation, the purpose of the foundation is to ensure the support and development for all future versions of Ubuntu. Mark Shuttleworth describes the goal as to ensure the continuity of the Ubuntu project. On 12 March 2009, Ubuntu announced developer support for 3rd-party cloud management platforms, Unity has become the default GUI for Ubuntu Desktop, although following the release of Ubuntu 17.10 it will move to the GNOME3 desktop instead as work on Unity ends. A default installation of Ubuntu contains a range of software that includes LibreOffice, Firefox, Thunderbird, Transmission
30.
Linux
–
Linux is a Unix-like computer operating system assembled under the model of free and open-source software development and distribution. The defining component of Linux is the Linux kernel, an operating system kernel first released on September 17,1991 by Linus Torvalds, the Free Software Foundation uses the name GNU/Linux to describe the operating system, which has led to some controversy. Linux was originally developed for computers based on the Intel x86 architecture. Because of the dominance of Android on smartphones, Linux has the largest installed base of all operating systems. Linux is also the operating system on servers and other big iron systems such as mainframe computers. It is used by around 2. 3% of desktop computers, the Chromebook, which runs on Chrome OS, dominates the US K–12 education market and represents nearly 20% of the sub-$300 notebook sales in the US. Linux also runs on embedded systems – devices whose operating system is built into the firmware and is highly tailored to the system. This includes TiVo and similar DVR devices, network routers, facility automation controls, televisions, many smartphones and tablet computers run Android and other Linux derivatives. The development of Linux is one of the most prominent examples of free, the underlying source code may be used, modified and distributed—commercially or non-commercially—by anyone under the terms of its respective licenses, such as the GNU General Public License. Typically, Linux is packaged in a known as a Linux distribution for both desktop and server use. Distributions intended to run on servers may omit all graphical environments from the standard install, because Linux is freely redistributable, anyone may create a distribution for any intended use. The Unix operating system was conceived and implemented in 1969 at AT&Ts Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, first released in 1971, Unix was written entirely in assembly language, as was common practice at the time. Later, in a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie, the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, as a result, Unix grew quickly and became widely adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs, freed of the legal obligation requiring free licensing, the GNU Project, started in 1983 by Richard Stallman, has the goal of creating a complete Unix-compatible software system composed entirely of free software. Later, in 1985, Stallman started the Free Software Foundation, by the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers, daemons, and the kernel were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, although not released until 1992 due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has also stated that if 386BSD had been available at the time, although the complete source code of MINIX was freely available, the licensing terms prevented it from being free software until the licensing changed in April 2000
31.
APT (Debian)
–
APT was originally designed as a front-end for dpkg to work with Debians. deb packages, but it has since been modified to also work with the RPM Package Manager system via APT-RPM. The Fink project has ported APT to Mac OS X for some of its own package management tasks, there has been an apt program since version 1.0, apt is a collection of tools distributed in a package named apt. A significant part of apt is defined in a C++ library of functions, apt also includes programs for dealing with packages. Three such programs are apt, apt-get and apt-cache and they are commonly used in examples of apt because they are simple and ubiquitous. The apt package is of important priority in all current Debian releases, Apt can be considered a front-end to dpkg, friendlier than the older dselect front-end. While dpkg performs actions on individual packages, apt tools manage relations between them, as well as sourcing and management of higher-level versioning decisions, APT is often hailed as one of Debians best features, which Debian developers attribute to the strict quality controls in Debians policy. A major feature in APT is the way it calls dpkg — it does topological sorting of the list of packages to be installed or removed, in some cases, it utilizes the --force options in dpkg. However, it only does this when it is unable to calculate how to avoid the reason dpkg requires the action to be forced, the user indicates one or more packages to be installed. Each package name is phrased as just the portion of the package. Notably, apt automatically gets and installs packages upon which the package depends. This was an original distinguishing characteristic of apt-based package management systems, as it avoided installation failure due to missing dependencies, another such distinction is remote repository retrieval of packages. Apt provides other options to override decisions made by apt-gets conflict resolution system. One option is to force a particular version of a package and this can downgrade a package and render dependent software inoperable, so the user must be careful. Finally, the mechanism allows the user to create an alternative installation policy for individual packages. The user can specify packages by POSIX regular expression, apt-get is the command line package management tool supplied with the Debian package apt. APT searches its cached list of packages and lists the dependencies that must be installed or updated, triggers are the treatment of deferred actions. APT retrieves, configures and installs the dependencies automatically, other commands used in apt-get, update is used to resynchronize the package index files from their sources. The lists of available packages are fetched from the location specified in /etc/apt/sources. list, for example, when using a Debian archive, this command retrieves and scans the Packages. gz files, so that information about new and updated packages is available
32.
Coupling (computer programming)
–
Coupling is usually contrasted with cohesion. Low coupling often correlates with high cohesion, and vice versa, low coupling is often a sign of a well-structured computer system and a good design, and when combined with high cohesion, supports the general goals of high readability and maintainability. Structured Design, including cohesion and coupling, were published in the article Stevens, Myers & Constantine and the book Yourdon & Constantine, coupling can be low or high. Content coupling Content coupling occurs when one module modifies or relies on the workings of another module. In this situation, a change in the way the second module produces data might also require a change in the dependent module, Common coupling Common coupling occurs when two modules share the same global data. Changing the shared resource might imply changing all the modules using it, External coupling External coupling occurs when two modules share an externally imposed data format, communication protocol, or device interface. This is basically related to the communication to external tools and devices, Control coupling Control coupling is one module controlling the flow of another, by passing it information on what to do. Stamp coupling Stamp coupling occurs when modules share a data structure and use only parts of it. In this situation, a modification in a field that a module does not need may lead to changing the way the module reads the record, Data coupling Data coupling occurs when modules share data through, for example, parameters. Each datum is a piece, and these are the only data shared. Message coupling This is the loosest type of coupling and it can be achieved by state decentralization and component communication is done via parameters or message passing. No coupling Modules do not communicate at all with one another, subclass coupling Describes the relationship between a child and its parent. The child is connected to its parent, but the parent is not connected to the child, temporal coupling When two actions are bundled together into one module just because they happen to occur at the same time. In recent work various other coupling concepts have been investigated and used as indicators for different modularization principles used in practice, assembly of modules might require more effort and/or time due to the increased inter-module dependency. A particular module might be harder to reuse and/or test because dependent modules must be included, longer messages require more CPU and memory to produce. To optimize runtime performance, message length must be minimized and message meaning must be maximized, Message Transmission Overhead and Performance Since a message must be transmitted in full to retain its complete meaning, message transmission must be optimized. Longer messages require more CPU and memory to transmit and receive, also, when necessary, receivers must reassemble a message into its original state to completely receive it. Hence, to optimize performance, message length must be minimized
33.
Deb (file format)
–
Deb is the format, as well as extension of the software package format for the Debian distribution and its derivatives. Debian packages are standard Unix ar archives that include two tar archives, one archive holds the control information and another contains the installable data. Dpkg provides the functionality for installing and manipulating Debian packages. Generally end users dont manage packages directly with dpkg but instead use the APT package management software or other APT front-ends such as synaptic or KPackage. Debian packages can be converted into other formats and vice versa using alien. Some core Debian packages are available as udebs, and are used only for bootstrapping a Debian installation. Although these files use the filename extension, they adhere to the same structure specification as ordinary deb files. However, unlike their deb counterparts, udeb packages contain only essential functional files, in particular, documentation files are normally omitted. Udeb packages are not installable on a standard Debian system, but are used in Debian-Installer, prior to Debian 0.93 a package consisted of a file header and two concatenated gzip archives. Since Debian 0.93, a deb package is implemented as an ar archive and this archive contains three files in a specific order, debian-binary - Contains a single line giving the package format version number. Control archive - A tar archive named control. tar contains the maintainer scripts, compressing the archive with gzip or xz is supported. The file extension changes to indicate the compression method, data archive - A tar archive named data. tar contains the actual installable files. Compressing the archive with gzip, bzip2, lzma or xz is supported, the file extension changes to indicate the compression method. The control archive contents can include the files, control contains a brief description of the package as well as other information such as its dependencies. Md5sums contains MD5 checksums of all files in the package in order to corrupt or incomplete files. Conffiles lists the files of the package that should be treated as configuration files, configuration files are not overwritten during an update unless specified. Preinst, postinst, prerm and postrm are optional scripts that are executed before or after installing, updating or removing the package, config is an optional script that supports the debconf configuration mechanism. Shlibs list of shared library dependencies, debian-based distributions support GPG signature verification of signed Debian packages, but most have this feature disabled by default
34.
Foobar
–
The terms foobar, or foo and others are used as placeholder names in computer programming or computer-related documentation. They have been used to name such as variables, functions. The etymology of foo is obscure and its use in connection with bar is generally traced to the World War II military slang FUBAR, later bowdlerised to foobar. The word foo on its own was used earlier and this may be related to the Chinese word fu, which can mean happiness. The first known use of the terms in print in a programming context appears in a 1965 edition of MITs Tech Engineering News, foobar may have come about as a result of the pre-existing Foo being conjoined with bar an addition borrowed from the militarys FUBAR. The use of foo in a context is generally credited to the Tech Model Railroad Club of MIT from circa 1960. Another feature of the system was a clock on the dispatch board. When someone hit a switch, the clock stopped and the display was replaced with the word FOO, at TMRC the scram switches are, therefore. Because of this, an entry in the 1959 Dictionary of the TMRC Language went something like this, FOO and our first obligation is to keep the foo counters turning. One book describing the MIT train room describes two buttons by the door, labeled foo and bar. These were general purpose buttons and were often re-purposed for whatever fun idea the MIT hackers had at the time, hence the adoption of foo, an entry in the Abridged Dictionary of the TMRC Language states, Multiflush, stop-all-trains-button. Next best thing to the red door button, displays FOO on the clock when used. The term foobar was propagated through computer science circles in the 1960s, foobar was also used as a variable name in the Fortran code of Colossal Cave Adventure. The variable FOOBAR was used to contain the players progress in saying the magic phrase Fee Fie Foe Foo, in this Hello World code sample, foo and bar are used to illustrate string concatenation, Foo Camp is an annual hacker convention. BarCamp, a network of user generated conferences During the United States v. Microsoft Corp. foobar2000 is an audio player. Google uses a web tool called foo. bar to new employees. Foo Fighters is an American rock band named after the foo fighters, stanley Black & Decker has registered the trademark fubar, used as branding for a type of crowbar. Xyzzy Category, Variable The Jargon File entry on foobar, catb. org RFC1639 – FTP Operation Over Big Address Records