Linux kernel
The Linux kernel is a free and open-source, Unix-like operating system kernel. The Linux family of operating systems is based on this kernel and deployed on both traditional computer systems such as personal computers and servers in the form of Linux distributions, on various embedded devices such as routers, wireless access points, PBXes, set-top boxes, FTA receivers, smart TVs, PVRs, NAS appliances. While the adoption of the Linux kernel in desktop computer operating system is low, Linux-based operating systems dominate nearly every other segment of computing, from mobile devices to mainframes; as of November 2017, all of the world's 500 most powerful supercomputers run Linux. The Android operating system for tablet computers and smartwatches uses the Linux kernel; the Linux kernel was conceived and created in 1991 by Linus Torvalds for his personal computer and with no cross-platform intentions, but has since expanded to support a huge array of computer architectures, many more than other operating systems or kernels.
Linux attracted developers and users who adopted it as the kernel for other free software projects, notably the GNU Operating System, created as a free, non-proprietary operating system, based on UNIX as a by-product of the fallout of the Unix wars. The Linux kernel API, the application programming interface through which user programs interact with the kernel, is meant to be stable and to not break userspace programs; as part of the kernel's functionality, device drivers control the hardware. However, the interface between the kernel and loadable kernel modules, unlike in many other kernels and operating systems, is not meant to be stable by design; the Linux kernel, developed by contributors worldwide, is a prominent example of free and open source software. Day-to-day development discussions take place on the Linux kernel mailing list; the Linux kernel is released under the GNU General Public License version 2, with some firmware images released under various non-free licenses. In April 1991, Linus Torvalds, at the time a 21-year-old computer science student at the University of Helsinki, started working on some simple ideas for an operating system.
He started with a task switcher in a terminal driver. On 25 August 1991, Torvalds posted the following to comp.os.minix, a newsgroup on Usenet: I'm doing a operating system for 386 AT clones. This has been brewing since April, is starting to get ready. I'd like any feedback on things people like/dislike in minix. I've ported bash and gcc, things seem to work; this implies that I'll get something practical within a few months Yes - it's free of any minix code, it has a multi-threaded fs. It is NOT portable, it never will support anything other than AT-harddisks, as that's all I have:-(. It's in C, but most people wouldn't call what I write C, it uses every conceivable feature of the 386 I could find, as it was a project to teach me about the 386. As mentioned, it uses a MMU, for both paging and segmentation. It's the segmentation; some of my "C"-files are as much assembler as C. Unlike minix, I happen to LIKE interrupts, so interrupts are handled without trying to hide the reason behind them. After that, many people contributed code to the project.
Early on, the MINIX community contributed code and ideas to the Linux kernel. At the time, the GNU Project had created many of the components required for a free operating system, but its own kernel, GNU Hurd, was incomplete and unavailable; the Berkeley Software Distribution had not yet freed itself from legal encumbrances. Despite the limited functionality of the early versions, Linux gained developers and users. In September 1991, Torvalds released version 0.01 of the Linux kernel on the FTP server of the Finnish University and Research Network. It had 10,239 lines of code. On 5 October 1991, version 0.02 of the Linux kernel was released. Torvalds assigned version 0 to the kernel to indicate that it was for testing and not intended for productive use. In December 1991, Linux kernel 0.11 was released. This version was the first to be self-hosted as Linux kernel 0.11 could be compiled by a computer running the same kernel version. When Torvalds released version 0.12 in February 1992, he adopted the GNU General Public License version 2 over his previous self-drafted license, which had not permitted commercial redistribution.
On 19 January 1992, the first post to the new newsgroup alt.os.linux was submitted. On 31 March 1992, the newsgroup was renamed comp.os.linux. The fact that Linux is a monolithic kernel rather than a microkernel was the topic of a debate between Andrew S. Tanenbaum, the creator of MINIX, Torvalds; this discussion is known as the Tanenbaum–Torvalds debate and started in 1992 on the Usenet discussion group comp.os.minix as a general debate about Linux and kernel architecture. Tanenbaum argued that microkernels were superior to monolithic kernels and that therefore Linux was obsolete. Unlike traditional monolithic kernels, device drivers in Linux are configured as loadable kernel modules and are loaded or unloaded while
University of Texas at Austin
The University of Texas at Austin is a public research university in Austin, Texas. It is the flagship institution of the University of Texas System; the University of Texas was inducted into the Association of American Universities in 1929, becoming only the third university in the American South to be elected. The institution has the nation's eighth-largest single-campus enrollment, with over 50,000 undergraduate and graduate students and over 24,000 faculty and staff. A Public Ivy, it is a major center for academic research, with research expenditures exceeding $615 million for the 2016–2017 school year; the university houses seven museums and seventeen libraries, including the Lyndon Baines Johnson Library and Museum and the Blanton Museum of Art, operates various auxiliary research facilities, such as the J. J. Pickle Research Campus and the McDonald Observatory. Among university faculty are recipients of the Nobel Prize, Pulitzer Prize, the Wolf Prize, the Primetime Emmy Award, the Turing Award, the National Medal of Science, as well as many other awards.
As of October 2018, 11 Nobel Prize winners, 2 Turing Award winners and 1 Fields medalist have been affiliated with the school as alumni, faculty members or researchers. Student athletes are members of the Big 12 Conference, its Longhorn Network is the only sports network featuring the college sports of a single university. The Longhorns have won four NCAA Division I National Football Championships, six NCAA Division I National Baseball Championships, thirteen NCAA Division I National Men's Swimming and Diving Championships, has claimed more titles in men's and women's sports than any other school in the Big 12 since the league was founded in 1996; the first mention of a public university in Texas can be traced to the 1827 constitution for the Mexican state of Coahuila y Tejas. Although Title 6, Article 217 of the Constitution promised to establish public education in the arts and sciences, no action was taken by the Mexican government. After Texas obtained its independence from Mexico in 1836, the Texas Congress adopted the Constitution of the Republic, under Section 5 of its General Provisions, stated "It shall be the duty of Congress, as soon as circumstances will permit, to provide, by law, a general system of education."On April 18, 1838, "An Act to Establish the University of Texas" was referred to a special committee of the Texas Congress, but was not reported back for further action.
On January 26, 1839, the Texas Congress agreed to set aside fifty leagues of land—approximately 288,000 acres —towards the establishment of a publicly funded university. In addition, 40 acres in the new capital of Austin were reserved and designated "College Hill." In 1845, Texas was annexed into the United States. The state's Constitution of 1845 failed to mention higher education. On February 11, 1858, the Seventh Texas Legislature approved O. B. 102, an act to establish the University of Texas, which set aside $100,000 in United States bonds toward construction of the state's first publicly funded university. The legislature designated land reserved for the encouragement of railroad construction toward the university's endowment. On January 31, 1860, the state legislature, wanting to avoid raising taxes, passed an act authorizing the money set aside for the University of Texas to be used for frontier defense in west Texas to protect settlers from Indian attacks. Texas's secession from the Union and the American Civil War delayed repayment of the borrowed monies.
At the end of the Civil War in 1865, The University of Texas's endowment was just over $16,000 in warrants and nothing substantive had been done to organize the university's operations. This effort to establish a University was again mandated by Article 7, Section 10 of the Texas Constitution of 1876 which directed the legislature to "establish and provide for the maintenance and direction of a university of the first class, to be located by a vote of the people of this State, styled "The University of Texas."Additionally, Article 7, Section 11 of the 1876 Constitution established the Permanent University Fund, a sovereign wealth fund managed by the Board of Regents of the University of Texas and dedicated for the maintenance of the university. Because some state legislators perceived an extravagance in the construction of academic buildings of other universities, Article 7, Section 14 of the Constitution expressly prohibited the legislature from using the state's general revenue to fund construction of university buildings.
Funds for constructing university buildings had to come from the university's endowment or from private gifts to the university, but the university's operating expenses could come from the state's general revenues. The 1876 Constitution revoked the endowment of the railroad lands of the Act of 1858, but dedicated 1,000,000 acres of land, along with other property appropriated for the university, to the Permanent University Fund; this was to the detriment of the university as the lands the Constitution of 1876 granted the university represented less than 5% of the value of the lands granted to the university under the Act of 1858. The more valuable lands reverted to the fund to support general educat
ALGOL 68
ALGOL 68 is an imperative computer programming language, conceived as a successor to the ALGOL 60 programming language, designed with the goal of a much wider scope of application and more rigorously defined syntax and semantics. The contributions of ALGOL 68 to the field of computer science have been deep, wide ranging and enduring, although many of these contributions were only publicly identified when they had reappeared in subsequently developed programming languages. ALGOL 68 features include expression-based syntax, user-declared types and structures/tagged-unions, a reference model of variables and reference parameters, string and matrix slicing, concurrency. ALGOL 68 was designed by the IFIP Working Group 2.1. On December 20, 1968, the language was formally adopted by Working Group 2.1 and subsequently approved for publication by the General Assembly of IFIP. ALGOL 68 was defined using a two-level grammar formalism invented by Adriaan van Wijngaarden. Van Wijngaarden grammars use a context-free grammar to generate an infinite set of productions that will recognize a particular ALGOL 68 program.
ALGOL 68 has been criticized, most prominently by some members of its design committee such as C. A. R. Hoare and Edsger Dijkstra, for abandoning the simplicity of ALGOL 60, becoming a vehicle for complex or overly general ideas, doing little to make the compiler writer's task easier, in contrast to deliberately simple contemporaries such as C, S-algol and Pascal. In 1970, ALGOL 68-R became the first working compiler for ALGOL 68. In the 1973 revision, certain features – such as proceduring and formal bounds – were omitted. C.f. The language of the unrevised report.r0 Though European defence agencies promoted the use of ALGOL 68 for its expected security advantages, the American side of the NATO alliance decided to develop a different project, the Ada programming language, making its use obligatory for US defense contracts. Algol 68 had a notable influence within the Soviet Union, details of which can be found in Andrey Ershov's 2014 paper: "ALGOL 68 and Its Impact on the USSR and Russian Programming" and "Алгол 68 и его влияние на программирование в СССР и России" - pages: 336 & 342.
Steve Bourne, on the Algol 68 revision committee, took some of its ideas to his Bourne shell and to C. The complete history of the project can be found in C. H. Lindsey's A History of ALGOL 68. For a full-length treatment of the language, see Programming Algol 68 Made Easy by Dr. Sian Mountbatten, or Learning Algol 68 Genie by Dr. Marcel van der Veer which includes the Revised Report. "A Shorter History of Algol 68" ALGOL 68 - 3rd generation ALGOL Mar. 1968: Draft Report on the Algorithmic Language ALGOL 68 - Edited by: A. van Wijngaarden, B. J. Mailloux, J. E. L. Peck and C. H. A. Koster. Oct. 1968: Penultimate Draft Report on the Algorithmic Language ALGOL 68 – Chapters 1-9 Chapters 10-12 – Edited by: A. van Wijngaarden, B. J. Mailloux, J. E. L. Peck and C. H. A. Koster. Dec. 1968: Report on the Algorithmic Language ALGOL 68 – Offprint from Numerische Mathematik, 14, 79-218. – Edited by: A. van Wijngaarden, B. J. Mailloux, J. E. L. Peck and C. H. A. Koster. WG 2.1 members active in the original design of ALGOL 68: Friedrich L. Bauer • Hans Bekic • Edsger Dijkstra※ • Fraser Duncan※ • Jan Garwick※ • Gerhard Goos • Tony Hoare※ • Peter Zilahy Ingerman • Kees Koster • Peter Landin • Charles Lindsey • Barry Mailloux • John McCarthy • Jack Merner • Peter Naur‡ • Manfred Paul • John Peck • Willem van der Poel • Brian Randell※ • Doug Ross • Klaus Samelson • Gerhard Seegmüller※ • Michel Sintzoff • Wlad Turski※ • Aad van Wijngaarden • Niklaus Wirth‡ • Mike Woodger※ • Nobuo Yoneda.
‡Resigned after. Sep. 1973: Revised Report on the Algorithmic Language Algol 68 - Springer-Verlag 1976 - Edited by: A. van Wijngaarden, B. J. Mailloux, J. E. L. Peck, C. H. A. Koster, M. Sintzoff, C. H. Lindsey, L. G. L. T. Meertens and R. G. Fisker. 1968: On December 20, 1968, the "Final Report" was adopted by the Working Group subsequently approved by the General Assembly of UNESCO's IFIP for publication. Translations of the standard were made for Russian, German and Bulgarian, later Japanese and Chinese; the standard was made available in Braille. 1984: TC97 considered Algol 68 for standardisation as "New Work Item" TC97/N1642. West Germany, Netherlands, USSR and Czechoslovakia willing to participate in preparing the standard but the USSR and Czechoslovakia "were not the right kinds of member of the right ISO committees" and Algol 68's ISO standardisation stalled. 1988: Subsequently ALGOL 68 became one of the GOST standards in Russia. GOST 27974-88 Programming language ALGOL 68 – Язык программирования АЛГОЛ 68 GOST 27975-88 Programming language ALGOL 68 extended – Язык программирования АЛГОЛ 68 расширенный There are about 60 such reserved words in the standard language: mode, op, proc, heap, long, short, bool, char, int, sema, void, file, struct, union, at "@", eitherr0, is ":=:", isnt is notr0 ":/=:" ":≠:", of "→"r0, false, nil "○", skip "~", co "¢", comment "¢", pr, case ~ in ~ ouse ~ in ~ out ~ esac "", for ~ from ~ to ~ by ~ while ~ do ~ o
Dutch people
Dutch people or the Dutch are a Germanic ethnic group native to the Netherlands. They speak the Dutch language. Dutch people and their descendants are found in migrant communities worldwide, notably in Aruba, Guyana, Curaçao, Brazil, Australia, South Africa, New Zealand, the United States; the Low Countries were situated around the border of France and the Holy Roman Empire, forming a part of their respective peripheries, the various territories of which they consisted had become autonomous by the 13th century. Under the Habsburgs, the Netherlands were organised into a single administrative unit, in the 16th and 17th centuries the Northern Netherlands gained independence from Spain as the Dutch Republic; the high degree of urbanization characteristic of Dutch society was attained at a early date. During the Republic the first series of large-scale Dutch migrations outside of Europe took place; the Dutch have left behind a substantial legacy despite the limited size of their country. The Dutch people are seen as the pioneers of capitalism, their emphasis on a modern economy, a free market had a huge influence on the great powers of the West the British Empire, its Thirteen Colonies, the United States.
The traditional arts and culture of the Dutch encompasses various forms of traditional music, architectural styles and clothing, some of which are globally recognizable. Internationally, Dutch painters such as Rembrandt and Van Gogh are held in high regard; the dominant religion of the Dutch was Christianity, although in modern times the majority are no longer religious. Significant percentages of the Dutch are adherents of humanism, atheism or individual spirituality; as with all ethnic groups, the ethnogenesis of the Dutch has been a complex process. Though the majority of the defining characteristics of the Dutch ethnic group have accumulated over the ages, it is difficult to pinpoint the exact emergence of the Dutch people; the text below hence focuses on the history of the Dutch ethnic group. For Dutch colonial history, see the article on the Dutch Empire. In the first centuries CE, the Germanic tribes formed tribal societies with no apparent form of autocracy, beliefs based Germanic paganism and speaking a dialect still resembling Common Germanic.
Following the end of the migration period in the West around 500, with large federations settling the decaying Roman Empire, a series of monumental changes took place within these Germanic societies. Among the most important of these are their conversion from Germanic paganism to Christianity, the emergence of a new political system, centered on kings, a continuing process of emerging mutual unintelligibility of their various dialects; the general situation described above is applicable to most if not all modern European ethnic groups with origins among the Germanic tribes, such as the Frisians, Germans and the North-Germanic peoples. In the Low Countries, this phase began when the Franks, themselves a union of multiple smaller tribes, began to incur the northwestern provinces of the Roman Empire. In 358, the Salian Franks, one of the three main subdivisions among the Frankish alliance settled the area's Southern lands as foederati. Linguistically Old Frankish or Low Franconian evolved into Old Dutch, first attested in the 6th century, whereas religiously the Franks converted to Christianity from around 500 to 700.
On a political level, the Frankish warlords abandoned tribalism and founded a number of kingdoms culminating in the Frankish Empire of Charlemagne. However, the population make-up of the Frankish Empire, or early Frankish kingdoms such as Neustria and Austrasia, was not dominated by Franks. Though the Frankish leaders controlled most of Western Europe, the Franks themselves were confined to the Northwestern part of the Empire; the Franks in Northern France were assimilated by the general Gallo-Roman population, took over their dialects, whereas the Franks in the Low Countries retained their language, which would evolve into Dutch. The current Dutch-French language border has remained identical since, could be seen as marking the furthest pale of gallicization among the Franks; the medieval cities of the Low Countries, which experienced major growth during the 11th and 12th century, were instrumental in breaking down the relatively loose local form of feudalism. As they became powerful, they used their economical strength to influence the politics of their nobility.
During the early 14th century, beginning in and inspired by the County of Flanders, the cities in the Low Countries gained huge autonomy and dominated or influenced the various political affairs of the fief, including marriage succession. While the cities were of great political importance, they formed catalys
Unix
Unix is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, development starting in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, others. Intended for use inside the Bell System, AT&T licensed Unix to outside parties in the late 1970s, leading to a variety of both academic and commercial Unix variants from vendors including University of California, Microsoft, IBM, Sun Microsystems. In the early 1990s, AT&T sold its rights in Unix to Novell, which sold its Unix business to the Santa Cruz Operation in 1995; the UNIX trademark passed to The Open Group, a neutral industry consortium, which allows the use of the mark for certified operating systems that comply with the Single UNIX Specification. As of 2014, the Unix version with the largest installed base is Apple's macOS. Unix systems are characterized by a modular design, sometimes called the "Unix philosophy"; this concept entails that the operating system provides a set of simple tools that each performs a limited, well-defined function, with a unified filesystem as the main means of communication, a shell scripting and command language to combine the tools to perform complex workflows.
Unix distinguishes itself from its predecessors as the first portable operating system: the entire operating system is written in the C programming language, thus allowing Unix to reach numerous platforms. Unix was meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmers; the system grew larger as the operating system started spreading in academic circles, as users added their own tools to the system and shared them with colleagues. At first, Unix was not designed to be multi-tasking. Unix gained portability, multi-tasking and multi-user capabilities in a time-sharing configuration. Unix systems are characterized by various concepts: the use of plain text for storing data; these concepts are collectively known as the "Unix philosophy". Brian Kernighan and Rob Pike summarize this in The Unix Programming Environment as "the idea that the power of a system comes more from the relationships among programs than from the programs themselves".
In an era when a standard computer consisted of a hard disk for storage and a data terminal for input and output, the Unix file model worked quite well, as I/O was linear. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, semaphores, network sockets were added to support communication with other hosts; as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. By the early 1980s, users began seeing Unix as a potential universal operating system, suitable for computers of all sizes; the Unix environment and the client–server program model were essential elements in the development of the Internet and the reshaping of computing as centered in networks rather than in individual computers. Both Unix and the C programming language were developed by AT&T and distributed to government and academic institutions, which led to both being ported to a wider variety of machine families than any other operating system.
Under Unix, the operating system consists of many libraries and utilities along with the master control program, the kernel. The kernel provides services to start and stop programs, handles the file system and other common "low-level" tasks that most programs share, schedules access to avoid conflicts when programs try to access the same resource or device simultaneously. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space - although in microkernel implementations, like MINIX or Redox, functions such as network protocols may run in user space; the origins of Unix date back to the mid-1960s when the Massachusetts Institute of Technology, Bell Labs, General Electric were developing Multics, a time-sharing operating system for the GE-645 mainframe computer. Multics featured several innovations, but presented severe problems. Frustrated by the size and complexity of Multics, but not by its goals, individual researchers at Bell Labs started withdrawing from the project.
The last to leave were Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna, who decided to reimplement their experiences in a new project of smaller scale. This new operating system was without organizational backing, without a name; the new operating system was a single-tasking system. In 1970, the group coined the name Unics for Uniplexed Information and Computing Service, as a pun on Multics, which stood for Multiplexed Information and Computer Services. Brian Kernighan takes credit for the idea, but adds that "no one can remember" the origin of the final spelling Unix. Dennis Ritchie, Doug McIlroy, Peter G. Neumann credit Kernighan; the operating system was written in assembly language, but in 1973, Version 4 Unix was rewritten in C. Version 4 Unix, still had many PDP-11 dependent codes, is not suitable for porting; the first port to other platform was made five years f
Computer science
Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
Crash (computing)
In computing, a crash, or system crash, occurs when a computer program such as a software application or an operating system stops functioning properly and exits. The program responsible may appear to hang until a crash reporting service reports the crash and any details relating to it. If the program is a critical part of the operating system, the entire system may crash or hang resulting in a kernel panic or fatal system error. Most crashes are the result of executing invalid machine instructions. Typical causes include incorrect address values in the program counter, buffer overflow, overwriting a portion of the affected program code due to an earlier bug, accessing invalid memory addresses, using an illegal opcode or triggering an unhandled exception; the original software bug that started this chain of events is considered to be the cause of the crash, discovered through the process of debugging. The original bug can be far removed from the code that crashed. In earlier personal computers, attempting to write data to hardware addresses outside the system's main memory could cause hardware damage.
Some crashes are exploitable and let a malicious program or hacker to execute arbitrary code allowing the replication of viruses or the acquisition of data which would be inaccessible. An application crashes when it performs an operation, not allowed by the operating system; the operating system triggers an exception or signal in the application. Unix applications traditionally responded to the signal by dumping core. Most Windows and Unix GUI applications respond by displaying a dialogue box with the option to attach a debugger if one is installed; some applications attempt to continue running instead of exiting. Typical errors that result in application crashes include: attempting to read or write memory, not allocated for reading or writing by that application or x86 specific attempting to execute privileged or invalid instructions attempting to perform I/O operations on hardware devices to which it does not have permission to access passing invalid arguments to system calls attempting to access other system resources to which the application does not have permission to access attempting to execute machine instructions with bad arguments: divide by zero, operations on denorms or NaN values, memory access to unaligned addresses, etc.
A "crash to desktop" is said to occur when a program unexpectedly quits, abruptly taking the user back to the desktop. The term is applied only to crashes where no error is displayed, hence all the user sees as a result of the crash is the desktop. Many times there is no apparent action. During normal function, the program may freeze for a shorter period of time, close by itself. During normal function, the program may become a black screen and play the last few seconds of sound, being played before it crashes to desktop. Other times it may appear to be triggered by a certain action, such as loading an area. Crash to desktop bugs are considered problematic for users. Since they display no error message, it can be difficult to track down the source of the problem if the times they occur and the actions taking place right before the crash do not appear to have any pattern or common ground. One way to track down the source of the problem for games is to run them in windowed-mode. Windows Vista has a feature that can help track down the cause of a CTD problem when it occurs on any program.
Windows XP included a similar feature as well. Some computer programs, such as StepMania and BBC's Bamzooki crash to desktop if in full-screen, but displays the error in a separate window when the user has returned to the desktop. Crashes are caused by website failure or system failure; the software running the web server behind a website may crash, rendering it inaccessible or providing only an error message instead of normal content. For example: if a site is using an SQL database for a script and that SQL database server crashes PHP will display a connection error. An operating system crash occurs when a hardware exception occurs that cannot be handled. Operating system crashes can occur when internal sanity-checking logic within the operating system detects that the operating system has lost its internal self-consistency. Modern multi-tasking operating systems, such as Linux, macOS remain unharmed when an application program crashes; some operating systems, e.g. z/OS, have facilities for Reliability and serviceability and the OS can recover from the crash of a critical component, whether due to hardware failure, e.g. uncorrectable ECC error, or to software failure, e.g. a reference to an unassigned page.
Depending on the application, the crash may contain the user's private information. Moreover, many software bugs which cause crashes are exploitable for arbitrary code execution and other types of privilege escalation. For example, a stack buffer overflow can overwrite the return address of a subroutine with an invalid value, which will cause a segmentation fault when the subroutine returns. However, if an exploit overwrites the return address with a valid value, the code in that address will be executed; when crashes are collected in the field using a crash reporter, the next step for developers is to be able to reproduce them locally. For this, several techniques exist: STAR uses symbolic execution, MuCrash mutates the test code of the application that has