X Window System
The X Window System is a windowing system for bitmap displays, common on Unix-like operating systems. X provides the basic framework for a GUI environment: drawing and moving windows on the display device and interacting with a mouse and keyboard. X does not mandate the user interface – this is handled by individual programs; as such, the visual styling of X-based environments varies greatly. X originated at the Massachusetts Institute of Technology in 1984; the X protocol has been version 11 since September 1987. The X. Org Foundation leads the X project, with the current reference implementation, X. Org Server, available as free and open source software under the MIT License and similar permissive licenses. X is an architecture-independent system for remote graphical user interfaces and input device capabilities; each person using a networked terminal has the ability to interact with the display with any type of user input device. In its standard distribution it is a complete, albeit simple and interface solution which delivers a standard toolkit and protocol stack for building graphical user interfaces on most Unix-like operating systems and OpenVMS, has been ported to many other contemporary general purpose operating systems.
X provides the basic framework, or primitives, for building such GUI environments: drawing and moving windows on the display and interacting with a mouse, keyboard or touchscreen. X does not mandate the user interface. Programs may use X's graphical abilities with no user interface; as such, the visual styling of X-based environments varies greatly. Unlike most earlier display protocols, X was designed to be used over network connections rather than on an integral or attached display device. X features network transparency, which means an X program running on a computer somewhere on a network can display its user interface on an X server running on some other computer on the network; the X server is the provider of graphics resources and keyboard/mouse events to X clients, meaning that the X server is running on the computer in front of a human user, while the X client applications run anywhere on the network and communicate with the user's computer to request the rendering of graphics content and receive events from input devices including keyboards and mice.
The fact that the term "server" is applied to the software in front of the user is surprising to users accustomed to their programs being clients to services on remote computers. Here, rather than a remote database being the resource for a local app, the user's graphic display and input devices become resources made available by the local X server to both local and remotely hosted X client programs who need to share the user's graphics and input devices to communicate with the user. X's network protocol is based on X command primitives; this approach allows both 2D and 3D operations by an X client application which might be running on a different computer to still be accelerated on the X server's display. For example, in classic OpenGL, display lists containing large numbers of objects could be constructed and stored in the X server by a remote X client program, each rendered by sending a single glCallList across the network. X provides no native support for audio. X uses a client–server model: an X server communicates with various client programs.
The server sends back user input. The server may function as: an application displaying to a window of another display system a system program controlling the video output of a PC a dedicated piece of hardwareThis client–server terminology – the user's terminal being the server and the applications being the clients – confuses new X users, because the terms appear reversed, but X takes the perspective of the application, rather than that of the end-user: X provides display and I/O services to applications, so it is a server. The communication protocol between server and client operates network-transparently: the client and server may run on the same machine or on different ones with different architectures and operating systems. A client and server can communicate securely over the Internet by tunneling the connection over an encrypted network session. An X client itself may emulate an X server by providing display services to other clients; this is known as "X nesting". Open-source clients such as Xnest and Xephyr support such X nesting.
To use an X client application on a remote machine, the user may do the following: on the local machine, open a terminal window use ssh with the X forwarding argument to connect to the remote machine request local display/input service The remote X client application will make a connection to the user's local X server, providing display and input to the user. Alternatively, the local machine may run a small program that connects to the remote machine and starts the client application. Practical examples of remote clients include: administering a remote machine graphically using a client application to join with large numbers of other terminal users in collaborative workgroups running a computationally intensive simulation on a remote machine and displaying the results on
Tektronix, Inc. widely known as Tek, is an American company best known for manufacturing test and measurement devices such as oscilloscopes, logic analyzers, video and mobile test protocol equipment. An independent company, it is now a subsidiary of Fortive, a spinoff from Danaher Corporation. Several charities are, or were, associated with Tektronix, including the Tektronix Foundation and the M. J. Murdock Charitable Trust in Vancouver, Washington; the company traces its roots to the electronics revolution that followed World War II, was first founded in December 1945, as Tekrad. However, the name was similar to that of a California company, so, in 1946, the four partners, Howard Vollum, along with Jack Murdock and Miles Tippery, who had both served in the Coast Guard, accountant Glenn McDowell, formed Tektronix, Inc. with each contributing an initial $2,600 for equal shares. Howard Vollum had graduated in 1936 from Reed College with a degree in physics and a keen interest in oscilloscopes worked as a radio technician out of Jack Murdock's Murdock Radio and Appliance Company prior to the outbreak of war, during which he served in the Signal Corps.
Following the founding of Tektronix, Vollum invented the world’s first triggered oscilloscope in 1946, a significant technological breakthrough. This oscilloscope and developed by Tektronix, was the model 511; the model 511 was a triggering with sweep oscilloscope. The first oscilloscope with a true time-base was the Tektronix Model 513; the leading oscilloscope manufacturer at the time was DuMont Laboratories. DuMont pioneered the frequency-synch sweep. Allen DuMont tried the 511 at an electronics show and was impressed, but when he saw the price of $795, about twice as much as his equivalent model, he told Howard Vollum at the show that they would have a hard time selling many. Tektronix was incorporated in 1946 with its headquarters at SE Foster Road and SE 59th Avenue in Portland, just six blocks from Murdock's first family home. In 1947 there were 12 employees. Four years in 1951, Tektronix had 250 employees. Murdock and Vollum were known humanitarians and sought to operate their business as one might run a large and caring family.
In 1978, Tektronix was named by authors Robert Levering and Milton Moskowitz, et al, as among The 100 best companies to work for in America in their book of the same name. By 1950 the company began building a manufacturing facility in Washington County, Oregon, at Barnes Road and the Sunset Highway and, by 1956, had expanded the facility to 80,000 square feet; the company moved its headquarters to this site, following an employee vote. A detailed story of Howard Vollum and Jack Murdock along with the products that made Tektronix a leading maker of oscilloscopes can be found at the Museum of Vintage Tektronix Equipment. In 1956, a large piece of property in nearby Beaverton became available, the company’s employee retirement trust purchased the land and leased it back to the company. Construction began in 1957 and on May 1, 1959 Tektronix moved into its new Beaverton headquarters campus, on a 313-acre site which came to be called the Tektronix Industrial Park. In the late 1950s, Tektronix set a new trend in oscilloscope applications that would continue into the 1980s.
This was the introduction of the plug-in oscilloscope. Started with the 530 and 540 series oscilloscopes, the operator could switch in different horizontal sweep or vertical input plug-ins; this allowed the oscilloscope to be a adaptable test instrument. Tektronix would add in plug-ins to have the scope operate as a spectrum analyzer, waveform sampler, cable tester and transistor curve tracer; the 530 and 540 series ushered in the delayed trigger, allowing to trigger between a sweep rather than at the beginning. This allows better waveform reproduction. In 1961, Tektronix sold its first portable oscilloscope, the model 321; this oscilloscope could run on rechargeable batteries. It brought the oscilloscope into the transistor age. A year and a half the model 321A came out and, all transistors; the 560 series introduced the rectangular CRT to oscilloscopes. In 1964 Tektronix made an oscilloscope breakthrough, the world's first mass-produced analog storage oscilloscope the model 564. Hughes Aircraft Company is credited with the first working storage oscilloscope but it was made in small numbers and is rare today.
In 1966, Tektronix brought out a line of high frequency full function oscilloscopes called the 400 series. The oscilloscopes were packed with features for field work applications; these scopes were outstanding performers preferred over their laboratory bench models. The first models were a 16 MHz bandwidth and the 453, a 50 MHz bandwidth model; the following year the 454, a 150 MHz portable. These models put Tektronix well ahead of their competitors for years; the US Military contracted with Tektronix for a model 453 "ruggedized" for field servicing. The 400 series models would continue to be popular choices in the 80's. In addition the styling of the 400 series would be copied by Tektronix's competitors. 400 series oscilloscopes were still being used as of 2013. The company's IPO, when it publicly sold its first shares of stock, was on September 11, 1963. In 1974, the company acquired 256 acres in Wilsonville, Oregon where it built a facility for its imaging group. By 1976, the company employed nearly 10,000, was the state’s largest employer.
Tektronix's 1956 expansion and, in 1
Oracle Corporation is an American multinational computer technology corporation headquartered in Redwood Shores, California. The company specializes in developing and marketing database software and technology, cloud engineered systems, enterprise software products — its own brands of database management systems. In 2018, Oracle was the third-largest software maker by revenue, after Alphabet; the company develops and builds tools for database development and systems of middle-tier software, enterprise resource planning software, customer relationship management software, supply chain management software. Larry Ellison co-founded Oracle Corporation in 1977 with Bob Miner and Ed Oates under the name Software Development Laboratories. Ellison took inspiration from the 1970 paper written by Edgar F. Codd on relational database management systems named "A Relational Model of Data for Large Shared Data Banks." He heard about the IBM System R database from an article in the IBM Research Journal provided by Oates.
Ellison wanted to make Oracle's product compatible with System R, but failed to do so as IBM kept the error codes for their DBMS a secret. SDL changed its name to Relational Software, Inc in 1979 again to Oracle Systems Corporation in 1982, to align itself more with its flagship product Oracle Database. At this stage Bob Miner served as the company's senior programmer. On March 12, 1986, the company had its initial public offering. In 1995, Oracle Systems Corporation changed its name to Oracle Corporation named Oracle, but sometimes referred to as Oracle Corporation, the name of the holding company. Part of Oracle Corporation's early success arose from using the C programming language to implement its products; this eased porting to different operating systems. 1979: offers the first commercial SQL RDBMS 1983: offers a VAX-mode database 1984: offers the first database with read-consistency 1986: offers a client-server DBMS 1987: introduces UNIX-based Oracle applications 1988: introduces PL/SQL.
1992: offers full applications implementation methodology 1995: offers the first 64-bit RDBMS 1996: moves towards an open standards-based, web-enabled architecture 1999: offers its first DBMS with XML support 2001: becomes the first to complete 3 terabyte TPC-H world record 2002: offers the first database to pass 15 industry standard security evaluations 2003: introduces what it calls "Enterprise Grid Computing" with Oracle10g 2005: releases its first free database, Oracle Database 10g Express Edition 2006: acquires Siebel Systems 2007: acquires Hyperion Solutions 2008: Smart scans in software improve query-response in HP Oracle Database Machine / Exadata storage 2010: acquires Sun Microsystems 2013: begins use of Oracle 12c, capable of providing cloud services with Oracle Database 2014: acquires Micros Systems 2016: acquires NetSuite Inc. Oracle ranked No. 82 in the 2018 Fortune 500 list of the largest United States corporations by total revenue. According to Bloomberg, Oracle's CEO-to-employee pay ratio is 1,205:1.
The CEO's compensation in 2017 was $108,295,023. Meanwhile, the median employee compensation rate was $89,887. Oracle designs and sells both software and hardware products, as well as offering services that complement them. Many of the products have been added to Oracle's portfolio through acquisitions. Oracle's E-delivery service provides documentation. Oracle Database Release 10: In 2004, Oracle Corporation shipped release 10g as the latest version of Oracle Database. Release 11: Release 11g became the current Oracle Database version in 2007. Oracle Corporation released Oracle Database 11g Release 2 in September 2009; this version was available in four commercial editions—Enterprise Edition, Standard Edition, Standard Edition One, Personal Edition—and in one free edition—the Express Edition. The licensing of these editions shows various restrictions and obligations that were called complex by licensing expert Freirich Florea; the Enterprise Edition, the most expensive of the Database Editions, has the fewest restrictions — but has complex licensing.
Oracle Corporation constrains the Standard Edition and Standard Edition One with more licensing restrictions, in accordance with their lower price. Release 12: Release 12c became available on July 1, 2013. Oracle Corporation has acquired and developed the following additional database technologies: Berkeley DB, which offers embedded database processing Oracle Rdb, a relational database system running on OpenVMS platforms. Oracle acquired Rdb in 1994 from Digital Equipment Corporation. Oracle has since made many enhancements to this product and development continues as of 2008. TimesTen, which features in-memory database operations Oracle Essbase, which continues the Hyperion Essbase tradition of multi-dimensional database management MySQL, a relational database management system licensed under the GNU General Public License developed by MySQL AB Oracle NoSQL Database, a scalable, distributed key-value NoSQL database Oracle Fusion Middleware is a family of middleware
Configuration management is a systems engineering process for establishing and maintaining consistency of a product's performance and physical attributes with its requirements and operational information throughout its life. The CM process is used by military engineering organizations to manage changes throughout the system lifecycle of complex systems, such as weapon systems, military vehicles, information systems. Outside the military, the CM process is used with IT service management as defined by ITIL, with other domain models in the civil engineering and other industrial engineering segments such as roads, canals and buildings. CM applied over the life cycle of a system provides visibility and control of its performance and physical attributes. CM verifies that a system performs as intended, is identified and documented in sufficient detail to support its projected life cycle; the CM process facilitates orderly management of system information and system changes for such beneficial purposes as to revise capability.
The minimal cost of implementing CM is returned many fold in cost avoidance. The lack of CM, or its ineffectual implementation, can be expensive and sometimes can have such catastrophic consequences such as failure of equipment or loss of life. CM emphasizes the functional relation between parts and systems for controlling system change, it helps to verify. Changes to the system are proposed and implemented using a standardized, systematic approach that ensures consistency, proposed changes are evaluated in terms of their anticipated impact on the entire system. CM verifies that changes are carried out as prescribed and that documentation of items and systems reflects their true configuration. A complete CM program includes provisions for the storing and updating of all system information on a component and system basis. A structured CM program ensures that documentation for items is accurate and consistent with the actual physical design of the item. In many cases, without CM, the documentation is not consistent with the item itself.
For this reason, engineers and management are forced to develop documentation reflecting the actual status of the item before they can proceed with a change. This reverse engineering process is wasteful in terms of human and other resources and can be minimized or eliminated using CM. Configuration Management originated in the United States Department of Defense in the 1950s as a technical management discipline for hardware material items—and it is now a standard practice in every industry; the CM process became its own technical discipline sometime in the late 1960s when the DoD developed a series of military standards called the "480 series" that were subsequently issued in the 1970s. In 1991, the "480 series" was consolidated into a single standard known as the MIL–STD–973, replaced by MIL–HDBK–61 pursuant to a general DoD goal that reduced the number of military standards in favor of industry technical standards supported by standards developing organizations; this marked the beginning of what has now evolved into the most distributed and accepted standard on CM, ANSI–EIA–649–1998.
Now adopted by numerous organizations and agencies, the CM discipline's concepts include systems engineering, Integrated Logistics Support, Capability Maturity Model Integration, ISO 9000, Prince2 project management method, COBIT, Information Technology Infrastructure Library, product lifecycle management, Application Lifecycle Management. Many of these functions and models have redefined CM from its traditional holistic approach to technical management; some treat CM as being similar to a librarian activity, break out change control or change management as a separate or stand alone discipline. CM is the practice of handling changes systematically so that a system maintains its integrity over time. CM implements the policies, procedures and tools that manage, evaluate proposed changes, track the status of changes, maintain an inventory of system and support documents as the system changes. CM programs and plans provide technical and administrative direction to the development and implementation of the procedures, services, tools and resources required to develop and support a complex system.
During system development, CM allows program management to track requirements throughout the life-cycle through acceptance and operations and maintenance. As changes occur in the requirements and design, they must be approved and documented, creating an accurate record of the system status. Ideally the CM process is applied throughout the system lifecycle. Most professionals mix up or get confused with Asset management, where it inventories the assets on hand; the key difference between CM and AM is that the former does not manage the financial accounting aspect but on service that the system supports. The CM process for both hardware- and software-configuration items comprises five distinct disciplines as established in the MIL–HDBK–61A and in ANSI/EIA-649; these disciplines are carried out as policies and procedures for establishing baselines and for performing a standard change-management process. The IEEE 12207 process IEEE 12207.2 has these activities and adds "Release management and delivery".
The five discip
Computer science is the study of processes that interact with data and that can be represented as data in the form of programs. It enables the use of algorithms to manipulate and communicate digital information. A computer scientist studies the theory of computation and the practice of designing software systems, its fields can be divided into practical disciplines. Computational complexity theory is abstract, while computer graphics emphasizes real-world applications. Programming language theory considers approaches to the description of computational processes, while computer programming itself involves the use of programming languages and complex systems. Human–computer interaction considers the challenges in making computers useful and accessible; the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, aiding in computations such as multiplication and division.
Algorithms for performing computations have existed since antiquity before the development of sophisticated computing equipment. Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623. In 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner, he may be considered the first computer scientist and information theorist, among other reasons, documenting the binary number system. In 1820, Thomas de Colmar launched the mechanical calculator industry when he released his simplified arithmometer, the first calculating machine strong enough and reliable enough to be used daily in an office environment. Charles Babbage started the design of the first automatic mechanical calculator, his Difference Engine, in 1822, which gave him the idea of the first programmable mechanical calculator, his Analytical Engine, he started developing this machine in 1834, "in less than two years, he had sketched out many of the salient features of the modern computer".
"A crucial step was the adoption of a punched card system derived from the Jacquard loom" making it infinitely programmable. In 1843, during the translation of a French article on the Analytical Engine, Ada Lovelace wrote, in one of the many notes she included, an algorithm to compute the Bernoulli numbers, considered to be the first computer program. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information. In 1937, one hundred years after Babbage's impossible dream, Howard Aiken convinced IBM, making all kinds of punched card equipment and was in the calculator business to develop his giant programmable calculator, the ASCC/Harvard Mark I, based on Babbage's Analytical Engine, which itself used cards and a central computing unit; when the machine was finished, some hailed it as "Babbage's dream come true". During the 1940s, as new and more powerful computing machines were developed, the term computer came to refer to the machines rather than their human predecessors.
As it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. In 1945, IBM founded the Watson Scientific Computing Laboratory at Columbia University in New York City; the renovated fraternity house on Manhattan's West Side was IBM's first laboratory devoted to pure science. The lab is the forerunner of IBM's Research Division, which today operates research facilities around the world; the close relationship between IBM and the university was instrumental in the emergence of a new scientific discipline, with Columbia offering one of the first academic-credit courses in computer science in 1946. Computer science began to be established as a distinct academic discipline in the 1950s and early 1960s; the world's first computer science degree program, the Cambridge Diploma in Computer Science, began at the University of Cambridge Computer Laboratory in 1953. The first computer science degree program in the United States was formed at Purdue University in 1962.
Since practical computers became available, many applications of computing have become distinct areas of study in their own rights. Although many believed it was impossible that computers themselves could be a scientific field of study, in the late fifties it became accepted among the greater academic population, it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM 704 and the IBM 709 computers, which were used during the exploration period of such devices. "Still, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, you would have to start the whole process over again". During the late 1950s, the computer science discipline was much in its developmental stages, such issues were commonplace. Time has seen significant improvements in the effectiveness of computing technology. Modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base.
Computers were quite costly, some degree of humanitarian aid was needed for efficient use—in part from professional computer operators. As computer adoption became more widespread and affordable, less human assistance was needed for common usage. Despite its short history as a formal academic discipline, computer science has made a number of fundamental contributions to science and society—in fact, along with electronics, it is
O'Reilly Media is an American media company established by Tim O'Reilly that publishes books and Web sites and produces conferences on computer technology topics. Their distinctive brand features a woodcut of an animal on many of their book covers; the company began in 1978 as a private consulting firm doing technical writing, based in the Cambridge, Massachusetts area. In 1984, it began to retain publishing rights on manuals created for Unix vendors. A few 70-page "Nutshell Handbooks" were well-received, but the focus remained on the consulting business until 1988. After a conference displaying O'Reilly's preliminary Xlib manuals attracted significant attention, the company began increasing production of manuals and books; the original cover art consisted of animal designs developed by Edie Freedman because she thought that Unix program names sounded like "weird animals". In 1993 O'Reilly Media created the first web portal, when they launched one of the first Web-based resources, Global Network Navigator.
GNN was sold to AOL in one of the first large transactions of the dot-com bubble. GNN was the first site on the World Wide Web to feature paid advertising. Although O'Reilly Media got its start in publishing two decades after its genesis the company expanded into event production. In 1997, O'Reilly launched The Perl Conference to cross-promote its books on the Perl programming language. Many of the company's other software bestsellers were on topics that were off the radar of the commercial software industry. In 1998, O'Reilly invited many of the leaders of software projects to a meeting. Called the freeware summit, the meeting became known as the Open Source Summit; the O'Reilly Open Source Convention is now one of O'Reilly's flagship events. Other key events include the Strata Conference on big data, the Velocity Conference on Web Performance and Operations, FOO Camp. Past events of note include the Web 2.0 Summit. Overall, O'Reilly describes its business not as publishing or conferences, but as "changing the world by spreading the knowledge of innovators."Today, the company offers over one dozen conferences: Strata + Hadoop World OSCON Fluent Velocity The Next:Economy Summit The Next:Money Summit The Solid Conference The O'Reilly Software Architecture Conference The O'Reilly Design Conference O'Reilly Emerging Technology Conference Tools of Change Conference Web 2.0 Summit Web 2.0 Expo MySQL Conference and Expo RailsConf Where 2.0 Money:Tech Gov 2.0 Expo and Gov 2.0 Summit O'Reilly school of technology will be discontinued as of January 6, 2016, new enrollments are no longer accepted.
In the late 1990s, O'Reilly founded the O'Reilly Network, which grew to include sites such as: LinuxDevCenter.com MacDevCenter.com WindowsDevCenter.com ONLamp.com O'Reilly RadarIn 2008 the company revised its online model and stopped publishing on several of its sites. The company produced dev2dev in association with BEA and java.net in association with Sun Microsystems and CollabNet. In 2001, O'Reilly launched Safari Books Online, a subscription-based service providing access to ebooks as a joint venture with the Pearson Technology Group. Safari Books Online includes books and video from Adobe Press, Alpha Books, Cisco Press, FT Press, Microsoft Press, New Riders Publishing, O'Reilly, Peachpit Press, Prentice Hall, Prentice Hall PTR, Que and Sams Publishing. In 2014, O'Reilly Media acquired Pearson's stake, making Safari Books Online a wholly owned subsidiary of O'Reilly Media. O'Reilly did a redesign of the site and has some success in the attempt to expand beyond Safari's core B2C market into the B2B Enterprise market.
In 2017, O'Reilly Media announced they were no longer selling books including eBooks. Instead, everyone was encouraged to sign up to Safari. In 2003, after the dot com bust, O'Reilly's corporate goal was to reignite enthusiasm in the computer industry. To do this, Dale Dougherty and Tim O'Reilly decided to use the term "Web 2.0" coined in January 1999 by Darcy DiNucci. The term was used for the Web 2.0 Summit run by O'Reilly TechWeb. CMP registered Web 2.0 as a Service Mark "for arranging and conducting live events, namely trade shows, business conferences and educational conferences in various fields of computers and information technology." Web 2.0 framed what distinguished the companies that survived the dot com bust from those that died, identified key drivers of future success, including what is now called “cloud computing,” big data, new approaches to iterative, data-driven software development. In May 2006 CMP Media learned of an impending event called the "Web 2.0 Half day conference."
Concerned over their obligation to take reasonable means to enforce their trade and service marks CMP sent a cease and desist letter to the non-profit Irish organizers of the event. This attempt to restrict through legal mechanisms the use of the term was criticized by some; the legal issue was resolved by O'Reilly's apologizing for the early and aggressive involvement of attorneys, rather than calling the organizers, allowing them to use the service mark for this single event. In January 2005 the compan