A computer network is a digital telecommunications network which allows nodes to share resources. In computer networks, computing devices exchange data with each other using connections between nodes; these data links are established over cable media such as wires or optic cables, or wireless media such as Wi-Fi. Network computer devices that originate and terminate the data are called network nodes. Nodes are identified by network addresses, can include hosts such as personal computers and servers, as well as networking hardware such as routers and switches. Two such devices can be said to be networked together when one device is able to exchange information with the other device, whether or not they have a direct connection to each other. In most cases, application-specific communications protocols are layered over other more general communications protocols; this formidable collection of information technology requires skilled network management to keep it all running reliably. Computer networks support an enormous number of applications and services such as access to the World Wide Web, digital video, digital audio, shared use of application and storage servers and fax machines, use of email and instant messaging applications as well as many others.
Computer networks differ in the transmission medium used to carry their signals, communications protocols to organize network traffic, the network's size, traffic control mechanism and organizational intent. The best-known computer network is the Internet; the chronology of significant computer-network developments includes: In the late 1950s, early networks of computers included the U. S. military radar system Semi-Automatic Ground Environment. In 1959, Anatolii Ivanovich Kitov proposed to the Central Committee of the Communist Party of the Soviet Union a detailed plan for the re-organisation of the control of the Soviet armed forces and of the Soviet economy on the basis of a network of computing centres, the OGAS. In 1960, the commercial airline reservation system semi-automatic business research environment went online with two connected mainframes. In 1963, J. C. R. Licklider sent a memorandum to office colleagues discussing the concept of the "Intergalactic Computer Network", a computer network intended to allow general communications among computer users.
In 1964, researchers at Dartmouth College developed the Dartmouth Time Sharing System for distributed users of large computer systems. The same year, at Massachusetts Institute of Technology, a research group supported by General Electric and Bell Labs used a computer to route and manage telephone connections. Throughout the 1960s, Paul Baran and Donald Davies independently developed the concept of packet switching to transfer information between computers over a network. Davies pioneered the implementation of the concept with the NPL network, a local area network at the National Physical Laboratory using a line speed of 768 kbit/s. In 1965, Western Electric introduced the first used telephone switch that implemented true computer control. In 1966, Thomas Marill and Lawrence G. Roberts published a paper on an experimental wide area network for computer time sharing. In 1969, the first four nodes of the ARPANET were connected using 50 kbit/s circuits between the University of California at Los Angeles, the Stanford Research Institute, the University of California at Santa Barbara, the University of Utah.
Leonard Kleinrock carried out theoretical work to model the performance of packet-switched networks, which underpinned the development of the ARPANET. His theoretical work on hierarchical routing in the late 1970s with student Farouk Kamoun remains critical to the operation of the Internet today. In 1972, commercial services using X.25 were deployed, used as an underlying infrastructure for expanding TCP/IP networks. In 1973, the French CYCLADES network was the first to make the hosts responsible for the reliable delivery of data, rather than this being a centralized service of the network itself. In 1973, Robert Metcalfe wrote a formal memo at Xerox PARC describing Ethernet, a networking system, based on the Aloha network, developed in the 1960s by Norman Abramson and colleagues at the University of Hawaii. In July 1976, Robert Metcalfe and David Boggs published their paper "Ethernet: Distributed Packet Switching for Local Computer Networks" and collaborated on several patents received in 1977 and 1978.
In 1979, Robert Metcalfe pursued making Ethernet an open standard. In 1976, John Murphy of Datapoint Corporation created ARCNET, a token-passing network first used to share storage devices. In 1995, the transmission speed capacity for Ethernet increased from 10 Mbit/s to 100 Mbit/s. By 1998, Ethernet supported transmission speeds of a Gigabit. Subsequently, higher speeds of up to 400 Gbit/s were added; the ability of Ethernet to scale is a contributing factor to its continued use. Computer networking may be considered a branch of electrical engineering, electronics engineering, telecommunications, computer science, information technology or computer engineering, since it relies upon the theoretical and practical application of the related disciplines. A computer network facilitates interpersonal communications allowing users to communicate efficiently and via various means: email, instant messaging, online chat, video telephone calls, video conferencing. A network allows sharing of computing resources.
Users may access and use resources provided by devices on the network, such as printing a document on a shared network printer or use of a shared storage device. A network allows sharing of files, and
Dell is an American multinational computer technology company based in Round Rock, United States, that develops, sells and supports computers and related products and services. Named after its founder, Michael Dell, the company is one of the largest technological corporations in the world, employing more than 145,000 people in the U. S. and around the world. Dell sells personal computers, data storage devices, network switches, computer peripherals, HDTVs, printers, MP3 players, electronics built by other manufacturers; the company is well known for its innovations in supply chain management and electronic commerce its direct-sales model and its "build-to-order" or "configure to order" approach to manufacturing—delivering individual PCs configured to customer specifications. Dell was a pure hardware vendor for much of its existence, but with the acquisition in 2009 of Perot Systems, Dell entered the market for IT services; the company has since made additional acquisitions in storage and networking systems, with the aim of expanding their portfolio from offering computers only to delivering complete solutions for enterprise customers.
Dell was listed at number 51 in the Fortune 500 list, until 2014. After going private in 2013, the newly confidential nature of its financial information prevents the company from being ranked by Fortune. In 2015, it was the third largest PC vendor in the world after Lenovo and HP. Dell is the largest shipper of PC monitors worldwide. Dell is the sixth largest company in Texas by total revenue, according to Fortune magazine, it is the second largest non-oil company in Texas – behind AT&T – and the largest company in the Greater Austin area. It was a publicly traded company, as well as a component of the NASDAQ-100 and S&P 500, until it was taken private in a leveraged buyout which closed on October 30, 2013. In 2015, Dell acquired the enterprise technology firm EMC Corporation. Dell traces its origins to 1984, when Michael Dell created Dell Computer Corporation, which at the time did business as PC's Limited, while a student of the University of Texas at Austin; the dorm-room headquartered company sold IBM PC-compatible computers built from stock components.
Dell dropped out of school to focus full-time on his fledgling business, after getting $1,000 in expansion-capital from his family. In 1985, the company produced the first computer of its own design, the Turbo PC, which sold for $795. PC's Limited advertised its systems in national computer magazines for sale directly to consumers and custom assembled each ordered unit according to a selection of options; the company grossed more than $73 million in its first year of operation. In 1986, Michael Dell brought in Lee Walker, a 51-year-old venture capitalist, as president and chief operating officer, to serve as Dell's mentor and implement Dell's ideas for growing the company. Walker was instrumental in recruiting members to the board of directors when the company went public in 1988. Walker retired in 1990 due to health, Michael Dell hired Morton Meyerson, former CEO and president of Electronic Data Systems to transform the company from a fast-growing medium-sized firm into a billion-dollar enterprise.
The company dropped the PC's Limited name in 1987 to become Dell Computer Corporation and began expanding globally. In June 1988, Dell's market capitalization grew from $30 million to $80 million from its June 22 initial public offering of 3.5 million shares at $8.50 a share. In 1992, Fortune magazine included Dell Computer Corporation in its list of the world's 500 largest companies, making Michael Dell the youngest CEO of a Fortune 500 company ever. In 1993, to complement its own direct sales channel Dell planned to sell PCs at big-box retail outlets such as Wal-Mart, which would have brought in an additional $125 million in annual revenue. Bain consultant Kevin Rollins persuaded Michael Dell to pull out of these deals, believing they would be money losers in the long run. Margins at retail were thin at best and Dell left the reseller channel in 1994. Rollins would soon join Dell full-time and become the company President and CEO. Dell did not emphasize the consumer market, due to the higher costs and unacceptably low-profit margins in selling to individuals and households.
While the industry's average selling price to individuals was going down, Dell's was going up, as second- and third-time computer buyers who wanted powerful computers with multiple features and did not need much technical support were choosing Dell. Dell found an opportunity among PC-savvy individuals who liked the convenience of buying direct, customizing their PC to their means, having it delivered in days. In early 1997, Dell created an internal sales and marketing group dedicated to serving the home market and introduced a product line designed for individual users. From 1997 to 2004, Dell enjoyed steady growth and it gained market share from competitors during industry slumps. During the same period, rival PC vendors such as Compaq, Gateway, IBM, Packard Bell, AST Research struggled and left the market or were bought out. Dell surpassed Compaq to become the largest PC manufacturer in 1999. Operating costs made up only 10 percent of Dell's $35 billion in revenue in 2002, compared with 21 percent of revenue at Hewlett-Packard, 25 percent at Gateway, 46 percent at Cisco.
In 2002, when Compaq merged with Hewlett Packard, the newly combined Hewlett Packard took the top spot but struggled and Dell soon regained its lead. Dell grew the fastest in the early 2000s. Dell attained an
A computer terminal is an electronic or electromechanical hardware device, used for entering data into, displaying or printing data from, a computer or a computing system. The teletype was an example of an early day hardcopy terminal, predated the use of a computer screen by decades; the acronym CRT, which once referred to a computer terminal, has come to refer to a type of screen of a personal computer. Early terminals were inexpensive devices but slow compared to punched cards or paper tape for input, but as the technology improved and video displays were introduced, terminals pushed these older forms of interaction from the industry. A related development was timesharing systems, which evolved in parallel and made up for any inefficiencies of the user's typing ability with the ability to support multiple users on the same machine, each at their own terminal; the function of a terminal is confined to input of data. A terminal that depends on the host computer for its processing power is called a "dumb terminal" or a thin client.
A personal computer can run terminal emulator software that replicates the function of a terminal, sometimes allowing concurrent use of local programs and access to a distant terminal host system. The terminal of the first working programmable automatic digital Turing-complete computer, the Z3, had a keyboard and a row of lamps to show results. Early user terminals connected to computers were electromechanical teleprinters/teletypewriters, such as the Teletype Model 33 ASR used for telegraphy or the Friden Flexowriter. Keyboard/printer terminals that came included the IBM 2741 and the DECwriter LA30. Respective top speeds of teletypes, IBM 2741 and LA30 were 15 and 30 characters per second. Although at that time "paper was king" the speed of interaction was limited. Early video computer displays were sometimes nicknamed "Glass TTYs" or "Visual Display Units", used no CPU, instead relying on individual logic gates or primitive LSI chips, they became popular Input-Output devices on many different types of computer system once several suppliers gravitated to a set of common standards: ASCII character set, but early/economy models supported only capital letters RS-232 serial ports 24 lines of 80 characters of text.
Models sometimes had two character-width settings. Some type of cursor that can be positioned. Implementation of at least 3 control codes: Carriage Return, Line-Feed, Bell, but many more, such as Escape sequences to provide underlining, dim or reverse-video character highlighting, to clear the display and position the cursor; the Datapoint 3300 from Computer Terminal Corporation was announced in 1967 and shipped in 1969, making it one of the earliest stand-alone display-based terminals. It solved the memory space issue mentioned above by using a digital shift-register design, using only 72 columns rather than the more common choice of 80. Starting with the Datapoint 3300, by the late 1970s and early 1980s, there were dozens of manufacturers of terminals, including Lear-Siegler, ADDS, Data General, DEC, Hazeltine Corporation, Heath/Zenith, Hewlett Packard, IBM, Volker-Craig, Wyse, many of which had incompatible command sequences; the great variations in the control codes between makers gave rise to software that identified and grouped terminal types so the system software would display input forms using the appropriate control codes.
The great majority of terminals were monochrome, manufacturers variously offering green, white or amber and sometimes blue screen phosphors.. Terminals with modest color capability were available but not used. An "intelligent" terminal does its own processing implying a microprocessor is built in, but not all terminals with microprocessors did any real processing of input: the main computer to which it was attached would have to respond to each keystroke; the term "intelligent" in this context dates from 1969. Notable examples include the IBM 2250 and IBM 2260, predecessors to the IBM 3270 and introduced with System/360 in 1964. Most terminals were connected to minicomputers or mainframe computers and had a green or amber screen. Terminals communicate wi
A Chromebox is a small form-factor PC running Google's Chrome OS operating system. The device debuted in May 2012. Chromeboxes, like other Chrome OS devices including Chromebook laptops support web applications, thereby relying on an Internet connection for both software functionality and data storage; that connection, via a local area network, can be wireless or through an Ethernet port. The machines are classed as small form-factor PCs and feature a power switch and a set of connections to support a keyboard, pointing device and one or more monitors. Solid state drives are used for storage and only wireless printers are supported; the first Chromebox, released by Samsung on May 29, 2012, ran a dual-core Intel Celeron Processor 867 at 1.3 GHz, featured six USB 2.0 ports and two DisplayPort++ slots compatible with HDMI, DVI, VGA. In February 2014, Google bundled an Intel Core i7 Chromebox with a business video conferencing package, 1080p high definition camera module, external microphone/speaker and remote control.
This Chromebox for Meetings system retailed for $999 plus a $250 annual management fee, waived the first year—a cost thousands of dollars less than other unified videoconferencing systems, including those from Cisco and Polycom. The system employed a Google Hangouts-like interface for up to 15 participants, a dedicated URL for sharing screens, management accounts for scheduling meetings. An updated system announced in November 2017 featured a 4K camera and a machine learning feature that automatically identifies and frames participants. In March 2014, Asus established a new price at the low-end of the Chromebox market with a compact, 1.32 pound model that retailed at $179 and featured a Celeron CPU and four USB 3.0 ports. Yahoo Tech columnist David Pogue called the Asus device among the smallest, "least-expensive desktop computers sold", likening it to a Smart car. "You won’t be hauling lumber from Home Depot in it, but it’s a terrific deal—and most days, it’ll get you where you want to go."
In May, Asus released a faster model with an Intel Core i3 processor. Hewlett-Packard entered the market in June with a Chromebox powered by an Intel Celeron processor, optionally bundling a keyboard and mouse. In August, Acer introduced two models that could stand vertically and provided some business-oriented features, including encryption and fast deletion of local data. In September, Dell entered the market with an entry-level machine, as well as Dell's implementation of the Google video conferencing system. In August 2015, AOPen announced a family of Chromeboxes designed principally for driving the content of digital commercial signage; the models were ruggedized for on-site operation. The capability to run Android apps with Chrome OS devices, introduced by Google in 2016 and realized by certain Chromebooks in 2017, seemed to bypass Chromeboxes until a cluster of new Chromebox offerings appeared in 2018, including Acer, HP. Oregon-based CTL, maker of Chromebooks since 2014, launched its first Chromebox in March 2018
Graphical user interface
The graphical user interface is a form of user interface that allows users to interact with electronic devices through graphical icons and visual indicators such as secondary notation, instead of text-based user interfaces, typed command labels or text navigation. GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces, which require commands to be typed on a computer keyboard; the actions in a GUI are performed through direct manipulation of the graphical elements. Beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices and smaller household and industrial controls; the term GUI tends not to be applied to other lower-display resolution types of interfaces, such as video games, or not including flat screens, like volumetric displays because the term is restricted to the scope of two-dimensional display screens able to describe generic information, in the tradition of the computer science research at the Xerox Palo Alto Research Center.
Designing the visual composition and temporal behavior of a GUI is an important part of software application programming in the area of human–computer interaction. Its goal is to enhance the efficiency and ease of use for the underlying logical design of a stored program, a design discipline named usability. Methods of user-centered design are used to ensure that the visual language introduced in the design is well-tailored to the tasks; the visible graphical interface features of an application are sometimes referred to as chrome or GUI. Users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold; the widgets of a well-designed interface are selected to support the actions necessary to achieve the goals of users. A model–view–controller allows flexible structures in which the interface is independent from and indirectly linked to application functions, so the GUI can be customized easily; this allows users to select or design a different skin at will, eases the designer's work to change the interface as user needs evolve.
Good user interface design relates to users more, to system architecture less. Large widgets, such as windows provide a frame or container for the main presentation content such as a web page, email message or drawing. Smaller ones act as a user-input tool. A GUI may be designed for the requirements of a vertical market as application-specific graphical user interfaces. Examples include automated teller machines, point of sale touchscreens at restaurants, self-service checkouts used in a retail store, airline self-ticketing and check-in, information kiosks in a public space, like a train station or a museum, monitors or control screens in an embedded industrial application which employ a real-time operating system. By the 1980s, cell phones and handheld game systems employed application specific touchscreen GUIs. Newer automobiles use GUIs in their navigation systems and multimedia centers, or navigation multimedia center combinations. Sample graphical desktop environments A GUI uses a combination of technologies and devices to provide a platform that users can interact with, for the tasks of gathering and producing information.
A series of elements conforming a visual language have evolved to represent information stored in computers. This makes it easier for people with few computer skills to use computer software; the most common combination of such elements in GUIs is the windows, menus, pointer paradigm in personal computers. The WIMP style of interaction uses a virtual input device to represent the position of a pointing device, most a mouse, presents information organized in windows and represented with icons. Available commands are compiled together in menus, actions are performed making gestures with the pointing device. A window manager facilitates the interactions between windows and the windowing system; the windowing system handles hardware devices such as pointing devices, graphics hardware, positioning of the pointer. In personal computers, all these elements are modeled through a desktop metaphor to produce a simulation called a desktop environment in which the display represents a desktop, on which documents and folders of documents can be placed.
Window managers and other software combine to simulate the desktop environment with varying degrees of realism. Smaller mobile devices such as personal digital assistants and smartphones use the WIMP elements with different unifying metaphors, due to constraints in space and available input devices. Applications for which WIMP is not well suited may use newer interaction techniques, collectively termed post-WIMP user interfaces; as of 2011, some touchscreen-based operating systems such as Apple's iOS and Android use the class of GUIs named post-WIMP. These support styles of interaction using more than one finger in contact with a display, which allows actions such as pinching and rotating, which are unsupported by one pointer and mouse. Human interface devices, for the efficient interaction with a GUI include a computer keyboard used together with keyboard shortcuts, pointing devices for the cursor control: mouse, pointing stick, trackball, virtual keyboards, head-up displays. There are actions performed by programs that affect the GUI.
For example, there are components like inotify or D-Bus to facilitate communication between computer programs. Ivan Sutherland developed Sketchpad in 1963 held as the first graphical co
GIGA-BYTE Technology Co. Ltd. is a Taiwanese manufacturer and distributor of computer hardware. Gigabyte's principal business is motherboards, with shipments of 4.8 million motherboards in Q1 2015, while Asus shipped around 4.5 million motherboards in the same quarter. Gigabyte manufactures custom graphics cards and laptop computers. In 2010, Gigabyte was ranked 17th in "Taiwan's Top 20 Global Brands" by the Taiwan External Trade Development Council; the company is publicly held and traded on the Taiwan Stock Exchange, stock ID number TWSE: 2376. Gigabyte Technology was established in 1986 by Pei-Cheng Yeh. Gigabyte's components are used by Alienware, Falcon Northwest, CybertronPC, Origin PC, in Technology Direct desktops with up to a 5 year warranty, may be purchased at retail by those who wish to build or upgrade a PC system themselves. One of Gigabyte's key advertised features on its motherboards is its "Ultra Durable" construction, advertised with "all solid capacitors". On 8 August 2006 Gigabyte announced a joint venture with Asus.
Gigabyte developed the world's first software-controlled power supply in July 2007. An innovative method to charge the iPad and iPhone on the computer was introduced by Gigabyte in April 2010. Gigabyte launched the world’s first Z68 motherboard on 31 May 2011, with an on-board mSATA connection for Intel SSD and Smart Response Technology. On 2 April 2012, Gigabyte developed the world's first motherboard with 60A ICs from International Rectifier. Gigabyte designs and manufactures motherboards for both AMD and Intel platforms, produces graphics cards and notebooks in partnership with AMD and Nvidia, including Nvidia's Turing chipsets and AMD's Vega and Polaris chipsets. Other products of Gigabyte have included desktop computers, tablet computers, mobile phones, personal digital assistants, server motherboards, server racks, networking equipment, optical drives, computer monitors, keyboards, cooling components, power supplies, computer cases. Aorus is a registered sub-brand trademark of Gigabyte belonging to Aorus Pte. Ltd., a company registered in Singapore.
Aorus specializes in gaming related products such as motherboards, graphics cards, mice, headsets, cases and CPU coolers. List of companies of Taiwan AORUS ASRock Asus Biostar Elitegroup Computer Systems EVGA Corporation Micro-Star International Official website Gigabyte worldwide distribution partners and retailers Official Gigabyte forum
Unix is a family of multitasking, multiuser computer operating systems that derive from the original AT&T Unix, development starting in the 1970s at the Bell Labs research center by Ken Thompson, Dennis Ritchie, others. Intended for use inside the Bell System, AT&T licensed Unix to outside parties in the late 1970s, leading to a variety of both academic and commercial Unix variants from vendors including University of California, Microsoft, IBM, Sun Microsystems. In the early 1990s, AT&T sold its rights in Unix to Novell, which sold its Unix business to the Santa Cruz Operation in 1995; the UNIX trademark passed to The Open Group, a neutral industry consortium, which allows the use of the mark for certified operating systems that comply with the Single UNIX Specification. As of 2014, the Unix version with the largest installed base is Apple's macOS. Unix systems are characterized by a modular design, sometimes called the "Unix philosophy"; this concept entails that the operating system provides a set of simple tools that each performs a limited, well-defined function, with a unified filesystem as the main means of communication, a shell scripting and command language to combine the tools to perform complex workflows.
Unix distinguishes itself from its predecessors as the first portable operating system: the entire operating system is written in the C programming language, thus allowing Unix to reach numerous platforms. Unix was meant to be a convenient platform for programmers developing software to be run on it and on other systems, rather than for non-programmers; the system grew larger as the operating system started spreading in academic circles, as users added their own tools to the system and shared them with colleagues. At first, Unix was not designed to be multi-tasking. Unix gained portability, multi-tasking and multi-user capabilities in a time-sharing configuration. Unix systems are characterized by various concepts: the use of plain text for storing data; these concepts are collectively known as the "Unix philosophy". Brian Kernighan and Rob Pike summarize this in The Unix Programming Environment as "the idea that the power of a system comes more from the relationships among programs than from the programs themselves".
In an era when a standard computer consisted of a hard disk for storage and a data terminal for input and output, the Unix file model worked quite well, as I/O was linear. In the 1980s, non-blocking I/O and the set of inter-process communication mechanisms were augmented with Unix domain sockets, shared memory, message queues, semaphores, network sockets were added to support communication with other hosts; as graphical user interfaces developed, the file model proved inadequate to the task of handling asynchronous events such as those generated by a mouse. By the early 1980s, users began seeing Unix as a potential universal operating system, suitable for computers of all sizes; the Unix environment and the client–server program model were essential elements in the development of the Internet and the reshaping of computing as centered in networks rather than in individual computers. Both Unix and the C programming language were developed by AT&T and distributed to government and academic institutions, which led to both being ported to a wider variety of machine families than any other operating system.
Under Unix, the operating system consists of many libraries and utilities along with the master control program, the kernel. The kernel provides services to start and stop programs, handles the file system and other common "low-level" tasks that most programs share, schedules access to avoid conflicts when programs try to access the same resource or device simultaneously. To mediate such access, the kernel has special rights, reflected in the division between user space and kernel space - although in microkernel implementations, like MINIX or Redox, functions such as network protocols may run in user space; the origins of Unix date back to the mid-1960s when the Massachusetts Institute of Technology, Bell Labs, General Electric were developing Multics, a time-sharing operating system for the GE-645 mainframe computer. Multics featured several innovations, but presented severe problems. Frustrated by the size and complexity of Multics, but not by its goals, individual researchers at Bell Labs started withdrawing from the project.
The last to leave were Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna, who decided to reimplement their experiences in a new project of smaller scale. This new operating system was without organizational backing, without a name; the new operating system was a single-tasking system. In 1970, the group coined the name Unics for Uniplexed Information and Computing Service, as a pun on Multics, which stood for Multiplexed Information and Computer Services. Brian Kernighan takes credit for the idea, but adds that "no one can remember" the origin of the final spelling Unix. Dennis Ritchie, Doug McIlroy, Peter G. Neumann credit Kernighan; the operating system was written in assembly language, but in 1973, Version 4 Unix was rewritten in C. Version 4 Unix, still had many PDP-11 dependent codes, is not suitable for porting; the first port to other platform was made five years f