Neil J. Gunther
Neil Gunther is a computer information systems researcher best known internationally for developing the open-source performance modeling software Pretty Damn Quick and developing the Guerrilla approach to computer capacity planning and performance analysis. He has been cited for his contributions to the theory of large transients in computer systems and packet networks, his universal law of computational scalability. Gunther is a Senior Member of both the Association for Computing Machinery and the Institute of Electrical and Electronics Engineers, as well as a member of the American Mathematical Society, American Physical Society, Computer Measurement Group and ACM SIGMETRICS, he is focused on developing quantum information system technologies. Gunther is an Australian of German and Scots ancestry, born in Melbourne on 15 August 1950, he attended Preston East Primary School from 1955 to 1956, Balwyn North Primary School from 1956 until 1962. For his tenth birthday, Gunther received a copy of the now famous book entitled The Golden Book of Chemistry Experiments from an older cousin.
Inspired by the book, he started working on various experiments, making use of various chemicals that could be found around in his house. After he spilled some potassium permanganate solution on his bedroom carpet his mother confined him to an alcove in the garage which he turned into a small laboratory, replete with industrial chemicals and second-hand laboratory glassware. Gunther was interested in finding out how things like detergents and oils were composed by cracking them in his fractionating column, he took particular interest in mixing paints for his art classes, as well as his chemistry classes in Balwyn High School. His father, being the Superintendent of Melbourne's electrical power station, borrowed an organic chemistry text from the chemists in the quality control laboratory; this led to an intense interest in synthesizing Azo dyes. At around age 14, Gunther attempted to predict the color of azo dyes based on the chromophore-auxochrome combination. Apart from drawing up empirical tables, this effort was unsuccessful due to his lack of knowledge of quantum theory.
Gunther taught physics at San Jose State University from 1980-1981. He joined Syncal Corporation, a small company contracted by NASA and JPL to develop thermoelectric materials for their deep-space missions. Gunther was asked to analyze the thermal stability test data from the Voyager RTGs, he discovered that the stability of the silicon-germanium thermoelectric alloy was controlled by a soliton-based precipitation mechanism. JPL used his work to select the next generation of RTG materials for the Galileo mission launched in 1989. In 1982, Gunther joined Xerox PARC to develop parametric and functional test software for PARC's small-scale VLSI design fabrication line, he was recruited onto the Dragon multiprocessor workstation project where he developed the PARCbench multiprocessor benchmark. This was his first fore into computer performance analysis. 1989, he developed a Wick-rotated version of Richard Feynman's quantum path integral formalism for analyzing performance degradation in large-scale computer systems and packet networks.
In 1990 Gunther joined Pyramid Technology where he held positions as Senior Scientist and Manager of the Performance Analysis Group, responsible for attaining industry-high TPC benchmarks on their Unix multiprocessors. He performed simulations for the design of the Reliant RM1000 parallel database server. Gunther founded Performance Dynamics Company as a sole proprietorship, registered in California in 1994, to provide consulting and educational services for the management of high performance computer systems with an emphasis on performance analysis and enterprise-wide capacity planning, he went on to release and develop his own open-source performance modeling software called "PDQ" around 1998. That software accompanied his first textbook on performance analysis entitled The Practical Performance Analyst. Several other books have followed since then. In 2004, Gunther has embarked on joint research into quantum information systems based on photonics. During the course of his research in this area, he has developed a theory of photon bifurcation, being tested experimentally at École Polytechnique Fédérale de Lausanne.
This represents yet another application of path integral formulation to circumvent the wave-particle duality of light. In its simplest rendition, this theory can be considered as providing the quantum corrections to the Abbe-Rayleigh diffraction theory of imaging and the Fourier theory of optical information processing. Inspired by the work of Tukey, Gunther explored ways to help systems analyst visualize performance in a manner similar to that available in scientific visualization and information visualization. In 1991, he developed a tool called Barry, which employs barycentric coordinates to visualize sampled CPU usage data on large-scale multiprocessor systems. More he has applied the same 2-simplex barycentric coordinates to visualizing the Apdex application performance metric, based on categorical response time data. A barycentric 3-simplex], that can be swivelled on the computer screen using a mouse, has been found useful for visualizing packet network performance data. In 2008, he co-founded the PerfViz google group.
The relative capacity C of a computational platform is given by: C = N 1 + α + β N
Central processing unit
A central processing unit called a central processor or main processor, is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic and input/output operations specified by the instructions. The computer industry has used the term "central processing unit" at least since the early 1960s. Traditionally, the term "CPU" refers to a processor, more to its processing unit and control unit, distinguishing these core elements of a computer from external components such as main memory and I/O circuitry; the form and implementation of CPUs have changed over the course of their history, but their fundamental operation remains unchanged. Principal components of a CPU include the arithmetic logic unit that performs arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations and a control unit that orchestrates the fetching and execution of instructions by directing the coordinated operations of the ALU, registers and other components.
Most modern CPUs are microprocessors, meaning they are contained on a single integrated circuit chip. An IC that contains a CPU may contain memory, peripheral interfaces, other components of a computer; some computers employ a multi-core processor, a single chip containing two or more CPUs called "cores". Array processors or vector processors have multiple processors that operate in parallel, with no unit considered central. There exists the concept of virtual CPUs which are an abstraction of dynamical aggregated computational resources. Early computers such as the ENIAC had to be physically rewired to perform different tasks, which caused these machines to be called "fixed-program computers". Since the term "CPU" is defined as a device for software execution, the earliest devices that could rightly be called CPUs came with the advent of the stored-program computer; the idea of a stored-program computer had been present in the design of J. Presper Eckert and John William Mauchly's ENIAC, but was omitted so that it could be finished sooner.
On June 30, 1945, before ENIAC was made, mathematician John von Neumann distributed the paper entitled First Draft of a Report on the EDVAC. It was the outline of a stored-program computer that would be completed in August 1949. EDVAC was designed to perform a certain number of instructions of various types; the programs written for EDVAC were to be stored in high-speed computer memory rather than specified by the physical wiring of the computer. This overcame a severe limitation of ENIAC, the considerable time and effort required to reconfigure the computer to perform a new task. With von Neumann's design, the program that EDVAC ran could be changed by changing the contents of the memory. EDVAC, was not the first stored-program computer. Early CPUs were custom designs used as part of a sometimes distinctive computer. However, this method of designing custom CPUs for a particular application has given way to the development of multi-purpose processors produced in large quantities; this standardization began in the era of discrete transistor mainframes and minicomputers and has accelerated with the popularization of the integrated circuit.
The IC has allowed complex CPUs to be designed and manufactured to tolerances on the order of nanometers. Both the miniaturization and standardization of CPUs have increased the presence of digital devices in modern life far beyond the limited application of dedicated computing machines. Modern microprocessors appear in electronic devices ranging from automobiles to cellphones, sometimes in toys. While von Neumann is most credited with the design of the stored-program computer because of his design of EDVAC, the design became known as the von Neumann architecture, others before him, such as Konrad Zuse, had suggested and implemented similar ideas; the so-called Harvard architecture of the Harvard Mark I, completed before EDVAC used a stored-program design using punched paper tape rather than electronic memory. The key difference between the von Neumann and Harvard architectures is that the latter separates the storage and treatment of CPU instructions and data, while the former uses the same memory space for both.
Most modern CPUs are von Neumann in design, but CPUs with the Harvard architecture are seen as well in embedded applications. Relays and vacuum tubes were used as switching elements; the overall speed of a system is dependent on the speed of the switches. Tube computers like EDVAC tended to average eight hours between failures, whereas relay computers like the Harvard Mark I failed rarely. In the end, tube-based CPUs became dominant because the significant speed advantages afforded outweighed the reliability problems. Most of these early synchronous CPUs ran at low clock rates compared to modern microelectronic designs. Clock signal frequencies ranging from 100 kHz to 4 MHz were common at this time, limited by the speed of the switching de
In computing, a web application or web app is a client–server computer program which the client runs in a web browser. Common web applications include webmail, online retail sales, online auction; the general distinction between a dynamic web page of any kind and a "web application" is unclear. Web sites most to be referred to as "web applications" are those which have similar functionality to a desktop software application, or to a mobile app. HTML5 introduced explicit language support for making applications that are loaded as web pages, but can store data locally and continue to function while offline. Single-page applications are more application-like because they reject the more typical web paradigm of moving between distinct pages with different URLs. Single-page frameworks like Sencha Touch and AngularJS might be used to speed development of such a web app for a mobile platform. There are several ways of targeting mobile devices when making a web application: Responsive web design can be used to make a web application - whether a conventional website or a single-page application viewable on small screens and work well with touchscreens.
Progressive Web Apps are web applications that load like regular web pages or websites but can offer the user functionality such as working offline, push notifications, device hardware access traditionally available only to native mobile applications. Native apps or "mobile apps" run directly on a mobile device, just as a conventional software application runs directly on a desktop computer, without a web browser. Frameworks like React Native, Flutter and FuseTools allow the development of native apps for all platforms using languages other than each standard native language. Hybrid apps embed a mobile web site inside a native app using a hybrid framework like Apache Cordova and Ionic or Appcelerator Titanium; this allows development using web technologies while retaining certain advantages of native apps. In earlier computing models like client–server, the processing load for the application was shared between code on the server and code installed on each client locally. In other words, an application had its own pre-compiled client program which served as its user interface and had to be separately installed on each user's personal computer.
Information technology is the use of computers to store, retrieve and manipulate data, or information in the context of a business or other enterprise. IT is considered to be a subset of communications technology. An information technology system is an information system, a communications system or, more speaking, a computer system – including all hardware and peripheral equipment – operated by a limited group of users. Humans have been storing, retrieving and communicating information since the Sumerians in Mesopotamia developed writing in about 3000 BC, but the term information technology in its modern sense first appeared in a 1958 article published in the Harvard Business Review. We shall call it information technology." Their definition consists of three categories: techniques for processing, the application of statistical and mathematical methods to decision-making, the simulation of higher-order thinking through computer programs. The term is used as a synonym for computers and computer networks, but it encompasses other information distribution technologies such as television and telephones.
Several products or services within an economy are associated with information technology, including computer hardware, electronics, internet, telecom equipment, e-commerce. Based on the storage and processing technologies employed, it is possible to distinguish four distinct phases of IT development: pre-mechanical, electromechanical, electronic; this article focuses on the most recent period, which began in about 1940. Devices have been used to aid computation for thousands of years initially in the form of a tally stick; the Antikythera mechanism, dating from about the beginning of the first century BC, is considered to be the earliest known mechanical analog computer, the earliest known geared mechanism. Comparable geared devices did not emerge in Europe until the 16th century, it was not until 1645 that the first mechanical calculator capable of performing the four basic arithmetical operations was developed. Electronic computers, using either valves, began to appear in the early 1940s.
The electromechanical Zuse Z3, completed in 1941, was the world's first programmable computer, by modern standards one of the first machines that could be considered a complete computing machine. Colossus, developed during the Second World War to decrypt German messages, was the first electronic digital computer. Although it was programmable, it was not general-purpose, being designed to perform only a single task, it lacked the ability to store its program in memory. The first recognisably modern electronic digital stored-program computer was the Manchester Baby, which ran its first program on 21 June 1948; the development of transistors in the late 1940s at Bell Laboratories allowed a new generation of computers to be designed with reduced power consumption. The first commercially available stored-program computer, the Ferranti Mark I, contained 4050 valves and had a power consumption of 25 kilowatts. By comparison the first transistorised computer, developed at the University of Manchester and operational by November 1953, consumed only 150 watts in its final version.
Early electronic computers such as Colossus made use of punched tape, a long strip of paper on which data was represented by a series of holes, a technology now obsolete. Electronic data storage, used in modern computers, dates from World War II, when a form of delay line memory was developed to remove the clutter from radar signals, the first practical application of, the mercury delay line; the first random-access digital storage device was the Williams tube, based on a standard cathode ray tube, but the information stored in it and delay line memory was volatile in that it had to be continuously refreshed, thus was lost once power was removed. The earliest form of non-volatile computer storage was the magnetic drum, invented in 1932 and used in the Ferranti Mark 1, the world's first commercially available general-purpose electronic computer. IBM introduced the first hard disk drive as a component of their 305 RAMAC computer system. Most digital data today is still stored magnetically on hard disks, or optically on media such as CD-ROMs.
Until 2002 most information was stored on analog devices, but that year digital storage capacity exceeded analog for the first time. As of 2007 94% of the data stored worldwide was held digitally: 52% on hard disks, 28% on optical devices and 11% on digital magnetic tape, it has been estimated that the worldwide capacity to store information on electronic devices grew from less than 3 exabytes in 1986 to 295 exabytes in 2007, doubling every 3 years. Database management systems emerged in the 1960s to address the problem of storing and retrieving large amounts of data and quickly. One of the earliest such systems was IBM's Information Management System, still deployed more than 50 years later. IMS stores data hierarchically, but in the 1970s Ted Codd proposed an alternative relational storage model based on set theory and predicate logic and the familiar concepts of tables and columns; the first commercially available relational database management system was available from Oracle in 1981. All database management systems consist of a number of components that together allow the data they store to be accessed simultan
Enterprise resource planning
Enterprise resource planning is the integrated management of core business processes in real-time and mediated by software and technology. ERP is referred to as a category of business management software — a suite of integrated applications—that an organization can use to collect, store and interpret data from these many business activities. ERP provides an integrated and continuously updated view of core business processes using common databases maintained by a database management system. ERP systems track business resources—cash, raw materials, production capacity—and the status of business commitments: orders, purchase orders, payroll; the applications that make up the system share data across various departments that provide the data. ERP facilitates information flow between all business functions and manages connections to outside stakeholders. Enterprise system software is a multibillion-dollar industry that produces components supporting a variety of business functions. IT investments have become the largest category of capital expenditure in United States-based businesses over the past decade.
Though early ERP systems focused on large enterprises, smaller enterprises use ERP systems. The ERP system integrates varied organizational systems and facilitates error-free transactions and production, thereby enhancing the organization's efficiency. However, developing an ERP system differs from traditional system development. ERP systems run on a variety of computer hardware and network configurations using a database as an information repository; the Gartner Group first used the abbreviation ERP in the 1990s to extend upon the capabilities of material requirements planning, the manufacturing resource planning, as well as computer-integrated manufacturing. Without replacing these terms, ERP came to represent a larger whole that reflected the evolution of application integration beyond manufacturing. Not all ERP packages developed from a manufacturing core. By the mid-1990s ERP systems addressed all core enterprise functions. Governments and non–profit organizations began to use ERP systems.
ERP systems experienced rapid growth in the 1990s. Because of the year 2000 problem and the introduction of the euro that disrupted legacy systems, many companies took the opportunity to replace their old systems with ERP. ERP systems focused on automating back office functions that did not directly affect customers and the public. Front office functions, such as customer relationship management, dealt directly with customers, or e-business systems such as e-commerce, e-government, e-telecom, e-finance—or supplier relationship management became integrated when the internet simplified communicating with external parties."ERP II" was coined in 2000 in an article by Gartner Publications entitled ERP Is Dead—Long Live ERP II. It describes web–based software that provides real–time access to ERP systems to employees and partners; the ERP II role expands transaction processing. Rather than just manage buying, etc.—ERP II leverages information in the resources under its management to help the enterprise collaborate with other enterprises.
ERP II is more flexible than the first generation ERP. Rather than confine ERP system capabilities within the organization, it goes beyond the corporate walls to interact with other systems. Enterprise application suite is an alternate name for such systems. ERP II systems are used to enable collaborative initiatives such as supply chain management, customer relationship management, business intelligence among business partner organizations through the use of various e-business technologies. Developers now make more effort to integrate mobile devices with the ERP system. ERP vendors are extending ERP to these devices, along with other business applications. Technical stakes of modern ERP concern integration—hardware, networking, supply chains. ERP now covers more functions and roles—including decision making, stakeholders' relationships, transparency, etc. ERP systems include the following characteristics: An integrated system Operates in real time A common database that supports all the applications A consistent look and feel across modules Installation of the system with elaborate application/data integration by the Information Technology department, provided the implementation is not done in small steps Deployment options include: on-premises, cloud hosted, or SaaS An ERP system covers the following common functional areas.
In many ERP systems, these are called and grouped together as ERP modules: Finance & Accounting: General Ledger, Fixed Assets, payables including vouchering and payment, receivables Cash Management and collections, cash management, Financial Consolidation Management Accounting: Budgeting, cost management, activity based costing Human resources: Recruiting, rostering, benefits and pension plans, diversity management, separation Manufacturing: Engineering, bill of materials, work orders, capacity, workflow management, quality control, manufacturing process, manufacturing projects, manufacturing flow, product life cycle management Order Processing: Order to cash, order entry, credit checking, available to promise, shipping, sales analysis and reporting, sales commissioning. Supply chain management: Supply chain planning, supplier scheduling, product configurator, order to
Systems engineering is an interdisciplinary field of engineering and engineering management that focuses on how to design and manage complex systems over their life cycles. At its core, systems engineering utilizes systems thinking principles to organize this body of knowledge; the individual outcome of such efforts, an engineered system, can be defined as a combination of components that work in synergy to collectively perform a useful function. Issues such as requirements engineering, logistics, coordination of different teams and evaluation, maintainability and many other disciplines necessary for successful system development, design and ultimate decommission become more difficult when dealing with large or complex projects. Systems engineering deals with work-processes, optimization methods, risk management tools in such projects, it overlaps technical and human-centered disciplines such as industrial engineering, mechanical engineering, manufacturing engineering, control engineering, software engineering, electrical engineering, organizational studies, civil engineering and project management.
Systems engineering ensures that all aspects of a project or system are considered, integrated into a whole. The systems engineering process is a discovery process, quite unlike a manufacturing process. A manufacturing process is focused on repetitive activities that achieve high quality outputs with minimum cost and time; the systems engineering process must begin by discovering the real problems that need to be resolved, identifying the most probable or highest impact failures that can occur – systems engineering involves finding solutions to these problems. The term systems engineering can be traced back to Bell Telephone Laboratories in the 1940s; the need to identify and manipulate the properties of a system as a whole, which in complex engineering projects may differ from the sum of the parts' properties, motivated various industries those developing systems for the U. S. Military; when it was no longer possible to rely on design evolution to improve upon a system and the existing tools were not sufficient to meet growing demands, new methods began to be developed that addressed the complexity directly.
The continuing evolution of systems engineering comprises the development and identification of new methods and modeling techniques. These methods aid in a better comprehension of the design and developmental control of engineering systems as they grow more complex. Popular tools that are used in the systems engineering context were developed during these times, including USL, UML, QFD, IDEF0. In 1990, a professional society for systems engineering, the National Council on Systems Engineering, was founded by representatives from a number of U. S. corporations and organizations. NCOSE was created to address the need for improvements in systems engineering practices and education; as a result of growing involvement from systems engineers outside of the U. S. the name of the organization was changed to the International Council on Systems Engineering in 1995. Schools in several countries offer graduate programs in systems engineering, continuing education options are available for practicing engineers.
Systems engineering signifies only an approach and, more a discipline in engineering. The aim of education in systems engineering is to formalize various approaches and in doing so, identify new methods and research opportunities similar to that which occurs in other fields of engineering; as an approach, systems engineering is interdisciplinary in flavour. The traditional scope of engineering embraces the conception, development and operation of physical systems. Systems engineering, as conceived, falls within this scope. "Systems engineering", in this sense of the term, refers to the building of engineering concepts. The use of the term "systems engineer" has evolved over time to embrace a wider, more holistic concept of "systems" and of engineering processes; this evolution of the definition has been a subject of ongoing controversy, the term continues to apply to both the narrower and broader scope. Traditional systems engineering was seen as a branch of engineering in the classical sense, that is, as applied only to physical systems, such as spacecraft and aircraft.
More systems engineering has evolved to a take on a broader meaning when humans were seen as an essential component of a system. Checkland, for example, captures the broader meaning of systems engineering by stating that'engineering' "can be read in its general sense. Enterprise Systems Engineering pertains to the view of enterprises, that is, organizations or combinations of organizations, as systems. Service Systems Engineering has to do with the engineering of service systems. Checkland defines a service system as a system, conceived as serving another system. Most civil infrastructure systems are service systems. Systems engineering focuses on analyzing and eliciting customer needs and required functionality early in the development cycle, documenting requirements proceeding with design synthesis and system validation while considering the complete problem, the system lifecycle; this includes understanding all of the stakeholders involved. Oliver et al. claim that the systems engineerin