Automatic process control in continuous production processes is a combination of control engineering and chemical engineering disciplines that uses industrial control systems to achieve a production level of consistency and safety which could not be achieved purely by human manual control. It is implemented in industries such as oil refining and paper manufacturing, chemical processing and power generating plants. There is a wide range of size and complexity, but it enables a small number of operators to manage complex processes to a high degree of consistency; the development of large automatic process control systems was instrumental in enabling the design of large high volume and complex processes, which could not be otherwise economically or safely operated. The applications can range from controlling the temperature and level of a single process vessel, to a complete chemical processing plant with several thousand control loops. Early process control breakthroughs came most in the form of water control devices.
Ktesibios of Alexandria is credited for inventing float valves to regulate water level of water clocks in the 3rd Century BC. In the 1st Century AD, Heron of Alexandria invented a water valve similar to the fill valve used in modern toilets. Process controls inventions involved basic physics principles. In 1620, Cornlis Drebbel invented a bimetallic thermostat for controlling the temperature in a furnace. In 1681, Denis Papin discovered the pressure inside a vessel could be regulated by placing weights on top of the vessel lid. In 1745, Edmund Lee created the fantail to improve windmill efficiency. With the dawn of the Industrial Revolution in the 1760s, process controls inventions were aimed to replace human operators with mechanized processes. In 1784, Oliver Evans created a water-powered flourmill which operated using buckets and screw conveyors. Henry Ford applied the same theory in 1910 when the assembly line was created to decrease human intervention in the automobile production process.
For continuously variable process control it was not until 1922 that a formal control law for what we now call PID control or three-term control was first developed using theoretical analysis, by Russian American engineer Nicolas Minorsky. Minorsky was researching and designing automatic ship steering for the US Navy and based his analysis on observations of a helmsman, he noted the helmsman steered the ship based not only on the current course error, but on past error, as well as the current rate of change. His goal was not general control, which simplified the problem significantly. While proportional control provided stability against small disturbances, it was insufficient for dealing with a steady disturbance, notably a stiff gale, which required adding the integral term; the derivative term was added to improve stability and control. Process control of large industrial plants has evolved through many stages. Control would be from panels local to the process plant; however this required a large manpower resource to attend to these dispersed panels, there was no overall view of the process.
The next logical development was the transmission of all plant measurements to a permanently-manned central control room. This was the centralisation of all the localised panels, with the advantages of lower manning levels and easier overview of the process; the controllers were behind the control room panels, all automatic and manual control outputs were transmitted back to plant. However, whilst providing a central control focus, this arrangement was inflexible as each control loop had its own controller hardware, continual operator movement within the control room was required to view different parts of the process. With the coming of electronic processors and graphic displays it became possible to replace these discrete controllers with computer-based algorithms, hosted on a network of input/output racks with their own control processors; these could be distributed around plant, communicate with the graphic display in the control room or rooms. The distributed control system was born; the introduction of DCSs allowed easy interconnection and re-configuration of plant controls such as cascaded loops and interlocks, easy interfacing with other production computer systems.
It enabled sophisticated alarm handling, introduced automatic event logging, removed the need for physical records such as chart recorders, allowed the control racks to be networked and thereby located locally to plant to reduce cabling runs, provided high level overviews of plant status and production levels. The accompanying diagram is a general model which shows functional manufacturing levels in a large process using processor and computer-based control. Referring to the diagram: Level 0 contains the field devices such as flow and temperature sensors, final control elements, such as control valves. To determine the fundamental model for any process, the inputs and outputs of the system are defined differently than for other chemical processes; the balance equations are defined by the c
Arm Holdings is a British multinational semiconductor and software design company, owned by SoftBank Group and its Vision Fund. With its headquarters in Cambridgeshire, within the United Kingdom, its primary business is in the design of ARM processors, although it designs software development tools under the DS-5, RealView and Keil brands, as well as systems and platforms, system-on-a-chip infrastructure and software; as a "Holding" company, it holds shares of other companies. It is considered to be market dominant for processors in mobile phones and tablet computers; the company is one of the best-known "Silicon Fen" companies. Processors based on designs licensed from Arm, or designed by licensees of one of the Arm instruction set architectures, are used in all classes of computing devices. Examples of those processors range from the world's smallest computer to the processors in some supercomputers on the TOP500 list. Processors designed by Arm or by Arm licensees are used as microcontrollers in embedded systems, including real-time safety systems, biometrics systems, smart TVs, all modern smartwatches, are used as general-purpose processors in smartphones, laptops, desktops and supercomputers/HPC, e.g. a CPU "option" in Cray's supercomputers.
Arm's Mali line of graphics processing units are used in laptops, in over 50% of Android tablets by market share, some versions of Samsung's smartphones and smartwatches. It is the third most popular GPU in mobile devices. Systems, including iPhone smartphones include many chips, from many different providers, that include one or more licensed Arm cores, in addition to those in the main Arm-based processor. Arm's core designs are used in chips that support many common network related technologies in smartphones: Bluetooth, WiFi and broadband, in addition to corresponding equipment such as Bluetooth headsets, 802.11ac routers, network providers' cellular LTE. Arm's main CPU competitors in servers include Intel and AMD. In mobile applications, Intel's x86 Atom is a competitor. AMD sells Arm-based chips as well as x86. Arm's main GPU competitors include mobile GPUs from Imagination Technologies and Nvidia and Intel. Despite competing within GPUs, Qualcomm and Nvidia have combined their GPUs with an Arm licensed CPU.
Arm was a constituent of the FTSE 100 Index. It had a secondary listing on NASDAQ; however Japanese telecommunications company SoftBank Group made an agreed offer for Arm on 18 July 2016, subject to approval by Arm's shareholders, valuing the company at £23.4 billion. The transaction was completed on 5 September 2016; the acronym ARM was first used in 1983 and stood for "Acorn RISC Machine". Acorn Computers' first RISC processor was used in the original Acorn Archimedes and was one of the first RISC processors used in small computers. However, when the company was incorporated in 1990, the acronym was changed to "Advanced RISC Machines", in light of the company's name "Advanced RISC Machines Ltd." - and according to an interview with Steve Furber the name change was at the behest of Apple who did not wish to have the name of a former competitor - namely Acorn - in the name of the company. At the time of the IPO in 1998, the company name was changed to "ARM Holdings" just called ARM like the processors.
On 1 August 2017, the logo were changed. The logo is now all lowercase and other uses of'ARM' are in sentence case except where the whole sentence is upper case, so, for instance, it is now'Arm Holdings'; the company was founded in November 1990 as Advanced RISC Machines Ltd and structured as a joint venture between Acorn Computers, Apple Computer and VLSI Technology. The new company intended to further the development of the Acorn RISC Machine processor, used in the Acorn Archimedes and had been selected by Apple for their Newton project, its first profitable year was 1993. The company's Silicon Valley and Tokyo offices were opened in 1994. Arm invested in Palmchip Corporation in 1997 to provide system on chip platforms and to enter into the disk drive market. In 1998, the company changed its name from Advanced RISC Machines Ltd to ARM Ltd; the company was first listed on the London Stock Exchange and NASDAQ in 1998 and by February 1999, Apple's shareholding had fallen to 14.8%. In 2010, Arm joined with IBM, Texas Instruments, Samsung, ST-Ericsson and Freescale Semiconductor in forming a non-profit open source engineering company, Linaro.
Micrologic Solutions, a software consulting company based in Cambridge Allant Software, a developer of debugging software Infinite Designs, a design company based in Sheffield EuroMIPS a smart card design house in Sophia Antipolis, France The engineering team of Noral Micrologics, a debug hardware and software company based in Blackburn, England Adelante Technologies of Belgium, creating its OptimoDE data engines business, a form of lightweight DSP engine Axys Design Automation, a developer of ESL design tools and Artisan Components, a designer of Physical IP, the building blocks of integrated circuits KEIL Software, a leading developer of software development tools for the microcontroller market, including 8051 and C16x platforms. Arm acquired the engineering team of PowerEscape. Falanx, a developer of 3D graphics accelerators a
Programmable logic controller
A programmable logic controller or programmable controller is an industrial digital computer, ruggedized and adapted for the control of manufacturing processes, such as assembly lines, or robotic devices, or any activity that requires high reliability control and ease of programming and process fault diagnosis. PLCs were first developed in the automobile manufacturing industry to provide flexible and programmable controllers to replace hard-wired relays and sequencers. Since they have been adopted as high-reliability automation controllers suitable for harsh environments. A PLC is an example of a "hard" real-time system since output results must be produced in response to input conditions within a limited time, otherwise unintended operation will result. PLCs can range from small modular devices with tens of inputs and outputs, in a housing integral with the processor, to large rack-mounted modular devices with a count of thousands of I/O, which are networked to other PLC and SCADA systems.
They can be designed for multiple arrangements of digital and analog I/O, extended temperature ranges, immunity to electrical noise, resistance to vibration and impact. Programs to control machine operation are stored in battery-backed-up or non-volatile memory, it was from the automotive industry in the USA. Before the PLC, control and safety interlock logic for manufacturing automobiles was composed of relays, cam timers, drum sequencers, dedicated closed-loop controllers. Since these could number in the hundreds or thousands, the process for updating such facilities for the yearly model change-over was time consuming and expensive, as electricians needed to individually rewire the relays to change their operational characteristics; when digital computers became available, being general-purpose programmable devices, they were soon applied to control sequential and combinatorial logic in industrial processes. However these early computers required specialist programmers and stringent operating environmental control for temperature and power quality.
To meet these challenges the PLC was developed with several key attributes. It would tolerate the shop-floor environment, it would support discrete input and output in an extensible manner, it would not require years of training to use, it would permit its operation to be monitored. Since many industrial processes have timescales addressed by millisecond response times, modern electronics facilitate building reliable controllers, performance could be traded off for reliability. In 1968 GM Hydramatic issued a request for proposals for an electronic replacement for hard-wired relay systems based on a white paper written by engineer Edward R. Clark; the winning proposal came from Bedford Associates of Massachusetts. The first PLC, designated the 084 because it was Bedford Associates' eighty-fourth project, was the result. Bedford Associates started a new company dedicated to developing, manufacturing and servicing this new product: Modicon, which stood for modular digital controller. One of the people who worked on that project was Dick Morley, considered to be the "father" of the PLC.
The Modicon brand was sold in 1977 to Gould Electronics acquired by German Company AEG, by French Schneider Electric, the current owner. One of the first 084 models built is now on display at Schneider Electric's facility in North Andover, Massachusetts, it was presented to Modicon by GM, when the unit was retired after nearly twenty years of uninterrupted service. Modicon used the 84 moniker at the end of its product range; the automotive industry is still one of the largest users of PLCs. In a parallel development Odo Josef Struger is sometimes known as the "father of the programmable logic controller" as well, he was involved in the invention of the Allen-Bradley programmable logic controller during 1958 to 1960. Struger is credited with creating the PLC acronym. Allen-Bradley, the manufacturer of the controller, became a major programmable logic controller device manufacturer in the United States during the tenure of Struger. Early PLCs were designed to replace relay logic systems; these PLCs were programmed in "ladder logic", which resembles a schematic diagram of relay logic.
This program notation was chosen to reduce training demands for the existing technicians. Other early PLCs used a form of instruction list programming, based on a stack-based logic solver. Modern PLCs can be programmed in a variety of ways, from the relay-derived ladder logic to programming languages such as specially adapted dialects of BASIC and C. Another method is state logic, a high-level programming language designed to program PLCs based on state transition diagrams; the majority of PLC systems today adhere to the IEC 61131/3 control systems programming standard that defines 5 languages: Ladder Diagram, Structured Text, Function Block Diagram, Instruction List and sequential function chart. Many early PLCs did not have accompanying programming terminals that were capable of graphical representation of the logic, so the logic was instead represented as a series of logic expressions in some version of Boolean format, similar to Boolean algebra; as programming terminals evolved, it became more common for ladder logic to be used, for the aforementioned reasons and because it was a familiar format used for electro-mechanical control panels.
Newer formats such as state logic and Function Block (which is similar to the way logic is depicted when using
Metadata is "data that provides information about other data". Many distinct types of metadata exist, among these descriptive metadata, structural metadata, administrative metadata, reference metadata and statistical metadata. Descriptive metadata describes a resource for purposes such as identification, it can include elements such as title, abstract and keywords. Structural metadata is metadata about containers of data and indicates how compound objects are put together, for example, how pages are ordered to form chapters, it describes the types, versions and other characteristics of digital materials. Administrative metadata provides information to help manage a resource, such as when and how it was created, file type and other technical information, who can access it. Reference metadata describes the contents and quality of statistical data Statistical metadata may describe processes that collect, process, or produce statistical data. Metadata was traditionally used in the card catalogs of libraries until the 1980s, when libraries converted their catalog data to digital databases.
In the 2000s, as digital formats were becoming the prevalent way of storing data and information, metadata was used to describe digital data using metadata standards. The first description of "meta data" for computer systems is purportedly noted by MIT's Center for International Studies experts David Griffel and Stuart McIntosh in 1967: "In summary we have statements in an object language about subject descriptions of data and token codes for the data. We have statements in a meta language describing the data relationships and transformations, ought/is relations between norm and data."There are different metadata standards for each different discipline. Describing the contents and context of data or data files increases its usefulness. For example, a web page may include metadata specifying what software language the page is written in, what tools were used to create it, what subjects the page is about, where to find more information about the subject; this metadata can automatically improve the reader's experience and make it easier for users to find the web page online.
A CD may include metadata providing information about the musicians and songwriters whose work appears on the disc. A principal purpose of metadata is to help users discover resources. Metadata helps to organize electronic resources, provide digital identification, support the archiving and preservation of resources. Metadata assists users in resource discovery by "allowing resources to be found by relevant criteria, identifying resources, bringing similar resources together, distinguishing dissimilar resources, giving location information." Metadata of telecommunication activities including Internet traffic is widely collected by various national governmental organizations. This data can be used for mass surveillance. In many countries, the metadata relating to emails, telephone calls, web pages, video traffic, IP connections and cell phone locations are stored by government organizations. Metadata means "data about data". Although the "meta" prefix means "after" or "beyond", it is used to mean "about" in epistemology.
Metadata is defined as the data providing information about one or more aspects of the data. Some examples include:Means of creation of the data Purpose of the data Time and date of creation Creator or author of the data Location on a computer network where the data was created Standards used File size Data quality Source of the data Process used to create the dataFor example, a digital image may include metadata that describes how large the picture is, the color depth, the image resolution, when the image was created, the shutter speed, other data. A text document's metadata may contain information about how long the document is, who the author is, when the document was written, a short summary of the document. Metadata within web pages can contain descriptions of page content, as well as key words linked to the content; these links are called "Metatags", which were used as the primary factor in determining order for a web search until the late 1990s. The reliance of metatags in web searches was decreased in the late 1990s because of "keyword stuffing".
Metatags were being misused to trick search engines into thinking some websites had more relevance in the search than they did. Metadata can be stored and managed in a database called a metadata registry or metadata repository. However, without context and a point of reference, it might be impossible to identify metadata just by looking at it. For example: by itself, a database containing several numbers, all 13 digits long could be the results of calculations or a list of numbers to plug into an equation - without any other context, the numbers themselves can be perceived as the data, but if given the context that this database is a log of a book collection, those 13-digit numbers may now be identified as ISBNs - information that refers to the book, but is not itself the information within the book. The term "metadata" was coined in 1968 by Philip Bagley, in his book "Extension of Programming Language Concepts" where it is clear that he uses the term in the ISO 11179 "traditional" sense, "structural metadata" i.e. "data about the containers of data".
Java (software platform)
Java is a set of computer software and specifications developed by James Gosling at Sun Microsystems, acquired by the Oracle Corporation, that provides a system for developing application software and deploying it in a cross-platform computing environment. Java is used in a wide variety of computing platforms from embedded devices and mobile phones to enterprise servers and supercomputers. Java applets, which are less common than standalone Java applications, were run in secure, sandboxed environments to provide many features of native applications through being embedded in HTML pages. It's still possible to run Java in web browsers after most of them having dropped support for Java's VM. Writing in the Java programming language is the primary way to produce code that will be deployed as byte code in a Java virtual machine. In addition, several languages have been designed to run natively on the JVM, including Clojure and Scala. Java syntax borrows from C and C++, but object-oriented features are modeled after Smalltalk and Objective-C.
Java eschews certain low-level constructs such as pointers and has a simple memory model where objects are allocated on the heap and all variables of object types are references. Memory management is handled through integrated automatic garbage collection performed by the JVM. On November 13, 2006, Sun Microsystems made the bulk of its implementation of Java available under the GNU General Public License; the latest version is Java 11, released on September 25, 2018. Java 11 is a supported long-term support version. Oracle "highly recommend that you uninstall older versions of Java", because of serious risks due to unresolved security issues. Since Java 9 is no longer supported, Oracle advises its users to "immediately transition" to Java 11. Extended support for Java 6 ended in December 2018; the Java platform is a suite of programs that facilitate developing and running programs written in the Java programming language. A Java platform will include a compiler and a set of libraries. Java is not specific to any processor or operating system as Java platforms have been implemented for a wide variety of hardware and operating systems with a view to enable Java programs to run identically on all of them.
Different platforms target different classes of device and application domains: Java Card: A technology that allows small Java-based applications to be run securely on smart cards and similar small-memory devices. Java ME: Specifies several different sets of libraries for devices with limited storage and power capacities, it is used to develop applications for mobile devices, PDAs, TV set-top boxes, printers. Java SE: For general-purpose use on desktop PCs, servers and similar devices. Java EE: Java SE plus various APIs which are useful for multi-tier client–server enterprise applications; the Java platform consists of several programs, each of which provides a portion of its overall capabilities. For example, the Java compiler, which converts Java source code into Java bytecode, is provided as part of the Java Development Kit; the Java Runtime Environment, complementing the JVM with a just-in-time compiler, converts intermediate bytecode into native machine code on the fly. The Java platform includes an extensive set of libraries.
The essential components in the platform are the Java language compiler, the libraries, the runtime environment in which Java intermediate bytecode executes according to the rules laid out in the virtual machine specification. The heart of the Java platform is the concept of a "virtual machine" that executes Java bytecode programs; this bytecode is the same no matter what operating system the program is running under. However, new versions, such as for Java 10, have made small changes, meaning the bytecode is in general only forward compatible. There is a JIT compiler within the Java Virtual Machine, or JVM; the JIT compiler translates the Java bytecode into native processor instructions at run-time and caches the native code in memory during execution. The use of bytecode as an intermediate language permits Java programs to run on any platform that has a virtual machine available; the use of a JIT compiler means that Java applications, after a short delay during loading and once they have "warmed up" by being all or JIT-compiled, tend to run about as fast as native programs.
Since JRE version 1.2, Sun's JVM implementation has included a just-in-time compiler instead of an interpreter. Although Java programs are cross-platform or platform independent, the code of the Java Virtual Machines that execute these programs is not; every supported operating platform has its own JVM. In most modern operating systems, a large body of reusable code is provided to simplify the programmer's job; this code is provided as a set of dynamically loadable libraries that applications can call at runtime. Because the Java platform is not dependent on any specific operatin
Active Directory is a directory service that Microsoft developed for the Windows domain networks. It is included in most Windows Server operating systems as a set of services. Active Directory was only in charge of centralized domain management. Starting with Windows Server 2008, Active Directory became an umbrella title for a broad range of directory-based identity-related services. A server running Active Directory Domain Service is called a domain controller, it authenticates and authorizes all users and computers in a Windows domain type network—assigning and enforcing security policies for all computers and installing or updating software. For example, when a user logs into a computer, part of a Windows domain, Active Directory checks the submitted password and determines whether the user is a system administrator or normal user, it allows management and storage of information, provides authentication and authorization mechanisms, establishes a framework to deploy other related services: Certificate Services, Active Directory Federation Services, Lightweight Directory Services and Rights Management Services.
Active Directory uses Lightweight Directory Access Protocol versions 2 and 3, Microsoft's version of Kerberos, DNS. Active Directory, like many information-technology efforts, originated out of a democratization of design using Request for Comments or RFCs; the Internet Engineering Task Force, which oversees the RFC process, has accepted numerous RFCs initiated by widespread participants. Active Directory incorporates decades of communication technologies into the overarching Active Directory concept makes improvements upon them. For example, LDAP underpins Active Directory. X.500 directories and the Organizational Unit preceded the Active Directory concept that makes use of those methods. The LDAP concept began to emerge before the founding of Microsoft in April 1975, with RFCs as early as 1971. RFCs contributing to LDAP include RFC 1823,RFC 2307, RFC 3062, RFC 4533. Microsoft previewed Active Directory in 1999, released it first with Windows 2000 Server edition, revised it to extend functionality and improve administration in Windows Server 2003.
Additional improvements came with subsequent versions of Windows Server. In Windows Server 2008, additional services were added to Active Directory, such as Active Directory Federation Services; the part of the directory in charge of management of domains, a core part of the operating system, was renamed Active Directory Domain Services and became a server role like others. "Active Directory" became the umbrella title of a broader range of directory-based services. According to Bryon Hynes, everything related to identity was brought under Active Directory's banner. Active Directory Services consist of multiple directory services; the best known is Active Directory Domain Services abbreviated as AD DS or AD. Active Directory Domain Services is the cornerstone of every Windows domain network, it stores information about members of the domain, including devices and users, verifies their credentials and defines their access rights. The server running this service is called a domain controller. A domain controller is contacted when a user logs into a device, accesses another device across the network, or runs a line-of-business Metro-style app sideloaded into a device.
Other Active Directory services as well as most of Microsoft server technologies rely on or use Domain Services. Active Directory Lightweight Directory Services known as Active Directory Application Mode, is a light-weight implementation of AD DS. AD LDS runs as a service on Windows Server. AD LDS shares the code base with AD DS and provides the same functionality, including an identical API, but does not require the creation of domains or domain controllers, it provides a Data Store for storage of directory data and a Directory Service with an LDAP Directory Service Interface. Unlike AD DS, multiple AD LDS instances can run on the same server. Active Directory Certificate Services establishes an on-premises public key infrastructure, it can create and revoke public key certificates for internal uses of an organization. These certificates can be used to encrypt files and network traffic. AD CS predates Windows Server 2008, but its name was Certificate Services. AD CS requires an AD DS infrastructure.
Active Directory Federation Services is a single sign-on service. With an AD FS infrastructure in place, users may use several web-based services or network resources using only one set of credentials stored at a central location, as opposed to having to be granted a dedicated set of credentials for each service. AD FS's purpose is an extension of that of AD DS: The latter enables users to authenticate with and use the devices that are part of the same network, using one set of credentials; the former enables them to use the same set of credentials in a different network. As the name suggests, AD FS works based on the concept of federated identity. AD FS requires an AD DS infrastructure. Active Directory Rights Management Services is a server software for information rights management shipped with Windows Server