Electronic data processing
Electronic data processing can refer to the use of automated methods to process commercial data. This uses simple, repetitive activities to process large volumes of similar information. For example: stock updates applied to an inventory, banking transactions applied to account and customer master files and ticketing transactions to an airline's reservation system, billing for utility services; the modifier "electronic" or "automatic" was used with "database processing" c. 1960, to distinguish human clerical data processing from that done by computer. The first commercial business computer was developed in the United Kingdom in 1951, by the J. Lyons and Co. catering organization. This was known as the'Lyons Electronic Office' - or LEO for short, it was developed further and used during the 1960s and early 1970s. (Joe Lyons formed a separate company to develop the LEO computers and this subsequently merged to form English Electric Leo Marconi and International Computers Limited. By the end of the 1950s punched card manufacturers, Powers-Samas, IBM and others, were marketing an array of computers.
Early commercial systems were installed by large organizations. These could afford to invest the time and capital necessary to purchase hardware, hire specialist staff to develop bespoke software and work through the consequent organizational and cultural changes. At first, individual organizations developed their own software, including data management utilities, themselves. Different products might have'one-off' bespoke software; this fragmented approach led to duplicated effort and the production of management information needed manual effort. High hardware costs and slow processing speeds forced developers to use resources'efficiently'. Data storage formats were compacted, for example. A common example is the removal of the century from dates, which led to the'millennium bug'. Data input required intermediate processing via punched paper tape or punched card and separate input to a repetitive, labor-intensive task, removed from user control and error-prone. Invalid or incorrect data needed correction and resubmission with consequences for data and account reconciliation.
Data storage was serial on paper tape, later to magnetic tape: the use of data storage within accessible memory was not cost-effective. Significant developments took place in 1959 with IBM announcing the 1401 computer and in 1962 with ICT making delivery of the ICT 1301. Like all machines during this time the core processor together with the peripherals – magnetic tape decks, drums and card and paper tape input and output required considerable space in specially constructed air conditioned accommodation. Parts of the punched card installation, in particular sorters, were retained to present the card input to the computer in a pre-sort form that reduced the processing time involved in sorting large amounts of data. Data processing facilities became available to smaller organizations in the form of the computer services bureau; these offered processing of specific applications e.g. payroll and were a prelude to the purchase of customers' own computers. Organizations used these facilities for testing programs while awaiting the arrival of their own machine.
These initial machines were delivered to customers with limited software. The design staff was divided into two groups. Systems analysts produced a systems specification and programmers translated the specification into machine language. Literature on computers and EDP was sparse through articles appearing in accountancy publications and material supplied by the equipment manufacturers; the first issue of The Computer Journal published by The British Computer Society appeared in mid 1958. The UK Accountancy Body now named The Association of Chartered Certified Accountants formed an Electronic Data Processing Committee in July 1958 with the purpose of informing its members of the opportunities created by the computer; the Committee produced its first booklet in An Introduction to Electronic Computers. In 1958 The Institute of Chartered Accountants in England and Wales produced a paper Accounting by Electronic Methods; the notes indicated and the possible implications of using a computer. Progressive organizations attempted to go beyond the straight systems transfer from punched card equipment and unit accounting machines to the computer, to producing accounts to the trial balance stage and integrated management information systems.
New procedures redesigned the way paper flowed, changed organizational structures, called for a rethink of the way information was presented to management and challenged the internal control principles adopted by the designers of accounting systems. But the full realization of these benefits had to await the arrival of the next generation of computers As with other industrial processes commercial IT has moved in most cases from a custom-order, craft-based industry where the product was tailored to fit the customer. Mass-production has reduced costs and IT is available to the smallest organization. LEO was hardware tailored for a single client. Today, Intel Pentium and compatible chips are standard and become parts of other components which are combined as needed. One individual change of note was the freeing of computers and removable storage from protected, air-filtered environments. Microsoft and IBM at various times have been influential enough to impose order on IT and the resultant standardizations allowed specialist software to flourish.
Software is available off the sh
Birmingham is the second-most populous city in the United Kingdom, after London, the most populous city in the English Midlands. It is the most populous metropolitan district in the United Kingdom, with an estimated 1,137,123 inhabitants, is considered the social, cultural and commercial centre of the Midlands, it is the main local government of the West Midlands conurbation, the third most populated urban area in the United Kingdom, with a population of 2,897,303 in 2017. The wider Birmingham metropolitan area is the second largest in the United Kingdom with a population of over 4.3 million. It is referred to as the United Kingdom's "second city". A market town in the medieval period, Birmingham grew in the 18th-century Midlands Enlightenment and subsequent Industrial Revolution, which saw advances in science and economic development, producing a series of innovations that laid many of the foundations of modern industrial society. By 1791 it was being hailed as "the first manufacturing town in the world".
Birmingham's distinctive economic profile, with thousands of small workshops practising a wide variety of specialised and skilled trades, encouraged exceptional levels of creativity and innovation and provided an economic base for prosperity, to last into the final quarter of the 20th century. The Watt steam engine was invented in Birmingham; the resulting high level of social mobility fostered a culture of political radicalism which, under leaders from Thomas Attwood to Joseph Chamberlain, was to give it a political influence unparalleled in Britain outside London, a pivotal role in the development of British democracy. From the summer of 1940 to the spring of 1943, Birmingham was bombed by the German Luftwaffe in what is known as the Birmingham Blitz; the damage done to the city's infrastructure, in addition to a deliberate policy of demolition and new building by planners, led to extensive urban regeneration in subsequent decades. Birmingham's economy is now dominated by the service sector.
The city is a major international commercial centre, ranked as a beta- world city by the Globalization and World Cities Research Network. Its metropolitan economy is the second largest in the United Kingdom with a GDP of $121.1bn, its six universities make it the largest centre of higher education in the country outside London. Birmingham's major cultural institutions – the City of Birmingham Symphony Orchestra, the Birmingham Royal Ballet, the Birmingham Repertory Theatre, the Library of Birmingham and the Barber Institute of Fine Arts – enjoy international reputations, the city has vibrant and influential grassroots art, music and culinary scenes. Birmingham is the fourth-most. People from Birmingham are called Brummies, a term derived from the city's nickname of "Brum", which originates from the city's old name, which in turn is thought to have derived from "Bromwich-ham"; the Brummie accent and dialect are distinctive. Birmingham's early history is that of a marginal area; the main centres of population and wealth in the pre-industrial English Midlands lay in the fertile and accessible river valleys of the Trent, the Severn and the Avon.
The area of modern Birmingham lay in between, on the upland Birmingham Plateau and within the densely wooded and sparsely populated Forest of Arden. There is evidence of early human activity in the Birmingham area dating back to around 8000 BC, with stone age artefacts suggesting seasonal settlements, overnight hunting parties and woodland activities such as tree felling; the many burnt mounds that can still be seen around the city indicate that modern humans first intensively settled and cultivated the area during the bronze age, when a substantial but short-lived influx of population occurred between 1700 BC and 1000 BC caused by conflict or immigration in the surrounding area. During the 1st-century Roman conquest of Britain, the forested country of the Birmingham Plateau formed a barrier to the advancing Roman legions, who built the large Metchley Fort in the area of modern-day Edgbaston in AD 48, made it the focus of a network of Roman roads. Birmingham as a settlement dates from the Anglo-Saxon era.
The city's name comes from the Old English Beormingahām, meaning the home or settlement of the Beormingas – indicating that Birmingham was established in the 6th or early 7th century as the primary settlement of an Anglian tribal grouping and regio of that name. Despite this early importance, by the time of the Domesday Book of 1086 the manor of Birmingham was one of the poorest and least populated in Warwickshire, valued at only 20 shillings, with the area of the modern city divided between the counties of Warwickshire and Worcestershire; the development of Birmingham into a significant urban and commercial centre began in 1166, when the Lord of the Manor Peter de Bermingham obtained a charter to hold a market at his castle, followed this with the creation of a planned market town and seigneurial borough within his demesne or manorial estate, around the site that became the Bull Ring. This established Birmingham as the primary commercial centre for the Birmingham Plateau at a time when the area's economy was expanding with population growth nationally leading to the clearance and settlement of marginal land.
Within a century of the charter Birmingham had grown into a prosperous urban centre of merchants and craftsmen. By 1327 it was the third-largest town in Warwickshire, a position it would retain for the next 200 years; the principal governing institutions of medieval Birmingham – including the Guild of the Ho
Information and communications technology
Information and communications technology is an extensional term for information technology that stresses the role of unified communications and the integration of telecommunications and computers, as well as necessary enterprise software, middleware and audiovisual systems, that enable users to access, store and manipulate information. The term ICT is used to refer to the convergence of audiovisual and telephone networks with computer networks through a single cabling or link system. There are large economic incentives to merge the telephone network with the computer network system using a single unified system of cabling, signal distribution, management. ICT is a broad subject and the concepts are evolving, it covers any product that will store, manipulate, transmit, or receive information electronically in a digital form. For clarity, Zuppo provided an ICT hierarchy where all levels of the hierarchy "contain some degree of commonality in that they are related to technologies that facilitate the transfer of information and various types of electronically mediated communications".
Theoretical differences between interpersonal-communication technologies and mass-communication technologies have been identified by the philosopher Piyush Mathur. Skills Framework for the Information Age is one of many models for describing and managing competencies for ICT professionals for the 21st century; the phrase "information and communication technologies" has been used by academic researchers since the 1980s. The abbreviation "ICT" became popular after it was used in a report to the UK government by Dennis Stevenson in 1997, in the revised National Curriculum for England and Northern Ireland in 2000. However, in 2012, the Royal Society recommended that the use of the term "ICT" should be discontinued in British schools "as it has attracted too many negative connotations". From 2014 the National Curriculum has used the word computing, which reflects the addition of computer programming into the curriculum. Variations of the phrase have spread worldwide; the United Nations has created a "United Nations Information and Communication Technologies Task Force" and an internal "Office of Information and Communications Technology".
The money spent on IT worldwide has been estimated as US$3.8 trillion in 2017 and has been growing at less than 5% per year since 2009. The estimate 2018 growth of the entire ICT in is 5%; the biggest growth of 16% is expected in the area of new technologies. The 2014 IT budget of US federal government was nearly $82 billion. IT costs, as a percentage of corporate revenue, have grown 50% since 2002, putting a strain on IT budgets; when looking at current companies' IT budgets, 75% are recurrent costs, used to "keep the lights on" in the IT department, 25% are cost of new initiatives for technology development. The average IT budget has the following breakdown: 31% personnel costs 29% software costs 26% hardware costs 14% costs of external service providers; the estimate of money to be spent in 2022 is just over US$6 trillion. The world's technological capacity to store information grew from 2.6 exabytes in 1986 to 15.8 in 1993, over 54.5 in 2000, to 295 exabytes in 2007, some 5 zettabytes in 2014.
This is the informational equivalent to 1.25 stacks of CD-ROM from the earth to the moon in 2007, the equivalent of 4,500 stacks of printed books from the earth to the sun in 2014. The world's technological capacity to receive information through one-way broadcast networks was 432 exabytes of information in 1986, 715 exabytes in 1993, 1.2 zettabytes in 2000, 1.9 zettabytes in 2007. The world's effective capacity to exchange information through two-way telecommunication networks was 281 petabytes of information in 1986, 471 petabytes in 1993, 2.2 exabytes in 2000, 65 exabytes in 2007, some 100 exabytes in 2014. The world's technological capacity to compute information with humanly guided general-purpose computers grew from 3.0 × 10^8 MIPS in 1986, to 6.4 x 10^12 MIPS in 2007. The following is a list of OECD countries by share of ICT sector in total value added in 2013; the ICT Development Index ranks and compares the level of ICT use and access across the various countries around the world. In 2014 ITU released the latest rankings of the IDI, with Denmark attaining the top spot, followed by South Korea.
The top 30 countries in the rankings include most high-income countries where quality of life is higher than average, which includes countries from Europe and other regions such as "Australia, Canada, Macao, New Zealand and the United States. In developing countries, ICT development is constrained by limited capabilities and the objectives of ICT projects are not met. On 21 December 2001, the United Nations General Assembly approved Resolution 56/183, endorsing the holding of the World Summit on the Information Society to discuss the opportunities and challenges facing today's information society. According to this resolution, the General Assembly related the Summit to the United Nations Millennium Declaration's goal of implementing ICT to achieve Millennium Development Goals, it emphasized a multi-stakeholder approach to achieve these goals, using all stakeholders i
Typesetting is the composition of text by means of arranging physical types or the digital equivalents. Stored letters and other symbols are retrieved and ordered according to a language's orthography for visual display. Typesetting requires one or more fonts. One significant effect of typesetting was that authorship of works could be spotted more making it difficult for copiers who have not gained permission. During much of the letterpress era, movable type was composed by hand for each page. Cast metal sorts were composed into words lines paragraphs pages of text and bound together to make up a form, with all letter faces the same "height to paper", creating an surface of type; the form was placed in a press, an impression made on paper. During typesetting, individual sorts are picked from a type case with the right hand, set into a composing stick held in the left hand from left to right, as viewed by the setter upside down; as seen in the photo of the composing stick, a lower case'q' looks like a'd', a lower case'b' looks like a'p', a lower case'p' looks like a'b' and a lower case'd' looks like a'q'.
This is reputed to be the origin of the expression "mind your p's and q's". It might just as have been "mind your b's and d's"; the diagram at right illustrates a cast metal sort: a face, b body or shank, c point size, 1 shoulder, 2 nick, 3 groove, 4 foot. Wooden printing sorts were in use for centuries in combination with metal type. Not shown, more the concern of the casterman, is the “set”, or width of each sort. Set width, like body size, is measured in points. In order to extend the working life of type, to account for the finite sorts in a case of type, copies of forms were cast when anticipating subsequent printings of a text, freeing the costly type for other work; this was prevalent in book and newspaper work where rotary presses required type forms to wrap an impression cylinder rather than set in the bed of a press. In this process, called stereotyping, the entire form is pressed into a fine matrix such as plaster of Paris or papier mâché called a flong to create a positive, from which the stereotype form was electrotyped, cast of type metal.
Advances such as the typewriter and computer would push the state of the art farther ahead. Still, hand composition and letterpress printing have not fallen out of use, since the introduction of digital typesetting, it has seen a revival as an artisanal pursuit. However, it is a small niche within the larger typesetting market; the time and effort required to manually compose the text led to several efforts in the 19th century to produce mechanical typesetting. While some, such as the Paige compositor, met with limited success, by the end of the 19th century, several methods had been devised whereby an operator working a keyboard or other devices could produce the desired text. Most of the successful systems involved the in-house casting of the type to be used, hence are termed "hot metal" typesetting; the Linotype machine, invented in 1884, used a keyboard to assemble the casting matrices, cast an entire line of type at a time. In the Monotype System, a keyboard was used to punch a paper tape, fed to control a casting machine.
The Ludlow Typograph otherwise used hot metal. By the early 20th century, the various systems were nearly universal in large newspapers and publishing houses. Phototypesetting or "cold type" systems first appeared in the early 1960s and displaced continuous casting machines; these devices consisted of glass or film disks or strips that spun in front of a light source to selectively expose characters onto light-sensitive paper. They were driven by pre-punched paper tapes, they were connected to computer front ends. One of the earliest electronic photocomposition systems was introduced by Fairchild Semiconductor; the typesetter typed a line of text on a Fairchild keyboard. To verify correct content of the line it was typed a second time. If the two lines were identical a bell rang and the machine produced a punched paper tape corresponding to the text. With the completion of a block of lines the typesetter fed the corresponding paper tapes into a phototypesetting device that mechanically set type outlines printed on glass sheets into place for exposure onto a negative film.
Photosensitive paper was exposed to light through the negative film, resulting in a column of black type on white paper, or a galley. The galley was cut up and used to create a mechanical drawing or paste up of a whole page. A large film negative of the page is used to make plates for offset printing; the next generation of phototypesetting machines to emerge were those that generated characters on a cathode ray tube. Typical of the type were the Alphanumeric APS2, IBM 2680, I. I. I. VideoComp, Autologic APS5, Linotron 202; these machines were the mainstay of phototypesetting for much of the 1980s. Such machines could be "driven online" by a computer front-end system or took their data from magnetic tape. Type fonts were stored digitally on conventional magnetic disk drives. Computers excel at automatically correcting documents. Character-by-character, computer-aided phototypesetting was, in turn rendered obsolete in the 1980s by digital systems employing a raster image processor to render an entire page to a single high-resolution digital image, now known as imagesetting.
The first commercially successful laser imagesetter, able to make use of a raster image p
Radar is a detection system that uses radio waves to determine the range, angle, or velocity of objects. It can be used to detect aircraft, spacecraft, guided missiles, motor vehicles, weather formations, terrain. A radar system consists of a transmitter producing electromagnetic waves in the radio or microwaves domain, a transmitting antenna, a receiving antenna and a receiver and processor to determine properties of the object. Radio waves from the transmitter reflect off the object and return to the receiver, giving information about the object's location and speed. Radar was developed secretly for military use by several nations in the period before and during World War II. A key development was the cavity magnetron in the UK, which allowed the creation of small systems with sub-meter resolution; the term RADAR was coined in 1940 by the United States Navy as an acronym for RAdio Detection And Ranging The term radar has since entered English and other languages as a common noun, losing all capitalization.
The modern uses of radar are diverse, including air and terrestrial traffic control, radar astronomy, air-defense systems, antimissile systems, marine radars to locate landmarks and other ships, aircraft anticollision systems, ocean surveillance systems, outer space surveillance and rendezvous systems, meteorological precipitation monitoring and flight control systems, guided missile target locating systems, ground-penetrating radar for geological observations, range-controlled radar for public health surveillance. High tech radar systems are associated with digital signal processing, machine learning and are capable of extracting useful information from high noise levels. Radar is a key technology that the self-driving systems are designed to use, along with sonar and other sensors. Other systems similar to radar make use of other parts of the electromagnetic spectrum. One example is "lidar". With the emergence of driverless vehicles, Radar is expected to assist the automated platform to monitor its environment, thus preventing unwanted incidents.
As early as 1886, German physicist Heinrich Hertz showed that radio waves could be reflected from solid objects. In 1895, Alexander Popov, a physics instructor at the Imperial Russian Navy school in Kronstadt, developed an apparatus using a coherer tube for detecting distant lightning strikes; the next year, he added a spark-gap transmitter. In 1897, while testing this equipment for communicating between two ships in the Baltic Sea, he took note of an interference beat caused by the passage of a third vessel. In his report, Popov wrote that this phenomenon might be used for detecting objects, but he did nothing more with this observation; the German inventor Christian Hülsmeyer was the first to use radio waves to detect "the presence of distant metallic objects". In 1904, he demonstrated the feasibility of detecting a ship in dense fog, but not its distance from the transmitter, he obtained a patent for his detection device in April 1904 and a patent for a related amendment for estimating the distance to the ship.
He got a British patent on September 23, 1904 for a full radar system, that he called a telemobiloscope. It operated on a 50 cm wavelength and the pulsed radar signal was created via a spark-gap, his system used the classic antenna setup of horn antenna with parabolic reflector and was presented to German military officials in practical tests in Cologne and Rotterdam harbour but was rejected. In 1915, Robert Watson-Watt used radio technology to provide advance warning to airmen and during the 1920s went on to lead the U. K. research establishment to make many advances using radio techniques, including the probing of the ionosphere and the detection of lightning at long distances. Through his lightning experiments, Watson-Watt became an expert on the use of radio direction finding before turning his inquiry to shortwave transmission. Requiring a suitable receiver for such studies, he told the "new boy" Arnold Frederic Wilkins to conduct an extensive review of available shortwave units. Wilkins would select a General Post Office model after noting its manual's description of a "fading" effect when aircraft flew overhead.
Across the Atlantic in 1922, after placing a transmitter and receiver on opposite sides of the Potomac River, U. S. Navy researchers A. Hoyt Taylor and Leo C. Young discovered that ships passing through the beam path caused the received signal to fade in and out. Taylor submitted a report, suggesting that this phenomenon might be used to detect the presence of ships in low visibility, but the Navy did not continue the work. Eight years Lawrence A. Hyland at the Naval Research Laboratory observed similar fading effects from passing aircraft. Before the Second World War, researchers in the United Kingdom, Germany, Japan, the Netherlands, the Soviet Union, the United States, independently and in great secrecy, developed technologies that led to the modern version of radar. Australia, New Zealand, South Africa followed prewar Great Britain's radar development, Hungary generated its radar technology during the war. In France in 1934, following systematic studies on the split-anode magnetron, the research branch of the Compagnie Générale de Télégraphie Sans Fil headed by Maurice Ponte with Henri Gutton, Sylvain Berline and M. Hugon, began developing an obstacle-locatin
Royal Electrical and Mechanical Engineers
The Corps of Royal Electrical and Mechanical Engineers is a corps of the British Army that maintains the equipment that the Army uses. Prior to REME's formation, maintenance was the responsibility of several different corps: Royal Army Ordnance Corps—weapons and armoured vehicles Royal Engineers—engineering plant and machinery, RE motor transport Royal Corps of Signals—communications equipment Royal Army Service Corps—other motor transport Royal Artillery—heavy weapons artificersDuring World War II, the increase in quantity and complexity of equipment exposed the flaws in this system. Pursuant to the recommendation of a Committee on Skilled Men in the Services chaired by William Beveridge, the Corps of Royal Electrical and Mechanical Engineers was formed on 1 October 1942; such a major re-organisation was too complex to be carried out and in the middle of a world war. Therefore, the changeover was undertaken in two phases. In Phase I, implemented REME was formed on the existing framework of the RAOC Engineering Branch, strengthened by the transfer of certain technical units and tradesmen from the RE and RASC.
At the same time, a number of individual tradesmen were transferred into REME from other corps. The new corps was made responsible for repairing the technical equipment of all arms with certain major exceptions. REME did not yet undertake: Those repairs that were carried out by unit tradesmen who were driver/mechanics or fitters in regiments and belonged to the unit rather than being attached to it. Repairs of RASC-operated vehicles, which remained the responsibility of the RASC. Repairs of RE specialist equipment, which remained the responsibility of the RE. In 1949, it was decided; this decision was published in Army Council Instruction 110 of 1949, the necessary reorganisation was carried out in the various arms and services in three stages between July 1951 and January 1952. The main changes were: The transfer to REME of most of the unit repair responsibilities of other arms; the provision of Light Aid Detachments for certain units that had not possessed them under the old organisation. The provision of new REME workshops to carry out field repairs in RASC transport companies.
Maintenance of vessels of the RASC fleet whilst in port was given to the fleet repair branch, a civilian organisation which came under the REME umbrella. This organisation was responsible for arranging and overseeing ship refits. After some interim designs, the badge of the Corps was formalised in June 1943 for use as the cap-badge, collar-badge, on the buttons, it consisted of an oval Royally Crowned laurel wreath. Within the wreath was a pair of calipers. Examples of these early badges can be found at the REME Museum. In 1947, the Horse and Lightning was adopted as the cap badge. At the end of the war, the Allies occupied the major German industrial centres to decide their fate; the Volkswagen factory at Wolfsburg became part of the British Zone in June 1945 and No. 30 Workshop Control Unit, REME, assumed control in July. They operated under the overall direction of Colonel Michael McEvoy at Rhine Army Headquarters, Bad Oeynhausen. Uniquely, he had experience of the KdF Wagen in his pre-war career as a motor racing engineer.
After visiting the Volkswagen factory, McEvoy had the idea of trying to get Volkswagen back into production to provide light transport for the occupying forces. The British Army, Red Cross and essential German services were chronically short of light vehicles. If the factory could provide them, there would be no cost to the British taxpayer and the factory could be saved. To do this, a good manager with technical experience would be needed. Maj. Ivan Hirst was told to "take charge of" the Volkswagen plant before arriving in August 1945, he had drains fixed and bomb craters filled in. At first, the wartime Kubelwagen was viewed as a suitable vehicle. Once it became clear it could not be put back into production, the Volkswagen saloon or Kaefer was suggested. Hirst had an example delivered to Rhine Army headquarters, where it was demonstrated by Colonel McEvoy; the positive reaction led to the Military Government placing an order for 20,000 Volkswagens in September 1945. The REME Museum is based at MoD Lyneham.
The Defence School of Electronic and Mechanical Engineering at MoD Lyneham meets most of the training needs of the corps. With minor exceptions only, the Corps is now responsible for the examination, modification and recovery of all mechanical, electronic and optical equipment of the Army beyond the capacity of unit non-technical personnel. REME has its Regimental Headquarters collocated with 8 Training Battalion REME based in MOD Lyneham, in Wiltshire. All trade training and Artificer training of Electro/Mechanical trades of REME and various related training to other units within the British Army and the Navy and Air Force is conducted by 8 Training Battalion REME. In line with the Army 2020 review, there are seven Regular, two Training and six Army Reserve battalions within REME. Regular Army Battalions 1 Close Support Battalion REME 4 CS Company 12 CS Company 2 Close Support Battalion REME 7 CS Company 11 CS Company 3 Close Support Battalion REME 5 Armoured Company 20 Armoured Company 18 Field Company 4 Close Support Battalion REME 9 Armoured Company 10 Armoured Company 17 Field Company 5 Force Support Battalion REME 1 Field Company 2 Field Company 15 Field Company 6 Close S
England is a country, part of the United Kingdom. It shares land borders with Wales to Scotland to the north-northwest; the Irish Sea lies west of England and the Celtic Sea lies to the southwest. England is separated from continental Europe by the North Sea to the east and the English Channel to the south; the country covers five-eighths of the island of Great Britain, which lies in the North Atlantic, includes over 100 smaller islands, such as the Isles of Scilly and the Isle of Wight. The area now called England was first inhabited by modern humans during the Upper Palaeolithic period, but takes its name from the Angles, a Germanic tribe deriving its name from the Anglia peninsula, who settled during the 5th and 6th centuries. England became a unified state in the 10th century, since the Age of Discovery, which began during the 15th century, has had a significant cultural and legal impact on the wider world; the English language, the Anglican Church, English law – the basis for the common law legal systems of many other countries around the world – developed in England, the country's parliamentary system of government has been adopted by other nations.
The Industrial Revolution began in 18th-century England, transforming its society into the world's first industrialised nation. England's terrain is chiefly low hills and plains in central and southern England. However, there is upland and mountainous terrain in the west; the capital is London, which has the largest metropolitan area in both the United Kingdom and the European Union. England's population of over 55 million comprises 84% of the population of the United Kingdom concentrated around London, the South East, conurbations in the Midlands, the North West, the North East, Yorkshire, which each developed as major industrial regions during the 19th century; the Kingdom of England – which after 1535 included Wales – ceased being a separate sovereign state on 1 May 1707, when the Acts of Union put into effect the terms agreed in the Treaty of Union the previous year, resulting in a political union with the Kingdom of Scotland to create the Kingdom of Great Britain. In 1801, Great Britain was united with the Kingdom of Ireland to become the United Kingdom of Great Britain and Ireland.
In 1922 the Irish Free State seceded from the United Kingdom, leading to the latter being renamed the United Kingdom of Great Britain and Northern Ireland. The name "England" is derived from the Old English name Englaland, which means "land of the Angles"; the Angles were one of the Germanic tribes that settled in Great Britain during the Early Middle Ages. The Angles came from the Anglia peninsula in the Bay of Kiel area of the Baltic Sea; the earliest recorded use of the term, as "Engla londe", is in the late-ninth-century translation into Old English of Bede's Ecclesiastical History of the English People. The term was used in a different sense to the modern one, meaning "the land inhabited by the English", it included English people in what is now south-east Scotland but was part of the English kingdom of Northumbria; the Anglo-Saxon Chronicle recorded that the Domesday Book of 1086 covered the whole of England, meaning the English kingdom, but a few years the Chronicle stated that King Malcolm III went "out of Scotlande into Lothian in Englaland", thus using it in the more ancient sense.
According to the Oxford English Dictionary, its modern spelling was first used in 1538. The earliest attested reference to the Angles occurs in the 1st-century work by Tacitus, Germania, in which the Latin word Anglii is used; the etymology of the tribal name itself is disputed by scholars. How and why a term derived from the name of a tribe, less significant than others, such as the Saxons, came to be used for the entire country and its people is not known, but it seems this is related to the custom of calling the Germanic people in Britain Angli Saxones or English Saxons to distinguish them from continental Saxons of Old Saxony between the Weser and Eider rivers in Northern Germany. In Scottish Gaelic, another language which developed on the island of Great Britain, the Saxon tribe gave their name to the word for England. An alternative name for England is Albion; the name Albion referred to the entire island of Great Britain. The nominally earliest record of the name appears in the Aristotelian Corpus the 4th-century BC De Mundo: "Beyond the Pillars of Hercules is the ocean that flows round the earth.
In it are two large islands called Britannia. But modern scholarly consensus ascribes De Mundo not to Aristotle but to Pseudo-Aristotle, i.e. it was written in the Graeco-Roman period or afterwards. The word Albion or insula Albionum has two possible origins, it either derives from a cognate of the Latin albus meaning white, a reference to the white cliffs of Dover or from the phrase the "island of the Albiones" in the now lost Massaliote Periplus, attested through Avienus' Ora Maritima to which the former served as a source. Albion is now applied to England in a more poetic capacity. Another romantic name for England is Loegria, related to the Welsh word for England and made popular by its use in Arthurian legend; the earliest known evidence of human presence in the area now known as England was that of Homo antecessor, dating to approximate