Numerical weather prediction
Numerical weather prediction uses mathematical models of the atmosphere and oceans to predict the weather based on current weather conditions. Though first attempted in the 1920s, it was not until the advent of computer simulation in the 1950s that numerical weather predictions produced realistic results. A number of global and regional forecast models are run in different countries worldwide, using current weather observations relayed from radiosondes, weather satellites and other observing systems as inputs. Mathematical models based on the same physical principles can be used to generate either short-term weather forecasts or longer-term climate predictions; the improvements made to regional models have allowed for significant improvements in tropical cyclone track and air quality forecasts. Manipulating the vast datasets and performing the complex calculations necessary to modern numerical weather prediction requires some of the most powerful supercomputers in the world. With the increasing power of supercomputers, the forecast skill of numerical weather models extends to only about six days.
Factors affecting the accuracy of numerical predictions include the density and quality of observations used as input to the forecasts, along with deficiencies in the numerical models themselves. Post-processing techniques such as model output statistics have been developed to improve the handling of errors in numerical predictions. A more fundamental problem lies in the chaotic nature of the partial differential equations that govern the atmosphere, it is impossible to solve these equations and small errors grow with time. Present understanding is that this chaotic behavior limits accurate forecasts to about 14 days with accurate input data and a flawless model. In addition, the partial differential equations used in the model need to be supplemented with parameterizations for solar radiation, moist processes, heat exchange, vegetation, surface water, the effects of terrain. In an effort to quantify the large amount of inherent uncertainty remaining in numerical predictions, ensemble forecasts have been used since the 1990s to help gauge the confidence in the forecast, to obtain useful results farther into the future than otherwise possible.
This approach analyzes multiple forecasts created with an individual forecast model or multiple models. The history of numerical weather prediction began in the 1920s through the efforts of Lewis Fry Richardson, who used procedures developed by Vilhelm Bjerknes to produce by hand a six-hour forecast for the state of the atmosphere over two points in central Europe, taking at least six weeks to do so, it was not until the advent of the computer and computer simulations that computation time was reduced to less than the forecast period itself. The ENIAC was used to create the first weather forecasts via computer in 1950, based on a simplified approximation to the atmospheric governing equations. In 1954, Carl-Gustav Rossby's group at the Swedish Meteorological and Hydrological Institute used the same model to produce the first operational forecast. Operational numerical weather prediction in the United States began in 1955 under the Joint Numerical Weather Prediction Unit, a joint project by the U.
S. Air Force and Weather Bureau. In 1956, Norman Phillips developed a mathematical model which could realistically depict monthly and seasonal patterns in the troposphere. Following Phillips' work, several groups began working to create general circulation models; the first general circulation climate model that combined both oceanic and atmospheric processes was developed in the late 1960s at the NOAA Geophysical Fluid Dynamics Laboratory. As computers have become more powerful, the size of the initial data sets has increased and newer atmospheric models have been developed to take advantage of the added available computing power; these newer models include more physical processes in the simplifications of the equations of motion in numerical simulations of the atmosphere. In 1966, West Germany and the United States began producing operational forecasts based on primitive-equation models, followed by the United Kingdom in 1972 and Australia in 1977; the development of limited area models facilitated advances in forecasting the tracks of tropical cyclones as well as air quality in the 1970s and 1980s.
By the early 1980s models began to include the interactions of soil and vegetation with the atmosphere, which led to more realistic forecasts. The output of forecast models based on atmospheric dynamics is unable to resolve some details of the weather near the Earth's surface; as such, a statistical relationship between the output of a numerical weather model and the ensuing conditions at the ground was developed in the 1970s and 1980s, known as model output statistics. Starting in the 1990s, model ensemble forecasts have been used to help define the forecast uncertainty and to extend the window in which numerical weather forecasting is viable farther into the future than otherwise possible; the atmosphere is a fluid. As such, the idea of numerical weather prediction is to sample the state of the fluid at a given time and use the equations of fluid dynamics and thermodynamics to estimate the state of the fluid at some time in the future; the process of entering observation data into the model to generate initial conditions is called initialization.
On land, terrain maps available at resolutions dow
University of Liverpool
The University of Liverpool is a public university based in the city of Liverpool, England. Founded as a college in 1881, it gained its royal charter in 1903 with the ability to award degrees and is known to be one of the six original'red brick' civic universities, it comprises three faculties organised into schools. It is a founding member of the Russell Group, the N8 Group for research collaboration and the university management school is AACSB accredited. Ten Nobel Prize winners are amongst its alumni and past faculty and the university offers more than 230 first degree courses across 103 subjects, its alumni include the CEOs of GlobalFoundries, ARM Holdings, Tesco and The Coca-Cola Company. It was the world's first university to establish departments in oceanography, civic design and biochemistry at the Johnston Laboratories. In 2006 the university became the first in the UK to establish an independent university in China, Xi'an Jiaotong-Liverpool University, making it the world's first Sino-British university.
For 2017-18, Liverpool had a turnover of £543.9 million, including £95.6 million from research grants and contracts. It has the fifth largest endowment of any university in England. Graduates of the university are styled with the post-nominal letters Lpool, to indicate the institution; the university has a strategic partnership with Laureate International Universities, a for-profit college collective, for University of Liverpool online. The partnership provides the technical infrastructure to deliver courses worldwide; the university was established in 1881 as University College Liverpool, admitting its first students in 1882. In 1884, it became part of the federal Victoria University. In 1894 Oliver Lodge, a professor at the university, made the world's first public radio transmission and two years took the first surgical X-ray in the United Kingdom; the Liverpool University Press was founded in 1899, making it the third oldest university press in England. Students in this period were awarded external degrees by the University of London.
Following a royal charter and act of Parliament in 1903, it became an independent university with the right to confer its own degrees called the University of Liverpool. The next few years saw major developments at the university, including Sir Charles Sherrington's discovery of the synapse and William Blair-Bell's work on chemotherapy in the treatment of cancer. In the 1930s to 1940s Sir James Chadwick and Sir Joseph Rotblat made major contributions to the development of the atomic bomb. From 1943 to 1966 Allan Downie, Professor of Bacteriology, was involved in the eradication of smallpox. In 1994 the university was a founding member of the Russell Group, a collaboration of twenty leading research-intensive universities, as well as a founding member of the N8 Group in 2004. In the 21st century physicists and technicians from the University of Liverpool were involved in the construction of the Large Hadron Collider at CERN, working on two of the four detectors in the LHC. In 2004, Sylvan Learning known as Laureate International Universities, became the worldwide partner for University of Liverpool online.
The university has produced ten Nobel Prize winners, from the fields of science, medicine and peace. The Nobel laureates include the physician Sir Ronald Ross, physicist Charles Barkla, physicist Martin Lewis Perl, the physiologist Sir Charles Sherrington, physicist Sir James Chadwick, chemist Sir Robert Robinson, chemist Har Gobind Khorana, physiologist Rodney Porter, economist Ronald Coase and physicist Joseph Rotblat. Sir Ronald Ross was the first British Nobel laureate in 1902; the University is associated with Professors Ronald Finn and Sir Cyril Clarke who jointly won the Lasker-DeBakey Clinical Medical Research Award in 1980 and Sir David Weatherall who won the Lasker-Koshland Special Achievement Award in Medical Science in 2010. These Lasker Awards are popularly known as America's Nobels. Over the 2013/2014 academic year, members of staff took part in numerous strikes after staff were offered a pay rise of 1% which unions equated to a 13% pay cut since 2008; the strikes were supported by both the university's Guild of Students and the National Union of Students.
Some students at the university supported the strike. The university is based around a single urban campus five minutes walk from Liverpool City Centre, at the top of Brownlow Hill and Mount Pleasant. Occupying 100 acres, it contains 192 non-residential buildings that house 69 lecture theatres, 114 teaching areas and research facilities; the main site is divided into three faculties: Life Sciences. The Veterinary Teaching Hospital and Ness Botanical Gardens are based on the Wirral Peninsula. There was a marine biology research station at Port Erin on the Isle of Man until it closed in 2006. Fifty-one residential buildings, on or near the campus, provide 3,385 rooms for students, on a catered or self-catering basis; the centrepiece of the campus remains the University's original red brick building, the Victoria Building. Opened in 1892, it has been restored as the Victoria Gallery and Museum, complete with cafe and activities for school visits Victoria Gallery and Museum, University of Liverpool.
In 2011 the university made a commitment to invest £660m into the'Student Experience', £250m of which will be spent on Student Accommodation. Announced so far have been two large On-Campus halls of residences (the first of which, Vine Court, opened September 2012, new Veterinary Science facilities, a £10m refurbishment of the Liverpool Guild of Students. New Central Teaching Laboratories for physics, earth sciences, chemistry an
General circulation model
A general circulation model is a type of climate model. It employs a mathematical model of the general circulation of a planetary ocean, it uses the Navier–Stokes equations on a rotating sphere with thermodynamic terms for various energy sources. These equations are the basis for computer programs used to simulate the Earth's atmosphere or oceans. Atmospheric and oceanic GCMs are key components along with land-surface components. GCMs and global climate models are used for weather forecasting, understanding the climate and forecasting climate change. Versions designed for decade to century time scale climate applications were created by Syukuro Manabe and Kirk Bryan at the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey; these models are based on the integration of a variety of fluid dynamical and sometimes biological equations. The acronym GCM stood for General Circulation Model. A second meaning came into use, namely Global Climate Model. While these do not refer to the same thing, General Circulation Models are the tools used for modelling climate, hence the two terms are sometimes used interchangeably.
However, the term "global climate model" is ambiguous and may refer to an integrated framework that incorporates multiple components including a general circulation model, or may refer to the general class of climate models that use a variety of means to represent the climate mathematically. In 1956, Norman Phillips developed a mathematical model that could realistically depict monthly and seasonal patterns in the troposphere, it became the first successful climate model. Following Phillips's work, several groups began working to create GCMs; the first to combine both oceanic and atmospheric processes was developed in the late 1960s at the NOAA Geophysical Fluid Dynamics Laboratory. By the early 1980s, the United States' National Center for Atmospheric Research had developed the Community Atmosphere Model. In 1996, efforts began to model vegetation types; the Hadley Centre for Climate Prediction and Research's HadCM3 model coupled ocean-atmosphere elements. The role of gravity waves was added in the mid-1980s.
Gravity waves are required to simulate global scale circulations accurately. Atmospheric and oceanic GCMs can be coupled to form an atmosphere-ocean coupled general circulation model. With the addition of submodels such as a sea ice model or a model for evapotranspiration over land, AOGCMs become the basis for a full climate model. A recent trend in GCMs is to apply them as components of Earth system models, e.g. by coupling ice sheet models for the dynamics of the Greenland and Antarctic ice sheets, one or more chemical transport models for species important to climate. Thus a carbon CTM may allow a GCM to better predict anthropogenic changes in carbon dioxide concentrations. In addition, this approach allows accounting for inter-system feedback: e.g. chemistry-climate models allow the possible effects of climate change on ozone hole to be studied. Climate prediction uncertainties depend on uncertainties in chemical and social models. Significant uncertainties and unknowns remain regarding the future course of human population and technology.
Three-dimensional GCMs apply discrete equations for fluid motion and integrate these forward in time. They contain parameterisations for processes such as convection that occur on scales too small to be resolved directly. A simple general circulation model consists of a dynamic core that relates properties such as temperature to others such as pressure and velocity. Examples are programs that solve the primitive equations, given energy input and energy dissipation in the form of scale-dependent friction, so that atmospheric waves with the highest wavenumbers are most attenuated; such models may be used to study atmospheric processes, but are not suitable for climate projections. Atmospheric GCMs model the atmosphere using imposed sea surface temperatures, they may include atmospheric chemistry. AGCMs consist of a dynamical core which integrates the equations of fluid motion for: surface pressure horizontal components of velocity in layers temperature and water vapor in layers radiation, split into solar/short wave and terrestrial/infrared/long wave parameters for: convection land surface processes albedo hydrology cloud coverA GCM contains prognostic equations that are a function of time together with diagnostic equations that are evaluated from them for a specific time period.
As an example, pressure at any height can be diagnosed by applying the hydrostatic equation to the predicted surface pressure and the predicted values of temperature between the surface and the height of interest. Pressure is used to compute the pressure gradient force in the time-dependent equation for the winds. OGCMs may contain a sea ice model. For example, the standard resolution of HadOM3 is 1.25 degrees in latitude and longitude, with 20 vertical levels, leading to 1,500,000 variables. AOGCMs combine the two submodels, they remove the need to specify fluxes across the interface of the ocean surface. These models are the basis for model predictions of future climate, such as are discussed by the IPCC. AOGCMs internalise, they have been used to provide predictions at a regional scale. While the simpler models are susceptible to
Natural Environment Research Council
The Natural Environment Research Council is a British Research Council that supports research and knowledge transfer activities in the environmental sciences. NERC began in 1965 when several environmental research organisations were brought under the one umbrella organisation; when most research councils were re-organised in 1994, it had new responsibilities – Earth observation and science-developed archaeology. Collaboration between research councils increased in 2002. Sir Graham Sutton Professor John Krebs, Baron Krebs 1994-1999 Sir John Lawton 1999–2005 Professor Alan Thorpe 2005–2011 Dr Steven Wilson – 2011–2012 Professor Duncan Wingham – from 1 January 2012 The council's head office is at Polaris House in Swindon, alongside the other six Research Councils. NERC's research centres provide leadership to the UK environmental science community and play significant and influential roles in international science collaborations, it supports a number of collaborative centres of excellence and subject-based designated Environmental Data Centres for the storage and distribution of environmental data.
The Natural Environment Research Council delivers independent research, survey and knowledge transfer in the environmental sciences, to advance knowledge of planet Earth as a complex, interacting system. The council's work covers the full range of atmospheric, biological and aquatic sciences, from the deep oceans to the upper atmosphere, from the geographical poles to the equator. NERC's mission is to gather and apply knowledge, create understanding and predict the behaviour of the natural environment and its resources, communicate all aspects of the council's work; the British Meteorological Office is not part of NERC. The NERC Airborne Research Facility collects and processes remotely sensed data for use by the scientific community. Data are collected from one of four Twin Otter research aircraft operated by British Antarctic Survey, processed by a data analysis team at the Plymouth Marine Laboratory and archived at the National Earth Observation Data Centre; the NERC ARF provides radiometrically corrected hyperspectral data from the AISA Fenix and Owl instruments.
Conservation biology Conservation ethic Conservation movement David Carson Ecology Ecology movement Environmentalism Environmental movement Environmental protection Habitat conservation List of environmental organisations Natural environment Natural capital Natural resource Renewable resource Royal Research Ship Sustainable development Sustainability Official website British Antarctic Survey British Geological Survey Centre for Ecology and Hydrology National Centre for Atmospheric Science National Centre for Earth Observation NERC Centre for Population Biology National Oceanography Centre Research Councils UK ARF homepage ARSF-DAN Wiki
The Met Office is the United Kingdom's national weather service. It is an executive agency and trading fund of the Department for Business and Industrial Strategy led by CEO, Penelope Endersby, who took on the role as Chief Executive in December 2018, the first woman to do so; the Met Office makes meteorological predictions across all timescales from weather forecasts to climate change. The Met Office was established in 1854 as a small department within the Board of Trade under Vice Admiral Robert FitzRoy as a service to mariners; the loss of the passenger vessel, the Royal Charter, 459 lives off the coast of Anglesey in a violent storm in October 1859 led to the first gale warning service. FitzRoy established a network of 15 coastal stations from which visual gale warnings could be provided for ships at sea; the new electric telegraph enabled rapid dissemination of warnings and led to the development of an observational network which could be used to provide synoptic analysis. The Met Office started in 1861 to provide weather forecasts to newspapers.
FitzRoy requested the daily traces of the photo-barograph at Kew Observatory to assist in this task and similar barographs and as well as instruments to continuously record other meteorological parameters were provided to stations across the observing network. Publication of forecasts ceased in May 1866 after FitzRoy's death but recommenced in April 1879. Following the First World War, the Met Office became part of the Air Ministry in 1919, the weather observed from the top of Adastral House giving rise to the phrase "The weather on the Air Ministry roof"; as a result of the need for weather information for aviation, the Met Office located many of its observation and data collection points on RAF airfields, this accounts for the large number of military airfields mentioned in weather reports today. In 1936 the Met Office split with services to the Royal Navy being provided by its own forecasting services, it became an executive agency of the Ministry of Defence in April 1990, a quasi-governmental role, being required to act commercially.
Following a machinery of government change, the Met Office became part of the Department for Business and Skills on 18 July 2011, subsequently part of the Department for Business and Industrial Strategy following the merger of BIS and the Department of Energy and Climate Change on 14 July 2016. Although no longer part of the MOD, the Met Office maintains strong links with the military through its front line offices at RAF and Army bases both in the UK and overseas and its involvement in the Joint Operations Meteorology and Oceanography Centre with the Royal Navy; the Mobile Met Unit are a unit consisting of Met Office staff who are RAF reservists who accompany forward units in times of conflict advising the armed forces of the conditions for battle the RAF. In September 2003 the Met Office moved its headquarters from Bracknell in Berkshire to a purpose-built £80m structure at Exeter Business Park, near junction 29 of the M5 motorway; the new building was opened on 21 June 2004 – a few weeks short of the Met Office's 150th anniversary – by Robert May, Baron May of Oxford.
It has a worldwide presence – including a forecasting centre in Aberdeen, offices in Gibraltar and on the Falklands. Other outposts lodge in establishments such as the Joint Centre for Mesoscale Meteorology at University of Reading in Berkshire, the Joint Centre for Hydro-Meteorological Research site at Wallingford in Oxfordshire, there is a Met Office presence at Army and Air Force bases within the UK and abroad. Royal Navy weather forecasts are provided by naval officers, not Met Office personnel; the Shipping Forecast is produced by the Met Office and broadcast on BBC Radio 4, for those traversing the seas around the British Isles. The Met Office issues Severe Weather Warnings for the United Kingdom through the National Severe Weather Warning Service; these warn of weather events that may endanger people's lives. In March 2008, the system was improved and a new stage of warning was introduced, the'Advisory'. In September 2015 the Met Office established a "name our storms" project, the aim is to provide a single authoritative naming system for the storms that affect the UK and Ireland by asking the public to suggest names.
On 10 November, the first named. The main role of the Met Office is to produce forecast models by gathering information from weather satellites in space and observations on earth processing it with a variety of models, based on a software package known as the unified model; the principal weather products for UK customers are 36-hour forecasts from the operational 1.5 km resolution UKV model covering the UK and surroundings, 48-hour forecasts from the 12 km resolution NAE model covering Europe and the North Atlantic, 144-hour forecasts from the 25 km resolution global model. The Met Office's Global Model forecast has been in the top 3 for global weather forecast performance in independent verification to WMO standards. Products for other regions of the globe are sold to customers abroad, provided for MOD operations abroad or provided free to developing countries in Africa. If necessary, forecasters may make adjustments to the computer forecasts. Data is stored in the Met Office's own PP-format.
Formed in 2009, the Flood Forecasting Centre is a joint venture between the Environment Agency and the Met Office to provide flood risk guidance for Engl
Climatology or climate science is the scientific study of climate, scientifically defined as weather conditions averaged over a period of time. This modern field of study is regarded as a branch of the atmospheric sciences and a subfield of physical geography, one of the Earth sciences. Climatology now includes aspects of biogeochemistry. Basic knowledge of climate can be used within shorter term weather forecasting using analog techniques such as the El Niño–Southern Oscillation, the Madden–Julian oscillation, the North Atlantic oscillation, the Northern Annular Mode, known as the Arctic oscillation, the Northern Pacific Index, the Pacific decadal oscillation, the Interdecadal Pacific Oscillation. Climate models are used for a variety of purposes from study of the dynamics of the weather and climate system to projections of future climate. Weather is known as the condition of the atmosphere over a period of time, while climate has to do with the atmospheric condition over an extended to indefinite period of time.
Chinese scientist Shen Kuo inferred that climates shifted over an enormous span of time, after observing petrified bamboos found underground near Yanzhou, a dry-climate area unsuitable for the growth of bamboo. Early climate researchers include Edmund Halley, who published a map of the trade winds in 1686 after a voyage to the southern hemisphere. Benjamin Franklin first mapped the course of the Gulf Stream for use in sending mail from the United States to Europe. Francis Galton invented the term anticyclone. Helmut Landsberg fostered the use of statistical analysis in climatology, which led to its evolution into a physical science; the Greeks began the formal study of climate. The first distinct climate treaties were the works of Hippocrates, who wrote Airs and Places in 400 B. C. E. Climatology is approached in various ways such as Paleoclimatology, which seeks to reconstruct past climates by examining records such as ice cores and tree rings. Paleotempestology uses these same records to help determine hurricane frequency over millennia.
The study of contemporary climates incorporates meteorological data accumulated over many years, such as records of rainfall and atmospheric composition. Knowledge of the atmosphere and its dynamics is embodied in models, either statistical or mathematical, which help by integrating different observations and testing how they fit together. Modeling is used for understanding past and potential future climates. Historical climatology is the study of climate as related to human history and thus focuses only on the last few thousand years. Climate research is made difficult by the large scale, long time periods, complex processes which govern climate. Climate is governed by physical laws; these equations are coupled and nonlinear, so that approximate solutions are obtained by using numerical methods to create global climate models. Climate is sometimes modeled as a stochastic process but this is accepted as an approximation to processes that are otherwise too complicated to analyze. Scientists use climate indices based on several climate patterns in their attempt to characterize and understand the various climate mechanisms that culminate in our daily weather.
Much in the way the Dow Jones Industrial Average, based on the stock prices of 30 companies, is used to represent the fluctuations in the stock market as a whole, climate indices are used to represent the essential elements of climate. Climate indices are devised with the twin objectives of simplicity and completeness, each index represents the status and timing of the climate factor it represents. By their nature, indices are simple, combine many details into a generalized, overall description of the atmosphere or ocean which can be used to characterize the factors which impact the global climate system. El Niño–Southern Oscillation is a global coupled ocean-atmosphere phenomenon; the Pacific Ocean signatures, El Niño and La Niña are important temperature fluctuations in surface waters of the tropical Eastern Pacific Ocean. The name El Niño, from the Spanish for "the little boy", refers to the Christ child, because the phenomenon is noticed around Christmas time in the Pacific Ocean off the west coast of South America.
La Niña means "the little girl". Their effect on climate in the subtropics and the tropics are profound; the atmospheric signature, the Southern Oscillation reflects the monthly or seasonal fluctuations in the air pressure difference between Tahiti and Darwin. The most recent occurrence of El Niño started in September 2006 and lasted until early 2007. ENSO is a set of interacting parts of a single global system of coupled ocean-atmosphere climate fluctuations that come about as a consequence of oceanic and atmospheric circulation. ENSO is the most prominent known source of inter-annual variability in weather and climate around the world; the cycle occurs every two to seven years, with El Niño lasting nine months to two years within the longer term cycle, though not all areas globally are affected. ENSO has signatures in the Pacific and Indian Oceans. In the Pacific, during major warm events, El Niño warming extends over much of the tropical Pacific and becomes linked to the SO intensity. While ENSO events are in phase between the Pacific and Indian Ocean