Artificial intelligence in healthcare
Artificial intelligence in healthcare is the use of complex algorithms and softwares to estimate human cognition in the analysis of complicated medical data. AI is the ability for computer algorithms to approximate conclusions without direct human input. What distinguishes AI technology from traditional technologies in health care is the ability to gain information, process it and give a well-defined output to the end-user. AI does this through machine learning algorithms; these algorithms can create its own logic. In order to reduce the margin of error, AI algorithms need to be tested repeatedly. AI algorithms behave differently from humans in two ways: algorithms are literal: if you set a goal, the algorithm can’t adjust itself and only understand what is has been told explicitly, algorithms are black boxes; the primary aim of health-related AI applications is to analyze relationships between prevention or treatment techniques and patient outcomes. AI programs have been developed and applied to practices such as diagnosis processes, treatment protocol development, drug development, personalized medicine, patient monitoring and care.
Medical institutions such as The Mayo Clinic, Memorial Sloan Kettering Cancer Center, Massachusetts General Hospital, National Health Service, have developed AI algorithms for their departments. Large technology companies such as IBM and Google, startups such as Welltok and Ayasdi, have developed AI algorithms for healthcare. Additionally, hospitals are looking to AI solutions to support operational initiatives that increase cost saving, improve patient satisfaction, satisfy their staffing and workforce needs. Companies like Hospital IQ are developing predictive analytics solutions that help healthcare leaders improve business operations through increasing utilization, decreasing patient boarding, reducing length of stay and optimizing staffing levels. Research in the 1960s and 1970s produced the first problem-solving program, or expert system, known as Dendral. While it was designed for applications in organic chemistry, it provided the basis for a subsequent system MYCIN, considered one of the most significant early uses of artificial intelligence in medicine.
MYCIN and other systems such as INTERNIST-1 and CASNET did not achieve routine use by practitioners, however. The 1980s and 1990s brought the proliferation of the microcomputer and new levels of network connectivity. During this time, there was a recognition by researchers and developers that AI systems in healthcare must be designed to accommodate the absence of perfect data and build on the expertise of physicians. Approaches involving fuzzy set theory, Bayesian networks, artificial neural networks, have been applied to intelligent computing systems in healthcare. Medical and technological advancements occurring over this half-century period that have enabled the growth healthcare-related applications of AI include: Improvements in computing power resulting in faster data collection and data processing Increased volume and availability of health-related data from personal and healthcare-related devices Growth of genomic sequencing databases Widespread implementation of electronic health record systems Improvements in natural language processing and computer vision, enabling machines to replicate human perceptual processes Enhanced the precision of robot-assisted surgery Various specialties in medicine have shown an increase in research regarding AI.
The specialty that has gained the greatest attention is the field of Radiology. An ability to interpret imaging results may aid clinicians in detecting a minute change in an image that a clinician might accidentally miss. A study at Stanford created an algorithm; the radiology conference Radiological Society of North America has implemented a large part of its schedule to the use of AI in imaging. The emergence of AI technology in radiology is perceived as a threat by some specialists, as the technology can perform certain tasks better than human specialists, changing the role radiologists have currently. Recent advances have suggested the use of AI to describe and evaluate the outcome of maxillo-facial surgery or the assessment of cleft patients therapy in regard to facial attractiveness or age appearance; the increase of Telemedicine, has shown the rise of possible AI applications. The ability to monitor patients using AI may allow for the communication of information to physicians if possible disease activity may have occurred.
A wearable device may allow for constant monitoring of a patient and allow for the ability to notice changes that may be less distinguishable by humans. Electronic health records are crucial to the digitailization and information spread of the healthcare industry; however logging all of this data comes with its own problems like cognitive overload and burnout for users. EHR developers are now automating much of the process and starting to use natural language processing tools to improve this process. One study conducted by the Centerstone research insititute found that predictive modeling of EHR data has achieved 70–72% accuracy in predicting individualized treatment response at baseline. Meaning using an AI tool that scans EHR data it can pretty predict the course of disease in a person; the subsequent motive of large based health companies merging with other health companies, allow for greater health data accessibility. Greater health data may allow for more implementation of AI algorithms.
A large part of industry focus of implementation of AI in the healthcare sector is in the clinical decision support systems. As the amount of data increases, AI decision support systems become
Georgia Institute of Technology
The Georgia Institute of Technology referred to as Georgia Tech, is a public research university and institute of technology in Atlanta, Georgia. It has satellite campuses in Savannah, Georgia; the school was founded in 1885 as the Georgia School of Technology as part of Reconstruction plans to build an industrial economy in the post-Civil War Southern United States. It offered only a degree in mechanical engineering. By 1901, its curriculum had expanded to include electrical and chemical engineering. In 1948, the school changed its name to reflect its evolution from a trade school to a larger and more capable technical institute and research university. Today, Georgia Tech is organized into six colleges and contains about 31 departments/units, with emphasis on science and technology, it is well recognized for its degree programs in engineering, business administration, the sciences and design. Georgia Tech is ranked 8th among all public national universities in the United States, 7th in the Best Engineering Schools ranking, 35th among all colleges and universities in the United States by U.
S. News & World Report rankings, 34th among global universities in the world by Times Higher Education rankings. Georgia Tech has been ranked as the "smartest" public college in America. Student athletics, both organized and intramural, are a part of alumni life; the school's intercollegiate competitive sports teams, the four-time football national champion Yellow Jackets, the nationally recognized fight song "Ramblin' Wreck from Georgia Tech", have helped keep Georgia Tech in the national spotlight. Georgia Tech fields eight men's and seven women's teams that compete in the NCAA Division I athletics and the Football Bowl Subdivision. Georgia Tech is a member of the Coastal Division in the Atlantic Coast Conference; the idea of a technology school in Georgia was introduced in 1865 during the Reconstruction period. Two former Confederate officers, Major John Fletcher Hanson and Nathaniel Edwin Harris, who had become prominent citizens in the town of Macon, Georgia after the Civil War believed that the South needed to improve its technology to compete with the industrial revolution, occurring throughout the North.
However, because the American South of that era was populated by agricultural workers and few technical developments were occurring, a technology school was needed. In 1882, the Georgia State Legislature authorized a committee, led by Harris, to visit the Northeast to see firsthand how technology schools worked, they were impressed by the polytechnic educational models developed at the Massachusetts Institute of Technology and the Worcester County Free Institute of Industrial Science. The committee recommended adapting the Worcester model, which stressed a combination of "theory and practice", the "practice" component including student employment and production of consumer items to generate revenue for the school. On October 13, 1885, Georgia Governor Henry D. McDaniel signed the bill to create and fund the new school. In 1887, Atlanta pioneer Richard Peters donated to the state 4 acres of the site of a failed garden suburb called Peters Park; the site was bounded on the south by North Avenue, on the west by Cherry Street.
He sold five adjoining acres of land to the state for US$10,000. This land was near Atlanta's northern city limits at the time of its founding, although the city has expanded several miles beyond it. A historical marker on the large hill in Central Campus notes the site occupied by the school's first buildings once held fortifications to protect Atlanta during the Atlanta Campaign of the American Civil War; the surrender of the city took place on the southwestern boundary of the modern Georgia Tech campus in 1864. The Georgia School of Technology opened in the fall of 1888 with two buildings. One building had classrooms to teach students, it was designed for students to produce goods to sell and fund the school. The two buildings were equal in size to show the importance of teaching both the mind and the hands, though, at the time, there was some disagreement to whether the machine shop should have been used to turn a profit. On October 20, 1905, U. S. President Theodore Roosevelt visited Georgia Tech.
On the steps of Tech Tower, Roosevelt delivered a speech about the importance of technological education. He shook hands with every student. Georgia Tech's Evening School of Commerce began holding classes in 1912; the evening school admitted its first female student in 1917, although the state legislature did not authorize attendance by women until 1920. Annie T. Wise became the first female graduate in 1919 and was Georgia Tech's first female faculty member the following year. In 1931, the Board of Regents transferred control of the Evening School of Commerce to the University of Georgia and moved the civil and electrical engineering courses at UGA to Tech. Tech replaced the commerce school with what became the College of Business; the commerce school would split from UGA and become Georgia State University. In 1934, the Engineering Experiment Station was founded by W. Harry Vaughan with an initial budget of $5,000 and 13 part-time faculty. Founded as the Georgia School of Technology, Georgia Tech assumed its pre
Leland Stanford Junior University is a private research university in Stanford, California. Stanford is known for its academic strength, proximity to Silicon Valley, ranking as one of the world's top universities; the university was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr. who had died of typhoid fever at age 15 the previous year. Stanford was a U. S. Senator and former Governor of California who made his fortune as a railroad tycoon; the school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, Provost Frederick Terman supported faculty and graduates' entrepreneurialism to build self-sufficient local industry in what would be known as Silicon Valley; the university is one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.
The university is organized around three traditional schools consisting of 40 academic departments at the undergraduate and graduate level and four professional schools that focus on graduate programs in Law, Medicine and Business. Stanford's undergraduate program is the most selective in the United States by acceptance rate. Students compete in 36 varsity sports, the university is one of two private institutions in the Division I FBS Pac-12 Conference, it has gained the most for a university. Stanford athletes have won 512 individual championships, Stanford has won the NACDA Directors' Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals; as of October 2018, 83 Nobel laureates, 27 Turing Award laureates, 8 Fields Medalists have been affiliated with Stanford as students, faculty or staff. In addition, Stanford University is noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups.
Stanford alumni have founded a large number of companies, which combined produce more than $2.7 trillion in annual revenue and have created 5.4 million jobs as of 2011 equivalent to the 10th largest economy in the world. Stanford is the alma mater of 30 living billionaires and 17 astronauts, is one of the leading producers of members of the United States Congress. Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child; the institution opened in 1891 on Stanford's previous Palo Alto farm. Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War and Peace was started by Herbert Hoover to preserve artifacts related to World War I; the Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The SLAC National Accelerator Laboratory, established in 1962, performs research in particle physics. Jane and Leland Stanford modeled their university after the great eastern universities, most Cornell University and Harvard University.
Stanford opened being called the "Cornell of the West" in 1891 due to faculty being former Cornell affiliates including its first president, David Starr Jordan. Both Cornell and Stanford were among the first to have higher education be accessible and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, Stanford became an early adopter as well. Most of Stanford University is on one of the largest in the United States, it is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley 37 miles southeast of San Francisco and 20 miles northwest of San Jose. In 2008, 60% of this land remained undeveloped. Stanford's main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land is within the city limits of Palo Alto; the campus includes much land in unincorporated San Mateo County, as well as in the city limits of Menlo Park and Portola Valley.
The academic central campus is adjacent to Palo Alto, bounded by El Camino Real, Stanford Avenue, Junipero Serra Boulevard, Sand Hill Road. The United States Postal Service has assigned it two ZIP Codes: 94305 for campus mail and 94309 for P. O. box mail. It lies within area code 650. Stanford operates or intends to operate in various locations outside of its central campus. On the founding grant: Jasper Ridge Biological Preserve is a 1,200-acre natural reserve south of the central campus owned by the university and used by wildlife biologists for research. SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy, it contains the longest linear particle accelerator in the world, 2 miles on 426 acres of land. Golf course and a seasonal lake: The university has its own golf course and a seasonal lake, both home to the vulnerable California tiger salamander; as of 2012 Lake Laguni
A computer mouse is a hand-held pointing device that detects two-dimensional motion relative to a surface. This motion is translated into the motion of a pointer on a display, which allows a smooth control of the graphical user interface; the first public demonstration of a mouse controlling a computer system was in 1968. Wired to a computer, many modern mice are cordless, relying on short-range radio communication with the connected system. Mice used a ball rolling on a surface to detect motion, but modern mice have optical sensors that have no moving parts. In addition to moving a cursor, computer mice have one or more buttons to allow operations such as selection of a menu item on a display. Mice also feature other elements, such as touch surfaces and "wheels", which enable additional control and dimensional input; the earliest known publication of the term mouse as referring to a computer pointing device is in Bill English's July 1965 publication, "Computer-Aided Display Control" originating from its resemblance to the shape and size of a mouse, a rodent, with the cord resembling its tail.
The plural for the small rodent is always "mice" in modern usage. The plural of a computer mouse is "mouses" and "mice" according to most dictionaries, with "mice" being more common; the first recorded plural usage is "mice". The term computer mouses may be used informally in some cases. Although, the plural of mouse is mice, the two words have undergone a differentiation through usage; the trackball, a related pointing device, was invented in 1946 by Ralph Benjamin as part of a post-World War II-era fire-control radar plotting system called Comprehensive Display System. Benjamin was working for the British Royal Navy Scientific Service. Benjamin's project used analog computers to calculate the future position of target aircraft based on several initial input points provided by a user with a joystick. Benjamin felt that a more elegant input device was needed and invented what they called a "roller ball" for this purpose; the device was patented in 1947, but only a prototype using a metal ball rolling on two rubber-coated wheels was built, the device was kept as a military secret.
Another early trackball was built by British electrical engineer Kenyon Taylor in collaboration with Tom Cranston and Fred Longstaff. Taylor was part of the original Ferranti Canada, working on the Royal Canadian Navy's DATAR system in 1952. DATAR was similar in concept to Benjamin's display; the trackball used four disks to pick up two each for the X and Y directions. Several rollers provided mechanical support; when the ball was rolled, the pickup discs spun and contacts on their outer rim made periodic contact with wires, producing pulses of output with each movement of the ball. By counting the pulses, the physical movement of the ball could be determined. A digital computer calculated the tracks and sent the resulting data to other ships in a task force using pulse-code modulation radio signals; this trackball used a standard Canadian five-pin bowling ball. It was not patented. Douglas Engelbart of the Stanford Research Institute has been credited in published books by Thierry Bardini, Paul Ceruzzi, Howard Rheingold, several others as the inventor of the computer mouse.
Engelbart was recognized as such in various obituary titles after his death in July 2013. By 1963, Engelbart had established a research lab at SRI, the Augmentation Research Center, to pursue his objective of developing both hardware and software computer technology to "augment" human intelligence; that November, while attending a conference on computer graphics in Reno, Engelbart began to ponder how to adapt the underlying principles of the planimeter to X-Y coordinate input. On November 14, 1963, he first recorded his thoughts in his personal notebook about something he called a "bug," which in a "3-point" form could have a "drop point and 2 orthogonal wheels." He wrote that the "bug" would be "easier" and "more natural" to use, unlike a stylus, it would stay still when let go, which meant it would be "much better for coordination with the keyboard."In 1964, Bill English joined ARC, where he helped Engelbart build the first mouse prototype. They christened the device the mouse as early models had a cord attached to the rear part of the device which looked like a tail, in turn resembled the common mouse.
As noted above, this "mouse" was first mentioned in print in a July 1965 report, on which English was the lead author. On 9 December 1968, Engelbart publicly demonstrated the mouse at what would come to be known as The Mother of All Demos. Engelbart never received any royalties for it, as his employer SRI held the patent, which expired before the mouse became used in personal computers. In any event, the invention of the mouse was just a small part of Engelbart's much larger project of augmenting human intellect. Several other experimental pointing-devices developed for Engelbart's oN-Line System exploited different body movements – for example, head-mounted devices attached to the chin or nose – but the mouse won out because of its speed and convenience; the first mouse, a bulky device used two potentiometers perpendicular to each other and connected to wheels: the rotation of each wheel translated into motion along one axis. At the time of the "Mother of All Demos", Engelbart's group had been using their second generation, 3-button mouse for about a year.
On October 2, 1968, a mouse device named Rollkugel (German for "rolling bal
Vannevar Bush was an American engineer and science administrator, who during World War II headed the U. S. Office of Scientific Research and Development, through which all wartime military R&D was carried out, including important developments in radar and the initiation and early administration of the Manhattan Project, he emphasized the importance of scientific research to national security and economic well-being, was chiefly responsible for the movement that led to the creation of the National Science Foundation. Bush joined the Department of Electrical Engineering at Massachusetts Institute of Technology in 1919, founded the company now known as Raytheon in 1922. Bush became vice president of MIT and dean of the MIT School of Engineering in 1932, president of the Carnegie Institution of Washington in 1938. During his career, Bush patented a string of his own inventions, he is known for his engineering work on analog computers, for the memex. Starting in 1927, Bush constructed a differential analyzer, an analog computer with some digital components that could solve differential equations with as many as 18 independent variables.
An offshoot of the work at MIT by Bush and others was the beginning of digital circuit design theory. The memex, which he began developing in the 1930s, was a hypothetical adjustable microfilm viewer with a structure analogous to that of hypertext; the memex and Bush's 1945 essay "As We May Think" influenced generations of computer scientists, who drew inspiration from his vision of the future. Bush was appointed to the National Advisory Committee for Aeronautics in 1938, soon became its chairman; as chairman of the National Defense Research Committee, director of OSRD, Bush coordinated the activities of some six thousand leading American scientists in the application of science to warfare. Bush was a well-known policymaker and public intellectual during World War II, when he was in effect the first presidential science advisor; as head of NDRC and OSRD, he initiated the Manhattan Project, ensured that it received top priority from the highest levels of government. In Science, The Endless Frontier, his 1945 report to the President of the United States, Bush called for an expansion of government support for science, he pressed for the creation of the National Science Foundation.
Vannevar Bush was born in Everett, Massachusetts, on March 11, 1890, the third child and only son of Perry Bush, the local Universalist pastor, his wife Emma Linwood. He had two older sisters and Reba, he was named after John Vannevar, an old friend of the family who had attended Tufts College with Perry. The family moved to Chelsea, Massachusetts, in 1892, Bush graduated from Chelsea High School in 1909, he attended Tufts, like his father before him. A popular student, he was vice president of his sophomore class, president of his junior class. During his senior year, he managed the football team, he became a member of the Alpha Tau Omega fraternity, dated Phoebe Clara Davis, who came from Chelsea. Tufts allowed students to gain a master's degree in four years with a bachelor's degree. For his master's thesis, Bush invented and patented a "profile tracer"; this was a mapping device for assisting surveyors. It had two bicycle wheels, a pen that plotted the terrain over which it traveled, it was the first of a string of inventions.
On graduation in 1913 he received both bachelor of master of science degrees. After graduation, Bush worked at General Electric in New York, for $14 a week; as a "test man", his job was to assess equipment to ensure. He transferred to GE's plant in Pittsfield, Massachusetts, to work on high voltage transformers, but after a fire broke out at the plant and the other test men were suspended, he returned to Tufts in October 1914 to teach mathematics, spent the 1915 summer break working at the Brooklyn Navy Yard as an electrical inspector. Bush was awarded a $1,500 scholarship to study at Clark University as a doctoral student of Arthur Gordon Webster, but Webster wanted Bush to study acoustics. Bush preferred to quit rather than study a subject. Bush subsequently enrolled in the Massachusetts Institute of Technology electrical engineering program. Spurred by the need for enough financial security to marry, he submitted his thesis, entitled Oscillating-Current Circuits: An Extension of the Theory of Generalized Angular Velocities, with Applications to the Coupled Circuit and the Artificial Transmission Line, in April 1916.
His adviser, Arthur Edwin Kennelly, tried to demand more work from him, but Bush refused, Kennelly was overruled by the department chairman. He married Phoebe in August 1916, they had two sons: John Hathaway Bush. Bush accepted a job with Tufts, where he became involved with the American Radio and Research Corporation, which began broadcasting music from the campus on March 8, 1916; the station owner, Harold Power, hired him to run the company's laboratory, at a salary greater than that which Bush drew from Tufts. In 1917, following the United States' entry into World War I, he went to work with the National Research Council, he attempted to develop a means of detecting submarines by measuring the disturbance in the Earth's magnetic field. His device worked. Bush left Tufts in 1919, although he remained employed by AMRAD, joined the Department of Electrical Engineering at Massachusetts Institute of
A machine is a mechanical structure that uses power to apply forces and control movement to perform an intended action. Machines can be driven by animals and people, by natural forces such as wind and water, by chemical, thermal, or electrical power, include a system of mechanisms that shape the actuator input to achieve a specific application of output forces and movement, they can include computers and sensors that monitor performance and plan movement called mechanical systems. Renaissance natural philosophers identified six simple machines which were the elementary devices that put a load into motion, calculated the ratio of output force to input force, known today as mechanical advantage. Modern machines are complex systems that consist of structural elements and control components and include interfaces for convenient use. Examples include a wide range of vehicles, such as automobiles and airplanes, appliances in the home and office, including computers, building air handling and water handling systems, as well as farm machinery, machine tools and factory automation systems and robots.
The English word machine comes through Middle French from Latin machina, which in turn derives from the Greek. The word mechanical comes from the same Greek roots. A wider meaning of "fabric, structure" is found in classical Latin, but not in Greek usage; this meaning is found in late medieval French, is adopted from the French into English in the mid-16th century. In the 17th century, the word could mean a scheme or plot, a meaning now expressed by the derived machination; the modern meaning develops out of specialized application of the term to stage engines used in theater and to military siege engines, both in the late 16th and early 17th centuries. The OED traces the formal, modern meaning to John Harris' Lexicon Technicum, which has: Machine, or Engine, in Mechanicks, is whatsoever hath Force sufficient either to raise or stop the Motion of a Body... Simple Machines are reckoned to be Six in Number, viz. the Ballance, Pulley, Wheel and Screw... Compound Machines, or Engines, are innumerable.
The word engine used as a synonym both by Harris and in language derives from Latin ingenium "ingenuity, an invention". The hand axe, made by chipping flint to form a wedge, in the hands of a human transforms force and movement of the tool into a transverse splitting forces and movement of the workpiece; the idea of a simple machine originated with the Greek philosopher Archimedes around the 3rd century BC, who studied the Archimedean simple machines: lever and screw. Archimedes discovered the principle of mechanical advantage in the lever. Greek philosophers defined the classic five simple machines and were able to calculate their mechanical advantage. Heron of Alexandria in his work Mechanics lists five mechanisms that can "set a load in motion". However, the Greeks' understanding was limited to statics and did not include dynamics or the concept of work. During the Renaissance the dynamics of the Mechanical Powers, as the simple machines were called, began to be studied from the standpoint of how much useful work they could perform, leading to the new concept of mechanical work.
In 1586 Flemish engineer Simon Stevin derived the mechanical advantage of the inclined plane, it was included with the other simple machines. The complete dynamic theory of simple machines was worked out by Italian scientist Galileo Galilei in 1600 in Le Meccaniche, he was the first to understand that simple machines do not create energy, they transform it. The classic rules of sliding friction in machines were discovered by Leonardo da Vinci, but remained unpublished in his notebooks, they were rediscovered by Guillaume Amontons and were further developed by Charles-Augustin de Coulomb. James Watt patented his parallel motion linkage in 1782, which made the double acting steam engine practical; the Boulton and Watt steam engine and designs powered steam locomotives, steam ships, factories. The Industrial Revolution was a period from 1750 to 1850 where changes in agriculture, mining and technology had a profound effect on the social and cultural conditions of the times, it began in the United Kingdom subsequently spread throughout Western Europe, North America and the rest of the world.
Starting in the part of the 18th century, there began a transition in parts of Great Britain's manual labour and draft-animal-based economy towards machine-based manufacturing. It started with the mechanisation of the textile industries, the development of iron-making techniques and the increased use of refined coal; the idea that a machine can be decomposed into simple movable elements led Archimedes to define the lever and screw as simple machines. By the time of the Renaissance this list increased to include the wheel and axle and inclined plane; the modern approach to characterizing machines focusses on the components that allow movement, known as joints. Wedge: Perhaps the first example of a device designed to manage power is the hand axe called biface and Olorgesailie. A hand axe is made by chipping stone flint, to form a bifacial edge, or wedge. A wedge is a simple machine that transforms lateral force and movement o
Clean coal technology
Clean coal technology is a collection of technologies being developed in attempts to lessen the negative environmental impact of coal energy generation and to mitigate worldwide climate change. When coal is used as a fuel source, the gaseous emissions generated by the thermal decomposition of the coal include sulfur dioxide, nitrogen oxides and other chemical byproducts that vary depending on the type of the coal being used; these emissions have been established to have a negative impact on the environment and human health, contributing to acid rain, lung cancer and cardiovascular disease. As a result, clean coal technologies are being developed to remove or reduce pollutant emissions to the atmosphere; some of the techniques that would be used to accomplish this include chemically washing minerals and impurities from the coal, improved technology for treating flue gases to remove pollutants to stringent levels and at higher efficiency, carbon capture and storage technologies to capture the carbon dioxide from the flue gas and dewatering lower rank coals to improve the calorific value, thus the efficiency of the conversion into electricity.
Concerns exist regarding the economic viability of these technologies and the timeframe of delivery high hidden economic costs in terms of social and environmental damage, the costs and viability of disposing of removed carbon and other toxic matter. In its original usage, the term "Clean Coal" was used to refer to technologies that were designed to reduce emission of pollutants associated with burning coal, such as washing coal at the mine; this step removes some including rocks and soil. This makes coal cheaper to transport. More the definition of clean coal has been expanded to include carbon capture and storage. Clean coal technology addresses atmospheric problems resulting from burning coal; the primary focus was on SO2 and NOx, the most important gases in causation of acid rain, particulates which cause visible air pollution and have deleterious effects on human health. Several different technological methods are available for the purpose of carbon capture as demanded by the clean coal concept: Pre-combustion capture – This involves gasification of a feedstock to form synthesis gas, which may be shifted to produce a H2 and CO2-rich gas mixture, from which the CO2 can be efficiently captured and separated and sequestered, This technology is associated with Integrated Gasification Combined Cycle process configurations.
Post-combustion capture – This refers to capture of CO2 from exhaust gases of combustion processes. Oxy-fuel combustion – Fossil fuels such as coal are burned in a mixture of recirculated flue gas and oxygen, rather than in air, which eliminates nitrogen from the flue gas enabling efficient, low-cost CO2 capture; the Kemper County IGCC Project, a proposed 582 MW coal gasification-based power plant, was expected to use pre-combustion capture of CO2 to capture 65% of the CO2 the plant produces, which would have been utilized and geologically sequestered in enhanced oil recovery operations. However, after many delays and a cost runup to $7.5 billion, the coal gasification project was abandoned and as of late 2017, Kemper is under construction as a cheaper natural gas power plant. The Saskatchewan Government's Boundary Dam Integrated Carbon Capture and Sequestration Demonstration Project will use post-combustion, amine-based scrubber technology to capture 90% of the CO2 emitted by Unit 3 of the power plant.
An early example of a coal-based plant using carbon-capture technology is Swedish company Vattenfall’s Schwarze Pumpe power station located in Spremberg, built by German firm Siemens, which went on-line in September 2008. The facility captures CO2 and acid rain producing pollutants, separates them, compresses the CO2 into a liquid. Plans are to inject the CO2 into other geological formations. Vattenfall opines that this technology is considered not to be a final solution for CO2 reduction in the atmosphere, but provides an achievable solution in the near term while more desirable alternative solutions to power generation can be made economically practical. Other examples of oxy-combustion carbon capture are in progress. Callide Power Station has retrofitted a 30-MWth existing PC-fired power plant to operate in oxy-fuel mode. Babcock-ThermoEnergy's Zero Emission Boiler System is oxy-combustion-based. Other carbon capture and storage technologies include those. Low-rank coals contain a higher level of moisture content which contains a lower energy content per tonne.
This causes an increased emissions output. Reduction of moisture from the coal prior to combustion can reduce emissions by up to 50 percent; the UK government is working towards a clean energy future and supported clean coal projects across the country. In August 2010, UK-based company B9 Coal announced a clean coal project with 90% carbon capture to be put forward to DECC; this proposed project was to create pure streams of carbon dioxide. The hydrogen was to be used as an emissions-free fuel to run an alkaline fuel cell whilst the carbon dioxide was to be captured. In the late 1980s and early 1990s, the U. S. Department of Energy (D