National Institute of Standards and Technology
The National Institute of Standards and Technology is a physical sciences laboratory, a non-regulatory agency of the United States Department of Commerce. Its mission is to promote industrial competitiveness. NIST's activities are organized into laboratory programs that include nanoscale science and technology, information technology, neutron research, material measurement, physical measurement; the American AI initiative has called NIST to lead the development of appropriate technical standards for reliable, trustworthy, secure and interoperable AI systems. The Articles of Confederation, ratified by the colonies in 1781, contained the clause, "The United States in Congress assembled shall have the sole and exclusive right and power of regulating the alloy and value of coin struck by their own authority, or by that of the respective states—fixing the standards of weights and measures throughout the United States". Article 1, section 8, of the Constitution of the United States, transferred this power to Congress.
To coin money, regulate the value thereof, of foreign coin, fix the standard of weights and measures". In January 1790, President George Washington, in his first annual message to Congress stated that, "Uniformity in the currency and measures of the United States is an object of great importance, will, I am persuaded, be duly attended to", ordered Secretary of State Thomas Jefferson to prepare a plan for Establishing Uniformity in the Coinage and Measures of the United States, afterwards referred to as the Jefferson report. On October 25, 1791, Washington appealed a third time to Congress, "A uniformity of the weights and measures of the country is among the important objects submitted to you by the Constitution and if it can be derived from a standard at once invariable and universal, must be no less honorable to the public council than conducive to the public convenience", but it was not until 1838, that a uniform set of standards was worked out. In 1821, John Quincy Adams had declared "Weights and measures may be ranked among the necessities of life to every individual of human society".
From 1830 until 1901, the role of overseeing weights and measures was carried out by the Office of Standard Weights and Measures, part of the United States Department of the Treasury. In 1901, in response to a bill proposed by Congressman James H. Southard, the National Bureau of Standards was founded with the mandate to provide standard weights and measures, to serve as the national physical laboratory for the United States. President Theodore Roosevelt appointed Samuel W. Stratton as the first director; the budget for the first year of operation was $40,000. The Bureau took custody of the copies of the kilogram and meter bars that were the standards for US measures, set up a program to provide metrology services for United States scientific and commercial users. A laboratory site was constructed in Washington, DC, instruments were acquired from the national physical laboratories of Europe. In addition to weights and measures, the Bureau developed instruments for electrical units and for measurement of light.
In 1905 a meeting was called that would be the first "National Conference on Weights and Measures". Conceived as purely a metrology agency, the Bureau of Standards was directed by Herbert Hoover to set up divisions to develop commercial standards for materials and products.page 133 Some of these standards were for products intended for government use, but product standards affected private-sector consumption. Quality standards were developed for products including some types of clothing, automobile brake systems and headlamps and electrical safety. During World War I, the Bureau worked on multiple problems related to war production operating its own facility to produce optical glass when European supplies were cut off. Between the wars, Harry Diamond of the Bureau developed a blind approach radio aircraft landing system. During World War II, military research and development was carried out, including development of radio propagation forecast methods, the proximity fuze and the standardized airframe used for Project Pigeon, shortly afterwards the autonomously radar-guided Bat anti-ship guided bomb and the Kingfisher family of torpedo-carrying missiles.
In 1948, financed by the United States Air Force, the Bureau began design and construction of SEAC, the Standards Eastern Automatic Computer. The computer went into operation in May 1950 using a combination of vacuum tubes and solid-state diode logic. About the same time the Standards Western Automatic Computer, was built at the Los Angeles office of the NBS by Harry Huskey and used for research there. A mobile version, DYSEAC, was built for the Signal Corps in 1954. Due to a changing mission, the "National Bureau of Standards" became the "National Institute of Standards and Technology" in 1988. Following September 11, 2001, NIST conducted the official investigation into the collapse of the World Trade Center buildings. NIST, known between 1901 and 1988 as the National Bureau of Standards, is a measurement standards laboratory known as a National Metrological Institute, a non-regulatory agency of the United States Department of Commerce; the institute's official mission is to: Promote U. S. innovation and industrial competitiveness by advancing measurement science and technology in ways that enhance economic security and improve our quality of life.
NIST had an operating budget for fiscal year 2007 of about $843.3 million. NIST's 2009 budget was $992 million
A braille e-book is a refreshable braille display using electroactive polymers or heated wax rather than mechanical pins to raise braille dots on a display. Though not inherently expensive, due to the small scale of production they have not been shown to be economical; some e-books are produced with the production of a printed format, as described in electronic publishing. Braille books were written in paper, with Perkins Brailler typewriter, a machine invented in 1951, improved in 2008, another way of produce braille books was with Braille printers or embossers. In 2011 David S. Morgan produced the first SMART Brailler machine, with added text to speech function and allowed digital capture of data entered. In 1960 Robert Mann, a teacher in MIT, wrote DOTSYS, a software that allowed automatic braille translation, another group created an embossing device called "M. I. T. Braillemboss.". The Mitre Corporation team of Robert Gildea, Jonathan Millen, Reid Gerhart and Joseph Sullivan developed DOTSYS III, the first braille translator written in a portable programming language.
DOTSYS III was developed for the Atlanta Public Schools as a public domain program. Braille translators allowed the automatic creation of braille text or books from an script into Braille scripture without the need of typing Braille books in Braille typewriters, but still needed embossers to produce books, this last step is not necessary when the e-book is read in a Braille e-book. A Korean concept design published in 2009 by Yanko Design attracted attention. A British prototype design called "Anagraphs" was created in 2013, but funding from the European Union ran out before it could be brought to production. A Braille Ebook/Tablet was slated to be released for purchase in the 4th quarter of 2016 by the Austrian company Blitab, it was expected to be priced under US$3000. As of February 2019 the company was inviting people to sign up as a "Tester", with the explanation, "Become one of the first to touch and feel the future of large scale tactile Braille displays." Book E-book Braille translator Perkins Brailler View on Disability: How to make a cheap Braille e-reader Wax-based Braille display makes e-reading available to blind, 22 April 2014
An operating system is system software that manages computer hardware and software resources and provides common services for computer programs. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources. For hardware functions such as input and output and memory allocation, the operating system acts as an intermediary between programs and the computer hardware, although the application code is executed directly by the hardware and makes system calls to an OS function or is interrupted by it. Operating systems are found on many devices that contain a computer – from cellular phones and video game consoles to web servers and supercomputers; the dominant desktop operating system is Microsoft Windows with a market share of around 82.74%. MacOS by Apple Inc. is in second place, the varieties of Linux are collectively in third place. In the mobile sector, use in 2017 is up to 70% of Google's Android and according to third quarter 2016 data, Android on smartphones is dominant with 87.5 percent and a growth rate 10.3 percent per year, followed by Apple's iOS with 12.1 percent and a per year decrease in market share of 5.2 percent, while other operating systems amount to just 0.3 percent.
Linux distributions are dominant in supercomputing sectors. Other specialized classes of operating systems, such as embedded and real-time systems, exist for many applications. A single-tasking system can only run one program at a time, while a multi-tasking operating system allows more than one program to be running in concurrency; this is achieved by time-sharing, where the available processor time is divided between multiple processes. These processes are each interrupted in time slices by a task-scheduling subsystem of the operating system. Multi-tasking may be characterized in co-operative types. In preemptive multitasking, the operating system slices the CPU time and dedicates a slot to each of the programs. Unix-like operating systems, such as Solaris and Linux—as well as non-Unix-like, such as AmigaOS—support preemptive multitasking. Cooperative multitasking is achieved by relying on each process to provide time to the other processes in a defined manner. 16-bit versions of Microsoft Windows used cooperative multi-tasking.
32-bit versions of both Windows NT and Win9x, used preemptive multi-tasking. Single-user operating systems have no facilities to distinguish users, but may allow multiple programs to run in tandem. A multi-user operating system extends the basic concept of multi-tasking with facilities that identify processes and resources, such as disk space, belonging to multiple users, the system permits multiple users to interact with the system at the same time. Time-sharing operating systems schedule tasks for efficient use of the system and may include accounting software for cost allocation of processor time, mass storage and other resources to multiple users. A distributed operating system manages a group of distinct computers and makes them appear to be a single computer; the development of networked computers that could be linked and communicate with each other gave rise to distributed computing. Distributed computations are carried out on more than one machine; when computers in a group work in cooperation, they form a distributed system.
In an OS, distributed and cloud computing context, templating refers to creating a single virtual machine image as a guest operating system saving it as a tool for multiple running virtual machines. The technique is used both in virtualization and cloud computing management, is common in large server warehouses. Embedded operating systems are designed to be used in embedded computer systems, they are designed to operate on small machines like PDAs with less autonomy. They are able to operate with a limited number of resources, they are compact and efficient by design. Windows CE and Minix 3 are some examples of embedded operating systems. A real-time operating system is an operating system that guarantees to process events or data by a specific moment in time. A real-time operating system may be single- or multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a deterministic nature of behavior is achieved. An event-driven system switches between tasks based on their priorities or external events while time-sharing operating systems switch tasks based on clock interrupts.
A library operating system is one in which the services that a typical operating system provides, such as networking, are provided in the form of libraries and composed with the application and configuration code to construct a unikernel: a specialized, single address space, machine image that can be deployed to cloud or embedded environments. Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries and parallel processing; when personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers. In the 1940s, the earliest electronic digital systems had no operating systems.
Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plug boards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the pri
A computer monitor is an output device that displays information in pictorial form. A monitor comprises the display device, circuitry and power supply; the display device in modern monitors is a thin film transistor liquid crystal display with LED backlighting having replaced cold-cathode fluorescent lamp backlighting. Older monitors used a cathode ray tube. Monitors are connected to the computer via VGA, Digital Visual Interface, HDMI, DisplayPort, low-voltage differential signaling or other proprietary connectors and signals. Computer monitors were used for data processing while television receivers were used for entertainment. From the 1980s onwards, computers have been used for both data processing and entertainment, while televisions have implemented some computer functionality; the common aspect ratio of televisions, computer monitors, has changed from 4:3 to 16:10, to 16:9. Modern computer monitors are interchangeable with conventional television sets. However, as computer monitors do not include components such as a television tuner and speakers, it may not be possible to use a computer monitor as a television without external components.
Early electronic computers were fitted with a panel of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the'monitor'; as early monitors were only capable of displaying a limited amount of information and were transient, they were considered for program output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the program's operation; as technology developed engineers realized that the output of a CRT display was more flexible than a panel of light bulbs and by giving control of what was displayed in the program itself, the monitor itself became a powerful output device in its own right. Computer monitors were known as visual display units, but this term had fallen out of use by the 1990s. Multiple technologies have been used for computer monitors.
Until the 21st century most used cathode ray tubes but they have been superseded by LCD monitors. The first computer monitors used cathode ray tubes. Prior to the advent of home computers in the late 1970s, it was common for a video display terminal using a CRT to be physically integrated with a keyboard and other components of the system in a single large chassis; the display was monochrome and far less sharp and detailed than on a modern flat-panel monitor, necessitating the use of large text and limiting the amount of information that could be displayed at one time. High-resolution CRT displays were developed for the specialized military and scientific applications but they were far too costly for general use; some of the earliest home computers were limited to monochrome CRT displays, but color display capability was a standard feature of the pioneering Apple II, introduced in 1977, the specialty of the more graphically sophisticated Atari 800, introduced in 1979. Either computer could be connected to the antenna terminals of an ordinary color TV set or used with a purpose-made CRT color monitor for optimum resolution and color quality.
Lagging several years behind, in 1981 IBM introduced the Color Graphics Adapter, which could display four colors with a resolution of 320 x 200 pixels, or it could produce 640 x 200 pixels with two colors. In 1984 IBM introduced the Enhanced Graphics Adapter, capable of producing 16 colors and had a resolution of 640 x 350. By the end of the 1980s color CRT monitors that could display 1024 x 768 pixels were available and affordable. During the following decade, maximum display resolutions increased and prices continued to fall. CRT technology remained dominant in the PC monitor market into the new millennium because it was cheaper to produce and offered to view angles close to 180 degrees. CRTs still offer some image quality advantages over LCDs but improvements to the latter have made them much less obvious; the dynamic range of early LCD panels was poor, although text and other motionless graphics were sharper than on a CRT, an LCD characteristic known as pixel lag caused moving graphics to appear noticeably smeared and blurry.
There are multiple technologies. Throughout the 1990s, the primary use of LCD technology as computer monitors was in laptops where the lower power consumption, lighter weight, smaller physical size of LCDs justified the higher price versus a CRT; the same laptop would be offered with an assortment of display options at increasing price points: monochrome, passive color, or active matrix color. As volume and manufacturing capability have improved, the monochrome and passive color technologies were dropped from most product lines. TFT-LCD is a variant of LCD, now the dominant technology used for computer monitors; the first standalone LCDs appeared in the mid-1990s selling for high prices. As prices declined over a period of years they became more popular, by 1997 were competing with CRT monitors. Among the first desktop LCD computer monitors was the Eizo L66 in the mid-1990s, the Apple Studio Display in 1998, the Apple Cinema Display in 1999. In 2003, TFT-LCDs outsold CRTs for the first time, becoming the primary technology used for computer monitors.
The main advantages of LCDs over CRT displays are that LC
The Katholieke Universiteit LeuvenA,B abbreviated KU Leuven, is a research university in the Dutch-speaking town of Leuven in Flanders, Belgium. It conducts teaching and services in the sciences, humanities, medicine and social sciences. In addition to its main campus in Leuven, it has satellite campuses in Kortrijk, Ghent, Ostend, Diepenbeek, Sint-Katelijne-Waver, in Belgium's capital Brussels. KU Leuven is the largest university in the Low Countries. In 2017-18, more than 58,000 students were enrolled, its primary language of instruction is Dutch, although several programs are taught in English graduate degrees. KU Leuven ranks among the top 100 universities in the world; as of 2016-2017, It ranks 40th globally according to Times Higher Education, 79th according to QS World University Rankings, 93rd according to the Academic Ranking of World Universities. According to Thomson Reuters, in 2016, 2017 and 2018, KU Leuven researchers have filed more patents than any other university in Europe; as such, KU Leuven was ranked first in the publication's annual list of Europe's most innovative universities for those three years.
A number of its programs rank within the top 100 in the world according to QS World University Rankings by Subject. It is the highest-ranked university from the Low Countries; the old University of Leuven was founded at the center of the historic town of Leuven in 1425, making it Belgium's first university. The University of Leuven closed during the Napoleonic period in 1797; the Catholic University of Leuven was "re-founded" in 1834, is identified as a continuation of the older institution. C In 1968, the Catholic University of Leuven split into the Dutch-language Katholieke Universiteit te Leuven and the French-language Université catholique de Louvain, which moved to Louvain-la-Neuve in Wallonia; the Catholic University of Leuven has been a major contributor to the development of Catholic theology. It is considered the oldest existent Catholic university. Although Catholic in heritage, it operates independently from the Church. KU Leuven is open to students from different faiths. For the history of the pre-1970 university see Catholic University of Leuven.
In 1968, tensions between the Dutch-speaking and French-speaking communities led to the splitting of the bilingual Catholic University of Leuven into two "sister" universities, with the Dutch-language university becoming a functioning independent institution in Leuven in 1970, the Université catholique de Louvain departing to a newly built greenfield campus site in the French-speaking part of Belgium. Pieter De Somer became the first rector of the KUL. In 1972, the KUL set up a separate entity Leuven Research & Development to support industrial and commercial applications of university research, it has led to numerous spin-offs, such as the technology company Metris, manages tens of millions of euros in investments and venture capital. The university's electronic learning environment, TOLEDO, which started in September 2001, was developed into the central electronic learning environment at the KUL; the word is an acronym for TOetsen en LEren Doeltreffend Ondersteunen. It is the collective name for a number of commercial software programs and tools, such as Blackboard.
The project offers the Question Mark Perception assignment software to all institution members and has implemented the Ariadne KPS to reuse digital learning objects inside the Blackboard environment. On 11 July 2002, the KU Leuven became the dominant institution in the "KU Leuven Association". KU Leuven is a member of the Coimbra Group as well as of the LERU Group. Since November 2014, KU Leuven's Faculty of Economics and Business is accredited by European Quality Improvement System, a leading accreditation system specializing in higher education institutions of management and business administration. Since August 2017, the university has been led by Luc Sels; the Belgian archbishop, André-Joseph Léonard is the current Grand Chancellor and a member of the university board. KU Leuven is dedicated to Mary, the mother of Jesus, under her traditional attribute as "Seat of Wisdom", organizes an annual celebration on 2 February in her honour. On that day, the university awards its honorary doctorates.
The seal used by the university shows the medieval statue Our Lady of Leuven in a vesica piscis shape. In the academic year of 2012-2013, the university held Erasmus contracts with 434 European establishments, it had 22 central bilateral agreements in 8 countries: the United States, South Africa, the Democratic Republic of Congo, Vietnam and the Netherlands. The vast majority of international EU students came from the Netherlands, while most non-EU ones come from China. KU Leuven hosts the world's largest banana genebank, the Bioversity International Musa Germplasm Transit Centre, that celebrated its 30th anniversary in 2017 and was visited by Deputy Prime Minister and Minister for Development Cooperation, Alexander De Croo. Academics at KU Leuven is organized into three groups, each with its own faculties and schools offering programs up to doctoral level. While most courses are taught in Dutch, many are offered in English the graduate programs. Biomedical Sciences Group Department of Cardiovascular Sciences Department of Oral Health Sciences Department of Pharmaceutical and Pharmacological Sci
Burmese Braille is the braille alphabet of languages of Burma written in the Burmese script, including Burmese and Karen. Letters that may not seem at first glance to correspond to international norms are more recognizable when traditional romanization is considered. For example, သ s is rendered ⠹ th, how it was romanized when Burmese Braille was developed; the first braille alphabet for Burmese was developed by a Father Jackson ca. 1918. There was no provision for the voiced aspirate series of consonants, nor for the retroflex, Jackson provided distinct letters for complex onsets such as ky, hm and for various syllable rimes, with no regard to how they are written in the print Burmese alphabet; these aspects have all been changed, as have several of the letters for the values which were retained. However, some of the old letters, unusual by international standards, such as ⠌ for င ng and ⠪ for ီ i; the letters in print Burmese transcribe consonants and, in syllable-initial position, vowels. The consonants each have a corresponding letter in braille, but the initial vowels in print are in braille all written ⠰ plus the letter for the appropriate diacritic.
The consonant ny has two forms in print. The stacking of consonants in print is indicated with ⠤ in braille; that is, Burmese Braille has two viramas, one corresponding to print virama, one corresponding to stacking. For example, ကမ္ဘာ kambha "world" is written ⠅⠍⠤⠃⠁; the diacritics in print, which transcribe both vowels and consonants, are rendered as follows in Karen Braille. ⠰ is used to mark syllable- or word-initial vowels, which have distinct letters in the Burmese print alphabet. For example, The following punctuation is specific to Burmese. Western punctuation uses Western braille conventions
MacOS is a series of graphical operating systems developed and marketed by Apple Inc. since 2001. It is the primary operating system for Apple's Mac family of computers. Within the market of desktop and home computers, by web usage, it is the second most used desktop OS, after Microsoft Windows.macOS is the second major series of Macintosh operating systems. The first is colloquially called the "classic" Mac OS, introduced in 1984, the final release of, Mac OS 9 in 1999; the first desktop version, Mac OS X 10.0, was released in March 2001, with its first update, 10.1, arriving that year. After this, Apple began naming its releases after big cats, which lasted until OS X 10.8 Mountain Lion. Since OS X 10.9 Mavericks, releases have been named after locations in California. Apple shortened the name to "OS X" in 2012 and changed it to "macOS" in 2016, adopting the nomenclature that they were using for their other operating systems, iOS, watchOS, tvOS; the latest version is macOS Mojave, publicly released in September 2018.
Between 1999 and 2009, Apple sold. The initial version, Mac OS X Server 1.0, was released in 1999 with a user interface similar to Mac OS 8.5. After this, new versions were introduced concurrently with the desktop version of Mac OS X. Beginning with Mac OS X 10.7 Lion, the server functions were made available as a separate package on the Mac App Store.macOS is based on technologies developed between 1985 and 1997 at NeXT, a company that Apple co-founder Steve Jobs created after leaving the company. The "X" in Mac OS X and OS X is pronounced as such; the X was a prominent part of the operating system's brand identity and marketing in its early years, but receded in prominence since the release of Snow Leopard in 2009. UNIX 03 certification was achieved for the Intel version of Mac OS X 10.5 Leopard and all releases from Mac OS X 10.6 Snow Leopard up to the current version have UNIX 03 certification. MacOS shares its Unix-based core, named Darwin, many of its frameworks with iOS, tvOS and watchOS.
A modified version of Mac OS X 10.4 Tiger was used for the first-generation Apple TV. Releases of Mac OS X from 1999 to 2005 ran on the PowerPC-based Macs of that period. After Apple announced that they were switching to Intel CPUs from 2006 onwards, versions were released for 32-bit and 64-bit Intel-based Macs. Versions from Mac OS X 10.7 Lion run on 64-bit Intel CPUs, in contrast to the ARM architecture used on iOS and watchOS devices, do not support PowerPC applications. The heritage of what would become macOS had originated at NeXT, a company founded by Steve Jobs following his departure from Apple in 1985. There, the Unix-like NeXTSTEP operating system was developed, launched in 1989; the kernel of NeXTSTEP is based upon the Mach kernel, developed at Carnegie Mellon University, with additional kernel layers and low-level user space code derived from parts of BSD. Its graphical user interface was built on top of an object-oriented GUI toolkit using the Objective-C programming language. Throughout the early 1990s, Apple had tried to create a "next-generation" OS to succeed its classic Mac OS through the Taligent and Gershwin projects, but all of them were abandoned.
This led Apple to purchase NeXT in 1996, allowing NeXTSTEP called OPENSTEP, to serve as the basis for Apple's next generation operating system. This purchase led to Steve Jobs returning to Apple as an interim, the permanent CEO, shepherding the transformation of the programmer-friendly OPENSTEP into a system that would be adopted by Apple's primary market of home users and creative professionals; the project was first code named "Rhapsody" and officially named Mac OS X. Mac OS X was presented as the tenth major version of Apple's operating system for Macintosh computers. Previous Macintosh operating systems were named using Arabic numerals, as with Mac OS 8 and Mac OS 9; the letter "X" in Mac OS X's name refers to a Roman numeral. It is therefore pronounced "ten" in this context. However, it is commonly pronounced like the letter "X"; the first version of Mac OS X, Mac OS X Server 1.0, was a transitional product, featuring an interface resembling the classic Mac OS, though it was not compatible with software designed for the older system.
Consumer releases of Mac OS X included more backward compatibility. Mac OS applications could be rewritten to run natively via the Carbon API; the consumer version of Mac OS X was launched in 2001 with Mac OS X 10.0. Reviews were variable, with extensive praise for its sophisticated, glossy Aqua interface but criticizing it for sluggish performance. With Apple's popularity at a low, the makers of several classic Mac applications such as FrameMaker and PageMaker declined to develop new versions of their software for Mac OS X. Ars Technica columnist John Siracusa, who reviewed every major OS X release up to 10.10, described the early releases in retrospect as'dog-slow, feature poor' and Aqua as'unbearably slow and a huge resource hog'. Apple developed several new releases of Mac OS X. Siracusa's review of version 10.3, noted "It's strange to have gone from years of uncertainty and vaporware to a steady annual supply of major new operating system releases." Version 10.4, Tiger shocked executives at Microsoft by offering a number of features, such as fast file s