Changing a theme requires changing CSS and a script of the theme. When using Sencha CMD a re-build of the application might be required. All components should work with each theme. For example Classic theme has rather small elements not suited for touch devices. Neptune Touch has bigger elements better suited for phones. Ext JS comes in two flavours called classic toolkit, they differ not only with available themes but there are some API differences between them. So it is not as easy to migrate from one toolkit to the other. There are plans to out some differences between the toolkits in Ext JS 7.1, planned for 2019. Ext JS is a composition of classes; some examples: an abstract layer for browsers state management server communication layer layout and window management event management routing Ext JS has its own class system. Classes are defined with Ext.define and an instance can be created with Ext.create. Some classes can be created by an alias. Instances of components are created automatically.
Class can extend built-in classes. Custom components would extend built in components. There is a built-in dynamic loader. There are two types of dependencies in ExtJS. Dependencies declared. Dependencies defined in uses property might be loaded, it is possible to override classes. Built-in classes. Overriding built-in classes might be useful to create patches. Overridden class gets merged with new declaration; each class can be overridden as many times. Ext JS version 2.0 was released on 4 December 2007. This version was promoted as providing an interface and features more similar to those traditionally associated with desktop applications. Promoted were the new user documentation, API documentation, samples. Ext JS 2.0 did not provide a backward compatibility with version 1.1. A migration guide was developed to address this. Ext JS version 3.0 was released on 6 July 2009. This version added communication support for a new Ext.. Direct server side platform. New flash, it was backwards compatible with version 2.0 code.
Version 4.0 of the Ext framework was released on April 26, 2011. It includes a revised class structure, a revised data package, an animation and drawing package that uses SVG and VML, revised charting and theming, it includes an optional architecture that provides a model–view–controller style of code organization. Version 5.0 of the Ext JS framework was released on June 2, 2014. It includes the ability to build desktop apps on touch-enabled devices—using a single code base, a Model View ViewModel architecture, two-way data binding, responsive layouts, other component upgrades with support for adding widgets inside a grid cell for data visualization and big data analytics. Ext JS 5 includes an upgraded touch-optimized charting package along with additional financial charting capabilities. Ext JS 5 supports modern and legacy browsers including: Safari 6+, Firefox, IE8+, Opera 12+. On the mobile platform, Ext JS 5 supports Safari on iOS 6 and 7, Chrome on Android 4.1+, Windows 8 touch-screen devices running IE10+.
Important: From the Ext JS 5 version you cannot buy license for fewer than 5 developers. Version 6.0 of the Ext JS framework was released on July 1, 2015. It merges the Sencha Touch framework into Ext JS. On 15-Jun-2010, the merge of Ext JS with JQTouch and Raphaël was announced forming a new organisation called Sencha Inc. Ext JS continues to be available as a main product on the new Sencha website together with Sencha Touch, Sencha GWT, Sencha Architect, Sencha Animator and Ext core. On 23-Aug-2017, Sencha was acquired by IDERA. Embarcadero is known for acquisitions of rapid
Linux is a family of free and open-source software operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is packaged in a Linux distribution. Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy. Popular Linux distributions include Debian and Ubuntu. Commercial distributions include SUSE Linux Enterprise Server. Desktop Linux distributions include a windowing system such as X11 or Wayland, a desktop environment such as GNOME or KDE Plasma. Distributions intended for servers may omit graphics altogether, include a solution stack such as LAMP; because Linux is redistributable, anyone may create a distribution for any purpose. Linux was developed for personal computers based on the Intel x86 architecture, but has since been ported to more platforms than any other operating system.
Linux is the leading operating system on servers and other big iron systems such as mainframe computers, the only OS used on TOP500 supercomputers. It is used by around 2.3 percent of desktop computers. The Chromebook, which runs the Linux kernel-based Chrome OS, dominates the US K–12 education market and represents nearly 20 percent of sub-$300 notebook sales in the US. Linux runs on embedded systems, i.e. devices whose operating system is built into the firmware and is tailored to the system. This includes routers, automation controls, digital video recorders, video game consoles, smartwatches. Many smartphones and tablet computers run other Linux derivatives; because of the dominance of Android on smartphones, Linux has the largest installed base of all general-purpose operating systems. Linux is one of the most prominent examples of open-source software collaboration; the source code may be used and distributed—commercially or non-commercially—by anyone under the terms of its respective licenses, such as the GNU General Public License.
The Unix operating system was conceived and implemented in 1969, at AT&T's Bell Laboratories in the United States by Ken Thompson, Dennis Ritchie, Douglas McIlroy, Joe Ossanna. First released in 1971, Unix was written in assembly language, as was common practice at the time. In a key pioneering approach in 1973, it was rewritten in the C programming language by Dennis Ritchie; the availability of a high-level language implementation of Unix made its porting to different computer platforms easier. Due to an earlier antitrust case forbidding it from entering the computer business, AT&T was required to license the operating system's source code to anyone who asked; as a result, Unix grew and became adopted by academic institutions and businesses. In 1984, AT&T divested itself of Bell Labs; the GNU Project, started in 1983 by Richard Stallman, had the goal of creating a "complete Unix-compatible software system" composed of free software. Work began in 1984. In 1985, Stallman started the Free Software Foundation and wrote the GNU General Public License in 1989.
By the early 1990s, many of the programs required in an operating system were completed, although low-level elements such as device drivers and the kernel, called GNU/Hurd, were stalled and incomplete. Linus Torvalds has stated that if the GNU kernel had been available at the time, he would not have decided to write his own. Although not released until 1992, due to legal complications, development of 386BSD, from which NetBSD, OpenBSD and FreeBSD descended, predated that of Linux. Torvalds has stated that if 386BSD had been available at the time, he would not have created Linux. MINIX was created by Andrew S. Tanenbaum, a computer science professor, released in 1987 as a minimal Unix-like operating system targeted at students and others who wanted to learn the operating system principles. Although the complete source code of MINIX was available, the licensing terms prevented it from being free software until the licensing changed in April 2000. In 1991, while attending the University of Helsinki, Torvalds became curious about operating systems.
Frustrated by the licensing of MINIX, which at the time limited it to educational use only, he began to work on his own operating system kernel, which became the Linux kernel. Torvalds began the development of the Linux kernel on MINIX and applications written for MINIX were used on Linux. Linux matured and further Linux kernel development took place on Linux systems. GNU applications replaced all MINIX components, because it was advantageous to use the available code from the GNU Project with the fledgling operating system. Torvalds initiated a switch from his original license, which prohibited commercial redistribution, to the GNU GPL. Developers worked to integrate GNU components with the Linux kernel, making a functional and free operating system. Linus Torvalds had wanted to call his invention "Freax", a portmant
C++ is a general-purpose programming language, developed by Bjarne Stroustrup as an extension of the C language, or "C with Classes". It has imperative, object-oriented and generic programming features, while providing facilities for low-level memory manipulation, it is always implemented as a compiled language, many vendors provide C++ compilers, including the Free Software Foundation, Intel, IBM, so it is available on many platforms. C++ was designed with a bias toward system programming and embedded, resource-constrained software and large systems, with performance and flexibility of use as its design highlights. C++ has been found useful in many other contexts, with key strengths being software infrastructure and resource-constrained applications, including desktop applications and performance-critical applications. C++ is standardized by the International Organization for Standardization, with the latest standard version ratified and published by ISO in December 2017 as ISO/IEC 14882:2017.
The C++ programming language was standardized in 1998 as ISO/IEC 14882:1998, amended by the C++03, C++11 and C++14 standards. The current C++ 17 standard supersedes these with an enlarged standard library. Before the initial standardization in 1998, C++ was developed by Danish computer scientist Bjarne Stroustrup at Bell Labs since 1979 as an extension of the C language. C++20 is the next planned standard, keeping with the current trend of a new version every three years. In 1979, Bjarne Stroustrup, a Danish computer scientist, began work on "C with Classes", the predecessor to C++; the motivation for creating a new language originated from Stroustrup's experience in programming for his Ph. D. thesis. Stroustrup found that Simula had features that were helpful for large software development, but the language was too slow for practical use, while BCPL was fast but too low-level to be suitable for large software development; when Stroustrup started working in AT&T Bell Labs, he had the problem of analyzing the UNIX kernel with respect to distributed computing.
Remembering his Ph. D. experience, Stroustrup set out to enhance the C language with Simula-like features. C was chosen because it was general-purpose, fast and used; as well as C and Simula's influences, other languages influenced C++, including ALGOL 68, Ada, CLU and ML. Stroustrup's "C with Classes" added features to the C compiler, including classes, derived classes, strong typing and default arguments. In 1983, "C with Classes" was renamed to "C++", adding new features that included virtual functions, function name and operator overloading, constants, type-safe free-store memory allocation, improved type checking, BCPL style single-line comments with two forward slashes. Furthermore, it included the development of a standalone compiler for Cfront. In 1985, the first edition of The C++ Programming Language was released, which became the definitive reference for the language, as there was not yet an official standard; the first commercial implementation of C++ was released in October of the same year.
In 1989, C++ 2.0 was released, followed by the updated second edition of The C++ Programming Language in 1991. New features in 2.0 included multiple inheritance, abstract classes, static member functions, const member functions, protected members. In 1990, The Annotated C++ Reference Manual was published; this work became the basis for the future standard. Feature additions included templates, namespaces, new casts, a boolean type. After the 2.0 update, C++ evolved slowly until, in 2011, the C++11 standard was released, adding numerous new features, enlarging the standard library further, providing more facilities to C++ programmers. After a minor C++14 update released in December 2014, various new additions were introduced in C++17, further changes planned for 2020; as of 2017, C++ remains the third most popular programming language, behind Java and C. On January 3, 2018, Stroustrup was announced as the 2018 winner of the Charles Stark Draper Prize for Engineering, "for conceptualizing and developing the C++ programming language".
According to Stroustrup: "the name signifies the evolutionary nature of the changes from C". This name is credited to Rick Mascitti and was first used in December 1983; when Mascitti was questioned informally in 1992 about the naming, he indicated that it was given in a tongue-in-cheek spirit. The name comes from C's ++ operator and a common naming convention of using "+" to indicate an enhanced computer program. During C++'s development period, the language had been referred to as "new C" and "C with Classes" before acquiring its final name. Throughout C++'s life, its development and evolution has been guided by a set of principles: It must be driven by actual problems and its features should be useful in real world programs; every feature should be implementable. Programmers should be free to pick their own programming style, that style should be supported by C++. Allowing a useful feature is more important than preventing every possible misuse of C++, it should provide facilities for organising programs into separate, well-defined parts, provide facilities for combining separately developed parts.
No implicit violations of the type system (but allow explicit violations.
Electronic health record
An electronic health record, or electronic medical record, is the systematized collection of patient and population electronically-stored health information in a digital format. These records can be shared across different health care settings. Records are shared through network-connected, enterprise-wide information systems or other information networks and exchanges. EHRs may include a range of data, including demographics, medical history and allergies, immunization status, laboratory test results, radiology images, vital signs, personal statistics like age and weight, billing information. A decade ago, electronic health records were touted as key to increasing of quality care. Today, providers are using data from patient records to improve quality outcomes through their care management programs. Combining multiple types of clinical data from the system's health records has helped clinicians identify and stratify chronically ill patients. EHR can improve quality care by using the data and analytics to prevent hospitalizations among high-risk patients.
EHR systems are designed to store data and to capture the state of a patient across time. It eliminates the need to track down a patient's previous paper medical records and assists in ensuring data is accurate and legible, it can reduce risk of data replication as there is only one modifiable file, which means the file is more up to date, decreases risk of lost paperwork. Due to the digital information being searchable and in a single file, EMRs are more effective when extracting medical data for the examination of possible trends and long term changes in a patient. Population-based studies of medical records may be facilitated by the widespread adoption of EHRs and EMRs; the terms EHR, electronic patient record and EMR have been used interchangeably, although differences between the models are now being defined. The electronic health record is a more longitudinal collection of the electronic health information of individual patients or populations; the EMR, in contrast, is the patient record created by providers for specific encounters in hospitals and ambulatory environments, which can serve as a data source for an EHR.
In contrast, a personal health record is an electronic application for recording personal medical data that the individual patient controls and may make available to health providers. While there is still a considerable about of debate around the superiority of electronic health records over paper records, the research literature paints a more realistic picture of the benefits and downsides; the increased transparency and accessibility acquired by the adoption of electronic medical records may increase the ease with which they can be accessed by healthcare professionals, but can increase the amount of stolen information by unauthorized persons or unscrupulous users versus paper medical records, as acknowledged by the increased security requirements for electronic medical records included in the Health Information and Accessibility Act and by large-scale breaches in confidential records reported by EMR users. Concerns about security contribute to the resistance shown to their adoption. Handwritten paper medical records may be poorly legible.
Pre-printed forms, standardization of abbreviations and standards for penmanship were encouraged to improve reliability of paper medical records. Electronic records may help with the standardization of forms and data input. Digitization of forms facilitates the collection of data for clinical studies. However, standardisation may create challenges for local practice. Overall, those with EMRs, that have automated notes and records, order entry, clinical decision support had fewer complications, lower mortality rates, lower costs. EMRs can be continuously updated. If the ability to exchange records between different EMR systems were perfected, it would facilitate the co-ordination of health care delivery in non-affiliated health care facilities. In addition, data from an electronic system can be used anonymously for statistical reporting in matters such as quality improvement, resource management and public health communicable disease surveillance. However, it is difficult to remove data from its context.
Ambulance services in Australia, the United States and the United Kingdom have introduced the use of EMR systems. EMS Encounters in the United States are recorded using various platforms and vendors in compliance with the NEMSIS standard; the benefits of electronic records in ambulances include: patient data sharing, injury/illness prevention, better training for paramedics, review of clinical standards, better research options for pre-hospital care and design of future treatment options, data based outcome improvement, clinical decision support. Automated handwriting recognition of ambulance medical forms has been successful. For example, Intermedix TripTix offers handwriting support across all elements of the NEMSIS 3.3.4 and 3.4.0 standard as well as custom forms on Windows devices. These systems allow traditionally paper-based medical documents to be converted to digital at the time of entry with less cost overhead; the data can be efficiently used for epidemiological analysis, including de-identified data at the National level.
Digital formatting enables information to be used and shared over secure networks Track care and outcomes Trigger warnings and reminders Send and receive orders and results Decrease billing processing time and create more accurate billing systemHealth Information Exchange Technical and social fram
In computing, a web application or web app is a client–server computer program which the client runs in a web browser. Common web applications include webmail, online retail sales, online auction; the general distinction between a dynamic web page of any kind and a "web application" is unclear. Web sites most to be referred to as "web applications" are those which have similar functionality to a desktop software application, or to a mobile app. HTML5 introduced explicit language support for making applications that are loaded as web pages, but can store data locally and continue to function while offline. Single-page applications are more application-like because they reject the more typical web paradigm of moving between distinct pages with different URLs. Single-page frameworks like Sencha Touch and AngularJS might be used to speed development of such a web app for a mobile platform. There are several ways of targeting mobile devices when making a web application: Responsive web design can be used to make a web application - whether a conventional website or a single-page application viewable on small screens and work well with touchscreens.
Progressive Web Apps are web applications that load like regular web pages or websites but can offer the user functionality such as working offline, push notifications, device hardware access traditionally available only to native mobile applications. Native apps or "mobile apps" run directly on a mobile device, just as a conventional software application runs directly on a desktop computer, without a web browser. Frameworks like React Native, Flutter and FuseTools allow the development of native apps for all platforms using languages other than each standard native language. Hybrid apps embed a mobile web site inside a native app using a hybrid framework like Apache Cordova and Ionic or Appcelerator Titanium; this allows development using web technologies while retaining certain advantages of native apps. In earlier computing models like client–server, the processing load for the application was shared between code on the server and code installed on each client locally. In other words, an application had its own pre-compiled client program which served as its user interface and had to be separately installed on each user's personal computer.