University of Zurich
The University of Zurich, located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the colleges of theology, medicine. Currently, the university has seven faculties, Human Medicine, Economic Sciences, Law and Natural Sciences, the university offers the widest range of subjects and courses of any Swiss higher education institution. It was the first university in Europe to be founded by the rather than a monarch or church. Eventually, the authorities offered Strauss a pension before he had a chance to start his duties, the university allowed women to attend philosophy lectures from 1847, and admitted the first female doctoral student in 1866. The Faculty of Veterinary Medicine was added in 1901, the second-oldest such faculty in the world, in 1914, the university moved to new premises designed by the architect Karl Moser on Rämistrasse 71. The university is scattered all over the city of Zurich, members of the university can use several libraries, including the ETH-library, and the Zurich Central Library, with over 5 million volumes.
In 1962, the faculty of science proposed to establish the Irchelpark campus on the Strickhofareal, the first stage the construction of the university buildings was begun in 1973, and the campus was inaugurated in 1979. The construction of the stage lasted from 1978 to 1983. The campus houses the anthropological museum Anthropologisches Museum, and the cantonal Staatsarchiv Zürich, the Institute and Museum for the History of Medicine is part of the university. The University of Zurich as a whole ranks in the top ten of Europe, notably in the fields of bioscience and finance, there is a close-knit collaboration between the University of Zurich and the ETH. Their faculty of medicine is six years. Shanghai Jiao Tong University Ranking 54th globally and 15th in Europe, THES – QS World University Rankings 61st globally and 14th in Europe. QS World University Rankings 2014 57th globally, professional Ranking of World Universities 32nd globally and 10th in Europe. University Ranking by Academic Performance 2010 52nd globally and 1st in Switzerland, according to Handelsblatt, the Department of Economics was ranked first in the German-speaking area and in 2009 the faculty of Business Administration was ranked third in the German-speaking area.
Bachelor courses are taught in Swiss Standard German, but use of English is increasing in many faculties, the only bachelors program taught entirely in English is the English Language and Literature program. All Master courses at the Faculty of Science are held in English, in some highly competitive and international programs, such as the Master of Science in Quantitative Finance, all lectures are held in English. Associated with the university are 12 Nobel Prize recipients, primarily in Physics, corpus Córporum, digital library created and maintained by the University’s Institute for Greek and Latin Philology
National University of Singapore
The National University of Singapore is an autonomous university in Singapore. Founded in 1905, it is the oldest institute of learning in Singapore, as well as the largest university in the country in terms of student enrolment. NUS is a research-intensive, comprehensive university with an entrepreneurial dimension, NUS is ranked as Asias top university in both the QS World University Rankings and the Times Higher Education World University Rankings in 2016. According to the latest 2016 QS World University Rankings, NUS is placed 12th in the world, NUS fared well in the 2016–17 Times Higher Education World University Rankings, coming in 24th in the world and 1st in Asia. NUSs main campus is located in South-West Singapore adjacent to Kent Ridge, the Bukit Timah campus houses the Faculty of Law, Lee Kuan Yew School of Public Policy and research institutes, while the Duke-NUS Medical School Singapore is located at the Outram campus. Tan, who was the first president of the Straits Chinese British Association, managed to raise 87,077 Straits dollars, on 3 July 1905, the medical school was founded, and was known as the Straits Settlements and Federated Malay States Government Medical School.
In 1912, the school received an endowment of $120,000 from the King Edward VII Memorial Fund. Subsequently, on 18 November 1913, the name of the school was changed to the King Edward VII Medical School, in 1921, it was again changed to the King Edward VII College of Medicine to reflect its academic status. In 1928, Raffles College was established to promote arts and social sciences at tertiary level for Malayan students, two decades later, Raffles College was merged with the King Edward VII College of Medicine to form the University of Malaya on 8 October 1949. The two institutions were merged to provide for the education needs of the Federation of Malaya. In 1960, the governments of Federation of Malaya and Singapore indicated their desire to change the status of the divisions into that of a national university. Legislation was passed in 1961 establishing the former Kuala Lumpur division as the University of Malaya while the Singapore division was renamed the University of Singapore on 1 January 1962, the National University of Singapore was formed with the merger of the University of Singapore and Nanyang University in 1980.
This was done in part due to the desire to pool the two institutions resources into a single, stronger entity, and promote English as Singapores main language of education. The original crest of Nanyang University with three intertwined rings was incorporated into the new coat-of-arms of NUS, NUS began its entrepreneurial education endeavours in the 1980s, with the setting up of the Centre for Management of Innovation and Technopreneurship in 1988. In 2001, this was renamed the NUS Entrepreneurship Centre, NUS has a semester-based modular system for conducting courses. It adopts features of the British system, such as small group teaching, students may transfer between courses within their first two semesters, enrol in cross-faculty modules or take up electives from different faculties. NUS has 16 faculties and schools, including a Music Conservatory, NUS has been ranked among the best in the Asia by two international ranking systems, the QS World University Rankings and the Times Higher Education World University Rankings.
The QS World University Rankings 2016–17 ranked NUS 12th in the world and 1st in Asia, the Times Higher Education World University Rankings 2016–17 placed NUS at 24th in the world and 1st in Asia, while its 2015–16 reputation rankings placed it at 24th globally
In many fields of mathematics and physics, almost all scientific papers are self-archived on the arXiv repository. Begun on August 14,1991, arXiv. org passed the half-million article milestone on October 3,2008, by 2014 the submission rate had grown to more than 8,000 per month. The arXiv was made possible by the low-bandwidth TeX file format, around 1990, Joanne Cohn began emailing physics preprints to colleagues as TeX files, but the number of papers being sent soon filled mailboxes to capacity. Additional modes of access were added, FTP in 1991, Gopher in 1992. The term e-print was quickly adopted to describe the articles and its original domain name was xxx. lanl. gov. Due to LANLs lack of interest in the rapidly expanding technology, in 1999 Ginsparg changed institutions to Cornell University and it is now hosted principally by Cornell, with 8 mirrors around the world. Its existence was one of the factors that led to the current movement in scientific publishing known as open access. Mathematicians and scientists regularly upload their papers to arXiv.
org for worldwide access, Ginsparg was awarded a MacArthur Fellowship in 2002 for his establishment of arXiv. The annual budget for arXiv is approximately $826,000 for 2013 to 2017, funded jointly by Cornell University Library, annual donations were envisaged to vary in size between $2,300 to $4,000, based on each institution’s usage. As of 14 January 2014,174 institutions have pledged support for the period 2013–2017 on this basis, in September 2011, Cornell University Library took overall administrative and financial responsibility for arXivs operation and development. Ginsparg was quoted in the Chronicle of Higher Education as saying it was supposed to be a three-hour tour, Ginsparg remains on the arXiv Scientific Advisory Board and on the arXiv Physics Advisory Committee. The lists of moderators for many sections of the arXiv are publicly available, additionally, an endorsement system was introduced in 2004 as part of an effort to ensure content that is relevant and of interest to current research in the specified disciplines.
Under the system, for categories that use it, an author must be endorsed by an established arXiv author before being allowed to submit papers to those categories. Endorsers are not asked to review the paper for errors, new authors from recognized academic institutions generally receive automatic endorsement, which in practice means that they do not need to deal with the endorsement system at all. However, the endorsement system has attracted criticism for allegedly restricting scientific inquiry, perelman appears content to forgo the traditional peer-reviewed journal process, stating, If anybody is interested in my way of solving the problem, its all there – let them go and read about it. The arXiv generally re-classifies these works, e. g. in General mathematics, papers can be submitted in any of several formats, including LaTeX, and PDF printed from a word processor other than TeX or LaTeX. The submission is rejected by the software if generating the final PDF file fails, if any image file is too large.
ArXiv now allows one to store and modify an incomplete submission, the time stamp on the article is set when the submission is finalized
World Wide Web
The World Wide Web is an information space where documents and other web resources are identified by Uniform Resource Locators, interlinked by hypertext links, and can be accessed via the Internet. English scientist Tim Berners-Lee invented the World Wide Web in 1989 and he wrote the first web browser computer program in 1990 while employed at CERN in Switzerland. The Web browser was released outside of CERN in 1991, first to research institutions starting in January 1991. The World Wide Web has been central to the development of the Information Age and is the primary tool billions of people use to interact on the Internet, Web pages are primarily text documents formatted and annotated with Hypertext Markup Language. In addition to formatted text, web pages may contain images, audio, embedded hyperlinks permit users to navigate between web pages. Multiple web pages with a theme, a common domain name. Website content can largely be provided by the publisher, or interactive where users contribute content or the content depends upon the user or their actions, websites may be mostly informative, primarily for entertainment, or largely for commercial, governmental, or non-governmental organisational purposes.
In the 2006 Great British Design Quest organised by the BBC and the Design Museum, Tim Berners-Lees vision of a global hyperlinked information system became a possibility by the second half of the 1980s. By 1985, the global Internet began to proliferate in Europe, in 1988 the first direct IP connection between Europe and North America was made and Berners-Lee began to openly discuss the possibility of a web-like system at CERN. Such a system, he explained, could be referred to using one of the meanings of the word hypertext. At this point HTML and HTTP had already been in development for two months and the first Web server was about a month from completing its first successful test. While the read-only goal was met, accessible authorship of web content took longer to mature, with the concept, WebDAV, Web 2.0. The proposal was modelled after the SGML reader Dynatext by Electronic Book Technology, a NeXT Computer was used by Berners-Lee as the worlds first web server and to write the first web browser, WorldWideWeb, in 1990.
By Christmas 1990, Berners-Lee had built all the necessary for a working Web, the first web browser. The first web site, which described the project itself, was published on 20 December 1990, jones stored it on a magneto-optical drive and on his NeXT computer. On 6 August 1991, Berners-Lee published a summary of the World Wide Web project on the newsgroup alt. hypertext. This date is confused with the public availability of the first web servers. The first server outside Europe was installed at the Stanford Linear Accelerator Center in Palo Alto, accounts differ substantially as to the date of this event
Amazon S3 is a web service offered by Amazon Web Services. Amazon S3 provides storage through web services interfaces, Amazon launched S3 on its fifth publicly available web service, in the United States in March 2006 and in Europe in November 2007. At its inception, Amazon charged end users US$0.15 per gigabyte-month, with charges for bandwidth used in sending and receiving data. On November 1,2008, pricing moved to tiers where end users storing more than 50 terabytes receive discounted pricing, Amazon says that S3 uses the same scalable storage infrastructure that Amazon. com uses to run its own global e-commerce network. Amazon S3 is reported to store more than 2 trillion objects as of April 2013, S3 uses include web hosting, image hosting, and storage for backup systems. S3 guarantees 99. 9% monthly uptime service-level agreement, that is, Amazon does not make details of S3s design public, though it clearly manages data with an object storage architecture. According to Amazon, S3s design aims to provide scalability, high availability, S3 is designed to provide 99. 999999999% durability and 99. 99% availability of objects over a given year, though there is no service-level agreement for durability. S3 stores arbitrary objects up to 5 terabytes in size, each accompanied by up to 2 kilobytes of metadata, objects are organized into buckets, and identified within each bucket by a unique, user-assigned key.
Amazon Machine Images which are used in the Elastic Compute Cloud can be exported to S3 as bundles and objects can be created and retrieved using either a REST-style HTTP interface or a SOAP interface. Additionally, objects can be downloaded using the HTTP GET interface, requests are authorized using an access control list associated with each bucket and object. The Amazon AWS Authentication mechanism allows the owner to create an authenticated URL with time-bounded validity. That is, someone can construct a URL that can be handed off to a third-party for access for a such as the next 30 minutes. Every item in a bucket can be served up as a BitTorrent feed, the S3 store can act as a seed host for a torrent and any BitTorrent client can retrieve the file. This drastically reduces the costs for the download of popular objects. While the use of BitTorrent does reduce bandwidth, AWS does not provide native bandwidth limiting and this can lead to users on the free-tier S3 or small hobby users amassing dramatic bills. AWS representatives have stated that such a feature was on the design table from 2006 to 2010 but have recently stated the feature is no longer in development.
A bucket can be configured to save HTTP log information to a sibling bucket, Amazon S3 provides options to host static websites with Index document support and error document support. This support was added as a result of user requests dating at least to 2006, for example, suppose that Amazon S3 was configured with CNAME records to host http, //subdomain. example. com/
Computer science is the study of the theory and engineering that form the basis for the design and use of computers. An alternate, more succinct definition of science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems and its fields can be divided into a variety of theoretical and practical disciplines. Some fields, such as computational complexity theory, are highly abstract, other fields still focus on challenges in implementing computation. Human–computer interaction considers the challenges in making computers and computations useful, the earliest foundations of what would become computer science predate the invention of the modern digital computer. Machines for calculating fixed numerical tasks such as the abacus have existed since antiquity, algorithms for performing computations have existed since antiquity, even before the development of sophisticated computing equipment.
Wilhelm Schickard designed and constructed the first working mechanical calculator in 1623, in 1673, Gottfried Leibniz demonstrated a digital mechanical calculator, called the Stepped Reckoner. He may be considered the first computer scientist and information theorist, among other reasons and he started developing this machine in 1834, and in less than two years, he had sketched out many of the salient features of the modern computer. A crucial step was the adoption of a card system derived from the Jacquard loom making it infinitely programmable. Around 1885, Herman Hollerith invented the tabulator, which used punched cards to process statistical information, when the machine was finished, some hailed it as Babbages dream come true. During the 1940s, as new and more powerful computing machines were developed, as it became clear that computers could be used for more than just mathematical calculations, the field of computer science broadened to study computation in general. Computer science began to be established as an academic discipline in the 1950s.
The worlds first computer science program, the Cambridge Diploma in Computer Science. The first computer science program in the United States was formed at Purdue University in 1962. Since practical computers became available, many applications of computing have become distinct areas of study in their own rights and it is the now well-known IBM brand that formed part of the computer science revolution during this time. IBM released the IBM704 and the IBM709 computers, working with the IBM was frustrating if you had misplaced as much as one letter in one instruction, the program would crash, and you would have to start the whole process over again. During the late 1950s, the science discipline was very much in its developmental stages. Time has seen significant improvements in the usability and effectiveness of computing technology, modern society has seen a significant shift in the users of computer technology, from usage only by experts and professionals, to a near-ubiquitous user base
Rsync is a utility for efficiently transferring and synchronizing files across computer systems. It is commonly found on Unix-like systems and functions as both a file synchronization and file transfer program, the rsync algorithm is a type of delta encoding, and is used for minimizing network usage. Zlib may be used for compression, and SSH or stunnel can be used for data security. Rsync is typically used for synchronizing files and directories between two different systems, for example, if the command rsync local-file user@remote-host, remote-file is run, rsync will use SSH to connect as user to remote-host. Once connected, it will invoke the remote hosts rsync and the two programs will determine what parts of the file need to be transferred over the connection, rsync can operate in a daemon mode, serving files in the native rsync protocol. It is released under version 3 of the GNU General Public License, andrew Tridgell and Paul Mackerras wrote the original rsync, which was first announced on 19 June 1996.
Tridgell discusses the design and performance of rsync in chapters 3 through 5 of his Ph. D. thesis in 1999 and it is currently maintained by Wayne Davison. Because of the flexibility and scriptability of rsync, it has become a standard Linux utility and it has been ported to Windows, FreeBSD, NetBSD, OpenBSD, and macOS. Similar to rcp and scp, rsync requires the specification of a source and of a destination, either of them may be remote, but not both. Generic syntax, where SRC is the file or directory to copy from, DEST is the file or directory to copy to, rsync can synchronize Unix clients to a central Unix server using rsync/ssh and standard Unix accounts. It can be used in environments, for example to efficiently synchronize files with a backup copy on an external hard drive. A scheduling utility such as cron can carry out such as automated encrypted rsync-based mirroring between multiple hosts and a central server. A command line to mirror FreeBSD might look like, $ rsync -avz --delete ftp4. de.
FreeBSD. org, $ rsync -avz --delete—safe-links rsync. apache. org, apache-dist /path/to/mirror The preferred way to mirror the PuTTY website to the current directory is to use rsync. A way to mimic the capabilities of Time Machine - see tym, make a full backup of system root directory, An rsync process does its job by communicating with another rsync process, a sender and a receiver. At startup, a client has to connect to a peer process. If the transfer is local the peer can be created with fork, if a remote host is involved, rsync starts a process to handle the connection, typically Secure Shell. Upon connection, a command is issued to start a process on the remote host. Besides using remote shells, tunnelling can be used to have remote ports appear as local on the server where an rsync daemon runs and those possibilities allow to adjust security levels at the state of the art, while a naive rsync daemon can be enough for a local network
GitHub is a web-based Git or version control repository and Internet hosting service. It offers all of the version control and source code management functionality of Git as well as adding its own features. It provides access control and several features such as bug tracking, feature requests, task management. GitHub offers both plans for private and free repositories on the account which are commonly used to host open-source software projects. As of April 2016, GitHub reports having more than 14 million users and more than 85.5 million repositories, the trademark mascot of GitHub is Octocat, an anthropomorphized cat with cephalopod limbs. Development of the GitHub platform began on 1 October 2007, the site was launched in April 2008 by Tom Preston-Werner, Chris Wanstrath, and PJ Hyett after it had been made available for a few months prior as a beta release. Projects on GitHub can be accessed and manipulated using the standard Git command-line interface, GitHub allows registered and non-registered users to browse public repositories on the site.
Multiple desktop clients and Git plugins have created by GitHub. The site provides social networking-like functions such as feeds, wikis, a user must create an account in order to contribute content to the site, but public repositories can be browsed and downloaded by anyone. With a registered account, users are able to discuss, create repositories, submit contributions to others repositories. The software that runs GitHub was written using Ruby on Rails and Erlang by GitHub, Inc. developers Chris Wanstrath, PJ Hyett, GitHub is mostly used for code. Emojis GitHub Pages, small websites can be hosted from public repositories on GitHub, nested task-lists within files Visualization of geospatial data 3D render files that can be previewed using a new integrated STL file viewer that displays the files on a 3D canvas. The viewer is powered by WebGL and Three. js, photoshops native PSD format can be previewed and compared to previous versions of the same file. GitHubs Terms of Service do not require public software projects hosted on GitHub to meet the Open Source Definition.
For that reason, it is advisable for users and developers intending to use a piece of software found on GitHub to read the license in the repository to determine if it meets their needs. The Terms of Service state, By setting your repositories to be viewed publicly, you agree to allow others to view, GitHub operates other services, a pastebin-style site called Gist that is for hosting code snippets, and a slide hosting service called Speaker Deck. Tom Preston-Werner presented the then-new Gist feature at a punk rock Ruby conference in 2008, Gist builds on the traditional simple concept of a pastebin by adding version control for code snippets, easy forking, and SSL encryption for private pastes. Because each gist has its own Git repository, multiple code snippets can be contained in a single paste, forked code can be pushed back to the original author in the form of a patch, so gists can become more like mini-projects
Search engine (computing)
A search engine is an information retrieval system designed to help find information stored on a computer system. The search results are presented in a list and are commonly called hits. Search engines help to minimize the required to find information. The most public, visible form of an engine is a Web search engine which searches for information on the World Wide Web. Search engines provide an interface to a group of items that enables users to specify criteria about an item of interest and have the engine find the matching items, the criteria are referred to as a search query. In the case of search engines, the search query is typically expressed as a set of words that identify the desired concept that one or more documents may contain. There are several styles of search query syntax that vary in strictness and it can switch names within the search engines from previous sites. Some search engines apply improvements to search queries to increase the likelihood of providing a quality set of items through a known as query expansion.
Query understanding methods can be used to standardize query language, the list of items that meet the criteria specified by the query is typically sorted, or ranked. Ranking items by relevance reduces the required to find the desired information. Probabilistic search engines rank items based on measures of similarity and sometimes popularity or authority or use relevance feedback, other types of search engines do not store an index. Crawler, or spider type search engines may collect and assess items at the time of the search query, dynamically considering additional items based on the contents of a starting item. Meta search engines store neither an index nor a cache and instead simply reuse the index or results of one or more search engines to provide an aggregated