Scrum (software development)
Scrum is an agile framework for managing knowledge work, with an emphasis on software development, although it has wide application in other fields and is starting to be explored by traditional project teams more generally. It is designed for teams of three to nine members, who break their work into actions that can be completed within timeboxed iterations, called sprints, no longer than one month and most two weeks track progress and re-plan in 15-minute time-boxed stand-up meetings, called daily scrums. Approaches to coordinating the work of multiple scrum teams in larger organizations include large-scale scrum, scaled agile framework, scrum of scrums, Scrum@Scale, the Nexus, among others. Scrum is a lightweight and incremental framework for managing product development, it defines "a flexible, holistic product development strategy where a development team works as a unit to reach a common goal", challenges assumptions of the "traditional, sequential approach" to product development, enables teams to self-organize by encouraging physical co-location or close online collaboration of all team members, as well as daily face-to-face communication among all team members and disciplines involved.
A key principle of Scrum is the dual recognition that customers will change their minds about what they want or need and that there will be unpredictable challenges—for which a predictive or planned approach is not suited. As such, Scrum adopts an evidence-based empirical approach—accepting that the problem cannot be understood or defined up front, instead focusing on how to maximize the team's ability to deliver to respond to emerging requirements, to adapt to evolving technologies and changes in market conditions. Scrum is seen written in all-capitals, as SCRUM; the word is not an acronym, so this is not correct. While the trademark on the term Scrum itself has been allowed to lapse, it is deemed as owned by the wider community rather than an individual, so the leading capital for Scrum is retained in this article. Many of the terms used in Scrum are written with leading capitals. However, to maintain an encyclopedic tone, this article uses normal sentence case for these terms — unless they are recognized marks.
Hirotaka Takeuchi and Ikujiro Nonaka introduced the term scrum in the context of product development in their 1986 Harvard Business Review article, "The New New Product Development Game". Takeuchi and Nonaka argued in The Knowledge Creating Company that it is a form of "organizational knowledge creation good at bringing about innovation continuously and spirally"; the authors described a new approach to commercial product development that would increase speed and flexibility, based on case studies from manufacturing firms in the automotive and printer industries. They called this the holistic or rugby approach, as the whole process is performed by one cross-functional team across multiple overlapping phases, in which the team "tries to go the distance as a unit, passing the ball back and forth". In the early 1990s, Ken Schwaber used what would become Scrum at his company, Advanced Development Methods. In 1995, Sutherland and Schwaber jointly presented a paper describing the scrum framework at the Business Object Design and Implementation Workshop held as part of Object-Oriented Programming, Languages & Applications'95 in Austin, Texas.
Over the following years and Sutherland collaborated to combine this material—with their experience and evolving good practice—to develop what became known as Scrum. In 2001, Schwaber worked with Mike Beedle to describe the method in the book, Agile Software Development with Scrum. Scrum's approach to planning and managing product development involves bringing decision-making authority to the level of operation properties and certainties. In 2002, Schwaber with others founded the Scrum Alliance and set up the Certified Scrum accreditation series. Schwaber left the Scrum Alliance in late 2009 and founded Scrum.org which oversees the parallel Professional Scrum accreditation series. Since 2009, a public document called The Scrum Guide has defined Scrum, it has been revised 5 times, with the current version being November 2017. In 2018, Schwaber and the Scrum.org community, along with leaders of the Kanban community, published The Kanban Guide for Scrum Teams. There are three roles in the scrum framework.
These are ideally co-located to ensure optimal communication among team members. Together these three roles form the scrum team. While many organizations have other roles involved with defining and delivering the product, Scrum defines only these three; the product owner represents the product's stakeholders and the voice of the customer, is responsible for the product backlog and accountable for maximizing the value that the team delivers. The product owner defines the product in customer-centric terms, adds them to the product backlog, prioritizes them based on importance and dependencies. A scrum team should have only one product owner; this role should not be combined with that of the scrum master. The product owner should focus on the business
Agile software development
Agile software development is an approach to software development under which requirements and solutions evolve through the collaborative effort of self-organizing and cross-functional teams and their customer/end user. It advocates adaptive planning, evolutionary development, early delivery, continual improvement, it encourages rapid and flexible response to change; the term agile was popularized, by the Manifesto for Agile Software Development. The values and principles expoused in this manifesto were derived from and underpin a broad range of software development frameworks, including Scrum and Kanban. There is significant anecdotal evidence that adopting agile practices and values improves the agility of software professionals and organizations. Iterative and incremental development methods can be traced back as early as 1957, with evolutionary project management and adaptive software development emerging in the early 1970s. During the 1990s, a number of lightweight software development methods evolved in reaction to the prevailing heavyweight methods that critics described as overly regulated and micro-managed.
These included: rapid application development, from 1991. Although these all originated before the publication of the Agile Manifesto, they are now collectively referred to as agile software development methods. At the same time, similar changes were underway in aerospace. In 2001, seventeen software developers met at a resort in Snowbird, Utah to discuss these lightweight development methods, including among others Kent Beck, Ward Cunningham, Dave Thomas, Jeff Sutherland, Ken Schwaber, Jim Highsmith, Alistair Cockburn, Robert C. Martin. Together they published the Manifesto for Agile Software Development. In 2005, a group headed by Cockburn and Highsmith wrote an addendum of project management principles, the PM Declaration of Interdependence, to guide software project management according to agile software development methods. In 2009, a group working with Martin wrote an extension of software development principles, the Software Craftsmanship Manifesto, to guide agile software development according to professional conduct and mastery.
In 2011, the Agile Alliance created the Guide to Agile Practices, an evolving open-source compendium of the working definitions of agile practices and elements, along with interpretations and experience guidelines from the worldwide community of agile practitioners. Based on their combined experience of developing software and helping others do that, the seventeen signatories to the manifesto proclaimed that they value: Individuals and Interactions over processes and tools Working Software over comprehensive documentation Customer Collaboration over contract negotiation Responding to Change over following a plan That is to say, the items on the left are valued more than the items on the right; as Scott Ambler elucidated: Tools and processes are important, but it is more important to have competent people working together effectively. Good documentation is useful in helping people to understand how the software is built and how to use it, but the main point of development is to create software, not documentation.
A contract is important but is no substitute for working with customers to discover what they need. A project plan is important, but it must not be too rigid to accommodate changes in technology or the environment, stakeholders' priorities, people's understanding of the problem and its solution; some of the authors formed the Agile Alliance, a non-profit organization that promotes software development according to the manifesto's values and principles. Introducing the manifesto on behalf of the Agile Alliance, Jim Highsmith said, The Agile movement is not anti-methodology, in fact many of us want to restore credibility to the word methodology. We want to restore a balance. We embrace modeling, but not in order to file some diagram in a dusty corporate repository. We embrace documentation, but not hundreds of pages of rarely-used tomes. We recognize the limits of planning in a turbulent environment; those who would brand proponents of XP or SCRUM or any of the other Agile Methodologies as "hackers" are ignorant of both the methodologies and the original definition of the term hacker.
The Manifesto for Agile Software Development is based on twelve principles: Customer satisfaction by early and continuous delivery of valuable software. Welcome changing requirements in late development. Deliver working software Close, daily cooperation between business people and developers Projects are built around motivated individuals, who should be trusted Face-to-face conversation is the best form of communication Working software is the primary measure of progress Sustainable development, able to maintain a constant pace Continuous attention to technical excellence and good design Simplicity—the art of maximizing the amount of work not done—is essential Best architectures and designs emerge from self-organizing teams Regularly, the team reflects on how to become more effective, adjusts accordingly Most agile development methods break product development work into small increments that minimize the amount of up-front planning and design. Iterations, or sprints, are short time frames that last from one to four weeks.
Each iteration involves a cross-functional team working in all functions: pl
An entity–relationship model describes interrelated things of interest in a specific domain of knowledge. A basic ER model is composed of entity types and specifies relationships that can exist between entities. In software engineering, an ER model is formed to represent things a business needs to remember in order to perform business processes; the ER model becomes an abstract data model, that defines a data or information structure which can be implemented in a database a relational database. Entity–relationship modeling was developed for database design by Peter Chen and published in a 1976 paper. However, variants of the idea existed previously; some ER models show super and subtype entities connected by generalization-specialization relationships, an ER model can be used in the specification of domain-specific ontologies. An entity–relationship model is the result of systematic analysis to define and describe what is important to processes in an area of a business, it does not define the business processes.
It is drawn in a graphical form as boxes that are connected by lines which express the associations and dependencies between entities. An ER model can be expressed in a verbal form, for example: one building may be divided into zero or more apartments, but one apartment can only be located in one building. Entities may be characterized not only by relationships, but by additional properties, which include identifiers called "primary keys". Diagrams created to represent attributes as well as entities and relationships may be called entity-attribute-relationship diagrams, rather than entity–relationship models. An ER model is implemented as a database. In a simple relational database implementation, each row of a table represents one instance of an entity type, each field in a table represents an attribute type. In a relational database a relationship between entities is implemented by storing the primary key of one entity as a pointer or "foreign key" in the table of another entity There is a tradition for ER/data models to be built at two or three levels of abstraction.
Note that the conceptual-logical-physical hierarchy below is used in other kinds of specification, is different from the three schema approach to software engineering. Conceptual data model This is the highest level ER model in that it contains the least granular detail but establishes the overall scope of what is to be included within the model set; the conceptual ER model defines master reference data entities that are used by the organization. Developing an enterprise-wide conceptual ER model is useful to support documenting the data architecture for an organization. A conceptual ER model may be used as more logical data models; the purpose of the conceptual ER model is to establish structural metadata commonality for the master data entities between the set of logical ER models. The conceptual data model may be used to form commonality relationships between ER models as a basis for data model integration. Logical data model A logical ER model does not require a conceptual ER model if the scope of the logical ER model includes only the development of a distinct information system.
The logical ER model contains more detail than the conceptual ER model. In addition to master data entities and transactional data entities are now defined; the details of each data entity are developed and the relationships between these data entities are established. The logical ER model is however developed independently of the specific database management system into which it can be implemented. Physical data model One or more physical ER; the physical ER model is developed to be instantiated as a database. Therefore, each physical ER model must contain enough detail to produce a database and each physical ER model is technology dependent since each database management system is somewhat different; the physical model is instantiated in the structural metadata of a database management system as relational database objects such as database tables, database indexes such as unique key indexes, database constraints such as a foreign key constraint or a commonality constraint. The ER model is normally used to design modifications to the relational database objects and to maintain the structural metadata of the database.
The first stage of information system design uses these models during the requirements analysis to describe information needs or the type of information, to be stored in a database. The data modeling technique can be used to describe any ontology for a certain area of interest. In the case of the design of an information system, based on a database, the conceptual data model is, at a stage, mapped to a logical data model, such as the relational model. Note that sometimes, both of these phases are referred to as "physical design." An entity may be defined as a thing capable of an independent existence that can be uniquely identified. An entity is an abstraction from the complexities of a domain; when we speak of an entity, we speak of some aspect of the real world that can be distinguished from other aspects of the real world. An entity is a thing that exists either logically. An entity may be a physical object such as a house or a car, an event
In software engineering, structured analysis and structured design are methods for analyzing business requirements and developing specifications for converting practices into computer programs, hardware configurations, related manual procedures. Structured analysis and design techniques are fundamental tools of systems analysis, they developed from classical systems analysis of the 1970s. Structured analysis is still in use today. Structured analysis consists of interpreting the system concept into data and control terminology represented by data flow diagrams; the flow of data and control from bubble to the data store to bubble can be difficult to track and the number of bubbles can increase. One approach is to first define events from the outside world that require the system to react assign a bubble to that event. Bubbles that need to interact are connected until the system is defined. Bubbles are grouped into higher level bubbles to decrease complexity. Data dictionaries are needed to describe the data and command flows, a process specification is needed to capture the transaction/transformation information.
SA and SD are displayed with structure charts, data flow diagrams and data model diagrams, of which there were many variations, including those developed by Tom DeMarco, Ken Orr, Larry Constantine, Vaughn Frick, Ed Yourdon, Steven Ward, Peter Chen, others. These techniques were combined in various published system development methodologies, including structured systems analysis and design method, profitable information by design, Nastec structured analysis & design, SDM/70 and the Spectrum structured system development methodology. Structured analysis is part of a series of structured methods that "represent a collection of analysis and programming techniques that were developed in response to the problems facing the software world from the 1960s to the 1980s. In this timeframe most commercial programming was done in Cobol and Fortran C and BASIC. There was little guidance on "good" design and programming techniques, there were no standard techniques for documenting requirements and designs.
Systems were getting larger and more complex, the information system development became harder and harder to do so."As a way to help manage large and complex software, the following structured methods emerged since the end of the 1960s: Structured programming in circa 1967 with Edsger Dijkstra - "Go To Statement Considered Harmful" Niklaus Wirth Stepwise design in 1971 Nassi–Shneiderman diagram in 1972 Warnier/Orr diagram in 1974 - "Logical Construction of Programs" HIPO in 1974 - IBM Hierarchy input-process-output Structured design around 1975 with Larry Constantine, Ed Yourdon and Wayne Stevens. Jackson structured programming in circa 1975 developed by Michael A. Jackson Structured analysis in circa 1978 with Tom DeMarco, Gane & Sarson, McMenamin & Palmer. Structured analysis and design technique developed by Douglas T. Ross Yourdon structured method developed by Edward Yourdon. Structured analysis and system specification published in 1979 by Tom DeMarco. Structured systems analysis and design method first presented in 1983 developed by the UK Office of Government Commerce.
IDEF0 based on SADT, developed by Douglas T. Ross in 1985. Hatley-Pirbhai modeling, defined in "Strategies for Real-Time System Specification" by Derek J. Hatley and Imtiaz A. Pirbhai in 1988. Information popularised by James Martin. According to Hay "information engineering was a logical extension of the structured techniques that were developed during the 1970's. Structured programming led to structured design; these techniques were characterized by their use of diagrams: structure charts for structured design, data flow diagrams for structured analysis, both to aid in communication between users and developers, to improve the analyst's and the designer's discipline. During the 1980's, tools began to appear which both automated the drawing of the diagrams, kept track of the things drawn in a data dictionary". After the example of computer-aided design and computer-aided manufacturing, the use of these tools was named computer-aided software engineering. Structured analysis creates a hierarchy employing a single abstraction mechanism.
The structured analysis method can employ IDEF, is process driven, starts with a purpose and a viewpoint. This method identifies the overall function and iteratively divides functions into smaller functions, preserving inputs, outputs and mechanisms necessary to optimize processes. Known as a functional decomposition approach, it focuses on cohesion within functions and coupling between functions leading to structured data; the functional decomposition of the structured method describes the process without delineating system behavior and dictates system structure in the form of required functions. The method identifies outputs as related to the activities. One reason for the popularity of structured analysis is its intuitive ability to communicate high-level processes and concepts, whether in single system or enterprise levels. Discovering how objects might support functions for commercially prevalent object-oriented development is unclear. In contrast to IDEF, the UML is interface driven with multiple abstraction mechanisms useful in describing service-oriented architectures.
Structured analysis views a system from the perspective of the data flowing through it. The function of the system is described by processes. Structured analysis takes ad
A trademark, trade mark, or trade-mark is a recognizable sign, design, or expression which identifies products or services of a particular source from those of others, although trademarks used to identify services are called service marks. The trademark owner can be business organization, or any legal entity. A trademark may be located on a label, a voucher, or on the product itself. For the sake of corporate identity, trademarks are displayed on company buildings; the first legislative act concerning trademarks was passed in 1266 under the reign of Henry III, requiring all bakers to use a distinctive mark for the bread they sold. The first modern trademark laws emerged in the late 19th century. In France the first comprehensive trademark system in the world was passed into law in 1857; the Trade Marks Act 1938 of the United Kingdom changed the system, permitting registration based on "intent-to-use”, creating an examination based process, creating an application publication system. The 1938 Act, which served as a model for similar legislation elsewhere, contained other novel concepts such as "associated trademarks", a consent to use system, a defensive mark system, non claiming right system.
The symbols ™ and ® can be used to indicate trademarks. A trademark identifies the brand owner of a particular service. Trademarks can be used by others under licensing agreements; the unauthorized usage of trademarks by producing and trading counterfeit consumer goods is known as brand piracy. The owner of a trademark may pursue legal action against trademark infringement. Most countries require formal registration of a trademark as a precondition for pursuing this type of action; the United States and other countries recognize common law trademark rights, which means action can be taken to protect an unregistered trademark if it is in use. Still, common law trademarks offer the holder, in general, less legal protection than registered trademarks. A trademark may be designated by the following symbols: ™ ℠ ® A trademark is a name, phrase, symbol, image, or a combination of these elements. There is a range of non-conventional trademarks comprising marks which do not fall into these standard categories, such as those based on colour, smell, or sound.
Trademarks which are considered offensive are rejected according to a nation's trademark law. The term trademark is used informally to refer to any distinguishing attribute by which an individual is identified, such as the well-known characteristics of celebrities; when a trademark is used in relation to services rather than products, it may sometimes be called a service mark in the United States. The essential function of a trademark is to identify the commercial source or origin of products or services, so a trademark, properly called, indicates source or serves as a badge of origin. In other words, trademarks serve to identify a particular business as the source of goods or services; the use of a trademark in this way is known as trademark use. Certain exclusive rights attach to a registered mark. Trademark rights arise out of the use of, or to maintain exclusive rights over, that sign in relation to certain products or services, assuming there are no other trademark objections. Different goods and services have been classified by the International Classification of Goods and Services into 45 Trademark Classes.
The idea behind this system is to specify and limit the extension of the intellectual property right by determining which goods or services are covered by the mark, to unify classification systems around the world. In trademark treatises it is reported that blacksmiths who made swords in the Roman Empire are thought of as being the first users of trademarks. Other notable trademarks that have been used for a long time include Löwenbräu, which claims use of its lion mark since 1383; the first trademark legislation was passed by the Parliament of England under the reign of King Henry III in 1266, which required all bakers to use a distinctive mark for the bread they sold. The first modern trademark laws emerged in the late 19th century. In France the first comprehensive trademark system in the world was passed into law in 1857 with the "Manufacture and Goods Mark Act". In Britain, the Merchandise Marks Act 1862 made it a criminal offence to imitate another's trade mark'with intent to defraud or to enable another to defraud'.
In 1875, the Trade Marks Registration Act was passed which allowed formal registration of trade marks at the UK Patent Office for the first time. Registration was considered to comprise prima facie evidence of ownership of a trade mark and registration of marks began on 1 January 1876; the 1875 Act defined a registrable trade mark as'a device, or mark, or name of an individual or firm printed in some particular and distinctive manner. In the United States, Congress first atte
Office of Government Commerce
The Office of Government Commerce was a UK Government Office established as part of the HM Treasury in 2000. It was moved into the Efficiency and Reform Group of the Cabinet Office in 2010, before being closed in 2011; the OGC operated through the Government Procurement Service, an executive agency now known as the Crown Commercial Service. The purpose of the OGC was to support the procurement and acquisition process of public sector organisations in the UK through policy and process guidance and the negotiation of overarching service and provision frameworks; this was intended to improve value for money to the taxpayer, optimising the level of taxpayers equity directed towards the delivery of services. Similar organisations can be found in most western European countries, for instance Hansel Ltd. in Finland and Consip in Italy. The OGC supported initiatives to encourage better supplier relations, sustainable procurement, the benefits of utilising smaller suppliers and the potential of eProcurement.
Representing the UK at the European Union, the organisation assisted the public sector application of EU procurement rules within the United Kingdom. The OGC was a member of Procurement G6, an informal group leading the use of framework agreements and e-procurement instruments in public procurement; the organisation used to act as sponsor for best practice of project, programme and service management: Managing Successful Programmes Projects in Controlled Environments Management of Risk Portfolio Management Value Management Information Technology Infrastructure Library Portfolio and Project Offices These areas of best practice are now owned jointly by the UK government and Capita, being managed by Axelos. On 24 April 2008 it was reported in the Daily Telegraph that a new logo for OGC had been introduced at the cost of £14,000; the logo caused embarrassment. P3O
Edward Nash Yourdon was an American software engineer, computer consultant and lecturer, software engineering methodology pioneer. He was one of the lead developers of the structured analysis techniques of the 1970s and a co-developer of both the Yourdon/Whitehead method for object-oriented analysis/design in the late 1980s and the Coad/Yourdon methodology for object-oriented analysis/design in the 1990s. Yourdon obtained his B. S. in applied mathematics from Massachusetts Institute of Technology in 1965, did graduate work in electrical engineering and computer science at MIT and the Polytechnic Institute of New York. In 1964 Yourdon started working at Digital Equipment Corporation developing FORTRAN programs for the PDP-5 minicomputer and assembler for the PDP-8. In the 1960s and early 1970s after working at a small consulting firm and as independent consultant, in 1974 Yourdon founded his own consulting firm, YOURDON Inc. to provide educational and consulting services. After he sold this firm in 1986 he served on the Board of multiple IT consultancy corporations, was advisor on several research project in the software industry throughout the 1990s.
In June 1997, Yourdon was inducted into the Computer Hall of Fame, along with such notables as Charles Babbage, James Martin, Grace Hopper, Gerald Weinberg. In December 1999 the Crosstalk: The Journal of Defense Software Engineering, named him one of the ten most influential people in the software field. In the late 1990s, Yourdon became the center of controversy over his beliefs that Y2K-related computer problems could result in severe software failures that would culminate in widespread social collapse. Due to the efforts of Yourdon and thousands of dedicated technologists and project managers, these potential critical system failure points were remediated, thus avoiding the problems Yourdon and others identified early enough to make a difference. In the new millennium, Yourdon became Faculty Fellow at the Information Systems Research Center of the University of North Texas as well as Fellow of the Business Technology Trends Council for the Cutter Consortium, where he was editor of the Cutter IT Journal.
After developing structured analysis techniques of the 1970s, object-oriented analysis/design in the late 1980s and 1990s, in the new millennium Yourdon specialized in project management, software engineering methodologies, Web 2.0 development. He founded and published American Programmer magazine, he is the author of the book Fall of the American Programmer. In 1974, Yourdon founded the consulting firm Yourdon Inc. in New York, which provided consulting and publishing in the field of software engineering. In the early 1980s, the company had multiple offices in North America and Europe and a staff of 150 people, they trained over 250,000 people in the topics of structured programming, structured design, structured analysis, logical data modeling and project management. In 1986, Yourdon sold the consulting company, it became part of the Canadian software company CGI Informatique. The publishing division had published over 150 books on software engineering topics before it became part of Prentice Hall.
In the 1980s Yourdon developed the Yourdon structured method in SSADM based on the functional structuring. The method supports two distinct design phases: design. YSM includes three discrete steps: the feasibility study, it offers a series of models: The behavioral model: states that system behavior can be described in three ways: functions and relationships. The processor environment model: describes the allocation of computing functions in processor hardware; the software environment model: defines the software architecture and its effects from each processor. The code organizational model: shows the modular structure of each taskThe Yourdon structured method and structured analysis and design technique are examples of structured design methods. During the late 1990s, he was one of the leading proponents of the theory that the'Y2K bug' could lead to a collapse of civilization, or at least protracted economic depression and technological breakdown on a wide scale, he wrote several books on the subject, including Time Bomb 2000, produced at least one video putting forth that theory.
Yourdon was criticized when his predictions failed to materialize in any form. This blunder may have caused him to lose credibility with many in the software industry. In his final years, Yourdon served as an internationally recognized expert witness and computer consultant specializing in project management, software engineering methodologies, Web 2.0 development. He died on January 2016, as a result of a post-surgical blood infection. Yourdon was married to Toni Nash, he had three children. He had four grandchildren. Yourdon had five sisters. Yourdon was an avid photographer whose photos were published in The New York Times, Los Angeles Times, The Wall Street Journal, Fast Company, Time/CNN, New York Observer, New York magazine and the Huffington Post. Yourdon authored over 550 technical articles and authored or coauthored 26 computer books since 1967. A selection: 1967. Real-Time Systems Design. Information & Systems Press. 1972. Design of On-Line Computer Systems. Prentice Hall. Yourdon, Edward. St