Grading criteria's for the 2nd iteration
Your main deliverable in the second iteration is the application itself. The additional deliverables will be also graded, but all those are only supporting you in the process of creating the actual working application.
Application functionality. The functionality in the application covers at least one of the core use cases (peripheral use cases such as login/logout do not count towards being core use cases). The provided functionality is tested and works. User can access the delivered functionality and test its behaviour with minimal additional assistance. The provided functionality is either bug-free or known bugs (which must not block the testing of the functionality) are registered in the issue tracker. The functionality provided by the system reflects what is indicated in the release notes.
Continuous Integration (CI). There is an automated process in place that goes all the way from source code to a running application (outside the IDE). There is a trace that the CI process has been running for at least a week. When a new version of the code appears in the VCS, the new code is checked out, the application is built, and it is deployed on a server (if applicable) or it is made available as an executable file available somewhere where customer and course organiser can access it. In case of the errors during the build, the CI environment reports them to the team.
Detailed requirements. At least 75% of the requirements identified and marked as in the scope of the work delivered at the end of the course are detailed in selected form (for example, in use cases). Detailed requirements are in individual wiki pages/documents. Detailed requirements are accessible from the overall requirements list. For any UML artifacts, you may also use an external UML tool and add links in your Wiki to these artifacts or just upload them to your Wiki.
UI wireframes/mockups. If applicable, the application UI has been prototyped using wireframes or mockups and there are links from the detailed requirements to the corresponding UI screens. In the case of a non-interactive application (such as a command-line app), there should be representative examples of "input-output" pairs (instead of UI mockups) and these examples should be linked to the requirements.
Scope. The tasks are estimated, and the estimates have been discussed with the customer. The project plan indicates what tasks are delivered in the subsequent iterations. There is a clear picture of what functionality will be ready after iteration 3 and after iteration 4.
Release notes. Release notes for this iteration are present. Reading the release notes gives an overview what features of the system are supported by the released version of the system and is thus input for the acceptance testing by the customer and the course organizer. If delivered functionality contains known bugs, those issues are highlighted in release notes.
Collaboration infrastructure is in use. There are traces in the collaboration and development infrastructure about the work done. Namely:
- The version control system being used regularly and by several or preferably all team members. Commits are linked to a task registered in the issue tracker. The released version is tagged.
- Requirements in wiki or other documents are updated regularly. Minutes of meetings with customers are recorded. Changes in the requirements are recorded, if such changes have occurred.
- The issue tracking system is used regularly and kept up-to-date. The completed issues are marked as such. New issues being added when required and assigned to project team members. Issues are linked to a detailed requirement (whenever this link makes sense). The set of issues is complete relative to the current snapshot of the requirements. The issues/tasks are at an adequate level of granularity. All issues are assigned to a team members. All issues have a deadline.