Milestone: M1 Development Iteration
Milestone M1 Checklist
☐ Demo video
☐ Demo video report, including who-did-what part
☐ Version of your code from the demo video in GitHub
☐ Up-to-date versions of your user stories, design artifacts, and planning artifacts
☐ Task report
☐ Customer sign-off
☐ GitHub release (URL submitted to eCourseware dropbox)
☐ Teammate evaluations
There are three main deliverables for Milestone M1: a demo video, a collection of project artifacts, and a live in-class demo session.
1. Demo Video and Video Report
Your team will be responsible for creating a demo video of your software. This video is mainly to assist the course instructors in grading your progress on the project. The video must also have an accompanying video report that includes a who-did-what report that lists who built each of the demoed features. The demo video and document must meet the following grading criteria:
- Criterion: Demonstrate the progress that the team has made so far.
- All the new features. Include all the latest features in the demo. Don’t leave any out. A big point of this exercise is to demonstrate all the wonderful progress that the team has made. Note that this criterion does not mean that you should skip re-demoing old features. It just means you shouldn’t skip the new ones. However, you should not include old work in the who-did-what report.
- Backend too. Although UI features are a high priority, you may also demonstrate that backend functionality is working, even if it’s not yet connected to the frontend. The key thing is to prove that the code runs and works! Along those lines, you may demo automated tests.
- Criterion: Display the team’s work in the best possible light.
- Story form. Any demonstration of UI must take place in the context of a cohesive story. That is, the presenter must describe one or more characters (with names, like Alice and/or Bob) and relate a story about the character using the software. The presenter must stick to this story format. The story and accompanying demo must be well thought out, and not leave the audience with the impression that the presenter is making it up as he/she goes along. Use realistic names for things and not made-up placeholders, like “foo” and “slafjsd”.
- UI first. Since the UI is generally most interesting, you should lead with that.
- General audience. Don’t forget that not everyone is as familiar with your project as you are. To be on the safe side, explain it as if you are talking to someone who has never seen it before.
- No special effects or fancy editing. The video should clearly show a user (or users) interacting with your web app. Don’t add special effects or sound effects, which distract or detract from the authenticity of the interaction.
- Criterion: Length and format constraints.
- Time limit. The video must be no more than 10 minutes long.
- Fill the time. Your video should be at least 6 minutes; otherwise, you’re probably doing something wrong.
- Video format. The video must be shared via YouTube.
- Criterion: Make clear who contributed what to the project in the video report.
- New work only. Although you may demo features from previous iterations in the video, the who-did-what report should only mention features/work that are new to the latest milestone submission.
- Use the template. Your team’s video report must follow the Markdown template given below. In the video report, give an entry for each new/updated feature demoed. Each entry must include things like the time offset in the video where the feature was demoed, the name of the team member who created/updated the feature, a brief description of the feature, and the URL(s) of the pull request(s) in which the work was submitted.
# Video Report
- Project: (Nickname of Project)
- Milestone: (Milestone ID; e.g., M1)
- Demo Video: (URL for demo video; e.g., <https://youtu.be/xxx>)
## Who Did What
| Time Offset | (Offset into Video; e.g., hh:mm:ss; sort sequentially) |
| ------------ | ----- |
| Contributor | (Team Member Name) |
| Contribution | (Brief description of work on display) |
| Pull Request | (PR URL(s) for the submitted work; e.g., <http://xxx>; comma-separated list if multiple pull requests) |
Note that the creators of the demo video and accompanying video report are eligible for A&B points.
2. Project Artifacts
For Milestone M1, you will submit the following artifacts:
- a copy of your code (as a release in GitHub), and
- up-to-date versions of your design and planning artifacts (i.e., USs, sitemap, UI sketches, and model-class diagram).
The artifacts should satisfy the following grading criteria:
-
Release in GitHub. To grade your code and other artifacts, I will download your milestone GitHub release.
-
Code builds and runs. I should be able to build and run your code using the usual approach from the Boot Camp projects. If any special instructions are required to build/run your software, include them in the README file in your project’s top-level directory.
-
Replicable demo. I should be able to replicate your demo video. If seed data is required to do so, you must somehow make that data available to me (possibly giving instructions in the README).
-
Artifact quality. All your artifacts must be of high quality. The criteria from the Milestone M0 still apply, with the following addition.
-
Code quality. Your code must follow common style guidelines and be well organized and readable. For example, all code must be properly indented, and class/variable/method names must be sensible. You should also do your utmost to avoid bugs and other sloppiness.
-
Customer satisfaction. Your customer will provide feedback on how well your team has satisfied the requirements they gave you and how well aligned your team’s prioritization of the work has been with the customer’s priorities.
Note that there is an A&B eligible role (Quality Assurance Czar) with special responsibilities regarding milestone artifact quality.
3. Live In-Class Demo Session
For this session, each team will operate a demo booth. One member of your team (the “demo-booth operator”) must run the booth, providing visitors with an interactive demo of your team’s software. The remaining members of your team will circulate about the other booths, acting as visitors. The interactive demo must meet the following grading criteria:
-
Clearly explain your project to visitors. Assume that visitors have never seen your project before. Thoroughly and clearly explain what problem your project solves and how it does so.
-
Display the team’s work in the best possible light. Use presentation techniques discussed this semester to present your team’s software in an engaging and compelling way. Also, think about the best way to set up your booth. What equipment will you need? Extra monitors?
-
Allow visitors to use your project. This is an interactive demo, which means that visitors should be allowed to try out your project if time allows.
-
Time limits.
- Don’t go too long. The demo must be no more than 8 minutes long.
- Fill the time. Keep your visitors engaged throughout the 8 minutes.
Note that demo-booth operator is an A&B eligible role.
4. Project Workflow, Task Planning, and Task Reporting
All work contributed to the project must follow the process described in the Project Workflow Instructions document. This process includes full task planning and outcome reporting.
5. Submitting the Milestone
5.1. Customer Sign-Off
Before your milestone submission will be considered complete, your customer have signed off on it, as per the form below. (Note that I will contact the customer directly to collect their sign-off, so you need only to get their verbal approval.)
5.2. GitHub Release
Once all team members tasks have been completed and their pull requests have been merged into the master
branch, your team must create a release for the milestone:
- Set the Tag version to
m1v1
. If changes (e.g., bug fixes) are made to the release after it is created, you can create a new release that includes the changes—just be sure to increment the version (e.g.,m1v2
). - Set Release title to
Milestone M1, Version 1
(replacingVersion 1
with the appropriate version of the release).
As the last step, your team must submit the URL of the release page to the appropriate eCourseware dropbox. Only one team member needs to perform this step. If you need to correct a release, don’t forget to resubmit the URL as well to reflect the correct version.
6. Teammate Evaluations
At the end of each iteration, each team member must provide an evaluation of each other team member. Instructions and forms for performing these teammate evaluations will be communicated by email near the end of the iteration.
Milestone M1 Customer Sign-Off Form
Customers: Please indicate your approval of the following items—but ONLY if you agree 100% with the statement for the item.
If you have ANY disagreement, do not give your approval. Instead, provide the team with feedback, and have them resolve whatever issue is preventing your approval.
☐ New! I have reviewed a demo of the software, and I have provided the team any feedback I had.
☐ I have reviewed the user stories, and they are up to date and consistent with my wishes.
☐ I have reviewed any new user-interface designs, and I approve of them.
Grading Rubric
Below are each of the grading items for this Milestone, along with their point values and weights. If an item is not submitted at all, 0 points will be awarded for that item. The top-level bullets specify grading criteria. The sub-bullets indicate standard deductions for errors in a submitted item. The deduction list below may not be complete because there may be mistakes that we did not expect. The deduction for an unexpected mistake will be assessed at the time it’s discovered and will reflect how severe the instructor thinks the mistake is. If the deductions for a grading item total more than the total points for that item, 0 points will be awarded for the item.
Progress on Product
20 points with weight of 30%
- Made inadequate progress given the number of team members and time in the iteration
- -2 to -20 depending on severity
Communication
Overall weight of 40%
Demo Video
20 points with weight of 20%
- Fails to provide a gentle introduction that addresses a general audience
- -2 deduction
- Fails to tell a user-task oriented cohesive story
- -3 deduction
- Fails to use realistic data
- -1 to -3 depending on severity
- Fails to demo key features that arguably should have been demoed
- -1 for each missing feature up to -5
- Fails to differentiate old and new features
- -2 deduction
- Bad demo length
- -1 per minute too short up to -4
- -1 per minute too long up to -4
- Bugs apparent
- -1 per bug up to -3
- Problems with the video report document
- -5 missing document
- -3 fails to follow Markdown template
Interactive Demo Presentation
20 points with weight of 20%
- Does not tell a user-task oriented cohesive story
- -3 deduction
- Does not use realistic data
- -1 to -3 depending on severity
- Does not provide a gentle introduction that addresses a general audience
- -2 deduction
- Bad demo length
- -1 per minute too short up to -4
- -1 per minute too long up to -4
- Bugs apparent
- -1 per bug up to -3
Process
Overall weight of 30%
Task Reporting
20 points with weight of 5%
- Failed to document outcomes of tasks
- -10 deduction
- Failed to follow template
- -2 per mistake up to -6
Release
20 points with weight of 5%
- Failed to upload correct release URL to eCourseware on time
- -8 deduction
User Stories
20 points with weight of 10%
- See Milestone M0 rubric
- Not up to date
- -6 for not keeping statuses up to date
- -2 per clearly missing user story (e.g., for features already implemented)
Model Class Diagram
20 points with weight of 10%
- See Milestone M0 rubric
- Not up to date
- -10 Not updated to be consistent with model implementation