Hello, if you have any need, please feel free to consult us, this is my wechat: wx91due
1 Introduction
- design and implement automated tests for the program
- design and carry out a performance experiment to assess the performance of this program
- refactor this program
- use revision control to manage your source code
- C
- Python
You must select one of these languages for your assessment.
The assessment must be done individually.
1.1 Questions about the assessment
2 Source code
3 Assessment
1. Refactor the source code to improve its structure, design, readability, and performance. This must be driven by a set of automated tests.
2. Design and carry out a performance experiment on the source code.
You can develop your code and tests on whatever platform and under whatever environment you want but subject to the following important constraints:
1. You must create and use a Git source code repository, and a clone of this must be maintained in your user space on the University of Edinburgh GitLab service. Please ensure that your repository is private.
3. A minimal test is included with this program, and you must not change the commands given to invoke this test.
3.1 Code refactoring
- The goal of the coursework is to evaluate your use of revision control, refactoring, profile analysis, and testing. It is not to evaluate your knowledge of the intricacies of Python or C.
- Your submission must be based on one language only. If you submit an attempt containing multiple languages, only one of these will be marked.
- The markers are not looking for specifically Python-related or C-related refactorings. There are many refactorings that can be done that are not language specific.
- For all valid input, the program should produce the same output as before. You may however change the output produced for any invalid input.
- No parallel programming is expected or required, and the resulting refactored code could still be serial.5
- For the C implementation, a Makefile is provided. There is no requirement for you to expand or change this, although you may wish to do so if it makes your development easier. Such changes will not be marked, although you must ensure any commands given in the README.md still work as expected.
- Changes to the README.md will not be marked explicitly. However, if you require the marker to install and/or use additional packages, then these need to be documented. This falls under the remit of the “Operation of refactored codes and tests” component of the marking, discussed below.
3.2 Performance experiment
- Investigate the effects on performance of different compiler optimisation flags.
- Identify hotspots in the program which could be optimised in future.
- Investigate the behaviour or performance of the program when configured with different parameters.
- Comparison of the behaviour or performance of different implementations of the program (e.g., original vs refactored code).
- Any other performance-related experiment you think are of interest.
You may choose one of these options or come up with your own. Please keep in mind that quality is more important than quantity. Credit will be given for intelligent and insightful analysis of your results, in preference to the quantity of data you present. You will be given credit for specific insights into the effect of your chosen experimental variables on this particular program, rather than comments on the effect on programs in general.
This experiment must be documented in a report capturing the following:
- Introduction: Motivation and context for your choice of experiment.
- Method: Description of how you carried out your experiment, what your set up was, how you collected data.
- Results: Your results, including a clear graphical presentation of any performance data.
- Discussion: Analysis and discussion of your results.
- Conclusion: Some brief conclusions and suggestions for future work.
You can use whatever profiling and performance analysis tools you want as long as they are specified in your performance analysis report.
You are expected to incorporate one of your performance improvement findings into the code refactoring. Those findings that are not implemented should be presented as future work in the report’s conclusion.
4 Assignments
- Formative milestone part-way through the course
- Summative final assessment submission
The purpose of the milestone is to provide you with early feedback that you can incorporate into your final submission.
The deadlines will be shown on LEARN under Assessment Information for all students.
Submission is allowed up to 7 calendar days after the deadline, with a 5% deduction per day or part-day late.
If you submit your assessment by the deadline then it will be marked, and you cannot submit an amended version to be marked later.
If you do not wish the submitted version of your assessment to be marked, and intend submitting an amended version after the deadline, then it is your responsibility to let the course organiser know, before the deadline, that the submitted version should not be marked.
5 Submission format
Submissions for both the milestone and final assessment will take the form of PDF documents. Both documents must contain the following information for your GitLab repository on the first page:
For example:
- Your student number.
- The course identifier, PSoc, for Programming Skills (on-campus) or PSol, for Programming Skills (online).
For example, if your student number is s1234567, and are taking Programming Skills (on campus), then you would name your submission file s1234567-PSoc.pdf.
Your reports must be a maximum of 8 pages, size 12 text, 1.5 line spacing, excluding title page, contents, glossary, references, or appendices. This should come to approximately 2,800 words. Any submission longer than 8 pages will result in any pages after the first 8 pages to not be read or marked.
A LaTeX and Word template fulfilling these requirements is provided on Learn under Additional materials -> Templates. You may choose to use a different style of document as long as these requirements are met.
6 Marking
Submission |
Weight out of the total course mark |
Weight out of submission
|
Milestone |
0% |
|
|
Code refactoring plans |
50% |
|
Performance experiment plans |
50% |
Final Submission |
100% |
|
|
Code refactoring implementation |
50% |
|
Performance experiment implementation |
50% |
6.1 Milestone marking scheme (0%)
- Code refactoring plans
- Use of version control
- Test plan and design
- Performance experiment plans
- Motivation and context
- Experiment plan
6.2 Final submission marking scheme (100%)
The final submission will cover the following criteria:8
Code refactoring |
50% |
|
|
Use of version control |
10% |
|
Operation of refactored code and tests |
10% |
|
Error handling and validation |
10% |
|
Readability of source code |
20% |
|
Refactoring of source code |
25% |
|
Implementation and quality of tests |
25% |
Performance experiment |
50% |
|
|
Motivation and context |
10% |
|
Experimental approach |
10% |
|
Result selection and presentation |
10% |
|
Experiment reproducibility |
10% |
|
Result analysis and interpretation |
30% |
|
Identification and implementation of performance improvements
|
30% |
6.3 Provisional marks
7 Frequently asked questions
You can change the directory structure or add additional files or modules as long as the main program still exists under its original name and is what is invoked to run the program.
The course teaches Continuous Integration, and this coursework assesses the use of version control, does this include CI?
Can I modify the README.md file?
Yes, in fact this is strongly encouraged. The README.md file acts as the program’s documentation, so if you make any modifications that the user must be aware of (e.g., introducing new dependencies), these should be documented.
When I try to visualise the simulation with ImageMagick, the output is too small
Is there a good reason to not always optimise C code with -O3?
Optimisation using the -O3 compiler flag can change floating point constraints, and you can get the wrong results for a given algorithm if you relax the floating-point constraints too much.
The GCC bug tracker has reports which might give an idea of kind of issues the -O3 optimisation can have, for example:
When I run gprof the results always show that 100% of the time is spent in the main() function, regardless of the size of the grid
If you have a program that reads in a very large file using a function called read_data(), for example, and this function takes 9 minutes, and then the contents of the file are processed in a function process_data(), for example, which takes 1 minute, then the profiler will show that 90% of the execution is in read_data().
My profiler is not returning information about how much time is spent in each function
Where the runtime of the program is very quick, a profiler might not return any information about how much time was spent in each function (both in seconds, and as a percentage of the total runtime). You should think about how you could run the program in such a way that the runtime might take longer.
Your report is a technical report, rather than a literature review, or an academic paper. If you use or quote documentation, or appropriate websites, then you should cite these, but it is acceptable to use footnotes rather than using full references.