COMP6685 Deep Learning


Hello, if you have any need, please feel free to consult us, this is my wechat: wx91due


COMP6685 Deep Learning

RETRIEVAL ASSESSMENT
INDIVIDUAL (100% of total mark)
Deliverables:
1x Jupyter notebook

Task: You are required to develop a phyton code using TensorFlow (Keras) with additional comments to answer the question in the next section. Your code should be able to run on CPUs.

Create a code, in the provided template in Moodle, to train a Recurrent Neural Network (RNN) on the public benchmark dataset named Poker Hand available at https://archive.ics.uci.edu/ml/datasets/Poker+Hand.

Poker Hand dataset is composed of one training set named “poker-hand-training-true.data” and one testing set named “poker-hand-testing.data”. You will need to download both training and testing sets into your local disk by clicking the Download hyperlink (in the top right button of the page).

In Poker Hand dataset, each data sample (row) is an example of a hand consisting of five playing cards drawn from a standard deck of 52. Each card is described using two attributes (suit and rank), for a total of 10 predictive attributes. There is one Class attribute that describes the "Poker Hand". You can find more information about this dataset from: https://www.kaggle.com/datasets/rasvob/uci-poker-hand-dataset

The dataset should be imported in the code. An example on how to import the dataset to your code can be found from the link below: https://www.kaggle.com/code/rasvob/uci-poker-dataset-classification

In this assignment, you are required to implement a single vanilla RNN (not LSTM nor GRU) and add a comment in each of the parameters chosen. The RNN should be trained with the training set and its performance should be evaluated on the testing set.

You can determine the setting of the RNN (including, the number of layers, number of recurrent neurons in each layer, regularization, dropout, optimiser, activation function, learning rate, etc.) according to your own preference. However, it is important that the RNN can achieve good classification performance in terms of accuracy on the testing set after being trained on the training set for no more than 40 epochs.

An acceptable classification accuracy rate on the testing set should be above 65%, namely, more than 65% of the testing data samples are correctly classified by the RNN model. You are also required to present the confusion matrix along with the classification accuracy as the final prediction result.

All main settings should be commented in the line code. The output of each code block and the training progresses of the RNN models should be kept in the submitted jupyter notebook file. A question about final remarks on the results will be answered on the markdown defined in the template.

Submission:
  • by Moodle within the deadline of Monday, 5th August 2024, before the cutoff at 23.55
  • Submit only a jupyter notebook file. Use the template provided. The comments should be included in the file as comments in code or in the markdown space allocated.
  • Your jupyter notebook file name should include your Student ID, Name

Marking Scheme (100 marks for the assessment that corresponds to 25% of the total mark of the module):

• Importing the dataset (both training set and testing set). (10 marks)
• Correct definition and implementation of the RNN; (20 marks)
• Training of the RNN on the training set (10 marks)
• Evaluate the model on the testing set (10 marks)
• Acceptable classification accuracy on the testing set with confusion matrix presented (20 marks)
• Code outline, including useful comments in the code (10 marks)
• Code running without errors (10 marks)
• Final remarks/conclusions on the obtained results and ideas for further improvement of the accuracy (10 marks)
Plagiarism and Duplication of Material
Senate has agreed on the following definition of plagiarism: "Plagiarism is the act of repeating the ideas or discoveries of another as one's own. To copy sentences, phrases or even striking expressions without acknowledgement in a manner that may deceive the reader as to the source is plagiarism; to paraphrase in a manner that may deceive the reader is likewise plagiarism. Where such copying or close paraphrasing has occurred the mere mention of the source in a bibliography will not be deemed sufficient acknowledgement; in each such instance it must be referred specifically to its source. Verbatim quotations must be directly acknowledged either in inverted commas or by indenting."

The work you submit must be your own, except where its original author is clearly referenced. We reserve the right to run checks on all submitted work in an effort to identify possible plagiarism, and take disciplinary action against anyone found to have committed plagiarism.

When you use other peoples' material, you must clearly indicate the source of the material using the Harvard style (see http://www.kent.ac.uk/uelt/ai/styleguides.html).

In addition, substantial amounts of verbatim or near verbatim cut-and-paste from web-based sources, course material and other resources will not be considered as evidence of your own understanding of the topics being examined.
The Department publishes an on-line Plagiarism and Collaboration Frequently Asked Questions (FAQ) which is available at: http://www.cs.ukc.ac.uk/teaching/student/assessment/plagiarism.local
Work may be submitted to Turnitin for the identification of possible plagiarism. You can find out more about Turnitin at the following page: http://www.kent.ac.uk/uelt/ai/students/usingturnitinsts.html#whatisTurnit in

发表评论

电子邮件地址不会被公开。 必填项已用*标注