10 October 2022
About the research project
Artificial Intelligence (AI) has been emerging as an important technology with significant influence on many aspects of life. In the era of digital media, modern AI approaches, known as "deep learning" methods, can effectively utilise the large availability of data and the computational powers for highly accurate prediction. This advancement has made breakthroughs in a number of application domains, for example, object detection and classification in autonomous driving systems, and automated diagnosis from medical images. However, deep learning methods also receive some concern from the business and society due to the lack of explanation and interpretation capability for the prediction outcomes. Without the interpretability, humans will not be able to understand the behaviour of AI to establish trust and to acquire useful knowledge, e.g. we want to know not only whether there is a disease or not from a medical image but also why?
Current approaches for interpretable AI focus on extracting semantic relationships of features that form a prediction, while little has been done for the integration of representing and learning of the semantics. This project will study graph neural network (GNN) with attention mechanism to achieve semantical and visual interpretation for decision makings from image data. The objectives of this project include:
- Investigation of approaches to seamlessly integrate GNN with attention mechanism for image classification/segmentation/object detection.
- A method to produce sensible explainability of how decision makings are made while maintaining high prediction performance.
- An analysis of the relation between interpretability and effectiveness, e.g. how interpretability affect prediction performance.
This project will use medical image analysis as a case study to verify the interpretable capability of the proposed approaches and to provide the practicality of the research.
Primary SupervisorMeet Dr Son Tran
Applicants will be considered for a Research Training Program (RTP) scholarship or Tasmania Graduate Research Scholarship (TGRS) which, if successful, provides:
- a living allowance stipend of $28,854 per annum (2022 rate, indexed annually) for 3.5 years
- a relocation allowance of up to $2,000
- a tuition fees offset covering the cost of tuition fees for up to four years (domestic applicants only)
If successful, international applicants will receive a University of Tasmania Fees Offset for up to four years.
As part of the application process you may indicate if you do not wish to be considered for scholarship funding.
Applicants should review the Higher Degree by Research minimum entry requirements.
Additional eligibility criteria specific to this project/scholarship:
- Applicants must be able to undertake the project on-campus.
The project is competitively assessed and awarded. Selection is based on academic merit and suitability to the project as determined by the College.
Additional essential selection criteria specific to this project:
- Have a solid background in computing
- Have excellent programming skills
Additional desirable selection criteria specific to this project:
- Have good knowledge in Python
- Have basic knowledge in machine learning
There is a three-step application process:
- Select your project, and check you meet the eligibility and selection criteria;
- Contact the Primary Supervisor, Dr Son Tran to discuss your suitability and the project's requirements; and
- Submit an application by the closing date listed above.
- Copy and paste the title of the project from this advertisement into your application. If you don’t correctly do this your application may be rejected.
- As part of your application, you will be required to submit a covering letter, a CV including 2 x referees and your project research proposal.
Following the application closing date applications will be assessed within the College. Applicants should expect to receive notification of the outcome by email by the advertised outcome date.