diff --git a/_pages/cv.md b/_pages/cv.md
index f46c7e6..1218c5b 100644
--- a/_pages/cv.md
+++ b/_pages/cv.md
@@ -12,14 +12,15 @@ redirect_from:
[alham.fikri@mbzuai.ac.ae](mailto:alham.fikri@mbzuai.ac.ae)
-I am an assistant professor at MBZUAI. I obtained my Ph.D. from the University of Edinburgh’s Institute for Language, Cognition, and Computation, where I focused on enhancing the training and inference speed of machine translation. My studies were supervised by Dr. Kenneth Heafield and Dr. Rico Sennrich. Presently, my research centers on multilingual, low-resource, and low-compute NLP. I have worked on lightweight models via distillation, as well as adapting models to unseen languages in cases of limited training data.
-I have been developing various multilingual large language models such as BLOOMZ/mT0, Jais, Bactrian-X, and more. I have also worked on building multilingual and under-represented NLP resources and benchmarks, especially those that capture local culture and nuances, some of which were awarded best resource papers.
-In the past, I have gained industry experience at companies such as Amazon, Google, and Apple.
+[//]: # (I am an assistant professor at MBZUAI. I obtained my Ph.D. from the University of Edinburgh’s Institute for Language, Cognition, and Computation, where I focused on enhancing the training and inference speed of machine translation. My studies were supervised by Dr. Kenneth Heafield and Dr. Rico Sennrich. Presently, my research centers on multilingual, low-resource, and low-compute NLP. I have worked on lightweight models via distillation, as well as adapting models to unseen languages in cases of limited training data.)
+[//]: # (I have been developing various multilingual large language models such as BLOOMZ/mT0, Jais, Bactrian-X, and more. I have also worked on building multilingual and under-represented NLP resources and benchmarks, especially those that capture local culture and nuances, some of which were awarded best resource papers.)
+[//]: # (In the past, I have gained industry experience at companies such as Amazon, Google, and Apple.)
## Education
* **PhD, University of Edinburgh** (2016 - 2020)\
-Thesis: Approximating Neural Machine Translation for Efficiency; making NMT trains faster using distributed and asynchronous training, sparse gradient, and transfer learning.
+Thesis: Approximating Neural Machine Translation for Efficiency; making NMT trains faster using distributed and asynchronous training, sparse gradient, and transfer learning. Supervised by Kenneth Heafield and Rico Sennrich.
+Examiner: Graham Neubig and Barry Haddow
* **MSc Artificial Intelligence, University of Edinburgh** (2014 - 2015)\
With distinction. Final project: Haiku generator with word vector model.
* **BSc Computer Science, Universitas Indonesia** (2010 - 2014)\