Natural Language Processing

Xfmers

Python Library to quickly initialise custom Transformer models

DL Engineering

AI Lab Container

GPU container for rapid and flexible prototyping for AI applications

Optimising TensorFlow Performance

Tutorial at PyCon SG 2019 on faster training with TensorFlow 2.0, covering a variety of techniques.

JupyterHub @ SUTD

NVIDIA GPU cluster to support deep learning undergraduate courses and research: Moodle and JupyterHub, running on a multi-node Kubernetes cluster on-premise.

Setup process and configuration files are publicly available on GitHub.

NVStatsRecorder

Python Library to easily record and graph various NVIDIA GPU metrics during model training.

NSCC On-boarding

Developed on-boarding examples to assist research teams in Singapore to do deep learning training on NSCC, Singapore’s National Supercomputing Cluster.

Open Source contributions

Transformers

NLP Library by HuggingFace

#1763, #1668, #1590

Fixed Keras model.fit() training for TF XLNet, TF XLM models

#1667, #1508

Added automatic mixed precision support for training examples and inference benchmarks

TensorFlow

DL Library by Google

#29249

Added docstring for tf.train.experimental.enable_mixed_precision_graph_rewrite

Past Projects

Optimizing BERT Training Performance in tf.keras

Exploration of various techniques and the impact on training throughput of BERT.

Chrome Extension to combat Fake News

Chrome Extension to educate users on evaluating the credibility of a web page.

SmartBin

Real-time object detection and classification of recyclable waste.