Visualized: COVID-19 Cases in Singapore

I decided to take a deeper look at the COVID-19 cases in Singapore based on publicly available data. In the process, I came up with some interesting visualization and observations. Here’s what I’ve been able to do: Network Graph of Local Clusters One of the more worrying features of COVID-19 is its ability to spread…… Continue reading Visualized: COVID-19 Cases in Singapore

Short Introduction to Knowledge Graphs

Knowledge graphs are useful for providing structured sources of information for many downstream tasks. Hence, it is an interesting problem to build large knowledge graphs (KG) from a large text corpus. Being able to learn a KG from web-scale corpora means that we could leverage the large amount of unstructured information on websites (e.g. TechCrunch)…… Continue reading Short Introduction to Knowledge Graphs

Depth-wise Separable Convolutions: Performance Investigations

Depth-wise Separable Convolutions (shorthand: DepSep convolution) have been proposed as an efficient alternative to traditional Convolutions. They are used in models such as MobileNet (Howard et al., 2017), EfficientNet (Tan et al., 2019), and more. They have less parameters and require less floating point operations (FLOPs) to compute. However, due to the complexities of modern…… Continue reading Depth-wise Separable Convolutions: Performance Investigations

Transformer Architecture

This post provides a primer on the Transformer model architecture. It is extremely adept at sequence modelling tasks such as language modelling, where the elements in the sequences exhibit temporal correlations with each other. Outline Encoder-Decoder ModelsMulti-headed Self-attentionTransformer “Layer”Vaswani TransformerTransformer Family Tree Encoder-Decoder Models Transformers are a type of Encoder-Decoder model. In this section, we…… Continue reading Transformer Architecture