Posts by Collection

Github Cheatsheet

  1. userid and password store: git config --global credential.helper store

Login to Jupyter Notebook remotely

  1. On server: jupyter notebook --no-browser --port=8889
  2. On local terminal: ssh -N -f -L localhost:8888:localhost:8889 username@serverIP
  3. On local browser: Open http://localhost:8888/ with token

publications

Sparse Convolutions for Faster Object Recognition

Published in AI Systems Workshop at SOSP, 2019

This paper presents a preliminary technique for accelerating ML inference on sparse inputs by modifying the convolution operator to be sparsity-aware.

Recommended citation: Wei Hao and Shivaram Venkataraman, "Sparse Convolutions for Faster Object Recognition", AI Systems Workshop at Symposium on Operating Systems Principles(SOSP), 2019. http://learningsys.org/sosp19/assets/papers/21_CameraReadySubmission_sparse_conv_aisys19_final.pdf

Learning Amyloid Pathology Progression from Longitudinal PiB-PET Images in Preclinical Alzheimer’s Disease

Published in 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI), 2020

This paper talks about a novel trainable network diffusion model that infers the propagation dynamics of amyloid pathology, which conditioned on individual-level structural connectivity network.

Recommended citation: Wei Hao, Nicholas M. Vogt, Zihang Meng, Seong Jae Hwang, Rebecca L. Koscik, Sterling C. Johnson, Barbara B. Bendlin, and Vikas Singh, "Learning Amyloid Pathology Progression from Longitudinal PiB-PET Images in Preclinical Alzheimer’s Disease", International Symposium on Biomedical Imaging (ISBI), 2020. https://ieeexplore.ieee.org/abstract/document/9098571

Serving DNNs like Clockwork: Performance Predictability from the Bottom Up

Published in 14th USENIX Symposium on Operating Systems Design and Implementation (OSDI), 2020

In this paper, starting with the predictable execution times of individual DNN inferences, we adopt a principled design methodology to successively build a fully distributed model serving system that achieves predictable end-to-end performance.

Recommended citation: Arpan Gujarati, Reza Karimi, Safya Alzayat, Wei Hao, Antoine Kaufmann, Ymir Vigfusson, Jonathan Mace, "Serving DNNs like Clockwork: Performance Predictability from the Bottom Up", 14th USENIX Symposium on Operating Systems Design and Implementation (OSDI), 2020. https://arxiv.org/abs/2006.02464

talks

Oral Presentation at ISBI 2020

Published:

This presentation talks about a novel trainable network diffusion model that infers the propagation dynamics of amyloid pathology, which conditioned on individual-level structural connectivity network.

Oral Presentation at OSDI 2020

Published:

This presentation talks about a fully distributed DNN model serving system that achieves predictable end-to-end performance.

teaching

Teaching experience 1

Undergraduate course, Columbia University, Computer Science Department, 2020

Ohhh my first TA job