Linnan Wang
Linnan Wang Meta Linkedin I'm a senior deep learning engineer at nvidia. i got my ph.d. from the cs department of brown university, advised by prof. rodrigo fonseca. before brown, i was a omscs student at gatech while being a full time software developer at dow jones. Linnan wang brown university verified email at brown.edu homepage artificial intelligence high performance computing distributed systems.
Cv Lichen Wang View linnan wang, ph.d.’s profile on linkedin, a professional community of 1 billion members. Read articles by linnan wang on sciencedirect, the world's leading source for scientific, technical, and medical research. Publications planning at inference: mcts test time scaling for long video generation ritvik bale, ethan he, ashwath aithal, linnan wang submitted to iclr 2026 multi objective optimization by learning space partition yiyang zhao, linnan wang, kevin yang, tianjun zhang, tian guo, yuandong tian. To improve the search efficiency for neural architecture search (nas), one shot nas proposes to train a single super net to approximate the performance of proposal architectures during search via.
Author Nvidia 技术博客 Publications planning at inference: mcts test time scaling for long video generation ritvik bale, ethan he, ashwath aithal, linnan wang submitted to iclr 2026 multi objective optimization by learning space partition yiyang zhao, linnan wang, kevin yang, tianjun zhang, tian guo, yuandong tian. To improve the search efficiency for neural architecture search (nas), one shot nas proposes to train a single super net to approximate the performance of proposal architectures during search via. The release codes of la mcts with its application to neural architecture search. a unified library of state of the art model optimization techniques like quantization, pruning, distillation, speculative decoding, etc. it compresses deep learning models for downstream deployment…. He got his ph.d. from brown university in 2021. his research topic is neural architecture search, and his nas related works have been published at icml, neurips, iclr, cvpr, tpmai, and aaai. at nvidia, linnan is continuing his r&d in nas and shipping nas optimized models to nvidia core products. Research article fft based gradient sparsification for the distributed training of deep neural networks linnan wang, 6 june 2020hpdc '20: proceedings of the 29th international symposium on high performance parallel and distributed computing doi.org 10.1145 3369583.3392681. Linnan wang, wei wu, junyu zhang, hang liu, george bosilca, maurice herlihy, rodrigo fonseca: fft based gradient sparsification for the distributed training of deep neural networks.
Research Menghan Wang The release codes of la mcts with its application to neural architecture search. a unified library of state of the art model optimization techniques like quantization, pruning, distillation, speculative decoding, etc. it compresses deep learning models for downstream deployment…. He got his ph.d. from brown university in 2021. his research topic is neural architecture search, and his nas related works have been published at icml, neurips, iclr, cvpr, tpmai, and aaai. at nvidia, linnan is continuing his r&d in nas and shipping nas optimized models to nvidia core products. Research article fft based gradient sparsification for the distributed training of deep neural networks linnan wang, 6 june 2020hpdc '20: proceedings of the 29th international symposium on high performance parallel and distributed computing doi.org 10.1145 3369583.3392681. Linnan wang, wei wu, junyu zhang, hang liu, george bosilca, maurice herlihy, rodrigo fonseca: fft based gradient sparsification for the distributed training of deep neural networks.
Comments are closed.