WebGuolin Ke, Di He & Tie-Yan Liu Microsoft Research {guolin.ke, dihe, tyliu}@microsoft.com ABSTRACT In this work, we investigate the positional encoding methods used in … WebGuolin Ke is currently the head of Machine Learning Group at DP Technology, working on AI for Science. Previously, he was a Senior Researcher at the Machine Learning Group …
Guolin Ke OpenReview
WebSep 1, 2024 · Guolin Ke, Zhenhui Xu, Jia Zhang, Jiang Bian, and Tie-Yan Liu. "DeepGBM: A Deep Learning Framework Distilled by GBDT for Online Prediction Tasks." In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. WebSep 27, 2024 · Guolin Ke, Jia Zhang, Zhenhui Xu, Jiang Bian, Tie-Yan Liu. TL;DR: We propose a universal neural network solution to derive effective NN architectures for tabular data automatically. Abstract: Neural Network (NN) has achieved state-of-the-art performances in many tasks within image, speech, and text domains. Such great … cpchem annual report
Rethinking Positional Encoding in Language Pre-training
WebAuthors: Gengmo Zhou, Zhifeng Gao, Qiankun Ding, Hang Zheng, Hongteng Xu, Zhewei Wei, Linfeng Zhang, Guolin Ke. Schematic illustration of the Uni-Mol framework. Uni-Mol is a universal 3D molecular pretraining framework that offers a significant expansion of representation capacity and application scope in drug design. WebS Lu, D He, C Xiong, G Ke, W Malik, Z Dou, P Bennett, TY Liu, A Overwijk Proceedings of the 2024 Conference on Empirical Methods in Natural Language … , 2024 52 * WebMay 21, 2024 · TL;DR: We have explored the direct application of Transformers to graph representation. With three simple, yet effective graph structural encodings, the proposed GraphFormer works surprisingly well on a wide range of popular benchmark datasets. Abstract: The Transformer architecture has become a dominant choice in many domains, … cp check point