User profiles for HAO PENG

Hao Peng

- Verified email at buaa.edu.cn - Cited by 6843

Hao Peng

- Verified email at illinois.edu - Cited by 3723

Peng Hao

- Verified email at student.uts.edu.au - Cited by 1074

Random feature attention

H Peng, N Pappas, D Yogatama, R Schwartz… - arXiv preprint arXiv …, 2021 - arxiv.org
Transformers are state-of-the-art models for a variety of sequence modeling tasks. At their
core is an attention function which models pairwise interactions between the inputs at every …

A survey on text classification: From shallow to deep learning

Q Li, H Peng, J Li, C Xia, R Yang, L Sun, PS Yu… - arXiv preprint arXiv …, 2020 - arxiv.org
Text classification is the most fundamental and essential task in natural language processing.
The last decade has seen a surge of research in this area due to the unprecedented …

Modeling of quantization effects in digitally controlled dc–dc converters

H Peng, A Prodic, E Alarcón… - IEEE Transactions on …, 2007 - ieeexplore.ieee.org
In digitally controlled dc–dc converters with a single voltage feedback loop, the two quantizers,
namely the analog-to-digital (A/D) converter and the digital pulse-width modulator (DPWM…

Learning from context or names? an empirical study on neural relation extraction

H Peng, T Gao, X Han, Y Lin, P Li, Z Liu, M Sun… - arXiv preprint arXiv …, 2020 - arxiv.org
Neural models have achieved remarkable success on relation extraction (RE) benchmarks.
However, there is no clear understanding which type of information affects existing RE …

A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

…, G Wang, K Zhang, C Ji, Q Yan, L He, H Peng… - arXiv preprint arXiv …, 2023 - arxiv.org
Pretrained Foundation Models (PFMs) are regarded as the foundation for various downstream
tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is trained on …

Transfer learning using computational intelligence: A survey

J Lu, V Behbood, P Hao, H Zuo, S Xue… - Knowledge-Based …, 2015 - Elsevier
Transfer learning aims to provide a framework to utilize previously-acquired knowledge to
solve new but similar problems much more quickly and effectively. In contrast to classical …

[PDF][PDF] Classifying relations via long short term memory networks along shortest dependency paths

Y Xu, L Mou, G Li, Y Chen, H Peng… - Proceedings of the 2015 …, 2015 - aclanthology.org
Relation classification is an important research arena in the field of natural language
processing (NLP). In this paper, we present SDP-LSTM, a novel neural network to classify the …

A convolutional attention network for extreme summarization of source code

M Allamanis, H Peng, C Sutton - International conference on …, 2016 - proceedings.mlr.press
Attention mechanisms in neural networks have proved useful for problems in which the
input and output do not have fixed dimension. Often there exist features that are locally …

Complexity-based prompting for multi-step reasoning

Y Fu, H Peng, A Sabharwal, P Clark… - … Conference on Learning …, 2022 - openreview.net
We study the task of prompting large-scale language models to perform multi-step reasoning.
Existing work shows that when prompted with a chain of thoughts (CoT), sequences of …

Recent advances on host–guest material systems toward organic room temperature phosphorescence

X Yan, H Peng, Y Xiang, J Wang, L Yu, Y Tao, H Li… - Small, 2022 - Wiley Online Library
The design and characterization of purely organic room‐temperature phosphorescent (RTP)
materials for optoelectronic applications is currently the focus of research in the field of …