Random feature attention
Transformers are state-of-the-art models for a variety of sequence modeling tasks. At their
core is an attention function which models pairwise interactions between the inputs at every …
core is an attention function which models pairwise interactions between the inputs at every …
A survey on text classification: From shallow to deep learning
Text classification is the most fundamental and essential task in natural language processing.
The last decade has seen a surge of research in this area due to the unprecedented …
The last decade has seen a surge of research in this area due to the unprecedented …
Modeling of quantization effects in digitally controlled dc–dc converters
In digitally controlled dc–dc converters with a single voltage feedback loop, the two quantizers,
namely the analog-to-digital (A/D) converter and the digital pulse-width modulator (DPWM…
namely the analog-to-digital (A/D) converter and the digital pulse-width modulator (DPWM…
Learning from context or names? an empirical study on neural relation extraction
Neural models have achieved remarkable success on relation extraction (RE) benchmarks.
However, there is no clear understanding which type of information affects existing RE …
However, there is no clear understanding which type of information affects existing RE …
A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Pretrained Foundation Models (PFMs) are regarded as the foundation for various downstream
tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is trained on …
tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is trained on …
Transfer learning using computational intelligence: A survey
Transfer learning aims to provide a framework to utilize previously-acquired knowledge to
solve new but similar problems much more quickly and effectively. In contrast to classical …
solve new but similar problems much more quickly and effectively. In contrast to classical …
[PDF][PDF] Classifying relations via long short term memory networks along shortest dependency paths
Relation classification is an important research arena in the field of natural language
processing (NLP). In this paper, we present SDP-LSTM, a novel neural network to classify the …
processing (NLP). In this paper, we present SDP-LSTM, a novel neural network to classify the …
A convolutional attention network for extreme summarization of source code
Attention mechanisms in neural networks have proved useful for problems in which the
input and output do not have fixed dimension. Often there exist features that are locally …
input and output do not have fixed dimension. Often there exist features that are locally …
Complexity-based prompting for multi-step reasoning
We study the task of prompting large-scale language models to perform multi-step reasoning.
Existing work shows that when prompted with a chain of thoughts (CoT), sequences of …
Existing work shows that when prompted with a chain of thoughts (CoT), sequences of …
Recent advances on host–guest material systems toward organic room temperature phosphorescence
X Yan, H Peng, Y Xiang, J Wang, L Yu, Y Tao, H Li… - Small, 2022 - Wiley Online Library
The design and characterization of purely organic room‐temperature phosphorescent (RTP)
materials for optoelectronic applications is currently the focus of research in the field of …
materials for optoelectronic applications is currently the focus of research in the field of …