Relational Semantic Behaviour in Transformer Models: An Empirical and Cognitively Informed Analysis of Attention-Based Meaning Construction
Abstract
Transformer-Based Models (TBMs) have demonstrated remarkable performance across a wide range of natural language tasks, yet the nature of the semantic representations they construct remains an open question. In particular, it is unclear whether meaning in such models emerges primarily from the encoding of discrete lexical features or from relational patterns established through contextual interaction. This paper investigates the relational semantic behaviour of TBMs by analysing attention-driven transformations across varied linguistic inputs. Using a series of controlled empirical probes, we examine how attention mechanisms mediate context-sensitive meaning construction, focusing on relational dependencies rather than token-level representations alone. The analysis highlights systematic patterns through which semantic coherence arises from interactions between elements across a sequence, suggesting that meaning in transformer models is fundamentally relational and dynamically constructed. Rather than interpreting these findings as direct analogues of human cognition, the study adopts a cognitively informed perspective, using concepts from cognitive semantics and relational representation to interpret model behaviour. This approach allows transformer architectures to be examined as computational systems that exhibit structured, interpretable semantic organisation without assuming psychological equivalence. The results contribute to ongoing discussions at the intersection of cognitive computation and artificial intelligence by providing empirical evidence that attention mechanisms support relational semantic organisation in large language models. These findings have implications for model interpretability, the evaluation of semantic representations, and the development of cognitively grounded analytical frameworks for contemporary AI systems (Vaswani et al., 2017).
Related articles
Related articles are currently not available for this article.