...

A Pre-trained Model for Document Understanding with Relative Polar Coordinate Encoding of Layout Structures


View a PDF of the paper titled DocPolarBERT: A Pre-trained Model for Document Understanding with Relative Polar Coordinate Encoding of Layout Structures, by Benno Uthayasooriyar and 3 other authors

View PDF
HTML (experimental)

Abstract:We introduce DocPolarBERT, a layout-aware BERT model for document understanding that eliminates the need for absolute 2D positional embeddings. We extend self-attention to take into account text block positions in relative polar coordinate system rather than the Cartesian one. Despite being pre-trained on a dataset more than six times smaller than the widely used IIT-CDIP corpus, DocPolarBERT achieves state-of-the-art results. These results demonstrate that a carefully designed attention mechanism can compensate for reduced pre-training data, offering an efficient and effective alternative for document understanding.

Submission history

From: Benno Uthayasooriyar [view email]
[v1]
Fri, 11 Jul 2025 14:00:56 UTC (814 KB)
[v2]
Tue, 15 Jul 2025 07:51:41 UTC (814 KB)
[v3]
Thu, 31 Jul 2025 15:39:10 UTC (798 KB)

Source link

#Pretrained #Model #Document #Understanding #Relative #Polar #Coordinate #Encoding #Layout #Structures