Why antialiasing with integrated intel GPU is better then with dedicated Nvidia rtx3050?

I have dual GPU laptop (Nvidia RTX3050). Nvidia Prime profile is set to On-Demand. If I ran kicad in Nvidia GPU ( __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia kicad) the antialiasing is worse then with integrated Intel. Why?

I use kubuntu 24.04, nvidia proprietary driver 575.64.03

kicad Application: KiCad x86_64 on x86_64

Version: 9.0.3-9.0.3-0~ubuntu24.04.1, release build

Libraries:
wxWidgets 3.2.4
FreeType 2.13.2
HarfBuzz 8.3.0
FontConfig 2.15.0
libcurl/8.5.0 OpenSSL/3.0.13 zlib/1.3 brotli/1.1.0 zstd/1.5.5 libidn2/2.3.7 libpsl/0.21.2 (+libidn2/2.3.7) libssh/0.10.6/openssl/zlib nghttp2/1.59.0 librtmp/2.3 OpenLDAP/2.6.7

Platform: Ubuntu 24.04.3 LTS, 64 bit, Little endian, wxGTK, X11, KDE, x11

Build Info:
Date: Jul 8 2025 13:02:59
wxWidgets: 3.2.4 (wchar_t,wx containers) GTK+ 3.24
Boost: 1.83.0
OCC: 7.6.3
Curl: 8.5.0
ngspice: 42
Compiler: GCC 13.3.0 with C++ ABI 1018
KICAD_IPC_API=ON

Locale:
Lang: en_GB
Enc: UTF-8
Num: 1,234.5
Encoded кΩ丈: D0BACEA9E4B888 (sys), D0BACEA9E4B888 (utf8)