Skip to content

Comments

displayport: use correct buffer size in setOuiSource#1034

Closed
tuxedo-aer wants to merge 1 commit intoNVIDIA:mainfrom
tuxedo-aer:main
Closed

displayport: use correct buffer size in setOuiSource#1034
tuxedo-aer wants to merge 1 commit intoNVIDIA:mainfrom
tuxedo-aer:main

Conversation

@tuxedo-aer
Copy link

Don't send more bytes than necessary to avoid problems with displays that only accept the exact amount of bytes defined in the DisplayPort standard.

Currently, 16 bytes are allocated and sent, although only addresses from 300h to 309h are valid. The remaining bytes are zeroed, but that still seems to cause problems on some displays that won't accept the additional bytes. Therefore, reduce the buffer size to the required amount and only send the relevant bytes.

This resolves the problem for me on TUXEDO InfinityBook Max and TUXEDO Stellaris devices with 5050 and 5060 GPUs. I have no evidence that this caused any actual problems with the driver, but doing things correctly and avoiding warnings in the console is always good.

Don't send more bytes than necessary to avoid problems with displays
that only accept the exact amount of bytes defined in the DisplayPort
standard.
@CLAassistant
Copy link

CLAassistant commented Feb 20, 2026

CLA assistant check
All committers have signed the CLA.

@Binary-Eater
Copy link
Collaborator

Binary-Eater commented Feb 21, 2026

Thanks @tuxedo-aer. We already have fixed the issue internally with correctly allocating the source OUI buffer to not write to addresses 0030Ch and above, which are reserved and lead to undefined behavior for certain sinks when written to. We appreciate your change.

The fix is tracked in internal bug 5783114.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants