]> git.ipfire.org Git - thirdparty/u-boot.git/commit
ufs: core: Fix heap corruption due to out of bounds write
authorMarek Vasut <marek.vasut+renesas@mailbox.org>
Sun, 29 Mar 2026 23:11:36 +0000 (01:11 +0200)
committerNeil Armstrong <neil.armstrong@linaro.org>
Wed, 22 Apr 2026 08:06:01 +0000 (10:06 +0200)
commitc7ebdb9871dfd6e170a6dfeee39be234c37a4b53
treec7d841d289f7a633a7cd27df70f7845cba61b895
parentc7299ff33ebfabe35baa6656590d977dedfba5b0
ufs: core: Fix heap corruption due to out of bounds write

The ufshcd_read_string_desc() can perform out of bounds write and
corrupt heap in case the input utf-16 string contains code points
which convert to anything more than plain 7-bit ASCII string.

This occurs because utf16_to_utf8(dst, src, size) in U-Boot behaves
differently than Linux utf16s_to_utf8s(..., maxlen), but the porting
process did not take that into consideration. The U-Boot variant of
the function converts up to $size utf-16 fixed-length 16-bit input
characters into as many 1..4 Byte long variable-length utf-8 output
characters. That means for 16 Byte input, the output can be up to 64
Bytes long. The Linux variant converts up utf-16 input into up to
$maxlen Bytes worth of utf-8 output, but stops at the $maxlen limit.
That means for 16 Byte input with maxlen=32, the processing will stop
after writing 32 output Bytes.

In case of U-Boot, use of utf16_to_utf8() leads to potential corruption
of data past the $size Bytes and therefore corruption of surrounding
content on the heap.

The fix is as simple, allocate buffer that is sufficient to fit the
utf-8 string. The rest of the code in ufshcd_read_string_desc() does
correctly limit the buffer to fit into the DMA descriptor afterward.

Signed-off-by: Marek Vasut <marek.vasut+renesas@mailbox.org>
Link: https://patch.msgid.link/20260329231151.332108-1-marek.vasut+renesas@mailbox.org
Signed-off-by: Neil Armstrong <neil.armstrong@linaro.org>
drivers/ufs/ufs-uclass.c