"Not enough space" error when printing a string

Questions about the LÖVE API, installing LÖVE and other support related questions go here.
Forum rules
Before you make a thread asking for help, read this.
User avatar
YounYokel
Prole
Posts: 33
Joined: Thu Oct 03, 2019 5:57 pm
Location: Kazakhstan

Re: "Not enough space" error when printing a string

Post by YounYokel » Sun Feb 02, 2020 11:07 am

zorg wrote:
Sun Feb 02, 2020 8:26 am
YounYokel wrote:
Sun Feb 02, 2020 6:44 am
raidho36 wrote:
Sun Feb 02, 2020 12:26 am
I insist that you head over to Wikipedia and study carefully the article about UTF-8 encoding.
Weird that other function does work which inserts a string in given position works which is almost the same.
I'm pretty sure inserting to an index that would be between two bytes of a multibyte character would also break.
Uh... I didn't understand a word. Sorry(
Let's make games with love.

Code: Select all

string.format('Made with LÖVE %s.%s (%s)', ({love.getVersion()})[1], ({love.getVersion()})[2], ({love.getVersion()})[4])

dusoft
Party member
Posts: 104
Joined: Fri Nov 08, 2013 12:07 am

Re: "Not enough space" error when printing a string

Post by dusoft » Sun Feb 02, 2020 7:23 pm

YounYokel wrote:
Sun Feb 02, 2020 11:07 am
Uh... I didn't understand a word. Sorry(
Point is that UTF-8/Unicode characters are coded in multiple bytes. So, if you try crop or get a substring of string, it can cut the string in the middle of an actual Unicode character. E.g.

-- ASCII chars
abcdefgh

e.g. get (ASCII) chars 1 to 4 of this string: abcd

However:
-- Unicode chars (emojis)
🥰👍🍾

🥰 (Unicode) = \xf0\x9f\xa5\xb0 (UTF-8 encoding)

e.g. get (ASCII) char 1 of this string: \xf0

You are getting a malformed string or possibly error, because Lua is unable to handle this malformed string.

You should never manually do a substring on Unicode/UTF-8 characters, rather use only Unicode/UTF-8 safe functions to handle any cropping or substring cutting.

See this converter: https://www.branah.com/unicode-converter
enter: 🥰
get (4 bytes): \xf0\x9f\xa5\xb0

remove last byte (3 bytes): \xf0\x9f\xa5
get: 🥀

If you accidentally remove subsequent characters, you get a malformed string. Also see "Invalid byte sequences on Wikipedia":
https://en.wikipedia.org/wiki/UTF-8#Inv ... _sequences

User avatar
raidho36
Party member
Posts: 2063
Joined: Mon Jun 17, 2013 12:00 pm

Re: "Not enough space" error when printing a string

Post by raidho36 » Mon Feb 03, 2020 1:21 am

Seeing how trying to do all of that Unicode handling manually at every corner gets us nowhere, I've made a library. See if it works for you.

https://love2d.org/forums/viewtopic.php?f=5&t=88211

Post Reply

Who is online

Users browsing this forum: No registered users and 18 guests