Age | Commit message (Collapse) | Author |
|
Provide a conversion function from utf8 to utf16.
Add missing #include <linux/types.h> in include/charset.h.
Remove superfluous #include <common.h> in lib/charset.c.
Signed-off-by: Heinrich Schuchardt <xypron.glpk@gmx.de>
Signed-off-by: Alexander Graf <agraf@suse.de>
|
|
The constant MAX_UTF8_PER_UTF16 is used to calculate
required memory when converting from UTF-16 to UTF-8.
If this constant is too big we waste memory.
A code point encoded by one UTF-16 symbol is converted to a
maximum of three UTF-8 symbols, e.g.
0xffff could be encoded as 0xef 0xbf 0xbf.
The first byte carries four bits, the second and third byte
carry six bits each.
A code point encoded by two UTF-16 symbols is converted to four
UTF-8 symbols.
So in this case we need a maximum of two UTF-8 symbols per
UTF-16 symbol.
As the overall maximum is three UTF-8 symobls per UTF-16 symbol
we need MAX_UTF8_PER_UTF16 = 3.
Fixes: 78178bb0c9d lib: add some utf16 handling helpers
Signed-off-by: Heinrich Schuchardt <xypron.glpk@gmx.de>
Signed-off-by: Alexander Graf <agraf@suse.de>
|
|
We'll eventually want these in a few places in efi_loader, and also
vsprintf.
Signed-off-by: Rob Clark <robdclark@gmail.com>
|