diff options
author | Trent Piepho <tpiepho@impinj.com> | 2019-05-10 17:48:20 +0000 |
---|---|---|
committer | Simon Glass <sjg@chromium.org> | 2019-05-21 17:33:23 -0600 |
commit | b061ef39c350c288542536b09dc01d9e984a12ac (patch) | |
tree | d8bab333c9261a53eb0669f8d2595d4de3028e4a /drivers/core | |
parent | 347ea0b63eb5143bf0e48aba65a41f50999367f0 (diff) |
core: ofnode: Have ofnode_read_u32_default return a u32
It was returning an int, which doesn't work if the u32 it is reading,
or the default value, will overflow a signed int.
While it could be made to work, when using a C standard/compiler where
casting negative signed values to unsigned has a defined behavior,
combined with careful casting, it seems obvious one is meant to use
ofnode_read_s32_default() with signed values.
Cc: Simon Glass <sjg@chromium.org>
Signed-off-by: Trent Piepho <tpiepho@impinj.com>
Diffstat (limited to 'drivers/core')
-rw-r--r-- | drivers/core/ofnode.c | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/drivers/core/ofnode.c b/drivers/core/ofnode.c index 12977a7790..c72c6e2673 100644 --- a/drivers/core/ofnode.c +++ b/drivers/core/ofnode.c @@ -39,7 +39,7 @@ int ofnode_read_u32(ofnode node, const char *propname, u32 *outp) return 0; } -int ofnode_read_u32_default(ofnode node, const char *propname, u32 def) +u32 ofnode_read_u32_default(ofnode node, const char *propname, u32 def) { assert(ofnode_valid(node)); ofnode_read_u32(node, propname, &def); |