Skip to content

Commit eb214f2

Browse files
xairytorvalds
authored andcommitted
kasan, arm64: use ARCH_SLAB_MINALIGN instead of manual aligning
Instead of changing cache->align to be aligned to KASAN_SHADOW_SCALE_SIZE in kasan_cache_create() we can reuse the ARCH_SLAB_MINALIGN macro. Link: http://lkml.kernel.org/r/52ddd881916bcc153a9924c154daacde78522227.1546540962.git.andreyknvl@google.com Signed-off-by: Andrey Konovalov <andreyknvl@google.com> Suggested-by: Vincenzo Frascino <vincenzo.frascino@arm.com> Cc: Andrey Ryabinin <aryabinin@virtuozzo.com> Cc: Christoph Lameter <cl@linux.com> Cc: Dmitry Vyukov <dvyukov@google.com> Cc: Mark Rutland <mark.rutland@arm.com> Cc: Vincenzo Frascino <vincenzo.frascino@arm.com> Cc: Will Deacon <will.deacon@arm.com> Signed-off-by: Andrew Morton <akpm@linux-foundation.org> Signed-off-by: Linus Torvalds <torvalds@linux-foundation.org>
1 parent 63f3655 commit eb214f2

File tree

2 files changed

+6
-2
lines changed

2 files changed

+6
-2
lines changed

arch/arm64/include/asm/cache.h

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,12 @@
5858
*/
5959
#define ARCH_DMA_MINALIGN (128)
6060

61+
#ifdef CONFIG_KASAN_SW_TAGS
62+
#define ARCH_SLAB_MINALIGN (1ULL << KASAN_SHADOW_SCALE_SHIFT)
63+
#else
64+
#define ARCH_SLAB_MINALIGN __alignof__(unsigned long long)
65+
#endif
66+
6167
#ifndef __ASSEMBLY__
6268

6369
#include <linux/bitops.h>

mm/kasan/common.c

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -298,8 +298,6 @@ void kasan_cache_create(struct kmem_cache *cache, unsigned int *size,
298298
return;
299299
}
300300

301-
cache->align = round_up(cache->align, KASAN_SHADOW_SCALE_SIZE);
302-
303301
*flags |= SLAB_KASAN;
304302
}
305303

0 commit comments

Comments
 (0)