summaryrefslogtreecommitdiffstats
path: root/mm/slub.c
AgeCommit message (Expand)AuthorFilesLines
2018-04-11kasan, slub: fix handling of kasan_slab_free hookAndrey Konovalov1-23/+34
2018-04-05slab, slub: skip unnecessary kasan_cache_shutdown()Shakeel Butt1-0/+11
2018-04-05slub: make size_from_object() return unsigned intAlexey Dobriyan1-1/+1
2018-04-05slub: make struct kmem_cache_order_objects::x unsigned intAlexey Dobriyan1-35/+39
2018-04-05slub: make slab_index() return unsigned intAlexey Dobriyan1-1/+1
2018-04-05slab: make usercopy region 32-bitAlexey Dobriyan1-1/+1
2018-04-05kasan: make kasan_cache_create() work with 32-bit slab cache sizesAlexey Dobriyan1-1/+1
2018-04-05slab: make kmem_cache_flags accept 32-bit object sizeAlexey Dobriyan1-2/+2
2018-04-05slub: make ->size unsigned intAlexey Dobriyan1-6/+6
2018-04-05slub: make ->object_size unsigned intAlexey Dobriyan1-4/+4
2018-04-05slub: make ->cpu_partial unsigned intAlexey Dobriyan1-3/+3
2018-04-05slub: make ->inuse unsigned intAlexey Dobriyan1-3/+2
2018-04-05slub: make ->align unsigned intAlexey Dobriyan1-1/+1
2018-04-05slub: make ->reserved unsigned intAlexey Dobriyan1-1/+1
2018-04-05slub: make ->remote_node_defrag_ratio unsigned intAlexey Dobriyan1-5/+6
2018-04-05slab: make kmem_cache_create() work with 32-bit sizesAlexey Dobriyan1-1/+1
2018-04-05mm/slub.c: use jitter-free reference while printing ageChintan Pandya1-4/+5
2018-02-06kasan: don't use __builtin_return_address(1)Dmitry Vyukov1-4/+4
2018-02-06kasan: detect invalid frees for large objectsDmitry Vyukov1-2/+2
2018-02-03Merge tag 'usercopy-v4.16-rc1' of git://git.kernel.org/pub/scm/linux/kernel/g...Linus Torvalds1-12/+37
2018-01-31slub: remove obsolete comments of put_cpu_partial()Miles Chen1-3/+1
2018-01-31mm/slub.c: fix wrong address during slab padding restorationBalasubramani Vivekanandan1-3/+5
2018-01-15usercopy: Allow strict enforcement of whitelistsKees Cook1-1/+2
2018-01-15usercopy: WARN() on slab cache usercopy region violationsKees Cook1-4/+19
2018-01-15usercopy: Prepare for usercopy whitelistingDavid Windsor1-2/+9
2018-01-15usercopy: Include offset in hardened usercopy reportKees Cook1-6/+8
2017-11-15kmemcheck: rip it outLevin, Alexander (Sasha Levin)1-3/+2
2017-11-15kmemcheck: remove whats left of NOTRACK flagsLevin, Alexander (Sasha Levin)1-2/+0
2017-11-15kmemcheck: stop using GFP_NOTRACK and SLAB_NOTRACKLevin, Alexander (Sasha Levin)1-3/+1
2017-11-15kmemcheck: remove annotationsLevin, Alexander (Sasha Levin)1-20/+0
2017-11-15slub: fix sysfs duplicate filename creation when slub_debug=OMiles Chen1-0/+4
2017-11-15slab, slub, slob: convert slab_flags_t to 32-bitAlexey Dobriyan1-3/+3
2017-11-15slab, slub, slob: add slab_flags_tAlexey Dobriyan1-12/+14
2017-11-15mm: slabinfo: remove CONFIG_SLABINFOYang Shi1-2/+2
2017-11-02License cleanup: add SPDX GPL-2.0 license identifier to files with no licenseGreg Kroah-Hartman1-0/+1
2017-09-13mm: treewide: remove GFP_TEMPORARY allocation flagMichal Hocko1-1/+1
2017-09-08treewide: make "nr_cpu_ids" unsignedAlexey Dobriyan1-1/+1
2017-09-06mm/slub.c: constify attribute_group structuresArvind Yadav1-1/+1
2017-09-06mm/slub.c: add a naive detection of double free or corruptionAlexander Popov1-0/+4
2017-09-06mm: add SLUB free list pointer obfuscationKees Cook1-5/+37
2017-09-06slub: tidy up initialization orderingAlexander Potapenko1-2/+2
2017-08-18slub: fix per memcg cache leak on css offlineVladimir Davydov1-1/+2
2017-07-06mm: memcontrol: account slab stats per lruvecJohannes Weiner1-2/+2
2017-07-06mm: vmstat: move slab statistics from zone to node countersJohannes Weiner1-2/+2
2017-07-06mm/slub.c: wrap kmem_cache->cpu_partial in config CONFIG_SLUB_CPU_PARTIALWei Yang1-31/+38
2017-07-06mm/slub.c: wrap cpu_slab->partial in CONFIG_SLUB_CPU_PARTIALWei Yang1-7/+11
2017-07-06mm/slub: reset cpu_slab's pointer in deactivate_slab()Wei Yang1-13/+8
2017-07-06mm/slub.c: remove a redundant assignment in ___slab_alloc()Wei Yang1-1/+0
2017-06-23slub: make sysfs file removal asynchronousTejun Heo1-14/+26
2017-06-02slub/memcg: cure the brainless abuse of sysfs attributesThomas Gleixner1-2/+4