diff options
author | Aneesh Kumar K.V <aneesh.kumar@linux.ibm.com> | 2018-06-14 16:01:52 +0530 |
---|---|---|
committer | Michael Ellerman <mpe@ellerman.id.au> | 2018-06-20 09:13:25 +1000 |
commit | fadd03c615922d8521a2e76d4ba2335891cb2790 (patch) | |
tree | 69ffe2b951674633146183b0178d7a262adcddc9 /arch/powerpc/mm/hugetlbpage.c | |
parent | 758380b8155f69b4e2f77f27562f8a7a466749d6 (diff) | |
download | linux-fadd03c615922d8521a2e76d4ba2335891cb2790.tar.bz2 |
powerpc/mm/hash/4k: Free hugetlb page table caches correctly.
With 4k page size for hugetlb we allocate hugepage directories from its on slab
cache. With patch 0c4d26802 ("powerpc/book3s64/mm: Simplify the rcu callback for page table free")
we missed to free these allocated hugepd tables.
Update pgtable_free to handle hugetlb hugepd directory table.
Fixes: 0c4d268029bf ("powerpc/book3s64/mm: Simplify the rcu callback for page table free")
Signed-off-by: Aneesh Kumar K.V <aneesh.kumar@linux.ibm.com>
[mpe: Add CONFIG_HUGETLB_PAGE guard to fix build break]
Signed-off-by: Michael Ellerman <mpe@ellerman.id.au>
Diffstat (limited to 'arch/powerpc/mm/hugetlbpage.c')
-rw-r--r-- | arch/powerpc/mm/hugetlbpage.c | 3 |
1 files changed, 2 insertions, 1 deletions
diff --git a/arch/powerpc/mm/hugetlbpage.c b/arch/powerpc/mm/hugetlbpage.c index 7c5f479c5c00..8a9a49c13865 100644 --- a/arch/powerpc/mm/hugetlbpage.c +++ b/arch/powerpc/mm/hugetlbpage.c @@ -337,7 +337,8 @@ static void free_hugepd_range(struct mmu_gather *tlb, hugepd_t *hpdp, int pdshif if (shift >= pdshift) hugepd_free(tlb, hugepte); else - pgtable_free_tlb(tlb, hugepte, pdshift - shift); + pgtable_free_tlb(tlb, hugepte, + get_hugepd_cache_index(pdshift - shift)); } static void hugetlb_free_pmd_range(struct mmu_gather *tlb, pud_t *pud, |